text
stringlengths 1
22.8M
|
|---|
Pilkington Football Club is a football club based in St Helens, Merseyside, England. They are currently members of the and play at the Ruskin Drive.
History
The club was established in 1938 and joined both the Liverpool Business Houses League and the St Helens Combination. The name originates from the Pilkington glass factory in St. Helens. They later became members of the Liverpool County Football Combination. In 1980 the club joined Division Two of the Mid-Cheshire League. When Division Two was abolished in 1983, they became part of the league's single division. The club finished bottom of the league in 1986–87.
A new second division of the Mid-Cheshire League was formed in 1987 and Pilkington became members of Division One. However, after finishing bottom of the league in 1988–89 and 1990–91, they were relegated to Division Two. The club were Division Two runners-up in 1997–98, resulting in promotion to Division One. They finished bottom of the division in 1999–2000, but were reprieved from relegation after another left the league. The following season saw the club win the Liverpool Junior Cup, a trophy the club won again in 2004–05. In 2007 the league was renamed the Cheshire League.
In 2010–11 Pilkington were runners-up in Division One, missing out on the title on goal difference. The division was renamed the Premier Division in 2014. The club finished bottom of the Premier Division in 2014–15 and were relegated to Division One. They won the league's Presidents Cup in 2016–17, and after finishing as runners-up in Division One in 2017–18, the club were promoted back to the Premier Division. The following season saw them win the Premier Division title, resulting in promotion to Division One North of the North West Counties League. They were Division One North champions in 2022–23 and were promoted to the Premier Division.
Ground
The club originally played at Crossley Road. They moved to Ruskin Drive at the beginning of the 1948–49 season.
Honours
Cheshire League
Premier Division champions 2018–19
Presidents Cup winners 2016–17
Liverpool Junior Cup
Winners 2000–01, 2004–05
See also
Pilkington F.C. players
References
External links
Football clubs in England
Football clubs in Merseyside
Sport in St Helens, Merseyside
1938 establishments in England
Association football clubs established in 1938
Liverpool County Football Combination
Cheshire Association Football League
North West Counties Football League clubs
Works association football teams in England
Fan-owned football clubs in England
|
The Ritz Cinema (better known as The Ritz) is a Grade-II listed art-deco former cinema located on Abbey Street, Nuneaton. It was opened on 23 July 1937, originally for the Union Cinemas circuit, however, in October of the same year, ABC Cinemas would take over the building. The Ritz would stop showing films in 1984. After being used as a cinema, the building would be converted to a bingo hall, and trade as such until its closure.
History
The Ritz was constructed by cinema architects Verity and Beverley for Union Cinemas. It was founded by David Bernhard and his son, C.F. Bernhard. It opened to the public for the first time on 23 July 1937. It was opened by Nuneaton Mayor and Councillor, T.L. Liggins, who was accompanied by the Mayoress, with nearly 2,000 people inside for the opening ceremony. Melody for Two and Mysterious Crossing were shown on the opening night. As well as this, a live show took place on-stage.
Around the same time of the Ritz being constructed, plans were unveiled about a Danilo Cinema potentially being built in Nuneaton, but this never went ahead because seven cinemas already existed in town. Mortimer Dent, of the Danilo Cinema Company, was still interested in setting up near to Nuneaton and ultimately chose Hinckley. The proposed Nuneaton location would have marked the first of his Danilo cinemas.
The general contractors for the building, G. E. & W. Wincott of Nuneaton, were also the contractors of the Danilo Cinema in Hinckley. Reports state that there was 'a race' to see which of the two cinemas would open first. The Ritz won and opened just three days before the Danilo.
It closed as a Gala Bingo hall in 2007. The closure was blamed on the smoking ban. A year later in 2008, it received its Grade II listed protection.
Closure
Whilst it remained closed from 2008, until 2019 where it was finally purchased, various community projects were launched and many parties shown interest in trying to get the building back in to use once again.
The Ritz was placed back on the market back in June 2008 at a lower asking price of £1.2 million, down from its original £1.5 million price tag, due to "abortive negotiations" with an interested party. Around the same time, rumours were circulating that fashion retailer Primark were taking on the site, but these were denied by chartered surveyors Wright Silverwood.
In November 2009, it was sold privately to a 'mystery' investor, who was believed to be from Nuneaton, but the investor was not named. It failed to meet the £320,000 reserve price at auction.
In February 2010, a year later, "serious interest" was shown in the building by two national leisure operators and a national retailer, neither of which were bingo operators, as a covenant was made with Gala Bingo whilst they had the building which meant it could not be used as a bingo hall again after its closure.
In July 2013, flyposters were replaced with cinema-related art when local Nuneaton-based group, Art Alert, visited the venue as part of their 'Friends of the Ritz' project. In 2013, film director Ken Loach who was born in Nuneaton, backed a £5m plan for the venue to be used again, though nothing came of the plans.
On 27 February 2019, the building was sold for £339,500.
Compton organ
The Ritz had a 'Compton organ' that was in an orchestra pit, which was played by organist Ken Stroud. The organ remains in use at a church in Essex.
References
Art Deco architecture in England
Buildings and structures in Nuneaton
Former cinemas in England
Grade II listed buildings in Warwickshire
|
```smalltalk
// The .NET Foundation licenses this file to you under the MIT license.
namespace Microsoft.NET.Build.Tasks.ConflictResolution
{
class PackageRank
{
private Dictionary<string, int> packageRanks;
public PackageRank(string[] packageIds)
{
var numPackages = packageIds?.Length ?? 0;
// cache ranks for fast lookup
packageRanks = new Dictionary<string, int>(numPackages, StringComparer.OrdinalIgnoreCase);
for (int i = numPackages - 1; i >= 0; i--)
{
var preferredPackageId = packageIds[i].Trim();
if (preferredPackageId.Length != 0)
{
// overwrite any duplicates, lowest rank will win.
packageRanks[preferredPackageId] = i;
}
}
}
/// <summary>
/// Get's the rank of a package, lower packages are preferred
/// </summary>
/// <param name="packageId">id of package</param>
/// <returns>rank of package</returns>
public int GetPackageRank(string packageId)
{
int rank;
if (packageId != null && packageRanks.TryGetValue(packageId, out rank))
{
return rank;
}
return int.MaxValue;
}
}
}
```
|
```go
package migrations
import (
"context"
"database/sql"
"github.com/pressly/goose/v3"
)
func init() {
goose.AddMigrationContext(Up20200419222708, Down20200419222708)
}
func Up20200419222708(_ context.Context, tx *sql.Tx) error {
notice(tx, "A full rescan will be performed to change the search behaviour")
return forceFullRescan(tx)
}
func Down20200419222708(_ context.Context, tx *sql.Tx) error {
return nil
}
```
|
```javascript
'use strict'
var http = require('http')
, https = require('https')
, url = require('url')
, util = require('util')
, stream = require('stream')
, zlib = require('zlib')
, bl = require('bl')
, hawk = require('hawk')
, aws = require('aws-sign2')
, httpSignature = require('http-signature')
, mime = require('mime-types')
, stringstream = require('stringstream')
, caseless = require('caseless')
, ForeverAgent = require('forever-agent')
, FormData = require('form-data')
, helpers = require('./lib/helpers')
, cookies = require('./lib/cookies')
, getProxyFromURI = require('./lib/getProxyFromURI')
, Querystring = require('./lib/querystring').Querystring
, Har = require('./lib/har').Har
, Auth = require('./lib/auth').Auth
, OAuth = require('./lib/oauth').OAuth
, Multipart = require('./lib/multipart').Multipart
, Redirect = require('./lib/redirect').Redirect
, Tunnel = require('./lib/tunnel').Tunnel
var safeStringify = helpers.safeStringify
, isReadStream = helpers.isReadStream
, toBase64 = helpers.toBase64
, defer = helpers.defer
, copy = helpers.copy
, version = helpers.version
, globalCookieJar = cookies.jar()
var globalPool = {}
function filterForNonReserved(reserved, options) {
// Filter out properties that are not reserved.
// Reserved values are passed in at call site.
var object = {}
for (var i in options) {
var notReserved = (reserved.indexOf(i) === -1)
if (notReserved) {
object[i] = options[i]
}
}
return object
}
function filterOutReservedFunctions(reserved, options) {
// Filter out properties that are functions and are reserved.
// Reserved values are passed in at call site.
var object = {}
for (var i in options) {
var isReserved = !(reserved.indexOf(i) === -1)
var isFunction = (typeof options[i] === 'function')
if (!(isReserved && isFunction)) {
object[i] = options[i]
}
}
return object
}
// Function for properly handling a connection error
function connectionErrorHandler(error) {
var socket = this
if (socket.res) {
if (socket.res.request) {
socket.res.request.emit('error', error)
} else {
socket.res.emit('error', error)
}
} else {
socket._httpMessage.emit('error', error)
}
}
// Return a simpler request object to allow serialization
function requestToJSON() {
var self = this
return {
uri: self.uri,
method: self.method,
headers: self.headers
}
}
// Return a simpler response object to allow serialization
function responseToJSON() {
var self = this
return {
statusCode: self.statusCode,
body: self.body,
headers: self.headers,
request: requestToJSON.call(self.request)
}
}
function Request (options) {
// if given the method property in options, set property explicitMethod to true
// extend the Request instance with any non-reserved properties
// remove any reserved functions from the options object
// set Request instance to be readable and writable
// call init
var self = this
// start with HAR, then override with additional options
if (options.har) {
self._har = new Har(self)
options = self._har.options(options)
}
stream.Stream.call(self)
var reserved = Object.keys(Request.prototype)
var nonReserved = filterForNonReserved(reserved, options)
stream.Stream.call(self)
util._extend(self, nonReserved)
options = filterOutReservedFunctions(reserved, options)
self.readable = true
self.writable = true
if (options.method) {
self.explicitMethod = true
}
self._qs = new Querystring(self)
self._auth = new Auth(self)
self._oauth = new OAuth(self)
self._multipart = new Multipart(self)
self._redirect = new Redirect(self)
self._tunnel = new Tunnel(self)
self.init(options)
}
util.inherits(Request, stream.Stream)
// Debugging
Request.debug = process.env.NODE_DEBUG && /\brequest\b/.test(process.env.NODE_DEBUG)
function debug() {
if (Request.debug) {
console.error('REQUEST %s', util.format.apply(util, arguments))
}
}
Request.prototype.debug = debug
Request.prototype.init = function (options) {
// init() contains all the code to setup the request object.
// the actual outgoing request is not started until start() is called
// this function is called from both the constructor and on redirect.
var self = this
if (!options) {
options = {}
}
self.headers = self.headers ? copy(self.headers) : {}
// Delete headers with value undefined since they break
// ClientRequest.OutgoingMessage.setHeader in node 0.12
for (var headerName in self.headers) {
if (typeof self.headers[headerName] === 'undefined') {
delete self.headers[headerName]
}
}
caseless.httpify(self, self.headers)
if (!self.method) {
self.method = options.method || 'GET'
}
if (!self.localAddress) {
self.localAddress = options.localAddress
}
self._qs.init(options)
debug(options)
if (!self.pool && self.pool !== false) {
self.pool = globalPool
}
self.dests = self.dests || []
self.__isRequestRequest = true
// Protect against double callback
if (!self._callback && self.callback) {
self._callback = self.callback
self.callback = function () {
if (self._callbackCalled) {
return // Print a warning maybe?
}
self._callbackCalled = true
self._callback.apply(self, arguments)
}
self.on('error', self.callback.bind())
self.on('complete', self.callback.bind(self, null))
}
// People use this property instead all the time, so support it
if (!self.uri && self.url) {
self.uri = self.url
delete self.url
}
// If there's a baseUrl, then use it as the base URL (i.e. uri must be
// specified as a relative path and is appended to baseUrl).
if (self.baseUrl) {
if (typeof self.baseUrl !== 'string') {
return self.emit('error', new Error('options.baseUrl must be a string'))
}
if (typeof self.uri !== 'string') {
return self.emit('error', new Error('options.uri must be a string when using options.baseUrl'))
}
if (self.uri.indexOf('//') === 0 || self.uri.indexOf('://') !== -1) {
return self.emit('error', new Error('options.uri must be a path when using options.baseUrl'))
}
// Handle all cases to make sure that there's only one slash between
// baseUrl and uri.
var baseUrlEndsWithSlash = self.baseUrl.lastIndexOf('/') === self.baseUrl.length - 1
var uriStartsWithSlash = self.uri.indexOf('/') === 0
if (baseUrlEndsWithSlash && uriStartsWithSlash) {
self.uri = self.baseUrl + self.uri.slice(1)
} else if (baseUrlEndsWithSlash || uriStartsWithSlash) {
self.uri = self.baseUrl + self.uri
} else if (self.uri === '') {
self.uri = self.baseUrl
} else {
self.uri = self.baseUrl + '/' + self.uri
}
delete self.baseUrl
}
// A URI is needed by this point, emit error if we haven't been able to get one
if (!self.uri) {
return self.emit('error', new Error('options.uri is a required argument'))
}
// If a string URI/URL was given, parse it into a URL object
if (typeof self.uri === 'string') {
self.uri = url.parse(self.uri)
}
// DEPRECATED: Warning for users of the old Unix Sockets URL Scheme
if (self.uri.protocol === 'unix:') {
return self.emit('error', new Error('`unix://` URL scheme is no longer supported. Please use the format `path_to_url`'))
}
// Support Unix Sockets
if (self.uri.host === 'unix') {
self.enableUnixSocket()
}
if (self.strictSSL === false) {
self.rejectUnauthorized = false
}
if (!self.uri.pathname) {self.uri.pathname = '/'}
if (!(self.uri.host || (self.uri.hostname && self.uri.port)) && !self.uri.isUnix) {
// Invalid URI: it may generate lot of bad errors, like 'TypeError: Cannot call method `indexOf` of undefined' in CookieJar
// Detect and reject it as soon as possible
var faultyUri = url.format(self.uri)
var message = 'Invalid URI "' + faultyUri + '"'
if (Object.keys(options).length === 0) {
// No option ? This can be the sign of a redirect
// As this is a case where the user cannot do anything (they didn't call request directly with this URL)
// they should be warned that it can be caused by a redirection (can save some hair)
message += '. This can be caused by a crappy redirection.'
}
// This error was fatal
return self.emit('error', new Error(message))
}
if (!self.hasOwnProperty('proxy')) {
self.proxy = getProxyFromURI(self.uri)
}
self.tunnel = self._tunnel.isEnabled(options)
if (self.proxy) {
self._tunnel.setup(options)
}
self._redirect.onRequest(options)
self.setHost = false
if (!self.hasHeader('host')) {
var hostHeaderName = self.originalHostHeaderName || 'host'
self.setHeader(hostHeaderName, self.uri.hostname)
if (self.uri.port) {
if ( !(self.uri.port === 80 && self.uri.protocol === 'http:') &&
!(self.uri.port === 443 && self.uri.protocol === 'https:') ) {
self.setHeader(hostHeaderName, self.getHeader('host') + (':' + self.uri.port) )
}
}
self.setHost = true
}
self.jar(self._jar || options.jar)
if (!self.uri.port) {
if (self.uri.protocol === 'http:') {self.uri.port = 80}
else if (self.uri.protocol === 'https:') {self.uri.port = 443}
}
if (self.proxy && !self.tunnel) {
self.port = self.proxy.port
self.host = self.proxy.hostname
} else {
self.port = self.uri.port
self.host = self.uri.hostname
}
if (options.form) {
self.form(options.form)
}
if (options.formData) {
var formData = options.formData
var requestForm = self.form()
var appendFormValue = function (key, value) {
if (value.hasOwnProperty('value') && value.hasOwnProperty('options')) {
requestForm.append(key, value.value, value.options)
} else {
requestForm.append(key, value)
}
}
for (var formKey in formData) {
if (formData.hasOwnProperty(formKey)) {
var formValue = formData[formKey]
if (formValue instanceof Array) {
for (var j = 0; j < formValue.length; j++) {
appendFormValue(formKey, formValue[j])
}
} else {
appendFormValue(formKey, formValue)
}
}
}
}
if (options.qs) {
self.qs(options.qs)
}
if (self.uri.path) {
self.path = self.uri.path
} else {
self.path = self.uri.pathname + (self.uri.search || '')
}
if (self.path.length === 0) {
self.path = '/'
}
// Auth must happen last in case signing is dependent on other headers
if (options.aws) {
self.aws(options.aws)
}
if (options.hawk) {
self.hawk(options.hawk)
}
if (options.httpSignature) {
self.httpSignature(options.httpSignature)
}
if (options.auth) {
if (Object.prototype.hasOwnProperty.call(options.auth, 'username')) {
options.auth.user = options.auth.username
}
if (Object.prototype.hasOwnProperty.call(options.auth, 'password')) {
options.auth.pass = options.auth.password
}
self.auth(
options.auth.user,
options.auth.pass,
options.auth.sendImmediately,
options.auth.bearer
)
}
if (self.gzip && !self.hasHeader('accept-encoding')) {
self.setHeader('accept-encoding', 'gzip')
}
if (self.uri.auth && !self.hasHeader('authorization')) {
var uriAuthPieces = self.uri.auth.split(':').map(function(item) {return self._qs.unescape(item)})
self.auth(uriAuthPieces[0], uriAuthPieces.slice(1).join(':'), true)
}
if (!self.tunnel && self.proxy && self.proxy.auth && !self.hasHeader('proxy-authorization')) {
var proxyAuthPieces = self.proxy.auth.split(':').map(function(item) {return self._qs.unescape(item)})
var authHeader = 'Basic ' + toBase64(proxyAuthPieces.join(':'))
self.setHeader('proxy-authorization', authHeader)
}
if (self.proxy && !self.tunnel) {
self.path = (self.uri.protocol + '//' + self.uri.host + self.path)
}
if (options.json) {
self.json(options.json)
}
if (options.multipart) {
self.multipart(options.multipart)
}
if (options.time) {
self.timing = true
self.elapsedTime = self.elapsedTime || 0
}
function setContentLength () {
if (!self.hasHeader('content-length')) {
var length
if (typeof self.body === 'string') {
length = Buffer.byteLength(self.body)
}
else if (Array.isArray(self.body)) {
length = self.body.reduce(function (a, b) {return a + b.length}, 0)
}
else {
length = self.body.length
}
if (length) {
self.setHeader('content-length', length)
} else {
self.emit('error', new Error('Argument error, options.body.'))
}
}
}
if (self.body) {
setContentLength()
}
if (options.oauth) {
self.oauth(options.oauth)
} else if (self._oauth.params && self.hasHeader('authorization')) {
self.oauth(self._oauth.params)
}
var protocol = self.proxy && !self.tunnel ? self.proxy.protocol : self.uri.protocol
, defaultModules = {'http:':http, 'https:':https}
, httpModules = self.httpModules || {}
self.httpModule = httpModules[protocol] || defaultModules[protocol]
if (!self.httpModule) {
return self.emit('error', new Error('Invalid protocol: ' + protocol))
}
if (options.ca) {
self.ca = options.ca
}
if (!self.agent) {
if (options.agentOptions) {
self.agentOptions = options.agentOptions
}
if (options.agentClass) {
self.agentClass = options.agentClass
} else if (options.forever) {
var v = version()
// use ForeverAgent in node 0.10- only
if (v.major === 0 && v.minor <= 10) {
self.agentClass = protocol === 'http:' ? ForeverAgent : ForeverAgent.SSL
} else {
self.agentClass = self.httpModule.Agent
self.agentOptions = self.agentOptions || {}
self.agentOptions.keepAlive = true
}
} else {
self.agentClass = self.httpModule.Agent
}
}
if (self.pool === false) {
self.agent = false
} else {
self.agent = self.agent || self.getNewAgent()
}
self.on('pipe', function (src) {
if (self.ntick && self._started) {
self.emit('error', new Error('You cannot pipe to this stream after the outbound request has started.'))
}
self.src = src
if (isReadStream(src)) {
if (!self.hasHeader('content-type')) {
self.setHeader('content-type', mime.lookup(src.path))
}
} else {
if (src.headers) {
for (var i in src.headers) {
if (!self.hasHeader(i)) {
self.setHeader(i, src.headers[i])
}
}
}
if (self._json && !self.hasHeader('content-type')) {
self.setHeader('content-type', 'application/json')
}
if (src.method && !self.explicitMethod) {
self.method = src.method
}
}
// self.on('pipe', function () {
// console.error('You have already piped to this stream. Pipeing twice is likely to break the request.')
// })
})
defer(function () {
if (self._aborted) {
return
}
var end = function () {
if (self._form) {
if (!self._auth.hasAuth) {
self._form.pipe(self)
}
else if (self._auth.hasAuth && self._auth.sentAuth) {
self._form.pipe(self)
}
}
if (self._multipart && self._multipart.chunked) {
self._multipart.body.pipe(self)
}
if (self.body) {
setContentLength()
if (Array.isArray(self.body)) {
self.body.forEach(function (part) {
self.write(part)
})
} else {
self.write(self.body)
}
self.end()
} else if (self.requestBodyStream) {
console.warn('options.requestBodyStream is deprecated, please pass the request object to stream.pipe.')
self.requestBodyStream.pipe(self)
} else if (!self.src) {
if (self._auth.hasAuth && !self._auth.sentAuth) {
self.end()
return
}
if (self.method !== 'GET' && typeof self.method !== 'undefined') {
self.setHeader('content-length', 0)
}
self.end()
}
}
if (self._form && !self.hasHeader('content-length')) {
// Before ending the request, we had to compute the length of the whole form, asyncly
self.setHeader(self._form.getHeaders(), true)
self._form.getLength(function (err, length) {
if (!err) {
self.setHeader('content-length', length)
}
end()
})
} else {
end()
}
self.ntick = true
})
}
// Must call this when following a redirect from https to http or vice versa
// Attempts to keep everything as identical as possible, but update the
// httpModule, Tunneling agent, and/or Forever Agent in use.
Request.prototype._updateProtocol = function () {
var self = this
var protocol = self.uri.protocol
if (protocol === 'https:' || self.tunnel) {
// previously was doing http, now doing https
// if it's https, then we might need to tunnel now.
if (self.proxy) {
if (self._tunnel.setup()) {
return
}
}
self.httpModule = https
switch (self.agentClass) {
case ForeverAgent:
self.agentClass = ForeverAgent.SSL
break
case http.Agent:
self.agentClass = https.Agent
break
default:
// nothing we can do. Just hope for the best.
return
}
// if there's an agent, we need to get a new one.
if (self.agent) {
self.agent = self.getNewAgent()
}
} else {
// previously was doing https, now doing http
self.httpModule = http
switch (self.agentClass) {
case ForeverAgent.SSL:
self.agentClass = ForeverAgent
break
case https.Agent:
self.agentClass = http.Agent
break
default:
// nothing we can do. just hope for the best
return
}
// if there's an agent, then get a new one.
if (self.agent) {
self.agent = null
self.agent = self.getNewAgent()
}
}
}
Request.prototype.getNewAgent = function () {
var self = this
var Agent = self.agentClass
var options = {}
if (self.agentOptions) {
for (var i in self.agentOptions) {
options[i] = self.agentOptions[i]
}
}
if (self.ca) {
options.ca = self.ca
}
if (self.ciphers) {
options.ciphers = self.ciphers
}
if (self.secureProtocol) {
options.secureProtocol = self.secureProtocol
}
if (self.secureOptions) {
options.secureOptions = self.secureOptions
}
if (typeof self.rejectUnauthorized !== 'undefined') {
options.rejectUnauthorized = self.rejectUnauthorized
}
if (self.cert && self.key) {
options.key = self.key
options.cert = self.cert
}
if (self.pfx) {
options.pfx = self.pfx
}
if (self.passphrase) {
options.passphrase = self.passphrase
}
var poolKey = ''
// different types of agents are in different pools
if (Agent !== self.httpModule.Agent) {
poolKey += Agent.name
}
// ca option is only relevant if proxy or destination are https
var proxy = self.proxy
if (typeof proxy === 'string') {
proxy = url.parse(proxy)
}
var isHttps = (proxy && proxy.protocol === 'https:') || this.uri.protocol === 'https:'
if (isHttps) {
if (options.ca) {
if (poolKey) {
poolKey += ':'
}
poolKey += options.ca
}
if (typeof options.rejectUnauthorized !== 'undefined') {
if (poolKey) {
poolKey += ':'
}
poolKey += options.rejectUnauthorized
}
if (options.cert) {
if (poolKey) {
poolKey += ':'
}
poolKey += options.cert.toString('ascii') + options.key.toString('ascii')
}
if (options.pfx) {
if (poolKey) {
poolKey += ':'
}
poolKey += options.pfx.toString('ascii')
}
if (options.ciphers) {
if (poolKey) {
poolKey += ':'
}
poolKey += options.ciphers
}
if (options.secureProtocol) {
if (poolKey) {
poolKey += ':'
}
poolKey += options.secureProtocol
}
if (options.secureOptions) {
if (poolKey) {
poolKey += ':'
}
poolKey += options.secureOptions
}
}
if (self.pool === globalPool && !poolKey && Object.keys(options).length === 0 && self.httpModule.globalAgent) {
// not doing anything special. Use the globalAgent
return self.httpModule.globalAgent
}
// we're using a stored agent. Make sure it's protocol-specific
poolKey = self.uri.protocol + poolKey
// generate a new agent for this setting if none yet exists
if (!self.pool[poolKey]) {
self.pool[poolKey] = new Agent(options)
// properly set maxSockets on new agents
if (self.pool.maxSockets) {
self.pool[poolKey].maxSockets = self.pool.maxSockets
}
}
return self.pool[poolKey]
}
Request.prototype.start = function () {
// start() is called once we are ready to send the outgoing HTTP request.
// this is usually called on the first write(), end() or on nextTick()
var self = this
if (self._aborted) {
return
}
self._started = true
self.method = self.method || 'GET'
self.href = self.uri.href
if (self.src && self.src.stat && self.src.stat.size && !self.hasHeader('content-length')) {
self.setHeader('content-length', self.src.stat.size)
}
if (self._aws) {
self.aws(self._aws, true)
}
// We have a method named auth, which is completely different from the http.request
// auth option. If we don't remove it, we're gonna have a bad time.
var reqOptions = copy(self)
delete reqOptions.auth
debug('make request', self.uri.href)
self.req = self.httpModule.request(reqOptions)
if (self.timing) {
self.startTime = new Date().getTime()
}
if (self.timeout && !self.timeoutTimer) {
var timeout = self.timeout < 0 ? 0 : self.timeout
// Set a timeout in memory - this block will throw if the server takes more
// than `timeout` to write the HTTP status and headers (corresponding to
// the on('response') event on the client). NB: this measures wall-clock
// time, not the time between bytes sent by the server.
self.timeoutTimer = setTimeout(function () {
var connectTimeout = self.req.socket && self.req.socket.readable === false
self.abort()
var e = new Error('ETIMEDOUT')
e.code = 'ETIMEDOUT'
e.connect = connectTimeout
self.emit('error', e)
}, timeout)
if (self.req.setTimeout) { // only works on node 0.6+
// Set an additional timeout on the socket, via the `setsockopt` syscall.
// This timeout sets the amount of time to wait *between* bytes sent
// from the server, and may or may not correspond to the wall-clock time
// elapsed from the start of the request.
//
// In particular, it's useful for erroring if the server fails to send
// data halfway through streaming a response.
self.req.setTimeout(timeout, function () {
if (self.req) {
self.req.abort()
var e = new Error('ESOCKETTIMEDOUT')
e.code = 'ESOCKETTIMEDOUT'
e.connect = false
self.emit('error', e)
}
})
}
}
self.req.on('response', self.onRequestResponse.bind(self))
self.req.on('error', self.onRequestError.bind(self))
self.req.on('drain', function() {
self.emit('drain')
})
self.req.on('socket', function(socket) {
self.emit('socket', socket)
})
self.on('end', function() {
if ( self.req.connection ) {
self.req.connection.removeListener('error', connectionErrorHandler)
}
})
self.emit('request', self.req)
}
Request.prototype.onRequestError = function (error) {
var self = this
if (self._aborted) {
return
}
if (self.req && self.req._reusedSocket && error.code === 'ECONNRESET'
&& self.agent.addRequestNoreuse) {
self.agent = { addRequest: self.agent.addRequestNoreuse.bind(self.agent) }
self.start()
self.req.end()
return
}
if (self.timeout && self.timeoutTimer) {
clearTimeout(self.timeoutTimer)
self.timeoutTimer = null
}
self.emit('error', error)
}
Request.prototype.onRequestResponse = function (response) {
var self = this
debug('onRequestResponse', self.uri.href, response.statusCode, response.headers)
response.on('end', function() {
if (self.timing) {
self.elapsedTime += (new Date().getTime() - self.startTime)
debug('elapsed time', self.elapsedTime)
response.elapsedTime = self.elapsedTime
}
debug('response end', self.uri.href, response.statusCode, response.headers)
})
// The check on response.connection is a workaround for browserify.
if (response.connection && response.connection.listeners('error').indexOf(connectionErrorHandler) === -1) {
response.connection.setMaxListeners(0)
response.connection.once('error', connectionErrorHandler)
}
if (self._aborted) {
debug('aborted', self.uri.href)
response.resume()
return
}
self.response = response
response.request = self
response.toJSON = responseToJSON
// XXX This is different on 0.10, because SSL is strict by default
if (self.httpModule === https &&
self.strictSSL && (!response.hasOwnProperty('socket') ||
!response.socket.authorized)) {
debug('strict ssl error', self.uri.href)
var sslErr = response.hasOwnProperty('socket') ? response.socket.authorizationError : self.uri.href + ' does not support SSL'
self.emit('error', new Error('SSL Error: ' + sslErr))
return
}
// Save the original host before any redirect (if it changes, we need to
// remove any authorization headers). Also remember the case of the header
// name because lots of broken servers expect Host instead of host and we
// want the caller to be able to specify this.
self.originalHost = self.getHeader('host')
if (!self.originalHostHeaderName) {
self.originalHostHeaderName = self.hasHeader('host')
}
if (self.setHost) {
self.removeHeader('host')
}
if (self.timeout && self.timeoutTimer) {
clearTimeout(self.timeoutTimer)
self.timeoutTimer = null
}
var targetCookieJar = (self._jar && self._jar.setCookie) ? self._jar : globalCookieJar
var addCookie = function (cookie) {
//set the cookie if it's domain in the href's domain.
try {
targetCookieJar.setCookie(cookie, self.uri.href, {ignoreError: true})
} catch (e) {
self.emit('error', e)
}
}
response.caseless = caseless(response.headers)
if (response.caseless.has('set-cookie') && (!self._disableCookies)) {
var headerName = response.caseless.has('set-cookie')
if (Array.isArray(response.headers[headerName])) {
response.headers[headerName].forEach(addCookie)
} else {
addCookie(response.headers[headerName])
}
}
if (self._redirect.onResponse(response)) {
return // Ignore the rest of the response
} else {
// Be a good stream and emit end when the response is finished.
// Hack to emit end on close because of a core bug that never fires end
response.on('close', function () {
if (!self._ended) {
self.response.emit('end')
}
})
response.on('end', function () {
self._ended = true
})
var responseContent
if (self.gzip) {
var contentEncoding = response.headers['content-encoding'] || 'identity'
contentEncoding = contentEncoding.trim().toLowerCase()
if (contentEncoding === 'gzip') {
responseContent = zlib.createGunzip()
response.pipe(responseContent)
} else {
// Since previous versions didn't check for Content-Encoding header,
// ignore any invalid values to preserve backwards-compatibility
if (contentEncoding !== 'identity') {
debug('ignoring unrecognized Content-Encoding ' + contentEncoding)
}
responseContent = response
}
} else {
responseContent = response
}
if (self.encoding) {
if (self.dests.length !== 0) {
console.error('Ignoring encoding parameter as this stream is being piped to another stream which makes the encoding option invalid.')
} else if (responseContent.setEncoding) {
responseContent.setEncoding(self.encoding)
} else {
// Should only occur on node pre-v0.9.4 (joyent/node@9b5abe5) with
// zlib streams.
// If/When support for 0.9.4 is dropped, this should be unnecessary.
responseContent = responseContent.pipe(stringstream(self.encoding))
}
}
if (self._paused) {
responseContent.pause()
}
self.responseContent = responseContent
self.emit('response', response)
self.dests.forEach(function (dest) {
self.pipeDest(dest)
})
responseContent.on('data', function (chunk) {
self._destdata = true
self.emit('data', chunk)
})
responseContent.on('end', function (chunk) {
self.emit('end', chunk)
})
responseContent.on('error', function (error) {
self.emit('error', error)
})
responseContent.on('close', function () {self.emit('close')})
if (self.callback) {
self.readResponseBody(response)
}
//if no callback
else {
self.on('end', function () {
if (self._aborted) {
debug('aborted', self.uri.href)
return
}
self.emit('complete', response)
})
}
}
debug('finish init function', self.uri.href)
}
Request.prototype.readResponseBody = function (response) {
var self = this
debug('reading response\'s body')
var buffer = bl()
, strings = []
self.on('data', function (chunk) {
if (Buffer.isBuffer(chunk)) {
buffer.append(chunk)
} else {
strings.push(chunk)
}
})
self.on('end', function () {
debug('end event', self.uri.href)
if (self._aborted) {
debug('aborted', self.uri.href)
return
}
if (buffer.length) {
debug('has body', self.uri.href, buffer.length)
if (self.encoding === null) {
// response.body = buffer
// can't move to this until path_to_url
response.body = buffer.slice()
} else {
response.body = buffer.toString(self.encoding)
}
} else if (strings.length) {
// The UTF8 BOM [0xEF,0xBB,0xBF] is converted to [0xFE,0xFF] in the JS UTC16/UCS2 representation.
// Strip this value out when the encoding is set to 'utf8', as upstream consumers won't expect it and it breaks JSON.parse().
if (self.encoding === 'utf8' && strings[0].length > 0 && strings[0][0] === '\uFEFF') {
strings[0] = strings[0].substring(1)
}
response.body = strings.join('')
}
if (self._json) {
try {
response.body = JSON.parse(response.body, self._jsonReviver)
} catch (e) {
debug('invalid JSON received', self.uri.href)
}
}
debug('emitting complete', self.uri.href)
if (typeof response.body === 'undefined' && !self._json) {
response.body = self.encoding === null ? new Buffer(0) : ''
}
self.emit('complete', response, response.body)
})
}
Request.prototype.abort = function () {
var self = this
self._aborted = true
if (self.req) {
self.req.abort()
}
else if (self.response) {
self.response.abort()
}
self.emit('abort')
}
Request.prototype.pipeDest = function (dest) {
var self = this
var response = self.response
// Called after the response is received
if (dest.headers && !dest.headersSent) {
if (response.caseless.has('content-type')) {
var ctname = response.caseless.has('content-type')
if (dest.setHeader) {
dest.setHeader(ctname, response.headers[ctname])
}
else {
dest.headers[ctname] = response.headers[ctname]
}
}
if (response.caseless.has('content-length')) {
var clname = response.caseless.has('content-length')
if (dest.setHeader) {
dest.setHeader(clname, response.headers[clname])
} else {
dest.headers[clname] = response.headers[clname]
}
}
}
if (dest.setHeader && !dest.headersSent) {
for (var i in response.headers) {
// If the response content is being decoded, the Content-Encoding header
// of the response doesn't represent the piped content, so don't pass it.
if (!self.gzip || i !== 'content-encoding') {
dest.setHeader(i, response.headers[i])
}
}
dest.statusCode = response.statusCode
}
if (self.pipefilter) {
self.pipefilter(response, dest)
}
}
Request.prototype.qs = function (q, clobber) {
var self = this
var base
if (!clobber && self.uri.query) {
base = self._qs.parse(self.uri.query)
} else {
base = {}
}
for (var i in q) {
base[i] = q[i]
}
var qs = self._qs.stringify(base)
if (qs === '') {
return self
}
self.uri = url.parse(self.uri.href.split('?')[0] + '?' + qs)
self.url = self.uri
self.path = self.uri.path
if (self.uri.host === 'unix') {
self.enableUnixSocket()
}
return self
}
Request.prototype.form = function (form) {
var self = this
if (form) {
if (!/^application\/x-www-form-urlencoded\b/.test(self.getHeader('content-type'))) {
self.setHeader('content-type', 'application/x-www-form-urlencoded')
}
self.body = (typeof form === 'string')
? self._qs.rfc3986(form.toString('utf8'))
: self._qs.stringify(form).toString('utf8')
return self
}
// create form-data object
self._form = new FormData()
self._form.on('error', function(err) {
err.message = 'form-data: ' + err.message
self.emit('error', err)
self.abort()
})
return self._form
}
Request.prototype.multipart = function (multipart) {
var self = this
self._multipart.onRequest(multipart)
if (!self._multipart.chunked) {
self.body = self._multipart.body
}
return self
}
Request.prototype.json = function (val) {
var self = this
if (!self.hasHeader('accept')) {
self.setHeader('accept', 'application/json')
}
self._json = true
if (typeof val === 'boolean') {
if (self.body !== undefined) {
if (!/^application\/x-www-form-urlencoded\b/.test(self.getHeader('content-type'))) {
self.body = safeStringify(self.body)
} else {
self.body = self._qs.rfc3986(self.body)
}
if (!self.hasHeader('content-type')) {
self.setHeader('content-type', 'application/json')
}
}
} else {
self.body = safeStringify(val)
if (!self.hasHeader('content-type')) {
self.setHeader('content-type', 'application/json')
}
}
if (typeof self.jsonReviver === 'function') {
self._jsonReviver = self.jsonReviver
}
return self
}
Request.prototype.getHeader = function (name, headers) {
var self = this
var result, re, match
if (!headers) {
headers = self.headers
}
Object.keys(headers).forEach(function (key) {
if (key.length !== name.length) {
return
}
re = new RegExp(name, 'i')
match = key.match(re)
if (match) {
result = headers[key]
}
})
return result
}
Request.prototype.enableUnixSocket = function () {
// Get the socket & request paths from the URL
var unixParts = this.uri.path.split(':')
, host = unixParts[0]
, path = unixParts[1]
// Apply unix properties to request
this.socketPath = host
this.uri.pathname = path
this.uri.path = path
this.uri.host = host
this.uri.hostname = host
this.uri.isUnix = true
}
Request.prototype.auth = function (user, pass, sendImmediately, bearer) {
var self = this
self._auth.onRequest(user, pass, sendImmediately, bearer)
return self
}
Request.prototype.aws = function (opts, now) {
var self = this
if (!now) {
self._aws = opts
return self
}
var date = new Date()
self.setHeader('date', date.toUTCString())
var auth =
{ key: opts.key
, secret: opts.secret
, verb: self.method.toUpperCase()
, date: date
, contentType: self.getHeader('content-type') || ''
, md5: self.getHeader('content-md5') || ''
, amazonHeaders: aws.canonicalizeHeaders(self.headers)
}
var path = self.uri.path
if (opts.bucket && path) {
auth.resource = '/' + opts.bucket + path
} else if (opts.bucket && !path) {
auth.resource = '/' + opts.bucket
} else if (!opts.bucket && path) {
auth.resource = path
} else if (!opts.bucket && !path) {
auth.resource = '/'
}
auth.resource = aws.canonicalizeResource(auth.resource)
self.setHeader('authorization', aws.authorization(auth))
return self
}
Request.prototype.httpSignature = function (opts) {
var self = this
httpSignature.signRequest({
getHeader: function(header) {
return self.getHeader(header, self.headers)
},
setHeader: function(header, value) {
self.setHeader(header, value)
},
method: self.method,
path: self.path
}, opts)
debug('httpSignature authorization', self.getHeader('authorization'))
return self
}
Request.prototype.hawk = function (opts) {
var self = this
self.setHeader('Authorization', hawk.client.header(self.uri, self.method, opts).field)
}
Request.prototype.oauth = function (_oauth) {
var self = this
self._oauth.onRequest(_oauth)
return self
}
Request.prototype.jar = function (jar) {
var self = this
var cookies
if (self._redirect.redirectsFollowed === 0) {
self.originalCookieHeader = self.getHeader('cookie')
}
if (!jar) {
// disable cookies
cookies = false
self._disableCookies = true
} else {
var targetCookieJar = (jar && jar.getCookieString) ? jar : globalCookieJar
var urihref = self.uri.href
//fetch cookie in the Specified host
if (targetCookieJar) {
cookies = targetCookieJar.getCookieString(urihref)
}
}
//if need cookie and cookie is not empty
if (cookies && cookies.length) {
if (self.originalCookieHeader) {
// Don't overwrite existing Cookie header
self.setHeader('cookie', self.originalCookieHeader + '; ' + cookies)
} else {
self.setHeader('cookie', cookies)
}
}
self._jar = jar
return self
}
// Stream API
Request.prototype.pipe = function (dest, opts) {
var self = this
if (self.response) {
if (self._destdata) {
self.emit('error', new Error('You cannot pipe after data has been emitted from the response.'))
} else if (self._ended) {
self.emit('error', new Error('You cannot pipe after the response has been ended.'))
} else {
stream.Stream.prototype.pipe.call(self, dest, opts)
self.pipeDest(dest)
return dest
}
} else {
self.dests.push(dest)
stream.Stream.prototype.pipe.call(self, dest, opts)
return dest
}
}
Request.prototype.write = function () {
var self = this
if (!self._started) {
self.start()
}
return self.req.write.apply(self.req, arguments)
}
Request.prototype.end = function (chunk) {
var self = this
if (chunk) {
self.write(chunk)
}
if (!self._started) {
self.start()
}
self.req.end()
}
Request.prototype.pause = function () {
var self = this
if (!self.responseContent) {
self._paused = true
} else {
self.responseContent.pause.apply(self.responseContent, arguments)
}
}
Request.prototype.resume = function () {
var self = this
if (!self.responseContent) {
self._paused = false
} else {
self.responseContent.resume.apply(self.responseContent, arguments)
}
}
Request.prototype.destroy = function () {
var self = this
if (!self._ended) {
self.end()
} else if (self.response) {
self.response.destroy()
}
}
Request.defaultProxyHeaderWhiteList =
Tunnel.defaultProxyHeaderWhiteList.slice()
Request.defaultProxyHeaderExclusiveList =
Tunnel.defaultProxyHeaderExclusiveList.slice()
// Exports
Request.prototype.toJSON = requestToJSON
module.exports = Request
```
|
```xml
<?xml version="1.0" encoding="utf-8"?>
<!--EXPORTED BY TOOL, DON'T MODIFY IT!-->
<!--Source File: node_test\event_subtree_2.xml-->
<behavior name="node_test/event_subtree_2" agenttype="AgentNodeTest" version="5">
<pars>
<par name="_$local_task_param_$_0" type="int" value="0" />
<par name="_$local_task_param_$_1" type="bool" value="false" />
</pars>
<node class="Task" id="6">
<property Prototype="Self.AgentNodeTest::event_test_int_bool(0,false)" />
<property IsHTN="false" />
<node class="Sequence" id="0">
<node class="Action" id="1">
<property Method="Self.AgentNodeTest::setEventVarInt(int Self.AgentNodeTest::_$local_task_param_$_0)" />
<property ResultOption="BT_SUCCESS" />
</node>
<node class="Action" id="2">
<property Method="Self.AgentNodeTest::setEventVarBool(bool Self.AgentNodeTest::_$local_task_param_$_1)" />
<property ResultOption="BT_SUCCESS" />
</node>
</node>
</node>
</behavior>
```
|
```java
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
package google.registry.tools;
import static com.google.common.base.Preconditions.checkArgument;
import com.beust.jcommander.Parameter;
import com.beust.jcommander.Parameters;
import com.google.common.base.Joiner;
import com.google.common.collect.ImmutableSet;
import google.registry.model.tld.label.PremiumList;
import google.registry.model.tld.label.PremiumListDao;
import javax.annotation.Nullable;
/**
* Command to delete a {@link PremiumList}. This command will fail if the premium list is currently
* in use on a tld.
*/
@Parameters(separators = " =", commandDescription = "Delete a PremiumList.")
final class DeletePremiumListCommand extends ConfirmingCommand {
@Nullable PremiumList premiumList;
@Parameter(
names = {"-n", "--name"},
description = "The name of the premium list to delete.",
required = true)
private String name;
@Override
protected void init() {
checkArgument(
PremiumListDao.getLatestRevision(name).isPresent(),
"Cannot delete the premium list %s because it doesn't exist.",
name);
premiumList = PremiumListDao.getLatestRevision(name).get();
ImmutableSet<String> tldsUsedOn = premiumList.getReferencingTlds();
checkArgument(
tldsUsedOn.isEmpty(),
"Cannot delete premium list because it is used on these tld(s): %s",
Joiner.on(", ").join(tldsUsedOn));
}
@Override
protected String prompt() {
return "You are about to delete the premium list: \n" + premiumList.getName();
}
@Override
protected String execute() {
PremiumListDao.delete(premiumList);
return String.format("Deleted premium list '%s'.\n", premiumList.getName());
}
}
```
|
```groff
.\" $OpenBSD: vge.4,v 1.23 2021/09/08 20:29:21 jmc Exp $
.\" $FreeBSD: vge.4,v 1.6 2004/11/24 19:06:43 brueffer Exp $
.\"
.\" Bill Paul <wpaul@windriver.com>. All rights reserved.
.\"
.\" Redistribution and use in source and binary forms, with or without
.\" modification, are permitted provided that the following conditions
.\" are met:
.\" 1. Redistributions of source code must retain the above copyright
.\" notice, this list of conditions and the following disclaimer.
.\" 2. Redistributions in binary form must reproduce the above copyright
.\" notice, this list of conditions and the following disclaimer in the
.\" documentation and/or other materials provided with the distribution.
.\" 3. All advertising materials mentioning features or use of this software
.\" must display the following acknowledgement:
.\" This product includes software developed by Bill Paul.
.\" 4. Neither the name of the author nor the names of any co-contributors
.\" may be used to endorse or promote products derived from this software
.\" without specific prior written permission.
.\"
.\" THIS SOFTWARE IS PROVIDED BY Bill Paul AND CONTRIBUTORS ``AS IS'' AND
.\" ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
.\" IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
.\" ARE DISCLAIMED. IN NO EVENT SHALL Bill Paul OR THE VOICES IN HIS HEAD
.\" BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
.\" CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
.\" SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
.\" INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
.\" CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
.\" ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
.\" THE POSSIBILITY OF SUCH DAMAGE.
.\"
.Dd $Mdocdate: September 8 2021 $
.Dt VGE 4
.Os
.Sh NAME
.Nm vge
.Nd VIA Velocity 10/100/1Gb Ethernet device
.Sh SYNOPSIS
.Cd "vge* at pci?"
.Cd "ciphy* at mii?"
.Cd "ipgphy* at mii?"
.Sh DESCRIPTION
The
.Nm
driver provides support for various NICs and embedded Ethernet interfaces
based on the VIA Networking Technologies VT6120, VT6122, VT6130 and VT6132
Gigabit Ethernet controller chips, including the following:
.Pp
.Bl -bullet -compact
.It
ZyXEL GN650-T 64-bit PCI Gigabit Ethernet NIC (ZX1701)
.It
ZyXEL GN670-T 32-bit PCI Gigabit Ethernet NIC (ZX1702)
.El
.Pp
The VT6120/VT6122 is a 33/66MHz 64-bit PCI device which combines a tri-speed
MAC with an integrated 10/100/1000 copper PHY.
(Some older cards use an external PHY.)
The VT6130/VT6132 is the PCI Express version.
The MAC supports IPv4 transmit/receive IP/TCP/UDP checksum offload,
VLAN tag insertion and stripping, a 64-entry CAM filter and a 64-entry
VLAN filter, 64-bit multicast hash filter, 4 separate transmit DMA
queues, flow control and jumbo frames (not on VT6130/VT6132).
The Velocity family has a 16K receive FIFO and 48K transmit FIFO.
.Pp
The
.Nm
driver takes advantage of the IPv4 transmit/receive IP/TCP/UDP checksum
offload, VLAN tag insertion and stripping, and the CAM filter support.
The CAM filter is used for multicast address filtering to provide
64 perfect multicast address filter support.
If it is necessary for the interface to join more than 64 multicast
groups, the driver will switch over to using the hash filter.
.Pp
The
.Nm
driver supports the following media types:
.Bl -tag -width 10baseTXUTP
.It Cm autoselect
Enable autoselection of the media type and options.
The user can manually override the autoselected mode by adding media
options to the appropriate
.Xr hostname.if 5
file.
.It Cm 10baseT/UTP
Set 10Mbps operation.
The
.Xr ifconfig 8
.Ic mediaopt
option can also be used to select either
.Cm full-duplex
or
.Cm half-duplex
modes.
.It Cm 100baseTX
Set 100Mbps (Fast Ethernet) operation.
The
.Xr ifconfig 8
.Ic mediaopt
option can also be used to select either
.Cm full-duplex
or
.Cm half-duplex
modes.
.It Cm 1000baseT
Set 1000baseT operation over twisted pair.
Both
.Cm full-duplex
and
.Cm half-duplex
modes are supported.
.El
.Pp
The
.Nm
driver supports the following media options:
.Bl -tag -width full-duplex
.It Cm full-duplex
Force full duplex operation.
.It Cm half-duplex
Force half duplex operation.
.El
.Pp
For more information on configuring this device, see
.Xr ifconfig 8 .
.Sh SEE ALSO
.Xr arp 4 ,
.Xr ciphy 4 ,
.Xr ifmedia 4 ,
.Xr intro 4 ,
.Xr ipgphy 4 ,
.Xr netintro 4 ,
.Xr pci 4 ,
.Xr hostname.if 5 ,
.Xr ifconfig 8
.Sh HISTORY
The
.Nm
device driver first appeared in
.Ox 3.7 .
.Sh AUTHORS
.An -nosplit
The
.Nm
driver was written by
.An Bill Paul Aq Mt wpaul@windriver.com
and ported to
.Ox
by
.An Peter Valchev Aq Mt pvalchev@openbsd.org .
```
|
Alan Garside (29 September 1926 – 23 May 2021) was an Australian soccer player who played as a forward for Granville and the Australian national team. He received a cap for his appearance and a 2020–2021 Socceroos jersey with his name on the back.
International career
Garside played his first and only international match against South Africa on 1 October 1955.
Honours
Granville
NSW Division One South Premiership: 1952
References
1926 births
2021 deaths
Men's association football forwards
Australia men's international soccer players
Australian men's soccer players
Soccer players from Sydney
|
```c++
/*
Centered Square Number is a centered figurate number that gives the number
of dots in a square with a dot in the center and all other dots surrounding
the center dot in successive square layers.
Nth Centered square number can be calculated by using formula n^2 + (n-1)^2.
*/
#include<iostream>
using namespace std;
int centeredSquare(int num)
{
// Using formula
return num * num + ((num - 1) * (num - 1));
}
int main()
{
int num;
cin>>num;
cout<<centeredSquare(num);
return 0;
}
/*
Input:
6
output:
61
*/
```
|
```smalltalk
/*
* CellEditKeyEngine - A engine that allows the behaviour of arbitrary keys to be configured
*
* Author: Phillip Piper
* Date: 3-March-2011 10:53 pm
*
* Change log:
* v2.8
* 2014-05-30 JPP - When a row is disabled, skip over it when looking for another cell to edit
* v2.5
* 2012-04-14 JPP - Fixed bug where, on a OLV with only a single editable column, tabbing
* to change rows would edit the cell above rather than the cell below
* the cell being edited.
* 2.5
* 2011-03-03 JPP - First version
*
*
* This program is free software: you can redistribute it and/or modify
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
*
* along with this program. If not, see <path_to_url
*
* If you wish to use this code in a closed source application, please contact phillip.piper@gmail.com.
*/
using System;
using System.Collections.Generic;
using System.Windows.Forms;
namespace BrightIdeasSoftware {
/// <summary>
/// Indicates the behavior of a key when a cell "on the edge" is being edited.
/// and the normal behavior of that key would exceed the edge. For example,
/// for a key that normally moves one column to the left, the "edge" would be
/// the left most column, since the normal action of the key cannot be taken
/// (since there are no more columns to the left).
/// </summary>
public enum CellEditAtEdgeBehaviour {
/// <summary>
/// The key press will be ignored
/// </summary>
Ignore,
/// <summary>
/// The key press will result in the cell editing wrapping to the
/// cell on the opposite edge.
/// </summary>
Wrap,
/// <summary>
/// The key press will wrap, but the column will be changed to the
/// appropiate adjacent column. This only makes sense for keys where
/// the normal action is ChangeRow.
/// </summary>
ChangeColumn,
/// <summary>
/// The key press will wrap, but the row will be changed to the
/// appropiate adjacent row. This only makes sense for keys where
/// the normal action is ChangeColumn.
/// </summary>
ChangeRow,
/// <summary>
/// The key will result in the current edit operation being ended.
/// </summary>
EndEdit
};
/// <summary>
/// Indicates the normal behaviour of a key when used during a cell edit
/// operation.
/// </summary>
public enum CellEditCharacterBehaviour {
/// <summary>
/// The key press will be ignored
/// </summary>
Ignore,
/// <summary>
/// The key press will end the current edit and begin an edit
/// operation on the next editable cell to the left.
/// </summary>
ChangeColumnLeft,
/// <summary>
/// The key press will end the current edit and begin an edit
/// operation on the next editable cell to the right.
/// </summary>
ChangeColumnRight,
/// <summary>
/// The key press will end the current edit and begin an edit
/// operation on the row above.
/// </summary>
ChangeRowUp,
/// <summary>
/// The key press will end the current edit and begin an edit
/// operation on the row below
/// </summary>
ChangeRowDown,
/// <summary>
/// The key press will cancel the current edit
/// </summary>
CancelEdit,
/// <summary>
/// The key press will finish the current edit operation
/// </summary>
EndEdit,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb1,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb2,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb3,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb4,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb5,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb6,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb7,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb8,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb9,
/// <summary>
/// Custom verb that can be used for specialized actions.
/// </summary>
CustomVerb10,
};
/// <summary>
/// Instances of this class handle key presses during a cell edit operation.
/// </summary>
public class CellEditKeyEngine {
#region Public interface
/// <summary>
/// Sets the behaviour of a given key
/// </summary>
/// <param name="key"></param>
/// <param name="normalBehaviour"></param>
/// <param name="atEdgeBehaviour"></param>
public virtual void SetKeyBehaviour(Keys key, CellEditCharacterBehaviour normalBehaviour, CellEditAtEdgeBehaviour atEdgeBehaviour) {
CellEditKeyMap[key] = normalBehaviour;
CellEditKeyAtEdgeBehaviourMap[key] = atEdgeBehaviour;
}
/// <summary>
/// Handle a key press
/// </summary>
/// <param name="olv"></param>
/// <param name="keyData"></param>
/// <returns>True if the key was completely handled.</returns>
public virtual bool HandleKey(ObjectListView olv, Keys keyData) {
if (olv == null) throw new ArgumentNullException("olv");
CellEditCharacterBehaviour behaviour;
if (!CellEditKeyMap.TryGetValue(keyData, out behaviour))
return false;
ListView = olv;
switch (behaviour) {
case CellEditCharacterBehaviour.Ignore:
break;
case CellEditCharacterBehaviour.CancelEdit:
HandleCancelEdit();
break;
case CellEditCharacterBehaviour.EndEdit:
HandleEndEdit();
break;
case CellEditCharacterBehaviour.ChangeColumnLeft:
case CellEditCharacterBehaviour.ChangeColumnRight:
HandleColumnChange(keyData, behaviour);
break;
case CellEditCharacterBehaviour.ChangeRowDown:
case CellEditCharacterBehaviour.ChangeRowUp:
HandleRowChange(keyData, behaviour);
break;
default:
return HandleCustomVerb(keyData, behaviour);
}
return true;
}
#endregion
#region Implementation properties
/// <summary>
/// Gets or sets the ObjectListView on which the current key is being handled.
/// This cannot be null.
/// </summary>
protected ObjectListView ListView {
get { return listView; }
set { listView = value; }
}
private ObjectListView listView;
/// <summary>
/// Gets the row of the cell that is currently being edited
/// </summary>
protected OLVListItem ItemBeingEdited {
get {
return (ListView == null || ListView.CellEditEventArgs == null) ? null : ListView.CellEditEventArgs.ListViewItem;
}
}
/// <summary>
/// Gets the index of the column of the cell that is being edited
/// </summary>
protected int SubItemIndexBeingEdited {
get {
return (ListView == null || ListView.CellEditEventArgs == null) ? -1 : ListView.CellEditEventArgs.SubItemIndex;
}
}
/// <summary>
/// Gets or sets the map that remembers the normal behaviour of keys
/// </summary>
protected IDictionary<Keys, CellEditCharacterBehaviour> CellEditKeyMap {
get {
if (cellEditKeyMap == null)
InitializeCellEditKeyMaps();
return cellEditKeyMap;
}
set {
cellEditKeyMap = value;
}
}
private IDictionary<Keys, CellEditCharacterBehaviour> cellEditKeyMap;
/// <summary>
/// Gets or sets the map that remembers the desired behaviour of keys
/// on edge cases.
/// </summary>
protected IDictionary<Keys, CellEditAtEdgeBehaviour> CellEditKeyAtEdgeBehaviourMap {
get {
if (cellEditKeyAtEdgeBehaviourMap == null)
InitializeCellEditKeyMaps();
return cellEditKeyAtEdgeBehaviourMap;
}
set {
cellEditKeyAtEdgeBehaviourMap = value;
}
}
private IDictionary<Keys, CellEditAtEdgeBehaviour> cellEditKeyAtEdgeBehaviourMap;
#endregion
#region Initialization
/// <summary>
/// Setup the default key mapping
/// </summary>
protected virtual void InitializeCellEditKeyMaps() {
cellEditKeyMap = new Dictionary<Keys, CellEditCharacterBehaviour>();
cellEditKeyMap[Keys.Escape] = CellEditCharacterBehaviour.CancelEdit;
cellEditKeyMap[Keys.Return] = CellEditCharacterBehaviour.EndEdit;
cellEditKeyMap[Keys.Enter] = CellEditCharacterBehaviour.EndEdit;
cellEditKeyMap[Keys.Tab] = CellEditCharacterBehaviour.ChangeColumnRight;
cellEditKeyMap[Keys.Tab | Keys.Shift] = CellEditCharacterBehaviour.ChangeColumnLeft;
cellEditKeyMap[Keys.Left | Keys.Alt] = CellEditCharacterBehaviour.ChangeColumnLeft;
cellEditKeyMap[Keys.Right | Keys.Alt] = CellEditCharacterBehaviour.ChangeColumnRight;
cellEditKeyMap[Keys.Up | Keys.Alt] = CellEditCharacterBehaviour.ChangeRowUp;
cellEditKeyMap[Keys.Down | Keys.Alt] = CellEditCharacterBehaviour.ChangeRowDown;
cellEditKeyAtEdgeBehaviourMap = new Dictionary<Keys, CellEditAtEdgeBehaviour>();
cellEditKeyAtEdgeBehaviourMap[Keys.Tab] = CellEditAtEdgeBehaviour.Wrap;
cellEditKeyAtEdgeBehaviourMap[Keys.Tab | Keys.Shift] = CellEditAtEdgeBehaviour.Wrap;
cellEditKeyAtEdgeBehaviourMap[Keys.Left | Keys.Alt] = CellEditAtEdgeBehaviour.Wrap;
cellEditKeyAtEdgeBehaviourMap[Keys.Right | Keys.Alt] = CellEditAtEdgeBehaviour.Wrap;
cellEditKeyAtEdgeBehaviourMap[Keys.Up | Keys.Alt] = CellEditAtEdgeBehaviour.ChangeColumn;
cellEditKeyAtEdgeBehaviourMap[Keys.Down | Keys.Alt] = CellEditAtEdgeBehaviour.ChangeColumn;
}
#endregion
#region Command handling
/// <summary>
/// Handle the end edit command
/// </summary>
protected virtual void HandleEndEdit() {
ListView.PossibleFinishCellEditing();
}
/// <summary>
/// Handle the cancel edit command
/// </summary>
protected virtual void HandleCancelEdit() {
ListView.CancelCellEdit();
}
/// <summary>
/// Placeholder that subclasses can override to handle any custom verbs
/// </summary>
/// <param name="keyData"></param>
/// <param name="behaviour"></param>
/// <returns></returns>
protected virtual bool HandleCustomVerb(Keys keyData, CellEditCharacterBehaviour behaviour) {
return false;
}
/// <summary>
/// Handle a change row command
/// </summary>
/// <param name="keyData"></param>
/// <param name="behaviour"></param>
protected virtual void HandleRowChange(Keys keyData, CellEditCharacterBehaviour behaviour) {
// If we couldn't finish editing the current cell, don't try to move it
if (!ListView.PossibleFinishCellEditing())
return;
OLVListItem olvi = ItemBeingEdited;
int subItemIndex = SubItemIndexBeingEdited;
bool isGoingUp = behaviour == CellEditCharacterBehaviour.ChangeRowUp;
// Try to find a row above (or below) the currently edited cell
// If we find one, start editing it and we're done.
OLVListItem adjacentOlvi = GetAdjacentItemOrNull(olvi, isGoingUp);
if (adjacentOlvi != null) {
StartCellEditIfDifferent(adjacentOlvi, subItemIndex);
return;
}
// There is no adjacent row in the direction we want, so we must be on an edge.
CellEditAtEdgeBehaviour atEdgeBehaviour;
if (!CellEditKeyAtEdgeBehaviourMap.TryGetValue(keyData, out atEdgeBehaviour))
atEdgeBehaviour = CellEditAtEdgeBehaviour.Wrap;
switch (atEdgeBehaviour) {
case CellEditAtEdgeBehaviour.Ignore:
break;
case CellEditAtEdgeBehaviour.EndEdit:
ListView.PossibleFinishCellEditing();
break;
case CellEditAtEdgeBehaviour.Wrap:
adjacentOlvi = GetAdjacentItemOrNull(null, isGoingUp);
StartCellEditIfDifferent(adjacentOlvi, subItemIndex);
break;
case CellEditAtEdgeBehaviour.ChangeColumn:
// Figure out the next editable column
List<OLVColumn> editableColumnsInDisplayOrder = EditableColumnsInDisplayOrder;
int displayIndex = Math.Max(0, editableColumnsInDisplayOrder.IndexOf(ListView.GetColumn(subItemIndex)));
if (isGoingUp)
displayIndex = (editableColumnsInDisplayOrder.Count + displayIndex - 1) % editableColumnsInDisplayOrder.Count;
else
displayIndex = (displayIndex + 1) % editableColumnsInDisplayOrder.Count;
subItemIndex = editableColumnsInDisplayOrder[displayIndex].Index;
// Wrap to the next row and start the cell edit
adjacentOlvi = GetAdjacentItemOrNull(null, isGoingUp);
StartCellEditIfDifferent(adjacentOlvi, subItemIndex);
break;
}
}
/// <summary>
/// Handle a change column command
/// </summary>
/// <param name="keyData"></param>
/// <param name="behaviour"></param>
protected virtual void HandleColumnChange(Keys keyData, CellEditCharacterBehaviour behaviour)
{
// If we couldn't finish editing the current cell, don't try to move it
if (!ListView.PossibleFinishCellEditing())
return;
// Changing columns only works in details mode
if (ListView.View != View.Details)
return;
List<OLVColumn> editableColumns = EditableColumnsInDisplayOrder;
OLVListItem olvi = ItemBeingEdited;
int displayIndex = Math.Max(0,
editableColumns.IndexOf(ListView.GetColumn(SubItemIndexBeingEdited)));
bool isGoingLeft = behaviour == CellEditCharacterBehaviour.ChangeColumnLeft;
// Are we trying to continue past one of the edges?
if ((isGoingLeft && displayIndex == 0) ||
(!isGoingLeft && displayIndex == editableColumns.Count - 1))
{
// Yes, so figure out our at edge behaviour
CellEditAtEdgeBehaviour atEdgeBehaviour;
if (!CellEditKeyAtEdgeBehaviourMap.TryGetValue(keyData, out atEdgeBehaviour))
atEdgeBehaviour = CellEditAtEdgeBehaviour.Wrap;
switch (atEdgeBehaviour)
{
case CellEditAtEdgeBehaviour.Ignore:
return;
case CellEditAtEdgeBehaviour.EndEdit:
HandleEndEdit();
return;
case CellEditAtEdgeBehaviour.ChangeRow:
case CellEditAtEdgeBehaviour.Wrap:
if (atEdgeBehaviour == CellEditAtEdgeBehaviour.ChangeRow)
olvi = GetAdjacentItem(olvi, isGoingLeft && displayIndex == 0);
if (isGoingLeft)
displayIndex = editableColumns.Count - 1;
else
displayIndex = 0;
break;
}
}
else
{
if (isGoingLeft)
displayIndex -= 1;
else
displayIndex += 1;
}
int subItemIndex = editableColumns[displayIndex].Index;
StartCellEditIfDifferent(olvi, subItemIndex);
}
#endregion
#region Utilities
/// <summary>
/// Start editing the indicated cell if that cell is not already being edited
/// </summary>
/// <param name="olvi">The row to edit</param>
/// <param name="subItemIndex">The cell within that row to edit</param>
protected void StartCellEditIfDifferent(OLVListItem olvi, int subItemIndex) {
if (ItemBeingEdited == olvi && SubItemIndexBeingEdited == subItemIndex)
return;
ListView.EnsureVisible(olvi.Index);
ListView.StartCellEdit(olvi, subItemIndex);
}
/// <summary>
/// Gets the adjacent item to the given item in the given direction.
/// If that item is disabled, continue in that direction until an enabled item is found.
/// </summary>
/// <param name="olvi">The row whose neighbour is sought</param>
/// <param name="up">The direction of the adjacentness</param>
/// <returns>An OLVListView adjacent to the given item, or null if there are no more enabled items in that direction.</returns>
protected OLVListItem GetAdjacentItemOrNull(OLVListItem olvi, bool up) {
OLVListItem item = up ? ListView.GetPreviousItem(olvi) : ListView.GetNextItem(olvi);
while (item != null && !item.Enabled)
item = up ? ListView.GetPreviousItem(item) : ListView.GetNextItem(item);
return item;
}
/// <summary>
/// Gets the adjacent item to the given item in the given direction, wrapping if needed.
/// </summary>
/// <param name="olvi">The row whose neighbour is sought</param>
/// <param name="up">The direction of the adjacentness</param>
/// <returns>An OLVListView adjacent to the given item, or null if there are no more items in that direction.</returns>
protected OLVListItem GetAdjacentItem(OLVListItem olvi, bool up) {
return GetAdjacentItemOrNull(olvi, up) ?? GetAdjacentItemOrNull(null, up);
}
/// <summary>
/// Gets a collection of columns that are editable in the order they are shown to the user
/// </summary>
protected List<OLVColumn> EditableColumnsInDisplayOrder {
get {
List<OLVColumn> editableColumnsInDisplayOrder = new List<OLVColumn>();
foreach (OLVColumn x in ListView.ColumnsInDisplayOrder)
if (x.IsEditable)
editableColumnsInDisplayOrder.Add(x);
return editableColumnsInDisplayOrder;
}
}
#endregion
}
}
```
|
The Schou Brewery () is a former Norwegian brewery.
History
The company originated in a brewery that Johannes Thrane founded around 1800. Jørgen Young owned the brewery for some time before it was purchased by Christian Julius Schou (1792–1874) in 1837. The brewery was operated at several different locations in Oslo, and in 1873 operations were moved to a new facility at the Schousløkken property at Trondheimsveien (Trondheim Street) no. 2. The Schou Brewery took over the Foss Brewery in 1917, when the Foss Brewery was unable to receive raw materials from Germany during the First World War. In 1962, the Schou Brewery merged with Frydenlund Breweries to create the Merged Breweries Company (). That company operated until 1977 as the Frydenlund Schou Brewery (). In 1977 the company was taken over by Nora Industries (), which also owned the Ringnes brewery and Nora Mineral Water (). The Schou Brewery was Norway's oldest brewery when it shut down in 1981.
The Schou Brewery today
The neighborhood where the brewery was located is largely protected by the Urban Conservation Office, which has resulted in good conservation of major parts of the original brewery. The area is dominated by large brick industrial buildings from the last century, combined with newer modern premises. Until 2005, BI Norwegian Business School was a major tenant of the property, with the Norwegian School of Marketing (Norges Markedshøyskole) centrally located in the neighborhood. Several municipal businesses have office space in the area, including Oslo Water and Sewerage and the Real Estate and Urban Renewal Office. The brewery building is owned by the insurance company KLP. Since 2010, the Schou Cellar Microbrewery (Schouskjelleren Mikrobryggeri) has operated in the property's basement. Schous Pils is still brewed, but on a small scale by Ringnes Brewery and sold at some pubs and bars in Oslo.
Culture
For a long time, the municipality of Oslo sought to establish a "culture district" in Oslo in order to provide a gathering place for cultural activities. During the process of choosing a venue for a rock music museum, gradually both cultural and political forces agreed on the Schou neighborhood as a suitable place. In the end, Namsos was chosen as the site of the rock music museum, but private initiatives are underway to start a private museum in the Schou Brewery building.
In addition to an experience center for Norwegian rock, several other music-related ideas are planned for the brewery area. The municipality would like to use the brewery's former wort building (Vørterhuset) for training rooms designed for bands and solo performers. Another venue for musicians is Shou Corner (Schous Corner), a tavern that has featured live music all year long for a number of years. The Schou Corner building has protected status, but due to major subsidence damage to the structure there are plans to raze it and rebuild on a better foundation, and to continue the tavern business there. A private music school is under construction, and the music business Imerslund Musikk has opened a new shop in the oldest building in the neighborhood.
The Schou Cultural Brewery (Schous kulturbryggeri) is a large cultural venue in Oslo that transforms parts of the former Schou Brewery into a living cultural district. Five projects are located here:
The Pop Center (Popsenteret), an interactive museum for Norwegian popular music
The Practice Hotel (Øvingshotellet), with fifty rehearsal rooms for bands, soloists, and choirs
The National Stage (Riksscene), for Norwegian and international folk music, joik, and folk dancing
The Culture Station (Kulturstasjon), with dance, theater, and visual arts for children and young people
The state is responsible for the National Stage for folk dance and folk music. The municipality of Oslo is responsible for the other three projects. There is a widespread use of common functions, making the Schou Cultural Brewery a major cultural institution. Parts of the activity take place in the old brewery buildings, and a modern new building will also be built in the rear courtyard.
The Schou Cultural Brewery was set up as a project in the Culture and Sports Center (Kultur- og idrettsetaten), led by a steering committee appointed by the Oslo City Council. The Practice Hotel and the Culture Station were opened 2008, and the rest of the Schou Cultural Brewery was completed in 2009.
References
Breweries in Norway
Defunct breweries
Defunct food and drink companies of Norway
Grünerløkka
Food and drink companies disestablished in 1981
1981 disestablishments in Norway
|
Bondone may refer to:
Bondone, commune in Trentino, Italy
Monte Bondone, mountain in Trentino, Italy
Giotto di Bondone (1270-1337), Italian painter
|
Distorsio mcgintyi is a species of medium-sized sea snail, a marine gastropod mollusk in the family Personidae, the Distortio snails.
Description
The length of the shell attains 32 mm.
Distribution
This species occurs in the Gulf of Mexico, mainly off Florida.
References
Rosenberg, G.; Moretzsohn, F.; García, E. F. (2009). Gastropoda (Mollusca) of the Gulf of Mexico, Pp. 579–699 in: Felder, D.L. and D.K. Camp (eds.), Gulf of Mexico–Origins, Waters, and Biota. Texas A&M Press, College Station, Texas.
Parth, M. (2017). Seashells and Chinese snuff bottles. The collection Manfred Parth. München: Verlag Dr Friedrich Pfeil. 288 pp.
External links
Olsson, A. A. & McGinty, T. L. (1951). A Distorsio new to the Florida fauna. The Nautilus. 65(1): 26-28, pl. 1, figs. 5-6, 9
Personidae
|
Birmingham Curzon Street railway station (formerly Birmingham station) was a railway station in central Birmingham, England. Initially used as a major early passenger terminus before being eclipsed by newer facilities and converted into a goods depot, it was a continuously active railway facility up until 1966.
The station was jointly built and operated by the London and Birmingham Railway (L&BR) and the Grand Junction Railway (GJR), being the meeting point between the two railways, as well as the terminus for the first intercity line to be built into London. As such, it served as a joint terminus for the scheduled passenger trains of both companies to major destinations such as London, Manchester and Liverpool, between 1838 and 1854. It was formally opened on 24 June 1838, and received its first train from London on 17 September of that year. Being incapable of permitting through trains, it quickly proved to be inadequate even after expansion efforts to accommodate longer trains. Thus, during the 1840s, the newly-created Midland Railway opted to build a larger and more suitable station, Birmingham New Street, half a mile away from the earlier station that would take over most of its passenger traffic in 1854.
During the 1850s, Curzon Street station found a new role handling freight traffic; conversion work was undertaken between 1860 and 1965 to turn it into a dedicated goods station. In addition, limited passenger traffic, such as special excursion trains, called at that station up until its closure to passengers in 1893. It was heavily used for railway freight into the British Rail era, only being closed to rail-based goods traffic in 1966. Many original features were demolished at this time, such as the platforms and trainshed, but the principal entrance building survived and was given Grade I listed status. While much of the site continued to be used for road-based parcel traffic, the principal building was used as office space for various purposes, including the occasional art event. During the 2010s, it was announced that the site and the principal building would be reused and integrated into the new Birmingham Curzon Street railway station, and host the high speed services on High Speed 2.
History
Background
The construction of the station, which was originally known simply as Birmingham station, is closely associated with the creation of the London and Birmingham Railway (L&BR), the first intercity line to be built into London and the largest project to have ever been undertaken in Britain at the time. At Birmingham, the L&BR connected with the Grand Junction Railway (GJR), which was constructed at the same time; it had been intended for the two railways to meet end on as to facilitate the running of through services; however, on account of the opposition of influential land owners, the GJR's desired alignment was blocked, necessitating the creation of two adjacent termini stations, one for each company, at Curzon Street.
The L&BR's station was built on the south side of the site, featuring a pair of platforms in parallel (one for arrivals and the other for departures), along with four carriage sidings next to the tracks leading to the two outer platforms; six lines in total served the station. A sizable train shed that was supported by a pair of wrought iron truss spans covered both the platforms and the six tracks, covering an area of 217 feet (66 metres) long and 113 feet (34 metres) wide. To the rear of the departure platform, a lengthy building accommodating booking offices, waiting rooms and a parcels office was present. Furthermore, it also featured a grand three-storey Principal Building complete with four massive Ionic columns that intentionally matched the Doric Euston Arch present at the London terminus.
The GJR's station was located on the northern side of the site in a triangular area of land. It featured parallel departure and arrivals platforms, which had to be staggered in order to fit into the available land. The GJR also built their own independent entrance building and booking office (now demolished) which were located behind the departure platform. Separate yards for passengers and their horse-drawn carriages were present between the station and Curzon Street. The GJR's facilities were mainly designed by Joseph Franklin.
Various additional railway facilities were also constructed nearby on land to the south and east of the station; these included carriage sheds for the L&BR, a sixteen-sided engine house, and freight handling areas for the transhipment of goods between the L&BR and the neighbouring Birmingham Canal. A dedicated L&BR freight depot was also established to the north of the station.
As a passenger station
While the station was formally opened on 24 June 1838, however, due to the delayed completion of Kilsby Tunnel, the first train from London did not arrive until on 17 September of that year. That first train had traversed the 112 miles between the two cities in four hours and 48 minutes. During 1839, the GJR arrived at Curzon Street; although the line had opened two years earlier, one year before the L&BR, it originally ran to a temporary terminus at Vauxhall until a 28-span viaduct over the River Rea valley had been completed along with their side of the station.
By 1846, the station was already being extensively modified. The train shed was extended so that it could accommodate the running of longer trains, while the departure platform was extended to create a new bay platform for the use of the Birmingham and Gloucester Railway. Furthermore, the Principal Building had been extended along its northern side for the purpose of providing additional refreshment space for passengers, including a hotel.
Within only a few years of opening, the station had become quite heavily trafficked; however, the arrangement of the parallel platforms quickly proved to be an inconvenience to the travelling public and operators alike, the inability to perform through trips complicated many train journeys. It was also inconveniently located on the eastern edge of Birmingham city centre. Accordingly, its use as a major passenger station was relatively short-lived. Following the merging of the L&BR and GJR into the London and North Western Railway (LNWR) in 1846, work started on the new and more conveniently located 'Grand Central' station, which would become known as Birmingham New Street, which was shared with the Midland Railway.
Located only half a mile to the west of the preceding station, New Street was completed in 1854; unsurprisingly, the majority of passenger services were diverted away from the older station that same year. Furthermore, the smaller Lawley Street station, terminus of the Birmingham and Derby Junction Railway (a forerunner of the Midland Railway) was also opened a short distance to the east not long thereafter.
As a goods station
During November 1852, the name of the station was changed from Birmingham to Birmingham Curzon Street. The primary use of the station became the handling of goods; initially this was as an overflow to the adjacent goods depot, rail freight increased considerably during the mid 1850s. During 1860, work commenced on the formal conversion of the site into a goods station, which included the closure of the nearby engine shed; it was at this time that the general station buildings were demolished, while the train sheds and the Principal Building were retained, the latter to serve as offices. The conversion was completed in 1865.
In early 1874, a portion of Curzon Street station (at the corner of New Canal Street and Banbury Street) was adapted and used from Easter that year as an 'excursion station' to relieve New Street station at peak times, such as holidays or fair days. The station provided frequent public holiday excursion services to Sutton Coldfield. These excursions continued until Easter 1893, their discontinuation was to facilitate the expansion of the main lines into New Street from two to four.
During the early years of its life as a good station, horses were primarily used to shunt wagons around the depot, while capstans and turntables were also used to transfer wagons between tracks as well as to marshal them into trains. Different goods were handled across the site; while fruit and vegetables went through the old GJR arrival platform train shed, grain and flour was processed at the old GJR departure area. General freight was typically addressed beneath the 1838 L&BR train sheds. In 1914, Curzon Street employed more than 2,000 people, along with roughly 600 horses and 900 wagons.
During 1923, as a result of the Railway Groupings, the ownership of Curzon Street station was transferred to the newly-created London Midland & Scottish Railway (LMSR). Amid the Second World War, an incendiary bomb struck the Principal Building, causing mostly superficial damage, while numerous other bombs impacted nearby, after which it was repaired. In 1952, the Principal Building was given a Grade 1 Listing in recognition of its historical importance; in subsequent years, it would become the only surviving part of the original station and the world’s oldest example of monumental railway architecture.
Curzon Street station continued to be used into the British Rail era as a goods station up until 1966. The platforms, along with the original train sheds, were demolished that same year. For several decades, the site was used as a Parcelforce depot; this was demolished in May 2006. For a time, the site was largely used as a car park.
The surviving entrance building
The surviving Grade I listed entrance building was designed by Philip Hardwick, having been intended to be used as the company's offices and boardroom. Built during 1838, it is among the world's oldest surviving pieces of monumental railway architecture. Built at a cost of £28,000, the architecture is Roman inspired, following Hardwick's trip to Italy in 1818–19. It has tall pillars running up the front of the building, made out of a series of huge blocks of stone. The design mirrored the Euston Arch at the London end of the L&BR. In the original design, the building was to be flanked by two arches leading into the station: excavations have revealed that these were never built. The interior was modified in 1839 to accommodate a 'hotel' (the Victoria), although this was probably more in the nature of a refreshment room or public house, and later the booking hall, with a large iron balustraded stone staircase and offices. It is three storeys tall but relatively small. A detailed paper from Historic England can be found at the Warwickshire Railways website..
In 1841, a hotel extension – known originally as the Queen's Hotel – was added to the northern (Curzon Street) side of the building, but was eclipsed (and renamed the Railway Hotel) when a new Queen's Hotel was opened next to New Street station. During June 1900, the Railway Hotel was closed, after which the contents were sold and the space was converted into offices for the goods depot. On 27 January 1847, the Institution of Mechanical Engineers was established with George Stephenson as its first president in the Queen's Hotel; a plaque commemorating the centenary of the event was placed inside the station building when the hotel was demolished.
In 1897, Ansells Brewery had a purpose built public house, The Woodman, that was built opposite the station. It was still open by 2020.
In separate instances, during 1970 and 1978, British Rail applied to demolish the Principal Building, but permission to proceed was repeatedly refused on both occasions. Instead, in 1979, the ownership of the building was transferred to Birmingham City Council, which carried out carried out extensive restoration and repairs over the following three years, at which time the newer hotel wing was demolished. Once the renovations were completed, the building was intermittently used as offices for various groups. Amongst these users was a University of Birmingham student theatre group, the 'Three Bugs Fringe Theatre'. The building was also proposed as a home for the Royal College of Organists, but the proposal foundered in 2005 for lack of funds.
A commemorative plaque was installed next to the station entrance in 1988 which reads: "THIS PLAQUE COMMEMORATES THE 150TH ANNIVERSARY OF THE ARRIVAL OF THE FIRST LONDON TO BIRMINGHAM TRAIN AT THIS STATION ON MONDAY 17TH SEPTEMBER 1838".
The building was unused except for the occasional art exhibition. Birmingham City Council had hoped to refurbish the building and find an alternative tenant. It was expected to be the centrepiece of the City Park and Masshouse development scheme, which is located around the site, most of the surrounding buildings having been demolished. However, these plans were superseded by the High Speed 2 proposal, which will incorporate the surviving entrance building into the eastern entrance of a new station. A masonry colonnade screen will connect the historic structure and the new HS2 station viaducts and eastern concourse at New Canal Street. The renovated building will have a visitor centre and office space that will be used by HS2 Ltd, Birmingham City University, and Historic England. Renovation of the building was funded through a housing and regeneration grant rather than the HS2 Act, and when funding ran out in May 2022, work was temporarily suspended. Internal refurbishment was "well advanced" but funding could not be secured for external facade repairs. HS2 said it was working to "identify further heritage funding to fully restore this iconic landmark for the city."
References
Notes
Citations
Further reading
External links
Curzon Street on warwickshirerailways.com
Birmingham.gov.uk
Lookingatbuildings entry
Photo and description
Rail Around Birmingham: Curzon Street railway station
Disused railway stations in Birmingham, West Midlands
Disused railway goods stations in Great Britain
Former London and Birmingham Railway stations
Grand Junction Railway
Railway stations in Great Britain opened in 1838
Railway stations in Great Britain closed in 1893
1838 establishments in England
Philip Hardwick buildings
Grade I listed railway stations
Grade I listed buildings in Birmingham
Art museums and galleries in Birmingham, West Midlands
Terminating vistas in the United Kingdom
|
```c++
// accompanying file LICENSE_1_0.txt or copy at
// path_to_url
#ifndef BOOST_TYPE_DWA20010120_HPP
# define BOOST_TYPE_DWA20010120_HPP
namespace boost {
// Just a simple "type envelope". Useful in various contexts, mostly to work
// around some MSVC deficiencies.
template <class T>
struct type {};
}
#endif // BOOST_TYPE_DWA20010120_HPP
```
|
Harriman-and-West Airport , also known as Harriman & West or Harriman-West, is a public airport located three nautical miles (5 km) west of the central business district of North Adams, a city in Berkshire County, Massachusetts, United States. It is owned by the City of North Adams and is operated by a five-member Airport Commission.
This airport is assigned a three-letter location identifier of AQW by the Federal Aviation Administration, but it does not have an International Air Transport Association (IATA) airport code.
Facilities and aircraft
Harriman-and-West Airport covers an area of which contains one asphalt paved runway (11/29) measuring 4,300 x 100 ft (1,311 x 30 m). There are 39 aircraft based on the field. For the 12-month period ending November 6, 2012, the airport had 31,755 aircraft operations, an average of 87 per day: 96% general aviation, 3% air taxi and 1% military.
References
External links
Airports in Berkshire County, Massachusetts
Buildings and structures in North Adams, Massachusetts
|
The Cheyney Wolves are the athletic sports teams for Cheyney University. They compete as an independent and formerly played in the Pennsylvania State Athletic Conference (PSAC). Women's sports include basketball, cheerleading and volleyball. Basketball is the only men's sport the university currently offers as of 2019.
Basketball
The men's basketball program is 7th all-time in NCAA win percentage, including 16 PSAC conference championships, four Final Fours, and one National Championship (1978), as coached by John Chaney, who coached from 1972 to 1982.
In 1982, coached by C. Vivian Stringer, the team competed in the championship game of the inaugural NCAA Division I women's basketball tournament despite being a Division II school. They are the only HBCU to reach a Division I Final Four. After Stringer left in 1983, she was replaced by Winthrop McGriff, who led them to the Final Four in the 1984 NCAA Division I women's basketball tournament, becoming the first Black man to lead a women's team to the Final Four and the only one for three decades.
Both Chaney and Stringer would be inducted into the Naismith Memorial Basketball Hall of Fame, making Cheyney one of three schools to have had future Naismith Hall of Fame men’s and women’s basketball head coaches employed at the same time.
Probation
During the 2007–08 through 2010–11 academic years, the university violated NCAA rules in the certification of initial, transfer and continuing eligibility involving all sports programs. During the four-year period, numerous student-athletes competed while ineligible due to improper certification. In amateurism certification alone, 109 student-athletes practiced, competed and received travel expenses and/or athletically related financial aid before the university received their amateurism certification status from the NCAA Eligibility Center. The committee also concluded that a former compliance director failed to monitor when she did not follow proper procedures in the certification of student-athletes’ eligibility. The entire athletics program was on probation until August 2019.
For the 2018-19 academic year, Cheyney withdrew from the PSAC and Division II and played that season as an independent. The football team, suspended since being unable to afford the trip to the Turkey Day Classic in November 2017, did not play.
By 2019, the status quo from 2018-19 continued; the Wolves offer only basketball and women's volleyball, both of which primarily play Division III and community college teams.
References
External links
|
Yvonne Murray is an Irish journalist. She is the current Global Security Reporter based in New York for RTÉ News since December 2022. She previously reported for RTÉ News on Chinese affairs from Taipei, Taiwan.
Career
Murray previously worked for the British Broadcasting Corporation (BBC). She also contributed television reports to Channel 4 News, worked on independent documentary productions and written for The Economist.
In 2018, she re-established RTÉ's presence in China, from where she contributed multimedia reports across RTÉ News outlets including Morning Ireland, Six One News, Nine O'Clock News, Prime Time and RTÉ News online. Her reporting for RTÉ examined Ireland's relationship with China, charted the rise of China's economy, the regime's tightening authoritarianism at home as well as its growing influence on the world stage and the deepening superpower rivalry between China and the United States. She reported on-the-ground from Hong Kong, on the mass detention of Uyghurs in Xinjiang, on cross-strait tensions from Taiwan and from Wuhan, on the outbreak of the COVID-19 pandemic.
On 31 March 2021, it was revealed that Murray and her family were forced to leave China amid concerns for the safety of her husband, John Sudworth who is China Correspondent for BBC News. She and her family lived in China for ten years and took the decision to relocate to Taiwan, after facing legal threats and pressure from Chinese authorities.
On 14 December 2022, RTÉ News appointed Murray as its new Global Security Reporter based at the United Nations in New York. In her role she would provide comprehensive reporting and analysis on global news, and specifically the work of the UN Security Council.
Personal life
Murray was born in Howth, Dublin, Ireland. She is married to John Sudworth, and has three children. Two of their three children were born in China and all three speak Chinese proficiently.
References
Irish women journalists
Irish women radio presenters
Living people
RTÉ newsreaders and journalists
People from Howth
Year of birth missing (living people)
Broadcasters from County Dublin
|
Ationg Tituh is a Malaysian politician from Sabah region. He is the one who established and started the Parti Gagasan Rakyat Sabah (GAGASAN), and also the inaugural president of the local party on 28 August 2013. His political party is among the 20 official political parties registered in 2013.
Election results
References
Sabah State Legislative Assembly
Malaysian politicians
Year of birth missing (living people)
Living people
|
The Fondation Mérieux is an independent family foundation recognized for public utility created by Charles Mérieux. Its mission is to contribute to global health by strengthening local capacities in developing countries to reduce the impact of infectious diseases on vulnerable populations.
History
The FM was created in 1967 by Doctor Charles Mérieux, in homage to his father Marcel Mérieux, pupil of Louis Pasteur and founder of the Institut Mérieux in 1897. It was recognized in French law as "for public utility" in 1976.
The foundation is present in 18 countries such as Mali, Cambodia, Laos and Haiti. and builds its projects with the health authorities and local partners.
In October 2004, the FM was the beneficiary of a Franco-Chinese agreement that led to the creation of the Institut Pasteur de Shanghai and the donation of four truck-trailer BSL-3 laboratories, supplied by Labover.
In 2012, the FM continued its partnership with the Chinese Academy of Medical Sciences. The partnership, which is renewed by contract every five years, was begun with Phase I in 2007. In this case, the FM was interested in the study of tuberculosis. Inter alia, Phase I of the CAMS-FM partnership funds trained "132 laboratory professionals at national, prefectural et and municipal levels."
In 2015, the CAMS-FM partnership founded the Christophe Mérieux Laboratory (CML) at the Institute of Pathogen Biology in Beijing to focus on the study pneumonia and tuberculosis. Researchers at the CML "benefit from and training modules developed by the Emerging Pathogens Laboratory in Lyon", a BSL-4 lab which was also built by the FM in 1999 and since 2005 is now operated by INSERM.
In 2015, the FM participated in the donation by the French government of CIRI's Biosafety Level 4 expertise to the Wuhan Institute of Virology. This laboratory was accused by the leak of Covid-19, but there was not found any direct evidence that SARS-CoV-2 was inside a laboratory in Wuhan before the pandemic, although the lab has not released its records, which have been sought by scientists and governments around the world.
In January 2017, a researcher who was financed by the CAMS-FM partnership participated in a study of human rhinovirus and genotype A21.
In July 2020, it was announced that a researcher employed at the CAMS-FM participated in the development of a low-cost COVID-19 screening test, which uses a variant of the CRISPR technology.
Synopsis
The FM plays a role in the fight against infectious diseases by intervening in various areas:
Increase access to diagnosis because diagnosis is an essential tool for surveillance and control of diseases.
Reinforce research capacities, in particular by training teams on site and by setting up a research program.
Exchange and share knowledge, for the dissemination of global medical advances
Acting for mother and child, the most vulnerable to infectious diseases
See also
Mérieux Family
References
External links
Medical research
History of medicine
1967 establishments in France
Life sciences industry
|
```c++
#ifndef BOOST_SERIALIZATION_EXAMPLE_DEMO_PIMPL_A_HPP
#define BOOST_SERIALIZATION_EXAMPLE_DEMO_PIMPL_A_HPP
/////////1/////////2/////////3/////////4/////////5/////////6/////////7/////////8
// demo_pimpl_A.hpp
// Use, modification and distribution is subject to the Boost Software
// path_to_url
// class whose declaration is hidden by a pointer
struct B;
struct A {
// class a contains a pointer to a "hidden" declaration
B *pimpl;
template<class Archive>
void serialize(Archive & ar, const unsigned int file_version);
A();
~A();
};
#endif // BOOST_SERIALIZATION_EXAMPLE_DEMO_PIMPL_A_HPP
```
|
```smalltalk
// <auto-generated />
using System;
using Microsoft.EntityFrameworkCore;
using Microsoft.EntityFrameworkCore.Infrastructure;
using Microsoft.EntityFrameworkCore.Metadata;
using Microsoft.EntityFrameworkCore.Storage.ValueConversion;
using MyCompanyName.MyProjectName.Data;
using Volo.Abp.EntityFrameworkCore;
#nullable disable
namespace MyCompanyName.MyProjectName.Mvc.Migrations
{
[DbContext(typeof(MyProjectNameDbContext))]
partial class MyProjectNameDbContextModelSnapshot : ModelSnapshot
{
protected override void BuildModel(ModelBuilder modelBuilder)
{
#pragma warning disable 612, 618
modelBuilder
.HasAnnotation("_Abp_DatabaseProvider", EfCoreDatabaseProvider.SqlServer)
.HasAnnotation("ProductVersion", "8.0.0")
.HasAnnotation("Relational:MaxIdentifierLength", 128);
SqlServerModelBuilderExtensions.UseIdentityColumns(modelBuilder);
modelBuilder.Entity("Volo.Abp.AuditLogging.AuditLog", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("ApplicationName")
.HasMaxLength(96)
.HasColumnType("nvarchar(96)")
.HasColumnName("ApplicationName");
b.Property<string>("BrowserInfo")
.HasMaxLength(512)
.HasColumnType("nvarchar(512)")
.HasColumnName("BrowserInfo");
b.Property<string>("ClientId")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)")
.HasColumnName("ClientId");
b.Property<string>("ClientIpAddress")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)")
.HasColumnName("ClientIpAddress");
b.Property<string>("ClientName")
.HasMaxLength(128)
.HasColumnType("nvarchar(128)")
.HasColumnName("ClientName");
b.Property<string>("Comments")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("Comments");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<string>("CorrelationId")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)")
.HasColumnName("CorrelationId");
b.Property<string>("Exceptions")
.HasColumnType("nvarchar(max)");
b.Property<int>("ExecutionDuration")
.HasColumnType("int")
.HasColumnName("ExecutionDuration");
b.Property<DateTime>("ExecutionTime")
.HasColumnType("datetime2");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<string>("HttpMethod")
.HasMaxLength(16)
.HasColumnType("nvarchar(16)")
.HasColumnName("HttpMethod");
b.Property<int?>("HttpStatusCode")
.HasColumnType("int")
.HasColumnName("HttpStatusCode");
b.Property<Guid?>("ImpersonatorTenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("ImpersonatorTenantId");
b.Property<string>("ImpersonatorTenantName")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)")
.HasColumnName("ImpersonatorTenantName");
b.Property<Guid?>("ImpersonatorUserId")
.HasColumnType("uniqueidentifier")
.HasColumnName("ImpersonatorUserId");
b.Property<string>("ImpersonatorUserName")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("ImpersonatorUserName");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.Property<string>("TenantName")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)")
.HasColumnName("TenantName");
b.Property<string>("Url")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("Url");
b.Property<Guid?>("UserId")
.HasColumnType("uniqueidentifier")
.HasColumnName("UserId");
b.Property<string>("UserName")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("UserName");
b.HasKey("Id");
b.HasIndex("TenantId", "ExecutionTime");
b.HasIndex("TenantId", "UserId", "ExecutionTime");
b.ToTable("AbpAuditLogs", (string)null);
});
modelBuilder.Entity("Volo.Abp.AuditLogging.AuditLogAction", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<Guid>("AuditLogId")
.HasColumnType("uniqueidentifier")
.HasColumnName("AuditLogId");
b.Property<int>("ExecutionDuration")
.HasColumnType("int")
.HasColumnName("ExecutionDuration");
b.Property<DateTime>("ExecutionTime")
.HasColumnType("datetime2")
.HasColumnName("ExecutionTime");
b.Property<string>("ExtraProperties")
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<string>("MethodName")
.HasMaxLength(128)
.HasColumnType("nvarchar(128)")
.HasColumnName("MethodName");
b.Property<string>("Parameters")
.HasMaxLength(2000)
.HasColumnType("nvarchar(2000)")
.HasColumnName("Parameters");
b.Property<string>("ServiceName")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("ServiceName");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("Id");
b.HasIndex("AuditLogId");
b.HasIndex("TenantId", "ServiceName", "MethodName", "ExecutionTime");
b.ToTable("AbpAuditLogActions", (string)null);
});
modelBuilder.Entity("Volo.Abp.AuditLogging.EntityChange", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<Guid>("AuditLogId")
.HasColumnType("uniqueidentifier")
.HasColumnName("AuditLogId");
b.Property<DateTime>("ChangeTime")
.HasColumnType("datetime2")
.HasColumnName("ChangeTime");
b.Property<byte>("ChangeType")
.HasColumnType("tinyint")
.HasColumnName("ChangeType");
b.Property<string>("EntityId")
.HasMaxLength(128)
.HasColumnType("nvarchar(128)")
.HasColumnName("EntityId");
b.Property<Guid?>("EntityTenantId")
.HasColumnType("uniqueidentifier");
b.Property<string>("EntityTypeFullName")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)")
.HasColumnName("EntityTypeFullName");
b.Property<string>("ExtraProperties")
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("Id");
b.HasIndex("AuditLogId");
b.HasIndex("TenantId", "EntityTypeFullName", "EntityId");
b.ToTable("AbpEntityChanges", (string)null);
});
modelBuilder.Entity("Volo.Abp.AuditLogging.EntityPropertyChange", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<Guid>("EntityChangeId")
.HasColumnType("uniqueidentifier");
b.Property<string>("NewValue")
.HasMaxLength(512)
.HasColumnType("nvarchar(512)")
.HasColumnName("NewValue");
b.Property<string>("OriginalValue")
.HasMaxLength(512)
.HasColumnType("nvarchar(512)")
.HasColumnName("OriginalValue");
b.Property<string>("PropertyName")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)")
.HasColumnName("PropertyName");
b.Property<string>("PropertyTypeFullName")
.IsRequired()
.HasMaxLength(64)
.HasColumnType("nvarchar(64)")
.HasColumnName("PropertyTypeFullName");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("Id");
b.HasIndex("EntityChangeId");
b.ToTable("AbpEntityPropertyChanges", (string)null);
});
modelBuilder.Entity("Volo.Abp.FeatureManagement.FeatureDefinitionRecord", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("AllowedProviders")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("DefaultValue")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("Description")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("DisplayName")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("ExtraProperties")
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<string>("GroupName")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<bool>("IsAvailableToHost")
.HasColumnType("bit");
b.Property<bool>("IsVisibleToClients")
.HasColumnType("bit");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("ParentName")
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("ValueType")
.HasMaxLength(2048)
.HasColumnType("nvarchar(2048)");
b.HasKey("Id");
b.HasIndex("GroupName");
b.HasIndex("Name")
.IsUnique();
b.ToTable("AbpFeatures", (string)null);
});
modelBuilder.Entity("Volo.Abp.FeatureManagement.FeatureGroupDefinitionRecord", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("DisplayName")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("ExtraProperties")
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.HasKey("Id");
b.HasIndex("Name")
.IsUnique();
b.ToTable("AbpFeatureGroups", (string)null);
});
modelBuilder.Entity("Volo.Abp.FeatureManagement.FeatureValue", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("ProviderKey")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("ProviderName")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("Value")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.HasKey("Id");
b.HasIndex("Name", "ProviderName", "ProviderKey")
.IsUnique()
.HasFilter("[ProviderName] IS NOT NULL AND [ProviderKey] IS NOT NULL");
b.ToTable("AbpFeatureValues", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityClaimType", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<string>("Description")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsStatic")
.HasColumnType("bit");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("Regex")
.HasMaxLength(512)
.HasColumnType("nvarchar(512)");
b.Property<string>("RegexDescription")
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<bool>("Required")
.HasColumnType("bit");
b.Property<int>("ValueType")
.HasColumnType("int");
b.HasKey("Id");
b.ToTable("AbpClaimTypes", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityLinkUser", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<Guid?>("SourceTenantId")
.HasColumnType("uniqueidentifier");
b.Property<Guid>("SourceUserId")
.HasColumnType("uniqueidentifier");
b.Property<Guid?>("TargetTenantId")
.HasColumnType("uniqueidentifier");
b.Property<Guid>("TargetUserId")
.HasColumnType("uniqueidentifier");
b.HasKey("Id");
b.HasIndex("SourceUserId", "SourceTenantId", "TargetUserId", "TargetTenantId")
.IsUnique()
.HasFilter("[SourceTenantId] IS NOT NULL AND [TargetTenantId] IS NOT NULL");
b.ToTable("AbpLinkUsers", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityRole", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<int>("EntityVersion")
.HasColumnType("int");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsDefault")
.HasColumnType("bit")
.HasColumnName("IsDefault");
b.Property<bool>("IsPublic")
.HasColumnType("bit")
.HasColumnName("IsPublic");
b.Property<bool>("IsStatic")
.HasColumnType("bit")
.HasColumnName("IsStatic");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("NormalizedName")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("Id");
b.HasIndex("NormalizedName");
b.ToTable("AbpRoles", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityRoleClaim", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uniqueidentifier");
b.Property<string>("ClaimType")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("ClaimValue")
.HasMaxLength(1024)
.HasColumnType("nvarchar(1024)");
b.Property<Guid>("RoleId")
.HasColumnType("uniqueidentifier");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("Id");
b.HasIndex("RoleId");
b.ToTable("AbpRoleClaims", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentitySecurityLog", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("Action")
.HasMaxLength(96)
.HasColumnType("nvarchar(96)");
b.Property<string>("ApplicationName")
.HasMaxLength(96)
.HasColumnType("nvarchar(96)");
b.Property<string>("BrowserInfo")
.HasMaxLength(512)
.HasColumnType("nvarchar(512)");
b.Property<string>("ClientId")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("ClientIpAddress")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<string>("CorrelationId")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<string>("Identity")
.HasMaxLength(96)
.HasColumnType("nvarchar(96)");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.Property<string>("TenantName")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<Guid?>("UserId")
.HasColumnType("uniqueidentifier");
b.Property<string>("UserName")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.HasKey("Id");
b.HasIndex("TenantId", "Action");
b.HasIndex("TenantId", "ApplicationName");
b.HasIndex("TenantId", "Identity");
b.HasIndex("TenantId", "UserId");
b.ToTable("AbpSecurityLogs", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentitySession", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("ClientId")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("Device")
.IsRequired()
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("DeviceInfo")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("IpAddresses")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<DateTime?>("LastAccessed")
.HasColumnType("datetime2");
b.Property<string>("SessionId")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<DateTime>("SignedIn")
.HasColumnType("datetime2");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.Property<Guid>("UserId")
.HasColumnType("uniqueidentifier");
b.HasKey("Id");
b.HasIndex("Device");
b.HasIndex("SessionId");
b.HasIndex("TenantId", "UserId");
b.ToTable("AbpSessions", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUser", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<int>("AccessFailedCount")
.ValueGeneratedOnAdd()
.HasColumnType("int")
.HasDefaultValue(0)
.HasColumnName("AccessFailedCount");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2")
.HasColumnName("CreationTime");
b.Property<Guid?>("CreatorId")
.HasColumnType("uniqueidentifier")
.HasColumnName("CreatorId");
b.Property<Guid?>("DeleterId")
.HasColumnType("uniqueidentifier")
.HasColumnName("DeleterId");
b.Property<DateTime?>("DeletionTime")
.HasColumnType("datetime2")
.HasColumnName("DeletionTime");
b.Property<string>("Email")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("Email");
b.Property<bool>("EmailConfirmed")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("EmailConfirmed");
b.Property<int>("EntityVersion")
.HasColumnType("int");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsActive")
.HasColumnType("bit")
.HasColumnName("IsActive");
b.Property<bool>("IsDeleted")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("IsDeleted");
b.Property<bool>("IsExternal")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("IsExternal");
b.Property<DateTime?>("LastModificationTime")
.HasColumnType("datetime2")
.HasColumnName("LastModificationTime");
b.Property<Guid?>("LastModifierId")
.HasColumnType("uniqueidentifier")
.HasColumnName("LastModifierId");
b.Property<DateTimeOffset?>("LastPasswordChangeTime")
.HasColumnType("datetimeoffset");
b.Property<bool>("LockoutEnabled")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("LockoutEnabled");
b.Property<DateTimeOffset?>("LockoutEnd")
.HasColumnType("datetimeoffset");
b.Property<string>("Name")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)")
.HasColumnName("Name");
b.Property<string>("NormalizedEmail")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("NormalizedEmail");
b.Property<string>("NormalizedUserName")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("NormalizedUserName");
b.Property<string>("PasswordHash")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("PasswordHash");
b.Property<string>("PhoneNumber")
.HasMaxLength(16)
.HasColumnType("nvarchar(16)")
.HasColumnName("PhoneNumber");
b.Property<bool>("PhoneNumberConfirmed")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("PhoneNumberConfirmed");
b.Property<string>("SecurityStamp")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("SecurityStamp");
b.Property<bool>("ShouldChangePasswordOnNextLogin")
.HasColumnType("bit");
b.Property<string>("Surname")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)")
.HasColumnName("Surname");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.Property<bool>("TwoFactorEnabled")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("TwoFactorEnabled");
b.Property<string>("UserName")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)")
.HasColumnName("UserName");
b.HasKey("Id");
b.HasIndex("Email");
b.HasIndex("NormalizedEmail");
b.HasIndex("NormalizedUserName");
b.HasIndex("UserName");
b.ToTable("AbpUsers", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserClaim", b =>
{
b.Property<Guid>("Id")
.HasColumnType("uniqueidentifier");
b.Property<string>("ClaimType")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("ClaimValue")
.HasMaxLength(1024)
.HasColumnType("nvarchar(1024)");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.Property<Guid>("UserId")
.HasColumnType("uniqueidentifier");
b.HasKey("Id");
b.HasIndex("UserId");
b.ToTable("AbpUserClaims", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserDelegation", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<DateTime>("EndTime")
.HasColumnType("datetime2");
b.Property<Guid>("SourceUserId")
.HasColumnType("uniqueidentifier");
b.Property<DateTime>("StartTime")
.HasColumnType("datetime2");
b.Property<Guid>("TargetUserId")
.HasColumnType("uniqueidentifier");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("Id");
b.ToTable("AbpUserDelegations", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserLogin", b =>
{
b.Property<Guid>("UserId")
.HasColumnType("uniqueidentifier");
b.Property<string>("LoginProvider")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("ProviderDisplayName")
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("ProviderKey")
.IsRequired()
.HasMaxLength(196)
.HasColumnType("nvarchar(196)");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("UserId", "LoginProvider");
b.HasIndex("LoginProvider", "ProviderKey");
b.ToTable("AbpUserLogins", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserOrganizationUnit", b =>
{
b.Property<Guid>("OrganizationUnitId")
.HasColumnType("uniqueidentifier");
b.Property<Guid>("UserId")
.HasColumnType("uniqueidentifier");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2")
.HasColumnName("CreationTime");
b.Property<Guid?>("CreatorId")
.HasColumnType("uniqueidentifier")
.HasColumnName("CreatorId");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("OrganizationUnitId", "UserId");
b.HasIndex("UserId", "OrganizationUnitId");
b.ToTable("AbpUserOrganizationUnits", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserRole", b =>
{
b.Property<Guid>("UserId")
.HasColumnType("uniqueidentifier");
b.Property<Guid>("RoleId")
.HasColumnType("uniqueidentifier");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("UserId", "RoleId");
b.HasIndex("RoleId", "UserId");
b.ToTable("AbpUserRoles", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserToken", b =>
{
b.Property<Guid>("UserId")
.HasColumnType("uniqueidentifier");
b.Property<string>("LoginProvider")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("Name")
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.Property<string>("Value")
.HasColumnType("nvarchar(max)");
b.HasKey("UserId", "LoginProvider", "Name");
b.ToTable("AbpUserTokens", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.OrganizationUnit", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("Code")
.IsRequired()
.HasMaxLength(95)
.HasColumnType("nvarchar(95)")
.HasColumnName("Code");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2")
.HasColumnName("CreationTime");
b.Property<Guid?>("CreatorId")
.HasColumnType("uniqueidentifier")
.HasColumnName("CreatorId");
b.Property<Guid?>("DeleterId")
.HasColumnType("uniqueidentifier")
.HasColumnName("DeleterId");
b.Property<DateTime?>("DeletionTime")
.HasColumnType("datetime2")
.HasColumnName("DeletionTime");
b.Property<string>("DisplayName")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)")
.HasColumnName("DisplayName");
b.Property<int>("EntityVersion")
.HasColumnType("int");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsDeleted")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("IsDeleted");
b.Property<DateTime?>("LastModificationTime")
.HasColumnType("datetime2")
.HasColumnName("LastModificationTime");
b.Property<Guid?>("LastModifierId")
.HasColumnType("uniqueidentifier")
.HasColumnName("LastModifierId");
b.Property<Guid?>("ParentId")
.HasColumnType("uniqueidentifier");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("Id");
b.HasIndex("Code");
b.HasIndex("ParentId");
b.ToTable("AbpOrganizationUnits", (string)null);
});
modelBuilder.Entity("Volo.Abp.Identity.OrganizationUnitRole", b =>
{
b.Property<Guid>("OrganizationUnitId")
.HasColumnType("uniqueidentifier");
b.Property<Guid>("RoleId")
.HasColumnType("uniqueidentifier");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2")
.HasColumnName("CreationTime");
b.Property<Guid?>("CreatorId")
.HasColumnType("uniqueidentifier")
.HasColumnName("CreatorId");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("OrganizationUnitId", "RoleId");
b.HasIndex("RoleId", "OrganizationUnitId");
b.ToTable("AbpOrganizationUnitRoles", (string)null);
});
modelBuilder.Entity("Volo.Abp.OpenIddict.Applications.OpenIddictApplication", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("ApplicationType")
.HasMaxLength(50)
.HasColumnType("nvarchar(50)");
b.Property<string>("ClientId")
.HasMaxLength(100)
.HasColumnType("nvarchar(100)");
b.Property<string>("ClientSecret")
.HasColumnType("nvarchar(max)");
b.Property<string>("ClientType")
.HasMaxLength(50)
.HasColumnType("nvarchar(50)");
b.Property<string>("ClientUri")
.HasColumnType("nvarchar(max)");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<string>("ConsentType")
.HasMaxLength(50)
.HasColumnType("nvarchar(50)");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2")
.HasColumnName("CreationTime");
b.Property<Guid?>("CreatorId")
.HasColumnType("uniqueidentifier")
.HasColumnName("CreatorId");
b.Property<Guid?>("DeleterId")
.HasColumnType("uniqueidentifier")
.HasColumnName("DeleterId");
b.Property<DateTime?>("DeletionTime")
.HasColumnType("datetime2")
.HasColumnName("DeletionTime");
b.Property<string>("DisplayName")
.HasColumnType("nvarchar(max)");
b.Property<string>("DisplayNames")
.HasColumnType("nvarchar(max)");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsDeleted")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("IsDeleted");
b.Property<string>("JsonWebKeySet")
.HasColumnType("nvarchar(max)");
b.Property<DateTime?>("LastModificationTime")
.HasColumnType("datetime2")
.HasColumnName("LastModificationTime");
b.Property<Guid?>("LastModifierId")
.HasColumnType("uniqueidentifier")
.HasColumnName("LastModifierId");
b.Property<string>("LogoUri")
.HasColumnType("nvarchar(max)");
b.Property<string>("Permissions")
.HasColumnType("nvarchar(max)");
b.Property<string>("PostLogoutRedirectUris")
.HasColumnType("nvarchar(max)");
b.Property<string>("Properties")
.HasColumnType("nvarchar(max)");
b.Property<string>("RedirectUris")
.HasColumnType("nvarchar(max)");
b.Property<string>("Requirements")
.HasColumnType("nvarchar(max)");
b.Property<string>("Settings")
.HasColumnType("nvarchar(max)");
b.HasKey("Id");
b.HasIndex("ClientId");
b.ToTable("OpenIddictApplications", (string)null);
});
modelBuilder.Entity("Volo.Abp.OpenIddict.Authorizations.OpenIddictAuthorization", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<Guid?>("ApplicationId")
.HasColumnType("uniqueidentifier");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<DateTime?>("CreationDate")
.HasColumnType("datetime2");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2")
.HasColumnName("CreationTime");
b.Property<Guid?>("CreatorId")
.HasColumnType("uniqueidentifier")
.HasColumnName("CreatorId");
b.Property<Guid?>("DeleterId")
.HasColumnType("uniqueidentifier")
.HasColumnName("DeleterId");
b.Property<DateTime?>("DeletionTime")
.HasColumnType("datetime2")
.HasColumnName("DeletionTime");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsDeleted")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("IsDeleted");
b.Property<DateTime?>("LastModificationTime")
.HasColumnType("datetime2")
.HasColumnName("LastModificationTime");
b.Property<Guid?>("LastModifierId")
.HasColumnType("uniqueidentifier")
.HasColumnName("LastModifierId");
b.Property<string>("Properties")
.HasColumnType("nvarchar(max)");
b.Property<string>("Scopes")
.HasColumnType("nvarchar(max)");
b.Property<string>("Status")
.HasMaxLength(50)
.HasColumnType("nvarchar(50)");
b.Property<string>("Subject")
.HasMaxLength(400)
.HasColumnType("nvarchar(400)");
b.Property<string>("Type")
.HasMaxLength(50)
.HasColumnType("nvarchar(50)");
b.HasKey("Id");
b.HasIndex("ApplicationId", "Status", "Subject", "Type");
b.ToTable("OpenIddictAuthorizations", (string)null);
});
modelBuilder.Entity("Volo.Abp.OpenIddict.Scopes.OpenIddictScope", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2")
.HasColumnName("CreationTime");
b.Property<Guid?>("CreatorId")
.HasColumnType("uniqueidentifier")
.HasColumnName("CreatorId");
b.Property<Guid?>("DeleterId")
.HasColumnType("uniqueidentifier")
.HasColumnName("DeleterId");
b.Property<DateTime?>("DeletionTime")
.HasColumnType("datetime2")
.HasColumnName("DeletionTime");
b.Property<string>("Description")
.HasColumnType("nvarchar(max)");
b.Property<string>("Descriptions")
.HasColumnType("nvarchar(max)");
b.Property<string>("DisplayName")
.HasColumnType("nvarchar(max)");
b.Property<string>("DisplayNames")
.HasColumnType("nvarchar(max)");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsDeleted")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("IsDeleted");
b.Property<DateTime?>("LastModificationTime")
.HasColumnType("datetime2")
.HasColumnName("LastModificationTime");
b.Property<Guid?>("LastModifierId")
.HasColumnType("uniqueidentifier")
.HasColumnName("LastModifierId");
b.Property<string>("Name")
.HasMaxLength(200)
.HasColumnType("nvarchar(200)");
b.Property<string>("Properties")
.HasColumnType("nvarchar(max)");
b.Property<string>("Resources")
.HasColumnType("nvarchar(max)");
b.HasKey("Id");
b.HasIndex("Name");
b.ToTable("OpenIddictScopes", (string)null);
});
modelBuilder.Entity("Volo.Abp.OpenIddict.Tokens.OpenIddictToken", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<Guid?>("ApplicationId")
.HasColumnType("uniqueidentifier");
b.Property<Guid?>("AuthorizationId")
.HasColumnType("uniqueidentifier");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<DateTime?>("CreationDate")
.HasColumnType("datetime2");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2")
.HasColumnName("CreationTime");
b.Property<Guid?>("CreatorId")
.HasColumnType("uniqueidentifier")
.HasColumnName("CreatorId");
b.Property<Guid?>("DeleterId")
.HasColumnType("uniqueidentifier")
.HasColumnName("DeleterId");
b.Property<DateTime?>("DeletionTime")
.HasColumnType("datetime2")
.HasColumnName("DeletionTime");
b.Property<DateTime?>("ExpirationDate")
.HasColumnType("datetime2");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsDeleted")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("IsDeleted");
b.Property<DateTime?>("LastModificationTime")
.HasColumnType("datetime2")
.HasColumnName("LastModificationTime");
b.Property<Guid?>("LastModifierId")
.HasColumnType("uniqueidentifier")
.HasColumnName("LastModifierId");
b.Property<string>("Payload")
.HasColumnType("nvarchar(max)");
b.Property<string>("Properties")
.HasColumnType("nvarchar(max)");
b.Property<DateTime?>("RedemptionDate")
.HasColumnType("datetime2");
b.Property<string>("ReferenceId")
.HasMaxLength(100)
.HasColumnType("nvarchar(100)");
b.Property<string>("Status")
.HasMaxLength(50)
.HasColumnType("nvarchar(50)");
b.Property<string>("Subject")
.HasMaxLength(400)
.HasColumnType("nvarchar(400)");
b.Property<string>("Type")
.HasMaxLength(50)
.HasColumnType("nvarchar(50)");
b.HasKey("Id");
b.HasIndex("AuthorizationId");
b.HasIndex("ReferenceId");
b.HasIndex("ApplicationId", "Status", "Subject", "Type");
b.ToTable("OpenIddictTokens", (string)null);
});
modelBuilder.Entity("Volo.Abp.PermissionManagement.PermissionDefinitionRecord", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("DisplayName")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("ExtraProperties")
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<string>("GroupName")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<bool>("IsEnabled")
.HasColumnType("bit");
b.Property<byte>("MultiTenancySide")
.HasColumnType("tinyint");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("ParentName")
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("Providers")
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("StateCheckers")
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.HasKey("Id");
b.HasIndex("GroupName");
b.HasIndex("Name")
.IsUnique();
b.ToTable("AbpPermissions", (string)null);
});
modelBuilder.Entity("Volo.Abp.PermissionManagement.PermissionGrant", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("ProviderKey")
.IsRequired()
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("ProviderName")
.IsRequired()
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<Guid?>("TenantId")
.HasColumnType("uniqueidentifier")
.HasColumnName("TenantId");
b.HasKey("Id");
b.HasIndex("TenantId", "Name", "ProviderName", "ProviderKey")
.IsUnique()
.HasFilter("[TenantId] IS NOT NULL");
b.ToTable("AbpPermissionGrants", (string)null);
});
modelBuilder.Entity("Volo.Abp.PermissionManagement.PermissionGroupDefinitionRecord", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("DisplayName")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("ExtraProperties")
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.HasKey("Id");
b.HasIndex("Name")
.IsUnique();
b.ToTable("AbpPermissionGroups", (string)null);
});
modelBuilder.Entity("Volo.Abp.SettingManagement.Setting", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("ProviderKey")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("ProviderName")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("Value")
.IsRequired()
.HasMaxLength(2048)
.HasColumnType("nvarchar(2048)");
b.HasKey("Id");
b.HasIndex("Name", "ProviderName", "ProviderKey")
.IsUnique()
.HasFilter("[ProviderName] IS NOT NULL AND [ProviderKey] IS NOT NULL");
b.ToTable("AbpSettings", (string)null);
});
modelBuilder.Entity("Volo.Abp.SettingManagement.SettingDefinitionRecord", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("DefaultValue")
.HasMaxLength(2048)
.HasColumnType("nvarchar(2048)");
b.Property<string>("Description")
.HasMaxLength(512)
.HasColumnType("nvarchar(512)");
b.Property<string>("DisplayName")
.IsRequired()
.HasMaxLength(256)
.HasColumnType("nvarchar(256)");
b.Property<string>("ExtraProperties")
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsEncrypted")
.HasColumnType("bit");
b.Property<bool>("IsInherited")
.HasColumnType("bit");
b.Property<bool>("IsVisibleToClients")
.HasColumnType("bit");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(128)
.HasColumnType("nvarchar(128)");
b.Property<string>("Providers")
.HasMaxLength(1024)
.HasColumnType("nvarchar(1024)");
b.HasKey("Id");
b.HasIndex("Name")
.IsUnique();
b.ToTable("AbpSettingDefinitions", (string)null);
});
modelBuilder.Entity("Volo.Abp.TenantManagement.Tenant", b =>
{
b.Property<Guid>("Id")
.ValueGeneratedOnAdd()
.HasColumnType("uniqueidentifier");
b.Property<string>("ConcurrencyStamp")
.IsConcurrencyToken()
.IsRequired()
.HasMaxLength(40)
.HasColumnType("nvarchar(40)")
.HasColumnName("ConcurrencyStamp");
b.Property<DateTime>("CreationTime")
.HasColumnType("datetime2")
.HasColumnName("CreationTime");
b.Property<Guid?>("CreatorId")
.HasColumnType("uniqueidentifier")
.HasColumnName("CreatorId");
b.Property<Guid?>("DeleterId")
.HasColumnType("uniqueidentifier")
.HasColumnName("DeleterId");
b.Property<DateTime?>("DeletionTime")
.HasColumnType("datetime2")
.HasColumnName("DeletionTime");
b.Property<int>("EntityVersion")
.HasColumnType("int");
b.Property<string>("ExtraProperties")
.IsRequired()
.HasColumnType("nvarchar(max)")
.HasColumnName("ExtraProperties");
b.Property<bool>("IsDeleted")
.ValueGeneratedOnAdd()
.HasColumnType("bit")
.HasDefaultValue(false)
.HasColumnName("IsDeleted");
b.Property<DateTime?>("LastModificationTime")
.HasColumnType("datetime2")
.HasColumnName("LastModificationTime");
b.Property<Guid?>("LastModifierId")
.HasColumnType("uniqueidentifier")
.HasColumnName("LastModifierId");
b.Property<string>("Name")
.IsRequired()
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("NormalizedName")
.IsRequired()
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.HasKey("Id");
b.HasIndex("Name");
b.HasIndex("NormalizedName");
b.ToTable("AbpTenants", (string)null);
});
modelBuilder.Entity("Volo.Abp.TenantManagement.TenantConnectionString", b =>
{
b.Property<Guid>("TenantId")
.HasColumnType("uniqueidentifier");
b.Property<string>("Name")
.HasMaxLength(64)
.HasColumnType("nvarchar(64)");
b.Property<string>("Value")
.IsRequired()
.HasMaxLength(1024)
.HasColumnType("nvarchar(1024)");
b.HasKey("TenantId", "Name");
b.ToTable("AbpTenantConnectionStrings", (string)null);
});
modelBuilder.Entity("Volo.Abp.AuditLogging.AuditLogAction", b =>
{
b.HasOne("Volo.Abp.AuditLogging.AuditLog", null)
.WithMany("Actions")
.HasForeignKey("AuditLogId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.AuditLogging.EntityChange", b =>
{
b.HasOne("Volo.Abp.AuditLogging.AuditLog", null)
.WithMany("EntityChanges")
.HasForeignKey("AuditLogId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.AuditLogging.EntityPropertyChange", b =>
{
b.HasOne("Volo.Abp.AuditLogging.EntityChange", null)
.WithMany("PropertyChanges")
.HasForeignKey("EntityChangeId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityRoleClaim", b =>
{
b.HasOne("Volo.Abp.Identity.IdentityRole", null)
.WithMany("Claims")
.HasForeignKey("RoleId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserClaim", b =>
{
b.HasOne("Volo.Abp.Identity.IdentityUser", null)
.WithMany("Claims")
.HasForeignKey("UserId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserLogin", b =>
{
b.HasOne("Volo.Abp.Identity.IdentityUser", null)
.WithMany("Logins")
.HasForeignKey("UserId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserOrganizationUnit", b =>
{
b.HasOne("Volo.Abp.Identity.OrganizationUnit", null)
.WithMany()
.HasForeignKey("OrganizationUnitId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("Volo.Abp.Identity.IdentityUser", null)
.WithMany("OrganizationUnits")
.HasForeignKey("UserId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserRole", b =>
{
b.HasOne("Volo.Abp.Identity.IdentityRole", null)
.WithMany()
.HasForeignKey("RoleId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("Volo.Abp.Identity.IdentityUser", null)
.WithMany("Roles")
.HasForeignKey("UserId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUserToken", b =>
{
b.HasOne("Volo.Abp.Identity.IdentityUser", null)
.WithMany("Tokens")
.HasForeignKey("UserId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.Identity.OrganizationUnit", b =>
{
b.HasOne("Volo.Abp.Identity.OrganizationUnit", null)
.WithMany()
.HasForeignKey("ParentId");
});
modelBuilder.Entity("Volo.Abp.Identity.OrganizationUnitRole", b =>
{
b.HasOne("Volo.Abp.Identity.OrganizationUnit", null)
.WithMany("Roles")
.HasForeignKey("OrganizationUnitId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
b.HasOne("Volo.Abp.Identity.IdentityRole", null)
.WithMany()
.HasForeignKey("RoleId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.OpenIddict.Authorizations.OpenIddictAuthorization", b =>
{
b.HasOne("Volo.Abp.OpenIddict.Applications.OpenIddictApplication", null)
.WithMany()
.HasForeignKey("ApplicationId");
});
modelBuilder.Entity("Volo.Abp.OpenIddict.Tokens.OpenIddictToken", b =>
{
b.HasOne("Volo.Abp.OpenIddict.Applications.OpenIddictApplication", null)
.WithMany()
.HasForeignKey("ApplicationId");
b.HasOne("Volo.Abp.OpenIddict.Authorizations.OpenIddictAuthorization", null)
.WithMany()
.HasForeignKey("AuthorizationId");
});
modelBuilder.Entity("Volo.Abp.TenantManagement.TenantConnectionString", b =>
{
b.HasOne("Volo.Abp.TenantManagement.Tenant", null)
.WithMany("ConnectionStrings")
.HasForeignKey("TenantId")
.OnDelete(DeleteBehavior.Cascade)
.IsRequired();
});
modelBuilder.Entity("Volo.Abp.AuditLogging.AuditLog", b =>
{
b.Navigation("Actions");
b.Navigation("EntityChanges");
});
modelBuilder.Entity("Volo.Abp.AuditLogging.EntityChange", b =>
{
b.Navigation("PropertyChanges");
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityRole", b =>
{
b.Navigation("Claims");
});
modelBuilder.Entity("Volo.Abp.Identity.IdentityUser", b =>
{
b.Navigation("Claims");
b.Navigation("Logins");
b.Navigation("OrganizationUnits");
b.Navigation("Roles");
b.Navigation("Tokens");
});
modelBuilder.Entity("Volo.Abp.Identity.OrganizationUnit", b =>
{
b.Navigation("Roles");
});
modelBuilder.Entity("Volo.Abp.TenantManagement.Tenant", b =>
{
b.Navigation("ConnectionStrings");
});
#pragma warning restore 612, 618
}
}
}
```
|
Korathota Grama Niladhari Division is a Grama Niladhari Division of the Kaduwela Divisional Secretariat of Colombo District of Western Province, Sri Lanka . It has Grama Niladhari Division Code 488.
Korathota is a surrounded by the Welihinda, Mullegama North, Dedigamuwa, Pahala Bomiriya B, Nawagamuwa, Nawagamuwa South, Shanthalokagama, Thunadahena and Kothalawala Grama Niladhari Divisions.
Demographics
Ethnicity
The Korathota Grama Niladhari Division has a Sinhalese majority (98.3%) . In comparison, the Kaduwela Divisional Secretariat (which contains the Korathota Grama Niladhari Division) has a Sinhalese majority (95.6%)
Religion
The Korathota Grama Niladhari Division has a Buddhist majority (96.3%) . In comparison, the Kaduwela Divisional Secretariat (which contains the Korathota Grama Niladhari Division) has a Buddhist majority (90.4%)
Grama Niladhari Divisions of Kaduwela Divisional Secretariat
References
|
The Rossland was a sternwheel steamboat that ran on the Arrow Lakes in British Columbia. It was named after Rossland, British Columbia, once a prosperous mining town in the region.
Design and construction
Rossland was the third steamboat built by the Canadian Pacific Railway for its steamboat lines running in the lakes of the Kootenays. She was designed by the superintendent of the C.P.R.'s Lake Service, the accomplished steamboat man James W. Troup to be an express passenger and tourism boat, intended to make the 256 mile round trip from Arrowhead to Robson and back in one day.
Rossland was built at Nakusp at the shipyard owned by the master builder Thomas J. Bulger and his sons James M. and David T. Bulger. Most inland steamers of the Pacific Northwest were built with a flat bottom with as shallow a draft as possible so that they could move as far up the many shallow rivers to reach gold fields, farms or other areas where transportation was needed and roads or railroads were absent or inadequate. Rossland was an exception to this rule. She was intended to operate as a "lake boat" where depth of water was normally not a problem, and therefore she had a rounder and deeper bottom than the normal sternwheeler design. Her lake boat design would make Rossland faster and more efficient on the deep water of the Arrow Lakes. Her powerful engines were built by B.C. Iron Works, in Vancouver
Service on Arrow Lakes
Following her launch, Rossland was towed to a nearby wharf by the vessel Nakusp for completion. Before passenger accommodations were installed, Rossland was worked towing barges while the Lytton was being overhauled. Passenger service for Rossland began in early 1898. At her maximum speed, 22 miles per hour, Rossland was easily the fast vessel on the lakes. However, she burned too much coal at this pace, and normally did not run so fast. Steamboats were prone to damage and even destruction by fire, as Nakusp had been in 1897. In 1899 Rossland caught fire below the down of Nakusp. Captain Forslund was able to beach the vessel and extinguish the flames. Steamboat operation on the Arrow Lakes was seasonal, as they were generally frozen over during winter. The boats were moored in as safe a place as could be found during the freeze up, and sometimes work would be done on them to prepare them for the next season. Often work would be done on the boats to prepare them for the next season, and this occurred in Rossland'''s case.
Reconstruction
During the winter of 1908 to 1909, at a cost of $2,290, her Texas deck (the highest cabin on the ship except for the pilot house) was extended all the way back to the stern to allow additional passenger accommodations. The Rossland's hull, built entirely of wood, wore out quickly under heavy use, and became waterlogged. This was typical for wooden steamboats. If repair of the hull was impractical, sometimes a new hull would be built and the boat's cabins (called the "house") and machinery would be transferred to the new hull. In the winter of 1909-1910, this was done with the Rossland. She was brought into the shipyard at Nakusp, where builder James Bulger hauled her out of the water, unfastened her house and machinery, and jacked them up on timbers. Bulger and his workmen then launched the old hull back the lake and built a new hull under the old house and machinery. The supports were removed, and the vessel was relaunched. With a new hull, Rossland was practically a new steamboat. The Texas was also extended a bit during the 1909-10 reconstruction.
Effects of the Great War
When Canada entered the Great War in 1914, the young men of British Columbia were mobilized and many C.P.R. employees volunteered for Canada's armed services. Engineers, deck and engine room hands were especially wanted by the navy. As the young men left, the local farms and businesses declined, and there was a fall off in tourism as well. In her last years, Rossland like other C.P.R. inland steamers, transported troops. The economic downturn caused by mobilization forced C.P.R. to take a number of its steamers out of operation. Rossland had been having boiler troubles, and rather than repairing them, C.P.R. took her out of service.
Foundered at dock
On January 25, 1917, Rossland, moored at Nakusp, sank at the dock. Either her hull seams had opened or the weight of ice and snow on her decks and house had pressed her down so far that water poured in through ports that had been left open. She sank quickly, heeled over sharply on her port side, with the water up to the pilot house. Rossland was raised in March 1917. Her long-time master, Captain Forslund, bought her hull used it as a wharf boat for his place south of Needles.
See also
List of historical ships in British Columbia
Notes
Further reading
Faber, Jim, Steamer's Wake—Voyaging down the old marine highways of Puget Sound, British Columbia, and the Columbia River, Enetai Press, Seattle, WA 1985
Mills, Randall V., Sternwheelers up Columbia—A Century of Steamboating in the Oregon Country, University of Nebraska, Lincoln NE (1977 reprint of 1947 ed.)
Timmen, Fritz, Blow for the Landing—A Hundred Years of Steam Navigation on the Waters of the West'', Caxton Printers, Caldwell, Idaho
External links
Photographs of Rossland from the Provincial Archives of British Columbia
Rossland at Nakusp, circa 1898 The short Texas (upper cabin) dates this photograph as before the winter or 1909-1910.
Rossland at Nakusp, 1910 or later The longer Texas allows this photograph to be dated as after 1910. The close interaction between the railway and the Arrow Lakes steamers can be seen with the rail tracks and freight cars running right out to the docks.
Steamboats of the Arrow Lakes
Paddle steamers of British Columbia
History of British Columbia
1897 ships
Ships of CP Ships
Troopships of Canada
|
"Goteo" is a song by Argentine rapper Duki. It was released on August 6, 2019. The music video for the song has more than 100 million views on YouTube. The song has over 160 million plays on Spotify. In 2020 the song was nominated for 21st Annual Latin Grammy Awards in the category Best Rap/Hip-Hop Song.
Background
The song was produced by Asan, the video for the song was directed by Dano and recorded by Zazo Canvas, Nico Leonardo and Dano. The video was recorded in Barcelona when Duki performed at the Razzmatazz in 2019.
Remix
The remix of the song was released on January 23, 2020 that features the participation of C.R.O, Ronny J, Capo Plaza and Pablo Chill-E. The remix of the song was nominated in 2020 for the Gardel Awards in the category "Best Urban/Trap Musical Collaboration".
Personnel
Primary artist
Duki — lead vocals
Additional personnel
Asan — producer
El Sidechain — mixing
Remix version
C.R.O — guest vocals
Ronny J — guest vocals
Pablo Chill-E — guest vocals
Capo Plaza — guest vocals
Charts
Weekly charts
Year-end charts
Certifications
Awards and accolades
See also
List of Billboard Argentina Hot 100 top-ten singles in 2019
References
2019 singles
2019 songs
Argentine songs
Duki (rapper) songs
Songs written by Duki (rapper)
Argentina Hot 100 Top Ten Singles
|
```c++
/*
*
* Use of this source code is governed by a BSD-style license that can be
* found in the LICENSE file.
*/
#include "SkWidget.h"
#include "SkCanvas.h"
#include "SkMath.h"
#include "SkShader.h"
#include "SkInterpolator.h"
#include "SkTime.h"
SkProgressView::SkProgressView(uint32_t flags) : SkView(flags), fOnShader(NULL), fOffShader(NULL)
{
fValue = 0;
fMax = 0;
fInterp = NULL;
fDoInterp = false;
}
SkProgressView::~SkProgressView()
{
delete fInterp;
SkSafeUnref(fOnShader);
SkSafeUnref(fOffShader);
}
void SkProgressView::setMax(U16CPU max)
{
if (fMax != max)
{
fMax = SkToU16(max);
if (fValue > 0)
this->inval(NULL);
}
}
void SkProgressView::setValue(U16CPU value)
{
if (fValue != value)
{
if (fDoInterp)
{
if (fInterp)
delete fInterp;
fInterp = new SkInterpolator(1, 2);
SkScalar x = (SkScalar)(fValue << 8);
fInterp->setKeyFrame(0, SkTime::GetMSecs(), &x, 0);
x = (SkScalar)(value << 8);
fInterp->setKeyFrame(1, SkTime::GetMSecs() + 333, &x);
}
fValue = SkToU16(value);
this->inval(NULL);
}
}
void SkProgressView::onDraw(SkCanvas* canvas)
{
if (fMax == 0)
return;
SkFixed percent;
if (fInterp)
{
SkScalar x;
if (fInterp->timeToValues(SkTime::GetMSecs(), &x) == SkInterpolator::kFreezeEnd_Result)
{
delete fInterp;
fInterp = NULL;
}
percent = (SkFixed)x; // now its 16.8
percent = SkMax32(0, SkMin32(percent, fMax << 8)); // now its pinned
percent = SkFixedDiv(percent, fMax << 8); // now its 0.16
this->inval(NULL);
}
else
{
U16CPU value = SkMax32(0, SkMin32(fValue, fMax));
percent = SkFixedDiv(value, fMax);
}
SkRect r;
SkPaint p;
r.set(0, 0, this->width(), this->height());
p.setAntiAlias(true);
r.fRight = r.fLeft + SkScalarMul(r.width(), SkFixedToScalar(percent));
p.setStyle(SkPaint::kFill_Style);
p.setColor(SK_ColorDKGRAY);
p.setShader(fOnShader);
canvas->drawRect(r, p);
p.setColor(SK_ColorWHITE);
p.setShader(fOffShader);
r.fLeft = r.fRight;
r.fRight = this->width() - SK_Scalar1;
if (r.width() > 0)
canvas->drawRect(r, p);
}
#include "SkImageDecoder.h"
static SkShader* inflate_shader(const char file[])
{
SkBitmap bm;
return SkImageDecoder::DecodeFile(file, &bm) ?
SkShader::CreateBitmapShader(bm, SkShader::kRepeat_TileMode, SkShader::kRepeat_TileMode) :
NULL;
}
void SkProgressView::onInflate(const SkDOM& dom, const SkDOM::Node* node)
{
this->INHERITED::onInflate(dom, node);
const char* s;
SkASSERT(fOnShader == NULL);
SkASSERT(fOffShader == NULL);
if ((s = dom.findAttr(node, "src-on")) != NULL)
fOnShader = inflate_shader(s);
if ((s = dom.findAttr(node, "src-off")) != NULL)
fOffShader = inflate_shader(s);
(void)dom.findBool(node, "do-interp", &fDoInterp);
}
```
|
View of the Cannaregio Canal is a small oil-on-canvas painting executed ca. 1770 by the Italian painter Francesco Guardi. It measuring 48.9 × 77.5 cm. It is now in the reading room of the Frick Art Reference Library alongside the Regatta in Venice. Both paintings were gifted to the Frick Collection by Helen Clay Frick after her father's death. In the painting, Guardi captures a typical scene of Venetian life on the canals. In this particular veduta, Guardi depicts a section of the northern bank of the Cannaregio Canal, one of Venice’s largest canals, located in the Cannaregio sestiere (district) of the city.
The Palazzo Surian Bellotto is the most prominent building in the painting, named after the two different families who owned the palace: first the Surian family and then the Bellotto family.
The Ponte dei Tre Archi connects the northern bank with the small fragment of the southern bank. The bridge’s arch acts as a viewfinder, focusing the viewer’s gaze onto the tiny patch of pure blue, breaking up the otherwise entirely urban landscape.
Another building that features in this view is the Santa Maria delle Penitenti. It is recognizable by its circular window below the roof and placement behind the Ponte dei Tre Archi. This church was designed by Giorgio Massari in 1725 and the façade was never completed. Guardi takes care to ensure its recognizability despite its modest appearance.
References
External links
1770 paintings
Paintings by Francesco Guardi
Paintings in the Frick Collection
Cityscape paintings of Venice
|
Pershotravensk (, ) is a city and municipality in Synelnykove Raion, Dnipropetrovsk Oblast (province) of Ukraine. Population:
History
Until May 1960 it was known as a town Shakhtarske. City since February 1966.
In 1974 the population was 23.6 thousand people.
In January 1989 the population was 28 068 people
In January 2013 the population was 29 019 people.
Until 18 July 2020, Pershotravensk was incorporated as a city of oblast significance. In July 2020, as part of the administrative reform of Ukraine, which reduced the number of raions of Dnipropetrovsk Oblast to seven, the city of Pershotravensk was merged into Synelnykove Raion.
Notable people
Oleksandr Kubrakov — economist, civil servant, politician
Yaroslav Rakitskyi — footballer
References
Cities in Dnipropetrovsk Oblast
Cities of regional significance in Ukraine
Populated places established in the Ukrainian Soviet Socialist Republic
|
The 2016–17 Australian Athletics Championships was the 95th edition of the national championship in outdoor track and field for Australia. It was held from 30 March – 2 April 2017 at the Sydney Olympic Park Athletic Centre in Sydney. It served as the selection meeting for Australia at the 2017 World Championships in Athletics. Distance events were held separately, with the 10,000 metres taking place at the Zatopek 10K on 8 December 2016 at Lakeside Stadium in Melbourne and the 5000 metres taking place at the Summer of Athletics Meet in Canberra on 11 March 2017.
Medal summary
Men
Women
References
External links
Athletics Australia website
2017
Australian Athletics Championships
Australian Championships
Athletics Championships
Sports competitions in Sydney
2010s in Sydney
|
```go
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
package amqp09
const (
// Shared
urlsField = "urls"
tlsField = "tls"
// Input
queueField = "queue"
queueDeclareField = "queue_declare"
queueDeclareEnabledField = "enabled"
queueDeclareDurableField = "durable"
queueDeclareAutoDeleteField = "auto_delete"
bindingsDeclareField = "bindings_declare"
bindingsDeclareExchangeField = "exchange"
bindingsDeclareKeyField = "key"
consumerTagField = "consumer_tag"
autoAckField = "auto_ack"
nackRejectPattensField = "nack_reject_patterns"
prefetchCountField = "prefetch_count"
prefetchSizeField = "prefetch_size"
// Output
exchangeField = "exchange"
exchangeDeclareField = "exchange_declare"
exchangeDeclareEnabledField = "enabled"
exchangeDeclareTypeField = "type"
exchangeDeclareDurableField = "durable"
keyField = "key"
typeField = "type"
contentTypeField = "content_type"
contentEncodingField = "content_encoding"
metadataFilterField = "metadata"
priorityField = "priority"
persistentField = "persistent"
mandatoryField = "mandatory"
immediateField = "immediate"
timeoutField = "timeout"
correlationIDField = "correlation_id"
replyToField = "reply_to"
expirationField = "expiration"
messageIDField = "message_id"
userIDField = "user_id"
appIDField = "app_id"
)
```
|
Anti-spam appliances are software or hardware devices integrated with on-board software that implement e-mail spam filtering and/or anti-spam for instant messaging (also called "spim") and are deployed at the gateway or in front of the mail server. They are normally driven by an operating system optimized for spam filtering. Anti-spam appliances have existed in wide area networks and home networks since the early 2000s.
The anti-spam appliances that are found in wide area networks are usually built from server hardware and are generally used by companies, corporations, ISPs, and universities, and anti-spam appliances that are found in home networks are usually built from embedded hardware and are generally used by consumers. Anti-spam technology companies that have produced anti-spam appliances for wide area networks and/or home networks include, but are not limited to Proofpoint, IronPort, Barracuda Networks, D-Link, Spam Cube, Netgear and TrustEli.
Some main reasons why hardware anti-spam appliances may be selected instead of software could include, the customer's preference to buy hardware rather than software due to its ease of installation, the appliance can manage itself after it is installed, and anti-spam appliances commonly provide other security features such as anti-virus protection.
References
Spam filtering
Spamming
Anti-spam
|
```javascript
/**
* @license
*/
/**
* @fileoverview This file contains helpers for constructing and rendering the
* critical request chains network tree.
*/
import {Globals} from './report-globals.js';
/** @typedef {import('./dom.js').DOM} DOM */
/** @typedef {import('./details-renderer.js').DetailsRenderer} DetailsRenderer */
/**
* @typedef CRCSegment
* @property {LH.Audit.Details.SimpleCriticalRequestNode[string]} node
* @property {boolean} isLastChild
* @property {boolean} hasChildren
* @property {number} startTime
* @property {number} transferSize
* @property {boolean[]} treeMarkers
*/
class CriticalRequestChainRenderer {
/**
* Create render context for critical-request-chain tree display.
* @param {LH.Audit.Details.SimpleCriticalRequestNode} tree
* @return {{tree: LH.Audit.Details.SimpleCriticalRequestNode, startTime: number, transferSize: number}}
*/
static initTree(tree) {
let startTime = 0;
const rootNodes = Object.keys(tree);
if (rootNodes.length > 0) {
const node = tree[rootNodes[0]];
startTime = node.request.startTime;
}
return {tree, startTime, transferSize: 0};
}
/**
* Helper to create context for each critical-request-chain node based on its
* parent. Calculates if this node is the last child, whether it has any
* children itself and what the tree looks like all the way back up to the root,
* so the tree markers can be drawn correctly.
* @param {LH.Audit.Details.SimpleCriticalRequestNode} parent
* @param {string} id
* @param {number} startTime
* @param {number} transferSize
* @param {Array<boolean>=} treeMarkers
* @param {boolean=} parentIsLastChild
* @return {CRCSegment}
*/
static createSegment(parent, id, startTime, transferSize, treeMarkers, parentIsLastChild) {
const node = parent[id];
const siblings = Object.keys(parent);
const isLastChild = siblings.indexOf(id) === (siblings.length - 1);
const hasChildren = !!node.children && Object.keys(node.children).length > 0;
// Copy the tree markers so that we don't change by reference.
const newTreeMarkers = Array.isArray(treeMarkers) ? treeMarkers.slice(0) : [];
// Add on the new entry.
if (typeof parentIsLastChild !== 'undefined') {
newTreeMarkers.push(!parentIsLastChild);
}
return {
node,
isLastChild,
hasChildren,
startTime,
transferSize: transferSize + node.request.transferSize,
treeMarkers: newTreeMarkers,
};
}
/**
* Creates the DOM for a tree segment.
* @param {DOM} dom
* @param {CRCSegment} segment
* @param {DetailsRenderer} detailsRenderer
* @return {Node}
*/
static createChainNode(dom, segment, detailsRenderer) {
const chainEl = dom.createComponent('crcChain');
// Hovering over request shows full URL.
dom.find('.lh-crc-node', chainEl).setAttribute('title', segment.node.request.url);
const treeMarkeEl = dom.find('.lh-crc-node__tree-marker', chainEl);
// Construct lines and add spacers for sub requests.
segment.treeMarkers.forEach(separator => {
const classSeparator = separator ?
'lh-tree-marker lh-vert' :
'lh-tree-marker';
treeMarkeEl.append(
dom.createElement('span', classSeparator),
dom.createElement('span', 'lh-tree-marker')
);
});
const classLastChild = segment.isLastChild ?
'lh-tree-marker lh-up-right' :
'lh-tree-marker lh-vert-right';
const classHasChildren = segment.hasChildren ?
'lh-tree-marker lh-horiz-down' :
'lh-tree-marker lh-right';
treeMarkeEl.append(
dom.createElement('span', classLastChild),
dom.createElement('span', 'lh-tree-marker lh-right'),
dom.createElement('span', classHasChildren)
);
// Fill in url, host, and request size information.
const url = segment.node.request.url;
const linkEl = detailsRenderer.renderTextURL(url);
const treevalEl = dom.find('.lh-crc-node__tree-value', chainEl);
treevalEl.append(linkEl);
if (!segment.hasChildren) {
const {startTime, endTime, transferSize} = segment.node.request;
const span = dom.createElement('span', 'lh-crc-node__chain-duration');
span.textContent =
' - ' + Globals.i18n.formatMilliseconds((endTime - startTime) * 1000) + ', ';
const span2 = dom.createElement('span', 'lh-crc-node__chain-duration');
span2.textContent = Globals.i18n.formatBytesToKiB(transferSize, 0.01);
treevalEl.append(span, span2);
}
return chainEl;
}
/**
* Recursively builds a tree from segments.
* @param {DOM} dom
* @param {DocumentFragment} tmpl
* @param {CRCSegment} segment
* @param {Element} elem Parent element.
* @param {LH.Audit.Details.CriticalRequestChain} details
* @param {DetailsRenderer} detailsRenderer
*/
static buildTree(dom, tmpl, segment, elem, details, detailsRenderer) {
elem.append(CRCRenderer.createChainNode(dom, segment, detailsRenderer));
if (segment.node.children) {
for (const key of Object.keys(segment.node.children)) {
const childSegment = CRCRenderer.createSegment(segment.node.children, key,
segment.startTime, segment.transferSize, segment.treeMarkers, segment.isLastChild);
CRCRenderer.buildTree(dom, tmpl, childSegment, elem, details, detailsRenderer);
}
}
}
/**
* @param {DOM} dom
* @param {LH.Audit.Details.CriticalRequestChain} details
* @param {DetailsRenderer} detailsRenderer
* @return {Element}
*/
static render(dom, details, detailsRenderer) {
const tmpl = dom.createComponent('crc');
const containerEl = dom.find('.lh-crc', tmpl);
// Fill in top summary.
dom.find('.lh-crc-initial-nav', tmpl).textContent = Globals.strings.crcInitialNavigation;
dom.find('.lh-crc__longest_duration_label', tmpl).textContent =
Globals.strings.crcLongestDurationLabel;
dom.find('.lh-crc__longest_duration', tmpl).textContent =
Globals.i18n.formatMilliseconds(details.longestChain.duration);
// Construct visual tree.
const root = CRCRenderer.initTree(details.chains);
for (const key of Object.keys(root.tree)) {
const segment = CRCRenderer.createSegment(root.tree, key, root.startTime, root.transferSize);
CRCRenderer.buildTree(dom, tmpl, segment, containerEl, details, detailsRenderer);
}
return dom.find('.lh-crc-container', tmpl);
}
}
// Alias b/c the name is really long.
const CRCRenderer = CriticalRequestChainRenderer;
export {
CriticalRequestChainRenderer,
};
```
|
```c++
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing,
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// specific language governing permissions and limitations
#include "exprs/timestamp-functions.h"
#include <ctime>
#include <iomanip>
#include <boost/date_time/posix_time/posix_time_types.hpp>
#include <boost/date_time/gregorian/gregorian_types.hpp>
#include <gutil/strings/substitute.h>
#include "exprs/anyval-util.h"
#include "runtime/datetime-simple-date-format-parser.h"
#include "runtime/timestamp-value.inline.h"
#include "runtime/timestamp-value.h"
#include "udf/udf.h"
#include "udf/udf-internal.h"
#include "common/names.h"
using boost::gregorian::greg_month;
using boost::gregorian::max_date_time;
using boost::gregorian::min_date_time;
using boost::posix_time::not_a_date_time;
using boost::posix_time::ptime;
using boost::posix_time::time_duration;
using namespace impala_udf;
using namespace strings;
typedef boost::gregorian::date Date;
typedef boost::gregorian::days Days;
typedef boost::gregorian::months Months;
typedef boost::gregorian::weeks Weeks;
typedef boost::gregorian::years Years;
typedef boost::posix_time::hours Hours;
typedef boost::posix_time::microseconds Microseconds;
typedef boost::posix_time::milliseconds Milliseconds;
typedef boost::posix_time::minutes Minutes;
typedef boost::posix_time::nanoseconds Nanoseconds;
typedef boost::posix_time::seconds Seconds;
namespace impala {
using namespace datetime_parse_util;
StringVal TimestampFunctions::StringValFromTimestamp(FunctionContext* context,
const TimestampValue& tv, const StringVal& fmt) {
void* state = context->GetFunctionState(FunctionContext::THREAD_LOCAL);
DateTimeFormatContext* dt_ctx = reinterpret_cast<DateTimeFormatContext*>(state);
if (!context->IsArgConstant(1)) {
dt_ctx->Reset(reinterpret_cast<const char*>(fmt.ptr), fmt.len);
if (!SimpleDateFormatTokenizer::Tokenize(dt_ctx, FORMAT)){
ReportBadFormat(context, datetime_parse_util::GENERAL_ERROR, fmt, false);
return StringVal::null();
}
}
return tv.ToStringVal(context, *dt_ctx);
}
template <class TIME>
StringVal TimestampFunctions::FromUnix(FunctionContext* context, const TIME& intp) {
if (intp.is_null) return StringVal::null();
const TimestampValue tv = TimestampValue::FromUnixTime(intp.val,
context->impl()->state()->time_zone_for_unix_time_conversions());
return tv.ToStringVal(context);
}
template <class TIME>
StringVal TimestampFunctions::FromUnix(FunctionContext* context, const TIME& intp,
const StringVal& fmt) {
if (fmt.is_null || fmt.len == 0) {
ReportBadFormat(context, datetime_parse_util::GENERAL_ERROR, fmt, false);
return StringVal::null();
}
if (intp.is_null) return StringVal::null();
const TimestampValue& t = TimestampValue::FromUnixTime(intp.val,
context->impl()->state()->time_zone_for_unix_time_conversions());
return StringValFromTimestamp(context, t, fmt);
}
BigIntVal TimestampFunctions::Unix(FunctionContext* context, const StringVal& string_val,
const StringVal& fmt) {
const TimestampVal& tv_val = ToTimestamp(context, string_val, fmt);
if (tv_val.is_null) return BigIntVal::null();
const TimestampValue& tv = TimestampValue::FromTimestampVal(tv_val);
const Timezone* tz = context->impl()->state()->time_zone_for_unix_time_conversions();
time_t result;
return (tv.ToUnixTime(tz, &result)) ? BigIntVal(result) : BigIntVal::null();
}
BigIntVal TimestampFunctions::Unix(FunctionContext* context, const TimestampVal& ts_val) {
if (ts_val.is_null) return BigIntVal::null();
const TimestampValue& tv = TimestampValue::FromTimestampVal(ts_val);
const Timezone* tz = context->impl()->state()->time_zone_for_unix_time_conversions();
time_t result;
return (tv.ToUnixTime(tz, &result)) ? BigIntVal(result) : BigIntVal::null();
}
BigIntVal TimestampFunctions::Unix(FunctionContext* context) {
time_t result;
const Timezone* tz = context->impl()->state()->time_zone_for_unix_time_conversions();
if (context->impl()->state()->now()->ToUnixTime(tz, &result)) {
return BigIntVal(result);
} else {
return BigIntVal::null();
}
}
BigIntVal TimestampFunctions::UtcToUnixMicros(FunctionContext* context,
const TimestampVal& ts_val) {
if (ts_val.is_null) return BigIntVal::null();
const TimestampValue& tv = TimestampValue::FromTimestampVal(ts_val);
int64_t result;
return (tv.UtcToUnixTimeMicros(&result)) ? BigIntVal(result) : BigIntVal::null();
}
TimestampVal TimestampFunctions::UnixMicrosToUtcTimestamp(FunctionContext* context,
const BigIntVal& unix_time_micros) {
if (unix_time_micros.is_null) return TimestampVal::null();
TimestampValue tv = TimestampValue::UtcFromUnixTimeMicros(unix_time_micros.val);
TimestampVal result;
tv.ToTimestampVal(&result);
return result;
}
TimestampVal TimestampFunctions::ToTimestamp(FunctionContext* context,
const BigIntVal& bigint_val) {
if (bigint_val.is_null) return TimestampVal::null();
const Timezone* tz = context->impl()->state()->time_zone_for_unix_time_conversions();
const TimestampValue& tv = TimestampValue::FromUnixTime(bigint_val.val, tz);
TimestampVal tv_val;
tv.ToTimestampVal(&tv_val);
return tv_val;
}
TimestampVal TimestampFunctions::ToTimestamp(FunctionContext* context,
const StringVal& date, const StringVal& fmt) {
if (fmt.is_null || fmt.len == 0) {
ReportBadFormat(context, datetime_parse_util::GENERAL_ERROR, fmt, false);
return TimestampVal::null();
}
if (date.is_null || date.len == 0) return TimestampVal::null();
void* state = context->GetFunctionState(FunctionContext::THREAD_LOCAL);
DateTimeFormatContext* dt_ctx = reinterpret_cast<DateTimeFormatContext*>(state);
if (!context->IsArgConstant(1)) {
dt_ctx->Reset(reinterpret_cast<const char*>(fmt.ptr), fmt.len);
if (!SimpleDateFormatTokenizer::Tokenize(dt_ctx, PARSE)) {
ReportBadFormat(context, datetime_parse_util::GENERAL_ERROR, fmt, false);
return TimestampVal::null();
}
}
const TimestampValue& tv = TimestampValue::ParseSimpleDateFormat(
reinterpret_cast<const char*>(date.ptr), date.len, *dt_ctx);
TimestampVal tv_val;
tv.ToTimestampVal(&tv_val);
return tv_val;
}
StringVal TimestampFunctions::FromTimestamp(FunctionContext* context,
const TimestampVal& date, const StringVal& fmt) {
if (date.is_null) return StringVal::null();
const TimestampValue& tv = TimestampValue::FromTimestampVal(date);
if (!tv.HasDate()) return StringVal::null();
return StringValFromTimestamp(context, tv, fmt);
}
BigIntVal TimestampFunctions::UnixFromString(FunctionContext* context,
const StringVal& sv) {
if (sv.is_null) return BigIntVal::null();
const TimestampValue& tv = TimestampValue::ParseSimpleDateFormat(
reinterpret_cast<const char *>(sv.ptr), sv.len);
const Timezone* tz = context->impl()->state()->time_zone_for_unix_time_conversions();
time_t result;
return (tv.ToUnixTime(tz, &result)) ? BigIntVal(result) : BigIntVal::null();
}
IntVal TimestampFunctions::Year(FunctionContext* context, const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasDate()) return IntVal::null();
return IntVal(ts_value.date().year());
}
IntVal TimestampFunctions::Month(FunctionContext* context, const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasDate()) return IntVal::null();
return IntVal(ts_value.date().month());
}
IntVal TimestampFunctions::Quarter(FunctionContext* context, const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasDate()) return IntVal::null();
int m = ts_value.date().month();
return IntVal((m - 1) / 3 + 1);
}
IntVal TimestampFunctions::DayOfWeek(FunctionContext* context,
const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasDate()) return IntVal::null();
// Sql has the result in [1,7] where 1 = Sunday. Boost has 0 = Sunday.
return IntVal(ts_value.date().day_of_week() + 1);
}
IntVal TimestampFunctions::DayOfMonth(FunctionContext* context,
const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasDate()) return IntVal::null();
return IntVal(ts_value.date().day());
}
IntVal TimestampFunctions::DayOfYear(FunctionContext* context,
const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasDate()) return IntVal::null();
return IntVal(ts_value.date().day_of_year());
}
IntVal TimestampFunctions::WeekOfYear(FunctionContext* context,
const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasDate()) return IntVal::null();
return IntVal(ts_value.date().week_number());
}
IntVal TimestampFunctions::Hour(FunctionContext* context, const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasTime()) return IntVal::null();
return IntVal(ts_value.time().hours());
}
IntVal TimestampFunctions::Minute(FunctionContext* context, const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasTime()) return IntVal::null();
return IntVal(ts_value.time().minutes());
}
IntVal TimestampFunctions::Second(FunctionContext* context, const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasTime()) return IntVal::null();
return IntVal(ts_value.time().seconds());
}
IntVal TimestampFunctions::Millisecond(FunctionContext* context,
const TimestampVal& ts_val) {
if (ts_val.is_null) return IntVal::null();
const TimestampValue& ts_value = TimestampValue::FromTimestampVal(ts_val);
if (!ts_value.HasTime()) return IntVal::null();
const boost::posix_time::time_duration& time = ts_value.time();
return IntVal(time.total_milliseconds() - time.total_seconds() * 1000);
}
bool StringToTimeOfDay(
const StringVal& str_val, boost::posix_time::time_duration* time) {
if (str_val.is_null) return false;
boost::gregorian::date dummy_date;
return TimestampParser::ParseSimpleDateFormat(
reinterpret_cast<char*>(str_val.ptr), str_val.len, &dummy_date, time, true);
}
IntVal TimestampFunctions::Hour(FunctionContext* context, const StringVal& str_val) {
boost::posix_time::time_duration time;
if (!StringToTimeOfDay(str_val, &time)) {
return IntVal::null();
}
return IntVal(time.hours());
}
IntVal TimestampFunctions::Minute(FunctionContext* context, const StringVal& str_val) {
boost::posix_time::time_duration time;
if (!StringToTimeOfDay(str_val, &time)) {
return IntVal::null();
}
return IntVal(time.minutes());
}
IntVal TimestampFunctions::Second(FunctionContext* context, const StringVal& str_val) {
boost::posix_time::time_duration time;
if (!StringToTimeOfDay(str_val, &time)) {
return IntVal::null();
}
return IntVal(time.seconds());
}
IntVal TimestampFunctions::Millisecond(
FunctionContext* context, const StringVal& str_val) {
boost::posix_time::time_duration time;
if (!StringToTimeOfDay(str_val, &time)) {
return IntVal::null();
}
return IntVal(time.total_milliseconds() - time.total_seconds() * 1000);
}
TimestampVal TimestampFunctions::Now(FunctionContext* context) {
const TimestampValue* now = context->impl()->state()->now();
TimestampVal return_val;
now->ToTimestampVal(&return_val);
return return_val;
}
TimestampVal TimestampFunctions::UtcTimestamp(FunctionContext* context) {
const TimestampValue* utc_timestamp = context->impl()->state()->utc_timestamp();
TimestampVal return_val;
utc_timestamp->ToTimestampVal(&return_val);
return return_val;
}
// Writes 'num' as ASCII into 'dst'. If necessary, adds leading zeros to make the ASCII
// representation exactly 'len' characters. Both 'num' and 'len' must be >= 0.
static inline void IntToChar(uint8_t* dst, int num, int len) {
DCHECK_GE(len, 0);
DCHECK_GE(num, 0);
for (int i = len - 1; i >= 0; --i) {
*(dst + i) = '0' + (num % 10);
num /= 10;
}
}
StringVal TimestampFunctions::ToDate(FunctionContext* context,
const TimestampVal& ts_val) {
if (ts_val.is_null) return StringVal::null();
const TimestampValue ts_value = TimestampValue::FromTimestampVal(ts_val);
// Defensively, return NULL if the timestamp does not have a date portion. Some of
// our built-in functions might incorrectly return such a malformed timestamp.
if (!ts_value.HasDate()) return StringVal::null();
StringVal result(context, 10);
// Return NULL if 'result' allocation fails inside of the StringVal constructor.
if (UNLIKELY(result.is_null)) return StringVal::null();
result.len = 10;
// Fill in year, month, and day.
IntToChar(result.ptr, ts_value.date().year(), 4);
IntToChar(result.ptr + 5, ts_value.date().month(), 2);
IntToChar(result.ptr + 8, ts_value.date().day(), 2);
// Fill in dashes.
result.ptr[7] = '-';
result.ptr[4] = '-';
return result;
}
inline unsigned short GetLastDayOfMonth(int month, int year) {
switch (month) {
case 1: return 31;
case 2: return IsLeapYear(year) ? 29 : 28;
case 3: return 31;
case 4: return 30;
case 5: return 31;
case 6: return 30;
case 7: return 31;
case 8: return 31;
case 9: return 30;
case 10: return 31;
case 11: return 30;
case 12: return 31;
default:
DCHECK(false);
return -1;
}
}
/// The functions below help workaround IMPALA-1675: if a very large interval is added
/// to a date, boost fails to throw an exception.
template <class Interval>
bool IsOverMaxInterval(const int64_t count) {
DCHECK(false) << "NYI";
return false;
}
template <>
inline bool IsOverMaxInterval<Years>(const int64_t val) {
return val < -TimestampFunctions::MAX_YEAR_INTERVAL ||
TimestampFunctions::MAX_YEAR_INTERVAL < val;
}
template <>
inline bool IsOverMaxInterval<Months>(const int64_t val) {
return val < -TimestampFunctions::MAX_MONTH_INTERVAL ||
TimestampFunctions::MAX_MONTH_INTERVAL < val;
}
template <>
inline bool IsOverMaxInterval<Weeks>(const int64_t val) {
return val < -TimestampFunctions::MAX_WEEK_INTERVAL ||
TimestampFunctions::MAX_WEEK_INTERVAL < val;
}
template <>
inline bool IsOverMaxInterval<Days>(const int64_t val) {
return val < -TimestampFunctions::MAX_DAY_INTERVAL ||
TimestampFunctions::MAX_DAY_INTERVAL < val;
}
template <>
inline bool IsOverMaxInterval<Hours>(const int64_t val) {
return val < -TimestampFunctions::MAX_HOUR_INTERVAL ||
TimestampFunctions::MAX_HOUR_INTERVAL < val;
}
template <>
inline bool IsOverMaxInterval<Minutes>(const int64_t val) {
return val < -TimestampFunctions::MAX_MINUTE_INTERVAL ||
TimestampFunctions::MAX_MINUTE_INTERVAL < val;
}
template <>
inline bool IsOverMaxInterval<Seconds>(const int64_t val) {
return val < -TimestampFunctions::MAX_SEC_INTERVAL ||
TimestampFunctions::MAX_SEC_INTERVAL < val;
}
template <>
inline bool IsOverMaxInterval<Milliseconds>(const int64_t val) {
return val < -TimestampFunctions::MAX_MILLI_INTERVAL ||
TimestampFunctions::MAX_MILLI_INTERVAL < val;
}
template <>
inline bool IsOverMaxInterval<Microseconds>(const int64_t val) {
return val < -TimestampFunctions::MAX_MICRO_INTERVAL ||
TimestampFunctions::MAX_MICRO_INTERVAL < val;
}
template <>
inline bool IsOverMaxInterval<Nanoseconds>(const int64_t val) {
return false;
}
inline bool IsUnsupportedYear(int64_t year) {
return year < TimestampFunctions::MIN_YEAR || TimestampFunctions::MAX_YEAR < year;
}
/// The AddInterval() functions provide a unified interface for adding intervals of all
/// types. To subtract, the 'interval' can be negative. 'context' and 'interval' are
/// input params, 'datetime' is an output param.
template <typename Interval>
inline void AddInterval(FunctionContext* context, int64_t interval, ptime* datetime) {
*datetime += Interval(interval);
}
/// IMPALA-2086: Avoid boost changing Feb 28th into Feb 29th when the resulting year is
/// a leap year. Doing the work rather than using boost then adjusting the boost result
/// is a little faster (and about the same speed as the default boost logic).
template <>
inline void AddInterval<Years>(FunctionContext* context, int64_t interval,
ptime* datetime) {
const Date& date = datetime->date();
int year = date.year() + interval;
if (UNLIKELY(IsUnsupportedYear(year))) {
context->AddWarning(Substitute("Add/sub year resulted in an out of range year: $0",
year).c_str());
*datetime = ptime(not_a_date_time);
}
greg_month month = date.month();
int day = date.day().as_number();
if (day == 29 && month == boost::gregorian::Feb && !IsLeapYear(year)) day = 28;
*datetime = ptime(boost::gregorian::date(year, month, day), datetime->time_of_day());
}
string TimestampFunctions::ShortDayName(FunctionContext* context,
const TimestampVal& ts) {
if (ts.is_null) return NULL;
IntVal dow = DayOfWeek(context, ts);
DCHECK_GT(dow.val, 0);
DCHECK_LT(dow.val, 8);
return SHORT_DAY_NAMES[CAPITALIZED][dow.val - 1];
}
StringVal TimestampFunctions::LongDayName(FunctionContext* context,
const TimestampVal& ts) {
if (ts.is_null) return StringVal::null();
IntVal dow = DayOfWeek(context, ts);
DCHECK_GT(dow.val, 0);
DCHECK_LT(dow.val, 8);
const string& day_name = DAY_NAMES[CAPITALIZED][dow.val - 1];
return StringVal(reinterpret_cast<uint8_t*>(const_cast<char*>(day_name.data())),
day_name.size());
}
string TimestampFunctions::ShortMonthName(FunctionContext* context,
const TimestampVal& ts) {
if (ts.is_null) return NULL;
IntVal mth = Month(context, ts);
DCHECK_GT(mth.val, 0);
DCHECK_LT(mth.val, 13);
return SHORT_MONTH_NAMES[CAPITALIZED][mth.val - 1];
}
StringVal TimestampFunctions::LongMonthName(FunctionContext* context,
const TimestampVal& ts) {
if (ts.is_null) return StringVal::null();
IntVal mth = Month(context, ts);
DCHECK_GT(mth.val, 0);
DCHECK_LT(mth.val, 13);
const string& mn = MONTH_NAMES[CAPITALIZED][mth.val - 1];
return StringVal(reinterpret_cast<uint8_t*>(const_cast<char*>(mn.data())), mn.size());
}
namespace {
inline cctz::time_point<cctz::sys_seconds> UnixTimeToTimePoint(time_t t) {
return std::chrono::time_point_cast<cctz::sys_seconds>(
std::chrono::system_clock::from_time_t(0)) + cctz::sys_seconds(t);
}
}
StringVal TimestampFunctions::TimeOfDay(FunctionContext* context) {
const TimestampVal curr = Now(context);
if (curr.is_null) return StringVal::null();
const string& day = ShortDayName(context, curr);
const string& month = ShortMonthName(context, curr);
IntVal dayofmonth = DayOfMonth(context, curr);
IntVal hour = Hour(context, curr);
IntVal min = Minute(context, curr);
IntVal sec = Second(context, curr);
IntVal year = Year(context, curr);
string tz_name;
if (context->impl()->state()->local_time_zone() != UTCPTR) {
// Calculate 'start' time point at which query execution started.
cctz::time_point<cctz::sys_seconds> start = UnixTimeToTimePoint(
context->impl()->state()->query_ctx().start_unix_millis / MILLIS_PER_SEC);
// Find 'tz_name' time-zone abbreviation that corresponds to 'local_time_zone' at
// 'start' time point.
cctz::time_zone::absolute_lookup start_lookup =
context->impl()->state()->local_time_zone()->lookup(start);
tz_name = (start_lookup.abbr != nullptr) ? start_lookup.abbr :
context->impl()->state()->local_time_zone()->name();
} else {
tz_name = "UTC";
}
stringstream result;
result << day << " " << month << " " << setw(2) << setfill('0')
<< dayofmonth.val << " " << setw(2) << setfill('0') << hour.val << ":"
<< setw(2) << setfill('0') << min.val << ":"
<< setw(2) << setfill('0') << sec.val << " " << year.val << " "
<< tz_name;
return AnyValUtil::FromString(context, result.str());
}
IntVal TimestampFunctions::TimestampCmp(FunctionContext* context,
const TimestampVal& ts1, const TimestampVal& ts2) {
if (ts1.is_null || ts2.is_null) return IntVal::null();
const TimestampValue& ts_value1 = TimestampValue::FromTimestampVal(ts1);
const TimestampValue& ts_value2 = TimestampValue::FromTimestampVal(ts2);
if (ts_value1 > ts_value2) return 1;
if (ts_value1 < ts_value2) return -1;
return 0;
}
DoubleVal TimestampFunctions::MonthsBetween(FunctionContext* context,
const TimestampVal& ts1, const TimestampVal& ts2) {
if (ts1.is_null || ts2.is_null) return DoubleVal::null();
const TimestampValue& ts_value1 = TimestampValue::FromTimestampVal(ts1);
const TimestampValue& ts_value2 = TimestampValue::FromTimestampVal(ts2);
if (!ts_value1.HasDate() || !ts_value2.HasDate()) return DoubleVal::null();
IntVal year1 = Year(context, ts1);
IntVal year2 = Year(context, ts2);
IntVal month1 = Month(context, ts1);
IntVal month2 = Month(context, ts2);
IntVal day1 = DayOfMonth(context, ts1);
IntVal day2 = DayOfMonth(context, ts2);
int days_diff = 0;
// If both timestamps are last days of different months they don't contribute
// a fractional value to the number of months, therefore there is no need to
// calculate difference in their days.
if (!(day1.val == GetLastDayOfMonth(month1.val, year1.val) &&
day2.val == GetLastDayOfMonth(month2.val, year2.val))) {
days_diff = day1.val - day2.val;
}
double months_between = (year1.val - year2.val) * 12 +
month1.val - month2.val + (static_cast<double>(days_diff) / 31.0);
return DoubleVal(months_between);
}
TimestampVal TimestampFunctions::NextDay(FunctionContext* context,
const TimestampVal& date, const StringVal& weekday) {
if (weekday.is_null) {
context->SetError("Invalid Day: NULL");
return TimestampVal::null();
}
string weekday_str = string(reinterpret_cast<const char*>(weekday.ptr), weekday.len);
transform(weekday_str.begin(), weekday_str.end(), weekday_str.begin(), tolower);
const auto it = DAYNAME_MAP.find(weekday_str);
if (it == DAYNAME_MAP.end()) {
context->SetError(Substitute("Invalid Day: $0", weekday_str).c_str());
return TimestampVal::null();
}
DCHECK_GE(it->second, 0);
DCHECK_LE(it->second, 6);
int delta_days = it->second + 1 - DayOfWeek(context, date).val;
delta_days = delta_days <= 0 ? delta_days + 7 : delta_days;
DCHECK_GE(delta_days, 1);
DCHECK_LE(delta_days, 7);
IntVal delta(delta_days);
return AddSub<true, IntVal, Days, false>(context, date, delta);
}
TimestampVal TimestampFunctions::LastDay(FunctionContext* context,
const TimestampVal& ts) {
if (ts.is_null) return TimestampVal::null();
const TimestampValue& timestamp = TimestampValue::FromTimestampVal(ts);
if (!timestamp.HasDate()) return TimestampVal::null();
TimestampValue tsv(timestamp.date().end_of_month(), time_duration(0,0,0,0));
TimestampVal rt_date;
tsv.ToTimestampVal(&rt_date);
return rt_date;
}
IntVal TimestampFunctions::IntMonthsBetween(FunctionContext* context,
const TimestampVal& ts1, const TimestampVal& ts2) {
if (ts1.is_null || ts2.is_null) return IntVal::null();
const TimestampValue& ts_value1 = TimestampValue::FromTimestampVal(ts1);
const TimestampValue& ts_value2 = TimestampValue::FromTimestampVal(ts2);
if (!ts_value1.HasDate() || !ts_value2.HasDate()) return IntVal::null();
DoubleVal months_between = MonthsBetween(context, ts1, ts2);
return IntVal(static_cast<int32_t>(months_between.val));
}
/// The MONTH interval is a special case. The different ways of adding a month interval
/// are:
/// 1) ADD_MONTHS(<TIMESTAMP>, <NUMBER>)
/// 2) ADD_DATE(<TIMESTAMP>, INTERVAL <NUMBER> MONTH)
/// 3) <TIMESTAMP> + INTERVAL <NUMBER> MONTH
/// For other interval types, all three produce the same result. For MONTH and case #1, if
/// the input TIMESTAMP is the last day of the month, the result will always be the last
/// day of the month. Cases #2 and #3 are equivalent and do not have the special handling
/// as case #1. In all cases, if the result would be on a day that is beyond the last day
/// of the month, the day is reduced to be the last day of the month. A value of true
/// for the 'keep_max_day' argument corresponds to case #1.
inline void AddMonths(FunctionContext* context, int64_t months, bool keep_max_day,
ptime* datetime) {
int64_t years = months / 12;
months %= 12;
const Date& date = datetime->date();
int year = date.year() + years;
int month = date.month().as_number() + months;
if (month <= 0) {
--year;
month += 12;
} else if (month > 12) {
++year;
month -= 12;
}
if (UNLIKELY(IsUnsupportedYear(year))) {
context->AddWarning(Substitute("Add/sub month resulted in an out of range year: $0",
year).c_str());
*datetime = ptime(not_a_date_time);
}
DCHECK_GE(month, 1);
DCHECK_LE(month, 12);
int day = date.day().as_number();
if (keep_max_day && GetLastDayOfMonth(date.month().as_number(), date.year()) == day) {
day = GetLastDayOfMonth(month, year);
} else {
day = min(day, static_cast<int>(GetLastDayOfMonth(month, year)));
}
*datetime = ptime(boost::gregorian::date(year, month, day), datetime->time_of_day());
}
template <>
inline void AddInterval<Months>(FunctionContext* context, int64_t interval,
ptime* datetime) {
AddMonths(context, interval, false, datetime);
}
/// The AddInterval() functions below workaround various boost bugs in adding large
/// intervals -- if the interval is too large, some sort of overflow/wrap around
/// happens resulting in an incorrect value. There is no way to predict what input value
/// will cause a wrap around. The values below were chosen arbitrarily and shown to
/// work through testing.
template <>
inline void AddInterval<Hours>(FunctionContext* context, int64_t interval,
ptime* datetime) {
int64_t weeks = interval / (7 * 24);
int64_t hours = interval % (7 * 24);
AddInterval<Weeks>(context, weeks, datetime);
*datetime += Hours(hours);
}
template <>
inline void AddInterval<Minutes>(FunctionContext* context, int64_t interval,
ptime* datetime) {
int64_t days = interval / (60 * 24);
int64_t minutes = interval % (60 * 24);
AddInterval<Days>(context, days, datetime);
*datetime += Minutes(minutes);
}
/// Workaround a boost bug in adding large second intervals.
template <>
inline void AddInterval<Seconds>(FunctionContext* context, int64_t interval,
ptime* datetime) {
int64_t days = interval / (60 * 60 * 24);
int64_t seconds = interval % (60 * 60 * 24);
AddInterval<Days>(context, days, datetime);
*datetime += Seconds(seconds);
}
/// Workaround a boost bug in adding large millisecond intervals.
template <>
inline void AddInterval<Milliseconds>(FunctionContext* context, int64_t interval,
ptime* datetime) {
int64_t seconds = interval / 1000;
int64_t milliseconds = interval % 1000;
AddInterval<Seconds>(context, seconds, datetime);
*datetime += Milliseconds(milliseconds);
}
/// Workaround a boost bug in adding large microsecond intervals.
template <>
inline void AddInterval<Microseconds>(FunctionContext* context, int64_t interval,
ptime* datetime) {
int64_t seconds = interval / 1000000;
int64_t microseconds = interval % 1000000;
AddInterval<Seconds>(context, seconds, datetime);
*datetime += Microseconds(microseconds);
}
/// Workaround a boost bug in adding large nanosecond intervals.
template <>
inline void AddInterval<Nanoseconds>(FunctionContext* context, int64_t interval,
ptime* datetime) {
int64_t seconds = interval / 1000000000;
int64_t nanoseconds = interval % 1000000000;
AddInterval<Seconds>(context, seconds, datetime);
*datetime += Nanoseconds(nanoseconds);
}
inline void DcheckAddSubResult(const TimestampVal& input, const TimestampVal& result,
bool is_add, int64_t interval) {
#ifndef NDEBUG
if (!result.is_null) {
const TimestampValue& input_value = TimestampValue::FromTimestampVal(input);
const TimestampValue& result_value = TimestampValue::FromTimestampVal(result);
if (interval == 0) {
DCHECK_EQ(result_value, input_value);
} else if (is_add == (interval > 0)) {
DCHECK_GT(result_value, input_value);
} else {
DCHECK_LT(result_value, input_value);
}
}
#endif
}
/// Template parameters:
/// 'is_add': Set to false for subtraction.
/// 'AnyIntVal': TinyIntVal, SmallIntVal, ...
/// 'Interval': A boost interval type -- Years, Months, ...
/// 'is_add_months_keep_last_day': Should only be set to true if 'Interval' is Months.
/// When true, AddMonths() will be called with 'keep_max_day' set to true.
template <bool is_add, typename AnyIntVal, typename Interval,
bool is_add_months_keep_last_day>
TimestampVal TimestampFunctions::AddSub(FunctionContext* context,
const TimestampVal& timestamp, const AnyIntVal& num_interval_units) {
DCHECK_EQ(Date(max_date_time).year(), MAX_YEAR);
DCHECK_EQ(Date(min_date_time).year(), MIN_YEAR);
if (timestamp.is_null || num_interval_units.is_null) return TimestampVal::null();
const TimestampValue& value = TimestampValue::FromTimestampVal(timestamp);
if (!value.HasDate()) return TimestampVal::null();
// Adding/subtracting boost::gregorian::dates can throw if the result exceeds the
// min/max supported dates. (Sometimes the exception is thrown lazily and calling an
// accessor functions is needed to trigger validation.)
if (UNLIKELY(IsOverMaxInterval<Interval>(num_interval_units.val))) {
context->AddWarning(Substitute("Cannot $0 interval $1: Interval value too large",
is_add ? "add" : "subtract", num_interval_units.val).c_str());
return TimestampVal::null();
}
try {
ptime datetime;
value.ToPtime(&datetime);
if (is_add_months_keep_last_day) {
AddMonths(context, is_add ? num_interval_units.val : -num_interval_units.val, true,
&datetime);
} else {
AddInterval<Interval>(context,
is_add ? num_interval_units.val : -num_interval_units.val, &datetime);
}
// Validate that the ptime is not "special" (ie not_a_date_time) and has a valid year.
// If validation fails, an exception is thrown.
datetime.date().year();
const TimestampValue result_value(datetime);
TimestampVal result_val;
result_value.ToTimestampVal(&result_val);
DcheckAddSubResult(timestamp, result_val, is_add, num_interval_units.val);
return result_val;
} catch (const std::exception& e) {
context->AddWarning(Substitute("Cannot $0 interval $1: $2",
is_add ? "add" : "subtract", num_interval_units.val, e.what()).c_str());
return TimestampVal::null();
}
}
IntVal TimestampFunctions::DateDiff(FunctionContext* context,
const TimestampVal& ts_val1,
const TimestampVal& ts_val2) {
if (ts_val1.is_null || ts_val2.is_null) return IntVal::null();
const TimestampValue& ts_value1 = TimestampValue::FromTimestampVal(ts_val1);
const TimestampValue& ts_value2 = TimestampValue::FromTimestampVal(ts_val2);
if (!ts_value1.HasDate() || !ts_value2.HasDate()) {
return IntVal::null();
}
return IntVal((ts_value1.date() - ts_value2.date()).days());
}
// Explicit template instantiation is required for proper linking. These functions
// are only indirectly called via a function pointer provided by the opcode registry
// which does not trigger implicit template instantiation.
// Must be kept in sync with common/function-registry/impala_functions.py.
template StringVal
TimestampFunctions::FromUnix<IntVal>(FunctionContext* context, const IntVal& intp, const
StringVal& fmt);
template StringVal
TimestampFunctions::FromUnix<BigIntVal>(FunctionContext* context, const BigIntVal& intp,
const StringVal& fmt);
template StringVal
TimestampFunctions::FromUnix<IntVal>(FunctionContext* context , const IntVal& intp);
template StringVal
TimestampFunctions::FromUnix<BigIntVal>(FunctionContext* context, const BigIntVal& intp);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Years, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Years, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Years, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Years, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Months, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Months, true>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Months, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Months, true>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Months, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Months, true>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Months, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Months, true>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Weeks, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Weeks, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Weeks, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Weeks, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Days, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Days, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Days, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Days, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Hours, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Hours, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Hours, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Hours, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Minutes, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Minutes, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Minutes, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Minutes, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Seconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Seconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Seconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Seconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Milliseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Milliseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Milliseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Milliseconds, false>(
FunctionContext* context, const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Microseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Microseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Microseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Microseconds, false>(
FunctionContext* context, const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, IntVal, Nanoseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<true, BigIntVal, Nanoseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, IntVal, Nanoseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const IntVal& count);
template TimestampVal
TimestampFunctions::AddSub<false, BigIntVal, Nanoseconds, false>(FunctionContext* context,
const TimestampVal& ts_val, const BigIntVal& count);
}
```
|
The Port Deposit Bridge (also known as the Susquehanna River Bridge or Rock Run Toll Bridge) was the earliest bridge crossing of the Susquehanna River below Columbia, Pennsylvania, providing the first reliable link between the northern and southern United States. The bridge was also the fifth and last of Theodore Burr's Susquehanna crossings. The wooden covered bridge was constructed just north of Port Deposit, Maryland, between 1817 and 1818 and lasted until 1857. It was built and operated by the Susquehanna Bridge and Bank Company.
History
The site for the bridge was surveyed in 1813. The bridge crossed from north of Port Deposit to just below Rock Run stream in Harford County. The bridge crossed Steel, Roberts and Wood Islands. Construction of the bridge was started in 1817. The bridge was constructed by Theodore Burr, who had just completed work on four Susquehanna bridges in Pennsylvania. The bridge design used his Burr arch truss. "This ultimate achievement of Burr's on the Susquehanna, having in all eighteen 200-foot trussed arch wooden spans, eight between the west shore and a first island, two between that and a second island, and eight more between that and the east shore, and a total length
of 4,170 feet, was to be performed in two years. For the years between 1812 and 1818 Burr's trussed arch had been an increasing
triumph.". The bridge opened in 1818.
On 1 January 1823, friction from an iron shod sleigh caused a fire which burned large portions of the bridge. It was rebuilt and back in operation by 1828 or 1830. The bridge operated until 1854 when a herd of cattle caused two spans to collapse. It remained closed and a large section of the eastern span was destroyed by the spring flood of 1857. It was superseded by the Conowingo Bridge, which reopened in 1859, further upstream.
The ruins of the abutments are still clearly visible from the western shore or from above. The Jersey Toll House, located at the southwestern end of the bridge still exists as part of Susquehanna State Park.
See also
List of crossings of the Susquehanna River
References
External links
Drawing of bridge from Susquehanna Bank and Bridge Co. $5 Note
Covered bridges in Maryland
Bridges over the Susquehanna River
Bridges completed in 1818
Wooden bridges in Maryland
Road bridges in Maryland
Former toll bridges in Maryland
Burr Truss bridges in the United States
Bridges in Cecil County, Maryland
|
```xml
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="14.0" xmlns="path_to_url">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="Debug|Win32">
<Configuration>Debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Debug|x64">
<Configuration>Debug</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|Win32">
<Configuration>Release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="Release|x64">
<Configuration>Release</Configuration>
<Platform>x64</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{8713DCC9-E21C-485A-99E5-B8D1E5AD91B6}</ProjectGuid>
<Keyword>Win32Proj</Keyword>
<RootNamespace>LitColumns</RootNamespace>
<WindowsTargetPlatformVersion>10.0.10240.0</WindowsTargetPlatformVersion>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<PlatformToolset>v140</PlatformToolset>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>true</UseDebugLibraries>
<PlatformToolset>v140</PlatformToolset>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<PlatformToolset>v140</PlatformToolset>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<UseDebugLibraries>false</UseDebugLibraries>
<PlatformToolset>v140</PlatformToolset>
<WholeProgramOptimization>true</WholeProgramOptimization>
<CharacterSet>Unicode</CharacterSet>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Label="Shared">
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<ImportGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'" Label="PropertySheets">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
</ImportGroup>
<PropertyGroup Label="UserMacros" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<LinkIncremental>true</LinkIncremental>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<LinkIncremental>true</LinkIncremental>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<LinkIncremental>false</LinkIncremental>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<LinkIncremental>false</LinkIncremental>
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|Win32'">
<ClCompile>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>WIN32;_DEBUG;_WINDOWS;%(PreprocessorDefinitions)</PreprocessorDefinitions>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Debug|x64'">
<ClCompile>
<PrecompiledHeader>
</PrecompiledHeader>
<WarningLevel>Level3</WarningLevel>
<Optimization>Disabled</Optimization>
<PreprocessorDefinitions>WIN32;_DEBUG;_WINDOWS;%(PreprocessorDefinitions)</PreprocessorDefinitions>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|Win32'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<PrecompiledHeader>
</PrecompiledHeader>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>WIN32;NDEBUG;_WINDOWS;%(PreprocessorDefinitions)</PreprocessorDefinitions>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
</Link>
</ItemDefinitionGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='Release|x64'">
<ClCompile>
<WarningLevel>Level3</WarningLevel>
<PrecompiledHeader>
</PrecompiledHeader>
<Optimization>MaxSpeed</Optimization>
<FunctionLevelLinking>true</FunctionLevelLinking>
<IntrinsicFunctions>true</IntrinsicFunctions>
<PreprocessorDefinitions>WIN32;NDEBUG;_WINDOWS;%(PreprocessorDefinitions)</PreprocessorDefinitions>
</ClCompile>
<Link>
<SubSystem>Windows</SubSystem>
<GenerateDebugInformation>true</GenerateDebugInformation>
<EnableCOMDATFolding>true</EnableCOMDATFolding>
<OptimizeReferences>true</OptimizeReferences>
</Link>
</ItemDefinitionGroup>
<ItemGroup>
<ClCompile Include="..\..\Common\d3dApp.cpp" />
<ClCompile Include="..\..\Common\d3dUtil.cpp" />
<ClCompile Include="..\..\Common\DDSTextureLoader.cpp" />
<ClCompile Include="..\..\Common\GameTimer.cpp" />
<ClCompile Include="..\..\Common\GeometryGenerator.cpp" />
<ClCompile Include="..\..\Common\MathHelper.cpp" />
<ClCompile Include="FrameResource.cpp" />
<ClCompile Include="LitColumnsApp.cpp" />
</ItemGroup>
<ItemGroup>
<ClInclude Include="..\..\Common\d3dApp.h" />
<ClInclude Include="..\..\Common\d3dUtil.h" />
<ClInclude Include="..\..\Common\d3dx12.h" />
<ClInclude Include="..\..\Common\DDSTextureLoader.h" />
<ClInclude Include="..\..\Common\GameTimer.h" />
<ClInclude Include="..\..\Common\GeometryGenerator.h" />
<ClInclude Include="..\..\Common\MathHelper.h" />
<ClInclude Include="..\..\Common\UploadBuffer.h" />
<ClInclude Include="FrameResource.h" />
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets">
</ImportGroup>
</Project>
```
|
Candi, Cesare - (b Minerbio, near Bologna, 5 March 1869; d Genoa, 29 Sept 1947) was an Italian instrument maker.
References
External links
Alberto Giordano&C. The Genoese line
::: Alberto Giordano&C. - Fine violins, violas and cellos in Genoa ::: at www.giordanoviolins.com
La Liuteria Italiana / Italian Violin Making in the 1800s and 1900s - Umberto Azzolina
I Maestri Del Novicento - Carlo Vettori
La Liuteria Lombarda del '900 - Roberto Codazzi, Cinzia Manfredini 2002
Dictionary of 20th Century Italian Violin Makers - Marlin Brinser 1978
Liuteria Parmense
View a fine example of Cesare Candi - violin 1901 close-up:
Cesare Candi - violin 1901 top
Cesare Candi - violin 1901 back
Cesare Candi - violin 1901 scroll front
Cesare Candi - violin 1901 scroll back
Cesare Candi - violin 1901 scroll right
Living Museum
Discover the history of the Bolognese School Bolognese Violin Makers
1869 births
1947 deaths
Italian luthiers
Businesspeople from Genoa
|
4:4:4 may refer to:
Digital images or video in which all color components have the same sampling rate, thus not using chroma subsampling
Another name for the RGB color space
See also
444 (disambiguation)
4:44 (disambiguation)
|
```kotlin
/*
* that can be found in the LICENSE file.
*/
// All classes and methods should be used in tests
@file:Suppress("UNUSED")
package conversions
import kotlin.native.concurrent.freeze
import kotlin.native.concurrent.isFrozen
import kotlin.native.internal.ObjCErrorException
import kotlin.native.ref.WeakReference
import kotlin.properties.ReadWriteProperty
import kotlin.reflect.KClass
import kotlin.reflect.KProperty
import kotlin.test.*
import kotlinx.cinterop.*
// Ensure loaded function IR classes aren't ordered by arity:
internal fun referenceFunction1(block: (Any?) -> Unit) {}
// Constants
const val dbl: Double = 3.14
const val flt: Float = 2.73F
const val integer: Int = 42
const val longInt: Long = 1984
// Vars
var intVar: Int = 451
var str = "Kotlin String"
var strAsAny: Any = "Kotlin String as Any"
// MIN/MAX values as Numbers
var minDoubleVal: kotlin.Number = Double.MIN_VALUE
var maxDoubleVal: kotlin.Number = Double.MAX_VALUE
// Infinities and NaN
val nanDoubleVal: Double = Double.NaN
val nanFloatVal: Float = Float.NaN
val infDoubleVal: Double = Double.POSITIVE_INFINITY
val infFloatVal: Float = Float.NEGATIVE_INFINITY
private fun <T> T.toNullable(): T? = this
fun box(booleanValue: Boolean) = booleanValue.toNullable()
fun box(byteValue: Byte) = byteValue.toNullable()
fun box(shortValue: Short) = shortValue.toNullable()
fun box(intValue: Int) = intValue.toNullable()
fun box(longValue: Long) = longValue.toNullable()
fun box(uByteValue: UByte) = uByteValue.toNullable()
fun box(uShortValue: UShort) = uShortValue.toNullable()
fun box(uIntValue: UInt) = uIntValue.toNullable()
fun box(uLongValue: ULong) = uLongValue.toNullable()
fun box(floatValue: Float) = floatValue.toNullable()
fun box(doubleValue: Double) = doubleValue.toNullable()
private inline fun <reified T> ensureEquals(actual: T?, expected: T) {
if (actual !is T) error(T::class)
if (actual != expected) error(T::class)
}
fun ensureEqualBooleans(actual: Boolean?, expected: Boolean) = ensureEquals(actual, expected)
fun ensureEqualBytes(actual: Byte?, expected: Byte) = ensureEquals(actual, expected)
fun ensureEqualShorts(actual: Short?, expected: Short) = ensureEquals(actual, expected)
fun ensureEqualInts(actual: Int?, expected: Int) = ensureEquals(actual, expected)
fun ensureEqualLongs(actual: Long?, expected: Long) = ensureEquals(actual, expected)
fun ensureEqualUBytes(actual: UByte?, expected: UByte) = ensureEquals(actual, expected)
fun ensureEqualUShorts(actual: UShort?, expected: UShort) = ensureEquals(actual, expected)
fun ensureEqualUInts(actual: UInt?, expected: UInt) = ensureEquals(actual, expected)
fun ensureEqualULongs(actual: ULong?, expected: ULong) = ensureEquals(actual, expected)
fun ensureEqualFloats(actual: Float?, expected: Float) = ensureEquals(actual, expected)
fun ensureEqualDoubles(actual: Double?, expected: Double) = ensureEquals(actual, expected)
// Boolean
val boolVal: Boolean = true
val boolAnyVal: Any = false
// Lists
val numbersList: List<Number> = listOf(1.toByte(), 2.toShort(), 13)
val anyList: List<Any> = listOf("Str", 42, 3.14, true)
// lateinit
lateinit var lateinitIntVar: Any
// lazy
val lazyVal: String by lazy {
println("Lazy value initialization")
"Lazily initialized string"
}
// Delegation
var delegatedGlobalArray: Array<String> by DelegateClass()
class DelegateClass: ReadWriteProperty<Nothing?, Array<String>> {
private var holder: Array<String> = arrayOf("property")
override fun getValue(thisRef: Nothing?, property: KProperty<*>): Array<String> {
return arrayOf("Delegated", "global", "array") + holder
}
override fun setValue(thisRef: Nothing?, property: KProperty<*>, value: Array<String>) {
holder = value
}
}
// Getter with delegation
val delegatedList: List<String>
get() = delegatedGlobalArray.toList()
// Null
val nullVal: Any? = null
var nullVar: String? = ""
// Any
var anyValue: Any = "Str"
// Functions
fun emptyFun() { }
fun strFun(): String = "fooStr"
fun argsFun(i: Int, l: Long, d: Double, s: String): Any = s + i + l + d
fun funArgument(foo: () -> String): String = foo()
// Generic functions
fun <T, R> genericFoo(t: T, foo: (T) -> R): R = foo(t)
fun <T : Number, R : T> fooGenericNumber(r: R, foo: (T) -> Number): Number = foo(r)
fun <T> varargToList(vararg args: T): List<T> = args.toList()
// Extensions
fun String.subExt(i: Int): String {
return if (i < this.length) this[i].toString() else "nothing"
}
fun Any?.toString(): String = this?.toString() ?: "null"
fun Any?.print() = println(this.toString())
fun Char.boxChar(): Char? = this
fun Char?.isA(): Boolean = (this == 'A')
// Lambdas
val sumLambda = { x: Int, y: Int -> x + y }
// Inheritance
interface I {
fun iFun(): String = "I::iFun"
}
fun I.iFunExt() = iFun()
private interface PI {
fun piFun(): Any
fun iFun(): String = "PI::iFun"
}
class DefaultInterfaceExt : I
open class OpenClassI : I {
override fun iFun(): String = "OpenClassI::iFun"
}
class FinalClassExtOpen : OpenClassI() {
override fun iFun(): String = "FinalClassExtOpen::iFun"
}
open class MultiExtClass : OpenClassI(), PI {
override fun piFun(): Any {
return 42
}
override fun iFun(): String = super<PI>.iFun()
}
open class ConstrClass(open val i: Int, val s: String, val a: Any = "AnyS") : OpenClassI()
class ExtConstrClass(override val i: Int) : ConstrClass(i, "String") {
override fun iFun(): String = "ExtConstrClass::iFun::$i-$s-$a"
}
// Enum
enum class Enumeration(val enumValue: Int) {
ANSWER(42), YEAR(1984), TEMPERATURE(451)
}
fun passEnum(): Enumeration {
return Enumeration.ANSWER
}
fun receiveEnum(e: Int) {
println("ENUM got: ${get(e).enumValue}")
}
fun get(value: Int): Enumeration {
return Enumeration.values()[value]
}
// Data class values and generated properties: component# and toString()
data class TripleVals<T>(val first: T, val second: T, val third: T)
data class TripleVars<T>(var first: T, var second: T, var third: T) {
override fun toString(): String {
return "[$first, $second, $third]"
}
}
open class WithCompanionAndObject {
companion object {
val str = "String"
var named: I? = Named
}
object Named : OpenClassI() {
override fun iFun(): String = "WithCompanionAndObject.Named::iFun"
}
}
fun getCompanionObject() = WithCompanionAndObject.Companion
fun getNamedObject() = WithCompanionAndObject.Named
fun getNamedObjectInterface(): OpenClassI = WithCompanionAndObject.Named
typealias EE = Enumeration
fun EE.getAnswer() : EE = Enumeration.ANSWER
inline class IC1(val value: Int)
inline class IC2(val value: String)
inline class IC3(val value: TripleVals<Any?>?)
fun box(ic1: IC1): Any = ic1
fun box(ic2: IC2): Any = ic2
fun box(ic3: IC3): Any = ic3
fun concatenateInlineClassValues(ic1: IC1, ic1N: IC1?, ic2: IC2, ic2N: IC2?, ic3: IC3, ic3N: IC3?): String =
"${ic1.value} ${ic1N?.value} ${ic2.value} ${ic2N?.value} ${ic3.value} ${ic3N?.value}"
fun IC1.getValue1() = this.value
fun IC1?.getValueOrNull1() = this?.value
fun IC2.getValue2() = value
fun IC2?.getValueOrNull2() = this?.value
fun IC3.getValue3() = value
fun IC3?.getValueOrNull3() = this?.value
fun isFrozen(obj: Any): Boolean = obj.isFrozen
fun kotlinLambda(block: (Any) -> Any): Any = block
fun multiply(int: Int, long: Long) = int * long
class MyException : Exception()
class MyError : Error()
@Throws(MyException::class, MyError::class)
fun throwException(error: Boolean): Unit {
throw if (error) MyError() else MyException()
}
interface SwiftOverridableMethodsWithThrows {
@Throws(MyException::class) fun unit(): Unit
@Throws(MyException::class) fun nothing(): Nothing
@Throws(MyException::class) fun any(): Any
@Throws(MyException::class) fun block(): () -> Int
}
interface MethodsWithThrows : SwiftOverridableMethodsWithThrows {
@Throws(MyException::class) fun nothingN(): Nothing?
@Throws(MyException::class) fun anyN(): Any?
@Throws(MyException::class) fun blockN(): (() -> Int)?
@Throws(MyException::class) fun pointer(): CPointer<*>
@Throws(MyException::class) fun pointerN(): CPointer<*>?
@Throws(MyException::class) fun int(): Int
@Throws(MyException::class) fun longN(): Long?
@Throws(MyException::class) fun double(): Double
interface UnitCaller {
@Throws(MyException::class) fun call(methods: MethodsWithThrows): Unit
}
}
open class Throwing : MethodsWithThrows {
@Throws(MyException::class) constructor(doThrow: Boolean) {
if (doThrow) throw MyException()
}
override fun unit(): Unit = throw MyException()
override fun nothing(): Nothing = throw MyException()
override fun nothingN(): Nothing? = throw MyException()
override fun any(): Any = throw MyException()
override fun anyN(): Any? = throw MyException()
override fun block(): () -> Int = throw MyException()
override fun blockN(): (() -> Int)? = throw MyException()
override fun pointer(): CPointer<*> = throw MyException()
override fun pointerN(): CPointer<*>? = throw MyException()
override fun int(): Int = throw MyException()
override fun longN(): Long? = throw MyException()
override fun double(): Double = throw MyException()
}
class NotThrowing : MethodsWithThrows {
@Throws(MyException::class) constructor() {}
override fun unit(): Unit {}
override fun nothing(): Nothing = throw MyException()
override fun nothingN(): Nothing? = null
override fun any(): Any = Any()
override fun anyN(): Any? = Any()
override fun block(): () -> Int = { 42 }
override fun blockN(): (() -> Int)? = null
override fun pointer(): CPointer<*> = 1L.toCPointer<COpaque>()!!
override fun pointerN(): CPointer<*>? = null
override fun int(): Int = 42
override fun longN(): Long? = null
override fun double(): Double = 3.14
}
@Throws(Throwable::class)
fun testSwiftThrowing(methods: SwiftOverridableMethodsWithThrows) = with(methods) {
assertSwiftThrowing { unit() }
assertSwiftThrowing { nothing() }
assertSwiftThrowing { any() }
assertSwiftThrowing { block() }
}
private inline fun assertSwiftThrowing(block: () -> Unit) =
assertFailsWith<ObjCErrorException>(block = block)
@Throws(Throwable::class)
fun testSwiftNotThrowing(methods: SwiftOverridableMethodsWithThrows) = with(methods) {
unit()
assertEquals(42, any())
assertEquals(17, block()())
}
@Throws(MyError::class)
fun callUnit(methods: SwiftOverridableMethodsWithThrows) = methods.unit()
@Throws(Throwable::class)
fun callUnitCaller(caller: MethodsWithThrows.UnitCaller, methods: MethodsWithThrows) {
assertFailsWith<MyException> { caller.call(methods) }
}
interface ThrowsWithBridgeBase {
@Throws(MyException::class)
fun plusOne(x: Int): Any
}
abstract class ThrowsWithBridge : ThrowsWithBridgeBase {
abstract override fun plusOne(x: Int): Int
}
@Throws(Throwable::class)
fun testSwiftThrowing(test: ThrowsWithBridgeBase, flag: Boolean) {
assertFailsWith<ObjCErrorException> {
if (flag) {
test.plusOne(0)
} else {
val test1 = test as ThrowsWithBridge
val ignore: Int = test1.plusOne(1)
}
}
}
@Throws(Throwable::class)
fun testSwiftNotThrowing(test: ThrowsWithBridgeBase) {
assertEquals(3, test.plusOne(2))
val test1 = test as ThrowsWithBridge
assertEquals<Int>(4, test1.plusOne(3))
}
fun Any.same() = this
// path_to_url
val PROPERTY_NAME_MUST_NOT_BE_ALTERED_BY_SWIFT = 111
// path_to_url
class Deeply {
class Nested {
class Type {
val thirtyTwo = 32
}
interface IType
}
}
class WithGenericDeeply() {
class Nested {
class Type<T> {
val thirtyThree = 33
}
}
}
// path_to_url
class TypeOuter {
class Type {
val thirtyFour = 34
}
}
data class CKeywords(val float: Float, val `enum`: Int, var goto: Boolean)
interface Base1 {
fun same(value: Int?): Int?
}
interface ExtendedBase1 : Base1 {
override fun same(value: Int?): Int?
}
interface Base2 {
fun same(value: Int?): Int?
}
internal interface Base3 {
fun same(value: Int?): Int
}
open class Base23 : Base2, Base3 {
override fun same(value: Int?): Int = error("should not reach here")
}
fun call(base1: Base1, value: Int?) = base1.same(value)
fun call(extendedBase1: ExtendedBase1, value: Int?) = extendedBase1.same(value)
fun call(base2: Base2, value: Int?) = base2.same(value)
fun call(base3: Any, value: Int?) = (base3 as Base3).same(value)
fun call(base23: Base23, value: Int?) = base23.same(value)
interface Transform<T, R> {
fun map(value: T): R
}
interface TransformWithDefault<T> : Transform<T, T> {
override fun map(value: T): T = value
}
class TransformInheritingDefault<T> : TransformWithDefault<T>
interface TransformIntString {
fun map(intValue: Int): String
}
abstract class TransformIntToString : Transform<Int, String>, TransformIntString {
override abstract fun map(intValue: Int): String
}
open class TransformIntToDecimalString : TransformIntToString() {
override fun map(intValue: Int): String = intValue.toString()
}
private class TransformDecimalStringToInt : Transform<String, Int> {
override fun map(stringValue: String): Int = stringValue.toInt()
}
fun createTransformDecimalStringToInt(): Transform<String, Int> = TransformDecimalStringToInt()
open class TransformIntToLong : Transform<Int, Long> {
override fun map(value: Int): Long = value.toLong()
}
class GH2931 {
class Data
class Holder {
val data = Data()
init {
freeze()
}
}
}
class GH2945(var errno: Int) {
fun testErrnoInSelector(p: Int, errno: Int) = p + errno
}
class GH2830 {
interface I
private class PrivateImpl : I
fun getI(): Any = PrivateImpl()
}
class GH2959 {
interface I {
val id: Int
}
private class PrivateImpl(override val id: Int) : I
fun getI(id: Int): List<I> = listOf(PrivateImpl(id))
}
fun runUnitBlock(block: () -> Unit): Boolean {
val blockAny: () -> Any? = block
return blockAny() === Unit
}
fun asUnitBlock(block: () -> Any?): () -> Unit = { block() }
fun runNothingBlock(block: () -> Nothing) = try {
block()
false
} catch (e: Throwable) {
true
}
fun asNothingBlock(block: () -> Any?): () -> Nothing = {
block()
TODO()
}
fun getNullBlock(): (() -> Unit)? = null
fun isBlockNull(block: (() -> Unit)?): Boolean = block == null
interface IntBlocks<T> {
fun getPlusOneBlock(): T
fun callBlock(argument: Int, block: T): Int
}
object IntBlocksImpl : IntBlocks<(Int) -> Int> {
override fun getPlusOneBlock(): (Int) -> Int = { it: Int -> it + 1 }
override fun callBlock(argument: Int, block: (Int) -> Int): Int = block(argument)
}
interface UnitBlockCoercion<T : Any> {
fun coerce(block: () -> Unit): T
fun uncoerce(block: T): () -> Unit
}
object UnitBlockCoercionImpl : UnitBlockCoercion<() -> Unit> {
override fun coerce(block: () -> Unit): () -> Unit = block
override fun uncoerce(block: () -> Unit): () -> Unit = block
}
fun isFunction(obj: Any?): Boolean = obj is Function<*>
fun isFunction0(obj: Any?): Boolean = obj is Function0<*>
abstract class MyAbstractList : List<Any?>
fun takeForwardDeclaredClass(obj: objcnames.classes.ForwardDeclaredClass) {}
fun takeForwardDeclaredProtocol(obj: objcnames.protocols.ForwardDeclaredProtocol) {}
class TestKClass {
fun getKotlinClass(clazz: ObjCClass) = getOriginalKotlinClass(clazz)
fun getKotlinClass(protocol: ObjCProtocol) = getOriginalKotlinClass(protocol)
fun isTestKClass(kClass: KClass<*>): Boolean = (kClass == TestKClass::class)
fun isI(kClass: KClass<*>): Boolean = (kClass == TestKClass.I::class)
interface I
}
// path_to_url
interface ForwardI2 : ForwardI1
interface ForwardI1 {
fun getForwardI2(): ForwardI2
}
abstract class ForwardC2 : ForwardC1()
abstract class ForwardC1 {
abstract fun getForwardC2(): ForwardC2
}
interface TestSR10177Workaround
interface TestClashes1 {
val clashingProperty: Int
}
interface TestClashes2 {
val clashingProperty: Any
val clashingProperty_: Any
}
class TestClashesImpl : TestClashes1, TestClashes2 {
override val clashingProperty: Int
get() = 1
override val clashingProperty_: Int
get() = 2
}
class TestInvalidIdentifiers {
class `$Foo`
class `Bar$`
fun `a$d$d`(`$1`: Int, `2`: Int, `3`: Int): Int = `$1` + `2` + `3`
var `$status`: String = ""
enum class E(val value: Int) {
`4$`(4),
`5$`(5),
`_`(6),
`__`(7)
}
companion object `Companion$` {
val `42` = 42
}
val `$` = '$'
val `_` = '_'
}
@Suppress("UNUSED_PARAMETER")
open class TestDeprecation() {
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) open class OpenHidden : TestDeprecation()
@Suppress("DEPRECATION_ERROR") class ExtendingHidden : OpenHidden() {
class Nested
}
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) interface HiddenInterface {
fun effectivelyHidden(): Any
}
@Suppress("DEPRECATION_ERROR") open class ImplementingHidden : Any(), HiddenInterface {
override fun effectivelyHidden(): Int = -1
}
@Suppress("DEPRECATION_ERROR")
fun callEffectivelyHidden(obj: Any): Int = (obj as HiddenInterface).effectivelyHidden() as Int
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) class Hidden : TestDeprecation() {
open class Nested {
class Nested
inner class Inner
}
inner class Inner {
inner class Inner
}
}
@Suppress("DEPRECATION_ERROR") class ExtendingNestedInHidden : Hidden.Nested()
@Suppress("DEPRECATION_ERROR") fun getHidden() = Hidden()
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) constructor(hidden: Byte) : this()
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) fun hidden() {}
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) val hiddenVal: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) var hiddenVar: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) open fun openHidden() {}
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) open val openHiddenVal: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) open var openHiddenVar: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) open class OpenError : TestDeprecation()
@Suppress("DEPRECATION_ERROR") class ExtendingError : OpenError()
@Deprecated("error", level = DeprecationLevel.ERROR) interface ErrorInterface
@Suppress("DEPRECATION_ERROR") class ImplementingError : ErrorInterface
@Deprecated("error", level = DeprecationLevel.ERROR) class Error : TestDeprecation()
@Suppress("DEPRECATION_ERROR") fun getError() = Error()
@Deprecated("error", level = DeprecationLevel.ERROR) constructor(error: Short) : this()
@Deprecated("error", level = DeprecationLevel.ERROR) fun error() {}
@Deprecated("error", level = DeprecationLevel.ERROR) val errorVal: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) var errorVar: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) open fun openError() {}
@Deprecated("error", level = DeprecationLevel.ERROR) open val openErrorVal: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) open var openErrorVar: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) open class OpenWarning : TestDeprecation()
@Suppress("DEPRECATION") class ExtendingWarning : OpenWarning()
@Deprecated("warning", level = DeprecationLevel.WARNING) interface WarningInterface
@Suppress("DEPRECATION") class ImplementingWarning : WarningInterface
@Deprecated("warning", level = DeprecationLevel.WARNING) class Warning : TestDeprecation()
@Suppress("DEPRECATION") fun getWarning() = Warning()
@Deprecated("warning", level = DeprecationLevel.WARNING) constructor(warning: Int) : this()
@Deprecated("warning", level = DeprecationLevel.WARNING) fun warning() {}
@Deprecated("warning", level = DeprecationLevel.WARNING) val warningVal: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) var warningVar: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) open fun openWarning() {}
@Deprecated("warning", level = DeprecationLevel.WARNING) open val openWarningVal: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) open var openWarningVar: Any? = null
constructor(normal: Long) : this()
fun normal() {}
val normalVal: Any? = null
var normalVar: Any? = null
open fun openNormal(): Int = 1
open val openNormalVal: Any? = null
open var openNormalVar: Any? = null
class HiddenOverride() : TestDeprecation() {
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) constructor(hidden: Byte) : this()
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override fun openHidden() {}
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override val openHiddenVal: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override var openHiddenVar: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) constructor(error: Short) : this()
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override fun openError() {}
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override val openErrorVal: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override var openErrorVar: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) constructor(warning: Int) : this()
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override fun openWarning() {}
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override val openWarningVal: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override var openWarningVar: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) constructor(normal: Long) : this()
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override fun openNormal(): Int = 2
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override val openNormalVal: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) override var openNormalVar: Any? = null
}
class ErrorOverride() : TestDeprecation() {
@Deprecated("error", level = DeprecationLevel.ERROR) constructor(hidden: Byte) : this()
@Deprecated("error", level = DeprecationLevel.ERROR) override fun openHidden() {}
@Deprecated("error", level = DeprecationLevel.ERROR) override val openHiddenVal: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) override var openHiddenVar: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) constructor(error: Short) : this()
@Deprecated("error", level = DeprecationLevel.ERROR) override fun openError() {}
@Deprecated("error", level = DeprecationLevel.ERROR) override val openErrorVal: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) override var openErrorVar: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) constructor(warning: Int) : this()
@Deprecated("error", level = DeprecationLevel.ERROR) override fun openWarning() {}
@Deprecated("error", level = DeprecationLevel.ERROR) override val openWarningVal: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) override var openWarningVar: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) constructor(normal: Long) : this()
@Deprecated("error", level = DeprecationLevel.ERROR) override fun openNormal(): Int = 3
@Deprecated("error", level = DeprecationLevel.ERROR) override val openNormalVal: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) override var openNormalVar: Any? = null
}
class WarningOverride() : TestDeprecation() {
@Deprecated("warning", level = DeprecationLevel.WARNING) constructor(hidden: Byte) : this()
@Deprecated("warning", level = DeprecationLevel.WARNING) override fun openHidden() {}
@Deprecated("warning", level = DeprecationLevel.WARNING) override val openHiddenVal: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) override var openHiddenVar: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) constructor(error: Short) : this()
@Deprecated("warning", level = DeprecationLevel.WARNING) override fun openError() {}
@Deprecated("warning", level = DeprecationLevel.WARNING) override val openErrorVal: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) override var openErrorVar: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) constructor(warning: Int) : this()
@Deprecated("warning", level = DeprecationLevel.WARNING) override fun openWarning() {}
@Deprecated("warning", level = DeprecationLevel.WARNING) override val openWarningVal: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) override var openWarningVar: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) constructor(normal: Long) : this()
@Deprecated("warning", level = DeprecationLevel.WARNING) override fun openNormal(): Int = 4
@Deprecated("warning", level = DeprecationLevel.WARNING) override val openNormalVal: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) override var openNormalVar: Any? = null
}
class NormalOverride() : TestDeprecation() {
constructor(hidden: Byte) : this()
override fun openHidden() {}
override val openHiddenVal: Any? = null
override var openHiddenVar: Any? = null
constructor(error: Short) : this()
override fun openError() {}
override val openErrorVal: Any? = null
override var openErrorVar: Any? = null
constructor(warning: Int) : this()
override fun openWarning() {}
override val openWarningVal: Any? = null
override var openWarningVar: Any? = null
constructor(normal: Long) : this()
override fun openNormal(): Int = 5
override val openNormalVal: Any? = null
override var openNormalVar: Any? = null
}
@Suppress("DEPRECATION_ERROR") fun test(hiddenNested: Hidden.Nested) {}
@Suppress("DEPRECATION_ERROR") fun test(hiddenNestedNested: Hidden.Nested.Nested) {}
@Suppress("DEPRECATION_ERROR") fun test(hiddenNestedInner: Hidden.Nested.Inner) {}
@Suppress("DEPRECATION_ERROR") fun test(hiddenInner: Hidden.Inner) {}
@Suppress("DEPRECATION_ERROR") fun test(hiddenInnerInner: Hidden.Inner.Inner) {}
@Suppress("DEPRECATION_ERROR") fun test(topLevelHidden: TopLevelHidden) {}
@Suppress("DEPRECATION_ERROR") fun test(topLevelHiddenNested: TopLevelHidden.Nested) {}
@Suppress("DEPRECATION_ERROR") fun test(topLevelHiddenNestedNested: TopLevelHidden.Nested.Nested) {}
@Suppress("DEPRECATION_ERROR") fun test(topLevelHiddenNestedInner: TopLevelHidden.Nested.Inner) {}
@Suppress("DEPRECATION_ERROR") fun test(topLevelHiddenInner: TopLevelHidden.Inner) {}
@Suppress("DEPRECATION_ERROR") fun test(topLevelHiddenInnerInner: TopLevelHidden.Inner.Inner) {}
@Suppress("DEPRECATION_ERROR") fun test(extendingHiddenNested: ExtendingHidden.Nested) {}
@Suppress("DEPRECATION_ERROR") fun test(extendingNestedInHidden: ExtendingNestedInHidden) {}
}
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) class TopLevelHidden {
class Nested {
class Nested
inner class Inner
}
inner class Inner {
inner class Inner
}
}
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) fun hidden() {}
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) val hiddenVal: Any? = null
@Deprecated("hidden", level = DeprecationLevel.HIDDEN) var hiddenVar: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) fun error() {}
@Deprecated("error", level = DeprecationLevel.ERROR) val errorVal: Any? = null
@Deprecated("error", level = DeprecationLevel.ERROR) var errorVar: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) fun warning() {}
@Deprecated("warning", level = DeprecationLevel.WARNING) val warningVal: Any? = null
@Deprecated("warning", level = DeprecationLevel.WARNING) var warningVar: Any? = null
fun gc() {
kotlin.native.internal.GC.collect()
}
class TestWeakRefs(private val frozen: Boolean) {
private var obj: Any? = Any().also {
if (frozen) it.freeze()
}
fun getObj() = obj!!
fun clearObj() {
obj = null
}
fun createCycle(): List<Any> {
val node1 = Node(null)
val node2 = Node(node1)
node1.next = node2
if (frozen) node1.freeze()
return listOf(node1, node2)
}
private class Node(var next: Node?)
}
class SharedRefs {
class MutableData {
var x = 0
fun update() { x += 1 }
}
fun createRegularObject(): MutableData = create { MutableData() }
fun createLambda(): () -> Unit = create {
var mutableData = 0
{
println(mutableData++)
}
}
fun createCollection(): MutableList<Any> = create {
mutableListOf()
}
fun createFrozenRegularObject() = createRegularObject().freeze()
fun createFrozenLambda() = createLambda().freeze()
fun createFrozenCollection() = createCollection().freeze()
fun hasAliveObjects(): Boolean {
kotlin.native.internal.GC.collect()
return mustBeRemoved.any { it.get() != null }
}
private fun <T : Any> create(block: () -> T) = block()
.also { mustBeRemoved += WeakReference(it) }
private val mustBeRemoved = mutableListOf<WeakReference<*>>()
}
interface TestRememberNewObject {
fun getObject(): Any
fun waitForCleanup()
}
fun testRememberNewObject(test: TestRememberNewObject) {
val obj = autoreleasepool { test.getObject() }
test.waitForCleanup()
assertNotEquals("", obj.toString()) // Likely crashes if object is removed.
}
open class ClassForTypeCheck
fun testClassTypeCheck(x: Any) = x is ClassForTypeCheck
interface InterfaceForTypeCheck
fun testInterfaceTypeCheck(x: Any) = x is InterfaceForTypeCheck
interface IAbstractInterface {
fun foo(): Int
}
interface IAbstractInterface2 {
fun foo() = 42
}
fun testAbstractInterfaceCall(x: IAbstractInterface) = x.foo()
fun testAbstractInterfaceCall2(x: IAbstractInterface2) = x.foo()
abstract class AbstractInterfaceBase : IAbstractInterface {
override fun foo() = bar()
abstract fun bar(): Int
}
abstract class AbstractInterfaceBase2 : IAbstractInterface2
abstract class AbstractInterfaceBase3 : IAbstractInterface {
abstract override fun foo(): Int
}
var gh3525BaseInitCount = 0
open class GH3525Base {
init {
gh3525BaseInitCount++
}
}
var gh3525InitCount = 0
object GH3525 : GH3525Base() {
init {
gh3525InitCount++
}
}
class TestStringConversion {
lateinit var str: Any
}
fun foo(a: kotlin.native.concurrent.AtomicReference<*>) {}
interface GH3825 {
@Throws(MyException::class) fun call0(callback: () -> Boolean)
@Throws(MyException::class) fun call1(doThrow: Boolean, callback: () -> Unit)
@Throws(MyException::class) fun call2(callback: () -> Unit, doThrow: Boolean)
}
class GH3825KotlinImpl : GH3825 {
override fun call0(callback: () -> Boolean) {
if (callback()) throw MyException()
}
override fun call1(doThrow: Boolean, callback: () -> Unit) {
if (doThrow) throw MyException()
callback()
}
override fun call2(callback: () -> Unit, doThrow: Boolean) {
if (doThrow) throw MyException()
callback()
}
}
@Throws(Throwable::class)
fun testGH3825(gh3825: GH3825) {
var count = 0
assertFailsWith<ObjCErrorException> { gh3825.call0({ true }) }
gh3825.call0({ count += 1; false })
assertEquals(1, count)
assertFailsWith<ObjCErrorException> { gh3825.call1(true, { fail() }) }
gh3825.call1(false, { count += 1 })
assertEquals(2, count)
assertFailsWith<ObjCErrorException> { gh3825.call2({ fail() }, true) }
gh3825.call2({ count += 1 }, false)
assertEquals(3, count)
}
fun mapBoolean2String(): Map<Boolean, String> = mapOf(Pair(false, "false"), Pair(true, "true"))
fun mapByte2Short(): Map<Byte, Short> = mapOf(Pair(-1, 2))
fun mapShort2Byte(): Map<Short, Byte> = mapOf(Pair(-2, 1))
fun mapInt2Long(): Map<Int, Long> = mapOf(Pair(-4, 8))
fun mapLong2Long(): Map<Long, Long> = mapOf(Pair(-8, 8))
fun mapUByte2Boolean(): Map<UByte, Boolean> = mapOf(Pair(0x80U, true))
fun mapUShort2Byte(): Map<UShort, Byte> = mapOf(Pair(0x8000U, 1))
fun mapUInt2Long(): Map<UInt, Long> = mapOf(Pair(0x7FFF_FFFFU, 7), Pair(0x8000_0000U, 8))
fun mapULong2Long(): Map<ULong, Long> = mapOf(Pair(0x8000_0000_0000_0000UL, 8))
fun mapFloat2Float(): Map<Float, Float> = mapOf(Pair(3.14f, 100f))
fun mapDouble2String(): Map<Double, String> = mapOf(Pair(2.718281828459045, "2.718281828459045"))
fun mutBoolean2String(): MutableMap<Boolean, String> = mutableMapOf(Pair(false, "false"), Pair(true, "true"))
fun mutByte2Short(): MutableMap<Byte, Short> = mutableMapOf(Pair(-1, 2))
fun mutShort2Byte(): MutableMap<Short, Byte> = mutableMapOf(Pair(-2, 1))
fun mutInt2Long(): MutableMap<Int, Long> = mutableMapOf(Pair(-4, 8))
fun mutLong2Long(): MutableMap<Long, Long> = mutableMapOf(Pair(-8, 8))
fun mutUByte2Boolean(): MutableMap<UByte, Boolean> = mutableMapOf(Pair(128U, true))
fun mutUShort2Byte(): MutableMap<UShort, Byte> = mutableMapOf(Pair(32768U, 1))
fun mutUInt2Long(): MutableMap<UInt, Long> = mutableMapOf(Pair(0x8000_0000U, 8))
fun mutULong2Long(): MutableMap<ULong, Long> = mutableMapOf(Pair(0x8000_0000_0000_0000UL, 8))
fun mutFloat2Float(): MutableMap<Float, Float> = mutableMapOf(Pair(3.14f, 100f))
fun mutDouble2String(): MutableMap<Double, String> = mutableMapOf(Pair(2.718281828459045, "2.718281828459045"))
interface Foo_FakeOverrideInInterface<T> {
fun foo(t: T?)
}
interface Bar_FakeOverrideInInterface : Foo_FakeOverrideInInterface<String>
fun callFoo_FakeOverrideInInterface(obj: Bar_FakeOverrideInInterface) {
obj.foo(null)
}
```
|
```html
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=US-ASCII">
<title>Struct template is_interval_container<icl::interval_base_map< SubType, DomainT, CodomainT, Traits, Compare, Combine, Section, Interval, Alloc >></title>
<link rel="stylesheet" href="../../../../../../doc/src/boostbook.css" type="text/css">
<meta name="generator" content="DocBook XSL Stylesheets V1.79.1">
<link rel="home" href="../../index.html" title="Chapter 1. Boost.Icl">
<link rel="up" href="../../header/boost/icl/interval_base_map_hpp.html" title="Header <boost/icl/interval_base_map.hpp>">
<link rel="prev" href="has_inverse_ic_idp58327408.html" title="Struct template has_inverse<icl::interval_base_map< SubType, DomainT, CodomainT, Traits, Compare, Combine, Section, Interval, Alloc >>">
<link rel="next" href="absorbs_identi_idp58356128.html" title="Struct template absorbs_identities<icl::interval_base_map< SubType, DomainT, CodomainT, Traits, Compare, Combine, Section, Interval, Alloc >>">
</head>
<body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF">
<table cellpadding="2" width="100%"><tr>
<td valign="top"><img alt="Boost C++ Libraries" width="277" height="86" src="../../../../../../boost.png"></td>
<td align="center"><a href="../../../../../../index.html">Home</a></td>
<td align="center"><a href="../../../../../libraries.htm">Libraries</a></td>
<td align="center"><a href="path_to_url">People</a></td>
<td align="center"><a href="path_to_url">FAQ</a></td>
<td align="center"><a href="../../../../../../more/index.htm">More</a></td>
</tr></table>
<hr>
<div class="spirit-nav">
<a accesskey="p" href="has_inverse_ic_idp58327408.html"><img src="../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../../header/boost/icl/interval_base_map_hpp.html"><img src="../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../index.html"><img src="../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="absorbs_identi_idp58356128.html"><img src="../../../../../../doc/src/images/next.png" alt="Next"></a>
</div>
<div class="refentry">
<a name="boost.icl.is_interval_co_idp58341776"></a><div class="titlepage"></div>
<div class="refnamediv">
<h2><span class="refentrytitle">Struct template is_interval_container<icl::interval_base_map< SubType, DomainT, CodomainT, Traits, Compare, Combine, Section, Interval, Alloc >></span></h2>
<p>boost::icl::is_interval_container<icl::interval_base_map< SubType, DomainT, CodomainT, Traits, Compare, Combine, Section, Interval, Alloc >></p>
</div>
<h2 xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" class="refsynopsisdiv-title">Synopsis</h2>
<div xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" class="refsynopsisdiv"><pre class="synopsis"><span class="comment">// In header: <<a class="link" href="../../header/boost/icl/interval_base_map_hpp.html" title="Header <boost/icl/interval_base_map.hpp>">boost/icl/interval_base_map.hpp</a>>
</span><span class="keyword">template</span><span class="special"><</span><span class="keyword">typename</span> SubType<span class="special">,</span> <span class="keyword">typename</span> DomainT<span class="special">,</span> <span class="keyword">typename</span> CodomainT<span class="special">,</span>
<span class="keyword">typename</span> Traits<span class="special">,</span> <span class="identifier">ICL_COMPARE</span> Compare<span class="special">,</span> <span class="identifier">ICL_COMBINE</span> Combine<span class="special">,</span>
<span class="identifier">ICL_SECTION</span> Section<span class="special">,</span> <span class="identifier">ICL_INTERVAL</span><span class="special">(</span><span class="identifier">ICL_COMPARE</span><span class="special">)</span> Interval<span class="special">,</span>
<span class="identifier">ICL_ALLOC</span> Alloc<span class="special">></span>
<span class="keyword">struct</span> <a class="link" href="is_interval_co_idp58341776.html" title="Struct template is_interval_container<icl::interval_base_map< SubType, DomainT, CodomainT, Traits, Compare, Combine, Section, Interval, Alloc >>">is_interval_container</a><span class="special"><</span><span class="identifier">icl</span><span class="special">::</span><span class="identifier">interval_base_map</span><span class="special"><</span> <span class="identifier">SubType</span><span class="special">,</span> <span class="identifier">DomainT</span><span class="special">,</span> <span class="identifier">CodomainT</span><span class="special">,</span> <span class="identifier">Traits</span><span class="special">,</span> <span class="identifier">Compare</span><span class="special">,</span> <span class="identifier">Combine</span><span class="special">,</span> <span class="identifier">Section</span><span class="special">,</span> <span class="identifier">Interval</span><span class="special">,</span> <span class="identifier">Alloc</span> <span class="special">></span><span class="special">></span> <span class="special">{</span>
<span class="comment">// types</span>
<span class="keyword">typedef</span> <span class="identifier">is_interval_container</span><span class="special"><</span> <a class="link" href="interval_base_map.html" title="Class template interval_base_map">icl::interval_base_map</a><span class="special"><</span> <span class="identifier">SubType</span><span class="special">,</span> <span class="identifier">DomainT</span><span class="special">,</span> <span class="identifier">CodomainT</span><span class="special">,</span> <span class="identifier">Traits</span><span class="special">,</span> <span class="identifier">Compare</span><span class="special">,</span> <span class="identifier">Combine</span><span class="special">,</span> <span class="identifier">Section</span><span class="special">,</span> <span class="identifier">Interval</span><span class="special">,</span> <span class="identifier">Alloc</span> <span class="special">></span> <span class="special">></span> <a name="boost.icl.is_interval_co_idp58341776.type"></a><span class="identifier">type</span><span class="special">;</span>
<span class="comment">// <a class="link" href="is_interval_co_idp58341776.html#idp58353056-bb">public member functions</a></span>
<a class="link" href="is_interval_co_idp58341776.html#idp58353616-bb"><span class="identifier">BOOST_STATIC_CONSTANT</span></a><span class="special">(</span><span class="keyword">bool</span><span class="special">,</span> <span class="identifier">value</span> <span class="special">=</span> <span class="keyword">true</span><span class="special">)</span><span class="special">;</span>
<span class="special">}</span><span class="special">;</span></pre></div>
<div class="refsect1">
<a name="idp115381632"></a><h2>Description</h2>
<div class="refsect2">
<a name="idp115382048"></a><h3>
<a name="idp58353056-bb"></a><code class="computeroutput">is_interval_container</code> public member functions</h3>
<div class="orderedlist"><ol class="orderedlist" type="1"><li class="listitem"><pre class="literallayout"> <a name="idp58353616-bb"></a><span class="identifier">BOOST_STATIC_CONSTANT</span><span class="special">(</span><span class="keyword">bool</span><span class="special">,</span> <span class="identifier">value</span> <span class="special">=</span> <span class="keyword">true</span><span class="special">)</span><span class="special">;</span></pre></li></ol></div>
</div>
</div>
</div>
<table xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" width="100%"><tr>
<td align="left"></td>
GmbH<p>
file LICENSE_1_0.txt or copy at <a href="path_to_url" target="_top">path_to_url
</p>
</div></td>
</tr></table>
<hr>
<div class="spirit-nav">
<a accesskey="p" href="has_inverse_ic_idp58327408.html"><img src="../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../../header/boost/icl/interval_base_map_hpp.html"><img src="../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../index.html"><img src="../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="absorbs_identi_idp58356128.html"><img src="../../../../../../doc/src/images/next.png" alt="Next"></a>
</div>
</body>
</html>
```
|
```c++
/* -*- mode: C++; c-basic-offset: 4; indent-tabs-mode: nil -*- */
// vim: ft=cpp:expandtab:ts=8:sw=4:softtabstop=4:
#ident "$Id$"
/*
COPYING CONDITIONS NOTICE:
This program is free software; you can redistribute it and/or modify
published by the Free Software Foundation, and provided that the
following conditions are met:
* Redistributions of source code must retain this COPYING
CONDITIONS NOTICE, the COPYRIGHT NOTICE (below), the
DISCLAIMER (below), the UNIVERSITY PATENT NOTICE (below), the
PATENT MARKING NOTICE (below), and the PATENT RIGHTS
GRANT (below).
* Redistributions in binary form must reproduce this COPYING
CONDITIONS NOTICE, the COPYRIGHT NOTICE (below), the
DISCLAIMER (below), the UNIVERSITY PATENT NOTICE (below), the
PATENT MARKING NOTICE (below), and the PATENT RIGHTS
GRANT (below) in the documentation and/or other materials
provided with the distribution.
along with this program; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA
02110-1301, USA.
COPYRIGHT NOTICE:
TokuFT, Tokutek Fractal Tree Indexing Library.
DISCLAIMER:
This program is distributed in the hope that it will be useful, but
WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
UNIVERSITY PATENT NOTICE:
The technology is licensed by the Massachusetts Institute of
Technology, Rutgers State University of New Jersey, and the Research
Foundation of State University of New York at Stony Brook under
United States of America Serial No. 11/760379 and to the patents
and/or patent applications resulting from it.
PATENT MARKING NOTICE:
This software is covered by US Patent No. 8,185,551.
This software is covered by US Patent No. 8,489,638.
PATENT RIGHTS GRANT:
"THIS IMPLEMENTATION" means the copyrightable works distributed by
Tokutek as part of the Fractal Tree project.
"PATENT CLAIMS" means the claims of patents that are owned or
licensable by Tokutek, both currently or in the future; and that in
the absence of this license would be infringed by THIS
IMPLEMENTATION or by using or running THIS IMPLEMENTATION.
"PATENT CHALLENGE" shall mean a challenge to the validity,
patentability, enforceability and/or non-infringement of any of the
PATENT CLAIMS or otherwise opposing any of the PATENT CLAIMS.
Tokutek hereby grants to you, for the term and geographical scope of
the PATENT CLAIMS, a non-exclusive, no-charge, royalty-free,
irrevocable (except as stated in this section) patent license to
make, have made, use, offer to sell, sell, import, transfer, and
otherwise run, modify, and propagate the contents of THIS
IMPLEMENTATION, where such license applies only to the PATENT
CLAIMS. This grant does not include claims that would be infringed
only as a consequence of further modifications of THIS
IMPLEMENTATION. If you or your agent or licensee institute or order
or agree to the institution of patent litigation against any entity
(including a cross-claim or counterclaim in a lawsuit) alleging that
THIS IMPLEMENTATION constitutes direct or contributory patent
infringement, or inducement of patent infringement, then any rights
such litigation is filed. If you or your agent or exclusive
licensee institute or order or agree to the institution of a PATENT
CHALLENGE, then Tokutek may terminate any rights granted to you
*/
#include "test.h"
static TOKUTXN const null_txn = 0;
static void test2 (int limit) {
FT_HANDLE t;
int r;
int i;
CACHETABLE ct;
const char *fname = TOKU_TEST_FILENAME;
if (verbose) printf("%s:%d checking\n", __FILE__, __LINE__);
toku_cachetable_create(&ct, 0, ZERO_LSN, nullptr);
unlink(fname);
r = toku_open_ft_handle(fname, 1, &t, 1024, 256, TOKU_DEFAULT_COMPRESSION_METHOD, ct, null_txn, toku_builtin_compare_fun);
if (verbose) printf("%s:%d did setup\n", __FILE__, __LINE__);
assert(r==0);
for (i=0; i<limit; i++) { // 4096
DBT k,v;
char key[100],val[100];
snprintf(key,100,"hello%d",i);
snprintf(val,100,"there%d",i);
toku_ft_insert(t, toku_fill_dbt(&k, key, 1+strlen(key)), toku_fill_dbt(&v, val, 1+strlen(val)), null_txn);
assert(r==0);
r = toku_verify_ft(t); assert(r==0);
//printf("%s:%d did insert %d\n", __FILE__, __LINE__, i);
}
if (verbose) printf("%s:%d inserted\n", __FILE__, __LINE__);
r = toku_verify_ft(t); assert(r==0);
r = toku_close_ft_handle_nolsn(t, 0); assert(r==0);
toku_cachetable_close(&ct);
if (verbose) printf("test2 ok\n");
}
int
test_main (int argc , const char *argv[]) {
default_parse_args(argc, argv);
if (verbose) printf("test2 faster\n");
test2(2);
test2(27);
test2(212);
test2(4096);
if (verbose) printf("test1 ok\n");
return 0;
}
```
|
Julius Schulhoff (Julius Šulhov) (2 August 182515 March 1898) was a Bohemian pianist and composer of Jewish birth. As a composer, he was best known for his virtuosic salon pieces for solo piano, which included a grand sonata in F minor, twelve études, and various caprices, impromptus, waltzes, and mazurkas.
Life and career
Schulhoff was born in Prague, where he began studying piano with Siegfried Kisch and Ignaz Amadeus Tedesco and also trained in music theory with Václav Tomášek. He made his debut at Dresden in 1842 and soon afterwards appeared at the Leipzig Gewandhaus. Moving to Paris shortly afterwards, he met Frédéric Chopin, who encouraged him in his bid to become an established professional pianist. The concerts that Schulhoff gave at Chopin's suggestion were greeted with such acclaim that he embarked on a long tour through France and to London, continuing his travels through Spain (1851) and Russia (1853).
After this tour he returned to Paris, where he devoted himself entirely to composition and teaching. He continued as a piano teacher when he settled in Dresden in 1870 and later moved to Berlin in 1897. He died in Berlin in 1898, aged 72.
He was the great-uncle of the 20th-century composer Erwin Schulhoff.
References
Article in Jewish Encyclopedia
Sources
External links
musicologie.org Full score
1825 births
1898 deaths
Austrian classical composers
Austrian classical pianists
Male classical pianists
Czech Jews
Czech male classical composers
Austrian male classical composers
Czech classical pianists
Jewish classical composers
Jewish classical pianists
Musicians from Prague
Pupils of Václav Tomášek
Czech Romantic composers
19th-century classical composers
19th-century classical pianists
19th-century Czech male musicians
|
Samuel Lyde (1825–1860) was an English writer and Church of England missionary who lived and worked in Syria in the 1850s and wrote a pioneering book on the Alawite sect. In 1856, he sparked months of anti-Christian rioting in Ottoman Palestine when, during a visit there, he killed a beggar.
Life and missionary work
Lyde was born in 1825. He obtained a degree in 1848 after studying at Jesus College, Cambridge and in 1851 he was awarded an M.A, took holy orders as a clergyman of the Church of England and became employed as a fellow of Jesus College. Poor health, according to Lyde, prevented him from "exercising the duties of his profession in England, at least during the winter months" and, therefore, in the winter of 1850/1851 he made "the usual tour" of Egypt and Syria. While on the "tour", he decided, because of his health, to settle permanently in Syria, then a part of the Ottoman Empire. While visiting Beirut, the British consul suggested to him that he could occupy his time by working as a missionary to the Alawites, also known as Nusayris, a secretive mountain sect who later provided two of modern Syria's leaders: Bashar al-Assad and his father, Hafez al-Assad.
Lyde was persuaded by the idea. From 1853 to 1859, he lived among the Alawite community of the Kalbiyya district, and established a mission and school in Bhamra, a village overlooking the Mediterranean port of Latakia. However, he later wrote that living among them convinced him that the Alawites fulfilled St Paul's description of the heathen: "filled with all unrighteousness, fornication, wickedness, covetousness, maliciousness".
Lyde travelled to Palestine in 1856, and as he rode on his horse into Nablus he shot and killed a beggar who was trying to steal his coat. It was either an accidental discharge of the gun or Lyde had lost his nerve and fired. An anti-Christian riot ensued during which Christian houses were burned and several Greeks and Prussians were killed. Lyde took refuge in the town governor's house but was eventually put on trial for murder. The only witnesses were three women who accused him of attacking and deliberately killing the beggar. However, the testimony of women was inadmissible in Ottoman courts and he was acquitted of murder, although he was ordered to pay compensation to the man's family. The violent rioting continued for several months and even spread to Gaza.
Lyde developed a deranged mental state and had delusions that he was John the Baptist, Jesus Christ or God himself. However, he subsequently recovered sufficiently to write a book on the Alawites, which he completed in Cairo shortly before his death. He died in Alexandria in Egypt in April 1860. He was 35 years old. He bequeathed his mission at Bhamra to two American missionaries, R. J. Dodds and J. Beattie of the Reformed Presbyterian Church.
Publications and influence
Lyde wrote two books on the Alawites: The Anseyreeh and Ismaeleeh: A Visit to the Secret Sects of Northern Syria with a View to the Establishment of Schools (1853) and The Asian Mystery Illustrated in the History, Religion and Present State of the Ansaireeh or Nusairis of Syria (1860). The latter is considered to be a pioneering work, and was the first monograph to be written on the Alawite-Nusayri religion. It remained the only Western book on the subject until 1900, when René Dussaud published his Histoire et religion des Nosairîs.
His description of Alawite doctrines was based on a document called Kitab al-mashyakha ("The Manual of the Shaykhs"), which he said he had bought from a Christian merchant from Latakia. This document appears to have differed in certain respects from other sources on Alawite doctrine. For many years it was thought to have been lost and only available through the extracts quoted in translation by Lyde. In 2013, it was announced that the document Lyde had used had been discovered in the archives of the Old Library of Jesus College, Cambridge. Lyde had bequeathed it to his old college, and, apparently, had sent it to Cambridge shortly before his death.
His writing reveals a negative view of the Alawites and, in particular, he was critical of what he saw as their brigandage, feuds, lying and divorce. He went as far as saying that "the state of [Alawi] society was a perfect hell upon earth". The Asian Mystery became a popular book and has been described as "colourful" but "unreliable" in some respects. Nevertheless, Lyde's account remains an influential source on Alawites, and, for instance, is widely quoted on the internet.
Notes
References
External links
Full texts of Lyde's works via Google books:
The Anseyreeh and Ismaeleeh: A Visit to the Secret Sects of Northern Syria with a View to the Establishment of Schools (1853)
The Asian Mystery Illustrated in the History, Religion and Present State of the Ansaireeh or Nusairis of Syria (1860)
Church of England missions
English Anglican missionaries
1825 births
1860 deaths
Anglican missionaries in Syria
British orientalists
Fellows of Jesus College, Cambridge
English religious writers
People acquitted of murder
History of Ottoman Syria
19th-century people from the Ottoman Empire
|
Grace Neville (1898–1973) was an American screenwriter.
Biography
Grace Neville was born in Chattanooga, Tennessee, to Benjamin Neville and Helen Turnell. She was added to the scenario department at Columbia Pictures in 1935. She later served as an officer in the Hollywood Studio Club, which aimed to prepare women for careers in the film industry. Neville — who never married — died in West Hollywood, where she resided, in 1973.
Selected filmography
Little Miss Roughneck (1938)
All American Sweetheart (1937)
Counsel for Crime (1937)
The Game That Kills (1937)
Motor Madness (1937)
Find the Witness (1937)
Shakedown (1936)
Dangerous Intrigue (1936)
Air Hawks (1935)
References
Bibliography
Larry Langman & Daniel Finn. A Guide to American Crime Films of the Thirties. Greenwood Press, 1995.
External links
1898 births
1973 deaths
Writers from Chattanooga, Tennessee
Screenwriters from Tennessee
American women screenwriters
20th-century American women writers
20th-century American screenwriters
|
The murder of Julie Pacey (1955/1956 – 26 September 1994) was a mysterious, and still unsolved, murder of a mother in her own home in Grantham, England on Monday 26 September 1994. 38-year-old 'vivacious mum' Pacey was found strangled to death with a cord in her first-floor bathroom by her 14-year-old daughter on her return home from school. A mysterious figure who became known as the 'Overalls Man', and who remains the prime suspect, was seen by numerous witnesses in the vicinity of Pacey's home in the days around the murder, and this red-faced man had suspiciously turned up at Pacey's home when she was alone three days previously, supposedly asking for directions. Although she had been sexually assaulted, investigators still do not know why she was apparently targeted in what appeared to be a pre-planned attack. The case has twice featured on Crimewatch, on which Pacey's murder was described as a "truly dreadful case".
As well as the mystery surrounding 'Overalls Man', the case is notable for a number of 'bizarre' matters associated with it. Even though Pacey's family car was an Audi, she was witnessed by multiple people in an unknown BMW in the days leading up to the murder, despite her family being adamant she would not have had access to one. A BMW was also reported to have been seen parked on her driveway on occasions, including on the day of the murder, and a BMW was also seen speeding away from the scene after the killing. These sightings of cars have never been explained. In another 'bizarre twist' that was reported on nationally, the (innocent) actor who played the role of the killer in the Crimewatch reconstruction was sensationally investigated as a suspect in September 2015 after the 1994 appeal was re-shown, with some viewers who had watched the appeal mistakenly calling in to state that they recognised him as the killer.
Publicity on the case had returned in 2015 after it was revealed that a full DNA profile of the killer had been identified in what was described as a "landmark forensic breakthrough", although with the killer's DNA not matching any on the UK National DNA Database, police continue to appeal to the public to come forward to provide a name that can be investigated to be eliminated. Pacey's murder has been described as one of the region's "most mysterious crimes".
Pacey's background
Pacey was a 38-year-old mother of two who lived with her family in a luxury bungalow in Grantham, Lincolnshire. Her husband was Andrew Pacey, whom she had originally met at the age of eight, and they were childhood sweethearts. They had been married for eighteen years at the time of the murder and had two children, 14-year-old Helen and 11-year-old Matthew. Julie looked after children part-time as her job, at local preschool St Peter's Day Nursery in St Catherine's Road, which was opposite the town's police headquarters. Andrew later described her as someone who "got on with everyone" and was "kind and caring and considerate" and "lived for the family". Andrew worked for a long-established plumbing business in the town. The area they lived in was described as a peaceful neighbourhood.
Pacey was described as a "vivacious blonde" and a "vivacious 'perfect mum'" in the press, with neighbours saying she was "extremely attractive and popular".
Lead-up to murder
There were a number of suspicious occurrences in the lead-up to Pacey's murder which it is believed may have been related to the killing. At around 3:30 pm on Friday 23 September 1994, three days (almost exactly to the minute) before Pacey was killed, a local girl who came to the Pacey's home as usual after school to wait until her own mother returned from work saw a suspicious figure. As she walked towards the Pacey's house, she saw a "strange man" wearing blue overalls walking into the Pacey's driveway. When the girl walked into the driveway to enter the house, the unknown man walked back out and passed directly by her, with the girl saying he was also wearing brown workman's boots, was fairly chubby and looked around 35. The girl said he had a big round face that was "all pink", with him having a ruddy complexion. He also had rough hands. As soon as the girl passed him and entered through the front door, Pacey asked her if she'd seen the strange man that had just entered, with Pacey explaining that she'd heard a knock at the door and assumed it was the girl as expected but that the stranger male had entered and asked for directions. Pacey revealed that the man had asked for directions to Eskdale Road. The girl involved would later help create an artist's impression of the ruddy-faced individual in overalls.
Though the Pacey's family car was an Audi 80, several people mysteriously later insisted to police that they'd seen Pacey in a 5-series BMW in the period before the murder. On the very day of the murder, an acquaintance of Pacey's was driving down her road when she again saw this BMW car. She had been driving behind it but it turned into Pacey's driveway, on which she also saw the family Audi car as usual. This was about 2:50 pm on the Monday Pacey was murdered, which was notable as Pacey had herself just returned home in the Audi only five minutes earlier at around 2:45 pm.
Murder
The murder occurred on Monday 26 September, 1994. That day, Pacey had left for work at around 10 am. Her husband was out that day working on a plumbing contract on Pechiney Packaging in Springfield Road with his brother. Having returned home in the family Audi car at 2:45 pm that day, Julie left again around half an hour later to go to the shops, being seen driving the Audi in Highcliffe Road near to her home. Suspiciously, a man in blue overalls – apparently the same that had randomly turned up at Pacey's home asking for directions three days earlier – was seen walking up the road at the same time, and he walked into the road as Pacey was passing in her car, causing her to nearly hit him. After Pacey's car had passed, the man was witnessed suddenly turning round and running back towards where Pacey's car had gone.
At around 4:15 pm Julie's daughter Helen arrived home from school to discover her mother dead in the first-floor bathroom. Pacey had been sexually assaulted and strangled. A ligature mark was found around her neck, and she had been strangled with a cord. She was fully clothed but some of her clothes had been disarranged.
Only one item was found to be missing from the home after the murder: her unusual French Luc Desroches watch which she bought shortly before she died while on holiday in France, for the equivalent of only £10. This item has never been found and was apparently taken, for unknown reasons, from her home by her killer.
Initial investigation
It was soon discovered by police that the suspicious man in blue overalls/dungarees had been additionally seen by a number of witnesses on the estate and in Grantham generally between Thursday 22 September, four days before the murder, and Tuesday 27 September, the day after. The witnesses generally agreed he had a particularly red face. It emerged that this man had been asking a number of people for directions to different places, strangely including an industrial site far away on the other side of town, and unusually never asked the witnesses to repeat any directions. The morning after the murder, at 9 am, he was seen again kicking grass nearby as if he was looking for something, and he later entered a shop in Grantham town centre, acting "totally suspiciously" according to the shopkeeper. Again, he was wearing the blue overalls. The mysterious man became known as 'Overalls Man'.
It was quickly established that there was a sexual motive for the crime, with Pacey having been sexually assaulted, although current investigators note that they have still found no specific motive for why she was apparently targeted to be attacked. There were no signs of a break in, no indications of a struggle, and she was found face down in the bathroom which is where it is believed she was killed. It was believed she may have been strangled from behind. Her nails, which had been "beautifully manicured", were found to be completely unbroken, further indicating the lack of struggle. It was believed that the daughter had found Pacey's body only minutes after she had been attacked. Police theorised that the man could have got in because Pacey had left her keys in the lock as she entered the house.
Pacey's case featured on Crimewatch in November 1994. It was said that it was "quite bizarre" that the family were adamant that Pacey never had access to a BMW, yet was apparently seen in it by numerous witnesses at times. The sightings had weight as some were made by people who knew Pacey well and knew she drove an Audi, and some of these witnesses stated that they had seen this BMW parked next to the Audi on the drive. To add to the mystery, a BMW with the same description was seen twice speeding away from the scene of the murder at around 3:20 pm, the time which Pacey was seen driving her Audi down Highcliffe Road as she encountered the man in the blue overalls.
On Monday 3 October 1994, exactly a week after the murder, police stopped more than 500 people walking or driving along Longcliffe Road to ask if they had any information on the murder. Questionnaires were handed out to mothers and children. A full-colour poster with the artist's impression of the 'Overalls Man' was distributed to every house on Pacey's estate.
Detectives believed that the watch that had been stolen may have been given to a wife or girlfriend by the killer, and they then may have started wearing it without realising its significance.
Pacey's parents later said that they believed that the killer may not have known Julie but "known of her", saying: "It is too much of a coincidence that it happened the one day when she wasn't looking after the little girl she minded and before the children came home from school". One of the lead detectives, Superintendent Roger Billingsley, declared in 1997 that the murder appeared to be a pre-meditated crime.
Continuing publicity
The case eventually went cold, but continued to receive publicity. In 2001, it was reported that links between Pacey's case and the murder of 21-year-old Sharon Harper four months earlier in Grantham had been investigated. The links were made as part of Operation Enigma, a wider national police investigation into the unsolved murders of dozens of women in the UK in the 1990s. Several 'clusters' of similar murders were identified which suggested that each could be the work of one man, and the Harper and Pacey cases were one of these 'clusters'. Barmaid Harper had been murdered after she left her Grantham pub workplace after midnight one night, and her body was found soon after in undergrowth near to her home having been strangled to death and beaten.
Publicity returned to the case in 2015 after significant developments. A second appeal was made on Crimewatch, 21 years after the original reconstruction, where it was revealed that scientists had managed to isolate a full DNA profile of the killer. It was further noted that all investigations had failed to establish an actual motive for the crime, with Pacey simply being a family woman who lived for her children and there being no reason why someone should attack her. The crime was described as a "truly dreadful case". Significantly, it was revealed that the 'Overalls Man' who had been the key suspect in the original investigation had still not being eliminated or identified, suggesting that he is the killer. It was also revealed that the owner of the BMW car, which it was stated "may or may not have been involved" with events that day, had not been identified. The DNA development, which was made using the modern DNA-17 profiling technique, was described as a "landmark forensic breakthrough". The DNA did not match anyone in the UK National DNA Database, and so the appeals have been made to the public to identify the murderer by giving a name to investigators could compare to the DNA profile, with lead detective Helen Evans saying: "The one drawback is that the DNA does not match anyone presently in our databases. For that reason we are asking the public for help and urging them to give us a name". Appeals were also made to the 'Overalls Man' that he could come forward and potentially be eliminated with the DNA. However, he has not done so.
Shortly after the Crimewatch appeal, in early September 2015, the original actor himself in the 1994 reconstruction was investigated after calls to the programme in what was described as a "bizarre twist". Some viewers who had watched the re-appeal, which included sections from the original 1994 reconstruction, became, as described by The Independent, 'confused', and called to name the actor playing the killer as being responsible. In events that made national news, this caused police to visit and take the DNA of the man, the now 53-year-old Steve Watson, but they didn't realise that he was the original appeal actor. Watson stated that locals had recognised him in the street after the reconstruction and so named him as the killer, and police investigated him as a result even after he told them he was the actor, saying: "[My face] was on the screen for too long and even then, people in the street said, 'Oh, is that that murderer? To hear those words you think, 'Please, it's just a reconstruction, surely you understand', but unfortunately they don't". He added that people had come up to him in the street and said "you're that murderer" or "you're that one that got that poor girl". Watson had previously received abuse from confused locals who thought he was the killer after he played the suspect in the 1994 appeal. He had played the role due to his likeness to the suspect, but this had caused problems including a farcical occasion when a police officer nearly arrested him after he turned up at the police station dressed as the suspect ready to film the reconstruction with the TV crew. After the 2015 events, Watson described the situation as "ridiculous" to the press, and The Independent said that the calls were misguided responses to the appeal. No further action was taken against Watson after the DNA sample was taken and in 2017 he was reported to have been found "entirely innocent".
See also
Other (still unsolved) UK cold cases where the offender's DNA is known:
Murder of Deborah Linsley
Murders of Eve Stratford and Lynne Weedon
Murder of Lisa Hession
Murders of Jacqueline Ansell-Lamb and Barbara Mayo
Murder of Lindsay Rimer
Murder of Janet Brown
Murder of Sheila Anderson
Murder of Linda Cook
Murder of Melanie Hall
Batman rapist – subject to Britain's longest-running serial rape investigation
References
September 1994 events in the United Kingdom
1990s in Lincolnshire
1994 in England
2015 in England
1994 murders in the United Kingdom
Unsolved murders in England
Incidents of violence against women
1994 crimes
Grantham
Violence against women in England
|
```xml
export interface ArrangePoint {
tick: number
trackIndex: number
}
export namespace ArrangePoint {
export function sub(v1: ArrangePoint, v2: ArrangePoint) {
return {
tick: v1.tick - v2.tick,
trackIndex: v1.trackIndex - v2.trackIndex,
}
}
export function clamp(point: ArrangePoint, maxTrackIndex: number) {
return {
tick: Math.max(0, point.tick),
trackIndex: Math.max(0, Math.min(maxTrackIndex, point.trackIndex)),
}
}
}
```
|
```javascript
CSS for when JavaScript is enabled
Vibration API
Navigation Timing API
Blobs
Geolocation
```
|
Sir Gordon Frederick Tietjens (born 9 December 1955) is head coach of the Samoa rugby sevens team, and a celebrated former coach of the New Zealand men's national team in rugby sevens, the All Blacks Sevens. When the International Rugby Board inducted him into the IRB Hall of Fame in May 2012, it said that "Tietjens' roll of honour is without peer in Sevens, and perhaps in the Game of Rugby as a whole." According to Spiro Zavos, Tietjens is "The greatest of all the Sevens coaches".
As of his induction, he had coached the All Blacks Sevens to 10 series titles in the IRB Sevens World Series, the Rugby World Cup Sevens crown in 2001, and gold medals in four of the five Commonwealth Games in which the sport had been contested, losing the 2014 final in Glasgow. He has also added two more IRB Sevens series titles (2013 and 2014), and a second Rugby World Cup Sevens crown (also in 2013).
Player development
Tietjens has coached many young players who have gone on to become All Blacks, including Christian Cullen, Jonah Lomu, Joe Rokocoko, Mils Muliaina, Rico Gear, Cory Jane, Ben Smith, Rieko Ioane, Israel Dagg, and Liam Messam. Tietjens coached 44 players who went on to become All Blacks in the 15-a-side game and before his retirement, he was the only remaining active international coach from the amateur era.
Tietjens is currently assisted by Eric Rush, a former long-serving captain of the New Zealand Sevens team and a former Sevens star himself.
Olympics
In 2012, his contract as the NZ Sevens coach was extended through to 2016. This allows Tietjens to be part of Sevens rugby's first inclusion in the Olympic Games. According to then-World Rugby chairman Bernard Lapasset, sevens' inclusion in the Olympics was "in no small way down to Gordon Tietjens. Through his knowledge, passion, and expertise, he has driven the standards towards what we now celebrate as a truly global game of sevens."
In the 1999 New Year Honours, Tietjens was appointed a Member of the New Zealand Order of Merit (MNZM); in the 2007 New Year Honours, he was elevated to Companion of the same order (CNZM), and in the 2013 Queen's Birthday Honours, he was further promoted to a Knight Companion of the New Zealand Order of Merit (KNZM).
After an unsuccessful 2016 Olympics campaign, Tietjens stepped down from the All Black Sevens coaching position.
Samoa
In October 2016, Tietjens accepted the role of coach to the Samoa national sevens team.
In 2020, he announced that he would be stepping down from the head coaching role for the Samoa national sevens team stating that he wanted Samoa to be able to plan for the future
Achievements
Playing honours: New Zealand Barbarians
Member of the first national New Zealand Hong Kong Sevens Team (1983)
New Zealand Sevens Head Coach (1994–2016)
IRB Sevens World Series Champions (2000, 2001, 2002, 2003, 2004, 2005, 2007, 2008, 2011, 2012, 2013 & 2014)
Rugby World Cup Sevens Champions (2001 & 2013)
Commonwealth Games Rugby Sevens Gold Medalist (1998, 2002, 2006 & 2010)
NPC Coach of the Year in (2000 for Bay of Plenty)
NZRU Coach of the Year (2010)
IRB Hall of Fame Member
Knight Companion of The New Zealand Order of Merit
References
External links
1955 births
Living people
World Rugby Hall of Fame inductees
New Zealand rugby union coaches
Rugby sevens in New Zealand
Knights Companion of the New Zealand Order of Merit
Rugby football people awarded knighthoods
Olympic coaches for New Zealand
New Zealand national rugby sevens team coaches
New Zealand rugby union players
Waikato rugby union players
Bay of Plenty rugby union players
New Zealand male rugby sevens players
New Zealand international rugby sevens players
Samoa national rugby sevens team coaches
|
```text
Max Magic Find Bonus
0
cagao123, bungholio
0 00E35318 3C607F7F
0 00E3531C 6063FFFF
0 00E35320 9001FFFC
0 00E35324 C021FFFC
0 00E35328 4889C006
0 0089C000 48E3531A
/*
Nearly all dropped items
will be Legendary.
*/
#
Unlock All Acts
0
bungholio
0 0041C158 3860001F
/*
It does not unlock all
chapters within each act.
*/
#
Max Stats
0
GuitarMan
0 00F26AA0 2F86F010
0 00F26AA4 409E0014
0 00F26AA8 3CC04B18
0 00F26AAC 60C6967F
0 00F26AB0 90C50008
0 00F26AB4 80C50008
0 00F26AB8 2F86F00F
0 00F26ABC 409E0014
0 00F26AC0 3CC04B18
0 00F26AC4 60C6967F
0 00F26AC8 90C50008
0 00F26ACC 80C50008
0 00F26AD0 2F86F00E
0 00F26AD4 409E0014
0 00F26AD8 3CC04B18
0 00F26ADC 60C6967F
0 00F26AE0 90C50008
0 00F26AE4 80C50008
0 00F26AE8 2F86F026
0 00F26AEC 409E0014
0 00F26AF0 3CC04b18
0 00F26AF4 60C6967f
0 00F26AF8 90C50008
0 00F26AFC 80C50008
0 00F26B00 2F86F0BB
0 00F26B04 409E0010
0 00F26B08 3CC0447A
0 00F26B0C 90C50008
0 00F26B10 80C50008
0 00F26B14 2F86F03F
0 00F26B18 409E0010
0 00F26B1C 3CC04316
0 00F26B20 90C50008
0 00F26B24 80C50008
0 00F26B28 2F86F0A3
0 00F26B2C 409E0010
0 00F26B30 3CC04070
0 00F26B34 90C50008
0 00F26B38 80C50008
0 00F26B3C 2F86F04B
0 00F26B40 409E0010
0 00F26B44 3CC047C3
0 00F26B48 90C50008
0 00F26B4C 80C50008
0 00F26B50 2F86F0EA
0 00F26B54 409E000C
0 00F26B58 3CC0447A
0 00F26B5C 90C50008
0 00F26B60 80C50008
0 00F26B64 48571CEE
0 00571CE8 48F26AA2
#
No Cooldown
0
GuitarMan
0 009C0574 41820100
#
Huge Gold On Sell
0
GuitarMan
0 00479970 60000000
#
AoB Unlock All Acts
0
bungholio
B 00010000 04000000
B 907100802C0400004182012C8071007C 907100802C0400004182012C3860001F
/*
It does not unlock all
chapters within each act.
*/
#
```
|
```xml
<?xml version="1.0" encoding="UTF-8" ?>
<beans xmlns="path_to_url" xmlns:xsi="path_to_url"
xmlns:dubbo="path_to_url"
xsi:schemaLocation="path_to_url
path_to_url path_to_url path_to_url">
<bean id="demoService" class="com.zheng.demo.rpc.service.impl.DemoServiceImpl"/>
<dubbo:service interface="com.zheng.demo.rpc.api.DemoService" ref="demoService" timeout="5000" retries="0" />
</beans>
```
|
The Australian Brandenburg Orchestra (ABO) is an Australian period instrument orchestra specialising in the performance of baroque and classical music.
Founders
The orchestra's founder and artistic director is Paul Dyer.
In 2013 Dyer was appointed an Officer of the Order of Australia (AO) for his "distinguished service to the performing arts, particularly orchestral music as a director, conductor and musician, through the promotion of educational programs and support for emerging artists". In 2003 Paul was awarded the Australian Centenary Medal for his services to Australian society and the advancement of music and in 2010 the Sydney University Alumni Medal for Professional Achievement.
The other founder and current managing director is Bruce Applebaum.
History
The orchestra was formed in 1989 by Paul Dyer and Bruce Applebaum and their name pays tribute to the Brandenburg Concertos of J. S. Bach, who was central to the Baroque period.
Since the beginning in 1989, the orchestra has become the leading voice in the Australian cultural landscape due to the purity of their work when compared with the Australian Chamber Orchestra and Sydney Symphony Orchestra.
The group under Paul Dyer and Bruce Applebaum's leadership commenced life as the Brandenburg Ensemble, then as the Brandenburg Orchestra, and finally the Australian Brandenburg Orchestra. Its first concert was in January 1990, in the Concert Hall of the Sydney Opera House.
Venues and Programming
The orchestra has hired the Sydney City Recital Hall as its main venue since it opened in 2000. This venue was custom made for the ABO. The group makes regular appearances in the major concert halls and cultural venues in Australian east coast cities of Brisbane and Melbourne.
The Orchestra's name pays tribute to the Brandenburg Concertos of J.S. Bach, whose musical genius was central to the Baroque era, as Paul Dyer is today. The concerts include both the music of well-known composers such as Mozart, Vivaldi and Handel, as well as lesser-known composers, rare works and unusual replica instruments. The musicians always play from original edition scores on replica instruments of the 18th century.
The group has performed with guest artists such as Andreas Scholl, Emma Kirkby, Derek Lee Ragin, Andrew Manze, Philippe Jaroussky, Avi Avital, Dmitry Sinkovsky, Federico Guglielmo, Christina Pluhar and Elizabeth Wallfisch. Every December, the orchestra plays O Come All Ye Faithful and Stille Nacht in churches across Sydney, which is a highlight for families across the Eastern and Northern suburbs.
Discography
Charting albums
Awards and nominations
AIR Awards
The Australian Independent Record Awards (commonly known informally as AIR Awards) is an annual awards night to recognise, promote and celebrate the success of Australia's Independent Music sector.
|-
| AIR Awards of 2015
|Brandenburg Celebrates
| Best Independent Classical Album
|
|-
| AIR Awards of 2017
| Brandenburg Celebrates
| Best Independent Classical Album
|
|-
ARIA Music Awards
The ARIA Music Awards is an annual awards ceremony that recognises excellence, innovation, and achievement across all genres of Australian music. They commenced in 1987.
!
|-
| 1993
| The Brandenburg Orchestra
|rowspan="9" | Best Classical Album
|
|rowspan="9" |
|-
| 1995
| Handel: Opera Arias (with Graham Pushee & Paul Dyer)
|
|-
| 1998
| Handel: Arias (with Yvonne Kenny & Paul Dyer)
|
|-
| 1999
| If Love's a Sweet Passion (with Sara Macliver & Paul Dyer)
|
|-
| 2001
| Vivaldi – Ii Flauto Dolce (with Genevieve Lacey & Paul Dyer)
|
|-
| 2005
| Sanctuary
|
|-
| 2009
| Handel: Concerti Grossi Opus 6 (with Paul Dyer)
|
|-
| 2010
| Tapas – Tastes of the Baroque (with Paul Dyer)
|
|-
| 2015
| Brandenburg Celebrates
|
|-
References
External links
Australian Brandenburg Orchestra (official website)
ARIA Award winners
Australian orchestras
Early music orchestras
Musical groups established in 1989
Music in Sydney
|
```smalltalk
// ==========================================================================
// Squidex Headless CMS
// ==========================================================================
// ==========================================================================
namespace Squidex.Translator.State.Old;
public class OldTranslatedText
{
public SortedDictionary<string, string> Texts { get; set; } = [];
public SortedSet<TextOrigin> Origins { get; set; } = [];
}
```
|
Kremis is a small town and commune in the Cercle of Yélimané in the Kayes Region of south-western Mali. In 2009 the commune had a population of 10,467.
References
External links
.
Communes of Kayes Region
|
Anta Rugāte (born Antonija Lūriņa on April 16, 1949 in Rundēni parish) is a Latvian journalist and politician. Rugāte served as a deputy of the Saeima.
References
Women deputies of the Saeima
1949 births
Living people
|
```javascript
ace.define("ace/theme/dawn.css",["require","exports","module"], function(require, exports, module){module.exports = ".ace-dawn .ace_gutter {\n background: #ebebeb;\n color: #333\n}\n\n.ace-dawn .ace_print-margin {\n width: 1px;\n background: #e8e8e8\n}\n\n.ace-dawn {\n background-color: #F9F9F9;\n color: #080808\n}\n\n.ace-dawn .ace_cursor {\n color: #000000\n}\n\n.ace-dawn .ace_marker-layer .ace_selection {\n background: rgba(39, 95, 255, 0.30)\n}\n\n.ace-dawn.ace_multiselect .ace_selection.ace_start {\n box-shadow: 0 0 3px 0px #F9F9F9;\n}\n\n.ace-dawn .ace_marker-layer .ace_step {\n background: rgb(255, 255, 0)\n}\n\n.ace-dawn .ace_marker-layer .ace_bracket {\n margin: -1px 0 0 -1px;\n border: 1px solid rgba(75, 75, 126, 0.50)\n}\n\n.ace-dawn .ace_marker-layer .ace_active-line {\n background: rgba(36, 99, 180, 0.12)\n}\n\n.ace-dawn .ace_gutter-active-line {\n background-color : #dcdcdc\n}\n\n.ace-dawn .ace_marker-layer .ace_selected-word {\n border: 1px solid rgba(39, 95, 255, 0.30)\n}\n\n.ace-dawn .ace_invisible {\n color: rgba(75, 75, 126, 0.50)\n}\n\n.ace-dawn .ace_keyword,\n.ace-dawn .ace_meta {\n color: #794938\n}\n\n.ace-dawn .ace_constant,\n.ace-dawn .ace_constant.ace_character,\n.ace-dawn .ace_constant.ace_character.ace_escape,\n.ace-dawn .ace_constant.ace_other {\n color: #811F24\n}\n\n.ace-dawn .ace_invalid.ace_illegal {\n text-decoration: underline;\n font-style: italic;\n color: #F8F8F8;\n background-color: #B52A1D\n}\n\n.ace-dawn .ace_invalid.ace_deprecated {\n text-decoration: underline;\n font-style: italic;\n color: #B52A1D\n}\n\n.ace-dawn .ace_support {\n color: #691C97\n}\n\n.ace-dawn .ace_support.ace_constant {\n color: #B4371F\n}\n\n.ace-dawn .ace_fold {\n background-color: #794938;\n border-color: #080808\n}\n\n.ace-dawn .ace_list,\n.ace-dawn .ace_markup.ace_list,\n.ace-dawn .ace_support.ace_function {\n color: #693A17\n}\n\n.ace-dawn .ace_storage {\n font-style: italic;\n color: #A71D5D\n}\n\n.ace-dawn .ace_string {\n color: #0B6125\n}\n\n.ace-dawn .ace_string.ace_regexp {\n color: #CF5628\n}\n\n.ace-dawn .ace_comment {\n font-style: italic;\n color: #5A525F\n}\n\n.ace-dawn .ace_heading,\n.ace-dawn .ace_markup.ace_heading {\n color: #19356D\n}\n\n.ace-dawn .ace_variable {\n color: #234A97\n}\n\n.ace-dawn .ace_indent-guide {\n background: url(data:image/png;base64,your_sha256_hashYLh/5+x/AAizA4hxNNsZAAAAAElFTkSuQmCC) right repeat-y\n}\n\n.ace-dawn .ace_indent-guide-active {\n background: url(\"data:image/png;base64,your_sha256_hashEwEAmpwYAAAAIGNIUk0AAHolAACAgwAA+f8AAIDpAAB1MAAA6mAAADqYAAAXb5JfxUYAAAAZSURBVHjaYvj///9/hivKyv8BAAAA//8DACLqBhbvk+/eAAAAAElFTkSuQmCC\") right repeat-y;\n} \n";
});
ace.define("ace/theme/dawn",["require","exports","module","ace/theme/dawn.css","ace/lib/dom"], function(require, exports, module){exports.isDark = false;
exports.cssClass = "ace-dawn";
exports.cssText = require("./dawn.css");
var dom = require("../lib/dom");
dom.importCssString(exports.cssText, exports.cssClass, false);
}); (function() {
ace.require(["ace/theme/dawn"], function(m) {
if (typeof module == "object" && typeof exports == "object" && module) {
module.exports = m;
}
});
})();
```
|
```smalltalk
using UnityEngine;
using UnityEditor;
namespace Skinner
{
[CanEditMultipleObjects]
[CustomEditor(typeof(SkinnerTrail))]
public class SkinnerTrailEditor : Editor
{
SerializedProperty _source;
SerializedProperty _template;
SerializedProperty _speedLimit;
SerializedProperty _drag;
SerializedProperty _cutoffSpeed;
SerializedProperty _speedToWidth;
SerializedProperty _maxWidth;
SerializedProperty _randomSeed;
static GUIContent _labelSensitivity = new GUIContent("Sensitivity");
void OnEnable()
{
_source = serializedObject.FindProperty("_source");
_template = serializedObject.FindProperty("_template");
_speedLimit = serializedObject.FindProperty("_speedLimit");
_drag = serializedObject.FindProperty("_drag");
_cutoffSpeed = serializedObject.FindProperty("_cutoffSpeed");
_speedToWidth = serializedObject.FindProperty("_speedToWidth");
_maxWidth = serializedObject.FindProperty("_maxWidth");
_randomSeed = serializedObject.FindProperty("_randomSeed");
}
public override void OnInspectorGUI()
{
serializedObject.Update();
bool reconfigured = false;
EditorGUI.BeginChangeCheck();
EditorGUILayout.PropertyField(_source);
EditorGUILayout.PropertyField(_template);
reconfigured |= EditorGUI.EndChangeCheck();
EditorGUILayout.Space();
EditorGUILayout.PropertyField(_speedLimit);
EditorGUILayout.PropertyField(_drag);
EditorGUILayout.Space();
EditorGUILayout.LabelField("Line Width Modifier (By Speed)", EditorStyles.boldLabel);
EditorGUILayout.PropertyField(_cutoffSpeed);
EditorGUILayout.PropertyField(_speedToWidth, _labelSensitivity);
EditorGUILayout.PropertyField(_maxWidth);
EditorGUILayout.Space();
EditorGUI.BeginChangeCheck();
EditorGUILayout.PropertyField(_randomSeed);
reconfigured |= EditorGUI.EndChangeCheck();
if (reconfigured)
foreach (SkinnerTrail st in targets) st.UpdateConfiguration();
serializedObject.ApplyModifiedProperties();
}
}
}
```
|
```xml
export function pluralize(val: number, word: string, plural = `${word}s`) {
return [1, -1].includes(Number(val)) ? word : plural;
}
export function addPlural(value: number, word: string, plural = `${word}s`) {
return `${value} ${pluralize(value, word, plural)}`;
}
```
|
```css
Vertical centering fluid blocks
Vertical percentages are relative to container width, not height
Clearfix for layouts
Fixed navigation bar
Avoid margin hacks with flexbox
```
|
```python
import pytest
from ManageEnginePAM360 import Client, pam360_fetch_password, pam360_create_resource, \
pam360_create_account, pam360_update_resource, pam360_update_account, pam360_fetch_account_details, pam360_list_resources, \
pam360_list_accounts, pam360_update_account_password, pam360_fetch_resource_account_id
from test_data.context import FETCH_PASSWORD_CONTEXT, CREATE_RESOURCE_CONTEXT, \
CREATE_ACCOUNT_CONTEXT, UPDATE_RESOURCE_CONTEXT, UPDATE_ACCOUNT_CONTEXT, FETCH_ACCOUNT_DETAILS_CONTEXT, \
LIST_ALL_RESOURCE_CONTEXT, LIST_ALL_ACCOUNTS_CONTEXT, UPDATE_ACCOUNT_PASSWORD_CONTEXT, FETCH_RESOURCE_ACCOUNT_ID_CONTEXT
from test_data.responses import FETCH_PASSWORD_RAW_RESPONSE, \
CREATE_RESOURCE_RAW_RESPONSE, CREATE_ACCOUNT_RAW_RESPONSE, UPDATE_RESOURCE_RAW_RESPONSE, UPDATE_ACCOUNT_RAW_RESPONSE, \
FETCH_ACCOUNT_DETAILS_RAW_RESPONSE, LIST_ALL_RESOURCE_RAW_RESPONSE, LIST_ALL_ACCOUNTS_RAW_RESPONSE, \
UPDATE_ACCOUNT_PASSWORD_RAW_RESPONSE, FETCH_RESOURCE_ACCOUNT_ID_RAW_RESPONSE
FETCH_PASSWORD_ARGS = {
"resource_id": "1",
"account_id": "1"
}
CREATE_RESOURCE_ARGS = {
"resource_name": "SOUTH-FIN-WINSERQA-09",
"resource_type": "Windows",
"account_name": "administrator",
"password": "QA!K>35Hgg(x"
}
CREATE_ACCOUNT_ARGS = {
"resource_id": "1",
"account_name": "admin",
"password": "t8BRq)<6h9g1"
}
UPDATE_RESOURCE_ARGS = {
"resource_id": "1",
"resource_name": "SOUTH-FIN-WINSERQA-09",
"resource_url": "path_to_url"
}
UPDATE_ACCOUNT_ARGS = {
"resource_id": "1",
"account_id": "1",
"account_name": "admin",
"notes": "Windows server resources reserved for testing API"
}
FETCH_ACCOUNT_DETAILS_ARGS = {
"resource_id": "1",
"account_id": "1"
}
LIST_ALL_ACCOUNTS_ARGS = {
"resource_id": "1"
}
UPDATE_ACCOUNT_PASSWORD_ARGS = {
"resource_id": "1",
"account_id": "1",
"new_password": "A8>ne3J&0Z",
"reset_type": "LOCAL",
"reason": "Password Expired",
"ticket_id": "1"
}
FETCH_RESOURCE_ACCOUNT_ID_ARGS = {
"resource_name": "SOUTH-FIN-WINSERQA-09",
"account_name": "administrator"
}
@pytest.mark.parametrize('command, args, http_response, context', [
(pam360_fetch_password, FETCH_PASSWORD_ARGS, FETCH_PASSWORD_RAW_RESPONSE, FETCH_PASSWORD_CONTEXT),
(pam360_create_resource, CREATE_RESOURCE_ARGS, CREATE_RESOURCE_RAW_RESPONSE, CREATE_RESOURCE_CONTEXT),
(pam360_create_account, CREATE_ACCOUNT_ARGS, CREATE_ACCOUNT_RAW_RESPONSE, CREATE_ACCOUNT_CONTEXT),
(pam360_update_resource, UPDATE_RESOURCE_ARGS, UPDATE_RESOURCE_RAW_RESPONSE, UPDATE_RESOURCE_CONTEXT),
(pam360_update_account, UPDATE_ACCOUNT_ARGS, UPDATE_ACCOUNT_RAW_RESPONSE, UPDATE_ACCOUNT_CONTEXT),
(pam360_fetch_account_details, FETCH_ACCOUNT_DETAILS_ARGS, FETCH_ACCOUNT_DETAILS_RAW_RESPONSE, FETCH_ACCOUNT_DETAILS_CONTEXT),
(pam360_list_resources, {}, LIST_ALL_RESOURCE_RAW_RESPONSE, LIST_ALL_RESOURCE_CONTEXT),
(pam360_list_accounts, LIST_ALL_ACCOUNTS_ARGS, LIST_ALL_ACCOUNTS_RAW_RESPONSE, LIST_ALL_ACCOUNTS_CONTEXT),
(pam360_update_account_password, UPDATE_ACCOUNT_PASSWORD_ARGS, UPDATE_ACCOUNT_PASSWORD_RAW_RESPONSE,
UPDATE_ACCOUNT_PASSWORD_CONTEXT),
(pam360_fetch_resource_account_id, FETCH_RESOURCE_ACCOUNT_ID_ARGS, FETCH_RESOURCE_ACCOUNT_ID_RAW_RESPONSE,
FETCH_RESOURCE_ACCOUNT_ID_CONTEXT),
])
def test_manageengine_pam360_commands(command, args, http_response, context, mocker):
"""Unit test
Given
- demisto args
- raw response of the http request
When
- mock the http request result
Then
- create the context
- validate the expected_result and the created context
"""
client = Client(server_url="path_to_url", app_token="B698EF92-B151-4E5C-969D-CA7B50DF4E9D", verify_certificate=False,
proxy=False)
mocker.patch.object(Client, '_http_request', return_value=http_response)
outputs = command(client, **args)
results = outputs.to_context()
assert results.get("EntryContext") == context
```
|
Lisa Fowler (also Shaw) is a fictional character from the BBC soap opera EastEnders, played by Lucy Benjamin. The character was introduced as a "home-wrecking blonde" by executive producer Matthew Robinson on 7 December 1998. The character made her initial departure on 3 October 2002, when she was written out by producer Louise Berridge.
Berridge later reintroduced the character on two occasions in 2003. Despite several reports stating she would return, Benjamin did not reprise the role until 2010 when Bryan Kirkwood brought the character back for a single episode on 5 August 2010. The actress reprised the role again in 2017 for a brief stint, returning on 21 July 2017 and departing on 3 August 2017. On 27 May 2019, it was announced that Benjamin would once again be reprising the role for a "specific storyline" and Lisa returned on 2 September 2019. She departed once again on 24 January 2020. On 23 May 2023, it was announced that Benjamin had signed up to reprise the role for another short stint later in the summer, alongside Lisa's granddaughter Peggy Taylor. They returned on 10 July and departed on 20 July 2023.
Lisa is characterised as "a loyal, romantic Earth Mother" who is "feisty, independent and ambitious". Initially, show scriptwriters and BBC controller of drama, Mal Young, doubted Benjamin's casting in the role. However, after a volatile relationship with Phil Mitchell (Steve McFadden), Lisa became an "eternal victim". The characters became instrumental in one of EastEnders most highly publicised and anticipated storylines, dubbed Who Shot Phil? in 2001, in which an unknown assailant guns down Phil. An estimated 22 million viewers watched Lisa confess to Phil's attempted murder, causing the third-largest power surge on record. Other storylines included Lisa's pregnancy with Phil's child, in which she opted not to tell him and claimed that her partner, Mark Fowler (Todd Carty), was the father, giving birth to Louise Mitchell (Rachel Cox/Brittany Papple/Tilly Keeper); in the show's Christmas 2001 episodes, Louise's parentage is revealed. In Lisa's returns, the storylines have centred around her daughter Louise's pregnancy, falling out with and coping with best friend Mel Owen's (Tamzin Outhwaite) death, a feud with Sharon Watts (Letitia Dean) and partially making amends with Phil.
Storylines
1998–2010
Lisa Shaw first appeared in Albert Square as a trainee market inspector, but clashes with her boss Michael Rose (Russell Floyd) because of her poor time-keeping. However, she starts to fall for him, and the pair strike up an affair. Lisa soon begins to feel guilty, and demands that Michael chooses between her and his wife Susan (Tilly Vosburgh). Michael initially chooses Lisa, but does not tell Susan, and Lisa threatens to tell her. To escape the breakdown of his marriage, he flees Walford with Susan in February 1999, leaving heartbroken Lisa behind.
She later makes friends with Mel Healy (Tamzin Outhwaite) and moves in with Mark Fowler (Todd Carty). Although Mark has feelings for her, Lisa does not notice, having a fling with Gianni di Marco (Marc Bannerman) and a relationship with Phil Mitchell (Steve McFadden), resulting in Lisa's pregnancy. Phil asks her to have an abortion, but she refuses, so he asks her to move in after coming round to the idea, but she miscarries, and blames Phil. Lisa loses her job when Ian Beale (Adam Woodyatt) reports her for being absent, so she becomes dependent on Phil. He is happy with the situation, but she loses her confidence and becomes jealous of Phil's former wife, Kathy Mitchell (Gillian Taylforth) and son Ben (Morgan Whittle), hiding letters and a video from them. This irritates Phil, and he has sex with Mel when she is upset, following a fight with Lisa on Christmas Day. Regretting this, Mel agrees not to tell Lisa, but does so when Lisa says she intends to get pregnant again. Devastated, Lisa ends her relationship with Phil and moves in with Mark, but she is already pregnant. She keeps this secret, as Phil keeps belittling her, and on Mel and Steve Owen's (Martin Kemp) wedding night, Phil says that he never loved her, and suggests Mel give her tips in the bedroom. In revenge, Lisa steals a gun from the e20 nightclub and shoots Phil. He survives, and confronts her after leaving hospital, but realises that he drove her to it, and frames his enemy Dan Sullivan (Craig Fairbrass) instead. Wanting to keep Phil away from her child, Lisa and Mark marry, claiming that he is the baby's father. Everyone believes it, and their romance becomes real. However, Lisa gives birth to her baby, named Louise, and Lisa tells Sharon Watts (Letitia Dean) that Phil is Louise's father. Sharon, however, tells Phil that Louise is his daughter, and he forces them to accept he will be part of Louise's life. Despite initial animosity, when Lisa sees Phil with Louise, old feelings resurface, and they eventually reconcile. Following an affair, Lisa leaves Mark, and goes to live with Phil.
However, Phil wants Louise, not Lisa, and when she realises this, she and Louise emigrate to Portugal. Phil finds Lisa, and brings Louise home after blackmailing Lisa about shooting him, making her think that she is unstable and an unfit mother. Phil leaves her standing on the edge of a cliff, and she is presumed dead. On Phil's wedding day to Kate Morton (Jill Halfpenny), Lisa returns to reclaim her daughter. He is difficult, leading her to plan to shoot him again, but Den Watts (Leslie Grantham) persuades her to let him do it. He and his son, Dennis Rickman (Nigel Harman), frame Phil for armed robbery, and Lisa gains custody when he is imprisoned. She leaves the Square with Louise.
Several years later, Jack Branning (Scott Maslen), a former policeman, tells Phil that Lisa and Louise are living in South East London. After seeing Louise call another man "daddy", Phil decides that she is better off with Lisa. Two years later, Louise arrives alone in Walford, saying that Lisa has gone on holiday but not returned. Louise is taken into care, but when Phil discovers this, he has a DNA test to prove Louise is his daughter, and gains custody of her.
Later that year, Phil is visited by a social worker, Derek Evans (Simon Lowe), who says that Lisa has made an application to see Louise. Phil does not want Lisa to have contact with Louise, but Phil's mother Peggy Mitchell (Barbara Windsor) takes her to see Lisa on Louise's request. Peggy slaps Lisa for abandoning her daughter, and Lisa reveals she had a breakdown and thought Louise might be better off without her, but the neighbour she left Louise with promised to look after her. Lisa worries that Phil might hurt Louise physically after Phil has hit Peggy, who is unable to assure Lisa that this will not happen.
Peggy allows Louise to stay with Lisa permanently, as long as Phil can visit her. Phil steals the social worker's bag to get Lisa's address, and drives there to confront her, but finds the house abandoned and empty.
2017–2023
Seven years later, Louise (now Tilly Keeper) is burnt by bullies Madison Drake (Seraphina Beh) and Alexandra D'Costa (Sydney Craven), and Lisa visits her in hospital. Lisa and Sharon argue, but Sharon allows Lisa to stay. She tells Sharon that Louise wanted to live with Phil when she got a new boyfriend, which she admits was a relief. Lisa tells Phil she has had help with her life and has changed, but he ejects her from the hospital, telling her to stay away from Louise. She then stands up to Phil, and he allows her to visit Louise. Lisa then starts talking strangely. She sees her therapist, and lies that she has been doing fun things with Louise.
Lisa takes the keys to Phil's house and lets herself in, talking to and smashing photos, but cuts herself on glass. A doctor allows Lisa to take Louise outside of the hospital, but Lisa tells Louise she has been discharged, and takes her to a train station. Phil and Sharon inform the police that Louise is missing, and they trace Lisa through a cash withdrawal. Phil then discovers that Lisa has a mental health team. Lisa takes Louise to a hotel, and prevents her from using the phone by cutting the cord, despite her being in agony. Louise realises that Lisa is not taking her medication. When Lisa allows Louise to go, she chooses to stay, and comforts Lisa when she is distressed.
Phil tracks them down, but Lisa hits him over the head with a phone, while Louise passes out. Sharon realises that Lisa's problems are all Phil's fault, and Lisa tells Louise they will go to the hospital together. Later, Louise blames herself for Lisa's condition, telling Phil she left her despite knowing she was ill.
In August 2019, a heavily pregnant Louise and her fiancé Keanu Taylor (Danny Walters) flee Walford after Phil's son Ben frames Keanu for attacking Phil, when the real culprit was Stacey Slater (Lacey Turner). Ben convinces Mel to give him Lisa's address, and Mel lies to Ben about Lisa's whereabouts; although Mel eventually reveals her location, Lisa is not found.
Lisa returns in September 2019 to seek Mel's help to get Keanu out of the country. Lisa suggests that she, Louise, Mel, Keanu and Mel's son Hunter Owen (Charlie Winter) flee together, but Hunter escapes and holds The Queen Vic hostage. Hunter takes Louise, threatening to shoot her if the police do not give into his demands, but he is later shot and killed by a marksman.
Phil asks Louise and Keanu to move back in with him, which upsets Lisa, as she had planned for her and Louise to live together. A concerned Lisa, recognising the signs, tries to support Louise's best friend Bex Fowler (Jasmine Armfield), advising her that she does not have to go to Oxford University if she does not want to, but Bex ignores her, pretending she is fine; Bex later overdoses, but eventually recovers.
Lisa and Mel fall out over Sharon, and also over Mel's increasing obsession with Louise's pregnancy. However, when Mel is killed, Lisa is devastated and, genuinely believing Sharon killed Mel, she exposes to Phil the truth she learned from a vengeful Mel that Phil is not Sharon's baby's father. Lisa later begins a feud with Sharon, and on the day of Mel's funeral, she bans the Mitchells from attending, and reveals to the whole pub that Sharon's baby is not Phil's.
After the square thinks Lisa is mentally ill again, she checks herself into a unit for a break. However, Sharon reveals to Lisa that Keanu is the father as Louise gives birth. Upon hearing this, Lisa leaves the unit, with the intention of telling Phil. Once Lisa sees them as a family, she decides not to tell Phil and Louise, but makes sure that Keanu never hurts Louise, stating that if he does, she will reveal everything. The birth of their granddaughter, Peggy Taylor, brings Lisa and Phil together, and they make amends.
Phil, however, becomes suspicious, and starts to believe Lisa. Mistakenly thinking Phil arranged Keanu's "murder", Lisa, Louise, Peggy and Phil flee to Portugal. In August 2021, Phil receives a call from Louise saying that Lisa was involved in a car accident in France, after a male driver drove into the back of her. Phil goes to care for Louise and Peggy while Lisa recovers.
In July 2023, Lisa returns to Walford with Peggy, having become her legal guardian due to Louise's "uncontrollable behaviour" as a result of Keanu's affair with Sharon. She refuses to let Keanu see Peggy unless he gives her large amounts of money for maintenance; these demands continue, and after various arguments with Phil and his fiancée Kat Slater (Jessie Wallace), it is then discovered that Lisa has been lying about Louise, and has been using the money to fund a gambling addiction. Keanu then breaks into Phil's safe and steals money from him to hand over to Lisa.
Phil discovers his money is missing and assumes it was Lisa. Kat calls the police; Lisa is arrested and held for questioning. Before Lisa is released, Keanu confesses to Phil that it was him who stole the money because Lisa was blackmailing him. Sharon discovers Lisa has a gambling problem, and tells Phil. Phil then refuses to hand over Peggy until Lisa gets her act together. A destitute Lisa is found sleeping on the square bench by Sharon, who invites her to stay with her for a couple of nights.
Sharon and Keanu have a row over these arrangements; Keanu then sees Lisa's bag, and hides her and Peggy's passports so that they will miss the flight back to Portugal the following day. Lisa searches Sharon's house, and furiously accuses Keanu of sabotage. With the help of Sharon and Martin Fowler (James Bye), Lisa finds the passports, and secretly orders a taxi to the airport. Keanu then realises Lisa is leaving with Peggy, and chases after the taxi, desperately watching on as Lisa departs with Peggy for Portugal.
Creation and development
Casting
The character's arrival was announced by the press in November 1998. She was one of Executive Producer Matthew Robinson's introductions, and was described as a "home-wrecking blonde". According to actress Lucy Benjamin, who played Lisa, the scriptwriters had doubts about casting her in the role: "I just knew in my heart that I'd be able to do it well. And I desperately wanted the security of a regular income. But after the audition they kept me hanging on for such a long time. They tortured me by saying yes then saying no. It's the only job I've ever cried about when I thought I didn't have it." Benjamin said that she was overjoyed when she was eventually given the part. In 2010, Matthew Robinson discussed Lisa's introduction with Walford Web, suggesting that Lisa and several other new characters introduced were an attempt to fill in character gaps in the soap resulting from a large number of axings. Lisa was conceptualised as a "totty" contingent. He indicated that the casting of Benjamin had been awkward, as BBC controller of drama Mal Young was hesitant about her hiring. He eventually relented and, according to Robinson, admitted after seeing her in the role that "she was, after all, a great asset to the show".
Personality
Prior to the 2000s, Lucy Benjamin described Lisa Shaw, as "feisty, independent and ambitious". Later, she described her as "a loyal, romantic Earth Mother". It was revealed through dialogue on-screen that Lisa had nursed her mother through terminal cancer, and that she had not lost her virginity until the age of 28, when she came to Walford.
The character's demeanour altered somewhat circa 2000. Due to her romantic pairing with Phil Mitchell, she became "mentally battered". Rupert Smith, author of EastEnders: 20 years in Albert Square, classified the character as an "eternal victim": one who endures misfortune and misery, and is an endless sufferer. He adds, "She was intelligent, beautiful and young — and so, of course, Lisa had to lie down and let men walk all over her [...] a snivelling, suicidal wreck." Benjamin has noted that her character was "constantly crying".
Relationship with Phil Mitchell
Lisa became more prominently featured in 1999, when she was paired romantically with Phil Mitchell (Steve McFadden). The relationship was scripted as problematic, and included storylines about miscarriage, emotional and mental abuse, and infidelity, when Phil slept with Lisa's best friend, Mel Healy, (Tamzin Outhwaite).
One of EastEnders''' most highly anticipated storylines involved the couple. "Who Shot Phil?" saw Phil Mitchell gunned down outside his home in March 2001 in a Dallas-style whodunnit mystery. Various key characters were in the frame for the deed, and viewers were left guessing for weeks as to which one was the real culprit. Several outcomes were allegedly filmed, and it was reported that only a few TV executives knew the identity of the would-be assassin; even the actors were kept in the dark. A spokesman commented, "The cast are only getting their own scripts. They are not being told anyone else's storylines. Not even Phil knows who shot him. It's top secret." Scriptwriters were reportedly given private security after a writer's laptop was stolen, in what was believed to be an attempt to gain the identity of the assailant. The storyline captivated the public's imagination, leading to thousands of bets being placed at the bookies across the UK: bookmaker William Hill said there was about 50,000 bets on who was responsible for the shooting.
An estimated 22 million viewers watched EastEnders on 5 April 2001 to find out that Lisa – Phil's spurned ex-girlfriend – was the culprit. The episode caused the third-largest power surge on record, and the Liverpool and Barcelona UEFA Cup semi-final was postponed for 15 minutes to accommodate a special 40-minute edition of the soap.
Lucy Benjamin told the Daily Star that keeping the secret that her character was responsible for the attempted murder had been the "worst two months of her life". She commented to The Mirror, "For two months I've carried this secret and it's been tough, really hard. I've had to lie to my colleagues — all the suspects were told to say it wasn't us. I've become very good at lying! The lies went on and on. The first person I told that it wasn't me was Todd Carty, who plays Mark [Fowler]. I thought he was a good chap to try my lies on to see if he believed it! And he did. I was delighted that it was me. I think Lisa had every reason to do it". At the time, Benjamin expressed fear that the high-profile storyline would spell the end of her character, who she had thought would be imprisoned. However, in a further plot twist, Phil framed Dan Sullivan (Craig Fairbrass) for the shooting. Subsequent storylines involved in Lisa and Phil's narratives surrounded Lisa's pregnancy. In the storyline, Lisa, secretly expecting Phil's baby, married Mark, and claimed the baby was his. This secret was revealed in the Christmas Day 2001 episodes.
Departure (2002)
In June 2002, Lucy Benjamin was axed from her role of Lisa, whose departing storyline revolved around the rekindling of her romance with Phil. After beginning an affair, she left Mark, taking her daughter to live with the Mitchells. However, Lisa realised she had made a mistake when Phil and his family began excluding her from her daughter's life; she absconded with the baby in October 2002. In a subsequent plot, Phil retrieved his daughter off-screen, chasing Lisa to Portugal, and returning with Lisa and Louise's passport.
Guest stints (2003–2017)
In January 2003, the BBC announced that Lucy Benjamin would reprise the role of Lisa for a special set of episodes that revealed Lisa's fate in flashbacks. The episodes were filmed on-location in and around Albufeira on the Portuguese Algarve. It was claimed that several endings to the episodes had been filmed, and that the outcome was a "closely guarded secret". Despite initial claims that Lisa would be killed off in the episodes, this did not occur; instead, Lisa gave Phil custody of Louise after he convinced her she was unstable, and threatened to tell the police that she had once shot him if she returned to England.
However, in June 2003, it was confirmed that Lisa would once again be returning to the serial. Benjamin said, "I couldn't believe it when I got the call a few months ago to ask if I would return to Walford. I didn't think Lisa would give up baby Lou without one last fight." The return storyline was temporary, allowing Lisa to usurp Phil and once again take custody of her daughter, with the help of Den Watts (Leslie Grantham). The character then disappeared, exiting in November 2003.
Following this exit, numerous press reports suggested that the character would be returning to the serial again; these turned out to be false, with an EastEnders spokesperson saying in 2006 and 2007 that there were no current plans to bring the character back.
When asked if she would return in 2004, Benjamin said, "I loved Lisa. She was a great character and I loved playing all those story lines. It was a wonderful opportunity for me but all that angst and that drama can sometimes get to you. She hasn't been killed off. Loads of characters are revived and brought back but I don't know if it's something I'd want to do again at the moment. It still feels like only yesterday I was there and it's good to kind of recharge your batteries and get out there and do other things. But I love the show and I'd never say never! And I liked playing Lisa. I thought she was a great character so you just don't know."
In April 2010, it was reported that Lisa would return for a single episode later in the year in a bid to retrieve Louise from Phil, who gained custody of her earlier in the year. Benjamin said of her return: "I'm really looking forward to returning to EastEnders for this episode. It will be great to see some familiar faces and work alongside old friends again." Executive producer Bryan Kirkwood commented, "Lucy's character Lisa was responsible for one of the biggest cliffhanger episodes in EastEnders, so it's a real treat to have her back on screen," while a spokesperson for the show added, "Lisa Fowler was a major part of Phil's life – she knows him as well as anyone, so she won't be happy about him looking after his daughter. With their history, you know that this storyline is going to be an explosive episode in the Mitchells' history." The episode was broadcast on 5 August 2010.
Discussing her brief return, Benjamin said, "What I liked about doing this storyline was that it did explain where Lisa had been. When I was watching it, I was thinking 'Where's her mother? Where is Lisa? It's just ridiculous'. But it does make sense. She is mentally unstable and she does have times like that and the little girl is old enough to make her own decisions and say, 'I want to live with my dad now'. So she had to let her go. I liked that I was able to come back and defend myself because Lisa was kind of being slaughtered in the Square for being a rubbish mum!"
It was reported on 20 July 2017 that Benjamin had reprised the role for a "surprise" appearance; she was expected to be on screen until August. Lisa returns to care for Louise after she is seriously injured, arriving at the hospital in the closing moments of the episode. Steve McFadden (Phil) also returned to the serial in the episodes following Lisa's return, and a show insider commented, "When that hospital door starts to open, [the audience will] be on tenterhooks to see who has rushed to help Louise. When it turns out to be Lisa, rather than Phil, there will be gasps of shock. It's such a brilliant surprise." The show made no official confirmation of the reports, although Lisa returned in the episode broadcast in the United Kingdom on 21 July 2017.
It was subsequently confirmed that Lisa would be returning for a "brief stint", and would share scenes with Phil, following his return. On her return, Benjamin commented, "It was great to be back at EastEnders as it has been such a long time since I had been there. I loved being back in Elstree for the few weeks I was there, seeing old friends again and working with really talented people. I loved every minute of it."
Benjamin's agent was approached by the show with a potential return for Lisa, which intrigued Benjamin, who had reservations about returning, as she felt the character was "a chapter that was closed". Benjamin was later contacted personally by Liza Mellody, the show's story producer, who explained a possible storyline that appealed to Benjamin. She found the storyline "too good an offer to refuse" and agreed to the return, safe in the knowledge that "it was absolutely worth coming back for".
The actress struggled to keep her return a secret, and only informed her mother and husband. She called the secret "all very cloak and dagger" at the studios, and revealed that she was told to enter through a separate entrance and wear oversized sunglasses. On show scripts, Lisa's lines were listed under the character 'Sam' and, whenever Benjamin was filming, monitors that relay what is being filmed to sets are switched off.
Benjamin felt "a little apprehensive" on her first day back filming with the serial, but settled back in after 20 minutes, and enjoyed working with McFadden, Letitia Dean (Sharon Mitchell), Natalie Cassidy (Sonia Fowler), and Dean Gaffney (Robbie Jackson) again, which she compared to "being with old friends". Benjamin teased "great" scenes between Lisa and Sharon, Phil and Sonia, opining that it is "reminiscent of historical stuff". The actress also stated that she missed the workload associated with filming the serial and commented, "It was great to get my teeth back into doing what I do, really." Benjamin developed "a really lovely connection" with Tilly Keeper, who portrays Louise Mitchell, and found her "marvellous" to work with, calling her a "professional". They had never met before working together on EastEnders. The actress hoped her return has "a great impact" as she felt a "sense of responsibility" with portraying the storyline. She called the storyline dramatic and explained, "you'll see a lot of old Lisa being played out".
Benjamin explained the reasons for Lisa's return, saying she is back because Sonia contacted her, who "feels Louise needs her [Lisa] to be there" and Louise asks for Lisa in "a state of upset", where "she's a bit delirious". This allows Lisa the "choice about whether or not she wants to be involved." It was reported that Lisa would not get "a great reception", believing that Lisa has "been a bad mum", but Benjamin defended Lisa "not being there for Louise" as she "hasn't been informed about what's going on [...] and if things were going terribly wrong she'd know about it." Lisa has been absent from Louise's life due to "her own personal issues" and Lisa "thinks it's in Louise's best interest to have stayed away." Sharon will not "be particularly keen to see her for her own reasons", and Lisa staying away is seen as the best for Phil and his side of the family.
Benjamin said when it comes to Lisa and Phil, "there's always going to be drama [...] there's going to be fireworks" as Phil "isn't Lisa's favourite person" and "in true Lisa and Phil fashion, it will be quite explosive." Lisa sees Phil as the person that "triggers episodes of things going wrong for her" and the one who has "caused all of the problems going on her life", caused by him taking Louise away from her, which leads to her wariness around him, as she knows "what he's capable of doing." Benjamin believes Lisa and Phil will never "be the best of friends or see eye-to-eye", but Louise should be the person they put first and be adults for. Lisa sees Sharon as "the right woman for Phil" and regards her as "lovely", so she needs Sharon as an "ally" in order to help her relationship with Louise, but knows Sharon is "no fool", who "will stand her ground."
Benjamin added that Lisa "feels very guilty" about leaving Louise, and being excluded from her life, but her "driving force is her unconditional love for her child"; she has "hard times ahead with Louise, explaining why she hasn't been there. I think she just hopes her daughter will need her and see she's better off having her mum in her life."
Reintroduction (2019)
On 27 May 2019, it was announced that Benjamin would once again be reprising the role for a longer stint. Lisa, who was last seen in 2017, would be returning after discovering that her daughter Louise is pregnant. The friendship between Lisa and Mel Owen (Tamzin Outhwaite) would also be revisited, a friendship that was prominent during their original tenure in the soap.
On returning to the soap, Benjamin said: "Going back to EastEnders feels like going home. I love and adore playing Lisa, and am looking forward to seeing what's in store for her this time." The show's executive producer Jon Sen added: "Lisa is one of the most enduringly popular characters in the history of the show. We're chuffed Lucy has agreed to come back for a thrilling storyline that takes us into the heart of her past on the Square."
Departure (2020)
Benjamin confirmed during an interview on This Morning on 8 January 2020 that she had filmed her final scenes as Lisa, but was always "open for a return". In the interview, she also said that there would be no reunion between Lisa and Phil.
On Lisa's departure storyline, Benjamin said: "All Lisa knows is Keanu is the father of Sharon's baby and he's left the Square. That is all still to play out, the fact Lisa finds out more sinister things have happened to Keanu". Speaking about whether the door had been left open, Benjamin confessed: "I don't even know if I can talk about that. I knew I was going back for a stint and I've completed that stint." Benjamin commented, "It's so difficult. Lisa doesn't seem to go away, she's the bad penny."
Benjamin also said that she had enjoyed working with Tamzin Outhwaite, who played her on-screen best friend Mel Owen, once again. She said: "When Jon Sen put the call into me and said this is the storyline I said fantastic I get to work with Tamzin again and being with my mate was a glorious thing to get involved in. It was a massive storyline. A really big deal and done really well. Up until Mel died we had a great time exploring those characters and the dynamics between them. We got people who had never witnessed Mel and Lisa together seeing them for the first time."
Reception
In 2002, a survey done by Whitaker's revealed that 11% of British people questioned could not name a single world leader, but nearly half could list five characters in EastEnders. Most named was Phil Mitchell (44%), followed by Mark Fowler (40%), Pauline Fowler (30%), Peggy Mitchell (28%) and Lisa Fowler (24%).Daily Mirror television critic Ian Hyland has described Lisa as miserable, suggesting in 2002 that she was "red-hot favourite to take over [Pauline Fowler's] misery mantle." Jamie McCallum from The Guardian'' mocked the character and her relationship with Phil in 2000, stating, "We should, however, pay tribute to Lisa. This was the latest in a string of gripping dilemmas, including Should I Date Phil?, Should I Shag Phil? and Should I Give Birth to a Descendant of Phil? For one who spends so much time deliberating, that girl makes a lot of duff decisions."
See also
List of EastEnders characters (1998)
References
External links
EastEnders characters
Television characters introduced in 1998
British female characters in television
Fictional characters with psychiatric disorders
Fictional victims of domestic abuse
Fictional kidnappers
Fictional criminals in soap operas
Beale family (EastEnders)
Fictional gamblers
Fictional blackmailers
|
Troika or troyka (from Russian тройка, meaning 'a set of three') may refer to:
Cultural tradition
Troika (dance), a Russian folk dance
Troika (driving), a traditional Russian harness driving combination, a cultural icon of Russia
Politics
Triumvirate, a political regime ruled or dominated by three powerful individuals, usually troika in the context of the Soviet Union and Russia
Troika (Soviet leadership), one of the temporary triumvirates in the Soviet Union
European troika, the decision group formed by the European Commission (EC), the European Central Bank (ECB) and the International Monetary Fund (IMF)
OSCE troika, the leadership of the Organization for Security and Co-operation in Europe: the chairman-in-office and the previous and incoming chairmen-in-office
NKVD troika, a commission of three for express judgment in the Soviet Union during the time of Joseph Stalin
Troika (Tunisia), a three-party alliance that governed Tunisia from 2011 to 2014
"The troika" during the U.S. presidency of Ronald Reagan: James Baker, Ed Meese, and Michael Deaver
Troika of tyranny, a term coined by John R. Bolton for three Central and South American nations (Cuba, Nicaragua and Venezuela)
Arts and entertainment
Literature
The Troika, a 1997 novel by Stepan Chapman
Troika, a 1979 novel by David Gurr
Troika, a 2011 novella by Alastair Reynolds
Troika, the fourth novel of the Indigo Saga by Louise Cooper
Tale of the Troika, a novel by Russian authors Arkady and Boris Strugatsky
Music
"Troika", the first movement of The Blizzard suite by Georgy Sviridov (1974, movie version — 1964)
Troika (Julia Kogan album), 2011
Troika (D'Virgilio, Morse & Jennings album), 2022
"Troika", the fourth movement of Suite from Lieutenant Kijé by Prokofiev
"Troika", the eleventh movement of The Seasons by Tchaikovsky
Other uses in arts and entertainment
Troika (video game), a 1991 video game
Troika, a painting by Vasily Perov
Troika (series), a 2010 Israeli-Russian short TV series by Leonid Prudovsky
Troyka, a British jazz band featuring pianist Kit Downes
Troika (1930 film), a German drama film
Troika (1969 film), an American experimental comedy film by Fredric Hobbs
Businesses
Troika Games, a video games developer 1998–2005
Troika Pottery, a Cornish pottery company 1963–1983
Troika Dialog, the former name of Sberbank CIB, a multinational investment banking and asset management firm
Other uses
Troika (chocolate), a confection by Nidar AS, Norway
Troika (crater), on Mars
Troika (ride), an amusement park ride
The Troika (Kuala Lumpur), a condominium in Kuala Lumpur, Malaysia
Troika card, a contactless reusable card designed to pay for public transport in Moscow, Russia
VAZ-2103, a Soviet sedan car nicknamed Troika
See also
Trojka (disambiguation)
Three of a kind (disambiguation)
Threesome (disambiguation)
Troyca, a Japanese animation studio
|
```javascript
// Mark transpiled classes as __PURE__ so that UglifyJS can remove them
module.exports = function visitor() {
return {
visitor: {
ClassExpression: function ClassExpression(path) {
path.addComment('leading', '#__PURE__');
}
}
};
};
```
|
Craigmore is a large suburb north of Adelaide, South Australia. It is in the City of Playford local government area, just east of Elizabeth and south of Gawler.
History
Craigmore is within the traditional territory of the Aboriginal Kaurna people of the Adelaide Plains. European settlement in the area began in the early 1850s and the wider district was known as Smithfield based upon the Township established by John Smith. Blair Farm was established by Gavin Scoular on land north of what is now Uley Road and East of Adams Road. Through a series of land purchases from 1853 to 1867 Scoular had built up his land holding to a total of 577 acres. Thomas Hogarth who was a member of South Australian Legislative Council from 1866 until his retirement in 1885 established a property in 1850 called Blair Place on land south of and adjacent to Smith's Creek and east of what is now known as Adams Road. Many of the early settlers of the Smithfield District had emigrated from Scotland. The meaning in Scottish for "Craig" is "rocky hill" and "More" is "big".
The modern development of Craigmore started circa 1970s with the construction of State Housing Trust estates. During 1975, the southeastern part of what is now Craigmore was built as a private development named Blair Park. Further private developments occurred during the mid-1980s, late 1990s, and throughout 2000, with urban infill still incomplete. This development has made Craigmore one of Adelaide's longest suburbs by distance, totalling 3.75 kilometres parallel with Adams Road between Kakuna Crescent and Arthur Street.
Geography
Craigmore is situated on foothills approximately 29 km by road from the Adelaide GPO. Although Craigmore is a suburb of Adelaide and part of the Adelaide metropolitan area, Craigmore is 11 km from Gawler by road. Most dwellings are built up to the start of the hills (One Tree Hill), which are used for cattle grazing and wine growing.
Adams Creek runs through the middle of Craigmore, with an elevation of the suburb ranging from 86 to 149 meters at its highest point.
Demographics
The 2006 census shows Craigmore as having a population of 10,319. Residents of Craigmore have a mixed income, with older former public housing residents in the middle and newer larger, more expensive houses in the newer estates such as Somerset Grove and Beckham Rise.
Community
The local newspaper is the News Review Messenger. Other regional and national newspapers such as The Advertiser and The Bunyip are also available.
Facilities
Craigmore is serviced by a high school which opened in 1970. Craigmore is also serviced by a shopping centre containing a Coles Supermarket, a liquor store BWS, an award-winning bakery and many other speciality shops. There is a YMCA, a number of public primary schools, Catherine McCauley School, and an R-12 Christian College (Hope Christian College). Close by, the large Munno Para Shopping City has many large stores including K-Mart, Foodland, and Harvey Norman, as well as the even bigger Elizabeth shopping centre. A new Woolworths shopping centre is also located in the adjacent Blakes Crossing. Uleybury Winery and restaurant is a short 3 km walk from Craigmore. Some sporting clubs that reside in the area are Craigmore Cricket Club and the Munno Para City Soccer Club.
In June 2010, Craigmore was finally given television reception through the erection of a tower on the corner of Uley and Adams Roads in Elizabeth Park. In the same month, Craigmore was also given ADSL equivalent speed internet through wireless WiMAX. Prior to this date, one-half of the dwellings in Craigmore were within an ADSL blackspot and were required to rely on 3G, dialup, or satellite internet.
Parks
Craigmore Park runs through the centre of the suburb. The linear park known as the Elephant Walk can be followed to Anderson Walk in Smithfield. As Craigmore borders the Adelaide Hills, there are lookout points on Craigmore Road and Uley Road, where the York Peninsula can sometimes be seen on a clear day.
Schools
Catherine McCauley Primary School
Craigmore South Primary School
Craigmore High School
Hope Christian College
Playford Primary School
Churches
Craigmore Christian Church
Craigmore Latter Day Saints Chapel
Transport
The area of Craigmore is serviced by Adelaide Metro which provides the 441, 442 and 443 services. All three services terminate at the Smithfield and Elizabeth Interchanges, with train connections to Adelaide and Gawler. The 443 service covers most of the area serviced by the 440, 441 and 442 bus services which only run as night services.
Notable people
David Hicks; former Guantánamo Bay detainee.
See also
City of Playford
List of Adelaide suburbs
References
Suburbs of Adelaide
|
```objective-c
/*
* Blowfish algorithm
*
* This file is part of FFmpeg.
*
* FFmpeg is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
*
* FFmpeg is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
*
* You should have received a copy of the GNU Lesser General Public
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
*/
#ifndef AVUTIL_BLOWFISH_H
#define AVUTIL_BLOWFISH_H
#include <stdint.h>
/**
* @defgroup lavu_blowfish Blowfish
* @ingroup lavu_crypto
* @{
*/
#define AV_BF_ROUNDS 16
typedef struct AVBlowfish {
uint32_t p[AV_BF_ROUNDS + 2];
uint32_t s[4][256];
} AVBlowfish;
/**
* Allocate an AVBlowfish context.
*/
AVBlowfish *av_blowfish_alloc(void);
/**
* Initialize an AVBlowfish context.
*
* @param ctx an AVBlowfish context
* @param key a key
* @param key_len length of the key
*/
void av_blowfish_init(struct AVBlowfish *ctx, const uint8_t *key, int key_len);
/**
* Encrypt or decrypt a buffer using a previously initialized context.
*
* @param ctx an AVBlowfish context
* @param xl left four bytes halves of input to be encrypted
* @param xr right four bytes halves of input to be encrypted
* @param decrypt 0 for encryption, 1 for decryption
*/
void av_blowfish_crypt_ecb(struct AVBlowfish *ctx, uint32_t *xl, uint32_t *xr,
int decrypt);
/**
* Encrypt or decrypt a buffer using a previously initialized context.
*
* @param ctx an AVBlowfish context
* @param dst destination array, can be equal to src
* @param src source array, can be equal to dst
* @param count number of 8 byte blocks
* @param iv initialization vector for CBC mode, if NULL ECB will be used
* @param decrypt 0 for encryption, 1 for decryption
*/
void av_blowfish_crypt(struct AVBlowfish *ctx, uint8_t *dst, const uint8_t *src,
int count, uint8_t *iv, int decrypt);
/**
* @}
*/
#endif /* AVUTIL_BLOWFISH_H */
```
|
European Combined Geodetic Network (ECGN) is a research project aimed at high accuracy geoid determination. The purpose of ECGN is to connect the height systems obtained via geometric positioning by GNSS with gravity-referenced heights with a cm-level accuracy. The effects of the atmosphere, the oceans and time-dependent parameters of the solid Earth on the gravity field are investigated. ECGN uses the data of satellite gravity missions CHAMP, GRACE and GOCE to model the Earth's gravity field and is linked to other gravity-related projects (GMES, GEOSS, GGOS). The ECGN is considered as a European contribution to the International Association of Geodesy (IAG) project Global Geodetic Observation System (GGOS). ECGN is managed by EUREF.
References
External links
Official website
Geodesy
Gravimetry
|
James Monroe Smith may refer to:
James Monroe Smith (Georgia planter) (1839–1915), planter and state legislator in Georgia
James Monroe Smith (academic administrator) (1888–1949), American educator and academic administrator in Louisiana
James Monroe Smith (lawyer), member of the Chicago LGBT Hall of Fame
See also
James Smith (disambiguation)
|
```cython
from cpython.dict cimport PyDict_GetItem, PyDict_SetItem
from cpython.exc cimport PyErr_Clear, PyErr_GivenExceptionMatches, PyErr_Occurred
from cpython.list cimport PyList_Append, PyList_GET_ITEM, PyList_GET_SIZE
from cpython.object cimport PyObject_RichCompareBool, Py_NE
from cpython.ref cimport PyObject, Py_INCREF, Py_XDECREF
from cpython.sequence cimport PySequence_Check
from cpython.set cimport PySet_Add, PySet_Contains
from cpython.tuple cimport PyTuple_GET_ITEM, PyTuple_GetSlice, PyTuple_New, PyTuple_SET_ITEM
# Locally defined bindings that differ from `cython.cpython` bindings
from cytoolz.cpython cimport PtrIter_Next, PtrObject_GetItem
from collections import deque
from heapq import heapify, heappop, heapreplace
from itertools import chain, islice
from operator import itemgetter
from random import Random
from cytoolz.compatibility import map, zip, zip_longest
from cytoolz.utils import no_default
__all__ = ['remove', 'accumulate', 'groupby', 'merge_sorted', 'interleave',
'unique', 'isiterable', 'isdistinct', 'take', 'drop', 'take_nth',
'first', 'second', 'nth', 'last', 'get', 'concat', 'concatv',
'mapcat', 'cons', 'interpose', 'frequencies', 'reduceby', 'iterate',
'sliding_window', 'partition', 'partition_all', 'count', 'pluck',
'join', 'tail', 'diff', 'topk', 'peek', 'random_sample']
cpdef object identity(object x):
return x
cdef class remove:
""" remove(predicate, seq)
Return those items of sequence for which predicate(item) is False
>>> def iseven(x):
... return x % 2 == 0
>>> list(remove(iseven, [1, 2, 3, 4]))
[1, 3]
"""
def __cinit__(self, object predicate, object seq):
self.predicate = predicate
self.iter_seq = iter(seq)
def __iter__(self):
return self
def __next__(self):
cdef object val
val = next(self.iter_seq)
while self.predicate(val):
val = next(self.iter_seq)
return val
cdef class accumulate:
""" accumulate(binop, seq, initial='__no__default__')
Repeatedly apply binary function to a sequence, accumulating results
>>> from operator import add, mul
>>> list(accumulate(add, [1, 2, 3, 4, 5]))
[1, 3, 6, 10, 15]
>>> list(accumulate(mul, [1, 2, 3, 4, 5]))
[1, 2, 6, 24, 120]
Accumulate is similar to ``reduce`` and is good for making functions like
cumulative sum:
>>> from functools import partial, reduce
>>> sum = partial(reduce, add)
>>> cumsum = partial(accumulate, add)
Accumulate also takes an optional argument that will be used as the first
value. This is similar to reduce.
>>> list(accumulate(add, [1, 2, 3], -1))
[-1, 0, 2, 5]
>>> list(accumulate(add, [], 1))
[1]
See Also:
itertools.accumulate : In standard itertools for Python 3.2+
"""
def __cinit__(self, object binop, object seq, object initial='__no__default__'):
self.binop = binop
self.iter_seq = iter(seq)
self.result = self # sentinel
self.initial = initial
def __iter__(self):
return self
def __next__(self):
if self.result is self:
if self.initial != no_default:
self.result = self.initial
else:
self.result = next(self.iter_seq)
else:
self.result = self.binop(self.result, next(self.iter_seq))
return self.result
cdef inline object _groupby_core(dict d, object key, object item):
cdef PyObject *obj = PyDict_GetItem(d, key)
if obj is NULL:
val = []
PyList_Append(val, item)
PyDict_SetItem(d, key, val)
else:
PyList_Append(<object>obj, item)
cpdef dict groupby(object key, object seq):
"""
Group a collection by a key function
>>> names = ['Alice', 'Bob', 'Charlie', 'Dan', 'Edith', 'Frank']
>>> groupby(len, names) # doctest: +SKIP
{3: ['Bob', 'Dan'], 5: ['Alice', 'Edith', 'Frank'], 7: ['Charlie']}
>>> iseven = lambda x: x % 2 == 0
>>> groupby(iseven, [1, 2, 3, 4, 5, 6, 7, 8]) # doctest: +SKIP
{False: [1, 3, 5, 7], True: [2, 4, 6, 8]}
Non-callable keys imply grouping on a member.
>>> groupby('gender', [{'name': 'Alice', 'gender': 'F'},
... {'name': 'Bob', 'gender': 'M'},
... {'name': 'Charlie', 'gender': 'M'}]) # doctest:+SKIP
{'F': [{'gender': 'F', 'name': 'Alice'}],
'M': [{'gender': 'M', 'name': 'Bob'},
{'gender': 'M', 'name': 'Charlie'}]}
See Also:
countby
"""
cdef dict d = {}
cdef object item, keyval
cdef Py_ssize_t i, N
if callable(key):
for item in seq:
keyval = key(item)
_groupby_core(d, keyval, item)
elif isinstance(key, list):
N = PyList_GET_SIZE(key)
for item in seq:
keyval = PyTuple_New(N)
for i in range(N):
val = <object>PyList_GET_ITEM(key, i)
val = item[val]
Py_INCREF(val)
PyTuple_SET_ITEM(keyval, i, val)
_groupby_core(d, keyval, item)
else:
for item in seq:
keyval = item[key]
_groupby_core(d, keyval, item)
return d
cdef object _merge_sorted_binary(object seqs):
mid = len(seqs) // 2
L1 = seqs[:mid]
if len(L1) == 1:
seq1 = iter(L1[0])
else:
seq1 = _merge_sorted_binary(L1)
L2 = seqs[mid:]
if len(L2) == 1:
seq2 = iter(L2[0])
else:
seq2 = _merge_sorted_binary(L2)
try:
val2 = next(seq2)
except StopIteration:
return seq1
return _merge_sorted(seq1, seq2, val2)
cdef class _merge_sorted:
def __cinit__(self, seq1, seq2, val2):
self.seq1 = seq1
self.seq2 = seq2
self.val1 = None
self.val2 = val2
self.loop = 0
def __iter__(self):
return self
def __next__(self):
if self.loop == 0:
try:
self.val1 = next(self.seq1)
except StopIteration:
self.loop = 2
return self.val2
if self.val2 < self.val1:
self.loop = 1
return self.val2
return self.val1
elif self.loop == 1:
try:
self.val2 = next(self.seq2)
except StopIteration:
self.loop = 3
return self.val1
if self.val2 < self.val1:
return self.val2
self.loop = 0
return self.val1
elif self.loop == 2:
return next(self.seq2)
return next(self.seq1)
cdef object _merge_sorted_binary_key(object seqs, object key):
mid = len(seqs) // 2
L1 = seqs[:mid]
if len(L1) == 1:
seq1 = iter(L1[0])
else:
seq1 = _merge_sorted_binary_key(L1, key)
L2 = seqs[mid:]
if len(L2) == 1:
seq2 = iter(L2[0])
else:
seq2 = _merge_sorted_binary_key(L2, key)
try:
val2 = next(seq2)
except StopIteration:
return seq1
return _merge_sorted_key(seq1, seq2, val2, key)
cdef class _merge_sorted_key:
def __cinit__(self, seq1, seq2, val2, key):
self.seq1 = seq1
self.seq2 = seq2
self.key = key
self.val1 = None
self.key1 = None
self.val2 = val2
self.key2 = key(val2)
self.loop = 0
def __iter__(self):
return self
def __next__(self):
if self.loop == 0:
try:
self.val1 = next(self.seq1)
except StopIteration:
self.loop = 2
return self.val2
self.key1 = self.key(self.val1)
if self.key2 < self.key1:
self.loop = 1
return self.val2
return self.val1
elif self.loop == 1:
try:
self.val2 = next(self.seq2)
except StopIteration:
self.loop = 3
return self.val1
self.key2 = self.key(self.val2)
if self.key2 < self.key1:
return self.val2
self.loop = 0
return self.val1
elif self.loop == 2:
return next(self.seq2)
return next(self.seq1)
cdef object c_merge_sorted(object seqs, object key=None):
if len(seqs) == 0:
return iter([])
elif len(seqs) == 1:
return iter(seqs[0])
elif key is None:
return _merge_sorted_binary(seqs)
return _merge_sorted_binary_key(seqs, key)
def merge_sorted(*seqs, **kwargs):
"""
Merge and sort a collection of sorted collections
This works lazily and only keeps one value from each iterable in memory.
>>> list(merge_sorted([1, 3, 5], [2, 4, 6]))
[1, 2, 3, 4, 5, 6]
>>> ''.join(merge_sorted('abc', 'abc', 'abc'))
'aaabbbccc'
The "key" function used to sort the input may be passed as a keyword.
>>> list(merge_sorted([2, 3], [1, 3], key=lambda x: x // 3))
[2, 1, 3, 3]
"""
if 'key' in kwargs:
return c_merge_sorted(seqs, kwargs['key'])
return c_merge_sorted(seqs)
cdef class interleave:
""" interleave(seqs)
Interleave a sequence of sequences
>>> list(interleave([[1, 2], [3, 4]]))
[1, 3, 2, 4]
>>> ''.join(interleave(('ABC', 'XY')))
'AXBYC'
Both the individual sequences and the sequence of sequences may be infinite
Returns a lazy iterator
"""
def __cinit__(self, seqs):
self.iters = [iter(seq) for seq in seqs]
self.newiters = []
self.i = 0
self.n = PyList_GET_SIZE(self.iters)
def __iter__(self):
return self
def __next__(self):
# This implementation is similar to what is done in `toolz` in that we
# construct a new list of iterators, `self.newiters`, when a value is
# successfully retrieved from an iterator from `self.iters`.
cdef PyObject *obj
cdef object val
if self.i == self.n:
self.n = PyList_GET_SIZE(self.newiters)
self.i = 0
if self.n == 0:
raise StopIteration
self.iters = self.newiters
self.newiters = []
val = <object>PyList_GET_ITEM(self.iters, self.i)
self.i += 1
obj = PtrIter_Next(val)
# TODO: optimization opportunity. Previously, it was possible to
# continue on given exceptions, `self.pass_exceptions`, which is
# why this code is structured this way. Time to clean up?
while obj is NULL:
obj = PyErr_Occurred()
if obj is not NULL:
val = <object>obj
PyErr_Clear()
raise val
if self.i == self.n:
self.n = PyList_GET_SIZE(self.newiters)
self.i = 0
if self.n == 0:
raise StopIteration
self.iters = self.newiters
self.newiters = []
val = <object>PyList_GET_ITEM(self.iters, self.i)
self.i += 1
obj = PtrIter_Next(val)
PyList_Append(self.newiters, val)
val = <object>obj
Py_XDECREF(obj)
return val
cdef class _unique_key:
def __cinit__(self, object seq, object key):
self.iter_seq = iter(seq)
self.key = key
self.seen = set()
def __iter__(self):
return self
def __next__(self):
cdef object item, tag
item = next(self.iter_seq)
tag = self.key(item)
while PySet_Contains(self.seen, tag):
item = next(self.iter_seq)
tag = self.key(item)
PySet_Add(self.seen, tag)
return item
cdef class _unique_identity:
def __cinit__(self, object seq):
self.iter_seq = iter(seq)
self.seen = set()
def __iter__(self):
return self
def __next__(self):
cdef object item
item = next(self.iter_seq)
while PySet_Contains(self.seen, item):
item = next(self.iter_seq)
PySet_Add(self.seen, item)
return item
cpdef object unique(object seq, object key=None):
"""
Return only unique elements of a sequence
>>> tuple(unique((1, 2, 3)))
(1, 2, 3)
>>> tuple(unique((1, 2, 1, 3)))
(1, 2, 3)
Uniqueness can be defined by key keyword
>>> tuple(unique(['cat', 'mouse', 'dog', 'hen'], key=len))
('cat', 'mouse')
"""
if key is None:
return _unique_identity(seq)
else:
return _unique_key(seq, key)
cpdef object isiterable(object x):
"""
Is x iterable?
>>> isiterable([1, 2, 3])
True
>>> isiterable('abc')
True
>>> isiterable(5)
False
"""
try:
iter(x)
return True
except TypeError:
pass
return False
cpdef object isdistinct(object seq):
"""
All values in sequence are distinct
>>> isdistinct([1, 2, 3])
True
>>> isdistinct([1, 2, 1])
False
>>> isdistinct("Hello")
False
>>> isdistinct("World")
True
"""
if iter(seq) is seq:
seen = set()
for item in seq:
if PySet_Contains(seen, item):
return False
seen.add(item)
return True
else:
return len(seq) == len(set(seq))
cpdef object take(Py_ssize_t n, object seq):
"""
The first n elements of a sequence
>>> list(take(2, [10, 20, 30, 40, 50]))
[10, 20]
See Also:
drop
tail
"""
return islice(seq, n)
cpdef object tail(Py_ssize_t n, object seq):
"""
The last n elements of a sequence
>>> tail(2, [10, 20, 30, 40, 50])
[40, 50]
See Also:
drop
take
"""
if PySequence_Check(seq):
return seq[-n:]
return tuple(deque(seq, n))
cpdef object drop(Py_ssize_t n, object seq):
"""
The sequence following the first n elements
>>> list(drop(2, [10, 20, 30, 40, 50]))
[30, 40, 50]
See Also:
take
tail
"""
if n < 0:
raise ValueError('n argument for drop() must be non-negative')
cdef Py_ssize_t i
cdef object iter_seq
iter_seq = iter(seq)
try:
for i in range(n):
next(iter_seq)
except StopIteration:
pass
return iter_seq
cpdef object take_nth(Py_ssize_t n, object seq):
"""
Every nth item in seq
>>> list(take_nth(2, [10, 20, 30, 40, 50]))
[10, 30, 50]
"""
return islice(seq, 0, None, n)
cpdef object first(object seq):
"""
The first element in a sequence
>>> first('ABC')
'A'
"""
return next(iter(seq))
cpdef object second(object seq):
"""
The second element in a sequence
>>> second('ABC')
'B'
"""
seq = iter(seq)
next(seq)
return next(seq)
cpdef object nth(Py_ssize_t n, object seq):
"""
The nth element in a sequence
>>> nth(1, 'ABC')
'B'
"""
if PySequence_Check(seq):
return seq[n]
if n < 0:
raise ValueError('"n" must be positive when indexing an iterator')
seq = iter(seq)
while n > 0:
n -= 1
next(seq)
return next(seq)
cpdef object last(object seq):
"""
The last element in a sequence
>>> last('ABC')
'C'
"""
cdef object val
if PySequence_Check(seq):
return seq[-1]
val = no_default
for val in seq:
pass
if val == no_default:
raise IndexError
return val
cpdef object rest(object seq):
seq = iter(seq)
next(seq)
return seq
cdef tuple _get_exceptions = (IndexError, KeyError, TypeError)
cdef tuple _get_list_exc = (IndexError, KeyError)
cpdef object get(object ind, object seq, object default='__no__default__'):
"""
Get element in a sequence or dict
Provides standard indexing
>>> get(1, 'ABC') # Same as 'ABC'[1]
'B'
Pass a list to get multiple values
>>> get([1, 2], 'ABC') # ('ABC'[1], 'ABC'[2])
('B', 'C')
Works on any value that supports indexing/getitem
For example here we see that it works with dictionaries
>>> phonebook = {'Alice': '555-1234',
... 'Bob': '555-5678',
... 'Charlie':'555-9999'}
>>> get('Alice', phonebook)
'555-1234'
>>> get(['Alice', 'Bob'], phonebook)
('555-1234', '555-5678')
Provide a default for missing values
>>> get(['Alice', 'Dennis'], phonebook, None)
('555-1234', None)
See Also:
pluck
"""
cdef Py_ssize_t i
cdef object val
cdef tuple result
cdef PyObject *obj
if isinstance(ind, list):
i = PyList_GET_SIZE(ind)
result = PyTuple_New(i)
# List of indices, no default
if default == no_default:
for i, val in enumerate(ind):
val = seq[val]
Py_INCREF(val)
PyTuple_SET_ITEM(result, i, val)
return result
# List of indices with default
for i, val in enumerate(ind):
obj = PtrObject_GetItem(seq, val)
if obj is NULL:
val = <object>PyErr_Occurred()
PyErr_Clear()
if not PyErr_GivenExceptionMatches(val, _get_list_exc):
raise val
Py_INCREF(default)
PyTuple_SET_ITEM(result, i, default)
else:
val = <object>obj
PyTuple_SET_ITEM(result, i, val)
return result
obj = PtrObject_GetItem(seq, ind)
if obj is NULL:
val = <object>PyErr_Occurred()
PyErr_Clear()
if default == no_default:
raise val
if PyErr_GivenExceptionMatches(val, _get_exceptions):
return default
raise val
Py_XDECREF(obj)
return <object>obj
cpdef object concat(object seqs):
"""
Concatenate zero or more iterables, any of which may be infinite.
An infinite sequence will prevent the rest of the arguments from
being included.
We use chain.from_iterable rather than ``chain(*seqs)`` so that seqs
can be a generator.
>>> list(concat([[], [1], [2, 3]]))
[1, 2, 3]
See also:
itertools.chain.from_iterable equivalent
"""
return chain.from_iterable(seqs)
def concatv(*seqs):
"""
Variadic version of concat
>>> list(concatv([], ["a"], ["b", "c"]))
['a', 'b', 'c']
See also:
itertools.chain
"""
return chain.from_iterable(seqs)
cpdef object mapcat(object func, object seqs):
"""
Apply func to each sequence in seqs, concatenating results.
>>> list(mapcat(lambda s: [c.upper() for c in s],
... [["a", "b"], ["c", "d", "e"]]))
['A', 'B', 'C', 'D', 'E']
"""
return concat(map(func, seqs))
cpdef object cons(object el, object seq):
"""
Add el to beginning of (possibly infinite) sequence seq.
>>> list(cons(1, [2, 3]))
[1, 2, 3]
"""
return chain((el,), seq)
cdef class interpose:
""" interpose(el, seq)
Introduce element between each pair of elements in seq
>>> list(interpose("a", [1, 2, 3]))
[1, 'a', 2, 'a', 3]
"""
def __cinit__(self, object el, object seq):
self.el = el
self.iter_seq = iter(seq)
self.do_el = False
try:
self.val = next(self.iter_seq)
except StopIteration:
self.do_el = True
def __iter__(self):
return self
def __next__(self):
if self.do_el:
self.val = next(self.iter_seq)
self.do_el = False
return self.el
else:
self.do_el = True
return self.val
cpdef dict frequencies(object seq):
"""
Find number of occurrences of each value in seq
>>> frequencies(['cat', 'cat', 'ox', 'pig', 'pig', 'cat']) #doctest: +SKIP
{'cat': 3, 'ox': 1, 'pig': 2}
See Also:
countby
groupby
"""
cdef dict d = {}
cdef PyObject *obj
cdef Py_ssize_t val
for item in seq:
obj = PyDict_GetItem(d, item)
if obj is NULL:
d[item] = 1
else:
val = <object>obj
d[item] = val + 1
return d
cdef inline object _reduceby_core(dict d, object key, object item, object binop,
object init, bint skip_init, bint call_init):
cdef PyObject *obj = PyDict_GetItem(d, key)
if obj is not NULL:
PyDict_SetItem(d, key, binop(<object>obj, item))
elif skip_init:
PyDict_SetItem(d, key, item)
elif call_init:
PyDict_SetItem(d, key, binop(init(), item))
else:
PyDict_SetItem(d, key, binop(init, item))
cpdef dict reduceby(object key, object binop, object seq, object init='__no__default__'):
"""
Perform a simultaneous groupby and reduction
The computation:
>>> result = reduceby(key, binop, seq, init) # doctest: +SKIP
is equivalent to the following:
>>> def reduction(group): # doctest: +SKIP
... return reduce(binop, group, init) # doctest: +SKIP
>>> groups = groupby(key, seq) # doctest: +SKIP
>>> result = valmap(reduction, groups) # doctest: +SKIP
But the former does not build the intermediate groups, allowing it to
operate in much less space. This makes it suitable for larger datasets
that do not fit comfortably in memory
The ``init`` keyword argument is the default initialization of the
reduction. This can be either a constant value like ``0`` or a callable
like ``lambda : 0`` as might be used in ``defaultdict``.
Simple Examples
---------------
>>> from operator import add, mul
>>> iseven = lambda x: x % 2 == 0
>>> data = [1, 2, 3, 4, 5]
>>> reduceby(iseven, add, data) # doctest: +SKIP
{False: 9, True: 6}
>>> reduceby(iseven, mul, data) # doctest: +SKIP
{False: 15, True: 8}
Complex Example
---------------
>>> projects = [{'name': 'build roads', 'state': 'CA', 'cost': 1000000},
... {'name': 'fight crime', 'state': 'IL', 'cost': 100000},
... {'name': 'help farmers', 'state': 'IL', 'cost': 2000000},
... {'name': 'help farmers', 'state': 'CA', 'cost': 200000}]
>>> reduceby('state', # doctest: +SKIP
... lambda acc, x: acc + x['cost'],
... projects, 0)
{'CA': 1200000, 'IL': 2100000}
Example Using ``init``
----------------------
>>> def set_add(s, i):
... s.add(i)
... return s
>>> reduceby(iseven, set_add, [1, 2, 3, 4, 1, 2, 3], set) # doctest: +SKIP
{True: set([2, 4]),
False: set([1, 3])}
"""
cdef dict d = {}
cdef object item, keyval
cdef Py_ssize_t i, N
cdef bint skip_init = init == no_default
cdef bint call_init = callable(init)
if callable(key):
for item in seq:
keyval = key(item)
_reduceby_core(d, keyval, item, binop, init, skip_init, call_init)
elif isinstance(key, list):
N = PyList_GET_SIZE(key)
for item in seq:
keyval = PyTuple_New(N)
for i in range(N):
val = <object>PyList_GET_ITEM(key, i)
val = item[val]
Py_INCREF(val)
PyTuple_SET_ITEM(keyval, i, val)
_reduceby_core(d, keyval, item, binop, init, skip_init, call_init)
else:
for item in seq:
keyval = item[key]
_reduceby_core(d, keyval, item, binop, init, skip_init, call_init)
return d
cdef class iterate:
""" iterate(func, x)
Repeatedly apply a function func onto an original input
Yields x, then func(x), then func(func(x)), then func(func(func(x))), etc..
>>> def inc(x): return x + 1
>>> counter = iterate(inc, 0)
>>> next(counter)
0
>>> next(counter)
1
>>> next(counter)
2
>>> double = lambda x: x * 2
>>> powers_of_two = iterate(double, 1)
>>> next(powers_of_two)
1
>>> next(powers_of_two)
2
>>> next(powers_of_two)
4
>>> next(powers_of_two)
8
"""
def __cinit__(self, object func, object x):
self.func = func
self.x = x
self.val = self # sentinel
def __iter__(self):
return self
def __next__(self):
if self.val is self:
self.val = self.x
else:
self.x = self.func(self.x)
return self.x
cdef class sliding_window:
""" sliding_window(n, seq)
A sequence of overlapping subsequences
>>> list(sliding_window(2, [1, 2, 3, 4]))
[(1, 2), (2, 3), (3, 4)]
This function creates a sliding window suitable for transformations like
sliding means / smoothing
>>> mean = lambda seq: float(sum(seq)) / len(seq)
>>> list(map(mean, sliding_window(2, [1, 2, 3, 4])))
[1.5, 2.5, 3.5]
"""
def __cinit__(self, Py_ssize_t n, object seq):
cdef Py_ssize_t i
self.iterseq = iter(seq)
self.prev = PyTuple_New(n)
for i in range(1, n):
seq = next(self.iterseq)
Py_INCREF(seq)
PyTuple_SET_ITEM(self.prev, i, seq)
self.n = n
def __iter__(self):
return self
def __next__(self):
cdef tuple current
cdef object item
cdef Py_ssize_t i
current = PyTuple_New(self.n)
for i in range(1, self.n):
item = self.prev[i]
Py_INCREF(item)
PyTuple_SET_ITEM(current, i-1, item)
item = next(self.iterseq)
Py_INCREF(item)
PyTuple_SET_ITEM(current, self.n-1, item)
self.prev = current
return current
no_pad = '__no__pad__'
cpdef object partition(Py_ssize_t n, object seq, object pad='__no__pad__'):
"""
Partition sequence into tuples of length n
>>> list(partition(2, [1, 2, 3, 4]))
[(1, 2), (3, 4)]
If the length of ``seq`` is not evenly divisible by ``n``, the final tuple
is dropped if ``pad`` is not specified, or filled to length ``n`` by pad:
>>> list(partition(2, [1, 2, 3, 4, 5]))
[(1, 2), (3, 4)]
>>> list(partition(2, [1, 2, 3, 4, 5], pad=None))
[(1, 2), (3, 4), (5, None)]
See Also:
partition_all
"""
args = [iter(seq)] * n
if pad == '__no__pad__':
return zip(*args)
else:
return zip_longest(*args, fillvalue=pad)
cdef class partition_all:
""" partition_all(n, seq)
Partition all elements of sequence into tuples of length at most n
The final tuple may be shorter to accommodate extra elements.
>>> list(partition_all(2, [1, 2, 3, 4]))
[(1, 2), (3, 4)]
>>> list(partition_all(2, [1, 2, 3, 4, 5]))
[(1, 2), (3, 4), (5,)]
See Also:
partition
"""
def __cinit__(self, Py_ssize_t n, object seq):
self.n = n
self.iterseq = iter(seq)
def __iter__(self):
return self
def __next__(self):
cdef tuple result
cdef object item
cdef Py_ssize_t i = 0
result = PyTuple_New(self.n)
for item in self.iterseq:
Py_INCREF(item)
PyTuple_SET_ITEM(result, i, item)
i += 1
if i == self.n:
return result
# iterable exhausted before filling the tuple
if i == 0:
raise StopIteration
return PyTuple_GetSlice(result, 0, i)
cpdef object count(object seq):
"""
Count the number of items in seq
Like the builtin ``len`` but works on lazy sequencies.
Not to be confused with ``itertools.count``
See also:
len
"""
if iter(seq) is not seq and hasattr(seq, '__len__'):
return len(seq)
cdef Py_ssize_t i = 0
for _ in seq:
i += 1
return i
cdef class _pluck_index:
def __cinit__(self, object ind, object seqs):
self.ind = ind
self.iterseqs = iter(seqs)
def __iter__(self):
return self
def __next__(self):
val = next(self.iterseqs)
return val[self.ind]
cdef class _pluck_index_default:
def __cinit__(self, object ind, object seqs, object default):
self.ind = ind
self.iterseqs = iter(seqs)
self.default = default
def __iter__(self):
return self
def __next__(self):
cdef PyObject *obj
cdef object val
val = next(self.iterseqs)
obj = PtrObject_GetItem(val, self.ind)
if obj is NULL:
val = <object>PyErr_Occurred()
PyErr_Clear()
if not PyErr_GivenExceptionMatches(val, _get_exceptions):
raise val
return self.default
Py_XDECREF(obj)
return <object>obj
cdef class _pluck_list:
def __cinit__(self, list ind not None, object seqs):
self.ind = ind
self.iterseqs = iter(seqs)
self.n = len(ind)
def __iter__(self):
return self
def __next__(self):
cdef Py_ssize_t i
cdef tuple result
cdef object val, seq
seq = next(self.iterseqs)
result = PyTuple_New(self.n)
for i, val in enumerate(self.ind):
val = seq[val]
Py_INCREF(val)
PyTuple_SET_ITEM(result, i, val)
return result
cdef class _pluck_list_default:
def __cinit__(self, list ind not None, object seqs, object default):
self.ind = ind
self.iterseqs = iter(seqs)
self.default = default
self.n = len(ind)
def __iter__(self):
return self
def __next__(self):
cdef Py_ssize_t i
cdef object val, seq
cdef tuple result
seq = next(self.iterseqs)
result = PyTuple_New(self.n)
for i, val in enumerate(self.ind):
obj = PtrObject_GetItem(seq, val)
if obj is NULL:
val = <object>PyErr_Occurred()
PyErr_Clear()
if not PyErr_GivenExceptionMatches(val, _get_list_exc):
raise val
Py_INCREF(self.default)
PyTuple_SET_ITEM(result, i, self.default)
else:
val = <object>obj
PyTuple_SET_ITEM(result, i, val)
return result
cpdef object pluck(object ind, object seqs, object default='__no__default__'):
"""
plucks an element or several elements from each item in a sequence.
``pluck`` maps ``itertoolz.get`` over a sequence and returns one or more
elements of each item in the sequence.
This is equivalent to running `map(curried.get(ind), seqs)`
``ind`` can be either a single string/index or a list of strings/indices.
``seqs`` should be sequence containing sequences or dicts.
e.g.
>>> data = [{'id': 1, 'name': 'Cheese'}, {'id': 2, 'name': 'Pies'}]
>>> list(pluck('name', data))
['Cheese', 'Pies']
>>> list(pluck([0, 1], [[1, 2, 3], [4, 5, 7]]))
[(1, 2), (4, 5)]
See Also:
get
map
"""
if isinstance(ind, list):
if default != no_default:
return _pluck_list_default(ind, seqs, default)
if PyList_GET_SIZE(ind) < 10:
return _pluck_list(ind, seqs)
return map(itemgetter(*ind), seqs)
if default == no_default:
return _pluck_index(ind, seqs)
return _pluck_index_default(ind, seqs, default)
cdef class _getter_index:
def __cinit__(self, object ind):
self.ind = ind
def __call__(self, object seq):
return seq[self.ind]
cdef class _getter_list:
def __cinit__(self, list ind not None):
self.ind = ind
self.n = len(ind)
def __call__(self, object seq):
cdef Py_ssize_t i
cdef tuple result
cdef object val
result = PyTuple_New(self.n)
for i, val in enumerate(self.ind):
val = seq[val]
Py_INCREF(val)
PyTuple_SET_ITEM(result, i, val)
return result
cdef class _getter_null:
def __call__(self, object seq):
return ()
# TODO: benchmark getters (and compare against itemgetter)
cpdef object getter(object index):
if isinstance(index, list):
if PyList_GET_SIZE(index) == 0:
return _getter_null()
elif PyList_GET_SIZE(index) < 10:
return _getter_list(index)
return itemgetter(*index)
return _getter_index(index)
cpdef object join(object leftkey, object leftseq,
object rightkey, object rightseq,
object left_default='__no__default__',
object right_default='__no__default__'):
"""
Join two sequences on common attributes
This is a semi-streaming operation. The LEFT sequence is fully evaluated
and placed into memory. The RIGHT sequence is evaluated lazily and so can
be arbitrarily large.
>>> friends = [('Alice', 'Edith'),
... ('Alice', 'Zhao'),
... ('Edith', 'Alice'),
... ('Zhao', 'Alice'),
... ('Zhao', 'Edith')]
>>> cities = [('Alice', 'NYC'),
... ('Alice', 'Chicago'),
... ('Dan', 'Syndey'),
... ('Edith', 'Paris'),
... ('Edith', 'Berlin'),
... ('Zhao', 'Shanghai')]
>>> # Vacation opportunities
>>> # In what cities do people have friends?
>>> result = join(second, friends,
... first, cities)
>>> for ((a, b), (c, d)) in sorted(unique(result)):
... print((a, d))
('Alice', 'Berlin')
('Alice', 'Paris')
('Alice', 'Shanghai')
('Edith', 'Chicago')
('Edith', 'NYC')
('Zhao', 'Chicago')
('Zhao', 'NYC')
('Zhao', 'Berlin')
('Zhao', 'Paris')
Specify outer joins with keyword arguments ``left_default`` and/or
``right_default``. Here is a full outer join in which unmatched elements
are paired with None.
>>> identity = lambda x: x
>>> list(join(identity, [1, 2, 3],
... identity, [2, 3, 4],
... left_default=None, right_default=None))
[(2, 2), (3, 3), (None, 4), (1, None)]
Usually the key arguments are callables to be applied to the sequences. If
the keys are not obviously callable then it is assumed that indexing was
intended, e.g. the following is a legal change
>>> # result = join(second, friends, first, cities)
>>> result = join(1, friends, 0, cities) # doctest: +SKIP
"""
if left_default == no_default and right_default == no_default:
if callable(rightkey):
return _inner_join_key(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
elif isinstance(rightkey, list):
return _inner_join_indices(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
else:
return _inner_join_index(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
elif left_default != no_default and right_default == no_default:
if callable(rightkey):
return _right_outer_join_key(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
elif isinstance(rightkey, list):
return _right_outer_join_indices(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
else:
return _right_outer_join_index(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
elif left_default == no_default and right_default != no_default:
if callable(rightkey):
return _left_outer_join_key(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
elif isinstance(rightkey, list):
return _left_outer_join_indices(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
else:
return _left_outer_join_index(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
else:
if callable(rightkey):
return _outer_join_key(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
elif isinstance(rightkey, list):
return _outer_join_indices(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
else:
return _outer_join_index(leftkey, leftseq, rightkey, rightseq,
left_default, right_default)
cdef class _join:
def __cinit__(self,
object leftkey, object leftseq,
object rightkey, object rightseq,
object left_default=no_default,
object right_default=no_default):
self.left_default = left_default
self.right_default = right_default
self._rightkey = rightkey
self.rightseq = iter(rightseq)
if isinstance(rightkey, list):
self.N = len(rightkey)
self.d = groupby(leftkey, leftseq)
self.seen_keys = set()
self.matches = []
self.right = None
self.is_rightseq_exhausted = False
def __iter__(self):
return self
cdef object rightkey(self):
pass
cdef class _right_outer_join(_join):
def __next__(self):
cdef PyObject *obj
if self.i == PyList_GET_SIZE(self.matches):
self.right = next(self.rightseq)
key = self.rightkey()
obj = PyDict_GetItem(self.d, key)
if obj is NULL:
return (self.left_default, self.right)
self.matches = <object>obj
self.i = 0
match = <object>PyList_GET_ITEM(self.matches, self.i) # skip error checking
self.i += 1
return (match, self.right)
cdef class _right_outer_join_key(_right_outer_join):
cdef object rightkey(self):
return self._rightkey(self.right)
cdef class _right_outer_join_index(_right_outer_join):
cdef object rightkey(self):
return self.right[self._rightkey]
cdef class _right_outer_join_indices(_right_outer_join):
cdef object rightkey(self):
keyval = PyTuple_New(self.N)
for i in range(self.N):
val = <object>PyList_GET_ITEM(self._rightkey, i)
val = self.right[val]
Py_INCREF(val)
PyTuple_SET_ITEM(keyval, i, val)
return keyval
cdef class _outer_join(_join):
def __next__(self):
cdef PyObject *obj
if not self.is_rightseq_exhausted:
if self.i == PyList_GET_SIZE(self.matches):
try:
self.right = next(self.rightseq)
except StopIteration:
self.is_rightseq_exhausted = True
self.keys = iter(self.d)
return next(self)
key = self.rightkey()
PySet_Add(self.seen_keys, key)
obj = PyDict_GetItem(self.d, key)
if obj is NULL:
return (self.left_default, self.right)
self.matches = <object>obj
self.i = 0
match = <object>PyList_GET_ITEM(self.matches, self.i) # skip error checking
self.i += 1
return (match, self.right)
else:
if self.i == PyList_GET_SIZE(self.matches):
key = next(self.keys)
while key in self.seen_keys:
key = next(self.keys)
obj = PyDict_GetItem(self.d, key)
self.matches = <object>obj
self.i = 0
match = <object>PyList_GET_ITEM(self.matches, self.i) # skip error checking
self.i += 1
return (match, self.right_default)
cdef class _outer_join_key(_outer_join):
cdef object rightkey(self):
return self._rightkey(self.right)
cdef class _outer_join_index(_outer_join):
cdef object rightkey(self):
return self.right[self._rightkey]
cdef class _outer_join_indices(_outer_join):
cdef object rightkey(self):
keyval = PyTuple_New(self.N)
for i in range(self.N):
val = <object>PyList_GET_ITEM(self._rightkey, i)
val = self.right[val]
Py_INCREF(val)
PyTuple_SET_ITEM(keyval, i, val)
return keyval
cdef class _left_outer_join(_join):
def __next__(self):
cdef PyObject *obj
if not self.is_rightseq_exhausted:
if self.i == PyList_GET_SIZE(self.matches):
obj = NULL
while obj is NULL:
try:
self.right = next(self.rightseq)
except StopIteration:
self.is_rightseq_exhausted = True
self.keys = iter(self.d)
return next(self)
key = self.rightkey()
PySet_Add(self.seen_keys, key)
obj = PyDict_GetItem(self.d, key)
self.matches = <object>obj
self.i = 0
match = <object>PyList_GET_ITEM(self.matches, self.i) # skip error checking
self.i += 1
return (match, self.right)
else:
if self.i == PyList_GET_SIZE(self.matches):
key = next(self.keys)
while key in self.seen_keys:
key = next(self.keys)
obj = PyDict_GetItem(self.d, key)
self.matches = <object>obj
self.i = 0
match = <object>PyList_GET_ITEM(self.matches, self.i) # skip error checking
self.i += 1
return (match, self.right_default)
cdef class _left_outer_join_key(_left_outer_join):
cdef object rightkey(self):
return self._rightkey(self.right)
cdef class _left_outer_join_index(_left_outer_join):
cdef object rightkey(self):
return self.right[self._rightkey]
cdef class _left_outer_join_indices(_left_outer_join):
cdef object rightkey(self):
keyval = PyTuple_New(self.N)
for i in range(self.N):
val = <object>PyList_GET_ITEM(self._rightkey, i)
val = self.right[val]
Py_INCREF(val)
PyTuple_SET_ITEM(keyval, i, val)
return keyval
cdef class _inner_join(_join):
def __next__(self):
cdef PyObject *obj = NULL
if self.i == PyList_GET_SIZE(self.matches):
while obj is NULL:
self.right = next(self.rightseq)
key = self.rightkey()
obj = PyDict_GetItem(self.d, key)
self.matches = <object>obj
self.i = 0
match = <object>PyList_GET_ITEM(self.matches, self.i) # skip error checking
self.i += 1
return (match, self.right)
cdef class _inner_join_key(_inner_join):
cdef object rightkey(self):
return self._rightkey(self.right)
cdef class _inner_join_index(_inner_join):
cdef object rightkey(self):
return self.right[self._rightkey]
cdef class _inner_join_indices(_inner_join):
cdef object rightkey(self):
keyval = PyTuple_New(self.N)
for i in range(self.N):
val = <object>PyList_GET_ITEM(self._rightkey, i)
val = self.right[val]
Py_INCREF(val)
PyTuple_SET_ITEM(keyval, i, val)
return keyval
cdef class _diff_key:
def __cinit__(self, object seqs, object key, object default=no_default):
self.N = len(seqs)
if self.N < 2:
raise TypeError('Too few sequences given (min 2 required)')
if default == no_default:
self.iters = zip(*seqs)
else:
self.iters = zip_longest(*seqs, fillvalue=default)
self.key = key
def __iter__(self):
return self
def __next__(self):
cdef object val, val2, items
cdef Py_ssize_t i
while True:
items = next(self.iters)
val = self.key(<object>PyTuple_GET_ITEM(items, 0))
for i in range(1, self.N):
val2 = self.key(<object>PyTuple_GET_ITEM(items, i))
if PyObject_RichCompareBool(val, val2, Py_NE):
return items
cdef class _diff_identity:
def __cinit__(self, object seqs, object default=no_default):
self.N = len(seqs)
if self.N < 2:
raise TypeError('Too few sequences given (min 2 required)')
if default == no_default:
self.iters = zip(*seqs)
else:
self.iters = zip_longest(*seqs, fillvalue=default)
def __iter__(self):
return self
def __next__(self):
cdef object val, val2, items
cdef Py_ssize_t i
while True:
items = next(self.iters)
val = <object>PyTuple_GET_ITEM(items, 0)
for i in range(1, self.N):
val2 = <object>PyTuple_GET_ITEM(items, i)
if PyObject_RichCompareBool(val, val2, Py_NE):
return items
cdef object c_diff(object seqs, object default=no_default, object key=None):
if key is None:
return _diff_identity(seqs, default=default)
else:
return _diff_key(seqs, key, default=default)
def diff(*seqs, **kwargs):
"""
Return those items that differ between sequences
>>> list(diff([1, 2, 3], [1, 2, 10, 100]))
[(3, 10)]
Shorter sequences may be padded with a ``default`` value:
>>> list(diff([1, 2, 3], [1, 2, 10, 100], default=None))
[(3, 10), (None, 100)]
A ``key`` function may also be applied to each item to use during
comparisons:
>>> list(diff(['apples', 'bananas'], ['Apples', 'Oranges'], key=str.lower))
[('bananas', 'Oranges')]
"""
N = len(seqs)
if N == 1 and isinstance(seqs[0], list):
seqs = seqs[0]
default = kwargs.get('default', no_default)
key = kwargs.get('key')
return c_diff(seqs, default=default, key=key)
cpdef object topk(Py_ssize_t k, object seq, object key=None):
"""
Find the k largest elements of a sequence
Operates lazily in ``n*log(k)`` time
>>> topk(2, [1, 100, 10, 1000])
(1000, 100)
Use a key function to change sorted order
>>> topk(2, ['Alice', 'Bob', 'Charlie', 'Dan'], key=len)
('Charlie', 'Alice')
See also:
heapq.nlargest
"""
cdef object item, val, top
cdef object it = iter(seq)
cdef object _heapreplace = heapreplace
cdef Py_ssize_t i = k
cdef list pq = []
if key is not None and not callable(key):
key = getter(key)
if k < 2:
if k < 1:
return ()
top = list(take(1, it))
if len(top) == 0:
return ()
it = concatv(top, it)
if key is None:
return (max(it),)
else:
return (max(it, key=key),)
for item in it:
if key is None:
PyList_Append(pq, (item, i))
else:
PyList_Append(pq, (key(item), i, item))
i -= 1
if i == 0:
break
if i != 0:
pq.sort(reverse=True)
k = 0 if key is None else 2
return tuple([item[k] for item in pq])
heapify(pq)
top = pq[0][0]
if key is None:
for item in it:
if top < item:
_heapreplace(pq, (item, i))
top = pq[0][0]
i -= 1
else:
for item in it:
val = key(item)
if top < val:
_heapreplace(pq, (val, i, item))
top = pq[0][0]
i -= 1
pq.sort(reverse=True)
k = 0 if key is None else 2
return tuple([item[k] for item in pq])
cpdef object peek(object seq):
"""
Retrieve the next element of a sequence
Returns the first element and an iterable equivalent to the original
sequence, still having the element retrieved.
>>> seq = [0, 1, 2, 3, 4]
>>> first, seq = peek(seq)
>>> first
0
>>> list(seq)
[0, 1, 2, 3, 4]
"""
iterator = iter(seq)
item = next(iterator)
return item, chain((item,), iterator)
cdef class random_sample:
""" random_sample(prob, seq, random_state=None)
Return elements from a sequence with probability of prob
Returns a lazy iterator of random items from seq.
``random_sample`` considers each item independently and without
replacement. See below how the first time it returned 13 items and the
next time it returned 6 items.
>>> seq = list(range(100))
>>> list(random_sample(0.1, seq)) # doctest: +SKIP
[6, 9, 19, 35, 45, 50, 58, 62, 68, 72, 78, 86, 95]
>>> list(random_sample(0.1, seq)) # doctest: +SKIP
[6, 44, 54, 61, 69, 94]
Providing an integer seed for ``random_state`` will result in
deterministic sampling. Given the same seed it will return the same sample
every time.
>>> list(random_sample(0.1, seq, random_state=2016))
[7, 9, 19, 25, 30, 32, 34, 48, 59, 60, 81, 98]
>>> list(random_sample(0.1, seq, random_state=2016))
[7, 9, 19, 25, 30, 32, 34, 48, 59, 60, 81, 98]
``random_state`` can also be any object with a method ``random`` that
returns floats between 0.0 and 1.0 (exclusive).
>>> from random import Random
>>> randobj = Random(2016)
>>> list(random_sample(0.1, seq, random_state=randobj))
[7, 9, 19, 25, 30, 32, 34, 48, 59, 60, 81, 98]
"""
def __cinit__(self, object prob, object seq, random_state=None):
float(prob)
self.prob = prob
self.iter_seq = iter(seq)
if not hasattr(random_state, 'random'):
random_state = Random(random_state)
self.random_func = random_state.random
def __iter__(self):
return self
def __next__(self):
while True:
if self.random_func() < self.prob:
return next(self.iter_seq)
next(self.iter_seq)
```
|
Indykpol is one of the largest poultry processors and the largest turkey processor in Poland. It is headquartered in Olsztyn. The company was founded in 1991 as the private successor of the state-owned corporation Olsztyńskie Zakłady Drobiarskie. In 1993, it was transformed into a joint-stock company. Since October 12, 1994, it has been listed on the Warsaw Stock Exchange.
References
Food and drink companies established in 1991
1991 establishments in Poland
Food and drink companies of Poland
Companies listed on the Warsaw Stock Exchange
Polish brands
Olsztyn
|
```php
<?php
/**
* Site/blog functions that work with the blogs table and related data.
*
* @package WordPress
* @subpackage Multisite
* @since MU
*/
/**
* Update the last_updated field for the current blog.
*
* @since MU
*
* @global wpdb $wpdb WordPress database abstraction object.
*/
function wpmu_update_blogs_date() {
global $wpdb;
update_blog_details( $wpdb->blogid, array('last_updated' => current_time('mysql', true)) );
/**
* Fires after the blog details are updated.
*
* @since MU
*
* @param int $blog_id Blog ID.
*/
do_action( 'wpmu_blog_updated', $wpdb->blogid );
}
/**
* Get a full blog URL, given a blog id.
*
* @since MU
*
* @param int $blog_id Blog ID
* @return string Full URL of the blog if found. Empty string if not.
*/
function get_blogaddress_by_id( $blog_id ) {
$bloginfo = get_blog_details( (int) $blog_id );
if ( empty( $bloginfo ) ) {
return '';
}
$scheme = parse_url( $bloginfo->home, PHP_URL_SCHEME );
$scheme = empty( $scheme ) ? 'http' : $scheme;
return esc_url( $scheme . '://' . $bloginfo->domain . $bloginfo->path );
}
/**
* Get a full blog URL, given a blog name.
*
* @since MU
*
* @param string $blogname The (subdomain or directory) name
* @return string
*/
function get_blogaddress_by_name( $blogname ) {
if ( is_subdomain_install() ) {
if ( $blogname == 'main' )
$blogname = 'www';
$url = rtrim( network_home_url(), '/' );
if ( !empty( $blogname ) )
$url = preg_replace( '|^([^\.]+://)|', "\${1}" . $blogname . '.', $url );
} else {
$url = network_home_url( $blogname );
}
return esc_url( $url . '/' );
}
/**
* Given a blog's (subdomain or directory) slug, retrieve its id.
*
* @since MU
*
* @global wpdb $wpdb WordPress database abstraction object.
*
* @param string $slug
* @return int A blog id
*/
function get_id_from_blogname( $slug ) {
global $wpdb;
$current_site = get_current_site();
$slug = trim( $slug, '/' );
$blog_id = wp_cache_get( 'get_id_from_blogname_' . $slug, 'blog-details' );
if ( $blog_id )
return $blog_id;
if ( is_subdomain_install() ) {
$domain = $slug . '.' . $current_site->domain;
$path = $current_site->path;
} else {
$domain = $current_site->domain;
$path = $current_site->path . $slug . '/';
}
$blog_id = $wpdb->get_var( $wpdb->prepare("SELECT blog_id FROM {$wpdb->blogs} WHERE domain = %s AND path = %s", $domain, $path) );
wp_cache_set( 'get_id_from_blogname_' . $slug, $blog_id, 'blog-details' );
return $blog_id;
}
/**
* Retrieve the details for a blog from the blogs table and blog options.
*
* @since MU
*
* @global wpdb $wpdb WordPress database abstraction object.
*
* @param int|string|array $fields Optional. A blog ID, a blog slug, or an array of fields to query against.
* If not specified the current blog ID is used.
* @param bool $get_all Whether to retrieve all details or only the details in the blogs table.
* Default is true.
* @return object|false Blog details on success. False on failure.
*/
function get_blog_details( $fields = null, $get_all = true ) {
global $wpdb;
if ( is_array($fields ) ) {
if ( isset($fields['blog_id']) ) {
$blog_id = $fields['blog_id'];
} elseif ( isset($fields['domain']) && isset($fields['path']) ) {
$key = md5( $fields['domain'] . $fields['path'] );
$blog = wp_cache_get($key, 'blog-lookup');
if ( false !== $blog )
return $blog;
if ( substr( $fields['domain'], 0, 4 ) == 'www.' ) {
$nowww = substr( $fields['domain'], 4 );
$blog = $wpdb->get_row( $wpdb->prepare( "SELECT * FROM $wpdb->blogs WHERE domain IN (%s,%s) AND path = %s ORDER BY CHAR_LENGTH(domain) DESC", $nowww, $fields['domain'], $fields['path'] ) );
} else {
$blog = $wpdb->get_row( $wpdb->prepare( "SELECT * FROM $wpdb->blogs WHERE domain = %s AND path = %s", $fields['domain'], $fields['path'] ) );
}
if ( $blog ) {
wp_cache_set($blog->blog_id . 'short', $blog, 'blog-details');
$blog_id = $blog->blog_id;
} else {
return false;
}
} elseif ( isset($fields['domain']) && is_subdomain_install() ) {
$key = md5( $fields['domain'] );
$blog = wp_cache_get($key, 'blog-lookup');
if ( false !== $blog )
return $blog;
if ( substr( $fields['domain'], 0, 4 ) == 'www.' ) {
$nowww = substr( $fields['domain'], 4 );
$blog = $wpdb->get_row( $wpdb->prepare( "SELECT * FROM $wpdb->blogs WHERE domain IN (%s,%s) ORDER BY CHAR_LENGTH(domain) DESC", $nowww, $fields['domain'] ) );
} else {
$blog = $wpdb->get_row( $wpdb->prepare( "SELECT * FROM $wpdb->blogs WHERE domain = %s", $fields['domain'] ) );
}
if ( $blog ) {
wp_cache_set($blog->blog_id . 'short', $blog, 'blog-details');
$blog_id = $blog->blog_id;
} else {
return false;
}
} else {
return false;
}
} else {
if ( ! $fields )
$blog_id = get_current_blog_id();
elseif ( ! is_numeric( $fields ) )
$blog_id = get_id_from_blogname( $fields );
else
$blog_id = $fields;
}
$blog_id = (int) $blog_id;
$all = $get_all == true ? '' : 'short';
$details = wp_cache_get( $blog_id . $all, 'blog-details' );
if ( $details ) {
if ( ! is_object( $details ) ) {
if ( $details == -1 ) {
return false;
} else {
// Clear old pre-serialized objects. Cache clients do better with that.
wp_cache_delete( $blog_id . $all, 'blog-details' );
unset($details);
}
} else {
return $details;
}
}
// Try the other cache.
if ( $get_all ) {
$details = wp_cache_get( $blog_id . 'short', 'blog-details' );
} else {
$details = wp_cache_get( $blog_id, 'blog-details' );
// If short was requested and full cache is set, we can return.
if ( $details ) {
if ( ! is_object( $details ) ) {
if ( $details == -1 ) {
return false;
} else {
// Clear old pre-serialized objects. Cache clients do better with that.
wp_cache_delete( $blog_id, 'blog-details' );
unset($details);
}
} else {
return $details;
}
}
}
if ( empty($details) ) {
$details = $wpdb->get_row( $wpdb->prepare( "SELECT * FROM $wpdb->blogs WHERE blog_id = %d /* get_blog_details */", $blog_id ) );
if ( ! $details ) {
// Set the full cache.
wp_cache_set( $blog_id, -1, 'blog-details' );
return false;
}
}
if ( ! $get_all ) {
wp_cache_set( $blog_id . $all, $details, 'blog-details' );
return $details;
}
switch_to_blog( $blog_id );
$details->blogname = get_option( 'blogname' );
$details->siteurl = get_option( 'siteurl' );
$details->post_count = get_option( 'post_count' );
$details->home = get_option( 'home' );
restore_current_blog();
/**
* Filter a blog's details.
*
* @since MU
*
* @param object $details The blog details.
*/
$details = apply_filters( 'blog_details', $details );
wp_cache_set( $blog_id . $all, $details, 'blog-details' );
$key = md5( $details->domain . $details->path );
wp_cache_set( $key, $details, 'blog-lookup' );
return $details;
}
/**
* Clear the blog details cache.
*
* @since MU
*
* @param int $blog_id Optional. Blog ID. Defaults to current blog.
*/
function refresh_blog_details( $blog_id = 0 ) {
$blog_id = (int) $blog_id;
if ( ! $blog_id ) {
$blog_id = get_current_blog_id();
}
$details = get_blog_details( $blog_id, false );
if ( ! $details ) {
// Make sure clean_blog_cache() gets the blog ID
// when the blog has been previously cached as
// non-existent.
$details = (object) array(
'blog_id' => $blog_id,
'domain' => null,
'path' => null
);
}
clean_blog_cache( $details );
/**
* Fires after the blog details cache is cleared.
*
* @since 3.4.0
*
* @param int $blog_id Blog ID.
*/
do_action( 'refresh_blog_details', $blog_id );
}
/**
* Update the details for a blog. Updates the blogs table for a given blog id.
*
* @since MU
*
* @global wpdb $wpdb WordPress database abstraction object.
*
* @param int $blog_id Blog ID
* @param array $details Array of details keyed by blogs table field names.
* @return bool True if update succeeds, false otherwise.
*/
function update_blog_details( $blog_id, $details = array() ) {
global $wpdb;
if ( empty($details) )
return false;
if ( is_object($details) )
$details = get_object_vars($details);
$current_details = get_blog_details($blog_id, false);
if ( empty($current_details) )
return false;
$current_details = get_object_vars($current_details);
$details = array_merge($current_details, $details);
$details['last_updated'] = current_time('mysql', true);
$update_details = array();
$fields = array( 'site_id', 'domain', 'path', 'registered', 'last_updated', 'public', 'archived', 'mature', 'spam', 'deleted', 'lang_id');
foreach ( array_intersect( array_keys( $details ), $fields ) as $field ) {
if ( 'path' === $field ) {
$details[ $field ] = trailingslashit( '/' . trim( $details[ $field ], '/' ) );
}
$update_details[ $field ] = $details[ $field ];
}
$result = $wpdb->update( $wpdb->blogs, $update_details, array('blog_id' => $blog_id) );
if ( false === $result )
return false;
// If spam status changed, issue actions.
if ( $details['spam'] != $current_details['spam'] ) {
if ( $details['spam'] == 1 ) {
/**
* Fires when the blog status is changed to 'spam'.
*
* @since MU
*
* @param int $blog_id Blog ID.
*/
do_action( 'make_spam_blog', $blog_id );
} else {
/**
* Fires when the blog status is changed to 'ham'.
*
* @since MU
*
* @param int $blog_id Blog ID.
*/
do_action( 'make_ham_blog', $blog_id );
}
}
// If mature status changed, issue actions.
if ( $details['mature'] != $current_details['mature'] ) {
if ( $details['mature'] == 1 ) {
/**
* Fires when the blog status is changed to 'mature'.
*
* @since 3.1.0
*
* @param int $blog_id Blog ID.
*/
do_action( 'mature_blog', $blog_id );
} else {
/**
* Fires when the blog status is changed to 'unmature'.
*
* @since 3.1.0
*
* @param int $blog_id Blog ID.
*/
do_action( 'unmature_blog', $blog_id );
}
}
// If archived status changed, issue actions.
if ( $details['archived'] != $current_details['archived'] ) {
if ( $details['archived'] == 1 ) {
/**
* Fires when the blog status is changed to 'archived'.
*
* @since MU
*
* @param int $blog_id Blog ID.
*/
do_action( 'archive_blog', $blog_id );
} else {
/**
* Fires when the blog status is changed to 'unarchived'.
*
* @since MU
*
* @param int $blog_id Blog ID.
*/
do_action( 'unarchive_blog', $blog_id );
}
}
// If deleted status changed, issue actions.
if ( $details['deleted'] != $current_details['deleted'] ) {
if ( $details['deleted'] == 1 ) {
/**
* Fires when the blog status is changed to 'deleted'.
*
* @since 3.5.0
*
* @param int $blog_id Blog ID.
*/
do_action( 'make_delete_blog', $blog_id );
} else {
/**
* Fires when the blog status is changed to 'undeleted'.
*
* @since 3.5.0
*
* @param int $blog_id Blog ID.
*/
do_action( 'make_undelete_blog', $blog_id );
}
}
if ( isset( $details['public'] ) ) {
switch_to_blog( $blog_id );
update_option( 'blog_public', $details['public'] );
restore_current_blog();
}
refresh_blog_details($blog_id);
return true;
}
/**
* Clean the blog cache
*
* @since 3.5.0
*
* @param stdClass $blog The blog details as returned from get_blog_details()
*/
function clean_blog_cache( $blog ) {
$blog_id = $blog->blog_id;
$domain_path_key = md5( $blog->domain . $blog->path );
wp_cache_delete( $blog_id , 'blog-details' );
wp_cache_delete( $blog_id . 'short' , 'blog-details' );
wp_cache_delete( $domain_path_key, 'blog-lookup' );
wp_cache_delete( 'current_blog_' . $blog->domain, 'site-options' );
wp_cache_delete( 'current_blog_' . $blog->domain . $blog->path, 'site-options' );
wp_cache_delete( 'get_id_from_blogname_' . trim( $blog->path, '/' ), 'blog-details' );
wp_cache_delete( $domain_path_key, 'blog-id-cache' );
}
/**
* Retrieve option value for a given blog id based on name of option.
*
* If the option does not exist or does not have a value, then the return value
* will be false. This is useful to check whether you need to install an option
* and is commonly used during installation of plugin options and to test
* whether upgrading is required.
*
* If the option was serialized then it will be unserialized when it is returned.
*
* @since MU
*
* @param int $id A blog ID. Can be null to refer to the current blog.
* @param string $option Name of option to retrieve. Expected to not be SQL-escaped.
* @param mixed $default Optional. Default value to return if the option does not exist.
* @return mixed Value set for the option.
*/
function get_blog_option( $id, $option, $default = false ) {
$id = (int) $id;
if ( empty( $id ) )
$id = get_current_blog_id();
if ( get_current_blog_id() == $id )
return get_option( $option, $default );
switch_to_blog( $id );
$value = get_option( $option, $default );
restore_current_blog();
/**
* Filter a blog option value.
*
* The dynamic portion of the hook name, `$option`, refers to the blog option name.
*
* @since 3.5.0
*
* @param string $value The option value.
* @param int $id Blog ID.
*/
return apply_filters( "blog_option_{$option}", $value, $id );
}
/**
* Add a new option for a given blog id.
*
* You do not need to serialize values. If the value needs to be serialized, then
* it will be serialized before it is inserted into the database. Remember,
* resources can not be serialized or added as an option.
*
* You can create options without values and then update the values later.
* Existing options will not be updated and checks are performed to ensure that you
* aren't adding a protected WordPress option. Care should be taken to not name
* options the same as the ones which are protected.
*
* @since MU
*
* @param int $id A blog ID. Can be null to refer to the current blog.
* @param string $option Name of option to add. Expected to not be SQL-escaped.
* @param mixed $value Optional. Option value, can be anything. Expected to not be SQL-escaped.
* @return bool False if option was not added and true if option was added.
*/
function add_blog_option( $id, $option, $value ) {
$id = (int) $id;
if ( empty( $id ) )
$id = get_current_blog_id();
if ( get_current_blog_id() == $id )
return add_option( $option, $value );
switch_to_blog( $id );
$return = add_option( $option, $value );
restore_current_blog();
return $return;
}
/**
* Removes option by name for a given blog id. Prevents removal of protected WordPress options.
*
* @since MU
*
* @param int $id A blog ID. Can be null to refer to the current blog.
* @param string $option Name of option to remove. Expected to not be SQL-escaped.
* @return bool True, if option is successfully deleted. False on failure.
*/
function delete_blog_option( $id, $option ) {
$id = (int) $id;
if ( empty( $id ) )
$id = get_current_blog_id();
if ( get_current_blog_id() == $id )
return delete_option( $option );
switch_to_blog( $id );
$return = delete_option( $option );
restore_current_blog();
return $return;
}
/**
* Update an option for a particular blog.
*
* @since MU
*
* @param int $id The blog id
* @param string $option The option key
* @param mixed $value The option value
* @return bool True on success, false on failure.
*/
function update_blog_option( $id, $option, $value, $deprecated = null ) {
$id = (int) $id;
if ( null !== $deprecated )
_deprecated_argument( __FUNCTION__, '3.1' );
if ( get_current_blog_id() == $id )
return update_option( $option, $value );
switch_to_blog( $id );
$return = update_option( $option, $value );
restore_current_blog();
refresh_blog_details( $id );
return $return;
}
/**
* Switch the current blog.
*
* This function is useful if you need to pull posts, or other information,
* from other blogs. You can switch back afterwards using restore_current_blog().
*
* Things that aren't switched:
* - autoloaded options. See #14992
* - plugins. See #14941
*
* @see restore_current_blog()
* @since MU
*
* @global wpdb $wpdb
* @global int $blog_id
* @global array $_wp_switched_stack
* @global bool $switched
* @global string $table_prefix
* @global WP_Object_Cache $wp_object_cache
*
* @param int $new_blog The id of the blog you want to switch to. Default: current blog
* @param bool $deprecated Deprecated argument
* @return true Always returns True.
*/
function switch_to_blog( $new_blog, $deprecated = null ) {
global $wpdb;
if ( empty( $new_blog ) )
$new_blog = $GLOBALS['blog_id'];
$GLOBALS['_wp_switched_stack'][] = $GLOBALS['blog_id'];
/*
* If we're switching to the same blog id that we're on,
* set the right vars, do the associated actions, but skip
* the extra unnecessary work
*/
if ( $new_blog == $GLOBALS['blog_id'] ) {
/**
* Fires when the blog is switched.
*
* @since MU
*
* @param int $new_blog New blog ID.
* @param int $new_blog Blog ID.
*/
do_action( 'switch_blog', $new_blog, $new_blog );
$GLOBALS['switched'] = true;
return true;
}
$wpdb->set_blog_id( $new_blog );
$GLOBALS['table_prefix'] = $wpdb->get_blog_prefix();
$prev_blog_id = $GLOBALS['blog_id'];
$GLOBALS['blog_id'] = $new_blog;
if ( function_exists( 'wp_cache_switch_to_blog' ) ) {
wp_cache_switch_to_blog( $new_blog );
} else {
global $wp_object_cache;
if ( is_object( $wp_object_cache ) && isset( $wp_object_cache->global_groups ) )
$global_groups = $wp_object_cache->global_groups;
else
$global_groups = false;
wp_cache_init();
if ( function_exists( 'wp_cache_add_global_groups' ) ) {
if ( is_array( $global_groups ) ) {
wp_cache_add_global_groups( $global_groups );
} else {
wp_cache_add_global_groups( array( 'users', 'userlogins', 'usermeta', 'user_meta', 'useremail', 'userslugs', 'site-transient', 'site-options', 'site-lookup', 'blog-lookup', 'blog-details', 'rss', 'global-posts', 'blog-id-cache', 'networks' ) );
}
wp_cache_add_non_persistent_groups( array( 'comment', 'counts', 'plugins' ) );
}
}
if ( did_action( 'init' ) ) {
wp_roles()->reinit();
$current_user = wp_get_current_user();
$current_user->for_blog( $new_blog );
}
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'switch_blog', $new_blog, $prev_blog_id );
$GLOBALS['switched'] = true;
return true;
}
/**
* Restore the current blog, after calling switch_to_blog()
*
* @see switch_to_blog()
* @since MU
*
* @global wpdb $wpdb
* @global array $_wp_switched_stack
* @global int $blog_id
* @global bool $switched
* @global string $table_prefix
* @global WP_Object_Cache $wp_object_cache
*
* @return bool True on success, false if we're already on the current blog
*/
function restore_current_blog() {
global $wpdb;
if ( empty( $GLOBALS['_wp_switched_stack'] ) )
return false;
$blog = array_pop( $GLOBALS['_wp_switched_stack'] );
if ( $GLOBALS['blog_id'] == $blog ) {
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'switch_blog', $blog, $blog );
// If we still have items in the switched stack, consider ourselves still 'switched'
$GLOBALS['switched'] = ! empty( $GLOBALS['_wp_switched_stack'] );
return true;
}
$wpdb->set_blog_id( $blog );
$prev_blog_id = $GLOBALS['blog_id'];
$GLOBALS['blog_id'] = $blog;
$GLOBALS['table_prefix'] = $wpdb->get_blog_prefix();
if ( function_exists( 'wp_cache_switch_to_blog' ) ) {
wp_cache_switch_to_blog( $blog );
} else {
global $wp_object_cache;
if ( is_object( $wp_object_cache ) && isset( $wp_object_cache->global_groups ) )
$global_groups = $wp_object_cache->global_groups;
else
$global_groups = false;
wp_cache_init();
if ( function_exists( 'wp_cache_add_global_groups' ) ) {
if ( is_array( $global_groups ) ) {
wp_cache_add_global_groups( $global_groups );
} else {
wp_cache_add_global_groups( array( 'users', 'userlogins', 'usermeta', 'user_meta', 'useremail', 'userslugs', 'site-transient', 'site-options', 'site-lookup', 'blog-lookup', 'blog-details', 'rss', 'global-posts', 'blog-id-cache', 'networks' ) );
}
wp_cache_add_non_persistent_groups( array( 'comment', 'counts', 'plugins' ) );
}
}
if ( did_action( 'init' ) ) {
wp_roles()->reinit();
$current_user = wp_get_current_user();
$current_user->for_blog( $blog );
}
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'switch_blog', $blog, $prev_blog_id );
// If we still have items in the switched stack, consider ourselves still 'switched'
$GLOBALS['switched'] = ! empty( $GLOBALS['_wp_switched_stack'] );
return true;
}
/**
* Determines if switch_to_blog() is in effect
*
* @since 3.5.0
*
* @global array $_wp_switched_stack
*
* @return bool True if switched, false otherwise.
*/
function ms_is_switched() {
return ! empty( $GLOBALS['_wp_switched_stack'] );
}
/**
* Check if a particular blog is archived.
*
* @since MU
*
* @param int $id The blog id
* @return string Whether the blog is archived or not
*/
function is_archived( $id ) {
return get_blog_status($id, 'archived');
}
/**
* Update the 'archived' status of a particular blog.
*
* @since MU
*
* @param int $id The blog id
* @param string $archived The new status
* @return string $archived
*/
function update_archived( $id, $archived ) {
update_blog_status($id, 'archived', $archived);
return $archived;
}
/**
* Update a blog details field.
*
* @since MU
*
* @global wpdb $wpdb WordPress database abstraction object.
*
* @param int $blog_id BLog ID
* @param string $pref A field name
* @param string $value Value for $pref
* @param null $deprecated
* @return string|false $value
*/
function update_blog_status( $blog_id, $pref, $value, $deprecated = null ) {
global $wpdb;
if ( null !== $deprecated )
_deprecated_argument( __FUNCTION__, '3.1' );
if ( ! in_array( $pref, array( 'site_id', 'domain', 'path', 'registered', 'last_updated', 'public', 'archived', 'mature', 'spam', 'deleted', 'lang_id') ) )
return $value;
$result = $wpdb->update( $wpdb->blogs, array($pref => $value, 'last_updated' => current_time('mysql', true)), array('blog_id' => $blog_id) );
if ( false === $result )
return false;
refresh_blog_details( $blog_id );
if ( 'spam' == $pref ) {
if ( $value == 1 ) {
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'make_spam_blog', $blog_id );
} else {
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'make_ham_blog', $blog_id );
}
} elseif ( 'mature' == $pref ) {
if ( $value == 1 ) {
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'mature_blog', $blog_id );
} else {
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'unmature_blog', $blog_id );
}
} elseif ( 'archived' == $pref ) {
if ( $value == 1 ) {
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'archive_blog', $blog_id );
} else {
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'unarchive_blog', $blog_id );
}
} elseif ( 'deleted' == $pref ) {
if ( $value == 1 ) {
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'make_delete_blog', $blog_id );
} else {
/** This filter is documented in wp-includes/ms-blogs.php */
do_action( 'make_undelete_blog', $blog_id );
}
} elseif ( 'public' == $pref ) {
/**
* Fires after the current blog's 'public' setting is updated.
*
* @since MU
*
* @param int $blog_id Blog ID.
* @param string $value The value of blog status.
*/
do_action( 'update_blog_public', $blog_id, $value ); // Moved here from update_blog_public().
}
return $value;
}
/**
* Get a blog details field.
*
* @since MU
*
* @global wpdb $wpdb WordPress database abstraction object.
*
* @param int $id The blog id
* @param string $pref A field name
* @return bool|string|null $value
*/
function get_blog_status( $id, $pref ) {
global $wpdb;
$details = get_blog_details( $id, false );
if ( $details )
return $details->$pref;
return $wpdb->get_var( $wpdb->prepare("SELECT %s FROM {$wpdb->blogs} WHERE blog_id = %d", $pref, $id) );
}
/**
* Get a list of most recently updated blogs.
*
* @since MU
*
* @global wpdb $wpdb WordPress database abstraction object.
*
* @param mixed $deprecated Not used
* @param int $start The offset
* @param int $quantity The maximum number of blogs to retrieve. Default is 40.
* @return array The list of blogs
*/
function get_last_updated( $deprecated = '', $start = 0, $quantity = 40 ) {
global $wpdb;
if ( ! empty( $deprecated ) )
_deprecated_argument( __FUNCTION__, 'MU' ); // never used
return $wpdb->get_results( $wpdb->prepare("SELECT blog_id, domain, path FROM $wpdb->blogs WHERE site_id = %d AND public = '1' AND archived = '0' AND mature = '0' AND spam = '0' AND deleted = '0' AND last_updated != '0000-00-00 00:00:00' ORDER BY last_updated DESC limit %d, %d", $wpdb->siteid, $start, $quantity ) , ARRAY_A );
}
/**
* Handler for updating the blog date when a post is published or an already published post is changed.
*
* @since 3.3.0
*
* @param string $new_status The new post status
* @param string $old_status The old post status
* @param object $post Post object
*/
function _update_blog_date_on_post_publish( $new_status, $old_status, $post ) {
$post_type_obj = get_post_type_object( $post->post_type );
if ( ! $post_type_obj || ! $post_type_obj->public ) {
return;
}
if ( 'publish' != $new_status && 'publish' != $old_status ) {
return;
}
// Post was freshly published, published post was saved, or published post was unpublished.
wpmu_update_blogs_date();
}
/**
* Handler for updating the blog date when a published post is deleted.
*
* @since 3.4.0
*
* @param int $post_id Post ID
*/
function _update_blog_date_on_post_delete( $post_id ) {
$post = get_post( $post_id );
$post_type_obj = get_post_type_object( $post->post_type );
if ( ! $post_type_obj || ! $post_type_obj->public ) {
return;
}
if ( 'publish' != $post->post_status ) {
return;
}
wpmu_update_blogs_date();
}
/**
* Handler for updating the blog posts count date when a post is deleted.
*
* @since 4.0.0
*
* @param int $post_id Post ID.
*/
function _update_posts_count_on_delete( $post_id ) {
$post = get_post( $post_id );
if ( ! $post || 'publish' !== $post->post_status ) {
return;
}
update_posts_count();
}
/**
* Handler for updating the blog posts count date when a post status changes.
*
* @since 4.0.0
*
* @param string $new_status The status the post is changing to.
* @param string $old_status The status the post is changing from.
*/
function _update_posts_count_on_transition_post_status( $new_status, $old_status ) {
if ( $new_status === $old_status ) {
return;
}
if ( 'publish' !== $new_status && 'publish' !== $old_status ) {
return;
}
update_posts_count();
}
```
|
```xml
/*
* one or more contributor license agreements. See the NOTICE file distributed
* with this work for additional information regarding copyright ownership.
*/
export {default as DownloadButton} from './DownloadButton';
```
|
```java
package io.jpress.model.base;
import io.jboot.db.model.JbootModel;
import com.jfinal.plugin.activerecord.IBean;
/**
* Generated by JPress, do not modify this file.
*/
@SuppressWarnings("serial")
public abstract class BaseOption<M extends BaseOption<M>> extends JbootModel<M> implements IBean {
private static final long serialVersionUID = 1L;
/**
* ID
*/
public void setId(java.lang.Long id) {
set("id", id);
}
/**
* ID
*/
public java.lang.Long getId() {
return getLong("id");
}
/**
* KEY
*/
public void setKey(java.lang.String key) {
set("key", key);
}
/**
* KEY
*/
public java.lang.String getKey() {
return getStr("key");
}
/**
*
*/
public void setValue(java.lang.String value) {
set("value", value);
}
/**
*
*/
public java.lang.String getValue() {
return getStr("value");
}
/**
* ID
*/
public void setSiteId(java.lang.Long siteId) {
set("site_id", siteId);
}
/**
* ID
*/
public java.lang.Long getSiteId() {
return getLong("site_id");
}
}
```
|
Paul Hampton Adams (December 12, 1919 – September 24, 1995) was an American football player and coach. He played for the Pittsburgh Steelers of the National Football League (NFL) in 1947. Adams served as the head football coach at Morehead State University in Morehead, Kentucky from 1956 to 1958, compiling a record of 4–21–4.
Head coaching record
References
1919 births
1995 deaths
American football centers
Morehead State Eagles football coaches
Morehead State Eagles football players
Pittsburgh Steelers players
People from Lawrence County, Ohio
Coaches of American football from Ohio
Players of American football from Ohio
|
```php
<?php declare(strict_types=1);
namespace Tests\Utils\Unions;
use GraphQL\Type\Definition\Type;
use Nuwave\Lighthouse\Schema\TypeRegistry;
final class Person
{
public function __construct(private TypeRegistry $typeRegistry) {}
/** @param array<string, mixed> $value */
public function resolveType(array $value): Type
{
$type = isset($value['id'])
? 'User'
: 'Employee';
return $this->typeRegistry->get($type);
}
}
```
|
```smalltalk
//your_sha256_hash--------------
// <auto-generated>
// Dieser Code wurde von einem Tool generiert.
// Laufzeitversion:4.0.30319.42000
//
// nderungen an dieser Datei knnen falsches Verhalten verursachen und gehen verloren, wenn
// der Code erneut generiert wird.
// </auto-generated>
//your_sha256_hash--------------
namespace nspector.Properties {
using System;
/// <summary>
/// Eine stark typisierte Ressourcenklasse zum Suchen von lokalisierten Zeichenfolgen usw.
/// </summary>
// Diese Klasse wurde von der StronglyTypedResourceBuilder automatisch generiert
// -Klasse ber ein Tool wie ResGen oder Visual Studio automatisch generiert.
// Um einen Member hinzuzufgen oder zu entfernen, bearbeiten Sie die .ResX-Datei und fhren dann ResGen
// mit der /str-Option erneut aus, oder Sie erstellen Ihr VS-Projekt neu.
[global::System.CodeDom.Compiler.GeneratedCodeAttribute("System.Resources.Tools.StronglyTypedResourceBuilder", "17.0.0.0")]
[global::System.Diagnostics.DebuggerNonUserCodeAttribute()]
[global::System.Runtime.CompilerServices.CompilerGeneratedAttribute()]
public class Resources {
private static global::System.Resources.ResourceManager resourceMan;
private static global::System.Globalization.CultureInfo resourceCulture;
[global::System.Diagnostics.CodeAnalysis.SuppressMessageAttribute("Microsoft.Performance", "CA1811:AvoidUncalledPrivateCode")]
internal Resources() {
}
/// <summary>
/// Gibt die zwischengespeicherte ResourceManager-Instanz zurck, die von dieser Klasse verwendet wird.
/// </summary>
[global::System.ComponentModel.EditorBrowsableAttribute(global::System.ComponentModel.EditorBrowsableState.Advanced)]
public static global::System.Resources.ResourceManager ResourceManager {
get {
if (object.ReferenceEquals(resourceMan, null)) {
global::System.Resources.ResourceManager temp = new global::System.Resources.ResourceManager("nspector.Properties.Resources", typeof(Resources).Assembly);
resourceMan = temp;
}
return resourceMan;
}
}
/// <summary>
/// berschreibt die CurrentUICulture-Eigenschaft des aktuellen Threads fr alle
/// Ressourcenzuordnungen, die diese stark typisierte Ressourcenklasse verwenden.
/// </summary>
[global::System.ComponentModel.EditorBrowsableAttribute(global::System.ComponentModel.EditorBrowsableState.Advanced)]
public static global::System.Globalization.CultureInfo Culture {
get {
return resourceCulture;
}
set {
resourceCulture = value;
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap apply {
get {
object obj = ResourceManager.GetObject("apply", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Zeichenfolge, die <?xml version="1.0" encoding="utf-8"?>
///<CustomSettingNames>
/// <Settings>
/// <CustomSetting>
/// <UserfriendlyName>VRR Support Indicator</UserfriendlyName>
/// <HexSettingID>0x8df510</HexSettingID>
/// <GroupName>2 - Sync and Refresh</GroupName>
/// <SettingValues>
/// <CustomSettingValue>
/// <UserfriendlyName>Off</UserfriendlyName>
/// <HexValue>0x00000000</HexValue>
/// </CustomSettingValue>
/// <CustomSettingValue>
/// <UserfriendlyName>On</UserfriendlyName>
/// <HexValue>0x0000000 [Rest der Zeichenfolge wurde abgeschnitten]"; hnelt.
/// </summary>
public static string CustomSettingNames {
get {
return ResourceManager.GetString("CustomSettingNames", resourceCulture);
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap export1 {
get {
object obj = ResourceManager.GetObject("export1", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap filter_user {
get {
object obj = ResourceManager.GetObject("filter_user", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap find_set2 {
get {
object obj = ResourceManager.GetObject("find_set2", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap home_sm {
get {
object obj = ResourceManager.GetObject("home_sm", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Icon hnlich wie (Symbol).
/// </summary>
public static System.Drawing.Icon Icon1 {
get {
object obj = ResourceManager.GetObject("Icon1", resourceCulture);
return ((System.Drawing.Icon)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap ieframe_1_18212 {
get {
object obj = ResourceManager.GetObject("ieframe_1_18212", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap ieframe_1_31073_002 {
get {
object obj = ResourceManager.GetObject("ieframe_1_31073_002", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap import1 {
get {
object obj = ResourceManager.GetObject("import1", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap n1_016 {
get {
object obj = ResourceManager.GetObject("n1-016", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap nv_btn {
get {
object obj = ResourceManager.GetObject("nv_btn", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap PortableDeviceStatus_3_16_011 {
get {
object obj = ResourceManager.GetObject("PortableDeviceStatus_3_16_011", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Icon hnlich wie (Symbol).
/// </summary>
public static System.Drawing.Icon shield16 {
get {
object obj = ResourceManager.GetObject("shield16", resourceCulture);
return ((System.Drawing.Icon)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap text_binary {
get {
object obj = ResourceManager.GetObject("text_binary", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap transparent16 {
get {
object obj = ResourceManager.GetObject("transparent16", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap window_application_add {
get {
object obj = ResourceManager.GetObject("window_application_add", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
/// <summary>
/// Sucht eine lokalisierte Ressource vom Typ System.Drawing.Bitmap.
/// </summary>
public static System.Drawing.Bitmap window_application_delete {
get {
object obj = ResourceManager.GetObject("window_application_delete", resourceCulture);
return ((System.Drawing.Bitmap)(obj));
}
}
}
}
```
|
```smalltalk
using System;
using System.Collections.Generic;
using System.IO;
using System.Net.Http;
using System.Text.RegularExpressions;
using System.Threading.Tasks;
using Certify.Models;
using Certify.Models.Config;
using Certify.Models.Plugins;
using Certify.Models.Providers;
using Certify.Plugins;
using Newtonsoft.Json;
namespace Certify.Providers.DNS.AcmeDns
{
#pragma warning disable IDE1006 // Naming Styles
internal class AcmeDnsRegistration
{
public List<string> allowfrom { get; set; }
public string fulldomain { get; set; }
public string subdomain { get; set; }
public string password { get; set; }
public string username { get; set; }
public string subject { get; set; }
}
internal class AcmeDnsUpdate
{
public string txt { get; set; }
}
#pragma warning restore IDE1006 // Naming Styles
public class DnsProviderAcmeDnsProvider : PluginProviderBase<IDnsProvider, ChallengeProviderDefinition>, IDnsProviderProviderPlugin { }
public class DnsProviderAcmeDns : IDnsProvider
{
private Dictionary<string, string> _credentials;
private ILog _log;
private int? _customPropagationDelay = null;
public int PropagationDelaySeconds => (_customPropagationDelay != null ? (int)_customPropagationDelay : Definition.PropagationDelaySeconds);
public string ProviderId => Definition.Id;
public string ProviderTitle => Definition.Title;
public string ProviderDescription => Definition.Description;
public string ProviderHelpUrl => Definition.HelpUrl;
public bool IsTestModeSupported => Definition.IsTestModeSupported;
public List<ProviderParameter> ProviderParameters => Definition.ProviderParameters;
public static ChallengeProviderDefinition Definition
{
get
{
return new ChallengeProviderDefinition
{
Id = "DNS01.API.AcmeDns",
Title = "acme-dns DNS API",
Description = "Validates via an acme-dns server using CNAME redirection to an alternative DNS service dedicated to ACME challenge responses.",
HelpUrl = "path_to_url",
PropagationDelaySeconds = 5,
ProviderParameters = new List<ProviderParameter>{
new ProviderParameter{ Key="api",Name="API Url", IsRequired=true, IsCredential=false, IsPassword=false, Value="path_to_url", Description="Self hosted API is recommended: path_to_url" },
new ProviderParameter{ Key="allowfrom",Name="Optional Allow From IPs", IsCredential=false, IsRequired=false, IsPassword=false, Description="e.g. 192.168.100.1/24; 1.2.3.4/32; 2002:c0a8:2a00::0/40" },
new ProviderParameter{ Key="credentials_json",Name="Optional credentials JSON", IsCredential=false, IsRequired=false, IsPassword=false, Type=OptionType.MultiLineText, Description="(optional) Only used if already registered or service supports pre-registering." }
},
ChallengeType = SupportedChallengeTypes.CHALLENGE_TYPE_DNS,
Config = "Provider=Certify.Providers.DNS.AcmeDns",
HandlerType = ChallengeHandlerType.INTERNAL
};
}
}
private HttpClient _client;
private Dictionary<string, string> _parameters = new Dictionary<string, string>();
private JsonSerializerSettings _serializerSettings;
private string _settingsPath { get; set; }
private Uri _apiBaseUri { get; set; }
public DnsProviderAcmeDns()
{
_settingsPath = EnvironmentUtil.CreateAppDataPath();
_client = new HttpClient();
_client.DefaultRequestHeaders.Add("User-Agent", "Certify/DnsProviderAcmeDns");
_serializerSettings = new JsonSerializerSettings
{
Formatting = Formatting.None,
MissingMemberHandling = MissingMemberHandling.Ignore,
NullValueHandling = NullValueHandling.Ignore
};
}
private string _settingsStoreName { get; set; } = "acmedns";
/// <summary>
/// Checks for and loads an existing registration settings file and also returns the computed settings path used.
/// </summary>
/// <param name="storeName"></param>
/// <param name="settingsPath"></param>
/// <param name="domainId"></param>
/// <param name="apiPrefix"></param>
/// <param name="ensureSuffix"></param>
/// <returns></returns>
private (AcmeDnsRegistration registration, bool isNewRegistration, string fullSettingsPath) EnsureExistingRegistration(string storeName, string settingsPath, string domainId, string apiPrefix, string ensureSuffix = null)
{
var registrationSettingsPath = Path.Combine(settingsPath, storeName);
if (!System.IO.Directory.Exists(registrationSettingsPath))
{
System.IO.Directory.CreateDirectory(registrationSettingsPath);
}
var domainConfigFile = domainId.Replace("*.", "") + ".json";
var filenameRegex = new Regex(
$"[{Regex.Escape(new string(Path.GetInvalidFileNameChars()))}]",
RegexOptions.Singleline | RegexOptions.Compiled | RegexOptions.CultureInvariant
);
domainConfigFile = filenameRegex.Replace(domainConfigFile, "_");
registrationSettingsPath = Path.Combine(registrationSettingsPath, apiPrefix + "_" + domainConfigFile);
if (System.IO.File.Exists(registrationSettingsPath))
{
// registration exists
var reg = JsonConvert.DeserializeObject<AcmeDnsRegistration>(System.IO.File.ReadAllText(registrationSettingsPath));
if (ensureSuffix == null || reg.fulldomain.EndsWith(ensureSuffix))
{
// is an existing registration
return (registration: reg, isNewRegistration: false, fullSettingsPath: registrationSettingsPath);
}
else
{
// registration is not a valid certify dns registration, must be annother acme-dns service}
_log?.Warning("Existing acme-dns registration found, new registration required for Certify DNS");
}
}
// registration config not matched
return (registration: null, isNewRegistration: true, fullSettingsPath: registrationSettingsPath);
}
private async Task<(AcmeDnsRegistration registration, bool isNewRegistration)> Register(string settingsPath, string domainId)
{
var apiPrefix = "";
if (_parameters.TryGetValue("api", out var apiBase) && !string.IsNullOrWhiteSpace(apiBase))
{
_apiBaseUri = new System.Uri(apiBase);
if (!_apiBaseUri.ToString().EndsWith("/"))
{
_apiBaseUri = new Uri($"{_apiBaseUri}/");
}
_client.BaseAddress = _apiBaseUri;
// we prefix the settings file with the encoded API url as these settings are
// only useful on the target API, changing the api should change all settings
apiPrefix = ToUrlSafeBase64String(_client.BaseAddress.Host);
}
else
{
_log.Error("acme-dns requires a valid API URL.");
return (null, false);
}
// optionally load and use an existing registration from provided json
if (_parameters.TryGetValue("credentials_json", out var credentialsJson) && !string.IsNullOrWhiteSpace(credentialsJson))
{
// use an existing registration credentials file
try
{
var reg = JsonConvert.DeserializeObject<AcmeDnsRegistration>(credentialsJson);
if (!string.IsNullOrEmpty(reg.fulldomain))
{
return (registration: reg, isNewRegistration: false);
}
else
{
_log.Error("Credentials JSON does not appear to be valid.");
}
}
catch
{
_log.Error("Failed to parse credentials JSON. Leave the Credentials JSON blank if not known/required.");
_log?.Information("acme-dns API registration failed.");
return (null, false);
}
}
_settingsStoreName = "acmedns";
var registrationCheck = EnsureExistingRegistration(_settingsStoreName, settingsPath, domainId, apiPrefix);
// return existing if any
if (registrationCheck.registration != null)
{
return (registration: registrationCheck.registration, isNewRegistration: registrationCheck.isNewRegistration);
}
var registration = new AcmeDns.AcmeDnsRegistration();
if (_parameters.TryGetValue("allowfrom", out var allowFrom) && !string.IsNullOrWhiteSpace(allowFrom))
{
var rules = allowFrom.Split(';');
registration.allowfrom = new List<string>();
foreach (var r in rules)
{
registration.allowfrom.Add(r.Trim().ToLower());
}
}
var json = JsonConvert.SerializeObject(registration, _serializerSettings);
var req = new HttpRequestMessage(HttpMethod.Post, $"{_apiBaseUri}register");
if (_credentials?.ContainsKey("user") == true && _credentials?.ContainsKey("key") == true)
{
var basicCredentials = "Basic " + ToUrlSafeBase64String(_credentials["user"] + ":" + _credentials["key"]);
req.Headers.Add("Authorization", basicCredentials);
}
req.Content = new StringContent(json);
var response = await _client.SendAsync(req);
if (response.IsSuccessStatusCode)
{
// got new registration
var responseJson = await response.Content.ReadAsStringAsync();
registration = JsonConvert.DeserializeObject<AcmeDnsRegistration>(responseJson);
// save these settings for later
System.IO.File.WriteAllText(registrationCheck.fullSettingsPath, JsonConvert.SerializeObject(registration));
_log?.Information("API registration completed");
// is a new registration
return (registration, true);
}
else
{
// failed to register
_log?.Information("API registration failed");
return (null, false);
}
}
public async Task<ActionResult> Test()
{
return await Task.FromResult(new ActionResult { IsSuccess = true, Message = "Test completed, but no zones returned." });
}
public async Task<ActionResult> CreateRecord(DnsRecord request)
{
// create or load registration settings
var (registration, isNewRegistration) = await Register(_settingsPath, request.TargetDomainName);
if (registration == null)
{
return new ActionResult { IsSuccess = false, Message = $"Failed to register with acme-dns. Check API Url and required credentials (if any)." };
}
if (isNewRegistration)
{
return new ActionResult
{
IsSuccess = false,
Message = $"[Action Required] To complete setup, add a CNAME record in your DNS:\r\n\t{request.RecordName}\r\nwith the value:\r\n\t{registration.fulldomain}",
Result = new { Name = request.RecordName, Value = registration.fulldomain }
};
}
var req = new HttpRequestMessage(HttpMethod.Post, $"{_apiBaseUri}update");
req.Headers.Add("X-Api-User", registration.username);
req.Headers.Add("X-Api-Key", registration.password);
var update = new
{
subdomain = registration.subdomain,
txt = request.RecordValue
};
var json = JsonConvert.SerializeObject(update, _serializerSettings);
req.Content = new StringContent(json);
var result = await _client.SendAsync(req);
try
{
if (result.IsSuccessStatusCode)
{
var responseJson = await result.Content.ReadAsStringAsync();
var updateResult = JsonConvert.DeserializeObject<AcmeDnsUpdate>(responseJson);
return new ActionResult { IsSuccess = true, Message = $"Updated: {request.RecordName} :: {registration.fulldomain}" };
}
else
{
return new ActionResult { IsSuccess = false, Message = $"Update failed with status {result.StatusCode} acme-dns API credentials may be invalid. Ensure the {request.RecordName} CNAME points to {registration.fulldomain}" };
}
}
catch (Exception exp)
{
return new ActionResult { IsSuccess = false, Message = $"Update failed: {exp.Message}" };
}
}
public async Task<ActionResult> DeleteRecord(DnsRecord request)
{
return await Task.FromResult(
new ActionResult { IsSuccess = true, Message = $"Dns Record Delete skipped (not applicable): {request.RecordName}" }
);
}
public async Task<List<DnsZone>> GetZones()
{
var results = new List<DnsZone>();
return await Task.FromResult(results);
}
public async Task<bool> InitProvider(Dictionary<string, string> credentials, Dictionary<string, string> parameters, ILog log = null)
{
_credentials = credentials;
_log = log;
_parameters = parameters;
if (parameters?.ContainsKey("propagationdelay") == true)
{
if (int.TryParse(parameters["propagationdelay"], out var customPropDelay))
{
_customPropagationDelay = customPropDelay;
}
}
return await Task.FromResult(true);
}
public static string ToUrlSafeBase64String(byte[] data)
{
var s = Convert.ToBase64String(data);
s = s.Split('=')[0]; // Remove any trailing '='s
s = s.Replace('+', '-'); // 62nd char of encoding
s = s.Replace('/', '_'); // 63rd char of encoding
return s;
}
public static string ToUrlSafeBase64String(string val)
{
var bytes = System.Text.UTF8Encoding.UTF8.GetBytes(val);
return ToUrlSafeBase64String(bytes);
}
}
}
```
|
```objective-c
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
// Original code copyright 2014 Foxit Software Inc. path_to_url
#ifndef CORE_FPDFAPI_PAGE_CPDF_CONTENTMARKS_H_
#define CORE_FPDFAPI_PAGE_CPDF_CONTENTMARKS_H_
#include <memory>
#include <vector>
#include "core/fpdfapi/page/cpdf_contentmarkitem.h"
#include "core/fxcrt/fx_system.h"
#include "core/fxcrt/retain_ptr.h"
class CPDF_Dictionary;
class CPDF_ContentMarks {
public:
CPDF_ContentMarks();
~CPDF_ContentMarks();
std::unique_ptr<CPDF_ContentMarks> Clone();
int GetMarkedContentID() const;
size_t CountItems() const;
bool ContainsItem(const CPDF_ContentMarkItem* pItem) const;
// The returned pointer is never null.
CPDF_ContentMarkItem* GetItem(size_t index);
const CPDF_ContentMarkItem* GetItem(size_t index) const;
void AddMark(ByteString name);
void AddMarkWithDirectDict(ByteString name, CPDF_Dictionary* pDict);
void AddMarkWithPropertiesHolder(const ByteString& name,
CPDF_Dictionary* pHolder,
const ByteString& property_name);
bool RemoveMark(CPDF_ContentMarkItem* pMarkItem);
void DeleteLastMark();
size_t FindFirstDifference(const CPDF_ContentMarks* other) const;
private:
class MarkData final : public Retainable {
public:
MarkData();
MarkData(const MarkData& src);
~MarkData() override;
size_t CountItems() const;
bool ContainsItem(const CPDF_ContentMarkItem* pItem) const;
CPDF_ContentMarkItem* GetItem(size_t index);
const CPDF_ContentMarkItem* GetItem(size_t index) const;
int GetMarkedContentID() const;
void AddMark(ByteString name);
void AddMarkWithDirectDict(ByteString name, CPDF_Dictionary* pDict);
void AddMarkWithPropertiesHolder(const ByteString& name,
CPDF_Dictionary* pHolder,
const ByteString& property_name);
bool RemoveMark(CPDF_ContentMarkItem* pMarkItem);
void DeleteLastMark();
private:
std::vector<RetainPtr<CPDF_ContentMarkItem>> m_Marks;
};
void EnsureMarkDataExists();
RetainPtr<MarkData> m_pMarkData;
};
#endif // CORE_FPDFAPI_PAGE_CPDF_CONTENTMARKS_H_
```
|
```javascript
'use strict';
module.exports = function (x) {
if (typeof x !== 'number') {
throw new TypeError('Expected a number');
}
return x === 300 ||
x === 301 ||
x === 302 ||
x === 303 ||
x === 305 ||
x === 307 ||
x === 308;
};
```
|
```xml
// Next.js API route support: path_to_url
import type { NextApiRequest, NextApiResponse } from "next";
import { FliptApiClient } from "@flipt-io/flipt";
import { v4 as uuidv4 } from "uuid";
const client = new FliptApiClient({
environment: process.env.FLIPT_ADDR ?? "path_to_url",
});
type Data = {
name: string;
};
export default async function handler(
_req: NextApiRequest,
res: NextApiResponse<Data>
) {
let language = "english";
try {
const evaluation = await client.evaluation.variant({
namespaceKey: "default",
flagKey: "language",
entityId: uuidv4(),
context: {},
});
language = evaluation.variantKey;
} catch (err) {
console.log(err);
}
let response: any = {
greeting:
language == "spanish"
? "Hola, from Next.js API route"
: "Hello, from Next.js API route",
};
res.status(200).json(response);
}
```
|
```java
package com.polidea.rxandroidble2.internal.util;
import android.os.ParcelUuid;
import android.util.SparseArray;
import androidx.annotation.RestrictTo;
import com.polidea.rxandroidble2.internal.RxBleLog;
import com.polidea.rxandroidble2.internal.logger.LoggerUtil;
import com.polidea.rxandroidble2.internal.scan.ScanRecordImplCompat;
import com.polidea.rxandroidble2.scan.ScanRecord;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.UUID;
import bleshadow.javax.inject.Inject;
public class ScanRecordParser {
// The following data type values are assigned by Bluetooth SIG.
// For more details refer to Bluetooth 4.1 specification, Volume 3, Part C, Section 18.
private static final int DATA_TYPE_FLAGS = 0x01;
private static final int DATA_TYPE_SERVICE_UUIDS_16_BIT_PARTIAL = 0x02;
private static final int DATA_TYPE_SERVICE_UUIDS_16_BIT_COMPLETE = 0x03;
private static final int DATA_TYPE_SERVICE_UUIDS_32_BIT_PARTIAL = 0x04;
private static final int DATA_TYPE_SERVICE_UUIDS_32_BIT_COMPLETE = 0x05;
private static final int DATA_TYPE_SERVICE_UUIDS_128_BIT_PARTIAL = 0x06;
private static final int DATA_TYPE_SERVICE_UUIDS_128_BIT_COMPLETE = 0x07;
private static final int DATA_TYPE_LOCAL_NAME_SHORT = 0x08;
private static final int DATA_TYPE_LOCAL_NAME_COMPLETE = 0x09;
private static final int DATA_TYPE_TX_POWER_LEVEL = 0x0A;
private static final int DATA_TYPE_SERVICE_DATA_16_BIT = 0x16;
private static final int DATA_TYPE_SERVICE_DATA_32_BIT = 0x20;
private static final int DATA_TYPE_SERVICE_DATA_128_BIT = 0x21;
private static final int DATA_TYPE_SERVICE_SOLICITATION_UUIDS_16_BIT = 0x14;
private static final int DATA_TYPE_SERVICE_SOLICITATION_UUIDS_32_BIT = 0x1F;
private static final int DATA_TYPE_SERVICE_SOLICITATION_UUIDS_128_BIT = 0x15;
private static final int DATA_TYPE_MANUFACTURER_SPECIFIC_DATA = 0xFF;
public static final UUID BASE_UUID =
UUID.fromString("00000000-0000-1000-8000-00805F9B34FB");
/** Length of bytes for 16 bit UUID */
public static final int UUID_BYTES_16_BIT = 2;
/** Length of bytes for 32 bit UUID */
public static final int UUID_BYTES_32_BIT = 4;
/** Length of bytes for 128 bit UUID */
public static final int UUID_BYTES_128_BIT = 16;
@Inject
public ScanRecordParser() {
}
public List<UUID> extractUUIDs(byte[] scanResult) {
ScanRecord record = parseFromBytes(scanResult);
List<ParcelUuid> uuids = record.getServiceUuids();
if (uuids == null) {
return new ArrayList<>();
}
List<UUID> uuids2 = new ArrayList<>();
for (ParcelUuid uuid : uuids) {
uuids2.add(uuid.getUuid());
}
return uuids2;
}
/**
* Method is copied from post lollipop {@link android.bluetooth.le.ScanRecord}
*/
@RestrictTo(RestrictTo.Scope.LIBRARY_GROUP)
public ScanRecord parseFromBytes(byte[] scanRecord) {
if (scanRecord == null) {
return null;
}
int currentPos = 0;
int advertiseFlag = -1;
List<ParcelUuid> serviceUuids = new ArrayList<>();
List<ParcelUuid> serviceSolicitationUuids = new ArrayList<>();
String localName = null;
int txPowerLevel = Integer.MIN_VALUE;
SparseArray<byte[]> manufacturerData = new SparseArray<>();
Map<ParcelUuid, byte[]> serviceData = new HashMap<>();
try {
while (currentPos < scanRecord.length) {
// length is unsigned int.
int length = scanRecord[currentPos++] & 0xFF;
if (length == 0) {
break;
}
// Note the length includes the length of the field type itself.
int dataLength = length - 1;
// fieldType is unsigned int.
int fieldType = scanRecord[currentPos++] & 0xFF;
switch (fieldType) {
case DATA_TYPE_FLAGS:
advertiseFlag = scanRecord[currentPos] & 0xFF;
break;
case DATA_TYPE_SERVICE_UUIDS_16_BIT_PARTIAL:
case DATA_TYPE_SERVICE_UUIDS_16_BIT_COMPLETE:
parseServiceUuid(scanRecord, currentPos,
dataLength, UUID_BYTES_16_BIT, serviceUuids);
break;
case DATA_TYPE_SERVICE_UUIDS_32_BIT_PARTIAL:
case DATA_TYPE_SERVICE_UUIDS_32_BIT_COMPLETE:
parseServiceUuid(scanRecord, currentPos, dataLength,
UUID_BYTES_32_BIT, serviceUuids);
break;
case DATA_TYPE_SERVICE_UUIDS_128_BIT_PARTIAL:
case DATA_TYPE_SERVICE_UUIDS_128_BIT_COMPLETE:
parseServiceUuid(scanRecord, currentPos, dataLength,
UUID_BYTES_128_BIT, serviceUuids);
break;
case DATA_TYPE_SERVICE_SOLICITATION_UUIDS_16_BIT:
parseServiceSolicitationUuid(scanRecord, currentPos, dataLength,
UUID_BYTES_16_BIT, serviceSolicitationUuids);
break;
case DATA_TYPE_SERVICE_SOLICITATION_UUIDS_32_BIT:
parseServiceSolicitationUuid(scanRecord, currentPos, dataLength,
UUID_BYTES_32_BIT, serviceSolicitationUuids);
break;
case DATA_TYPE_SERVICE_SOLICITATION_UUIDS_128_BIT:
parseServiceSolicitationUuid(scanRecord, currentPos, dataLength,
UUID_BYTES_128_BIT, serviceSolicitationUuids);
break;
case DATA_TYPE_LOCAL_NAME_SHORT:
case DATA_TYPE_LOCAL_NAME_COMPLETE:
localName = new String(
extractBytes(scanRecord, currentPos, dataLength));
break;
case DATA_TYPE_TX_POWER_LEVEL:
txPowerLevel = scanRecord[currentPos];
break;
case DATA_TYPE_SERVICE_DATA_16_BIT:
case DATA_TYPE_SERVICE_DATA_32_BIT:
case DATA_TYPE_SERVICE_DATA_128_BIT:
int serviceUuidLength = UUID_BYTES_16_BIT;
if (fieldType == DATA_TYPE_SERVICE_DATA_32_BIT) {
serviceUuidLength = UUID_BYTES_32_BIT;
} else if (fieldType == DATA_TYPE_SERVICE_DATA_128_BIT) {
serviceUuidLength = UUID_BYTES_128_BIT;
}
byte[] serviceDataUuidBytes = extractBytes(scanRecord, currentPos,
serviceUuidLength);
ParcelUuid serviceDataUuid = parseUuidFrom(serviceDataUuidBytes);
byte[] serviceDataArray = extractBytes(scanRecord,
currentPos + serviceUuidLength, dataLength - serviceUuidLength);
serviceData.put(serviceDataUuid, serviceDataArray);
break;
case DATA_TYPE_MANUFACTURER_SPECIFIC_DATA:
// The first two bytes of the manufacturer specific data are
// manufacturer ids in little endian.
int manufacturerId = ((scanRecord[currentPos + 1] & 0xFF) << 8)
+ (scanRecord[currentPos] & 0xFF);
byte[] manufacturerDataBytes = extractBytes(scanRecord, currentPos + 2,
dataLength - 2);
manufacturerData.put(manufacturerId, manufacturerDataBytes);
break;
default:
// Just ignore, we don't handle such data type.
break;
}
currentPos += dataLength;
}
if (serviceUuids.isEmpty()) {
serviceUuids = null;
}
return new ScanRecordImplCompat(serviceUuids, serviceSolicitationUuids, manufacturerData, serviceData,
advertiseFlag, txPowerLevel, localName, scanRecord);
} catch (Exception e) {
RxBleLog.e(e, "Unable to parse scan record: %s", LoggerUtil.bytesToHex(scanRecord));
// As the record is invalid, ignore all the parsed results for this packet
// and return an empty record with raw scanRecord bytes in results
return new ScanRecordImplCompat(null, null, null, null, -1, Integer.MIN_VALUE, null, scanRecord);
}
}
private static ParcelUuid parseUuidFrom(byte[] uuidBytes) {
if (uuidBytes == null) {
throw new IllegalArgumentException("uuidBytes cannot be null");
}
int length = uuidBytes.length;
if (length != UUID_BYTES_16_BIT && length != UUID_BYTES_32_BIT
&& length != UUID_BYTES_128_BIT) {
throw new IllegalArgumentException("uuidBytes length invalid - " + length);
}
// Construct a 128 bit UUID.
if (length == UUID_BYTES_128_BIT) {
ByteBuffer buf = ByteBuffer.wrap(uuidBytes).order(ByteOrder.LITTLE_ENDIAN);
long msb = buf.getLong(8);
long lsb = buf.getLong(0);
return new ParcelUuid(new UUID(msb, lsb));
}
// For 16 bit and 32 bit UUID we need to convert them to 128 bit value.
// 128_bit_value = uuid * 2^96 + BASE_UUID
long shortUuid;
if (length == UUID_BYTES_16_BIT) {
shortUuid = uuidBytes[0] & 0xFF;
shortUuid += (uuidBytes[1] & 0xFF) << 8;
} else {
shortUuid = uuidBytes[0] & 0xFF;
shortUuid += (uuidBytes[1] & 0xFF) << 8;
shortUuid += (uuidBytes[2] & 0xFF) << 16;
shortUuid += (uuidBytes[3] & 0xFF) << 24;
}
long msb = BASE_UUID.getMostSignificantBits() + (shortUuid << 32);
long lsb = BASE_UUID.getLeastSignificantBits();
return new ParcelUuid(new UUID(msb, lsb));
}
// Parse service UUIDs.
private int parseServiceUuid(byte[] scanRecord, int currentPos, int dataLength,
int uuidLength, List<ParcelUuid> serviceUuids) {
while (dataLength > 0) {
byte[] uuidBytes = extractBytes(scanRecord, currentPos,
uuidLength);
serviceUuids.add(parseUuidFrom(uuidBytes));
dataLength -= uuidLength;
currentPos += uuidLength;
}
return currentPos;
}
/**
* Parse service Solicitation UUIDs.
*/
private int parseServiceSolicitationUuid(byte[] scanRecord, int currentPos,
int dataLength, int uuidLength, List<ParcelUuid> serviceSolicitationUuids) {
while (dataLength > 0) {
byte[] uuidBytes = extractBytes(scanRecord, currentPos, uuidLength);
serviceSolicitationUuids.add(parseUuidFrom(uuidBytes));
dataLength -= uuidLength;
currentPos += uuidLength;
}
return currentPos;
}
// Helper method to extract bytes from byte array.
private byte[] extractBytes(byte[] scanRecord, int start, int length) {
byte[] bytes = new byte[length];
System.arraycopy(scanRecord, start, bytes, 0, length);
return bytes;
}
}
```
|
```java
/*
* DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
*
* This code is free software; you can redistribute it and/or modify it
* published by the Free Software Foundation. Oracle designates this
* particular file as subject to the "Classpath" exception as provided
* by Oracle in the LICENSE file that accompanied this code.
*
* This code is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
* version 2 for more details (a copy is included in the LICENSE file that
* accompanied this code).
*
* 2 along with this work; if not, write to the Free Software Foundation,
* Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
*
* Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
* or visit www.oracle.com if you need additional information or have any
* questions.
*/
package jdk.graal.compiler.hotspot;
import java.util.Comparator;
import java.util.HashMap;
import java.util.Map;
import java.util.Map.Entry;
import java.util.SortedSet;
import java.util.TreeSet;
import jdk.graal.compiler.core.common.util.MethodKey;
import jdk.graal.compiler.debug.TTY;
import jdk.graal.compiler.options.Option;
import jdk.graal.compiler.options.OptionKey;
import jdk.graal.compiler.options.OptionType;
import jdk.graal.compiler.options.OptionValues;
import jdk.vm.ci.code.CompilationRequest;
import jdk.vm.ci.hotspot.HotSpotJVMCIRuntime;
import jdk.vm.ci.meta.ResolvedJavaMethod;
class CompilationCounters {
public static class Options {
// @formatter:off
@Option(help = "The number of compilations allowed for any method before " +
"the VM exits (a value of 0 means there is no limit).", type = OptionType.Debug)
public static final OptionKey<Integer> CompilationCountLimit = new OptionKey<>(0);
// @formatter:on
}
private final OptionValues options;
CompilationCounters(OptionValues options) {
TTY.println("Warning: Compilation counters enabled, excessive recompilation of a method will cause a failure!");
this.options = options;
}
private final Map<MethodKey, Integer> counters = new HashMap<>();
/**
* Counts the number of compilations for the {@link ResolvedJavaMethod} of the
* {@link CompilationRequest}. If the number of compilations exceeds
* {@link Options#CompilationCountLimit} this method prints an error message and exits the VM.
*
* @param method the method about to be compiled
*/
synchronized void countCompilation(ResolvedJavaMethod method) {
MethodKey key = new MethodKey(method);
Integer val = counters.get(key);
val = val != null ? val + 1 : 1;
counters.put(key, val);
if (val > Options.CompilationCountLimit.getValue(options)) {
TTY.printf("Error. Method %s was compiled too many times. Number of compilations: %d\n", method.format("%H.%n(%p)"),
CompilationCounters.Options.CompilationCountLimit.getValue(options));
TTY.println("==================================== High compilation counters ====================================");
SortedSet<Map.Entry<MethodKey, Integer>> sortedCounters = new TreeSet<>(new CounterComparator());
for (Map.Entry<MethodKey, Integer> e : counters.entrySet()) {
sortedCounters.add(e);
}
for (Map.Entry<MethodKey, Integer> entry : sortedCounters) {
if (entry.getValue() >= Options.CompilationCountLimit.getValue(options) / 2) {
TTY.out.printf("%d\t%s%n", entry.getValue(), entry.getKey());
}
}
TTY.flush();
HotSpotGraalServices.exit(-1, HotSpotJVMCIRuntime.runtime());
}
}
static final class CounterComparator implements Comparator<Map.Entry<MethodKey, Integer>> {
@Override
public int compare(Entry<MethodKey, Integer> o1, Entry<MethodKey, Integer> o2) {
if (o1.getValue() < o2.getValue()) {
return -1;
}
if (o1.getValue() > o2.getValue()) {
return 1;
}
return String.valueOf(o1.getKey()).compareTo(String.valueOf(o2.getKey()));
}
}
}
```
|
The Japanese Metal Industrial Workers' Union (, Zenkindomei) was a trade union representing metal engineering workers in Japan.
The union founded in 1951, and affiliated with the Japanese Federation of Labour. It later joined the Japanese Confederation of Labour (Domei), and by 1967 it was its second-largest affiliate, with 220,044 members. In 1987, it moved to Domei's successor, the Japanese Trade Union Confederation. On 9 September 1999, it merged with the National Metal and Machinery Workers' Union to form JAM.
References
Engineering trade unions
Trade unions established in 1951
Trade unions disestablished in 1999
Trade unions in Japan
|
Dan C. Snyder aka Daniel Cotton Snyder is an American sculptor and ceramicist.
Background and education
Born July 23, 1948 in Philadelphia, Pennsylvania. Snyder earned a Bachelor's degree from Pennsylvania State University, studying art and anthropology. His PSU ceramic sculpture work led him to Robert Arneson, and a University of California, Davis, Master of Fine Arts degree in 1972, as well as his association with other faculty members William T. Wiley, Wayne Thiebaud, Roy De Forest, Manuel Neri, and fellow students John E. Buck, Jock Reynolds, Bruce Gutton, Richard T. Notkin, Deborah Butterfield, and John Roloff.
Awards
While working towards his MFA at Davis, Snyder received a University of California Davis Patent Grant in 1972 for research with lightweight clay bodies. Snyder was a recipient of the National Endowment for the Arts projects grant in 1973, which enabled him to obtain special permission to visit and study the closed to the public Lascaux Cave in France and the prehistoric rock drawings in Val Camonica, Italy. In 1975 working alongside other Academy members Laurie Olin, Mark Balet, Frank Holmes, and Franklin D. Israel, Snyder became a fellow of the American Academy in Rome in sculpture, having received the Rome Prize in 1973. In 2007 Snyder received the AIA, Sierra Valley Honor Award for Excellence in Design for My Time, an interactive sundial artwork and streetscape design in collaboration with the children of Van Buskirk Community Center, Stockton, California, in association with LDA Partners, 2007.
Works
Snyder is known for his life-size classical inspired ceramic figures, such as The Restoration of Hope 2, located in the Manetti Shrem Museum of Art, Mondavi Art Garden, University of California Davis, and his painted cut-out sculptures which he uses in public art works such as Welcome North, Welcome South, Welcome East, Welcome West, commissioned by the San Francisco International Airport and the San Francisco Arts Commission in 1983.
Of his ceramic figures, Maria Porges wrote in American Ceramics, December 1990, “Snyder‘s method of assembling the figure out of a shell of fragments often creates an astonishing lightness of being at the same time the whorl of vigorous modeled fragments out of which the angel emerges suggest clay’s original state. The spirit lives forever, Snyder implies, even though the flesh does not.” and, “By both drawing on the past and placing his work firmly in the context of the present, Snyder has brought his pieces into the dialogue about mortality and the body, which presently occupies the art world.”
About his painted cutout figures featured in many of his public works and art galleries, Andre Michell Workman wrote in Artweek, April 11th, 1981 “What is most compelling about Snyder’s new painted sculptures is the aura of happiness and optimism - the joie de vivre - which they exude.”
Snyder has been represented by the Wenger Gallery in San Francisco, San Diego, Mexico City, and Los Angeles from 1972 to 1989, and the Allrich Gallery in San Francisco from 1976, alternating numerous ceramic and mixed media solo exhibitions until its closure in 1993, and Sloan Miyasato Fine Arts, San Francisco.
Snyder has completed over twenty mixed media public works, including Promenade Stockton, CA, Chicken Corn Soup & Homemade Ice Cream, Rockville, MD, and The Secret Ingredient, Coca-Cola Company, Atlanta. Snyder also completed four public commissions that were community based collaborations with children, including My Time, Stockton CA, Fantasy Island, Sunnyvale, CA, Hey Buddy - Seen Any Moose? Fairbanks, Alaska, Look What I Found, Tacoma, WA, and A Few of My Favorite Things, Albany, California.
Snyder’s work is represented in the public and corporate collections of the SFO Museum, Oakland Museum of California, Lannan Foundation, Los Angeles, Monterey Museum of Art, University of California, Davis, The American Academy in Rome, Crocker Art Museum, TRW inc, Bank of America Center San Francisco, Pennsylvania State University, Hyatt, Kaiser Permanente, Oliver Carr Co., Washington DC, and Milpitas, California.
Snyder taught at Pennsylvania State University, 1975-1976, University of California at Santa Cruz, 1987-1993, and San Francisco State University, 1994
Film
Ceramic figurative sculptures used in the film “Basic Instinct”, 1992
See also
List of fellows of the American Academy in Rome (1971–1990)
Further reading
Opitz, Glenn B (1984). Dictionary of American Sculptors:18th Century to Present. Newbury Books. ISBN 9780938290032
Jaques Cattell Press (1986). Who's Who in American Art. Jacques: Cattell Press. ISBN 9780835218
References
External links
https://www.dansnyderartist.com
1948 births
Living people
20th-century American sculptors
American ceramists
Public art
American Academy in Rome
University of California, Davis alumni
Pennsylvania State University alumni
21st-century American sculptors
Artists from Philadelphia
Artists from California
American glass artists
|
```php
<?php
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
*/
namespace Google\Service\DataLabeling;
class GoogleCloudDatalabelingV1beta1SearchExampleComparisonsRequest extends \Google\Model
{
/**
* @var int
*/
public $pageSize;
/**
* @var string
*/
public $pageToken;
/**
* @param int
*/
public function setPageSize($pageSize)
{
$this->pageSize = $pageSize;
}
/**
* @return int
*/
public function getPageSize()
{
return $this->pageSize;
}
/**
* @param string
*/
public function setPageToken($pageToken)
{
$this->pageToken = $pageToken;
}
/**
* @return string
*/
public function getPageToken()
{
return $this->pageToken;
}
}
// Adding a class alias for backwards compatibility with the previous class name.
class_alias(GoogleCloudDatalabelingV1beta1SearchExampleComparisonsRequest::class, your_sha256_hashExampleComparisonsRequest');
```
|
Leopoldo Tahier (born 17 June 1911, date of death unknown) was an Argentine freestyle swimmer. He competed in two events at the 1932 Summer Olympics.
References
External links
1911 births
Year of death missing
Argentine male freestyle swimmers
Olympic swimmers for Argentina
Swimmers at the 1932 Summer Olympics
Swimmers from Buenos Aires
|
```xml
//
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in all
// copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
// SOFTWARE.
/**
* Cloud functions to handle events from Google Cloud Storage.
* @packageDocumentation
*/
import { firebaseConfig } from "../../common/config";
import { copyIfPresent } from "../../common/encoding";
import { ResetValue } from "../../common/options";
import { initV2Endpoint, ManifestEndpoint } from "../../runtime/manifest";
import { CloudEvent, CloudFunction } from "../core";
import { wrapTraceContext } from "../trace";
import { Expression } from "../../params";
import * as options from "../options";
import { SecretParam } from "../../params/types";
import { withInit } from "../../common/onInit";
/**
* An object within Google Cloud Storage.
* Ref: path_to_url
*/
export interface StorageObjectData {
/**
* The name of the bucket containing this object.
*/
bucket: string;
/**
* Cache-Control directive for the object data, matching
* [path_to_url#section-5.2"][RFC 7234 5.2].
*/
cacheControl?: string;
/**
* Number of underlying components that make up this object. Components are
* accumulated by compose operations.
* Attempting to set this field will result in an error.
*/
componentCount?: number;
/**
* Content-Disposition of the object data, matching
* [path_to_url 6266].
*/
contentDisposition?: string;
/**
* Content-Encoding of the object data, matching
* [path_to_url#section-3.1.2.2][RFC 7231 3.1.2.2]
*/
contentEncoding?: string;
/**
* Content-Language of the object data, matching
* [path_to_url#section-3.1.3.2][RFC 7231 3.1.3.2].
*/
contentLanguage?: string;
/**
* Content-Type of the object data, matching
* [path_to_url#section-3.1.1.5][RFC 7231 3.1.1.5].
* If an object is stored without a Content-Type, it is served as
* `application/octet-stream`.
*/
contentType?: string;
/**
* CRC32c checksum. For more information about using the CRC32c
* checksum, see
* [path_to_url#_JSONAPI][Hashes and
* ETags: Best Practices].
*/
crc32c?: string;
/**
* Metadata of customer-supplied encryption key, if the object is encrypted by
* such a key.
*/
customerEncryption?: CustomerEncryption;
/**
* HTTP 1.1 Entity tag for the object. See
* [path_to_url#section-2.3][RFC 7232 2.3].
*/
etag?: string;
/**
* The content generation of this object. Used for object versioning.
* Attempting to set this field will result in an error.
*/
generation: number;
/**
* The ID of the object, including the bucket name, object name, and
* generation number.
*/
id: string;
/**
* The kind of item this is. For objects, this is always "storage#object".
*/
kind?: string;
/**
* MD5 hash of the data; encoded using base64 as per
* [path_to_url#section-4][RFC 4648 4]. For more
* information about using the MD5 hash, see
* [path_to_url#_JSONAPI][Hashes and
* ETags: Best Practices].
*/
md5Hash?: string;
/**
* Media download link.
*/
mediaLink?: string;
/**
* User-provided metadata, in key/value pairs.
*/
metadata?: { [key: string]: string };
/**
* The version of the metadata for this object at this generation. Used for
* preconditions and for detecting changes in metadata. A metageneration
* number is only meaningful in the context of a particular generation of a
* particular object.
*/
metageneration: number;
/**
* The name of the object.
*/
name: string;
/**
* The link to this object.
*/
selfLink?: string;
/**
* Content-Length of the object data in bytes, matching
* [path_to_url#section-3.3.2][RFC 7230 3.3.2].
*/
size: number;
/**
* Storage class of the object.
*/
storageClass: string;
/**
* The creation time of the object.
* Attempting to set this field will result in an error.
*/
timeCreated?: Date | string;
/**
* The deletion time of the object. Will be returned if and only if this
* version of the object has been deleted.
*/
timeDeleted?: Date | string;
/**
* The time at which the object's storage class was last changed.
*/
timeStorageClassUpdated?: Date | string;
/**
* The modification time of the object metadata.
*/
updated?: Date | string;
}
/**
* Metadata of customer-supplied encryption key, if the object is encrypted by
* such a key.
*/
export interface CustomerEncryption {
/**
* The encryption algorithm.
*/
encryptionAlgorithm?: string;
/**
* SHA256 hash value of the encryption key.
*/
keySha256?: string;
}
/** A CloudEvent that contains StorageObjectData */
export interface StorageEvent extends CloudEvent<StorageObjectData> {
/** The name of the bucket containing this object. */
bucket: string;
}
/** @internal */
export const archivedEvent = "google.cloud.storage.object.v1.archived";
/** @internal */
export const finalizedEvent = "google.cloud.storage.object.v1.finalized";
/** @internal */
export const deletedEvent = "google.cloud.storage.object.v1.deleted";
/** @internal */
export const metadataUpdatedEvent = "google.cloud.storage.object.v1.metadataUpdated";
/** StorageOptions extend EventHandlerOptions with a bucket name */
export interface StorageOptions extends options.EventHandlerOptions {
/** The name of the bucket containing this object. */
bucket?: string | Expression<string>;
/**
* If true, do not deploy or emulate this function.
*/
omit?: boolean | Expression<boolean>;
/**
* Region where functions should be deployed.
*/
region?: options.SupportedRegion | string | Expression<string> | ResetValue;
/**
* Amount of memory to allocate to a function.
*/
memory?: options.MemoryOption | Expression<number> | ResetValue;
/**
* Timeout for the function in seconds, possible values are 0 to 540.
* HTTPS functions can specify a higher timeout.
*
* @remarks
* The minimum timeout for a gen 2 function is 1s. The maximum timeout for a
* function depends on the type of function: Event handling functions have a
* maximum timeout of 540s (9 minutes). HTTPS and callable functions have a
* maximum timeout of 36,00s (1 hour). Task queue functions have a maximum
* timeout of 1,800s (30 minutes)
*/
timeoutSeconds?: number | Expression<number> | ResetValue;
/**
* Min number of actual instances to be running at a given time.
*
* @remarks
* Instances will be billed for memory allocation and 10% of CPU allocation
* while idle.
*/
minInstances?: number | Expression<number> | ResetValue;
/**
* Max number of instances to be running in parallel.
*/
maxInstances?: number | Expression<number> | ResetValue;
/**
* Number of requests a function can serve at once.
*
* @remarks
* Can only be applied to functions running on Cloud Functions v2.
* A value of null restores the default concurrency (80 when CPU >= 1, 1 otherwise).
* Concurrency cannot be set to any value other than 1 if `cpu` is less than 1.
* The maximum value for concurrency is 1,000.
*/
concurrency?: number | Expression<number> | ResetValue;
/**
* Fractional number of CPUs to allocate to a function.
*
* @remarks
* Defaults to 1 for functions with <= 2GB RAM and increases for larger memory sizes.
* This is different from the defaults when using the gcloud utility and is different from
* the fixed amount assigned in Google Cloud Functions generation 1.
* To revert to the CPU amounts used in gcloud or in Cloud Functions generation 1, set this
* to the value "gcf_gen1"
*/
cpu?: number | "gcf_gen1";
/**
* Connect cloud function to specified VPC connector.
*/
vpcConnector?: string | Expression<string> | ResetValue;
/**
* Egress settings for VPC connector.
*/
vpcConnectorEgressSettings?: options.VpcEgressSetting | ResetValue;
/**
* Specific service account for the function to run as.
*/
serviceAccount?: string | Expression<string> | ResetValue;
/**
* Ingress settings which control where this function can be called from.
*/
ingressSettings?: options.IngressSetting | ResetValue;
/**
* User labels to set on the function.
*/
labels?: Record<string, string>;
/*
* Secrets to bind to a function.
*/
secrets?: (string | SecretParam)[];
/** Whether failed executions should be delivered again. */
retry?: boolean | Expression<boolean> | ResetValue;
}
/**
* Event handler sent only when a bucket has enabled object versioning.
* This event indicates that the live version of an object has become an
* archived version, either because it was archived or because it was
* overwritten by the upload of an object of the same name.
*
* @param handler - Event handler which is run every time a Google Cloud Storage archival occurs.
*/
export function onObjectArchived(
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler sent only when a bucket has enabled object versioning.
* This event indicates that the live version of an object has become an
* archived version, either because it was archived or because it was
* overwritten by the upload of an object of the same name.
*
* @param bucket - The name of the bucket containing this object.
* @param handler - Event handler which is run every time a Google Cloud Storage archival occurs.
*/
export function onObjectArchived(
bucket: string | Expression<string>,
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler sent only when a bucket has enabled object versioning.
* This event indicates that the live version of an object has become an
* archived version, either because it was archived or because it was
* overwritten by the upload of an object of the same name.
*
* @param opts - Options that can be set on an individual event-handling function.
* @param handler - Event handler which is run every time a Google Cloud Storage archival occurs.
*/
export function onObjectArchived(
opts: StorageOptions,
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler sent only when a bucket has enabled object versioning.
* This event indicates that the live version of an object has become an
* archived version, either because it was archived or because it was
* overwritten by the upload of an object of the same name.
*
* @param bucketOrOptsOrHandler - Options or string that may (or may not) define the bucket to be used.
* @param handler - Event handler which is run every time a Google Cloud Storage archival occurs.
*/
export function onObjectArchived(
bucketOrOptsOrHandler:
| string
| Expression<string>
| StorageOptions
| ((event: StorageEvent) => any | Promise<any>),
handler?: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent> {
return onOperation(archivedEvent, bucketOrOptsOrHandler, handler);
}
/**
* Event handler which fires every time a Google Cloud Storage object
* creation occurs.
*
* Sent when a new object (or a new generation of an existing object)
* is successfully created in the bucket. This includes copying or rewriting
* an existing object. A failed upload does not trigger this event.
*
* @param handler - Event handler which is run every time a Google Cloud Storage object creation occurs.
*/
export function onObjectFinalized(
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler which fires every time a Google Cloud Storage object
* creation occurs.
*
* Sent when a new object (or a new generation of an existing object)
* is successfully created in the bucket. This includes copying or rewriting
* an existing object. A failed upload does not trigger this event.
*
* @param bucket - The name of the bucket containing this object.
* @param handler - Event handler which is run every time a Google Cloud Storage object creation occurs.
*/
export function onObjectFinalized(
bucket: string | Expression<string>,
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler which fires every time a Google Cloud Storage object
* creation occurs.
*
* Sent when a new object (or a new generation of an existing object)
* is successfully created in the bucket. This includes copying or rewriting
* an existing object. A failed upload does not trigger this event.
*
* @param opts - Options that can be set on an individual event-handling function.
* @param handler - Event handler which is run every time a Google Cloud Storage object creation occurs.
*/
export function onObjectFinalized(
opts: StorageOptions,
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler which fires every time a Google Cloud Storage object
* creation occurs.
*
* Sent when a new object (or a new generation of an existing object)
* is successfully created in the bucket. This includes copying or rewriting
* an existing object. A failed upload does not trigger this event.
*
* @param bucketOrOptsOrHandler - Options or string that may (or may not) define the bucket to be used.
* @param handler - Event handler which is run every time a Google Cloud Storage object creation occurs.
*/
export function onObjectFinalized(
bucketOrOptsOrHandler:
| string
| Expression<string>
| StorageOptions
| ((event: StorageEvent) => any | Promise<any>),
handler?: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent> {
return onOperation(finalizedEvent, bucketOrOptsOrHandler, handler);
}
/**
* Event handler which fires every time a Google Cloud Storage deletion occurs.
*
* Sent when an object has been permanently deleted. This includes objects
* that are overwritten or are deleted as part of the bucket's lifecycle
* configuration. For buckets with object versioning enabled, this is not
* sent when an object is archived, even if archival occurs
* via the `storage.objects.delete` method.
*
* @param handler - Event handler which is run every time a Google Cloud Storage object deletion occurs.
*/
export function onObjectDeleted(
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler which fires every time a Google Cloud Storage deletion occurs.
*
* Sent when an object has been permanently deleted. This includes objects
* that are overwritten or are deleted as part of the bucket's lifecycle
* configuration. For buckets with object versioning enabled, this is not
* sent when an object is archived, even if archival occurs
* via the `storage.objects.delete` method.
*
* @param bucket - The name of the bucket containing this object.
* @param handler - Event handler which is run every time a Google Cloud Storage object deletion occurs.
*/
export function onObjectDeleted(
bucket: string | Expression<string>,
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler which fires every time a Google Cloud Storage deletion occurs.
*
* Sent when an object has been permanently deleted. This includes objects
* that are overwritten or are deleted as part of the bucket's lifecycle
* configuration. For buckets with object versioning enabled, this is not
* sent when an object is archived, even if archival occurs
* via the `storage.objects.delete` method.
*
* @param opts - Options that can be set on an individual event-handling function.
* @param handler - Event handler which is run every time a Google Cloud Storage object deletion occurs.
*/
export function onObjectDeleted(
opts: StorageOptions,
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler which fires every time a Google Cloud Storage deletion occurs.
*
* Sent when an object has been permanently deleted. This includes objects
* that are overwritten or are deleted as part of the bucket's lifecycle
* configuration. For buckets with object versioning enabled, this is not
* sent when an object is archived, even if archival occurs
* via the `storage.objects.delete` method.
*
* @param bucketOrOptsOrHandler - Options or string that may (or may not) define the bucket to be used.
* @param handler - Event handler which is run every time a Google Cloud Storage object deletion occurs.
*/
export function onObjectDeleted(
bucketOrOptsOrHandler:
| string
| Expression<string>
| StorageOptions
| ((event: StorageEvent) => any | Promise<any>),
handler?: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent> {
return onOperation(deletedEvent, bucketOrOptsOrHandler, handler);
}
/**
* Event handler which fires every time the metadata of an existing object
* changes.
*
* @param bucketOrOptsOrHandler - Options or string that may (or may not) define the bucket to be used.
* @param handler - Event handler which is run every time a Google Cloud Storage object metadata update occurs.
*/
export function onObjectMetadataUpdated(
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler which fires every time the metadata of an existing object
* changes.
*
* @param bucket - The name of the bucket containing this object.
* @param handler - Event handler which is run every time a Google Cloud Storage object metadata update occurs.
*/
export function onObjectMetadataUpdated(
bucket: string | Expression<string>,
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler which fires every time the metadata of an existing object
* changes.
*
* @param opts - Options that can be set on an individual event-handling function.
* @param handler - Event handler which is run every time a Google Cloud Storage object metadata update occurs.
*/
export function onObjectMetadataUpdated(
opts: StorageOptions,
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent>;
/**
* Event handler which fires every time the metadata of an existing object
* changes.
*
* @param bucketOrOptsOrHandler - Options or string that may (or may not) define the bucket to be used.
* @param handler - Event handler which is run every time a Google Cloud Storage object metadata update occurs.
*/
export function onObjectMetadataUpdated(
bucketOrOptsOrHandler:
| string
| Expression<string>
| StorageOptions
| ((event: StorageEvent) => any | Promise<any>),
handler?: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent> {
return onOperation(metadataUpdatedEvent, bucketOrOptsOrHandler, handler);
}
/** @internal */
export function onOperation(
eventType: string,
bucketOrOptsOrHandler:
| string
| Expression<string>
| StorageOptions
| ((event: StorageEvent) => any | Promise<any>),
handler: (event: StorageEvent) => any | Promise<any>
): CloudFunction<StorageEvent> {
if (typeof bucketOrOptsOrHandler === "function") {
handler = bucketOrOptsOrHandler as (event: StorageEvent) => any | Promise<any>;
bucketOrOptsOrHandler = {};
}
const [opts, bucket] = getOptsAndBucket(bucketOrOptsOrHandler);
const func = (raw: CloudEvent<unknown>) => {
return wrapTraceContext(withInit(handler))(raw as StorageEvent);
};
func.run = handler;
Object.defineProperty(func, "__trigger", {
get: () => {
const baseOpts = options.optionsToTriggerAnnotations(options.getGlobalOptions());
const specificOpts = options.optionsToTriggerAnnotations(opts);
return {
platform: "gcfv2",
...baseOpts,
...specificOpts,
labels: {
...baseOpts?.labels,
...specificOpts?.labels,
},
eventTrigger: {
eventType,
resource: bucket, // TODO(colerogers): replace with 'bucket,' eventually
},
};
},
});
// TypeScript doesn't recognize defineProperty as adding a property and complains
// that __endpoint doesn't exist. We can either cast to any and lose all type safety
// or we can just assign a meaningless value before calling defineProperty.
func.__endpoint = {} as ManifestEndpoint;
// SDK may attempt to read FIREBASE_CONFIG env var to fetch the default bucket name.
// To prevent runtime errors when FIREBASE_CONFIG env var is missing, we use getters.
Object.defineProperty(func, "__endpoint", {
get: () => {
const baseOpts = options.optionsToEndpoint(options.getGlobalOptions());
const specificOpts = options.optionsToEndpoint(opts);
const endpoint: ManifestEndpoint = {
platform: "gcfv2",
...initV2Endpoint(options.getGlobalOptions(), opts),
...baseOpts,
...specificOpts,
labels: {
...baseOpts?.labels,
...specificOpts?.labels,
},
eventTrigger: {
eventType,
eventFilters: { bucket },
retry: false,
},
};
copyIfPresent(endpoint.eventTrigger, opts, "retry", "retry");
return endpoint;
},
});
return func;
}
/** @internal */
export function getOptsAndBucket(
bucketOrOpts: string | Expression<string> | StorageOptions
): [options.EventHandlerOptions, string | Expression<string>] {
let bucket: string | Expression<string>;
let opts: options.EventHandlerOptions;
// If bucket is a string or Expression<string>
if (typeof bucketOrOpts === "string" || "value" in bucketOrOpts) {
bucket = bucketOrOpts;
opts = {};
} else {
bucket = bucketOrOpts.bucket || firebaseConfig()?.storageBucket;
opts = { ...bucketOrOpts };
delete (opts as any).bucket;
}
if (!bucket) {
throw new Error(
"Missing bucket name. If you are unit testing, please provide a bucket name" +
" by providing bucket name directly in the event handler or by setting process.env.FIREBASE_CONFIG."
);
}
if (typeof bucket === "string" && !/^[a-z\d][a-z\d\\._-]{1,230}[a-z\d]$/.test(bucket)) {
throw new Error(`Invalid bucket name ${bucket}`);
}
return [opts, bucket];
}
```
|
Paradise is a village in the Nickerie District of Suriname. It has a population of about 966 residents. The village is located about two kilometers from the town of Nieuw Nickerie.
The village is named after the sugar and cotton plantation Paradise The plantation was founded in 1797. Paradise and Plaisance were the first two concessions for plantations in Nickerie. On 31 December 1913, the plantation was disbanded and turned into 335 plots for small scale agriculture. The plantation was also home to several dozen free slaves. A railway station was planned in Paradise, however the plan was never completed.
See also
Nieuw Nickerie
References
External links
Populated places in Nickerie District
|
```smalltalk
/****************************************************************************
*
* path_to_url
* path_to_url
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in
* all copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
* THE SOFTWARE.
****************************************************************************/
using System;
namespace QFramework
{
public interface IMsg
{
/// <summary>
/// EventID
/// </summary>
int EventID { get; set; }
/// <summary>
/// Processed or not
/// </summary>
bool Processed { get; set; }
/// <summary>
/// reusable or not
/// </summary>
bool ReuseAble { get; set; }
int ManagerID { get; }
void Recycle2Cache();
}
/// <summary>
/// msgbody
/// </summary>
public class QMsg : IMsg, IPoolable, IPoolType
{
/// <summary>
/// EventID
/// TODO: raname 2 Id
/// </summary>
public virtual int EventID { get; set; }
/// <summary>
/// Processed or not
/// </summary>
public bool Processed { get; set; }
/// <summary>
/// reusable or not
/// </summary>
public bool ReuseAble { get; set; }
public int ManagerID
{
get { return EventID / QMsgSpan.Count * QMsgSpan.Count; }
}
public QMsg()
{
}
#region Object Pool
public static QMsg Allocate<T>(T eventId) where T : IConvertible
{
QMsg msg = SafeObjectPool<QMsg>.Instance.Allocate();
msg.EventID = eventId.ToInt32(null);
msg.ReuseAble = true;
return msg;
}
public virtual void Recycle2Cache()
{
SafeObjectPool<QMsg>.Instance.Recycle(this);
}
void IPoolable.OnRecycled()
{
Processed = false;
}
bool IPoolable.IsRecycled { get; set; }
#endregion
#region deprecated since v0.0.5
//[Obsolete("deprecated,use allocate instead")]
public QMsg(int eventID)
{
EventID = eventID;
}
#endregion
}
}
```
|
The Money Maze is an American television game show seen on ABC from December 23, 1974, to June 27, 1975. The show was hosted by Nick Clooney and was announced by Alan Kalter. It was produced by Daphne-Don Lipp Productions, of which Dick Cavett was a principal.
The object of the game was to negotiate a large maze built on the studio floor that housed several towers. A contestant would direct his or her spouse from a perch above the maze; the spouse would need to find his or her way to a specified tower inside the maze and press a button there in order to win prizes.
Clooney hosted Money Maze concurrently with his local daily talk show, The Nick Clooney Show, on then-ABC affiliate WKRC-TV in Cincinnati (now a CBS station). In fact, WKRC scheduled Money Maze on a delay at 10:30 a.m., immediately before Nick Clooney at 11:00, to provide a 90-minute block for the popular local personality.
Gameplay
Two married couples played against each other for the right to enter the maze. Three regular rounds were played. Each round had a particular topic, with eight related clues. Two clues would be shown on a screen; one couple would select a clue for the other to attempt to answer. A correct answer scored a point, and that couple would then select from two clues (a new clue plus the unused one from the last pair) for the opposing couple. An incorrect answer gave the opponents a chance to answer instead. If they did so correctly, they won the round and had a chance to answer as many of the remaining clues as possible; otherwise, play would continue in the round. If the two couples each answered four clues in the round, a tiebreaker would be played where two additional clues were shown. The first couple to activate a buzzer would select a clue to answer for one point, then try to answer the other for two points. If they were wrong on either, the other couple got a free attempt.
The winning couple in each round would then send one member into the maze, with the other directing from above. The "runner" would have 15 seconds to find a phone-booth-size "tower" with push-buttons on each side. Pressing the lit button before time expired won the prize and three points. Later in the show's run, couples were given the option of trying to also reach a second "money tower" within a total of 25 seconds for a $500 bonus and three additional points; if they accepted the risk but failed to reach both towers, the prize and the cash bonus were both lost.
Catch-Up Round
Clues proceeded as in earlier rounds, except that the couple trailing in score at that point of the game would do all the answering and the leading team would select the clues. The first clue was worth one point, the second worth two, and so on. If the trailing couple incorrectly answered at any time before their score surpassed their opponents, the round was over and the other couple won outright. If the trailing couple tied or passed the leading couple's score, the leading couple, now trailing, received only one chance for a final clue that would win the game.
The winner at the end of this round would play "The $10,000 Dash," a final maze run for a prize of up to $10,000. Both couples kept their money and prizes. If both couples were tied going into the Catch-Up Round, they each ran the maze for $10,000.
The $10,000 Dash
In the final run, five of the towers (out of eight available) would be lit. Four of them would have zeroes on top, and the fifth would have a "1" lit. The runner had 60 seconds to activate the "1" and hit the button at the maze exit to win anything at all. To win the $10,000, the runner had to activate the push-buttons on the five lit towers, reach the exit, and push its button within 60 seconds. The couple won $1 for reaching the "1," and the winnings were multiplied by 10 for each zero reached. However, if the runner activated only zeroes or did not hit the exit button before time ran out, the couple won nothing. The buttons could be hit in any order, but only one button on each tower was active.
Champions were retired upon winning the $10,000 Dash or after appearing for three days.
Broadcast history
ABC broadcast The Money Maze at 4:00 p.m. Eastern (3:00 Central), opposite Tattletales on CBS and Somerset on NBC; Money Maze did not perform well against either series in the ratings, and host Clooney claimed in a 1998 Cincinnati Post column that fewer than half of ABC's affiliates carried the show.
However, this was not the only reason the show faltered.
Set
The large maze, estimated by some sources at 50 × 100 feet, had the audience sitting in bleachers above and around three sides of the maze, with the stage facing the remaining side. It is also widely believed to have been the main factor in the show's undoing.
The set was so large and complex that it took nearly an entire day to set up the maze and another to break it down, tying up the studio for an extra two days for each five-show, one-day taping session. According to Mark Evanier, producer Don Segall described Money Maze as "the first game show where the stage crew took home more money than the contestants"; the rental fees for taping at a large studio for several days, plus overtime pay for setting up, striking, and storing the set, quickly eclipsed the show's prize budget.
Cancellation
ABC may have viewed the large expenses as a headache, even as Tattletales was pushed to 11:00 am on June 16 in favor of Musical Chairs. While Money Maze was scheduled to end on July 4, the network discontinued the show before the final week was taped. The last aired week (June 30 – July 4) consisted of repeats from the later format (with the $500 bonus tower), all containing $10,000 wins, with the Friday repeat being the last first-run show from the previous Friday. A new version of the 1960s game You Don't Say! replaced The Money Maze the next week.
Planned revival
In 2009, producer Ron Greenberg worked with Don Lipp and Phil Gurin on a new pilot for a revival on French TV network TF1. It is uncertain what has come of these plans.
Episode status
The pilot (titled The Moneymaze) and at least one episode from the series exist in ABC's archive. As with most other daytime game shows on the networks other than CBS from that era, the tapes were erased after broadcast for reuse due to their great expense at the time.
A brief clip from an episode aired in 2004 when Chuck Barris and George Clooney (Nick Clooney's son) were promoting Confessions of a Dangerous Mind. Another 1975 episode, recorded on an early home VCR by artist Andy Warhol, is held at the Paley Center for Media.
References
External links
1974 American television series debuts
1975 American television series endings
1970s American game shows
American Broadcasting Company original programming
English-language television shows
Lost television shows
|
Sanjai Gandhi is an attorney at law specializing at intellectual property rights. Gandhi had been instrumental in getting protection under Geographical Indication (protection & Registration) Act, 1999 for more than 15 Geographical indications (GI) for the state of Tamil Nadu, India. The products for which IPR attorney Sanjai Gandhi has obtained GI protection are: Kancheepuram Silk Sarees, Bhavani Jamakkalam (bedsheet), Madurai Sungudi Saree, Salem White Silk, Kovai Cora Cotton, Arani Sari, Thanjavur Paintings, Thanjavur Dancing Doll, Ethomozhi Tall Coconut of Kanyakumari district and Tangaliya Shawl of Gujarat, Thanjavur Veenai, Mahabalipuram Stone Sculpture, Thirubuvanam Silk Sarees, Dindigul Locks, Srivilliputtur Palkova, Kandangi Saree, Arumbavur Wood Carving, and Thanjavur Pith Work.
Mr. Sanjai Gandhi has also received National Intellectual Property Award 2018 in the category of Top Individual / organization for Best facilitation of Registration of GI and Promotion of registered GI in India.
Mr. Sanjai Gandhi has also been appointed as the Nodal Officer for Tamil Nadu state's GI registry. Tamil Nadu is the only state in India to appoint a Nodal Officer for GI Registry even though the Union Ministry of Commerce had called upon all states to appoint such a Nodal Officer.
He has written a book in English titled Arts and Crafts of India: Registered GI Products. To spread awareness about IPRs, he has penned a book titled Intellectual Property in Tamil.
References
Living people
20th-century Indian lawyers
1975 births
People from Thanjavur district
|
```java
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
package google.registry.rde;
import static google.registry.util.HexDumper.dumpHex;
import static java.nio.charset.StandardCharsets.UTF_8;
import com.google.common.collect.Ordering;
import com.google.common.io.BaseEncoding;
import com.google.re2j.Matcher;
import com.google.re2j.Pattern;
import google.registry.gcs.GcsUtils;
import google.registry.request.HttpException.NoContentException;
import google.registry.xjc.rde.XjcRdeRrType;
import google.registry.xml.XmlException;
import java.io.BufferedInputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
import org.joda.time.DateTime;
import org.joda.time.ReadableInstant;
import org.joda.time.format.DateTimeFormatter;
import org.joda.time.format.ISODateTimeFormat;
/** Helper methods for RDE. */
public final class RdeUtils {
/** Number of bytes in head of XML deposit that will contain the information we want. */
private static final int PEEK_SIZE = 2048;
/** Regular expression for extracting creation timestamp from a raw XML deposit. */
private static final Pattern WATERMARK_PATTERN = Pattern.compile("[<:]watermark>\\s*([^<\\s]+)");
/** Standard ISO date/time formatter without milliseconds. Used for watermarks. */
private static final DateTimeFormatter DATETIME_FORMATTER =
ISODateTimeFormat.dateTimeNoMillis().withZoneUTC();
/**
* Look at some bytes from {@code xmlInput} to ensure it appears to be a FULL XML deposit and
* then use a regular expression to extract the watermark timestamp which is returned.
*/
public static DateTime peekWatermark(BufferedInputStream xmlInput)
throws IOException, XmlException {
xmlInput.mark(PEEK_SIZE);
byte[] peek = new byte[PEEK_SIZE];
if (xmlInput.read(peek) != PEEK_SIZE) {
throw new IOException(String.format("Failed to peek %,d bytes on input file", PEEK_SIZE));
}
xmlInput.reset();
String peekStr = new String(peek, UTF_8);
if (!peekStr.contains("urn:ietf:params:xml:ns:rde-1.0")) {
throw new XmlException(String.format(
"Does not appear to be an XML RDE deposit\n%s", dumpHex(peek)));
}
if (!peekStr.contains("type=\"FULL\"")) {
throw new XmlException("Only FULL XML RDE deposits suppported at this time");
}
Matcher watermarkMatcher = WATERMARK_PATTERN.matcher(peekStr);
if (!watermarkMatcher.find()) {
throw new XmlException("Could not find RDE watermark in XML");
}
return DATETIME_FORMATTER.parseDateTime(watermarkMatcher.group(1));
}
/** Find the most recent folder in the given GCS bucket for the given watermark. */
public static String findMostRecentPrefixForWatermark(
DateTime watermark, String bucket, String tld, GcsUtils gcsUtils) throws NoContentException {
// The prefix is always in the format of: rde-2022-02-21t00-00-00z-2022-02-21t00-07-33z, where
// the first datetime is the watermark and the second one is the time when the RDE beam job
// launched. We search for the latest folder that starts with "rde-[watermark]".
String partialPrefix = String.format("rde-%s", watermark.toString("yyyy-MM-dd't'HH-mm-ss'z'"));
String latestFilenameSuffix = null;
try {
latestFilenameSuffix =
gcsUtils.listFolderObjects(bucket, partialPrefix).stream()
.max(Ordering.natural())
.orElse(null);
} catch (IOException e) {
throw new NoContentException(
String.format(
"Error reading folders starting with %s in bucket %s", partialPrefix, bucket));
}
if (latestFilenameSuffix == null) {
throw new NoContentException(
String.format("RDE deposit for TLD %s on %s does not exist", tld, watermark));
}
int firstSlashPosition = latestFilenameSuffix.indexOf('/');
return partialPrefix + latestFilenameSuffix.substring(0, firstSlashPosition + 1);
}
/**
* Generates an ID matching the regex {@code \w{1,13}} from a millisecond timestamp.
*
* <p>This routine works by turning the number of UTC milliseconds from the UNIX epoch into a
* big-endian byte-array which is then converted to a base32 string without padding that's no
* longer than 13 chars because {@code 13 = Ceiling[Log[32, 2^64]]}. How lucky!
*/
public static String timestampToId(ReadableInstant timestamp) {
byte[] bytes = ByteBuffer.allocate(8).putLong(timestamp.getMillis()).array();
return BaseEncoding.base32().omitPadding().encode(bytes);
}
static XjcRdeRrType makeXjcRdeRrType(String registrarId) {
XjcRdeRrType bean = new XjcRdeRrType();
bean.setValue(registrarId);
return bean;
}
private RdeUtils() {}
}
```
|
```objective-c
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#ifndef V8_CCTEST_UNICODE_HELPERS_H_
#define V8_CCTEST_UNICODE_HELPERS_H_
#include "src/unicode.h"
int Ucs2CharLength(unibrow::uchar c);
int Utf8LengthHelper(const char* s);
#endif // V8_CCTEST_UNICODE_HELPERS_H_
```
|
```xml
import { G2Spec } from '../../../src';
export function carsPointScatterPlot(): G2Spec {
return {
type: 'point',
width: 1152,
height: 600,
data: {
type: 'fetch',
value: 'data/cars.csv',
},
encode: {
x: 'mpg',
y: 'hp',
color: 'steelblue',
},
labels: [
{
text: 'name',
stroke: '#fff',
textAlign: 'start',
textBaseline: 'middle',
dx: 10,
position: 'left',
fontSize: 10,
lineWidth: 2,
},
],
};
}
```
|
Otluca () is a village in the Beşiri District of Batman Province in Turkey. The village is populated by Kurds of the Reşkotan tribe and had a population of 267 in 2021.
References
Villages in Beşiri District
Kurdish settlements in Batman Province
|
```objective-c
/**
* Authors:
* - Paul Asmuth <paul@eventql.io>
* - Laura Schlimmer <laura@eventql.io>
*
* This program is free software: you can redistribute it and/or modify it under
* or any later version.
*
* In accordance with Section 7(e) of the license, the licensing of the Program
* under the license does not imply a trademark license. Therefore any rights,
* title and interest in our trademarks remain entirely with us.
*
* This program is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS
* FOR A PARTICULAR PURPOSE. See the license for more details.
*
* You can be released from the requirements of the license by purchasing a
* commercial license. Buying such a license is mandatory as soon as you develop
* commercial activities involving this program without disclosing the source
* code of your own applications
*/
#pragma once
#include <eventql/util/stdtypes.h>
#include <eventql/util/return_code.h>
#include <eventql/sql/qtree/ValueExpressionNode.h>
#include <eventql/sql/qtree/qtree_coder.h>
#include "eventql/eventql.h"
namespace csql {
class IfExpressionNode : public ValueExpressionNode {
public:
static ReturnCode newNode(
RefPtr<ValueExpressionNode> conditional_expr,
RefPtr<ValueExpressionNode> true_branch_expr,
RefPtr<ValueExpressionNode> false_branch_expr,
RefPtr<ValueExpressionNode>* node);
Vector<RefPtr<ValueExpressionNode>> arguments() const override;
RefPtr<ValueExpressionNode> conditional() const;
RefPtr<ValueExpressionNode> trueBranch() const;
RefPtr<ValueExpressionNode> falseBranch() const;
RefPtr<QueryTreeNode> deepCopy() const override;
String toSQL() const override;
SType getReturnType() const override;
static void encode(
QueryTreeCoder* coder,
const IfExpressionNode& node,
OutputStream* os);
static RefPtr<QueryTreeNode> decode (
QueryTreeCoder* coder,
InputStream* is);
protected:
IfExpressionNode(
SType return_type,
RefPtr<ValueExpressionNode> conditional_expr,
RefPtr<ValueExpressionNode> true_branch_expr,
RefPtr<ValueExpressionNode> false_branch_expr);
SType return_type_;
RefPtr<ValueExpressionNode> conditional_expr_;
RefPtr<ValueExpressionNode> true_branch_expr_;
RefPtr<ValueExpressionNode> false_branch_expr_;
};
} // namespace csql
```
|
The Paralegal Institute is a nationally accredited two-year college based in Phoenix, Arizona, United States. The Paralegal Institute (TPI) offers programs specializing in paralegal, criminal justice, medical transcription and legal nurse consultation.
History
The Paralegal Institute was founded in 1974 by John Morrison and is now run by Kathleen Mirabile. It was a paraprofessional residential school until 1979 when it became a distance learning institution. Since then, The Paralegal Institute received accreditation from the Distance Education and Training Council and was approved to grant degrees and diplomas by the State Board for Private Postsecondary Education.
The Paralegal Institute may issue two-year associate degree's in paralegal studies and criminal justice that are recognized by the U.S. Department of Education and Council for Higher Education Accreditation (CHEA). It has been accredited by Distance Education and Training Council since 1979.
Programs
The Paralegal Institute offers six programs in four different areas.
Paralegal Diploma
Students must fulfill 30 credit hours within one year to complete the Paralegal Diploma program. Among this program's specialties are legal research, legal analysis and contracts. Students who complete this program are eligible to sit for the National Association for Legal Assistants Certification Exam.
Paralegal Associate Degree
Students have two years to finish the Paralegal associate degree program, which requires completion of the requirements of a paralegal diploma and five additional specialty courses. These can include programs on torts, environmental law and social security disability.
Criminal Justice Diploma
Students must fulfill 30 credit hours within one year to complete the Criminal Justice Diploma program. Among this program's specialties are criminal law, laws of evidence and other courses taught by licensed practicing attorneys.
Associate Degree in Criminal Justice
To earn an associate degree students must meet the requirements of the Criminal Justice Diploma in addition to five specialty courses. These include programs on white collar crime, corrections and criminology.
Legal Nurse Consultant
This program is offered to individuals who have medical credentials, and offers courses in personal injury law and legal nurse consulting. Students have two years to complete the Legal Nurse Consultant program.
Medical Transcriptionist
The Medical Transcriptionist program teaches students about medical terminology used in hospitals and clinics. Students have two years to complete the Medical Transcriptionist program.
References
External links
The Paralegal Institute - official web site
Distance Education Accreditation Commission
Distance education institutions based in the United States
Educational institutions established in 1974
1974 establishments in Arizona
|
```objective-c
#pragma once
#include "tensors/backend.h"
#include "tensors/tensor.h"
#include "functional/functional.h"
#include "graph/node.h"
#include "tensors/tensor_operators.h"
#ifdef CUDNN
#include "tensors/gpu/cudnn_wrappers.h"
#endif
namespace marian {
struct UnaryNodeOp : public NaryNodeOp {
UnaryNodeOp(Expr a, Shape shape, Type value_type)
: NaryNodeOp({a}, shape, value_type) {}
UnaryNodeOp(Expr a, Type value_type)
: NaryNodeOp({a}, a->shape(), value_type) {}
UnaryNodeOp(Expr a, Shape shape)
: NaryNodeOp({a}, shape, a->value_type()) {}
UnaryNodeOp(Expr a)
: NaryNodeOp({a}, a->shape(), a->value_type()) {}
const std::string color() override { return "yellow"; }
};
struct ScalarAddNodeOp : public UnaryNodeOp {
private:
friend class SerializationHelpers;
float scalar_{0};
public:
ScalarAddNodeOp(Expr a, float scalar) : UnaryNodeOp(a), scalar_{scalar} {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = _2 + scalar_, val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(_1, child(0)->grad(), adj_))};
}
const std::string type() override { return "scalar_add"; }
virtual size_t hash() override {
if(!hash_) {
hash_ = NaryNodeOp::hash();
util::hash_combine(hash_, scalar_);
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<ScalarAddNodeOp>(node);
if(!cnode)
return false;
if(scalar_ != cnode->scalar_)
return false;
return true;
}
};
// Cast a tensor to a different type
struct CastNodeOp : public UnaryNodeOp {
public:
CastNodeOp(Expr a, Type type) : UnaryNodeOp(a, type) {}
NodeOps forwardOps() override {
using namespace functional;
return { NodeOp(CopyCast(val_, child(0)->val())) };
}
NodeOps backwardOps() override {
using namespace functional;
return { NodeOp(AddCast(child(0)->grad(), adj_)) };
}
const std::string type() override { return "cast"; }
};
struct ScalarMultNodeOp : public UnaryNodeOp {
private:
friend class SerializationHelpers;
float scalar_{0};
public:
ScalarMultNodeOp(Expr a, float scalar) : UnaryNodeOp(a), scalar_{scalar} {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = scalar_ * _2, val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(scalar_ * _1, child(0)->grad(), adj_))};
}
const std::string type() override { return "scalar_mult"; }
virtual size_t hash() override {
if(!hash_) {
hash_ = NaryNodeOp::hash();
util::hash_combine(hash_, scalar_);
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<ScalarMultNodeOp>(node);
if(!cnode)
return false;
if(scalar_ != cnode->scalar_)
return false;
return true;
}
};
struct ClipNodeOp : public UnaryNodeOp {
private:
float clip_{0};
public:
ClipNodeOp(Expr a, float clip) : UnaryNodeOp(a), clip_{clip} {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = clip(_2, clip_), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(
Add(bump(_1, clip_) * _2, child(0)->grad(), child(0)->val(), adj_))};
}
const std::string type() override { return "clip"; }
virtual size_t hash() override {
if(!hash_) {
hash_ = NaryNodeOp::hash();
util::hash_combine(hash_, clip_);
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<ClipNodeOp>(node);
if(!cnode)
return false;
if(clip_ != cnode->clip_)
return false;
return true;
}
};
struct SigmoidNodeOp : public UnaryNodeOp {
SigmoidNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = sigmoid(_2), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(_1 * _2 * (1.0f - _2), child(0)->grad(), adj_, val_))};
}
const std::string type() override { return "sigmoid"; }
};
// struct Scalar2PowNodeOp : public UnaryNodeOp {
// private:
// float scalar_{0};
//
// public:
// template <typename... Args>
// Scalar2PowNodeOp(Expr a, float scalar, Args... args)
// : UnaryNodeOp(a, args...), scalar_{scalar} {}
//
// NodeOps forwardOps() {
// return {NodeOp(Element(_1 = Pow(_2, scalar_), val_, child(0)->val()))};
// }
//
// NodeOps backwardOps() {
// return {NodeOp(Add(scalar_ * Pow(_1, scalar_ - 1.f) * _2,
// child(0)->grad(), child(0)->val(), adj_))};
// }
//
// const std::string type() { return "scalar_pow2"; }
//};
//
// struct Scalar1PowNodeOp : public UnaryNodeOp {
// private:
// float scalar_{0};
//
// public:
// template <typename... Args>
// Scalar1PowNodeOp(float scalar, Expr a, Args... args)
// : UnaryNodeOp(a, args...), scalar_{scalar} {}
//
// NodeOps forwardOps() {
// return {NodeOp(Element(_1 = Pow(scalar_, _2), val_, child(0)->val()))};
// }
//
// NodeOps backwardOps() {
// return {NodeOp(Add(Pow(scalar_, _1) * log(scalar_) * _2, child(0)->grad(),
// child(0)->val(), adj_))};
// }
//
// const std::string type() { return "scalar_pow1"; }
//};
struct TanhNodeOp : public NaryNodeOp {
TanhNodeOp(const std::vector<Expr>& nodes)
: NaryNodeOp(nodes, newShape(nodes)) {}
Shape newShape(const std::vector<Expr>& nodes) {
return Shape::broadcast(nodes);
}
NodeOps forwardOps() override {
using namespace functional;
switch(children_.size()) {
case 1: return {NodeOp(Element(_1 = tanh(_2), val_, child(0)->val()))};
case 2:
return {NodeOp(Element(
_1 = tanh(_2 + _3), val_, child(0)->val(), child(1)->val()))};
case 3:
return {NodeOp(Element(_1 = tanh(_2 + _3 + _4),
val_,
child(0)->val(),
child(1)->val(),
child(2)->val()))};
default:
return {
NodeOp(Element(_1 = _2 + _3 + _4,
val_,
child(0)->val(),
child(1)->val(),
child(2)->val());
for(size_t i = 3; i < children_.size(); ++i)
Element(_1 = _1 + _2, val_, child(i)->val());
Element(_1 = tanh(_1), val_);)
};
}
}
NodeOps backwardOps() override {
using namespace functional;
NodeOps ops;
for(size_t i = 0; i < children_.size(); i++) {
ops.push_back(
NodeOp(Add(_1 * (1.0f - (_2 * _2)), child(i)->grad(), adj_, val_)));
}
return ops;
}
const std::string color() override { return "yellow"; }
const std::string type() override { return "tanh"; }
};
struct ReLUNodeOp : public UnaryNodeOp {
ReLUNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
// f(x) = max(0, x)
using namespace functional;
return {NodeOp(Element(_1 = ReLU(_2),
val_, // _1 := f(x) to be calculated
child(0)->val() // _2 := x
))};
}
NodeOps backwardOps() override {
using namespace functional;
// dJ/dx += dJ/df * binarystep(x)
return {NodeOp(Add(_1 * ReLUback(_2),
child(0)->grad(), // dJ/dx
adj_, // _1 := dJ/df
child(0)->val() // _2 := f(x) = max(0, x)
))};
}
const std::string type() override { return "ReLU"; }
};
/**
* Represents a <a
* href="path_to_url">parametric
* rectified linear unit</a> node in an expression graph.
* For \f$ \alpha = 0.01 \f$ (the default value) it is equivalent to Leaky
* ReLU.
*
* This node implements the activation function:
* \f[
* f(x, \alpha) =
* \begin{cases}
* \alpha x & \text{if } x \leq 0 \\
* x & \text{if } x > 0
* \end{cases}
* \f]
*
* and its derivative:
* \f[
* f^\prime(x, \alpha) =
* \begin{cases}
* \alpha & \text{if } x \leq 0 \\
* 1 & \text{if } x > 0
* \end{cases}
* \f]
*/
struct PReLUNodeOp : public UnaryNodeOp {
PReLUNodeOp(float alpha, Expr a) : UnaryNodeOp(a), alpha_(alpha) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = PReLU(_2, alpha_), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(
_1 * PReLUback(_2, alpha_), child(0)->grad(), adj_, child(0)->val()))};
}
const std::string type() override { return "PReLU"; }
virtual size_t hash() override {
if(!hash_) {
hash_ = NaryNodeOp::hash();
util::hash_combine(hash_, alpha_);
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<PReLUNodeOp>(node);
if(!cnode)
return false;
if(alpha_ != cnode->alpha_)
return false;
return true;
}
private:
float alpha_{0.01f};
};
/**
* Represents a <a href="path_to_url">swish</a> node
* in an expression graph.
*
* This node implements the activation function
* \f$ f(x) = x \cdot \sigma(bx) \f$
* and its derivative
* \f$ f^\prime(x) = bf(x) + \sigma(bx)(1 - bf(x)) \f$ .
*
*/
struct SwishNodeOp : public UnaryNodeOp {
SwishNodeOp(Expr a, float b = 1.f) : UnaryNodeOp(a), b_{b} {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = _2 * sigmoid(b_ * _2), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
// dJ/dx += dJ/df * (b*f(x) + sigmoid(b*x) * (1 - b*f(x)))
return {NodeOp(Add(_1 * (b_ * _3 + sigmoid(b_ * _2) * (1.f - (b_ * _3))),
child(0)->grad(), // dJ/dx
adj_, // _1 := dJ/df
child(0)->val(), // _2 := x
val_ // _3 := f(x) = x*sigmoid(b*x)
))};
}
const std::string type() override { return "swish"; }
virtual size_t hash() override {
if(!hash_) {
hash_ = NaryNodeOp::hash();
util::hash_combine(hash_, b_);
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<SwishNodeOp>(node);
if(!cnode)
return false;
if(b_ != cnode->b_)
return false;
return true;
}
float b_;
};
struct SoftmaxNodeOp : public UnaryNodeOp {
SoftmaxNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
return {
NodeOp(Softmax(val_, child(0)->val()))};
}
NodeOps backwardOps() override {
// For each row, the Jacobian times vector is given by:
// J * dy = p .* (dy - avg*1)
// where avg = p'*dy and p is the softmax output (probabilities).
//
// For more information, see sec. 2.5 of the following reference:
// Andr F. T. Martins and Ramon Astudillo.
// "From Softmax to Sparsemax: A Sparse Model of Attention and Multi-Label
// Classification." ICML 2016.
// path_to_url
// val_ is already masked if there is a mask, so no need to apply here.
return {NodeOp(SoftmaxGrad(child(0)->grad(), adj_, val_))};
}
const std::string type() override { return "softmax"; }
};
struct LogSoftmaxNodeOp : public UnaryNodeOp {
LogSoftmaxNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override { return {NodeOp(LogSoftmax(val_, child(0)->val()))}; }
NodeOps backwardOps() override {
// Based on the description for softmax, we have logsoftmax:
// J * dy = dy - avg*1
// where avg = exp(p)'*dy and p is the softmax output (probabilities).
return {NodeOp(LogSoftmaxGrad(child(0)->grad(), adj_, val_))};
}
const std::string type() override { return "logsoftmax"; }
};
enum class ReduceNodeOpCode {
sum, mean, rms, meanSqr, min, max, prod, logSumExp
};
struct ReduceNodeOp : public UnaryNodeOp {
friend class SerializationHelpers;
int axis_;
ReduceNodeOpCode opCode_;
int reducedDim_; // dimension of axis being reduced, e.g. used in mean()
ReduceNodeOp(Expr a, int axis, ReduceNodeOpCode opCode)
: UnaryNodeOp(a, newShape(a, axis)), opCode_(opCode)
{
reducedDim_ = a->shape()[axis]; // e.g. used in mean()
ABORT_IF(reducedDim_ != a->shape().elements() / shape().elements(),
"Bug in determining reducedDim {} != {}",
reducedDim_,
a->shape().elements() / shape().elements());
}
NodeOps forwardOps() override {
using namespace functional;
switch (opCode_) {
case ReduceNodeOpCode::sum:
return {NodeOp(Reduce(_1, val_, child(0)->val()))};
case ReduceNodeOpCode::mean:
return {NodeOp(Reduce(_1, 1.0f / (float)reducedDim_, val_, child(0)->val()))};
case ReduceNodeOpCode::rms:
return {NodeOp(Reduce(_1 * _1, 1.0f / (float)reducedDim_, val_, child(0)->val());
Element(_1 = sqrt(_1), val_))};
case ReduceNodeOpCode::meanSqr:
return {NodeOp(Reduce(_1 * _1, 1.0f / (float)reducedDim_, val_, child(0)->val()))};
case ReduceNodeOpCode::min:
return {NodeOp(Reduce(_1, min(_1,_2), std::numeric_limits<float>::max(), val_, child(0)->val()))};
case ReduceNodeOpCode::max:
return {NodeOp(Reduce(_1, max(_1,_2), std::numeric_limits<float>::lowest(), val_, child(0)->val()))};
case ReduceNodeOpCode::prod:
return {NodeOp(Reduce(_1, _1 * _2, 1.0f, val_, child(0)->val()))};
case ReduceNodeOpCode::logSumExp:
return {NodeOp(Reduce(_1, logaddexp(_1,_2), std::numeric_limits<float>::lowest(), val_, child(0)->val()))};
default:
ABORT("Unexpected reduction op-code {}", (int)opCode_);
}
}
NodeOps backwardOps() override {
using namespace functional;
#if 1 // @BUGBUG: This is a workaround for not correctly propagating non-trainable information. @TODO: Do this the right and general way.
if (adj_ == nullptr)
return {};
#endif
switch (opCode_) {
case ReduceNodeOpCode::sum:
return {NodeOp(Add(_1, child(0)->grad(), adj_))};
case ReduceNodeOpCode::mean:
return {NodeOp(Add(_1, 1.0f / (float)reducedDim_, child(0)->grad(), adj_))};
case ReduceNodeOpCode::rms: // WARNING: UNTESTED!!
// y = (sum_j x_j^2)^0.5
// dJ/dx_i = dJ/dy * 0.5 (sum_j x_j^2)^-0.5 * 2 x_i = dJ/dy * x_i / y --@REVIEW: is this correct?
// @TODO: do we need protection against div by 0? L'hospital rule?
return {NodeOp(Add(_1 * _2 / _3, child(0)->grad(), adj_, child(0)->val(), val_))};
case ReduceNodeOpCode::meanSqr: // WARNING: UNTESTED!!
// y = sum_j x_j^2
// dJ/dx_i = dJ/dy * sum_j dx_j^2/dx_i = dJ/dy * 2 dx_i --@REVIEW: is this correct?
return {NodeOp(Add(_1 * 2.0f * _2, child(0)->grad(), adj_, child(0)->val()))};
case ReduceNodeOpCode::min: // WARNING: UNTESTED!!
case ReduceNodeOpCode::max: // WARNING: UNTESTED!!
// adj_ gets routed into the min/max value --@REVIEW: is this correct?
return {NodeOp(Add((_1 == _2) * _3, child(0)->grad(), child(0)->val(), val_, adj_))};
case ReduceNodeOpCode::logSumExp:
// y = log(sum_j exp(x_j))
// dJ/dx_i = dJ/dy * 1/(sum_j exp(x_j)) exp(x_i) = dJ/dy * exp(x_i - y)) --@REVIEW: is this correct?
return {NodeOp(Add(_1 * exp(_2 - _3), child(0)->grad(), adj_, child(0)->val(), val_))};
default:
ABORT("Unexpected reduction op-code {}", (int)opCode_);
}
}
Shape newShape(Expr a, int axis) {
Shape shape = a->shape();
axis_ = shape.axis(axis);
shape.set(axis_, 1);
return shape;
}
const std::string type() override {
switch (opCode_) {
case ReduceNodeOpCode::sum: return "sum";
case ReduceNodeOpCode::mean: return "mean";
case ReduceNodeOpCode::rms: return "rms";
case ReduceNodeOpCode::meanSqr: return "meanSqr";
case ReduceNodeOpCode::min: return "min";
case ReduceNodeOpCode::max: return "max";
case ReduceNodeOpCode::prod: return "prod";
case ReduceNodeOpCode::logSumExp: return "logSumExp";
default: ABORT("Unexpected reduction op-code {}", (int)opCode_);
}
}
const std::string color() override { return "orange"; }
virtual size_t hash() override {
if(!hash_) {
hash_ = NaryNodeOp::hash();
util::hash_combine(hash_, axis_);
util::hash_combine(hash_, (int)opCode_);
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<ReduceNodeOp>(node);
if(!cnode)
return false;
if(axis_ != cnode->axis_ || opCode_ != cnode->opCode_)
return false;
return true;
}
};
struct LogNodeOp : public UnaryNodeOp {
LogNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = log(_2), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {// NodeOp(Add(_1 * (1.f / _2), child(0)->grad(), adj_,
// child(0)->val()))};
NodeOp(Add(_1 / _2, child(0)->grad(), adj_, child(0)->val()))};
}
const std::string type() override { return "log"; }
};
struct ExpNodeOp : public UnaryNodeOp {
ExpNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = exp(_2), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(_1 * exp(_2), child(0)->grad(), adj_, child(0)->val()))};
}
const std::string type() override { return "exp"; }
};
struct SinNodeOp : public UnaryNodeOp {
SinNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = sin(_2), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(_1 * cos(_2), child(0)->grad(), adj_, child(0)->val()))};
}
const std::string type() override { return "sin"; }
};
struct CosNodeOp : public UnaryNodeOp {
CosNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = cos(_2), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(_1 * -sin(_2), child(0)->grad(), adj_, child(0)->val()))};
}
const std::string type() override { return "cos"; }
};
struct TanNodeOp : public UnaryNodeOp {
TanNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = tan(_2), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(_1 / sqr(cos(_2)), child(0)->grad(), adj_, child(0)->val()))};
}
const std::string type() override { return "tan"; }
};
struct SqrtNodeOp : public UnaryNodeOp {
float epsilon_;
SqrtNodeOp(Expr a, float epsilon) : UnaryNodeOp(a), epsilon_(epsilon) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = sqrt(_2 + epsilon_), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(0.5f * (1.f / _1) * _2, child(0)->grad(), val_, adj_))};
}
const std::string type() override { return "sqrt"; }
virtual size_t hash() override {
if(!hash_) {
size_t seed = NaryNodeOp::hash();
util::hash_combine(seed, epsilon_);
hash_ = seed;
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<SqrtNodeOp>(node);
if(!cnode)
return false;
if(epsilon_ != cnode->epsilon_)
return false;
return true;
}
};
struct SquareNodeOp : public UnaryNodeOp {
SquareNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = _2 * _2, val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {
NodeOp(Add(2.f * _1 * _2, child(0)->grad(), child(0)->val(), adj_))};
}
const std::string type() override { return "square"; }
};
struct NegNodeOp : public UnaryNodeOp {
NegNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = -_2, val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(-_1, child(0)->grad(), adj_))};
}
const std::string type() override { return "negate"; }
};
struct TransposeNodeOp : public UnaryNodeOp {
TransposeNodeOp(Expr a, const std::vector<int>& axes)
: UnaryNodeOp(a, newShape(a, axes)), axes_{axes}, axesBw_(axes.size()) {
for(int i = 0; i < axes_.size(); ++i)
axesBw_[axes_[i]] = i;
}
NodeOps forwardOps() override {
return {NodeOp(TransposeND(val_, child(0)->val(), axes_))};
}
NodeOps backwardOps() override {
return {NodeOp(TransposeNDGrad(child(0)->grad(), adj_, axesBw_))};
}
Shape newShape(Expr a, const std::vector<int>& axes) {
Shape shape = a->shape();
ABORT_IF(shape.size() != axes.size(),
"Shape and transpose axes have different number of dimensions");
for(size_t i = 0; i < shape.size(); ++i)
shape.set(i, a->shape()[axes[i]]);
return shape;
}
virtual size_t hash() override {
if(!hash_) {
size_t seed = NaryNodeOp::hash();
for(auto ax : axes_)
util::hash_combine(seed, ax);
hash_ = seed;
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<TransposeNodeOp>(node);
if(!cnode)
return false;
if(axes_ != cnode->axes_)
return false;
return true;
}
const std::string type() override { return "transpose"; }
const std::string color() override { return "orange"; }
private:
friend class SerializationHelpers;
std::vector<int> axes_;
std::vector<int> axesBw_;
};
class ReshapeNodeOp : public UnaryNodeOp {
protected:
friend class SerializationHelpers;
Expr reshapee_;
public:
ReshapeNodeOp(Expr a, Shape shape) : UnaryNodeOp(a, shape), reshapee_(a) {
ABORT_IF(a->shape().elements() != shape.elements(),
"Reshape must not change the number of elements (from {} to {})", a->shape().toString(), shape.toString());
Node::destroy_ = false;
}
~ReshapeNodeOp() {}
void allocate() override {}
void free() override {}
void forward() override {}
void backward() override {}
void init_dependent() override { reshapee_->init_dependent(); }
void set_zero_adjoint() override { reshapee_->set_zero_adjoint(); }
Tensor& val() override {
auto childVal = reshapee_->val();
auto temp = TensorBase::New(childVal->memory(), shape(), childVal->type(), childVal->getBackend());
val_.swap(temp);
return val_;
};
Tensor& grad() override {
auto childGrad = reshapee_->grad();
auto temp = TensorBase::New(childGrad->memory(), shape(), childGrad->type(), childGrad->getBackend());
adj_.swap(temp);
return adj_;
};
const std::string type() override { return "reshape"; }
const std::string color() override { return "grey"; }
virtual size_t hash() override {
if(!hash_) {
size_t seed = NaryNodeOp::hash();
for(auto s : shape())
util::hash_combine(seed, s);
hash_ = seed;
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<ReshapeNodeOp>(node);
if(!cnode)
return false;
if(shape() != cnode->shape())
return false;
return true;
}
};
// @TODO: add version with access to backward step
// This allows to attach a lambda function to any node during the execution. It is a non-operation otherwise
// i.e. doesn't consume any memory or take any time to execute (it's a reshape onto itself) other than the
// compute in the lambda function. It gets called after the forward step of the argument node.
class CallbackNodeOp : public ReshapeNodeOp {
private:
typedef std::function<void(Expr)> LambdaNodeCallback;
std::unique_ptr<LambdaNodeCallback> callback_;
public:
CallbackNodeOp(Expr node, LambdaNodeCallback callback)
: ReshapeNodeOp(node, node->shape()),
callback_(new LambdaNodeCallback(callback)) {
}
void forward() override {
(*callback_)(ReshapeNodeOp::reshapee_);
}
const std::string type() override { return "callback"; }
virtual size_t hash() override {
size_t seed = ReshapeNodeOp::hash();
util::hash_combine(seed, callback_.get());
return seed;
}
virtual bool equal(Expr node) override {
if(!ReshapeNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<CallbackNodeOp>(node);
if(!cnode)
return false;
if(callback_ != cnode->callback_) // pointer compare on purpose
return false;
return true;
}
};
class DropoutReluInplaceNodeOp : public ReshapeNodeOp {
private:
Expr mask_;
public:
DropoutReluInplaceNodeOp(Expr node, Expr mask)
: ReshapeNodeOp(node, node->shape()),
mask_(mask) {}
void forward() override {
using namespace marian::functional;
Element(_1 = ReLU(_1 * _2), val(), mask_->val());
}
void backward() override {
using namespace marian::functional;
Element(_1 = _1 * ReLUback(_2) * _3, grad(), val(), mask_->val());
}
const std::string type() override { return "dropoutReluInplace"; }
virtual size_t hash() override {
size_t seed = ReshapeNodeOp::hash();
util::hash_combine(seed, mask_->hash());
return seed;
}
virtual bool equal(Expr node) override {
if(!ReshapeNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<DropoutReluInplaceNodeOp>(node);
if(!cnode)
return false;
if(mask_ != cnode->mask_)
return false;
return true;
}
};
// @TODO: review if still required as this is an ugly hack anyway.
// Memory less operator that clips gradients during backward step
// Executes this as an additional operation on the gradient.
class ClipGradientNodeOp : public UnaryNodeOp {
private:
Expr clipee_;
float clipValue_{0};
public:
ClipGradientNodeOp(Expr a, float clipValue)
: UnaryNodeOp(a), clipee_(a), clipValue_(clipValue) {
Node::destroy_ = false;
}
~ClipGradientNodeOp() {}
void allocate() override {}
void free() override {}
void forward() override {}
void backward() override {
using namespace marian::functional;
Element(_1 = clip(_1, clipValue_), adj_);
}
void init_dependent() override { clipee_->init_dependent(); }
void set_zero_adjoint() override { clipee_->set_zero_adjoint(); }
Tensor& val() override {
auto childVal = clipee_->val();
auto temp = TensorBase::New(childVal->memory(), shape(), childVal->type(), childVal->getBackend());
val_.swap(temp);
return val_;
};
Tensor& grad() override {
auto childGrad = clipee_->grad();
auto temp = TensorBase::New(childGrad->memory(), shape(), childGrad->type(), childGrad->getBackend());
adj_.swap(temp);
return adj_;
};
const std::string type() override { return "clipGradient"; }
const std::string color() override { return "grey"; }
virtual size_t hash() override {
if(!hash_) {
size_t seed = NaryNodeOp::hash();
util::hash_combine(seed, clipValue_);
hash_ = seed;
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<ClipGradientNodeOp>(node);
if(!cnode)
return false;
if(clipValue_ != cnode->clipValue_)
return false;
return true;
}
};
// narrow an axis to [begin, end)
// The resulting object must be consecutive in memory.
class SliceViewNodeOp : public UnaryNodeOp {
private:
friend class SerializationHelpers;
Expr viewedNode_; // viewed underlying node
Slice slice_; // index range
int axis_; // and axis along which it is viewed
size_t byteOffset_, byteSize_; // viewed segment in bytes (memory-consecutive)
public:
SliceViewNodeOp(Expr a, int axis, Slice slice)
: UnaryNodeOp(a, newShape(a, axis, slice), a->value_type()), viewedNode_(a), slice_(slice), axis_(axis) {
Node::destroy_ = false;
auto byteStride = a->shape().stride(axis) * sizeOf(value_type());
byteOffset_ = slice.begin * byteStride;
byteSize_ = shape()[axis] * byteStride;
}
static Shape newShape(Expr a, int& axis, Slice& slice) { // note: normalizes slice and axis in-place
const auto& shape = a->shape();
axis = shape.axis(axis); // normalize negative axis
slice = shape.slice(slice, axis); // normalize negative slice values
// enforce consecutive memory
if (slice.begin != 0 || slice.end != shape[axis] || slice.stride != 1) { // unless it's a no-op
ABORT_IF(slice.stride != 1, "Strides other than 1 are presently not supported by sliceView()");
for(int i = 0; i < axis; ++i)
ABORT_IF(shape[i] != 1, "Non-consecutive slices are presently not supported by sliceView()");
}
Shape outShape = shape;
outShape.set(axis, slice.end - slice.begin);
return outShape;
}
void allocate() override {}
void free() override {}
void forward() override {}
void backward() override {}
void init_dependent() override { viewedNode_->init_dependent(); }
void set_zero_adjoint() override { viewedNode_->set_zero_adjoint(); } // lazily allocate and zero out gradient (only runs once)
Tensor& val() override {
auto childVal = viewedNode_->val();
auto mem = MemoryPiece::New(childVal->memory()->data() + byteOffset_, byteSize_);
auto temp = TensorBase::New(mem, shape(), childVal->type(), childVal->getBackend());
val_.swap(temp);
return val_;
};
Tensor& grad() override {
auto childGrad = viewedNode_->grad();
auto mem = MemoryPiece::New(childGrad->memory()->data() + byteOffset_, byteSize_);
auto temp = TensorBase::New(mem, shape(), childGrad->type(), childGrad->getBackend());
adj_.swap(temp);
return adj_;
};
const std::string type() override { return "sliceView"; }
const std::string color() override { return "grey"; }
virtual size_t hash() override {
if(!hash_) {
hash_ = NaryNodeOp::hash();
util::hash_combine(hash_, slice_.begin);
util::hash_combine(hash_, slice_.end);
util::hash_combine(hash_, slice_.stride);
util::hash_combine(hash_, axis_);
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<SliceViewNodeOp>(node);
if(!cnode)
return false;
if(slice_ != cnode->slice_)
return false;
if(axis_ != cnode->axis_)
return false;
return true;
}
};
struct ShiftNodeOp : public UnaryNodeOp {
ShiftNodeOp(Expr a, Shape shift, float padValue)
: UnaryNodeOp(a, a->shape()), shift_(shift), padValue_(padValue) {}
NodeOps forwardOps() override {
return {NodeOp(
Shift(val_, child(0)->val(), shift_, padValue_, /*invert=*/false))};
}
NodeOps backwardOps() override {
// last parameter beta=1 says to use += (out = in + beta * out)
// @TODO: check need for padValue_
return {NodeOp(ShiftGrad(child(0)->grad(), adj_, shift_, true))};
}
const std::string type() override { return "shift"; }
virtual size_t hash() override {
if(!hash_) {
size_t seed = NaryNodeOp::hash();
for(auto i : shift_)
util::hash_combine(seed, i);
util::hash_combine(seed, padValue_);
hash_ = seed;
}
return hash_;
}
virtual bool equal(Expr node) override {
if(!NaryNodeOp::equal(node))
return false;
auto cnode = std::dynamic_pointer_cast<ShiftNodeOp>(node);
if(!cnode)
return false;
if(shift_ != cnode->shift_)
return false;
if(padValue_ != cnode->padValue_)
return false;
return true;
}
private:
friend class SerializationHelpers;
Shape shift_; // shift offsets in each dimension
float padValue_; // what value to shift in
};
struct AbsNodeOp : public UnaryNodeOp {
AbsNodeOp(Expr a) : UnaryNodeOp(a) {}
NodeOps forwardOps() override {
using namespace functional;
return {NodeOp(Element(_1 = abs(_2), val_, child(0)->val()))};
}
NodeOps backwardOps() override {
using namespace functional;
return {NodeOp(Add(sgn(_1) * _2, child(0)->grad(), child(0)->val(), adj_))};
}
const std::string type() override { return "abs"; }
};
#ifdef CUDNN
class PoolingOp : public UnaryNodeOp {
public:
PoolingOp(Expr x,
int height,
int width,
int padHeight,
int padWidth,
int strideHeight,
int strideWidth,
std::string mode)
: UnaryNodeOp(x),
pooling_(height,
width,
padHeight,
padWidth,
strideHeight,
strideWidth,
mode) {}
NodeOps forwardOps() override {
return {NodeOp(pooling_.forward(child(0)->val(), val_))};
}
NodeOps backwardOps() override {
return {NodeOp(
pooling_.backward(child(0)->val(), child(0)->grad(), val_, adj_))};
}
const std::string type() override { return "layer_pooling"; }
protected:
PoolingWrapper pooling_;
};
#endif
class PoolingWithMaskingOp : public UnaryNodeOp {
public:
PoolingWithMaskingOp(Expr x, Expr mask, int width, bool isEven = false)
: UnaryNodeOp(x), mask_(mask), width_(width), isEven_(isEven) {
auto xShape = x->shape();
int dimBatch = xShape[0];
int dimWord = xShape[1];
int cols = (isEven_) ? xShape[2] - 1 : xShape[2];
int dimSentence = (cols / width_) + (cols % width_ != 0);
shape_ = {dimBatch, dimWord, dimSentence};
}
NodeOps forwardOps() override {
return {NodeOp(PoolingWithMaskingForward(
val_, child(0)->val(), mask_->val(), width_, isEven_))};
}
NodeOps backwardOps() override {
return {NodeOp(PoolingWithMaskingBackward(adj_,
child(0)->grad(),
child(0)->val(),
mask_->val(),
width_,
isEven_))};
}
const std::string type() override { return "layer_pooling"; }
protected:
friend class SerializationHelpers;
Expr mask_;
int width_;
bool isEven_;
};
} // namespace marian
```
|
```c++
//
//
// See accompanying file LICENSE_1_0.txt or copy at
// path_to_url
#include <boost/mp11/utility.hpp>
#include <boost/core/lightweight_test_trait.hpp>
#include <type_traits>
using boost::mp11::mp_invoke;
template<class...> struct X {};
template<template<class...> class F, class... T> using Y = X<F<T>...>;
template<class Q, class... T> using Z = X<mp_invoke<Q, T>...>;
template<class T, class U> struct P {};
template<class T, class U> using first = T;
int main()
{
using boost::mp11::mp_identity_t;
using boost::mp11::mp_quote;
{
using Q = mp_quote<mp_identity_t>;
BOOST_TEST_TRAIT_TRUE((std::is_same<mp_invoke<Q, void>, void>));
BOOST_TEST_TRAIT_TRUE((std::is_same<mp_invoke<Q, int[]>, int[]>));
}
{
using Q = mp_quote<mp_identity_t>;
#if defined( BOOST_MSVC ) && BOOST_WORKAROUND( BOOST_MSVC, <= 1800 )
#else
using R1 = Y<Q::fn, void, char, int>;
BOOST_TEST_TRAIT_TRUE((std::is_same<R1, X<void, char, int>>));
#endif
#if defined( BOOST_MSVC ) && BOOST_WORKAROUND( BOOST_MSVC, < 1920 && BOOST_MSVC >= 1900 )
#else
using R2 = Z<Q, void, char, int>;
BOOST_TEST_TRAIT_TRUE((std::is_same<R2, X<void, char, int>>));
#endif
}
{
using Q = mp_quote<P>;
BOOST_TEST_TRAIT_TRUE((std::is_same<mp_invoke<Q, void, void>, P<void, void>>));
BOOST_TEST_TRAIT_TRUE((std::is_same<mp_invoke<Q, char[], int[]>, P<char[], int[]>>));
}
{
using Q = mp_quote<first>;
BOOST_TEST_TRAIT_TRUE((std::is_same<mp_invoke<Q, void, int>, void>));
BOOST_TEST_TRAIT_TRUE((std::is_same<mp_invoke<Q, char[], int[]>, char[]>));
}
return boost::report_errors();
}
```
|
```java
package com.jsh.erp.controller;
import com.alibaba.fastjson.JSONArray;
import com.alibaba.fastjson.JSONObject;
import com.jsh.erp.datasource.entities.MaterialAttribute;
import com.jsh.erp.datasource.entities.Person;
import com.jsh.erp.service.materialAttribute.MaterialAttributeService;
import com.jsh.erp.utils.BaseResponseInfo;
import io.swagger.annotations.Api;
import io.swagger.annotations.ApiOperation;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
import javax.annotation.Resource;
import javax.servlet.http.HttpServletRequest;
import java.util.List;
/**
* @author ji sheng hua jshERP
*/
@RestController
@RequestMapping(value = "/materialAttribute")
@Api(tags = {""})
public class MaterialAttributeController {
private Logger logger = LoggerFactory.getLogger(MaterialAttributeController.class);
@Resource
private MaterialAttributeService materialAttributeService;
/**
*
* @param request
* @return
*/
@GetMapping(value = "/getNameList")
@ApiOperation(value = "")
public JSONArray getNameList(HttpServletRequest request)throws Exception {
JSONArray dataArray = new JSONArray();
try {
List<MaterialAttribute> materialAttributeList = materialAttributeService.getMaterialAttribute();
if (null != materialAttributeList) {
for (MaterialAttribute materialAttribute : materialAttributeList) {
JSONObject item = new JSONObject();
item.put("value", materialAttribute.getId().toString());
item.put("name", materialAttribute.getAttributeName());
dataArray.add(item);
}
}
} catch(Exception e){
logger.error(e.getMessage(), e);
}
return dataArray;
}
/**
* id
* @param request
* @return
*/
@GetMapping(value = "/getValueListById")
@ApiOperation(value = "id")
public JSONArray getValueListById(@RequestParam("id") Long id,
HttpServletRequest request)throws Exception {
JSONArray dataArray = new JSONArray();
try {
dataArray = materialAttributeService.getValueArrById(id);
} catch(Exception e){
logger.error(e.getMessage(), e);
}
return dataArray;
}
}
```
|
```smalltalk
// ***********************************************************************
//
// Permission is hereby granted, free of charge, to any person obtaining
// a copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to
// permit persons to whom the Software is furnished to do so, subject to
// the following conditions:
//
// The above copyright notice and this permission notice shall be
// included in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
// EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
// NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
// LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
// OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
// WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
// ***********************************************************************
using System;
using NUnit.Framework.Api;
namespace NUnit.Framework.Internal.Filters
{
/// <summary>
/// NotFilter negates the operation of another filter
/// </summary>
[Serializable]
public class NotFilter : TestFilter
{
ITestFilter baseFilter;
bool topLevel = false;
/// <summary>
/// Construct a not filter on another filter
/// </summary>
/// <param name="baseFilter">The filter to be negated</param>
public NotFilter( ITestFilter baseFilter)
{
this.baseFilter = baseFilter;
}
/// <summary>
/// Indicates whether this is a top-level NotFilter,
/// requiring special handling of Explicit
/// </summary>
public bool TopLevel
{
get { return topLevel; }
set { topLevel = value; }
}
/// <summary>
/// Gets the base filter
/// </summary>
public ITestFilter BaseFilter
{
get { return baseFilter; }
}
/// <summary>
/// Check whether the filter matches a test
/// </summary>
/// <param name="test">The test to be matched</param>
/// <returns>True if it matches, otherwise false</returns>
public override bool Match( ITest test )
{
if (topLevel && test.RunState == RunState.Explicit)
return false;
return !baseFilter.Pass( test );
}
/// <summary>
/// Determine whether any descendant of the test matches the filter criteria.
/// </summary>
/// <param name="test">The test to be matched</param>
/// <returns>True if at least one descendant matches the filter criteria</returns>
protected override bool MatchDescendant(ITest test)
{
if (!test.HasChildren || test.Tests == null || topLevel && test.RunState == RunState.Explicit)
return false;
foreach (ITest child in test.Tests)
{
if (Match(child) || MatchDescendant(child))
return true;
}
return false;
}
}
}
```
|
Michael Burrows is an Australian singer and songwriter from Melbourne, Victoria.
Burrows' demos were first discovered by Neil Finn (Crowded House and Fleetwood Mac) through Medicine Mondiale, an Auckland-based social enterprise. Soon after, Finn recorded the vocals for two of Burrows' songs at Roundhead Studios in Auckland, New Zealand.
In 2015, Burrows was handpicked by Martha Wainwright to open for her sold-out acoustic tour in Australia, despite not having any music officially released.
From there, Burrows started working with Grammy Award winning songwriter Frank Myers, fellow Grammy Award winner, Steve Marcantonio and former musical director for Faith Hill and Reba McEntire, Jimmy Nichols. With the talented and experienced professionals, Burrows recorded a few tracks at the legendary Ocean Way Studios in Nashville and released his debut single "Please Don’t Cry" on 27 April 2018. The song peaked at No. 17 on the MPE Rock Stream Charts and No. 11 on the Download Charts. In September 2018, Burrows released his second single "Turn This Love Around." The song debuted at No. 39 on Australia's iTunes Store, No. 2 on the iTunes Rock Chart, and No. 14 on Australian Independent Record Labels Association's singles chart.
In January 2019, "Turn This Love Around" made its U.S. premiere on Atwood Magazine and landed on Billboards Adult Contemporary chart for 10 weeks, reaching its peak at No. 24. The single's success carried over to his following single "Brightest Star," which peaked on the same chart at No. 25. Both singles became part of Burrows' debut EP Turn This Love Around, which was released in June. And both garnered Grammy consideration for 2019. In 2020, Burrows released the single Brand New Heartache, co-written by Lior Attar and Simon Starr, performing it live as soon as COVID restrictions were restricted. The song received extensive radio play in Australia, also crossing over to country radio across all of the major networks.
Burrows’ music strikes a mix of folk rock, Americana and pop and has been compared to some of his greatest influences including The Beatles, Eagles, Wilco, America, and Dawes .
DiscographySingles "Please Don't Cry" (2018)
"Turn This Love Around" (2018)
"Brightest Star" (2019)EP'''
Turn This Love Around'' (2019)
Notes
External links
Australian male singer-songwriters
Australian singer-songwriters
Living people
Year of birth missing (living people)
Musicians from Melbourne
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.