text
stringlengths 1
22.8M
|
|---|
is a series of video games developed and published by Capcom, and a bigger media franchise based on it, including three anime shows, an anime movie, a live action show, and numerous drama CDs, light novels, manga, and stage plays. Its story is loosely based on real events of the titular Sengoku period in the history of feudal Japan. Sengoku Basara was considered to be popular in Japan when the games released as they won multiple awards, became a cultural phenomenon, been cited as an example of games as art, and gained a passionate fanbase.
While Sengoku Basara was mainly popular in Japan, it did gain some popularity in other countries in Asia. Despite being considered niche outside of Asia, it does maintain a small following in other countries. The franchise started with the first Sengoku Basara video game releasing in Japan on July 21, 2005, for the PlayStation 2.
Sengoku Basaras producer was Hiroyuki Kobayashi (who was the producer for every console and handheld game in the series except for Sengoku Basara Sanada Yukimura-den, and is the creator of the franchise), and its director was Makoto Yamamoto (who was the director for every console and handheld game in the series except for Sengoku Basara Sanada Yukimura-den). Sengoku Basara serves as one of Capcom's flagship series in Japan. As of June 30, 2023, the game series has sold 4.1 million copies worldwide.
Games
Console and handheld games
Sengoku Basara (Devil Kings)
Sengoku Basara (戦国BASARA) is the first game in the series and released in Japan on July 21, 2005, for the PS2 as a hack and slash, action game. Devil Kings, an English-language version of the game, featured altered gameplay and a completely different, supposedly more western audience-oriented dark fantasy story with original characters. It was never used again due to the negative response the localization received from critics, fans, and players.
Sengoku Basara 2
Sengoku Basara 2 (戦国BASARA2) is a sequel to the original Sengoku Basara, and it released in Japan for the PS2 on July 27, 2006. The game was ported to the Wii in 2007 as part of the Sengoku Basara 2 Heroes: Double Pack. An expansion titled, Sengoku Basara 2 Heroes, was released in 2007. The game marks the series' first anniversary and is also considered to have started the "Sengoku Boom" throughout Japan. The Sengoku Boom sparked a renewed interest in the history of Japan (mostly the Sengoku period of feudal Japan, hence the name) where people in Japan went to museums, castles, and battlefields to learn more about the real life history of the various Sengoku Basara characters, buy merchandise related to the game series and the Sengoku period, and buy video games set during the Sengoku period.
Sengoku Basara 2 Heroes
Sengoku Basara 2 Heroes (戦国BASARA2 英雄外伝 HEROES) is an expansion to Sengoku Basara 2, and is the first expansion in the series. The game was released in Asia for the PS2 and Wii (Japan only) on November 29, 2007. The Wii version includes Sengoku Basara 2 as part of the Sengoku Basara 2 Heroes: Double Pack. Characters that were unplayable in the previous game, Sengoku Basara 2, are playable in Sengoku Basara 2 Heroes. The game also sparked a major boom in tourism to the hometown of Katakura Kojūrō, Shiroishi City.
Sengoku Basara X
Sengoku Basara X (戦国BASARA X) is a 2D fighting game developed by Arc System Works, creators of the Guilty Gear and BlazBlue series, which released on April 9, 2008, for Japanese arcades, and was ported to the PS2 later the same year in Asia on June 26, 2008.
Sengoku Basara Battle Heroes
Sengoku Basara Battle Heroes (戦国BASARA バトルヒーローズ) is a PSP-exclusive title developed by Access Games, which released in Japan on April 9, 2009, as a beat 'em up, action game. The game is a spin-off of the series.
Sengoku Basara 3 (Sengoku Basara: Samurai Heroes)
Sengoku Basara 3 (戦国BASARA3) is the third game in the main series and sequel to Sengoku Basara 2, released in Asia on July 29, 2010, for the PS3 and Wii (Japan only). It is the first game in the series to be localized outside of Asia since the original game, and was released in North America on October 12, 2010, in Australia on October 14, 2010, and in Europe on October 15, 2010. An expansion titled, Sengoku Basara 3 Utage, was released in 2011. The game marks the series' fifth anniversary. The game currently ranks as the best-selling Sengoku Basara game in the series (before this game released, it was previously Sengoku Basara 2 Heroes).
Sengoku Basara Chronicle Heroes
Sengoku Basara Chronicle Heroes (戦国BASARA クロニクルヒーローズ) released for the PSP in Asia on July 21, 2011. The game serves as a sequel to Sengoku Basara Battle Heroes.
Sengoku Basara 3 Utage
Sengoku Basara 3 Utage (戦国BASARA3 宴) released for the PS3 and Wii (Japan only) in Asia on November 10, 2011. The game serves as an expansion to Sengoku Basara 3. Characters that were unplayable in the previous game, Sengoku Basara Samurai Heroes, are playable in Sengoku Basara 3 Utage. "Utage" is Japanese for "Party".
Sengoku Basara HD Collection
Sengoku Basara HD Collection (戦国BASARA HDコレクション) released for the PS3 in Asia on August 30, 2012. It comes with Sengoku Basara, Sengoku Basara 2, and Sengoku Basara 2 Heroes in 720p HD on one disc.
Sengoku Basara 4
Sengoku Basara 4 (戦国BASARA4) is the fourth game in the main series. It was released in Asia on January 23, 2014 for the PS3. The game serves as a mixture of a sequel to Sengoku Basara 3, and a soft reboot of the series so new fans can enjoy it without having to play previous games in order to understand the full story. It's also the first console game in the series to receive a collector's edition, digital release, DLC, and updates, with every other game afterward following up with this. An expanded version titled, Sengoku Basara 4 Sumeragi, was released in 2015.
Sengoku Basara 4 Sumeragi
Sengoku Basara 4 Sumeragi (戦国BASARA4 皇) released for the PS3 and PS4 in Asia on July 23, 2015. The game is the first game to be developed by Capcom for the PS4. The game comes with all of the content in Sengoku Basara 4 along with new content. The game marks the series' tenth anniversary. Characters that were unplayable in the previous game, Sengoku Basara 4, are playable in Sengoku Basara 4 Sumeragi. An "Anniversary Edition" of the game containing all of the DLC was released for the PS4 in Japan on July 21, 2020. "Sumeragi" is Japanese for "Emperor".
Sengoku Basara Sanada Yukimura-den
Sengoku Basara Sanada Yukimura-den (戦国BASARA 真田幸村伝) is a spin-off game focusing on the life of one of the series' main protagonists, Sanada Yukimura, and released in Asia for the PS3 (Japan only) and PS4 on August 25, 2016. The game is more historically accurate than previous games, but still retains a few series' original elements, such as the rivalry between Date Masamune and Sanada Yukimura. This is the first and only console game in the series to not have Hiroyuki Kobayashi and Makoto Yamamoto involved. The game currently ranks as the worst-selling Sengoku Basara game in the series (before this game released, it was previously Sengoku Basara HD Collection). "Sanada Yukimura-den" is Japanese for "Legend of Sanada Yukimura".
Mobile games
Sengoku Basara Battle Party
Sengoku Basara Battle Party (戦国BASARA バトルパーティー) was a free-to-play, mobile-based gacha RPG for Android and iOS based on the Sengoku Basara franchise, and was available through Google Play and the App Store. The game was announced by Capcom on May 14, 2019, and was released in Japan on June 24, 2019. Two trailers for the game were uploaded on YouTube by Capcom on 5/15/2019, and 7/1/2019, respectively. A series of live streams for the game were uploaded on YouTube by Capcom on 7/6/2019, 8/29/2019, 9/26/2019, 11/13/2019, 12/23/2019, 1/30/2020, and 6/24/2020, respectively. A Capcom collaboration in the game between Sengoku Basara and Monster Hunter titled, "Sengoku Basara Battle Party X Monster Hunter: World -Collaboration-", started on November 28, 2019, and ended on December 26, 2019. A trailer for the Monster Hunter collaboration was uploaded on YouTube by Capcom on November 28, 2019. A second Capcom collaboration in the game between Sengoku Basara and Devil May Cry titled, "Sengoku Basara Battle Party X Devil May Cry 4 -Collaboration-", started on January 14, 2020, and ended on February 13, 2020. A trailer for the Devil May Cry collaboration was uploaded on YouTube by Capcom on January 14, 2020. Download for the game is still available, but it can't be played. The game was shut down on December 21, 2020, due to "external issues" from Capcom's management.
Adaptations
The Sengoku Basara franchise has had several different forms of media.
Notably, an anime series was planned and written by Yasuyuki Muto. The first anime, Sengoku Basara, started broadcasting in Japan on April 2, 2009. The series' sequel anime, titled Sengoku Basara II, began broadcast in Japan on July 11, 2010. Furthermore, the series' movie finale titled Sengoku Basara -The Last Party- was released in Japanese theaters on June 4, 2011. All three anime adaptations were licensed and published in the United States in 2012 by Funimation under the titles, Sengoku Basara: Samurai Kings, Sengoku Basara: Samurai Kings 2, and Sengoku Basara: Samurai Kings -The Last Party-. An anime based on Sengoku Basara 3 titled Sengoku Basara Judge End, began broadcast in Japan on July 6, 2014, and was licensed and published in the United States in 2016 by Funimation under the title, Sengoku Basara: End of Judgement. Another anime titled, Gakuen Basara, based on the Gakuen Basara manga series, started broadcasting in Japan on October 4, 2018.
Several manga adaptations of the series have been serialized in manga magazines, and later released in tankōbon format in Japan. A manga adaptation of the second game was created by Yak Haibara. The four volume series, Sengoku Basara 2, was published in Japan from 2007-2009. It was published in the United States from 2012-2013 by UDON under the title Sengoku Basara: Samurai Legends. The title change was due to the Sengoku Basara 2 video game not being released in the United States. Kairi Shimotsuki created the first manga adaptation of the series, based on the first game, which was titled, Sengoku Basara Ranse Ranbu, and was released as a three volume series in 2006.
Radio shows have been produced with the first series being released on four CD volumes. A stage play based on Sengoku Basara 3 was announced on July 17, 2011 in Japan and ran later in 2011 from October 14 to October 30, and was later released on DVD in Japan on February 23, 2012. The stage play received "universal acclaim" in Japan. This stage play is considered to be the beginning of the Sengoku Basara stage play series' huge success and popularity in Japan with later stage plays in the series becoming even more successful and popular. There has been a total of 17 stage plays in the series as of 2019, with Capcom doing 1-2 per year (1 in 2009, 1 in 2010, 1 in 2011, 2 in 2012, 2 in 2013, 2 in 2014, 1 in 2015, 2 in 2016, 2 in 2017, 2 in 2018, and 1 in 2019).
In 2012, the Takarazuka Revue announced that Flower Troupe would be performing a 'Sengoku Basara' musical. Ranju Tomu and Ranno Hana starred, and Asumi Rio and Nozomi Futo also featured in the adaptation, which played at the Tokyu Theater Orb, from 06/15 - 07/01 in 2013. The staging of the musical was much more lavish than that of the stage plays with a lot more special effects, and less action and stunts. Reviews were "generally favorable". As of 2019, this is the first and only Sengoku Basara musical to have been performed.
A live-action television drama titled Sengoku Basara Moonlight Party began broadcasting in Japan on July 12, 2012, on the Mainichi Broadcasting System.
In August 2015, Capcom produced a collaborative stage play with Sengoku Basara and Devil May Cry titled "Sengoku Basara VS Devil May Cry". In the play, Dante, Lady, Trish, and Vergil come across some mysterious historical ruins while chasing after a demon, and are sent back in time to Japan's Warring States (Sengoku) period. There, the group meets Date Masamune, Sanada Yukimura, and other characters from the Sengoku Basara franchise. The play ran at the AiiA 2.5 Theater Tokyo for 18 performances from August 20–30. Masanari Ujigawa directed and composed the stage play with Hideaki Itsuno and Izaki Matsuno collaborating on the scenario. Kazushi Miyakoda and Tetsuya Yamaura produced the play with Hiroyuki Kobayashi and Makoto Yamamoto as supervisors.
Related products
A large range of merchandise has been created for the series in Japan, including books, CD soundtracks, drama CDs, figures, magazines, radio CDs, and trading cards.
Cards based on Sengoku Basara were included in Capcom's free-to-play digital collectible card game, Teppen, on October 1, 2020, through an expansion released worldwide titled, "The Tale of Amatsu no Kuni". Sengoku Basara character Oichi was included as a playable hero along with more Sengoku Basara cards in another expansion for Teppen that released worldwide on January 5, 2021. The expansion is titled, "The Battle of Amatsu no Kuni."
References
External links
Official Sengoku BASARA 3 Stage Play website
Capcom franchises
Video games adapted into television shows
Video game franchises
Video games about samurai
Video games set in feudal Japan
ko:전국 바사라
ja:戦国BASARA
zh:戰國BASARA
|
```go
// Use of this source code is governed by a BSD-style
// license that can be found in the LICENSE file.
// +build linux
package ipv6
import (
"unsafe"
"golang.org/x/net/bpf"
"golang.org/x/net/internal/socket"
)
func (so *sockOpt) setAttachFilter(c *socket.Conn, f []bpf.RawInstruction) error {
prog := sockFProg{
Len: uint16(len(f)),
Filter: (*sockFilter)(unsafe.Pointer(&f[0])),
}
b := (*[sizeofSockFprog]byte)(unsafe.Pointer(&prog))[:sizeofSockFprog]
return so.Set(c, b)
}
```
|
```html
<html lang="en">
<head>
<title>iswprint - Untitled</title>
<meta http-equiv="Content-Type" content="text/html">
<meta name="description" content="Untitled">
<meta name="generator" content="makeinfo 4.11">
<link title="Top" rel="start" href="index.html#Top">
<link rel="up" href="Ctype.html#Ctype" title="Ctype">
<link rel="prev" href="iswlower.html#iswlower" title="iswlower">
<link rel="next" href="iswpunct.html#iswpunct" title="iswpunct">
<link href="path_to_url" rel="generator-home" title="Texinfo Homepage">
<meta http-equiv="Content-Style-Type" content="text/css">
<style type="text/css"><!--
pre.display { font-family:inherit }
pre.format { font-family:inherit }
pre.smalldisplay { font-family:inherit; font-size:smaller }
pre.smallformat { font-family:inherit; font-size:smaller }
pre.smallexample { font-size:smaller }
pre.smalllisp { font-size:smaller }
span.sc { font-variant:small-caps }
span.roman { font-family:serif; font-weight:normal; }
span.sansserif { font-family:sans-serif; font-weight:normal; }
--></style>
</head>
<body>
<div class="node">
<p>
<a name="iswprint"></a>
Next: <a rel="next" accesskey="n" href="iswpunct.html#iswpunct">iswpunct</a>,
Previous: <a rel="previous" accesskey="p" href="iswlower.html#iswlower">iswlower</a>,
Up: <a rel="up" accesskey="u" href="Ctype.html#Ctype">Ctype</a>
<hr>
</div>
<h3 class="section">3.23 <code>iswprint</code>—printable wide character test</h3>
<p><a name="index-iswprint-140"></a><strong>Synopsis</strong>
<pre class="example"> #include <wctype.h>
int iswprint(wint_t <var>c</var>);
</pre>
<p><strong>Description</strong><br>
<code>iswprint</code> is a function which classifies wide-character values that
are printable.
<pre class="sp">
</pre>
<strong>Returns</strong><br>
<code>iswprint</code> returns non-zero if <var>c</var> is a printable wide character.
<pre class="sp">
</pre>
<strong>Portability</strong><br>
<code>iswprint</code> is C99.
<p>No supporting OS subroutines are required.
<pre class="sp">
</pre>
</body></html>
```
|
```cmake
board_runner_args(jlink "--device=nRF51822_xxAC" "--speed=4000")
include(${ZEPHYR_BASE}/boards/common/nrfjprog.board.cmake)
include(${ZEPHYR_BASE}/boards/common/jlink.board.cmake)
```
|
Tama is a village in the administrative district of Gmina Ruda Maleniecka, within Końskie County, Świętokrzyskie Voivodeship, in south-central Poland. It lies approximately north-west of Ruda Maleniecka, west of Końskie, and north-west of the regional capital Kielce.
References
Villages in Końskie County
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
// MODULES //
var resolve = require( 'path' ).resolve;
var exec = require( 'child_process' ).exec;
var tape = require( 'tape' );
var IS_BROWSER = require( '@stdlib/assert/is-browser' );
var IS_WINDOWS = require( '@stdlib/assert/is-windows' );
var EXEC_PATH = require( '@stdlib/process/exec-path' );
var readFileSync = require( '@stdlib/fs/read-file' ).sync;
// VARIABLES //
var fpath = resolve( __dirname, '..', 'bin', 'cli' );
var opts = {
'skip': IS_BROWSER || IS_WINDOWS
};
// FIXTURES //
var PKG_VERSION = require( './../package.json' ).version;
// TESTS //
tape( 'command-line interface', function test( t ) {
t.ok( true, __filename );
t.end();
});
tape( 'when invoked with a `--help` flag, the command-line interface prints the help text to `stderr`', opts, function test( t ) {
var expected;
var cmd;
expected = readFileSync( resolve( __dirname, '..', 'docs', 'usage.txt' ), {
'encoding': 'utf8'
});
cmd = [
EXEC_PATH,
fpath,
'--help'
];
exec( cmd.join( ' ' ), done );
function done( error, stdout, stderr ) {
if ( error ) {
t.fail( error.message );
} else {
t.strictEqual( stdout.toString(), '', 'does not print to `stdout`' );
t.strictEqual( stderr.toString(), expected+'\n', 'expected value' );
}
t.end();
}
});
tape( 'when invoked with a `-h` flag, the command-line interface prints the help text to `stderr`', opts, function test( t ) {
var expected;
var cmd;
expected = readFileSync( resolve( __dirname, '..', 'docs', 'usage.txt' ), {
'encoding': 'utf8'
});
cmd = [
EXEC_PATH,
fpath,
'-h'
];
exec( cmd.join( ' ' ), done );
function done( error, stdout, stderr ) {
if ( error ) {
t.fail( error.message );
} else {
t.strictEqual( stdout.toString(), '', 'does not print to `stdout`' );
t.strictEqual( stderr.toString(), expected+'\n', 'expected value' );
}
t.end();
}
});
tape( 'when invoked with a `--version` flag, the command-line interface prints the version to `stderr`', opts, function test( t ) {
var cmd = [
EXEC_PATH,
fpath,
'--version'
];
exec( cmd.join( ' ' ), done );
function done( error, stdout, stderr ) {
if ( error ) {
t.fail( error.message );
} else {
t.strictEqual( stdout.toString(), '', 'does not print to `stdout`' );
t.strictEqual( stderr.toString(), PKG_VERSION+'\n', 'expected value' );
}
t.end();
}
});
tape( 'when invoked with a `-V` flag, the command-line interface prints the version to `stderr`', opts, function test( t ) {
var cmd = [
EXEC_PATH,
fpath,
'-V'
];
exec( cmd.join( ' ' ), done );
function done( error, stdout, stderr ) {
if ( error ) {
t.fail( error.message );
} else {
t.strictEqual( stdout.toString(), '', 'does not print to `stdout`' );
t.strictEqual( stderr.toString(), PKG_VERSION+'\n', 'expected value' );
}
t.end();
}
});
tape( 'the command-line interface prints either `true` or `false` to `stdout` indicating whether an environment provides native `Map` support', opts, function test( t ) {
var cmd = [
EXEC_PATH,
fpath
];
exec( cmd.join( ' ' ), done );
function done( error, stdout, stderr ) {
var str;
if ( error ) {
t.fail( error.message );
} else {
str = stdout.toString();
t.strictEqual( str === 'true\n' || str === 'false\n', true, 'prints either `true` or `false` to `stdout`' );
t.strictEqual( stderr.toString(), '', 'does not print to `stderr`' );
}
t.end();
}
});
```
|
Administratively, Lesotho is divided into ten districts, each headed by a district administrator. Each district has a capital known as a camptown.
The districts are further subdivided into 80 constituencies, which consists of 129 local community councils.
References
Lesotho
Lesotho
|
Nodar Khizanishvili () (born 31 January 1953 in Batumi, Adjar ASSR) is a retired Soviet football player of Georgian ethnicity. He is the father of Zurab Khizanishvili.
Honours
Soviet Top League winner: 1978.
Soviet Cup winner: 1976, 1979.
UEFA Cup Winners' Cup winner: 1981.
International career
He played his only game for USSR on 14 April 1982 in a friendly against Argentina, coming on for the final eight minutes in place of Leonid Buryak.
References
External links
Profile
1953 births
Living people
Sportspeople from Batumi
Men's footballers from Georgia (country)
Soviet men's footballers
Soviet Union men's international footballers
FC Dinamo Tbilisi players
FC Dinamo Batumi players
FC Torpedo Kutaisi players
Soviet Top League players
Men's association football defenders
|
```java
package com.netty.rpc.server.registry;
import com.netty.rpc.config.Constant;
import com.netty.rpc.protocol.RpcProtocol;
import com.netty.rpc.protocol.RpcServiceInfo;
import com.netty.rpc.util.ServiceUtil;
import com.netty.rpc.zookeeper.CuratorClient;
import org.apache.curator.framework.CuratorFramework;
import org.apache.curator.framework.state.ConnectionState;
import org.apache.curator.framework.state.ConnectionStateListener;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
/**
*
*
* @author luxiaoxun
*/
public class ServiceRegistry {
private static final Logger logger = LoggerFactory.getLogger(ServiceRegistry.class);
private CuratorClient curatorClient;
private List<String> pathList = new ArrayList<>();
public ServiceRegistry(String registryAddress) {
this.curatorClient = new CuratorClient(registryAddress, 5000);
}
public void registerService(String host, int port, Map<String, Object> serviceMap) {
// Register service info
List<RpcServiceInfo> serviceInfoList = new ArrayList<>();
for (String key : serviceMap.keySet()) {
String[] serviceInfo = key.split(ServiceUtil.SERVICE_CONCAT_TOKEN);
if (serviceInfo.length > 0) {
RpcServiceInfo rpcServiceInfo = new RpcServiceInfo();
rpcServiceInfo.setServiceName(serviceInfo[0]);
if (serviceInfo.length == 2) {
rpcServiceInfo.setVersion(serviceInfo[1]);
} else {
rpcServiceInfo.setVersion("");
}
logger.info("Register new service: {} ", key);
serviceInfoList.add(rpcServiceInfo);
} else {
logger.warn("Can not get service name and version: {} ", key);
}
}
try {
RpcProtocol rpcProtocol = new RpcProtocol();
rpcProtocol.setHost(host);
rpcProtocol.setPort(port);
rpcProtocol.setServiceInfoList(serviceInfoList);
String serviceData = rpcProtocol.toJson();
byte[] bytes = serviceData.getBytes();
String path = Constant.ZK_DATA_PATH + "-" + rpcProtocol.hashCode();
path = this.curatorClient.createPathData(path, bytes);
pathList.add(path);
logger.info("Register {} new service, host: {}, port: {}", serviceInfoList.size(), host, port);
} catch (Exception e) {
logger.error("Register service fail, exception: {}", e.getMessage());
}
curatorClient.addConnectionStateListener(new ConnectionStateListener() {
@Override
public void stateChanged(CuratorFramework curatorFramework, ConnectionState connectionState) {
if (connectionState == ConnectionState.RECONNECTED) {
logger.info("Connection state: {}, register service after reconnected", connectionState);
registerService(host, port, serviceMap);
}
}
});
}
public void unregisterService() {
logger.info("Unregister all service");
for (String path : pathList) {
try {
this.curatorClient.deletePath(path);
} catch (Exception ex) {
logger.error("Delete service path error: " + ex.getMessage());
}
}
this.curatorClient.close();
}
}
```
|
Barry T. Hynes is an American politician who served as a Boston City Councilor and as Boston's City Clerk.
Hynes grew up in Boston's Dorchester neighborhood. His father, John Hynes, was Mayor of Boston from 1950 to 1960. Hynes graduated from Boston College High School and the University of Notre Dame.
Hynes served on the Boston City Council from 1964 to 1967. He was president of the Council in 1967. Hynes was a candidate for Mayor of Boston in 1967, but dropped out one month before the preliminary election. After leaving the council, he worked as an aide to Mayor Kevin White. In 1978, Hynes was named Boston City Clerk. He resigned in 1983 to work full-time at his travel agency.
Outside politics Hynes worked as a stockbroker, real estate broker, and novelist.
School Initiatives
Hynes is founder of Nativity Preparatory School in Boston and Nativity Preparatory School of New Bedford, as well as co-founder of Paraclete in South Boston with Sister Ann Fox.
References
Boston city clerks
Presidents of the Boston City Council
Year of birth missing (living people)
University of Notre Dame alumni
Living people
Boston College High School alumni
|
```objective-c
//===- llvm/ADT/CoalescingBitVector.h - A coalescing bitvector --*- C++ -*-===//
//
// See path_to_url for license information.
//
//===your_sha256_hash------===//
///
/// \file
/// A bitvector that uses an IntervalMap to coalesce adjacent elements
/// into intervals.
///
//===your_sha256_hash------===//
#ifndef LLVM_ADT_COALESCINGBITVECTOR_H
#define LLVM_ADT_COALESCINGBITVECTOR_H
#include "llvm/ADT/IntervalMap.h"
#include "llvm/ADT/STLExtras.h"
#include "llvm/ADT/SmallVector.h"
#include "llvm/ADT/iterator_range.h"
#include "llvm/Support/Debug.h"
#include "llvm/Support/raw_ostream.h"
#include <initializer_list>
namespace llvm {
/// A bitvector that, under the hood, relies on an IntervalMap to coalesce
/// elements into intervals. Good for representing sets which predominantly
/// contain contiguous ranges. Bad for representing sets with lots of gaps
/// between elements.
///
/// Compared to SparseBitVector, CoalescingBitVector offers more predictable
/// performance for non-sequential find() operations.
///
/// \tparam IndexT - The type of the index into the bitvector.
template <typename IndexT> class CoalescingBitVector {
static_assert(std::is_unsigned<IndexT>::value,
"Index must be an unsigned integer.");
using ThisT = CoalescingBitVector<IndexT>;
/// An interval map for closed integer ranges. The mapped values are unused.
using MapT = IntervalMap<IndexT, char>;
using UnderlyingIterator = typename MapT::const_iterator;
using IntervalT = std::pair<IndexT, IndexT>;
public:
using Allocator = typename MapT::Allocator;
/// Construct by passing in a CoalescingBitVector<IndexT>::Allocator
/// reference.
CoalescingBitVector(Allocator &Alloc)
: Alloc(&Alloc), Intervals(Alloc) {}
/// \name Copy/move constructors and assignment operators.
/// @{
CoalescingBitVector(const ThisT &Other)
: Alloc(Other.Alloc), Intervals(*Other.Alloc) {
set(Other);
}
ThisT &operator=(const ThisT &Other) {
clear();
set(Other);
return *this;
}
CoalescingBitVector(ThisT &&Other) = delete;
ThisT &operator=(ThisT &&Other) = delete;
/// @}
/// Clear all the bits.
void clear() { Intervals.clear(); }
/// Check whether no bits are set.
bool empty() const { return Intervals.empty(); }
/// Count the number of set bits.
unsigned count() const {
unsigned Bits = 0;
for (auto It = Intervals.begin(), End = Intervals.end(); It != End; ++It)
Bits += 1 + It.stop() - It.start();
return Bits;
}
/// Set the bit at \p Index.
///
/// This method does /not/ support setting a bit that has already been set,
/// for efficiency reasons. If possible, restructure your code to not set the
/// same bit multiple times, or use \ref test_and_set.
void set(IndexT Index) {
assert(!test(Index) && "Setting already-set bits not supported/efficient, "
"IntervalMap will assert");
insert(Index, Index);
}
/// Set the bits set in \p Other.
///
/// This method does /not/ support setting already-set bits, see \ref set
/// for the rationale. For a safe set union operation, use \ref operator|=.
void set(const ThisT &Other) {
for (auto It = Other.Intervals.begin(), End = Other.Intervals.end();
It != End; ++It)
insert(It.start(), It.stop());
}
/// Set the bits at \p Indices. Used for testing, primarily.
void set(std::initializer_list<IndexT> Indices) {
for (IndexT Index : Indices)
set(Index);
}
/// Check whether the bit at \p Index is set.
bool test(IndexT Index) const {
const auto It = Intervals.find(Index);
if (It == Intervals.end())
return false;
assert(It.stop() >= Index && "Interval must end after Index");
return It.start() <= Index;
}
/// Set the bit at \p Index. Supports setting an already-set bit.
void test_and_set(IndexT Index) {
if (!test(Index))
set(Index);
}
/// Reset the bit at \p Index. Supports resetting an already-unset bit.
void reset(IndexT Index) {
auto It = Intervals.find(Index);
if (It == Intervals.end())
return;
// Split the interval containing Index into up to two parts: one from
// [Start, Index-1] and another from [Index+1, Stop]. If Index is equal to
// either Start or Stop, we create one new interval. If Index is equal to
// both Start and Stop, we simply erase the existing interval.
IndexT Start = It.start();
if (Index < Start)
// The index was not set.
return;
IndexT Stop = It.stop();
assert(Index <= Stop && "Wrong interval for index");
It.erase();
if (Start < Index)
insert(Start, Index - 1);
if (Index < Stop)
insert(Index + 1, Stop);
}
/// Set union. If \p RHS is guaranteed to not overlap with this, \ref set may
/// be a faster alternative.
void operator|=(const ThisT &RHS) {
// Get the overlaps between the two interval maps.
SmallVector<IntervalT, 8> Overlaps;
getOverlaps(RHS, Overlaps);
// Insert the non-overlapping parts of all the intervals from RHS.
for (auto It = RHS.Intervals.begin(), End = RHS.Intervals.end();
It != End; ++It) {
IndexT Start = It.start();
IndexT Stop = It.stop();
SmallVector<IntervalT, 8> NonOverlappingParts;
getNonOverlappingParts(Start, Stop, Overlaps, NonOverlappingParts);
for (IntervalT AdditivePortion : NonOverlappingParts)
insert(AdditivePortion.first, AdditivePortion.second);
}
}
/// Set intersection.
void operator&=(const ThisT &RHS) {
// Get the overlaps between the two interval maps (i.e. the intersection).
SmallVector<IntervalT, 8> Overlaps;
getOverlaps(RHS, Overlaps);
// Rebuild the interval map, including only the overlaps.
clear();
for (IntervalT Overlap : Overlaps)
insert(Overlap.first, Overlap.second);
}
/// Reset all bits present in \p Other.
void intersectWithComplement(const ThisT &Other) {
SmallVector<IntervalT, 8> Overlaps;
if (!getOverlaps(Other, Overlaps)) {
// If there is no overlap with Other, the intersection is empty.
return;
}
// Delete the overlapping intervals. Split up intervals that only partially
// intersect an overlap.
for (IntervalT Overlap : Overlaps) {
IndexT OlapStart, OlapStop;
std::tie(OlapStart, OlapStop) = Overlap;
auto It = Intervals.find(OlapStart);
IndexT CurrStart = It.start();
IndexT CurrStop = It.stop();
assert(CurrStart <= OlapStart && OlapStop <= CurrStop &&
"Expected some intersection!");
// Split the overlap interval into up to two parts: one from [CurrStart,
// OlapStart-1] and another from [OlapStop+1, CurrStop]. If OlapStart is
// equal to CurrStart, the first split interval is unnecessary. Ditto for
// when OlapStop is equal to CurrStop, we omit the second split interval.
It.erase();
if (CurrStart < OlapStart)
insert(CurrStart, OlapStart - 1);
if (OlapStop < CurrStop)
insert(OlapStop + 1, CurrStop);
}
}
bool operator==(const ThisT &RHS) const {
// We cannot just use std::equal because it checks the dereferenced values
// of an iterator pair for equality, not the iterators themselves. In our
// case that results in comparison of the (unused) IntervalMap values.
auto ItL = Intervals.begin();
auto ItR = RHS.Intervals.begin();
while (ItL != Intervals.end() && ItR != RHS.Intervals.end() &&
ItL.start() == ItR.start() && ItL.stop() == ItR.stop()) {
++ItL;
++ItR;
}
return ItL == Intervals.end() && ItR == RHS.Intervals.end();
}
bool operator!=(const ThisT &RHS) const { return !operator==(RHS); }
class const_iterator {
friend class CoalescingBitVector;
public:
using iterator_category = std::forward_iterator_tag;
using value_type = IndexT;
using difference_type = std::ptrdiff_t;
using pointer = value_type *;
using reference = value_type &;
private:
// For performance reasons, make the offset at the end different than the
// one used in \ref begin, to optimize the common `It == end()` pattern.
static constexpr unsigned kIteratorAtTheEndOffset = ~0u;
UnderlyingIterator MapIterator;
unsigned OffsetIntoMapIterator = 0;
// Querying the start/stop of an IntervalMap iterator can be very expensive.
// Cache these values for performance reasons.
IndexT CachedStart = IndexT();
IndexT CachedStop = IndexT();
void setToEnd() {
OffsetIntoMapIterator = kIteratorAtTheEndOffset;
CachedStart = IndexT();
CachedStop = IndexT();
}
/// MapIterator has just changed, reset the cached state to point to the
/// start of the new underlying iterator.
void resetCache() {
if (MapIterator.valid()) {
OffsetIntoMapIterator = 0;
CachedStart = MapIterator.start();
CachedStop = MapIterator.stop();
} else {
setToEnd();
}
}
/// Advance the iterator to \p Index, if it is contained within the current
/// interval. The public-facing method which supports advancing past the
/// current interval is \ref advanceToLowerBound.
void advanceTo(IndexT Index) {
assert(Index <= CachedStop && "Cannot advance to OOB index");
if (Index < CachedStart)
// We're already past this index.
return;
OffsetIntoMapIterator = Index - CachedStart;
}
const_iterator(UnderlyingIterator MapIt) : MapIterator(MapIt) {
resetCache();
}
public:
const_iterator() { setToEnd(); }
bool operator==(const const_iterator &RHS) const {
// Do /not/ compare MapIterator for equality, as this is very expensive.
// The cached start/stop values make that check unnecessary.
return std::tie(OffsetIntoMapIterator, CachedStart, CachedStop) ==
std::tie(RHS.OffsetIntoMapIterator, RHS.CachedStart,
RHS.CachedStop);
}
bool operator!=(const const_iterator &RHS) const {
return !operator==(RHS);
}
IndexT operator*() const { return CachedStart + OffsetIntoMapIterator; }
const_iterator &operator++() { // Pre-increment (++It).
if (CachedStart + OffsetIntoMapIterator < CachedStop) {
// Keep going within the current interval.
++OffsetIntoMapIterator;
} else {
// We reached the end of the current interval: advance.
++MapIterator;
resetCache();
}
return *this;
}
const_iterator operator++(int) { // Post-increment (It++).
const_iterator tmp = *this;
operator++();
return tmp;
}
/// Advance the iterator to the first set bit AT, OR AFTER, \p Index. If
/// no such set bit exists, advance to end(). This is like std::lower_bound.
/// This is useful if \p Index is close to the current iterator position.
/// However, unlike \ref find(), this has worst-case O(n) performance.
void advanceToLowerBound(IndexT Index) {
if (OffsetIntoMapIterator == kIteratorAtTheEndOffset)
return;
// Advance to the first interval containing (or past) Index, or to end().
while (Index > CachedStop) {
++MapIterator;
resetCache();
if (OffsetIntoMapIterator == kIteratorAtTheEndOffset)
return;
}
advanceTo(Index);
}
};
const_iterator begin() const { return const_iterator(Intervals.begin()); }
const_iterator end() const { return const_iterator(); }
/// Return an iterator pointing to the first set bit AT, OR AFTER, \p Index.
/// If no such set bit exists, return end(). This is like std::lower_bound.
/// This has worst-case logarithmic performance (roughly O(log(gaps between
/// contiguous ranges))).
const_iterator find(IndexT Index) const {
auto UnderlyingIt = Intervals.find(Index);
if (UnderlyingIt == Intervals.end())
return end();
auto It = const_iterator(UnderlyingIt);
It.advanceTo(Index);
return It;
}
/// Return a range iterator which iterates over all of the set bits in the
/// half-open range [Start, End).
iterator_range<const_iterator> half_open_range(IndexT Start,
IndexT End) const {
assert(Start < End && "Not a valid range");
auto StartIt = find(Start);
if (StartIt == end() || *StartIt >= End)
return {end(), end()};
auto EndIt = StartIt;
EndIt.advanceToLowerBound(End);
return {StartIt, EndIt};
}
void print(raw_ostream &OS) const {
OS << "{";
for (auto It = Intervals.begin(), End = Intervals.end(); It != End;
++It) {
OS << "[" << It.start();
if (It.start() != It.stop())
OS << ", " << It.stop();
OS << "]";
}
OS << "}";
}
#if !defined(NDEBUG) || defined(LLVM_ENABLE_DUMP)
LLVM_DUMP_METHOD void dump() const {
// LLDB swallows the first line of output after callling dump(). Add
// newlines before/after the braces to work around this.
dbgs() << "\n";
print(dbgs());
dbgs() << "\n";
}
#endif
private:
void insert(IndexT Start, IndexT End) { Intervals.insert(Start, End, 0); }
/// Record the overlaps between \p this and \p Other in \p Overlaps. Return
/// true if there is any overlap.
bool getOverlaps(const ThisT &Other,
SmallVectorImpl<IntervalT> &Overlaps) const {
for (IntervalMapOverlaps<MapT, MapT> I(Intervals, Other.Intervals);
I.valid(); ++I)
Overlaps.emplace_back(I.start(), I.stop());
assert(llvm::is_sorted(Overlaps,
[](IntervalT LHS, IntervalT RHS) {
return LHS.second < RHS.first;
}) &&
"Overlaps must be sorted");
return !Overlaps.empty();
}
/// Given the set of overlaps between this and some other bitvector, and an
/// interval [Start, Stop] from that bitvector, determine the portions of the
/// interval which do not overlap with this.
void getNonOverlappingParts(IndexT Start, IndexT Stop,
const SmallVectorImpl<IntervalT> &Overlaps,
SmallVectorImpl<IntervalT> &NonOverlappingParts) {
IndexT NextUncoveredBit = Start;
for (IntervalT Overlap : Overlaps) {
IndexT OlapStart, OlapStop;
std::tie(OlapStart, OlapStop) = Overlap;
// [Start;Stop] and [OlapStart;OlapStop] overlap iff OlapStart <= Stop
// and Start <= OlapStop.
bool DoesOverlap = OlapStart <= Stop && Start <= OlapStop;
if (!DoesOverlap)
continue;
// Cover the range [NextUncoveredBit, OlapStart). This puts the start of
// the next uncovered range at OlapStop+1.
if (NextUncoveredBit < OlapStart)
NonOverlappingParts.emplace_back(NextUncoveredBit, OlapStart - 1);
NextUncoveredBit = OlapStop + 1;
if (NextUncoveredBit > Stop)
break;
}
if (NextUncoveredBit <= Stop)
NonOverlappingParts.emplace_back(NextUncoveredBit, Stop);
}
Allocator *Alloc;
MapT Intervals;
};
} // namespace llvm
#endif // LLVM_ADT_COALESCINGBITVECTOR_H
```
|
Bertram Patrick Pockney (17 March 1927 – 22 June 2004) was a Russian studies academic which an interest in Soviet economics. He graduated from the London School of Economics in 1947. He then taught at Holland Park School while also later lecturing on Soviet economics at Battersea College of Advanced Technology. In the meantime, he completed a BA at the University of London (1960). He was appointed to a lectureship at the University of Surrey in 1965. In 1970, he was promoted to a senior lectureship in Russian studies. He was later appointed to a professorship, delivering his inaugural lecture in 1984. He retired in 1992 and worked as a consultant on industry in Russia and Eastern Europe. In 1991, he authored Soviet Statistics since 1950.
References
1927 births
2004 deaths
Russian studies scholars
Alumni of the London School of Economics
Academics of the University of Surrey
|
Discoderus impotens is a species of ground beetle in the family Carabidae. It is found in North America.
References
Further reading
Harpalinae
Articles created by Qbugbot
Beetles described in 1858
|
Joseph Carpenter may refer to:
Joseph Edwards Carpenter (1813–1885), English playwright and songwriter
Joseph Estlin Carpenter (1844–1927), Unitarian minister, principal of Manchester College, Oxford
Joe Carpenter (rugby union), English rugby union player
Joe Carpenter (R.O.D), fictional character in Japanese novel, Read or Die
|
```python
"""
Car model for Hybrid A* path planning
author: Zheng Zh (@Zhengzh)
"""
import sys
import pathlib
root_dir = pathlib.Path(__file__).parent.parent.parent
sys.path.append(str(root_dir))
from math import cos, sin, tan, pi
import matplotlib.pyplot as plt
import numpy as np
from utils.angle import rot_mat_2d
WB = 3.0 # rear to front wheel
W = 2.0 # width of car
LF = 3.3 # distance from rear to vehicle front end
LB = 1.0 # distance from rear to vehicle back end
MAX_STEER = 0.6 # [rad] maximum steering angle
BUBBLE_DIST = (LF - LB) / 2.0 # distance from rear to center of vehicle.
BUBBLE_R = np.hypot((LF + LB) / 2.0, W / 2.0) # bubble radius
# vehicle rectangle vertices
VRX = [LF, LF, -LB, -LB, LF]
VRY = [W / 2, -W / 2, -W / 2, W / 2, W / 2]
def check_car_collision(x_list, y_list, yaw_list, ox, oy, kd_tree):
for i_x, i_y, i_yaw in zip(x_list, y_list, yaw_list):
cx = i_x + BUBBLE_DIST * cos(i_yaw)
cy = i_y + BUBBLE_DIST * sin(i_yaw)
ids = kd_tree.query_ball_point([cx, cy], BUBBLE_R)
if not ids:
continue
if not rectangle_check(i_x, i_y, i_yaw,
[ox[i] for i in ids], [oy[i] for i in ids]):
return False # collision
return True # no collision
def rectangle_check(x, y, yaw, ox, oy):
# transform obstacles to base link frame
rot = rot_mat_2d(yaw)
for iox, ioy in zip(ox, oy):
tx = iox - x
ty = ioy - y
converted_xy = np.stack([tx, ty]).T @ rot
rx, ry = converted_xy[0], converted_xy[1]
if not (rx > LF or rx < -LB or ry > W / 2.0 or ry < -W / 2.0):
return False # collision
return True # no collision
def plot_arrow(x, y, yaw, length=1.0, width=0.5, fc="r", ec="k"):
"""Plot arrow."""
if not isinstance(x, float):
for (i_x, i_y, i_yaw) in zip(x, y, yaw):
plot_arrow(i_x, i_y, i_yaw)
else:
plt.arrow(x, y, length * cos(yaw), length * sin(yaw),
fc=fc, ec=ec, head_width=width, head_length=width, alpha=0.4)
def plot_car(x, y, yaw):
car_color = '-k'
c, s = cos(yaw), sin(yaw)
rot = rot_mat_2d(-yaw)
car_outline_x, car_outline_y = [], []
for rx, ry in zip(VRX, VRY):
converted_xy = np.stack([rx, ry]).T @ rot
car_outline_x.append(converted_xy[0]+x)
car_outline_y.append(converted_xy[1]+y)
arrow_x, arrow_y, arrow_yaw = c * 1.5 + x, s * 1.5 + y, yaw
plot_arrow(arrow_x, arrow_y, arrow_yaw)
plt.plot(car_outline_x, car_outline_y, car_color)
def pi_2_pi(angle):
return (angle + pi) % (2 * pi) - pi
def move(x, y, yaw, distance, steer, L=WB):
x += distance * cos(yaw)
y += distance * sin(yaw)
yaw += pi_2_pi(distance * tan(steer) / L) # distance/2
return x, y, yaw
def main():
x, y, yaw = 0., 0., 1.
plt.axis('equal')
plot_car(x, y, yaw)
plt.show()
if __name__ == '__main__':
main()
```
|
Donald A. Guardian (born June 12, 1953) is an American Republican Party politician who has represented 2nd Legislative District in the New Jersey General Assembly since taking office on January 11, 2022, when he became the first openly gay Republican legislator in state history. He served as the Mayor of Atlantic City, New Jersey from 2014 to 2017.
Early life and education
Raised in Palisades Park, New Jersey and West New York, Guardian graduated from Don Bosco Preparatory High School. He graduated in 1975 from Upsala College.
Before being elected Mayor, Guardian served as an executive with the Boy Scouts of America and at the Claridge Hotel, and headed Atlantic City's Special Improvement District for two decades prior to his election as mayor.
After his time as Mayor, Guardian was named as Business Administrator by the Toms River, New Jersey Township Council, for which he was paid an annual salary of $175,000.
Elective office
On January 19, 2013, Guardian announced he was challenging incumbent mayor Lorenzo Langford. He won the Republican primary unopposed. On November 5, Guardian defeated Langford by 50% to 47%. In the 2013 United States elections, he defeated incumbent Democratic mayor Lorenzo Langford to become Atlantic City's first openly gay mayor and first Republican mayor since 1990. In the 2017 election, Guardian lost re-election to Democratic city councilman, Frank Gilliam.
Guardian was elected to the State Assembly in 2021, alongside fellow Republican Claire Swift. They defeated the Democratic slate of incumbent Assemblyman John Armato and Atlantic County Commissioner Caren Fitzpatrick; the district's other Assemblyman, Democrat Vince Mazzeo, did not seek reelection to the Assembly in order to make what would ultimately be an unsuccessful bid for the district's State Senate seat.
On January 11, 2022, Guardian was sworn in to the New Jersey General Assembly. His election makes him the first openly gay Republican legislator in state history and the only openly gay member of the New Jersey Legislature.
Committees
Committee assignments for the current session are:
Environment and Solid Waste
Special Committee on Infrastructure and Natural Resources
Tourism, Gaming and the Arts
District 2
Each of the 40 districts in the New Jersey Legislature has one representative in the New Jersey Senate and two members in the New Jersey General Assembly. The representatives from the 2nd District for the 2022—23 Legislative Session are:
Senator Vincent J. Polistina (R)
Assemblyman Don Guardian (R)
Assemblyman Claire Swift (R)
References
External links
Legislative webpage
City of Atlantic City: Mayor Donald A. Guardian
Campaign website
Atlantic City Mayor Don Guardian on Instagram
Atlantic City Mayor Don Guardian on Twitter
Atlantic City Mayor Don Guardian on Facebook
1953 births
Living people
LGBT state legislators in New Jersey
Gay politicians
LGBT mayors of places in the United States
LGBT people from New Jersey
Mayors of Atlantic City, New Jersey
Republican Party members of the New Jersey General Assembly
21st-century American politicians
21st-century American LGBT people
People from Palisades Park, New Jersey
People from West New York, New Jersey
Don Bosco Preparatory High School alumni
Upsala College alumni
|
```yaml
{{- if .Values.influxdb.enabled }}
# Fluentbit deployment for Fission
#
# Requires:
# - service account: fission-svc
apiVersion: v1
kind: ConfigMap
metadata:
name: {{ .Release.Name }}-fission-fluentbit
data:
{{- if .Files.Get "config/fluentbit.conf" }}
fluentbit.conf: |
{{ .Files.Get "config/fluentbit.conf" | indent 3 }}
{{ else }}
{{ fail "invalid chart" }}
{{- end }}
{{- if .Files.Get "config/parsers.conf" }}
parsers.conf: |
{{ .Files.Get "config/parsers.conf" | indent 3 }}
{{ else }}
{{ fail "invalid chart" }}
{{- end }}
{{- if .Values.logger.podSecurityPolicy.enabled }}
---
apiVersion: policy/v1beta1
kind: PodSecurityPolicy
metadata:
name: {{ .Release.Name }}-fission-logger-privileged
labels:
chart: "{{ .Chart.Name }}-{{ .Chart.Version }}"
svc: logger
spec:
privileged: true
seLinux:
rule: RunAsAny
supplementalGroups:
rule: RunAsAny
runAsUser:
rule: RunAsAny
fsGroup:
rule: RunAsAny
volumes:
- '*'
{{- if .Values.logger.podSecurityPolicy.additionalCapabilities }}
allowedCapabilities:
{{- range .Values.logger.podSecurityPolicy.additionalCapabilities }}
- {{ . }}
{{- end }}
{{- end }}
---
apiVersion: rbac.authorization.k8s.io/v1
kind: Role
metadata:
name: psp:{{ .Release.Name }}-fission-logger-privileged
labels:
chart: "{{ .Chart.Name }}-{{ .Chart.Version }}"
svc: logger
rules:
- apiGroups: ['policy']
resources: ['podsecuritypolicies']
verbs: ['use']
resourceNames:
- {{ .Release.Name }}-fission-logger-privileged
---
apiVersion: rbac.authorization.k8s.io/v1
kind: RoleBinding
metadata:
name: psp:{{ .Release.Name }}-fission-logger-privileged
labels:
chart: "{{ .Chart.Name }}-{{ .Chart.Version }}"
svc: logger
roleRef:
kind: Role
name: psp:{{ .Release.Name }}-fission-logger-privileged
apiGroup: rbac.authorization.k8s.io
subjects:
- kind: ServiceAccount
name: default
namespace: {{ .Release.Namespace }}
{{- end }}
---
apiVersion: apps/v1
kind: DaemonSet
metadata:
name: logger
labels:
chart: "{{ .Chart.Name }}-{{ .Chart.Version }}"
svc: logger
spec:
selector:
matchLabels:
svc: logger
template:
metadata:
labels:
svc: logger
spec:
initContainers:
- name: init
image: {{ .Values.busyboxImage | quote }}
imagePullPolicy: {{ .Values.pullPolicy }}
command: ['mkdir', '-p', '/var/log/fission']
volumeMounts:
- name: container-log
mountPath: /var/log/
readOnly: false
{{- if .Values.logger.enableSecurityContext }}
securityContext:
privileged: true
{{- end }}
containers:
- name: logger
image: {{ include "fission-bundleImage" . | quote }}
imagePullPolicy: {{ .Values.pullPolicy }}
env:
- name: NODE_NAME
valueFrom:
fieldRef:
apiVersion: v1
fieldPath: spec.nodeName
command: ["/fission-bundle"]
args: ["--logger"]
volumeMounts:
- name: container-log
mountPath: /var/log/
readOnly: false
- name: docker-log
mountPath: /var/lib/docker/containers
readOnly: true
{{- if .Values.logger.enableSecurityContext }}
securityContext:
privileged: true
{{- end }}
- name: fluentbit
{{- if .Values.repository }}
image: "{{ .Values.logger.fluentdImageRepository }}/{{ .Values.logger.fluentdImage }}:{{ .Values.logger.fluentdImageTag }}"
{{ else }}
image: "{{ .Values.logger.fluentdImage }}:{{ .Values.logger.fluentdImageTag }}"
{{- end }}
imagePullPolicy: {{ .Values.pullPolicy }}
# CMD ["/fluent-bit/bin/fluent-bit", "-c", "/fluent-bit/etc/fluent-bit.conf"]
command: ["/fluent-bit/bin/fluent-bit", "-c", "/fluent-bit/etc/fluentbit.conf"]
env:
- name: INFLUXDB_ADDRESS
value: influxdb
- name: INFLUXDB_PORT
value: "8086"
- name: INFLUXDB_DBNAME
value: "fissionFunctionLog"
- name: INFLUXDB_USERNAME
valueFrom:
secretKeyRef:
name: influxdb
key: username
- name: INFLUXDB_PASSWD
valueFrom:
secretKeyRef:
name: influxdb
key: password
- name: LOG_PATH
value: /var/log/fission/*.log
{{- if .Values.logger.enableSecurityContext }}
securityContext:
privileged: true
{{- end }}
volumeMounts:
- name: container-log
mountPath: /var/log/
readOnly: false
- name: docker-log
mountPath: /var/lib/docker/containers
readOnly: true
- name: fluentbit-config
mountPath: /fluent-bit/etc/
readOnly: true
serviceAccountName: fission-fluentbit
volumes:
- name: container-log
hostPath:
path: /var/log/
- name: docker-log
hostPath:
path: /var/lib/docker/containers
# Fluentbit config location: /fluent-bit/etc/*.conf
- name: fluentbit-config
configMap:
name: {{ .Release.Name }}-fission-fluentbit
{{- with .Values.imagePullSecrets }}
imagePullSecrets:
{{- toYaml . | nindent 8 }}
{{- end }}
updateStrategy:
type: RollingUpdate
{{- end }}
```
|
```c++
/*
[auto_generated]
boost/numeric/odeint/integrate/integrate_const.hpp
[begin_description]
Constant integration of ODEs, meaning that the state of the ODE is observed on constant time intervals.
The routines makes full use of adaptive and dense-output methods.
[end_description]
(See accompanying file LICENSE_1_0.txt or
copy at path_to_url
*/
#ifndef BOOST_NUMERIC_ODEINT_INTEGRATE_INTEGRATE_CONST_HPP_INCLUDED
#define BOOST_NUMERIC_ODEINT_INTEGRATE_INTEGRATE_CONST_HPP_INCLUDED
#include <boost/type_traits/is_same.hpp>
#include <boost/numeric/odeint/stepper/stepper_categories.hpp>
#include <boost/numeric/odeint/iterator/integrate/null_observer.hpp>
#include <boost/numeric/odeint/iterator/integrate/detail/integrate_const.hpp>
#include <boost/numeric/odeint/iterator/integrate/detail/integrate_adaptive.hpp>
namespace boost {
namespace numeric {
namespace odeint {
/*
* Integrates with constant time step dt.
*/
template< class Stepper , class System , class State , class Time , class Observer >
size_t integrate_const(
Stepper stepper , System system , State &start_state ,
Time start_time , Time end_time , Time dt ,
Observer observer
)
{
typedef typename odeint::unwrap_reference< Stepper >::type::stepper_category stepper_category;
// we want to get as fast as possible to the end
if( boost::is_same< null_observer , Observer >::value )
{
return detail::integrate_adaptive(
stepper , system , start_state ,
start_time , end_time , dt ,
observer , stepper_category() );
}
else
{
return detail::integrate_const( stepper , system , start_state ,
start_time , end_time , dt ,
observer , stepper_category() );
}
}
/**
* \brief Second version to solve the forwarding problem,
* can be called with Boost.Range as start_state.
*/
template< class Stepper , class System , class State , class Time , class Observer >
size_t integrate_const(
Stepper stepper , System system , const State &start_state ,
Time start_time , Time end_time , Time dt ,
Observer observer
)
{
typedef typename odeint::unwrap_reference< Stepper >::type::stepper_category stepper_category;
// we want to get as fast as possible to the end
if( boost::is_same< null_observer , Observer >::value )
{
return detail::integrate_adaptive(
stepper , system , start_state ,
start_time , end_time , dt ,
observer , stepper_category() );
}
else
{
return detail::integrate_const( stepper , system , start_state ,
start_time , end_time , dt ,
observer , stepper_category() );
}
}
/**
* \brief integrate_const without observer calls
*/
template< class Stepper , class System , class State , class Time >
size_t integrate_const(
Stepper stepper , System system , State &start_state ,
Time start_time , Time end_time , Time dt
)
{
return integrate_const( stepper , system , start_state , start_time , end_time , dt , null_observer() );
}
/**
* \brief Second version to solve the forwarding problem,
* can be called with Boost.Range as start_state.
*/
template< class Stepper , class System , class State , class Time >
size_t integrate_const(
Stepper stepper , System system , const State &start_state ,
Time start_time , Time end_time , Time dt
)
{
return integrate_const( stepper , system , start_state , start_time , end_time , dt , null_observer() );
}
/********* DOXYGEN *********/
/**
* \fn integrate_const( Stepper stepper , System system , State &start_state , Time start_time , Time end_time , Time dt , Observer observer )
* \brief Integrates the ODE with constant step size.
*
* Integrates the ODE defined by system using the given stepper.
* This method ensures that the observer is called at constant intervals dt.
* If the Stepper is a normal stepper without step size control, dt is also
* used for the numerical scheme. If a ControlledStepper is provided, the
* algorithm might reduce the step size to meet the error bounds, but it is
* ensured that the observer is always called at equidistant time points
* t0 + n*dt. If a DenseOutputStepper is used, the step size also may vary
* and the dense output is used to call the observer at equidistant time
* points.
*
* \param stepper The stepper to be used for numerical integration.
* \param system Function/Functor defining the rhs of the ODE.
* \param start_state The initial condition x0.
* \param start_time The initial time t0.
* \param end_time The final integration time tend.
* \param dt The time step between observer calls, _not_ necessarily the
* time step of the integration.
* \param observer Function/Functor called at equidistant time intervals.
* \return The number of steps performed.
*/
} // namespace odeint
} // namespace numeric
} // namespace boost
#endif // BOOST_NUMERIC_ODEINT_INTEGRATE_INTEGRATE_CONST_HPP_INCLUDED
```
|
The Source Newspaper, also known in French as La Source, is an intercultural newspaper in Vancouver, British Columbia, Canada. In publication since 1999, its editorial offices are at the corner of Granville and Robson Streets in Downtown Vancouver. Styling itself as "A Forum For Diversity", it is dedicated to diversity and intercultural harmony. Also published in French, its readership includes readers from all over British Columbia, as far east as Nelson, all over the island, as far north as Terrace, as well as the whole Lower Mainland. Its founding publisher is Mamadou Gangué.
References
Newspapers published in Vancouver
Newspapers established in 1999
1999 establishments in British Columbia
|
Roderick Allen may refer to:
Roderick R. Allen (1894–1970), Major General of the U.S. Army
Rod Allen (born 1959), baseball analyst
Rod Allen (advertising executive) (1929–2007)
See also
Rodney Allen (disambiguation)
Allen (surname)
|
Thomas or Tom Gorman may refer to:
Thomas Kiely Gorman (1892–1980), Roman Catholic bishop of Dallas
Tom Gorman (American football) (1910–1975), American football player and coach
Tom Gorman (rugby league) (1901–1978), Australian rugby league player
Tom Gorman (tennis) (born 1946), American tennis player
Tommie Gorman (born 1956), Irish journalist
Tommy Gorman (1886–1961), Canadian lacrosse player and founder of the National Hockey League (NHL)
Baseball
Tom Gorman (1980s pitcher) (born 1957), American baseball relief pitcher
Tom Gorman (right-handed pitcher) (1925–1992), American baseball relief pitcher
Tom Gorman (umpire) (1919–1986), American baseball pitcher and umpire
See also
Gorman Thomas (born 1950), American former professional baseball player
|
Michal Ondo (born 20 February 1985), nicknamed Máchal, is a Czech professional darts player and a member of the team Barbaři Sedlec.
Career
Michal started playing in 2001 and he belongs to the best and most successful Czech players.
In soft-tip darts he is fourth times National champion in teams and in 2017 he dominated the tournament in singles.
His first successful year in steel darts came in 2011, in which he made three big achievements. He became National champion in singles and also started to compete abroad, finishing 3rd in Hungaria Open and Vienna Open. In 2012 he became National champion in teams with DC Bizoni. In 2014 he finished 3rd in Romania Open. In the same year he won in Czech Cup and in the National championship in doubles. In 2015 he triumphed again in Czech Cup and he managed to win doubles tournament on the biggest darts tournament in Czech republic – Czech Open. Following year, 2016, he won Czech Cup for the third time and he repeated the 3rd place from Hungaria Open as well.
Between 2011 and 2017 he attempted to qualify for BDO World Darts Championship, but never succeeded. He took part in six major tournaments, World Masters, making it into last 48 in 2015.
In 2018, he finished 3rd in WDF Europe Cup which is probably his big achievement so far. In the same year, he became National champion with the team Barbaři Sedlec. In November he took part in the exhibition Prague Darts Masters and he played against Peter Wright.
In 2019 he won National championship in doubles for the third time, this year with Michal Šmejda. He also competed in PDC European Q-school 2019, three times making in into last 128 and last 64 in the last tournament. In November he qualified for another exhibition, Prague Darts Master Souboj legend, where also Phil Taylor played.
In 2020 he tried PDC European Q-school again, making it into last 32 on Day 3. The other days he was eliminated in last 512, last 256 and last 128, which was not enough to secure the Tour card. In April 2020 he became one of the ten players of newly found 2020 Tipsport Premier League.
Major tournaments results
WDF
BDO
References
External links
1985 births
Living people
Czech darts players
Professional Darts Corporation associate players
People from Vysoké Mýto
Sportspeople from the Pardubice Region
|
Sir Ralph Sydenham (died 1671) was an English politician who sat in the House of Commons from 1641 to 1642. He supported the Royalist cause in the English Civil War.
Sydenham was the son of Sir John Sydenham of Brimpton. He was knighted in Scotland on 17 July 1617.
In 1641, Sydenham was elected Member of Parliament for Bossiney in the Long Parliament in place of Clotworthy who sat for Maldon. Sydenham followed the King to Oxford and was thus disabled from sitting in parliament on 29 September 1642. He compounded for his delinquency in a fine of £500. He lived at Youlston, Devon.
Following the Restoration in 1660, Sydenham was made Master of Charterhouse and remained in post until his death in 1671.
In 1629, Sydenham married Mary, the widow of Sir Arthur Chichester, at St Mary Abbots Church, Kensington and had a family.
References
Year of birth missing
1671 deaths
Members of the pre-1707 English Parliament for constituencies in Cornwall
Cavaliers
English MPs 1640–1648
|
Lucimar Teodoro (born 1 May 1981) is a Brazilian track and field athlete who specialises in the 400 metres sprint and the 400 metres hurdles.
Teodoro was born in Guararapes, São Paulo and began competing in senior athletics in 2001. She attended her first Olympics at the 2004 Athens Games as part of the 4×400 metres relay team. However, the team did not reach the final of the event. Teodoro won both the sprint and hurdling 400 m titles at the 2005 national championships and reached the semi-finals of sprint event at the 2005 World Championships in Athletics. She competed at the following World Championships in Osaka, this time in the 400 m hurdles, but did not progress to the finals. At the 2008 Beijing Olympics Teodoro represented Brazil in the 400 m hurdles, and the 4×400 metres relay.
She set a new South American record in the 400 m hurdles with a time of 55.84 seconds in Belém at the Grande Prêmio Brasil Caixa meet in May 2009. Following this, she finished second at the Troféu Brasil Caixa de Atletismo to Luciana França, who challenged Teodoro's record by running a new personal best of 55.90 seconds. In July, she won the gold at the 2009 Lusophony Games.
However, she tested positive for banned substances (Fenproporex) soon after and was provisionally banned from competition and was not selected for the 2009 World Championships in Athletics. After admitting her use of a banned substance, she was suspended from competition for the minimum of two years by the Brazilian Athletics Confederation.
Achievements
Personal bests
All information taken from IAAF profile.
See also
List of doping cases in sport
References
External links
1981 births
Living people
Brazilian female sprinters
Brazilian female hurdlers
Athletes (track and field) at the 2003 Pan American Games
Athletes (track and field) at the 2004 Summer Olympics
Athletes (track and field) at the 2008 Summer Olympics
Doping cases in athletics
Olympic athletes for Brazil
Brazilian sportspeople in doping cases
Pan American Games athletes for Brazil
Pan American Games medalists in athletics (track and field)
Pan American Games bronze medalists for Brazil
Medalists at the 2003 Pan American Games
Athletes from São Paulo (state)
21st-century Brazilian women
20th-century Brazilian women
People from Guararapes
|
Inga cuspidata is a species of plant in the family Fabaceae. It is found only in Panama.
References
cuspidata
Flora of Panama
Vulnerable plants
Taxonomy articles created by Polbot
|
```go
/*
path_to_url
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package docker
import (
"bytes"
"fmt"
"io"
"github.com/containerd/containerd/errdefs"
"github.com/containerd/containerd/log"
)
const maxRetry = 3
type httpReadSeeker struct {
size int64
offset int64
rc io.ReadCloser
open func(offset int64) (io.ReadCloser, error)
closed bool
errsWithNoProgress int
}
func newHTTPReadSeeker(size int64, open func(offset int64) (io.ReadCloser, error)) (io.ReadCloser, error) {
return &httpReadSeeker{
size: size,
open: open,
}, nil
}
func (hrs *httpReadSeeker) Read(p []byte) (n int, err error) {
if hrs.closed {
return 0, io.EOF
}
rd, err := hrs.reader()
if err != nil {
return 0, err
}
n, err = rd.Read(p)
hrs.offset += int64(n)
if n > 0 || err == nil {
hrs.errsWithNoProgress = 0
}
if err == io.ErrUnexpectedEOF {
// connection closed unexpectedly. try reconnecting.
if n == 0 {
hrs.errsWithNoProgress++
if hrs.errsWithNoProgress > maxRetry {
return // too many retries for this offset with no progress
}
}
if hrs.rc != nil {
if clsErr := hrs.rc.Close(); clsErr != nil {
log.L.WithError(clsErr).Error("httpReadSeeker: failed to close ReadCloser")
}
hrs.rc = nil
}
if _, err2 := hrs.reader(); err2 == nil {
return n, nil
}
} else if err == io.EOF {
// The CRI's imagePullProgressTimeout relies on responseBody.Close to
// update the process monitor's status. If the err is io.EOF, close
// the connection since there is no more available data.
if hrs.rc != nil {
if clsErr := hrs.rc.Close(); clsErr != nil {
log.L.WithError(clsErr).Error("httpReadSeeker: failed to close ReadCloser after io.EOF")
}
hrs.rc = nil
}
}
return
}
func (hrs *httpReadSeeker) Close() error {
if hrs.closed {
return nil
}
hrs.closed = true
if hrs.rc != nil {
return hrs.rc.Close()
}
return nil
}
func (hrs *httpReadSeeker) Seek(offset int64, whence int) (int64, error) {
if hrs.closed {
return 0, fmt.Errorf("Fetcher.Seek: closed: %w", errdefs.ErrUnavailable)
}
abs := hrs.offset
switch whence {
case io.SeekStart:
abs = offset
case io.SeekCurrent:
abs += offset
case io.SeekEnd:
if hrs.size == -1 {
return 0, fmt.Errorf("Fetcher.Seek: unknown size, cannot seek from end: %w", errdefs.ErrUnavailable)
}
abs = hrs.size + offset
default:
return 0, fmt.Errorf("Fetcher.Seek: invalid whence: %w", errdefs.ErrInvalidArgument)
}
if abs < 0 {
return 0, fmt.Errorf("Fetcher.Seek: negative offset: %w", errdefs.ErrInvalidArgument)
}
if abs != hrs.offset {
if hrs.rc != nil {
if err := hrs.rc.Close(); err != nil {
log.L.WithError(err).Error("Fetcher.Seek: failed to close ReadCloser")
}
hrs.rc = nil
}
hrs.offset = abs
}
return hrs.offset, nil
}
func (hrs *httpReadSeeker) reader() (io.Reader, error) {
if hrs.rc != nil {
return hrs.rc, nil
}
if hrs.size == -1 || hrs.offset < hrs.size {
// only try to reopen the body request if we are seeking to a value
// less than the actual size.
if hrs.open == nil {
return nil, fmt.Errorf("cannot open: %w", errdefs.ErrNotImplemented)
}
rc, err := hrs.open(hrs.offset)
if err != nil {
return nil, fmt.Errorf("httpReadSeeker: failed open: %w", err)
}
if hrs.rc != nil {
if err := hrs.rc.Close(); err != nil {
log.L.WithError(err).Error("httpReadSeeker: failed to close ReadCloser")
}
}
hrs.rc = rc
} else {
// There is an edge case here where offset == size of the content. If
// we seek, we will probably get an error for content that cannot be
// sought (?). In that case, we should err on committing the content,
// as the length is already satisfied but we just return the empty
// reader instead.
hrs.rc = io.NopCloser(bytes.NewReader([]byte{}))
}
return hrs.rc, nil
}
```
|
```pod
=pod
=head1 NAME
DSA_generate_parameters - generate DSA parameters
=head1 SYNOPSIS
#include <openssl/dsa.h>
DSA *DSA_generate_parameters(int bits, unsigned char *seed,
int seed_len, int *counter_ret, unsigned long *h_ret,
void (*callback)(int, int, void *), void *cb_arg);
=head1 DESCRIPTION
DSA_generate_parameters() generates primes p and q and a generator g
for use in the DSA.
B<bits> is the length of the prime to be generated; the DSS allows a
maximum of 1024 bits.
If B<seed> is B<NULL> or B<seed_len> E<lt> 20, the primes will be
generated at random. Otherwise, the seed is used to generate
them. If the given seed does not yield a prime q, a new random
seed is chosen.
DSA_generate_parameters() places the iteration count in
*B<counter_ret> and a counter used for finding a generator in
*B<h_ret>, unless these are B<NULL>.
A callback function may be used to provide feedback about the progress
of the key generation. If B<callback> is not B<NULL>, it will be
called as follows:
=over 4
=item *
When a candidate for q is generated, B<callback(0, m++, cb_arg)> is called
(m is 0 for the first candidate).
=item *
When a candidate for q has passed a test by trial division,
B<callback(1, -1, cb_arg)> is called.
While a candidate for q is tested by Miller-Rabin primality tests,
B<callback(1, i, cb_arg)> is called in the outer loop
(once for each witness that confirms that the candidate may be prime);
i is the loop counter (starting at 0).
=item *
When a prime q has been found, B<callback(2, 0, cb_arg)> and
B<callback(3, 0, cb_arg)> are called.
=item *
Before a candidate for p (other than the first) is generated and tested,
B<callback(0, counter, cb_arg)> is called.
=item *
When a candidate for p has passed the test by trial division,
B<callback(1, -1, cb_arg)> is called.
While it is tested by the Miller-Rabin primality test,
B<callback(1, i, cb_arg)> is called in the outer loop
(once for each witness that confirms that the candidate may be prime).
i is the loop counter (starting at 0).
=item *
When p has been found, B<callback(2, 1, cb_arg)> is called.
=item *
When the generator has been found, B<callback(3, 1, cb_arg)> is called.
=back
=head1 RETURN VALUE
DSA_generate_parameters() returns a pointer to the DSA structure, or
B<NULL> if the parameter generation fails. The error codes can be
obtained by L<ERR_get_error(3)|ERR_get_error(3)>.
=head1 BUGS
Seed lengths E<gt> 20 are not supported.
=head1 SEE ALSO
L<dsa(3)|dsa(3)>, L<ERR_get_error(3)|ERR_get_error(3)>, L<rand(3)|rand(3)>,
L<DSA_free(3)|DSA_free(3)>
=head1 HISTORY
DSA_generate_parameters() appeared in SSLeay 0.8. The B<cb_arg>
argument was added in SSLeay 0.9.0.
In versions up to OpenSSL 0.9.4, B<callback(1, ...)> was called
in the inner loop of the Miller-Rabin test whenever it reached the
squaring step (the parameters to B<callback> did not reveal how many
witnesses had been tested); since OpenSSL 0.9.5, B<callback(1, ...)>
is called as in BN_is_prime(3), i.e. once for each witness.
=cut
```
|
```c
/****************************************************************************
*
* gxvprop.c
*
* TrueTypeGX/AAT prop table validation (body).
*
* suzuki toshiya, Masatake YAMATO, Red Hat K.K.,
* David Turner, Robert Wilhelm, and Werner Lemberg.
*
* This file is part of the FreeType project, and may only be used,
* modified, and distributed under the terms of the FreeType project
* license, LICENSE.TXT. By continuing to use, modify, or distribute
* this file you indicate that you have read the license and
* understand and accept it fully.
*
*/
/****************************************************************************
*
* gxvalid is derived from both gxlayout module and otvalid module.
* Development of gxlayout is supported by the Information-technology
* Promotion Agency(IPA), Japan.
*
*/
#include "gxvalid.h"
#include "gxvcommn.h"
/**************************************************************************
*
* The macro FT_COMPONENT is used in trace mode. It is an implicit
* parameter of the FT_TRACE() and FT_ERROR() macros, used to print/log
* messages during execution.
*/
#undef FT_COMPONENT
#define FT_COMPONENT gxvprop
/*************************************************************************/
/*************************************************************************/
/***** *****/
/***** Data and Types *****/
/***** *****/
/*************************************************************************/
/*************************************************************************/
#define GXV_PROP_HEADER_SIZE ( 4 + 2 + 2 )
#define GXV_PROP_SIZE_MIN GXV_PROP_HEADER_SIZE
typedef struct GXV_prop_DataRec_
{
FT_Fixed version;
} GXV_prop_DataRec, *GXV_prop_Data;
#define GXV_PROP_DATA( field ) GXV_TABLE_DATA( prop, field )
#define GXV_PROP_FLOATER 0x8000U
#define GXV_PROP_USE_COMPLEMENTARY_BRACKET 0x1000U
#define GXV_PROP_COMPLEMENTARY_BRACKET_OFFSET 0x0F00U
#define GXV_PROP_ATTACHING_TO_RIGHT 0x0080U
#define GXV_PROP_RESERVED 0x0060U
#define GXV_PROP_DIRECTIONALITY_CLASS 0x001FU
/*************************************************************************/
/*************************************************************************/
/***** *****/
/***** UTILITY FUNCTIONS *****/
/***** *****/
/*************************************************************************/
/*************************************************************************/
static void
gxv_prop_zero_advance_validate( FT_UShort gid,
GXV_Validator gxvalid )
{
FT_Face face;
FT_Error error;
FT_GlyphSlot glyph;
GXV_NAME_ENTER( "zero advance" );
face = gxvalid->face;
error = FT_Load_Glyph( face,
gid,
FT_LOAD_IGNORE_TRANSFORM );
if ( error )
FT_INVALID_GLYPH_ID;
glyph = face->glyph;
if ( glyph->advance.x != (FT_Pos)0 ||
glyph->advance.y != (FT_Pos)0 )
{
GXV_TRACE(( " found non-zero advance in zero-advance glyph\n" ));
FT_INVALID_DATA;
}
GXV_EXIT;
}
/* Pass 0 as GLYPH to check the default property */
static void
gxv_prop_property_validate( FT_UShort property,
FT_UShort glyph,
GXV_Validator gxvalid )
{
if ( glyph != 0 && ( property & GXV_PROP_FLOATER ) )
gxv_prop_zero_advance_validate( glyph, gxvalid );
if ( property & GXV_PROP_USE_COMPLEMENTARY_BRACKET )
{
FT_UShort offset;
char complement;
offset = (FT_UShort)( property & GXV_PROP_COMPLEMENTARY_BRACKET_OFFSET );
if ( offset == 0 )
{
GXV_TRACE(( " found zero offset to property\n" ));
FT_INVALID_OFFSET;
}
complement = (char)( offset >> 8 );
if ( complement & 0x08 )
{
/* Top bit is set: negative */
/* Calculate the absolute offset */
complement = (char)( ( complement & 0x07 ) + 1 );
/* The gid for complement must be greater than 0 */
if ( glyph <= complement )
{
GXV_TRACE(( " found non-positive glyph complement\n" ));
FT_INVALID_DATA;
}
}
else
{
/* The gid for complement must be the face. */
gxv_glyphid_validate( (FT_UShort)( glyph + complement ), gxvalid );
}
}
else
{
if ( property & GXV_PROP_COMPLEMENTARY_BRACKET_OFFSET )
GXV_TRACE(( "glyph %d cannot have complementary bracketing\n",
glyph ));
}
/* this is introduced in version 2.0 */
if ( property & GXV_PROP_ATTACHING_TO_RIGHT )
{
if ( GXV_PROP_DATA( version ) == 0x00010000UL )
{
GXV_TRACE(( " found older version (1.0) in new version table\n" ));
FT_INVALID_DATA;
}
}
if ( property & GXV_PROP_RESERVED )
{
GXV_TRACE(( " found non-zero bits in reserved bits\n" ));
FT_INVALID_DATA;
}
if ( ( property & GXV_PROP_DIRECTIONALITY_CLASS ) > 11 )
{
/* TODO: Too restricted. Use the validation level. */
if ( GXV_PROP_DATA( version ) == 0x00010000UL ||
GXV_PROP_DATA( version ) == 0x00020000UL )
{
GXV_TRACE(( " found too old version in directionality class\n" ));
FT_INVALID_DATA;
}
}
}
static void
gxv_prop_LookupValue_validate( FT_UShort glyph,
GXV_LookupValueCPtr value_p,
GXV_Validator gxvalid )
{
gxv_prop_property_validate( value_p->u, glyph, gxvalid );
}
/*
+===============+ --------+
| lookup header | |
+===============+ |
| BinSrchHeader | |
+===============+ |
| lastGlyph[0] | |
+---------------+ |
| firstGlyph[0] | | head of lookup table
+---------------+ | +
| offset[0] | -> | offset [byte]
+===============+ | +
| lastGlyph[1] | | (glyphID - firstGlyph) * 2 [byte]
+---------------+ |
| firstGlyph[1] | |
+---------------+ |
| offset[1] | |
+===============+ |
|
... |
|
16bit value array |
+===============+ |
| value | <-------+
...
*/
static GXV_LookupValueDesc
gxv_prop_LookupFmt4_transit( FT_UShort relative_gindex,
GXV_LookupValueCPtr base_value_p,
FT_Bytes lookuptbl_limit,
GXV_Validator gxvalid )
{
FT_Bytes p;
FT_Bytes limit;
FT_UShort offset;
GXV_LookupValueDesc value;
/* XXX: check range? */
offset = (FT_UShort)( base_value_p->u +
relative_gindex * sizeof ( FT_UShort ) );
p = gxvalid->lookuptbl_head + offset;
limit = lookuptbl_limit;
GXV_LIMIT_CHECK ( 2 );
value.u = FT_NEXT_USHORT( p );
return value;
}
/*************************************************************************/
/*************************************************************************/
/***** *****/
/***** prop TABLE *****/
/***** *****/
/*************************************************************************/
/*************************************************************************/
FT_LOCAL_DEF( void )
gxv_prop_validate( FT_Bytes table,
FT_Face face,
FT_Validator ftvalid )
{
FT_Bytes p = table;
FT_Bytes limit = 0;
GXV_ValidatorRec gxvalidrec;
GXV_Validator gxvalid = &gxvalidrec;
GXV_prop_DataRec proprec;
GXV_prop_Data prop = &proprec;
FT_Fixed version;
FT_UShort format;
FT_UShort defaultProp;
gxvalid->root = ftvalid;
gxvalid->table_data = prop;
gxvalid->face = face;
FT_TRACE3(( "validating `prop' table\n" ));
GXV_INIT;
GXV_LIMIT_CHECK( 4 + 2 + 2 );
version = FT_NEXT_LONG( p );
format = FT_NEXT_USHORT( p );
defaultProp = FT_NEXT_USHORT( p );
GXV_TRACE(( " version 0x%08x\n", version ));
GXV_TRACE(( " format 0x%04x\n", format ));
GXV_TRACE(( " defaultProp 0x%04x\n", defaultProp ));
/* only versions 1.0, 2.0, 3.0 are defined (1996) */
if ( version != 0x00010000UL &&
version != 0x00020000UL &&
version != 0x00030000UL )
{
GXV_TRACE(( " found unknown version\n" ));
FT_INVALID_FORMAT;
}
/* only formats 0x0000, 0x0001 are defined (1996) */
if ( format > 1 )
{
GXV_TRACE(( " found unknown format\n" ));
FT_INVALID_FORMAT;
}
gxv_prop_property_validate( defaultProp, 0, gxvalid );
if ( format == 0 )
{
FT_TRACE3(( "(format 0, no per-glyph properties, "
"remaining %d bytes are skipped)", limit - p ));
goto Exit;
}
/* format == 1 */
GXV_PROP_DATA( version ) = version;
gxvalid->lookupval_sign = GXV_LOOKUPVALUE_UNSIGNED;
gxvalid->lookupval_func = gxv_prop_LookupValue_validate;
gxvalid->lookupfmt4_trans = gxv_prop_LookupFmt4_transit;
gxv_LookupTable_validate( p, limit, gxvalid );
Exit:
FT_TRACE4(( "\n" ));
}
/* END */
```
|
Mohamed Hamid Hazzaz (; 30 November 1945 – 13 January 2018), also known as Hamid El-Hazzaz, was a Moroccan football goalkeeper. Hazzaz spent his entire career playing for Maghreb de Fès.
Club career
Hazzaz joined the MAS Fez first team in 1962. He won the Moroccan league with the team three times, in 1965, 1979, and 1983, and the Throne Cup once in 1980. He retired in 1984.
International career
Hazzaz first played for played for Morocco against Tunesia in 1968 as part of the 1970 FIFA World Cup qualifiers. He was named to the squad for the 1970 FIFA World Cup where he played Morocco's final group match against Bulgaria. The match finished in a 1–1 draw, making Morocco the first African side to avoid defeat in the World Cup.
Hazzaz represented Morocco at the 1972 Summer Olympics in Munich and the 1976 Africa Cup of Nations. A He also played in the 1976 Arab Games and the 1978 Africa Cup of Nations. In both the 1976 and 1978 Africa Cup of Nations, he was chosen as the goalkeeper for the Team of the Tournament. He retired from international football in 1979.
References
External links
1945 births
2018 deaths
Moroccan men's footballers
Morocco men's international footballers
Men's association football goalkeepers
Maghreb de Fès players
Botola players
1970 FIFA World Cup players
Footballers at the 1972 Summer Olympics
Olympic footballers for Morocco
1976 African Cup of Nations players
1978 African Cup of Nations players
Footballers from Fez, Morocco
Africa Cup of Nations-winning players
|
Asselineau () is a French surname. Notable people with the surname include:
Charles Asselineau (1820-1874), French writer
François Asselineau (born 1957), French politician and civil servant
(born 1965), French ice hockey player
(1808-1889), French painter
Roger Asselineau (1915-2002), French scholar and writer
French-language surnames
|
Calais is a city in France. The name can also refer to:
Places
Calais, Maine, United States, a city
Calais, Vermont, United States, a town
Arrondissement of Calais, France
Calais (constituency), the electoral area of Calais, France, represented in the Parliament of England before the French reconquest in the 16th century
Calais, Alberta, Canada, an unincorporated community
Calais, Limpopo, South Africa, a village
Mount Calais, Alexander Island, Antarctica
A crater on Saturn's moon Phoebe (moon)
Automobiles
Cadillac Calais
Oldsmobile Cutlass Calais
Holden Calais
Given name
Calaïs , one of the Boreads in Greek mythology
Saint Calais, French hermit-saint, namesake of commune of Saint-Calais
Calais Campbell (born 1986), American National Football League player
Other uses
Calais (beetle), a genus of click beetles
Calais (Reuters product), an internet toolkit
Calais RUFC, a former French football club
Calais Railroad, Maine
|
```shell
Debugging `ssh` client issues
`Firewall` as a service
Quick port test with `netcat`
Find services running on your host
Disable `IPv6`
```
|
```html
<html lang="en">
<head>
<title>Virtual Base Classes - STABS</title>
<meta http-equiv="Content-Type" content="text/html">
<meta name="description" content="STABS">
<meta name="generator" content="makeinfo 4.11">
<link title="Top" rel="start" href="index.html#Top">
<link rel="up" href="Cplusplus.html#Cplusplus" title="Cplusplus">
<link rel="prev" href="Inheritance.html#Inheritance" title="Inheritance">
<link rel="next" href="Static-Members.html#Static-Members" title="Static Members">
<link href="path_to_url" rel="generator-home" title="Texinfo Homepage">
<!--
Contributed by Cygnus Support. Written by Julia Menapace, Jim Kingdon,
and David MacKenzie.
Permission is granted to copy, distribute and/or modify this document
any later version published by the Free Software Foundation; with no
Invariant Sections, with no Front-Cover Texts, and with no Back-Cover
Texts. A copy of the license is included in the section entitled ``GNU
<meta http-equiv="Content-Style-Type" content="text/css">
<style type="text/css"><!--
pre.display { font-family:inherit }
pre.format { font-family:inherit }
pre.smalldisplay { font-family:inherit; font-size:smaller }
pre.smallformat { font-family:inherit; font-size:smaller }
pre.smallexample { font-size:smaller }
pre.smalllisp { font-size:smaller }
span.sc { font-variant:small-caps }
span.roman { font-family:serif; font-weight:normal; }
span.sansserif { font-family:sans-serif; font-weight:normal; }
--></style>
</head>
<body>
<div class="node">
<p>
<a name="Virtual-Base-Classes"></a>
Next: <a rel="next" accesskey="n" href="Static-Members.html#Static-Members">Static Members</a>,
Previous: <a rel="previous" accesskey="p" href="Inheritance.html#Inheritance">Inheritance</a>,
Up: <a rel="up" accesskey="u" href="Cplusplus.html#Cplusplus">Cplusplus</a>
<hr>
</div>
<h3 class="section">8.13 Virtual Base Classes</h3>
<p>A derived class object consists of a concatenation in memory of the data
areas defined by each base class, starting with the leftmost and ending
with the rightmost in the list of base classes. The exception to this
rule is for virtual inheritance. In the example above, class <code>D</code>
inherits virtually from base class <code>B</code>. This means that an
instance of a <code>D</code> object will not contain its own <code>B</code> part but
merely a pointer to a <code>B</code> part, known as a virtual base pointer.
<p>In a derived class stab, the base offset part of the derivation
information, described above, shows how the base class parts are
ordered. The base offset for a virtual base class is always given as 0.
Notice that the base offset for <code>B</code> is given as 0 even though
<code>B</code> is not the first base class. The first base class <code>A</code>
starts at offset 0.
<p>The field information part of the stab for class <code>D</code> describes the field
which is the pointer to the virtual base class <code>B</code>. The vbase pointer
name is ‘<samp><span class="samp">$vb</span></samp>’ followed by a type reference to the virtual base class.
Since the type id for <code>B</code> in this example is 25, the vbase pointer name
is ‘<samp><span class="samp">$vb25</span></samp>’.
<!-- FIXME!! fake linebreaks below -->
<pre class="smallexample"> .stabs "D:Tt31=s32!3,000,20;100,25;0264,28;$vb25:24,128;Ddat:1,
160,32;A_virt::32=##1;:i;2A*-2147483647;20;;B_virt::32:i;
2A*-2147483647;25;;C_virt::32:i;2A*-2147483647;28;;D_virt:
:32:i;2A*-2147483646;31;;;~%20;",128,0,0,0
</pre>
<p>Following the name and a semicolon is a type reference describing the
type of the virtual base class pointer, in this case 24. Type 24 was
defined earlier as the type of the <code>B</code> class <code>this</code> pointer. The
<code>this</code> pointer for a class is a pointer to the class type.
<pre class="example"> .stabs "this:P24=*25=xsB:",64,0,0,8
</pre>
<p>Finally the field offset part of the vbase pointer field description
shows that the vbase pointer is the first field in the <code>D</code> object,
before any data fields defined by the class. The layout of a <code>D</code>
class object is a follows, <code>Adat</code> at 0, the vtable pointer for
<code>A</code> at 32, <code>Cdat</code> at 64, the vtable pointer for C at 96, the
virtual base pointer for <code>B</code> at 128, and <code>Ddat</code> at 160.
</body></html>
```
|
```objective-c
// This file is part of Eigen, a lightweight C++ template library
// for linear algebra.
//
//
// This Source Code Form is subject to the terms of the Mozilla
// with this file, You can obtain one at path_to_url
#ifndef EIGEN_GEOMETRY_SSE_H
#define EIGEN_GEOMETRY_SSE_H
namespace Eigen {
namespace internal {
template<class Derived, class OtherDerived>
struct quat_product<Architecture::SSE, Derived, OtherDerived, float, Aligned16>
{
static inline Quaternion<float> run(const QuaternionBase<Derived>& _a, const QuaternionBase<OtherDerived>& _b)
{
Quaternion<float> res;
const __m128 mask = _mm_setr_ps(0.f,0.f,0.f,-0.f);
__m128 a = _a.coeffs().template packet<Aligned16>(0);
__m128 b = _b.coeffs().template packet<Aligned16>(0);
__m128 s1 = _mm_mul_ps(vec4f_swizzle1(a,1,2,0,2),vec4f_swizzle1(b,2,0,1,2));
__m128 s2 = _mm_mul_ps(vec4f_swizzle1(a,3,3,3,1),vec4f_swizzle1(b,0,1,2,1));
pstore(&res.x(),
_mm_add_ps(_mm_sub_ps(_mm_mul_ps(a,vec4f_swizzle1(b,3,3,3,3)),
_mm_mul_ps(vec4f_swizzle1(a,2,0,1,0),
vec4f_swizzle1(b,1,2,0,0))),
_mm_xor_ps(mask,_mm_add_ps(s1,s2))));
return res;
}
};
template<class Derived, int Alignment>
struct quat_conj<Architecture::SSE, Derived, float, Alignment>
{
static inline Quaternion<float> run(const QuaternionBase<Derived>& q)
{
Quaternion<float> res;
const __m128 mask = _mm_setr_ps(-0.f,-0.f,-0.f,0.f);
pstore(&res.x(), _mm_xor_ps(mask, q.coeffs().template packet<Alignment>(0)));
return res;
}
};
template<typename VectorLhs,typename VectorRhs>
struct cross3_impl<Architecture::SSE,VectorLhs,VectorRhs,float,true>
{
static inline typename plain_matrix_type<VectorLhs>::type
run(const VectorLhs& lhs, const VectorRhs& rhs)
{
__m128 a = lhs.template packet<traits<VectorLhs>::Alignment>(0);
__m128 b = rhs.template packet<traits<VectorRhs>::Alignment>(0);
__m128 mul1=_mm_mul_ps(vec4f_swizzle1(a,1,2,0,3),vec4f_swizzle1(b,2,0,1,3));
__m128 mul2=_mm_mul_ps(vec4f_swizzle1(a,2,0,1,3),vec4f_swizzle1(b,1,2,0,3));
typename plain_matrix_type<VectorLhs>::type res;
pstore(&res.x(),_mm_sub_ps(mul1,mul2));
return res;
}
};
template<class Derived, class OtherDerived, int Alignment>
struct quat_product<Architecture::SSE, Derived, OtherDerived, double, Alignment>
{
static inline Quaternion<double> run(const QuaternionBase<Derived>& _a, const QuaternionBase<OtherDerived>& _b)
{
const Packet2d mask = _mm_castsi128_pd(_mm_set_epi32(0x0,0x0,0x80000000,0x0));
Quaternion<double> res;
const double* a = _a.coeffs().data();
Packet2d b_xy = _b.coeffs().template packet<Alignment>(0);
Packet2d b_zw = _b.coeffs().template packet<Alignment>(2);
Packet2d a_xx = pset1<Packet2d>(a[0]);
Packet2d a_yy = pset1<Packet2d>(a[1]);
Packet2d a_zz = pset1<Packet2d>(a[2]);
Packet2d a_ww = pset1<Packet2d>(a[3]);
// two temporaries:
Packet2d t1, t2;
/*
* t1 = ww*xy + yy*zw
* t2 = zz*xy - xx*zw
* res.xy = t1 +/- swap(t2)
*/
t1 = padd(pmul(a_ww, b_xy), pmul(a_yy, b_zw));
t2 = psub(pmul(a_zz, b_xy), pmul(a_xx, b_zw));
#ifdef EIGEN_VECTORIZE_SSE3
EIGEN_UNUSED_VARIABLE(mask)
pstore(&res.x(), _mm_addsub_pd(t1, preverse(t2)));
#else
pstore(&res.x(), padd(t1, pxor(mask,preverse(t2))));
#endif
/*
* t1 = ww*zw - yy*xy
* t2 = zz*zw + xx*xy
* res.zw = t1 -/+ swap(t2) = swap( swap(t1) +/- t2)
*/
t1 = psub(pmul(a_ww, b_zw), pmul(a_yy, b_xy));
t2 = padd(pmul(a_zz, b_zw), pmul(a_xx, b_xy));
#ifdef EIGEN_VECTORIZE_SSE3
EIGEN_UNUSED_VARIABLE(mask)
pstore(&res.z(), preverse(_mm_addsub_pd(preverse(t1), t2)));
#else
pstore(&res.z(), psub(t1, pxor(mask,preverse(t2))));
#endif
return res;
}
};
template<class Derived, int Alignment>
struct quat_conj<Architecture::SSE, Derived, double, Alignment>
{
static inline Quaternion<double> run(const QuaternionBase<Derived>& q)
{
Quaternion<double> res;
const __m128d mask0 = _mm_setr_pd(-0.,-0.);
const __m128d mask2 = _mm_setr_pd(-0.,0.);
pstore(&res.x(), _mm_xor_pd(mask0, q.coeffs().template packet<Alignment>(0)));
pstore(&res.z(), _mm_xor_pd(mask2, q.coeffs().template packet<Alignment>(2)));
return res;
}
};
} // end namespace internal
} // end namespace Eigen
#endif // EIGEN_GEOMETRY_SSE_H
```
|
```xml
/**
*/
import axios from '@nextcloud/axios'
import { getCurrentUser } from '@nextcloud/auth'
import { loadState } from '@nextcloud/initial-state'
import { generateOcsUrl } from '@nextcloud/router'
import { defineComponent } from 'vue'
export default defineComponent({
props: {
resourceId: {
type: Number,
required: true,
},
resourceType: {
type: String,
default: 'files',
},
},
data() {
return {
editorData: {
actorDisplayName: getCurrentUser()!.displayName as string,
actorId: getCurrentUser()!.uid as string,
key: 'editor',
},
userData: {},
}
},
methods: {
/**
* Autocomplete @mentions
*
* @param {string} search the query
* @param {Function} callback the callback to process the results with
*/
async autoComplete(search, callback) {
const { data } = await axios.get(generateOcsUrl('core/autocomplete/get'), {
params: {
search,
itemType: 'files',
itemId: this.resourceId,
sorter: 'commenters|share-recipients',
limit: loadState('comments', 'maxAutoCompleteResults'),
},
})
// Save user data so it can be used by the editor to replace mentions
data.ocs.data.forEach(user => { this.userData[user.id] = user })
return callback(Object.values(this.userData))
},
/**
* Make sure we have all mentions as Array of objects
*
* @param mentions the mentions list
*/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
genMentionsData(mentions: any[]): Record<string, object> {
Object.values(mentions)
.flat()
.forEach(mention => {
this.userData[mention.mentionId] = {
// TODO: support groups
icon: 'icon-user',
id: mention.mentionId,
label: mention.mentionDisplayName,
source: 'users',
primary: getCurrentUser()?.uid === mention.mentionId,
}
})
return this.userData
},
},
})
```
|
```kotlin
package com.apollographql.apollo.ast
import okio.FileSystem
internal actual val HOST_FILESYSTEM: FileSystem
get() = TODO("Not yet implemented")
```
|
```xml
<?xml version="1.0" encoding="UTF-8"?>
<definitions id="definitions"
xmlns="path_to_url"
xmlns:flowable="path_to_url"
targetNamespace="Examples"
xmlns:tns="Examples">
<process id="process">
<startEvent id="theStart" />
<sequenceFlow sourceRef="theStart" targetRef="task" />
<userTask id="task" />
<sequenceFlow sourceRef="task" targetRef="sendEventTask" />
<serviceTask id="sendEventTask" flowable:type="send-event">
<extensionElements>
<flowable:eventType>myEvent</flowable:eventType>
<flowable:channelKey>out-channel</flowable:channelKey>
<flowable:eventInParameter source="test" target="eventProperty" />
</extensionElements>
<multiInstanceLoopCharacteristics isSequential="false">
<loopCardinality>3</loopCardinality>
</multiInstanceLoopCharacteristics>
</serviceTask>
<sequenceFlow sourceRef="sendEventTask" targetRef="taskAfter" />
<userTask id="taskAfter" />
<sequenceFlow sourceRef="taskAfter" targetRef="theEnd" />
<endEvent id="theEnd" />
</process>
</definitions>
```
|
```java
/*
This file is part of the iText (R) project.
Authors: Apryse Software.
This program is offered under a commercial and under the AGPL license.
For commercial licensing, contact us at path_to_url For AGPL licensing, see below.
AGPL licensing:
This program is free software: you can redistribute it and/or modify
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
along with this program. If not, see <path_to_url
*/
package com.itextpdf.layout.renderer;
import com.itextpdf.io.font.otf.Glyph;
import com.itextpdf.io.font.otf.GlyphLine;
import com.itextpdf.io.util.TextUtil;
import com.itextpdf.kernel.font.PdfFont;
import com.itextpdf.kernel.font.PdfFontFactory;
import com.itextpdf.test.ExtendedITextTest;
import java.io.IOException;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.Tag;
@Tag("UnitTest")
public class TextPreprocessingUtilTest extends ExtendedITextTest {
private static PdfFont pdfFont;
@BeforeAll
public static void initializeFont() throws IOException {
pdfFont = PdfFontFactory.createFont();
}
@Test
public void enSpaceTest() {
specialWhitespaceGlyphTest('\u2002');
}
@Test
public void emSpaceTest() {
specialWhitespaceGlyphTest('\u2003');
}
@Test
public void thinSpaceTest() {
specialWhitespaceGlyphTest('\u2009');
}
@Test
public void horizontalTabulationTest() {
specialWhitespaceGlyphTest('\t');
}
@Test
public void regularSymbolTest() {
GlyphLine glyphLine = new GlyphLine();
Glyph regularGlyph = pdfFont.getGlyph('a');
glyphLine.add(0, regularGlyph);
TextPreprocessingUtil.replaceSpecialWhitespaceGlyphs(glyphLine, pdfFont);
Glyph glyph = glyphLine.get(0);
Assertions.assertEquals(regularGlyph, glyph);
}
private void specialWhitespaceGlyphTest(int unicode) {
GlyphLine glyphLine = new GlyphLine();
// Create a new glyph, because it is a special glyph, and it is not contained in the regular font
glyphLine.add(0, new Glyph(0, unicode));
TextPreprocessingUtil.replaceSpecialWhitespaceGlyphs(glyphLine, pdfFont);
Glyph glyph = glyphLine.get(0);
Glyph space = pdfFont.getGlyph('\u0020');
Assertions.assertEquals(space.getCode(), glyph.getCode());
Assertions.assertEquals(space.getWidth(), glyph.getWidth());
Assertions.assertEquals(space.getUnicode(), glyph.getUnicode());
Assertions.assertArrayEquals(TextUtil.convertFromUtf32(unicode), glyph.getChars());
}
}
```
|
Jack Henderson is an American thriller writer. He writes the series of novels featuring hacker John Fagan, aka phr33k, and FBI agent Jeannie Reese.
Biography
Henderson was born in Springfield, Missouri on October 3, 1958 and was brought up in Buffalo, Missouri.
Henderson did not complete a university degree, although he did complete two years of further education in Performing Arts before leaving college to work in regional theatre. While finding work in this capacity in New York City, Henderson started writing technical documentation for the medical industry, requiring him to perform a great deal of research into the topics discussed.
Henderson has stated that he put in "nearly a year of research and fact-finding" before starting to write his first novel, Maximum Impact (known as Circumference of Darkness in the US). He has since written a follow-up novel, Seven Seconds, and ghost-wrote the book The Overton Window for political commentator Glenn Beck. The Overton Window contains marked similarities to Circumference of Darkness.
Bibliography
The John Fagan series
Maximum Impact (2007, Sphere) (aka Circumference of Darkness (Bantam) in the US)
Seven Seconds (2009, Sphere)
Other works
The Overton Window (2010, Threshold)
References
External links
1958 births
Living people
American thriller writers
Writers from Springfield, Missouri
People from Buffalo, Missouri
Novelists from New York City
American male novelists
Novelists from Missouri
|
```c
/*
*
* in the file LICENSE in the source distribution or at
* path_to_url
*/
#include <openssl/asn1.h>
#include <openssl/asn1t.h>
#include <openssl/bio.h>
#include <openssl/err.h>
#include <stdio.h>
/* Experimental NDEF ASN1 BIO support routines */
/*
* The usage is quite simple, initialize an ASN1 structure, get a BIO from it
* then any data written through the BIO will end up translated to
* appropriate format on the fly. The data is streamed out and does *not*
* need to be all held in memory at once. When the BIO is flushed the output
* is finalized and any signatures etc written out. The BIO is a 'proper'
* BIO and can handle non blocking I/O correctly. The usage is simple. The
* implementation is *not*...
*/
/* BIO support data stored in the ASN1 BIO ex_arg */
typedef struct ndef_aux_st {
/* ASN1 structure this BIO refers to */
ASN1_VALUE *val;
const ASN1_ITEM *it;
/* Top of the BIO chain */
BIO *ndef_bio;
/* Output BIO */
BIO *out;
/* Boundary where content is inserted */
unsigned char **boundary;
/* DER buffer start */
unsigned char *derbuf;
} NDEF_SUPPORT;
static int ndef_prefix(BIO *b, unsigned char **pbuf, int *plen, void *parg);
static int ndef_prefix_free(BIO *b, unsigned char **pbuf, int *plen,
void *parg);
static int ndef_suffix(BIO *b, unsigned char **pbuf, int *plen, void *parg);
static int ndef_suffix_free(BIO *b, unsigned char **pbuf, int *plen,
void *parg);
/*
* On success, the returned BIO owns the input BIO as part of its BIO chain.
* On failure, NULL is returned and the input BIO is owned by the caller.
*
* Unfortunately cannot constify this due to CMS_stream() and PKCS7_stream()
*/
BIO *BIO_new_NDEF(BIO *out, ASN1_VALUE *val, const ASN1_ITEM *it)
{
NDEF_SUPPORT *ndef_aux = NULL;
BIO *asn_bio = NULL;
const ASN1_AUX *aux = it->funcs;
ASN1_STREAM_ARG sarg;
BIO *pop_bio = NULL;
if (!aux || !aux->asn1_cb) {
ERR_raise(ERR_LIB_ASN1, ASN1_R_STREAMING_NOT_SUPPORTED);
return NULL;
}
ndef_aux = OPENSSL_zalloc(sizeof(*ndef_aux));
asn_bio = BIO_new(BIO_f_asn1());
if (ndef_aux == NULL || asn_bio == NULL)
goto err;
/* ASN1 bio needs to be next to output BIO */
out = BIO_push(asn_bio, out);
if (out == NULL)
goto err;
pop_bio = asn_bio;
if (BIO_asn1_set_prefix(asn_bio, ndef_prefix, ndef_prefix_free) <= 0
|| BIO_asn1_set_suffix(asn_bio, ndef_suffix, ndef_suffix_free) <= 0
|| BIO_ctrl(asn_bio, BIO_C_SET_EX_ARG, 0, ndef_aux) <= 0)
goto err;
/*
* Now let the callback prepend any digest, cipher, etc., that the BIO's
* ASN1 structure needs.
*/
sarg.out = out;
sarg.ndef_bio = NULL;
sarg.boundary = NULL;
/*
* The asn1_cb(), must not have mutated asn_bio on error, leaving it in the
* middle of some partially built, but not returned BIO chain.
*/
if (aux->asn1_cb(ASN1_OP_STREAM_PRE, &val, it, &sarg) <= 0) {
/*
* ndef_aux is now owned by asn_bio so we must not free it in the err
* clean up block
*/
ndef_aux = NULL;
goto err;
}
/*
* We must not fail now because the callback has prepended additional
* BIOs to the chain
*/
ndef_aux->val = val;
ndef_aux->it = it;
ndef_aux->ndef_bio = sarg.ndef_bio;
ndef_aux->boundary = sarg.boundary;
ndef_aux->out = out;
return sarg.ndef_bio;
err:
/* BIO_pop() is NULL safe */
(void)BIO_pop(pop_bio);
BIO_free(asn_bio);
OPENSSL_free(ndef_aux);
return NULL;
}
static int ndef_prefix(BIO *b, unsigned char **pbuf, int *plen, void *parg)
{
NDEF_SUPPORT *ndef_aux;
unsigned char *p;
int derlen;
if (parg == NULL)
return 0;
ndef_aux = *(NDEF_SUPPORT **)parg;
derlen = ASN1_item_ndef_i2d(ndef_aux->val, NULL, ndef_aux->it);
if (derlen < 0)
return 0;
if ((p = OPENSSL_malloc(derlen)) == NULL)
return 0;
ndef_aux->derbuf = p;
*pbuf = p;
ASN1_item_ndef_i2d(ndef_aux->val, &p, ndef_aux->it);
if (*ndef_aux->boundary == NULL)
return 0;
*plen = *ndef_aux->boundary - *pbuf;
return 1;
}
static int ndef_prefix_free(BIO *b, unsigned char **pbuf, int *plen,
void *parg)
{
NDEF_SUPPORT *ndef_aux;
if (parg == NULL)
return 0;
ndef_aux = *(NDEF_SUPPORT **)parg;
if (ndef_aux == NULL)
return 0;
OPENSSL_free(ndef_aux->derbuf);
ndef_aux->derbuf = NULL;
*pbuf = NULL;
*plen = 0;
return 1;
}
static int ndef_suffix_free(BIO *b, unsigned char **pbuf, int *plen,
void *parg)
{
NDEF_SUPPORT **pndef_aux = (NDEF_SUPPORT **)parg;
if (!ndef_prefix_free(b, pbuf, plen, parg))
return 0;
OPENSSL_free(*pndef_aux);
*pndef_aux = NULL;
return 1;
}
static int ndef_suffix(BIO *b, unsigned char **pbuf, int *plen, void *parg)
{
NDEF_SUPPORT *ndef_aux;
unsigned char *p;
int derlen;
const ASN1_AUX *aux;
ASN1_STREAM_ARG sarg;
if (parg == NULL)
return 0;
ndef_aux = *(NDEF_SUPPORT **)parg;
aux = ndef_aux->it->funcs;
/* Finalize structures */
sarg.ndef_bio = ndef_aux->ndef_bio;
sarg.out = ndef_aux->out;
sarg.boundary = ndef_aux->boundary;
if (aux->asn1_cb(ASN1_OP_STREAM_POST,
&ndef_aux->val, ndef_aux->it, &sarg) <= 0)
return 0;
derlen = ASN1_item_ndef_i2d(ndef_aux->val, NULL, ndef_aux->it);
if (derlen < 0)
return 0;
if ((p = OPENSSL_malloc(derlen)) == NULL)
return 0;
ndef_aux->derbuf = p;
*pbuf = p;
derlen = ASN1_item_ndef_i2d(ndef_aux->val, &p, ndef_aux->it);
if (*ndef_aux->boundary == NULL)
return 0;
*pbuf = *ndef_aux->boundary;
*plen = derlen - (*ndef_aux->boundary - ndef_aux->derbuf);
return 1;
}
```
|
Xinjiang Tianshan Leopard F.C. () is a defunct professional Chinese football club that participated in the China League One division under licence from the Chinese Football Association (CFA). The team was based in Ürümqi, Xinjiang. Their majority shareholder is the Urumqi Juntai Real Estate Co., Ltd. (Juntai Group)
The club dissolved in February 2023.
History
Hubei China-Kyle was established in December 2011 by the China-Kyle Special Steel Co., Ltd and they brought in Li Jianzhong () as their chairman as well as Li Jun () as their first manager. With the aid from the Hubei Football Association they formed a team and registered to play within the third tier of the Chinese football league system in the 2012 league season. Their home location was the Huangshi Stadium and all blue was chosen as their home uniform. In their debut season they finished fourth place in the South Group and advanced into the play-offs where after beating Hebei Zhongji and Shenzhen Fengpeng they ultimately came runners-up to Guizhou Zhicheng within the division, nonetheless that position ensured promotion to the China League One division.
In their first appearance within the second division the club would struggle with the higher level of finance and professionalism required within the division. Before the start of the season the team could not afford the plane tickets required to go their training camp set in Dongguan, Guangdong and had to go there by coach. Despite the financial constraints Li Jun was able to avoid relegation on the final day of the season when the team beat Chengdu Tiancheng 2–0 at home. At the start of the 2014 league season the club would publicly declare they were looking for investment and were willing to leave Hubei Province to obtain it. This saw speculation grow that the club were going to move to Xi'an, but talks between the city broke down. The Xinjiang Uygur Autonomous Region Sports Bureau, however would express interest in investing into the club, which initially saw them take over their Under-20 team. On 14 February 2014 the Xinjiang Uygur Autonomous Region Sports Bureau followed through with their investment by providing the Xinjiang Sports Centre, training facilities and sponsorship, which saw Hubei China-Kyle moved to Xinjiang's capital city Ürümqi and changed their name to Xinjiang Tianshan Leopard. The club would gain sponsorship from local real estate company Urumqi Juntai Real Estate Co., Ltd. (Juntai Group) who decided to become the club's main investor throughout the season.
In the 2018 China League One, Xinjiang suffered a shock as they finished last in the division, but due to the dissolution of Dalian Transcendence and Yanbian Funde, and Zhejiang Yiteng being unable to apply for a League One license, Xinjiang miraculously managed to stay afloat in the League One.
The club dissolved in February 2023.
Name history
2011–2013 Hubei China-Kyle F.C. 湖北华凯尔
2014–2022 Xinjiang Tianshan Leopard 新疆天山雪豹
Managerial history
Li Jun (2012–2018)
Paul Put (2018)
Fernando (2019–2020)
Polat Kutulk (caretaker) (2020)
Pei Encai (2021–2022)
Results
All-time league rankings
As of the end of 2022 season.
In group stage.
Key
Pld = Played
W = Games won
D = Games drawn
L = Games lost
F = Goals for
A = Goals against
Pts = Points
Pos = Final position
DNQ = Did not qualify
DNE = Did not enter
NH = Not Held
– = Does Not Exist
R1 = Round 1
R2 = Round 2
R3 = Round 3
R4 = Round 4
F = Final
SF = Semi-finals
QF = Quarter-finals
R16 = Round of 16
Group = Group stage
GS2 = Second Group stage
QR1 = First Qualifying Round
QR2 = Second Qualifying Round
QR3 = Third Qualifying Round
References
Defunct football clubs in China
2014 establishments in China
2023 disestablishments in China
Association football clubs established in 2014
Association football clubs disestablished in 2023
Huangshi
Sport in Hubei
Sport in Xinjiang
|
```objective-c
//
// ZJTest2ViewController.m
// ZJScrollPageView
//
// Created by zeroj on 2017/4/26.
//
#import "ZJTest2ViewController.h"
@interface ZJTest2ViewController ()
@end
@implementation ZJTest2ViewController
- (void)viewDidLoad {
[super viewDidLoad];
self.view.backgroundColor = [UIColor orangeColor];
// Do any additional setup after loading the view.
}
//
- (void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
NSLog(@"Test2_viewWillAppear------");
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
NSLog(@"Test2_viewDidAppear-----");
}
- (void)viewWillDisappear:(BOOL)animated {
[super viewWillDisappear:animated];
NSLog(@"Test2_viewWillDisappear-----");
}
- (void)viewDidDisappear:(BOOL)animated {
[super viewDidDisappear:animated];
NSLog(@"Test2_viewDidDisappear--------");
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
/*
#pragma mark - Navigation
// In a storyboard-based application, you will often want to do a little preparation before navigation
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
// Get the new view controller using [segue destinationViewController].
// Pass the selected object to the new view controller.
}
*/
@end
```
|
```php
<?php
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
*/
namespace Google\Service\Aiplatform;
class GoogleCloudAiplatformV1ListFeaturestoresResponse extends \Google\Collection
{
protected $collection_key = 'featurestores';
protected $featurestoresType = GoogleCloudAiplatformV1Featurestore::class;
protected $featurestoresDataType = 'array';
/**
* @var string
*/
public $nextPageToken;
/**
* @param GoogleCloudAiplatformV1Featurestore[]
*/
public function setFeaturestores($featurestores)
{
$this->featurestores = $featurestores;
}
/**
* @return GoogleCloudAiplatformV1Featurestore[]
*/
public function getFeaturestores()
{
return $this->featurestores;
}
/**
* @param string
*/
public function setNextPageToken($nextPageToken)
{
$this->nextPageToken = $nextPageToken;
}
/**
* @return string
*/
public function getNextPageToken()
{
return $this->nextPageToken;
}
}
// Adding a class alias for backwards compatibility with the previous class name.
class_alias(GoogleCloudAiplatformV1ListFeaturestoresResponse::class, your_sha256_hashesResponse');
```
|
```java
package com.dexvis.javafx.scene.control;
import java.util.ArrayList;
import java.util.List;
import javafx.collections.ObservableList;
import javafx.event.ActionEvent;
import javafx.scene.Scene;
import javafx.scene.control.ContextMenu;
import javafx.scene.control.Label;
import javafx.scene.control.ListCell;
import javafx.scene.control.ListView;
import javafx.scene.control.MenuItem;
import javafx.scene.control.SelectionMode;
import javafx.scene.image.ImageView;
import javafx.scene.input.ClipboardContent;
import javafx.scene.input.DragEvent;
import javafx.scene.input.Dragboard;
import javafx.scene.input.KeyCode;
import javafx.scene.input.KeyEvent;
import javafx.scene.input.MouseEvent;
import javafx.scene.input.TransferMode;
import javafx.scene.layout.HBox;
import javafx.scene.paint.Color;
import javafx.stage.Stage;
import javafx.util.Callback;
import com.dexvis.dex.DexConstants;
import com.dexvis.dex.wf.DexTask;
import com.thoughtworks.xstream.annotations.XStreamOmitField;
public class DexTaskList extends ListView<DexTaskItem> implements DexConstants
{
@XStreamOmitField
private int insertionPoint = -1;
@XStreamOmitField
private ModalText modalText;
@XStreamOmitField
private Stage stage = null;
@XStreamOmitField
private List<DexTaskItem> copyTasks = new ArrayList<DexTaskItem>();
public DexTaskList()
{
super();
setCellFactory(new Callback<ListView<DexTaskItem>, ListCell<DexTaskItem>>()
{
@Override
public ListCell<DexTaskItem> call(ListView<DexTaskItem> list)
{
return new DexTaskItemCell();
}
});
getSelectionModel().setSelectionMode(SelectionMode.MULTIPLE);
setOnKeyPressed(event -> keyPress(event));
setOnDragOver(event -> onDragOver(event));
setOnDragDropped(event -> onDragDropped(event));
ContextMenu ctxMenu = new ContextMenu();
MenuItem disableTask = new MenuItem("Disable");
MenuItem enableTask = new MenuItem("Enable");
MenuItem renameTask = new MenuItem("Rename");
MenuItem configTask = new MenuItem("Configure");
disableTask.setOnAction(action -> disableTask(action));
enableTask.setOnAction(action -> enableTask(action));
renameTask.setOnAction(action -> renameTask(action));
configTask.setOnAction(action -> configTask(action));
ctxMenu.getItems().addAll(disableTask, enableTask, renameTask, configTask);
setOnDragDetected(event -> onDragDetected(event));
setContextMenu(ctxMenu);
}
public void setStage(Stage stage)
{
this.stage = stage;
}
public class DexTaskItemCell extends ListCell<DexTaskItem>
{
private HBox hbox = new HBox();
private ImageView imageView = new ImageView();
private Label label = new Label("UNNAMED");
public DexTaskItemCell()
{
this.hbox.getChildren().addAll(imageView, label);
DexTaskItem item = getItem();
if (item != null)
{
imageView.setImage(getItem().getImage());
hbox.opacityProperty().bind(item.getOpacity());
label.textProperty().bind(item.getName());
setLabelColor(label, item);
}
setOnDragEntered(event -> onDragEntered(event));
setOnDragExited(event -> onDragExited(event));
}
private void setLabelColor(Label label, DexTaskItem item)
{
if (item != null && item.getActive() != null && item.getActive().get())
{
label.setTextFill(Color.BLACK);
}
else
{
label.setTextFill(Color.RED);
}
}
@Override
public void updateItem(DexTaskItem item, boolean empty)
{
super.updateItem(item, empty);
if (empty)
{
setText(null);
setGraphic(null);
}
else
{
if (item != null)
{
imageView.setImage(item.getImage());
label.textProperty().bind(item.getName());
setLabelColor(label, item);
hbox.opacityProperty().bind(item.getOpacity());
}
setGraphic(hbox);
}
}
public void onDragEntered(DragEvent evt)
{
System.out.println("On Cell Drag Entered");
/* the drag-and-drop gesture entered the target */
/* show to the user that it is an actual gesture target */
int index = getIndex();
insertionPoint = index;
label.setTextFill(Color.RED);
evt.consume();
}
public void onDragExited(DragEvent evt)
{
System.out.println("On Cell Drag Exited");
int index = getIndex();
insertionPoint = index;
setLabelColor(label, getItem());
evt.consume();
}
}
public void renameTask(ActionEvent evt)
{
// TODO: Replaced the actioneventhandler with an expression, weirdly, it
// worked.
// I am actually not sure why, how, or if it truly did work or introduced
// some
// subtle bug that it will take me hours to find later. Hence the TODO.
modalText = new ModalText(stage, "Change Name", "Enter New Name:",
getSelectionModel().getSelectedItem().getName().get(),
event -> changeName(event));
}
public void configTask(ActionEvent evt)
{
try
{
Stage configStage = new Stage();
DexTask task = getSelectionModel().getSelectedItem().getTask().getValue();
JsonGuiPane configGui = task.getConfigurationGui();
Scene configScene = new Scene(configGui, 800, 600);
configStage.setScene(configScene);
configStage.show();
}
catch(Exception ex)
{
ex.printStackTrace();
}
}
public void changeName(ActionEvent evt)
{
getSelectionModel().getSelectedItem().setName(modalText.getText());
}
public void enableTask(ActionEvent evt)
{
System.out.println("ENABLE EVENT: " + evt);
ObservableList<Integer> selected = getSelectionModel().getSelectedIndices();
ObservableList<DexTaskItem> items = getItems();
for (int i : selected)
{
DexTaskItem task = items.get(i);
task.setActive(true);
}
forcedRefresh();
}
public void disableTask(ActionEvent evt)
{
System.out.println("DISABLE EVENT: " + evt);
ObservableList<Integer> selected = getSelectionModel().getSelectedIndices();
ObservableList<DexTaskItem> items = getItems();
for (int i : selected)
{
DexTaskItem task = items.get(i);
task.setActive(false);
}
forcedRefresh();
}
public void enableAll()
{
for (DexTaskItem item : getItems())
{
item.setActive(true);
}
forcedRefresh();
}
public void disableAll()
{
for (DexTaskItem item : getItems())
{
item.setActive(false);
}
forcedRefresh();
}
private void forcedRefresh()
{
ObservableList<DexTaskItem> items = getItems();
setItems(null);
setItems(items);
}
public void keyPress(KeyEvent evt)
{
System.out.println("*** keypress: " + evt);
if (evt.getCode().equals(KeyCode.DELETE))
{
int delIndex = getSelectionModel().getSelectedIndex();
int size = getItems().size();
if (delIndex >= 0 && delIndex < size)
{
System.out.println("Deleting Task: " + (delIndex + 1) + " of " + size);
getItems().remove(delIndex);
}
}
else if (evt.getCode().equals(KeyCode.C) && evt.isControlDown())
{
System.out.println("Control-C");
copyTasks.clear();
ObservableList<Integer> selected = getSelectionModel()
.getSelectedIndices();
ObservableList<DexTaskItem> items = getItems();
for (int selectedIndex : selected)
{
copyTasks.add(items.get(selectedIndex).clone());
}
}
else if (evt.getCode().equals(KeyCode.V) && evt.isControlDown())
{
if (copyTasks == null)
{
return;
}
int insertionIndex = getSelectionModel().getSelectedIndex();
// Need to clone the entire list.
List<DexTaskItem> copiedTasks = new ArrayList<DexTaskItem>();
for (DexTaskItem task : copyTasks)
{
copiedTasks.add((DexTaskItem) (task.clone()));
}
getItems().addAll(insertionIndex, copiedTasks);
}
else
{
// System.out.println("Ignoring keypress");
}
}
public List<DexTaskItem> getCopyTasks()
{
return copyTasks;
}
public void setCopyTasks(List<DexTaskItem> copyTasks)
{
this.copyTasks = copyTasks;
}
public void clearCopyTasks()
{
this.copyTasks.clear();
}
public void onDragOver(DragEvent evt)
{
evt.acceptTransferModes(TransferMode.ANY);
evt.consume();
}
public void onDragDropped(DragEvent evt)
{
// System.out.println("On Drag Dropped");
Dragboard db = evt.getDragboard();
boolean success = false;
try
{
if (db.hasContent(DEX_TASK_CREATE))
{
System.out.println("DND-RECEIVING: '" + db.getContent(DEX_TASK_CREATE) + "'");
Class clazz = (Class) db.getContent(DEX_TASK_CREATE);
DexTask task = (DexTask) clazz.newInstance();
DexTaskItem item = new DexTaskItem(task);
int insertionPoint = getInsertionPoint();
System.out.println("Inserting at: " + insertionPoint + ", list size: "
+ getItems().size());
if (insertionPoint >= 0 && insertionPoint <= getItems().size())
{
getItems().add(insertionPoint - 1, item);
}
else
{
getItems().add(item);
}
success = true;
}
else if (db.hasContent(DEX_TASK_LIST_MOVE))
{
int movingTo = getInsertionPoint();
if (movingTo < 0)
{
movingTo = 0;
}
int movingFrom = (int) db.getContent(DEX_TASK_LIST_MOVE);
if (movingFrom < movingTo)
{
DexTaskItem movingItem = getItems().remove(movingFrom);
getItems().add(movingTo - 1, movingItem);
}
else if (movingFrom > movingTo)
{
DexTaskItem movingItem = getItems().remove(movingFrom);
getItems().add(movingTo, movingItem);
}
System.out.println("MOVING: " + movingFrom + "->" + movingTo);
}
// Kludgey, but resets all items to active/inactive default opacity.
List<DexTaskItem> items = getItems();
for (DexTaskItem item : items)
{
System.out.println("Setting item: " + item.getName() + " opacity="
+ (item.getActive().get() ? "1.0" : "0.5"));
item.getOpacity().set(item.getActive().get() ? 1.0 : .5);
}
}
catch(Exception ex)
{
ex.printStackTrace();
}
evt.setDropCompleted(success);
evt.consume();
}
public void onDragDetected(MouseEvent evt)
{
System.out.println("On Drag Detected");
DexTaskList source = (DexTaskList) evt.getSource();
/* drag was detected, start a drag-and-drop gesture */
/* allow any transfer mode */
int movingFrom = source.getSelectionModel().getSelectedIndex();
DexTaskItem item = source.getSelectionModel().getSelectedItem();
Dragboard db = source.startDragAndDrop(TransferMode.COPY_OR_MOVE);
/* Put a string on a dragboard */
ClipboardContent content = new ClipboardContent();
if (content != null && item != null && item.getTask() != null
&& item.getTask().get() != null)
{
content.put(DEX_TASK_LIST_MOVE, movingFrom);
db.setContent(content);
}
evt.consume();
}
public int getInsertionPoint()
{
return insertionPoint;
}
}
```
|
The Cape Post (1879-1880) was a newspaper that briefly operated in the Cape Colony.
Founding
It was founded in December 1879 by former Cape Argus editor Patrick McLoughlin, as an outlet for his radical liberal opposition to British imperialism. Officially, the paper's purpose was to encourage spontaneous unity in southern Africa, to counter the Colonial office's scheme to impose a system of British-ruled confederation on the region.
While McLoughlin served as business manager, he co-edited it with the controversial firebrand Francis Reginald Statham who had been invited to Cape Town especially for this purpose. Both men also did much of the writing. The offices were based in Cape Town.
Political controversy
Although the paper received strong support from powerful local leaders like Saul Solomon, John Molteno, Charles Fairbridge and John X. Merriman, it was under strong imperial pressure, and went against the prevailing mood in much of the Cape Colony.
At the time, the inclusive Molteno Government had just been overthrown, and British control (in the form of a proposed "Confederation") was being solidified across southern Africa. Resulting wars were flaring up from the Transvaal (1st Boer War) and Transkei (9th Frontier War) to Zululand (Anglo-Zulu War). Public opinion had become strongly militant since these events, and the new publication came under sustained attack - legal and public.
Controversy arose quickly during the notorious "Koegas affair" (1879–80). This concerned the murder of San people (Bushmen) by farmers, near the northern frontier. In the subsequent murder trial, the farmers were acquitted, and the resulting outrage focused on Attorney General Thomas Upington. The Cape Argus and Cape Post accused Upington of deliberately allowing the trial to take place in a racist and hostile town that would be expected to acquit the murderers, due to prejudice and local influence. The culmination of the outrage was a public campaign, led by the Cape Post editors among others, accusing Upington and his colleagues of allowing white juries to acquit white murderers from murdering blacks.
Closure
The paper quickly ran into financial difficulties and was forced to close in 1880. Its two editors dispersed; Statham leaving the country, and McLoughlin moving to Oudtschoorn in the Karoo where he shot himself soon afterwards.
References
Defunct newspapers published in South Africa
Newspapers established in 1879
Publications disestablished in 1880
1879 establishments in the Cape Colony
|
```php
<?php
/*
*
* File ini bagian dari:
*
* OpenSID
*
* Sistem informasi desa sumber terbuka untuk memajukan desa
*
* Aplikasi dan source code ini dirilis berdasarkan lisensi GPL V3
*
* Hak Cipta 2009 - 2015 Combine Resource Institution (path_to_url
* Hak Cipta 2016 - 2024 Perkumpulan Desa Digital Terbuka (path_to_url
*
* Dengan ini diberikan izin, secara gratis, kepada siapa pun yang mendapatkan salinan
* dari perangkat lunak ini dan file dokumentasi terkait ("Aplikasi Ini"), untuk diperlakukan
* tanpa batasan, termasuk hak untuk menggunakan, menyalin, mengubah dan/atau mendistribusikan,
* asal tunduk pada syarat berikut:
*
* Pemberitahuan hak cipta di atas dan pemberitahuan izin ini harus disertakan dalam
* setiap salinan atau bagian penting Aplikasi Ini. Barang siapa yang menghapus atau menghilangkan
* pemberitahuan ini melanggar ketentuan lisensi Aplikasi Ini.
*
* PERANGKAT LUNAK INI DISEDIAKAN "SEBAGAIMANA ADANYA", TANPA JAMINAN APA PUN, BAIK TERSURAT MAUPUN
* TERSIRAT. PENULIS ATAU PEMEGANG HAK CIPTA SAMA SEKALI TIDAK BERTANGGUNG JAWAB ATAS KLAIM, KERUSAKAN ATAU
* KEWAJIBAN APAPUN ATAS PENGGUNAAN ATAU LAINNYA TERKAIT APLIKASI INI.
*
* @package OpenSID
* @author Tim Pengembang OpenDesa
* @copyright Hak Cipta 2009 - 2015 Combine Resource Institution (path_to_url
* @copyright Hak Cipta 2016 - 2024 Perkumpulan Desa Digital Terbuka (path_to_url
* @license path_to_url GPL V3
* @link path_to_url
*
*/
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
return new class () extends Migration {
/**
* Run the migrations.
*
* @return void
*/
public function up()
{
Schema::table('analisis_respon_bukti', static function (Blueprint $table) {
$table->foreign(['config_id'], 'analisis_respon_bukti_config_fk')->references(['id'])->on('config')->onUpdate('CASCADE')->onDelete('CASCADE');
});
}
/**
* Reverse the migrations.
*
* @return void
*/
public function down()
{
Schema::table('analisis_respon_bukti', static function (Blueprint $table) {
$table->dropForeign('analisis_respon_bukti_config_fk');
});
}
};
```
|
```scss
@include b(affix) {
@include define(zindex, 10);
z-index: var(--zindex);
}
```
|
```ruby
# frozen_string_literal: true
require "spec_helper"
require "nokogiri"
module Decidim
describe AuthorizationFormBuilder do
let(:record) do
DummyAuthorizationHandler.new({})
end
let(:helper) { Class.new(ActionView::Base).new(ActionView::LookupContext.new(ActionController::Base.view_paths), {}, []) }
let(:builder) { described_class.new(:authorization_handler, record, helper, {}) }
before do
allow(helper).to receive(:authorizations_path).and_return("/authorizations")
end
def find(selector)
subject.css(selector).first
end
describe "all_fields" do
subject { Nokogiri::HTML(builder.all_fields) }
it "includes the handler name" do
expect(find("input#authorization_handler_handler_name")["value"]).to eq("dummy_authorization_handler")
end
it "includes the public handler attributes" do
expect(find("input#authorization_handler_birthday")["type"]).to eq("date")
expect(find("input#authorization_handler_postal_code")["type"]).to eq("text")
expect(find("input#authorization_handler_document_number")["type"]).to eq("text")
expect(find("input#authorization_handler_name_and_surname")["type"]).to eq("text")
end
it "does not include other handler attributes" do
expect(find("input#authorization_handler_id")).to be_nil
expect(find("input#authorization_handler_user")).to be_nil
end
context "when there are scopes" do
let(:user) { create(:user) }
let!(:scope) { create(:scope, organization: user.organization) }
let(:record) do
DummyAuthorizationHandler.new(user:)
end
it "includes a scopes selector" do
expect(find("select#authorization_handler_scope_id").children.first["value"]).to eq(scope.id.to_s)
end
end
end
describe "input" do
it "renders a single field for an attribute" do
html = Nokogiri::HTML(builder.input(:birthday))
expect(html.css("label[for='authorization_handler_birthday']").length).to eq(1)
expect(html.css("input[type='date']").length).to eq(1)
expect(html.css("#authorization_handler_birthday").length).to eq(1)
end
context "when specifying the input type" do
it "renders it" do
html = Nokogiri::HTML(builder.input(:document_number, as: :email_field))
expect(html.css("label[for='authorization_handler_document_number']").length).to eq(1)
expect(html.css(".label-required").length).to eq(1)
expect(html.css("input[type='email']").length).to eq(1)
expect(html.css(".form-error").length).to eq(1)
end
end
end
describe "public_attributes (private)" do
subject { builder.send(:public_attributes) }
let(:public_attributes) do
{
"handler_name" => :string,
"document_number" => :string,
"postal_code" => :string,
"birthday" => :"decidim/attributes/localized_date",
"scope_id" => :integer,
"name_and_surname" => :string
}
end
it { is_expected.to eq(public_attributes) }
end
end
end
```
|
```objective-c
// 2018 and later: Unicode, Inc. and others.
#include "unicode/utypes.h"
#if !UCONFIG_NO_FORMATTING
#ifndef __SOURCE_NUMBER_UTYPES_H__
#define __SOURCE_NUMBER_UTYPES_H__
#include "unicode/numberformatter.h"
#include "number_types.h"
#include "number_decimalquantity.h"
#include "formatted_string_builder.h"
#include "formattedval_impl.h"
U_NAMESPACE_BEGIN
namespace number::impl {
/** Helper function used in upluralrules.cpp */
const DecimalQuantity* validateUFormattedNumberToDecimalQuantity(
const UFormattedNumber* uresult, UErrorCode& status);
/**
* Struct for data used by FormattedNumber.
*
* This struct is held internally by the C++ version FormattedNumber since the member types are not
* declared in the public header file.
*
* Exported as U_I18N_API for tests
*/
class U_I18N_API UFormattedNumberData : public FormattedValueStringBuilderImpl {
public:
UFormattedNumberData() : FormattedValueStringBuilderImpl(kUndefinedField) {}
virtual ~UFormattedNumberData();
UFormattedNumberData(UFormattedNumberData&&) = default;
UFormattedNumberData& operator=(UFormattedNumberData&&) = default;
// The formatted quantity.
DecimalQuantity quantity;
// The output unit for the formatted quantity.
// TODO(units,hugovdm): populate this correctly for the general case - it's
// currently only implemented for the .usage() use case.
MeasureUnit outputUnit;
// The gender of the formatted output.
const char *gender = "";
};
} // namespace number::impl
U_NAMESPACE_END
#endif //__SOURCE_NUMBER_UTYPES_H__
#endif /* #if !UCONFIG_NO_FORMATTING */
```
|
Cecrita guttivitta, the saddled prominent moth, is a species of moth of the family Notodontidae. It is found in North America, including Alabama, Arkansas, Connecticut, Delaware, Florida, Georgia, Illinois, Indiana, Iowa, Kansas, Kentucky, Maine, Maryland, Massachusetts, Minnesota, New Brunswick, New Hampshire, New Jersey, New York, North Carolina, Ohio, Oklahoma, Ontario, Pennsylvania, South Carolina, Tennessee, Vermont, Virginia, West Virginia and Wisconsin.
The wingspan is about 40 mm. Adults are brownish to greenish grey with white or black spots on the forewings. There is one generation per year.
The larvae feed on the foliage of a wide range of woody plants, including apple, birch, blueberry, dogwood, hazel, maple, oak, sumac and walnut.
Gallery
References
Moths described in 1855
Notodontidae
Moths of North America
|
Blood money, also called bloodwit, is money or some sort of compensation paid by an offender (usually a murderer) or their family group to the family or kin group of the victim.
Particular examples and uses
Blood money is, colloquially, the reward for bringing a criminal to justice. A common meaning in other contexts is the money-penalty paid by a murderer to the kinsfolk of the victim. These fines completely protect the offender (or the kinsfolk thereof) from the vengeance of the injured family. The system was common among Germanic peoples as part of the Ancient Germanic law before the introduction of Christianity (weregild), and a scale of payments, graduated according to the heinousness of the crime, was fixed by laws, which further settled who could exact the blood-money, and who were entitled to share it. Homicide was not the only crime thus expiable: blood-money could be exacted for most crimes of violence. Some acts, such as killing someone in a church or while asleep, or within the precincts of the royal palace, and corporal infamy (rape) were "bot-less"; the death penalty was inflicted instead. Such a criminal was outlawed, and could be killed on sight or thrown into a bog in case of rape according to Tacitus.
In Islam
In Islamic terms, Qisas can in some cases result in blood money being paid out to the family of victims. The amount varies from country to country and from case to case.
In Judaism
As a person's life is considered as being the property of God, Judaism forbids the taking of blood-money for the life of a murdered victim.
In Japan
In Japanese culture it is common to give blood money, or mimaikin, to a victim's family. Such was the case with Lucie Blackman's father, who accepted £450,000 as blood money for the murder of his daughter.
In Korea
Under the Korean legal system, it is common for those accused of both minor (such as defamation) and serious crimes to offer blood money (hapuigeum, 합의금) to the victim, and if accepted then the perpetrator is usually excused from further punishment. Despite being common practice, its use in high-profile cases does sometimes result in protests.
Other meanings or uses
In Christianity
In the Christian Bible, the term is used to refer to the thirty pieces of silver Judas Iscariot received in exchange for revealing the identity of Jesus Christ to the forces sent by the Pharisees and/or the Sanhedrin. After the crucifixion of Christ, Judas returned the payment to the chief priests, who "took the silver pieces and said, 'It is not lawful to put them into the treasury, because it is the price of blood.'"
In shipping
"Shanghaiing" was the practice of the forced conscription of sailors. Boarding masters, whose job it was to find crews for ships, were paid "by the body," and thus had a strong incentive to place as many seamen on ships as possible. This pay was called blood money.
See also
Anglo-Saxon law
Blood feud
Blood law
Blood libel
Danegeld
Diyya
Ericfine
Feud
Galanas
Germanic law
Główszczyzna
Kanun
Leges inter Brettos et Scottos
Leibzoll
Religious minority
Protection money
Tallage
Weregild
Wrongful death
References
Criminal law
Criminal procedure
Judicial remedies
Restorative justice
Punishments
|
```javascript
'use strict';
import React, {Component} from 'react';
import PropTypes from 'prop-types';
import {withTranslation} from '../lib/i18n';
import {requiresAuthenticatedUser, Title, withPageHelpers} from '../lib/page';
import {Table} from '../lib/table';
import {HTTPMethod} from '../lib/axios';
import {withComponentMixins} from "../lib/decorator-helpers";
import {tableAddRestActionButton, tableRestActionDialogInit, tableRestActionDialogRender} from "../lib/modals";
@withComponentMixins([
withTranslation,
withPageHelpers,
requiresAuthenticatedUser
])
export default class UserShares extends Component {
constructor(props) {
super(props);
this.sharesTables = {};
this.state = {};
tableRestActionDialogInit(this);
}
static propTypes = {
user: PropTypes.object
}
render() {
const t = this.props.t;
const renderSharesTable = (entityTypeId, title, typeName) => {
const columns = [
{ data: 0, title: t('name') },
{ data: 1, title: t('role') },
{
actions: data => {
const actions = [];
const autoGenerated = data[3];
const perms = data[4];
if (!autoGenerated && perms.includes('share')) {
const name = data[0];
const entityId = data[2];
tableAddRestActionButton(
actions,
this,
{
method: HTTPMethod.PUT,
url: 'rest/shares',
data: {
entityTypeId,
entityId,
userId: this.props.user.id
},
refreshTables: () => {
for (const key in this.sharesTables) {
this.sharesTables[key].refresh();
}
}
},
{ icon: 'trash-alt', label: t('unshare') },
t('confirmUnsharing'),
t('areYouSureYouWantToRemoveTheSharingOfThe', {typeName, name}),
t('removingSharingOfTheTypeNameName', {typeName, name}),
t('sharingOfTheTypeNameNameRemoved', {typeName, name}),
null
);
}
return actions;
}
}
];
return (
<div>
<h3>{title}</h3>
<Table ref={node => this.sharesTables[entityTypeId] = node} withHeader dataUrl={`rest/shares-table-by-user/${entityTypeId}/${this.props.user.id}`} columns={columns} />
</div>
);
};
return (
<div>
{tableRestActionDialogRender(this)}
<Title>{t('sharesForUserUsername', {username: this.props.user.username})}</Title>
{renderSharesTable('namespace', t('namespaces'), t('namespace-1'))}
{renderSharesTable('list', t('lists'), t('list-1'))}
{renderSharesTable('template', t('templates'), t('template-1'))}
{renderSharesTable('mosaicoTemplate', t('mosaicoTemplates'), t('mosaicoTemplate'))}
{renderSharesTable('campaign', t('campaigns'), t('campaign-1'))}
{renderSharesTable('customForm', t('customForms-1', t('customForms-2')))}
{renderSharesTable('report', t('reports'), t('report-1'))}
{renderSharesTable('reportTemplate', t('reportTemplates'), t('reportTemplate-2'))}
{renderSharesTable('sendConfiguration', t('sendConfigurations-1'), t('sendConfiguration'))}
</div>
);
}
}
```
|
Jemeker Thompson-Hairston is an American former drug dealer who rose to the top of the cocaine trade during the peak of the 1980s crack epidemic in the United States. She was based in "South Central" Los Angeles and had cocaine distributors in multiple US cities working for her.
Biography
When Thompson was 8 years old, she and her mother were evicted from their apartment in South Los Angeles (formerly known as South Central Los Angeles), and their belongings were strewn outside. In the Netflix documentary episode about her life as a drug dealer, Thompson states that "I knew then that I wanted money and that I wanted to control everything." She ran track in high school and was the girlfriend of Anthony "Daff" Mosley. Daff was dealing marijuana and Thompson began collecting payments for him.
Thompson began selling between 3 and 4 kilos of cocaine a week, while still in high school. She and "Daff" married in 1980, and began selling crack cocaine. Thompson's drug dealing business boomed, she and her husband bought a home in Encino, California and in 1982, had a son. When "Daff" was murdered, Drug Enforcement Administration agents, who had been watching Thompson, thought she would stop drug dealing. In 1984, she began to grow her business again by recruiting distributors around the U.S. and for some time began getting the cocaine directly from producers in Colombia. At age 26, she opened a hair distribution business (of wigs and hair extensions), which Thompson said was a legitimate business but authorities said was used for laundering drug profits.
When Percy "Cheese" Bratton, her business partner, was arrested with kilos of cocaine in his car, he made a deal with investigators, providing the evidence needed to indict Thompson. Thompson tried to avoid capture, but at the age of 31, was arrested at her son's 6th grade graduation ceremony. Thompson was charged with conspiracy to distribute cocaine and five counts of money laundering. Thompson was indicted, convicted and completed 12 of her 15-year prison sentence. During her time in prison, Thompson converted to Christianity and after her release, began a Christian ministry called Second Chance Evangelical Ministries. Her son became a professional skateboarder.
Thompson-Hairston is married to Champ Hairston.
Memoir
The crack epidemic in the U.S. was a time during the 1980s when people were using crack cocaine as payment for every day goods, crack cocaine was cheap. Thompson wrote about her life during that time in Queen Pin, co-written with David Ritz. It was published in 2010. In a mixed review, Kirkus Reviews said that it "lacks the salacious elements that make criminal memoirs compelling," but noted that it "may appeal to believers" due to its religious content.
Drug Lords episode
Thompson was the subject of an episode in the second season of the Netflix series Drug Lords. On July 10, 2018, Netflix aired an episode featuring Thompson's story. In the documentary she talked about making millions in the drug trade, her time in prison, converting to Christianity and doing Christian ministry since her release from prison.
Gangsters: America's Most Evil
Thompson was the subject of the 8th episode of the first season of Gangsters: America's Most Evil. A&E aired the episode on September 12, 2012, which documented that Thompson was one of the top contributors to the 1980s crack epidemic in the United States.
Christian ministry
Thompson's change from drug dealer to evangelist was featured on the 700 Club, a program on the Christian Broadcasting Network.
See also
Women in the drug economy in the United States
References
External links
IMDB Jemeker Thompson: Crack Queen of L.A.
American drug traffickers
American female gangsters
American memoirists
American women writers
Gangsters from Los Angeles
Living people
Year of birth missing (living people)
American women memoirists
Drug dealers
21st-century American women
|
```xml
import { LoginBackgroundSliderPage } from './login-background-slider';
import { NgModule } from '@angular/core';
import { IonicPageModule } from 'ionic-angular';
@NgModule({
declarations: [
LoginBackgroundSliderPage,
],
imports: [
IonicPageModule.forChild(LoginBackgroundSliderPage),
],
exports: [
LoginBackgroundSliderPage
]
})
export class LoginBackgroundSliderPageModule { }
```
|
Benderville is an unincorporated community located in the town of Scott, Brown County, Wisconsin, United States. Benderville is located on County Highway A near the southeastern shore of Green Bay, northeast of the city of Green Bay.
References
Unincorporated communities in Brown County, Wisconsin
Unincorporated communities in Wisconsin
Green Bay metropolitan area
|
```vue
<template>
<div>
<div v-if="layout ==='poster'" class="row poster-ui-controls">
<!-- Only show the list title when not in tabs & only for the poster layout -->
<div v-if="!stateLayout.splitHomeInTabs && (showsInLists && showsInLists.length > 1)" class="show-list-title listTitle">
<button type="button" class="nav-show-list move-show-list">
<span class="icon-bar" />
<span class="icon-bar" />
<span class="icon-bar" />
</button>
<h3 class="header">{{listTitle}}</h3>
</div>
<div class="col-lg-12">
<div class="show-option">
<input type="search" v-model="filterByName" class="form-control form-control-inline input-sm input200" placeholder="Filter Show Name">
</div>
<div class="show-option">
<!-- These need to patch apiv2 on change! -->
<select v-model="posterUiSortDir" id="postersortdirection" class="form-control form-control-inline input-sm" placeholder="Direction">
<option :value="1">Ascending</option>
<option :value="0">Descending</option>
</select>
</div>
<div class="show-option pull-right">
<select v-model="posterUiSortBy" id="postersort" class="form-control form-control-inline input-sm" placeholder="Sort By">
<option v-for="option in posterSortByOptions" :value="option.value" :key="option.value">
{{ option.text }}
</option>
</select>
</div>
<poster-size-slider />
</div>
</div>
<!-- We're still loading -->
<template v-if="!this.showsLoading.finished && shows.length === 0">
<state-switch state="loading" :theme="stateLayout.themeName" />
<span>Loading</span>
</template>
<template v-else-if="shows.length >= 1">
<component :class="[['simple', 'small', 'banner'].includes(layout) ? 'table-layout' : '']" :is="mappedLayout" v-bind="$props" />
</template>
</div>
</template>
<script>
import { mapActions, mapGetters, mapState } from 'vuex';
import Banner from './banner.vue';
import Simple from './simple.vue';
import Poster from './poster.vue';
import Smallposter from './smallposter.vue';
import { AppLink, PosterSizeSlider, StateSwitch } from '../helpers';
export default {
name: 'show-list',
components: {
AppLink,
Banner,
Simple,
Poster,
PosterSizeSlider,
Smallposter,
StateSwitch
},
props: {
layout: {
validator: layout => [
null,
'',
'poster',
'banner',
'simple',
'small'
].includes(layout),
required: true
},
shows: {
type: Array,
required: true
},
listTitle: {
type: String
},
header: {
type: Boolean
}
},
data() {
return {
postSortDirOptions: [
{ value: '0', text: 'Descending' },
{ value: '1', text: 'Ascending' }
],
posterSortByOptions: [
{ text: 'Name', value: 'name' },
{ text: 'Next episode', value: 'date' },
{ text: 'Network', value: 'network' },
{ text: 'Progress', value: 'progress' },
{ text: 'Indexer', value: 'indexer' }
]
};
},
computed: {
...mapState({
stateLayout: state => state.config.layout,
showsLoading: state => state.shows.loading
}),
...mapGetters({
showsInLists: 'showsInLists'
}),
filterByName: {
get() {
const { local } = this.stateLayout;
const { showFilterByName } = local;
return showFilterByName;
},
set(value) {
const { setLayoutLocal } = this;
setLayoutLocal({ key: 'showFilterByName', value });
}
},
mappedLayout() {
const { layout } = this;
if (layout === 'small') {
return 'smallposter';
}
return layout;
},
posterUiSortBy: {
get() {
const { stateLayout } = this;
const { posterSortby } = stateLayout;
return posterSortby;
},
set(value) {
const { setPosterSortBy } = this;
setPosterSortBy({ value });
}
},
posterUiSortDir: {
get() {
const { stateLayout } = this;
const { posterSortdir } = stateLayout;
return posterSortdir;
},
set(value) {
const { setPosterSortDir } = this;
setPosterSortDir({ value });
}
}
},
methods: {
...mapActions({
setPosterSortBy: 'setPosterSortBy',
setPosterSortDir: 'setPosterSortDir',
setLayoutLocal: 'setLayoutLocal'
})
}
};
</script>
<style scoped>
/* Configure the show-list-title for in the poster layout. */
.show-list-title {
display: flex;
float: left;
margin-top: 6px;
}
button.nav-show-list {
height: 20px;
}
.show-list-title > h3 {
margin: 0;
}
/* Configure the show-list-title for in the table layouts. */
.table-layout >>> .show-list-title {
display: flex;
float: left;
}
.table-layout >>> button.nav-show-list {
height: 20px;
}
.table-layout >>> .show-list-title > h3 {
margin: 0;
}
/** Use this as table styling for all table layouts */
.table-layout >>> .vgt-table {
width: 100%;
margin-right: auto;
margin-left: auto;
text-align: left;
border-spacing: 0;
}
.table-layout >>> .vgt-table th,
.table-layout >>> .vgt-table td {
padding: 4px;
vertical-align: middle;
}
/* remove extra border from left edge */
.table-layout >>> .vgt-table th:first-child,
.table-layout >>> .vgt-table td:first-child {
border-left: none;
}
.table-layout >>> .vgt-table th {
text-align: center;
text-shadow: -1px -1px 0 rgba(0, 0, 0, 0.3);
background-color: rgb(51, 51, 51);
white-space: nowrap;
color: #fff;
border-collapse: collapse;
font-weight: normal;
position: relative;
background-image: none;
padding: 4px;
cursor: default;
}
.table-layout >>> .vgt-table span.break-word {
word-wrap: break-word;
}
.table-layout >>> .vgt-table thead th.sorting.sorting-asc {
background-position-x: right;
background-position-y: bottom;
}
.table-layout >>> .vgt-table thead th.sorting {
background-repeat: no-repeat;
}
.table-layout >>> .vgt-table thead th.sortable button {
-webkit-appearance: none;
-moz-appearance: none;
appearance: none;
background: transparent;
border: none;
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
}
.table-layout >>> .vgt-table input.tablesorter-filter {
width: 98%;
height: auto;
-webkit-box-sizing: border-box;
-moz-box-sizing: border-box;
box-sizing: border-box;
}
.table-layout >>> .vgt-table tr.tablesorter-filter-row,
.table-layout >>> .vgt-table tr.tablesorter-filter-row td {
text-align: center;
}
/* optional disabled input styling */
.table-layout >>> .vgt-table input.tablesorter-filter-row .disabled {
display: none;
}
.tablesorter-header-inner {
padding: 0 2px;
text-align: center;
}
.table-layout >>> .vgt-table tfoot tr {
text-align: center;
border-collapse: collapse;
}
.table-layout >>> .vgt-table tfoot a {
text-decoration: none;
}
.table-layout >>> .vgt-table th.vgt-row-header {
text-align: left;
}
.table-layout >>> .vgt-table .season-header {
display: inline;
margin-left: 5px;
}
.table-layout >>> .vgt-table tr.spacer {
height: 25px;
}
.table-layout >>> .vgt-dropdown {
float: right;
}
.table-layout >>> .vgt-dropdown > .button-group {
position: relative;
}
.table-layout >>> .dropdown-toggle {
position: absolute;
z-index: 1;
top: 0.1em;
right: 0.1em;
width: 1em;
transition: width 0.2s ease-in-out;
}
.table-layout >>> .dropdown-toggle:hover,
.table-layout >>> .dropdown-toggle:active {
width: 2em;
}
.table-layout >>> .vgt-dropdown-menu {
position: absolute;
z-index: 1;
float: left;
min-width: 160px;
padding: 5px 0;
margin: 2px 0 0;
font-size: 14px;
text-align: left;
list-style: none;
background-clip: padding-box;
border-radius: 3px;
right: 0;
top: 2em;
}
.table-layout >>> .vgt-dropdown-menu > li > span {
display: block;
padding: 3px 5px;
clear: both;
font-weight: 400;
line-height: 1.42857143;
white-space: nowrap;
}
.table-layout >>> .indexer-image :not(:last-child) {
margin-right: 5px;
}
.table-layout >>> .vgt-input {
height: 23px;
line-height: 23px;
font-size: 0.9em;
width: 100%;
background-color: #fff;
background-image: none;
border: 1px solid #ccc;
border-radius: 3px;
padding: 0 10px;
margin: 0;
}
.table-layout >>> .vgt-select {
height: 23px;
line-height: 23px;
font-size: 0.9em;
width: 100%;
background-color: #fff;
background-image: none;
border: 1px solid #ccc;
border-radius: 3px;
padding: 0 10px;
}
</style>
```
|
```python
"""
TensorFlow policy class used for PG.
"""
import logging
from typing import Dict, List, Optional, Tuple, Type, Union
from rllib_pg.pg.pg import PGConfig
from rllib_pg.pg.utils import post_process_advantages
from ray.rllib.evaluation.episode import Episode
from ray.rllib.evaluation.postprocessing import Postprocessing
from ray.rllib.models.action_dist import ActionDistribution
from ray.rllib.models.modelv2 import ModelV2
from ray.rllib.policy import Policy
from ray.rllib.policy.dynamic_tf_policy_v2 import DynamicTFPolicyV2
from ray.rllib.policy.eager_tf_policy_v2 import EagerTFPolicyV2
from ray.rllib.policy.sample_batch import SampleBatch
from ray.rllib.policy.tf_mixins import LearningRateSchedule
from ray.rllib.utils.annotations import override
from ray.rllib.utils.framework import try_import_tf
from ray.rllib.utils.typing import AgentID, TensorType, TFPolicyV2Type
tf1, tf, tfv = try_import_tf()
logger = logging.getLogger(__name__)
# We need this builder function because we want to share the same
# custom logics between TF1 dynamic and TF2 eager policies.
def get_pg_tf_policy(name: str, base: TFPolicyV2Type) -> TFPolicyV2Type:
"""Construct a PGTFPolicy inheriting either dynamic or eager base policies.
Args:
base: Base class for this policy. DynamicTFPolicyV2 or EagerTFPolicyV2.
Returns:
A TF Policy to be used with PGTrainer.
"""
class PGTFPolicy(
LearningRateSchedule,
base,
):
def __init__(
self,
observation_space,
action_space,
config: PGConfig,
existing_model=None,
existing_inputs=None,
):
# First thing first, enable eager execution if necessary.
base.enable_eager_execution_if_necessary()
# Enforce AlgorithmConfig for PG Policies.
if isinstance(config, dict):
config = PGConfig.from_dict(config)
# Initialize base class.
base.__init__(
self,
observation_space,
action_space,
config,
existing_inputs=existing_inputs,
existing_model=existing_model,
)
LearningRateSchedule.__init__(self, config.lr, config.lr_schedule)
# Note: this is a bit ugly, but loss and optimizer initialization must
# happen after all the MixIns are initialized.
self.maybe_initialize_optimizer_and_loss()
@override(base)
def loss(
self,
model: ModelV2,
dist_class: Type[ActionDistribution],
train_batch: SampleBatch,
) -> Union[TensorType, List[TensorType]]:
"""The basic policy gradients loss function.
Calculates the vanilla policy gradient loss based on:
L = -E[ log(pi(a|s)) * A]
Args:
model: The Model to calculate the loss for.
dist_class: The action distr. class.
train_batch: The training data.
Returns:
Union[TensorType, List[TensorType]]: A single loss tensor or a list
of loss tensors.
"""
# Pass the training data through our model to get distribution parameters.
dist_inputs, _ = model(train_batch)
# Create an action distribution object.
action_dist = dist_class(dist_inputs, model)
# Calculate the vanilla PG loss based on:
# L = -E[ log(pi(a|s)) * A]
loss = -tf.reduce_mean(
action_dist.logp(train_batch[SampleBatch.ACTIONS])
* tf.cast(train_batch[Postprocessing.ADVANTAGES], dtype=tf.float32)
)
self.policy_loss = loss
return loss
@override(base)
def postprocess_trajectory(
self,
sample_batch: SampleBatch,
other_agent_batches: Optional[
Dict[AgentID, Tuple["Policy", SampleBatch]]
] = None,
episode: Optional["Episode"] = None,
) -> SampleBatch:
sample_batch = super().postprocess_trajectory(
sample_batch, other_agent_batches, episode
)
return post_process_advantages(
self, sample_batch, other_agent_batches, episode
)
@override(base)
def extra_learn_fetches_fn(self) -> Dict[str, TensorType]:
return {
"learner_stats": {"cur_lr": self.cur_lr},
}
@override(base)
def stats_fn(self, train_batch: SampleBatch) -> Dict[str, TensorType]:
"""Returns the calculated loss and learning rate in a stats dict.
Args:
policy: The Policy object.
train_batch: The data used for training.
Returns:
Dict[str, TensorType]: The stats dict.
"""
return {
"policy_loss": self.policy_loss,
"cur_lr": self.cur_lr,
}
PGTFPolicy.__name__ = name
PGTFPolicy.__qualname__ = name
return PGTFPolicy
PGTF1Policy = get_pg_tf_policy("PGTF1Policy", DynamicTFPolicyV2)
PGTF2Policy = get_pg_tf_policy("PGTF2Policy", EagerTFPolicyV2)
```
|
Hucisko Nienadowskie is a village in the administrative district of Gmina Dubiecko, within Przemyśl County, Subcarpathian Voivodeship, in south-eastern Poland. It lies approximately north-east of Dubiecko, north-west of Przemyśl, and south-east of the regional capital Rzeszów.
References
Hucisko Nienadowskie
|
```go
// _ _
// __ _____ __ ___ ___ __ _| |_ ___
// \ \ /\ / / _ \/ _` \ \ / / |/ _` | __/ _ \
// \ V V / __/ (_| |\ V /| | (_| | || __/
// \_/\_/ \___|\__,_| \_/ |_|\__,_|\__\___|
//
//
// CONTACT: hello@weaviate.io
//
package v1
import (
"testing"
"github.com/weaviate/weaviate/entities/models"
"github.com/weaviate/weaviate/usecases/byteops"
"github.com/stretchr/testify/require"
"github.com/weaviate/weaviate/entities/schema"
pb "github.com/weaviate/weaviate/grpc/generated/protocol/v1"
)
type innerTest struct {
datatype schema.DataType
out *pb.Value
shouldError bool
}
func makeTestList(succeedingInnerTests map[schema.DataType]*pb.Value) []innerTest {
dtypes := append(schema.PrimitiveDataTypes, schema.DeprecatedPrimitiveDataTypes...)
list := make([]innerTest, len(dtypes))
for idx, dtype := range dtypes {
out, ok := succeedingInnerTests[dtype]
if ok {
list[idx] = innerTest{
datatype: dtype,
out: out,
shouldError: false,
}
} else {
list[idx] = innerTest{
datatype: dtype,
out: nil,
shouldError: true,
}
}
}
return list
}
func TestNewPrimitiveValue(t *testing.T) {
float_val := float32(1.1)
tests := []struct {
name string
in any
tests map[bool][]innerTest
}{
{
name: "bools",
in: []bool{true, false},
tests: map[bool][]innerTest{
false: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeBooleanArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Values: []*pb.Value{
{Kind: &pb.Value_BoolValue{BoolValue: true}},
{Kind: &pb.Value_BoolValue{BoolValue: false}},
},
}}},
}),
true: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeBooleanArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_BoolValues{BoolValues: &pb.BoolValues{Values: []bool{true, false}}},
}}},
}),
},
},
{
name: "strings",
in: []string{"a string", "another string"},
tests: map[bool][]innerTest{
false: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeDateArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{
{Kind: &pb.Value_DateValue{DateValue: "a string"}},
{Kind: &pb.Value_DateValue{DateValue: "another string"}},
}}}},
schema.DataTypeStringArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{
{Kind: &pb.Value_StringValue{StringValue: "a string"}},
{Kind: &pb.Value_StringValue{StringValue: "another string"}},
}}}},
schema.DataTypeTextArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{
{Kind: &pb.Value_StringValue{StringValue: "a string"}},
{Kind: &pb.Value_StringValue{StringValue: "another string"}},
}}}},
schema.DataTypeUUIDArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{
{Kind: &pb.Value_UuidValue{UuidValue: "a string"}},
{Kind: &pb.Value_UuidValue{UuidValue: "another string"}},
}}}},
}),
true: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeDateArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_DateValues{DateValues: &pb.DateValues{Values: []string{"a string", "another string"}}},
}}},
schema.DataTypeStringArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_TextValues{TextValues: &pb.TextValues{Values: []string{"a string", "another string"}}},
}}},
schema.DataTypeTextArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_TextValues{TextValues: &pb.TextValues{Values: []string{"a string", "another string"}}},
}}},
schema.DataTypeUUIDArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_UuidValues{UuidValues: &pb.UuidValues{Values: []string{"a string", "another string"}}},
}}},
}),
},
},
{
name: "float64s",
in: []float64{1.1, 2.2, 3.3},
tests: map[bool][]innerTest{
false: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeNumberArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{
{Kind: &pb.Value_NumberValue{NumberValue: 1.1}},
{Kind: &pb.Value_NumberValue{NumberValue: 2.2}},
{Kind: &pb.Value_NumberValue{NumberValue: 3.3}},
}}}},
schema.DataTypeIntArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{
{Kind: &pb.Value_IntValue{IntValue: 1}},
{Kind: &pb.Value_IntValue{IntValue: 2}},
{Kind: &pb.Value_IntValue{IntValue: 3}},
}}}},
}),
true: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeNumberArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_NumberValues{NumberValues: &pb.NumberValues{Values: byteops.Float64ToByteVector([]float64{1.1, 2.2, 3.3})}},
}}},
schema.DataTypeIntArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_IntValues{IntValues: &pb.IntValues{Values: byteops.IntsToByteVector([]float64{1, 2, 3})}},
}}},
}),
},
},
{
name: "empty array",
in: []interface{}{},
tests: map[bool][]innerTest{
false: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeBooleanArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{}}}},
schema.DataTypeDateArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{}}}},
schema.DataTypeNumberArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{}}}},
schema.DataTypeIntArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{}}}},
schema.DataTypeStringArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{}}}},
schema.DataTypeTextArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{}}}},
schema.DataTypeUUIDArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{Values: []*pb.Value{}}}},
}),
true: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeBooleanArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_BoolValues{BoolValues: &pb.BoolValues{Values: []bool{}}},
}}},
schema.DataTypeDateArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_DateValues{DateValues: &pb.DateValues{Values: []string{}}},
}}},
schema.DataTypeNumberArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_NumberValues{NumberValues: &pb.NumberValues{Values: byteops.Float64ToByteVector([]float64{})}},
}}},
schema.DataTypeIntArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_IntValues{IntValues: &pb.IntValues{Values: byteops.IntsToByteVector([]float64{})}},
}}},
schema.DataTypeStringArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_TextValues{TextValues: &pb.TextValues{Values: []string{}}},
}}},
schema.DataTypeTextArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_TextValues{TextValues: &pb.TextValues{Values: []string{}}},
}}},
schema.DataTypeUUIDArray: {Kind: &pb.Value_ListValue{ListValue: &pb.ListValue{
Kind: &pb.ListValue_UuidValues{UuidValues: &pb.UuidValues{Values: []string{}}},
}}},
}),
},
},
{
name: "bool",
in: true,
tests: map[bool][]innerTest{
false: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeBoolean: {Kind: &pb.Value_BoolValue{BoolValue: true}},
}),
true: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeBoolean: {Kind: &pb.Value_BoolValue{BoolValue: true}},
}),
},
},
{
name: "string",
in: "a string",
tests: map[bool][]innerTest{
false: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeDate: {Kind: &pb.Value_DateValue{DateValue: "a string"}},
schema.DataTypeString: {Kind: &pb.Value_StringValue{StringValue: "a string"}},
schema.DataTypeText: {Kind: &pb.Value_StringValue{StringValue: "a string"}},
schema.DataTypeUUID: {Kind: &pb.Value_UuidValue{UuidValue: "a string"}},
schema.DataTypeBlob: {Kind: &pb.Value_BlobValue{BlobValue: "a string"}},
}),
true: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeDate: {Kind: &pb.Value_DateValue{DateValue: "a string"}},
schema.DataTypeString: {Kind: &pb.Value_TextValue{TextValue: "a string"}},
schema.DataTypeText: {Kind: &pb.Value_TextValue{TextValue: "a string"}},
schema.DataTypeUUID: {Kind: &pb.Value_UuidValue{UuidValue: "a string"}},
schema.DataTypeBlob: {Kind: &pb.Value_BlobValue{BlobValue: "a string"}},
}),
},
},
{
name: "float64",
in: 1.1,
tests: map[bool][]innerTest{
false: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeNumber: {Kind: &pb.Value_NumberValue{NumberValue: 1.1}},
schema.DataTypeInt: {Kind: &pb.Value_IntValue{IntValue: 1}},
}),
true: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeNumber: {Kind: &pb.Value_NumberValue{NumberValue: 1.1}},
schema.DataTypeInt: {Kind: &pb.Value_IntValue{IntValue: 1}},
}),
},
},
{
name: "geo",
in: &models.GeoCoordinates{Longitude: &float_val, Latitude: &float_val},
tests: map[bool][]innerTest{
false: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeGeoCoordinates: {Kind: &pb.Value_GeoValue{GeoValue: &pb.GeoCoordinate{Latitude: float_val, Longitude: float_val}}},
}),
true: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypeGeoCoordinates: {Kind: &pb.Value_GeoValue{GeoValue: &pb.GeoCoordinate{Latitude: float_val, Longitude: float_val}}},
}),
},
},
{
name: "phone number",
in: &models.PhoneNumber{Input: "1234567890"},
tests: map[bool][]innerTest{
false: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypePhoneNumber: {Kind: &pb.Value_PhoneValue{PhoneValue: &pb.PhoneNumber{Input: "1234567890"}}},
}),
true: makeTestList(map[schema.DataType]*pb.Value{
schema.DataTypePhoneNumber: {Kind: &pb.Value_PhoneValue{PhoneValue: &pb.PhoneNumber{Input: "1234567890"}}},
}),
},
},
}
for _, tt := range tests {
for uses125, innerTests := range tt.tests {
for _, test := range innerTests {
m := NewMapping(uses125)
out, err := m.NewPrimitiveValue(tt.in, test.datatype)
if test.shouldError {
if err == nil {
t.Logf("expected an error for %v and %s", tt.in, test.datatype)
}
require.Error(t, err)
} else {
require.NoError(t, err)
require.Equal(t, test.out, out)
}
}
}
}
}
```
|
```c++
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#if V8_TARGET_ARCH_ARM64
#include "src/codegen.h"
#include "src/ic/ic.h"
#include "src/ic/ic-compiler.h"
#include "src/ic/stub-cache.h"
namespace v8 {
namespace internal {
#define __ ACCESS_MASM(masm)
// "type" holds an instance type on entry and is not clobbered.
// Generated code branch on "global_object" if type is any kind of global
// JS object.
static void GenerateGlobalInstanceTypeCheck(MacroAssembler* masm, Register type,
Label* global_object) {
__ Cmp(type, JS_GLOBAL_OBJECT_TYPE);
__ Ccmp(type, JS_GLOBAL_PROXY_TYPE, ZFlag, ne);
__ B(eq, global_object);
}
// Helper function used from LoadIC GenerateNormal.
//
// elements: Property dictionary. It is not clobbered if a jump to the miss
// label is done.
// name: Property name. It is not clobbered if a jump to the miss label is
// done
// result: Register for the result. It is only updated if a jump to the miss
// label is not done.
// The scratch registers need to be different from elements, name and result.
// The generated code assumes that the receiver has slow properties,
// is not a global object and does not have interceptors.
static void GenerateDictionaryLoad(MacroAssembler* masm, Label* miss,
Register elements, Register name,
Register result, Register scratch1,
Register scratch2) {
DCHECK(!AreAliased(elements, name, scratch1, scratch2));
DCHECK(!AreAliased(result, scratch1, scratch2));
Label done;
// Probe the dictionary.
NameDictionaryLookupStub::GeneratePositiveLookup(masm, miss, &done, elements,
name, scratch1, scratch2);
// If probing finds an entry check that the value is a normal property.
__ Bind(&done);
static const int kElementsStartOffset =
NameDictionary::kHeaderSize +
NameDictionary::kElementsStartIndex * kPointerSize;
static const int kDetailsOffset = kElementsStartOffset + 2 * kPointerSize;
__ Ldr(scratch1, FieldMemOperand(scratch2, kDetailsOffset));
__ Tst(scratch1, Smi::FromInt(PropertyDetails::TypeField::kMask));
__ B(ne, miss);
// Get the value at the masked, scaled index and return.
__ Ldr(result,
FieldMemOperand(scratch2, kElementsStartOffset + 1 * kPointerSize));
}
// Helper function used from StoreIC::GenerateNormal.
//
// elements: Property dictionary. It is not clobbered if a jump to the miss
// label is done.
// name: Property name. It is not clobbered if a jump to the miss label is
// done
// value: The value to store (never clobbered).
//
// The generated code assumes that the receiver has slow properties,
// is not a global object and does not have interceptors.
static void GenerateDictionaryStore(MacroAssembler* masm, Label* miss,
Register elements, Register name,
Register value, Register scratch1,
Register scratch2) {
DCHECK(!AreAliased(elements, name, value, scratch1, scratch2));
Label done;
// Probe the dictionary.
NameDictionaryLookupStub::GeneratePositiveLookup(masm, miss, &done, elements,
name, scratch1, scratch2);
// If probing finds an entry in the dictionary check that the value
// is a normal property that is not read only.
__ Bind(&done);
static const int kElementsStartOffset =
NameDictionary::kHeaderSize +
NameDictionary::kElementsStartIndex * kPointerSize;
static const int kDetailsOffset = kElementsStartOffset + 2 * kPointerSize;
static const int kTypeAndReadOnlyMask =
PropertyDetails::TypeField::kMask |
PropertyDetails::AttributesField::encode(READ_ONLY);
__ Ldrsw(scratch1, UntagSmiFieldMemOperand(scratch2, kDetailsOffset));
__ Tst(scratch1, kTypeAndReadOnlyMask);
__ B(ne, miss);
// Store the value at the masked, scaled index and return.
static const int kValueOffset = kElementsStartOffset + kPointerSize;
__ Add(scratch2, scratch2, kValueOffset - kHeapObjectTag);
__ Str(value, MemOperand(scratch2));
// Update the write barrier. Make sure not to clobber the value.
__ Mov(scratch1, value);
__ RecordWrite(elements, scratch2, scratch1, kLRHasNotBeenSaved,
kDontSaveFPRegs);
}
// Checks the receiver for special cases (value type, slow case bits).
// Falls through for regular JS object and return the map of the
// receiver in 'map_scratch' if the receiver is not a SMI.
static void GenerateKeyedLoadReceiverCheck(MacroAssembler* masm,
Register receiver,
Register map_scratch,
Register scratch,
int interceptor_bit, Label* slow) {
DCHECK(!AreAliased(map_scratch, scratch));
// Check that the object isn't a smi.
__ JumpIfSmi(receiver, slow);
// Get the map of the receiver.
__ Ldr(map_scratch, FieldMemOperand(receiver, HeapObject::kMapOffset));
// Check bit field.
__ Ldrb(scratch, FieldMemOperand(map_scratch, Map::kBitFieldOffset));
__ Tbnz(scratch, Map::kIsAccessCheckNeeded, slow);
__ Tbnz(scratch, interceptor_bit, slow);
// Check that the object is some kind of JS object EXCEPT JS Value type.
// In the case that the object is a value-wrapper object, we enter the
// runtime system to make sure that indexing into string objects work
// as intended.
STATIC_ASSERT(JS_OBJECT_TYPE > JS_VALUE_TYPE);
__ Ldrb(scratch, FieldMemOperand(map_scratch, Map::kInstanceTypeOffset));
__ Cmp(scratch, JS_OBJECT_TYPE);
__ B(lt, slow);
}
// Loads an indexed element from a fast case array.
//
// receiver - holds the receiver on entry.
// Unchanged unless 'result' is the same register.
//
// key - holds the smi key on entry.
// Unchanged unless 'result' is the same register.
//
// elements - holds the elements of the receiver and its prototypes. Clobbered.
//
// result - holds the result on exit if the load succeeded.
// Allowed to be the the same as 'receiver' or 'key'.
// Unchanged on bailout so 'receiver' and 'key' can be safely
// used by further computation.
static void GenerateFastArrayLoad(MacroAssembler* masm, Register receiver,
Register key, Register elements,
Register scratch1, Register scratch2,
Register result, Label* slow,
LanguageMode language_mode) {
DCHECK(!AreAliased(receiver, key, elements, scratch1, scratch2));
Label check_prototypes, check_next_prototype;
Label done, in_bounds, absent;
// Check for fast array.
__ Ldr(elements, FieldMemOperand(receiver, JSObject::kElementsOffset));
__ AssertFastElements(elements);
// Check that the key (index) is within bounds.
__ Ldr(scratch1, FieldMemOperand(elements, FixedArray::kLengthOffset));
__ Cmp(key, scratch1);
__ B(lo, &in_bounds);
// Out of bounds. Check the prototype chain to see if we can just return
// 'undefined'.
__ Cmp(key, Operand(Smi::FromInt(0)));
__ B(lt, slow); // Negative keys can't take the fast OOB path.
__ Bind(&check_prototypes);
__ Ldr(scratch2, FieldMemOperand(receiver, HeapObject::kMapOffset));
__ Bind(&check_next_prototype);
__ Ldr(scratch2, FieldMemOperand(scratch2, Map::kPrototypeOffset));
// scratch2: current prototype
__ JumpIfRoot(scratch2, Heap::kNullValueRootIndex, &absent);
__ Ldr(elements, FieldMemOperand(scratch2, JSObject::kElementsOffset));
__ Ldr(scratch2, FieldMemOperand(scratch2, HeapObject::kMapOffset));
// elements: elements of current prototype
// scratch2: map of current prototype
__ CompareInstanceType(scratch2, scratch1, JS_OBJECT_TYPE);
__ B(lo, slow);
__ Ldrb(scratch1, FieldMemOperand(scratch2, Map::kBitFieldOffset));
__ Tbnz(scratch1, Map::kIsAccessCheckNeeded, slow);
__ Tbnz(scratch1, Map::kHasIndexedInterceptor, slow);
__ JumpIfNotRoot(elements, Heap::kEmptyFixedArrayRootIndex, slow);
__ B(&check_next_prototype);
__ Bind(&absent);
if (is_strong(language_mode)) {
// Strong mode accesses must throw in this case, so call the runtime.
__ B(slow);
} else {
__ LoadRoot(result, Heap::kUndefinedValueRootIndex);
__ B(&done);
}
__ Bind(&in_bounds);
// Fast case: Do the load.
__ Add(scratch1, elements, FixedArray::kHeaderSize - kHeapObjectTag);
__ SmiUntag(scratch2, key);
__ Ldr(scratch2, MemOperand(scratch1, scratch2, LSL, kPointerSizeLog2));
// In case the loaded value is the_hole we have to check the prototype chain.
__ JumpIfRoot(scratch2, Heap::kTheHoleValueRootIndex, &check_prototypes);
// Move the value to the result register.
// 'result' can alias with 'receiver' or 'key' but these two must be
// preserved if we jump to 'slow'.
__ Mov(result, scratch2);
__ Bind(&done);
}
// Checks whether a key is an array index string or a unique name.
// Falls through if a key is a unique name.
// The map of the key is returned in 'map_scratch'.
// If the jump to 'index_string' is done the hash of the key is left
// in 'hash_scratch'.
static void GenerateKeyNameCheck(MacroAssembler* masm, Register key,
Register map_scratch, Register hash_scratch,
Label* index_string, Label* not_unique) {
DCHECK(!AreAliased(key, map_scratch, hash_scratch));
// Is the key a name?
Label unique;
__ JumpIfObjectType(key, map_scratch, hash_scratch, LAST_UNIQUE_NAME_TYPE,
not_unique, hi);
STATIC_ASSERT(LAST_UNIQUE_NAME_TYPE == FIRST_NONSTRING_TYPE);
__ B(eq, &unique);
// Is the string an array index with cached numeric value?
__ Ldr(hash_scratch.W(), FieldMemOperand(key, Name::kHashFieldOffset));
__ TestAndBranchIfAllClear(hash_scratch, Name::kContainsCachedArrayIndexMask,
index_string);
// Is the string internalized? We know it's a string, so a single bit test is
// enough.
__ Ldrb(hash_scratch, FieldMemOperand(map_scratch, Map::kInstanceTypeOffset));
STATIC_ASSERT(kInternalizedTag == 0);
__ TestAndBranchIfAnySet(hash_scratch, kIsNotInternalizedMask, not_unique);
__ Bind(&unique);
// Fall through if the key is a unique name.
}
void LoadIC::GenerateNormal(MacroAssembler* masm, LanguageMode language_mode) {
Register dictionary = x0;
DCHECK(!dictionary.is(LoadDescriptor::ReceiverRegister()));
DCHECK(!dictionary.is(LoadDescriptor::NameRegister()));
Label slow;
__ Ldr(dictionary, FieldMemOperand(LoadDescriptor::ReceiverRegister(),
JSObject::kPropertiesOffset));
GenerateDictionaryLoad(masm, &slow, dictionary,
LoadDescriptor::NameRegister(), x0, x3, x4);
__ Ret();
// Dictionary load failed, go slow (but don't miss).
__ Bind(&slow);
GenerateRuntimeGetProperty(masm, language_mode);
}
void LoadIC::GenerateMiss(MacroAssembler* masm) {
// The return address is in lr.
Isolate* isolate = masm->isolate();
ASM_LOCATION("LoadIC::GenerateMiss");
DCHECK(!AreAliased(x4, x5, LoadWithVectorDescriptor::SlotRegister(),
LoadWithVectorDescriptor::VectorRegister()));
__ IncrementCounter(isolate->counters()->load_miss(), 1, x4, x5);
// Perform tail call to the entry.
__ Push(LoadWithVectorDescriptor::ReceiverRegister(),
LoadWithVectorDescriptor::NameRegister(),
LoadWithVectorDescriptor::SlotRegister(),
LoadWithVectorDescriptor::VectorRegister());
int arg_count = 4;
__ TailCallRuntime(Runtime::kLoadIC_Miss, arg_count, 1);
}
void LoadIC::GenerateRuntimeGetProperty(MacroAssembler* masm,
LanguageMode language_mode) {
// The return address is in lr.
__ Push(LoadDescriptor::ReceiverRegister(), LoadDescriptor::NameRegister());
// Do tail-call to runtime routine.
__ TailCallRuntime(is_strong(language_mode) ? Runtime::kGetPropertyStrong
: Runtime::kGetProperty,
2, 1);
}
void KeyedLoadIC::GenerateMiss(MacroAssembler* masm) {
// The return address is in lr.
Isolate* isolate = masm->isolate();
DCHECK(!AreAliased(x10, x11, LoadWithVectorDescriptor::SlotRegister(),
LoadWithVectorDescriptor::VectorRegister()));
__ IncrementCounter(isolate->counters()->keyed_load_miss(), 1, x10, x11);
__ Push(LoadWithVectorDescriptor::ReceiverRegister(),
LoadWithVectorDescriptor::NameRegister(),
LoadWithVectorDescriptor::SlotRegister(),
LoadWithVectorDescriptor::VectorRegister());
// Perform tail call to the entry.
int arg_count = 4;
__ TailCallRuntime(Runtime::kKeyedLoadIC_Miss, arg_count, 1);
}
void KeyedLoadIC::GenerateRuntimeGetProperty(MacroAssembler* masm,
LanguageMode language_mode) {
// The return address is in lr.
__ Push(LoadDescriptor::ReceiverRegister(), LoadDescriptor::NameRegister());
// Do tail-call to runtime routine.
__ TailCallRuntime(is_strong(language_mode) ? Runtime::kKeyedGetPropertyStrong
: Runtime::kKeyedGetProperty,
2, 1);
}
static void GenerateKeyedLoadWithSmiKey(MacroAssembler* masm, Register key,
Register receiver, Register scratch1,
Register scratch2, Register scratch3,
Register scratch4, Register scratch5,
Label* slow,
LanguageMode language_mode) {
DCHECK(!AreAliased(key, receiver, scratch1, scratch2, scratch3, scratch4,
scratch5));
Isolate* isolate = masm->isolate();
Label check_number_dictionary;
// If we can load the value, it should be returned in x0.
Register result = x0;
GenerateKeyedLoadReceiverCheck(masm, receiver, scratch1, scratch2,
Map::kHasIndexedInterceptor, slow);
// Check the receiver's map to see if it has fast elements.
__ CheckFastElements(scratch1, scratch2, &check_number_dictionary);
GenerateFastArrayLoad(masm, receiver, key, scratch3, scratch2, scratch1,
result, slow, language_mode);
__ IncrementCounter(isolate->counters()->keyed_load_generic_smi(), 1,
scratch1, scratch2);
__ Ret();
__ Bind(&check_number_dictionary);
__ Ldr(scratch3, FieldMemOperand(receiver, JSObject::kElementsOffset));
__ Ldr(scratch2, FieldMemOperand(scratch3, JSObject::kMapOffset));
// Check whether we have a number dictionary.
__ JumpIfNotRoot(scratch2, Heap::kHashTableMapRootIndex, slow);
__ LoadFromNumberDictionary(slow, scratch3, key, result, scratch1, scratch2,
scratch4, scratch5);
__ Ret();
}
static void GenerateKeyedLoadWithNameKey(MacroAssembler* masm, Register key,
Register receiver, Register scratch1,
Register scratch2, Register scratch3,
Register scratch4, Register scratch5,
Label* slow) {
DCHECK(!AreAliased(key, receiver, scratch1, scratch2, scratch3, scratch4,
scratch5));
Isolate* isolate = masm->isolate();
Label probe_dictionary, property_array_property;
// If we can load the value, it should be returned in x0.
Register result = x0;
GenerateKeyedLoadReceiverCheck(masm, receiver, scratch1, scratch2,
Map::kHasNamedInterceptor, slow);
// If the receiver is a fast-case object, check the stub cache. Otherwise
// probe the dictionary.
__ Ldr(scratch2, FieldMemOperand(receiver, JSObject::kPropertiesOffset));
__ Ldr(scratch3, FieldMemOperand(scratch2, HeapObject::kMapOffset));
__ JumpIfRoot(scratch3, Heap::kHashTableMapRootIndex, &probe_dictionary);
// The handlers in the stub cache expect a vector and slot. Since we won't
// change the IC from any downstream misses, a dummy vector can be used.
Register vector = LoadWithVectorDescriptor::VectorRegister();
Register slot = LoadWithVectorDescriptor::SlotRegister();
DCHECK(!AreAliased(vector, slot, scratch1, scratch2, scratch3, scratch4));
Handle<TypeFeedbackVector> dummy_vector =
TypeFeedbackVector::DummyVector(masm->isolate());
int slot_index = dummy_vector->GetIndex(
FeedbackVectorSlot(TypeFeedbackVector::kDummyKeyedLoadICSlot));
__ LoadRoot(vector, Heap::kDummyVectorRootIndex);
__ Mov(slot, Operand(Smi::FromInt(slot_index)));
Code::Flags flags = Code::RemoveTypeAndHolderFromFlags(
Code::ComputeHandlerFlags(Code::LOAD_IC));
masm->isolate()->stub_cache()->GenerateProbe(masm, Code::KEYED_LOAD_IC, flags,
receiver, key, scratch1,
scratch2, scratch3, scratch4);
// Cache miss.
KeyedLoadIC::GenerateMiss(masm);
// Do a quick inline probe of the receiver's dictionary, if it exists.
__ Bind(&probe_dictionary);
__ Ldr(scratch1, FieldMemOperand(receiver, HeapObject::kMapOffset));
__ Ldrb(scratch1, FieldMemOperand(scratch1, Map::kInstanceTypeOffset));
GenerateGlobalInstanceTypeCheck(masm, scratch1, slow);
// Load the property.
GenerateDictionaryLoad(masm, slow, scratch2, key, result, scratch1, scratch3);
__ IncrementCounter(isolate->counters()->keyed_load_generic_symbol(), 1,
scratch1, scratch2);
__ Ret();
}
void KeyedLoadIC::GenerateMegamorphic(MacroAssembler* masm,
LanguageMode language_mode) {
// The return address is in lr.
Label slow, check_name, index_smi, index_name;
Register key = LoadDescriptor::NameRegister();
Register receiver = LoadDescriptor::ReceiverRegister();
DCHECK(key.is(x2));
DCHECK(receiver.is(x1));
__ JumpIfNotSmi(key, &check_name);
__ Bind(&index_smi);
// Now the key is known to be a smi. This place is also jumped to from below
// where a numeric string is converted to a smi.
GenerateKeyedLoadWithSmiKey(masm, key, receiver, x7, x3, x4, x5, x6, &slow,
language_mode);
// Slow case.
__ Bind(&slow);
__ IncrementCounter(masm->isolate()->counters()->keyed_load_generic_slow(), 1,
x4, x3);
GenerateRuntimeGetProperty(masm, language_mode);
__ Bind(&check_name);
GenerateKeyNameCheck(masm, key, x0, x3, &index_name, &slow);
GenerateKeyedLoadWithNameKey(masm, key, receiver, x4, x5, x6, x7, x3, &slow);
__ Bind(&index_name);
__ IndexFromHash(x3, key);
// Now jump to the place where smi keys are handled.
__ B(&index_smi);
}
static void StoreIC_PushArgs(MacroAssembler* masm) {
if (FLAG_vector_stores) {
__ Push(StoreDescriptor::ReceiverRegister(),
StoreDescriptor::NameRegister(), StoreDescriptor::ValueRegister(),
VectorStoreICDescriptor::SlotRegister(),
VectorStoreICDescriptor::VectorRegister());
} else {
__ Push(StoreDescriptor::ReceiverRegister(),
StoreDescriptor::NameRegister(), StoreDescriptor::ValueRegister());
}
}
void KeyedStoreIC::GenerateMiss(MacroAssembler* masm) {
ASM_LOCATION("KeyedStoreIC::GenerateMiss");
StoreIC_PushArgs(masm);
int args = FLAG_vector_stores ? 5 : 3;
__ TailCallRuntime(Runtime::kKeyedStoreIC_Miss, args, 1);
}
static void KeyedStoreGenerateMegamorphicHelper(
MacroAssembler* masm, Label* fast_object, Label* fast_double, Label* slow,
KeyedStoreCheckMap check_map, KeyedStoreIncrementLength increment_length,
Register value, Register key, Register receiver, Register receiver_map,
Register elements_map, Register elements) {
DCHECK(!AreAliased(value, key, receiver, receiver_map, elements_map, elements,
x10, x11));
Label transition_smi_elements;
Label transition_double_elements;
Label fast_double_without_map_check;
Label non_double_value;
Label finish_store;
__ Bind(fast_object);
if (check_map == kCheckMap) {
__ Ldr(elements_map, FieldMemOperand(elements, HeapObject::kMapOffset));
__ Cmp(elements_map,
Operand(masm->isolate()->factory()->fixed_array_map()));
__ B(ne, fast_double);
}
// HOLECHECK: guards "A[i] = V"
// We have to go to the runtime if the current value is the hole because there
// may be a callback on the element.
Label holecheck_passed;
__ Add(x10, elements, FixedArray::kHeaderSize - kHeapObjectTag);
__ Add(x10, x10, Operand::UntagSmiAndScale(key, kPointerSizeLog2));
__ Ldr(x11, MemOperand(x10));
__ JumpIfNotRoot(x11, Heap::kTheHoleValueRootIndex, &holecheck_passed);
__ JumpIfDictionaryInPrototypeChain(receiver, elements_map, x10, slow);
__ bind(&holecheck_passed);
// Smi stores don't require further checks.
__ JumpIfSmi(value, &finish_store);
// Escape to elements kind transition case.
__ CheckFastObjectElements(receiver_map, x10, &transition_smi_elements);
__ Bind(&finish_store);
if (increment_length == kIncrementLength) {
// Add 1 to receiver->length.
__ Add(x10, key, Smi::FromInt(1));
__ Str(x10, FieldMemOperand(receiver, JSArray::kLengthOffset));
}
Register address = x11;
__ Add(address, elements, FixedArray::kHeaderSize - kHeapObjectTag);
__ Add(address, address, Operand::UntagSmiAndScale(key, kPointerSizeLog2));
__ Str(value, MemOperand(address));
Label dont_record_write;
__ JumpIfSmi(value, &dont_record_write);
// Update write barrier for the elements array address.
__ Mov(x10, value); // Preserve the value which is returned.
__ RecordWrite(elements, address, x10, kLRHasNotBeenSaved, kDontSaveFPRegs,
EMIT_REMEMBERED_SET, OMIT_SMI_CHECK);
__ Bind(&dont_record_write);
__ Ret();
__ Bind(fast_double);
if (check_map == kCheckMap) {
// Check for fast double array case. If this fails, call through to the
// runtime.
__ JumpIfNotRoot(elements_map, Heap::kFixedDoubleArrayMapRootIndex, slow);
}
// HOLECHECK: guards "A[i] double hole?"
// We have to see if the double version of the hole is present. If so go to
// the runtime.
__ Add(x10, elements, FixedDoubleArray::kHeaderSize - kHeapObjectTag);
__ Add(x10, x10, Operand::UntagSmiAndScale(key, kPointerSizeLog2));
__ Ldr(x11, MemOperand(x10));
__ CompareAndBranch(x11, kHoleNanInt64, ne, &fast_double_without_map_check);
__ JumpIfDictionaryInPrototypeChain(receiver, elements_map, x10, slow);
__ Bind(&fast_double_without_map_check);
__ StoreNumberToDoubleElements(value, key, elements, x10, d0,
&transition_double_elements);
if (increment_length == kIncrementLength) {
// Add 1 to receiver->length.
__ Add(x10, key, Smi::FromInt(1));
__ Str(x10, FieldMemOperand(receiver, JSArray::kLengthOffset));
}
__ Ret();
__ Bind(&transition_smi_elements);
// Transition the array appropriately depending on the value type.
__ Ldr(x10, FieldMemOperand(value, HeapObject::kMapOffset));
__ JumpIfNotRoot(x10, Heap::kHeapNumberMapRootIndex, &non_double_value);
// Value is a double. Transition FAST_SMI_ELEMENTS ->
// FAST_DOUBLE_ELEMENTS and complete the store.
__ LoadTransitionedArrayMapConditional(
FAST_SMI_ELEMENTS, FAST_DOUBLE_ELEMENTS, receiver_map, x10, x11, slow);
AllocationSiteMode mode =
AllocationSite::GetMode(FAST_SMI_ELEMENTS, FAST_DOUBLE_ELEMENTS);
ElementsTransitionGenerator::GenerateSmiToDouble(masm, receiver, key, value,
receiver_map, mode, slow);
__ Ldr(elements, FieldMemOperand(receiver, JSObject::kElementsOffset));
__ B(&fast_double_without_map_check);
__ Bind(&non_double_value);
// Value is not a double, FAST_SMI_ELEMENTS -> FAST_ELEMENTS.
__ LoadTransitionedArrayMapConditional(FAST_SMI_ELEMENTS, FAST_ELEMENTS,
receiver_map, x10, x11, slow);
mode = AllocationSite::GetMode(FAST_SMI_ELEMENTS, FAST_ELEMENTS);
ElementsTransitionGenerator::GenerateMapChangeElementsTransition(
masm, receiver, key, value, receiver_map, mode, slow);
__ Ldr(elements, FieldMemOperand(receiver, JSObject::kElementsOffset));
__ B(&finish_store);
__ Bind(&transition_double_elements);
// Elements are FAST_DOUBLE_ELEMENTS, but value is an Object that's not a
// HeapNumber. Make sure that the receiver is a Array with FAST_ELEMENTS and
// transition array from FAST_DOUBLE_ELEMENTS to FAST_ELEMENTS
__ LoadTransitionedArrayMapConditional(FAST_DOUBLE_ELEMENTS, FAST_ELEMENTS,
receiver_map, x10, x11, slow);
mode = AllocationSite::GetMode(FAST_DOUBLE_ELEMENTS, FAST_ELEMENTS);
ElementsTransitionGenerator::GenerateDoubleToObject(
masm, receiver, key, value, receiver_map, mode, slow);
__ Ldr(elements, FieldMemOperand(receiver, JSObject::kElementsOffset));
__ B(&finish_store);
}
void KeyedStoreIC::GenerateMegamorphic(MacroAssembler* masm,
LanguageMode language_mode) {
ASM_LOCATION("KeyedStoreIC::GenerateMegamorphic");
Label slow;
Label array;
Label fast_object;
Label extra;
Label fast_object_grow;
Label fast_double_grow;
Label fast_double;
Label maybe_name_key;
Label miss;
Register value = StoreDescriptor::ValueRegister();
Register key = StoreDescriptor::NameRegister();
Register receiver = StoreDescriptor::ReceiverRegister();
DCHECK(receiver.is(x1));
DCHECK(key.is(x2));
DCHECK(value.is(x0));
Register receiver_map = x3;
Register elements = x4;
Register elements_map = x5;
__ JumpIfNotSmi(key, &maybe_name_key);
__ JumpIfSmi(receiver, &slow);
__ Ldr(receiver_map, FieldMemOperand(receiver, HeapObject::kMapOffset));
// Check that the receiver does not require access checks and is not observed.
// The generic stub does not perform map checks or handle observed objects.
__ Ldrb(x10, FieldMemOperand(receiver_map, Map::kBitFieldOffset));
__ TestAndBranchIfAnySet(
x10, (1 << Map::kIsAccessCheckNeeded) | (1 << Map::kIsObserved), &slow);
// Check if the object is a JS array or not.
Register instance_type = x10;
__ CompareInstanceType(receiver_map, instance_type, JS_ARRAY_TYPE);
__ B(eq, &array);
// Check that the object is some kind of JS object EXCEPT JS Value type. In
// the case that the object is a value-wrapper object, we enter the runtime
// system to make sure that indexing into string objects works as intended.
STATIC_ASSERT(JS_VALUE_TYPE < JS_OBJECT_TYPE);
__ Cmp(instance_type, JS_OBJECT_TYPE);
__ B(lo, &slow);
// Object case: Check key against length in the elements array.
__ Ldr(elements, FieldMemOperand(receiver, JSObject::kElementsOffset));
// Check array bounds. Both the key and the length of FixedArray are smis.
__ Ldrsw(x10, UntagSmiFieldMemOperand(elements, FixedArray::kLengthOffset));
__ Cmp(x10, Operand::UntagSmi(key));
__ B(hi, &fast_object);
__ Bind(&slow);
// Slow case, handle jump to runtime.
// Live values:
// x0: value
// x1: key
// x2: receiver
PropertyICCompiler::GenerateRuntimeSetProperty(masm, language_mode);
// Never returns to here.
__ bind(&maybe_name_key);
__ Ldr(x10, FieldMemOperand(key, HeapObject::kMapOffset));
__ Ldrb(x10, FieldMemOperand(x10, Map::kInstanceTypeOffset));
__ JumpIfNotUniqueNameInstanceType(x10, &slow);
if (FLAG_vector_stores) {
// The handlers in the stub cache expect a vector and slot. Since we won't
// change the IC from any downstream misses, a dummy vector can be used.
Register vector = VectorStoreICDescriptor::VectorRegister();
Register slot = VectorStoreICDescriptor::SlotRegister();
DCHECK(!AreAliased(vector, slot, x5, x6, x7, x8));
Handle<TypeFeedbackVector> dummy_vector =
TypeFeedbackVector::DummyVector(masm->isolate());
int slot_index = dummy_vector->GetIndex(
FeedbackVectorSlot(TypeFeedbackVector::kDummyKeyedStoreICSlot));
__ LoadRoot(vector, Heap::kDummyVectorRootIndex);
__ Mov(slot, Operand(Smi::FromInt(slot_index)));
}
Code::Flags flags = Code::RemoveTypeAndHolderFromFlags(
Code::ComputeHandlerFlags(Code::STORE_IC));
masm->isolate()->stub_cache()->GenerateProbe(masm, Code::STORE_IC, flags,
receiver, key, x5, x6, x7, x8);
// Cache miss.
__ B(&miss);
__ Bind(&extra);
// Extra capacity case: Check if there is extra capacity to
// perform the store and update the length. Used for adding one
// element to the array by writing to array[array.length].
// Check for room in the elements backing store.
// Both the key and the length of FixedArray are smis.
__ Ldrsw(x10, UntagSmiFieldMemOperand(elements, FixedArray::kLengthOffset));
__ Cmp(x10, Operand::UntagSmi(key));
__ B(ls, &slow);
__ Ldr(elements_map, FieldMemOperand(elements, HeapObject::kMapOffset));
__ Cmp(elements_map, Operand(masm->isolate()->factory()->fixed_array_map()));
__ B(eq, &fast_object_grow);
__ Cmp(elements_map,
Operand(masm->isolate()->factory()->fixed_double_array_map()));
__ B(eq, &fast_double_grow);
__ B(&slow);
__ Bind(&array);
// Array case: Get the length and the elements array from the JS
// array. Check that the array is in fast mode (and writable); if it
// is the length is always a smi.
__ Ldr(elements, FieldMemOperand(receiver, JSObject::kElementsOffset));
// Check the key against the length in the array.
__ Ldrsw(x10, UntagSmiFieldMemOperand(receiver, JSArray::kLengthOffset));
__ Cmp(x10, Operand::UntagSmi(key));
__ B(eq, &extra); // We can handle the case where we are appending 1 element.
__ B(lo, &slow);
KeyedStoreGenerateMegamorphicHelper(
masm, &fast_object, &fast_double, &slow, kCheckMap, kDontIncrementLength,
value, key, receiver, receiver_map, elements_map, elements);
KeyedStoreGenerateMegamorphicHelper(masm, &fast_object_grow,
&fast_double_grow, &slow, kDontCheckMap,
kIncrementLength, value, key, receiver,
receiver_map, elements_map, elements);
__ bind(&miss);
GenerateMiss(masm);
}
void StoreIC::GenerateMegamorphic(MacroAssembler* masm) {
Register receiver = StoreDescriptor::ReceiverRegister();
Register name = StoreDescriptor::NameRegister();
DCHECK(!AreAliased(receiver, name, StoreDescriptor::ValueRegister(), x3, x4,
x5, x6));
// Probe the stub cache.
Code::Flags flags = Code::RemoveTypeAndHolderFromFlags(
Code::ComputeHandlerFlags(Code::STORE_IC));
masm->isolate()->stub_cache()->GenerateProbe(masm, Code::STORE_IC, flags,
receiver, name, x3, x4, x5, x6);
// Cache miss: Jump to runtime.
GenerateMiss(masm);
}
void StoreIC::GenerateMiss(MacroAssembler* masm) {
StoreIC_PushArgs(masm);
// Tail call to the entry.
int args = FLAG_vector_stores ? 5 : 3;
__ TailCallRuntime(Runtime::kStoreIC_Miss, args, 1);
}
void StoreIC::GenerateNormal(MacroAssembler* masm) {
Label miss;
Register value = StoreDescriptor::ValueRegister();
Register receiver = StoreDescriptor::ReceiverRegister();
Register name = StoreDescriptor::NameRegister();
Register dictionary = x5;
DCHECK(!AreAliased(value, receiver, name,
VectorStoreICDescriptor::SlotRegister(),
VectorStoreICDescriptor::VectorRegister(), x5, x6, x7));
__ Ldr(dictionary, FieldMemOperand(receiver, JSObject::kPropertiesOffset));
GenerateDictionaryStore(masm, &miss, dictionary, name, value, x6, x7);
Counters* counters = masm->isolate()->counters();
__ IncrementCounter(counters->store_normal_hit(), 1, x6, x7);
__ Ret();
// Cache miss: Jump to runtime.
__ Bind(&miss);
__ IncrementCounter(counters->store_normal_miss(), 1, x6, x7);
GenerateMiss(masm);
}
Condition CompareIC::ComputeCondition(Token::Value op) {
switch (op) {
case Token::EQ_STRICT:
case Token::EQ:
return eq;
case Token::LT:
return lt;
case Token::GT:
return gt;
case Token::LTE:
return le;
case Token::GTE:
return ge;
default:
UNREACHABLE();
return al;
}
}
bool CompareIC::HasInlinedSmiCode(Address address) {
// The address of the instruction following the call.
Address info_address = Assembler::return_address_from_call_start(address);
InstructionSequence* patch_info = InstructionSequence::At(info_address);
return patch_info->IsInlineData();
}
// Activate a SMI fast-path by patching the instructions generated by
// JumpPatchSite::EmitJumpIf(Not)Smi(), using the information encoded by
// JumpPatchSite::EmitPatchInfo().
void PatchInlinedSmiCode(Address address, InlinedSmiCheck check) {
// The patch information is encoded in the instruction stream using
// instructions which have no side effects, so we can safely execute them.
// The patch information is encoded directly after the call to the helper
// function which is requesting this patch operation.
Address info_address = Assembler::return_address_from_call_start(address);
InlineSmiCheckInfo info(info_address);
// Check and decode the patch information instruction.
if (!info.HasSmiCheck()) {
return;
}
if (FLAG_trace_ic) {
PrintF("[ Patching ic at %p, marker=%p, SMI check=%p\n", address,
info_address, reinterpret_cast<void*>(info.SmiCheck()));
}
// Patch and activate code generated by JumpPatchSite::EmitJumpIfNotSmi()
// and JumpPatchSite::EmitJumpIfSmi().
// Changing
// tb(n)z xzr, #0, <target>
// to
// tb(!n)z test_reg, #0, <target>
Instruction* to_patch = info.SmiCheck();
PatchingAssembler patcher(to_patch, 1);
DCHECK(to_patch->IsTestBranch());
DCHECK(to_patch->ImmTestBranchBit5() == 0);
DCHECK(to_patch->ImmTestBranchBit40() == 0);
STATIC_ASSERT(kSmiTag == 0);
STATIC_ASSERT(kSmiTagMask == 1);
int branch_imm = to_patch->ImmTestBranch();
Register smi_reg;
if (check == ENABLE_INLINED_SMI_CHECK) {
DCHECK(to_patch->Rt() == xzr.code());
smi_reg = info.SmiRegister();
} else {
DCHECK(check == DISABLE_INLINED_SMI_CHECK);
DCHECK(to_patch->Rt() != xzr.code());
smi_reg = xzr;
}
if (to_patch->Mask(TestBranchMask) == TBZ) {
// This is JumpIfNotSmi(smi_reg, branch_imm).
patcher.tbnz(smi_reg, 0, branch_imm);
} else {
DCHECK(to_patch->Mask(TestBranchMask) == TBNZ);
// This is JumpIfSmi(smi_reg, branch_imm).
patcher.tbz(smi_reg, 0, branch_imm);
}
}
} // namespace internal
} // namespace v8
#endif // V8_TARGET_ARCH_ARM64
```
|
Iván Pozo (born 26 August 1979, Vigo, Spain) is a retired Spanish professional boxer. Pozo is a former EBU European Flyweight champion and former WBO Inter-Continental Flyweight champion. He currently fights at bantamweight and defended his WBC Mundo Hispano title by defeating Adonis Rivas by unanimous decision on March 4, 2011.
References
1979 births
Flyweight boxers
Living people
Spanish male boxers
20th-century Spanish people
21st-century Spanish people
|
Iry-pat ( "member of the elite") was an ancient Egyptian ranking title, that is a title announcing a high position in the hierarchy of the country. Iry-pat was indeed the highest ranking title at the royal court, and only the most important officials could bear this title. The title is already attested in the First Dynasty: one of the first holders was Merka, official under king Qa'a.
In the New Kingdom, the title was often the crown prince and the title announced that the holder was the second ruler in the country. It is therefore sometimes translated as Hereditary or Crown Prince. Under Tutankhamun, Horemheb was officially designated the iry-pat or successor to this pharaoh but did not succeed the boy king since Ay intervened to seize the throne instead for about 4 years before Horemheb assumed power as pharaoh.
References
Ancient Egyptian titles
|
Ehatisaht, also known as Ehatisaht Village and Ahateset, is a former First Nations village of the Nuu-chah-nulth people on northern Vancouver Island on the north shore of Esperanza Inlet. The native language is Nuučaan̓uɫ.
See also
Ehattesaht First Nation
References
Northern Vancouver Island
Settlements in British Columbia
Nuu-chah-nulth
|
```smalltalk
using System;
using UnityEngine;
namespace Microsoft.MixedReality.Toolkit.Utilities.Gltf.Schema
{
/// <summary>
/// The material appearance of a primitive.
/// path_to_url
/// </summary>
[Serializable]
public class GltfMaterial : GltfChildOfRootProperty
{
/// <summary>
/// A set of parameter values that are used to define the metallic-roughness
/// material model from Physically-Based Rendering (PBR) methodology.
/// </summary>
public GltfPbrMetallicRoughness pbrMetallicRoughness;
/// <summary>
/// A set of parameter values used to light flat-shaded materials
/// </summary>
public GltfMaterialCommonConstant commonConstant;
/// <summary>
/// A tangent space normal map. Each texel represents the XYZ components of a
/// normal vector in tangent space.
/// </summary>
public GltfNormalTextureInfo normalTexture;
/// <summary>
/// The occlusion map is a greyscale texture, with white indicating areas that
/// should receive full indirect lighting and black indicating no indirect
/// lighting.
/// </summary>
public GltfOcclusionTextureInfo occlusionTexture;
/// <summary>
/// The emissive map controls the color and intensity of the light being emitted
/// by the material. This texture contains RGB components in sRGB color space.
/// If a fourth component (A) is present, it is ignored.
/// </summary>
public GltfTextureInfo emissiveTexture;
/// <summary>
/// The RGB components of the emissive color of the material.
/// If an emissiveTexture is specified, this value is multiplied with the texel
/// values.
/// <items>
/// <minimum>0.0</minimum>
/// <maximum>1.0</maximum>
/// </items>
/// <minItems>3</minItems>
/// <maxItems>3</maxItems>
/// </summary>
public float[] emissiveFactor = { 0f, 0f, 0f, 0f };
/// <summary>
/// The material's alpha rendering mode enumeration specifying the interpretation of the
/// alpha value of the main factor and texture. In `OPAQUE` mode, the alpha value is
/// ignored and the rendered output is fully opaque. In `MASK` mode, the rendered output
/// is either fully opaque or fully transparent depending on the alpha value and the
/// specified alpha cutoff value. In `BLEND` mode, the alpha value is used to composite
/// the source and destination areas. The rendered output is combined with the background
/// using the normal painting operation (i.e. the Porter and Duff over operator).
/// </summary>
public string alphaMode;
/// <summary>
/// Specifies the cutoff threshold when in `MASK` mode. If the alpha value is greater than
/// or equal to this value then it is rendered as fully opaque, otherwise, it is rendered
/// as fully transparent. This value is ignored for other modes.
/// </summary>
public double alphaCutoff = 0.5f;
/// <summary>
/// Specifies whether the material is double sided. When this value is false, back-face
/// culling is enabled. When this value is true, back-face culling is disabled and double
/// sided lighting is enabled. The back-face must have its normals reversed before the
/// lighting equation is evaluated.
/// </summary>
public bool doubleSided;
/// <summary>
/// Unity Material wrapper for the GltfMaterial
/// </summary>
public Material Material { get; internal set; }
}
}
```
|
```haskell
{-# LANGUAGE AllowAmbiguousTypes #-}
{-# LANGUAGE DataKinds #-}
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE DeriveDataTypeable #-}
{-# LANGUAGE DerivingStrategies #-}
{-# LANGUAGE DuplicateRecordFields #-}
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE NamedFieldPuns #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE PolyKinds #-}
{-# LANGUAGE RecordWildCards #-}
{-# LANGUAGE TypeFamilies #-}
{-# LANGUAGE UndecidableInstances #-}
module PlutusTx.Blueprint.Schema where
import Control.Lens.Plated (Plated)
import Data.Aeson (ToJSON (..), (.=))
import Data.Aeson qualified as Aeson
import Data.Aeson.Extra (optionalField, requiredField)
import Data.Aeson.KeyMap qualified as KeyMap
import Data.ByteString (ByteString)
import Data.ByteString.Base16 qualified as Base16
import Data.Data (Data, Typeable)
import Data.Function ((&))
import Data.Kind (Type)
import Data.List.NonEmpty (NonEmpty, nonEmpty)
import Data.Text (Text)
import Data.Text.Encoding qualified as Text
import GHC.Generics (Generic)
import Numeric.Natural (Natural)
import PlutusTx.Blueprint.Definition.Id (DefinitionId, definitionIdToText)
import PlutusTx.Blueprint.Schema.Annotation (SchemaInfo, comment, description, title)
import Prelude hiding (max, maximum, min, minimum)
{- | Blueprint schema definition, as defined by the CIP-0057:
path_to_url#core-vocabulary
The 'referencedTypes' phantom type parameter is used to track the types used in the contract
making sure their schemas are included in the blueprint and that they are referenced
in a type-safe way.
-}
data Schema (referencedTypes :: [Type])
= SchemaInteger SchemaInfo IntegerSchema
| SchemaBytes SchemaInfo BytesSchema
| SchemaList SchemaInfo (ListSchema referencedTypes)
| SchemaMap SchemaInfo (MapSchema referencedTypes)
| SchemaConstructor SchemaInfo (ConstructorSchema referencedTypes)
| SchemaBuiltInData SchemaInfo
| SchemaBuiltInUnit SchemaInfo
| SchemaBuiltInBoolean SchemaInfo
| SchemaBuiltInInteger SchemaInfo
| SchemaBuiltInBytes SchemaInfo
| SchemaBuiltInString SchemaInfo
| SchemaBuiltInPair SchemaInfo (PairSchema referencedTypes)
| SchemaBuiltInList SchemaInfo (Schema referencedTypes)
| SchemaOneOf (NonEmpty (Schema referencedTypes))
| SchemaAnyOf (NonEmpty (Schema referencedTypes))
| SchemaAllOf (NonEmpty (Schema referencedTypes))
| SchemaNot (Schema referencedTypes)
| SchemaDefinitionRef DefinitionId
deriving stock (Eq, Ord, Show, Generic, Data)
deriving anyclass instance (Typeable referencedTypes) => Plated (Schema referencedTypes)
instance ToJSON (Schema referencedTypes) where
toJSON = \case
SchemaInteger info MkIntegerSchema{..} ->
dataType info "integer"
& optionalField "multipleOf" multipleOf
& optionalField "minimum" minimum
& optionalField "maximum" maximum
& optionalField "exclusiveMinimum" exclusiveMinimum
& optionalField "exclusiveMaximum" exclusiveMaximum
& Aeson.Object
SchemaBytes info MkBytesSchema{..} ->
dataType info "bytes"
& optionalField "enum" (fmap toHex <$> nonEmpty enum)
& optionalField "maxLength" maxLength
& optionalField "minLength" minLength
& Aeson.Object
where
toHex :: ByteString -> Text
toHex = Text.decodeUtf8 . Base16.encode
SchemaList info MkListSchema{..} ->
dataType info "list"
& requiredField "items" itemSchema
& optionalField "minItems" minItems
& optionalField "maxItems" maxItems
& optionalField "uniqueItems" uniqueItems
& Aeson.Object
SchemaMap info MkMapSchema{..} ->
dataType info "map"
& requiredField "keys" keySchema
& requiredField "values" valueSchema
& optionalField "minItems" minItems
& optionalField "maxItems" maxItems
& Aeson.Object
SchemaConstructor info MkConstructorSchema{..} ->
dataType info "constructor"
& requiredField "index" index
& requiredField "fields" fieldSchemas
& Aeson.Object
SchemaBuiltInData info ->
Aeson.Object $ infoFields info
SchemaBuiltInUnit info ->
Aeson.Object $ dataType info "#unit"
SchemaBuiltInBoolean info ->
Aeson.Object $ dataType info "#boolean"
SchemaBuiltInInteger info ->
Aeson.Object $ dataType info "#integer"
SchemaBuiltInBytes info ->
Aeson.Object $ dataType info "#bytes"
SchemaBuiltInString info ->
Aeson.Object $ dataType info "#string"
SchemaBuiltInPair info MkPairSchema{left, right} ->
dataType info "#pair"
& requiredField "left" left
& requiredField "right" right
& Aeson.Object
SchemaBuiltInList info schema ->
dataType info "#list"
& requiredField "items" schema
& Aeson.Object
SchemaOneOf schemas ->
Aeson.object ["oneOf" .= schemas]
SchemaAnyOf schemas ->
Aeson.object ["anyOf" .= schemas]
SchemaAllOf schemas ->
Aeson.object ["allOf" .= schemas]
SchemaNot schema ->
Aeson.object ["not" .= schema]
SchemaDefinitionRef definitionId ->
Aeson.object ["$ref" .= ("#/definitions/" <> definitionIdToText definitionId)]
where
dataType :: SchemaInfo -> String -> Aeson.Object
dataType info ty = requiredField "dataType" ty (infoFields info)
infoFields :: SchemaInfo -> Aeson.Object
infoFields info =
KeyMap.empty
& optionalField "title" (title info)
& optionalField "description" (description info)
& optionalField "$comment" (comment info)
data IntegerSchema = MkIntegerSchema
{ multipleOf :: Maybe Integer
-- ^ An instance is valid if division by this value results in an integer.
, minimum :: Maybe Integer
-- ^ An instance is valid only if it is greater than or exactly equal to "minimum".
, maximum :: Maybe Integer
-- ^ An instance is valid only if it is less than or exactly equal to "maximum".
, exclusiveMinimum :: Maybe Integer
-- ^ An instance is valid only if it is strictly greater than "exclusiveMinimum".
, exclusiveMaximum :: Maybe Integer
-- ^ An instance is valid only if it is strictly less than "exclusiveMaximum".
}
deriving stock (Eq, Ord, Show, Generic, Data)
emptyIntegerSchema :: IntegerSchema
emptyIntegerSchema =
MkIntegerSchema
{ multipleOf = Nothing
, minimum = Nothing
, maximum = Nothing
, exclusiveMinimum = Nothing
, exclusiveMaximum = Nothing
}
data BytesSchema = MkBytesSchema
{ enum :: [ByteString]
-- ^ An instance validates successfully if once hex-encoded,
-- its value matches one of the specified values.
, minLength :: Maybe Natural
-- ^ An instance is valid if its length is greater than, or equal to, this value.
, maxLength :: Maybe Natural
-- ^ An instance is valid if its length is less than, or equal to, this value.
}
deriving stock (Eq, Ord, Show, Generic, Data)
emptyBytesSchema :: BytesSchema
emptyBytesSchema = MkBytesSchema{enum = [], minLength = Nothing, maxLength = Nothing}
data ListSchema (referencedTypes :: [Type]) = MkListSchema
{ itemSchema :: Schema referencedTypes
-- ^ Element schema
, minItems :: Maybe Natural
-- ^ An array instance is valid if its size is greater than, or equal to, this value.
, maxItems :: Maybe Natural
-- ^ An array instance is valid if its size is less than, or equal to, this value.
, uniqueItems :: Maybe Bool
-- ^ If this value is false, the instance validates successfully.
-- If it is set to True, the instance validates successfully if all of its elements are unique.
}
deriving stock (Eq, Ord, Show, Generic, Data)
mkListSchema :: Schema referencedTypes -> ListSchema referencedTypes
mkListSchema itemSchema =
MkListSchema
{ itemSchema
, minItems = Nothing
, maxItems = Nothing
, uniqueItems = Nothing
}
data MapSchema (referencedTypes :: [Type]) = MkMapSchema
{ keySchema :: Schema referencedTypes
-- ^ Key schema
, valueSchema :: Schema referencedTypes
-- ^ Value schema
, minItems :: Maybe Natural
-- ^ A map instance is valid if its size is greater than, or equal to, this value.
, maxItems :: Maybe Natural
-- ^ A map instance is valid if its size is less than, or equal to, this value.
}
deriving stock (Eq, Ord, Show, Generic, Data)
data ConstructorSchema (referencedTypes :: [Type]) = MkConstructorSchema
{ index :: Natural
-- ^ Constructor index
, fieldSchemas :: [Schema referencedTypes]
-- ^ Field schemas
}
deriving stock (Eq, Ord, Show, Generic, Data)
data PairSchema (referencedTypes :: [Type]) = MkPairSchema
{ left :: Schema referencedTypes
-- ^ Schema of the first element
, right :: Schema referencedTypes
-- ^ Schema of the second element
}
deriving stock (Eq, Ord, Show, Generic, Data)
```
|
```smalltalk
Class {
#name : 'ClyUnknownScopeTest',
#superclass : 'ClyScopeTest',
#category : 'Calypso-NavigationModel-Tests',
#package : 'Calypso-NavigationModel-Tests'
}
{ #category : 'running' }
ClyUnknownScopeTest >> createSampleScope [
^ClyUnknownScope new
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testAdoptQuery [
| query adoptedQuery |
scope := self createSampleScope.
query := ClyReturningScopeBasisExampleQuery new.
adoptedQuery := scope adoptQuery: query.
self assert: adoptedQuery identicalTo: query
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testConvertingToAnotherScopeClass [
| convertedScope |
scope := self createSampleScope.
convertedScope := scope asScope: ClyExampleScope.
self assert: convertedScope identicalTo: scope
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testConvertingToNewBasis [
scope := self createSampleScope.
self assert: (scope withNewBasisObjects: #(newBasis)) identicalTo: scope.
self assert: scope basisObjects isEmpty
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testCreationUnifiedInstance [
scope := self createSampleScope.
self assert: scope asUnifiedInstance identicalTo: scope
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testGettingSingletonInstanceFromScratch [
ClyUnknownScope reset.
self assert: ClyUnknownScope instance identicalTo: ClyUnknownScope instance
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testHasEmptyBasisObjects [
scope := self createSampleScope.
self assert: scope basisObjects isEmpty
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testIsBasedOnEmptyBasis [
scope := self createSampleScope.
self assert: scope isBasedOnEmptyBasis
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testIsNotBasedOnAnyObject [
scope := self createSampleScope.
self deny: (scope isBasedOn: #anyObject)
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testIsNotBasedOnMultipleBasis [
scope := self createSampleScope.
self deny: scope isBasedOnMultipleBasis
]
{ #category : 'tests' }
ClyUnknownScopeTest >> testIsNotBasedOnSingleBasis [
scope := self createSampleScope.
self deny: scope isBasedOnSingleBasis
]
```
|
Nomex is a flame-resistant meta-aramid material developed in the early 1960s by DuPont and first marketed in 1967.
Properties
Nomex and related aramid polymers are related to nylon, but have aromatic backbones, and hence are more rigid and more durable. Nomex is an example of a meta variant of the aramids (Kevlar is a para aramid). Unlike Kevlar, Nomex strands cannot align during filament polymerization and have less strength: its ultimate tensile strength is 340 MPa. However, it has excellent thermal, chemical, and radiation resistance for a polymer material. It can withstand temperatures of up to 370 °C.
Production
Nomex is produced by condensation reaction from the monomers m-phenylenediamine and isophthaloyl chloride.
It is sold in both fiber and sheet forms and is used as a fabric where resistance from heat and flame is required. Nomex sheet is actually a calendered paper and made in a similar fashion. Nomex Type 410 paper was the first Nomex paper developed and one of the higher volume grades made, mostly for electrical insulation purposes. Nomex fiber is made in the United States.
Wilfred Sweeny (1926–2011), the DuPont scientist responsible for discoveries leading to Nomex, earned a DuPont Lavoisier Medal in 2002 partly for this work.
Applications
Nomex Paper is used in electrical laminates such as circuit boards and transformer cores as well as fireproof honeycomb structures where it is saturated with a phenolic resin. Honeycomb structures such as these, as well as mylar-Nomex laminates, are used extensively in aircraft construction. Firefighting, military aviation, and vehicle racing industries use Nomex to create clothing and equipment that can withstand intense heat.
A Nomex hood is a common piece of racing and firefighting equipment. It is placed on the head on top of a firefighter's face mask. The hood protects the portions of the head not covered by the helmet and face mask from the intense heat of the fire.
Wildland firefighters wear Nomex shirts and trousers as part of their personal protective equipment during wildfire suppression activities.
Racing car drivers wear driving suits constructed of Nomex and or other fire retardant materials, along with Nomex gloves, long underwear, balaclavas, socks, helmet lining and shoes, to protect them in the event of a fire.
Military pilots and aircrew wear flight suits made of over 92 percent Nomex to protect them from cockpit fires. Troops riding in ground vehicles often wear Nomex for fire protection. Kevlar thread is often used to hold the fabric together at seams.
Military tank drivers also typically use Nomex hoods as protection against fire.
In the U.S. space program, Nomex has been used for the Thermal Micrometeoroid Garment on the Extravehicular Mobility Unit (in conjunction with Kevlar and Gore-Tex) and ACES pressure suit, both for fire and extreme environment (water immersion to near vacuum) protection, and as thermal blankets on the payload bay doors, fuselage, and upper wing surfaces of the Space Shuttle Orbiter. It has also been used for the airbags for the Mars Pathfinder and Mars Exploration Rover missions , the Galileo atmospheric probe, the Cassini-Huygens Titan probe, as an external covering on the AERCam Sprint, and is planned to be incorporated into NASA's upcoming Crew Exploration Vehicle.
Nomex has been used as an acoustic material in Troy, NY, at Rensselaer Polytechnic Institute's Experimental Media and Performing Arts Center (EMPAC) main concert hall. A ceiling canopy of Nomex reflects high and mid frequency sound, providing reverberation, while letting lower frequency sound partially pass through the canopy. According to RPI President Shirley Ann Jackson, EMPAC is the first venue in the world to use Nomex as an architectural material for acoustic reasons.
Nomex (like Kevlar) is also used in the production of loudspeaker drivers.
Honeycomb-structured Nomex paper is used as a spacer between layers of lead in the ATLAS Liquid Argon Calorimeter, and as a laminate core for hull and deck construction in custom boats such as Stiletto Catamarans like the Stiletto 27.
Nomex is used in industrial applications as a filter in exhaust filtration systems, typically a baghouse, that deal with hot gas emissions found in asphalt plants, cement plants, steel smelting facilities, and non-ferrous metal production facilities.
Nomex is used in some classical guitar tops in order to create a 'composite' soundboard. When Nomex is laminated between 2 spruce or cedar 'skins', a rigid and lightweight plate is produced, which can improve the efficiency of the soundboard. While the 'laminated' technique was created by Matthias Dammann, the use of Nomex within was first employed by luthier Gernot Wagner.
History
The deaths in fiery crashes of race car drivers Fireball Roberts at Charlotte, and Eddie Sachs and Dave MacDonald at Indianapolis in 1964, led to the use of flame-resistant fabrics such as Nomex. In early 1966 Competition Press and Autoweek reported: "During the past season, experimental driving suits were worn by Walt Hansgen, Masten Gregory, Marvin Panch and Group 44's Bob Tullius; these four representing a fairly good cross section in the sport. The goal was to get use-test information on the comfort and laundering characteristics of Nomex. The Chrysler-Plymouth team at the recent Motor Trend 500 at Riverside also wore these suits."
See also
Aramid
Gore-Tex
Kevlar
Marlan
PET film
Silica Aerogel
Thermal Micrometeoroid Garment
Twaron
Vectran
References
External links
DuPont Nomex
Dupont.com - 40th anniversary of Nomex - 2007
Comparison of single-layer Nomex suits
Flame retardant fabrics
Synthetic materials
Firefighting equipment
Synthetic fibers
Brand name materials
DuPont products
|
```c
/*
*
*/
#include <stddef.h>
#include <string.h>
#include <inttypes.h>
#include <sys/lock.h>
#include <sys/param.h>
#include "esp_attr.h"
#include "esp_check.h"
#include "esp_sleep.h"
#include "esp_log.h"
#include "freertos/FreeRTOS.h"
#include "freertos/task.h"
#include "esp_heap_caps.h"
#include "hal/rtc_hal.h"
#include "soc/rtc_periph.h"
#include "soc/soc_caps.h"
#include "esp_private/sleep_cpu.h"
#include "esp_private/sleep_event.h"
#include "sdkconfig.h"
static __attribute__((unused)) const char *TAG = "sleep";
typedef struct {
uint32_t start;
uint32_t end;
} cpu_domain_dev_regs_region_t;
typedef struct {
cpu_domain_dev_regs_region_t *region;
int region_num;
uint32_t *regs_frame;
} cpu_domain_dev_sleep_frame_t;
/**
* Internal structure which holds all requested light sleep cpu retention parameters
*/
typedef struct {
rtc_cntl_sleep_retent_t retent;
} sleep_cpu_retention_t;
static DRAM_ATTR __attribute__((unused)) sleep_cpu_retention_t s_cpu_retention;
esp_err_t esp_sleep_cpu_pd_low_init(void)
{
if (s_cpu_retention.retent.cpu_pd_mem == NULL) {
void *buf = heap_caps_aligned_calloc(SOC_RTC_CNTL_CPU_PD_DMA_ADDR_ALIGN, 1,
SOC_RTC_CNTL_CPU_PD_RETENTION_MEM_SIZE + RTC_HAL_DMA_LINK_NODE_SIZE,
MALLOC_CAP_RETENTION);
if (buf) {
s_cpu_retention.retent.cpu_pd_mem = rtc_cntl_hal_dma_link_init(buf,
buf + RTC_HAL_DMA_LINK_NODE_SIZE, SOC_RTC_CNTL_CPU_PD_RETENTION_MEM_SIZE, NULL);
} else {
return ESP_ERR_NO_MEM;
}
}
return ESP_OK;
}
esp_err_t esp_sleep_cpu_pd_low_deinit(void)
{
if (s_cpu_retention.retent.cpu_pd_mem) {
heap_caps_free(s_cpu_retention.retent.cpu_pd_mem);
s_cpu_retention.retent.cpu_pd_mem = NULL;
}
return ESP_OK;
}
void sleep_enable_cpu_retention(void)
{
rtc_cntl_hal_enable_cpu_retention(&s_cpu_retention.retent);
}
void IRAM_ATTR sleep_disable_cpu_retention(void)
{
rtc_cntl_hal_disable_cpu_retention(&s_cpu_retention.retent);
}
esp_err_t esp_sleep_cpu_retention_init(void)
{
return esp_sleep_cpu_pd_low_init();
}
esp_err_t esp_sleep_cpu_retention_deinit(void)
{
return esp_sleep_cpu_pd_low_deinit();
}
bool cpu_domain_pd_allowed(void)
{
return (s_cpu_retention.retent.cpu_pd_mem != NULL);
}
esp_err_t sleep_cpu_configure(bool light_sleep_enable)
{
#if ESP_SLEEP_POWER_DOWN_CPU
if (light_sleep_enable) {
ESP_RETURN_ON_ERROR(esp_sleep_cpu_retention_init(), TAG, "Failed to enable CPU power down during light sleep.");
} else {
ESP_RETURN_ON_ERROR(esp_sleep_cpu_retention_deinit(), TAG, "Failed to release CPU retention memory");
}
#endif
return ESP_OK;
}
```
|
```java
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package org.flowable.engine.delegate;
import java.util.concurrent.CompletableFuture;
import org.flowable.common.engine.api.async.AsyncTaskInvoker;
/**
* Convenience class which always uses the {@link AsyncTaskInvoker} to execute the async data.
* Provides intermediate methods to prepare the execution data before executing and do the
* actual execution without the need to work with futures.
*
* @param <Input> the input of the execution
* @param <Output> the output of the execution
* @author Filip Hrisafov
* @see MapBasedFlowableFutureJavaDelegate
* @see FutureJavaDelegate
*/
public interface FlowableFutureJavaDelegate<Input, Output> extends FutureJavaDelegate<Output> {
@Override
default CompletableFuture<Output> execute(DelegateExecution execution, AsyncTaskInvoker taskInvoker) {
Input inputData = prepareExecutionData(execution);
return taskInvoker.submit(() -> execute(inputData));
}
/**
* Method invoked before doing the execution to extract needed that from the execution
* on the main thread.
* This should be used to prepare and extract data from the execution before doing the execution in a different thread.
*
* @param execution the execution from which to extract data
* @return the data for the delegate
*/
Input prepareExecutionData(DelegateExecution execution);
/**
* Perform the actual execution of the delegate in another thread.
* This uses {@link #prepareExecutionData(DelegateExecution)} to get the needed data
* from the {@link DelegateExecution} and returns the output data that can is passed to {@link #afterExecution(DelegateExecution, Object)}.
*
* <b>IMPORTANT:</b> This is a completely new thread which does not participate in the transaction of the process.
*
* @param inputData the input data for the execution created via {@link #prepareExecutionData(DelegateExecution)}
* @return the output data of the execution
* @see #execute(DelegateExecution, AsyncTaskInvoker)
*/
Output execute(Input inputData);
/**
* Method invoked with the result from {@link #execute(Object)}.
* This should be used to set data on the {@link DelegateExecution}.
* This is on the same thread as {@link #prepareExecutionData(DelegateExecution)} and participates in the process transaction.
*
* @param execution the execution to which data can be set
* @param executionData the execution data
*/
@Override
void afterExecution(DelegateExecution execution, Output executionData);
}
```
|
```scala
/*
*/
package akka.stream.alpakka.googlecloud.bigquery.scaladsl.spray
import akka.util.ByteString
import spray.json.{deserializationError, JsBoolean, JsFalse, JsNumber, JsString, JsTrue, JsValue}
/**
* Provides the BigQueryJsonFormats for BigQuery table cells of the most important Scala types.
*/
trait BigQueryBasicFormats {
implicit object IntJsonFormat extends BigQueryJsonFormat[Int] {
def write(x: Int) = JsNumber(x)
def read(value: JsValue) = value match {
case JsNumber(x) if x.isValidInt => x.intValue
case BigQueryNumber(x) if x.isValidInt => x.intValue
case x => deserializationError("Expected Int as JsNumber or JsString, but got " + x)
}
}
implicit object LongJsonFormat extends BigQueryJsonFormat[Long] {
def write(x: Long) =
if (-9007199254740991L <= x & x <= 9007199254740991L)
JsNumber(x)
else
JsString(x.toString)
def read(value: JsValue) = value match {
case JsNumber(x) if x.isValidLong => x.longValue
case BigQueryNumber(x) if x.isValidLong => x.longValue
case x => deserializationError("Expected Long as JsNumber or JsString, but got " + x)
}
}
implicit object FloatJsonFormat extends BigQueryJsonFormat[Float] {
def write(x: Float) = JsNumber(x)
def read(value: JsValue) = value match {
case JsNumber(x) => x.floatValue
case BigQueryNumber(x) => x.floatValue
case x => deserializationError("Expected Float as JsNumber or JsString, but got " + x)
}
}
implicit object DoubleJsonFormat extends BigQueryJsonFormat[Double] {
def write(x: Double) = JsNumber(x)
def read(value: JsValue) = value match {
case JsNumber(x) => x.doubleValue
case BigQueryNumber(x) => x.doubleValue
case x => deserializationError("Expected Double as JsNumber or JsString, but got " + x)
}
}
implicit object ByteJsonFormat extends BigQueryJsonFormat[Byte] {
def write(x: Byte) = JsNumber(x)
def read(value: JsValue) = value match {
case JsNumber(x) if x.isValidByte => x.byteValue
case BigQueryNumber(x) if x.isValidByte => x.byteValue
case x => deserializationError("Expected Byte as JsNumber or JsString, but got " + x)
}
}
implicit object ShortJsonFormat extends BigQueryJsonFormat[Short] {
def write(x: Short) = JsNumber(x)
def read(value: JsValue) = value match {
case JsNumber(x) if x.isValidShort => x.shortValue
case BigQueryNumber(x) if x.isValidShort => x.shortValue
case x => deserializationError("Expected Short as JsNumber or JsString, but got " + x)
}
}
implicit object BigDecimalJsonFormat extends BigQueryJsonFormat[BigDecimal] {
def write(x: BigDecimal) = {
require(x ne null)
JsString(x.toString)
}
def read(value: JsValue) = value match {
case JsNumber(x) => x
case BigQueryNumber(x) => x
case x => deserializationError("Expected BigDecimal as JsNumber or JsString, but got " + x)
}
}
implicit object BigIntJsonFormat extends BigQueryJsonFormat[BigInt] {
def write(x: BigInt) = {
require(x ne null)
JsString(x.toString)
}
def read(value: JsValue) = value match {
case JsNumber(x) => x.toBigInt
case BigQueryNumber(x) => x.toBigInt
case x => deserializationError("Expected BigInt as JsNumber or JsString, but got " + x)
}
}
implicit object UnitJsonFormat extends BigQueryJsonFormat[Unit] {
def write(x: Unit) = JsNumber(1)
def read(value: JsValue): Unit = {}
}
implicit object BooleanJsonFormat extends BigQueryJsonFormat[Boolean] {
def write(x: Boolean) = JsBoolean(x)
def read(value: JsValue) = value match {
case JsTrue | JsString("true") => true
case JsFalse | JsString("false") => false
case x => deserializationError("Expected Boolean as JsBoolean or JsString, but got " + x)
}
}
implicit object CharJsonFormat extends BigQueryJsonFormat[Char] {
def write(x: Char) = JsString(String.valueOf(x))
def read(value: JsValue) = value match {
case JsString(x) if x.length == 1 => x.charAt(0)
case x => deserializationError("Expected Char as single-character JsString, but got " + x)
}
}
implicit object StringJsonFormat extends BigQueryJsonFormat[String] {
def write(x: String) = {
require(x ne null)
JsString(x)
}
def read(value: JsValue) = value match {
case JsString(x) => x
case x => deserializationError("Expected String as JsString, but got " + x)
}
}
implicit object SymbolJsonFormat extends BigQueryJsonFormat[Symbol] {
def write(x: Symbol) = JsString(x.name)
def read(value: JsValue) = value match {
case JsString(x) => Symbol(x)
case x => deserializationError("Expected Symbol as JsString, but got " + x)
}
}
implicit object ByteStringJsonFormat extends BigQueryJsonFormat[ByteString] {
import java.nio.charset.StandardCharsets.US_ASCII
def write(x: ByteString) = JsString(x.encodeBase64.decodeString(US_ASCII))
def read(value: JsValue) = value match {
case BigQueryBytes(x) => x
case x => deserializationError("Expected ByteString as JsString, but got " + x)
}
}
}
```
|
```html
<?xml version="1.0" encoding="utf-8" ?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "path_to_url">
<html xmlns="path_to_url" xml:lang="en" lang="en">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<meta name="generator" content="Docutils 0.7: path_to_url" />
<title>The MPL Reference Manual: erase</title>
<link rel="stylesheet" href="../style.css" type="text/css" />
</head>
<body class="docframe refmanual">
<table class="header"><tr class="header"><td class="header-group navigation-bar"><span class="navigation-group"><a href="./end.html" class="navigation-link">Prev</a> <a href="./erase-key.html" class="navigation-link">Next</a></span><span class="navigation-group-separator"> | </span><span class="navigation-group"><a href="./end.html" class="navigation-link">Back</a> <a href="./erase-key.html" class="navigation-link">Along</a></span><span class="navigation-group-separator"> | </span><span class="navigation-group"><a href="./intrinsic-metafunctions.html" class="navigation-link">Up</a> <a href="../refmanual.html" class="navigation-link">Home</a></span><span class="navigation-group-separator"> | </span><span class="navigation-group"><a href="./refmanual_toc.html" class="navigation-link">Full TOC</a></span></td>
<td class="header-group page-location"><a href="../refmanual.html" class="navigation-link">Front Page</a> / <a href="./sequences.html" class="navigation-link">Sequences</a> / <a href="./intrinsic-metafunctions.html" class="navigation-link">Intrinsic Metafunctions</a> / <a href="./erase.html" class="navigation-link">erase</a></td>
</tr></table><div class="header-separator"></div>
<div class="section" id="erase">
<h1><a class="toc-backref" href="./intrinsic-metafunctions.html#id1432">erase</a></h1>
<div class="section" id="id238">
<h3><a class="subsection-title" href="#synopsis" name="synopsis">Synopsis</a></h3>
<pre class="literal-block">
template<
typename Sequence
, typename First
, typename Last = <em>unspecified</em>
>
struct <a href="./erase.html" class="identifier">erase</a>
{
typedef <em>unspecified</em> type;
};
</pre>
</div>
<div class="section" id="id239">
<h3><a class="subsection-title" href="#description" name="description">Description</a></h3>
<p><tt class="literal"><span class="pre"><a href="./erase.html" class="identifier">erase</a></span></tt> performs a removal of one or more adjacent elements in the sequence
starting from an arbitrary position.</p>
</div>
<div class="section" id="id240">
<h3><a class="subsection-title" href="#header" name="header">Header</a></h3>
<pre class="literal-block">
#include <<a href="../../../../boost/mpl/erase.hpp" class="header">boost/mpl/erase.hpp</a>>
</pre>
</div>
<div class="section" id="id241">
<h3><a class="subsection-title" href="#model-of" name="model-of">Model of</a></h3>
<p><a class="reference internal" href="./tag-dispatched-metafunction.html">Tag Dispatched Metafunction</a></p>
</div>
<div class="section" id="id242">
<h3><a class="subsection-title" href="#parameters" name="parameters">Parameters</a></h3>
<table border="1" class="docutils table">
<colgroup>
<col width="15%" />
<col width="36%" />
<col width="48%" />
</colgroup>
<thead valign="bottom">
<tr><th class="head">Parameter</th>
<th class="head">Requirement</th>
<th class="head">Description</th>
</tr>
</thead>
<tbody valign="top">
<tr><td><tt class="literal"><span class="pre">Sequence</span></tt></td>
<td><a class="reference internal" href="./extensible-sequence.html">Extensible Sequence</a> or
<a class="reference internal" href="./extensible-associative-sequence.html">Extensible Associative Sequence</a></td>
<td>A sequence to erase from.</td>
</tr>
<tr><td><tt class="literal"><span class="pre">First</span></tt></td>
<td><a class="reference internal" href="./forward-iterator.html">Forward Iterator</a></td>
<td>An iterator to the beginning of the range to
be erased.</td>
</tr>
<tr><td><tt class="literal"><span class="pre">Last</span></tt></td>
<td><a class="reference internal" href="./forward-iterator.html">Forward Iterator</a></td>
<td>An iterator past-the-end of the range to be
erased.</td>
</tr>
</tbody>
</table>
</div>
<div class="section" id="id243">
<h3><a class="subsection-title" href="#expression-semantics" name="expression-semantics">Expression semantics</a></h3>
<div class="expression-semantics compound">
<p class="compound-first">For any <a class="reference internal" href="./extensible-sequence.html">Extensible Sequence</a> <tt class="literal"><span class="pre">s</span></tt>, and iterators <tt class="literal"><span class="pre">pos</span></tt>, <tt class="literal"><span class="pre">first</span></tt> and <tt class="literal"><span class="pre">last</span></tt> into <tt class="literal"><span class="pre">s</span></tt>:</p>
<pre class="compound-middle literal-block">
typedef <a href="./erase.html" class="identifier">erase</a><s,first,last>::type r;
</pre>
<table class="compound-middle docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field"><th class="field-name">Return type:</th><td class="field-body"><p class="first"><a class="reference internal" href="./extensible-sequence.html">Extensible Sequence</a>.</p>
</td>
</tr>
<tr class="field"><th class="field-name">Precondition:</th><td class="field-body"><p class="first"><tt class="literal"><span class="pre">[first,last)</span></tt> is a valid range in <tt class="literal"><span class="pre">s</span></tt>.</p>
</td>
</tr>
<tr class="field"><th class="field-name">Semantics:</th><td class="field-body"><p class="first"><tt class="literal"><span class="pre">r</span></tt> is a new sequence, <a class="reference internal" href="./terminology.html#concept-identical">concept-identical</a> to <tt class="literal"><span class="pre">s</span></tt>, of the following elements:
[<tt class="literal"><span class="pre"><a href="./begin.html" class="identifier">begin</a><s>::type</span></tt>, <tt class="literal"><span class="pre">pos</span></tt>), [<tt class="literal"><span class="pre">last</span></tt>, <tt class="literal"><span class="pre"><a href="./end.html" class="identifier">end</a><s>::type</span></tt>).</p>
</td>
</tr>
<tr class="field"><th class="field-name">Postcondition:</th><td class="field-body"><p class="first">The relative order of the elements in <tt class="literal"><span class="pre">r</span></tt> is the same as in <tt class="literal"><span class="pre">s</span></tt>;</p>
<pre class="last literal-block">
<a href="./size.html" class="identifier">size</a><r>::value == <a href="./size.html" class="identifier">size</a><s>::value - <a href="./distance.html" class="identifier">distance</a><first,last>::value
</pre>
</td>
</tr>
</tbody>
</table>
<!-- ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -->
<pre class="compound-middle literal-block">
typedef <a href="./erase.html" class="identifier">erase</a><s,pos>::type r;
</pre>
<table class="compound-last docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field"><th class="field-name">Return type:</th><td class="field-body"><p class="first"><a class="reference internal" href="./extensible-sequence.html">Extensible Sequence</a>.</p>
</td>
</tr>
<tr class="field"><th class="field-name">Precondition:</th><td class="field-body"><p class="first"><tt class="literal"><span class="pre">pos</span></tt> is a dereferenceable iterator in <tt class="literal"><span class="pre">s</span></tt>.</p>
</td>
</tr>
<tr class="field"><th class="field-name">Semantics:</th><td class="field-body"><p class="first">Equivalent to</p>
<pre class="last literal-block">
typedef <a href="./erase.html" class="identifier">erase</a>< s,pos,<a href="./next.html" class="identifier">next</a><pos>::type >::type r;
</pre>
</td>
</tr>
</tbody>
</table>
</div>
<div class="expression-semantics compound">
<p class="compound-first">For any <a class="reference internal" href="./extensible-associative-sequence.html">Extensible Associative Sequence</a> <tt class="literal"><span class="pre">s</span></tt>, and iterator <tt class="literal"><span class="pre">pos</span></tt> into <tt class="literal"><span class="pre">s</span></tt>:</p>
<pre class="compound-middle literal-block">
typedef <a href="./erase.html" class="identifier">erase</a><s,pos>::type r;
</pre>
<table class="compound-last docutils field-list" frame="void" rules="none">
<col class="field-name" />
<col class="field-body" />
<tbody valign="top">
<tr class="field"><th class="field-name">Return type:</th><td class="field-body"><a class="reference internal" href="./extensible-sequence.html">Extensible Sequence</a>.</td>
</tr>
<tr class="field"><th class="field-name">Precondition:</th><td class="field-body"><tt class="literal"><span class="pre">pos</span></tt> is a dereferenceable iterator to <tt class="literal"><span class="pre">s</span></tt>.</td>
</tr>
<tr class="field"><th class="field-name">Semantics:</th><td class="field-body">Erases the element at a specific position <tt class="literal"><span class="pre">pos</span></tt>; equivalent to
<tt class="literal"><span class="pre"><a href="./erase-key.html" class="identifier">erase_key</a><s,</span> <span class="pre"><a href="./deref.html" class="identifier">deref</a><pos>::type</span> <span class="pre">>::type</span></tt>.</td>
</tr>
<tr class="field"><th class="field-name">Postcondition:</th><td class="field-body"><tt class="literal"><span class="pre"><a href="./size.html" class="identifier">size</a><r>::value</span> <span class="pre">==</span> <span class="pre"><a href="./size.html" class="identifier">size</a><s>::value</span> <span class="pre">-</span> <span class="pre">1</span></tt>.</td>
</tr>
</tbody>
</table>
</div>
</div>
<div class="section" id="id244">
<h3><a class="subsection-title" href="#complexity" name="complexity">Complexity</a></h3>
<table border="1" class="docutils table">
<colgroup>
<col width="45%" />
<col width="55%" />
</colgroup>
<thead valign="bottom">
<tr><th class="head">Sequence archetype</th>
<th class="head">Complexity (the range form)</th>
</tr>
</thead>
<tbody valign="top">
<tr><td><a class="reference internal" href="./extensible-associative-sequence.html">Extensible Associative Sequence</a></td>
<td>Amortized constant time.</td>
</tr>
<tr><td><a class="reference internal" href="./extensible-sequence.html">Extensible Sequence</a></td>
<td>Quadratic in the worst case, linear at best.</td>
</tr>
</tbody>
</table>
</div>
<div class="section" id="id245">
<h3><a class="subsection-title" href="#example" name="example">Example</a></h3>
<pre class="literal-block">
typedef <a href="./vector-c.html" class="identifier">vector_c</a><int,1,0,5,1,7,5,0,5> values;
typedef <a href="./find.html" class="identifier">find</a>< values, <a href="./integral-c.html" class="identifier">integral_c</a><int,7> >::type pos;
typedef <a href="./erase.html" class="identifier">erase</a><values,pos>::type result;
<a href="./assert-relation.html" class="identifier">BOOST_MPL_ASSERT_RELATION</a>( <a href="./size.html" class="identifier">size</a><result>::value, ==, 7 );
typedef <a href="./find.html" class="identifier">find</a><result, <a href="./integral-c.html" class="identifier">integral_c</a><int,7> >::type iter;
<a href="./assert.html" class="identifier">BOOST_MPL_ASSERT</a>(( is_same< iter, <a href="./end.html" class="identifier">end</a><result>::type > ));
</pre>
</div>
<div class="section" id="id246">
<h3><a class="subsection-title" href="#see-also" name="see-also">See also</a></h3>
<p><a class="reference internal" href="./extensible-sequence.html">Extensible Sequence</a>, <a class="reference internal" href="./extensible-associative-sequence.html">Extensible Associative Sequence</a>, <a class="reference internal" href="./erase-key.html">erase_key</a>, <a class="reference internal" href="./pop-front.html">pop_front</a>, <a class="reference internal" href="./pop-back.html">pop_back</a>, <a class="reference internal" href="./insert.html">insert</a></p>
<!-- Sequences/Intrinsic Metafunctions//erase_key -->
</div>
</div>
<div class="footer-separator"></div>
<table class="footer"><tr class="footer"><td class="header-group navigation-bar"><span class="navigation-group"><a href="./end.html" class="navigation-link">Prev</a> <a href="./erase-key.html" class="navigation-link">Next</a></span><span class="navigation-group-separator"> | </span><span class="navigation-group"><a href="./end.html" class="navigation-link">Back</a> <a href="./erase-key.html" class="navigation-link">Along</a></span><span class="navigation-group-separator"> | </span><span class="navigation-group"><a href="./intrinsic-metafunctions.html" class="navigation-link">Up</a> <a href="../refmanual.html" class="navigation-link">Home</a></span><span class="navigation-group-separator"> | </span><span class="navigation-group"><a href="./refmanual_toc.html" class="navigation-link">Full TOC</a></span></td>
file LICENSE_1_0.txt or copy at <a class="reference external" href="path_to_url" target="_top">path_to_url
</html>
```
|
Troy Anthony Carter (born October 26, 1963) is an American politician serving as the U.S. representative for Louisiana's 2nd congressional district since 2021. He was previously a member of the Louisiana State Senate for the 7th district. A member of the Democratic Party, Carter also previously served on the New Orleans City Council and as a member of the Louisiana House of Representatives. He is currently the only Democrat in Louisiana's congressional delegation.
Early life and education
Carter was born in New Orleans. After graduating from Oliver Perry Walker High School in Algiers, he attended Xavier University of Louisiana, where he earned a Bachelor of Arts degree in business administration and political science. He has completed programs at the Harvard Kennedy School and Carnegie Mellon University's School of Urban and Public Affairs.
Early career
Carter has been an adjunct political science instructor at Xavier University of Louisiana. Before his election to the state legislature, he served six years as executive assistant to New Orleans mayor Sidney Barthelemy.
Carter was elected as a member of the Louisiana House of Representatives in 1991, becoming the first African-American to serve District 102 in the Louisiana House. As a state representative in 1993, he introduced legislation to prohibit discrimination against LGBTQ individuals. After his election to the Louisiana Senate, he filed similar legislation in 2017 and 2020.
In 1994, he was elected to represent District C on the New Orleans City Council. He served until 2002, when he unsuccessfully ran for mayor, losing the primary election to Ray Nagin and Richard Pennington. He was an unsuccessful candidate for Louisiana's 2nd congressional district seat in 2006 against then-incumbent William J. Jefferson.
After several years out of public office, Carter was elected to the Louisiana Senate in 2015. He received 12,935 votes (56.8%) in the 2015 runoff election to Jeff Arnold's 9,852 (43.2%). Carter authored or co-sponsored 75 bills that went on to become law. While also serving as chair of the Louisiana Senate Democratic Caucus, Carter chairs the Senate's Labor and Industrial Relations Committee.
Carter also chairs the Algiers Development District.
U.S. House of Representatives
Elections
2021 special
On November 18, 2020, U.S. Representative Cedric Richmond announced that he would resign from Louisiana's 2nd congressional district in January 2021 after having been selected by President-elect Joe Biden to be Senior Advisor to the President and the administration's director of the Office of Public Liaison. Carter then ran to fill the seat in Congress in the special election. On March 20, 2021, Carter finished first in the top-two primary and advanced, with runner-up Senator Karen Carter Peterson, to the runoff election held on April 24.
Carter was endorsed by Cedric Richmond, John Breaux, 8 congressional Democrats, Helena Moreno, Cleo Fields, Sharon Weston Broome, the AFL–CIO, the Louisiana Democratic Party, The Times-Picayune/The New Orleans Advocate, The Louisiana Weekly, and Gambit.
In the runoff, Carter received 48,511 votes (55.2%) to Peterson's 39,295 (44.8%).
Tenure
He was sworn in as the U.S. Representative for Louisiana's 2nd congressional district on May 11, 2021, increasing the Democratic Party's majority to 219-212 over the Republican Party in the United States House of Representatives. On August 12, 2022, he voted to pass the Inflation Reduction Act of 2022.
Committee assignments
Committee on Transportation and Infrastructure
Committee on Small Business
Caucus memberships
New Democrat Coalition
Congressional Progressive Caucus
Political positions
Carter opposes conservative measures that have sought to restrict abortion and expand gun rights. During his term of office as a state senator, he had two priorities: raising the state's minimum wage and strengthening anti-discrimination laws against the LGBTQ+ community. He supports the infrastructure policy of the Biden administration.
Carter voted to provide Israel with support following 2023 Hamas attack on Israel.
Personal life
Carter's wife Andreé serves in the United States Army Reserve, and achieved the rank of brigadier general. They have two sons. The family lives on the Westbank of New Orleans, where Carter was born and raised.
Carter is a Baptist.
See also
List of African-American United States representatives
References
External links
Representative Troy Carter official U.S. House website
Troy Carter for Congress campaign website
|-
|-
|-
|-
|-
1963 births
21st-century American politicians
African-American state legislators in Louisiana
African-American people in Louisiana politics
Baptists from Louisiana
Democratic Party members of the United States House of Representatives from Louisiana
Living people
Democratic Party Louisiana state senators
Democratic Party members of the Louisiana House of Representatives
New Orleans City Council members
Xavier University of Louisiana alumni
21st-century African-American politicians
20th-century African-American people
|
```objective-c
//===-- DirectXInstrInfo.h - Define InstrInfo for DirectX -------*- C++ -*-===//
//
// See path_to_url for license information.
//
//===your_sha256_hash------===//
//
// This file declares the DirectX specific subclass of TargetInstrInfo.
//
//===your_sha256_hash------===//
#ifndef LLVM_DIRECTX_DIRECTXINSTRINFO_H
#define LLVM_DIRECTX_DIRECTXINSTRINFO_H
#include "DirectXRegisterInfo.h"
#include "llvm/CodeGen/TargetInstrInfo.h"
#define GET_INSTRINFO_HEADER
#include "DirectXGenInstrInfo.inc"
namespace llvm {
struct DirectXInstrInfo : public DirectXGenInstrInfo {
explicit DirectXInstrInfo() : DirectXGenInstrInfo() {}
~DirectXInstrInfo() override;
};
} // namespace llvm
#endif // LLVM_DIRECTX_DIRECTXINSTRINFO_H
```
|
```javascript
'use strict';
// Flags: --expose_gc
// This test ensures that userland-only AsyncResources cause a destroy event to
// be emitted when they get gced.
const common = require('../common');
const assert = require('assert');
const async_hooks = require('async_hooks');
const destroyedIds = new Set();
async_hooks.createHook({
destroy: common.mustCallAtLeast((asyncId) => {
destroyedIds.add(asyncId);
}, 1)
}).enable();
let asyncId = null;
{
const res = new async_hooks.AsyncResource('foobar');
asyncId = res.asyncId();
}
setImmediate(() => {
global.gc();
setImmediate(() => assert.ok(destroyedIds.has(asyncId)));
});
```
|
```objective-c
/* DO NOT EDIT
This file was automatically generated by Pidl
from wzcsvc.idl and wzcsvc.cnf.
Pidl is a perl based IDL compiler for DCE/RPC idl files.
It is maintained by the Samba team, not the Wireshark team.
Instructions on how to download and install Pidl can be
found at path_to_url
*/
#ifndef __PACKET_DCERPC_WZCSVC_H
#define __PACKET_DCERPC_WZCSVC_H
#endif /* __PACKET_DCERPC_WZCSVC_H */
```
|
```java
/*
*
* *
* * *
* * *
* * * path_to_url
* * *
* * * Unless required by applicable law or agreed to in writing, software
* * * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* *
*
*/
package com.chillingvan.canvasgl.textureFilter;
import com.chillingvan.canvasgl.ICanvasGL;
import com.chillingvan.canvasgl.glcanvas.BasicTexture;
import com.chillingvan.canvasgl.glcanvas.GLCanvas;
/**
* Created by Matthew on 2016/10/14.
*/
public interface TextureFilter {
String getVertexShader();
String getFragmentShader();
String getOesFragmentProgram();
void onPreDraw(int program, BasicTexture texture, ICanvasGL canvas);
void destroy();
}
```
|
```python
#
#
# path_to_url
#
# Unless required by applicable law or agreed to in writing, software
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
import numpy as np
import tensorflow as tf
from numpy.testing import assert_allclose
import gpflow
from gpflow.config import default_float
def test_sparse_mcmc_likelihoods_and_gradients() -> None:
"""
This test makes sure that when the inducing points are the same as the data
points, the sparse mcmc is the same as full mcmc
"""
rng = np.random.RandomState(0)
X, Y = rng.randn(10, 1), rng.randn(10, 1)
v_vals = rng.randn(10, 1)
likelihood = gpflow.likelihoods.StudentT()
model_1 = gpflow.models.GPMC(
data=(X, Y), kernel=gpflow.kernels.Exponential(), likelihood=likelihood
)
model_2 = gpflow.models.SGPMC(
data=(X, Y),
kernel=gpflow.kernels.Exponential(),
inducing_variable=X.copy(),
likelihood=likelihood,
)
model_1.V = tf.convert_to_tensor(v_vals, dtype=default_float())
model_2.V = tf.convert_to_tensor(v_vals, dtype=default_float())
model_1.kernel.lengthscales.assign(0.8)
model_2.kernel.lengthscales.assign(0.8)
model_1.kernel.variance.assign(4.2)
model_2.kernel.variance.assign(4.2)
assert_allclose(
model_1.log_posterior_density(), model_2.log_posterior_density(), rtol=1e-5, atol=1e-5
)
```
|
Scorpaena afuerae, the Peruvian scorpionfish, is a species of marine ray-finned fish belonging to the family Scorpaenidae, the scorpionfishes. This species is found in the eastern Pacific Ocean.
Taxonomy
Scorpaena afuerae was first formally described in 1946 by the American ichthyologist Samuel Frederick Hildebrand with the type locality given as Lobos de Afuera Island off Peru. The type locality is reflected in the specific name.
Description
Scorpaena afuerae has a large, spiny head with deep pits in front of and to the rear of the eyes. The suborbital ridge has 3-4 spines. There are teeth in the centre of the roof of the mouth and at its sides. The preoperculum has 5 spines with the uppermost being the largest, with the spine nest to that also being comparatively strong. There is a row of fringed skin flaps along the edge of the preoperculum. There are 3 spines in a row to the rear of the eye and 2 on the upper margin of the operculum. There are skin flaps over the front of the upper jaw but none on the lower jaw or on the body. The dorsal fin has 12 spines and 9-10 soft raysXII while the anal fin has 3 spines and 5 soft rays. The pectoral has 19-21 finrays with the upper rays being branched and the lower rays are unbranched and thickened. This species has a uniform red colouration with indistinct stripes and spots. It has a maximum published length of , although is more typical.
Distribution and habitat
Scorpaena afuerae is found in the eastern Pacific from Chile to Baja California Sur in Mexico and Cocos Island. It is a demersal species of rocky areas and rubble at depths of .
Biology
Scorpaena afuerae rests on the bottom during the day but, become active at night when they feed on small crustaceans, octopus and small fish. It is a rare species.
References
afuerae
Fish described in 1946
Taxa named by Samuel Frederick Hildebrand
|
```php
<?php
$x = require $x;
$y = require $x &&
require $y;
if (require $x) {
return (require $x);
}
/* false-positives */
require_once $x;
return (require __DIR__ . 'semicolons.php');
```
|
HDMS Thetis is a Thetis-class ocean patrol vessel belonging to the Royal Danish Navy.
In mid-1990s the ship served as a platform for seismic operations in the waters near Greenland. In 2002 she took over the role from her sister ship Hvidbjørnen as a platform for Commander Danish Task Group. The role was handed over to Absalon in September 2007.
February - April 2008 Thetis served as a protection against pirates for the World Food Programme chartered ships, carrying food aid, off the Horn of Africa. A squad of soldiers from the Frogman Corps was deployed aboard the ship.
In 2009 the ship served as staff ship for the NATO Mine Countermeasure Group 1.
References
External links
Thetis-class ocean patrol vessels
Ships built in Svendborg
1989 ships
Frigates of the Royal Danish Navy
|
Joseph Marie may refer to:
Joseph Marie, baron de Gérando, French jurist, philanthropist and philosopher
Joseph Marie, Count Dessaix, French general
|
```c++
// (See accompanying file LICENSE.md or copy at path_to_url
#ifndef BOOST_HANA_TEST_LAWS_ITERABLE_HPP
#define BOOST_HANA_TEST_LAWS_ITERABLE_HPP
#include <boost/hana/any_of.hpp>
#include <boost/hana/assert.hpp>
#include <boost/hana/at.hpp>
#include <boost/hana/back.hpp>
#include <boost/hana/bool.hpp>
#include <boost/hana/concept/comparable.hpp>
#include <boost/hana/concept/foldable.hpp>
#include <boost/hana/concept/sequence.hpp>
#include <boost/hana/core/make.hpp>
#include <boost/hana/core/to.hpp>
#include <boost/hana/core/when.hpp>
#include <boost/hana/drop_front.hpp>
#include <boost/hana/drop_front_exactly.hpp>
#include <boost/hana/equal.hpp>
#include <boost/hana/eval_if.hpp>
#include <boost/hana/find_if.hpp>
#include <boost/hana/for_each.hpp>
#include <boost/hana/front.hpp>
#include <boost/hana/functional/always.hpp>
#include <boost/hana/functional/capture.hpp>
#include <boost/hana/integral_constant.hpp>
#include <boost/hana/is_empty.hpp>
#include <boost/hana/lazy.hpp>
#include <boost/hana/length.hpp>
#include <boost/hana/minus.hpp>
#include <boost/hana/not.hpp>
#include <boost/hana/optional.hpp>
#include <boost/hana/range.hpp>
#include <boost/hana/tuple.hpp>
#include <laws/base.hpp>
namespace boost { namespace hana { namespace test {
template <typename It, typename = hana::when<true>>
struct TestIterable : TestIterable<It, laws> {
using TestIterable<It, laws>::TestIterable;
};
template <typename It>
struct TestIterable<It, laws> {
template <typename Xs>
TestIterable(Xs xs) {
hana::for_each(xs, [](auto xs) {
static_assert(Iterable<decltype(xs)>{}, "");
BOOST_HANA_CONSTANT_CHECK(
hana::is_empty(xs) ^iff^ hana::is_empty(hana::to<tuple_tag>(xs))
);
only_when_(hana::not_(hana::is_empty(xs)), hana::make_lazy([](auto xs) {
BOOST_HANA_CHECK(hana::equal(
hana::front(xs),
hana::front(hana::to<tuple_tag>(xs))
));
BOOST_HANA_CHECK(hana::equal(
hana::to<tuple_tag>(hana::drop_front_exactly(xs)),
hana::drop_front_exactly(hana::to<tuple_tag>(xs))
));
// methods
// back(xs) == at(xs, length(xs)-1)
BOOST_HANA_CHECK(hana::equal(
hana::back(xs),
hana::at(xs, hana::minus(hana::length(xs), hana::size_c<1>))
));
})(xs));
// drop_front(xs, 0) == xs
BOOST_HANA_CHECK(hana::equal(
hana::drop_front(xs, size_c<0>),
xs
));
// at(xs, n) == front(drop_front(xs, n))
hana::for_each(hana::make_range(size_c<0>, hana::length(xs)),
hana::capture(xs)([](auto xs, auto n) {
BOOST_HANA_CHECK(hana::equal(
hana::at(xs, n),
hana::front(hana::drop_front(xs, n))
));
}));
// Searchable
hana::eval_if(hana::is_empty(xs),
hana::make_lazy([](auto xs) {
BOOST_HANA_CONSTANT_CHECK(
hana::not_(hana::any_of(xs, hana::always(true_c)))
);
BOOST_HANA_CONSTANT_CHECK(hana::equal(
hana::find_if(xs, hana::always(true_c)),
nothing
));
})(xs),
hana::make_lazy([](auto xs) {
BOOST_HANA_CHECK(
hana::any_of(xs, hana::always(true_c))
);
BOOST_HANA_CHECK(
hana::not_(hana::any_of(xs, hana::always(false_c)))
);
BOOST_HANA_CHECK(hana::equal(
hana::find_if(xs, hana::always(true_c)),
hana::just(hana::front(xs))
));
})(xs)
);
});
}
};
template <typename S>
struct TestIterable<S, when<Sequence<S>::value>>
: TestIterable<S, laws>
{
template <int i>
using x = ct_eq<i>;
template <int i = 0>
struct invalid { };
struct undefined { };
template <typename Xs>
TestIterable(Xs xs) : TestIterable<S, laws>{xs} {
constexpr auto list = make<S>;
//////////////////////////////////////////////////////////////////
// front
//////////////////////////////////////////////////////////////////
BOOST_HANA_CONSTANT_CHECK(equal(
front(list(x<0>{})),
x<0>{}
));
BOOST_HANA_CONSTANT_CHECK(equal(
front(list(x<0>{}, invalid<>{})),
x<0>{}
));
BOOST_HANA_CONSTANT_CHECK(equal(
front(list(x<0>{}, invalid<1>{}, invalid<2>{})),
x<0>{}
));
BOOST_HANA_CONSTEXPR_CHECK(equal(
front(list(1)), 1
));
BOOST_HANA_CONSTEXPR_CHECK(equal(
front(list(1, '2')), 1
));
BOOST_HANA_CONSTEXPR_CHECK(equal(
front(list(1, '2', 3.3)), 1
));
//////////////////////////////////////////////////////////////////
// back
//////////////////////////////////////////////////////////////////
BOOST_HANA_CONSTANT_CHECK(equal(
back(list(x<0>{})),
x<0>{}
));
BOOST_HANA_CONSTANT_CHECK(equal(
back(list(invalid<0>{}, x<1>{})),
x<1>{}
));
BOOST_HANA_CONSTANT_CHECK(equal(
back(list(invalid<0>{}, invalid<1>{}, x<2>{})),
x<2>{}
));
BOOST_HANA_CONSTEXPR_CHECK(equal(
back(list(1)), 1
));
BOOST_HANA_CONSTEXPR_CHECK(equal(
back(list(1, '2')), '2'
));
BOOST_HANA_CONSTEXPR_CHECK(equal(
back(list(1, '2', 3.3)), 3.3
));
}
};
}}} // end namespace boost::hana::test
#endif // !BOOST_HANA_TEST_LAWS_ITERABLE_HPP
```
|
```html
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Title</title>
<script src="./jessibuca.js"></script>
<style>
html, body {
width: 100%;
height: 100%;
overflow: hidden;
}
html {
-ms-text-size-adjust: 100%;
-webkit-text-size-adjust: 100%;
-webkit-tap-highlight-color: transparent;
}
body {
line-height: 1.6;
position: relative;
font-family: "Microsoft Yahei", tahoma, arial, "Hiragino Sans GB";
}
input {
outline: 0;
}
* {
margin: 0;
padding: 0;
}
* {
-webkit-tap-highlight-color: transparent
}
a img {
border: 0;
}
a {
text-decoration: none;
}
a,
input,
button,
textarea{
}
input,
button,
textarea {
color: inherit;
font: inherit;
}
li {
list-style: none;
}
ol,
ul {
margin: 0;
padding: 0;
list-style: none;
}
</style>
<style>
html,body{
width: 100%;
height: 100%;
position: relative;
}
.root {
position: relative;
width: 100%;
text-align: center;
/*display: flex;*/
/*align-items: center;*/
/*justify-content: center;*/
/*margin-top: 3rem;*/
}
.container-shell {
backdrop-filter: blur(5px);
background: hsla(0, 0%, 50%, 0.5);
padding: 30px 4px 10px 4px;
/* border: 2px solid black; */
width: auto;
position: relative;
border-radius: 5px;
box-shadow: 0 10px 20px;
}
.container-shell:before {
content: "jessibuca demo player";
position: absolute;
color: darkgray;
top: 4px;
left: 10px;
text-shadow: 1px 1px black;
}
#container {
margin: 0 auto;
background: rgba(13, 14, 27, 0.7);
width: 640px;
height: 398px;
}
.input {
display: flex;
margin-top: 10px;
/*color: white;*/
place-content: stretch;
}
.input2 {
bottom: 0px;
}
.input input {
flex: auto;
}
.err {
position: absolute;
top: 40px;
left: 10px;
color: red;
}
.option {
position: absolute;
top: 4px;
right: 10px;
display: flex;
place-content: center;
font-size: 12px;
}
.option span {
color: white;
}
.page {
background: url('./bg.jpg');
background-repeat: no-repeat;
background-position: top;
}
@media (max-width: 720px) {
#container {
width: 90vw;
height: 52.7vw;
}
}
</style>
</head>
<body class="page">
<div class="root">
<div class="container-shell">
<div id="container"></div>
<div class="input">
<div>
<input
type="checkbox"
id="useWebFullScreen"
/><span>web</span>
</div>
</div>
<div class="input">
<div>URL</div>
<input
autocomplete="on"
id="playUrl"
value="path_to_url"
/>
<button id="play"></button>
<button id="pause" style="display: none"></button>
</div>
<div class="input" style="line-height: 30px">
<button id="fullscreen"></button>
<button id="destroy"></button>
</div>
</div>
</div>
<script>
var $player = document.getElementById('play');
var $pause = document.getElementById('pause');
var $playHref = document.getElementById('playUrl');
var $container = document.getElementById('container');
var $destroy = document.getElementById('destroy');
var $fullscreen = document.getElementById('fullscreen');
var $useWebFullScreen = document.getElementById('useWebFullScreen');
var showOperateBtns = true; //
var forceNoOffscreen = true; //
var jessibuca = null;
function create() {
jessibuca = new Jessibuca({
container: $container,
videoBuffer: 0.2, //
isResize: false,
text: "",
loadingText: "",
debug: true,
showBandwidth: showOperateBtns, //
operateBtns: {
fullscreen: showOperateBtns,
screenshot: showOperateBtns,
play: showOperateBtns,
audio: showOperateBtns,
},
forceNoOffscreen: forceNoOffscreen,
useWebFullScreen: $useWebFullScreen.checked,
isNotMute: false,
},);
// jessibuca.onLog = msg => console.error(msg);
// jessibuca.onRecord = (status) => console.log('onRecord', status);
// jessibuca.onPause = () => console.log('onPause');
// jessibuca.onPlay = () => console.log('onPlay');
// jessibuca.onFullscreen = msg => console.log('onFullscreen', msg);
// jessibuca.onMute = msg => console.log('onMute', msg);
jessibuca.on('fullscreen',(value)=>{
console.log('onFullscreen', value);
})
$player.style.display = 'inline-block';
$pause.style.display = 'none';
$destroy.style.display = 'none';
$fullscreen.style.display = 'none';
}
create();
function replay() {
if (jessibuca) {
jessibuca.destroy().then(()=>{
create();
play();
});
}
else {
create();
play();
}
}
function play() {
var href = $playHref.value;
if (href) {
jessibuca.play(href);
$player.style.display = 'none';
$pause.style.display = 'inline-block';
$destroy.style.display = 'inline-block';
$fullscreen.style.display = 'inline-block';
}
}
$player.addEventListener('click', function () {
play()
}, false)
$pause.addEventListener('click', function () {
$player.style.display = 'inline-block';
$pause.style.display = 'none';
jessibuca.pause();
})
$destroy.addEventListener('click', function () {
if (jessibuca) {
jessibuca.destroy();
}
create();
})
$fullscreen.addEventListener('click',function () {
if(jessibuca){
jessibuca.setFullscreen(true)
}
})
$useWebFullScreen.addEventListener('click', function () {
replay()
})
</script>
</body>
</html>
```
|
Irene Dufaux (born on 7 November 1960) is a Swiss sport shooter. She competed in rifle shooting events at the 1988 Summer Olympics.
Olympic results
References
1960 births
Living people
ISSF rifle shooters
Swiss female sport shooters
Shooters at the 1988 Summer Olympics
Olympic shooters for Switzerland
|
```objective-c
//
// TLChatViewController+Conversation.m
// TLChat
//
// Created by on 2017/12/26.
//
#import "TLChatViewController+Conversation.h"
@implementation TLChatViewController (Conversation)
- (instancetype)initWithConversation:(TLConversation *)conversation
{
if (conversation.convType == TLConversationTypePersonal) {
return [self initWithUserId:conversation.partnerID];
}
else if (conversation.convType == TLConversationTypeGroup){
return [self initWithGroupId:conversation.partnerID];
}
return [super init];
}
@end
```
|
```python
# Owner(s): ["module: dynamo"]
import functools
import operator
import re
import sys
import warnings
from itertools import product
from unittest import expectedFailure as xfail, skipIf as skipif, SkipTest
import pytest
from pytest import raises as assert_raises
from torch.testing._internal.common_utils import (
instantiate_parametrized_tests,
parametrize,
run_tests,
skipIfTorchDynamo,
TEST_WITH_TORCHDYNAMO,
TestCase,
xpassIfTorchDynamo,
)
if TEST_WITH_TORCHDYNAMO:
import numpy as np
from numpy.testing import (
assert_,
assert_array_equal,
assert_equal,
assert_warns,
HAS_REFCOUNT,
)
else:
import torch._numpy as np
from torch._numpy.testing import (
assert_,
assert_array_equal,
assert_equal,
assert_warns,
HAS_REFCOUNT,
)
skip = functools.partial(skipif, True)
@instantiate_parametrized_tests
class TestIndexing(TestCase):
def test_index_no_floats(self):
a = np.array([[[5]]])
assert_raises(IndexError, lambda: a[0.0])
assert_raises(IndexError, lambda: a[0, 0.0])
assert_raises(IndexError, lambda: a[0.0, 0])
assert_raises(IndexError, lambda: a[0.0, :])
assert_raises(IndexError, lambda: a[:, 0.0])
assert_raises(IndexError, lambda: a[:, 0.0, :])
assert_raises(IndexError, lambda: a[0.0, :, :])
assert_raises(IndexError, lambda: a[0, 0, 0.0])
assert_raises(IndexError, lambda: a[0.0, 0, 0])
assert_raises(IndexError, lambda: a[0, 0.0, 0])
assert_raises(IndexError, lambda: a[-1.4])
assert_raises(IndexError, lambda: a[0, -1.4])
assert_raises(IndexError, lambda: a[-1.4, 0])
assert_raises(IndexError, lambda: a[-1.4, :])
assert_raises(IndexError, lambda: a[:, -1.4])
assert_raises(IndexError, lambda: a[:, -1.4, :])
assert_raises(IndexError, lambda: a[-1.4, :, :])
assert_raises(IndexError, lambda: a[0, 0, -1.4])
assert_raises(IndexError, lambda: a[-1.4, 0, 0])
assert_raises(IndexError, lambda: a[0, -1.4, 0])
# Note torch validates index arguments "depth-first", so will prioritise
# raising TypeError over IndexError, e.g.
#
# >>> a = np.array([[[5]]])
# >>> a[0.0:, 0.0]
# IndexError: only integers, slices (`:`), ellipsis (`...`),
# numpy.newaxis # (`None`) and integer or boolean arrays are
# valid indices
# >>> t = torch.as_tensor([[[5]]]) # identical to a
# >>> t[0.0:, 0.0]
# TypeError: slice indices must be integers or None or have an
# __index__ method
#
assert_raises((IndexError, TypeError), lambda: a[0.0:, 0.0])
assert_raises((IndexError, TypeError), lambda: a[0.0:, 0.0, :])
def test_slicing_no_floats(self):
a = np.array([[5]])
# start as float.
assert_raises(TypeError, lambda: a[0.0:])
assert_raises(TypeError, lambda: a[0:, 0.0:2])
assert_raises(TypeError, lambda: a[0.0::2, :0])
assert_raises(TypeError, lambda: a[0.0:1:2, :])
assert_raises(TypeError, lambda: a[:, 0.0:])
# stop as float.
assert_raises(TypeError, lambda: a[:0.0])
assert_raises(TypeError, lambda: a[:0, 1:2.0])
assert_raises(TypeError, lambda: a[:0.0:2, :0])
assert_raises(TypeError, lambda: a[:0.0, :])
assert_raises(TypeError, lambda: a[:, 0:4.0:2])
# step as float.
assert_raises(TypeError, lambda: a[::1.0])
assert_raises(TypeError, lambda: a[0:, :2:2.0])
assert_raises(TypeError, lambda: a[1::4.0, :0])
assert_raises(TypeError, lambda: a[::5.0, :])
assert_raises(TypeError, lambda: a[:, 0:4:2.0])
# mixed.
assert_raises(TypeError, lambda: a[1.0:2:2.0])
assert_raises(TypeError, lambda: a[1.0::2.0])
assert_raises(TypeError, lambda: a[0:, :2.0:2.0])
assert_raises(TypeError, lambda: a[1.0:1:4.0, :0])
assert_raises(TypeError, lambda: a[1.0:5.0:5.0, :])
assert_raises(TypeError, lambda: a[:, 0.4:4.0:2.0])
# should still get the DeprecationWarning if step = 0.
assert_raises(TypeError, lambda: a[::0.0])
@skip(reason="torch allows slicing with non-0d array components")
def test_index_no_array_to_index(self):
# No non-scalar arrays.
a = np.array([[[1]]])
assert_raises(TypeError, lambda: a[a:a:a])
# Conversely, using scalars doesn't raise in NumPy, e.g.
#
# >>> i = np.int64(1)
# >>> a[i:i:i]
# array([], shape=(0, 1, 1), dtype=int64)
#
def test_none_index(self):
# `None` index adds newaxis
a = np.array([1, 2, 3])
assert_equal(a[None], a[np.newaxis])
assert_equal(a[None].ndim, a.ndim + 1)
@skip
def test_empty_tuple_index(self):
# Empty tuple index creates a view
a = np.array([1, 2, 3])
assert_equal(a[()], a)
assert_(a[()].tensor._base is a.tensor)
a = np.array(0)
assert_(isinstance(a[()], np.int_))
def test_same_kind_index_casting(self):
# Indexes should be cast with same-kind and not safe, even if that
# is somewhat unsafe. So test various different code paths.
index = np.arange(5)
u_index = index.astype(np.uint8) # i.e. cast to default uint indexing dtype
arr = np.arange(10)
assert_array_equal(arr[index], arr[u_index])
arr[u_index] = np.arange(5)
assert_array_equal(arr, np.arange(10))
arr = np.arange(10).reshape(5, 2)
assert_array_equal(arr[index], arr[u_index])
arr[u_index] = np.arange(5)[:, None]
assert_array_equal(arr, np.arange(5)[:, None].repeat(2, axis=1))
arr = np.arange(25).reshape(5, 5)
assert_array_equal(arr[u_index, u_index], arr[index, index])
def test_empty_fancy_index(self):
# Empty list index creates an empty array
# with the same dtype (but with weird shape)
a = np.array([1, 2, 3])
assert_equal(a[[]], [])
assert_equal(a[[]].dtype, a.dtype)
b = np.array([], dtype=np.intp)
assert_equal(a[[]], [])
assert_equal(a[[]].dtype, a.dtype)
b = np.array([])
assert_raises(IndexError, a.__getitem__, b)
def test_ellipsis_index(self):
a = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
assert_(a[...] is not a)
assert_equal(a[...], a)
# `a[...]` was `a` in numpy <1.9.
# Slicing with ellipsis can skip an
# arbitrary number of dimensions
assert_equal(a[0, ...], a[0])
assert_equal(a[0, ...], a[0, :])
assert_equal(a[..., 0], a[:, 0])
# Slicing with ellipsis always results
# in an array, not a scalar
assert_equal(a[0, ..., 1], np.array(2))
# Assignment with `(Ellipsis,)` on 0-d arrays
b = np.array(1)
b[(Ellipsis,)] = 2
assert_equal(b, 2)
@xpassIfTorchDynamo # 'torch_.np.array() does not have base attribute.
def test_ellipsis_index_2(self):
a = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
assert_(a[...] is not a)
assert_equal(a[...], a)
# `a[...]` was `a` in numpy <1.9.
assert_(a[...].base is a)
def test_single_int_index(self):
# Single integer index selects one row
a = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
assert_equal(a[0], [1, 2, 3])
assert_equal(a[-1], [7, 8, 9])
# Index out of bounds produces IndexError
assert_raises(IndexError, a.__getitem__, 1 << 30)
# Index overflow produces IndexError
# Note torch raises RuntimeError here
assert_raises((IndexError, RuntimeError), a.__getitem__, 1 << 64)
def test_single_bool_index(self):
# Single boolean index
a = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
assert_equal(a[np.array(True)], a[None])
assert_equal(a[np.array(False)], a[None][0:0])
def test_boolean_shape_mismatch(self):
arr = np.ones((5, 4, 3))
index = np.array([True])
assert_raises(IndexError, arr.__getitem__, index)
index = np.array([False] * 6)
assert_raises(IndexError, arr.__getitem__, index)
index = np.zeros((4, 4), dtype=bool)
assert_raises(IndexError, arr.__getitem__, index)
assert_raises(IndexError, arr.__getitem__, (slice(None), index))
def test_boolean_indexing_onedim(self):
# Indexing a 2-dimensional array with
# boolean array of length one
a = np.array([[0.0, 0.0, 0.0]])
b = np.array([True], dtype=bool)
assert_equal(a[b], a)
# boolean assignment
a[b] = 1.0
assert_equal(a, [[1.0, 1.0, 1.0]])
@skip(reason="NP_VER: fails on CI")
def test_boolean_assignment_value_mismatch(self):
# A boolean assignment should fail when the shape of the values
# cannot be broadcast to the subscription. (see also gh-3458)
a = np.arange(4)
def f(a, v):
a[a > -1] = v
assert_raises((RuntimeError, ValueError, TypeError), f, a, [])
assert_raises((RuntimeError, ValueError, TypeError), f, a, [1, 2, 3])
assert_raises((RuntimeError, ValueError, TypeError), f, a[:1], [1, 2, 3])
def test_boolean_indexing_twodim(self):
# Indexing a 2-dimensional array with
# 2-dimensional boolean array
a = np.array([[1, 2, 3], [4, 5, 6], [7, 8, 9]])
b = np.array([[True, False, True], [False, True, False], [True, False, True]])
assert_equal(a[b], [1, 3, 5, 7, 9])
assert_equal(a[b[1]], [[4, 5, 6]])
assert_equal(a[b[0]], a[b[2]])
# boolean assignment
a[b] = 0
assert_equal(a, [[0, 2, 0], [4, 0, 6], [0, 8, 0]])
def test_boolean_indexing_list(self):
# Regression test for #13715. It's a use-after-free bug which the
# test won't directly catch, but it will show up in valgrind.
a = np.array([1, 2, 3])
b = [True, False, True]
# Two variants of the test because the first takes a fast path
assert_equal(a[b], [1, 3])
assert_equal(a[None, b], [[1, 3]])
def test_reverse_strides_and_subspace_bufferinit(self):
# This tests that the strides are not reversed for simple and
# subspace fancy indexing.
a = np.ones(5)
b = np.zeros(5, dtype=np.intp)[::-1]
c = np.arange(5)[::-1]
a[b] = c
# If the strides are not reversed, the 0 in the arange comes last.
assert_equal(a[0], 0)
# This also tests that the subspace buffer is initialized:
a = np.ones((5, 2))
c = np.arange(10).reshape(5, 2)[::-1]
a[b, :] = c
assert_equal(a[0], [0, 1])
def test_reversed_strides_result_allocation(self):
# Test a bug when calculating the output strides for a result array
# when the subspace size was 1 (and test other cases as well)
a = np.arange(10)[:, None]
i = np.arange(10)[::-1]
assert_array_equal(a[i], a[i.copy("C")])
a = np.arange(20).reshape(-1, 2)
def test_uncontiguous_subspace_assignment(self):
# During development there was a bug activating a skip logic
# based on ndim instead of size.
a = np.full((3, 4, 2), -1)
b = np.full((3, 4, 2), -1)
a[[0, 1]] = np.arange(2 * 4 * 2).reshape(2, 4, 2).T
b[[0, 1]] = np.arange(2 * 4 * 2).reshape(2, 4, 2).T.copy()
assert_equal(a, b)
@skip(reason="torch does not limit dims to 32")
def test_too_many_fancy_indices_special_case(self):
# Just documents behaviour, this is a small limitation.
a = np.ones((1,) * 32) # 32 is NPY_MAXDIMS
assert_raises(IndexError, a.__getitem__, (np.array([0]),) * 32)
def test_scalar_array_bool(self):
# NumPy bools can be used as boolean index (python ones as of yet not)
a = np.array(1)
assert_equal(a[np.bool_(True)], a[np.array(True)])
assert_equal(a[np.bool_(False)], a[np.array(False)])
# After deprecating bools as integers:
# a = np.array([0,1,2])
# assert_equal(a[True, :], a[None, :])
# assert_equal(a[:, True], a[:, None])
#
# assert_(not np.may_share_memory(a, a[True, :]))
def test_everything_returns_views(self):
# Before `...` would return a itself.
a = np.arange(5)
assert_(a is not a[()])
assert_(a is not a[...])
assert_(a is not a[:])
def test_broaderrors_indexing(self):
a = np.zeros((5, 5))
assert_raises(IndexError, a.__getitem__, ([0, 1], [0, 1, 2]))
assert_raises(IndexError, a.__setitem__, ([0, 1], [0, 1, 2]), 0)
def test_trivial_fancy_out_of_bounds(self):
a = np.zeros(5)
ind = np.ones(20, dtype=np.intp)
ind[-1] = 10
assert_raises(IndexError, a.__getitem__, ind)
assert_raises((IndexError, RuntimeError), a.__setitem__, ind, 0)
ind = np.ones(20, dtype=np.intp)
ind[0] = 11
assert_raises(IndexError, a.__getitem__, ind)
assert_raises((IndexError, RuntimeError), a.__setitem__, ind, 0)
def test_trivial_fancy_not_possible(self):
# Test that the fast path for trivial assignment is not incorrectly
# used when the index is not contiguous or 1D, see also gh-11467.
a = np.arange(6)
idx = np.arange(6, dtype=np.intp).reshape(2, 1, 3)[:, :, 0]
assert_array_equal(a[idx], idx)
# this case must not go into the fast path, note that idx is
# a non-contiuguous none 1D array here.
a[idx] = -1
res = np.arange(6)
res[0] = -1
res[3] = -1
assert_array_equal(a, res)
def test_memory_order(self):
# This is not necessary to preserve. Memory layouts for
# more complex indices are not as simple.
a = np.arange(10)
b = np.arange(10).reshape(5, 2).T
assert_(a[b].flags.f_contiguous)
# Takes a different implementation branch:
a = a.reshape(-1, 1)
assert_(a[b, 0].flags.f_contiguous)
@skipIfTorchDynamo() # XXX: flaky, depends on implementation details
def test_small_regressions(self):
# Reference count of intp for index checks
a = np.array([0])
if HAS_REFCOUNT:
refcount = sys.getrefcount(np.dtype(np.intp))
# item setting always checks indices in separate function:
a[np.array([0], dtype=np.intp)] = 1
a[np.array([0], dtype=np.uint8)] = 1
assert_raises(IndexError, a.__setitem__, np.array([1], dtype=np.intp), 1)
assert_raises(IndexError, a.__setitem__, np.array([1], dtype=np.uint8), 1)
if HAS_REFCOUNT:
assert_equal(sys.getrefcount(np.dtype(np.intp)), refcount)
def test_tuple_subclass(self):
arr = np.ones((5, 5))
# A tuple subclass should also be an nd-index
class TupleSubclass(tuple):
pass
index = ([1], [1])
index = TupleSubclass(index)
assert_(arr[index].shape == (1,))
# Unlike the non nd-index:
assert_(arr[index,].shape != (1,))
@xpassIfTorchDynamo # (reason="XXX: low-prio behaviour to support")
def test_broken_sequence_not_nd_index(self):
# See path_to_url
# If we have an object which claims to be a sequence, but fails
# on item getting, this should not be converted to an nd-index (tuple)
# If this object happens to be a valid index otherwise, it should work
# This object here is very dubious and probably bad though:
class SequenceLike:
def __index__(self):
return 0
def __len__(self):
return 1
def __getitem__(self, item):
raise IndexError("Not possible")
arr = np.arange(10)
assert_array_equal(arr[SequenceLike()], arr[SequenceLike(),])
# also test that field indexing does not segfault
# for a similar reason, by indexing a structured array
arr = np.zeros((1,), dtype=[("f1", "i8"), ("f2", "i8")])
assert_array_equal(arr[SequenceLike()], arr[SequenceLike(),])
def test_indexing_array_weird_strides(self):
# See also gh-6221
# the shapes used here come from the issue and create the correct
# size for the iterator buffering size.
x = np.ones(10)
x2 = np.ones((10, 2))
ind = np.arange(10)[:, None, None, None]
ind = np.broadcast_to(ind, (10, 55, 4, 4))
# single advanced index case
assert_array_equal(x[ind], x[ind.copy()])
# higher dimensional advanced index
zind = np.zeros(4, dtype=np.intp)
assert_array_equal(x2[ind, zind], x2[ind.copy(), zind])
def test_indexing_array_negative_strides(self):
# From gh-8264,
# core dumps if negative strides are used in iteration
arro = np.zeros((4, 4))
arr = arro[::-1, ::-1]
slices = (slice(None), [0, 1, 2, 3])
arr[slices] = 10
assert_array_equal(arr, 10.0)
@parametrize("index", [True, False, np.array([0])])
@parametrize("num", [32, 40])
@parametrize("original_ndim", [1, 32])
def test_too_many_advanced_indices(self, index, num, original_ndim):
# These are limitations based on the number of arguments we can process.
# For `num=32` (and all boolean cases), the result is actually define;
# but the use of NpyIter (NPY_MAXARGS) limits it for technical reasons.
if not (isinstance(index, np.ndarray) and original_ndim < num):
# unskipped cases fail because of assigning too many indices
raise SkipTest("torch does not limit dims to 32")
arr = np.ones((1,) * original_ndim)
with pytest.raises(IndexError):
arr[(index,) * num]
with pytest.raises(IndexError):
arr[(index,) * num] = 1.0
def test_nontuple_ndindex(self):
a = np.arange(25).reshape((5, 5))
assert_equal(a[[0, 1]], np.array([a[0], a[1]]))
assert_equal(a[[0, 1], [0, 1]], np.array([0, 6]))
raise SkipTest(
"torch happily consumes non-tuple sequences with multi-axis "
"indices (i.e. slices) as an index, whereas NumPy invalidates "
"them, assumedly to keep things simple. This invalidation "
"behaviour is just too niche to bother emulating."
)
assert_raises(IndexError, a.__getitem__, [slice(None)])
@instantiate_parametrized_tests
class TestBroadcastedAssignments(TestCase):
def assign(self, a, ind, val):
a[ind] = val
return a
def test_prepending_ones(self):
a = np.zeros((3, 2))
a[...] = np.ones((1, 3, 2))
# Fancy with subspace with and without transpose
a[[0, 1, 2], :] = np.ones((1, 3, 2))
a[:, [0, 1]] = np.ones((1, 3, 2))
# Fancy without subspace (with broadcasting)
a[[[0], [1], [2]], [0, 1]] = np.ones((1, 3, 2))
def test_prepend_not_one(self):
assign = self.assign
s_ = np.s_
a = np.zeros(5)
# Too large and not only ones.
try:
assign(a, s_[...], np.ones((2, 1)))
except Exception as e:
self.assertTrue(isinstance(e, (ValueError, RuntimeError)))
assert_raises(
(ValueError, RuntimeError), assign, a, s_[[1, 2, 3],], np.ones((2, 1))
)
assert_raises(
(ValueError, RuntimeError), assign, a, s_[[[1], [2]],], np.ones((2, 2, 1))
)
def test_simple_broadcasting_errors(self):
assign = self.assign
s_ = np.s_
a = np.zeros((5, 1))
try:
assign(a, s_[...], np.zeros((5, 2)))
except Exception as e:
self.assertTrue(isinstance(e, (ValueError, RuntimeError)))
try:
assign(a, s_[...], np.zeros((5, 0)))
except Exception as e:
self.assertTrue(isinstance(e, (ValueError, RuntimeError)))
assert_raises(
(ValueError, RuntimeError), assign, a, s_[:, [0]], np.zeros((5, 2))
)
assert_raises(
(ValueError, RuntimeError), assign, a, s_[:, [0]], np.zeros((5, 0))
)
assert_raises(
(ValueError, RuntimeError), assign, a, s_[[0], :], np.zeros((2, 1))
)
@parametrize(
"index", [(..., [1, 2], slice(None)), ([0, 1], ..., 0), (..., [1, 2], [1, 2])]
)
def test_broadcast_error_reports_correct_shape(self, index):
values = np.zeros((100, 100)) # will never broadcast below
arr = np.zeros((3, 4, 5, 6, 7))
with pytest.raises((ValueError, RuntimeError)) as e:
arr[index] = values
shape = arr[index].shape
r_inner_shape = "".join(f"{side}, ?" for side in shape[:-1]) + str(shape[-1])
assert re.search(rf"[\(\[]{r_inner_shape}[\]\)]$", str(e.value))
def test_index_is_larger(self):
# Simple case of fancy index broadcasting of the index.
a = np.zeros((5, 5))
a[[[0], [1], [2]], [0, 1, 2]] = [2, 3, 4]
assert_((a[:3, :3] == [2, 3, 4]).all())
def test_broadcast_subspace(self):
a = np.zeros((100, 100))
v = np.arange(100)[:, None]
b = np.arange(100)[::-1]
a[b] = v
assert_((a[::-1] == v).all())
class TestFancyIndexingCast(TestCase):
@xpassIfTorchDynamo # (
# reason="XXX: low-prio to support assigning complex values on floating arrays"
# )
def test_boolean_index_cast_assign(self):
# Setup the boolean index and float arrays.
shape = (8, 63)
bool_index = np.zeros(shape).astype(bool)
bool_index[0, 1] = True
zero_array = np.zeros(shape)
# Assigning float is fine.
zero_array[bool_index] = np.array([1])
assert_equal(zero_array[0, 1], 1)
# Fancy indexing works, although we get a cast warning.
assert_warns(
np.ComplexWarning, zero_array.__setitem__, ([0], [1]), np.array([2 + 1j])
)
assert_equal(zero_array[0, 1], 2) # No complex part
# Cast complex to float, throwing away the imaginary portion.
assert_warns(
np.ComplexWarning, zero_array.__setitem__, bool_index, np.array([1j])
)
assert_equal(zero_array[0, 1], 0)
@xfail # (reason="XXX: requires broadcast() and broadcast_to()")
class TestMultiIndexingAutomated(TestCase):
"""
These tests use code to mimic the C-Code indexing for selection.
NOTE:
* This still lacks tests for complex item setting.
* If you change behavior of indexing, you might want to modify
these tests to try more combinations.
* Behavior was written to match numpy version 1.8. (though a
first version matched 1.7.)
* Only tuple indices are supported by the mimicking code.
(and tested as of writing this)
* Error types should match most of the time as long as there
is only one error. For multiple errors, what gets raised
will usually not be the same one. They are *not* tested.
Update 2016-11-30: It is probably not worth maintaining this test
indefinitely and it can be dropped if maintenance becomes a burden.
"""
def setupUp(self):
self.a = np.arange(np.prod([3, 1, 5, 6])).reshape(3, 1, 5, 6)
self.b = np.empty((3, 0, 5, 6))
self.complex_indices = [
"skip",
Ellipsis,
0,
# Boolean indices, up to 3-d for some special cases of eating up
# dimensions, also need to test all False
np.array([True, False, False]),
np.array([[True, False], [False, True]]),
np.array([[[False, False], [False, False]]]),
# Some slices:
slice(-5, 5, 2),
slice(1, 1, 100),
slice(4, -1, -2),
slice(None, None, -3),
# Some Fancy indexes:
np.empty((0, 1, 1), dtype=np.intp), # empty and can be broadcast
np.array([0, 1, -2]),
np.array([[2], [0], [1]]),
np.array([[0, -1], [0, 1]], dtype=np.dtype("intp")),
np.array([2, -1], dtype=np.int8),
np.zeros([1] * 31, dtype=int), # trigger too large array.
np.array([0.0, 1.0]),
] # invalid datatype
# Some simpler indices that still cover a bit more
self.simple_indices = [Ellipsis, None, -1, [1], np.array([True]), "skip"]
# Very simple ones to fill the rest:
self.fill_indices = [slice(None, None), 0]
def _get_multi_index(self, arr, indices):
"""Mimic multi dimensional indexing.
Parameters
----------
arr : ndarray
Array to be indexed.
indices : tuple of index objects
Returns
-------
out : ndarray
An array equivalent to the indexing operation (but always a copy).
`arr[indices]` should be identical.
no_copy : bool
Whether the indexing operation requires a copy. If this is `True`,
`np.may_share_memory(arr, arr[indices])` should be `True` (with
some exceptions for scalars and possibly 0-d arrays).
Notes
-----
While the function may mostly match the errors of normal indexing this
is generally not the case.
"""
in_indices = list(indices)
indices = []
# if False, this is a fancy or boolean index
no_copy = True
# number of fancy/scalar indexes that are not consecutive
num_fancy = 0
# number of dimensions indexed by a "fancy" index
fancy_dim = 0
# NOTE: This is a funny twist (and probably OK to change).
# The boolean array has illegal indexes, but this is
# allowed if the broadcast fancy-indices are 0-sized.
# This variable is to catch that case.
error_unless_broadcast_to_empty = False
# We need to handle Ellipsis and make arrays from indices, also
# check if this is fancy indexing (set no_copy).
ndim = 0
ellipsis_pos = None # define here mostly to replace all but first.
for i, indx in enumerate(in_indices):
if indx is None:
continue
if isinstance(indx, np.ndarray) and indx.dtype == bool:
no_copy = False
if indx.ndim == 0:
raise IndexError
# boolean indices can have higher dimensions
ndim += indx.ndim
fancy_dim += indx.ndim
continue
if indx is Ellipsis:
if ellipsis_pos is None:
ellipsis_pos = i
continue # do not increment ndim counter
raise IndexError
if isinstance(indx, slice):
ndim += 1
continue
if not isinstance(indx, np.ndarray):
# This could be open for changes in numpy.
# numpy should maybe raise an error if casting to intp
# is not safe. It rejects np.array([1., 2.]) but not
# [1., 2.] as index (same for ie. np.take).
# (Note the importance of empty lists if changing this here)
try:
indx = np.array(indx, dtype=np.intp)
except ValueError:
raise IndexError from None
in_indices[i] = indx
elif indx.dtype.kind != "b" and indx.dtype.kind != "i":
raise IndexError(
"arrays used as indices must be of integer (or boolean) type"
)
if indx.ndim != 0:
no_copy = False
ndim += 1
fancy_dim += 1
if arr.ndim - ndim < 0:
# we can't take more dimensions then we have, not even for 0-d
# arrays. since a[()] makes sense, but not a[(),]. We will
# raise an error later on, unless a broadcasting error occurs
# first.
raise IndexError
if ndim == 0 and None not in in_indices:
# Well we have no indexes or one Ellipsis. This is legal.
return arr.copy(), no_copy
if ellipsis_pos is not None:
in_indices[ellipsis_pos : ellipsis_pos + 1] = [slice(None, None)] * (
arr.ndim - ndim
)
for ax, indx in enumerate(in_indices):
if isinstance(indx, slice):
# convert to an index array
indx = np.arange(*indx.indices(arr.shape[ax]))
indices.append(["s", indx])
continue
elif indx is None:
# this is like taking a slice with one element from a new axis:
indices.append(["n", np.array([0], dtype=np.intp)])
arr = arr.reshape(arr.shape[:ax] + (1,) + arr.shape[ax:])
continue
if isinstance(indx, np.ndarray) and indx.dtype == bool:
if indx.shape != arr.shape[ax : ax + indx.ndim]:
raise IndexError
try:
flat_indx = np.ravel_multi_index(
np.nonzero(indx), arr.shape[ax : ax + indx.ndim], mode="raise"
)
except Exception:
error_unless_broadcast_to_empty = True
# fill with 0s instead, and raise error later
flat_indx = np.array([0] * indx.sum(), dtype=np.intp)
# concatenate axis into a single one:
if indx.ndim != 0:
arr = arr.reshape(
arr.shape[:ax]
+ (np.prod(arr.shape[ax : ax + indx.ndim]),)
+ arr.shape[ax + indx.ndim :]
)
indx = flat_indx
else:
# This could be changed, a 0-d boolean index can
# make sense (even outside the 0-d indexed array case)
# Note that originally this is could be interpreted as
# integer in the full integer special case.
raise IndexError
else:
# If the index is a singleton, the bounds check is done
# before the broadcasting. This used to be different in <1.9
if indx.ndim == 0:
if indx >= arr.shape[ax] or indx < -arr.shape[ax]:
raise IndexError
if indx.ndim == 0:
# The index is a scalar. This used to be two fold, but if
# fancy indexing was active, the check was done later,
# possibly after broadcasting it away (1.7. or earlier).
# Now it is always done.
if indx >= arr.shape[ax] or indx < -arr.shape[ax]:
raise IndexError
if len(indices) > 0 and indices[-1][0] == "f" and ax != ellipsis_pos:
# NOTE: There could still have been a 0-sized Ellipsis
# between them. Checked that with ellipsis_pos.
indices[-1].append(indx)
else:
# We have a fancy index that is not after an existing one.
# NOTE: A 0-d array triggers this as well, while one may
# expect it to not trigger it, since a scalar would not be
# considered fancy indexing.
num_fancy += 1
indices.append(["f", indx])
if num_fancy > 1 and not no_copy:
# We have to flush the fancy indexes left
new_indices = indices[:]
axes = list(range(arr.ndim))
fancy_axes = []
new_indices.insert(0, ["f"])
ni = 0
ai = 0
for indx in indices:
ni += 1
if indx[0] == "f":
new_indices[0].extend(indx[1:])
del new_indices[ni]
ni -= 1
for ax in range(ai, ai + len(indx[1:])):
fancy_axes.append(ax)
axes.remove(ax)
ai += len(indx) - 1 # axis we are at
indices = new_indices
# and now we need to transpose arr:
arr = arr.transpose(*(fancy_axes + axes))
# We only have one 'f' index now and arr is transposed accordingly.
# Now handle newaxis by reshaping...
ax = 0
for indx in indices:
if indx[0] == "f":
if len(indx) == 1:
continue
# First of all, reshape arr to combine fancy axes into one:
orig_shape = arr.shape
orig_slice = orig_shape[ax : ax + len(indx[1:])]
arr = arr.reshape(
arr.shape[:ax]
+ (np.prod(orig_slice).astype(int),)
+ arr.shape[ax + len(indx[1:]) :]
)
# Check if broadcasting works
res = np.broadcast(*indx[1:])
# unfortunately the indices might be out of bounds. So check
# that first, and use mode='wrap' then. However only if
# there are any indices...
if res.size != 0:
if error_unless_broadcast_to_empty:
raise IndexError
for _indx, _size in zip(indx[1:], orig_slice):
if _indx.size == 0:
continue
if np.any(_indx >= _size) or np.any(_indx < -_size):
raise IndexError
if len(indx[1:]) == len(orig_slice):
if np.prod(orig_slice) == 0:
# Work around for a crash or IndexError with 'wrap'
# in some 0-sized cases.
try:
mi = np.ravel_multi_index(
indx[1:], orig_slice, mode="raise"
)
except Exception as exc:
# This happens with 0-sized orig_slice (sometimes?)
# here it is a ValueError, but indexing gives a:
raise IndexError("invalid index into 0-sized") from exc
else:
mi = np.ravel_multi_index(indx[1:], orig_slice, mode="wrap")
else:
# Maybe never happens...
raise ValueError
arr = arr.take(mi.ravel(), axis=ax)
try:
arr = arr.reshape(arr.shape[:ax] + mi.shape + arr.shape[ax + 1 :])
except ValueError:
# too many dimensions, probably
raise IndexError from None
ax += mi.ndim
continue
# If we are here, we have a 1D array for take:
arr = arr.take(indx[1], axis=ax)
ax += 1
return arr, no_copy
def _check_multi_index(self, arr, index):
"""Check a multi index item getting and simple setting.
Parameters
----------
arr : ndarray
Array to be indexed, must be a reshaped arange.
index : tuple of indexing objects
Index being tested.
"""
# Test item getting
try:
mimic_get, no_copy = self._get_multi_index(arr, index)
except Exception as e:
if HAS_REFCOUNT:
prev_refcount = sys.getrefcount(arr)
assert_raises(type(e), arr.__getitem__, index)
assert_raises(type(e), arr.__setitem__, index, 0)
if HAS_REFCOUNT:
assert_equal(prev_refcount, sys.getrefcount(arr))
return
self._compare_index_result(arr, index, mimic_get, no_copy)
def _check_single_index(self, arr, index):
"""Check a single index item getting and simple setting.
Parameters
----------
arr : ndarray
Array to be indexed, must be an arange.
index : indexing object
Index being tested. Must be a single index and not a tuple
of indexing objects (see also `_check_multi_index`).
"""
try:
mimic_get, no_copy = self._get_multi_index(arr, (index,))
except Exception as e:
if HAS_REFCOUNT:
prev_refcount = sys.getrefcount(arr)
assert_raises(type(e), arr.__getitem__, index)
assert_raises(type(e), arr.__setitem__, index, 0)
if HAS_REFCOUNT:
assert_equal(prev_refcount, sys.getrefcount(arr))
return
self._compare_index_result(arr, index, mimic_get, no_copy)
def _compare_index_result(self, arr, index, mimic_get, no_copy):
"""Compare mimicked result to indexing result."""
raise SkipTest("torch does not support subclassing")
arr = arr.copy()
indexed_arr = arr[index]
assert_array_equal(indexed_arr, mimic_get)
# Check if we got a view, unless its a 0-sized or 0-d array.
# (then its not a view, and that does not matter)
if indexed_arr.size != 0 and indexed_arr.ndim != 0:
assert_(np.may_share_memory(indexed_arr, arr) == no_copy)
# Check reference count of the original array
if HAS_REFCOUNT:
if no_copy:
# refcount increases by one:
assert_equal(sys.getrefcount(arr), 3)
else:
assert_equal(sys.getrefcount(arr), 2)
# Test non-broadcast setitem:
b = arr.copy()
b[index] = mimic_get + 1000
if b.size == 0:
return # nothing to compare here...
if no_copy and indexed_arr.ndim != 0:
# change indexed_arr in-place to manipulate original:
indexed_arr += 1000
assert_array_equal(arr, b)
return
# Use the fact that the array is originally an arange:
arr.flat[indexed_arr.ravel()] += 1000
assert_array_equal(arr, b)
def test_boolean(self):
a = np.array(5)
assert_equal(a[np.array(True)], 5)
a[np.array(True)] = 1
assert_equal(a, 1)
# NOTE: This is different from normal broadcasting, as
# arr[boolean_array] works like in a multi index. Which means
# it is aligned to the left. This is probably correct for
# consistency with arr[boolean_array,] also no broadcasting
# is done at all
self._check_multi_index(self.a, (np.zeros_like(self.a, dtype=bool),))
self._check_multi_index(self.a, (np.zeros_like(self.a, dtype=bool)[..., 0],))
self._check_multi_index(self.a, (np.zeros_like(self.a, dtype=bool)[None, ...],))
def test_multidim(self):
# Automatically test combinations with complex indexes on 2nd (or 1st)
# spot and the simple ones in one other spot.
with warnings.catch_warnings():
# This is so that np.array(True) is not accepted in a full integer
# index, when running the file separately.
warnings.filterwarnings("error", "", DeprecationWarning)
warnings.filterwarnings("error", "", np.VisibleDeprecationWarning)
def isskip(idx):
return isinstance(idx, str) and idx == "skip"
for simple_pos in [0, 2, 3]:
tocheck = [
self.fill_indices,
self.complex_indices,
self.fill_indices,
self.fill_indices,
]
tocheck[simple_pos] = self.simple_indices
for index in product(*tocheck):
index = tuple(i for i in index if not isskip(i))
self._check_multi_index(self.a, index)
self._check_multi_index(self.b, index)
# Check very simple item getting:
self._check_multi_index(self.a, (0, 0, 0, 0))
self._check_multi_index(self.b, (0, 0, 0, 0))
# Also check (simple cases of) too many indices:
assert_raises(IndexError, self.a.__getitem__, (0, 0, 0, 0, 0))
assert_raises(IndexError, self.a.__setitem__, (0, 0, 0, 0, 0), 0)
assert_raises(IndexError, self.a.__getitem__, (0, 0, [1], 0, 0))
assert_raises(IndexError, self.a.__setitem__, (0, 0, [1], 0, 0), 0)
def test_1d(self):
a = np.arange(10)
for index in self.complex_indices:
self._check_single_index(a, index)
class TestFloatNonIntegerArgument(TestCase):
"""
These test that ``TypeError`` is raised when you try to use
non-integers as arguments to for indexing and slicing e.g. ``a[0.0:5]``
and ``a[0.5]``, or other functions like ``array.reshape(1., -1)``.
"""
def test_valid_indexing(self):
# These should raise no errors.
a = np.array([[[5]]])
a[np.array([0])]
a[[0, 0]]
a[:, [0, 0]]
a[:, 0, :]
a[:, :, :]
def test_valid_slicing(self):
# These should raise no errors.
a = np.array([[[5]]])
a[::]
a[0:]
a[:2]
a[0:2]
a[::2]
a[1::2]
a[:2:2]
a[1:2:2]
def test_non_integer_argument_errors(self):
a = np.array([[5]])
assert_raises(TypeError, np.reshape, a, (1.0, 1.0, -1))
assert_raises(TypeError, np.reshape, a, (np.array(1.0), -1))
assert_raises(TypeError, np.take, a, [0], 1.0)
assert_raises((TypeError, RuntimeError), np.take, a, [0], np.float64(1.0))
@skip(
reason=("torch doesn't have scalar types with distinct element-wise behaviours")
)
def test_non_integer_sequence_multiplication(self):
# NumPy scalar sequence multiply should not work with non-integers
def mult(a, b):
return a * b
assert_raises(TypeError, mult, [1], np.float64(3))
# following should be OK
mult([1], np.int_(3))
def test_reduce_axis_float_index(self):
d = np.zeros((3, 3, 3))
assert_raises(TypeError, np.min, d, 0.5)
assert_raises(TypeError, np.min, d, (0.5, 1))
assert_raises(TypeError, np.min, d, (1, 2.2))
assert_raises(TypeError, np.min, d, (0.2, 1.2))
class TestBooleanIndexing(TestCase):
# Using a boolean as integer argument/indexing is an error.
def test_bool_as_int_argument_errors(self):
a = np.array([[[1]]])
assert_raises(TypeError, np.reshape, a, (True, -1))
# Note that operator.index(np.array(True)) does not work, a boolean
# array is thus also deprecated, but not with the same message:
# assert_warns(DeprecationWarning, operator.index, np.True_)
assert_raises(TypeError, np.take, args=(a, [0], False))
raise SkipTest("torch consumes boolean tensors as ints, no bother raising here")
assert_raises(TypeError, np.reshape, a, (np.bool_(True), -1))
assert_raises(TypeError, operator.index, np.array(True))
def test_boolean_indexing_weirdness(self):
# Weird boolean indexing things
a = np.ones((2, 3, 4))
assert a[False, True, ...].shape == (0, 2, 3, 4)
assert a[True, [0, 1], True, True, [1], [[2]]].shape == (1, 2)
assert_raises(IndexError, lambda: a[False, [0, 1], ...])
def test_boolean_indexing_fast_path(self):
# These used to either give the wrong error, or incorrectly give no
# error.
a = np.ones((3, 3))
# This used to incorrectly work (and give an array of shape (0,))
idx1 = np.array([[False] * 9])
with pytest.raises(IndexError):
a[idx1]
# This used to incorrectly give a ValueError: operands could not be broadcast together
idx2 = np.array([[False] * 8 + [True]])
with pytest.raises(IndexError):
a[idx2]
# This is the same as it used to be. The above two should work like this.
idx3 = np.array([[False] * 10])
with pytest.raises(IndexError):
a[idx3]
# This used to give ValueError: non-broadcastable operand
a = np.ones((1, 1, 2))
idx = np.array([[[True], [False]]])
with pytest.raises(IndexError):
a[idx]
class TestArrayToIndexDeprecation(TestCase):
"""Creating an index from array not 0-D is an error."""
def test_array_to_index_error(self):
# so no exception is expected. The raising is effectively tested above.
a = np.array([[[1]]])
assert_raises((TypeError, RuntimeError), np.take, a, [0], a)
raise SkipTest(
"Multi-dimensional tensors are indexable just as long as they only "
"contain a single element, no bother raising here"
)
assert_raises(TypeError, operator.index, np.array([1]))
raise SkipTest("torch consumes tensors as ints, no bother raising here")
assert_raises(TypeError, np.reshape, a, (a, -1))
class TestNonIntegerArrayLike(TestCase):
"""Tests that array_likes only valid if can safely cast to integer.
For instance, lists give IndexError when they cannot be safely cast to
an integer.
"""
@skip(
reason=(
"torch consumes floats by way of falling back on its deprecated "
"__index__ behaviour, no bother raising here"
)
)
def test_basic(self):
a = np.arange(10)
assert_raises(IndexError, a.__getitem__, [0.5, 1.5])
assert_raises(IndexError, a.__getitem__, (["1", "2"],))
# The following is valid
a.__getitem__([])
class TestMultipleEllipsisError(TestCase):
"""An index can only have a single ellipsis."""
@xfail # (
# reason=(
# "torch currently consumes multiple ellipsis, no bother raising "
# "here. See path_to_url#issue-917252204"
# )
# )
def test_basic(self):
a = np.arange(10)
assert_raises(IndexError, lambda: a[..., ...])
assert_raises(IndexError, a.__getitem__, ((Ellipsis,) * 2,))
assert_raises(IndexError, a.__getitem__, ((Ellipsis,) * 3,))
if __name__ == "__main__":
run_tests()
```
|
Highway 397 (AR 397 and Hwy. 397) is a north-south state highway in Boone County, Arkansas. The highway is maintained by the Arkansas Department of Transportation.
Route description
The ArDOT maintains Highway 397 like all other parts of the state highway system. As a part of these responsibilities, the Department tracks the volume of traffic using its roads in surveys using a metric called average annual daily traffic (AADT). ArDOT estimates the traffic level for a segment of roadway for any average day of the year in these surveys. As of 2019, AADT was estimated at 4,500 vehicles per day (VPD) along the northern part and 2,200 VPD near the southern terminus. No segment of Highway 397 has been listed as part of the National Highway System, a network of roads important to the nation's economy, defense, and mobility.
The Highway 397 designation begins at a junction with Highway 43 in the Ozark Mountains just outside Harrison, the county seat of Boone County, Arkansas. Highway 397 passes the Grubb Springs School, listed on the National Register of Historic Places and runs through a rural area before crossing Dry Branch and briefly serving as the western city limits of Harrison. Highway 397 next intersects Highway 392; the two highways form a brief concurrency westbound along a section line road. Highway 397 turns north alone, crossing Dry Jordan Creek and again serving as the western boundary of Harrison. The highway enters an industrial area of Harrison before terminating at Industrial Park Road, a city street.
History
In 1973, the Arkansas General Assembly passed Act 9 of 1973. The act directed county judges and legislators to designate up to of county roads as state highways in each county. As a result of this legislation, Highway 397 was created between Highway 43 and Industrial Park Road in Harrison by the Arkansas State Highway Commission on April 25, 1973.
Major intersections
See also
References
External links
397
Transportation in Boone County, Arkansas
|
Cherry Muhanji is the pen name of Jeannette Delaine Washington (born April 26, 1939, in Detroit, Michigan), an American writer.
She is best known for her novel Her, which won a Ferro-Grumley Award and a Lambda Literary Award in 1991, and the anthology Tight Spaces, which she copublished with Kesho Y. Scott and Egyirba High and which won an American Book Award in 1988. She has also published poetry and short stories in literary magazines and anthologies and is currently working on a memoir.
Muhanji holds a doctorate in English, anthropology and African American World Studies from the University of Iowa. She has taught at various colleges and universities, including the University of Minnesota, Goddard College and Portland State University.
Muhanji's only novel, Her, was released in 1990. It explores the relationships between a community of black women in Detroit.
References
1939 births
American women novelists
American women poets
African-American LGBT people
Living people
20th-century American novelists
American LGBT poets
American LGBT novelists
African-American poets
Poets from Michigan
20th-century American women writers
20th-century American poets
American Book Award winners
Novelists from Michigan
Lambda Literary Award for Debut Fiction winners
African-American novelists
20th-century African-American women writers
20th-century African-American writers
21st-century African-American people
21st-century African-American women
21st-century American LGBT people
20th-century American LGBT people
|
```c
/*
*
* in the file LICENSE in the source distribution or at
* path_to_url
*/
#include <stdio.h>
#include "internal/cryptlib.h"
#include <openssl/conf.h>
#include <openssl/asn1.h>
#include <openssl/ocsp.h>
#include "ocsp_local.h"
#include <openssl/x509v3.h>
#include "../x509/ext_dat.h"
/*
* OCSP extensions and a couple of CRL entry extensions
*/
static int i2r_ocsp_crlid(const X509V3_EXT_METHOD *method, void *nonce,
BIO *out, int indent);
static int i2r_ocsp_acutoff(const X509V3_EXT_METHOD *method, void *nonce,
BIO *out, int indent);
static int i2r_object(const X509V3_EXT_METHOD *method, void *obj, BIO *out,
int indent);
static void *ocsp_nonce_new(void);
static int i2d_ocsp_nonce(const void *a, unsigned char **pp);
static void *d2i_ocsp_nonce(void *a, const unsigned char **pp, long length);
static void ocsp_nonce_free(void *a);
static int i2r_ocsp_nonce(const X509V3_EXT_METHOD *method, void *nonce,
BIO *out, int indent);
static int i2r_ocsp_nocheck(const X509V3_EXT_METHOD *method,
void *nocheck, BIO *out, int indent);
static void *s2i_ocsp_nocheck(const X509V3_EXT_METHOD *method,
X509V3_CTX *ctx, const char *str);
static int i2r_ocsp_serviceloc(const X509V3_EXT_METHOD *method, void *in,
BIO *bp, int ind);
const X509V3_EXT_METHOD ossl_v3_ocsp_crlid = {
NID_id_pkix_OCSP_CrlID, 0, ASN1_ITEM_ref(OCSP_CRLID),
0, 0, 0, 0,
0, 0,
0, 0,
i2r_ocsp_crlid, 0,
NULL
};
const X509V3_EXT_METHOD ossl_v3_ocsp_acutoff = {
NID_id_pkix_OCSP_archiveCutoff, 0, ASN1_ITEM_ref(ASN1_GENERALIZEDTIME),
0, 0, 0, 0,
0, 0,
0, 0,
i2r_ocsp_acutoff, 0,
NULL
};
const X509V3_EXT_METHOD ossl_v3_crl_invdate = {
NID_invalidity_date, 0, ASN1_ITEM_ref(ASN1_GENERALIZEDTIME),
0, 0, 0, 0,
0, 0,
0, 0,
i2r_ocsp_acutoff, 0,
NULL
};
const X509V3_EXT_METHOD ossl_v3_crl_hold = {
NID_hold_instruction_code, 0, ASN1_ITEM_ref(ASN1_OBJECT),
0, 0, 0, 0,
0, 0,
0, 0,
i2r_object, 0,
NULL
};
const X509V3_EXT_METHOD ossl_v3_ocsp_nonce = {
NID_id_pkix_OCSP_Nonce, 0, NULL,
ocsp_nonce_new,
ocsp_nonce_free,
d2i_ocsp_nonce,
i2d_ocsp_nonce,
0, 0,
0, 0,
i2r_ocsp_nonce, 0,
NULL
};
const X509V3_EXT_METHOD ossl_v3_ocsp_nocheck = {
NID_id_pkix_OCSP_noCheck, 0, ASN1_ITEM_ref(ASN1_NULL),
0, 0, 0, 0,
0, s2i_ocsp_nocheck,
0, 0,
i2r_ocsp_nocheck, 0,
NULL
};
const X509V3_EXT_METHOD ossl_v3_ocsp_serviceloc = {
NID_id_pkix_OCSP_serviceLocator, 0, ASN1_ITEM_ref(OCSP_SERVICELOC),
0, 0, 0, 0,
0, 0,
0, 0,
i2r_ocsp_serviceloc, 0,
NULL
};
static int i2r_ocsp_crlid(const X509V3_EXT_METHOD *method, void *in, BIO *bp,
int ind)
{
OCSP_CRLID *a = in;
if (a->crlUrl) {
if (BIO_printf(bp, "%*scrlUrl: ", ind, "") <= 0)
goto err;
if (!ASN1_STRING_print(bp, (ASN1_STRING *)a->crlUrl))
goto err;
if (BIO_write(bp, "\n", 1) <= 0)
goto err;
}
if (a->crlNum) {
if (BIO_printf(bp, "%*scrlNum: ", ind, "") <= 0)
goto err;
if (i2a_ASN1_INTEGER(bp, a->crlNum) <= 0)
goto err;
if (BIO_write(bp, "\n", 1) <= 0)
goto err;
}
if (a->crlTime) {
if (BIO_printf(bp, "%*scrlTime: ", ind, "") <= 0)
goto err;
if (!ASN1_GENERALIZEDTIME_print(bp, a->crlTime))
goto err;
if (BIO_write(bp, "\n", 1) <= 0)
goto err;
}
return 1;
err:
return 0;
}
static int i2r_ocsp_acutoff(const X509V3_EXT_METHOD *method, void *cutoff,
BIO *bp, int ind)
{
if (BIO_printf(bp, "%*s", ind, "") <= 0)
return 0;
if (!ASN1_GENERALIZEDTIME_print(bp, cutoff))
return 0;
return 1;
}
static int i2r_object(const X509V3_EXT_METHOD *method, void *oid, BIO *bp,
int ind)
{
if (BIO_printf(bp, "%*s", ind, "") <= 0)
return 0;
if (i2a_ASN1_OBJECT(bp, oid) <= 0)
return 0;
return 1;
}
/*
* OCSP nonce. This is needs special treatment because it doesn't have an
* ASN1 encoding at all: it just contains arbitrary data.
*/
static void *ocsp_nonce_new(void)
{
return ASN1_OCTET_STRING_new();
}
static int i2d_ocsp_nonce(const void *a, unsigned char **pp)
{
const ASN1_OCTET_STRING *os = a;
if (pp) {
memcpy(*pp, os->data, os->length);
*pp += os->length;
}
return os->length;
}
static void *d2i_ocsp_nonce(void *a, const unsigned char **pp, long length)
{
ASN1_OCTET_STRING *os, **pos;
pos = a;
if (pos == NULL || *pos == NULL) {
os = ASN1_OCTET_STRING_new();
if (os == NULL)
goto err;
} else {
os = *pos;
}
if (!ASN1_OCTET_STRING_set(os, *pp, length))
goto err;
*pp += length;
if (pos)
*pos = os;
return os;
err:
if ((pos == NULL) || (*pos != os))
ASN1_OCTET_STRING_free(os);
ERR_raise(ERR_LIB_OCSP, ERR_R_ASN1_LIB);
return NULL;
}
static void ocsp_nonce_free(void *a)
{
ASN1_OCTET_STRING_free(a);
}
static int i2r_ocsp_nonce(const X509V3_EXT_METHOD *method, void *nonce,
BIO *out, int indent)
{
if (BIO_printf(out, "%*s", indent, "") <= 0)
return 0;
if (i2a_ASN1_STRING(out, nonce, V_ASN1_OCTET_STRING) <= 0)
return 0;
return 1;
}
/* Nocheck is just a single NULL. Don't print anything and always set it */
static int i2r_ocsp_nocheck(const X509V3_EXT_METHOD *method, void *nocheck,
BIO *out, int indent)
{
return 1;
}
static void *s2i_ocsp_nocheck(const X509V3_EXT_METHOD *method,
X509V3_CTX *ctx, const char *str)
{
return ASN1_NULL_new();
}
static int i2r_ocsp_serviceloc(const X509V3_EXT_METHOD *method, void *in,
BIO *bp, int ind)
{
int i;
OCSP_SERVICELOC *a = in;
ACCESS_DESCRIPTION *ad;
if (BIO_printf(bp, "%*sIssuer: ", ind, "") <= 0)
goto err;
if (X509_NAME_print_ex(bp, a->issuer, 0, XN_FLAG_ONELINE) <= 0)
goto err;
for (i = 0; i < sk_ACCESS_DESCRIPTION_num(a->locator); i++) {
ad = sk_ACCESS_DESCRIPTION_value(a->locator, i);
if (BIO_printf(bp, "\n%*s", (2 * ind), "") <= 0)
goto err;
if (i2a_ASN1_OBJECT(bp, ad->method) <= 0)
goto err;
if (BIO_puts(bp, " - ") <= 0)
goto err;
if (GENERAL_NAME_print(bp, ad->location) <= 0)
goto err;
}
return 1;
err:
return 0;
}
```
|
```groovy
package fastdex.build.transform
import com.android.build.api.transform.Transform
import com.android.build.api.transform.TransformException
import com.android.build.api.transform.TransformInvocation
import fastdex.build.util.Constants
import fastdex.build.util.JarOperation
import fastdex.build.variant.FastdexVariant
import fastdex.common.utils.FileUtils
/**
* Created by tong on 17/10/31.
*/
class FastdexDexBuilderTransform extends TransformProxy {
FastdexDexBuilderTransform(Transform base,File streamOutputFolder, FastdexVariant fastdexVariant) {
super(base,streamOutputFolder,fastdexVariant)
}
@Override
void transform(TransformInvocation transformInvocation) throws TransformException, IOException, InterruptedException {
if (fastdexVariant.hasDexCache) {
project.logger.error("\n==fastdex patch transform start,we will generate dex file")
if (fastdexVariant.projectSnapshoot.diffResultSet.isJavaFileChanged()) {
FileUtils.deleteDir(streamOutputFolder)
File patchJar = new File(streamOutputFolder,Constants.PATCH_JAR)
//jar
JarOperation.generatePatchJar(fastdexVariant,transformInvocation,patchJar)
}
else {
project.logger.error("==fastdex no java files have changed, just ignore")
}
}
else {
fastdexBuilder.injectInputAndSaveClassPath(transformInvocation)
base.transform(transformInvocation)
}
}
}
```
|
Estola obscuroides is a species of beetle in the family Cerambycidae. It was described by Stephan von Breuning in 1942. It is known from Paraguay.
References
Estola
Beetles described in 1942
|
```shell
#!/bin/bash
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
#
# path_to_url
#
# Unless required by applicable law or agreed to in writing,
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# specific language governing permissions and limitations
#
# This is a work around for testing t3c which uses systemctl.
# systemctl does not work in a container very well so this script
# replaces systemctl in the container and always returns a
# sucessful result to t3c.
USAGE="\nsystemctl COMMAND NAME\n"
if [ -z $1 ] || [ -z $2 ]; then
echo -e $USAGE
exit 0
else
COMMAND=$1
NAME=$2
fi
if [ "$2" != "trafficserver.service" ]; then
echo -e "\nFailed to start ${NAME}.service: Unit not found.n"
exit 0
fi
case $COMMAND in
enable)
;;
restart)
/opt/trafficserver/bin/trafficserver restart
;;
status)
/opt/trafficserver/bin/trafficserver status
;;
start)
/opt/trafficserver/bin/trafficserver start
;;
stop)
/opt/trafficserver/bin/trafficserver stop
;;
esac
exit $?
```
|
```javascript
/*
* All rights reserved.
*
* This source code is licensed under the license found in the LICENSE file in
* the root directory of this source tree.
*/
import Calendar from 'components/Calendar/Calendar.react';
import { Directions } from 'lib/Constants';
import Icon from 'components/Icon/Icon.react';
import { monthDayStringUTC, monthsFrom, daysFrom } from 'lib/DateUtils';
import Popover from 'components/Popover/Popover.react';
import Position from 'lib/Position';
import PropTypes from 'lib/PropTypes';
import React from 'react';
import styles from 'components/DateRange/DateRange.scss';
export default class DateRange extends React.Component {
constructor(props) {
super();
const val = props.value || {};
this.state = {
open: false,
position: null,
start: val.start || monthsFrom(new Date(), -1),
end: val.end || new Date(),
};
this.wrapRef = React.createRef();
}
toggle() {
this.setState(() => {
if (this.state.open) {
return { open: false };
}
const pos = Position.inWindow(this.wrapRef.current);
if (this.props.align === Directions.RIGHT) {
pos.x += this.wrapRef.current.clientWidth;
}
return {
open: true,
position: pos,
};
});
}
setStart(start) {
let end = this.state.end;
if (start > end) {
end = daysFrom(start, 1);
}
this.setState({ start, end });
}
setEnd(end) {
let start = this.state.start;
if (start > end) {
start = daysFrom(end, -1);
}
this.setState({ start, end });
}
close() {
this.setState({
open: false,
});
this.props.onChange({ start: this.state.start, end: this.state.end });
}
rangeString() {
return `${monthDayStringUTC(this.state.start)} - ${monthDayStringUTC(this.state.end)}`;
}
render() {
let popover = null;
let content = null;
if (this.state.open) {
const classes = [styles.open];
if (this.props.align === Directions.RIGHT) {
classes.push(styles.right);
}
const renderShade =
this.state.start.getFullYear() < this.state.end.getFullYear() ||
this.state.start.getMonth() !== this.state.end.getMonth();
popover = (
<Popover
fixed={true}
position={this.state.position}
onExternalClick={this.close.bind(this)}
>
<div className={classes.join(' ')}>
<div className={styles.calendars}>
<Calendar
value={this.state.start}
onChange={start => this.setStart(start)}
shadeAfter={renderShade}
/>
<Calendar
value={this.state.end}
onChange={end => this.setEnd(end)}
shadeBefore={renderShade}
/>
</div>
<div className={styles.range} onClick={this.close.bind(this)}>
<span>{this.rangeString()}</span>
<Icon width={18} height={18} name="calendar-solid" fill="#169CEE" />
</div>
</div>
</Popover>
);
} else {
content = (
<div className={styles.range}>
<span>{this.rangeString()}</span>
<Icon width={18} height={18} name="calendar-solid" fill="#169CEE" />
</div>
);
}
return (
<div className={styles.wrap} onClick={this.toggle.bind(this)} ref={this.wrapRef}>
{content}
{popover}
</div>
);
}
}
DateRange.propTypes = {
value: PropTypes.object.describe(
'The value of the range. It has two props, "start" and "end," which are both Dates.'
),
onChange: PropTypes.func.describe(
'A function called when the date range is closed. It receives an object with two Date properties: start and end.'
),
align: PropTypes.string.describe(
'The side to align the range selector with. Possible options are Constants.Directions.LEFT or Constants.Directions.RIGHT.'
),
};
```
|
```javascript
const snippets = ["@let foo = 'Hello' + ', World'; "];
runFormatTest(
{
importMeta: import.meta,
snippets,
},
["angular"],
{ embeddedLanguageFormatting: "off" },
);
runFormatTest(
{
importMeta: import.meta,
snippets,
},
["angular"],
{ semi: false },
);
```
|
"Montana" is the regional anthem of the U.S. state of Montana. It was written by Charles Cohan and composed by Joseph E. Howard and was adopted as the state song on February 20, 1945.
Lyrics
Tell me of that Treasure State,
Story always new,
Tell of its beauties grand
And its hearts so true.
Mountains of sunset fire
The land I love the best
Let me grasp the hand of one
From out the golden West
Montana, Montana, Glory of the West
Of all the states from coast to coast, You're easily the best
Montana, Montana, Where skies are always blue
M-O-N-T-A-N-A
Montana I love you!
Each country has its flower;
Each one plays a part,
Each bloom brings a longing hope
To some lonely heart.
Bitterroot to me is dear
Growing in my land
Sing then that glorious air
The one I understand.
Montana, Montana, Glory of the West
Of all the states from coast to coast, You're easily the best
Montana, Montana, Where skies are always blue
M-O-N-T-A-N-A
Montana I love you!'''
External links
Song history by Travel Montana
Songs about Montana
Montana culture
Montana 1945
Music of Montana
|
```java
/**
* This file is part of Skript.
*
* Skript is free software: you can redistribute it and/or modify
* (at your option) any later version.
*
* Skript is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
*
* along with Skript. If not, see <path_to_url
*
*/
package ch.njol.skript.expressions;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import org.bukkit.enchantments.Enchantment;
import org.bukkit.event.Event;
import org.bukkit.inventory.meta.ItemMeta;
import org.eclipse.jdt.annotation.Nullable;
import ch.njol.skript.aliases.ItemType;
import ch.njol.skript.classes.Changer.ChangeMode;
import ch.njol.skript.doc.Description;
import ch.njol.skript.doc.Examples;
import ch.njol.skript.doc.Name;
import ch.njol.skript.doc.Since;
import ch.njol.skript.expressions.base.PropertyExpression;
import ch.njol.skript.lang.Expression;
import ch.njol.skript.lang.SkriptParser.ParseResult;
import ch.njol.skript.lang.util.SimpleExpression;
import ch.njol.skript.util.EnchantmentType;
import ch.njol.util.Kleenean;
import ch.njol.util.coll.CollectionUtils;
@Name("Item Enchantments")
@Description("All the enchantments an <a href='classes.html#itemtype'>item type</a> has.")
@Examples("clear enchantments of event-item")
@Since("2.2-dev36")
public class ExprEnchantments extends SimpleExpression<EnchantmentType> {
static {
PropertyExpression.register(ExprEnchantments.class, EnchantmentType.class, "enchantments", "itemtypes");
}
@SuppressWarnings("null")
private Expression<ItemType> items;
@SuppressWarnings({"null","unchecked"})
@Override
public boolean init(Expression<?>[] exprs, int matchedPattern, Kleenean isDelayed, ParseResult parseResult) {
items = (Expression<ItemType>) exprs[0];
return true;
}
@Override
public boolean isSingle() {
return false;
}
@Override
@Nullable
protected EnchantmentType[] get(Event e) {
List<EnchantmentType> enchantments = new ArrayList<>();
for (ItemType item : items.getArray(e)) {
EnchantmentType[] enchants = item.getEnchantmentTypes();
if (enchants == null)
continue;
Collections.addAll(enchantments, enchants);
}
return enchantments.toArray(new EnchantmentType[0]);
}
@Override
@Nullable
public Class<?>[] acceptChange(ChangeMode mode) {
// Enchantment doesn't get automatically converted to EnchantmentType if you give it more than a one.
// Meaning you can transform an Enchantment array to an EnchantmentType array automatically,
// So, we gotta do it manually.
return CollectionUtils.array(Enchantment[].class, EnchantmentType[].class);
}
@Override
public void change(Event e, @Nullable Object[] delta, ChangeMode mode) {
ItemType[] source = items.getArray(e);
EnchantmentType[] enchants = new EnchantmentType[delta != null ? delta.length : 0];
if (delta != null && delta.length != 0) {
for (int i = 0; i<delta.length; i++) {
if (delta[i] instanceof EnchantmentType)
enchants[i] = (EnchantmentType) delta[i];
else
enchants[i] = new EnchantmentType((Enchantment) delta[i]);
}
}
switch (mode) {
case ADD:
for (ItemType item : source)
item.addEnchantments(enchants);
break;
case REMOVE:
case REMOVE_ALL:
for (ItemType item : source) {
ItemMeta meta = item.getItemMeta();
assert meta != null;
for (EnchantmentType enchant : enchants) {
Enchantment ench = enchant.getType();
assert ench != null;
if (enchant.getInternalLevel() == -1
|| meta.getEnchantLevel(ench) == enchant.getLevel()) {
// Remove directly from meta since it's more efficient on this case
meta.removeEnchant(ench);
}
item.setItemMeta(meta);
}
}
break;
case SET:
for (ItemType item : source) {
item.clearEnchantments();
item.addEnchantments(enchants);
}
break;
case DELETE:
case RESET:
for (ItemType item : source)
item.clearEnchantments();
break;
}
}
@Override
public Class<? extends EnchantmentType> getReturnType() {
return EnchantmentType.class;
}
@Override
public String toString(@Nullable Event e, boolean debug) {
return "the enchantments of " + items.toString(e, debug);
}
}
```
|
Ludwik Żychliński (born 1837 in Grand Duchy of Posen, d. after 1901 in Brusno, Lubaczów County) was a Polish patriotic activist, (military officer), and diarist. Żychliński participated in Giuseppe Garibaldi's expedition of Sicily (1860). During the American Civil War, he fought by the Union side (1862–1863). He was a at the January Uprising of 1863–1864.
Notable works
Machiawell, jego życie i pewne wybitne strony, zawarte w jego dwóch głównych pismach historyczno-politycznych, 1861
Memoirs:
Pamiętniki z wojny amerykańskiej 1862 r.
Pamiętniki byłego dowódcy Dzieci Warszawy...
Przygody Wielkopolanina w Azji i Ameryce...
Wrażenia i przygody zesłanego w Sybir Wielkopolanina
References
External links
Extended biography
January Uprising participants
Polish diarists
Union Army soldiers
1837 births
Year of death missing
Date of birth unknown
Polish people of the American Civil War
|
```go
/*
path_to_url
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package nodevariant
import (
"github.com/pkg/errors"
"github.com/spf13/cobra"
"k8s.io/kubeadm/kinder/pkg/build/alter"
"k8s.io/kubeadm/kinder/pkg/constants"
)
type flagpole struct {
Image string
BaseImage string
InitArtifacts string
ImageTars []string
ImageNamePrefix string
UpgradeArtifacts string
Kubeadm string
Kubelet string
PrePullAdditionalImages bool
Path []string
}
// NewCommand returns a new cobra.Command for building the node image
func NewCommand() *cobra.Command {
flags := &flagpole{}
cmd := &cobra.Command{
Args: cobra.NoArgs,
Use: "node-image-variant",
Aliases: []string{"node-variant", "variant", "nv"},
Short: "build the node image variant",
Long: "build the variant for a node image by adding packages, images or replacing the kubeadm binary",
RunE: func(cmd *cobra.Command, args []string) error {
return runE(flags, cmd, args)
},
}
cmd.Flags().StringVar(
&flags.Image, "image",
constants.DefaultNodeImage,
"name:tag of the resulting image to be built",
)
cmd.Flags().StringVar(
&flags.BaseImage, "base-image",
constants.DefaultBaseImage,
"name:tag of the source image; this can be a kindest/base image or kindest/node image",
)
cmd.Flags().StringVar(
&flags.InitArtifacts, "with-init-artifacts",
"",
"version/build-label/path to a folder with Kubernetes binaries & image tarballs to be used for the kubeadm init workflow",
)
cmd.Flags().StringSliceVar(
&flags.ImageTars, "with-images",
nil,
"version/build-label/path to images tar or folder with images tars to be added to the images",
)
cmd.Flags().StringVar(
&flags.ImageNamePrefix, "image-name-prefix",
"",
"add a name prefix to images tars included in the image",
)
cmd.Flags().StringVar(
&flags.UpgradeArtifacts, "with-upgrade-artifacts",
"",
"version/build-label/path to a folder with Kubernetes binaries & image tarballs to be used for testing the kubeadm-upgrade workflow",
)
cmd.Flags().StringVar(
&flags.Kubeadm, "with-kubeadm",
"",
"override the kubeadm binary existing in the image with the given version/build-label/file or folder containing the kubeadm binary",
)
cmd.Flags().StringVar(
&flags.Kubelet, "with-kubelet",
"",
"override the kubeadm binary existing in the image with the given version/build-label/file or folder containing the kubelet binary",
)
cmd.Flags().BoolVar(
&flags.PrePullAdditionalImages, "with-kubeadm-additional-images",
true,
"pre-pull kubeadm additional required images such as etcd, coredns and pause, etc",
)
cmd.Flags().StringSliceVar(
&flags.Path, "with-path",
nil,
"sourcePath:destPath pairs; copies file/dir at sourcePath on the host to destPath inside the image, destPath has to be absolute",
)
return cmd
}
func runE(flags *flagpole, cmd *cobra.Command, args []string) error {
ctx, err := alter.NewContext(
// base build options
alter.WithBaseImage(flags.BaseImage),
alter.WithImage(flags.Image),
// bits to be added to the image
alter.WithInitArtifacts(flags.InitArtifacts),
alter.WithKubeadm(flags.Kubeadm),
alter.WithKubelet(flags.Kubelet),
alter.WithImageTars(flags.ImageTars),
alter.WithUpgradeArtifacts(flags.UpgradeArtifacts),
alter.WithPrePullAdditionalImages(flags.PrePullAdditionalImages),
// bits options
alter.WithImageNamePrefix(flags.ImageNamePrefix),
alter.WithPath(flags.Path),
)
if err != nil {
return errors.Wrap(err, "error creating alter context")
}
if err := ctx.Alter(); err != nil {
return errors.Wrap(err, "error altering node image")
}
return nil
}
```
|
East Wakefield is an unincorporated community in the town of Wakefield in Carroll County, New Hampshire. It is located in the eastern part of Wakefield along New Hampshire Route 153, northeast of Wakefield village and directly south of Pine River Pond. Balch Pond and Ivanhoe Pond are also nearby. The area is a popular summer home location.
East Wakefield has a different ZIP code (03830) from the rest of the town of Wakefield.
References
Unincorporated communities in New Hampshire
Unincorporated communities in Carroll County, New Hampshire
Wakefield, New Hampshire
|
```java
/*
* one or more contributor license agreements. See the NOTICE file distributed
* with this work for additional information regarding copyright ownership.
*/
package io.camunda.operate.store.opensearch.client.sync;
import static io.camunda.operate.store.opensearch.dsl.QueryDSL.ids;
import static io.camunda.operate.store.opensearch.dsl.QueryDSL.term;
import static io.camunda.operate.store.opensearch.dsl.RequestDSL.clearScrollRequest;
import static io.camunda.operate.store.opensearch.dsl.RequestDSL.deleteByQueryRequestBuilder;
import static io.camunda.operate.store.opensearch.dsl.RequestDSL.deleteRequestBuilder;
import static io.camunda.operate.store.opensearch.dsl.RequestDSL.getRequest;
import static io.camunda.operate.store.opensearch.dsl.RequestDSL.scrollRequest;
import static io.camunda.operate.store.opensearch.dsl.RequestDSL.time;
import static java.lang.String.format;
import io.camunda.operate.opensearch.ExtendedOpenSearchClient;
import io.camunda.operate.store.NotFoundException;
import io.camunda.operate.store.opensearch.client.OpenSearchFailedShardsException;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Collection;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.function.Consumer;
import java.util.function.Function;
import org.opensearch.client.opensearch.OpenSearchClient;
import org.opensearch.client.opensearch._types.Result;
import org.opensearch.client.opensearch._types.aggregations.Aggregate;
import org.opensearch.client.opensearch._types.query_dsl.Query;
import org.opensearch.client.opensearch.core.DeleteByQueryRequest;
import org.opensearch.client.opensearch.core.DeleteByQueryResponse;
import org.opensearch.client.opensearch.core.DeleteRequest;
import org.opensearch.client.opensearch.core.DeleteResponse;
import org.opensearch.client.opensearch.core.GetResponse;
import org.opensearch.client.opensearch.core.IndexRequest;
import org.opensearch.client.opensearch.core.IndexResponse;
import org.opensearch.client.opensearch.core.SearchRequest;
import org.opensearch.client.opensearch.core.SearchResponse;
import org.opensearch.client.opensearch.core.UpdateRequest;
import org.opensearch.client.opensearch.core.UpdateResponse;
import org.opensearch.client.opensearch.core.search.Hit;
import org.opensearch.client.opensearch.core.search.HitsMetadata;
import org.slf4j.Logger;
public class OpenSearchDocumentOperations extends OpenSearchRetryOperation {
public static final String SCROLL_KEEP_ALIVE_MS = "60000ms";
// this scroll timeout value is used for reindex and delete q
public static final String INTERNAL_SCROLL_KEEP_ALIVE_MS = "30000ms";
public static final int TERMS_AGG_SIZE = 10000;
public static final int TOPHITS_AGG_SIZE = 100;
public OpenSearchDocumentOperations(
final Logger logger, final OpenSearchClient openSearchClient) {
super(logger, openSearchClient);
}
private static Function<Exception, String> defaultSearchErrorMessage(final String index) {
return e -> format("Failed to search index: %s! Reason: %s", index, e.getMessage());
}
private static String defaultDeleteErrorMessage(final String index) {
return format("Failed to delete index: %s", index);
}
private static String defaultPersistErrorMessage(final String index) {
return format("Failed to persist index: %s", index);
}
private void clearScroll(final String scrollId) {
if (scrollId != null) {
try {
openSearchClient.clearScroll(clearScrollRequest(scrollId));
} catch (final Exception e) {
logger.warn("Error occurred when clearing the scroll with id [{}]", scrollId);
}
}
}
private void checkFailedShards(final SearchRequest request, final SearchResponse<?> response) {
if (!response.shards().failures().isEmpty()) {
throw new OpenSearchFailedShardsException(
format(
"Shards failed executing request (request=%s, failed shards=%s)",
request, response.shards().failures()));
}
}
public <R> Map<String, Aggregate> unsafeScrollWith(
final SearchRequest.Builder searchRequestBuilder,
final Consumer<List<Hit<R>>> hitsConsumer,
final Consumer<HitsMetadata<R>> hitsMetadataConsumer,
final Class<R> clazz,
final boolean retry)
throws IOException {
final var request = searchRequestBuilder.scroll(time(SCROLL_KEEP_ALIVE_MS)).build();
return retry
? executeWithRetries(() -> scrollWith(request, hitsConsumer, hitsMetadataConsumer, clazz))
: scrollWith(request, hitsConsumer, hitsMetadataConsumer, clazz);
}
private <R> Map<String, Aggregate> scrollWith(
final SearchRequest request,
final Consumer<List<Hit<R>>> hitsConsumer,
final Consumer<HitsMetadata<R>> hitsMetadataConsumer,
final Class<R> clazz)
throws IOException {
String scrollId = null;
try {
SearchResponse<R> response = openSearchClient.search(request, clazz);
final var aggregates = response.aggregations();
if (hitsMetadataConsumer != null) {
hitsMetadataConsumer.accept(response.hits());
}
scrollId = response.scrollId();
List<Hit<R>> hits = response.hits().hits();
while (!hits.isEmpty() && scrollId != null) {
checkFailedShards(request, response);
if (hitsConsumer != null) {
hitsConsumer.accept(hits);
}
response = openSearchClient.scroll(scrollRequest(scrollId), clazz);
scrollId = response.scrollId();
hits = response.hits().hits();
}
return aggregates;
} finally {
if (scrollId != null) {
clearScroll(scrollId);
}
}
}
private <R> Map<String, Aggregate> safeScrollWith(
final SearchRequest.Builder requestBuilder,
final Class<R> entityClass,
final Consumer<List<Hit<R>>> hitsConsumer) {
return safeScrollWith(requestBuilder, entityClass, hitsConsumer, null);
}
private <R> Map<String, Aggregate> safeScrollWith(
final SearchRequest.Builder requestBuilder,
final Class<R> entityClass,
final Consumer<List<Hit<R>>> hitsConsumer,
final Consumer<HitsMetadata<R>> hitsMetadataConsumer) {
return safe(
() ->
unsafeScrollWith(
requestBuilder, hitsConsumer, hitsMetadataConsumer, entityClass, false),
defaultSearchErrorMessage(getIndex(requestBuilder)));
}
private <R> AggregatedResult<R> scroll(
final SearchRequest.Builder searchRequestBuilder, final Class<R> clazz, final boolean retry)
throws IOException {
final var result = scrollHits(searchRequestBuilder, clazz, retry);
return new AggregatedResult<>(
result.values().stream().map(Hit::source).toList(), result.aggregates());
}
public <R> AggregatedResult<Hit<R>> scrollHits(
final SearchRequest.Builder searchRequestBuilder, final Class<R> clazz) throws IOException {
final List<Hit<R>> result = new ArrayList<>();
final var aggregates =
unsafeScrollWith(searchRequestBuilder, result::addAll, null, clazz, false);
return new AggregatedResult<>(result, aggregates);
}
public <R> AggregatedResult<Hit<R>> scrollHits(
final SearchRequest.Builder searchRequestBuilder, final Class<R> clazz, final boolean retry)
throws IOException {
final List<Hit<R>> result = new ArrayList<>();
final var aggregates =
unsafeScrollWith(searchRequestBuilder, result::addAll, null, clazz, retry);
return new AggregatedResult<>(result, aggregates);
}
public <R> void scrollWith(
final SearchRequest.Builder requestBuilder,
final Class<R> entityClass,
final Consumer<List<Hit<R>>> hitsConsumer) {
safeScrollWith(requestBuilder, entityClass, hitsConsumer);
}
public <R> void scrollWith(
final SearchRequest.Builder requestBuilder,
final Class<R> entityClass,
final Consumer<List<Hit<R>>> hitsConsumer,
final Consumer<HitsMetadata<R>> hitsMetadataConsumer) {
safeScrollWith(requestBuilder, entityClass, hitsConsumer, hitsMetadataConsumer);
}
public <R> AggregatedResult<R> scrollValuesAndAggregations(
final SearchRequest.Builder requestBuilder, final Class<R> entityClass) {
return safe(
() -> scroll(requestBuilder, entityClass, false),
defaultSearchErrorMessage(getIndex(requestBuilder)));
}
public <R> AggregatedResult<R> scrollValuesAndAggregations(
final SearchRequest.Builder requestBuilder, final Class<R> entityClass, final boolean retry) {
return safe(
() -> scroll(requestBuilder, entityClass, retry),
defaultSearchErrorMessage(getIndex(requestBuilder)));
}
public <R> List<R> scrollValues(
final SearchRequest.Builder requestBuilder, final Class<R> entityClass) {
return scrollValuesAndAggregations(requestBuilder, entityClass).values();
}
public <R> List<R> scrollValues(
final SearchRequest.Builder requestBuilder, final Class<R> entityClass, final boolean retry) {
return scrollValuesAndAggregations(requestBuilder, entityClass, retry).values();
}
private <R> SearchResponse<R> unsafeSearch(
final SearchRequest request, final Class<R> entityClass) throws IOException {
final var response = openSearchClient.search(request, entityClass);
checkFailedShards(request, response);
return response;
}
public <R> SearchResponse<R> search(
final SearchRequest.Builder requestBuilder, final Class<R> entityClass) {
return search(requestBuilder, entityClass, false);
}
public <R> SearchResponse<R> search(
final SearchRequest.Builder requestBuilder, final Class<R> entityClass, final boolean retry) {
final var request = requestBuilder.build();
return retry
? executeWithRetries(() -> unsafeSearch(request, entityClass))
: safe(
() -> unsafeSearch(request, entityClass),
defaultSearchErrorMessage(getIndex(requestBuilder)));
}
public <R> List<R> searchValues(
final SearchRequest.Builder requestBuilder, final Class<R> entityClass) {
return searchValues(requestBuilder, entityClass, false);
}
public <R> List<R> searchValues(
final SearchRequest.Builder requestBuilder, final Class<R> entityClass, final boolean retry) {
return search(requestBuilder, entityClass, retry).hits().hits().stream()
.map(Hit::source)
.toList();
}
public Map<String, Aggregate> searchAggregations(final SearchRequest.Builder requestBuilder) {
requestBuilder.size(0);
return search(requestBuilder, Void.class).aggregations();
}
public <R> R searchUnique(
final SearchRequest.Builder requestBuilder, final Class<R> entityClass, final String key) {
final SearchResponse<R> response = search(requestBuilder, entityClass);
if (response.hits().total().value() == 1) {
return response.hits().hits().get(0).source();
} else if (response.hits().total().value() > 1) {
throw new NotFoundException(
format("Could not find unique %s with key '%s'.", getIndex(requestBuilder), key));
} else {
throw new NotFoundException(
format("Could not find %s with key '%s'.", getIndex(requestBuilder), key));
}
}
public long docCount(final SearchRequest.Builder requestBuilder) {
requestBuilder.size(0);
return search(requestBuilder, Void.class).hits().total().value();
}
public Map<String, String> getIndexNames(final String index, final Collection<String> ids) {
final Map<String, String> result = new HashMap<>();
final var searchRequestBuilder =
new SearchRequest.Builder().index(index).query(ids(ids)).source(s -> s.fetch(false));
final Consumer<List<Hit<Void>>> hitsConsumer =
hits -> hits.forEach(hit -> result.put(hit.id(), hit.index()));
safeScrollWith(searchRequestBuilder, Void.class, hitsConsumer);
return result;
}
// TODO check unused
public boolean documentExistsWithGivenRetries(final String name, final String id) {
return executeWithGivenRetries(
10,
format("Exists document from %s with id %s", name, id),
() -> openSearchClient.exists(e -> e.index(name).id(id)).value(),
null);
}
public <R> Optional<R> getWithRetries(
final String index, final String id, final Class<R> entitiyClass) {
return executeWithRetries(
() -> {
final GetResponse<R> response = openSearchClient.get(getRequest(index, id), entitiyClass);
return response.found() ? Optional.ofNullable(response.source()) : Optional.empty();
});
}
public DeleteResponse delete(final String index, final String id) {
final var deleteRequestBuilder = new DeleteRequest.Builder().index(index).id(id);
return safe(
() -> openSearchClient.delete(deleteRequestBuilder.build()),
e -> defaultDeleteErrorMessage(index));
}
public DeleteByQueryResponse delete(final String index, final String field, final String value) {
final var deleteRequestBuilder =
new DeleteByQueryRequest.Builder().index(index).query(term(field, value));
return safe(
() -> openSearchClient.deleteByQuery(deleteRequestBuilder.build()),
e -> defaultDeleteErrorMessage(index));
}
public boolean deleteWithRetries(final String index, final Query query) {
return executeWithRetries(
() -> {
final DeleteByQueryRequest request =
deleteByQueryRequestBuilder(index).query(query).build();
final DeleteByQueryResponse response = openSearchClient.deleteByQuery(request);
return response.failures().isEmpty() && response.deleted() > 0;
});
}
public long deleteByQuery(final String index, final Query query) {
return executeWithRetries(
() -> {
final DeleteByQueryRequest request =
deleteByQueryRequestBuilder(index).query(query).build();
final DeleteByQueryResponse response = openSearchClient.deleteByQuery(request);
return response.deleted();
});
}
public boolean deleteWithRetries(final String index, final String id) {
return executeWithRetries(
() ->
openSearchClient.delete(deleteRequestBuilder(index, id).build()).result()
== Result.Deleted);
}
public <A> IndexResponse index(final IndexRequest.Builder<A> requestBuilder) {
return safe(
() -> openSearchClient.index(requestBuilder.build()),
e -> defaultPersistErrorMessage(getIndex(requestBuilder)));
}
public <A> boolean indexWithRetries(final IndexRequest.Builder<A> requestBuilder) {
final IndexRequest<A> indexRequest = requestBuilder.build();
return executeWithRetries(
() -> {
final IndexResponse response = openSearchClient.index(indexRequest);
return List.of(Result.Created, Result.Updated).contains(response.result());
});
}
public <A> UpdateResponse<Void> update(
final UpdateRequest.Builder<Void, A> requestBuilder,
final Function<Exception, String> errorMessageSupplier) {
return safe(
() -> openSearchClient.update(requestBuilder.build(), Void.class), errorMessageSupplier);
}
public <R> SearchResponse<R> fixedSearch(final SearchRequest request, final Class<R> classR) {
if (openSearchClient instanceof final ExtendedOpenSearchClient extendedOpenSearchClient) {
return safe(
() -> extendedOpenSearchClient.fixedSearch(request, classR),
e -> defaultDeleteErrorMessage(request.index().toString()));
} else {
throw new UnsupportedOperationException(
"ExtendedOpenSearchClient is required to execute fixedSearch! Provided: "
+ openSearchClient.getClass().getName());
}
}
public Map<String, Object> searchAsMap(final SearchRequest.Builder requestBuilder) {
final var request = requestBuilder.size(0).build();
if (openSearchClient instanceof final ExtendedOpenSearchClient extendedOpenSearchClient) {
return safe(
() -> extendedOpenSearchClient.searchAsMap(request),
e -> defaultDeleteErrorMessage(request.index().toString()));
} else {
throw new UnsupportedOperationException(
"ExtendedOpenSearchClient is required to execute fixedSearch! Provided: "
+ openSearchClient.getClass().getName());
}
}
public record AggregatedResult<R>(List<R> values, Map<String, Aggregate> aggregates) {}
}
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.