text
stringlengths 1
22.8M
|
|---|
Adis Jahović (, born 18 March 1987) is a Macedonian footballer who plays as a forward for Turkish club Bodrumspor.
Club career
Jahović began playing football at the age of 7 with FK Makedonija Gjorče Petrov., and left to Bosnia at the age of 18. Jahović was banned for ten games for making an extremely threatening remark towards the referee of the match against Biel-Bienne.
On 29 August 2014 he made a transfer to FC Krylia Sovetov Samara. With Krylia, he won the 2014–15 Russian National Football League and got promotion to the 2015–16 Russian Premier League.
On 31 August 2016, he joined Turkish Süper Lig club Göztepe S.K. He made an impressive record scoring 35 league goals in 49 league games.
On 30 January 2018 he left Göztepe and joined Konyaspor on a 1,5 million Euro transfer. In 2019, Jahović signed with Yeni Malatyaspor.
On 1 February 2021 he joined Göztepe S.K. again. On 10 April 2021, he scored a hat-trick in a 3-2 win against Hatayspor.
International career
Jahović was born in Macedonia and is of Bosniak origin. He plays for the Macedonian national football team. He debuted on 14 November 2012, during the friendly match between Macedonia and Slovenia. has earned a total of 15 caps, scoring 3 goals. In an October 2016 FIFA World Cup qualification match against Israel, he missed a 90th-minute penalty kick in a 2-1 defeat and announced international retirement soon after the match. However, 5 years later he returned to national team lineup.
Career statistics
International goals
Scores and results list Macedonia's goal tally first.
Honours
Rijeka
Croatian Super Cup: 2014
Krylia Sovetov Samara
Russian National Football League: 2014–15
References
External links
Profile at MacedonianFootball
1987 births
Footballers from Skopje
Macedonian people of Bosnia and Herzegovina descent
Living people
Men's association football forwards
Macedonian men's footballers
North Macedonia men's international footballers
FK Napredok players
FK Makedonija G.P. players
FK Željezničar Sarajevo players
FK Velež Mostar players
FK Sarajevo players
FC Wil players
FC Zürich players
FC Vorskla Poltava players
HNK Rijeka players
PFC Krylia Sovetov Samara players
Göztepe S.K. footballers
Konyaspor footballers
Yeni Malatyaspor footballers
Antalyaspor footballers
Macedonian First Football League players
Premier League of Bosnia and Herzegovina players
Swiss Challenge League players
Swiss Super League players
Ukrainian Premier League players
Croatian Football League players
Russian First League players
Russian Premier League players
TFF First League players
Süper Lig players
Macedonian expatriate men's footballers
Expatriate men's footballers in Bosnia and Herzegovina
Expatriate men's footballers in Switzerland
Expatriate men's footballers in Ukraine
Expatriate men's footballers in Croatia
Expatriate men's footballers in Russia
Expatriate men's footballers in Turkey
Macedonian expatriate sportspeople in Bosnia and Herzegovina
Macedonian expatriate sportspeople in Switzerland
Macedonian expatriate sportspeople in Ukraine
Macedonian expatriate sportspeople in Croatia
Macedonian expatriate sportspeople in Russia
Macedonian expatriate sportspeople in Turkey
Bodrum F.K. footballers
|
Roger Shimomura (born Roger Yutaka Shimomura in 1939 in Seattle) is an American artist and a retired professor at the University of Kansas, having taught there from 1969 to 2004. His art, showcased across the United States, Japan, Canada, Mexico, and Israel, often combines American popular culture, traditional Asian tropes, and stereotypical racial imagery to provoke thought and debate on issues of identity and social perception.
Early life
Roger Shimomura was born on June 26, 1939, at the Shimomura family home in Seattle, Washington's Central District. He was delivered by his grandmother, Toku, a professional midwife who would become an important figure in his life and art. His father, Eddy Kazuo Shimomura, was a pharmacist, and his mother, Aya, was a homemaker. Both parents were U.S.-born nisei whose parents had emigrated from Japan in the early 1900s.
After the attack on Pearl Harbor, signing of Executive Order 9066, and the beginning of Japanese incarceration, his family was forcibly relocated on short notice and incarcerated at Camp Harmony in Puyallup, Washington. They were transported from there to the more permanent Minidoka camp in Idaho. After about two years at Minidoka, the family moved to Chicago (outside the West Coast Japanese exclusion zone), where Shimomura's father had secured a job in a pharmacy. The family lived there for a few months before returning to Seattle at war's end in 1945. Shimomura's younger sister Carolyn had died of meningitis during their stay in Chicago.
The family returned to their home in the Central District, and his father resumed his pharmacy work. Shimomura was not yet fully aware of the implications of the racial hierarchies around him. As a child in the postwar years, he played a game called "Kill the Jap" with the sons of artist Paul Horiuchi, who lived across the street. As he grew up, he felt increasing anguish over the conflict between his father's wish that he become a doctor, and his own desire to follow in the footsteps of his three uncles, who were all commercial artists.
Education
After graduating from Garfield High School, Shimomura began studying graphic design at the University of Washington, earning his BA in 1961. He was required to join the ROTC (Reserve Officers' Training Corps) program. Despite his severe distaste for the program, he did very well in his military studies. From 1962 to 64 he served as an artillery officer in the 1st Cavalry Division, stationed at Fort Lewis, Washington, and in Korea.
After leaving the Army, Shimomura began working as a commercial artist and designer, including work on the Polynesian Pavilion at the New York World's Fair in 1964. He became frustrated with the limitations of the field. He began taking painting classes at the University of Washington, where, under the influence of the emerging Pop Art movement, he discovered the possibilities in combining fine art with his lifelong interest in pop culture. Transferring to Syracuse University in New York, where he experimented with filmmaking and performance art, he received his MFA in 1969. That same year he took a job teaching art at the University of Kansas in Lawrence, Kansas. He taught there for the next 35 years, until his retirement in 2004.
Career
In 2013, he told an interviewer,
"My biggest influences initially were the California Funk ceramics artists. Their irreverence helped me break out of my conservative Asian thinking mode. These clay artists said in their works that nothing was sacred, that we needed a fresh start and needed to examine everything. There was a sense that art could take a leadership role in this revolution." He has also expressed admiration for the Pop Art movement, citing Andy Warhol as "my biggest influence, visually, historically, and stylistically".
Shimomura's paintings often take stereotypical American images of Asians: glowering, buck-toothed wartime "Japs", Fu Manchu, subservient geishas, martial artists, and skewer them through over-the-top exaggeration or juxtaposition with images of idealized American society. Pop culture icons such as Mickey Mouse, Coca-Cola, and Pikachu appear incongruously in bright, flat-perspective landscapes, sometimes with absurdly altered portraits of Shimomura himself. His more subtle works often combine traditional Japanese woodblock printing with impressions of the incarceration camps, taken from both his own youthful memories and passages from the diary that his grandmother Toku kept for many years.
While continuing to teach at the University of Kansas, Shimomura gradually became one of the most recognized artists in the United States, amassing awards and exhibition in many of the country's major museums and arts institutions. Since his retirement from teaching in 2004, he has continued painting, giving lectures, and exhibiting.
Collections
His works are in the permanent collections of the Metropolitan Museum of Art (New York), the Whitney Museum of American Art (New York), the Smithsonian American Art Museum (Washington, D.C.), the Denver Art Museum, the Japanese American National Museum (Los Angeles, CA), the Seattle Art Museum, the Japanese Cultural and Community Center of Washington, the Detroit Institute of Arts, the New York Public Library, the Philadelphia Museum of Art, the Asian American Arts Centre (New York), the Phoenix Art Museum, the Tacoma Art Museum, the Marianna Kistler Beach Museum of Art at Kansas State University, the Ulrich Museum of Art at Wichita State University, the Mulvane Art Museum at Washburn University and other museums and institutions.
Exhibitions
His paintings and prints have been the subject of more than 150 solo exhibitions. In addition, he has participated in hundreds of group shows in museums, galleries, schools, and other institutions in the US, Japan, Canada, Mexico, and Israel.
His experimental theater pieces have been performed at such venues as the Smithsonian Institution (Washington, D.C.), the Franklin Furnace (New York City), the Walker Art Center (Minneapolis), and the Bellevue Arts Museum (Bellevue, Washington).
Awards
Honors and awards include:
150th Anniversary Timeless Award, University of Washington College of Arts & Sciences, Seattle (2012)
United States Artists Ford Fellow for Visual Arts (2011)
First Kansas Master Artist Award in the Visual Arts, Kansas Arts Commission, Topeka, Kansas (2008)
Joan Mitchell Foundation Painting Award, New York City (2003)
Kansas Governor's Arts Award, Governor Joan Finney, Topeka, Kansas (1994)
In 1999, the Seattle Urban League designated a scholarship in his name that has been awarded annually to a Seattle resident pursuing a career in art.
References
Works cited
Emily Stamey, The Prints of Roger Shimomura A Catalogue Raisonné, 1968–2005 , University of Washington Press, 2007,
External links
Fall 2009 newsletter of The Wing Luke Asian Museum, includes artist's statement for Shimomura exhibit Yellow Peril and reproduces the titular painting.
Oral Histories: Roger Shimomura, C-Span, July 9, 2011
1939 births
Living people
Japanese-American internees
Artists from Seattle
American artists of Japanese descent
Garfield High School (Seattle) alumni
University of Kansas faculty
Artists from Kansas
Franklin Furnace artists
|
```javascript
define(["Tone/component/Compressor", "helper/Basic", "helper/PassAudio", "helper/PassAudioStereo", "Test"],
function (Compressor, Basic, PassAudio, PassAudioStereo, Test) {
describe("Compressor", function(){
Basic(Compressor);
context("Compression", function(){
it("handles input and output connections", function(){
var comp = new Compressor();
Test.connect(comp);
comp.connect(Test);
comp.dispose();
});
it("passes the incoming signal through", function(done){
var comp;
PassAudio(function(input, output){
comp = new Compressor();
input.connect(comp);
comp.connect(output);
}, function(){
comp.dispose();
done();
});
});
it("passes the incoming stereo signal through", function(done){
var comp;
PassAudioStereo(function(input, output){
comp = new Compressor();
input.connect(comp);
comp.connect(output);
}, function(){
comp.dispose();
done();
});
});
it("can be get and set through object", function(){
var comp = new Compressor();
var values = {
"ratio" : 22,
"threshold" : -30,
"release" : 0.5,
"attack" : 0.03,
"knee" : 20
};
comp.set(values);
expect(comp.get()).to.have.keys(["ratio", "threshold", "release", "attack", "ratio"]);
comp.dispose();
});
it("can get/set all interfaces", function(){
var comp = new Compressor();
var values = {
"ratio" : 22,
"threshold" : -30,
"release" : 0.5,
"attack" : 0.03,
"knee" : 20
};
comp.ratio.value = values.ratio;
comp.threshold.value = values.threshold;
comp.release.value = values.release;
comp.attack.value = values.attack;
comp.knee.value = values.knee;
expect(comp.ratio.value).to.equal(values.ratio);
expect(comp.threshold.value).to.equal(values.threshold);
expect(comp.release.value).to.equal(values.release);
expect(comp.attack.value).to.be.closeTo(values.attack, 0.01);
expect(comp.knee.value).to.equal(values.knee);
comp.dispose();
});
});
});
});
```
|
```kotlin
package de.westnordost.streetcomplete.quests.parking_type
import de.westnordost.streetcomplete.R
import de.westnordost.streetcomplete.data.osm.geometry.ElementGeometry
import de.westnordost.streetcomplete.data.osm.osmquests.OsmFilterQuestType
import de.westnordost.streetcomplete.data.user.achievements.EditTypeAchievement.CAR
import de.westnordost.streetcomplete.osm.Tags
class AddParkingType : OsmFilterQuestType<ParkingType>() {
override val elementFilter = """
nodes, ways, relations with
amenity = parking
and (!parking or parking = yes)
"""
override val changesetComment = "Specify parking types"
override val wikiLink = "Tag:amenity=parking"
override val icon = R.drawable.ic_quest_parking
override val achievements = listOf(CAR)
override fun getTitle(tags: Map<String, String>) = R.string.quest_parkingType_title
override fun createForm() = AddParkingTypeForm()
override fun applyAnswerTo(answer: ParkingType, tags: Tags, geometry: ElementGeometry, timestampEdited: Long) {
tags["parking"] = answer.osmValue
}
}
```
|
The Mangmoom Card (Thai for "spider") is a planned stored-value card for rapid transit systems in Bangkok. Currently, many commuters carry multiple cards, since the existing Rabbit Card only works on the BTS Skytrain and Bangkok BRT, while the MRT Plus card works on the MRT Blue Line and MRT Purple Line. The card was initially planned to launch in August 2016 but was delayed until at least 2018 as the Office of Transport and Traffic Policy and Planning required more time to integrate the ticketing systems of the different rail networks. In November 2016, Transport Minister Arkhom Termpittayapaisith announced the card would be available to use on the Skytrain, MRT, and Airport Rail Link by the middle of 2017.
In April 2017, it was announced that the MRTA would act as a central clearing house for the ticketing system, with the Airport Rail Link and some electric rail services joining by mid-2017, buses and the MRT Purple line joining by October 2017, and the BTS Skytrain and MRT Blue Line later. In October 2017, it was announced that the card was again delayed to mid-2018 and will only work with buses and the Airport Rail Link at launch.
In June 2018, it was announced the card would finally launch on the 23rd of that month, with 200,000 cards issued to the public, and would only work on the MRT Blue Line and MRT Purple Line at launch, with the Airport Rail Link to be added by October.
In September 2018, it was announced that compatibility with the Airport Rail Link would be delayed until the end of 2018, and BMTA buses would be delayed until March 2019. However, as of May 2023, no new Mangmoom Card is issued, the card issued by Bangkok MRT is now the MRT card. The Blue and Purple lines accept EMV Contactless payments as will the Yellow Line. According to the plan of the OTP, the joint ticketing system by different public transport companies will be completed by 2027.
See also
Electronic money
List of smart cards
References
Contactless smart cards
Transport in Bangkok
|
NGI may refer to:
NGI Airport or Gau Airport, an airport in Fiji
National Geographic Institute (Belgium), the Belgian national mapping agency
Navigazione Generale Italiana, an Italian shipping company
Next Generation Identification, a project of the US Federal Bureau of Investigation (FBI)
Next generation interceptor (NGI)
Next Generation Internet (disambiguation)
Northern Gulf Institute, a US National Oceanic and Atmospheric Administration (NOAA) Cooperative Institute
Norwegian Geotechnical Institute, a private geoscience research and consulting foundation
National Genetics Institute, genetics laboratory co-founded by Andrew Conrad
Nehemiah Global Initiative, a non-governmental organization founded by Kenneth Bae
See also
Ngi language, a language of Cameroon
|
```go
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
package validations
import (
"errors"
"strings"
dto "github.com/prometheus/client_model/go"
)
// LintHistogramSummaryReserved detects when other types of metrics use names or labels
// reserved for use by histograms and/or summaries.
func LintHistogramSummaryReserved(mf *dto.MetricFamily) []error {
// These rules do not apply to untyped metrics.
t := mf.GetType()
if t == dto.MetricType_UNTYPED {
return nil
}
var problems []error
isHistogram := t == dto.MetricType_HISTOGRAM
isSummary := t == dto.MetricType_SUMMARY
n := mf.GetName()
if !isHistogram && strings.HasSuffix(n, "_bucket") {
problems = append(problems, errors.New(`non-histogram metrics should not have "_bucket" suffix`))
}
if !isHistogram && !isSummary && strings.HasSuffix(n, "_count") {
problems = append(problems, errors.New(`non-histogram and non-summary metrics should not have "_count" suffix`))
}
if !isHistogram && !isSummary && strings.HasSuffix(n, "_sum") {
problems = append(problems, errors.New(`non-histogram and non-summary metrics should not have "_sum" suffix`))
}
for _, m := range mf.GetMetric() {
for _, l := range m.GetLabel() {
ln := l.GetName()
if !isHistogram && ln == "le" {
problems = append(problems, errors.New(`non-histogram metrics should not have "le" label`))
}
if !isSummary && ln == "quantile" {
problems = append(problems, errors.New(`non-summary metrics should not have "quantile" label`))
}
}
}
return problems
}
```
|
```java
package com.amazonaws.serverless.proxy.spring.slowapp;
import org.springframework.beans.factory.InitializingBean;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.stereotype.Component;
import java.time.Instant;
@SpringBootApplication(exclude = {
org.springframework.boot.autoconfigure.security.reactive.ReactiveUserDetailsServiceAutoConfiguration.class,
org.springframework.boot.autoconfigure.security.reactive.ReactiveSecurityAutoConfiguration.class,
org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration.class,
org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration.class
})
public class SlowTestApplication {
@Component
public static class SlowDownInit implements InitializingBean {
public static final int INIT_SLEEP_TIME_MS = 13_000;
@Override
public void afterPropertiesSet() throws Exception {
Thread.sleep(INIT_SLEEP_TIME_MS);
}
}
}
```
|
Basophobia or basiphobia may refer to:
Basophobia, fear associated with astasia-abasia or fear of walking/standing erect
Basophobia, fear of falling
|
```shell
#!/bin/bash
SCRIPTS_PATH="$(dirname "$(realpath "$0")")"
RESOURCES_PATH=$SCRIPTS_PATH/../../resources
INSTALL_PATH=$SCRIPTS_PATH/../../install
QT_DIR_OPTION=""
PACKAGES_PATH=$SCRIPTS_PATH/../../packages
SIGN=false
CERT_FILE_OPTION=""
CERT_PSSW=""
#checking for parameters
for i in "$@"
do
case $i in
-i=*|--install_path=*)
INSTALL_PATH="${i#*=}"
shift # past argument=value
;;
-qt=*|--qt_dir=*)
QT_DIR_OPTION=qt="${i#*=}"
shift # past argument=value
;;
-p=*|--packages_path=*)
PACKAGES_PATH="${i#*=}"
shift # past argument=value
;;
-cf=*|--cert_file=*)
CERT_FILE_OPTION=cf="${i#*=}"
shift # past argument=value
;;
-cp=*|--cert_pssw=*)
CERT_PSSW="${i#*=}"
if [ -n "$CERT_PSSW" ]; then
SIGN=true
fi
shift # past argument=value
;;
*)
# unknown option
;;
esac
done
bash $SCRIPTS_PATH/internal/2a_portable.sh -i=$INSTALL_PATH $QT_DIR_OPTION
echo "======= Portable Version Created ======="
if [ "$SIGN" = true ] ; then
bash $SCRIPTS_PATH/internal/2b_sign_dlls.sh -i=$INSTALL_PATH $CERT_FILE_OPTION -cp=$CERT_PSSW
echo "======= Portable Version Signed ======="
fi
bash $SCRIPTS_PATH/internal/2c_installer.sh -i=$INSTALL_PATH -p=$PACKAGES_PATH
echo "======= Installer Created ======="
if [ "$SIGN" = true ] ; then
bash $SCRIPTS_PATH/internal/2b_sign_dlls.sh -i=$PACKAGES_PATH $CERT_FILE_OPTION -cp=$CERT_PSSW
echo "======= Installer Signed ======="
fi
```
|
```objective-c
//===your_sha256_hash------===//
//
// See path_to_url for license information.
//
//===your_sha256_hash------===//
#ifndef _LIBCPP___ALGORITHM_CLAMP_H
#define _LIBCPP___ALGORITHM_CLAMP_H
#include <__algorithm/comp.h>
#include <__assert>
#include <__config>
#if !defined(_LIBCPP_HAS_NO_PRAGMA_SYSTEM_HEADER)
# pragma GCC system_header
#endif
_LIBCPP_BEGIN_NAMESPACE_STD
#if _LIBCPP_STD_VER > 14
template<class _Tp, class _Compare>
_LIBCPP_NODISCARD_EXT inline
_LIBCPP_INLINE_VISIBILITY constexpr
const _Tp&
clamp(const _Tp& __v, const _Tp& __lo, const _Tp& __hi, _Compare __comp)
{
_LIBCPP_ASSERT(!__comp(__hi, __lo), "Bad bounds passed to std::clamp");
return __comp(__v, __lo) ? __lo : __comp(__hi, __v) ? __hi : __v;
}
template<class _Tp>
_LIBCPP_NODISCARD_EXT inline
_LIBCPP_INLINE_VISIBILITY constexpr
const _Tp&
clamp(const _Tp& __v, const _Tp& __lo, const _Tp& __hi)
{
return _VSTD::clamp(__v, __lo, __hi, __less<_Tp>());
}
#endif
_LIBCPP_END_NAMESPACE_STD
#endif // _LIBCPP___ALGORITHM_CLAMP_H
```
|
```java
/// Source : path_to_url
/// Author : liuyubobobo
/// Time : 2017-11-17
import java.util.LinkedList;
/// Non-Recursive
/// Time Complexity: O(n), where n is the node's number of the tree
/// Space Complexity: O(h), where h is the height of the tree
public class Solution2 {
// Definition for a binary tree node.
public class TreeNode {
int val;
TreeNode left;
TreeNode right;
TreeNode(int x) { val = x; }
}
public TreeNode invertTree(TreeNode root) {
if(root == null)
return null;
LinkedList<TreeNode> queue = new LinkedList<TreeNode>();
queue.addLast(root);
while(!queue.isEmpty()){
TreeNode curNode = queue.removeFirst();
TreeNode tempNode = curNode.left;
curNode.left = curNode.right;
curNode.right = tempNode;
if(curNode.left != null)
queue.addLast(curNode.left);
if(curNode.right != null)
queue.push(curNode.right);
}
return root;
}
}
```
|
```hlsl
#include "common.hlsl"
struct VS_OUTPUT {
float4 pos : POSITION;
float4 texCoord : TEXCOORD0;
#ifndef SHADOW_DEPTH
float4 hpos : TEXCOORD1;
#endif
};
#ifdef VERTEX
VS_OUTPUT main(VS_INPUT In) {
VS_OUTPUT Out;
int index = int(In.aCoord.w);
float4 rBasisRot = uBasis[index];
float4 rBasisPos = uBasis[index + 1];
float3 cpos = In.aCoord.xyz - normalize(In.aNormal.xyz) * SHADOW_NORMAL_BIAS;
float3 coord = mulBasis(rBasisRot, rBasisPos.xyz, cpos);
Out.texCoord = In.aTexCoord * INV_SHORT_HALF;
Out.pos = mul(uViewProj, float4(coord, rBasisPos.w));
#ifndef SHADOW_DEPTH
Out.hpos = Out.pos;
#endif
return Out;
}
#else // PIXEL
#if defined(ALPHA_TEST) || !defined(SHADOW_DEPTH)
#define PS_PARAMS VS_OUTPUT In
#else
#define PS_PARAMS
#endif
float4 main(PS_PARAMS) : COLOR0 {
#ifdef ALPHA_TEST
clip(SAMPLE_2D_LINEAR(sDiffuse, In.texCoord.xy).a - ALPHA_REF);
#endif
#ifdef SHADOW_DEPTH
return 0.0;
#else
return pack(In.hpos.z / In.hpos.w);
#endif
}
#endif
```
|
```c
#include <stdlib.h>
#include <string.h>
#include "config.h"
#include "node.h"
static void S_node_unlink(cmark_node *node);
#define NODE_MEM(node) cmark_node_mem(node)
static CMARK_INLINE bool S_is_block(cmark_node *node) {
if (node == NULL) {
return false;
}
return node->type >= CMARK_NODE_FIRST_BLOCK &&
node->type <= CMARK_NODE_LAST_BLOCK;
}
static CMARK_INLINE bool S_is_inline(cmark_node *node) {
if (node == NULL) {
return false;
}
return node->type >= CMARK_NODE_FIRST_INLINE &&
node->type <= CMARK_NODE_LAST_INLINE;
}
static bool S_can_contain(cmark_node *node, cmark_node *child) {
cmark_node *cur;
if (node == NULL || child == NULL) {
return false;
}
// Verify that child is not an ancestor of node or equal to node.
cur = node;
do {
if (cur == child) {
return false;
}
cur = cur->parent;
} while (cur != NULL);
if (child->type == CMARK_NODE_DOCUMENT) {
return false;
}
switch (node->type) {
case CMARK_NODE_DOCUMENT:
case CMARK_NODE_BLOCK_QUOTE:
case CMARK_NODE_ITEM:
return S_is_block(child) && child->type != CMARK_NODE_ITEM;
case CMARK_NODE_LIST:
return child->type == CMARK_NODE_ITEM;
case CMARK_NODE_CUSTOM_BLOCK:
return true;
case CMARK_NODE_PARAGRAPH:
case CMARK_NODE_HEADING:
case CMARK_NODE_EMPH:
case CMARK_NODE_STRONG:
case CMARK_NODE_LINK:
case CMARK_NODE_IMAGE:
case CMARK_NODE_CUSTOM_INLINE:
return S_is_inline(child);
default:
break;
}
return false;
}
cmark_node *cmark_node_new_with_mem(cmark_node_type type, cmark_mem *mem) {
cmark_node *node = (cmark_node *)mem->calloc(1, sizeof(*node));
cmark_strbuf_init(mem, &node->content, 0);
node->type = (uint16_t)type;
switch (node->type) {
case CMARK_NODE_HEADING:
node->as.heading.level = 1;
break;
case CMARK_NODE_LIST: {
cmark_list *list = &node->as.list;
list->list_type = CMARK_BULLET_LIST;
list->start = 0;
list->tight = false;
break;
}
default:
break;
}
return node;
}
cmark_node *cmark_node_new(cmark_node_type type) {
extern cmark_mem DEFAULT_MEM_ALLOCATOR;
return cmark_node_new_with_mem(type, &DEFAULT_MEM_ALLOCATOR);
}
// Free a cmark_node list and any children.
static void S_free_nodes(cmark_node *e) {
cmark_node *next;
while (e != NULL) {
cmark_strbuf_free(&e->content);
switch (e->type) {
case CMARK_NODE_CODE_BLOCK:
cmark_chunk_free(NODE_MEM(e), &e->as.code.info);
cmark_chunk_free(NODE_MEM(e), &e->as.code.literal);
break;
case CMARK_NODE_TEXT:
case CMARK_NODE_HTML_INLINE:
case CMARK_NODE_CODE:
case CMARK_NODE_HTML_BLOCK:
cmark_chunk_free(NODE_MEM(e), &e->as.literal);
break;
case CMARK_NODE_LINK:
case CMARK_NODE_IMAGE:
cmark_chunk_free(NODE_MEM(e), &e->as.link.url);
cmark_chunk_free(NODE_MEM(e), &e->as.link.title);
break;
case CMARK_NODE_CUSTOM_BLOCK:
case CMARK_NODE_CUSTOM_INLINE:
cmark_chunk_free(NODE_MEM(e), &e->as.custom.on_enter);
cmark_chunk_free(NODE_MEM(e), &e->as.custom.on_exit);
break;
default:
break;
}
if (e->last_child) {
// Splice children into list
e->last_child->next = e->next;
e->next = e->first_child;
}
next = e->next;
NODE_MEM(e)->free(e);
e = next;
}
}
void cmark_node_free(cmark_node *node) {
S_node_unlink(node);
node->next = NULL;
S_free_nodes(node);
}
cmark_node_type cmark_node_get_type(cmark_node *node) {
if (node == NULL) {
return CMARK_NODE_NONE;
} else {
return (cmark_node_type)node->type;
}
}
const char *cmark_node_get_type_string(cmark_node *node) {
if (node == NULL) {
return "NONE";
}
switch (node->type) {
case CMARK_NODE_NONE:
return "none";
case CMARK_NODE_DOCUMENT:
return "document";
case CMARK_NODE_BLOCK_QUOTE:
return "block_quote";
case CMARK_NODE_LIST:
return "list";
case CMARK_NODE_ITEM:
return "item";
case CMARK_NODE_CODE_BLOCK:
return "code_block";
case CMARK_NODE_HTML_BLOCK:
return "html_block";
case CMARK_NODE_CUSTOM_BLOCK:
return "custom_block";
case CMARK_NODE_PARAGRAPH:
return "paragraph";
case CMARK_NODE_HEADING:
return "heading";
case CMARK_NODE_THEMATIC_BREAK:
return "thematic_break";
case CMARK_NODE_TEXT:
return "text";
case CMARK_NODE_SOFTBREAK:
return "softbreak";
case CMARK_NODE_LINEBREAK:
return "linebreak";
case CMARK_NODE_CODE:
return "code";
case CMARK_NODE_HTML_INLINE:
return "html_inline";
case CMARK_NODE_CUSTOM_INLINE:
return "custom_inline";
case CMARK_NODE_EMPH:
return "emph";
case CMARK_NODE_STRONG:
return "strong";
case CMARK_NODE_LINK:
return "link";
case CMARK_NODE_IMAGE:
return "image";
}
return "<unknown>";
}
cmark_node *cmark_node_next(cmark_node *node) {
if (node == NULL) {
return NULL;
} else {
return node->next;
}
}
cmark_node *cmark_node_previous(cmark_node *node) {
if (node == NULL) {
return NULL;
} else {
return node->prev;
}
}
cmark_node *cmark_node_parent(cmark_node *node) {
if (node == NULL) {
return NULL;
} else {
return node->parent;
}
}
cmark_node *cmark_node_first_child(cmark_node *node) {
if (node == NULL) {
return NULL;
} else {
return node->first_child;
}
}
cmark_node *cmark_node_last_child(cmark_node *node) {
if (node == NULL) {
return NULL;
} else {
return node->last_child;
}
}
void *cmark_node_get_user_data(cmark_node *node) {
if (node == NULL) {
return NULL;
} else {
return node->user_data;
}
}
int cmark_node_set_user_data(cmark_node *node, void *user_data) {
if (node == NULL) {
return 0;
}
node->user_data = user_data;
return 1;
}
const char *cmark_node_get_literal(cmark_node *node) {
if (node == NULL) {
return NULL;
}
switch (node->type) {
case CMARK_NODE_HTML_BLOCK:
case CMARK_NODE_TEXT:
case CMARK_NODE_HTML_INLINE:
case CMARK_NODE_CODE:
return cmark_chunk_to_cstr(NODE_MEM(node), &node->as.literal);
case CMARK_NODE_CODE_BLOCK:
return cmark_chunk_to_cstr(NODE_MEM(node), &node->as.code.literal);
default:
break;
}
return NULL;
}
int cmark_node_set_literal(cmark_node *node, const char *content) {
if (node == NULL) {
return 0;
}
switch (node->type) {
case CMARK_NODE_HTML_BLOCK:
case CMARK_NODE_TEXT:
case CMARK_NODE_HTML_INLINE:
case CMARK_NODE_CODE:
cmark_chunk_set_cstr(NODE_MEM(node), &node->as.literal, content);
return 1;
case CMARK_NODE_CODE_BLOCK:
cmark_chunk_set_cstr(NODE_MEM(node), &node->as.code.literal, content);
return 1;
default:
break;
}
return 0;
}
int cmark_node_get_heading_level(cmark_node *node) {
if (node == NULL) {
return 0;
}
switch (node->type) {
case CMARK_NODE_HEADING:
return node->as.heading.level;
default:
break;
}
return 0;
}
int cmark_node_set_heading_level(cmark_node *node, int level) {
if (node == NULL || level < 1 || level > 6) {
return 0;
}
switch (node->type) {
case CMARK_NODE_HEADING:
node->as.heading.level = level;
return 1;
default:
break;
}
return 0;
}
cmark_list_type cmark_node_get_list_type(cmark_node *node) {
if (node == NULL) {
return CMARK_NO_LIST;
}
if (node->type == CMARK_NODE_LIST) {
return node->as.list.list_type;
} else {
return CMARK_NO_LIST;
}
}
int cmark_node_set_list_type(cmark_node *node, cmark_list_type type) {
if (!(type == CMARK_BULLET_LIST || type == CMARK_ORDERED_LIST)) {
return 0;
}
if (node == NULL) {
return 0;
}
if (node->type == CMARK_NODE_LIST) {
node->as.list.list_type = type;
return 1;
} else {
return 0;
}
}
cmark_delim_type cmark_node_get_list_delim(cmark_node *node) {
if (node == NULL) {
return CMARK_NO_DELIM;
}
if (node->type == CMARK_NODE_LIST) {
return node->as.list.delimiter;
} else {
return CMARK_NO_DELIM;
}
}
int cmark_node_set_list_delim(cmark_node *node, cmark_delim_type delim) {
if (!(delim == CMARK_PERIOD_DELIM || delim == CMARK_PAREN_DELIM)) {
return 0;
}
if (node == NULL) {
return 0;
}
if (node->type == CMARK_NODE_LIST) {
node->as.list.delimiter = delim;
return 1;
} else {
return 0;
}
}
int cmark_node_get_list_start(cmark_node *node) {
if (node == NULL) {
return 0;
}
if (node->type == CMARK_NODE_LIST) {
return node->as.list.start;
} else {
return 0;
}
}
int cmark_node_set_list_start(cmark_node *node, int start) {
if (node == NULL || start < 0) {
return 0;
}
if (node->type == CMARK_NODE_LIST) {
node->as.list.start = start;
return 1;
} else {
return 0;
}
}
int cmark_node_get_list_tight(cmark_node *node) {
if (node == NULL) {
return 0;
}
if (node->type == CMARK_NODE_LIST) {
return node->as.list.tight;
} else {
return 0;
}
}
int cmark_node_set_list_tight(cmark_node *node, int tight) {
if (node == NULL) {
return 0;
}
if (node->type == CMARK_NODE_LIST) {
node->as.list.tight = tight == 1;
return 1;
} else {
return 0;
}
}
const char *cmark_node_get_fence_info(cmark_node *node) {
if (node == NULL) {
return NULL;
}
if (node->type == CMARK_NODE_CODE_BLOCK) {
return cmark_chunk_to_cstr(NODE_MEM(node), &node->as.code.info);
} else {
return NULL;
}
}
int cmark_node_set_fence_info(cmark_node *node, const char *info) {
if (node == NULL) {
return 0;
}
if (node->type == CMARK_NODE_CODE_BLOCK) {
cmark_chunk_set_cstr(NODE_MEM(node), &node->as.code.info, info);
return 1;
} else {
return 0;
}
}
const char *cmark_node_get_url(cmark_node *node) {
if (node == NULL) {
return NULL;
}
switch (node->type) {
case CMARK_NODE_LINK:
case CMARK_NODE_IMAGE:
return cmark_chunk_to_cstr(NODE_MEM(node), &node->as.link.url);
default:
break;
}
return NULL;
}
int cmark_node_set_url(cmark_node *node, const char *url) {
if (node == NULL) {
return 0;
}
switch (node->type) {
case CMARK_NODE_LINK:
case CMARK_NODE_IMAGE:
cmark_chunk_set_cstr(NODE_MEM(node), &node->as.link.url, url);
return 1;
default:
break;
}
return 0;
}
const char *cmark_node_get_title(cmark_node *node) {
if (node == NULL) {
return NULL;
}
switch (node->type) {
case CMARK_NODE_LINK:
case CMARK_NODE_IMAGE:
return cmark_chunk_to_cstr(NODE_MEM(node), &node->as.link.title);
default:
break;
}
return NULL;
}
int cmark_node_set_title(cmark_node *node, const char *title) {
if (node == NULL) {
return 0;
}
switch (node->type) {
case CMARK_NODE_LINK:
case CMARK_NODE_IMAGE:
cmark_chunk_set_cstr(NODE_MEM(node), &node->as.link.title, title);
return 1;
default:
break;
}
return 0;
}
const char *cmark_node_get_on_enter(cmark_node *node) {
if (node == NULL) {
return NULL;
}
switch (node->type) {
case CMARK_NODE_CUSTOM_INLINE:
case CMARK_NODE_CUSTOM_BLOCK:
return cmark_chunk_to_cstr(NODE_MEM(node), &node->as.custom.on_enter);
default:
break;
}
return NULL;
}
int cmark_node_set_on_enter(cmark_node *node, const char *on_enter) {
if (node == NULL) {
return 0;
}
switch (node->type) {
case CMARK_NODE_CUSTOM_INLINE:
case CMARK_NODE_CUSTOM_BLOCK:
cmark_chunk_set_cstr(NODE_MEM(node), &node->as.custom.on_enter, on_enter);
return 1;
default:
break;
}
return 0;
}
const char *cmark_node_get_on_exit(cmark_node *node) {
if (node == NULL) {
return NULL;
}
switch (node->type) {
case CMARK_NODE_CUSTOM_INLINE:
case CMARK_NODE_CUSTOM_BLOCK:
return cmark_chunk_to_cstr(NODE_MEM(node), &node->as.custom.on_exit);
default:
break;
}
return NULL;
}
int cmark_node_set_on_exit(cmark_node *node, const char *on_exit) {
if (node == NULL) {
return 0;
}
switch (node->type) {
case CMARK_NODE_CUSTOM_INLINE:
case CMARK_NODE_CUSTOM_BLOCK:
cmark_chunk_set_cstr(NODE_MEM(node), &node->as.custom.on_exit, on_exit);
return 1;
default:
break;
}
return 0;
}
int cmark_node_get_start_line(cmark_node *node) {
if (node == NULL) {
return 0;
}
return node->start_line;
}
int cmark_node_get_start_column(cmark_node *node) {
if (node == NULL) {
return 0;
}
return node->start_column;
}
int cmark_node_get_end_line(cmark_node *node) {
if (node == NULL) {
return 0;
}
return node->end_line;
}
int cmark_node_get_end_column(cmark_node *node) {
if (node == NULL) {
return 0;
}
return node->end_column;
}
// Unlink a node without adjusting its next, prev, and parent pointers.
static void S_node_unlink(cmark_node *node) {
if (node == NULL) {
return;
}
if (node->prev) {
node->prev->next = node->next;
}
if (node->next) {
node->next->prev = node->prev;
}
// Adjust first_child and last_child of parent.
cmark_node *parent = node->parent;
if (parent) {
if (parent->first_child == node) {
parent->first_child = node->next;
}
if (parent->last_child == node) {
parent->last_child = node->prev;
}
}
}
void cmark_node_unlink(cmark_node *node) {
S_node_unlink(node);
node->next = NULL;
node->prev = NULL;
node->parent = NULL;
}
int cmark_node_insert_before(cmark_node *node, cmark_node *sibling) {
if (node == NULL || sibling == NULL) {
return 0;
}
if (!node->parent || !S_can_contain(node->parent, sibling)) {
return 0;
}
S_node_unlink(sibling);
cmark_node *old_prev = node->prev;
// Insert 'sibling' between 'old_prev' and 'node'.
if (old_prev) {
old_prev->next = sibling;
}
sibling->prev = old_prev;
sibling->next = node;
node->prev = sibling;
// Set new parent.
cmark_node *parent = node->parent;
sibling->parent = parent;
// Adjust first_child of parent if inserted as first child.
if (parent && !old_prev) {
parent->first_child = sibling;
}
return 1;
}
int cmark_node_insert_after(cmark_node *node, cmark_node *sibling) {
if (node == NULL || sibling == NULL) {
return 0;
}
if (!node->parent || !S_can_contain(node->parent, sibling)) {
return 0;
}
S_node_unlink(sibling);
cmark_node *old_next = node->next;
// Insert 'sibling' between 'node' and 'old_next'.
if (old_next) {
old_next->prev = sibling;
}
sibling->next = old_next;
sibling->prev = node;
node->next = sibling;
// Set new parent.
cmark_node *parent = node->parent;
sibling->parent = parent;
// Adjust last_child of parent if inserted as last child.
if (parent && !old_next) {
parent->last_child = sibling;
}
return 1;
}
int cmark_node_replace(cmark_node *oldnode, cmark_node *newnode) {
if (!cmark_node_insert_before(oldnode, newnode)) {
return 0;
}
cmark_node_unlink(oldnode);
return 1;
}
int cmark_node_prepend_child(cmark_node *node, cmark_node *child) {
if (!S_can_contain(node, child)) {
return 0;
}
S_node_unlink(child);
cmark_node *old_first_child = node->first_child;
child->next = old_first_child;
child->prev = NULL;
child->parent = node;
node->first_child = child;
if (old_first_child) {
old_first_child->prev = child;
} else {
// Also set last_child if node previously had no children.
node->last_child = child;
}
return 1;
}
int cmark_node_append_child(cmark_node *node, cmark_node *child) {
if (!S_can_contain(node, child)) {
return 0;
}
S_node_unlink(child);
cmark_node *old_last_child = node->last_child;
child->next = NULL;
child->prev = old_last_child;
child->parent = node;
node->last_child = child;
if (old_last_child) {
old_last_child->next = child;
} else {
// Also set first_child if node previously had no children.
node->first_child = child;
}
return 1;
}
static void S_print_error(FILE *out, cmark_node *node, const char *elem) {
if (out == NULL) {
return;
}
fprintf(out, "Invalid '%s' in node type %s at %d:%d\n", elem,
cmark_node_get_type_string(node), node->start_line,
node->start_column);
}
int cmark_node_check(cmark_node *node, FILE *out) {
cmark_node *cur;
int errors = 0;
if (!node) {
return 0;
}
cur = node;
for (;;) {
if (cur->first_child) {
if (cur->first_child->prev != NULL) {
S_print_error(out, cur->first_child, "prev");
cur->first_child->prev = NULL;
++errors;
}
if (cur->first_child->parent != cur) {
S_print_error(out, cur->first_child, "parent");
cur->first_child->parent = cur;
++errors;
}
cur = cur->first_child;
continue;
}
next_sibling:
if (cur == node) {
break;
}
if (cur->next) {
if (cur->next->prev != cur) {
S_print_error(out, cur->next, "prev");
cur->next->prev = cur;
++errors;
}
if (cur->next->parent != cur->parent) {
S_print_error(out, cur->next, "parent");
cur->next->parent = cur->parent;
++errors;
}
cur = cur->next;
continue;
}
if (cur->parent->last_child != cur) {
S_print_error(out, cur->parent, "last_child");
cur->parent->last_child = cur;
++errors;
}
cur = cur->parent;
goto next_sibling;
}
return errors;
}
```
|
```java
/**
* Tencent is pleased to support the open source community by making MSEC available.
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software distributed under the
*/
package beans.service;
import beans.request.Alarm;
import beans.response.AlarmResponse;
import msec.monitor.Monitor;
import msec.org.DBUtil;
import msec.org.JsonRPCHandler;
import msec.org.Tools;
import org.apache.log4j.Logger;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.InetSocketAddress;
import java.net.Socket;
import java.util.ArrayList;
/**
* Created by Administrator on 2016/4/27.
*/
public class DeleteAlarmSetting extends JsonRPCHandler {
static public void delAlarmSetting(String svc, String attr, int type ) throws Exception
{
Logger logger = Logger.getLogger(DeleteAlarmSetting.class);
Socket sock = new Socket();
ArrayList<String> ret = new ArrayList<String>();
try {
sock.setSoTimeout(2000);
sock.connect(new InetSocketAddress(MonitorBySvcOrIP.monitor_server_ip, MonitorBySvcOrIP.monitor_server_port), 3000);
OutputStream out = sock.getOutputStream();
//
//svcip
//
Monitor.ReqSetAlarmAttr.Builder b = Monitor.ReqSetAlarmAttr.newBuilder();
b.setServicename(svc);
b.setAttrname(attr);
switch (type)
{
case Monitor.AlarmType.ALARM_DIFF_PERCENT_VALUE:
b.setDiffPercent(0);
break;
case Monitor.AlarmType.ALARM_MAX_VALUE:
b.setMax(0);
break;
case Monitor.AlarmType.ALARM_DIFF_VALUE:
b.setDiff(0);
break;
case Monitor.AlarmType.ALARM_MIN_VALUE:
b.setMin(0);
break;
default:
throw new Exception("invalid type:"+type);
}
Monitor.ReqSetAlarmAttr reqSetAlarmAttr = b.build();
Monitor.ReqMonitor.Builder bb = Monitor.ReqMonitor.newBuilder();
bb.setSetalarmattr(reqSetAlarmAttr);
Monitor.ReqMonitor req = bb.build();
//
byte[] lenB = Tools.int2Bytes(req.getSerializedSize()+4);
out.write(lenB);
//pb
req.writeTo(out);
logger.info("send request to monitor.");
//
InputStream in = sock.getInputStream();
byte[] buf = new byte[1024*1024];
int totalLen = 4;
int totalReceived = 0;
while (totalReceived < totalLen) {
int len = in.read(buf, totalReceived, totalLen - totalReceived);
if (len <= 0) {
throw new Exception("monitor server.");
}
totalReceived += len;
}
totalLen = Tools.bytes2int(buf);
logger.info("monitor server response length:"+totalLen);
if (totalLen < 4 || totalLen > buf.length) {
throw new Exception("monitor server.");
}
totalLen -= 4;
totalReceived = 0;
buf = new byte[totalLen];
while (totalReceived < totalLen) {
int len = in.read(buf, totalReceived, totalLen - totalReceived);
if (len <= 0) {
throw new Exception("monitor server.");
}
totalReceived += len;
}
logger.info("received monitor server response successfully.");
Monitor.RespMonitor monitor = Monitor.RespMonitor.parseFrom(buf);
if (monitor.getResult() != 0) {
logger.error("monitor server returns errcode:"+monitor.getResult());
throw new Exception("monitor server. result="+monitor.getResult());
}
Monitor.RespSetAlarmAttr resp = monitor.getSetalarmattr();
} finally {
if (sock != null && sock.isConnected()) {
try {
sock.close();
} catch (Exception e) {
}
}
}
}
public AlarmResponse exec(Alarm request)
{
AlarmResponse response = new AlarmResponse();
Logger logger = Logger.getLogger(DeleteAlarmSetting.class);
String svc = "";
String date = "";
String attr = "";
String result;
result = checkIdentity();
if (!result.equals("success"))
{
response.setStatus(99);
response.setMessage(result);
return response;
}
svc = request.getService_name();
attr = request.getAttr_name();
int type = request.getAlarm_type();
if (svc == null || svc.length() == 0 )
{
response.setMessage("svc name should NOT be both empty!");
response.setStatus(100);
return response;
}
try {
delAlarmSetting(svc, attr, type);
response.setMessage("success");
response.setStatus(0);
return response;
}
catch (Exception e)
{
response.setMessage(e.getMessage());
response.setStatus(100);
return response;
}
}
}
```
|
```scss
@import '../../../themes/mixins/buttons';
@import '../sidebarConfig';
.hideThumb {
display: none !important;
}
.scrollbarThumb {
background-color: var(--theme-sidebar-wallets-scrollbar-background-color);
border-radius: 2px;
width: 4px !important;
&:active {
background-color: var(
--theme-sidebar-wallets-scrollbar-background-color-active
);
}
&:hover {
background-color: var(
--theme-sidebar-wallets-scrollbar-background-color-hover
);
}
}
.wallets {
flex: 1 1 auto;
}
.addWalletButton {
@include button;
align-items: center;
background-color: var(--theme-sidebar-menu-add-button-background-color);
color: var(--theme-sidebar-menu-add-button-text-color);
display: flex;
height: $sidebar-button-height;
width: $sidebar-width - $sidebar-minimized-category-width;
&:hover {
background-color: var(
--theme-sidebar-menu-add-button-background-color-hover
);
}
.icon {
line-height: 12px;
margin: 0 20px;
width: 25px;
& > svg {
width: 25px;
path {
fill: var(--theme-icon-add-wallet-from-sidebar-color);
}
}
}
span {
font-family: var(--font-regular);
font-size: 18px;
}
&.active {
background-color: var(
--theme-sidebar-menu-add-button-background-color-active
);
cursor: default;
}
}
.walletSortControls {
align-items: center;
display: flex;
justify-content: center;
margin-bottom: 14px;
margin-top: 14px;
width: 100%;
.walletSortOffset {
margin-right: 12px;
}
}
.walletSearchContainer {
align-items: center;
display: flex;
height: 84px;
padding: 0 20px;
}
```
|
Abad is a village and municipality in the Agdash Rayon of Azerbaijan. It has a population of 1,716.
References
Populated places in Agdash District
|
```objective-c
/*
*
* Use of this source code is governed by a BSD-style license that can be
* found in the LICENSE file.
*/
#ifndef GrGLBufferImpl_DEFINED
#define GrGLBufferImpl_DEFINED
#include "SkTypes.h"
#include "gl/GrGLFunctions.h"
class GrGLGpu;
/**
* This class serves as the implementation of GrGL*Buffer classes. It was written to avoid code
* duplication in those classes.
*/
class GrGLBufferImpl : SkNoncopyable {
public:
struct Desc {
GrGLuint fID; // set to 0 to indicate buffer is CPU-backed and not a VBO.
size_t fSizeInBytes;
bool fDynamic;
};
GrGLBufferImpl(GrGLGpu*, const Desc&, GrGLenum bufferType);
~GrGLBufferImpl() {
// either release or abandon should have been called by the owner of this object.
SkASSERT(0 == fDesc.fID);
}
void abandon();
void release(GrGLGpu* gpu);
GrGLuint bufferID() const { return fDesc.fID; }
size_t baseOffset() const { return reinterpret_cast<size_t>(fCPUData); }
void bind(GrGLGpu* gpu) const;
void* map(GrGLGpu* gpu);
void unmap(GrGLGpu* gpu);
bool isMapped() const;
bool updateData(GrGLGpu* gpu, const void* src, size_t srcSizeInBytes);
private:
void validate() const;
Desc fDesc;
GrGLenum fBufferType; // GL_ARRAY_BUFFER or GL_ELEMENT_ARRAY_BUFFER
void* fCPUData;
void* fMapPtr;
size_t fGLSizeInBytes; // In certain cases we make the size of the GL buffer object
// smaller or larger than the size in fDesc.
typedef SkNoncopyable INHERITED;
};
#endif
```
|
```c++
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#ifndef V8_INTL_SUPPORT
#error Internationalization is expected to be enabled.
#endif // V8_INTL_SUPPORT
#include "src/objects/js-segment-iterator.h"
#include <map>
#include <memory>
#include <string>
#include "src/heap/factory.h"
#include "src/isolate.h"
#include "src/objects-inl.h"
#include "src/objects/intl-objects.h"
#include "src/objects/js-segment-iterator-inl.h"
#include "src/objects/managed.h"
#include "unicode/brkiter.h"
namespace v8 {
namespace internal {
MaybeHandle<String> JSSegmentIterator::GetSegment(Isolate* isolate,
int32_t start,
int32_t end) const {
return Intl::ToString(isolate, *(unicode_string()->raw()), start, end);
}
Handle<String> JSSegmentIterator::GranularityAsString() const {
switch (granularity()) {
case JSSegmenter::Granularity::GRAPHEME:
return GetReadOnlyRoots().grapheme_string_handle();
case JSSegmenter::Granularity::WORD:
return GetReadOnlyRoots().word_string_handle();
case JSSegmenter::Granularity::SENTENCE:
return GetReadOnlyRoots().sentence_string_handle();
case JSSegmenter::Granularity::COUNT:
UNREACHABLE();
}
}
MaybeHandle<JSSegmentIterator> JSSegmentIterator::Create(
Isolate* isolate, icu::BreakIterator* break_iterator,
JSSegmenter::Granularity granularity, Handle<String> text) {
CHECK_NOT_NULL(break_iterator);
// 1. Let iterator be ObjectCreate(%SegmentIteratorPrototype%).
Handle<Map> map = Handle<Map>(
isolate->native_context()->intl_segment_iterator_map(), isolate);
Handle<JSObject> result = isolate->factory()->NewJSObjectFromMap(map);
Handle<JSSegmentIterator> segment_iterator =
Handle<JSSegmentIterator>::cast(result);
segment_iterator->set_flags(0);
segment_iterator->set_granularity(granularity);
// 2. Let iterator.[[SegmentIteratorSegmenter]] be segmenter.
Handle<Managed<icu::BreakIterator>> managed_break_iterator =
Managed<icu::BreakIterator>::FromRawPtr(isolate, 0, break_iterator);
segment_iterator->set_icu_break_iterator(*managed_break_iterator);
// 3. Let iterator.[[SegmentIteratorString]] be string.
Managed<icu::UnicodeString> unicode_string =
Intl::SetTextToBreakIterator(isolate, text, break_iterator);
segment_iterator->set_unicode_string(unicode_string);
// 4. Let iterator.[[SegmentIteratorIndex]] be 0.
// step 4 is stored inside break_iterator.
// 5. Let iterator.[[SegmentIteratorBreakType]] be undefined.
segment_iterator->set_is_break_type_set(false);
return segment_iterator;
}
// ecma402 #sec-segment-iterator-prototype-breakType
Handle<Object> JSSegmentIterator::BreakType() const {
if (!is_break_type_set()) {
return GetReadOnlyRoots().undefined_value_handle();
}
icu::BreakIterator* break_iterator = icu_break_iterator()->raw();
int32_t rule_status = break_iterator->getRuleStatus();
switch (granularity()) {
case JSSegmenter::Granularity::GRAPHEME:
return GetReadOnlyRoots().undefined_value_handle();
case JSSegmenter::Granularity::WORD:
if (rule_status >= UBRK_WORD_NONE && rule_status < UBRK_WORD_NONE_LIMIT) {
// "words" that do not fit into any of other categories. Includes spaces
// and most punctuation.
return GetReadOnlyRoots().none_string_handle();
}
if ((rule_status >= UBRK_WORD_NUMBER &&
rule_status < UBRK_WORD_NUMBER_LIMIT) ||
(rule_status >= UBRK_WORD_LETTER &&
rule_status < UBRK_WORD_LETTER_LIMIT) ||
(rule_status >= UBRK_WORD_KANA &&
rule_status < UBRK_WORD_KANA_LIMIT) ||
(rule_status >= UBRK_WORD_IDEO &&
rule_status < UBRK_WORD_IDEO_LIMIT)) {
// words that appear to be numbers, letters, kana characters,
// ideographic characters, etc
return GetReadOnlyRoots().word_string_handle();
}
return GetReadOnlyRoots().undefined_value_handle();
case JSSegmenter::Granularity::SENTENCE:
if (rule_status >= UBRK_SENTENCE_TERM &&
rule_status < UBRK_SENTENCE_TERM_LIMIT) {
// sentences ending with a sentence terminator ('.', '?', '!', etc.)
// character, possibly followed by a hard separator (CR, LF, PS, etc.)
return GetReadOnlyRoots().term_string_handle();
}
if ((rule_status >= UBRK_SENTENCE_SEP &&
rule_status < UBRK_SENTENCE_SEP_LIMIT)) {
// sentences that do not contain an ending sentence terminator ('.',
// '?', '!', etc.) character, but are ended only by a hard separator
// (CR, LF, PS, etc.) hard, or mandatory line breaks
return GetReadOnlyRoots().sep_string_handle();
}
return GetReadOnlyRoots().undefined_value_handle();
case JSSegmenter::Granularity::COUNT:
UNREACHABLE();
}
}
// ecma402 #sec-segment-iterator-prototype-index
Handle<Object> JSSegmentIterator::Index(
Isolate* isolate, Handle<JSSegmentIterator> segment_iterator) {
icu::BreakIterator* icu_break_iterator =
segment_iterator->icu_break_iterator()->raw();
CHECK_NOT_NULL(icu_break_iterator);
return isolate->factory()->NewNumberFromInt(icu_break_iterator->current());
}
// ecma402 #sec-segment-iterator-prototype-next
MaybeHandle<JSReceiver> JSSegmentIterator::Next(
Isolate* isolate, Handle<JSSegmentIterator> segment_iterator) {
Factory* factory = isolate->factory();
icu::BreakIterator* icu_break_iterator =
segment_iterator->icu_break_iterator()->raw();
// 3. Let _previousIndex be iterator.[[SegmentIteratorIndex]].
int32_t prev = icu_break_iterator->current();
// 4. Let done be AdvanceSegmentIterator(iterator, forwards).
int32_t index = icu_break_iterator->next();
segment_iterator->set_is_break_type_set(true);
if (index == icu::BreakIterator::DONE) {
// 5. If done is true, return CreateIterResultObject(undefined, true).
return factory->NewJSIteratorResult(isolate->factory()->undefined_value(),
true);
}
// 6. Let newIndex be iterator.[[SegmentIteratorIndex]].
Handle<Object> new_index = factory->NewNumberFromInt(index);
// 8. Let segment be the substring of string from previousIndex to
// newIndex, inclusive of previousIndex and exclusive of newIndex.
Handle<String> segment;
ASSIGN_RETURN_ON_EXCEPTION(isolate, segment,
segment_iterator->GetSegment(isolate, prev, index),
JSReceiver);
// 9. Let breakType be iterator.[[SegmentIteratorBreakType]].
Handle<Object> break_type = segment_iterator->BreakType();
// 10. Let result be ! ObjectCreate(%ObjectPrototype%).
Handle<JSObject> result = factory->NewJSObject(isolate->object_function());
// 11. Perform ! CreateDataProperty(result "segment", segment).
CHECK(JSReceiver::CreateDataProperty(isolate, result,
factory->segment_string(), segment,
Just(kDontThrow))
.FromJust());
// 12. Perform ! CreateDataProperty(result, "breakType", breakType).
CHECK(JSReceiver::CreateDataProperty(isolate, result,
factory->breakType_string(), break_type,
Just(kDontThrow))
.FromJust());
// 13. Perform ! CreateDataProperty(result, "index", newIndex).
CHECK(JSReceiver::CreateDataProperty(isolate, result, factory->index_string(),
new_index, Just(kDontThrow))
.FromJust());
// 14. Return CreateIterResultObject(result, false).
return factory->NewJSIteratorResult(result, false);
}
// ecma402 #sec-segment-iterator-prototype-following
Maybe<bool> JSSegmentIterator::Following(
Isolate* isolate, Handle<JSSegmentIterator> segment_iterator,
Handle<Object> from_obj) {
Factory* factory = isolate->factory();
icu::BreakIterator* icu_break_iterator =
segment_iterator->icu_break_iterator()->raw();
// 3. If from is not undefined,
if (!from_obj->IsUndefined()) {
// a. Let from be ? ToIndex(from).
uint32_t from;
Handle<Object> index;
ASSIGN_RETURN_ON_EXCEPTION_VALUE(
isolate, index,
Object::ToIndex(isolate, from_obj, MessageTemplate::kInvalidIndex),
Nothing<bool>());
if (!index->ToArrayIndex(&from)) {
THROW_NEW_ERROR_RETURN_VALUE(
isolate,
NewRangeError(MessageTemplate::kParameterOfFunctionOutOfRange,
factory->NewStringFromStaticChars("from"),
factory->NewStringFromStaticChars("following"), index),
Nothing<bool>());
}
// b. Let length be the length of iterator.[[SegmentIteratorString]].
uint32_t length =
static_cast<uint32_t>(icu_break_iterator->getText().getLength());
// c. If from length, throw a RangeError exception.
if (from >= length) {
THROW_NEW_ERROR_RETURN_VALUE(
isolate,
NewRangeError(MessageTemplate::kParameterOfFunctionOutOfRange,
factory->NewStringFromStaticChars("from"),
factory->NewStringFromStaticChars("following"),
from_obj),
Nothing<bool>());
}
// d. Let iterator.[[SegmentIteratorPosition]] be from.
segment_iterator->set_is_break_type_set(true);
icu_break_iterator->following(from);
return Just(false);
}
// 4. return AdvanceSegmentIterator(iterator, forward).
// 4. .... or if direction is backwards and position is 0, return true.
// 4. If direction is forwards and position is the length of string ... return
// true.
segment_iterator->set_is_break_type_set(true);
return Just(icu_break_iterator->next() == icu::BreakIterator::DONE);
}
// ecma402 #sec-segment-iterator-prototype-preceding
Maybe<bool> JSSegmentIterator::Preceding(
Isolate* isolate, Handle<JSSegmentIterator> segment_iterator,
Handle<Object> from_obj) {
Factory* factory = isolate->factory();
icu::BreakIterator* icu_break_iterator =
segment_iterator->icu_break_iterator()->raw();
// 3. If from is not undefined,
if (!from_obj->IsUndefined()) {
// a. Let from be ? ToIndex(from).
uint32_t from;
Handle<Object> index;
ASSIGN_RETURN_ON_EXCEPTION_VALUE(
isolate, index,
Object::ToIndex(isolate, from_obj, MessageTemplate::kInvalidIndex),
Nothing<bool>());
if (!index->ToArrayIndex(&from)) {
THROW_NEW_ERROR_RETURN_VALUE(
isolate,
NewRangeError(MessageTemplate::kParameterOfFunctionOutOfRange,
factory->NewStringFromStaticChars("from"),
factory->NewStringFromStaticChars("preceding"), index),
Nothing<bool>());
}
// b. Let length be the length of iterator.[[SegmentIteratorString]].
uint32_t length =
static_cast<uint32_t>(icu_break_iterator->getText().getLength());
// c. If from > length or from = 0, throw a RangeError exception.
if (from > length || from == 0) {
THROW_NEW_ERROR_RETURN_VALUE(
isolate,
NewRangeError(MessageTemplate::kParameterOfFunctionOutOfRange,
factory->NewStringFromStaticChars("from"),
factory->NewStringFromStaticChars("preceding"),
from_obj),
Nothing<bool>());
}
// d. Let iterator.[[SegmentIteratorIndex]] be from.
segment_iterator->set_is_break_type_set(true);
icu_break_iterator->preceding(from);
return Just(false);
}
// 4. return AdvanceSegmentIterator(iterator, backwards).
// 4. .... or if direction is backwards and position is 0, return true.
segment_iterator->set_is_break_type_set(true);
return Just(icu_break_iterator->previous() == icu::BreakIterator::DONE);
}
} // namespace internal
} // namespace v8
```
|
Naty Saidoff (born 1954/1955) is an Israeli-born American diamond dealer, real estate investor and philanthropist. Born in Israel, he grew up on a kibbutz and emigrated to the United States for college. He started his career as a diamond dealer. He is the founder of Capital Foresight Investment, a real estate investment company with rental properties across the United States. He is a large property investor in Downtown Los Angeles. He supports pro-Israel non-profit organizations.
Early life
Naty Saidoff was born in 1954 or 1955 in Israel. He grew up on a kibbutz.
Saidoff graduated from the University of California, Los Angeles (UCLA), where he earned a bachelor's degree in Economics.
Career
Saidoff started his career as a diamond dealer.
Saidoff is the founder of Capital Foresight Investment, a real estate investment company. He is the owner of rental properties in Denver, Colorado, Austin, Texas, Orange County, California as well as Long Beach, California, where he acquired the Walker Building with Bill Lindborg in 1999 and turned the derelict hotel into a residential building.
As an investor in Bristol 423, Saidoff acquired the historic Leland Hotel, the King Edward Hotel, the Hotel Alexandria and the Baltimore Hotel with Izek Shomof and his son, Eric Shomof, in 2012. Additionally, they acquired the Santa Fe Lofts at Sixth Street and Main Street, the Binford Lofts at 837 Traction Avenue, the Title Insurance Building at 433 South Spring Street and its adjacent building at 419 South Spring Street, the Maxfield Building at 819 South Santee Street and the Capitol Garment Building at 217 East Eighth Street.
Philanthropy
Saidoff supports pro-Israel advocacy non-profit organizations. In 2015, he stated, "I think the best way to be a Zionist is to live in Israel, and since I don’t, the second-best way is to be an activist and donate money and do what I do now."
Saidoff serves as a Vice President of StandWithUs while his wife serves on its board. The Saidoffs received the StandWithUs Honorary Award in 2003. Additionally, he serves on the board of the Israeli-American Council as well as on the national board of governors of the American Jewish Committee. The Saidoffs received the AJC Community Service Award in 2006. In 2014, Saidoff and his wife were invited to meet Pope Francis in Vatican City on an AJC delegation.
With his wife, Saidoff endowed the Debbie & Naty Saidoff Center, an Adult Degree Completion Program in San Antonio, Texas.
Personal life
Saidoff and his wife, Debra, have a son, Joshua. They reside in Bel Air.
References
Living people
1950s births
Israeli emigrants to the United States
People from Bel Air, Los Angeles
University of California, Los Angeles alumni
Businesspeople from Los Angeles
Philanthropists from California
Diamond dealers
American philanthropists
American real estate businesspeople
21st-century American Jews
|
The 2009–10 season of Unirea Urziceni began on 25 July with the first training session, led by the team's head coach Dan Petrescu. After several friendlies the first competitive game was the Romanian Supercup against CFR Cluj on 26 July 2009. The match ended 1–1 in regular time, but CFR Cluj managed to win the cup after the penalty shootout, in which Răzvan Pădureţu, Raul Rusescu and Sorin Frunză missed.
Unirea has made several squad changes, signing former Steaua captain Sorin Paraschiv, former Rapid captain Vasile Maftei and Antonio Semedo. Sorin Rădoi was loaned to Unirea Alba Iulia.
Pre-season and friendlies
Competitions
Overall record
Supercupa României
Liga I
League table
Results summary
Results by round
Matches
Cupa României
Round of 32
UEFA Champions League
Group stage
UEFA Europa League
Knockout phase
Round of 32
Players
Individual statistics
|}
Transfers
In
Out
Goals
Club
Coaching staff
References
FC Unirea Urziceni seasons
Unirea Urziceni
Unirea Urziceni
|
```php
<?php
/**
*
*
* path_to_url
*
* or in the "license" file accompanying this file. This file is distributed
* on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either
*/
namespace Aws\Glacier\Model\MultipartUpload;
use Aws\Glacier\Model\MultipartUpload\UploadPartGenerator;
use Aws\Common\Client\AwsClientInterface;
use Aws\Common\Model\MultipartUpload\AbstractTransferState;
use Aws\Common\Model\MultipartUpload\UploadIdInterface;
/**
* State of a multipart upload
*/
class TransferState extends AbstractTransferState
{
const ALREADY_UPLOADED = '-';
/**
* @var UploadPartGenerator Glacier upload helper object that contains part information
*/
protected $partGenerator;
/**
* {@inheritdoc}
*/
public static function fromUploadId(AwsClientInterface $client, UploadIdInterface $uploadId)
{
$transferState = new static($uploadId);
$listParts = $client->getIterator('ListParts', $uploadId->toParams());
foreach ($listParts as $part) {
list($firstByte, $lastByte) = explode('-', $part['RangeInBytes']);
$partSize = (float) $listParts->getLastResult()->get('PartSizeInBytes');
$partData = array(
'partNumber' => $firstByte / $partSize + 1,
'checksum' => $part['SHA256TreeHash'],
'contentHash' => self::ALREADY_UPLOADED,
'size' => $lastByte - $firstByte + 1,
'offset' => $firstByte
);
$transferState->addPart(UploadPart::fromArray($partData));
}
return $transferState;
}
/**
* @param UploadPartGenerator $partGenerator Glacier upload helper object
*
* @return $this
*/
public function setPartGenerator(UploadPartGenerator $partGenerator)
{
$this->partGenerator = $partGenerator;
return $this;
}
/**
* @return UploadPartGenerator Glacier upload helper object
*/
public function getPartGenerator()
{
return $this->partGenerator;
}
}
```
|
```javascript
import * as p from '@fluentui/react/lib/HoverCard';
console.log(p);
export default {
name: 'HoverCard',
};
```
|
Silesian Upland or Silesian Highland () is a highland located in Silesia and Lesser Poland, Poland.
Its highest point is the St. Anne Mountain (406 m).
See also
Silesian Lowlands
Silesian-Lusatian Lowlands
Silesian Foothills
Silesian-Moravian Foothills
Landforms of Silesian Voivodeship
Plateaus of Poland
|
Palazzo Hercolani is a palace in Forlì, Emilia-Romagna, Italy. There is also a Palazzo Hercolani in Bologna.
Until 1844 it belonged to the ancient Hercolani Family of Forlì. The last Hercolani heir living in the palace was Fabrizio Gaddi Hercolani, son of Cesarina Hercolani and Lepido Gaddi Hercolani. In 1844 Fabrizio sold the palace to Count Sesto Matteucci, who engaged in a great number of renovations, although the palace plan was still left unchanged.
The palace was the product of the unification of three different buildings. Sesto Matteucci gave the palace front a more homogeneous look and improved the palace facilities and decorum.
In 1866 the palace became a possession of the Guarini family, when Vittoria Matteucci marries Count Filippo Guarini. In 1946 the palace was sold by last Guarini heirs to the cooperative society Carlo Marx.
The palace houses a work of the Italian painter Pompeo Randi named La Beata Vergine del Fuoco con i Santi Mercuriale, Pellegrino, Marcolino e Valeriano, showing the Sesto Matteucci coat of arms and a landscape of Forlì on the background. On the whole the palace houses six frescoed roofs and two more wall frescoes. All decorations go back to the 19th century.
During the 1980s the palace was renovated by Assicoop Romagna, which has currently its seat in the palace. For its unique elegance and facilities, the palace often hosts important events occurring in the town.
External links
The presentation of a book written by Nando dalla Chiesa (Undersecretary for Research and University) in the courtyard of Palazzo Hercolani
Hercolani, Palazzo
|
The Boston Rowing Marathon is a rowing head race taking place on the third Sunday of September annually in Lincolnshire, England, over the exceptionally long distance of 49.2 km (30.6 miles). The course is along the River Witham from Lincoln to Boston.
Overview
The event started as a one off in 1946, a pub bet, and was big news for the town. A single coxed four of seniors rowed from central Boston to the Brayford Pool in central Lincoln (against the river flow) and took around six hours. A distance, measured at the time, of 34 miles.
On 26th October 1947, a teenaged coxed four from the Boston Rowing Club, took on the challenge of beating the 1946 time. Aubrey Fox, Deg Borman, Bill Gale and Bill Lockwood (AKA Dennis) completed the course in 4 hrs 11 mins and were each awarded an inscribed tankard by the mayor of Boston, TM Moffatt. In 1948 a solo rower covered the course (time unknown).
In 1949, Crowland Rowing Club became involved and the course was reversed to finish at Boston. It was easier to row with the flow and preferable to finish at the boathouse with the pub next door. In 1950 the event was opened to all competitors and has remained so to this day. The event is now organised by Boston Rowing Club.
The long distance of the event makes it unique in British rowing and thus attracts a lot of entries; some competing for a time, others only wanting to complete the distance. The event is also unusual in accepting entries from all crews and categories.
The current record for the 30.6 mile course is 2 h 59 min 45 s, set in 1991 by a University of London Boat Club men's eight. The race was cancelled in 2000, due to that year's fuel crisis, in 2011 due to an unusually prolific growth of water weed, and again in 2020 due to the COVID-19 pandemic. In 2021 the event was again cancelled due to Covid-19, but a solo club rower completed the course on the scheduled day.
Course
The start is a set of landing stages at Lincoln Rowing Centre, Stamp End Lock, Waterside South, Lincoln. The centre was founded in 2006 in response to marathon being the only rowing hosted in Lincoln. The race follows the straightened Witham downstream to the finish line at Boston Rowing Club boathouse, 660 m north of the first bridges in Boston (overlapping road and rail bridges). One lock is present in the course.
See also
The Great River Race on the River Thames, founded in 1988, is a 22-mile rowing race, but is only open to traditional boats.
The Ringvaart Regatta, founded in 1976, is a 100 km rowing race on the Ringvaart in the Netherlands.
References
External links
Website
Rowing competitions in the United Kingdom
Sport in Lincoln, England
Recurring sporting events established in 1946
Sport in Boston, Lincolnshire
|
Fánk is a sweet traditional Hungarian donut. The most commonly used ingredients are: flour, yeast, butter, egg yolk, a little bit of rum, salt, milk and oil to deep fry with. After the pastry has risen for approximately 30 minutes the result is an extreme light doughnut-like pastry. Fánk is traditionally served with powdered sugar and lekvár, Hungarian thick jams; mostly apricot at that.
See also
List of doughnut varieties
External links
Doughnuts
Hungarian desserts
Hungarian pastries
|
The Basilica of Holy Trinity () in Kraków, Poland, is a Catholic basilica. Built in a gothic style, it also houses a monastery of the Order of Preachers (Dominicans). Its history dates from the year 1223.
Hyacinth, a Catholic saint, is buried in the church, as well as Polish monarch Leszek II the Black and Renaissance humanist Filippo Buonaccorsi.
References
External links
Holy Trinity
Dominican churches in Poland
1200s establishments
Burial sites of the Piast dynasty
The Most Holy Virgin Mary, Queen of Poland
|
```go
//
// Permission is hereby granted, free of charge, to any person obtaining a copy
// of this software and associated documentation files (the "Software"), to deal
// in the Software without restriction, including without limitation the rights
// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
// copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in
// all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
// THE SOFTWARE.
// Package httpd contains http routers.
package httpd
import (
"net/http"
"strings"
"github.com/m3db/m3/src/query/api/v1/options"
"github.com/m3db/m3/src/x/headers"
)
type router struct {
promqlHandler func(http.ResponseWriter, *http.Request)
m3QueryHandler func(http.ResponseWriter, *http.Request)
defaultQueryEngine options.QueryEngine
}
func NewQueryRouter() options.QueryRouter {
return &router{}
}
func (r *router) Setup(opts options.QueryRouterOptions) {
defaultEngine := opts.DefaultQueryEngine
if defaultEngine != options.PrometheusEngine && defaultEngine != options.M3QueryEngine {
defaultEngine = options.PrometheusEngine
}
r.defaultQueryEngine = defaultEngine
r.promqlHandler = opts.PromqlHandler
r.m3QueryHandler = opts.M3QueryHandler
}
func (r *router) ServeHTTP(w http.ResponseWriter, req *http.Request) {
engine := strings.ToLower(req.Header.Get(headers.EngineHeaderName))
urlParam := req.URL.Query().Get(EngineURLParam)
if len(urlParam) > 0 {
engine = strings.ToLower(urlParam)
}
if !options.IsQueryEngineSet(engine) {
engine = string(r.defaultQueryEngine)
}
w.Header().Add(headers.EngineHeaderName, engine)
if engine == string(options.M3QueryEngine) {
r.m3QueryHandler(w, req)
return
}
r.promqlHandler(w, req)
}
```
|
Shirley Kaneda (born 1951) is an abstract painter and artist based in New York City.
Early life
Shirley Kaneda is an American artist who was born in Tokyo to Korean-born parents. She was educated in English attending The American School in Japan. She came to New York in 1970 to attend Parsons School of Design and has since lived and worked in New York City. She became a naturalized American citizen in 1986.
Work
Shirley Kaneda started exhibiting her work in the late 1980s in New York at such venues as White Columns as well as in the landmark show, Conceptual Abstraction at Sidney Janis Gallery. She has had numerous solo exhibitions both nationally and internationally at such venues as Jack Shainman Gallery, Feigen Contemporary, Danese Gallery and Gallery Richard in New York as well as Mark Moore Gallery in LA, Bernard Jacobson Gallery in London, Annandale Gallery in Sydney, Australia, Raffaella Cortese Gallery in Milan, Galerie Schuster in Germany, Centre d’Art D’Ivry in Paris and Centre d’Art Contemporain in Sete, France, among others.
The critic, Matt Biro has said, “For more than three decades Shirley Kaneda has expanded the possibilities of abstract painting in a number of unique and thought-provoking ways. An artist who pushes the limits of painterly form today, Kaneda is an analytical and historically informed painter. She excels in juxtaposing a wide variety of gestures, shapes, and patterns in a manner that suggests an archaeology of twentieth century modernism.”
In rethinking abstraction, she has focused on two of its greatest deficits—its inherent decorativeness and opticality. "By re-establishing the content of the aesthetic or how it’s addressed, the range of qualities represented by the decorative can be utilized, which appeal primarily to the senses to establish a form of signification for them that will make their content and presence tangible." Furthermore, in discussing her work, she has said, “The idea of regaining art's importance in the area of aesthetics through the decorative is linked to the view that art may still show us a way of intensifying our perception and reflexivity. One of the reasons for constructing my paintings the way I have is that I hope they will hold back automatic responses, disrupting expectations, and pre-conceptions, hopefully connecting the viewer with overlooked or unconscious modes of thought. In my view this can open up the supposedly closed system of abstract painting's subjects and aesthetics. And in terms of how the decorative can be understood metaphorically in my painting, I use it to promote such non-heroic themes as, beauty, fluidity, variation and so on. By exploiting and building on discriminatory concepts, I hope to continue the process of demystifying such traditionally masculine values as the heroic, the aggressive, and the rational.”
Kaneda was a Guggenheim Fellow in 1999, and has received grants from the National Endowment for the Arts, Pollock-Krasner Foundation, the Elizabeth Foundation, as well as The American Academy of Arts and Letters Purchase Award, and CCA Andratx Artist Residency. Her work has been reviewed in various publications such as The New York Times, Art in America, The New Yorker, Art News, Time Out, Contemporary, Art Critical, Huffington Post, Art Issues among many others. Her works are in the collection of such museums as The American University Museum at the Katzen Art Center, Neuberger Museum of Art, Suny Purchase, David Winton Bell Gallery, Brown University, Escalette Collection of Art, Chapman University, Princeton University Art Museum, and Virginia Museum of Fine Arts as well as in many private and corporate collections. She has also written essays and criticism for Arts Magazine, Art Journal, Journal of Contemporary Painting, Women and Performance among others. In 1991, she wrote the essay, “Painting and Its Others, The Feminine in Abstract Painting", for ARTS MAGAZINE which has been anthologized in “PAINTING,” edited by Terry R. Myers, in the series Documents of Contemporary Art, published by White Chapel Gallery and the MIT Press.
She has also conducted many interviews for Bomb Magazine since 1991 with artists such as Jonathan Lasker, Philip Taaffe, Valerie Jaudon, Shirley Jaffe, Robert Mangold, Mira Schor, and Charline Von Heyl among others. Kaneda was Asst. Professor at Virginia Commonwealth University as the Thalheimer Faculty Fellow (1999–2001), Associate Professor at Claremont Graduate University (2001–2003), and was tenured Professor of Painting at Pratt Institute in New York from 2003 to 2017.
References
Mathew Biro, CONTEMPORARY, "Shirley Kaneda, Fluid Transitions" #81, 2006
Shirley Kaneda, "Artist Statement," Fall 2018 Lectures, November 13, 2018, New York Studio School
Terry R. Myers, PAINTING, "Shirley Kaneda, Painting and Its Others:In the Realm of the Feminine, 1991//072," The MIT Press, 2011, p. 72-80
External links
Shirley Kaneda (her own website)
Shirley Kaneda (her faculty page at Pratt Institute)
(video of a Kaneda exhibit in Paris)
1951 births
Living people
20th-century Japanese painters
20th-century Japanese women artists
21st-century Japanese women artists
Abstract painters
American art educators
Claremont Graduate University faculty
Parsons School of Design alumni
Pratt Institute faculty
Virginia Commonwealth University faculty
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
// MODULES //
var bench = require( '@stdlib/bench' );
var IS_BROWSER = require( '@stdlib/assert/is-browser' );
var isArray = require( '@stdlib/assert/is-array' );
var pkg = require( './../package.json' ).name;
var afinn111 = require( './../lib' );
// VARIABLES //
var opts = {
'skip': IS_BROWSER
};
// MAIN //
bench( pkg, opts, function benchmark( b ) {
var data;
var i;
b.tic();
for ( i = 0; i < b.iterations; i++ ) {
data = afinn111();
if ( data.length === 0 ) {
b.fail( 'should have a length greater than 0' );
}
}
b.toc();
if ( !isArray( data ) ) {
b.fail( 'should return an array' );
}
b.pass( 'benchmark finished' );
b.end();
});
```
|
Rhianna Patrick (born 1977) is a Torres Strait Islander radio personality who was born in Brisbane and lived in Weipa as a child, moving to Brisbane aged 10. She has a Bachelor of Arts from the University of Queensland in 1999 with a double major in journalism. She has worked in radio and television.
Career
While studying she worked at 4AAA Indigenous Radio Station as the Breakfast announcer for nearly three years and also as the rugby reporter on the station's sports show. In 2002 she joined the ABC in Sydney as a cadet and spent a year in Mackay. She returned to Sydney in 2004 working as newsreader in the Triple J News Team for the ABC where she has continued her career in radio and television.
She worked on the Indigenous television program Message Stick in various roles including as an Associate Producer/Researcher.
She produced, directed and wrote A Close Shave (2002) a documentary on a young Torres Strait Islander boy; directed Coming of the Light (2006); wrote Troubled Waters (2007) and was co-writer of Stylin'up (2007) all part of the Message Stick series
For 5 years until early 2014 Patrick hosted Speaking Out on radio. Then she moved to Brisbane until mid-April to act as stand-in host of Afternoon in south-east Queensland.
She was also presenting stories on Awaye! Radio Nationals indigenous arts and culture radio programme in 2014. In 2017 she presented a radio series on Sunday evenings which included the discussion of films, music, nostalgia, and current news topics and playing music.
Rhianna Patrick was on the program to appear at two events for the 2017 Brisbane Writers Festival in Brisbane, Queensland, Australia.
Private life
She is married to David White, the former producer of Speaking Out.
References
External links
Speaking Out Website
Message Stick: Rhianna Patrick
Australian women radio presenters
1977 births
Living people
Triple J announcers
|
```smalltalk
using System;
using Newtonsoft.Json;
namespace TemplateProject2
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
string result = JsonConvert.SerializeObject(new { a = "test" });
}
}
}
```
|
Weightlifting at the 2018 Asian Games was held at the Jakarta International Expo Hall A, Jakarta, Indonesia, from 20 to 27 August 2018.
China and Kazakhstan did not participate as their weightlifting federations were suspended by the IWF from October 2017 to October 2018.
Schedule
Medalists
Men
Women
Medal table
Participating nations
A total of 164 athletes from 29 nations competed in weightlifting at the 2018 Asian Games:
References
External links
Weightlifting at the 2018 Asian Games
Results
Results book
2018 Asian Games events
2018
Asian Games
Asian Games
|
```java
/* ===========================================================
* JFreeChart : a free chart library for the Java(tm) platform
* ===========================================================
*
*
* Project Info: path_to_url
*
* This library is free software; you can redistribute it and/or modify it
* (at your option) any later version.
*
* This library is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY
* or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public
*
* You should have received a copy of the GNU Lesser General Public
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301,
* USA.
*
* [Oracle and Java are registered trademarks of Oracle and/or its affiliates.
* Other names may be trademarks of their respective owners.]
*
* ----------------------
* DefaultAxisEditor.java
* ----------------------
*
* Original Author: David Gilbert;
* Contributor(s): Andrzej Porebski;
* Arnaud Lelievre;
*
*/
package org.jfree.chart.swing.editor;
import java.awt.BorderLayout;
import java.awt.Color;
import java.awt.Font;
import java.awt.Paint;
import java.awt.event.ActionEvent;
import java.awt.event.ActionListener;
import java.util.ResourceBundle;
import javax.swing.BorderFactory;
import javax.swing.JButton;
import javax.swing.JCheckBox;
import javax.swing.JColorChooser;
import javax.swing.JLabel;
import javax.swing.JOptionPane;
import javax.swing.JPanel;
import javax.swing.JTabbedPane;
import javax.swing.JTextField;
import org.jfree.chart.axis.Axis;
import org.jfree.chart.axis.LogAxis;
import org.jfree.chart.axis.NumberAxis;
import org.jfree.chart.api.RectangleInsets;
/**
* A panel for editing the properties of an axis.
*/
class DefaultAxisEditor extends JPanel implements ActionListener {
/** The axis label. */
private final JTextField label;
/** The label font. */
private Font labelFont;
/** The label paint. */
private final PaintSample labelPaintSample;
/** A field showing a description of the label font. */
private final JTextField labelFontField;
/** The font for displaying tick labels on the axis. */
private Font tickLabelFont;
/**
* A field containing a description of the font for displaying tick labels
* on the axis.
*/
private final JTextField tickLabelFontField;
/** The paint (color) for the tick labels. */
private final PaintSample tickLabelPaintSample;
/**
* An empty sub-panel for extending the user interface to handle more
* complex axes.
*/
private JPanel slot1;
/**
* An empty sub-panel for extending the user interface to handle more
* complex axes.
*/
private JPanel slot2;
/** A flag that indicates whether or not the tick labels are visible. */
private final JCheckBox showTickLabelsCheckBox;
/** A flag that indicates whether or not the tick marks are visible. */
private final JCheckBox showTickMarksCheckBox;
// /** Insets text field. */
// private InsetsTextField tickLabelInsetsTextField;
//
// /** Label insets text field. */
// private InsetsTextField labelInsetsTextField;
/** The tick label insets. */
private final RectangleInsets tickLabelInsets;
/** The label insets. */
private final RectangleInsets labelInsets;
/** A tabbed pane for... */
private final JTabbedPane otherTabs;
/** The resourceBundle for the localization. */
protected static ResourceBundle localizationResources
= ResourceBundle.getBundle("org.jfree.chart.editor.LocalizationBundle");
/**
* A static method that returns a panel that is appropriate for the axis
* type.
*
* @param axis the axis whose properties are to be displayed/edited in
* the panel.
*
* @return A panel or {@code null} if axis is {@code null}.
*/
public static DefaultAxisEditor getInstance(Axis axis) {
if (axis != null) {
// figure out what type of axis we have and instantiate the
// appropriate panel
if (axis instanceof NumberAxis) {
return new DefaultNumberAxisEditor((NumberAxis) axis);
}
if (axis instanceof LogAxis) {
return new DefaultLogAxisEditor((LogAxis) axis);
}
else {
return new DefaultAxisEditor(axis);
}
}
else {
return null;
}
}
/**
* Standard constructor: builds a panel for displaying/editing the
* properties of the specified axis.
*
* @param axis the axis whose properties are to be displayed/edited in
* the panel.
*/
public DefaultAxisEditor(Axis axis) {
this.labelFont = axis.getLabelFont();
this.labelPaintSample = new PaintSample(axis.getLabelPaint());
this.tickLabelFont = axis.getTickLabelFont();
this.tickLabelPaintSample = new PaintSample(axis.getTickLabelPaint());
// Insets values
this.tickLabelInsets = axis.getTickLabelInsets();
this.labelInsets = axis.getLabelInsets();
setLayout(new BorderLayout());
JPanel general = new JPanel(new BorderLayout());
general.setBorder(
BorderFactory.createTitledBorder(
BorderFactory.createEtchedBorder(),
localizationResources.getString("General")
)
);
JPanel interior = new JPanel(new LCBLayout(5));
interior.setBorder(BorderFactory.createEmptyBorder(0, 5, 0, 5));
interior.add(new JLabel(localizationResources.getString("Label")));
this.label = new JTextField(axis.getLabel());
interior.add(this.label);
interior.add(new JPanel());
interior.add(new JLabel(localizationResources.getString("Font")));
this.labelFontField = new FontDisplayField(this.labelFont);
interior.add(this.labelFontField);
JButton b = new JButton(localizationResources.getString("Select..."));
b.setActionCommand("SelectLabelFont");
b.addActionListener(this);
interior.add(b);
interior.add(new JLabel(localizationResources.getString("Paint")));
interior.add(this.labelPaintSample);
b = new JButton(localizationResources.getString("Select..."));
b.setActionCommand("SelectLabelPaint");
b.addActionListener(this);
interior.add(b);
// interior.add(
// new JLabel(localizationResources.getString("Label_Insets"))
// );
// b = new JButton(localizationResources.getString("Edit..."));
// b.setActionCommand("LabelInsets");
// b.addActionListener(this);
// this.labelInsetsTextField = new InsetsTextField(this.labelInsets);
// interior.add(this.labelInsetsTextField);
// interior.add(b);
//
// interior.add(
// new JLabel(localizationResources.getString("Tick_Label_Insets"))
// );
// b = new JButton(localizationResources.getString("Edit..."));
// b.setActionCommand("TickLabelInsets");
// b.addActionListener(this);
// this.tickLabelInsetsTextField
// = new InsetsTextField(this.tickLabelInsets);
// interior.add(this.tickLabelInsetsTextField);
// interior.add(b);
general.add(interior);
add(general, BorderLayout.NORTH);
this.slot1 = new JPanel(new BorderLayout());
JPanel other = new JPanel(new BorderLayout());
other.setBorder(BorderFactory.createTitledBorder(
BorderFactory.createEtchedBorder(),
localizationResources.getString("Other")));
this.otherTabs = new JTabbedPane();
this.otherTabs.setBorder(BorderFactory.createEmptyBorder(0, 5, 0, 5));
JPanel ticks = new JPanel(new LCBLayout(3));
ticks.setBorder(BorderFactory.createEmptyBorder(4, 4, 4, 4));
this.showTickLabelsCheckBox = new JCheckBox(
localizationResources.getString("Show_tick_labels"),
axis.isTickLabelsVisible()
);
ticks.add(this.showTickLabelsCheckBox);
ticks.add(new JPanel());
ticks.add(new JPanel());
ticks.add(
new JLabel(localizationResources.getString("Tick_label_font"))
);
this.tickLabelFontField = new FontDisplayField(this.tickLabelFont);
ticks.add(this.tickLabelFontField);
b = new JButton(localizationResources.getString("Select..."));
b.setActionCommand("SelectTickLabelFont");
b.addActionListener(this);
ticks.add(b);
this.showTickMarksCheckBox = new JCheckBox(
localizationResources.getString("Show_tick_marks"),
axis.isTickMarksVisible()
);
ticks.add(this.showTickMarksCheckBox);
ticks.add(new JPanel());
ticks.add(new JPanel());
this.otherTabs.add(localizationResources.getString("Ticks"), ticks);
other.add(this.otherTabs);
this.slot1.add(other);
this.slot2 = new JPanel(new BorderLayout());
this.slot2.add(this.slot1, BorderLayout.NORTH);
add(this.slot2);
}
/**
* Returns the current axis label.
*
* @return The current axis label.
*/
public String getLabel() {
return this.label.getText();
}
/**
* Returns the current label font.
*
* @return The current label font.
*/
public Font getLabelFont() {
return this.labelFont;
}
/**
* Returns the current label paint.
*
* @return The current label paint.
*/
public Paint getLabelPaint() {
return this.labelPaintSample.getPaint();
}
/**
* Returns a flag that indicates whether or not the tick labels are visible.
*
* @return {@code true} if tick mark labels are visible.
*/
public boolean isTickLabelsVisible() {
return this.showTickLabelsCheckBox.isSelected();
}
/**
* Returns the font used to draw the tick labels (if they are showing).
*
* @return The font used to draw the tick labels.
*/
public Font getTickLabelFont() {
return this.tickLabelFont;
}
/**
* Returns the current tick label paint.
*
* @return The current tick label paint.
*/
public Paint getTickLabelPaint() {
return this.tickLabelPaintSample.getPaint();
}
/**
* Returns the current value of the flag that determines whether or not
* tick marks are visible.
*
* @return {@code true} if tick marks are visible.
*/
public boolean isTickMarksVisible() {
return this.showTickMarksCheckBox.isSelected();
}
/**
* Returns the current tick label insets value
*
* @return The current tick label insets value.
*/
public RectangleInsets getTickLabelInsets() {
return (this.tickLabelInsets == null)
? new RectangleInsets(0, 0, 0, 0)
: this.tickLabelInsets;
}
/**
* Returns the current label insets value
*
* @return The current label insets value.
*/
public RectangleInsets getLabelInsets() {
return (this.labelInsets == null)
? new RectangleInsets(0, 0, 0, 0) : this.labelInsets;
}
/**
* Returns a reference to the tabbed pane.
*
* @return A reference to the tabbed pane.
*/
public JTabbedPane getOtherTabs() {
return this.otherTabs;
}
/**
* Handles user interaction with the property panel.
*
* @param event information about the event that triggered the call to
* this method.
*/
@Override
public void actionPerformed(ActionEvent event) {
String command = event.getActionCommand();
if (command.equals("SelectLabelFont")) {
attemptLabelFontSelection();
}
else if (command.equals("SelectLabelPaint")) {
attemptModifyLabelPaint();
}
else if (command.equals("SelectTickLabelFont")) {
attemptTickLabelFontSelection();
}
// else if (command.equals("LabelInsets")) {
// editLabelInsets();
// }
// else if (command.equals("TickLabelInsets")) {
// editTickLabelInsets();
// }
}
/**
* Presents a font selection dialog to the user.
*/
private void attemptLabelFontSelection() {
FontChooserPanel panel = new FontChooserPanel(this.labelFont);
int result = JOptionPane.showConfirmDialog(this, panel,
localizationResources.getString("Font_Selection"),
JOptionPane.OK_CANCEL_OPTION, JOptionPane.PLAIN_MESSAGE);
if (result == JOptionPane.OK_OPTION) {
this.labelFont = panel.getSelectedFont();
this.labelFontField.setText(
this.labelFont.getFontName() + " " + this.labelFont.getSize()
);
}
}
/**
* Allows the user the opportunity to change the outline paint.
*/
private void attemptModifyLabelPaint() {
Color c;
c = JColorChooser.showDialog(
this, localizationResources.getString("Label_Color"), Color.BLUE
);
if (c != null) {
this.labelPaintSample.setPaint(c);
}
}
/**
* Presents a tick label font selection dialog to the user.
*/
public void attemptTickLabelFontSelection() {
FontChooserPanel panel = new FontChooserPanel(this.tickLabelFont);
int result = JOptionPane.showConfirmDialog(this, panel,
localizationResources.getString("Font_Selection"),
JOptionPane.OK_CANCEL_OPTION, JOptionPane.PLAIN_MESSAGE);
if (result == JOptionPane.OK_OPTION) {
this.tickLabelFont = panel.getSelectedFont();
this.tickLabelFontField.setText(
this.tickLabelFont.getFontName() + " "
+ this.tickLabelFont.getSize()
);
}
}
// /**
// * Presents insets chooser panel allowing user to modify tick label's
// * individual insets values. Updates the current insets text field if
// * edit is accepted.
// */
// private void editTickLabelInsets() {
// InsetsChooserPanel panel = new InsetsChooserPanel(
// this.tickLabelInsets);
// int result = JOptionPane.showConfirmDialog(
// this, panel, localizationResources.getString("Edit_Insets"),
// JOptionPane.PLAIN_MESSAGE
// );
//
// if (result == JOptionPane.OK_OPTION) {
// this.tickLabelInsets = panel.getInsets();
// this.tickLabelInsetsTextField.setInsets(this.tickLabelInsets);
// }
// }
//
// /**
// * Presents insets chooser panel allowing user to modify label's
// * individual insets values. Updates the current insets text field if edit
// * is accepted.
// */
// private void editLabelInsets() {
// InsetsChooserPanel panel = new InsetsChooserPanel(this.labelInsets);
// int result = JOptionPane.showConfirmDialog(
// this, panel, localizationResources.getString("Edit_Insets"),
// JOptionPane.PLAIN_MESSAGE
// );
//
// if (result == JOptionPane.OK_OPTION) {
// this.labelInsets = panel.getInsets();
// this.labelInsetsTextField.setInsets(this.labelInsets);
// }
// }
/**
* Sets the properties of the specified axis to match the properties
* defined on this panel.
*
* @param axis the axis.
*/
public void setAxisProperties(Axis axis) {
axis.setLabel(getLabel());
axis.setLabelFont(getLabelFont());
axis.setLabelPaint(getLabelPaint());
axis.setTickMarksVisible(isTickMarksVisible());
// axis.setTickMarkStroke(getTickMarkStroke());
axis.setTickLabelsVisible(isTickLabelsVisible());
axis.setTickLabelFont(getTickLabelFont());
axis.setTickLabelPaint(getTickLabelPaint());
axis.setTickLabelInsets(getTickLabelInsets());
axis.setLabelInsets(getLabelInsets());
}
}
```
|
```scss
// UTILITY MIXINS
// --------------------------------------------------
// For clearing floats like a boss h5bp.com/q
@mixin clearfix {
*zoom: 1;
&:before,
&:after {
display: table;
content: "";
// Fixes Opera/contenteditable bug:
// path_to_url#comment-36952
line-height: 0;
}
&:after {
clear: both;
}
}
// Center-align a block level element
// ----------------------------------
@mixin center-block() {
display: block;
margin-left: auto;
margin-right: auto;
}
// ROUND CORNERS
// --------------------------------------------------
// .border-radius(VALUE,VALUE,VALUE,VALUE);
@mixin border-radius($topright: 0, $bottomright: 0, $bottomleft: 0, $topleft: 0) {
-webkit-border-top-right-radius : $topright;
-webkit-border-bottom-right-radius : $bottomright;
-webkit-border-bottom-left-radius : $bottomleft;
-webkit-border-top-left-radius : $topleft;
-moz-border-radius-topright : $topright;
-moz-border-radius-bottomright : $bottomright;
-moz-border-radius-bottomleft : $bottomleft;
-moz-border-radius-topleft : $topleft;
border-top-right-radius : $topright;
border-bottom-right-radius : $bottomright;
border-bottom-left-radius : $bottomleft;
border-top-left-radius : $topleft;
-webkit-background-clip : padding-box;
-moz-background-clip : padding;
background-clip : padding-box;
}
// .rounded(VALUE);
@mixin rounded($radius:4px) {
-webkit-border-radius : $radius;
-moz-border-radius : $radius;
border-radius : $radius;
}
// TYPOGRAPHY
// --------------------------------------------------
// Full-fat vertical rhythm
// ------------------------
@mixin font-size($size) {
font-size: 0px + $size;
font-size: 0rem + $size / $doc-font-size;
line-height: 0 + round($doc-line-height / $size*10000) / 10000;
margin-bottom: 0px + $doc-line-height;
margin-bottom: 0rem + ($doc-line-height / $doc-font-size);
}
// Just the REMs
// -------------
@mixin font-rem($size) {
font-size: 0px + $size;
font-size: 0rem + $size / $doc-font-size;
}
// Just font-size and line-height
// ------------------------------
@mixin font($size) {
font-size: 0px + $size;
font-size: 0rem + $size / $doc-font-size;
line-height: 0 + round($doc-line-height / $size*10000) / 10000;
}
// GRADIENTS
// --------------------------------------------------
@mixin horizontal($startColor : $color_gallery, $midColor : $color_stack, $endColor : $color_gallery) {
background-color: $endColor;
background-image : -webkit-linear-gradient(left, $startColor, $midColor, $endColor); // Safari 5.1+, Chrome 10+
background-image : -moz-linear-gradient(left, $startColor, $midColor, $endColor); // FF 3.6+
background-image : -ms-linear-gradient(left, $startColor, $midColor, $endColor); // IE10
background-image : -o-linear-gradient(left, $startColor, $midColor, $endColor); // Opera 11.10
background-image : linear-gradient(left, $startColor, $midColor, $endColor); // W3C
background-repeat : repeat-x;
}
// TRANSFORMATIONS
// --------------------------------------------------
// .transition(PROPERTY DURATION DELAY(OPTIONAL) TIMING-FINCTION);
@mixin transition($transition) {
-webkit-transition : $transition;
-moz-transition : $transition;
-ms-transition : $transition;
-o-transition : $transition;
transition : $transition;
}
```
|
```html+django
{{#imports }}
{{{.}}}
{{/imports}}
{{#moduleDclns }}
{{{.}}}
{{/moduleDclns}}
{{#varDclns}}
{{#isAny }}
{{{prefix}}} {{{typeLHS}}} {{{name}}} = // value
<{{{type}}}> __recall_any("");
{{/isAny}}
{{^isAny }}
{{{prefix}}} {{{typeLHS}}} {{{name}}} = // value
<{{{type}}}> __recall_any_error("");
{{/isAny}}
{{/varDclns}}
{{{lastVarDcln}}}
// Define built-in functions
any|error {{{exprVarName}}} = ();
function __java_recall(handle context_id, handle name) returns any|error { }
function __java_memorize(handle context_id, handle name, any|error value) { }
function __recall_any(string name) returns any { }
function __recall_any_error(string name) returns any|error { }
function __memorize(string name, any|error value) { }
public function __run() returns any { }
function __stmts() returns any { }
function init() { }
public function main() {
{{#varDclns}}
__memorize("{{{encodedName}}}", {{{name}}});
{{/varDclns}}
{{#newVarNames}}
__memorize("unknown",
{{{.}}} // Variable is unable to be referenced after declaration
);
{{/newVarNames}}
_ = java:fromString("");
}
```
|
```lua
-- LAVA PITS ------------------------------------------------------------
register_level "the_lava_pits"
{
name = "The Lava Pits",
entry = "On level @1 he entered the Lava Pits.",
welcome = "You descend into the Lava Pits. Dammit, it's hot in here!",
level = 22,
Create = function ()
level:set_generator_style( 1 )
generator.fill( "plava", area.FULL )
local lavapits_armor = {
level = 25,
type = {ITEMTYPE_ARMOR,ITEMTYPE_BOOTS},
weights = { is_unique = 5, ulavaarmor = 3 }, -- multiplicative!
}
local translation = {
['.'] = "floor",
['='] = "lava",
['>'] = "stairs",
['/'] = { "floor", item = "shell" },
['|'] = { "floor", item = "cell" },
['-'] = { "floor", item = "rocket" },
['!'] = { "floor", item = "epack" },
['?'] = { "floor", item = core.ifdiff( 3, nil, "epack" ) },
['*'] = { "floor", item = core.ifdiff( 4, nil, "epack" ) },
['A'] = { "floor", being = core.bydiff{ "sergeant", "knight", "mancubus", "revenant" } },
['1'] = { "floor", item = level:roll_item( lavapits_armor ) },
['2'] = { "floor", item = level:roll_item( lavapits_armor ) },
['3'] = { "floor", item = level:roll_item( lavapits_armor ) },
['4'] = { "floor", item = level:roll_item( lavapits_armor ) },
['5'] = { "floor", item = level:roll_item( lavapits_armor ) },
['6'] = { "floor", item = level:roll_item( lavapits_armor ) },
['7'] = { "floor", item = level:roll_item( lavapits_armor ) },
['8'] = { "floor", item = level:roll_item( lavapits_armor ) },
['9'] = { "floor", item = level:roll_item( lavapits_armor ) },
['0'] = { "floor", item = level:roll_item( lavapits_armor ) },
}
local map = [=[
============================================================================
============================================================================
==========================A./==============================!8.==============
=========================...-.======================/.=====.A.==============
=========================...|.=====================..-.=====================
==========================...=======================..============...=======
===========================.=====================================...0.======
==...======================A============================...=======...=======
=.....=====================.===========================../|.================
=..>..=====================.=============...============...============.|.==
=.....=====================.============.....==========================./.==
==...======================A============..A..===============================
===========================.=============...========../===========...A======
==========================...======================..A..=========.....?=====
=========================...|.======================..-===========.A7534====
=========================...-.===============..====================*621=====
==========================A./================.9=============================
============================================================================
]=]
generator.place_tile( translation, map, 2, 2 )
local second = core.bydiff{ "lostsoul", "cacodemon", "cacodemon", "pain" }
level:summon{ "lostsoul", 14 + 2*DIFFICULTY, cell = "lava" }
level:summon{ second, 5 + DIFFICULTY, cell = "lava" }
level:player(4,11)
level.status = 0
end,
OnKillAll = function ()
local result = level.status
if result == 0 then
ui.msg("That seems to be all of them... wait! Something is moving there, or is it just lava glow?")
level.status = 1
local element = level:summon("lava_elemental")
element.inv:add( item.new("lava_element") )
elseif result == 1 then
ui.msg("Tough son of a bitch... now to get that shiny object he left behind...")
level.status = 2
end
end,
OnExit = function ()
local result = level.status
if result == 0 then
ui.msg("Too hot dammit, I'm leaving this party...")
player:add_history("He decided it was too hot there.")
elseif result == 1 then
ui.msg("There goes my beard... at least I'm still alive.")
player:add_history("He fled there from the monstrous lava elemental.")
elseif result == 2 then
ui.msg("Lava elementals my ass. I don't care.")
player:add_badge("lava1")
if core.is_challenge("challenge_aoi") then player:add_badge("lava2") end
player:add_history("He managed to clear the Lava Pits completely!")
end
end,
}
```
|
```objective-c
/*your_sha256_hash---------
*
* ts_public.h
* Public interface to various tsearch modules, such as
* parsers and dictionaries.
*
*
* src/include/tsearch/ts_public.h
*
*your_sha256_hash---------
*/
#ifndef _PG_TS_PUBLIC_H_
#define _PG_TS_PUBLIC_H_
#include "tsearch/ts_type.h"
/*
* Parser's framework
*/
/*
* returning type for prslextype method of parser
*/
typedef struct
{
int lexid;
char *alias;
char *descr;
} LexDescr;
/*
* Interface to headline generator
*/
typedef struct
{
uint32 selected:1,
in:1,
replace:1,
repeated:1,
skip:1,
unused:3,
type:8,
len:16;
WordEntryPos pos;
char *word;
QueryOperand *item;
} HeadlineWordEntry;
typedef struct
{
HeadlineWordEntry *words;
int32 lenwords;
int32 curwords;
int32 vectorpos; /* positions a-la tsvector */
char *startsel;
char *stopsel;
char *fragdelim;
int16 startsellen;
int16 stopsellen;
int16 fragdelimlen;
} HeadlineParsedText;
/*
* Common useful things for tsearch subsystem
*/
extern char *get_tsearch_config_filename(const char *basename,
const char *extension);
/*
* Often useful stopword list management
*/
typedef struct
{
int len;
char **stop;
} StopList;
extern void readstoplist(const char *fname, StopList *s,
char *(*wordop) (const char *));
extern bool searchstoplist(StopList *s, char *key);
/*
* Interface with dictionaries
*/
/* return struct for any lexize function */
typedef struct
{
/*----------
* Number of current variant of split word. For example the Norwegian
* word 'fotballklubber' has two variants to split: ( fotball, klubb )
* and ( fot, ball, klubb ). So, dictionary should return:
*
* nvariant lexeme
* 1 fotball
* 1 klubb
* 2 fot
* 2 ball
* 2 klubb
*
* In general, a TSLexeme will be considered to belong to the same split
* variant as the previous one if they have the same nvariant value.
* The exact values don't matter, only changes from one lexeme to next.
*----------
*/
uint16 nvariant;
uint16 flags; /* See flag bits below */
char *lexeme; /* C string */
} TSLexeme;
/* Flag bits that can appear in TSLexeme.flags */
#define TSL_ADDPOS 0x01
#define TSL_PREFIX 0x02
#define TSL_FILTER 0x04
/*
* Struct for supporting complex dictionaries like thesaurus.
* 4th argument for dictlexize method is a pointer to this
*/
typedef struct
{
bool isend; /* in: marks for lexize_info about text end is
* reached */
bool getnext; /* out: dict wants next lexeme */
void *private_state; /* internal dict state between calls with
* getnext == true */
} DictSubState;
#endif /* _PG_TS_PUBLIC_H_ */
```
|
```c
/*
* buffer.c -- generic memory buffer .
*
*
* See LICENSE for the license.
*
*/
#include "config.h"
#include <stdlib.h>
#include <stdio.h>
#include "buffer.h"
static void
buffer_cleanup(void *arg)
{
buffer_type *buffer = (buffer_type *) arg;
assert(!buffer->_fixed);
free(buffer->_data);
}
buffer_type *
buffer_create(region_type *region, size_t capacity)
{
buffer_type *buffer
= (buffer_type *) region_alloc(region, sizeof(buffer_type));
if (!buffer)
return NULL;
buffer->_data = (uint8_t *) xalloc(capacity);
buffer->_position = 0;
buffer->_limit = buffer->_capacity = capacity;
buffer->_fixed = 0;
buffer_invariant(buffer);
region_add_cleanup(region, buffer_cleanup, buffer);
return buffer;
}
void
buffer_create_from(buffer_type *buffer, void *data, size_t size)
{
assert(data);
buffer->_position = 0;
buffer->_limit = buffer->_capacity = size;
buffer->_data = (uint8_t *) data;
buffer->_fixed = 1;
buffer_invariant(buffer);
}
void
buffer_clear(buffer_type *buffer)
{
buffer_invariant(buffer);
buffer->_position = 0;
buffer->_limit = buffer->_capacity;
}
void
buffer_flip(buffer_type *buffer)
{
buffer_invariant(buffer);
buffer->_limit = buffer->_position;
buffer->_position = 0;
}
void
buffer_rewind(buffer_type *buffer)
{
buffer_invariant(buffer);
buffer->_position = 0;
}
void
buffer_set_capacity(buffer_type *buffer, size_t capacity)
{
buffer_invariant(buffer);
assert(buffer->_position <= capacity);
buffer->_data = (uint8_t *) xrealloc(buffer->_data, capacity);
buffer->_limit = buffer->_capacity = capacity;
}
void
buffer_reserve(buffer_type *buffer, size_t amount)
{
buffer_invariant(buffer);
assert(!buffer->_fixed);
if (buffer->_capacity < buffer->_position + amount) {
size_t new_capacity = buffer->_capacity * 3 / 2;
if (new_capacity < buffer->_position + amount) {
new_capacity = buffer->_position + amount;
}
buffer_set_capacity(buffer, new_capacity);
}
buffer->_limit = buffer->_capacity;
}
int
buffer_printf(buffer_type *buffer, const char *format, ...)
{
va_list args;
int written;
size_t remaining;
buffer_invariant(buffer);
assert(buffer->_limit == buffer->_capacity);
remaining = buffer_remaining(buffer);
va_start(args, format);
written = vsnprintf((char *) buffer_current(buffer), remaining,
format, args);
va_end(args);
if (written >= 0 && (size_t) written >= remaining) {
buffer_reserve(buffer, written + 1);
va_start(args, format);
written = vsnprintf((char *) buffer_current(buffer),
buffer_remaining(buffer),
format, args);
va_end(args);
}
buffer->_position += written;
return written;
}
```
|
Euil "Snitz" Snider (December 9, 1905 – February 9, 1978) was an American sprinter. He competed in the men's 400 metres at the 1928 Summer Olympics. He graduated from Oak Grove High School in Jefferson County, Alabama and was on the track team at Auburn University. He was the head football coach at Bessemer High School from 1933 to 1963 and was later inducted into the Alabama Sports Hall of Fame. Snider died on February 9, 1978. In 1972, Bessemer Stadium where he had coached the Tigers for thirty years was renamed "Snitz Snider Stadium" in his honor.
Head coaching record
References
External links
1905 births
1978 deaths
American male sprinters
American men's basketball players
Auburn Tigers football players
Auburn Tigers men's basketball players
College track and field coaches in the United States
High school baseball coaches in the United States
High school basketball coaches in Alabama
High school football coaches in Alabama
Samford Bulldogs athletic directors
Samford Bulldogs football coaches
Athletes (track and field) at the 1928 Summer Olympics
Olympic track and field athletes for the United States
Sportspeople from Jefferson County, Alabama
Coaches of American football from Alabama
Players of American football from Alabama
Baseball coaches from Alabama
Basketball coaches from Alabama
Basketball players from Alabama
Track and field athletes from Alabama
|
The Opening Ceremony of the XV Pan American Games took place on 13 July 2007. Considered an audition for the 2014 World Cup and 2016 Olympics, the Opening Ceremony was praised by the media for its creativity and Olympic-style production value. The Los Angeles Times reported:
Brazil's 2nd largest city still must overcome doubts about crime and traffic, among other things, if it hopes to make good on its quixotic bid to play host to the 2016 Olympic Games. But Rio sure has the opening ceremony down pat. On Friday the city inaugurated the 15th Pan American Games with a lavish and creative 3½ -hour show that featured a symphony orchestra, three 100-foot-long coral snakes, Miss Brazil, an alligator the size of a 747, fireworks, a 1,500-piece percussion band and thousands of dancers dressed as everything from ocean waves to water lilies.
Approximately 90,000 spectators and 5,000 athletes packed Rio de Janeiro's Maracanã Stadium for the occasion. The ceremony included a cast of 4,000 and a $17 million (US) budget. The Executive Producer of the Opening Ceremony was Scott Givens. More than 250 people were part of the creative and production teams with another 1,000 backstage volunteers. Scott Givens' team was responsible for the Opening Ceremony, Sports Production, the presentation of 2,252 medals, Sports Production and the Closing Ceremony.
The theme of the show was based on the theme of the Rio 2007 Games: Viva Essa Energia (Share this Energy).
Proceedings
The show began at 17:30 local time (UTC-3) and lasted for two and a half hours.
The beginning had the Dragoons of Independence, the first regiment of the Brazilian army and the official guard of the Republic's president, bringing a Brazilian flag and an execution of the Brazilian national anthem sung by Elza Soares, the former wife of football player Mané Garrincha. The countdown, starting with the number 15, showed the cities which hosted the American event.
The first part was named "Viva essa energia" (Live this energy in Portuguese). And started with an afro-Brazilian boy named Cainan playing a tambour and guiding 1,150 rhythmists from the 17 Samba schools and presenting the official song of the XV Pan American Games, "Viva essa Energia", composed by Arnaldo Antunes, former singer the Brazilian rock band Titãs, and Liminha, singer the Brazilian rock band Os Mutantes, and sung by Ana Costa samba singer.
The parade of the athletes had a rhythm of a samba, chorinho and the bossa nova's rhythm played by the battery of samba schools which made a huge corridor when the athletes from the 42 nations passed between them. There was expectation about the entry of Panama's delegation due to interference by the Panamanian government on their national Olympic committee, the IOC have banned the participation of the country in official events but the PASO required the Panamanian athletes to participate using their organization's flag, so the IOC came back to the decision and authorised the participation of Panama.
At this games were created the PASO's anthem, composed by André Mehmari and performed by Orquestra Sinfônica Brasileira conducted by Roberto Minczuk.
The show were divided in three parts: "A energia do Sol" (The energy of sun), "A energia da água" (the energy of water), and "A energia do homem" (The man's energy). The show was coordinated by Rosa Magalhães from the Rio de Janeiro samba school GRES Imperatriz Leopoldinense.
The oath of the athletes was performed by Brazilian taekwondo athlete Natália Falavigna.
Contrary to plan and tradition, the games were not opened by Brazil's head of state, President Luiz Inácio Lula da Silva, but by the head of the Brazilian Olympic Committee, Carlos Arthur Nuzman. Prior to the official opening, Lula had been repeatedly booed whenever the in-stadium camera showed him on the large screen set up inside the stadium.
Parade of Nations
References
External links
Opening ceremony video
Parade of Nations
See also
2006 Asian Games opening ceremony
2008 Summer Olympics opening ceremony
2016 Summer Olympics opening ceremony
2016 Summer Olympics Parade of Nations
Opening Ceremony
Pan American Games opening ceremonies
Ceremonies in Brazil
|
Olivier Bouygues (born 14 September 1950) is deputy CEO of the French company Bouygues, and CEO of the family holding company SCDM.
Early life
He was born on 14 September 1950, and educated at the École Nationale Supérieure du Pétrole et des Moteurs.
Career
In 1974, he joined the Bouygues group, beginning his career in the civil works branch. From 1983-88, he worked at Bouygues Offshore, and held the posts of Director of Boscam, a Cameroon subsidiary, then Director of the France Works and Special Projects division. From 1988-92, he was Chairman and CEO of Maison Bouygues. In 1992, he became Group Executive Vice President of Utilities Management, which grouped the international and French activities of Saur. In 2002, he was appointed Deputy CEO of Bouygues.
He is a director of TF1 Group.
Personal life
He is co-owner of the Château Montrose vineyard, with his brother Martin Bouygues. The brothers purchased the winery in 2006.
References
External links
1950 births
Living people
Olivier
French billionaires
French businesspeople
|
```java
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing,
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* specific language governing permissions and limitations
*/
package org.ballerinalang.test.expressions.conversion;
import io.ballerina.runtime.api.utils.StringUtils;
import io.ballerina.runtime.api.values.BArray;
import io.ballerina.runtime.api.values.BMap;
import io.ballerina.runtime.internal.types.BMapType;
import org.ballerinalang.test.BCompileUtil;
import org.ballerinalang.test.BRunUtil;
import org.ballerinalang.test.CompileResult;
import org.testng.Assert;
import org.testng.annotations.AfterClass;
import org.testng.annotations.BeforeClass;
import org.testng.annotations.Test;
/**
* Test cases for converting types which can be stamped.
*
* @since 0.985.0
*/
public class NativeConversionWithStampTypesTest {
private CompileResult compileResult;
@BeforeClass
public void setup() {
compileResult = BCompileUtil.compile("test-src/expressions/conversion/native-conversion-stampable-values.bal");
}
@Test(description = "Test converting a record which can be stamp, into a another record and check previous values "
+ "are not changed")
public void testConvertStampRecordToRecord() {
Object arr = BRunUtil.invoke(compileResult, "testConvertStampRecordToRecord");
BArray results = (BArray) arr;
BMap<String, Object> person = (BMap<String, Object>) results.get(0);
BMap<String, Object> employee = (BMap<String, Object>) results.get(1);
Assert.assertEquals(results.size(), 2);
Assert.assertEquals(person.get(StringUtils.fromString("name")).toString(), "Watson");
Assert.assertEquals(employee.get(StringUtils.fromString("name")).toString(), "Waruna");
Assert.assertEquals(person.get(StringUtils.fromString("age")).toString(), "25");
Assert.assertEquals(employee.get(StringUtils.fromString("age")).toString(), "30");
Assert.assertEquals(person.get(StringUtils.fromString("school")).toString(), "ABC College");
Assert.assertEquals(employee.get(StringUtils.fromString("school")).toString(), "ABC College");
}
@Test(description = "Test converting a record into a json and check previous values are not changed")
public void testConvertStampRecordToJSON() {
Object arr = BRunUtil.invoke(compileResult, "testConvertStampRecordToJSON");
BArray results = (BArray) arr;
BMap<String, Object> employee = (BMap<String, Object>) results.get(0);
BMap<String, Object> json = (BMap<String, Object>) results.get(1);
Assert.assertEquals(results.size(), 2);
Assert.assertEquals(json.getType().getClass(), BMapType.class);
Assert.assertEquals(employee.size(), 4);
Assert.assertEquals(json.size(), 4);
Assert.assertEquals(employee.get(StringUtils.fromString("name")).toString(), "John");
Assert.assertEquals(json.get(StringUtils.fromString("name")).toString(), "Waruna");
Assert.assertEquals(employee.get(StringUtils.fromString("school")).toString(), "DEF College");
Assert.assertEquals(json.get(StringUtils.fromString("school")).toString(), "ABC College");
}
@Test(description = "Test converting a record into a map and check previous values are not changed")
public void testConvertStampRecordToMap() {
BRunUtil.invoke(compileResult, "testConvertStampRecordToMap");
}
@Test(description = "Test converting a tuple into a map and check previous values are not changed")
public void testConvertStampTupleToMap() {
BRunUtil.invoke(compileResult, "testConvertStampTupleToMap");
}
@Test
public void testConvertMapJsonWithDecimalToOpenRecord() {
BRunUtil.invoke(compileResult, "testConvertMapJsonWithDecimalToOpenRecord");
}
@Test
public void testConvertMapJsonWithDecimalUnionTarget() {
BRunUtil.invoke(compileResult, "testConvertMapJsonWithDecimalUnionTarget");
}
@Test
void testConvertToUnionWithActualType() {
BRunUtil.invoke(compileResult, "testConvertToUnionWithActualType");
}
@AfterClass
public void tearDown() {
compileResult = null;
}
}
```
|
```html+erb
<div class="form__wrapper">
<div class="item__edit-form">
<%= decidim_form_for(@form, url: taxonomy_item_path(taxonomy_id: taxonomy.id, id: taxonomy_item.id ), remote: true, html: { id: "taxonomy-item-form", class: "form-defaults form new_taxonomy_" }) do |f| %>
<h1>
<%= t ".title", taxonomy: translated_attribute(taxonomy.name) %>
<%= f.submit t(".update"), class: "button button__sm button__secondary" %>
</h1>
<%= render partial: "form", object: f %>
<% end %>
</div>
</div>
```
|
Mount Holly is a mountain located in the Catskill Mountains of New York east-southeast of Walton, New York. Located to the east is Colchester Mountain and southeast is Starkweather Hill.
References
Mountains of Delaware County, New York
Mountains of New York (state)
|
Mutya ng Pilipinas 2001 was the 33rd edition of Mutya ng Pilipinas. It was held at the now closed NBC Tent in Taguig, Metro Manila on June 9, 2001.
At the end of the event, Josephine Canonizado crowned Darlene Carbungco as Mutya ng Pilipinas Asia Pacific 2001. Including her crowned are the new court of winners: Michelle Ann Peñez was named First Runner-Up, Mimilannie Lisondra was named Second Runner-Up, Liza Diño was named Third Runner-Up, and Janice Gay Alop was named Fourth Runner-Up.
Results
Color keys
The contestant was a Runner-up in an International pageant.
The contestant was not able to compete in an International pageant.
The contestant did not place.
Special awards
Major awards
Contestants
25 delegates have been selected to compete this year.
Notes
Crossovers
Mutya #2 Nuriza Abeja Jr. was a semifinalist at Binibining Pilipinas 2002.
Mutya #4 Sandra Rebancos was a candidate at Binibining Pilipinas 2002.
Mutya #5 Athena Claveria was a candidate at Binibining Pilipinas 2002.
Mutya #20 Anna Liza Bernal was a semifinalist at Binibining Pilipinas 2000.
Post-pageant notes
Mutya ng Pilipinas Asia Pacific, Darlene Carbungco competed at Miss Asia Pacific 2001 in Makati, Philippines and placed 4th runner-up.
Mutya 1st runner-up, Michelle Ann Peñez competed at Miss Tourism Universe 2001 and placed 1st runner-up
Mutya 2nd runner-up, Mimilannie Lisondra did not compete at Miss Intercontinental 2001 pageant
Mutya third runner-up, Mary Liza Diño competed at Miss Tourism International 2001-2002 in Malaysia but unplaced
References
2001 beauty pageants
2001 in the Philippines
2001
|
```kotlin
package mega.privacy.android.app.domain.repository
import kotlinx.coroutines.flow.Flow
import mega.privacy.android.app.domain.model.ShakeEvent
/**
* Repository to handle interaction to gateway
*/
interface ShakeDetectorRepository {
/**
* Function to call @VibratorGateway to vibrate device
*/
fun vibrateDevice()
/**
* Function to monitor sensor event and return flow of @ShakeEvent
*
* @return Flow of @ShakeEvent
*/
fun monitorShakeEvents(): Flow<ShakeEvent>
}
```
|
```php
<?php
declare(strict_types=1);
namespace Psalm\Node\Expr\BinaryOp;
use PhpParser\Node\Expr\BinaryOp\Mul;
use Psalm\Node\VirtualNode;
final class VirtualMul extends Mul implements VirtualNode
{
}
```
|
West Pride is a gay, lesbian, bisexual and transgender cultural festival in Gothenburg, Sweden, started in June 2007. The annual event is arranged by the Gothenburg Municipality and Västra Götaland region, in cooperation with RFSL and other LGBT organisations. It takes place at the city's foremost cultural institutions such as Gothenburg City Theatre, the Röhsska Museum and the Museum of World Culture. Gothenburg Film Festival shows queer films during the festival.
History
In 2017, the city supported the event by displaying 1,000 rainbow flags. Parade participants in 2017 carried signs urging acceptance of refugees and immigrants, and urging other European and Middle Eastern countries to accept LGBTQ people.
See also
List of LGBT film festivals
References
External links
West Pride - Homepage of the festival
LGBT events in Sweden
Culture in Gothenburg
LGBT film festivals
Recurring events established in 2007
2007 establishments in Sweden
LGBT festivals in Europe
|
```raw token data
Vlan20 is down (VLAN/BD is down), line protocol is down
Hardware is EtherSVI, address is 5087.89a1.d8d5
Internet Address is 10.1.20.3/24
MTU 1500 bytes, BW 1000000 Kbit, DLY 10 usec,
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, loopback not set
Keepalive not supported
ARP type: ARPA
Last clearing of "show interface" counters never
L3 in Switched:
ucast: 0 pkts, 0 bytes
mgmt0 is up
admin state is up,
Hardware: GigabitEthernet, address: 5087.89a1.d8ce (bia 5087.89a1.d8ce)
Description: out of band mgmt interface
Internet Address is 10.1.100.21/24
MTU 1500 bytes, BW 100000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
full-duplex, 100 Mb/s
Auto-Negotiation is turned on
Auto-mdix is turned off
EtherType is 0x0000
1 minute input rate 744 bits/sec, 0 packets/sec
1 minute output rate 1608 bits/sec, 0 packets/sec
Rx
3093618 input packets 224819 unicast packets 2319461 multicast packets
549338 broadcast packets 267225622 bytes
Tx
394746 output packets 206870 unicast packets 187869 multicast packets
7 broadcast packets 117206151 bytes
Ethernet1/1 is up
admin state is up, Dedicated Interface
Hardware: 1000/10000 Ethernet, address: 5087.89a1.d8d6 (bia 5087.89a1.d8d6)
MTU 1500 bytes, BW 10000000 Kbit, DLY 10 usec
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, medium is broadcast
Port mode is access
full-duplex, 10 Gb/s, media type is 10G
Beacon is turned off
Auto-Negotiation is turned on
Input flow-control is off, output flow-control is off
Auto-mdix is turned off
Rate mode is dedicated
Switchport monitor is off
EtherType is 0x8100
EEE (efficient-ethernet) : n/a
Last link flapped 2week(s) 1day(s)
Last clearing of "show interface" counters 2w1d
111 interface resets
30 seconds input rate 80 bits/sec, 0 packets/sec
30 seconds output rate 256 bits/sec, 0 packets/sec
Load-Interval #2: 5 minute (300 seconds)
input rate 112 bps, 0 pps; output rate 248 bps, 0 pps
RX
0 unicast packets 68448 multicast packets 0 broadcast packets
68448 input packets 19188106 bytes
0 jumbo packets 0 storm suppression packets
0 runts 0 giants 0 CRC 0 no buffer
0 input error 0 short frame 0 overrun 0 underrun 0 ignored
0 watchdog 0 bad etype drop 0 bad proto drop 0 if down drop
0 input with dribble 0 input discard
0 Rx pause
TX
0 unicast packets 754962 multicast packets 0 broadcast packets
754962 output packets 62262008 bytes
0 jumbo packets
0 output error 0 collision 0 deferred 0 late collision
0 lost carrier 0 no carrier 0 babble 0 output discard
0 Tx pause
```
|
The Farifax-Brewster School was a private K-6 elementary school in Bailey's Crossroads, Virginia. The school was founded in 1954 by Stuart A. Reiss and Robert S. Reiss, with Robert's wife Olga also serving in an administrative role. The school began operating in 1955 with an average enrollment of 21 students. The Fairfax-Brewster School opened a summer camp the following year, also serving students in Kindergarten through 6th grade. By 1962, 21 students attended the summer camp.
The proximity of the school's founding to Brown v. Board of Education ruling desegregating public schools has led some legal scholars to describe Fairfax-Brester as a segregation academy.
By 1972, enrollment at the Fairfax-Brewster School had grown to 236 students during the school year and 223 students at the summer camp. No black student had ever been enrolled in the school or summer camp. The school faced a federal lawsuit in 1973 (Runyon v. McCrary) after denying admission to a black child, Colin M. Gonzales. The school denied having discriminated against black students, saying that Gonzales was not admitted because he would not qualify to begin first grade. The court found that Gonzales was denied admission solely because of his race, a decision that was upheld on appeal to the 4th Circuit Court of Appeals and the Supreme Court.
The Reiss family continued to own and operate the Fairfax-Brewster School until Olga and Robert retired in 1987 and 1988, respectively. By 1989, Norma Brill had become the owner and director of the school and summer camp. The school was sold to Chancellor Beacon Academies in 2000, which was later acquired by Imagine Schools.
In 2006, the school was torn down and several homes were built on the property, most of which have an address on Brill Court, a street named after former owner Norma Brill.
References
Segregation academies in Virginia
Schools in Fairfax County, Virginia
Defunct schools in Virginia
Educational institutions established in 1955
1955 establishments in Virginia
Demolished buildings and structures in Virginia
Buildings and structures demolished in 2006
|
The Society for Artistic Research (SAR) is an international nonprofit, artistic and scientific society devoted to developing, linking and disseminating internationally artistic research as a specific practice of producing knowledge. SAR also aims to facilitate co-operation and communication among those interested in the study and practices of artistic research.
History
SAR was founded in 2010 in Bern, Switzerland (as an initiative of the two artists Florian Dombois and Michael Schwab together with Henk Borgdorff) by about 80 artists, researchers and academics from around the globe. It is the only international society for artistic research in the world. It has an international membership drawn from both academic and non-academic institutions and individuals.
Objectives
to promote the practices of artistic research done in and outside of academic institutions
to facilitate co-operation and communication among those interested in artistic research
to hold, or to participate in the holding of, conferences and meetings for the communication of artistic based knowledge, and to publicise and disseminate by other means knowledge and views concerning artistic research practices and results
Activities
The society publishes the triannual Journal for Artistic Research (JAR), an international, online, open access and peer-reviewed journal for the identification, publication and dissemination of artistic research and its methodologies, from all arts disciplines.
The society publishes the Research Catalogue (RC), a searchable, documentary database of artistic research, to which anyone can contribute.
The society awards the Annual Prize for Excellent Research Catalogue Exposition for innovative, experimental new formats of publications.
The society hosts an annual convention consisting of a conference covering various specialist topics organised around a theme.
Current executive board (elected till)
Florian Schneider, President (Trondheim, Norway; 2026)
Geir Strøm, Vice-President (Bergen, Norway; 2024 )
Jaana Erkkilä-Hill, Vice-President (Helsinki, Finland; 2024)
Angela Bartram (Derby, UK; 2024)
Michaela Glanz (Vienna, Austria; 2024)
Blanka Chládková,(Brno, The Czech Republic; 2026)
Esa Kirkkopelto (Helsinki, Finland; 2026)
Johan A. Haarberg, Executive Officer (Bergen, Norway)
Jessica Kaiser, SAR Executive Consultant (Gras, Austria; 2022)
Board 2020-2022: Deniz Peters, President / Geir Strøm, Vice-President (Bergen, Norway; since 2018) / Jaana Erkkilä-Hill, Vice-President (Helsinki, Finland; since 2020), Angela Bartram (Derby, UK; since 2018) / Chrysa Parkinson (Stockholm, Sweden; 2020-2021) / Gabriele Schmid (Ottersberg, Germany; 2015-2022); Johan A. Haarberg, SAR Executive Officer (Bergen, Norway) / Jessica Kaiser, SAR Executive Consultant (Gras, Austria; 2022)
Board 2019-2020: Deniz Peters, President / Geir Strøm, Vice-President (Bergen, Norway; since 2018) / Giaco Schiesser, Vice-President (Zurich, Switzerland; since 2013) / Angela Bartram (Derby, UK) / Alexander Damianisch (Wien, Österreich; since 2013) / Leena Rouhiainen, Helsinki, Finland; since 2015) / Gabriele Schmid (Ottersberg, Germany); Johan A. Haarberg, SAR Executive Officer (Bergen, Norway)
Board 2017-2019: Henk Borgdorff, President (Leiden / The Hague, Netherlands) / Geir Strøm, Vice-President (Bergen, Norway; since 2018) / Giaco Schiesser, Vice-President (Zurich, Switzerland) / Angela Bartram (Derby, UK) / Alexander Damianisch (Wien, Österreich) / Leena Rouhiainen, Helsinki, Finland) / Gabriele Schmid (Ottersberg, Germany); Johan A. Haarberg, SAR Executive Officer (Bergen, Norway; since 2018)
Board 2015-2017: Henk Borgdorff, President (Leiden / The Hague, Netherlands) / Johan A. Haarberg, Vice-President (Bergen, Norway) / Giaco Schiesser, Vice-President (Zurich, Switzerland) / Alexander Damianisch (Wien, Österreich) / Anya Lewin (Plymouth, UK) / Leena Rouhiainen, Helsinki, Finland) / Gabriele Schmid (Ottersberg, Germany)
Board 2013-2015: Gerhard Eckel, President (Graz, Austria) / Johan A. Haarberg, Vice-President (Bergen, Norway) / Rolf Hughes, Vice-President (Stockholm, Sweden) / Alexander Damianisch (Wien, Österreich) / Julie Harboe (Lucerne, Switzerland) / Efva Lilja (Stockholm, Sweden) / Giaco Schiesser (Zurich, Switzerland)
Board 2011-2013: Anna Lindal, President (Gothenburg, Sweden) / Florian Dombois, Vice-President (Bern, Switzerland) / Rolf Hughes, Vice-President (Stockholm, Sweden) / Barbara Bolt (Melbourne, Australia) / Gerhard Eckel (Graz, Austria) / Kim Gorus (Antwerp, Belgium) / Johan A. Haarberg (Bergen, Norway)
Board 2010-2011: Florian Dombois, President (Bern, Switzerland) / Anna Lindal, Vice-President (Gothenburg, Sweden) / Darla Crispin, Vice-President (Ghent, Belgium) / Jan Kaila (Helsinki, Finland) / Sofie van Loo (Antwerp, Belgium) / George Petelin (Brisbane, Australia) / Stephen Scrivener (London, Great Britain)
References
External links
Official website
Journal for Artistic Research
Research Catalogue
Arts organisations based in Switzerland
Organisations based in Bern
|
```java
/*
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*
*/
/**
* Classes for connecting and getting metadata for Cassandra schemas and tables.
*
* @author tgianos
* @since 1.0.0
*/
package com.netflix.metacat.connector.cassandra;
```
|
```xml
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="12.0" xmlns="path_to_url">
<PropertyGroup>
<MicroBuildV2Dir>$(SolutionDir)packages\MicroBuild.Core.0.2.0\build</MicroBuildV2Dir>
<MicroBuildV2Props>$(MicroBuildV2Dir)\MicroBuild.Core.props</MicroBuildV2Props>
<MicroBuildV2Targets>$(MicroBuildV2Dir)\MicroBuild.Core.targets</MicroBuildV2Targets>
</PropertyGroup>
<Import Project="$(MicroBuildV2Props)"
Condition="'$(VSO_MICROBUILD_V2)'=='True' AND Exists('$(MicroBuildV2Props)')" />
<Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
<PropertyGroup>
<ErrorText>This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them. For more information, see path_to_url The missing file is {0}.</ErrorText>
</PropertyGroup>
<Error Text="$([System.String]::Format('$(ErrorText)', '$(MicroBuildV2Props)'))"
Condition="'$(VSO_MICROBUILD_V2)'=='True' AND !Exists('$(MicroBuildV2Props)')" />
<Error Text="$([System.String]::Format('$(ErrorText)', '$(MicroBuildV2Targets)'))"
Condition="'$(VSO_MICROBUILD_V2)'=='True' AND !Exists('$(MicroBuildV2Targets)')" />
</Target>
<Import Project="$(MicroBuildV2Targets)"
Condition="'$(VSO_MICROBUILD_V2)'=='True' AND Exists('$(MicroBuildV2Targets)')" />
</Project>
```
|
```smalltalk
using System.Linq;
using GameServerCore;
using GameServerCore.Enums;
using static LeagueSandbox.GameServer.API.ApiFunctionManager;
using LeagueSandbox.GameServer.Scripting.CSharp;
using GameServerCore.Scripting.CSharp;
using LeagueSandbox.GameServer.GameObjects.AttackableUnits.AI;
using LeagueSandbox.GameServer.GameObjects.SpellNS;
namespace Spells
{
public class Shatter : ISpellScript
{
public SpellScriptMetadata ScriptMetadata { get; private set; } = new SpellScriptMetadata()
{
TriggersSpellCasts = true
// TODO
};
public void OnSpellPostCast(Spell spell)
{
var owner = spell.CastInfo.Owner;
var armor = owner.Stats.Armor.Total;
var damage = spell.CastInfo.SpellLevel * 40 + armor * 0.2f;
var reduce = spell.CastInfo.SpellLevel * 5 + armor * 0.05f;
AddParticleTarget(owner, owner, "Shatter_nova", owner);
foreach (var enemy in GetUnitsInRange(owner.Position, 375, true)
.Where(x => x.Team == CustomConvert.GetEnemyTeam(owner.Team)))
{
var hasbuff = HasBuff((ObjAIBase)enemy, "TaricWDis");
if (enemy is ObjAIBase)
{
enemy.TakeDamage(owner, damage, DamageType.DAMAGE_TYPE_MAGICAL, DamageSource.DAMAGE_SOURCE_SPELL, false);
var p2 = AddParticleTarget(owner, enemy, "Shatter_tar", enemy);
AddBuff("TaricWDis", 4.0f, 1, spell, enemy, owner);
if (hasbuff == true)
{
return;
}
if (hasbuff == false)
{
enemy.Stats.Armor.FlatBonus -= reduce;
}
CreateTimer(4f, () =>
{
enemy.Stats.Armor.FlatBonus += reduce;
RemoveParticle(p2);
});
}
}
}
}
}
```
|
Below is list of Catholic schools in the state of New South Wales. It is correct as of June 2023.
Systemic primary schools
Catholic high and K–12 schools
Special schools
See also
List of non-government schools in New South Wales
Catholic Education in the Diocese of Parramatta
Catholic education in Australia
The Seminary of the Good Shepherd
External links
Catholic Education Commission NSW website
Catholic Education Office Sydney
Catholic
Aust
Roman Catholic Archdiocese of Sydney
Roman Catholic Diocese of Parramatta
Roman Catholic Diocese of Broken Bay
Roman Catholic Diocese of Wollongong
Roman Catholic Diocese of Maitland-Newcastle
Roman Catholic Diocese of Armidale
Roman Catholic Diocese of Bathurst in Australia
Roman Catholic Archdiocese of Canberra and Goulburn
Roman Catholic Diocese of Lismore
Roman Catholic Diocese of Wagga Wagga
|
```php
<?php declare(strict_types=1);
namespace Nuwave\Lighthouse\Schema\Directives;
use Illuminate\Support\Collection;
use Nuwave\Lighthouse\Execution\Arguments\ArgumentSet;
use Nuwave\Lighthouse\Schema\Values\FieldValue;
use Nuwave\Lighthouse\Support\Contracts\ArgDirective;
use Nuwave\Lighthouse\Support\Contracts\ArgDirectiveForArray;
use Nuwave\Lighthouse\Support\Contracts\Directive;
use Nuwave\Lighthouse\Support\Contracts\FieldMiddleware;
use Nuwave\Lighthouse\Support\Utils;
abstract class ArgTraversalDirective extends BaseDirective implements FieldMiddleware
{
public function handleField(FieldValue $fieldValue): void
{
$fieldValue->addArgumentSetTransformer(fn (ArgumentSet $argumentSet): ArgumentSet => $this->transformRecursively($argumentSet));
}
protected function transformRecursively(ArgumentSet $argumentSet): ArgumentSet
{
foreach ($argumentSet->arguments as $argument) {
$directivesForArray = $argument->directives->filter(
Utils::instanceofMatcher(ArgDirectiveForArray::class),
);
$argument->value = $this->transform($argument->value, $directivesForArray);
$directivesForArgument = $argument->directives->filter(
Utils::instanceofMatcher(ArgDirective::class),
);
$argument->value = Utils::mapEach(
function ($value) use ($directivesForArgument) {
if ($value instanceof ArgumentSet) {
$value = $this->transform($value, $directivesForArgument);
return $this->transformRecursively($value);
}
return $this->transform($value, $directivesForArgument);
},
$argument->value,
);
}
return $argumentSet;
}
/**
* @param mixed $value The client given value
* @param \Illuminate\Support\Collection<int, \Nuwave\Lighthouse\Support\Contracts\Directive> $directivesForArgument
*
* @return mixed The transformed value
*/
protected function transform(mixed $value, Collection $directivesForArgument): mixed
{
foreach ($directivesForArgument as $directive) {
$value = $this->applyDirective($directive, $value);
}
return $value;
}
/**
* @param mixed $value The client given value
*
* @return mixed The transformed value
*/
abstract protected function applyDirective(Directive $directive, mixed $value): mixed;
}
```
|
```python
AdminExecuteSequence = [
(u'InstallInitialize', None, 1500),
(u'InstallFinalize', None, 6600),
(u'InstallFiles', None, 4000),
(u'InstallAdminPackage', None, 3900),
(u'FileCost', None, 900),
(u'CostInitialize', None, 800),
(u'CostFinalize', None, 1000),
(u'InstallValidate', None, 1400),
]
AdminUISequence = [
(u'FileCost', None, 900),
(u'CostInitialize', None, 800),
(u'CostFinalize', None, 1000),
(u'ExecuteAction', None, 1300),
(u'ExitDialog', None, -1),
(u'FatalError', None, -3),
(u'UserExit', None, -2),
]
AdvtExecuteSequence = [
(u'InstallInitialize', None, 1500),
(u'InstallFinalize', None, 6600),
(u'CostInitialize', None, 800),
(u'CostFinalize', None, 1000),
(u'InstallValidate', None, 1400),
(u'CreateShortcuts', None, 4500),
(u'MsiPublishAssemblies', None, 6250),
(u'PublishComponents', None, 6200),
(u'PublishFeatures', None, 6300),
(u'PublishProduct', None, 6400),
(u'RegisterClassInfo', None, 4600),
(u'RegisterExtensionInfo', None, 4700),
(u'RegisterMIMEInfo', None, 4900),
(u'RegisterProgIdInfo', None, 4800),
]
InstallExecuteSequence = [
(u'InstallInitialize', None, 1500),
(u'InstallFinalize', None, 6600),
(u'InstallFiles', None, 4000),
(u'FileCost', None, 900),
(u'CostInitialize', None, 800),
(u'CostFinalize', None, 1000),
(u'InstallValidate', None, 1400),
(u'CreateShortcuts', None, 4500),
(u'MsiPublishAssemblies', None, 6250),
(u'PublishComponents', None, 6200),
(u'PublishFeatures', None, 6300),
(u'PublishProduct', None, 6400),
(u'RegisterClassInfo', None, 4600),
(u'RegisterExtensionInfo', None, 4700),
(u'RegisterMIMEInfo', None, 4900),
(u'RegisterProgIdInfo', None, 4800),
(u'AllocateRegistrySpace', u'NOT Installed', 1550),
(u'AppSearch', None, 400),
(u'BindImage', None, 4300),
(u'CCPSearch', u'NOT Installed', 500),
(u'CreateFolders', None, 3700),
(u'DeleteServices', u'VersionNT', 2000),
(u'DuplicateFiles', None, 4210),
(u'FindRelatedProducts', None, 200),
(u'InstallODBC', None, 5400),
(u'InstallServices', u'VersionNT', 5800),
(u'IsolateComponents', None, 950),
(u'LaunchConditions', None, 100),
(u'MigrateFeatureStates', None, 1200),
(u'MoveFiles', None, 3800),
(u'PatchFiles', None, 4090),
(u'ProcessComponents', None, 1600),
(u'RegisterComPlus', None, 5700),
(u'RegisterFonts', None, 5300),
(u'RegisterProduct', None, 6100),
(u'RegisterTypeLibraries', None, 5500),
(u'RegisterUser', None, 6000),
(u'RemoveDuplicateFiles', None, 3400),
(u'RemoveEnvironmentStrings', None, 3300),
(u'RemoveExistingProducts', None, 6700),
(u'RemoveFiles', None, 3500),
(u'RemoveFolders', None, 3600),
(u'RemoveIniValues', None, 3100),
(u'RemoveODBC', None, 2400),
(u'RemoveRegistryValues', None, 2600),
(u'RemoveShortcuts', None, 3200),
(u'RMCCPSearch', u'NOT Installed', 600),
(u'SelfRegModules', None, 5600),
(u'SelfUnregModules', None, 2200),
(u'SetODBCFolders', None, 1100),
(u'StartServices', u'VersionNT', 5900),
(u'StopServices', u'VersionNT', 1900),
(u'MsiUnpublishAssemblies', None, 1750),
(u'UnpublishComponents', None, 1700),
(u'UnpublishFeatures', None, 1800),
(u'UnregisterClassInfo', None, 2700),
(u'UnregisterComPlus', None, 2100),
(u'UnregisterExtensionInfo', None, 2800),
(u'UnregisterFonts', None, 2500),
(u'UnregisterMIMEInfo', None, 3000),
(u'UnregisterProgIdInfo', None, 2900),
(u'UnregisterTypeLibraries', None, 2300),
(u'ValidateProductID', None, 700),
(u'WriteEnvironmentStrings', None, 5200),
(u'WriteIniValues', None, 5100),
(u'WriteRegistryValues', None, 5000),
]
InstallUISequence = [
(u'FileCost', None, 900),
(u'CostInitialize', None, 800),
(u'CostFinalize', None, 1000),
(u'ExecuteAction', None, 1300),
(u'ExitDialog', None, -1),
(u'FatalError', None, -3),
(u'UserExit', None, -2),
(u'AppSearch', None, 400),
(u'CCPSearch', u'NOT Installed', 500),
(u'FindRelatedProducts', None, 200),
(u'IsolateComponents', None, 950),
(u'LaunchConditions', None, 100),
(u'MigrateFeatureStates', None, 1200),
(u'RMCCPSearch', u'NOT Installed', 600),
(u'ValidateProductID', None, 700),
]
tables=['AdminExecuteSequence', 'AdminUISequence', 'AdvtExecuteSequence', 'InstallExecuteSequence', 'InstallUISequence']
```
|
```javascript
import '@kitware/vtk.js/favicon';
// Load the rendering pieces we want to use (for both WebGL and WebGPU)
import '@kitware/vtk.js/Rendering/Profiles/Geometry';
import vtkFullScreenRenderWindow from '@kitware/vtk.js/Rendering/Misc/FullScreenRenderWindow';
import vtkActor from '@kitware/vtk.js/Rendering/Core/Actor';
import vtkHttpDataSetReader from '@kitware/vtk.js/IO/Core/HttpDataSetReader';
import vtkMapper from '@kitware/vtk.js/Rendering/Core/Mapper';
// Force the loading of HttpDataAccessHelper to support gzip decompression
import '@kitware/vtk.js/IO/Core/DataAccessHelper/HttpDataAccessHelper';
// your_sha256_hash------------
// Standard rendering code setup
// your_sha256_hash------------
const fullScreenRenderer = vtkFullScreenRenderWindow.newInstance();
const renderer = fullScreenRenderer.getRenderer();
const renderWindow = fullScreenRenderer.getRenderWindow();
// your_sha256_hash------------
// Example code
// your_sha256_hash------------
// Server is not sending the .gz and with the compress header
// Need to fetch the true file name and uncompress it locally
// your_sha256_hash------------
const reader = vtkHttpDataSetReader.newInstance({ fetchGzip: true });
reader.setUrl(`${__BASE_PATH__}/data/cow.vtp`).then(() => {
reader.loadData().then(() => {
renderer.resetCamera();
renderWindow.render();
});
});
const mapper = vtkMapper.newInstance();
mapper.setInputConnection(reader.getOutputPort());
const actor = vtkActor.newInstance();
actor.setMapper(mapper);
renderer.addActor(actor);
// -----------------------------------------------------------
// Make some variables global so that you can inspect and
// modify objects in your browser's developer console:
// -----------------------------------------------------------
global.source = reader;
global.mapper = mapper;
global.actor = actor;
global.renderer = renderer;
global.renderWindow = renderWindow;
```
|
The Satyrinae, the satyrines or satyrids, commonly known as the browns, are a subfamily of the Nymphalidae (brush-footed butterflies). They were formerly considered a distinct family, Satyridae. This group contains nearly half of the known diversity of brush-footed butterflies. The true number of the Satyrinae species is estimated to exceed 2,400.
Overview
They are generally weak fliers and often shun bright sunlight, preferring moist and semishaded habitats. The caterpillars feed chiefly on monocotyledonous plants such as palms, grasses, and bamboos. The Morphinae are sometimes united with this group.
The taxonomy and systematics of the subfamily are under heavy revision. Much of the early pioneering work of L. D. Miller has helped significantly by creating some sort of order. Dyndirus (Capronnier, 1874) is a satyrid incertae sedis. Other than this genus, according to the latest studies on the classification of Nymphalidae, all satyrines have been assigned to one of the tribes, at least preliminarily. For detailed lists, see the tribe pages.
References
Further reading
Glassberg, Jeffrey Butterflies through Binoculars, The West (2001)
Guppy, Crispin S. and Shepard, Jon H. Butterflies of British Columbia (2001)
James, David G. and Nunnallee, David Life Histories of Cascadia Butterflies (2011)
Pelham, Jonathan Catalogue of the Butterflies of the United States and Canada (2008)
Pyle, Robert Michael The Butterflies of Cascadia (2002)
External links
Satyrinae of the Western Palearctic
Tree of Life: Satyrinae
Insect Life Forms - Satyridae
Butterflies and Moths of North America
Butterflies of America
-
Taxa named by Jean Baptiste Boisduval
Butterfly subfamilies
|
The Allegiant Travel Company is an American travel and hospitality company that is the parent of Allegiant Air. Other subsidiaries include Sunseeker Resorts and Allegiant Nonstop. The Allegiant Travel Company is headquartered in the Las Vegas suburb of Summerlin, Nevada and is publicly traded on the Nasdaq exchange under the stock ticker symbol, ALGT.
History
The Allegiant Travel Company was founded in 1999 as the parent company of Allegiant Air, which itself had been founded in 1997. Initially based out of Fresno, California, the company reorganized in 2000 with Maurice J. Gallagher Jr. gaining an almost 20 percent stake in the company. He had previously been a prominent creditor of Allegiant and was one of the co-founders of ValuJet Airlines.
In May 2006, Allegiant Travel Company filed plans for an initial public offering (IPO). It officially began trading on the Nasdaq exchange under the stock ticker symbol, ALGT, in December of that year.
By 2019, Allegiant Travel's primary subsidiary, Allegiant Air, had switched from a fleet predominately composed of MD-80s to one exclusively composed of Airbus jets.
Subsidiaries
Allegiant Air
Allegiant Air was founded in 1997 and is the ninth-largest commercial airline in the United States as of January 2020. Part of Allegiant Air's business model includes earning commissions by selling passengers ancillary items like rental cars, hotel rooms, tickets to events, amusement park passes, and other add-ons. The airline has a fleet composed of 85 Airbus jets that serves more than 500 routes across the country.
Sunseeker Resorts
Plans for the inaugural Sunseeker Resort in Charlotte County, Florida (known as Sunseeker Resort Charlotte Harbor) were announced in August 2017. Construction on the project was initially halted due to the COVID-19 pandemic, but resumed in 2021 with plans for the 500 room, 180 extended-stay suite resort.
Other
The Allegiant Travel Company also counts the golf course management software firm, Teesnap, as one of its subsidiaries. The company was founded in 2013 and has been owned by Allegiant since its outset. The firm's software was being used by 590 golf courses as of July 2019, but was also looking for a buyer for the subsidiary as of the same date. Another Allegiant subsidiary, Game Plane, created an eponymous game show that was filmed on Allegiant Air flights, which ran during 2014 and 2015 on the Discovery Family Channel. Allegiant Travel also operated an information technology company called Allegiant Systems that had the goal of selling software systems to other airlines.
Allegiant Travel formerly operated family entertainment centers in Utah and Michigan. Known as Allegiant Nonstop, the company closed the centers in 2020.
Sponsorships
The Allegiant Travel Company is the official sponsor of several sports teams and venues.
Allegiant Stadium
In August 2019, Allegiant was awarded the naming rights for the home of the Las Vegas Raiders and the UNLV Rebels football team, Allegiant Stadium. It is also the official airline of the Raiders.
Other sport sponsorships
In July 2018, Allegiant was named the official airline of Minor League Baseball (MiLB). In December of that year, it announced a credit card partnership with the MiLB that would allow Allegiant credit card holders to earn points in relation to their local baseball teams and communities.
Allegiant is also the official domestic airline partner of the Vegas Golden Knights. In September 2019, the company unveiled a Golden Knights-themed plane that featured a livery with the team's logo. In January 2020, Allegiant signed a deal to become the official airline of the Indianapolis Colts, and the following year became the official airline of the Pac-12 Conference.
See also
List of Allegiant Air destinations
List of S&P 600 companies
References
External links
Allegiant Air
Sunseeker Resorts
Hospitality companies established in 1999
Companies listed on the Nasdaq
Companies based in Las Vegas
1999 establishments in Nevada
|
Melittia chlorophila is a moth of the family Sesiidae. It was described by Erich Martin Hering in 1935 and is known from Sierra Leone.
References
Sesiidae
Moths of Africa
|
```objective-c
#ifndef __PROTOCOL_STACK_HELPERS_H
#define __PROTOCOL_STACK_HELPERS_H
#include "ktypes.h"
#include "protocols/classification/defs.h"
// get_protocol_layer retrieves the `protocol_layer_t` associated to the given `protocol_t`.
// Example:
// get_protocol_layer(PROTOCOL_HTTP) => LAYER_APPLICATION
// get_protocol_layer(PROTOCOL_TLS) => LAYER_ENCRYPTION
static __always_inline protocol_layer_t get_protocol_layer(protocol_t proto) {
u16 layer_bit = proto&(LAYER_API_BIT|LAYER_APPLICATION_BIT|LAYER_ENCRYPTION_BIT);
switch(layer_bit) {
case LAYER_API_BIT:
return LAYER_API;
case LAYER_APPLICATION_BIT:
return LAYER_APPLICATION;
case LAYER_ENCRYPTION_BIT:
return LAYER_ENCRYPTION;
}
return LAYER_UNKNOWN;
}
// set_protocol adds `proto` to the given `stack`
static __always_inline void set_protocol(protocol_stack_t *stack, protocol_t proto) {
if (!stack || proto == PROTOCOL_UNKNOWN) {
return;
}
protocol_layer_t layer = get_protocol_layer(proto);
if (!layer) {
return;
}
// this is the the number of the protocol without the layer bit set
__u8 proto_num = (__u8)proto;
switch(layer) {
case LAYER_API:
stack->layer_api = proto_num;
return;
case LAYER_APPLICATION:
stack->layer_application = proto_num;
return;
case LAYER_ENCRYPTION:
stack->layer_encryption = proto_num;
return;
default:
return;
}
}
// is_fully_classified returns true if all layers are set or if
// `mark_as_fully_classified` was previously called for this `stack`
static __always_inline bool is_fully_classified(protocol_stack_t *stack) {
if (!stack) {
return false;
}
return stack->flags&FLAG_FULLY_CLASSIFIED ||
(stack->layer_api > 0 &&
stack->layer_application > 0 &&
stack->layer_encryption > 0);
}
// mark_as_fully_classified is intended to be used as an "optimization" helper
// so a protocol stack can be treated as fully classified even if some layers
// are missing.
// For example, if we determine from a socket-filter program that a
// connection has Kafka traffic, we can call `set_protocol(stack, PROTOCOL_KAFKA)`
// and then `mark_as_fully_classified(stack)` to indicate that no further
// classification attempts are necessary, since there can't be an encryption
// layer protocol nor an API-level protocol above Kafka.
static __always_inline void mark_as_fully_classified(protocol_stack_t *stack) {
if (!stack) {
return;
}
stack->flags |= FLAG_FULLY_CLASSIFIED;
}
// get_protocol_from_stack returns the `protocol_t` value that belongs to the given `layer`
// Example: If we had a `protocol_stack_t` with HTTP, calling `get_protocol_from_stack(stack, LAYER_APPLICATION)
// would return PROTOCOL_HTTP;
__maybe_unused static __always_inline protocol_t get_protocol_from_stack(protocol_stack_t *stack, protocol_layer_t layer) {
if (!stack) {
return PROTOCOL_UNKNOWN;
}
__u16 proto_num = 0;
__u16 layer_bit = 0;
switch(layer) {
case LAYER_API:
proto_num = stack->layer_api;
layer_bit = LAYER_API_BIT;
break;
case LAYER_APPLICATION:
proto_num = stack->layer_application;
layer_bit = LAYER_APPLICATION_BIT;
break;
case LAYER_ENCRYPTION:
proto_num = stack->layer_encryption;
layer_bit = LAYER_ENCRYPTION_BIT;
break;
default:
break;
}
if (!proto_num) {
return PROTOCOL_UNKNOWN;
}
return proto_num | layer_bit;
}
// is_protocol_layer_known returns true when `stack` contains a protocol at the given `layer`
__maybe_unused static __always_inline bool is_protocol_layer_known(protocol_stack_t *stack, protocol_layer_t layer) {
if (!stack) {
return false;
}
protocol_t proto = get_protocol_from_stack(stack, layer);
return proto != PROTOCOL_UNKNOWN;
}
// merge_protocol_stacks modifies `this` by merging it with `that`
static __always_inline void merge_protocol_stacks(protocol_stack_t *this, protocol_stack_t *that) {
if (!this || !that) {
return;
}
if (!this->layer_api) {
this->layer_api = that->layer_api;
}
if (!this->layer_application) {
this->layer_application = that->layer_application;
}
if (!this->layer_encryption) {
this->layer_encryption = that->layer_encryption;
}
this->flags |= that->flags;
}
static __always_inline void set_protocol_flag(protocol_stack_t *stack, u8 flag) {
if (!stack) {
return;
}
stack->flags |= flag;
}
#endif
```
|
The Capture of Olovo (1 November — 17 December 1941) was a battle fought between allied forces of Chetnik Detachments of the Yugoslav Army (Chetniks) and Yugoslav Partisans against Axis forces of the Independent State of Croatia garrisoned in Olovo in the first year of World War II in Yugoslavia.
Background
On 21 September 1941 Chetniks attacked militia guards who protected a wooden bridge on the railway between Olovo and Kladanj. They killed one and imprisoned 9 militiamen, without damaging the bridge. On 29 September Chetniks burned wooden bridge between Olovo and Zavidovići.
On 28 October parts of Partisan Romanija Detachment in cooperation with Chetniks captured village Knežina after three days of fighting. Croatian Home Guard and Muslim militiamen fled Knežina and retreated to Olovo.
On 14 November 1941 Captain Sreharski Janko was appointed as commander of Olovo garrison. The 4th company of Sarajevo Reserve Battalion was commanded by Lieutenant Ante Marinković.
Forces
Four Chetnik companies with 400 Chetniks and parts of Partisan Romanija Detachment (Knežina, Bjelogoračka and Crepoljska companies) and Zvijezda Detachment (Nišić battalion and Crnovrška and Vlahinjska company) with total of 800 Partisans organized an unsuccessful attack on Olovo on 1 November 1941.
The Axis forces in Olovo belonged to the III Domobran Corps commanded by Mihajlo Lukić. In mid-December 1941 the garrison in Olovo consisted of 2 companies of Croatian Home Guard, 180 militiamen, 40 gendarmes and a battery of mountain guns. The North-East positions around Olovo were defended by the 4th company of Sarajevo Reserve Battalion (166 members of Croatian Home Guards) enforced by 1 machine-gun. The South-East positions were held by the 17th company of the 6th Infantry Regiment (70 members of Croatian Home Guards, without one platoon). The West positions were defended by militia consisting of 130 and 40 militiamen. A battery of two mountain-guns operated from positions west of railway station in Olovo. One platoon of the 17th company of 6th Infantry Regiment was kept as reserve while flanks were protected by 50 militiamen in village Ponijerka.
Offensive
Artillery preparation
According to some contemporary Croatian reports, in period 1–24 November 1941 about 240 Chetniks were killed during their attacks on Axis controlled Olovo. On 17 November in 7 a.m. insurgents attacked Olovo garrison. The attack started by Chetnik artillery which destroyed militia guard post killing and wounding 24 militiamen, while remaining 6 of them fled. The Chetnik artillery was then aimed against the most important position of Olovo garrison, so called "Stijena" which was defended by the 4th company of Sarajevo Reserve Battalion supported by one machine gun. The position of machine gun was quickly destroyed by Chetnik artillery. Another machine gun was sent as a replacement, but it was also quickly destroyed by Chetnik artillery.
Infantry assault and capture of Olovo
Around 10 a.m. the insurgents stopped their artillery fire and replaced it with barrages of rifle fire of the infantry insurgent units. The commander of the 4th company of Sarajevo Reserve Battalion, Ante Marinković was wounded during this attack and his company had to retreat from "Stijena" in 12:30. After being inforced by one reserve platoon this company managed to recapture "Stijena" for short time only to retreat after being attacked by more numerous Chetnik forces. When Chetniks permanently captured "Stijena" they burned straws as signal to other insurgents about their success. This boosted morale of the insurgents to attack more fiercely the positions of Olovo garrison that began retreating from their positions. To avoid capture of his forces, garrison commander Streharski retreat to the positions west of the village Solun. On 17 December 1941 Olovo was recaptured by Chetnik and Partisan rebel units.
On 18 December Streharski continued his retreat under fire until his forces reached Careva Ćuprija.
Aftermath
At the end of 1941 joint Partisan-Chetnik administration still existed in many Eastern Bosnian towns, including Olovo.
The post-war Yugoslav sources emphasize that on 21 January 1942 part of German 750 regiment from 718 Infantry Division recaptured Olovo after the weak resistance of Chetniks. In 1943 Partisan 2nd Serbian brigade recaptured Olovo and burned its railway station and its wagons and equipment.
References
Sources
Olovo
Olovo
Olovo
Olovo Municipality
|
```c++
/*
path_to_url
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
//! \file test_malloc_pools.cpp
//! \brief Test for [memory_allocation] functionality
#define __TBB_NO_IMPLICIT_LINKAGE 1
#include "common/test.h"
#define HARNESS_TBBMALLOC_THREAD_SHUTDOWN 1
#include "common/utils.h"
#include "common/utils_assert.h"
#include "common/spin_barrier.h"
#include "common/tls_limit.h"
#include "tbb/scalable_allocator.h"
#include <atomic>
template<typename T>
static inline T alignUp (T arg, uintptr_t alignment) {
return T(((uintptr_t)arg+(alignment-1)) & ~(alignment-1));
}
struct PoolSpace: utils::NoCopy {
size_t pos;
int regions;
size_t bufSize;
char *space;
static const size_t BUF_SIZE = 8*1024*1024;
PoolSpace(size_t bufSz = BUF_SIZE) :
pos(0), regions(0),
bufSize(bufSz), space(new char[bufSize]) {
memset(space, 0, bufSize);
}
~PoolSpace() {
delete []space;
}
};
static PoolSpace *poolSpace;
struct MallocPoolHeader {
void *rawPtr;
size_t userSize;
};
static std::atomic<int> liveRegions;
static void *getMallocMem(intptr_t /*pool_id*/, size_t &bytes)
{
void *rawPtr = malloc(bytes+sizeof(MallocPoolHeader)+1);
if (!rawPtr)
return nullptr;
// +1 to check working with unaligned space
void *ret = (void *)((uintptr_t)rawPtr+sizeof(MallocPoolHeader)+1);
MallocPoolHeader *hdr = (MallocPoolHeader*)ret-1;
hdr->rawPtr = rawPtr;
hdr->userSize = bytes;
liveRegions++;
return ret;
}
static int putMallocMem(intptr_t /*pool_id*/, void *ptr, size_t bytes)
{
MallocPoolHeader *hdr = (MallocPoolHeader*)ptr-1;
ASSERT(bytes == hdr->userSize, "Invalid size in pool callback.");
free(hdr->rawPtr);
liveRegions--;
return 0;
}
void TestPoolReset()
{
rml::MemPoolPolicy pol(getMallocMem, putMallocMem);
rml::MemoryPool *pool;
pool_create_v1(0, &pol, &pool);
for (int i=0; i<100; i++) {
REQUIRE(pool_malloc(pool, 8));
REQUIRE(pool_malloc(pool, 50*1024));
}
int regionsBeforeReset = liveRegions.load(std::memory_order_acquire);
bool ok = pool_reset(pool);
REQUIRE(ok);
for (int i=0; i<100; i++) {
REQUIRE(pool_malloc(pool, 8));
REQUIRE(pool_malloc(pool, 50*1024));
}
REQUIRE_MESSAGE(regionsBeforeReset == liveRegions.load(std::memory_order_relaxed),
"Expected no new regions allocation.");
ok = pool_destroy(pool);
REQUIRE(ok);
REQUIRE_MESSAGE(!liveRegions.load(std::memory_order_relaxed), "Expected all regions were released.");
}
class SharedPoolRun: utils::NoAssign {
static long threadNum;
static utils::SpinBarrier startB,
mallocDone;
static rml::MemoryPool *pool;
static void **crossThread,
**afterTerm;
public:
static const int OBJ_CNT = 100;
static void init(int num, rml::MemoryPool *pl, void **crThread, void **aTerm) {
threadNum = num;
pool = pl;
crossThread = crThread;
afterTerm = aTerm;
startB.initialize(threadNum);
mallocDone.initialize(threadNum);
}
void operator()( int id ) const {
const int ITERS = 1000;
void *local[ITERS];
startB.wait();
for (int i=id*OBJ_CNT; i<(id+1)*OBJ_CNT; i++) {
afterTerm[i] = pool_malloc(pool, i%2? 8*1024 : 9*1024);
memset(afterTerm[i], i, i%2? 8*1024 : 9*1024);
crossThread[i] = pool_malloc(pool, i%2? 9*1024 : 8*1024);
memset(crossThread[i], i, i%2? 9*1024 : 8*1024);
}
for (int i=1; i<ITERS; i+=2) {
local[i-1] = pool_malloc(pool, 6*1024);
memset(local[i-1], i, 6*1024);
local[i] = pool_malloc(pool, 16*1024);
memset(local[i], i, 16*1024);
}
mallocDone.wait();
int myVictim = threadNum-id-1;
for (int i=myVictim*OBJ_CNT; i<(myVictim+1)*OBJ_CNT; i++)
pool_free(pool, crossThread[i]);
for (int i=0; i<ITERS; i++)
pool_free(pool, local[i]);
}
};
long SharedPoolRun::threadNum;
utils::SpinBarrier SharedPoolRun::startB,
SharedPoolRun::mallocDone;
rml::MemoryPool *SharedPoolRun::pool;
void **SharedPoolRun::crossThread,
**SharedPoolRun::afterTerm;
// single pool shared by different threads
void TestSharedPool()
{
rml::MemPoolPolicy pol(getMallocMem, putMallocMem);
rml::MemoryPool *pool;
pool_create_v1(0, &pol, &pool);
void **crossThread = new void*[utils::MaxThread * SharedPoolRun::OBJ_CNT];
void **afterTerm = new void*[utils::MaxThread * SharedPoolRun::OBJ_CNT];
for (int p=utils::MinThread; p<=utils::MaxThread; p++) {
SharedPoolRun::init(p, pool, crossThread, afterTerm);
SharedPoolRun thr;
void *hugeObj = pool_malloc(pool, 10*1024*1024);
REQUIRE(hugeObj);
utils::NativeParallelFor( p, thr );
pool_free(pool, hugeObj);
for (int i=0; i<p*SharedPoolRun::OBJ_CNT; i++)
pool_free(pool, afterTerm[i]);
}
delete []afterTerm;
delete []crossThread;
bool ok = pool_destroy(pool);
REQUIRE(ok);
REQUIRE_MESSAGE(!liveRegions.load(std::memory_order_relaxed), "Expected all regions were released.");
}
void *CrossThreadGetMem(intptr_t pool_id, size_t &bytes)
{
if (poolSpace[pool_id].pos + bytes > poolSpace[pool_id].bufSize)
return nullptr;
void *ret = poolSpace[pool_id].space + poolSpace[pool_id].pos;
poolSpace[pool_id].pos += bytes;
poolSpace[pool_id].regions++;
return ret;
}
int CrossThreadPutMem(intptr_t pool_id, void* /*raw_ptr*/, size_t /*raw_bytes*/)
{
poolSpace[pool_id].regions--;
return 0;
}
class CrossThreadRun: utils::NoAssign {
static long number_of_threads;
static utils::SpinBarrier barrier;
static rml::MemoryPool **pool;
static char **obj;
public:
static void initBarrier(unsigned thrds) { barrier.initialize(thrds); }
static void init(long num) {
number_of_threads = num;
pool = new rml::MemoryPool*[number_of_threads];
poolSpace = new PoolSpace[number_of_threads];
obj = new char*[number_of_threads];
}
static void destroy() {
for (long i=0; i<number_of_threads; i++)
REQUIRE_MESSAGE(!poolSpace[i].regions, "Memory leak detected");
delete []pool;
delete []poolSpace;
delete []obj;
}
CrossThreadRun() {}
void operator()( int id ) const {
rml::MemPoolPolicy pol(CrossThreadGetMem, CrossThreadPutMem);
const int objLen = 10*id;
pool_create_v1(id, &pol, &pool[id]);
obj[id] = (char*)pool_malloc(pool[id], objLen);
REQUIRE(obj[id]);
memset(obj[id], id, objLen);
{
const size_t lrgSz = 2*16*1024;
void *ptrLarge = pool_malloc(pool[id], lrgSz);
REQUIRE(ptrLarge);
memset(ptrLarge, 1, lrgSz);
// consume all small objects
while (pool_malloc(pool[id], 5 * 1024));
// releasing of large object will not give a chance to allocate more
// since only fixed pool can look at other bins aligned/notAligned
pool_free(pool[id], ptrLarge);
CHECK(!pool_malloc(pool[id], 5*1024));
}
barrier.wait();
int myPool = number_of_threads-id-1;
for (int i=0; i<10*myPool; i++)
REQUIRE(myPool==obj[myPool][i]);
pool_free(pool[myPool], obj[myPool]);
bool ok = pool_destroy(pool[myPool]);
REQUIRE(ok);
}
};
long CrossThreadRun::number_of_threads;
utils::SpinBarrier CrossThreadRun::barrier;
rml::MemoryPool **CrossThreadRun::pool;
char **CrossThreadRun::obj;
// pools created, used and destroyed by different threads
void TestCrossThreadPools()
{
for (int p=utils::MinThread; p<=utils::MaxThread; p++) {
CrossThreadRun::initBarrier(p);
CrossThreadRun::init(p);
utils::NativeParallelFor( p, CrossThreadRun() );
for (int i=0; i<p; i++)
REQUIRE_MESSAGE(!poolSpace[i].regions, "Region leak detected");
CrossThreadRun::destroy();
}
}
// buffer is too small to pool be created, but must not leak resources
void TestTooSmallBuffer()
{
poolSpace = new PoolSpace(8*1024);
rml::MemPoolPolicy pol(CrossThreadGetMem, CrossThreadPutMem);
rml::MemoryPool *pool;
pool_create_v1(0, &pol, &pool);
bool ok = pool_destroy(pool);
REQUIRE(ok);
REQUIRE_MESSAGE(!poolSpace[0].regions, "No leaks.");
delete poolSpace;
}
class FixedPoolHeadBase : utils::NoAssign {
size_t size;
std::atomic<bool> used;
char* data;
public:
FixedPoolHeadBase(size_t s) : size(s), used(false) {
data = new char[size];
}
void *useData(size_t &bytes) {
bool wasUsed = used.exchange(true);
REQUIRE_MESSAGE(!wasUsed, "The buffer must not be used twice.");
bytes = size;
return data;
}
~FixedPoolHeadBase() {
delete []data;
}
};
template<size_t SIZE>
class FixedPoolHead : FixedPoolHeadBase {
public:
FixedPoolHead() : FixedPoolHeadBase(SIZE) { }
};
static void *fixedBufGetMem(intptr_t pool_id, size_t &bytes)
{
return ((FixedPoolHeadBase*)pool_id)->useData(bytes);
}
class FixedPoolUse: utils::NoAssign {
static utils::SpinBarrier startB;
rml::MemoryPool *pool;
size_t reqSize;
int iters;
public:
FixedPoolUse(unsigned threads, rml::MemoryPool *p, size_t sz, int it) :
pool(p), reqSize(sz), iters(it) {
startB.initialize(threads);
}
void operator()( int /*id*/ ) const {
startB.wait();
for (int i=0; i<iters; i++) {
void *o = pool_malloc(pool, reqSize);
ASSERT(o, "Invalid object");
pool_free(pool, o);
}
}
};
utils::SpinBarrier FixedPoolUse::startB;
class FixedPoolNomem: utils::NoAssign {
utils::SpinBarrier *startB;
rml::MemoryPool *pool;
public:
FixedPoolNomem(utils::SpinBarrier *b, rml::MemoryPool *p) :
startB(b), pool(p) {}
void operator()(int id) const {
startB->wait();
void *o = pool_malloc(pool, id%2? 64 : 128*1024);
ASSERT(!o, "All memory must be consumed.");
}
};
class FixedPoolSomeMem: utils::NoAssign {
utils::SpinBarrier *barrier;
rml::MemoryPool *pool;
public:
FixedPoolSomeMem(utils::SpinBarrier *b, rml::MemoryPool *p) :
barrier(b), pool(p) {}
void operator()(int id) const {
barrier->wait();
utils::Sleep(2*id);
void *o = pool_malloc(pool, id%2? 64 : 128*1024);
barrier->wait();
pool_free(pool, o);
}
};
bool haveEnoughSpace(rml::MemoryPool *pool, size_t sz)
{
if (void *p = pool_malloc(pool, sz)) {
pool_free(pool, p);
return true;
}
return false;
}
void TestFixedBufferPool()
{
const int ITERS = 7;
const size_t MAX_OBJECT = 7*1024*1024;
void *ptrs[ITERS];
rml::MemPoolPolicy pol(fixedBufGetMem, nullptr, 0, /*fixedSizePool=*/true,
/*keepMemTillDestroy=*/false);
rml::MemoryPool *pool;
{
FixedPoolHead<MAX_OBJECT + 1024*1024> head;
pool_create_v1((intptr_t)&head, &pol, &pool);
{
utils::NativeParallelFor( 1, FixedPoolUse(1, pool, MAX_OBJECT, 2) );
for (int i=0; i<ITERS; i++) {
ptrs[i] = pool_malloc(pool, MAX_OBJECT/ITERS);
REQUIRE(ptrs[i]);
}
for (int i=0; i<ITERS; i++)
pool_free(pool, ptrs[i]);
utils::NativeParallelFor( 1, FixedPoolUse(1, pool, MAX_OBJECT, 1) );
}
// each thread asks for an MAX_OBJECT/p/2 object,
// /2 is to cover fragmentation
for (int p=utils::MinThread; p<=utils::MaxThread; p++) {
utils::NativeParallelFor( p, FixedPoolUse(p, pool, MAX_OBJECT/p/2, 10000) );
}
{
const int p = 128;
utils::NativeParallelFor( p, FixedPoolUse(p, pool, MAX_OBJECT/p/2, 1) );
}
{
size_t maxSz;
const int p = 256;
utils::SpinBarrier barrier(p);
// Find maximal useful object size. Start with MAX_OBJECT/2,
// as the pool might be fragmented by BootStrapBlocks consumed during
// FixedPoolRun.
size_t l, r;
REQUIRE(haveEnoughSpace(pool, MAX_OBJECT/2));
for (l = MAX_OBJECT/2, r = MAX_OBJECT + 1024*1024; l < r-1; ) {
size_t mid = (l+r)/2;
if (haveEnoughSpace(pool, mid))
l = mid;
else
r = mid;
}
maxSz = l;
REQUIRE_MESSAGE(!haveEnoughSpace(pool, maxSz+1), "Expect to find boundary value.");
// consume all available memory
void *largeObj = pool_malloc(pool, maxSz);
REQUIRE(largeObj);
void *o = pool_malloc(pool, 64);
if (o) // pool fragmented, skip FixedPoolNomem
pool_free(pool, o);
else
utils::NativeParallelFor( p, FixedPoolNomem(&barrier, pool) );
pool_free(pool, largeObj);
// keep some space unoccupied
largeObj = pool_malloc(pool, maxSz-512*1024);
REQUIRE(largeObj);
utils::NativeParallelFor( p, FixedPoolSomeMem(&barrier, pool) );
pool_free(pool, largeObj);
}
bool ok = pool_destroy(pool);
REQUIRE(ok);
}
// check that fresh untouched pool can successfully fulfil requests from 128 threads
{
FixedPoolHead<MAX_OBJECT + 1024*1024> head;
pool_create_v1((intptr_t)&head, &pol, &pool);
int p=128;
utils::NativeParallelFor( p, FixedPoolUse(p, pool, MAX_OBJECT/p/2, 1) );
bool ok = pool_destroy(pool);
REQUIRE(ok);
}
}
static size_t currGranularity;
static void *getGranMem(intptr_t /*pool_id*/, size_t &bytes)
{
REQUIRE_MESSAGE(!(bytes%currGranularity), "Region size mismatch granularity.");
return malloc(bytes);
}
static int putGranMem(intptr_t /*pool_id*/, void *ptr, size_t bytes)
{
REQUIRE_MESSAGE(!(bytes%currGranularity), "Region size mismatch granularity.");
free(ptr);
return 0;
}
void TestPoolGranularity()
{
rml::MemPoolPolicy pol(getGranMem, putGranMem);
const size_t grans[] = {4*1024, 2*1024*1024, 6*1024*1024, 10*1024*1024};
for (unsigned i=0; i<sizeof(grans)/sizeof(grans[0]); i++) {
pol.granularity = currGranularity = grans[i];
rml::MemoryPool *pool;
pool_create_v1(0, &pol, &pool);
for (int sz=500*1024; sz<16*1024*1024; sz+=101*1024) {
void *p = pool_malloc(pool, sz);
REQUIRE_MESSAGE(p, "Can't allocate memory in pool.");
pool_free(pool, p);
}
bool ok = pool_destroy(pool);
REQUIRE(ok);
}
}
static size_t putMemAll, getMemAll, getMemSuccessful;
static void *getMemMalloc(intptr_t /*pool_id*/, size_t &bytes)
{
getMemAll++;
void *p = malloc(bytes);
if (p)
getMemSuccessful++;
return p;
}
static int putMemFree(intptr_t /*pool_id*/, void *ptr, size_t /*bytes*/)
{
putMemAll++;
free(ptr);
return 0;
}
void TestPoolKeepTillDestroy()
{
const int ITERS = 50*1024;
void *ptrs[2*ITERS+1];
rml::MemPoolPolicy pol(getMemMalloc, putMemFree);
rml::MemoryPool *pool;
// 1st create default pool that returns memory back to callback,
// then use keepMemTillDestroy policy
for (int keep=0; keep<2; keep++) {
getMemAll = putMemAll = 0;
if (keep)
pol.keepAllMemory = 1;
pool_create_v1(0, &pol, &pool);
for (int i=0; i<2*ITERS; i+=2) {
ptrs[i] = pool_malloc(pool, 7*1024);
ptrs[i+1] = pool_malloc(pool, 10*1024);
}
ptrs[2*ITERS] = pool_malloc(pool, 8*1024*1024);
REQUIRE(!putMemAll);
for (int i=0; i<2*ITERS; i++)
pool_free(pool, ptrs[i]);
pool_free(pool, ptrs[2*ITERS]);
size_t totalPutMemCalls = putMemAll;
if (keep)
REQUIRE(!putMemAll);
else {
REQUIRE(putMemAll);
putMemAll = 0;
}
size_t getCallsBefore = getMemAll;
void *p = pool_malloc(pool, 8*1024*1024);
REQUIRE(p);
if (keep)
REQUIRE_MESSAGE(getCallsBefore == getMemAll, "Must not lead to new getMem call");
size_t putCallsBefore = putMemAll;
bool ok = pool_reset(pool);
REQUIRE(ok);
REQUIRE_MESSAGE(putCallsBefore == putMemAll, "Pool is not releasing memory during reset.");
ok = pool_destroy(pool);
REQUIRE(ok);
REQUIRE(putMemAll);
totalPutMemCalls += putMemAll;
REQUIRE_MESSAGE(getMemAll == totalPutMemCalls, "Memory leak detected.");
}
}
static bool memEqual(char *buf, size_t size, int val)
{
bool memEq = true;
for (size_t k=0; k<size; k++)
if (buf[k] != val)
memEq = false;
return memEq;
}
void TestEntries()
{
const int SZ = 4;
const int ALGN = 4;
size_t size[SZ] = {8, 8000, 9000, 100*1024};
size_t algn[ALGN] = {8, 64, 4*1024, 8*1024*1024};
rml::MemPoolPolicy pol(getGranMem, putGranMem);
currGranularity = 1; // not check granularity in the test
rml::MemoryPool *pool;
pool_create_v1(0, &pol, &pool);
for (int i=0; i<SZ; i++)
for (int j=0; j<ALGN; j++) {
char *p = (char*)pool_aligned_malloc(pool, size[i], algn[j]);
REQUIRE((p && 0==((uintptr_t)p & (algn[j]-1))));
memset(p, j, size[i]);
size_t curr_algn = algn[rand() % ALGN];
size_t curr_sz = size[rand() % SZ];
char *p1 = (char*)pool_aligned_realloc(pool, p, curr_sz, curr_algn);
REQUIRE((p1 && 0==((uintptr_t)p1 & (curr_algn-1))));
REQUIRE(memEqual(p1, utils::min(size[i], curr_sz), j));
memset(p1, j+1, curr_sz);
size_t curr_sz1 = size[rand() % SZ];
char *p2 = (char*)pool_realloc(pool, p1, curr_sz1);
REQUIRE(p2);
REQUIRE(memEqual(p2, utils::min(curr_sz1, curr_sz), j+1));
pool_free(pool, p2);
}
bool ok = pool_destroy(pool);
REQUIRE(ok);
bool fail = rml::pool_destroy(nullptr);
REQUIRE(!fail);
fail = rml::pool_reset(nullptr);
REQUIRE(!fail);
}
rml::MemoryPool *CreateUsablePool(size_t size)
{
rml::MemoryPool *pool;
rml::MemPoolPolicy okPolicy(getMemMalloc, putMemFree);
putMemAll = getMemAll = getMemSuccessful = 0;
rml::MemPoolError res = pool_create_v1(0, &okPolicy, &pool);
if (res != rml::POOL_OK) {
REQUIRE_MESSAGE((!getMemAll && !putMemAll), "No callbacks after fail.");
return nullptr;
}
void *o = pool_malloc(pool, size);
if (!getMemSuccessful) {
// no memory from callback, valid reason to leave
REQUIRE_MESSAGE(!o, "The pool must be unusable.");
return nullptr;
}
REQUIRE_MESSAGE(o, "Created pool must be useful.");
REQUIRE_MESSAGE((getMemSuccessful == 1 || getMemSuccessful == 5 || getMemAll > getMemSuccessful),
"Multiple requests are allowed when unsuccessful request occurred or cannot search in bootstrap memory. ");
REQUIRE(!putMemAll);
pool_free(pool, o);
return pool;
}
void CheckPoolLeaks(size_t poolsAlwaysAvailable)
{
const size_t MAX_POOLS = 16*1000;
const int ITERS = 20, CREATED_STABLE = 3;
rml::MemoryPool *pools[MAX_POOLS];
size_t created, maxCreated = MAX_POOLS;
int maxNotChangedCnt = 0;
// expecting that for ITERS runs, max number of pools that can be created
// can be stabilized and still stable CREATED_STABLE times
for (int j=0; j<ITERS && maxNotChangedCnt<CREATED_STABLE; j++) {
for (created=0; created<maxCreated; created++) {
rml::MemoryPool *p = CreateUsablePool(1024);
if (!p)
break;
pools[created] = p;
}
REQUIRE_MESSAGE(created>=poolsAlwaysAvailable,
"Expect that the reasonable number of pools can be always created.");
for (size_t i=0; i<created; i++) {
bool ok = pool_destroy(pools[i]);
REQUIRE(ok);
}
if (created < maxCreated) {
maxCreated = created;
maxNotChangedCnt = 0;
} else
maxNotChangedCnt++;
}
REQUIRE_MESSAGE(maxNotChangedCnt == CREATED_STABLE, "The number of created pools must be stabilized.");
}
void TestPoolCreation()
{
putMemAll = getMemAll = getMemSuccessful = 0;
rml::MemPoolPolicy nullPolicy(nullptr, putMemFree),
emptyFreePolicy(getMemMalloc, nullptr),
okPolicy(getMemMalloc, putMemFree);
rml::MemoryPool *pool;
rml::MemPoolError res = pool_create_v1(0, &nullPolicy, &pool);
REQUIRE_MESSAGE(res==rml::INVALID_POLICY, "pool with empty pAlloc can't be created");
res = pool_create_v1(0, &emptyFreePolicy, &pool);
REQUIRE_MESSAGE(res==rml::INVALID_POLICY, "pool with empty pFree can't be created");
REQUIRE_MESSAGE((!putMemAll && !getMemAll), "no callback calls are expected");
res = pool_create_v1(0, &okPolicy, &pool);
REQUIRE(res==rml::POOL_OK);
bool ok = pool_destroy(pool);
REQUIRE(ok);
REQUIRE_MESSAGE(putMemAll == getMemSuccessful, "no leaks after pool_destroy");
// 32 is a guess for a number of pools that is acceptable everywere
CheckPoolLeaks(32);
// try to consume all but 16 TLS keys
LimitTLSKeysTo limitTLSTo(16);
// ...and check that we can create at least 16 pools
CheckPoolLeaks(16);
}
struct AllocatedObject {
rml::MemoryPool *pool;
};
const size_t BUF_SIZE = 1024*1024;
class PoolIdentityCheck : utils::NoAssign {
rml::MemoryPool** const pools;
AllocatedObject** const objs;
public:
PoolIdentityCheck(rml::MemoryPool** p, AllocatedObject** o) : pools(p), objs(o) {}
void operator()(int id) const {
objs[id] = (AllocatedObject*)pool_malloc(pools[id], BUF_SIZE/2);
REQUIRE(objs[id]);
rml::MemoryPool *act_pool = rml::pool_identify(objs[id]);
REQUIRE(act_pool == pools[id]);
for (size_t total=0; total<2*BUF_SIZE; total+=256) {
AllocatedObject *o = (AllocatedObject*)pool_malloc(pools[id], 256);
REQUIRE(o);
act_pool = rml::pool_identify(o);
REQUIRE(act_pool == pools[id]);
pool_free(act_pool, o);
}
if( id&1 ) { // make every second returned object "small"
pool_free(act_pool, objs[id]);
objs[id] = (AllocatedObject*)pool_malloc(pools[id], 16);
REQUIRE(objs[id]);
}
objs[id]->pool = act_pool;
}
};
void TestPoolDetection()
{
const int POOLS = 4;
rml::MemPoolPolicy pol(fixedBufGetMem, nullptr, 0, /*fixedSizePool=*/true,
/*keepMemTillDestroy=*/false);
rml::MemoryPool *pools[POOLS];
FixedPoolHead<BUF_SIZE*POOLS> head[POOLS];
AllocatedObject *objs[POOLS];
for (int i=0; i<POOLS; i++)
pool_create_v1((intptr_t)(head+i), &pol, &pools[i]);
// if object somehow released to different pools, subsequent allocation
// from affected pools became impossible
for (int k=0; k<10; k++) {
PoolIdentityCheck check(pools, objs);
if( k&1 )
utils::NativeParallelFor( POOLS, check);
else
for (int i=0; i<POOLS; i++) check(i);
for (int i=0; i<POOLS; i++) {
rml::MemoryPool *p = rml::pool_identify(objs[i]);
REQUIRE(p == objs[i]->pool);
pool_free(p, objs[i]);
}
}
for (int i=0; i<POOLS; i++) {
bool ok = pool_destroy(pools[i]);
REQUIRE(ok);
}
}
void TestLazyBootstrap()
{
rml::MemPoolPolicy pol(getMemMalloc, putMemFree);
const size_t sizes[] = {8, 9*1024, 0};
for (int i=0; sizes[i]; i++) {
rml::MemoryPool *pool = CreateUsablePool(sizes[i]);
bool ok = pool_destroy(pool);
REQUIRE(ok);
REQUIRE_MESSAGE(getMemSuccessful == putMemAll, "No leak.");
}
}
class NoLeakOnDestroyRun: utils::NoAssign {
rml::MemoryPool *pool;
utils::SpinBarrier *barrier;
public:
NoLeakOnDestroyRun(rml::MemoryPool *p, utils::SpinBarrier *b) : pool(p), barrier(b) {}
void operator()(int id) const {
void *p = pool_malloc(pool, id%2? 8 : 9000);
REQUIRE((p && liveRegions.load(std::memory_order_relaxed)));
barrier->wait();
if (!id) {
bool ok = pool_destroy(pool);
REQUIRE(ok);
REQUIRE_MESSAGE(!liveRegions.load(std::memory_order_relaxed), "Expected all regions were released.");
}
// other threads must wait till pool destruction,
// to not call thread destruction cleanup before this
barrier->wait();
}
};
void TestNoLeakOnDestroy()
{
liveRegions.store(0, std::memory_order_release);
for (int p=utils::MinThread; p<=utils::MaxThread; p++) {
rml::MemPoolPolicy pol(getMallocMem, putMallocMem);
utils::SpinBarrier barrier(p);
rml::MemoryPool *pool;
pool_create_v1(0, &pol, &pool);
utils::NativeParallelFor(p, NoLeakOnDestroyRun(pool, &barrier));
}
}
static int putMallocMemError(intptr_t /*pool_id*/, void *ptr, size_t bytes)
{
MallocPoolHeader *hdr = (MallocPoolHeader*)ptr-1;
REQUIRE_MESSAGE(bytes == hdr->userSize, "Invalid size in pool callback.");
free(hdr->rawPtr);
liveRegions--;
return -1;
}
void TestDestroyFailed()
{
rml::MemPoolPolicy pol(getMallocMem, putMallocMemError);
rml::MemoryPool *pool;
pool_create_v1(0, &pol, &pool);
void *ptr = pool_malloc(pool, 16);
REQUIRE(ptr);
bool fail = pool_destroy(pool);
REQUIRE_MESSAGE(fail==false, "putMemPolicyError callback returns error, "
"expect pool_destroy() failure");
}
void TestPoolMSize() {
rml::MemoryPool *pool = CreateUsablePool(1024);
const int SZ = 10;
// Original allocation requests, random numbers from small to large
size_t requestedSz[SZ] = {8, 16, 500, 1000, 2000, 4000, 8000, 1024*1024, 4242+4242, 8484+8484};
// Unlike large objects, small objects do not store its original size along with the object itself
// On Power architecture TLS bins are divided differently.
size_t allocatedSz[SZ] =
#if __powerpc64__ || __ppc64__ || __bgp__
{8, 16, 512, 1024, 2688, 5376, 8064, 1024*1024, 4242+4242, 8484+8484};
#else
{8, 16, 512, 1024, 2688, 4032, 8128, 1024*1024, 4242+4242, 8484+8484};
#endif
for (int i = 0; i < SZ; i++) {
void* obj = pool_malloc(pool, requestedSz[i]);
size_t objSize = pool_msize(pool, obj);
REQUIRE_MESSAGE(objSize == allocatedSz[i], "pool_msize returned the wrong value");
pool_free(pool, obj);
}
bool destroyed = pool_destroy(pool);
REQUIRE(destroyed);
}
//! \brief \ref error_guessing
TEST_CASE("Too small buffer") {
TestTooSmallBuffer();
}
//! \brief \ref error_guessing
TEST_CASE("Pool reset") {
TestPoolReset();
}
TEST_CASE("Shared pool") {
TestSharedPool();
}
//! \brief \ref error_guessing
TEST_CASE("Cross thread pools") {
TestCrossThreadPools();
}
//! \brief \ref interface
TEST_CASE("Fixed buffer pool") {
TestFixedBufferPool();
}
//! \brief \ref interface
TEST_CASE("Pool granularity") {
TestPoolGranularity();
}
//! \brief \ref error_guessing
TEST_CASE("Keep pool till destroy") {
TestPoolKeepTillDestroy();
}
//! \brief \ref error_guessing
TEST_CASE("Entries") {
TestEntries();
}
//! \brief \ref interface
TEST_CASE("Pool creation") {
TestPoolCreation();
}
//! \brief \ref error_guessing
TEST_CASE("Pool detection") {
TestPoolDetection();
}
//! \brief \ref error_guessing
TEST_CASE("Lazy bootstrap") {
TestLazyBootstrap();
}
//! \brief \ref error_guessing
TEST_CASE("No leak on destroy") {
TestNoLeakOnDestroy();
}
//! \brief \ref error_guessing
TEST_CASE("Destroy failed") {
TestDestroyFailed();
}
//! \brief \ref interface
TEST_CASE("Pool msize") {
TestPoolMSize();
}
```
|
Songs for Beginners is the debut solo studio album by English singer-songwriter Graham Nash. Released in May 1971, it was one of four high-profile albums (all charting within the top fifteen) released by each member of Crosby, Stills, Nash & Young in the wake of their chart-topping Déjà Vu album of 1970, along with After the Gold Rush (Neil Young, September 1970), Stephen Stills (Stephen Stills, November 1970) and If I Could Only Remember My Name (David Crosby, February 1971). Songs for Beginners peaked at No. 15 on the Billboard Top Pop Albums chart, and the single "Chicago" made it to No. 35 on the Billboard Hot 100. It has been certified a gold record by the RIAA.
History
Nash brought in an impressive group of guests to assist in the recording, including David Crosby, Jerry Garcia, Phil Lesh, Dave Mason, David Lindley, Rita Coolidge, and Neil Young (under Young's early 1970s pseudonym Joe Yankee). The making of this album directly followed Nash's break-up with longtime girlfriend Joni Mitchell. Many of the songs are about their time together. The Top 40 track "Chicago" concerned both the 1968 Democratic National Convention and the trial of the Chicago Eight, articulating the outrage Nash felt concerning those proceedings.
"Wounded Bird" was written for Stephen Stills, about the pains he was going through in his relationship with Judy Collins. "Better Days" was also written for Stills, after Rita Coolidge left him for Nash.
A first-generation compact disc was released in the late 1980s, and reissued in 2011. A remixed version supervised by Nash was issued on 180-gram vinyl only by Classic Records in 2001. A deluxe edition of Songs for Beginners was released on 23 September 2008 as a CD+DVD-Audio pack, featuring a bonus multichannel high resolution audio, all new 2008 video interview with Nash, plus a photo gallery and complete lyrics along with the 11-track CD album remastered.
The song "Simple Man" features in the opening sequence of the 2007 film Reign Over Me, and a copy of the album appears in it. The same song was also used in the final minutes of the finale of the HBO series Looking. The song "Better Days" appears in episode 2 of Fox TV's The Passage, released in 2019. A demo version of "Be Yourself" plays during the closing credits of the film Up in the Air. "Military Madness" has been covered live by Death Cab For Cutie, and was covered by indie-rock band Woods on their 2009 album Songs of Shame.
In 2018, the song "Better Days" was used as the closing credit song in the Showtime miniseries Escape at Dannemora, Episode 7. In 2021, "Better Days" was played over the closing credits of the HBO Max series Hacks, Episode 6.
Track listing
Personnel
Graham Nash — vocals; guitar all tracks except "Better Days" and "Simple Man"; piano on "Better Days", "Simple Man", "Chicago" and "We Can Change the World"; organ on "Better Days", "There's Only One", "Chicago" and "We Can Change the World"; paper and comb on "Sleep Song"; tambourine on "Chicago" and "We Can Change the World"
Rita Coolidge — piano on "Be Yourself" and "There's Only One"; electric piano on "Be Yourself"; backing vocals on "Military Madness", "Better Days", "Simple Man", "There's Only One", "Chicago" and "We Can Change the World"
Jerry Garcia — pedal steel guitar on "I Used to Be a King" and "Man in the Mirror"
Neil Young — piano on "Better Days", "Man in the Mirror" and "I Used to Be a King"
Dorian Rudnytsky — cello on "Simple Man" and "Sleep Song"
Dave Mason — electric guitar on "Military Madness"
David Crosby — electric guitar on "I Used to Be a King"
Joel Bernstein — piano on "Military Madness"
Bobby Keys — saxophone on "There's Only One"
David Lindley — fiddle on "Simple Man"
Sermon Posthumas — bass clarinet on "Better Days"
Chris Ethridge — bass on "Man in the Mirror", "There's Only One", "Chicago" and "We Can Change the World"
Calvin "Fuzzy" Samuels — bass on "Military Madness", "Better Days" and "Be Yourself"
Phil Lesh — bass on "I Used to Be a King"
John Barbata — drums on "Military Madness", "I Used to Be a King", "Be Yourself", "Man in the Mirror", "There's Only One", "Chicago" and "We Can Change the World"; tambourine on "Chicago"
Dallas Taylor — drums on "Better Days"
P.P. Arnold — backing vocals on "Military Madness"
Venetta Fields, Sherlie Matthews, Clydie King, Dorothy Morrison — backing vocals on "There's Only One", "Chicago" and "We Can Change the World"
Production personnel
Graham Nash – producer
Bill Halverson, Russ Gary, Larry Cox — recording engineers
Glyn Johns – Mixing
Doug Sax – mastering
Gary Burden — art direction
Joel Bernstein, Graham Nash – photography
Charts
Singles
Certification
References
1971 debut albums
Atlantic Records albums
Graham Nash albums
Albums produced by Graham Nash
|
"I Was the One" is a song by Elvis Presley, written by Aaron Schroeder, Bill Peppers, Claude Demetrius and Hal Blair.
Presley recorded it at RCA's Studios, Nashville, on January 11, 1956. It was released as the B-side of the "Heartbreak Hotel" single (RCA Victor 20-6420 (78 rpm record) and RCA Victor 47-6420 (single)) in 1956, and was produced by Steve Sholes.
Other versions
Swedish band Streaplers has also recorded the song. It was released on the LP Speed (Bohus Bglp 5010) in 1978.
Country musician Jimmie Dale Gilmore also recorded a version on his album, Spinning Around the Sun.
References
1956 singles
Elvis Presley songs
1956 songs
Songs written by Aaron Schroeder
Songs written by Claude Demetrius
Song recordings produced by Stephen H. Sholes
Songs written by Hal Blair
|
Jenico Preston, 7th Viscount Gormanston (born at Gormanston, County Meath 1631; died at Limerick 17 March 1691), was an Irish peer, Jacobite soldier and landowner.
Life
The elder son of Nicholas Preston, 6th Viscount Gormanston and Mary Barnewall, eldest daughter of Nicholas, 1st Viscount Barnewall by his wife Bridget, Dowager Countess of Tyrone, eldest daughter and co-heiress of Henry FitzGerald, 12th Earl of Kildare, he succeeded in 1643 to his father's title. In July 1647 Lord Gormanston is recorded as attending a Jesuit school at Kilkenny and went into exile with King Charles II in 1651 when his lands were confiscated after the Royalist and Confederate forces were defeated by Cromwell's forces. The Gormanston estates, held by his father before the Irish Rebellion of 1641, were restored to him in 1660 upon the Stuart Restoration.
Commissioned into the Irish Army in September 1685 as a lieutenant in the regiment of Colonel Richard Talbot, Duke of Tyrconnell, he was promoted captain in March 1686. He then successfully petitioned King James II to reverse his father's outlawry and also in 1686 was sworn of the Privy Council of Ireland. Appointed Alderman of Drogheda in 1687 and Burgess of Athboy in 1689 by Royal Charter, Viscount Gormanston served as Lord Lieutenant of Meath (1689–1691).
A member of the Irish House of Lords, he sat in the short-lived Patriot Parliament called by James II in 1689. Lord Gormanston was appointed a Commissioner of the Treasury in 1690.
Promoted lieutenant-colonel in the Irish Army, Lord Gormanston served at the battles of Cavan and the Boyne in 1690. On 17 March 1691 during the Siege of Limerick, the 7th Viscount died leaving no male heir, thus was succeeded in the family title by his nephew, Jenico Preston (1640–1697) as de jure 8th Viscount.
Married twice, firstly to Lady Frances Leke (who died without issue in 1682), daughter of Francis Leke, 1st Earl of Scarsdale, and by his second wife, Margaret Molyneux, daughter of Caryll, 3rd Viscount Molyneux, Lord Gormanston had an only daughter, Mary Preston (who married her cousin, Anthony Preston, de jure 9th Viscount Gormanston).
Posthumously indicted for high treason and declared an outlaw by decree of King William III on 16 April 1691, the viscountcy was thereby attainted (later restored in 1800).
See also
Gormanston Castle
Lodge's Peerage of Ireland
References
External links
www.nli.ie
www.patrickcomerford.com
1631 births
1691 deaths
17th-century Irish people
Irish Jacobites
Jacobite military personnel of the Williamite War in Ireland
Lord-Lieutenants of Meath
Members of the Irish House of Lords
Members of the Privy Council of Ireland
Viscounts in the Peerage of Ireland
|
Lena Horne at the Waldorf Astoria is a 1957 live album by Lena Horne, conducted by Lennie Hayton, recorded in Stereo at the Waldorf-Astoria Hotel in New York City on the evening of December 31, 1956. One of the first non-classical live albums to be recorded in Stereo, the monaural album peaked at #24 in the Billboard Hot 200 and became the best selling record by a female artist in the history of the RCA Victor label. The album was re-issued on CD in 2002, by Collectables Records, together with Horne's 1961 live album Lena Horne at the Sands.
Track listing
"Today I Love Everybody" (Harold Arlen, Dorothy Fields) - 2.55
"Let Me Love You" (Bart Howard, Lou Levy) - 3.06
"Come Runnin'" (Roc Hillman) - 2.42
Cole Porter Medley: "How's Your Romance?"/"After You"/"Love of My Life"/"It's All Right with Me" - 7:21
"Mood Indigo"/"I'm Beginning to See the Light" (Duke Ellington, Mitchell Parish, Barney Bigard, Irving Mills)/(Ellington, Don George, Johnny Hodges, Harry James) - 4:30
"How You Say It" (Matt Dubey, Harold Karr) - 3:16
"Honeysuckle Rose" (Fats Waller, Andy Razaf) - 2:59
"Day In, Day Out" (Rube Bloom, Johnny Mercer) - 2:07
"New Fangled Tango" (Dubey, Karr) - 3:06
"I Love to Love" (Herbert Baker) - 4:20
"From This Moment On" (Porter) - 1:57
Personnel
Lena Horne - vocals
Nat Brandwynne & His Orchestra - orchestra
Lennie Hayton - conductor
References
RCA Victor live albums
Lena Horne live albums
1957 live albums
Albums conducted by Lennie Hayton
albums arranged by Lennie Hayton
|
Sanjay Kirpal is a Fijian politician and Member of the Parliament of Fiji for the FijiFirst Party. He was elected to Parliament in the 2018 election. He is a registered Valuer by profession and has a valuation company named Professional Valuations Limited.
He has worked for Ministry of Lands and also attained degree in Bachelor of Arts in Land Management and Development.
References
Living people
Indian members of the Parliament of Fiji
FijiFirst politicians
Year of birth missing (living people)
|
```cmake
#.rst:
# clapack config for vcpkg
# ------------
#
# Find clapack as a valid LAPACK implementation.
#
# The module defines the same outputs as FindLAPACK by cmake
include(${CMAKE_ROOT}/Modules/SelectLibraryConfigurations.cmake)
include(${CMAKE_ROOT}/Modules/FindPackageHandleStandardArgs.cmake)
set(CLAPACK_VERSION "3.2.1")
set(LAPACK_VERSION "${CLAPACK_VERSION}")
#set(CMAKE_THREAD_PREFER_PTHREAD TRUE)
find_package(Threads)
find_package(clapack CONFIG REQUIRED) # This will be found !
if(NOT TARGET lapack)
message(FATAL_ERROR "Target lapack was not created by find_package(clapack)!")
endif()
if(NOT TARGET LAPACK::LAPACK)
add_library(LAPACK::LAPACK INTERFACE IMPORTED)
target_link_libraries(LAPACK::LAPACK INTERFACE lapack)
set(lib_prop IMPORTED_LOCATION)
#if(@VCPKG_LIBRARY_LINKAGE@ STREQUAL "dynamic" AND WIN32)
# set(lib_prop IMPORTED_IMPLIB)
#endif()
get_property(LAPACK_LIBRARY_RELEASE TARGET lapack PROPERTY ${lib_prop}_RELEASE)
get_property(LAPACK_LIBRARY_DEBUG TARGET lapack PROPERTY ${lib_prop}_DEBUG)
get_property(LAPACK_INCLUDE_DIR TARGET lapack PROPERTY INTERFACE_INCLUDE_DIRECTORIES) # Doesn't make much sense but ok.
select_library_configurations(LAPACK)
get_property(LAPACK_LINKER_FLAGS_RELEASE TARGET lapack PROPERTY IMPORTED_LINK_INTERFACE_LIBRARIES_RELEASE)
get_property(LAPACK_LINKER_FLAGS_DEBUG TARGET lapack PROPERTY IMPORTED_LINK_INTERFACE_LIBRARIES_DEBUG)
list(TRANSFORM LAPACK_LINKER_FLAGS_DEBUG PREPEND "$<$<CONFIG:DEBUG>:")
list(TRANSFORM LAPACK_LINKER_FLAGS_DEBUG APPEND ">")
list(TRANSFORM LAPACK_LINKER_FLAGS_RELEASE PREPEND "$<$<NOT:$<CONFIG:DEBUG>>:")
list(TRANSFORM LAPACK_LINKER_FLAGS_RELEASE APPEND ">")
set(LAPACK_LIBRARIES "${LAPACK_LIBRARIES};${LAPACK_LINKER_FLAGS_DEBUG};${LAPACK_LINKER_FLAGS_RELEASE}")
set(LAPACK95_LIBRARIES "${LAPACK_LIBRARIES}")
set(LAPACK95_FOUND "TRUE")
set(LAPACK_LINKER_FLAGS "${LAPACK_LIBRARIES}")
endif()
find_package_handle_standard_args(LAPACK DEFAULT_MSG LAPACK_LIBRARY LAPACK_INCLUDE_DIR )
mark_as_advanced(LAPACK_INCLUDE_DIR LAPACK_LIBRARY)
```
|
Saltlick Creek is a tributary of the Little Kanawha River, long, located in central West Virginia in the United States. Via the Little Kanawha and Ohio rivers, it is part of the watershed of the Mississippi River, draining an area of in a rural region on the unglaciated portion of the Allegheny Plateau.
Saltlick Creek flows for its entire length in Braxton County. It rises approximately south of Flatwoods and flows generally northward, through the communities of Corley, Rollyson, and Gem to its mouth at the Little Kanawha River in Burnsville. Downstream of Rollyson, the creek is paralleled by West Virginia Route 5.
According to the West Virginia Department of Environmental Protection, approximately 80% of the Saltlick Creek watershed is forested, mostly deciduous. Approximately 18% is used for pasture and agriculture.
According to the Geographic Names Information System, Saltlick Creek has historically been known by the variant names "Salt Lick," "Salt Lick Creek," "Salt Lick Fork," and "Saltlick Fork." The creek was named for a nearby mineral lick.
See also
List of rivers of West Virginia
References
Rivers of West Virginia
Little Kanawha River
Rivers of Braxton County, West Virginia
|
```smalltalk
namespace Roslynator.CSharp.Analysis;
internal static class DiagnosticPropertyKeys
{
internal static readonly string ImplicitToCollectionExpression = nameof(ImplicitToCollectionExpression);
internal static readonly string CollectionExpressionToImplicit = nameof(CollectionExpressionToImplicit);
internal static readonly string ExplicitToCollectionExpression = nameof(ExplicitToCollectionExpression);
internal static readonly string VarToExplicit = nameof(VarToExplicit);
}
```
|
```yaml
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
#
# path_to_url
#
# Unless required by applicable law or agreed to in writing,
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# specific language governing permissions and limitations
contexts:
- name: master
prelude: |
$LOAD_PATH.unshift(File.expand_path("ext/arrow"))
$LOAD_PATH.unshift(File.expand_path("lib"))
prelude: |-
require "arrow"
require "faker"
state = ENV.fetch("FAKER_RANDOM_SEED", 17).to_i
Faker::Config.random = Random.new(state)
n_rows = 1000
n_columns = 10
type = Arrow::ListDataType.new(name: "values", type: :double)
fields = {}
arrays = {}
n_columns.times do |i|
column_name = "column_#{i}"
fields[column_name] = type
arrays[column_name] = n_rows.times.map do
n_elements = Faker::Number.within(range: 1 ... 100)
n_elements.times.map do
Faker::Number.normal(mean: 0, standard_deviation: 1e+6)
end
end
end
record_batch = Arrow::RecordBatch.new(fields, arrays)
def pure_ruby_raw_records(record_batch)
n_rows = record_batch.n_rows
n_columns = record_batch.n_columns
columns = record_batch.columns
records = []
i = 0
while i < n_rows
record = []
j = 0
while j < n_columns
record << columns[j][i]
j += 1
end
records << record
i += 1
end
records
end
benchmark:
pure_ruby: |-
pure_ruby_raw_records(record_batch)
raw_records: |-
record_batch.raw_records
```
|
```go
/*
path_to_url
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package util
import (
"testing"
)
func TestGetOSVersionLinux(t *testing.T) {
testCases := []struct {
name string
fakeOSReleasePath string
expectedOSVersion string
expectErr bool
}{
{
name: "COS",
fakeOSReleasePath: "testdata/os-release-cos",
expectedOSVersion: "cos 77-12293.0.0",
expectErr: false,
},
{
name: "Debian",
fakeOSReleasePath: "testdata/os-release-debian",
expectedOSVersion: "debian 9 (stretch)",
expectErr: false,
},
{
name: "Ubuntu",
fakeOSReleasePath: "testdata/os-release-ubuntu",
expectedOSVersion: "ubuntu 16.04.6 LTS (Xenial Xerus)",
expectErr: false,
},
{
name: "centos",
fakeOSReleasePath: "testdata/os-release-centos",
expectedOSVersion: "centos 7 (Core)",
expectErr: false,
},
{
name: "rocky",
fakeOSReleasePath: "testdata/os-release-rocky",
expectedOSVersion: "rocky 8.5 (Green Obsidian)",
expectErr: false,
},
{
name: "rhel",
fakeOSReleasePath: "testdata/os-release-rhel",
expectedOSVersion: "rhel 7.7 (Maipo)",
expectErr: false,
},
{
name: "ol",
fakeOSReleasePath: "testdata/os-release-ol",
expectedOSVersion: "ol 9.0",
expectErr: false,
},
{
name: "amzn",
fakeOSReleasePath: "testdata/os-release-amzn",
expectedOSVersion: "amzn 2",
expectErr: false,
},
{
name: "sles",
fakeOSReleasePath: "testdata/os-release-sles",
expectedOSVersion: "sles 15-SP4",
expectErr: false,
},
{
name: "mariner",
fakeOSReleasePath: "testdata/os-release-mariner",
expectedOSVersion: "mariner 2.0.20240123",
expectErr: false,
},
{
name: "azurelinux",
fakeOSReleasePath: "testdata/os-release-azurelinux",
expectedOSVersion: "azurelinux 3.0.20240328",
expectErr: false,
},
{
name: "Unknown",
fakeOSReleasePath: "testdata/os-release-unknown",
expectedOSVersion: "",
expectErr: true,
},
{
name: "Empty",
fakeOSReleasePath: "testdata/os-release-empty",
expectedOSVersion: "",
expectErr: true,
},
}
for _, tt := range testCases {
tc := tt
t.Run(tc.name, func(t *testing.T) {
t.Parallel()
osVersion, err := getOSVersion(tc.fakeOSReleasePath)
if tc.expectErr && err == nil {
t.Errorf("Expect to get error, but got no returned error.")
}
if !tc.expectErr && err != nil {
t.Errorf("Expect to get no error, but got returned error: %v", err)
}
if !tc.expectErr && osVersion != tc.expectedOSVersion {
t.Errorf("Wanted: %+v. \nGot: %+v", tc.expectedOSVersion, osVersion)
}
})
}
}
```
|
Pyrausta dissimulans is a moth in the family Crambidae. It was described by Harrison Gray Dyar Jr. in 1914. It is found in Mexico.
References
Moths described in 1914
dissimulans
Moths of Central America
|
Ren Hui (; born 11 August 1983) is a Chinese speed skater who won a bronze medal in the Women's 500 m at the 2006 Winter Olympics and bronze at the World Single Distance Championships for Women.
Records
Competitions
Olympic Games
2006 Winter Olympics – Women's 500 metres
2006 Winter Olympics – Women's 1000 metres
2010 Winter Olympics – Women's 1000 metres
World Single Distance Speed Skating Championships
2004 World Single Distance Speed Skating Championships – 500 m
World Sprint Speed Skating Championships
2010 World Sprint Speed Skating Championships
ISU Speed Skating World Cup
2006–07 ISU Speed Skating World Cup
2007–08 ISU Speed Skating World Cup
2008–09 ISU Speed Skating World Cup – Women's 100 metres
2008–09 ISU Speed Skating World Cup – Women's 500 metres
2008–09 ISU Speed Skating World Cup – Women's 1000 metres
2009–10 ISU Speed Skating World Cup – Women's 500 metres
2009–10 ISU Speed Skating World Cup – Women's 1000 metres
Asian Winter Games
2007 Asian Winter Games – 1000 m
See also
China at the 2006 Winter Olympics
China at the 2010 Winter Olympics
References
http://www.skateresults.com/skaters/ren_hui
1983 births
Living people
Chinese female speed skaters
Speed skaters at the 2006 Winter Olympics
Speed skaters at the 2010 Winter Olympics
Olympic speed skaters for China
Olympic bronze medalists for China
Olympic medalists in speed skating
Sportspeople from Heilongjiang
Medalists at the 2006 Winter Olympics
Asian Games medalists in speed skating
Speed skaters at the 2007 Asian Winter Games
Medalists at the 2007 Asian Winter Games
Asian Games bronze medalists for China
20th-century Chinese women
21st-century Chinese women
|
```java
/*
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing,
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* specific language governing permissions and limitations
*/
package org.apache.pulsar.client.impl;
import static java.lang.String.format;
import static org.apache.pulsar.client.api.PulsarClientException.FailedFeatureCheck.SupportsGetPartitionedMetadataWithoutAutoCreation;
import io.netty.buffer.ByteBuf;
import io.opentelemetry.api.common.Attributes;
import java.net.InetSocketAddress;
import java.net.URI;
import java.util.Optional;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.ScheduledExecutorService;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicLong;
import org.apache.commons.lang3.mutable.MutableObject;
import org.apache.pulsar.client.api.PulsarClientException;
import org.apache.pulsar.client.api.SchemaSerializationException;
import org.apache.pulsar.client.impl.metrics.LatencyHistogram;
import org.apache.pulsar.common.api.proto.CommandGetTopicsOfNamespace.Mode;
import org.apache.pulsar.common.api.proto.CommandLookupTopicResponse;
import org.apache.pulsar.common.api.proto.CommandLookupTopicResponse.LookupType;
import org.apache.pulsar.common.lookup.GetTopicsResult;
import org.apache.pulsar.common.naming.NamespaceName;
import org.apache.pulsar.common.naming.TopicName;
import org.apache.pulsar.common.partition.PartitionedTopicMetadata;
import org.apache.pulsar.common.protocol.Commands;
import org.apache.pulsar.common.protocol.schema.BytesSchemaVersion;
import org.apache.pulsar.common.schema.SchemaInfo;
import org.apache.pulsar.common.util.Backoff;
import org.apache.pulsar.common.util.BackoffBuilder;
import org.apache.pulsar.common.util.FutureUtil;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class BinaryProtoLookupService implements LookupService {
private final PulsarClientImpl client;
private final ServiceNameResolver serviceNameResolver;
private final boolean useTls;
private final ExecutorService executor;
private final String listenerName;
private final int maxLookupRedirects;
private final ConcurrentHashMap<TopicName, CompletableFuture<LookupTopicResult>>
lookupInProgress = new ConcurrentHashMap<>();
private final ConcurrentHashMap<TopicName, CompletableFuture<PartitionedTopicMetadata>>
partitionedMetadataInProgress = new ConcurrentHashMap<>();
private final LatencyHistogram histoGetBroker;
private final LatencyHistogram histoGetTopicMetadata;
private final LatencyHistogram histoGetSchema;
private final LatencyHistogram histoListTopics;
public BinaryProtoLookupService(PulsarClientImpl client,
String serviceUrl,
boolean useTls,
ExecutorService executor)
throws PulsarClientException {
this(client, serviceUrl, null, useTls, executor);
}
public BinaryProtoLookupService(PulsarClientImpl client,
String serviceUrl,
String listenerName,
boolean useTls,
ExecutorService executor)
throws PulsarClientException {
this.client = client;
this.useTls = useTls;
this.executor = executor;
this.maxLookupRedirects = client.getConfiguration().getMaxLookupRedirects();
this.serviceNameResolver = new PulsarServiceNameResolver();
this.listenerName = listenerName;
updateServiceUrl(serviceUrl);
LatencyHistogram histo = client.instrumentProvider().newLatencyHistogram("pulsar.client.lookup.duration",
"Duration of lookup operations", null,
Attributes.builder().put("pulsar.lookup.transport-type", "binary").build());
histoGetBroker = histo.withAttributes(Attributes.builder().put("pulsar.lookup.type", "topic").build());
histoGetTopicMetadata =
histo.withAttributes(Attributes.builder().put("pulsar.lookup.type", "metadata").build());
histoGetSchema = histo.withAttributes(Attributes.builder().put("pulsar.lookup.type", "schema").build());
histoListTopics = histo.withAttributes(Attributes.builder().put("pulsar.lookup.type", "list-topics").build());
}
@Override
public void updateServiceUrl(String serviceUrl) throws PulsarClientException {
serviceNameResolver.updateServiceUrl(serviceUrl);
}
/**
* Calls broker binaryProto-lookup api to find broker-service address which can serve a given topic.
*
* @param topicName
* topic-name
* @return broker-socket-address that serves given topic
*/
public CompletableFuture<LookupTopicResult> getBroker(TopicName topicName) {
long startTime = System.nanoTime();
final MutableObject<CompletableFuture> newFutureCreated = new MutableObject<>();
try {
return lookupInProgress.computeIfAbsent(topicName, tpName -> {
CompletableFuture<LookupTopicResult> newFuture =
findBroker(serviceNameResolver.resolveHost(), false, topicName, 0);
newFutureCreated.setValue(newFuture);
newFuture.thenRun(() -> {
histoGetBroker.recordSuccess(System.nanoTime() - startTime);
}).exceptionally(x -> {
histoGetBroker.recordFailure(System.nanoTime() - startTime);
return null;
});
return newFuture;
});
} finally {
if (newFutureCreated.getValue() != null) {
newFutureCreated.getValue().whenComplete((v, ex) -> {
lookupInProgress.remove(topicName, newFutureCreated.getValue());
});
}
}
}
/**
* calls broker binaryProto-lookup api to get metadata of partitioned-topic.
*
*/
@Override
public CompletableFuture<PartitionedTopicMetadata> getPartitionedTopicMetadata(
TopicName topicName, boolean metadataAutoCreationEnabled, boolean useFallbackForNonPIP344Brokers) {
final MutableObject<CompletableFuture> newFutureCreated = new MutableObject<>();
try {
return partitionedMetadataInProgress.computeIfAbsent(topicName, tpName -> {
CompletableFuture<PartitionedTopicMetadata> newFuture = getPartitionedTopicMetadata(
serviceNameResolver.resolveHost(), topicName, metadataAutoCreationEnabled,
useFallbackForNonPIP344Brokers);
newFutureCreated.setValue(newFuture);
return newFuture;
});
} finally {
if (newFutureCreated.getValue() != null) {
newFutureCreated.getValue().whenComplete((v, ex) -> {
partitionedMetadataInProgress.remove(topicName, newFutureCreated.getValue());
});
}
}
}
private CompletableFuture<LookupTopicResult> findBroker(InetSocketAddress socketAddress,
boolean authoritative, TopicName topicName, final int redirectCount) {
CompletableFuture<LookupTopicResult> addressFuture = new CompletableFuture<>();
if (maxLookupRedirects > 0 && redirectCount > maxLookupRedirects) {
addressFuture.completeExceptionally(
new PulsarClientException.LookupException("Too many redirects: " + maxLookupRedirects));
return addressFuture;
}
client.getCnxPool().getConnection(socketAddress).thenAccept(clientCnx -> {
long requestId = client.newRequestId();
ByteBuf request = Commands.newLookup(topicName.toString(), listenerName, authoritative, requestId);
clientCnx.newLookup(request, requestId).whenComplete((r, t) -> {
if (t != null) {
// lookup failed
log.warn("[{}] failed to send lookup request : {}", topicName, t.getMessage());
if (log.isDebugEnabled()) {
log.debug("[{}] Lookup response exception: {}", topicName, t);
}
addressFuture.completeExceptionally(t);
} else {
URI uri = null;
try {
// (1) build response broker-address
if (useTls) {
uri = new URI(r.brokerUrlTls);
} else {
String serviceUrl = r.brokerUrl;
uri = new URI(serviceUrl);
}
InetSocketAddress responseBrokerAddress =
InetSocketAddress.createUnresolved(uri.getHost(), uri.getPort());
// (2) redirect to given address if response is: redirect
if (r.redirect) {
findBroker(responseBrokerAddress, r.authoritative, topicName, redirectCount + 1)
.thenAccept(addressFuture::complete)
.exceptionally((lookupException) -> {
Throwable cause = FutureUtil.unwrapCompletionException(lookupException);
// lookup failed
if (redirectCount > 0) {
if (log.isDebugEnabled()) {
log.debug("[{}] lookup redirection failed ({}) : {}", topicName,
redirectCount, cause.getMessage());
}
} else {
log.warn("[{}] lookup failed : {}", topicName,
cause.getMessage(), cause);
}
addressFuture.completeExceptionally(cause);
return null;
});
} else {
// (3) received correct broker to connect
if (r.proxyThroughServiceUrl) {
// Connect through proxy
addressFuture.complete(
new LookupTopicResult(responseBrokerAddress, socketAddress, true));
} else {
// Normal result with direct connection to broker
addressFuture.complete(
new LookupTopicResult(responseBrokerAddress, responseBrokerAddress, false));
}
}
} catch (Exception parseUrlException) {
// Failed to parse url
log.warn("[{}] invalid url {} : {}", topicName, uri, parseUrlException.getMessage(),
parseUrlException);
addressFuture.completeExceptionally(parseUrlException);
}
}
client.getCnxPool().releaseConnection(clientCnx);
});
}).exceptionally(connectionException -> {
addressFuture.completeExceptionally(FutureUtil.unwrapCompletionException(connectionException));
return null;
});
return addressFuture;
}
private CompletableFuture<PartitionedTopicMetadata> getPartitionedTopicMetadata(InetSocketAddress socketAddress,
TopicName topicName, boolean metadataAutoCreationEnabled, boolean useFallbackForNonPIP344Brokers) {
long startTime = System.nanoTime();
CompletableFuture<PartitionedTopicMetadata> partitionFuture = new CompletableFuture<>();
client.getCnxPool().getConnection(socketAddress).thenAccept(clientCnx -> {
boolean finalAutoCreationEnabled = metadataAutoCreationEnabled;
if (!metadataAutoCreationEnabled && !clientCnx.isSupportsGetPartitionedMetadataWithoutAutoCreation()) {
if (useFallbackForNonPIP344Brokers) {
log.info("[{}] Using original behavior of getPartitionedTopicMetadata(topic) in "
+ "getPartitionedTopicMetadata(topic, false) "
+ "since the target broker does not support PIP-344 and fallback is enabled.", topicName);
finalAutoCreationEnabled = true;
} else {
partitionFuture.completeExceptionally(
new PulsarClientException.FeatureNotSupportedException("The feature of "
+ "getting partitions without auto-creation is not supported by the broker. "
+ "Please upgrade the broker to version that supports PIP-344 to resolve this "
+ "issue.",
SupportsGetPartitionedMetadataWithoutAutoCreation));
return;
}
}
long requestId = client.newRequestId();
ByteBuf request = Commands.newPartitionMetadataRequest(topicName.toString(), requestId,
finalAutoCreationEnabled);
clientCnx.newLookup(request, requestId).whenComplete((r, t) -> {
if (t != null) {
histoGetTopicMetadata.recordFailure(System.nanoTime() - startTime);
log.warn("[{}] failed to get Partitioned metadata : {}", topicName,
t.getMessage(), t);
partitionFuture.completeExceptionally(t);
} else {
try {
histoGetTopicMetadata.recordSuccess(System.nanoTime() - startTime);
partitionFuture.complete(new PartitionedTopicMetadata(r.partitions));
} catch (Exception e) {
partitionFuture.completeExceptionally(new PulsarClientException.LookupException(
format("Failed to parse partition-response redirect=%s, topic=%s, partitions with %s,"
+ " error message %s",
r.redirect, topicName, r.partitions,
e.getMessage())));
}
}
client.getCnxPool().releaseConnection(clientCnx);
});
}).exceptionally(connectionException -> {
partitionFuture.completeExceptionally(FutureUtil.unwrapCompletionException(connectionException));
return null;
});
return partitionFuture;
}
@Override
public CompletableFuture<Optional<SchemaInfo>> getSchema(TopicName topicName) {
return getSchema(topicName, null);
}
@Override
public CompletableFuture<Optional<SchemaInfo>> getSchema(TopicName topicName, byte[] version) {
long startTime = System.nanoTime();
CompletableFuture<Optional<SchemaInfo>> schemaFuture = new CompletableFuture<>();
if (version != null && version.length == 0) {
schemaFuture.completeExceptionally(new SchemaSerializationException("Empty schema version"));
return schemaFuture;
}
InetSocketAddress socketAddress = serviceNameResolver.resolveHost();
client.getCnxPool().getConnection(socketAddress).thenAccept(clientCnx -> {
long requestId = client.newRequestId();
ByteBuf request = Commands.newGetSchema(requestId, topicName.toString(),
Optional.ofNullable(BytesSchemaVersion.of(version)));
clientCnx.sendGetSchema(request, requestId).whenComplete((r, t) -> {
if (t != null) {
histoGetSchema.recordFailure(System.nanoTime() - startTime);
log.warn("[{}] failed to get schema : {}", topicName,
t.getMessage(), t);
schemaFuture.completeExceptionally(t);
} else {
histoGetSchema.recordSuccess(System.nanoTime() - startTime);
schemaFuture.complete(r);
}
client.getCnxPool().releaseConnection(clientCnx);
});
}).exceptionally(ex -> {
schemaFuture.completeExceptionally(FutureUtil.unwrapCompletionException(ex));
return null;
});
return schemaFuture;
}
public String getServiceUrl() {
return serviceNameResolver.getServiceUrl();
}
@Override
public InetSocketAddress resolveHost() {
return serviceNameResolver.resolveHost();
}
@Override
public CompletableFuture<GetTopicsResult> getTopicsUnderNamespace(NamespaceName namespace,
Mode mode,
String topicsPattern,
String topicsHash) {
CompletableFuture<GetTopicsResult> topicsFuture = new CompletableFuture<>();
AtomicLong opTimeoutMs = new AtomicLong(client.getConfiguration().getOperationTimeoutMs());
Backoff backoff = new BackoffBuilder()
.setInitialTime(100, TimeUnit.MILLISECONDS)
.setMandatoryStop(opTimeoutMs.get() * 2, TimeUnit.MILLISECONDS)
.setMax(1, TimeUnit.MINUTES)
.create();
getTopicsUnderNamespace(serviceNameResolver.resolveHost(), namespace, backoff, opTimeoutMs, topicsFuture, mode,
topicsPattern, topicsHash);
return topicsFuture;
}
private void getTopicsUnderNamespace(InetSocketAddress socketAddress,
NamespaceName namespace,
Backoff backoff,
AtomicLong remainingTime,
CompletableFuture<GetTopicsResult> getTopicsResultFuture,
Mode mode,
String topicsPattern,
String topicsHash) {
long startTime = System.nanoTime();
client.getCnxPool().getConnection(socketAddress).thenAccept(clientCnx -> {
long requestId = client.newRequestId();
ByteBuf request = Commands.newGetTopicsOfNamespaceRequest(
namespace.toString(), requestId, mode, topicsPattern, topicsHash);
clientCnx.newGetTopicsOfNamespace(request, requestId).whenComplete((r, t) -> {
if (t != null) {
histoListTopics.recordFailure(System.nanoTime() - startTime);
getTopicsResultFuture.completeExceptionally(t);
} else {
histoListTopics.recordSuccess(System.nanoTime() - startTime);
if (log.isDebugEnabled()) {
log.debug("[namespace: {}] Success get topics list in request: {}",
namespace, requestId);
}
getTopicsResultFuture.complete(r);
}
client.getCnxPool().releaseConnection(clientCnx);
});
}).exceptionally((e) -> {
long nextDelay = Math.min(backoff.next(), remainingTime.get());
if (nextDelay <= 0) {
getTopicsResultFuture.completeExceptionally(
new PulsarClientException.TimeoutException(
format("Could not get topics of namespace %s within configured timeout",
namespace.toString())));
return null;
}
((ScheduledExecutorService) executor).schedule(() -> {
log.warn("[namespace: {}] Could not get connection while getTopicsUnderNamespace -- Will try again in"
+ " {} ms", namespace, nextDelay);
remainingTime.addAndGet(-nextDelay);
getTopicsUnderNamespace(socketAddress, namespace, backoff, remainingTime, getTopicsResultFuture,
mode, topicsPattern, topicsHash);
}, nextDelay, TimeUnit.MILLISECONDS);
return null;
});
}
@Override
public void close() throws Exception {
// no-op
}
public static class LookupDataResult {
public final String brokerUrl;
public final String brokerUrlTls;
public final int partitions;
public final boolean authoritative;
public final boolean proxyThroughServiceUrl;
public final boolean redirect;
public LookupDataResult(CommandLookupTopicResponse result) {
this.brokerUrl = result.hasBrokerServiceUrl() ? result.getBrokerServiceUrl() : null;
this.brokerUrlTls = result.hasBrokerServiceUrlTls() ? result.getBrokerServiceUrlTls() : null;
this.authoritative = result.isAuthoritative();
this.redirect = result.hasResponse() && result.getResponse() == LookupType.Redirect;
this.proxyThroughServiceUrl = result.isProxyThroughServiceUrl();
this.partitions = -1;
}
public LookupDataResult(int partitions) {
super();
this.partitions = partitions;
this.brokerUrl = null;
this.brokerUrlTls = null;
this.authoritative = false;
this.proxyThroughServiceUrl = false;
this.redirect = false;
}
}
private static final Logger log = LoggerFactory.getLogger(BinaryProtoLookupService.class);
}
```
|
Lehigh was an unincorporated community in Barbour County, West Virginia, United States.
References
Unincorporated communities in West Virginia
Unincorporated communities in Barbour County, West Virginia
|
Song of the Crippled Bull is the debut EP from American based progressive death metal band Black Crown Initiate. It was released independently on July 17, 2013 and quickly became a featured EP on various underground publications. The album was recorded by Carson Slovak at Atrium Audio in Lancaster, Pennsylvania. Slovak is also credited for having designed the artwork.
In an interview with Terrorizer, guitarist Andy Thomas goes in depth about the concept of the EP and some of the themes that surround the music:
"Basically, there are a couple of themes running through the album. Microcosmic and macrocosmic, in their nature. The overarching apparent theme is based on Hindu texts. A four stage cycle in the universe that goes from the creation to destruction, and then recreation. And the last phase is called Caliuga. It’s a phase of complete depravity, everything is almost ruined. If I look around that’s what I see. It’s symbolized by iron, an extension of the Iron Age, and also by a one-legged bull. The first phase is known as the Golden Age, which you read about in all this different cultures, but that’s symbolized by a bull with all his legs. We’re now living in the age of the one-legged bull, which is what The Song Of The Crippled Bull is all about."
Track listing
Reception
"The band’s debut EP, Song of the Crippled Bull, is an epic four-part suite of progressive death metal sanctity that shouldn’t be ignored. The utter brutality matched with a sense of grandeur and melodic reprieve is immensely impressive."
- No Clean Singing
"A very good start for an up and coming metal band, and a promising release that shows what they are capable of becoming."
- Sputnik Music
"'Song of The Crippled Bull' is fantastic on almost every level, and the major complaint is that it’s a giant tease for their debut LP."
- Metal Underground
"Another new exemplar of the fast-growing modern prog-death scene – think mechanized hyperspeed technicality and shimmering chord flurries."
- Metalsucks
References
2013 albums
Black Crown Initiate albums
|
Hesperomannia (island-aster) is a genus of flowering plant in the family Asteraceae.
Hesperomannia is endemic to Hawaii and consists of four species:
Hesperomannia arborescens (Lanai hesperomannia)
Hesperomannia arbuscula (Maui hesperomannia)
Hesperomannia lydgatei (Kauai hesperomannia)
Hesperomannia swezeyi
Although traditionally classified in the tribe Mutisieae, molecular evidence shows that it belongs in the tribe Vernonieae, most closely related to the African Vernonia species.
References
Asteraceae genera
Endemic flora of Hawaii
Taxonomy articles created by Polbot
|
Anaeromicropila populeti is a bacterium of the family Lachnospiraceae.
References
Bacteria described in 1985
Lachnospiraceae
|
```c++
//==- SemaRISCVVectorLookup.cpp - Name Lookup for RISC-V Vector Intrinsic -==//
//
// See path_to_url for license information.
//
//===your_sha256_hash------===//
//
// This file implements name lookup for RISC-V vector intrinsic.
//
//===your_sha256_hash------===//
#include "clang/AST/ASTContext.h"
#include "clang/AST/Decl.h"
#include "clang/Basic/Builtins.h"
#include "clang/Basic/TargetInfo.h"
#include "clang/Lex/Preprocessor.h"
#include "clang/Sema/Lookup.h"
#include "clang/Sema/RISCVIntrinsicManager.h"
#include "clang/Sema/Sema.h"
#include "clang/Support/RISCVVIntrinsicUtils.h"
#include "llvm/ADT/SmallVector.h"
#include <optional>
#include <string>
#include <vector>
using namespace llvm;
using namespace clang;
using namespace clang::RISCV;
namespace {
// Function definition of a RVV intrinsic.
struct RVVIntrinsicDef {
/// Full function name with suffix, e.g. vadd_vv_i32m1.
std::string Name;
/// Overloaded function name, e.g. vadd.
std::string OverloadName;
/// Mapping to which clang built-in function, e.g. __builtin_rvv_vadd.
std::string BuiltinName;
/// Function signature, first element is return type.
RVVTypes Signature;
};
struct RVVOverloadIntrinsicDef {
// Indexes of RISCVIntrinsicManagerImpl::IntrinsicList.
SmallVector<size_t, 8> Indexes;
};
} // namespace
static const PrototypeDescriptor RVVSignatureTable[] = {
#define DECL_SIGNATURE_TABLE
#include "clang/Basic/riscv_vector_builtin_sema.inc"
#undef DECL_SIGNATURE_TABLE
};
static const RVVIntrinsicRecord RVVIntrinsicRecords[] = {
#define DECL_INTRINSIC_RECORDS
#include "clang/Basic/riscv_vector_builtin_sema.inc"
#undef DECL_INTRINSIC_RECORDS
};
// Get subsequence of signature table.
static ArrayRef<PrototypeDescriptor> ProtoSeq2ArrayRef(uint16_t Index,
uint8_t Length) {
return ArrayRef(&RVVSignatureTable[Index], Length);
}
static QualType RVVType2Qual(ASTContext &Context, const RVVType *Type) {
QualType QT;
switch (Type->getScalarType()) {
case ScalarTypeKind::Void:
QT = Context.VoidTy;
break;
case ScalarTypeKind::Size_t:
QT = Context.getSizeType();
break;
case ScalarTypeKind::Ptrdiff_t:
QT = Context.getPointerDiffType();
break;
case ScalarTypeKind::UnsignedLong:
QT = Context.UnsignedLongTy;
break;
case ScalarTypeKind::SignedLong:
QT = Context.LongTy;
break;
case ScalarTypeKind::Boolean:
QT = Context.BoolTy;
break;
case ScalarTypeKind::SignedInteger:
QT = Context.getIntTypeForBitwidth(Type->getElementBitwidth(), true);
break;
case ScalarTypeKind::UnsignedInteger:
QT = Context.getIntTypeForBitwidth(Type->getElementBitwidth(), false);
break;
case ScalarTypeKind::Float:
switch (Type->getElementBitwidth()) {
case 64:
QT = Context.DoubleTy;
break;
case 32:
QT = Context.FloatTy;
break;
case 16:
QT = Context.Float16Ty;
break;
default:
llvm_unreachable("Unsupported floating point width.");
}
break;
case Invalid:
llvm_unreachable("Unhandled type.");
}
if (Type->isVector())
QT = Context.getScalableVectorType(QT, *Type->getScale());
if (Type->isConstant())
QT = Context.getConstType(QT);
// Transform the type to a pointer as the last step, if necessary.
if (Type->isPointer())
QT = Context.getPointerType(QT);
return QT;
}
namespace {
class RISCVIntrinsicManagerImpl : public sema::RISCVIntrinsicManager {
private:
Sema &S;
ASTContext &Context;
RVVTypeCache TypeCache;
// List of all RVV intrinsic.
std::vector<RVVIntrinsicDef> IntrinsicList;
// Mapping function name to index of IntrinsicList.
StringMap<size_t> Intrinsics;
// Mapping function name to RVVOverloadIntrinsicDef.
StringMap<RVVOverloadIntrinsicDef> OverloadIntrinsics;
// Create IntrinsicList
void InitIntrinsicList();
// Create RVVIntrinsicDef.
void InitRVVIntrinsic(const RVVIntrinsicRecord &Record, StringRef SuffixStr,
StringRef OverloadedSuffixStr, bool IsMask,
RVVTypes &Types, bool HasPolicy, Policy PolicyAttrs);
// Create FunctionDecl for a vector intrinsic.
void CreateRVVIntrinsicDecl(LookupResult &LR, IdentifierInfo *II,
Preprocessor &PP, unsigned Index,
bool IsOverload);
public:
RISCVIntrinsicManagerImpl(clang::Sema &S) : S(S), Context(S.Context) {
InitIntrinsicList();
}
// Create RISC-V vector intrinsic and insert into symbol table if found, and
// return true, otherwise return false.
bool CreateIntrinsicIfFound(LookupResult &LR, IdentifierInfo *II,
Preprocessor &PP) override;
};
} // namespace
void RISCVIntrinsicManagerImpl::InitIntrinsicList() {
const TargetInfo &TI = Context.getTargetInfo();
bool HasVectorFloat32 = TI.hasFeature("zve32f");
bool HasVectorFloat64 = TI.hasFeature("zve64d");
bool HasZvfh = TI.hasFeature("experimental-zvfh");
bool HasRV64 = TI.hasFeature("64bit");
bool HasFullMultiply = TI.hasFeature("v");
// Construction of RVVIntrinsicRecords need to sync with createRVVIntrinsics
// in RISCVVEmitter.cpp.
for (auto &Record : RVVIntrinsicRecords) {
// Create Intrinsics for each type and LMUL.
BasicType BaseType = BasicType::Unknown;
ArrayRef<PrototypeDescriptor> BasicProtoSeq =
ProtoSeq2ArrayRef(Record.PrototypeIndex, Record.PrototypeLength);
ArrayRef<PrototypeDescriptor> SuffixProto =
ProtoSeq2ArrayRef(Record.SuffixIndex, Record.SuffixLength);
ArrayRef<PrototypeDescriptor> OverloadedSuffixProto = ProtoSeq2ArrayRef(
Record.OverloadedSuffixIndex, Record.OverloadedSuffixSize);
PolicyScheme UnMaskedPolicyScheme =
static_cast<PolicyScheme>(Record.UnMaskedPolicyScheme);
PolicyScheme MaskedPolicyScheme =
static_cast<PolicyScheme>(Record.MaskedPolicyScheme);
const Policy DefaultPolicy;
llvm::SmallVector<PrototypeDescriptor> ProtoSeq =
RVVIntrinsic::computeBuiltinTypes(BasicProtoSeq, /*IsMasked=*/false,
/*HasMaskedOffOperand=*/false,
Record.HasVL, Record.NF,
UnMaskedPolicyScheme, DefaultPolicy);
llvm::SmallVector<PrototypeDescriptor> ProtoMaskSeq =
RVVIntrinsic::computeBuiltinTypes(
BasicProtoSeq, /*IsMasked=*/true, Record.HasMaskedOffOperand,
Record.HasVL, Record.NF, MaskedPolicyScheme, DefaultPolicy);
bool UnMaskedHasPolicy = UnMaskedPolicyScheme != PolicyScheme::SchemeNone;
bool MaskedHasPolicy = MaskedPolicyScheme != PolicyScheme::SchemeNone;
SmallVector<Policy> SupportedUnMaskedPolicies =
RVVIntrinsic::getSupportedUnMaskedPolicies();
SmallVector<Policy> SupportedMaskedPolicies =
RVVIntrinsic::getSupportedMaskedPolicies(Record.HasTailPolicy,
Record.HasMaskPolicy);
for (unsigned int TypeRangeMaskShift = 0;
TypeRangeMaskShift <= static_cast<unsigned int>(BasicType::MaxOffset);
++TypeRangeMaskShift) {
unsigned int BaseTypeI = 1 << TypeRangeMaskShift;
BaseType = static_cast<BasicType>(BaseTypeI);
if ((BaseTypeI & Record.TypeRangeMask) != BaseTypeI)
continue;
// Check requirement.
if (BaseType == BasicType::Float16 && !HasZvfh)
continue;
if (BaseType == BasicType::Float32 && !HasVectorFloat32)
continue;
if (BaseType == BasicType::Float64 && !HasVectorFloat64)
continue;
if (((Record.RequiredExtensions & RVV_REQ_RV64) == RVV_REQ_RV64) &&
!HasRV64)
continue;
if ((BaseType == BasicType::Int64) &&
((Record.RequiredExtensions & RVV_REQ_FullMultiply) ==
RVV_REQ_FullMultiply) &&
!HasFullMultiply)
continue;
// Expanded with different LMUL.
for (int Log2LMUL = -3; Log2LMUL <= 3; Log2LMUL++) {
if (!(Record.Log2LMULMask & (1 << (Log2LMUL + 3))))
continue;
std::optional<RVVTypes> Types =
TypeCache.computeTypes(BaseType, Log2LMUL, Record.NF, ProtoSeq);
// Ignored to create new intrinsic if there are any illegal types.
if (!Types.has_value())
continue;
std::string SuffixStr = RVVIntrinsic::getSuffixStr(
TypeCache, BaseType, Log2LMUL, SuffixProto);
std::string OverloadedSuffixStr = RVVIntrinsic::getSuffixStr(
TypeCache, BaseType, Log2LMUL, OverloadedSuffixProto);
// Create non-masked intrinsic.
InitRVVIntrinsic(Record, SuffixStr, OverloadedSuffixStr, false, *Types,
UnMaskedHasPolicy, DefaultPolicy);
// Create non-masked policy intrinsic.
if (Record.UnMaskedPolicyScheme != PolicyScheme::SchemeNone) {
for (auto P : SupportedUnMaskedPolicies) {
llvm::SmallVector<PrototypeDescriptor> PolicyPrototype =
RVVIntrinsic::computeBuiltinTypes(
BasicProtoSeq, /*IsMasked=*/false,
/*HasMaskedOffOperand=*/false, Record.HasVL, Record.NF,
UnMaskedPolicyScheme, P);
std::optional<RVVTypes> PolicyTypes = TypeCache.computeTypes(
BaseType, Log2LMUL, Record.NF, PolicyPrototype);
InitRVVIntrinsic(Record, SuffixStr, OverloadedSuffixStr,
/*IsMask=*/false, *PolicyTypes, UnMaskedHasPolicy,
P);
}
}
if (!Record.HasMasked)
continue;
// Create masked intrinsic.
std::optional<RVVTypes> MaskTypes =
TypeCache.computeTypes(BaseType, Log2LMUL, Record.NF, ProtoMaskSeq);
InitRVVIntrinsic(Record, SuffixStr, OverloadedSuffixStr, true,
*MaskTypes, MaskedHasPolicy, DefaultPolicy);
if (Record.MaskedPolicyScheme == PolicyScheme::SchemeNone)
continue;
// Create masked policy intrinsic.
for (auto P : SupportedMaskedPolicies) {
llvm::SmallVector<PrototypeDescriptor> PolicyPrototype =
RVVIntrinsic::computeBuiltinTypes(
BasicProtoSeq, /*IsMasked=*/true, Record.HasMaskedOffOperand,
Record.HasVL, Record.NF, MaskedPolicyScheme, P);
std::optional<RVVTypes> PolicyTypes = TypeCache.computeTypes(
BaseType, Log2LMUL, Record.NF, PolicyPrototype);
InitRVVIntrinsic(Record, SuffixStr, OverloadedSuffixStr,
/*IsMask=*/true, *PolicyTypes, MaskedHasPolicy, P);
}
} // End for different LMUL
} // End for different TypeRange
}
}
// Compute name and signatures for intrinsic with practical types.
void RISCVIntrinsicManagerImpl::InitRVVIntrinsic(
const RVVIntrinsicRecord &Record, StringRef SuffixStr,
StringRef OverloadedSuffixStr, bool IsMasked, RVVTypes &Signature,
bool HasPolicy, Policy PolicyAttrs) {
// Function name, e.g. vadd_vv_i32m1.
std::string Name = Record.Name;
if (!SuffixStr.empty())
Name += "_" + SuffixStr.str();
// Overloaded function name, e.g. vadd.
std::string OverloadedName;
if (!Record.OverloadedName)
OverloadedName = StringRef(Record.Name).split("_").first.str();
else
OverloadedName = Record.OverloadedName;
if (!OverloadedSuffixStr.empty())
OverloadedName += "_" + OverloadedSuffixStr.str();
// clang built-in function name, e.g. __builtin_rvv_vadd.
std::string BuiltinName = "__builtin_rvv_" + std::string(Record.Name);
RVVIntrinsic::updateNamesAndPolicy(IsMasked, HasPolicy, Name, BuiltinName,
OverloadedName, PolicyAttrs);
// Put into IntrinsicList.
size_t Index = IntrinsicList.size();
IntrinsicList.push_back({Name, OverloadedName, BuiltinName, Signature});
// Creating mapping to Intrinsics.
Intrinsics.insert({Name, Index});
// Get the RVVOverloadIntrinsicDef.
RVVOverloadIntrinsicDef &OverloadIntrinsicDef =
OverloadIntrinsics[OverloadedName];
// And added the index.
OverloadIntrinsicDef.Indexes.push_back(Index);
}
void RISCVIntrinsicManagerImpl::CreateRVVIntrinsicDecl(LookupResult &LR,
IdentifierInfo *II,
Preprocessor &PP,
unsigned Index,
bool IsOverload) {
ASTContext &Context = S.Context;
RVVIntrinsicDef &IDef = IntrinsicList[Index];
RVVTypes Sigs = IDef.Signature;
size_t SigLength = Sigs.size();
RVVType *ReturnType = Sigs[0];
QualType RetType = RVVType2Qual(Context, ReturnType);
SmallVector<QualType, 8> ArgTypes;
QualType BuiltinFuncType;
// Skip return type, and convert RVVType to QualType for arguments.
for (size_t i = 1; i < SigLength; ++i)
ArgTypes.push_back(RVVType2Qual(Context, Sigs[i]));
FunctionProtoType::ExtProtoInfo PI(
Context.getDefaultCallingConvention(false, false, true));
PI.Variadic = false;
SourceLocation Loc = LR.getNameLoc();
BuiltinFuncType = Context.getFunctionType(RetType, ArgTypes, PI);
DeclContext *Parent = Context.getTranslationUnitDecl();
FunctionDecl *RVVIntrinsicDecl = FunctionDecl::Create(
Context, Parent, Loc, Loc, II, BuiltinFuncType, /*TInfo=*/nullptr,
SC_Extern, S.getCurFPFeatures().isFPConstrained(),
/*isInlineSpecified*/ false,
/*hasWrittenPrototype*/ true);
// Create Decl objects for each parameter, adding them to the
// FunctionDecl.
const auto *FP = cast<FunctionProtoType>(BuiltinFuncType);
SmallVector<ParmVarDecl *, 8> ParmList;
for (unsigned IParm = 0, E = FP->getNumParams(); IParm != E; ++IParm) {
ParmVarDecl *Parm =
ParmVarDecl::Create(Context, RVVIntrinsicDecl, Loc, Loc, nullptr,
FP->getParamType(IParm), nullptr, SC_None, nullptr);
Parm->setScopeInfo(0, IParm);
ParmList.push_back(Parm);
}
RVVIntrinsicDecl->setParams(ParmList);
// Add function attributes.
if (IsOverload)
RVVIntrinsicDecl->addAttr(OverloadableAttr::CreateImplicit(Context));
// Setup alias to __builtin_rvv_*
IdentifierInfo &IntrinsicII = PP.getIdentifierTable().get(IDef.BuiltinName);
RVVIntrinsicDecl->addAttr(
BuiltinAliasAttr::CreateImplicit(S.Context, &IntrinsicII));
// Add to symbol table.
LR.addDecl(RVVIntrinsicDecl);
}
bool RISCVIntrinsicManagerImpl::CreateIntrinsicIfFound(LookupResult &LR,
IdentifierInfo *II,
Preprocessor &PP) {
StringRef Name = II->getName();
// Lookup the function name from the overload intrinsics first.
auto OvIItr = OverloadIntrinsics.find(Name);
if (OvIItr != OverloadIntrinsics.end()) {
const RVVOverloadIntrinsicDef &OvIntrinsicDef = OvIItr->second;
for (auto Index : OvIntrinsicDef.Indexes)
CreateRVVIntrinsicDecl(LR, II, PP, Index,
/*IsOverload*/ true);
// If we added overloads, need to resolve the lookup result.
LR.resolveKind();
return true;
}
// Lookup the function name from the intrinsics.
auto Itr = Intrinsics.find(Name);
if (Itr != Intrinsics.end()) {
CreateRVVIntrinsicDecl(LR, II, PP, Itr->second,
/*IsOverload*/ false);
return true;
}
// It's not an RVV intrinsics.
return false;
}
namespace clang {
std::unique_ptr<clang::sema::RISCVIntrinsicManager>
CreateRISCVIntrinsicManager(Sema &S) {
return std::make_unique<RISCVIntrinsicManagerImpl>(S);
}
} // namespace clang
```
|
```javascript
// Generated by ReScript, PLEASE EDIT WITH CARE
'use strict';
function treeHeight(n) {
if (n !== undefined) {
return n.height;
} else {
return 0;
}
}
function copy(n) {
if (n === undefined) {
return n;
}
let v = n.value;
let h = n.height;
let l = n.left;
let r = n.right;
return {
value: v,
height: h,
left: copy(l),
right: copy(r)
};
}
exports.treeHeight = treeHeight;
exports.copy = copy;
/* No side effect */
```
|
```scala
/*
*/
package akka.stream.alpakka.jms.scaladsl
import javax.jms.{Connection, ConnectionFactory}
import org.apache.activemq.ActiveMQConnection
/**
* a silly cached connection factory, not thread safe
*/
class CachedConnectionFactory(connFactory: ConnectionFactory) extends ConnectionFactory {
var cachedConnection: ActiveMQConnection = null
override def createConnection(): Connection = {
if (cachedConnection == null) {
cachedConnection = connFactory.createConnection().asInstanceOf[ActiveMQConnection]
}
cachedConnection
}
override def createConnection(s: String, s1: String): Connection = cachedConnection
// added in JMS 2.0
// see path_to_url
def createContext(x$1: Int): javax.jms.JMSContext = ???
def createContext(x$1: String, x$2: String, x$3: Int): javax.jms.JMSContext = ???
def createContext(x$1: String, x$2: String): javax.jms.JMSContext = ???
def createContext(): javax.jms.JMSContext = ???
}
```
|
Bare Skin () is a bestseller novel by Zlatko Topčić, published in 2004.
Topčić also wrote the drama of the same name (2007) and the screenplay for the multiple award-winning feature film The Abandoned (2010; working title: Bare Skin), about the same theme.
References
2004 novels
Fiction set in the 21st century
Culture of Bosnia and Herzegovina
Bosnia and Herzegovina literature
Novels set in Bosnia and Herzegovina
Novels about rape
|
```c++
/*
Version 1.0. (See accompanying file LICENSE_1_0.txt or copy at
path_to_url
*/
#ifndef BOOST_POLYGON_POLYGON_90_SET_DATA_HPP
#define BOOST_POLYGON_POLYGON_90_SET_DATA_HPP
#include "isotropy.hpp"
#include "point_concept.hpp"
#include "transform.hpp"
#include "interval_concept.hpp"
#include "rectangle_concept.hpp"
#include "segment_concept.hpp"
#include "detail/iterator_points_to_compact.hpp"
#include "detail/iterator_compact_to_points.hpp"
#include "polygon_traits.hpp"
//manhattan boolean algorithms
#include "detail/boolean_op.hpp"
#include "detail/polygon_formation.hpp"
#include "detail/rectangle_formation.hpp"
#include "detail/max_cover.hpp"
#include "detail/property_merge.hpp"
#include "detail/polygon_90_touch.hpp"
#include "detail/iterator_geometry_to_set.hpp"
namespace boost { namespace polygon{
template <typename ltype, typename rtype, typename op_type>
class polygon_90_set_view;
template <typename T>
class polygon_90_set_data {
public:
typedef T coordinate_type;
typedef std::vector<std::pair<coordinate_type, std::pair<coordinate_type, int> > > value_type;
typedef typename std::vector<std::pair<coordinate_type, std::pair<coordinate_type, int> > >::const_iterator iterator_type;
typedef polygon_90_set_data operator_arg_type;
// default constructor
inline polygon_90_set_data() : orient_(HORIZONTAL), data_(), dirty_(false), unsorted_(false) {}
// constructor
inline polygon_90_set_data(orientation_2d orient) : orient_(orient), data_(), dirty_(false), unsorted_(false) {}
// constructor from an iterator pair over vertex data
template <typename iT>
inline polygon_90_set_data(orientation_2d, iT input_begin, iT input_end) :
orient_(HORIZONTAL), data_(), dirty_(false), unsorted_(false) {
dirty_ = true;
unsorted_ = true;
for( ; input_begin != input_end; ++input_begin) { insert(*input_begin); }
}
// copy constructor
inline polygon_90_set_data(const polygon_90_set_data& that) :
orient_(that.orient_), data_(that.data_), dirty_(that.dirty_), unsorted_(that.unsorted_) {}
template <typename ltype, typename rtype, typename op_type>
inline polygon_90_set_data(const polygon_90_set_view<ltype, rtype, op_type>& that);
// copy with orientation change constructor
inline polygon_90_set_data(orientation_2d orient, const polygon_90_set_data& that) :
orient_(orient), data_(), dirty_(false), unsorted_(false) {
insert(that, false, that.orient_);
}
// destructor
inline ~polygon_90_set_data() {}
// assignement operator
inline polygon_90_set_data& operator=(const polygon_90_set_data& that) {
if(this == &that) return *this;
orient_ = that.orient_;
data_ = that.data_;
dirty_ = that.dirty_;
unsorted_ = that.unsorted_;
return *this;
}
template <typename ltype, typename rtype, typename op_type>
inline polygon_90_set_data& operator=(const polygon_90_set_view<ltype, rtype, op_type>& that);
template <typename geometry_object>
inline polygon_90_set_data& operator=(const geometry_object& geometry) {
data_.clear();
insert(geometry);
return *this;
}
// insert iterator range
inline void insert(iterator_type input_begin, iterator_type input_end, orientation_2d orient = HORIZONTAL) {
if(input_begin == input_end || (!data_.empty() && &(*input_begin) == &(*(data_.begin())))) return;
dirty_ = true;
unsorted_ = true;
if(orient == orient_)
data_.insert(data_.end(), input_begin, input_end);
else {
for( ; input_begin != input_end; ++input_begin) {
insert(*input_begin, false, orient);
}
}
}
// insert iterator range
template <typename iT>
inline void insert(iT input_begin, iT input_end, orientation_2d orient = HORIZONTAL) {
if(input_begin == input_end) return;
dirty_ = true;
unsorted_ = true;
for( ; input_begin != input_end; ++input_begin) {
insert(*input_begin, false, orient);
}
}
inline void insert(const polygon_90_set_data& polygon_set) {
insert(polygon_set.begin(), polygon_set.end(), polygon_set.orient());
}
inline void insert(const std::pair<std::pair<point_data<coordinate_type>, point_data<coordinate_type> >, int>& edge, bool is_hole = false,
orientation_2d = HORIZONTAL) {
std::pair<coordinate_type, std::pair<coordinate_type, int> > vertex;
vertex.first = edge.first.first.x();
vertex.second.first = edge.first.first.y();
vertex.second.second = edge.second * (is_hole ? -1 : 1);
insert(vertex, false, VERTICAL);
vertex.first = edge.first.second.x();
vertex.second.first = edge.first.second.y();
vertex.second.second *= -1;
insert(vertex, false, VERTICAL);
}
template <typename geometry_type>
inline void insert(const geometry_type& geometry_object, bool is_hole = false, orientation_2d = HORIZONTAL) {
iterator_geometry_to_set<typename geometry_concept<geometry_type>::type, geometry_type>
begin_input(geometry_object, LOW, orient_, is_hole), end_input(geometry_object, HIGH, orient_, is_hole);
insert(begin_input, end_input, orient_);
}
inline void insert(const std::pair<coordinate_type, std::pair<coordinate_type, int> >& vertex, bool is_hole = false,
orientation_2d orient = HORIZONTAL) {
data_.push_back(vertex);
if(orient != orient_) std::swap(data_.back().first, data_.back().second.first);
if(is_hole) data_.back().second.second *= -1;
dirty_ = true;
unsorted_ = true;
}
inline void insert(coordinate_type major_coordinate, const std::pair<interval_data<coordinate_type>, int>& edge) {
std::pair<coordinate_type, std::pair<coordinate_type, int> > vertex;
vertex.first = major_coordinate;
vertex.second.first = edge.first.get(LOW);
vertex.second.second = edge.second;
insert(vertex, false, orient_);
vertex.second.first = edge.first.get(HIGH);
vertex.second.second *= -1;
insert(vertex, false, orient_);
}
template <typename output_container>
inline void get(output_container& output) const {
get_dispatch(output, typename geometry_concept<typename output_container::value_type>::type());
}
template <typename output_container>
inline void get(output_container& output, size_t vthreshold) const {
get_dispatch(output, typename geometry_concept<typename output_container::value_type>::type(), vthreshold);
}
template <typename output_container>
inline void get_polygons(output_container& output) const {
get_dispatch(output, polygon_90_concept());
}
template <typename output_container>
inline void get_rectangles(output_container& output) const {
clean();
form_rectangles(output, data_.begin(), data_.end(), orient_, rectangle_concept());
}
template <typename output_container>
inline void get_rectangles(output_container& output, orientation_2d slicing_orientation) const {
if(slicing_orientation == orient_) {
get_rectangles(output);
} else {
polygon_90_set_data<coordinate_type> ps(*this);
ps.transform(axis_transformation(axis_transformation::SWAP_XY));
output_container result;
ps.get_rectangles(result);
for(typename output_container::iterator itr = result.begin(); itr != result.end(); ++itr) {
::boost::polygon::transform(*itr, axis_transformation(axis_transformation::SWAP_XY));
}
output.insert(output.end(), result.begin(), result.end());
}
}
// equivalence operator
inline bool operator==(const polygon_90_set_data& p) const {
if(orient_ == p.orient()) {
clean();
p.clean();
return data_ == p.data_;
} else {
return false;
}
}
// inequivalence operator
inline bool operator!=(const polygon_90_set_data& p) const {
return !((*this) == p);
}
// get iterator to begin vertex data
inline iterator_type begin() const {
return data_.begin();
}
// get iterator to end vertex data
inline iterator_type end() const {
return data_.end();
}
const value_type& value() const {
return data_;
}
// clear the contents of the polygon_90_set_data
inline void clear() { data_.clear(); dirty_ = unsorted_ = false; }
// find out if Polygon set is empty
inline bool empty() const { clean(); return data_.empty(); }
// get the Polygon set size in vertices
inline std::size_t size() const { clean(); return data_.size(); }
// get the current Polygon set capacity in vertices
inline std::size_t capacity() const { return data_.capacity(); }
// reserve size of polygon set in vertices
inline void reserve(std::size_t size) { return data_.reserve(size); }
// find out if Polygon set is sorted
inline bool sorted() const { return !unsorted_; }
// find out if Polygon set is clean
inline bool dirty() const { return dirty_; }
// get the scanline orientation of the polygon set
inline orientation_2d orient() const { return orient_; }
// Start BM
// The problem: If we have two polygon sets with two different scanline orientations:
// I tried changing the orientation of one to coincide with other (If not, resulting boolean operation
// produces spurious results).
// First I tried copying polygon data from one of the sets into another set with corrected orientation
// using one of the copy constructor that takes in orientation (see somewhere above in this file) --> copy constructor throws error
// Then I tried another approach:(see below). This approach also fails to produce the desired results when test case is run.
// Here is the part that beats me: If I comment out the whole section, I can do all the operations (^=, -=, &= )these commented out
// operations perform. So then why do we need them?. Hence, I commented out this whole section.
// End BM
// polygon_90_set_data<coordinate_type>& operator-=(const polygon_90_set_data& that) {
// sort();
// that.sort();
// value_type data;
// std::swap(data, data_);
// applyBooleanBinaryOp(data.begin(), data.end(),
// that.begin(), that.end(), boolean_op::BinaryCount<boolean_op::BinaryNot>());
// return *this;
// }
// polygon_90_set_data<coordinate_type>& operator^=(const polygon_90_set_data& that) {
// sort();
// that.sort();
// value_type data;
// std::swap(data, data_);
// applyBooleanBinaryOp(data.begin(), data.end(),
// that.begin(), that.end(), boolean_op::BinaryCount<boolean_op::BinaryXor>());
// return *this;
// }
// polygon_90_set_data<coordinate_type>& operator&=(const polygon_90_set_data& that) {
// sort();
// that.sort();
// value_type data;
// std::swap(data, data_);
// applyBooleanBinaryOp(data.begin(), data.end(),
// that.begin(), that.end(), boolean_op::BinaryCount<boolean_op::BinaryAnd>());
// return *this;
// }
// polygon_90_set_data<coordinate_type>& operator|=(const polygon_90_set_data& that) {
// insert(that);
// return *this;
// }
void clean() const {
sort();
if(dirty_) {
boolean_op::default_arg_workaround<int>::applyBooleanOr(data_);
dirty_ = false;
}
}
void sort() const{
if(unsorted_) {
polygon_sort(data_.begin(), data_.end());
unsorted_ = false;
}
}
template <typename input_iterator_type>
void set(input_iterator_type input_begin, input_iterator_type input_end, orientation_2d orient) {
data_.clear();
reserve(std::distance(input_begin, input_end));
data_.insert(data_.end(), input_begin, input_end);
orient_ = orient;
dirty_ = true;
unsorted_ = true;
}
void set(const value_type& value, orientation_2d orient) {
data_ = value;
orient_ = orient;
dirty_ = true;
unsorted_ = true;
}
//extents
template <typename rectangle_type>
bool
extents(rectangle_type& extents_rectangle) const {
clean();
if(data_.empty()) return false;
if(orient_ == HORIZONTAL)
set_points(extents_rectangle, point_data<coordinate_type>(data_[0].second.first, data_[0].first),
point_data<coordinate_type>(data_[data_.size() - 1].second.first, data_[data_.size() - 1].first));
else
set_points(extents_rectangle, point_data<coordinate_type>(data_[0].first, data_[0].second.first),
point_data<coordinate_type>(data_[data_.size() - 1].first, data_[data_.size() - 1].second.first));
for(std::size_t i = 1; i < data_.size() - 1; ++i) {
if(orient_ == HORIZONTAL)
encompass(extents_rectangle, point_data<coordinate_type>(data_[i].second.first, data_[i].first));
else
encompass(extents_rectangle, point_data<coordinate_type>(data_[i].first, data_[i].second.first));
}
return true;
}
polygon_90_set_data&
bloat2(typename coordinate_traits<coordinate_type>::unsigned_area_type west_bloating,
typename coordinate_traits<coordinate_type>::unsigned_area_type east_bloating,
typename coordinate_traits<coordinate_type>::unsigned_area_type south_bloating,
typename coordinate_traits<coordinate_type>::unsigned_area_type north_bloating) {
std::vector<rectangle_data<coordinate_type> > rects;
clean();
rects.reserve(data_.size() / 2);
get(rects);
rectangle_data<coordinate_type> convolutionRectangle(interval_data<coordinate_type>(-((coordinate_type)west_bloating),
(coordinate_type)east_bloating),
interval_data<coordinate_type>(-((coordinate_type)south_bloating),
(coordinate_type)north_bloating));
for(typename std::vector<rectangle_data<coordinate_type> >::iterator itr = rects.begin();
itr != rects.end(); ++itr) {
convolve(*itr, convolutionRectangle);
}
clear();
insert(rects.begin(), rects.end());
return *this;
}
static void modify_pt(point_data<coordinate_type>& pt, const point_data<coordinate_type>& prev_pt,
const point_data<coordinate_type>& current_pt, const point_data<coordinate_type>& next_pt,
coordinate_type west_bloating,
coordinate_type east_bloating,
coordinate_type south_bloating,
coordinate_type north_bloating) {
bool pxl = prev_pt.x() < current_pt.x();
bool pyl = prev_pt.y() < current_pt.y();
bool nxl = next_pt.x() < current_pt.x();
bool nyl = next_pt.y() < current_pt.y();
bool pxg = prev_pt.x() > current_pt.x();
bool pyg = prev_pt.y() > current_pt.y();
bool nxg = next_pt.x() > current_pt.x();
bool nyg = next_pt.y() > current_pt.y();
//two of the four if statements will execute
if(pxl)
pt.y(current_pt.y() - south_bloating);
if(pxg)
pt.y(current_pt.y() + north_bloating);
if(nxl)
pt.y(current_pt.y() + north_bloating);
if(nxg)
pt.y(current_pt.y() - south_bloating);
if(pyl)
pt.x(current_pt.x() + east_bloating);
if(pyg)
pt.x(current_pt.x() - west_bloating);
if(nyl)
pt.x(current_pt.x() - west_bloating);
if(nyg)
pt.x(current_pt.x() + east_bloating);
}
static void resize_poly_up(std::vector<point_data<coordinate_type> >& poly,
coordinate_type west_bloating,
coordinate_type east_bloating,
coordinate_type south_bloating,
coordinate_type north_bloating) {
point_data<coordinate_type> first_pt = poly[0];
point_data<coordinate_type> second_pt = poly[1];
point_data<coordinate_type> prev_pt = poly[0];
point_data<coordinate_type> current_pt = poly[1];
for(std::size_t i = 2; i < poly.size(); ++i) {
point_data<coordinate_type> next_pt = poly[i];
modify_pt(poly[i-1], prev_pt, current_pt, next_pt, west_bloating, east_bloating, south_bloating, north_bloating);
prev_pt = current_pt;
current_pt = next_pt;
}
point_data<coordinate_type> next_pt = first_pt;
modify_pt(poly.back(), prev_pt, current_pt, next_pt, west_bloating, east_bloating, south_bloating, north_bloating);
prev_pt = current_pt;
current_pt = next_pt;
next_pt = second_pt;
modify_pt(poly[0], prev_pt, current_pt, next_pt, west_bloating, east_bloating, south_bloating, north_bloating);
remove_colinear_pts(poly);
}
static bool resize_poly_down(std::vector<point_data<coordinate_type> >& poly,
coordinate_type west_shrinking,
coordinate_type east_shrinking,
coordinate_type south_shrinking,
coordinate_type north_shrinking) {
rectangle_data<coordinate_type> extents_rectangle;
set_points(extents_rectangle, poly[0], poly[0]);
point_data<coordinate_type> first_pt = poly[0];
point_data<coordinate_type> second_pt = poly[1];
point_data<coordinate_type> prev_pt = poly[0];
point_data<coordinate_type> current_pt = poly[1];
encompass(extents_rectangle, current_pt);
for(std::size_t i = 2; i < poly.size(); ++i) {
point_data<coordinate_type> next_pt = poly[i];
encompass(extents_rectangle, next_pt);
modify_pt(poly[i-1], prev_pt, current_pt, next_pt, west_shrinking, east_shrinking, south_shrinking, north_shrinking);
prev_pt = current_pt;
current_pt = next_pt;
}
if(delta(extents_rectangle, HORIZONTAL) < std::abs(west_shrinking + east_shrinking))
return false;
if(delta(extents_rectangle, VERTICAL) < std::abs(north_shrinking + south_shrinking))
return false;
point_data<coordinate_type> next_pt = first_pt;
modify_pt(poly.back(), prev_pt, current_pt, next_pt, west_shrinking, east_shrinking, south_shrinking, north_shrinking);
prev_pt = current_pt;
current_pt = next_pt;
next_pt = second_pt;
modify_pt(poly[0], prev_pt, current_pt, next_pt, west_shrinking, east_shrinking, south_shrinking, north_shrinking);
return remove_colinear_pts(poly);
}
static bool remove_colinear_pts(std::vector<point_data<coordinate_type> >& poly) {
bool found_colinear = true;
while(found_colinear && poly.size() >= 4) {
found_colinear = false;
typename std::vector<point_data<coordinate_type> >::iterator itr = poly.begin();
itr += poly.size() - 1; //get last element position
typename std::vector<point_data<coordinate_type> >::iterator itr2 = poly.begin();
typename std::vector<point_data<coordinate_type> >::iterator itr3 = itr2;
++itr3;
std::size_t count = 0;
for( ; itr3 < poly.end(); ++itr3) {
if(((*itr).x() == (*itr2).x() && (*itr).x() == (*itr3).x()) ||
((*itr).y() == (*itr2).y() && (*itr).y() == (*itr3).y()) ) {
++count;
found_colinear = true;
} else {
itr = itr2;
++itr2;
}
*itr2 = *itr3;
}
itr3 = poly.begin();
if(((*itr).x() == (*itr2).x() && (*itr).x() == (*itr3).x()) ||
((*itr).y() == (*itr2).y() && (*itr).y() == (*itr3).y()) ) {
++count;
found_colinear = true;
}
poly.erase(poly.end() - count, poly.end());
}
return poly.size() >= 4;
}
polygon_90_set_data&
bloat(typename coordinate_traits<coordinate_type>::unsigned_area_type west_bloating,
typename coordinate_traits<coordinate_type>::unsigned_area_type east_bloating,
typename coordinate_traits<coordinate_type>::unsigned_area_type south_bloating,
typename coordinate_traits<coordinate_type>::unsigned_area_type north_bloating) {
std::list<polygon_45_with_holes_data<coordinate_type> > polys;
get(polys);
clear();
for(typename std::list<polygon_45_with_holes_data<coordinate_type> >::iterator itr = polys.begin();
itr != polys.end(); ++itr) {
//polygon_90_set_data<coordinate_type> psref;
//psref.insert(view_as<polygon_90_concept>((*itr).self_));
//rectangle_data<coordinate_type> prerect;
//psref.extents(prerect);
resize_poly_up((*itr).self_.coords_, (coordinate_type)west_bloating, (coordinate_type)east_bloating,
(coordinate_type)south_bloating, (coordinate_type)north_bloating);
iterator_geometry_to_set<polygon_90_concept, view_of<polygon_90_concept, polygon_45_data<coordinate_type> > >
begin_input(view_as<polygon_90_concept>((*itr).self_), LOW, orient_, false, true, COUNTERCLOCKWISE),
end_input(view_as<polygon_90_concept>((*itr).self_), HIGH, orient_, false, true, COUNTERCLOCKWISE);
insert(begin_input, end_input, orient_);
//polygon_90_set_data<coordinate_type> pstest;
//pstest.insert(view_as<polygon_90_concept>((*itr).self_));
//psref.bloat2(west_bloating, east_bloating, south_bloating, north_bloating);
//if(!equivalence(psref, pstest)) {
// std::cout << "test failed\n";
//}
for(typename std::list<polygon_45_data<coordinate_type> >::iterator itrh = (*itr).holes_.begin();
itrh != (*itr).holes_.end(); ++itrh) {
//rectangle_data<coordinate_type> rect;
//psref.extents(rect);
//polygon_90_set_data<coordinate_type> psrefhole;
//psrefhole.insert(prerect);
//psrefhole.insert(view_as<polygon_90_concept>(*itrh), true);
//polygon_45_data<coordinate_type> testpoly(*itrh);
if(resize_poly_down((*itrh).coords_,(coordinate_type)west_bloating, (coordinate_type)east_bloating,
(coordinate_type)south_bloating, (coordinate_type)north_bloating)) {
iterator_geometry_to_set<polygon_90_concept, view_of<polygon_90_concept, polygon_45_data<coordinate_type> > >
begin_input2(view_as<polygon_90_concept>(*itrh), LOW, orient_, true, true),
end_input2(view_as<polygon_90_concept>(*itrh), HIGH, orient_, true, true);
insert(begin_input2, end_input2, orient_);
//polygon_90_set_data<coordinate_type> pstesthole;
//pstesthole.insert(rect);
//iterator_geometry_to_set<polygon_90_concept, view_of<polygon_90_concept, polygon_45_data<coordinate_type> > >
// begin_input2(view_as<polygon_90_concept>(*itrh), LOW, orient_, true, true);
//pstesthole.insert(begin_input2, end_input, orient_);
//psrefhole.bloat2(west_bloating, east_bloating, south_bloating, north_bloating);
//if(!equivalence(psrefhole, pstesthole)) {
// std::cout << (winding(testpoly) == CLOCKWISE) << std::endl;
// std::cout << (winding(*itrh) == CLOCKWISE) << std::endl;
// polygon_90_set_data<coordinate_type> c(psrefhole);
// c.clean();
// polygon_90_set_data<coordinate_type> a(pstesthole);
// polygon_90_set_data<coordinate_type> b(pstesthole);
// a.sort();
// b.clean();
// std::cout << "test hole failed\n";
// //std::cout << testpoly << std::endl;
//}
}
}
}
return *this;
}
polygon_90_set_data&
shrink(typename coordinate_traits<coordinate_type>::unsigned_area_type west_shrinking,
typename coordinate_traits<coordinate_type>::unsigned_area_type east_shrinking,
typename coordinate_traits<coordinate_type>::unsigned_area_type south_shrinking,
typename coordinate_traits<coordinate_type>::unsigned_area_type north_shrinking) {
std::list<polygon_45_with_holes_data<coordinate_type> > polys;
get(polys);
clear();
for(typename std::list<polygon_45_with_holes_data<coordinate_type> >::iterator itr = polys.begin();
itr != polys.end(); ++itr) {
//polygon_90_set_data<coordinate_type> psref;
//psref.insert(view_as<polygon_90_concept>((*itr).self_));
//rectangle_data<coordinate_type> prerect;
//psref.extents(prerect);
//polygon_45_data<coordinate_type> testpoly((*itr).self_);
if(resize_poly_down((*itr).self_.coords_, -(coordinate_type)west_shrinking, -(coordinate_type)east_shrinking,
-(coordinate_type)south_shrinking, -(coordinate_type)north_shrinking)) {
iterator_geometry_to_set<polygon_90_concept, view_of<polygon_90_concept, polygon_45_data<coordinate_type> > >
begin_input(view_as<polygon_90_concept>((*itr).self_), LOW, orient_, false, true, COUNTERCLOCKWISE),
end_input(view_as<polygon_90_concept>((*itr).self_), HIGH, orient_, false, true, COUNTERCLOCKWISE);
insert(begin_input, end_input, orient_);
//iterator_geometry_to_set<polygon_90_concept, view_of<polygon_90_concept, polygon_45_data<coordinate_type> > >
// begin_input2(view_as<polygon_90_concept>((*itr).self_), LOW, orient_, false, true, COUNTERCLOCKWISE);
//polygon_90_set_data<coordinate_type> pstest;
//pstest.insert(begin_input2, end_input, orient_);
//psref.shrink2(west_shrinking, east_shrinking, south_shrinking, north_shrinking);
//if(!equivalence(psref, pstest)) {
// std::cout << "test failed\n";
//}
for(typename std::list<polygon_45_data<coordinate_type> >::iterator itrh = (*itr).holes_.begin();
itrh != (*itr).holes_.end(); ++itrh) {
//rectangle_data<coordinate_type> rect;
//psref.extents(rect);
//polygon_90_set_data<coordinate_type> psrefhole;
//psrefhole.insert(prerect);
//psrefhole.insert(view_as<polygon_90_concept>(*itrh), true);
//polygon_45_data<coordinate_type> testpoly(*itrh);
resize_poly_up((*itrh).coords_, -(coordinate_type)west_shrinking, -(coordinate_type)east_shrinking,
-(coordinate_type)south_shrinking, -(coordinate_type)north_shrinking);
iterator_geometry_to_set<polygon_90_concept, view_of<polygon_90_concept, polygon_45_data<coordinate_type> > >
begin_input2(view_as<polygon_90_concept>(*itrh), LOW, orient_, true, true),
end_input2(view_as<polygon_90_concept>(*itrh), HIGH, orient_, true, true);
insert(begin_input2, end_input2, orient_);
//polygon_90_set_data<coordinate_type> pstesthole;
//pstesthole.insert(rect);
//iterator_geometry_to_set<polygon_90_concept, view_of<polygon_90_concept, polygon_45_data<coordinate_type> > >
// begin_input2(view_as<polygon_90_concept>(*itrh), LOW, orient_, true, true);
//pstesthole.insert(begin_input2, end_input, orient_);
//psrefhole.shrink2(west_shrinking, east_shrinking, south_shrinking, north_shrinking);
//if(!equivalence(psrefhole, pstesthole)) {
// std::cout << (winding(testpoly) == CLOCKWISE) << std::endl;
// std::cout << (winding(*itrh) == CLOCKWISE) << std::endl;
// polygon_90_set_data<coordinate_type> c(psrefhole);
// c.clean();
// polygon_90_set_data<coordinate_type> a(pstesthole);
// polygon_90_set_data<coordinate_type> b(pstesthole);
// a.sort();
// b.clean();
// std::cout << "test hole failed\n";
// //std::cout << testpoly << std::endl;
//}
}
}
}
return *this;
}
polygon_90_set_data&
shrink2(typename coordinate_traits<coordinate_type>::unsigned_area_type west_shrinking,
typename coordinate_traits<coordinate_type>::unsigned_area_type east_shrinking,
typename coordinate_traits<coordinate_type>::unsigned_area_type south_shrinking,
typename coordinate_traits<coordinate_type>::unsigned_area_type north_shrinking) {
rectangle_data<coordinate_type> externalBoundary;
if(!extents(externalBoundary)) return *this;
::boost::polygon::bloat(externalBoundary, 10); //bloat by diferential ammount
//insert a hole that encompasses the data
insert(externalBoundary, true); //note that the set is in a dirty state now
sort(); //does not apply implicit OR operation
std::vector<rectangle_data<coordinate_type> > rects;
rects.reserve(data_.size() / 2);
//begin does not apply implicit or operation, this is a dirty range
form_rectangles(rects, data_.begin(), data_.end(), orient_, rectangle_concept());
clear();
rectangle_data<coordinate_type> convolutionRectangle(interval_data<coordinate_type>(-((coordinate_type)east_shrinking),
(coordinate_type)west_shrinking),
interval_data<coordinate_type>(-((coordinate_type)north_shrinking),
(coordinate_type)south_shrinking));
for(typename std::vector<rectangle_data<coordinate_type> >::iterator itr = rects.begin();
itr != rects.end(); ++itr) {
rectangle_data<coordinate_type>& rect = *itr;
convolve(rect, convolutionRectangle);
//insert rectangle as a hole
insert(rect, true);
}
convolve(externalBoundary, convolutionRectangle);
//insert duplicate of external boundary as solid to cancel out the external hole boundaries
insert(externalBoundary);
clean(); //we have negative values in the set, so we need to apply an OR operation to make it valid input to a boolean
return *this;
}
polygon_90_set_data&
shrink(direction_2d dir, typename coordinate_traits<coordinate_type>::unsigned_area_type shrinking) {
if(dir == WEST)
return shrink(shrinking, 0, 0, 0);
if(dir == EAST)
return shrink(0, shrinking, 0, 0);
if(dir == SOUTH)
return shrink(0, 0, shrinking, 0);
return shrink(0, 0, 0, shrinking);
}
polygon_90_set_data&
bloat(direction_2d dir, typename coordinate_traits<coordinate_type>::unsigned_area_type shrinking) {
if(dir == WEST)
return bloat(shrinking, 0, 0, 0);
if(dir == EAST)
return bloat(0, shrinking, 0, 0);
if(dir == SOUTH)
return bloat(0, 0, shrinking, 0);
return bloat(0, 0, 0, shrinking);
}
polygon_90_set_data&
resize(coordinate_type west, coordinate_type east, coordinate_type south, coordinate_type north);
polygon_90_set_data& move(coordinate_type x_delta, coordinate_type y_delta) {
for(typename std::vector<std::pair<coordinate_type, std::pair<coordinate_type, int> > >::iterator
itr = data_.begin(); itr != data_.end(); ++itr) {
if(orient_ == orientation_2d(VERTICAL)) {
(*itr).first += x_delta;
(*itr).second.first += y_delta;
} else {
(*itr).second.first += x_delta;
(*itr).first += y_delta;
}
}
return *this;
}
// transform set
template <typename transformation_type>
polygon_90_set_data& transform(const transformation_type& transformation) {
direction_2d dir1, dir2;
transformation.get_directions(dir1, dir2);
int sign = dir1.get_sign() * dir2.get_sign();
for(typename std::vector<std::pair<coordinate_type, std::pair<coordinate_type, int> > >::iterator
itr = data_.begin(); itr != data_.end(); ++itr) {
if(orient_ == orientation_2d(VERTICAL)) {
transformation.transform((*itr).first, (*itr).second.first);
} else {
transformation.transform((*itr).second.first, (*itr).first);
}
(*itr).second.second *= sign;
}
if(dir1 != EAST || dir2 != NORTH)
unsorted_ = true; //some mirroring or rotation must have happened
return *this;
}
// scale set
polygon_90_set_data& scale_up(typename coordinate_traits<coordinate_type>::unsigned_area_type factor) {
for(typename std::vector<std::pair<coordinate_type, std::pair<coordinate_type, int> > >::iterator
itr = data_.begin(); itr != data_.end(); ++itr) {
(*itr).first *= (coordinate_type)factor;
(*itr).second.first *= (coordinate_type)factor;
}
return *this;
}
polygon_90_set_data& scale_down(typename coordinate_traits<coordinate_type>::unsigned_area_type factor) {
typedef typename coordinate_traits<coordinate_type>::coordinate_distance dt;
for(typename std::vector<std::pair<coordinate_type, std::pair<coordinate_type, int> > >::iterator
itr = data_.begin(); itr != data_.end(); ++itr) {
(*itr).first = scaling_policy<coordinate_type>::round((dt)((*itr).first) / (dt)factor);
(*itr).second.first = scaling_policy<coordinate_type>::round((dt)((*itr).second.first) / (dt)factor);
}
unsorted_ = true; //scaling down can make coordinates equal that were not previously equal
return *this;
}
template <typename scaling_type>
polygon_90_set_data& scale(const anisotropic_scale_factor<scaling_type>& scaling) {
for(typename std::vector<std::pair<coordinate_type, std::pair<coordinate_type, int> > >::iterator
itr = data_.begin(); itr != data_.end(); ++itr) {
if(orient_ == orientation_2d(VERTICAL)) {
scaling.scale((*itr).first, (*itr).second.first);
} else {
scaling.scale((*itr).second.first, (*itr).first);
}
}
unsorted_ = true;
return *this;
}
template <typename scaling_type>
polygon_90_set_data& scale_with(const scaling_type& scaling) {
for(typename std::vector<std::pair<coordinate_type, std::pair<coordinate_type, int> > >::iterator
itr = data_.begin(); itr != data_.end(); ++itr) {
if(orient_ == orientation_2d(VERTICAL)) {
scaling.scale((*itr).first, (*itr).second.first);
} else {
scaling.scale((*itr).second.first, (*itr).first);
}
}
unsorted_ = true;
return *this;
}
polygon_90_set_data& scale(double factor) {
typedef typename coordinate_traits<coordinate_type>::coordinate_distance dt;
for(typename std::vector<std::pair<coordinate_type, std::pair<coordinate_type, int> > >::iterator
itr = data_.begin(); itr != data_.end(); ++itr) {
(*itr).first = scaling_policy<coordinate_type>::round((dt)((*itr).first) * (dt)factor);
(*itr).second.first = scaling_policy<coordinate_type>::round((dt)((*itr).second.first) * (dt)factor);
}
unsorted_ = true; //scaling make coordinates equal that were not previously equal
return *this;
}
polygon_90_set_data& self_xor() {
sort();
if(dirty_) { //if it is clean it is a no-op
boolean_op::default_arg_workaround<boolean_op::UnaryCount>::applyBooleanOr(data_);
dirty_ = false;
}
return *this;
}
polygon_90_set_data& self_intersect() {
sort();
if(dirty_) { //if it is clean it is a no-op
interval_data<coordinate_type> ivl((std::numeric_limits<coordinate_type>::min)(), (std::numeric_limits<coordinate_type>::max)());
rectangle_data<coordinate_type> rect(ivl, ivl);
insert(rect, true);
clean();
}
return *this;
}
inline polygon_90_set_data& interact(const polygon_90_set_data& that) {
typedef coordinate_type Unit;
if(that.dirty_) that.clean();
typename touch_90_operation<Unit>::TouchSetData tsd;
touch_90_operation<Unit>::populateTouchSetData(tsd, that.data_, 0);
std::vector<polygon_90_data<Unit> > polys;
get(polys);
std::vector<std::set<int> > graph(polys.size()+1, std::set<int>());
for(std::size_t i = 0; i < polys.size(); ++i){
polygon_90_set_data<Unit> psTmp(that.orient_);
psTmp.insert(polys[i]);
psTmp.clean();
touch_90_operation<Unit>::populateTouchSetData(tsd, psTmp.data_, i+1);
}
touch_90_operation<Unit>::performTouch(graph, tsd);
clear();
for(std::set<int>::iterator itr = graph[0].begin(); itr != graph[0].end(); ++itr){
insert(polys[(*itr)-1]);
}
dirty_ = false;
return *this;
}
template <class T2, typename iterator_type_1, typename iterator_type_2>
void applyBooleanBinaryOp(iterator_type_1 itr1, iterator_type_1 itr1_end,
iterator_type_2 itr2, iterator_type_2 itr2_end,
T2 defaultCount) {
data_.clear();
boolean_op::applyBooleanBinaryOp(data_, itr1, itr1_end, itr2, itr2_end, defaultCount);
}
private:
orientation_2d orient_;
mutable value_type data_;
mutable bool dirty_;
mutable bool unsorted_;
private:
//functions
template <typename output_container>
void get_dispatch(output_container& output, rectangle_concept ) const {
clean();
form_rectangles(output, data_.begin(), data_.end(), orient_, rectangle_concept());
}
template <typename output_container>
void get_dispatch(output_container& output, polygon_90_concept tag) const {
get_fracture(output, true, tag);
}
template <typename output_container>
void get_dispatch(output_container& output, polygon_90_concept tag,
size_t vthreshold) const {
get_fracture(output, true, tag, vthreshold);
}
template <typename output_container>
void get_dispatch(output_container& output, polygon_90_with_holes_concept tag) const {
get_fracture(output, false, tag);
}
template <typename output_container>
void get_dispatch(output_container& output, polygon_90_with_holes_concept tag,
size_t vthreshold) const {
get_fracture(output, false, tag, vthreshold);
}
template <typename output_container>
void get_dispatch(output_container& output, polygon_45_concept tag) const {
get_fracture(output, true, tag);
}
template <typename output_container>
void get_dispatch(output_container& output, polygon_45_with_holes_concept tag) const {
get_fracture(output, false, tag);
}
template <typename output_container>
void get_dispatch(output_container& output, polygon_concept tag) const {
get_fracture(output, true, tag);
}
template <typename output_container>
void get_dispatch(output_container& output, polygon_with_holes_concept tag) const {
get_fracture(output, false, tag);
}
template <typename output_container, typename concept_type>
void get_fracture(output_container& container, bool fracture_holes, concept_type tag) const {
clean();
::boost::polygon::get_polygons(container, data_.begin(), data_.end(), orient_, fracture_holes, tag);
}
template <typename output_container, typename concept_type>
void get_fracture(output_container& container, bool fracture_holes, concept_type tag,
size_t vthreshold) const {
clean();
::boost::polygon::get_polygons(container, data_.begin(), data_.end(), orient_, fracture_holes, tag, vthreshold);
}
};
template <typename coordinate_type>
polygon_90_set_data<coordinate_type>&
polygon_90_set_data<coordinate_type>::resize(coordinate_type west,
coordinate_type east,
coordinate_type south,
coordinate_type north) {
move(-west, -south);
coordinate_type e_total = west + east;
coordinate_type n_total = south + north;
if((e_total < 0) ^ (n_total < 0)) {
//different signs
if(e_total < 0) {
shrink(0, -e_total, 0, 0);
if(n_total != 0)
return bloat(0, 0, 0, n_total);
else
return (*this);
} else {
shrink(0, 0, 0, -n_total); //shrink first
if(e_total != 0)
return bloat(0, e_total, 0, 0);
else
return (*this);
}
} else {
if(e_total < 0) {
return shrink(0, -e_total, 0, -n_total);
}
return bloat(0, e_total, 0, n_total);
}
}
template <typename coordinate_type, typename property_type>
class property_merge_90 {
private:
std::vector<std::pair<property_merge_point<coordinate_type>, std::pair<property_type, int> > > pmd_;
public:
inline property_merge_90() : pmd_() {}
inline property_merge_90(const property_merge_90& that) : pmd_(that.pmd_) {}
inline property_merge_90& operator=(const property_merge_90& that) { pmd_ = that.pmd_; return *this; }
inline void insert(const polygon_90_set_data<coordinate_type>& ps, const property_type& property) {
merge_scanline<coordinate_type, property_type, polygon_90_set_data<coordinate_type> >::
populate_property_merge_data(pmd_, ps.begin(), ps.end(), property, ps.orient());
}
template <class GeoObjT>
inline void insert(const GeoObjT& geoObj, const property_type& property) {
polygon_90_set_data<coordinate_type> ps;
ps.insert(geoObj);
insert(ps, property);
}
//merge properties of input geometries and store the resulting geometries of regions
//with unique sets of merged properties to polygons sets in a map keyed by sets of properties
// T = std::map<std::set<property_type>, polygon_90_set_data<coordiante_type> > or
// T = std::map<std::vector<property_type>, polygon_90_set_data<coordiante_type> >
template <typename ResultType>
inline void merge(ResultType& result) {
merge_scanline<coordinate_type, property_type, polygon_90_set_data<coordinate_type>, typename ResultType::key_type> ms;
ms.perform_merge(result, pmd_);
}
};
//ConnectivityExtraction computes the graph of connectivity between rectangle, polygon and
//polygon set graph nodes where an edge is created whenever the geometry in two nodes overlap
template <typename coordinate_type>
class connectivity_extraction_90 {
private:
typedef typename touch_90_operation<coordinate_type>::TouchSetData tsd;
tsd tsd_;
unsigned int nodeCount_;
public:
inline connectivity_extraction_90() : tsd_(), nodeCount_(0) {}
inline connectivity_extraction_90(const connectivity_extraction_90& that) : tsd_(that.tsd_),
nodeCount_(that.nodeCount_) {}
inline connectivity_extraction_90& operator=(const connectivity_extraction_90& that) {
tsd_ = that.tsd_;
nodeCount_ = that.nodeCount_; {}
return *this;
}
//insert a polygon set graph node, the value returned is the id of the graph node
inline unsigned int insert(const polygon_90_set_data<coordinate_type>& ps) {
ps.clean();
touch_90_operation<coordinate_type>::populateTouchSetData(tsd_, ps.begin(), ps.end(), nodeCount_);
return nodeCount_++;
}
template <class GeoObjT>
inline unsigned int insert(const GeoObjT& geoObj) {
polygon_90_set_data<coordinate_type> ps;
ps.insert(geoObj);
return insert(ps);
}
//extract connectivity and store the edges in the graph
//graph must be indexable by graph node id and the indexed value must be a std::set of
//graph node id
template <class GraphT>
inline void extract(GraphT& graph) {
touch_90_operation<coordinate_type>::performTouch(graph, tsd_);
}
};
}
}
#endif
```
|
Karin Smirnov or Smirnoff (née Strindberg; 26 February 1880 – 10 May 1973) was a Finno-Swedish writer. She was the daughter of August Strindberg and Siri von Essen.
Smirnov was a socialist; she married Russian Bolshevik . She wrote plays and also books about her mother and father, and their marriage.
Finnish writers in Swedish
Swedish-speaking Finns
1880 births
1973 deaths
Strindberg family
|
```c
/***************************************************************************
* *
* ########### ########### ########## ########## *
* ############ ############ ############ ############ *
* ## ## ## ## ## ## ## *
* ## ## ## ## ## ## ## *
* ########### #### ###### ## ## ## ## ###### *
* ########### #### # ## ## ## ## # # *
* ## ## ###### ## ## ## ## # # *
* ## ## # ## ## ## ## # # *
* ############ ##### ###### ## ## ## ##### ###### *
* ########### ########### ## ## ## ########## *
* *
* S E C U R E M O B I L E N E T W O R K I N G *
* *
* This file is part of NexMon. *
* *
* *
* NexMon is free software: you can redistribute it and/or modify *
* (at your option) any later version. *
* *
* NexMon is distributed in the hope that it will be useful, *
* but WITHOUT ANY WARRANTY; without even the implied warranty of *
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the *
* *
* along with NexMon. If not, see <path_to_url *
* *
**************************************************************************/
#pragma NEXMON targetregion "patch"
#include <firmware_version.h> // definition of firmware version macros
#include <debug.h> // contains macros to access the debug hardware
#include <wrapper.h> // wrapper definitions for functions that already exist in the firmware
#include <structs.h> // structures that are used by the code in the firmware
#include <helper.h> // useful helper functions
#include <patcher.h> // macros used to craete patches such as BLPatch, BPatch, ...
#include <rates.h> // rates used to build the ratespec for frame injection
#include <bcmwifi_channels.h>
#include <monitormode.h> // defitionons such as MONITOR_...
#define RADIOTAP_MCS
#define RADIOTAP_VENDOR
#include <ieee80211_radiotap.h>
// plcp length in bytes
#define PLCP_LEN 6
extern void prepend_ethernet_ipv4_udp_header(struct sk_buff *p);
static int
channel2freq(struct wl_info *wl, unsigned int channel)
{
int freq = 0;
void *ci = 0;
wlc_phy_chan2freq_acphy(wl->wlc->band->pi, channel, &freq, &ci);
return freq;
}
static void
wl_monitor_radiotap(struct wl_info *wl, struct wl_rxsts *sts, struct sk_buff *p, unsigned char tunnel_over_udp)
{
struct osl_info *osh = wl->wlc->osh;
unsigned int p_len_new;
struct sk_buff *p_new;
if (tunnel_over_udp) {
p_len_new = p->len + sizeof(struct ethernet_ip_udp_header) +
sizeof(struct nexmon_radiotap_header);
} else {
p_len_new = p->len + sizeof(struct nexmon_radiotap_header);
}
// We figured out that frames larger than 2032 will not arrive in user space
if (p_len_new > 2032) {
printf("ERR: frame too large\n");
return;
} else {
p_new = pkt_buf_get_skb(osh, p_len_new);
}
if (!p_new) {
printf("ERR: no free sk_buff\n");
return;
}
if (tunnel_over_udp)
skb_pull(p_new, sizeof(struct ethernet_ip_udp_header));
struct nexmon_radiotap_header *frame = (struct nexmon_radiotap_header *) p_new->data;
memset(p_new->data, 0, sizeof(struct nexmon_radiotap_header));
frame->header.it_version = 0;
frame->header.it_pad = 0;
frame->header.it_len = sizeof(struct nexmon_radiotap_header) + PLCP_LEN;
frame->header.it_present =
(1<<IEEE80211_RADIOTAP_TSFT)
| (1<<IEEE80211_RADIOTAP_FLAGS)
| (1<<IEEE80211_RADIOTAP_RATE)
| (1<<IEEE80211_RADIOTAP_CHANNEL)
| (1<<IEEE80211_RADIOTAP_DBM_ANTSIGNAL)
| (1<<IEEE80211_RADIOTAP_DBM_ANTNOISE)
| (1<<IEEE80211_RADIOTAP_MCS)
| (1<<IEEE80211_RADIOTAP_VENDOR_NAMESPACE);
frame->tsf.tsf_l = sts->mactime;
frame->tsf.tsf_h = 0;
frame->flags = IEEE80211_RADIOTAP_F_FCS;
frame->chan_freq = channel2freq(wl, CHSPEC_CHANNEL(sts->chanspec));
if (frame->chan_freq > 3000)
frame->chan_flags |= IEEE80211_CHAN_5GHZ;
else
frame->chan_flags |= IEEE80211_CHAN_2GHZ;
if (sts->encoding == WL_RXS_ENCODING_OFDM)
frame->chan_flags |= IEEE80211_CHAN_OFDM;
if (sts->encoding == WL_RXS_ENCODING_DSSS_CCK)
frame->chan_flags |= IEEE80211_CHAN_CCK;
frame->data_rate = sts->datarate;
frame->dbm_antsignal = sts->signal;
frame->dbm_antnoise = sts->noise;
if (sts->encoding == WL_RXS_ENCODING_HT) {
frame->mcs[0] =
IEEE80211_RADIOTAP_MCS_HAVE_BW
| IEEE80211_RADIOTAP_MCS_HAVE_MCS
| IEEE80211_RADIOTAP_MCS_HAVE_GI
| IEEE80211_RADIOTAP_MCS_HAVE_FMT
| IEEE80211_RADIOTAP_MCS_HAVE_FEC
| IEEE80211_RADIOTAP_MCS_HAVE_STBC;
switch(sts->htflags) {
case WL_RXS_HTF_40:
frame->mcs[1] |= IEEE80211_RADIOTAP_MCS_BW_40;
break;
case WL_RXS_HTF_20L:
frame->mcs[1] |= IEEE80211_RADIOTAP_MCS_BW_20L;
break;
case WL_RXS_HTF_20U:
frame->mcs[1] |= IEEE80211_RADIOTAP_MCS_BW_20U;
break;
case WL_RXS_HTF_SGI:
frame->mcs[1] |= IEEE80211_RADIOTAP_MCS_SGI;
break;
case WL_RXS_HTF_STBC_MASK:
frame->mcs[1] |= ((sts->htflags & WL_RXS_HTF_STBC_MASK) >> WL_RXS_HTF_STBC_SHIFT) << IEEE80211_RADIOTAP_MCS_STBC_SHIFT;
break;
case WL_RXS_HTF_LDPC:
frame->mcs[1] |= IEEE80211_RADIOTAP_MCS_FEC_LDPC;
break;
}
frame->mcs[2] = sts->mcs;
}
frame->vendor_oui[0] = 'N';
frame->vendor_oui[1] = 'E';
frame->vendor_oui[2] = 'X';
frame->vendor_sub_namespace = 0;
frame->vendor_skip_length = PLCP_LEN;
memcpy(p_new->data + sizeof(struct nexmon_radiotap_header), p->data, p->len);
if (tunnel_over_udp) {
prepend_ethernet_ipv4_udp_header(p_new);
}
//wl_sendup(wl, 0, p_new);
wl->dev->chained->funcs->xmit(wl->dev, wl->dev->chained, p_new);
}
void
wl_monitor_hook(struct wl_info *wl, struct wl_rxsts *sts, struct sk_buff *p) {
unsigned char monitor = wl->wlc->monitor & 0xFF;
if (monitor & MONITOR_RADIOTAP) {
wl_monitor_radiotap(wl, sts, p, 0);
}
if (monitor & MONITOR_IEEE80211) {
wl_monitor(wl, sts, p);
}
if (monitor & MONITOR_LOG_ONLY) {
printf("frame received\n");
}
if (monitor & MONITOR_DROP_FRM) {
;
}
if (monitor & MONITOR_IPV4_UDP) {
wl_monitor_radiotap(wl, sts, p, 1);
}
}
// Hook the call to wl_monitor in wlc_monitor
__attribute__((at(0x18DA30, "", CHIP_VER_BCM4339, FW_VER_6_37_32_RC23_34_40_r581243)))
__attribute__((at(0x18DB20, "", CHIP_VER_BCM4339, FW_VER_6_37_32_RC23_34_43_r639704)))
BLPatch(wl_monitor_hook, wl_monitor_hook);
// activate badfcs, if MONITOR_ACTIVATE_BADFCS is set
void
wlc_mctrl_hook(struct wlc_info *wlc, uint32 mask, uint32 val)
{
if (wlc->monitor & MONITOR_ACTIVATE_BADFCS)
wlc_mctrl(wlc, MCTL_PROMISC | MCTL_KEEPBADFCS | MCTL_KEEPCONTROL, MCTL_PROMISC | MCTL_KEEPBADFCS | MCTL_KEEPCONTROL);
else
wlc_mctrl(wlc, mask, val);
}
__attribute__((at(0x34CB6, "flashpatch", CHIP_VER_BCM4339, FW_VER_ALL)))
BLPatch(wlc_mctrl_hook, wlc_mctrl_hook);
```
|
```shell
# See genscripts.sh and ../scripttempl/elf.sc for the meaning of these.
SCRIPT_NAME=elf
ELFSIZE=64
TEMPLATE_NAME=elf32
EXTRA_EM_FILE=ia64elf
OUTPUT_FORMAT="elf64-ia64-little"
ARCH=ia64
MACHINE=
MAXPAGESIZE=0x10000
# FIXME: It interferes with linker relaxation. Disable it until it is
# fixed.
if test "0" = "1" -a -n "$CREATE_SHLIB"; then
# Optimize shared libraries for 16K page size
COMMONPAGESIZE=0x4000
fi
TEXT_START_ADDR="0x4000000000000000"
DATA_ADDR="0x6000000000000000 + (. & (${MAXPAGESIZE} - 1))"
GENERATE_SHLIB_SCRIPT=yes
GENERATE_PIE_SCRIPT=yes
NOP=0x00300000010070000002000001000400 # a bundle full of nops
OTHER_GOT_SECTIONS="
.IA_64.pltoff ${RELOCATING-0} : { *(.IA_64.pltoff) }"
OTHER_PLT_RELOC_SECTIONS="
.rela.IA_64.pltoff ${RELOCATING-0} : { *(.rela.IA_64.pltoff) }"
OTHER_READONLY_SECTIONS=
OTHER_READWRITE_SECTIONS=
test -z "$CREATE_PIE" && OTHER_READONLY_SECTIONS="
.opd ${RELOCATING-0} : { *(.opd) }"
test -n "$CREATE_PIE" && OTHER_READWRITE_SECTIONS="
.opd ${RELOCATING-0} : { *(.opd) }"
test -n "$CREATE_PIE" && OTHER_GOT_RELOC_SECTIONS="
.rela.opd ${RELOCATING-0} : { *(.rela.opd) }"
OTHER_READONLY_SECTIONS="${OTHER_READONLY_SECTIONS}
.IA_64.unwind_info ${RELOCATING-0} : { *(.IA_64.unwind_info${RELOCATING+* .gnu.linkonce.ia64unwi.*}) }
.IA_64.unwind ${RELOCATING-0} : { *(.IA_64.unwind${RELOCATING+* .gnu.linkonce.ia64unw.*}) }"
```
|
Saber Azizi (born 13 January 1996) is an Afghan professional footballer who plays as a defender for Ariana FC.
Club career
Born in Sweden, Azizi joined BK Olympic youth academy. He made his debut against Norrby IF for Landskrona BoIS.
Arian FC
Azizi joined Ariana FC – a club from Malmö founded by Afghan immigrants – in July 2018.
International career
In September 2016, he received his first call-up to the Afghanistan senior side for the friendly against Lebanon.
In October 2017, he played his first competitive game for Afghanistan, against Jordan in the qualification for the 2019 AFC Asian Cup.
Career statistics
References
External links
Saber Azizi at FotbollTransfers
1996 births
Living people
Men's association football midfielders
Afghan men's footballers
Afghanistan men's international footballers
Swedish men's footballers
Swedish people of Afghan descent
Sportspeople of Afghan descent
BK Olympic players
Landskrona BoIS players
FC Rosengård 1917 players
Footballers from Malmö
|
Teófilo Leuterio Sison (February 29, 1880 – April 13, 1975) was a Philippine legislator and the first Secretary of National Defense of the Philippine Commonwealth.
Early life
Sison was born on February 29, 1880, in Dagupan, Pangasinan, to Benito Sison and Escolástica Leuterio.
He studied at the College of San Alberto Magno, obtaining a Bachelor of Arts degree in 1896 and the University of Santo Tomas, B.A., in the same year. He taught in the public schools of Binmaley, Pangasinan from October 19o0 until June 1901.
Career
On July 1, 1901, he was appointed interpreter for the Court of First Instance Third Judicial District. It was during his term as court interpreter that he married Filomena Solis in Lingayen, Pangasinan on November 19, 1910. He served in such capacity until July 1, 1914, when he was reappointed to a similar position in the 5th District where he remained until September 30, 1914.
After he passed the Philippine Bar examination on September 7, 1914, he established his own law office and engaged in the active practice of his profession.
Legislative career
In June 1916, he was elected Municipal Councilor of Lingayen, a position he held until October 1919. He went on to become Provincial Governor of Pangasinan during the June 1922 election and was re-elected in the general elections of 1925.
Then in June 1928, he was elected for the Second Senatorial District, comprising the provinces of Pangasinan, La Union and Zambales. As Senator during the period 1928–1931, he was Chairman of the Committees on Civil Service and National Enterprise, and member of the following committees: Finance, Public Works and Communication, Appointments, Justice, Municipal and Provincial Governments, Election and Privileges, City of Manila, Commerce and Industry, Labor and Immigration.
During the 9th Legislative Assembly, he was chairman of the Committee of Justice and member of the following committees: Finance, Public Works and Communication, Appointments, Public Instruction, External Relations, Banks Corporations and Franchise, Commerce and Industry, City of Manila, Municipal and Provincial Governments, Labor and Immigration, Civil Service and Library.
Secretary of National Defense
He was appointed Secretary of National Defense on November 1, 1939, during the presidency of Manuel Quezon pursuant to the enactment of Commonwealth Act No. 1 or the National Defense Act.
Death
He died two months after his 95th birthday on April 13, 1975. He was buried at Loyola Memorial Park in Marikina.
See also
List of secretaries of the Department of National Defense of the Philippines
References
Sison's Biography
1880 births
1975 deaths
Senators of the 10th Philippine Legislature
Senators of the 9th Philippine Legislature
Senators of the 8th Philippine Legislature
Nacionalista Party politicians
Governors of Pangasinan
People from Dagupan
Secretaries of National Defense of the Philippines
20th-century Filipino lawyers
University of Santo Tomas alumni
Burials at the Loyola Memorial Park
Quezon administration cabinet members
Members of the Senate of the Philippines from the 2nd district
|
```python
#
#
# path_to_url
#
# Unless required by applicable law or agreed to in writing, software
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# ==============================================================================
"""Environment configuration object for Estimators."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import copy
import json
import os
import six
from tensorflow.core.protobuf import config_pb2
from tensorflow.python.platform import tf_logging as logging
from tensorflow.python.training import server_lib
_USE_DEFAULT = object()
# A list of the property names in RunConfig that the user is allowed to change.
_DEFAULT_REPLACEABLE_LIST = [
'model_dir',
'tf_random_seed',
'save_summary_steps',
'save_checkpoints_steps',
'save_checkpoints_secs',
'session_config',
'keep_checkpoint_max',
'keep_checkpoint_every_n_hours',
'log_step_count_steps'
]
_SAVE_CKPT_ERR = (
'`save_checkpoints_steps` and `save_checkpoints_secs` cannot be both set.'
)
_TF_CONFIG_ENV = 'TF_CONFIG'
_TASK_ENV_KEY = 'task'
_TASK_TYPE_KEY = 'type'
_TASK_ID_KEY = 'index'
_CLUSTER_KEY = 'cluster'
_SERVICE_KEY = 'service'
_LOCAL_MASTER = ''
_GRPC_SCHEME = 'grpc://'
def _get_master(cluster_spec, task_type, task_id):
"""Returns the appropriate string for the TensorFlow master."""
if not cluster_spec:
raise RuntimeError(
'Internal error: `_get_master` does not expect empty cluster_spec.')
jobs = cluster_spec.jobs
# Lookup the master in cluster_spec using task_type and task_id,
# if possible.
if task_type not in jobs:
raise ValueError(
'%s is not a valid task_type in the cluster_spec:\n'
'%s\n\n'
'Note that these values may be coming from the TF_CONFIG environment '
'variable.' % (task_type, cluster_spec))
addresses = cluster_spec.job_tasks(task_type)
if not 0 <= task_id < len(addresses):
raise ValueError(
'%d is not a valid task_id for task_type %s in the cluster_spec:\n'
'%s\n\n'
'Note that these values may be coming from the TF_CONFIG environment '
'variable.' % (task_id, task_type, cluster_spec))
return _GRPC_SCHEME + addresses[task_id]
def _count_ps(cluster_spec):
"""Counts the number of parameter servers in cluster_spec."""
if not cluster_spec:
raise RuntimeError(
'Internal error: `_count_ps` does not expect empty cluster_spec.')
return len(cluster_spec.as_dict().get(TaskType.PS, []))
def _count_worker(cluster_spec, chief_task_type):
"""Counts the number of workers (including chief) in cluster_spec."""
if not cluster_spec:
raise RuntimeError(
'Internal error: `_count_worker` does not expect empty cluster_spec.')
return (len(cluster_spec.as_dict().get(TaskType.WORKER, [])) +
len(cluster_spec.as_dict().get(chief_task_type, [])))
def _validate_service(service):
"""Validates the service key."""
if service is not None and not isinstance(service, dict):
raise TypeError(
'If "service" is set in TF_CONFIG, it must be a dict. Given %s' %
type(service))
return service
def _validate_task_type_and_task_id(cluster_spec, task_env, chief_task_type):
"""Validates the task type and index in `task_env` according to cluster."""
if chief_task_type not in cluster_spec.jobs:
raise ValueError(
'If "cluster" is set in TF_CONFIG, it must have one "%s" node.' %
chief_task_type)
if len(cluster_spec.job_tasks(chief_task_type)) > 1:
raise ValueError(
'The "cluster" in TF_CONFIG must have only one "%s" node.' %
chief_task_type)
task_type = task_env.get(_TASK_TYPE_KEY, None)
task_id = task_env.get(_TASK_ID_KEY, None)
if not task_type:
raise ValueError(
'If "cluster" is set in TF_CONFIG, task type must be set.')
if task_id is None:
raise ValueError(
'If "cluster" is set in TF_CONFIG, task index must be set.')
task_id = int(task_id)
# Check the task id bounds. Upper bound is not necessary as
# - for evaluator, there is no upper bound.
# - for non-evaluator, task id is upper bounded by the number of jobs in
# cluster spec, which will be checked later (when retrieving the `master`)
if task_id < 0:
raise ValueError('Task index must be non-negative number.')
return task_type, task_id
def _validate_save_ckpt_with_replaced_keys(new_copy, replaced_keys):
"""Validates the save ckpt properties."""
# Ensure one (and only one) of save_steps and save_secs is not None.
# Also, if user sets one save ckpt property, say steps, the other one (secs)
# should be set as None to improve usability.
save_steps = new_copy.save_checkpoints_steps
save_secs = new_copy.save_checkpoints_secs
if ('save_checkpoints_steps' in replaced_keys and
'save_checkpoints_secs' in replaced_keys):
# If user sets both properties explicitly, we need to error out if both
# are set or neither of them are set.
if save_steps is not None and save_secs is not None:
raise ValueError(_SAVE_CKPT_ERR)
elif 'save_checkpoints_steps' in replaced_keys and save_steps is not None:
new_copy._save_checkpoints_secs = None # pylint: disable=protected-access
elif 'save_checkpoints_secs' in replaced_keys and save_secs is not None:
new_copy._save_checkpoints_steps = None # pylint: disable=protected-access
def _validate_properties(run_config):
"""Validates the properties."""
def _validate(property_name, cond, message):
property_value = getattr(run_config, property_name)
if property_value is not None and not cond(property_value):
raise ValueError(message)
_validate('model_dir', lambda dir: dir,
message='model_dir should be non-empty')
_validate('save_summary_steps', lambda steps: steps >= 0,
message='save_summary_steps should be >= 0')
_validate('save_checkpoints_steps', lambda steps: steps >= 0,
message='save_checkpoints_steps should be >= 0')
_validate('save_checkpoints_secs', lambda secs: secs >= 0,
message='save_checkpoints_secs should be >= 0')
_validate('session_config',
lambda sc: isinstance(sc, config_pb2.ConfigProto),
message='session_config must be instance of ConfigProto')
_validate('keep_checkpoint_max', lambda keep_max: keep_max >= 0,
message='keep_checkpoint_max should be >= 0')
_validate('keep_checkpoint_every_n_hours', lambda keep_hours: keep_hours > 0,
message='keep_checkpoint_every_n_hours should be > 0')
_validate('log_step_count_steps', lambda num_steps: num_steps > 0,
message='log_step_count_steps should be > 0')
_validate('tf_random_seed', lambda seed: isinstance(seed, six.integer_types),
message='tf_random_seed must be integer.')
class TaskType(object):
MASTER = 'master'
PS = 'ps'
WORKER = 'worker'
CHIEF = 'chief'
EVALUATOR = 'evaluator'
class RunConfig(object):
"""This class specifies the configurations for an `Estimator` run."""
def __init__(self,
model_dir=None,
tf_random_seed=None,
save_summary_steps=100,
save_checkpoints_steps=_USE_DEFAULT,
save_checkpoints_secs=_USE_DEFAULT,
session_config=None,
keep_checkpoint_max=5,
keep_checkpoint_every_n_hours=10000,
log_step_count_steps=100):
"""Constructs a RunConfig.
All distributed training related properties `cluster_spec`, `is_chief`,
`master` , `num_worker_replicas`, `num_ps_replicas`, `task_id`, and
`task_type` are set based on the `TF_CONFIG` environment variable, if the
pertinent information is present. The `TF_CONFIG` environment variable is a
JSON object with attributes: `cluster` and `task`.
`cluster` is a JSON serialized version of `ClusterSpec`'s Python dict from
`server_lib.py`, mapping task types (usually one of the `TaskType` enums) to
a list of task addresses.
`task` has two attributes: `type` and `index`, where `type` can be any of
the task types in `cluster`. ` When `TF_CONFIG` contains said information,
the following properties are set on this class:
* `cluster_spec` is parsed from `TF_CONFIG['cluster']`. Defaults to {}. If
present, must have one and only one node in the `chief` attribute of
`cluster_spec`.
* `task_type` is set to `TF_CONFIG['task']['type']`. Must set if
`cluster_spec` is present; must be `worker` (the default value) if
`cluster_spec` is not set.
* `task_id` is set to `TF_CONFIG['task']['index']`. Must set if
`cluster_spec` is present; must be 0 (the default value) if
`cluster_spec` is not set.
* `master` is determined by looking up `task_type` and `task_id` in the
`cluster_spec`. Defaults to ''.
* `num_ps_replicas` is set by counting the number of nodes listed
in the `ps` attribute of `cluster_spec`. Defaults to 0.
* `num_worker_replicas` is set by counting the number of nodes listed
in the `worker` and `chief` attributes of `cluster_spec`. Defaults to 1.
* `is_chief` is determined based on `task_type` and `cluster`.
There is a special node with `task_type` as `evaluator`, which is not part
of the (training) `cluster_spec`. It handles the distributed evaluation job.
Example of non-chief node:
```
cluster = {'chief': ['host0:2222'],
'ps': ['host1:2222', 'host2:2222'],
'worker': ['host3:2222', 'host4:2222', 'host5:2222']}
os.environ['TF_CONFIG'] = json.dumps(
{'cluster': cluster,
'task': {'type': 'worker', 'index': 1}})
config = ClusterConfig()
assert config.master == 'host4:2222'
assert config.task_id == 1
assert config.num_ps_replicas == 2
assert config.num_worker_replicas == 4
assert config.cluster_spec == server_lib.ClusterSpec(cluster)
assert config.task_type == 'worker'
assert not config.is_chief
```
Example of chief node:
```
cluster = {'chief': ['host0:2222'],
'ps': ['host1:2222', 'host2:2222'],
'worker': ['host3:2222', 'host4:2222', 'host5:2222']}
os.environ['TF_CONFIG'] = json.dumps(
{'cluster': cluster,
'task': {'type': 'chief', 'index': 0}})
config = ClusterConfig()
assert config.master == 'host0:2222'
assert config.task_id == 0
assert config.num_ps_replicas == 2
assert config.num_worker_replicas == 4
assert config.cluster_spec == server_lib.ClusterSpec(cluster)
assert config.task_type == 'chief'
assert config.is_chief
```
Example of evaluator node (evaluator is not part of training cluster):
```
cluster = {'chief': ['host0:2222'],
'ps': ['host1:2222', 'host2:2222'],
'worker': ['host3:2222', 'host4:2222', 'host5:2222']}
os.environ['TF_CONFIG'] = json.dumps(
{'cluster': cluster,
'task': {'type': 'evaluator', 'index': 0}})
config = ClusterConfig()
assert config.master == ''
assert config.evaluator_master == ''
assert config.task_id == 0
assert config.num_ps_replicas == 0
assert config.num_worker_replicas == 0
assert config.cluster_spec == {}
assert config.task_type == 'evaluator'
assert not config.is_chief
```
N.B.: If `save_checkpoints_steps` or `save_checkpoints_secs` is set,
`keep_checkpoint_max` might need to be adjusted accordingly, especially in
distributed training. For example, setting `save_checkpoints_secs` as 60
without adjusting `keep_checkpoint_max` (defaults to 5) leads to situation
that checkpoint would be garbage collected after 5 minutes. In distributed
training, the evaluation job starts asynchronously and might fail to load or
find the checkpoint due to race condition.
Args:
model_dir: directory where model parameters, graph, etc are saved. If
`None`, will use a default value set by the Estimator.
tf_random_seed: Random seed for TensorFlow initializers.
Setting this value allows consistency between reruns.
save_summary_steps: Save summaries every this many steps.
save_checkpoints_steps: Save checkpoints every this many steps. Can not be
specified with `save_checkpoints_secs`.
save_checkpoints_secs: Save checkpoints every this many seconds. Can not
be specified with `save_checkpoints_steps`. Defaults to 600 seconds if
both `save_checkpoints_steps` and `save_checkpoints_secs` are not set
in constructor. If both `save_checkpoints_steps` and
`save_checkpoints_secs` are None, then checkpoints are disabled.
session_config: a ConfigProto used to set session parameters, or None.
keep_checkpoint_max: The maximum number of recent checkpoint files to
keep. As new files are created, older files are deleted. If None or 0,
all checkpoint files are kept. Defaults to 5 (that is, the 5 most recent
checkpoint files are kept.)
keep_checkpoint_every_n_hours: Number of hours between each checkpoint
to be saved. The default value of 10,000 hours effectively disables
the feature.
log_step_count_steps: The frequency, in number of global steps, that the
global step/sec will be logged during training.
Raises:
ValueError: If both `save_checkpoints_steps` and `save_checkpoints_secs`
are set.
"""
if (save_checkpoints_steps == _USE_DEFAULT and
save_checkpoints_secs == _USE_DEFAULT):
save_checkpoints_steps = None
save_checkpoints_secs = 600
elif save_checkpoints_secs == _USE_DEFAULT:
save_checkpoints_secs = None
elif save_checkpoints_steps == _USE_DEFAULT:
save_checkpoints_steps = None
elif (save_checkpoints_steps is not None and
save_checkpoints_secs is not None):
raise ValueError(_SAVE_CKPT_ERR)
RunConfig._replace(
self,
allowed_properties_list=_DEFAULT_REPLACEABLE_LIST,
model_dir=model_dir,
tf_random_seed=tf_random_seed,
save_summary_steps=save_summary_steps,
save_checkpoints_steps=save_checkpoints_steps,
save_checkpoints_secs=save_checkpoints_secs,
session_config=session_config,
keep_checkpoint_max=keep_checkpoint_max,
keep_checkpoint_every_n_hours=keep_checkpoint_every_n_hours,
log_step_count_steps=log_step_count_steps)
self._init_distributed_setting_from_environment_var()
def _init_distributed_setting_from_environment_var(self):
"""Initialize distributed properties based on environment variable."""
tf_config = json.loads(os.environ.get(_TF_CONFIG_ENV) or '{}')
if tf_config:
logging.info('TF_CONFIG environment variable: %s', tf_config)
self._service = _validate_service(tf_config.get(_SERVICE_KEY))
self._cluster_spec = server_lib.ClusterSpec(tf_config.get(_CLUSTER_KEY, {}))
task_env = tf_config.get(_TASK_ENV_KEY, {})
if self._cluster_spec and TaskType.MASTER in self._cluster_spec.jobs:
return self._init_distributed_setting_from_environment_var_with_master(
tf_config)
if self._cluster_spec:
# Distributed mode.
self._task_type, self._task_id = _validate_task_type_and_task_id(
self._cluster_spec, task_env, TaskType.CHIEF)
if self._task_type != TaskType.EVALUATOR:
self._master = _get_master(
self._cluster_spec, self._task_type, self._task_id)
self._num_ps_replicas = _count_ps(self._cluster_spec)
self._num_worker_replicas = _count_worker(
self._cluster_spec, chief_task_type=TaskType.CHIEF)
else:
# Evaluator is not part of the training cluster.
self._cluster_spec = server_lib.ClusterSpec({})
self._master = _LOCAL_MASTER
self._num_ps_replicas = 0
self._num_worker_replicas = 0
self._is_chief = self._task_type == TaskType.CHIEF
else:
# Local mode.
self._task_type = task_env.get(_TASK_TYPE_KEY, TaskType.WORKER)
self._task_id = int(task_env.get(_TASK_ID_KEY, 0))
if self._task_type != TaskType.WORKER:
raise ValueError(
'If "cluster" is not set in TF_CONFIG, task type must be WORKER.')
if self._task_id != 0:
raise ValueError(
'If "cluster" is not set in TF_CONFIG, task index must be 0.')
self._master = ''
self._is_chief = True
self._num_ps_replicas = 0
self._num_worker_replicas = 1
def _init_distributed_setting_from_environment_var_with_master(self,
tf_config):
"""Initialize distributed properties for legacy cluster with `master`."""
# There is no tech reason, why user cannot have chief and master in the same
# cluster, but it is super confusing (which is really the chief?). So, block
# this case.
if TaskType.CHIEF in self._cluster_spec.jobs:
raise ValueError('If `master` node exists in `cluster`, job '
'`chief` is not supported.')
task_env = tf_config.get(_TASK_ENV_KEY, {})
self._task_type, self._task_id = _validate_task_type_and_task_id(
self._cluster_spec, task_env, TaskType.MASTER)
if self._task_type == TaskType.EVALUATOR:
raise ValueError('If `master` node exists in `cluster`, task_type '
'`evaluator` is not supported.')
self._master = _get_master(
self._cluster_spec, self._task_type, self._task_id)
self._num_ps_replicas = _count_ps(self._cluster_spec)
self._num_worker_replicas = _count_worker(
self._cluster_spec, chief_task_type=TaskType.MASTER)
self._is_chief = self._task_type == TaskType.MASTER
@property
def cluster_spec(self):
return self._cluster_spec
@property
def evaluation_master(self):
return ''
@property
def is_chief(self):
return self._is_chief
@property
def master(self):
return self._master
@property
def num_ps_replicas(self):
return self._num_ps_replicas
@property
def num_worker_replicas(self):
return self._num_worker_replicas
@property
def task_id(self):
return self._task_id
@property
def task_type(self):
return self._task_type
@property
def tf_random_seed(self):
return self._tf_random_seed
@property
def save_summary_steps(self):
return self._save_summary_steps
@property
def save_checkpoints_secs(self):
return self._save_checkpoints_secs
@property
def session_config(self):
return self._session_config
@property
def save_checkpoints_steps(self):
return self._save_checkpoints_steps
@property
def keep_checkpoint_max(self):
return self._keep_checkpoint_max
@property
def keep_checkpoint_every_n_hours(self):
return self._keep_checkpoint_every_n_hours
@property
def log_step_count_steps(self):
return self._log_step_count_steps
@property
def model_dir(self):
return self._model_dir
@property
def service(self):
"""Returns the platform defined (in TF_CONFIG) service dict."""
return self._service
def replace(self, **kwargs):
"""Returns a new instance of `RunConfig` replacing specified properties.
Only the properties in the following list are allowed to be replaced:
- `model_dir`.
- `tf_random_seed`,
- `save_summary_steps`,
- `save_checkpoints_steps`,
- `save_checkpoints_secs`,
- `session_config`,
- `keep_checkpoint_max`,
- `keep_checkpoint_every_n_hours`,
- `log_step_count_steps`,
In addition, either `save_checkpoints_steps` or `save_checkpoints_secs`
can be set (should not be both).
Args:
**kwargs: keyword named properties with new values.
Raises:
ValueError: If any property name in `kwargs` does not exist or is not
allowed to be replaced, or both `save_checkpoints_steps` and
`save_checkpoints_secs` are set.
Returns:
a new instance of `RunConfig`.
"""
return RunConfig._replace(
copy.deepcopy(self),
allowed_properties_list=_DEFAULT_REPLACEABLE_LIST,
**kwargs)
@staticmethod
def _replace(config, allowed_properties_list=None, **kwargs):
"""See `replace`.
N.B.: This implementation assumes that for key named "foo", the underlying
property the RunConfig holds is "_foo" (with one leading underscore).
Args:
config: The RunConfig to replace the values of.
allowed_properties_list: The property name list allowed to be replaced.
**kwargs: keyword named properties with new values.
Raises:
ValueError: If any property name in `kwargs` does not exist or is not
allowed to be replaced, or both `save_checkpoints_steps` and
`save_checkpoints_secs` are set.
Returns:
a new instance of `RunConfig`.
"""
allowed_properties_list = allowed_properties_list or []
for key, new_value in six.iteritems(kwargs):
if key in allowed_properties_list:
setattr(config, '_' + key, new_value)
continue
raise ValueError(
'Replacing {} is not supported. Allowed properties are {}.'.format(
key, allowed_properties_list))
_validate_save_ckpt_with_replaced_keys(config, kwargs.keys())
_validate_properties(config)
return config
```
|
Said Hadjerrouit is a professor of informatics and computer science at the University of Agder in Kristiansand, Norway. He got a doctoral degree (Dr.Ing) in 1992 in the field of medical expert systems and artificial intelligence, and a master's degree (1985) in software engineering from the Technische Universität Berlin, Germany. His teaching in Berlin focused mostly on informatics and society, philosophical and ethical issues of computing, and computers in developing countries. In 1991, he moved from Berlin to Kristiansand, Norway, and worked at the Institute of Electronic Data processing at the University of Agder. In 1994, he moved to the Institute of Mathematical Sciences at the same university, where he was appointed as an associate professor for teaching object-oriented programming, Web engineering, software development, and databases. From 2004, his work shifted to didactics of informatics and computer science education, ICT in mathematics education, ICT-enhanced learning, Web-based learning resources, social software, and Web 2.0 technology. In 2008, Hadjerrouit made a major shift in his research focus from didactics of informatics and Computer Science to mathematics education and use of digital tools in teaching and learning mathematics. He has been teaching the doctoral course “Theories in the Learning and Teaching of Mathematics” since 2014. He is also supervising two PhD students in the field of Flipped Classroom and documentational approach to mathematics education. Hadjerrouit has more than 140 publications in international journals and conference proceedings. He was awarded for Best Paper at Society for Information Technology and Teacher Education Conference (SITE 2010) in San Diego, California, United States, and IADIS e-Society conference 2012 in Berlin, Germany.
Hadjerrouit is leader of PhD committee for Specialisation in Mathematical Sciences (FDM).
Hadjerrouit is a member of the Agder Academy of Sciences and Letters, with members from Agder, Norway, and from abroad. Hadjerrouit is a member of ISI, and editorial review board of JELLO.
Selected publications
Hansen, N. K., &; Hadjerrouit, S. (2023). Analyzing Students’ Computational Thinking and Programming Skills for Mathematical Problem Solving. Open and Inclusive Educational Practice in the Digital World. . Springer Nature, pp. 155-173.
Stigberg, H.; Hadjerrouit, S.; Kaufmann, O. T.; Marentakis, G. (2022). Analysing tensions faced by pre-service mathematics teachers engaging in digital fabrication. Proceedings of the 45th Conference of the International Group for the Psychology of Mathematics Education (PME 45). . Universidad de Alicante, pp. 4-51 - 4-58.
Hansen, N.K. &; Hadjerrouit, S. (2021). Exploring Students’ Computational Thinking for Mathematical Problem-Solving: A Case Study. I: Proceedings of the 18th International Conference on Cognition and Exploratory Learning in Digital Age (CELDA2021). IADIS Press 2021, pp. 251-260.
Hadjerrouit, S.; Hansen, N.K. (2020). Students Engaging in Mathematical Problem-Solving through Computational Thinking and Programming Activities: A Synthesis of thwo Opposite Experiences. I: Proceedings of the 17th International Conference on Cognition and Exploratory Learning in the Digital Age (CELDA 2020). IADIS Press 2020; pp. 91-98.
Fleischmann, Y.; Gueudet, G.; Hadjerrouit, S.; Nicolas, P. (2020). Tertiary education in the digital age. International Network for Didactic Research in University Mathematics: INDRUM 2020;Volume 3, pp. 29-46.
Hadjerrouit, S. (2020). Exploring the Affordances of SimReal for Learning Mathematics in Teacher Education: A Socio-Cultural Perspective. I: Computer Supported Education. 11th International Conference, CSEDU 2019, Heraklion, Crete, Greece, May 2-4, 2019, Revised Selected Papers. Springer Nature 2020, pp.26-50.
Hadjerrouit, S. (2020). Using Affordances and Constraints to Evaluate the Use of a Formative e-Assessment System in Mathematics Education. I: CSEDU 2020 - Proceedings of the 12th International Conference on Computer Supported Education, Volume 1. SciTePress 2020, pp. 366-373.
Fredriksen, H., & Hadjerrouit, S. (2020). Exploring engineering students’ participation in flipped mathematics classroom: A discursive approach. Nordisk matematikkdidaktikk 2020 ;Volum 25.(1), pp. 45-64.
Fredriksen, H., & Hadjerrouit, S. (2019). An activity theory perspective on contradictions in flipped mathematics classrooms at the university level. International Journal of Mathematical Education in Science and Technology 2019, pp. 1-11.
Hadjerrouit, S. (2017). Assessing the Affordances of SimReal+ and their Applicability to Support the Learning of Mathematics in Teacher Education. Accepted for publication in the International Journal “Issues in Informing Science and Information Technology (IISIT)”. Informing Science Institute.
-------. (2015). Evaluating the Interactive Learning Tool SimReal+ for Visualizing and Simulating Mathematical Concepts. Proceedings of 12th International Conference on Cognition and Exploratory Learning in the Digital Age (CELDA 2015). IADIS Press 2015, pp. 101–108(2015).
-------. (2015). Assessing the Level of Collaborative Writing in a Wiki-Based Environment: A Case Study in Teacher Education. I: Competencies in Teaching, Learning and Educational Leadership in the Digital Age. London: Springer Verlag, 197-216.
------. (2014). Wiki as a Collaborative Writing Tool in Teacher Education: Evaluation and Suggestions for Effective Use. Computers in Human Behavior ;Volume 32, 301-312.
------. (2012). Wiki-based collaborative learning in higher education: a pedagogical Evaluation. Int. J. Innovation and Learning, Vol. 12, No. 1, 2012, pp. 6–26. (Print), (Online)
-----. (2010). Developing Web-Based Learning Resources in School Education: A User-Centered Approach. Interdisciplinary Journal of E-Learning and Learning Objects, Volume 6, pp. 115–135.
-----. (2009). Teaching and Leaning School Informatics: A Concept-Based Pedagogical Approach. Informatics in Education, Volume 8, No. 2, pp. 227–250.
-----. (2005). Constructivism as Guiding Philosophy for Software Engineering Education. ACM SIGCSE Bulletin, Volume 37, Number 4, December 2005, pp. 45–49.
References
University of Agder
Norwegian computer scientists
Living people
Year of birth missing (living people)
|
```scss
.main-content-area {
--content-area-margin: 1.25rem;
--content-area-padding: 1.25rem;
--content-area-sidebar-width: #{$content-sidebar-width};
--content-area-sidebar-bg: #{$content-sidebar-bg};
--content-area-header-padding: 0.625rem 1.125rem;
--content-area-header-bg: #{$main-content-header-bg};
--content-area-header-border: #{$main-content-header-border};
--content-area-footer-padding: 0.625rem 1.125rem;
--content-area-footer-bg: #{$main-content-footer-bg};
--content-area-footer-border: #{$main-content-footer-border};
margin: var(--content-area-margin);
scroll-margin: var(--content-area-margin);
}
.main-content-container {
background: white;
display: flex;
}
.main-content,
.main-content-sidebar {
padding: var(--content-area-padding);
}
.main-content {
// Allow nesting of tab panes directly within
@extend .tab-content;
flex: 1;
min-width: 0;
}
.main-content-header,
.main-content-footer {
display: flex;
justify-content: space-between;
}
.main-content-header {
margin: calc(var(--content-area-padding) * -1) calc(var(--content-area-padding) * -1) var(--content-area-padding) !important;
padding: var(--content-area-header-padding);
background: var(--content-area-header-bg);
border-bottom: var(--content-area-header-border)
}
.main-content-footer {
margin: var(--content-area-padding) calc(var(--content-area-padding) * -1) calc(var(--content-area-padding) * -1) !important;
padding: var(--content-area-footer-padding);
background: var(--content-area-footer-bg);
border-top: var(--content-area-footer-border);
}
.main-content,
.main-content-sidebar,
.tab-pane {
> *:first-child {
margin-top: 0;
}
> *:last-child {
margin-bottom: 0;
}
}
.main-content-sidebar {
background: var(--content-area-sidebar-bg);
flex: 0 0 var(--content-area-sidebar-width);
max-width: var(--content-area-sidebar-width);
}
.utilities {
display: flex;
justify-content: flex-start;
margin-bottom: 0.75rem;
}
@include media-breakpoint-down(lg) {
.main-content-container {
flex-direction: column;
}
.main-content-sidebar {
flex-basis: 0;
max-width: none;
}
}
@include media-breakpoint-down(md) {
.main-content-area {
--content-area-margin: 0.625rem;
--content-area-padding: 0.875rem;
--content-area-header-padding: 0.625rem 0.875rem;
--content-area-footer-padding: 0.625rem 0.875rem;
}
.main-content-header,
.main-content-footer {
flex-direction: column;
align-items: center;
}
}
```
|
```c
#include "../libc/_raw_print.h"
int main() {
print_int(7);
print_int(42);
print_int(123);
print_int(0);
//print_int(-99);
return 0;
}
```
|
Tom Servo is a fictional character from the American science fiction comedy television show Mystery Science Theater 3000 (MST3K). Tom is one of two wise-cracking, robotic main characters of the show, built by Joel Robinson to act as a companion and help stave off madness as he was forced to watch low-quality films (Tom and the other bots, Crow, Gypsy, and Cambot, are made from parts that would have otherwise allowed Joel to actually control the film). At least during the Comedy Central era, he was somewhat more mature than his theatre companion, Crow T. Robot. Tom, more often than the others, signals the need to exit the theater to perform host segments.
Overview
Tom Servo is a red puppet that has a gumball machine (Carousel Executive Snack Dispenser) for a head, a body composed of a toy "Money Lover Barrel" coin bank and a toy car engine block, and a bowl-shaped hovercraft skirt (a Halloween "Boo Bowl") instead of legs. His arm assembly (according to the 1998 MST3K Official Bot Builders Booklet Workbench Edition) is from a Mr Moonie doll. Mr Moonie (sometimes known as "Seymour Bunz" or "C More Bunz") was a gag doll popular during the time of filming, usually found suction cupped to people's car windows to moon other drivers. The doll would drop its pants and moon people when a rubber bladder was squeezed. The entire assembly, including the rubber bladder, were kept intact when installed into Tom's barrel body. The hands were apparently from a clown or hobo doll figure, but are now made from a mold. The springs were custom made by a local manufacturer, which were attached with glue to the wrists and arms. His arms would spring up and down whenever the clear balloon inside was gripped by Kevin, as shown in pictures from The Colossal Episode Guide or The Scrapbook tape. The Patrick Swayze Christmas segment shows Kevin in the puppet trench working Tom Servo with bladder in each hand. Although it was as though the arms were never really functional as arms, a point that is commented on occasionally throughout the series, they actually do move up and down due to the Mr Moonie doll bladder inside Tom, and can be seen moving up and down in a fair few episodes as early as season 2. His shoulders are made from the front of an Eveready Floating Lantern. Because Servo's head is transparent, chromakeyed images appear projected through it, and thus a second puppet was built for use in the theater segments, entirely spray-painted black. This black Servo also appeared in a host segment in episode #609, The Skydivers. Tom moves about by hovering. In seasons one through ten he is unable to hover much higher than the ground, however in season eleven (and briefly in the film version) he is capable of hovering at apparently any height and with great agility. In episode #413 Manhunt in Space it is revealed that Servo suffers from red-green colorblindness.
Servo's appearance has changed over time. In the pilot for MST3K, the robot who would become Servo was named "Beeper," who just spoke in beeps that only Crow could understand (similar to R2-D2 and C-3PO from the Star Wars films). He was an all-silver robot vaguely shaped like the ultimate Servo, with funnel-shaped shoulders, silver rubber tube arms, a plastic flowerpot for a hoverskirt, and a small fishbowl for a head. He was renamed 'Servo' after a vending machine called the Servotron. The character was absent in K01: Invaders from the Deep, but re-appeared again for the remainder of the KTMA series as the now-familiar Servo puppet with the gumball machine head in K02: Revenge of the Mysterons from Mars.
In Season 1 on the Comedy Channel, he was given a red color, longer black tube arms, squared white shoulders, a different hoverskirt, and the Carousel Snack Dispenser gumball machine head with a white beak. Around episode 105: The Corpse Vanishes, Servo's head was replaced with a slightly modified version of his "Carousel" head. The "neck" was slightly wider and the beak (now silver at this point) appeared smaller. This version of Servo's head would be used for the remainder of Season 1. For Season 2, the black tubing used for his arms was replaced by a pair of small silver springs and the more familiar Carousel Dispenser head design (KTMA/pre episode 105) returned with a silver beak. This physical form was pretty much the same throughout the remainder of the series, save for a brief flirtation (during episodes #205: Rocket Attack USA and #206: Ring of Terror) with a slim cylindrical gumball-machine head to try to reduce the screen area Servo's head obscured. It was introduced as a "haircut" that Joel gave Servo, but was quickly abandoned. By mid season 3 an extra cap from another Carousel Dispenser was added just below the "bubble" making Tom's head appear slightly taller and slimmer. Briefly in early season 4, Servo's white hands were changed to beige before returning to white after only a few episodes. During a host segment in the Sci-Fi era, he briefly acquired the body of a "beautiful butterfly."
Servo's voice and personality also changed during the show's early years. While Josh Weinstein operated Servo during the KTMA season, Servo originally spoke with a sleazy, Bullwinkle-type voice in episode K02 (the earliest seen of the regular Servo puppet), then a rather slow squeaky voice from K03–K05, and was somewhat immobile during host segments but oddly very active in the theater. In episode K06, Weinstein switched to a lower voice that Servo repeatedly proclaimed as his new "MIGHTY VOICE!" When Weinstein left at the end of Season 1, Kevin Murphy took over Servo's operation and tried to match Weinstein's Servo voice and personality, but gradually developed a somewhat new Servo sound and character (though Murphy has a fairly deep voice himself). This was explained as tinkering by Joel. During Murphy's tenure, Servo took many opportunities to showcase his excellent singing. He also has an extensive underwear collection (as seen in Mystery Science Theater 3000: The Movie), as well as a large number of duplicates of himself that he made in episode #420: The Human Duplicators (also seen in episode #612: The Starfighters, episode #910: The Touch of Satan, episode #913: Quest of the Delta Knights, episode #1003: Merlin's Shop of Mystical Wonders, episode #1004: Future War, and episode #1013: Diabolik). Female Servo duplicates were featured in The Touch of Satan and Quest of the Delta Knights.
Whenever a member of the cast is required to dress in drag for a sketch, Servo usually does the honors. This is both because of the dichotomy of women's clothes amusingly contrasted with puppeteer Murphy's strong baritone voice and because, in Murphy's words, "Servo looks better in a dress than Crow." Also, Servo is the only robot (other than Cambot in seasons 5–10) whose entire body can be seen on the show, since Crow's legs are behind the desk and Gypsy's body is several yards long.
Servo normally has a condescending personality and at times can make literary and technical references that are above his companion's heads. He frequently attempts to seem physically imposing to others, once acquiring "lifts" for his hover skirt to increase his size (accused by Mike of suffering "short man's disease") and on another occasion showing off a small arsenal he had acquired while drifting through space. Almost invariably, however, any attempts at confronting danger or displaying his intellectual skill cause him such frustration that he ends up crying, often needing consolation from Joel or Mike. He also has demonstrated a tendency to take jokes and skits way over the top, having to be brought back down to earth by the others.
Furthermore, he's easily rattled by sarcastic remarks made from Crow, such as the time when he made fun of his infatuation with a boy's pet turtle Tibby in the episode 'Gamera'.
He does have a good understanding and intellect in spite of his sensitivity and frustration, and has revealed in the episode 'The Gunslinger' that he's able to teleport at will, even though he only demonstrates this on rare other occasions.
In Season 11 Tom seemed to has received many upgrades. Including the ability to fly or hover as seen many times. Also he seems to have working hands and arms now as he's shown being able to wield a baseball bat. The original Servo (presumably) is apparently destroyed in Episode 1101 Reptilicus, but a clone identical to the original Servo in every way remains behind, taking his place. In Season 13, where the Satellite of Love set is green screen, Servo's head is permanently clouded to avoid chroma-key problems.
Other appearances
Tom Servo also appeared in the Cops-style Star Wars spoof Troops as a droid purloined by Jawas.
Servo, along with Crow, has a cameo appearance (appropriately in silhouette) in the Futurama episode "Raging Bender". Crow shushes Fry as he begins to riff a newsreel they were watching.
Servo makes an appearance in silhouette in the Homestar Runner cartoon "A Jorb Well Done", during a short scene in a theater.
The prototype web browser Servo is named after Tom Servo
In Gold Digger/Ninja High School Issue #1 "A Science Affair", Tom Servo can be seen in the background, marked as "Servo-tron".
In the Archie Comics series Sonic the Hedgehog, issue #52, Sonic is sent into a 1920s variation of Mobius. In searching for the handheld computer Nicole, Sonic does battle with a number of robots, three of them resembling Crow T. Robot, Tom Servo and Cambot.
References
The Mystery Science Theater 3000 Amazing Colossal Episode Guide (1996), .
The Official Mystery Science Theater 3000 Bot Building Booklet (1998), Best Brains, Inc., ISBFE 05557143431.
Satellite News: The Official Mystery Science Theater 3000 Web Site
The Mystery Science Theater 3000 Scrapbook VHS, Best Brains, Inc.
Mystery Science Theater 3000: The Movie DVD (1996).
Mystery Science Theater 3000, episode #K02 (Revenge of the Mysterons)
Mystery Science Theater 3000, episode #K06 (Gamera vs. Gaos)
Mystery Science Theater 3000, episode #201 (Rocketship X-M)
Mystery Science Theater 3000, episode #205 (Rocket Attack U.S.A.)
Mystery Science Theater 3000, episode #206 (Ring of Terror)
Mystery Science Theater 3000, episode #1004 (Future War)
Mystery Science Theater 3000, episode #1013 (Diabolik)
External links
A page with instructions for building a Tom Servo
Parts list for the above link
Details of Tom Servo's construction through the entire run of the series
Television characters introduced in 1988
Servo, Tom
Servo, Tom
Servo, Tom
Servo, Tom
Servo, Tom
|
```smalltalk
using System.IO;
namespace Amazon.Lambda.Annotations.SourceGenerator.FileIO
{
public interface IFileManager
{
string ReadAllText(string path);
void WriteAllText(string path, string content);
bool Exists(string path);
FileStream Create(string path);
}
}
```
|
```python
import demistomock as demisto # noqa: F401
from CommonServerPython import * # noqa: F401
import copy
import urllib3
QUERIES = {
"tag": "get_taginfo",
"signature": "get_siginfo",
"file_type": "get_file_type",
"clamav": "get_clamavinfo",
"imphash": "get_imphash",
"yara_rule": "get_yarainfo",
"issuer_cn": "get_issuerinfo"
}
EXCEPTIONS_MESSAGES = {
'illegal_sha256_hash': 'Illegal SHA256 hash provided.',
'file_not_found': 'The file was not found or is unknown to MalwareBazaar.',
'hash_not_found': 'The file (hash) you wanted to query is unknown to MalwareBazaar.',
'illegal_hash': 'The hash you provided is not a valid SHA256 hash.',
'user_blacklisted': 'Your API key is blacklisted.',
'no_results': 'Your query yield no results.',
'not_found': 'Tha value you wanted to query is unknown to MalwareBazaar.',
'illegal': 'The text you provided is not valid.'
}
VENDOR_NAME = 'MalwareBazaar'
LIST_HEADERS = ['md5_hash', 'sha256_hash', 'sha1_hash', 'file_name', 'file_type', 'file_size', 'tags', 'first_seen',
'last_seen']
FILE_HEADERS = ['md5_hash', 'sha256_hash', 'sha1_hash', 'file_name', 'file_type', 'file_size', 'tags', 'first_seen',
'last_seen', 'signature', 'ssdeep', 'reporter', 'imphash', 'yara_rules_names']
class Client(BaseClient):
def __init__(self, server_url, verify, proxy, headers, api_key):
self.api_key = api_key
super().__init__(base_url=server_url, verify=verify, proxy=proxy, headers=headers)
def file_request(self, hash):
response = self._http_request('POST',
files={
'query': (None, "get_info"),
'hash': (None, hash)
})
return response
def malwarebazaar_download_sample_request(self, sha256_hash):
response = self._http_request('POST',
files={
'query': (None, "get_file"),
'sha256_hash': (None, sha256_hash)
},
resp_type="response")
return response
def malwarebazaar_comment_add_request(self, sha256_hash, comment):
if self.api_key is None:
raise Exception('API Key is required for this command')
response = self._http_request('POST',
headers={"API-KEY": self.api_key},
files={
'query': (None, "add_comment"),
'sha256_hash': (None, sha256_hash),
'comment': (None, comment)
})
return response
def malwarebazaar_samples_list_request(self, sample_input, value, limit, query):
files = {
'query': (None, query),
sample_input: (None, value),
}
if not sample_input == 'issuer_cn':
files.update({'limit': (None, limit)})
response = self._http_request('POST',
files=files)
return response
def file_process(hash, reliability, raw_response, response_data) -> CommandResults:
"""
creates CommandResults for every file in the list inserted to file_command
Args:
hash:
raw_response:
response_data:
Returns:
CommandResults for the relevant file
"""
dbot_score = Common.DBotScore(
indicator=hash,
indicator_type=DBotScoreType.FILE,
integration_name=VENDOR_NAME,
score=Common.DBotScore.BAD,
reliability=reliability,
malicious_description=response_data.get('comment')
)
signature = response_data.get('signature')
relationship = EntityRelationship(name='indicator-of',
entity_a=hash,
entity_a_type='File',
entity_b=signature,
entity_b_type=FeedIndicatorType.indicator_type_by_server_version(
"STIX Malware"),
source_reliability=reliability,
brand=VENDOR_NAME)
table_name = f'{VENDOR_NAME} File reputation for: {hash}'
humam_readable_data = copy.deepcopy(response_data)
humam_readable_data.update({'yara_rules_names': []})
rules = humam_readable_data.get('yara_rules', [])
rules = rules if rules else []
for rule in rules:
humam_readable_data.get('yara_rules_names').append(rule.get('rule_name'))
md = tableToMarkdown(table_name, t=humam_readable_data, headerTransform=string_to_table_header, removeNull=True,
headers=FILE_HEADERS)
file_object = Common.File(md5=response_data.get('md5_hash'), sha256=response_data.get('sha256_hash'),
sha1=response_data.get('sha1_hash'), size=response_data.get('file_size'),
file_type=response_data.get('file_type'), dbot_score=dbot_score,
relationships=[relationship])
return CommandResults(
outputs_prefix='MalwareBazaar.File',
outputs_key_field='md5_hash',
outputs=response_data,
raw_response=raw_response,
indicator=file_object,
relationships=[relationship],
readable_output=md
)
def check_query_status(response, is_list_command=False, sample_type=None):
"""
checks whether the request to the API returned with the proper result
Args:
sample_type: string, type of sample (tag, signature, etc.)
is_list_command: bool
response: response from API
"""
not_found_error = '_not_found'
illegal_error = 'illegal_'
query_status = response.get("query_status")
if not query_status == "ok" and not query_status == "success":
if is_list_command:
if query_status == sample_type + not_found_error:
raise Exception(EXCEPTIONS_MESSAGES.get('not_found'))
if query_status == sample_type + illegal_error:
raise Exception(EXCEPTIONS_MESSAGES.get('illegal'))
if query_status in EXCEPTIONS_MESSAGES:
raise Exception(EXCEPTIONS_MESSAGES.get(query_status))
else:
raise Exception(query_status)
def file_command(client: Client, args: Dict[str, Any]) -> List[CommandResults]:
"""
Args:
client:
args: file - list of files hash
Returns:
file reputation for the given hashes
"""
reliability = demisto.params().get('integrationReliability', DBotScoreReliability.A)
if DBotScoreReliability.is_valid_type(reliability):
reliability = DBotScoreReliability.get_dbot_score_reliability_from_str(reliability)
else:
raise Exception("Please provide a valid value for the Source Reliability parameter.")
file_list = argToList(args.get('file'))
command_results: List[CommandResults] = []
for hash in file_list:
raw_response = client.file_request(hash)
if raw_response.get('query_status') == 'hash_not_found':
command_results.append(create_indicator_result_with_dbotscore_unknown(hash, DBotScoreType.FILE, reliability))
else:
check_query_status(raw_response)
response_data = raw_response.get('data')[0]
if file_name := response_data.get('file_name'):
response_data['file_name'] = '' if file_name == 'file' else file_name
command_results.append(file_process(hash, reliability, raw_response, response_data))
return command_results
def malwarebazaar_download_sample_command(client: Client, args: Dict[str, Any]) -> CommandResults:
"""
Args:
client:
args: sha256_hash of file
Returns:
zip file contains the malware sample from MalwareBazaar
"""
sha256_hash = args.get("sha256_hash")
response = client.malwarebazaar_download_sample_request(sha256_hash)
filename = f'{sha256_hash}.zip'
return fileResult(filename, response.content)
def malwarebazaar_comment_add_command(client: Client, args: Dict[str, Any]) -> CommandResults:
"""
Args:
client:
args: sha256_hash of file, comment to add in context of this file
Returns:
query status of the request to MalwareBazaar (success or error)
"""
sha256_hash = args.get("sha256_hash")
comment = args.get("comment")
response = client.malwarebazaar_comment_add_request(sha256_hash, comment)
check_query_status(response)
readable_output = f'Comment added to {sha256_hash} malware sample successfully'
outputs = {
'sha256_hash': sha256_hash,
'comment': comment,
}
return CommandResults(
outputs_prefix='MalwareBazaar.MalwarebazaarCommentAdd',
outputs_key_field='sha256_hash',
outputs=outputs,
readable_output=readable_output,
raw_response=response,
)
def malwarebazaar_samples_list_command(client: Client, args: Dict[str, Any]) -> CommandResults:
"""
Args:
client:
args: sample_type - {clamav, file_type, imphash, signature, tag, yara_rule}
sample_value
limit (optional) - number of results (default 50)
Returns:
query results from API
"""
sample_input = args.get("sample_type") or ''
value = args.get("sample_value")
limit = arg_to_number(args.get("limit")) if "limit" in args else None
page = arg_to_number(args.get("page")) if "page" in args else None
page_size = arg_to_number(args.get("page_size")) if "page_size" in args else None
# # if limit was provided, request limit results from api, else, use pagination (if nothing is used 50 results will
# # be requested as default)
if limit is None:
if page is not None and page_size is not None:
if page <= 0:
raise Exception('Chosen page number must be greater than 0')
limit = page_size * page
else:
limit = 50
# # 1000 is the maximal value we can get from tha API
limit = min(limit, 1000)
query = QUERIES.get(sample_input)
response = client.malwarebazaar_samples_list_request(sample_input, value, limit, query)
check_query_status(response, True, args.get('sample_type'))
response_data = response.get('data')
# take required results from response if pagination by page and page_size
if page is not None and page_size is not None:
response_data = response_data[-1 * page_size:]
readable_output = tableToMarkdown('Sample List', t=response_data, headerTransform=string_to_table_header,
removeNull=True, headers=LIST_HEADERS)
return CommandResults(
outputs_prefix='MalwareBazaar.MalwarebazaarSamplesList',
outputs_key_field='sha256_hash',
readable_output=readable_output,
outputs=response_data,
raw_response=response
)
def test_module(client: Client) -> None:
if client.api_key:
response = client.malwarebazaar_comment_add_request(
your_sha256_hash,
"test comment")
else:
response = client.malwarebazaar_samples_list_request('tag', 'TrickBot', '2', QUERIES.get('tag'))
check_query_status(response)
return_results('ok')
def main() -> None:
params: Dict[str, Any] = demisto.params()
args: Dict[str, Any] = demisto.args()
url = params.get('url')
api_key = params.get('credentials', {}).get('password') or None
verify_certificate: bool = not params.get('insecure', False)
proxy = params.get('proxy', False)
command = demisto.command()
demisto.debug(f'Command being called is {command}')
try:
urllib3.disable_warnings()
client: Client = Client(urljoin(url, '/api/v1/'), verify_certificate, proxy, headers={}, api_key=api_key)
commands = {
'file': file_command,
'malwarebazaar-download-sample': malwarebazaar_download_sample_command,
'malwarebazaar-comment-add': malwarebazaar_comment_add_command,
'malwarebazaar-samples-list': malwarebazaar_samples_list_command,
}
if command == 'test-module':
test_module(client)
elif command in commands:
return_results(commands[command](client, args))
else:
raise NotImplementedError(f'{command} command is not implemented.')
except Exception as e:
return_error(str(e))
if __name__ in ['__main__', 'builtin', 'builtins']:
main()
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.