text
stringlengths 1
22.8M
|
|---|
Leila Kayondo is a Ugandan musician.
Early life and education
Kayondo sang in school choirs and took part in school festivals when she was at Seeta Boarding Primary School and at Naalya Secondary School Namugongo for her O-Level. She continued the same road when she joined Greenville International School for A-Level. She graduated from Uganda Christian University in Mukono where she pursued a bachelor's degree in social works and social administration.
Music
Kayondo started her music career in Dream Gals, an all girl-music group, after taking part in a competition that led to the group. The group had hit songs "weekend" and "Wandekangawo".
In 2009, Kayondo left the group to embark on a solo career. She has had hit songs like "Awo", and "Relaxing". She signed to Striker Entertainment, a Nigeria-based record label, in Uganda in 2017, releasing two hit singles, "Respeck" and "Musaayi".
References
1988 births
Uganda Christian University alumni
21st-century Ugandan women singers
Living people
Kumusha
|
Oleksandr Kovalenko may refer to:
Oleksandr Kovalenko (footballer, born 1974) (born 1974), Ukrainian footballer, goalkeeper
Oleksandr Kovalenko (footballer) (1976-2010), Ukrainian footballer
Oleksandr Mykhailovych Kovalenko (1875-1963), Ukrainian political activist, scientist, military officer, writer
Oleksandr Mykolayovych Kovalenko (1935-2021), Ukrainian economist and politician
See also
Kovalenko
|
```go
package consul
import (
"context"
consul "github.com/hashicorp/consul/api"
"github.com/prometheus/client_golang/prometheus"
"github.com/prometheus/client_golang/prometheus/promauto"
"github.com/weaveworks/common/instrument"
)
type consulInstrumentation struct {
kv kv
consulMetrics *consulMetrics
}
type consulMetrics struct {
consulRequestDuration *instrument.HistogramCollector
}
func newConsulMetrics(registerer prometheus.Registerer) *consulMetrics {
consulRequestDurationCollector := instrument.NewHistogramCollector(promauto.With(registerer).NewHistogramVec(prometheus.HistogramOpts{
Name: "consul_request_duration_seconds",
Help: "Time spent on consul requests.",
Buckets: prometheus.DefBuckets,
}, []string{"operation", "status_code"}))
consulMetrics := consulMetrics{consulRequestDurationCollector}
return &consulMetrics
}
func (c consulInstrumentation) CAS(p *consul.KVPair, options *consul.WriteOptions) (bool, *consul.WriteMeta, error) {
var ok bool
var result *consul.WriteMeta
err := instrument.CollectedRequest(options.Context(), "CAS", c.consulMetrics.consulRequestDuration, instrument.ErrorCode, func(ctx context.Context) error {
options = options.WithContext(ctx)
var err error
ok, result, err = c.kv.CAS(p, options)
return err
})
return ok, result, err
}
func (c consulInstrumentation) Get(key string, options *consul.QueryOptions) (*consul.KVPair, *consul.QueryMeta, error) {
var kvp *consul.KVPair
var meta *consul.QueryMeta
err := instrument.CollectedRequest(options.Context(), "Get", c.consulMetrics.consulRequestDuration, instrument.ErrorCode, func(ctx context.Context) error {
options = options.WithContext(ctx)
var err error
kvp, meta, err = c.kv.Get(key, options)
return err
})
return kvp, meta, err
}
func (c consulInstrumentation) List(path string, options *consul.QueryOptions) (consul.KVPairs, *consul.QueryMeta, error) {
var kvps consul.KVPairs
var meta *consul.QueryMeta
err := instrument.CollectedRequest(options.Context(), "List", c.consulMetrics.consulRequestDuration, instrument.ErrorCode, func(ctx context.Context) error {
options = options.WithContext(ctx)
var err error
kvps, meta, err = c.kv.List(path, options)
return err
})
return kvps, meta, err
}
func (c consulInstrumentation) Delete(key string, options *consul.WriteOptions) (*consul.WriteMeta, error) {
var meta *consul.WriteMeta
err := instrument.CollectedRequest(options.Context(), "Delete", c.consulMetrics.consulRequestDuration, instrument.ErrorCode, func(ctx context.Context) error {
options = options.WithContext(ctx)
var err error
meta, err = c.kv.Delete(key, options)
return err
})
return meta, err
}
func (c consulInstrumentation) Put(p *consul.KVPair, options *consul.WriteOptions) (*consul.WriteMeta, error) {
var result *consul.WriteMeta
err := instrument.CollectedRequest(options.Context(), "Put", c.consulMetrics.consulRequestDuration, instrument.ErrorCode, func(ctx context.Context) error {
options = options.WithContext(ctx)
var err error
result, err = c.kv.Put(p, options)
return err
})
return result, err
}
```
|
Lineidae is a family of nemertean worms. It contains the following genera:
Aetheolineus Senz, 1993
Ammolineus Senz, 2001
Antarctolineus Muller & Scripcariu, 1964
Apatronemertes Wilfert & Gibson, 1974
Australineus Gibson, 1990
Cephalurichus Gibson, 1985
Colemaniella Gibson, 1982
Corsoua Corrêa, 1963
Craticulineus Gibson, 1984
Diplopleura Stimpson, 1857
Eousia Gibson, 1990
Euborlasia Vaillant, 1890
Flaminga Corrêa, 1957
Fragilonemertes Riser, 1998
Gastropion Moretto, 1998
Heteroenopleus Wern, 1998
Heterolineus Friedrich, 1935
Heteronemertes Chernyshev, 1995
Hinumanemertes Iwata, 1970
Kirsteueria Gibson, 1978
Kohnia Sundberg & Gibson, 1995
Leucocephalonemertes Cantell, 1996
Lineopsella Friedrich, 1970
Lineopselloides Gibson, 1990
Lineopsis
Lineus Sowerby, 1806
Micconemertes Gibson, 1997
Micrella
Micrellides Gibson, 1985
Micrura Ehrenberg, 1871
Micrurimorpha Korotkevich, 1980
Micrurinella Friedrich, 1960
Myorhynchonemertes Senz, 1997
Nemertoscolex Greeff, 1879
Neolineus
Nipponomicrura Chernyshev, 1995
Notospermus Huschke, 1829
Paralineopsis Iwata, 1993
Paralineus Schütz, 1911
Paramicrura Gibson & Sundberg, 1992
Paramicrurinella Gibson, 1985
Parvicirrus Ris, 1993
Pontolineus
Pseudomicrura Strand & Sundberg, 2011
Pussylineus Corrêa, 1956
Quasiutolineides Senz, 2001
Ramphogordius Rathke, 1843
Rhamphogordius
Riseriellus
Tarrhomyos Ris, 1993
Tenuilineus Ris, 1993
Uricholemma Sundberg & Gibson, 1995
Utolineides Senz, 1997
Utolineus Gibson, 1990
Yininemertes Sun and Lu, 2008
Zygeupolia Thompson, 1900
References
Heteronemertea
Nemertea families
|
```c
#include "ruby.h"
#include "rubyspec.h"
#ifdef __cplusplus
extern "C" {
#endif
static VALUE binding_spec_get_binding(VALUE self) {
return rb_funcall(self, rb_intern("binding"), 0);
}
void Init_binding_spec(void) {
VALUE cls = rb_define_class("CApiBindingSpecs", rb_cObject);
rb_define_method(cls, "get_binding", binding_spec_get_binding, 0);
}
#ifdef __cplusplus
}
#endif
```
|
Covered bonds are debt securities issued by a bank or mortgage institution and collateralised against a pool of assets that, in case of failure of the issuer, can cover claims at any point of time. They are subject to specific legislation to protect bond holders. Unlike asset-backed securities created in securitization, the covered bonds continue as obligations of the issuer; in essence, the investor has recourse against the issuer and the collateral, sometimes known as "dual recourse". Typically, covered bond assets remain on the issuer's consolidated balance sheet (usually with an appropriate capital charge).
As of beginning of 2019 volume of outstanding covered bonds worldwide was euro 2,577 billion, while largest markets were Denmark (€406 bil.), Germany (€370 bil.), France (€321 bil.) and Spain (€232 bil.).
History
Covered bonds were created in Prussia in 1769 by Frederick The Great and in Denmark in 1795. Danish covered bond lending emerged after the Great Fire of Copenhagen in 1795, when a quarter of the city burnt to the ground. After the fire, a great need arose for an organized credit market as a large number of new buildings were needed over a short period of time. Today nearly all real estate is financed with covered bonds in Denmark, and Denmark is the 3rd largest issuer in Europe.
In Prussia these Pfandbriefe were sold by estates of the country and regulated under public law. They were secured by real estate and subsidiary by the issuing estate. In about 1850, the first mortgage banks were allowed to sell Pfandbriefe as a means to refinance mortgage loans. With the mortgage banks law of 1900, the whole German Empire was given a standardized legal foundation for the issuance of Pfandbriefe.
Structure
A covered bond is a corporate bond with one important enhancement: recourse to a pool of assets that secures or "covers" the bond if the issuer (usually a financial institution) becomes insolvent. These assets act as additional credit cover; they do not have any bearing on the contractual cash flow to the investor, as is the case with Securitized assets.
For the investor, one major advantage to a covered bond is that the debt and the underlying asset pool remain on the issuer's financials, and issuers must ensure that the pool consistently backs the covered bond. In the event of default, the investor has recourse to both the pool and the issuer.
Because non-performing loans or prematurely paid debt must be replaced in the pool, success of the product for the issuer depends on the institution's ability to maintain the credit quality of the cover pool by replacing the non-performing and repaid assets in the pool.
Redemption regimes
There are three major redemption regimes for covered bonds:
Hard-bullet covered bonds: payments have to be made when due according to the original schedule. Failure to pay on the Standard Maturity Date (SMD) triggers default of the covered bonds, and the covered bonds accelerate. Until a few years ago, hard bullet structures were regarded as market practice. This means that if the respective covered bond issuer is not able to comply with their outstanding payment obligations, investors will obtain access to the respective covered bond programme's cover pool. If redemption of an issue is pending and the liquid funds available are not sufficient to redeem the bond and liquidity cannot be generated by another means, the collateral in the pool will be sold if the bond has a hard bullet structure. This means that investors can expect prompt repayment on the one hand but this is associated on the other hand with refinancing risk or market value risk - the risk that the market values of the assets may be reduced and, in extreme circumstances, the full repayment amount is not covered by the sales proceeds.
Soft-bullet covered bonds: payments have to be made when due according to the original schedule. Failure to pay on the SMD does not trigger covered bond default. The extension period grants more time (typically at least 12 months) to repay the covered bonds, setting a new Final Maturity Date (FMD). Failure to pay on the FMD triggers default and acceleration of the covered bond. Soft bullet structures and, more rarely, CPT structures as well (cf. the next bullet point), exist to counter the refinancing risk mentioned above. With regard to possible extension periods, a postponement of maturity by twelve months has become established under soft bullet structures.
Conditional pass-through covered bonds (CPT): payments have to be made when due according to the original schedule. Failure to pay by the SMD does not trigger default of that covered bond. The affected covered bond goes into pass-through mode. All other outstanding covered bonds are not affected and would only trigger the pass-through mode one after another if they are not redeemed on their respective SMDs. The original repayment date can be postponed for far longer in the case of a CPT structure. This also reduces the refinancing risk to a minimum at the same time. In contrast to the soft bullet structure, once the pass-through structure is triggered, the outstanding covered bond issues are redeemed firstly from the inflows generated from the assets associated with them and also from the sale of assets, if they can be sold at adequate market prices. However, in contrast to the soft bullet structure, the date at which investors can expect the outstanding claims to be serviced cannot be determined ex ante. Rather, in the worst-case scenario, they can only be determined upon maturity of the assets with the longest term.
No uniform trigger events have so far become established on the market to trigger an extension period (beyond the repayment date originally agreed under a soft bullet or CPT structures). Examples of possible triggers within the soft bullet and CPT structures include (i) the issuer's insolvency and postponement of redemption to a later repayment date by an independent trustee or (ii) the postponement of the original repayment date by the issuer.
If investors’ claims can be serviced when they originally fall due, there are no differences between the three payment structures as far as investors are concerned. However, rating agencies view soft bullet, and even more so CPT structures because the refinancing risk is lower, as positive factors in assessing their ratings.
Covered bond markets, where Hard-bullet structures prevail are Germany, France, Spain and Sweden. Typical Soft-bullet markets are UK, Switzerland, Norway, Italy, Netherlands, Canada and Australia. CPT structures have been seen in the Netherlands, Italy and Poland.
Rating agencies' approach to rate covered bonds
A comprehensive high level overview on the way rating agencies are evaluating the credit risk of a covered bond programme can be found in Chapter IV of the European Covered bond Councils (ECBC) Covered bond Fact book.
The Factbook is updated annually and also maintains more in depth summaries directly provided by the rating agencies.
Rating agencies usually apply a two-step analysis when rating covered bonds:
At the first stage, a quantitative model is applied that produces a maximum potential rating based on (1) the probability that the issuer will cease making payments under the covered bonds (this maximum potential rating is called a CB anchor); and (2) the estimated losses that will accrue to the cover pool should the issuer cease making payments under the covered bonds (this event is called a CB anchor event).
Then the maximum potential rating produced at the previous stage is refined to account for certain risks, particularly refinancing risk, arising on the occurrence of a CB anchor event. This is done by applying the so-called timely payment indicator (TPI) framework. The TPI framework limits the rating uplift that covered bonds may achieve over the CB anchor and may constrain the final covered bond rating to a lower level than the maximum potential rating under the quantitative model.
Usually the issuer's rating is used as a reference point (CB anchor) from which a probability of default for the issuer's payment obligations is derived.
See also
Financial instruments
European Covered Bond Council
Securitization
The Cover
References
External links
FDIC Policy Statement on Covered Bonds
ECBC Technical Issues Covered Bond Comparative Framework Database
Covered Bond Label Foundation
US Covered Bonds
Morrison & Foerster: Covered Bonds
Bonds (finance)
|
Oleg Aleksandrovich Akulov (; born 6 January 1973) is a Russian football coach and a former player.
References
1973 births
Living people
Russian men's footballers
FC Zhemchuzhina-Sochi players
Russian Premier League players
FC Kuban Krasnodar players
FC Akhmat Grozny players
Russian football managers
FC Kristall Smolensk players
Men's association football forwards
FC Volga Ulyanovsk players
FC Akademiya Tolyatti players
FC Kuzbass Kemerovo players
|
Flamel may refer to:
People
Nicolas Flamel (1340-1418) reputed to be an alchemist who concocted the philosopher's stone
Perenelle Flamel (died 1397) reputed to be an alchemist, wife of Nicolas
Places
House of Nicolas Flamel, a building that Nicolas Flamel once lived in, now a restaurant still a private home
Rue Nicolas Flamel, a street in Paris, France, which contains the House of Nicolas Flamel
Other uses
Nicolas Flamel (Harry Potter), a fictionalized version of Nicolas Flamel
Perenelle Flamel (Harry Potter), a fictionalized version of Perenelle Flamel
Flamel Technologies, a French pharma company
|
Mike Hintz is a former defensive back in the National Football League. He played with the Chicago Bears during the 1987 NFL season.
References
Sportspeople from Eau Claire, Wisconsin
Players of American football from Wisconsin
Chicago Bears players
American football defensive backs
Wisconsin–Platteville Pioneers football players
1965 births
Living people
National Football League replacement players
|
Olev may refer to:
EML Olev (M415), Estonian minelayer
Office of Low Emission Vehicles (OLEV), part of the UK government Department for Transport
Online Electric Vehicle (OLEV)
People
Given name
Olev Eskola (1914–1990), Estonian actor
Olev Olesk (1921–2017), Estonian politician
Olev Raju (born 1948), Estonian economist and politician
Olev Roomet (1901–1987), Estonian musician
Olev Siinmaa (1881–1948), Estonian architect
Olev Subbi (1930–2013), Estonian artist
Olev Tinn (1920–1971), Estonian actor
Olev Toomet (1929–2009), Estonian politician
Olev Vinn (born 1971), Estonian paleobiologist and paleontologist
Surname
Naum Olev (1939–2009), Russian lyricist
Estonian masculine given names
Masculine given names
|
Waki is a village in the administrative district of Gmina Kościelec, within Koło County, Greater Poland Voivodeship, in west-central Poland. It lies approximately north-west of Kościelec, west of Koło, and east of the regional capital Poznań.
References
Waki
|
Augustin Chaho in French or Agosti Xaho in Basque was an important Romantic Basque writer. He was born in Tardets (Atharratze in basque), Soule, French Basqueland on 10 October 1811 and died in Bayonne (Baiona in Basque), Labourd 23 October 1858. He is considered a precursor of left-wing Basque patriotism.
It is usually said that he studied in Paris with Charles Nodier. In Paris he developed his esoteric thought.
He wrote Travel to Navarre during the insurrection of the Basques (1830-1835) (1836, in French, on his experiences in the First Carlist War, which he interprets as an ethnic war of Basques against Spain that would bring about a basque republic.), The Legend of Aitor (in which he invented a national creation myth, that had great acceptance for some time) and Azti-Begia (The Soothsayer's Eye in Souletin Basque).
He was a republican supporter, and became councillor in Bayonne and the Basses-Pyrénées department. He headed the revolution of 1848 in Bayonne. After the Bonapartist coup of 1851, he escaped to Vitoria (Gasteiz in Basque), in Alava, Spanish Basqueland.
References
Azurmendi, J. 2020: Pentsamenduaren historia Euskal Herrian, Andoain, Jakin. .
Coyos Etxebarne, B. 2013: Agosti Xahori omenaldia: bere sortzearen bigarren mendeurrena, Donostia, Eusko Ikaskuntza. .
Urkizu, P. 1992: Agosti Chahoren bizitza eta idazlanak: (1811-1858), Bilbo, Euskaltzaindia. ISBN
Zabalo, J. 2004: Xaho. El genio de Zuberoa, Tafalla, Txalaparta. .
Zabaltza, X. 2011: Agosti Xaho. Aitzindari bakartia (1811-1858), Gasteiz, Eusko Jaurlaritzaren Argitalpen Zerbitzu Nagusia.
External links
CHAHO, Joseph Augustin in the Spanish-language Bernardo Estornés Lasa - Auñamendi Encyclopedia
Online edition of 'Azti-Begia'
Augustin Chaho. Precursor incomprendido. Un précurseur incompris (1811-1858), in Spanish and French, by Xabier Zabaltza
Aïtor. Légende cantabre / Aitor. - Leyenda cántabra / Aitor. - Kantabriar kondaira
Basque writers
French Basque politicians
People of the First Carlist War
French people of the Revolutions of 1848
Basque-language writers
1811 births
1858 deaths
French male writers
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
// MODULES //
var isnan = require( '@stdlib/math/base/assert/is-nan' );
var gamma = require( '@stdlib/math/base/special/gamma' );
var mean = require( '@stdlib/stats/base/dists/weibull/mean' );
// MAIN //
/**
* Returns the variance of a Weibull distribution.
*
* @param {PositiveNumber} k - shape parameter
* @param {PositiveNumber} lambda - scale parameter
* @returns {PositiveNumber} variance
*
* @example
* var v = variance( 1.0, 1.0 );
* // returns 1.0
*
* @example
* var v = variance( 4.0, 12.0 );
* // returns ~9.311
*
* @example
* var v = variance( 8.0, 2.0 );
* // returns ~0.078
*
* @example
* var v = variance( 1.0, -0.1 );
* // returns NaN
*
* @example
* var v = variance( -0.1, 1.0 );
* // returns NaN
*
* @example
* var v = variance( 2.0, NaN );
* // returns NaN
*
* @example
* var v = variance( NaN, 2.0 );
* // returns NaN
*/
function variance( k, lambda ) {
var mu;
if (
isnan( k ) ||
isnan( lambda ) ||
k <= 0.0 ||
lambda <= 0.0
) {
return NaN;
}
mu = mean( k, lambda );
return ( lambda*lambda * ( gamma( 1.0 + (2.0/k) ) ) ) - ( mu*mu );
}
// EXPORTS //
module.exports = variance;
```
|
The 2011–12 Burkinabé Premier League is the 50th edition of top flight football in Burkina Faso. A total of twenty teams competed in the season.
Teams
Sources:
Group A
ASFA
AS Kouritenga
Canon du Sud
Étoile Filante
Kadiogo
Sanmatenga FC
Santos
SONABEL
USFA
Ouagadougou
Group B
ASEC Koudougou
ASFB
AS Maya
Bobo Sport
Bouloumpoukou FC
Comoé
RC Bobo Dioulasso
US FRAN
Yatenga
Sourou Sport de Tougan
First stage
Sources:
Group A
Group B
Super division
Table
References
Premier League
Premier League
Burkina Faso
Burkinabé Premier League seasons
|
```php
<?php
if ( ! class_exists( 'acf_field_checkbox' ) ) :
class acf_field_checkbox extends acf_field {
/**
* This function will setup the field type data
*
* @type function
* @date 5/03/2014
* @since 5.0.0
*
* @param n/a
* @return n/a
*/
function initialize() {
// vars
$this->name = 'checkbox';
$this->label = __( 'Checkbox', 'acf' );
$this->category = 'choice';
$this->description = __( 'A group of checkbox inputs that allow the user to select one, or multiple values that you specify.', 'acf' );
$this->preview_image = acf_get_url() . '/assets/images/field-type-previews/field-preview-checkbox.png';
$this->doc_url = acf_add_url_utm_tags( 'path_to_url 'docs', 'field-type-selection' );
$this->defaults = array(
'layout' => 'vertical',
'choices' => array(),
'default_value' => '',
'allow_custom' => 0,
'save_custom' => 0,
'toggle' => 0,
'return_format' => 'value',
'custom_choice_button_text' => __( 'Add new choice', 'acf' ),
);
}
/**
* Create the HTML interface for your field
*
* @param $field (array) the $field being rendered
*
* @type action
* @since 3.6
* @date 23/01/13
*
* @param $field (array) the $field being edited
* @return n/a
*/
function render_field( $field ) {
// reset vars
$this->_values = array();
$this->_all_checked = true;
// ensure array
$field['value'] = acf_get_array( $field['value'] );
$field['choices'] = acf_get_array( $field['choices'] );
// hiden input
acf_hidden_input( array( 'name' => $field['name'] ) );
// vars
$li = '';
$ul = array(
'class' => 'acf-checkbox-list',
);
// append to class
$ul['class'] .= ' ' . ( $field['layout'] == 'horizontal' ? 'acf-hl' : 'acf-bl' );
$ul['class'] .= ' ' . $field['class'];
// checkbox saves an array
$field['name'] .= '[]';
// choices
if ( ! empty( $field['choices'] ) ) {
// choices
$li .= $this->render_field_choices( $field );
// toggle
if ( $field['toggle'] ) {
$li = $this->render_field_toggle( $field ) . $li;
}
}
// custom
if ( $field['allow_custom'] ) {
$li .= $this->render_field_custom( $field );
}
// return
echo '<ul ' . acf_esc_attrs( $ul ) . '>' . "\n" . $li . '</ul>' . "\n"; //phpcs:ignore WordPress.Security.EscapeOutput.OutputNotEscaped -- escaped by specific render methods above.
}
/**
* description
*
* @type function
* @date 15/7/17
* @since 5.6.0
*
* @param $post_id (int)
* @return $post_id (int)
*/
function render_field_choices( $field ) {
// walk
return $this->walk( $field['choices'], $field );
}
/**
* Validates values for the checkbox field
*
* @since 6.0.0
*
* @param boolean $valid If the field is valid.
* @param mixed $value The value to validate.
* @param array $field The main field array.
* @param string $input The input element's name attribute.
* @return boolean
*/
public function validate_value( $valid, $value, $field, $input ) {
if ( ! is_array( $value ) || empty( $field['allow_custom'] ) ) {
return $valid;
}
foreach ( $value as $value ) {
if ( empty( $value ) && $value !== '0' ) {
return __( 'Checkbox custom values cannot be empty. Uncheck any empty values.', 'acf' );
}
}
return $valid;
}
/**
* description
*
* @type function
* @date 15/7/17
* @since 5.6.0
*
* @param $post_id (int)
* @return $post_id (int)
*/
function render_field_toggle( $field ) {
// vars
$atts = array(
'type' => 'checkbox',
'class' => 'acf-checkbox-toggle',
'label' => __( 'Toggle All', 'acf' ),
);
// custom label
if ( is_string( $field['toggle'] ) ) {
$atts['label'] = $field['toggle'];
}
// checked
if ( $this->_all_checked ) {
$atts['checked'] = 'checked';
}
// return
return '<li>' . acf_get_checkbox_input( $atts ) . '</li>' . "\n";
}
/**
* description
*
* @type function
* @date 15/7/17
* @since 5.6.0
*
* @param $post_id (int)
* @return $post_id (int)
*/
function render_field_custom( $field ) {
// vars
$html = '';
// loop
foreach ( $field['value'] as $value ) {
// ignore if already eixsts
if ( isset( $field['choices'][ $value ] ) ) {
continue;
}
// vars
$esc_value = esc_attr( $value );
$text_input = array(
'name' => $field['name'],
'value' => $value,
);
// bail early if choice already exists
if ( in_array( $esc_value, $this->_values ) ) {
continue;
}
// append
$html .= '<li><input class="acf-checkbox-custom" type="checkbox" checked="checked" />' . acf_get_text_input( $text_input ) . '</li>' . "\n";
}
// append button
$html .= '<li><a href="#" class="button acf-add-checkbox">' . esc_attr( $field['custom_choice_button_text'] ) . '</a></li>' . "\n";
// return
return $html;
}
function walk( $choices = array(), $args = array(), $depth = 0 ) {
// bail early if no choices
if ( empty( $choices ) ) {
return '';
}
// defaults
$args = wp_parse_args(
$args,
array(
'id' => '',
'type' => 'checkbox',
'name' => '',
'value' => array(),
'disabled' => array(),
)
);
// vars
$html = '';
// sanitize values for 'selected' matching
if ( $depth == 0 ) {
$args['value'] = array_map( 'esc_attr', $args['value'] );
$args['disabled'] = array_map( 'esc_attr', $args['disabled'] );
}
// loop
foreach ( $choices as $value => $label ) {
// open
$html .= '<li>';
// optgroup
if ( is_array( $label ) ) {
$html .= '<ul>' . "\n";
$html .= $this->walk( $label, $args, $depth + 1 );
$html .= '</ul>';
// option
} else {
// vars
$esc_value = esc_attr( $value );
$atts = array(
'id' => $args['id'] . '-' . str_replace( ' ', '-', $value ),
'type' => $args['type'],
'name' => $args['name'],
'value' => $value,
'label' => $label,
);
// selected
if ( in_array( $esc_value, $args['value'] ) ) {
$atts['checked'] = 'checked';
} else {
$this->_all_checked = false;
}
// disabled
if ( in_array( $esc_value, $args['disabled'] ) ) {
$atts['disabled'] = 'disabled';
}
// store value added
$this->_values[] = $esc_value;
// append
$html .= acf_get_checkbox_input( $atts );
}
// close
$html .= '</li>' . "\n";
}
// return
return $html;
}
/**
* Create extra options for your field. This is rendered when editing a field.
* The value of $field['name'] can be used (like bellow) to save extra data to the $field
*
* @type action
* @since 3.6
* @date 23/01/13
*
* @param $field - an array holding all the field's data
*/
function render_field_settings( $field ) {
// Encode choices (convert from array).
$field['choices'] = acf_encode_choices( $field['choices'] );
$field['default_value'] = acf_encode_choices( $field['default_value'], false );
acf_render_field_setting(
$field,
array(
'label' => __( 'Choices', 'acf' ),
'instructions' => __( 'Enter each choice on a new line.', 'acf' ) . '<br />' . __( 'For more control, you may specify both a value and label like this:', 'acf' ) . '<br /><span class="acf-field-setting-example">' . __( 'red : Red', 'acf' ) . '</span>',
'type' => 'textarea',
'name' => 'choices',
)
);
acf_render_field_setting(
$field,
array(
'label' => __( 'Default Value', 'acf' ),
'instructions' => __( 'Enter each default value on a new line', 'acf' ),
'type' => 'textarea',
'name' => 'default_value',
)
);
acf_render_field_setting(
$field,
array(
'label' => __( 'Return Value', 'acf' ),
'instructions' => __( 'Specify the returned value on front end', 'acf' ),
'type' => 'radio',
'name' => 'return_format',
'layout' => 'horizontal',
'choices' => array(
'value' => __( 'Value', 'acf' ),
'label' => __( 'Label', 'acf' ),
'array' => __( 'Both (Array)', 'acf' ),
),
)
);
}
/**
* Renders the field settings used in the "Validation" tab.
*
* @since 6.0
*
* @param array $field The field settings array.
* @return void
*/
function render_field_validation_settings( $field ) {
acf_render_field_setting(
$field,
array(
'label' => __( 'Allow Custom Values', 'acf' ),
'name' => 'allow_custom',
'type' => 'true_false',
'ui' => 1,
'instructions' => __( "Allow 'custom' values to be added", 'acf' ),
)
);
acf_render_field_setting(
$field,
array(
'label' => __( 'Save Custom Values', 'acf' ),
'name' => 'save_custom',
'type' => 'true_false',
'ui' => 1,
'instructions' => __( "Save 'custom' values to the field's choices", 'acf' ),
'conditions' => array(
'field' => 'allow_custom',
'operator' => '==',
'value' => 1,
),
)
);
}
/**
* Renders the field settings used in the "Presentation" tab.
*
* @since 6.0
*
* @param array $field The field settings array.
* @return void
*/
function render_field_presentation_settings( $field ) {
acf_render_field_setting(
$field,
array(
'label' => __( 'Layout', 'acf' ),
'instructions' => '',
'type' => 'radio',
'name' => 'layout',
'layout' => 'horizontal',
'choices' => array(
'vertical' => __( 'Vertical', 'acf' ),
'horizontal' => __( 'Horizontal', 'acf' ),
),
)
);
acf_render_field_setting(
$field,
array(
'label' => __( 'Add Toggle All', 'acf' ),
'instructions' => __( 'Prepend an extra checkbox to toggle all choices', 'acf' ),
'name' => 'toggle',
'type' => 'true_false',
'ui' => 1,
)
);
}
/**
* This filter is appied to the $field before it is saved to the database
*
* @type filter
* @since 3.6
* @date 23/01/13
*
* @param $field - the field array holding all the field options
* @param $post_id - the field group ID (post_type = acf)
*
* @return $field - the modified field
*/
function update_field( $field ) {
// Decode choices (convert to array).
$field['choices'] = acf_decode_choices( $field['choices'] );
$field['default_value'] = acf_decode_choices( $field['default_value'], true );
return $field;
}
/**
* This filter is appied to the $value before it is updated in the db
*
* @type filter
* @since 3.6
* @date 23/01/13
*
* @param $value - the value which will be saved in the database
* @param $post_id - the post_id of which the value will be saved
* @param $field - the field array holding all the field options
*
* @return $value - the modified value
*/
function update_value( $value, $post_id, $field ) {
// bail early if is empty
if ( empty( $value ) ) {
return $value;
}
// select -> update_value()
$value = acf_get_field_type( 'select' )->update_value( $value, $post_id, $field );
// save_other_choice
if ( $field['save_custom'] ) {
// get raw $field (may have been changed via repeater field)
// if field is local, it won't have an ID
$selector = $field['ID'] ? $field['ID'] : $field['key'];
$field = acf_get_field( $selector );
if ( ! $field ) {
return false;
}
// bail early if no ID (JSON only)
if ( ! $field['ID'] ) {
return $value;
}
// loop
foreach ( $value as $v ) {
// ignore if already eixsts
if ( isset( $field['choices'][ $v ] ) ) {
continue;
}
// unslash (fixes serialize single quote issue)
$v = wp_unslash( $v );
// sanitize (remove tags)
$v = sanitize_text_field( $v );
// append
$field['choices'][ $v ] = $v;
}
// save
acf_update_field( $field );
}
// return
return $value;
}
/**
* This function will translate field settings
*
* @type function
* @date 8/03/2016
* @since 5.3.2
*
* @param $field (array)
* @return $field
*/
function translate_field( $field ) {
return acf_get_field_type( 'select' )->translate_field( $field );
}
/**
* This filter is appied to the $value after it is loaded from the db and before it is returned to the template
*
* @type filter
* @since 3.6
* @date 23/01/13
*
* @param $value (mixed) the value which was loaded from the database
* @param $post_id (mixed) the post_id from which the value was loaded
* @param $field (array) the field array holding all the field options
*
* @return $value (mixed) the modified value
*/
function format_value( $value, $post_id, $field ) {
// Bail early if is empty.
if ( acf_is_empty( $value ) ) {
return array();
}
// Always convert to array of items.
$value = acf_array( $value );
// Return.
return acf_get_field_type( 'select' )->format_value( $value, $post_id, $field );
}
/**
* Return the schema array for the REST API.
*
* @param array $field
* @return array
*/
public function get_rest_schema( array $field ) {
$schema = array(
'type' => array( 'integer', 'string', 'array', 'null' ),
'required' => isset( $field['required'] ) && $field['required'],
'items' => array(
'type' => array( 'string', 'integer' ),
),
);
if ( isset( $field['default_value'] ) && '' !== $field['default_value'] ) {
$schema['default'] = $field['default_value'];
}
// If we allow custom values, nothing else to do here.
if ( ! empty( $field['allow_custom'] ) ) {
return $schema;
}
$schema['items']['enum'] = acf_get_field_type( 'select' )->format_rest_choices( $field['choices'] );
return $schema;
}
}
// initialize
acf_register_field_type( 'acf_field_checkbox' );
endif; // class_exists check
```
|
```asciidoc
The example below is based on the following sample graph:
[source,cypher]
----
CREATE (Keanu:Person {name:'Keanu Reeves', born:1964})
CREATE (TomH:Person {name:'Tom Hanks', born:1956})
CREATE (TheMatrix:Movie {title:'The Matrix', released:1999, tagline:'Welcome to the Real World'})
CREATE (TheMatrixReloaded:Movie {title:'The Matrix Reloaded', released:2003, tagline:'Free your mind'})
CREATE (TheMatrixRevolutions:Movie {title:'The Matrix Revolutions', released:2003, tagline:'Everything that has a beginning has an end'})
CREATE (SomethingsGottaGive:Movie {title:"Something's Gotta Give", released:2003})
CREATE (TheDevilsAdvocate:Movie {title:"The Devil's Advocate", released:1997, tagline:'Evil has its winning ways'})
CREATE (YouveGotMail:Movie {title:"You've Got Mail", released:1998, tagline:'At odds in life... in love on-line.'})
CREATE (SleeplessInSeattle:Movie {title:'Sleepless in Seattle', released:1993, tagline:'What if someone you never met, someone you never saw, someone you never knew was the only someone for you?'})
CREATE (ThatThingYouDo:Movie {title:'That Thing You Do', released:1996, tagline:'In every life there comes a time when that thing you dream becomes that thing you do'})
CREATE (CloudAtlas:Movie {title:'Cloud Atlas', released:2012, tagline:'Everything is connected'})
CREATE (Keanu)-[:ACTED_IN {roles:['Neo']}]->(TheMatrix)
CREATE (Keanu)-[:ACTED_IN {roles:['Neo']}]->(TheMatrixReloaded)
CREATE (Keanu)-[:ACTED_IN {roles:['Neo']}]->(TheMatrixRevolutions)
CREATE (Keanu)-[:ACTED_IN {roles:['Julian Mercer']}]->(SomethingsGottaGive)
CREATE (Keanu)-[:ACTED_IN {roles:['Kevin Lomax']}]->(TheDevilsAdvocate)
CREATE (TomH)-[:ACTED_IN {roles:['Joe Fox']}]->(YouveGotMail)
CREATE (TomH)-[:ACTED_IN {roles:['Sam Baldwin']}]->(SleeplessInSeattle)
CREATE (TomH)-[:ACTED_IN {roles:['Mr. White']}]->(ThatThingYouDo)
CREATE (TomH)-[:ACTED_IN {roles:['Zachry', 'Dr. Henry Goose', 'Isaac Sachs', 'Dermot Hoggins']}]->(CloudAtlas);
----
[source,cypher]
----
CALL apoc.meta.stats();
----
.Results
[opts="header"]
|===
| labelCount | relTypeCount | propertyKeyCount | nodeCount | relCount | labels | relTypes | relTypesCount | stats
| 9 | 5 | 17 | 11 | 9 | {Movie: 9, Person: 2} | {`(:Person)-[:ACTED_IN]->()`: 9, `()-[:ACTED_IN]->(:Movie)`: 9, `()-[:ACTED_IN]->()`: 9} | {ACTED_IN: 9} | {relTypeCount: 5, propertyKeyCount: 17, labelCount: 9, nodeCount: 11, relCount: 9, labels: {Movie: 9, Person: 2}, relTypes: {`(:Person)-[:ACTED_IN]->()`: 9, `()-[:ACTED_IN]->(:Movie)`: 9, `()-[:ACTED_IN]->()`: 9}}
|===
Note that the `relTypesCount` field returns a count equal to `MATCH ()-[r:RELATIONSHIP_TYPE]->() RETURN count(r)` for each relationship type (unique relationships versus unique patterns).
```
|
```xml
<?xml version='1.0' encoding='UTF-8'?>
<definitions xmlns="path_to_url" xmlns:xsi="path_to_url" xmlns:xsd="path_to_url" xmlns:activiti="path_to_url" xmlns:bpmndi="path_to_url" xmlns:omgdc="path_to_url" xmlns:omgdi="path_to_url" typeLanguage="path_to_url" expressionLanguage="path_to_url" targetNamespace="path_to_url" xmlns:modeler="path_to_url" modeler:version="1.0en" modeler:exportDateTime="20141210124445777" modeler:modelId="924474" modeler:modelVersion="1" modeler:modelLastUpdated="1418215482206">
<process id="variablesFetchingTestProcess" name="VariablesTest" isExecutable="true">
<startEvent id="startEvent1"/>
<userTask id="sid-05767243-5EFD-496E-882E-14B47D3274B3" name="Task A"/>
<sequenceFlow id="sid-15D04314-3280-4CAF-8A23-0415598CF5B0" sourceRef="startEvent1" targetRef="sid-05767243-5EFD-496E-882E-14B47D3274B3"/>
<sequenceFlow id="sid-90E99CC2-B930-48F4-AE60-425DD6A4C657" sourceRef="sid-05767243-5EFD-496E-882E-14B47D3274B3" targetRef="sid-6EDB2B4B-48CB-4018-BFDB-A751B0B1950F"/>
<parallelGateway id="sid-6EDB2B4B-48CB-4018-BFDB-A751B0B1950F"/>
<userTask id="sid-157D27B4-4E96-47AB-A1BA-ADB350659DDB" name="Task B"/>
<sequenceFlow id="sid-2514CA53-0998-4C08-880D-14FCCE4B59E1" sourceRef="sid-6EDB2B4B-48CB-4018-BFDB-A751B0B1950F" targetRef="sid-157D27B4-4E96-47AB-A1BA-ADB350659DDB"/>
<sequenceFlow id="sid-EA019938-D594-4400-B498-A143E3546C6A" sourceRef="sid-6EDB2B4B-48CB-4018-BFDB-A751B0B1950F" targetRef="sid-92FDA83E-7CB0-4538-B26B-71D470395CA5"/>
<userTask id="sid-92FDA83E-7CB0-4538-B26B-71D470395CA5" name="Task C"/>
<userTask id="sid-A5706B24-A6CA-4131-94BD-659FC3028482" name="Task D"/>
<sequenceFlow id="sid-9CFBE731-A527-49EF-919F-9FC2B497EC63" sourceRef="sid-6EDB2B4B-48CB-4018-BFDB-A751B0B1950F" targetRef="sid-A5706B24-A6CA-4131-94BD-659FC3028482"/>
<parallelGateway id="sid-D5DF4CD1-DBBC-46DA-92EE-669BCC00CFB5"/>
<sequenceFlow id="sid-9EB6750A-F86E-4451-AED9-5EFCCA5A899A" sourceRef="sid-92FDA83E-7CB0-4538-B26B-71D470395CA5" targetRef="sid-D5DF4CD1-DBBC-46DA-92EE-669BCC00CFB5"/>
<sequenceFlow id="sid-1ADF62AD-B29D-45AF-8920-A9B43714C041" sourceRef="sid-A5706B24-A6CA-4131-94BD-659FC3028482" targetRef="sid-D5DF4CD1-DBBC-46DA-92EE-669BCC00CFB5"/>
<userTask id="sid-8E05A38C-6F00-4BEC-BE68-F38A72A1AFBD" name="Task E"/>
<sequenceFlow id="sid-469049E9-EFE8-4903-B80D-A38F3DDFA531" sourceRef="sid-D5DF4CD1-DBBC-46DA-92EE-669BCC00CFB5" targetRef="sid-8E05A38C-6F00-4BEC-BE68-F38A72A1AFBD"/>
<endEvent id="sid-0DCD8F1E-37C8-4F8D-AF19-B705CF07103F"/>
<sequenceFlow id="sid-A6AC0534-5433-40EF-ACCC-C47C48E8286C" sourceRef="sid-8E05A38C-6F00-4BEC-BE68-F38A72A1AFBD" targetRef="sid-0DCD8F1E-37C8-4F8D-AF19-B705CF07103F"/>
<sequenceFlow id="sid-20096242-1571-4D68-9FFC-8C6B0A07000B" sourceRef="sid-157D27B4-4E96-47AB-A1BA-ADB350659DDB" targetRef="sid-8AA6253E-EF3E-4F7F-A120-1EDCD83E2EB3"/>
<serviceTask id="sid-8AA6253E-EF3E-4F7F-A120-1EDCD83E2EB3" name="service task 1" activiti:class="org.activiti.engine.test.api.variables.VariablesTest$TestJavaDelegate10" />
<sequenceFlow id="sid-550458E7-E271-47D2-9B3E-D21A6CE1704A" sourceRef="sid-8AA6253E-EF3E-4F7F-A120-1EDCD83E2EB3" targetRef="sid-6A4C5B66-2BD8-4D65-B074-8510193AD98D"/>
<serviceTask id="sid-6A4C5B66-2BD8-4D65-B074-8510193AD98D" name="service task 2" activiti:class="org.activiti.engine.test.api.variables.VariablesTest$TestJavaDelegate11" />
<serviceTask id="sid-90AAEC84-20C3-407E-8B4D-8FCFC252054F" name="service task 3" activiti:class="org.activiti.engine.test.api.variables.VariablesTest$TestJavaDelegate12" />
<sequenceFlow id="sid-AA39E536-60F4-4A64-BE77-EE632A328ED2" sourceRef="sid-6A4C5B66-2BD8-4D65-B074-8510193AD98D" targetRef="sid-90AAEC84-20C3-407E-8B4D-8FCFC252054F"/>
<sequenceFlow id="sid-19C0E404-8A1C-4806-A37A-320F4404EBF0" sourceRef="sid-90AAEC84-20C3-407E-8B4D-8FCFC252054F" targetRef="sid-D5DF4CD1-DBBC-46DA-92EE-669BCC00CFB5"/>
</process>
<bpmndi:BPMNDiagram id="BPMNDiagram_variablesTest">
<bpmndi:BPMNPlane bpmnElement="variablesTest" id="BPMNPlane_variablesTest">
<bpmndi:BPMNShape bpmnElement="startEvent1" id="BPMNShape_startEvent1">
<omgdc:Bounds height="30.0" width="30.0" x="90.0" y="300.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-05767243-5EFD-496E-882E-14B47D3274B3" id="BPMNShape_sid-05767243-5EFD-496E-882E-14B47D3274B3">
<omgdc:Bounds height="80.0" width="100.0" x="165.0" y="275.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-6EDB2B4B-48CB-4018-BFDB-A751B0B1950F" id="BPMNShape_sid-6EDB2B4B-48CB-4018-BFDB-A751B0B1950F">
<omgdc:Bounds height="40.0" width="40.0" x="310.0" y="295.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-157D27B4-4E96-47AB-A1BA-ADB350659DDB" id="BPMNShape_sid-157D27B4-4E96-47AB-A1BA-ADB350659DDB">
<omgdc:Bounds height="80.0" width="100.0" x="395.0" y="165.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-92FDA83E-7CB0-4538-B26B-71D470395CA5" id="BPMNShape_sid-92FDA83E-7CB0-4538-B26B-71D470395CA5">
<omgdc:Bounds height="80.0" width="100.0" x="395.0" y="275.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-A5706B24-A6CA-4131-94BD-659FC3028482" id="BPMNShape_sid-A5706B24-A6CA-4131-94BD-659FC3028482">
<omgdc:Bounds height="80.0" width="100.0" x="395.0" y="390.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-D5DF4CD1-DBBC-46DA-92EE-669BCC00CFB5" id="BPMNShape_sid-D5DF4CD1-DBBC-46DA-92EE-669BCC00CFB5">
<omgdc:Bounds height="40.0" width="40.0" x="711.0" y="295.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-8E05A38C-6F00-4BEC-BE68-F38A72A1AFBD" id="BPMNShape_sid-8E05A38C-6F00-4BEC-BE68-F38A72A1AFBD">
<omgdc:Bounds height="80.0" width="100.0" x="796.0" y="275.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-0DCD8F1E-37C8-4F8D-AF19-B705CF07103F" id="BPMNShape_sid-0DCD8F1E-37C8-4F8D-AF19-B705CF07103F">
<omgdc:Bounds height="28.0" width="28.0" x="941.0" y="301.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-8AA6253E-EF3E-4F7F-A120-1EDCD83E2EB3" id="BPMNShape_sid-8AA6253E-EF3E-4F7F-A120-1EDCD83E2EB3">
<omgdc:Bounds height="80.0" width="100.0" x="395.0" y="30.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-6A4C5B66-2BD8-4D65-B074-8510193AD98D" id="BPMNShape_sid-6A4C5B66-2BD8-4D65-B074-8510193AD98D">
<omgdc:Bounds height="80.0" width="100.0" x="540.0" y="30.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNShape bpmnElement="sid-90AAEC84-20C3-407E-8B4D-8FCFC252054F" id="BPMNShape_sid-90AAEC84-20C3-407E-8B4D-8FCFC252054F">
<omgdc:Bounds height="80.0" width="100.0" x="681.0" y="30.0"/>
</bpmndi:BPMNShape>
<bpmndi:BPMNEdge bpmnElement="sid-20096242-1571-4D68-9FFC-8C6B0A07000B" id="BPMNEdge_sid-20096242-1571-4D68-9FFC-8C6B0A07000B">
<omgdi:waypoint x="445.0" y="165.0"/>
<omgdi:waypoint x="445.0" y="110.0"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-19C0E404-8A1C-4806-A37A-320F4404EBF0" id="BPMNEdge_sid-19C0E404-8A1C-4806-A37A-320F4404EBF0">
<omgdi:waypoint x="731.0" y="110.0"/>
<omgdi:waypoint x="731.0" y="295.0"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-9CFBE731-A527-49EF-919F-9FC2B497EC63" id="BPMNEdge_sid-9CFBE731-A527-49EF-919F-9FC2B497EC63">
<omgdi:waypoint x="330.5" y="334.5"/>
<omgdi:waypoint x="330.5" y="430.0"/>
<omgdi:waypoint x="395.0" y="430.0"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-A6AC0534-5433-40EF-ACCC-C47C48E8286C" id="BPMNEdge_sid-A6AC0534-5433-40EF-ACCC-C47C48E8286C">
<omgdi:waypoint x="896.0" y="315.0"/>
<omgdi:waypoint x="941.0" y="315.0"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-2514CA53-0998-4C08-880D-14FCCE4B59E1" id="BPMNEdge_sid-2514CA53-0998-4C08-880D-14FCCE4B59E1">
<omgdi:waypoint x="330.5" y="295.5"/>
<omgdi:waypoint x="330.5" y="205.0"/>
<omgdi:waypoint x="395.0" y="205.0"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-469049E9-EFE8-4903-B80D-A38F3DDFA531" id="BPMNEdge_sid-469049E9-EFE8-4903-B80D-A38F3DDFA531">
<omgdi:waypoint x="750.5833333333334" y="315.4166666666667"/>
<omgdi:waypoint x="796.0" y="315.2183406113537"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-90E99CC2-B930-48F4-AE60-425DD6A4C657" id="BPMNEdge_sid-90E99CC2-B930-48F4-AE60-425DD6A4C657">
<omgdi:waypoint x="265.0" y="315.2164502164502"/>
<omgdi:waypoint x="310.4130434782609" y="315.4130434782609"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-AA39E536-60F4-4A64-BE77-EE632A328ED2" id="BPMNEdge_sid-AA39E536-60F4-4A64-BE77-EE632A328ED2">
<omgdi:waypoint x="640.0" y="70.0"/>
<omgdi:waypoint x="681.0" y="70.0"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-EA019938-D594-4400-B498-A143E3546C6A" id="BPMNEdge_sid-EA019938-D594-4400-B498-A143E3546C6A">
<omgdi:waypoint x="349.5833333333333" y="315.4166666666667"/>
<omgdi:waypoint x="395.0" y="315.2183406113537"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-9EB6750A-F86E-4451-AED9-5EFCCA5A899A" id="BPMNEdge_sid-9EB6750A-F86E-4451-AED9-5EFCCA5A899A">
<omgdi:waypoint x="495.0" y="315.0"/>
<omgdi:waypoint x="711.0" y="315.0"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-550458E7-E271-47D2-9B3E-D21A6CE1704A" id="BPMNEdge_sid-550458E7-E271-47D2-9B3E-D21A6CE1704A">
<omgdi:waypoint x="495.0" y="70.0"/>
<omgdi:waypoint x="540.0" y="70.0"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-1ADF62AD-B29D-45AF-8920-A9B43714C041" id="BPMNEdge_sid-1ADF62AD-B29D-45AF-8920-A9B43714C041">
<omgdi:waypoint x="495.0" y="430.0"/>
<omgdi:waypoint x="731.0" y="430.0"/>
<omgdi:waypoint x="731.0" y="335.0"/>
</bpmndi:BPMNEdge>
<bpmndi:BPMNEdge bpmnElement="sid-15D04314-3280-4CAF-8A23-0415598CF5B0" id="BPMNEdge_sid-15D04314-3280-4CAF-8A23-0415598CF5B0">
<omgdi:waypoint x="120.0" y="315.0"/>
<omgdi:waypoint x="165.0" y="315.0"/>
</bpmndi:BPMNEdge>
</bpmndi:BPMNPlane>
</bpmndi:BPMNDiagram>
</definitions>
```
|
```go
// Code generated by type_noasm.go.tmpl. DO NOT EDIT.
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
//go:build noasm
// +build noasm
package math
func initInt64Go() {
Int64.sum = sum_int64_go
}
```
|
```python
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# This file is part of satpy.
#
# satpy is free software: you can redistribute it and/or modify it under the
# version.
#
# satpy is distributed in the hope that it will be useful, but WITHOUT ANY
# WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR
#
# satpy. If not, see <path_to_url
"""The glm_l2 reader tests package."""
import os
import unittest
from unittest import mock
import numpy as np
import xarray as xr
def setup_fake_dataset():
"""Create a fake dataset to avoid opening a file."""
# flash_extent_density
fed = (np.arange(10.).reshape((2, 5)) + 1.) * 50.
fed = (fed + 1.) / 0.5
fed = fed.astype(np.int16)
fed = xr.DataArray(
fed,
dims=("y", "x"),
attrs={
"scale_factor": 0.5,
"add_offset": -1.,
"_FillValue": 0,
"units": "Count per nominal 3136 microradian^2 pixel per 1.0 min",
"grid_mapping": "goes_imager_projection",
"standard_name": "flash_extent_density",
"long_name": "Flash extent density",
}
)
dqf = xr.DataArray(
fed.data.copy().astype(np.uint8),
dims=("y", "x"),
attrs={
"_FillValue": -1,
"units": "1",
"grid_mapping": "goes_imager_projection",
"standard_name": "status_flag",
"long_name": "GLM data quality flags",
"flag_meanings": "valid invalid",
}
)
# create a variable that won't be configured to test available_datasets
not_configured = xr.DataArray(
fed.data.copy(),
dims=("y", "x"),
attrs={
"scale_factor": 0.5,
"add_offset": -1.,
"_FillValue": 0,
"units": "1",
"grid_mapping": "goes_imager_projection",
"standard_name": "test",
"long_name": "Test",
}
)
x__ = xr.DataArray(
range(5),
attrs={"scale_factor": 2., "add_offset": -1.},
dims=("x",),
)
y__ = xr.DataArray(
range(2),
attrs={"scale_factor": -2., "add_offset": 1.},
dims=("y",),
)
proj = xr.DataArray(
[],
attrs={
"semi_major_axis": 1.,
"semi_minor_axis": 1.,
"perspective_point_height": 1.,
"longitude_of_projection_origin": -90.,
"latitude_of_projection_origin": 0.,
"sweep_angle_axis": u"x"
}
)
fake_dataset = xr.Dataset(
data_vars={
"flash_extent_density": fed,
"not_configured": not_configured,
"DQF": dqf,
"x": x__,
"y": y__,
"goes_imager_projection": proj,
"nominal_satellite_subpoint_lat": np.array(0.0),
"nominal_satellite_subpoint_lon": np.array(-89.5),
"nominal_satellite_height": np.array(35786.02)
},
attrs={
"time_coverage_start": "2017-09-20T17:30:40Z",
"time_coverage_end": "2017-09-20T17:41:17Z",
"spatial_resolution": "2km at nadir",
}
)
return fake_dataset
class TestGLML2FileHandler(unittest.TestCase):
"""Tests for the GLM L2 reader."""
@mock.patch("satpy.readers.abi_base.xr")
def setUp(self, xr_):
"""Create a fake file handler to test."""
from satpy.readers.glm_l2 import NCGriddedGLML2
fake_dataset = setup_fake_dataset()
xr_.open_dataset.return_value = fake_dataset
self.reader = NCGriddedGLML2("filename",
{"platform_shortname": "G16",
"scene_abbr": "C", "scan_mode": "M3"},
{"filetype": "glm_l2_imagery"})
def test_basic_attributes(self):
"""Test getting basic file attributes."""
import datetime as dt
assert self.reader.start_time == dt.datetime(2017, 9, 20, 17, 30, 40)
assert self.reader.end_time == dt.datetime(2017, 9, 20, 17, 41, 17)
def test_get_dataset(self):
"""Test the get_dataset method."""
from satpy.tests.utils import make_dataid
key = make_dataid(name="flash_extent_density")
res = self.reader.get_dataset(key, {"info": "info"})
exp = {"instrument_ID": None,
"modifiers": (),
"name": "flash_extent_density",
"orbital_parameters": {"projection_altitude": 1.0,
"projection_latitude": 0.0,
"projection_longitude": -90.0,
# 'satellite_nominal_altitude': 35786.02,
"satellite_nominal_latitude": 0.0,
"satellite_nominal_longitude": -89.5},
"orbital_slot": None,
"platform_name": "GOES-16",
"platform_shortname": "G16",
"production_site": None,
"scan_mode": "M3",
"scene_abbr": "C",
"scene_id": None,
"spatial_resolution": "2km at nadir",
"sensor": "glm",
"timeline_ID": None,
"grid_mapping": "goes_imager_projection",
"standard_name": "flash_extent_density",
"long_name": "Flash extent density",
"units": "Count per nominal 3136 microradian^2 pixel per 1.0 min"}
assert res.attrs == exp
def test_get_dataset_dqf(self):
"""Test the get_dataset method with special DQF var."""
from satpy.tests.utils import make_dataid
key = make_dataid(name="DQF")
res = self.reader.get_dataset(key, {"info": "info"})
exp = {"instrument_ID": None,
"modifiers": (),
"name": "DQF",
"orbital_parameters": {"projection_altitude": 1.0,
"projection_latitude": 0.0,
"projection_longitude": -90.0,
# 'satellite_nominal_altitude': 35786.02,
"satellite_nominal_latitude": 0.0,
"satellite_nominal_longitude": -89.5},
"orbital_slot": None,
"platform_name": "GOES-16",
"platform_shortname": "G16",
"production_site": None,
"scan_mode": "M3",
"scene_abbr": "C",
"scene_id": None,
"spatial_resolution": "2km at nadir",
"sensor": "glm",
"timeline_ID": None,
"grid_mapping": "goes_imager_projection",
"units": "1",
"_FillValue": -1,
"standard_name": "status_flag",
"long_name": "GLM data quality flags",
"flag_meanings": "valid invalid"}
assert res.attrs == exp
assert np.issubdtype(res.dtype, np.integer)
class TestGLML2Reader(unittest.TestCase):
"""Test high-level reading functionality of GLM L2 reader."""
yaml_file = "glm_l2.yaml"
@mock.patch("satpy.readers.abi_base.xr")
def setUp(self, xr_):
"""Create a fake reader to test."""
from satpy._config import config_search_paths
from satpy.readers import load_reader
self.reader_configs = config_search_paths(os.path.join("readers", self.yaml_file))
fake_dataset = setup_fake_dataset()
xr_.open_dataset.return_value = fake_dataset
r = load_reader(self.reader_configs)
loadables = r.select_files_from_pathnames([
your_sha256_hash00350.nc",
your_sha256_hash2862200350.nc",
])
assert len(loadables) == 2
r.create_filehandlers(loadables)
self.reader = r
def test_available_datasets(self):
"""Test that resolution is added to YAML configured variables."""
# make sure we have some files
assert self.reader.file_handlers
available_datasets = list(self.reader.available_dataset_ids)
# flash_extent_density, DQF, and not_configured are available in our tests
assert len(available_datasets) == 3
for ds_id in available_datasets:
assert ds_id["resolution"] == 2000
# make sure not_configured was discovered
names = [dataid["name"] for dataid in available_datasets]
assert "not_configured" in names
```
|
```c++
//===- DeclTemplate.cpp - Template Declaration AST Node Implementation ----===//
//
// See path_to_url for license information.
//
//===your_sha256_hash------===//
//
// This file implements the C++ related Decl classes for templates.
//
//===your_sha256_hash------===//
#include "clang/AST/DeclTemplate.h"
#include "clang/AST/ASTContext.h"
#include "clang/AST/ASTMutationListener.h"
#include "clang/AST/DeclCXX.h"
#include "clang/AST/DeclarationName.h"
#include "clang/AST/Expr.h"
#include "clang/AST/ExternalASTSource.h"
#include "clang/AST/TemplateBase.h"
#include "clang/AST/TemplateName.h"
#include "clang/AST/Type.h"
#include "clang/AST/TypeLoc.h"
#include "clang/Basic/Builtins.h"
#include "clang/Basic/LLVM.h"
#include "clang/Basic/SourceLocation.h"
#include "llvm/ADT/ArrayRef.h"
#include "llvm/ADT/FoldingSet.h"
#include "llvm/ADT/PointerUnion.h"
#include "llvm/ADT/STLExtras.h"
#include "llvm/ADT/SmallVector.h"
#include "llvm/Support/Casting.h"
#include "llvm/Support/ErrorHandling.h"
#include <algorithm>
#include <cassert>
#include <cstdint>
#include <memory>
#include <optional>
#include <utility>
using namespace clang;
//===your_sha256_hash------===//
// TemplateParameterList Implementation
//===your_sha256_hash------===//
TemplateParameterList::TemplateParameterList(const ASTContext& C,
SourceLocation TemplateLoc,
SourceLocation LAngleLoc,
ArrayRef<NamedDecl *> Params,
SourceLocation RAngleLoc,
Expr *RequiresClause)
: TemplateLoc(TemplateLoc), LAngleLoc(LAngleLoc), RAngleLoc(RAngleLoc),
NumParams(Params.size()), ContainsUnexpandedParameterPack(false),
HasRequiresClause(RequiresClause != nullptr),
HasConstrainedParameters(false) {
for (unsigned Idx = 0; Idx < NumParams; ++Idx) {
NamedDecl *P = Params[Idx];
begin()[Idx] = P;
bool IsPack = P->isTemplateParameterPack();
if (const auto *NTTP = dyn_cast<NonTypeTemplateParmDecl>(P)) {
if (!IsPack && NTTP->getType()->containsUnexpandedParameterPack())
ContainsUnexpandedParameterPack = true;
if (NTTP->hasPlaceholderTypeConstraint())
HasConstrainedParameters = true;
} else if (const auto *TTP = dyn_cast<TemplateTemplateParmDecl>(P)) {
if (!IsPack &&
TTP->getTemplateParameters()->containsUnexpandedParameterPack())
ContainsUnexpandedParameterPack = true;
} else if (const auto *TTP = dyn_cast<TemplateTypeParmDecl>(P)) {
if (const TypeConstraint *TC = TTP->getTypeConstraint()) {
if (TC->getImmediatelyDeclaredConstraint()
->containsUnexpandedParameterPack())
ContainsUnexpandedParameterPack = true;
}
if (TTP->hasTypeConstraint())
HasConstrainedParameters = true;
} else {
llvm_unreachable("unexpected template parameter type");
}
// FIXME: If a default argument contains an unexpanded parameter pack, the
// template parameter list does too.
}
if (HasRequiresClause) {
if (RequiresClause->containsUnexpandedParameterPack())
ContainsUnexpandedParameterPack = true;
*getTrailingObjects<Expr *>() = RequiresClause;
}
}
bool TemplateParameterList::containsUnexpandedParameterPack() const {
if (ContainsUnexpandedParameterPack)
return true;
if (!HasConstrainedParameters)
return false;
// An implicit constrained parameter might have had a use of an unexpanded
// pack added to it after the template parameter list was created. All
// implicit parameters are at the end of the parameter list.
for (const NamedDecl *Param : llvm::reverse(asArray())) {
if (!Param->isImplicit())
break;
if (const auto *TTP = dyn_cast<TemplateTypeParmDecl>(Param)) {
const auto *TC = TTP->getTypeConstraint();
if (TC && TC->getImmediatelyDeclaredConstraint()
->containsUnexpandedParameterPack())
return true;
}
}
return false;
}
TemplateParameterList *
TemplateParameterList::Create(const ASTContext &C, SourceLocation TemplateLoc,
SourceLocation LAngleLoc,
ArrayRef<NamedDecl *> Params,
SourceLocation RAngleLoc, Expr *RequiresClause) {
void *Mem = C.Allocate(totalSizeToAlloc<NamedDecl *, Expr *>(
Params.size(), RequiresClause ? 1u : 0u),
alignof(TemplateParameterList));
return new (Mem) TemplateParameterList(C, TemplateLoc, LAngleLoc, Params,
RAngleLoc, RequiresClause);
}
unsigned TemplateParameterList::getMinRequiredArguments() const {
unsigned NumRequiredArgs = 0;
for (const NamedDecl *P : asArray()) {
if (P->isTemplateParameterPack()) {
if (std::optional<unsigned> Expansions = getExpandedPackSize(P)) {
NumRequiredArgs += *Expansions;
continue;
}
break;
}
if (const auto *TTP = dyn_cast<TemplateTypeParmDecl>(P)) {
if (TTP->hasDefaultArgument())
break;
} else if (const auto *NTTP = dyn_cast<NonTypeTemplateParmDecl>(P)) {
if (NTTP->hasDefaultArgument())
break;
} else if (cast<TemplateTemplateParmDecl>(P)->hasDefaultArgument())
break;
++NumRequiredArgs;
}
return NumRequiredArgs;
}
unsigned TemplateParameterList::getDepth() const {
if (size() == 0)
return 0;
const NamedDecl *FirstParm = getParam(0);
if (const auto *TTP = dyn_cast<TemplateTypeParmDecl>(FirstParm))
return TTP->getDepth();
else if (const auto *NTTP = dyn_cast<NonTypeTemplateParmDecl>(FirstParm))
return NTTP->getDepth();
else
return cast<TemplateTemplateParmDecl>(FirstParm)->getDepth();
}
static bool AdoptTemplateParameterList(TemplateParameterList *Params,
DeclContext *Owner) {
bool Invalid = false;
for (NamedDecl *P : *Params) {
P->setDeclContext(Owner);
if (const auto *TTP = dyn_cast<TemplateTemplateParmDecl>(P))
if (AdoptTemplateParameterList(TTP->getTemplateParameters(), Owner))
Invalid = true;
if (P->isInvalidDecl())
Invalid = true;
}
return Invalid;
}
void TemplateParameterList::
getAssociatedConstraints(llvm::SmallVectorImpl<const Expr *> &AC) const {
if (HasConstrainedParameters)
for (const NamedDecl *Param : *this) {
if (const auto *TTP = dyn_cast<TemplateTypeParmDecl>(Param)) {
if (const auto *TC = TTP->getTypeConstraint())
AC.push_back(TC->getImmediatelyDeclaredConstraint());
} else if (const auto *NTTP = dyn_cast<NonTypeTemplateParmDecl>(Param)) {
if (const Expr *E = NTTP->getPlaceholderTypeConstraint())
AC.push_back(E);
}
}
if (HasRequiresClause)
AC.push_back(getRequiresClause());
}
bool TemplateParameterList::hasAssociatedConstraints() const {
return HasRequiresClause || HasConstrainedParameters;
}
bool TemplateParameterList::shouldIncludeTypeForArgument(
const PrintingPolicy &Policy, const TemplateParameterList *TPL,
unsigned Idx) {
if (!TPL || Idx >= TPL->size() || Policy.AlwaysIncludeTypeForTemplateArgument)
return true;
const NamedDecl *TemplParam = TPL->getParam(Idx);
if (const auto *ParamValueDecl =
dyn_cast<NonTypeTemplateParmDecl>(TemplParam))
if (ParamValueDecl->getType()->getContainedDeducedType())
return true;
return false;
}
namespace clang {
void *allocateDefaultArgStorageChain(const ASTContext &C) {
return new (C) char[sizeof(void*) * 2];
}
} // namespace clang
//===your_sha256_hash------===//
// TemplateDecl Implementation
//===your_sha256_hash------===//
TemplateDecl::TemplateDecl(Kind DK, DeclContext *DC, SourceLocation L,
DeclarationName Name, TemplateParameterList *Params,
NamedDecl *Decl)
: NamedDecl(DK, DC, L, Name), TemplatedDecl(Decl), TemplateParams(Params) {}
void TemplateDecl::anchor() {}
void TemplateDecl::
getAssociatedConstraints(llvm::SmallVectorImpl<const Expr *> &AC) const {
TemplateParams->getAssociatedConstraints(AC);
if (auto *FD = dyn_cast_or_null<FunctionDecl>(getTemplatedDecl()))
if (const Expr *TRC = FD->getTrailingRequiresClause())
AC.push_back(TRC);
}
bool TemplateDecl::hasAssociatedConstraints() const {
if (TemplateParams->hasAssociatedConstraints())
return true;
if (auto *FD = dyn_cast_or_null<FunctionDecl>(getTemplatedDecl()))
return FD->getTrailingRequiresClause();
return false;
}
bool TemplateDecl::isTypeAlias() const {
switch (getKind()) {
case TemplateDecl::TypeAliasTemplate:
case TemplateDecl::BuiltinTemplate:
return true;
default:
return false;
};
}
//===your_sha256_hash------===//
// RedeclarableTemplateDecl Implementation
//===your_sha256_hash------===//
void RedeclarableTemplateDecl::anchor() {}
RedeclarableTemplateDecl::CommonBase *RedeclarableTemplateDecl::getCommonPtr() const {
if (Common)
return Common;
// Walk the previous-declaration chain until we either find a declaration
// with a common pointer or we run out of previous declarations.
SmallVector<const RedeclarableTemplateDecl *, 2> PrevDecls;
for (const RedeclarableTemplateDecl *Prev = getPreviousDecl(); Prev;
Prev = Prev->getPreviousDecl()) {
if (Prev->Common) {
Common = Prev->Common;
break;
}
PrevDecls.push_back(Prev);
}
// If we never found a common pointer, allocate one now.
if (!Common) {
// FIXME: If any of the declarations is from an AST file, we probably
// need an update record to add the common data.
Common = newCommon(getASTContext());
}
// Update any previous declarations we saw with the common pointer.
for (const RedeclarableTemplateDecl *Prev : PrevDecls)
Prev->Common = Common;
return Common;
}
void RedeclarableTemplateDecl::loadLazySpecializationsImpl() const {
// Grab the most recent declaration to ensure we've loaded any lazy
// redeclarations of this template.
CommonBase *CommonBasePtr = getMostRecentDecl()->getCommonPtr();
if (CommonBasePtr->LazySpecializations) {
ASTContext &Context = getASTContext();
uint32_t *Specs = CommonBasePtr->LazySpecializations;
CommonBasePtr->LazySpecializations = nullptr;
for (uint32_t I = 0, N = *Specs++; I != N; ++I)
(void)Context.getExternalSource()->GetExternalDecl(Specs[I]);
}
}
template<class EntryType, typename... ProfileArguments>
typename RedeclarableTemplateDecl::SpecEntryTraits<EntryType>::DeclType *
RedeclarableTemplateDecl::findSpecializationImpl(
llvm::FoldingSetVector<EntryType> &Specs, void *&InsertPos,
ProfileArguments&&... ProfileArgs) {
using SETraits = SpecEntryTraits<EntryType>;
llvm::FoldingSetNodeID ID;
EntryType::Profile(ID, std::forward<ProfileArguments>(ProfileArgs)...,
getASTContext());
EntryType *Entry = Specs.FindNodeOrInsertPos(ID, InsertPos);
return Entry ? SETraits::getDecl(Entry)->getMostRecentDecl() : nullptr;
}
template<class Derived, class EntryType>
void RedeclarableTemplateDecl::addSpecializationImpl(
llvm::FoldingSetVector<EntryType> &Specializations, EntryType *Entry,
void *InsertPos) {
using SETraits = SpecEntryTraits<EntryType>;
if (InsertPos) {
#ifndef NDEBUG
void *CorrectInsertPos;
assert(!findSpecializationImpl(Specializations,
CorrectInsertPos,
SETraits::getTemplateArgs(Entry)) &&
InsertPos == CorrectInsertPos &&
"given incorrect InsertPos for specialization");
#endif
Specializations.InsertNode(Entry, InsertPos);
} else {
EntryType *Existing = Specializations.GetOrInsertNode(Entry);
(void)Existing;
assert(SETraits::getDecl(Existing)->isCanonicalDecl() &&
"non-canonical specialization?");
}
if (ASTMutationListener *L = getASTMutationListener())
L->AddedCXXTemplateSpecialization(cast<Derived>(this),
SETraits::getDecl(Entry));
}
ArrayRef<TemplateArgument> RedeclarableTemplateDecl::getInjectedTemplateArgs() {
TemplateParameterList *Params = getTemplateParameters();
auto *CommonPtr = getCommonPtr();
if (!CommonPtr->InjectedArgs) {
auto &Context = getASTContext();
SmallVector<TemplateArgument, 16> TemplateArgs;
Context.getInjectedTemplateArgs(Params, TemplateArgs);
CommonPtr->InjectedArgs =
new (Context) TemplateArgument[TemplateArgs.size()];
std::copy(TemplateArgs.begin(), TemplateArgs.end(),
CommonPtr->InjectedArgs);
}
return llvm::ArrayRef(CommonPtr->InjectedArgs, Params->size());
}
//===your_sha256_hash------===//
// FunctionTemplateDecl Implementation
//===your_sha256_hash------===//
FunctionTemplateDecl *
FunctionTemplateDecl::Create(ASTContext &C, DeclContext *DC, SourceLocation L,
DeclarationName Name,
TemplateParameterList *Params, NamedDecl *Decl) {
bool Invalid = AdoptTemplateParameterList(Params, cast<DeclContext>(Decl));
auto *TD = new (C, DC) FunctionTemplateDecl(C, DC, L, Name, Params, Decl);
if (Invalid)
TD->setInvalidDecl();
return TD;
}
FunctionTemplateDecl *FunctionTemplateDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
return new (C, ID) FunctionTemplateDecl(C, nullptr, SourceLocation(),
DeclarationName(), nullptr, nullptr);
}
RedeclarableTemplateDecl::CommonBase *
FunctionTemplateDecl::newCommon(ASTContext &C) const {
auto *CommonPtr = new (C) Common;
C.addDestruction(CommonPtr);
return CommonPtr;
}
void FunctionTemplateDecl::LoadLazySpecializations() const {
loadLazySpecializationsImpl();
}
llvm::FoldingSetVector<FunctionTemplateSpecializationInfo> &
FunctionTemplateDecl::getSpecializations() const {
LoadLazySpecializations();
return getCommonPtr()->Specializations;
}
FunctionDecl *
FunctionTemplateDecl::findSpecialization(ArrayRef<TemplateArgument> Args,
void *&InsertPos) {
return findSpecializationImpl(getSpecializations(), InsertPos, Args);
}
void FunctionTemplateDecl::addSpecialization(
FunctionTemplateSpecializationInfo *Info, void *InsertPos) {
addSpecializationImpl<FunctionTemplateDecl>(getSpecializations(), Info,
InsertPos);
}
void FunctionTemplateDecl::mergePrevDecl(FunctionTemplateDecl *Prev) {
using Base = RedeclarableTemplateDecl;
// If we haven't created a common pointer yet, then it can just be created
// with the usual method.
if (!Base::Common)
return;
Common *ThisCommon = static_cast<Common *>(Base::Common);
Common *PrevCommon = nullptr;
SmallVector<FunctionTemplateDecl *, 8> PreviousDecls;
for (; Prev; Prev = Prev->getPreviousDecl()) {
if (Prev->Base::Common) {
PrevCommon = static_cast<Common *>(Prev->Base::Common);
break;
}
PreviousDecls.push_back(Prev);
}
// If the previous redecl chain hasn't created a common pointer yet, then just
// use this common pointer.
if (!PrevCommon) {
for (auto *D : PreviousDecls)
D->Base::Common = ThisCommon;
return;
}
// Ensure we don't leak any important state.
assert(ThisCommon->Specializations.size() == 0 &&
"Can't merge incompatible declarations!");
Base::Common = PrevCommon;
}
//===your_sha256_hash------===//
// ClassTemplateDecl Implementation
//===your_sha256_hash------===//
ClassTemplateDecl *ClassTemplateDecl::Create(ASTContext &C, DeclContext *DC,
SourceLocation L,
DeclarationName Name,
TemplateParameterList *Params,
NamedDecl *Decl) {
bool Invalid = AdoptTemplateParameterList(Params, cast<DeclContext>(Decl));
auto *TD = new (C, DC) ClassTemplateDecl(C, DC, L, Name, Params, Decl);
if (Invalid)
TD->setInvalidDecl();
return TD;
}
ClassTemplateDecl *ClassTemplateDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
return new (C, ID) ClassTemplateDecl(C, nullptr, SourceLocation(),
DeclarationName(), nullptr, nullptr);
}
void ClassTemplateDecl::LoadLazySpecializations() const {
loadLazySpecializationsImpl();
}
llvm::FoldingSetVector<ClassTemplateSpecializationDecl> &
ClassTemplateDecl::getSpecializations() const {
LoadLazySpecializations();
return getCommonPtr()->Specializations;
}
llvm::FoldingSetVector<ClassTemplatePartialSpecializationDecl> &
ClassTemplateDecl::getPartialSpecializations() const {
LoadLazySpecializations();
return getCommonPtr()->PartialSpecializations;
}
RedeclarableTemplateDecl::CommonBase *
ClassTemplateDecl::newCommon(ASTContext &C) const {
auto *CommonPtr = new (C) Common;
C.addDestruction(CommonPtr);
return CommonPtr;
}
ClassTemplateSpecializationDecl *
ClassTemplateDecl::findSpecialization(ArrayRef<TemplateArgument> Args,
void *&InsertPos) {
return findSpecializationImpl(getSpecializations(), InsertPos, Args);
}
void ClassTemplateDecl::AddSpecialization(ClassTemplateSpecializationDecl *D,
void *InsertPos) {
addSpecializationImpl<ClassTemplateDecl>(getSpecializations(), D, InsertPos);
}
ClassTemplatePartialSpecializationDecl *
ClassTemplateDecl::findPartialSpecialization(
ArrayRef<TemplateArgument> Args,
TemplateParameterList *TPL, void *&InsertPos) {
return findSpecializationImpl(getPartialSpecializations(), InsertPos, Args,
TPL);
}
static void ProfileTemplateParameterList(ASTContext &C,
llvm::FoldingSetNodeID &ID, const TemplateParameterList *TPL) {
const Expr *RC = TPL->getRequiresClause();
ID.AddBoolean(RC != nullptr);
if (RC)
RC->Profile(ID, C, /*Canonical=*/true);
ID.AddInteger(TPL->size());
for (NamedDecl *D : *TPL) {
if (const auto *NTTP = dyn_cast<NonTypeTemplateParmDecl>(D)) {
ID.AddInteger(0);
ID.AddBoolean(NTTP->isParameterPack());
NTTP->getType().getCanonicalType().Profile(ID);
ID.AddBoolean(NTTP->hasPlaceholderTypeConstraint());
if (const Expr *E = NTTP->getPlaceholderTypeConstraint())
E->Profile(ID, C, /*Canonical=*/true);
continue;
}
if (const auto *TTP = dyn_cast<TemplateTypeParmDecl>(D)) {
ID.AddInteger(1);
ID.AddBoolean(TTP->isParameterPack());
ID.AddBoolean(TTP->hasTypeConstraint());
if (const TypeConstraint *TC = TTP->getTypeConstraint())
TC->getImmediatelyDeclaredConstraint()->Profile(ID, C,
/*Canonical=*/true);
continue;
}
const auto *TTP = cast<TemplateTemplateParmDecl>(D);
ID.AddInteger(2);
ID.AddBoolean(TTP->isParameterPack());
ProfileTemplateParameterList(C, ID, TTP->getTemplateParameters());
}
}
void
ClassTemplatePartialSpecializationDecl::Profile(llvm::FoldingSetNodeID &ID,
ArrayRef<TemplateArgument> TemplateArgs, TemplateParameterList *TPL,
ASTContext &Context) {
ID.AddInteger(TemplateArgs.size());
for (const TemplateArgument &TemplateArg : TemplateArgs)
TemplateArg.Profile(ID, Context);
ProfileTemplateParameterList(Context, ID, TPL);
}
void ClassTemplateDecl::AddPartialSpecialization(
ClassTemplatePartialSpecializationDecl *D,
void *InsertPos) {
if (InsertPos)
getPartialSpecializations().InsertNode(D, InsertPos);
else {
ClassTemplatePartialSpecializationDecl *Existing
= getPartialSpecializations().GetOrInsertNode(D);
(void)Existing;
assert(Existing->isCanonicalDecl() && "Non-canonical specialization?");
}
if (ASTMutationListener *L = getASTMutationListener())
L->AddedCXXTemplateSpecialization(this, D);
}
void ClassTemplateDecl::getPartialSpecializations(
SmallVectorImpl<ClassTemplatePartialSpecializationDecl *> &PS) const {
llvm::FoldingSetVector<ClassTemplatePartialSpecializationDecl> &PartialSpecs
= getPartialSpecializations();
PS.clear();
PS.reserve(PartialSpecs.size());
for (ClassTemplatePartialSpecializationDecl &P : PartialSpecs)
PS.push_back(P.getMostRecentDecl());
}
ClassTemplatePartialSpecializationDecl *
ClassTemplateDecl::findPartialSpecialization(QualType T) {
ASTContext &Context = getASTContext();
for (ClassTemplatePartialSpecializationDecl &P :
getPartialSpecializations()) {
if (Context.hasSameType(P.getInjectedSpecializationType(), T))
return P.getMostRecentDecl();
}
return nullptr;
}
ClassTemplatePartialSpecializationDecl *
ClassTemplateDecl::findPartialSpecInstantiatedFromMember(
ClassTemplatePartialSpecializationDecl *D) {
Decl *DCanon = D->getCanonicalDecl();
for (ClassTemplatePartialSpecializationDecl &P : getPartialSpecializations()) {
if (P.getInstantiatedFromMember()->getCanonicalDecl() == DCanon)
return P.getMostRecentDecl();
}
return nullptr;
}
QualType
ClassTemplateDecl::getInjectedClassNameSpecialization() {
Common *CommonPtr = getCommonPtr();
if (!CommonPtr->InjectedClassNameType.isNull())
return CommonPtr->InjectedClassNameType;
// C++0x [temp.dep.type]p2:
// The template argument list of a primary template is a template argument
// list in which the nth template argument has the value of the nth template
// parameter of the class template. If the nth template parameter is a
// template parameter pack (14.5.3), the nth template argument is a pack
// expansion (14.5.3) whose pattern is the name of the template parameter
// pack.
ASTContext &Context = getASTContext();
TemplateParameterList *Params = getTemplateParameters();
SmallVector<TemplateArgument, 16> TemplateArgs;
Context.getInjectedTemplateArgs(Params, TemplateArgs);
CommonPtr->InjectedClassNameType
= Context.getTemplateSpecializationType(TemplateName(this),
TemplateArgs);
return CommonPtr->InjectedClassNameType;
}
//===your_sha256_hash------===//
// TemplateTypeParm Allocation/Deallocation Method Implementations
//===your_sha256_hash------===//
TemplateTypeParmDecl *TemplateTypeParmDecl::Create(
const ASTContext &C, DeclContext *DC, SourceLocation KeyLoc,
SourceLocation NameLoc, unsigned D, unsigned P, IdentifierInfo *Id,
bool Typename, bool ParameterPack, bool HasTypeConstraint,
std::optional<unsigned> NumExpanded) {
auto *TTPDecl =
new (C, DC,
additionalSizeToAlloc<TypeConstraint>(HasTypeConstraint ? 1 : 0))
TemplateTypeParmDecl(DC, KeyLoc, NameLoc, Id, Typename,
HasTypeConstraint, NumExpanded);
QualType TTPType = C.getTemplateTypeParmType(D, P, ParameterPack, TTPDecl);
TTPDecl->setTypeForDecl(TTPType.getTypePtr());
return TTPDecl;
}
TemplateTypeParmDecl *
TemplateTypeParmDecl::CreateDeserialized(const ASTContext &C, unsigned ID) {
return new (C, ID)
TemplateTypeParmDecl(nullptr, SourceLocation(), SourceLocation(), nullptr,
false, false, std::nullopt);
}
TemplateTypeParmDecl *
TemplateTypeParmDecl::CreateDeserialized(const ASTContext &C, unsigned ID,
bool HasTypeConstraint) {
return new (C, ID,
additionalSizeToAlloc<TypeConstraint>(HasTypeConstraint ? 1 : 0))
TemplateTypeParmDecl(nullptr, SourceLocation(), SourceLocation(), nullptr,
false, HasTypeConstraint, std::nullopt);
}
SourceLocation TemplateTypeParmDecl::getDefaultArgumentLoc() const {
return hasDefaultArgument()
? getDefaultArgumentInfo()->getTypeLoc().getBeginLoc()
: SourceLocation();
}
SourceRange TemplateTypeParmDecl::getSourceRange() const {
if (hasDefaultArgument() && !defaultArgumentWasInherited())
return SourceRange(getBeginLoc(),
getDefaultArgumentInfo()->getTypeLoc().getEndLoc());
// TypeDecl::getSourceRange returns a range containing name location, which is
// wrong for unnamed template parameters. e.g:
// it will return <[[typename>]] instead of <[[typename]]>
else if (getDeclName().isEmpty())
return SourceRange(getBeginLoc());
return TypeDecl::getSourceRange();
}
unsigned TemplateTypeParmDecl::getDepth() const {
return getTypeForDecl()->castAs<TemplateTypeParmType>()->getDepth();
}
unsigned TemplateTypeParmDecl::getIndex() const {
return getTypeForDecl()->castAs<TemplateTypeParmType>()->getIndex();
}
bool TemplateTypeParmDecl::isParameterPack() const {
return getTypeForDecl()->castAs<TemplateTypeParmType>()->isParameterPack();
}
void TemplateTypeParmDecl::setTypeConstraint(NestedNameSpecifierLoc NNS,
DeclarationNameInfo NameInfo, NamedDecl *FoundDecl, ConceptDecl *CD,
const ASTTemplateArgumentListInfo *ArgsAsWritten,
Expr *ImmediatelyDeclaredConstraint) {
assert(HasTypeConstraint &&
"HasTypeConstraint=true must be passed at construction in order to "
"call setTypeConstraint");
assert(!TypeConstraintInitialized &&
"TypeConstraint was already initialized!");
new (getTrailingObjects<TypeConstraint>()) TypeConstraint(NNS, NameInfo,
FoundDecl, CD, ArgsAsWritten, ImmediatelyDeclaredConstraint);
TypeConstraintInitialized = true;
}
//===your_sha256_hash------===//
// NonTypeTemplateParmDecl Method Implementations
//===your_sha256_hash------===//
NonTypeTemplateParmDecl::NonTypeTemplateParmDecl(
DeclContext *DC, SourceLocation StartLoc, SourceLocation IdLoc, unsigned D,
unsigned P, IdentifierInfo *Id, QualType T, TypeSourceInfo *TInfo,
ArrayRef<QualType> ExpandedTypes, ArrayRef<TypeSourceInfo *> ExpandedTInfos)
: DeclaratorDecl(NonTypeTemplateParm, DC, IdLoc, Id, T, TInfo, StartLoc),
TemplateParmPosition(D, P), ParameterPack(true),
ExpandedParameterPack(true), NumExpandedTypes(ExpandedTypes.size()) {
if (!ExpandedTypes.empty() && !ExpandedTInfos.empty()) {
auto TypesAndInfos =
getTrailingObjects<std::pair<QualType, TypeSourceInfo *>>();
for (unsigned I = 0; I != NumExpandedTypes; ++I) {
new (&TypesAndInfos[I].first) QualType(ExpandedTypes[I]);
TypesAndInfos[I].second = ExpandedTInfos[I];
}
}
}
NonTypeTemplateParmDecl *
NonTypeTemplateParmDecl::Create(const ASTContext &C, DeclContext *DC,
SourceLocation StartLoc, SourceLocation IdLoc,
unsigned D, unsigned P, IdentifierInfo *Id,
QualType T, bool ParameterPack,
TypeSourceInfo *TInfo) {
AutoType *AT =
C.getLangOpts().CPlusPlus20 ? T->getContainedAutoType() : nullptr;
return new (C, DC,
additionalSizeToAlloc<std::pair<QualType, TypeSourceInfo *>,
Expr *>(0,
AT && AT->isConstrained() ? 1 : 0))
NonTypeTemplateParmDecl(DC, StartLoc, IdLoc, D, P, Id, T, ParameterPack,
TInfo);
}
NonTypeTemplateParmDecl *NonTypeTemplateParmDecl::Create(
const ASTContext &C, DeclContext *DC, SourceLocation StartLoc,
SourceLocation IdLoc, unsigned D, unsigned P, IdentifierInfo *Id,
QualType T, TypeSourceInfo *TInfo, ArrayRef<QualType> ExpandedTypes,
ArrayRef<TypeSourceInfo *> ExpandedTInfos) {
AutoType *AT = TInfo->getType()->getContainedAutoType();
return new (C, DC,
additionalSizeToAlloc<std::pair<QualType, TypeSourceInfo *>,
Expr *>(
ExpandedTypes.size(), AT && AT->isConstrained() ? 1 : 0))
NonTypeTemplateParmDecl(DC, StartLoc, IdLoc, D, P, Id, T, TInfo,
ExpandedTypes, ExpandedTInfos);
}
NonTypeTemplateParmDecl *
NonTypeTemplateParmDecl::CreateDeserialized(ASTContext &C, unsigned ID,
bool HasTypeConstraint) {
return new (C, ID, additionalSizeToAlloc<std::pair<QualType,
TypeSourceInfo *>,
Expr *>(0,
HasTypeConstraint ? 1 : 0))
NonTypeTemplateParmDecl(nullptr, SourceLocation(), SourceLocation(),
0, 0, nullptr, QualType(), false, nullptr);
}
NonTypeTemplateParmDecl *
NonTypeTemplateParmDecl::CreateDeserialized(ASTContext &C, unsigned ID,
unsigned NumExpandedTypes,
bool HasTypeConstraint) {
auto *NTTP =
new (C, ID,
additionalSizeToAlloc<std::pair<QualType, TypeSourceInfo *>, Expr *>(
NumExpandedTypes, HasTypeConstraint ? 1 : 0))
NonTypeTemplateParmDecl(nullptr, SourceLocation(), SourceLocation(),
0, 0, nullptr, QualType(), nullptr,
std::nullopt, std::nullopt);
NTTP->NumExpandedTypes = NumExpandedTypes;
return NTTP;
}
SourceRange NonTypeTemplateParmDecl::getSourceRange() const {
if (hasDefaultArgument() && !defaultArgumentWasInherited())
return SourceRange(getOuterLocStart(),
getDefaultArgument()->getSourceRange().getEnd());
return DeclaratorDecl::getSourceRange();
}
SourceLocation NonTypeTemplateParmDecl::getDefaultArgumentLoc() const {
return hasDefaultArgument()
? getDefaultArgument()->getSourceRange().getBegin()
: SourceLocation();
}
//===your_sha256_hash------===//
// TemplateTemplateParmDecl Method Implementations
//===your_sha256_hash------===//
void TemplateTemplateParmDecl::anchor() {}
TemplateTemplateParmDecl::TemplateTemplateParmDecl(
DeclContext *DC, SourceLocation L, unsigned D, unsigned P,
IdentifierInfo *Id, TemplateParameterList *Params,
ArrayRef<TemplateParameterList *> Expansions)
: TemplateDecl(TemplateTemplateParm, DC, L, Id, Params),
TemplateParmPosition(D, P), ParameterPack(true),
ExpandedParameterPack(true), NumExpandedParams(Expansions.size()) {
if (!Expansions.empty())
std::uninitialized_copy(Expansions.begin(), Expansions.end(),
getTrailingObjects<TemplateParameterList *>());
}
TemplateTemplateParmDecl *
TemplateTemplateParmDecl::Create(const ASTContext &C, DeclContext *DC,
SourceLocation L, unsigned D, unsigned P,
bool ParameterPack, IdentifierInfo *Id,
TemplateParameterList *Params) {
return new (C, DC) TemplateTemplateParmDecl(DC, L, D, P, ParameterPack, Id,
Params);
}
TemplateTemplateParmDecl *
TemplateTemplateParmDecl::Create(const ASTContext &C, DeclContext *DC,
SourceLocation L, unsigned D, unsigned P,
IdentifierInfo *Id,
TemplateParameterList *Params,
ArrayRef<TemplateParameterList *> Expansions) {
return new (C, DC,
additionalSizeToAlloc<TemplateParameterList *>(Expansions.size()))
TemplateTemplateParmDecl(DC, L, D, P, Id, Params, Expansions);
}
TemplateTemplateParmDecl *
TemplateTemplateParmDecl::CreateDeserialized(ASTContext &C, unsigned ID) {
return new (C, ID) TemplateTemplateParmDecl(nullptr, SourceLocation(), 0, 0,
false, nullptr, nullptr);
}
TemplateTemplateParmDecl *
TemplateTemplateParmDecl::CreateDeserialized(ASTContext &C, unsigned ID,
unsigned NumExpansions) {
auto *TTP =
new (C, ID, additionalSizeToAlloc<TemplateParameterList *>(NumExpansions))
TemplateTemplateParmDecl(nullptr, SourceLocation(), 0, 0, nullptr,
nullptr, std::nullopt);
TTP->NumExpandedParams = NumExpansions;
return TTP;
}
SourceLocation TemplateTemplateParmDecl::getDefaultArgumentLoc() const {
return hasDefaultArgument() ? getDefaultArgument().getLocation()
: SourceLocation();
}
void TemplateTemplateParmDecl::setDefaultArgument(
const ASTContext &C, const TemplateArgumentLoc &DefArg) {
if (DefArg.getArgument().isNull())
DefaultArgument.set(nullptr);
else
DefaultArgument.set(new (C) TemplateArgumentLoc(DefArg));
}
//===your_sha256_hash------===//
// TemplateArgumentList Implementation
//===your_sha256_hash------===//
TemplateArgumentList::TemplateArgumentList(ArrayRef<TemplateArgument> Args)
: Arguments(getTrailingObjects<TemplateArgument>()),
NumArguments(Args.size()) {
std::uninitialized_copy(Args.begin(), Args.end(),
getTrailingObjects<TemplateArgument>());
}
TemplateArgumentList *
TemplateArgumentList::CreateCopy(ASTContext &Context,
ArrayRef<TemplateArgument> Args) {
void *Mem = Context.Allocate(totalSizeToAlloc<TemplateArgument>(Args.size()));
return new (Mem) TemplateArgumentList(Args);
}
FunctionTemplateSpecializationInfo *FunctionTemplateSpecializationInfo::Create(
ASTContext &C, FunctionDecl *FD, FunctionTemplateDecl *Template,
TemplateSpecializationKind TSK, const TemplateArgumentList *TemplateArgs,
const TemplateArgumentListInfo *TemplateArgsAsWritten, SourceLocation POI,
MemberSpecializationInfo *MSInfo) {
const ASTTemplateArgumentListInfo *ArgsAsWritten = nullptr;
if (TemplateArgsAsWritten)
ArgsAsWritten = ASTTemplateArgumentListInfo::Create(C,
*TemplateArgsAsWritten);
void *Mem =
C.Allocate(totalSizeToAlloc<MemberSpecializationInfo *>(MSInfo ? 1 : 0));
return new (Mem) FunctionTemplateSpecializationInfo(
FD, Template, TSK, TemplateArgs, ArgsAsWritten, POI, MSInfo);
}
//===your_sha256_hash------===//
// ClassTemplateSpecializationDecl Implementation
//===your_sha256_hash------===//
ClassTemplateSpecializationDecl::
ClassTemplateSpecializationDecl(ASTContext &Context, Kind DK, TagKind TK,
DeclContext *DC, SourceLocation StartLoc,
SourceLocation IdLoc,
ClassTemplateDecl *SpecializedTemplate,
ArrayRef<TemplateArgument> Args,
ClassTemplateSpecializationDecl *PrevDecl)
: CXXRecordDecl(DK, TK, Context, DC, StartLoc, IdLoc,
SpecializedTemplate->getIdentifier(), PrevDecl),
SpecializedTemplate(SpecializedTemplate),
TemplateArgs(TemplateArgumentList::CreateCopy(Context, Args)),
SpecializationKind(TSK_Undeclared) {
}
ClassTemplateSpecializationDecl::ClassTemplateSpecializationDecl(ASTContext &C,
Kind DK)
: CXXRecordDecl(DK, TTK_Struct, C, nullptr, SourceLocation(),
SourceLocation(), nullptr, nullptr),
SpecializationKind(TSK_Undeclared) {}
ClassTemplateSpecializationDecl *
ClassTemplateSpecializationDecl::Create(ASTContext &Context, TagKind TK,
DeclContext *DC,
SourceLocation StartLoc,
SourceLocation IdLoc,
ClassTemplateDecl *SpecializedTemplate,
ArrayRef<TemplateArgument> Args,
ClassTemplateSpecializationDecl *PrevDecl) {
auto *Result =
new (Context, DC) ClassTemplateSpecializationDecl(
Context, ClassTemplateSpecialization, TK, DC, StartLoc, IdLoc,
SpecializedTemplate, Args, PrevDecl);
Result->setMayHaveOutOfDateDef(false);
// If the template decl is incomplete, copy the external lexical storage from
// the base template. This allows instantiations of incomplete types to
// complete using the external AST if the template's declaration came from an
// external AST.
if (!SpecializedTemplate->getTemplatedDecl()->isCompleteDefinition())
Result->setHasExternalLexicalStorage(
SpecializedTemplate->getTemplatedDecl()->hasExternalLexicalStorage());
Context.getTypeDeclType(Result, PrevDecl);
return Result;
}
ClassTemplateSpecializationDecl *
ClassTemplateSpecializationDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
auto *Result =
new (C, ID) ClassTemplateSpecializationDecl(C, ClassTemplateSpecialization);
Result->setMayHaveOutOfDateDef(false);
return Result;
}
void ClassTemplateSpecializationDecl::getNameForDiagnostic(
raw_ostream &OS, const PrintingPolicy &Policy, bool Qualified) const {
NamedDecl::getNameForDiagnostic(OS, Policy, Qualified);
const auto *PS = dyn_cast<ClassTemplatePartialSpecializationDecl>(this);
if (const ASTTemplateArgumentListInfo *ArgsAsWritten =
PS ? PS->getTemplateArgsAsWritten() : nullptr) {
printTemplateArgumentList(
OS, ArgsAsWritten->arguments(), Policy,
getSpecializedTemplate()->getTemplateParameters());
} else {
const TemplateArgumentList &TemplateArgs = getTemplateArgs();
printTemplateArgumentList(
OS, TemplateArgs.asArray(), Policy,
getSpecializedTemplate()->getTemplateParameters());
}
}
ClassTemplateDecl *
ClassTemplateSpecializationDecl::getSpecializedTemplate() const {
if (const auto *PartialSpec =
SpecializedTemplate.dyn_cast<SpecializedPartialSpecialization*>())
return PartialSpec->PartialSpecialization->getSpecializedTemplate();
return SpecializedTemplate.get<ClassTemplateDecl*>();
}
SourceRange
ClassTemplateSpecializationDecl::getSourceRange() const {
if (ExplicitInfo) {
SourceLocation Begin = getTemplateKeywordLoc();
if (Begin.isValid()) {
// Here we have an explicit (partial) specialization or instantiation.
assert(getSpecializationKind() == TSK_ExplicitSpecialization ||
getSpecializationKind() == TSK_ExplicitInstantiationDeclaration ||
getSpecializationKind() == TSK_ExplicitInstantiationDefinition);
if (getExternLoc().isValid())
Begin = getExternLoc();
SourceLocation End = getBraceRange().getEnd();
if (End.isInvalid())
End = getTypeAsWritten()->getTypeLoc().getEndLoc();
return SourceRange(Begin, End);
}
// An implicit instantiation of a class template partial specialization
// uses ExplicitInfo to record the TypeAsWritten, but the source
// locations should be retrieved from the instantiation pattern.
using CTPSDecl = ClassTemplatePartialSpecializationDecl;
auto *ctpsd = const_cast<CTPSDecl *>(cast<CTPSDecl>(this));
CTPSDecl *inst_from = ctpsd->getInstantiatedFromMember();
assert(inst_from != nullptr);
return inst_from->getSourceRange();
}
else {
// No explicit info available.
llvm::PointerUnion<ClassTemplateDecl *,
ClassTemplatePartialSpecializationDecl *>
inst_from = getInstantiatedFrom();
if (inst_from.isNull())
return getSpecializedTemplate()->getSourceRange();
if (const auto *ctd = inst_from.dyn_cast<ClassTemplateDecl *>())
return ctd->getSourceRange();
return inst_from.get<ClassTemplatePartialSpecializationDecl *>()
->getSourceRange();
}
}
//===your_sha256_hash------===//
// ConceptDecl Implementation
//===your_sha256_hash------===//
ConceptDecl *ConceptDecl::Create(ASTContext &C, DeclContext *DC,
SourceLocation L, DeclarationName Name,
TemplateParameterList *Params,
Expr *ConstraintExpr) {
bool Invalid = AdoptTemplateParameterList(Params, DC);
auto *TD = new (C, DC) ConceptDecl(DC, L, Name, Params, ConstraintExpr);
if (Invalid)
TD->setInvalidDecl();
return TD;
}
ConceptDecl *ConceptDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
ConceptDecl *Result = new (C, ID) ConceptDecl(nullptr, SourceLocation(),
DeclarationName(),
nullptr, nullptr);
return Result;
}
//===your_sha256_hash------===//
// ImplicitConceptSpecializationDecl Implementation
//===your_sha256_hash------===//
ImplicitConceptSpecializationDecl::ImplicitConceptSpecializationDecl(
DeclContext *DC, SourceLocation SL,
ArrayRef<TemplateArgument> ConvertedArgs)
: Decl(ImplicitConceptSpecialization, DC, SL),
NumTemplateArgs(ConvertedArgs.size()) {
setTemplateArguments(ConvertedArgs);
}
ImplicitConceptSpecializationDecl::ImplicitConceptSpecializationDecl(
EmptyShell Empty, unsigned NumTemplateArgs)
: Decl(ImplicitConceptSpecialization, Empty),
NumTemplateArgs(NumTemplateArgs) {}
ImplicitConceptSpecializationDecl *ImplicitConceptSpecializationDecl::Create(
const ASTContext &C, DeclContext *DC, SourceLocation SL,
ArrayRef<TemplateArgument> ConvertedArgs) {
return new (C, DC,
additionalSizeToAlloc<TemplateArgument>(ConvertedArgs.size()))
ImplicitConceptSpecializationDecl(DC, SL, ConvertedArgs);
}
ImplicitConceptSpecializationDecl *
ImplicitConceptSpecializationDecl::CreateDeserialized(
const ASTContext &C, unsigned ID, unsigned NumTemplateArgs) {
return new (C, ID, additionalSizeToAlloc<TemplateArgument>(NumTemplateArgs))
ImplicitConceptSpecializationDecl(EmptyShell{}, NumTemplateArgs);
}
void ImplicitConceptSpecializationDecl::setTemplateArguments(
ArrayRef<TemplateArgument> Converted) {
assert(Converted.size() == NumTemplateArgs);
std::uninitialized_copy(Converted.begin(), Converted.end(),
getTrailingObjects<TemplateArgument>());
}
//===your_sha256_hash------===//
// ClassTemplatePartialSpecializationDecl Implementation
//===your_sha256_hash------===//
void ClassTemplatePartialSpecializationDecl::anchor() {}
ClassTemplatePartialSpecializationDecl::
ClassTemplatePartialSpecializationDecl(ASTContext &Context, TagKind TK,
DeclContext *DC,
SourceLocation StartLoc,
SourceLocation IdLoc,
TemplateParameterList *Params,
ClassTemplateDecl *SpecializedTemplate,
ArrayRef<TemplateArgument> Args,
const ASTTemplateArgumentListInfo *ArgInfos,
ClassTemplatePartialSpecializationDecl *PrevDecl)
: ClassTemplateSpecializationDecl(Context,
ClassTemplatePartialSpecialization,
TK, DC, StartLoc, IdLoc,
SpecializedTemplate, Args, PrevDecl),
TemplateParams(Params), ArgsAsWritten(ArgInfos),
InstantiatedFromMember(nullptr, false) {
if (AdoptTemplateParameterList(Params, this))
setInvalidDecl();
}
ClassTemplatePartialSpecializationDecl *
ClassTemplatePartialSpecializationDecl::
Create(ASTContext &Context, TagKind TK,DeclContext *DC,
SourceLocation StartLoc, SourceLocation IdLoc,
TemplateParameterList *Params,
ClassTemplateDecl *SpecializedTemplate,
ArrayRef<TemplateArgument> Args,
const TemplateArgumentListInfo &ArgInfos,
QualType CanonInjectedType,
ClassTemplatePartialSpecializationDecl *PrevDecl) {
const ASTTemplateArgumentListInfo *ASTArgInfos =
ASTTemplateArgumentListInfo::Create(Context, ArgInfos);
auto *Result = new (Context, DC)
ClassTemplatePartialSpecializationDecl(Context, TK, DC, StartLoc, IdLoc,
Params, SpecializedTemplate, Args,
ASTArgInfos, PrevDecl);
Result->setSpecializationKind(TSK_ExplicitSpecialization);
Result->setMayHaveOutOfDateDef(false);
Context.getInjectedClassNameType(Result, CanonInjectedType);
return Result;
}
ClassTemplatePartialSpecializationDecl *
ClassTemplatePartialSpecializationDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
auto *Result = new (C, ID) ClassTemplatePartialSpecializationDecl(C);
Result->setMayHaveOutOfDateDef(false);
return Result;
}
//===your_sha256_hash------===//
// FriendTemplateDecl Implementation
//===your_sha256_hash------===//
void FriendTemplateDecl::anchor() {}
FriendTemplateDecl *
FriendTemplateDecl::Create(ASTContext &Context, DeclContext *DC,
SourceLocation L,
MutableArrayRef<TemplateParameterList *> Params,
FriendUnion Friend, SourceLocation FLoc) {
TemplateParameterList **TPL = nullptr;
if (!Params.empty()) {
TPL = new (Context) TemplateParameterList *[Params.size()];
llvm::copy(Params, TPL);
}
return new (Context, DC)
FriendTemplateDecl(DC, L, TPL, Params.size(), Friend, FLoc);
}
FriendTemplateDecl *FriendTemplateDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
return new (C, ID) FriendTemplateDecl(EmptyShell());
}
//===your_sha256_hash------===//
// TypeAliasTemplateDecl Implementation
//===your_sha256_hash------===//
TypeAliasTemplateDecl *
TypeAliasTemplateDecl::Create(ASTContext &C, DeclContext *DC, SourceLocation L,
DeclarationName Name,
TemplateParameterList *Params, NamedDecl *Decl) {
bool Invalid = AdoptTemplateParameterList(Params, DC);
auto *TD = new (C, DC) TypeAliasTemplateDecl(C, DC, L, Name, Params, Decl);
if (Invalid)
TD->setInvalidDecl();
return TD;
}
TypeAliasTemplateDecl *TypeAliasTemplateDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
return new (C, ID) TypeAliasTemplateDecl(C, nullptr, SourceLocation(),
DeclarationName(), nullptr, nullptr);
}
RedeclarableTemplateDecl::CommonBase *
TypeAliasTemplateDecl::newCommon(ASTContext &C) const {
auto *CommonPtr = new (C) Common;
C.addDestruction(CommonPtr);
return CommonPtr;
}
//===your_sha256_hash------===//
// ClassScopeFunctionSpecializationDecl Implementation
//===your_sha256_hash------===//
void ClassScopeFunctionSpecializationDecl::anchor() {}
ClassScopeFunctionSpecializationDecl *
ClassScopeFunctionSpecializationDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
return new (C, ID) ClassScopeFunctionSpecializationDecl(
nullptr, SourceLocation(), nullptr, nullptr);
}
//===your_sha256_hash------===//
// VarTemplateDecl Implementation
//===your_sha256_hash------===//
VarTemplateDecl *VarTemplateDecl::getDefinition() {
VarTemplateDecl *CurD = this;
while (CurD) {
if (CurD->isThisDeclarationADefinition())
return CurD;
CurD = CurD->getPreviousDecl();
}
return nullptr;
}
VarTemplateDecl *VarTemplateDecl::Create(ASTContext &C, DeclContext *DC,
SourceLocation L, DeclarationName Name,
TemplateParameterList *Params,
VarDecl *Decl) {
bool Invalid = AdoptTemplateParameterList(Params, DC);
auto *TD = new (C, DC) VarTemplateDecl(C, DC, L, Name, Params, Decl);
if (Invalid)
TD->setInvalidDecl();
return TD;
}
VarTemplateDecl *VarTemplateDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
return new (C, ID) VarTemplateDecl(C, nullptr, SourceLocation(),
DeclarationName(), nullptr, nullptr);
}
void VarTemplateDecl::LoadLazySpecializations() const {
loadLazySpecializationsImpl();
}
llvm::FoldingSetVector<VarTemplateSpecializationDecl> &
VarTemplateDecl::getSpecializations() const {
LoadLazySpecializations();
return getCommonPtr()->Specializations;
}
llvm::FoldingSetVector<VarTemplatePartialSpecializationDecl> &
VarTemplateDecl::getPartialSpecializations() const {
LoadLazySpecializations();
return getCommonPtr()->PartialSpecializations;
}
RedeclarableTemplateDecl::CommonBase *
VarTemplateDecl::newCommon(ASTContext &C) const {
auto *CommonPtr = new (C) Common;
C.addDestruction(CommonPtr);
return CommonPtr;
}
VarTemplateSpecializationDecl *
VarTemplateDecl::findSpecialization(ArrayRef<TemplateArgument> Args,
void *&InsertPos) {
return findSpecializationImpl(getSpecializations(), InsertPos, Args);
}
void VarTemplateDecl::AddSpecialization(VarTemplateSpecializationDecl *D,
void *InsertPos) {
addSpecializationImpl<VarTemplateDecl>(getSpecializations(), D, InsertPos);
}
VarTemplatePartialSpecializationDecl *
VarTemplateDecl::findPartialSpecialization(ArrayRef<TemplateArgument> Args,
TemplateParameterList *TPL, void *&InsertPos) {
return findSpecializationImpl(getPartialSpecializations(), InsertPos, Args,
TPL);
}
void
VarTemplatePartialSpecializationDecl::Profile(llvm::FoldingSetNodeID &ID,
ArrayRef<TemplateArgument> TemplateArgs, TemplateParameterList *TPL,
ASTContext &Context) {
ID.AddInteger(TemplateArgs.size());
for (const TemplateArgument &TemplateArg : TemplateArgs)
TemplateArg.Profile(ID, Context);
ProfileTemplateParameterList(Context, ID, TPL);
}
void VarTemplateDecl::AddPartialSpecialization(
VarTemplatePartialSpecializationDecl *D, void *InsertPos) {
if (InsertPos)
getPartialSpecializations().InsertNode(D, InsertPos);
else {
VarTemplatePartialSpecializationDecl *Existing =
getPartialSpecializations().GetOrInsertNode(D);
(void)Existing;
assert(Existing->isCanonicalDecl() && "Non-canonical specialization?");
}
if (ASTMutationListener *L = getASTMutationListener())
L->AddedCXXTemplateSpecialization(this, D);
}
void VarTemplateDecl::getPartialSpecializations(
SmallVectorImpl<VarTemplatePartialSpecializationDecl *> &PS) const {
llvm::FoldingSetVector<VarTemplatePartialSpecializationDecl> &PartialSpecs =
getPartialSpecializations();
PS.clear();
PS.reserve(PartialSpecs.size());
for (VarTemplatePartialSpecializationDecl &P : PartialSpecs)
PS.push_back(P.getMostRecentDecl());
}
VarTemplatePartialSpecializationDecl *
VarTemplateDecl::findPartialSpecInstantiatedFromMember(
VarTemplatePartialSpecializationDecl *D) {
Decl *DCanon = D->getCanonicalDecl();
for (VarTemplatePartialSpecializationDecl &P : getPartialSpecializations()) {
if (P.getInstantiatedFromMember()->getCanonicalDecl() == DCanon)
return P.getMostRecentDecl();
}
return nullptr;
}
//===your_sha256_hash------===//
// VarTemplateSpecializationDecl Implementation
//===your_sha256_hash------===//
VarTemplateSpecializationDecl::VarTemplateSpecializationDecl(
Kind DK, ASTContext &Context, DeclContext *DC, SourceLocation StartLoc,
SourceLocation IdLoc, VarTemplateDecl *SpecializedTemplate, QualType T,
TypeSourceInfo *TInfo, StorageClass S, ArrayRef<TemplateArgument> Args)
: VarDecl(DK, Context, DC, StartLoc, IdLoc,
SpecializedTemplate->getIdentifier(), T, TInfo, S),
SpecializedTemplate(SpecializedTemplate),
TemplateArgs(TemplateArgumentList::CreateCopy(Context, Args)),
SpecializationKind(TSK_Undeclared), IsCompleteDefinition(false) {}
VarTemplateSpecializationDecl::VarTemplateSpecializationDecl(Kind DK,
ASTContext &C)
: VarDecl(DK, C, nullptr, SourceLocation(), SourceLocation(), nullptr,
QualType(), nullptr, SC_None),
SpecializationKind(TSK_Undeclared), IsCompleteDefinition(false) {}
VarTemplateSpecializationDecl *VarTemplateSpecializationDecl::Create(
ASTContext &Context, DeclContext *DC, SourceLocation StartLoc,
SourceLocation IdLoc, VarTemplateDecl *SpecializedTemplate, QualType T,
TypeSourceInfo *TInfo, StorageClass S, ArrayRef<TemplateArgument> Args) {
return new (Context, DC) VarTemplateSpecializationDecl(
VarTemplateSpecialization, Context, DC, StartLoc, IdLoc,
SpecializedTemplate, T, TInfo, S, Args);
}
VarTemplateSpecializationDecl *
VarTemplateSpecializationDecl::CreateDeserialized(ASTContext &C, unsigned ID) {
return new (C, ID)
VarTemplateSpecializationDecl(VarTemplateSpecialization, C);
}
void VarTemplateSpecializationDecl::getNameForDiagnostic(
raw_ostream &OS, const PrintingPolicy &Policy, bool Qualified) const {
NamedDecl::getNameForDiagnostic(OS, Policy, Qualified);
const auto *PS = dyn_cast<VarTemplatePartialSpecializationDecl>(this);
if (const ASTTemplateArgumentListInfo *ArgsAsWritten =
PS ? PS->getTemplateArgsAsWritten() : nullptr) {
printTemplateArgumentList(
OS, ArgsAsWritten->arguments(), Policy,
getSpecializedTemplate()->getTemplateParameters());
} else {
const TemplateArgumentList &TemplateArgs = getTemplateArgs();
printTemplateArgumentList(
OS, TemplateArgs.asArray(), Policy,
getSpecializedTemplate()->getTemplateParameters());
}
}
VarTemplateDecl *VarTemplateSpecializationDecl::getSpecializedTemplate() const {
if (const auto *PartialSpec =
SpecializedTemplate.dyn_cast<SpecializedPartialSpecialization *>())
return PartialSpec->PartialSpecialization->getSpecializedTemplate();
return SpecializedTemplate.get<VarTemplateDecl *>();
}
void VarTemplateSpecializationDecl::setTemplateArgsInfo(
const TemplateArgumentListInfo &ArgsInfo) {
TemplateArgsInfo =
ASTTemplateArgumentListInfo::Create(getASTContext(), ArgsInfo);
}
void VarTemplateSpecializationDecl::setTemplateArgsInfo(
const ASTTemplateArgumentListInfo *ArgsInfo) {
TemplateArgsInfo =
ASTTemplateArgumentListInfo::Create(getASTContext(), ArgsInfo);
}
//===your_sha256_hash------===//
// VarTemplatePartialSpecializationDecl Implementation
//===your_sha256_hash------===//
void VarTemplatePartialSpecializationDecl::anchor() {}
VarTemplatePartialSpecializationDecl::VarTemplatePartialSpecializationDecl(
ASTContext &Context, DeclContext *DC, SourceLocation StartLoc,
SourceLocation IdLoc, TemplateParameterList *Params,
VarTemplateDecl *SpecializedTemplate, QualType T, TypeSourceInfo *TInfo,
StorageClass S, ArrayRef<TemplateArgument> Args,
const ASTTemplateArgumentListInfo *ArgInfos)
: VarTemplateSpecializationDecl(VarTemplatePartialSpecialization, Context,
DC, StartLoc, IdLoc, SpecializedTemplate, T,
TInfo, S, Args),
TemplateParams(Params), ArgsAsWritten(ArgInfos),
InstantiatedFromMember(nullptr, false) {
if (AdoptTemplateParameterList(Params, DC))
setInvalidDecl();
}
VarTemplatePartialSpecializationDecl *
VarTemplatePartialSpecializationDecl::Create(
ASTContext &Context, DeclContext *DC, SourceLocation StartLoc,
SourceLocation IdLoc, TemplateParameterList *Params,
VarTemplateDecl *SpecializedTemplate, QualType T, TypeSourceInfo *TInfo,
StorageClass S, ArrayRef<TemplateArgument> Args,
const TemplateArgumentListInfo &ArgInfos) {
const ASTTemplateArgumentListInfo *ASTArgInfos
= ASTTemplateArgumentListInfo::Create(Context, ArgInfos);
auto *Result =
new (Context, DC) VarTemplatePartialSpecializationDecl(
Context, DC, StartLoc, IdLoc, Params, SpecializedTemplate, T, TInfo,
S, Args, ASTArgInfos);
Result->setSpecializationKind(TSK_ExplicitSpecialization);
return Result;
}
VarTemplatePartialSpecializationDecl *
VarTemplatePartialSpecializationDecl::CreateDeserialized(ASTContext &C,
unsigned ID) {
return new (C, ID) VarTemplatePartialSpecializationDecl(C);
}
static TemplateParameterList *
createMakeIntegerSeqParameterList(const ASTContext &C, DeclContext *DC) {
// typename T
auto *T = TemplateTypeParmDecl::Create(
C, DC, SourceLocation(), SourceLocation(), /*Depth=*/1, /*Position=*/0,
/*Id=*/nullptr, /*Typename=*/true, /*ParameterPack=*/false,
/*HasTypeConstraint=*/false);
T->setImplicit(true);
// T ...Ints
TypeSourceInfo *TI =
C.getTrivialTypeSourceInfo(QualType(T->getTypeForDecl(), 0));
auto *N = NonTypeTemplateParmDecl::Create(
C, DC, SourceLocation(), SourceLocation(), /*Depth=*/0, /*Position=*/1,
/*Id=*/nullptr, TI->getType(), /*ParameterPack=*/true, TI);
N->setImplicit(true);
// <typename T, T ...Ints>
NamedDecl *P[2] = {T, N};
auto *TPL = TemplateParameterList::Create(
C, SourceLocation(), SourceLocation(), P, SourceLocation(), nullptr);
// template <typename T, ...Ints> class IntSeq
auto *TemplateTemplateParm = TemplateTemplateParmDecl::Create(
C, DC, SourceLocation(), /*Depth=*/0, /*Position=*/0,
/*ParameterPack=*/false, /*Id=*/nullptr, TPL);
TemplateTemplateParm->setImplicit(true);
// typename T
auto *TemplateTypeParm = TemplateTypeParmDecl::Create(
C, DC, SourceLocation(), SourceLocation(), /*Depth=*/0, /*Position=*/1,
/*Id=*/nullptr, /*Typename=*/true, /*ParameterPack=*/false,
/*HasTypeConstraint=*/false);
TemplateTypeParm->setImplicit(true);
// T N
TypeSourceInfo *TInfo = C.getTrivialTypeSourceInfo(
QualType(TemplateTypeParm->getTypeForDecl(), 0));
auto *NonTypeTemplateParm = NonTypeTemplateParmDecl::Create(
C, DC, SourceLocation(), SourceLocation(), /*Depth=*/0, /*Position=*/2,
/*Id=*/nullptr, TInfo->getType(), /*ParameterPack=*/false, TInfo);
NamedDecl *Params[] = {TemplateTemplateParm, TemplateTypeParm,
NonTypeTemplateParm};
// template <template <typename T, T ...Ints> class IntSeq, typename T, T N>
return TemplateParameterList::Create(C, SourceLocation(), SourceLocation(),
Params, SourceLocation(), nullptr);
}
static TemplateParameterList *
createTypePackElementParameterList(const ASTContext &C, DeclContext *DC) {
// std::size_t Index
TypeSourceInfo *TInfo = C.getTrivialTypeSourceInfo(C.getSizeType());
auto *Index = NonTypeTemplateParmDecl::Create(
C, DC, SourceLocation(), SourceLocation(), /*Depth=*/0, /*Position=*/0,
/*Id=*/nullptr, TInfo->getType(), /*ParameterPack=*/false, TInfo);
// typename ...T
auto *Ts = TemplateTypeParmDecl::Create(
C, DC, SourceLocation(), SourceLocation(), /*Depth=*/0, /*Position=*/1,
/*Id=*/nullptr, /*Typename=*/true, /*ParameterPack=*/true,
/*HasTypeConstraint=*/false);
Ts->setImplicit(true);
// template <std::size_t Index, typename ...T>
NamedDecl *Params[] = {Index, Ts};
return TemplateParameterList::Create(C, SourceLocation(), SourceLocation(),
llvm::ArrayRef(Params), SourceLocation(),
nullptr);
}
static TemplateParameterList *createBuiltinTemplateParameterList(
const ASTContext &C, DeclContext *DC, BuiltinTemplateKind BTK) {
switch (BTK) {
case BTK__make_integer_seq:
return createMakeIntegerSeqParameterList(C, DC);
case BTK__type_pack_element:
return createTypePackElementParameterList(C, DC);
}
llvm_unreachable("unhandled BuiltinTemplateKind!");
}
void BuiltinTemplateDecl::anchor() {}
BuiltinTemplateDecl::BuiltinTemplateDecl(const ASTContext &C, DeclContext *DC,
DeclarationName Name,
BuiltinTemplateKind BTK)
: TemplateDecl(BuiltinTemplate, DC, SourceLocation(), Name,
createBuiltinTemplateParameterList(C, DC, BTK)),
BTK(BTK) {}
void TypeConstraint::print(llvm::raw_ostream &OS, PrintingPolicy Policy) const {
if (NestedNameSpec)
NestedNameSpec.getNestedNameSpecifier()->print(OS, Policy);
ConceptName.printName(OS, Policy);
if (hasExplicitTemplateArgs()) {
OS << "<";
// FIXME: Find corresponding parameter for argument
for (auto &ArgLoc : ArgsAsWritten->arguments())
ArgLoc.getArgument().print(Policy, OS, /*IncludeType*/ false);
OS << ">";
}
}
TemplateParamObjectDecl *TemplateParamObjectDecl::Create(const ASTContext &C,
QualType T,
const APValue &V) {
DeclContext *DC = C.getTranslationUnitDecl();
auto *TPOD = new (C, DC) TemplateParamObjectDecl(DC, T, V);
C.addDestruction(&TPOD->Value);
return TPOD;
}
TemplateParamObjectDecl *
TemplateParamObjectDecl::CreateDeserialized(ASTContext &C, unsigned ID) {
auto *TPOD = new (C, ID) TemplateParamObjectDecl(nullptr, QualType(), APValue());
C.addDestruction(&TPOD->Value);
return TPOD;
}
void TemplateParamObjectDecl::printName(llvm::raw_ostream &OS,
const PrintingPolicy &Policy) const {
OS << "<template param ";
printAsExpr(OS, Policy);
OS << ">";
}
void TemplateParamObjectDecl::printAsExpr(llvm::raw_ostream &OS) const {
printAsExpr(OS, getASTContext().getPrintingPolicy());
}
void TemplateParamObjectDecl::printAsExpr(llvm::raw_ostream &OS,
const PrintingPolicy &Policy) const {
getType().getUnqualifiedType().print(OS, Policy);
printAsInit(OS, Policy);
}
void TemplateParamObjectDecl::printAsInit(llvm::raw_ostream &OS) const {
printAsInit(OS, getASTContext().getPrintingPolicy());
}
void TemplateParamObjectDecl::printAsInit(llvm::raw_ostream &OS,
const PrintingPolicy &Policy) const {
getValue().printPretty(OS, Policy, getType(), &getASTContext());
}
TemplateParameterList *clang::getReplacedTemplateParameterList(Decl *D) {
switch (D->getKind()) {
case Decl::Kind::ClassTemplate:
return cast<ClassTemplateDecl>(D)->getTemplateParameters();
case Decl::Kind::ClassTemplateSpecialization: {
const auto *CTSD = cast<ClassTemplateSpecializationDecl>(D);
auto P = CTSD->getSpecializedTemplateOrPartial();
if (const auto *CTPSD =
P.dyn_cast<ClassTemplatePartialSpecializationDecl *>())
return CTPSD->getTemplateParameters();
return cast<ClassTemplateDecl *>(P)->getTemplateParameters();
}
case Decl::Kind::ClassTemplatePartialSpecialization:
return cast<ClassTemplatePartialSpecializationDecl>(D)
->getTemplateParameters();
case Decl::Kind::TypeAliasTemplate:
return cast<TypeAliasTemplateDecl>(D)->getTemplateParameters();
case Decl::Kind::BuiltinTemplate:
return cast<BuiltinTemplateDecl>(D)->getTemplateParameters();
case Decl::Kind::CXXDeductionGuide:
case Decl::Kind::CXXConversion:
case Decl::Kind::CXXConstructor:
case Decl::Kind::CXXDestructor:
case Decl::Kind::CXXMethod:
case Decl::Kind::Function:
return cast<FunctionDecl>(D)
->getTemplateSpecializationInfo()
->getTemplate()
->getTemplateParameters();
case Decl::Kind::FunctionTemplate:
return cast<FunctionTemplateDecl>(D)->getTemplateParameters();
case Decl::Kind::VarTemplate:
return cast<VarTemplateDecl>(D)->getTemplateParameters();
case Decl::Kind::VarTemplateSpecialization: {
const auto *VTSD = cast<VarTemplateSpecializationDecl>(D);
auto P = VTSD->getSpecializedTemplateOrPartial();
if (const auto *VTPSD =
P.dyn_cast<VarTemplatePartialSpecializationDecl *>())
return VTPSD->getTemplateParameters();
return cast<VarTemplateDecl *>(P)->getTemplateParameters();
}
case Decl::Kind::VarTemplatePartialSpecialization:
return cast<VarTemplatePartialSpecializationDecl>(D)
->getTemplateParameters();
case Decl::Kind::TemplateTemplateParm:
return cast<TemplateTemplateParmDecl>(D)->getTemplateParameters();
case Decl::Kind::Concept:
return cast<ConceptDecl>(D)->getTemplateParameters();
default:
llvm_unreachable("Unhandled templated declaration kind");
}
}
```
|
Karuyeh (, also Romanized as Kārūyeh; also known as Kāravīeh and Karveh) is a village in Golestan Rural District, in the Central District of Falavarjan County, Isfahan Province, Iran. At the 2006 census, its population was 1,096 and 281 families.
References
Populated places in Falavarjan County
|
```smalltalk
The source code in this file is covered under a dual-license scenario:
- RCL: for OPC Foundation Corporate Members in good-standing
- GPL V2: everybody else
This source code is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
*/
using System;
using System.Threading;
using System.Threading.Tasks;
namespace Opc.Ua.Bindings
{
/// <summary>
/// Creates a transport channel for the ITransportChannel interface.
/// Implements the UA-SC security and UA Binary encoding.
/// The socket layer requires a IMessageSocketFactory implementation.
/// </summary>
public class UaSCUaBinaryTransportChannel : ITransportChannel, IMessageSocketChannel
{
private const int kChannelCloseDefault = 1_000;
#region Constructors
/// <summary>
/// Create a transport channel from a message socket factory.
/// </summary>
/// <param name="messageSocketFactory">The message socket factory.</param>
public UaSCUaBinaryTransportChannel(IMessageSocketFactory messageSocketFactory)
{
m_messageSocketFactory = messageSocketFactory;
}
#endregion
#region IDisposable Members
/// <summary>
/// Frees any unmanaged resources.
/// </summary>
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
/// <summary>
/// An overrideable version of the Dispose.
/// </summary>
protected virtual void Dispose(bool disposing)
{
if (disposing)
{
var channel = Interlocked.Exchange(ref m_channel, null);
Utils.SilentDispose(channel);
}
}
#endregion
#region IMessageSocketChannel Members
/// <summary>
/// Returns the channel's underlying message socket if connected / available.
/// </summary>
public IMessageSocket Socket
{
get { return m_channel?.Socket; }
}
#endregion
#region ITransportChannel Members
/// <summary>
/// A masking indicating which features are implemented.
/// </summary>
public TransportChannelFeatures SupportedFeatures =>
TransportChannelFeatures.Open | TransportChannelFeatures.BeginOpen |
TransportChannelFeatures.BeginSendRequest | TransportChannelFeatures.SendRequestAsync |
(Socket?.MessageSocketFeatures ?? 0);
/// <summary>
/// Gets the description for the endpoint used by the channel.
/// </summary>
public EndpointDescription EndpointDescription => m_settings.Description;
/// <summary>
/// Gets the configuration for the channel.
/// </summary>
public EndpointConfiguration EndpointConfiguration => m_settings.Configuration;
/// <summary>
/// Gets the context used when serializing messages exchanged via the channel.
/// </summary>
public IServiceMessageContext MessageContext => m_quotas.MessageContext;
/// <summary>
/// Gets the the channel's current security token.
/// </summary>
public ChannelToken CurrentToken
{
get { return m_channel?.CurrentToken; }
}
/// <summary>
/// Gets or sets the default timeout for requests send via the channel.
/// </summary>
public int OperationTimeout
{
get { return m_operationTimeout; }
set { m_operationTimeout = value; }
}
/// <summary>
/// Initializes a secure channel with the endpoint identified by the URL.
/// </summary>
/// <param name="url">The URL for the endpoint.</param>
/// <param name="settings">The settings to use when creating the channel.</param>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
public void Initialize(
Uri url,
TransportChannelSettings settings)
{
SaveSettings(url, settings);
Interlocked.Exchange(ref m_channel, CreateChannel());
}
/// <summary>
/// Initializes a secure channel with the endpoint identified by the connection.
/// </summary>
/// <param name="connection">The connection to use.</param>
/// <param name="settings">The settings to use when creating the channel.</param>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
public void Initialize(
ITransportWaitingConnection connection,
TransportChannelSettings settings)
{
SaveSettings(connection.EndpointUrl, settings);
Interlocked.Exchange(ref m_channel, CreateChannel(connection));
}
/// <summary>
/// Opens a secure channel with the endpoint identified by the URL.
/// </summary>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
public void Open()
{
// opens when the first request is called to preserve previous behavior.
}
/// <summary>
/// Begins an asynchronous operation to open a secure channel with the endpoint identified by the URL.
/// </summary>
/// <param name="callback">The callback to call when the operation completes.</param>
/// <param name="callbackData">The callback data to return with the callback.</param>
/// <returns>
/// The result which must be passed to the EndOpen method.
/// </returns>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <seealso cref="Open"/>
public IAsyncResult BeginOpen(AsyncCallback callback, object callbackData)
{
lock (m_lock)
{
// create the channel.
Interlocked.Exchange(ref m_channel, CreateChannel(null));
// begin connect operation.
return m_channel.BeginConnect(this.m_url, m_operationTimeout, callback, callbackData);
}
}
/// <summary>
/// Completes an asynchronous operation to open a secure channel.
/// </summary>
/// <param name="result">The result returned from the BeginOpen call.</param>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <seealso cref="Open"/>
public void EndOpen(IAsyncResult result)
{
m_channel.EndConnect(result);
}
/// <summary>
/// Closes any existing secure channel and opens a new one.
/// </summary>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <remarks>
/// Calling this method will cause outstanding requests over the current secure channel to fail.
/// </remarks>
public void Reconnect() => Reconnect(null);
/// <summary>
/// Closes any existing secure channel and opens a new one.
/// </summary>
/// <param name="connection">A reverse connection, null otherwise.</param>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <remarks>
/// Calling this method will cause outstanding requests over the current secure channel to fail.
/// </remarks>
public void Reconnect(ITransportWaitingConnection connection)
{
Utils.LogInfo("TransportChannel RECONNECT: Reconnecting to {0}.", m_url);
lock (m_lock)
{
// the new channel must be created first because WinSock will reuse sockets and this
// can result in messages sent over the old socket arriving as messages on the new socket.
// if this happens the new channel is shutdown because of a security violation.
UaSCUaBinaryClientChannel channel = Interlocked.Exchange(ref m_channel, null);
try
{
// reconnect.
Interlocked.Exchange(ref m_channel, CreateChannel(connection));
// begin connect operation.
IAsyncResult result = m_channel.BeginConnect(m_url, m_operationTimeout, null, null);
m_channel.EndConnect(result);
}
finally
{
// close existing channel.
if (channel != null)
{
try
{
channel.Close(kChannelCloseDefault);
}
catch (Exception e)
{
// do nothing.
Utils.LogTrace(e, "Ignoring exception while closing transport channel during Reconnect.");
}
finally
{
channel.Dispose();
}
}
}
}
}
/// <summary>
/// Begins an asynchronous operation to close the existing secure channel and open a new one.
/// </summary>
/// <param name="callback">The callback to call when the operation completes.</param>
/// <param name="callbackData">The callback data to return with the callback.</param>
/// <returns>
/// The result which must be passed to the EndReconnect method.
/// </returns>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <seealso cref="Reconnect()"/>
public IAsyncResult BeginReconnect(AsyncCallback callback, object callbackData)
{
throw new NotImplementedException();
}
/// <summary>
/// Completes an asynchronous operation to close the existing secure channel and open a new one.
/// </summary>
/// <param name="result">The result returned from the BeginReconnect call.</param>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <seealso cref="Reconnect()"/>
public void EndReconnect(IAsyncResult result)
{
throw new NotImplementedException();
}
/// <summary>
/// Closes the secure channel.
/// </summary>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
public void Close()
{
UaSCUaBinaryClientChannel channel = Interlocked.Exchange(ref m_channel, null);
channel?.Close(kChannelCloseDefault);
}
/// <summary>
/// Closes the secure channel (async).
/// </summary>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
public Task CloseAsync(CancellationToken ct)
{
UaSCUaBinaryClientChannel channel = Interlocked.Exchange(ref m_channel, null);
if (channel != null)
{
return channel.CloseAsync(kChannelCloseDefault, ct);
}
return Task.CompletedTask;
}
/// <summary>
/// Begins an asynchronous operation to close the secure channel.
/// </summary>
/// <param name="callback">The callback to call when the operation completes.</param>
/// <param name="callbackData">The callback data to return with the callback.</param>
/// <returns>
/// The result which must be passed to the EndClose method.
/// </returns>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <seealso cref="Close"/>
public IAsyncResult BeginClose(AsyncCallback callback, object callbackData)
{
throw new NotImplementedException();
}
/// <summary>
/// Completes an asynchronous operation to close the secure channel.
/// </summary>
/// <param name="result">The result returned from the BeginClose call.</param>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <seealso cref="Close"/>
public void EndClose(IAsyncResult result)
{
throw new NotImplementedException();
}
/// <summary>
/// Sends a request over the secure channel.
/// </summary>
/// <param name="request">The request to send.</param>
/// <returns>The response returned by the server.</returns>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
public IServiceResponse SendRequest(IServiceRequest request)
{
IAsyncResult result = BeginSendRequest(request, null, null);
return EndSendRequest(result);
}
/// <summary>
/// Sends a request over the secure channel (async version).
/// </summary>
/// <param name="request">The request to send.</param>
/// <param name="ct">The cancellation token.</param>
/// <returns>The response returned by the server.</returns>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
public Task<IServiceResponse> SendRequestAsync(IServiceRequest request, CancellationToken ct)
{
var operation = BeginSendRequest(request, null, null);
return EndSendRequestAsync(operation, ct);
}
/// <summary>
/// Begins an asynchronous operation to send a request over the secure channel.
/// </summary>
/// <param name="request">The request to send.</param>
/// <param name="callback">The callback to call when the operation completes.</param>
/// <param name="callbackData">The callback data to return with the callback.</param>
/// <returns>
/// The result which must be passed to the EndSendRequest method.
/// </returns>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <seealso cref="SendRequest"/>
public IAsyncResult BeginSendRequest(IServiceRequest request, AsyncCallback callback, object callbackData)
{
UaSCUaBinaryClientChannel channel = m_channel;
if (channel == null)
{
channel = CreateChannel();
var currentChannel = Interlocked.CompareExchange(ref m_channel, channel, null);
if (currentChannel != null)
{
Utils.SilentDispose(channel);
channel = currentChannel;
}
}
return channel.BeginSendRequest(request, m_operationTimeout, callback, callbackData);
}
/// <summary>
/// Completes an asynchronous operation to send a request over the secure channel.
/// </summary>
/// <param name="result">The result returned from the BeginSendRequest call.</param>
/// <returns></returns>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <seealso cref="SendRequest"/>
public IServiceResponse EndSendRequest(IAsyncResult result)
{
UaSCUaBinaryClientChannel channel = m_channel;
if (channel == null)
{
throw ServiceResultException.Create(StatusCodes.BadSecureChannelClosed, "Channel has been closed.");
}
return channel.EndSendRequest(result);
}
/// <summary>
/// Completes an asynchronous operation to send a request over the secure channel.
/// </summary>
/// <param name="result">The result returned from the BeginSendRequest call.</param>
/// <param name="ct"></param>
/// <returns></returns>
/// <exception cref="ServiceResultException">Thrown if any communication error occurs.</exception>
/// <seealso cref="SendRequest"/>
public Task<IServiceResponse> EndSendRequestAsync(IAsyncResult result, CancellationToken ct)
{
UaSCUaBinaryClientChannel channel = m_channel;
if (channel == null)
{
throw ServiceResultException.Create(StatusCodes.BadSecureChannelClosed, "Channel has been closed.");
}
return channel.EndSendRequestAsync(result, ct);
}
/// <summary>
/// Saves the settings so the channel can be opened later.
/// </summary>
/// <param name="url">The URL.</param>
/// <param name="settings">The settings.</param>
private void SaveSettings(Uri url, TransportChannelSettings settings)
{
// save the settings.
m_url = url;
m_settings = settings;
m_operationTimeout = settings.Configuration.OperationTimeout;
// initialize the quotas.
EndpointConfiguration configuration = m_settings.Configuration;
m_quotas = new ChannelQuotas {
MaxBufferSize = configuration.MaxBufferSize,
MaxMessageSize = TcpMessageLimits.AlignRoundMaxMessageSize(configuration.MaxMessageSize),
ChannelLifetime = configuration.ChannelLifetime,
SecurityTokenLifetime = configuration.SecurityTokenLifetime,
MessageContext = new ServiceMessageContext() {
MaxArrayLength = configuration.MaxArrayLength,
MaxByteStringLength = configuration.MaxByteStringLength,
MaxMessageSize = TcpMessageLimits.AlignRoundMaxMessageSize(configuration.MaxMessageSize),
MaxStringLength = configuration.MaxStringLength,
MaxEncodingNestingLevels = configuration.MaxEncodingNestingLevels,
MaxDecoderRecoveries = configuration.MaxDecoderRecoveries,
NamespaceUris = m_settings.NamespaceUris,
ServerUris = new StringTable(),
Factory = m_settings.Factory
},
CertificateValidator = settings.CertificateValidator
};
// create the buffer manager.
m_bufferManager = new BufferManager("Client", settings.Configuration.MaxBufferSize);
}
/// <summary>
/// Opens the channel before sending the request.
/// </summary>
/// <param name="connection">A reverse connection, null otherwise.</param>
private UaSCUaBinaryClientChannel CreateChannel(ITransportWaitingConnection connection = null)
{
IMessageSocket socket = null;
if (connection != null)
{
socket = connection.Handle as IMessageSocket;
if (socket == null)
{
throw new ArgumentException("Connection Handle is not of type IMessageSocket.");
}
}
// create the channel.
var channel = new UaSCUaBinaryClientChannel(
Guid.NewGuid().ToString(),
m_bufferManager,
m_messageSocketFactory,
m_quotas,
m_settings.ClientCertificate,
m_settings.ClientCertificateChain,
m_settings.ServerCertificate,
m_settings.Description);
// use socket for reverse connections, ignore otherwise
if (socket != null)
{
channel.Socket = socket;
channel.Socket.ChangeSink(channel);
channel.ReverseSocket = true;
}
return channel;
}
#endregion
#region Private Fields
private readonly object m_lock = new object();
private Uri m_url;
private int m_operationTimeout;
private TransportChannelSettings m_settings;
private ChannelQuotas m_quotas;
private BufferManager m_bufferManager;
private UaSCUaBinaryClientChannel m_channel;
private IMessageSocketFactory m_messageSocketFactory;
#endregion
}
}
```
|
Sporting Club de Paris is a French Futsal club, based in Paris in the 13th district, and having the emblem of the Sporting CP as an official affiliate. Sporting is a quadruple Championnat winner in France as well as a four-time winner of the Coupe de France de Futsal.
Sporting Club de Paris is the French club's most successful futsal and continuously represents France in the UEFA Futsal Cup, the most prestigious European competition for futsal.
Honors
Division 1 Futsal: 5
2010-11, 2011-12, 2012-13, 2013-14, 2021-22
Coupe de France: 6
2009-10, 2010-11, 2011-12, 2012-13, 2014-15, 2018-19
Current squad
UEFA Club Competitions record
Appearances: 5
References
External links
Club's official site
Futsal clubs in France
Futsal clubs established in 2008
2008 establishments in France
Sports clubs and teams in Paris
|
```go
// contributor license agreements. See the NOTICE file distributed with
// this work for additional information regarding copyright ownership.
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
// Package sql contains SQL transform APIs, allowing SQL queries to be used
// in Beam Go pipelines.
//
// NOTE: This feature only works when an expansion service/handler is
// registered for SQL transform. The APIs are subject to change without
// backward compatibility guarantees.
package sql
import (
"reflect"
"github.com/apache/beam/sdks/v2/go/pkg/beam"
"github.com/apache/beam/sdks/v2/go/pkg/beam/core/graph"
"github.com/apache/beam/sdks/v2/go/pkg/beam/core/runtime/xlangx"
"github.com/apache/beam/sdks/v2/go/pkg/beam/core/typex"
"github.com/apache/beam/sdks/v2/go/pkg/beam/transforms/sql/sqlx"
)
// Option is the base type of all the SQL transform options.
type Option func(sqlx.Options)
// options contain all the options for a SQL transform.
type options struct {
dialect string
expansionAddr string
inputs map[string]beam.PCollection
outType beam.FullType
customs []sqlx.Option
}
func (o *options) Add(opt sqlx.Option) {
o.customs = append(o.customs, opt)
}
// Input adds a named PCollection input to the transform.
func Input(name string, in beam.PCollection) Option {
return func(o sqlx.Options) {
o.(*options).inputs[name] = in
}
}
// OutputType specifies the output PCollection type of the transform.
// It must match the SQL output schema.
//
// There is currently no default output type, so users must set this option.
// In the future, Row, once implemented, may become the default output type.
func OutputType(t reflect.Type, components ...typex.FullType) Option {
return func(o sqlx.Options) {
o.(*options).outType = typex.New(t, components...)
}
}
// Dialect specifies the SQL dialect, e.g. use 'zetasql' for ZetaSQL.
func Dialect(dialect string) Option {
return func(o sqlx.Options) {
o.(*options).dialect = dialect
}
}
// ExpansionAddr is the URL of the expansion service to use.
func ExpansionAddr(addr string) Option {
return func(o sqlx.Options) {
o.(*options).expansionAddr = addr
}
}
// Transform creates a SQL-based transform over zero or more PCollections
// and/or named data sources.
//
// PCollection inputs can be added using the sql.Input option. SQL queries can
// also refer to external tables that can be resolved by the expansion service.
//
// The output PCollection type must be specified by the sql.OutputType option.
//
// Example:
//
// in := beam.Create(s, 1, 2, 3)
// out := sql.Transform(s, "SELECT COUNT(*) FROM t",
// sql.Input("t", in),
// sql.OutputType(reflect.TypeOf(int64(0))))
// // `out` is a PCollection<int64> with a single element 3.
//
// If an expansion service address is not provided as an option, one will be
// automatically started for the transform.
func Transform(s beam.Scope, query string, opts ...Option) beam.PCollection {
o := &options{
inputs: make(map[string]beam.PCollection),
}
for _, opt := range opts {
opt(o)
}
if o.outType == nil {
panic("output type must be specified for sql.Transform")
}
payload := beam.CrossLanguagePayload(&sqlx.ExpansionPayload{
Query: query,
Dialect: o.dialect,
})
expansionAddr := sqlx.DefaultExpansionAddr
if o.expansionAddr != "" {
expansionAddr = xlangx.Require(o.expansionAddr)
}
out := beam.CrossLanguage(s, sqlx.Urn, payload, expansionAddr, o.inputs, beam.UnnamedOutput(o.outType))
return out[graph.UnnamedOutputTag]
}
```
|
```scala
package com.wavesplatform.lang
sealed trait ExecutionError {
def message: String
}
case class CommonError(details: String, cause: Option[ValidationError] = None) extends ExecutionError {
override def toString: String = s"CommonError($message)"
override def message: String = cause.map(_.toString).getOrElse(details)
}
case class ThrownError(message: String) extends ExecutionError
case class FailOrRejectError(message: String, skipInvokeComplexity: Boolean = true) extends ExecutionError with ValidationError
```
|
Philip Kapleau (August 20, 1912 – May 6, 2004) was an American teacher of Zen Buddhism in the Sanbo Kyodan tradition, which is rooted in Japanese Sōtō and incorporates Rinzai-school koan-study. He also strongly advocated for Buddhist vegetarianism.
Early life
Kapleau was born in New Haven, Connecticut. As a teenager he worked as a bookkeeper. He briefly studied law and later became an accomplished court reporter. In 1945 he served as chief Allied court reporter for the Trial of the Major War Criminals Before the International Military Tribunal, which judged the leaders of Nazi Germany. It was the first of the series commonly known as the Nuremberg Trials.
Kapleau later covered the International Military Tribunal for the Far East, commonly known as the Tokyo War Crimes Trials. While in Japan he became intrigued by Zen Buddhism. He became acquainted with Karlfried Graf Dürckheim, then a prisoner at Sugamo Prison, who recommended that Kapleau attend informal lectures given by D.T. Suzuki in Kita-Kamakura. After returning to America, Kapleau renewed his acquaintance with D.T. Suzuki who had left Kita-Kamakura to lecture on Zen at Columbia University. Disaffected with a primarily intellectual treatment of Zen, he moved to Japan in 1953 to seek its deeper truth.
Zen training
He trained initially with Soen Nakagawa, then rigorously with Daiun Harada at the temple Hosshin-ji. Later he became a disciple of Hakuun Yasutani, a dharma heir of Harada. After 13 years' training, Kapleau was ordained as a priest by Yasutani in 1965 "according to the rites prescribed by the Patriarch Eihei Dogen" as described by Yasutani in a certificate from the Sanbo "Three Treasures" Buddhist Religious Association, dated June 28, 1964, and given permission to teach. Kapleau ended his relationship with Yasutani formally in 1967 over disagreements about teaching and other personal issues. According to James Ishmael Ford, "Kapleau had completed about half of the Harada-Yasutani kōan curriculum, the koans in the Gateless Gate and the Blue Cliff Record," and was entitled to teach, but did not receive dharma transmission. According to Andrew Rawlinson, "Kapleau has created his own Zen lineage."
Work and teaching
During a book tour in 1965 he was invited to teach meditation at a gathering in Rochester, New York. In 1966 he left Japan to create the Rochester Zen Center.
For almost 40 years, Kapleau taught at the Center and in many other settings around the world, and provided his own dharma transmission to several disciples. He also introduced many modifications to the Japanese Zen tradition, such as chanting the Heart Sutra in the local language, English in the U.S., or Polish at the Center he founded in Katowice. He often emphasized that Zen Buddhism adapted so readily to new cultures because it was not dependent upon a dogmatic external form. At the same time he recognized that it was not always easy to discern the form from the essence, and one had to be careful not to "throw the baby out with the bathwater."
He suffered from Parkinson’s Disease for several years. While his physical mobility was reduced, he enjoyed lively and trenchant interactions with a steady stream of visitors throughout his life. On May 6, 2004, he died peacefully in the backyard of the Rochester Zen Center, surrounded by many of his closest disciples and friends.
Writings
Kapleau transcribed other Zen teachers' talks, interviewed lay students and monks, and recorded the practical details of Zen Buddhist practice. His book, The Three Pillars of Zen, published in 1965, has been translated into 12 languages, and is still in print. It was one of the first English-language books to present Zen Buddhism not as philosophy, but as a pragmatic and salutary way of training and living.
Kapleau was an articulate and passionate writer. His emphasis in writing and teaching was that insight and enlightenment are available to anyone, not just austere and isolated Zen monks. Also well known for his views on vegetarianism, peace and compassion, he remains widely read, and is a notable influence on Zen Buddhism as it is practiced in the West. Today, his dharma heirs and former students teach at Zen centers around the world.
Kapleau's book To Cherish All Life: A Buddhist Case for Becoming Vegetarian condemns meat-eating. He argued that Buddhism enjoins vegetarianism on the principle of nonharmfulness.
Grist for the mill
A favorite saying of Philip Kapleau was "Grist for the mill" which means that all of our troubles and trials can be useful or contain some profit to us. In this spirit, his gravestone is one of the millstones from Chapin Mill, the 135-acre (0.55 km2) Buddhist retreat center whose land was donated by a founding member of the Rochester Zen Center, Ralph Chapin.
Lineage
Kapleau appointed several successors, some of whom have subsequently appointed successors or authorized teachers:
Bishop, Mitra (12 Apr 1941-). Founder and head of the Mountain Gate monastic center, NM and the Hidden Valley Zen Center, CA.
Henry, Michael Danan (12 Nov 1939-). Also a teacher appointed by Robert Aitken. Founding teacher at the Zen Center of Denver (now teacher of Old Bones Sangha).
Kempe, Karin Roshi. Teacher at the Denver Zen Center. In 2008 the Head of Zendo at Zen Center of Denver.
Morgareidge, Ken Roshi. Teacher at the Denver Zen Center until 2021 [186]. Currently teaching independently. In 2005-2007 the Head of Zendo at Zen Center of Denver.
Sheehan, Peggy Roshi. Teacher at the Denver Zen Center [186]. In 2001-2005 the Head of Zendo at Zen Center of Denver.
Martin, Rafe Roshi. Teacher at Endless Path Zendo, author.
Holmgren, Hoag. Teacher at Mountain Path Sangha, author.
Gifford, Dane Zenson (1949 - 2016). Former teacher at the Toronto Zen Centre.
Graef, Sunyana (1948-). Former teacher at the Toronto Zen Centre, head of the Vermont Center. Teacher at the Casa Zen in Costa Rica.
Henderson, Taigen Sensei (1949-) Since 2005 Dharma Heir of Sunyana Graef and the abbot of the TZC. Teacher at the Toronto Zen Centre.
Kjolhede, Peter Bodhin (1948-). Abbot at the Rochester Zen Center, Teacher at the Madison Zen Center, WI, US.
Odland, Kanja Sensei (1963-). Ordained as a priest in 1999. In 2001 she was authorised to teach by Kjolhede.
Ross, Lanny Sevan Keido Sei'an Sensei (7 Sep 1951-). Also holds the Dharma Transmission in the Jiyu Kennett and Robert Aitken lineages bestowed on him in 2007 by James Zeno Myoun Ford. Former teacher at the Chicago Zen Center in Evanston, IL, US.
Poromaa, Mikael Sante Sensei (1958-). Ordained as a Zen priest in 1991. Kjolhede gave him sanction to teach in 1998. Teacher at the Stockholm Zen Center, Sweden and its affiliate Helsinki Zen Center, Finland.
Wrightson, Charlotte Amala Sensei (1958-). Ordained as a Zen priest in 1999. Sanctioned to teach in 2004. Kjolhede gave her Dharma Transmission in Feb 2012. Teacher at the Auckland Zen Center, New Zealand.
Kjolhede, Sonja Sunya. Teacher at the Windhorse Zen Community, near Asheville, NC. Teacher at the Polish affiliate center (established by D. Gifford) of the Rochester Zen Center. Sister of Peter Kjolhede, wife of Lawson Sachter.
Low, Albert (1928-2016). Teacher at the Montreal Zen Center.
Sachter, Lawson David. Teacher at the Windhorse Zen Community, near Asheville, NC and spiritual director of the Clear Water Zen Center in Florida. Husband of Sonja Kjolhede.
Two students ended their formal affiliation with Philip Kapleau, establishing independent teaching-careers:
Packer, Toni (1927-2013) Teacher at Springwater Center (formerly named Genesee Valley Zen Center), Rochester.
Clarke, Richard (31 Jan 1933-) From 1967 to 1980 a student of Philip Kapleau, but neither ordained by P. Kapleau nor sanctioned by him to teach. Teacher at the Living Dharma Center, Amherst, MA and Coventry, CT
Bibliography
Awakening to Zen (New York: Scribner, 1997)
Straight to the Heart of Zen (Boston: Shambhala, 2001)
The Three Pillars of Zen (New York: Anchor Books, 2000)
The Wheel of Death (London: George Allen & Unwin LTD, 1972)
The Zen of Living and Dying: A Practical and Spiritual Guide (Boston: Shambhala, 1998)
To Cherish All Life: A Buddhist Case for Becoming Vegetarian (San Francisco: Harper & Row, 1982)
Zen: Dawn in the West (Garden City, N.Y.: Anchor Press, 1979)
Zen: Merging of East and West (New York: Anchor Books, 1989)
See also
Buddhism in the United States
Buddhism in the West
Timeline of Zen Buddhism in the United States
Notes
References
External links
Harada-Yasutani School
Kapleau's Teachers
1912 births
2004 deaths
American religious writers
American vegetarianism activists
American Zen Buddhists
People from Connecticut
Sanbo Kyodan Buddhists
Writers from New Haven, Connecticut
Zen Buddhism writers
Zen Buddhist spiritual teachers
|
Adamanterpeton (from Greek: Αδάμαντας adamantas, meaning 'diamond' and Greek: ἑρπετόν herpetón, meaning 'creeping thing') is a genus of Edopoid Temnospondyl within the family Cochleosauridae. The type species A. ohioensis was named in 1998 and is currently the only known species within this genus. Adamanterpeton is rare in the Linton vertebrate assemblage, with other amphibians like Sauropleura, Ophiderpeton, and Colosteus being more common. Unlike other Linton vertebrates, Adamanterpeton may have been adapted to a terrestrial lifestyle.
History and Discovery
The species is only known from two specimens in the fossil record. The cochleosaurid specimens were discovered in the fossil-rich Allegheny Formation of Linton, Ohio by John Strong Newberry, but were only formally described and figured by Edward Drinker Cope in 1875. Using Cope's (1875) work, Moodie (1916) attributed the specimens to Macrerpeton huxleyi, however, given that the holotype for M. huxleyi was well described and distinct from the two specimens this attribution was not recognized by future authors. In 1930, Alfred Romer categorized the specimens as Anthracosaurus. In 1947, Romer revised his previous classification and designated them as Leptophractus obsoletus. However, instead of using the type specimen of L. obsoletus he used the two cochleosaurid specimens and erroneously classified L. obsoletus as small Temnospondyli belonging to the Edopoidea family. In 1963, Romer made revisions to what constituted the Anthracosaurus material from Linton, Ohio, and in doing so removed the two cochleosaurid specimens from its previous classification leaving them with no official systematic status or classification. Hook and Baird (1986) identified them as Gaudrya latistoma, however Sequeira & Milner (1993) disproved this classification.
In their 1998 paper, Sequeira and Milner attempted to classify the cochleosaurid specimens and finding no similarities to existing cochleosaurid genera, considered them to be a plesiomorphic stem cochleosaurid and named the corresponding genus Adamanterpeton and assigned the species name ohioensis to the type specimen.
Paleoenvironment and Geology
The species is only known from the Diamond coal mines of Linton, Ohio. Linton fossils are preserved in a coal deposit known as the Upper Freeport coal, which is a thin deposit of cannel coal that lies beneath a locally thick layer of bituminous coal at the top of the Allegheny Group. The area was likely a meander of a larger river that was cut off, thus forming an oxbow lake. A reduction in sediment deposits to the lake led to an increase in peat-forming mires which allowed for peat to directly accumulate over sapropel found in abandoned channel segments, which is what eventually formed the coal deposits. The considerable depth of the lake was also thought to contribute to the extended length of time that the sapropelic conditions existed for in the area after the lake dried up.
The area is thought to have produced somewhere between 200 and 300 Temnospondyl specimens, of which only two represent A. ohioensis effectively making it the rarest Temnospondyl and one of the rarer tetrapods from the assemblage. Milner and Sequeira (1998) attribute its scarcity in the Linton fauna to its terrestrial nature, suggesting it wasn't naturally found in this aquatic environment and was likely an accidental preservation. C. florensis was found in an assemblage in Florence, Nova Scotia that is considered similar to the Linton assemblage in terms of its depositional setting. Both C. florensis and A. ohioensis have relatively corrugated skulls that measure between 100 and 500 mm in length, a feature commonly associated with terrestrial Temnospondyls, which led Milner and Sequeira to hypothesize that the Cochleosaurid family included amphibians and small terrestrial carnivores.
Description
Both specimens only consist of a skull and lack any post cranial materials. The skulls of both specimens aren't fully preserved, however, the type specimen skull measures to be about 130 mm in length and the other specimen measuring around 120 mm.
The honeycomb pattern produced from the pit-and-ridge system, that is characteristic of Temnospondyls, is also observed in Adamanterpeton. However, this ornament pattern is significantly reduced in the area between the anterior snout and the posterior skull table margin, and in the lacrimals and jugals, which is a pattern also observed in C. bohemicus and C. florensis. The specimens also display a pattern where ossified struts extend from either side of the midline of the premaxilla, which then extend to the midline of the nasals and the frontals, and finish along the parietals. The struts are a feature considered to be characteristic of Cochleosaurids.
Similar to Procochleosaurus, Adamanterpeton retains a pineal foramen unlike the other members of the Cochleosaurid family, where the pineal foramen is either completely closed or only open in juveniles. It is unknown whether the pineal foramen ever closes in their evolutionary history, however, the size of the foramen in both genera suggest that they do not and instead have the primitive condition found in tetrapods.
The skull roof of the specimens were considered to be representative of the primitive Temnospondyl condition, either through direct observation of the features or through inference, which was necessary as regions of the squamosal and quadratojugal were missing from both specimens.
The palate is also considered to be representative of a primitive Temnospondyl, as it retains primitive features such as smaller interpterygoid vacuities, paired vomers, ectopterygoids and pterygoids. The vomers are elongated, in addition to an elongated prechoanal and interchoanal region, both of which can be associated with elongated choanae, features that are all typical of cochleosaurid as well.
The parasphenoid plate of the cranium is wider than it is long in Adamanterpeton, similar to C. bohemicus. It also contains three shallow depressions on its ventral surface that are thought to maybe be homologous with the parasphenoid tubera that are present in other edopoids.
The mandible was relatively shallow and made up of the same bones typically observed in Temnospondyls, which is a feature also observed in Cochleosaurus and Nigerpeton. This is contrasted with the Procochleosaurus condition where the mandible is deeper and more robust. They had simple, pointed teeth and fangs, that were implanted into the bone with the tooth socket remaining separate. Teeth were arranged in the typical Temnospondyl condition with three pairs of fangs on the palatal bones.
Classification
Adamanterpeton is classified as a stem Cochleosaurus, along with Procochleosaurus however, there is uncertainty about whether Adamanterpeton should be more stemward or Procochleosaurus, as despite Adamanterpeton having all the Cochelosaurid character states it lacks the apomorphies shared by more derived cochelosaurids. However, Procochleosaurus data is quite scarce, it is less than 50% complete, so its phylogenetic position remains ambiguous and up to interpretation.
Despite the ambiguity of the order of the stem groups, there has been repeated support for a monophyletic Cochleosauridae family that consists of Procochleosaurus and Adamanterpeton as stem groups to the crown group that consists of Cochleosaurus, Nigerpeton and Chenoprosopus. The characters that diagnose this family are thought to be pineal foramen closure, anteriorly broad choanae, extreme anterior vomerine elongation, jugal expansion on the palatal surface and are generally agreed upon by researchers, although there has been pushback against the inclusion of pineal foramen.
References
Cochleosauridae
Carboniferous temnospondyls of North America
Fossil taxa described in 1998
Prehistoric amphibian genera
Carboniferous amphibians
Peat
|
```go
// Code generated by smithy-go-codegen DO NOT EDIT.
package eventbridge
import (
"context"
"fmt"
awsmiddleware "github.com/aws/aws-sdk-go-v2/aws/middleware"
"github.com/aws/aws-sdk-go-v2/service/eventbridge/types"
"github.com/aws/smithy-go/middleware"
smithyhttp "github.com/aws/smithy-go/transport/http"
"time"
)
// Get the information about an existing global endpoint. For more information
// about global endpoints, see [Making applications Regional-fault tolerant with global endpoints and event replication]in the Amazon EventBridge User Guide .
//
// [Making applications Regional-fault tolerant with global endpoints and event replication]: path_to_url
func (c *Client) DescribeEndpoint(ctx context.Context, params *DescribeEndpointInput, optFns ...func(*Options)) (*DescribeEndpointOutput, error) {
if params == nil {
params = &DescribeEndpointInput{}
}
result, metadata, err := c.invokeOperation(ctx, "DescribeEndpoint", params, optFns, c.addOperationDescribeEndpointMiddlewares)
if err != nil {
return nil, err
}
out := result.(*DescribeEndpointOutput)
out.ResultMetadata = metadata
return out, nil
}
type DescribeEndpointInput struct {
// The name of the endpoint you want to get information about. For example,
// "Name":"us-east-2-custom_bus_A-endpoint" .
//
// This member is required.
Name *string
// The primary Region of the endpoint you want to get information about. For
// example "HomeRegion": "us-east-1" .
HomeRegion *string
noSmithyDocumentSerde
}
type DescribeEndpointOutput struct {
// The ARN of the endpoint you asked for information about.
Arn *string
// The time the endpoint you asked for information about was created.
CreationTime *time.Time
// The description of the endpoint you asked for information about.
Description *string
// The ID of the endpoint you asked for information about.
EndpointId *string
// The URL of the endpoint you asked for information about.
EndpointUrl *string
// The event buses being used by the endpoint you asked for information about.
EventBuses []types.EndpointEventBus
// The last time the endpoint you asked for information about was modified.
LastModifiedTime *time.Time
// The name of the endpoint you asked for information about.
Name *string
// Whether replication is enabled or disabled for the endpoint you asked for
// information about.
ReplicationConfig *types.ReplicationConfig
// The ARN of the role used by the endpoint you asked for information about.
RoleArn *string
// The routing configuration of the endpoint you asked for information about.
RoutingConfig *types.RoutingConfig
// The current state of the endpoint you asked for information about.
State types.EndpointState
// The reason the endpoint you asked for information about is in its current state.
StateReason *string
// Metadata pertaining to the operation's result.
ResultMetadata middleware.Metadata
noSmithyDocumentSerde
}
func (c *Client) addOperationDescribeEndpointMiddlewares(stack *middleware.Stack, options Options) (err error) {
if err := stack.Serialize.Add(&setOperationInputMiddleware{}, middleware.After); err != nil {
return err
}
err = stack.Serialize.Add(&awsAwsjson11_serializeOpDescribeEndpoint{}, middleware.After)
if err != nil {
return err
}
err = stack.Deserialize.Add(&awsAwsjson11_deserializeOpDescribeEndpoint{}, middleware.After)
if err != nil {
return err
}
if err := addProtocolFinalizerMiddlewares(stack, options, "DescribeEndpoint"); err != nil {
return fmt.Errorf("add protocol finalizers: %v", err)
}
if err = addlegacyEndpointContextSetter(stack, options); err != nil {
return err
}
if err = addSetLoggerMiddleware(stack, options); err != nil {
return err
}
if err = addClientRequestID(stack); err != nil {
return err
}
if err = addComputeContentLength(stack); err != nil {
return err
}
if err = addResolveEndpointMiddleware(stack, options); err != nil {
return err
}
if err = addComputePayloadSHA256(stack); err != nil {
return err
}
if err = addRetry(stack, options); err != nil {
return err
}
if err = addRawResponseToMetadata(stack); err != nil {
return err
}
if err = addRecordResponseTiming(stack); err != nil {
return err
}
if err = addClientUserAgent(stack, options); err != nil {
return err
}
if err = smithyhttp.AddErrorCloseResponseBodyMiddleware(stack); err != nil {
return err
}
if err = smithyhttp.AddCloseResponseBodyMiddleware(stack); err != nil {
return err
}
if err = addSetLegacyContextSigningOptionsMiddleware(stack); err != nil {
return err
}
if err = addTimeOffsetBuild(stack, c); err != nil {
return err
}
if err = addUserAgentRetryMode(stack, options); err != nil {
return err
}
if err = addOpDescribeEndpointValidationMiddleware(stack); err != nil {
return err
}
if err = stack.Initialize.Add(newServiceMetadataMiddleware_opDescribeEndpoint(options.Region), middleware.Before); err != nil {
return err
}
if err = addRecursionDetection(stack); err != nil {
return err
}
if err = addRequestIDRetrieverMiddleware(stack); err != nil {
return err
}
if err = addResponseErrorMiddleware(stack); err != nil {
return err
}
if err = addRequestResponseLogging(stack, options); err != nil {
return err
}
if err = addDisableHTTPSMiddleware(stack, options); err != nil {
return err
}
return nil
}
func newServiceMetadataMiddleware_opDescribeEndpoint(region string) *awsmiddleware.RegisterServiceMetadata {
return &awsmiddleware.RegisterServiceMetadata{
Region: region,
ServiceID: ServiceID,
OperationName: "DescribeEndpoint",
}
}
```
|
Saqal () was a Turgesh Qaghan. According to Yuri Zuev, he was a Manichaeist so that his name was possibly derived from Manichean theonym Sakla which means "Creator of the World". Other reconstructions are Saqal and Soq.
Early reign
Suoge succeeded his father, Wuzhile, to the Turgesh throne. However, the Tang court did not acknowledge him as a Khagan, and instead, appointed him as Commander of the Walu Province (嗢鹿州都督). They gave him the title of Prince of Huaide (懷徳郡王), which made him a subordinate of Ashina Huaidao (who was Shixing Qaghan). Later, the deputy of Guo Yuanzhen, Jie Wan (解琬), was sent to bestow him with the title of Prince of Jinhe (金河郡王) in 708.
Conflict with Tang
Eventually, the relationship between Suoge and the Tang court deteriorated. Suoge's subordinate, Juechuo Zhongjie (闕啜忠節), rebelled to his command, but was unable to prevail. At the suggestion of Guo Yuanzhen, in 708, he was set to give up his military forces and return to Chang'an, the Tang capital. The Tang general Zhou Yiti (周以悌) persuaded Juechuo to bribe chancellors Zong Chuke and Ji Chuna into launching an attack against Suoge. Juechuo did so and succeeded. Zong, after Juechuo's bribery, proposed to Emperor Zhongzong the idea of attacking Suoge in alliance with the Tibetan Empire, to which Emperor Zhongzong agreed, despite Guo's opposition. The Emperor appointed Ashina Xian (a son of Ashina Yuanqing) as Shixing Qaghan and sent him to capture Suyab. The attack did not succeed. Zhongjie was captured and Xian was defeated by Suoge.
Suoge's brother, Zhenu (遮努), gathered an army of 20,000 members and successfully attacked several Tang outposts – Kucha, Bohuan (modern Aksu, Xinjiang), Yanqi and Yingzhan. As Guo was hesitant to send an army to Shule, Suoge grew more self-reliant and declared himself as Khagan. Later, Suoge sent an envoy to Chang'an to demand that Zong be executed after wreaking havoc to Anxi. As a response, Guo was replaced by Zhou Yiti. Meanwhile, Suoge sent a letter claiming his innocence to Guo. In turn, Guo reported the facts of the situation to Emperor Zhongzong, which led to the Zongs accusing him of treason. However, Emperor Zhongzong agreed with Guo and sent an envoy to make peace with Suoge and make him Shisixing Qaghan () in 709. Meanwhile, he also sent Zhou Yiti to exile in Baizhou.
End of reign
According to Takeshi Osawa, the mediator that sent in the peace envoy to Suoge was Kyrgyz Khaganate's ruler, Bars Bek – a Khagan closely controlled by Qapaghan Qaghan, who was a brother-in-law to future Bilge Qaghan. Bars Bek secretly plotted a triple alliance with Tang and Turgesh, but Tonyukuk heard his plans, with help of Zhenu (遮努), Suoge's brother, who rebelled against him and deserted to the Second Turkic Khaganate. Tonyukuk made a surprise attack on Kirghiz at night in the year 710. Bars Bek was killed and Tonyukuk later headed on to Turgesh. However, Qapaghan's khatun died soon, which caused the khagan to order a halt to the attack. Suoge used this opportunity to move ahead, only to see Tonyukuk change his mind and disobey the order, ambushing Suoge. After a disastrous defeat at Bolchu, he was executed by future Bilge Qaghan. His lands were given as an appanage to Inal, son of Qapaghan.
References
Further reading
Werner Sundermann, “MANICHEISM iii. THE MANICHEAN PANDAEMONIUM,” Encyclopædia Iranica, online edition, 2018, available at http://www.iranicaonline.org/articles/manicheism-pandaemonium (accessed on 12 April 2018).
Türgesh khagans
711 deaths
8th-century monarchs in Asia
8th-century murdered monarchs
|
In the Battle of al-Sannabra (1113), a Crusader army led by King Baldwin I of Jerusalem was defeated by a Muslim army sent by the Sultan of the Seljuk Turks and commanded by Mawdud ibn Altuntash of Mosul.
Background
Beginning in 1110, the Seljuk Sultan Muhammad I in Baghdad ordered invasions of the Crusader states for six successive years. "In 1110, 1112, and 1114 the city of Edessa was the objective; in 1113 Galilee was invaded, and in 1111 and 1115 the Latin possessions which lay east of the Orontes between Aleppo and Shaizar."
The attack on Edessa in 1110 failed to take the city. In 1111, Mawdud of Mosul led a host which fought Baldwin I's Frankish army to a draw in the Battle of Shaizar. Afterward, the Muslim leader's army dispersed because of its lack of success and plunder. In 1112 and 1114, the Muslim counterattack against Edessa was weak. In the other four years, the Crusader states - the Kingdom of Jerusalem, Principality of Antioch, County of Tripoli and County of Edessa - joined forces in defense.
Battle
In 1113, Mawdud joined Toghtekin of Damascus and their combined army aimed to cross the Jordan River south of the Sea of Galilee. Baldwin I offered battle near the bridge of al-Sannabra. Mawdud used the device of a feigned flight to entice Baldwin I into rashly ordering a charge. The Frankish army was surprised and beaten when it unexpectedly ran into the main Turkish army.
The surviving Crusaders kept their cohesion and fell back to a hill west of the inland sea where they fortified their camp. In this position they were reinforced from Tripoli and Antioch but remained inert. A number of Christian pilgrims also rallied to the army after al-Sannabra.
Unable to annihilate the Crusaders, Mawdud watched them with his main army while sending raiding columns to ravage the countryside and sack the town of Nablus. In this, Mawdud anticipated the strategy of Saladin in two later campaigns that were marked by the Battle of Belvoir Castle (1182) and the Battle of Al-Fule (1183). As in these campaigns, the Frankish field army could oppose the main Muslim army, but it could not stop raiding forces from doing great damage to crops and towns.
While the Turkish raiders roamed freely through Crusader lands, the local Muslim farmers entered into friendly relations with them. This deeply troubled the Frankish land magnates, who ultimately depended upon rents from cultivators of the soil.
Aftermath
Mawdud was unable to make any permanent conquests after his victory. Soon afterward, he was assassinated and Aq-Sunqur Bursuqi took command of the failed attempt against Edessa in 1114. Roger of Salerno routed the last Seljuk invading army at the Battle of Sarmin after a protracted campaign in 1115.
References
Smail, R. C. Crusading Warfare 1097-1193. New York: Barnes & Noble Books, (1956) 1995.
Battles involving the Seljuk Empire
1110s in the Kingdom of Jerusalem
Conflicts in 1113
1113 in Asia
Battles involving the Kingdom of Jerusalem
|
```xml
export default class AccountsError extends Error {
public packageName?: string;
public functionName?: string;
public reason?: string;
constructor(packageName?: string, functionName?: string, reason?: string) {
// Build Error message from parameters
const message = reason
? `[ Accounts - ${packageName} ] ${functionName} : ${reason}`
: packageName;
// Build the underlying Error
super(message);
// Assign parameters for future use
this.packageName = packageName;
this.functionName = functionName;
this.reason = reason;
// Set the prototype to AccountsError so "instanceof AccountsError" returns true
Object.setPrototypeOf(this, AccountsError.prototype);
// Recapture the stack trace to avoid this function to be in it
if (typeof (Error as any).captureStackTrace === 'function') {
(Error as any).captureStackTrace(this, this.constructor);
} else {
this.stack = new Error(message).stack;
}
}
}
```
|
```antlr
/*
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
grammar RQLStatement;
import BaseRule;
showShardingTableRules
: SHOW SHARDING TABLE (tableRule | RULES) (FROM databaseName)?
;
showShardingTableReferenceRules
: SHOW SHARDING TABLE REFERENCE (RULE ruleName | RULES) (FROM databaseName)?
;
showShardingAlgorithms
: SHOW SHARDING ALGORITHMS (FROM databaseName)?
;
showShardingAuditors
: SHOW SHARDING AUDITORS (FROM databaseName)?
;
showShardingTableNodes
: SHOW SHARDING TABLE NODES tableName? (FROM databaseName)?
;
showShardingKeyGenerators
: SHOW SHARDING KEY GENERATORS (FROM databaseName)?
;
showDefaultShardingStrategy
: SHOW DEFAULT SHARDING STRATEGY (FROM databaseName)?
;
showUnusedShardingAlgorithms
: SHOW UNUSED SHARDING ALGORITHMS (FROM databaseName)?
;
showUnusedShardingKeyGenerators
: SHOW UNUSED SHARDING KEY GENERATORS (FROM databaseName)?
;
showUnusedShardingAuditors
: SHOW UNUSED SHARDING AUDITORS (FROM databaseName)?
;
showShardingTableRulesUsedAlgorithm
: SHOW SHARDING TABLE RULES USED ALGORITHM shardingAlgorithmName (FROM databaseName)?
;
showShardingTableRulesUsedKeyGenerator
: SHOW SHARDING TABLE RULES USED KEY GENERATOR keyGeneratorName (FROM databaseName)?
;
showShardingTableRulesUsedAuditor
: SHOW SHARDING TABLE RULES USED AUDITOR auditorName (FROM databaseName)?
;
countShardingRule
: COUNT SHARDING RULE (FROM databaseName)?
;
tableRule
: RULE tableName
;
databaseName
: IDENTIFIER_
;
```
|
```javascript
Async and defer scripts
Form a `URL` from its parts
Vibration API
Page Visibility API
Fetch API
```
|
```javascript
'use strict';
const {isNodeMatches} = require('./utils/is-node-matches.js');
const {isMethodCall} = require('./ast/index.js');
const {removeMethodCall} = require('./fix/index.js');
const MESSAGE_ID = 'prefer-array-flat-map';
const messages = {
[MESSAGE_ID]: 'Prefer `.flatMap()` over `.map().flat()`.',
};
const ignored = ['React.Children', 'Children'];
/** @param {import('eslint').Rule.RuleContext} context */
const create = context => ({
CallExpression(callExpression) {
if (!(
isMethodCall(callExpression, {
method: 'flat',
optionalCall: false,
optionalMember: false,
})
&& (
callExpression.arguments.length === 0
|| (
callExpression.arguments.length === 1
&& callExpression.arguments[0].type === 'Literal'
&& callExpression.arguments[0].raw === '1'
)
)
&& isMethodCall(callExpression.callee.object, {
method: 'map',
optionalCall: false,
optionalMember: false,
})
)) {
return;
}
const flatCallExpression = callExpression;
const mapCallExpression = flatCallExpression.callee.object;
if (isNodeMatches(mapCallExpression.callee.object, ignored)) {
return;
}
const {sourceCode} = context;
const mapProperty = mapCallExpression.callee.property;
return {
node: flatCallExpression,
loc: {start: mapProperty.loc.start, end: flatCallExpression.loc.end},
messageId: MESSAGE_ID,
* fix(fixer) {
// Removes:
// map().flat();
// ^^^^^^^
// (map()).flat();
// ^^^^^^^
yield * removeMethodCall(fixer, flatCallExpression, sourceCode);
// Renames:
// map().flat();
// ^^^
// (map()).flat();
// ^^^
yield fixer.replaceText(mapProperty, 'flatMap');
},
};
},
});
/** @type {import('eslint').Rule.RuleModule} */
module.exports = {
create,
meta: {
type: 'suggestion',
docs: {
description: 'Prefer `.flatMap()` over `.map().flat()`.',
recommended: true,
},
fixable: 'code',
messages,
},
};
```
|
Louis Verreydt (25 November 1950 – 13 August 1977) was a Belgian cyclist. He competed in the team time trial at the 1972 Summer Olympics. He died of a heart attack in 1977.
References
External links
1950 births
1977 deaths
Belgian male cyclists
Olympic cyclists for Belgium
Cyclists at the 1972 Summer Olympics
Cyclists from Antwerp
UCI Road World Champions (elite men)
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
// MODULES //
var tape = require( 'tape' );
var pow = require( '@stdlib/math/base/special/pow' );
var ln = require( '@stdlib/math/base/special/ln' );
var SMALLEST_NORMAL = require( '@stdlib/constants/float64/smallest-normal' );
var FLOAT64_MIN_LN = require( './../lib' );
// TESTS //
tape( 'main export is a number', function test( t ) {
t.ok( true, __filename );
t.strictEqual( typeof FLOAT64_MIN_LN, 'number', 'main export is a number' );
t.end();
});
tape( 'export is a double-precision floating-point number equal to the natural logarithm of the smallest normalized double-precision floating-point number', function test( t ) {
t.equal( FLOAT64_MIN_LN, -ln( pow( 2, 1022 ) ), 'equals the logarithm of the smallest normalized double-precision floating-point number' );
t.equal( FLOAT64_MIN_LN, ln( SMALLEST_NORMAL ), 'equals the logarithm of the smallest normalized double-precision floating-point number' );
t.end();
});
```
|
Anton Ausserdorfer (11 March 1836, in Anras – 16 September 1885, in Hall, Tirol) was an Austrian clergyman and botanical collector.
He served as a curate in Windisch-Matrei, and was a good friend of fellow clergyman/botanist Rupert Huter (1834–1919). He collected mainly in South Tyrol.
Plants with the specific epithet of ausserdorferi are named in his honor, an example being the grass species Avenastrum ausserdorferi.
Associated works
Katalog zum Herbar des Abiturienten Anton Außerdorfer : Alphabetisch geordnet und mit Einleitung über das Studium der Pflanzenkunde in Tirol bis zum Jahre 1855 versehen; Vinzenz Gasser, in: "Studien und Mitteilungen zur Geschichte des Benediktinerordens und seiner Zweige".
References
1836 births
1885 deaths
People from Lienz District
Austrian clergy
19th-century Austrian botanists
|
```javascript
import warning from 'warning';
import { isFunction } from './utils';
import prefixedDispatch from './prefixedDispatch';
export function run(subs, model, app, onError) {
const funcs = [];
const nonFuncs = [];
for (const key in subs) {
if (Object.prototype.hasOwnProperty.call(subs, key)) {
const sub = subs[key];
const unlistener = sub(
{
dispatch: prefixedDispatch(app._store.dispatch, model),
history: app._history,
},
onError,
);
if (isFunction(unlistener)) {
funcs.push(unlistener);
} else {
nonFuncs.push(key);
}
}
}
return { funcs, nonFuncs };
}
export function unlisten(unlisteners, namespace) {
if (!unlisteners[namespace]) return;
const { funcs, nonFuncs } = unlisteners[namespace];
warning(
nonFuncs.length === 0,
`[app.unmodel] subscription should return unlistener function, check these subscriptions ${nonFuncs.join(
', ',
)}`,
);
for (const unlistener of funcs) {
unlistener();
}
delete unlisteners[namespace];
}
```
|
```yaml
apiVersion: v1
kind: Namespace
metadata:
labels:
istio-injection: enabled
name: ns-override
---
apiVersion: v1
kind: Namespace
metadata:
labels:
istio-injection: enabled
name: workload-override
---
# ProxyConfig for root namespace
apiVersion: networking.istio.io/v1beta1
kind: ProxyConfig
metadata:
name: valid-example-pc
namespace: istio-system
---
# ProxyConfig for ns-override namespace
apiVersion: networking.istio.io/v1beta1
kind: ProxyConfig
metadata:
name: valid-example-pc
namespace: ns-override
spec:
image:
imageType: distroless
---
# ProxyConfig for workload-override namespace
apiVersion: networking.istio.io/v1beta1
kind: ProxyConfig
metadata:
name: valid-example-pc
namespace: workload-override
spec:
selector:
matchLabels:
app: details
workload: details
image:
imageType: distroless
---
# Pod with ProxyConfig for ns-override namespace should not get a warning.
apiVersion: v1
kind: Pod
metadata:
labels:
app: details
name: details-v1-pod-ns-override
namespace: ns-override
spec:
containers:
- image: docker.io/istio/examples-bookinfo-details-v1:1.15.0
name: details
- image: docker.io/istio/proxyv2:1.3.1-distroless
name: istio-proxy
---
# Pod with ProxyConfig for workload-override namespace should not get a warning.
apiVersion: v1
kind: Pod
metadata:
labels:
app: details
workload: details
name: details-v1-pod-workload-override
namespace: workload-override
spec:
containers:
- image: docker.io/istio/examples-bookinfo-details-v1:1.15.0
name: details
- image: docker.io/istio/proxyv2:1.3.1-distroless
name: istio-proxy
---
apiVersion: v1
kind: Namespace
metadata:
labels:
istio-injection: enabled
name: annotation-override
---
# Pod with annotation override should not get a warning.
apiVersion: v1
kind: Pod
metadata:
annotations:
proxy.istio.io/config: |
image:
imageType: distroless
labels:
app: details
name: details-v1-pod-annotation-override
namespace: annotation-override
spec:
containers:
- image: docker.io/istio/examples-bookinfo-details-v1:1.15.0
name: details
- image: docker.io/istio/proxyv2:1.3.1-distroless
name: istio-proxy
```
|
The Guelph Memorial Gardens was an arena located in Guelph, Ontario. It was originally built in 1948 out of the remnants of a nineteenth-century building that had housed the Royal Winter Fair. The Gardens hosted various hockey teams over the years, including the Guelph Biltmore Mad Hatters, Guelph Platers and Guelph Storm. The arena has 3,999 seats and around 300 standing room only spots.
The last hockey game played at the arena was March 24, 2000 and the building closed permanently in November 2001 with the opening of the Sleeman Centre (Guelph).
Demolition of the arena commenced in December 2005. The City of Guelph built a new city hall on the site, which is next door to the former city hall.
External links
The OHL Arena & Travel Guide - Guelph Memorial Gardens
Defunct indoor arenas in Canada
Defunct indoor ice hockey venues in Canada
Sports venues in Ontario
Ontario Hockey League arenas
Guelph Storm
Buildings and structures in Guelph
|
```yaml
### YamlMime:FAQ
metadata:
title: FAQ Azure VM sizes with no local temporary disk
description: This article provides answers to frequently asked questions (FAQ) about Microsoft Azure VM sizes that dont have local temporary disk.
author: brbell
ms.service: azure-virtual-machines
ms.topic: faq
ms.subservice: sizes
ms.author: brbell
ms.reviewer: mimckitt
ms.date: 06/15/2020
title: Azure VM sizes with no local temporary disk
summary: |
> [!TIP]
> Try the **[Virtual Machine selector tool](path_to_url to find other sizes that best fit your workload.
This article provides answers to frequently asked questions (FAQ) about Azure VM sizes that don't have a local temporary disk (that is, no local temp disk).
sections:
- name: Ignored
questions:
- question: |
What does no local temp disk mean?
answer: |
Traditionally, we have had VM sizes (for example, Standard_D2s_v3, Standard_E48_v3) that include a small local disk (that is, a D: Drive). With the VM series such as [Dasv5](dasv5-dadsv5-series.md) and [Easv5](easv5-eadsv5-series.md) that small local disk no longer exists. However, you can still attach Standard HDD, Premium SSD or Ultra SSD to use as remote storage.
- question: |
What if I still want a local temp disk?
answer: |
If your workload requires a local temporary disk, we still offer sizes such as the [Dadsv5](dasv5-dadsv5-series.md).
> [!NOTE]
> Local temporary disk isn't persistent; to ensure your data is persistent, please use Standard HDD, Premium SSD or Ultra SSD options.
- question: |
Can I resize a VM size that has a local temp disk to a VM size with no local temp disk?
answer: |
No. The only combinations allowed for resizing are:
1. VM (with local temp disk) -> VM (with local temp disk); and
2. VM (with no local temp disk) -> VM (with no local temp disk).
If interested in a work around, please see next question.
> [!NOTE]
> If an image depends on the resource disk, or a pagefile or swapfile exists on the local temp disk, the diskless images won't workinstead, use the with disk alternative.
- question: |
How do I migrate from a VM size with local temp disk to a VM size with no local temp disk?
answer: |
You can migrate by following these steps:
1. Connect to your Virtual Machine that has a local temporary disk (for example, a D: Drive) as a local admin.
2. Follow the guidelines on the "Temporarily move pagefile.sys to C drive" section of [Use the D: drive as a data drive on a Windows VM](./windows/change-drive-letter.md) to move the page file from the local temporary disk (D: drive) to the C: drive.
> [!NOTE]
> Follow the guidelines on the "Temporarily move pagefile.sys to C drive" section of Use the D: drive as a data drive on a Windows VM to move page file from the local temporary disk (D: drive) to C: drive. **Deviation from the steps outlined will lead to the error message - "Unable to resize the VM since changing from resource disk to non-resource disk VM size and vice-versa is not allowed.**
3. Take a snapshot of the VM by following the steps outlined in [Create a snapshot using the portal or Azure CLI](./linux/snapshot-copy-managed-disk.md).
4. Use snapshot to create a new diskless VM (such as, Dv5, Dsv5, Dasv5, Ev5, Esv5, Easv5 series) by following the steps outlined in [Create a virtual machine from a snapshot with CLI](/previous-versions/azure/virtual-machines/scripts/virtual-machines-linux-cli-sample-create-vm-from-snapshot).
- question: |
Do these VM sizes support both Linux and Windows Operating Systems (OS)?
answer: |
Yes.
- question: |
Will this break my custom scripts, custom images or OS images that have scratch files or page files on a local temp disk?
answer: |
If the custom OS image points to the local temp disk, the image might not work correctly with this diskless size.
additionalContent: |
## Next steps
In this document, you learned more about the most frequent questions related to Azure VMs with no local temp disk. For more information about these VM sizes, see the following articles:
- [Specifications for Dv5 and Dsv5-series](dv5-dsv5-series.md)
- [Specifications for Dasv5 and Dadsv5-series](dasv5-dadsv5-series.md)
- [Specifications for Ev5 and Esv5-series](ev5-esv5-series.md)
- [Specifications for Easv5 and Eadsv5-series](easv5-eadsv5-series.md)
```
|
The 2018–19 Coppa Titano was the 61st edition of the football competition in San Marino. The cup was contested under a new format this season. The competition began on 23 October 2018 and ended on 19 April 2019.
S.P. La Fiorita were the defending cup champions after winning the previous tournament by defeating S. P. Tre Penne in the final by the score of 3–2 after extra time.
Format
The draw for the opening round was held on 30 August 2018. The final was contested over one leg, all other rounds were contested over two legs. When a winner was not determined in regular time, extra time and then penalties were used to determine a winner.
First round
The first legs of the first round were played on 23–24 October 2018, and the second legs were played on 6–7 November 2018. The draw for the first round was held on 30 August 2018.
|}
Quarter–finals
The first legs of the quarter–finals were played on 4–5 December 2018, and the second legs were played on 16 December 2018.
|}
Semi–finals
The first legs of the semi–finals will be played on 6–7 April 2019, and the second legs will be played on 13–14 April 2019.
|}
Final
The final was played 19 April 2019.
Bracket
See also
2018–19 Campionato Sammarinese di Calcio
External links
official site (Italian)
uefa.com
References
Coppa Titano seasons
San Marino
Coppa Titano
|
This is a list of the breeds of ass or donkey considered in France to be of French origin.
Recognised breeds
Baudet du Poitou
Bourbonnais Donkey
Cotentin Donkey
Grand Noir du Berry
Norman donkey
Provence donkey
Pyrenean donkey
Minor and extinct breeds
Petit Gris du Berry
References
Donkey
|
```sqlpl
drop database if exists `seata`;
CREATE DATABASE `seata` CHARSET utf8;
use `seata`;
-- -------------------------------- The script used when storeMode is 'db' --------------------------------
-- the table to store GlobalSession data
CREATE TABLE IF NOT EXISTS `global_table`
(
`xid` VARCHAR(128) NOT NULL,
`transaction_id` BIGINT,
`status` TINYINT NOT NULL,
`application_id` VARCHAR(32),
`transaction_service_group` VARCHAR(32),
`transaction_name` VARCHAR(128),
`timeout` INT,
`begin_time` BIGINT,
`application_data` VARCHAR(2000),
`gmt_create` DATETIME,
`gmt_modified` DATETIME,
PRIMARY KEY (`xid`),
KEY `idx_gmt_modified_status` (`gmt_modified`, `status`),
KEY `idx_transaction_id` (`transaction_id`)
) ENGINE = InnoDB
DEFAULT CHARSET = utf8;
-- the table to store BranchSession data
CREATE TABLE IF NOT EXISTS `branch_table`
(
`branch_id` BIGINT NOT NULL,
`xid` VARCHAR(128) NOT NULL,
`transaction_id` BIGINT,
`resource_group_id` VARCHAR(32),
`resource_id` VARCHAR(256),
`branch_type` VARCHAR(8),
`status` TINYINT,
`client_id` VARCHAR(64),
`application_data` VARCHAR(2000),
`gmt_create` DATETIME(6),
`gmt_modified` DATETIME(6),
PRIMARY KEY (`branch_id`),
KEY `idx_xid` (`xid`)
) ENGINE = InnoDB
DEFAULT CHARSET = utf8;
-- the table to store lock data
CREATE TABLE IF NOT EXISTS `lock_table`
(
`row_key` VARCHAR(128) NOT NULL,
`xid` VARCHAR(96),
`transaction_id` BIGINT,
`branch_id` BIGINT NOT NULL,
`resource_id` VARCHAR(256),
`table_name` VARCHAR(32),
`pk` VARCHAR(36),
`gmt_create` DATETIME,
`gmt_modified` DATETIME,
PRIMARY KEY (`row_key`),
KEY `idx_branch_id` (`branch_id`)
) ENGINE = InnoDB
DEFAULT CHARSET = utf8;
```
|
Dhayah Fort () is an 18th-century fortification in Ras Al Khaimah, United Arab Emirates (UAE). It is the highest hilltop fort in the UAE and was the site of a battle during the Persian Gulf campaign of 1819, when British troops captured the fort after a brief siege.
The fall of Dhayah was to pave the way for the signing of the General Maritime Treaty of 1820, the first of a number of treaties between the British government and the Sheikhs, or rulers, of what would become known as the Trucial Coast.
History
The fort was the last outpost of Al Qasimi to be captured by British forces in 1819, when a punitive expedition was dispatched from Bombay to quell the seafaring tribe, on supposed charged of piracy against British-flagged merchantmen.
Ras Al Khaimah fell to the British force on 9 December 1819. Following this, three ships were sent to blockade the nearby village of Rams to the North. They landed a force on 18 December, which fought its way inland through date plantations to the hilltop fort of Dhayah on the 19th. There, 398 men and another 400 women and children held out, without sanitation, water or effective cover from the sun, for three days under heavy fire from mortars and 12-pound cannon.
The two 24-pound cannons from HMS Liverpool, which had been used to bombard Ras Al Khaimah town from the landward side, were once again pressed into use and dragged across the plain from the coast at Rams, a journey of some four miles. Each of the guns weighed over 2 tonnes. After enduring two hours of sustained fire from the big guns, which breached the fort's walls, the last of the Al Qasimi surrendered at 10.30 on the morning of 22 December.
Many of the people in the fort were herders and farmers from the date groves of Dhayah who had fled there on the arrival of the British and of the 798 people who surrendered, only 177 were identified as fighting men. The British flag was briefly flown from the fort before it was demolished. British losses from the action at Dhayah included 1 officer and 3 men killed and 16 wounded.
The British expeditionary force then blew up the town of Ras Al Khaimah and established a garrison there of 800 sepoys and artillery, before visiting Jazirat Al Hamra, which was found to be deserted. They went on to destroy the fortifications and larger vessels of Umm Al Qawain, Ajman, Fasht, Sharjah, Abu Hail, and Dubai. Ten vessels that had taken shelter in Bahrain were also destroyed. The Royal Navy suffered no casualties during the action.
The Sheikh of Rams and Al Dhaya (named on the treaty as the 'Sheikh of Zyah'), Hassan bin Ali, was a co-signatory of the General Maritime Treaty of 1820, which established peace followed the cessation of hostilities. That treaty between the British and the Sheikhs of what was formerly known as the Pirate Coast and what would become known as the Trucial Coast led to the establishment of what is known today as the United Arab Emirates (UAE).
Further fortifications
A larger fortification constructed from mud bricks sits at the foot of the hill. This 'Sur' was used as a retreat for local people. A third element of fortification at Dhayah are watchtowers in the palm groves. Between the three fortifications, the area was rendered secure against interference from other Arab states, although the aged fortifications proved no match for the heavy artillery of the Royal Navy.
The fort is unusual in that it is the highest hilltop fort (as opposed to tower or lookout post) in the UAE. It commands 360 degree views of the surrounding lush wadiscape and plains. Remains in the area to the base of the hill at Dhayah show the area was inhabited as far back as the Wadi Suq period.
Dhayah fort as it stands today was rebuilt after 1819 and restored in the 1990s. A relatively small fortification, it lacks a natural source of water. By the time of J. G. Lorimer's survey of 1906, the area of Dhayah was uninhabited.
See also
References
Forts in the United Arab Emirates
History of the United Arab Emirates
|
```xml
import { ExpandedTaxInfoUpdateRequest } from "../../../../billing/models/request/expanded-tax-info-update.request";
export class ProviderSetupRequest {
name: string;
businessName: string;
billingEmail: string;
token: string;
key: string;
taxInfo: ExpandedTaxInfoUpdateRequest;
}
```
|
```php
<?php
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
*/
namespace Google\Service\MapsPlaces;
class GoogleMapsPlacesV1PlaceAttribution extends \Google\Model
{
/**
* @var string
*/
public $provider;
/**
* @var string
*/
public $providerUri;
/**
* @param string
*/
public function setProvider($provider)
{
$this->provider = $provider;
}
/**
* @return string
*/
public function getProvider()
{
return $this->provider;
}
/**
* @param string
*/
public function setProviderUri($providerUri)
{
$this->providerUri = $providerUri;
}
/**
* @return string
*/
public function getProviderUri()
{
return $this->providerUri;
}
}
// Adding a class alias for backwards compatibility with the previous class name.
class_alias(GoogleMapsPlacesV1PlaceAttribution::class, 'Google_Service_MapsPlaces_GoogleMapsPlacesV1PlaceAttribution');
```
|
```objective-c
/*
*
*/
/**
* @note TX and RX channels are index from 0 in the LL driver, i.e. tx_channel = [0,3], rx_channel = [0,3]
*/
#pragma once
#include <stdint.h>
#include <stdbool.h>
#include <stddef.h>
#include "hal/misc.h"
#include "hal/assert.h"
#include "soc/rmt_struct.h"
#include "soc/system_struct.h"
#include "hal/rmt_types.h"
#ifdef __cplusplus
extern "C" {
#endif
#define RMT_LL_EVENT_TX_DONE(channel) (1 << (channel))
#define RMT_LL_EVENT_TX_THRES(channel) (1 << ((channel) + 8))
#define RMT_LL_EVENT_TX_LOOP_END(channel) (1 << ((channel) + 12))
#define RMT_LL_EVENT_TX_ERROR(channel) (1 << ((channel) + 4))
#define RMT_LL_EVENT_RX_DONE(channel) (1 << ((channel) + 16))
#define RMT_LL_EVENT_RX_THRES(channel) (1 << ((channel) + 24))
#define RMT_LL_EVENT_RX_ERROR(channel) (1 << ((channel) + 20))
#define RMT_LL_EVENT_TX_MASK(channel) (RMT_LL_EVENT_TX_DONE(channel) | RMT_LL_EVENT_TX_THRES(channel) | RMT_LL_EVENT_TX_LOOP_END(channel))
#define RMT_LL_EVENT_RX_MASK(channel) (RMT_LL_EVENT_RX_DONE(channel) | RMT_LL_EVENT_RX_THRES(channel))
#define RMT_LL_MAX_LOOP_COUNT_PER_BATCH 1023
#define RMT_LL_MAX_FILTER_VALUE 255
#define RMT_LL_MAX_IDLE_VALUE 32767
typedef enum {
RMT_LL_MEM_OWNER_SW = 0,
RMT_LL_MEM_OWNER_HW = 1,
} rmt_ll_mem_owner_t;
/**
* @brief Enable the bus clock for RMT module
*
* @param group_id Group ID
* @param enable true to enable, false to disable
*/
static inline void rmt_ll_enable_bus_clock(int group_id, bool enable)
{
(void)group_id;
SYSTEM.perip_clk_en0.rmt_clk_en = enable;
}
/// use a macro to wrap the function, force the caller to use it in a critical section
/// the critical section needs to declare the __DECLARE_RCC_ATOMIC_ENV variable in advance
#define rmt_ll_enable_bus_clock(...) (void)__DECLARE_RCC_ATOMIC_ENV; rmt_ll_enable_bus_clock(__VA_ARGS__)
/**
* @brief Reset the RMT module
*
* @param group_id Group ID
*/
static inline void rmt_ll_reset_register(int group_id)
{
(void)group_id;
SYSTEM.perip_rst_en0.rmt_rst = 1;
SYSTEM.perip_rst_en0.rmt_rst = 0;
}
/// use a macro to wrap the function, force the caller to use it in a critical section
/// the critical section needs to declare the __DECLARE_RCC_ATOMIC_ENV variable in advance
#define rmt_ll_reset_register(...) (void)__DECLARE_RCC_ATOMIC_ENV; rmt_ll_reset_register(__VA_ARGS__)
/**
* @brief Enable clock gate for register and memory
*
* @param dev Peripheral instance address
* @param enable True to enable, False to disable
*/
static inline void rmt_ll_enable_periph_clock(rmt_dev_t *dev, bool enable)
{
dev->sys_conf.clk_en = enable; // register clock gating
dev->sys_conf.mem_clk_force_on = enable; // memory clock gating
}
/**
* @brief Force power on the RMT memory block, regardless of the outside PMU logic
*
* @param dev Peripheral instance address
*/
static inline void rmt_ll_mem_force_power_on(rmt_dev_t *dev)
{
dev->sys_conf.mem_force_pu = 1;
dev->sys_conf.mem_force_pd = 0;
}
/**
* @brief Force power off the RMT memory block, regardless of the outside PMU logic
*
* @param dev Peripheral instance address
*/
static inline void rmt_ll_mem_force_power_off(rmt_dev_t *dev)
{
dev->sys_conf.mem_force_pd = 1;
dev->sys_conf.mem_force_pu = 0;
}
/**
* @brief Power control the RMT memory block by the outside PMU logic
*
* @param dev Peripheral instance address
*/
static inline void rmt_ll_mem_power_by_pmu(rmt_dev_t *dev)
{
dev->sys_conf.mem_force_pd = 0;
dev->sys_conf.mem_force_pu = 0;
}
/**
* @brief Enable APB accessing RMT memory in nonfifo mode
*
* @param dev Peripheral instance address
* @param enable True to enable, False to disable
*/
static inline void rmt_ll_enable_mem_access_nonfifo(rmt_dev_t *dev, bool enable)
{
dev->sys_conf.apb_fifo_mask = enable;
}
/**
* @brief Set clock source and divider for RMT channel group
*
* @param dev Peripheral instance address
* @param channel not used as clock source is set for all channels
* @param src Clock source
* @param divider_integral Integral part of the divider
* @param divider_denominator Denominator part of the divider
* @param divider_numerator Numerator part of the divider
*/
static inline void rmt_ll_set_group_clock_src(rmt_dev_t *dev, uint32_t channel, rmt_clock_source_t src,
uint32_t divider_integral, uint32_t divider_denominator, uint32_t divider_numerator)
{
// Formula: rmt_sclk = module_clock_src / (1 + div_num + div_a / div_b)
(void)channel; // the source clock is set for all channels
HAL_ASSERT(divider_integral >= 1);
HAL_FORCE_MODIFY_U32_REG_FIELD(dev->sys_conf, sclk_div_num, divider_integral - 1);
dev->sys_conf.sclk_div_a = divider_numerator;
dev->sys_conf.sclk_div_b = divider_denominator;
switch (src) {
case RMT_CLK_SRC_APB:
dev->sys_conf.sclk_sel = 1;
break;
case RMT_CLK_SRC_RC_FAST:
dev->sys_conf.sclk_sel = 2;
break;
case RMT_CLK_SRC_XTAL:
dev->sys_conf.sclk_sel = 3;
break;
default:
HAL_ASSERT(false && "unsupported RMT clock source");
break;
}
}
/**
* @brief Enable RMT peripheral source clock
*
* @param dev Peripheral instance address
* @param en True to enable, False to disable
*/
static inline void rmt_ll_enable_group_clock(rmt_dev_t *dev, bool en)
{
dev->sys_conf.sclk_active = en;
}
////////////////////////////////////////TX Channel Specific/////////////////////////////////////////////////////////////
/**
* @brief Reset clock divider for TX channels by mask
*
* @param dev Peripheral instance address
* @param channel_mask Mask of TX channels
*/
static inline void rmt_ll_tx_reset_channels_clock_div(rmt_dev_t *dev, uint32_t channel_mask)
{
// write 1 to reset
dev->ref_cnt_rst.val |= channel_mask & 0x0F;
}
/**
* @brief Set TX channel clock divider
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param div Division value
*/
static inline void rmt_ll_tx_set_channel_clock_div(rmt_dev_t *dev, uint32_t channel, uint32_t div)
{
HAL_ASSERT(div >= 1 && div <= 256 && "divider out of range");
// limit the maximum divider to 256
if (div >= 256) {
div = 0; // 0 means 256 division
}
HAL_FORCE_MODIFY_U32_REG_FIELD(dev->chnconf0[channel], div_cnt_chn, div);
}
/**
* @brief Reset RMT reading pointer for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
*/
__attribute__((always_inline))
static inline void rmt_ll_tx_reset_pointer(rmt_dev_t *dev, uint32_t channel)
{
dev->chnconf0[channel].mem_rd_rst_chn = 1;
dev->chnconf0[channel].mem_rd_rst_chn = 0;
dev->chnconf0[channel].apb_mem_rst_chn = 1;
dev->chnconf0[channel].apb_mem_rst_chn = 0;
}
/**
* @brief Enable DMA access for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param enable True to enable, False to disable
*/
static inline void rmt_ll_tx_enable_dma(rmt_dev_t *dev, uint32_t channel, bool enable)
{
HAL_ASSERT(channel == 3 && "only TX channel 3 has DMA ability");
dev->chnconf0[channel].dma_access_en_chn = enable;
}
/**
* @brief Start transmitting for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
*/
__attribute__((always_inline))
static inline void rmt_ll_tx_start(rmt_dev_t *dev, uint32_t channel)
{
// update other configuration registers before start transmitting
dev->chnconf0[channel].conf_update_chn = 1;
dev->chnconf0[channel].tx_start_chn = 1;
}
/**
* @brief Stop transmitting for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
*/
__attribute__((always_inline))
static inline void rmt_ll_tx_stop(rmt_dev_t *dev, uint32_t channel)
{
dev->chnconf0[channel].tx_stop_chn = 1;
// stop won't take place until configurations updated
dev->chnconf0[channel].conf_update_chn = 1;
}
/**
* @brief Set memory block number for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param block_num memory block number
*/
static inline void rmt_ll_tx_set_mem_blocks(rmt_dev_t *dev, uint32_t channel, uint8_t block_num)
{
dev->chnconf0[channel].mem_size_chn = block_num;
}
/**
* @brief Enable TX wrap
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param enable True to enable, False to disable
*/
static inline void rmt_ll_tx_enable_wrap(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chnconf0[channel].mem_tx_wrap_en_chn = enable;
}
/**
* @brief Enable transmitting in a loop
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param enable True to enable, False to disable
*/
__attribute__((always_inline))
static inline void rmt_ll_tx_enable_loop(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chnconf0[channel].tx_conti_mode_chn = enable;
}
/**
* @brief Set loop count for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param count TX loop count
*/
__attribute__((always_inline))
static inline void rmt_ll_tx_set_loop_count(rmt_dev_t *dev, uint32_t channel, uint32_t count)
{
HAL_ASSERT(count <= RMT_LL_MAX_LOOP_COUNT_PER_BATCH && "loop count out of range");
dev->chn_tx_lim[channel].tx_loop_num_chn = count;
}
/**
* @brief Reset loop count for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
*/
__attribute__((always_inline))
static inline void rmt_ll_tx_reset_loop_count(rmt_dev_t *dev, uint32_t channel)
{
dev->chn_tx_lim[channel].loop_count_reset_chn = 1;
dev->chn_tx_lim[channel].loop_count_reset_chn = 0;
}
/**
* @brief Enable loop count for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param enable True to enable, False to disable
*/
__attribute__((always_inline))
static inline void rmt_ll_tx_enable_loop_count(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chn_tx_lim[channel].tx_loop_cnt_en_chn = enable;
}
/**
* @brief Enable loop stop at count value automatically
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param enable True to enable, False to disable
*/
__attribute__((always_inline))
static inline void rmt_ll_tx_enable_loop_autostop(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chn_tx_lim[channel].loop_stop_en_chn = enable;
}
/**
* @brief Enable transmit multiple channels synchronously
*
* @param dev Peripheral instance address
* @param enable True to enable, False to disable
*/
static inline void rmt_ll_tx_enable_sync(rmt_dev_t *dev, bool enable)
{
dev->tx_sim.tx_sim_en = enable;
}
/**
* @brief Clear the TX channels synchronous group
*
* @param dev Peripheral instance address
*/
static inline void rmt_ll_tx_clear_sync_group(rmt_dev_t *dev)
{
dev->tx_sim.val &= ~(0x0F);
}
/**
* @brief Add TX channels to the synchronous group
*
* @param dev Peripheral instance address
* @param channel_mask Mask of TX channels to be added to the synchronous group
*/
static inline void rmt_ll_tx_sync_group_add_channels(rmt_dev_t *dev, uint32_t channel_mask)
{
dev->tx_sim.val |= (channel_mask & 0x0F);
}
/**
* @brief Remove TX channels from the synchronous group
*
* @param dev Peripheral instance address
* @param channel_mask Mask of TX channels to be removed from the synchronous group
*/
static inline void rmt_ll_tx_sync_group_remove_channels(rmt_dev_t *dev, uint32_t channel_mask)
{
dev->tx_sim.val &= ~channel_mask;
}
/**
* @brief Fix the output level when TX channel is in IDLE state
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param level IDLE level (1 => high, 0 => low)
* @param enable True to fix the IDLE level, otherwise the IDLE level is determined by EOF encoder
*/
__attribute__((always_inline))
static inline void rmt_ll_tx_fix_idle_level(rmt_dev_t *dev, uint32_t channel, uint8_t level, bool enable)
{
dev->chnconf0[channel].idle_out_en_chn = enable;
dev->chnconf0[channel].idle_out_lv_chn = level;
}
/**
* @brief Set the amount of RMT symbols that can trigger the limitation interrupt
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param limit Specify the number of symbols
*/
static inline void rmt_ll_tx_set_limit(rmt_dev_t *dev, uint32_t channel, uint32_t limit)
{
dev->chn_tx_lim[channel].tx_lim_chn = limit;
}
/**
* @brief Set high and low duration of carrier signal
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param high_ticks Duration of high level
* @param low_ticks Duration of low level
*/
static inline void rmt_ll_tx_set_carrier_high_low_ticks(rmt_dev_t *dev, uint32_t channel, uint32_t high_ticks, uint32_t low_ticks)
{
HAL_ASSERT(high_ticks >= 1 && high_ticks <= 65536 && low_ticks >= 1 && low_ticks <= 65536 && "out of range high/low ticks");
// ticks=0 means 65536 in hardware
if (high_ticks >= 65536) {
high_ticks = 0;
}
if (low_ticks >= 65536) {
low_ticks = 0;
}
HAL_FORCE_MODIFY_U32_REG_FIELD(dev->chncarrier_duty[channel], carrier_high_chn, high_ticks);
HAL_FORCE_MODIFY_U32_REG_FIELD(dev->chncarrier_duty[channel], carrier_low_chn, low_ticks);
}
/**
* @brief Enable modulating carrier signal to TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param enable True to enable, False to disable
*/
static inline void rmt_ll_tx_enable_carrier_modulation(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chnconf0[channel].carrier_en_chn = enable;
}
/**
* @brief Set on high or low to modulate the carrier signal
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param level Which level to modulate on (0=>low level, 1=>high level)
*/
static inline void rmt_ll_tx_set_carrier_level(rmt_dev_t *dev, uint32_t channel, uint8_t level)
{
dev->chnconf0[channel].carrier_out_lv_chn = level;
}
/**
* @brief Enable to always output carrier signal, regardless of a valid data transmission
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @param enable True to output carrier signal in all RMT state, False to only output carrier signal for effective data
*/
static inline void rmt_ll_tx_enable_carrier_always_on(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chnconf0[channel].carrier_eff_en_chn = !enable;
}
////////////////////////////////////////RX Channel Specific/////////////////////////////////////////////////////////////
/**
* @brief Reset clock divider for RX channels by mask
*
* @param dev Peripheral instance address
* @param channel_mask Mask of RX channels
*/
static inline void rmt_ll_rx_reset_channels_clock_div(rmt_dev_t *dev, uint32_t channel_mask)
{
dev->ref_cnt_rst.val |= ((channel_mask & 0x0F) << 4);
}
/**
* @brief Set RX channel clock divider
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param div Division value
*/
static inline void rmt_ll_rx_set_channel_clock_div(rmt_dev_t *dev, uint32_t channel, uint32_t div)
{
HAL_ASSERT(div >= 1 && div <= 256 && "divider out of range");
// limit the maximum divider to 256
if (div >= 256) {
div = 0; // 0 means 256 division
}
HAL_FORCE_MODIFY_U32_REG_FIELD(dev->chmconf[channel].conf0, div_cnt_chm, div);
}
/**
* @brief Reset RMT writing pointer for RX channel
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
*/
__attribute__((always_inline))
static inline void rmt_ll_rx_reset_pointer(rmt_dev_t *dev, uint32_t channel)
{
dev->chmconf[channel].conf1.mem_wr_rst_chm = 1;
dev->chmconf[channel].conf1.mem_wr_rst_chm = 0;
dev->chmconf[channel].conf1.apb_mem_rst_chm = 1;
dev->chmconf[channel].conf1.apb_mem_rst_chm = 0;
}
/**
* @brief Enable DMA access for RX channel
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param enable True to enable, False to disable
*/
static inline void rmt_ll_rx_enable_dma(rmt_dev_t *dev, uint32_t channel, bool enable)
{
HAL_ASSERT(channel == 3 && "only RX channel 3 has DMA ability");
dev->chmconf[channel].conf0.dma_access_en_chm = enable;
}
/**
* @brief Enable receiving for RX channel
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param enable True to enable, False to disable
*/
__attribute__((always_inline))
static inline void rmt_ll_rx_enable(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chmconf[channel].conf1.rx_en_chm = enable;
// rx won't be enabled until configurations updated
dev->chmconf[channel].conf1.conf_update_chm = 1;
}
/**
* @brief Set memory block number for RX channel
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param block_num memory block number
*/
static inline void rmt_ll_rx_set_mem_blocks(rmt_dev_t *dev, uint32_t channel, uint8_t block_num)
{
dev->chmconf[channel].conf0.mem_size_chm = block_num;
}
/**
* @brief Set the time length for RX channel before going into IDLE state
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param thres Time length threshold
*/
__attribute__((always_inline))
static inline void rmt_ll_rx_set_idle_thres(rmt_dev_t *dev, uint32_t channel, uint32_t thres)
{
dev->chmconf[channel].conf0.idle_thres_chm = thres;
}
/**
* @brief Set RMT memory owner for RX channel
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param owner Memory owner
*/
__attribute__((always_inline))
static inline void rmt_ll_rx_set_mem_owner(rmt_dev_t *dev, uint32_t channel, rmt_ll_mem_owner_t owner)
{
dev->chmconf[channel].conf1.mem_owner_chm = owner;
}
/**
* @brief Enable filter for RX channel
*
* @param dev Peripheral instance address
* @param channel RMT RX chanenl number
* @param enable True to enable, False to disable
*/
__attribute__((always_inline))
static inline void rmt_ll_rx_enable_filter(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chmconf[channel].conf1.rx_filter_en_chm = enable;
}
/**
* @brief Set RX channel filter threshold (i.e. the maximum width of one pulse signal that would be treated as a noise)
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param thres Filter threshold
*/
__attribute__((always_inline))
static inline void rmt_ll_rx_set_filter_thres(rmt_dev_t *dev, uint32_t channel, uint32_t thres)
{
HAL_FORCE_MODIFY_U32_REG_FIELD(dev->chmconf[channel].conf1, rx_filter_thres_chm, thres);
}
/**
* @brief Get RMT memory write cursor offset
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @return writer offset
*/
__attribute__((always_inline))
static inline uint32_t rmt_ll_rx_get_memory_writer_offset(rmt_dev_t *dev, uint32_t channel)
{
return dev->chmstatus[channel].mem_waddr_ex_chm - (channel + 4) * 48;
}
/**
* @brief Set the amount of RMT symbols that can trigger the limitation interrupt
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param limit Specify the number of symbols
*/
static inline void rmt_ll_rx_set_limit(rmt_dev_t *dev, uint32_t channel, uint32_t limit)
{
dev->chm_rx_lim[channel].rx_lim_chm = limit;
}
/**
* @brief Set high and low duration of carrier signal
*
* @param dev dev Peripheral instance address
* @param channel RMT TX channel number
* @param high_ticks Duration of high level
* @param low_ticks Duration of low level
*/
static inline void rmt_ll_rx_set_carrier_high_low_ticks(rmt_dev_t *dev, uint32_t channel, uint32_t high_ticks, uint32_t low_ticks)
{
HAL_ASSERT(high_ticks >= 1 && high_ticks <= 65536 && low_ticks >= 1 && low_ticks <= 65536 && "out of range high/low ticks");
HAL_FORCE_MODIFY_U32_REG_FIELD(dev->chm_rx_carrier_rm[channel], carrier_high_thres_chm, high_ticks - 1);
HAL_FORCE_MODIFY_U32_REG_FIELD(dev->chm_rx_carrier_rm[channel], carrier_low_thres_chm, low_ticks - 1);
}
/**
* @brief Enable demodulating the carrier on RX channel
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param enable True to enable, False to disable
*/
static inline void rmt_ll_rx_enable_carrier_demodulation(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chmconf[channel].conf0.carrier_en_chm = enable;
}
/**
* @brief Set on high or low to demodulate the carrier signal
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param level Which level to demodulate (0=>low level, 1=>high level)
*/
static inline void rmt_ll_rx_set_carrier_level(rmt_dev_t *dev, uint32_t channel, uint8_t level)
{
dev->chmconf[channel].conf0.carrier_out_lv_chm = level;
}
/**
* @brief Enable RX wrap
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @param enable True to enable, False to disable
*/
static inline void rmt_ll_rx_enable_wrap(rmt_dev_t *dev, uint32_t channel, bool enable)
{
dev->chmconf[channel].conf1.mem_rx_wrap_en_chm = enable;
}
//////////////////////////////////////////Interrupt Specific////////////////////////////////////////////////////////////
/**
* @brief Enable RMT interrupt for specific event mask
*
* @param dev Peripheral instance address
* @param mask Event mask
* @param enable True to enable, False to disable
*/
__attribute__((always_inline))
static inline void rmt_ll_enable_interrupt(rmt_dev_t *dev, uint32_t mask, bool enable)
{
if (enable) {
dev->int_ena.val |= mask;
} else {
dev->int_ena.val &= ~mask;
}
}
/**
* @brief Clear RMT interrupt status by mask
*
* @param dev Peripheral instance address
* @param mask Interrupt status mask
*/
__attribute__((always_inline))
static inline void rmt_ll_clear_interrupt_status(rmt_dev_t *dev, uint32_t mask)
{
dev->int_clr.val = mask;
}
/**
* @brief Get interrupt status register address
*
* @param dev Peripheral instance address
* @return Register address
*/
static inline volatile void *rmt_ll_get_interrupt_status_reg(rmt_dev_t *dev)
{
return &dev->int_st;
}
/**
* @brief Get interrupt status for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @return Interrupt status
*/
__attribute__((always_inline))
static inline uint32_t rmt_ll_tx_get_interrupt_status(rmt_dev_t *dev, uint32_t channel)
{
return dev->int_st.val & RMT_LL_EVENT_TX_MASK(channel);
}
/**
* @brief Get interrupt raw status for TX channel
*
* @param dev Peripheral instance address
* @param channel RMT TX channel number
* @return Interrupt raw status
*/
static inline uint32_t rmt_ll_tx_get_interrupt_status_raw(rmt_dev_t *dev, uint32_t channel)
{
return dev->int_raw.val & (RMT_LL_EVENT_TX_MASK(channel) | RMT_LL_EVENT_TX_ERROR(channel));
}
/**
* @brief Get interrupt raw status for RX channel
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @return Interrupt raw status
*/
static inline uint32_t rmt_ll_rx_get_interrupt_status_raw(rmt_dev_t *dev, uint32_t channel)
{
return dev->int_raw.val & (RMT_LL_EVENT_RX_MASK(channel) | RMT_LL_EVENT_RX_ERROR(channel));
}
/**
* @brief Get interrupt status for RX channel
*
* @param dev Peripheral instance address
* @param channel RMT RX channel number
* @return Interrupt status
*/
__attribute__((always_inline))
static inline uint32_t rmt_ll_rx_get_interrupt_status(rmt_dev_t *dev, uint32_t channel)
{
return dev->int_st.val & RMT_LL_EVENT_RX_MASK(channel);
}
//////////////////////////////////////////Deprecated Functions//////////////////////////////////////////////////////////
/////////////////////////////The following functions are only used by the legacy driver/////////////////////////////////
/////////////////////////////They might be removed in the next major release (ESP-IDF 6.0)//////////////////////////////
////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
__attribute__((always_inline))
static inline uint32_t rmt_ll_tx_get_status_word(rmt_dev_t *dev, uint32_t channel)
{
return dev->chnstatus[channel].val;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_rx_get_status_word(rmt_dev_t *dev, uint32_t channel)
{
return dev->chmstatus[channel].val;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_tx_get_channel_clock_div(rmt_dev_t *dev, uint32_t channel)
{
uint32_t div = HAL_FORCE_READ_U32_REG_FIELD(dev->chnconf0[channel], div_cnt_chn);
return div == 0 ? 256 : div;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_rx_get_channel_clock_div(rmt_dev_t *dev, uint32_t channel)
{
uint32_t div = HAL_FORCE_READ_U32_REG_FIELD(dev->chmconf[channel].conf0, div_cnt_chm);
return div == 0 ? 256 : div;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_rx_get_idle_thres(rmt_dev_t *dev, uint32_t channel)
{
return dev->chmconf[channel].conf0.idle_thres_chm;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_tx_get_mem_blocks(rmt_dev_t *dev, uint32_t channel)
{
return dev->chnconf0[channel].mem_size_chn;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_rx_get_mem_blocks(rmt_dev_t *dev, uint32_t channel)
{
return dev->chmconf[channel].conf0.mem_size_chm;
}
__attribute__((always_inline))
static inline bool rmt_ll_tx_is_loop_enabled(rmt_dev_t *dev, uint32_t channel)
{
return dev->chnconf0[channel].tx_conti_mode_chn;
}
__attribute__((always_inline))
static inline rmt_clock_source_t rmt_ll_get_group_clock_src(rmt_dev_t *dev, uint32_t channel)
{
rmt_clock_source_t clk_src = RMT_CLK_SRC_APB;
switch (dev->sys_conf.sclk_sel) {
case 1:
clk_src = RMT_CLK_SRC_APB;
break;
case 2:
clk_src = RMT_CLK_SRC_RC_FAST;
break;
case 3:
clk_src = RMT_CLK_SRC_XTAL;
break;
}
return clk_src;
}
__attribute__((always_inline))
static inline bool rmt_ll_tx_is_idle_enabled(rmt_dev_t *dev, uint32_t channel)
{
return dev->chnconf0[channel].idle_out_en_chn;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_tx_get_idle_level(rmt_dev_t *dev, uint32_t channel)
{
return dev->chnconf0[channel].idle_out_lv_chn;
}
static inline bool rmt_ll_is_mem_force_powered_down(rmt_dev_t *dev)
{
return dev->sys_conf.mem_force_pd;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_rx_get_mem_owner(rmt_dev_t *dev, uint32_t channel)
{
return dev->chmconf[channel].conf1.mem_owner_chm;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_rx_get_limit(rmt_dev_t *dev, uint32_t channel)
{
return dev->chm_rx_lim[channel].rx_lim_chm;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_get_tx_end_interrupt_status(rmt_dev_t *dev)
{
return dev->int_st.val & 0x0F;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_get_rx_end_interrupt_status(rmt_dev_t *dev)
{
return (dev->int_st.val >> 16) & 0x0F;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_get_tx_err_interrupt_status(rmt_dev_t *dev)
{
return (dev->int_st.val >> 4) & 0x0F;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_get_rx_err_interrupt_status(rmt_dev_t *dev)
{
return (dev->int_st.val >> 20) & 0x0F;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_get_tx_thres_interrupt_status(rmt_dev_t *dev)
{
return (dev->int_st.val >> 8) & 0x0F;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_get_rx_thres_interrupt_status(rmt_dev_t *dev)
{
return (dev->int_st.val >> 24) & 0x0F;
}
__attribute__((always_inline))
static inline uint32_t rmt_ll_get_tx_loop_interrupt_status(rmt_dev_t *dev)
{
return (dev->int_st.val >> 12) & 0x0F;
}
#ifdef __cplusplus
}
#endif
```
|
Jean Haskell Hosmer (January 29, 1842 – January 29, 1890) was an American actress and tragedienne who reached the zenith of her career directly following the American Civil War, and is associated through her career with actor and Lincoln assassin John Wilkes Booth along his brother Edwin Booth.
Hosmer was a cousin of sculptor Harriet Hosmer and of poet William H. C. Hosmer.
Early life
Born in 1842, near Boston, Massachusetts, Hosmer and her family moved to near Buffalo, New York, when she was four years old. At the age of eight, Hosmer's desire to act was stimulated by witnessing a performance of Douglas Jerrold's play Black-Eyed Susan. Hosmer soon after fell ill with an unknown disease which was attributed to her intense excitement for the production. Upon her recovery she continued to visit the theatre despite her parents' objections, often dressed in boys' clothes. She attended the Genesee and Wyoming Seminary in Alexander, New York, for a short time but remained committed to having a career on the stage. Her parents, influenced by her father's failings in business, eventually gave their consent so long as she performed under a stage name.
Career
Early career
Under the stage name "Miss Jennie Stanley," Hosmer began her career in Buffalo doing ballet at the age of 15. At the age of 16, under the tutelage of actor Barton Hill, she gained her first speaking role at the Metropolitan Theatre and was soon promoted to member of the theater's stock company. In 1860, while a leading actress, the young Hosmer was noticed by McVicker's Theater in Chicago and was recruited by the playhouse for its company. She performed with the company there under the name "Miss Jean Stanley" for two years supporting the Booth brothers.
Rise to stardom
After leaving McVicker's, Hosmer moved on her own to Philadelphia to star at the Chestnut Street Theatre in Philadelphia, Pennsylvania, under her own name. In 1863, she made her mark at the Chestnut playing strong, tragic female characters in plays such as Macbeth, Camille, and Romeo and Juliet. The death of one of her sisters in that same year prompted Hosmer to take a break from acting, but after two years reappeared for her starring tour. On May 29, 1865, she opened in New York at the Winter Garden Theatre on Broadway as Camille in Camille to glowing reviews, and stayed in New York until July 7. In announcing her farewell week there, the Spirit of the Times wrote:The current is the sixth and positively last week of the brilliantly successful engagement of Miss Hosmer at this house. I say brilliant both in an artistic and financial point of view, for rarely, if ever, has an artiste achieved a more legitimate and well-merited artistic success or been rewarded with a more substantial one as soon as her unqestionable genius became known to the play-going public. I regard Miss Hosmer's engagement as one of the most extraordinary in every way that has been played in this city for many years. She came among us unheralded and unpuffed; content, it would seem, to rely on her own intrinsic merits, and stand or fall by the freely-expressed opinion of her auditors and the critical acumen of the press. To triumph over the former was an easy task; to win the good opinion of the latter, one of great difficulty.
Decline
Following the close of the New York show in 1865, Hosmer's career began to decline. In 1866, she returned to Philadelphia to play at Arch Street Theatre with McKee Rankin, but left after a short while. Bad personal habits began to affect her work, and she likely suffered from alcoholism which became more noticeable in her acting as she aged. This was noted by English theater critic Clement Scott during an 1878 production of Lucrezia Borgia:There were moments when I felt almost paralyzed by her acting. I never saw the spirit of such intense tragedy in a woman. But as the performance wore on she became indistinct, reckless and uncertain, and when the curtain went down she was thoroughly drunk.During the 1870s and 1880s, Hosmer was considered semi-retired from the stage and now taught acting and elocution. However, in order to further sustain herself, she continued to irregularly act in productions with small-town and amateur companies across the country.
Death
In the winter of 1889–90, Hosmer took a job acting with the small Ramage's Standard Theatre company playing in Midwestern towns. While at a production in a small town near Cleveland, Ohio, she suddenly began to severely hemorrhage. Hosmer was quickly sent to her sister's home in Buffalo, where she died in obscurity on her birthday of January 29. She was 48 years old.
Legacy
Association with the Booths
At McVicker's, Hosmer appeared in supporting roles to John Wilkes Booth and his brother Edwin. Of John, Hosmer later remarked:I consider him a greater actor than his brother. He better represented the genius of his father, the first Junius Brutus Booth, and he played with such fire and vigor that he made us in his company actually fear him. But he did not have the refinement, grace, and crystal clearness of elocution possessed by Edwin.While supporting Wilkes Booth, Hosmer kept a lock of his hair and held onto it following her departure from Chicago. Following his death, Hosmer sent the lock of hair to his mother, Mary Ann Holmes Booth.
References
1842 births
1890 deaths
19th-century American actresses
American stage actresses
Actresses from Boston
Actresses from Buffalo, New York
|
Lothar Kolditz (born September 30, 1929) is a German chemist and former politician. He was president of the National Front of the German Democratic Republic.
Early life and education
Kolditz was born in the municipality of Zschorlau on September 30, 1929, the son of the carpenter Paul Kolditz and his wife Ella, née Bauer. From 1941 to 1948 he attended high school in Aue.
Kolditz studied chemistry from 1948 to 1952 at the Humboldt University of Berlin, graduating with a degree in chemistry. His undergraduate thesis "Über Kaliumborfluoridtetraschwefeltrioxyd und die Darstellung von Trisulfurylfluorid“ (About potassium borofluoride tetrasulfur trioxide and the preparation of trisulphuryl fluoride) was written under the guidance of Hans-Albert Lehmann.
In 1954 he also received his doctorate with his dissertation "About Polyarsenatophosphate" under the supervision of Erich Thilo. Kolditz then habilitated in 1957 with a thesis "On compounds of pentavalent phosphorus, arsenic and antimony with fluorine and chlorine".
Kolditz has been married to Ruth, née Schramm, since 1951; the couple has two daughters and lives in Steinförde (a district of Fürstenberg).
Academic career
In 1957, Kolditz was appointed as a professor at the Technical University of Leuna-Merseburg; where he would teach inorganic and radiochemistry. After Franz Hein retired, Kolditz was appointed to the Friedrich Schiller University in Jena in 1959. There he was the director of the Inorganic Chemistry Institute until 1962.
In 1962 Kolditz returned to the Humboldt University of Berlin, initially as a chair and director of the First Chemical Institute (1962–68). From 1965 to 1968 he was also Vice-Rector for Natural Sciences at the university. In 1969 he was elected as a corresponding member of the German Academy of Sciences at Berlin. After the reforms at the university in 1968, he became Humboldt University's director of chemistry from 1971 to 1979. In 1972 he was elected a full member of the Academy of Sciences of the German Democratic Republic (formerly the German Academy of Sciences at Berlin, which was renamed that year).
From 1980 to 1990 Kolditz worked at the academy as director of the Central Institute for Inorganic Chemistry (ZIAC) in Berlin. The ZIAC was created in 1971 by merging the Institute for Inorganic Chemistry, founded in 1952, and the Institute for Applied Silicate Research, founded in 1951. The Academy of Sciences of the German Democratic Republic and the ZIAC were dissolved on December 31, 1991, due to the reunification of Germany. As a result, Kolditz retired and has lived in Fürstenberg ever since.
Kolditz's publishing activities include around 350 original publications in academic journals, 37 patents and well over 200 colloquium lectures. He was a long-standing member of the editorial board of several journals. Kolditz has continued to contribute to scientific publications during his retirement.
Political career
In September 1971 he was appointed to the electoral commission of the German Democratic Republic in preparation for the 1971 East German general election. On October 30, 1981, Kolditz was elected President of the National Front of the German Democratic Republic, succeeding Erich Correns. In July 1982 Kolditz was elected as a member of the State Council of the GDR. From 1983 to 1990 he was a member of the presidium of the Society for German-Soviet Friendship (DSF). From 1986 to 1990 Kolditz was a member of the Presidential Council of the Cultural Association of the GDR. In the 1986 East German general election Kolditz was elected to the Volkskammer as a representative of the Cultural Association of the GDR.
Memberships, honors and Awards
1969, Corresponding member of the German Academy of Sciences
1972, National Prize of the German Democratic Republic, 3rd Class
1972, Full member of the Academy of Sciences of the German Democratic Republic
1976, Clemens Winkler Medal of the Chemical Society of the German Democratic Republic
1983, Honorary Doctorate from the Freiberg University of Mining and Technology
1984, Patriotic Order of Merit, in Gold
1988, Foreign Member of the USSR Academy of Sciences
1989, Outstanding Scientist of the People
1992, Foreign Member of the Russian Academy of Sciences
1993, Member of the Leibniz Society of Sciences in Berlin
Publications
Textbooks/Books
Radiochemistry (Radiochemie). In: Gustav Hertz: Textbook of Nuclear Physics, Volume III, Applied Nuclear Physics (Lehrbuch der Kernphysik, Band III, Angewandte Kernphysik). 1962.
Editor: Włodzimierz Trzebiatowski: Textbook of Inorganic Chemistry (Lehrbuch der anorganischen Chemie). 1963.
Halides of Arsenic and Antimony. In: H. J. Emeleus: Advances in Inorganic Chemistry and Radiochemistry. Vol. 7. Academic Press, New York/London 1965.
Halides of Arsenic and Antimony. In: Viktor Gutmann: Halogen Chemistry. Vol. 2. Academic Press New York/London 1967.
Editor: Inorganic: Text and practical book of inorganic chemistry; with an introduction to physical chemistry (Anorganikum: Lehr- und Praktikumsbuch der anorganischen Chemie; mit einer Einführung in die physikalische Chemie). 1967.
Compounds of Nb and Ta (Verbindungen von Niob und Tantal). In: Niedenzu, Lexington, Kentucky: Houben-Weyl, Methodicum Chimicum. Academic Press, New York/London 1974.
with Günter Kauschka: Exchange Reactions (Austauschreaktionen). In: Recent Developments in Inorganic Chemistry (Neuere Entwicklungen der anorganischen Chemie). 1974.
Editor and co-author: Inorganic Chemistry (Textbook) (Anorganische Chemie). 1978.
with Jurij Alexandrovič Buslaev und Eleonora Alexandrovna Kravčenko: Nuclear quadrupole resonance in inorganic chemistry. 1987.
Perfluorohalogenoorgano Compounds of Main Group Elements. In: Gmelin Handbook of Inorganic and Organometallic Chemistry. 8th Edition. Springer 1994.
The Linden University 1945 to 1990 - reconstruction, consolidation and turbulence in chemistry, the Prorectorate for Natural Sciences (Die Lindenuniversität 1945 bis 1990 – Wiederaufbau, Konsolidierung und Turbulenzen in der Chemie, das Prorektorat für Naturwissenschaften). In: Wolfgang Girnus and Klaus Meier: Humboldt University under the Linden (Die Humboldt-Universität Unter den Linden 1945 bis 1990). Leipzig University Press 2010.
Preparation of Fluorine-Containing Molecular Halides and Heteropolar Salts with Elements of Group 15 and Niobium and Tantalum Halides; Fluoro and Fluorohydroxy Complexes of As, Sb and Sn. In: Herbert W. Roesky: Efficient Preparations of Fluorine Compounds. John Wiley & Sons, Inc. 2013.
External links
References
1929 births
Members of the Russian Academy of Sciences
Members of the German Academy of Sciences at Berlin
Recipients of the Patriotic Order of Merit in gold
Members of the Volkskammer
Members of the State Council of East Germany
Academic staff of the Humboldt University of Berlin
Academic staff of the University of Jena
20th-century German chemists
Living people
|
```c++
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions
// are met:
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above copyright
// notice, this list of conditions and the following disclaimer in the
// documentation and/or other materials provided with the distribution.
// * Neither the name of NVIDIA CORPORATION nor the names of its
// contributors may be used to endorse or promote products derived
// from this software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY
// EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
// PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
// CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
// EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
// PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
// PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
// OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
//
// This file was generated by NvParameterized/scripts/GenParameterized.pl
#include "BufferU8x1.h"
#include <string.h>
#include <stdlib.h>
using namespace NvParameterized;
namespace nvidia
{
namespace apex
{
using namespace BufferU8x1NS;
const char* const BufferU8x1Factory::vptr =
NvParameterized::getVptr<BufferU8x1, BufferU8x1::ClassAlignment>();
const uint32_t NumParamDefs = 3;
static NvParameterized::DefinitionImpl* ParamDefTable; // now allocated in buildTree [NumParamDefs];
static const size_t ParamLookupChildrenTable[] =
{
1, 2,
};
#define TENUM(type) nvidia::##type
#define CHILDREN(index) &ParamLookupChildrenTable[index]
static const NvParameterized::ParamLookupNode ParamLookupTable[NumParamDefs] =
{
{ TYPE_STRUCT, false, 0, CHILDREN(0), 1 },
{ TYPE_ARRAY, true, (size_t)(&((ParametersStruct*)0)->data), CHILDREN(1), 1 }, // data
{ TYPE_U8, false, 1 * sizeof(uint8_t), NULL, 0 }, // data[]
};
bool BufferU8x1::mBuiltFlag = false;
NvParameterized::MutexType BufferU8x1::mBuiltFlagMutex;
BufferU8x1::BufferU8x1(NvParameterized::Traits* traits, void* buf, int32_t* refCount) :
NvParameters(traits, buf, refCount)
{
//mParameterizedTraits->registerFactory(className(), &BufferU8x1FactoryInst);
if (!buf) //Do not init data if it is inplace-deserialized
{
initDynamicArrays();
initStrings();
initReferences();
initDefaults();
}
}
BufferU8x1::~BufferU8x1()
{
freeStrings();
freeReferences();
freeDynamicArrays();
}
void BufferU8x1::destroy()
{
// We cache these fields here to avoid overwrite in destructor
bool doDeallocateSelf = mDoDeallocateSelf;
NvParameterized::Traits* traits = mParameterizedTraits;
int32_t* refCount = mRefCount;
void* buf = mBuffer;
this->~BufferU8x1();
NvParameters::destroy(this, traits, doDeallocateSelf, refCount, buf);
}
const NvParameterized::DefinitionImpl* BufferU8x1::getParameterDefinitionTree(void)
{
if (!mBuiltFlag) // Double-checked lock
{
NvParameterized::MutexType::ScopedLock lock(mBuiltFlagMutex);
if (!mBuiltFlag)
{
buildTree();
}
}
return(&ParamDefTable[0]);
}
const NvParameterized::DefinitionImpl* BufferU8x1::getParameterDefinitionTree(void) const
{
BufferU8x1* tmpParam = const_cast<BufferU8x1*>(this);
if (!mBuiltFlag) // Double-checked lock
{
NvParameterized::MutexType::ScopedLock lock(mBuiltFlagMutex);
if (!mBuiltFlag)
{
tmpParam->buildTree();
}
}
return(&ParamDefTable[0]);
}
NvParameterized::ErrorType BufferU8x1::getParameterHandle(const char* long_name, Handle& handle) const
{
ErrorType Ret = NvParameters::getParameterHandle(long_name, handle);
if (Ret != ERROR_NONE)
{
return(Ret);
}
size_t offset;
void* ptr;
getVarPtr(handle, ptr, offset);
if (ptr == NULL)
{
return(ERROR_INDEX_OUT_OF_RANGE);
}
return(ERROR_NONE);
}
NvParameterized::ErrorType BufferU8x1::getParameterHandle(const char* long_name, Handle& handle)
{
ErrorType Ret = NvParameters::getParameterHandle(long_name, handle);
if (Ret != ERROR_NONE)
{
return(Ret);
}
size_t offset;
void* ptr;
getVarPtr(handle, ptr, offset);
if (ptr == NULL)
{
return(ERROR_INDEX_OUT_OF_RANGE);
}
return(ERROR_NONE);
}
void BufferU8x1::getVarPtr(const Handle& handle, void*& ptr, size_t& offset) const
{
ptr = getVarPtrHelper(&ParamLookupTable[0], const_cast<BufferU8x1::ParametersStruct*>(¶meters()), handle, offset);
}
/* Dynamic Handle Indices */
void BufferU8x1::freeParameterDefinitionTable(NvParameterized::Traits* traits)
{
if (!traits)
{
return;
}
if (!mBuiltFlag) // Double-checked lock
{
return;
}
NvParameterized::MutexType::ScopedLock lock(mBuiltFlagMutex);
if (!mBuiltFlag)
{
return;
}
for (uint32_t i = 0; i < NumParamDefs; ++i)
{
ParamDefTable[i].~DefinitionImpl();
}
traits->free(ParamDefTable);
mBuiltFlag = false;
}
#define PDEF_PTR(index) (&ParamDefTable[index])
void BufferU8x1::buildTree(void)
{
uint32_t allocSize = sizeof(NvParameterized::DefinitionImpl) * NumParamDefs;
ParamDefTable = (NvParameterized::DefinitionImpl*)(mParameterizedTraits->alloc(allocSize));
memset(ParamDefTable, 0, allocSize);
for (uint32_t i = 0; i < NumParamDefs; ++i)
{
NV_PARAM_PLACEMENT_NEW(ParamDefTable + i, NvParameterized::DefinitionImpl)(*mParameterizedTraits);
}
// Initialize DefinitionImpl node: nodeIndex=0, longName=""
{
NvParameterized::DefinitionImpl* ParamDef = &ParamDefTable[0];
ParamDef->init("", TYPE_STRUCT, "STRUCT", true);
}
// Initialize DefinitionImpl node: nodeIndex=1, longName="data"
{
NvParameterized::DefinitionImpl* ParamDef = &ParamDefTable[1];
ParamDef->init("data", TYPE_ARRAY, NULL, true);
#ifdef NV_PARAMETERIZED_HIDE_DESCRIPTIONS
#else
static HintImpl HintTable[1];
static Hint* HintPtrTable[1] = { &HintTable[0], };
HintTable[0].init("shortDescription", "Container for BYTE1 formats", true);
ParamDefTable[1].setHints((const NvParameterized::Hint**)HintPtrTable, 1);
#endif /* NV_PARAMETERIZED_HIDE_DESCRIPTIONS */
ParamDef->setArraySize(-1);
}
// Initialize DefinitionImpl node: nodeIndex=2, longName="data[]"
{
NvParameterized::DefinitionImpl* ParamDef = &ParamDefTable[2];
ParamDef->init("data", TYPE_U8, NULL, true);
#ifdef NV_PARAMETERIZED_HIDE_DESCRIPTIONS
#else
static HintImpl HintTable[1];
static Hint* HintPtrTable[1] = { &HintTable[0], };
HintTable[0].init("shortDescription", "Container for BYTE1 formats", true);
ParamDefTable[2].setHints((const NvParameterized::Hint**)HintPtrTable, 1);
#endif /* NV_PARAMETERIZED_HIDE_DESCRIPTIONS */
}
// SetChildren for: nodeIndex=0, longName=""
{
static Definition* Children[1];
Children[0] = PDEF_PTR(1);
ParamDefTable[0].setChildren(Children, 1);
}
// SetChildren for: nodeIndex=1, longName="data"
{
static Definition* Children[1];
Children[0] = PDEF_PTR(2);
ParamDefTable[1].setChildren(Children, 1);
}
mBuiltFlag = true;
}
void BufferU8x1::initStrings(void)
{
}
void BufferU8x1::initDynamicArrays(void)
{
data.buf = NULL;
data.isAllocated = true;
data.elementSize = sizeof(uint8_t);
data.arraySizes[0] = 0;
}
void BufferU8x1::initDefaults(void)
{
freeStrings();
freeReferences();
freeDynamicArrays();
initDynamicArrays();
initStrings();
initReferences();
}
void BufferU8x1::initReferences(void)
{
}
void BufferU8x1::freeDynamicArrays(void)
{
if (data.isAllocated && data.buf)
{
mParameterizedTraits->free(data.buf);
}
}
void BufferU8x1::freeStrings(void)
{
}
void BufferU8x1::freeReferences(void)
{
}
} // namespace apex
} // namespace nvidia
```
|
```javascript
/** Used as the `TypeError` message for "Functions" methods. */
var FUNC_ERROR_TEXT = 'Expected a function';
/* Native method references for those with the same name as other `lodash` methods. */
var nativeIsFinite = global.isFinite;
/**
* The opposite of `_.before`; this method creates a function that invokes
* `func` once it's called `n` or more times.
*
* @static
* @memberOf _
* @category Function
* @param {number} n The number of calls before `func` is invoked.
* @param {Function} func The function to restrict.
* @returns {Function} Returns the new restricted function.
* @example
*
* var saves = ['profile', 'settings'];
*
* var done = _.after(saves.length, function() {
* console.log('done saving!');
* });
*
* _.forEach(saves, function(type) {
* asyncSave({ 'type': type, 'complete': done });
* });
* // => logs 'done saving!' after the two async saves have completed
*/
function after(n, func) {
if (typeof func != 'function') {
if (typeof n == 'function') {
var temp = n;
n = func;
func = temp;
} else {
throw new TypeError(FUNC_ERROR_TEXT);
}
}
n = nativeIsFinite(n = +n) ? n : 0;
return function() {
if (--n < 1) {
return func.apply(this, arguments);
}
};
}
module.exports = after;
```
|
Madeleine Zillhardt (June 10, 1863 in Saint-Quentin, France – April 16, 1950 in Neuilly-sur-Seine, France) was a French artist, writer, decorator and painter. Her life and her career are linked to another artist, the German-Swiss painter Louise Catherine Breslau, of whom she was the companion, the muse and the inspirer. They lived together for more than forty years, a life turned towards arts. She was the sister of painter Jenny Zillhardt.
Biography
Madeleine Zillhardt studied at the Académie Julian, art school in Paris; at the time the only institution of art education open to women in Paris. Her sister Jenny Zillhardt also studied there. She met there young artists like her: Anna Klumpke, Hermine David, Agnes Goodsir, Sarah Purser, Marie Bashkirtseff, and especially her "rival", Louise Catherine Breslau.
In 1884. Zillhardt asked Breslau to make her portrait. They would not leave each other and moved permanently together in 1886. In 1887, Breslau performs 'Contre Jour', one of her masterpieces, representing the couple she formed with Zillhardt in their intimacy. The painting was bought by the Swiss government in 1896, and is today held by the Museum of Fine Arts Bern. In 1908, Breslau painted 'La Vie pensive', another work of the couple, now displayed at the Cantonal Museum of Fine Arts in Lausanne.
Zillhardt became one of the most original decorators of her time and Louise Breslau had a huge success in the world of painting. The two women become an essential couple in the Parisian art scene and receive their artist friends: Henri Fantin-Latour, Auguste Rodin, and Edgar Degas, of whom Zillhardt wrote a biography. During the First World War, Madeleine Zillhardt distinguished herself in the decorative arts for her "patriotic faience", in support to Clemenceau or denouncing the bombing of civilians, like Fluctuat nec mergitur, Paris bombed, executed in 1918, now part of the collections of the French national Museum of Air and Space. She also went back to painting with Breslau. They painted the portraits of soldiers, nurses and doctors on the way to the front to give to their families before they leave.
After the war, the health of Breslau declined until her death on May 12, 1927 in Paris.
Zillhardt spent the rest of her life perpetuating the work of her companion, donating to various museums. Zillhardt did not allow the work of her partner not to be too dispersed and to appear today in the international collections.
== The barge 'Louise-Catherine''' ==
In 1928, Zillhardt bought the concrete barge 'Liège' in Paris in order to make it available to the Salvation Army. With the support of Winnaretta Singer, princesse de Polignac and heiress of the company of sewing machines Singer, the barge was rehabilitated by Le Corbusier in 1929, with his student, Japonese architect Kunio Maekawa.
According to the will of Madeleine Zillhardt, the boat took the name Louise-Catherine in tribute to Breslau and became a refuge for the homeless in winter and a summer camp for children, moored in Paris on the banks of the Seine, at the Pont des Arts and at the Pont d'Austerlitz.
In 2006, the boat was taken over the Fondation Le Corbusier. In 2018, it accidentally sank on February 10, during the flood of the Seine in Paris. The Louise-Catherine barge is still located at the Port of Austerlitz, in the 13th arrondissement, hoping for a renovation project.
Legacy
A street in Paris is named 'place Louise Catherine Breslau & Madeleine Zillhardt', in the 6th arrondissement (Saint-Germain-des-Prés).
In Saint-Quentin (home town of Madeleine Zillhardt), the 'Musée Antoine Lécuyer' exhibits 'Sous la lampe. Portrait de Madeleine Zillhardt'.
Bibliography
Madeleine Zillhardt. Louise-Catherine Breslau et ses amis, Paris, Éditions des Portiques, 1932.
Madeleine Zillhardt. Monsieur Edgar Degas'', Paris, L'Echoppe, new edition 2015.
References
1863 births
1950 deaths
20th-century French non-fiction writers
Académie Julian alumni
19th-century French women artists
20th-century French women artists
19th-century French women writers
20th-century French women writers
French lesbian artists
French lesbian writers
French designers
French LGBT painters
French people of World War I
|
The Men's 100 metre individual medley competition of the 2016 FINA World Swimming Championships (25 m) was held on 8 and 9 December 2016.
Records
Prior to the competition, the existing world and championship records were as follows.
Results
Heats
The heats were held at 09:30.
Semifinals
The semifinals were held at 18:44.
Semifinal 1
Semifinal 2
Final
The final was held at 18:42.
References
Men's 100 metre individual medley
|
English rock duo Royal Blood have released four studio albums, two extended plays (EPs), eighteen singles and twenty-two music videos. Formed in Brighton in March 2011, Royal Blood consists of bassist and vocalist Mike Kerr and drummer Ben Thatcher. After signing with Warner Bros. Records, the duo released their debut single "Out of the Black" in October 2013, which debuted at number 29 on the UK Rock & Metal Singles Chart. In February 2014, "Little Monster" was issued as the band's second single, registering on the UK Singles Chart at number 95 and the UK Rock & Metal Singles Chart at number one. Both singles were later issued alongside their B-sides on the EP Out of the Black in March. "Come On Over" – initially featured as the B-side to "Out of the Black" – was released as a single in April, reaching number 68 on the UK Singles Chart. At the same time, "Little Monster" also returned to the charts, peaking at number 74 on the UK Singles Chart.
Royal Blood's self-titled debut album was released in August 2014, topping the UK Albums Chart, Irish Albums Chart and Scottish Albums Chart. The week before the album's release, "Out of the Black" registered on the UK Singles Chart at number 78, while the band's fourth single "Figure It Out" debuted at number 50 (it would later peak at number 43). Royal Blood was certified gold by the British Phonographic Industry (BPI) by September, and eventually received a platinum certification for sales in excess of 300,000 units. "Ten Tonne Skeleton" was released as the fifth and final single from Royal Blood in late 2014; it charted in Canada only, reaching number 45 on the Canadian Rock Songs chart. "Where Are You Now?", featured on the TV series Vinyl, also registered on the chart, peaking at number 43.
In April 2017, it was announced that Royal Blood's second album How Did We Get So Dark? would be released in June. "Lights Out" was issued as the first single from the album the same month, debuting at number 96 on the UK Singles Chart and number one on the UK Rock & Metal Singles Chart. How Did We Get So Dark? debuted atop the UK Albums Chart upon its release, while a total of ten tracks from the album reached the top 20 of the UK Rock & Metal Singles Chart.
The band officially announced their third studio album Typhoons on 21 January 2021, with a planned release date for 30 April 2021. The band released three singles from Typhoons preceding the album's release: "Trouble's Coming", "Typhoons", and "Limbo".
On 25 May 2023, Royal Blood released "Mountains at Midnight", the lead single from their fourth studio album, Back to the Water Below. The second single "Pull Me Through" was released on 13 July 2023. The album was released on 1 September 2023.
Studio albums
Extended plays
Singles
Featured singles
Other charted songs
Music videos
Footnotes
References
External links
Royal Blood official website
Royal Blood discography at AllMusic
Royal Blood discography at Discogs
Royal Blood discography at MusicBrainz
Royal Blood
Royal Blood
|
```c
#include <3ds.h>
#include <string.h>
#include "process_monitor.h"
#include "exheader_info_heap.h"
#include "termination.h"
#include "reslimit.h"
#include "manager.h"
#include "util.h"
#include "luma_shared_config.h"
static void cleanupProcess(ProcessData *process)
{
if (process->flags & PROCESSFLAG_DEPENDENCIES_LOADED) {
ExHeader_Info *exheaderInfo = ExHeaderInfoHeap_New();
if (exheaderInfo == NULL) {
panic(0);
}
listAndTerminateDependencies(process, exheaderInfo);
ExHeaderInfoHeap_Delete(exheaderInfo);
}
if (!(process->flags & PROCESSFLAG_KIP)) {
SRVPM_UnregisterProcess(process->pid);
FSREG_Unregister(process->pid);
LOADER_UnregisterProgram(process->programHandle);
}
ProcessList_Lock(&g_manager.processList);
if (g_manager.runningApplicationData != NULL && process->handle == g_manager.runningApplicationData->handle) {
if (IS_N3DS && OS_KernelConfig->app_memtype == 6) {
assertSuccess(resetAppMemLimit());
}
g_manager.runningApplicationData = NULL;
// We need to do this here to ensure that the ExHeader at init matches the ExHeader
// at termination at all times, otherwise the process refcounts of sysmodules
// get all messed up.
Luma_SharedConfig->hbldr_3dsx_tid = Luma_SharedConfig->selected_hbldr_3dsx_tid;
}
if (g_manager.debugData != NULL && process->handle == g_manager.debugData->handle) {
g_manager.debugData = NULL;
}
ProcessList_Unlock(&g_manager.processList);
if (process->flags & PROCESSFLAG_NOTIFY_TERMINATION) {
notifySubscribers(0x110 + process->terminatedNotificationVariation);
}
}
void processMonitor(void *p)
{
(void)p;
Handle handles[0x41] = { g_manager.newProcessEvent };
for (;;) {
u32 numProcesses = 0;
bool atLeastOneTerminating = false;
ProcessData *process;
ProcessData processBackup;
s32 id = -1;
ProcessList_Lock(&g_manager.processList);
FOREACH_PROCESS(&g_manager.processList, process) {
// Rebuild the handle array
if (process->terminationStatus != TERMSTATUS_TERMINATED) {
handles[1 + numProcesses++] = process->handle;
if (process->terminationStatus == TERMSTATUS_NOTIFICATION_SENT) {
atLeastOneTerminating = true;
}
}
}
ProcessList_Unlock(&g_manager.processList);
// If no more processes are terminating, signal the event
if (g_manager.waitingForTermination && !atLeastOneTerminating) {
assertSuccess(svcSignalEvent(g_manager.allNotifiedTerminationEvent));
}
// Note: lack of assertSuccess is intentional.
svcWaitSynchronizationN(&id, handles, 1 + numProcesses, false, -1LL);
if (id > 0) {
// Note: official PM conditionally erases the process from the list, cleans up, then conditionally frees the process data
// Bug in official PM (?): it unlocks the list before setting termstatus = TERMSTATUS_TERMINATED
ProcessList_Lock(&g_manager.processList);
process = ProcessList_FindProcessByHandle(&g_manager.processList, handles[id]);
if (process != NULL) {
process->terminationStatus = TERMSTATUS_TERMINATED;
if (process->flags & PROCESSFLAG_NOTIFY_TERMINATION) {
process->flags |= PROCESSFLAG_NOTIFY_TERMINATION_TERMINATED;
}
processBackup = *process; // <-- make sure no list access is done through this node
// Note: PROCESSFLAG_NOTIFY_TERMINATION_TERMINATED can be set by terminateProcessImpl
// APT is shit, why must an app call APT to ask to terminate itself?
if (!(process->flags & PROCESSFLAG_NOTIFY_TERMINATION_TERMINATED)) {
ProcessList_Delete(&g_manager.processList, process);
}
}
ProcessList_Unlock(&g_manager.processList);
if (process != NULL) {
cleanupProcess(&processBackup);
if (!(processBackup.flags & PROCESSFLAG_NOTIFY_TERMINATION_TERMINATED)) {
svcCloseHandle(processBackup.handle);
}
}
}
}
}
```
|
Las mujeres mandan ("The Women Rule") is a 1937 Mexican film. It was directed by Fernando de Fuentes.
Plot
Del Diestro plays Isidoro a bored bank teller, who decides to leave his family to follow a young dancer, Chayito, played by Tamayo. Once in Mexico City they become lovers and she asks him to rob the bank he used to work for.
Cast
Alfredo del Diestro
Marina Tamayo
Sara Garcia
Joaquin Coss
Manuel Buendia
Carmen Conde
References
External links
1937 films
1930s Spanish-language films
Films directed by Fernando de Fuentes
Mexican black-and-white films
Mexican crime drama films
1937 crime drama films
1930s Mexican films
|
The Maiden Stone, also known as the Drumdurno Stone after the nearby farm, is a Pictish standing stone near Inverurie in Aberdeenshire in Scotland, probably dating to the 9th century AD.
Name
The name is derived from local legend, incorporating the most obvious mark of wear and tear on the stone: a triangular notch toward the top of the monument.
The legend states that the daughter of the Laird of Balquhain made a bet with a stranger that she could bake a bannock faster than he could build a road to the top of Bennachie. The prize would be the maiden's hand. However, the stranger was the Devil and finished the road and claimed the forfeit. The maiden ran from the Devil and prayed to be saved. The legend finishes by saying that God turned her to stone, but the notch is where the Devil grasped her shoulder as she ran.
Purpose
Based on the mixture of Pictish and Christian symbols on the stone it is most likely that the stone marks a preaching site during missionary trips to the Picts.
Description
The stone is red granite, standing 3.01m high (one of the tallest of all Pictish monuments, even though several centimetres have been lost at the top owing to weathering). It is a Class II Pictish monument (combining Christian, and pre-Christian Pictish, motifs), dating from the late 8th or early 9th century AD.
The west side has a ringed cross below a human figure between two "fish-monsters". Below the cross there is a square panel with a disc containing a Celtic spiral motif at its centre, surrounded by a key-patterned ring, with knotwork patterns infilling the corners. On the reverse, there are four panels enclosing: a large centaur below three very weathered figures (possibly two smaller wrestling centaurs and a dog); a "notched rectangle and Z-rod" symbol; a Pictish Beast symbol, a mirror and a comb. There is a knotwork pattern on the narrow north edge and a keywork pattern on the south edge. A portion of the north edge is missing and the patterns are heavily eroded, particularly on the western face.
The human figure and "fish-monsters" may represent the Biblical story of Jonah and the Whale, with the whale doubled to make the design symmetrical.
The site is a scheduled ancient monument under the care of Historic Environment Scotland and is open at all times.
See also
Stones of Scotland
References
Historic Scotland properties in Aberdeenshire
Pictish stones
Christianity in medieval Scotland
Pictish stones in Aberdeenshire
Scheduled Ancient Monuments in Aberdeenshire
Inverurie
|
The 2023–24 East of Scotland Football League (known as the Central Taxis East of Scotland League for sponsorship reasons) is the 95th season of the East of Scotland Football League, and the 10th season with its top division as part of the sixth tier of the Scottish football pyramid system. Linlithgow Rose are the reigning champions but are unable to defend their title after gaining promotion to the Lowland Football League.
Teams
The following teams changed division after the 2022–23 season.
To East of Scotland Football League
Transferred from Lothian & Edinburgh Amateur League
Linton Hotspur
From East of Scotland Football League
Promoted to Lowland Football League
Linlithgow Rose
Folded
Syngenta
Premier Division
Relegated from 2022–23 Premier Division:
Blackburn United
Oakley United
Vale of Leithen
Promoted to 2023–24 Premier Division:
Dunbar United
Glenrothes
Luncarty
Kinnoull
Stadia and locations
Notes
All grounds are equipped with floodlights, except Humbug Park (Crossgates Primrose), Warout Stadium (Glenrothes), Brownlands Park (Luncarty).
League table
Results
First Division
Relegated from 2022–23 First Division:
Burntisland Shipyard
Coldstream
Kennoway Star Hearts
Promoted to 2023–24 First Division:
Whitburn
St Andrews United
Heriot-Watt University
Arniston Rangers
Stadia and locations
Notes
League table
Results
Second Division
Relegated from 2022–23 Second Division:
Craigroyston
Hawick Royal Albert
Lochgelly Albert
Syngenta (folded)
Promoted to 2023–24 Second Division:
Bo'ness Athletic
Armadale Thistle
Edinburgh College
Stadia and locations
Notes
League table
Results
Third Division
Transferred from Lothian & Edinburgh Amateur League
Linton Hotspur
Stadia and locations
League table
Results
Notes
Club with an SFA licence eligible to participate in the Lowland League promotion play-off should they win the Premier Division.
References
External links
East of Scotland Football League
6
Sco6
|
```smalltalk
namespace Unosquare.FFME.Platform
{
using System;
using System.Collections.Generic;
using System.Linq;
using System.Reflection;
/// <summary>
/// Represents a proxy for the properties exposed by the gicen type.
/// </summary>
/// <typeparam name="T">The type that this proxy represents.</typeparam>
internal sealed class ClassProxy<T>
where T : class
{
/// <summary>
/// Initializes a new instance of the <see cref="ClassProxy{T}"/> class.
/// </summary>
public ClassProxy()
: this((p) => true)
{
// placeholder
}
/// <summary>Initializes a new instance of the <see cref="ClassProxy{T}"/> class.</summary>
/// <param name="matchClause">The match clause.</param>
public ClassProxy(Func<PropertyInfo, bool> matchClause)
{
var properties = RetrieveProperties().OrderBy(p => p.Name).ToArray();
var proxies = new Dictionary<string, IPropertyProxy>(properties.Length, StringComparer.Ordinal);
foreach (var property in properties)
{
if (!matchClause(property))
continue;
var proxy = CreatePropertyProxy(property);
proxies[property.Name] = proxy;
}
Properties = proxies;
PropertyNames = Properties.Keys.ToArray();
ReadOnlyPropertyNames = Properties
.Where(kvp => kvp.Value.CanRead && !kvp.Value.CanWrite)
.Select(kvp => kvp.Key).OrderBy(s => s).ToArray();
ReadWritePropertyNames = Properties
.Where(kvp => kvp.Value.CanRead && kvp.Value.CanWrite)
.Select(kvp => kvp.Key).OrderBy(s => s).ToArray();
}
/// <summary>
/// Gets the property proxies for this class.
/// </summary>
public IReadOnlyDictionary<string, IPropertyProxy> Properties { get; }
/// <summary>
/// Gets the registered property names.
/// </summary>
public IReadOnlyList<string> PropertyNames { get; }
/// <summary>
/// Gets the read only property names.
/// </summary>
public IReadOnlyList<string> ReadOnlyPropertyNames { get; }
/// <summary>
/// Gets the property names that are both, readable and writable.
/// </summary>
public IReadOnlyList<string> ReadWritePropertyNames { get; }
/// <summary>
/// Gets or sets the value of the specified property, with the specified instance.
/// </summary>
/// <value>
/// The value to set.
/// </value>
/// <param name="instance">The instance.</param>
/// <param name="propertyName">Name of the property.</param>
/// <returns>The value of the property.</returns>
public object this[T instance, string propertyName]
{
get => Properties[propertyName].GetValue(instance);
set => Properties[propertyName].SetValue(instance, value);
}
/// <summary>
/// Gets the <see cref="IPropertyProxy"/> for the specified property name.
/// </summary>
/// <value>
/// The <see cref="IPropertyProxy"/>.
/// </value>
/// <param name="propertyName">Name of the property.</param>
/// <returns>The property proxy.</returns>
public IPropertyProxy this[string propertyName] => Properties[propertyName];
/// <summary>
/// Retrieves the property information for the properties of the specified type.
/// </summary>
/// <returns>A collection of property information objects.</returns>
public static IReadOnlyList<PropertyInfo> RetrieveProperties()
{
var flags = BindingFlags.Instance | BindingFlags.Public;
var declaredOnly = typeof(T).IsInterface;
if (declaredOnly) flags |= BindingFlags.DeclaredOnly;
var result = new List<PropertyInfo>(64);
var propertyInfos = typeof(T).GetProperties(flags).ToArray();
foreach (var propertyInfo in propertyInfos)
result.Add(propertyInfo);
return result;
}
/// <summary>
/// Creates an instance of the property proxy without the need to specify type argument explicitly.
/// </summary>
/// <param name="propertyInfo">The property information.</param>
/// <returns>The property proxy containing metadata a nd getter and setter delegates.</returns>
private static IPropertyProxy CreatePropertyProxy(PropertyInfo propertyInfo)
{
var genericType = typeof(PropertyProxy<,>)
.MakeGenericType(propertyInfo.DeclaringType, propertyInfo.PropertyType);
return Activator.CreateInstance(genericType, propertyInfo) as IPropertyProxy;
}
}
}
```
|
```html
<!DOCTYPE html>
<html lang="en-US">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width">
<title>Postcard example</title>
<style>
@font-face {
font-family: 'handwriting';
src: url('fonts/journal-webfont.woff2') format('woff2'),
url('fonts/journal-webfont.woff') format('woff');
font-weight: normal;
font-style: normal;
}
@font-face {
font-family: 'typewriter';
src: url('fonts/veteran_typewriter-webfont.woff2') format('woff2'),
url('fonts/veteran_typewriter-webfont.woff') format('woff');
font-weight: normal;
font-style: normal;
}
body {
font : 1.3rem sans-serif;
padding : 0.5em;
margin : 0;
background : #222;
}
form {
position : relative;
width : 740px;
height : 498px;
margin : 0 auto;
padding: 1em;
box-sizing: border-box;
background : #FFF url(background.jpg);
/* we create our grid */
display : grid;
gap : 20px;
grid-template-columns : repeat(2, 1fr);
grid-template-rows : 10em 1em 1em 1em;
}
h1 {
font : 1em "typewriter", monospace;
align-self : end;
}
#message {
grid-row: 1 / 5;
}
#from, #reply {
display: flex;
}
label {
font : .8em "typewriter", sans-serif;
}
input,
textarea {
font : 1.4em/1.5em "handwriting", cursive, sans-serif;
border : none;
padding : 0 10px;
margin : 0;
width : 80%;
background : none;
}
input:focus,
textarea:focus {
background : rgba(0,0,0,.1);
border-radius: 5px;
}
textarea {
display : block;
padding : 10px;
margin : 10px 0 0 -10px;
width : 100%;
height : 90%;
border-right: 1px solid;
/* resize : none; */
overflow: auto;
}
button {
padding : 5px;
font : bold .6em sans-serif;
border : 2px solid #333;
border-radius: 5px;
background : none;
cursor : pointer;
transform : rotate(-1.5deg);
}
button:after {
content : " >>>";
}
button:hover,
button:focus {
outline : none;
background : #000;
color : #FFF;
}
</style>
</head>
<body>
<form>
<h1>to: Mozilla</h1>
<div id="from">
<label for="name">from:</label>
<input type="text" id="name" name="user_name">
</div>
<div id="reply">
<label for="mail">reply:</label>
<input type="email" id="mail" name="user_email">
</div>
<div id="message">
<label for="msg">Your message:</label>
<textarea id="msg" name="user_message"></textarea>
</div>
<div class="button">
<button type="submit">Send your message</button>
</div>
</form>
</body>
</html>
```
|
Jean-Baptiste-Pierre Le Romain (dates of birth and death unknown) was a French engineer and contributor to the Encyclopédie during the eighteenth century and the Age of Enlightenment.
Living in Martinique, in 1734 Le Romain drew up a topographic map of the island for the French government. In 1740 he was sent as an assistant engineer to improve the fortifications of the island of Grenada, where he was appointed chief engineer in 1748.
Le Romain provided nearly seventy articles on the Caribbean between volumes III and XVI of the Encyclopédie of Diderot and D'Alembert. More than half of his articles are short descriptions of production, animals, minerals and geography of the Caribbean islands. He provided more detail of the production of regional products such as indigo, sugar and foods derived from cassava. Le Romain also wrote the entry for 'negroes' which reflects the colonial prejudices against blacks, referring to them as "barbarians".
A competent naturalist, Le Romain's contributions gave Diderot and d'Alembert first-hand information on the Caribbean and thus an expanded view of the world for their Encyclopédie.
References
External links
List of Le Romain's contributions to the Encyclopédie .
Year of birth unknown
Year of death unknown
Contributors to the Encyclopédie (1751–1772)
French engineers
|
The Dorsland tree, also known as the Dorsland Baobab and the Dorsland Grootboom (), is a baobab tree in Namibia which is found to the south of the Khaudum National Park. At around 2,100 years old, it is believed to be the oldest tree in the country. It is high, has a circumference of more than and keeps growing despite having fallen over. In 2006, two of its oldest relatives died. The previous oldest tree in the country, known as Grootboom, was high, but died in 2005.
The tree was where the Dorsland trekkers camped in 1883 and they carved the year of the visit, "1883", into the tree. In 1891 a detachment of German troops passed the tree and carved "1891" and the name of three in the group "H. Gathemann", "E. Heller" and "D. Hannemann" into the tree.
References
Individual baobab trees
|
```smalltalk
using Microsoft.Bot.Builder.Integration.AspNet.Core;
using Microsoft.Bot.Connector.Authentication;
using Microsoft.Extensions.Logging;
namespace Microsoft.BotBuilderSamples
{
public class AdapterWithErrorHandler : CloudAdapter
{
public AdapterWithErrorHandler(BotFrameworkAuthentication auth, ILogger<IBotFrameworkHttpAdapter> logger)
: base(auth, logger)
{
OnTurnError = async (turnContext, exception) =>
{
// Log any leaked exception from the application.
// NOTE: In production environment, you should consider logging this to
// Azure Application Insights. Visit path_to_url to see how
// to add telemetry capture to your bot.
logger.LogError($"Exception caught : {exception.Message}");
// Send a catch-all apology to the user.
await turnContext.SendActivityAsync("Sorry, it looks like something went wrong.");
};
}
}
}
```
|
```objective-c
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#ifndef V8_COMPILER_S390_INSTRUCTION_CODES_S390_H_
#define V8_COMPILER_S390_INSTRUCTION_CODES_S390_H_
namespace v8 {
namespace internal {
namespace compiler {
// S390-specific opcodes that specify which assembly sequence to emit.
// Most opcodes specify a single instruction.
#define TARGET_ARCH_OPCODE_LIST(V) \
V(S390_And32) \
V(S390_And64) \
V(S390_Or32) \
V(S390_Or64) \
V(S390_Xor32) \
V(S390_Xor64) \
V(S390_ShiftLeft32) \
V(S390_ShiftLeft64) \
V(S390_ShiftLeftPair) \
V(S390_ShiftRight32) \
V(S390_ShiftRight64) \
V(S390_ShiftRightPair) \
V(S390_ShiftRightArith32) \
V(S390_ShiftRightArith64) \
V(S390_ShiftRightArithPair) \
V(S390_RotRight32) \
V(S390_RotRight64) \
V(S390_Not32) \
V(S390_Not64) \
V(S390_RotLeftAndMask32) \
V(S390_RotLeftAndClear64) \
V(S390_RotLeftAndClearLeft64) \
V(S390_RotLeftAndClearRight64) \
V(S390_Add32) \
V(S390_Add64) \
V(S390_AddPair) \
V(S390_AddFloat) \
V(S390_AddDouble) \
V(S390_Sub32) \
V(S390_Sub64) \
V(S390_SubFloat) \
V(S390_SubDouble) \
V(S390_SubPair) \
V(S390_MulPair) \
V(S390_Mul32) \
V(S390_Mul32WithHigh32) \
V(S390_Mul64) \
V(S390_MulHigh32) \
V(S390_MulHighU32) \
V(S390_MulFloat) \
V(S390_MulDouble) \
V(S390_Div32) \
V(S390_Div64) \
V(S390_DivU32) \
V(S390_DivU64) \
V(S390_DivFloat) \
V(S390_DivDouble) \
V(S390_Mod32) \
V(S390_Mod64) \
V(S390_ModU32) \
V(S390_ModU64) \
V(S390_ModDouble) \
V(S390_Neg32) \
V(S390_Neg64) \
V(S390_NegDouble) \
V(S390_NegFloat) \
V(S390_SqrtFloat) \
V(S390_FloorFloat) \
V(S390_CeilFloat) \
V(S390_TruncateFloat) \
V(S390_AbsFloat) \
V(S390_SqrtDouble) \
V(S390_FloorDouble) \
V(S390_CeilDouble) \
V(S390_TruncateDouble) \
V(S390_RoundDouble) \
V(S390_MaxFloat) \
V(S390_MaxDouble) \
V(S390_MinFloat) \
V(S390_MinDouble) \
V(S390_AbsDouble) \
V(S390_Cntlz32) \
V(S390_Cntlz64) \
V(S390_Popcnt32) \
V(S390_Popcnt64) \
V(S390_Cmp32) \
V(S390_Cmp64) \
V(S390_CmpFloat) \
V(S390_CmpDouble) \
V(S390_Tst32) \
V(S390_Tst64) \
V(S390_Push) \
V(S390_PushFrame) \
V(S390_StoreToStackSlot) \
V(S390_ExtendSignWord8) \
V(S390_ExtendSignWord16) \
V(S390_ExtendSignWord32) \
V(S390_Uint32ToUint64) \
V(S390_Int64ToInt32) \
V(S390_Int64ToFloat32) \
V(S390_Int64ToDouble) \
V(S390_Uint64ToFloat32) \
V(S390_Uint64ToDouble) \
V(S390_Int32ToFloat32) \
V(S390_Int32ToDouble) \
V(S390_Uint32ToFloat32) \
V(S390_Uint32ToDouble) \
V(S390_Float32ToInt64) \
V(S390_Float32ToUint64) \
V(S390_Float32ToInt32) \
V(S390_Float32ToUint32) \
V(S390_Float32ToDouble) \
V(S390_Float64SilenceNaN) \
V(S390_DoubleToInt32) \
V(S390_DoubleToUint32) \
V(S390_DoubleToInt64) \
V(S390_DoubleToUint64) \
V(S390_DoubleToFloat32) \
V(S390_DoubleExtractLowWord32) \
V(S390_DoubleExtractHighWord32) \
V(S390_DoubleInsertLowWord32) \
V(S390_DoubleInsertHighWord32) \
V(S390_DoubleConstruct) \
V(S390_BitcastInt32ToFloat32) \
V(S390_BitcastFloat32ToInt32) \
V(S390_BitcastInt64ToDouble) \
V(S390_BitcastDoubleToInt64) \
V(S390_LoadWordS8) \
V(S390_LoadWordU8) \
V(S390_LoadWordS16) \
V(S390_LoadWordU16) \
V(S390_LoadWordS32) \
V(S390_LoadWordU32) \
V(S390_LoadReverse16RR) \
V(S390_LoadReverse32RR) \
V(S390_LoadReverse64RR) \
V(S390_LoadReverse16) \
V(S390_LoadReverse32) \
V(S390_LoadReverse64) \
V(S390_LoadWord64) \
V(S390_LoadFloat32) \
V(S390_LoadDouble) \
V(S390_StoreWord8) \
V(S390_StoreWord16) \
V(S390_StoreWord32) \
V(S390_StoreWord64) \
V(S390_StoreReverse16) \
V(S390_StoreReverse32) \
V(S390_StoreReverse64) \
V(S390_StoreFloat32) \
V(S390_StoreDouble)
// Addressing modes represent the "shape" of inputs to an instruction.
// Many instructions support multiple addressing modes. Addressing modes
// are encoded into the InstructionCode of the instruction and tell the
// code generator after register allocation which assembler method to call.
//
// We use the following local notation for addressing modes:
//
// R = register
// O = register or stack slot
// D = double register
// I = immediate (handle, external, int32)
// MRI = [register + immediate]
// MRR = [register + register]
#define TARGET_ADDRESSING_MODE_LIST(V) \
V(MR) /* [%r0 ] */ \
V(MRI) /* [%r0 + K] */ \
V(MRR) /* [%r0 + %r1 ] */ \
V(MRRI) /* [%r0 + %r1 + K] */
} // namespace compiler
} // namespace internal
} // namespace v8
#endif // V8_COMPILER_S390_INSTRUCTION_CODES_S390_H_
```
|
Paradise Camp is a 1986 documentary film about Theresienstadt concentration camp in Czechoslovakia, written and directed by Australians Paul Rea and Frank Heimans, respectively. Czechoslovakian Jews were first told that Theresienstadt was a community established for their safety. They quickly recognized it as a ghetto and concentration camp.
In 1944, the Nazis cleaned up the camp, painting buildings and planting flowers, and deporting inmates to reduce overcrowding, in order to fool international Red Cross officials on a visit into believing the Jews were being well cared for. That year, the Germans also filmed a propaganda documentary at Thereienstadt to promote how they were caring for Jews.
The 1986 film includes excerpts from the propaganda film, in contrast with interviews of survivors, other material about the facts of the camp, and examples of art made by prisoners, including thousands of children's drawings hidden and preserved by their teacher.
Summary
"They had nice coats. They brought pictures", a witness remembers of prominent Czech Jews entering Theresienstadt at its opening. "They wanted to make their beautiful spa stay very nice, and, when they arrived in big barracks, they couldn’t understand what happened".
Paradise Camp reveals how Nazi deception fooled Jews into entering Theresienstadt willingly. Elderly Jews were told the camp would be their safe haven, World War I veterans thought that their service to Germany was being rewarded, and prominent Jews thought they were being given special treatment for their German nationalism, with fine accommodations and protection from the war. But they all were soon held in densely overcrowded barracks, eating meager portions of bread, and fearing for their lives. Paradise Camp features interviews with survivors, who experienced the hunger, filth and terror that the Nazi officials intermittently masked, and displays old photographs and archival film footage, to reconstruct the truth about Theresienstadt.
Originally a fortress town in Terezin, Czechoslovakia, Theresienstadt was built in the 1800s for the Austrian Empress Maria Theresa. But the Nazis realized the high stone walls that surrounded the city made it an ideal site in which to resettle Jews from Czechoslovakia and its neighboring Eastern European nations. In 1941, the Germans established it as a Jewish ghetto, and transported tens of thousands of Jews there, forcing them to do the work to house and feed the large population. More than 150,000 Jews passed through Theresienstadt; most were killed in death camps. In late 1942, the Germans began to deport Jews from Theresienstadt to extermination camps throughout Eastern Europe, including Auschwitz, Majdanek and Treblinka. A smaller fortress on the other side of the river was used for political prisoners and, later, some Allied prisoners of war.
In 1944, preparing for a visit from the Red Cross (the Danish government had expressed concern about their nationals and the Germans were trying to maintain some Danish cooperation for forced labor), the Nazis had Jewish workers improve the complex, painting buildings and cleaning the streets, planting flower pots. The Germans deported thousands of inmates to concentration camps reduce overcrowding. They brought in ample props. That year they directed a Jewish prisoner filmmaker to produce a propaganda film portraying life at Theresienstadt concentration camp as comfortable and enjoyable. In the film, older women knit, a small boy waters a garden from an oversized water barrel, and everyone wears a lazy smile and a healthy layer of fat.
The 1986 documentary has interviews with survivors, who recount what life was really like at Theresienstadt. One woman remembers eating crushed red brick and pretending it was paprika, while crushed bark served as an alternative spice. Another woman recalls that before the Red Cross inspection, the Nazis built children's rooms painted in bright colors and lined with small beds. But the Nazis never told the Red Cross inspectors that the tiny beds were used by 17-year-olds, because all the younger children had already been deported and murdered in extermination camps.
Together with academic classes, prisoners arranged for drawing classes for children. Their teacher hid 4,000 drawings, which were found a decade after the war. Adult artists were also among the prisoners, and their work expressed the horror of daily life. The documentary shows their work: sketches of prisoners with sunken eyes, ragged clothes, and bony fingers document starvation and need.
Filmmakers
Paul Rea is an Australian journalist who in 1985 produced Where Death Wears a Smile, a documentary about Allied prisoners of war being held at the smaller fortress of Theresienstadt during World War II. He alleged that dozens had been killed there. While the executions were refuted by an Australian survivor of the camp, the film attracted attention from the government. For years, New Zealand and Australia had denied that any of their POWs were held there. After Rea's film, in 1987 the Australian government conducted a formal investigation, finding that some Australian soldiers had been interned at Theresienstadt, under conditions that violated the Geneva Conventions. They paid the survivors $10,000 each in compensation.
Frank Heimans is a director and producer.
References
External links
Paradise Camp, reviewed by The Jewish Channel
Terezin (Theresienstadt) Revealed --Selected Holocaust Resource, I Survived
Fortress Details
Terezín Memorial
Australian documentary films
1986 films
Documentary films about the Holocaust
1986 documentary films
Theresienstadt Ghetto
1980s English-language films
|
```javascript
Default function parameters
Handling modules
Creating promises
The `spread` operator
Reflect API in ES6
```
|
```javascript
Navigating the browser history
Permission API
FileReader.readAsArrayBuffer()
Fetch API
MediaDevices.getUserMedia()
```
|
William Touchet of Hainton (died 1322), Lord of Hainton, Barkwith and Southrey was an English noble. He fought in the wars in Scotland and fought on the baronial opposition side to King Edward II. He was executed in 1322.
Biography
William was the son of Nicholas Tuchet and Alice de Kyme. He joined the baronial opposition under Thomas de Lancaster, Earl of Lancaster against King Edward II of England. William was married to Elena de Denardeston. He was executed by hanging in 1322. His brother Richard was his heir.
Notes
Citations
References
Moor, Charles. Knights of Edward I.. United Kingdom, n.p, 1932.
Year of birth unknown
1322 deaths
13th-century English people
14th-century English people
|
TB 191 was a second-class torpedo boat constructed for the Colony of Tasmania and later operated by the Commonwealth Naval Forces and the Royal Australian Navy. She was sold in 1911.
Design and construction
TB 191 was ordered by the Australian colonial government of Tasmania in 1882 to protect the colony from possible Russian or French attack, and was built by John I. Thornycroft & Company. The torpedo boat was long, with a draught of , and a displacement of 12.5 tons, similar to the other torpedo boats ordered by the other Australian colonies.
Operational history
Built at a cost of £4,011, TB 191 arrived in Tasmania on board SS Abington on 1 May 1884. Operated by the Tasmanian Torpedo Corps, she appears to not have been used much in service of the Tasmanian colony. She was sold in 1905 to the Colony of South Australia, being towed to Adelaide by HMCS Protector, before becoming part of the Commonwealth Naval Forces.
She was sold in 1911.
Notes
References
1883 ships
Ships built in Chiswick
Ships of the South Australian Naval Service
Torpedo boats of the Royal Australian Navy
Torpedo boats of the Tasmanian Torpedo Corps
Ships built by John I. Thornycroft & Company
|
```java
/*
* DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
*
* This code is free software; you can redistribute it and/or modify it
* published by the Free Software Foundation.
*
* This code is distributed in the hope that it will be useful, but WITHOUT
* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
* version 2 for more details (a copy is included in the LICENSE file that
* accompanied this code).
*
* 2 along with this work; if not, write to the Free Software Foundation,
* Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
*
* Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
* or visit www.oracle.com if you need additional information or have any
* questions.
*/
package org.graalvm.visualizer.shell.actions;
import java.awt.event.ActionEvent;
import java.util.Map;
import javax.swing.Action;
import org.graalvm.visualizer.filter.DiagramFilters;
import org.graalvm.visualizer.filter.Filter;
import org.graalvm.visualizer.filter.FilterProvider;
import org.graalvm.visualizer.shell.ShellSession;
import org.graalvm.visualizer.view.api.DiagramModel;
import org.netbeans.api.editor.EditorActionRegistration;
import org.openide.util.Lookup;
/**
*
* @author sdedic
*/
// FIXME: make a specific script Lookup provider that will inject
// some subtree into MIMELookup
public class ExecuteScriptAction2 extends AbstractGraphShellAction {
public ExecuteScriptAction2() {
this(null, null);
}
ExecuteScriptAction2(Map<String, ?> attrs, Lookup lkp) {
super(attrs, lkp);
}
@EditorActionRegistration(
name = "execute-script",
// register for all mime types
mimeType = "",
iconResource = "org/graalvm/visualizer/shell/resources/execute.png",
toolBarPosition = 100
)
public static ExecuteScriptAction2 createEditorAction(Map params) {
return new ExecuteScriptAction2(params, null);
}
@Override
protected boolean computeEnabled(DiagramModel model, FilterProvider filterSource) {
return filterSource.getFilter() != null && model != null;
}
@Override
public void actionPerformed(ActionEvent ae) {
DiagramModel mdl = findDiagramModel();
Filter filter = getFilterSource().createFilter(true);
if (mdl == null || filter == null) {
return;
}
DiagramFilters modelFilters = mdl.getLookup().lookup(DiagramFilters.class);
if (modelFilters == null) {
return;
}
// apply the filter onto the model
modelFilters.applyScriptFilter(filter, ShellSession.getCurrentSession().getSharedEnvironment(), false, null);
}
@Override
public Action createContextAwareInstance(Lookup actionContext) {
return new ExecuteScriptAction2(attributes, actionContext);
}
}
```
|
Marsha Sue Ivins (born April 15, 1951) is an American retired astronaut and a veteran of five Space Shuttle missions.
Career
Ivins, born April 15, 1951, in Baltimore, Maryland, graduated from Nether Providence High School in Wallingford, Pennsylvania in 1969, and earned a Bachelor of Science degree in Aerospace Engineering from the University of Colorado at Boulder in 1973. She is Jewish-American. She went to work for NASA's Johnson Space Center, and worked mainly on orbiter displays and controls, before being assigned as a flight engineer in 1980 and co-pilot on NASA administrative aircraft. In 1984, Ivins was selected as an astronaut candidate.
She has flown aboard five space missions: STS-32 (1990), STS-46 (1992), STS-62 (1994), STS-81 (1997), and STS-98 (2001). Ivins retired from NASA on December 31, 2010.
Spaceflight experience
STS-32 (January 9–20, 1990) launched from the Kennedy Space Center, Florida, on an eleven-day flight, during which crew members on board the Space Shuttle Columbia successfully deployed a Syncom satellite, and retrieved the 21,400-pound Long Duration Exposure Facility (LDEF). Mission duration was 261 hours, 1 minute, and 38 seconds. Following 173 orbits of the Earth and 4.5 million miles, Columbia returned with a night landing at Edwards Air Force Base, California.
STS-46 (July 31 – August 8, 1992) was an 8-day mission, during which crew members deployed the EURECA (European Retrievable Carrier) satellite, and conducted the first Tethered Satellite System (TSS) test flight. Mission duration was 191 hours, 16 minutes, and 7 seconds. Space Shuttle Atlantis and her crew launched and landed at the Kennedy Space Center, Florida, completing 126 orbits of the Earth in 3.35 million miles.
STS-62 (March 4–18, 1994) was a 14-day mission for the United States Microgravity Payload (USMP) 2 and Office of Aeronautics and Space Technology (OAST) 2 payloads. These payloads studied the effects of microgravity on materials sciences and other space flight technologies. Other experiments on board included demonstration of advanced teleoperator tasks using the remote manipulator system, protein crystal growth, and dynamic behavior of space structures. Mission duration was 312 hours, 23 minutes, and 16 seconds. Space Shuttle Columbia launched and landed at the Kennedy Space Center, Florida, completing 224 orbits in 5.82 million miles.
STS-81 Atlantis (January 12–22, 1997) was a 10-day mission, the fifth to dock with Russia's Space Station Mir, and the second to exchange U.S. astronauts. The mission also carried the Spacehab double module providing additional middeck locker space for secondary experiments. In five days of docked operations more than three tons of food, water, experiment equipment and samples were moved back and forth between the two spacecraft. Following 160 orbits of the Earth, the STS-81 mission concluded with a landing on Kennedy Space Center's Runway 33 ending a 3.9 million mile journey. Mission duration was 244 hours, 56 minutes.
STS-98 Atlantis (February 7–20, 2001) continued the task of building and enhancing the International Space Station by delivering the U.S. laboratory module Destiny. The Shuttle spent seven days docked to the station while Destiny was attached and three spacewalks were conducted to complete its assembly. The crew also relocated a docking port, and delivered supplies and equipment to the resident Expedition-1 crew. Space Shuttle Atlantis returned to land at Edwards Air Force Base, California traveling 5.3 million miles in 203 orbits. Mission duration was 12 days, 21 hours, 20 minutes.
References
Interviews
, Budapest, December 6, 2012.
What It's Like to Spend 55 Days in Space, By Chris Mooney in Mother Jones, September 20, 2013, accessed May 7, 2014
An Astronaut Reveals What Life In Space Is Really Like, By Marsha Ivins as told to Caitlin Roper in WIRED Magazine, November 19, 2014, accessed November 23, 2014
External links
NASA Biography
1951 births
Living people
20th-century American Jews
Women astronauts
University of Colorado alumni
American women engineers
NASA civilian astronauts
People from Baltimore
21st-century women engineers
Space Shuttle program astronauts
Mir crew members
21st-century American Jews
20th-century American women
21st-century American women
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
// MODULES //
var tape = require( 'tape' );
var duration2ms = require( './../lib' );
// TESTS //
tape( 'main export is a function', function test( t ) {
t.ok( true, __filename );
t.strictEqual( typeof duration2ms, 'function', 'main export is a function' );
t.end();
});
tape( 'the function throws an error if not provided a duration string', function test( t ) {
var values;
var i;
values = [
5,
NaN,
true,
false,
null,
void 0,
[],
{},
function noop() {},
'2x',
'1w1d',
'1y3w12d'
];
for ( i = 0; i < values.length; i++ ) {
t.throws( badValue( values[i] ), TypeError, 'throws an error when provided '+values[i] );
}
t.end();
function badValue( value ) {
return function badValue() {
duration2ms( value );
};
}
});
tape( 'the function converts a duration string to milliseconds', function test( t ) {
var expected;
var values;
var ms;
var i;
values = [
'1ms',
'1s',
'1m',
'1h',
'1d',
'1d2h3m4s5ms'
];
expected = [ 1, 1000, 60000, 3600000, 86400000, 93784005 ];
for ( i = 0; i < values.length; i++ ) {
ms = duration2ms( values[i] );
t.equal( ms, expected[i], 'returns expected value when provided '+values[i] );
}
t.end();
});
tape( 'the function returns `0` if provided an empty string', function test( t ) {
var ms = duration2ms( '' );
t.equal( ms, 0, 'returns 0' );
t.end();
});
```
|
```java
/*
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package org.apache.beam.runners.spark.structuredstreaming.translation;
import static org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior.DO_NOT_ENTER_TRANSFORM;
import static org.apache.beam.sdk.Pipeline.PipelineVisitor.CompositeBehavior.ENTER_TRANSFORM;
import static org.apache.beam.sdk.util.Preconditions.checkStateNotNull;
import static org.apache.beam.sdk.values.PCollection.IsBounded.UNBOUNDED;
import java.io.IOException;
import java.io.Serializable;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.HashSet;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.function.Supplier;
import javax.annotation.Nullable;
import org.apache.beam.runners.core.construction.SerializablePipelineOptions;
import org.apache.beam.runners.spark.SparkCommonPipelineOptions;
import org.apache.beam.runners.spark.structuredstreaming.translation.batch.functions.SideInputValues;
import org.apache.beam.runners.spark.structuredstreaming.translation.helpers.EncoderProvider;
import org.apache.beam.sdk.Pipeline;
import org.apache.beam.sdk.Pipeline.PipelineVisitor;
import org.apache.beam.sdk.annotations.Internal;
import org.apache.beam.sdk.coders.Coder;
import org.apache.beam.sdk.options.PipelineOptions;
import org.apache.beam.sdk.options.StreamingOptions;
import org.apache.beam.sdk.runners.AppliedPTransform;
import org.apache.beam.sdk.runners.TransformHierarchy.Node;
import org.apache.beam.sdk.transforms.PTransform;
import org.apache.beam.sdk.transforms.View;
import org.apache.beam.sdk.util.WindowedValue;
import org.apache.beam.sdk.util.construction.PTransformTranslation;
import org.apache.beam.sdk.values.PCollection;
import org.apache.beam.sdk.values.PInput;
import org.apache.beam.sdk.values.POutput;
import org.apache.beam.sdk.values.PValue;
import org.apache.beam.sdk.values.TupleTag;
import org.apache.spark.broadcast.Broadcast;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Encoder;
import org.apache.spark.sql.SparkSession;
import org.apache.spark.storage.StorageLevel;
import org.checkerframework.checker.nullness.qual.MonotonicNonNull;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import scala.reflect.ClassTag;
/**
* The pipeline translator translates a Beam {@link Pipeline} into a Spark correspondence, that can
* then be evaluated.
*
* <p>The translation involves traversing the hierarchy of a pipeline multiple times:
*
* <ol>
* <li>Detect if {@link StreamingOptions#setStreaming streaming} mode is required.
* <li>Identify datasets that are repeatedly used as input and should be cached.
* <li>And finally, translate each primitive or composite {@link PTransform} that is {@link
* #getTransformTranslator known} and {@link TransformTranslator#canTranslate supported} into
* its Spark correspondence. If a composite is not supported, it will be expanded further into
* its parts and translated then.
* </ol>
*/
@Internal
public abstract class PipelineTranslator {
private static final Logger LOG = LoggerFactory.getLogger(PipelineTranslator.class);
// Threshold to limit query plan complexity to avoid unnecessary planning overhead. Currently this
// is fairly low, Catalyst won't be able to optimize beyond ParDos anyways. Until there's
// dedicated support for schema transforms, there's little value of allowing more complex plans at
// this point.
private static final int PLAN_COMPLEXITY_THRESHOLD = 6;
public static void replaceTransforms(Pipeline pipeline, StreamingOptions options) {
pipeline.replaceAll(SparkTransformOverrides.getDefaultOverrides(options.isStreaming()));
}
/**
* Analyse the pipeline to determine if we have to switch to streaming mode for the pipeline
* translation and update {@link StreamingOptions} accordingly.
*/
public static void detectStreamingMode(Pipeline pipeline, StreamingOptions options) {
StreamingModeDetector detector = new StreamingModeDetector(options.isStreaming());
pipeline.traverseTopologically(detector);
options.setStreaming(detector.streaming);
}
/** Returns a {@link TransformTranslator} for the given {@link PTransform} if known. */
protected abstract @Nullable <
InT extends PInput, OutT extends POutput, TransformT extends PTransform<InT, OutT>>
TransformTranslator<InT, OutT, TransformT> getTransformTranslator(TransformT transform);
/**
* Translates a Beam pipeline into its Spark correspondence using the Spark SQL / Dataset API.
*
* <p>Note, in some cases this involves the early evaluation of some parts of the pipeline. For
* example, in order to use a side-input {@link org.apache.beam.sdk.values.PCollectionView
* PCollectionView} in a translation the corresponding Spark {@link
* org.apache.beam.runners.spark.translation.Dataset Dataset} might have to be collected and
* broadcasted to be able to continue with the translation.
*
* @return The result of the translation is an {@link EvaluationContext} that can trigger the
* evaluation of the Spark pipeline.
*/
public EvaluationContext translate(
Pipeline pipeline, SparkSession session, SparkCommonPipelineOptions options) {
LOG.debug("starting translation of the pipeline using {}", getClass().getName());
DependencyVisitor dependencies = new DependencyVisitor();
pipeline.traverseTopologically(dependencies);
TranslatingVisitor translator = new TranslatingVisitor(session, options, dependencies.results);
pipeline.traverseTopologically(translator);
return new EvaluationContext(translator.leaves, session);
}
/**
* The correspondence of a {@link PCollection} as result of translating a {@link PTransform}
* including additional metadata (such as name and dependents).
*/
private static final class TranslationResult<IntT, T>
implements EvaluationContext.NamedDataset<T> {
private final String name;
private final float complexityFactor;
private float planComplexity = 0;
private @MonotonicNonNull Dataset<WindowedValue<T>> dataset = null;
private @MonotonicNonNull Broadcast<SideInputValues<T>> sideInputBroadcast = null;
private @Nullable UnresolvedTranslation<IntT, T> unresolved = null;
// dependent downstream transforms (if empty this is a leaf)
private final Set<PTransform<?, ?>> dependentTransforms = new HashSet<>();
// upstream dependencies (required inputs)
private final List<TranslationResult<?, ?>> dependencies;
private TranslationResult(
PCollection<?> pCol, float complexityFactor, List<TranslationResult<?, ?>> dependencies) {
this.name = pCol.getName();
this.complexityFactor = complexityFactor;
this.dependencies = dependencies;
}
@Override
public String name() {
return name;
}
@Override
public @Nullable Dataset<WindowedValue<T>> dataset() {
return dataset;
}
private boolean isLeaf() {
return dependentTransforms.isEmpty();
}
private int usages() {
return dependentTransforms.size();
}
private void resetPlanComplexity() {
planComplexity = 1;
}
/** Estimate complexity of query plan by multiplying complexities of all dependencies. */
private float estimatePlanComplexity() {
if (planComplexity > 0) {
return planComplexity;
}
float complexity = 1 + complexityFactor;
for (TranslationResult<?, ?> result : dependencies) {
complexity *= result.estimatePlanComplexity();
}
return (planComplexity = complexity);
}
}
/**
* Unresolved translation, allowing to optimize the generated Spark DAG.
*
* <p>An unresolved translation can - in certain cases - be fused together with following
* transforms. Currently this is only the case for ParDos with linear linage.
*/
public interface UnresolvedTranslation<InT, T> {
PCollection<InT> getInput();
<T2> UnresolvedTranslation<InT, T2> fuse(UnresolvedTranslation<T, T2> next);
Dataset<WindowedValue<T>> resolve(
Supplier<PipelineOptions> options, Dataset<WindowedValue<InT>> input);
}
/** Shared, mutable state during the translation of a pipeline and omitted afterwards. */
public interface TranslationState extends EncoderProvider {
<T> Dataset<WindowedValue<T>> getDataset(PCollection<T> pCollection);
boolean isLeaf(PCollection<?> pCollection);
<InT, OutT> void putUnresolved(
PCollection<OutT> out, UnresolvedTranslation<InT, OutT> unresolved);
<T> void putDataset(
PCollection<T> pCollection, Dataset<WindowedValue<T>> dataset, boolean cache);
default <T> void putDataset(PCollection<T> pCollection, Dataset<WindowedValue<T>> dataset) {
putDataset(pCollection, dataset, true);
}
<T> Broadcast<SideInputValues<T>> getSideInputBroadcast(
PCollection<T> pCollection, SideInputValues.Loader<T> loader);
Supplier<PipelineOptions> getOptionsSupplier();
PipelineOptions getOptions();
SparkSession getSparkSession();
}
/**
* {@link PTransformVisitor} that translates supported {@link PTransform PTransforms} into their
* Spark correspondence.
*
* <p>Note, in some cases this involves the early evaluation of some parts of the pipeline. For
* example, in order to use a side-input {@link org.apache.beam.sdk.values.PCollectionView
* PCollectionView} in a translation the corresponding Spark {@link
* org.apache.beam.runners.spark.translation.Dataset Dataset} might have to be collected and
* broadcasted.
*/
private class TranslatingVisitor extends PTransformVisitor implements TranslationState {
private final Map<PCollection<?>, TranslationResult<?, ?>> translationResults;
private final Map<Coder<?>, Encoder<?>> encoders;
private final SparkSession sparkSession;
private final PipelineOptions options;
private final Supplier<PipelineOptions> optionsSupplier;
private final StorageLevel storageLevel;
private final Set<TranslationResult<?, ?>> leaves;
public TranslatingVisitor(
SparkSession sparkSession,
SparkCommonPipelineOptions options,
Map<PCollection<?>, TranslationResult<?, ?>> translationResults) {
this.sparkSession = sparkSession;
this.translationResults = translationResults;
this.options = options;
this.optionsSupplier = new BroadcastOptions(sparkSession, options);
this.storageLevel = StorageLevel.fromString(options.getStorageLevel());
this.encoders = new HashMap<>();
this.leaves = new HashSet<>();
}
@Override
<InT extends PInput, OutT extends POutput> void visit(
Node node,
PTransform<InT, OutT> transform,
TransformTranslator<InT, OutT, PTransform<InT, OutT>> translator) {
AppliedPTransform<InT, OutT, PTransform<InT, OutT>> appliedTransform =
(AppliedPTransform) node.toAppliedPTransform(getPipeline());
try {
LOG.info(
"Translating {}: {}",
node.isCompositeNode() ? "composite" : "primitive",
node.getFullName());
translator.translate(transform, appliedTransform, this);
} catch (IOException e) {
throw new RuntimeException(e);
}
}
@Override
public <T> Encoder<T> encoderOf(Coder<T> coder, Factory<T> factory) {
// computeIfAbsent fails with Java 11 on recursive factory
Encoder<T> enc = (Encoder<T>) encoders.get(coder);
if (enc == null) {
enc = factory.apply(coder);
encoders.put(coder, enc);
}
return enc;
}
private <IntT, T> TranslationResult<IntT, T> getResult(PCollection<T> pCollection) {
return (TranslationResult<IntT, T>) checkStateNotNull(translationResults.get(pCollection));
}
@Override
public <T> Dataset<WindowedValue<T>> getDataset(PCollection<T> pCollection) {
return getOrResolve(getResult(pCollection));
}
@Override
public <T> void putDataset(
PCollection<T> pCollection, Dataset<WindowedValue<T>> dataset, boolean cache) {
TranslationResult<?, T> result = getResult(pCollection);
result.dataset = dataset;
if (cache && result.usages() > 1) {
LOG.info("Dataset {} will be cached for reuse.", result.name);
dataset.persist(storageLevel); // use NONE to disable
}
if (result.estimatePlanComplexity() > PLAN_COMPLEXITY_THRESHOLD) {
// Break linage of dataset to limit planning overhead for complex query plans.
LOG.info("Breaking linage of dataset {} to limit complexity of query plan.", result.name);
result.dataset = sparkSession.createDataset(dataset.rdd(), dataset.encoder());
result.resetPlanComplexity();
}
if (result.isLeaf()) {
leaves.add(result);
}
}
private <InT, T> Dataset<WindowedValue<T>> getOrResolve(TranslationResult<InT, T> result) {
UnresolvedTranslation<InT, T> unresolved = result.unresolved;
if (unresolved != null) {
result.dataset = unresolved.resolve(optionsSupplier, getDataset(unresolved.getInput()));
result.unresolved = null;
}
return checkStateNotNull(result.dataset);
}
@Override
public <InT, T> void putUnresolved(
PCollection<T> out, UnresolvedTranslation<InT, T> unresolved) {
// For simplicity, pretend InT is the same
TranslationResult<InT, InT> translIn = getResult(unresolved.getInput());
TranslationResult<InT, T> translOut = getResult(out);
// Fuse with previous unresolved translation if necessary
UnresolvedTranslation<InT, InT> unresolvedIn = translIn.unresolved;
translOut.unresolved = unresolvedIn != null ? unresolvedIn.fuse(unresolved) : unresolved;
translIn.unresolved = null;
// Resolve dataset immediately in case of leaf or when there are multiple downstreams
if (translOut.usages() != 1) {
putDataset(out, getOrResolve(translOut));
}
}
@Override
public boolean isLeaf(PCollection<?> pCollection) {
return getResult(pCollection).isLeaf();
}
@Override
public <T> Broadcast<SideInputValues<T>> getSideInputBroadcast(
PCollection<T> pCollection, SideInputValues.Loader<T> loader) {
TranslationResult<?, T> result = getResult(pCollection);
if (result.sideInputBroadcast == null) {
SideInputValues<T> sideInputValues = loader.apply(getOrResolve(result));
result.sideInputBroadcast = broadcast(sparkSession, sideInputValues);
}
return result.sideInputBroadcast;
}
@Override
public Supplier<PipelineOptions> getOptionsSupplier() {
return optionsSupplier;
}
@Override
public PipelineOptions getOptions() {
return options;
}
@Override
public SparkSession getSparkSession() {
return sparkSession;
}
}
/**
* Supplier wrapping broadcasted {@link PipelineOptions} to avoid repeatedly serializing those as
* part of the task closures.
*/
private static class BroadcastOptions implements Supplier<PipelineOptions>, Serializable {
private final Broadcast<SerializablePipelineOptions> broadcast;
private BroadcastOptions(SparkSession session, PipelineOptions options) {
this.broadcast = broadcast(session, new SerializablePipelineOptions(options));
}
@Override
public PipelineOptions get() {
return broadcast.value().get();
}
}
private static <T> Broadcast<T> broadcast(SparkSession session, T t) {
return session.sparkContext().broadcast(t, (ClassTag) ClassTag.AnyRef());
}
/**
* {@link PTransformVisitor} that analyses dependencies of supported {@link PTransform
* PTransforms} to help identify cache candidates.
*
* <p>The visitor may throw if a {@link PTransform} is observed that uses unsupported features.
*/
private class DependencyVisitor extends PTransformVisitor {
private final Map<PCollection<?>, TranslationResult<?, ?>> results = new HashMap<>();
@Override
<InT extends PInput, OutT extends POutput> void visit(
Node node,
PTransform<InT, OutT> transform,
TransformTranslator<InT, OutT, PTransform<InT, OutT>> translator) {
// Track `transform` as downstream dependency of every input and reversely
// every input is a dependency of each output of `transform`.
List<TranslationResult<?, ?>> dependencies = new ArrayList<>(node.getInputs().size());
for (Map.Entry<TupleTag<?>, PCollection<?>> entry : node.getInputs().entrySet()) {
TranslationResult<?, ?> input = checkStateNotNull(results.get(entry.getValue()));
dependencies.add(input);
input.dependentTransforms.add(transform);
}
// add new translation result for every output of `transform`
for (PCollection<?> pOut : node.getOutputs().values()) {
results.put(pOut, new TranslationResult<>(pOut, translator.complexityFactor, dependencies));
}
}
}
/**
* An abstract {@link PipelineVisitor} that visits all translatable {@link PTransform} pipeline
* nodes of a pipeline with the respective {@link TransformTranslator}.
*
* <p>The visitor may throw if a {@link PTransform} is observed that uses unsupported features.
*/
private abstract class PTransformVisitor extends PipelineVisitor.Defaults {
/** Visit the {@link PTransform} with its respective {@link TransformTranslator}. */
abstract <InT extends PInput, OutT extends POutput> void visit(
Node node,
PTransform<InT, OutT> transform,
TransformTranslator<InT, OutT, PTransform<InT, OutT>> translator);
@Override
public final CompositeBehavior enterCompositeTransform(Node node) {
PTransform<PInput, POutput> transform = (PTransform<PInput, POutput>) node.getTransform();
TransformTranslator<PInput, POutput, PTransform<PInput, POutput>> translator =
getSupportedTranslator(transform);
if (transform != null && translator != null) {
visit(node, transform, translator);
return DO_NOT_ENTER_TRANSFORM;
} else {
return ENTER_TRANSFORM;
}
}
@Override
public final void visitPrimitiveTransform(Node node) {
PTransform<PInput, POutput> transform = (PTransform<PInput, POutput>) node.getTransform();
if (transform == null || transform.getClass().equals(View.CreatePCollectionView.class)) {
return; // ignore, nothing to be translated here, views are handled on the consumer side
}
TransformTranslator<PInput, POutput, PTransform<PInput, POutput>> translator =
getSupportedTranslator(transform);
if (translator == null) {
String urn = PTransformTranslation.urnForTransform(transform);
throw new UnsupportedOperationException("Transform " + urn + " is not supported.");
}
visit(node, transform, translator);
}
/** {@link TransformTranslator} for {@link PTransform} if translation is known and supported. */
private @Nullable TransformTranslator<PInput, POutput, PTransform<PInput, POutput>>
getSupportedTranslator(@Nullable PTransform<PInput, POutput> transform) {
if (transform == null) {
return null;
}
TransformTranslator<PInput, POutput, PTransform<PInput, POutput>> translator =
getTransformTranslator(transform);
return translator != null && translator.canTranslate(transform) ? translator : null;
}
}
/**
* Traverse the pipeline to check for unbounded {@link PCollection PCollections} that would
* require streaming mode unless streaming mode is already enabled.
*/
private static class StreamingModeDetector extends PipelineVisitor.Defaults {
private boolean streaming;
StreamingModeDetector(boolean streaming) {
this.streaming = streaming;
}
@Override
public CompositeBehavior enterCompositeTransform(Node node) {
return streaming ? DO_NOT_ENTER_TRANSFORM : ENTER_TRANSFORM; // stop if in streaming mode
}
@Override
public void visitValue(PValue value, Node producer) {
if (value instanceof PCollection && ((PCollection) value).isBounded() == UNBOUNDED) {
LOG.info("Found unbounded PCollection {}, switching to streaming mode.", value.getName());
streaming = true;
}
}
}
}
```
|
```c++
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
// Original code copyright 2014 Foxit Software Inc. path_to_url
#include "public/fpdf_transformpage.h"
#include <memory>
#include <sstream>
#include <vector>
#include "constants/page_object.h"
#include "core/fpdfapi/page/cpdf_clippath.h"
#include "core/fpdfapi/page/cpdf_page.h"
#include "core/fpdfapi/page/cpdf_pageobject.h"
#include "core/fpdfapi/page/cpdf_path.h"
#include "core/fpdfapi/parser/cpdf_array.h"
#include "core/fpdfapi/parser/cpdf_dictionary.h"
#include "core/fpdfapi/parser/cpdf_document.h"
#include "core/fpdfapi/parser/cpdf_number.h"
#include "core/fpdfapi/parser/cpdf_reference.h"
#include "core/fpdfapi/parser/cpdf_stream.h"
#include "core/fxge/cfx_pathdata.h"
#include "fpdfsdk/cpdfsdk_helpers.h"
#include "third_party/base/ptr_util.h"
namespace {
void SetBoundingBox(CPDF_Page* page,
const ByteString& key,
const CFX_FloatRect& rect) {
if (!page)
return;
page->GetDict()->SetRectFor(key, rect);
page->UpdateDimensions();
}
bool GetBoundingBox(CPDF_Page* page,
const ByteString& key,
float* left,
float* bottom,
float* right,
float* top) {
if (!page || !left || !bottom || !right || !top)
return false;
CPDF_Array* pArray = page->GetDict()->GetArrayFor(key);
if (!pArray)
return false;
*left = pArray->GetFloatAt(0);
*bottom = pArray->GetFloatAt(1);
*right = pArray->GetFloatAt(2);
*top = pArray->GetFloatAt(3);
return true;
}
CPDF_Object* GetPageContent(CPDF_Dictionary* pPageDict) {
return pPageDict
? pPageDict->GetDirectObjectFor(pdfium::page_object::kContents)
: nullptr;
}
} // namespace
FPDF_EXPORT void FPDF_CALLCONV FPDFPage_SetMediaBox(FPDF_PAGE page,
float left,
float bottom,
float right,
float top) {
SetBoundingBox(CPDFPageFromFPDFPage(page), pdfium::page_object::kMediaBox,
CFX_FloatRect(left, bottom, right, top));
}
FPDF_EXPORT void FPDF_CALLCONV FPDFPage_SetCropBox(FPDF_PAGE page,
float left,
float bottom,
float right,
float top) {
SetBoundingBox(CPDFPageFromFPDFPage(page), pdfium::page_object::kCropBox,
CFX_FloatRect(left, bottom, right, top));
}
FPDF_EXPORT void FPDF_CALLCONV FPDFPage_SetBleedBox(FPDF_PAGE page,
float left,
float bottom,
float right,
float top) {
SetBoundingBox(CPDFPageFromFPDFPage(page), pdfium::page_object::kBleedBox,
CFX_FloatRect(left, bottom, right, top));
}
FPDF_EXPORT void FPDF_CALLCONV FPDFPage_SetTrimBox(FPDF_PAGE page,
float left,
float bottom,
float right,
float top) {
SetBoundingBox(CPDFPageFromFPDFPage(page), pdfium::page_object::kTrimBox,
CFX_FloatRect(left, bottom, right, top));
}
FPDF_EXPORT void FPDF_CALLCONV FPDFPage_SetArtBox(FPDF_PAGE page,
float left,
float bottom,
float right,
float top) {
SetBoundingBox(CPDFPageFromFPDFPage(page), pdfium::page_object::kArtBox,
CFX_FloatRect(left, bottom, right, top));
}
FPDF_EXPORT FPDF_BOOL FPDF_CALLCONV FPDFPage_GetMediaBox(FPDF_PAGE page,
float* left,
float* bottom,
float* right,
float* top) {
return GetBoundingBox(CPDFPageFromFPDFPage(page),
pdfium::page_object::kMediaBox, left, bottom, right,
top);
}
FPDF_EXPORT FPDF_BOOL FPDF_CALLCONV FPDFPage_GetCropBox(FPDF_PAGE page,
float* left,
float* bottom,
float* right,
float* top) {
return GetBoundingBox(CPDFPageFromFPDFPage(page),
pdfium::page_object::kCropBox, left, bottom, right,
top);
}
FPDF_EXPORT FPDF_BOOL FPDF_CALLCONV FPDFPage_GetBleedBox(FPDF_PAGE page,
float* left,
float* bottom,
float* right,
float* top) {
return GetBoundingBox(CPDFPageFromFPDFPage(page),
pdfium::page_object::kBleedBox, left, bottom, right,
top);
}
FPDF_EXPORT FPDF_BOOL FPDF_CALLCONV FPDFPage_GetTrimBox(FPDF_PAGE page,
float* left,
float* bottom,
float* right,
float* top) {
return GetBoundingBox(CPDFPageFromFPDFPage(page),
pdfium::page_object::kTrimBox, left, bottom, right,
top);
}
FPDF_EXPORT FPDF_BOOL FPDF_CALLCONV FPDFPage_GetArtBox(FPDF_PAGE page,
float* left,
float* bottom,
float* right,
float* top) {
return GetBoundingBox(CPDFPageFromFPDFPage(page),
pdfium::page_object::kArtBox, left, bottom, right, top);
}
FPDF_EXPORT FPDF_BOOL FPDF_CALLCONV
FPDFPage_TransFormWithClip(FPDF_PAGE page,
const FS_MATRIX* matrix,
const FS_RECTF* clipRect) {
if (!matrix && !clipRect)
return false;
CPDF_Page* pPage = CPDFPageFromFPDFPage(page);
if (!pPage)
return false;
std::ostringstream textBuf;
textBuf << "q ";
if (clipRect) {
CFX_FloatRect rect = CFXFloatRectFromFSRECTF(*clipRect);
rect.Normalize();
textBuf << ByteString::Format("%f %f %f %f re W* n ", rect.left,
rect.bottom, rect.Width(), rect.Height());
}
if (matrix) {
textBuf << ByteString::Format("%f %f %f %f %f %f cm ", matrix->a, matrix->b,
matrix->c, matrix->d, matrix->e, matrix->f);
}
CPDF_Dictionary* pPageDict = pPage->GetDict();
CPDF_Object* pContentObj = GetPageContent(pPageDict);
if (!pContentObj)
return false;
CPDF_Document* pDoc = pPage->GetDocument();
if (!pDoc)
return false;
CPDF_Stream* pStream =
pDoc->NewIndirect<CPDF_Stream>(nullptr, 0, pDoc->New<CPDF_Dictionary>());
pStream->SetDataFromStringstream(&textBuf);
CPDF_Stream* pEndStream =
pDoc->NewIndirect<CPDF_Stream>(nullptr, 0, pDoc->New<CPDF_Dictionary>());
pEndStream->SetData(ByteStringView(" Q").span());
if (CPDF_Array* pContentArray = ToArray(pContentObj)) {
pContentArray->InsertAt(0, pStream->MakeReference(pDoc));
pContentArray->Add(pEndStream->MakeReference(pDoc));
} else if (pContentObj->IsStream() && !pContentObj->IsInline()) {
pContentArray = pDoc->NewIndirect<CPDF_Array>();
pContentArray->Add(pStream->MakeReference(pDoc));
pContentArray->Add(pContentObj->MakeReference(pDoc));
pContentArray->Add(pEndStream->MakeReference(pDoc));
pPageDict->SetFor(pdfium::page_object::kContents,
pContentArray->MakeReference(pDoc));
}
// Need to transform the patterns as well.
CPDF_Dictionary* pRes =
pPageDict->GetDictFor(pdfium::page_object::kResources);
if (!pRes)
return true;
CPDF_Dictionary* pPatternDict = pRes->GetDictFor("Pattern");
if (!pPatternDict)
return true;
CPDF_DictionaryLocker locker(pPatternDict);
for (const auto& it : locker) {
CPDF_Object* pObj = it.second.get();
if (pObj->IsReference())
pObj = pObj->GetDirect();
CPDF_Dictionary* pDict = nullptr;
if (pObj->IsDictionary())
pDict = pObj->AsDictionary();
else if (CPDF_Stream* pObjStream = pObj->AsStream())
pDict = pObjStream->GetDict();
else
continue;
if (matrix) {
CFX_Matrix m = CFXMatrixFromFSMatrix(*matrix);
pDict->SetMatrixFor("Matrix", pDict->GetMatrixFor("Matrix") * m);
}
}
return true;
}
FPDF_EXPORT void FPDF_CALLCONV
FPDFPageObj_TransformClipPath(FPDF_PAGEOBJECT page_object,
double a,
double b,
double c,
double d,
double e,
double f) {
CPDF_PageObject* pPageObj = CPDFPageObjectFromFPDFPageObject(page_object);
if (!pPageObj)
return;
CFX_Matrix matrix((float)a, (float)b, (float)c, (float)d, (float)e, (float)f);
// Special treatment to shading object, because the ClipPath for shading
// object is already transformed.
if (!pPageObj->IsShading())
pPageObj->TransformClipPath(matrix);
pPageObj->TransformGeneralState(matrix);
}
FPDF_EXPORT FPDF_CLIPPATH FPDF_CALLCONV FPDF_CreateClipPath(float left,
float bottom,
float right,
float top) {
CPDF_Path Path;
Path.AppendRect(left, bottom, right, top);
auto pNewClipPath = pdfium::MakeUnique<CPDF_ClipPath>();
pNewClipPath->AppendPath(Path, FXFILL_ALTERNATE, false);
// Caller takes ownership.
return FPDFClipPathFromCPDFClipPath(pNewClipPath.release());
}
FPDF_EXPORT void FPDF_CALLCONV FPDF_DestroyClipPath(FPDF_CLIPPATH clipPath) {
// Take ownership back from caller and destroy.
std::unique_ptr<CPDF_ClipPath>(CPDFClipPathFromFPDFClipPath(clipPath));
}
void OutputPath(std::ostringstream& buf, CPDF_Path path) {
const CFX_PathData* pPathData = path.GetObject();
if (!pPathData)
return;
const std::vector<FX_PATHPOINT>& pPoints = pPathData->GetPoints();
if (path.IsRect()) {
CFX_PointF diff = pPoints[2].m_Point - pPoints[0].m_Point;
buf << pPoints[0].m_Point.x << " " << pPoints[0].m_Point.y << " " << diff.x
<< " " << diff.y << " re\n";
return;
}
ByteString temp;
for (size_t i = 0; i < pPoints.size(); i++) {
buf << pPoints[i].m_Point.x << " " << pPoints[i].m_Point.y;
FXPT_TYPE point_type = pPoints[i].m_Type;
if (point_type == FXPT_TYPE::MoveTo) {
buf << " m\n";
} else if (point_type == FXPT_TYPE::BezierTo) {
buf << " " << pPoints[i + 1].m_Point.x << " " << pPoints[i + 1].m_Point.y
<< " " << pPoints[i + 2].m_Point.x << " " << pPoints[i + 2].m_Point.y;
buf << " c";
if (pPoints[i + 2].m_CloseFigure)
buf << " h";
buf << "\n";
i += 2;
} else if (point_type == FXPT_TYPE::LineTo) {
buf << " l";
if (pPoints[i].m_CloseFigure)
buf << " h";
buf << "\n";
}
}
}
FPDF_EXPORT void FPDF_CALLCONV FPDFPage_InsertClipPath(FPDF_PAGE page,
FPDF_CLIPPATH clipPath) {
CPDF_Page* pPage = CPDFPageFromFPDFPage(page);
if (!pPage)
return;
CPDF_Dictionary* pPageDict = pPage->GetDict();
CPDF_Object* pContentObj = GetPageContent(pPageDict);
if (!pContentObj)
return;
std::ostringstream strClip;
CPDF_ClipPath* pClipPath = CPDFClipPathFromFPDFClipPath(clipPath);
for (size_t i = 0; i < pClipPath->GetPathCount(); ++i) {
CPDF_Path path = pClipPath->GetPath(i);
if (path.GetPoints().empty()) {
// Empty clipping (totally clipped out)
strClip << "0 0 m W n ";
} else {
OutputPath(strClip, path);
if (pClipPath->GetClipType(i) == FXFILL_WINDING)
strClip << "W n\n";
else
strClip << "W* n\n";
}
}
CPDF_Document* pDoc = pPage->GetDocument();
if (!pDoc)
return;
CPDF_Stream* pStream =
pDoc->NewIndirect<CPDF_Stream>(nullptr, 0, pDoc->New<CPDF_Dictionary>());
pStream->SetDataFromStringstream(&strClip);
if (CPDF_Array* pArray = ToArray(pContentObj)) {
pArray->InsertAt(0, pStream->MakeReference(pDoc));
} else if (pContentObj->IsStream() && !pContentObj->IsInline()) {
CPDF_Array* pContentArray = pDoc->NewIndirect<CPDF_Array>();
pContentArray->Add(pStream->MakeReference(pDoc));
pContentArray->Add(pContentObj->MakeReference(pDoc));
pPageDict->SetFor(pdfium::page_object::kContents,
pContentArray->MakeReference(pDoc));
}
}
```
|
```php
<?php
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
*/
namespace Google\Service\CloudDeploy;
class SkaffoldModules extends \Google\Collection
{
protected $collection_key = 'configs';
/**
* @var string[]
*/
public $configs;
protected $gitType = SkaffoldGitSource::class;
protected $gitDataType = '';
protected $googleCloudBuildRepoType = SkaffoldGCBRepoSource::class;
protected $googleCloudBuildRepoDataType = '';
protected $googleCloudStorageType = SkaffoldGCSSource::class;
protected $googleCloudStorageDataType = '';
/**
* @param string[]
*/
public function setConfigs($configs)
{
$this->configs = $configs;
}
/**
* @return string[]
*/
public function getConfigs()
{
return $this->configs;
}
/**
* @param SkaffoldGitSource
*/
public function setGit(SkaffoldGitSource $git)
{
$this->git = $git;
}
/**
* @return SkaffoldGitSource
*/
public function getGit()
{
return $this->git;
}
/**
* @param SkaffoldGCBRepoSource
*/
public function setGoogleCloudBuildRepo(SkaffoldGCBRepoSource $googleCloudBuildRepo)
{
$this->googleCloudBuildRepo = $googleCloudBuildRepo;
}
/**
* @return SkaffoldGCBRepoSource
*/
public function getGoogleCloudBuildRepo()
{
return $this->googleCloudBuildRepo;
}
/**
* @param SkaffoldGCSSource
*/
public function setGoogleCloudStorage(SkaffoldGCSSource $googleCloudStorage)
{
$this->googleCloudStorage = $googleCloudStorage;
}
/**
* @return SkaffoldGCSSource
*/
public function getGoogleCloudStorage()
{
return $this->googleCloudStorage;
}
}
// Adding a class alias for backwards compatibility with the previous class name.
class_alias(SkaffoldModules::class, 'Google_Service_CloudDeploy_SkaffoldModules');
```
|
```c++
#include <Common/thread_local_rng.h>
#include <Common/ThreadPool.h>
#include <Common/PoolId.h>
#include <Parsers/ParserCreateQuery.h>
#include <Parsers/ASTCreateQuery.h>
#include <Parsers/parseQuery.h>
#include <Interpreters/Context.h>
#include <Interpreters/DatabaseCatalog.h>
#include <Interpreters/InterpreterCreateQuery.h>
#include <Interpreters/InterpreterSystemQuery.h>
#include <Interpreters/executeQuery.h>
#include <Interpreters/loadMetadata.h>
#include <Databases/DatabaseOrdinary.h>
#include <Databases/TablesLoader.h>
#include <Storages/StorageMaterializedView.h>
#include <IO/ReadBufferFromFile.h>
#include <IO/ReadHelpers.h>
#include <Common/escapeForFileName.h>
#include <Common/typeid_cast.h>
#include <Common/logger_useful.h>
#include <Common/CurrentMetrics.h>
#include <Core/Settings.h>
#include <filesystem>
#define ORDINARY_TO_ATOMIC_PREFIX ".tmp_convert."
namespace fs = std::filesystem;
namespace DB
{
namespace ErrorCodes
{
extern const int NOT_IMPLEMENTED;
extern const int LOGICAL_ERROR;
}
namespace ActionLocks
{
extern const StorageActionBlockType PartsMerge;
extern const StorageActionBlockType PartsFetch;
extern const StorageActionBlockType PartsSend;
extern const StorageActionBlockType DistributedSend;
}
static void executeCreateQuery(
const String & query,
ContextMutablePtr context,
const String & database,
const String & file_name,
bool create,
bool has_force_restore_data_flag)
{
const Settings & settings = context->getSettingsRef();
ParserCreateQuery parser;
ASTPtr ast = parseQuery(
parser, query.data(), query.data() + query.size(), "in file " + file_name,
0, settings.max_parser_depth, settings.max_parser_backtracks);
auto & ast_create_query = ast->as<ASTCreateQuery &>();
ast_create_query.setDatabase(database);
InterpreterCreateQuery interpreter(ast, context);
interpreter.setInternal(true);
if (!create)
{
interpreter.setForceAttach(true);
interpreter.setForceRestoreData(has_force_restore_data_flag);
}
interpreter.setLoadDatabaseWithoutTables(true);
interpreter.execute();
}
static bool isSystemOrInformationSchema(const String & database_name)
{
return database_name == DatabaseCatalog::SYSTEM_DATABASE ||
database_name == DatabaseCatalog::INFORMATION_SCHEMA ||
database_name == DatabaseCatalog::INFORMATION_SCHEMA_UPPERCASE;
}
static void loadDatabase(
ContextMutablePtr context,
const String & database,
const String & database_path,
bool force_restore_data)
{
String database_attach_query;
String database_metadata_file = database_path + ".sql";
if (fs::exists(fs::path(database_metadata_file)))
{
/// There is .sql file with database creation statement.
ReadBufferFromFile in(database_metadata_file, 1024);
readStringUntilEOF(database_attach_query, in);
}
else
{
/// It's first server run and we need create default and system databases.
/// .sql file with database engine will be written for CREATE query.
database_attach_query = "CREATE DATABASE " + backQuoteIfNeed(database);
}
try
{
executeCreateQuery(database_attach_query, context, database, database_metadata_file, /* create */ true, force_restore_data);
}
catch (Exception & e)
{
e.addMessage(fmt::format("while loading database {} from path {}", backQuote(database), database_path));
throw;
}
}
static void checkUnsupportedVersion(ContextMutablePtr context, const String & database_name)
{
/// Produce better exception message
String metadata_path = context->getPath() + "metadata/" + database_name;
if (fs::exists(fs::path(metadata_path)))
throw Exception(ErrorCodes::NOT_IMPLEMENTED, "Data directory for {} database exists, but metadata file does not. "
"Probably you are trying to upgrade from version older than 20.7. "
"If so, you should upgrade through intermediate version.", database_name);
}
static void checkIncompleteOrdinaryToAtomicConversion(ContextPtr context, const std::map<String, String> & databases)
{
if (context->getConfigRef().has("allow_reserved_database_name_tmp_convert"))
return;
auto convert_flag_path = fs::path(context->getFlagsPath()) / "convert_ordinary_to_atomic";
if (!fs::exists(convert_flag_path))
return;
/// Flag exists. Let's check if we had an unsuccessful conversion attempt previously
for (const auto & db : databases)
{
if (!db.first.starts_with(ORDINARY_TO_ATOMIC_PREFIX))
continue;
size_t last_dot = db.first.rfind('.');
if (last_dot <= strlen(ORDINARY_TO_ATOMIC_PREFIX))
continue;
String actual_name = db.first.substr(strlen(ORDINARY_TO_ATOMIC_PREFIX), last_dot - strlen(ORDINARY_TO_ATOMIC_PREFIX));
throw Exception(ErrorCodes::NOT_IMPLEMENTED, "Found a database with special name: {}. "
"Most likely it indicates that conversion of database {} from Ordinary to Atomic "
"was interrupted or failed in the middle. You can add <allow_reserved_database_name_tmp_convert> to config.xml "
"or remove convert_ordinary_to_atomic file from flags/ directory, so the server will start forcefully. "
"After starting the server, you can finish conversion manually by moving rest of the tables from {} to {} "
"(using RENAME TABLE) and executing DROP DATABASE {} and RENAME DATABASE {} TO {}",
backQuote(db.first), backQuote(actual_name), backQuote(actual_name), backQuote(db.first),
backQuote(actual_name), backQuote(db.first), backQuote(actual_name));
}
}
LoadTaskPtrs loadMetadata(ContextMutablePtr context, const String & default_database_name, bool async_load_databases)
{
LoggerPtr log = getLogger("loadMetadata");
String path = context->getPath() + "metadata";
/// There may exist 'force_restore_data' file, which means skip safety threshold
/// on difference of data parts while initializing tables.
/// This file is immediately deleted i.e. "one-shot".
auto force_restore_data_flag_file = fs::path(context->getFlagsPath()) / "force_restore_data";
bool has_force_restore_data_flag = fs::exists(force_restore_data_flag_file);
if (has_force_restore_data_flag)
{
try
{
fs::remove(force_restore_data_flag_file);
}
catch (...)
{
tryLogCurrentException("Load metadata", "Can't remove force restore file to enable data sanity checks");
}
}
/// Loop over databases.
std::map<String, String> databases;
fs::directory_iterator dir_end;
for (fs::directory_iterator it(path); it != dir_end; ++it)
{
if (it->is_symlink())
continue;
if (it->is_directory())
continue;
const auto current_file = it->path().filename().string();
/// TODO: DETACH DATABASE PERMANENTLY ?
if (fs::path(current_file).extension() == ".sql")
{
String db_name = fs::path(current_file).stem();
if (!isSystemOrInformationSchema(db_name))
databases.emplace(unescapeForFileName(db_name), fs::path(path) / db_name);
}
/// Temporary fails may be left from previous server runs.
if (fs::path(current_file).extension() == ".tmp")
{
LOG_WARNING(log, "Removing temporary file {}", it->path().string());
try
{
fs::remove(it->path());
}
catch (...)
{
/// It does not prevent server to startup.
tryLogCurrentException(log);
}
}
}
checkIncompleteOrdinaryToAtomicConversion(context, databases);
/// clickhouse-local creates DatabaseMemory as default database by itself
/// For clickhouse-server we need create default database
bool create_default_db_if_not_exists = !default_database_name.empty();
bool metadata_dir_for_default_db_already_exists = databases.contains(default_database_name);
if (create_default_db_if_not_exists && !metadata_dir_for_default_db_already_exists)
{
checkUnsupportedVersion(context, default_database_name);
databases.emplace(default_database_name, std::filesystem::path(path) / escapeForFileName(default_database_name));
}
TablesLoader::Databases loaded_databases;
for (const auto & [name, db_path] : databases)
{
loadDatabase(context, name, db_path, has_force_restore_data_flag);
loaded_databases.insert({name, DatabaseCatalog::instance().getDatabase(name)});
}
auto mode = getLoadingStrictnessLevel(/* attach */ true, /* force_attach */ true, has_force_restore_data_flag, /*secondary*/ false);
TablesLoader loader{context, std::move(loaded_databases), mode};
auto load_tasks = loader.loadTablesAsync();
auto startup_tasks = loader.startupTablesAsync();
if (async_load_databases)
{
LOG_INFO(log, "Start asynchronous loading of databases");
// Schedule all the jobs.
// Note that to achieve behaviour similar to synchronous case (postponing of merges) we use priorities.
// All startup jobs are assigned to pool with lower priority than load jobs pool.
// So all tables will finish loading before the first table startup if there are no queries (or dependencies).
// Query waiting for a table boosts its priority by moving jobs into `TablesLoaderForegroundPoolId` pool
// to finish table startup faster than load of the other tables.
scheduleLoad(load_tasks);
scheduleLoad(startup_tasks);
// Do NOT wait, just return tasks for continuation or later wait.
return joinTasks(load_tasks, startup_tasks);
}
else
{
// NOTE: some tables can still be started up in the "loading" phase if they are required by dependencies during loading of other tables
LOG_INFO(log, "Start synchronous loading of databases");
waitLoad(TablesLoaderForegroundPoolId, load_tasks); // First prioritize, schedule and wait all the load table tasks
LOG_INFO(log, "Start synchronous startup of databases");
waitLoad(TablesLoaderForegroundPoolId, startup_tasks); // Only then prioritize, schedule and wait all the startup tasks
return {};
}
}
static void loadSystemDatabaseImpl(ContextMutablePtr context, const String & database_name, const String & default_engine)
{
String path = context->getPath() + "metadata/" + database_name;
String metadata_file = path + ".sql";
if (fs::exists(metadata_file + ".tmp"))
fs::remove(metadata_file + ".tmp");
if (fs::exists(fs::path(metadata_file)))
{
/// 'has_force_restore_data_flag' is true, to not fail on loading query_log table, if it is corrupted.
loadDatabase(context, database_name, path, true);
}
else
{
checkUnsupportedVersion(context, database_name);
/// Initialize system database manually
String database_create_query = "CREATE DATABASE ";
database_create_query += database_name;
database_create_query += " ENGINE=";
database_create_query += default_engine;
executeCreateQuery(database_create_query, context, database_name, "<no file>", true, true);
}
}
static void convertOrdinaryDatabaseToAtomic(LoggerPtr log, ContextMutablePtr context, const DatabasePtr & database,
const String & name, const String tmp_name)
{
/// It's kind of C++ script that creates temporary database with Atomic engine,
/// moves all tables to it, drops old database and then renames new one to old name.
String name_quoted = backQuoteIfNeed(name);
String tmp_name_quoted = backQuoteIfNeed(tmp_name);
LOG_INFO(log, "Will convert database {} from Ordinary to Atomic", name_quoted);
String create_database_query = fmt::format("CREATE DATABASE IF NOT EXISTS {}", tmp_name_quoted);
auto res = executeQuery(create_database_query, context, QueryFlags{ .internal = true }).second;
executeTrivialBlockIO(res, context);
res = {};
auto tmp_database = DatabaseCatalog::instance().getDatabase(tmp_name);
assert(tmp_database->getEngineName() == "Atomic");
size_t num_tables = 0;
std::unordered_set<String> inner_mv_tables;
for (auto iterator = database->getTablesIterator(context); iterator->isValid(); iterator->next())
{
++num_tables;
auto id = iterator->table()->getStorageID();
id.database_name = tmp_name;
/// We need some uuid for checkTableCanBeRenamed
id.uuid = UUIDHelpers::generateV4();
iterator->table()->checkTableCanBeRenamed(id);
if (const auto * mv = dynamic_cast<const StorageMaterializedView *>(iterator->table().get()))
{
/// We should not rename inner tables of MVs, because MVs are responsible for renaming it...
if (mv->hasInnerTable())
inner_mv_tables.emplace(mv->getTargetTable()->getStorageID().table_name);
}
}
LOG_INFO(log, "Will move {} tables to {} (including {} inner tables of MVs)", num_tables, tmp_name_quoted, inner_mv_tables.size());
for (auto iterator = database->getTablesIterator(context); iterator->isValid(); iterator->next())
{
auto id = iterator->table()->getStorageID();
if (inner_mv_tables.contains(id.table_name))
{
LOG_DEBUG(log, "Do not rename {}, because it will be renamed together with MV", id.getNameForLogs());
continue;
}
String qualified_quoted_name = id.getFullTableName();
id.database_name = tmp_name;
String tmp_qualified_quoted_name = id.getFullTableName();
String move_table_query = fmt::format("RENAME TABLE {} TO {}", qualified_quoted_name, tmp_qualified_quoted_name);
res = executeQuery(move_table_query, context, QueryFlags{ .internal = true }).second;
executeTrivialBlockIO(res, context);
res = {};
}
LOG_INFO(log, "Moved all tables from {} to {}", name_quoted, tmp_name_quoted);
if (!database->empty())
throw Exception(ErrorCodes::LOGICAL_ERROR, "Database {} is not empty after moving tables", name_quoted);
String drop_query = fmt::format("DROP DATABASE {}", name_quoted);
context->setSetting("force_remove_data_recursively_on_drop", false);
res = executeQuery(drop_query, context, QueryFlags{ .internal = true }).second;
executeTrivialBlockIO(res, context);
res = {};
String rename_query = fmt::format("RENAME DATABASE {} TO {}", tmp_name_quoted, name_quoted);
res = executeQuery(rename_query, context, QueryFlags{ .internal = true }).second;
executeTrivialBlockIO(res, context);
LOG_INFO(log, "Finished database engine conversion of {}", name_quoted);
}
/// Converts database with Ordinary engine to Atomic. Does nothing if database is not Ordinary.
/// Can be called only during server startup when there are no queries from users.
static void maybeConvertOrdinaryDatabaseToAtomic(ContextMutablePtr context, const String & database_name, LoadTaskPtrs * startup_tasks = nullptr)
{
LoggerPtr log = getLogger("loadMetadata");
auto database = DatabaseCatalog::instance().getDatabase(database_name);
if (!database)
{
LOG_WARNING(log, "Database {} not found (while trying to convert it from Ordinary to Atomic)", database_name);
return;
}
if (database->getEngineName() != "Ordinary")
return;
const Strings permanently_detached_tables = database->getNamesOfPermanentlyDetachedTables();
if (!permanently_detached_tables.empty())
{
throw Exception(ErrorCodes::NOT_IMPLEMENTED, "Cannot automatically convert database {} from Ordinary to Atomic, "
"because it contains permanently detached tables ({}) that were not loaded during startup. "
"Attach these tables, so server will load and convert them",
database_name, fmt::join(permanently_detached_tables, ", "));
}
String tmp_name = fmt::format(ORDINARY_TO_ATOMIC_PREFIX"{}.{}", database_name, thread_local_rng());
try
{
if (startup_tasks) // NOTE: only for system database
{
/// It's not quite correct to run DDL queries while database is not started up.
waitLoad(TablesLoaderForegroundPoolId, *startup_tasks);
startup_tasks->clear();
}
auto local_context = Context::createCopy(context);
/// We have to stop background operations that may lock table for share to avoid DEADLOCK_AVOIDED error
/// on moving tables from Ordinary database. Server has not started to accept connections yet,
/// so there are no user queries, only background operations
LOG_INFO(log, "Will stop background operations to be able to rename tables in Ordinary database {}", database_name);
static const auto actions_to_stop = {
ActionLocks::PartsMerge, ActionLocks::PartsFetch, ActionLocks::PartsSend, ActionLocks::DistributedSend
};
for (const auto & action : actions_to_stop)
InterpreterSystemQuery::startStopActionInDatabase(action, /* start */ false, database_name, database, context, log);
local_context->setSetting("check_table_dependencies", false);
local_context->setSetting("check_referential_table_dependencies", false);
convertOrdinaryDatabaseToAtomic(log, local_context, database, database_name, tmp_name);
LOG_INFO(log, "Will start background operations after renaming tables in database {}", database_name);
for (const auto & action : actions_to_stop)
InterpreterSystemQuery::startStopActionInDatabase(action, /* start */ true, database_name, database, context, log);
auto new_database = DatabaseCatalog::instance().getDatabase(database_name);
UUID db_uuid = new_database->getUUID();
std::vector<UUID> tables_uuids;
for (auto iterator = new_database->getTablesIterator(context); iterator->isValid(); iterator->next())
tables_uuids.push_back(iterator->uuid());
/// Reload database just in case (and update logger name)
String detach_query = fmt::format("DETACH DATABASE {}", backQuoteIfNeed(database_name));
auto res = executeQuery(detach_query, context, QueryFlags{ .internal = true }).second;
executeTrivialBlockIO(res, context);
res = {};
/// Unlock UUID mapping, because it will be locked again on database reload.
/// It's safe to do during metadata loading, because cleanup task is not started yet.
DatabaseCatalog::instance().removeUUIDMappingFinally(db_uuid);
for (const auto & uuid : tables_uuids)
DatabaseCatalog::instance().removeUUIDMappingFinally(uuid);
String path = context->getPath() + "metadata/" + escapeForFileName(database_name);
/// force_restore_data is needed to re-create metadata symlinks
loadDatabase(context, database_name, path, /* force_restore_data */ true);
TablesLoader::Databases databases =
{
{database_name, DatabaseCatalog::instance().getDatabase(database_name)},
};
TablesLoader loader{context, databases, LoadingStrictnessLevel::FORCE_RESTORE};
waitLoad(TablesLoaderForegroundPoolId, loader.loadTablesAsync());
/// Startup tables if they were started before conversion and detach/attach
if (startup_tasks) // NOTE: only for system database
*startup_tasks = loader.startupTablesAsync(); // We have loaded old database(s), replace tasks to startup new database
else
// An old database was already loaded, so we should load new one as well
waitLoad(TablesLoaderForegroundPoolId, loader.startupTablesAsync());
}
catch (Exception & e)
{
e.addMessage("Exception while trying to convert database {} from Ordinary to Atomic. It may be in some intermediate state."
" You can finish conversion manually by moving the rest tables from {} to {} (using RENAME TABLE)"
" and executing DROP DATABASE {} and RENAME DATABASE {} TO {}.",
database_name, database_name, tmp_name, database_name, tmp_name, database_name);
throw;
}
}
void maybeConvertSystemDatabase(ContextMutablePtr context, LoadTaskPtrs & system_startup_tasks)
{
/// TODO remove this check, convert system database unconditionally
if (context->getSettingsRef().allow_deprecated_database_ordinary)
return;
maybeConvertOrdinaryDatabaseToAtomic(context, DatabaseCatalog::SYSTEM_DATABASE, &system_startup_tasks);
}
void convertDatabasesEnginesIfNeed(const LoadTaskPtrs & load_metadata, ContextMutablePtr context)
{
auto convert_flag_path = fs::path(context->getFlagsPath()) / "convert_ordinary_to_atomic";
if (!fs::exists(convert_flag_path))
return;
LOG_INFO(getLogger("loadMetadata"), "Found convert_ordinary_to_atomic file in flags directory, "
"will try to convert all Ordinary databases to Atomic");
// Wait for all table to be loaded and started
waitLoad(TablesLoaderForegroundPoolId, load_metadata);
for (const auto & [name, _] : DatabaseCatalog::instance().getDatabases())
if (name != DatabaseCatalog::SYSTEM_DATABASE)
maybeConvertOrdinaryDatabaseToAtomic(context, name);
LOG_INFO(getLogger("loadMetadata"), "Conversion finished, removing convert_ordinary_to_atomic flag");
fs::remove(convert_flag_path);
}
LoadTaskPtrs loadMetadataSystem(ContextMutablePtr context)
{
loadSystemDatabaseImpl(context, DatabaseCatalog::SYSTEM_DATABASE, "Atomic");
loadSystemDatabaseImpl(context, DatabaseCatalog::INFORMATION_SCHEMA, "Memory");
loadSystemDatabaseImpl(context, DatabaseCatalog::INFORMATION_SCHEMA_UPPERCASE, "Memory");
TablesLoader::Databases databases =
{
{DatabaseCatalog::SYSTEM_DATABASE, DatabaseCatalog::instance().getSystemDatabase()},
{DatabaseCatalog::INFORMATION_SCHEMA, DatabaseCatalog::instance().getDatabase(DatabaseCatalog::INFORMATION_SCHEMA)},
{DatabaseCatalog::INFORMATION_SCHEMA_UPPERCASE, DatabaseCatalog::instance().getDatabase(DatabaseCatalog::INFORMATION_SCHEMA_UPPERCASE)},
};
TablesLoader loader{context, databases, LoadingStrictnessLevel::FORCE_RESTORE};
auto tasks = loader.loadTablesAsync();
waitLoad(TablesLoaderForegroundPoolId, tasks);
/// Will startup tables in system database after all databases are loaded.
return loader.startupTablesAsync();
}
}
```
|
```yaml
intents:
- greet
- default
- goodbye
slots:
cuisine:
type: text
location:
type: text
entities:
- name
responses:
utter_greet:
- super: cool
utter_goodbye:
- text: goodbye :(
utter_default:
- text: default message
actions:
- utter_default
- utter_greet
- utter_goodbye
```
|
```shell
#!/bin/bash
# This Source Code Form is subject to the terms of the Mozilla Public
# file, You can obtain one at path_to_url
# Purpose: This script is needed to start the client and service with
# one command. This is necessary as ctest - which is used to run the
# tests - isn't able to start two binaries for one testcase. Therefore
# the testcase simply executes this script. This script then runs client
# and service and checks that both exit sucessfully.
FAIL=0
# Parameter 1: the pid to check
# Parameter 2: number of TCP/UDP sockets the process should have open
check_tcp_udp_sockets_are_open ()
{
# Check that the passed pid/process does listen on at least one TCP/UDP socket
# awk is used to avoid the case when a inode number is the same as a PID. The awk
# program filters the netstat output down to the protocol (1st field) and
# the PID/Program name (last field) fields.
SERVICE_SOCKETS_LISTENING=$(netstat -tulpen 2> /dev/null | awk '{print $1 "\t" $NF}' | grep $1 | wc -l)
if [ $SERVICE_SOCKETS_LISTENING -lt $2 ]
then
((FAIL+=1))
fi
}
# Parameter 1: the pid to check
check_tcp_udp_sockets_are_closed ()
{
# Check that the passed pid/process does not listen on any TCP/UDP socket
# or has any active connection via a TCP/UDP socket
# awk is used to avoid the case when a inode number is the same as a PID. The awk
# program filters the netstat output down to the protocol (1st field) and
# the PID/Program name (last field) fields.
SERVICE_SOCKETS_LISTENING=$(netstat -tulpen 2> /dev/null | awk '{print $1 "\t" $NF}' | grep $1 | wc -l)
if [ $SERVICE_SOCKETS_LISTENING -ne 0 ]
then
((FAIL+=1))
fi
SERVICE_SOCKETS_CONNECTED=$(netstat -tupen 2> /dev/null | awk '{print $1 "\t" $NF}' | grep $1 | wc -l)
if [ $SERVICE_SOCKETS_CONNECTED -ne 0 ]
then
((FAIL+=1))
fi
}
# Start the service for payload test with UDP
export VSOMEIP_APPLICATION_NAME=external_local_payload_test_service
export VSOMEIP_CONFIGURATION=external_local_payload_test_service.json
./payload_test_service --udp &
SERIVCE_PID=$!
# Display a message to show the user that he must now call the external client
# to finish the test successfully
if [ ! -z "$USE_LXC_TEST" ]; then
echo "starting external local payload on slave LXC"
ssh -tt -i $SANDBOX_ROOT_DIR/commonapi_main/lxc-config/.ssh/mgc_lxc/rsa_key_file.pub -o StrictHostKeyChecking=no root@$LXC_TEST_SLAVE_IP "bash -ci \"set -m; cd \\\$SANDBOX_TARGET_DIR/vsomeip_lib/test/network_tests; ./external_local_payload_test_client_external_start.sh\"" &
echo "remote ssh job id: $!"
elif [ ! -z "$USE_DOCKER" ]; then
docker exec $DOCKER_IMAGE sh -c "cd $DOCKER_TESTS && ./external_local_payload_test_client_external_start.sh" &
else
cat <<End-of-message
*******************************************************************************
*******************************************************************************
** Please now run:
** external_local_payload_test_client_external_start.sh
** from an external host to successfully complete this test.
**
** You probably will need to adapt the 'unicast' settings in
** external_local_payload_test_client_external.json and
** external_local_payload_test_service.json to your personal setup.
*******************************************************************************
*******************************************************************************
End-of-message
fi
# The service should listen on a TCP and UDP socket now
sleep 1
check_tcp_udp_sockets_are_open $SERIVCE_PID 2
# Wait until service is finished
# The client remotely shuts down the service if he has successfully transmitted
# all the packets with different payloads. Therefore we can assume that everything
# went well, even if we can only check the exit code of the service here.
# Fail gets incremented if either client or service exit
# with a non-zero exit code
wait $SERIVCE_PID || ((FAIL+=1))
# Start the service for payload test with tcp
export VSOMEIP_APPLICATION_NAME=external_local_payload_test_service
export VSOMEIP_CONFIGURATION=external_local_payload_test_service.json
./payload_test_service --tcp &
SERIVCE_PID=$!
# The service should listen on a TCP and UDP socket now
sleep 1
check_tcp_udp_sockets_are_open $SERIVCE_PID 2
# Wait until service is finished
# The client remotely shuts down the service if he has successfully transmitted
# all the packets with different payloads. Therefore we can assume that everything
# went well, even if we can only check the exit code of the service here.
# Fail gets incremented if either client or service exit
# with a non-zero exit code
wait $SERIVCE_PID || ((FAIL+=1))
# Check if server exited sucessfully
if [ $FAIL -eq 0 ]
then
exit 0
else
exit 1
fi
```
|
```c
/* GIO - GLib Input, Output and Streaming Library
*
*
* This program is free software: you can redistribute it and/or modify
* by the Free Software Foundation; either version 2 of the licence or (at
* your option) any later version.
*
* See the included COPYING file for more information.
*/
/**
* SECTION:gtcpconnection
* @title: GTcpConnection
* @short_description: A TCP GSocketConnection
* @include: gio/gio.h
* @see_also: #GSocketConnection.
*
* This is the subclass of #GSocketConnection that is created
* for TCP/IP sockets.
*
* Since: 2.22
*/
#include "config.h"
#include "gtcpconnection.h"
#include "gasyncresult.h"
#include "gtask.h"
#include "giostream.h"
#include "glibintl.h"
struct _GTcpConnectionPrivate
{
guint graceful_disconnect : 1;
};
G_DEFINE_TYPE_WITH_CODE (GTcpConnection, g_tcp_connection,
G_TYPE_SOCKET_CONNECTION,
G_ADD_PRIVATE (GTcpConnection)
g_socket_connection_factory_register_type (g_define_type_id,
G_SOCKET_FAMILY_IPV4,
G_SOCKET_TYPE_STREAM,
G_SOCKET_PROTOCOL_DEFAULT);
g_socket_connection_factory_register_type (g_define_type_id,
G_SOCKET_FAMILY_IPV6,
G_SOCKET_TYPE_STREAM,
G_SOCKET_PROTOCOL_DEFAULT);
g_socket_connection_factory_register_type (g_define_type_id,
G_SOCKET_FAMILY_IPV4,
G_SOCKET_TYPE_STREAM,
G_SOCKET_PROTOCOL_TCP);
g_socket_connection_factory_register_type (g_define_type_id,
G_SOCKET_FAMILY_IPV6,
G_SOCKET_TYPE_STREAM,
G_SOCKET_PROTOCOL_TCP);
);
static gboolean g_tcp_connection_close (GIOStream *stream,
GCancellable *cancellable,
GError **error);
static void g_tcp_connection_close_async (GIOStream *stream,
int io_priority,
GCancellable *cancellable,
GAsyncReadyCallback callback,
gpointer user_data);
enum
{
PROP_0,
PROP_GRACEFUL_DISCONNECT
};
static void
g_tcp_connection_init (GTcpConnection *connection)
{
connection->priv = g_tcp_connection_get_instance_private (connection);
connection->priv->graceful_disconnect = FALSE;
}
static void
g_tcp_connection_get_property (GObject *object,
guint prop_id,
GValue *value,
GParamSpec *pspec)
{
GTcpConnection *connection = G_TCP_CONNECTION (object);
switch (prop_id)
{
case PROP_GRACEFUL_DISCONNECT:
g_value_set_boolean (value, connection->priv->graceful_disconnect);
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec);
}
}
static void
g_tcp_connection_set_property (GObject *object,
guint prop_id,
const GValue *value,
GParamSpec *pspec)
{
GTcpConnection *connection = G_TCP_CONNECTION (object);
switch (prop_id)
{
case PROP_GRACEFUL_DISCONNECT:
g_tcp_connection_set_graceful_disconnect (connection,
g_value_get_boolean (value));
break;
default:
G_OBJECT_WARN_INVALID_PROPERTY_ID (object, prop_id, pspec);
}
}
static void
g_tcp_connection_class_init (GTcpConnectionClass *class)
{
GObjectClass *gobject_class = G_OBJECT_CLASS (class);
GIOStreamClass *stream_class = G_IO_STREAM_CLASS (class);
gobject_class->set_property = g_tcp_connection_set_property;
gobject_class->get_property = g_tcp_connection_get_property;
stream_class->close_fn = g_tcp_connection_close;
stream_class->close_async = g_tcp_connection_close_async;
g_object_class_install_property (gobject_class, PROP_GRACEFUL_DISCONNECT,
g_param_spec_boolean ("graceful-disconnect",
P_("Graceful Disconnect"),
P_("Whether or not close does a graceful disconnect"),
FALSE,
G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS));
}
static gboolean
g_tcp_connection_close (GIOStream *stream,
GCancellable *cancellable,
GError **error)
{
GTcpConnection *connection = G_TCP_CONNECTION (stream);
GSocket *socket;
char buffer[1024];
gssize ret;
gboolean had_error;
socket = g_socket_connection_get_socket (G_SOCKET_CONNECTION (stream));
had_error = FALSE;
if (connection->priv->graceful_disconnect &&
!g_cancellable_is_cancelled (cancellable) /* Cancelled -> close fast */)
{
if (!g_socket_shutdown (socket, FALSE, TRUE, error))
{
error = NULL; /* Ignore further errors */
had_error = TRUE;
}
else
{
while (TRUE)
{
ret = g_socket_receive_with_blocking (socket, buffer, sizeof (buffer),
TRUE, cancellable, error);
if (ret < 0)
{
had_error = TRUE;
error = NULL;
break;
}
if (ret == 0)
break;
}
}
}
return G_IO_STREAM_CLASS (g_tcp_connection_parent_class)
->close_fn (stream, cancellable, error) && !had_error;
}
/* consumes @error */
static void
async_close_finish (GTask *task,
GError *error)
{
GIOStreamClass *parent = G_IO_STREAM_CLASS (g_tcp_connection_parent_class);
GIOStream *stream = g_task_get_source_object (task);
GCancellable *cancellable = g_task_get_cancellable (task);
/* Close underlying stream, ignoring further errors if we already
* have one.
*/
if (error)
parent->close_fn (stream, cancellable, NULL);
else
parent->close_fn (stream, cancellable, &error);
if (error)
g_task_return_error (task, error);
else
g_task_return_boolean (task, TRUE);
}
static gboolean
close_read_ready (GSocket *socket,
GIOCondition condition,
GTask *task)
{
GError *error = NULL;
char buffer[1024];
gssize ret;
ret = g_socket_receive_with_blocking (socket, buffer, sizeof (buffer),
FALSE, g_task_get_cancellable (task),
&error);
if (ret < 0)
{
if (g_error_matches (error, G_IO_ERROR, G_IO_ERROR_WOULD_BLOCK))
{
g_error_free (error);
return TRUE;
}
else
{
async_close_finish (task, error);
g_object_unref (task);
return FALSE;
}
}
if (ret == 0)
{
async_close_finish (task, NULL);
return FALSE;
}
return TRUE;
}
static void
g_tcp_connection_close_async (GIOStream *stream,
int io_priority,
GCancellable *cancellable,
GAsyncReadyCallback callback,
gpointer user_data)
{
GTcpConnection *connection = G_TCP_CONNECTION (stream);
GSocket *socket;
GSource *source;
GError *error;
GTask *task;
if (connection->priv->graceful_disconnect &&
!g_cancellable_is_cancelled (cancellable) /* Cancelled -> close fast */)
{
task = g_task_new (stream, cancellable, callback, user_data);
g_task_set_priority (task, io_priority);
socket = g_socket_connection_get_socket (G_SOCKET_CONNECTION (stream));
error = NULL;
if (!g_socket_shutdown (socket, FALSE, TRUE, &error))
{
g_task_return_error (task, error);
g_object_unref (task);
return;
}
source = g_socket_create_source (socket, G_IO_IN, cancellable);
g_task_attach_source (task, source, (GSourceFunc) close_read_ready);
g_source_unref (source);
return;
}
G_IO_STREAM_CLASS (g_tcp_connection_parent_class)
->close_async (stream, io_priority, cancellable, callback, user_data);
}
/**
* g_tcp_connection_set_graceful_disconnect:
* @connection: a #GTcpConnection
* @graceful_disconnect: Whether to do graceful disconnects or not
*
* This enables graceful disconnects on close. A graceful disconnect
* means that we signal the receiving end that the connection is terminated
* and wait for it to close the connection before closing the connection.
*
* A graceful disconnect means that we can be sure that we successfully sent
* all the outstanding data to the other end, or get an error reported.
* However, it also means we have to wait for all the data to reach the
* other side and for it to acknowledge this by closing the socket, which may
* take a while. For this reason it is disabled by default.
*
* Since: 2.22
*/
void
g_tcp_connection_set_graceful_disconnect (GTcpConnection *connection,
gboolean graceful_disconnect)
{
graceful_disconnect = !!graceful_disconnect;
if (graceful_disconnect != connection->priv->graceful_disconnect)
{
connection->priv->graceful_disconnect = graceful_disconnect;
g_object_notify (G_OBJECT (connection), "graceful-disconnect");
}
}
/**
* g_tcp_connection_get_graceful_disconnect:
* @connection: a #GTcpConnection
*
* Checks if graceful disconnects are used. See
* g_tcp_connection_set_graceful_disconnect().
*
* Returns: %TRUE if graceful disconnect is used on close, %FALSE otherwise
*
* Since: 2.22
*/
gboolean
g_tcp_connection_get_graceful_disconnect (GTcpConnection *connection)
{
return connection->priv->graceful_disconnect;
}
```
|
Beddoe is a surname of Welsh origin. It originates from Bettws or Betws (), a Welsh name that is derived from the Anglo-Saxon Old English bed-hus—i.e. a bead-house: a house of prayer, or oratory.
At the time of the British Census of 1881, its frequency was highest in Pembrokeshire (over 65 times the national average), followed by Cambridgeshire, Shropshire, Glamorgan, Monmouthshire, Herefordshire, Carmarthenshire, Staffordshire, Cardiganshire and Warwickshire.
Notable people with the surname include:
Martin Beddoe (born 1955), British judge
Alan Beddoe (1893–1975), Canadian artist, consultant in heraldry, and founder of the Heraldry Society of Canada
Clive Beddoe (born 1947), founding shareholder and chairman of the board of directors of WestJet Airlines
Dan Beddoe (1863–1937), Welsh tenor
Don Beddoe (1903–1991), American character actor
John Beddoe (1826–1911), British ethnologist
Valerie Beddoe (born 1960), Australian diver
Fictional characters
Philo Beddoe, a character played by Clint Eastwood in the films Every Which Way But Loose and Any Which Way You Can
See also
David Rowe-Beddoe, Baron Rowe-Beddoe, British politician and member of the House of Lords
Beddoe Rees, British architect and politician
Beddoe–Rose Family Cemetery, New York State
Bedo (disambiguation)
Beddoes
References
Surnames of Welsh origin
Welsh-language surnames
|
```c++
//your_sha256_hash-----------//
//
// See accompanying file LICENSE_1_0.txt or copy at
// path_to_url
//
// See path_to_url for more information.
//your_sha256_hash-----------//
#include <algorithm>
#include <iostream>
#include <numeric>
#include <vector>
#include "perf.hpp"
int rand_int()
{
return static_cast<int>((rand() / double(RAND_MAX)) * 25.0);
}
int main(int argc, char *argv[])
{
perf_parse_args(argc, argv);
std::cout << "size: " << PERF_N << std::endl;
// create vector of random numbers on the host
std::vector<int> host_vector(PERF_N);
std::generate(host_vector.begin(), host_vector.end(), rand_int);
int pattern[] = {2, 6, 6, 7, 8, 4};
perf_timer t;
for(size_t trial = 0; trial < PERF_TRIALS; trial++){
t.start();
std::search(host_vector.begin(), host_vector.end(),
pattern, pattern + 6);
t.stop();
}
std::cout << "time: " << t.min_time() / 1e6 << " ms" << std::endl;
return 0;
}
```
|
Aleksandr Belov is a Soviet sprint canoer who competed in the mid-1980s. He won a silver medal in the K-4 500 m event at the 1985 ICF Canoe Sprint World Championships in Mechelen.
References
Living people
Soviet male canoeists
Russian male canoeists
Year of birth missing (living people)
ICF Canoe Sprint World Championships medalists in kayak
|
```html
---
layout: default
---
{% include nav.html %}
<header class="article-alt-header bg-gray-light">
<div class="container-lg p-responsive mx-auto">
<h1 class="h1-mktg lh-condensed text-center my-6 py-md-6">{{ page.title }}</h1>
<p class="lead text-center text-gray col-md-8 mx-auto mb-4 position-relative">{{ page.description }}</p>
</div>
</header>
<article class="pb-md-6">
<div class="article-body mx-auto px-3 pt-4 pt-lg-6 pb-6 col-md-10 col-lg-8 col-xl-6">
{{ content }}
</div>
</article>
{% include footer.html %}
```
|
Andrea Pietra is an Argentine actress. She worked with Nancy Dupláa as the lead actresses of Socias. She suggested her to hire the actress Paola Barrientos for her new telenovela, Graduados. In 2012 she joined the telenovela Sos mi hombre.
Works
Film
Television
References
Actresses from Buenos Aires
1968 births
Living people
|
```python
from office365.intune.organizations.branding_properties import (
OrganizationalBrandingProperties,
)
class OrganizationalBranding(OrganizationalBrandingProperties):
"""
Contains details about the organization's default branding. Inherits from organizationalBrandingProperties.
Organizations can customize their Azure Active Directory (Azure AD) sign-in pages which appear when users sign
in to their organization's tenant-specific apps, or when Azure AD identifies the user's tenant from their username.
A developer can also read the company's branding information and customize their app experience to tailor
it specifically for the signed-in user using their company's branding.
You can't change your original configuration's language. However, companies can add different branding based on
locale. For language-specific branding, see the organizationalBrandingLocalization object.
"""
```
|
```python
from __future__ import division, absolute_import, print_function
# Code common to build tools
import sys
import warnings
import copy
import binascii
from numpy.distutils.misc_util import mingw32
#-------------------
# Versioning support
#-------------------
# How to change C_API_VERSION ?
# - increase C_API_VERSION value
# - record the hash for the new C API with the script cversions.py
# and add the hash to cversions.txt
# The hash values are used to remind developers when the C API number was not
# updated - generates a MismatchCAPIWarning warning which is turned into an
# exception for released version.
# Binary compatibility version number. This number is increased whenever the
# C-API is changed such that binary compatibility is broken, i.e. whenever a
# recompile of extension modules is needed.
C_ABI_VERSION = 0x01000009
# Minor API version. This number is increased whenever a change is made to the
# C-API -- whether it breaks binary compatibility or not. Some changes, such
# as adding a function pointer to the end of the function table, can be made
# without breaking binary compatibility. In this case, only the C_API_VERSION
# (*not* C_ABI_VERSION) would be increased. Whenever binary compatibility is
# broken, both C_API_VERSION and C_ABI_VERSION should be increased.
#
# 0x00000008 - 1.7.x
# 0x00000009 - 1.8.x
# 0x00000009 - 1.9.x
# 0x0000000a - 1.10.x
# 0x0000000a - 1.11.x
C_API_VERSION = 0x0000000a
class MismatchCAPIWarning(Warning):
pass
def is_released(config):
"""Return True if a released version of numpy is detected."""
from distutils.version import LooseVersion
v = config.get_version('../version.py')
if v is None:
raise ValueError("Could not get version")
pv = LooseVersion(vstring=v).version
if len(pv) > 3:
return False
return True
def get_api_versions(apiversion, codegen_dir):
"""
Return current C API checksum and the recorded checksum.
Return current C API checksum and the recorded checksum for the given
version of the C API version.
"""
# Compute the hash of the current API as defined in the .txt files in
# code_generators
sys.path.insert(0, codegen_dir)
try:
m = __import__('genapi')
numpy_api = __import__('numpy_api')
curapi_hash = m.fullapi_hash(numpy_api.full_api)
apis_hash = m.get_versions_hash()
finally:
del sys.path[0]
return curapi_hash, apis_hash[apiversion]
def check_api_version(apiversion, codegen_dir):
"""Emits a MismacthCAPIWarning if the C API version needs updating."""
curapi_hash, api_hash = get_api_versions(apiversion, codegen_dir)
# If different hash, it means that the api .txt files in
# codegen_dir have been updated without the API version being
# updated. Any modification in those .txt files should be reflected
# in the api and eventually abi versions.
# To compute the checksum of the current API, use
# code_generators/cversions.py script
if not curapi_hash == api_hash:
msg = ("API mismatch detected, the C API version "
"numbers have to be updated. Current C api version is %d, "
"with checksum %s, but recorded checksum for C API version %d in "
"codegen_dir/cversions.txt is %s. If functions were added in the "
"C API, you have to update C_API_VERSION in %s."
)
warnings.warn(msg % (apiversion, curapi_hash, apiversion, api_hash,
__file__),
MismatchCAPIWarning)
# Mandatory functions: if not found, fail the build
MANDATORY_FUNCS = ["sin", "cos", "tan", "sinh", "cosh", "tanh", "fabs",
"floor", "ceil", "sqrt", "log10", "log", "exp", "asin",
"acos", "atan", "fmod", 'modf', 'frexp', 'ldexp']
# Standard functions which may not be available and for which we have a
# replacement implementation. Note that some of these are C99 functions.
OPTIONAL_STDFUNCS = ["expm1", "log1p", "acosh", "asinh", "atanh",
"rint", "trunc", "exp2", "log2", "hypot", "atan2", "pow",
"copysign", "nextafter", "ftello", "fseeko",
"strtoll", "strtoull", "cbrt", "strtold_l", "fallocate"]
OPTIONAL_HEADERS = [
# sse headers only enabled automatically on amd64/x32 builds
"xmmintrin.h", # SSE
"emmintrin.h", # SSE2
"features.h", # for glibc version linux
]
# optional gcc compiler builtins and their call arguments and optional a
# required header
# call arguments are required as the compiler will do strict signature checking
OPTIONAL_INTRINSICS = [("__builtin_isnan", '5.'),
("__builtin_isinf", '5.'),
("__builtin_isfinite", '5.'),
("__builtin_bswap32", '5u'),
("__builtin_bswap64", '5u'),
("__builtin_expect", '5, 0'),
("__builtin_mul_overflow", '5, 5, (int*)5'),
("_mm_load_ps", '(float*)0', "xmmintrin.h"), # SSE
("_mm_prefetch", '(float*)0, _MM_HINT_NTA',
"xmmintrin.h"), # SSE
("_mm_load_pd", '(double*)0', "emmintrin.h"), # SSE2
("__builtin_prefetch", "(float*)0, 0, 3"),
]
# function attributes
# tested via "int %s %s(void *);" % (attribute, name)
# function name will be converted to HAVE_<upper-case-name> preprocessor macro
OPTIONAL_FUNCTION_ATTRIBUTES = [('__attribute__((optimize("unroll-loops")))',
'attribute_optimize_unroll_loops'),
('__attribute__((optimize("O3")))',
'attribute_optimize_opt_3'),
('__attribute__((nonnull (1)))',
'attribute_nonnull'),
]
# variable attributes tested via "int %s a" % attribute
OPTIONAL_VARIABLE_ATTRIBUTES = ["__thread", "__declspec(thread)"]
# Subset of OPTIONAL_STDFUNCS which may alreay have HAVE_* defined by Python.h
OPTIONAL_STDFUNCS_MAYBE = [
"expm1", "log1p", "acosh", "atanh", "asinh", "hypot", "copysign",
"ftello", "fseeko"
]
# C99 functions: float and long double versions
C99_FUNCS = [
"sin", "cos", "tan", "sinh", "cosh", "tanh", "fabs", "floor", "ceil",
"rint", "trunc", "sqrt", "log10", "log", "log1p", "exp", "expm1",
"asin", "acos", "atan", "asinh", "acosh", "atanh", "hypot", "atan2",
"pow", "fmod", "modf", 'frexp', 'ldexp', "exp2", "log2", "copysign",
"nextafter", "cbrt"
]
C99_FUNCS_SINGLE = [f + 'f' for f in C99_FUNCS]
C99_FUNCS_EXTENDED = [f + 'l' for f in C99_FUNCS]
C99_COMPLEX_TYPES = [
'complex double', 'complex float', 'complex long double'
]
C99_COMPLEX_FUNCS = [
"cabs", "cacos", "cacosh", "carg", "casin", "casinh", "catan",
"catanh", "ccos", "ccosh", "cexp", "cimag", "clog", "conj", "cpow",
"cproj", "creal", "csin", "csinh", "csqrt", "ctan", "ctanh"
]
def fname2def(name):
return "HAVE_%s" % name.upper()
def sym2def(symbol):
define = symbol.replace(' ', '')
return define.upper()
def type2def(symbol):
define = symbol.replace(' ', '_')
return define.upper()
# Code to detect long double representation taken from MPFR m4 macro
def check_long_double_representation(cmd):
cmd._check_compiler()
body = LONG_DOUBLE_REPRESENTATION_SRC % {'type': 'long double'}
# Disable whole program optimization (the default on vs2015, with python 3.5+)
# which generates intermediary object files and prevents checking the
# float representation.
if sys.platform == "win32" and not mingw32():
try:
cmd.compiler.compile_options.remove("/GL")
except (AttributeError, ValueError):
pass
# We need to use _compile because we need the object filename
src, obj = cmd._compile(body, None, None, 'c')
try:
ltype = long_double_representation(pyod(obj))
return ltype
except ValueError:
# try linking to support CC="gcc -flto" or icc -ipo
# struct needs to be volatile so it isn't optimized away
body = body.replace('struct', 'volatile struct')
body += "int main(void) { return 0; }\n"
src, obj = cmd._compile(body, None, None, 'c')
cmd.temp_files.append("_configtest")
cmd.compiler.link_executable([obj], "_configtest")
ltype = long_double_representation(pyod("_configtest"))
return ltype
finally:
cmd._clean()
LONG_DOUBLE_REPRESENTATION_SRC = r"""
/* "before" is 16 bytes to ensure there's no padding between it and "x".
* We're not expecting any "long double" bigger than 16 bytes or with
* alignment requirements stricter than 16 bytes. */
typedef %(type)s test_type;
struct {
char before[16];
test_type x;
char after[8];
} foo = {
{ '\0', '\0', '\0', '\0', '\0', '\0', '\0', '\0',
'\001', '\043', '\105', '\147', '\211', '\253', '\315', '\357' },
-123456789.0,
{ '\376', '\334', '\272', '\230', '\166', '\124', '\062', '\020' }
};
"""
def pyod(filename):
"""Python implementation of the od UNIX utility (od -b, more exactly).
Parameters
----------
filename : str
name of the file to get the dump from.
Returns
-------
out : seq
list of lines of od output
Note
----
We only implement enough to get the necessary information for long double
representation, this is not intended as a compatible replacement for od.
"""
def _pyod2():
out = []
fid = open(filename, 'rb')
try:
yo = [int(oct(int(binascii.b2a_hex(o), 16))) for o in fid.read()]
for i in range(0, len(yo), 16):
line = ['%07d' % int(oct(i))]
line.extend(['%03d' % c for c in yo[i:i+16]])
out.append(" ".join(line))
return out
finally:
fid.close()
def _pyod3():
out = []
fid = open(filename, 'rb')
try:
yo2 = [oct(o)[2:] for o in fid.read()]
for i in range(0, len(yo2), 16):
line = ['%07d' % int(oct(i)[2:])]
line.extend(['%03d' % int(c) for c in yo2[i:i+16]])
out.append(" ".join(line))
return out
finally:
fid.close()
if sys.version_info[0] < 3:
return _pyod2()
else:
return _pyod3()
_BEFORE_SEQ = ['000', '000', '000', '000', '000', '000', '000', '000',
'001', '043', '105', '147', '211', '253', '315', '357']
_AFTER_SEQ = ['376', '334', '272', '230', '166', '124', '062', '020']
_IEEE_DOUBLE_BE = ['301', '235', '157', '064', '124', '000', '000', '000']
_IEEE_DOUBLE_LE = _IEEE_DOUBLE_BE[::-1]
_INTEL_EXTENDED_12B = ['000', '000', '000', '000', '240', '242', '171', '353',
'031', '300', '000', '000']
_INTEL_EXTENDED_16B = ['000', '000', '000', '000', '240', '242', '171', '353',
'031', '300', '000', '000', '000', '000', '000', '000']
_MOTOROLA_EXTENDED_12B = ['300', '031', '000', '000', '353', '171',
'242', '240', '000', '000', '000', '000']
_IEEE_QUAD_PREC_BE = ['300', '031', '326', '363', '105', '100', '000', '000',
'000', '000', '000', '000', '000', '000', '000', '000']
_IEEE_QUAD_PREC_LE = _IEEE_QUAD_PREC_BE[::-1]
_DOUBLE_DOUBLE_BE = (['301', '235', '157', '064', '124', '000', '000', '000'] +
['000'] * 8)
_DOUBLE_DOUBLE_LE = (['000', '000', '000', '124', '064', '157', '235', '301'] +
['000'] * 8)
def long_double_representation(lines):
"""Given a binary dump as given by GNU od -b, look for long double
representation."""
# Read contains a list of 32 items, each item is a byte (in octal
# representation, as a string). We 'slide' over the output until read is of
# the form before_seq + content + after_sequence, where content is the long double
# representation:
# - content is 12 bytes: 80 bits Intel representation
# - content is 16 bytes: 80 bits Intel representation (64 bits) or quad precision
# - content is 8 bytes: same as double (not implemented yet)
read = [''] * 32
saw = None
for line in lines:
# we skip the first word, as od -b output an index at the beginning of
# each line
for w in line.split()[1:]:
read.pop(0)
read.append(w)
# If the end of read is equal to the after_sequence, read contains
# the long double
if read[-8:] == _AFTER_SEQ:
saw = copy.copy(read)
if read[:12] == _BEFORE_SEQ[4:]:
if read[12:-8] == _INTEL_EXTENDED_12B:
return 'INTEL_EXTENDED_12_BYTES_LE'
if read[12:-8] == _MOTOROLA_EXTENDED_12B:
return 'MOTOROLA_EXTENDED_12_BYTES_BE'
elif read[:8] == _BEFORE_SEQ[8:]:
if read[8:-8] == _INTEL_EXTENDED_16B:
return 'INTEL_EXTENDED_16_BYTES_LE'
elif read[8:-8] == _IEEE_QUAD_PREC_BE:
return 'IEEE_QUAD_BE'
elif read[8:-8] == _IEEE_QUAD_PREC_LE:
return 'IEEE_QUAD_LE'
elif read[8:-8] == _DOUBLE_DOUBLE_BE:
return 'DOUBLE_DOUBLE_BE'
elif read[8:-8] == _DOUBLE_DOUBLE_LE:
return 'DOUBLE_DOUBLE_LE'
elif read[:16] == _BEFORE_SEQ:
if read[16:-8] == _IEEE_DOUBLE_LE:
return 'IEEE_DOUBLE_LE'
elif read[16:-8] == _IEEE_DOUBLE_BE:
return 'IEEE_DOUBLE_BE'
if saw is not None:
raise ValueError("Unrecognized format (%s)" % saw)
else:
# We never detected the after_sequence
raise ValueError("Could not lock sequences (%s)" % saw)
```
|
```objective-c
/*
===========================================================================
This file is part of Quake III Arena source code.
Quake III Arena source code is free software; you can redistribute it
or (at your option) any later version.
Quake III Arena source code is distributed in the hope that it will be
useful, but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
along with Quake III Arena source code; if not, write to the Free Software
Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
===========================================================================
*/
//
/*****************************************************************************
* name: be_ai_char.h
*
* desc: bot characters
*
* $Archive: /source/code/botlib/be_ai_char.h $
*
*****************************************************************************/
//loads a bot character from a file
int BotLoadCharacter(char *charfile, float skill);
//frees a bot character
void BotFreeCharacter(int character);
//returns a float characteristic
float Characteristic_Float(int character, int index);
//returns a bounded float characteristic
float Characteristic_BFloat(int character, int index, float min, float max);
//returns an integer characteristic
int Characteristic_Integer(int character, int index);
//returns a bounded integer characteristic
int Characteristic_BInteger(int character, int index, int min, int max);
//returns a string characteristic
void Characteristic_String(int character, int index, char *buf, int size);
//free cached bot characters
void BotShutdownCharacters(void);
```
|
Frank Clayton may refer to:
People
Frank Clayton (Otago cricketer) (1866–1944), New Zealand cricketer
Frank Clayton (Canterbury cricketer) (1866–1941), New Zealand cricketer
Frank Clayton (American football) in 1955 NFL Draft
Fictional characters
Frank Clayton (Emmerdale), fictional character from the British soap opera Emmerdale
Frank Clayton, character in Saratoga (film)
Frank Clayton, character in Pleasures of the Rich
See also
Francis Clayton (disambiguation)
|
```c++
#ifndef BOOST_SERIALIZATION_ITEM_VERSION_TYPE_HPP
#define BOOST_SERIALIZATION_ITEM_VERSION_TYPE_HPP
// Use, modification and distribution is subject to the Boost Software
// path_to_url
#include <boost/cstdint.hpp> // uint_least8_t
#include <boost/integer_traits.hpp>
#include <boost/serialization/level.hpp>
#include <boost/serialization/is_bitwise_serializable.hpp>
// fixes broken example build on x86_64-linux-gnu-gcc-4.6.0
#include <boost/assert.hpp>
namespace boost {
namespace serialization {
#if defined(_MSC_VER)
#pragma warning( push )
#pragma warning( disable : 4244 4267 )
#endif
class item_version_type {
private:
typedef unsigned int base_type;
base_type t;
public:
// should be private - but MPI fails if it's not!!!
item_version_type(): t(0) {}
explicit item_version_type(const unsigned int t_) : t(t_){
BOOST_ASSERT(t_ <= boost::integer_traits<base_type>::const_max);
}
item_version_type(const item_version_type & t_) :
t(t_.t)
{}
item_version_type & operator=(item_version_type rhs){
t = rhs.t;
return *this;
}
// used for text output
operator base_type () const {
return t;
}
// used for text input
operator base_type & () {
return t;
}
bool operator==(const item_version_type & rhs) const {
return t == rhs.t;
}
bool operator<(const item_version_type & rhs) const {
return t < rhs.t;
}
};
#if defined(_MSC_VER)
#pragma warning( pop )
#endif
} } // end namespace boost::serialization
BOOST_IS_BITWISE_SERIALIZABLE(item_version_type)
BOOST_CLASS_IMPLEMENTATION(item_version_type, primitive_type)
#endif //BOOST_SERIALIZATION_ITEM_VERSION_TYPE_HPP
```
|
```html
{{ $product_link := "[Docker Hub](path_to_url" }}
{{ $domain_navigation := `Navigate to the domain settings page for your organization or company.
- Organization: Select **Organizations**, your organization, **Settings**, and then **Security**.
- Company: Select **Organizations**, your company, and then **Settings**.` }}
{{ if eq (.Get "product") "admin" }}
{{ $product_link = "the [Admin Console](path_to_url" }}
{{ $domain_navigation = "Select your organization or company in the left navigation drop-down menu, and then select **Domain management**." }}
{{ end }}
1. Sign in to {{ $product_link }}.
2. {{ $domain_navigation }}
3. Select **Add a domain**.
4. Continue with the on-screen instructions to get a verification code for
your domain as a **TXT Record Value**.
> [!NOTE]
>
> Format your domains without protocol or www information, for example,
> `yourcompany.example`. This should include all email domains and
> subdomains users will use to access Docker, for example
> `yourcompany.example` and `us.yourcompany.example`. Public domains such as
> `gmail.com`, `outlook.com`, etc. arent permitted.
> [!TIP]
>
> Make sure that the TXT record name that you create on your DNS matches
> the domain you registered on Docker in Step 4. For example,
> if you registered the subdomain `us.yourcompany.example`,
> you need to create a TXT record within the same name/zone `us`.
> A root domain such as `yourcompany.example` needs a TXT record on the
> root zone, which is typically denoted with the `@` name for the record.
5. Once you have waited 72 hours for the TXT record verification,
you can then select **Verify** next to the domain you've added,
and follow the on-screen instructions.
```
|
```objective-c
//===- SyncDependenceAnalysis.h - Divergent Branch Dependence -*- C++ -*-===//
//
// See path_to_url for license information.
//
//===your_sha256_hash------===//
//
// \file
// This file defines the SyncDependenceAnalysis class, which computes for
// every divergent branch the set of phi nodes that the branch will make
// divergent.
//
//===your_sha256_hash------===//
#ifndef LLVM_ANALYSIS_SYNCDEPENDENCEANALYSIS_H
#define LLVM_ANALYSIS_SYNCDEPENDENCEANALYSIS_H
#include "llvm/ADT/SmallPtrSet.h"
#include <map>
#include <memory>
#include <unordered_map>
#include <vector>
namespace llvm {
class BasicBlock;
class DominatorTree;
class Instruction;
class LoopInfo;
class PostDominatorTree;
using ConstBlockSet = SmallPtrSet<const BasicBlock *, 4>;
struct ControlDivergenceDesc {
// Join points of divergent disjoint paths.
ConstBlockSet JoinDivBlocks;
// Divergent loop exits
ConstBlockSet LoopDivBlocks;
};
struct ModifiedPO {
std::vector<const BasicBlock *> LoopPO;
std::unordered_map<const BasicBlock *, unsigned> POIndex;
void appendBlock(const BasicBlock &BB) {
POIndex[&BB] = LoopPO.size();
LoopPO.push_back(&BB);
}
unsigned getIndexOf(const BasicBlock &BB) const {
return POIndex.find(&BB)->second;
}
unsigned size() const { return LoopPO.size(); }
const BasicBlock *getBlockAt(unsigned Idx) const { return LoopPO[Idx]; }
};
/// \brief Relates points of divergent control to join points in
/// reducible CFGs.
///
/// This analysis relates points of divergent control to points of converging
/// divergent control. The analysis requires all loops to be reducible.
class SyncDependenceAnalysis {
public:
~SyncDependenceAnalysis();
SyncDependenceAnalysis(const DominatorTree &DT, const PostDominatorTree &PDT,
const LoopInfo &LI);
/// \brief Computes divergent join points and loop exits caused by branch
/// divergence in \p Term.
///
/// The set of blocks which are reachable by disjoint paths from \p Term.
/// The set also contains loop exits if there two disjoint paths:
/// one from \p Term to the loop exit and another from \p Term to the loop
/// header. Those exit blocks are added to the returned set.
/// If L is the parent loop of \p Term and an exit of L is in the returned
/// set then L is a divergent loop.
const ControlDivergenceDesc &getJoinBlocks(const Instruction &Term);
private:
static ControlDivergenceDesc EmptyDivergenceDesc;
ModifiedPO LoopPO;
const DominatorTree &DT;
const PostDominatorTree &PDT;
const LoopInfo &LI;
std::map<const Instruction *, std::unique_ptr<ControlDivergenceDesc>>
CachedControlDivDescs;
};
} // namespace llvm
#endif // LLVM_ANALYSIS_SYNCDEPENDENCEANALYSIS_H
```
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
// MODULES //
var tape = require( 'tape' );
var add = require( '@stdlib/math/base/ops/add' );
var zeros3d = require( '@stdlib/array/base/zeros3d' );
var binary3d = require( './../lib' );
// TESTS //
tape( 'main export is a function', function test( t ) {
t.ok( true, __filename );
t.strictEqual( typeof binary3d, 'function', 'main export is a function' );
t.end();
});
tape( 'the function applies a provided callback to nested input arrays and assigns results to a nested output array', function test( t ) {
var expected;
var shape;
var x;
var y;
var z;
shape = [ 2, 2, 2 ];
x = [
[
[ 1.0, 2.0 ],
[ 3.0, 4.0 ]
],
[
[ 1.0, 2.0 ],
[ 3.0, 4.0 ]
]
];
y = [
[
[ 5.0, 6.0 ],
[ 7.0, 8.0 ]
],
[
[ 5.0, 6.0 ],
[ 7.0, 8.0 ]
]
];
z = zeros3d( shape );
expected = [
[
[ 6.0, 8.0 ],
[ 10.0, 12.0 ]
],
[
[ 6.0, 8.0 ],
[ 10.0, 12.0 ]
]
];
binary3d( [ x, y, z ], shape, add );
t.deepEqual( z, expected, 'returns expected value' );
t.end();
});
tape( 'the function does not invoke a provided callback if provided a shape having a first element equal to zero', function test( t ) {
var expected;
var shape;
var x;
var y;
var z;
shape = [ 2, 2, 2 ];
x = [
[
[ 1.0, 2.0 ],
[ 3.0, 4.0 ]
],
[
[ 1.0, 2.0 ],
[ 3.0, 4.0 ]
]
];
y = [
[
[ 5.0, 6.0 ],
[ 7.0, 8.0 ]
],
[
[ 5.0, 6.0 ],
[ 7.0, 8.0 ]
]
];
z = zeros3d( shape );
expected = zeros3d( shape );
binary3d( [ x, y, z ], [ 0, 2, 2 ], clbk );
t.deepEqual( z, expected, 'returns expected value' );
t.end();
function clbk() {
t.ok( false, 'should not invoke callback' );
}
});
tape( 'the function does not invoke a provided callback if provided a shape having a second element equal to zero', function test( t ) {
var expected;
var shape;
var x;
var y;
var z;
shape = [ 2, 2, 2 ];
x = [
[
[ 1.0, 2.0 ],
[ 3.0, 4.0 ]
],
[
[ 1.0, 2.0 ],
[ 3.0, 4.0 ]
]
];
y = [
[
[ 5.0, 6.0 ],
[ 7.0, 8.0 ]
],
[
[ 5.0, 6.0 ],
[ 7.0, 8.0 ]
]
];
z = zeros3d( shape );
expected = zeros3d( shape );
binary3d( [ x, y, z ], [ 2, 0, 2 ], clbk );
t.deepEqual( z, expected, 'returns expected value' );
t.end();
function clbk() {
t.ok( false, 'should not invoke callback' );
}
});
tape( 'the function does not invoke a provided callback if provided a shape having a third element equal to zero', function test( t ) {
var expected;
var shape;
var x;
var y;
var z;
shape = [ 2, 2, 2 ];
x = [
[
[ 1.0, 2.0 ],
[ 3.0, 4.0 ]
],
[
[ 1.0, 2.0 ],
[ 3.0, 4.0 ]
]
];
y = [
[
[ 5.0, 6.0 ],
[ 7.0, 8.0 ]
],
[
[ 5.0, 6.0 ],
[ 7.0, 8.0 ]
]
];
z = zeros3d( shape );
expected = zeros3d( shape );
binary3d( [ x, y, z ], [ 2, 2, 0 ], clbk );
t.deepEqual( z, expected, 'returns expected value' );
t.end();
function clbk() {
t.ok( false, 'should not invoke callback' );
}
});
```
|
Tityus discrepans is a species of scorpion found in northern and north-eastern South America.
Description and behavior
Tityus discrepans can grow up to 71 mm (males) and 60 mm (females), has a reddish-brown body and pedipalps, and light legs, juveniles can be light yellow-brown, with the rest of the body with black areas. Like several scorpions, it has nocturnal habits, feeds on spiders, forks, cockroaches, butterflies and has even been noted to prey on other scorpions. It is a solitary animal, meeting other members of its species only during the mating season.
Range and habitat
Tityus discrepans is found in Brazil, Suriname, Venezuela, Guyana and Trinidad and Tobago, inhabits wooded areas, under leaves and rocks, orchids, bromeliads, cracks and bark. This species also occurs in human habitations, where it has already been reported on sheets, clothing and shoes.
Reproduction
Reproduction can occur at any time and several times a year, the way of reproduction is sexual, the male performs a dance, where it immobilizes the females and takes her to where he deposited the sperm cell (sperm bag). The female can produce 15-30 cubs, which are born fully developed, and are located on the mother's back.
Medical significance
Tityus discrepans is considered a serious public health problem in Venezuela, being dangerous for all ages, especially children and the elderly, the sting of this species can result in piloerection, dyspnea, excessive salivation, cramps, fever and vomiting. In severe cases, heart failure, pulmonary edema and pancreatitis occur. The venom of this species is composed of 80 types of toxins, 6 of which are Bactridines, which inhibit the sodium channels in the nervous system, and 10 of which are considered dangerous to humans, due to their low molecular weight, it travels quickly through the bloodstream and attacks the heart, lungs and pancreas.
References
discrepans
Scorpions of South America
|
The 1983–84 Soviet Championship League season was the 38th season of the Soviet Championship League, the top level of ice hockey in the Soviet Union. 12 teams participated in the league, and CSKA Moscow won the championship.
Standings
External links
Season on hockeystars.ru
1983–84 in Soviet ice hockey
Soviet League seasons
Sov
|
```java
/*
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package org.apache.shardingsphere.encrypt.checker.sql;
import org.apache.shardingsphere.encrypt.exception.syntax.UnsupportedEncryptSQLException;
import org.apache.shardingsphere.encrypt.rule.EncryptRule;
import org.apache.shardingsphere.encrypt.rule.column.EncryptColumn;
import org.apache.shardingsphere.encrypt.rule.table.EncryptTable;
import org.apache.shardingsphere.infra.binder.context.segment.select.orderby.OrderByItem;
import org.apache.shardingsphere.infra.binder.context.segment.table.TablesContext;
import org.apache.shardingsphere.infra.binder.context.statement.dml.SelectStatementContext;
import org.apache.shardingsphere.infra.database.core.DefaultDatabase;
import org.apache.shardingsphere.infra.database.core.metadata.database.enums.NullsOrderType;
import org.apache.shardingsphere.infra.database.core.type.DatabaseType;
import org.apache.shardingsphere.infra.metadata.database.schema.model.ShardingSphereSchema;
import org.apache.shardingsphere.infra.spi.type.typed.TypedSPILoader;
import org.apache.shardingsphere.sql.parser.statement.core.enums.OrderDirection;
import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.column.ColumnSegment;
import org.apache.shardingsphere.sql.parser.statement.core.segment.dml.order.item.ColumnOrderByItemSegment;
import org.apache.shardingsphere.sql.parser.statement.core.segment.generic.AliasSegment;
import org.apache.shardingsphere.sql.parser.statement.core.segment.generic.OwnerSegment;
import org.apache.shardingsphere.sql.parser.statement.core.segment.generic.table.SimpleTableSegment;
import org.apache.shardingsphere.sql.parser.statement.core.segment.generic.table.TableNameSegment;
import org.apache.shardingsphere.sql.parser.statement.core.value.identifier.IdentifierValue;
import org.junit.jupiter.api.Test;
import java.util.Collections;
import java.util.Optional;
import static org.junit.jupiter.api.Assertions.assertThrows;
import static org.mockito.Mockito.RETURNS_DEEP_STUBS;
import static org.mockito.Mockito.mock;
import static org.mockito.Mockito.when;
class EncryptOrderByItemSupportedCheckerTest {
private final DatabaseType databaseType = TypedSPILoader.getService(DatabaseType.class, "FIXTURE");
@Test
void assertCheck() {
assertThrows(UnsupportedEncryptSQLException.class, () -> new EncryptOrderByItemSupportedChecker().check(mockEncryptRule(), mock(ShardingSphereSchema.class), buildSelectStatementContext()));
}
private EncryptRule mockEncryptRule() {
EncryptRule result = mock(EncryptRule.class);
EncryptTable encryptTable = mock(EncryptTable.class);
when(encryptTable.isEncryptColumn("certificate_number")).thenReturn(true);
EncryptColumn encryptColumn = mock(EncryptColumn.class, RETURNS_DEEP_STUBS);
when(encryptColumn.getAssistedQuery()).thenReturn(Optional.empty());
when(encryptTable.getEncryptColumn("certificate_number")).thenReturn(encryptColumn);
when(result.findEncryptTable("t_encrypt")).thenReturn(Optional.of(encryptTable));
return result;
}
private SelectStatementContext buildSelectStatementContext() {
SimpleTableSegment simpleTableSegment = new SimpleTableSegment(new TableNameSegment(0, 0, new IdentifierValue("t_encrypt")));
simpleTableSegment.setAlias(new AliasSegment(0, 0, new IdentifierValue("a")));
ColumnSegment columnSegment = new ColumnSegment(0, 0, new IdentifierValue("certificate_number"));
columnSegment.setOwner(new OwnerSegment(0, 0, new IdentifierValue("a")));
SelectStatementContext result = mock(SelectStatementContext.class, RETURNS_DEEP_STUBS);
when(result.getDatabaseType()).thenReturn(databaseType);
ColumnOrderByItemSegment columnOrderByItemSegment = new ColumnOrderByItemSegment(columnSegment, OrderDirection.ASC, NullsOrderType.FIRST);
OrderByItem orderByItem = new OrderByItem(columnOrderByItemSegment);
when(result.getOrderByContext().getItems()).thenReturn(Collections.singleton(orderByItem));
when(result.getGroupByContext().getItems()).thenReturn(Collections.emptyList());
when(result.getSubqueryContexts().values()).thenReturn(Collections.emptyList());
when(result.getTablesContext()).thenReturn(new TablesContext(Collections.singleton(simpleTableSegment), databaseType, DefaultDatabase.LOGIC_NAME));
return result;
}
}
```
|
```java
/*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions are met:
*
* 1. Redistributions of source code must retain the above copyright notice, this
* list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright notice,
* this list of conditions and the following disclaimer in the documentation
* and/or other materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
* WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
* ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
* LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
* ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
* SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package net.runelite.client.plugins.cluescrolls.clues;
import com.google.common.collect.ImmutableMap;
import java.awt.Color;
import java.awt.Graphics2D;
import javax.annotation.Nonnull;
import javax.annotation.Nullable;
import lombok.AccessLevel;
import lombok.Getter;
import lombok.NonNull;
import net.runelite.api.Varbits;
import net.runelite.api.annotations.Varbit;
import net.runelite.api.coords.LocalPoint;
import net.runelite.api.coords.WorldPoint;
import net.runelite.client.plugins.cluescrolls.ClueScrollPlugin;
import static net.runelite.client.plugins.cluescrolls.clues.Enemy.ANCIENT_WIZARDS;
import static net.runelite.client.plugins.cluescrolls.clues.Enemy.ARMADYLEAN_GUARD;
import static net.runelite.client.plugins.cluescrolls.clues.Enemy.ARMADYLEAN_OR_BANDOSIAN_GUARD;
import static net.runelite.client.plugins.cluescrolls.clues.Enemy.BANDOSIAN_GUARD;
import static net.runelite.client.plugins.cluescrolls.clues.Enemy.BRASSICAN_MAGE;
import static net.runelite.client.plugins.cluescrolls.clues.Enemy.SARADOMIN_WIZARD;
import static net.runelite.client.plugins.cluescrolls.clues.Enemy.ZAMORAK_WIZARD;
import net.runelite.client.ui.overlay.OverlayUtil;
import net.runelite.client.ui.overlay.components.LineComponent;
import net.runelite.client.ui.overlay.components.PanelComponent;
import net.runelite.client.ui.overlay.components.TitleComponent;
@Getter
public class CoordinateClue extends ClueScroll implements LocationClueScroll
{
@Getter
private static class CoordinateClueInfo
{
private final String directions;
private final boolean lightRequired;
@Getter(onMethod_ = {@Varbit})
private final int lightSourceVarbitId;
private final Enemy enemy;
private CoordinateClueInfo(@NonNull String directions)
{
this(directions, null);
}
private CoordinateClueInfo(@NonNull String directions, Enemy enemy)
{
this.directions = directions;
this.enemy = enemy;
this.lightRequired = false;
this.lightSourceVarbitId = -1;
}
private CoordinateClueInfo(@Nonnull String directions, Enemy enemy, boolean lightRequired, @Varbit int lightSourceVarbitId)
{
this.directions = directions;
this.enemy = enemy;
this.lightRequired = lightRequired;
this.lightSourceVarbitId = lightSourceVarbitId;
}
}
static final ImmutableMap<WorldPoint, CoordinateClueInfo> CLUES = new ImmutableMap.Builder<WorldPoint, CoordinateClueInfo>()
// Medium
.put(new WorldPoint(2479, 3158, 0), new CoordinateClueInfo("South of fruit tree patch, west of Tree Gnome Village."))
.put(new WorldPoint(2887, 3154, 0), new CoordinateClueInfo("West of Banana plantation on Karamja."))
.put(new WorldPoint(2743, 3151, 0), new CoordinateClueInfo("Entrance of Brimhaven dungeon."))
.put(new WorldPoint(3184, 3150, 0), new CoordinateClueInfo("South of Lumbridge Swamp."))
.put(new WorldPoint(3217, 3177, 0), new CoordinateClueInfo("East of Lumbridge Swamp."))
.put(new WorldPoint(3007, 3144, 0), new CoordinateClueInfo("Near the entrance to the Asgarnian Ice Dungeon, south of Port Sarim (AIQ)."))
.put(new WorldPoint(2896, 3119, 0), new CoordinateClueInfo("Near Karambwan fishing spot (DKP)."))
.put(new WorldPoint(2697, 3207, 0), new CoordinateClueInfo("Centre of Moss Giant Island, west of Brimhaven."))
.put(new WorldPoint(2679, 3110, 0), new CoordinateClueInfo("North of Hazelmere's house (CLS)."))
.put(new WorldPoint(3510, 3074, 0), new CoordinateClueInfo("East of Uzer (DLQ)."))
.put(new WorldPoint(3160, 3251, 0), new CoordinateClueInfo("West of trapdoor leading to H.A.M Hideout."))
.put(new WorldPoint(2643, 3252, 0), new CoordinateClueInfo("South of Ardougne Zoo, North of Tower of Life (DJP)."))
.put(new WorldPoint(2322, 3061, 0), new CoordinateClueInfo("South-west of Castle wars (BKP)."))
.put(new WorldPoint(2875, 3046, 0), new CoordinateClueInfo("North of nature altar, north of Shilo Village (CKR)."))
.put(new WorldPoint(2849, 3033, 0), new CoordinateClueInfo("West of nature altar, north of Shilo Village (CKR)."))
.put(new WorldPoint(2848, 3296, 0), new CoordinateClueInfo("North of Crandor island."))
.put(new WorldPoint(2583, 2990, 0), new CoordinateClueInfo("Feldip Hills, south-east of Gu'Thanoth (AKS)."))
.put(new WorldPoint(3179, 3344, 0), new CoordinateClueInfo("In the cow pen north of the Lumbridge windmill."))
.put(new WorldPoint(2383, 3370, 0), new CoordinateClueInfo("West of the outpost"))
.put(new WorldPoint(3312, 3375, 0), new CoordinateClueInfo("North-west of Exam Centre, on the hill."))
.put(new WorldPoint(3121, 3384, 0), new CoordinateClueInfo("North-east of Draynor Manor, near River Lum."))
.put(new WorldPoint(3430, 3388, 0), new CoordinateClueInfo("West of Mort Myre Swamp (BKR)."))
.put(new WorldPoint(2920, 3403, 0), new CoordinateClueInfo("South-east of Taverley, near Lady of the Lake."))
.put(new WorldPoint(2594, 2899, 0), new CoordinateClueInfo("South-east of Feldip Hills, by the crimson swifts (AKS)."))
.put(new WorldPoint(2387, 3435, 0), new CoordinateClueInfo("West of Tree Gnome Stronghold, near the pen containing terrorbirds."))
.put(new WorldPoint(2512, 3467, 0), new CoordinateClueInfo("Baxtorian Falls (Bring rope)."))
.put(new WorldPoint(2381, 3468, 0), new CoordinateClueInfo("West of Tree Gnome Stronghold, north of the pen with terrorbirds."))
.put(new WorldPoint(3005, 3475, 0), new CoordinateClueInfo("Ice Mountain, west of Edgeville Monastery."))
.put(new WorldPoint(2585, 3505, 0), new CoordinateClueInfo("By the shore line north of the Coal Trucks."))
.put(new WorldPoint(3443, 3515, 0), new CoordinateClueInfo("South of Slayer Tower (CKS)."))
.put(new WorldPoint(2416, 3516, 0), new CoordinateClueInfo("Tree Gnome Stronghold, west of Grand Tree, near swamp."))
.put(new WorldPoint(3429, 3523, 0), new CoordinateClueInfo("South of Slayer Tower (CKS)."))
.put(new WorldPoint(2363, 3531, 0), new CoordinateClueInfo("North-east of Eagles' Peak (AKQ)."))
.put(new WorldPoint(2919, 3535, 0), new CoordinateClueInfo("East of Burthorpe pub."))
.put(new WorldPoint(3548, 3560, 0), new CoordinateClueInfo("Inside Fenkenstrain's Castle."))
.put(new WorldPoint(1476, 3566, 0), new CoordinateClueInfo("Graveyard of Heroes in west Shayzien."))
.put(new WorldPoint(2735, 3638, 0), new CoordinateClueInfo("East of Rellekka, north-west of Golden Apple Tree (AJR)."))
.put(new WorldPoint(2681, 3653, 0), new CoordinateClueInfo("Rellekka, in the garden of the south-east house."))
.put(new WorldPoint(2537, 3881, 0), new CoordinateClueInfo("Miscellania (CIP)."))
.put(new WorldPoint(2828, 3234, 0), new CoordinateClueInfo("Southern coast of Crandor."))
.put(new WorldPoint(1247, 3726, 0), new CoordinateClueInfo("Just inside the Farming Guild"))
.put(new WorldPoint(3770, 3898, 0), new CoordinateClueInfo("On the small island north-east of Fossil Island's mushroom forest."))
.put(new WorldPoint(1659, 3111, 0), new CoordinateClueInfo("Dig west of the Bazaar in Civitas illa Fortis."))
// Hard
.put(new WorldPoint(2209, 3161, 0), new CoordinateClueInfo("North-east of Tyras Camp (BJS if 76 Agility).", SARADOMIN_WIZARD))
.put(new WorldPoint(2181, 3206, 0), new CoordinateClueInfo("South of Iorwerth Camp.", SARADOMIN_WIZARD))
.put(new WorldPoint(3081, 3209, 0), new CoordinateClueInfo("Small Island (CLP).", SARADOMIN_WIZARD))
.put(new WorldPoint(3399, 3246, 0), new CoordinateClueInfo("Behind the PvP Arena."))
.put(new WorldPoint(2699, 3251, 0), new CoordinateClueInfo("Little island (AIR).", SARADOMIN_WIZARD))
.put(new WorldPoint(3546, 3251, 0), new CoordinateClueInfo("North-east of Burgh de Rott.", SARADOMIN_WIZARD))
.put(new WorldPoint(3544, 3256, 0), new CoordinateClueInfo("North-east of Burgh de Rott.", SARADOMIN_WIZARD))
.put(new WorldPoint(2841, 3267, 0), new CoordinateClueInfo("Crandor island.", SARADOMIN_WIZARD))
.put(new WorldPoint(3168, 3041, 0), new CoordinateClueInfo("Bedabin Camp.", SARADOMIN_WIZARD))
.put(new WorldPoint(2542, 3031, 0), new CoordinateClueInfo("Gu'Tanoth, may require 20gp.", SARADOMIN_WIZARD))
.put(new WorldPoint(2581, 3030, 0), new CoordinateClueInfo("Gu'Tanoth island, enter cave north-west of Feldip Hills (AKS).", SARADOMIN_WIZARD))
.put(new WorldPoint(2961, 3024, 0), new CoordinateClueInfo("Ship yard (DKP).", SARADOMIN_WIZARD))
.put(new WorldPoint(2339, 3311, 0), new CoordinateClueInfo("East of Prifddinas on Arandar mountain pass.", SARADOMIN_WIZARD))
.put(new WorldPoint(3440, 3341, 0), new CoordinateClueInfo("Nature Spirit's grotto (BIP).", SARADOMIN_WIZARD))
.put(new WorldPoint(2763, 2974, 0), new CoordinateClueInfo("Cairn Isle, west of Shilo Village (CKR).", SARADOMIN_WIZARD))
.put(new WorldPoint(3138, 2969, 0), new CoordinateClueInfo("West of Bandit Camp in Kharidian Desert.", SARADOMIN_WIZARD))
.put(new WorldPoint(2924, 2963, 0), new CoordinateClueInfo("On the southern part of eastern Karamja, west of the gnome glider.", SARADOMIN_WIZARD))
.put(new WorldPoint(2838, 2914, 0), new CoordinateClueInfo("Kharazi Jungle, near water pool (CKR).", SARADOMIN_WIZARD))
.put(new WorldPoint(3441, 3419, 0), new CoordinateClueInfo("Mort Myre Swamp (BKR).", SARADOMIN_WIZARD))
.put(new WorldPoint(2950, 2902, 0), new CoordinateClueInfo("South-east of Kharazi Jungle.", SARADOMIN_WIZARD))
.put(new WorldPoint(2775, 2891, 0), new CoordinateClueInfo("South-west of Kharazi Jungle.", SARADOMIN_WIZARD))
.put(new WorldPoint(3113, 3602, 0), new CoordinateClueInfo("Wilderness. South-west of Ferox Enclave (level 11).", ZAMORAK_WIZARD))
.put(new WorldPoint(2892, 3675, 0), new CoordinateClueInfo("On the summit of Trollheim.", SARADOMIN_WIZARD))
.put(new WorldPoint(3168, 3677, 0), new CoordinateClueInfo("Wilderness. Graveyard of Shadows.", ZAMORAK_WIZARD))
.put(new WorldPoint(2853, 3690, 0), new CoordinateClueInfo("Entrance to the troll Stronghold.", SARADOMIN_WIZARD))
.put(new WorldPoint(3305, 3692, 0), new CoordinateClueInfo("Wilderness. West of eastern green dragons.", ZAMORAK_WIZARD))
.put(new WorldPoint(3055, 3696, 0), new CoordinateClueInfo("Wilderness. Bandit Camp.", ZAMORAK_WIZARD))
.put(new WorldPoint(3302, 3696, 0), new CoordinateClueInfo("Wilderness. West of eastern green dragons.", ZAMORAK_WIZARD))
.put(new WorldPoint(1479, 3699, 0), new CoordinateClueInfo("Lizardman Canyon (DJR).", SARADOMIN_WIZARD))
.put(new WorldPoint(2712, 3732, 0), new CoordinateClueInfo("North-east of Rellekka (DKS).", SARADOMIN_WIZARD))
.put(new WorldPoint(2970, 3749, 0), new CoordinateClueInfo("Wilderness. Forgotten Cemetery.", ZAMORAK_WIZARD))
.put(new WorldPoint(3094, 3764, 0), new CoordinateClueInfo("Wilderness. Mining site north of Bandit Camp.", ZAMORAK_WIZARD))
.put(new WorldPoint(3311, 3769, 0), new CoordinateClueInfo("Wilderness. South of the Silk Chasm (Venenatis).", ZAMORAK_WIZARD))
.put(new WorldPoint(1460, 3782, 0), new CoordinateClueInfo("Lovakengj, near burning man.", SARADOMIN_WIZARD))
.put(new WorldPoint(3244, 3792, 0), new CoordinateClueInfo("Wilderness. South-east of Lava Dragon Isle by some Chaos Dwarves.", ZAMORAK_WIZARD))
.put(new WorldPoint(3140, 3804, 0), new CoordinateClueInfo("Wilderness. North of black chinchompa hunter area.", ZAMORAK_WIZARD))
.put(new WorldPoint(2946, 3819, 0), new CoordinateClueInfo("Wilderness. Chaos Temple (level 38).", ZAMORAK_WIZARD))
.put(new WorldPoint(3771, 3825, 0), new CoordinateClueInfo("Fossil Island. East of Museum Camp.", SARADOMIN_WIZARD))
.put(new WorldPoint(3013, 3846, 0), new CoordinateClueInfo("Wilderness. West of Lava Maze, before KBD's lair.", ZAMORAK_WIZARD))
.put(new WorldPoint(3058, 3884, 0), new CoordinateClueInfo("Wilderness. Near runite ore north of Lava Maze.", ZAMORAK_WIZARD))
.put(new WorldPoint(3290, 3889, 0), new CoordinateClueInfo("Wilderness. Demonic Ruins.", ZAMORAK_WIZARD))
.put(new WorldPoint(3770, 3897, 0), new CoordinateClueInfo("Small Island north of Fossil Island.", SARADOMIN_WIZARD))
.put(new WorldPoint(2505, 3899, 0), new CoordinateClueInfo("Small Island north-west of Miscellania (AJS).", SARADOMIN_WIZARD))
.put(new WorldPoint(3285, 3942, 0), new CoordinateClueInfo("Wilderness. Rogues' Castle.", ZAMORAK_WIZARD))
.put(new WorldPoint(3159, 3959, 0), new CoordinateClueInfo("Wilderness. North of Deserted Keep, west of Resource Area.", ZAMORAK_WIZARD))
.put(new WorldPoint(3039, 3960, 0), new CoordinateClueInfo("Wilderness. Pirates' Hideout.", ZAMORAK_WIZARD))
.put(new WorldPoint(2987, 3963, 0), new CoordinateClueInfo("Wilderness. West of Wilderness Agility Course.", ZAMORAK_WIZARD))
.put(new WorldPoint(3189, 3963, 0), new CoordinateClueInfo("Wilderness. North of Resource Area, near magic axe hut.", ZAMORAK_WIZARD))
.put(new WorldPoint(2341, 3697, 0), new CoordinateClueInfo("North-east of the Piscatoris Fishing Colony bank.", SARADOMIN_WIZARD))
.put(new WorldPoint(3143, 3774, 0), new CoordinateClueInfo("In level 32 Wilderness, by the black chinchompa hunting area.", ZAMORAK_WIZARD))
.put(new WorldPoint(2970, 3913, 0), new CoordinateClueInfo("Frozen Waste Plateau, south-west of Wilderness Agility Course.", ZAMORAK_WIZARD))
.put(new WorldPoint(1410, 3611, 0), new CoordinateClueInfo("Lake Molch dock west of Shayzien Encampment.", SARADOMIN_WIZARD))
.put(new WorldPoint(1409, 3483, 0), new CoordinateClueInfo("South of Shayziens' Wall.", SARADOMIN_WIZARD))
// Elite
.put(new WorldPoint(2357, 3151, 0), new CoordinateClueInfo("Lletya.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3587, 3180, 0), new CoordinateClueInfo("Meiyerditch.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2820, 3078, 0), new CoordinateClueInfo("Tai Bwo Wannai. Hardwood Grove. 100 Trading sticks or elite Karamja diary completion is needed to enter.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3811, 3060, 0), new CoordinateClueInfo("Small island north-east of Mos Le'Harmless.", ARMADYLEAN_OR_BANDOSIAN_GUARD, true, Varbits.FIRE_PIT_MOS_LE_HARMLESS))
.put(new WorldPoint(2180, 3282, 0), new CoordinateClueInfo("North of Iorwerth Camp.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2870, 2997, 0), new CoordinateClueInfo("North-east corner in Shilo Village.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3302, 2988, 0), new CoordinateClueInfo("On top of a cliff to the west of Pollnivneach.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2511, 2980, 0), new CoordinateClueInfo("Just south of Gu'Tanoth, west of gnome glider (AKS).", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2732, 3372, 0), new CoordinateClueInfo("Legends' Guild.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3573, 3425, 0), new CoordinateClueInfo("North of Dessous's tomb from Desert Treasure.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3828, 2848, 0), new CoordinateClueInfo("East of Harmony Island.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3225, 2838, 0), new CoordinateClueInfo("South of Desert Treasure pyramid.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(1773, 3510, 0), new CoordinateClueInfo("Ruins north of the Hosidius mine.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3822, 3562, 0), new CoordinateClueInfo("North-east of Dragontooth Island. Bring a Ghostspeak Amulet and 25 Ecto-tokens to reach the island.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3603, 3564, 0), new CoordinateClueInfo("North of the wrecked ship, outside of Port Phasmatys.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2936, 2721, 0), new CoordinateClueInfo("Eastern shore of Crash Island.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2697, 2705, 0), new CoordinateClueInfo("South-west of Ape Atoll.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2778, 3678, 0), new CoordinateClueInfo("Mountain Camp.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2827, 3740, 0), new CoordinateClueInfo("West of the entrance to the Ice Path, where the Troll child resides.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2359, 3799, 0), new CoordinateClueInfo("Neitiznot.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2194, 3807, 0), new CoordinateClueInfo("Pirates' Cove.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2700, 3808, 0), new CoordinateClueInfo("Northwestern part of the Trollweiss and Rellekka Hunter area (DKS).", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3215, 3835, 0), new CoordinateClueInfo("Wilderness. Lava Dragon Isle.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3369, 3894, 0), new CoordinateClueInfo("Wilderness. Fountain of Rune.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2065, 3923, 0), new CoordinateClueInfo("Outside the western wall on Lunar Isle.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3188, 3933, 0), new CoordinateClueInfo("Wilderness. Resource Area. An entry fee of 7,500 coins is required, or less if Wilderness Diaries have been completed.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3043, 3940, 0), new CoordinateClueInfo("Wilderness. South of Pirates' Hideout.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3380, 3963, 0), new CoordinateClueInfo("Wilderness. North of Volcano.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3051, 3736, 0), new CoordinateClueInfo("East of the Wilderness Obelisk in 28 Wilderness.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2316, 3814, 0), new CoordinateClueInfo("West of Neitiznot, near the bridge.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2872, 3937, 0), new CoordinateClueInfo("Weiss.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2484, 4016, 0), new CoordinateClueInfo("Northeast corner of the Island of Stone.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2222, 3331, 0), new CoordinateClueInfo("Prifddinas, west of the Tower of Voices", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3560, 3987, 0), new CoordinateClueInfo("Lithkren. Digsite pendant teleport if unlocked, otherwise take rowboat from west of Mushroom Meadow Mushtree.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(2318, 2954, 0), new CoordinateClueInfo("North-east corner of the Isle of Souls (BJP).", BANDOSIAN_GUARD))
.put(new WorldPoint(2094, 2889, 0), new CoordinateClueInfo("West side of the Isle of Souls.", ARMADYLEAN_GUARD))
.put(new WorldPoint(1451, 3509, 0), new CoordinateClueInfo("Ruins of Morra.", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(3318, 2706, 0), new CoordinateClueInfo("Necropolis mine", ARMADYLEAN_OR_BANDOSIAN_GUARD))
.put(new WorldPoint(1557, 3183, 0), new CoordinateClueInfo("North of Ortus Farm", ARMADYLEAN_GUARD))
// Master
.put(new WorldPoint(2178, 3209, 0), new CoordinateClueInfo("South of Iorwerth Camp.", BRASSICAN_MAGE))
.put(new WorldPoint(2155, 3100, 0), new CoordinateClueInfo("South of Port Tyras (BJS if 76 Agility).", BRASSICAN_MAGE))
.put(new WorldPoint(2217, 3092, 0), new CoordinateClueInfo("Poison Waste island (DLR).", BRASSICAN_MAGE))
.put(new WorldPoint(3830, 3060, 0), new CoordinateClueInfo("Small island located north-east of Mos Le'Harmless.", BRASSICAN_MAGE, true, Varbits.FIRE_PIT_MOS_LE_HARMLESS))
.put(new WorldPoint(2834, 3271, 0), new CoordinateClueInfo("Crandor island.", BRASSICAN_MAGE))
.put(new WorldPoint(2732, 3284, 0), new CoordinateClueInfo("Witchaven.", BRASSICAN_MAGE))
.put(new WorldPoint(3622, 3320, 0), new CoordinateClueInfo("Meiyerditch. Outside mine.", BRASSICAN_MAGE))
.put(new WorldPoint(2303, 3328, 0), new CoordinateClueInfo("East of Prifddinas.", BRASSICAN_MAGE))
.put(new WorldPoint(3570, 3405, 0), new CoordinateClueInfo("North of Dessous's tomb from Desert Treasure.", BRASSICAN_MAGE))
.put(new WorldPoint(2840, 3423, 0), new CoordinateClueInfo("Water Obelisk Island.", BRASSICAN_MAGE))
.put(new WorldPoint(3604, 3564, 0), new CoordinateClueInfo("North of the wrecked ship, outside of Port Phasmatys (ALQ).", BRASSICAN_MAGE))
.put(new WorldPoint(3085, 3569, 0), new CoordinateClueInfo("Wilderness. Obelisk of Air.", BRASSICAN_MAGE))
.put(new WorldPoint(2934, 2727, 0), new CoordinateClueInfo("Eastern shore of Crash Island.", BRASSICAN_MAGE))
.put(new WorldPoint(1451, 3695, 0), new CoordinateClueInfo("West side of Lizardman Canyon with Lizardman shaman.", ANCIENT_WIZARDS))
.put(new WorldPoint(2538, 3739, 0), new CoordinateClueInfo("Waterbirth Island. Bring a pet rock and rune thrownaxe OR have 85 agility.", BRASSICAN_MAGE))
.put(new WorldPoint(1248, 3751, 0), new CoordinateClueInfo("In the north wing of the Farming Guild.", BRASSICAN_MAGE))
.put(new WorldPoint(1698, 3792, 0), new CoordinateClueInfo("Arceuus church.", ANCIENT_WIZARDS))
.put(new WorldPoint(2951, 3820, 0), new CoordinateClueInfo("Wilderness. Chaos Temple (level 38).", ANCIENT_WIZARDS))
.put(new WorldPoint(2202, 3825, 0), new CoordinateClueInfo("Pirates' Cove, between Lunar Isle and Rellekka.", ANCIENT_WIZARDS))
.put(new WorldPoint(1761, 3853, 0), new CoordinateClueInfo("Arceuus essence mine (CIS).", BRASSICAN_MAGE))
.put(new WorldPoint(2090, 3863, 0), new CoordinateClueInfo("South of Lunar Isle, west of Astral altar.", ANCIENT_WIZARDS))
.put(new WorldPoint(1442, 3878, 0), new CoordinateClueInfo("Northern area of the Lovakengj Sulphur Mine. Facemask or Slayer Helmet recommended.", BRASSICAN_MAGE))
.put(new WorldPoint(3380, 3929, 0), new CoordinateClueInfo("Wilderness. Near Volcano.", ANCIENT_WIZARDS))
.put(new WorldPoint(3188, 3939, 0), new CoordinateClueInfo("Wilderness. Resource Area.", BRASSICAN_MAGE))
.put(new WorldPoint(3304, 3941, 0), new CoordinateClueInfo("Wilderness. East of Rogues' Castle.", ANCIENT_WIZARDS))
.put(new WorldPoint(3028, 3928, 0), new CoordinateClueInfo("Wilderness. South-east of Agility Training Area.", BRASSICAN_MAGE))
.put(new WorldPoint(1769, 3418, 0), new CoordinateClueInfo("Crabclaw Isle", ANCIENT_WIZARDS))
.build();
private final String text;
@Getter(AccessLevel.PRIVATE)
private final WorldPoint location;
/**
* For regions which are mirrored, the location of the clue in the mirrored region.
*/
@Nullable
private final WorldPoint mirrorLocation;
public CoordinateClue(String text, WorldPoint location, WorldPoint mirrorLocation)
{
this.text = text;
this.location = location;
this.mirrorLocation = mirrorLocation;
final CoordinateClueInfo clueInfo = CLUES.get(location);
if (clueInfo != null)
{
setFirePitVarbitId(clueInfo.getLightSourceVarbitId());
setRequiresLight(clueInfo.lightRequired);
setEnemy(clueInfo.getEnemy());
}
setRequiresSpade(true);
}
@Override
public WorldPoint getLocation(ClueScrollPlugin plugin)
{
return location;
}
@Override
public WorldPoint[] getLocations(ClueScrollPlugin plugin)
{
if (mirrorLocation != null)
{
return new WorldPoint[]{location, mirrorLocation};
}
else
{
return new WorldPoint[]{location};
}
}
@Override
public void makeOverlayHint(PanelComponent panelComponent, ClueScrollPlugin plugin)
{
panelComponent.getChildren().add(TitleComponent.builder().text("Coordinate Clue").build());
final CoordinateClueInfo solution = CLUES.get(location);
if (solution != null)
{
panelComponent.getChildren().add(LineComponent.builder()
.left(solution.getDirections())
.build());
panelComponent.getChildren().add(LineComponent.builder().build());
}
panelComponent.getChildren().add(LineComponent.builder()
.left("Click the clue scroll on your world map to see dig location.")
.build());
renderOverlayNote(panelComponent, plugin);
}
@Override
public void makeWorldOverlayHint(Graphics2D graphics, ClueScrollPlugin plugin)
{
for (WorldPoint worldPoint : getLocations(plugin))
{
LocalPoint localLocation = LocalPoint.fromWorld(plugin.getClient(), worldPoint);
if (localLocation != null)
{
OverlayUtil.renderTileOverlay(plugin.getClient(), graphics, localLocation, plugin.getSpadeImage(), Color.ORANGE);
}
}
}
@Override
public int[] getConfigKeys()
{
return new int[]{location.hashCode()};
}
}
```
|
The Zhengding Airport railway station () is a station on the Beijing–Guangzhou–Shenzhen–Hong Kong high-speed railway. It is located in Xinchengpu Town in Zhengding County, Shijiazhuang, Hebei Province, near Shijiazhuang Zhengding International Airport.
History
The station was opened on 26 December 2012, together with the Beijing-Zhengzhou section of the Beijing–Guangzhou–Shenzhen–Hong Kong High-Speed Railway.
During the first three months of 2013, passengers flying to Shijiazhuang were provided with free train tickets to station such as and . Passengers arriving to the airport by train and leaving by plane could have the cost of their train tickets reimbursed as well.
Over 150,000 passengers used the promotion in 2013.
Location
Although named after the Shijiazhuang Zhengding International Airport, the station is not adjacent to the airport. It is located about southeast to the airport and is connected to the airport's main passenger terminals with a dedicated long expressway. Shuttle bus services are provided to connect the station with the airport, with a 3–5 min trip.
It takes under 15 minutes for a D- or G-series trains to travel from the station to . The travelling time to Beijing is around 1 hour to 1 hour 30 min.
Station layout
The station has 2 side platforms and 4 tracks. The Platform 1 (northern platform) is used by trains going towards the direction of while the Platform 2 (southern platform) is used by trains towards the direction of . The station building, which houses ticket offices, waiting halls, restaurants, etc., is located to the north of the platforms.
References
Railway stations in Hebei
Airport railway stations in China
Railway stations in China opened in 2012
|
Rattan Mohan Sharma (born 14 June 1971) is an Indian classical vocalist, belonging to the Mewati gharana. He performs classical music forms such as khyal and tarana as well as light classical forms such as Haveli Sangeet, Tappa and Bhajan as well as Rajasthani Folk. He is considered an "A" grade artist on All India Radio.
Early life and training
Sharma was born in Rajasthan to Padma and Mohan Lal Sharma. He is the nephew and a disciple of classical vocalist, PanditJasraj. His affinity for percussion instruments in his youth led Sharma to practice tabla up to the age of 15. Over the years, he has trained under Pandit Jasraj.
Career
He belongs to the Mewati Gharana and belongs to the family of vocalists such as Motiram, Maniram, Pratap Narayan, and Jasraj. He has performed in many concerts and festivals in India and abroad. As a playback singer he has performed in the mythological film Dashavatar (2009).
He performs regularly at the classical music festival Pandit Motiram Pandit Maniram Sangeet Samaroh organized by Jasraj.
Awards & Titles
Shankar Rao Vyas Award
Pandit Jasraj rotating Trophy
Mewati Gharana Gaurav Puraskar
Acharya Varishtha (title)
Sur Ratna (title)
Sur Mañi (title)
IWAF (gharana award)
Rajasthan gaurav samman
Marwar ratna
Badshah-e-Tarana (title by people of Hyderabad)
Kala Sarathi (by shri shri Ravi Shankar ji)
Personal life
Sharma is married to Ekta Sharma and has a son, Swar Sharma.
Discography
Devotional albums include
Mere Bhagwan - Shri Ramji
Mere Bhagwan - Mere Guru
Mere Bhagwan - Shri Hanumanji
Mere Bhagwan - Shri Saibaba
Mere Bhagwan - Shri Ganeshji
Mere Bhagwan - Shri Krishnaji
Mere Bhagwan - Shri Vishnuji
Mere Bhagwan - Gayatri Maa
Mere Bhagwan - Durga Maa
Jaago bhor bhayi
Dashavatar
Naman
Gayatri
Gayatri Aradhana
Hanuman Raksha Kavach
Navagraha Shakti
Bhaktamar Stotra
Navkar
Dharohar (Hindustani Classical)
Jasrangi (With Pta. Gargee Siddhant Dutta)
Mewati Gharana
Utsav (DVD with Shankar Mahadevan)
Hanuman I
Hanuman II
Hanuman Dhun
Shiv Dhun
Healing Mantras - Heart
Lalita Shahastra Naam
Gayatri Shahastra Naam
Jai Gangey
Sapta vaar katha
Healing Mantras- Respiration
The Holy Trinity
Nav Durga
Satyanarayan Katha
Surya
Vaastu
Temple Music of India (Haveli Sangeet)
Gopal
Krishna Leela
Sama (Gujarati Garba, in praise of The Aga Khan)
Maha Mrityunjay
Sampoorna Maha Mrityunjay
Shri Raam
Raam Dhun
Holi - The Great Radha Krishna Celebration
Spiritual Mantras Vol 1
Jago Matt Shubh Prabhat Aaya
References
External links
Hindustani singers
1971 births
Living people
People from Rajasthan
Mewati gharana
Performers of Hindu music
Indian male singers
Bhajan singers
|
```yaml
name: PyPI_Packaging
# Trigger this build whenever the dev or release branches are updated,
# or on pull requests that affect PyPI-related files.
#
# Ideally we'd run this pipeline for all pull requests, but doing so consumes
# our limited number of slots on Azure and almost always just duplicates the
# build done in the main pipeline.
trigger:
branches:
include:
- dev
- release
pr:
branches:
include:
- dev
- release
paths:
include:
- azure-pypi-pipeline.yml
- build_scripts/pypi/*.py
- build_scripts/pypi/package_files/*
# Azure does not support an empty string as a default argument, so we use a
# special 'none' string for post_release_tag.
parameters:
- name: post_release_tag
displayName: Post Release Version Tag
type: string
default: 'none'
variables:
- name: post_release_tag_arg
${{ if eq(parameters.post_release_tag, 'none') }}:
value: ''
${{ else }}:
value: ${{ format('--post-release-tag {0}', parameters.post_release_tag) }}
stages:
- stage: Build_and_Package
jobs:
- job: Linux
strategy:
matrix:
Python36:
PYTHON_INTERPRETER: /opt/python/cp36-cp36m/bin/python
PYTHON_TAG: cp36
Python37:
PYTHON_INTERPRETER: /opt/python/cp37-cp37m/bin/python
PYTHON_TAG: cp37
Python38:
PYTHON_INTERPRETER: /opt/python/cp38-cp38/bin/python
PYTHON_TAG: cp38
Python39:
PYTHON_INTERPRETER: /opt/python/cp39-cp39/bin/python
PYTHON_TAG: cp39
Python310:
PYTHON_INTERPRETER: /opt/python/cp310-cp310/bin/python
PYTHON_TAG: cp310
Python311:
PYTHON_INTERPRETER: /opt/python/cp311-cp311/bin/python
PYTHON_TAG: cp311
Python312:
PYTHON_INTERPRETER: /opt/python/cp312-cp312/bin/python
PYTHON_TAG: cp312
timeoutInMinutes: 90
pool:
vmImage: Ubuntu-20.04
steps:
- bash: |
docker build -t manylinuxwithcmake build_scripts/pypi/docker
docker run --name usdmanylinux --rm -id -v $(Build.SourcesDirectory):/opt/USD -v /home/vsts/dist:/opt/USD-dist manylinuxwithcmake
displayName: 'Creating docker build environment'
- bash: |
# Terrible, terrible hack. The manylinux Docker image used to build the
# Python wheel does not include the corresponding Python shared library
# to link against. path_to_url#libpythonx-y-so-1
# describes why this is so. However, the FindPython CMake module used
# by USD's build system requires that the library exists and will error
# out otherwise, even though we explicitly avoid linking against Python
# via the PXR_PY_UNDEFINED_DYNAMIC_LOOKUP flag.
#
# To work around this, we create a dummy file for the library using
# the same logic as build_usd.py to determine where the library should
# exist (see GetPythonInfo). FindPython will see that the library exists
# and allow the build to continue. The file is 100% bogus, but the
# PXR_PY_UNDEFINED_DYNAMIC_LOOKUP flag will ensure that we never try to
# link against this library anyway, so it doesn't matter.
docker exec usdmanylinux $(PYTHON_INTERPRETER) -c "import pathlib,sysconfig; pathlib.Path(sysconfig.get_config_var('LIBDIR'), sysconfig.get_config_var('LDLIBRARY')).touch()"
docker exec usdmanylinux $(PYTHON_INTERPRETER) build_scripts/build_usd.py --build-args USD,"-DPXR_PY_UNDEFINED_DYNAMIC_LOOKUP=ON -DPXR_BUILD_USD_TOOLS=OFF -DPXR_INSTALL_LOCATION=../pxr/pluginfo" --no-materialx --no-imaging --no-examples --no-tutorials --build /opt/USD/gen/build --src /opt/USD/gen/src /opt/USD/inst -v
displayName: 'Building USD'
- bash: |
docker exec usdmanylinux mkdir ./packaging
docker exec usdmanylinux cp -R /opt/USD/inst ./packaging
docker exec usdmanylinux sh -c 'cp build_scripts/pypi/package_files/* ./packaging'
docker exec usdmanylinux sh -c 'cp LICENSE.txt ./packaging'
displayName: "Creating packaging directory"
- bash: |
docker exec -w /opt/USD/packaging usdmanylinux $(PYTHON_INTERPRETER) setup.py $(post_release_tag_arg) bdist_wheel --python-tag ${PYTHON_TAG}
displayName: 'Running setup.py'
- bash: |
docker exec usdmanylinux /bin/bash -c 'PYTHONPATH=/opt/USD/packaging/pypi/lib/python LD_LIBRARY_PATH=/opt/USD/packaging/pypi/lib:$LD_LIBRARY_PATH auditwheel repair packaging/dist/*.whl'
displayName: 'Running auditwheel repair (moves .so files into package)'
- bash: |
WHEEL_PACKAGE_NAME=`docker exec usdmanylinux ls wheelhouse`
docker exec usdmanylinux $(PYTHON_INTERPRETER) build_scripts/pypi/updatePluginfos.py "wheelhouse/$WHEEL_PACKAGE_NAME" "/opt/USD-dist/$WHEEL_PACKAGE_NAME"
displayName: 'Updating pluginfo paths'
- bash: |
docker stop usdmanylinux
displayName: 'Stopping docker container'
- task: PublishPipelineArtifact@0
inputs:
artifactName: dist-linux-$(PYTHON_TAG)
targetPath: /home/vsts/dist
- job: Windows
strategy:
matrix:
Python36:
PYTHON_VERSION_SPEC: 3.6
PYTHON_TAG: cp36
Python37:
PYTHON_VERSION_SPEC: 3.7
PYTHON_TAG: cp37
Python38:
PYTHON_VERSION_SPEC: 3.8
PYTHON_TAG: cp38
Python39:
PYTHON_VERSION_SPEC: 3.9
PYTHON_TAG: cp39
Python310:
PYTHON_VERSION_SPEC: 3.10
PYTHON_TAG: cp310
Python311:
PYTHON_VERSION_SPEC: 3.11
PYTHON_TAG: cp311
Python312:
PYTHON_VERSION_SPEC: 3.12
PYTHON_TAG: cp312
timeoutInMinutes: 90
pool:
vmImage: 'windows-2019'
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION_SPEC)
addToPath: true
- script: |
call C:\"Program Files (x86)"\"Microsoft Visual Studio"\2019\Enterprise\VC\Auxiliary\Build\vcvars64.bat
set BOOST_ROOT=
python --version
python build_scripts/build_usd.py --build-args USD,"-DPXR_ENABLE_PRECOMPILED_HEADERS=OFF -DPXR_PY_UNDEFINED_DYNAMIC_LOOKUP=ON -DPXR_BUILD_USD_TOOLS=OFF -DPXR_INSTALL_LOCATION=../pxr/pluginfo" --no-materialx --no-imaging --no-examples --no-tutorials --build %HOME%/USDgen/build --src %HOME%/USDgen/src %HOME%/USDinst -v
displayName: 'Building USD'
- script: |
dir
mkdir D:\packaging
xcopy /E /I %HOME%\USDinst D:\packaging\inst
copy build_scripts\pypi\package_files\* D:\packaging
copy LICENSE.txt D:\packaging
dir D:\packaging
dir D:\packaging\inst
displayName: "Creating packaging directory"
- script: |
python -m pip install wheel setuptools
displayName: 'Installing python packages'
- script: |
cd D:\packaging
dir
python setup.py $(post_release_tag_arg) bdist_wheel --python-tag $(PYTHON_TAG) --plat-name win_amd64
displayName: 'Running setup.py'
- task: PublishPipelineArtifact@0
inputs:
artifactName: dist-windows-$(PYTHON_TAG)
targetPath: D:\packaging\dist
- job: Mac
strategy:
matrix:
Python36:
PYTHON_VERSION_SPEC: 3.6
PYTHON_INTERPRETER: python3.6
PYTHON_TAG: cp36
Python37:
PYTHON_VERSION_SPEC: 3.7
PYTHON_INTERPRETER: python3.7
PYTHON_TAG: cp37
Python38:
PYTHON_VERSION_SPEC: 3.8
PYTHON_INTERPRETER: python3.8
PYTHON_TAG: cp38
Python39:
PYTHON_VERSION_SPEC: 3.9
PYTHON_INTERPRETER: python3.9
PYTHON_TAG: cp39
Python310:
PYTHON_VERSION_SPEC: 3.10
PYTHON_INTERPRETER: python3.10
PYTHON_TAG: cp310
Python311:
PYTHON_VERSION_SPEC: 3.11
PYTHON_INTERPRETER: python3.11
PYTHON_TAG: cp311
Python312:
PYTHON_VERSION_SPEC: 3.12
PYTHON_INTERPRETER: python3.12
PYTHON_TAG: cp312
timeoutInMinutes: 180
pool:
vmImage: 'macOS-12'
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION_SPEC)
addToPath: true
- script: |
sudo xcode-select -s /Applications/Xcode_13.2.app/Contents/Developer
$(PYTHON_INTERPRETER) build_scripts/build_usd.py --build-args USD,"-DPXR_PY_UNDEFINED_DYNAMIC_LOOKUP=ON -DPXR_BUILD_USD_TOOLS=OFF -DPXR_INSTALL_LOCATION=../pluginfo" --no-materialx --no-imaging --no-examples --no-tutorials --generator Xcode --build-target universal --build $HOME/USDgen/build --src $HOME/USDgen/src $HOME/USDinst -v
displayName: 'Building USD'
- bash: |
$(PYTHON_INTERPRETER) -m pip install delocate~=0.10.2 wheel setuptools
displayName: 'Installing python packages'
- bash: |
mkdir ./packaging
mkdir ./packaging/inst
cp -R $HOME/USDinst/* ./packaging/inst
cp build_scripts/pypi/package_files/* ./packaging
cp LICENSE.txt ./packaging
displayName: "Creating packaging directory"
- bash: |
cd ./packaging
$(PYTHON_INTERPRETER) setup.py $(post_release_tag_arg) bdist_wheel --python-tag ${PYTHON_TAG} --plat-name macosx_10_9_universal2
displayName: 'Running setup.py'
- bash: |
delocate-wheel -v -w dist-delocated packaging/dist/*
displayName: 'Running delocate (moves shared lib files into package)'
- bash: |
WHEEL_PACKAGE_NAME=`ls ./packaging/dist`
mkdir -p ./dist
$(PYTHON_INTERPRETER) build_scripts/pypi/updatePluginfos.py "./dist-delocated/$WHEEL_PACKAGE_NAME" "./dist/$WHEEL_PACKAGE_NAME"
displayName: 'Updating pluginfo paths'
- task: PublishPipelineArtifact@0
inputs:
artifactName: dist-mac-$(PYTHON_TAG)
targetPath: ./dist
- stage: Deliver
dependsOn: Build_and_Package
jobs:
- job: Collect_Packages
timeoutInMinutes: 15
pool:
vmImage: Ubuntu-20.04
steps:
- task: DownloadPipelineArtifact@2
displayName: 'Downloading all artifacts'
inputs:
source: current
# Not specifying artifact names so we get all of them
downloadPath: '$(Pipeline.Workspace)'
- bash: |
cd $(Pipeline.Workspace)
mkdir dist-final
cp dist*/*.whl dist-final
displayName: 'Collecting whl packages'
- task: PublishPipelineArtifact@0
displayName: 'Publishing in a single download'
inputs:
artifactName: dist
targetPath: '$(Pipeline.Workspace)/dist-final'
# The matrix below is verbose. There is a way to do this with a more n x m
# syntax using a foreach loop kind of construct. For now, opting for the below
# because it seems easier to read. An example of the other approach can be
# found here:
# path_to_url
- stage: Test_Packages
dependsOn: Deliver
jobs:
- job: Test_Install
strategy:
matrix:
Linux_Python36:
PYTHON_VERSION_SPEC: 3.6
IMAGE: 'Ubuntu-20.04'
PYTHON_INTERPRETER: python3
Linux_Python37:
PYTHON_VERSION_SPEC: 3.7
IMAGE: 'Ubuntu-20.04'
PYTHON_INTERPRETER: python3
Linux_Python38:
PYTHON_VERSION_SPEC: 3.8
IMAGE: 'Ubuntu-20.04'
PYTHON_INTERPRETER: python3
Linux_Python39:
PYTHON_VERSION_SPEC: 3.9
IMAGE: 'Ubuntu-20.04'
PYTHON_INTERPRETER: python3
Linux_Python310:
PYTHON_VERSION_SPEC: 3.10
IMAGE: 'Ubuntu-20.04'
PYTHON_INTERPRETER: python3
Linux_Python311:
PYTHON_VERSION_SPEC: 3.11
IMAGE: 'Ubuntu-20.04'
PYTHON_INTERPRETER: python3
Linux_Python312:
PYTHON_VERSION_SPEC: 3.12
IMAGE: 'Ubuntu-20.04'
PYTHON_INTERPRETER: python3
Windows_Python36:
PYTHON_VERSION_SPEC: 3.6
IMAGE: 'windows-2019'
PYTHON_INTERPRETER: python
Windows_Python37:
PYTHON_VERSION_SPEC: 3.7
IMAGE: 'windows-2019'
PYTHON_INTERPRETER: python
Windows_Python38:
PYTHON_VERSION_SPEC: 3.8
IMAGE: 'windows-2019'
PYTHON_INTERPRETER: python
Windows_Python39:
PYTHON_VERSION_SPEC: 3.9
IMAGE: 'windows-2019'
PYTHON_INTERPRETER: python
Windows_Python310:
PYTHON_VERSION_SPEC: 3.10
IMAGE: 'windows-2019'
PYTHON_INTERPRETER: python
Windows_Python311:
PYTHON_VERSION_SPEC: 3.11
IMAGE: 'windows-2019'
PYTHON_INTERPRETER: python
Windows_Python312:
PYTHON_VERSION_SPEC: 3.12
IMAGE: 'windows-2019'
PYTHON_INTERPRETER: python
Mac_Python36:
PYTHON_VERSION_SPEC: 3.6
IMAGE: 'macOS-12'
PYTHON_INTERPRETER: python3
Mac_Python37:
PYTHON_VERSION_SPEC: 3.7
IMAGE: 'macOS-12'
PYTHON_INTERPRETER: python3
Mac_Python38:
PYTHON_VERSION_SPEC: 3.8
IMAGE: 'macOS-12'
PYTHON_INTERPRETER: python3
Mac_Python39:
PYTHON_VERSION_SPEC: 3.9
IMAGE: 'macOS-12'
PYTHON_INTERPRETER: python3
Mac_Python310:
PYTHON_VERSION_SPEC: 3.10
IMAGE: 'macOS-12'
PYTHON_INTERPRETER: python3
Mac_Python311:
PYTHON_VERSION_SPEC: 3.11
IMAGE: 'macOS-12'
PYTHON_INTERPRETER: python3
Mac_Python312:
PYTHON_VERSION_SPEC: 3.12
IMAGE: 'macOS-12'
PYTHON_INTERPRETER: python3
timeoutInMinutes: 10
pool:
vmImage: '$(IMAGE)'
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION_SPEC)
addToPath: true
- task: DownloadPipelineArtifact@2
displayName: 'Downloading dist package'
inputs:
source: current
artifact: dist
downloadPath: '$(Pipeline.Workspace)'
- script: |
$(PYTHON_INTERPRETER) --version
$(PYTHON_INTERPRETER) -m pip install pytest
$(PYTHON_INTERPRETER) -m pip install --no-index --find-links=file://$(Pipeline.Workspace) usd-core
py.test --junitxml TEST-usdinstall.xml build_scripts/pypi/test.py
displayName: "Testing fresh pip install"
- task: PublishTestResults@2
condition: succeededOrFailed()
inputs:
testRunTitle: 'Test results for $(IMAGE) Python $(PYTHON_VERSION_SPEC)'
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.