text
stringlengths 1
22.8M
|
|---|
The Resource Conservation and Recovery Act (RCRA), enacted in 1976, is the principal federal law in the United States governing the disposal of solid waste and hazardous waste.
History and goals
Congress enacted RCRA to address the increasing problems the nation faced from its growing volume of municipal and industrial waste. RCRA was an amendment of the Solid Waste Disposal Act of 1965. The act set national goals for:
Protecting human health and the natural environment from the potential hazards of waste disposal.
Energy conservation and natural resources.
Reducing the amount of waste generated, through source reduction and recycling
Maintaining environmental health standards.
Ensuring the management of waste in an environmentally sound manner.
The RCRA program is a joint federal and state endeavor, with the U.S. Environmental Protection Agency (EPA) providing basic requirements that states then adopt, adapt, and enforce. RCRA is now most widely known for the regulations promulgated under it that set standards for the treatment, storage and disposal of hazardous waste in the United States. However, it also plays an integral role in the management of municipal and industrial waste as well as underground storage tanks.
Implementation
EPA has published waste management regulations, which are codified in Title 40 of the Code of Federal Regulations at parts 239 through 282. Regulations regarding management of hazardous waste begin in part 260. States are authorized to operate their own hazardous waste programs, which must be at least as stringent as federal standards, and are tasked with creating state implementation plans for managing solid waste.
In California, the Department of Toxic Substances Control (DTSC) is the primary authority enforcing the RCRA requirements, as well as the California Hazardous Waste Control Law (HWCL) of 1972.
Provisions
Subtitle A: General Provisions
Congressional Findings; Objectives and National Policy
Definitions
Interstate Cooperation; Application of Act and Integration with Other Acts
Financial Disclosure; Solid Waste Management Information and Guidelines
Subtitle B: Office of Solid Waste; Authorities of the Administrator
Office of Solid Waste and Interagency Coordinating Committee
Authorities of EPA Administrator
Resource Recovery and Conservation Panels; Grants
Annual Report; Office of Ombudsman
Subtitle C: "Cradle to Grave" requirements for hazardous waste
Arguably the most notable provisions of the RCRA statute are included in Subtitle C, which directs EPA to establish controls on the management of hazardous wastes from their point of generation, through their transportation and treatment, storage and/or disposal. Because RCRA requires controls on hazardous waste generators (i.e., sites that generate hazardous waste), transporters, and treatment, storage and disposal facilities (i.e., facilities that ultimately treat/dispose of or recycle the hazardous waste), the overall regulatory framework has become known as the "cradle to grave" system. States are authorized to implement their own hazardous waste programs. The statute imposes stringent recordkeeping and reporting requirements on generators, transporters, and operators of treatment, storage and disposal facilities handling hazardous waste.
Subtitle D: Non-hazardous Solid Wastes
Subtitle D provides criteria for landfills and other waste disposal facilities, and banned open landfills. EPA published its initial standards in 1979 for "sanitary" landfills that receive municipal solid waste. The "solid waste" definition includes garbage (e.g., food containers, coffee grounds), non-recycled household appliances, residue from incinerated automobile tires, refuse such as metal scrap, construction materials, and sludge from industrial and sewage treatment plants and drinking water treatment plants. Subtitle D also exempted certain hazardous wastes from the Subtitle C regulations, such as hazardous wastes from households and from conditionally exempt small quantity generators.
Special wastes
In 1980 Congress designated several kinds of industrial wastes as "special wastes," which are exempt from Subtitle C, including oil and gas exploration and production wastes (such as drill cuttings, produced water, and drilling fluids), coal combustion residuals generated by electric power plants and other industries, mining waste, and cement kiln dust. See Solid Waste Disposal Amendments of 1980.
Subtitle E: Department of Commerce responsibilities
Development of Specifications for secondary materials; Development of markets for recovered material.
Technology promotion
Subtitle F: Federal responsibilities
Application of Federal, State and Local Law to Federal Facilities
Federal procurement
Cooperation with EPA; Applicability of solid waste disposal guidelines to executive agencies
Subtitle G: Miscellaneous provisions
Whistleblower protection. Employees in the United States who believe they were fired or suffered another adverse action related to enforcement of this law have 30 days to file a written complaint with the Occupational Safety and Health Administration.
Citizen Suits; Imminent Hazard suits
Petition for regulations; Public participation
Subtitle H: Research, Development, Demonstration and Information
Research, Demonstrations, Training; Special Studies
Coordination, collection, dissemination of information
Subtitle I: Underground Storage Tanks
Background
The operation of underground storage tanks (USTs) became subject to the RCRA regulatory program with enactment of the Hazardous and Solid Waste Amendments of 1984 (HSWA). At that time there were about 2.1 million tanks subject to federal regulation, and the EPA program led to closure and removal of most substandard tanks. As of 2009 there were approximately 600,000 active USTs at 223,000 sites subject to federal regulation.
Regulatory requirements
The federal UST regulations cover tanks storing petroleum or listed hazardous substances, and define the types of tanks permitted. EPA established a tank notification system to track UST status. UST regulatory programs are principally administered by state and U.S. territorial agencies.
The regulations set standards for:
Groundwater monitoring
Secondary Containment
Release detection, prevention and correction
Spill prevention
Overfill prevention (for petroleum products)
Restrictions on land disposal of untreatable hazardous waste products.
The Superfund Amendments and Reauthorization Act of 1986 (SARA) required owners and operators of USTs to ensure corrective action is completed when a tank is in need of repair, or removal, when it is necessary to protect human health and the environment. The amendments established a trust fund to pay for the cleanup of leaking UST sites where responsible parties cannot be identified.
It is also recommended that above-ground storage tanks are used whenever possible.
Subtitle J: Medical Waste (expired)
RCRA Subtitle J regulated medical waste in four states (New York, New Jersey, Connecticut, Rhode Island) and Puerto Rico, and expired on March 22, 1991. (See Medical Waste Tracking Act.) State environmental and health agencies regulate medical waste, rather than EPA. Other federal agencies have issued safety regulations governing the handling of medical waste, including the Centers for Disease Control and Prevention, Occupational Safety and Health Administration, and the Food and Drug Administration.
Amendments and related legislation
Solid Waste Disposal Amendments of 1980
Congress exempted several types of wastes from classification as hazardous under Subtitle C in its 1980 amendment to RCRA. The Solid Waste Disposal Amendments of 1980 designated the following categories as "special wastes" and not subject to the stricter permitting requirements of Subtitle C:
coal combustion residuals (CCR) generated by electric power plants and other industries, including fly ash, bottom ash, slag waste and flue-gas desulfurization wastes
mining waste from ore mines and mineral mines
cement kiln dust
drilling fluid, produced water, and other wastes from oil and gas wells.
These legislative exemptions, known as the "Bevill exclusion" and the "Bentsen exclusion", were intended to be temporary, pending studies conducted by EPA and subsequent determinations as to whether any of these waste categories should be classified as hazardous. In its reviews following the 1980 amendments, EPA determined that most of the exempted waste types would continue to be classified as non-hazardous.
Regulations
EPA published a CCR regulation in 2015 that would restrict the continued use of unlined ash ponds (surface impoundments) by coal-fired power plants. This regulation, was which was modified by the Trump administration in 2018, has been challenged in litigation and remanded to EPA for further revision by the United States Court of Appeals for the District of Columbia Circuit. In response to the court decision, EPA published a proposed rule on December 2, 2019 that would establish an August 31, 2020 deadline for facilities to stop placing ash in unlined impoundments. The proposal would also provide additional time for some facilities—up to eight years—to find alternatives for managing ash wastes before closing surface impoundments.
Superfund
The Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), also known as "Superfund," was enacted in 1980 to address the problem of remediating abandoned hazardous waste sites, by establishing legal liability, as well as a trust fund for cleanup activities. In general CERCLA applies to contaminated sites, while RCRA's focus is on controlling the ongoing generation and management of particular waste streams. RCRA, like CERCLA, has provisions to require cleanup of contaminated sites that occurred in the past.
Hazardous and Solid Waste Amendments of 1984
In 1984 Congress expanded the scope of RCRA with the enactment of Hazardous and Solid Waste Amendments (HSWA). The amendments strengthened the law by covering small quantity generators of hazardous waste and establishing requirements for hazardous waste incinerators, and the closing of substandard landfills.
Land Disposal Program Flexibility Act of 1996
The Land Disposal Program Flexibility Act of 1996 allowed some flexibility in the procedures for land disposal of certain wastes. For example, a waste is not subject to land disposal restrictions if it is sent to an industrial wastewater treatment facility, a municipal sewage treatment plant, or is treated in a "zero discharge" facility.
Treatment, storage, and disposal facility permits
Treatment, storage, and disposal facilities (TSDFs) manage hazardous waste under RCRA Subtitle C and generally must have a permit in order to operate. While most facilities have RCRA permits, some continue to operate under what is called "interim status." Interim status requirements appear in 40 CFR Part 265.
The permitting requirements for TSDFs appear in 40 CFR Parts 264 and 270. TSDFs manage (treat, store, or dispose) hazardous waste in units that may include: container storage areas, tanks, surface impoundments, waste piles, land treatment units, landfills, incinerators, containment buildings, and/or drip pads. The unit-specific permitting and operational requirements are described in further detail in 40 CFR Part 264, Subparts J through DD.
See also
Clean Water Act
Formerly Used Defense Sites
Hazardous waste in the United States
Solid waste policy in the United States
Uranium Mill Tailings Radiation Control Act
References
External links
RCRA Orientation Manual (EPA, 2014): A good textbook-length introduction to RCRA
RCRA Online: database of documents covering a wide range of RCRA issues and topics
"Hazardous Waste Permitting Process: A Citizens Guide" - EPA
Waste Management: A Half Century of Progress, a report by the EPA Alumni Association
Collected Papers of William Sanjour, a retired EPA employee and whistleblower
External links
As codified in 42 U.S.C. chapter 82 of the United States Code from the LII
As codified in 42 U.S.C. chapter 82 of the United States Code from the US House of Representatives
Solid Waste Disposal Act aka Resource Conservation and Recovery Act (PDF/details) as amended in the GPO Statute Compilations collection
Summary of the RCRA from the EPA
1976 in American law
94th United States Congress
United States business law
First Amendment to the United States Constitution
Public administration
United States Environmental Protection Agency
United States federal environmental legislation
Waste legislation in the United States
Whistleblower protection legislation
1976 in the environment
|
The League of Ireland XI, more recently referred to as the Airtricity League XI for sponsorship reasons, is the representative team of the League of Ireland, the national association football league of the Republic of Ireland. For much of its history, the League of Ireland XI has effectively acted as a reserve or B team to the senior Republic of Ireland national team, providing international representative honours to home-based players. In fact it has played considerably more games than the actual Republic of Ireland B national football team. In addition to playing regular games against similar representative teams, such as the Irish League XI, the Scottish Football League XI and the Football League XI, the League of Ireland XI has also played in prestige friendlies against the full national teams of both Argentina and Brazil. The League of Ireland XI also represented Ireland in the qualifying stages of the 1988 Olympic Football Tournament. More recently a League of Ireland U-23 XI has represented the Republic of Ireland in the International Challenge Trophy. Meanwhile, a senior team with no age or nationality restriction regularly plays visiting club sides. More recently the team competed in the 2011 Dublin Super Cup.
History
1920s and 1930s
During the 1920s and 1930s, the four national associations that made up the International Football Association Board (IFAB)– The Football Association, the Scottish Football Association, the Football Association of Wales and Northern Ireland's Irish Football Association – refused to recognise the rights of the Football Association of Ireland (FAI) when it came arranging full internationals. Consequently, the FAI could not arrange full internationals against its nearest neighbours. The IFAB, however, did permit inter-league games to be played. In the absence of full internationals against England, Scotland, Wales or Northern Ireland, these inter-league matches between the League of Ireland XI, the Irish League XI, the Welsh Football League XI and the Scottish Football League XI were highly regarded by both the FAI and Irish football fans alike. Attendances of up to 30,000 at these matches at Dalymount Park led them to have been treated almost as full internationals.
The League of Ireland XI made their official debut with a 3–3 draw against a Welsh Football League XI on 9 February 1924. Ernie MacKay scored the representative team's first ever goal while Dave Roberts added the other two. The League of Ireland XI played the Irish League XI for the first time on 13 March 1926. Charlie Dowdall scored twice in a 3–1 win for the home team. On St. Patrick's Day, 1937, a League of Ireland XI also played and defeated visiting Yugoslav side SK Jugoslavija 3–2. The League of Ireland XI played the Scottish League XI for the first time on St. Patrick's Day, 1939. The Scottish team was billed as a team of all-stars and had a combined valuation estimated to be £60,000. In front of a crowd of 35,000 at Dalymount Park, the League of Ireland XI defeated the Scottish League XI with Johnny Johnstone and Paddy Bradshaw scoring the goals in the 2–1 win
National Team
The League of Ireland XI has always enjoyed a close relationship with the senior Republic of Ireland national team. When Ireland competed at both the 1924 and 1948 Olympic Football Tournaments, they were represented by League of Ireland XIs made up of amateur players. On at least three further occasions before the Second World War, the FAI selected a full international team entirely made up of players playing in Ireland. On 21 March 1926, for the game against Italy, the Ireland team even featured Drumcondra’s Joe Grace from the Leinster Senior League. It was a League of Ireland XI that played Belgium on 12 February 1928 and then the Netherlands on 8 December 1935. Before the Second World War, League of Ireland players made up the nucleus of just about every FAI Ireland full international team.
Post-Second World War
For most of the Second World War era, the League of Ireland XI's only opponents were the Irish League XI. However once the conflict ended, the fixture against the Scottish League XI was revived. They also began to play the Football League XI on a regular basis. With the majority of the leading Irish players now playing in the Football League, however, the League of Ireland XI now found itself at a disadvantage. As a result, the majority of the games they played against the Scottish League XI and the Football League XI usually ended in heavy defeat. However, there was the occasional success story. On 2 October 1963 at Dalymount Park, the League of Ireland XI defeated the Football League XI 2–1, thanks to goals from Eddie Bailham and Ronnie Whelan. This Football League XI included four players – Ray Wilson, Bobby Moore, Roger Hunt and Martin Peters – who subsequently went on to help England win the 1966 FIFA World Cup. At the time Whelan was working for Unidaire, a Finglas-based electrical firm, and he subsequently received a warning from his boss at the company for taking time off to play in this game.
Prestige Friendlies
From the late 1970s onwards, the League of Ireland XI also began to play friendlies against national teams. These included two prestige games against the full Argentina national team. On 19 April 1978, at the Estadio Alberto J. Armando, Argentina played the League of Ireland XI in a warm up game as part of their preparations for hosting the 1978 FIFA World Cup. A team that included the former England international Bobby Tambling and several Republic of Ireland internationals such as Johnny Giles, Ray Treacy, Eamonn Gregg, Noel Synnott, Cathal Muckian, Jerome Clark and Synan Braddish lost 3–1 to a very strong Argentina. The starting eleven for Argentina included ten players who later played in the 1978 FIFA World Cup Final in which Argentina beat the Netherlands 3-1. In addition a young Diego Maradona also came on as a substitute. Leopoldo Luque, Oscar Ortiz and Ricardo Villa scored for Argentina before Synan Braddish grabbed a consolation goal for the league select. On 29 May 1979, Argentina, then the reigning World Cup holders, visited Lansdowne Road and were held to a 0–0 draw by a Republic of Ireland XI in a UNICEF fundraiser; this team is sometimes incorrectly listed as a League of Ireland XI.
30 April 1980 saw the League of Ireland XI play Argentina for a second time, this time at the Estadio Monumental. On this occasion, a team that included Liam Buckley, Terry Eviston, Johnny Walsh and Tommy McConville lost 1–0 to a goal scored by Diego Maradona. A month later, Argentina beat the senior Republic of Ireland 1–0 at Lansdowne Road.
In another notable game from this era, the League of Ireland XI also became the first representative team to play the Basque Country following the ending of the Francoist regime. This game was played on 16 August 1979 at the San Mamés Stadium. The Basque team was made up of Real Sociedad and Athletic Bilbao players and all eleven subsequently became full Spain internationals. In contrast the league select was under strength and was referred to in newspaper reports as a League of Ireland B team. The Basque Country team easily defeated this League of Ireland XI 4–1. In 1981, the League of Ireland XI returned to South America and this time played Brazil. A team managed by Jim McLaughlin lost 6–0 with the legendary Zico scoring four of Brazil's goals.
Olympic qualifiers
League of Ireland XIs made up of amateur players represented Ireland in qualifiers for the 1960, 1972, 1976 and 1980 Olympic Football Tournaments. For the 1988 Olympic Football Tournament qualifiers, a senior League of Ireland XI featuring professionals represented Ireland. They were drawn in a "group of death" that also included Hungary, Sweden, Spain and France – France had won the gold medal at the 1984 Olympic Football Tournament. This League of Ireland XI was again managed by Jim McLaughlin.
The team kicked off their Olympic campaign with a 2–1 defeat against Hungary at Glenmalure Park on November 11, 1986. Their next opponents were Spain at Tolka Park on February 4, 1987. Goals from Noel Larkin and Mick Byrne saw the League of Ireland XI draw 2–2. Their first away games came against Sweden and France. The League of Ireland XI lost 1–0 to Sweden after they conceded a very late goal but managed to hold France to a 1–1 draw. On August 26, 1987, a crowd of less than 1,000 saw the League of Ireland XI lose 1–0 at Dalymount Park to a Sweden team that included Thomas Brolin. Next came the home match against France on 18 November 1987 at Dalymount Park. A crowd of just 4,000 would witness one of the League of Ireland XI's best results. Two goals from Mick Bennett and one from Peter Eccles saw them gain a 3–0 win. Ireland finished the qualifying group with two away games. Dave Barry scored in Hungary but the League of Ireland XI lost 3–1 while goals from Barry Kehoe and Bennett earned them a 2–2 with Spain in Alicante. The League of Ireland XI finished fourth in the group. Sweden qualified for the finals where they were knocked out in the quarter-finals.
Group C Final Table
1988 Marlboro Cup
In August 1988, the League of Ireland XI competed in the Marlboro Cup, a four team tournament, held at the Los Angeles Memorial Coliseum. They lost their first game 3–0 against Club Universidad de Guadalajara on 5 August, with Mick Neville conceding an own goal. They then lost 1–0 to El Salvador in a third place playoff two days later. The tournament was won by Guatemala, who beat Club Universidad 3–2 in the final.
Aviva Stadium
Manchester United
On 4 August 2010, the League of Ireland XI hosted the first soccer match to be played at the Aviva Stadium. A team managed by Damien Richardson lost 7–1 to Manchester United. The league select were 6-0 down after 70 minutes, with goals from Park Ji-sung (2), Michael Owen, Javier Hernández, Antonio Valencia and Jonny Evans. Park opened the scoring in the 13th minute in bizarre fashion; as he went to block a defender's clearance, the ball ricocheted off him and into the net. Owen doubled United's lead in the 25th minute with a chipped shot over the goalkeeper, before half-time substitute Hernández made it 3–0 two minutes after the break. Three goals in the space of nine minutes from Valencia (60th minute), a second from Park (63rd) and Jonny Evans (69th) increased the lead to 6–0, before Dave Mulcahy scored a consolation goal for the League of Ireland XI in the 78th minute. Nevertheless, there was still time for Nani to get a seventh goal, converting a penalty after Hernández had been fouled in the penalty area.
Dublin Super Cup
Damien Richardson was again in charge of the League of Ireland XI when the Aviva Stadium hosted the 2011 Dublin Super Cup, a tournament which saw the representative team take on both Manchester City and Celtic. Shamrock Rovers players, however, were not available because of a clash with the 2011–12 UEFA Champions League qualifying phase and play-off rounds. As a result, Richardson had to field an understrength team. They lost their opening game to Manchester City 3–0, then lost 5–0 to Celtic. The League of Ireland XI were the only team in the tournament that didn't win a match or score any goals.
League of Ireland XI matches
Recent squad
The following players were called up for the 2011 Dublin Super Cup
|-
! colspan="10" style="background:#b0d3fb; text-align:left;"|
|- style="background:#dfedfd;"
|-
! colspan="10" style="background:#b0d3fb; text-align:left;"|
|- style="background:#dfedfd;"
|-
! colspan="9" style="background:#b0d3fb; text-align:left;"|
|- style="background:#dfedfd;"
|}
Non-Irish players
Throughout the history of the League of Ireland, the vast majority of the players have come from either the Republic of Ireland or Northern Ireland. Consequently, the League of Ireland XI has largely been made up of Irish players. However, there has always been a contingent of non-Irish players and, right from the beginning, they have been selected to play for the League of Ireland XI. Dave Roberts from England scored twice in the team's very first game. Another English-born player, Johnny Matthews, scored a penalty against Gordon Banks when the League of Ireland XI played the Football League XI in 1971 at Lansdowne Road. Like Roberts and Matthews, most of the non-Irish players have come from Great Britain but some have come from further afield.
References
XI
Representative teams of association football leagues
1924 establishments in Ireland
Sports organizations established in 1924
|
"Wine Colored Roses" is a song written by Dennis Knutson and A.L. "Doodle" Owens, recorded by American country music artist George Jones. It was released in September 1986 as the first single and title track from Jones' album Wine Colored Roses. The song peaked at number 10 on the Billboard Hot Country Singles chart. The song tells the story of a man who receives a letter from a caring ex-lover who asks him if he still drinks. Unable to overcome his own shame to speak with her on the telephone, he sends her a dozen wine colored roses as a symbol that he still drinks. The song became part of Jones' live show at the time, and when he would sing the line, "She asked if I had quit drinkin'," audiences would often cheer, mindful of how George had gotten sober himself in 1984. In his essay for the liner notes to the 1994 Sony compilation The Essential George Jones: The Spirit of Country, Rich Kienzle states, "If there were any doubters, 'Wine Colored Roses' proved Jones was a timeless superstar, even without stimulants."
Chart performance
References
1986 singles
George Jones songs
Epic Records singles
Songs written by A.L. "Doodle" Owens
Song recordings produced by Billy Sherrill
Songs written by Dennis Knutson
|
```c
/*
*
* in the file LICENSE in the source distribution or at
* path_to_url
*/
#include <stdio.h>
#include <stdlib.h>
#include "internal/cryptlib.h"
#include <openssl/evp.h>
#include <openssl/kdf.h>
#include <openssl/core.h>
#include <openssl/core_names.h>
#include "crypto/evp.h"
#include "internal/numbers.h"
#include "internal/provider.h"
#include "evp_local.h"
EVP_KDF_CTX *EVP_KDF_CTX_new(EVP_KDF *kdf)
{
EVP_KDF_CTX *ctx = NULL;
if (kdf == NULL)
return NULL;
ctx = OPENSSL_zalloc(sizeof(EVP_KDF_CTX));
if (ctx == NULL
|| (ctx->algctx = kdf->newctx(ossl_provider_ctx(kdf->prov))) == NULL
|| !EVP_KDF_up_ref(kdf)) {
ERR_raise(ERR_LIB_EVP, ERR_R_EVP_LIB);
if (ctx != NULL)
kdf->freectx(ctx->algctx);
OPENSSL_free(ctx);
ctx = NULL;
} else {
ctx->meth = kdf;
}
return ctx;
}
void EVP_KDF_CTX_free(EVP_KDF_CTX *ctx)
{
if (ctx == NULL)
return;
ctx->meth->freectx(ctx->algctx);
ctx->algctx = NULL;
EVP_KDF_free(ctx->meth);
OPENSSL_free(ctx);
}
EVP_KDF_CTX *EVP_KDF_CTX_dup(const EVP_KDF_CTX *src)
{
EVP_KDF_CTX *dst;
if (src == NULL || src->algctx == NULL || src->meth->dupctx == NULL)
return NULL;
dst = OPENSSL_malloc(sizeof(*dst));
if (dst == NULL)
return NULL;
memcpy(dst, src, sizeof(*dst));
if (!EVP_KDF_up_ref(dst->meth)) {
ERR_raise(ERR_LIB_EVP, ERR_R_EVP_LIB);
OPENSSL_free(dst);
return NULL;
}
dst->algctx = src->meth->dupctx(src->algctx);
if (dst->algctx == NULL) {
EVP_KDF_CTX_free(dst);
return NULL;
}
return dst;
}
int evp_kdf_get_number(const EVP_KDF *kdf)
{
return kdf->name_id;
}
const char *EVP_KDF_get0_name(const EVP_KDF *kdf)
{
return kdf->type_name;
}
const char *EVP_KDF_get0_description(const EVP_KDF *kdf)
{
return kdf->description;
}
int EVP_KDF_is_a(const EVP_KDF *kdf, const char *name)
{
return kdf != NULL && evp_is_a(kdf->prov, kdf->name_id, NULL, name);
}
const OSSL_PROVIDER *EVP_KDF_get0_provider(const EVP_KDF *kdf)
{
return kdf->prov;
}
const EVP_KDF *EVP_KDF_CTX_kdf(EVP_KDF_CTX *ctx)
{
return ctx->meth;
}
void EVP_KDF_CTX_reset(EVP_KDF_CTX *ctx)
{
if (ctx == NULL)
return;
if (ctx->meth->reset != NULL)
ctx->meth->reset(ctx->algctx);
}
size_t EVP_KDF_CTX_get_kdf_size(EVP_KDF_CTX *ctx)
{
OSSL_PARAM params[2] = { OSSL_PARAM_END, OSSL_PARAM_END };
size_t s = 0;
if (ctx == NULL)
return 0;
*params = OSSL_PARAM_construct_size_t(OSSL_KDF_PARAM_SIZE, &s);
if (ctx->meth->get_ctx_params != NULL
&& ctx->meth->get_ctx_params(ctx->algctx, params))
return s;
if (ctx->meth->get_params != NULL
&& ctx->meth->get_params(params))
return s;
return 0;
}
int EVP_KDF_derive(EVP_KDF_CTX *ctx, unsigned char *key, size_t keylen,
const OSSL_PARAM params[])
{
if (ctx == NULL)
return 0;
return ctx->meth->derive(ctx->algctx, key, keylen, params);
}
/*
* The {get,set}_params functions return 1 if there is no corresponding
* function in the implementation. This is the same as if there was one,
* but it didn't recognise any of the given params, i.e. nothing in the
* bag of parameters was useful.
*/
int EVP_KDF_get_params(EVP_KDF *kdf, OSSL_PARAM params[])
{
if (kdf->get_params != NULL)
return kdf->get_params(params);
return 1;
}
int EVP_KDF_CTX_get_params(EVP_KDF_CTX *ctx, OSSL_PARAM params[])
{
if (ctx->meth->get_ctx_params != NULL)
return ctx->meth->get_ctx_params(ctx->algctx, params);
return 1;
}
int EVP_KDF_CTX_set_params(EVP_KDF_CTX *ctx, const OSSL_PARAM params[])
{
if (ctx->meth->set_ctx_params != NULL)
return ctx->meth->set_ctx_params(ctx->algctx, params);
return 1;
}
int EVP_KDF_names_do_all(const EVP_KDF *kdf,
void (*fn)(const char *name, void *data),
void *data)
{
if (kdf->prov != NULL)
return evp_names_do_all(kdf->prov, kdf->name_id, fn, data);
return 1;
}
```
|
Macroscelesia is a genus of moths in the family Sesiidae.
Species
Macroscelesia japona (Hampson, 1919)
Macroscelesia longipes (Moore, 1877)
Macroscelesia longipes longipes (Moore, 1877)
Macroscelesia longipes yamatoensis Arita, 1992
Macroscelesia aritai Kallies & Garrevoet, 2001
Macroscelesia diaphana Gorbunov & Arita, 1995
Macroscelesia elaea (Hampson, 1919)
Macroscelesia formosana Arita & Gorbunov, 2002
Macroscelesia owadai Arita & Gorbunov, 2000
Macroscelesia sapaensis Kallies & Arita, 2004
Macroscelesia vietnamica Arita & Gorbunov, 2000
References
Sesiidae
|
The conflicts between the Göktürks and the Sassanid Empire include:
First Perso-Turkic War (588–589)
Second Perso-Turkic War (606–608)
Third Perso-Turkic War (627–629)
Wars involving the Sasanian Empire
|
Art on the Move is an annual summer arts program held in Detroit, Michigan.
The organization sponsors temporary art installations during the summer months. These temporary pieces are created by resident artists, who in turn mentor young artists as the public works are executed and erected.
Funding for Art on the Move has come from architectural firms in southwestern Michigan, the City of Detroit, and Detroit's Empowerment Zone Development Corporation.
The program has shifted from its original connection with the Detroit People Mover and now provides programs and exhibits for the Detroit Festival of the Arts.
External links
Detroit Festival of the Arts
Detroit Empowerment Zone - link to Art on the Move
Culture of Detroit
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
// MODULES //
var tape = require( 'tape' );
var isnan = require( '@stdlib/math/base/assert/is-nan' );
var PINF = require( '@stdlib/constants/float64/pinf' );
var NINF = require( '@stdlib/constants/float64/ninf' );
var entropy = require( './../lib' );
// FIXTURES //
var data = require( './fixtures/julia/data.json' );
// TESTS //
tape( 'main export is a function', function test( t ) {
t.ok( true, __filename );
t.strictEqual( typeof entropy, 'function', 'main export is a function' );
t.end();
});
tape( 'if provided `NaN` for any parameter, the function returns `NaN`', function test( t ) {
var v = entropy( NaN, 0.5 );
t.equal( isnan( v ), true, 'returns NaN' );
v = entropy( 10, NaN );
t.equal( isnan( v ), true, 'returns NaN' );
v = entropy( NaN, NaN );
t.equal( isnan( v ), true, 'returns NaN' );
t.end();
});
tape( 'if provided a nonpositive `gamma`, the function always returns `NaN`', function test( t ) {
var y;
y = entropy( 0.0, 0.0 );
t.equal( isnan( y ), true, 'returns NaN' );
y = entropy( 0.0, -1.0 );
t.equal( isnan( y ), true, 'returns NaN' );
y = entropy( 0.0, NINF );
t.equal( isnan( y ), true, 'returns NaN' );
y = entropy( PINF, NINF );
t.equal( isnan( y ), true, 'returns NaN' );
t.end();
});
tape( 'the function returns the differential entropy of a Cauchy distribution', function test( t ) {
var expected;
var gamma;
var x0;
var y;
var i;
expected = data.expected;
x0 = data.x0;
gamma = data.gamma;
for ( i = 0; i < x0.length; i++ ) {
y = entropy( x0[i], gamma[i] );
t.equal( y, expected[i], 'x0 :'+x0[i]+', gamma: '+gamma[i]+', y: '+y+', expected: '+expected[i] );
}
t.end();
});
```
|
```objective-c
//===- SemaTemplate.h - C++ Templates ---------------------------*- C++ -*-===//
//
// See path_to_url for license information.
//===your_sha256_hash------===//
//
// This file provides types used in the semantic analysis of C++ templates.
//
//===your_sha256_hash------===//
#ifndef LLVM_CLANG_SEMA_TEMPLATE_H
#define LLVM_CLANG_SEMA_TEMPLATE_H
#include "clang/AST/DeclTemplate.h"
#include "clang/AST/DeclVisitor.h"
#include "clang/AST/TemplateBase.h"
#include "clang/AST/Type.h"
#include "clang/Basic/LLVM.h"
#include "clang/Sema/Sema.h"
#include "llvm/ADT/ArrayRef.h"
#include "llvm/ADT/DenseMap.h"
#include "llvm/ADT/PointerUnion.h"
#include "llvm/ADT/SmallVector.h"
#include <cassert>
#include <optional>
#include <utility>
namespace clang {
class ASTContext;
class BindingDecl;
class CXXMethodDecl;
class Decl;
class DeclaratorDecl;
class DeclContext;
class EnumDecl;
class FunctionDecl;
class NamedDecl;
class ParmVarDecl;
class TagDecl;
class TypedefNameDecl;
class TypeSourceInfo;
class VarDecl;
/// The kind of template substitution being performed.
enum class TemplateSubstitutionKind : char {
/// We are substituting template parameters for template arguments in order
/// to form a template specialization.
Specialization,
/// We are substituting template parameters for (typically) other template
/// parameters in order to rewrite a declaration as a different declaration
/// (for example, when forming a deduction guide from a constructor).
Rewrite,
};
/// Data structure that captures multiple levels of template argument
/// lists for use in template instantiation.
///
/// Multiple levels of template arguments occur when instantiating the
/// definitions of member templates. For example:
///
/// \code
/// template<typename T>
/// struct X {
/// template<T Value>
/// struct Y {
/// void f();
/// };
/// };
/// \endcode
///
/// When instantiating X<int>::Y<17>::f, the multi-level template argument
/// list will contain a template argument list (int) at depth 0 and a
/// template argument list (17) at depth 1.
class MultiLevelTemplateArgumentList {
/// The template argument list at a certain template depth
using ArgList = ArrayRef<TemplateArgument>;
struct ArgumentListLevel {
llvm::PointerIntPair<Decl *, 1, bool> AssociatedDeclAndFinal;
ArgList Args;
};
using ContainerType = SmallVector<ArgumentListLevel, 4>;
using ArgListsIterator = ContainerType::iterator;
using ConstArgListsIterator = ContainerType::const_iterator;
/// The template argument lists, stored from the innermost template
/// argument list (first) to the outermost template argument list (last).
ContainerType TemplateArgumentLists;
/// The number of outer levels of template arguments that are not
/// being substituted.
unsigned NumRetainedOuterLevels = 0;
/// The kind of substitution described by this argument list.
TemplateSubstitutionKind Kind = TemplateSubstitutionKind::Specialization;
public:
/// Construct an empty set of template argument lists.
MultiLevelTemplateArgumentList() = default;
/// Construct a single-level template argument list.
MultiLevelTemplateArgumentList(Decl *D, ArgList Args, bool Final) {
addOuterTemplateArguments(D, Args, Final);
}
void setKind(TemplateSubstitutionKind K) { Kind = K; }
/// Determine the kind of template substitution being performed.
TemplateSubstitutionKind getKind() const { return Kind; }
/// Determine whether we are rewriting template parameters rather than
/// substituting for them. If so, we should not leave references to the
/// original template parameters behind.
bool isRewrite() const {
return Kind == TemplateSubstitutionKind::Rewrite;
}
/// Determine the number of levels in this template argument
/// list.
unsigned getNumLevels() const {
return TemplateArgumentLists.size() + NumRetainedOuterLevels;
}
/// Determine the number of substituted levels in this template
/// argument list.
unsigned getNumSubstitutedLevels() const {
return TemplateArgumentLists.size();
}
// Determine the number of substituted args at 'Depth'.
unsigned getNumSubsitutedArgs(unsigned Depth) const {
assert(NumRetainedOuterLevels <= Depth && Depth < getNumLevels());
return TemplateArgumentLists[getNumLevels() - Depth - 1].Args.size();
}
unsigned getNumRetainedOuterLevels() const {
return NumRetainedOuterLevels;
}
/// Determine how many of the \p OldDepth outermost template parameter
/// lists would be removed by substituting these arguments.
unsigned getNewDepth(unsigned OldDepth) const {
if (OldDepth < NumRetainedOuterLevels)
return OldDepth;
if (OldDepth < getNumLevels())
return NumRetainedOuterLevels;
return OldDepth - TemplateArgumentLists.size();
}
/// Retrieve the template argument at a given depth and index.
const TemplateArgument &operator()(unsigned Depth, unsigned Index) const {
assert(NumRetainedOuterLevels <= Depth && Depth < getNumLevels());
assert(Index <
TemplateArgumentLists[getNumLevels() - Depth - 1].Args.size());
return TemplateArgumentLists[getNumLevels() - Depth - 1].Args[Index];
}
/// A template-like entity which owns the whole pattern being substituted.
/// This will usually own a set of template parameters, or in some
/// cases might even be a template parameter itself.
std::pair<Decl *, bool> getAssociatedDecl(unsigned Depth) const {
assert(NumRetainedOuterLevels <= Depth && Depth < getNumLevels());
auto AD = TemplateArgumentLists[getNumLevels() - Depth - 1]
.AssociatedDeclAndFinal;
return {AD.getPointer(), AD.getInt()};
}
/// Determine whether there is a non-NULL template argument at the
/// given depth and index.
///
/// There must exist a template argument list at the given depth.
bool hasTemplateArgument(unsigned Depth, unsigned Index) const {
assert(Depth < getNumLevels());
if (Depth < NumRetainedOuterLevels)
return false;
if (Index >=
TemplateArgumentLists[getNumLevels() - Depth - 1].Args.size())
return false;
return !(*this)(Depth, Index).isNull();
}
bool isAnyArgInstantiationDependent() const {
for (ArgumentListLevel ListLevel : TemplateArgumentLists)
for (const TemplateArgument &TA : ListLevel.Args)
if (TA.isInstantiationDependent())
return true;
return false;
}
/// Clear out a specific template argument.
void setArgument(unsigned Depth, unsigned Index,
TemplateArgument Arg) {
assert(NumRetainedOuterLevels <= Depth && Depth < getNumLevels());
assert(Index <
TemplateArgumentLists[getNumLevels() - Depth - 1].Args.size());
const_cast<TemplateArgument &>(
TemplateArgumentLists[getNumLevels() - Depth - 1].Args[Index]) = Arg;
}
/// Add a new outmost level to the multi-level template argument
/// list.
/// A 'Final' substitution means that Subst* nodes won't be built
/// for the replacements.
void addOuterTemplateArguments(Decl *AssociatedDecl, ArgList Args,
bool Final) {
assert(!NumRetainedOuterLevels &&
"substituted args outside retained args?");
assert(getKind() == TemplateSubstitutionKind::Specialization);
TemplateArgumentLists.push_back(
{{AssociatedDecl->getCanonicalDecl(), Final}, Args});
}
void addOuterTemplateArguments(ArgList Args) {
assert(!NumRetainedOuterLevels &&
"substituted args outside retained args?");
assert(getKind() == TemplateSubstitutionKind::Rewrite);
TemplateArgumentLists.push_back({{}, Args});
}
void addOuterTemplateArguments(std::nullopt_t) {
assert(!NumRetainedOuterLevels &&
"substituted args outside retained args?");
TemplateArgumentLists.push_back({});
}
/// Replaces the current 'innermost' level with the provided argument list.
/// This is useful for type deduction cases where we need to get the entire
/// list from the AST, but then add the deduced innermost list.
void replaceInnermostTemplateArguments(ArgList Args) {
assert(TemplateArgumentLists.size() > 0 && "Replacing in an empty list?");
TemplateArgumentLists[0].Args = Args;
}
/// Add an outermost level that we are not substituting. We have no
/// arguments at this level, and do not remove it from the depth of inner
/// template parameters that we instantiate.
void addOuterRetainedLevel() {
++NumRetainedOuterLevels;
}
void addOuterRetainedLevels(unsigned Num) {
NumRetainedOuterLevels += Num;
}
/// Retrieve the innermost template argument list.
const ArgList &getInnermost() const {
return TemplateArgumentLists.front().Args;
}
/// Retrieve the outermost template argument list.
const ArgList &getOutermost() const {
return TemplateArgumentLists.back().Args;
}
ArgListsIterator begin() { return TemplateArgumentLists.begin(); }
ConstArgListsIterator begin() const {
return TemplateArgumentLists.begin();
}
ArgListsIterator end() { return TemplateArgumentLists.end(); }
ConstArgListsIterator end() const { return TemplateArgumentLists.end(); }
};
/// The context in which partial ordering of function templates occurs.
enum TPOC {
/// Partial ordering of function templates for a function call.
TPOC_Call,
/// Partial ordering of function templates for a call to a
/// conversion function.
TPOC_Conversion,
/// Partial ordering of function templates in other contexts, e.g.,
/// taking the address of a function template or matching a function
/// template specialization to a function template.
TPOC_Other
};
// This is lame but unavoidable in a world without forward
// declarations of enums. The alternatives are to either pollute
// Sema.h (by including this file) or sacrifice type safety (by
// making Sema.h declare things as enums).
class TemplatePartialOrderingContext {
TPOC Value;
public:
TemplatePartialOrderingContext(TPOC Value) : Value(Value) {}
operator TPOC() const { return Value; }
};
/// Captures a template argument whose value has been deduced
/// via c++ template argument deduction.
class DeducedTemplateArgument : public TemplateArgument {
/// For a non-type template argument, whether the value was
/// deduced from an array bound.
bool DeducedFromArrayBound = false;
public:
DeducedTemplateArgument() = default;
DeducedTemplateArgument(const TemplateArgument &Arg,
bool DeducedFromArrayBound = false)
: TemplateArgument(Arg), DeducedFromArrayBound(DeducedFromArrayBound) {}
/// Construct an integral non-type template argument that
/// has been deduced, possibly from an array bound.
DeducedTemplateArgument(ASTContext &Ctx,
const llvm::APSInt &Value,
QualType ValueType,
bool DeducedFromArrayBound)
: TemplateArgument(Ctx, Value, ValueType),
DeducedFromArrayBound(DeducedFromArrayBound) {}
/// For a non-type template argument, determine whether the
/// template argument was deduced from an array bound.
bool wasDeducedFromArrayBound() const { return DeducedFromArrayBound; }
/// Specify whether the given non-type template argument
/// was deduced from an array bound.
void setDeducedFromArrayBound(bool Deduced) {
DeducedFromArrayBound = Deduced;
}
};
/// A stack-allocated class that identifies which local
/// variable declaration instantiations are present in this scope.
///
/// A new instance of this class type will be created whenever we
/// instantiate a new function declaration, which will have its own
/// set of parameter declarations.
class LocalInstantiationScope {
public:
/// A set of declarations.
using DeclArgumentPack = SmallVector<VarDecl *, 4>;
private:
/// Reference to the semantic analysis that is performing
/// this template instantiation.
Sema &SemaRef;
using LocalDeclsMap =
llvm::SmallDenseMap<const Decl *,
llvm::PointerUnion<Decl *, DeclArgumentPack *>, 4>;
/// A mapping from local declarations that occur
/// within a template to their instantiations.
///
/// This mapping is used during instantiation to keep track of,
/// e.g., function parameter and variable declarations. For example,
/// given:
///
/// \code
/// template<typename T> T add(T x, T y) { return x + y; }
/// \endcode
///
/// when we instantiate add<int>, we will introduce a mapping from
/// the ParmVarDecl for 'x' that occurs in the template to the
/// instantiated ParmVarDecl for 'x'.
///
/// For a parameter pack, the local instantiation scope may contain a
/// set of instantiated parameters. This is stored as a DeclArgumentPack
/// pointer.
LocalDeclsMap LocalDecls;
/// The set of argument packs we've allocated.
SmallVector<DeclArgumentPack *, 1> ArgumentPacks;
/// The outer scope, which contains local variable
/// definitions from some other instantiation (that may not be
/// relevant to this particular scope).
LocalInstantiationScope *Outer;
/// Whether we have already exited this scope.
bool Exited = false;
/// Whether to combine this scope with the outer scope, such that
/// lookup will search our outer scope.
bool CombineWithOuterScope;
/// If non-NULL, the template parameter pack that has been
/// partially substituted per C++0x [temp.arg.explicit]p9.
NamedDecl *PartiallySubstitutedPack = nullptr;
/// If \c PartiallySubstitutedPack is non-null, the set of
/// explicitly-specified template arguments in that pack.
const TemplateArgument *ArgsInPartiallySubstitutedPack;
/// If \c PartiallySubstitutedPack, the number of
/// explicitly-specified template arguments in
/// ArgsInPartiallySubstitutedPack.
unsigned NumArgsInPartiallySubstitutedPack;
public:
LocalInstantiationScope(Sema &SemaRef, bool CombineWithOuterScope = false)
: SemaRef(SemaRef), Outer(SemaRef.CurrentInstantiationScope),
CombineWithOuterScope(CombineWithOuterScope) {
SemaRef.CurrentInstantiationScope = this;
}
LocalInstantiationScope(const LocalInstantiationScope &) = delete;
LocalInstantiationScope &
operator=(const LocalInstantiationScope &) = delete;
~LocalInstantiationScope() {
Exit();
}
const Sema &getSema() const { return SemaRef; }
/// Exit this local instantiation scope early.
void Exit() {
if (Exited)
return;
for (unsigned I = 0, N = ArgumentPacks.size(); I != N; ++I)
delete ArgumentPacks[I];
SemaRef.CurrentInstantiationScope = Outer;
Exited = true;
}
/// Clone this scope, and all outer scopes, down to the given
/// outermost scope.
LocalInstantiationScope *cloneScopes(LocalInstantiationScope *Outermost) {
if (this == Outermost) return this;
// Save the current scope from SemaRef since the LocalInstantiationScope
// will overwrite it on construction
LocalInstantiationScope *oldScope = SemaRef.CurrentInstantiationScope;
LocalInstantiationScope *newScope =
new LocalInstantiationScope(SemaRef, CombineWithOuterScope);
newScope->Outer = nullptr;
if (Outer)
newScope->Outer = Outer->cloneScopes(Outermost);
newScope->PartiallySubstitutedPack = PartiallySubstitutedPack;
newScope->ArgsInPartiallySubstitutedPack = ArgsInPartiallySubstitutedPack;
newScope->NumArgsInPartiallySubstitutedPack =
NumArgsInPartiallySubstitutedPack;
for (LocalDeclsMap::iterator I = LocalDecls.begin(), E = LocalDecls.end();
I != E; ++I) {
const Decl *D = I->first;
llvm::PointerUnion<Decl *, DeclArgumentPack *> &Stored =
newScope->LocalDecls[D];
if (I->second.is<Decl *>()) {
Stored = I->second.get<Decl *>();
} else {
DeclArgumentPack *OldPack = I->second.get<DeclArgumentPack *>();
DeclArgumentPack *NewPack = new DeclArgumentPack(*OldPack);
Stored = NewPack;
newScope->ArgumentPacks.push_back(NewPack);
}
}
// Restore the saved scope to SemaRef
SemaRef.CurrentInstantiationScope = oldScope;
return newScope;
}
/// deletes the given scope, and all outer scopes, down to the
/// given outermost scope.
static void deleteScopes(LocalInstantiationScope *Scope,
LocalInstantiationScope *Outermost) {
while (Scope && Scope != Outermost) {
LocalInstantiationScope *Out = Scope->Outer;
delete Scope;
Scope = Out;
}
}
/// Find the instantiation of the declaration D within the current
/// instantiation scope.
///
/// \param D The declaration whose instantiation we are searching for.
///
/// \returns A pointer to the declaration or argument pack of declarations
/// to which the declaration \c D is instantiated, if found. Otherwise,
/// returns NULL.
llvm::PointerUnion<Decl *, DeclArgumentPack *> *
findInstantiationOf(const Decl *D);
void InstantiatedLocal(const Decl *D, Decl *Inst);
void InstantiatedLocalPackArg(const Decl *D, VarDecl *Inst);
void MakeInstantiatedLocalArgPack(const Decl *D);
/// Note that the given parameter pack has been partially substituted
/// via explicit specification of template arguments
/// (C++0x [temp.arg.explicit]p9).
///
/// \param Pack The parameter pack, which will always be a template
/// parameter pack.
///
/// \param ExplicitArgs The explicitly-specified template arguments provided
/// for this parameter pack.
///
/// \param NumExplicitArgs The number of explicitly-specified template
/// arguments provided for this parameter pack.
void SetPartiallySubstitutedPack(NamedDecl *Pack,
const TemplateArgument *ExplicitArgs,
unsigned NumExplicitArgs);
/// Reset the partially-substituted pack when it is no longer of
/// interest.
void ResetPartiallySubstitutedPack() {
assert(PartiallySubstitutedPack && "No partially-substituted pack");
PartiallySubstitutedPack = nullptr;
ArgsInPartiallySubstitutedPack = nullptr;
NumArgsInPartiallySubstitutedPack = 0;
}
/// Retrieve the partially-substitued template parameter pack.
///
/// If there is no partially-substituted parameter pack, returns NULL.
NamedDecl *
getPartiallySubstitutedPack(const TemplateArgument **ExplicitArgs = nullptr,
unsigned *NumExplicitArgs = nullptr) const;
/// Determine whether D is a pack expansion created in this scope.
bool isLocalPackExpansion(const Decl *D);
};
class TemplateDeclInstantiator
: public DeclVisitor<TemplateDeclInstantiator, Decl *>
{
Sema &SemaRef;
Sema::ArgumentPackSubstitutionIndexRAII SubstIndex;
DeclContext *Owner;
const MultiLevelTemplateArgumentList &TemplateArgs;
Sema::LateInstantiatedAttrVec* LateAttrs = nullptr;
LocalInstantiationScope *StartingScope = nullptr;
bool EvaluateConstraints = true;
/// A list of out-of-line class template partial
/// specializations that will need to be instantiated after the
/// enclosing class's instantiation is complete.
SmallVector<std::pair<ClassTemplateDecl *,
ClassTemplatePartialSpecializationDecl *>, 4>
OutOfLinePartialSpecs;
/// A list of out-of-line variable template partial
/// specializations that will need to be instantiated after the
/// enclosing variable's instantiation is complete.
/// FIXME: Verify that this is needed.
SmallVector<
std::pair<VarTemplateDecl *, VarTemplatePartialSpecializationDecl *>, 4>
OutOfLineVarPartialSpecs;
public:
TemplateDeclInstantiator(Sema &SemaRef, DeclContext *Owner,
const MultiLevelTemplateArgumentList &TemplateArgs)
: SemaRef(SemaRef),
SubstIndex(SemaRef, SemaRef.ArgumentPackSubstitutionIndex),
Owner(Owner), TemplateArgs(TemplateArgs) {}
void setEvaluateConstraints(bool B) {
EvaluateConstraints = B;
}
bool getEvaluateConstraints() {
return EvaluateConstraints;
}
// Define all the decl visitors using DeclNodes.inc
#define DECL(DERIVED, BASE) \
Decl *Visit ## DERIVED ## Decl(DERIVED ## Decl *D);
#define ABSTRACT_DECL(DECL)
// Decls which never appear inside a class or function.
#define OBJCCONTAINER(DERIVED, BASE)
#define FILESCOPEASM(DERIVED, BASE)
#define TOPLEVELSTMT(DERIVED, BASE)
#define IMPORT(DERIVED, BASE)
#define EXPORT(DERIVED, BASE)
#define LINKAGESPEC(DERIVED, BASE)
#define OBJCCOMPATIBLEALIAS(DERIVED, BASE)
#define OBJCMETHOD(DERIVED, BASE)
#define OBJCTYPEPARAM(DERIVED, BASE)
#define OBJCIVAR(DERIVED, BASE)
#define OBJCPROPERTY(DERIVED, BASE)
#define OBJCPROPERTYIMPL(DERIVED, BASE)
#define EMPTY(DERIVED, BASE)
#define LIFETIMEEXTENDEDTEMPORARY(DERIVED, BASE)
// Decls which use special-case instantiation code.
#define BLOCK(DERIVED, BASE)
#define CAPTURED(DERIVED, BASE)
#define IMPLICITPARAM(DERIVED, BASE)
#include "clang/AST/DeclNodes.inc"
enum class RewriteKind { None, RewriteSpaceshipAsEqualEqual };
void adjustForRewrite(RewriteKind RK, FunctionDecl *Orig, QualType &T,
TypeSourceInfo *&TInfo,
DeclarationNameInfo &NameInfo);
// A few supplemental visitor functions.
Decl *VisitCXXMethodDecl(CXXMethodDecl *D,
TemplateParameterList *TemplateParams,
std::optional<const ASTTemplateArgumentListInfo *>
ClassScopeSpecializationArgs = std::nullopt,
RewriteKind RK = RewriteKind::None);
Decl *VisitFunctionDecl(FunctionDecl *D,
TemplateParameterList *TemplateParams,
RewriteKind RK = RewriteKind::None);
Decl *VisitDecl(Decl *D);
Decl *VisitVarDecl(VarDecl *D, bool InstantiatingVarTemplate,
ArrayRef<BindingDecl *> *Bindings = nullptr);
Decl *VisitBaseUsingDecls(BaseUsingDecl *D, BaseUsingDecl *Inst,
LookupResult *Lookup);
// Enable late instantiation of attributes. Late instantiated attributes
// will be stored in LA.
void enableLateAttributeInstantiation(Sema::LateInstantiatedAttrVec *LA) {
LateAttrs = LA;
StartingScope = SemaRef.CurrentInstantiationScope;
}
// Disable late instantiation of attributes.
void disableLateAttributeInstantiation() {
LateAttrs = nullptr;
StartingScope = nullptr;
}
LocalInstantiationScope *getStartingScope() const { return StartingScope; }
using delayed_partial_spec_iterator = SmallVectorImpl<std::pair<
ClassTemplateDecl *, ClassTemplatePartialSpecializationDecl *>>::iterator;
using delayed_var_partial_spec_iterator = SmallVectorImpl<std::pair<
VarTemplateDecl *, VarTemplatePartialSpecializationDecl *>>::iterator;
/// Return an iterator to the beginning of the set of
/// "delayed" partial specializations, which must be passed to
/// InstantiateClassTemplatePartialSpecialization once the class
/// definition has been completed.
delayed_partial_spec_iterator delayed_partial_spec_begin() {
return OutOfLinePartialSpecs.begin();
}
delayed_var_partial_spec_iterator delayed_var_partial_spec_begin() {
return OutOfLineVarPartialSpecs.begin();
}
/// Return an iterator to the end of the set of
/// "delayed" partial specializations, which must be passed to
/// InstantiateClassTemplatePartialSpecialization once the class
/// definition has been completed.
delayed_partial_spec_iterator delayed_partial_spec_end() {
return OutOfLinePartialSpecs.end();
}
delayed_var_partial_spec_iterator delayed_var_partial_spec_end() {
return OutOfLineVarPartialSpecs.end();
}
// Helper functions for instantiating methods.
TypeSourceInfo *SubstFunctionType(FunctionDecl *D,
SmallVectorImpl<ParmVarDecl *> &Params);
bool InitFunctionInstantiation(FunctionDecl *New, FunctionDecl *Tmpl);
bool InitMethodInstantiation(CXXMethodDecl *New, CXXMethodDecl *Tmpl);
bool SubstDefaultedFunction(FunctionDecl *New, FunctionDecl *Tmpl);
TemplateParameterList *
SubstTemplateParams(TemplateParameterList *List);
bool SubstQualifier(const DeclaratorDecl *OldDecl,
DeclaratorDecl *NewDecl);
bool SubstQualifier(const TagDecl *OldDecl,
TagDecl *NewDecl);
Decl *VisitVarTemplateSpecializationDecl(
VarTemplateDecl *VarTemplate, VarDecl *FromVar,
const TemplateArgumentListInfo &TemplateArgsInfo,
ArrayRef<TemplateArgument> Converted,
VarTemplateSpecializationDecl *PrevDecl = nullptr);
Decl *InstantiateTypedefNameDecl(TypedefNameDecl *D, bool IsTypeAlias);
ClassTemplatePartialSpecializationDecl *
InstantiateClassTemplatePartialSpecialization(
ClassTemplateDecl *ClassTemplate,
ClassTemplatePartialSpecializationDecl *PartialSpec);
VarTemplatePartialSpecializationDecl *
InstantiateVarTemplatePartialSpecialization(
VarTemplateDecl *VarTemplate,
VarTemplatePartialSpecializationDecl *PartialSpec);
void InstantiateEnumDefinition(EnumDecl *Enum, EnumDecl *Pattern);
private:
template<typename T>
Decl *instantiateUnresolvedUsingDecl(T *D,
bool InstantiatingPackElement = false);
};
} // namespace clang
#endif // LLVM_CLANG_SEMA_TEMPLATE_H
```
|
The wildlife of Greece includes the diverse flora, fauna, and funga of Greece, a country in southern Europe. The country is mostly mountainous with a very long, convoluted coastline, consisting of peninsulas and many islands. The climate ranges from Mediterranean through temperate to alpine, and the habitats include mountains, hills, forests, rivers, lakes, coasts and cultivated land.
Geography
Greece is a country in the Balkan Peninsula of southern Europe, and lies to the south of Albania, North Macedonia and Bulgaria, and west of Turkey. It has a long coastline with the Mediterranean Sea and the Aegean Sea, and includes the island of Crete and many smaller islands. Mainland Greece covers about 80% of the total territory and is largely mountainous. The largest mountain group is the Pindus Range which forms the spine of the Greek mainland, with the highest peak rising to above sea level. The country's tallest mountain, Mount Olympus is further east, and rises to above sea level. The large Peloponnese peninsula, in the south of the country, is separated from the rest of the Greek mainland by the Corinthian and Saronic Gulfs, but joined by the Isthmus of Corinth.
Climate
Much of the country experiences a Mediterranean climate with warm or hot, dry summers and the rainfall falling in winter. The islands mostly have a Mediterranean climate, with the climate of Crete being particularly influenced by its maritime surroundings and proximity to Africa. Higher regions of the western and central parts of the country, and the mountainous parts of the Peloponnese, experience an Alpine climate. The climate is very varied across the country; snow may still be lying near the peaks in June while the lowlands are experiencing high temperatures.
Flora
Greece includes a great diversity of vascular plants among its flora. In 2013, there were 5,752 species and 1,893 subspecies of native and introduced plants, for a total of 6,620 taxa, including 1,278 endemic species and 452 endemic subspecies. By June 2018, the number of species had been revised upwards with 6,695 taxa being listed, consisting of 5,828 species and 1,982 subspecies, belonging to 1,073 genera and 185 families.
Much of Greece lies within the Aegean and Western Turkey sclerophyllous and mixed forests ecoregion. This is characterised by maquis shrubland which includes the evergreen oak and the Greek strawberry tree, as well as the kermes oak, the strawberry tree, green olive, the bay laurel, the cedar, the Spanish broom and others. Intensive land use has reduced these forests to remnants. Of the deciduous species most common are the ash, the elm, the Montpellier maple, the Judas tree, the terebinth, the smoke tree and others. Greece was connected to western Turkey during the Pliocene era, and the two countries include many identical plants among their flora. The Cretan date palm has a very restricted range in southern Greece and Crete, with a few stands in Turkey.
Fauna
Larger, carnivorous mammals found in Greece include the European wildcat, the Balkan lynx, the red fox, the golden jackal, the grey wolf, the Eurasian brown bear, the American mink, the least weasel, the European polecat, the marbled polecat, the beech marten, the European pine marten, the European badger, the Eurasian otter and about twenty species of bat. The island of Gyaros is the breeding area for the largest population of the endangered Mediterranean monk seal, and about fifteen species of whales, dolphins and porpoises are reported in Greek waters.
Ungulates found in Greece include the wild boar, the red deer, the fallow deer, the roe deer, the chamois and the endangered Cretan ibex. Also present are the European rabbit and the European hare, the southern white-breasted hedgehog and the northern white-breasted hedgehog, the European mole, some ten species of shrew and around thirty species of rodent (squirrels, dormice, mice, rats and voles).
With its varied topography and habitats, Greece has a rich bird fauna. It is a meeting point for birds of three continents, the southern limit for some species and the northern limit for others. Beside the resident bird populations, many migratory species visit the country as they move seasonally between their breeding grounds and their overwintering areas. About 450 species of bird have been recorded in Greece. The Dadia Forest in the northeast is an important area for birds of prey, where four species of vulture are among the thirty-six diurnal species of raptor that have been recorded.
Birds commonly found in the maquis shrubland include the eastern subalpine and Rüppell's warblers, the cirl, rock and black-headed buntings, and the rock, red-legged and chukar partridges. Wetland birds are well catered for by a number of Ramsar sites such as Lake Kerkini, the Nestos Delta, and the Evros Delta and their freshwater marshes, lakes, brackish lagoons, saltmarshes and mudflats.
Greece's rivers are brimming with aquatic wildlife too, with a diverse range of endemic freshwater fishes, around 160 species were listed in 2015. There are also several species of lampreys, notably three species of lamprey endemic to Greece; the Epirus brook lamprey, Greek brook lamprey and Almopaios brook lamprey. Lake inhabitants include the endemic Macedonian shad, formerly a fish that was commercially fished. Within the cyprinid fishes, there is an endemic barbel; the Evia barbel, found only on Evia Island, critically endangered and suffering from increasing droughts and barriers to movement.
References
Greece
Biota of Greece
|
The 1968 Philadelphia Phillies season was a season in Major League Baseball. The Phillies finished eighth in the National League with a record of 76 wins and 86 losses, 21 games behind the NL pennant-winning Cardinals.
Offseason
November 28, 1967: Doc Edwards was drafted by the Phillies from the Houston Astros in the 1967 minor league draft.
December 15, 1967: Jim Bunning was traded by the Phillies to the Pittsburgh Pirates for Woodie Fryman, Bill Laxton, Don Money and Harold Clem (minors).
January 26, 1968: Manny Trillo was signed by the Phillies as an amateur free agent.
Regular season
The Phillies were scheduled to open the 1968 season on April 9, 1968, in Los Angeles. However, the assassination of Martin Luther King Jr., on April 4, lead to days of national unrest. President Johnson declared Monday, April 8, a national day of mourning, and the funeral was scheduled for April 9. The Dodgers initially refused to postpone the game, leading Phillies GM John Quinn and President Bob Carpenter to announce that the Phillies would not play on April 9 even under threat of forfeit. On April 7, Quinn told reporters, "Under the rules, the game can be forfeited and we could be fined. But we have made our final decision. We will not play." In consultation with NL President Warren Giles, the Dodgers eventually agreed and postponed the game. The Phillies opened April 10, 1968, with a Chris Short 2 to 0 shutout of the Dodgers.
On July 28, 1968, George Culver of the Cincinnati Reds pitched a 6–1 no-hitter against the Phillies in the second game of a doubleheader at Connie Mack Stadium.
Season standings
Record vs. opponents
Notable transactions
June 7, 1968: Buddy Schultz was drafted by the Phillies in the 4th round of the 1968 Major League Baseball Draft, but did not sign.
Game log
|- style="background:#bbb"
| – || April 9 || @ Dodgers || colspan=6 | Postponed (Funeral of Martin Luther King Jr.); Makeup: April 16
|- style="background:#bfb"
| 1 || April 10 || @ Dodgers || 2–0 || Chris Short (1–0) || Claude Osteen (0–1) || None || 28,138 || 1–0
|- style="background:#fbb"
| 2 || April 11 || @ Astros || 3–7 || Don Wilson (1–0) || Larry Jackson (0–1) || None || 11,972 || 1–1
|- style="background:#fbb"
| 3 || April 12 || @ Astros || 2–5 || Denny Lemaster (1–0) || Woodie Fryman (0–1) || John Buzhardt (1) || 16,415 || 1–2
|- style="background:#fbb"
| 4 || April 13 || @ Astros || 3–4 || Dave Giusti (1–0) || Grant Jackson (0–1) || None || 13,164 || 1–3
|- style="background:#fbb"
| 5 || April 14 (1) || @ Giants || 2–13 || Juan Marichal (1–0) || Rick Wise (0–1) || None || see 2nd game || 1–4
|- style="background:#fbb"
| 6 || April 14 (2) || @ Giants || 1–3 || Ray Sadecki (1–0) || Chris Short (1–1) || None || 18,314 || 1–5
|- style="background:#fbb"
| 7 || April 16 || @ Dodgers || 3–5 || Mike Kekich (1–0) || Larry Jackson (0–2) || Hank Aguirre (1) || 16,571 || 1–6
|- style="background:#bfb"
| 8 || April 17 || Dodgers || 3–2 || Woodie Fryman (1–1) || Don Drysdale (1–1) || Turk Farrell (1) || 15,817 || 2–6
|- style="background:#bfb"
| 9 || April 19 || Astros || 2–1 || Chris Short (2–1) || Dave Giusti (1–1) || None || 6,671 || 3–6
|- style="background:#bfb"
| 10 || April 20 || Astros || 7–1 || Larry Jackson (1–2) || Larry Dierker (1–2) || None || 3,738 || 4–6
|- style="background:#bfb"
| 11 || April 21 || Astros || 8–0 || Woodie Fryman (2–1) || Don Wilson (1–1) || None || 5,634 || 5–6
|- style="background:#bfb"
| 12 || April 22 || Giants || 2–1 (10) || Rick Wise (1–1) || Frank Linzy (1–2) || None || 4,231 || 6–6
|- style="background:#fbb"
| 13 || April 23 || Giants || 1–7 || Juan Marichal (3–0) || Chris Short (2–2) || None || 8,618 || 6–7
|- style="background:#bbb"
| – || April 24 || Giants || colspan=6 | Postponed (rain); Makeup: June 18 as a traditional double-header
|- style="background:#fbb"
| 14 || April 26 || @ Braves || 1–3 || Pat Jarvis (1–2) || Larry Jackson (1–3) || None || 10,614 || 6–8
|- style="background:#bfb"
| 15 || April 27 || @ Braves || 4–1 || Woodie Fryman (3–1) || Dick Kelley (1–2) || None || 14,207 || 7–8
|- style="background:#bfb"
| 16 || April 28 || @ Braves || 4–3 || Rick Wise (2–1) || Phil Niekro (2–2) || Turk Farrell (2) || 13,442 || 8–8
|- style="background:#fbb"
| 17 || April 30 || @ Mets || 0–1 || Don Cardwell (1–2) || Chris Short (2–3) || None || 3,771 || 8–9
|-
|- style="background:#bfb"
| 18 || May 1 || @ Mets || 7–2 (11) || Larry Jackson (2–3) || Ron Taylor (0–1) || Grant Jackson (1) || 11,450 || 9–9
|- style="background:#fbb"
| 19 || May 2 || @ Mets || 0–3 || Nolan Ryan (2–2) || Woodie Fryman (3–2) || Ron Taylor (2) || 9,795 || 9–10
|- style="background:#bfb"
| 20 || May 3 || Pirates || 3–2 || Turk Farrell (1–0) || Ron Kline (0–1) || None || 9,433 || 10–10
|- style="background:#bfb"
| 21 || May 4 || Pirates || 3–2 || Dick Hall (1–0) || Roy Face (0–1) || None || 15,834 || 11–10
|- style="background:#fbb"
| 22 || May 5 || Pirates || 2–5 || Dave Wickersham (1–0) || Larry Jackson (2–4) || Bob Moose (3) || 9,407 || 11–11
|- style="background:#fbb"
| 23 || May 6 || @ Reds || 1–10 || George Culver (1–1) || Woodie Fryman (3–3) || None || 3,991 || 11–12
|- style="background:#bfb"
| 24 || May 7 || @ Reds || 5–2 || Rick Wise (3–1) || Jim Maloney (2–2) || Turk Farrell (3) || 4,953 || 12–12
|- style="background:#bfb"
| 25 || May 8 || @ Reds || 6–2 || Dick Hall (2–0) || Bob Lee (2–2) || None || 3,535 || 13–12
|- style="background:#bfb"
| 26 || May 9 || @ Reds || 7–3 || Larry Jackson (3–4) || Milt Pappas (2–2) || Turk Farrell (4) || 3,735 || 14–12
|- style="background:#fbb"
| 27 || May 10 || @ Pirates || 1–2 || Bob Veale (1–3) || Woodie Fryman (3–4) || Roy Face (3) || 9,397 || 14–13
|- style="background:#bbb"
| – || May 11 || @ Pirates || colspan=6 | Postponed (rain); Makeup: July 11 as a traditional double-header
|- style="background:#fbb"
| 28 || May 12 || @ Pirates || 1–2 || Al McBean (5–2) || Jeff James (0–1) || None || 12,203 || 14–14
|- style="background:#fbb"
| 29 || May 13 || Braves || 2–4 || Phil Niekro (3–3) || Chris Short (2–4) || None || 3,126 || 14–15
|- style="background:#fbb"
| 30 || May 14 || Braves || 1–3 || Ron Reed (4–0) || Larry Jackson (3–5) || None || 4,531 || 14–16
|- style="background:#bbb"
| – || May 15 || Braves || colspan=6 | Postponed (rain); Makeup: July 26 as a traditional double-header
|- style="background:#bbb"
| – || May 16 || Braves || colspan=6 | Postponed (rain); Makeup: August 28 as a traditional double-header
|- style="background:#bfb"
| 31 || May 17 || Cardinals || 1–0 (10) || Woodie Fryman (4–4) || Bob Gibson (3–3) || None || 17,034 || 15–16
|- style="background:#bfb"
| 32 || May 18 || Cardinals || 3–2 || Larry Jackson (4–5) || Nelson Briles (5–3) || None || 12,941 || 16–16
|- style="background:#bfb"
| 33 || May 19 || Cardinals || 4–3 || Turk Farrell (2–0) || Joe Hoerner (2–1) || None || 27,725 || 17–16
|- style="background:#fbb"
| 34 || May 21 || @ Cubs || 5–6 || Rich Nye (3–4) || Turk Farrell (2–1) || None || 4,422 || 17–17
|- style="background:#bfb"
| 35 || May 22 || Mets || 8–0 || Woodie Fryman (5–4) || Don Cardwell (1–5) || None || 5,717 || 18–17
|- style="background:#bbb"
| – || May 23 || Mets || colspan=6 | Postponed (rain); Makeup: September 20 as a traditional double-header
|- style="background:#fbb"
| 36 || May 24 || @ Cardinals || 1–5 || Steve Carlton (5–1) || Chris Short (2–5) || None || 34,515 || 18–18
|- style="background:#bfb"
| 37 || May 25 || @ Cardinals || 1–0 || Larry Jackson (5–5) || Larry Jaster (2–2) || Turk Farrell (5) || 19,432 || 19–18
|- style="background:#bfb"
| 38 || May 26 || @ Cardinals || 9–3 || Woodie Fryman (6–4) || Hal Gilson (0–1) || None || 42,446 || 20–18
|- style="background:#bbb"
| – || May 28 || Cubs || colspan=6 | Postponed (rain); Makeup: July 17 as a traditional double-header
|- style="background:#fbb"
| 39 || May 29 (1) || Cubs || 2–9 || Ken Holtzman (4–3) || Chris Short (2–6) || None || see 2nd game || 20–19
|- style="background:#bfb"
| 40 || May 29 (2) || Cubs || 8–3 || Rick Wise (4–1) || Rich Nye (3–5) || None || 18,128 || 21–19
|- style="background:#bbb"
| – || May 30 || Cubs || colspan=6 | Postponed (rain); Makeup: September 13 as a traditional double-header
|- style="background:#fbb"
| 41 || May 31 || Reds || 4–5 || Gary Nolan (1–0) || Turk Farrell (2–2) || George Culver (2) || 9,112 || 21–20
|-
|- style="background:#bfb"
| 42 || June 1 || Reds || 12–0 || Woodie Fryman (7–4) || Milt Pappas (2–5) || None || 10,566 || 22–20
|- style="background:#fbb"
| 43 || June 2 || Reds || 3–5 || Jim Maloney (5–3) || Rick Wise (4–2) || None || 6,662 || 22–21
|- style="background:#bfb"
| 44 || June 3 || @ Giants || 1–0 || Chris Short (3–6) || Ray Sadecki (6–6) || None || 3,609 || 23–21
|- style="background:#bfb"
| 45 || June 4 || @ Giants || 5–1 || Larry Jackson (6–5) || Mike McCormick (4–7) || None || 4,870 || 24–21
|- style="background:#bfb"
| 46 || June 5 || @ Giants || 2–1 || Woodie Fryman (8–4) || Gaylord Perry (6–3) || None || 3,018 || 25–21
|- style="background:#fbb"
| 47 || June 6 || @ Giants || 2–7 || Juan Marichal (10–2) || Rick Wise (4–3) || None || 3,758 || 25–22
|- style="background:#fbb"
| 48 || June 7 || @ Dodgers || 0–2 || Claude Osteen (5–7) || Chris Short (3–7) || None || 18,249 || 25–23
|- style="background:#fbb"
| 49 || June 8 || @ Dodgers || 3–5 || Don Drysdale (8–3) || Larry Jackson (6–6) || Hank Aguirre (2) || 50,060 || 25–24
|- style="background:#fbb"
| 50 || June 9 || @ Dodgers || 3–4 || Jim Brewer (3–1) || Woodie Fryman (8–5) || None || 18,781 || 25–25
|- style="background:#fbb"
| 51 || June 11 || Astros || 1–5 || Larry Dierker (6–8) || Rick Wise (4–4) || None || 5,243 || 25–26
|- style="background:#bbb"
| – || June 12 || Astros || colspan=6 | Postponed (rain); Makeup: August 13 as a traditional double-header
|- style="background:#bfb"
| 52 || June 13 || Astros || 3–2 || Chris Short (4–7) || Dave Giusti (4–7) || Turk Farrell (6) || 4,542 || 26–26
|- style="background:#fbb"
| 53 || June 14 (1) || Dodgers || 0–6 || Bill Singer (6–5) || Jeff James (0–2) || None || see 2nd game || 26–27
|- style="background:#bfb"
| 54 || June 14 (2) || Dodgers || 2–1 || Woodie Fryman (9–5) || Jim Brewer (3–2) || None || 19,716 || 27–27
|- style="background:#bfb"
| 55 || June 15 || Dodgers || 6–5 || Turk Farrell (3–2) || Hank Aguirre (0–1) || None || 11,868 || 28–27
|- style="background:#fbb"
| 56 || June 16 || Dodgers || 1–2 || Claude Osteen (6–8) || Rick Wise (4–5) || Jim Brewer (3) || 29,084 || 28–28
|- style="background:#bbb"
| – || June 17 || Dodgers || colspan=6 | Postponed (rain); Makeup: September 2 as a traditional double-header
|- style="background:#bfb"
| 57 || June 18 (1) || Giants || 10–2 || Chris Short (5–7) || Gaylord Perry (6–4) || None || see 2nd game || 29–28
|- style="background:#bfb"
| 58 || June 18 (2) || Giants || 9–1 || Woodie Fryman (10–5) || Mike McCormick (5–9) || None || 22,184 || 30–28
|- style="background:#fbb"
| 59 || June 19 || Giants || 1–5 || Juan Marichal (13–2) || Larry Jackson (6–7) || None || 15,520 || 30–29
|- style="background:#bfb"
| 60 || June 20 || Giants || 2–1 || Rick Wise (5–5) || Ray Sadecki (7–9) || None || 12,656 || 31–29
|- style="background:#fbb"
| 61 || June 21 || @ Astros || 1–2 || Mike Cuellar (4–3) || Jeff James (0–3) || None || 19,274 || 31–30
|- style="background:#bfb"
| 62 || June 22 || @ Astros || 7–6 || Gary Wagner (1–0) || Wade Blasingame (1–2) || Turk Farrell (7) || 21,015 || 32–30
|- style="background:#fbb"
| 63 || June 23 || @ Astros || 4–7 || Denny Lemaster (7–6) || Woodie Fryman (10–6) || Fred Gladding (2) || 15,876 || 32–31
|- style="background:#fbb"
| 64 || June 25 || @ Braves || 1–6 || Ron Reed (8–3) || Larry Jackson (6–8) || None || 11,876 || 32–32
|- style="background:#bfb"
| 65 || June 26 || @ Braves || 3–2 (11) || John Boozer (1–0) || Jim Britton (3–2) || Turk Farrell (8) || 10,128 || 33–32
|- style="background:#fbb"
| 66 || June 27 || @ Braves || 3–4 || Pat Jarvis (8–5) || Chris Short (5–8) || Cecil Upshaw (4) || 12,347 || 33–33
|- style="background:#fbb"
| 67 || June 28 || Pirates || 1–10 || Jim Bunning (4–9) || Woodie Fryman (10–7) || None || 18,994 || 33–34
|- style="background:#fbb"
| 68 || June 29 || Pirates || 0–1 || Bob Moose (3–5) || Larry Jackson (6–9) || None || 17,052 || 33–35
|- style="background:#fbb"
| 69 || June 30 || Pirates || 2–5 || Bob Veale (6–7) || Turk Farrell (3–3) || None || 8,884 || 33–36
|-
|- style="background:#bfb"
| 70 || July 1 || @ Cubs || 6–4 || Chris Short (6–8) || Rich Nye (4–9) || John Boozer (1) || 9,614 || 34–36
|- style="background:#fbb"
| 71 || July 2 || @ Cubs || 3–5 || Ferguson Jenkins (7–9) || Woodie Fryman (10–8) || None || 10,932 || 34–37
|- style="background:#bfb"
| 72 || July 3 || @ Cubs || 3–2 || Larry Jackson (7–9) || Ken Holtzman (5–5) || Chris Short (1) || 9,179 || 35–37
|- style="background:#fbb"
| 73 || July 4 (1) || @ Cubs || 2–6 || Joe Niekro (7–6) || Grant Jackson (0–2) || Phil Regan (13) || see 2nd game || 35–38
|- style="background:#bfb"
| 74 || July 4 (2) || @ Cubs || 7–4 || Jeff James (1–3) || Darcy Fast (0–1) || John Boozer (2) || 21,516 || 36–38
|- style="background:#bfb"
| 75 || July 5 || Mets || 3–1 || Chris Short (7–8) || Tom Seaver (7–6) || None || 10,084 || 37–38
|- style="background:#fbb"
| 76 || July 6 || Mets || 6–11 || Al Jackson (2–3) || Woodie Fryman (10–9) || Cal Koonce (6) || 4,032 || 37–39
|- style="background:#bfb"
| 77 || July 7 (1) || Mets || 4–3 || Dick Hall (3–0) || Ron Taylor (1–2) || None || see 2nd game || 38–39
|- style="background:#fbb"
| 78 || July 7 (2) || Mets || 2–4 || Danny Frisella (2–3) || Larry Jackson (7–10) || Tom Seaver (1) || 14,478 || 38–40
|- style="background:#bbcaff;"
| – || July 9 ||colspan="7" |1968 Major League Baseball All-Star Game at the Houston Astrodome in Houston
|- style="background:#bfb"
| 79 || July 11 (1) || @ Pirates || 5–0 || Larry Jackson (8–10) || Bob Veale (7–9) || None || see 2nd game || 39–40
|- style="background:#bfb"
| 80 || July 11 (2) || @ Pirates || 4–1 || Chris Short (8–8) || Bob Moose (3–6) || John Boozer (3) || 15,371 || 40–40
|- style="background:#bfb"
| 81 || July 12 || @ Pirates || 3–2 || Jeff James (2–3) || Jim Bunning (4–11) || John Boozer (4) || 9,206 || 41–40
|- style="background:#bfb"
| 82 || July 13 || @ Pirates || 3–2 (16) || Chris Short (9–8) || Dock Ellis (1–1) || None || 6,869 || 42–40
|- style="background:#bfb"
| 83 || July 14 (1) || @ Mets || 5–3 || Rick Wise (6–5) || Al Jackson (2–4) || None || see 2nd game || 43–40
|- style="background:#bfb"
| 84 || July 14 (2) || @ Mets || 9–2 || Grant Jackson (1–2) || Danny Frisella (2–4) || None || 57,011 || 44–40
|- style="background:#bfb"
| 85 || July 15 || @ Mets || 5–3 || Larry Jackson (9–10) || Nolan Ryan (6–8) || John Boozer (5) || 20,628 || 45–40
|- style="background:#fbb"
| 86 || July 16 || Cubs || 3–4 (12) || Phil Regan (8–2) || Gary Wagner (1–1) || Joe Niekro (1) || 11,980 || 45–41
|- style="background:#fbb"
| 87 || July 17 (1) || Cubs || 4–8 || Bill Hands (9–5) || Woodie Fryman (10–10) || Phil Regan (14) || see 2nd game || 45–42
|- style="background:#bfb"
| 88 || July 17 (2) || Cubs || 8–0 || Jeff James (3–3) || Rich Nye (4–11) || None || 17,920 || 46–42
|- style="background:#fbb"
| 89 || July 19 || @ Reds || 2–9 || George Culver (7–9) || Rick Wise (6–6) || Clay Carroll (3) || 12,400 || 46–43
|- style="background:#fbb"
| 90 || July 20 || @ Reds || 3–9 || Gerry Arrigo (5–5) || Larry Jackson (9–11) || Ted Abernathy (10) || 13,256 || 46–44
|- style="background:#fbb"
| 91 || July 21 || @ Reds || 6–12 || Tony Cloninger (2–5) || Chris Short (9–9) || Ted Abernathy (11) || 10,885 || 46–45
|- style="background:#fbb"
| 92 || July 22 || @ Cardinals || 4–5 || Wayne Granger (4–0) || John Boozer (1–1) || None || 17,619 || 46–46
|- style="background:#fbb"
| 93 || July 23 || @ Cardinals || 5–11 || Larry Jaster (8–5) || Rick Wise (6–7) || Dick Hughes (3) || 26,199 || 46–47
|- style="background:#fbb"
| 94 || July 24 || @ Cardinals || 1–3 || Ray Washburn (9–3) || Larry Jackson (9–12) || Joe Hoerner (11) || 23,828 || 46–48
|- style="background:#fbb"
| 95 || July 25 || @ Cardinals || 0–5 || Bob Gibson (14–5) || Chris Short (9–10) || None || 28,147 || 46–49
|- style="background:#fbb"
| 96 || July 26 (1) || Braves || 4–5 || George Stone (1–1) || Grant Jackson (1–3) || Claude Raymond (7) || see 2nd game || 46–50
|- style="background:#fbb"
| 97 || July 26 (2) || Braves || 2–3 || Milt Pappas (6–7) || Jeff James (3–4) || Cecil Upshaw (7) || 16,334 || 46–51
|- style="background:#bfb"
| 98 || July 27 || Braves || 1–0 || Woodie Fryman (11–10) || Jim Britton (4–5) || None || 12,020 || 47–51
|- style="background:#bfb"
| 99 || July 28 || Braves || 3–0 || Larry Jackson (10–12) || Pat Jarvis (10–8) || None || 8,173 || 48–51
|- style="background:#fbb"
| 100 || July 29 (1) || Reds || 6–7 || Ted Abernathy (8–1) || Turk Farrell (3–4) || None || see 2nd game || 48–52
|- style="background:#fbb"
| 101 || July 29 (2) || Reds || 1–6 || George Culver (9–9) || Chris Short (9–11) || None || 14,083 || 48–53
|- style="background:#fbb"
| 102 || July 30 || Reds || 2–5 || Tony Cloninger (3–6) || Rick Wise (6–8) || Clay Carroll (6) || 7,213 || 48–54
|- style="background:#fbb"
| 103 || July 31 || Cardinals || 2–3 || Nelson Briles (13–7) || Woodie Fryman (11–11) || None || 14,811 || 48–55
|-
|- style="background:#fbb"
| 104 || August 1 || Cardinals || 1–2 (8) || Steve Carlton (11–5) || Larry Jackson (10–13) || None || 12,674 || 48–56
|- style="background:#fbb"
| 105 || August 2 || @ Astros || 3–4 || Pat House (1–0) || Grant Jackson (1–4) || Steve Shea (3) || 12,957 || 48–57
|- style="background:#bfb"
| 106 || August 3 || @ Astros || 2–1 || Chris Short (10–11) || Mike Cuellar (6–6) || None || 19,185 || 49–57
|- style="background:#bfb"
| 107 || August 4 || @ Astros || 3–2 || Rick Wise (7–8) || Steve Shea (1–2) || None || 15,003 || 50–57
|- style="background:#bfb"
| 108 || August 5 || @ Giants || 6–4 (10) || Dick Hall (4–0) || Mike McCormick (7–13) || None || 5,429 || 51–57
|- style="background:#fbb"
| 109 || August 6 || @ Giants || 1–4 || Gaylord Perry (10–10) || Larry Jackson (10–14) || None || 6,246 || 51–58
|- style="background:#fbb"
| 110 || August 7 || @ Giants || 3–4 || Frank Linzy (5–7) || Dick Hall (4–1) || None || 5,109 || 51–59
|- style="background:#bfb"
| 111 || August 8 || @ Dodgers || 1–0 || Rick Wise (8–8) || Bill Singer (9–11) || None || 14,198 || 52–59
|- style="background:#bfb"
| 112 || August 9 || @ Dodgers || 3–2 || Chris Short (11–11) || Claude Osteen (8–17) || Gary Wagner (1) || 15,150 || 53–59
|- style="background:#fbb"
| 113 || August 10 || @ Dodgers || 2–3 (14) || Hank Aguirre (1–2) || Grant Jackson (1–5) || None || 15,559 || 53–60
|- style="background:#fbb"
| 114 || August 11 || @ Dodgers || 0–1 || Don Drysdale (13–10) || Larry Jackson (10–15) || None || 13,365 || 53–61
|- style="background:#fbb"
| 115 || August 13 (1) || Astros || 0–5 || Don Wilson (9–12) || Rick Wise (8–9) || None || see 2nd game || 53–62
|- style="background:#bfb"
| 116 || August 13 (2) || Astros || 4–2 || Jeff James (4–4) || Mike Cuellar (6–8) || Gary Wagner (2) || 7,021 || 54–62
|- style="background:#bfb"
| 117 || August 14 || Astros || 4–3 || Chris Short (12–11) || Denny Lemaster (9–12) || Gary Wagner (3) || 4,040 || 55–62
|- style="background:#fbb"
| 118 || August 15 || Astros || 2–3 || Dave Giusti (7–12) || Turk Farrell (3–5) || Danny Coombs (1) || 3,217 || 55–63
|- style="background:#fbb"
| 119 || August 16 || Giants || 5–7 || Joe Gibbon (1–2) || Gary Wagner (1–2) || Bill Monbouquette (1) || 18,586 || 55–64
|- style="background:#fbb"
| 120 || August 17 || Giants || 4–6 || Juan Marichal (22–5) || Rick Wise (8–10) || Frank Linzy (8) || 9,526 || 55–65
|- style="background:#bfb"
| 121 || August 18 || Giants || 5–3 || Gary Wagner (2–2) || Bill Monbouquette (5–8) || None || 11,562 || 56–65
|- style="background:#fbb"
| 122 || August 19 || Cardinals || 0–2 || Bob Gibson (18–5) || Woodie Fryman (11–12) || None || 12,278 || 56–66
|- style="background:#bfb"
| 123 || August 20 || Cardinals || 8–2 || Larry Jackson (11–15) || Nelson Briles (16–8) || None || 9,379 || 57–66
|- style="background:#fbb"
| 124 || August 21 || Cardinals || 3–8 || Dick Hughes (2–2) || Jerry Johnson (0–1) || Joe Hoerner (13) || 9,500 || 57–67
|- style="background:#bfb"
| 125 || August 22 || Cardinals || 7–3 || Chris Short (13–11) || Larry Jaster (8–10) || None || 10,193 || 58–67
|- style="background:#fbb"
| 126 || August 23 || @ Braves || 0–6 || Pat Jarvis (13–9) || Rick Wise (8–11) || None || 23,408 || 58–68
|- style="background:#bfb"
| 127 || August 24 || @ Braves || 4–3 || Jerry Johnson (1–1) || George Stone (3–3) || Gary Wagner (4) || 9,053 || 59–68
|- style="background:#bfb"
| 128 || August 25 || @ Braves || 4–1 || Larry Jackson (12–15) || Phil Niekro (11–10) || None || 8,049 || 60–68
|- style="background:#fbb"
| 129 || August 26 || @ Reds || 5–6 || Clay Carroll (6–5) || Gary Wagner (2–3) || None || 6,221 || 60–69
|- style="background:#fbb"
| 130 || August 27 || @ Reds || 0–10 || Tony Cloninger (5–6) || Rick Wise (8–12) || None || 6,623 || 60–70
|- style="background:#fbb"
| 131 || August 28 (1) || Braves || 2–9 || George Stone (4–3) || Woodie Fryman (11–13) || None || see 2nd game || 60–71
|- style="background:#fbb"
| 132 || August 28 (2) || Braves || 1–2 || Pat Jarvis (14–9) || Jerry Johnson (1–2) || None || 6,713 || 60–72
|- style="background:#fbb"
| 133 || August 29 || Braves || 0–6 || Phil Niekro (11–11) || Larry Jackson (12–16) || None || 4,396 || 60–73
|- style="background:#bfb"
| 134 || August 30 || Reds || 7–4 || Chris Short (14–11) || Jim Maloney (11–9) || Turk Farrell (9) || 6,614 || 61–73
|- style="background:#bfb"
| 135 || August 31 || Reds || 3–2 || Turk Farrell (4–5) || Clay Carroll (6–6) || None || 6,629 || 62–73
|-
|- style="background:#bfb"
| 136 || September 1 || Reds || 4–3 || Jerry Johnson (2–2) || Ted Abernathy (9–3) || None || 4,381 || 63–73
|- style="background:#bfb"
| 137 || September 2 (1) || Dodgers || 5–4 || John Boozer (2–1) || John Purdin (2–3) || Gary Wagner (5) || see 2nd game || 64–73
|- style="background:#bfb"
| 138 || September 2 (2) || Dodgers || 7–5 || Woodie Fryman (12–13) || Mike Kekich (2–9) || Turk Farrell (10) || 5,240 || 65–73
|- style="background:#fbb"
| 139 || September 3 || Dodgers || 9–10 || Jim Brewer (7–3) || John Boozer (2–2) || None || 2,812 || 65–74
|- style="background:#fbb"
| 140 || September 4 || Dodgers || 0–3 || Don Sutton (7–14) || Larry Jackson (12–17) || None || 3,282 || 65–75
|- style="background:#bfb"
| 141 || September 6 || @ Cubs || 5–2 || Rick Wise (9–12) || Bill Hands (15–9) || None || 2,621 || 66–75
|- style="background:#bfb"
| 142 || September 7 || @ Cubs || 4–2 || Chris Short (15–11) || Ferguson Jenkins (17–13) || Gary Wagner (6) || 13,578 || 67–75
|- style="background:#fbb"
| 143 || September 8 || @ Cubs || 3–10 || Ken Holtzman (10–11) || Woodie Fryman (12–14) || None || 15,789 || 67–76
|- style="background:#bfb"
| 144 || September 9 || @ Pirates || 8–7 (15) || Chris Short (16–11) || Bruce Dal Canton (1–1) || None || 2,664 || 68–76
|- style="background:#bbb"
| – || September 10 || @ Pirates || colspan=6 | Postponed (rain); Makeup: September 11 as a traditional double-header
|- style="background:#bfb"
| 145 || September 11 || @ Pirates || 8–6 (12) || Gary Wagner (3–3) || Al McBean (9–12) || Turk Farrell (11) || see 2nd game || 69–76
|- style="background:#fbb"
| 146 || September 12 || @ Pirates || 4–6 || Steve Blass (15–5) || Rick Wise (9–13) || Luke Walker (3) || 2,789 || 69–77
|- style="background:#bfb"
| 147 || September 13 (1) || Cubs || 3–1 || Chris Short (17–11) || Ken Holtzman (10–12) || None || see 2nd game || 70–77
|- style="background:#fbb"
| 148 || September 13 (2) || Cubs || 1–9 || Rich Nye (6–12) || Jerry Johnson (2–3) || None || 5,253 || 70–78
|- style="background:#bfb"
| 149 || September 14 || Cubs || 4–1 || Larry Jackson (13–17) || Bill Hands (16–10) || None || 2,251 || 71–78
|- style="background:#fbb"
| 150 || September 15 || Cubs || 0–4 || Ferguson Jenkins (18–14) || Grant Jackson (1–6) || None || 4,015 || 71–79
|- style="background:#fbb"
| 151 || September 16 || Pirates || 1–6 || Dock Ellis (5–4) || Rick Wise (9–14) || None || 2,087 || 71–80
|- style="background:#fbb"
| 152 || September 17 || Pirates || 2–4 || Bob Moose (7–10) || Chris Short (17–12) || Bruce Dal Canton (1) || 2,576 || 71–81
|- style="background:#bfb"
| 153 || September 18 || Pirates || 2–1 || Jerry Johnson (3–3) || Bob Veale (13–14) || Gary Wagner (7) || 2,463 || 72–81
|- style="background:#fbb"
| 154 || September 20 (1) || Mets || 2–3 || Tom Seaver (15–11) || Gary Wagner (3–4) || None || see 2nd game || 72–82
|- style="background:#fbb"
| 155 || September 20 (2) || Mets || 4–5 || Cal Koonce (6–4) || Turk Farrell (4–6) || None || 4,443 || 72–83
|- style="background:#bfb"
| 156 || September 21 || Mets || 4–3 || Chris Short (18–12) || Dick Selma (9–10) || Gary Wagner (8) || 1,854 || 73–83
|- style="background:#fbb"
| 157 || September 22 || Mets || 2–5 || Jim McAndrew (4–7) || Rick Wise (9–15) || Don Cardwell (1) || 3,259 || 73–84
|- style="background:#bfb"
| 158 || September 24 || @ Cardinals || 2–1 || Jerry Johnson (4–3) || Ray Washburn (13–8) || None || 10,530 || 74–84
|- style="background:#fbb"
| 159 || September 25 || @ Cardinals || 4–5 || Nelson Briles (19–11) || Chris Short (18–13) || Joe Hoerner (16) || 10,992 || 74–85
|- style="background:#bfb"
| 160 || September 27 || @ Mets || 3–2 (11) || Gary Wagner (4–4) || Ron Taylor (1–5) || Turk Farrell (12) || 11,169 || 75–85
|- style="background:#fbb"
| 161 || September 28 || @ Mets || 1–3 || Jerry Koosman (19–12) || Jerry Johnson (4–4) || None || 9,140 || 75–86
|- style="background:#bfb"
| 162 || September 29 || @ Mets || 10–3 || Chris Short (19–13) || Tom Seaver (16–12) || None || 29,302 || 76–86
|-
Roster
Player stats
Batting
Starters by position
Note: Pos = Position; G = Games played; AB = At bats; H = Hits; Avg. = Batting average; HR = Home runs; RBI = Runs batted in
Other batters
Note: G = Games played; AB = At bats; H = Hits; Avg. = Batting average; HR = Home runs; RBI = Runs batted in
Pitching
Starting pitchers
Note: G = Games pitched; IP = Innings pitched; W = Wins; L = Losses; ERA = Earned run average; SO = Strikeouts
Other pitchers
Note: G = Games pitched; IP = Innings pitched; W = Wins; L = Losses; ERA = Earned run average; SO = Strikeouts
Relief pitchers
Note: G = Games pitched; W = Wins; L = Losses; SV = Saves; ERA = Earned run average; SO = Strikeouts
Farm system
LEAGUE CHAMPIONS: Reading
Notes
References
1968 Philadelphia Phillies season at Baseball Reference
Philadelphia Phillies seasons
Philadelphia Phillies season
Philadelph
|
```c++
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing,
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// specific language governing permissions and limitations
#include "kudu/common/schema.h"
#include <cstddef>
#include <cstdint>
#include <string>
#include <tuple> // IWYU pragma: keep
#include <utility>
#include <vector>
#include <glog/logging.h> // IWYU pragma: keep
#include <gtest/gtest.h>
#include "kudu/common/common.pb.h"
#include "kudu/common/key_encoder.h"
#include "kudu/common/row.h"
#include "kudu/common/types.h"
#include "kudu/gutil/strings/stringpiece.h"
#include "kudu/gutil/strings/substitute.h"
#include "kudu/util/faststring.h"
#include "kudu/util/hexdump.h"
#include "kudu/util/int128.h"
#include "kudu/util/memory/arena.h"
#include "kudu/util/slice.h"
#include "kudu/util/status.h"
#include "kudu/util/stopwatch.h" // IWYU pragma: keep
#include "kudu/util/test_macros.h"
#include "kudu/util/test_util.h"
using std::string;
using std::vector;
using strings::Substitute;
namespace kudu {
// Return true if the schemas have exactly the same set of columns
// and respective types.
bool EqualSchemas(const Schema& lhs, const Schema& rhs) {
if (lhs != rhs) {
return false;
}
if (lhs.num_key_columns_ != rhs.num_key_columns_) {
return false;
}
if (lhs.num_columns() != rhs.num_columns()) {
return false;
}
for (size_t i = 0; i < rhs.num_columns(); ++i) {
if (!lhs.cols_[i].Equals(rhs.cols_[i])) {
return false;
}
}
if (lhs.has_column_ids() != rhs.has_column_ids()) {
return false;
}
if (lhs.has_column_ids()) {
if (lhs.col_ids_ != rhs.col_ids_) {
return false;
}
if (lhs.max_col_id() != rhs.max_col_id()) {
return false;
}
}
return true;
}
namespace tablet {
// Copy a row and its referenced data into the given Arena.
static Status CopyRowToArena(const Slice& row,
Arena* dst_arena,
ContiguousRow* copied) {
Slice row_data;
// Copy the direct row data to arena
if (!dst_arena->RelocateSlice(row, &row_data)) {
return Status::IOError("no space for row data in arena");
}
copied->Reset(row_data.mutable_data());
return RelocateIndirectDataToArena(copied, dst_arena);
}
class TestSchema : public KuduTest {};
// Test basic functionality of Schema definition
TEST_F(TestSchema, TestSchema) {
Schema empty_schema;
ASSERT_GT(empty_schema.memory_footprint_excluding_this(), 0);
ColumnSchema col1("key", STRING);
ColumnSchema col2("uint32val", UINT32, true);
ColumnSchema col3("int32val", INT32);
vector<ColumnSchema> cols = { col1, col2, col3 };
Schema schema(cols, 1);
ASSERT_EQ(sizeof(Slice) + sizeof(uint32_t) + sizeof(int32_t),
schema.byte_size());
ASSERT_EQ(3, schema.num_columns());
ASSERT_EQ(0, schema.column_offset(0));
ASSERT_EQ(sizeof(Slice), schema.column_offset(1));
ASSERT_GT(schema.memory_footprint_excluding_this(),
empty_schema.memory_footprint_excluding_this());
EXPECT_EQ("(\n"
" key STRING NOT NULL,\n"
" uint32val UINT32 NULLABLE,\n"
" int32val INT32 NOT NULL,\n"
" PRIMARY KEY (key)\n"
")",
schema.ToString());
EXPECT_EQ("key STRING NOT NULL", schema.column(0).ToString());
EXPECT_EQ("UINT32 NULLABLE", schema.column(1).TypeToString());
}
TEST_F(TestSchema, TestSchemaToStringMode) {
SchemaBuilder builder;
builder.AddKeyColumn("key", DataType::INT32);
const auto schema = builder.Build();
EXPECT_EQ(
Substitute("(\n"
" $0:key INT32 NOT NULL,\n"
" PRIMARY KEY (key)\n"
")",
schema.column_id(0)),
schema.ToString());
EXPECT_EQ("(\n"
" key INT32 NOT NULL,\n"
" PRIMARY KEY (key)\n"
")",
schema.ToString(Schema::ToStringMode::BASE_INFO));
}
enum IncludeColumnIds {
INCLUDE_COL_IDS,
NO_COL_IDS
};
class ParameterizedSchemaTest : public KuduTest,
public ::testing::WithParamInterface<IncludeColumnIds> {};
INSTANTIATE_TEST_SUITE_P(SchemaTypes, ParameterizedSchemaTest,
::testing::Values(INCLUDE_COL_IDS, NO_COL_IDS));
TEST_P(ParameterizedSchemaTest, TestCopyAndMove) {
auto check_schema = [](const Schema& schema) {
ASSERT_EQ(sizeof(Slice) + sizeof(uint32_t) + sizeof(int32_t),
schema.byte_size());
ASSERT_EQ(3, schema.num_columns());
ASSERT_EQ(0, schema.column_offset(0));
ASSERT_EQ(sizeof(Slice), schema.column_offset(1));
EXPECT_EQ(Substitute("(\n"
" $0key STRING NOT NULL,\n"
" $1uint32val UINT32 NULLABLE,\n"
" $2int32val INT32 NOT NULL,\n"
" PRIMARY KEY (key)\n"
")",
schema.has_column_ids() ? "0:" : "",
schema.has_column_ids() ? "1:" : "",
schema.has_column_ids() ? "2:" : ""),
schema.ToString());
EXPECT_EQ("key STRING NOT NULL", schema.column(0).ToString());
EXPECT_EQ("UINT32 NULLABLE", schema.column(1).TypeToString());
};
ColumnSchema col1("key", STRING);
ColumnSchema col2("uint32val", UINT32, true);
ColumnSchema col3("int32val", INT32);
vector<ColumnSchema> cols = { col1, col2, col3 };
vector<ColumnId> ids = { ColumnId(0), ColumnId(1), ColumnId(2) };
constexpr int kNumKeyCols = 1;
const auto& schema = GetParam() == INCLUDE_COL_IDS
? Schema(cols, ids, kNumKeyCols) : Schema(cols, kNumKeyCols);
NO_FATALS(check_schema(schema));
// Check copy- and move-assignment.
Schema moved_schema;
{
Schema copied_schema = schema;
NO_FATALS(check_schema(copied_schema));
ASSERT_TRUE(EqualSchemas(schema, copied_schema));
// Move-assign to 'moved_to_schema' from 'copied_schema' and then let
// 'copied_schema' go out of scope to make sure none of the 'moved_schema'
// resources are incorrectly freed.
moved_schema = std::move(copied_schema);
// 'copied_schema' is moved from so it should still be valid to call
// ToString(), though we can't expect any particular result.
copied_schema.ToString(); // NOLINT(*)
}
NO_FATALS(check_schema(moved_schema));
ASSERT_TRUE(EqualSchemas(schema, moved_schema));
// Check copy- and move-construction.
{
Schema copied_schema(schema);
NO_FATALS(check_schema(copied_schema));
ASSERT_TRUE(EqualSchemas(schema, copied_schema));
Schema moved_schema(std::move(copied_schema));
copied_schema.ToString(); // NOLINT(*)
NO_FATALS(check_schema(moved_schema));
ASSERT_TRUE(EqualSchemas(schema, moved_schema));
}
}
// Test basic functionality of Schema definition with decimal columns
TEST_F(TestSchema, TestSchemaWithDecimal) {
ColumnSchema col1("key", STRING);
ColumnSchema col2("decimal32val", DECIMAL32, false, false, false,
nullptr, nullptr, ColumnStorageAttributes(),
ColumnTypeAttributes(9, 4));
ColumnSchema col3("decimal64val", DECIMAL64, true, false, false,
nullptr, nullptr, ColumnStorageAttributes(),
ColumnTypeAttributes(18, 10));
ColumnSchema col4("decimal128val", DECIMAL128, true, false, false,
nullptr, nullptr, ColumnStorageAttributes(),
ColumnTypeAttributes(38, 2));
vector<ColumnSchema> cols = { col1, col2, col3, col4 };
Schema schema(cols, 1);
ASSERT_EQ(sizeof(Slice) + sizeof(int32_t) +
sizeof(int64_t) + sizeof(int128_t),
schema.byte_size());
EXPECT_EQ("(\n"
" key STRING NOT NULL,\n"
" decimal32val DECIMAL(9, 4) NOT NULL,\n"
" decimal64val DECIMAL(18, 10) NULLABLE,\n"
" decimal128val DECIMAL(38, 2) NULLABLE,\n"
" PRIMARY KEY (key)\n"
")",
schema.ToString());
EXPECT_EQ("DECIMAL(9, 4) NOT NULL", schema.column(1).TypeToString());
EXPECT_EQ("DECIMAL(18, 10) NULLABLE", schema.column(2).TypeToString());
EXPECT_EQ("DECIMAL(38, 2) NULLABLE", schema.column(3).TypeToString());
}
// Test Schema::Equals respects decimal column attributes
TEST_F(TestSchema, TestSchemaEqualsWithDecimal) {
ColumnSchema col1("key", STRING);
ColumnSchema col_18_10("decimal64val", DECIMAL64, true, false, false,
nullptr, nullptr, ColumnStorageAttributes(),
ColumnTypeAttributes(18, 10));
ColumnSchema col_18_9("decimal64val", DECIMAL64, true, false,false,
nullptr, nullptr, ColumnStorageAttributes(),
ColumnTypeAttributes(18, 9));
ColumnSchema col_17_10("decimal64val", DECIMAL64, true, false, false,
nullptr, nullptr, ColumnStorageAttributes(),
ColumnTypeAttributes(17, 10));
ColumnSchema col_17_9("decimal64val", DECIMAL64, true, false, false,
nullptr, nullptr, ColumnStorageAttributes(),
ColumnTypeAttributes(17, 9));
Schema schema_18_10({ col1, col_18_10 }, 1);
Schema schema_18_9({ col1, col_18_9 }, 1);
Schema schema_17_10({ col1, col_17_10 }, 1);
Schema schema_17_9({ col1, col_17_9 }, 1);
EXPECT_EQ(schema_18_10, schema_18_10);
EXPECT_NE(schema_18_10, schema_18_9);
EXPECT_NE(schema_18_10, schema_17_10);
EXPECT_NE(schema_18_10, schema_17_9);
}
TEST_F(TestSchema, TestColumnSchemaEquals) {
Slice default_str("read-write default");
ColumnSchema col1("key", STRING);
ColumnSchema col2("key1", STRING);
ColumnSchema col3("key", STRING, true);
ColumnSchema col4("key", STRING, true, false, false, &default_str, &default_str);
ASSERT_TRUE(col1.Equals(col1));
ASSERT_FALSE(col1.Equals(col2, ColumnSchema::COMPARE_NAME));
ASSERT_TRUE(col1.Equals(col2, ColumnSchema::COMPARE_TYPE));
ASSERT_TRUE(col1.Equals(col3, ColumnSchema::COMPARE_NAME));
ASSERT_FALSE(col1.Equals(col3, ColumnSchema::COMPARE_TYPE));
ASSERT_TRUE(col1.Equals(col3, ColumnSchema::COMPARE_OTHER));
ASSERT_FALSE(col3.Equals(col4, ColumnSchema::COMPARE_OTHER));
ASSERT_TRUE(col4.Equals(col4, ColumnSchema::COMPARE_OTHER));
}
TEST_F(TestSchema, TestSchemaEquals) {
Schema schema1({ ColumnSchema("col1", STRING),
ColumnSchema("col2", STRING),
ColumnSchema("col3", UINT32) },
2);
Schema schema2({ ColumnSchema("newCol1", STRING),
ColumnSchema("newCol2", STRING),
ColumnSchema("newCol3", UINT32) },
2);
Schema schema3({ ColumnSchema("col1", STRING),
ColumnSchema("col2", UINT32),
ColumnSchema("col3", UINT32, true) },
2);
Schema schema4({ ColumnSchema("col1", STRING),
ColumnSchema("col2", UINT32),
ColumnSchema("col3", UINT32, false) },
2);
ASSERT_NE(schema1, schema2);
ASSERT_TRUE(schema1.KeyEquals(schema1));
ASSERT_TRUE(schema1.KeyEquals(schema2, ColumnSchema::COMPARE_TYPE));
ASSERT_FALSE(schema1.KeyEquals(schema2, ColumnSchema::COMPARE_NAME));
ASSERT_TRUE(schema1.KeyTypeEquals(schema2));
ASSERT_FALSE(schema2.KeyTypeEquals(schema3));
ASSERT_NE(schema3, schema4);
ASSERT_EQ(schema4, schema4);
ASSERT_TRUE(schema3.KeyEquals(schema4, ColumnSchema::COMPARE_NAME_AND_TYPE));
}
TEST_F(TestSchema, TestReset) {
Schema schema;
ASSERT_FALSE(schema.initialized());
ASSERT_OK(schema.Reset({ ColumnSchema("col3", UINT32),
ColumnSchema("col2", STRING) },
1));
ASSERT_TRUE(schema.initialized());
Schema schema1;
ASSERT_OK(schema1.Reset({ ColumnSchema("col3", UINT64),
ColumnSchema("col4", STRING),
ColumnSchema("col5", UINT32),
ColumnSchema("col6", STRING) }, 2));
ASSERT_OK(schema.Reset(schema1.columns(), 2));
ASSERT_TRUE(schema == schema1);
for (int i = 0; i < schema1.num_columns(); i++) {
ASSERT_EQ(schema.column_offset(i), schema1.column_offset(i));
}
ASSERT_EQ(schema.key_byte_size(), schema1.key_byte_size());
// Move an uninitialized schema into the initialized schema.
Schema schema2;
schema = std::move(schema2);
ASSERT_FALSE(schema.initialized());
}
// Test for KUDU-943, a bug where we suspected that Variant didn't behave
// correctly with empty strings.
TEST_F(TestSchema, TestEmptyVariant) {
Slice empty_val("");
Slice nonempty_val("test");
Variant v(STRING, &nonempty_val);
ASSERT_EQ("test", (static_cast<const Slice*>(v.value()))->ToString());
v.Reset(STRING, &empty_val);
ASSERT_EQ("", (static_cast<const Slice*>(v.value()))->ToString());
v.Reset(STRING, &nonempty_val);
ASSERT_EQ("test", (static_cast<const Slice*>(v.value()))->ToString());
}
TEST_F(TestSchema, TestProjectSubset) {
Schema schema1({ ColumnSchema("col1", STRING),
ColumnSchema("col2", STRING),
ColumnSchema("col3", UINT32) },
1);
Schema schema2({ ColumnSchema("col3", UINT32),
ColumnSchema("col2", STRING) },
0);
RowProjector row_projector(&schema1, &schema2);
ASSERT_OK(row_projector.Init());
// Verify the mapping
ASSERT_EQ(2, row_projector.base_cols_mapping().size());
ASSERT_EQ(0, row_projector.projection_defaults().size());
const vector<RowProjector::ProjectionIdxMapping>& mapping = row_projector.base_cols_mapping();
ASSERT_EQ(mapping[0].first, 0); // col3 schema2
ASSERT_EQ(mapping[0].second, 2); // col3 schema1
ASSERT_EQ(mapping[1].first, 1); // col2 schema2
ASSERT_EQ(mapping[1].second, 1); // col2 schema1
}
// Test projection when the type of the projected column
// doesn't match the original type.
TEST_F(TestSchema, TestProjectTypeMismatch) {
Schema schema1({ ColumnSchema("key", STRING),
ColumnSchema("val", UINT32) },
1);
Schema schema2({ ColumnSchema("val", STRING) }, 0);
RowProjector row_projector(&schema1, &schema2);
Status s = row_projector.Init();
ASSERT_TRUE(s.IsInvalidArgument()) << s.ToString();
ASSERT_STR_CONTAINS(s.message().ToString(), "must have type");
}
// Test projection when the some columns in the projection
// are not present in the base schema
TEST_F(TestSchema, TestProjectMissingColumn) {
Schema schema1({ ColumnSchema("key", STRING), ColumnSchema("val", UINT32) }, 1);
Schema schema2({ ColumnSchema("val", UINT32), ColumnSchema("non_present", STRING) }, 0);
Schema schema3({ ColumnSchema("val", UINT32), ColumnSchema("non_present", UINT32, true) }, 0);
uint32_t default_value = 15;
Schema schema4({ ColumnSchema("val", UINT32),
ColumnSchema("non_present", UINT32, false, false, false, &default_value) },
0);
RowProjector row_projector(&schema1, &schema2);
Status s = row_projector.Init();
ASSERT_TRUE(s.IsInvalidArgument()) << s.ToString();
ASSERT_STR_CONTAINS(s.message().ToString(),
"does not exist in the projection, and it does not have a default value or a nullable type");
// Verify Default nullable column with no default value
ASSERT_OK(row_projector.Reset(&schema1, &schema3));
ASSERT_EQ(1, row_projector.base_cols_mapping().size());
ASSERT_EQ(1, row_projector.projection_defaults().size());
ASSERT_EQ(row_projector.base_cols_mapping()[0].first, 0); // val schema2
ASSERT_EQ(row_projector.base_cols_mapping()[0].second, 1); // val schema1
ASSERT_EQ(row_projector.projection_defaults()[0], 1); // non_present schema3
// Verify Default non nullable column with default value
ASSERT_OK(row_projector.Reset(&schema1, &schema4));
ASSERT_EQ(1, row_projector.base_cols_mapping().size());
ASSERT_EQ(1, row_projector.projection_defaults().size());
ASSERT_EQ(row_projector.base_cols_mapping()[0].first, 0); // val schema4
ASSERT_EQ(row_projector.base_cols_mapping()[0].second, 1); // val schema1
ASSERT_EQ(row_projector.projection_defaults()[0], 1); // non_present schema4
}
// Test projection mapping using IDs.
// This simulate a column rename ('val' -> 'val_renamed')
// and a new column added ('non_present')
TEST_F(TestSchema, TestProjectRename) {
SchemaBuilder builder;
ASSERT_OK(builder.AddKeyColumn("key", STRING));
ASSERT_OK(builder.AddColumn("val", UINT32));
Schema schema1 = builder.Build();
builder.Reset(schema1);
ASSERT_OK(builder.AddNullableColumn("non_present", UINT32));
ASSERT_OK(builder.RenameColumn("val", "val_renamed"));
Schema schema2 = builder.Build();
RowProjector row_projector(&schema1, &schema2);
ASSERT_OK(row_projector.Init());
ASSERT_EQ(2, row_projector.base_cols_mapping().size());
ASSERT_EQ(1, row_projector.projection_defaults().size());
ASSERT_EQ(row_projector.base_cols_mapping()[0].first, 0); // key schema2
ASSERT_EQ(row_projector.base_cols_mapping()[0].second, 0); // key schema1
ASSERT_EQ(row_projector.base_cols_mapping()[1].first, 1); // val_renamed schema2
ASSERT_EQ(row_projector.base_cols_mapping()[1].second, 1); // val schema1
ASSERT_EQ(row_projector.projection_defaults()[0], 2); // non_present schema2
}
// Test that we can map a projection schema (no column ids) onto a tablet
// schema (column ids).
TEST_F(TestSchema, TestGetMappedReadProjection) {
Schema tablet_schema({ ColumnSchema("key", STRING),
ColumnSchema("val", INT32) },
{ ColumnId(0),
ColumnId(1) },
1);
const bool kReadDefault = false;
Schema projection({ ColumnSchema("key", STRING),
ColumnSchema("deleted", IS_DELETED,
/*is_nullable=*/false,
/*is_immutable=*/false,
/*is_auto_incrementing=*/false,
/*read_default=*/&kReadDefault) },
1);
Schema mapped;
ASSERT_OK(tablet_schema.GetMappedReadProjection(projection, &mapped));
ASSERT_EQ(1, mapped.num_key_columns());
ASSERT_EQ(2, mapped.num_columns());
ASSERT_TRUE(mapped.has_column_ids());
ASSERT_FALSE(EqualSchemas(mapped, projection));
// The column id for the 'key' column in the mapped projection should match
// the one from the tablet schema.
ASSERT_EQ("key", mapped.column(0).name());
ASSERT_EQ(0, mapped.column_id(0));
// Since 'deleted' is a virtual column and thus does not appear in the tablet
// schema, in the mapped schema it should have been assigned a higher column
// id than the highest column id in the tablet schema.
ASSERT_EQ("deleted", mapped.column(1).name());
ASSERT_GT(mapped.column_id(1), tablet_schema.column_id(1));
ASSERT_GT(mapped.max_col_id(), tablet_schema.max_col_id());
// Ensure that virtual columns that are nullable or that do not have read
// defaults are rejected.
Schema nullable_projection;
Status s = nullable_projection.Reset({ ColumnSchema("key", STRING),
ColumnSchema("deleted", IS_DELETED,
/*is_nullable=*/true,
/*is_immutable=*/false,
/*is_auto_incrementing=*/false,
/*read_default=*/&kReadDefault) },
1);
ASSERT_FALSE(s.ok());
ASSERT_STR_CONTAINS(s.ToString(), "must not be nullable");
Schema no_default_projection;
s = no_default_projection.Reset({ ColumnSchema("key", STRING),
ColumnSchema("deleted", IS_DELETED,
/*is_nullable=*/false,
/*is_immutable=*/false,
/*is_auto_incrementing*/false,
/*read_default=*/nullptr) },
1);
ASSERT_FALSE(s.ok());
ASSERT_STR_CONTAINS(s.ToString(), "must have a default value for read");
}
// Test that the schema can be used to compare and stringify rows.
TEST_F(TestSchema, TestRowOperations) {
Schema schema({ ColumnSchema("col1", STRING),
ColumnSchema("col2", STRING),
ColumnSchema("col3", UINT32),
ColumnSchema("col4", INT32) },
1);
Arena arena(1024);
RowBuilder rb(&schema);
rb.AddString(string("row_a_1"));
rb.AddString(string("row_a_2"));
rb.AddUint32(3);
rb.AddInt32(-3);
ContiguousRow row_a(&schema);
ASSERT_OK(CopyRowToArena(rb.data(), &arena, &row_a));
rb.Reset();
rb.AddString(string("row_b_1"));
rb.AddString(string("row_b_2"));
rb.AddUint32(3);
rb.AddInt32(-3);
ContiguousRow row_b(&schema);
ASSERT_OK(CopyRowToArena(rb.data(), &arena, &row_b));
ASSERT_GT(schema.Compare(row_b, row_a), 0);
ASSERT_LT(schema.Compare(row_a, row_b), 0);
ASSERT_EQ(R"((string col1="row_a_1", string col2="row_a_2", uint32 col3=3, int32 col4=-3))",
schema.DebugRow(row_a));
}
TEST(TestKeyEncoder, TestKeyEncoder) {
faststring fs;
const KeyEncoder<faststring>& encoder = GetKeyEncoder<faststring>(GetTypeInfo(STRING));
typedef std::tuple<vector<Slice>, Slice> test_pair;
vector<test_pair> pairs;
// Simple key
pairs.push_back(test_pair({ Slice("foo", 3) }, Slice("foo", 3)));
// Simple compound key
pairs.push_back(test_pair({ Slice("foo", 3), Slice("bar", 3) },
Slice("foo" "\x00\x00" "bar", 8)));
// Compound key with a \x00 in it
pairs.push_back(test_pair({ Slice("xxx\x00yyy", 7), Slice("bar", 3) },
Slice("xxx" "\x00\x01" "yyy" "\x00\x00" "bar", 13)));
int i = 0;
for (const test_pair &t : pairs) {
const vector<Slice> &in = std::get<0>(t);
Slice expected = std::get<1>(t);
fs.clear();
for (int col = 0; col < in.size(); col++) {
encoder.Encode(&in[col], col == in.size() - 1, &fs);
}
ASSERT_EQ(expected, Slice(fs))
<< "Failed encoding example " << i << ".\n"
<< "Expected: " << HexDump(expected) << "\n"
<< "Got: " << HexDump(Slice(fs));
i++;
}
}
TEST_F(TestSchema, TestDecodeKeys_CompoundStringKey) {
Schema schema({ ColumnSchema("col1", STRING),
ColumnSchema("col2", STRING),
ColumnSchema("col3", STRING) },
2);
EXPECT_EQ(R"((string col1="foo", string col2="bar"))",
schema.DebugEncodedRowKey(Slice("foo\0\0bar", 8), Schema::START_KEY));
EXPECT_EQ(R"((string col1="fo\000o", string col2="bar"))",
schema.DebugEncodedRowKey(Slice("fo\x00\x01o\0\0""bar", 10), Schema::START_KEY));
EXPECT_EQ(R"((string col1="fo\000o", string col2="bar\000xy"))",
schema.DebugEncodedRowKey(Slice("fo\x00\x01o\0\0""bar\0xy", 13), Schema::START_KEY));
EXPECT_EQ("<start of table>",
schema.DebugEncodedRowKey("", Schema::START_KEY));
EXPECT_EQ("<end of table>",
schema.DebugEncodedRowKey("", Schema::END_KEY));
}
// Test that appropriate statuses are returned when trying to decode an invalid
// encoded key.
TEST_F(TestSchema, TestDecodeKeys_InvalidKeys) {
Schema schema({ ColumnSchema("col1", STRING),
ColumnSchema("col2", UINT32),
ColumnSchema("col3", STRING) },
2);
EXPECT_EQ("<invalid key: Invalid argument: Error decoding composite key component"
" 'col1': Missing separator after composite key string component: foo>",
schema.DebugEncodedRowKey(Slice("foo"), Schema::START_KEY));
EXPECT_EQ("<invalid key: Invalid argument: Error decoding composite key component 'col2': "
"key too short>",
schema.DebugEncodedRowKey(Slice("foo\x00\x00", 5), Schema::START_KEY));
EXPECT_EQ("<invalid key: Invalid argument: Error decoding composite key component 'col2': "
"key too short: \\xff\\xff>",
schema.DebugEncodedRowKey(Slice("foo\x00\x00\xff\xff", 7), Schema::START_KEY));
}
TEST_F(TestSchema, TestCreateProjection) {
Schema schema({ ColumnSchema("col1", STRING),
ColumnSchema("col2", STRING),
ColumnSchema("col3", STRING),
ColumnSchema("col4", STRING),
ColumnSchema("col5", STRING) },
2);
Schema schema_with_ids = SchemaBuilder(schema).Build();
Schema partial_schema;
// By names, without IDs
ASSERT_OK(schema.CreateProjectionByNames({ "col1", "col2", "col4" }, &partial_schema));
EXPECT_EQ("(\n"
" col1 STRING NOT NULL,\n"
" col2 STRING NOT NULL,\n"
" col4 STRING NOT NULL,\n"
" PRIMARY KEY ()\n"
")",
partial_schema.ToString());
// By names, with IDS
ASSERT_OK(schema_with_ids.CreateProjectionByNames({ "col1", "col2", "col4" }, &partial_schema));
EXPECT_EQ(Substitute("(\n"
" $0:col1 STRING NOT NULL,\n"
" $1:col2 STRING NOT NULL,\n"
" $2:col4 STRING NOT NULL,\n"
" PRIMARY KEY ()\n"
")",
schema_with_ids.column_id(0),
schema_with_ids.column_id(1),
schema_with_ids.column_id(3)),
partial_schema.ToString());
// By names, with missing names.
Status s = schema.CreateProjectionByNames({ "foobar" }, &partial_schema);
EXPECT_EQ("Not found: column not found: foobar", s.ToString());
// By IDs
ASSERT_OK(schema_with_ids.CreateProjectionByIdsIgnoreMissing({ schema_with_ids.column_id(0),
schema_with_ids.column_id(1),
ColumnId(1000), // missing column
schema_with_ids.column_id(3) },
&partial_schema));
EXPECT_EQ(Substitute("(\n"
" $0:col1 STRING NOT NULL,\n"
" $1:col2 STRING NOT NULL,\n"
" $2:col4 STRING NOT NULL,\n"
" PRIMARY KEY ()\n"
")",
schema_with_ids.column_id(0),
schema_with_ids.column_id(1),
schema_with_ids.column_id(3)),
partial_schema.ToString());
}
TEST_F(TestSchema, TestFindColumn) {
Schema schema({ ColumnSchema("col1", STRING),
ColumnSchema("col2", INT32) },
1);
int col_idx;
ASSERT_OK(schema.FindColumn("col1", &col_idx));
ASSERT_EQ(0, col_idx);
ASSERT_OK(schema.FindColumn("col2", &col_idx));
ASSERT_EQ(1, col_idx);
Status s = schema.FindColumn("col3", &col_idx);
ASSERT_TRUE(s.IsNotFound()) << s.ToString();
ASSERT_EQ(s.ToString(), "Not found: No such column: col3");
}
#ifdef NDEBUG
TEST(TestKeyEncoder, BenchmarkSimpleKey) {
faststring fs;
Schema schema({ ColumnSchema("col1", STRING) }, 1);
RowBuilder rb(&schema);
rb.AddString(Slice("hello world"));
ConstContiguousRow row(rb.schema(), rb.data());
LOG_TIMING(INFO, "Encoding") {
for (int i = 0; i < 10000000; i++) {
schema.EncodeComparableKey(row, &fs);
}
}
}
#endif
} // namespace tablet
} // namespace kudu
```
|
Swix is a Norwegian manufacturing company of winter sports equipment, headquartered in Lillehammer. Range of products by Swix include ski wax, ski poles and sportswear. The company is owned by the investment company Ferd AS.
The company was founded in Sweden in 1946 by Börje Gabrielsson, based on experience with ski wax by Swedish skier Martin Matsbo. Swix was the first producer of modern ski wax, and had a scientific approach to the challenge. The wax was colour-coded according to type, a practice that has since been adopted by the entire ski wax industry. Production was in Skåne in Sweden and in Fjellhamar near Oslo, Norway. By 1964 all production had been moved to Norway. In 1974, Swix bought Liljedahls Skistavfabrikk in Lillehammer, bringing the brand under the Swix name and becoming a leading ski pole manufacturer. Swix was bought by Ferd in 1978. In 1989, Swix bought Norheim, which was once owned by Gresvig AS, and has since been expanding into the technical sportswear market.
Toko, founded by Jakob Tobler in 1916 in Altstätten, Switzerland as "Tobler & Co.", was bought in 2010, and renamed Toko-Swix Sport AG.
In 2018, Swix Sport Group changed its name to BRAV. The new name will not be consumer-facing, but it will serve internally to unite the company's multiple brands.
References
External links
Official website
Sporting goods manufacturers of Norway
Sportswear brands
Manufacturing companies established in 1946
Ski equipment manufacturers
Norwegian companies established in 1946
|
Cloud, Castle, Lake is a short story anthology by Vladimir Nabokov. It contains five stories: "The Admiralty Spire," "Razor," "A Russian Beauty," "Cloud, Castle, Lake," and "Signs and Symbols."
References
Short story collections by Vladimir Nabokov
2005 short story collections
|
Christer Löfqvist (born 4 June 1944 in Visby, Gotland, Sweden - died 1 February 1978) was an international motorcycle speedway rider from Sweden.
Career
Löfqvist reached the final of the Speedway World Championship in 1974.
He competed in Great Britain for West Ham Hammers, Poole Pirates, and the Hackney Hawks.
Death
Löfqvist was diagnosed with a brain tumour and died in 1978 aged 33.
World Final Appearances
Individual World Championship
1972 - London, Wembley Stadium - 4th - 11pts
1974 - Göteborg, Ullevi - 9th - 8pts
World Team Cup
1972 - Olching, Olching Speedwaybahn (with Tommy Jansson / Jan Simensen / Anders Michanek / Göte Nordin) 4th - 18pts (6)
1974 - Chorzów, Silesian Stadium (with Anders Michanek / Sören Sjösten / Tommy Jansson) - 2nd - 31pts (5)
1976 - London, White City Stadium (with Anders Michanek / Bengt Jansson / Bernt Persson / Lars-Åke Andersson) - 3rd - 26pts (1)
References
1944 births
1978 deaths
Swedish speedway riders
Hackney Hawks riders
Poole Pirates riders
West Ham Hammers riders
Sportspeople from Gotland County
|
```javascript
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
var global = this;
;(function () {
var calledDelete = false;
var calledGet = false;
var calledHas = false;
var calledSet = false;
var target = {};
var assertEquals = global.assertEquals;
var proxy = new Proxy(target, {
has(target, property) {
calledHas = true;
return Reflect.has(target, property);
},
get(target, property, receiver) {
calledGet = true;
return Reflect.get(target, property, receiver);
},
set(targer, property, value, receiver) {
calledSet = true;
return Reflect.set(target, property, value, receiver);
},
delete(target, property, receiver) {
calledDelete = true;
return Reflect.delete(target, property, receiver);
}
});
Object.setPrototypeOf(global, proxy);
getGlobal;
assertTrue(calledGet);
makeGlobal = 2;
assertTrue(calledSet);
"findGlobal" in global;
assertTrue(calledHas);
var deletedOwn = delete makeGlobal;
assertTrue(deletedOwn);
assertEquals(void 0, makeGlobal);
})();
```
|
```c++
#define SOL_ALL_SAFETIES_ON 1
#include <sol/sol.hpp>
#include <iostream>
int main() {
std::cout << "=== override-able member functions ==="
<< std::endl;
struct thingy {
sol::function paint;
thingy(sol::this_state L)
: paint(sol::make_reference<sol::function>(
L.lua_state(), &thingy::default_paint)) {
}
void default_paint() {
std::cout << "p" << std::endl;
}
};
sol::state lua;
lua.open_libraries(sol::lib::base);
lua.new_usertype<thingy>("thingy",
sol::constructors<thingy(sol::this_state)>(),
"paint",
&thingy::paint);
sol::string_view code = R"(
obj = thingy.new()
obj:paint()
obj.paint = function (self) print("g") end
obj:paint()
function obj:paint () print("s") end
obj:paint()
)";
lua.safe_script(code);
std::cout << std::endl;
return 0;
}
```
|
```javascript
'use strict'
const fs = require('fs')
const path = require('path')
const test = require('tap').test
const mr = require('npm-registry-mock')
const npm = require('../../lib/npm.js')
const common = require('../common-tap.js')
const testdir = common.pkg
const repo = path.join(testdir, 'repo')
const prefix = path.join(testdir, 'prefix')
const cache = common.cache
var Tacks = require('tacks')
var Dir = Tacks.Dir
var File = Tacks.File
let daemon
let daemonPID
let git
let mockRegistry
process.env.npm_config_prefix = prefix
const fixture = new Tacks(Dir({
repo: Dir({}),
prefix: Dir({}),
deps: Dir({
parent: Dir({
'package.json': File({
name: 'parent',
version: '1.2.3',
dependencies: {
'child': 'git://localhost:' + common.gitPort + '/child.git'
}
})
}),
child: Dir({
'package.json': File({
name: 'child',
version: '1.0.3',
main: 'dobuild.js',
scripts: {
'prepublish': 'exit 123',
'prepare': 'writer build-artifact'
},
devDependencies: {
writer: 'file:' + path.join(testdir, 'deps', 'writer')
}
})
}),
writer: Dir({
'package.json': File({
name: 'writer',
version: '1.0.0',
bin: 'writer.js'
}),
'writer.js': File(`#!/usr/bin/env node\n
require('fs').writeFileSync(process.argv[2], 'hello, world!')
`)
})
})
}))
test('setup', function (t) {
fixture.create(testdir)
setup(function (er, r) {
t.ifError(er, 'git started up successfully')
if (!er) {
daemon = r[r.length - 2]
daemonPID = r[r.length - 1]
}
mr({
port: common.port
}, function (er, server) {
t.ifError(er, 'started mock registry')
mockRegistry = server
t.end()
})
})
})
test('install from git repo with prepare script', function (t) {
const parent = path.join(testdir, 'deps', 'parent')
common.npm([
'install',
'--no-save',
'--registry', common.registry,
'--cache', cache,
'--loglevel', 'error'
], {
cwd: parent
}, function (err, code, stdout, stderr) {
if (err) { throw err }
t.equal(code, 0, 'exited successfully')
t.equal(stderr, '', 'no actual output on stderr')
const target = path.join(parent, 'node_modules', 'child', 'build-artifact')
fs.readFile(target, 'utf8', (err, data) => {
if (err) { throw err }
t.equal(data, 'hello, world!', 'build artifact written for git dep')
t.end()
})
})
})
test('clean', function (t) {
mockRegistry.close()
daemon.on('close', t.end)
process.kill(daemonPID)
})
function setup (cb) {
npm.load({
prefix: testdir,
loglevel: 'silent'
}, function () {
git = require('../../lib/utils/git.js')
function startDaemon (cb) {
// start git server
const d = git.spawn(
[
'daemon',
'--verbose',
'--listen=localhost',
'--export-all',
'--base-path=.',
'--reuseaddr',
'--port=' + common.gitPort
],
{
cwd: repo,
env: process.env
}
)
d.stderr.on('data', childFinder)
function childFinder (c) {
const cpid = c.toString().match(/^\[(\d+)\]/)
if (cpid[1]) {
this.removeListener('data', childFinder)
cb(null, [d, cpid[1]])
}
}
}
const childPath = path.join(testdir, 'deps', 'child')
common.makeGitRepo({
path: childPath,
commands: [
git.chainableExec([
'clone', '--bare', childPath, 'child.git'
], { cwd: repo, env: process.env }),
startDaemon
]
}, cb)
})
}
```
|
```ruby
class Symlinks < Formula
desc "Symbolic link maintenance utility"
homepage "path_to_url"
url "path_to_url"
sha256 your_sha256_hash
license "MIT"
bottle do
sha256 cellar: :any_skip_relocation, arm64_sonoma: your_sha256_hash
sha256 cellar: :any_skip_relocation, arm64_ventura: your_sha256_hash
sha256 cellar: :any_skip_relocation, arm64_monterey: your_sha256_hash
sha256 cellar: :any_skip_relocation, arm64_big_sur: your_sha256_hash
sha256 cellar: :any_skip_relocation, sonoma: your_sha256_hash
sha256 cellar: :any_skip_relocation, ventura: your_sha256_hash
sha256 cellar: :any_skip_relocation, monterey: your_sha256_hash
sha256 cellar: :any_skip_relocation, big_sur: your_sha256_hash
sha256 cellar: :any_skip_relocation, x86_64_linux: your_sha256_hash
end
def install
system "make", "install"
bin.install "symlinks"
man8.install "symlinks.8"
end
test do
assert_match "->", shell_output("#{bin}/symlinks -v #{__dir__}/../../Aliases")
end
end
```
|
Vikramaditya Shukla is an Indian former actor who has appeared in Tamil and Telugu language films.
Career
Vikramaditya made his debut in the Tamil slasher film, Whistle (2003), with a critic noting he was "apt" for the role. Vikramaditya then portrayed two supporting roles, firstly in Sundar C's Chinna (2005) and then in a negative role in Bambara Kannaley (2005), though both films were met with unfavourable responses from critics and at the box office. He was then selected to portray the leading male role in the Telugu film Manasu Palike Mouna Raagam (2006) which had Sneha portray the leading role. The film released to negative reviews.
Vikramaditya then portrayed a husband engaging in an illicit affair in Tholaipesi (2007), which opened to negative reviews and had a reviewer write that the actor needed "to work on his emotions". His other film in 2007, the triangular love story Nanbanin Kadhali, co-starring Shivani Singh and Kunal, received similar reviews. It had been shot in 2005 under the name Cleopatra and had a delayed release. His most recent theatrical release was Agathiyan's romantic drama Nenjathai Killadhe (2008), where he portrayed a leading role alongside Vikranth and Bharathi.
Filmography
All films are in Tamil, unless otherwise noted.
References
External links
Living people
Male actors in Tamil cinema
21st-century Indian male actors
1986 births
|
```c
/*====================================================================*
-
- Redistribution and use in source and binary forms, with or without
- modification, are permitted provided that the following conditions
- are met:
- 1. Redistributions of source code must retain the above copyright
- notice, this list of conditions and the following disclaimer.
- 2. Redistributions in binary form must reproduce the above
- copyright notice, this list of conditions and the following
- disclaimer in the documentation and/or other materials
- provided with the distribution.
-
- THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
- ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
- LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
- A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL ANY
- CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
- EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
- PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
- PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
- OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
- NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
- SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*====================================================================*/
/*
* recog_bootnum.c
*
* This does two things:
*
* (1) It makes bootnum1.pa and bootnum2.pa from stored labelled data.
*
* (2) Using these, as well as bootnum3.pa, it makes code for
* generating and compiling the the pixas, which are used by the
* boot digit recognizer.
* The output of the code generator is files such as autogen_101.*.
* These files have been edited to combine the .c and .h files into
* a single .c file:
* autogen_101.* --> src/bootnumgen1.c
* autogen_102.* --> src/bootnumgen2.c
* autogen_103.* --> src/bootnumgen3.c
*
* To add another set of templates to bootnumgen1.c:
* (a) Add a new .pa file: prog/recog/digits/digit_setN.pa (N > 15)
* (b) Add code to MakeBootnum1() for this set, selecting with the
* string those templates you want to use.
* (c) Run recog_bootnum.
* * This makes a new /tmp/lept/recog/digits/bootnum1.pa.
* Replace prog/recog/digits/bootnum1.pa with this.
* * This makes new files: /tmp/lept/auto/autogen.101.{h,c}.
* The .h file is the only one we need to use.
* Replace the encoded string in src/bootnumgen1.c with the
* one in autogen.101.h, and recompile.
*/
#ifdef HAVE_CONFIG_H
#include <config_auto.h>
#endif /* HAVE_CONFIG_H */
#include "allheaders.h"
#include "bmfdata.h"
static PIXA *MakeBootnum1(void);
static PIXA *MakeBootnum2(void);
l_int32 main(int argc,
char **argv)
{
PIX *pix1;
PIXA *pixa1, *pixa2, *pixa3;
L_STRCODE *strc;
if (argc != 1) {
lept_stderr(" Syntax: recog_bootnum\n");
return 1;
}
setLeptDebugOK(1);
lept_mkdir("lept/recog/digits");
/* ----------------------- Bootnum 1 --------------------- */
/* Make the bootnum pixa from the images */
pixa1 = MakeBootnum1();
pixaWrite("/tmp/lept/recog/digits/bootnum1.pa", pixa1);
pix1 = pixaDisplayTiledWithText(pixa1, 1500, 1.0, 10, 2, 6, 0xff000000);
pixDisplay(pix1, 100, 0);
pixDestroy(&pix1);
pixaDestroy(&pixa1);
/* Generate the code to make the bootnum1 pixa.
* Note: the actual code we use is in bootnumgen1.c, and
* has already been compiled into the library. */
strc = strcodeCreate(101); /* arbitrary integer */
strcodeGenerate(strc, "/tmp/lept/recog/digits/bootnum1.pa", "PIXA");
strcodeFinalize(&strc, "/tmp/lept/auto");
lept_free(strc);
/* Generate the bootnum1 pixa from the generated code */
pixa1 = l_bootnum_gen1();
pix1 = pixaDisplayTiledWithText(pixa1, 1500, 1.0, 10, 2, 6, 0xff000000);
/* pix1 = pixaDisplayTiled(pixa1, 1500, 0, 30); */
pixDisplay(pix1, 100, 0);
pixDestroy(&pix1);
/* Extend the bootnum1 pixa by erosion */
pixa3 = pixaExtendByMorph(pixa1, L_MORPH_ERODE, 2, NULL, 1);
pix1 = pixaDisplayTiledWithText(pixa3, 1500, 1.0, 10, 2, 6, 0xff000000);
pixDisplay(pix1, 100, 0);
pixDestroy(&pix1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa3);
/* ----------------------- Bootnum 2 --------------------- */
/* Read bootnum 2 */
pixa2 = pixaRead("recog/digits/bootnum2.pa");
pixaWrite("/tmp/lept/recog/digits/bootnum2.pa", pixa2);
pix1 = pixaDisplayTiledWithText(pixa2, 1500, 1.0, 10, 2, 6, 0xff000000);
pixDisplay(pix1, 100, 700);
pixDestroy(&pix1);
pixaDestroy(&pixa2);
/* Generate the code to make the bootnum2 pixa.
* Note: the actual code we use is in bootnumgen2.c. */
strc = strcodeCreate(102); /* another arbitrary integer */
strcodeGenerate(strc, "/tmp/lept/recog/digits/bootnum2.pa", "PIXA");
strcodeFinalize(&strc, "/tmp/lept/auto");
lept_free(strc);
/* Generate the bootnum2 pixa from the generated code */
pixa2 = l_bootnum_gen2();
/* pix1 = pixaDisplayTiled(pixa2, 1500, 0, 30); */
pix1 = pixaDisplayTiledWithText(pixa2, 1500, 1.0, 10, 2, 6, 0xff000000);
pixDisplay(pix1, 100, 700);
pixDestroy(&pix1);
pixaDestroy(&pixa2);
/* ----------------------- Bootnum 3 --------------------- */
/* Read bootnum 3 */
pixa1 = pixaRead("recog/digits/bootnum3.pa");
pix1 = pixaDisplayTiledWithText(pixa1, 1500, 1.0, 10, 2, 6, 0xff000000);
pixDisplay(pix1, 1000, 0);
pixDestroy(&pix1);
pixaDestroy(&pixa1);
/* Generate the code that, when deserializes, gives you bootnum3.pa.
* Note: the actual code we use is in bootnumgen3.c, and
* has already been compiled into the library. */
strc = strcodeCreate(103); /* arbitrary integer */
strcodeGenerate(strc, "recog/digits/bootnum3.pa", "PIXA");
strcodeFinalize(&strc, "/tmp/lept/auto");
lept_free(strc);
/* Generate the bootnum3 pixa from the generated code */
pixa1 = l_bootnum_gen3();
pix1 = pixaDisplayTiledWithText(pixa1, 1500, 1.0, 10, 2, 6, 0xff000000);
pixDisplay(pix1, 1000, 0);
pixDestroy(&pix1);
/* Extend the bootnum3 pixa twice by erosion */
pixa3 = pixaExtendByMorph(pixa1, L_MORPH_ERODE, 2, NULL, 1);
pix1 = pixaDisplayTiledWithText(pixa3, 1500, 1.0, 10, 2, 6, 0xff000000);
pixDisplay(pix1, 1000, 0);
pixDestroy(&pix1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa3);
#if 0
pixa1 = l_bootnum_gen1();
/* pixa1 = pixaRead("recog/digits/bootnum1.pa"); */
pixaWrite("/tmp/lept/junk.pa", pixa1);
pixa2 = pixaRead("/tmp/lept/junk.pa");
pixaWrite("/tmp/lept/junk1.pa", pixa2);
pixa3 = pixaRead("/tmp/lept/junk1.pa");
n = pixaGetCount(pixa3);
for (i = 0; i < n; i++) {
pix = pixaGetPix(pixa3, i, L_CLONE);
lept_stderr("i = %d, text = %s\n", i, pixGetText(pix));
pixDestroy(&pix);
}
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixaDestroy(&pixa3);
#endif
return 0;
}
PIXA *MakeBootnum1(void)
{
const char *str;
PIXA *pixa1, *pixa2, *pixa3;
pixa1 = pixaRead("recog/digits/digit_set02.pa");
str = "10, 27, 35, 45, 48, 74, 79, 97, 119, 124, 148";
pixa3 = pixaSelectWithString(pixa1, str, NULL);
pixaDestroy(&pixa1);
pixa1 = pixaRead("recog/digits/digit_set03.pa");
str = "2, 15, 30, 50, 60, 75, 95, 105, 121, 135";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set05.pa");
str = "0, 15, 30, 49, 60, 75, 90, 105, 120, 135";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set06.pa");
str = "4, 15, 30, 48, 60, 78, 90, 105, 120, 135";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set07.pa");
str = "3, 15, 30, 45, 60, 77, 78, 91, 105, 120, 149";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set08.pa");
str = "0, 20, 30, 45, 60, 75, 90, 106, 121, 135";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set09.pa");
str = "0, 20, 32, 47, 54, 63, 75, 91, 105, 125, 136";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set11.pa");
str = "0, 15, 36, 46, 62, 63, 76, 91, 106, 123, 135";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set12.pa");
str = "1, 20, 31, 45, 61, 75, 95, 107, 120, 135";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set13.pa");
str = "1, 16, 31, 48, 63, 78, 98, 105, 123, 136";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set14.pa");
str = "1, 14, 24, 37, 53, 62, 74, 83, 98, 114";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
pixa1 = pixaRead("recog/digits/digit_set15.pa");
str = "0, 1, 3, 5, 7, 8, 13, 25, 35";
pixa2 = pixaSelectWithString(pixa1, str, NULL);
pixaJoin(pixa3, pixa2, 0, -1);
pixaDestroy(&pixa1);
pixaDestroy(&pixa2);
return pixa3;
}
PIXA *MakeBootnum2(void)
{
char *fname;
l_int32 i, n, w, h;
BOX *box;
PIX *pix;
PIXA *pixa;
L_RECOG *recog;
SARRAY *sa;
/* Phase 1: generate recog from the digit data */
recog = recogCreate(0, 40, 0, 128, 1);
sa = getSortedPathnamesInDirectory("recog/bootnums", "png", 0, 0);
n = sarrayGetCount(sa);
for (i = 0; i < n; i++) {
/* Read each pix: grayscale, multi-character, labelled */
fname = sarrayGetString(sa, i, L_NOCOPY);
if ((pix = pixRead(fname)) == NULL) {
lept_stderr("Can't read %s\n", fname);
continue;
}
/* Convert to a set of 1 bpp, single character, labelled */
pixGetDimensions(pix, &w, &h, NULL);
box = boxCreate(0, 0, w, h);
recogTrainLabeled(recog, pix, box, NULL, 0);
pixDestroy(&pix);
boxDestroy(&box);
}
recogTrainingFinished(&recog, 1, -1, -1.0);
sarrayDestroy(&sa);
/* Phase 2: generate pixa consisting of 1 bpp, single character pix */
pixa = recogExtractPixa(recog);
pixaWrite("/tmp/lept/recog/digits/bootnum2.pa", pixa);
recogDestroy(&recog);
return pixa;
}
```
|
```rust
// #![cfg(feature = "derive")]
//
// use luminance::shader::Uniform;
// use luminance::UniformInterface;
//
// #[test]
// fn derive_uniform_interface() {
// #[derive(UniformInterface)]
// struct SimpleUniformInterface {
// _t: Uniform<f32>,
// }
// }
//
// #[test]
// fn derive_unbound_uniform_interface() {
// #[derive(UniformInterface)]
// struct SimpleUniformInterface {
// #[uniform(unbound)]
// _t: Uniform<f32>,
// }
// }
//
// #[test]
// fn derive_renamed_uniform_interface() {
// #[derive(UniformInterface)]
// struct SimpleUniformInterface {
// #[uniform(name = "time")]
// _t: Uniform<f32>,
// }
// }
//
// #[test]
// fn derive_unbound_renamed_uniform_interface() {
// #[derive(UniformInterface)]
// struct SimpleUniformInterface {
// #[uniform(name = "time", unbound)]
// _t: Uniform<f32>,
// #[uniform(unbound, name = "time")]
// _t2: Uniform<f32>,
// }
// }
```
|
```pascal
object SiteAdvancedDialog: TSiteAdvancedDialog
Left = 351
Top = 167
HelpType = htKeyword
HelpKeyword = 'ui_login_advanced'
BorderIcons = [biSystemMenu, biMinimize, biMaximize, biHelp]
BorderStyle = bsDialog
Caption = 'Advanced Site Settings'
ClientHeight = 432
ClientWidth = 561
Color = clBtnFace
ParentFont = True
OldCreateOrder = True
Position = poOwnerFormCenter
OnClose = FormClose
OnCloseQuery = FormCloseQuery
OnShow = FormShow
DesignSize = (
561
432)
PixelsPerInch = 96
TextHeight = 13
object MainPanel: TPanel
Left = 0
Top = 0
Width = 561
Height = 392
Align = alTop
Anchors = [akLeft, akTop, akRight, akBottom]
BevelOuter = bvNone
TabOrder = 0
object PageControl: TPageControl
Left = 152
Top = 0
Width = 409
Height = 392
HelpType = htKeyword
ActivePage = EnvironmentSheet
Align = alClient
MultiLine = True
Style = tsButtons
TabOrder = 1
TabStop = False
OnChange = PageControlChange
object EnvironmentSheet: TTabSheet
Tag = 1
HelpType = htKeyword
HelpKeyword = 'ui_login_environment'
Caption = 'Environment'
ImageIndex = 6
TabVisible = False
DesignSize = (
401
382)
object EnvironmentGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 140
Anchors = [akLeft, akTop, akRight]
Caption = 'Server environment'
TabOrder = 0
DesignSize = (
393
140)
object EOLTypeLabel: TLabel
Left = 12
Top = 20
Width = 241
Height = 13
Caption = 'End-of-line &characters (if not indicated by server):'
FocusControl = EOLTypeCombo
end
object UtfLabel: TLabel
Left = 12
Top = 44
Width = 144
Height = 13
Caption = '&UTF-8 encoding for filenames:'
FocusControl = UtfCombo
end
object TimeDifferenceLabel: TLabel
Left = 12
Top = 68
Width = 84
Height = 13
Caption = 'Time &zone offset:'
FocusControl = TimeDifferenceEdit
end
object TimeDifferenceHoursLabel: TLabel
Left = 210
Top = 68
Width = 27
Height = 13
Anchors = [akTop, akRight]
Caption = 'hours'
FocusControl = TimeDifferenceEdit
end
object TimeDifferenceMinutesLabel: TLabel
Left = 336
Top = 66
Width = 37
Height = 13
Anchors = [akTop, akRight]
Caption = 'minutes'
FocusControl = TimeDifferenceMinutesEdit
end
object EOLTypeCombo: TComboBox
Left = 320
Top = 15
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 0
Items.Strings = (
'LF'
'CR/LF')
end
object UtfCombo: TComboBox
Left = 320
Top = 39
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 1
end
object TimeDifferenceEdit: TUpDownEdit
Left = 135
Top = 63
Width = 69
Height = 21
Alignment = taRightJustify
MaxValue = 25.000000000000000000
MinValue = -25.000000000000000000
Value = -13.000000000000000000
Anchors = [akTop, akRight]
TabOrder = 2
OnChange = DataChange
end
object TimeDifferenceMinutesEdit: TUpDownEdit
Left = 261
Top = 63
Width = 69
Height = 21
Alignment = taRightJustify
Increment = 15.000000000000000000
MaxValue = 45.000000000000000000
MinValue = -45.000000000000000000
Value = -13.000000000000000000
Anchors = [akTop, akRight]
TabOrder = 3
OnChange = DataChange
end
object TimeDifferenceAutoCheck: TCheckBox
Left = 135
Top = 90
Width = 242
Height = 17
Caption = 'Detect &automatically'
TabOrder = 4
OnClick = DataChange
end
object TrimVMSVersionsCheck: TCheckBox
Left = 12
Top = 113
Width = 369
Height = 17
Caption = '&Trim VMS version numbers'
TabOrder = 5
OnClick = DataChange
end
end
object DSTModeGroup: TGroupBox
Left = 0
Top = 153
Width = 393
Height = 93
Anchors = [akLeft, akTop, akRight]
Caption = 'Daylight saving time'
TabOrder = 1
DesignSize = (
393
93)
object DSTModeUnixCheck: TRadioButton
Left = 12
Top = 19
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Adjust remote timestamp to local co&nventions'
TabOrder = 0
OnClick = DataChange
end
object DSTModeWinCheck: TRadioButton
Left = 12
Top = 42
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Adjust remote timestamp with &DST'
TabOrder = 1
OnClick = DataChange
end
object DSTModeKeepCheck: TRadioButton
Left = 12
Top = 65
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Preser&ve remote timestamp'
TabOrder = 2
OnClick = DataChange
end
end
object PuttyGroup: TGroupBox
Left = 0
Top = 252
Width = 393
Height = 98
Anchors = [akLeft, akTop, akRight]
Caption = 'PuTTY'
TabOrder = 2
DesignSize = (
393
98)
object PuttySettingsLabel: TLabel
Left = 12
Top = 18
Width = 116
Height = 13
Caption = '&PuTTY terminal settings:'
FocusControl = EncryptKeyPasswordEdit
end
object PuttySettingsButton: TButton
Left = 12
Top = 61
Width = 125
Height = 25
Anchors = [akTop, akRight]
Caption = '&Edit in PuTTY...'
TabOrder = 1
OnClick = PuttySettingsButtonClick
end
object PuttySettingsEdit: TEdit
Left = 12
Top = 34
Width = 370
Height = 21
MaxLength = 64
TabOrder = 0
Text = 'PuttySettingsEdit'
OnChange = DataChange
OnExit = EncryptKeyEditExit
end
end
end
object DirectoriesSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_directories'
Caption = 'Directories'
ImageIndex = 11
TabVisible = False
DesignSize = (
401
382)
object DirectoriesGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 183
Anchors = [akLeft, akTop, akRight]
Caption = 'Directories'
TabOrder = 0
DesignSize = (
393
183)
object LocalDirectoryLabel: TLabel
Left = 12
Top = 111
Width = 74
Height = 13
Caption = '&Local directory:'
FocusControl = LocalDirectoryEdit
end
object RemoteDirectoryLabel: TLabel
Left = 12
Top = 66
Width = 87
Height = 13
Caption = '&Remote directory:'
FocusControl = RemoteDirectoryEdit
end
object LocalDirectoryDescLabel: TLabel
Left = 12
Top = 156
Width = 241
Height = 13
Caption = 'Local directory is not used with Explorer interface.'
ShowAccelChar = False
end
object LocalDirectoryEdit: TDirectoryEdit
Left = 12
Top = 128
Width = 371
Height = 21
AcceptFiles = True
DialogText = 'Select startup local directory.'
ClickKey = 16397
Anchors = [akLeft, akTop, akRight]
TabOrder = 3
Text = 'LocalDirectoryEdit'
OnChange = DataChange
end
object RemoteDirectoryEdit: TEdit
Left = 12
Top = 83
Width = 371
Height = 21
Anchors = [akLeft, akTop, akRight]
MaxLength = 1000
TabOrder = 2
Text = 'RemoteDirectoryEdit'
OnChange = DataChange
end
object UpdateDirectoriesCheck: TCheckBox
Left = 12
Top = 42
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Re&member last used directory'
TabOrder = 1
end
object SynchronizeBrowsingCheck: TCheckBox
Left = 12
Top = 19
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Syn&chronize browsing'
TabOrder = 0
end
end
object DirectoryOptionsGroup: TGroupBox
Left = 1
Top = 195
Width = 393
Height = 116
Anchors = [akLeft, akTop, akRight]
Caption = 'Directory reading options'
TabOrder = 1
DesignSize = (
393
116)
object CacheDirectoriesCheck: TCheckBox
Left = 12
Top = 19
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Cache &visited remote directories'
TabOrder = 0
OnClick = DataChange
end
object CacheDirectoryChangesCheck: TCheckBox
Left = 12
Top = 42
Width = 230
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Cache &directory changes'
TabOrder = 1
OnClick = DataChange
end
object ResolveSymlinksCheck: TCheckBox
Left = 12
Top = 65
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Resolve symbolic li&nks'
TabOrder = 3
end
object PreserveDirectoryChangesCheck: TCheckBox
Left = 251
Top = 42
Width = 139
Height = 17
Anchors = [akTop, akRight]
Caption = '&Permanent cache'
TabOrder = 2
end
object FollowDirectorySymlinksCheck: TCheckBox
Left = 12
Top = 88
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Follow symbolic links to directories'
TabOrder = 4
end
end
end
object RecycleBinSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_recycle_bin'
Caption = 'Recycle bin'
ImageIndex = 15
TabVisible = False
DesignSize = (
401
382)
object RecycleBinGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 116
Anchors = [akLeft, akTop, akRight]
Caption = 'Recycle bin'
TabOrder = 0
DesignSize = (
393
116)
object RecycleBinPathLabel: TLabel
Left = 12
Top = 66
Width = 95
Height = 13
Caption = '&Remote recycle bin:'
FocusControl = RecycleBinPathEdit
end
object DeleteToRecycleBinCheck: TCheckBox
Left = 12
Top = 19
Width = 370
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Preserve deleted remote files to recycle bin'
TabOrder = 0
OnClick = DataChange
end
object OverwrittenToRecycleBinCheck: TCheckBox
Left = 12
Top = 42
Width = 370
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Preserve &overwritten remote files to recycle bin (SFTP only)'
TabOrder = 1
OnClick = DataChange
end
object RecycleBinPathEdit: TEdit
Left = 12
Top = 83
Width = 370
Height = 21
Anchors = [akLeft, akTop, akRight]
MaxLength = 1000
TabOrder = 2
Text = 'RecycleBinPathEdit'
OnChange = DataChange
end
end
end
object EncryptionSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_encryption'
Caption = 'Encryption'
TabVisible = False
DesignSize = (
401
382)
object EncryptFilesCheck: TCheckBox
Left = 12
Top = 8
Width = 382
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Encrypt files'
TabOrder = 0
OnClick = DataChange
end
object EncryptFilesGroup: TGroupBox
Left = 0
Top = 32
Width = 393
Height = 121
Anchors = [akLeft, akTop, akRight]
Caption = 'Encryption options'
TabOrder = 1
object Label13: TLabel
Left = 12
Top = 18
Width = 75
Height = 13
Caption = 'Encryption &key:'
FocusControl = EncryptKeyPasswordEdit
end
object EncryptKeyVisibleEdit: TEdit
Left = 12
Top = 34
Width = 370
Height = 21
MaxLength = 64
TabOrder = 1
Text = 'EncryptKeyVisibleEdit'
Visible = False
OnChange = DataChange
OnExit = EncryptKeyEditExit
end
object EncryptKeyPasswordEdit: TPasswordEdit
Left = 12
Top = 34
Width = 370
Height = 21
MaxLength = 64
TabOrder = 0
Text = 'EncryptKeyPasswordEdit'
OnChange = DataChange
OnExit = EncryptKeyEditExit
end
object ShowEncryptionKeyCheck: TCheckBox
Left = 12
Top = 61
Width = 117
Height = 17
Caption = '&Show key'
TabOrder = 2
OnClick = ShowEncryptionKeyCheckClick
end
object GenerateKeyButton: TButton
Left = 12
Top = 84
Width = 117
Height = 25
Caption = '&Generate Key'
TabOrder = 3
OnClick = GenerateKeyButtonClick
end
end
end
object SftpSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_sftp'
Caption = 'SFTP'
ImageIndex = 12
TabVisible = False
DesignSize = (
401
382)
object SFTPBugsGroupBox: TGroupBox
Left = 0
Top = 153
Width = 393
Height = 70
Anchors = [akLeft, akTop, akRight]
Caption = 'Detection of known bugs in SFTP servers'
TabOrder = 1
DesignSize = (
393
70)
object Label10: TLabel
Left = 12
Top = 20
Width = 211
Height = 13
Caption = '&Reverses order of link command arguments:'
FocusControl = SFTPBugSymlinkCombo
end
object Label36: TLabel
Left = 12
Top = 44
Width = 205
Height = 13
Caption = '&Misinterprets file timestamps prior to 1970:'
FocusControl = SFTPBugSignedTSCombo
end
object SFTPBugSymlinkCombo: TComboBox
Left = 320
Top = 15
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 0
end
object SFTPBugSignedTSCombo: TComboBox
Left = 320
Top = 39
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 1
end
end
object SFTPProtocolGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 141
Anchors = [akLeft, akTop, akRight]
Caption = 'Protocol options'
TabOrder = 0
DesignSize = (
393
141)
object Label34: TLabel
Left = 12
Top = 44
Width = 157
Height = 13
Caption = '&Preferred SFTP protocol version:'
FocusControl = SFTPMaxVersionCombo
end
object Label23: TLabel
Left = 12
Top = 20
Width = 62
Height = 13
Caption = 'SFTP ser&ver:'
FocusControl = SftpServerEdit
end
object Label5: TLabel
Left = 12
Top = 68
Width = 157
Height = 13
Caption = '&Canonicalize paths on the server'
FocusControl = SFTPRealPathCombo
end
object SFTPMaxVersionCombo: TComboBox
Left = 320
Top = 39
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 1
Items.Strings = (
'0'
'1'
'2'
'3'
'4'
'5'
'6')
end
object SftpServerEdit: TComboBox
Left = 149
Top = 15
Width = 232
Height = 21
AutoComplete = False
Anchors = [akLeft, akTop, akRight]
MaxLength = 255
TabOrder = 0
Text = 'SftpServerEdit'
OnChange = DataChange
Items.Strings = (
'Default'
'/bin/sftp-server'
'sudo su -c /bin/sftp-server')
end
object AllowScpFallbackCheck: TCheckBox
Left = 12
Top = 90
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Allow SCP &fallback'
TabOrder = 3
OnClick = DataChange
end
object SFTPRealPathCombo: TComboBox
Left = 320
Top = 63
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 2
end
object UsePosixRenameCheck: TCheckBox
Left = 12
Top = 113
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Use POSIX rename'
TabOrder = 4
OnClick = DataChange
end
end
end
object ScpSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_scp'
Caption = 'SCP/ShellX'
ImageIndex = 3
TabVisible = False
DesignSize = (
401
382)
object OtherShellOptionsGroup: TGroupBox
Left = 0
Top = 161
Width = 393
Height = 69
Anchors = [akLeft, akTop, akRight]
Caption = 'Other options'
TabOrder = 2
DesignSize = (
393
69)
object LookupUserGroupsCheck: TCheckBox
Left = 12
Top = 19
Width = 140
Height = 17
AllowGrayed = True
Caption = 'Lookup &user groups'
TabOrder = 0
OnClick = DataChange
end
object ClearAliasesCheck: TCheckBox
Left = 12
Top = 42
Width = 140
Height = 17
Caption = 'Clear a&liases'
TabOrder = 2
OnClick = DataChange
end
object UnsetNationalVarsCheck: TCheckBox
Left = 168
Top = 19
Width = 213
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Clear &national variables'
TabOrder = 1
OnClick = DataChange
end
object Scp1CompatibilityCheck: TCheckBox
Left = 168
Top = 42
Width = 213
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Use scp&2 with scp1 compatibility'
TabOrder = 3
OnClick = DataChange
end
end
object ShellGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 70
Anchors = [akLeft, akTop, akRight]
Caption = 'Shell'
TabOrder = 0
DesignSize = (
393
70)
object Label19: TLabel
Left = 12
Top = 20
Width = 26
Height = 13
Caption = 'S&hell:'
FocusControl = ShellEdit
end
object Label20: TLabel
Left = 12
Top = 44
Width = 104
Height = 13
Caption = '&Return code variable:'
FocusControl = ReturnVarEdit
end
object ShellEdit: TComboBox
Left = 168
Top = 15
Width = 213
Height = 21
AutoComplete = False
Anchors = [akLeft, akTop, akRight]
MaxLength = 50
TabOrder = 0
Text = 'ShellEdit'
Items.Strings = (
'Default'
'/bin/bash'
'/bin/ksh'
'/bin/sh'
'sudo su -')
end
object ReturnVarEdit: TComboBox
Left = 168
Top = 39
Width = 213
Height = 21
AutoComplete = False
Anchors = [akLeft, akTop, akRight]
MaxLength = 50
TabOrder = 1
Text = 'ReturnVarEdit'
Items.Strings = (
'Autodetect'
'?'
'status')
end
end
object ScpLsOptionsGroup: TGroupBox
Left = 0
Top = 84
Width = 393
Height = 69
Anchors = [akLeft, akTop, akRight]
Caption = 'Directory listing'
TabOrder = 1
DesignSize = (
393
69)
object Label9: TLabel
Left = 12
Top = 20
Width = 82
Height = 13
Caption = 'Listing &command:'
FocusControl = ListingCommandEdit
end
object IgnoreLsWarningsCheck: TCheckBox
Left = 12
Top = 42
Width = 140
Height = 17
Caption = 'Ignore LS &warnings'
TabOrder = 1
OnClick = DataChange
end
object SCPLsFullTimeAutoCheck: TCheckBox
Left = 168
Top = 42
Width = 217
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Try to get &full timestamp'
TabOrder = 2
OnClick = DataChange
end
object ListingCommandEdit: TComboBox
Left = 168
Top = 15
Width = 213
Height = 21
AutoComplete = False
Anchors = [akLeft, akTop, akRight]
MaxLength = 50
TabOrder = 0
Text = 'ListingCommandEdit'
Items.Strings = (
'ls -la'
'ls -gla')
end
end
end
object FtpSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_ftp'
Caption = 'FTP'
ImageIndex = 16
TabVisible = False
DesignSize = (
401
382)
object FtpGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 251
Anchors = [akLeft, akTop, akRight]
Caption = 'Protocol options'
TabOrder = 0
DesignSize = (
393
251)
object Label25: TLabel
Left = 12
Top = 42
Width = 103
Height = 13
Caption = 'Post login &commands:'
FocusControl = PostLoginCommandsMemo
end
object FtpListAllLabel: TLabel
Left = 12
Top = 148
Width = 159
Height = 13
Caption = '&Support for listing of hidden files:'
FocusControl = FtpListAllCombo
end
object Label24: TLabel
Left = 12
Top = 124
Width = 188
Height = 13
Caption = 'Use &MLSD command for directory listing'
FocusControl = FtpUseMlsdCombo
end
object FtpForcePasvIpLabel: TLabel
Left = 12
Top = 172
Width = 230
Height = 13
Caption = '&Force IP address for passive mode connections:'
FocusControl = FtpForcePasvIpCombo
end
object FtpAccountLabel: TLabel
Left = 12
Top = 20
Width = 43
Height = 13
Caption = '&Account:'
FocusControl = FtpAccountEdit
end
object Label3: TLabel
Left = 12
Top = 196
Width = 232
Height = 13
Caption = 'Use &HOST command to select host on the server'
FocusControl = FtpHostCombo
end
object PostLoginCommandsMemo: TMemo
Left = 12
Top = 59
Width = 369
Height = 53
Anchors = [akLeft, akTop, akRight]
ScrollBars = ssVertical
TabOrder = 1
end
object FtpListAllCombo: TComboBox
Left = 320
Top = 143
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 3
OnChange = DataChange
end
object FtpForcePasvIpCombo: TComboBox
Left = 320
Top = 167
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 4
OnChange = DataChange
end
object FtpUseMlsdCombo: TComboBox
Left = 320
Top = 119
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 2
OnChange = DataChange
end
object FtpAccountEdit: TEdit
Left = 128
Top = 15
Width = 253
Height = 21
Anchors = [akLeft, akTop, akRight]
MaxLength = 100
TabOrder = 0
Text = 'FtpAccountEdit'
OnChange = DataChange
end
object FtpHostCombo: TComboBox
Left = 320
Top = 191
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 5
OnChange = DataChange
end
object VMSAllRevisionsCheck: TCheckBox
Left = 12
Top = 220
Width = 309
Height = 17
Caption = 'Display all file &revisions on VMS servers'
TabOrder = 6
end
end
end
object S3Sheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_s3'
Caption = 'S3'
ImageIndex = 16
TabVisible = False
DesignSize = (
401
382)
object S3Group: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 97
Anchors = [akLeft, akTop, akRight]
Caption = 'Protocol options'
TabOrder = 0
DesignSize = (
393
97)
object Label27: TLabel
Left = 12
Top = 20
Width = 72
Height = 13
Caption = '&Default region:'
FocusControl = S3DefaultReqionCombo
end
object S3UrlStyleLabel: TLabel
Left = 12
Top = 44
Width = 49
Height = 13
Caption = '&URL style:'
FocusControl = S3UrlStyleCombo
end
object S3DefaultReqionCombo: TComboBox
Left = 168
Top = 15
Width = 213
Height = 21
Anchors = [akLeft, akTop, akRight]
DropDownCount = 16
MaxLength = 32
TabOrder = 0
Text = 'S3DefaultRegionCombo'
OnChange = DataChange
Items.Strings = (
'af-south-1'
'ap-east-1'
'ap-northeast-1'
'ap-northeast-2'
'ap-northeast-3'
'ap-south-1'
'ap-south-2'
'ap-southeast-1'
'ap-southeast-2'
'ap-southeast-3'
'ap-southeast-4'
'ca-central-1'
'ca-west-1'
'cn-north-1'
'cn-northwest-1'
'eu-central-1'
'eu-central-2'
'eu-north-1'
'eu-south-1'
'eu-south-2'
'eu-west-1'
'eu-west-2'
'eu-west-3'
'il-central-1'
'me-central-1'
'me-south-1'
'sa-east-1'
'us-east-1'
'us-east-2'
'us-gov-east-1'
'us-gov-west-1'
'us-west-1'
'us-west-2')
end
object S3UrlStyleCombo: TComboBox
Left = 168
Top = 39
Width = 213
Height = 21
AutoComplete = False
Style = csDropDownList
Anchors = [akLeft, akTop, akRight]
MaxLength = 50
TabOrder = 1
Items.Strings = (
'Virtual Host'
'Path')
end
object S3RequesterPaysCheck: TCheckBox
Left = 12
Top = 68
Width = 369
Height = 17
Caption = 'Requester &pays'
TabOrder = 2
end
end
object S3AuthenticationGroup: TGroupBox
Left = 1
Top = 109
Width = 393
Height = 143
Anchors = [akLeft, akTop, akRight]
Caption = 'Authentication'
TabOrder = 1
DesignSize = (
393
143)
object S3SessionTokenLabel: TLabel
Left = 12
Top = 20
Width = 70
Height = 13
Caption = '&Session token:'
FocusControl = S3SessionTokenMemo
end
object S3SessionTokenMemo: TMemo
Left = 11
Top = 36
Width = 371
Height = 93
Anchors = [akLeft, akTop, akRight, akBottom]
MaxLength = 10000
TabOrder = 0
OnChange = DataChange
OnKeyDown = NoteMemoKeyDown
end
end
end
object WebDavSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_webdav'
Caption = 'WebDAV'
ImageIndex = 17
TabVisible = False
DesignSize = (
401
382)
object WebdavGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 46
Anchors = [akLeft, akTop, akRight]
Caption = 'Protocol options'
TabOrder = 0
DesignSize = (
393
46)
object WebDavLiberalEscapingCheck: TCheckBox
Left = 12
Top = 19
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Tolerate non-encoded special characters in filenames'
TabOrder = 0
OnClick = DataChange
end
end
end
object ConnSheet: TTabSheet
Tag = 1
HelpType = htKeyword
HelpKeyword = 'ui_login_connection'
Caption = 'Connection'
ImageIndex = 7
TabVisible = False
DesignSize = (
401
382)
object FtpPingGroup: TGroupBox
Left = 0
Top = 132
Width = 393
Height = 117
Anchors = [akLeft, akTop, akRight]
Caption = 'Keepalives'
TabOrder = 3
DesignSize = (
393
117)
object FtpPingIntervalLabel: TLabel
Left = 12
Top = 90
Width = 142
Height = 13
Caption = 'Seconds &between keepalives:'
FocusControl = FtpPingIntervalSecEdit
end
object FtpPingIntervalSecEdit: TUpDownEdit
Left = 208
Top = 85
Width = 73
Height = 21
Alignment = taRightJustify
MaxValue = 3600.000000000000000000
MinValue = 1.000000000000000000
MaxLength = 4
TabOrder = 3
OnChange = DataChange
end
object FtpPingOffButton: TRadioButton
Left = 12
Top = 19
Width = 365
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Off'
TabOrder = 0
OnClick = DataChange
end
object FtpPingDummyCommandButton: TRadioButton
Left = 12
Top = 42
Width = 365
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Executing &dummy protocol commands'
TabOrder = 1
OnClick = DataChange
end
object FtpPingDirectoryListingButton: TRadioButton
Left = 12
Top = 65
Width = 365
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&And additionally reading the current directory'
TabOrder = 2
OnClick = DataChange
end
end
object TimeoutGroup: TGroupBox
Left = 0
Top = 80
Width = 393
Height = 46
Anchors = [akLeft, akTop, akRight]
Caption = 'Timeouts'
TabOrder = 1
DesignSize = (
393
46)
object Label11: TLabel
Left = 12
Top = 19
Width = 122
Height = 13
Caption = 'Server &response timeout:'
FocusControl = TimeoutEdit
end
object Label12: TLabel
Left = 334
Top = 19
Width = 39
Height = 13
Anchors = [akTop, akRight]
Caption = 'seconds'
FocusControl = TimeoutEdit
end
object TimeoutEdit: TUpDownEdit
Left = 256
Top = 14
Width = 73
Height = 21
Alignment = taRightJustify
Increment = 5.000000000000000000
MaxValue = 6000.000000000000000000
MinValue = 5.000000000000000000
Anchors = [akTop, akRight]
MaxLength = 4
TabOrder = 0
OnChange = DataChange
end
end
object PingGroup: TGroupBox
Left = 0
Top = 132
Width = 393
Height = 117
Anchors = [akLeft, akTop, akRight]
Caption = 'Keepalives'
TabOrder = 2
DesignSize = (
393
117)
object PingIntervalLabel: TLabel
Left = 12
Top = 90
Width = 142
Height = 13
Caption = 'Seconds &between keepalives:'
FocusControl = PingIntervalSecEdit
end
object PingIntervalSecEdit: TUpDownEdit
Left = 256
Top = 85
Width = 73
Height = 21
Alignment = taRightJustify
MaxValue = 3600.000000000000000000
MinValue = 1.000000000000000000
Anchors = [akTop, akRight]
MaxLength = 4
TabOrder = 3
OnChange = DataChange
end
object PingOffButton: TRadioButton
Left = 12
Top = 19
Width = 365
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Off'
TabOrder = 0
OnClick = DataChange
end
object PingNullPacketButton: TRadioButton
Left = 12
Top = 42
Width = 365
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Sending of &null SSH packets'
TabOrder = 1
OnClick = DataChange
end
object PingDummyCommandButton: TRadioButton
Left = 12
Top = 65
Width = 365
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Executing &dummy protocol commands'
TabOrder = 2
OnClick = DataChange
end
end
object IPvGroup: TGroupBox
Left = 0
Top = 255
Width = 393
Height = 46
Anchors = [akLeft, akTop, akRight]
Caption = 'Internet protocol version'
TabOrder = 4
object IPAutoButton: TRadioButton
Left = 12
Top = 19
Width = 101
Height = 17
Caption = 'A&uto'
TabOrder = 0
OnClick = DataChange
end
object IPv4Button: TRadioButton
Left = 124
Top = 19
Width = 101
Height = 17
Caption = 'IPv&4'
TabOrder = 1
OnClick = DataChange
end
object IPv6Button: TRadioButton
Left = 236
Top = 19
Width = 101
Height = 17
Caption = 'IPv&6'
TabOrder = 2
OnClick = DataChange
end
end
object ConnectionGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 69
Anchors = [akLeft, akTop, akRight]
Caption = 'Connection'
TabOrder = 0
DesignSize = (
393
69)
object FtpPasvModeCheck: TCheckBox
Left = 12
Top = 19
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Passive mode'
TabOrder = 0
OnClick = DataChange
end
object BufferSizeCheck: TCheckBox
Left = 12
Top = 42
Width = 369
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Optimize connection &buffer size'
TabOrder = 1
OnClick = DataChange
end
end
end
object ProxySheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_proxy'
Caption = 'Proxy'
ImageIndex = 8
TabVisible = False
DesignSize = (
401
382)
object ProxyTypeGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 164
Anchors = [akLeft, akTop, akRight]
Caption = 'Proxy'
TabOrder = 0
DesignSize = (
393
164)
object ProxyMethodLabel: TLabel
Left = 12
Top = 20
Width = 57
Height = 13
Caption = 'Proxy &type:'
FocusControl = SshProxyMethodCombo
end
object ProxyHostLabel: TLabel
Left = 12
Top = 41
Width = 85
Height = 13
Caption = 'Pro&xy host name:'
FocusControl = ProxyHostEdit
end
object ProxyPortLabel: TLabel
Left = 284
Top = 41
Width = 63
Height = 13
Anchors = [akTop, akRight]
Caption = 'Po&rt number:'
FocusControl = ProxyPortEdit
end
object ProxyUsernameLabel: TLabel
Left = 12
Top = 85
Width = 55
Height = 13
Caption = '&User name:'
FocusControl = ProxyUsernameEdit
end
object ProxyPasswordLabel: TLabel
Left = 200
Top = 85
Width = 50
Height = 13
Caption = '&Password:'
FocusControl = ProxyPasswordEdit
end
object SshProxyMethodCombo: TComboBox
Left = 128
Top = 15
Width = 110
Height = 21
Style = csDropDownList
TabOrder = 0
OnChange = DataChange
Items.Strings = (
'None'
'SOCKS4'
'SOCKS5'
'HTTP'
'Telnet'
'Local')
end
object ProxyPortEdit: TUpDownEdit
Left = 284
Top = 58
Width = 98
Height = 21
Alignment = taRightJustify
MaxValue = 65535.000000000000000000
MinValue = 1.000000000000000000
Anchors = [akTop, akRight]
TabOrder = 4
OnChange = DataChange
end
object ProxyHostEdit: TEdit
Left = 12
Top = 58
Width = 266
Height = 21
Anchors = [akLeft, akTop, akRight]
MaxLength = 255
TabOrder = 3
Text = 'ProxyHostEdit'
OnChange = DataChange
end
object ProxyUsernameEdit: TEdit
Left = 12
Top = 102
Width = 182
Height = 21
MaxLength = 100
TabOrder = 5
Text = 'ProxyUsernameEdit'
OnChange = DataChange
end
object ProxyPasswordEdit: TPasswordEdit
Left = 200
Top = 102
Width = 182
Height = 21
Anchors = [akLeft, akTop, akRight]
MaxLength = 100
TabOrder = 6
Text = 'ProxyPasswordEdit'
OnChange = DataChange
end
object FtpProxyMethodCombo: TComboBox
Left = 128
Top = 15
Width = 254
Height = 21
Style = csDropDownList
Anchors = [akLeft, akTop, akRight]
DropDownCount = 12
TabOrder = 1
OnChange = DataChange
Items.Strings = (
'None'
'SOCKS4'
'SOCKS5'
'HTTP'
'SITE %host'
'USER %proxyuser, USER %user@%host'
'OPEN %host'
'USER %proxyuser, USER %user'
'USER %user@%host'
'USER %proxyuser@%host'
'USER %user@%host %proxyuser'
'USER %user@%proxyuser@%host')
end
object NeonProxyMethodCombo: TComboBox
Left = 128
Top = 15
Width = 110
Height = 21
Style = csDropDownList
TabOrder = 2
OnChange = DataChange
Items.Strings = (
'None'
'SOCKS4'
'SOCKS5'
'HTTP')
end
object ProxyAutodetectButton: TButton
Left = 12
Top = 129
Width = 100
Height = 25
Caption = '&Autodetect'
TabOrder = 7
OnClick = ProxyAutodetectButtonClick
end
end
object ProxySettingsGroup: TGroupBox
Left = 0
Top = 176
Width = 393
Height = 128
Anchors = [akLeft, akTop, akRight]
Caption = 'Proxy settings'
TabOrder = 1
DesignSize = (
393
128)
object ProxyTelnetCommandLabel: TLabel
Left = 12
Top = 18
Width = 82
Height = 13
Caption = 'Telnet co&mmand:'
FocusControl = ProxyTelnetCommandEdit
end
object Label17: TLabel
Left = 12
Top = 99
Width = 168
Height = 13
Caption = 'Do &DNS name lookup at proxy end:'
FocusControl = ProxyDNSCombo
end
object ProxyLocalCommandLabel: TLabel
Left = 12
Top = 18
Width = 107
Height = 13
Caption = 'Local proxy co&mmand:'
FocusControl = ProxyLocalCommandEdit
end
object ProxyTelnetCommandEdit: TEdit
Left = 12
Top = 35
Width = 370
Height = 21
Anchors = [akLeft, akTop, akRight]
MaxLength = 255
TabOrder = 0
Text = 'ProxyTelnetCommandEdit'
OnChange = DataChange
end
object ProxyLocalhostCheck: TCheckBox
Left = 12
Top = 77
Width = 370
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Co&nsider proxying local host connections'
TabOrder = 5
end
object ProxyDNSCombo: TComboBox
Left = 252
Top = 94
Width = 130
Height = 21
Style = csDropDownList
Anchors = [akLeft, akTop, akRight]
TabOrder = 6
Items.Strings = (
'Auto'
'No'
'Yes')
end
object ProxyLocalCommandEdit: TEdit
Left = 12
Top = 35
Width = 274
Height = 21
Anchors = [akLeft, akTop, akRight]
TabOrder = 2
Text = 'ProxyLocalCommandEdit'
OnChange = DataChange
end
object ProxyLocalCommandBrowseButton: TButton
Left = 300
Top = 33
Width = 82
Height = 25
Anchors = [akTop, akRight]
Caption = '&Browse...'
TabOrder = 3
OnClick = ProxyLocalCommandBrowseButtonClick
end
object ProxyTelnetCommandHintText: TStaticText
Left = 303
Top = 58
Width = 79
Height = 16
Alignment = taRightJustify
Anchors = [akTop, akRight]
AutoSize = False
Caption = 'patterns'
TabOrder = 1
TabStop = True
end
object ProxyLocalCommandHintText: TStaticText
Left = 207
Top = 58
Width = 79
Height = 16
Alignment = taRightJustify
Anchors = [akTop, akRight]
AutoSize = False
Caption = 'patterns'
TabOrder = 4
TabStop = True
end
end
end
object TunnelSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_tunnel'
Caption = 'Tunnel'
ImageIndex = 14
TabVisible = False
DesignSize = (
401
382)
object TunnelSessionGroup: TGroupBox
Left = 0
Top = 32
Width = 393
Height = 118
Anchors = [akLeft, akTop, akRight]
Caption = 'Host to setup tunnel on'
TabOrder = 1
DesignSize = (
393
118)
object Label6: TLabel
Left = 12
Top = 18
Width = 55
Height = 13
Caption = '&Host name:'
FocusControl = TunnelHostNameEdit
end
object Label14: TLabel
Left = 284
Top = 18
Width = 63
Height = 13
Anchors = [akTop, akRight]
Caption = 'Po&rt number:'
FocusControl = TunnelPortNumberEdit
end
object Label15: TLabel
Left = 12
Top = 68
Width = 55
Height = 13
Caption = '&User name:'
FocusControl = TunnelUserNameEdit
end
object Label16: TLabel
Left = 200
Top = 68
Width = 50
Height = 13
Caption = '&Password:'
FocusControl = TunnelPasswordEdit
end
object TunnelHostNameEdit: TEdit
Left = 12
Top = 35
Width = 266
Height = 21
Anchors = [akLeft, akTop, akRight]
MaxLength = 255
TabOrder = 0
Text = 'TunnelHostNameEdit'
OnChange = DataChange
end
object TunnelUserNameEdit: TEdit
Left = 12
Top = 85
Width = 182
Height = 21
MaxLength = 100
TabOrder = 2
Text = 'TunnelUserNameEdit'
OnChange = DataChange
end
object TunnelPasswordEdit: TPasswordEdit
Left = 200
Top = 85
Width = 182
Height = 21
Anchors = [akLeft, akTop, akRight]
MaxLength = 100
TabOrder = 3
Text = 'TunnelPasswordEdit'
OnChange = DataChange
end
object TunnelPortNumberEdit: TUpDownEdit
Left = 284
Top = 35
Width = 98
Height = 21
Alignment = taRightJustify
MaxValue = 65535.000000000000000000
MinValue = 1.000000000000000000
Anchors = [akTop, akRight]
TabOrder = 1
OnChange = DataChange
end
end
object TunnelCheck: TCheckBox
Left = 12
Top = 8
Width = 382
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Connect through SSH tunnel'
TabOrder = 0
OnClick = DataChange
end
object TunnelOptionsGroup: TGroupBox
Left = 0
Top = 156
Width = 393
Height = 47
Anchors = [akLeft, akTop, akRight]
Caption = 'Tunnel options'
TabOrder = 2
DesignSize = (
393
47)
object Label21: TLabel
Left = 12
Top = 20
Width = 84
Height = 13
Caption = '&Local tunnel port:'
FocusControl = TunnelLocalPortNumberEdit
end
object TunnelLocalPortNumberEdit: TComboBox
Left = 284
Top = 15
Width = 98
Height = 21
AutoComplete = False
Anchors = [akTop, akRight]
MaxLength = 50
TabOrder = 0
Text = 'TunnelLocalPortNumberEdit'
OnChange = DataChange
Items.Strings = (
'Autoselect')
end
end
object TunnelAuthenticationParamsGroup: TGroupBox
Left = 0
Top = 209
Width = 393
Height = 68
Anchors = [akLeft, akTop, akRight]
Caption = 'Tunnel authentication parameters'
TabOrder = 3
DesignSize = (
393
68)
object Label18: TLabel
Left = 12
Top = 18
Width = 75
Height = 13
Caption = 'Private &key file:'
FocusControl = TunnelPrivateKeyEdit3
end
object TunnelPrivateKeyEdit3: TFilenameEdit
Left = 12
Top = 35
Width = 370
Height = 21
AcceptFiles = True
OnBeforeDialog = PathEditBeforeDialog
OnAfterDialog = PrivateKeyEdit3AfterDialog
Filter =
'PuTTY Private Key Files (*.ppk)|*.ppk|All Private Key Files (*.p' +
'pk;*.pem;*.key;id_*)|*.ppk;*.pem;*.key;id_*|All Files (*.*)|*.*'
DialogOptions = [ofReadOnly, ofPathMustExist, ofFileMustExist]
DialogTitle = 'Select private key file'
ClickKey = 16397
Anchors = [akLeft, akTop, akRight]
TabOrder = 0
Text = 'TunnelPrivateKeyEdit3'
OnChange = DataChange
end
end
end
object SslSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_tls'
Caption = 'TLS/SSL'
ImageIndex = 13
TabVisible = False
DesignSize = (
401
382)
object TlsGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 99
Anchors = [akLeft, akTop, akRight]
Caption = 'TLS options'
TabOrder = 0
DesignSize = (
393
99)
object MinTlsVersionLabel: TLabel
Left = 12
Top = 20
Width = 102
Height = 13
Caption = 'Mi&nimum TLS version:'
FocusControl = MinTlsVersionCombo
end
object MaxTlsVersionLabel: TLabel
Left = 12
Top = 44
Width = 106
Height = 13
Caption = 'Ma&ximum TLS version:'
FocusControl = MaxTlsVersionCombo
end
object MinTlsVersionCombo: TComboBox
Left = 304
Top = 15
Width = 77
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 0
OnChange = MinTlsVersionComboChange
Items.Strings = (
'TLS 1.0'
'TLS 1.1'
'TLS 1.2'
'TLS 1.3')
end
object MaxTlsVersionCombo: TComboBox
Left = 304
Top = 39
Width = 77
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 1
OnChange = MaxTlsVersionComboChange
Items.Strings = (
'TLS 1.0'
'TLS 1.1'
'TLS 1.2'
'TLS 1.3')
end
object SslSessionReuseCheck2: TCheckBox
Left = 12
Top = 68
Width = 365
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Reuse TLS session ID for data connections'
TabOrder = 2
OnClick = DataChange
end
end
object TlsAuthenticationGroup: TGroupBox
Left = 0
Top = 113
Width = 393
Height = 72
Anchors = [akLeft, akTop, akRight]
Caption = 'Authentication parameters'
TabOrder = 1
DesignSize = (
393
72)
object Label4: TLabel
Left = 12
Top = 20
Width = 99
Height = 13
Caption = '&Client certificate file:'
FocusControl = TlsCertificateFileEdit
end
object TlsCertificateFileEdit: TFilenameEdit
Left = 12
Top = 37
Width = 372
Height = 21
AcceptFiles = True
OnBeforeDialog = PathEditBeforeDialog
OnAfterDialog = TlsCertificateFileEditAfterDialog
Filter =
'Certificates and private key files (*.pfx;*.p12;*.key;*.pem)|*.p' +
'fx;*.p12;*.key;*.pem|All Files (*.*)|*.*'
DialogOptions = [ofReadOnly, ofPathMustExist, ofFileMustExist]
DialogTitle = 'Select client certificate file'
ClickKey = 16397
Anchors = [akLeft, akTop, akRight]
TabOrder = 0
Text = 'TlsCertificateFileEdit'
OnChange = DataChange
end
end
end
object AdvancedSheet: TTabSheet
Tag = 1
HelpType = htKeyword
HelpKeyword = 'ui_login_ssh'
Caption = 'SSH'
ImageIndex = 2
TabVisible = False
DesignSize = (
401
382)
object ProtocolGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 46
Anchors = [akLeft, akTop, akRight]
Caption = 'Protocol options'
TabOrder = 0
DesignSize = (
393
46)
object CompressionCheck: TCheckBox
Left = 16
Top = 19
Width = 367
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Enable &compression'
TabOrder = 0
OnClick = DataChange
end
end
object EncryptionGroup: TGroupBox
Left = 0
Top = 58
Width = 393
Height = 171
Anchors = [akLeft, akTop, akRight]
Caption = 'Encryption options'
TabOrder = 1
DesignSize = (
393
171)
object Label8: TLabel
Left = 12
Top = 19
Width = 162
Height = 13
Caption = 'Encryption cipher selection &policy:'
FocusControl = CipherListBox
end
object CipherListBox: TListBox
Left = 12
Top = 36
Width = 285
Height = 99
Anchors = [akLeft, akTop, akRight]
DragMode = dmAutomatic
ItemHeight = 13
TabOrder = 0
OnClick = DataChange
OnDragDrop = AlgListBoxDragDrop
OnDragOver = AlgListBoxDragOver
OnStartDrag = AlgListBoxStartDrag
end
object Ssh2LegacyDESCheck: TCheckBox
Left = 16
Top = 142
Width = 367
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Enable legacy use of single-&DES'
TabOrder = 3
end
object CipherUpButton: TButton
Left = 303
Top = 36
Width = 80
Height = 25
Anchors = [akTop, akRight]
Caption = '&Up'
TabOrder = 1
OnClick = CipherButtonClick
end
object CipherDownButton: TButton
Left = 303
Top = 68
Width = 80
Height = 25
Anchors = [akTop, akRight]
Caption = '&Down'
TabOrder = 2
OnClick = CipherButtonClick
end
end
end
object KexSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_kex'
Caption = 'Key exchange'
ImageIndex = 13
TabVisible = False
DesignSize = (
401
382)
object KexOptionsGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 222
Anchors = [akLeft, akTop, akRight]
Caption = 'Key exchange algorithm options'
TabOrder = 0
DesignSize = (
393
222)
object Label28: TLabel
Left = 12
Top = 19
Width = 124
Height = 13
Caption = 'Algorithm selection &policy:'
FocusControl = KexListBox
end
object KexListBox: TListBox
Left = 12
Top = 36
Width = 285
Height = 153
Anchors = [akLeft, akTop, akRight]
DragMode = dmAutomatic
ItemHeight = 13
TabOrder = 0
OnClick = DataChange
OnDragDrop = AlgListBoxDragDrop
OnDragOver = AlgListBoxDragOver
OnStartDrag = AlgListBoxStartDrag
end
object KexUpButton: TButton
Left = 303
Top = 36
Width = 80
Height = 25
Anchors = [akTop, akRight]
Caption = '&Up'
TabOrder = 1
OnClick = KexButtonClick
end
object KexDownButton: TButton
Left = 303
Top = 68
Width = 80
Height = 25
Anchors = [akTop, akRight]
Caption = '&Down'
TabOrder = 2
OnClick = KexButtonClick
end
object AuthGSSAPIKEXCheck: TCheckBox
Left = 12
Top = 195
Width = 285
Height = 17
Caption = 'Attempt &GSSAPI key exchange'
TabOrder = 3
OnClick = DataChange
end
end
object KexReexchangeGroup: TGroupBox
Left = 0
Top = 235
Width = 393
Height = 69
Anchors = [akLeft, akTop, akRight]
Caption = 'Options controlling key re-exchange'
TabOrder = 1
DesignSize = (
393
69)
object Label31: TLabel
Left = 12
Top = 20
Width = 199
Height = 13
Caption = 'Max &minutes before rekey (0 for no limit):'
Color = clBtnFace
FocusControl = RekeyTimeEdit
ParentColor = False
end
object Label32: TLabel
Left = 12
Top = 44
Width = 184
Height = 13
Caption = 'Ma&x data before rekey (0 for no limit):'
Color = clBtnFace
FocusControl = RekeyDataEdit
ParentColor = False
end
object RekeyTimeEdit: TUpDownEdit
Left = 303
Top = 12
Width = 80
Height = 21
Alignment = taRightJustify
MaxValue = 1440.000000000000000000
Anchors = [akTop, akRight]
MaxLength = 4
TabOrder = 0
OnChange = DataChange
end
object RekeyDataEdit: TEdit
Left = 303
Top = 39
Width = 80
Height = 21
Anchors = [akTop, akRight]
MaxLength = 10
TabOrder = 1
OnChange = DataChange
end
end
end
object AuthSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_authentication'
Caption = 'Authentication'
ImageIndex = 10
TabVisible = False
DesignSize = (
401
382)
object SshNoUserAuthCheck: TCheckBox
Left = 12
Top = 8
Width = 382
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = '&Bypass authentication entirely'
TabOrder = 0
OnClick = DataChange
end
object AuthenticationGroup: TGroupBox
Left = 0
Top = 32
Width = 393
Height = 94
Anchors = [akLeft, akTop, akRight]
Caption = 'Authentication options'
TabOrder = 1
DesignSize = (
393
94)
object TryAgentCheck: TCheckBox
Left = 12
Top = 19
Width = 373
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Attempt authentication using &Pageant'
TabOrder = 0
OnClick = DataChange
end
object AuthKICheck: TCheckBox
Left = 12
Top = 42
Width = 373
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Attempt '#39'keyboard-&interactive'#39' authentication'
TabOrder = 1
OnClick = DataChange
end
object AuthKIPasswordCheck: TCheckBox
Left = 32
Top = 65
Width = 353
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Respond with a pass&word to the first prompt'
TabOrder = 2
OnClick = DataChange
end
end
object AuthenticationParamsGroup: TGroupBox
Left = 0
Top = 132
Width = 393
Height = 164
Anchors = [akLeft, akTop, akRight]
Caption = 'Authentication parameters'
TabOrder = 2
DesignSize = (
393
164)
object PrivateKeyLabel: TLabel
Left = 12
Top = 42
Width = 75
Height = 13
Caption = 'Private &key file:'
FocusControl = PrivateKeyEdit3
end
object DetachedCertificateLabel: TLabel
Left = 12
Top = 117
Width = 186
Height = 13
Caption = 'Certificate to &use with the private key:'
FocusControl = DetachedCertificateEdit
end
object AgentFwdCheck: TCheckBox
Left = 12
Top = 19
Width = 373
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Allow agent &forwarding'
TabOrder = 0
OnClick = DataChange
end
object PrivateKeyEdit3: TFilenameEdit
Left = 12
Top = 59
Width = 372
Height = 21
AcceptFiles = True
OnBeforeDialog = PathEditBeforeDialog
OnAfterDialog = PrivateKeyEdit3AfterDialog
Filter =
'PuTTY Private Key Files (*.ppk)|*.ppk|All Private Key Files (*.p' +
'pk;*.pem;*.key;id_*)|*.ppk;*.pem;*.key;id_*|All Files (*.*)|*.*'
DialogOptions = [ofReadOnly, ofPathMustExist, ofFileMustExist]
DialogTitle = 'Select private key file'
ClickKey = 16397
Anchors = [akLeft, akTop, akRight]
TabOrder = 1
Text = 'PrivateKeyEdit3'
OnChange = DataChange
end
object PrivateKeyToolsButton: TButton
Left = 201
Top = 86
Width = 101
Height = 25
Caption = '&Tools'
TabOrder = 3
OnClick = PrivateKeyToolsButtonClick
end
object PrivateKeyViewButton: TButton
Left = 12
Top = 86
Width = 183
Height = 25
Caption = '&Display Public Key'
TabOrder = 2
OnClick = PrivateKeyViewButtonClick
end
object DetachedCertificateEdit: TFilenameEdit
Left = 12
Top = 133
Width = 372
Height = 21
AcceptFiles = True
OnBeforeDialog = PathEditBeforeDialog
OnAfterDialog = PrivateKeyEdit3AfterDialog
Filter = 'Public key files (*.pub)|*.pub|All Files (*.*)|*.*'
DialogOptions = [ofReadOnly, ofPathMustExist, ofFileMustExist]
DialogTitle = 'Select certificate file'
ClickKey = 16397
Anchors = [akLeft, akTop, akRight]
TabOrder = 4
Text = 'DetachedCertificateEdit'
OnChange = DataChange
end
end
object GSSAPIGroup: TGroupBox
Left = 0
Top = 302
Width = 393
Height = 71
Anchors = [akLeft, akTop, akRight]
Caption = 'GSSAPI'
TabOrder = 3
DesignSize = (
393
71)
object AuthGSSAPICheck3: TCheckBox
Left = 12
Top = 19
Width = 373
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Attempt &GSSAPI authentication'
TabOrder = 0
OnClick = AuthGSSAPICheck3Click
end
object GSSAPIFwdTGTCheck: TCheckBox
Left = 32
Top = 42
Width = 353
Height = 17
Anchors = [akLeft, akTop, akRight]
Caption = 'Allow GSSAPI &credential delegation'
TabOrder = 1
OnClick = AuthGSSAPICheck3Click
end
end
end
object BugsSheet: TTabSheet
Tag = 2
HelpType = htKeyword
HelpKeyword = 'ui_login_bugs'
Caption = 'Bugs'
ImageIndex = 9
TabVisible = False
DesignSize = (
401
382)
object BugsGroupBox: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 217
Anchors = [akLeft, akTop, akRight]
Caption = 'Detection of known bugs in SSH servers'
TabOrder = 0
DesignSize = (
393
217)
object BugHMAC2Label: TLabel
Left = 12
Top = 68
Width = 144
Height = 13
Caption = 'Miscomputes SSH H&MAC keys:'
FocusControl = BugHMAC2Combo
end
object BugDeriveKey2Label: TLabel
Left = 12
Top = 92
Width = 166
Height = 13
Caption = 'Miscomputes SSH &encryption keys:'
FocusControl = BugDeriveKey2Combo
end
object BugRSAPad2Label: TLabel
Left = 12
Top = 116
Width = 200
Height = 13
Caption = 'Requires &padding on SSH RSA signatures:'
FocusControl = BugRSAPad2Combo
end
object BugPKSessID2Label: TLabel
Left = 12
Top = 140
Width = 185
Height = 13
Caption = 'Misuses the sessio&n ID in SSH PK auth:'
FocusControl = BugPKSessID2Combo
end
object BugRekey2Label: TLabel
Left = 12
Top = 164
Width = 177
Height = 13
Caption = 'Handles SSH &key re-exchange badly:'
FocusControl = BugRekey2Combo
end
object BugMaxPkt2Label: TLabel
Left = 12
Top = 188
Width = 166
Height = 13
Caption = 'Ignores SSH ma&ximum packet size:'
FocusControl = BugMaxPkt2Combo
end
object BugIgnore2Label: TLabel
Left = 12
Top = 20
Width = 159
Height = 13
Caption = 'Chokes on SSH i&gnore messages:'
FocusControl = BugIgnore2Combo
end
object BugWinAdjLabel: TLabel
Left = 12
Top = 44
Width = 202
Height = 13
Caption = 'Chokes on WinSCP'#39's SSH '#39'&winadj'#39' requests'
FocusControl = BugWinAdjCombo
end
object BugHMAC2Combo: TComboBox
Left = 320
Top = 63
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 2
OnChange = DataChange
end
object BugDeriveKey2Combo: TComboBox
Left = 320
Top = 87
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 3
OnChange = DataChange
end
object BugRSAPad2Combo: TComboBox
Left = 320
Top = 111
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 4
OnChange = DataChange
end
object BugPKSessID2Combo: TComboBox
Left = 320
Top = 135
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 5
OnChange = DataChange
end
object BugRekey2Combo: TComboBox
Left = 320
Top = 159
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 6
OnChange = DataChange
end
object BugMaxPkt2Combo: TComboBox
Left = 320
Top = 183
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 7
OnChange = DataChange
end
object BugIgnore2Combo: TComboBox
Left = 320
Top = 15
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 0
OnChange = DataChange
end
object BugWinAdjCombo: TComboBox
Left = 320
Top = 39
Width = 61
Height = 21
Style = csDropDownList
Anchors = [akTop, akRight]
TabOrder = 1
OnChange = DataChange
end
end
end
object NoteSheet: TTabSheet
HelpType = htKeyword
HelpKeyword = 'ui_login_note'
Caption = 'Note'
ImageIndex = 14
TabVisible = False
DesignSize = (
401
382)
object NoteGroup: TGroupBox
Left = 0
Top = 6
Width = 393
Height = 367
Anchors = [akLeft, akTop, akRight, akBottom]
Caption = 'Note'
TabOrder = 0
DesignSize = (
393
367)
object NoteMemo: TMemo
Left = 12
Top = 21
Width = 371
Height = 332
Anchors = [akLeft, akTop, akRight, akBottom]
MaxLength = 4000
TabOrder = 0
OnChange = DataChange
OnKeyDown = NoteMemoKeyDown
end
end
end
end
object LeftPanel: TPanel
Left = 0
Top = 0
Width = 152
Height = 392
Align = alLeft
BevelOuter = bvNone
TabOrder = 0
DesignSize = (
152
392)
object NavigationTree: TTreeView
Left = 8
Top = 9
Width = 136
Height = 379
Anchors = [akLeft, akTop, akRight, akBottom]
DoubleBuffered = True
HideSelection = False
HotTrack = True
Indent = 19
ParentDoubleBuffered = False
ReadOnly = True
ShowButtons = False
ShowRoot = False
TabOrder = 0
OnChange = NavigationTreeChange
OnCollapsing = NavigationTreeCollapsing
Items.NodeData = {
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
000000000000000000000001054E006F00740065005800}
end
end
end
object OKBtn: TButton
Left = 307
Top = 401
Width = 75
Height = 25
Anchors = [akRight, akBottom]
Caption = 'OK'
Default = True
ModalResult = 1
TabOrder = 1
end
object CancelBtn: TButton
Left = 392
Top = 401
Width = 75
Height = 25
Anchors = [akRight, akBottom]
Cancel = True
Caption = 'Cancel'
ModalResult = 2
TabOrder = 2
end
object HelpButton: TButton
Left = 475
Top = 401
Width = 75
Height = 25
Anchors = [akRight, akBottom]
Caption = '&Help'
TabOrder = 3
OnClick = HelpButtonClick
end
object ColorButton: TButton
Left = 8
Top = 401
Width = 82
Height = 25
Anchors = [akLeft, akBottom]
Caption = '&Color'
TabOrder = 4
OnClick = ColorButtonClick
end
object ColorImageList: TImageList
AllocBy = 1
Left = 36
Top = 289
Bitmap = {
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
000000000000}
end
object ColorImageList120: TImageList
AllocBy = 1
Height = 20
Width = 20
Left = 132
Top = 289
Bitmap = {
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
00000000000000000000000000000000000000000000}
end
object ColorImageList144: TImageList
AllocBy = 1
Height = 24
Width = 24
Left = 36
Top = 337
Bitmap = {
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
000000000000}
end
object ColorImageList192: TImageList
AllocBy = 1
Height = 32
Width = 32
Left = 132
Top = 337
Bitmap = {
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
your_sha256_hash
000000000000}
end
object PrivateKeyMenu: TPopupMenu
Left = 128
Top = 384
object PrivateKeyGenerateItem: TMenuItem
Caption = '&Generate New Key Pair with PuTTYgen...'
OnClick = PrivateKeyGenerateItemClick
end
object PrivateKeyUploadItem: TMenuItem
Caption = '&Install Public Key into Server...'
OnClick = PrivateKeyUploadItemClick
end
end
end
```
|
```python
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice,
# this list of conditions and the following disclaimer.
#
# 2. Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
#
# 3. Neither the name of the copyright holder nor the names of its
# contributors may be used to endorse or promote products derived from
# this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
# THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
# PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR
# CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
# EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
# PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS;
# OR BUSINESS INTERRUPTION). HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
# WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR
# OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF
# ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
from . import Pmod
from . import PMOD_GROVE_G3
from . import PMOD_GROVE_G4
__author__ = "Marco Rabozzi, Luca Cerina, Giuseppe Natale"
PMOD_GROVE_DLIGHT_PROGRAM = "pmod_grove_dlight.bin"
CONFIG_IOP_SWITCH = 0x1
GET_LIGHT_VALUE = 0x3
GET_LUX_VALUE = 0x5
class Grove_Dlight(object):
"""This class controls the Grove IIC color sensor.
Grove Color sensor based on the TCS3414CS.
Hardware version: v1.3.
Attributes
----------
microblaze : Pmod
Microblaze processor instance used by this module.
"""
def __init__(self, mb_info, gr_pin):
"""Return a new instance of an Grove_Dlight object.
Parameters
----------
mb_info : dict
A dictionary storing Microblaze information, such as the
IP name and the reset name.
gr_pin: list
A group of pins on pmod-grove adapter.
"""
if gr_pin not in [PMOD_GROVE_G3,
PMOD_GROVE_G4]:
raise ValueError("Group number can only be G3 - G4.")
self.microblaze = Pmod(mb_info, PMOD_GROVE_DLIGHT_PROGRAM)
self.microblaze.write_mailbox(0, gr_pin)
self.microblaze.write_blocking_command(CONFIG_IOP_SWITCH)
def read_raw_light(self):
"""Read the visible and IR channel values.
Read the values from the grove digital light peripheral.
Returns
-------
tuple
A tuple containing 2 integer values ch0 (visible) and ch1 (IR).
"""
self.microblaze.write_blocking_command(GET_LIGHT_VALUE)
ch0, ch1 = self.microblaze.read_mailbox(0, 2)
return ch0, ch1
def read_lux(self):
"""Read the computed lux value of the sensor.
Returns
-------
int
The lux value from the sensor
"""
self.microblaze.write_blocking_command(GET_LUX_VALUE)
lux = self.microblaze.read_mailbox(0x8)
return lux
```
|
Jean-Michel de Venture de Paradis (8 May 1739, Marseille – 16 May 1799, Acre, aged 60) was an 18th-century French orientalist and dragoman.
Biography
The son of a family of diplomats and military (King's interpreter in the Levant), he studied at the École des Jeunes de langues, in the premises of the collège Louis-le-Grand in Paris. After an internship at the Embassy of France in Constantinople, he held various positions of drogoman in Syria, Egypt, Morocco, Tunis and Algiers. He also participated as an interpreter to the inspection mission of the Levant, which was entrusted to baron de Tott, secretary and interpreter of the Embassy of France in Constantinople.
He returned to Paris in 1797 to occupy the chair of Turkish language at the Institut national des langues et civilisations orientales.
He was the oldest member of the Commission des sciences et des arts and was appointed first interpreter (military interpreter) of the Armée d'Orient. He became a member of the Institut d'Égypte 22 August 1798, in the art and literature section.
Jean-Joseph Marcel, who was his pupil says he died from dysentery, while others speak of plague. Another hypothesis assumes that he died April 19, 1799, at Nazareth of illness following the Siege of Acre
Married June 14, 1774, in Cairo with Victoria Digeon, he had two daughters including Jeanne Venture de Paradis who in 1810 married the clockmaker Antoine Louis Breguet, son of the famous Abraham-Louis Breguet, who is an ascendant of actress Clémentine Célarié and the other daughter who married Joseph Sulkowski, Polish aristocrat favorite aide of Napoleon Bonaparte during the expedition of Egypt.
Bibliography
Tunis et Alger au XVIIIe siècle, Sindbad, 1999
Alger au XVIIIe siècle, Alger, Jourdan, 1898
References
External links
Jean-Michel Venture de Paradis (1739-1799) on data.bnf.fr
Venture de paradis ou un pionnier des études berbères
Alger au XVIIIe siècle, de Jean-Michel Venture de Paradis
Un savant à l’épreuve du terrain politique
French orientalists
1739 births
Writers from Marseille
1799 deaths
Infectious disease deaths in Israel
|
```javascript
/* eslint-disable import/no-extraneous-dependencies */
function eslintParser() {
return require('@babel/eslint-parser')
}
function pluginProposalClassProperties() {
return require('@babel/plugin-proposal-class-properties')
}
function pluginProposalExportNamespaceFrom() {
return require('@babel/plugin-proposal-export-namespace-from')
}
function pluginProposalNumericSeparator() {
return require('@babel/plugin-proposal-numeric-separator')
}
function pluginProposalObjectRestSpread() {
return require('@babel/plugin-proposal-object-rest-spread')
}
function pluginSyntaxBigint() {
return require('@babel/plugin-syntax-bigint')
}
function pluginSyntaxDynamicImport() {
return require('@babel/plugin-syntax-dynamic-import')
}
function pluginSyntaxImportAttributes() {
return require('@babel/plugin-syntax-import-attributes')
}
function pluginSyntaxJsx() {
return require('@babel/plugin-syntax-jsx')
}
function pluginTransformDefine() {
return require('babel-plugin-transform-define')
}
function pluginTransformModulesCommonjs() {
return require('@babel/plugin-transform-modules-commonjs')
}
function pluginTransformReactRemovePropTypes() {
return require('babel-plugin-transform-react-remove-prop-types')
}
function pluginTransformRuntime() {
return require('@babel/plugin-transform-runtime')
}
function presetEnv() {
return require('@babel/preset-env')
}
function presetReact() {
return require('@babel/preset-react')
}
function presetTypescript() {
return require('@babel/preset-typescript')
}
module.exports = {
eslintParser,
pluginProposalClassProperties,
pluginProposalExportNamespaceFrom,
pluginProposalNumericSeparator,
pluginProposalObjectRestSpread,
pluginSyntaxBigint,
pluginSyntaxDynamicImport,
pluginSyntaxImportAttributes,
pluginSyntaxJsx,
pluginTransformDefine,
pluginTransformModulesCommonjs,
pluginTransformReactRemovePropTypes,
pluginTransformRuntime,
presetEnv,
presetReact,
presetTypescript,
}
```
|
```c++
/*=============================================================================
file LICENSE_1_0.txt or copy at path_to_url
==============================================================================*/
#if !defined(FUSION_INCLUDE_MPL)
#define FUSION_INCLUDE_MPL
#include <boost/fusion/support/config.hpp>
#include <boost/fusion/adapted/mpl.hpp>
#include <boost/fusion/mpl.hpp>
#endif
```
|
```smalltalk
using System.Collections.Generic;
using System.Text.Json.Serialization;
namespace Audit.Core
{
public class AuditEventEnvironment : IAuditOutput
{
/// <summary>
/// Gets or sets the name of the user responsible for the change.
/// </summary>
public string UserName { get; set; }
/// <summary>
/// Gets or sets the name of the machine.
/// </summary>
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string MachineName { get; set; }
/// <summary>
/// Gets or sets the name of the domain.
/// </summary>
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string DomainName { get; set; }
/// <summary>
/// The name of the method that has the audited code
/// </summary>
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string CallingMethodName { get; set; }
/// <summary>
/// The full stack trace
/// </summary>
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string StackTrace { get; set; }
/// <summary>
/// The name of the assembly from where the audit scope was invoked
/// </summary>
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string AssemblyName { get; set; }
/// <summary>
/// The exception information (if any)
/// </summary>
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string Exception { get; set; }
/// <summary>
/// The locale name
/// </summary>
[JsonIgnore(Condition = JsonIgnoreCondition.WhenWritingNull)]
public string Culture { get; set; }
[JsonExtensionData]
public Dictionary<string, object> CustomFields { get; set; } = new Dictionary<string, object>();
/// <summary>
/// Serializes this Environment entity as a JSON string
/// </summary>
public string ToJson()
{
return Configuration.JsonAdapter.Serialize(this);
}
/// <summary>
/// Parses an Environment entity from its JSON string representation.
/// </summary>
/// <param name="json">JSON string with the Environment entity representation.</param>
public static AuditEventEnvironment FromJson(string json)
{
return Configuration.JsonAdapter.Deserialize<AuditEventEnvironment>(json);
}
}
}
```
|
Claire Scorpo is an Australian architect, founder and director of Agius Scorpo Architects. In 2014 she founded Claire Scorpo Architects. In 2018 Scorpo partnered with Nic Agius reorganizing the studio as Agius Scorpo Architects. As well as co-directing the practice, Scorpo has been leading design studios at RMIT School of Design since 2011, and is an active member of the Australian Institute of Architects and her work has been exhibited and awarded nationally.
Scorpo's work centres around offering generous high quality living spaces on a modest budget. Scorpo achieves this through thoughtful detailing, sensitive materiality selection, and the dialogue between the landscape and architecture.
Early life and family
The Scorpo family carry a long heritage of winemaking. Stretching across generations the family has grown vineyards in Sicily, Sardinia, and are currently located in Merricks North, Australia. Scorpo’s parents, Paul and Caroline Scorpo founded Scorpo Wines after purchasing an apple orchard in 1997. The property became the site of the Scorpo’s family home with their business opening in 2002. By day, Paul worked as a Landscape Architect. Running his office from home, Claire was exposed to the inner workings of a Landscape Architecture office at a young age.
Career and education
Rejecting the influence of her father's Landscape Architecture practice on her, Scorpo began her education pursuing a degree in the sciences. After enrolling in two architectural courses, Scorpo made the switch, retrospectively realizing how her father's practice has influenced how she approaches current projects.
Scorpo studied at RMIT University in Melbourne, Australia graduating with a Bachelor of Architectural Design and Masters of Architecture with Distinction in 2013. During her time at RMIT Scorpo also did an exchange in Berlin, which shaped her interest in housing with small footprints which rely on public space and public amenities. In her early years at school, she took a well-known intensive studio led by Peter Corrigan. Through this connection, Scorpo was hired at Edmond and Corrigan where she worked (2006–2008) while continuing her studies part-time. While at Edmond and Corrigan Scorpo was mentored by Maggie Edmund, whom was described by Neil Clerehan as "probably the nation's foremost female architect".
After finishing her thesis Scorpo worked as a Graduate Architect at NMBW Studio (2011–2013) while continuing to finish professional courses at RMIT. At this time Scorpo began teaching design studios in residential volume building at RMIT. Her studios explore real-world social topics faced in regional towns across Victoria such as the implementation of affordable housing, mental health considerations, engaging with indigenous culture and declining populations.
One of her first projects, which was started while she was still a student was a major renovation to her parents’ home. Soon after graduation Scorpo founded her own multidisciplinary practice, Claire Scorpo Architects, in 2014. Focusing on residential renovations and add-on’s Claire Scorpo Architects was recognized within their first year of operations by winning the 2015 Architeam Award in the Residential Alterations and Additions category.
Research & Gender Equity
Since starting her practice in 2014, Scorpo has been dedicated to supporting women in the field of architecture. Scorpo benefited from the AIA's constructive mentoring program, having gained confidence as a women in a male-dominated field through the mentorship of Edmund, a sucesfull women architect herself. Scorpo credits this experience as being "enormously beneficial to me in the first year of starting practice."
Scorpo uses her practice as a research centre for developing a repeatable communal housing model which is congruent with affordable housing. She uses each project her firm takes on to progressively build towards a pilot case, which once repeatable at a low cost, she believes could provide stable housing for Australia.
In 2016, Scorpo was commissioned by the Australian design centre to create an installation for the Women in Design Symposium. In collaboration with Design Tazmania and Elliat Rich, they designed circular untreated brass components which were woven together by different groups of women at the symposium to create a suspended metallic weave. The fingerprints of each woman were left stained in the raw brass components. The project examined women’s work and what it was worth through the importance of community and working collectively.
In 2023, Scorpo partnered with sheBuilt Developers to deliver an intricate seven-storey addition at the rear of an 1880 gothic revival brick bank in South Melbourne. This project became a milestone in the Australian construction industry having been entirely facilitated by an all-female team intending to promote, empower and encourage female leadership within the building and design industry.
Awards
2023 Emerging Architect Prize
2022 Architeam Award Winner - Residential Alts and Adds up to $500,000 (Hawthorn I)
2017 Dulux Study Tour Prize Winner
2016 Houses Awards Shortlist: Emerging Architecture Practice
2016 Architeam Award Commendation - Residential – Alterations (Shoreham III)
2015 Architeam Award Winner - Residential Award – Alterations and Additions (Thornbury)
2015 Architeam Award Commendation - Residential Award – Alterations and Additions (Fitzroy I)
2015 Architeam Award Commendation - Unbuilt Award (Fitzroy II)
References
External links
Living people
Year of birth missing (living people)
Place of birth missing (living people)
Australian women architects
RMIT University alumni
Australian people of Italian descent
|
```jsx
import {
useCart,
useCartLine,
CartLineQuantityAdjustButton,
CartLinePrice,
CartLineQuantity,
Image,
Link,
} from '@shopify/hydrogen';
import {Heading, IconRemove, Text} from '~/components';
export function CartLineItem() {
const {linesRemove} = useCart();
const {id: lineId, quantity, merchandise} = useCartLine();
return (
<li key={lineId} className="flex gap-4">
<div className="flex-shrink">
<Image
width={112}
height={112}
widths={[112]}
data={merchandise.image}
loaderOptions={{
scale: 2,
crop: 'center',
}}
className="object-cover object-center w-24 h-24 border rounded md:w-28 md:h-28"
/>
</div>
<div className="flex justify-between flex-grow">
<div className="grid gap-2">
<Heading as="h3" size="copy">
<Link to={`/products/${merchandise.product.handle}`}>
{merchandise.product.title}
</Link>
</Heading>
<div className="grid pb-2">
{(merchandise?.selectedOptions || []).map((option) => (
<Text color="subtle" key={option.name}>
{option.name}: {option.value}
</Text>
))}
</div>
<div className="flex items-center gap-2">
<div className="flex justify-start text-copy">
<CartLineQuantityAdjust lineId={lineId} quantity={quantity} />
</div>
<button
type="button"
onClick={() => linesRemove([lineId])}
className="flex items-center justify-center w-10 h-10 border rounded"
>
<span className="sr-only">Remove</span>
<IconRemove aria-hidden="true" />
</button>
</div>
</div>
<Text>
<CartLinePrice as="span" />
</Text>
</div>
</li>
);
}
function CartLineQuantityAdjust({lineId, quantity}) {
return (
<>
<label htmlFor={`quantity-${lineId}`} className="sr-only">
Quantity, {quantity}
</label>
<div className="flex items-center border rounded">
<CartLineQuantityAdjustButton
adjust="decrease"
aria-label="Decrease quantity"
className="w-10 h-10 transition text-primary/50 hover:text-primary disabled:cursor-wait"
>
−
</CartLineQuantityAdjustButton>
<CartLineQuantity as="div" className="px-2 text-center" />
<CartLineQuantityAdjustButton
adjust="increase"
aria-label="Increase quantity"
className="w-10 h-10 transition text-primary/50 hover:text-primary disabled:cursor-wait"
>
+
</CartLineQuantityAdjustButton>
</div>
</>
);
}
```
|
The brown madtom (Noturus phaeus) is a species of madtom catfish native to the southern United States.
Distribution
The distribution of the brown madtom includes disjunct tributaries of the Mississippi River from the Obion River in Tennessee and Kentucky south to southwestern Mississippi and central and northern Louisiana extending to extreme southern Arkansas. It occupies creeks and small rivers with sand-gravel riffles and runs with debris, rocks, and undercut banks.
References
Noturus
Madtom
Endemic fish of the United States
Freshwater fish of the Southeastern United States
Fish described in 1969
|
```go
// Use of this source code is governed by a BSD-style
// license that can be found in the LICENSE file.
package json
import (
"bytes"
"fmt"
"strconv"
)
// Kind represents a token kind expressible in the JSON format.
type Kind uint16
const (
Invalid Kind = (1 << iota) / 2
EOF
Null
Bool
Number
String
Name
ObjectOpen
ObjectClose
ArrayOpen
ArrayClose
// comma is only for parsing in between tokens and
// does not need to be exported.
comma
)
func (k Kind) String() string {
switch k {
case EOF:
return "eof"
case Null:
return "null"
case Bool:
return "bool"
case Number:
return "number"
case String:
return "string"
case ObjectOpen:
return "{"
case ObjectClose:
return "}"
case Name:
return "name"
case ArrayOpen:
return "["
case ArrayClose:
return "]"
case comma:
return ","
}
return "<invalid>"
}
// Token provides a parsed token kind and value.
//
// Values are provided by the difference accessor methods. The accessor methods
// Name, Bool, and ParsedString will panic if called on the wrong kind. There
// are different accessor methods for the Number kind for converting to the
// appropriate Go numeric type and those methods have the ok return value.
type Token struct {
// Token kind.
kind Kind
// pos provides the position of the token in the original input.
pos int
// raw bytes of the serialized token.
// This is a subslice into the original input.
raw []byte
// boo is parsed boolean value.
boo bool
// str is parsed string value.
str string
}
// Kind returns the token kind.
func (t Token) Kind() Kind {
return t.kind
}
// RawString returns the read value in string.
func (t Token) RawString() string {
return string(t.raw)
}
// Pos returns the token position from the input.
func (t Token) Pos() int {
return t.pos
}
// Name returns the object name if token is Name, else it panics.
func (t Token) Name() string {
if t.kind == Name {
return t.str
}
panic(fmt.Sprintf("Token is not a Name: %v", t.RawString()))
}
// Bool returns the bool value if token kind is Bool, else it panics.
func (t Token) Bool() bool {
if t.kind == Bool {
return t.boo
}
panic(fmt.Sprintf("Token is not a Bool: %v", t.RawString()))
}
// ParsedString returns the string value for a JSON string token or the read
// value in string if token is not a string.
func (t Token) ParsedString() string {
if t.kind == String {
return t.str
}
panic(fmt.Sprintf("Token is not a String: %v", t.RawString()))
}
// Float returns the floating-point number if token kind is Number.
//
// The floating-point precision is specified by the bitSize parameter: 32 for
// float32 or 64 for float64. If bitSize=32, the result still has type float64,
// but it will be convertible to float32 without changing its value. It will
// return false if the number exceeds the floating point limits for given
// bitSize.
func (t Token) Float(bitSize int) (float64, bool) {
if t.kind != Number {
return 0, false
}
f, err := strconv.ParseFloat(t.RawString(), bitSize)
if err != nil {
return 0, false
}
return f, true
}
// Int returns the signed integer number if token is Number.
//
// The given bitSize specifies the integer type that the result must fit into.
// It returns false if the number is not an integer value or if the result
// exceeds the limits for given bitSize.
func (t Token) Int(bitSize int) (int64, bool) {
s, ok := t.getIntStr()
if !ok {
return 0, false
}
n, err := strconv.ParseInt(s, 10, bitSize)
if err != nil {
return 0, false
}
return n, true
}
// Uint returns the signed integer number if token is Number.
//
// The given bitSize specifies the unsigned integer type that the result must
// fit into. It returns false if the number is not an unsigned integer value
// or if the result exceeds the limits for given bitSize.
func (t Token) Uint(bitSize int) (uint64, bool) {
s, ok := t.getIntStr()
if !ok {
return 0, false
}
n, err := strconv.ParseUint(s, 10, bitSize)
if err != nil {
return 0, false
}
return n, true
}
func (t Token) getIntStr() (string, bool) {
if t.kind != Number {
return "", false
}
parts, ok := parseNumberParts(t.raw)
if !ok {
return "", false
}
return normalizeToIntString(parts)
}
// TokenEquals returns true if given Tokens are equal, else false.
func TokenEquals(x, y Token) bool {
return x.kind == y.kind &&
x.pos == y.pos &&
bytes.Equal(x.raw, y.raw) &&
x.boo == y.boo &&
x.str == y.str
}
```
|
Vishnudham Mandir is a temple on the border of the village Bherwania & Sadiha and Bhervania, Siwan District, Bihar, India, where a statue of Lord Vishnu was found by a carpenter underneath a tree in 1998. The temple was constructed in South Indian architecture by local villagers and authorities, at the site of the statue.
It is the largest idol of Lord Vishnu in north India made up of black granite and measuring 7.5 feet long and 3.5 feet in width. The idol belongs to the Gupta period. The idol has four arms carrying a "sankh(conch)", a "chakra(discus)", a "gada (club)", and a "padma (lotus)". There are two figures – one masculine and another feminine – below the left and right arms of the idol. An idol of goddess Laxmi was also recovered from the spot but it was stolen.
References
Hindu temples in Bihar
|
```swift
//
// AppDelegate.swift
// StellarDemo
//
// Created by AugustRush on 5/7/16.
//
import UIKit
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
// Override point for customization after application launch.
return true
}
func applicationWillResignActive(_ application: UIApplication) {
// Sent when the application is about to move from active to inactive state. This can occur for certain types of temporary interruptions (such as an incoming phone call or SMS message) or when the user quits the application and it begins the transition to the background state.
// Use this method to pause ongoing tasks, disable timers, and throttle down OpenGL ES frame rates. Games should use this method to pause the game.
}
func applicationDidEnterBackground(_ application: UIApplication) {
// Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later.
// If your application supports background execution, this method is called instead of applicationWillTerminate: when the user quits.
}
func applicationWillEnterForeground(_ application: UIApplication) {
// Called as part of the transition from the background to the inactive state; here you can undo many of the changes made on entering the background.
}
func applicationDidBecomeActive(_ application: UIApplication) {
// Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface.
}
func applicationWillTerminate(_ application: UIApplication) {
// Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:.
}
}
```
|
Claudia Acerenza Maríez (born 15 January 1966) is a retired Uruguayan sprinter who competed in the 200 and 400 metres. She represented her country at the 1988 Summer Olympics as well as two outdoor and two indoor World Championships. Her twin sister, Soledad Acerenza was also a sprinter.
She still holds national records on several sprinting distances.
International competitions
Personal bests
Outdoor
100 metres – 11.54 (Mexico City 1988) NR
200 metres – 23.78 (Mexico City 1988) NR
400 metres – 55.82 (Tokyo 1991)
Indoor
200 metres – 25.69 (Seville 1991)
400 metres – 56.57 (Seville 1991) NR
References
1966 births
Living people
Uruguayan female sprinters
Athletes (track and field) at the 1988 Summer Olympics
Athletes (track and field) at the 1987 Pan American Games
Athletes (track and field) at the 1991 Pan American Games
Olympic athletes for Uruguay
Pan American Games competitors for Uruguay
World Athletics Championships athletes for Uruguay
Uruguayan twins
South American Games gold medalists for Uruguay
South American Games silver medalists for Uruguay
South American Games medalists in athletics
Competitors at the 1986 South American Games
Competitors at the 1990 South American Games
Olympic female sprinters
|
```sqlpl
SET statement_timeout = 0;
SET lock_timeout = 0;
SET client_encoding = 'UTF8';
SET standard_conforming_strings = on;
SET check_function_bodies = false;
SET client_min_messages = warning;
SET row_security = off;
CREATE EXTENSION IF NOT EXISTS plpgsql WITH SCHEMA pg_catalog;
SET search_path = public, pg_catalog;
CREATE TYPE access_token_type AS ENUM (
'client',
'network'
);
CREATE FUNCTION b32enc_crockford(src bytea) RETURNS text
LANGUAGE plpgsql IMMUTABLE
AS $$
-- Adapted from the Go package encoding/base32.
-- See path_to_url
-- NOTE(kr): this function does not pad its output
DECLARE
-- alphabet is the base32 alphabet defined
-- by Douglas Crockford. It preserves lexical
-- order and avoids visually-similar symbols.
-- See path_to_url
alphabet text := '0123456789ABCDEFGHJKMNPQRSTVWXYZ';
dst text := '';
n integer;
b0 integer;
b1 integer;
b2 integer;
b3 integer;
b4 integer;
b5 integer;
b6 integer;
b7 integer;
BEGIN
FOR r IN 0..(length(src)-1) BY 5
LOOP
b0:=0; b1:=0; b2:=0; b3:=0; b4:=0; b5:=0; b6:=0; b7:=0;
-- Unpack 8x 5-bit source blocks into an 8 byte
-- destination quantum
n := length(src) - r;
IF n >= 5 THEN
b7 := get_byte(src, r+4) & 31;
b6 := get_byte(src, r+4) >> 5;
END IF;
IF n >= 4 THEN
b6 := b6 | (get_byte(src, r+3) << 3) & 31;
b5 := (get_byte(src, r+3) >> 2) & 31;
b4 := get_byte(src, r+3) >> 7;
END IF;
IF n >= 3 THEN
b4 := b4 | (get_byte(src, r+2) << 1) & 31;
b3 := (get_byte(src, r+2) >> 4) & 31;
END IF;
IF n >= 2 THEN
b3 := b3 | (get_byte(src, r+1) << 4) & 31;
b2 := (get_byte(src, r+1) >> 1) & 31;
b1 := (get_byte(src, r+1) >> 6) & 31;
END IF;
b1 := b1 | (get_byte(src, r) << 2) & 31;
b0 := get_byte(src, r) >> 3;
-- Encode 5-bit blocks using the base32 alphabet
dst := dst || substr(alphabet, b0+1, 1);
dst := dst || substr(alphabet, b1+1, 1);
IF n >= 2 THEN
dst := dst || substr(alphabet, b2+1, 1);
dst := dst || substr(alphabet, b3+1, 1);
END IF;
IF n >= 3 THEN
dst := dst || substr(alphabet, b4+1, 1);
END IF;
IF n >= 4 THEN
dst := dst || substr(alphabet, b5+1, 1);
dst := dst || substr(alphabet, b6+1, 1);
END IF;
IF n >= 5 THEN
dst := dst || substr(alphabet, b7+1, 1);
END IF;
END LOOP;
RETURN dst;
END;
$$;
CREATE FUNCTION next_chain_id(prefix text) RETURNS text
LANGUAGE plpgsql
AS $$
-- Adapted from the technique published by Instagram.
-- See path_to_url
DECLARE
our_epoch_ms bigint := 1433333333333; -- do not change
seq_id bigint;
now_ms bigint; -- from unix epoch, not ours
shard_id int := 4; -- must be different on each shard
n bigint;
BEGIN
SELECT nextval('chain_id_seq') % 1024 INTO seq_id;
SELECT FLOOR(EXTRACT(EPOCH FROM clock_timestamp()) * 1000) INTO now_ms;
n := (now_ms - our_epoch_ms) << 23;
n := n | (shard_id << 10);
n := n | (seq_id);
RETURN prefix || b32enc_crockford(int8send(n));
END;
$$;
SET default_tablespace = '';
SET default_with_oids = false;
CREATE TABLE access_tokens (
id text NOT NULL,
sort_id text DEFAULT next_chain_id('at'::text),
type access_token_type,
hashed_secret bytea NOT NULL,
created timestamp with time zone DEFAULT now() NOT NULL
);
CREATE SEQUENCE account_control_program_seq
START WITH 10001
INCREMENT BY 10000
NO MINVALUE
NO MAXVALUE
CACHE 1;
CREATE TABLE account_control_programs (
signer_id text NOT NULL,
key_index bigint NOT NULL,
control_program bytea NOT NULL,
change boolean NOT NULL,
expires_at timestamp with time zone
);
CREATE TABLE account_utxos (
asset_id bytea NOT NULL,
amount bigint NOT NULL,
account_id text NOT NULL,
control_program_index bigint NOT NULL,
control_program bytea NOT NULL,
confirmed_in bigint NOT NULL,
output_id bytea NOT NULL,
source_id bytea NOT NULL,
source_pos bigint NOT NULL,
ref_data_hash bytea NOT NULL,
change boolean NOT NULL
);
CREATE TABLE accounts (
account_id text NOT NULL,
tags jsonb,
alias text
);
CREATE TABLE annotated_accounts (
id text NOT NULL,
alias text NOT NULL,
keys jsonb NOT NULL,
quorum integer NOT NULL,
tags jsonb NOT NULL
);
CREATE TABLE annotated_assets (
id bytea NOT NULL,
sort_id text NOT NULL,
alias text NOT NULL,
issuance_program bytea NOT NULL,
keys jsonb NOT NULL,
quorum integer NOT NULL,
definition jsonb NOT NULL,
tags jsonb NOT NULL,
local boolean NOT NULL
);
CREATE TABLE annotated_inputs (
tx_hash bytea NOT NULL,
index integer NOT NULL,
type text NOT NULL,
asset_id bytea NOT NULL,
asset_alias text NOT NULL,
asset_definition jsonb NOT NULL,
asset_tags jsonb NOT NULL,
asset_local boolean NOT NULL,
amount bigint NOT NULL,
account_id text,
account_alias text,
account_tags jsonb,
issuance_program bytea NOT NULL,
reference_data jsonb NOT NULL,
local boolean NOT NULL,
spent_output_id bytea NOT NULL
);
CREATE TABLE annotated_outputs (
block_height bigint NOT NULL,
tx_pos integer NOT NULL,
output_index integer NOT NULL,
tx_hash bytea NOT NULL,
timespan int8range NOT NULL,
output_id bytea NOT NULL,
type text NOT NULL,
purpose text NOT NULL,
asset_id bytea NOT NULL,
asset_alias text NOT NULL,
asset_definition jsonb NOT NULL,
asset_tags jsonb NOT NULL,
asset_local boolean NOT NULL,
amount bigint NOT NULL,
account_id text,
account_alias text,
account_tags jsonb,
control_program bytea NOT NULL,
reference_data jsonb NOT NULL,
local boolean NOT NULL
);
CREATE TABLE annotated_txs (
block_height bigint NOT NULL,
tx_pos integer NOT NULL,
tx_hash bytea NOT NULL,
data jsonb NOT NULL,
"timestamp" timestamp with time zone NOT NULL,
block_id bytea NOT NULL,
local boolean NOT NULL,
reference_data jsonb NOT NULL,
block_tx_count integer
);
CREATE TABLE asset_tags (
asset_id bytea NOT NULL,
tags jsonb
);
CREATE TABLE assets (
id bytea NOT NULL,
created_at timestamp with time zone DEFAULT now() NOT NULL,
sort_id text DEFAULT next_chain_id('asset'::text) NOT NULL,
issuance_program bytea NOT NULL,
client_token text,
initial_block_hash bytea NOT NULL,
signer_id text,
definition bytea NOT NULL,
alias text,
first_block_height bigint,
vm_version bigint NOT NULL
);
CREATE SEQUENCE assets_key_index_seq
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
CREATE TABLE block_processors (
name text NOT NULL,
height bigint DEFAULT 0 NOT NULL
);
CREATE TABLE blocks (
block_hash bytea NOT NULL,
height bigint NOT NULL,
data bytea NOT NULL,
header bytea NOT NULL
);
CREATE SEQUENCE chain_id_seq
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
CREATE TABLE config (
singleton boolean DEFAULT true NOT NULL,
is_signer boolean,
is_generator boolean,
blockchain_id bytea NOT NULL,
configured_at timestamp with time zone NOT NULL,
generator_url text DEFAULT ''::text NOT NULL,
block_pub text DEFAULT ''::text NOT NULL,
remote_block_signers bytea DEFAULT '\x'::bytea NOT NULL,
generator_access_token text DEFAULT ''::text NOT NULL,
max_issuance_window_ms bigint,
id text NOT NULL,
block_hsm_url text DEFAULT ''::text,
block_hsm_access_token text DEFAULT ''::text,
CONSTRAINT config_singleton CHECK (singleton)
);
CREATE TABLE core_id (
singleton boolean DEFAULT true NOT NULL,
id text,
CONSTRAINT core_id_singleton CHECK (singleton)
);
CREATE TABLE generator_pending_block (
singleton boolean DEFAULT true NOT NULL,
data bytea NOT NULL,
height bigint,
CONSTRAINT generator_pending_block_singleton CHECK (singleton)
);
CREATE TABLE leader (
singleton boolean DEFAULT true NOT NULL,
leader_key text NOT NULL,
expiry timestamp with time zone DEFAULT '1970-01-01 00:00:00-08'::timestamp with time zone NOT NULL,
address text NOT NULL,
CONSTRAINT leader_singleton CHECK (singleton)
);
CREATE TABLE migrations (
filename text NOT NULL,
hash text NOT NULL,
applied_at timestamp with time zone DEFAULT now() NOT NULL
);
CREATE SEQUENCE mockhsm_sort_id_seq
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
CREATE TABLE mockhsm (
pub bytea NOT NULL,
prv bytea NOT NULL,
alias text,
sort_id bigint DEFAULT nextval('mockhsm_sort_id_seq'::regclass) NOT NULL,
key_type text DEFAULT 'chain_kd'::text NOT NULL
);
CREATE TABLE query_blocks (
height bigint NOT NULL,
"timestamp" bigint NOT NULL
);
CREATE TABLE signed_blocks (
block_height bigint NOT NULL,
block_hash bytea NOT NULL
);
CREATE TABLE signers (
id text NOT NULL,
type text NOT NULL,
key_index bigint NOT NULL,
quorum integer NOT NULL,
client_token text,
xpubs bytea[] NOT NULL
);
CREATE SEQUENCE signers_key_index_seq
START WITH 1
INCREMENT BY 1
NO MINVALUE
NO MAXVALUE
CACHE 1;
ALTER SEQUENCE signers_key_index_seq OWNED BY signers.key_index;
CREATE TABLE snapshots (
height bigint NOT NULL,
data bytea NOT NULL,
created_at timestamp without time zone DEFAULT now()
);
CREATE TABLE submitted_txs (
tx_hash bytea NOT NULL,
height bigint NOT NULL,
submitted_at timestamp without time zone DEFAULT now() NOT NULL
);
CREATE TABLE txfeeds (
id text DEFAULT next_chain_id('cur'::text) NOT NULL,
alias text,
filter text,
after text,
client_token text
);
ALTER TABLE ONLY signers ALTER COLUMN key_index SET DEFAULT nextval('signers_key_index_seq'::regclass);
ALTER TABLE ONLY access_tokens
ADD CONSTRAINT access_tokens_pkey PRIMARY KEY (id);
ALTER TABLE ONLY account_control_programs
ADD CONSTRAINT account_control_programs_pkey PRIMARY KEY (control_program);
ALTER TABLE ONLY accounts
ADD CONSTRAINT account_tags_pkey PRIMARY KEY (account_id);
ALTER TABLE ONLY account_utxos
ADD CONSTRAINT account_utxos_pkey PRIMARY KEY (output_id);
ALTER TABLE ONLY accounts
ADD CONSTRAINT accounts_alias_key UNIQUE (alias);
ALTER TABLE ONLY annotated_accounts
ADD CONSTRAINT annotated_accounts_pkey PRIMARY KEY (id);
ALTER TABLE ONLY annotated_assets
ADD CONSTRAINT annotated_assets_pkey PRIMARY KEY (id);
ALTER TABLE ONLY annotated_inputs
ADD CONSTRAINT annotated_inputs_pkey PRIMARY KEY (tx_hash, index);
ALTER TABLE ONLY annotated_outputs
ADD CONSTRAINT annotated_outputs_output_id_key UNIQUE (output_id);
ALTER TABLE ONLY annotated_outputs
ADD CONSTRAINT annotated_outputs_pkey PRIMARY KEY (block_height, tx_pos, output_index);
ALTER TABLE ONLY annotated_txs
ADD CONSTRAINT annotated_txs_pkey PRIMARY KEY (block_height, tx_pos);
ALTER TABLE ONLY asset_tags
ADD CONSTRAINT asset_tags_asset_id_key UNIQUE (asset_id);
ALTER TABLE ONLY assets
ADD CONSTRAINT assets_alias_key UNIQUE (alias);
ALTER TABLE ONLY assets
ADD CONSTRAINT assets_client_token_key UNIQUE (client_token);
ALTER TABLE ONLY assets
ADD CONSTRAINT assets_pkey PRIMARY KEY (id);
ALTER TABLE ONLY block_processors
ADD CONSTRAINT block_processors_name_key UNIQUE (name);
ALTER TABLE ONLY blocks
ADD CONSTRAINT blocks_height_key UNIQUE (height);
ALTER TABLE ONLY blocks
ADD CONSTRAINT blocks_pkey PRIMARY KEY (block_hash);
ALTER TABLE ONLY config
ADD CONSTRAINT config_pkey PRIMARY KEY (singleton);
ALTER TABLE ONLY core_id
ADD CONSTRAINT core_id_pkey PRIMARY KEY (singleton);
ALTER TABLE ONLY generator_pending_block
ADD CONSTRAINT generator_pending_block_pkey PRIMARY KEY (singleton);
ALTER TABLE ONLY leader
ADD CONSTRAINT leader_singleton_key UNIQUE (singleton);
ALTER TABLE ONLY migrations
ADD CONSTRAINT migrations_pkey PRIMARY KEY (filename);
ALTER TABLE ONLY mockhsm
ADD CONSTRAINT mockhsm_alias_key UNIQUE (alias);
ALTER TABLE ONLY mockhsm
ADD CONSTRAINT mockhsm_pkey PRIMARY KEY (pub);
ALTER TABLE ONLY query_blocks
ADD CONSTRAINT query_blocks_pkey PRIMARY KEY (height);
ALTER TABLE ONLY signers
ADD CONSTRAINT signers_client_token_key UNIQUE (client_token);
ALTER TABLE ONLY signers
ADD CONSTRAINT signers_pkey PRIMARY KEY (id);
ALTER TABLE ONLY mockhsm
ADD CONSTRAINT sort_id_index UNIQUE (sort_id);
ALTER TABLE ONLY snapshots
ADD CONSTRAINT state_trees_pkey PRIMARY KEY (height);
ALTER TABLE ONLY submitted_txs
ADD CONSTRAINT submitted_txs_pkey PRIMARY KEY (tx_hash);
ALTER TABLE ONLY txfeeds
ADD CONSTRAINT txfeeds_alias_key UNIQUE (alias);
ALTER TABLE ONLY txfeeds
ADD CONSTRAINT txfeeds_client_token_key UNIQUE (client_token);
ALTER TABLE ONLY txfeeds
ADD CONSTRAINT txfeeds_pkey PRIMARY KEY (id);
CREATE INDEX account_utxos_asset_id_account_id_confirmed_in_idx ON account_utxos USING btree (asset_id, account_id, confirmed_in);
CREATE INDEX annotated_assets_sort_id ON annotated_assets USING btree (sort_id);
CREATE INDEX annotated_outputs_timespan_idx ON annotated_outputs USING gist (timespan);
CREATE INDEX annotated_txs_data_idx ON annotated_txs USING gin (data jsonb_path_ops);
CREATE INDEX query_blocks_timestamp_idx ON query_blocks USING btree ("timestamp");
CREATE UNIQUE INDEX signed_blocks_block_height_idx ON signed_blocks USING btree (block_height);
insert into migrations (filename, hash) values ('2017-02-03.0.core.schema-snapshot.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-02-07.0.query.non-null-alias.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-02-16.0.query.spent-output.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-02-28.0.core.remove-outpoints.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-03-02.0.core.add-output-source-info.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-03-09.0.core.account-utxos-change.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-04-13.0.query.block-transactions-count.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-04-17.0.core.null-token-type.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-04-27.0.generator.pending-block-height.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-05-08.0.core.drop-redundant-indexes.sql', your_sha256_hash);
insert into migrations (filename, hash) values ('2017-06-28.0.core.coreid.sql', your_sha256_hash);
```
|
Judar (, also Romanized as Jūdar and Jowdar; also known as Jow Darreh and Chūdar) is a village in Gifan Rural District, Garmkhan District, Bojnord County, North Khorasan Province, Iran. At the 2006 census, its population was 236, in 62 families.
References
Populated places in Bojnord County
|
```python
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from __future__ import absolute_import
from __future__ import print_function
import os
import sys
import subprocess
import random
import string
import time
import logging
import threading
from atx import strutils
random.seed(time.time())
def id_generator(n=5):
return ''.join(random.choice(string.ascii_uppercase + string.digits) for _ in range(n))
def dirname(name):
if os.path.isabs(name):
return os.path.dirname(name)
return os.path.dirname(os.path.abspath(name))
def exec_cmd(*cmds, **kwargs):
'''
@arguments env=None, timeout=3
may raise Error
'''
env = os.environ.copy()
env.update(kwargs.get('env', {}))
envcopy = {}
for key in env:
try:
envcopy[key] = str(env[key]).encode('utf-8') # fix encoding
except:
print('IGNORE BAD ENV KEY: ' + repr(key))
env = envcopy
timeout = kwargs.get('timeout', 120)
shell = kwargs.get('shell', False)
try:
import sh
# log.debug('RUN(timeout=%ds): %s'%(timeout, ' '.join(cmds)))
if shell:
cmds = list(cmds)
cmds[:0] = ['bash', '-c']
c = sh.Command(cmds[0])
try:
r = c(*cmds[1:], _err_to_out=True, _out=sys.stdout, _env=env, _timeout=timeout)
except:
# log.error('EXEC_CMD error, cmd: %s'%(' '.join(cmds)))
raise
except ImportError:
# log.debug('RUN(timeout=XX): %s'%(' '.join(cmds)))
if shell:
cmds = ' '.join(cmds)
r = subprocess.Popen(cmds, env=env, stdout=sys.stdout, stderr=sys.stderr, shell=shell)
return r.wait()
return 0
def random_name(name):
out = []
for c in name:
if c == 'X':
c = random.choice(string.ascii_lowercase)
out.append(c)
return ''.join(out)
def remove_force(name):
if not os.path.isfile(name):
return
try:
os.unlink(name)
except Exception as e:
print("Warning: tempfile {} not deleted, Error {}".format(name, e))
SYSTEM_ENCODING = 'gbk' if os.name == 'nt' else 'utf-8'
VALID_IMAGE_EXTS = ['.jpg', '.jpeg', '.png', '.bmp']
# def auto_decode(s, encoding='utf-8'):
# return s if isinstance(s, unicode) else unicode(s, encoding)
def list_images(path=['.']):
""" Return list of image files """
for image_dir in set(path):
if not os.path.isdir(image_dir):
continue
for filename in os.listdir(image_dir):
bname, ext = os.path.splitext(filename)
if ext.lower() not in VALID_IMAGE_EXTS:
continue
filepath = os.path.join(image_dir, filename)
yield strutils.decode(filepath)
def list_all_image(path, valid_exts=VALID_IMAGE_EXTS):
"""List all images under path
@return unicode list
"""
for filename in os.listdir(path):
bname, ext = os.path.splitext(filename)
if ext.lower() not in VALID_IMAGE_EXTS:
continue
filepath = os.path.join(path, filename)
yield strutils.decode(filepath)
def image_name_match(name, target):
if name == target:
return True
bn = os.path.normpath(name)
bt = os.path.basename(target)
if bn == bt:
return True
bn, ext = os.path.splitext(bn)
if ext != '':
return False
for ext in VALID_IMAGE_EXTS:
if bn+ext == bt or bn+ext.upper() == bt:
return True
if bt.find('@') != -1:
if bn == bt[:bt.find('@')]:
return True
return False
def search_image(name=None, path=['.']):
"""
look for the image real path, if name is None, then return all images under path.
@return system encoded path string
FIXME(ssx): this code is just looking wired.
"""
name = strutils.decode(name)
for image_dir in path:
if not os.path.isdir(image_dir):
continue
image_dir = strutils.decode(image_dir)
image_path = os.path.join(image_dir, name)
if os.path.isfile(image_path):
return strutils.encode(image_path)
for image_path in list_all_image(image_dir):
if not image_name_match(name, image_path):
continue
return strutils.encode(image_path)
return None
def clean_path(filepath):
return os.path.normpath(os.path.relpath(strutils.decode(filepath)))
def filename_match(fsearch, filename, width, height):
'''
<nickname>@({width}x{height}|auto).(png|jpg|jpeg)
'''
fsearch = clean_path(fsearch)
filename = clean_path(filename)
if fsearch == filename:
return True
if fsearch.find('@') == -1:
return False
basename, fileext = os.path.splitext(fsearch)
nickname, extinfo = basename.split('@', 1)
if extinfo == 'auto':
valid_names = {}.fromkeys([
nickname+'@{}x{}'.format(width, height)+fileext,
nickname+'@{}x{}'.format(height, width)+fileext,
nickname+'.{}x{}'.format(width, height)+fileext,
nickname+'.{}x{}'.format(height, width)+fileext,
])
if filename in valid_names:
return True
# if extinfo.find('x') != -1:
# cw, ch = extinfo.split('x', 1)
# if cw*width == ch*height or cw*height == ch*width:
# return True
return False
def lookup_image(fsearch, width=0, height=0):
dirname = os.path.dirname(fsearch) or "."
for file in os.listdir(dirname):
filepath = os.path.join(dirname, file)
if filename_match(fsearch, filepath, width, height):
return filepath
def nameddict(name, props):
"""
Point = nameddict('Point', ['x', 'y'])
pt = Point(x=1, y=2)
pt.y = 3
print pt
"""
class NamedDict(object):
def __init__(self, *args, **kwargs):
self.__store = {}.fromkeys(props)
if args:
for i, k in enumerate(props[:len(args)]):
self[k] = args[i]
for k, v in kwargs.items():
self[k] = v
def __getattr__(self, key):
# print '+', key
if key.startswith('_NamedDict__'):
return self.__dict__[key]
else:
return self.__store[key]
def __setattr__(self, key, value):
# print '-', key
if key.startswith('_NamedDict__'):
object.__setattr__(self, key, value)
else:
self.__setitem__(key, value)
# self.__store[key] = value
def __getitem__(self, key):
return self.__store[key]
def __setitem__(self, key, value):
if key not in props:
raise AttributeError("NamedDict(%s) has no attribute %s, avaliables are %s" % (
name, key, props))
self.__store[key] = value
def __dict__(self):
return self.__store
def __str__(self):
return 'NamedDict(%s: %s)' % (name, str(self.__store))
return NamedDict
if __name__ == '__main__':
# print search_image('.png')
# print search_image('oo')
# print search_image()
Point = nameddict('Point', ['x', 'y'])
print(Point(2, 3, 4))
```
|
```kotlin
package com.x8bit.bitwarden.data.platform.repository.model
/**
* A data state that can be used as a template for data in the repository layer.
*/
sealed class DataState<out T> {
/**
* Data that is being wrapped by [DataState].
*/
abstract val data: T?
/**
* Loading state that has no data is available.
*/
data object Loading : DataState<Nothing>() {
override val data: Nothing? get() = null
}
/**
* Loaded state that has data available.
*/
data class Loaded<T>(
override val data: T,
) : DataState<T>()
/**
* Pending state that has data available.
*/
data class Pending<T>(
override val data: T,
) : DataState<T>()
/**
* Error state that may have data available.
*/
data class Error<T>(
val error: Throwable,
override val data: T? = null,
) : DataState<T>()
/**
* No network state that may have data is available.
*/
data class NoNetwork<T>(
override val data: T? = null,
) : DataState<T>()
}
```
|
```objective-c
/*
*
* This program is free software: you can redistribute it and/or modify
* the Free Software Foundation, either version 2 or (at your option)
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
*
* along with this program. If not, see <path_to_url
*/
#ifndef AGENTSETTINGSPAGE_H
#define AGENTSETTINGSPAGE_H
#include "gui/ApplicationSettingsWidget.h"
class AgentSettingsPage : public ISettingsPage
{
public:
AgentSettingsPage() = default;
~AgentSettingsPage() override = default;
QString name() override;
QIcon icon() override;
QWidget* createWidget() override;
void loadSettings(QWidget* widget) override;
void saveSettings(QWidget* widget) override;
private:
};
#endif // AGENTSETTINGSPAGE_H
```
|
Gusevsky () is an urban locality (urban-type settlement) under the administrative jurisdiction of the town of krai significance of Gus-Khrustalny of Vladimir Oblast, Russia. Population:
References
Notes
Sources
Urban-type settlements in Vladimir Oblast
|
is a railway station on the Seibu Shinjuku Line in Nakano, Tokyo, Japan, operated by the private railway operator Seibu Railway.
Lines
Araiyakushi-mae Station is served by the 47.5 km Seibu Shinjuku Line from in Tokyo to in Saitama Prefecture. Located between and , it is 5.2 km from the Seibu-Shinjuku terminus.
During the daytime off-peak, the station is served by six trains per hour in either direction.
Station layout
The station consists of a two ground-level side platforms serving two tracks.
Platforms
History
The station opened on 16 April 1927.
Station numbering was introduced on all Seibu Railway lines during fiscal 2012, with Araiyakushi-mae Station becoming "SS05".
Future developments
In order to ease congestion and improve the safety of the railway in the local area, plans have been produced to divert the tracks between Nakai Station and Nogata Station underground. Consequently, the existing station complex is expected to be replaced by an underground station. Approval for the plan was granted in April 2013.
Passenger statistics
In fiscal 2013, the station was the 48th busiest on the Seibu network with an average of 22,645 passengers daily.
The passenger figures for previous years are as shown below.
Surrounding area
, commonly known as Araiyakushi, after which the station was named
Mejiro University Shinjuku campus
References
External links
Araiyakushi-mae Station
Railway stations in Tokyo
Nakano, Tokyo
Railway stations in Japan opened in 1927
|
```css
Use `text-transform` to avoid screen-reader pronunciation errors
Use ```list-style-type``` to change the marker type in lists
Using the `font-variant` property to transform text to small caps
Comma-separated lists
Page breaks for printing
```
|
```java
package org.bouncycastle.crypto.params;
import org.bouncycastle.crypto.CipherParameters;
public class KeyParameter
implements CipherParameters
{
private byte[] key;
public KeyParameter(
byte[] key)
{
this(key, 0, key.length);
}
public KeyParameter(
byte[] key,
int keyOff,
int keyLen)
{
this.key = new byte[keyLen];
System.arraycopy(key, keyOff, this.key, 0, keyLen);
}
public byte[] getKey()
{
return key;
}
}
```
|
Panagiotis Ballas (, born 6 September 1993) is a Greek professional footballer who plays as a defensive midfielder.
Club career
He joined Atromitos in 2008 from Akadimia Elpides Karditsas. He made his debut for the first team in a home game against Kavala, on the final matchday of the 2010–11 season.
On 8 September 2017, he signed a three-year contract with AEL, on a free transfer.
Career statistics
Club
References
External links
Atromitos F.C. profile
1993 births
Living people
Greek men's footballers
Greece men's under-21 international footballers
Greece men's youth international footballers
Super League Greece players
3. Liga players
Super League Greece 2 players
Atromitos F.C. players
SG Sonnenhof Großaspach players
Panionios F.C. players
Athlitiki Enosi Larissa F.C. players
Apollon Larissa F.C. players
Anagennisi Karditsa F.C. players
Greek expatriate men's footballers
Greek expatriate sportspeople in Germany
Expatriate men's footballers in Germany
Men's association football midfielders
Footballers from Karditsa
|
```php
<?php
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
*/
namespace Google\Service\CloudDeploy;
class Rollout extends \Google\Collection
{
protected $collection_key = 'rolledBackByRollouts';
/**
* @var string[]
*/
public $annotations;
/**
* @var string
*/
public $approvalState;
/**
* @var string
*/
public $approveTime;
/**
* @var string
*/
public $controllerRollout;
/**
* @var string
*/
public $createTime;
/**
* @var string
*/
public $deployEndTime;
/**
* @var string
*/
public $deployFailureCause;
/**
* @var string
*/
public $deployStartTime;
/**
* @var string
*/
public $deployingBuild;
/**
* @var string
*/
public $description;
/**
* @var string
*/
public $enqueueTime;
/**
* @var string
*/
public $etag;
/**
* @var string
*/
public $failureReason;
/**
* @var string[]
*/
public $labels;
protected $metadataType = Metadata::class;
protected $metadataDataType = '';
/**
* @var string
*/
public $name;
protected $phasesType = Phase::class;
protected $phasesDataType = 'array';
/**
* @var string
*/
public $rollbackOfRollout;
/**
* @var string[]
*/
public $rolledBackByRollouts;
/**
* @var string
*/
public $state;
/**
* @var string
*/
public $targetId;
/**
* @var string
*/
public $uid;
/**
* @param string[]
*/
public function setAnnotations($annotations)
{
$this->annotations = $annotations;
}
/**
* @return string[]
*/
public function getAnnotations()
{
return $this->annotations;
}
/**
* @param string
*/
public function setApprovalState($approvalState)
{
$this->approvalState = $approvalState;
}
/**
* @return string
*/
public function getApprovalState()
{
return $this->approvalState;
}
/**
* @param string
*/
public function setApproveTime($approveTime)
{
$this->approveTime = $approveTime;
}
/**
* @return string
*/
public function getApproveTime()
{
return $this->approveTime;
}
/**
* @param string
*/
public function setControllerRollout($controllerRollout)
{
$this->controllerRollout = $controllerRollout;
}
/**
* @return string
*/
public function getControllerRollout()
{
return $this->controllerRollout;
}
/**
* @param string
*/
public function setCreateTime($createTime)
{
$this->createTime = $createTime;
}
/**
* @return string
*/
public function getCreateTime()
{
return $this->createTime;
}
/**
* @param string
*/
public function setDeployEndTime($deployEndTime)
{
$this->deployEndTime = $deployEndTime;
}
/**
* @return string
*/
public function getDeployEndTime()
{
return $this->deployEndTime;
}
/**
* @param string
*/
public function setDeployFailureCause($deployFailureCause)
{
$this->deployFailureCause = $deployFailureCause;
}
/**
* @return string
*/
public function getDeployFailureCause()
{
return $this->deployFailureCause;
}
/**
* @param string
*/
public function setDeployStartTime($deployStartTime)
{
$this->deployStartTime = $deployStartTime;
}
/**
* @return string
*/
public function getDeployStartTime()
{
return $this->deployStartTime;
}
/**
* @param string
*/
public function setDeployingBuild($deployingBuild)
{
$this->deployingBuild = $deployingBuild;
}
/**
* @return string
*/
public function getDeployingBuild()
{
return $this->deployingBuild;
}
/**
* @param string
*/
public function setDescription($description)
{
$this->description = $description;
}
/**
* @return string
*/
public function getDescription()
{
return $this->description;
}
/**
* @param string
*/
public function setEnqueueTime($enqueueTime)
{
$this->enqueueTime = $enqueueTime;
}
/**
* @return string
*/
public function getEnqueueTime()
{
return $this->enqueueTime;
}
/**
* @param string
*/
public function setEtag($etag)
{
$this->etag = $etag;
}
/**
* @return string
*/
public function getEtag()
{
return $this->etag;
}
/**
* @param string
*/
public function setFailureReason($failureReason)
{
$this->failureReason = $failureReason;
}
/**
* @return string
*/
public function getFailureReason()
{
return $this->failureReason;
}
/**
* @param string[]
*/
public function setLabels($labels)
{
$this->labels = $labels;
}
/**
* @return string[]
*/
public function getLabels()
{
return $this->labels;
}
/**
* @param Metadata
*/
public function setMetadata(Metadata $metadata)
{
$this->metadata = $metadata;
}
/**
* @return Metadata
*/
public function getMetadata()
{
return $this->metadata;
}
/**
* @param string
*/
public function setName($name)
{
$this->name = $name;
}
/**
* @return string
*/
public function getName()
{
return $this->name;
}
/**
* @param Phase[]
*/
public function setPhases($phases)
{
$this->phases = $phases;
}
/**
* @return Phase[]
*/
public function getPhases()
{
return $this->phases;
}
/**
* @param string
*/
public function setRollbackOfRollout($rollbackOfRollout)
{
$this->rollbackOfRollout = $rollbackOfRollout;
}
/**
* @return string
*/
public function getRollbackOfRollout()
{
return $this->rollbackOfRollout;
}
/**
* @param string[]
*/
public function setRolledBackByRollouts($rolledBackByRollouts)
{
$this->rolledBackByRollouts = $rolledBackByRollouts;
}
/**
* @return string[]
*/
public function getRolledBackByRollouts()
{
return $this->rolledBackByRollouts;
}
/**
* @param string
*/
public function setState($state)
{
$this->state = $state;
}
/**
* @return string
*/
public function getState()
{
return $this->state;
}
/**
* @param string
*/
public function setTargetId($targetId)
{
$this->targetId = $targetId;
}
/**
* @return string
*/
public function getTargetId()
{
return $this->targetId;
}
/**
* @param string
*/
public function setUid($uid)
{
$this->uid = $uid;
}
/**
* @return string
*/
public function getUid()
{
return $this->uid;
}
}
// Adding a class alias for backwards compatibility with the previous class name.
class_alias(Rollout::class, 'Google_Service_CloudDeploy_Rollout');
```
|
```swift
//: [Table of Contents](00-ToC)
//: [Previous](@previous)
import SwifterSwift
import PlaygroundSupport
//: ## UIKit extensions
//: ### UIButton extensions
let button = UIButton(frame: CGRect(x: 0, y: 0, width: 100, height: 40))
// Set title, title color and image for all states at once!
button.setTitleForAllStates("Login")
button.setTitleColorForAllStates(UIColor.blue)
button.setImageForAllStates(UIImage(named: "login")!)
//: ### UIColor extensions
// Create new UIColor for RGB values
let color1 = UIColor(red: 121, green: 220, blue: 164)
// Create new UIColor for a hex string (including strings starting with #, 0x or in short css hex format)
let color2 = UIColor(hexString: "#00F")
// Create new UIColor for a hexadecimal value
let color3 = UIColor(hex: 0x45C91B)
// Blend two colors with ease
UIColor.blend(UIColor.red, intensity1: 0.5, with: UIColor.green, intensity2: 0.3)
// Return hexadecimal value string
UIColor.red.hexString
// Use Google Material design colors with ease
let indigo = UIColor.Material.indigo
// Use CSS colors with ease:
let beige = UIColor.CSS.beige
// Return brand colors from more than 30 social brands
let facebookColor = UIColor.Social.facebook
//: ### UIImage extensions
let image1 = UIImage(named: "logo")!
// Crop images
let croppedImage = image1.cropped(to: CGRect(x: 0, y: 0, width: 100, height: 100))
// scale to fit width or height
let scaledImage1 = image1.scaled(toHeight: 50)
let scaledImage2 = image1.scaled(toWidth: 50)
// Compress images
let compressedImage = image1.compressed(quality: 0.3)
// get image size
image1.kilobytesSize
//: ### UIImageView extensions
let imageView = UIImageView()
// Download an image from URL in background
PlaygroundPage.current.needsIndefiniteExecution = true
imageView.download(from: URL(string: "path_to_url")!,
contentMode: .scaleAspectFit,
placeholder: image1,
completionHandler: { downloadedImage in
downloadedImage
PlaygroundPage.current.needsIndefiniteExecution = false
imageView.sizeToFit()
// Blur image view
imageView.blur(withStyle: .light)
})
//: ### UINavigationBar extensions
let navbar = UINavigationBar(frame: CGRect(x: 0, y: 0, width: 100, height: 60))
let navItem = UINavigationItem(title: "Title")
navbar.pushItem(navItem, animated: false)
// Change navigation bar font and color
navbar.setTitleFont(UIFont.systemFont(ofSize: 10), color: UIColor.red)
//: ### UIView extensions
// Set borderColor, borderWidth, cornerRadius, shadowColor, and many other properties from code or storyboard
var view = UIView(frame: CGRect(x: 0, y: 0, width: 100, height: 100))
view.backgroundColor = UIColor.red
// Set some or all corners radiuses of view.
view.roundCorners([.bottomLeft, .topRight], radius: 30)
view.layerCornerRadius = 30
// Add shadow to view
view.addShadow(ofColor: .black, radius: 3, opacity: 0.5)
// Add gradient
view.addGradient(colors: [.red], direction: .rightToLeft)
//: [Next](@next)
```
|
The John Mayfield House is a historic house in Glasgow, Kentucky. It was built in the 1830s for John Mayfield and his family. It was later acquired by W. L. Steffey, and his family inherited the property after his death. It was designed in the Federal architectural style. It has been listed on the National Register of Historic Places since May 20, 1983.
References
National Register of Historic Places in Barren County, Kentucky
Federal architecture in Kentucky
Houses completed in 1830
1830s establishments in Kentucky
Houses in Barren County, Kentucky
Houses on the National Register of Historic Places in Kentucky
Glasgow, Kentucky
|
or Achala (, "The Immovable", ), also known as (, "Immovable Lord") or (, "Noble Immovable Lord"), is a wrathful deity and dharmapala (protector of the Dharma) prominent in Vajrayana Buddhism and East Asian Buddhism.
Originally a minor deity described as a messenger or acolyte of the buddha Vairocana, Acala later rose to prominence as an object of veneration in his own right as a remover of obstacles and destroyer of evil, eventually becoming seen as the wrathful manifestation of either Vairocana, the buddha Akṣobhya, or the bodhisattva Mañjuśrī. In later texts, he is also called (, "Violent Wrathful One", ) or (, "Violent One of Great Wrath", ), the names by which he is more commonly known in countries like Nepal and Tibet.
In East Asian esoteric Buddhism, Acala is classed among the Wisdom Kings () and is preeminent among the five Wisdom Kings of the Womb Realm. Accordingly, he occupies an important hierarchical position in the Mandala of the Two Realms. In China, he is known as Bùdòng Míngwáng (不動明王, "Immovable Wisdom King", the Chinese translation of Sanskrit Acala(nātha) Vidyārāja), while in Japan, he is called Fudō Myōō, the on'yomi reading of his Chinese name. Acala (as Fudō) is one of the especially important and well-known divinities in Japanese Buddhism, being especially venerated in the Shingon, Tendai, Zen, and Nichiren sects, as well as in Shugendō.
Acala has been worshiped throughout the Middle Ages and into modern times in Nepal, Tibet, China and Japan, where sculptural and pictorial representations of him are most often found.
Origins and development
Acala first appears in the (不空羂索神変真言經, pinyin: Bùkōng juànsuǒ shénbiàn zhēnyán jīng, translated by Bodhiruci circa 707-709 CE), where he is described as a servant or messenger of the buddha Vairocana:The first from the west in the northern quadrant is the acolyte Acala (不動使者). In his left hand he grasps a noose and in his right hand he holds a sword. He is seated in the half-lotus position.
More well-known, however, is the following passage from the Mahāvairocana Tantra (also known as the Mahāvairocanābhisaṃbodhi Tantra or the Vairocana Sūtra) which refers to Acala as one of the deities of the Womb Realm Mandala:
The deity was apparently popular in India during the 8th-9th centuries as evident by the fact that six of the Sanskrit texts translated by the esoteric master Amoghavajra into Chinese are devoted entirely to him. Indeed, Acala's rise to a more prominent position in the Esoteric pantheon in East Asian Buddhism may be credited in part to the writings of Amoghavajra and his teacher Vajrabodhi.
While some scholars have put forward the theory that Acala originated from the Hindu god Shiva, particularly his attributes of destruction and reincarnation, Bernard Faure suggested the wrathful esoteric deity Trailokyavijaya (whose name is an epithet of Shiva), the Vedic fire god Agni, and the guardian deity Vajrapani to be other, more likely prototypes for Acala. He notes: "one could theoretically locate Acala's origins in a generic , but only in the sense that all Tantric deities can in one way or another be traced back to ." Faure compares Acala to Vajrapani in that both were originally minor deities who eventually came to occupy important places in the Buddhist pantheon.
Acala is said to be a powerful deity who protects the faithful by burning away all impediments () and defilements (), thus aiding them towards enlightenment. In a commentary on the Mahāvairocana Tantra by Yi Xing, he is said to have manifested in the world following Vairocana's vow to save all beings, and that his primary function is to remove obstacles to enlightenment. Indeed, the tantra instructs the ritual practitioner to recite Acala's mantras or to visualize himself as Acala in order to remove obstacles.
From a humble acolyte, Acala evolved into a powerful demon-subduing deity. In later texts such as the Caṇḍamahāroṣaṇa Tantra, Acala - under the name ("Violent Wrathful One") or ("Violent One of Great Wrath") - is portrayed as the "frightener of gods, titans, and men, the destroyer of the strength of demons" who slays ghosts and evil spirits with his fierce anger. In the Sādhanamālā, the gods Vishnu, Shiva, Brahma and Kandarpa - described as "wicked" beings who subject humanity to endless rebirth - are said to be terrified of Acala because he carries a rope to bind them.
In Tibetan Buddhism, Acala or Miyowa (མི་གཡོ་བ་, Wylie: mi g.yo ba) is considered as belonging to the ("vajra family", Tibetan: དོ་རྗེའི་རིགས་, dorjé rik; Wylie: rdo rje'i rigs), one of the Five Buddha Families presided over by the buddha Akṣobhya and may even be regarded, along with the other deities of the kula, as an aspect or emanation of the latter. He is thus sometimes depicted in South Asian art wearing a crown with an effigy of Akṣobhya. In Nepal, Acala may also be identified as a manifestation of the bodhisattva Mañjuśrī. He has a consort named Viśvavajrī in both the Nepalese and Tibetan traditions, with whom he is at times depicted in yab-yum union.
By contrast, the sanrinjin (三輪身, "bodies of the three wheels") theory, based on Amoghavajra's writings and prevalent in Japanese esoteric Buddhism (Mikkyō), interprets Acala as an incarnation of Vairocana. In this system, the five chief vidyārājas or Wisdom Kings (明王, Myōō), of which Acala is one, are interpreted as the wrathful manifestations (教令輪身, kyōryōrin-shin, lit. ""embodiments of the wheel of injunction") of the Five Great Buddhas, who appear both as gentle bodhisattvas to teach the Dharma and also as fierce wrathful deities to subdue and convert hardened nonbelievers. Under this conceptualization, vidyārājas are ranked superior to , a different class of guardian deities. However, this interpretation, while common in Japan, is not necessarily universal: in Nichiren-shū, for instance, Acala and Rāgarāja (Aizen Myōō), the two vidyārājas who commonly feature in the mandalas inscribed by Nichiren, are seen as protective deities (外護神, gegoshin) who respectively embody the two tenets of hongaku ("original enlightenment") doctrine: "life and death (saṃsāra) are precisely nirvana" (生死即涅槃, shōji soku nehan) and "worldly passions (kleśa) are precisely enlightenment (bodhi)" (煩悩即菩提, bonnō soku bodai).
Iconography
The Caṇḍamahāroṣaṇa Tantra description of Acala is a good summary of the deity's depiction in South Asian Buddhist art.
In Nepalese and Tibetan art, Acala is usually shown either kneeling on his left knee or standing astride, bearing a noose or lasso (pāśa) and an upraised sword. Some depictions portray him trampling on the elephant-headed Vighnarāja (lit. "Ruler of Hindrances", a Buddhist equivalent to the Hindu god Ganesha, albeit interpreted negatively as one who causes obstacles), signifying his role as the destroyer of impediments to enlightenment. He may also be shown wearing a tiger skin, with snakes coiled around his arms and body.
By contrast, portrayals of Acala (Fudō) in Japan generally tend to conform to the description given in the Amoghapāśakalparāja Sūtra and the Mahāvairocana Tantra: holding a lasso and a sword while sitting or standing on a rock (盤石座, banjakuza) or a pile of hewn stones (瑟瑟座, shitsushitsuza), with his braided hair hanging from the left of his head. He may also be depicted with a lotus flower - a symbol of enlightenment - on his head (頂蓮, chōren). Unlike the South Asian Acala, whose striding posture conveys movement and dynamism, the Japanese Fudō sits or stands erect, suggesting motionlessness and rigidity. The sword he wields may or may not be flaming and is sometimes described generically as a or , which is descriptive of the fact that the sword's pommel is in the shape of the talon-like vajra (金剛杵, kongō-sho). It may also be referred to as a . In some cases, he is seen holding the "Kurikara sword" (倶利伽羅剣, Kurikara-ken), a sword with the dragon (nāga) king Kurikara (倶利伽羅; Sanskrit: Kulikāla-rāja or Kṛkāla-rāja) coiled around it. The flaming nimbus or halo behind Acala is commonly known in Japanese as the "Garuda flame" (迦楼羅炎, karura-en) after the mythical fire-breathing bird from Indian mythology.
There are two main variations in the iconography of Acala / Fudō in Japan. The first type (observable in the earliest extant Japanese images of the deity) shows him with wide open, glaring eyes, straight hair braided in rows and two fangs pointed in the same direction; a lotus flower rests above his head. The second type (which first appeared in the late 9th century and became increasingly common during the late Heian and Kamakura periods), by contrast, portrays Acala with curly hair, one eye wide open and/or looking upwards, with the other narrowed and/or looking downwards, an iconographic trait known as the tenchigan (天地眼), "heaven-and-earth eyes". Similarly, one of his fangs is now shown as pointing up, with the other pointing down. In place of the lotus flower, images of this type may sport seven topknots.
Although the squinting left eye and inverted fangs of the second type ultimately derives from the description of Acala given in the Mahāvairocana Tantra and Yi Xing's commentary on the text ("with his lower [right] tooth he bites the upper-right side of his lip, and with his left [-upper tooth he bites] his lower lip which sticks out"), these attributes were mostly absent in Chinese and earlier Japanese icons.
Acala's mismatched eyes and fangs were allegorically interpreted to signify both the duality and nonduality of his nature (and of all reality): the upward fang for instance was interpreted as symbolizing the process of elevation towards enlightenment, with the downward fang symbolizing the descent of enlightened beings into the world to teach sentient beings. The two fangs also symbolize the realms of buddhas and sentient beings, yin and yang, and male and female, with the nonduality of these two polar opposites being expressed by Acala's tightly closed lips.
Acala is commonly shown as having either black or blue skin (the Sādhanamālā describes his color as being "like that of the atasī (flax) flower," which may be either yellow or blue), though he may be at times portrayed in other colors. In Tibet, for instance, a variant of the kneeling Acala depiction shows him as being white in hue "like sunrise on a snow mountain reflecting many rays of light". In Japan, some images may depict Acala sporting a red (赤不動, Aka-Fudō) or yellow (黄不動, Ki-Fudō) complexion. The most famous example of the Aka-Fudō portrayal is a painting kept at Myōō-in on Mount Kōya (Wakayama Prefecture) traditionally attributed to the Heian period Tendai monk Enchin. Legend claims that Enchin, inspired by a vision of Acala, painted the image using his own blood (thus explaining its red color), though recent analysis suggests that the image may have been actually created much later, during the Kamakura period. The most well-known image of the Ki-Fudō type, meanwhile, is enshrined in Mii-dera (Onjō-ji) at the foot of Mount Hiei in Shiga Prefecture and is said to have been based on another vision that Enchin saw while practicing austerities in 838. The original Mii-dera Ki-Fudō is traditionally only shown to esoteric masters (ācārya; 阿闍梨, ajari) during initiation rites and is otherwise not shown to the public, though copies of it have been made. One such copy, made in the 12th century, is kept at Manshu-in in Kyoto.
The deity is usually depicted with one head and two arms, though a few portrayals show him with multiple heads, arms or legs. In Japan, a depiction of Acala with four arms is employed in subjugation rituals and earth-placating rituals (安鎮法, anchin-hō); this four-armed form is identified in one text as "the lord of the various categories [of gods]." An iconographic depiction known as the "Two-Headed Rāgarāja" (両頭愛染, Ryōzu Aizen or Ryōtō Aizen) shows Acala combined with the wisdom king Rāgarāja (Aizen).
Acolytes
Acala is sometimes described as having a retinue of acolytes, the number of which vary between sources, usually two or eight but sometimes thirty-six or even forty-eight. These represent the elemental, untamed forces of nature that the ritual practitioner seeks to harness.
The two boy servants or dōji (童子) most commonly depicted in Japanese iconographic portrayals are and , who also appear as the last two of the list of Acala's eight great dōji. Kiṃkara is depicted as white in color, with his hands joined in respect, while Ceṭaka is red-skinned and holds a vajra in his left hand and a vajra staff in his right hand. The two are said to symbolize both Dharma-essence and ignorance, respectively, and is held to be in charge of good and evil.
Kiṃkara and Ceṭaka are also sometimes interpreted as transformations or emanations of Acala himself. In a sense, they reflect Acala's original characterization as an attendant of Vairocana; indeed, their servile nature is reflected in their names (Ceṭaka for instance means "slave") and their topknots, the mark of banished people and slaves. In other texts, they are also described as manifestations of Avalokiteśvara (Kannon) and Vajrapāṇi or as transformations of the dragon Kurikara, who is himself sometimes seen as one of Acala's various incarnations.
Two other notable dōji are Matijvala (恵光童子, Ekō-dōji) and Matisādhu (恵喜童子, Eki-dōji), the first two of Acala's eight great acolytes. Matijvala is depicted as white in color and holds a three-pronged vajra in his right hand and a lotus topped with a moon disk on his left, while Matisādhu is red and holds a trident in his right hand and a wish-fulfilling jewel (cintāmaṇi) on his left. The eight acolytes as a whole symbolize the eight directions, with Matijvala and Matisādhu representing east and south, respectively.
Texts
As noted above, Acala appears in the Amoghapāśakalparāja Sūtra and the Vairocanābhisaṃbodhi Sūtra. As Caṇḍaroṣaṇa or Caṇḍamahāroṣaṇa, he is the primary deity of the Caṇḍamahāroṣaṇa Tantra and is described in the Sādhanamālā.
The Japanese esoteric Buddhist tradition and Shugendō also make use of the following apocryphal sutras on Acala:
Sūtra of the Great Wrathful King Āryācala's Secret Dhāraṇī (聖無動尊大威怒王秘密陀羅尼経, Shō-Mudō-son daiifunnuō himitsu darani kyō)
A sūtra consisting of a discourse on Acala given by the bodhisattva Vajrasattva (identified here with Samantabhadra) to Mañjuśrī, set in "Vairocana's great assembly." The sutra describes Acala as being identical with the all-pervading dharmakāya, "[having] no fixed abode, but [dwelling] within the hearts of sentient beings" (無其所居、但住衆生心想之中).
Āryācala Sūtra (仏説聖不動経, Bussetsu Shō-Fudō kyō)
A summarized version of the above sutra. Translated into English, it runs as follows:
To this text is often appended two litanies of the names of Acala's young acolytes (童子, dōji), the 'thirty-six dōji (三十六童子, sanjuroku dōji) and the 'eight great dōji (八大童子, hachi daidōji).
Sūtra on Reverencing the Secret Dhāraṇī of Āryācala (稽首聖無動尊秘密陀羅尼経, Keishu Shō-Mudō-son himitsu darani kyō)
Bīja and mantra
The bīja or seed syllables used to represent Acala in Japanese Buddhism are (हां / हाँ) and hāmmāṃ (हाम्मां / हाम्माँ), the latter being a combination of the two final bīja in his mantra: hāṃ māṃ (हां मां). Hāṃ is sometimes confounded with the similar-looking (हूं), prompting some writers to mistakenly identify Acala with other deities. The syllables are written using the Siddham script and is conventionally read as kān (カーン) and kānmān (カーンマーン).
Three mantras of Acala are considered to be the standard in Japan. The most widely known one, derived from the Mahāvairocana Tantra and popularly known as the "Mantra of Compassionate Help" (慈救呪, jikushu or jikuju), goes as follows:
The "Short Mantra" (小呪, shōshu) of Acala - also found in the Mahāvairocana Tantra - is as follows:
The longest of the three is the "Great Mantra" of Acala, also known as the "Fire Realm Mantra" (火界呪, kakaishu / kakaiju):
Another mantra associated with the deity is Oṃ caṇḍa-mahāroṣaṇa hūṃ phaṭ, found in the Siddhaikavīra Tantra. The text describes it as the "king of mantras" that dispels all evil and grants "whatever the follower of Mantrayāna desires".
Worship
Japan
Fudō Myōō (Acala), was never popular in Indian, Tibetan or even Chinese Buddhism , but in Japan it became the object of a flourishing cult with esoteric overtones.
The cult of Acala was first brought to Japan by the esoteric master Kūkai, the founder of the Shingon school, and his successors, where it developed as part of the growing popularity of rituals for the protection of the state. While Acala was at first simply regarded as the primus inter pares among the five wisdom kings, he gradually became a focus of worship in his own right, subsuming characteristics of the other four vidyarājas (who came to be perceived as emanating from him), and became installed as the main deity (honzon) at many temples and outdoor shrines.
Acala, as a powerful vanquisher of evil, was regarded both as a protector of the imperial court and the nation as a whole (in which capacity he was invoked during state-sponsored rituals) and the personal guardian of ritual practitioners. Many eminent Buddhist priests like Kūkai, Kakuban, Ennin, Enchin, and Sōō worshiped Acala as their patron deity, and stories of how he miraculously rescued his devotees in times of danger were widely circulated.
At temples dedicated to Acala, priests perform the , or ritual service to enlist the deity's power of purification to benefit the faithful. This rite routinely involves the use of the as a purification tool.
Lay persons or monks in yamabushi gear who go into rigorous training outdoors in the mountains often pray to small Acala statues or portable talismans that serve as his honzon. This element of yamabushi training, known as Shugendō, predates the introduction of Acala to Japan. At this time, figures such as , who appeared before the sect's founder, En no Gyōja, or Vairocana, were commonly worshiped. Once Acala was added to list of deities typically enshrined by the yamabushi monks, his images were either portable, or installed in hokora (outdoor shrines). These statues would often be placed near waterfalls (a common training ground), deep in the mountains and in caves.
The daimyo Takeda Shingen is known to have taken Fudō Myōō as his patron (particularly when he transitioned to being a lay monk in his later years), and has commissioned a statue of Fudō that is supposedly modelled after his face.
Acala also tops the list of Thirteen Buddhas. Thus Shingon Buddhist mourners assign Fudō to the first seven days of service. The first week is an important observance, but perhaps not as much as the observance of "seven times seven days" (i.e. 49 days) signifying the end of the "intermediate state" (bardo).
Literature on Shingon Buddhist ritual will explain that Sanskrit "seed syllables", mantras and mudras are attendant to each of the Buddhas for each observance period. But the scholarly consensus seems to be that invocation of the "Thirteen Buddhas" had evolved later, around the 14th century and became widespread by the following century, so it is doubtful that this practice was part of Kūkai's original teachings.
China
Bùdòng Míngwáng (Acala) worship in China was first introduced into China during the Tang dynasty after the translation of esoteric tantras associated with him by monks such as Amoghavajra and Vajrabodhi. Iconography of Acala has been depicted infrequently in some temples and grottoes from the Tang through to contemporaneous times, usually as part of a set depicting the Eight Wisdom Kings or Ten Wisdom Kings, In modern times, he is revered as one of the eight Buddhist guardians of the Chinese zodiac and specifically considered to be the protector of those born in the year of the Rooster. He is also frequently invoked during Chinese Buddhist repentance ceremonies, such as the Liberation Rite of Water and Land, along with the other Wisdom Kings where they are given offerings and intreated to expel evil from the ritual platform.
In popular culture
Gary Snyder's 1969 poem Smokey the Bear Sutra portrays Smokey Bear (the mascot of the U.S. Forest Service) as an incarnation of Vairocana (the "Great Sun Buddha") in a similar vein as Acala. Indeed, Acala's Mantra of Compassionate Help is presented in the text as Smokey's "great mantra."
Sailor Mars from the Sailor Moon series invokes Acala through the Sanskrit chant of the Mantra of Compassionate Help during her "Fire Soul Bird" attack. Acala is flashed multiple times as a shadowed figure in flames, consistent with Japanese iconography, and in line with Sailor Mars's element of fire.
Gallery
See also
Wisdom King
Trailokyavijaya
Rāgarāja
Homa (ritual)
Narita-san
References
Bibliography
External links
Fudo Myo-O, 不動明王 O-Fudo-sama in Japan
Ellen Schattschneider "Fudo Myoo (Acala)" - In: immortal wishes (2003)
Shingon Buddhist International Institute
Tendai Buddhist Sangha in Denver Colorado
Dharmapalas
Shingon Buddhism
Wisdom Kings
Death gods
Wrathful deities
Herukas
Acala
|
Goeminne is a Belgian surname. Notable people with the surname include:
Christiane Goeminne, Belgian cyclist
Paul Goeminne (1888–?), Belgian ice hockey player
Surnames of Belgian origin
|
Duttaphrynus beddomii (common name: Beddome's toad) is a species of toad endemic to the Western Ghats of India. It is found in Kerala and Tamil Nadu states in the southern Western Ghats at elevations of asl.
Description
Duttaphrynus beddomii exhibits a crown that lacks bony ridges; its short, projecting snout has an angular canthus rostralis. Its interorbital space is somewhat broader than the upper eyelid. Its tympanum is very small, and sometimes indistinct. The species' first finger does not extend beyond the second; its toes are nearly entirely webbed, with single subarticular tubercles, two small metatarsal tubercles, and no tarsal fold. The tarso-metatarsal articulation reaches to between the eye and the tip of the snout. Its upper parts are covered with rough tubercles; its parotoids ovate, about twice as long as broad, are rather indistinct. The toad is dark brown as seen from above, with indistinct black spots; its limbs are marbled with carmine; its lower surfaces are marbled with brown.
D. beddomii measures from snout to vent.
References
External links
beddomii
Frogs of India
Endemic fauna of the Western Ghats
Amphibians described in 1876
Taxa named by Albert Günther
|
```html
#@layout()
#define css()
<link rel="stylesheet" href="#(CPATH)/static/components/jquery-file-upload/css/jquery.fileupload.css">
<style>
#uploader {
height: 230px;
}
.myPanel {
font-size: 25px;
color: #ccc;
text-align: center;
padding-top: 60px;
}
</style>
#end
#define script()
<script src="#(CPATH)/static/components/jquery-file-upload/js/vendor/jquery.ui.widget.js"></script>
<script src="#(CPATH)/static/components/jquery-file-upload/js/jquery.iframe-transport.js"></script>
<script src="#(CPATH)/static/components/jquery-file-upload/js/jquery.fileupload.js"></script>
<script>
$('#cfile').fileupload({
dropZone: $('#uploader'),
url: '#(CPATH)/admin/setting/tools/markdown/doMarkdownImport',
sequentialUploads: true,
done: function (e, data) {
if (data.result.state == "ok") {
toastr.success(" Markdown ")
} else {
toastr.error(data.result.message)
}
}
});
</script>
#end
#define content()
<section class="content-header">
<div class="container-fluid">
<div class="row">
<div class="col-sm-6">
<div class="row mb-2">
<div class="col-sm-12">
<h1>
Markdown
<small data-toggle="tooltip" title="" data-placement="right"
data-trigger="hover"><i class="nav-icon far fa-question-circle"></i></small>
<small> / / Markdown</small>
</h1>
</div>
</div>
</div>
</div>
</div><!-- /.container-fluid -->
</section>
<section class="content">
<div class="container-fluid">
<div class="card card-outline card-primary">
<div class="card-body">
<div id="uploader">
<span class="btn btn-block btn-primary fileinput-button" style="width: 220px">
<i class="fas fa-plus"></i>
<span>Markdown</span>
<input id="cfile" type="file" name="files[]" multiple>
</span>
<div class="myPanel">
Hexo/Jekyll ...
</div>
</div>
</div>
</div>
</div>
</section>
#end
```
|
Ronald Mulder (born 27 February 1986) is a Dutch speed skater. He won bronze in the men's 500 metres event at the 2014 Winter Olympics in Sochi, Russia, and finished sixth in the men's 500 metres event at the 2012 World Single Distance Championships. His twin brother, Michel Mulder, is also a speed skater. Both competed in The World Games 2017 in Wroclaw, Poland representing the Netherlands in 500 metres sprint and 200 metres time trial. He competed in the 2018 Winter Olympics in the men's 500 metres event, finishing in 7th place.
Mulder is the current holder of the Dutch record on the 500 metres distance.
In 2023, he took part in the television show Het Onbekende.
Personal records
See also
List of Olympic medalist families
References
External links
1986 births
Dutch male speed skaters
Speed skaters at the 2010 Winter Olympics
Speed skaters at the 2014 Winter Olympics
Speed skaters at the 2018 Winter Olympics
Olympic speed skaters for the Netherlands
Medalists at the 2014 Winter Olympics
Olympic medalists in speed skating
Olympic bronze medalists for the Netherlands
Sportspeople from Zwolle
Dutch twins
Living people
World Single Distances Speed Skating Championships medalists
Competitors at the 2017 World Games
|
Murkongselek Railway Station is a main railway station in Dhemaji district, Assam. Its code is MZS. It serves Murkongselek town. The station consists of three platforms. The station has been upgraded to a standard Class II Station.
. It is a railway station which connects Assam to Arunachal Pradesh.
Station details
Platforms
There are a total of 3 platforms and 5 tracks. The platforms are connected by foot overbridge. These platforms are built to accommodate 24 coaches express train.
Murkongselek railway station has a separate platform for receiving and unloading freight (goods) trains.
Station layout
Major Trains
Kamakhya–Murkongselek Lachit Express
Rangapara North–Murkongselek Passenger
Dekargaon–Murkongselek Passenger
Murkongselek - Ledo DEMU Express Special
Underconstruction new rail lines
227 km Murkongselek–Pasighat–Tezu–Rupai line is being undertake as a strategic project.
Nearest airport
The nearest airports are Dibrugarh Airport, Lilabari Airport at Lakhimpur district, Pasighat Airport at Arunachal Pradesh.( Nearly 37 km).
See also
References
External links
Murkongselek Station Junction Map
Railway stations in Dhemaji district
Rangiya railway division
|
```python
Following PEP 8 styling guideline.
Get more with `collections`!
Enhance your `tuple`s
Get the most of `float`s
Looping techniques
```
|
```m4sugar
# ltsugar.m4 -- libtool m4 base layer. -*-Autoconf-*-
#
# Written by Gary V. Vaughan, 2004
#
# This file is free software; the Free Software Foundation gives
# unlimited permission to copy and/or distribute it, with or without
# modifications, as long as this notice is preserved.
# serial 6 ltsugar.m4
# This is to help aclocal find these macros, as it can't see m4_define.
AC_DEFUN([LTSUGAR_VERSION], [m4_if([0.1])])
# lt_join(SEP, ARG1, [ARG2...])
# -----------------------------
# Produce ARG1SEPARG2...SEPARGn, omitting [] arguments and their
# associated separator.
# Needed until we can rely on m4_join from Autoconf 2.62, since all earlier
# versions in m4sugar had bugs.
m4_define([lt_join],
[m4_if([$#], [1], [],
[$#], [2], [[$2]],
[m4_if([$2], [], [], [[$2]_])$0([$1], m4_shift(m4_shift($@)))])])
m4_define([_lt_join],
[m4_if([$#$2], [2], [],
[m4_if([$2], [], [], [[$1$2]])$0([$1], m4_shift(m4_shift($@)))])])
# lt_car(LIST)
# lt_cdr(LIST)
# ------------
# Manipulate m4 lists.
# These macros are necessary as long as will still need to support
# Autoconf-2.59 which quotes differently.
m4_define([lt_car], [[$1]])
m4_define([lt_cdr],
[m4_if([$#], 0, [m4_fatal([$0: cannot be called without arguments])],
[$#], 1, [],
[m4_dquote(m4_shift($@))])])
m4_define([lt_unquote], $1)
# lt_append(MACRO-NAME, STRING, [SEPARATOR])
# ------------------------------------------
# Redefine MACRO-NAME to hold its former content plus `SEPARATOR'`STRING'.
# Note that neither SEPARATOR nor STRING are expanded; they are appended
# to MACRO-NAME as is (leaving the expansion for when MACRO-NAME is invoked).
# No SEPARATOR is output if MACRO-NAME was previously undefined (different
# than defined and empty).
#
# This macro is needed until we can rely on Autoconf 2.62, since earlier
# versions of m4sugar mistakenly expanded SEPARATOR but not STRING.
m4_define([lt_append],
[m4_define([$1],
m4_ifdef([$1], [m4_defn([$1])[$3]])[$2])])
# lt_combine(SEP, PREFIX-LIST, INFIX, SUFFIX1, [SUFFIX2...])
# ----------------------------------------------------------
# Produce a SEP delimited list of all paired combinations of elements of
# PREFIX-LIST with SUFFIX1 through SUFFIXn. Each element of the list
# has the form PREFIXmINFIXSUFFIXn.
# Needed until we can rely on m4_combine added in Autoconf 2.62.
m4_define([lt_combine],
[m4_if(m4_eval([$# > 3]), [1],
[m4_pushdef([_Lt_sep], [m4_define([_Lt_sep], m4_defn([lt_car]))])]]dnl
[[m4_foreach([_Lt_prefix], [$2],
[m4_foreach([_Lt_suffix],
]m4_dquote(m4_dquote(m4_shift(m4_shift(m4_shift($@)))))[,
[_Lt_sep([$1])[]m4_defn([_Lt_prefix])[$3]m4_defn([_Lt_suffix])])])])])
# lt_if_append_uniq(MACRO-NAME, VARNAME, [SEPARATOR], [UNIQ], [NOT-UNIQ])
# your_sha256_hash-------
# Iff MACRO-NAME does not yet contain VARNAME, then append it (delimited
# by SEPARATOR if supplied) and expand UNIQ, else NOT-UNIQ.
m4_define([lt_if_append_uniq],
[m4_ifdef([$1],
[m4_if(m4_index([$3]m4_defn([$1])[$3], [$3$2$3]), [-1],
[lt_append([$1], [$2], [$3])$4],
[$5])],
[lt_append([$1], [$2], [$3])$4])])
# lt_dict_add(DICT, KEY, VALUE)
# -----------------------------
m4_define([lt_dict_add],
[m4_define([$1($2)], [$3])])
# lt_dict_add_subkey(DICT, KEY, SUBKEY, VALUE)
# --------------------------------------------
m4_define([lt_dict_add_subkey],
[m4_define([$1($2:$3)], [$4])])
# lt_dict_fetch(DICT, KEY, [SUBKEY])
# ----------------------------------
m4_define([lt_dict_fetch],
[m4_ifval([$3],
m4_ifdef([$1($2:$3)], [m4_defn([$1($2:$3)])]),
m4_ifdef([$1($2)], [m4_defn([$1($2)])]))])
# lt_if_dict_fetch(DICT, KEY, [SUBKEY], VALUE, IF-TRUE, [IF-FALSE])
# your_sha256_hash-
m4_define([lt_if_dict_fetch],
[m4_if(lt_dict_fetch([$1], [$2], [$3]), [$4],
[$5],
[$6])])
# lt_dict_filter(DICT, [SUBKEY], VALUE, [SEPARATOR], KEY, [...])
# --------------------------------------------------------------
m4_define([lt_dict_filter],
[m4_if([$5], [], [],
[lt_join(m4_quote(m4_default([$4], [[, ]])),
lt_unquote(m4_split(m4_normalize(m4_foreach(_Lt_key, lt_car([m4_shiftn(4, $@)]),
[lt_if_dict_fetch([$1], _Lt_key, [$2], [$3], [_Lt_key ])])))))])[]dnl
])
```
|
```rust
/*
*
* This software may be used and distributed according to the terms of the
*/
extern crate proc_macro;
use hgrc_parser::Instruction;
use indexmap::IndexMap;
use proc_macro::TokenStream;
use proc_macro::TokenTree;
/// Generate `StaticConfig` from a static string in rc format:
///
/// ```ignore
/// static_rc!(
/// r#"
/// [section]
/// name = value
/// "#
/// )
/// ```
#[proc_macro]
pub fn static_rc(tokens: TokenStream) -> TokenStream {
// Extract.
let content: String = match extract_string_literal(tokens.clone()) {
Some(content) => content,
None => panic!(
"static_rc requires a single string literal, got: {:?}",
tokens
),
};
// Parse hgrc.
let mut items: Vec<(&str, &str, Option<String>)> = Vec::new();
for inst in hgrc_parser::parse(&content).expect("parse static_rc!") {
match inst {
Instruction::SetConfig {
section,
name,
value,
..
} => {
items.push((section, name, Some(value.to_string())));
}
Instruction::UnsetConfig { section, name, .. } => {
items.push((section, name, None));
}
Instruction::Include { .. } => {
panic!("static_rc! does not support %include");
}
}
}
static_config_from_items(&items)
}
fn extract_string_literal(tokens: TokenStream) -> Option<String> {
let mut result: Option<String> = None;
for token in tokens {
if result.is_some() {
return None;
}
match token {
TokenTree::Literal(lit) => {
// Extract the string out. Note public APIs of "Literal" only provides a way to get
// the content with surrounding " " or r#" "#. Use a naive approach to strip out
// the " ".
let quoted = lit.to_string();
let content = quoted.splitn(2, '"').nth(1)?.rsplitn(2, '"').nth(1)?;
let content = if quoted.starts_with('r') {
content.to_string()
} else {
// Handle escapes naively.
content.replace(r"\n", "\n")
};
result = Some(content);
continue;
}
TokenTree::Group(group) => {
result = extract_string_literal(group.stream());
}
_ => {}
}
}
result
}
/// Generate `StaticConfig` from a static string in rc format:
///
/// ```ignore
/// static_items![
/// ("section1", "name1", "value1"),
/// ("section1", "name2", "value2"),
/// ]
/// ```
#[proc_macro]
pub fn static_items(tokens: TokenStream) -> TokenStream {
let mut items: Vec<(String, String, String)> = Vec::new();
for token in tokens {
if let TokenTree::Group(group) = token {
let tokens: Vec<_> = group.stream().into_iter().collect();
if let [section, _comma1, name, _comma2, value] = &tokens[..] {
let section = extract_string_literal(section.clone().into()).expect("section");
let name = extract_string_literal(name.clone().into()).expect("name");
let value = extract_string_literal(value.clone().into()).expect("value");
items.push((section, name, value));
}
}
}
let items: Vec<(&str, &str, Option<String>)> = items
.iter()
.map(|v| (v.0.as_str(), v.1.as_str(), Some(v.2.clone())))
.collect();
static_config_from_items(&items)
}
/// Generate code for `StaticConfig` for a list of `(section, name, value)`.
/// A `None` `value` means `%unset`. The order of the list is preserved in
/// APIs like `sections()` and `keys()`.
fn static_config_from_items(items: &[(&str, &str, Option<String>)]) -> TokenStream {
let mut sections: IndexMap<&str, IndexMap<&str, Option<String>>> = IndexMap::new();
for (section, name, value) in items {
sections
.entry(section)
.or_default()
.insert(name, value.clone());
}
// Generate code. Looks like:
//
// {
// use staticconfig::phf;
// // Workaround nested map. See path_to_url
// const SECTION1 = phf::phf_ordered_map! {
// "name1" => Some("value1"),
// "name2" => None, // %unset
// };
// const SECTION2 = phf::phf_ordered_map! {
// ...
// };
// ...
// const SECTIONS = phf::phf_ordered_map! {
// "section1" => SECTION1,
// "section2" => SECTION2,
// ...
// };
// staticconfig::StaticConfig {
// name: "StaticConfig",
// sections: SECTIONS,
// }
// }
let mut code = "{ use staticconfig::phf;\n".to_string();
for (i, (_section, items)) in sections.iter().enumerate() {
code += &format!(
"const SECTION{}: phf::OrderedMap<&'static str, Option<&'static str>> = phf::phf_ordered_map! {{\n",
i
);
for (name, value) in items.iter() {
code += &format!(" {:?} => {:?},\n", name, value);
}
code += "};\n";
}
code += "const SECTIONS: phf::OrderedMap<&'static str, phf::OrderedMap<&'static str, Option<&'static str>>> = phf::phf_ordered_map! {\n";
for (i, (section, _items)) in sections.iter().enumerate() {
code += &format!(" {:?} => SECTION{},\n", section, i);
}
code += "};\n";
code += r#"staticconfig::StaticConfig::from_macro_rules(SECTIONS) }"#;
code.parse().unwrap()
}
```
|
```python
#!/usr/bin/env python2
"""
VDF file reader
This program is free software; you can redistribute it and/or modify
the Free Software Foundation
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
with this program; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
"""
import shlex
def parse_vdf(fileobj):
"""
Converts VDF file or file-like object into python dict
Throws ValueError if profile cannot be parsed.
"""
rv = {}
stack = [ rv ]
lexer = shlex.shlex(fileobj)
key = None
t = lexer.get_token()
while t:
if t == "{":
# Set value to dict and add it on top of stack
if key is None:
raise ValueError("Dict without key")
value = {}
if key in stack[-1]:
lst = ensure_list(stack[-1][key])
lst.append(value)
stack[-1][key] = lst
else:
stack[-1][key] = value
stack.append(value)
key = None
elif t == "}":
# Pop last dict from stack
if len(stack) < 2:
raise ValueError("'}' without '{'")
stack = stack[0:-1]
elif key is None:
key = t.strip('"').lower()
elif key in stack[-1]:
lst = ensure_list(stack[-1][key])
lst.append(t.strip('"'))
stack[-1][key] = lst
key = None
else:
stack[-1][key] = t.strip('"')
key = None
t = lexer.get_token()
if len(stack) > 1:
raise ValueError("'{' without '}'")
return rv
def ensure_list(value):
"""
If value is list, returns same value.
Otherwise, returns [ value ]
"""
return value if type(value) == list else [ value ]
if __name__ == "__main__":
print parse_vdf(file('app_generic.vdf', "r"))
```
|
```json5
{
"Comment": "AWS_SDK_DYNAMODB_PUT_DELETE_ITEM",
"StartAt": "PutItem",
"States": {
"PutItem": {
"Type": "Task",
"Resource": "arn:aws:states:::aws-sdk:dynamodb:putItem",
"Parameters": {
"TableName.$": "$.TableName",
"Item.$": "$.Item"
},
"ResultPath": "$.putItemOutput",
"Next": "DeleteItem"
},
"DeleteItem": {
"Type": "Task",
"Resource": "arn:aws:states:::aws-sdk:dynamodb:deleteItem",
"ResultPath": "$.deleteItemOutput",
"Parameters": {
"TableName.$": "$.TableName",
"Key.$": "$.Key"
},
"End": true
}
}
}
```
|
```python
from cStringIO import StringIO
from json.tests import PyTest, CTest
class TestDump(object):
def test_dump(self):
sio = StringIO()
self.json.dump({}, sio)
self.assertEqual(sio.getvalue(), '{}')
def test_dumps(self):
self.assertEqual(self.dumps({}), '{}')
def test_encode_truefalse(self):
self.assertEqual(self.dumps(
{True: False, False: True}, sort_keys=True),
'{"false": true, "true": false}')
self.assertEqual(self.dumps(
{2: 3.0, 4.0: 5L, False: 1, 6L: True}, sort_keys=True),
'{"false": 1, "2": 3.0, "4.0": 5, "6": true}')
# Issue 16228: Crash on encoding resized list
def test_encode_mutated(self):
a = [object()] * 10
def crasher(obj):
del a[-1]
self.assertEqual(self.dumps(a, default=crasher),
'[null, null, null, null, null]')
class TestPyDump(TestDump, PyTest): pass
class TestCDump(TestDump, CTest): pass
```
|
Lorenzo Alcantuz (1741–1782) was a Colombian revolutionary. He was born in Sogamoso. Alongside Jose Antonio Galan, he was a key leader of the Revolución Comunera of 1781. This is considered to be the most important revolutionary movement in New Granada/Colombia prior to the achievement of national independence in the early 19th century.
The insurrection was triggered by violent riots in Simacota, Mogotes, Barichara and Curití in December 1780. The town of San Gil quickly joined the protests and it was there that Alcantuz carried out the symbolic revolutionary act of trampling on the royal coat of arms, which represented Spanish colonial power. The authorities took a dim view of the insurgency, and the main instigators Alcantuz, Galán, Isidro Molina and Juan Manuel José Ortiz were all hanged in Bogota on 1 February 1782. They were then decapitated and their dead bodies quartered and burned. Alcantuz's head was displayed in San Gil.
An indoor stadium in San Gil is now named after him.
References
Colombian revolutionaries
1741 births
1782 deaths
People from Sogamoso
|
Šarūnas Nakas (born August 2, 1962) is a Lithuanian composer, essayist, curator, filmmaker and broadcaster.
A student of Julius Juzeliūnas at the Lithuanian Academy of Music and Theatre, he graduated in 1986 before going to Poland for further study between 1988 and 1991; he has also worked at the IRCAM in Paris (1998). Meanwhile, he followed classes with Witold Lutoslawski, Louis Andriessen and Gérard Grisey.
In 1982, Nakas founded the Vilnius New Music Ensemble, with which he has toured in 15 European countries and Canada for eighteen consecutive years. Nakas was also artistic director of the Gaida contemporary music festival (1996) and Musica Ficta festival (1997), both in Vilnius. In 2001 he wrote a textbook on contemporary music, the first of its kind in Lithuania. Between 2000 and 2007 he was a musical editor of the cultural monthly Kultūros barai in Vilnius. Since 2002 he's hosting programs about contemporary music on Lithuanian radio. Since 2011, he has been moderating the online magazine for contemporary music modus-radio.com.
Since 2005, he has worked on multi-channel video installations, using his own music as soundtrack. In 2012, Nakas founded the Šarūnas Nakas' Multimedia Quintet.
He was co-curator of the international art exhibition Dialogues of Colour and Sound. Works by M. K. Čiurlionis and his contemporaries in Vilnius (2009).
Nakas was awarded the Lithuanian National Arts and Culture Prize in 2007, and has received numerous other awards for his work.
Selected compositions
Merz-machine for virtual orchestra (1985) or six pianos (1997)
Vox-machine for virtual choir (1985)
Ricercars for seven amplified instruments (1985)
Motet for five voices (1985)
Arcanum for viola and organ (1987)
Sarmatia for organ (1990)
And this Tranquillity for voice, singing violinist, singing trombonist and singing pianist (1991)
Fluctus semigallorum for actress, soprano, bass, viola and trombone (1991)
Cenotaph for chamber ensemble (1995)
Crypt for percussion and double-bass (1996)
Chronon for clarinet, trumpet, piano, percussion, cello and double-bass (1997)
Ziqquratu for flute, clarinet, piano, percussion, violin and cello (1998)
Vilne for tenor, four horns, two pianos and percussion (1998)
Ziqquratu-2 for percussion, chamber ensemble and electronics (1999)
Fight and Escape for ensemble (1999)
Aporia for three chamber groups (2001)
The Cup of Grail for clarinet and two percussion (2001)
Drang nach Westen. A New Sermon to the Barbarians for large ensemble (2003)
Eyes Dazzled by the North for chamber ensemble (2004)
Nude for large orchestra (2004)
Crown for wind orchestra (2005)
Dreamlike Venice, a Flammable Hurricane of Love for violin, soprano, baritone, two actors and chamber orchestra (2006)
Reliquary for voice, electric zither and oboe (2008)
Icon of Fire for chamber ensemble or piano solo (2008)
Resistance for large orchestra (2009)
Kitman for piano (2012)
Tuba mirum for tuba and 10 brass (2012)
Machine désirante for cello and 17 strings (2015)
Target for seven cellos (2017)
Hymn for piano, harpsichord and organ (2017)
Merz-Machine-Postlude for any number of keyboards (2017)
Vilne for voices, piano, percussion and electronics, lyrics by Moshe Kulbak (2017 m.)
A Woman Under the Influence session for a set of kanklės and winds (2017)
Silence and the Mirror for kannel and kanklės (2022)
Women's Gaze session for two orchestras (2022)
very simply for birbynė and kanklės (2023)
Multimedial works
At Heaven's Door performance (2000)
Falling Portraits, Broken Hearts, Eternal Victims performance (2002)
Sanguis Is Blood performance (2003)
I Was Shot performance (2003)
Painting on Sky installation (2005)
Aporia 3-channel video and music installation (2006)
The Gospel According to Blacksmith Ignotas 7-channel video and music installation (2007)
100 hi-STORIES: The National Museum of Musical Legends multimedia installation with series of short silent experimental movies (2008-ongoing)
Čiurlionis as Composer: Soundscapes educational multimedia installation (2009)
Ziqquratu-3 for performer, multi-percussionist, live- and pre-programmed electronics, and video (2012)
Delirium of Apollo for piano, kanklės (zither), prerecorded voice and transmitting objects (2015)
La fenêtre mystique for piano and transmitting objects (2015)
Vilne for voices, piano, percussion and electronics (2017)
Cipher for piano, percussion, musique concrète and radio noises (2017)
Vibrations Cosmiques for piano, synthesizer, percussion, electric guitar, contrabass clarinet and electronics (2018)
A Woman Under The Influence performance with installation for woman instrumentalist and male performer (2019)
How It Is for voice and electronics (2021)
Recordings
Nude (2007) chamber and orchestral music (CD)
Eyes Dazzled by the North (2012) chamber music (CD)
Ziqquratu (2012) electronic music (CD) and multimedia works (DVD)
At Heaven's Door (2016) chamber music and musique concrète
Cipher (2017) chamber and electroacoustic music (CD)
also in:
Experimental Music (1997) incl. Merz Machine (tape music version)
Transmission: pianocircus (2001) incl. Merz Machine (six pianos version)
Zoom in: new music from Lithuania (2002) incl. Chronon
Zoom in 2: new music from Lithuania (2003) incl. Aporia
Gaida (2005) incl. Aporia and Drang nach Westen. A New Sermon to the Barbarians
Zoom in 5: new music from Lithuania (2007) incl. Crown
Zoom in 8: New Music from Lithuania (2010) incl. Icon of Fire
References
Biography
External links
Šarūnas Nakas: A Maverick from the 'Peripheral Village'. The composer in conversation with Asta Pakarklytė, 2008
Essays by Šarūnas Nakas on modus-radio.com
Profile with music samples on mxl.lt
Pieces by Šarūnas Nakas on soundcloud.com
Final part from the Dreamlike Venice, a Flaming Hurricane of Love, 2006
Performance of Merz-machine for six pianos live in London, May 14, 2008
Ziqquratu-3 (with video), 2012
Ziqquratu-3: Secret Thirteen Mix 067, 2013
1962 births
Living people
Lithuanian classical composers
20th-century classical composers
21st-century classical composers
Lithuanian Academy of Music and Theatre alumni
Male classical composers
Contemporary classical composers
20th-century male musicians
21st-century male musicians
|
```scala
/*
*/
package com.lightbend.internal.broker
import akka.persistence.query.Offset
import akka.stream.scaladsl.Source
import com.lightbend.lagom.scaladsl.api.broker.Subscriber
import com.lightbend.lagom.scaladsl.api.broker.Topic
import com.lightbend.lagom.scaladsl.persistence.AggregateEvent
import com.lightbend.lagom.scaladsl.persistence.AggregateEventTag
import scala.collection.immutable
trait InternalTopic[Message] extends Topic[Message] {
final override def topicId: Topic.TopicId =
throw new UnsupportedOperationException("Topic#topicId is not permitted in the service's topic implementation")
final override def subscribe: Subscriber[Message] =
throw new UnsupportedOperationException("Topic#subscribe is not permitted in the service's topic implementation.")
}
final class TaggedOffsetTopicProducer[Message, Event <: AggregateEvent[Event]](
val tags: immutable.Seq[AggregateEventTag[Event]],
val readSideStream: (AggregateEventTag[Event], Offset) => Source[(Message, Offset), _]
) extends InternalTopic[Message]
```
|
```objective-c
// Use of this source code is governed by a BSD-style license that can be
// found in the LICENSE file.
#ifndef V8_X87_CODE_STUBS_X87_H_
#define V8_X87_CODE_STUBS_X87_H_
namespace v8 {
namespace internal {
void ArrayNativeCode(MacroAssembler* masm,
bool construct_call,
Label* call_generic_code);
class StringHelper : public AllStatic {
public:
// Generate code for copying characters using the rep movs instruction.
// Copies ecx characters from esi to edi. Copying of overlapping regions is
// not supported.
static void GenerateCopyCharacters(MacroAssembler* masm,
Register dest,
Register src,
Register count,
Register scratch,
String::Encoding encoding);
// Compares two flat one byte strings and returns result in eax.
static void GenerateCompareFlatOneByteStrings(MacroAssembler* masm,
Register left, Register right,
Register scratch1,
Register scratch2,
Register scratch3);
// Compares two flat one byte strings for equality and returns result in eax.
static void GenerateFlatOneByteStringEquals(MacroAssembler* masm,
Register left, Register right,
Register scratch1,
Register scratch2);
private:
static void GenerateOneByteCharsCompareLoop(
MacroAssembler* masm, Register left, Register right, Register length,
Register scratch, Label* chars_not_equal,
Label::Distance chars_not_equal_near = Label::kFar);
DISALLOW_IMPLICIT_CONSTRUCTORS(StringHelper);
};
class NameDictionaryLookupStub: public PlatformCodeStub {
public:
enum LookupMode { POSITIVE_LOOKUP, NEGATIVE_LOOKUP };
NameDictionaryLookupStub(Isolate* isolate, Register dictionary,
Register result, Register index, LookupMode mode)
: PlatformCodeStub(isolate) {
minor_key_ = DictionaryBits::encode(dictionary.code()) |
ResultBits::encode(result.code()) |
IndexBits::encode(index.code()) | LookupModeBits::encode(mode);
}
static void GenerateNegativeLookup(MacroAssembler* masm,
Label* miss,
Label* done,
Register properties,
Handle<Name> name,
Register r0);
static void GeneratePositiveLookup(MacroAssembler* masm,
Label* miss,
Label* done,
Register elements,
Register name,
Register r0,
Register r1);
bool SometimesSetsUpAFrame() override { return false; }
private:
static const int kInlinedProbes = 4;
static const int kTotalProbes = 20;
static const int kCapacityOffset =
NameDictionary::kHeaderSize +
NameDictionary::kCapacityIndex * kPointerSize;
static const int kElementsStartOffset =
NameDictionary::kHeaderSize +
NameDictionary::kElementsStartIndex * kPointerSize;
Register dictionary() const {
return Register::from_code(DictionaryBits::decode(minor_key_));
}
Register result() const {
return Register::from_code(ResultBits::decode(minor_key_));
}
Register index() const {
return Register::from_code(IndexBits::decode(minor_key_));
}
LookupMode mode() const { return LookupModeBits::decode(minor_key_); }
class DictionaryBits: public BitField<int, 0, 3> {};
class ResultBits: public BitField<int, 3, 3> {};
class IndexBits: public BitField<int, 6, 3> {};
class LookupModeBits: public BitField<LookupMode, 9, 1> {};
DEFINE_NULL_CALL_INTERFACE_DESCRIPTOR();
DEFINE_PLATFORM_CODE_STUB(NameDictionaryLookup, PlatformCodeStub);
};
class RecordWriteStub: public PlatformCodeStub {
public:
RecordWriteStub(Isolate* isolate, Register object, Register value,
Register address, RememberedSetAction remembered_set_action,
SaveFPRegsMode fp_mode)
: PlatformCodeStub(isolate),
regs_(object, // An input reg.
address, // An input reg.
value) { // One scratch reg.
minor_key_ = ObjectBits::encode(object.code()) |
ValueBits::encode(value.code()) |
AddressBits::encode(address.code()) |
RememberedSetActionBits::encode(remembered_set_action) |
SaveFPRegsModeBits::encode(fp_mode);
}
RecordWriteStub(uint32_t key, Isolate* isolate)
: PlatformCodeStub(key, isolate), regs_(object(), address(), value()) {}
enum Mode {
STORE_BUFFER_ONLY,
INCREMENTAL,
INCREMENTAL_COMPACTION
};
bool SometimesSetsUpAFrame() override { return false; }
static const byte kTwoByteNopInstruction = 0x3c; // Cmpb al, #imm8.
static const byte kTwoByteJumpInstruction = 0xeb; // Jmp #imm8.
static const byte kFiveByteNopInstruction = 0x3d; // Cmpl eax, #imm32.
static const byte kFiveByteJumpInstruction = 0xe9; // Jmp #imm32.
static Mode GetMode(Code* stub) {
byte first_instruction = stub->instruction_start()[0];
byte second_instruction = stub->instruction_start()[2];
if (first_instruction == kTwoByteJumpInstruction) {
return INCREMENTAL;
}
DCHECK(first_instruction == kTwoByteNopInstruction);
if (second_instruction == kFiveByteJumpInstruction) {
return INCREMENTAL_COMPACTION;
}
DCHECK(second_instruction == kFiveByteNopInstruction);
return STORE_BUFFER_ONLY;
}
static void Patch(Code* stub, Mode mode) {
switch (mode) {
case STORE_BUFFER_ONLY:
DCHECK(GetMode(stub) == INCREMENTAL ||
GetMode(stub) == INCREMENTAL_COMPACTION);
stub->instruction_start()[0] = kTwoByteNopInstruction;
stub->instruction_start()[2] = kFiveByteNopInstruction;
break;
case INCREMENTAL:
DCHECK(GetMode(stub) == STORE_BUFFER_ONLY);
stub->instruction_start()[0] = kTwoByteJumpInstruction;
break;
case INCREMENTAL_COMPACTION:
DCHECK(GetMode(stub) == STORE_BUFFER_ONLY);
stub->instruction_start()[0] = kTwoByteNopInstruction;
stub->instruction_start()[2] = kFiveByteJumpInstruction;
break;
}
DCHECK(GetMode(stub) == mode);
CpuFeatures::FlushICache(stub->instruction_start(), 7);
}
DEFINE_NULL_CALL_INTERFACE_DESCRIPTOR();
private:
// This is a helper class for freeing up 3 scratch registers, where the third
// is always ecx (needed for shift operations). The input is two registers
// that must be preserved and one scratch register provided by the caller.
class RegisterAllocation {
public:
RegisterAllocation(Register object,
Register address,
Register scratch0)
: object_orig_(object),
address_orig_(address),
scratch0_orig_(scratch0),
object_(object),
address_(address),
scratch0_(scratch0) {
DCHECK(!AreAliased(scratch0, object, address, no_reg));
scratch1_ = GetRegThatIsNotEcxOr(object_, address_, scratch0_);
if (scratch0.is(ecx)) {
scratch0_ = GetRegThatIsNotEcxOr(object_, address_, scratch1_);
}
if (object.is(ecx)) {
object_ = GetRegThatIsNotEcxOr(address_, scratch0_, scratch1_);
}
if (address.is(ecx)) {
address_ = GetRegThatIsNotEcxOr(object_, scratch0_, scratch1_);
}
DCHECK(!AreAliased(scratch0_, object_, address_, ecx));
}
void Save(MacroAssembler* masm) {
DCHECK(!address_orig_.is(object_));
DCHECK(object_.is(object_orig_) || address_.is(address_orig_));
DCHECK(!AreAliased(object_, address_, scratch1_, scratch0_));
DCHECK(!AreAliased(object_orig_, address_, scratch1_, scratch0_));
DCHECK(!AreAliased(object_, address_orig_, scratch1_, scratch0_));
// We don't have to save scratch0_orig_ because it was given to us as
// a scratch register. But if we had to switch to a different reg then
// we should save the new scratch0_.
if (!scratch0_.is(scratch0_orig_)) masm->push(scratch0_);
if (!ecx.is(scratch0_orig_) &&
!ecx.is(object_orig_) &&
!ecx.is(address_orig_)) {
masm->push(ecx);
}
masm->push(scratch1_);
if (!address_.is(address_orig_)) {
masm->push(address_);
masm->mov(address_, address_orig_);
}
if (!object_.is(object_orig_)) {
masm->push(object_);
masm->mov(object_, object_orig_);
}
}
void Restore(MacroAssembler* masm) {
// These will have been preserved the entire time, so we just need to move
// them back. Only in one case is the orig_ reg different from the plain
// one, since only one of them can alias with ecx.
if (!object_.is(object_orig_)) {
masm->mov(object_orig_, object_);
masm->pop(object_);
}
if (!address_.is(address_orig_)) {
masm->mov(address_orig_, address_);
masm->pop(address_);
}
masm->pop(scratch1_);
if (!ecx.is(scratch0_orig_) &&
!ecx.is(object_orig_) &&
!ecx.is(address_orig_)) {
masm->pop(ecx);
}
if (!scratch0_.is(scratch0_orig_)) masm->pop(scratch0_);
}
// If we have to call into C then we need to save and restore all caller-
// saved registers that were not already preserved. The caller saved
// registers are eax, ecx and edx. The three scratch registers (incl. ecx)
// will be restored by other means so we don't bother pushing them here.
void SaveCallerSaveRegisters(MacroAssembler* masm, SaveFPRegsMode mode) {
if (!scratch0_.is(eax) && !scratch1_.is(eax)) masm->push(eax);
if (!scratch0_.is(edx) && !scratch1_.is(edx)) masm->push(edx);
if (mode == kSaveFPRegs) {
// Save FPU state in m108byte.
masm->sub(esp, Immediate(108));
masm->fnsave(Operand(esp, 0));
}
}
inline void RestoreCallerSaveRegisters(MacroAssembler* masm,
SaveFPRegsMode mode) {
if (mode == kSaveFPRegs) {
// Restore FPU state in m108byte.
masm->frstor(Operand(esp, 0));
masm->add(esp, Immediate(108));
}
if (!scratch0_.is(edx) && !scratch1_.is(edx)) masm->pop(edx);
if (!scratch0_.is(eax) && !scratch1_.is(eax)) masm->pop(eax);
}
inline Register object() { return object_; }
inline Register address() { return address_; }
inline Register scratch0() { return scratch0_; }
inline Register scratch1() { return scratch1_; }
private:
Register object_orig_;
Register address_orig_;
Register scratch0_orig_;
Register object_;
Register address_;
Register scratch0_;
Register scratch1_;
// Third scratch register is always ecx.
Register GetRegThatIsNotEcxOr(Register r1,
Register r2,
Register r3) {
for (int i = 0; i < Register::NumAllocatableRegisters(); i++) {
Register candidate = Register::FromAllocationIndex(i);
if (candidate.is(ecx)) continue;
if (candidate.is(r1)) continue;
if (candidate.is(r2)) continue;
if (candidate.is(r3)) continue;
return candidate;
}
UNREACHABLE();
return no_reg;
}
friend class RecordWriteStub;
};
enum OnNoNeedToInformIncrementalMarker {
kReturnOnNoNeedToInformIncrementalMarker,
kUpdateRememberedSetOnNoNeedToInformIncrementalMarker
};
inline Major MajorKey() const final { return RecordWrite; }
void Generate(MacroAssembler* masm) override;
void GenerateIncremental(MacroAssembler* masm, Mode mode);
void CheckNeedsToInformIncrementalMarker(
MacroAssembler* masm,
OnNoNeedToInformIncrementalMarker on_no_need,
Mode mode);
void InformIncrementalMarker(MacroAssembler* masm);
void Activate(Code* code) override {
code->GetHeap()->incremental_marking()->ActivateGeneratedStub(code);
}
Register object() const {
return Register::from_code(ObjectBits::decode(minor_key_));
}
Register value() const {
return Register::from_code(ValueBits::decode(minor_key_));
}
Register address() const {
return Register::from_code(AddressBits::decode(minor_key_));
}
RememberedSetAction remembered_set_action() const {
return RememberedSetActionBits::decode(minor_key_);
}
SaveFPRegsMode save_fp_regs_mode() const {
return SaveFPRegsModeBits::decode(minor_key_);
}
class ObjectBits: public BitField<int, 0, 3> {};
class ValueBits: public BitField<int, 3, 3> {};
class AddressBits: public BitField<int, 6, 3> {};
class RememberedSetActionBits: public BitField<RememberedSetAction, 9, 1> {};
class SaveFPRegsModeBits : public BitField<SaveFPRegsMode, 10, 1> {};
RegisterAllocation regs_;
DISALLOW_COPY_AND_ASSIGN(RecordWriteStub);
};
} } // namespace v8::internal
#endif // V8_X87_CODE_STUBS_X87_H_
```
|
```tex
%!TEX TS-program = xelatex
%!TEX encoding = UTF-8 Unicode
\documentclass[11pt,tikz,border=1]{standalone}
\usetikzlibrary{positioning,math}
\input{../plots}
\begin{document}
\manipulateSoftmaxBars{2.5}{-1}{3.2}{-2}
\end{document}
```
|
```objective-c
#import "AppDelegate.h"
@interface AppDelegate ()
@property (weak) IBOutlet NSWindow *window;
@end
@implementation AppDelegate
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
windowController = [[ShaderDesignerWindowController alloc] initWithWindowNibName:@"ShaderDesignerWindowController"];
[windowController showWindow:self];
}
@end
```
|
The Russo-Turkish War (1806–1812) between the Russian Empire and the Ottoman Empire was one of the Russo-Turkish Wars. Russia prevailed, but both sides wanted peace as they feared Napoleon's moves to the east.
Background
The war broke out against the background of the Napoleonic Wars. In 1806, Sultan Selim III, encouraged by the Russian defeat at Austerlitz and advised by the French Empire, deposed the pro-Russian Constantine Ypsilantis as Hospodar of the Principality of Wallachia and Alexander Mourousis as Hospodar of Moldavia, both Ottoman vassal states. Simultaneously, the French Empire occupied Dalmatia and threatened to penetrate the Danubian principalities at any time. In order to safeguard the Russian border against a possible French attack, a 40,000-strong Russian contingent advanced into Moldavia and Wallachia. The Sultan reacted by blocking the Dardanelles to Russian ships and declared war on Russia.
Early hostilities
Initially, Emperor Alexander I was reluctant to concentrate large forces against the Ottoman Empire while his relations with Napoleonic France were still uncertain and the main part of his army was occupied fighting against Napoleon in Prussia. A massive Ottoman offensive aimed at Russian-occupied Bucharest, the Wallachian capital, was promptly checked at Obilesti by as few as 4,500 soldiers commanded by Mikhail Miloradovich (June 2, 1807). In Armenia, the 7,000-strong contingent of Count Gudovich destroyed the Turkish force of 20,000 at Arpachai (June 18). In the meantime, the Russian Imperial Navy under Dmitry Senyavin blockaded the Dardanelles and defeated the Ottoman fleet in the Battle of the Dardanelles, after which Selim III was deposed. The Ottoman fleet was destroyed the following month in the Battle of Athos, thus establishing Russian supremacy at sea.
Campaigns of 1808–10
At this point the war might have ended, if it were not for the Peace of Tilsit. The Russian Emperor, constrained by Napoleon to sign an armistice with the Turks, used the time of peace to transfer more Russian soldiers from Prussia to Bessarabia. After the southern army was augmented to 80,000 and the hostilities were resumed, the 76-year-old commander-in-chief Prozorovsky made little progress in more than a year. In August 1809 he was eventually succeeded by Prince Bagration, who promptly crossed the Danube and overran Dobruja. Bagration proceeded to lay siege to Silistra but, on hearing that the 50,000-strong Turkish army approached the city, deemed it wise to evacuate Dobruja and retreat to Bessarabia.
In 1810, the hostilities were renewed by the brothers Nikolay and Sergei Kamensky, who defeated the Ottoman reinforcement heading for Silistra and ousted the Turks from Hacıoğlu Pazarcık (May 22). The position of Silistra now appeared hopeless, and the garrison surrendered on May, 30. Ten days later, Kamensky laid siege to another strong fortress, Shumla (or Schumen). His storm of the citadel was repelled at great loss of life, and more bloodshed ensued during the storming of the Danubian port of Rousse (or Rustchuk) on 22 July. The latter fortress did not fall to the Russians until 9 September, after Kamensky's army had surprised and routed a huge Turkish detachment at Batin on 26 August. On 26 October, Kamensky again defeated a 40,000-strong army of Osman Pasha at Vidin. The Russians lost only 1,500 men, compared with 10,000 for their opponents.
However, the young Nikolay Kamensky caught a serious illness on February 4, 1811 and died soon thereafter, left the army under the command of Louis Alexandre Andrault de Langeron. To this point, although the Russians had won many battles, they had failed to achieve any important victories that would force the Ottomans to end the war. Furthermore, relationship between France and Russia quickly became strained, pointing to the inevitable renewal of hostilities between the countries. The Russian Empire found that she needed to end the southern war quickly in order to concentrate on dealing with Napoleon. In such a situation, Tsar Alexander appointed his disfavoured general Mikhail Kutuzov to be the new commander of the Russian force.
Kutuzov's campaign (1811)
Kutuzov's first action upon taking command was to reduce the size of the garrisons in the fortresses along the Danube and retreat back into Wallachia. The Russian withdrawal induced the Turks to launch a counter-offensive to recapture lost territory. In the spring of 1811, 60,000 Turkish troops led by Grand Vizier Ahmed Pasha gathered at Šumnu, the strongest fortress in Ottoman Bulgaria and set out on a campaign to confront Kutuzov's army. Kutuzov's army was also large with 46,000 soldiers, however, he was responsible for protecting the full 600 mile Danube River border between Wallachia and Ottoman Bulgaria.
On 22 June 1811, the two forces met in battle at Rusçuk on the Danube. After a long struggle, the Russians successfully repelled Ahmed Pasha's larger army. A few days later as the Turks were preparing to attack the Russians in the Rusçuk fortress, Kutuzov ordered his forces to cross the Danube and retreat back into Wallachia.
Believing that the Russians were trying to escape, Ahmed Pasha decided to launch an attack. On 28 August, 36,000 Turkish troops began to cross the Danube River to assault the Russians. The Turkish force established a fortified bridgehead on the left bank of the river near the small village of Slobozia where they were quickly surrounded by two divisions of Kutuzov's army. The remaining 20,000 men of Ahmed Pasha's army remained at the Turkish field camp on the right bank near Rusçuk where they guarded the munitions and supplies. On the night of 1 October 1811, however, a Russian detachment of 7,500 men secretly crossed the Danube. In the morning the Russians overwhelmed the Turkish troops in a surprise attack. The Turks panicked and scattered, suffering 2,000 casualties. Thereafter, the Russian forces completely enveloped the Turkish bridgehead on the left bank of the Danube and initiated an all-out artillery attack.
For approximately six weeks, the Russians sieged and bombarded the Turkish bridgehead. Surrounded with their supply lines cut, the Turks suffered not only from a persistent Russian bombardment but also from malnutrition and disease. A ceasefire was agreed upon on 25 October and approximately three weeks later on 14 November 1811, Ahmed Pasha agreed to a truce and formally surrendered to Kutuzov. The magnitude of the Turkish defeat with 36,000 casualties, ended the war along the Danube and led to peace negotiations ultimately resulting in the signing of the Treaty of Bucharest on 28 May 1812.
Caucasus front
Six years of war on the eastern front left the border unchanged. Fighting here was more serious than during the Russo-Turkish War of 1787–1792, but it was still a sideshow to the main action. Russia crossed the Caucasus and annexed Georgia, the western half of which had been nominally Turkish. It also had taken the Persian vassal khanates along the Caspian coast and east of Georgia. The area around modern Armenia (Erivan Khanate and Nakhichevan Khanate) was still Persian. Russia was also at war with Persia but the Turks and Persians did not help each other. A large part of the Russian army was tied up because of Napoleon's threat in the west. The Russian Viceroys were 1806: Ivan Gudovich, 1809: Alexander Tormasov, 1811: Filippo Paulucci, 1812: Nikolay Rtishchev.
Fighting with Turkey began in 1807 with the swift seizure of Anapa by Admiral Pustoshkin. Gudovich led his main force toward Akhaltsikhe but lost 900 men while trying to storm Akhalkalaki and withdrew to Georgia. Secondary campaigns against Kars and Poti also failed. The Turks took the offensive, failed three times to take Gyumri and then were completely defeated by Gudovich (Battle of Arpachai). He was congratulated by the Shah, an interesting comment on the relations between the two Muslim empires. Gudovich was replaced by Count Tormasov who arrived about April 1809. In 1810 Poti on the coast was captured. A Turkish invasion was blocked by General Paulucci under the walls of Akhalkalaki. In November 1810 a Russian attack on Akhaltsikhe failed due to an outbreak of plague. In 1811 Tormasov was recalled at his own request and replaced by Paulucci in Transcaucasia, Rtishchev taking over the Northern Line. In 1811 more troops were withdrawn to deal with the expected threat of Napoleon. Turks and Persians agreed on a joint attack toward Gyumri. They met at ’Magasberd’ {location?} on 30Aug11. There a Kurd assassinated the Serasker of Erzurum and this caused the forces to break up.
Paulucci sent Pyotr Kotlyarevsky against Akhalkalaki. He made a forced march over the snow-covered mountains, avoiding the main roads, attacked at night, and had storming parties on the walls before the Turks knew the Russians were there. By the morning of 10 December he held the fort with a loss of only 30 killed and wounded. For this he was promoted to major-general at the age of 29. On 21 February 1812 5000 Turks failed to re-take Akhalkalaki. Three days later they were defeated at Parghita {location?}. Paulucci was sent west to command troops against Napoleon, and Rtishchev became commander of forces on both sides of the Caucasus mountains.
Russia decided to make peace, which was signed by the Treaty of Bucharest (1812).
Aftermath
According to the Treaty, the Ottoman Empire ceded the eastern half of Moldavia to Russia (which renamed the territory as Bessarabia), although it had committed to protecting that region. Russia became a new power in the lower Danube area, and had an economically, diplomatically, and militarily profitable frontier.
In Transcaucasia, Turkey regained nearly all it had lost in the east: Poti, Anapa and Akhalkalali. Russia retained Sukhum-Kale on the Abkhazian coast. In return, the Sultan accepted the Russian annexation of the Kingdom of Imereti, in 1810.
The treaty was approved by Alexander I of Russia on June 11, some 13 days before Napoleon's invasion of Russia began. The commanders were able to get many of the Russian soldiers in the Balkans back to the western areas before the expected attack of Napoleon.
See also
First Serbian Uprising
Citations
Bibliography
Conflicts in 1806
Conflicts in 1807
Conflicts in 1808
Conflicts in 1809
Conflicts in 1810
Conflicts in 1811
Conflicts in 1812
Russo-Turkish wars
Napoleonic Wars
19th century in Armenia
Military history of Georgia (country)
1800s in Romania
1810s in Romania
Ottoman Greece
1800s in the Ottoman Empire
1810s in the Ottoman Empire
19th century in Georgia (country)
1800s in the Russian Empire
1810s in the Russian Empire
1806 in the Russian Empire
1812 in the Russian Empire
1806 in the Ottoman Empire
1812 in the Ottoman Empire
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
/**
* Element-wise addition of two strided arrays.
*
* @module @stdlib/math/strided/ops/add
*
* @example
* var Float64Array = require( '@stdlib/array/float64' );
* var add = require( '@stdlib/math/strided/ops/add' );
*
* var x = new Float64Array( [ -2.0, 1.0, 3.0, -5.0, 4.0 ] );
* var y = new Float64Array( [ 1.0, 2.0, 3.0, 4.0, 5.0 ] );
* var z = new Float64Array( [ 0.0, 0.0, 0.0, 0.0, 0.0 ] );
*
* add( x.length, 'float64', x, 1, 'float64', y, 1, 'float64', z, 1 );
* // z => <Float64Array>[ -1.0, 3.0, 6.0, -1.0, 9.0 ]
*
* @example
* var Float64Array = require( '@stdlib/array/float64' );
* var add = require( '@stdlib/math/strided/ops/add' );
*
* var x = new Float64Array( [ -2.0, 1.0, 3.0, -5.0, 4.0 ] );
* var y = new Float64Array( [ 1.0, 2.0, 3.0, 4.0, 5.0 ] );
* var z = new Float64Array( [ 0.0, 0.0, 0.0, 0.0, 0.0 ] );
*
* add.ndarray( x.length, 'float64', x, 1, 0, 'float64', y, 1, 0, 'float64', z, 1, 0 );
* // z => <Float64Array>[ -1.0, 3.0, 6.0, -1.0, 9.0 ]
*/
// MODULES //
var join = require( 'path' ).join;
var tryRequire = require( '@stdlib/utils/try-require' );
var javascript = require( './main.js' );
// MAIN //
var main;
var tmp = tryRequire( join( __dirname, './native.js' ) );
if ( tmp instanceof Error ) {
main = javascript;
} else {
main = tmp;
}
// EXPORTS //
module.exports = main;
// exports: { "ndarray": "main.ndarray" }
```
|
```smalltalk
// The .NET Foundation licenses this file to you under the MIT license.
namespace Microsoft.TemplateEngine.Abstractions;
/// <summary>
/// Indicates parameter defined precedence.
/// </summary>
public enum PrecedenceDefinition
{
// If enable condition is set - parameter is conditionally disabled (regardless if require condition is set or not)
// Conditionally required is if and only if the only require condition is set.
/// <summary>
/// Parameter value is unconditionally required.
/// </summary>
Required,
/// <summary>
/// Set if and only if only the IsRequiredCondition is set.
/// </summary>
ConditionalyRequired,
/// <summary>
/// Parameter value is not required from user.
/// </summary>
Optional,
/// <summary>
/// Parameter value is implicitly populated.
/// </summary>
Implicit,
/// <summary>
/// Parameter might become disabled - value would not be needed nor used in such case.
/// </summary>
ConditionalyDisabled,
/// <summary>
/// Parameter is disabled - it's value is not required and will not be used.
/// </summary>
Disabled,
}
```
|
```javascript
/*
* The Original Code is Mozilla Universal charset detector code.
*
* The Initial Developer of the Original Code is
* Netscape Communications Corporation.
* the Initial Developer. All Rights Reserved.
*
* Contributor(s):
* Antnio Afonso (antonio.afonso gmail.com) - port to JavaScript
* Mark Pilgrim - port to Python
* Shy Shalom - original C code
*
* This library is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
*
* This library is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
*
* You should have received a copy of the GNU Lesser General Public
* Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA
* 02110-1301 USA
*/
!function(jschardet) {
// 255: Control characters that usually does not exist in any text
// 254: Carriage/Return
// 253: symbol (punctuation) that does not belong to word
// 252: 0 - 9
// The following result for thai was collected from a limited sample (1M).
// Character Mapping Table:
jschardet.TIS620CharToOrderMap = [
255,255,255,255,255,255,255,255,255,255,254,255,255,254,255,255, // 00
255,255,255,255,255,255,255,255,255,255,255,255,255,255,255,255, // 10
253,253,253,253,253,253,253,253,253,253,253,253,253,253,253,253, // 20
252,252,252,252,252,252,252,252,252,252,253,253,253,253,253,253, // 30
253,182,106,107,100,183,184,185,101, 94,186,187,108,109,110,111, // 40
188,189,190, 89, 95,112,113,191,192,193,194,253,253,253,253,253, // 50
253, 64, 72, 73,114, 74,115,116,102, 81,201,117, 90,103, 78, 82, // 60
96,202, 91, 79, 84,104,105, 97, 98, 92,203,253,253,253,253,253, // 70
209,210,211,212,213, 88,214,215,216,217,218,219,220,118,221,222,
223,224, 99, 85, 83,225,226,227,228,229,230,231,232,233,234,235,
236, 5, 30,237, 24,238, 75, 8, 26, 52, 34, 51,119, 47, 58, 57,
49, 53, 55, 43, 20, 19, 44, 14, 48, 3, 17, 25, 39, 62, 31, 54,
45, 9, 16, 2, 61, 15,239, 12, 42, 46, 18, 21, 76, 4, 66, 63,
22, 10, 1, 36, 23, 13, 40, 27, 32, 35, 86,240,241,242,243,244,
11, 28, 41, 29, 33,245, 50, 37, 6, 7, 67, 77, 38, 93,246,247,
68, 56, 59, 65, 69, 60, 70, 80, 71, 87,248,249,250,251,252,253
];
// Model Table:
// total sequences: 100%
// first 512 sequences: 92.6386%
// first 1024 sequences:7.3177%
// rest sequences: 1.0230%
// negative sequences: 0.0436%
jschardet.ThaiLangModel = [
0,1,3,3,3,3,0,0,3,3,0,3,3,0,3,3,3,3,3,3,3,3,0,0,3,3,3,0,3,3,3,3,
0,3,3,0,0,0,1,3,0,3,3,2,3,3,0,1,2,3,3,3,3,0,2,0,2,0,0,3,2,1,2,2,
3,0,3,3,2,3,0,0,3,3,0,3,3,0,3,3,3,3,3,3,3,3,3,0,3,2,3,0,2,2,2,3,
0,2,3,0,0,0,0,1,0,1,2,3,1,1,3,2,2,0,1,1,0,0,1,0,0,0,0,0,0,0,1,1,
3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,2,2,2,2,2,2,2,3,3,2,3,2,3,3,2,2,2,
3,1,2,3,0,3,3,2,2,1,2,3,3,1,2,0,1,3,0,1,0,0,1,0,0,0,0,0,0,0,1,1,
3,3,2,2,3,3,3,3,1,2,3,3,3,3,3,2,2,2,2,3,3,2,2,3,3,2,2,3,2,3,2,2,
3,3,1,2,3,1,2,2,3,3,1,0,2,1,0,0,3,1,2,1,0,0,1,0,0,0,0,0,0,1,0,1,
3,3,3,3,3,3,2,2,3,3,3,3,2,3,2,2,3,3,2,2,3,2,2,2,2,1,1,3,1,2,1,1,
3,2,1,0,2,1,0,1,0,1,1,0,1,1,0,0,1,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,
3,3,3,2,3,2,3,3,2,2,3,2,3,3,2,3,1,1,2,3,2,2,2,3,2,2,2,2,2,1,2,1,
2,2,1,1,3,3,2,1,0,1,2,2,0,1,3,0,0,0,1,1,0,0,0,0,0,2,3,0,0,2,1,1,
3,3,2,3,3,2,0,0,3,3,0,3,3,0,2,2,3,1,2,2,1,1,1,0,2,2,2,0,2,2,1,1,
0,2,1,0,2,0,0,2,0,1,0,0,1,0,0,0,1,1,1,1,0,0,0,0,0,0,0,0,0,0,1,0,
3,3,2,3,3,2,0,0,3,3,0,2,3,0,2,1,2,2,2,2,1,2,0,0,2,2,2,0,2,2,1,1,
0,2,1,0,2,0,0,2,0,1,1,0,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,
3,3,2,3,2,3,2,0,2,2,1,3,2,1,3,2,1,2,3,2,2,3,0,2,3,2,2,1,2,2,2,2,
1,2,2,0,0,0,0,2,0,1,2,0,1,1,1,0,1,0,3,1,1,0,0,0,0,0,0,0,0,0,1,0,
3,3,2,3,3,2,3,2,2,2,3,2,2,3,2,2,1,2,3,2,2,3,1,3,2,2,2,3,2,2,2,3,
3,2,1,3,0,1,1,1,0,2,1,1,1,1,1,0,1,0,1,1,0,0,0,0,0,0,0,0,0,2,0,0,
1,0,0,3,0,3,3,3,3,3,0,0,3,0,2,2,3,3,3,3,3,0,0,0,1,1,3,0,0,0,0,2,
0,0,1,0,0,0,0,0,0,0,2,3,0,0,0,3,0,2,0,0,0,0,0,3,0,0,0,0,0,0,0,0,
2,0,3,3,3,3,0,0,2,3,0,0,3,0,3,3,2,3,3,3,3,3,0,0,3,3,3,0,0,0,3,3,
0,0,3,0,0,0,0,2,0,0,2,1,1,3,0,0,1,0,0,2,3,0,1,0,0,0,0,0,0,0,1,0,
3,3,3,3,2,3,3,3,3,3,3,3,1,2,1,3,3,2,2,1,2,2,2,3,1,1,2,0,2,1,2,1,
2,2,1,0,0,0,1,1,0,1,0,1,1,0,0,0,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,
3,0,2,1,2,3,3,3,0,2,0,2,2,0,2,1,3,2,2,1,2,1,0,0,2,2,1,0,2,1,2,2,
0,1,1,0,0,0,0,1,0,1,1,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,2,1,3,3,1,1,3,0,2,3,1,1,3,2,1,1,2,0,2,2,3,2,1,1,1,1,1,2,
3,0,0,1,3,1,2,1,2,0,3,0,0,0,1,0,3,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,
3,3,1,1,3,2,3,3,3,1,3,2,1,3,2,1,3,2,2,2,2,1,3,3,1,2,1,3,1,2,3,0,
2,1,1,3,2,2,2,1,2,1,0,0,1,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,
3,3,2,3,2,3,3,2,3,2,3,2,3,3,2,1,0,3,2,2,2,1,2,2,2,1,2,2,1,2,1,1,
2,2,2,3,0,1,3,1,1,1,1,0,1,1,0,2,1,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,2,3,2,2,1,1,3,2,3,2,3,2,0,3,2,2,1,2,0,2,2,2,1,2,2,2,2,1,
3,2,1,2,2,1,0,2,0,1,0,0,1,1,0,0,0,0,0,1,1,0,1,0,0,0,0,0,0,0,0,1,
3,3,3,3,3,2,3,1,2,3,3,2,2,3,0,1,1,2,0,3,3,2,2,3,0,1,1,3,0,0,0,0,
3,1,0,3,3,0,2,0,2,1,0,0,3,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,2,3,2,3,3,0,1,3,1,1,2,1,2,1,1,3,1,1,0,2,3,1,1,1,1,1,1,1,1,
3,1,1,2,2,2,2,1,1,1,0,0,2,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
3,2,2,1,1,2,1,3,3,2,3,2,2,3,2,2,3,1,2,2,1,2,0,3,2,1,2,2,2,2,2,1,
3,2,1,2,2,2,1,1,1,1,0,0,1,1,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,3,3,3,3,1,3,3,0,2,1,0,3,2,0,0,3,1,0,1,1,0,1,0,0,0,0,0,1,
1,0,0,1,0,3,2,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,2,2,2,3,0,0,1,3,0,3,2,0,3,2,2,3,3,3,3,3,1,0,2,2,2,0,2,2,1,2,
0,2,3,0,0,0,0,1,0,1,0,0,1,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,
3,0,2,3,1,3,3,2,3,3,0,3,3,0,3,2,2,3,2,3,3,3,0,0,2,2,3,0,1,1,1,3,
0,0,3,0,0,0,2,2,0,1,3,0,1,2,2,2,3,0,0,0,0,0,1,0,0,0,0,0,0,0,0,1,
3,2,3,3,2,0,3,3,2,2,3,1,3,2,1,3,2,0,1,2,2,0,2,3,2,1,0,3,0,0,0,0,
3,0,0,2,3,1,3,0,0,3,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,1,3,2,2,2,1,2,0,1,3,1,1,3,1,3,0,0,2,1,1,1,1,2,1,1,1,0,2,1,0,1,
1,2,0,0,0,3,1,1,0,0,0,0,1,0,1,0,0,1,0,1,0,0,0,0,0,3,1,0,0,0,1,0,
3,3,3,3,2,2,2,2,2,1,3,1,1,1,2,0,1,1,2,1,2,1,3,2,0,0,3,1,1,1,1,1,
3,1,0,2,3,0,0,0,3,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,2,3,0,3,3,0,2,0,0,0,0,0,0,0,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,
0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,2,3,1,3,0,0,1,2,0,0,2,0,3,3,2,3,3,3,2,3,0,0,2,2,2,0,0,0,2,2,
0,0,1,0,0,0,0,3,0,0,0,0,2,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
0,0,0,3,0,2,0,0,0,0,0,0,0,0,0,0,1,2,3,1,3,3,0,0,1,0,3,0,0,0,0,0,
0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,1,2,3,1,2,3,1,0,3,0,2,2,1,0,2,1,1,2,0,1,0,0,1,1,1,1,0,1,0,0,
1,0,0,0,0,1,1,0,3,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,3,3,2,1,0,1,1,1,3,1,2,2,2,2,2,2,1,1,1,1,0,3,1,0,1,3,1,1,1,1,
1,1,0,2,0,1,3,1,1,0,0,1,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,2,0,1,
3,0,2,2,1,3,3,2,3,3,0,1,1,0,2,2,1,2,1,3,3,1,0,0,3,2,0,0,0,0,2,1,
0,1,0,0,0,0,1,2,0,1,1,3,1,1,2,2,1,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,
0,0,3,0,0,1,0,0,0,3,0,0,3,0,3,1,0,1,1,1,3,2,0,0,0,3,0,0,0,0,2,0,
0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,2,0,0,0,0,0,0,0,0,0,
3,3,1,3,2,1,3,3,1,2,2,0,1,2,1,0,1,2,0,0,0,0,0,3,0,0,0,3,0,0,0,0,
3,0,0,1,1,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,1,2,0,3,3,3,2,2,0,1,1,0,1,3,0,0,0,2,2,0,0,0,0,3,1,0,1,0,0,0,
0,0,0,0,0,0,0,0,0,1,0,1,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,2,3,1,2,0,0,2,1,0,3,1,0,1,2,0,1,1,1,1,3,0,0,3,1,1,0,2,2,1,1,
0,2,0,0,0,0,0,1,0,1,0,0,1,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,0,3,1,2,0,0,2,2,0,1,2,0,1,0,1,3,1,2,1,0,0,0,2,0,3,0,0,0,1,0,
0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,1,1,2,2,0,0,0,2,0,2,1,0,1,1,0,1,1,1,2,1,0,0,1,1,1,0,2,1,1,1,
0,1,1,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,1,0,1,
0,0,0,2,0,1,3,1,1,1,1,0,0,0,0,3,2,0,1,0,0,0,1,2,0,0,0,1,0,0,0,0,
0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,3,3,3,3,1,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,0,2,3,2,2,0,0,0,1,0,0,0,0,2,3,2,1,2,2,3,0,0,0,2,3,1,0,0,0,1,1,
0,0,1,0,0,0,0,0,0,0,1,0,0,1,0,0,0,0,0,1,1,0,1,0,0,0,0,0,0,0,0,0,
3,3,2,2,0,1,0,0,0,0,2,0,2,0,1,0,0,0,1,1,0,0,0,2,1,0,1,0,1,1,0,0,
0,1,0,2,0,0,1,0,3,0,1,0,0,0,2,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,1,0,0,1,0,0,0,0,0,1,1,2,0,0,0,0,1,0,0,1,3,1,0,0,0,0,1,1,0,0,
0,1,0,0,0,0,3,0,0,0,0,0,0,3,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,
3,3,1,1,1,1,2,3,0,0,2,1,1,1,1,1,0,2,1,1,0,0,0,2,1,0,1,2,1,1,0,1,
2,1,0,3,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,3,1,0,0,0,0,0,0,0,3,0,0,0,3,0,0,0,0,0,0,0,0,1,1,0,0,0,0,0,0,1,
0,0,0,2,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,3,2,0,0,0,0,0,0,1,2,1,0,1,1,0,2,0,0,1,0,0,2,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,2,0,0,0,1,3,0,1,0,0,0,2,0,0,0,0,0,0,0,1,2,0,0,0,0,0,
3,3,0,0,1,1,2,0,0,1,2,1,0,1,1,1,0,1,1,0,0,2,1,1,0,1,0,0,1,1,1,0,
0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,3,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,2,2,1,0,0,0,0,1,0,0,0,0,3,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,
2,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,3,0,0,1,1,0,0,0,2,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
1,1,0,1,2,0,1,2,0,0,1,1,0,2,0,1,0,0,1,0,0,0,0,1,0,0,0,2,0,0,0,0,
1,0,0,1,0,1,1,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,1,0,0,0,0,0,0,0,1,1,0,1,1,0,2,1,3,0,0,0,0,1,1,0,0,0,0,0,0,0,3,
1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,1,0,1,0,0,2,0,0,2,0,0,1,1,2,0,0,1,1,0,0,0,1,0,0,0,1,1,0,0,0,
1,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,
1,0,0,3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,1,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
3,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,1,1,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,0,0,0,2,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,1,0,1,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,3,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,1,0,0,0,0,
1,0,0,0,0,0,0,0,0,1,0,0,0,0,2,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,1,1,0,0,2,1,0,0,1,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
];
jschardet.TIS620ThaiModel = {
"charToOrderMap" : jschardet.TIS620CharToOrderMap,
"precedenceMatrix" : jschardet.ThaiLangModel,
"mTypicalPositiveRatio" : 0.926386,
"keepEnglishLetter" : false,
"charsetName" : "TIS-620"
};
}(require('./init'));
```
|
```xml
<Project Sdk="Microsoft.NET.Sdk">
<Import Project="..\..\..\buildtools\common.props" />
<PropertyGroup>
<TargetFrameworks>netstandard2.0;net5.0;net6.0;net8.0</TargetFrameworks>
<VersionPrefix>1.10.0</VersionPrefix>
<Description>Provides a bootstrap and Lambda Runtime API Client to help you to develop custom .NET Core Lambda Runtimes.</Description>
<AssemblyTitle>Amazon.Lambda.RuntimeSupport</AssemblyTitle>
<AssemblyName>Amazon.Lambda.RuntimeSupport</AssemblyName>
<PackageId>Amazon.Lambda.RuntimeSupport</PackageId>
<PackageTags>AWS;Amazon;Lambda</PackageTags>
<PackageReadmeFile>README.md</PackageReadmeFile>
<GenerateAssemblyVersionAttribute>true</GenerateAssemblyVersionAttribute>
<GenerateAssemblyFileVersionAttribute>true</GenerateAssemblyFileVersionAttribute>
</PropertyGroup>
<PropertyGroup Condition=" '$(ExecutableOutputType)'=='true' ">
<OutputType>Exe</OutputType>
</PropertyGroup>
<PropertyGroup Condition="'$(TargetFramework)' == 'net8.0'">
<WarningsAsErrors>IL2026,IL2067,IL2075</WarningsAsErrors>
<IsTrimmable>true</IsTrimmable>
<EnableTrimAnalyzer>true</EnableTrimAnalyzer>
<EnableTrimAnalyzer>true</EnableTrimAnalyzer>
</PropertyGroup>
<ItemGroup>
<None Include="README.md" Pack="true" PackagePath="\" />
</ItemGroup>
<ItemGroup Condition="'$(TargetFramework)' == 'netstandard2.0'">
<PackageReference Include="System.Runtime.Loader" Version="4.3.0" />
</ItemGroup>
<ItemGroup Condition="'$(TargetFramework)' == 'netstandard2.0'">
<PackageReference Include="System.Text.Json" Version="6.0.0" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\Amazon.Lambda.Core\Amazon.Lambda.Core.csproj" />
</ItemGroup>
<ItemGroup>
<None Update="bootstrap.sh">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
<None Update="bootstrap-al2023.sh">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</None>
</ItemGroup>
</Project>
```
|
```c++
/*
*/
/*
* This file is part of Scylla.
*
* Scylla is free software: you can redistribute it and/or modify
* (at your option) any later version.
*
* Scylla is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
*
* along with Scylla. If not, see <path_to_url
*/
#include <boost/range/adaptor/transformed.hpp>
#include <boost/range/algorithm/copy.hpp>
#include <boost/range/algorithm_ext/push_back.hpp>
#include <boost/range/size.hpp>
#include <seastar/core/thread.hh>
#include <seastar/util/defer.hh>
#include "partition_version.hh"
#include "partition_snapshot_row_cursor.hh"
#include "partition_snapshot_reader.hh"
#include "tests/test-utils.hh"
#include "tests/mutation_assertions.hh"
#include "tests/simple_schema.hh"
#include "tests/mutation_source_test.hh"
#include "tests/failure_injecting_allocation_strategy.hh"
#include "tests/range_tombstone_list_assertions.hh"
#include "real_dirty_memory_accounter.hh"
using namespace std::chrono_literals;
// Verifies that tombstones in "list" are monotonic, overlap with the requested range,
// and have information equivalent with "expected" in that range.
static
void check_tombstone_slice(const schema& s, std::vector<range_tombstone> list,
const query::clustering_range& range,
std::initializer_list<range_tombstone> expected)
{
range_tombstone_list actual(s);
position_in_partition::less_compare less(s);
position_in_partition prev_pos = position_in_partition::before_all_clustered_rows();
for (auto&& rt : list) {
if (!less(rt.position(), position_in_partition::for_range_end(range))) {
BOOST_FAIL(sprint("Range tombstone out of range: %s, range: %s", rt, range));
}
if (!less(position_in_partition::for_range_start(range), rt.end_position())) {
BOOST_FAIL(sprint("Range tombstone out of range: %s, range: %s", rt, range));
}
if (!less(prev_pos, rt.position())) {
BOOST_FAIL(sprint("Range tombstone breaks position monotonicity: %s, list: %s", rt, list));
}
prev_pos = position_in_partition(rt.position());
actual.apply(s, rt);
}
actual.trim(s, query::clustering_row_ranges{range});
range_tombstone_list expected_list(s);
for (auto&& rt : expected) {
expected_list.apply(s, rt);
}
expected_list.trim(s, query::clustering_row_ranges{range});
assert_that(s, actual).is_equal_to(expected_list);
}
SEASTAR_TEST_CASE(test_range_tombstone_slicing) {
return seastar::async([] {
logalloc::region r;
mutation_cleaner cleaner(r, no_cache_tracker);
simple_schema table;
auto s = table.schema();
with_allocator(r.allocator(), [&] {
logalloc::reclaim_lock l(r);
auto rt1 = table.make_range_tombstone(table.make_ckey_range(1, 2));
auto rt2 = table.make_range_tombstone(table.make_ckey_range(4, 7));
auto rt3 = table.make_range_tombstone(table.make_ckey_range(6, 9));
mutation_partition m1(s);
m1.apply_delete(*s, rt1);
m1.apply_delete(*s, rt2);
m1.apply_delete(*s, rt3);
partition_entry e(mutation_partition(*s, m1));
auto snap = e.read(r, cleaner, s, no_cache_tracker);
auto check_range = [&s] (partition_snapshot& snap, const query::clustering_range& range,
std::initializer_list<range_tombstone> expected) {
auto tombstones = snap.range_tombstones(
position_in_partition::for_range_start(range),
position_in_partition::for_range_end(range));
check_tombstone_slice(*s, tombstones, range, expected);
};
check_range(*snap, table.make_ckey_range(0, 0), {});
check_range(*snap, table.make_ckey_range(1, 1), {rt1});
check_range(*snap, table.make_ckey_range(3, 4), {rt2});
check_range(*snap, table.make_ckey_range(3, 5), {rt2});
check_range(*snap, table.make_ckey_range(3, 6), {rt2, rt3});
check_range(*snap, table.make_ckey_range(6, 6), {rt2, rt3});
check_range(*snap, table.make_ckey_range(7, 10), {rt2, rt3});
check_range(*snap, table.make_ckey_range(8, 10), {rt3});
check_range(*snap, table.make_ckey_range(10, 10), {});
check_range(*snap, table.make_ckey_range(0, 10), {rt1, rt2, rt3});
auto rt4 = table.make_range_tombstone(table.make_ckey_range(1, 2));
auto rt5 = table.make_range_tombstone(table.make_ckey_range(5, 8));
mutation_partition m2(s);
m2.apply_delete(*s, rt4);
m2.apply_delete(*s, rt5);
auto&& v2 = e.add_version(*s, no_cache_tracker);
v2.partition().apply_weak(*s, m2, *s);
auto snap2 = e.read(r, cleaner, s, no_cache_tracker);
check_range(*snap2, table.make_ckey_range(0, 0), {});
check_range(*snap2, table.make_ckey_range(1, 1), {rt4});
check_range(*snap2, table.make_ckey_range(3, 4), {rt2});
check_range(*snap2, table.make_ckey_range(3, 5), {rt2, rt5});
check_range(*snap2, table.make_ckey_range(3, 6), {rt2, rt3, rt5});
check_range(*snap2, table.make_ckey_range(4, 4), {rt2});
check_range(*snap2, table.make_ckey_range(5, 5), {rt2, rt5});
check_range(*snap2, table.make_ckey_range(6, 6), {rt2, rt3, rt5});
check_range(*snap2, table.make_ckey_range(7, 10), {rt2, rt3, rt5});
check_range(*snap2, table.make_ckey_range(8, 8), {rt3, rt5});
check_range(*snap2, table.make_ckey_range(9, 9), {rt3});
check_range(*snap2, table.make_ckey_range(8, 10), {rt3, rt5});
check_range(*snap2, table.make_ckey_range(10, 10), {});
check_range(*snap2, table.make_ckey_range(0, 10), {rt4, rt2, rt3, rt5});
});
});
}
class mvcc_partition;
// Together with mvcc_partition abstracts memory management details of dealing with MVCC.
class mvcc_container {
schema_ptr _schema;
std::optional<cache_tracker> _tracker;
std::optional<logalloc::region> _region_holder;
std::optional<mutation_cleaner> _cleaner_holder;
partition_snapshot::phase_type _phase = partition_snapshot::min_phase;
dirty_memory_manager _mgr;
std::optional<real_dirty_memory_accounter> _acc;
logalloc::region* _region;
mutation_cleaner* _cleaner;
public:
struct no_tracker {};
mvcc_container(schema_ptr s)
: _schema(s)
, _tracker(std::make_optional<cache_tracker>())
, _acc(std::make_optional<real_dirty_memory_accounter>(_mgr, *_tracker, 0))
, _region(&_tracker->region())
, _cleaner(&_tracker->cleaner())
{ }
mvcc_container(schema_ptr s, no_tracker)
: _schema(s)
, _region_holder(std::make_optional<logalloc::region>())
, _cleaner_holder(std::make_optional<mutation_cleaner>(*_region_holder, nullptr))
, _region(&*_region_holder)
, _cleaner(&*_cleaner_holder)
{ }
mvcc_container(mvcc_container&&) = delete;
// Call only when this container was constructed with a tracker
mvcc_partition make_evictable(const mutation_partition& mp);
// Call only when this container was constructed without a tracker
mvcc_partition make_not_evictable(const mutation_partition& mp);
logalloc::region& region() { return *_region; }
cache_tracker* tracker() { return &*_tracker; }
mutation_cleaner& cleaner() { return *_cleaner; }
partition_snapshot::phase_type next_phase() { return ++_phase; }
partition_snapshot::phase_type phase() const { return _phase; }
real_dirty_memory_accounter& accounter() { return *_acc; }
mutation_partition squashed(partition_snapshot_ptr& snp) {
logalloc::allocating_section as;
return as(region(), [&] {
return snp->squashed();
});
}
// Merges other into this
void merge(mvcc_container& other) {
_region->merge(*other._region);
_cleaner->merge(*other._cleaner);
}
};
class mvcc_partition {
schema_ptr _s;
partition_entry _e;
mvcc_container& _container;
bool _evictable;
private:
void apply_to_evictable(partition_entry&& src, schema_ptr src_schema);
void apply(const mutation_partition& mp, schema_ptr mp_schema);
public:
mvcc_partition(schema_ptr s, partition_entry&& e, mvcc_container& container, bool evictable)
: _s(s), _e(std::move(e)), _container(container), _evictable(evictable) {
}
mvcc_partition(mvcc_partition&&) = default;
~mvcc_partition() {
with_allocator(region().allocator(), [&] {
_e = {};
});
}
partition_entry& entry() { return _e; }
schema_ptr schema() const { return _s; }
logalloc::region& region() const { return _container.region(); }
mvcc_partition& operator+=(const mutation&);
mvcc_partition& operator+=(mvcc_partition&&);
mutation_partition squashed() {
logalloc::allocating_section as;
return as(region(), [&] {
return _e.squashed(*_s);
});
}
void upgrade(schema_ptr new_schema) {
logalloc::allocating_section as;
with_allocator(region().allocator(), [&] {
as(region(), [&] {
_e.upgrade(_s, new_schema, _container.cleaner(), _container.tracker());
_s = new_schema;
});
});
}
partition_snapshot_ptr read() {
logalloc::allocating_section as;
return as(region(), [&] {
return _e.read(region(), _container.cleaner(), schema(), _container.tracker(), _container.phase());
});
}
void evict() {
with_allocator(region().allocator(), [&] {
_e.evict(_container.cleaner());
});
}
};
void mvcc_partition::apply_to_evictable(partition_entry&& src, schema_ptr src_schema) {
with_allocator(region().allocator(), [&] {
logalloc::allocating_section as;
mutation_cleaner src_cleaner(region(), no_cache_tracker);
auto c = as(region(), [&] {
return _e.apply_to_incomplete(*schema(), std::move(src), *src_schema, src_cleaner, as, region(),
*_container.tracker(), _container.next_phase(), _container.accounter());
});
repeat([&] {
return c.run();
}).get();
});
}
mvcc_partition& mvcc_partition::operator+=(mvcc_partition&& src) {
assert(_evictable);
apply_to_evictable(std::move(src.entry()), src.schema());
return *this;
}
mvcc_partition& mvcc_partition::operator+=(const mutation& m) {
with_allocator(region().allocator(), [&] {
apply(m.partition(), m.schema());
});
return *this;
}
void mvcc_partition::apply(const mutation_partition& mp, schema_ptr mp_s) {
with_allocator(region().allocator(), [&] {
if (_evictable) {
apply_to_evictable(partition_entry(mutation_partition(*mp_s, mp)), mp_s);
} else {
logalloc::allocating_section as;
as(region(), [&] {
_e.apply(*_s, mp, *mp_s);
});
}
});
}
mvcc_partition mvcc_container::make_evictable(const mutation_partition& mp) {
return with_allocator(region().allocator(), [&] {
logalloc::allocating_section as;
return as(region(), [&] {
return mvcc_partition(_schema, partition_entry::make_evictable(*_schema, mp), *this, true);
});
});
}
mvcc_partition mvcc_container::make_not_evictable(const mutation_partition& mp) {
return with_allocator(region().allocator(), [&] {
logalloc::allocating_section as;
return as(region(), [&] {
return mvcc_partition(_schema, partition_entry(mutation_partition(*_schema, mp)), *this, false);
});
});
}
SEASTAR_TEST_CASE(test_apply_to_incomplete) {
return seastar::async([] {
simple_schema table;
mvcc_container ms(table.schema());
auto&& s = *table.schema();
auto new_mutation = [&] {
return mutation(table.schema(), table.make_pkey(0));
};
auto mutation_with_row = [&] (clustering_key ck) {
auto m = new_mutation();
table.add_row(m, ck, "v");
return m;
};
auto ck1 = table.make_ckey(1);
auto ck2 = table.make_ckey(2);
BOOST_TEST_MESSAGE("Check that insert falling into discontinuous range is dropped");
{
auto e = ms.make_evictable(mutation_partition::make_incomplete(s));
auto m = new_mutation();
table.add_row(m, ck1, "v");
e += m;
assert_that(table.schema(), e.squashed()).is_equal_to(mutation_partition::make_incomplete(s));
}
BOOST_TEST_MESSAGE("Check that continuity is a union");
{
auto m1 = mutation_with_row(ck2);
auto e = ms.make_evictable(m1.partition());
auto snap1 = e.read();
auto m2 = mutation_with_row(ck2);
e += m2;
partition_version* latest = &*e.entry().version();
for (rows_entry& row : latest->partition().clustered_rows()) {
row.set_continuous(is_continuous::no);
}
auto m3 = mutation_with_row(ck1);
e += m3;
assert_that(table.schema(), e.squashed()).is_equal_to((m2 + m3).partition());
// Check that snapshot data is not stolen when its entry is applied
auto e2 = ms.make_evictable(mutation_partition(table.schema()));
e2 += std::move(e);
assert_that(table.schema(), ms.squashed(snap1)).is_equal_to(m1.partition());
assert_that(table.schema(), e2.squashed()).is_equal_to((m2 + m3).partition());
}
});
}
SEASTAR_TEST_CASE(test_schema_upgrade_preserves_continuity) {
return seastar::async([] {
simple_schema table;
mvcc_container ms(table.schema());
auto new_mutation = [&] {
return mutation(table.schema(), table.make_pkey(0));
};
auto mutation_with_row = [&] (clustering_key ck) {
auto m = new_mutation();
table.add_row(m, ck, "v");
return m;
};
// FIXME: There is no assert_that() for mutation_partition
auto assert_entry_equal = [&] (mvcc_partition& e, mutation m) {
auto key = table.make_pkey(0);
assert_that(mutation(e.schema(), key, e.squashed()))
.is_equal_to(m);
};
auto m1 = mutation_with_row(table.make_ckey(1));
m1.partition().clustered_rows().begin()->set_continuous(is_continuous::no);
m1.partition().set_static_row_continuous(false);
m1.partition().ensure_last_dummy(*m1.schema());
auto e = ms.make_evictable(m1.partition());
auto rd1 = e.read();
auto m2 = mutation_with_row(table.make_ckey(3));
m2.partition().ensure_last_dummy(*m2.schema());
e += m2;
auto new_schema = schema_builder(table.schema()).with_column("__new_column", utf8_type).build();
auto cont_before = e.squashed().get_continuity(*table.schema());
e.upgrade(new_schema);
auto cont_after = e.squashed().get_continuity(*new_schema);
rd1 = {};
auto expected = m1 + m2;
expected.partition().set_static_row_continuous(false); // apply_to_incomplete()
assert_entry_equal(e, expected);
BOOST_REQUIRE(cont_after.equals(*new_schema, cont_before));
auto m3 = mutation_with_row(table.make_ckey(2));
e += m3;
auto m4 = mutation_with_row(table.make_ckey(0));
table.add_static_row(m4, "s_val");
e += m4;
expected += m3;
expected.partition().set_static_row_continuous(false); // apply_to_incomplete()
assert_entry_equal(e, expected);
});
}
SEASTAR_TEST_CASE(test_eviction_with_active_reader) {
return seastar::async([] {
{
simple_schema table;
mvcc_container ms(table.schema());
auto&& s = *table.schema();
auto pk = table.make_pkey();
auto ck1 = table.make_ckey(1);
auto ck2 = table.make_ckey(2);
auto e = ms.make_evictable(mutation_partition(table.schema()));
mutation m1(table.schema(), pk);
m1.partition().clustered_row(s, ck2);
e += m1;
auto snap1 = e.read();
mutation m2(table.schema(), pk);
m2.partition().clustered_row(s, ck1);
e += m2;
auto snap2 = e.read();
partition_snapshot_row_cursor cursor(s, *snap2);
cursor.advance_to(position_in_partition_view::before_all_clustered_rows());
BOOST_REQUIRE(cursor.continuous());
BOOST_REQUIRE(cursor.key().equal(s, ck1));
e.evict();
{
logalloc::reclaim_lock rl(ms.region());
cursor.maybe_refresh();
auto mp = cursor.read_partition();
assert_that(table.schema(), mp).is_equal_to(s, (m1 + m2).partition());
}
}
});
}
SEASTAR_TEST_CASE(test_apply_to_incomplete_respects_continuity) {
// Test that apply_to_incomplete() drops entries from source which fall outside continuity
// and that continuity is not affected.
return seastar::async([] {
{
random_mutation_generator gen(random_mutation_generator::generate_counters::no);
auto s = gen.schema();
mvcc_container ms(s);
mutation m1 = gen();
mutation m2 = gen();
mutation m3 = gen();
mutation to_apply = gen();
to_apply.partition().make_fully_continuous();
// Without active reader
auto test = [&] (bool with_active_reader) {
auto e = ms.make_evictable(m3.partition());
auto snap1 = e.read();
m2.partition().make_fully_continuous();
e += m2;
auto snap2 = e.read();
m1.partition().make_fully_continuous();
e += m1;
partition_snapshot_ptr snap;
if (with_active_reader) {
snap = e.read();
}
auto before = e.squashed();
auto e_continuity = before.get_continuity(*s);
auto expected_to_apply_slice = mutation_partition(*s, to_apply.partition());
if (!before.static_row_continuous()) {
expected_to_apply_slice.static_row() = {};
}
auto expected = mutation_partition(*s, before);
expected.apply_weak(*s, std::move(expected_to_apply_slice));
e += to_apply;
assert_that(s, e.squashed())
.is_equal_to(expected, e_continuity.to_clustering_row_ranges())
.has_same_continuity(before);
};
test(false);
test(true);
}
});
}
// Call with region locked.
static mutation_partition read_using_cursor(partition_snapshot& snap) {
partition_snapshot_row_cursor cur(*snap.schema(), snap);
cur.maybe_refresh();
auto mp = cur.read_partition();
for (auto&& rt : snap.range_tombstones()) {
mp.apply_delete(*snap.schema(), rt);
}
mp.apply(*snap.schema(), static_row(snap.static_row(false)));
mp.set_static_row_continuous(snap.static_row_continuous());
mp.apply(snap.partition_tombstone());
return mp;
}
SEASTAR_TEST_CASE(test_snapshot_cursor_is_consistent_with_merging) {
// Tests that reading many versions using a cursor gives the logical mutation back.
return seastar::async([] {
{
random_mutation_generator gen(random_mutation_generator::generate_counters::no);
auto s = gen.schema();
mvcc_container ms(s);
mutation m1 = gen();
mutation m2 = gen();
mutation m3 = gen();
m2.partition().make_fully_continuous();
m3.partition().make_fully_continuous();
{
auto e = ms.make_evictable(m1.partition());
auto snap1 = e.read();
e += m2;
auto snap2 = e.read();
e += m3;
auto expected = e.squashed();
auto snap = e.read();
auto actual = read_using_cursor(*snap);
assert_that(s, actual).has_same_continuity(expected);
// Drop empty rows
can_gc_fn never_gc = [] (tombstone) { return false; };
actual.compact_for_compaction(*s, never_gc, gc_clock::now());
expected.compact_for_compaction(*s, never_gc, gc_clock::now());
assert_that(s, actual).is_equal_to(expected);
}
}
});
}
SEASTAR_TEST_CASE(your_sha256_hash) {
// Tests that reading many versions using a cursor gives the logical mutation back.
return seastar::async([] {
logalloc::region r;
mutation_cleaner cleaner(r, no_cache_tracker);
with_allocator(r.allocator(), [&] {
random_mutation_generator gen(random_mutation_generator::generate_counters::no);
auto s = gen.schema();
mutation m1 = gen();
mutation m2 = gen();
mutation m3 = gen();
m1.partition().make_fully_continuous();
m2.partition().make_fully_continuous();
m3.partition().make_fully_continuous();
{
logalloc::reclaim_lock rl(r);
auto e = partition_entry(mutation_partition(*s, m3.partition()));
auto snap1 = e.read(r, cleaner, s, no_cache_tracker);
e.apply(*s, m2.partition(), *s);
auto snap2 = e.read(r, cleaner, s, no_cache_tracker);
e.apply(*s, m1.partition(), *s);
auto expected = e.squashed(*s);
auto snap = e.read(r, cleaner, s, no_cache_tracker);
auto actual = read_using_cursor(*snap);
BOOST_REQUIRE(expected.is_fully_continuous());
BOOST_REQUIRE(actual.is_fully_continuous());
assert_that(s, actual)
.is_equal_to(expected);
}
});
});
}
SEASTAR_TEST_CASE(test_continuity_merging_in_evictable) {
// Tests that reading many versions using a cursor gives the logical mutation back.
return seastar::async([] {
cache_tracker tracker;
auto& r = tracker.region();
with_allocator(r.allocator(), [&] {
simple_schema ss;
auto s = ss.schema();
auto base_m = mutation(s, ss.make_pkey(0));
auto m1 = base_m; // continuous in [-inf, 0]
m1.partition().clustered_row(*s, ss.make_ckey(0), is_dummy::no, is_continuous::no);
m1.partition().clustered_row(*s, position_in_partition::after_all_clustered_rows(), is_dummy::no, is_continuous::no);
{
logalloc::reclaim_lock rl(r);
auto e = partition_entry::make_evictable(*s, m1.partition());
auto snap1 = e.read(r, tracker.cleaner(), s, &tracker);
e.add_version(*s, &tracker).partition()
.clustered_row(*s, ss.make_ckey(1), is_dummy::no, is_continuous::no);
e.add_version(*s, &tracker).partition()
.clustered_row(*s, ss.make_ckey(2), is_dummy::no, is_continuous::no);
auto expected = mutation_partition(*s, m1.partition());
expected.clustered_row(*s, ss.make_ckey(1), is_dummy::no, is_continuous::no);
expected.clustered_row(*s, ss.make_ckey(2), is_dummy::no, is_continuous::no);
auto snap = e.read(r, tracker.cleaner(), s, &tracker);
auto actual = read_using_cursor(*snap);
auto actual2 = e.squashed(*s);
assert_that(s, actual)
.has_same_continuity(expected)
.is_equal_to(expected);
assert_that(s, actual2)
.has_same_continuity(expected)
.is_equal_to(expected);
}
});
});
}
SEASTAR_TEST_CASE(test_partition_snapshot_row_cursor) {
return seastar::async([] {
cache_tracker tracker;
auto& r = tracker.region();
with_allocator(r.allocator(), [&] {
simple_schema table;
auto&& s = *table.schema();
auto e = partition_entry::make_evictable(s, mutation_partition(table.schema()));
auto snap1 = e.read(r, tracker.cleaner(), table.schema(), &tracker);
{
auto&& p1 = snap1->version()->partition();
p1.clustered_row(s, table.make_ckey(0), is_dummy::no, is_continuous::no);
p1.clustered_row(s, table.make_ckey(1), is_dummy::no, is_continuous::no);
p1.clustered_row(s, table.make_ckey(2), is_dummy::no, is_continuous::no);
p1.clustered_row(s, table.make_ckey(3), is_dummy::no, is_continuous::no);
p1.clustered_row(s, table.make_ckey(6), is_dummy::no, is_continuous::no);
p1.ensure_last_dummy(s);
}
auto snap2 = e.read(r, tracker.cleaner(), table.schema(), &tracker, 1);
partition_snapshot_row_cursor cur(s, *snap2);
position_in_partition::equal_compare eq(s);
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.advance_to(table.make_ckey(0)));
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(0)));
BOOST_REQUIRE(!cur.continuous());
}
r.full_compaction();
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.maybe_refresh());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(0)));
BOOST_REQUIRE(!cur.continuous());
BOOST_REQUIRE(cur.next());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(1)));
BOOST_REQUIRE(!cur.continuous());
BOOST_REQUIRE(cur.next());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(2)));
BOOST_REQUIRE(!cur.continuous());
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.maybe_refresh());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(2)));
BOOST_REQUIRE(!cur.continuous());
}
{
auto&& p2 = snap2->version()->partition();
p2.clustered_row(s, table.make_ckey(2), is_dummy::no, is_continuous::yes);
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.maybe_refresh());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(2)));
BOOST_REQUIRE(cur.next());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(3)));
BOOST_REQUIRE(!cur.continuous());
}
{
auto&& p2 = snap2->version()->partition();
p2.clustered_row(s, table.make_ckey(4), is_dummy::no, is_continuous::yes);
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.maybe_refresh());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(3)));
BOOST_REQUIRE(cur.next());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(4)));
BOOST_REQUIRE(cur.continuous());
BOOST_REQUIRE(cur.next());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(6)));
BOOST_REQUIRE(!cur.continuous());
BOOST_REQUIRE(cur.next());
BOOST_REQUIRE(eq(cur.position(), position_in_partition::after_all_clustered_rows()));
BOOST_REQUIRE(cur.continuous());
BOOST_REQUIRE(!cur.next());
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.advance_to(table.make_ckey(4)));
BOOST_REQUIRE(cur.continuous());
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.maybe_refresh());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(4)));
BOOST_REQUIRE(cur.continuous());
}
{
auto&& p2 = snap2->version()->partition();
p2.clustered_row(s, table.make_ckey(5), is_dummy::no, is_continuous::yes);
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.maybe_refresh());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(4)));
BOOST_REQUIRE(cur.continuous());
BOOST_REQUIRE(cur.next());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(5)));
BOOST_REQUIRE(cur.continuous());
BOOST_REQUIRE(cur.next());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(6)));
BOOST_REQUIRE(!cur.continuous());
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.advance_to(table.make_ckey(4)));
BOOST_REQUIRE(cur.continuous());
}
e.evict(tracker.cleaner());
{
auto&& p2 = snap2->version()->partition();
p2.clustered_row(s, table.make_ckey(5), is_dummy::no, is_continuous::yes);
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.maybe_refresh());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(4)));
BOOST_REQUIRE(cur.continuous());
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.advance_to(table.make_ckey(4)));
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(4)));
BOOST_REQUIRE(cur.continuous());
BOOST_REQUIRE(cur.next());
}
{
logalloc::reclaim_lock rl(r);
BOOST_REQUIRE(cur.maybe_refresh());
BOOST_REQUIRE(eq(cur.position(), table.make_ckey(5)));
BOOST_REQUIRE(cur.continuous());
}
});
});
}
SEASTAR_TEST_CASE(test_apply_is_atomic) {
auto do_test = [](auto&& gen) {
failure_injecting_allocation_strategy alloc(standard_allocator());
with_allocator(alloc, [&] {
auto target = gen();
auto second = gen();
target.partition().make_fully_continuous();
second.partition().make_fully_continuous();
auto expected = target + second;
size_t fail_offset = 0;
while (true) {
mutation_partition m2 = mutation_partition(*second.schema(), second.partition());
auto e = partition_entry(mutation_partition(*target.schema(), target.partition()));
//auto snap1 = e.read(r, gen.schema());
alloc.fail_after(fail_offset++);
try {
e.apply(*target.schema(), std::move(m2), *second.schema());
alloc.stop_failing();
break;
} catch (const std::bad_alloc&) {
assert_that(mutation(target.schema(), target.decorated_key(), e.squashed(*target.schema())))
.is_equal_to(target)
.has_same_continuity(target);
e.apply(*target.schema(), std::move(m2), *second.schema());
assert_that(mutation(target.schema(), target.decorated_key(), e.squashed(*target.schema())))
.is_equal_to(expected)
.has_same_continuity(expected);
}
assert_that(mutation(target.schema(), target.decorated_key(), e.squashed(*target.schema())))
.is_equal_to(expected)
.has_same_continuity(expected);
}
});
};
do_test(random_mutation_generator(random_mutation_generator::generate_counters::no));
do_test(random_mutation_generator(random_mutation_generator::generate_counters::yes));
return make_ready_future<>();
}
SEASTAR_TEST_CASE(test_versions_are_merged_when_snapshots_go_away) {
return seastar::async([] {
logalloc::region r;
mutation_cleaner cleaner(r, nullptr);
with_allocator(r.allocator(), [&] {
random_mutation_generator gen(random_mutation_generator::generate_counters::no);
auto s = gen.schema();
mutation m1 = gen();
mutation m2 = gen();
mutation m3 = gen();
m1.partition().make_fully_continuous();
m2.partition().make_fully_continuous();
m3.partition().make_fully_continuous();
{
auto e = partition_entry(mutation_partition(*s, m1.partition()));
auto snap1 = e.read(r, cleaner, s, nullptr);
{
logalloc::reclaim_lock rl(r);
e.apply(*s, m2.partition(), *s);
}
auto snap2 = e.read(r, cleaner, s, nullptr);
snap1 = {};
snap2 = {};
cleaner.drain().get();
BOOST_REQUIRE_EQUAL(1, boost::size(e.versions()));
assert_that(s, e.squashed(*s)).is_equal_to((m1 + m2).partition());
}
{
auto e = partition_entry(mutation_partition(*s, m1.partition()));
auto snap1 = e.read(r, cleaner, s, nullptr);
{
logalloc::reclaim_lock rl(r);
e.apply(*s, m2.partition(), *s);
}
auto snap2 = e.read(r, cleaner, s, nullptr);
snap2 = {};
snap1 = {};
cleaner.drain().get();
BOOST_REQUIRE_EQUAL(1, boost::size(e.versions()));
assert_that(s, e.squashed(*s)).is_equal_to((m1 + m2).partition());
}
});
});
}
// Reproducer of #4030
SEASTAR_TEST_CASE(test_snapshot_merging_after_container_is_destroyed) {
return seastar::async([] {
random_mutation_generator gen(random_mutation_generator::generate_counters::no);
auto s = gen.schema();
mutation m1 = gen();
m1.partition().make_fully_continuous();
mutation m2 = gen();
m2.partition().make_fully_continuous();
auto c1 = std::make_unique<mvcc_container>(s, mvcc_container::no_tracker{});
auto c2 = std::make_unique<mvcc_container>(s, mvcc_container::no_tracker{});
auto e = std::make_unique<mvcc_partition>(c1->make_not_evictable(m1.partition()));
auto snap1 = e->read();
*e += m2;
auto snap2 = e->read();
while (!need_preempt()) {} // Ensure need_preempt() to force snapshot destruction to defer
snap1 = {};
c2->merge(*c1);
snap2 = {};
e.reset();
c1 = {};
c2->cleaner().drain().get();
});
}
```
|
Meys is a commune in the Rhône department in eastern France.
See also
Communes of the Rhône department
References
Communes of Rhône (department)
|
```go
package errors
import (
"encoding/json"
"errors"
"fmt"
"log/slog"
"net/http"
"github.com/kataras/iris/v12/context"
"github.com/kataras/iris/v12/x/client"
)
// LogErrorFunc is an alias of a function type which accepts the Iris request context and an error
// and it's fired whenever an error should be logged.
//
// See "OnErrorLog" variable to change the way an error is logged,
// by default the error is logged using the Application's Logger's Error method.
type LogErrorFunc = func(ctx *context.Context, err error)
// LogError can be modified to customize the way an error is logged to the server (most common: internal server errors, database errors et.c.).
// Can be used to customize the error logging, e.g. using Sentry (cloud-based error console).
var LogError LogErrorFunc = func(ctx *context.Context, err error) {
if ctx == nil {
slog.Error(err.Error())
return
}
ctx.Application().Logger().Error(err)
}
// SkipCanceled is a package-level setting which by default
// skips the logging of a canceled response or operation.
// See the "Context.IsCanceled()" method and "iris.IsCanceled()" function
// that decide if the error is caused by a canceled operation.
//
// Change of this setting MUST be done on initialization of the program.
var SkipCanceled = true
type (
// ErrorCodeName is a custom string type represents canonical error names.
//
// It contains functionality for safe and easy error populating.
// See its "Message", "Details", "Data" and "Log" methods.
ErrorCodeName string
// ErrorCode represents the JSON form ErrorCode of the Error.
ErrorCode struct {
CanonicalName ErrorCodeName `json:"canonical_name" yaml:"CanonicalName"`
Status int `json:"status" yaml:"Status"`
}
)
// A read-only map of valid http error codes.
var errorCodeMap = make(map[ErrorCodeName]ErrorCode)
// Deprecated: Use Register instead.
var E = Register
// Register registers a custom HTTP Error and returns its canonical name for future use.
// The method "New" is reserved and was kept as it is for compatibility
// with the standard errors package, therefore the "Register" name was chosen instead.
// The key stroke "e" is near and accessible while typing the "errors" word
// so developers may find it easy to use.
//
// See "RegisterErrorCode" and "RegisterErrorCodeMap" for alternatives.
//
// Example:
//
// var (
// NotFound = errors.Register("NOT_FOUND", http.StatusNotFound)
// )
// ...
// NotFound.Details(ctx, "resource not found", "user with id: %q was not found", userID)
//
// This method MUST be called on initialization, before HTTP server starts as
// the internal map is not protected by mutex.
func Register(httpErrorCanonicalName string, httpStatusCode int) ErrorCodeName {
canonicalName := ErrorCodeName(httpErrorCanonicalName)
RegisterErrorCode(canonicalName, httpStatusCode)
return canonicalName
}
// RegisterErrorCode registers a custom HTTP Error.
//
// This method MUST be called on initialization, before HTTP server starts as
// the internal map is not protected by mutex.
func RegisterErrorCode(canonicalName ErrorCodeName, httpStatusCode int) {
errorCodeMap[canonicalName] = ErrorCode{
CanonicalName: canonicalName,
Status: httpStatusCode,
}
}
// RegisterErrorCodeMap registers one or more custom HTTP Errors.
//
// This method MUST be called on initialization, before HTTP server starts as
// the internal map is not protected by mutex.
func RegisterErrorCodeMap(errorMap map[ErrorCodeName]int) {
if len(errorMap) == 0 {
return
}
for canonicalName, httpStatusCode := range errorMap {
RegisterErrorCode(canonicalName, httpStatusCode)
}
}
// List of default error codes a server should follow and send back to the client.
var (
Cancelled ErrorCodeName = Register("CANCELLED", context.StatusTokenRequired)
Unknown ErrorCodeName = Register("UNKNOWN", http.StatusInternalServerError)
InvalidArgument ErrorCodeName = Register("INVALID_ARGUMENT", http.StatusBadRequest)
DeadlineExceeded ErrorCodeName = Register("DEADLINE_EXCEEDED", http.StatusGatewayTimeout)
NotFound ErrorCodeName = Register("NOT_FOUND", http.StatusNotFound)
AlreadyExists ErrorCodeName = Register("ALREADY_EXISTS", http.StatusConflict)
PermissionDenied ErrorCodeName = Register("PERMISSION_DENIED", http.StatusForbidden)
Unauthenticated ErrorCodeName = Register("UNAUTHENTICATED", http.StatusUnauthorized)
ResourceExhausted ErrorCodeName = Register("RESOURCE_EXHAUSTED", http.StatusTooManyRequests)
FailedPrecondition ErrorCodeName = Register("FAILED_PRECONDITION", http.StatusBadRequest)
Aborted ErrorCodeName = Register("ABORTED", http.StatusConflict)
OutOfRange ErrorCodeName = Register("OUT_OF_RANGE", http.StatusBadRequest)
Unimplemented ErrorCodeName = Register("UNIMPLEMENTED", http.StatusNotImplemented)
Internal ErrorCodeName = Register("INTERNAL", http.StatusInternalServerError)
Unavailable ErrorCodeName = Register("UNAVAILABLE", http.StatusServiceUnavailable)
DataLoss ErrorCodeName = Register("DATA_LOSS", http.StatusInternalServerError)
)
// errorFuncCodeMap is a read-only map of error code names and their error functions.
// See HandleError package-level function.
var errorFuncCodeMap = make(map[ErrorCodeName][]func(error) error)
// HandleError handles an error by sending it to the client
// based on the registered error code names and their error functions.
// Returns true if the error was handled, otherwise false.
// If the given "err" is nil then it returns false.
// If the given "err" is a type of validation error then it sends it to the client
// using the "Validation" method.
// If the given "err" is a type of client.APIError then it sends it to the client
// using the "HandleAPIError" function.
//
// See ErrorCodeName.MapErrorFunc and MapErrors methods too.
func HandleError(ctx *context.Context, err error) bool {
if err == nil {
return false
}
if ctx.IsStopped() {
return false
}
for errorCodeName, errorFuncs := range errorFuncCodeMap {
for _, errorFunc := range errorFuncs {
if errToSend := errorFunc(err); errToSend != nil {
errorCodeName.Err(ctx, errToSend)
return true
}
}
}
// Unwrap and collect the errors slice so the error result doesn't contain the ErrorCodeName type
// and fire the error status code and title based on this error code name itself.
var asErrCode ErrorCodeName
if As(err, &asErrCode) {
if unwrapJoined, ok := err.(joinedErrors); ok {
errs := unwrapJoined.Unwrap()
errsToKeep := make([]error, 0, len(errs)-1)
for _, src := range errs {
if _, isErrorCodeName := src.(ErrorCodeName); !isErrorCodeName {
errsToKeep = append(errsToKeep, src)
}
}
if len(errsToKeep) > 0 {
err = errors.Join(errsToKeep...)
}
}
asErrCode.Err(ctx, err)
return true
}
if handleJSONError(ctx, err) {
return true
}
if vErr, ok := err.(ValidationError); ok {
if vErr == nil {
return false // consider as not error for any case, this should never happen.
}
InvalidArgument.Validation(ctx, vErr)
return true
}
if vErrs, ok := err.(ValidationErrors); ok {
if len(vErrs) == 0 {
return false // consider as not error for any case, this should never happen.
}
InvalidArgument.Validation(ctx, vErrs...)
return true
}
if apiErr, ok := client.GetError(err); ok {
handleAPIError(ctx, apiErr)
return true
}
Internal.LogErr(ctx, err)
return true
}
// Error returns an empty string, it is only declared as a method of ErrorCodeName type in order
// to be a compatible error to be joined within other errors:
//
// err = fmt.Errorf("%w%w", errors.InvalidArgument, err) OR
// err = errors.InvalidArgument.Wrap(err)
func (e ErrorCodeName) Error() string {
return ""
}
type joinedErrors interface{ Unwrap() []error }
// Wrap wraps the given error with this ErrorCodeName.
// It calls the standard errors.Join package-level function.
// See HandleError function for more.
func (e ErrorCodeName) Wrap(err error) error {
return errors.Join(e, err)
}
// MapErrorFunc registers a function which will validate the incoming error and
// return the same error or overriden in order to be sent to the client, wrapped by this ErrorCodeName "e".
//
// This method MUST be called on initialization, before HTTP server starts as
// the internal map is not protected by mutex.
//
// Example Code:
//
// errors.InvalidArgument.MapErrorFunc(func(err error) error {
// stripeErr, ok := err.(*stripe.Error)
// if !ok {
// return nil
// }
//
// return &errors.Error{
// Message: stripeErr.Msg,
// Details: stripeErr.DocURL,
// }
// })
func (e ErrorCodeName) MapErrorFunc(fn func(error) error) {
errorFuncCodeMap[e] = append(errorFuncCodeMap[e], fn)
}
// MapError registers one or more errors which will be sent to the client wrapped by this "e" ErrorCodeName
// when the incoming error matches to at least one of the given "targets" one.
//
// This method MUST be called on initialization, before HTTP server starts as
// the internal map is not protected by mutex.
func (e ErrorCodeName) MapErrors(targets ...error) {
e.MapErrorFunc(func(err error) error {
for _, target := range targets {
if Is(err, target) {
return err
}
}
return nil
})
}
// Message sends an error with a simple message to the client.
func (e ErrorCodeName) Message(ctx *context.Context, format string, args ...interface{}) {
fail(ctx, e, sprintf(format, args...), "", nil, nil)
}
// Details sends an error with a message and details to the client.
func (e ErrorCodeName) Details(ctx *context.Context, msg, details string, detailsArgs ...interface{}) {
fail(ctx, e, msg, sprintf(details, detailsArgs...), nil, nil)
}
// Data sends an error with a message and json data to the client.
func (e ErrorCodeName) Data(ctx *context.Context, msg string, data interface{}) {
fail(ctx, e, msg, "", nil, data)
}
// DataWithDetails sends an error with a message, details and json data to the client.
func (e ErrorCodeName) DataWithDetails(ctx *context.Context, msg, details string, data interface{}) {
fail(ctx, e, msg, details, nil, data)
}
// Validation sends an error which renders the invalid fields to the client.
func (e ErrorCodeName) Validation(ctx *context.Context, validationErrors ...ValidationError) {
e.validation(ctx, validationErrors)
}
func (e ErrorCodeName) validation(ctx *context.Context, validationErrors interface{}) {
fail(ctx, e, "validation failure", "fields were invalid", validationErrors, nil)
}
// Err sends the error's text as a message to the client.
// In exception, if the given "err" is a type of validation error
// then the Validation method is called instead.
func (e ErrorCodeName) Err(ctx *context.Context, err error) {
if err == nil {
return
}
if vErr, ok := err.(ValidationError); ok {
if vErr == nil {
return // consider as not error for any case, this should never happen.
}
e.Validation(ctx, vErr)
}
if vErrs, ok := err.(ValidationErrors); ok {
if len(vErrs) == 0 {
return // consider as not error for any case, this should never happen.
}
e.Validation(ctx, vErrs...)
}
// If it's already an Error type then send it directly.
if httpErr, ok := err.(*Error); ok {
if errorCode, ok := errorCodeMap[e]; ok {
httpErr.ErrorCode = errorCode
ctx.StopWithJSON(errorCode.Status, httpErr) // here we override the fail function and send the error as it is.
return
}
}
e.Message(ctx, err.Error())
}
// Log sends an error of "format" and optional "args" to the client and prints that
// error using the "LogError" package-level function, which can be customized.
//
// See "LogErr" too.
func (e ErrorCodeName) Log(ctx *context.Context, format string, args ...interface{}) {
if SkipCanceled {
if ctx.IsCanceled() {
return
}
for _, arg := range args {
if err, ok := arg.(error); ok {
if context.IsErrCanceled(err) {
return
}
}
}
}
err := fmt.Errorf(format, args...)
e.LogErr(ctx, err)
}
// LogErr sends the given "err" as message to the client and prints that
// error to using the "LogError" package-level function, which can be customized.
func (e ErrorCodeName) LogErr(ctx *context.Context, err error) {
if SkipCanceled && (ctx.IsCanceled() || context.IsErrCanceled(err)) {
return
}
LogError(ctx, err)
e.Message(ctx, "server error")
}
// HandleAPIError handles remote server errors.
// Optionally, use it when you write your server's HTTP clients using the the /x/client package.
// When the HTTP Client sends data to a remote server but that remote server
// failed to accept the request as expected, then the error will be proxied
// to this server's end-client.
//
// When the given "err" is not a type of client.APIError then
// the error will be sent using the "Internal.LogErr" method which sends
// HTTP internal server error to the end-client and
// prints the "err" using the "LogError" package-level function.
func HandleAPIError(ctx *context.Context, err error) {
// Error expected and came from the external server,
// save its body so we can forward it to the end-client.
if apiErr, ok := client.GetError(err); ok {
handleAPIError(ctx, apiErr)
return
}
Internal.LogErr(ctx, err)
}
func handleAPIError(ctx *context.Context, apiErr client.APIError) {
// Error expected and came from the external server,
// save its body so we can forward it to the end-client.
statusCode := apiErr.Response.StatusCode
if statusCode >= 400 && statusCode < 500 {
InvalidArgument.DataWithDetails(ctx, "remote server error", "invalid client request", apiErr.Body)
} else {
Internal.Data(ctx, "remote server error", apiErr.Body)
}
// Unavailable.DataWithDetails(ctx, "remote server error", "unavailable", apiErr.Body)
}
func handleJSONError(ctx *context.Context, err error) bool {
var syntaxErr *json.SyntaxError
if errors.As(err, &syntaxErr) {
InvalidArgument.Details(ctx, "unable to parse body", "syntax error at byte offset %d", syntaxErr.Offset)
return true
}
var unmarshalErr *json.UnmarshalTypeError
if errors.As(err, &unmarshalErr) {
InvalidArgument.Details(ctx, "unable to parse body", "unmarshal error for field %q", unmarshalErr.Field)
return true
}
return false
// else {
// InvalidArgument.Details(ctx, "unable to parse body", err.Error())
// }
}
var (
// ErrUnexpected is the HTTP error which sent to the client
// when server fails to send an error, it's a fallback error.
// The server fails to send an error on two cases:
// 1. when the provided error code name is not registered (the error value is the ErrUnexpectedErrorCode)
// 2. when the error contains data but cannot be encoded to json (the value of the error is the result error of json.Marshal).
ErrUnexpected = Register("UNEXPECTED_ERROR", http.StatusInternalServerError)
// ErrUnexpectedErrorCode is the error which logged
// when the given error code name is not registered.
ErrUnexpectedErrorCode = New("unexpected error code name")
)
// Error represents the JSON form of "http wire errors".
//
// Examples can be found at:
//
// path_to_url
type Error struct {
ErrorCode ErrorCode `json:"http_error_code" yaml:"HTTPErrorCode"`
Message string `json:"message,omitempty" yaml:"Message"`
Details string `json:"details,omitempty" yaml:"Details"`
Validation interface{} `json:"validation,omitempty" yaml:"Validation,omitempty"`
Data json.RawMessage `json:"data,omitempty" yaml:"Data,omitempty"` // any other custom json data.
}
// Error method completes the error interface. It just returns the canonical name, status code, message and details.
func (err *Error) Error() string {
if err.Message == "" {
err.Message = "<empty>"
}
if err.Details == "" {
err.Details = "<empty>"
}
if err.ErrorCode.CanonicalName == "" {
err.ErrorCode.CanonicalName = ErrUnexpected
}
if err.ErrorCode.Status <= 0 {
err.ErrorCode.Status = http.StatusInternalServerError
}
return sprintf("iris http wire error: canonical name: %s, http status code: %d, message: %s, details: %s", err.ErrorCode.CanonicalName, err.ErrorCode.Status, err.Message, err.Details)
}
func fail(ctx *context.Context, codeName ErrorCodeName, msg, details string, validationErrors interface{}, dataValue interface{}) {
errorCode, ok := errorCodeMap[codeName]
if !ok {
// This SHOULD NEVER happen, all ErrorCodeNames MUST be registered.
LogError(ctx, ErrUnexpectedErrorCode)
fail(ctx, ErrUnexpected, msg, details, validationErrors, dataValue)
return
}
var data json.RawMessage
if dataValue != nil {
switch v := dataValue.(type) {
case json.RawMessage:
data = v
case []byte:
data = v
case error:
if msg == "" {
msg = v.Error()
} else if details == "" {
details = v.Error()
} else {
data = json.RawMessage(v.Error())
}
default:
b, err := json.Marshal(v)
if err != nil {
LogError(ctx, err)
fail(ctx, ErrUnexpected, err.Error(), "", nil, nil)
return
}
data = b
}
}
err := Error{
ErrorCode: errorCode,
Message: msg,
Details: details,
Data: data,
Validation: validationErrors,
}
// ctx.SetErr(&err)
ctx.StopWithJSON(errorCode.Status, err)
}
```
|
```javascript
import data from '../../../shared/testing/data';
import constants from '../../../api/migrations/seed/default/constants';
const publicChannel = data.channels.find(
c => c.id === constants.SPECTRUM_GENERAL_CHANNEL_ID
);
const privateChannel = data.channels.find(
c => c.id === constants.SPECTRUM_PRIVATE_CHANNEL_ID
);
const publicCommunity = data.communities.find(
c => c.id === constants.SPECTRUM_COMMUNITY_ID
);
const publicThread = data.threads.find(
t => t.communityId === publicCommunity.id && t.channelId === publicChannel.id
);
const privateThread = data.threads.find(
t => t.communityId === publicCommunity.id && t.channelId === privateChannel.id
);
const publicThreadAuthor = data.users.find(
u => u.id === publicThread.creatorId
);
const nonMemberUser = data.users.find(u => u.id === constants.QUIET_USER_ID);
const memberInChannelUser = data.users.find(u => u.id === constants.BRYN_ID);
const triggerThreadDelete = () => {
cy.get('[data-cy="thread-dropdown-delete"]')
.first()
.click();
cy.get('[data-cy="delete-button"]').should('be.visible');
cy.get('div.ReactModal__Overlay')
.should('be.visible')
.click('topLeft');
};
const openSettingsDropdown = () => {
cy.get('[data-cy="thread-actions-dropdown-trigger"]')
.last()
.should('be.visible')
.click({ force: true });
};
describe('action bar renders', () => {
describe('non authed', () => {
beforeEach(() => {
cy.visit(`/thread/${publicThread.id}`);
});
it('should render', () => {
cy.get('[data-cy="thread-view"]').should('be.visible');
cy.get('[data-cy="thread-actions-dropdown-trigger"]').should(
'not.be.visible'
);
});
});
describe('authed non member', () => {
beforeEach(() => {
cy.auth(nonMemberUser.id).then(() =>
cy.visit(`/thread/${publicThread.id}`)
);
});
it('should render', () => {
cy.get('[data-cy="thread-view"]').should('be.visible');
});
});
describe('authed member', () => {
beforeEach(() => {
cy.auth(memberInChannelUser.id).then(() =>
cy.visit(`/thread/${publicThread.id}`)
);
});
it('should render', () => {
cy.get('[data-cy="thread-view"]').should('be.visible');
cy.get('[data-cy="thread-actions-dropdown-trigger"]').should(
'not.be.visible'
);
});
});
describe('authed private channel member', () => {
beforeEach(() => {
cy.auth(memberInChannelUser.id).then(() =>
cy.visit(`/thread/${privateThread.id}`)
);
});
it('should render', () => {
cy.get('[data-cy="thread-view"]').should('be.visible');
cy.get('[data-cy="thread-actions-dropdown-trigger"]').should(
'not.be.visible'
);
});
});
describe('thread author', () => {
beforeEach(() => {
cy.auth(publicThreadAuthor.id).then(() =>
cy.visit(`/thread/${publicThread.id}`)
);
});
it('should render', () => {
cy.get('[data-cy="thread-view"]').should('be.visible');
openSettingsDropdown();
cy.get('[data-cy="thread-actions-dropdown"]').should('be.visible');
// dropdown controls
cy.get('[data-cy="thread-dropdown-delete"]').should('be.visible');
});
it('should lock the thread', () => {
cy.auth(publicThreadAuthor.id);
openSettingsDropdown();
});
it('should trigger delete thread', () => {
cy.auth(publicThreadAuthor.id);
openSettingsDropdown();
triggerThreadDelete();
});
});
describe('channel moderator', () => {
beforeEach(() => {
cy.auth(constants.CHANNEL_MODERATOR_USER_ID).then(() =>
cy.visit(`/thread/${publicThread.id}`)
);
});
it('should render', () => {
cy.get('[data-cy="thread-view"]').should('be.visible');
openSettingsDropdown();
cy.get('[data-cy="thread-actions-dropdown"]').should('be.visible');
cy.get('[data-cy="thread-dropdown-delete"]').should('be.visible');
});
it('should trigger delete thread', () => {
cy.auth(constants.CHANNEL_MODERATOR_USER_ID);
openSettingsDropdown();
triggerThreadDelete();
});
});
describe('channel owner', () => {
beforeEach(() => {
cy.auth(constants.CHANNEL_MODERATOR_USER_ID).then(() =>
cy.visit(`/thread/${publicThread.id}`)
);
});
it('should render', () => {
cy.get('[data-cy="thread-view"]').should('be.visible');
openSettingsDropdown();
cy.get('[data-cy="thread-actions-dropdown"]').should('be.visible');
cy.get('[data-cy="thread-dropdown-delete"]').should('be.visible');
});
it('should trigger delete thread', () => {
cy.auth(constants.CHANNEL_MODERATOR_USER_ID);
openSettingsDropdown();
triggerThreadDelete();
});
});
describe('community moderator', () => {
beforeEach(() => {
cy.auth(constants.COMMUNITY_MODERATOR_USER_ID).then(() =>
cy.visit(`/thread/${publicThread.id}`)
);
});
it('should render', () => {
cy.get('[data-cy="thread-view"]').should('be.visible');
openSettingsDropdown();
cy.get('[data-cy="thread-actions-dropdown"]').should('be.visible');
cy.get('[data-cy="thread-dropdown-delete"]').should('be.visible');
});
it('should trigger delete thread', () => {
cy.auth(constants.COMMUNITY_MODERATOR_USER_ID);
openSettingsDropdown();
triggerThreadDelete();
});
});
describe('community owner', () => {
beforeEach(() => {
cy.auth(constants.MAX_ID).then(() =>
cy.visit(`/thread/${publicThread.id}`)
);
});
it('should render', () => {
cy.get('[data-cy="thread-view"]').should('be.visible');
openSettingsDropdown();
cy.get('[data-cy="thread-actions-dropdown"]').should('be.visible');
cy.get('[data-cy="thread-dropdown-delete"]').should('be.visible');
});
it('should trigger delete thread', () => {
cy.auth(constants.MAX_ID);
openSettingsDropdown();
triggerThreadDelete();
});
});
});
```
|
Görkem or Gorkem is a Turkish-language masculine and (to a lesser extent also) feminine given name with the meaning "glory, splendor".
Notable people with the name include:
Ahmet Görkem Görk (1983), Turkish footballer
Görkem Sağlam (1998), German footballer of Turkish descent
Görkem Sala (1992), Turkish singer, DJ, and hip hop artist
Görkem Sevindik (1986), Turkish actor
Görkem Yeltan (1977), Turkish actress
Turkish masculine given names
Masculine given names
Turkish feminine given names
Feminine given names
|
The poetry of the Ottoman Empire, or Ottoman Divan poetry, is little known outside modern Turkey, which forms the heartland of what was once the Ottoman Empire. It is, however, a rich and ancient poetic tradition that lasted for nearly 700 years, and one whose influence can still be felt in the modern Turkish poetic tradition.
Even in modern Turkey, however, Ottoman Divan poetry is a highly specialist subject. Much of this has to do with the fact that Divan poetry is written in Ottoman Turkish, which was written using a variant of the Arabic script and made extensive use of Arabic and Persian words, making the language vastly different from modern Turkish. In its own time, knowledge of this form of literary Turkish was largely limited to the educated classes.
History
The Ottoman Divan poetry tradition embraced the influence of the Persian and, to a lesser extent, Arabic literatures. As far back as the pre-Ottoman Seljuk period in the late 11th to early 14th centuries CE, this influence was already being felt: the Seljuks conducted their official business in the Persian language, rather than in Turkish, and the poetry of the Seljuk court was highly inflected with Persian.
When the Ottoman Empire arose in northwestern Anatolia, it continued this tradition. The most common poetic forms of the Ottoman court, for instance, were derived either directly from the Persian literary tradition (the gazel; the mesnevî), or indirectly through Persian from the Arabic (the kasîde). However, the decision to adopt these poetic forms wholesale led to two important further consequences:
the poetic meters (Persian: beher (Arabic: بَحْر); Turkish: aruz (Arabic: عَرُوض)) of Persian poetry were adopted.
Persian- and Arabic-based words were brought into the Turkish language in great numbers, as Turkish words rarely worked well within the system of the Persian poetic meter.
Out of this confluence of choices, the Ottoman Turkish language—which was always highly distinct from standard Turkish—was effectively born. This style of writing under Persian and Arabic influence came to be known as "Divan literature" (Turkish divân edebiyatı), as divân was the Ottoman Turkish word referring to the collected works of a poet.
Beginning with the Tanzimat reform period (1839–1876) of Ottoman history and continuing until the dissolution of the empire in the early 20th century, the Divan poetic tradition steadily dwindled, and more and more influence from both Turkish folk literature and European literature began to make itself felt.
Divan
Mesnevi
Mesnevi (masnavi or mathnavi) in literary term "Rhyming Couplets of Profound Spiritual Meaning" is style developed in Persian poetry which Nizami Ganjavi and Jami are the famous poets of type. In Turkic literature first mesnevi was Yusuf Has Hajib's Kutadgu Bilig. Generally social concepts Ferdowsi's Shahnameh, Fuzûlî's Leyla ile Mecnun'u, military events, educational concepts such as Yusuf Nabi's or related to religion or philosophy such as Mevlana's (Rumi) Masnavi is covered.
A peculiarity of the masnavi of the Ottoman period is that they almost always possess, beneath the literal meaning, a subtle spiritual signification. Many poems, of Mesnevi of Mevlana and the Divan of Aşık Paşha examples of confessedly religious, moral, or mystic but a much larger number are allegorical. To this latter class belong almost all the long romantic mesnevis of the Persian and mid Ottoman poets; in the stories of the loves of Leyla and Mecnun, Yusuf and Zuleykha, Kusrev and Shavin, Suleyman and Ebsal, and a hundred of like kind, can see pictured, if we look beneath the surface, the soul of man for God, or the yearning of the human heart after heavenly light and wisdom. There is not a character introduced into those romances but represents the passion not an incident but has some spiritual meaning. In the history of Iskender, or Alexander, we watch the noble human soul in its struggles against the powers of this world, and, when aided by God and guided by the heavenly wisdom of righteous teachers, its ultimate victory over every earthly passion, and its attainment of that point of divine serenity whence it can look calmly down on all sublunary things.
Kaside
Kaside is generally about God, religious or government leaders and their values. Most famous poets are Ahmed Paşa, Necati, Bâkî, Nedîm, most importantly Nef'i.
Terminology:
Tevhid: About the Unity of God.
Münacaat: Prayer to God
Naat: About religious leaders and the prophet.
Methiye: About the sultan and government leaders.
Nesip or teşbib: Nature and environment descriptions.
Girizgah: Prelude to the topic.
Fahriye: Praising the poet himself
Dua: Prayer and well wishing for the subject of the poem
See also
Kashifi
Gazel
Persian metres
Şemi
Notes
Bibliography
Gibb, E.J.W. Ottoman Literature: The Poets and Poetry of Turkey. .
Tanpınar, Ahmet Hamdi. 19'uncu Asır Türk Edebiyatı Tarihi. İstanbul: Çağlayan Kitabevi, 1988.
External links
Divan-Full Text-Republic of Turkey Ministry of Culture and Tourism
Masnavi-Full Text-Republic of Turkey Ministry of Culture and Tourism
Poetry by country
|
```swift
//
// Autocomplete+TextReplacementDictionary.swift
// KeyboardKit
//
// Created by Daniel Saidi on 2024-06-05.
//
import Foundation
public extension Autocomplete {
/// This type can be used to define text replacements to
/// e.g. define custom autocorrections.
struct TextReplacementDictionary {
public init(
_ initialValue: Dictionary = .init()
) {
self.dictionary = initialValue
}
public typealias Dictionary = KeyboardLocale.Dictionary<[String: String]>
private var dictionary: Dictionary = .init()
}
}
public extension Autocomplete.TextReplacementDictionary {
/// This predefined dictionary can be used as a starting
/// point when defining a custom list of autocorrections.
static var additionalAutocorrections: Self {
.init(.init(
[
KeyboardLocale.english: [
"i": "I",
"ill": "I'll",
"Ill": "I'll"
]
]
))
}
}
public extension Autocomplete.TextReplacementDictionary {
/// Insert a text replacement for a certain locale.
mutating func addTextReplacement(
for text: String,
with replacement: String,
locale: KeyboardLocaleInfo
) {
addTextReplacements([text: replacement], for: locale)
}
/// Insert a text replacement for a certain locale.
mutating func addTextReplacements(
_ dict: [String: String],
for locale: KeyboardLocaleInfo
) {
var val = dictionary.value(for: locale) ?? [:]
dict.forEach {
val[$0.key] = $0.value
}
setTextReplacements(val, for: locale)
}
/// Set the text replacements for a certain locale.
mutating func setTextReplacements(
_ dict: [String: String],
for locale: KeyboardLocaleInfo
) {
dictionary.set(dict, for: locale)
}
/// Get a text replacement for a certain text and locale.
func textReplacement(
for text: String,
locale: KeyboardLocaleInfo
) -> String? {
textReplacements(for: locale)?[text]
}
/// Get all text replacements for a certain locale.
func textReplacements(
for locale: KeyboardLocaleInfo
) -> [String: String]? {
dictionary.value(for: locale)
}
}
```
|
Nidularium mangaratibense is a plant species in the genus Nidularium. This species is endemic to Brazil.
References
mangaratibense
Flora of Brazil
|
The Rumford Municipal Building is located on Congress Street in the central business district of Rumford, Maine. Built in 1915 to a design by Lewiston architect Harry S. Coombs, it continues to house the town's municipal offices today. It is a fine example of Colonial Revival architecture, representing the town's growth in the early decades of the 20th century, and was listed on the National Register of Historic Places in 1980.
Description and history
The town of Rumford, Maine experienced rapid growth with the arrival in the 1880s of industrialist Hugh J. Chisholm and the railroad in the 1890s. Chisholm in 1893 established the paper processing industry that still dominates the town, and its population and central business district grew rapidly in the following decades. This growth prompted the need for a new town hall, and in 1915 the town retained Harry S. Coombs, a Maine architect based in Lewiston, to design a new municipal building, with a projected cost of $25,000. The building Coombs designed ended up costing $95,000 to complete.
The building Coombs designed is a 2-1/2 story Colonial Revival structure, faced in brick with granite trim. The main block is five bays wide, set on a raised foundation, with a hip roof topped by a clocktower with a coppered cupola. The main entrance, centered on the front facade, is accessed via a small flight of stairs, and is sheltered by a portico supported by Doric columns. The portico has an entablature topped by an extended dentillated cornice. First floor windows are rectangular sash windows with a decorative granite keystone. The second floor window above the entry is a Palladian window, its sections separated by engaged columns, with a swag decoration above. The remaining second floor windows are rectangular, with a recessed round-arch section above. There is a cornice line separating the second floor from the attic level, which has groups of smaller windows just below the roof.
See also
National Register of Historic Places listings in Oxford County, Maine
References
City and town halls on the National Register of Historic Places in Maine
Colonial Revival architecture in Maine
Buildings and structures completed in 1916
Buildings and structures in Oxford County, Maine
Rumford, Maine
Town halls in Maine
National Register of Historic Places in Oxford County, Maine
|
```objective-c
#pragma once
#ifndef T_RASTERCM_INCLUDED
#define T_RASTERCM_INCLUDED
#include "traster.h"
#include "tpixelcm.h"
#undef DVAPI
#undef DVVAR
#ifdef TRASTER_EXPORTS
#define DVAPI DV_EXPORT_API
#define DVVAR DV_EXPORT_VAR
#else
#define DVAPI DV_IMPORT_API
#define DVVAR DV_IMPORT_VAR
#endif
typedef TRasterT<TPixelCM32> TRasterCM32;
#ifdef _WIN32
template class DVAPI TSmartPointerT<TRasterT<TPixelCM32>>;
template class DVAPI TRasterPT<TPixelCM32>;
#endif
typedef TRasterPT<TPixelCM32> TRasterCM32P;
#endif
```
|
```javascript
'use strict';
Object.defineProperty(exports, "__esModule", {
value: true
});
var _extends = Object.assign || function (target) { for (var i = 1; i < arguments.length; i++) { var source = arguments[i]; for (var key in source) { if (Object.prototype.hasOwnProperty.call(source, key)) { target[key] = source[key]; } } } return target; };
exports.default = CancelButton;
var _react = require('react');
var _react2 = _interopRequireDefault(_react);
var _button = require('./button');
var _button2 = _interopRequireDefault(_button);
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }
function _objectWithoutProperties(obj, keys) { var target = {}; for (var i in obj) { if (keys.indexOf(i) >= 0) continue; if (!Object.prototype.hasOwnProperty.call(obj, i)) continue; target[i] = obj[i]; } return target; }
var STYLE = {
borderColor: "#adadad",
backgroundColor: "#e6e6e6"
};
var STYLE_HOVER = {
backgroundColor: "#d4d4d4",
borderColor: "#8c8c8c"
};
function CancelButton(_ref) {
var children = _ref.children,
rest = _objectWithoutProperties(_ref, ['children']);
return _react2.default.createElement(
_button2.default,
_extends({ style: STYLE, styleHover: STYLE_HOVER }, rest),
children
);
}
```
|
Baidoo is a surname. Notable people with the surname include:
Charlotte Lily Baidoo, Ghanaian banker
Ishmael Baidoo (born 1998), Ghanaian footballer
Michael Baidoo (born 1999), Ghanaian footballer
Shabazz Baidoo (born 1988), English footballer
Stephen Baidoo (born 1976), Ghanaian footballer
|
```c
#ifndef NO_LABEL_VALUES
x(){if(&&e-&&b<0)x();b:goto*&&b;e:;}
#else
int x;
#endif
```
|
```java
/*
*
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package com.dianping.zebra.group.config.datasource.transform;
import com.dianping.zebra.group.config.datasource.IEntity;
import com.dianping.zebra.group.config.datasource.IVisitor;
import com.dianping.zebra.group.config.datasource.entity.Any;
import com.dianping.zebra.group.config.datasource.entity.DataSourceConfig;
import com.dianping.zebra.group.config.datasource.entity.GroupDataSourceConfig;
import java.util.Iterator;
import java.util.Stack;
public class DefaultMerger implements IVisitor {
private Stack<Object> m_objs = new Stack();
private GroupDataSourceConfig m_groupDataSourceConfig;
public DefaultMerger() {
}
public DefaultMerger(GroupDataSourceConfig groupDataSourceConfig) {
this.m_groupDataSourceConfig = groupDataSourceConfig;
this.m_objs.push(groupDataSourceConfig);
}
public GroupDataSourceConfig getGroupDataSourceConfig() {
return this.m_groupDataSourceConfig;
}
protected Stack<Object> getObjects() {
return this.m_objs;
}
public <T> void merge(IEntity<T> to, IEntity<T> from) {
this.m_objs.push(to);
from.accept(this);
this.m_objs.pop();
}
protected void mergeDataSourceConfig(DataSourceConfig to, DataSourceConfig from) {
to.mergeAttributes(from);
to.getProperties().addAll(from.getProperties());
to.setTimeWindow(from.getTimeWindow());
to.setPunishLimit(from.getPunishLimit());
to.setJdbcUrl(from.getJdbcUrl());
to.setUsername(from.getUsername());
to.setDriverClass(from.getDriverClass());
to.setPassword(from.getPassword());
to.setWarmupTime(from.getWarmupTime());
}
protected void mergeGroupDataSourceConfig(GroupDataSourceConfig to, GroupDataSourceConfig from) {
to.mergeAttributes(from);
}
public void visitAny(Any any) {
}
public void visitDataSourceConfig(DataSourceConfig from) {
DataSourceConfig to = (DataSourceConfig) this.m_objs.peek();
this.mergeDataSourceConfig(to, from);
this.visitDataSourceConfigChildren(to, from);
}
protected void visitDataSourceConfigChildren(DataSourceConfig to, DataSourceConfig from) {
to.getProperties().addAll(from.getProperties());
}
public void visitGroupDataSourceConfig(GroupDataSourceConfig from) {
GroupDataSourceConfig to = (GroupDataSourceConfig) this.m_objs.peek();
this.mergeGroupDataSourceConfig(to, from);
this.visitGroupDataSourceConfigChildren(to, from);
}
protected void visitGroupDataSourceConfigChildren(GroupDataSourceConfig to, GroupDataSourceConfig from) {
Iterator var3 = from.getDataSourceConfigs().values().iterator();
while (var3.hasNext()) {
DataSourceConfig source = (DataSourceConfig) var3.next();
DataSourceConfig target = to.findDataSourceConfig(source.getId());
if (target == null) {
target = new DataSourceConfig(source.getId());
to.addDataSourceConfig(target);
}
this.m_objs.push(target);
source.accept(this);
this.m_objs.pop();
}
}
}
```
|
```objective-c
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#pragma once
#include <string>
#include "paddle/fluid/framework/ir/fuse_pass_base.h"
namespace paddle {
namespace framework {
namespace ir {
class QuantDequantMkldnnPass : public FusePassBase {
public:
QuantDequantMkldnnPass() = default;
virtual ~QuantDequantMkldnnPass() {}
protected:
void ApplyImpl(ir::Graph* graph) const override;
private:
void MarkSkipQuantizedOps(
ir::Graph* graph, const std::unordered_set<std::string>& skip_ops) const;
void MarkSkipQuantizedPool2d(ir::Graph* graph) const;
void CollectInfoFromFake(
ir::Graph* graph,
Scope* scope,
const std::unordered_set<std::string>& fake_dequantize_types,
std::unordered_map<std::string, std::vector<float>>* weight_thresholds)
const;
///
/// \brief collect scale info for weight from onnx_format dequantize_linear op
/// onnx_format_dequantize_types: the onnx_format dequantize op type
/// weight_thresholds: scale info for weight
/// var_quant_scales: scale info for act
/// onnx_format_quantize_model: recorder if the quantize model is a
/// onnx_format quantize model
///
void CollectWeightScalesInfoFromONNXFormatDequantize(
ir::Graph* graph,
Scope* scope,
std::unordered_map<std::string, std::vector<float>>* weight_thresholds,
std::unordered_map<std::string, std::vector<float>>* var_quant_scales,
bool* onnx_format_quantize_model) const;
void CollectInputScalesFromQuantize(
ir::Graph* graph,
Scope* scope,
const std::unordered_set<std::string>& fake_quantize_types,
std::unordered_map<std::string, std::vector<float>>* var_quant_scales)
const;
void ConvertFromINT8ToFP32(const std::vector<float>& scales,
phi::DenseTensor* weight_tensor,
int8_t* int8_weight_data,
float* fp32_weight_data,
const std::string& weight_var_name) const;
void CollectOutputScalesFromAttr(
ir::Graph* graph,
std::unordered_map<std::string, std::vector<float>>* var_quant_scales)
const;
void CollectFakeQuantizeOps(ir::Graph* graph,
Node* op_node,
std::unordered_set<const Node*>* nodes2rm) const;
void CollectFakeDequantizeOps(
ir::Graph* graph,
Node* op_node,
std::unordered_set<const Node*>* nodes2rm) const;
///
/// \brief collect all the onnx_format quantize related ops to remove
/// nodes2rm: record all quantize related ops to remove
///
void CollectQuantizeDequantizeOpsFromONNXFormat(
ir::Graph* graph,
Node* op_node,
std::unordered_set<const Node*>* nodes2rm) const;
void RemoveFakeOps(
ir::Graph* graph,
const std::unordered_set<std::string>& fake_quantize_types,
const std::unordered_set<std::string>& fake_dequantize_types,
const std::unordered_set<std::string>& fake_quantize_dequantize_types,
const std::unordered_set<std::string>&
onnx_format_quantize_dequantize_types) const;
bool IsInt8Weight(Node* op_node,
Scope* scope,
const std::string& weight_name) const;
void TransposeWeight(phi::DenseTensor* input) const;
void DequantizeOpWeights(
Node* op_node,
Scope* scope,
const std::string& weight_name,
const std::string& output_name,
const std::unordered_map<std::string, std::vector<float>>&
weight_thresholds) const;
///
/// \brief Dequantize weight in conv or matmul
/// weight_thresholds: recorded scale info for weight
///
void DequantizeOpWeightsFromONNXFormat(
Node* op_node,
Scope* scope,
const std::string& weight_name,
const std::unordered_map<std::string, std::vector<float>>&
weight_thresholds,
std::vector<std::string>* dequantized_weights_names) const;
void DequantizeWeights(
ir::Graph* graph,
Scope* scope,
const std::unordered_map<std::string, std::vector<float>>&
weight_thresholds,
const bool& onnx_format_quantize_model) const;
void UpdateActivations(ir::Graph* graph) const;
void RemoveCtrlVars(ir::Graph* graph) const;
};
} // namespace ir
} // namespace framework
} // namespace paddle
```
|
The Blue Cross & Blue Shield of Rhode Island Headquarters is a LEED Certified high-rise in downtown Providence, Rhode Island.
Location
The building is sited immediately north of the two Waterplace Towers, at the intersection of American Express Plaza and Exchange Street. At a height of , it is currently the seventh-tallest building in the city and state.
Purpose
The public health insurer announced April 19, 2007, its intention to consolidate its operations comprising 1100 employees, which had been scattered throughout the city, into one 12-story building. The $114 million tower will be built on top of a parking garage currently under construction for Intercontinental's neighboring Waterplace condominiums, and is estimated to be finished by early 2010. The economic feasibility of this move is made possible through $25 million worth of tax breaks negotiated with Intercontinental.
Opinion
Mayor David Cicilline was optimistic at the prospect of retaining the 1100 employees in the city adding to the "economy and add to the life and vitality of our city", while the state's governor, Donald Carcieri, worried about the impact a move to occupy such expensive real estate would have on the premiums for the insurer's customers, since the first priority of the Blue Cross & Blue Shield should be the affordable coverage of citizens.
Gallery
References
Skyscraper office buildings in Providence, Rhode Island
Economy of Providence, Rhode Island
Office buildings completed in 2010
|
```c
/* $OpenBSD: installboot.c,v 1.20 2020/03/11 09:59:31 otto Exp $ */
/* $NetBSD: installboot.c,v 1.2 1997/04/06 08:41:12 cgd Exp $ */
/*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
* 3. All advertising materials mentioning features or use of this software
* must display the following acknowledgement:
* This product includes software developed by Paul Kranenburg.
* 4. The name of the author may not be used to endorse or promote products
* derived from this software without specific prior written permission
*
* THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
* IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
* OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
* IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT,
* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
* DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
* THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
* THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
#include <sys/param.h>
#include <sys/mount.h>
#include <sys/ioctl.h>
#include <sys/time.h>
#include <sys/stat.h>
#include <sys/sysctl.h>
#include <ufs/ufs/dinode.h>
#include <ufs/ufs/dir.h>
#include <ufs/ffs/fs.h>
#include <sys/disklabel.h>
#include <sys/dkio.h>
#include <err.h>
#include <errno.h>
#include <fcntl.h>
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#include <unistd.h>
#include <util.h>
#include "bbinfo.h"
#ifndef ISO_DEFAULT_BLOCK_SIZE
#define ISO_DEFAULT_BLOCK_SIZE 2048
#endif
int verbose, nowrite, hflag;
char *boot, *proto, *dev;
struct bbinfoloc *bbinfolocp;
struct bbinfo *bbinfop;
int max_block_count;
char *loadprotoblocks(char *, long *);
int loadblocknums(char *, int, unsigned long);
static void devread(int, void *, daddr_t, size_t, char *);
static void usage(void);
static int sbchk(struct fs *, daddr_t);
static void sbread(int, daddr_t, struct fs **, char *);
int main(int, char *[]);
int isofsblk = 0;
int isofseblk = 0;
static const daddr_t sbtry[] = SBLOCKSEARCH;
static void
usage(void)
{
(void)fprintf(stderr,
"usage: installboot [-n] [-v] [-s isofsblk -e isofseblk] "
"<boot> <proto> <device>\n");
exit(1);
}
int
main(int argc, char *argv[])
{
int c, devfd;
char *protostore;
long protosize;
struct stat disksb, bootsb;
struct disklabel dl;
daddr_t partoffset;
#define BBPAD 0x1e0
struct bb {
char bb_pad[BBPAD]; /* disklabel lives in here, actually */
long bb_secsize; /* size of secondary boot block */
long bb_secstart; /* start of secondary boot block */
long bb_flags; /* unknown; always zero */
long bb_cksum; /* checksum of the boot block, as longs. */
} bb;
long *lp, *ep;
while ((c = getopt(argc, argv, "vns:e:")) != -1) {
switch (c) {
case 'n':
/* Do not actually write the bootblock to disk */
nowrite = 1;
break;
case 'v':
/* Chat */
verbose = 1;
break;
case 's':
isofsblk = atoi(optarg);
break;
case 'e':
isofseblk = atoi(optarg);
break;
default:
usage();
}
}
if (argc - optind < 3)
usage();
boot = argv[optind];
proto = argv[optind + 1];
dev = argv[optind + 2];
if (verbose) {
(void)printf("boot: %s\n", boot);
(void)printf("proto: %s\n", proto);
(void)printf("device: %s\n", dev);
}
/* Load proto blocks into core */
if ((protostore = loadprotoblocks(proto, &protosize)) == NULL)
exit(1);
/* Open and check raw disk device */
if ((devfd = opendev(dev, O_RDONLY, OPENDEV_PART, &dev)) < 0)
err(1, "open: %s", dev);
if (fstat(devfd, &disksb) == -1)
err(1, "fstat: %s", dev);
if (!S_ISCHR(disksb.st_mode))
errx(1, "%s must be a character device node", dev);
if ((minor(disksb.st_rdev) % getmaxpartitions()) != getrawpartition())
errx(1, "%s must be the raw partition", dev);
/* Extract and load block numbers */
if (stat(boot, &bootsb) == -1)
err(1, "stat: %s", boot);
if (!S_ISREG(bootsb.st_mode))
errx(1, "%s must be a regular file", boot);
if ((minor(disksb.st_rdev) / getmaxpartitions()) !=
(minor(bootsb.st_dev) / getmaxpartitions()))
errx(1, "%s must be somewhere on %s", boot, dev);
/*
* Find the offset of the secondary boot block's partition
* into the disk. If disklabels not supported, assume zero.
*/
if (ioctl(devfd, DIOCGDINFO, &dl) != -1) {
partoffset = DL_GETPOFFSET(&dl.d_partitions[minor(bootsb.st_dev) %
getmaxpartitions()]);
} else {
if (errno != ENOTTY)
err(1, "read disklabel: %s", dev);
warnx("couldn't read label from %s, using part offset of 0",
dev);
partoffset = 0;
}
if (verbose)
(void)printf("%s partition offset = 0x%llx\n", boot, partoffset);
/* Sync filesystems (make sure boot's block numbers are stable) */
sync();
sleep(2);
sync();
sleep(2);
if (loadblocknums(boot, devfd, DL_SECTOBLK(&dl, partoffset)) != 0)
exit(1);
(void)close(devfd);
if (nowrite)
return 0;
#if 0
/* Write patched proto bootblocks into the superblock */
if (protosize > SBSIZE - DEV_BSIZE)
errx(1, "proto bootblocks too big");
#endif
if ((devfd = opendev(dev, O_RDWR, OPENDEV_PART, &dev)) < 0)
err(1, "open: %s", dev);
if (lseek(devfd, DEV_BSIZE, SEEK_SET) != DEV_BSIZE)
err(1, "lseek bootstrap");
if (write(devfd, protostore, protosize) != protosize)
err(1, "write bootstrap");
if (lseek(devfd, 0, SEEK_SET) != 0)
err(1, "lseek label");
if (read(devfd, &bb, sizeof (bb)) != sizeof (bb))
err(1, "read label");
bb.bb_secsize = 15;
bb.bb_secstart = 1;
bb.bb_flags = 0;
bb.bb_cksum = 0;
for (lp = (long *)&bb, ep = &bb.bb_cksum; lp < ep; lp++)
bb.bb_cksum += *lp;
if (lseek(devfd, 0, SEEK_SET) != 0)
err(1, "lseek label 2");
if (write(devfd, &bb, sizeof bb) != sizeof bb)
err(1, "write label ");
(void)close(devfd);
return 0;
}
char *
loadprotoblocks(char *fname, long *size)
{
int fd, sz;
char *bp;
struct stat statbuf;
u_int64_t *matchp;
/*
* Read the prototype boot block into memory.
*/
if ((fd = open(fname, O_RDONLY)) < 0) {
warn("open: %s", fname);
return NULL;
}
if (fstat(fd, &statbuf) != 0) {
warn("fstat: %s", fname);
close(fd);
return NULL;
}
sz = roundup(statbuf.st_size, DEV_BSIZE);
if ((bp = calloc(sz, 1)) == NULL) {
warnx("malloc: %s: no memory", fname);
close(fd);
return NULL;
}
if (read(fd, bp, statbuf.st_size) != statbuf.st_size) {
warn("read: %s", fname);
free(bp);
close(fd);
return NULL;
}
close(fd);
/*
* Find the magic area of the program, and figure out where
* the 'blocks' struct is, from that.
*/
bbinfolocp = NULL;
for (matchp = (u_int64_t *)bp; (char *)matchp < bp + sz; matchp++) {
if (*matchp != 0xbabefacedeadbeef)
continue;
bbinfolocp = (struct bbinfoloc *)matchp;
if (bbinfolocp->magic1 == 0xbabefacedeadbeef &&
bbinfolocp->magic2 == 0xdeadbeeffacebabe)
break;
bbinfolocp = NULL;
}
if (bbinfolocp == NULL) {
warnx("%s: not a valid boot block?", fname);
return NULL;
}
bbinfop = (struct bbinfo *)(bp + bbinfolocp->end - bbinfolocp->start);
memset(bbinfop, 0, sz - (bbinfolocp->end - bbinfolocp->start));
max_block_count =
((char *)bbinfop->blocks - bp) / sizeof (bbinfop->blocks[0]);
if (verbose) {
(void)printf("boot block info locator at offset 0x%x\n",
(char *)bbinfolocp - bp);
(void)printf("boot block info at offset 0x%x\n",
(char *)bbinfop - bp);
(void)printf("max number of blocks: %d\n", max_block_count);
}
*size = sz;
return (bp);
}
static void
devread(int fd, void *buf, daddr_t blk, size_t size, char *msg)
{
if (pread(fd, buf, size, dbtob((off_t)blk)) != (ssize_t)size)
err(1, "%s: devread: pread", msg);
}
static char sblock[SBSIZE];
int
loadblocknums(char *boot, int devfd, unsigned long partoffset)
{
int i, fd, ndb;
struct stat statbuf;
struct statfs statfsbuf;
struct fs *fs;
char *buf;
daddr32_t *ap1;
daddr_t blk, *ap2;
struct ufs1_dinode *ip1;
struct ufs2_dinode *ip2;
int32_t cksum;
/*
* Open 2nd-level boot program and record the block numbers
* it occupies on the filesystem represented by `devfd'.
*/
if ((fd = open(boot, O_RDONLY)) < 0)
err(1, "open: %s", boot);
if (fstatfs(fd, &statfsbuf) != 0)
err(1, "statfs: %s", boot);
if (isofsblk) {
bbinfop->bsize = ISO_DEFAULT_BLOCK_SIZE;
bbinfop->nblocks = isofseblk - isofsblk + 1;
if (bbinfop->nblocks > max_block_count)
errx(1, "%s: Too many blocks", boot);
if (verbose)
(void)printf("%s: starting block %d (%d total):\n\t",
boot, isofsblk, bbinfop->nblocks);
for (i = 0; i < bbinfop->nblocks; i++) {
blk = (isofsblk + i) * (bbinfop->bsize / DEV_BSIZE);
bbinfop->blocks[i] = blk;
if (verbose)
(void)printf("%d ", blk);
}
if (verbose)
(void)printf("\n");
cksum = 0;
for (i = 0; i < bbinfop->nblocks +
(sizeof(*bbinfop) / sizeof(bbinfop->blocks[0])) - 1; i++)
cksum += ((int32_t *)bbinfop)[i];
bbinfop->cksum = -cksum;
return 0;
}
if (strncmp(statfsbuf.f_fstypename, MOUNT_FFS, MFSNAMELEN))
errx(1, "%s: must be on a FFS filesystem", boot);
if (fsync(fd) != 0)
err(1, "fsync: %s", boot);
if (fstat(fd, &statbuf) != 0)
err(1, "fstat: %s", boot);
close(fd);
/* Read superblock */
sbread(devfd, partoffset, &fs, sblock);
/* Read inode */
if ((buf = malloc(fs->fs_bsize)) == NULL)
errx(1, "No memory for filesystem block");
blk = fsbtodb(fs, ino_to_fsba(fs, statbuf.st_ino));
devread(devfd, buf, blk + partoffset, fs->fs_bsize, "inode");
if (fs->fs_magic == FS_UFS1_MAGIC) {
ip1 = (struct ufs1_dinode *)(buf) + ino_to_fsbo(fs,
statbuf.st_ino);
ndb = howmany(ip1->di_size, fs->fs_bsize);
} else {
ip2 = (struct ufs2_dinode *)(buf) + ino_to_fsbo(fs,
statbuf.st_ino);
ndb = howmany(ip2->di_size, fs->fs_bsize);
}
/*
* Check the block numbers; we don't handle fragments
*/
if (ndb > max_block_count)
errx(1, "%s: Too many blocks", boot);
/*
* Register filesystem block size.
*/
bbinfop->bsize = fs->fs_bsize;
/*
* Register block count.
*/
bbinfop->nblocks = ndb;
if (verbose)
(void)printf("%s: block numbers: ", boot);
if (fs->fs_magic == FS_UFS1_MAGIC) {
ap1 = ip1->di_db;
for (i = 0; i < NDADDR && *ap1 && ndb; i++, ap1++, ndb--) {
blk = fsbtodb(fs, *ap1);
bbinfop->blocks[i] = blk + partoffset;
if (verbose)
(void)printf("%d ", bbinfop->blocks[i]);
}
} else {
ap2 = ip2->di_db;
for (i = 0; i < NDADDR && *ap2 && ndb; i++, ap2++, ndb--) {
blk = fsbtodb(fs, *ap2);
bbinfop->blocks[i] = blk + partoffset;
if (verbose)
(void)printf("%d ", bbinfop->blocks[i]);
}
}
if (verbose)
(void)printf("\n");
if (ndb == 0)
goto checksum;
/*
* Just one level of indirections; there isn't much room
* for more in the 1st-level bootblocks anyway.
*/
if (verbose)
(void)printf("%s: block numbers (indirect): ", boot);
if (fs->fs_magic == FS_UFS1_MAGIC) {
blk = ip1->di_ib[0];
devread(devfd, buf, blk + partoffset, fs->fs_bsize,
"indirect block");
ap1 = (daddr32_t *)buf;
for (; i < NINDIR(fs) && *ap1 && ndb; i++, ap1++, ndb--) {
blk = fsbtodb(fs, *ap1);
bbinfop->blocks[i] = blk + partoffset;
if (verbose)
(void)printf("%d ", bbinfop->blocks[i]);
}
} else {
blk = ip2->di_ib[0];
devread(devfd, buf, blk + partoffset, fs->fs_bsize,
"indirect block");
ap2 = (daddr_t *)buf;
for (; i < NINDIR(fs) && *ap2 && ndb; i++, ap2++, ndb--) {
blk = fsbtodb(fs, *ap2);
bbinfop->blocks[i] = blk + partoffset;
if (verbose)
(void)printf("%d ", bbinfop->blocks[i]);
}
}
if (verbose)
(void)printf("\n");
if (ndb)
errx(1, "%s: Too many blocks", boot);
checksum:
cksum = 0;
for (i = 0; i < bbinfop->nblocks +
(sizeof (*bbinfop) / sizeof (bbinfop->blocks[0])) - 1; i++)
cksum += ((int32_t *)bbinfop)[i];
bbinfop->cksum = -cksum;
return 0;
}
static int
sbchk(struct fs *fs, daddr_t sbloc)
{
if (verbose)
fprintf(stderr, "looking for superblock at %lld\n", sbloc);
if (fs->fs_magic != FS_UFS2_MAGIC && fs->fs_magic != FS_UFS1_MAGIC) {
if (verbose)
fprintf(stderr, "bad superblock magic 0x%x\n",
fs->fs_magic);
return (0);
}
/*
* Looking for an FFS1 file system at SBLOCK_UFS2 will find the
* wrong superblock for file systems with 64k block size.
*/
if (fs->fs_magic == FS_UFS1_MAGIC && sbloc == SBLOCK_UFS2) {
if (verbose)
fprintf(stderr, "skipping ffs1 superblock at %lld\n",
sbloc);
return (0);
}
if (fs->fs_bsize <= 0 || fs->fs_bsize < sizeof(struct fs) ||
fs->fs_bsize > MAXBSIZE) {
if (verbose)
fprintf(stderr, "invalid superblock block size %d\n",
fs->fs_bsize);
return (0);
}
if (fs->fs_sbsize <= 0 || fs->fs_sbsize > SBSIZE) {
if (verbose)
fprintf(stderr, "invalid superblock size %d\n",
fs->fs_sbsize);
return (0);
}
if (fs->fs_inopb <= 0) {
if (verbose)
fprintf(stderr, "invalid superblock inodes/block %d\n",
fs->fs_inopb);
return (0);
}
if (verbose)
fprintf(stderr, "found valid %s superblock\n",
fs->fs_magic == FS_UFS2_MAGIC ? "ffs2" : "ffs1");
return (1);
}
static void
sbread(int fd, daddr_t poffset, struct fs **fs, char *sblock)
{
int i;
daddr_t sboff;
for (i = 0; sbtry[i] != -1; i++) {
sboff = sbtry[i] / DEV_BSIZE;
devread(fd, sblock, poffset + sboff, SBSIZE, "superblock");
*fs = (struct fs *)sblock;
if (sbchk(*fs, sbtry[i]))
break;
}
if (sbtry[i] == -1)
errx(1, "couldn't find ffs superblock");
}
```
|
My Hustler is a 1965 American film by Andy Warhol, and Paul Morrissey. The film is propelled by the sonorous, magnetic acting of 30-year-old Ed Hood interacting with the blonde Hustler, Paul America.
Joe Campbell ("Sugar Plum Fairy"), Genevieve Charbin and Dorothy Dean also compete for the attentions of the Hustler and provide foils for the interaction of the main characters. The erudite and very funny Hood, a perpetual graduate student in English at Harvard and "live parody of southern gentility", was recruited by Chuck Wein. Hood's magnetic performance was driven by his deep, mellifluous voice, trained by elocution lessons as a privileged child in Alabama, and lubricated copiously by alcohol. Among his many peculiarities was his habit of drinking beer from the bottle, not by placing the bottle to his lips, but into his mouth, sucking on it, as seen in the film.
Production
The film is a collaboration between Warhol, Chuck Wein and Paul Morrissey, with Morrissey as camera and audio operator and Wein credited as director, and was filmed over Labor Day Weekend, 1965, on Fire Island, NY using a 16mm Auricon news camera. My Hustler is the first Warhol film worked on by Paul Morrissey, who introduced, in this film, camera movement and audible sound to Warhol's cinematography.
Release
Distribution
The first advertisement for a screening of My Hustler appeared in the 6 January 1966 issue of the Village Voice for screenings on the 12th, 13th and 14 January at 8 and 10 pm at the Filmmakers' Cinematheque and was mentioned in the New York Times on January 30. On January 20 a "special notice" in The Village Voice informed the reader that the film would play "every midnight indefinitely" due to public demand. It ended its first run at the Filmmakers' Cinematheque in the middle of April, 1966. Opening at the more mainstream Hudson Theater in July 1967, the film was shown near-continuously in New York City through the end of 1968. The film first showed in Los Angeles and Chicago in early July, 1966 and then ran near-continuously in L.A. and Chicago through 1969. The film was also shown in Tucson, San Bernardino, Albuquerque, Akron and Indianapolis in 1966-67.
Revivals
Several revivals of My Hustler have occurred in New York City since 1969, although the provenance of some of these showings is uncertain due to Andy Warhol withdrawing his films from distribution between 1970 and 1984. In 1984, with Warhol's approval, the Museum of Modern Art and the Whitney Museum embarked on a joint archival research project to catalog Warhol's extensive film collection, investigate its history, and preserve and re-release all of the films along with scholarly research and publication.
Since 1988, My Hustler has been screened multiple times at both museums as part of the project. However, similarly to other Warhol films, it has not been released on any commercial medium, such as DVD. This preservation effort has provided scholars and film enthusiasts with valuable access to these important works of American avant-garde cinema. The project has allowed for a deeper understanding and appreciation of Warhol's contributions to the film industry and art world, while also preserving these works for future generations.
Reception
Box office
My Hustler was the first Warhol film to make any money, grossing $4,000 from its initial run at the Filmmaker's Cinematheque. At the Hudson theater in New York, it grossed $52,400 in its first 3 weeks.
Critical reception
A common expression at the time of the film's initial release was one of revolutionary. New York Times critic Bosley Crowther wrote:
"It is sordid, vicious and contemptuous. The only thing engaging about it is a certain quality and tone of degradation that is almost too candid and ruthless to be believed."
A few years later the revolutionary nature of My Hustler was being recognised; Vincent Canby, in somewhat backwards and grudging praise, complained that distributors were taking his critical remarks out of context and using them as advertising come-ons:
"Warhol, of course, is responsible for one of the toughest dilemmas facing critics today. Largely as a result of his pioneering in the making of movies like Chelsea Girls and My Hustler it's impossible to accurately describe many new movies without automatically writing phrases that can't be picked up and used as instant come-ons."
By 1995 the critical perspective of Warhol's most influential films, including My Hustler had shifted to an appreciation of their unique, semi-documentary perspective, Stephen Holden:
"The esthetic running through Warhol's films is an icy voyeurism. As witty or sexy or photogenic as Warhol's superstars may have been, their largely unstructured, crudely edited play-acting in front of his camera could also be cruelly revealing. Again and again, one has the feeling of confronting people with limited internal resources, desperate to be noticed at any cost."
Censorship
My Hustler was the subject of plainclothes police surveillance in the audience during its initial theatrical release in 1966, and on April 12 the owners of the Filmmakers' Cinematheque were served a summons to a hearing to show-cause why the theater's license should not be revoked for showing films of "sexual immorality, lewdness, perversion and homosexuality." On November 16, after a defence by the New York Civil Liberties Union, the charges were thrown out.
See also
List of American films of 1965
Andy Warhol filmography
References
Bibliography
Escoffier, Jeffrey. Bigger Than Life: The History of Gay Porn Cinema from Beefcake to Hardcore. Philadelphia: Running Press, 2009.
Victor Bockris, Warhol: The Biography. Da Capo Press, 1989.
Watson, Steven, Factory Made. Pantheon Books, 2003.
External links
My Hustler at IMDB
My Hustler at warholstars.org
1965 films
Films directed by Andy Warhol
American independent films
1960s American films
|
```java
/*
* one or more contributor license agreements. See the NOTICE file distributed
* with this work for additional information regarding copyright ownership.
*/
package io.camunda.zeebe.broker.system.partitions.impl.steps;
import io.atomix.raft.RaftServer.Role;
import io.camunda.zeebe.broker.system.partitions.PartitionTransitionContext;
import io.camunda.zeebe.broker.system.partitions.PartitionTransitionStep;
import io.camunda.zeebe.engine.state.QueryService;
import io.camunda.zeebe.engine.state.query.StateQueryService;
import io.camunda.zeebe.scheduler.future.ActorFuture;
import io.camunda.zeebe.scheduler.future.CompletableActorFuture;
import java.time.InstantSource;
import org.agrona.CloseHelper;
public final class QueryServicePartitionTransitionStep implements PartitionTransitionStep {
@Override
public ActorFuture<Void> prepareTransition(
final PartitionTransitionContext context, final long term, final Role targetRole) {
final var currentRole = context.getCurrentRole();
final QueryService queryService = context.getQueryService();
if (queryService != null && (currentRole == Role.LEADER || targetRole == Role.INACTIVE)) {
try {
CloseHelper.close(queryService);
context.setQueryService(null);
return CompletableActorFuture.completed(null);
} catch (final Exception e) {
return CompletableActorFuture.completedExceptionally(e);
}
}
return CompletableActorFuture.completed(null);
}
@Override
public ActorFuture<Void> transitionTo(
final PartitionTransitionContext context, final long term, final Role targetRole) {
final var currentRole = context.getCurrentRole();
if (targetRole != Role.INACTIVE
&& (currentRole == Role.LEADER || context.getQueryService() == null)) {
try {
final var service = new StateQueryService(context.getZeebeDb(), InstantSource.system());
context.setQueryService(service);
return CompletableActorFuture.completed(null);
} catch (final Exception e) {
return CompletableActorFuture.completedExceptionally(e);
}
}
return CompletableActorFuture.completed(null);
}
@Override
public String getName() {
return "QueryService";
}
}
```
|
Danila Comastri Montanari (4 November 1948 – 28 July 2023) was an Italian historical mystery fiction writer. She wrote the Publius Aurelius Statius series.
Biography
Graduated in pedagogy and political sciences, for twenty years she has been teaching and continuing to make regular trips.
In 1990 she wrote her first novel, Mors Tua and then devoted herself full time to the narrative, favouring the genre of historical mystery, which allowed her to reconcile her main interests: the study of the past (in particular ancient civilizations) and the love for mystery weaves.
Starting from 1990, she wrote historical detective stories focused on the figure of Publius Aurelius Statius, senator in the Rome of Claudius. 19 of her novels were published.
In addition to the Statius series, Comastri Montanari penned other novels and short stories set in different historical periods. In February 2007 she published the essay Giallo antico. How to write a historical detective story, published by Hobby & Work; the appendix contains the stories Pirates of the Chersonese, Assassination at the temple of Vesta and Il giallo del serpente.
Danila Comastri Montanari died on 28 July 2023, at the age of 74.
Bibliography
Publius Aurelius series
1990 – Mors tua
1991 – In corpore sano
1993 – Cave canem
1994 – Morituri te salutant
1996 – Parce sepulto
1997 – Cui prodest?
1999 – Spes ultima dea
2000 – Scelera
2001 – Gallia est
2002 – Saturnalia
2003 – Ars moriendi – Un'indagine a Pompei
2004 – Olympia – Un'indagine ai giochi ellenici
2005 – Tenebrae
2007 – Nemesis
2009 – Dura Lex
2011 – Tabula Rasa
2013 – Pallida Mors
2015 - Saxa rubra
2017 - Ludus in fabula
Standalone novels
1995 – Ricette per un delitto
1996 – La campana dell'arciprete
1997 – Il panno di Mastro Gervaso
1999 – Una strada giallo sangue
2003 – Istigazione a delinquere
2008 – Terrore
References
1948 births
2023 deaths
Italian women writers
Italian historical novelists
Writers of historical mysteries
Writers of historical fiction set in antiquity
University of Bologna alumni
Writers from Bologna
|
```c
//===========================================================================
//
// Name: _files.c
// Function:
// Programmer: Mr Elusive
// Last update: 1999-12-02
// Tab Size: 4
//===========================================================================
/*
aas_areamerging.c //AAS area merging
aas_cfg.c //AAS configuration for different games
aas_create.c //AAS creating
aas_edgemelting.c //AAS edge melting
aas_facemerging.c //AAS face merging
aas_file.c //AAS file writing
aas_gsubdiv.c //AAS gravitational and ladder subdivision
aas_map.c //AAS map brush creation
aas_prunenodes.c //AAS node pruning
aas_store.c //AAS file storing
map.c //map file loading and writing
map_hl.c //Half-Life map loading
map_q1.c //Quake1 map loading
map_q2.c //Quake2 map loading
map_q3.c //Quake3 map loading
map_sin.c //Sin map loading
tree.c //BSP tree management + node pruning (*)
brushbsp.c //brush bsp creation (*)
portals.c //BSP portal creation and leaf filling (*)
csg.c //Constructive Solid Geometry brush chopping (*)
leakfile.c //leak file writing (*)
textures.c //Quake2 BSP textures (*)
l_bsp_ent.c //BSP entity parsing
l_bsp_hl.c //Half-Life BSP loading and writing
l_bsp_q1.c //Quake1 BSP loading and writing
l_bsp_q2.c //Quake2 BSP loading and writing
l_bsp_q3.c //Quake2 BSP loading and writing
l_bsp_sin.c //Sin BSP loading and writing
l_cmd.c //cmd library
l_log.c //log file library
l_math.c //math library
l_mem.c //memory management library
l_poly.c //polygon (winding) library
l_script.c //script file parsing library
l_threads.c //multi-threading library
l_utils.c //utility library
l_qfiles.c //loading of quake files
gldraw.c //GL drawing (*)
glfile.c //GL file writing (*)
nodraw.c //no draw module (*)
bspc.c //BSPC Win32 console version
winbspc.c //WinBSPC Win32 GUI version
win32_terminal.c //Win32 terminal output
win32_qfiles.c //Win32 game file management (also .pak .sin)
win32_font.c //Win32 fonts
win32_folder.c //Win32 folder dialogs
*/
```
|
Robert Edgeworth-Johnstone (4 February 1900 – 3 December 1994) was a British chemical engineer and inventor. Born in Dublin, he spent 33 years in industry as a chemical engineer and consultant, working overseas in the oil industry. A keen musician, Edgeworth-Johnstone developed the Johnstone flute, a simple version of the instrument made from the aluminium brass tubing used in oil refineries. The instrument's design was admired by renowned flautist James Galway, but Edgeworth-Johnstone did little with the invention until he published details in a 1993 book.
He later became Lady Trent Professor of Chemical Engineering at the University of Nottingham and was involved in reforming the courses there to be more applicable to industry. Edgeworth-Johnstone retired in 1967 but continued to work to advance engineering education, authoring a 1969 report on the subject for the Institution of Chemical Engineers. In later life he lived in Brighton, where he represented the county of Sussex in pistol shooting, before moving to France, where he died at Parcé-sur-Sarthe.
Early life
Robert Edgeworth-Johnstone was born in Dublin on 4 February 1900. His family were of Anglo-Irish background; his father Sir Walter Edgeworth-Johnstone was a Chief Commissioner of the Dublin Metropolitan Police from 1915 to 1923.
Edgeworth-Johnstone was educated at Wellington College and the Royal Military Academy, Woolwich before he secured a job with the Magadi Soda Company. He worked in factories in Kenya for the next three years and there developed a fascination with chemical engineering. Upon his return to the United Kingdom he pursued an engineering education. Edgeworth-Johnstone was awarded a degree from the Manchester College of Technology and, in 1932, a doctorate from University College London. In 1957 he co-authored Pilot plants, models, and scale-up methods in chemical engineering with Meredith Thring. Edgeworth-Johnstone spent 33 years working in industry as an engineer and consultant.
Johnstone flute
In 1933, whilst working on oil projects in Trinidad, Edgeworth-Johnstone – a keen musician who could play the guitar, mandolin and clavichord – developed a keyless flute. He intended this to be a link between the simple and cheap recorder and the more complicated and expensive Boehm-keyed flute. He made the body of his prototype out of aluminium brass tubing, a cheap material readily available in the oil refineries in which he worked. The innovative mouthpiece was made from a piece of wood (his preference was for West Indian Purpleheart) several inches long that protruded into the tube. The flute had ten holes, arranged in an s-curve, which were covered by use of all the player's fingers and thumbs. One reviewer described this as a disadvantage for the instruments intended use as a stepping stone as it differed from the fingering used for the six-hole recorder and flute. The simple construction and ability for home manufacture were praised. The renowned flautist James Galway was impressed by the design but Edgeworth-Johnstone did little with it until 1993 when he published the details in The Johnstone Flute, a book describing the development and use of the flute.
University of Nottingham
Edgeworth-Johnstone was appointed the first Lady Trent Professor of Chemical Engineering by the University of Nottingham in 1960, despite having no previous university work experience. He took the opportunity to reform the way the subject was taught at the university, seeking feedback from industry and putting greater emphasis on economics, management and administration skills. In this regard he anticipated by 20 years the Finniston Report which implemented widespread reform of engineering teaching. Edgeworth-Johnstone retired in 1967 but continued to work for improvements in engineering education, authoring a 1969 report for the Institution of Chemical Engineers which became a framework for future development in the field.
Personal life
Edgeworth-Johnstone married Jessie Greig in 1932, they had two sons and a daughter together. His wife died in 1981. At the age of 80 he represented the county of Sussex in competitive pistol shooting. In later years Edgeworth-Johnstone moved from his home in Brighton to France, where he died at Parcé-sur-Sarthe on 3 December 1994. The University of Nottingham library has held a collection of his correspondence and papers in its department of manuscripts and special collections since 1982.
References
1900 births
1994 deaths
British chemical engineers
20th-century British inventors
People educated at Wellington College, Berkshire
Graduates of the Royal Military Academy, Woolwich
Alumni of the University of Manchester
Alumni of University College London
Academics of the University of Nottingham
Engineers from Dublin (city)
|
The 1920 Lehigh Brown and White football team was an American football team that represented Lehigh University as an independent during the 1920 college football season. In its ninth season under head coach Tom Keady, the team compiled a 5–2–2 record and outscored opponents by a total of 172 to 54. The team played its home games at Taylor Stadium in Bethlehem, Pennsylvania.
Schedule
References
Lehigh
Lehigh Mountain Hawks football seasons
Lehigh football
|
Murray Albert Dowey (January 3, 1926 – May 26, 2021), was a Canadian ice hockey goaltender. Nicknamed "Fast Hands", he was a member of the Ottawa RCAF Flyers, which won the gold medal in ice hockey representing Canada at the 1948 Winter Olympics in St. Moritz.
Early life
Dowey was born in eastern Toronto on January 3, 1926. His father, Albert, was Irish Canadian; his mother, Winifred, was of English descent. Dowey served in the Canadian Army for two years, and played for Barkers Hockey Club of the Toronto Mercantile League. He was later presented with the opportunity to be the practice goaltender for the Toronto Maple Leafs, but did not sign after being unable to agree to terms with the owner of the Toronto Marlies (the Leafs' minor league affiliate). He worked as a clerk and typist for the Toronto Transit Commission (TTC) at the time he was recruited to the national hockey team for the 1948 Winter Olympics.
1948 Winter Olympics
The Canadian Amateur Hockey Association declined to send a team to the 1948 Olympics, in protest of the International Olympic Committee regulation that only amateur players could participate in the Games (thus ruling out those who received remuneration for playing hockey). However, Sandy Watson, a squadron leader and medical officer for the Royal Canadian Air Force (RCAF), decided to assemble a team nonetheless under the Ottawa RCAF Flyers banner. Dowey became the final player added to the Olympic roster. The original starting goalie, Dick Ball, failed a medical exam just two days before the team was to depart for the tournament. Two other players who made the team – Wally Halder and George Mara – had been teammates with Dowey in the Mercantile League. They consequently recommended Dowey to the Flyers officials.
Dowey recorded five shutouts in the eight games he played in, to go along with a 0.62 goals against average. Both numbers remained Olympic records at the time of his death. He conceded only one goal apiece against Sweden and Italy, with the three goals by the United States the most goals he allowed in one game during the tournament.
Later life
After the Olympic Games and several exhibition games, Dowey went back to his job with the TTC. He retired from the TTC in 1986, having worked there for 44 years. He was honoured by the Canadian Forces in 2001, when it was announced that the 1948 RCAF Flyers were selected as Canada's greatest military athletes of the 20th century. He was selected as one of the torchbearers for the Toronto stretch of the 2010 Winter Olympics torch relay in December 2009. He was subsequently inducted into the Etobicoke Sports Hall of Fame the following year.
Dowey was interviewed in-depth for a documentary on the Flyers in 2015, as well as a book in the autumn of 2020, both called Against All Odds. He spent his final years at a retirement home in Etobicoke. Dowey died on May 26, 2021, in Toronto, Ontario. He was 95, and was the last surviving member of the Flyers' 1948 Olympic team.
Career statistics
International
Bolded numbers indicate tournament leader
Sources:
References
External links
Murray Dowey's profile at databaseOlympics.com
1926 births
2021 deaths
Canadian ice hockey goaltenders
Ice hockey people from Toronto
Ice hockey players at the 1948 Winter Olympics
Medalists at the 1948 Winter Olympics
Olympic ice hockey players for Canada
Olympic gold medalists for Canada
Olympic medalists in ice hockey
|
The Black Reichswehr () was the name for the extra-legal paramilitary formations promoted by the German Reichswehr army during the time of the Weimar Republic; it was raised despite restrictions imposed by the Versailles Treaty. The secret organisation was dissolved in 1923 upon the failed Küstrin Putsch.
History
Restrictions on German military forces after the First World War
The Versailles Treaty restricted the numbers of the German army to seven divisions of infantry and three of cavalry, for a total of 100,000 men, and no more than 4,000 officers. Conscription was prohibited, and civilian employees engaged in forest protection, customs inspection, and other official duties could not receive military training. The military was to be exclusively devoted to the maintenance of order within German territory and control of the frontiers. The Treaty further prohibited the construction of aircraft, heavy artillery, submarines, capital ships, and tanks, and the production of materials for chemical warfare.
Naval forces were limited to 15,000 men. The Treaty also specified the navy could number no more than 6 battleships of no more than 10,000 tons displacement, 6 cruisers (6,000 tons displacement), 6 destroyers (800 tons displacement), and twelve torpedo boats (200 tons displacement), and these ships could only be replaced after twenty years for the first 2 classes of ships, and after fifteen years, for the remaining classes of ships. Article 191 specifically prohibited the production or acquisition of submarines. The Treaty further prohibited the manufacture, import, and export of weapons and poison gas.
To maintain these restrictions, the Treaty created an Allied military commission, whose job was to monitor German military activity, known as the Governments of the Principal Allied and Associated Powers.
Circumventing the Versailles Treaty military restrictions
The Reichswehr military organisation, as it was reorganised under General Hans von Seeckt and Defence Minister Otto Gessler, evaded these prohibitions through a variety of measures. Safeguarding its secrecy, Gessler and Reichswehr officials denied the organisation's existence to the Reichstag and other institutions. After the Third Silesian Uprising, the military, with von Seeckt's knowledge, provided arms to Freikorps members and other paramilitary groups, who, after the end of the hostilities, stashed their weapons away and reorganised as labour battalions under the command of Major Fedor von Bock, comprising about 2,000 service members and further 18,000 reservists, concentrated around the garrison town of Küstrin in Brandenburg. Black Reichswehr paramilitary forces comprised the SA troops of the Nazi Party, Der Stahlhelm organisation, and numerous Freikorps like the Marinebrigade Ehrhardt, its Organisation Consul successor or Bund Oberland.
Though constantly denied by the Reichswehr supreme command and the Ministry of Defence, Black Reichswehr forces served in sabotage acts and assaults during the French Occupation of the Ruhr and were responsible for several Feme murders.
Küstrin Putsch
Groups from the Black Reichswehr called labour commandos (German: ) led by Bruno Ernst Buchrucker wanted to bring down the Reich government of Chancellor Gustav Stresemann and replace the parliamentary democratic republic with a national dictatorship. The putsch was prompted when on 26 September 1923 the government ended passive resistance to the occupation of the Ruhr by French and Belgian troops that had been imposed in January 1923 after Germany defaulted on the war reparations payments required by the Treaty of Versailles.
When the commandos were getting out of their truck, the Reichswehr opened fire on them with a machine gun, killing one and wounding seven. They were the putsch's only casualties. 381 Black Reichswehr labour commandos were arrested but released a short time later. All of the officers involved remained under arrest.
In popular culture
The Black Reichswehr is featured in Babylon Berlin, the German neo-noir television series based on the 2008 novel Der nasse Fisch (The Wet Fish) by Volker Kutscher. In the first two seasons, the organisation is depicted as plotting a coup of the Weimar Republic to restore the German Empire, returning Wilhelm II to the throne and installing Erich Ludendorff as Chancellor. The character of Generalmajor Seegers, the fictional ringleader of the plot, is an amalgamation of several historical Reichswehr officers, including Hans von Seeckt and Kurt von Hammerstein-Equord.
Gallery
See also
Kapp Putsch
Beer Hall Putsch
References
Sources
1919 establishments in Germany
1923 disestablishments in Germany
Military of the Weimar Republic
Paramilitary organisations of the Weimar Republic
Anti-communist organizations in Germany
|
Chu Suanzi (; 324 – 5 July 384), formally Empress Kangxian (康獻皇后, literally "the joyful and wise empress"), at times as Empress Dowager Chongde (崇德太后), was an empress of the Jin Dynasty (266–420). Her husband was Emperor Kang, and, outliving him by 40 years, she was an empress dowager during the reigns of five emperors, including serving as regents for three of them: her son Emperor Mu (344–357), her nephew Emperor Ai (364–366), and her cousin Emperor Xiaowu (373–376). Despite the power she held, she appeared to largely yield to the judgement of high-level officials who advised her and rarely made decisions on her own.
Background and life as empress consort
Chu Suanzi's father Chu Pou was a mid-level official during the reign of Emperor Kang's brother Emperor Cheng, successively on the staffs of Sima Yang (司馬羕) the Prince of Xiyang and then of Sima Yue the Prince of Wu—the future Emperor Kang. He was respected for the appropriateness of his speech.
As Chu Suanzi herself grew older, she was known for her intelligence and foresight, and she was married to Sima Yue, who had then become the Prince of Langye, as his princess. When Emperor Cheng chose to pass the throne to his brother rather than his sons at his death in 342, Sima Yue became emperor, and he created her empress on 10 February 343, when he was 21 and she was 18. At the same time, Lady Chu's mother Lady Xie was also created "Xiangjun of Xunyang" (寻阳乡君). That year, she also bore him his only son, Sima Dan. He died just one year later, however, and her infant son Sima Dan, whom he created crown prince just before his death, became emperor (as Emperor Mu).
As empress dowager
During Emperor Mu's reign
After Lady Chu became empress dowager, she received an official petition stating that since her mother Lady Xie had already received a title, the deceased wives of her father, Ladies Xun and Bian, should be granted titles posthumously as well. Empress Dowager Chu dismissed the petition.
Due to Emperor Mu's young age, Empress Dowager Chu was quickly called upon to serve as regent. He Chong was her key advisor. He Chong initially wanted to share that authority with her father Chu Pou, but Chu Pou believed that as the empress dowager's father, it was inappropriate for him to serve in that capacity, and therefore remained a provincial governor. Eventually, in 345, that role went to Emperor Mu's granduncle Sima Yu the Prince of Kuaiji. After He Chong's death in 346, the authority was shared between Sima Yu and Cai Mo. After the general Huan Wen conquered Cheng Han in 347, effectively, the imperial government lost authority over the western provinces, with Huan being only nominally submissive. In response, Sima Yu invited the equally renowned Yin Hao to join as a key advisor to the empress dowager.
In 348, after he led a failed campaign against the collapsing Later Zhao with major losses, Chu Pou died in humiliation on 1 Jan 350. During the next few years, however, without major campaigns on Jin's part, many of Later Zhao's southern provinces switched their allegiance to Jin, but not firmly so. Huan repeatedly requested permission to march north, but was constantly rebuffed by Sima Yu and Yin, who were concerned that he would be even harder to control if he recovered central and northern China for Jin.
In 350, when Empress Dowager Chu was bestowing a greater honor on Cai, Cai repeatedly declined—so much so that, as the emperor, the empress dowager, and the officials waited at the palace for the messengers that the empress dowager sent to his home to return with him, they waited from morning to evening, causing the seven-year-old emperor to be drained. Yin used this opportunity to accuse Cai of disrespect and had him excluded from government and reduced to commoner status.
In 352 and 353, Yin made two failed attempts at northern expeditions against Former Yan and Former Qin—and after the second failure, which was caused by Yin's arrogance toward one of the former Later Zhao generals who surrendered to Jin, Yao Xiang, causing Yao to rebel in fear and anger—Huan submitted a petition accusing Yin of crimes, and with popular sentiment against Yin, Sima Yu was forced to exile Yin. From that point on, the imperial government rarely went against Huan's wishes. In 354, Huan attacked Former Qin and enjoyed some initial successes, moving within miles of the Former Qin capital Chang'an, but eventually hesitated when he was close, and Former Qin fought back and forced him to retreat. In 356, Huan marched north again and was able to force Yao out of the Luoyang region, which he had occupied, and this allowed Jin to regain control of the territory south of the Yellow River (except for modern Shandong, where the warlord Duan Kan (段龕) had been a nominal Jin vassal since 351 but was conquered by Former Yan's emperor Murong Jun in 356).
In 357, as Emperor Mu turned 14 and went through his rite of passage (jiaguanli (加冠禮)), Empress Dowager Chu officially stripped herself of her role as regent, and moved to Chongde Palace (崇德宮), which would be her residence for the rest of her life.
In 361, Emperor Mu died at the age of 18. As he had no sons, Empress Dowager Chu ordered that his cousin Sima Pi the Prince of Langye (Emperor Cheng's oldest son) be made emperor, and he took the throne as Emperor Ai.
During Emperor Ai's reign
As Emperor Ai was two years older than Emperor Mu and already 21 at the time of his ascension, Empress Dowager Chu did not serve as regent initially. However, in 364, when Emperor Ai became poisoned by pills given by magicians which he took while trying to seek immortality, he fell ill and could not handle matters of state. Empress Dowager Chu again served as regent at that point. After he died sonless in 365, she ordered that his younger brother Sima Yi the Prince of Langye succeed him (as Emperor Fei).
During Emperor Fei's reign
Similarly, because Emperor Fei was also already an adult at the time he ascended the throne, Empress Dowager Chu did not serve as regent. However, after Huan Wen's major attack on Former Yan in 369 was repelled, at much loss of life, by joint forces of Former Yan and Former Qin, he pondered ways to demonstrate that he was still in control, particularly because he was interested in usurping the Jin throne. He spread false rumors that Emperor Fei was impotent and that all his sons were actually sons of his close associates. In 371, he drafted a proposed edict for Empress Dowager Chu and submitted it to her while she was at a Buddhist shrine in her palace. She read his submission and commented that she suspected the same thing, and she signed the edict, although she added several sentences showing her grief:
This widow has suffered more than a hundred kinds of grief. I consider those who have died and those who still live, and my heart is like being cut by knives.
Huan was initially apprehensive that Empress Dowager Chu might not submit to his plan, so he was happy that she agreed despite her tone. Huan then removed Emperor Fei and replaced him with his granduncle Sima Yu the Prince of Kuaiji (as Emperor Jianwen). Empress Dowager Chu initially created the former emperor the Prince of Donghai—a title he had held previous to becoming Prince of Langye and then emperor—but Huan pressured her into demoting him further to Duke of Haixi, a title that he would be known by in many historical accounts.
During Emperor Jianwen's reign
As the niece of Emperor Jianwen, Empress Dowager Chu had few roles during his reign—particularly because he himself had to negotiate a treacherous path that Huan had laid for him, as Huan himself intended to take the throne. Emperor Jianwen honored her as Empress Dowager Chongde, based on the name of her palace. After he died in 372, he was succeeded by his son and crown prince Sima Yao (as Emperor Xiaowu), and while she initially instinctively thought that Huan should be named regent, Wang Tanzhi and Xie An, officials loyal to the imperial clan and trying to prevent a Huan usurpation, persuaded her to become regent again. She agreed, but did not take official regent capacity initially.
During Emperor Xiaowu's reign
In 373, Huan Wen visited the capital Jiankang, and the rumor at the time was that he was going to execute Wang Tanzhi and Xie An, and then take over the throne. However, after Wang and Xie met him, he apparently changed his mind and returned to his defense post at Gushu (姑孰, in modern Ma'anshan, Anhui), although he continued to pressure the imperial government to grant him the nine bestowments, but Wang and Xie dragged on, and after Huan died later in 373, his territories were divided among his brothers Huan Huo and Huan Chong, and his nephew Huan Shixiu (桓石秀). As Huan Chong was loyal to the imperial government, the threat of a Huan usurpation dissipated. After Huan Wen's death, Empress Dowager Chu formally took over as regent. In 376, after Emperor Xiaowu turned 14 and had his rite of passage, Empress Dowager Chu again gave up her regent authorities and was again referred to as Empress Dowager Chongde. She died in 384 and was buried with honors due an empress, with her husband Emperor Kang.
References
324 births
384 deaths
Jin dynasty (266–420) empresses
Jin dynasty (266–420) empresses dowager
Jin dynasty (266–420) regents
Jin dynasty (266–420) Buddhists
4th-century women regents
4th-century empresses consort
4th-century Chinese women
|
```smalltalk
/*
*
* This software may be modified and distributed under the terms
* of the MIT license. See the LICENSE file for details.
*
* path_to_url
*
*/
using Microsoft.AspNetCore.Antiforgery;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Http;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Extensions.Options;
namespace Piranha.Manager.Controllers;
[Area("Manager")]
[Route("manager/login/auth")]
[Authorize(Policy = Permission.Admin)]
[ApiController]
public sealed class AuthController : Controller
{
private readonly IAntiforgery _antiForgery;
private readonly ManagerOptions _options;
/// <summary>
/// Default constructor.
/// </summary>
/// <param name="antiforgery">The antiforgery service</param>
/// <param name="options">The manager options</param>
public AuthController(IAntiforgery antiforgery, IOptions<ManagerOptions> options)
{
_antiForgery = antiforgery;
_options = options.Value;
}
[Route("{returnUrl?}")]
[HttpGet]
public IActionResult SetAuthCookie([FromQuery]string returnUrl = null)
{
var tokens = _antiForgery.GetAndStoreTokens(HttpContext);
Response.Cookies.Append(_options.XsrfCookieName, tokens.RequestToken, new CookieOptions
{
HttpOnly = false,
IsEssential = true
});
if (!string.IsNullOrEmpty(returnUrl))
{
return LocalRedirect(returnUrl);
}
return LocalRedirect("~/manager");
}
}
```
|
Mordellistena annuliventris is a beetle in the genus Mordellistena of the family Mordellidae. It was described in 1886 by Quedenfeldt.
References
annuliventris
Beetles described in 1886
|
During World War II, 4,058 ethnic Germans along with several hundred other Axis-nationals living in Latin America were deported to the United States and their home countries, often at the behest of the US government. Although the arrest, internment and/or deportation of belligerent country nationals was common practice in both Axis and Allied countries (and their colonies) during both World War I and World War II, subsequent US Congressional investigations and reparations during the 1980s and 1990s, especially for Japanese Americans interned, have raised awareness of the injustice of such practices. Unlike Allied civilians held in Nazi concentration camps or those interned by the Japanese, Axis nationals interned in Allied countries did not suffer from systematic starvation and widespread mistreatment by their captors.
Although conducted ostensibly to curb Axis subterfuge, like the Internment of Japanese Canadians, the Internment of Japanese Americans, and the Internment of German Americans, many of the deportees were not supporters of the Nazi regime. Persons deported even included Jewish refugees who had fled Nazi Germany prior to the German declaration of war against the United States. Of the 4,656 deportees sent to the US in 1942, 2,242 were exchanged with Axis powers for citizens of Allied countries arrested and interned by the Axis powers and 2,414 remained in the US until the end of the war.
The deportation of Germans was preceded by the immigration of tens of thousands of German immigrants to Latin America during the 19th and early 20th centuries. While the vast majority of these immigrants integrated into Latin American societies, some still held German citizenship at the time of Germany's declaration of war in 1941. Prior to World War II both the German and U.S. governments were actively competing for political and economic influence across Latin America and with the outbreak of war the U.S. government was fearful that nationals of the belligerent countries could pose a threat. Subsequently several thousand German, Japanese and Italian nationals were arrested by Latin American governments and many were deported to internment camps in the U.S. for the duration of the conflict. A minority were even deported to Nazi Germany. Following the war most were repatriated to their home countries.
Background
German Influence in Latin America
Germany's involvement in the region had existed since the 19th century. Waves of German immigrants had been generally welcomed into the region, partially as a result of the popularity of racist ideologies among Latin America's political and economic elites who often believed in the myth of the Protestant work ethic and the superiority of Northern European Protestants immigrants over Southern European Catholics. The Nazi German Government primarily viewed Latin America as a source of raw materials and attempted to deepen commercial relations with the region during the inter-war period. Germany offered military training and sold weapons to several Latin American governments in an attempt to undermine U.S. influence in the region. During the 1930s sympathetic far-right and fascist political movements, most notably Brazil's Integralism movement and Argentina's Nacionalismo movement attempted to gain control in their respective states. Following World War II thousands of Nazis would escape to Latin America and avoid capture by the Allies thanks to Nazi sympathizers in Latin America, some of whom held military and political positions in Latin America's governments.
American influence in Latin America
Since the establishment of the Monroe Doctrine the United States Government sought and gained an influential position in the Western Hemisphere. During the 1930s, the American intervention in region, which began in 1898 (see the Spanish American War and the banana wars) on behalf of local elites and U.S. corporations, was replaced by the Good Neighbor policy which was made official at the Montevideo Convention in 1933. American influence in the region remained significant during the pre-war period and included the U.S. control over the recently built Panama Canal.
U.S. - Germany Relations in Latin America immediately prior to 1941
By 1941, the US was supplying war materials to democracies fighting against the Axis powers under the Lend-Lease Act of 1941. The US Navy also assisted the Royal Navy against German submarines in the Atlantic and hostility against Germany spilled over to ethnic Germans in the United States and in Latin America. Ethnic Germans in Latin America were placed under surveillance at the behest of the US. Ethnic Germans in Latin America deemed “dangerous” were placed on the Proclaimed List of Certain Blocked Nationals. Created in June 1941 by the US, those of the list were subjected to economic sanction and were blocked from doing businesses with American companies.
Rationale for deportation
Inadequate American intelligence
American intelligence on Latin America was very poor and inaccurate, thus overstating the threat of ethnic Germans in Latin America, motivating the deportation and internment. Pre-war American intelligence gathering in Latin America was dependent on embassy cables, reports by G-2 (Army intelligence), ONI (Naval Intelligence), and civilian volunteers. Latin America was shunned by talented officers as backwater and “prejudicial to their promotions”, therefore, intelligence staff posted Latin America were often from the bottom of the barrel. For example, Colonel Carl Strong, the military attaché in Bogota, warned of a German attack on Colombia “via Dakar and Nepal,” demonstrating his ignorance of Latin American geography.
Intelligence work in Latin America became a priority as Nazi Germany advanced across Western Europe. In June 1940, the U.S. Federal Bureau of Investigation (FBI) was charged with monitoring Latin America by President Franklin Roosevelt. The ~700 FBI agents poured into Latin America were not much more competent. Agent Donald Charles Bird of the FBI was given two weeks of Spanish lessons before being sent to Brazil, a Portuguese speaking country. These intelligence officers often overstated the level of German influence in Latin America and grossly exaggerated the threat of ethnic Germans as a potential fifth column. One such example was the FBI’s portrayal of the 12,000 ethnic Germans in Bolivia as an imminent threat, ignoring the fact that 8,500 of them were Jews escaping from German-occupied Europe.
British disinformation
To further skew American assessment of the situation in Latin America, British Security Co-ordination (BSC), an arm of the British Secret Intelligence Service (MI6), fabricated many “evidences” of Nazi aggressions and infiltrations in Latin America to induce the US to join the war. These hoaxes and fabrications were in many cases readily accepted by American intelligence as truth. For example, in June 1940, the BSC forged a letter to implicate Major Elias Belmonte, former Bolivian military attaché to Berlin, in a German-sponsored coup plot in Bolivia. The letter and the alleged coup attempt became proof for German subversion in Latin America and was circulated by the FBI. On 27 October 1941, President Roosevelt dramatically announced,
"I have in my possession a secret map, made in Germany by Hitler's government - by planners of the new world order. [...] It is a map of South America and a part of Central America as Hitler proposes to reorganize it."
This map, printed in German, indicates plans to conquer South America and partition it into five satellite states. This map was also a fabrication by the BSC.
Overstated threat
The combination of inadequate American intelligence and abundant British disinformation convinced American policy-makers that the ethnic German population in Latin American constituted a threat to Latin America, and consequently, the United States. Lieutenant Jules Dubois, chief of the Intelligence Branch of the US Army in Panama declared,
“With their sights trained on Latin America, the Axis Powers began to groom puppets and sympathetic groups in every republic to seize the reigns [sic] of their governments’ machinery […] There were approximately three million Axis nationals residing in Latin American then, each of whom could have been made available to form part of a militant striking force capable of implementing the plans of the Axis at the appropriate time”
After the US entered the war, neutralizing the perceived threat against Latin America and securing the strategically important Panama Canal was seen by the US government as vital.
Deportation
U.S. Government requests the arrest and deportation of “all dangerous aliens”
The United States elected to use internment and deportation to neutralise the perceived threat posed by ethnic Germans following American entry into war. Because of the perceived incompetence and possible German infiltration of Latin American governments, local internment was deemed insufficient as a solution.
Immediately after the attack on Pearl Harbor the Government of Panama carried out arrests of Japanese, German, and Italian nationals. Following the arrests the US Ambassador in Panama requested that the Panamanian government send the internees to the U.S., citing the logistical difficulties of housing and feeding the internees in Panama. On 20 January 1942, the US State Department instructed its embassies in Cuba, Guatemala, Nicaragua, Costa Rica, Honduras, El Salvador, the Dominican Republic, and Haiti to obtain an agreement to send “all dangerous aliens” to the US for internment. In neutral Colombia, US Ambassador Braden urged for the expulsion of Germans even before US entry to the war.
Latin American response to U.S. demand
The Latin American countries were generally receptive to American demands. Their motivation varied between American influence, promise of military and economic aid, domestic anti-German sentiments, and the opportunity to seize the land and property of the Germans. Panama, which was tightly controlled by the US, agreed to send the “more dangerous” internees to the US on 13 January 1942. Little opposition arose in the Latin American countries that have declared war on Germany. Guatemala, Nicaragua, Costa Rica, Honduras, El Salvador, the Dominican Republic, and Haiti all agreed to deport “dangerous enemy aliens” by mid-February 1942, while Cuba compromised with interning selected Germans in the isolated Isle of Pines (now Isla de la Juventud). Colombia initially refused American demands, citing national sovereignty and constitutional rights. She relented in November 1943 after she was promised military aid under Lend-Lease from the US. Together with Ecuador, Peru, and Bolivia, they agreed to send selected Germans to the US under the promise that the deportees will be repatriated to Germany rather than interned in the US.
Selection of deportees
The selection of deportees was arbitrary and inaccurate in picking out potentially dangerous Germans. The selection was conducted by both by local governments and by American instructions. In total, 4,058 Germans were deported to the US. The Proclaimed List of Certain Blocked Nationals was used as the basis for deportation in many Latin American countries. Many more ethnic Germans in Latin American were also selected for deportation. Outspoken Nazi Party members, such as Otto Krogmann, leader of the Nazi Party in Costa Rica and brother of Carl Vincent Krogmann, Nazi politician and mayor of Hamburg, were quickly arrested and deported.
However, many other ethnic Germans were also detained and deported with scant evidence. Carl Specht, a labour organiser for Indian rubber trappers in Colombia was deported without evidence. Post-war investigators noted that he “incurred the enmity of some of the American rubber interest”. After deportation to the US, he volunteered to join the US Army. Wilhelm Wiedermann, tractor driver and naturalised Costa Rican citizen, was also deported after being reported by the US military attaché Lt. Col. E. Andino. Andino was later dubbed “one of the most unreliable intelligence officers in the employ of the United States Government” by post-war investigator. In Panama, out of the 1000+ ethnic Germans interned, the “most dangerous” 150 of them along with their 97 family members were deported to the US. 30 of them were Jewish refugees, five of which had spent time in concentration camps before moving to Panama, while 37 members of the local Nazi Party were allowed to stay. The Justice Department concluded in 1943 that selection of internees and deportations were conducted “without inquiry as to the loyalty or the danger of the particular alien.”
Internment
Legal basis of internment
Before their forced arrival to the United States, the deportees were deliberately not issued any visas. Once they arrived, they were arrested by the US Immigration and Naturalization Service (INS) as illegal immigrants, which formed the legal basis for the US government to intern or expel them.
Conditions of the internment camps
The conditions of the internment in the beginning were quite poor as it was believed that the internees would be quickly repatriated. Once it was clear that many internees would stay in the camps for extended period of time, the Justice Department, which operated the camps, began improving conditions. Internees of German, Japanese, and Italian descent from both the US and Latin America were interned together. Crystal City, a purpose-built internment camp for large families had notably well conditions. Japanese internees beautified the camp with gardening and landscaping; education was provided in English, German, Japanese, and Spanish; a swimming pool was built for the children, large families have their own house with kitchens, bathrooms, running water, and ice boxes. Karl-Albrecht Engel, one of the internees from the camp, reported in a letter to the German government:
“We grew tan and swelled up like doughnuts from the good meal. Three hot meals a day, starting with eggs and sausages in the morning and ending with oysters or meat and potatoes for dinner. The canteen sold three kinds of beer."
Repatriation
Ecuador
As the war came to a close, many of the interned Ecuadorians in American detention camps began requesting permission to return home to Ecuador (Becker 317). Camilo Ponce, The Ecuadorian Minister for Foreign Relations, agreed to most of the requests, stating “the majority of them, if not all, are individuals who have lived most of their lives in this country and did not leave of their own free will”. In 1945 Ecuadorian Ambassador, Galo Plaza, petitioned the secretary of state for the return of Axis country nationals still in internment camps, noting the justification for deporting these residents was “to prevent them from engaging in subversive activities against the security of the American republics,” but now that the threat had passed. In 1946 the US State Department asked Ecuador if it would like those remaining in internment camps to return. Ecuador took control of its residents in April 1946 when an agreement was finalized and Ecuadorian residents began to return home.
Cuba, Peru, and Guatemala
For the Cuban and Peruvian residents interned in the United States efforts to return home were often improbable at best. Beginning in 1940 the American government took to blacklisting companies and individuals with German ties from these, and most, Latin American countries. The stated rationale was to deny funding to local Nazi sympathizing factions, however the reality was that these American policies made it increasingly difficult for any German detainee to return to their family and home. A nice side effect, for Washington, was of course the fact that in many cases these German companies were their greatest competitors in the region, so these blacklists effectively created American monopolies in certain industries.
By the end of the war, those held in internment camps were either sent to the Axis powers in return for American nationals, or were given leave to return home. Of the returnees, 15 were eventually interviewed on their experiences in the American internment camps. Repatriation was particularly difficult for returning deportees who would lose everything through the internment, as with the case of Hugo Droege, whose farm in Guatemala had been seized after he was forcibly taken to America. Party ties prior to deportation often made repatriation difficult, as with the case of Droege, and some others, affiliations with the Nazi party, called by Droege “not real membership” haunted them beyond the war.
Post-war
Many of the deported German Latin Americans who would return to South America saw great difficulty in continuing life the same. Many countries had adopted strict anti-Nazi and anti-German policies, contributing to the governments seizing German holdings across Latin America. Repatriation took much more than just returning home, it was a process of becoming at home in Latin America again, a process that took longer than many would have hoped.
Aftermath and legacy
Aftermath
In the aftermath of the mass deportations, many companies owned by German deportees in Latin America were confiscated and expropriated, despite US intelligence recognizing that the resulting seizures would cause grave economic harm. Additionally, upon learning about the mass deportations of Germans from Latin America, the Nazi regime retaliated on the nations cooperating with the United States by scouring German-occupied territory for their citizens and forcibly interning them.
For the deportees, their fate after the war had varied greatly. Many had already been sent to Germany over the course of the war, but for those who had stayed in the US, some (depending on the country they had been deported from) were given the opportunity to return to Latin America. However, their return wasn’t necessarily the end of their ordeal, with many returning to find that their property and belongings had been confiscated. Additionally, these returning Germans found themselves marked and excluded from societies that the once thought of as home, as anti-immigrant and anti-German sentiment dogged many Latin American countries as a result of the deportations.
US Congressional investigation and reparations
In 1980, a Commission was approved by Congress as a “fact finding study” to look into the deportation of civilians to U.S. internment camps. This commission published its report in 1983, bringing to light the US government’s actions in this period. However, this commission’s recognition of the treatment of German civilians from Latin Americans was limited only to its appendix. In 1988, the United States Congress passed the Civil Liberties Act of 1988, which granted a formal apology and reparations to Japanese Americans interned during World War 2, with a $20,000 US payout to all survivors. However, these reparations and the subsequent apology did not pertain to non-U.S. citizens and legal permanent residents who were deported to the United States from Latin America.
Because the law was restricted to American citizens, and to legal permanent residents, ethnic Japanese who had been taken from their homes in Latin America (mostly from Peru), were not covered in the reparations, regardless of whether they had remained in the United States, had returned to Latin America, or had been deported to Japan after the war. In 1996, Carmen Mochizuki filed a class-action lawsuit, and, from what was left of the funds from the CLA, won a settlement of around $5,000 per person for those who were eligible. One hundred and forty-five of those affected were able to receive the $5,000 settlement before funds ran out. In 1999, funds were approved for the US Attorney General to pay compensation to the remaining claimants.
To this day, no formal apology has ever been issued to Germans deported from Latin America.
References
20th century in South America
United States in World War II
Civil detention in the United States
United States home front during World War II
1940s in North America
Internments in the United States
1980s in the United States
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.