text
stringlengths 1
22.8M
|
|---|
Baal is a hamlet in the Dutch province of Gelderland. It is located in the municipality of Lingewaard, about 1 km northeast of Bemmel, and located 1 kilometer north of Haalderen and is therefore part of Haalderen.
It was first mentioned in 850 as Barla, and means "bare forest". Baal is not a statistical entity, and the postal authorities have placed it under Haalderen. The hamlet used to have its own place name signs which looked like signs for a build up area, however the municipality removed them in 2017. The villagers have since then put up new signs. In 1840, Baal was home to 46 people. Nowadays, it consists of about 15 houses. There used to be 16 houses. In 1996 a house was destroyed by a Lockheed C-130 Hercules, killing 34 people on board. The house was empty at the time.
References
Populated places in Gelderland
Lingewaard
|
Elizabeth Stout (born June 16, 1990) is an American retired professional soccer player who played as a goalkeeper. She is currently a coach with the Southern Illinois Salukis.
Early career
Prior to college, she played for DuPont Manual High School and Mockingbird Valley Soccer Club (which later became Chicago Fire Juniors). Stout also played semi-professionally in the Women's Premier Soccer League following her college career.
Stout played collegiately from 2008 to 2011 at Western Kentucky University, where she set school records for wins, shutouts, goals against average, and save percentage, and recorded 39 shutouts, second in National Collegiate Athletic Association history.
Professional career
Stout played at BV Cloppenburg of Frauen Bundesliga in 2013–14 and FF Yzeure Allier Auvergne of Division 1 Féminine in 2012–13.
Stout played for the Liverpool Ladies during both the 2014 and 2015 seasons. In 2014, she played 13 games, keeping seven clean sheets, to help the club win the 2014 FA WSL title.
On November 23, 2015, the Boston Breakers announced that they had signed Stout. She started 10 matches in 2016, but the Breakers drafted goalkeeper Sammy Jo Prudhomme in the 2017 NWSL College Draft and named Abby Smith the starting keeper of the 2017 season, and Stout did not appear in 2017 before the team waived her in May. However, after an injury to incumbent starter Smith in June, the Breakers re-signed Stout as a goalkeeper replacement on June 21, 2017. Later in 2017, Stout signed for Apollon Limassol of the Cypriot First Division and UEFA Women's Champions League.
After one season with Apollon Limassol, Stout announced her retirement. However, on July 19, 2018, she was signed by the Orlando Pride as a National Team Replacement Player, as Ashlyn Harris was away on international duty.
International career
Stout was called up to the United States under-23 training camp in 2013, although she did not appear in a match.
Coaching
In 2018, Stout joined the coaching staff of the Southern Illinois Salukis. In 2021, Stout joined Racing Louisville FC as its academy's goalkeeping director. Since 2022, she has been an assistant coach for Racing Louisville FC's amateur team playing in the USL W League. On May 12, 2022, she also appeared in a match as a player for the team, recording four saves in a shutout draw against Detroit City FC's women's side.
Personal
Stout married Cortney Jodoin on February 15, 2020, in a private California wedding. The couple currently resides in Southern Illinois.
Honors
Liverpool
FA Women's Super League (1): 2014
Liverpool Women's Player's Player of the Year (2014)
References
External links
Libby Stout – French domestic football stats at footofeminin.fr
Living people
1990 births
Soccer players from Louisville, Kentucky
Western Kentucky Lady Toppers soccer players
American women's soccer players
American expatriate soccer players in Germany
American expatriate sportspeople in France
American expatriate sportspeople in England
Expatriate women's footballers in France
Expatriate women's footballers in England
Women's Super League players
Liverpool F.C. Women players
American expatriate women's soccer players
Women's association football goalkeepers
National Women's Soccer League players
Boston Breakers players
Apollon Ladies F.C. players
Expatriate women's footballers in Cyprus
American expatriate sportspeople in Cyprus
Southern Illinois Salukis women's soccer coaches
Orlando Pride players
Racing Louisville FC (USL W League) players
USL W League players
|
```c
/*
*
* This file is part of FFmpeg.
*
* FFmpeg is free software; you can redistribute it and/or
* modify it under the terms of the GNU Lesser General Public
*
* FFmpeg is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
*
* You should have received a copy of the GNU Lesser General Public
* Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
*/
#include "config.h"
#if HAVE_UNISTD_H
#include <unistd.h> /* getopt */
#endif
#include "libavutil/pixdesc.h"
#include "libavcodec/avcodec.h"
#include "libavutil/common.h"
#include "libavcodec/raw.h"
#undef printf
#undef fprintf
#if !HAVE_GETOPT
#include "compat/getopt.c"
#endif
static void usage(void)
{
printf("Show the relationships between rawvideo pixel formats and FourCC tags.\n");
printf("usage: fourcc2pixfmt [OPTIONS]\n");
printf("\n"
"Options:\n"
"-l list the pixel format for each fourcc\n"
"-L list the fourccs for each pixel format\n"
"-p PIX_FMT given a pixel format, print the list of associated fourccs (one per line)\n"
"-h print this help\n");
}
static void print_pix_fmt_fourccs(enum AVPixelFormat pix_fmt, const PixelFormatTag *pix_fmt_tags, char sep)
{
int i;
for (i = 0; pix_fmt_tags[i].pix_fmt != AV_PIX_FMT_NONE; i++) {
if (pix_fmt_tags[i].pix_fmt == pix_fmt) {
char buf[32];
av_get_codec_tag_string(buf, sizeof(buf), pix_fmt_tags[i].fourcc);
printf("%s%c", buf, sep);
}
}
}
int main(int argc, char **argv)
{
int i, list_fourcc_pix_fmt = 0, list_pix_fmt_fourccs = 0;
const PixelFormatTag *pix_fmt_tags = avpriv_get_raw_pix_fmt_tags();
const char *pix_fmt_name = NULL;
char c;
if (argc == 1) {
usage();
return 0;
}
while ((c = getopt(argc, argv, "hp:lL")) != -1) {
switch (c) {
case 'h':
usage();
return 0;
case 'l':
list_fourcc_pix_fmt = 1;
break;
case 'L':
list_pix_fmt_fourccs = 1;
break;
case 'p':
pix_fmt_name = optarg;
break;
case '?':
usage();
return 1;
}
}
if (list_fourcc_pix_fmt) {
for (i = 0; pix_fmt_tags[i].pix_fmt != AV_PIX_FMT_NONE; i++) {
char buf[32];
av_get_codec_tag_string(buf, sizeof(buf), pix_fmt_tags[i].fourcc);
printf("%s: %s\n", buf, av_get_pix_fmt_name(pix_fmt_tags[i].pix_fmt));
}
}
if (list_pix_fmt_fourccs) {
for (i = 0; av_pix_fmt_desc_get(i); i++) {
const AVPixFmtDescriptor *pix_desc = av_pix_fmt_desc_get(i);
if (!pix_desc->name || pix_desc->flags & AV_PIX_FMT_FLAG_HWACCEL)
continue;
printf("%s: ", pix_desc->name);
print_pix_fmt_fourccs(i, pix_fmt_tags, ' ');
printf("\n");
}
}
if (pix_fmt_name) {
enum AVPixelFormat pix_fmt = av_get_pix_fmt(pix_fmt_name);
if (pix_fmt == AV_PIX_FMT_NONE) {
fprintf(stderr, "Invalid pixel format selected '%s'\n", pix_fmt_name);
return 1;
}
print_pix_fmt_fourccs(pix_fmt, pix_fmt_tags, '\n');
}
return 0;
}
```
|
Rewan Police Horse Breeding Station is a heritage-listed former horse stud farm at Rewan Road, Rewan, Central Highlands Region, Queensland, Australia. It was built in 1911 by Queensland Police Service. It is also known as Rewan Police Horse Stud, Rewan Remount Breeding Station, and Rewan Station. It was added to the Queensland Heritage Register on 31 May 2019.
History
The former Rewan Police Horse Breeding Station, located approximately southwest of the town of Rolleston in the Springsure district of Central Queensland, was established as a police horse stud in 1909. Comprising 108,000 acres (43,706ha) at its largest, the reserve was home to up to 771 horses and up to 2850 cattle until its closure in 1934. It was also a sanctuary for native birds and animals. It retains the stables (1911), the commissioner's cottage (1911) and a meat house (1918) from its stud era. Rewan Police Horse Breeding Station is the only surviving site of a police horse stud in Queensland. It is important in demonstrating the Queensland Government's policy of breeding horses, which were essential as transport, for the Queensland Police Force between 1904 and 1934. It has a special association with the Queensland Police Force, an important organisation in Queensland history.
A police presence in the geographical area that became the Colony of Queensland began in 1843, not long after free settlement commenced in at the former Moreton Bay penal settlement (now the city of Brisbane). The Police Act 1863 created a centralised system under a Commissioner of Police, who controlled the Queensland Police Force (QPF) and the Native Police (The Native Police were disbanded by the early 1900s.). The commissioner headed a hierarchy of inspectors and sub-inspectors who were in charge of sergeants and constables working in police districts throughout Queensland. However, the Commissioner of Police did not control the water police.
Until the mid-20th century, horses were essential equipment for the QPF. From 1864 the care of police horses was regulated and any town with a police station also had a police horse paddock. The QPF also supplied horses for ceremonial occasions, such as the 1901 proclamation of the Commonwealth, the visit of Imperial troops in 1901 and the arrival of the Duke and Duchess of York (later George VI and Queen Elizabeth The Queen Mother) and in 1927; as well as for the annual Royal National Association show in Brisbane. Horses were purchased for the QPF by the commissioner or by inspectors for their own districts. The care and replacement of police mounts was a significant part of the police budget each year.
Several factors provided the impetus for the Queensland Police Department to begin a horse breeding program. By 1904, Commissioner of Police, William Edward Parry-Okeden (Commissioner 1895–1905), was urging that the QPF breed its own horses to overcome the high cost of purchasing suitable ones. A good horse cost between £7 and £10. These horses were scarce due to stock losses during the Federation Drought (1898-1902); demand for horses during the Boer War (1899-1902); and because agents of the Indian, Japanese, Dutch and American armies were competing in the Australian market for remounts. Meanwhile, the QPF's need for horses remained high, as a significant proportion of its horses were old and needed replacing. The QPF in 1904 numbered 888 men including trackers, and over 1000 horses.
In 1904, the QPF commenced breeding remounts for its mounted police at Woodford, north of Brisbane, using a Reserve for a Police Paddock (formerly part of Durundur Station). Stallions were hired for stud work at Woodford between 1904 and 1908 for between £20 and £30 per stallion per year. Costs were kept down by growing feed on-site.
In January 1909, a report on the supply of police remounts in the other Australian states found that only the Northern Territory, which was forming five small, horse-breeding stations, was engaged in breeding police remounts. Victoria reportedly had discontinued the breeding of remounts 40 years previously, as it had proved costly and unsuccessful. However, there is evidence that a stud operated at Dandenong Police Paddocks 1853-1930 then at Bundoora 1930-. No studs in the Northern Territory are known to remain.
When the Queensland Department of Public Lands wished to dispose of the reserve as Agricultural Farms and Perpetual Lease Selections, the QPF, under Commissioner William Geoffrey Cahill (Commissioner 1905–16), decided it would continue to breed its own remounts. Efforts to secure another, larger site near Brisbane were unsuccessful, but a suitable location in Central Queensland was found. In 1908, the Commissioner of Police announced he had obtained "a very fine reserve at Carnarvon (named 'Rewan') of 78,000 acres, resumed from the Consuelo leasehold for the purpose of breeding police remounts". Rewan Station, situated about north of Roma and south of Springsure, the nearest railway station, was proclaimed as a "reserve for stud farm for breeding police horses" in April 1909. Additionally, Rewan was to operate as a police station, conducting patrols in the district. The land was regarded as suitable for breeding horses because of its variety of grasses and good natural water supply from Carnarvon Creek, Johnny Woods Creek, Charcoal Creek, and Boggarawalla Creek, all with permanent water holes. As well as several permanent lagoons and swamps accessible to stock, Rewan was also declared a Reserve for the Protection and Preservation of Native Birds, under the Native Birds Protection Acts 1877–1884. Acting Sergeant John Joseph Campbell and Constable J Ruthenberg were appointed as rangers of the reserve. The reserve later came under new legislation, which protected native animals and birds.
The nearest town to Rewan was Rolleston, established in 1862, at the crossing point on the Brown River (a tributary of the Comet River) that teamsters used when travelling between inland districts and Rockhampton. Charles Frederick Gregory, brother of the Queensland Surveyor General, Augustus Charles Gregory, surveyed Rolleston town site in 1865. It was later named after Christopher Rolleston (1817–88), a local landholder. Rolleston was a former Commissioner of Crowns Land for the Darling Downs, and brother-in-law of the Leslie brothers, who lead the settlement of Darling Downs from 1840. From 1860, Rolleston, with Louis Hope and Alfred Denison, acquired extensive runs in the Leichhardt district centred upon Springsure Station near Rolleston. A branch railway line to Springsure opened from Emerald (on the Central Western railway line) in August 1887, improving transport to the district. In the first two decades of the 20th century, the large pastoral leases in the Springsure district (within the Bauhinia Shire from 1902) were resumed for closer settlement.
Rewan Police Horse Breeding Station commenced in mid-1909 when the Woodford horses, together with all available mares owned by the QPF in the various police districts, were transferred there. These totalled 76 mares, 37 geldings and 24 foals. Three stallions were purchased: "Mack" (£215), "Libertine" (£157/10) and "Bonny Boy" (£63). Another 35 mares were purchased from Jimbour Station and Salisbury Plains Station, and breeding operations commenced in the 1909 breeding season.
At the time of the QPF's takeover of Rewan in mid-1909, some improvements were already on site. These included a 5-room, iron-bark slab stockman's hut, suffering from white ant attack, and other out-house buildings, which were "rough and not considered of any value and should be pulled down". A good stockyard was sited on a sand ridge behind the stockman's hut, which, after repair, would meet the stud's requirements for some years. Fencing consisted of about of outer boundaries and two horse paddocks, which were in a poor state of repair.
Requirements to make it habitable and functional were: accommodation for staff, animals and equipment; a telephone connection for the property to function as a police station; and boundary fencing. The work was undertaken by the police constables, who cut the trees, pit-sawed the timber, and erected the buildings. The only materials imported were the window sashes and iron. By the end of 1909, mustering yards had been erected and fencing had commenced. By December 1910, the existing slab stockman's hut had been repaired for constables' quarters, and a house with detached kitchen was erected for use by Acting Sergeant John Joseph Campbell, who was in charge of the stud and the remount breeding program. The sergeant's quarters and constables' quarters are no longer on site.
Stables were erected about behind the sergeant's quarters. The building was with a skillion roof of iron, with guttering to its front. The frame was sawn timber, with dressed split slabs for its walls and partitions. The stables were divided into two stalls for stallions , one at each end of the building; a forage room ; a buggy shed; a dray shed; and three stalls each . The buggy shed was enclosed by two hardwood doors. Immediately in front of the two stallion stalls were two substantial yards.
A windmill was erected near Carnarvon Creek, where there was a good quantity of water, about from the sergeant's quarters. Water was pumped into a galvanised iron tank erected behind these quarters, on a stand high. Pipes from this tank lead to the sergeant's quarters and both stallion stalls. A meat house, for meat processing, was also erected, as an essential component of the station.
The accommodation for the two to four Aboriginal Queensland Police Force trackers employed on Rewan between 1909 and 1934 was inferior to that provided for other staff. An image from -17 of one residence shows a dirt-floored, slab-walled and bark-roofed hut, with boughs and rocks used to hold the roof in place. Their accommodation in 1920 was described as "big, comfortable and waterproof huts". The trackers on Rewan are known to have undertaken horse- and cattle-related duties, and fence- and yard-building. Elsewhere in Queensland, trackers performed kitchen duties, cared for horses, checked fences and looked for trespassing stock. By early 1911, a second dwelling had been constructed southeast of the sergeant's quarters, to accommodate the Commissioner of Police and other government officials when they inspected the property. Initially it was called the commissioner's cottage, but was later referred to as the officers' quarters. This timber-framed and -clad building with verandahs to its north and west elevations, had a corrugated metal roof with acroteria, and post-and-rail verandah balustrades. As it was intended for periodic short-term use by official visitors, it comprised only two bedrooms and a bathroom. By the end of April 1911, the sergeant's quarters and the commissioner's cottage, along with well-tended gardens to the front, had been fenced in with of painted picket fence, including two wicket gates and two large double gates, at the front and the back of the buildings; as well as of split paling fence.
Periodic visits by officers of the QPF and the Home Department, and dignitaries took place. The first occurred in April 1911, when Chief Inspector Frederic Urquhart reported excellent progress had been made in building, fencing, clearing and ring-barking on Rewan. As intended, Rewan also served as a police station and patrols were being made. Constable P Lee was appointed ranger of the reserve for native birds and native animals at the stud farm at Rewan' after the transfer of Constable Rothenburg to another station. Prickly pear of the tree variety was present on the property, but was confined to an area of about of brigalow scrub about from the station. Urquhart recommended doubling the herd of cows and adding 50 mares and another stallion before August 1911, and further fencing. He was convinced the place would be profitable and of immense service in providing police remounts.
Additions were subsequently made to the stud. Horse numbers rose to 331 by 31 July 1911, then to 400 by 31 July 1912. In 1913, the thoroughbred stallion "Turkish Lad" was purchased at the National Exhibition in Brisbane and sent with 12 mares to Rewan, and two further mares were purchased from Eulolo and Roxborough Downs Stations. By 1914, the number of stud mares had reached 215.
Rewan-bred horses were first supplied to the mounted police in 1912, when 69 trained remounts were provided. The Commissioner of Police reported that in the 1912-13 financial year the cost of purchasing horses for the department had decreased by £500, and he expected Rewan would produce well-bred horses in fair numbers in the future. In 1913 there was praise for the 20 Woodford- and Rewan-bred horses sent to Brisbane for use in escort work in the metropolitan area. In June 1913, 17 horses from Rewan were also despatched to Longreach.
Farming activities commenced on Rewan in 1913. In June, a new timber shed stored its crop of pumpkins. Farm implements were sent there, to enable ploughing of of land for lucerne and other crops, including oats. In May 1918, Rewan was growing lucerne and had already stored lucerne hay for feed.
However, during 1915 and 1916, environmental factors hampered Rewan's development. Drought halted agriculture and reduced foal numbers, while dingoes killed calves. In this period only 20 foals were born from 124 brood mares. Flooding late in 1916, which destroyed three tons of panicum hay at Rewan, ended the drought.
Afterwards, the stud resumed its breeding success. During 1916, 56 remounts were allocated for duty, valued at £840. There were also 5 bulls and 497 head of cattle on the property, valued at £2948. In the 1916-17 financial year, there were 10 bulls and 566 head of cattle, valued at £4320 and cattle had been sold for good prices that year. In the 1916-17 financial year, 70 remounts were sent out for duty to various districts (value £1050), and there were 669 horses on Rewan Station. In the 1916-17 financial year, 100 fat bullocks were sold at £15 per head, and 4 fat cows at £13 per head.
In July 1916, the Home Secretary, the Hon. John Saunders Huxham; James Christian Peterson, MLA; and the Under Secretary of the Home Office, William J Gall, inspected Rewan Station and requested a report on improvements required to make the property viable, and their approximate cost. Subsequently, the addition of cattle and a stallion to Rewan's stock was approved, resulting in the purchase of the stallion "Lord Elderslie", for £156/10, in May 1917. There were 576 cattle on the property after the sale of 100 bullocks to the Central Queensland Meat Export Company at £15 per head. Stud stallions were: "Libertine", "Turkish Lad", "Bonny Boy", "Brisbane" and "Lord Elderslie", while one, 'Archie'/'Archer', was to be sold.
Improvements to Rewan buildings, approved in September 1916, were carried out by the Department of Public Works (DPW) during 1918, using timber sourced from Meteor Downs. The buildings included a new meat house, erected at a cost of £150. It was constructed with its door in a location that differed from its DPW plan.
The 1920s were filled with praise for Rewan's horses. During his Queensland visit in 1920, Edward, Prince of Wales requested to see the world-famous Queensland troopers and their mounts. He congratulated the Commissioner of Police (Frederic Charles Urquhart, Commissioner 1917–21), Inspector Carroll, and Sub-Inspector Campbell, who supervised the police stud, saying he much admired the horses. Inspector Carroll and the officer-in-charge of the Rewan police stud farm, Sub-Inspector Campbell, were similarly complimented. King George V for his coronation in 1911 had been presented with the Rewan-bred horse "Brisbane" from the Queensland Government.
Official inspections continued in May 1921 when the Home Secretary, William McCormack, accompanied by J C Peterson, MLA for Normanby, and W H Gall, the Under Secretary of the Home Department, visited the Central district and spent two days inspecting Rewan. At this time there were about 600 horses and 1300 Shorthorn beef cattle on the property. In October of the same year, the new Commissioner of Police, Patrick Short (Commissioner 1921–25), also inspected Rewan Station. During 1921, 112 horses were sent from Rewan to various districts as remounts, the highest number provided in one year during its lifetime.
Improvements to the stud in the 1920s included reroofing the hay shed and additional stables with second-hand, galvanised iron (1922) and the addition of 30,000 acres of leasehold land from Consuelo Holding, north of Rewan (1922).
While Rewan concentrated on providing remounts for the mounted police, the motorisation of Queensland and the wider world was progressing rapidly. The QPF was failing to keep up, largely due to budget constraints. Successive commissioners had tried to provide more modern transport. In 1896 Commissioner Parry-Okeden introduced bicycles for patrols, which were slowly distributed around the colony. However, at a cost of about £13 per bike, they were more expensive than horses. Bicycle numbers reached 96 in 1914, but many were old and worn out and funds were not available to replace them or increase their numbers. More bicycles were purchased in the 1920s.
Between 1917 and 1924, Commissioner Urquhart advocated the introduction of motor cycles to supplement bicycles in country districts and requested 50, but realised the expense prohibited their purchase. Finally, in 1925 three motor cycles with side cars and 21 new bicycles were acquired, mainly for use in the metropolitan area. Motor cars for the police headquarters were hired as needed. Increasingly, police used their private vehicles at work.
By 30 June 1926, police horse numbers had fallen to 758 and the Commissioner of Police stated: "It is obvious that motor vehicles are filling calls that were formerly met by horses...". The QPF purchased two motor vans to transport prisoners in 1926 and by 30 June 1929, 12 motor cycles were in use. However, it was not until after Cecil James Carroll was appointed Commissioner of Police (1934–49), that motorisation of the QPF proceeded rapidly.
Environmental threats made the latter years of the 1920s difficult for Rewan Station. Successive drought between 1926 and 1932 affected the breeding, breaking and handling of horses, and delayed distribution of remounts in 1927. In November 1929, fire threatened Rewan from the north and west for over three weeks and required about 18 men to subdue it. Prickly pear had spread in Bauhinia Shire during the early years of the century and continued to require control measures on Rewan Station until its eradication after 1930 through biological control.
New stock was added to Rewan's stud in the late 1920s and praise continued for its horses, although QPF horse numbers continued to fall, reaching 616 on 1 July 1929. Notably, in August 1929, the well-known racehorse and sire, "Had-I-Wist", grandson of the 1890 Melbourne Cup winner "Carbine", was presented to the Queensland Government by the Commonwealth authorities and sent to Rewan Station as a stud horse. .Mundoolun-bred cattle were added to the Rewan stock in January 1930.
Praise for Rewan horses continued to be received in the late 1920s and early 1930s. In 1927, The Duke and Duchess of York expressed very high appreciation of the police horses which formed part of their escorts, and members of their party also commended the remounts, stressing their "symmetry and tractability". The Governor General of Australia and the Governor of Queensland, Sir John Goodwin, also "praised the class of remounts forming the escorts". In 1929, the Governor inspected "Had-I-Wist" and about 16 police horses bred at Rewan at the Petrie Terrace police depot and expressed his satisfaction with the quality and appearance of the horses. In 1932, Governor Sir Leslie Wilson and his daughter were enthusiastic about 13 Rewan-bred police horses paraded at the Petrie Terrace depot. . The horses seen by Sir Leslie Wilson and his daughter included Fretsaw, Hero and Grumpy, three prize-winners at the previous Brisbane Exhibition and also two or three young remounts in training.
Despite such praise, from 1929, members of the Queensland Parliament, the Queensland Police Union and Queensland newspapers criticised Rewan Station and the police horse breeding program. In the Queensland Parliament in 1929, the Home Secretary, J C Peterson (1929–32) was questioned by Thomas Alberto Dunlop, (Independent, Rockhampton), about the high cost of salaries and allowances, and forage for horses at Rewan Station. The Minister replied that salaries varied from time to time and that forage was supplied for the Rewan stallions. He also reported that 251 cattle had been sold in the previous financial year, realising £2069 5s. In 1930, Arthur Jones (Labor, Burke) questioned the Home Secretary about the annual cost of running Rewan Station and its total cost since establishment. The total amount spent by the government on Rewan Station since 1909 was £54,080. Upkeep of Rewan in the 1929-30 financial year was £3470. The value of remounts bred at Rewan and sent to police stations that year was £420, being 26 horses (14 to Townsville district and 12 to Cloncurry district), as drought conditions in that year affected sending out horses to stations.
From 1930 to 1933, the Queensland Police Union conducted a concerted campaign against the Rewan stud and in support of motorisation of the QPF. Queensland newspapers quoted Queensland Police Union Journal articles stating that police thought Rewan Station was a "white-elephant" because the land was unsuitable for horse breeding because it was too far south for horses acclimatised for northern/western service and the Rewan horses were unsuitable for the work they were meant to do. The union also complaine that Rewan needed a sub-inspector, two constables, and a number of trackers to manage it, meaning the salaries alone were over £2000 a year. The Telegraph advised that "The Government [should] view this matter seriously and have it dealt with purely from a business, utility and economic viewpoint". The Police Union continued its campaign to close the police horse stud into the first half of 1933 and appeared to exaggerate Rewan Station's costs. The Evening News quoted the Police Union's June meeting that hoped authorities "would have no hesitation selling the Rewan horse-breeding establishment, which was costing upwards of £50,000 a year [actually about £3470], a wasteful expenditure in a futile effort to prop up antiquated methods and ideas". In June 1933, the Home Secretary (Edward M Hanlon) was expected to confer with the Police Commissioner (William H Ryan, 1925–34) about the closure of Rewan Station. The site was considered for an Aboriginal settlement, but as the land was pastoral, not agricultural, that idea lapsed.
A galvanising factor in Rewan's fate was the conviction of its officer-in-charge, Sub-Inspector Campbell, for tax fraud and his dismissal from the police force in August 1933. Thomas Jones, superintendent of the farm home for boys at Westbrook, was sent to Rewan to take charge. His report on Rewan Station outlined the history of the breeding program and provided advice about future strategies. He found no misconduct in the management of the government's cattle and horses. He advised that the station needed fencing and building repairs, and ringbarking. He concluded that:"Rewan should be ringbarked and given a chance. The policy laid down when the place was first started was an admirable one and if carried out properly Rewan would be today ... doing well what was intended it should do. I... have no hesitation in saying that Rewan has not had a chance of carrying out its programme.... If it is decided to carry on it would not be difficult nor expensive to place the property under practical working conditions.... Had proper inspections been carried out the place would have been improved in every way instead of being allowed to drift."However, in the economic and political climate of 1933, Jones' advice was not taken and the stud's closure was announced in October 1933. The state was experiencing the worst worldwide economic depression since the 1890s. State-owned enterprises in general were out of favour after the failure of the Labor government's State Enterprises initiatives (1915-1925). The combined loss sustained by the State stations and the State-acquired Chillagoe railway, mines and smelter was £2 million. Although Rewan Police Horse Breeding Station was not part of that scheme, there was a popular view that the Government should not run business enterprises. According to Home Secretary Hanlon, Rewan Station had cost £60,000 over its lifetime for a return of £33,000. Hanlon believed there was no excuse for the failure of Rewan as the property was of the best land in Central Queensland and was well-watered. In 1932 there had been 140 mares on the holding but only two foals were reared. He believed the mares were totally unsuitable for breeding police remounts, as through sending away the best mares for police work and keeping the rejects for breeding, there had been a systematic deterioration of the stock. The reasons cited for the police horse stud's closure were that its location was too inaccessible for regular inspection and regular control by officers of the department and it had not produced horses suitable for North Queensland work.
In 1934, Rewan Station was divided into two pastoral properties and allocated by ballot. The value of improvements on Rewan Station at the time of its sale was £1570/16. The building improvements were:
the commissioner's cottage comprising two rooms, bathroom and front verandah
Sub-Inspector Campbell's quarters comprising eight rooms, with front and side verandahs
single constables' quarters comprising two rooms, front verandah, small kitchen and an improvised room on the back verandah
a meat house
three huts for native trackers' accommodation
stabling accommodation for the stallions
a number of sheds and yards
The stud horses, cattle and plant were disposed of by September 1934. Of the horses: 39 went to Springsure as Police remounts; 34 were forwarded to Woorabinda Aboriginal Settlement; 12 were sold locally; and 263 were herded over the Carnarvon Range for auction at Charleville on 12 September 1934. During the journey, 25 died or were destroyed including the valuable stallions, "Had-I-Wist" and "Ercanil", and 11 foals; and 5 went missing (strayed or died). Of the 1464 cattle on Rewan Station: 121 were sold to Lakes Creek Meatworks; 13 bulls were sold to John Rewan Campbell of Rolleston; 1300 were transferred to Woorabinda Aboriginal Settlement; and 30 died. The proceeds of the livestock sales were £1240.6.10. Plant and loose tools were transferred to Woorabinda Aboriginal Settlement and to the Springsure Police Station to be used in police stations. Additionally, there were 187 horses missing since 1931, presumed dead due to drought. The stud horses, in poor condition due to drought, were herded to Charleville, during a three-week period, at a fast pace and through stretches of country with little grass or water.
During its lifetime, Rewan Police Horse Breeding Station bred, trained and supplied 1029 remounts to the QPF.
By the end of the 1930s, the QPF's horse breeding program was remembered more positively. The Wingham Chronicle & Manning River Observer newspaper stated in 1938: "At Rewan some of the finest horses used in the Queensland Police Force were bred". In his 1941 obituary, Patrick Short, former Commissioner of Police (1921–25), was credited with making great improvements to Queensland police horses through the Rewan stud.
From 1934 to 2019, Rewan Station has operated as a 17,500ha cattle property. In 2019, it retains three buildings from its former role as a police horse stud: the former stables (1911), the former commissioner's cottage (1911), and the meat house (1918). These are the only built remains of a former police horse stud in Queensland and are important in demonstrating the Queensland Government's policy, between 1904 and 1934, of breeding horses for the Queensland Police Force, an important organisation in Queensland history.
Description
The former Rewan Police Horse Breeding Station is located approximately southwest of the Central Queensland town of Rolleston. Positioned on a rise, on the north bank of a creek bend, the station complex is accessed via Rewan Road to the east. It retains three timber-framed buildings that are associated with the occupation and operation of the early 20th century police station and horse stud.
Standing in their original locations, the buildings are oriented facing northeast and are roughly aligned along a northeast to southwest axis: the commissioner's cottage, facing and visible from Rewan Road; the former meat house approximately to the rear of the cottage; and the former stables approximately further to the southwest.
Commissioner's cottage (1911)
The cottage is a lowset, timber-framed and -clad building comprising an original (1911) gable-roofed core. It has a later (post-1932) skillion-roofed extension to the rear (southwest).
The core is rectangular in plan and has an L-shaped verandah wrapping the front (northeast) and side (northwest, part former bathroom). The skillion verandah roof is set below and at a shallower pitch to the gable. The core contains one room, but retains evidence of its original two-bedroom layout in the form of the top-rail of the (former) dividing partition and separate doorways accessing the front verandah. A window and doorway opening are also retained in the rear (southwest) wall.
A garden occupies the area at the front of the cottage; it retains garden beds with stone edging and is fenced on part of its original alignment (parallel with the cottage front and a return at the northwest end).
Former meat house (1918)
The meat house is a lowset, timber-framed and -clad building with a pyramid roof. Square in plan, it comprises a one-room core (former butchering room), surrounded by overhanging eaves supported on rough-hewn, round timber posts.
The core has a thin concrete slab floor on a raised platform of earth and undressed stone. It is accessed by a single door and has high-level screened openings for ventilation on all sides.
Former stables (1911)
The former stables is a long, narrow, lowset, timber-framed building with a skillion roof that slopes down to the long-sided front (northeast). It is one 13 ft (4m) stall in width and enclosed on the southeast, southwest and northwest sides. Two enclosed rooms (former stallion stall and feed room) are at the southeast end of the building; an open-plan area (former open-fronted dray shed and three partitioned stalls with post-and-rail fronts) is at the centre; and two partially enclosed rooms (former enclosed buggy shed and stallion stall) are at the northwest end.
The intactness of the building reflects its continued use as part of a working pastoral property, with sections of original timber slab and post-and-rail construction deteriorated, demolished or replaced. Original and early fabric is concentrated at the southeast and northwest ends, and includes horizontal timber slab walls and framing, and earth and timber (former feed and buggy rooms) flooring.
Heritage listing
Rewan Police Horse Breeding Station was listed on the Queensland Heritage Register on 31 May 2019 having satisfied the following criteria.
The place is important in demonstrating the evolution or pattern of Queensland's history.
Rewan Police Horse Breeding Station (former) (1909–34), is important in demonstrating the Queensland Government's policy of breeding horses, which were essential as transport, for the Queensland Police Force between 1904 and 1934. The place retains important surviving evidence of this rare police horse stud, which was constructed and maintained primarily by members of the Queensland Police Force, including: the former stables (1911); the former commissioner's cottage (1911); and a meat house (1918).
The place demonstrates rare, uncommon or endangered aspects of Queensland's cultural heritage.
As one of only two police horse studs established in Queensland, and the only one with surviving structures, the Rewan Police Horse Breeding Station (former) is a rare example of a function that has always been uncommon.
The place has a special association with the life or work of a particular person, group or organisation of importance in Queensland's history.
Rewan Police Horse Breeding Station (former) has a special association with the Queensland Police Force, an important organisation in Queensland history since its establishment in 1863. The Queensland Police Force established the stud in 1909, built most of its structures, and conducted the stud for 25 years to breed, train and supply horses to the Queensland Police Force for police work throughout the state, at a time when horses were vital to policing.
References
Sources
W. Ross Johnston, The Long Blue Line: A History of the Qld Police, Boolarong Publication, Brisbane, 1992
Attribution
External links
Queensland Heritage Register
Central Highlands Region
Horse breeding and studs
Articles incorporating text from the Queensland Heritage Register
Queensland Police Service
Horses in Australia
Buildings and structures in Central Queensland
|
Football at the 2007 Island Games may refer to:
Football at the 2007 Island Games – Men's tournament
Football at the 2007 Island Games – Women's tournament
2007 Island Games
2007 in association football
2007
|
Hyttbakken is a village in the municipality of Selbu in Trøndelag county, Norway. It is located along the Nea River, about east of the municipal center of Mebonden and about northwest of the village of Flora.
References
Villages in Trøndelag
Selbu
|
```batchfile
dotnet build localization-1
dotnet build localization-2
dotnet build localization-3
dotnet build localization-4
dotnet build localization-5
dotnet build localization-6
```
|
The Trio No. 1 in B-flat major for piano, violin, and cello, D. 898, was written by Franz Schubert in 1827. The composer finished the work in 1828, in the last year of his life. It was published in 1836 as Opus 99, eight years after the composer's death. Like the E-flat major trio, it is an unusually large scale work for piano trio, taking around 40 minutes in total to perform.
Structure
The piano trio contains four movements:
Discography
Alfred Cortot, piano; Jacques Thibaud, violin; Pablo Casals, cello (Kingsway Hall, London, July 5 and 6, 1926; originally released in October 1926 as HMV DB947/50, with US issue as Victor set M 11)
New York Trio (Clarence Adler, piano; Louis Edlin, violin; Cornelius van Vliet, cello) (May 24, 1928; released late 1928 as Edison Diamond Discs 80898/901; deleted December 31, 1929)
Eugene Istomin, piano; Isaac Stern, violin; Leonard Rose, cello (1964).
Trio Dali (Amandine Savary, piano; Jack Liebeck, violin; Christian-Pierre La Marca, cello), (2011).
Busch Trio (Mathieu van Bellen, violin; Ori Epstein, cello; Omri Epstein, piano), (2022).
Notes
References
Gramophone Magazine, "Classics reconsidered: Schubert’s B flat Trio from Thibaud, Casals and Cortot," https://www.gramophone.co.uk/features/article/classics-reconsidered-schubert-s-b-flat-trio-from-thibaud-casals-and-cortot
Fluff on the Needle, "Losing the Plot," June 16, 2012, https://fluffontheneedle.blogspot.com/2012/06/loosing-plot.html
External links
Performance of Piano Trio No. 1 by the Eroica Trio from the Isabella Stewart Gardner Museum in MP3 format
Chamber music by Franz Schubert
Schubert 01
1827 compositions
Compositions by Franz Schubert published posthumously
Compositions in B-flat major
|
```smalltalk
Extension { #name : 'Slot' }
{ #category : '*Shout' }
Slot >> styleNameIn: aRBVariableNode [
^ #instVar
]
```
|
Mazza Gallerie was an upscale shopping mall in the Friendship Heights neighborhood of northwest Washington, D.C. Opened in 1977, it had of retail space on three levels, a parking garage, and a direct connection to the Friendship Heights station of the Washington Metro. The last retail business closed in December 2022. The building is to be converted to residential apartments with retail on the ground floor.
The mall was named after Louise Mazza, whose daughter Olga inherited the land before it was developed. When the family sold the property, they attached a covenant requiring any future development to be called Mazza and to display a picture of Louise Mazza.
History
It was an early project of Herbert S. Miller's Western Development Corporation (now Mills Corporation), which later developed Potomac Mills, Georgetown Park, and Washington Harbour. Miller assembled a deal with property owner Olga Mazza and Neiman Marcus owner Stanley Marcus to build the development; Olga wanted an office building named after her mother and Miller wanted a residential component, but neither were approved. The developers envisioned the 60-store mall as the anchor for a new upscale shopping district: Washington's version of Fifth Avenue or Rodeo Drive. A retail corridor had already been established with the opening of a freestanding Woodward & Lothrop department store in 1950, followed by Lord & Taylor (1959) and Saks Fifth Avenue (1964) nearby, and a stop on the Metro Red Line had been approved in 1973.
Construction difficulties and labor disputes delayed construction of the $25 million project. The mall finally opened in November 1977, almost four years behind schedule. The delays had contributed to vacancies, as merchants could not plan for a firm opening date, and many retailers chose to open locations instead in White Flint Mall north of Bethesda. Traffic caused by the also-delayed construction of the Metro discouraged patrons from visiting the area. The poor early performance of the center contributed to a reputation as a "troubled" shopping center.
In June 1997, a group headed by Daniel McCaffery, who owned the Friendship Centre development across Wisconsin Avenue, acquired the mall for $28 million and opened up the marble block exterior with new windows, better lighting, and additional entrances. The project was financed by Security Capital Group, which was acquired by General Electric in 2001.
Among the added venues were a General Cinemas theatre and a restaurant, The R Room, owned by General Cinemas but operated by the restaurant division of Neiman Marcus. It closed in December 2000. Another restaurant, Rock Creek, operated in the restaurant space from 2007 to 2009.
In June 2004, Teachers Insurance and Annuity Association of America (TIAA) acquired the mall from General Electric for $77 million.
Ashkenazy Acquisition Corporation, led by Ben Ashkenazy, acquired the mall from TIAA for $78 million in January 2017. By then, the Friendship heights shopping district had been declining overall, as merchants closed or moved to the CityCenterDC district in downtown Washington, which opened in 2014, and as part of the general trend dubbed the "retail apocalypse." The COVID-19 pandemic accelerated these effects. In August 2020, the closure of the Neiman Marcus store was announced and Annaly Capital Management acquired the property via foreclosure. In February 2021, the closing of the AMC Theatres was announced.
Tishman Speyer acquired the property in May 2021 for $52 million with plans to redevelop it into 350 apartments and of retail space.
The last retail business in the mall, TJ Maxx, and the mall itself permanently closed on December 24, 2022.
References
Shopping malls in Washington, D.C.
Defunct shopping malls in the United States
Shopping malls established in 1977
1977 establishments in Washington, D.C.
Shopping malls disestablished in 2022
2022 disestablishments in Washington, D.C.
|
```smalltalk
using System.Data.Common;
using System.Threading.Tasks;
using Microsoft.EntityFrameworkCore.Query;
using Microsoft.EntityFrameworkCore.TestUtilities;
using MySqlConnector;
using Xunit;
using Xunit.Abstractions;
namespace Pomelo.EntityFrameworkCore.MySql.FunctionalTests.Query;
public class NorthwindSqlQueryMySqlTest : NorthwindSqlQueryTestBase<NorthwindQueryMySqlFixture<NoopModelCustomizer>>
{
public NorthwindSqlQueryMySqlTest(NorthwindQueryMySqlFixture<NoopModelCustomizer> fixture, ITestOutputHelper testOutputHelper)
: base(fixture)
{
Fixture.TestSqlLoggerFactory.SetTestOutputHelper(testOutputHelper);
}
[ConditionalFact]
public virtual void Check_all_tests_overridden()
=> TestHelpers.AssertAllMethodsOverridden(GetType());
public override async Task SqlQueryRaw_over_int(bool async)
{
await base.SqlQueryRaw_over_int(async);
AssertSql(
"""
SELECT `ProductID` FROM `Products`
""");
}
public override async Task SqlQuery_composed_Contains(bool async)
{
await base.SqlQuery_composed_Contains(async);
AssertSql(
"""
SELECT `o`.`OrderID`, `o`.`CustomerID`, `o`.`EmployeeID`, `o`.`OrderDate`
FROM `Orders` AS `o`
WHERE `o`.`OrderID` IN (
SELECT `s`.`Value`
FROM (
SELECT `ProductID` AS `Value` FROM `Products`
) AS `s`
)
""");
}
public override async Task SqlQuery_composed_Join(bool async)
{
await base.SqlQuery_composed_Join(async);
AssertSql(
"""
SELECT `o`.`OrderID`, `o`.`CustomerID`, `o`.`EmployeeID`, `o`.`OrderDate`, CAST(`s`.`Value` AS signed) AS `p`
FROM `Orders` AS `o`
INNER JOIN (
SELECT `ProductID` AS `Value` FROM `Products`
) AS `s` ON `o`.`OrderID` = CAST(`s`.`Value` AS signed)
""");
}
public override async Task SqlQuery_over_int_with_parameter(bool async)
{
await base.SqlQuery_over_int_with_parameter(async);
AssertSql(
"""
p0='10'
SELECT `ProductID` FROM `Products` WHERE `ProductID` = @p0
""");
}
protected override DbParameter CreateDbParameter(string name, object value)
=> new MySqlParameter { ParameterName = name, Value = value };
private void AssertSql(params string[] expected)
=> Fixture.TestSqlLoggerFactory.AssertBaseline(expected);
}
```
|
Steve Thomas may refer to:
Steve Thomas (artist) (born 1944), English designer and visual artist
Steve Thomas (television) (born 1952), former host of This Old House on PBS, host of Renovation Nation on Planet Green
Steve Thomas (Royal Navy Fleet Air Arm aviator) (born 1961), Royal Navy Fleet Air Arm aviator
Steve Thomas (ice hockey) (born 1963), National Hockey League ice hockey player
Steve Thomas (politician) (born 1967), Western Australian MLA
Steve Thomas (footballer) (born 1979), Welsh footballer
Steve Thomas (rugby) (born 1979), Welsh rugby league footballer
Stevie Thomas
Stevie Thomas (born 1967), Arena football player
See also
Steven Thomas (disambiguation)
Stephen Thomas (disambiguation)
|
The Halqa is a Moroccan concept that refers to people's theatre, an audience circle in the middle of which is the Helayqi (the artist who presents the show). This term has always been associated with the Moroccan intangible cultural richness of music, dance, singing, and storytelling.
This form of gathering slipped during history and moved into the Moroccan popular culture, and formed its special character, to become an intangible cultural heritage. For Moroccan society, the Halqa is a practice rooted in Moroccan culture that is linked and cohesive to a geographical location. From this came the anthropological interest in this tradition associated with the place or the "place of memory," as Pierre Nora called it.
Etymology
The term Halqa (plural Helaki) means in arabic a circle and it refers to the distribution formed by spectators or listeners (circular shape) around a person or group of persons who give a speech or present a show. Hence, this word can be used on anything that takes a circular or semi-circular shape. In the Arab world, it is known that the lessons that were held in schools (Medersa) used to take a circular shape in the past, while students sat in a semi-circle around their teacher. This form of gathering was prominent as well in the ancient Greek society in the so-called Agora.
History
Some historical writings indicate that the art of the Halqa in Morocco originated in the 19th century AD in the city of Marrakech in the Jemaa el-Fna square, before it spread to the rest of the cities. Especially in its weekly markets Souq, which were visited by people from different areas surrounding the city.
The presentation of the Halqa is supervised by specialists in the art of storytelling, mime and acrobatic games, as they present their performances in the markets and squares of major cities such as Bab al-Saqma, Bab al-Futuh, Bab 'Ajisah in Fez, or al-Hedim square, Bab Mansour al-'Alaj in Meknes, and Jemaa el-Fna Square in Marrakech to create a mixture of comedy, drama, poetry, singing, dance and storytelling, sometimes even the audience participates in it, and often personifies amazigh and Arab myths and legends inspired by the diverse Moroccan culture.
As for the Besaat, i.e. jokes or tease, it was at first just a traditional imitations to entertain the sultans and statesmen on the one hand, and to amuse the people on the other hand, such as characterizing the story of the Balarj on the ringtones of the bendir and the melodies of the flute.
The first concerts of the Besaat theater were presented in Morocco in front of Sultan Muhammad bin Abdullah and were used for preaching and guidance in an entertaining way. This art developed from this short show over the years and thanks to the encouragement of the sultans.
These Halayqiya artists came to Rabat on the occasion of Ashura, so they went to the royal palace in an atmosphere of humor and fun amidst the curiosity of adults and children, and when they reach the royal palace, they began to perform their shows.
If each group has its own characteristics that distinguish it, the people of Fez have specialized in imitation. It is said that the themes of the sketches they presented became social and patriotic in addition to their comedic character. These sketches dealt with social issues presented in an amusing form in front of the king who is aware of the hidden dramas of their owners, so he researches these issues and empowers the owners of their rights.
In 2003, UNESCO considered the Jemaa el-Fna square as a cultural space and a form of cultural expression, and a masterpiece of oral tradition. Out of 32 files, 19 were selected, including the Jemaa el-Fna square file, as a place that shines in various types of intangible heritage.
This art has known periods of ups and downs, and has begun to gradually disappear in certain Moroccan cities. In addition to a few spaces, the Jemaa el-Fna square in Marrakech, which has been labeled by UNESCO as intangible cultural heritage 2008, is still the vital cradle that embraces the pioneers of Halqa art with wonderful stories of local oral heritage or acrobatic movements performed by young men who never entered a sports institute, but inherited a gestural heritage traditional which is specific to certain tribes.
The traditional storytelling
Definition
The traditional storytelling is defined as a narrative heritage that is part of Moroccan folk oral literature. It is transmitted from generation to generation orally, and is usually presented in the form of a story, a riddle or a fable, the purpose of which is to draw a lesson at the end. It is one of the most important means of transmitting culture and moral values to future generations.
Traditional storytelling in the Marrakesh community
The traditional storytelling is part of the identity of the Marrakchi community. The art of storytelling circulated among people, in a family form, before the Halqa of the storyteller of Jemaa El-Fna square became one of the most important constituent elements of the activities of this square. As a result, storytelling has become an important part of intangible heritage.
Mohammed Bariz
Muhammad Bariz was born in 1959 in Marrakech. He was influenced by his mother's tales and by the most famous storyteller of his time, Moulay Muhammad al-Jabri. He entered the realm of storytelling through its wide doors in 1969, mastering the art of storytelling, and the method of maintaining balance of the Ḥalqa through his storytelling and oral theatrical embodiment. His repertoire has remained filled to this day with the precious things he never ceases to tell. Over three decades, his storytelling legacy has endured, and he and others like him have preserved it in Jemaa El-Fna Square and passed it on to generations who now carry the torch of its enhancement and preservation.
Moroccan woman and the traditionnal storytelling
Women have always been a source of inspiration for storytellers, their presence adding to the intrigue and suspense of storytelling. Not only she was one of its heroines of the tale, but she was the narrator of its events. As a result, women have had two fundamental roles in the transmission of oral tales throughout history: sometimes as a character helping to influence the course of events, and sometimes as the narrator of the tale herself.
Women's leadership in conveying the traditional tales within Moroccan society
In a former era, women used to hold gatherings for storytelling in a closed, warm family atmosphere, as part of traditional rituals that are now on the verge of extinction, in which they celebrate the art of the storytelling with the company of the whole family.
The storytelling gathering usually consisted of women and children, and every woman pours into it the stories she has in her chest. The woman narrator was that bridge through which the child crosses in his imagination the events of the story, to settle into slumber, and then sleep in stillness. Just like the lullabies that are sought for the infant before he surrenders to sleep.
She was, and still, playing the role of the socialization of the child, starting from the lullabies of the cradle, to the riddles and stories of childhood, which inculcate in them the values of goodness and the principles of identity. The narrator transports them to the depth of imagination, so that she and her listeners have a collective imagination that is an outlet and a refuge to escape from the realistic void.
Oral narration is a function that grandmothers and mothers took care of in the past. However, the common collective arena is not devoid of the presence of female storytellers and women narrators who practiced the art of storytelling by entering the Halqa, whether in the Jemaa El Fna square, or in the hospitality of the largest Riads of Marrakech , so they were artists, carrying the torch of this heritage, especially here as an example: the honorable storyteller called Lalla Ruqia, was about 111 years from now, she lived in the suite reserved for women at the Zaouia of Sidi Bel Abbes, where all the unmarried, poor, honorable women were gathered.
Jmi'a and Zahra too learned the art of storytelling as children from an old slave-lady in the Makhzen House, who was the storyteller of Sultan Mawla al-Hasan at the time. Sultan Hassan I, back then (18th century AD), was also instructed to hold special evenings for tales in the Agdal Palace in Marrakesh, under the supervision of the best storytellers of the city of Marrakech.
The honorable Lalla Ruqia, Lalla Jmi’a, and Lalla Zahra, and if they are preserved in the Moroccan collective memory, they are no exception. It is certain that there are female storytellers other than them, who were inadvertently or perhaps deliberately omitted from the Moroccan Marrakeshi documentation record, and we did not receive anything from their news.
Popular theater
The popular theater played an important role in the Halqa performances, it is built on the capabilities of the artist by resurrecting the oral heritage from its old forms by his ability to imitate and improvise that was one of the most important qualities that captivated minds. The Halqa is related to the symbol, and as it is an expression of the past, it is able to touch the nostalgia for the heritage of the ancestors. There are so many types of popular theater: Rma, Rehhala, Mejdoub, Baqsheesh, Lamsiyeh, owners of pigeons, snakes, monkeys and donkeys, “knowers” of the secrets of the horoscope, astrologers, magicians and singers ... Each of these has its own way of playing the theatre, but they are all based on employing elements of popular memory.
In this art, the theatrical game takes place in front of everyone, everyone watches from his angle, and everyone is shown in front of everyone.
Within the circle of Ḥalqa is a special time, a time cut out from the real time. It is a time of enjoyment and collective retrieval of memory, of origin, of the village, the desert, and the distant ravine.
Halqa of comedy
It consists of an individual or a group of two to three people performing a short scene (sketches) that satirizes certain aspects of daily, political or cultural life. Some of them use magic tricks to attract and entertain the spectators, and the great comedians in the second half of the twentieth century were distinguished by a high reputation, such as: Baqsheesh, Flifla, moul lahmar, Saroukh, and the quotes of Lamsiyyeh in the twenty-first century. Each generation has its own way, and each has its own comic and expressive style in conveying the jokes and short scenes that it has inherited, inspired by the customs and traditions of Moroccan society.
Halqa of acrobatics show
These shows are presented by young people from Sous and their tutor Sidi Hamad or Musa. They wear red and green suits. From an early age, acrobats are forced to live by very strict rules of life. They learn from the elderly to flex the muscles of their bodies and subject them to difficult exercises as their embodiment of a pyramid made up of all the members of the group. The secret of the success of the type of Halqa is the teamwork that allows such physical performance.
Halqa of animal tamers
The Halqa of animal tamers, of the monkey trainers for example, delight audiences because of the animals' innate abilities to mimic people's facial expressions. Also these of the snake tamers, who have a very special aura around them, always arousing in the spectators respect mixed with fear. They have been immunized since birth against snake venom, and they also have the ability to absorb the deadly poison directly, which enables them, at the same time, to save human lives. This is a grant and a gift, which they owe to their sheikh, Sidi Ahmed Ben Aissa, whose zawiya is located in the city of Meknes.
During the Haḍra ceremonies, which are organized each year on the day of the Prophet's birthday, the Issawi show the diversity of their dignities. But instead Jemaa el-Fna, they content themselves with submitting the snakes to their laws, uniting with them in enchantment to the sound of ghaita music. However, behind the desire to entertain passers-by on the ring, Issawi always remains deeply respectful of his snakes.
Halqa of traditionnal music
Traditional music is one of the components of the Halqa that energizes the Jemaa El-Fna square theater. Al-Malhoun, Gnaoua, Al-Ruwais, Al-Ghaita, Awlad Hmar, Hadawa, Al-Aita Al-Houzia (Al-Hawzi), Hamadcha, Awlad Sidi Rahal, all of them are independent musical variations that represent the diverse musical heritage of Morocco.
Traditional music consists of poems called Zajal, inspired by popular literature, sung in the Moroccan dialectal Arabic, in the form of musical and lyrical sequences, accompanied by playing an instrument: the Cambri, the flute, the violin, Taarija or Tara (tambourine), to distinguish the rhythm. In this way, a lyrical and musical show is formed, inspired by the lifestyle of passers-by, or derived from the traditions and stories of one of the various regions of Morocco. It may also be an improvisation from the inspiration of the moment that gives the Halqa a character of pleasure and familiarity.
Valorization of the art of Halqa as Moroccan popular oral heritage in our time
The presence of Helayqi is nothing but fairness to his ancestors of the Helayqi, an appreciation and continuity in the preservation of this oral heritage. Indeed, Halqa art in general and Helayqi artists in particular have always formed the core of a form of spontaneous theatrical expression of social issues such as heritages, customs, traditions and the politics of model and laws of life in society. Since then, they have been seen as the faithful guardians, recruits and defenders of an oral culture that is hundreds of years old.
See also
List of intangible cultural heritage in Morocco
Jemaa el-Fnaa
Gnawa
Storytelling
References
Moroccan culture
Intangible Cultural Heritage of Humanity
Street art
Storytelling
|
```xml
import { HardDriveIcon, LayersIcon } from 'lucide-react';
import { EditEdgeStackForm } from '@/react/edge/edge-stacks/ItemView/EditEdgeStackForm/EditEdgeStackForm';
import { useParamState } from '@/react/hooks/useParamState';
import { useIdParam } from '@/react/hooks/useIdParam';
import { NavTabs } from '@@/NavTabs';
import { PageHeader } from '@@/PageHeader';
import { Widget } from '@@/Widget';
import { useEdgeStack } from '../queries/useEdgeStack';
import { EnvironmentsDatatable } from './EnvironmentsDatatable';
export function ItemView() {
const idParam = useIdParam('stackId');
const edgeStackQuery = useEdgeStack(idParam);
const [tab = 'stack', setTab] = useParamState<'stack' | 'environments'>(
'tab'
);
if (!edgeStackQuery.data) {
return null;
}
const stack = edgeStackQuery.data;
return (
<>
<PageHeader
title="Edit Edge stack"
breadcrumbs={[
{ label: 'Edge Stacks', link: 'edge.stacks' },
stack.Name,
]}
reload
/>
<div className="row">
<div className="col-sm-12">
<Widget>
<Widget.Body className="!p-0">
<NavTabs<'stack' | 'environments'>
justified
type="pills"
options={[
{
id: 'stack',
label: 'Stack',
icon: LayersIcon,
children: (
<div className="p-5 pb-10">
<EditEdgeStackForm edgeStack={stack} />
</div>
),
},
{
id: 'environments',
icon: HardDriveIcon,
label: 'Environments',
children: <EnvironmentsDatatable />,
},
]}
selectedId={tab}
onSelect={setTab}
/>
</Widget.Body>
</Widget>
</div>
</div>
</>
);
}
```
|
```yaml
### YamlMime:FAQ
metadata:
title: Windows Enterprise multi-session FAQ - Azure
description: Frequently asked questions and best practices for using Windows Enterprise multi-session for Azure Virtual Desktop.
author: dknappettmsft
ms.topic: faq
ms.date: 08/02/2024
ms.author: daknappe
ms.custom: docs_inherited
title: Windows Enterprise multi-session FAQ
summary: This article answers frequently asked questions and explains best practices for Windows 10 Enterprise multi-session and Windows 11 Enterprise multi-session.
sections:
- name: Ignored
questions:
- question: What is Windows Enterprise multi-session?
answer: |
- question: How many users can simultaneously have an interactive session on Windows Enterprise multi-session?
answer: |
How many interactive sessions that can be active at the same time relies on your system's hardware resources (vCPU, memory, disk, and vGPU), how your users use their apps while signed in to a session, and how heavy your system's workload is. We suggest you validate your system's performance to understand how many users you can have on Windows Enterprise multi-session. To learn more, see [Azure Virtual Desktop pricing](path_to_url
- question: Why does my application report Windows Enterprise multi-session as a Server operating system?
answer: |
Windows Enterprise multi-session is a virtual edition of Windows Enterprise. One of the differences is that this operating system (OS) reports the [ProductType](/windows/win32/cimwin32prov/win32-operatingsystem) as having a value of 3, the same value as Windows Server. This property keeps the OS compatible with existing RDSH management tooling, RDSH multi-session-aware applications, and mostly low-level system performance optimizations for RDSH environments. Some application installers can block installation on Windows multi-session depending on whether they detect the ProductType is set to Client. If your app won't install, contact your application vendor for an updated version.
- question: Can I run Windows Enterprise multi-session outside of the Azure Virtual Desktop service?
answer: We don't allow customers to run Windows Enterprise multi-session in production environments outside of the Azure Virtual Desktop service. Only Microsoft or the Azure Virtual Desktop Approved Providers, Citrix and VMware, can provide access to the Azure Virtual Desktop service. It's against the licensing agreement to run Windows multi-session outside of the Azure Virtual Desktop service for production purposes. Windows multi-session also wont activate against on-premises Key Management Services (KMS).
- question: Can I upgrade a Windows VM to Windows Enterprise multi-session?
answer: No. It's not currently possible to upgrade an existing virtual machine (VM) that's running Windows Professional or Enterprise to Windows Enterprise multi-session. Also, if you deploy a Windows Enterprise multi-session VM and then update the product key to another edition, you won't be able to switch the VM back to Windows Enterprise multi-session and will need to redeploy the VM. Changing your Azure Virtual Desktop VM SKU to another edition is not supported.
- question: Does Windows Enterprise multi-session support Remote Desktop IP Virtualization?
answer: No. Azure Virtual Desktop [supported virtual machine OS images](prerequisites.md#operating-systems-and-licenses) do not support Remote Desktop IP Virtualization.
- question: How do I customize the Windows Enterprise multi-session image for my organization?
answer: |
You can start a VM in Azure with Windows Enterprise multi-session and customize it by installing LOB applications, sysprep/generalize, and then create an image using the Azure portal.
To get started, create a VM in Azure with Windows Enterprise multi-session. Instead of starting the VM in Azure, you can download the VHD directly. After that, you'll be able to use the VHD you downloaded to create a new Generation 1 VM on a Windows PC with Hyper-V enabled.
Customize the image to your needs by installing LOB applications and sysprep the image. When you're done customizing, upload the image to Azure with the VHD inside. After that, get Azure Virtual Desktop from the Azure Marketplace and use it to deploy a new host pool with the customized image.
- question: How do I manage Windows Enterprise multi-session after deployment?
answer: You can use any supported configuration tool, but we recommend Configuration Manager version 1906 because it supports Windows Enterprise multi-session or [Microsoft Intune](management.md) for Microsoft Entra joined or Microsoft Entra hybrid joined session hosts.
- question: Can Windows Enterprise multi-session be Microsoft Entra joined?
answer: |
Windows Enterprise multi-session can be Microsoft Entra joined. To get started, follow the steps to [Deploy Microsoft Entra joined virtual machines](deploy-azure-ad-joined-vm.md).
- question: Where can I find the Windows Enterprise multi-session image?
answer: |
Windows Enterprise multi-session can be conveniently selected in the Azure Virtual Desktop management interface while managing your environment. When needed, you can navigate to [**Azure Marketplace**](path_to_url search for the Windows 10 or Windows 11 offering, and select **Windows Enterprise multi-session plan**. For an image integrated with Microsoft 365 Apps for Enterprise, search with keyword **multi-session** to get to this offering. The marketplace images are updated monthly after the security patch release schedule of Windows Servicing & Delivery. The images with Microsoft 365 apps pre-installed are made available in the marketplace around the middle of the 3rd week of the month:
- [Windows 10 and 11 updates](path_to_url
- [Microsoft 365 Apps security updates](path_to_url and [feature updates](path_to_url
- Windows 365 gallery images include the latest Monthly Enterprise Channel release with the latest security updates.
- [Microsoft Teams updates](path_to_url
- question: Which Windows Enterprise multi-session versions are supported?
answer: Windows Enterprise multi-session, versions 1909 and later are supported and are available in the Azure gallery. These releases follow the same support lifecycle policy as Windows Enterprise, which means the March release is supported for 18 months and the September release for 30 months.
- question: Which profile management solution should I use for Windows Enterprise multi-session?
answer: |
We recommend you use FSLogix profile containers when you configure Windows Enterprise in non-persistent environments or other scenarios that need a centrally stored profile. FSLogix ensures the user profile is available and up-to-date for every user session. We also recommend you use your FSLogix profile container to store a user profile in any SMB share with appropriate permissions, but you can store user profiles in Azure page blob storage if necessary. Azure Virtual Desktop users can use FSLogix at no additional cost. FSLogix comes pre-installed on all Windows Enterprise multi-session images, but the IT admin is still responsible for configuring the FSLogix profile container.
For more information about how to configure an FSLogix profile container, see [Configure the FSLogix profile container](create-host-pools-user-profile.md#configure-the-fslogix-profile-container).
- question: Which license do I need to access Windows Enterprise multi-session?
answer: |
For a full list of applicable licenses, see [Azure Virtual Desktop pricing](path_to_url
- question: Why do my apps disappear after I sign out?
answer: This happens because you're using Windows Enterprise multi-session with a profile management solution like FSLogix. Your admin or profile solution configured your system to delete user profiles when users sign out. This configuration means that when your system deletes your user profile after you sign out, it also removes any apps you installed during your session. If you want to keep the apps you installed, you'll need to ask your admin to provision these apps for all users in your Azure Virtual Desktop environment.
- question: How do I make sure apps don't disappear when users sign out?
answer: |
Most virtualized environments are configured by default to prevent users from installing additional apps to their profiles. If you want to make sure an app doesn't disappear when your user signs out of Azure Virtual Desktop, you have to provision that app for all user profiles in your environment. For more information about provisioning apps, check out these resources:
- [Publish built-in apps in Azure Virtual Desktop](publish-apps.md)
- [DISM app package servicing command-line options](/windows-hardware/manufacture/desktop/your_sha256_hashions)
- [Add-AppxProvisionedPackage](/powershell/module/dism/add-appxprovisionedpackage)
- question: How do I make sure users don't download and install apps from the Microsoft Store?
answer: |
You can disable the Microsoft Store app to make sure users don't download extra apps beyond the apps you've already provisioned for them.
To disable the Store app:
1. Create and edit a new Group Policy Object.
2. Select **Computer Configuration** > **Policies** > **Administrative Templates** > **Windows Components** > **Store**.
3. Open the **Turn off the Store Application** setting.
4. Select the **Enabled** option.
5. Click the **Apply** button.
6. Click the **OK** button.
- question: Can Windows Enterprise multi-session and 11 Enterprise multi-session receive feature updates through Windows Server Update Services (WSUS)?
answer: |
Yes. You can update Windows Enterprise multi-session and Windows 11 Enterprise multi-session with the appropriate feature updates published to WSUS.
additionalContent: |
## Next steps
To learn more about Azure Virtual Desktop and Windows Enterprise multi-session:
- Read our [Azure Virtual Desktop documentation](overview.md)
- Visit our [Azure Virtual Desktop TechCommunity](path_to_url
- Set up your Azure Virtual Desktop deployment with the [Azure Virtual Desktop tutorials](./virtual-desktop-fall-2019/tenant-setup-azure-active-directory.md)
```
|
```xml
import { GeneratorTypes, PasswordTypes } from "../data/generator-types";
/** The kind of credential being generated. */
export type GeneratorType = (typeof GeneratorTypes)[number];
/** The kinds of passwords that can be generated. */
export type PasswordType = (typeof PasswordTypes)[number];
```
|
The New Zealand cricket team toured Australia from 18 November – 13 December 2011. The tour consisted of two Tests played for the Trans-Tasman Trophy.
The series was drawn 1–1, so the trophy was retained by Australia. New Zealand's win in the second Test in Hobart was its first Test win in Australia since 1985, and their first test match victory against Australia since 1993.
Squads
Tour matches
New South Wales XI v New Zealanders
Australia A v New Zealanders
Test Series (Trans-Tasman Trophy)
1st Test
Three Australia players, opening batsman David Warner and fast bowlers James Pattinson and Mitchell Starc, made their Test debuts in this match, after injuries to five players (batsmen Shane Watson and Shaun Marsh, and fast bowlers Ryan Harris, Mitchell Johnson and Pat Cummins), all of whom played in Australia's 1–1 series tie against South Africa earlier that month.
Day 1
New Zealand won the toss and chose to bat. Australia had the upper hand in the morning session, reducing New Zealand to 4/94 at lunch, with only Brendon McCullum (34) making more than twenty. A fifth wicket fell just after lunch, before Dean Brownlie and Daniel Vettori stabilised the innings. The pair added eighty runs without loss before play was called off due to rain and bad light before tea. New Zealand was 5/176 at stumps.
Day 2
Brownlie (77*) and Vettori (96) continued at the start of Day 2, extending their sixth wicket partnership to 158 runs before Vettori was run out. New Zealand collapsed from 5/254 to 8/259, before being dismissed for 295 just before lunch. Nathan Lyon cleaned up the tail, and finished with four wickets for Australia.
New Zealand dismissed the Australian openers cheaply to reduce Australia to 2/25 early in the afternoon session, but Australia recovered to reach 3/154 when stumps were drawn on Day 2, again early due to bad light.
Day 3
Australia added 100 runs for the loss of two wickets in the morning session, including Ricky Ponting (78). The new ball taken almost immediately after the lunch break, and Michael Clarke (139) and Brad Haddin (80) survived to add 108 runs for the sixth wicket before Clarke's dismissal. Haddin then batted with the tail, and Australia added another 82 runs for the last four wickets, the most notable contribution coming from Mitchell Starc (32*). Australia was dismissed for 427, a lead of 132, with seven overs remaining in the day. Chris Martin was the top wicket-taker with three.
New Zealand lost McCullum (1) in the final over of the day's play, and finished at 1/10.
Day 4
In the first hour of Day 4, debutant fast bowler James Pattinson (5/27) dismissed the entire New Zealand top order to reduce the Black Caps to 5/28. Pattinson took the first five wickets of the innings, including McCullum (1) on Day 3, then Martin Guptill (12), Kane Williamson (0), Ross Taylor (0) and nightwatchman Doug Bracewell (2). Australia went on to dismiss New Zealand for 150 midway through the afternoon session; Dean Brownlie (42) was the top scorer.
Needing only 19 runs for victory, Australia made the runs inside three overs for the loss of one wicket. Debutant David Warner scored the winning runs.
James Pattinson (1/64 & 5/27) was voted Man of the Match.
2nd Test
Uncapped fast bowler Trent Boult was brought into the New Zealand team to replace Daniel Vettori, who pulled out late with an injury. The Australian team was unchanged from the first Test.
Day 1
After winning the toss on a very green pitch, Australian captain Michael Clarke elected to field on Day 1. The Australian bowlers dominated play on Day 1, dismissing New Zealand for 150 at tea, with only Dean Brownlie (56) managing more than twenty runs. Australia had reduced New Zealand to 6/60 in the morning session, but Brownlie's partnerships with Doug Bracewell (45 runs for the seventh wicket) and Tim Southee (41 runs for the eighth wicket) helped New Zealand to a score of 150. James Pattinson (5/51) was the top wicket taker, and took five wickets for the second consecutive innings; Peter Siddle took three wickets.
Australia faced only 4.2 overs, reaching 1/12, before rain stopped play.
Day 2
With the pitch still green and assisting the fast bowlers, New Zealand dominated Day 2 just as Australia had Day 1. Chris Martin (3/46) dismissed the Australian top three, and Bracewell (3/20) and Trent Boult (3/29) also took three wickets apiece, as Australia was reduced to 7/75 before being dismissed for 136 late in the afternoon session. A 56-run partnership between tail-enders Siddle (36) and Pattinson (17) provided the only resistance for Australia. This gave New Zealand a 14-run lead on the first innings.
New Zealand reached tea without loss, before losing both openers in quick succession after tea to fall to 2/36. In conditions which still suited the bowlers, New Zealand added 103 runs for the loss of only one wicket for the rest of the day, finishing at 3/139 at stumps.
Day 3
After dismissing Kane Williamson (34) in the first over of the day, Australia took New Zealand's seven remaining wickets for only 87 runs in the morning session. Nathan Lyon (3/26), James Pattinson (3/54) and Peter Siddle (3/66) took three wickets each. Other than Williamson and top-scorer Ross Taylor (56), there were few strong contributions from New Zealand's batsmen. Debutant Trent Boult (21 from 13 balls) helped to add 23 runs for the tenth wicket to extend New Zealand's score to 226, and set Australia a target of 241 for victory.
In a rain-interrupted afternoon and evening, Australia faced 19 overs, and openers Phillip Hughes and David Warner reached stumps without loss, with the score 0/72.
Day 4
Play started with Australia requiring 169 runs to win with ten wickets in hand. Hughes (20) was dismissed in the first over, but Warner and Usman Khawaja (23) added another fifty runs for the second wicket, before Khawaja was dismissed with the score 2/122. Australia took the score to 2/159, before Doug Bracewell took three wickets for no score to reduce Australia to 5/159. Australia went to lunch at 5/173, requiring 68 runs with five wickets in hand.
Shortly after lunch, Warner made his maiden Test century, and he and Brad Haddin (15) took the score to 5/192, before two wickets apiece to Tim Southee (2/77) and Doug Bracewell in consecutive overs reduced Australia to 9/199, requiring 42 runs from the last wicket for victory. Opener David Warner and number eleven Nathan Lyon batted together for 8.4 overs, with Lyon twice surviving DRS referrals for lbw appeals. Finally, after a 34 run partnership for the tenth wicket, Lyon (9) was bowled by Bracewell (6/40). Australia was all out for 233, and New Zealand won the match by seven runs. It was New Zealand's first victory against Australia in a test since 1992–93, and its first in Australia since 1985–86. David Warner (123*) carried his bat, one of very few batsmen to achieve the feat in a fourth innings.
David Warner won the Man of the Match award for his 123* in the fourth innings. On a difficult batting pitch, his score was more than double the next highest (56, by Brownlie and Taylor).
Player of the Match controversy
For this series, Cricket Australia chose to allow the Player of the Match to be voted for by users within Australia of a phone app promoted on the Nine Network coverage by Vodafone, the telecommunications company sponsoring the series. This proved controversial in the second Test when Australian batsman David Warner (123*) easily won the award ahead of New Zealand bowler Doug Bracewell (3/20 & 6/40), whose bowling was a big key to the New Zealand victory; votes for Warner outnumbered votes for Bracewell by more than two to one, and home-country favouritism among voters was blamed. Cricket Australia abandoned the voting system after the series, and reverted to the traditional method of having the Player of the Match chosen by a panel of experts.
References
2011-12
2011–12 Australian cricket season
2011–12 New Zealand cricket season
International cricket competitions in 2011–12
|
In mathematics, a sample-continuous process is a stochastic process whose sample paths are almost surely continuous functions.
Definition
Let (Ω, Σ, P) be a probability space. Let X : I × Ω → S be a stochastic process, where the index set I and state space S are both topological spaces. Then the process X is called sample-continuous (or almost surely continuous, or simply continuous) if the map X(ω) : I → S is continuous as a function of topological spaces for P-almost all ω in Ω.
In many examples, the index set I is an interval of time, [0, T] or [0, +∞), and the state space S is the real line or n-dimensional Euclidean space Rn.
Examples
Brownian motion (the Wiener process) on Euclidean space is sample-continuous.
For "nice" parameters of the equations, solutions to stochastic differential equations are sample-continuous. See the existence and uniqueness theorem in the stochastic differential equations article for some sufficient conditions to ensure sample continuity.
The process X : [0, +∞) × Ω → R that makes equiprobable jumps up or down every unit time according to
is not sample-continuous. In fact, it is surely discontinuous.
Properties
For sample-continuous processes, the finite-dimensional distributions determine the law, and vice versa.
See also
Continuous stochastic process
References
Stochastic processes
|
```c
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
#include <stdlib.h>
#include <stdio.h>
#include <math.h>
#include <time.h>
#include <sys/time.h>
#define NAME "erfc"
#define ITERATIONS 1000000
#define REPEATS 3
/**
* Define prototypes for external functions.
*/
extern double erfc( double x );
/**
* Prints the TAP version.
*/
static void print_version( void ) {
printf( "TAP version 13\n" );
}
/**
* Prints the TAP summary.
*
* @param total total number of tests
* @param passing total number of passing tests
*/
static void print_summary( int total, int passing ) {
printf( "#\n" );
printf( "1..%d\n", total ); // TAP plan
printf( "# total %d\n", total );
printf( "# pass %d\n", passing );
printf( "#\n" );
printf( "# ok\n" );
}
/**
* Prints benchmarks results.
*
* @param elapsed elapsed time in seconds
*/
static void print_results( double elapsed ) {
double rate = (double)ITERATIONS / elapsed;
printf( " ---\n" );
printf( " iterations: %d\n", ITERATIONS );
printf( " elapsed: %0.9f\n", elapsed );
printf( " rate: %0.9f\n", rate );
printf( " ...\n" );
}
/**
* Returns a clock time.
*
* @return clock time
*/
static double tic( void ) {
struct timeval now;
gettimeofday( &now, NULL );
return (double)now.tv_sec + (double)now.tv_usec/1.0e6;
}
/**
* Generates a random number on the interval [0,1).
*
* @return random number
*/
static double rand_double( void ) {
int r = rand();
return (double)r / ( (double)RAND_MAX + 1.0 );
}
/**
* Runs a benchmark.
*
* @return elapsed time in seconds
*/
static double benchmark( void ) {
double elapsed;
double x;
double y;
double t;
int i;
t = tic();
for ( i = 0; i < ITERATIONS; i++ ) {
x = ( 2.0*rand_double() ) - 1.0;
y = erfc( x );
if ( y != y ) {
printf( "should not return NaN\n" );
break;
}
}
elapsed = tic() - t;
if ( y != y ) {
printf( "should not return NaN\n" );
}
return elapsed;
}
/**
* Main execution sequence.
*/
int main( void ) {
double elapsed;
int i;
// Use the current time to seed the random number generator:
srand( time( NULL ) );
print_version();
for ( i = 0; i < REPEATS; i++ ) {
printf( "# c::cephes::%s\n", NAME );
elapsed = benchmark();
print_results( elapsed );
printf( "ok %d benchmark finished\n", i+1 );
}
print_summary( REPEATS, REPEATS );
}
```
|
```objective-c
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#pragma once
#include <functional>
#include "paddle/cinn/operator_fusion/pattern_node.h"
#include "paddle/cinn/operator_fusion/pir_graph_analyzing/dim_relation.h"
#include "paddle/cinn/operator_fusion/pir_graph_analyzing/shardable_axes_base.h"
#include "paddle/cinn/operator_fusion/policy/policy_base.h"
#include "paddle/cinn/operator_fusion/utils.h"
#include "paddle/common/enforce.h"
namespace cinn::fusion {
class RelativeJudgePolicy final : public PolicyBase {
public:
static constexpr PolicyKind Kind = PolicyKind::RelativeJudge;
RelativeJudgePolicy(const std::vector<pir::Operation*>& ops,
pir::ShapeConstraintIRAnalysis* shape_analysis)
: axes_info_(ops, shape_analysis) {
VLOG(4) << "[relative_judge_policy] Start AnalysisIndexExprRelation.";
index_expr_map_ = AnalysisIndexExprRelation(ops);
VLOG(4) << "[relative_judge_policy] End AnalysisIndexExprRelation.";
}
bool CanFuse(const PatternNodePtr& upstream,
const PatternNodePtr& downstream);
ShardableAxesInfoManager& GetAxesInfoManager() { return axes_info_; }
std::string Name() { return "RelativeJudgePolicy"; }
std::vector<size_t> GetFakeReduceIterIdx(const PatternNodePtr& upstream,
const PatternNodePtr& downstream);
bool IsRelated(DimUsage in, DimUsage out) {
return index_expr_map_[in].count(out) == 1;
}
private:
DimUsageRelation index_expr_map_;
ShardableAxesInfoManager axes_info_;
bool ReduceTreeGrownCanMerge(const PatternNodePtr&, const PatternNodePtr&);
bool ReducePlusTrivialCanMerge(const PatternNodePtr&, const PatternNodePtr&);
std::pair<std::vector<DimUsage>, std::vector<DimUsage>>
SplitFirstIfRelatedBySecond(const std::vector<DimUsage>& targets,
const std::vector<DimUsage>& related_with);
std::optional<ReducePattern> GetDownstreamFromCandidate(
const ReducePattern& upstream,
const std::vector<ReducePattern>& candidates);
bool IsDownstreamStmtDependReduceOp(pir::Operation* reduce,
const StmtPattern& downstream);
};
} // namespace cinn::fusion
```
|
```ruby
cask "invalid-generic-artifact-no-target" do
version "1.2.3"
sha256 your_sha256_hash
url "file://#{TEST_FIXTURE_DIR}/cask/caffeine.zip"
homepage "path_to_url"
artifact "Caffeine.app"
end
```
|
```objective-c
/**
* All rights reserved.
*
* This source code is licensed under the BSD-style license found in the
* LICENSE file in the root directory of this source tree. An additional grant
* of patent rights can be found in the PATENTS file in the same directory.
*/
#import <Foundation/Foundation.h>
typedef NS_ENUM (NSInteger, IGListBatchUpdateState) {
IGListBatchUpdateStateIdle,
IGListBatchUpdateStateQueuedBatchUpdate,
IGListBatchUpdateStateExecutingBatchUpdateBlock,
IGListBatchUpdateStateExecutedBatchUpdateBlock,
};
```
|
```javascript
//your_sha256_hash---------------------------------------
//your_sha256_hash---------------------------------------
var output = "";
function echo(o) {
try {
document.write(o + "<br/>");
} catch (ex) {
try {
WScript.Echo("" + o);
} catch (ex2) {
print("" + o);
}
}
}
echo("--- 1 ---");
try { echo(eval('/a\0b/').toString().length); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/\n/")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/\r/")); } catch (e) { echo("EXCEPTION"); }
echo("--- 2 ---");
try { echo(eval("/\\\0/").toString().length); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/\\\n/")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/\\\r/")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/\\\u2028/")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/\\\u2029/")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/\\")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/\\\0/").toString().length); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/\\\0a/").toString().length); } catch (e) { echo("EXCEPTION"); }
echo("--- 3 ---");
try { echo(eval("/[\n]/")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/[\r]/")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/[\u2028]/")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/[\u2029]/")); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/[\0]/").toString().length); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/[").toString().length); } catch (e) { echo("EXCEPTION"); }
try { echo(eval("/a)/")); } catch (e) { echo("EXCEPTION"); }
echo("--- 4 ---");
try { echo(eval('/\u2028*/').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('/\u2029*/').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('/\r*/').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('/\n*/').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('\0*').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('\0a*').toString().length) } catch (e) { echo("EXCEPTION"); }
echo("--- 5 ---");
try { echo(eval('\\\0*').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('/\\\r*/').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('/\\\n*/').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('/[\r]/').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('/[\n]/').toString().length) } catch (e) { echo("EXCEPTION"); }
try { echo(eval('[\0]').toString().length) } catch (e) { echo("EXCEPTION"); }
```
|
The Waghäusel Saalbach–Graben-Neudorf railway is a 7.94 km long railway north of Karlsruhe, Germany. It connects the Mannheim–Stuttgart high-speed railway with the Rhine Railway and is used by scheduled trains between Mannheim and Karlsruhe.
Route
The route runs between Waghäusel Saalbach junction (kilometre 31.7 of the new line) via Philippsburg Molzau junction (km 34.6 of the Rhine Railway) to Graben-Neudorf station. Both branches are grade separated. Three tracks are occupied by these parallel railways between Molzau and the southern exit in Graben-Neudorf to the Rhine Railway.
History
Planning
The link was already envisaged in the early planning in 1973.
During the planning and construction, the track was divided for planning approval purposes into sections 4b (northern section, Philippsburg area) and 4c (southern section, Graben-Neudorf area). The new Mannheim–Stuttgart railway in the section from the entrance and exit to Graben-Neudorf was included in section 4a (Waghäusel). In the mid 1970s, the entrance and exit tracks from the new line were designated as Überholbahnhof Oberhausen (Oberhausen crossing station) and the plans provided for overtaking movements. Such overtaking was no longer envisaged in the planning in 1979.
The approval procedure for section 4b was initiated on 21 August 1975. Three objections were discussed on 5 April 1978. Following the expression of an opinion by the regional council of 21 November 1978, the proposal was adopted on 10 January 197, and being unopposed, was made final on 10 April 1979.
The approval procedure for section 4c was also initiated on 21 August 1975. At the public hearing on 12 June 1978, ten objections from citizens and the community were considered. The opinion of the regional council was presented on 10 August 1979 and the proposal was adopted on 25 September 1979, subject to a lawsuit. It was given legal endorsement on 29 April 1980. According to another source the planning approval for this section was given on 16 October 1979, and since there were no complaints, it had immediate legal effect.
The approval procedure for section 4a was initiated on 4 February 1976. At the public hearing on 8 June 1978 22, objections were considered. The regional council issued its opinion by 12 February 1979. Six lawsuits were filed against the zoning decision issued on 10 May 1979. The decision gained legal force on 15 June 1981.
Originally, a turnout speed of 130 km/h was planned for Saalbach junction.
Construction
Construction work on the 4.5 km-long connecting line began in early September 1980.
Operations
Saalbach junction has been in use since May 1988 with a design speed of 200 km/h.
High-speed switches
"Basket arch" turnouts (Korbbogenweichen) were built in 1988 at Saalbach junction so that trains could branch at 200 km/h. They were initially the most elaborate turnouts in Germany. As the continuation of the high-speed line to Stuttgart was then under construction, the points were initially fixed in the branching position. Their length is 154 m and their weight (including the concrete sleepers) is 210 t. The point blades are 40 m-long, the moveable point frogs are 15 m long. The radius of the branching track is initially 7,000 m and reduces at the frog to 6000 m. The two sets of point blades are operated by two turnout motors with eight points of activation and closed with eight clamps. Separate test contacts are used to monitor the position of the points. The moveable point frogs are activated by a motor and secured by three clip fasteners. In the points area, two points signals are installed at the approaches and a third in the centre of the turnout. Another innovation was the use of slide chairs that require little lubrication. The two turnouts are still among the fastest in terms of operating speeds in Germany.
References
External links
Representations of the infrastructure and permitted speeds at OpenRailwayMap.
Railway lines in Baden-Württemberg
Railway lines opened in 1988
|
```java
package io.jpress.module.form.service.provider;
import io.jboot.aop.annotation.Bean;
import io.jboot.db.model.Columns;
import io.jpress.module.form.service.FormDatasourceItemService;
import io.jpress.module.form.model.FormDatasourceItem;
import io.jpress.commons.service.JPressServiceBase;
import javax.validation.constraints.NotNull;
@Bean
public class FormDatasourceItemServiceProvider extends JPressServiceBase<FormDatasourceItem> implements FormDatasourceItemService {
/**
* dict id
*
* @param dictId
* @return boolean
*/
@Override
public boolean deleteByDictId(@NotNull Long dictId) {
return DAO.deleteByColumns(Columns.create().eq("dict_id",dictId));
}
}
```
|
```objective-c
function softmax_multiclass_grad_hw()
% This file is associated with the book
% "Machine Learning Refined", Cambridge University Press, 2016.
% by Jeremy Watt, Reza Borhani, and Aggelos Katsaggelos.
% load in data
[X,y] = load_data();
% initializations
N = size(X,2);
C = length(unique(y));
X = [ones(size(X,1),1), X]';
W0 = randn(N+1,C);
alpha = 0.1;
% find separators via multiclass softmax classifier
W = softmax_multiclass_grad(X,y,W0,alpha);
% plot the separators as well as final classification
plot_separators(W, X, y)
%%%%%%%%%%%%%%%%% subfunctions %%%%%%%%%%%%%%%%%%%
function W = softmax_multiclass_grad(X,y,W0,alpha)
% initialize
max_its = 10000;
[N,P] = size(X);
C = length(unique(y));
W = W0;
k = 1;
%%% main %%%
while k <= max_its
% ----> grad =
W = W - alpha*grad;
% update counter
k = k + 1;
end
end
function [X,y] = load_data()
data = csvread('4class_data.csv');
X = data(:,1:end - 1);
y = data(:,end);
end
function plot_separators(W,X,y,deg)
red = [ 1 0 .4];
blue = [ 0 .4 1];
green = [0 1 0.5];
cyan = [1 0.7 0.5];
grey = [.7 .6 .5];
colors = [red;blue;green;cyan;grey];
% plot data
subplot(1,3,1)
plot_data(X,y)
subplot(1,3,2)
plot_data(X,y)
subplot(1,3,3)
plot_data(X,y)
%%% plot all linear separators %%%
subplot(1,3,2)
num_classes = length(unique(y));
x = [0:0.01:1];
for j = 1:num_classes
hold on
w = W(:,j);
plot (x,(-w(1)-w(2)*x)/w(3),'Color',colors(j,:),'linewidth',2);
end
%%% generate max-separator surface %%%
s = [0:0.005:1];
[s1,s2] = meshgrid(s,s);
s1 = reshape(s1,numel(s1),1);
s2 = reshape(s2,numel(s2),1);
% compute criteria for each point in the range [0,1] x [0,1]
square = [s1(:), s2(:)];
p = [ones(size(s1(:),1),1),s1(:), s2(:)];
f = W'*p';
[f,z] = max(f,[],1);
% fill in appropriate regions with class colors
subplot(1,3,3)
ind = find(z == 1);
k = boundary(square(ind,:));
v = [square(ind(k),:), ones(length(k),1)];
f = 1:length(k);
patch('Faces',f,'Vertices',v,'FaceColor',red,'FaceAlpha',0.3)
ind = find(z == 2);
k = boundary(square(ind,:));
v = [square(ind(k),:), 2*ones(length(k),1)];
f = 1:length(k);
patch('Faces',f,'Vertices',v,'FaceColor',blue,'FaceAlpha',0.3)
ind = find(z == 3);
k = boundary(square(ind,:));
v = [square(ind(k),:), 3*ones(length(k),1)];
f = 1:length(k);
patch('Faces',f,'Vertices',v,'FaceColor',green,'FaceAlpha',0.3)
ind = find(z == 4);
k = boundary(square(ind,:));
v = [square(ind(k),:), 4*ones(length(k),1)];
f = 1:length(k);
patch('Faces',f,'Vertices',v,'FaceColor',cyan,'FaceAlpha',0.3)
ind = find(z == 5);
k = boundary(square(ind,:));
v = [square(ind(k),:), 4*ones(length(k),1)];
f = 1:length(k);
patch('Faces',f,'Vertices',v,'FaceColor',grey,'FaceAlpha',0.3)
% produce decision boundary
s1 = reshape(s1,[length(s),length(s)]);
s2 = reshape(s2,[length(s),length(s)]);
z = reshape(z,[length(s),length(s)]);
num_classes = length(unique(z));
subplot(1,3,3)
for i = 1:num_classes - 1
hold on
contour(s1,s2,z,[i + 0.5,i + 0.5],'Color','k','LineWidth',2)
end
% make plot real nice lookin'
for i = 1:3
subplot(1,3,i)
axis([0 1 0 1])
axis square
xlabel('x_1','FontName','cmmi9','Fontsize',18)
ylabel('x_2','FontName','cmmi9','Fontsize',18)
set(get(gca,'YLabel'),'Rotation',0)
zlabel('y','FontName','cmmi9','Fontsize',18)
set(get(gca,'ZLabel'),'Rotation',0)
set(gca,'XTick',[0,1])
set(gca,'YTick',[0,1])
set(gca,'ZTick',[0:1:num_classes])
set(gcf,'color','w');
end
end
function plot_data(X,y)
red = [ 1 0 .4];
blue = [ 0 .4 1];
green = [0 1 0.5];
cyan = [1 0.7 0.5];
grey = [.7 .6 .5];
colors = [red;blue;green;cyan;grey];
% how many classes in the data? maximum 4 here.
class_labels = unique(y); % class labels
num_classes = length(class_labels);
% plot data
for i = 1:num_classes
class = class_labels(i);
ind = find(y == class);
hold on
scatter3(X(2,ind),X(3,ind),class*ones(length(ind),1),'Linewidth',2,'Markeredgecolor',colors(i,:),'markerFacecolor','none');
hold on
scatter3(X(2,ind),X(3,ind),class*ones(length(ind),1),'Linewidth',2,'Markeredgecolor',colors(i,:),'markerFacecolor','none');
end
axis([0 1 0 1])
axis square
box on
end
end
```
|
Koppány, also called Cupan was a Hungarian lord in the late 10th century and leader of pagans opposing the Christianization of Hungary. As the duke of Somogy, he laid claim to the throne based on the traditional idea of seniority, but was defeated and executed by Stephen (born with the pagan name Vajk), son of the previous grand prince Géza.
According to modern scholars' consensus view, he was a member of the royal Árpád dynasty. Koppány was the lord of the southern region of Transdanubia during the reign of Géza, who ruled between the early 970s and 997. After the death of Géza, Koppány laid claim to the throne against Géza's devout Christian son, Stephen. His claim was mainly supported by pagan Hungarians, but the royal army routed his army near Veszprém in 997 or 998. Koppány was killed either in the battle or in his duchy, to which he had fled from the battlefield. His corpse was cut in four pieces to be displayed on the walls of four major strongholds of Hungary, Győr, Veszprém, Esztergom and Gyulafehérvár (now Alba Iulia, Romania).
Family
He was the son of Zerind the Bald, according to the 14th-century Illuminated Chronicle. Although no primary source mentions that Koppány was descended from Álmos or Árpád, the first grand princes of the Hungarians, his attempt to seize the throne shows that he was a member of the Árpád dynasty. Historians debate which of the four or five sons of Árpád was Koppány's ancestor. Historians Gyula Kristó, László Szegfű and György Szabados say that Koppány was probably descended from Árpád's oldest son, Tarkatzus, but Kornél Bakay (who identified Zerind the Bald with Ladislas the Bald) writes that Árpád's youngest son, Zoltán, was Koppány's forefather. The exact date of Koppány's birth cannot be determined. He was allegedly born between around 950 and 965, because his claim to the throne in 997 shows that he was the oldest member of the Árpád dynasty at that time.
Duke of Somogy
The 14th-century Illuminated Chronicle recorded that "Duke Cupan ... held sway over a duchy" (ducatum tenebat, in Latin) during the reign of Géza, Grand Prince of the Hungarians. Géza, who ascended the throne around 972, was described as a cruel monarch in late 11th-century legends. His fame, along with the fact that only a few late-10th-century members of the royal family are known, suggests that Géza murdered most of his kinsmen, according to historian Pál Engel.
Even if Géza carried out a purge among his relatives, Koppány survived it. According to the Illuminated Chronicle, he was "Duke of Symigium" (or Somogy). Two later sourcesthe 15th-century Osvát Laskai and an unknown 16th-century Carthusian monkmentioned that Koppány had also been the lord of Zala. Based on the sources, modern historians agree that Koppány administered the southwestern region of Transdanubia, most likely between Lake Balaton and the river Dráva. Szabados says that Koppány's father had already dominated Somogy and Zala; in contrast, László Kontler writes that Koppány received his duchy from Géza as a compensation after Géza made his own son, Stephen, his heir.
Rebellion and death
Géza died in 997. Either in the same or the next year, Koppány revolted against Géza's successor, Stephen, claiming the throne and Géza's widow, Sarolt, for himself. His claim to the throne shows that he considered himself the lawful heir to Géza in accordance with the traditional principle of seniority, but in contrast with the Christian law of primogeniture which supported Stephen's right to succeed his father. Koppány's effort to marry Géza's widow was also in line with the pagan custom of levirate marriage, but Christians regarded it as an incestuous attempt. Both of Koppány's claims suggest that he was pagan, or he inclined to paganism even if he had been baptised.
In the nearly contemporaneous deed of foundation of the Pannonhalma Archabbey, Stephen mentioned that "a certain county named Somogy" attempted to dethrone him after his father's death. The late 11th-century Lesser Legend of King St Stephen declared that "certain noblemen whose hearts were inclined to idle banquets" turned against Stephen after his ascension to the throne. Both sources suggest that it was not Stephen who started the war, but that Koppány rebelled against him.
Koppány started to "destroy the castles of Stephen, plunder his properties [and] murder his servants", according to the Lesser Legend. The same source also wrote that Koppány laid siege to Veszprém, but Stephen collected his army, marched to the fortress and annihilated Koppány's troops. The German knights who had settled in Hungary after Stephen married Gisela of Bavaria in 996, played a preeminent role in the victory of the royal army. The commander of the royal army, Vecelin, was one of the German immigrants. The deed of foundation of the Pannonhalma monastery even referred to the civil war as a fight between "the Germans and the Hungarians".
Koppány was killed by Vecelin in the battle near Veszprém, according to Chapter 64 of the Illuminated Chronicle. On the other hand, Chapter 40 of the same source says that Vecelin killed Koppány in Somogy. If the latter report is valid, Koppány fled from the battlefield after his defeat at Veszprém, but the royal army chased and murdered him in his duchy. On Stephen's order, Koppány's body was quartered and its parts were hung over the walls of Esztergom, Veszprém, Győr and Gyulafehérvár (present-day Alba Iulia in Romania).
References
Sources
Primary sources
The Hungarian Illuminated Chronicle: Chronica de Gestis Hungarorum (Edited by Dezső Dercsényi) (1970). Corvina, Taplinger Publishing. .
Secondary sources
Further reading
990s deaths
Hungarian nobility
House of Árpád
10th-century Hungarian people
History of Somogy
Year of birth unknown
|
```html
<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "path_to_url">
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=US-ASCII">
<title>Struct template impl</title>
<link rel="stylesheet" href="../../../../../doc/src/boostbook.css" type="text/css">
<meta name="generator" content="DocBook XSL Stylesheets V1.79.1">
<link rel="home" href="../../../index.html" title="The Boost C++ Libraries BoostBook Documentation Subset">
<link rel="up" href="../or_.html#id-1.3.33.5.34.7.4" title="Description">
<link rel="prev" href="../or_.html" title="Struct template or_">
<link rel="next" href="../and_.html" title="Struct template and_">
</head>
<body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF">
<table cellpadding="2" width="100%"><tr>
<td valign="top"><img alt="Boost C++ Libraries" width="277" height="86" src="../../../../../boost.png"></td>
<td align="center"><a href="../../../../../index.html">Home</a></td>
<td align="center"><a href="../../../../../libs/libraries.htm">Libraries</a></td>
<td align="center"><a href="path_to_url">People</a></td>
<td align="center"><a href="path_to_url">FAQ</a></td>
<td align="center"><a href="../../../../../more/index.htm">More</a></td>
</tr></table>
<hr>
<div class="spirit-nav">
<a accesskey="p" href="../or_.html"><img src="../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../or_.html#id-1.3.33.5.34.7.4"><img src="../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../index.html"><img src="../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="../and_.html"><img src="../../../../../doc/src/images/next.png" alt="Next"></a>
</div>
<div class="refentry">
<a name="boost.proto.or_.impl"></a><div class="titlepage"></div>
<div class="refnamediv">
<h2><span class="refentrytitle">Struct template impl</span></h2>
<p>boost::proto::or_::impl</p>
</div>
<h2 xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" class="refsynopsisdiv-title">Synopsis</h2>
<div xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" class="refsynopsisdiv"><pre class="synopsis"><span class="comment">// In header: <<a class="link" href="../../../proto/reference.html#header.boost.proto.matches_hpp" title="Header <boost/proto/matches.hpp>">boost/proto/matches.hpp</a>>
</span>
<span class="keyword">template</span><span class="special"><</span><span class="keyword">typename</span> <a class="link" href="../../../Expr.html" title="Concept Expr">Expr</a><span class="special">,</span> <span class="keyword">typename</span> State<span class="special">,</span> <span class="keyword">typename</span> Data<span class="special">></span>
<span class="keyword">struct</span> <a class="link" href="impl.html" title="Struct template impl">impl</a> <span class="special">:</span> <span class="keyword"></span> <a class="link" href="../transform_impl.html" title="Struct template transform_impl">proto::transform_impl</a>< Expr, State, Data > <span class="special">{</span>
<span class="comment">// types</span>
<span class="keyword">typedef</span> <em class="replaceable"><code><span class="identifier">unspecified</span></code></em> <a name="boost.proto.or_.impl.result_type"></a><span class="identifier">result_type</span><span class="special">;</span>
<span class="comment">// <a class="link" href="impl.html#id-1_3_33_5_32_2_1_4_5_4-bb">public member functions</a></span>
<span class="identifier">result_type</span> <a class="link" href="impl.html#id-1_3_33_5_32_2_1_4_5_4_1-bb"><span class="keyword">operator</span><span class="special">(</span><span class="special">)</span></a><span class="special">(</span><span class="keyword">typename</span> <span class="identifier">impl</span><span class="special">::</span><span class="identifier">expr_param</span><span class="special">,</span>
<span class="keyword">typename</span> <span class="identifier">impl</span><span class="special">::</span><span class="identifier">state_param</span><span class="special">,</span>
<span class="keyword">typename</span> <span class="identifier">impl</span><span class="special">::</span><span class="identifier">data_param</span><span class="special">)</span> <span class="keyword">const</span><span class="special">;</span>
<span class="special">}</span><span class="special">;</span></pre></div>
<div class="refsect1">
<a name="id-1.3.33.5.34.7.4.5.4"></a><h2>Description</h2>
<div class="refsect2">
<a name="id-1.3.33.5.34.7.4.5.4.2"></a><h3>
<a name="id-1_3_33_5_32_2_1_4_5_4-bb"></a><code class="computeroutput">impl</code> public member functions</h3>
<div class="orderedlist"><ol class="orderedlist" type="1"><li class="listitem">
<pre class="literallayout"><span class="identifier">result_type</span> <a name="id-1_3_33_5_32_2_1_4_5_4_1-bb"></a><span class="keyword">operator</span><span class="special">(</span><span class="special">)</span><span class="special">(</span><span class="keyword">typename</span> <span class="identifier">impl</span><span class="special">::</span><span class="identifier">expr_param</span> expr<span class="special">,</span>
<span class="keyword">typename</span> <span class="identifier">impl</span><span class="special">::</span><span class="identifier">state_param</span> state<span class="special">,</span>
<span class="keyword">typename</span> <span class="identifier">impl</span><span class="special">::</span><span class="identifier">data_param</span> data<span class="special">)</span> <span class="keyword">const</span><span class="special">;</span></pre>
<div class="variablelist"><table border="0" class="variablelist compact">
<colgroup>
<col align="left" valign="top">
<col>
</colgroup>
<tbody>
<tr>
<td><p><span class="term">Parameters:</span></p></td>
<td><div class="variablelist"><table border="0" class="variablelist compact">
<colgroup>
<col align="left" valign="top">
<col>
</colgroup>
<tbody>
<tr>
<td><p><span class="term"><code class="computeroutput">data</code></span></p></td>
<td><p>A data of arbitrary type </p></td>
</tr>
<tr>
<td><p><span class="term"><code class="computeroutput">expr</code></span></p></td>
<td><p>An expression </p></td>
</tr>
<tr>
<td><p><span class="term"><code class="computeroutput">state</code></span></p></td>
<td><p>The current state </p></td>
</tr>
</tbody>
</table></div></td>
</tr>
<tr>
<td><p><span class="term">Returns:</span></p></td>
<td><p>
<code class="computeroutput">
G<sub>x</sub>()(expr, state, data)
</code>, where
<code class="computeroutput">x</code> is the lowest number such that
<code class="computeroutput">
<a class="link" href="../matches.html" title="Struct template matches">proto::matches</a><Expr, G<sub>x</sub>>::value
</code>
is <code class="computeroutput">true</code>.
</p></td>
</tr>
</tbody>
</table></div>
</li></ol></div>
</div>
</div>
</div>
<table xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" width="100%"><tr>
<td align="left"></td>
file LICENSE_1_0.txt or copy at <a href="path_to_url" target="_top">path_to_url
</p>
</div></td>
</tr></table>
<hr>
<div class="spirit-nav">
<a accesskey="p" href="../or_.html"><img src="../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../or_.html#id-1.3.33.5.34.7.4"><img src="../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../index.html"><img src="../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="../and_.html"><img src="../../../../../doc/src/images/next.png" alt="Next"></a>
</div>
</body>
</html>
```
|
Dwayne Washington (born April 24, 1994) is an American football running back for the Denver Broncos of the National Football League (NFL). He played college football at Washington. He was drafted by the Detroit Lions in the seventh round of the 2016 NFL Draft.
Early years
Washington attended and played high school football at Gahr High School in Cerritos, California.
College career
Washington played for the University of Washington from 2012 to 2015. Washington redshirted in his first year with the program. In the 2013 season, he had 47 rushing attempts for 332 yards and four touchdowns. In addition, he made one five-yard touchdown reception. In the 2014 season, he had 132 rushing attempts for 697 yards and nine touchdowns. In addition, he had 15 receptions for 91 yards. He played in nine games for the Washington Huskies during the 2015 season, rushing for 282 yards on 47 carries and catching 25 passes for another 315 yards and three touchdowns. He missed the final four games of the season, including the team's bowl game, due to a knee injury.
Collegiate statistics
Professional career
Detroit Lions
Washington was drafted in the seventh round with the 236th overall pick by the Detroit Lions in the 2016 NFL Draft. Overall, in his rookie season, he finished with 90 carries for 265 rushing yards and a rushing touchdown. In the 2017 season, he recorded 20 carries for 44 yards and had a larger role on special teams.
On September 1, 2018, Washington was waived by the Lions.
New Orleans Saints
On September 2, 2018, Washington was signed to the New Orleans Saints' practice squad. He was promoted to the active roster on September 28, 2018. He recorded 27 carries for 154 rushing yards in 13 games on the 2018 season, a majority which came against the Carolina Panthers in Week 17. In addition, he was a contributor on special teams. Overall, in the 2019 season, he recorded eight carries for 60 rushing yards to go along with a role on special tams.
Washington re-signed with the Saints on April 16, 2020. He was placed on the reserve/COVID-19 list by the team on August 30, 2020. He was activated on September 17, 2020. He was placed back on the COVID-19 list on January 2, 2021, and activated on January 6. In the 2020 season, he appeared in 11 games and mainly played on special teams.
On March 10, 2021, Washington signed a one-year contract extension with the Saints. He appeared in 14 games in the 2021 season and mainly had a special teams role.
On April 18, 2022, Washington signed another one-year contract with the Saints. Washington played in 12 games and mainly played in a special teams role in the 2022 season.
Denver Broncos
On August 16, 2023, Washington signed with the Denver Broncos. He was released on August 29, 2023 and re-signed to the practice squad. He was promoted to the active roster on October 4.
References
External links
Denver Broncos bio
Washington Huskies bio
1994 births
Living people
Players of American football from Lakewood, California
American football running backs
Washington Huskies football players
Detroit Lions players
New Orleans Saints players
Denver Broncos players
Gahr High School alumni
|
```go
package git
import (
"bytes"
"fmt"
"path/filepath"
)
// Status represents the current status of a Worktree.
// The key of the map is the path of the file.
type Status map[string]*FileStatus
// File returns the FileStatus for a given path, if the FileStatus doesn't
// exists a new FileStatus is added to the map using the path as key.
func (s Status) File(path string) *FileStatus {
if _, ok := (s)[path]; !ok {
s[path] = &FileStatus{Worktree: Untracked, Staging: Untracked}
}
return s[path]
}
// IsUntracked checks if file for given path is 'Untracked'
func (s Status) IsUntracked(path string) bool {
stat, ok := (s)[filepath.ToSlash(path)]
return ok && stat.Worktree == Untracked
}
// IsClean returns true if all the files aren't in Unmodified status.
func (s Status) IsClean() bool {
for _, status := range s {
if status.Worktree != Unmodified || status.Staging != Unmodified {
return false
}
}
return true
}
func (s Status) String() string {
buf := bytes.NewBuffer(nil)
for path, status := range s {
if status.Staging == Unmodified && status.Worktree == Unmodified {
continue
}
if status.Staging == Renamed {
path = fmt.Sprintf("%s -> %s", path, status.Extra)
}
fmt.Fprintf(buf, "%c%c %s\n", status.Staging, status.Worktree, path)
}
return buf.String()
}
// FileStatus contains the status of a file in the worktree
type FileStatus struct {
// Staging is the status of a file in the staging area
Staging StatusCode
// Worktree is the status of a file in the worktree
Worktree StatusCode
// Extra contains extra information, such as the previous name in a rename
Extra string
}
// StatusCode status code of a file in the Worktree
type StatusCode byte
const (
Unmodified StatusCode = ' '
Untracked StatusCode = '?'
Modified StatusCode = 'M'
Added StatusCode = 'A'
Deleted StatusCode = 'D'
Renamed StatusCode = 'R'
Copied StatusCode = 'C'
UpdatedButUnmerged StatusCode = 'U'
)
```
|
Joseph Warren Phinney (born 1848, Nantucket – d. 1934) was an American printer, type designer, and business executive. Phinney began his career at the Dickinson Type Foundry in Boston where he designed type and worked in management, eventually becoming owner. He was a key player in arranging the merger of twenty-six large foundries to form the American Type Founders Company in 1892, becoming both manager of the Boston branch and head of the design department, where he oversaw the consolidation of type faces following the merger. Though his own designs were largely derivative, Phinney took a great interest in type and its history and throughout his tenure at A.T.F. he sought to preserve and protect that company's legacy, as for instance, when he oversaw the re-introduction of Binny & Ronaldson's 1796 type design, Roman No. 1, as Oxford in 1892, or when he purchased Frederick W. Goudy's first type design, Camelot, in 1896. He stayed with A.T.F. for the rest of his career, passing the role of design head to Morris Fuller Benton and becoming senior vice-president. Phinney retired shortly before the company fell upon hard times during the Great Depression and died in 1934.
Typefaces designed Joseph W. Phinney
In addition to many faces cut for the Dickinson Type Foundry, Phinney also cut these faces cast by American Type Founders.
Jenson Series
Jenson Oldstyle + italic (1893), based on William Morris's "Golden Type", matrices cut by John F. Cumming from drawings by Phinney.
Jenson Heavyface (1899)
Jenson Condensed + Bold Condensed (1901)
Bradley Text (1895), developed from Will H. Bradley's lettering on the Christmas cover of Inland Printer Magazine by either Phinney or (more probably) Herman Ihlenberg.
Satanick (1896), based on William Morris's Troy and Chaucer, matrices cut by John F. Cumming from drawings by Phinney.
Taylor Gothic (1897), capitals only, lower-case based on Central Type Foundry of St. Louis' Quentell. Later re-worked by either Morris Fuller Benton or Goudy as Globe Gothic.
Devens Script (1898)
Touraine Oldstyle Italic (1898)
Lower-case letters for Goudy's Camelot (1900)
Abbott Oldstyle (1901)
Engravers Old English (1901), usually credited to Benton.
Flemish Black (1902) An adaptation of Priory Text, an 1870s version of William Caslon’s Caslon Text of 1734.
Cheltenham Oldstyle + italic (1902), developed from Bertram Goodhue's architectural lettering by either Phinney or Benton.
Cloister Black (1904), lower-case identical to Flemish Black, capitals usually credited to Benton.
References
MacGrew, Mac, American Metal Typefaces of the Twentieth Century, Oak Knoll Books, New Castle Delaware, 1993, .
American typographers and type designers
1848 births
1934 deaths
American printers
People from Nantucket, Massachusetts
|
```objective-c
/*
/ _____) _ | |
( (____ _____ ____ _| |_ _____ ____| |__
\____ \| ___ | (_ _) ___ |/ ___) _ \
_____) ) ____| | | || |_| ____( (___| | | |
(______/|_____)_|_|_| \__)_____)\____)_| |_|
(C)2013 Semtech
Description: SX1276 LoRa modem registers
Maintainer: Michael Coracin
*/
#ifndef _LORAGW_SX1276_REGS_LORA_H
#define _LORAGW_SX1276_REGS_LORA_H
/*!
* ============================================================================
* SX1276 Internal registers Address
* ============================================================================
*/
#define SX1276_REG_LR_FIFO 0x00
// Common settings
#define SX1276_REG_LR_OPMODE 0x01
#define SX1276_REG_LR_FRFMSB 0x06
#define SX1276_REG_LR_FRFMID 0x07
#define SX1276_REG_LR_FRFLSB 0x08
// Tx settings
#define SX1276_REG_LR_PACONFIG 0x09
#define SX1276_REG_LR_PARAMP 0x0A
#define SX1276_REG_LR_OCP 0x0B
// Rx settings
#define SX1276_REG_LR_LNA 0x0C
// LoRa registers
#define SX1276_REG_LR_FIFOADDRPTR 0x0D
#define SX1276_REG_LR_FIFOTXBASEADDR 0x0E
#define SX1276_REG_LR_FIFORXBASEADDR 0x0F
#define SX1276_REG_LR_FIFORXCURRENTADDR 0x10
#define SX1276_REG_LR_IRQFLAGSMASK 0x11
#define SX1276_REG_LR_IRQFLAGS 0x12
#define SX1276_REG_LR_RXNBBYTES 0x13
#define SX1276_REG_LR_RXHEADERCNTVALUEMSB 0x14
#define SX1276_REG_LR_RXHEADERCNTVALUELSB 0x15
#define SX1276_REG_LR_RXPACKETCNTVALUEMSB 0x16
#define SX1276_REG_LR_RXPACKETCNTVALUELSB 0x17
#define SX1276_REG_LR_MODEMSTAT 0x18
#define SX1276_REG_LR_PKTSNRVALUE 0x19
#define SX1276_REG_LR_PKTRSSIVALUE 0x1A
#define SX1276_REG_LR_RSSIVALUE 0x1B
#define SX1276_REG_LR_HOPCHANNEL 0x1C
#define SX1276_REG_LR_MODEMCONFIG1 0x1D
#define SX1276_REG_LR_MODEMCONFIG2 0x1E
#define SX1276_REG_LR_SYMBTIMEOUTLSB 0x1F
#define SX1276_REG_LR_PREAMBLEMSB 0x20
#define SX1276_REG_LR_PREAMBLELSB 0x21
#define SX1276_REG_LR_PAYLOADLENGTH 0x22
#define SX1276_REG_LR_PAYLOADMAXLENGTH 0x23
#define SX1276_REG_LR_HOPPERIOD 0x24
#define SX1276_REG_LR_FIFORXBYTEADDR 0x25
#define SX1276_REG_LR_MODEMCONFIG3 0x26
#define SX1276_REG_LR_FEIMSB 0x28
#define SX1276_REG_LR_FEIMID 0x29
#define SX1276_REG_LR_FEILSB 0x2A
#define SX1276_REG_LR_RSSIWIDEBAND 0x2C
#define SX1276_REG_LR_TEST2F 0x2F
#define SX1276_REG_LR_TEST30 0x30
#define SX1276_REG_LR_DETECTOPTIMIZE 0x31
#define SX1276_REG_LR_INVERTIQ 0x33
#define SX1276_REG_LR_TEST36 0x36
#define SX1276_REG_LR_DETECTIONTHRESHOLD 0x37
#define SX1276_REG_LR_SYNCWORD 0x39
#define SX1276_REG_LR_TEST3A 0x3A
#define SX1276_REG_LR_INVERTIQ2 0x3B
// end of documented register in datasheet
// I/O settings
#define SX1276_REG_LR_DIOMAPPING1 0x40
#define SX1276_REG_LR_DIOMAPPING2 0x41
// Version
#define SX1276_REG_LR_VERSION 0x42
// Additional settings
#define SX1276_REG_LR_PLLHOP 0x44
#define SX1276_REG_LR_TCXO 0x4B
#define SX1276_REG_LR_PADAC 0x4D
#define SX1276_REG_LR_FORMERTEMP 0x5B
#define SX1276_REG_LR_BITRATEFRAC 0x5D
#define SX1276_REG_LR_AGCREF 0x61
#define SX1276_REG_LR_AGCTHRESH1 0x62
#define SX1276_REG_LR_AGCTHRESH2 0x63
#define SX1276_REG_LR_AGCTHRESH3 0x64
#define SX1276_REG_LR_PLL 0x70
#endif // _LORAGW_SX1276_REGS_LORA_H
```
|
```go
package controlapi
import (
"context"
"fmt"
"io"
"testing"
"github.com/moby/swarmkit/v2/api"
cautils "github.com/moby/swarmkit/v2/ca/testutils"
"github.com/moby/swarmkit/v2/identity"
"github.com/moby/swarmkit/v2/log"
raftutils "github.com/moby/swarmkit/v2/manager/state/raft/testutils"
"github.com/moby/swarmkit/v2/manager/state/store"
"github.com/moby/swarmkit/v2/testutils"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"google.golang.org/grpc/codes"
"google.golang.org/grpc/grpclog"
)
func createNode(t *testing.T, ts *testServer, id string, role api.NodeRole, membership api.NodeSpec_Membership, state api.NodeStatus_State) *api.Node {
node := &api.Node{
ID: id,
Spec: api.NodeSpec{
Membership: membership,
},
Status: api.NodeStatus{
State: state,
},
Role: role,
}
err := ts.Store.Update(func(tx store.Tx) error {
return store.CreateNode(tx, node)
})
assert.NoError(t, err)
return node
}
func TestGetNode(t *testing.T) {
ts := newTestServer(t)
defer ts.Stop()
_, err := ts.Client.GetNode(context.Background(), &api.GetNodeRequest{})
assert.Error(t, err)
assert.Equal(t, codes.InvalidArgument, testutils.ErrorCode(err))
_, err = ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: "invalid"})
assert.Error(t, err)
assert.Equal(t, codes.NotFound, testutils.ErrorCode(err))
node := createNode(t, ts, "id", api.NodeRoleManager, api.NodeMembershipAccepted, api.NodeStatus_READY)
r, err := ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: node.ID})
assert.NoError(t, err)
assert.Equal(t, node.ID, r.Node.ID)
}
func TestListNodes(t *testing.T) {
ts := newTestServer(t)
defer ts.Stop()
r, err := ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.Empty(t, r.Nodes)
createNode(t, ts, "id1", api.NodeRoleManager, api.NodeMembershipAccepted, api.NodeStatus_READY)
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.Equal(t, 1, len(r.Nodes))
createNode(t, ts, "id2", api.NodeRoleWorker, api.NodeMembershipAccepted, api.NodeStatus_READY)
createNode(t, ts, "id3", api.NodeRoleWorker, api.NodeMembershipPending, api.NodeStatus_READY)
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.Equal(t, 3, len(r.Nodes))
// List by role.
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Roles: []api.NodeRole{api.NodeRoleManager},
},
},
)
assert.NoError(t, err)
assert.Equal(t, 1, len(r.Nodes))
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Roles: []api.NodeRole{api.NodeRoleWorker},
},
},
)
assert.NoError(t, err)
assert.Equal(t, 2, len(r.Nodes))
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Roles: []api.NodeRole{api.NodeRoleManager, api.NodeRoleWorker},
},
},
)
assert.NoError(t, err)
assert.Equal(t, 3, len(r.Nodes))
// List by membership.
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Memberships: []api.NodeSpec_Membership{api.NodeMembershipAccepted},
},
},
)
assert.NoError(t, err)
assert.Equal(t, 2, len(r.Nodes))
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Memberships: []api.NodeSpec_Membership{api.NodeMembershipPending},
},
},
)
assert.NoError(t, err)
assert.Equal(t, 1, len(r.Nodes))
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Memberships: []api.NodeSpec_Membership{api.NodeMembershipAccepted, api.NodeMembershipPending},
},
},
)
assert.NoError(t, err)
assert.Equal(t, 3, len(r.Nodes))
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Roles: []api.NodeRole{api.NodeRoleWorker},
Memberships: []api.NodeSpec_Membership{api.NodeMembershipPending},
},
},
)
assert.NoError(t, err)
assert.Equal(t, 1, len(r.Nodes))
}
func TestListNodesWithLabelFilter(t *testing.T) {
ts := newTestServer(t)
defer ts.Stop()
// satify these test cases:
// Filtering on engine labels
// - returns all nodes with matching engine labels
// - does not return nodes with matching node labels
// - does not return nodes with non-matching engine labels
// Filtering on nodes:
// - returns all nodes with matching node labels
// - does not return nodes with matching engine labels
// - does not return nodes with non-matching node labels
// we'll need 3 nodes for this test.
nodes := make([]*api.Node, 3)
nodes[0] = &api.Node{
ID: "node0",
Spec: api.NodeSpec{
Annotations: api.Annotations{
Labels: map[string]string{
"allcommon": "node",
"nodelabel1": "shouldmatch",
"nodelabel2": "unique1",
},
},
},
Description: &api.NodeDescription{
Engine: &api.EngineDescription{
Labels: map[string]string{
"allcommon": "engine",
"enginelabel1": "shouldmatch",
"enginelabel2": "unique1",
},
},
},
}
nodes[1] = &api.Node{
ID: "node1",
Spec: api.NodeSpec{
Annotations: api.Annotations{
Labels: map[string]string{
"allcommon": "node",
"nodelabel1": "shouldmatch",
"nodelabel2": "unique2",
},
},
},
Description: &api.NodeDescription{
Engine: &api.EngineDescription{
Labels: map[string]string{
"allcommon": "engine",
"enginelabel1": "shouldmatch",
"enginelabel2": "unique2",
},
},
},
}
nodes[2] = &api.Node{
ID: "node2",
Spec: api.NodeSpec{
Annotations: api.Annotations{
Labels: map[string]string{
"allcommon": "node",
"nodelabel1": "shouldnevermatch",
"nodelabel2": "unique1",
},
},
},
Description: &api.NodeDescription{
Engine: &api.EngineDescription{
Labels: map[string]string{
"allcommon": "engine",
"enginelabel1": "shouldnevermatch",
"enginelabel2": "unique1",
},
},
},
}
// createNode gives us a bunch of fields we don't care about. instead, do a
// store update directly
err := ts.Store.Update(func(tx store.Tx) error {
for _, node := range nodes {
if err := store.CreateNode(tx, node); err != nil {
return err
}
}
return nil
})
require.NoError(t, err, "error creating nodes")
// now try listing nodes
// listing with an empty set of labels should return all nodes
t.Log("list nodes with no filters")
r, err := ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{},
})
assert.NoError(t, err)
assert.Len(t, r.Nodes, 3)
t.Log("list nodes with allcommon=engine engine label filter")
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Labels: map[string]string{"allcommon": "engine"},
},
})
assert.NoError(t, err)
assert.Len(t, r.Nodes, 3)
t.Log("list nodes with allcommon=node engine label filter")
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Labels: map[string]string{"allcommon": "node"},
},
})
// nothing should be returned; allcommon=engine on engine labels
assert.NoError(t, err)
assert.Len(t, r.Nodes, 0)
t.Log("list nodes with allcommon=node node filter")
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
NodeLabels: map[string]string{"allcommon": "node"},
},
})
assert.NoError(t, err)
assert.Len(t, r.Nodes, 3)
t.Log("list nodes with allcommon=engine node filter")
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
NodeLabels: map[string]string{"allcommon": "engine"},
},
})
assert.NoError(t, err)
assert.Len(t, r.Nodes, 0)
t.Log("list nodes with nodelabel1=shouldmatch node filter")
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
NodeLabels: map[string]string{"nodelabel1": "shouldmatch"},
},
})
// should only return the first 2 nodes
assert.NoError(t, err)
assert.Len(t, r.Nodes, 2)
assert.Contains(t, r.Nodes, nodes[0])
assert.Contains(t, r.Nodes, nodes[1])
t.Log("list nodes with enginelabel1=shouldmatch engine filter")
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Labels: map[string]string{"enginelabel1": "shouldmatch"},
},
})
// should only return the first 2 nodes
assert.NoError(t, err)
assert.Len(t, r.Nodes, 2)
assert.Contains(t, r.Nodes, nodes[0])
assert.Contains(t, r.Nodes, nodes[1])
t.Log("list nodes with node two engine filters")
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Labels: map[string]string{
"enginelabel1": "shouldmatch",
"enginelabel2": "unique1",
},
},
})
// should only return the first node
assert.NoError(t, err)
assert.Len(t, r.Nodes, 1)
assert.Contains(t, r.Nodes, nodes[0])
t.Log("list nodes with node two node filters")
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
NodeLabels: map[string]string{
"nodelabel1": "shouldmatch",
"nodelabel2": "unique1",
},
},
})
// should only return the first node
assert.NoError(t, err)
assert.Len(t, r.Nodes, 1)
assert.Contains(t, r.Nodes, nodes[0])
t.Log("list nodes with both engine and node filters")
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
// all nodes pass this filter
Labels: map[string]string{
"enginelabel1": "",
},
// only 0 and 2 pass this filter
NodeLabels: map[string]string{
"nodelabel2": "unique1",
},
},
})
assert.NoError(t, err)
assert.Len(t, r.Nodes, 2)
assert.Contains(t, r.Nodes, nodes[0])
assert.Contains(t, r.Nodes, nodes[2])
}
func TestRemoveNodes(t *testing.T) {
ts := newTestServer(t)
defer ts.Stop()
ts.Store.Update(func(tx store.Tx) error {
store.CreateCluster(tx, &api.Cluster{
ID: identity.NewID(),
Spec: api.ClusterSpec{
Annotations: api.Annotations{
Name: store.DefaultClusterName,
},
},
})
return nil
})
r, err := ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.Empty(t, r.Nodes)
createNode(t, ts, "id1", api.NodeRoleManager, api.NodeMembershipAccepted, api.NodeStatus_READY)
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.Len(t, r.Nodes, 1)
createNode(t, ts, "id2", api.NodeRoleWorker, api.NodeMembershipAccepted, api.NodeStatus_READY)
createNode(t, ts, "id3", api.NodeRoleWorker, api.NodeMembershipPending, api.NodeStatus_UNKNOWN)
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.Len(t, r.Nodes, 3)
// Attempt to remove a ready node without force
_, err = ts.Client.RemoveNode(context.Background(),
&api.RemoveNodeRequest{
NodeID: "id2",
Force: false,
},
)
assert.Error(t, err)
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Roles: []api.NodeRole{api.NodeRoleManager, api.NodeRoleWorker},
},
},
)
assert.NoError(t, err)
assert.Len(t, r.Nodes, 3)
// Attempt to remove a ready node with force
_, err = ts.Client.RemoveNode(context.Background(),
&api.RemoveNodeRequest{
NodeID: "id2",
Force: true,
},
)
assert.NoError(t, err)
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Roles: []api.NodeRole{api.NodeRoleManager, api.NodeRoleWorker},
},
},
)
assert.NoError(t, err)
assert.Len(t, r.Nodes, 2)
clusterResp, err := ts.Client.ListClusters(context.Background(), &api.ListClustersRequest{})
assert.NoError(t, err)
require.Len(t, clusterResp.Clusters, 1)
require.Len(t, clusterResp.Clusters[0].BlacklistedCertificates, 1)
_, ok := clusterResp.Clusters[0].BlacklistedCertificates["id2"]
assert.True(t, ok)
// Attempt to remove a non-ready node without force
_, err = ts.Client.RemoveNode(context.Background(),
&api.RemoveNodeRequest{
NodeID: "id3",
Force: false,
},
)
assert.NoError(t, err)
r, err = ts.Client.ListNodes(context.Background(),
&api.ListNodesRequest{
Filters: &api.ListNodesRequest_Filters{
Roles: []api.NodeRole{api.NodeRoleManager, api.NodeRoleWorker},
},
},
)
assert.NoError(t, err)
assert.Len(t, r.Nodes, 1)
}
func init() {
grpclog.SetLoggerV2(grpclog.NewLoggerV2(io.Discard, io.Discard, io.Discard))
log.L.Logger.SetOutput(io.Discard)
}
func getMap(t *testing.T, nodes []*api.Node) map[uint64]*api.ManagerStatus {
m := make(map[uint64]*api.ManagerStatus)
for _, n := range nodes {
if n.ManagerStatus != nil {
m[n.ManagerStatus.RaftID] = n.ManagerStatus
}
}
return m
}
func TestListManagerNodes(t *testing.T) {
t.Parallel()
tc := cautils.NewTestCA(t)
defer tc.Stop()
ts := newTestServer(t)
defer ts.Stop()
nodes, clockSource := raftutils.NewRaftCluster(t, tc)
defer raftutils.TeardownCluster(nodes)
// Create a node object for each of the managers
assert.NoError(t, nodes[1].MemoryStore().Update(func(tx store.Tx) error {
assert.NoError(t, store.CreateNode(tx, &api.Node{ID: nodes[1].SecurityConfig.ClientTLSCreds.NodeID()}))
assert.NoError(t, store.CreateNode(tx, &api.Node{ID: nodes[2].SecurityConfig.ClientTLSCreds.NodeID()}))
assert.NoError(t, store.CreateNode(tx, &api.Node{ID: nodes[3].SecurityConfig.ClientTLSCreds.NodeID()}))
return nil
}))
// Assign one of the raft node to the test server
ts.Server.raft = nodes[1].Node
ts.Server.store = nodes[1].MemoryStore()
// There should be 3 reachable managers listed
r, err := ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.NotNil(t, r)
managers := getMap(t, r.Nodes)
assert.Len(t, ts.Server.raft.GetMemberlist(), 3)
assert.Len(t, r.Nodes, 3)
// Node 1 should be the leader
for i := 1; i <= 3; i++ {
if i == 1 {
assert.True(t, managers[nodes[uint64(i)].Config.ID].Leader)
continue
}
assert.False(t, managers[nodes[uint64(i)].Config.ID].Leader)
}
// All nodes should be reachable
for i := 1; i <= 3; i++ {
assert.Equal(t, api.RaftMemberStatus_REACHABLE, managers[nodes[uint64(i)].Config.ID].Reachability)
}
// Add two more nodes to the cluster
raftutils.AddRaftNode(t, clockSource, nodes, tc)
raftutils.AddRaftNode(t, clockSource, nodes, tc)
raftutils.WaitForCluster(t, clockSource, nodes)
// Add node entries for these
assert.NoError(t, nodes[1].MemoryStore().Update(func(tx store.Tx) error {
assert.NoError(t, store.CreateNode(tx, &api.Node{ID: nodes[4].SecurityConfig.ClientTLSCreds.NodeID()}))
assert.NoError(t, store.CreateNode(tx, &api.Node{ID: nodes[5].SecurityConfig.ClientTLSCreds.NodeID()}))
return nil
}))
// There should be 5 reachable managers listed
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.NotNil(t, r)
managers = getMap(t, r.Nodes)
assert.Len(t, ts.Server.raft.GetMemberlist(), 5)
assert.Len(t, r.Nodes, 5)
for i := 1; i <= 5; i++ {
assert.Equal(t, api.RaftMemberStatus_REACHABLE, managers[nodes[uint64(i)].Config.ID].Reachability)
}
// Stops 2 nodes
nodes[4].Server.Stop()
nodes[4].ShutdownRaft()
nodes[5].Server.Stop()
nodes[5].ShutdownRaft()
// Node 4 and Node 5 should be listed as Unreachable
assert.NoError(t, testutils.PollFunc(clockSource, func() error {
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
if err != nil {
return err
}
managers = getMap(t, r.Nodes)
if len(r.Nodes) != 5 {
return fmt.Errorf("expected 5 nodes, got %d", len(r.Nodes))
}
if managers[nodes[4].Config.ID].Reachability == api.RaftMemberStatus_REACHABLE {
return fmt.Errorf("expected node 4 to be unreachable")
}
if managers[nodes[5].Config.ID].Reachability == api.RaftMemberStatus_REACHABLE {
return fmt.Errorf("expected node 5 to be unreachable")
}
return nil
}))
// Restart the 2 nodes
nodes[4] = raftutils.RestartNode(t, clockSource, nodes[4], false)
nodes[5] = raftutils.RestartNode(t, clockSource, nodes[5], false)
raftutils.WaitForCluster(t, clockSource, nodes)
assert.Len(t, ts.Server.raft.GetMemberlist(), 5)
// All the nodes should be reachable again
assert.NoError(t, testutils.PollFunc(clockSource, func() error {
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
if err != nil {
return err
}
managers = getMap(t, r.Nodes)
for i := 1; i <= 5; i++ {
if managers[nodes[uint64(i)].Config.ID].Reachability != api.RaftMemberStatus_REACHABLE {
return fmt.Errorf("node %x is unreachable", nodes[uint64(i)].Config.ID)
}
}
return nil
}))
// Stop node 1 (leader)
nodes[1].Server.Stop()
nodes[1].ShutdownRaft()
newCluster := map[uint64]*raftutils.TestNode{
2: nodes[2],
3: nodes[3],
4: nodes[4],
5: nodes[5],
}
// Wait for the re-election to occur
raftutils.WaitForCluster(t, clockSource, newCluster)
var leaderNode *raftutils.TestNode
for _, node := range newCluster {
if node.IsLeader() {
leaderNode = node
}
}
// Switch the raft node used by the server
ts.Server.raft = leaderNode.Node
// Node 1 should not be the leader anymore
assert.NoError(t, testutils.PollFunc(clockSource, func() error {
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
if err != nil {
return err
}
managers = getMap(t, r.Nodes)
if managers[nodes[1].Config.ID].Leader {
return fmt.Errorf("expected node 1 not to be the leader")
}
if managers[nodes[1].Config.ID].Reachability == api.RaftMemberStatus_REACHABLE {
return fmt.Errorf("expected node 1 to be unreachable")
}
return nil
}))
// Restart node 1
nodes[1].ShutdownRaft()
nodes[1] = raftutils.RestartNode(t, clockSource, nodes[1], false)
raftutils.WaitForCluster(t, clockSource, nodes)
// Ensure that node 1 is not the leader
assert.False(t, managers[nodes[uint64(1)].Config.ID].Leader)
// Check that another node got the leader status
var leader uint64
leaderCount := 0
for i := 1; i <= 5; i++ {
if managers[nodes[uint64(i)].Config.ID].Leader {
leader = nodes[uint64(i)].Config.ID
leaderCount++
}
}
// There should be only one leader after node 1 recovery and it
// should be different than node 1
assert.Equal(t, 1, leaderCount)
assert.NotEqual(t, leader, nodes[1].Config.ID)
}
func TestUpdateNode(t *testing.T) {
tc := cautils.NewTestCA(t)
defer tc.Stop()
ts := newTestServer(t)
defer ts.Stop()
nodes := make(map[uint64]*raftutils.TestNode)
nodes[1], _ = raftutils.NewInitNode(t, tc, nil)
defer raftutils.TeardownCluster(nodes)
nodeID := nodes[1].SecurityConfig.ClientTLSCreds.NodeID()
// Assign one of the raft node to the test server
ts.Server.raft = nodes[1].Node
ts.Server.store = nodes[1].MemoryStore()
_, err := ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{
NodeID: nodeID,
Spec: &api.NodeSpec{
Availability: api.NodeAvailabilityDrain,
},
NodeVersion: &api.Version{},
})
assert.Error(t, err)
assert.Equal(t, codes.NotFound, testutils.ErrorCode(err))
// Create a node object for the manager
assert.NoError(t, nodes[1].MemoryStore().Update(func(tx store.Tx) error {
assert.NoError(t, store.CreateNode(tx, &api.Node{
ID: nodes[1].SecurityConfig.ClientTLSCreds.NodeID(),
Spec: api.NodeSpec{
Membership: api.NodeMembershipAccepted,
},
Role: api.NodeRoleManager,
}))
return nil
}))
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{})
assert.Error(t, err)
assert.Equal(t, codes.InvalidArgument, testutils.ErrorCode(err))
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{NodeID: "invalid", Spec: &api.NodeSpec{}, NodeVersion: &api.Version{}})
assert.Error(t, err)
assert.Equal(t, codes.NotFound, testutils.ErrorCode(err))
r, err := ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: nodeID})
assert.NoError(t, err)
if !assert.NotNil(t, r) {
assert.FailNow(t, "got unexpected nil response from GetNode")
}
assert.NotNil(t, r.Node)
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{NodeID: nodeID})
assert.Error(t, err)
assert.Equal(t, codes.InvalidArgument, testutils.ErrorCode(err))
spec := r.Node.Spec.Copy()
spec.Availability = api.NodeAvailabilityDrain
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{
NodeID: nodeID,
Spec: spec,
})
assert.Error(t, err)
assert.Equal(t, codes.InvalidArgument, testutils.ErrorCode(err))
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{
NodeID: nodeID,
Spec: spec,
NodeVersion: &r.Node.Meta.Version,
})
assert.NoError(t, err)
r, err = ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: nodeID})
assert.NoError(t, err)
if !assert.NotNil(t, r) {
assert.FailNow(t, "got unexpected nil response from GetNode")
}
assert.NotNil(t, r.Node)
assert.NotNil(t, r.Node.Spec)
assert.Equal(t, api.NodeAvailabilityDrain, r.Node.Spec.Availability)
version := &r.Node.Meta.Version
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{NodeID: nodeID, Spec: &r.Node.Spec, NodeVersion: version})
assert.NoError(t, err)
// Perform an update with the "old" version.
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{NodeID: nodeID, Spec: &r.Node.Spec, NodeVersion: version})
assert.Error(t, err)
}
func testUpdateNodeDemote(t *testing.T) {
tc := cautils.NewTestCA(t)
defer tc.Stop()
ts := newTestServer(t)
defer ts.Stop()
nodes, clockSource := raftutils.NewRaftCluster(t, tc)
defer raftutils.TeardownCluster(nodes)
// Assign one of the raft node to the test server
ts.Server.raft = nodes[1].Node
ts.Server.store = nodes[1].MemoryStore()
// Create a node object for each of the managers
assert.NoError(t, nodes[1].MemoryStore().Update(func(tx store.Tx) error {
assert.NoError(t, store.CreateNode(tx, &api.Node{
ID: nodes[1].SecurityConfig.ClientTLSCreds.NodeID(),
Spec: api.NodeSpec{
DesiredRole: api.NodeRoleManager,
Membership: api.NodeMembershipAccepted,
},
Role: api.NodeRoleManager,
}))
assert.NoError(t, store.CreateNode(tx, &api.Node{
ID: nodes[2].SecurityConfig.ClientTLSCreds.NodeID(),
Spec: api.NodeSpec{
DesiredRole: api.NodeRoleManager,
Membership: api.NodeMembershipAccepted,
},
Role: api.NodeRoleManager,
}))
assert.NoError(t, store.CreateNode(tx, &api.Node{
ID: nodes[3].SecurityConfig.ClientTLSCreds.NodeID(),
Spec: api.NodeSpec{
DesiredRole: api.NodeRoleManager,
Membership: api.NodeMembershipAccepted,
},
Role: api.NodeRoleManager,
}))
return nil
}))
// Stop Node 3 (1 node out of 3)
nodes[3].Server.Stop()
nodes[3].ShutdownRaft()
// Node 3 should be listed as Unreachable
assert.NoError(t, testutils.PollFunc(clockSource, func() error {
members := nodes[1].GetMemberlist()
if len(members) != 3 {
return fmt.Errorf("expected 3 nodes, got %d", len(members))
}
if members[nodes[3].Config.ID].Status.Reachability == api.RaftMemberStatus_REACHABLE {
return fmt.Errorf("expected node 3 to be unreachable")
}
return nil
}))
// Try to demote Node 2, this should fail because of the quorum safeguard
r, err := ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: nodes[2].SecurityConfig.ClientTLSCreds.NodeID()})
assert.NoError(t, err)
spec := r.Node.Spec.Copy()
spec.DesiredRole = api.NodeRoleWorker
version := &r.Node.Meta.Version
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{
NodeID: nodes[2].SecurityConfig.ClientTLSCreds.NodeID(),
Spec: spec,
NodeVersion: version,
})
assert.Error(t, err)
assert.Equal(t, codes.FailedPrecondition, testutils.ErrorCode(err))
// Restart Node 3
nodes[3] = raftutils.RestartNode(t, clockSource, nodes[3], false)
raftutils.WaitForCluster(t, clockSource, nodes)
// Node 3 should be listed as Reachable
assert.NoError(t, testutils.PollFunc(clockSource, func() error {
members := nodes[1].GetMemberlist()
if len(members) != 3 {
return fmt.Errorf("expected 3 nodes, got %d", len(members))
}
if members[nodes[3].Config.ID].Status.Reachability == api.RaftMemberStatus_UNREACHABLE {
return fmt.Errorf("expected node 3 to be reachable")
}
return nil
}))
raftMember := ts.Server.raft.GetMemberByNodeID(nodes[3].SecurityConfig.ClientTLSCreds.NodeID())
assert.NotNil(t, raftMember)
// Try to demote Node 3, this should succeed
r, err = ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: nodes[3].SecurityConfig.ClientTLSCreds.NodeID()})
assert.NoError(t, err)
spec = r.Node.Spec.Copy()
spec.DesiredRole = api.NodeRoleWorker
version = &r.Node.Meta.Version
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{
NodeID: nodes[3].SecurityConfig.ClientTLSCreds.NodeID(),
Spec: spec,
NodeVersion: version,
})
assert.NoError(t, err)
newCluster := map[uint64]*raftutils.TestNode{
1: nodes[1],
2: nodes[2],
}
ts.Server.raft.RemoveMember(context.Background(), raftMember.RaftID)
raftutils.WaitForCluster(t, clockSource, newCluster)
// Server should list 2 members
assert.NoError(t, testutils.PollFunc(clockSource, func() error {
members := nodes[1].GetMemberlist()
if len(members) != 2 {
return fmt.Errorf("expected 2 nodes, got %d", len(members))
}
return nil
}))
demoteNode := nodes[2]
lastNode := nodes[1]
raftMember = ts.Server.raft.GetMemberByNodeID(demoteNode.SecurityConfig.ClientTLSCreds.NodeID())
assert.NotNil(t, raftMember)
// Try to demote a Node and scale down to 1
r, err = ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: demoteNode.SecurityConfig.ClientTLSCreds.NodeID()})
assert.NoError(t, err)
spec = r.Node.Spec.Copy()
spec.DesiredRole = api.NodeRoleWorker
version = &r.Node.Meta.Version
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{
NodeID: demoteNode.SecurityConfig.ClientTLSCreds.NodeID(),
Spec: spec,
NodeVersion: version,
})
assert.NoError(t, err)
ts.Server.raft.RemoveMember(context.Background(), raftMember.RaftID)
// Update the server
ts.Server.raft = lastNode.Node
ts.Server.store = lastNode.MemoryStore()
newCluster = map[uint64]*raftutils.TestNode{
1: lastNode,
}
raftutils.WaitForCluster(t, clockSource, newCluster)
assert.NoError(t, testutils.PollFunc(clockSource, func() error {
members := lastNode.GetMemberlist()
if len(members) != 1 {
return fmt.Errorf("expected 1 node, got %d", len(members))
}
return nil
}))
// Make sure we can't demote the last manager.
r, err = ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: lastNode.SecurityConfig.ClientTLSCreds.NodeID()})
assert.NoError(t, err)
spec = r.Node.Spec.Copy()
spec.DesiredRole = api.NodeRoleWorker
version = &r.Node.Meta.Version
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{
NodeID: lastNode.SecurityConfig.ClientTLSCreds.NodeID(),
Spec: spec,
NodeVersion: version,
})
assert.Error(t, err)
assert.Equal(t, codes.FailedPrecondition, testutils.ErrorCode(err))
// Propose a change in the spec and check if the remaining node can still process updates
r, err = ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: lastNode.SecurityConfig.ClientTLSCreds.NodeID()})
assert.NoError(t, err)
spec = r.Node.Spec.Copy()
spec.Availability = api.NodeAvailabilityDrain
version = &r.Node.Meta.Version
_, err = ts.Client.UpdateNode(context.Background(), &api.UpdateNodeRequest{
NodeID: lastNode.SecurityConfig.ClientTLSCreds.NodeID(),
Spec: spec,
NodeVersion: version,
})
assert.NoError(t, err)
// Get node information and check that the availability is set to drain
r, err = ts.Client.GetNode(context.Background(), &api.GetNodeRequest{NodeID: lastNode.SecurityConfig.ClientTLSCreds.NodeID()})
assert.NoError(t, err)
assert.Equal(t, r.Node.Spec.Availability, api.NodeAvailabilityDrain)
}
func TestUpdateNodeDemote(t *testing.T) {
t.Parallel()
testUpdateNodeDemote(t)
}
// TestRemoveNodeAttachments tests the unexported orphanNodeTasks
func TestOrphanNodeTasks(t *testing.T) {
// first, set up a store and all that
ts := newTestServer(t)
defer ts.Stop()
ts.Store.Update(func(tx store.Tx) error {
store.CreateCluster(tx, &api.Cluster{
ID: identity.NewID(),
Spec: api.ClusterSpec{
Annotations: api.Annotations{
Name: store.DefaultClusterName,
},
},
})
return nil
})
// make sure before we start that our server is in a good (empty) state
r, err := ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.Empty(t, r.Nodes)
// create a manager
createNode(t, ts, "id1", api.NodeRoleManager, api.NodeMembershipAccepted, api.NodeStatus_READY)
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.Len(t, r.Nodes, 1)
// create a worker. put it in the DOWN state, which is the state it will be
// in to remove it anyway
createNode(t, ts, "id2", api.NodeRoleWorker, api.NodeMembershipAccepted, api.NodeStatus_DOWN)
r, err = ts.Client.ListNodes(context.Background(), &api.ListNodesRequest{})
assert.NoError(t, err)
assert.Len(t, r.Nodes, 2)
// create a network we can "attach" to
err = ts.Store.Update(func(tx store.Tx) error {
n := &api.Network{
ID: "net1id",
Spec: api.NetworkSpec{
Annotations: api.Annotations{
Name: "net1name",
},
Attachable: true,
},
}
return store.CreateNetwork(tx, n)
})
require.NoError(t, err)
// create some tasks:
err = ts.Store.Update(func(tx store.Tx) error {
// 1.) A network attachment on the node we're gonna remove
task1 := &api.Task{
ID: "task1",
NodeID: "id2",
DesiredState: api.TaskStateRunning,
Status: api.TaskStatus{
State: api.TaskStateRunning,
},
Spec: api.TaskSpec{
Runtime: &api.TaskSpec_Attachment{
Attachment: &api.NetworkAttachmentSpec{
ContainerID: "container1",
},
},
Networks: []*api.NetworkAttachmentConfig{
{
Target: "net1id",
Addresses: []string{}, // just leave this empty, we don't need it
},
},
},
// we probably don't care about the rest of the fields.
}
if err := store.CreateTask(tx, task1); err != nil {
return err
}
// 2.) A network attachment on the node we're not going to remove
task2 := &api.Task{
ID: "task2",
NodeID: "id1",
DesiredState: api.TaskStateRunning,
Status: api.TaskStatus{
State: api.TaskStateRunning,
},
Spec: api.TaskSpec{
Runtime: &api.TaskSpec_Attachment{
Attachment: &api.NetworkAttachmentSpec{
ContainerID: "container2",
},
},
Networks: []*api.NetworkAttachmentConfig{
{
Target: "net1id",
Addresses: []string{}, // just leave this empty, we don't need it
},
},
},
// we probably don't care about the rest of the fields.
}
if err := store.CreateTask(tx, task2); err != nil {
return err
}
// 3.) A regular task on the node we're going to remove
task3 := &api.Task{
ID: "task3",
NodeID: "id2",
DesiredState: api.TaskStateRunning,
Status: api.TaskStatus{
State: api.TaskStateRunning,
},
Spec: api.TaskSpec{
Runtime: &api.TaskSpec_Container{
Container: &api.ContainerSpec{},
},
},
}
if err := store.CreateTask(tx, task3); err != nil {
return err
}
// 4.) A regular task on the node we're not going to remove
task4 := &api.Task{
ID: "task4",
NodeID: "id1",
DesiredState: api.TaskStateRunning,
Status: api.TaskStatus{
State: api.TaskStateRunning,
},
Spec: api.TaskSpec{
Runtime: &api.TaskSpec_Container{
Container: &api.ContainerSpec{},
},
},
}
if err := store.CreateTask(tx, task4); err != nil {
return err
}
// 5.) A regular task that's already in a terminal state on the node,
// which does not need to be updated.
task5 := &api.Task{
ID: "task5",
NodeID: "id2",
DesiredState: api.TaskStateRunning,
Status: api.TaskStatus{
// use TaskStateCompleted, as this is the earliest terminal
// state (this ensures we don't actually use <= instead of <)
State: api.TaskStateCompleted,
},
Spec: api.TaskSpec{
Runtime: &api.TaskSpec_Container{
Container: &api.ContainerSpec{},
},
},
}
return store.CreateTask(tx, task5)
})
require.NoError(t, err)
// Now, call the function with our nodeID. make sure it returns no error
err = ts.Store.Update(func(tx store.Tx) error {
return orphanNodeTasks(tx, "id2")
})
require.NoError(t, err)
// Now, make sure only tasks 1 and 3, the tasks on the node we're deleting
// removed, are removed
ts.Store.View(func(tx store.ReadTx) {
tasks, err := store.FindTasks(tx, store.All)
require.NoError(t, err)
require.Len(t, tasks, 5)
// and the list should not contain task1 or task2
for _, task := range tasks {
require.NotNil(t, task)
if task.ID == "task1" || task.ID == "task3" {
require.Equal(t, task.Status.State, api.TaskStateOrphaned)
} else {
require.NotEqual(t, task.Status.State, api.TaskStateOrphaned)
}
}
})
}
```
|
Sarkis Katchadourian (, August 9, 1886 – March 4, 1947) was an Armenian artist.
Biography
Sarkis Katchadourian was born on August 9, 1886, in Malatya in the family of Sarkis and Varduhi. He received his primary education in Armenian Evangelical College of Malatya. 1902-1908 he studied at Sanasarian College in Karin (Erzurum), In 1908 he left for Constantinople. 1908-1911 he studied at Accademia di Belle Arti di Roma and graduated with gold medal. He returned to Constantinople, where he taught painting at Sanasarian College in Karin. 1912-1914 he studied at the National Higher school of decorative arts in Paris and received the first class diploma. He left for Geneva to study pedagogy and then improved it in Munich and Vienna. In 1914-1918 Katchadourian was in Caucasus (Batumi, Yessentuki, Tiflis, Yerevan, Dilijan, Ijevan, Alexandrapol, Gharakilisa and Echmiadzin).
In 1915 he left for Persia then Armenia. In 1916 he participated in forming "Armenian Artist's Association" in Tiflis. In 1920 Katchadourian married Vardanoush Sarian. in 1921 he became member-secretary in "Armenian Artists Union" in Tiflis. In 1921 the government of ARrmenian SSR assigned the artist to draw sketches for stamps printed in Constantinople at Yesayan's personal publishing house.
1937-1941 he worked in India and made copies from temples' frescos. In 1941 Katchadourian settled in New York City.
Katchadourian's works are held in Paris, London, Vienna, Brussels, New York and other museums, the USSR, the Georgian State Museum and the National Gallery of Armenia. In 1971 he exhibited in India and Katchadourian's widow, painter Vava Katchadourian, presented the copies of Ceylonese frescos to the State Gallery of Armenia.
Death
Sarkis Katchadourian died in Paris, 1947. His urn was buried in Artist's Pantheon of Yerevan on December 28, 1977.
Gallery
See also
List of Armenian artists
List of Armenians
Culture of Armenia
References
External sources
Indian murals and Sarkis Katchadourian
About Sarkis Katchadourian
1886 births
1947 deaths
People from Malatya
Armenian artists
Artists from the Ottoman Empire
20th-century artists from the Ottoman Empire
Armenians from the Ottoman Empire
20th-century Armenian people
Emigrants from the Ottoman Empire to France
|
```go
//
//
// path_to_url
package snowflake
import (
"bytes"
"context"
"database/sql"
"io"
"net/http"
"net/http/httptest"
"strings"
"testing"
"time"
"github.com/gofrs/uuid"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"github.com/redpanda-data/benthos/v4/public/service"
)
const (
dummyUUID = "12345678-90ab-cdef-1234-567890abcdef"
)
type MockDB struct {
Queries []string
QueriesCount int
}
func (db *MockDB) ExecContext(ctx context.Context, query string, _ ...any) (sql.Result, error) {
db.Queries = append(db.Queries, query)
db.QueriesCount++
return nil, nil
}
func (db *MockDB) Close() error { return nil }
func (db *MockDB) hasQuery(query string) bool {
for _, q := range db.Queries {
if q == query {
return true
}
}
return false
}
type MockUUIDGenerator struct{}
func (MockUUIDGenerator) NewV4() (uuid.UUID, error) {
return uuid.Must(uuid.FromString(dummyUUID)), nil
}
type MockHTTPClient struct {
SnowpipeHost string
Queries []string
QueriesCount int
Payloads []string
JWTs []string
}
func (c *MockHTTPClient) Do(req *http.Request) (*http.Response, error) {
req.URL.Host = c.SnowpipeHost
req.URL.Scheme = "http"
query := req.URL.Path
query += "?" + req.URL.RawQuery
c.Queries = append(c.Queries, query)
c.QueriesCount++
// Read request body and recreate it
bodyBytes, err := io.ReadAll(req.Body)
if err != nil {
return nil, err
}
req.Body.Close()
req.Body = io.NopCloser(bytes.NewBuffer(bodyBytes))
c.Payloads = append(c.Payloads, strings.TrimSpace(string(bodyBytes)))
c.JWTs = append(c.JWTs, req.Header.Get("Authorization"))
return http.DefaultClient.Do(req)
}
func (c *MockHTTPClient) hasQuery(query string) bool {
for _, q := range c.Queries {
if q == query {
return true
}
}
return false
}
func (c *MockHTTPClient) hasPayload(payload string) bool {
for _, p := range c.Payloads {
if p == payload {
return true
}
}
return false
}
func TestSnowflakeOutput(t *testing.T) {
type testCase struct {
name string
privateKeyPath string
privateKeyPassphrase string
stage string
fileName string
fileExtension string
requestID string
snowpipe string
compression string
snowflakeHTTPResponseCode int
snowflakeResponseCode string
wantPUTQuery string
wantPUTQueriesCount int
wantSnowpipeQuery string
wantSnowpipeQueriesCount int
wantSnowpipePayload string
wantSnowpipeJWT string
errConfigContains string
errContains string
}
getSnowflakeWriter := func(t *testing.T, tc testCase) (*snowflakeWriter, error) {
t.Helper()
outputConfig := `
account: benthos
region: east-us-2
cloud: azure
user: foobar
private_key_file: ` + tc.privateKeyPath + `
private_key_pass: ` + tc.privateKeyPassphrase + `
role: test_role
database: test_db
warehouse: test_warehouse
schema: test_schema
path: foo/bar/baz
stage: '` + tc.stage + `'
file_name: '` + tc.fileName + `'
file_extension: '` + tc.fileExtension + `'
upload_parallel_threads: 42
compression: ` + tc.compression + `
request_id: '` + tc.requestID + `'
snowpipe: '` + tc.snowpipe + `'
`
spec := snowflakePutOutputConfig()
env := service.NewEnvironment()
conf, err := spec.ParseYAML(outputConfig, env)
require.NoError(t, err)
return newSnowflakeWriterFromConfig(conf, service.MockResources())
}
tests := []testCase{
{
name: "executes snowflake query with plaintext SSH key",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
compression: "NONE",
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".json @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = NONE PARALLEL=42",
},
{
name: "executes snowflake query with encrypted SSH key",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.p8",
privateKeyPassphrase: "test123",
stage: "@test_stage",
compression: "NONE",
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".json @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = NONE PARALLEL=42",
},
{
name: "fails to read missing SSH key",
privateKeyPath: "resources/ssh_keys/missing_key.pem",
stage: "@test_stage",
compression: "NONE",
errConfigContains: "failed to read private key resources/ssh_keys/missing_key.pem: open resources/ssh_keys/missing_key.pem: no such file or directory",
},
{
name: "fails to read encrypted SSH key without passphrase",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.p8",
stage: "@test_stage",
compression: "NONE",
errConfigContains: "failed to read private key: private key requires a passphrase, but private_key_passphrase was not supplied",
},
{
name: "executes snowflake query without compression",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
compression: "NONE",
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".json @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = NONE PARALLEL=42",
},
{
name: "executes snowflake query with automatic compression",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
compression: "AUTO",
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".gz @test_stage/foo/bar/baz AUTO_COMPRESS = TRUE SOURCE_COMPRESSION = AUTO_DETECT PARALLEL=42",
},
{
name: "executes snowflake query with gzip compression",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
compression: "GZIP",
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".gz @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = GZIP PARALLEL=42",
},
{
name: "executes snowflake query with DEFLATE compression",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
compression: "DEFLATE",
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".deflate @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = DEFLATE PARALLEL=42",
},
{
name: "executes snowflake query with RAW_DEFLATE compression",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
compression: "RAW_DEFLATE",
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".raw_deflate @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = RAW_DEFLATE PARALLEL=42",
},
{
name: "handles file name and file extension interpolation",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
fileName: `${! "deadbeef" }`,
fileExtension: `${! "parquet" }`,
compression: "NONE",
wantPUTQuery: "PUT file://foo/bar/baz/deadbeef.parquet @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = NONE PARALLEL=42",
},
{
name: "executes snowflake query and calls Snowpipe",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
snowpipe: "test_pipe",
compression: "NONE",
snowflakeHTTPResponseCode: http.StatusOK,
snowflakeResponseCode: "SUCCESS",
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".json @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = NONE PARALLEL=42",
wantPUTQueriesCount: 1,
wantSnowpipeQuery: "/v1/data/pipes/test_db.test_schema.test_pipe/insertFiles?requestId=" + dummyUUID,
wantSnowpipeQueriesCount: 1,
wantSnowpipePayload: `{"files":[{"path":"foo/bar/baz/` + dummyUUID + `.json"}]}`,
wantSnowpipeJWT: "Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.your_sha256_hashyour_sha256_hashQ0U2S1Irc0J4VEVoenBFPSIsInN1YiI6IkJFTlRIT1MuRk9PQkFSIn0.your_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashFZ_Xk6tnaZfdIrGKRZ5dsA",
},
{
name: "gets error code from Snowpipe",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
snowpipe: "test_pipe",
compression: "NONE",
snowflakeHTTPResponseCode: http.StatusOK,
snowflakeResponseCode: "FAILURE",
errContains: "received unexpected Snowpipe response code: FAILURE",
},
{
name: "gets http error from Snowpipe",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
snowpipe: "test_pipe",
compression: "NONE",
snowflakeHTTPResponseCode: http.StatusTeapot,
errContains: "received unexpected Snowpipe response status: 418",
},
{
name: "handles stage interpolation and runs a query for each sub-batch",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: `@test_stage_${! json("id") }`,
compression: "NONE",
wantPUTQueriesCount: 2,
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".json @test_stage_bar/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = NONE PARALLEL=42",
},
{
name: "handles Snowpipe interpolation and runs a query for each sub-batch",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: "@test_stage",
snowpipe: `test_pipe_${! json("id") }`,
compression: "NONE",
snowflakeHTTPResponseCode: http.StatusOK,
snowflakeResponseCode: "SUCCESS",
wantPUTQuery: "PUT file://foo/bar/baz/" + dummyUUID + ".json @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = NONE PARALLEL=42",
wantPUTQueriesCount: 2,
wantSnowpipeQuery: "/v1/data/pipes/test_db.test_schema.test_pipe_bar/insertFiles?requestId=" + dummyUUID,
wantSnowpipeQueriesCount: 2,
wantSnowpipePayload: `{"files":[{"path":"foo/bar/baz/` + dummyUUID + `.json"}]}`,
wantSnowpipeJWT: "Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.your_sha256_hashyour_sha256_hashQ0U2S1Irc0J4VEVoenBFPSIsInN1YiI6IkJFTlRIT1MuRk9PQkFSIn0.your_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashFZ_Xk6tnaZfdIrGKRZ5dsA",
},
{
name: "handles request_id interpolation and runs a query and makes a single Snowpipe call for the entire batch",
privateKeyPath: "resources/ssh_keys/snowflake_rsa_key.pem",
stage: `@test_stage`,
snowpipe: `test_pipe`,
requestID: `${! "deadbeef" }`,
compression: "NONE",
snowflakeHTTPResponseCode: http.StatusOK,
snowflakeResponseCode: "SUCCESS",
wantPUTQuery: "PUT file://foo/bar/baz/deadbeef.json @test_stage/foo/bar/baz AUTO_COMPRESS = FALSE SOURCE_COMPRESSION = NONE PARALLEL=42",
wantPUTQueriesCount: 1,
wantSnowpipeQuery: "/v1/data/pipes/test_db.test_schema.test_pipe/insertFiles?requestId=deadbeef",
wantSnowpipeQueriesCount: 1,
wantSnowpipePayload: `{"files":[{"path":"foo/bar/baz/deadbeef.json"}]}`,
wantSnowpipeJWT: "Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.your_sha256_hashyour_sha256_hashQ0U2S1Irc0J4VEVoenBFPSIsInN1YiI6IkJFTlRIT1MuRk9PQkFSIn0.your_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashFZ_Xk6tnaZfdIrGKRZ5dsA",
},
// TODO:
// - Snowflake PUT query payload tests
}
for _, test := range tests {
t.Run(test.name, func(t *testing.T) {
s, err := getSnowflakeWriter(t, test)
if test.errConfigContains == "" {
require.NoError(t, err)
} else {
require.Error(t, err)
require.Contains(t, err.Error(), test.errConfigContains)
return
}
s.uuidGenerator = MockUUIDGenerator{}
snowpipeTestServer := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(test.snowflakeHTTPResponseCode)
_, _ = w.Write([]byte(`{"ResponseCode": "` + test.snowflakeResponseCode + `"}`))
}))
t.Cleanup(snowpipeTestServer.Close)
mockHTTPClient := MockHTTPClient{
SnowpipeHost: snowpipeTestServer.Listener.Addr().String(),
}
s.httpClient = &mockHTTPClient
mockDB := MockDB{}
s.db = &mockDB
s.nowFn = func() time.Time { return time.Time{} }
err = s.WriteBatch(context.Background(), service.MessageBatch{
service.NewMessage([]byte(`{"id":"foo","content":"foo stuff"}`)),
service.NewMessage([]byte(`{"id":"bar","content":"bar stuff"}`)),
})
if test.errContains == "" {
require.NoError(t, err)
} else {
require.Error(t, err)
require.Contains(t, err.Error(), test.errContains)
return
}
if test.wantPUTQueriesCount > 0 {
assert.Equal(t, test.wantPUTQueriesCount, mockDB.QueriesCount)
}
if test.wantPUTQuery != "" {
assert.True(t, mockDB.hasQuery(test.wantPUTQuery))
}
if test.wantSnowpipeQueriesCount > 0 {
assert.Equal(t, test.wantSnowpipeQueriesCount, mockHTTPClient.QueriesCount)
assert.Len(t, mockHTTPClient.JWTs, test.wantSnowpipeQueriesCount)
for _, jwt := range mockHTTPClient.JWTs {
assert.Equal(t, test.wantSnowpipeJWT, jwt)
}
}
if test.wantSnowpipeQuery != "" {
assert.True(t, mockHTTPClient.hasQuery(test.wantSnowpipeQuery))
}
if test.wantSnowpipePayload != "" {
assert.True(t, mockHTTPClient.hasPayload(test.wantSnowpipePayload))
}
})
}
}
```
|
The Eastern Sea Frontier (EASTSEAFRON) was a United States Navy operational command during World War II, that was responsible for the coastal waters from Canada to Jacksonville, Florida, extending out for a nominal distance of two hundred miles. The Commander was designated Commander, Eastern Sea Frontier (COMEASTSEAFRON). COMEASTSEAFRON had vessels for convoy use or other uses determined by the commander. In addition to providing escorts for convoys within its frontier, the frontier was responsible for sea-air rescue, harbor defense, shipping lane patrol, minesweeping, and air operations.
History
The Code of Federal Regulations indicate Eastern Sea Frontier's commander also served as commander of the Atlantic Reserve Fleet as of 1937–38.
In December 1940 Navy Basic War Plan, RAINBOW No. 3 (PL-44) was promulgated. This Plan and WPL-42 stipulated the preparation of operations plans.
On 14 January 1941, orders were issued to transfer Rear Admiral Adolphus Andrews from his post as Commander, Scouting Force, United States Fleet to a new command as Commandant, Third Naval District. These orders were modified as of 1 March 1941, giving Admiral Andrews additional duties as Commander, North Atlantic Naval Coastal Frontier. When he relieved Rear Admiral Clark H. Woodward on 10 March 1941 for the first time the Commandant, Third Naval District was also designated as Commander, North Atlantic Naval Coastal Frontier.
On 16 March 1941 the first Operation Plan: North Atlantic Naval Coastal Frontier Plan 0–4 (RAINBOW 3), with the short title, NA-NCF-44 was issued. This Plan set up the proposed Staff of the Commander, North Atlantic Naval Coastal Frontier, which consisted of the following: Chief of Staff, Operations Officer, Shipping Control Officer, Air Officer, First Army Liaison Officer Intelligence Officer and Communication Officer. It also provided for the Command Relations and the plans for coordination with the Army Commander. Nevertheless, at this time, no officers were immediately available to fill these commands.
On 3 April 1941, a second plan was issued to modify NA-NCF-44 to make it applicable to the concept of war outlined in "RAINBOW No. 1." This modification was entitled, North Atlantic Naval Coastal Frontier Plan O-4 (RAINBOW No. 1), with the short title, NA-NCF-42. On 22 April 1941, a third plan was issued: the original Operation Plan for the Forces of the North Atlantic Naval Coastal Frontier. This plan was designated, "Operation Plan, NA-NCF-1-41." However, at this time, task forces were not created. When the Navy Basic War Plan, RAINBOW No. 2 (WPL-46) was issued in May 1941, it included important directives on the eventual organization of task forces and command relations in the naval coastal frontiers. Nevertheless, these directives merely outline an organizational structure which would not be created until a later order was issued.
WPL-46 had incorporated the structure of task forces as they had been ordered by General Order No. 143, issued 3 February 1941. On 1 July 1941, the Chief of Naval Operations formally ordered the establishment of naval coastal frontiers, thus transforming them from their theoretical status; but added in the same dispatch, "For the present, Naval Coastal Frontier Forces, as prescribed in General Order No. 143, will not be formed."
On 6 February 1942, the Secretary of the Navy formally changed the names of the coastal frontiers to sea frontiers; thereafter, the North Atlantic Naval Coastal Frontier was designated the Eastern Sea Frontier.
Eastern Sea Frontier's headquarters were located at 90 Church Street in Lower Manhattan. The commander of the Eastern Sea Frontier, until the closing months of 1943, was then-Rear Admiral Adolphus Andrews.
COMEASTSEAFRON had control and responsibility for convoys within its defined area. Convoys from adjacent sea frontiers would continue across sea frontier boundaries. Since the Eastern Sea Frontier coordinated with the Royal Canadian Navy for convoys crossing into the Canadian Coastal Zone, the Eastern Sea Frontier was the "parent" of the contiguous sea frontiers to the south, and COMEASTSEAFRON authority extended beyond its own frontier. COMEASTSEAFRON operational orders could only be appealed to Admiral Ernest King.
Aircraft
COMEASTSEAFRON resources included a blimp airship group at Naval Air Engineering Station Lakehurst, a special convoy air escort group at Naval Air Station Quonset Point, and the Northern, Narragansett, New York, Delaware, Chesapeake, and Southern Air Groups operating from sixteen airfields from Bar Harbor NAAF to Naval Air Station Jacksonville. COMEASTSEAFRON worked closely with the U.S. Army Air Force in the defense of the frontier. Usually, offices of the U.S. Navy and U.S. Army Air Force officers assigned to the frontier, had their offices side by side in order to create effective two-way communications and expedited reaction to reports of enemy presence.
Participating units
VS-33
Commanders
Rear Admiral Clark H. Woodward: ? - February 6, 1942
Vice Admiral Adolphus Andrews: February 6, 1942 - November 1, 1943
Vice Admiral Herbert F. Leary: November 1, 1943 - January 16, 1946
Vice Admiral Thomas C. Kinkaid: January 1946 - April 1950
Vice Admiral Oscar C. Badger: May 1950 - April 1952
Rear Admiral John E. Wilkes (acting): March 1951 - June 1951
Vice Admiral Walter S. DeLany: April 1952 - January 30, 1953
Vice Admiral Laurance T. DuBose: February 1, 1953 - June 1, 1955
Vice Admiral Arthur D. Struble: June 1, 1955 - July 1, 1956
Vice Admiral Frederick W. McMahon: January 1957 - December 1958
Vice Admiral Thomas S. Combs: December 1958 - April 1960
Vice Admiral Charles Wellborn Jr.: March 1960 - January 1963
Vice Admiral Harold T. Deutermann: 1963-1965
Vice Admiral John S. McCain Jr.: 1965-1967
Vice Admiral Andrew McBurney Jackson: 1967-1969
Vice Admiral John M. Lee: 1969-1970
Notes
References
Admiral Ernest King, First Report to the Secretary of the Navy: Covering our Peacetime Navy and our Wartime Navy and including combat operations up to 1 March 1944. April 1944, pp. 75–88.
Samuel Eliot Morison, History of United States Naval Operations in World War II, The Battle of the Atlantic, 1939–1943.
External links
U-boat Archive – Eastern Sea Frontier
U-boat Archive – Eastern Sea Frontier – War Diary March 1942
Glossary of U.S. Naval Abbreviations (OPNAV 29-P1000)
Naval Operations in the Atlantic and Mediterranean to March 1944
The Battle of the Atlantic
American Theater of World War II
Submarine warfare in World War II
Eastern Sea Frontier
|
```objective-c
/* Automatically generated by version.sh, do not manually edit! */
#ifndef AVUTIL_FFVERSION_H
#define AVUTIL_FFVERSION_H
#define FFMPEG_VERSION "git-2016-01-14-19b4974"
#endif /* AVUTIL_FFVERSION_H */
```
|
```shell
#!/bin/bash -eu
#
#
# path_to_url
#
# Unless required by applicable law or agreed to in writing, software
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#
################################################################################
pip3 install .
# Build fuzzers in $OUT.
for fuzzer in $(find $SRC -name 'fuzz_*.py'); do
# Add relevant data and two hidden modules
compile_python_fuzzer $fuzzer \
--add-data ./connexion/resources/schemas/:connexion/resources/schemas/ \
--add-data $SRC/jsonschema_specifications/jsonschema_specifications/schemas:jsonschema_specifications/schemas \
--hidden-import=asgiref \
--hidden-import=flask
done
```
|
Charlotta "Lotta" Kristina Johansdotter Edholm (born 8 February 1965) is a Swedish politician for the Liberals. Since 18 October 2022 she has been the Minister for Schools in the Ulf Kristersson cabinet.
Biography
Edholm was born in Västerås, and has a BA in political sciences from Stockholm University. She served as a member of the Riksdag () from 1992 to 1994. She was municipal commissioner () for schools in Stockholm from 2006 to 2014, and 2018 to 2020 as well as in opposition from 2014 to 2018. She was previously married to former party leader, Lars Leijonborg, with whom she has one son.
References
1965 births
Living people
Members of the Riksdag from the Liberals (Sweden)
Women government ministers of Sweden
Swedish Ministers for Schools
Women members of the Riksdag
|
A tau robe is a very simple black or white gown cut to resemble the Greek letter, "tau," or "T".
Description
The arms are usually from fingertip to fingertip, and the bottom hem is usually floor length, as with a ceremonial tabard. Both the arms and body of the gown will flare so that the sleeves being bell sleeves, are wider at the fingers than at the shoulder, and the bottom wider than at the chest. This loose fitting helps with maneuverability while wearing the robe.
Purpose
The robe is one of the vestments worn in ceremonial magic. Although not essential, Donald Kraig describes the purpose of wearing the robe as "to physically show both your conscious and your unconscious that you are no longer in your daily dress." Kraig goes on to say wearing the robe indicates a magical and spiritual intent, such as engaging in ritual, and should be kept exclusively for that purpose.
Footnotes
References
Ceremonial clothing
Robes and cloaks
|
The Canon PowerShot S is a series of digital cameras released by Canon, as part of the wider PowerShot range. The S-series was originally a line of compact point-and-shoot cameras, slowly evolving into a prosumer line of cameras slotting right beneath the G-series cameras. The line later branched off into Canon's line of super-zoom cameras. The PowerShot ELPH line is a branch of the S-series, due to its model number designations in the United States (with the S- and SD- prefixes), as well as the similarities between the PowerShot ELPH S100 and the PowerShot S10
G-series in a compact body
From the PowerShot S90 onwards the S-series continues a line of Canon compact digital cameras that commenced with the Ixus 900Ti and feature the Digic image processors and larger than average sensors as fitted to the advanced PowerShot G-series cameras. The Ixus / S-series and the equivalent G-series models are listed below:
Ixus 900Ti (SD900)* / PowerShot G7 / Digic III / 10MP 3648 × 2736 1/1.8″ CCD.
Ixus 960IS (SD950IS)* / PowerShot G9 / Digic III / 12.1MP 4000 × 3000 1/1.7″ CCD.
Ixus 980IS (SD990IS) / PowerShot G10 / Digic 4 / 14.7MP 4416 × 3312 1/1.7″ CCD.
PowerShot S90, S95, S200 / PowerShot G11, G12 / Digic 4 / 10MP 3648 × 2736 1/1.7″ CCD (S200 features Digic 5).
PowerShot S100, S110 / PowerShot G15 / Digic 5 / 12.1MP 4000×3000 1/1.7" CMOS.
Powershot S120 / PowerShot G16 / Digic 6 / 12.1MP 4000×3000 1/1.7" CMOS.
(* The Ixus 900Ti and 960IS feature a titanium body.)
Models
Compact S series
The Sxx series is made up of two sub-series. The S10 and S20 were compact point-and-shoot cameras, while the S30-onwards were prosumer digital cameras that were the de-contented, lower-cost alternative to the equivalent G-series camera at the time.
Super Zoom S/SX series
The S1 to SX70 series consists of ultra-zoom cameras, having longer zoom ranges and a more extensive list of features. The SX100 and later SX models are a more compact, affordable spin-off. The "SX" stands for "Super Zoom." All S and SX models feature image stabilization, and most have full manual controls.
See also
Canon PowerShot
Canon PowerShot A
Canon PowerShot G
Canon PowerShot SD or Digital Elph
References
External links
S
Superzoom cameras
Year of introduction missing
|
```smalltalk
using AspectCore.DynamicProxy;
namespace AspectCore.Extensions.DataValidation
{
[NonAspect]
public interface IDataStateFactory
{
IDataState CreateDataState(DataValidationContext dataValidationContext);
}
}
```
|
The Anatomical Therapeutic Chemical (ATC) Classification System is a drug classification system that classifies the active ingredients of drugs according to the organ or system on which they act and their therapeutic, pharmacological and chemical properties. Its purpose is an aid to monitor drug use and for research to improve quality medication use. It does not imply drug recommendation or efficacy. It is controlled by the World Health Organization Collaborating Centre for Drug Statistics Methodology (WHOCC), and was first published in 1976.
Coding system
This pharmaceutical coding system divides drugs into different groups according to the organ or system on which they act, their therapeutic intent or nature, and the drug's chemical characteristics. Different brands share the same code if they have the same active substance and indications. Each bottom-level ATC code stands for a pharmaceutically used substance, or a combination of substances, in a single indication (or use). This means that one drug can have more than one code, for example acetylsalicylic acid (aspirin) has as a drug for local oral treatment, as a platelet inhibitor, and as an analgesic and antipyretic; as well as one code can represent more than one active ingredient, for example is the combination of perindopril with amlodipine, two active ingredients that have their own codes ( and respectively) when prescribed alone.
The ATC classification system is a strict hierarchy, meaning that each code necessarily has one and only one parent code, except for the 14 codes at the topmost level which have no parents. The codes are semantic identifiers, meaning they depict information by themselves beyond serving as identifiers (namely, the codes depict themselves the complete lineage of parenthood). As of 7 May 2020, there are 6,331 codes in ATC; the table below gives the count per level.
History
The ATC system is based on the earlier Anatomical Classification System, which is intended as a tool for the pharmaceutical industry to classify pharmaceutical products (as opposed to their active ingredients). This system, confusingly also called ATC, was initiated in 1971 by the European Pharmaceutical Market Research Association (EphMRA) and is being maintained by the EphMRA and Intellus. Its codes are organised into four levels. The WHO's system, having five levels, is an extension and modification of the EphMRA's. It was first published in 1976.
Classification
In this system, drugs are classified into groups at five different levels:
First level
The first level of the code indicates the anatomical main group and consists of one letter. There are 14 main groups:
Example: C Cardiovascular system
Second level
The second level of the code indicates the therapeutic subgroup and consists of two digits.
Example: C03 Diuretics
Third level
The third level of the code indicates the therapeutic/pharmacological subgroup and consists of one letter.
Example: C03C High-ceiling diuretics
Fourth level
The fourth level of the code indicates the chemical/therapeutic/pharmacological subgroup and consists of one letter.
Example: C03CA Sulfonamides
Fifth level
The fifth level of the code indicates the chemical substance and consists of two digits.
Example: C03CA01 furosemide
Other ATC classification systems
ATCvet
The Anatomical Therapeutic Chemical Classification System for veterinary medicinal products (ATCvet) is used to classify veterinary drugs. ATCvet codes can be created by placing the letter Q in front of the ATC code of most human medications. For example, furosemide for veterinary use has the code QC03CA01.
Some codes are used exclusively for veterinary drugs, such as QI Immunologicals, QJ51 Antibacterials for intramammary use or QN05AX90 amperozide.
Herbal ATC (HATC)
The Herbal ATC system (HATC) is an ATC classification of herbal substances; it differs from the regular ATC system by using 4 digits instead of 2 at the 5th level group.
The herbal classification is not adopted by WHO. The Uppsala Monitoring Centre is responsible for the Herbal ATC classification, and it is part of the WHODrug Global portfolio available by subscription.
Defined daily dose
The ATC system also includes defined daily doses (DDDs) for many drugs. This is a measurement of drug consumption based on the usual daily dose for a given drug. According to the definition, "[t]he DDD is the assumed average maintenance dose per day for a drug used for its main indication in adults."
Adaptations and updates
National issues of the ATC classification, such as the German Anatomisch-therapeutisch-chemische Klassifikation mit Tagesdosen, may include additional codes and DDDs not present in the WHO version.
ATC follows guidelines in creating new codes for newly approved drugs. An application is submitted to WHO for ATC classification and DDD assignment. A preliminary or temporary code is assigned and published on the website and in the WHO Drug Information for comment or objection. New ATC/DDD codes are discussed at the semi-annual Working Group meeting. If accepted it becomes a final decision and published semi-annually on the website and WHO Drug Information and implemented in the annual print/on-line ACT/DDD Index on January 1.
Changes to existing ATC/DDD follow a similar process to become temporary codes and if accepted become a final decision as ATC/DDD alterations. ATC and DDD alterations are only valid and implemented in the coming annual updates; the original codes must continue until the end of the year.
An updated version of the complete on-line/print ATC index with DDDs is published annually on January 1.
See also
Classification of Pharmaco-Therapeutic Referrals (CPR)
ICD-10 International Classification of Diseases
International Classification of Primary Care (ICPC-2) / ICPC-2 PLUS
Medical classification
Pharmaceutical care
Pharmacotherapy
RxNorm
References
External links
Quarterly journal providing an overview of topics relating to medicines development and regulation.
from
EphMRA Anatomical Classification (ATC and NFC)
atcd. R script to scrape the ATC data from the WHOCC website; contains link to download entire ATC tree.
Drugs
Pharmacological classification systems
World Health Organization
|
```xml
import * as React from 'react';
import cx from 'classnames';
import { createSvgIcon } from '../utils/createSvgIcon';
import { iconClassNames } from '../utils/iconClassNames';
export const ItalicIcon = createSvgIcon({
svg: ({ classes }) => (
<svg role="presentation" focusable="false" viewBox="2 2 16 16" className={classes.svg}>
<path
className={cx(iconClassNames.outline, classes.outlinePart)}
d="M16 3C16.2761 3 16.5 3.22386 16.5 3.5C16.5 3.77614 16.2761 4 16 4H12.843L8.227 16H11.5C11.7761 16 12 16.2239 12 16.5C12 16.7761 11.7761 17 11.5 17H4C3.72386 17 3.5 16.7761 3.5 16.5C3.5 16.2239 3.72386 16 4 16H7.156L11.771 4H8.5C8.22386 4 8 3.77614 8 3.5C8 3.22386 8.22386 3 8.5 3H16Z"
/>
<path
className={cx(iconClassNames.filled, classes.filledPart)}
d="M8 3.25C8 2.83579 8.33579 2.5 8.75 2.5H16.25C16.6642 2.5 17 2.83579 17 3.25C17 3.66421 16.6642 4 16.25 4H13.0151L8.59202 15.5H11.25C11.6642 15.5 12 15.8358 12 16.25C12 16.6642 11.6642 17 11.25 17H3.75C3.33579 17 3 16.6642 3 16.25C3 15.8358 3.33579 15.5 3.75 15.5H6.9849L11.408 4H8.75C8.33579 4 8 3.66421 8 3.25Z"
/>
</svg>
),
displayName: 'ItalicIcon',
});
```
|
```swift
//
// Lexer.swift
// Parsing
//
// Created by Nick Lockwood on 03/09/2018.
//
import Foundation
// MARK: interface
public enum Token: Equatable {
case assign // = operator
case plus // + operator
case identifier(String) // letter followed by one or more alphanumeric chars
case number(Double) // any valid floating point number
case string(String) // a string literal surrounded by ""
case `let` // let keyword
case print // print keyword
}
public enum LexerError: Error, Equatable {
case unrecognizedInput(String)
}
public func tokenize(_ input: String) throws -> [Token] {
let whitespace = try NSRegularExpression(pattern: "(\\s|\\n)+")
let assign = try NSRegularExpression(pattern: "=")
let plus = try NSRegularExpression(pattern: "\\+")
let identifier = try NSRegularExpression(pattern: "[a-z][a-z0-9]*", options: .caseInsensitive)
let number = try NSRegularExpression(pattern: "[0-9.]+")
let string = try NSRegularExpression(pattern: "\"(\\\\\"|\\\\\\\\|[^\"\\\\])*\"") // !!!
// this part is nasty because NSRange indices don't map directly to String indices
var range = NSRange(location: 0, length: input.utf16.count)
func readToken(_ regex: NSRegularExpression) -> String? {
guard let match = regex.firstMatch(in: input, options: .anchored, range: range) else {
return nil
}
range.location += match.range.length
range.length -= match.range.length
return (input as NSString).substring(with: match.range)
}
func readToken() -> Token? {
_ = readToken(whitespace) // skip whitespace
if readToken(assign) != nil {
return .assign
}
if readToken(plus) != nil {
return .plus
}
if let name = readToken(identifier) {
switch name {
case "let":
return .let
case "print":
return .print
default:
return .identifier(name)
}
}
let start = range
if let digits = readToken(number), let double = Double(digits) {
return .number(double)
} else {
range = start
}
if let string = readToken(string) {
let unescapedString = String(string.dropFirst().dropLast())
.replacingOccurrences(of: "\\\"", with: "\"")
.replacingOccurrences(of: "\\\\", with: "\\")
return .string(unescapedString)
}
return nil
}
var tokens: [Token] = []
while let token = readToken() {
tokens.append(token)
}
if range.length != 0 {
throw LexerError.unrecognizedInput((input as NSString).substring(with: range))
}
return tokens
}
```
|
```css
th {
color: #337AB7;
}
```
|
```css
/*!
* # Semantic UI 2.2.7 - Search
* path_to_url
*
*
* Released under the MIT license
* path_to_url
*
*/.ui.search{position:relative}.ui.search>.prompt{margin:0;outline:0;-webkit-appearance:none;-webkit-tap-highlight-color:rgba(255,255,255,0);text-shadow:none;font-style:normal;font-weight:400;line-height:1.2142em;padding:.67861429em 1em;font-size:1em;background:#fff;border:1px solid rgba(34,36,38,.15);color:rgba(0,0,0,.87);box-shadow:0 0 0 0 transparent inset;-webkit-transition:background-color .1s ease,color .1s ease,box-shadow .1s ease,border-color .1s ease;transition:background-color .1s ease,color .1s ease,box-shadow .1s ease,border-color .1s ease}.ui.search .prompt{border-radius:500rem}.ui.search .prompt~.search.icon{cursor:pointer}.ui.search>.results{display:none;position:absolute;top:100%;left:0;-webkit-transform-origin:center top;transform-origin:center top;white-space:normal;background:#fff;margin-top:.5em;width:18em;border-radius:.28571429rem;box-shadow:0 2px 4px 0 rgba(34,36,38,.12),0 2px 10px 0 rgba(34,36,38,.15);border:1px solid #d4d4d5;z-index:998}.ui.search>.results>:first-child{border-radius:.28571429rem .28571429rem 0 0}.ui.search>.results>:last-child{border-radius:0 0 .28571429rem .28571429rem}.ui.search>.results .result{cursor:pointer;display:block;overflow:hidden;font-size:1em;padding:.85714286em 1.14285714em;color:rgba(0,0,0,.87);line-height:1.33;border-bottom:1px solid rgba(34,36,38,.1)}.ui.search>.results .result:last-child{border-bottom:none!important}.ui.search>.results .result .image{float:right;overflow:hidden;background:0 0;width:5em;height:3em;border-radius:.25em}.ui.search>.results .result .image img{display:block;width:auto;height:100%}.ui.search>.results .result .image+.content{margin:0 6em 0 0}.ui.search>.results .result .title{margin:-.14285em 0 0;font-family:Lato,'Helvetica Neue',Arial,Helvetica,sans-serif;font-weight:700;font-size:1em;color:rgba(0,0,0,.85)}.ui.search>.results .result .description{margin-top:0;font-size:.92857143em;color:rgba(0,0,0,.4)}.ui.search>.results .result .price{float:right;color:#21ba45}.ui.search>.results>.message{padding:1em 1em}.ui.search>.results>.message .header{font-family:Lato,'Helvetica Neue',Arial,Helvetica,sans-serif;font-size:1rem;font-weight:700;color:rgba(0,0,0,.87)}.ui.search>.results>.message .description{margin-top:.25rem;font-size:1em;color:rgba(0,0,0,.87)}.ui.search>.results>.action{display:block;border-top:none;background:#f3f4f5;padding:.92857143em 1em;color:rgba(0,0,0,.87);font-weight:700;text-align:center}.ui.search>.prompt:focus{border-color:rgba(34,36,38,.35);background:#fff;color:rgba(0,0,0,.95)}.ui.loading.search .input>i.icon:before{position:absolute;content:'';top:50%;left:50%;margin:-.64285714em 0 0 -.64285714em;width:1.28571429em;height:1.28571429em;border-radius:500rem;border:.2em solid rgba(0,0,0,.1)}.ui.loading.search .input>i.icon:after{position:absolute;content:'';top:50%;left:50%;margin:-.64285714em 0 0 -.64285714em;width:1.28571429em;height:1.28571429em;-webkit-animation:button-spin .6s linear;animation:button-spin .6s linear;-webkit-animation-iteration-count:infinite;animation-iteration-count:infinite;border-radius:500rem;border-color:#767676 transparent transparent;border-style:solid;border-width:.2em;box-shadow:0 0 0 1px transparent}.ui.category.search>.results .category .result:hover,.ui.search>.results .result:hover{background:#f9fafb}.ui.search .action:hover{background:#e0e0e0}.ui.category.search>.results .category.active{background:#f3f4f5}.ui.category.search>.results .category.active>.name{color:rgba(0,0,0,.87)}.ui.category.search>.results .category .result.active,.ui.search>.results .result.active{position:relative;border-left-color:rgba(34,36,38,.1);background:#f3f4f5;box-shadow:none}.ui.search>.results .result.active .title{color:rgba(0,0,0,.85)}.ui.search>.results .result.active .description{color:rgba(0,0,0,.85)}.ui.search.selection .prompt{border-radius:.28571429rem}.ui.search.selection>.icon.input>.remove.icon{pointer-events:none;position:absolute;left:auto;opacity:0;color:'';top:0;right:0;-webkit-transition:color .1s ease,opacity .1s ease;transition:color .1s ease,opacity .1s ease}.ui.search.selection>.icon.input>.active.remove.icon{cursor:pointer;opacity:.8;pointer-events:auto}.ui.search.selection>.icon.input:not([class*="left icon"])>.icon~.remove.icon{right:1.85714em}.ui.search.selection>.icon.input>.remove.icon:hover{opacity:1;color:#db2828}.ui.category.search .results{width:28em}.ui.category.search>.results .category{background:#f3f4f5;box-shadow:none;border-bottom:1px solid rgba(34,36,38,.1);-webkit-transition:background .1s ease,border-color .1s ease;transition:background .1s ease,border-color .1s ease}.ui.category.search>.results .category:last-child{border-bottom:none}.ui.category.search>.results .category:first-child .name+.result{border-radius:0 .28571429rem 0 0}.ui.category.search>.results .category:last-child .result:last-child{border-radius:0 0 .28571429rem 0}.ui.category.search>.results .category .result{background:#fff;margin-left:100px;border-left:1px solid rgba(34,36,38,.15);border-bottom:1px solid rgba(34,36,38,.1);-webkit-transition:background .1s ease,border-color .1s ease;transition:background .1s ease,border-color .1s ease;padding:.85714286em 1.14285714em}.ui.category.search>.results .category:last-child .result:last-child{border-bottom:none}.ui.category.search>.results .category>.name{width:100px;background:0 0;font-family:Lato,'Helvetica Neue',Arial,Helvetica,sans-serif;font-size:1em;float:1em;float:left;padding:.4em 1em;font-weight:700;color:rgba(0,0,0,.4)}.ui[class*="left aligned"].search>.results{right:auto;left:0}.ui[class*="right aligned"].search>.results{right:0;left:auto}.ui.fluid.search .results{width:100%}.ui.mini.search{font-size:.78571429em}.ui.small.search{font-size:.92857143em}.ui.search{font-size:1em}.ui.large.search{font-size:1.14285714em}.ui.big.search{font-size:1.28571429em}.ui.huge.search{font-size:1.42857143em}.ui.massive.search{font-size:1.71428571em}
```
|
In chemistry, molality is a measure of the amount of solute in a solution relative to a given mass of solvent. This contrasts with the definition of molarity which is based on a given volume of solution.
A commonly used unit for molality is the moles per kilogram (mol/kg). A solution of concentration 1 mol/kg is also sometimes denoted as 1 molal. The unit mol/kg requires that molar mass be expressed in kg/mol, instead of the usual g/mol or kg/kmol.
Definition
The molality (b), of a solution is defined as the amount of substance (in moles) of solute, nsolute, divided by the mass (in kg) of the solvent, msolvent:
In the case of solutions with more than one solvent, molality can be defined for the mixed solvent considered as a pure pseudo-solvent. Instead of mole solute per kilogram solvent as in the binary case, units are defined as mole solute per kilogram mixed solvent.
Origin
The term molality is formed in analogy to molarity which is the molar concentration of a solution. The earliest known use of the intensive property molality and of its adjectival unit, the now-deprecated molal, appears to have been published by G. N. Lewis and M. Randall in the 1923 publication of Thermodynamics and the Free Energies of Chemical Substances. Though the two terms are subject to being confused with one another, the molality and molarity of a dilute aqueous solution are nearly the same, as one kilogram of water (solvent) occupies the volume of 1 liter at room temperature and a small amount of solute has little effect on the volume.
Unit
The SI unit for molality is moles per kilogram of solvent.
A solution with a molality of 3 mol/kg is often described as "3 molal", "3 m" or "3 m". However, following the SI system of units, the National Institute of Standards and Technology, the United States authority on measurement, considers the term "molal" and the unit symbol "m" to be obsolete, and suggests mol/kg or a related unit of the SI.
Usage considerations
Advantages
The primary advantage of using molality as a measure of concentration is that molality only depends on the masses of solute and solvent, which are unaffected by variations in temperature and pressure. In contrast, solutions prepared volumetrically (e.g. molar concentration or mass concentration) are likely to change as temperature and pressure change. In many applications, this is a significant advantage because the mass, or the amount, of a substance is often more important than its volume (e.g. in a limiting reagent problem).
Another advantage of molality is the fact that the molality of one solute in a solution is independent of the presence or absence of other solutes.
Problem areas
Unlike all the other compositional properties listed in "Relation" section (below), molality depends on the choice of the substance to be called “solvent” in an arbitrary mixture. If there is only one pure liquid substance in a mixture, the choice is clear, but not all solutions are this clear-cut: in an alcohol–water solution, either one could be called the solvent; in an alloy, or solid solution, there is no clear choice and all constituents may be treated alike. In such situations, mass or mole fraction is the preferred compositional specification.
Relation to other compositional quantities
In what follows, the solvent may be given the same treatment as the other constituents of the solution, such that the molality of the solvent of an n-solute solution, say b0, is found to be nothing more than the reciprocal of its molar mass, M0 (expressed as kg/mol):
For the solutes the expression of molalities is similar:
The expressions linking molalities to mass fractions and mass concentrations contain the molar masses of the solutes Mi:
Similarly the equalities below are obtained from the definitions of the molalities and of the other compositional quantities.
The mole fraction of solvent can be obtained from the definition by dividing the numerator and denominator to the amount of solvent n0:
Then the sum of ratios of the other mole amounts to the amount of solvent is substituted with expressions from below containing molalities:
giving the result
Mass fraction
The conversions to and from the mass fraction, w1, of the solute in a single-solute solution are
where b1 is the molality and M1 is the molar mass of the solute.
More generally, for an n-solute/one-solvent solution, letting bi and wi be, respectively, the molality and mass fraction of the i-th solute,
where Mi is the molar mass of the ith solute, and w0 is the mass fraction of the solvent, which is expressible both as a function of the molalities as well as a function of the other mass fractions,
Substitution gives:
Mole fraction
The conversions to and from the mole fraction, x1 mole fraction of the solute in a single-solute solution are
where M0 is the molar mass of the solvent.
More generally, for an n-solute/one-solvent solution, letting xi be the mole fraction of the ith solute,
where x0 is the mole fraction of the solvent, expressible both as a function of the molalities as well as a function of the other mole fractions:
Substitution gives:
Molar concentration (molarity)
The conversions to and from the molar concentration, c1, for one-solute solutions are
where ρ is the mass density of the solution, b1 is the molality, and M1 is the molar mass (in kg/mol) of the solute.
For solutions with n solutes, the conversions are
where the molar concentration of the solvent c0 is expressible both as a function of the molalities as well as a function of the other molarities:
Substitution gives:
Mass concentration
The conversions to and from the mass concentration, ρsolute, of a single-solute solution are
or
where ρ is the mass density of the solution, b1 is the molality, and M1 is the molar mass of the solute.
For the general n-solute solution, the mass concentration of the ith solute, ρi, is related to its molality, bi, as follows:
where the mass concentration of the solvent, ρ0, is expressible both as a function of the molalities as well as a function of the other mass concentrations:
Substitution gives:
Equal ratios
Alternatively, one may use just the last two equations given for the compositional property of the solvent in each of the preceding sections, together with the relationships given below, to derive the remainder of properties in that set:
where i and j are subscripts representing all the constituents, the n solutes plus the solvent.
Example of conversion
An acid mixture consists of 0.76, 0.04, and 0.20 mass fractions of 70% HNO3, 49% HF, and H2O, where the percentages refer to mass fractions of the bottled acids carrying a balance of H2O. The first step is determining the mass fractions of the constituents:
The approximate molar masses in kg/mol are
First derive the molality of the solvent, in mol/kg,
and use that to derive all the others by use of the equal ratios:
Actually, bH2O cancels out, because it is not needed. In this case, there is a more direct equation: we use it to derive the molality of HF:
The mole fractions may be derived from this result:
Osmolality
Osmolality is a variation of molality that takes into account only solutes that contribute to a solution's osmotic pressure. It is measured in osmoles of the solute per kilogram of water. This unit is frequently used in medical laboratory results in place of osmolarity, because it can be measured simply by depression of the freezing point of a solution, or cryoscopy (see also: osmostat and colligative properties).
Relation to apparent (molar) properties
Molality appears in the expression of the apparent (molar) volume of a solute as a function of the molality b of that solute (and density of the solution and solvent):
For multicomponent systems the relation is slightly modified by the sum of molalities of solutes. Also a total molality and a mean apparent molar volume can be defined for the solutes together and also a mean molar mass of the solutes as if they were a single solute. In this case the first equality from above is modified with the mean molar mass M of the pseudosolute instead of the molar mass of the single solute:
,
, yi,j being ratios involving molalities of solutes i,j and the total molality bT.
The sum of products molalities - apparent molar volumes of solutes in their binary solutions equals the product between the sum of molalities of solutes and apparent molar volume in ternary or multicomponent solution.
,
Relation to apparent molar properties and activity coefficients
For concentrated ionic solutions the activity coefficient of the electrolyte is split into electric and statistical components.
The statistical part includes molality b, hydration index number h, the number of ions from the dissociation and the ratio ra between the apparent molar volume of the electrolyte and the molar volume of water.
Concentrated solution statistical part of the activity coefficient is:
Molalities of a ternary or multicomponent solution
The molalities of solutes b1, b2 in a ternary solution obtained by mixing two binary aqueous solutions with different solutes (say a sugar and a salt or two different salts) are different than the initial molalities of the solutes bii in their binary solutions.
The content of solvent in mass fractions w01 and w02 from each solution of masses ms1 and ms2 to be mixed as a function of initial molalities is calculated. Then the amount (mol) of solute from each binary solution is divided by the sum of masses of water after mixing:
Mass fractions of each solute in the initial solutions w11 and w22
are expressed as a function of the initial molalities b11, b22 :
These expressions of mass fractions are substituted in the final molalitaties.
The results for a ternary solution can be extended to a multicomponent solution (with more than two solutes).
From the molalities of the binary solutions
The molalities of the solutes in a ternary solution can be expressed also from molalities in the binary solutions and their masses:
The binary solution molalities are:
The masses of the solutes determined from the molalities of the solutes and the masses of water can be substituted in the expressions of the masses of solutions:
Similarly for the mass of the second solution:
From here one can obtain the masses of water to be summed in the denominator of the molalitities of the solutes in the ternary solutions.
Thus the ternary molalities are:
See also
Molarity
References
Chemical properties
Mass-specific quantities
es:Concentración#Molalidad
|
Gordon David Henry (August 17, 1926 – October 3, 1972) was a Canadian ice hockey goaltender for the Boston Bruins.
Career
Henry played three regular season games and five playoff games in the National Hockey League with the Boston Bruins between 1949 and 1951. The rest of his career, which lasted from 1943 to 1956, was mainly spent in the Eastern Amateur Hockey League and American Hockey League.
Career statistics
Regular season and playoffs
External links
1926 births
1972 deaths
Baltimore Clippers (1945–49) players
Boston Bruins players
Boston Olympics players
Canadian expatriate ice hockey players in the United States
Canadian ice hockey goaltenders
Charlotte Checkers (EHL) players
Hershey Bears players
Ice hockey people from Ontario
New York Rovers players
Ontario Hockey Association Senior A League (1890–1979) players
Philadelphia Falcons players
Philadelphia Ramblers players
Sportspeople from Owen Sound
|
```python
"""
Tutorial - Multiple methods.
This tutorial shows you how to link to other methods of your request
handler.
"""
import os.path
import cherrypy
class HelloWorld:
"""Hello world app."""
@cherrypy.expose
def index(self):
"""Produce HTTP response body of hello world app index URI."""
# Let's link to another method here.
return 'We have an <a href="show_msg">important message</a> for you!'
@cherrypy.expose
def show_msg(self):
"""Render a "Hello world!" message on ``/show_msg`` URI."""
return 'Hello world!'
tutconf = os.path.join(os.path.dirname(__file__), 'tutorial.conf')
if __name__ == '__main__':
# CherryPy always starts with app.root when trying to map request URIs
# to objects, so we need to mount a request handler root. A request
# to '/' will be mapped to HelloWorld().index().
cherrypy.quickstart(HelloWorld(), config=tutconf)
```
|
Godville is a mobile and desktop browser zero-player role-playing video game developed by Mikhail Platov and Dmitry Kosinov. It was released as a Russian website in 2007 and as a mobile game in English on July 18, 2010. In the game, the player controls a character known as the god, who interacts with a character called the hero. The hero progresses in the video game without interaction with the player's god character. Reception to the game was positive, with the focus on its gameplay.
Gameplay
Godville is a zero-player game, which means it does not require interaction from the player for the game to progress, though it does require some setup. In the game, there is the hero-character, who is a non-player character, and there is the god-character, who is played by the player. The hero is a religious fanatic who uses a diary to communicate with the god, and occasionally needs a sign of the god's existence; the player uses the god-character to influence the hero positively or negatively using rewards and punishments, and sometimes direct communication. The hero can have a pet companion. After a period of time playing the game, the game enables the player to review the most-important events the hero has participated in since the last time the player checked the game.
The game is also a role-playing game, meaning the hero will wander his world, defeat monsters, find and use treasure and items, and sometimes lose to monsters and unfriendly non-player characters. The player names the hero. Over time, the hero levels up and learns special skills, and has his own personality as a result of his adventuring, as well as his interaction with the god. Occasionally, the hero will be philosophical. The game provides some items with enhanced abilities which the hero can use only with the god's involvement; the hero will sell these items even if they do "have some marvelous effect". Limited player-vs-player interaction is provided as the god can have the hero duel other heroes. The hero who wins takes some coins from the losing hero as well as some of losing hero's items. The god can somewhat influence these duels, but sometimes the god's attempt aids the opposing god's hero instead of his own.
The game was free-to-play at release and connected to the Internet. The game was not supported by ads at release and the developers were supported by their full-time jobs. Once their characters reach level 10, players can suggest updates to the game, which are then voted upon by the community of players for subsequent inclusion. Most of the phrases in the game are made and selected by the community.
Development and release
The game was developed by Dmitry Kosinov and Mikhail Platov, where Kosinov focused on technical issues and Platov the gameplay issues. They attribute their community of players as co-designers, from which the designers took feedback on direction and feature set.
The designers found Progress Quest in 2003, which they say they "immediately loved" and which Edge called a "clear progeny". Besides Progress Quest, the game was inspired by Terry Pratchett's Small Gods. It is named Godville because the name sounded good in both Russian and English and because it was appropriate for the game. Farmville did not exist then. From 2003 to 2007, they worked on the ideas for the game in their spare time, including a prototype version in English. The web browser version eventually released in 2007 was in Russian because it was their native language, which would allow them to write better jokes. The developers self-describe that version as "barely-playable", but they continued developing the game another three years. Knowledge of the Russian version spread by word of mouth, eventually having thousands of players. This version had no graphics or sound in 2010.
In early 2010 the developers started work on the English version, due to demand from Russian-speakers who wanted to show it to their English-speaking friends and because the developers wanted to work on an English version again. Russian-speakers helped with the initial translation. The English version of the website was not fully-featured in 2010, but new content was added daily. The developers released the game for iPhone and iPod in July 2010, and by August it had over 20,000 downloads. Based on the user response, the developers released an iPad version with the 2nd version of the iPhone app in September. The developers subsequently released an Android version in March 2011, expanded Web browser access to the English version in April 2011, released a Windows Phone version in July 2013, and an Apple Watch version in 2015.
Reception
On its English release, Eli Hodapp, editor-in-chief of TouchArcade, said that the game "sounds a little stupid" but that "it's surprisingly amusing without needing to actually do anything at all". Jim Sterling, writing for Destructoid, called Godville "a fun, funny, incredibly clever little game"; she later added in GamesRadar that it was "one of the most compelling, engaging, and addictive little bits of software out there". In 2012, Edge Online called the game "darkly rewarding in its meaningless levelling and incessant battles even before you take into account the smart writing", and was similarly addicted to "the promises of numbers that get larger and larger over time". In 2014, The New York Times said the game "has a wickedly funny side, and it will light up your imagination." Multiple reviewers identified the game as satirizing religion, the role-playing game genre, the massively-multiplayer online game genre, and video games in general; the developers also included "internet memes and ordinary day to day things".
Hodapp said that the original adventures the character went on were repetitive, but that the developers had implemented a number of excellent community suggestions within the first month. The lack of control over the hero was appealing. He originally thought the game was novel and didn't expect it to last, but was surprised that he was still interested in the game years later, and likens checking the game to checking his email or Twitter feed. GamePro identified typos in the content, likely due to the developers not speaking English natively, but that community suggestions continued to help improve the game. Bogost was unimpressed with the monetization and the use of god powers to influence the game, and was disappointed that the god could interact at all, but suggested that some interaction was necessary for Godville to be a game rather than a reading work.
NDTV compared the game favorably to Godus, another video game in the genre. Due to the idle game genre's "ease of access", Godville is cited as one game providing for "a variety of player preferences". Reviewers approved of the game on mobile platforms, including iPod Touch, iPhone, Apple Watch, and Android. It was also included in Mashable "11 Facebook Games You're Embarrassed to Admit You Play" list.
References
Further reading
External links
2010 video games
Android (operating system) games
God games
IOS games
Browser games
Video games developed in Russia
Windows Phone games
|
Esterina is a 1959 Italian drama film directed by Carlo Lizzani. It was entered into the main competition at the 20th Venice International Film Festival, in which Carla Gravina received a special mention for her performance.
Cast
Carla Gravina: Esterina
Geoffrey Horne: Gino
Domenico Modugno: Piero
Anna Maria Aveta: Piero's Wife
Silvana Jachino: Landlady
Laura Nucci: Hooker
Raimondo Van Riel: Old man
References
External links
1959 films
Films directed by Carlo Lizzani
Films scored by Carlo Rustichelli
Italian drama films
1950s Italian films
|
Christopher Blake (born 13 June 1953) is an Australian archer. He competed at the 1984 Summer Olympics and the 1988 Summer Olympics.
References
1953 births
Living people
Australian male archers
Olympic archers for Australia
Archers at the 1984 Summer Olympics
Archers at the 1988 Summer Olympics
Place of birth missing (living people)
|
An igloo (Inuit languages: , Inuktitut syllabics (plural: )), also known as a snow house or snow hut, is a type of shelter built of suitable snow.
Although igloos are often associated with all Inuit, they were traditionally used only by the people of Canada's Central Arctic and the Qaanaaq area of Greenland. Other Inuit tended to use snow to insulate their houses, which were constructed from whalebone and hides.
Snow is used because the air pockets trapped in it make it an insulator. On the outside, temperatures may be as low as , but on the inside, the temperature may range from when warmed by body heat alone.
Nomenclature
In the Inuit languages the word (plural ) can be used for a house or home built of any material. The word is not restricted exclusively to snowhouses (called specifically , plural ), but includes traditional tents, sod houses, homes constructed of driftwood and modern buildings. Outside Inuit culture, however, igloo refers exclusively to shelters constructed from blocks of compacted snow, generally in the form of a dome.
Several Inuit language dialects throughout the Canadian Arctic (Siglitun, Inuinnaqtun, Natsilingmiutut, Kivalliq, North Baffin) use for all buildings, including snowhouses, and it is the term used by the Government of Nunavut. An exception to this is the dialect used in the Igloolik region of Nunavut. is used for other buildings, while , (plural , Inuktitut syllabics: ) is specifically used for a snowhouse.
Types
There are three traditional types of igloos. Each has a different size from small to large and is used for a different purpose.
The smallest-sized igloos are constructed as temporary shelters. They are usually used for one or two nights, and therefore, are easier to build. On rare occasions these small types of igloos are used during hunting trips which are often on open sea ice.
Construction
Snow igloos are not spherical, but are built in a catenary, a curved shape more closely resembling a paraboloid. Using this shape, the stresses of snow as it ages and compresses are less likely to cause it to buckle because in an inverted paraboloid or catenoid the pressures are nearer to being exclusively compressive.
The individual snow bricks are cut out of the ground with saws and machete-like blades. They are originally cut out in a four-sided shape. However, later the snow bricks are often cut into five- or six-sided shapes to increase structural interlocking, similar to the stones used in the Inca Empire.
Igloos gradually become shorter with time due to the compressive creep of the snow.
Building methods
The snow used to build an igloo must have enough structural strength to be cut and stacked appropriately. The best snow to use for this purpose is snow which has been blown by wind, which can serve to compact and interlock the ice crystals. Snow that has settled gently to the ground in still weather is not useful. The hole left in the snow, where the blocks are cut, is usually used as the lower half of the shelter.
Snow's insulating properties enable the inside of the igloo to remain relatively warm. In some cases, a single block of clear freshwater ice is inserted to allow light into the igloo. Igloos used as winter shelters had beds made of loose snow, skins, and caribou furs. Sometimes, a short tunnel is constructed at the entrance, to reduce wind and heat loss when the door is opened. Animal skins or a snow block can be used as a door.
The igloo is architecturally unique in that it is a dome that can be raised out of independent blocks leaning on each other and polished to fit without an additional supporting structure during construction. An igloo that is built correctly will support the weight of a person standing on the roof.
Traditionally, an igloo might be deliberately consolidated immediately after construction by making a large flame with a (, stone lamp), briefly making the interior very hot, which causes the walls to melt slightly and settle. Body heat is also adequate, although slower. This melting and refreezing builds up a layer of ice that contributes to the strength of the igloo.
The sleeping platform is a raised area. Because warmer air rises and cooler air settles, the entrance area acts as a cold trap whereas the sleeping area will hold whatever heat is generated by a stove, lamp, body heat, or other device. The Central Inuit, especially those around the Davis Strait, lined the living area with skin, which could increase the temperature within from around to .
See also
Glacier cave – a natural hollow space within a glacier
Quinzhee – a shelter made by hollowing out a pile of settled snow
Snow cave – a shelter constructed in snow
Snow fort – a usually open-topped temporary structure made of snow walls that is usually used for recreational purposes
Vernacular architecture – a category of architecture based on local needs, construction materials and reflecting local traditions
References
Further reading
Richard Guy Condon, Julia Ogina and the Holman Elders, The Northern Copper Inuit ()
Igloo – the Traditional Arctic Snow Dome
An article on igloos from The Canadian Encyclopedia
Watch How to Build an Igloo (National Film Board of Canada)
Field Manual for the U.S. Antarctic Program, Chapter 11: "Snow Shelters", pp. 140-145
Traditional Dwellings: Igloos (1) (Interview; Library and Archives Canada)
(a Norwegian observer's account of the building a family's winter igloo, not a short-term hunting one, by Atikleura and Nalungia, Netsilik Inuit)
External links
How to Build an Igloo (wikiHow)
Buildings and structures made of snow or ice
Indigenous architecture
House types
Snow
Inuit culture
Greenlandic culture
Igloo
Native American architecture
Huts
Traditional Native American dwellings
|
WASP-44b is a closely orbiting Jupiter-sized planet found in the orbit of the sunlike star WASP-44 by the SuperWASP program, which searches for transiting planets that cross in front of their host stars as seen from Earth. After follow-up observations using radial velocity, the planet was confirmed. Use of another telescope at the same observatory detected WASP-44 transiting its star. The planet completes an orbit around its star every two and a half days, and orbits at roughly 0.03 AU from its host star. WASP-44b's discovery was reported by the Royal Astronomical Society in May 2011.
Discovery
Using the WASP-South station at the South African Astronomical Observatory, the SuperWASP project searched the night sky for potential planets that transited, or crossed in front of, their host stars at a roughly periodic rate. WASP-44 was among the candidates identified as a possible host to a transit event. WASP-44's reclassification as a potential planetary host came about after WASP-South scanned the Cetus constellation between July and November 2009. In combination with later observations using both WASP-South and the SuperWASP-North in the Canary Islands, over 15,755 photometric measurements were collected. A later set of observations between August and November 2010 produced a 6,000 point photometric data set, but the light curve was prepared late and was not considered in the discovery paper. The star was observed at the same time as stars WASP-45 and WASP-46.
In 2010, the European team of astronomers used the CORALIE spectrograph on the 1.2m Leonhard Euler Telescope at Chile's La Silla Observatory. The same radial velocity measurements detected by SuperWASP were detected. The planet WASP-44b was confirmed after analysis of the results ruled out spectroscopic binary stars, leaving a transiting planet as the most likely cause of the radial velocity variations.
The Euler telescope was used to observe WASP-44b as it transited its host star. For 4.2 hours on September 14, 2010, Euler observed WASP-44 in search of a slight dimming in brightness until a more precise light curve could be found. Accounting for all data yet collected, analysis yielded the planet's characteristics.
The discovery of WASP-44b, along with those of WASP-45b and WASP-46b, were reported on May 16, 2011 by the Royal Astronomical Society. The scientists who worked on the paper discussed the role of orbital eccentricity, or how elliptical an orbit is, and how poorly constrained it tends to be amongst Hot Jupiters, where a circular orbit is assumed. They used the three newly discovered planets as studies into the creation of a non-eccentric, circular model for a planet's orbit (the most likely solution) or an eccentric, elliptical solution for a planet's orbit (the solution that, according to the discovery team, required less of an assumption).
Host star
WASP-44 is a sunlike G-type star in the Cetus constellation. WASP-44 has a mass of 0.951 solar masses and a radius of 0.927 solar radii, which means that WASP-44 is 95% the mass of and 92% the size of the Sun. With an effective temperature of 5410 K, WASP-44 is cooler than the Sun, although it is richer in iron, with a measured metallicity of [Fe/H] = 0.06 (1.15 times the amount of iron found in the Sun). The star is an estimated 900 million years old, although this age is uncertain, as error bars are large. Based on its spectrum, WASP-44 is not active in its chromosphere (outer layer). The star was also not found to demonstrate a high rate of rotation.
With an apparent magnitude of 12.9, WASP-44 cannot be seen with the unaided eye from Earth.
Characteristics
WASP-44b is a Hot Jupiter with a mass of 0.889 times Jupiter's mass and a radius of 1.002 times that of Jupiter. Although less massive than Jupiter, the planet is bloated to a greater size because its proximity to its host star heats it, a common effect in such closely orbiting gas giants. WASP-44b orbits at a mean distance of 0.03473 AU, which is about 3% of the distance between the Earth and Sun. An orbit is completed every 2.4238039 days (58.171 hours).
WASP-44b has an orbital inclination of 86.02º, which is almost edge-on as seen from Earth.
References
Hot Jupiters
Cetus
Transiting exoplanets
Exoplanets discovered in 2011
Exoplanets discovered by WASP
|
```java
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package org.activiti.engine.impl;
import java.util.HashMap;
import java.util.Map;
/**
* @author Tom Baeyens
*/
public class Direction {
private static final Map<String, Direction> directions = new HashMap<>();
public static final Direction ASCENDING = new Direction("asc");
public static final Direction DESCENDING = new Direction("desc");
private String name;
public Direction(String name) {
this.name = name;
directions.put(name, this);
}
public String getName() {
return name;
}
public static Direction findByName(String directionName) {
return directions.get(directionName);
}
}
```
|
Nepal A.P.F. Club (), commonly known as APF Club, is a professional sports club based in Kathmandu, Bagmati Province, Nepal. The club is the sports-wing of the Armed Police Force.
History
The APF Club was established on 24 October 2001, following the tradition of forming a sports club within an official group, as the Police and army had already done. Initially, the club was named Gyanendra APF Club, in tribute to the King at the time.
Later, the government made the decision to remove all references to royalty from the names of government clubs, so the team changed their name to APF Club.
Cricket
APF is one of the three departmental teams to play in National League Cricket and in Prime Minister One Day Cup. Other nine regional teams along with Nepal Army Club compete with APF in the league. APF formed a cricket team in September 2010. APF initially signed national team captain Paras Khadka and 15 other players while former national team player Raju Basnyat was appointed as the coach. APF is two times champion in one-day format of the league as in 2011 and 2012. In 2012 tournament the APF Cricket Team defended its title defeating Region-4 Bhairawa by six wicket in Final match held on 21 December 2012. In 2012, APF won the Twenty20 format of the league for the first time.
Record
{| class="wikitable" style="text-align:center"
! rowspan="2" |Season
! colspan="2" |One-Day
! colspan="2" |Twenty20
|-
!Teams
!Position
!Teams
!Position
|-
|2011
|10
|style="background:gold;"|Winner
|10
|style="background:silver;"|Runners-up
|-
|2012
|9
|style="background:gold;"|Winner
|9
|style="background:gold;"|Winner
|-
|2013
| colspan="2" |Not held
|10
|style="background:silver;"|Runners-up
|-
|2014
|9
|style="background:silver;"|Runners-up
| colspan="2" rowspan="5" |Not held
|-
|2015
|10
|style="background:silver;"|Runners-up
|-
|2017
|8
|bgcolor=#deb678|Semi-Finals
|-
|2018
|10
|style="background:gold;"|Winner
|-
|2019
|10
|bgcolor=#deb678|Semi-Finals
|-
|2021
|10
|style="background:silver;"|'Runners-up
|TBD
|TBD
|}
Squad
Football
At its first appearance in the Martyr's Memorial A-Division League of season 2005–06, the club finished the season in fifth position, behind MMC, Three Star, Tribhuvan Army Club and the Nepal Police Club.
The club remained in fifth position in the 2006–07 season.
At the end of the 2009–10 season, the APF Club barely escaped relegation, staying in the league only by defeating Machchindra Club by 8–0 in the last match of the league. At the end of all 22 matches, APF is now in 10th position.
Record by season
Current squad Last updated 28 October 2021''.
References
External links
Match History of Armed Police Force Club Nepal
Armed Police Force Club website
The Recreational Sport Soccer Statistics Foundation
SoccerAge Nepal
Soccer in Nepal on BlogSpot
Match History of Armed Police Force Club Nepal
Football clubs in Nepal
2001 establishments in Nepal
Cricket teams in Nepal
|
```yaml
---
fixes:
- |
Fix collection of host tags pulled from GCP project (``project:`` and ``numeric_project_id:`` tags)
and GCP instance attributes.
```
|
```java
package easymvp.compiler.generator.decorator;
import com.squareup.javapoet.ClassName;
import com.squareup.javapoet.MethodSpec;
import easymvp.compiler.generator.DelegateClassGenerator;
import static easymvp.compiler.generator.AndroidLoaderUtils.getLoader;
import static easymvp.compiler.generator.AndroidLoaderUtils.getLoaderCallbacks;
import static easymvp.compiler.generator.AndroidLoaderUtils.getLoaderManager;
import static easymvp.compiler.generator.AndroidLoaderUtils.getPresenterLoader;
/**
* @author Saeed Masoumi (s-masoumi@live.com)
*/
public class ConductorControllerDecorator extends BaseDecorator {
public ConductorControllerDecorator(DelegateClassGenerator delegateClassGenerator) {
super(delegateClassGenerator);
}
@Override
public MethodSpec getLoaderManagerMethod(MethodSpec.Builder methodSignature) {
return methodSignature.addStatement("return view.getActivity().getLoaderManager()")
.returns(getLoaderManager())
.build();
}
@Override
public String createContextField(String viewField) {
return "final $T context = " + viewField + ".getActivity().getApplicationContext()";
}
@Override
protected void implementInitializer(MethodSpec.Builder method) {
}
@Override
protected String addStatementInOnDestroyMethod() {
return "if (view.getActivity() == null) return;\n"
+ "getLoaderManager(view).destroyLoader(loaderId)";
}
@Override
protected ClassName getPresenterLoaderClass() {
return getPresenterLoader();
}
@Override
protected ClassName getLoaderCallbacksClass() {
return getLoaderCallbacks();
}
@Override
protected ClassName getLoaderClass() {
return getLoader();
}
}
```
|
```xml
import i18next from 'i18next';
import {
TRANSLATION_BG_BG,
TRANSLATION_DE_DE,
TRANSLATION_EN_US,
TRANSLATION_ES_ES,
TRANSLATION_FR_FR,
TRANSLATION_HU_HU,
TRANSLATION_IT_IT,
TRANSLATION_JA_JP,
TRANSLATION_KO_KR,
TRANSLATION_NL_NL,
TRANSLATION_PL_PL,
TRANSLATION_PT_BR,
TRANSLATION_RU_RU,
TRANSLATION_SK_SK,
TRANSLATION_ZH_CN,
TRANSLATION_ZH_TW,
TRANSLATION_KA_GE
} from '../../locales';
class I18nEngine {
private static instance: I18nEngine;
private constructor() {}
public static getInstance() {
if (!I18nEngine.instance) {
I18nEngine.instance = new I18nEngine();
}
return I18nEngine.instance;
}
private availablesLanguages = {
'bg-BG': 'bg-BG',
'de-DE': 'de-DE',
'en-US': 'en-US',
'es-ES': 'es-ES',
'fr-FR': 'fr-FR',
'hu-HU': 'hu-HU',
'it-IT': 'it-IT',
'ja-JP': 'ja-JP',
'ka-GE': 'ka-GE',
'ko-KR': 'ko-KR',
'nl-NL': 'nl-NL',
'pl-PL': 'pl-PL',
'pt-BR': 'pt-BR',
'ru-RU': 'ru-RU',
'sk-SK': 'sk-SK',
'zh-CN': 'zh-CN',
'zh-TW': 'zh-TW'
};
public fallbackLanguage = 'en-US';
public init(language: string) {
i18next.init({
lng: language,
fallbackLng: this.fallbackLanguage,
interpolation: {
skipOnVariables: false
}
});
i18next.addResources('de-DE', 'translation', TRANSLATION_DE_DE);
i18next.addResources('en-US', 'translation', TRANSLATION_EN_US);
i18next.addResources('es-ES', 'translation', TRANSLATION_ES_ES);
i18next.addResources('fr-FR', 'translation', TRANSLATION_FR_FR);
i18next.addResources('hu-HU', 'translation', TRANSLATION_HU_HU);
i18next.addResources('it-IT', 'translation', TRANSLATION_IT_IT);
i18next.addResources('ja-JP', 'translation', TRANSLATION_JA_JP);
i18next.addResources('ka-GE', 'translation', TRANSLATION_KA_GE);
i18next.addResources('ko-KR', 'translation', TRANSLATION_KO_KR);
i18next.addResources('nl-NL', 'translation', TRANSLATION_NL_NL);
i18next.addResources('pl-PL', 'translation', TRANSLATION_PL_PL);
i18next.addResources('pt-BR', 'translation', TRANSLATION_PT_BR);
i18next.addResources('ru-RU', 'translation', TRANSLATION_RU_RU);
i18next.addResources('sk-SK', 'translation', TRANSLATION_SK_SK);
i18next.addResources('zh-CN', 'translation', TRANSLATION_ZH_CN);
i18next.addResources('zh-TW', 'translation', TRANSLATION_ZH_TW);
}
public translate(key: string): string {
return i18next.t(key);
}
public exists(key: string): boolean {
return i18next.exists(key);
}
public supportLanguage(language: string): boolean {
return typeof this.availablesLanguages[language] !== 'undefined';
}
}
export default I18nEngine.getInstance();
```
|
Alamjeet Kaur Chauhan is an Indian lawyer and former model. She was crowned Femina Miss India in 1978.
Early life and career
She was born in Punjab in the year 1955. In 1978, she entered the Femina Miss India contest and won the title. She was also declared the winner of Miss Beautiful smile sub-award at the pageant. She represented India in the Miss Universe 1978 pageant where she won the Best National Costume Award.
After completing her one-year tenure with Femina Miss India she returned to pursue her career as a lawyer.
References
1955 births
Living people
Femina Miss India winners
Indian beauty pageant winners
Miss Universe 1978 contestants
Female models from Punjab, India
20th-century Indian lawyers
20th-century Indian women lawyers
21st-century Indian lawyers
21st-century Indian women lawyers
|
Pain stimulus is a technique used by medical personnel for assessing the consciousness level of a person who is not responding to normal interaction, voice commands or gentle physical stimuli (such as shaking of the shoulders). It forms one part of a number of neurological assessments, including the first aid based AVPU scale and the more medically based Glasgow Coma Scale.
The objective of pain stimulus is to assess the level of consciousness of the patient by inducing vocalisation in an acceptable, consistent and replicable manner, and to this end, there are a limited number of techniques which are normally considered acceptable.
The pain stimulus can be applied centrally and/or peripherally, and there are benefits and drawbacks to each type of stimulus, depending on the type of patient and the response being assessed.
Central stimuli
A central stimulus is one which can only be successfully found if the brain is involved in the response to the pain (as opposed to peripheral stimuli, which can induce a result as a result of reflex. The four commonly used central pain stimuli are:
the trapezius squeeze - which involves gripping and twisting a portion of the trapezius muscle in the patient's shoulder
mandibular pressure - this is the manual stimulation of the mandibular nerve, located within the angle of the jaw
supraorbital pressure - this is the manual stimulation of the supraorbital nerve by pressing a thumb into the indentation above the eye, near the nose.
sternal rub - this involves creating a turning pressure (akin to a grinding motion with a pestle and mortar) on the patient's sternum
Ambulance trusts within the UK are now not recommending mandibular pressure, supraorbital pressure or sternal rub methods as these can be very painful and leave marks on the body. Furthermore, sternal rub can also be seen as assault.
Central stimuli should always be used when attempting to assess if the patient is localising to pain (i.e. moving their arms to the site where the pain is being applied), however it has been suggested that central stimuli are less suitable for the assessment of eye opening, compared to peripheral stimuli, as they can cause grimacing. There is also a statistical reason behind central pain stimuli being inaccurate, especially regarding the GCS, which depending on the patient's eye response, the total score, and thus severity of patients' condition, can be altered with varying prognostic accuracy.
If the patient reacts to the central pain stimulus normally, then a peripheral stimulus is unlikely to be required, unless there is suspicion of localised paresthesia or paralysis in a particular limb.
Central stimuli are likely to have to be applied for at least 15 and potentially up to 30 seconds in order for the clinician to accurately assess their efficacy.
The various acceptable central stimuli have been criticised or deemed suboptimal for various reasons. For instance, the sternal rub may leave bruising (especially on fair skinned patients) and for this reason is discouraged by some.
It has been claimed that supraorbital pressure and trapezius squeeze are more effective than the sternal rub or peripheral stimulation, but sternal rub remains the most common.
Supraorbital and mandibular pressure may not be suitable for patients with head injuries, or those with periorbital swelling.
Peripheral stimuli
Peripheral stimuli are generally applied to the limbs, and a common technique is squeezing the lunula area of the finger or toe nail, often with an adjunct such as a pen. Like the sternal rub, though, this can cause bruising, and is recommended against, in favour of squeezing the side of the finger.
References
External links
Glasgow Coma Scale
Emergency medicine
Intensive care medicine
Neuropsychological tests
|
The Gemma Factor is a BBC Three sitcom starring Anna Gilthorpe, Claire King and Gwyneth Powell. The series premiered on Tuesday 9 March 2010, and has six episodes.
Overview
The series is set in a town called Lumb in West Yorkshire, some scenes are set in Halifax and follows Gemma Collinge who "wants to be famous by the time she turns 21." It was filmed around Hebden Bridge and Heptonstall.
References
External links
BBC television sitcoms
BBC high definition shows
2010 British television series debuts
2010 British television series endings
2010s British sitcoms
English-language television shows
|
```smalltalk
/****************************************************************************
*
* path_to_url
* path_to_url
* path_to_url
* path_to_url
****************************************************************************/
#if UNITY_EDITOR
using System.Collections.Generic;
using System.IO;
using System.Text.RegularExpressions;
using UnityEditor;
using UnityEngine;
using UnityEngine.Networking;
namespace QFramework
{
public class MDHandlerImages
{
public string CurrentPath;
Texture mPlaceholder = null;
List<ImageRequest> mActiveRequests = new List<ImageRequest>();
Dictionary<string, Texture> mTextureCache = new Dictionary<string, Texture>();
List<AnimatedTexture> mAnimatedTextures = new List<AnimatedTexture>();
class AnimatedTexture
{
public string URL = string.Empty;
public int CurrentFrame = 0;
public double FrameTime = 0.0f;
public List<Texture2D> Textures = new List<Texture2D>();
public List<float> Times = new List<float>();
public AnimatedTexture(string url)
{
URL = url;
FrameTime = EditorApplication.timeSinceStartup;
}
public void Add(Texture2D tex, float delay)
{
Textures.Add(tex);
Times.Add(delay);
}
public bool Update()
{
var span = EditorApplication.timeSinceStartup - FrameTime;
if (span < Times[CurrentFrame])
{
return false;
}
FrameTime = EditorApplication.timeSinceStartup;
CurrentFrame = (CurrentFrame + 1) % Textures.Count;
return true;
}
}
class ImageRequest
{
public string URL; // original url
public UnityWebRequest Request;
public bool IsGif;
public ImageRequest(string url)
{
URL = url;
if (url.EndsWith(".gif", System.StringComparison.OrdinalIgnoreCase))
{
IsGif = true;
Request = UnityWebRequest.Get(url);
}
else
{
IsGif = false;
Request = UnityWebRequestTexture.GetTexture(url);
}
Request.SendWebRequest();
}
public AnimatedTexture GetAnimatedTexture()
{
var decoder = new Decoder(Request.downloadHandler.data);
var img = decoder.NextImage();
var anim = new AnimatedTexture(URL);
while (img != null)
{
anim.Add(img.CreateTexture(), img.Delay / 1000.0f);
img = decoder.NextImage();
}
return anim;
}
public Texture GetTexture()
{
var handler = Request.downloadHandler as DownloadHandlerTexture;
return handler?.texture;
}
}
//your_sha256_hash--------------
private string RemapURL(string url)
{
if (Regex.IsMatch(url, @"^\w+:", RegexOptions.Singleline))
{
return url;
}
var projectDir = Path.GetDirectoryName(Application.dataPath);
if (url.StartsWith("/"))
{
return $"file:///{projectDir}{url}";
}
var assetDir = Path.GetDirectoryName(CurrentPath);
return "file:///" + MDUtils.PathNormalise(string.Format("{0}/{1}/{2}", projectDir, assetDir, url));
}
//your_sha256_hash--------------
public Texture FetchImage(string url)
{
url = RemapURL(url);
Texture tex;
if (mTextureCache.TryGetValue(url, out tex))
{
return tex;
}
if (mPlaceholder == null)
{
var style = GUI.skin.GetStyle("btnPlaceholder");
mPlaceholder = style != null ? style.normal.background : null;
}
mActiveRequests.Add(new ImageRequest(url));
mTextureCache[url] = mPlaceholder;
return mPlaceholder;
}
//your_sha256_hash--------------
public bool UpdateRequests()
{
var req = mActiveRequests.Find(r => r.Request.isDone);
if (req == null)
{
return false;
}
#if UNITY_2020_2_OR_NEWER
if( req.Request.result == UnityWebRequest.Result.ProtocolError )
#else
if (req.Request.isHttpError)
#endif
{
Debug.LogError(string.Format("HTTP Error: {0} - {1} {2}", req.URL, req.Request.responseCode,
req.Request.error));
mTextureCache[req.URL] = null;
}
#if UNITY_2020_2_OR_NEWER
else if( req.Request.result == UnityWebRequest.Result.ConnectionError )
#else
else if (req.Request.isNetworkError)
#endif
{
Debug.LogError(string.Format("Network Error: {0} - {1}", req.URL, req.Request.error));
mTextureCache[req.URL] = null;
}
else if (req.IsGif)
{
var anim = req.GetAnimatedTexture();
if (anim != null && anim.Textures.Count > 0)
{
mTextureCache[req.URL] = anim.Textures[0];
if (anim.Textures.Count > 1)
{
mAnimatedTextures.Add(anim);
}
}
}
else
{
mTextureCache[req.URL] = req.GetTexture();
}
mActiveRequests.Remove(req);
return true;
}
//your_sha256_hash--------------
public bool UpdateAnimations()
{
var update = false;
foreach (var anim in mAnimatedTextures)
{
if (anim.Update())
{
mTextureCache[anim.URL] = anim.Textures[anim.CurrentFrame];
update = true;
}
}
return update;
}
//your_sha256_hash--------------
public bool Update()
{
return UpdateRequests() || UpdateAnimations();
}
}
}
#endif
```
|
```java
/*
This file is part of the iText (R) project.
Authors: Apryse Software.
This program is offered under a commercial and under the AGPL license.
For commercial licensing, contact us at path_to_url For AGPL licensing, see below.
AGPL licensing:
This program is free software: you can redistribute it and/or modify
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
along with this program. If not, see <path_to_url
*/
package com.itextpdf.kernel.events;
import com.itextpdf.kernel.pdf.PdfDocument;
import com.itextpdf.kernel.pdf.PdfPage;
/**
* Event dispatched by PdfDocument.
*/
public class PdfDocumentEvent extends Event {
/**
* Dispatched after page is created.
*/
public static final String START_PAGE = "StartPdfPage";
/**
* Dispatched after page is inserted/added into a document.
*/
public static final String INSERT_PAGE = "InsertPdfPage";
/**
* Dispatched after page is removed from a document.
*/
public static final String REMOVE_PAGE = "RemovePdfPage";
/**
* Dispatched before page is flushed to a document.
* This event isn't necessarily dispatched when a successive page has been created.
* Keep it in mind when using with highlevel iText API.
*/
public static final String END_PAGE = "EndPdfPage";
/**
* The PdfPage associated with this event.
*/
protected PdfPage page;
/**
* The PdfDocument associated with this event.
*/
private PdfDocument document;
/**
* Creates a PdfDocumentEvent.
*
* @param type type of the event that fired this event
* @param document document that fired this event
*/
public PdfDocumentEvent(String type, PdfDocument document) {
super(type);
this.document = document;
}
/**
* Creates a PdfDocumentEvent.
*
* @param type type of the event that fired this event
* @param page page that fired this event
*/
public PdfDocumentEvent(String type, PdfPage page) {
super(type);
this.page = page;
this.document = page.getDocument();
}
/**
* Returns the PdfDocument associated with this event.
*
* @return the PdfDocument associated with this event
*/
public PdfDocument getDocument() {
return document;
}
/**
* Returns the PdfPage associated with this event. Warning: this can be null.
*
* @return the PdfPage associated with this event
*/
public PdfPage getPage() {
return page;
}
}
```
|
```javascript
Asynchronous Code
Working with Promises
Mocha Hooks
Pending and Disabling Tests
Timeouts
```
|
In the Making is a Canadian television documentary series, which premiered on CBC Television on September 21, 2018. Co-created and hosted by Sean O'Neill, the series explores the creative process by profiling notable Canadian artists as they meet pivotal moments in their lives and work.
Artists profiled in the first season of the series included musicians Lido Pimienta and Chilly Gonzales, choreographers Dana Michel and Crystal Pite, and visual artists Adrian Stimson, Shelley Niro, Divya Mehra and Curtis Talwst Santiago.
The series received three Canadian Screen Award nominations at the 7th Canadian Screen Awards in 2019, for Best Photography in a Documentary Factual Series (Maya Bankovic), Best Direction in a Documentary or Factual Series (Chelsea McMullan) and Best Music in a Non-Fiction Program (Kieran Adams).
References
2018 Canadian television series debuts
CBC Television original programming
2010s Canadian documentary television series
|
The 2020–21 Norwich City F.C. season was the club's 119th season in existence and the first season back in the second tier of English football. In addition to the domestic league, Norwich City participated in this season's editions of the FA Cup and the EFL Cup.
Players
First-team squad
Transfers
Transfers in
Loans in
Loans out
Transfers out
Notes
Pre-season and friendlies
Norwich City confirmed they would play Milton Keynes Dons, SC Verl, Dynamo Dresden and Darmstadt 98 during pre-season.
Competitions
Overview
EFL Championship
League table
Results summary
Results by matchday
Matches
The league fixtures were announced on 21 August 2020.
FA Cup
The third round draw was made on 30 November, with Premier League and EFL Championship clubs all entering the competition. The fourth and fifth rounds draws were made consecutively on 11 January.
EFL Cup
The first round draw was made on 18 August, live on Sky Sports, by Paul Merson.
Statistics
Appearances, goals and cards
Goalscorers
Notes
References
External links
Norwich City F.C. seasons
Norwich City
|
Diogenes Laërtius ( ; , ; ) was a biographer of the Greek philosophers. Nothing is definitively known about his life, but his surviving Lives and Opinions of Eminent Philosophers is a principal source for the history of ancient Greek philosophy. His reputation is controversial among scholars because he often repeats information from his sources without critically evaluating it. He also frequently focuses on trivial or insignificant details of his subjects' lives while ignoring important details of their philosophical teachings and he sometimes fails to distinguish between earlier and later teachings of specific philosophical schools. However, unlike many other ancient secondary sources, Diogenes Laërtius generally reports philosophical teachings without attempting to reinterpret or expand on them, which means his accounts are often closer to the primary sources. Due to the loss of so many of the primary sources on which Diogenes relied, his work has become the foremost surviving source on the history of Greek philosophy.
Life
Laërtius must have lived after Sextus Empiricus (c. 200), whom he mentions, and before Stephanus of Byzantium and Sopater of Apamea (c. 500), who quote him. His work makes no mention of Neoplatonism, even though it is addressed to a woman who was "an enthusiastic Platonist". Hence he is assumed to have flourished in the first half of the 3rd century, during the reign of Alexander Severus (222–235) and his successors.
The precise form of his name is uncertain. The ancient manuscripts invariably refer to a "Laertius Diogenes", and this form of the name is repeated by Sopater and the Suda. The modern form "Diogenes Laertius" is much rarer, used by Stephanus of Byzantium, and in a lemma to the Greek Anthology. He is also referred to as "Laertes" or simply "Diogenes".
The origin of the name "Laertius" is also uncertain. Stephanus of Byzantium refers to him as "Διογένης ὁ Λαερτιεύς" (Diogenes ho Laertieus), implying that he was the native of some town, perhaps the Laerte in Caria (or another Laerte in Cilicia). Another suggestion is that one of his ancestors had for a patron a member of the Roman family of the Laërtii. The prevailing modern theory is that "Laertius" is a nickname (derived from the Homeric epithet Diogenes Laertiade, used in addressing Odysseus) used to distinguish him from the many other people called Diogenes in the ancient world.
His home town is unknown (at best uncertain, even according to a hypothesis that Laertius refers to his origin). A disputed passage in his writings has been used to suggest that it was Nicaea in Bithynia.
It has been suggested that Diogenes was an Epicurean or a Pyrrhonist. He passionately defends Epicurus in Book 10, which is of high quality and contains three long letters attributed to Epicurus explaining Epicurean doctrines. He is impartial to all schools, in the manner of the Pyrrhonists, and he carries the succession of Pyrrhonism further than that of the other schools. At one point, he even seems to refer to the Pyrrhonists as "our school." On the other hand, most of these points can be explained by the way he uncritically copies from his sources. It is by no means certain that he adhered to any school, and he is usually more attentive to biographical details.
In addition to the Lives, Diogenes refers to another work that he had written in verse on famous men, in various metres, which he called Epigrammata or Pammetros (Πάμμετρος).
Lives and Opinions of Eminent Philosophers
The work by which he is known, Lives and Opinions of Eminent Philosophers (; ), was written in Greek and professes to give an account of the lives and sayings of the Greek philosophers.
Although it is at best an uncritical and unphilosophical compilation, its value, as giving us an insight into the private lives of the Greek sages, led Montaigne to write that he wished that instead of one Laërtius there had been a dozen. On the other hand, modern scholars have advised that we treat Diogenes' testimonia with care, especially when he fails to cite his sources: "Diogenes has acquired an importance out of all proportion to his merits because the loss of many primary sources and of the earlier secondary compilations has accidentally left him the chief continuous source for the history of Greek philosophy".
Organization of the work
Diogenes divides his subjects into two "schools" which he describes as the Ionian/Ionic and the Italian/Italic; the division is somewhat dubious and appears to be drawn from the lost doxography of Sotion. The biographies of the "Ionian school" begin with Anaximander and end with Clitomachus, Theophrastus and Chrysippus; the "Italian" begins with Pythagoras and ends with Epicurus. The Socratic school, with its various branches, is classed with the Ionic; while the Eleatics and Pyrrhonists are treated under the Italic. He also includes his own poetic verse, albeit pedestrian, about the philosophers he discusses.
The work contains incidental remarks on many other philosophers, and there are useful accounts concerning Hegesias, Anniceris, and Theodorus (Cyrenaics); Persaeus (Stoic); and Metrodorus and Hermarchus (Epicureans). Book VII is incomplete and breaks off during the life of Chrysippus. From a table of contents in one of the manuscripts (manuscript P), this book is known to have continued with Zeno of Tarsus, Diogenes, Apollodorus, Boethus, Mnesarchus, Mnasagoras, Nestor, Basilides, Dardanus, Antipater, Heraclides, Sosigenes, Panaetius, Hecato, Posidonius, Athenodorus, another Athenodorus, Antipater, Arius, and Cornutus. The whole of Book X is devoted to Epicurus, and contains three long letters written by Epicurus, which explain Epicurean doctrines.
His chief authorities were Favorinus and Diocles of Magnesia, but his work also draws (either directly or indirectly) on books by Antisthenes of Rhodes, Alexander Polyhistor, and Demetrius of Magnesia, as well as works by Hippobotus, Aristippus, Panaetius, Apollodorus of Athens, Sosicrates, Satyrus, Sotion, Neanthes, Hermippus, Antigonus, Heraclides, Hieronymus, and Pamphila.
Oldest extant manuscripts
There are many extant manuscripts of the Lives, although none of them are especially old, and they all descend from a common ancestor, because they all lack the end of Book VII. The three most useful manuscripts are known as B, P, and F. Manuscript B (Codex Borbonicus) dates from the 12th century, and is in the National Library of Naples. Manuscript P (Paris) is dated to the 11th/12th century, and is in the Bibliothèque nationale de France. Manuscript F (Florence) is dated to the 13th century, and is in the Laurentian Library. The titles for the individual biographies used in modern editions are absent from these earliest manuscripts, however they can be found inserted into the blank spaces and margins of manuscript P by a later hand.
There seem to have been some early Latin translations, but they no longer survive. A 10th-century work entitled Tractatus de dictis philosophorum shows some knowledge of Diogenes. Henry Aristippus, in the 12th century, is known to have translated at least some of the work into Latin, and in the 14th century an unknown author made use of a Latin translation for his De vita et moribus philosophorum (attributed erroneously to Walter Burley).
Printed editions
The first printed editions were Latin translations. The first, Laertii Diogenis Vitae et sententiae eorum qui in philosophia probati fuerunt (Romae: Giorgo Lauer, 1472), printed the translation of Ambrogio Traversari (whose manuscript presentation copy to Cosimo de' Medici was dated February 8, 1433) and was edited by Elio Francesco Marchese. The Greek text of the lives of Aristotle and Theophrastus appeared in the third volume of the Aldine Aristotle in 1497. The first edition of the whole Greek text was that published by Hieronymus Froben in 1533. The Greek/Latin edition of 1692 by Marcus Meibomius divided each of the ten books into paragraphs of equal length, and progressively numbered them, providing the system still in use today.
The first critical edition of the entire text, by H.S. Long in the Oxford Classical Texts, was not produced until 1964; this edition was superseded by Miroslav Marcovich's Teubner edition, published between 1999 and 2002. A new edition, by Tiziano Dorandi, was published by Cambridge University Press in 2013.
English translations
Thomas Stanley's 1656 History of Philosophy adapts the format and content of Laertius' work into English, but Stanley compiled his book from a number of classical biographies of philosophers. The first complete English translation was a late 17th-century translation by ten different persons. A better translation was made by Charles Duke Yonge (1853), but although this was more literal, it still contained many inaccuracies. The next translation was by Robert Drew Hicks (1925) for the Loeb Classical Library, although it is slightly bowdlerized. A new translation by Pamela Mensch was published by Oxford University Press in 2018. Another by Stephen White was published by Cambridge University Press in 2020.
Legacy and assessment
Henricus Aristippus, the archdeacon of Catania, produced a Latin translation of Diogenes Laertius's book in southern Italy in the late 1150s, which has since been lost or destroyed. Geremia da Montagnone used this translation as a source for his Compedium moralium notabilium () and an anonymous Italian author used it as a source for work entitled Liber de vita et moribus philosophorum (written 1317–1320), which reached international popularity in the Late Middle Ages. The monk Ambrogio Traversari (1386–1439) produced another Latin translation in Florence between 1424 and 1433, for which far better records have survived. The Italian Renaissance scholar, painter, philosopher, and architect Leon Battista Alberti (1404–1472) borrowed from Traversari's translation of the Lives and Opinions of Eminent Philosophers in Book 2 of his Libri della famiglia and modeled his own autobiography on Diogenes Laërtius's Life of Thales.
Diogenes Laërtius's work has had a complicated reception in modern times. The value of his Lives and Opinions of Eminent Philosophers as an insight into the private lives of the Greek sages led the French Renaissance philosopher Michel de Montaigne (1533–1592) to exclaim that he wished that, instead of one Laërtius, there had been a dozen. Georg Wilhelm Friedrich Hegel (1770–1831) criticized Diogenes Laërtius for his lack of philosophical talent and categorized his work as nothing more than a compilation of previous writers' opinions. Nonetheless, he admitted that Diogenes Laërtius's compilation was an important one given the information that it contained. Hermann Usener (1834–1905) deplored Diogenes Laërtius as a "complete ass" (asinus germanus) in his Epicurea (1887). Werner Jaeger (1888–1961) damned him as "that great ignoramus". In the late twentieth and early twenty-first centuries, however, scholars have managed to partially redeem Diogenes Laertius's reputation as a writer by reading his book in a Hellenistic literary context.
Nonetheless, modern scholars treat Diogenes's testimonia with caution, especially when he fails to cite his sources. Herbert S. Long warns: "Diogenes has acquired an importance out of all proportion to his merits because the loss of many primary sources and of the earlier secondary compilations has accidentally left him the chief continuous source for the history of Greek philosophy." Robert M. Strozier offers a somewhat more positive assessment of Diogenes Laertius's reliability, noting that many other ancient writers attempt to reinterpret and expand on the philosophical teachings they describe, something which Diogenes Laërtius rarely does. Strozier concludes, "Diogenes Laertius is, when he does not conflate hundreds of years of distinctions, reliable simply because he is a less competent thinker than those on whom he writes, is less liable to re-formulate statements and arguments, and especially in the case of Epicurus, less liable to interfere with the texts he quotes. He does, however, simplify."
Despite his importance to the history of western philosophy and the controversy surrounding him, according to Gian Mario Cao, Diogenes Laërtius has still not received adequate philological attention. Both modern critical editions of his book, by H. S. Long (1964) and by M. Marcovich (1999) have received extensive criticism from scholars.
He is criticized primarily for being overly concerned with superficial details of the philosophers' lives and lacking the intellectual capacity to explore their actual philosophical works with any penetration. However, according to statements of the 14th-century monk Walter Burley in his De vita et moribus philosophorum, the text of Diogenes seems to have been much fuller than that which we now possess.
Reliability
Although Diogenes had a will to objectivity and fact-checking, Diogenes's works are today seen as generally unreliable from a historical perspective. He is neither consistent nor reliable in some of his reports and some of the details he cites contain obvious errors. Some of them were probably introduced by copyists in the transmission of the text from antiquity, but some errors are undoubtedly due to Diogenes himself. The reliability of Diogenes' sources have also been questioned, since he uses comic poets as sources. Professor Brian Gregor suggests that readers will benefit from modern scholarly assistance while reading Diogenes' biographies, since they are "notoriously unreliable". Some scholars (e.g. Delfim Leão) state that Diogenes' unreliability is not entirely his responsibility and blame his sources instead.
Editions and translations
Diogenis Laertii Vitae philosophorum edidit Miroslav Marcovich, Stuttgart-Lipsia, Teubner, 1999–2002. Bibliotheca scriptorum Graecorum et Romanorum Teubneriana, vol. 1: Books I–X ; vol. 2: Excerpta Byzantina; v. 3: Indices by Hans Gärtner.
Lives of Eminent Philosophers, edited by Tiziano Dorandi, Cambridge: Cambridge University Press, 2013 (Cambridge Classical Texts and Commentaries, vol. 50, new radically improved critical edition).
Translation by R.D. Hicks:
Translations based on the critical edition by Tiziano Dorandi:
See also
Mochus
Notes
References
Further reading
Barnes, Jonathan. 1992. "Diogenes Laertius IX 61–116: The Philosophy of Pyrrhonism." In Aufstieg und Niedergang der römischen Welt: Geschichte und Kultur Roms im Spiegel der neueren Forschung. Vol. 2: 36.5–6. Edited by Wolfgang Haase, 4241–4301. Berlin: W. de Gruyter.
Barnes, Jonathan. 1986. "Nietzsche and Diogenes Laertius." Nietzsche-Studien 15:16–40.
Dorandi, Tiziano. 2009. Laertiana: Capitoli sulla tradizione manoscritta e sulla storia del testo delle Vite dei filosofi di Diogene Laerzio. Berlin; New York: Walter de Gruyter.
Eshleman, Kendra Joy. 2007. "Affection and Affiliation: Social Networks and Conversion to Philosophy." The Classical Journal 103.2: 129–140.
Grau, Sergi. 2010. "How to Kill a Philosopher: The Narrating of Ancient Greek Philosophers' Deaths in Relation to the Living. Ancient Philosophy 30.2: 347-381
Hägg, Tomas. 2012. The Art of Biography in Antiquity. Cambridge, UK: Cambridge Univ. Press.
Kindstrand, Jan Frederik. 1986. "Diogenes Laertius and the Chreia Tradition." Elenchos 7:217–234.
Long, Anthony A. 2006. "Diogenes Laertius, Life of Arcesilaus." In From Epicurus to Epictetus: Studies in Hellenistic and Roman Philosophy. Edited by Anthony A. Long, 96–114. Oxford: Oxford Univ. Press.
Mansfeld, Jaap. 1986. "Diogenes Laertius on Stoic Philosophy." Elenchos 7: 295–382.
Mejer, Jørgen. 1978. Diogenes Laertius and his Hellenistic Background. Wiesbaden: Steiner.
Mejer, Jørgen. 1992. "Diogenes Laertius and the Transmission of Greek Philosophy." In Aufstieg und Niedergang der römischen Welt: Geschichte und Kultur Roms im Spiegel der neueren Forschung. Vol. 2: 36.5–6. Edited by Wolfgang Haase, 3556–3602. Berlin: W. de Gruyter.
Morgan, Teresa J. 2013. "Encyclopaedias of Virtue?: Collections of Sayings and Stories About Wise Men in Greek." In Encyclopaedism from Antiquity to the Renaissance. Edited by Jason König and Greg Woolf, 108–128. Cambridge; New York: Cambridge University Press.
Sassi, Maria Michela. 2011. Ionian Philosophy and Italic Philosophy: From Diogenes Laertius to Diels. In The Presocratics from the Latin Middle Ages to Hermann Diels. Edited by Oliver Primavesi and Katharina Luchner, 19–44. Stuttgart: Steiner.
Sollenberger, Michael. 1992. The Lives of the Peripatetics: An Analysis of the Content and Structure of Diogenes Laertius’ “Vitae philosophorum” Book 5. In Aufstieg und Niedergang der römischen Welt: Geschichte und Kultur Roms im Spiegel der neueren Forschung. Vol. 2: 36.5–6. Edited by Wolfgang Haase, 3793–3879. Berlin: W. de Gruyter.
Vogt, Katja Maria, ed. 2015. Pyrrhonian Skepticism in Diogenes Laertius. Tübingen, Germany: Mohr Siebeck.
Warren, James. 2007. "Diogenes Laertius, Biographer of Philosophy." In Ordering Knowledge in the Roman Empire. Edited by Jason König and Tim Whitmars, 133–149. Cambridge; New York : Cambridge University Press.
Attribution:
External links
Works by Diogenes Laertius at Perseus Digital Library
Ancient Greek text of Diogenes's Lives
Article on the Manuscript versions at the Tertullian Project
A bibliography of the Lives and Opinions of Eminent Philosophers
Libro de la vita de philosophi et delle loro elegantissime sentencie. Venice, Joannes Rubeus Vercellensis, 20 May 1489. From the Rare Book and Special Collections Division at the Library of Congress
Digitized Manuscript of Diogenes Laertius' Vitae Philosophorum (Arundel MS 531) at the British Library website
3rd-century Romans
3rd-century Greek philosophers
3rd-century writers
Ancient Greek biographers
Ancient Greek writers
Year of birth unknown
Year of death unknown
Quotation collectors
3rd-century books
Books about philosophers
|
```javascript
Using `.test()` with RegExp
Filtering items out of an array
Detect online connection
`console.*` in JavaScript
Round numbers to `N` decimals
```
|
```swift
// MARK: - Edge
public class Edge: Equatable {
public var neighbor: Node
public init(neighbor: Node) {
self.neighbor = neighbor
}
}
public func == (lhs: Edge, rhs: Edge) -> Bool {
return lhs.neighbor == rhs.neighbor
}
// MARK: - Node
public class Node: CustomStringConvertible, Equatable {
public var neighbors: [Edge]
public private(set) var label: String
public var distance: Int?
public var visited: Bool
public init(label: String) {
self.label = label
neighbors = []
visited = false
}
public var description: String {
if let distance = distance {
return "Node(label: \(label), distance: \(distance))"
}
return "Node(label: \(label), distance: infinity)"
}
public var hasDistance: Bool {
return distance != nil
}
public func remove(_ edge: Edge) {
neighbors.remove(at: neighbors.index { $0 === edge }!)
}
}
public func == (lhs: Node, rhs: Node) -> Bool {
return lhs.label == rhs.label && lhs.neighbors == rhs.neighbors
}
// MARK: - Graph
public class Graph: CustomStringConvertible, Equatable {
public private(set) var nodes: [Node]
public init() {
self.nodes = []
}
public func addNode(_ label: String) -> Node {
let node = Node(label: label)
nodes.append(node)
return node
}
public func addEdge(_ source: Node, neighbor: Node) {
let edge = Edge(neighbor: neighbor)
source.neighbors.append(edge)
}
public var description: String {
var description = ""
for node in nodes {
if !node.neighbors.isEmpty {
description += "[node: \(node.label) edges: \(node.neighbors.map { $0.neighbor.label})]"
}
}
return description
}
public func findNodeWithLabel(_ label: String) -> Node {
return nodes.filter { $0.label == label }.first!
}
public func duplicate() -> Graph {
let duplicated = Graph()
for node in nodes {
_ = duplicated.addNode(node.label)
}
for node in nodes {
for edge in node.neighbors {
let source = duplicated.findNodeWithLabel(node.label)
let neighbour = duplicated.findNodeWithLabel(edge.neighbor.label)
duplicated.addEdge(source, neighbor: neighbour)
}
}
return duplicated
}
}
public func == (lhs: Graph, rhs: Graph) -> Bool {
return lhs.nodes == rhs.nodes
}
```
|
The 2013 election for Mayor of Buffalo, New York took place on November 5, 2013. Two-term incumbent Democrat Byron Brown won reelection, defeating Republican Sergio Rodriguez.
Background
The 2013 Buffalo mayoral race is notable as the first mayoral election in Buffalo's history to not feature any white, non-Hispanic candidates in either the primary or the general elections. A Buffalo News editorial noted that despite the historic racial aspect of the election, Buffalo voters that year tended to be more concerned with traditional issues such as crime and education, in sharp contrast to the deep ethnic divisions that have normally characterized city politics.
Candidates
Democratic Party
Brown was challenged in the Democratic primary by Bernard Tolbert, the former Special Agent-in-Charge of Buffalo's FBI field office, the former Vice-President of Security for the National Basketball Association, and a former executive with the Coca-Cola Company.
Republican Party
Rodriguez, a 32-year-old, Dominican-born Marine Corps veteran, small business owner, and Medaille College administrator, ran unopposed as Buffalo's first Republican mayoral candidate since 2005.
Other potential candidates
Early in 2013, City Comptroller Mark Schroeder was rumored to be considering a campaign for mayor. However, he declined to run and later endorsed Brown.
Local businessman Matthew Ricchiazzi was reported in June 2013 to be circulating petitions for a possible mayoral campaign. If a Ricchiazzi campaign had actually materialized, it would have touched off a rare Republican primary between him and Rodriguez.
Debates
Leading up to the primary elections on September 10, Brown, Tolbert and Rodriguez participated in three televised debates.
The first mayoral debate took place on August 14, 2013 at the Buffalo News auditorium, sponsored by the Buffalo Association of Black Journalists and moderated by Al Vaughters of WIVB-TV. The dominant issue at the first debate was crime, with Rodriguez disputing Brown's assertion that crime had decreased in the city since the previous election; Brown rebutted by calling Rodriguez's statistics "absolutely false" and "nonsense". In addition, both Tolbert and Rodriguez claimed that Buffalo's police department was understaffed and inefficient, with Brown disputing Tolbert's specific claims that there had been a net loss in officers employed by the Buffalo Police since Brown's mayoral tenure began. Other issues that were debated included unemployment, with Brown forced to defend his administration in the face of statistics cited by Rodriguez showing that the unemployment rate in Buffalo had risen from 6.3% to 10% during Brown's two terms as mayor, and low graduation rates in the Buffalo Public Schools, with Tolbert calling for more involvement by the mayor's office in the school system and Rodriguez going even further, favoring complete mayoral control of the school board as in New York City and Yonkers, while Brown reaffirmed his commitment to cooperation with the Board of Education but denied he had the authority to directly participate in school-district policymaking.
The second debate was held on August 22, 2013 at St. Mary's School for the Deaf, sponsored by the Parkside Community Association and moderated by Buffalo News reporter Brian Meyer, with questions fielded directly from the audience. The beleaguered state of the Buffalo Public Schools was the dominant topic in the second debate, with Tolbert elaborating on his earlier position regarding mayoral involvement in the school board by proposing that three of the nine school board members, as well as the school superintendent, be appointed directly by the mayor, while Rodriguez repeated his earlier call for direct mayoral control of the schools. Brown, for his part, touted his success in restoring music programs to Buffalo city schools as well as the hiring of more school security officers. In addition, the Brown administration's record on economic development was attacked by both challengers, with Brown's boasts of $1.7 billion in development projects currently underway in the city countered by Tolbert's criticism that Brown allowed the Buffalo Economic Renaissance Corporation to fold under his watch, as well as his characterization of economic development in Buffalo as concentrated in downtown at the expense of the neighborhoods. As well, Rodriguez cited statistics showing that Buffalo's employment base had been reduced by 10,000 people since Brown took office in 2006.
The third and final mayoral debate was held on August 27, 2013 at WNED-TV studios. The candidates reiterated the points covered in the first two debates on police department staffing issues, unemployment, economic development, and problems in the Buffalo Public Schools. On the subject of crime, both Rodriguez and Tolbert derided Brown's use of gun buyback programs as a remedy for crime, with Tolbert characterizing the buybacks as "stunts". Also in the third debate, Tolbert was put on the defensive when he was asked about a sexual harassment lawsuit that had been filed against him while he was an executive at the National Basketball Association, while Brown was compelled to deny reports that his administration was rewarding campaign donors with patronage jobs as a quid pro quo and described a lawsuit filed against the city by a housing development company alleging a pay-to-play policy for awarding contracts as "baseless" and "absolute nonsense".
Democratic primary
Campaigning
Though praised for his experience in crimefighting and community service and his "straight-shooting vibe", Tolbert was criticized for waiting until relatively late in the election to announce his candidacy, spending many months courting potential backers and conspicuously attending community events as speculation grew, but not establishing a campaign finance committee until April 2 nor officially entering the race until May 11.
In the run-up to the primary, a Siena College poll commissioned by the Buffalo News and WGRZ-TV and conducted between August 11 and 13, 2013 favored Brown to win the Democratic primary election against Tolbert by a margin of 61-32. According to a Buffalo News editorial published on August 17, though his fundraising prowess, his name recognition and the other benefits of incumbency, and the great progress in waterfront development that took place during his administration would have made any primary challenge difficult, Brown's especially formidable lead was described as first and foremost a product of the shortcomings of his opponent's campaign, an anemic one by a political novice whose muddled message failed to resonate with voters, and particularly with the African-American community whose loyalty to Brown Tolbert had hoped to challenge. For his part, Tolbert accused Brown of pressuring city workers to donate to his campaign as a condition of continued employment as well as engaging in intimidation tactics against volunteers on opposing campaigns, the latter of which was echoed by Rodriguez.
Despite an impressive performance by Tolbert in the three mayoral debates that frequently put Brown on the defensive, a subsequent Siena College poll taken three weeks after the aforementioned one showed that Brown had widened his lead over Tolbert by four points.
Election returns
The Democratic primary election was held on September 10, 2013. Brown won the election, earning 14,022 votes (68.1%) to Tolbert's 6,577 (31.9%).
Commenting on Brown's official nomination the day after the election, a Buffalo News editorial reiterated the points made in the August 17 editorial regarding Brown's name recognition and the capital investments on the waterfront, also opining that the incumbent benefited from an electorate that largely believed Buffalo is heading in the right direction despite the problems facing the city school system, on which Tolbert tried and failed to capitalize. The editorial also mentioned the ample airtime on local television the Brown campaign purchased with its massive "fundraising war chest" for its commercial, which prominently featured President Barack Obama's praise of the sitting mayor during a visit to the University at Buffalo on August 22, 2013.
Contrary to predictions, turnout in the primary was historically low, with roughly 20% of registered Democrats voting. This apathy was attributed variously to Brown's perceived invincibility given his overwhelming lead in the polls, or to a failure by both Brown and Tolbert to inspire enthusiasm among city voters.
Conservative primary
Campaigning
In July 2013, under the terms of New York State's electoral fusion law, Rodriguez announced plans to mount a write-in candidacy as a Conservative to challenge that party's official endorsement of Brown, much as 2005 Republican candidate Kevin Helfer did, and also to circulate petitions for a possible independent run. Rodriguez was indeed able to collect the required number of valid signatures to run as a Conservative, but the Erie County Conservative Party, who had earlier endorsed Brown, vigorously sought to fend off Rodriguez's challenge by distributing a mailer to Conservative-registered city voters assailing Rodriguez for his support of the NY SAFE Act and his past unpaid student loans, a move Rodriguez described as "ugly, dirty, slanderous and baseless politics that people have grown to detest" employed in order "to deny voters a choice". The party also legally challenged all of the petitions collected by Rodriguez based on a little-known stipulation that the notaries public who distribute Opportunity To Ballot petitions must swear in the signers; this effort was dismissed by Erie County Supreme Court Justice John Michalek, setting the stage for a Conservative primary.
Election returns
The Conservative primary was held on September 10, 2013. Brown captured 70 of the votes cast, while Rodriguez captured 44 and Tolbert 11.
Republican campaign
Rodriguez officially announced his candidacy on February 6, 2013. His robust campaign featured an active social media presence and an emphasis on grassroots campaigning as exemplified by his "30 Day Every Neighborhood Tour" in September and October, which saw the candidate go door-to-door interacting with and fielding questions from residents of all parts of the city. Nonetheless, it was remarked in the Buffalo News that although Rodriguez had generated some buzz among Buffalo Hispanics, he had as yet failed to unite the city's Latino community in support of his long-shot candidacy.
In addition to his pursuit of the Conservative line, Rodriguez's petitions for an independent run also bore fruit with his announcement on September 25 that he would be running on a new ballot line, dubbed the Progressive Party. The party's name and platform make direct reference to the Progressive or "Bull Moose" Party founded by Theodore Roosevelt in 1912, with a stated goal of reforming city government by transcending partisan divisions and "making governmental changes parallel with the changing needs of its people... with a dignity defined by moral strength and urgency".
Endorsements
Brown received the endorsement of the Erie County Democratic Party, and also of the Erie County Conservative, Working Families, and Independence parties. He was also endorsed by, among others, New York Governor Andrew Cuomo, U.S. Senator Charles Schumer, U.S. Congressman Brian Higgins, New York State Senator Tim Kennedy, Erie County Executive Mark Poloncarz, New York State Comptroller Thomas DiNapoli, New York State Assemblypeople Crystal Peoples-Stokes, Sean Ryan, and Michael Kearns (the latter of whom had been his primary challenger in the previous election), the Buffalo Niagara Partnership, and the Buffalo Niagara Association of Realtors.
Tolbert, whose criticism of Brown in the months leading up to the Democratic primary largely centered on the incumbent mayor's crime-fighting efforts, received the endorsement of the Buffalo Police Benevolent Association in July 2013. Tolbert also received the endorsement of Challenger Community News, one of Buffalo's two African-American community newspapers.
Rodriguez received the endorsement of the City of Buffalo Republican Committee. However, the Erie County Republican Party declined to endorse Rodriguez, amid accusations from Rodriguez's camp that county Republican officials pressured him to drop out of the race for fear that increased voter turnout in heavily Democratic Buffalo could be a detriment to Republican candidates for countywide office. Rodriguez also received the endorsement of "The Griffin", the Canisius College student paper.
General Election Results
References
External links
Sergio Rodriguez for Mayor of Buffalo - Official campaign website of Sergio Rodriguez
Tolbert 2013 - Official campaign website of Bernard Tolbert
Re-Elect Mayor Byron Brown - Official campaign website of Byron Brown
2013 New York (state) elections
Buffalo
Mayoral elections in Buffalo, New York
|
```raw token data
Index Name Enable IPver DestIPAddress/DestPrefix DestMask Interface Gateway/NextHop
1 5473a28e-8 0 IPv4 177.41.30.228 255.255.255.0 ADSL
2 d6e181ce-a 0 IPv4 121.160.211.188 255.255.0.0 ADSL 66.135.158.175
3 b1a49337-a 0 IPv4 43.188.51.10 255.255.0.0 WWAN 221.29.200.232
4 f9ed53d7-5 1 IPv4 101.168.102.179 255.255.255.0 ADSL 222.173.23.26
5 b7feec47-b 1 IPv4 72.37.37.55 255.0.0.0 VDSL
6 facf6dd5-a 0 IPv4 59.38.164.29 255.0.0.0 VDSL
7 3ab0c611-b 0 IPv6 6d78:60da:1b22::/47 ADSL 6185:f0a7:164e:7fe3:8549:dfb3:dda5:f330
8 f5f679b4-c 0 IPv6 aff9:c5d3:877d:8276:c5d9:dbc1:8000:0/97 WWAN
9 723130e9-a 1 IPv6 51f2:794e:7305:2a0f:1a80::/73 WWAN e583:cfd7:c872:94b4:e9af:a5f8:b095:a058
10 bd7e5ad3-9 0 IPv6 7ec0::/10 ADSL 1aa8:fc25:8d13:67ac:4740:fdd1:61fd:1f81
Command Successful.
```
|
```go
// Code generated by smithy-go-codegen DO NOT EDIT.
package ec2
import (
"context"
"fmt"
awsmiddleware "github.com/aws/aws-sdk-go-v2/aws/middleware"
"github.com/aws/aws-sdk-go-v2/service/ec2/types"
"github.com/aws/smithy-go/middleware"
smithyhttp "github.com/aws/smithy-go/transport/http"
)
// Modifies the specified Amazon Web Services Verified Access group policy.
func (c *Client) ModifyVerifiedAccessGroupPolicy(ctx context.Context, params *ModifyVerifiedAccessGroupPolicyInput, optFns ...func(*Options)) (*ModifyVerifiedAccessGroupPolicyOutput, error) {
if params == nil {
params = &ModifyVerifiedAccessGroupPolicyInput{}
}
result, metadata, err := c.invokeOperation(ctx, "ModifyVerifiedAccessGroupPolicy", params, optFns, c.addOperationModifyVerifiedAccessGroupPolicyMiddlewares)
if err != nil {
return nil, err
}
out := result.(*ModifyVerifiedAccessGroupPolicyOutput)
out.ResultMetadata = metadata
return out, nil
}
type ModifyVerifiedAccessGroupPolicyInput struct {
// The ID of the Verified Access group.
//
// This member is required.
VerifiedAccessGroupId *string
// A unique, case-sensitive token that you provide to ensure idempotency of your
// modification request. For more information, see [Ensuring idempotency].
//
// [Ensuring idempotency]: path_to_url
ClientToken *string
// Checks whether you have the required permissions for the action, without
// actually making the request, and provides an error response. If you have the
// required permissions, the error response is DryRunOperation . Otherwise, it is
// UnauthorizedOperation .
DryRun *bool
// The Verified Access policy document.
PolicyDocument *string
// The status of the Verified Access policy.
PolicyEnabled *bool
// The options for server side encryption.
SseSpecification *types.VerifiedAccessSseSpecificationRequest
noSmithyDocumentSerde
}
type ModifyVerifiedAccessGroupPolicyOutput struct {
// The Verified Access policy document.
PolicyDocument *string
// The status of the Verified Access policy.
PolicyEnabled *bool
// The options in use for server side encryption.
SseSpecification *types.VerifiedAccessSseSpecificationResponse
// Metadata pertaining to the operation's result.
ResultMetadata middleware.Metadata
noSmithyDocumentSerde
}
func (c *Client) addOperationModifyVerifiedAccessGroupPolicyMiddlewares(stack *middleware.Stack, options Options) (err error) {
if err := stack.Serialize.Add(&setOperationInputMiddleware{}, middleware.After); err != nil {
return err
}
err = stack.Serialize.Add(&awsEc2query_serializeOpModifyVerifiedAccessGroupPolicy{}, middleware.After)
if err != nil {
return err
}
err = stack.Deserialize.Add(&awsEc2query_deserializeOpModifyVerifiedAccessGroupPolicy{}, middleware.After)
if err != nil {
return err
}
if err := addProtocolFinalizerMiddlewares(stack, options, "ModifyVerifiedAccessGroupPolicy"); err != nil {
return fmt.Errorf("add protocol finalizers: %v", err)
}
if err = addlegacyEndpointContextSetter(stack, options); err != nil {
return err
}
if err = addSetLoggerMiddleware(stack, options); err != nil {
return err
}
if err = addClientRequestID(stack); err != nil {
return err
}
if err = addComputeContentLength(stack); err != nil {
return err
}
if err = addResolveEndpointMiddleware(stack, options); err != nil {
return err
}
if err = addComputePayloadSHA256(stack); err != nil {
return err
}
if err = addRetry(stack, options); err != nil {
return err
}
if err = addRawResponseToMetadata(stack); err != nil {
return err
}
if err = addRecordResponseTiming(stack); err != nil {
return err
}
if err = addClientUserAgent(stack, options); err != nil {
return err
}
if err = smithyhttp.AddErrorCloseResponseBodyMiddleware(stack); err != nil {
return err
}
if err = smithyhttp.AddCloseResponseBodyMiddleware(stack); err != nil {
return err
}
if err = addSetLegacyContextSigningOptionsMiddleware(stack); err != nil {
return err
}
if err = addTimeOffsetBuild(stack, c); err != nil {
return err
}
if err = addUserAgentRetryMode(stack, options); err != nil {
return err
}
if err = addIdempotencyToken_opModifyVerifiedAccessGroupPolicyMiddleware(stack, options); err != nil {
return err
}
if err = addOpModifyVerifiedAccessGroupPolicyValidationMiddleware(stack); err != nil {
return err
}
if err = stack.Initialize.Add(newServiceMetadataMiddleware_opModifyVerifiedAccessGroupPolicy(options.Region), middleware.Before); err != nil {
return err
}
if err = addRecursionDetection(stack); err != nil {
return err
}
if err = addRequestIDRetrieverMiddleware(stack); err != nil {
return err
}
if err = addResponseErrorMiddleware(stack); err != nil {
return err
}
if err = addRequestResponseLogging(stack, options); err != nil {
return err
}
if err = addDisableHTTPSMiddleware(stack, options); err != nil {
return err
}
return nil
}
type idempotencyToken_initializeOpModifyVerifiedAccessGroupPolicy struct {
tokenProvider IdempotencyTokenProvider
}
func (*idempotencyToken_initializeOpModifyVerifiedAccessGroupPolicy) ID() string {
return "OperationIdempotencyTokenAutoFill"
}
func (m *idempotencyToken_initializeOpModifyVerifiedAccessGroupPolicy) HandleInitialize(ctx context.Context, in middleware.InitializeInput, next middleware.InitializeHandler) (
out middleware.InitializeOutput, metadata middleware.Metadata, err error,
) {
if m.tokenProvider == nil {
return next.HandleInitialize(ctx, in)
}
input, ok := in.Parameters.(*ModifyVerifiedAccessGroupPolicyInput)
if !ok {
return out, metadata, fmt.Errorf("expected middleware input to be of type *ModifyVerifiedAccessGroupPolicyInput ")
}
if input.ClientToken == nil {
t, err := m.tokenProvider.GetIdempotencyToken()
if err != nil {
return out, metadata, err
}
input.ClientToken = &t
}
return next.HandleInitialize(ctx, in)
}
func addIdempotencyToken_opModifyVerifiedAccessGroupPolicyMiddleware(stack *middleware.Stack, cfg Options) error {
return stack.Initialize.Add(&idempotencyToken_initializeOpModifyVerifiedAccessGroupPolicy{tokenProvider: cfg.IdempotencyTokenProvider}, middleware.Before)
}
func newServiceMetadataMiddleware_opModifyVerifiedAccessGroupPolicy(region string) *awsmiddleware.RegisterServiceMetadata {
return &awsmiddleware.RegisterServiceMetadata{
Region: region,
ServiceID: ServiceID,
OperationName: "ModifyVerifiedAccessGroupPolicy",
}
}
```
|
"Kant Nobody" is a song by American rapper Lil Wayne, released on February 23, 2023. The song, which features sampled vocals from late fellow American rapper DMX, was produced by Swizz Beatz and Avenue Beatz. It was included on Lil Wayne's first greatest hits album I Am Music.
Background
In an interview with Zane Lowe on Apple Music 1, Lil Wayne hinted he made the song in reaction to his inclusion on Billboard's list of the "Top 50 Greatest Rappers of All Time" (at number 7).
Composition
"Kant Nobody" begins with a sample of DMX speaking in an interview, before a sample of "Niggaz Done Started Something" from DMX's album It's Dark and Hell Is Hot is played throughout the track. In the chorus, Lil Wayne raps about his lyrical skill and legacy while referencing the process of composing the song: "Best rapper, ex-trapper, dress dapper / Swizzy gave me a head-banger, a neck-snapper / X-factor, make a rapper an example / All I need is a beat with a DMX sample".
Critical reception
Aron A. of HotNewHipHop gave the song a "Very Hottttt" rating and wrote, "...Lil Wayne delivers one of his best performances in a minute. While Wayne delivers one of his best verses in a minute, the song, unfortunately, limits DMX's contributions to a sample instead of a full 16 bars. Still, Lil Wayne and Swizz continue to shine as collaborators following the release of 'Uproar.'"
Charts
References
2023 singles
2023 songs
Lil Wayne songs
DMX songs
Song recordings produced by Swizz Beatz
Songs written by Lil Wayne
Songs written by DMX
Songs written by Swizz Beatz
Songs written by Marvin Gaye
Young Money Entertainment singles
Republic Records singles
|
Elections to Welwyn Hatfield Borough Council took place on 2 May 2019. This was on the same day as other local elections across the United Kingdom.
Results
No Independent (-4.6) or Abolish the Town Council (-0.7) candidates as previous.
Ward results
Brookmans Park and Little Heath
Haldens
Handside
Hatfield Central
Hatfield East
Hatfield South West
Hatfield Villages
Hollybush
Howlands
Northaw and Cuffley
Panshanger
Peartree
Sherrards
Welham Green and Hatfield South
Welwyn East
Welwyn West
References
Welwyn Hatfield Borough Council elections
|
```c
/*
NitroHax -- Cheat tool for the Nintendo DS
This program is free software: you can redistribute it and/or modify
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
along with this program. If not, see <path_to_url
*/
#include <string.h>
#include <nds/ndstypes.h>
#include <nds/arm9/exceptions.h>
#include <nds/arm9/cache.h>
#include <nds/bios.h>
#include <nds/system.h>
#include <nds/dma.h>
#include <nds/interrupts.h>
#include <nds/ipc.h>
#include <nds/timers.h>
#include <nds/fifomessages.h>
#include <nds/memory.h> // tNDSHeader
#include "module_params.h"
#include "ndma.h"
#include "tonccpy.h"
#include "hex.h"
#include "igm_text.h"
#include "nds_header.h"
#include "cardengine.h"
#include "locations.h"
#include "cardengine_header_arm9.h"
#include "unpatched_funcs.h"
#define ROMinRAM BIT(1)
#define isSdk5 BIT(5)
#define cacheFlushFlag BIT(7)
#define cardReadFix BIT(8)
#define cacheDisabled BIT(9)
//#ifdef DLDI
#include "my_fat.h"
#include "card.h"
//#endif
extern cardengineArm9* volatile ce9;
extern vu32* volatile sharedAddr;
extern tNDSHeader* ndsHeader;
extern aFile* romFile;
extern u32 cacheDescriptor[];
extern int cacheCounter[];
extern int accessCounter;
extern int romMapLines;
extern u32 romMap[4][3];
extern void callEndReadDmaThumb(void);
extern void disableIrqMask(u32 mask);
bool isDma = false;
bool dmaOn = true;
bool dmaDirectRead = false;
#ifndef TWLSDK
static bool dataSplit = false;
void endCardReadDma() {
if (dmaDirectRead && dmaOn && (ndmaBusy(0) || (dataSplit && ndmaBusy(1)))) {
IPC_SendSync(0x3);
return;
}
if (ce9->patches->cardEndReadDmaRef) {
VoidFn cardEndReadDmaRef = (VoidFn)ce9->patches->cardEndReadDmaRef;
(*cardEndReadDmaRef)();
} else if (ce9->thumbPatches->cardEndReadDmaRef) {
callEndReadDmaThumb();
}
}
#endif
extern bool IPC_SYNC_hooked;
extern void hookIPC_SYNC(void);
extern void enableIPC_SYNC(void);
#ifndef DLDI
#ifdef ASYNCPF
static u32 asyncSector = 0;
//static u32 asyncQueue[5];
//static int aQHead = 0;
//static int aQTail = 0;
//static int aQSize = 0;
#endif
bool dmaReadOnArm7 = false;
bool dmaReadOnArm9 = false;
extern int allocateCacheSlot(void);
extern int getSlotForSector(u32 sector);
//extern int getSlotForSectorManual(int i, u32 sector);
extern vu8* getCacheAddress(int slot);
extern void updateDescriptor(int slot, u32 sector);
/*#ifdef ASYNCPF
void addToAsyncQueue(sector) {
asyncQueue[aQHead] = sector;
aQHead++;
aQSize++;
if(aQHead>4) {
aQHead=0;
}
if(aQSize>5) {
aQSize=5;
aQTail++;
if(aQTail>4) aQTail=0;
}
}
u32 popFromAsyncQueueHead() {
if(aQSize>0) {
aQHead--;
if(aQHead == -1) aQHead = 4;
aQSize--;
return asyncQueue[aQHead];
} else return 0;
}
#endif*/
#ifdef ASYNCPF
static void waitForArm7(bool ipc) {
extern void sleepMs(int ms);
if (!ipc) {
IPC_SendSync(0x4);
}
while (sharedAddr[3] != (vu32)0) {
if (ipc) {
IPC_SendSync(0x4);
sleepMs(1);
}
}
}
void triggerAsyncPrefetch(sector) {
if(asyncSector == 0) {
int slot = getSlotForSector(sector);
// read max 32k via the WRAM cache
// do it only if there is no async command ongoing
if(slot==-1) {
//addToAsyncQueue(sector);
// send a command to the arm7 to fill the main RAM cache
u32 commandRead = (isDma ? 0x020FF80A : 0x020FF808);
slot = allocateCacheSlot();
vu8* buffer = getCacheAddress(slot);
cacheDescriptor[slot] = sector;
cacheCounter[slot] = 0x0FFFFFFF; // async marker
asyncSector = sector;
// write the command
sharedAddr[0] = buffer;
sharedAddr[1] = ce9->cacheBlockSize;
sharedAddr[2] = sector;
sharedAddr[3] = commandRead;
IPC_SendSync(0x4);
// do it asynchronously
/*waitForArm7();*/
}
}
}
void processAsyncCommand() {
if(asyncSector != 0) {
int slot = getSlotForSector(asyncSector);
if(slot!=-1 && cacheCounter[slot] == 0x0FFFFFFF) {
// get back the data from arm7
if(sharedAddr[3] == (vu32)0) {
updateDescriptor(slot, asyncSector);
asyncSector = 0;
}
}
}
}
void getAsyncSector() {
if(asyncSector != 0) {
int slot = getSlotForSector(asyncSector);
if(slot!=-1 && cacheCounter[slot] == 0x0FFFFFFF) {
// get back the data from arm7
waitForArm7(true);
updateDescriptor(slot, asyncSector);
asyncSector = 0;
}
}
}
#endif
static inline bool checkArm7(void) {
IPC_SendSync(0x4);
return (sharedAddr[3] == (vu32)0);
}
#ifndef TWLSDK
static u32 * dmaParams = NULL;
static int currentLen = 0;
//static int currentSlot = 0;
void continueCardReadDmaArm9() {
if(dmaReadOnArm9) {
if (ndmaBusy(0)) {
IPC_SendSync(0x3);
return;
}
dmaReadOnArm9 = false;
vu32* volatile cardStruct = (vu32*)ce9->cardStruct0;
//u32 dma = cardStruct[3]; // dma channel
u32 commandRead=0x025FFB0A;
u32 src = ((ce9->valueBits & isSdk5) ? dmaParams[3] : cardStruct[0]);
u8* dst = ((ce9->valueBits & isSdk5) ? (u8*)(dmaParams[4]) : (u8*)(cardStruct[1]));
u32 len = ((ce9->valueBits & isSdk5) ? dmaParams[5] : cardStruct[2]);
// Update cardi common
if (ce9->valueBits & isSdk5) {
dmaParams[3] = src + currentLen;
dmaParams[4] = (vu32)(dst + currentLen);
dmaParams[5] = len - currentLen;
} else {
cardStruct[0] = src + currentLen;
cardStruct[1] = (vu32)(dst + currentLen);
cardStruct[2] = len - currentLen;
}
src = ((ce9->valueBits & isSdk5) ? dmaParams[3] : cardStruct[0]);
dst = ((ce9->valueBits & isSdk5) ? (u8*)(dmaParams[4]) : (u8*)(cardStruct[1]));
len = ((ce9->valueBits & isSdk5) ? dmaParams[5] : cardStruct[2]);
u32 sector = (src/ce9->cacheBlockSize)*ce9->cacheBlockSize;
#ifdef ASYNCPF
processAsyncCommand();
#endif
if (len > 0) {
accessCounter++;
// Read via the main RAM cache
//int slot = getSlotForSectorManual(currentSlot+1, sector);
int slot = getSlotForSector(sector);
vu8* buffer = getCacheAddress(slot);
#ifdef ASYNCPF
u32 nextSector = sector+ce9->cacheBlockSize;
#endif
// Read max CACHE_READ_SIZE via the main RAM cache
if (slot == -1) {
#ifdef ASYNCPF
getAsyncSector();
#endif
// Send a command to the ARM7 to fill the RAM cache
slot = allocateCacheSlot();
buffer = getCacheAddress(slot);
//fileRead((char*)buffer, *romFile, sector, ce9->cacheBlockSize);
/*u32 len2 = (src - sector) + len;
u16 readLen = ce9->cacheBlockSize;
if (len2 > ce9->cacheBlockSize*3 && slot+3 < ce9->cacheSlots) {
readLen = ce9->cacheBlockSize*4;
} else if (len2 > ce9->cacheBlockSize*2 && slot+2 < ce9->cacheSlots) {
readLen = ce9->cacheBlockSize*3;
} else if (len2 > ce9->cacheBlockSize && slot+1 < ce9->cacheSlots) {
readLen = ce9->cacheBlockSize*2;
}*/
// Write the command
sharedAddr[0] = (vu32)buffer;
sharedAddr[1] = ce9->cacheBlockSize;
sharedAddr[2] = sector;
sharedAddr[3] = commandRead;
dmaReadOnArm7 = true;
IPC_SendSync(0x4);
updateDescriptor(slot, sector);
/*if (readLen >= ce9->cacheBlockSize*2) {
updateDescriptor(slot+1, sector+ce9->cacheBlockSize);
}
if (readLen >= ce9->cacheBlockSize*3) {
updateDescriptor(slot+2, sector+(ce9->cacheBlockSize*2));
}
if (readLen >= ce9->cacheBlockSize*4) {
updateDescriptor(slot+3, sector+(ce9->cacheBlockSize*3));
}
currentSlot = slot;*/
return;
}
#ifdef ASYNCPF
if(cacheCounter[slot] == 0x0FFFFFFF) {
// prefetch successfull
getAsyncSector();
triggerAsyncPrefetch(nextSector);
} else {
int i;
for(i=0; i<5; i++) {
if(asyncQueue[i]==sector) {
// prefetch successfull
triggerAsyncPrefetch(nextSector);
break;
}
}
}
#endif
updateDescriptor(slot, sector);
u32 len2 = len;
if ((src - sector) + len2 > ce9->cacheBlockSize) {
len2 = sector - src + ce9->cacheBlockSize;
}
/*if (len2 > 512) {
len2 -= src % 4;
len2 -= len2 % 32;
}*/
// Copy via dma
ndmaCopyWordsAsynch(0, (u8*)buffer+(src-sector), dst, len2);
dmaReadOnArm9 = true;
currentLen = len2;
//currentSlot = slot;
IPC_SendSync(0x3);
} else {
//disableIrqMask(IRQ_DMA0 << dma);
//resetRequestIrqMask(IRQ_DMA0 << dma);
//disableDMA(dma);
isDma = false;
endCardReadDma();
}
}
}
void continueCardReadDmaArm7() {
if(dmaReadOnArm7) {
if(!checkArm7()) return;
dmaReadOnArm7 = false;
vu32* volatile cardStruct = (vu32*)ce9->cardStruct0;
u32 src = ((ce9->valueBits & isSdk5) ? dmaParams[3] : cardStruct[0]);
u8* dst = ((ce9->valueBits & isSdk5) ? (u8*)(dmaParams[4]) : (u8*)(cardStruct[1]));
u32 len = ((ce9->valueBits & isSdk5) ? dmaParams[5] : cardStruct[2]);
/* if ((ce9->valueBits & cacheDisabled) && (u32)dst >= 0x02000000 && (u32)dst < 0x03000000) {
endCardReadDma();
} else { */
u32 sector = (src/ce9->cacheBlockSize)*ce9->cacheBlockSize;
u32 len2 = len;
if ((src - sector) + len2 > ce9->cacheBlockSize) {
len2 = sector - src + ce9->cacheBlockSize;
}
/*if (len2 > 512) {
len2 -= src % 4;
len2 -= len2 % 32;
}*/
//vu8* buffer = getCacheAddress(currentSlot);
vu8* buffer = getCacheAddress(getSlotForSector(sector));
// TODO Copy via dma
ndmaCopyWordsAsynch(0, (u8*)buffer+(src-sector), dst, len2);
dmaReadOnArm9 = true;
currentLen = len2;
IPC_SendSync(0x3);
// }
}
}
#endif
void cardSetDma(u32 * params) {
#ifdef TWLSDK
/* u32 src = params[3];
u8* dst = (u8*)params[4];
u32 len = params[5];
disableIrqMask(IRQ_CARD);
disableIrqMask(IRQ_CARD_LINE);
cardRead(NULL, dst, src, len);
endCardReadDma(); */
#else
isDma = true;
dmaDirectRead = false;
vu32* cardStruct = (vu32*)ce9->cardStruct0;
if (ce9->valueBits & isSdk5) {
dmaParams = params;
}
u32 src = ((ce9->valueBits & isSdk5) ? dmaParams[3] : cardStruct[0]);
u8* dst = ((ce9->valueBits & isSdk5) ? (u8*)(dmaParams[4]) : (u8*)(cardStruct[1]));
u32 len = ((ce9->valueBits & isSdk5) ? dmaParams[5] : cardStruct[2]);
// Simulate ROM mirroring
while (src >= ce9->romPaddingSize) {
src -= ce9->romPaddingSize;
}
if (ce9->valueBits & isSdk5) {
dmaParams[3] = src;
} else {
cardStruct[0] = src;
}
#ifndef TWLSDK
dataSplit = false;
#endif
bool romPart = false;
//int romPartNo = 0;
if (!(ce9->valueBits & ROMinRAM)) {
/*for (int i = 0; i < 2; i++) {
if (ce9->romPartSize[i] == 0) {
break;
}
romPart = (src >= ce9->romPartSrc[i] && src < ce9->romPartSrc[i]+ce9->romPartSize[i]);
if (romPart) {
romPartNo = i;
break;
}
}*/
romPart = (ce9->romPartSize > 0 && src >= ce9->romPartSrc && src < ce9->romPartSrc+ce9->romPartSize);
}
if (dmaOn && ((ce9->valueBits & ROMinRAM) || romPart)) {
dmaDirectRead = true;
disableIrqMask(IRQ_CARD);
disableIrqMask(IRQ_CARD_LINE);
enableIPC_SYNC();
// Copy via dma
// ndmaCopyWordsAsynch(0, (u8*)ce9->romLocation/*[romPartNo]*/+src, dst, len);
u32 len2 = 0;
for (int i = 0; i < ce9->romMapLines; i++) {
if (!(src >= ce9->romMap[i][0] && (i == ce9->romMapLines-1 || src < ce9->romMap[i+1][0])))
continue;
u32 newSrc = (ce9->romMap[i][1]-ce9->romMap[i][0])+src;
if (newSrc+len > ce9->romMap[i][2]) {
do {
len--;
len2++;
} while (newSrc+len != ce9->romMap[i][2]);
ndmaCopyWordsAsynch(1, (u8*)newSrc, dst, len);
src += len;
dst += len;
#ifndef TWLSDK
dataSplit = true;
#endif
} else {
ndmaCopyWordsAsynch(0, (u8*)newSrc, dst, len2==0 ? len : len2);
break;
}
}
IPC_SendSync(0x3);
return;
} else if (!dmaOn || ce9->patches->sleepRef || ce9->thumbPatches->sleepRef) {
cardRead(NULL, dst, src, len);
endCardReadDma();
return;
}
#endif
#ifndef TWLSDK
disableIrqMask(IRQ_CARD);
disableIrqMask(IRQ_CARD_LINE);
enableIPC_SYNC();
u32 commandRead=0x025FFB0A;
u32 sector = (src/ce9->cacheBlockSize)*ce9->cacheBlockSize;
//u32 page = (src / 512) * 512;
accessCounter++;
#ifdef ASYNCPF
processAsyncCommand();
#endif
/* if ((ce9->valueBits & cacheDisabled) && (u32)dst >= 0x02000000 && (u32)dst < 0x03000000) {
// Write the command
sharedAddr[0] = (vu32)dst;
sharedAddr[1] = len;
sharedAddr[2] = src;
sharedAddr[3] = commandRead;
dmaReadOnArm7 = true;
IPC_SendSync(0x4);
} else { */
// Read via the main RAM cache
int slot = getSlotForSector(sector);
vu8* buffer = getCacheAddress(slot);
#ifdef ASYNCPF
u32 nextSector = sector+ce9->cacheBlockSize;
#endif
// Read max CACHE_READ_SIZE via the main RAM cache
if (slot == -1) {
#ifdef ASYNCPF
getAsyncSector();
#endif
// Send a command to the ARM7 to fill the RAM cache
slot = allocateCacheSlot();
buffer = getCacheAddress(slot);
//fileRead((char*)buffer, *romFile, sector, ce9->cacheBlockSize);
/*u32 len2 = (src - sector) + len;
u16 readLen = ce9->cacheBlockSize;
if (len2 > ce9->cacheBlockSize*3 && slot+3 < ce9->cacheSlots) {
readLen = ce9->cacheBlockSize*4;
} else if (len2 > ce9->cacheBlockSize*2 && slot+2 < ce9->cacheSlots) {
readLen = ce9->cacheBlockSize*3;
} else if (len2 > ce9->cacheBlockSize && slot+1 < ce9->cacheSlots) {
readLen = ce9->cacheBlockSize*2;
}*/
// Write the command
sharedAddr[0] = (vu32)buffer;
sharedAddr[1] = ce9->cacheBlockSize;
sharedAddr[2] = sector;
sharedAddr[3] = commandRead;
dmaReadOnArm7 = true;
IPC_SendSync(0x4);
updateDescriptor(slot, sector);
/*if (readLen >= ce9->cacheBlockSize*2) {
updateDescriptor(slot+1, sector+ce9->cacheBlockSize);
}
if (readLen >= ce9->cacheBlockSize*3) {
updateDescriptor(slot+2, sector+(ce9->cacheBlockSize*2));
}
if (readLen >= ce9->cacheBlockSize*4) {
updateDescriptor(slot+3, sector+(ce9->cacheBlockSize*3));
}
currentSlot = slot;*/
return;
}
#ifdef ASYNCPF
if(cacheCounter[slot] == 0x0FFFFFFF) {
// prefetch successfull
getAsyncSector();
triggerAsyncPrefetch(nextSector);
} else {
int i;
for(i=0; i<5; i++) {
if(asyncQueue[i]==sector) {
// prefetch successfull
triggerAsyncPrefetch(nextSector);
break;
}
}
}
#endif
updateDescriptor(slot, sector);
u32 len2 = len;
if ((src - sector) + len2 > ce9->cacheBlockSize) {
len2 = sector - src + ce9->cacheBlockSize;
}
/*if (len2 > 512) {
len2 -= src % 4;
len2 -= len2 % 32;
}*/
// Copy via dma
ndmaCopyWordsAsynch(0, (u8*)buffer+(src-sector), dst, len2);
dmaReadOnArm9 = true;
currentLen = len2;
//currentSlot = slot;
//fixme: why is this needed to make the function work
//there seems to be some timing issue
swiDelay(1);
IPC_SendSync(0x3);
// }
#endif
}
#else
void cardSetDma(u32 * params) {
#ifdef TWLSDK
/* u32 src = params[3];
u8* dst = (u8*)params[4];
u32 len = params[5]; */
#else
isDma = true;
dmaDirectRead = false;
vu32* volatile cardStruct = (vu32*)ce9->cardStruct0;
u32 src = ((ce9->valueBits & isSdk5) ? params[3] : cardStruct[0]);
u8* dst = ((ce9->valueBits & isSdk5) ? (u8*)(params[4]) : (u8*)(cardStruct[1]));
u32 len = ((ce9->valueBits & isSdk5) ? params[5] : cardStruct[2]);
disableIrqMask(IRQ_CARD);
disableIrqMask(IRQ_CARD_LINE);
cardRead(NULL, dst, src, len);
endCardReadDma();
#endif
}
#endif
extern bool isNotTcm(u32 address, u32 len);
u32 cardReadDma(u32 dma0, u8* dst0, u32 src0, u32 len0) {
#ifndef TWLSDK
vu32* volatile cardStruct = (vu32*)ce9->cardStruct0;
u32 src = ((ce9->valueBits & isSdk5) ? src0 : cardStruct[0]);
u8* dst = ((ce9->valueBits & isSdk5) ? dst0 : (u8*)(cardStruct[1]));
u32 len = ((ce9->valueBits & isSdk5) ? len0 : cardStruct[2]);
u32 dma = ((ce9->valueBits & isSdk5) ? dma0 : cardStruct[3]); // dma channel
if(dma >= 0
&& dma <= 3
//&& func != NULL
&& len > 0
&& !(((u32)dst) & ((ce9->valueBits & isSdk5) ? 31 : 3))
&& isNotTcm((u32)dst, len)
// check 512 bytes page alignement
&& !(len & 511)
&& !(src & 511)
) {
isDma = true;
if (ce9->patches->cardEndReadDmaRef || ce9->thumbPatches->cardEndReadDmaRef) {
// new dma method
if (!(ce9->valueBits & isSdk5)) {
cacheFlush();
cardSetDma(NULL);
}
return true;
}
} /*else {
dma=4;
clearIcache();
}*/
#endif
return false;
}
```
|
Charles Calvin Ziegler (1854–1930) was a German-American poet from Rebersburg, Pennsylvania. His native language was Pennsylvania Dutch, and although he learned English in school, he wrote his poetry in the Pennsylvania Dutch language. He is said to have been the most accomplished poet to write in that language, and may have written the only Pennsylvania Dutch sonnet on record.
Background
Ziegler was born on a farm to a 39-year-old mother and an even older father. He had at least one older brother, Samuel, who moved to Iowa when Calvin was still young. When he was sixteen, Calvin went to join Samuel, eventually graduating the University of Iowa in 1878. He returned to Rebersburg and pursued teaching, but left again in October 1881 to study Greek and Latin at Harvard. In 1884 he graduated magna cum laude and returned briefly to Rebersburg. He left for good in 1885, settling in St. Louis.
Ziegler published a volume of poems with a Leipzig publisher in 1891, Drauss un Deheem, helping spur a revival of Pennsylvania Dutch literature. Included in the volume in a set of nineteen poems devoted to his mother, who died weeks before his Harvard graduation. The volume was republished in Pennsylvania 1936, with some additional material.
References
1854 births
1930 deaths
Poets from Pennsylvania
University of Iowa alumni
Harvard College alumni
People from Centre County, Pennsylvania
American people of German descent
|
```html
<html lang="en">
<head>
<title>Convenience Vars - Debugging with GDB</title>
<meta http-equiv="Content-Type" content="text/html">
<meta name="description" content="Debugging with GDB">
<meta name="generator" content="makeinfo 4.11">
<link title="Top" rel="start" href="index.html#Top">
<link rel="up" href="Data.html#Data" title="Data">
<link rel="prev" href="Value-History.html#Value-History" title="Value History">
<link rel="next" href="Convenience-Funs.html#Convenience-Funs" title="Convenience Funs">
<link href="path_to_url" rel="generator-home" title="Texinfo Homepage">
<!--
Permission is granted to copy, distribute and/or modify this document
any later version published by the Free Software Foundation; with the
Invariant Sections being ``Free Software'' and ``Free Software Needs
Free Documentation'', with the Front-Cover Texts being ``A GNU Manual,''
and with the Back-Cover Texts as in (a) below.
(a) The FSF's Back-Cover Text is: ``You are free to copy and modify
this GNU Manual. Buying copies from GNU Press supports the FSF in
developing GNU and promoting software freedom.''
-->
<meta http-equiv="Content-Style-Type" content="text/css">
<style type="text/css"><!--
pre.display { font-family:inherit }
pre.format { font-family:inherit }
pre.smalldisplay { font-family:inherit; font-size:smaller }
pre.smallformat { font-family:inherit; font-size:smaller }
pre.smallexample { font-size:smaller }
pre.smalllisp { font-size:smaller }
span.sc { font-variant:small-caps }
span.roman { font-family:serif; font-weight:normal; }
span.sansserif { font-family:sans-serif; font-weight:normal; }
--></style>
</head>
<body>
<div class="node">
<p>
<a name="Convenience-Vars"></a>
Next: <a rel="next" accesskey="n" href="Convenience-Funs.html#Convenience-Funs">Convenience Funs</a>,
Previous: <a rel="previous" accesskey="p" href="Value-History.html#Value-History">Value History</a>,
Up: <a rel="up" accesskey="u" href="Data.html#Data">Data</a>
<hr>
</div>
<h3 class="section">10.11 Convenience Variables</h3>
<p><a name="index-convenience-variables-678"></a><a name="index-user_002ddefined-variables-679"></a><span class="sc">gdb</span> provides <dfn>convenience variables</dfn> that you can use within
<span class="sc">gdb</span> to hold on to a value and refer to it later. These variables
exist entirely within <span class="sc">gdb</span>; they are not part of your program, and
setting a convenience variable has no direct effect on further execution
of your program. That is why you can use them freely.
<p>Convenience variables are prefixed with ‘<samp><span class="samp">$</span></samp>’. Any name preceded by
‘<samp><span class="samp">$</span></samp>’ can be used for a convenience variable, unless it is one of
the predefined machine-specific register names (see <a href="Registers.html#Registers">Registers</a>).
(Value history references, in contrast, are <em>numbers</em> preceded
by ‘<samp><span class="samp">$</span></samp>’. See <a href="Value-History.html#Value-History">Value History</a>.)
<p>You can save a value in a convenience variable with an assignment
expression, just as you would set a variable in your program.
For example:
<pre class="smallexample"> set $foo = *object_ptr
</pre>
<p class="noindent">would save in <code>$foo</code> the value contained in the object pointed to by
<code>object_ptr</code>.
<p>Using a convenience variable for the first time creates it, but its
value is <code>void</code> until you assign a new value. You can alter the
value with another assignment at any time.
<p>Convenience variables have no fixed types. You can assign a convenience
variable any type of value, including structures and arrays, even if
that variable already has a value of a different type. The convenience
variable, when used as an expression, has the type of its current value.
<a name="index-show-convenience-680"></a>
<a name="index-show-all-user-variables-and-functions-681"></a>
<dl><dt><code>show convenience</code><dd>Print a list of convenience variables used so far, and their values,
as well as a list of the convenience functions.
Abbreviated <code>show conv</code>.
<p><a name="index-init_002dif_002dundefined-682"></a><a name="index-convenience-variables_002c-initializing-683"></a><br><dt><code>init-if-undefined $</code><var>variable</var><code> = </code><var>expression</var><dd>Set a convenience variable if it has not already been set. This is useful
for user-defined commands that keep some state. It is similar, in concept,
to using local static variables with initializers in C (except that
convenience variables are global). It can also be used to allow users to
override default values used in a command script.
<p>If the variable is already defined then the expression is not evaluated so
any side-effects do not occur.
</dl>
<p>One of the ways to use a convenience variable is as a counter to be
incremented or a pointer to be advanced. For example, to print
a field from successive elements of an array of structures:
<pre class="smallexample"> set $i = 0
print bar[$i++]->contents
</pre>
<p class="noindent">Repeat that command by typing <RET>.
<p>Some convenience variables are created automatically by <span class="sc">gdb</span> and given
values likely to be useful.
<a name=your_sha256_hash4"></a>
<dl><dt><code>$_</code><dd>The variable <code>$_</code> is automatically set by the <code>x</code> command to
the last address examined (see <a href="Memory.html#Memory">Examining Memory</a>). Other
commands which provide a default address for <code>x</code> to examine also
set <code>$_</code> to that address; these commands include <code>info line</code>
and <code>info breakpoint</code>. The type of <code>$_</code> is <code>void *</code>
except when set by the <code>x</code> command, in which case it is a pointer
to the type of <code>$__</code>.
<p><a name=your_sha256_hash7d-685"></a><br><dt><code>$__</code><dd>The variable <code>$__</code> is automatically set by the <code>x</code> command
to the value found in the last address examined. Its type is chosen
to match the format in which the data was printed.
<br><dt><code>$_exitcode</code><dd><a name=your_sha256_hash_007d-686"></a>When the program being debugged terminates normally, <span class="sc">gdb</span>
automatically sets this variable to the exit code of the program, and
resets <code>$_exitsignal</code> to <code>void</code>.
<br><dt><code>$_exitsignal</code><dd><a name=your_sha256_hashle_007d-687"></a>When the program being debugged dies due to an uncaught signal,
<span class="sc">gdb</span> automatically sets this variable to that signal's number,
and resets <code>$_exitcode</code> to <code>void</code>.
<p>To distinguish between whether the program being debugged has exited
(i.e., <code>$_exitcode</code> is not <code>void</code>) or signalled (i.e.,
<code>$_exitsignal</code> is not <code>void</code>), the convenience function
<code>$_isvoid</code> can be used (see <a href="Convenience-Funs.html#Convenience-Funs">Convenience Functions</a>). For example, considering the following source code:
<pre class="smallexample"> #include <signal.h>
int
main (int argc, char *argv[])
{
raise (SIGALRM);
return 0;
}
</pre>
<p>A valid way of telling whether the program being debugged has exited
or signalled would be:
<pre class="smallexample"> (gdb) define has_exited_or_signalled
Type commands for definition of ``has_exited_or_signalled''.
End with a line saying just ``end''.
>if $_isvoid ($_exitsignal)
>echo The program has exited\n
>else
>echo The program has signalled\n
>end
>end
(gdb) run
Starting program:
Program terminated with signal SIGALRM, Alarm clock.
The program no longer exists.
(gdb) has_exited_or_signalled
The program has signalled
</pre>
<p>As can be seen, <span class="sc">gdb</span> correctly informs that the program being
debugged has signalled, since it calls <code>raise</code> and raises a
<code>SIGALRM</code> signal. If the program being debugged had not called
<code>raise</code>, then <span class="sc">gdb</span> would report a normal exit:
<pre class="smallexample"> (gdb) has_exited_or_signalled
The program has exited
</pre>
<br><dt><code>$_exception</code><dd>The variable <code>$_exception</code> is set to the exception object being
thrown at an exception-related catchpoint. See <a href="Set-Catchpoints.html#Set-Catchpoints">Set Catchpoints</a>.
<br><dt><code>$_probe_argc</code><dt><code>$_probe_arg0...$_probe_arg11</code><dd>Arguments to a static probe. See <a href="Static-Probe-Points.html#Static-Probe-Points">Static Probe Points</a>.
<br><dt><code>$_sdata</code><dd><a name=your_sha256_hashe-variable_007d-688"></a>The variable <code>$_sdata</code> contains extra collected static tracepoint
data. See <a href="Tracepoint-Actions.html#Tracepoint-Actions">Tracepoint Action Lists</a>. Note that
<code>$_sdata</code> could be empty, if not inspecting a trace buffer, or
if extra static tracepoint data has not been collected.
<br><dt><code>$_siginfo</code><dd><a name=your_sha256_hash007d-689"></a>The variable <code>$_siginfo</code> contains extra signal information
(see <a href="extra-signal-information.html#extra-signal-information">extra signal information</a>). Note that <code>$_siginfo</code>
could be empty, if the application has not yet received any signals.
For example, it will be empty before you execute the <code>run</code> command.
<br><dt><code>$_tlb</code><dd><a name=your_sha256_hash-690"></a>The variable <code>$_tlb</code> is automatically set when debugging
applications running on MS-Windows in native mode or connected to
gdbserver that supports the <code>qGetTIBAddr</code> request.
See <a href="General-Query-Packets.html#General-Query-Packets">General Query Packets</a>.
This variable contains the address of the thread information block.
</dl>
<p>On HP-UX systems, if you refer to a function or variable name that
begins with a dollar sign, <span class="sc">gdb</span> searches for a user or system
name first, before it searches for a convenience variable.
</body></html>
```
|
A data stream management system (DSMS) is a computer software system to manage continuous data streams. It is similar to a database management system (DBMS), which is, however, designed for static data in conventional databases. A DBMS also offers a flexible query processing so that the information needed can be expressed using queries. However, in contrast to a DBMS, a DSMS executes a continuous query that is not only performed once, but is permanently installed. Therefore, the query is continuously executed until it is explicitly uninstalled. Since most DSMS are data-driven, a continuous query produces new results as long as new data arrive at the system. This basic concept is similar to Complex event processing so that both technologies are partially coalescing.
Functional principle
One important feature of a DSMS is the possibility to handle potentially infinite and rapidly changing data streams by offering flexible processing at the same time, although there are only limited resources such as main memory. The following table provides various principles of DSMS and compares them to traditional DBMS.
Processing and streaming models
One of the biggest challenges for a DSMS is to handle potentially infinite data streams using a fixed amount of memory and no random access to the data. There are different approaches to limit the amount of data in one pass, which can be divided into two classes. For the one hand, there are compression techniques that try to summarize the data and for the other hand there are window techniques that try to portion the data into (finite) parts.
Synopses
The idea behind compression techniques is to maintain only a synopsis of the data, but not all (raw) data points of the data stream. The algorithms range from selecting random data points called sampling to summarization using histograms, wavelets or sketching. One simple example of a compression is the continuous calculation of an average. Instead of memorizing each data point, the synopsis only holds the sum and the number of items. The average can be calculated by dividing the sum by the number. However, it should be mentioned that synopses cannot reflect the data accurately. Thus, a processing that is based on synopses may produce inaccurate results.
Windows
Instead of using synopses to compress the characteristics of the whole data streams, window techniques only look on a portion of the data. This approach is motivated by the idea that only the most recent data are relevant. Therefore, a window continuously cuts out a part of the data stream, e.g. the last ten data stream elements, and only considers these elements during the processing. There are different kinds of such windows like sliding windows that are similar to FIFO lists or tumbling windows that cut out disjoint parts. Furthermore, the windows can also be differentiated into element-based windows, e.g., to consider the last ten elements, or time-based windows, e.g., to consider the last ten seconds of data. There are also different approaches to implementing windows. There are, for example, approaches that use timestamps or time intervals for system-wide windows or buffer-based windows for each single processing step. Sliding-window query processing is also suitable to being implemented in parallel processors by exploiting parallelism between different windows and/or within each window extent.
Query processing
Since there are a lot of prototypes, there is no standardized architecture. However, most DSMS are based on the query processing in DBMS by using declarative languages to express queries, which are translated into a plan of operators. These plans can be optimized and executed. A query processing often consists of the following steps.
Formulation of continuous queries
The formulation of queries is mostly done using declarative languages like SQL in DBMS. Since there are no standardized query languages to express continuous queries, there are a lot of languages and variations. However, most of them are based on SQL, such as the Continuous Query Language (CQL), StreamSQL and ESP. There are also graphical approaches where each processing step is a box and the processing flow is expressed by arrows between the boxes.
The language strongly depends on the processing model. For example, if windows are used for the processing, the definition of a window has to be expressed. In StreamSQL, a query with a sliding window for the last 10 elements looks like follows:
SELECT AVG(price) FROM examplestream [SIZE 10 ADVANCE 1 TUPLES] WHERE value > 100.0
This stream continuously calculates the average value of "price" of the last 10 tuples, but only considers those tuples whose prices are greater than 100.0.
In the next step, the declarative query is translated into a logical query plan. A query plan is a directed graph where the nodes are operators and the edges describe the processing flow. Each operator in the query plan encapsulates the semantic of a specific operation, such as filtering or aggregation. In DSMSs that process relational data streams, the operators are equal or similar to the operators of the Relational algebra, so that there are operators for selection, projection, join, and set operations. This operator concept allows the very flexible and versatile processing of a DSMS.
Optimization of queries
The logical query plan can be optimized, which strongly depends on the streaming model. The basic concepts for optimizing continuous queries are equal to those from database systems. If there are relational data streams and the logical query plan is based on relational operators from the Relational algebra, a query optimizer can use the algebraic equivalences to optimize the plan. These may be, for example, to push selection operators down to the sources, because they are not so computationally intensive like join operators.
Furthermore, there are also cost-based optimization techniques like in DBMS, where a query plan with the lowest costs is chosen from different equivalent query plans. One example is to choose the order of two successive join operators. In DBMS this decision is mostly done by certain statistics of the involved databases. But, since the data of a data streams is unknown in advance, there are no such statistics in a DSMS. However, it is possible to observe a data stream for a certain time to obtain some statistics. Using these statistics, the query can also be optimized later. So, in contrast to a DBMS, some DSMS allows to optimize the query even during runtime. Therefore, a DSMS needs some plan migration strategies to replace a running query plan with a new one.
Transformation of queries
Since a logical operator is only responsible for the semantics of an operation but does not consist of any algorithms, the logical query plan must be transformed into an executable counterpart. This is called a physical query plan. The distinction between a logical and a physical operator plan allows more than one implementation for the same logical operator. The join, for example, is logically the same, although it can be implemented by different algorithms like a Nested loop join or a Sort-merge join. Notice, these algorithms also strongly depend on the used stream and processing model.
Finally, the query is available as a physical query plan.
Execution of queries
Since the physical query plan consists of executable algorithms, it can be directly executed. For this, the physical query plan is installed into the system. The bottom of the graph (of the query plan) is connected to the incoming sources, which can be everything like connectors to sensors. The top of the graph is connected to the outgoing sinks, which may be for example a visualization. Since most DSMSs are data-driven, a query is executed by pushing the incoming data elements from the source through the query plan to the sink. Each time when a data element passes an operator, the operator performs its specific operation on the data element and forwards the result to all successive operators.
Examples
AURORA, StreamBase Systems, Inc.
Hortonworks DataFlow
IBM Streams
NIAGARA Query Engine
NiagaraST: A Research Data Stream Management System at Portland State University
Odysseus, an open source Java-based framework for Data Stream Management Systems
Pipeline DB
PIPES, webMethods Business Events
QStream
SAS Event Stream Processing
SQLstream
STREAM
StreamGlobe
StreamInsight
TelegraphCQ
WSO2 Stream Processor
See also
Complex Event Processing
Event stream processing
Relational data stream management system
References
External links
Processing Flows of Information: From Data Stream to Complex Event Processing - Survey article on Data Stream and Complex Event Processing Systems
Stream processing with SQL - Introduction to streaming data management with SQL
Big data
Data management
|
```javascript
const x256 = require('x256');
exports.getColorCode = function (color) {
if (Array.isArray(color) && color.length == 3) {
return x256(color[0], color[1], color[2]);
} else {
return color;
}
}
exports.arrayMax = function (array, iteratee) {
let index = -1;
let length = array.length;
let computed, result;
while (++index < length) {
let value = array[index];
let current = iteratee(value);
if (current != null && (computed === undefined ? current === current : current > computed)) {
computed = current,
result = value;
}
}
return result;
}
```
|
```go
/*
path_to_url
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
// This file was automatically generated by lister-gen
package v1beta1
import (
"k8s.io/apimachinery/pkg/api/errors"
"k8s.io/apimachinery/pkg/labels"
v1beta1 "k8s.io/client-go/pkg/apis/extensions/v1beta1"
"k8s.io/client-go/tools/cache"
)
// ReplicaSetLister helps list ReplicaSets.
type ReplicaSetLister interface {
// List lists all ReplicaSets in the indexer.
List(selector labels.Selector) (ret []*v1beta1.ReplicaSet, err error)
// ReplicaSets returns an object that can list and get ReplicaSets.
ReplicaSets(namespace string) ReplicaSetNamespaceLister
ReplicaSetListerExpansion
}
// replicaSetLister implements the ReplicaSetLister interface.
type replicaSetLister struct {
indexer cache.Indexer
}
// NewReplicaSetLister returns a new ReplicaSetLister.
func NewReplicaSetLister(indexer cache.Indexer) ReplicaSetLister {
return &replicaSetLister{indexer: indexer}
}
// List lists all ReplicaSets in the indexer.
func (s *replicaSetLister) List(selector labels.Selector) (ret []*v1beta1.ReplicaSet, err error) {
err = cache.ListAll(s.indexer, selector, func(m interface{}) {
ret = append(ret, m.(*v1beta1.ReplicaSet))
})
return ret, err
}
// ReplicaSets returns an object that can list and get ReplicaSets.
func (s *replicaSetLister) ReplicaSets(namespace string) ReplicaSetNamespaceLister {
return replicaSetNamespaceLister{indexer: s.indexer, namespace: namespace}
}
// ReplicaSetNamespaceLister helps list and get ReplicaSets.
type ReplicaSetNamespaceLister interface {
// List lists all ReplicaSets in the indexer for a given namespace.
List(selector labels.Selector) (ret []*v1beta1.ReplicaSet, err error)
// Get retrieves the ReplicaSet from the indexer for a given namespace and name.
Get(name string) (*v1beta1.ReplicaSet, error)
ReplicaSetNamespaceListerExpansion
}
// replicaSetNamespaceLister implements the ReplicaSetNamespaceLister
// interface.
type replicaSetNamespaceLister struct {
indexer cache.Indexer
namespace string
}
// List lists all ReplicaSets in the indexer for a given namespace.
func (s replicaSetNamespaceLister) List(selector labels.Selector) (ret []*v1beta1.ReplicaSet, err error) {
err = cache.ListAllByNamespace(s.indexer, s.namespace, selector, func(m interface{}) {
ret = append(ret, m.(*v1beta1.ReplicaSet))
})
return ret, err
}
// Get retrieves the ReplicaSet from the indexer for a given namespace and name.
func (s replicaSetNamespaceLister) Get(name string) (*v1beta1.ReplicaSet, error) {
obj, exists, err := s.indexer.GetByKey(s.namespace + "/" + name)
if err != nil {
return nil, err
}
if !exists {
return nil, errors.NewNotFound(v1beta1.Resource("replicaset"), name)
}
return obj.(*v1beta1.ReplicaSet), nil
}
```
|
```xml
/* This setup is just to install the source-map-support, so we'll get TypeScript stack traces
for the tests */
import * as fs from "fs";
// source-map-support has a bug that crashes some of our tests. We fix it: path_to_url
const moduleLocation = require.resolve("source-map-support/source-map-support.js");
const originalCode = "column -= 62;";
const replacementCode = "if (column > 63) { column -= 62; }";
const contents = fs.readFileSync(moduleLocation, "utf8");
const fixedContents = contents.replace(originalCode, replacementCode);
fs.writeFileSync(moduleLocation, fixedContents);
// Then we load the module
import * as sourceMapSupport from "source-map-support";
sourceMapSupport.install(); // Enable stack traces translation to typescript
```
|
The Sumatran slender gecko (Hemiphyllodactylus margarethae) is a species of gecko. It is endemic to Sumatra.
References
Hemiphyllodactylus
Endemic fauna of Indonesia
Reptiles of Indonesia
Fauna of Sumatra
Reptiles described in 1931
Taxa named by Leo Brongersma
|
The International Islamic Unity Conference is an international conference which is held every year during the Islamic Unity week in Tehran, Iran, and is organized by “the World Forum for Proximity of Islamic Schools of Thought”. The main target of the conference is devising the ways of unity for Islamic world, and likewise sympathy/consultation between scholars and scientists in order to approach their viewpoints in intended fields. The subjects of its meetings are determined based on divergent issues/problems of Islamic world. The meeting is kicked off with the attendance of many scholars, religious thinkers and prominent figures from diverse countries.
Background
The First International Islamic Unity Conference was inaugurated in 1987 in Tehran, Iran. After holding the 4th Islamic unity conference which was organized by Islamic Promotions Organization of Iran, then Ayatollah Khamenei as the supreme leader of Iran ordered to establish “the World Forum for Proximity of Islamic Schools of Thought”, since then, the conference is organized by the mentioned forum.
Members/Guests
A number of well-known figures from all over the globe are taken part in the event. On the whole, the participants (as the members/guests) include Islamic scholars, ministers of Islamic countries, clerics, intellectuals, representatives of scientific/cultural organizations and other significant individuals from different countries of the world. For instance, at the 29th Islamic unity conference there were over 600 prominent figures from Iran (as the host) and 70 countries from around the globe who participated in the meeting.
The 30th conference
The 30th Islamic unity conference was opened by president Rouhani, speech under the banner of "Unity, Necessity of Countering Takfiri movements" in Tehran, on 15 December 2016 with participation of diverse participants from different countries, such as: China, Iraq, UK, Malaysia, Indonesia, Lebanon, Thailand, Australia, Russia and so on, including more than domestic and foreign guests, several scholars and likewise 220 famous figures from different countries of the world took part in the conference. Additionally, the 31st international Islamic unity conference was held on 4-6 Dec 2017 in Tehran.
See also
Outline of Islam
Glossary of Islam
Index of Islam-related articles
References
Islam and politics
Shia–Sunni relations
|
The grey-hooded attila (Attila rufus) is a species of bird in the family Tyrannidae, the tyrant flycatchers. It is endemic to Brazil.
The grey-hooded attila occurs in a coastal strip along Brazil's southeast Atlantic coast. Its natural habitats are subtropical or tropical moist lowland forest and subtropical or tropical moist montane forest.
References
External links
Grey-hooded attila videos on the Internet Bird Collection
"Gray-hooded attila" photo gallery VIREO Photo-High Res
Attila (genus)
Birds of the Atlantic Forest
Endemic birds of Brazil
Birds described in 1819
Taxa named by Louis Pierre Vieillot
Taxonomy articles created by Polbot
|
```css
/*********************
* list cell and row *
*********************/
/*
Contributor notes:
Please use two space indentions.
Stack all related and child selectors and selector states into a logical hierarchy to a readable degree.
Make sure that all changes made here are part of the list cell and row.
*/
cell:not(check):not(radio),
row:not(check):not(radio) {
border-width: 0px 0px 1px 0px;
border: solid transparent; }
list > row label,
list > row image {
padding-left: 4px;
padding-right: 3px; }
cell:selected,
cell:selected:focus,
row:selected,
row:selected:hover,
row:selected:focus {
background-color: @theme_selected_bg_color;
outline-width: 1px;
outline-offset: 0px; }
/*list > separator {
border: 0px;
min-height: 0px;
min-width: 0px;
}*/
/* Exception for gnome-tweak-tool */
list.tweak-group {
padding: 8px; }
```
|
```xml
import { FontSizes, FontWeights, DefaultPalette } from 'office-ui-fabric-react/lib/Styling';
import { CommunicationColors } from '@uifabric/fluent-theme/lib/fluent/FluentColors';
import {
IStackStyles,
IStackTokens,
IStackItemStyles,
ITextFieldStyles,
ITextFieldSubComponentStyles,
IModalStyles,
ImageLoadState,
IDatePickerStyles,
ITextFieldProps,
IStyle,
IButtonStyles,
calculatePrecision,
IDropdownStyles,
} from 'office-ui-fabric-react';
// Styles definition
export const stackStyles: IStackStyles = {
root: {
alignItems: 'center',
marginTop: 10
}
};
export const stackItemStyles: IStackItemStyles = {
root: {
padding: 5,
display: 'flex',
width: 172,
height: 32,
fontWeight: FontWeights.regular,
}
};
export const stackTokens: IStackTokens = {
childrenGap: 10,
};
export const textFielStartDateDatePickerStyles: ITextFieldProps = {
styles: {
field: { backgroundColor: `${DefaultPalette.neutralLighter} !important` },
root: {},
wrapper: { },
subComponentStyles: undefined
}
};
export const textFielDueDateDatePickerStyles: ITextFieldProps = {
styles: {
field: { backgroundColor: `${DefaultPalette.neutralLighter} !important` },
root: {},
fieldGroup: { },
wrapper: {},
subComponentStyles: { } as ITextFieldSubComponentStyles
}
};
export const textFieldDescriptionStyles: ITextFieldStyles = {
field: { backgroundColor: DefaultPalette.neutralLighter },
root: {},
description: {},
errorMessage: {},
fieldGroup: {},
icon: {},
prefix: {},
suffix: {},
wrapper: {},
subComponentStyles: undefined
};
export const textFieldCheckListItem: ITextFieldStyles = {
field: { selectors:{ [':hover']: { backgroundColor: DefaultPalette.neutralLighter}}},
root: { width:550,},
description: {},
errorMessage: {},
fieldGroup: {},
icon: {},
prefix: { backgroundColor: 'white'},
suffix: { backgroundColor: 'white'},
wrapper: {},
subComponentStyles: undefined,
};
export const textFieldStylesTaskName: ITextFieldStyles = {
field: { backgroundColor: `${DefaultPalette.neutralLighter} !important`},
root: {},
description: {},
errorMessage: {},
fieldGroup: {},
icon: {},
prefix: {},
suffix: {},
wrapper: {selectors:{ [':hover']: { borderWidth: 1,borderStyle:'solid', borderColor: DefaultPalette.themePrimary}}},
subComponentStyles: undefined
};
export const modalStyles: IModalStyles = {
main: { minWidth: 400 ,maxWidth: 450, },
root: {},
keyboardMoveIcon: {},
keyboardMoveIconContainer: {},
layer: {},
scrollableContent: {}
};
export const datePickerStartDateStyles: IDatePickerStyles = {
callout: {},
icon: {},
root: { marginTop:0, backgroundColor: '#f4f4f4'},
textField: { backgroundColor: '#f4f4f4', borderWidth:0}
};
export const textFieldStylesdatePicker: ITextFieldProps = {
style: { display: 'flex', justifyContent: 'flex-start', marginLeft: 15 , backgroundColor: '#f4f4f4'},
iconProps: { style: { left: 0 } }
};
export const peoplePicker: IStyle = {
backgroundColor: DefaultPalette.neutralLighter
};
export const addMemberButton: IButtonStyles = {
root: { marginLeft: 0, paddingLeft: 0, marginTop: 0, fontSize: FontSizes.medium,width:26 },
textContainer: {
fontSize: FontSizes.medium,
fontWeight: 'normal',
color: '#666666',
marginLeft: 5
}
};
export const dropDownBucketStyles: IDropdownStyles = {
root: { margin: 0 } ,
title: {backgroundColor: '#f4f4f4', borderWidth:0},
callout: {},
caretDown: {},
caretDownWrapper: {},
dropdown:{},
dropdownDivider: {},
dropdownItem: {},
dropdownItemDisabled: {},
dropdownItemHeader:{},
dropdownItemHidden: {},
dropdownItemSelected:{},
dropdownItemSelectedAndDisabled:{},
dropdownItems:{},
dropdownItemsWrapper:{},
dropdownOptionText:{},
errorMessage:{},
label:{},
panel:{},
subComponentStyles: undefined,
};
export const dropDownProgressStyles: IDropdownStyles = {
root: { margin: 0 } ,
title: {backgroundColor: '#f4f4f4', borderWidth:0},
callout: {},
caretDown: {},
caretDownWrapper: {},
dropdown:{},
dropdownDivider: {},
dropdownItem: {},
dropdownItemDisabled: {},
dropdownItemHeader:{},
dropdownItemHidden: {},
dropdownItemSelected:{},
dropdownItemSelectedAndDisabled:{},
dropdownItems:{},
dropdownItemsWrapper:{},
dropdownOptionText:{},
errorMessage:{},
label:{},
panel:{},
subComponentStyles: undefined,
};
```
|
Makrem Jerou (born 14 May 1976) is a Tunisian handball player. He competed in the men's tournament at the 2000 Summer Olympics.
References
1976 births
Living people
Tunisian male handball players
Olympic handball players for Tunisia
Handball players at the 2000 Summer Olympics
Place of birth missing (living people)
|
Richard Earl Adkins (March 3, 1920 – September 12, 1955) was a professional baseball player. He was a shortstop for one season (1942) with the Philadelphia Athletics. For his career, he compiled a .143 batting average in 7 at-bats.
He was born and later died in Electra, Texas at the age of 35 after a short battle with cancer.
References
External links
1920 births
1955 deaths
Philadelphia Athletics players
Major League Baseball shortstops
Baseball players from Texas
Wilmington Blue Rocks (1940–1952) players
Birmingham Barons players
Natchez Indians players
People from Electra, Texas
Sportspeople from Wichita County, Texas
Deaths from cancer in Texas
Clovis Pioneers players
Newport News Pilots players
Minor league baseball managers
|
```html
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=US-ASCII">
<title>Class template impl_template</title>
<link rel="stylesheet" href="../../../../../../../../doc/src/boostbook.css" type="text/css">
<meta name="generator" content="DocBook XSL Stylesheets V1.79.1">
<link rel="home" href="../../../../index.html" title="Chapter 1. Boost.Log v2">
<link rel="up" href="../function.html#idp90250880" title="Description">
<link rel="prev" href="impl.html" title="Class impl">
<link rel="next" href="../make_function.html" title="Function template make_function">
</head>
<body bgcolor="white" text="black" link="#0000FF" vlink="#840084" alink="#0000FF">
<table cellpadding="2" width="100%"><tr><td valign="top"><img alt="Boost C++ Libraries" width="277" height="86" src="../../../../../../../../boost.png"></td></tr></table>
<hr>
<div class="spirit-nav">
<a accesskey="p" href="impl.html"><img src="../../../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../function.html#idp90250880"><img src="../../../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../../index.html"><img src="../../../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="../make_function.html"><img src="../../../../../../../../doc/src/images/next.png" alt="Next"></a>
</div>
<div class="refentry">
<a name="boost.log.attributes.function.impl_template"></a><div class="titlepage"></div>
<div class="refnamediv">
<h2><span class="refentrytitle">Class template impl_template</span></h2>
<p>boost::log::attributes::function::impl_template — Factory implementation. </p>
</div>
<h2 xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" class="refsynopsisdiv-title">Synopsis</h2>
<div xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" class="refsynopsisdiv"><pre class="synopsis"><span class="comment">// In header: <<a class="link" href="../../../../attributes.html#header.boost.log.attributes.function_hpp" title="Header <boost/log/attributes/function.hpp>">boost/log/attributes/function.hpp</a>>
</span>
<span class="comment">// Factory implementation.</span>
<span class="keyword">template</span><span class="special"><</span><span class="keyword">typename</span> T<span class="special">></span>
<span class="keyword">class</span> <a class="link" href="impl_template.html" title="Class template impl_template">impl_template</a> <span class="special">:</span> <span class="keyword">public</span> <span class="identifier">function</span><span class="special"><</span> <span class="identifier">R</span> <span class="special">></span><span class="special">::</span><span class="identifier">impl</span> <span class="special">{</span>
<span class="keyword">public</span><span class="special">:</span>
<span class="comment">// <a class="link" href="impl_template.html#boost.log.attributes.function.impl_templateconstruct-copy-destruct">construct/copy/destruct</a></span>
<span class="keyword">explicit</span> <a class="link" href="impl_template.html#idp35738288-bb"><span class="identifier">impl_template</span></a><span class="special">(</span><span class="identifier">T</span> <span class="keyword">const</span> <span class="special">&</span><span class="special">)</span><span class="special">;</span>
<span class="comment">// <a class="link" href="impl_template.html#idp35735392-bb">public member functions</a></span>
<span class="keyword">virtual</span> <a class="link" href="../../attribute_value.html" title="Class attribute_value">attribute_value</a> <a class="link" href="impl_template.html#idp35735952-bb"><span class="identifier">get_value</span></a><span class="special">(</span><span class="special">)</span><span class="special">;</span>
<span class="special">}</span><span class="special">;</span></pre></div>
<div class="refsect1">
<a name="idp90311744"></a><h2>Description</h2>
<div class="refsect2">
<a name="idp90312160"></a><h3>
<a name="boost.log.attributes.function.impl_templateconstruct-copy-destruct"></a><code class="computeroutput">impl_template</code>
public
construct/copy/destruct</h3>
<div class="orderedlist"><ol class="orderedlist" type="1"><li class="listitem">
<pre class="literallayout"><span class="keyword">explicit</span> <a name="idp35738288-bb"></a><span class="identifier">impl_template</span><span class="special">(</span><span class="identifier">T</span> <span class="keyword">const</span> <span class="special">&</span> fun<span class="special">)</span><span class="special">;</span></pre>
<p>Constructor with the stored delegate initialization </p>
</li></ol></div>
</div>
<div class="refsect2">
<a name="idp90321136"></a><h3>
<a name="idp35735392-bb"></a><code class="computeroutput">impl_template</code> public member functions</h3>
<div class="orderedlist"><ol class="orderedlist" type="1"><li class="listitem">
<pre class="literallayout"><span class="keyword">virtual</span> <a class="link" href="../../attribute_value.html" title="Class attribute_value">attribute_value</a> <a name="idp35735952-bb"></a><span class="identifier">get_value</span><span class="special">(</span><span class="special">)</span><span class="special">;</span></pre>
<p>
</p>
<div class="variablelist"><table border="0" class="variablelist compact">
<colgroup>
<col align="left" valign="top">
<col>
</colgroup>
<tbody><tr>
<td><p><span class="term">Returns:</span></p></td>
<td><p>The actual attribute value. It shall not return empty values (exceptions shall be used to indicate errors). </p></td>
</tr></tbody>
</table></div>
</li></ol></div>
</div>
</div>
</div>
<table xmlns:rev="path_to_url~gregod/boost/tools/doc/revision" width="100%"><tr>
<td align="left"></td>
file LICENSE_1_0.txt or copy at <a href="path_to_url" target="_top">path_to_url
</p>
</div></td>
</tr></table>
<hr>
<div class="spirit-nav">
<a accesskey="p" href="impl.html"><img src="../../../../../../../../doc/src/images/prev.png" alt="Prev"></a><a accesskey="u" href="../function.html#idp90250880"><img src="../../../../../../../../doc/src/images/up.png" alt="Up"></a><a accesskey="h" href="../../../../index.html"><img src="../../../../../../../../doc/src/images/home.png" alt="Home"></a><a accesskey="n" href="../make_function.html"><img src="../../../../../../../../doc/src/images/next.png" alt="Next"></a>
</div>
</body>
</html>
```
|
is the ninth studio album by Japanese singer-songwriter Yumi Matsutoya, released in June 1980.
Pictures featured on the back and front sleeve art were taken at the Brown's Hotel in London. No songs were released as a single except "Tamerai", which first appeared as a B-side of the non-album track "Daydream", released in May 1980.
Like other studio albums which came out after she remained with EMI in 1977 when Alfa finally started operating as a record label, Toki no Nai Hotel was released on CD for the first time in 1985. The album was reissued in 1999, digitally remastered by Bernie Grundman.
Track listing
All songs written and composed by Yumi Matsutoya, arranged by Masataka Matsutoya.
""– 5:26
"" – 4:15
"Miss Lonely" – 4:28
"Ame ni Kieta Jogger " – 5:00
"" – 3:46
"" – 4:30
"" – 6:22
"" – 7:18
"" – 4:02
Footnotes
Recorded versions
"Mizu no Kage" was originally written for the female folk duo called Simons in 1978.
"Tamerai" was originally written for Midori Hagio. The song was later covered by Yoshiko Miyazaki, Keiko Saito, and Pink Lady member Keiko Masuda.
Queen's Fellows, a 2002 tribute album for Matsutoya includes two cover versions of the songs which were featured on Toki no Nai Hotel; "Cecile no Shumatsu" performed by Aiko and the title track by Takao Tajima.
Personnel
Masataka Matsutoya - keyboards
Tatsuo Hayashi - drums
Jun Aoyama - drums
Yuichi Tokashiki - drums
Nobu Saito - percussion
Kenji Takamizu - bass guitar
Tsugutoshi Goto - bass guitar
Masaki Matsubara - electric guitar
Shigeru Suzuki - electric guitar
Tsuyoshi Kon - electric guitar
Chuei Yoshikawa - acoustic guitar, mandolin
Hiromi Yasuda - acoustic guitar
Masamichi Sugi - backing vocals
Kiyoshi Saito - saxophone
Jake H Conception - saxophone
Shunzo Sunabara - saxophone
Yukio Eto - flute
Eiju Yamada - horn
Yasuhiro Okita- horn
Tomato Strings Unsemble - strings
Junichi Hiiro - violin
Leona - backing vocals
Clara - backing vocals
Lilika - backing vocals
Hideki Matsutake - synthesizer programming
Chart position
References
1980 albums
Yumi Matsutoya albums
EMI Music Japan albums
|
Bae Yoo-ram (; born August 22, 1986), is a South Korean actor. He is best known for his roles in Taxi Driver (2021–2023).
Filmography
Film
Television series
Web series
Awards and nominations
References
Living people
1986 births
South Korean male television actors
South Korean male film actors
Konkuk University alumni
21st-century South Korean male actors
|
```smalltalk
// The .NET Foundation licenses this file to you under the MIT license.
// See the LICENSE file in the project root for more information.
namespace System.Runtime.CompilerServices
{
/// <summary>
/// Used to indicate to the compiler that the <c>.locals init</c> flag should not be set in method headers.
/// </summary>
/// <remarks>Internal copy from the BCL attribute.</remarks>
[AttributeUsage(
AttributeTargets.Module |
AttributeTargets.Class |
AttributeTargets.Struct |
AttributeTargets.Interface |
AttributeTargets.Constructor |
AttributeTargets.Method |
AttributeTargets.Property |
AttributeTargets.Event,
Inherited = false)]
internal sealed class SkipLocalsInitAttribute : Attribute
{
}
}
```
|
```xml
console.log('Hello, World!');
```
|
```xml
export default class MemberErrors {
id: string;
login: string;
avatar_url: string;
isEntityValid : boolean;
public constructor() {
this.id = "";
this.login = "";
this.avatar_url = "";
}
}
```
|
Hertenstein may refer to:
Hertenstein, Lucerne, a village part of the municipality of Weggis, Canton of Lucerne, Switzerland
Hertensteiner Programm
Hertenstein, Aargau, a village in the municipality of Obersiggenthal, Canton of Aargau, Switzerland
Ruine Hertenstein, the ruin of a castle at Sigmaringen, Germany
Hertenstein Castle, the ruin of a castle near Blaufelden, Germany
People with the surname
Wilhelm Hertenstein (1825-1888), member of the Swiss Federal Council (1879-1888)
See also
Hartenstein (disambiguation)
|
```haskell
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE OverloadedStrings #-}
-- | Renders the debugger in the terminal.
module Debugger.TUI.Draw where
import Debugger.TUI.Types
import PlutusPrelude (render)
import Brick.AttrMap qualified as B
import Brick.Focus qualified as B
import Brick.Types qualified as B
import Brick.Widgets.Border qualified as BB
import Brick.Widgets.Center qualified as BC
import Brick.Widgets.Core qualified as B
import Brick.Widgets.Edit qualified as BE
import Data.Bifunctor
import Data.Maybe
import Data.Text (Text)
import Data.Text qualified as Text
import Lens.Micro
import Prettyprinter hiding (line)
drawDebugger ::
DebuggerState ->
[B.Widget ResourceName]
drawDebugger st =
withKeyBindingsOverlay
(st ^. dsKeyBindingsMode)
[header "Plutus Core Debugger" $ BC.center ui]
where
focusRing = st ^. dsFocusRing
(curLine, curCol) = BE.getCursorPosition (st ^. dsUplcEditor)
cursorPos = "Ln " <> show (curLine + 1) <> ", Col " <> show (curCol + 1)
uplcEditor =
BB.borderWithLabel (B.txt "UPLC program") $
B.vBox
[ B.withFocusRing
focusRing
(BE.renderEditor (drawDocumentWithHighlight (st ^. dsUplcHighlight)))
(st ^. dsUplcEditor)
, B.padTop (B.Pad 1) . BC.hCenter $ B.str cursorPos
]
sourceEditor = maybe B.emptyWidget
(BB.borderWithLabel (B.txt "Source program") .
B.withFocusRing
focusRing
(BE.renderEditor (drawDocumentWithHighlight (st ^. dsSourceHighlight)))
) (st ^. dsSourceEditor)
returnValueEditor =
BB.borderWithLabel (B.txt "UPLC value being returned") $
B.withFocusRing
focusRing
(BE.renderEditor (B.txt . Text.unlines))
(st ^. dsReturnValueEditor)
logsEditor =
BB.borderWithLabel (B.txt "Logs") $
B.withFocusRing
focusRing
(BE.renderEditor (B.txt . Text.unlines))
(st ^. dsLogsEditor)
budgetTxt = B.hBox
[ prettyTxt "Spent:" (st ^. dsBudgetData . budgetSpent)
-- do not show Remaining in absence of `--budget`
, maybe B.emptyWidget (prettyTxt "Remaining:") (st ^. dsBudgetData . budgetRemaining)
]
prettyTxt title = B.txt . render . group . (title <>) . pretty
ui =
B.vBox
[ BC.center uplcEditor B.<+> B.hLimit (st ^. dsHLimitRightEditors) sourceEditor
, B.vLimit (st ^. dsVLimitBottomEditors) $
BC.center returnValueEditor B.<+>
B.hLimit (st ^. dsHLimitRightEditors) logsEditor
, budgetTxt
, footer
]
-- | Draw a document, a consecutive part of which may be highlighted.
drawDocumentWithHighlight ::
forall n.
-- | The part of the document to be highlighted.
Maybe HighlightSpan ->
-- | The document to draw, one `Text` per line.
[Text] ->
B.Widget n
drawDocumentWithHighlight = \case
Nothing -> B.txt . Text.unlines
Just (HighlightSpan (B.Location (sLine, sCol)) eLoc) ->
let toWidgets :: [Text] -> [B.Widget n]
toWidgets lns =
let alg :: (Text, Int) -> B.Widget n
alg (line, lineNo)
| lineNo < sLine || lineNo > eLine = B.txt line
-- The current line (either entirely or part of it) needs to be highlighted.
| otherwise =
let s =
if lineNo > sLine
then 0
else sCol - 1
e =
if lineNo < eLine
then Text.length line
else eCol
in highlightLine line s e
where
B.Location (eLine, eCol) =
fromMaybe
(B.Location (sLine, Text.length line))
eLoc
in fmap alg (zip lns [1 ..])
in B.vBox
. toWidgets
-- This is needed for empty lines to be rendered correctly.
-- `Brick.Widgets.Core.txt` does the same via `fixEmpty`.
. fmap (\t -> if Text.null t then " " else t)
-- | Draw a line and highlight from the start column to the end column.
highlightLine ::
forall n.
Text ->
-- | Start column
Int ->
-- End column
Int ->
B.Widget n
highlightLine line s e =
B.hBox $
[B.txt left | not (Text.null left)]
++ [ B.withAttr highlightAttr $ B.txt middle
| not (Text.null middle)
]
++ [B.txt right | not (Text.null right)]
where
-- left -> no highlight
-- middle -> highlight
-- right -> no highlight
((left, middle), right) = first (Text.splitAt s) (Text.splitAt e line)
-- | Show key bindings upon request.
withKeyBindingsOverlay ::
KeyBindingsMode ->
[B.Widget ResourceName] ->
[B.Widget ResourceName]
withKeyBindingsOverlay = \case
KeyBindingsShown -> (BC.centerLayer debuggerKeyBindings :)
KeyBindingsHidden -> id
debuggerKeyBindings :: forall n. B.Widget n
debuggerKeyBindings =
B.vBox
[ B.withAttr menuAttr . B.vBox $
[ B.txt "Step (s)"
, B.txt "Move cursor (Arrow)"
, B.txt "Switch window (Tab)"
, B.txt "Resize windows (^Up/^Down/^Left/^Right)"
, B.txt "Quit (Esc)"
]
, B.txt "Press any key to exit"
]
menuAttr :: B.AttrName
menuAttr = B.attrName "menu"
highlightAttr :: B.AttrName
highlightAttr = B.attrName "highlight"
header :: Text -> B.Widget a -> B.Widget a
header title =
(B.vLimit 1 (B.withAttr menuAttr . BC.hCenter $ B.txt title) B.<=>)
footer :: forall a. B.Widget a
footer =
B.padTop (B.Pad 1) . BC.hCenter . B.withAttr menuAttr $
B.txt "(?): Key Bindings"
```
|
Secrets of a Hollywood Super Madam is an autobiography written by Jody Gibson.
Plot summary
In the late 1980s–1990s Gibson ran an exclusive "escort service" based in Hollywood, California, U.S.A., under the name "Sasha from the Valley", while also leading a double life on radio and television as a minor actress and recording artist. The book describes Gibson's life during this period. She was subsequently tried and convicted of pimping and conspiracy in a high media profile trial and sent to prison. Gibson claimed in the book that public figures used her business. Included in the text is court data from her "Black Book", which was introduced as evidence at the trial.
References
2007 non-fiction books
American autobiographies
Non-fiction books about American prostitution
|
```javascript
#version 450
#extension GL_QCOM_image_processing : require
layout(binding = 4) uniform sampler2D tex_samp;
uniform sampler2D SPIRV_Cross_Combinedtex2D_src1samp;
layout(location = 0) out vec4 fragColor;
layout(location = 0) in vec4 v_texcoord;
void main()
{
vec2 boxSize = vec2(2.5, 4.5);
vec4 _31 = textureBoxFilterQCOM(SPIRV_Cross_Combinedtex2D_src1samp, v_texcoord.xy, boxSize);
fragColor = _31;
vec4 _38 = textureBoxFilterQCOM(tex_samp, v_texcoord.xy, boxSize);
fragColor = _38;
}
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.