text
stringlengths 1
22.8M
|
|---|
During Recep Tayyip Erdoğan's campaign for presidency in 2014 and throughout his presidency, there were numerous claims that he didn't graduate from a university and therefore he is ineligible to be President of Turkey, because he was not a university graduate. Similar claims have also been made about Erdogan's high school diploma. These claims refer to Article 101 of the Turkish constitution stating that presidential candidates are required to have completed their tertiary education. The article however doesn't narrow down tertiary education to universities only, allowing other options such as trade schools and colleges.
Erdoğan's degree is publicly accessible and viewable at Marmara University's online diploma department.
Accusations
In June 2016, Ömen Faruk Eminağaoğlu, former chairmen of YARSAV and leader of the communist People's Liberation Party, had requested from the Supreme Election Council (YSK) to carry out an investigation on the "use of forged official documents". He also called for the Ankara's Chief Public Prosecutor to begin an investigation if such an investigation was not carried out. YSK unanimously rejected the case upon reviewing. YSK decided that since Erdoğan presented a notarized university degree during the presidential election process, the examination beyond this does not fall within the scope of the YSK's mandate. Erdoğan responded to accusations by saying "The school I registered, was educated and graduated is clear, my classmates are clear. Also, the university administration made a formal explanation. Despite all this, some people still insistently continue to bring up this issue" and asked Mehmet Emin Arat, rector of the Marmara University to release his diploma information. Marmara University released a statement stating that claims of Erdoğan having a fake diploma have no evidence. ÜNİVDER (University Faculty Members Association) criticized rectorate of Marmara University for denying the fake diploma allegations without sharing any documents, while it was expected for them to release a copy of the diploma of Erdoğan and stated that there isn't any evidence for Erdoğan's graduation from any university.
Witnesses
Aydın Ayaydın, a Member of Parliament from the opposition party CHP said that Erdoğan participated in the four-year degree education of Aksaray Academy of Economic and Commercial Sciences, as Ayaydın himself was a teacher of Erdoğan. Expressing that he remembers Erdoğan and his classmates very well, Ayaydın said "one of his classmates is Mehmet Emin Arat, who currently is a professor at Marmara University." Mehmet Emin Arat himself called the claims on Erdoğan not having a degree "unfair" and "baseless", stating that "the claims do not have any legal, official or historical basis".
Israeli Journalist Rafael Sadi, who was another classmate of Erdoğan, said that he was irritated by "people that are telling baseless lies just to slander the man for the sake of opposition" and shared the names of professors that he and Erdoğan followed courses from.
See also
List of honorary doctorates awarded to Recep Tayyip Erdoğan
References
Political scandals in Turkey
Recep Tayyip Erdoğan controversies
2014 controversies
Education controversies in Turkey
|
```python
import pytest
from conftest import assert_complete, partialize
@pytest.mark.bashcomp(pre_cmds=("HOME=$PWD",))
class TestXhost:
@pytest.mark.parametrize("prefix", ["+", "-", ""])
def test_hosts(self, bash, hosts, prefix):
completion = assert_complete(bash, "xhost %s" % prefix)
assert completion == [f"{prefix}{x}" for x in hosts]
@pytest.mark.parametrize("prefix", ["+", "-", ""])
def test_partial_hosts(self, bash, hosts, prefix):
first_char, partial_hosts = partialize(bash, hosts)
completion = assert_complete(bash, f"xhost {prefix}{first_char}")
if len(completion) == 1:
assert completion == partial_hosts[0][1:]
else:
assert completion == sorted(f"{prefix}{x}" for x in partial_hosts)
```
|
```smalltalk
namespace ApiVersioning.Examples.Models;
using System.ComponentModel.DataAnnotations;
public class Order
{
public int Id { get; set; }
public DateTimeOffset CreatedDate { get; set; } = DateTimeOffset.Now;
public DateTimeOffset EffectiveDate { get; set; } = DateTimeOffset.Now;
[Required]
public string Customer { get; set; }
}
```
|
Jay Lazar Garfield (born 13 November 1955) is an American professor of philosophy who specializes in Tibetan Buddhism. He also specializes on the philosophy of mind, cognitive science, epistemology, metaphysics, philosophy of language, ethics, and hermeneutics. He is currently Doris Silbert Professor in the Humanities at Smith College, professor of philosophy at the University of Melbourne, visiting professor of philosophy and Buddhist studies at Harvard Divinity School, and Adjunct Professor of Philosophy at the Central University of Tibetan Studies.
Academic career
Garfield received an A.B. from Oberlin College in 1975, and a Ph.D. from the University of Pittsburgh in 1986, where he worked with Wilfrid Sellars and Annette Baier. At the Central University of Tibetan Studies in India, he studied Nagarjuna with Geshe Yeshe Thabkhas.
He taught from 1980 to 1995 at Hampshire College, from 1996 to 1998 at the University of Tasmania, and since 1999 at Smith College.
He is editor-in-chief of the journal Sophia, and is on the editorial boards of Philosophical Psychology, Journal of Indian Philosophy and Religion, Australasian Philosophical Review, Philosophy East and West, American Institute of Buddhist Studies/Columbia Center for Buddhist Studies/Tibet House, Stanford Encyclopedia of Philosophy, and the Journal of Buddhist Philosophy.
Garfield was the inaugural Kwan Im Thong Hood Cho Temple Professor of Humanities and
Head of Studies, Philosophy, at Yale-NUS from 2013-2016. He said, "This Professorship has given me the opportunity of a lifetime – working with motivated, creative and talented students and colleagues and working in a community committed to building something entirely new, an Asian liberal arts college with a truly global curriculum." During his professorship at Yale-NUS, Garfield was one of six scholars who participated in a conference with the 14th Dalai Lama on "Mapping the Mind: A Dialogue between Modern Science and Buddhist Science."
Controversy over "If Philosophy Won't Diversify"
Garfield has long been a critic of what he sees as the narrow approach of Western philosophers. He has noted that "people in our profession are still happy to treat Western philosophy as the 'core' of the discipline, and as the umarked case. So, for instance, a course that addresses only classical Greek philosophy can be comfortably titled 'Ancient Philosophy,' not 'Ancient Western Philosophy,' and a course in metaphysics can be counted on to ignore all non-Western metaphysics. A course in Indian philosophy is not another course in the history of philosophy, but is part of the non-Western curriculum." Because of his knowledge of Buddhism and commitment to encouraging the study of Asian philosophy, Garfield was invited to be the keynote speaker at a conference on non-Western philosophical traditions organized by graduate students in philosophy at the University of Pennsylvania in 2016. However, he was "outraged" that there were only "one or two" members of the regular faculty in the department who attended the event, because he felt that this showed a lack of support for their own students' interest in non-Western philosophy.
Garfield discussed this issue with another speaker at the conference, Bryan W. Van Norden, and they wrote an editorial that appeared in The Stone column of The New York Times in May of that year, entitled "If Philosophy Won't Diversify, Let's Call It What It Really Is." In this editorial, they state: "we have urged our colleagues to look beyond the European canon in their own research and teaching." However, "progress has been minimal." Consequently, so long as "the profession as a whole remains resolutely Eurocentric," Garfield and Van Norden "ask those who sincerely believe that it does make sense to organize our discipline entirely around European and American figures and texts to pursue this agenda with honesty and openness. We therefore suggest that any department that regularly offers courses only on Western philosophy should rename itself 'Department of European and American Philosophy.'"
The article received 797 comments in just 12 hours. (None of the other Stone columns that month had over 500 comments.) Garfield later explained, "I woke up to all this email in my inbox [with] people asking, 'Are you okay?' 'Do you need to talk?'" Garfield soon realized that his colleagues were expressing concern for his well-being because so many of the comments on the article expressed "vitriolic racism and xenophobia. And some of it was clearly by philosophers and students of philosophy.'" One typical comment was that Western philosophy deserves precedence because "there is a particular school of thought that caught fire, broke cultural boundaries, and laid the foundation of modern science (Does anyone want to fly in a plane built with non-western math?) and our least oppressive governmental systems." On the other hand, there were also many supportive comments: "Hear! Hear! Inclusion is the order of the day. ... More wisdom from more perspectives — what could be better? We have so much to learn from each other, if only we listen."
Garfield and Van Norden's article was almost immediately translated into Chinese, and over twenty blogs in the English-speaking world have commented or hosted discussions, including Reddit. Garfield and Van Norden's piece has continued to provoke strong reactions. Some have applauded their call for greater diversity in the US philosophical canon. In addition, their piece has been featured in several recent essays arguing for greater diversity in philosophy.
However, there has also been extensive criticism of the Garfield and Van Norden article. Articles in Aeon and Weekly Standard argued that "philosophy" is, by definition, the tradition that grows out of Plato and Aristotle, so nothing outside that tradition could count as philosophy. Professor Amy Olberding of the University of Oklahoma wrote a detailed reply to critics of Garfield and Van Norden, arguing that criticisms fall into a stereotypical pattern that betrays a fundamental misunderstanding of the issues.
Publications
Books
Losing Ourselves: Learning to Live without a Self (Princeton University Press 2022)
Buddhist Ethics: A Philosophical Exploration (Buddhist Philosophy for Philosophers) (Oxford University Press 2022)
Dignāga’s Investigation of the Percept: A Philosophical Legacy in India and Tibet (with Douglas Duckworth, David Eckel, John Powers, Yeshes Thabkhas and Sonam Thakchöe, Oxford University Press 2016)
Moonpaths: Ethics in the Context of Conventional Truth (with the Cowherds, Oxford University Press 2015)
Engaging Buddhism: Why Does Buddhism Matter to Philosophy? (Oxford University Press 2015)
Sweet Reason: A Field Guide to Modern Logic, 2nd Edition (with James Henle and Thomas Tymoczko. Wiley. (2011)
Western Idealism and its Critics. Central University of Tibetan Studies Press, Sarnath, India, 2011, English only edition, Hobart: Pyrrho Press 1998.
Moonshadows: Conventional Truth in Buddhist Philosophy (with the Cowherds, Oxford University Press. (2010)
An Ocean of Reasoning: Tsong kha pa’s Great Commentary on Nāgārjuna’s Mūlamadhyamakakārika (with Geshe Ngawang Samten), Oxford University Press, 2006.
Empty Words: Buddhist Philosophy and Cross-Cultural Interpretation. Oxford University Press, New York, 2002.
Translator and commentator, Fundamental Wisdom of the Middle Way: Nāgārjuna's Mūlamadhyamakakārikā. Oxford University Press, New York, 1995.
Cognitive Science: An Introduction (with N. Stillings, M. Feinstein, E. Rissland, D. Rosenbaum, S. Weisler, and L. Baker-Ward). Bradford Books/MIT Press, 1987; 2nd edition (with N. Stillings, M. Feinstein, E. Rissland, D. Rosenbaum, S. Weisler, and L. Baker-Ward), Bradford Books/MIT Press, 1995.
Belief in Psychology: A Study in the Ontology of Mind. Bradford Books/MIT Press, 1988.
Edited collections
Madhyamaka and Yogācāra: Allies or Rivals? (ed., with J Westerhoff), Oxford University Press, 2015.
The Moon Points Back: Buddhism, Logic and Analytic Philosophy (ed. With Y. Deguchi, G. Priest and K. Tanaka). Oxford University Press, 2015
Contrary Thinking: Selected Papers of Daya Krishna (with N Bhushan and D Raveh), Oxford University Press (2011).
Indian Philosophy in English: Renaissance to Independence (with N Bhushan), Oxford University Press (2011).
Oxford Handbook of World Philosophy (with W Edelglass), Oxford University Press (2010).
Pointing at the Moon: Buddhism, Logic Analysis (with T Tillemans and M D’Amato), 2009, Oxford University Press.
TransBuddhism: Translation, Transmission and Transformation (with N Bhushan and A Zablocki) 2009, the University of Massachusetts Press.
Buddhist Philosophy: Essential Readings (with William Edelglass) 2009, Oxford University Press.
Foundations of Cognitive Science: The Essential Readings. Paragon House, New York, 1990.
Meaning and Truth: Essential Readings in Modern Semantics (with Murray Kiteley). Paragon House, New York, 1990.
Modularity in Knowledge Representation and Natural Language Understanding. Bradford Books/MIT Press, 1987.
Abortion: Moral and Legal Perspectives (with Patricia Hennessey). University of Massachusetts Press, 1984.
References
External links
Garfield's personal website
1955 births
American philosophy academics
Living people
|
```smalltalk
//
// SymbolWriterImpl.cs
//
// Author:
// Lluis Sanchez Gual (lluis@novell.com)
//
// (C) 2005 Novell, Inc. path_to_url
//
//
// Permission is hereby granted, free of charge, to any person obtaining
// a copy of this software and associated documentation files (the
// "Software"), to deal in the Software without restriction, including
// without limitation the rights to use, copy, modify, merge, publish,
// distribute, sublicense, and/or sell copies of the Software, and to
// permit persons to whom the Software is furnished to do so, subject to
// the following conditions:
//
// The above copyright notice and this permission notice shall be
// included in all copies or substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
// EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
// MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
// NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
// LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
// OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
// WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
//
#if !NET_CORE
using System;
using System.Reflection;
using System.Reflection.Emit;
using System.Runtime.CompilerServices;
using System.Collections;
using System.IO;
using System.Diagnostics.SymbolStore;
namespace ILRuntime.Mono.CompilerServices.SymbolWriter
{
public class SymbolWriterImpl: ISymbolWriter
{
MonoSymbolWriter msw;
int nextLocalIndex;
int currentToken;
string methodName;
Stack namespaceStack = new Stack ();
bool methodOpened;
Hashtable documents = new Hashtable ();
#if !CECIL
ModuleBuilder mb;
delegate Guid GetGuidFunc (ModuleBuilder mb);
GetGuidFunc get_guid_func;
public SymbolWriterImpl (ModuleBuilder mb)
{
this.mb = mb;
}
public void Close ()
{
MethodInfo mi = typeof (ModuleBuilder).GetMethod (
"Mono_GetGuid",
BindingFlags.Static | BindingFlags.NonPublic);
if (mi == null)
return;
get_guid_func = (GetGuidFunc) System.Delegate.CreateDelegate (
typeof (GetGuidFunc), mi);
msw.WriteSymbolFile (get_guid_func (mb));
}
#else
Guid guid;
public SymbolWriterImpl (Guid guid)
{
this.guid = guid;
}
public void Close ()
{
msw.WriteSymbolFile (guid);
}
#endif
public void CloseMethod ()
{
if (methodOpened) {
methodOpened = false;
nextLocalIndex = 0;
msw.CloseMethod ();
}
}
public void CloseNamespace ()
{
namespaceStack.Pop ();
msw.CloseNamespace ();
}
public void CloseScope (int endOffset)
{
msw.CloseScope (endOffset);
}
public ISymbolDocumentWriter DefineDocument (
string url,
Guid language,
Guid languageVendor,
Guid documentType)
{
SymbolDocumentWriterImpl doc = (SymbolDocumentWriterImpl) documents [url];
if (doc == null) {
SourceFileEntry entry = msw.DefineDocument (url);
CompileUnitEntry comp_unit = msw.DefineCompilationUnit (entry);
doc = new SymbolDocumentWriterImpl (comp_unit);
documents [url] = doc;
}
return doc;
}
public void DefineField (
SymbolToken parent,
string name,
FieldAttributes attributes,
byte[] signature,
SymAddressKind addrKind,
int addr1,
int addr2,
int addr3)
{
}
public void DefineGlobalVariable (
string name,
FieldAttributes attributes,
byte[] signature,
SymAddressKind addrKind,
int addr1,
int addr2,
int addr3)
{
}
public void DefineLocalVariable (
string name,
FieldAttributes attributes,
byte[] signature,
SymAddressKind addrKind,
int addr1,
int addr2,
int addr3,
int startOffset,
int endOffset)
{
msw.DefineLocalVariable (nextLocalIndex++, name);
}
public void DefineParameter (
string name,
ParameterAttributes attributes,
int sequence,
SymAddressKind addrKind,
int addr1,
int addr2,
int addr3)
{
}
public void DefineSequencePoints (
ISymbolDocumentWriter document,
int[] offsets,
int[] lines,
int[] columns,
int[] endLines,
int[] endColumns)
{
SymbolDocumentWriterImpl doc = (SymbolDocumentWriterImpl) document;
SourceFileEntry file = doc != null ? doc.Entry.SourceFile : null;
for (int n=0; n<offsets.Length; n++) {
if (n > 0 && offsets[n] == offsets[n-1] && lines[n] == lines[n-1] && columns[n] == columns[n-1])
continue;
msw.MarkSequencePoint (offsets[n], file, lines[n], columns[n], false);
}
}
public void Initialize (IntPtr emitter, string filename, bool fFullBuild)
{
msw = new MonoSymbolWriter (filename);
}
public void OpenMethod (SymbolToken method)
{
currentToken = method.GetToken ();
}
public void OpenNamespace (string name)
{
NamespaceInfo n = new NamespaceInfo ();
n.NamespaceID = -1;
n.Name = name;
namespaceStack.Push (n);
}
public int OpenScope (int startOffset)
{
return msw.OpenScope (startOffset);
}
public void SetMethodSourceRange (
ISymbolDocumentWriter startDoc,
int startLine,
int startColumn,
ISymbolDocumentWriter endDoc,
int endLine,
int endColumn)
{
int nsId = GetCurrentNamespace (startDoc);
SourceMethodImpl sm = new SourceMethodImpl (methodName, currentToken, nsId);
msw.OpenMethod (((ICompileUnit)startDoc).Entry, nsId, sm);
methodOpened = true;
}
public void SetScopeRange (int scopeID, int startOffset, int endOffset)
{
}
public void SetSymAttribute (SymbolToken parent, string name, byte[] data)
{
// This is a hack! but MonoSymbolWriter needs the method name
// and ISymbolWriter does not have any method for providing it
if (name == "__name")
methodName = System.Text.Encoding.UTF8.GetString (data);
}
public void SetUnderlyingWriter (IntPtr underlyingWriter)
{
}
public void SetUserEntryPoint (SymbolToken entryMethod)
{
}
public void UsingNamespace (string fullName)
{
if (namespaceStack.Count == 0) {
OpenNamespace ("");
}
NamespaceInfo ni = (NamespaceInfo) namespaceStack.Peek ();
if (ni.NamespaceID != -1) {
NamespaceInfo old = ni;
CloseNamespace ();
OpenNamespace (old.Name);
ni = (NamespaceInfo) namespaceStack.Peek ();
ni.UsingClauses = old.UsingClauses;
}
ni.UsingClauses.Add (fullName);
}
int GetCurrentNamespace (ISymbolDocumentWriter doc)
{
if (namespaceStack.Count == 0) {
OpenNamespace ("");
}
NamespaceInfo ni = (NamespaceInfo) namespaceStack.Peek ();
if (ni.NamespaceID == -1)
{
string[] usings = (string[]) ni.UsingClauses.ToArray (typeof(string));
int parentId = 0;
if (namespaceStack.Count > 1) {
namespaceStack.Pop ();
parentId = ((NamespaceInfo) namespaceStack.Peek ()).NamespaceID;
namespaceStack.Push (ni);
}
ni.NamespaceID = msw.DefineNamespace (ni.Name, ((ICompileUnit)doc).Entry, usings, parentId);
}
return ni.NamespaceID;
}
}
class SymbolDocumentWriterImpl: ISymbolDocumentWriter, ISourceFile, ICompileUnit
{
CompileUnitEntry comp_unit;
public SymbolDocumentWriterImpl (CompileUnitEntry comp_unit)
{
this.comp_unit = comp_unit;
}
public void SetCheckSum (Guid algorithmId, byte[] checkSum)
{
}
public void SetSource (byte[] source)
{
}
SourceFileEntry ISourceFile.Entry {
get { return comp_unit.SourceFile; }
}
public CompileUnitEntry Entry {
get { return comp_unit; }
}
}
class SourceMethodImpl: IMethodDef
{
string name;
int token;
int namespaceID;
public SourceMethodImpl (string name, int token, int namespaceID)
{
this.name = name;
this.token = token;
this.namespaceID = namespaceID;
}
public string Name {
get { return name; }
}
public int NamespaceID {
get { return namespaceID; }
}
public int Token {
get { return token; }
}
}
class NamespaceInfo
{
public string Name;
public int NamespaceID;
public ArrayList UsingClauses = new ArrayList ();
}
}
#endif
```
|
St. John's College is a State Integrated, Catholic, Day School for boys, located in Hastings, a provincial city in Hawkes Bay, New Zealand.
Founded in 1941 by the Marist Fathers, St. John's College has a non-selective enrolment policy (although it does give preference to students from Catholic families) and currently caters for approximately 450 students from Year 9 (3rd Form) to Year 13 (7th Form).
In 2006 its ethnic composition was Pākehā 73%, Māori 23%, and Pacific 4%
. Academically, the school offers senior years the National Certificate of Educational Achievement assessment system (NCEA).
St. John's College is the oldest private/state integrated secondary for boys in New Zealand outside the traditional main centres of Auckland, Wellington, Christchurch, and Dunedin (Te Aute College, also in Hawke's Bay was previously the title holder but became coed during the 1990s).
History
St John's College was established in 1941, on Frederick St, Hastings (the current site of St Mary's Primary School). It was founded by the Marist Fathers in response to the lack of Catholic education for young men in Hawkes Bay. It was also to be a brother school to the already established all girls Sacred Heart College, Napier, some 20 km north.
Enrolment proved so popular that the school needed to expand, so in 1956, with an allotment of donated land, St John's College moved to its present site on Jervois St, Mayfair. Old boys recall that on the day of the move, they carried the school furniture, back and forth to the new premises over 3 km away, and still had to attend afternoon class. The roll grew more slowly after that. Part of the problem was the transportation of students from around the Hawkes Bay region as many students from Napier found it difficult to reach school before school bus lines were established. There were even calls to make both Sacred Heart and St Johns co-ed, to prevent Napier boys travelling to Hastings and Hastings girls travelling to Napier. Today this issue is non-existent, although around 40% of St John's College students still come from the Napier area.
In 1975, St John's College was integrated into the state system under the Private Schools Conditional Integration Act 1975 "on a basis which will preserve and safeguard the special character of the education provided by them."
Over time St John's quietly expanded with the addition of new buildings and land. The 1990s saw drastic changes with the completion of 'The Centre' (the school gymnasium), the music suite and geography room, and the purchase of the old Firth industrial land to expand the playing fields. This gave St John's an additional rugby field, a new area for cricket nets, and another drive way towards Karamu road with additional car spaces. Since 2000 several remodels have been undertaken and includes the construction of the new technology wing.
St John's College celebrated its 50th Jubilee in 1991 which was the colleges biggest event ever. An abundance of old students returned for the weekend and included several speakers and functions as well as a variety of activities and inter house competitions for the students.
Campus
St John's College is situated on Jervois St, Mayfair, in Hastings' northern suburbs. The site layout has all academic buildings close to the main road and are named after former rectors of the school, for example 'The Dowling Block' which contains the Library and Humanities subjects. Behind the buildings are the school playing fields, which separate the swimming pool and the tennis courts from the academic areas.
Facilities
Current facilities of St John's include:
'The Centre', which is a gymnasium used for sports, games and physical education (PE) classes. Also used as the venue for weekly Friday assemblies and other official school occasions such as college masses and the Year 13 Leavers' Mass.
The Kenneth Guthrie Pool, located at the rear of the school property, next to the tennis courts.
Playing Fields, which consist of a variety of interconnected fields containing two rugby fields, a cricket pitch and soccer field, with field hockey students practicing at Park Island Sports Ground in Napier.
St John's tradition
Crest
The school crest incorporates four symbols. It incorporates the major elements of Archbishop Redwood's Crest: the star, the AM, and the "Redwood Cross" i.e. the cross on top of a pile of rocks. The five 5 pointed stars that sits on the upper left side represents the Virgin Mary. Originally it was a six-pointed star but was quickly corrected. The second is the cross that appears on the right side of the crest. It is the Calvary Cross, which represents the place where Jesus was crucified while also representing Christianity. Third directly below the star, is an A imposed over an M, a common symbol for Ave Maria, Latin for Hail Mary. The fourth is the Eagle situated across the top section of the shield, which represents St John. "In the shield's compartment" is the college's motto.
House system
The current house system was bought into the college in 1999, and was named after early Catholic missionaries who came to New Zealand. All except Redwood are French, and staff and students pronounce them in the traditional ways. The names and colours of the St John's College Houses are:
Colin – green
Forest – yellow
Redwood – red
Reignier – blue
Curriculum
Academic results
The number of students achieving national qualifications is well above the national mean for similar decile schools at all levels of NCEA. The percentage of students obtaining NCEA Level 1 increased from 61% in 2003 to 66% in 2004. Levels of attainment in the literacy requirement have recently improved to the current level of over 91%. Achievement in university entrance results has steadily improved over the past three years and is above the national average for schools at this decile level.
The percentage of Year 12 Māori students leaving school with qualifications is well above the national rates for schools in this decile. Retention of Māori students to complete their Year 12 studies is high.
Historical abuse
At least two priests who taught at St. John's College have been accused or convicted of committing sexual offences against children
Father Alan Woodcock SM, abused children at St John's College in Hastings, St Patrick's College in Upper Hutt, Highden in the Manawatu and Futuna in Wellington. After he left the Marist Priesthood and left New Zealand to live in England, he was extradited back to New Zealand and was convicted in 2004 of 21 sex offences committed against 11 children between 1978 and 1987 and sentenced to 7 years in prison. The abuse at St Pats Silverstream continued even after being reported to school rector Father Michael “Vince” Curtain and Marist order head Father Fred Bliss. Woodcock was moved to another Catholic institution in Palmerston North by Bliss where he continued to abuse children. Tracking him down abroad was done with the assistance of the Sisters of St Joseph of Nazareth. In the late 1980s, he took up residence in the England, where he was arrested in 2002
Father Patrick F Minto SM, BA, mentioned in the NZ Royal Commission of Inquiry into Abuse in Care hearing in Nov. 2020. In Nov. 2021 SNAP New Zealand (Survivors Network of those Abused By Priests) published on their Facebook page: SNAP has reports from our members of Fr Pat Minto SM's offending at St. John's College in Hastings.
Principals
Paul Melloy (2014-2019)
Rob Ferriera (2020-April 2022)
George Rogers (2022–present)
Notable alumni
Greg Cooper – former All Black.
Matt Cooper – former All Black.
Paddy Donovan (1936 - 2018) - amateur boxer and rugby union player.
Liam Dudding (born 1994) - cricketer
Chris Eaton (rugby union) (born 1984) - professional rugby union footballer
Greg Foran – Air New Zealand CEO, former president and CEO Walmart USA.
Peter Hayden (born 1948/49) - actor, and television series writer, producer and presenter.
Jonah Lowe (born 1996) - professional rugby union player
Paul Martin (born 1967) – Roman Catholic Bishop of Christchurch (2018-2021); Coadjutor-Archbishop of Wellington (2021-present)
Elijah Niko (born 1990) - professional rugby football player
Dean Parker - Arts Foundation Laureate
Brian Roche – business executive.
John Scott – architect.
Gerard Van Bohemen – Justice of the High Court of New Zealand, former Permanent Representative of New Zealand to the United Nations.
Michael Wintringham – former State Services Commissioner.
Eric Young – lead news anchor, Prime News, New Zealand.
References
External links
Official site
Boys' schools in New Zealand
Educational institutions established in 1941
Schools in Hastings, New Zealand
Catholic secondary schools in New Zealand
1941 establishments in New Zealand
|
Abutia-Kpota is a farming community located at South-Western part of Ho, the capital town of Volta Region, Ghana. Abutia Kpota is one of the towns within the Ho West Parliamentary Constituency. The town is near the Kalapa Re-source Reserve.
History
People
Abutia Kpota is one of the numerous settler towns in the Abutia Traditional Area. The people of Abutia belong to the Eʋedome group of the Eʋes. Abutia has 3 ain traditional towns which are Abutia Kloe, Abutia Agove and Abutia Teti. The Paramount Chief of Abutia is from Abutia Teti with divisional Chiefs from Abutia Kloe and Abutia Agove. All Chiefs in the Abutia Paramountcy pays allegiance to the Paramount Chief called Togbega Abutia Kodzo Gidi IV.
The majority of them are Christians, whilst a few are traditionalist. in the past, there were some Muslims too, but for now, they have dissolved into the ewe tribe.
Education
There is Evangelical Presbyterian School comprising Kindergarten, Primary and J.H.S. The Basic Education Certificate Examination (BECE) is annually staged at a nearby town, Abutia-Kloe. Performance of the outgoing students have not been encouraging over the years until recently one graduating class recorded 100% overall pass mark (i.e. The cohort qualify for admission into S.H.S.) in the BECE.
Agriculture
The community produces cassava, maize, rice and vegetables such as okra. The Abutia Kpota farms which is an initiative of the National Service Scheme mainly deals in maize plantation.
References
Populated places in the Volta Region
|
```shell
Practical `du` command
Extracting `tar` files to a specific directory
Deleting files in a secure manner
Find the unknown process preventing deleting of files
Delete commands aliases
```
|
```python
from localstack.packages import Package, package
@package(name="opensearch")
def opensearch_package() -> Package:
from localstack.services.opensearch.packages import opensearch_package
return opensearch_package
```
|
```shell
Find any Unix / Linux command
Random password generator
Adding directories to your `$PATH`
Conditional command execution
(`&&` operator)
Sequential execution using the `;` statement separator
```
|
```python
#!/usr/bin/env python
#
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
# 1. Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# 2. Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
# 3. Neither the name of the copyright holder nor the
# names of its contributors may be used to endorse or promote products
# derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
# ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE
# LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
# CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
# SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
# INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
# CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
# POSSIBILITY OF SUCH DAMAGE.
#
import unittest
from autothreadharness.harness_case import HarnessCase
class Router_9_2_12(HarnessCase):
role = HarnessCase.ROLE_ROUTER
case = '9 2 12'
golden_devices_required = 3
def on_dialog(self, dialog, title):
pass
if __name__ == '__main__':
unittest.main()
```
|
```shell
List installed packages
Installing a `.deb` package from the terminal
Prevent updating a specific package in Debian systems
Get `apt` to use a mirror / faster mirror
Using `PPAs`
```
|
The 1970 Michigan gubernatorial election was held on November 3, 1970. Republican William Milliken won the election, defeating Democratic nominee Sander Levin.
Primaries
The primary elections occurred on August 4, 1970.
Republican primary
Democratic primary
American Independent primary
General election
Major party candidates
William G. Milliken, Republican
Sander M. Levin, Democratic
Major party running mates
James H. Brickley, Republican
Edward H. McNamara, Democratic
Other candidates
James L. McCormick, American Independent
George Bouse, Socialist Workers
James Horvath, Socialist Labor
Other running mates
Robert E. Cauley, American Independent
Evelyn Kirsch, Socialist Workers
W. Clifford Bentley, Socialist Labor
Results overview
Results by county
Notes
References
Michigan
Michigan gubernatorial elections
November 1970 events in the United States
1970 Michigan elections
|
Hakametsä is a smaller district of Tampere, Finland, located about four kilometers from its city center. The neighboring parts of Hakametsä are Huikas, Ristinarkku, Messukylä, Vuohenoja, Kalevanrinne, Kaleva, Kissanmaa and Uusikylä.
Hakametsä was once the pasture of the Messukylä's parsonage, and it was settled in the late 19th century. The first inhabitants were "gardener Juho Fritzkopf, carpenter August Heino, mixed worker Nestor Rajala, butcher Kalle Lindevall, gardener Aksel Gauffin, baker Kustaa Eklund and baker Lahtinen". The first town plans of Hakametsä and Ristinarkku were confirmed in the 1950s. Finland's first ice rink, Tampere Ice Stadium (also known as Hakametsä Arena), was completed in Hakametsä for the 1965 Ice Hockey World Championships.
Further reading
References
Districts of Tampere
|
```javascript
import React from 'react';
import SvgIcon from '../../SvgIcon';
const CommunicationImportContacts = (props) => (
<SvgIcon {...props}>
<path d="M21 5c-1.11-.35-2.33-.5-3.5-.5-1.95 0-4.05.4-5.5 1.5-1.45-1.1-3.55-1.5-5.5-1.5S2.45 4.9 1 6v14.65c0 .25.25.5.5.5.1 0 .15-.05.25-.05C3.1 20.45 5.05 20 6.5 20c1.95 0 4.05.4 5.5 1.5 1.35-.85 3.8-1.5 5.5-1.5 1.65 0 3.35.3 4.75 1.05.1.05.15.05.25.05.25 0 .5-.25.5-.5V6c-.6-.45-1.25-.75-2-1zm0 13.5c-1.1-.35-2.3-.5-3.5-.5-1.7 0-4.15.65-5.5 1.5V8c1.35-.85 3.8-1.5 5.5-1.5 1.2 0 2.4.15 3.5.5v11.5z"/>
</SvgIcon>
);
CommunicationImportContacts.displayName = 'CommunicationImportContacts';
CommunicationImportContacts.muiName = 'SvgIcon';
export default CommunicationImportContacts;
```
|
```java
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package spec.cuba.web.navigation.entityinference.testscreens.withtype.byinterface;
import com.haulmont.cuba.gui.screen.EditorScreen;
import com.haulmont.cuba.gui.screen.Screen;
import com.haulmont.cuba.security.entity.User;
public class BaseEditorScreen extends Screen implements EditorScreen<User> {
@Override
public void setEntityToEdit(User entity) {
}
@Override
public User getEditedEntity() {
return null;
}
@Override
public boolean isLocked() {
return false;
}
@Override
public boolean hasUnsavedChanges() {
return false;
}
}
```
|
Agnes Odhiambo may refer to:
Agnes Odhiambo (accountant), Kenyan accountant who serves as the Controller of the Budget of Kenya
Agnes Odhiambo (activist), Kenyan human rights activist who works at Human Rights Watch
|
```java
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
package org.activiti.engine.impl;
import java.io.Serializable;
import org.flowable.engine.delegate.DelegateExecution;
/**
* @author Tom Baeyens
*/
public interface Condition extends Serializable {
boolean evaluate(String sequenceFlowId, DelegateExecution execution);
}
```
|
```javascript
(...[a, b]) => {}
```
|
Rosička is a municipality and village in Žďár nad Sázavou District in the Vysočina Region of the Czech Republic. It has about 50 inhabitants.
Rosička lies approximately south-west of Žďár nad Sázavou, north-east of Jihlava, and south-east of Prague.
References
Villages in Žďár nad Sázavou District
|
Ilfracombe Cemetery (properly the Marlborough Road Cemetery) is the burial ground for the town of Ilfracombe in Devon in the United Kingdom. The cemetery is owned and maintained by North Devon Council.
Located on the town's Marlborough Road, the cemetery is 8.88 acres in size and came into operation in 1926, with the first burial taking place on 22 April 1926. The cemetery features a variety of grave types in a mature landscaped setting, including areas for the burial or scattering of ashes and the burial of children. The cemetery has a small chapel where funeral services can be held. The cemetery has 19 war graves from World War II all of which are maintained by the Commonwealth War Graves Commission.
The cemetery is not to be confused with the older and now abandoned and overgrown Score Woods Cemetery in Ilfracombe.
Gallery
References
External links
Marlborough Road Cemetery - Gravestone Photographic Resource(GPR)
Cemeteries in Devon
Ilfracombe
Buildings and structures in Ilfracombe
Commonwealth War Graves Commission cemeteries in England
|
Gumbranch, alternatively spelled Gum Branch, is a city in Liberty County, Georgia, United States. It is a part of the Hinesville-Fort Stewart metropolitan statistical area. The population was 264 at the 2010 census.
History
Gumbranch was incorporated on January 1, 1979. While officially incorporated as "Gumbranch", the alternative spelling "Gum Branch" is often used, such as with the nearby Gum Branch Park and Gum Branch Baptist Church.
Geography
Gumbranch is located in western Liberty County at (31.838765, -81.684384). Georgia State Route 196 passes through the community, leading east to Hinesville, the county seat, and northwest to Glennville.
According to the United States Census Bureau, Gumbranch has a total area of , all of it recorded as land.
Demographics
As of the 2010 United States Census, there were 264 people living in the city. The racial makeup of the city was 91.7% White, 4.5% Black, 0.8% Native American, 0.4% Pacific Islander and 0.8% from two or more races. 1.9% were Hispanic or Latino of any race.
As of the census of 2000, there were 273 people, 100 households, and 76 families living in the city. The population density was . There were 129 housing units at an average density of . The racial makeup of the city was 91.58% White, 5.86% African American, 2.20% Asian, 0.37% from other races. Hispanic or Latino of any race were 1.10% of the population.
There were 100 households, out of which 34.0% had children under the age of 18 living with them, 60.0% were married couples living together, 9.0% had a female householder with no husband present, and 24.0% were non-families. 22.0% of all households were made up of individuals, and 7.0% had someone living alone who was 65 years of age or older. The average household size was 2.73 and the average family size was 3.17.
In the city, the population was spread out, with 28.9% under the age of 18, 11.7% from 18 to 24, 28.9% from 25 to 44, 24.5% from 45 to 64, and 5.9% who were 65 years of age or older. The median age was 30 years. For every 100 females, there were 105.3 males. For every 100 females age 18 and over, there were 98.0 males.
The median income for a household in the city was $35,625, and the median income for a family was $40,938. Males had a median income of $30,625 versus $17,917 for females. The per capita income for the city was $13,158. About 19.7% of families and 24.6% of the population were below the poverty line, including 28.6% of those under the age of eighteen and 46.2% of those 65 or over.
Government and infrastructure
Liberty County Fire Services operates Station 15 Gumbranch.
Education
The Liberty County School District operates public schools that serve Gumbranch.
References
External links
Gumbranch - State of Georgia
Cities in Georgia (U.S. state)
Cities in Liberty County, Georgia
Hinesville metropolitan area
Populated places established in 1979
|
```c++
// your_sha256_hash----------------------------------
//
// Permission is hereby granted, free of charge, to any person obtaining a copy of this software and
// associated documentation files (the "Software"), to deal in the Software without restriction,
// including without limitation the rights to use, copy, modify, merge, publish, distribute,
// sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is
// furnished to do so, subject to the following conditions:
//
// The above copyright notice and this permission notice shall be included in all copies or
// substantial portions of the Software.
//
// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT
// NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
// NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
// DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
// your_sha256_hash----------------------------------
// Local:
#include "VideoFrameWriter.h"
#include "Logger.h"
#if WIN32
#include "WindowsFrameWriter.h"
#else
#include "PosixFrameWriter.h"
#endif
// STL:
#include <exception>
#include <sstream>
#define LOG_COMPONENT Logger::LOG_VIDEO
namespace malmo
{
VideoFrameWriter::VideoFrameWriter(std::string path, std::string frame_info_filename, short width, short height, int frames_per_second, int channels, bool drop_input_frames)
: path(path)
, width(width)
, height(height)
, frames_per_second(frames_per_second)
, drop_input_frames(drop_input_frames)
, channels(channels)
, is_open(false)
, frame_duration(boost::posix_time::milliseconds(1000) / frames_per_second)
{
boost::filesystem::path fs_path(path);
if (boost::filesystem::is_directory(fs_path)) {
this->frame_info_path = fs_path / frame_info_filename;
}
else {
this->frame_info_path = fs_path.parent_path() / frame_info_filename;
}
}
VideoFrameWriter::~VideoFrameWriter()
{
this->close();
}
void VideoFrameWriter::open()
{
this->close();
// Create helpful script:
boost::filesystem::path fs_path(this->path);
std::string ffmpeg_helpfile = (fs_path.parent_path() / (fs_path.stem().string() + "_to_pngs.sh")).string();
std::ofstream helpfile(ffmpeg_helpfile);
helpfile << "#! To extract individual frames from the mp4\n";
helpfile << "mkdir " << fs_path.stem().string() << "_frames\n";
helpfile << "ffmpeg -i " << fs_path.filename() << " " << fs_path.stem().string() << "_frames/frame_%06d.png\n";
this->frame_info_stream.open(this->frame_info_path.string());
this->frame_info_stream << "width=" << this->width << std::endl;
this->frame_info_stream << "height=" << this->height << std::endl;
this->is_open = true;
this->start_time = boost::posix_time::microsec_clock::universal_time();
this->last_timestamp = this->start_time - this->frame_duration;
this->frame_index = 0;
this->frames_available = false;
this->frame_writer_thread = boost::thread(&VideoFrameWriter::writeFrames, this);
}
bool VideoFrameWriter::isOpen() const
{
return this->is_open;
}
void VideoFrameWriter::close()
{
LOGSECTION(LOG_FINE, "In VideoFrameWriter::close()...");
if (this->is_open) {
this->frame_info_stream.close();
this->is_open = false;
LOGFINE(LT("Set is_open to false"));
{
boost::lock_guard<boost::mutex> frames_available_guard(this->frames_available_mutex);
this->frames_available = true;
}
LOGFINE(LT("Notifying worker thread that frames are available, in order to close."));
this->frames_available_cond.notify_one();
LOGFINE(LT("Waiting for worker thread to join."));
this->frame_writer_thread.join();
LOGFINE(LT("Worker thread joined."));
LOGFINE(LT("Frames received for writing: "), this->frame_index);
LOGFINE(LT("Frames actually written: "), this->frames_actually_written);
}
}
void VideoFrameWriter::writeFrames()
{
this->frames_actually_written = 0;
while (this->is_open) {
{
boost::unique_lock<boost::mutex> lock(this->frames_available_mutex);
while (!this->frames_available) {
this->frames_available_cond.wait(lock);
}
}
while (true) {
TimestampedVideoFrame frame;
{
boost::lock_guard<boost::mutex> buffer_guard(this->frame_buffer_mutex);
if (this->frame_buffer.size() > 0) {
frame = this->frame_buffer.front();
this->frame_buffer.pop();
}
}
if (frame.width == 0) {
boost::lock_guard<boost::mutex> frames_available_guard(this->frames_available_mutex);
this->frames_available = false;
break;
}
try
{
writeSingleFrame(frame, this->frames_actually_written);
this->frames_actually_written++;
}
catch (std::exception& e)
{
LOGERROR(LT("Failed to write frame: "), e.what());
}
}
}
}
void VideoFrameWriter::writeSingleFrame(const TimestampedVideoFrame& frame, int count)
{
LOGTRACE(LT("Writing frame "), count + 1, LT(", "), frame.width, LT("x"), frame.height, LT("x"), frame.channels);
if (frame.channels == 4)
{
if (frame.frametype == TimestampedVideoFrame::DEPTH_MAP)
{
// For making videos out of 32bpp depth maps, what exactly should we display?
// We could reduce to greyscale, but that way we loose a lot of precision.
// Instead, convert to an HSV colour cone, which hopefully gives a greater range
// of colour values to map to.
const float* fPixels = reinterpret_cast<const float*>(&(frame.pixels[0]));
char *out_pixels = new char[frame.width * frame.height * 3];
for (int i = 0; i < frame.width*frame.height; i++)
{
float f = fPixels[i];
float h = 60.0f * f;
while (h >= 360.0)
h -= 360.0;
float s = 1.0;
float v = 1.0f - (f / 200.0f);
if (v < 0)
v = 0;
if (v > 1.0)
v = 1.0;
h = h / 60.0f;
float fract = h - floor(h);
v *= 255.0;
float p = v*(1.0f - s);
float q = v*(1.0f - s*fract);
float t = v*(1.0f - s*(1.0f - fract));
unsigned int out;
if (0. <= h && h < 1.)
out = int(v) + (int(t) << 8) + (int(p) << 16);
else if (1. <= h && h < 2.)
out = int(q) + (int(v) << 8) + (int(p) << 16);
else if (2. <= h && h < 3.)
out = int(p) + (int(v) << 8) + (int(t) << 16);
else if (3. <= h && h < 4.)
out = int(p) + (int(q) << 8) + (int(v) << 16);
else if (4. <= h && h < 5.)
out = int(t) + (int(p) << 8) + (int(v) << 16);
else if (5. <= h && h < 6.)
out = int(v) + (int(p) << 8) + (int(q) << 16);
else
out = 0;
out_pixels[3 * i] = out & 0xff;
out_pixels[3 * i + 1] = (out >> 8) & 0xff;
out_pixels[3 * i + 2] = (out >> 16) & 0xff;
}
this->doWrite(out_pixels, frame.width, frame.height, count);
delete[] out_pixels;
}
else
{
// extract DDD from RGBD
char *out_pixels = new char[frame.width * frame.height * 3];
for (int i = 0; i < frame.width*frame.height; i++)
{
out_pixels[i * 3] = out_pixels[i * 3 + 1] = out_pixels[i * 3 + 2] = frame.pixels[i * 4 + 3];
}
this->doWrite(out_pixels, frame.width, frame.height, count);
delete[] out_pixels;
}
}
else if (frame.channels == 3 || frame.channels == 1)
{
// write the pixel data directly
this->doWrite((char*)&frame.pixels[0], frame.width, frame.height, count);
}
else throw std::runtime_error("Unsupported number of channels");
}
bool VideoFrameWriter::write(TimestampedVideoFrame frame)
{
boost::lock_guard<boost::mutex> write_guard(this->write_mutex);
if (!this->drop_input_frames || frame.timestamp - this->last_timestamp >= this->frame_duration) {
this->last_timestamp = frame.timestamp;
std::stringstream name;
name << "frame_" << std::setfill('0') << std::setw(6) << this->frame_index + 1;
std::stringstream posdata;
posdata << "xyzyp: " << frame.xPos << " " << frame.yPos << " " << frame.zPos << " " << frame.yaw << " " << frame.pitch;
this->frame_info_stream << boost::posix_time::to_iso_string(frame.timestamp) << " " << name.str() << " " << posdata.str() << std::endl;
this->frame_index++;
{
boost::lock_guard<boost::mutex> buffer_guard(this->frame_buffer_mutex);
LOGTRACE(LT("Pushing frame "), this->frame_index, LT(", "), frame.width, LT("x"), frame.height, LT("x"), frame.channels, LT(" to write buffer."));
this->frame_buffer.push(frame);
}
{
boost::lock_guard<boost::mutex> frames_available_guard(this->frames_available_mutex);
this->frames_available = true;
}
this->frames_available_cond.notify_one();
return true;
}
return false;
}
std::unique_ptr<VideoFrameWriter> VideoFrameWriter::create(std::string path, std::string info_filename, short width, short height, int frames_per_second, int64_t bit_rate, int channels, bool drop_input_frames)
{
#if WIN32
std::unique_ptr<VideoFrameWriter> instance( new WindowsFrameWriter(path, info_filename, width, height, frames_per_second, bit_rate, channels, drop_input_frames) );
#else
std::unique_ptr<VideoFrameWriter> instance( new PosixFrameWriter(path, info_filename, width, height, frames_per_second, bit_rate, channels, drop_input_frames) );
#endif
return instance;
}
}
#undef LOG_COMPONENT
```
|
In graph theory, a maximal independent set (MIS) or maximal stable set is an independent set that is not a subset of any other independent set. In other words, there is no vertex outside the independent set that may join it because it is maximal with respect to the independent set property.
For example, in the graph , a path with three vertices , , and , and two edges and , the sets and are both maximally independent. The set is independent, but is not maximal independent, because it is a subset of the larger independent set In this same graph, the maximal cliques are the sets and
A MIS is also a dominating set in the graph, and every dominating set that is independent must be maximal independent, so MISs are also called independent dominating sets.
A graph may have many MISs of widely varying sizes; the largest, or possibly several equally large, MISs of a graph is called a maximum independent set. The graphs in which all maximal independent sets have the same size are called well-covered graphs.
The phrase "maximal independent set" is also used to describe maximal subsets of independent elements in mathematical structures other than graphs, and in particular in vector spaces and matroids.
Two algorithmic problems are associated with MISs: finding a single MIS in a given graph and listing all MISs in a given graph.
Definition
For a graph , an independent set is a maximal independent set if for , one of the following is true:
where denotes the neighbors of
The above can be restated as a vertex either belongs to the independent set or has at least one neighbor vertex that belongs to the independent set. As a result, every edge of the graph has at least one endpoint not in . However, it is not true that every edge of the graph has at least one, or even one endpoint in
Notice that any neighbor to a vertex in the independent set cannot be in because these vertices are disjoint by the independent set definition.
Related vertex sets
If S is a maximal independent set in some graph, it is a maximal clique or maximal complete subgraph in the complementary graph. A maximal clique is a set of vertices that induces a complete subgraph, and that is not a subset of the vertices of any larger complete subgraph. That is, it is a set S such that every pair of vertices in S is connected by an edge and every vertex not in S is missing an edge to at least one vertex in S. A graph may have many maximal cliques, of varying sizes; finding the largest of these is the maximum clique problem.
Some authors include maximality as part of the definition of a clique, and refer to maximal cliques simply as cliques.
The complement of a maximal independent set, that is, the set of vertices not belonging to the independent set, forms a minimal vertex cover. That is, the complement is a vertex cover, a set of vertices that includes at least one endpoint of each edge, and is minimal in the sense that none of its vertices can be removed while preserving the property that it is a cover. Minimal vertex covers have been studied in statistical mechanics in connection with the hard-sphere lattice gas model, a mathematical abstraction of fluid-solid state transitions.
Every maximal independent set is a dominating set, a set of vertices such that every vertex in the graph either belongs to the set or is adjacent to the set. A set of vertices is a maximal independent set if and only if it is an independent dominating set.
Graph family characterizations
Certain graph families have also been characterized in terms of their maximal cliques or maximal independent sets. Examples include the maximal-clique irreducible and hereditary maximal-clique irreducible graphs. A graph is said to be maximal-clique irreducible if every maximal clique has an edge that belongs to no other maximal clique, and hereditary maximal-clique irreducible if the same property is true for every induced subgraph. Hereditary maximal-clique irreducible graphs include triangle-free graphs, bipartite graphs, and interval graphs.
Cographs can be characterized as graphs in which every maximal clique intersects every maximal independent set, and in which the same property is true in all induced subgraphs.
Bounding the number of sets
showed that any graph with n vertices has at most 3n/3 maximal cliques. Complementarily, any graph with n vertices also has at most 3n/3 maximal independent sets. A graph with exactly 3n/3 maximal independent sets is easy to construct: simply take the disjoint union of n/3 triangle graphs. Any maximal independent set in this graph is formed by choosing one vertex from each triangle. The complementary graph, with exactly 3n/3 maximal cliques, is a special type of Turán graph; because of their connection with Moon and Moser's bound, these graphs are also sometimes called Moon-Moser graphs. Tighter bounds are possible if one limits the size of the maximal independent sets: the number of maximal independent sets of size k in any n-vertex graph is at most
The graphs achieving this bound are again Turán graphs.
Certain families of graphs may, however, have much more restrictive bounds on the numbers of maximal independent sets or maximal cliques. If all n-vertex graphs in a family of graphs have O(n) edges, and if every subgraph of a graph in the family also belongs to the family, then each graph in the family can have at most O(n) maximal cliques, all of which have size O(1). For instance, these conditions are true for the planar graphs: every n-vertex planar graph has at most 3n − 6 edges, and a subgraph of a planar graph is always planar, from which it follows that each planar graph has O(n) maximal cliques (of size at most four). Interval graphs and chordal graphs also have at most n maximal cliques, even though they are not always sparse graphs.
The number of maximal independent sets in n-vertex cycle graphs is given by the Perrin numbers, and the number of maximal independent sets in n-vertex path graphs is given by the Padovan sequence. Therefore, both numbers are proportional to powers of 1.324718, the plastic number.
Finding a single maximal independent set
Sequential algorithm
Given a Graph G(V,E), it is easy to find a single MIS using the following algorithm:
Initialize I to an empty set.
While V is not empty:
Choose a node v∈V;
Add v to the set I;
Remove from V the node v and all its neighbours.
Return I.
Random-selection parallel algorithm [Luby's Algorithm]
The following algorithm finds a MIS in time O(log n).
Initialize I to an empty set.
While V is not empty:
Choose a random set of vertices S ⊆ V, by selecting each vertex v independently with probability 1/(2d(v)), where d is the degree of v (the number of neighbours of v).
For every edge in E, if both its endpoints are in the random set S, then remove from S the endpoint whose degree is lower (i.e. has fewer neighbours). Break ties arbitrarily, e.g. using a lexicographic order on the vertex names.
Add the set S to I.
Remove from V the set S and all the neighbours of nodes in S.
Return I.
ANALYSIS: For each node v, divide its neighbours to lower neighbours (whose degree is lower than the degree of v) and higher neighbours (whose degree is higher than the degree of v), breaking ties as in the algorithm.
Call a node v bad if more than 2/3 of its neighbors are higher neighbours. Call an edge bad if both its endpoints are bad; otherwise the edge is good.
At least 1/2 of all edges are always good. PROOF: Build a directed version of G by directing each edge to the node with the higher degree (breaking ties arbitrarily). So for every bad node, the number of out-going edges is more than 2 times the number of in-coming edges. So every bad edge, that enters a node v, can be matched to a distinct set of two edges that exit the node v. Hence the total number of edges is at least 2 times the number of bad edges.
For every good node u, the probability that a neighbour of u is selected to S is at least a certain positive constant. PROOF: the probability that NO neighbour of u is selected to S is at most the probability that none of u's lower neighbours is selected. For each lower-neighbour v, the probability that it is not selected is (1-1/2d(v)), which is at most (1-1/2d(u)) (since d(u)>d(v)). The number of such neighbours is at least d(u)/3, since u is good. Hence the probability that no lower-neighbour is selected is at most 1-exp(-1/6).
For every node u that is selected to S, the probability that u will be removed from S is at most 1/2. PROOF: This probability is at most the probability that a higher-neighbour of u is selected to S. For each higher-neighbour v, the probability that it is selected is at most 1/2d(v), which is at most 1/2d(u) (since d(v)>d(u)). By union bound, the probability that no higher-neighbour is selected is at most d(u)/2d(u) = 1/2.
Hence, for every good node u, the probability that a neighbour of u is selected to S and remains in S is a certain positive constant. Hence, the probability that u is removed, in each step, is at least a positive constant.
Hence, for every good edge e, the probability that e is removed, in each step, is at least a positive constant. So the number of good edges drops by at least a constant factor each step.
Since at least half the edges are good, the total number of edges also drops by a constant factor each step.
Hence, the number of steps is O(log m), where m is the number of edges. This is bounded by .
A worst-case graph, in which the average number of steps is , is a graph made of n/2 connected components, each with 2 nodes. The degree of all nodes is 1, so each node is selected with probability 1/2, and with probability 1/4 both nodes in a component are not chosen. Hence, the number of nodes drops by a factor of 4 each step, and the expected number of steps is .
Random-priority parallel algorithm
The following algorithm is better than the previous one in that at least one new node is always added in each connected component:
Initialize I to an empty set.
While V is not empty, each node v does the following:
Selects a random number r(v) in [0,1] and sends it to its neighbours;
If r(v) is smaller than the numbers of all neighbours of v, then v inserts itself into I, removes itself from V and tells its neighbours about this;
If v heard that one of its neighbours got into I, then v removes itself from V.
Return I.
Note that in every step, the node with the smallest number in each connected component always enters I, so there is always some progress. In particular, in the worst-case of the previous algorithm (n/2 connected components with 2 nodes each), a MIS will be found in a single step.
ANALYSIS:
A node has probability at least of being removed. PROOF: For each edge connecting a pair of nodes , replace it with two directed edges, one from and the other . Notice that is now twice as large. For every pair of directed edges , define two events: and , pre-emptively removes and pre-emptively removes , respectively. The event occurs when and , where is a neighbor of and is neighbor . Recall that each node is given a random number in the same [0, 1] range. In a simple example with two disjoint nodes, each has probability of being smallest. If there are three disjoint nodes, each has probability of being smallest. In the case of , it has probability at least of being smallest because it is possible that a neighbor of is also the neighbor of , so a node becomes double counted. Using the same logic, the event also has probability at least of being removed.
When the events and occur, they remove and directed outgoing edges, respectively. PROOF: In the event , when is removed, all neighboring nodes are also removed. The number of outgoing directed edges from removed is . With the same logic, removes directed outgoing edges.
In each iteration of step 2, in expectation, half the edges are removed. PROOF: If the event happens then all neighbours of are removed; hence the expected number of edges removed due to this event is at least . The same is true for the reverse event , i.e. the expected number of edges removed is at least . Hence, for every undirected edge , the expected number of edges removed due to one of these nodes having smallest value is . Summing over all edges, , gives an expected number of edges removed every step, but each edge is counted twice (once per direction), giving edges removed in expectation every step.
Hence, the expected run time of the algorithm is which is .
Random-permutation parallel algorithm [Blelloch's Algorithm]
Instead of randomizing in each step, it is possible to randomize once, at the beginning of the algorithm, by fixing a random ordering on the nodes. Given this fixed ordering, the following parallel algorithm achieves exactly the same MIS as the #Sequential algorithm (i.e. the result is deterministic):
Initialize I to an empty set.
While V is not empty:
Let W be the set of vertices in V with no earlier neighbours (based on the fixed ordering);
Add W to I;
Remove from V the nodes in the set W and all their neighbours.
Return I.
Between the totally sequential and the totally parallel algorithms, there is a continuum of algorithms that are partly sequential and partly parallel. Given a fixed ordering on the nodes and a factor δ∈(0,1], the following algorithm returns the same MIS:
Initialize I to an empty set.
While V is not empty:
Select a factor δ∈(0,1].
Let P be the set of δn nodes that are first in the fixed ordering.
Let W be a MIS on P using the totally parallel algorithm.
Add W to I;
Remove from V all the nodes in the prefix P, and all the neighbours of nodes in the set W.
Return I.
Setting δ=1/n gives the totally sequential algorithm; setting δ=1 gives the totally parallel algorithm.
ANALYSIS: With a proper selection of the parameter δ in the partially parallel algorithm, it is possible to guarantee that it finishes after at most log(n) calls to the fully parallel algorithm, and the number of steps in each call is at most log(n). Hence the total run-time of the partially parallel algorithm is . Hence the run-time of the fully parallel algorithm is also at most . The main proof steps are:
If, in step i, we select , where D is the maximum degree of a node in the graph, then WHP all nodes remaining after step i have degree at most . Thus, after log(D) steps, all remaining nodes have degree 0 (since D<n), and can be removed in a single step.
If, in any step, the degree of each node is at most d, and we select (for any constant C), then WHP the longest path in the directed graph determined by the fixed ordering has length . Hence the fully parallel algorithm takes at most steps (since the longest path is a worst-case bound on the number of steps in that algorithm).
Combining these two facts gives that, if we select , then WHP the run-time of the partially parallel algorithm is .
Listing all maximal independent sets
An algorithm for listing all maximal independent sets or maximal cliques in a graph can be used as a subroutine for solving many NP-complete graph problems. Most obviously, the solutions to the maximum independent set problem, the maximum clique problem, and the minimum independent dominating problem must all be maximal independent sets or maximal cliques, and can be found by an algorithm that lists all maximal independent sets or maximal cliques and retains the ones with the largest or smallest size. Similarly, the minimum vertex cover can be found as the complement of one of the maximal independent sets. observed that listing maximal independent sets can also be used to find 3-colorings of graphs: a graph can be 3-colored if and only if the complement of one of its maximal independent sets is bipartite. He used this approach not only for 3-coloring but as part of a more general graph coloring algorithm, and similar approaches to graph coloring have been refined by other authors since. Other more complex problems can also be modeled as finding a clique or independent set of a specific type. This motivates the algorithmic problem of listing all maximal independent sets (or equivalently, all maximal cliques) efficiently.
It is straightforward to turn a proof of Moon and Moser's 3n/3 bound on the number of maximal independent sets into an algorithm that lists all such sets in time O(3n/3). For graphs that have the largest possible number of maximal independent sets, this algorithm takes constant time per output set. However, an algorithm with this time bound can be highly inefficient for graphs with more limited numbers of independent sets. For this reason, many researchers have studied algorithms that list all maximal independent sets in polynomial time per output set. The time per maximal independent set is proportional to that for matrix multiplication in dense graphs, or faster in various classes of sparse graphs.
Parallelization of finding maximum independent sets
History
The maximal independent set problem was originally thought to be non-trivial to parallelize due to the fact that the lexicographical maximal independent set proved to be P-Complete; however, it has been shown that a deterministic parallel solution could be given by an reduction from either the maximum set packing or the maximal matching problem or by an reduction from the 2-satisfiability problem. Typically, the structure of the algorithm given follows other parallel graph algorithms - that is they subdivide the graph into smaller local problems that are solvable in parallel by running an identical algorithm.
Initial research into the maximal independent set problem started on the PRAM model and has since expanded to produce results for distributed algorithms on computer clusters. The many challenges of designing distributed parallel algorithms apply in equal to the maximum independent set problem. In particular, finding an algorithm that exhibits efficient runtime and is optimal in data communication for subdividing the graph and merging the independent set.
Complexity class
It was shown in 1984 by Karp et al. that a deterministic parallel solution on PRAM to the maximal independent set belonged in the Nick's Class complexity zoo of . That is to say, their algorithm finds a maximal independent set in using , where is the vertex set size. In the same paper, a randomized parallel solution was also provided with a runtime of using processors. Shortly after, Luby and Alon et al. independently improved on this result, bringing the maximal independent set problem into the realm of with an runtime using processors, where is the number of edges in the graph. In order to show that their algorithm is in , they initially presented a randomized algorithm that uses processors but could be derandomized with an additional processors. Today, it remains an open question as to if the maximal independent set problem is in .
Communication and data exchange
Distributed maximal independent set algorithms are strongly influenced by algorithms on the PRAM model. The original work by Luby and Alon et al. has led to several distributed algorithms. In terms of exchange of bits, these algorithms had a message size lower bound per round of bits and would require additional characteristics of the graph. For example, the size of the graph would need to be known or the maximum degree of neighboring vertices for a given vertex could be queried. In 2010, Métivier et al. reduced the required message size per round to , which is optimal and removed the need for any additional graph knowledge.
Footnotes
Notes
References
.
.
.
.
.
.
.
.
.
.
.
.
.
.
. .
.
.
.
.
.
.
Graph theory objects
Computational problems in graph theory
de:Stabile Menge#Maximale stabile Menge
|
TeST TST-14 may refer to:
TeST TST-14 Bonus, piston-powered motor glider
TeST TST-14J BonusJet, jet-powered motorglider
|
"Nightcrawlers" is the third and final segment of the fourth episode of the first season (1985–86) of the television series The Twilight Zone. It is adapted from a short story of the same name by Robert R. McCammon, first published in the 1984 collection Masques.
Plot
State trooper Dennis Wells takes shelter from a downpour at a roadside diner. He describes to Bob the cook and the server a massacre that he is investigating that occurred at a local motel. After almost getting into a collision outside, a Vietnam veteran named Price enters the diner. Concerned by Price's reckless driving, Wells interrogates him.
Price is compelled to describe how he fled and abandoned his unit during the war, leaving all of them dead in the jungle, and that he has a recurring nightmare in which his unit, "The Nightcrawlers," are hunting him down to exact revenge. Wells instructs Price to sleep the night off at a local motel, but Price says he can't stay at a motel because he becomes a danger to everyone around him when he sleeps. Price explains that he and four other soldiers were sprayed with a chemical that gave them the power of mind over matter, which he demonstrates by materializing a t-bone steak on the grill. He says such manifestations quickly fade while he is awake, but are more dangerous when he dreams. Wells, realizing Price caused the massacre he's investigating, tries to arrest him, but Price melts his gun with his mind. Enraged, Wells knocks Price unconscious with a ketchup bottle, unintentionally bringing Price's nightmare into the world. Soldiers materialize, destroying everything in the cafe with machine gun fire and explosions. Trying to escape, Wells is shot and killed. Bob tries to kill Price with a pan, but is shot though not killed. Price regains consciousness as the soldiers force their way in. A spotlight is cast upon him, and he calls out, "Charlie's in the light!", prompting the soldiers to shoot and kill him. The cafe in flaming ruins, Bob is taken away in an ambulance. He reminds the others that Price said there are four more soldiers out there who have the same ability.
Production
"Nightcrawlers" was based on a short story by Robert R. McCammon. Executive producer Philip DeGuere wrote the teleplay, which he said was fairly easy, since the short story was so visual. It was scored by Merl Saunders and the Grateful Dead, featuring Huey Lewis on harmonica. Exene Cervenka of the punk group X plays a waitress.
"Nightcrawlers" was one of the most expensive segments to film in the entire series, chiefly due to its pyrotechnic finale, since all of the explosions and destroyed vehicles were real and not miniatures. The exterior of the diner was filmed on location by the side of a real highway, while the interior was a set built on the CBS Radford lot. The interior lights were rigged so that they would flicker whenever there was supposed to be a lightning strike.
Cinematographer Bradford May said director William Friedkin was the most challenging director he had ever worked with because he demanded utmost intensity from every shot. Wanting more intensity from Scott Paulin, the actor who played Price, Friedkin put his face right up to Paulin's and shook him, to the shock of the cast and crew.
In retrospect, director William Friedkin said: "Nightcrawlers” I did because it was a great story. It was a metaphor for how Vietnam continues to haunt us. We had a five-day shoot, and I approached it and shot it the same way I’d approach a film. It’s one of the most watched things I’ve ever done, and so it restored my confidence. But I’d always dabbled in TV, so this wasn’t an attempt to get away from film. To me it was no different from film, really."
References
Zicree, Marc Scott: The Twilight Zone Companion. Sillman-James Press, 1992 (second edition)
External links
1985 American television episodes
The Twilight Zone (1985 TV series season 1) episodes
Television episodes about Vietnam War
Television shows based on short fiction
fr:La Lumière des ténèbres
|
```c++
// This Source Code Form is subject to the terms of the Mozilla Public
// file, You can obtain one at path_to_url
#ifndef VSOMEIP_V3_E2E_PROFILE05_PROTECTOR_HPP
#define VSOMEIP_V3_E2E_PROFILE05_PROTECTOR_HPP
#include <map>
#include <mutex>
#include "../profile05/profile_05.hpp"
#include "../profile_interface/protector.hpp"
namespace vsomeip_v3 {
namespace e2e {
namespace profile05 {
class protector final : public e2e::profile_interface::protector {
public:
protector(void) = delete;
explicit protector(const profile_config &_config)
: config_(_config) {}
void protect(e2e_buffer &_buffer, instance_t _instance) override final;
private:
bool verify_inputs(e2e_buffer &_buffer);
uint8_t get_counter(instance_t _instance) const;
void increment_counter(instance_t _instance);
void write_counter(e2e_buffer &_buffer, uint8_t _data, size_t _index);
void write_crc(e2e_buffer &_buffer, uint16_t _data, size_t _index);
private:
profile_config config_;
std::map<instance_t, uint8_t> counter_;
std::mutex protect_mutex_;
};
} // namespace profile_05
} // namespace e2e
} // namespace vsomeip_v3
#endif // VSOMEIP_V3_E2E_PROFILE05_PROTECTOR_HPP
```
|
```objective-c
/* Various declarations for language-independent pretty-print subroutines.
Contributed by Gabriel Dos Reis <gdr@integrable-solutions.net>
This file is part of GCC.
GCC is free software; you can redistribute it and/or modify it under
Software Foundation; either version 3, or (at your option) any later
version.
GCC is distributed in the hope that it will be useful, but WITHOUT ANY
WARRANTY; without even the implied warranty of MERCHANTABILITY or
for more details.
along with GCC; see the file COPYING3. If not see
<path_to_url */
#ifndef GCC_PRETTY_PRINT_H
#define GCC_PRETTY_PRINT_H
#include "obstack.h"
#include "wide-int-print.h"
/* Maximum number of format string arguments. */
#define PP_NL_ARGMAX 30
/* The type of a text to be formatted according a format specification
along with a list of things. */
struct text_info
{
const char *format_spec;
va_list *args_ptr;
int err_no; /* for %m */
void **x_data;
rich_location *m_richloc;
void set_location (unsigned int idx, location_t loc, bool caret_p);
location_t get_location (unsigned int index_of_location) const;
};
/* How often diagnostics are prefixed by their locations:
o DIAGNOSTICS_SHOW_PREFIX_NEVER: never - not yet supported;
o DIAGNOSTICS_SHOW_PREFIX_ONCE: emit only once;
o DIAGNOSTICS_SHOW_PREFIX_EVERY_LINE: emit each time a physical
line is started. */
enum diagnostic_prefixing_rule_t
{
DIAGNOSTICS_SHOW_PREFIX_ONCE = 0x0,
DIAGNOSTICS_SHOW_PREFIX_NEVER = 0x1,
DIAGNOSTICS_SHOW_PREFIX_EVERY_LINE = 0x2
};
/* The chunk_info data structure forms a stack of the results from the
first phase of formatting (pp_format) which have not yet been
output (pp_output_formatted_text). A stack is necessary because
the diagnostic starter may decide to generate its own output by way
of the formatter. */
struct chunk_info
{
/* Pointer to previous chunk on the stack. */
struct chunk_info *prev;
/* Array of chunks to output. Each chunk is a NUL-terminated string.
In the first phase of formatting, even-numbered chunks are
to be output verbatim, odd-numbered chunks are format specifiers.
The second phase replaces all odd-numbered chunks with formatted
text, and the third phase simply emits all the chunks in sequence
with appropriate line-wrapping. */
const char *args[PP_NL_ARGMAX * 2];
};
/* The output buffer datatype. This is best seen as an abstract datatype
whose fields should not be accessed directly by clients. */
struct output_buffer
{
output_buffer ();
~output_buffer ();
/* Obstack where the text is built up. */
struct obstack formatted_obstack;
/* Obstack containing a chunked representation of the format
specification plus arguments. */
struct obstack chunk_obstack;
/* Currently active obstack: one of the above two. This is used so
that the text formatters don't need to know which phase we're in. */
struct obstack *obstack;
/* Stack of chunk arrays. These come from the chunk_obstack. */
struct chunk_info *cur_chunk_array;
/* Where to output formatted text. */
FILE *stream;
/* The amount of characters output so far. */
int line_length;
/* This must be large enough to hold any printed integer or
floating-point value. */
char digit_buffer[128];
/* Nonzero means that text should be flushed when
appropriate. Otherwise, text is buffered until either
pp_really_flush or pp_clear_output_area are called. */
bool flush_p;
};
/* Finishes constructing a NULL-terminated character string representing
the buffered text. */
static inline const char *
output_buffer_formatted_text (output_buffer *buff)
{
obstack_1grow (buff->obstack, '\0');
return (const char *) obstack_base (buff->obstack);
}
/* Append to the output buffer a string specified by its
STARTing character and LENGTH. */
static inline void
output_buffer_append_r (output_buffer *buff, const char *start, int length)
{
gcc_checking_assert (start);
obstack_grow (buff->obstack, start, length);
for (int i = 0; i < length; i++)
if (start[i] == '\n')
buff->line_length = 0;
else
buff->line_length++;
}
/* Return a pointer to the last character emitted in the
output_buffer. A NULL pointer means no character available. */
static inline const char *
output_buffer_last_position_in_text (const output_buffer *buff)
{
const char *p = NULL;
struct obstack *text = buff->obstack;
if (obstack_base (text) != obstack_next_free (text))
p = ((const char *) obstack_next_free (text)) - 1;
return p;
}
/* The type of pretty-printer flags passed to clients. */
typedef unsigned int pp_flags;
enum pp_padding
{
pp_none, pp_before, pp_after
};
/* Structure for switching in and out of verbatim mode in a convenient
manner. */
struct pp_wrapping_mode_t
{
/* Current prefixing rule. */
diagnostic_prefixing_rule_t rule;
/* The ideal upper bound of number of characters per line, as suggested
by front-end. */
int line_cutoff;
};
/* Maximum characters per line in automatic line wrapping mode.
Zero means don't wrap lines. */
#define pp_line_cutoff(PP) (PP)->wrapping.line_cutoff
/* Prefixing rule used in formatting a diagnostic message. */
#define pp_prefixing_rule(PP) (PP)->wrapping.rule
/* Get or set the wrapping mode as a single entity. */
#define pp_wrapping_mode(PP) (PP)->wrapping
/* The type of a hook that formats client-specific data onto a pretty_printer.
A client-supplied formatter returns true if everything goes well,
otherwise it returns false. */
typedef bool (*printer_fn) (pretty_printer *, text_info *, const char *,
int, bool, bool, bool);
/* Client supplied function used to decode formats. */
#define pp_format_decoder(PP) (PP)->format_decoder
/* TRUE if a newline character needs to be added before further
formatting. */
#define pp_needs_newline(PP) (PP)->need_newline
/* True if PRETTY-PRINTER is in line-wrapping mode. */
#define pp_is_wrapping_line(PP) (pp_line_cutoff (PP) > 0)
/* The amount of whitespace to be emitted when starting a new line. */
#define pp_indentation(PP) (PP)->indent_skip
/* True if identifiers are translated to the locale character set on
output. */
#define pp_translate_identifiers(PP) (PP)->translate_identifiers
/* True if colors should be shown. */
#define pp_show_color(PP) (PP)->show_color
/* The data structure that contains the bare minimum required to do
proper pretty-printing. Clients may derived from this structure
and add additional fields they need. */
struct pretty_printer
{
// Default construct a pretty printer with specified prefix
// and a maximum line length cut off limit.
explicit pretty_printer (const char* = NULL, int = 0);
virtual ~pretty_printer ();
/* Where we print external representation of ENTITY. */
output_buffer *buffer;
/* The prefix for each new line. */
const char *prefix;
/* Where to put whitespace around the entity being formatted. */
pp_padding padding;
/* The real upper bound of number of characters per line, taking into
account the case of a very very looong prefix. */
int maximum_length;
/* Indentation count. */
int indent_skip;
/* Current wrapping mode. */
pp_wrapping_mode_t wrapping;
/* If non-NULL, this function formats a TEXT into the BUFFER. When called,
TEXT->format_spec points to a format code. FORMAT_DECODER should call
pp_string (and related functions) to add data to the BUFFER.
FORMAT_DECODER can read arguments from *TEXT->args_pts using VA_ARG.
If the BUFFER needs additional characters from the format string, it
should advance the TEXT->format_spec as it goes. When FORMAT_DECODER
returns, TEXT->format_spec should point to the last character processed.
*/
printer_fn format_decoder;
/* Nonzero if current PREFIX was emitted at least once. */
bool emitted_prefix;
/* Nonzero means one should emit a newline before outputting anything. */
bool need_newline;
/* Nonzero means identifiers are translated to the locale character
set on output. */
bool translate_identifiers;
/* Nonzero means that text should be colorized. */
bool show_color;
};
static inline const char *
pp_get_prefix (const pretty_printer *pp) { return pp->prefix; }
#define pp_space(PP) pp_character (PP, ' ')
#define pp_left_paren(PP) pp_character (PP, '(')
#define pp_right_paren(PP) pp_character (PP, ')')
#define pp_left_bracket(PP) pp_character (PP, '[')
#define pp_right_bracket(PP) pp_character (PP, ']')
#define pp_left_brace(PP) pp_character (PP, '{')
#define pp_right_brace(PP) pp_character (PP, '}')
#define pp_semicolon(PP) pp_character (PP, ';')
#define pp_comma(PP) pp_character (PP, ',')
#define pp_dot(PP) pp_character (PP, '.')
#define pp_colon(PP) pp_character (PP, ':')
#define pp_colon_colon(PP) pp_string (PP, "::")
#define pp_arrow(PP) pp_string (PP, "->")
#define pp_equal(PP) pp_character (PP, '=')
#define pp_question(PP) pp_character (PP, '?')
#define pp_bar(PP) pp_character (PP, '|')
#define pp_bar_bar(PP) pp_string (PP, "||")
#define pp_carret(PP) pp_character (PP, '^')
#define pp_ampersand(PP) pp_character (PP, '&')
#define pp_ampersand_ampersand(PP) pp_string (PP, "&&")
#define pp_less(PP) pp_character (PP, '<')
#define pp_less_equal(PP) pp_string (PP, "<=")
#define pp_greater(PP) pp_character (PP, '>')
#define pp_greater_equal(PP) pp_string (PP, ">=")
#define pp_plus(PP) pp_character (PP, '+')
#define pp_minus(PP) pp_character (PP, '-')
#define pp_star(PP) pp_character (PP, '*')
#define pp_slash(PP) pp_character (PP, '/')
#define pp_modulo(PP) pp_character (PP, '%')
#define pp_exclamation(PP) pp_character (PP, '!')
#define pp_complement(PP) pp_character (PP, '~')
#define pp_quote(PP) pp_character (PP, '\'')
#define pp_backquote(PP) pp_character (PP, '`')
#define pp_doublequote(PP) pp_character (PP, '"')
#define pp_underscore(PP) pp_character (PP, '_')
#define pp_maybe_newline_and_indent(PP, N) \
if (pp_needs_newline (PP)) pp_newline_and_indent (PP, N)
#define pp_scalar(PP, FORMAT, SCALAR) \
do \
{ \
sprintf (pp_buffer (PP)->digit_buffer, FORMAT, SCALAR); \
pp_string (PP, pp_buffer (PP)->digit_buffer); \
} \
while (0)
#define pp_decimal_int(PP, I) pp_scalar (PP, "%d", I)
#define pp_unsigned_wide_integer(PP, I) \
pp_scalar (PP, HOST_WIDE_INT_PRINT_UNSIGNED, (unsigned HOST_WIDE_INT) I)
#define pp_wide_int(PP, W, SGN) \
do \
{ \
print_dec (W, pp_buffer (PP)->digit_buffer, SGN); \
pp_string (PP, pp_buffer (PP)->digit_buffer); \
} \
while (0)
#define pp_wide_integer(PP, I) \
pp_scalar (PP, HOST_WIDE_INT_PRINT_DEC, (HOST_WIDE_INT) I)
#define pp_pointer(PP, P) pp_scalar (PP, "%p", P)
#define pp_identifier(PP, ID) pp_string (PP, (pp_translate_identifiers (PP) \
? identifier_to_locale (ID) \
: (ID)))
#define pp_buffer(PP) (PP)->buffer
extern void pp_set_line_maximum_length (pretty_printer *, int);
extern void pp_set_prefix (pretty_printer *, const char *);
extern void pp_destroy_prefix (pretty_printer *);
extern int pp_remaining_character_count_for_line (pretty_printer *);
extern void pp_clear_output_area (pretty_printer *);
extern const char *pp_formatted_text (pretty_printer *);
extern const char *pp_last_position_in_text (const pretty_printer *);
extern void pp_emit_prefix (pretty_printer *);
extern void pp_append_text (pretty_printer *, const char *, const char *);
extern void pp_newline_and_flush (pretty_printer *);
extern void pp_newline_and_indent (pretty_printer *, int);
extern void pp_separate_with (pretty_printer *, char);
/* If we haven't already defined a front-end-specific diagnostics
style, use the generic one. */
#ifdef GCC_DIAG_STYLE
#define GCC_PPDIAG_STYLE GCC_DIAG_STYLE
#else
#define GCC_PPDIAG_STYLE __gcc_diag__
#endif
/* This header may be included before diagnostics-core.h, hence the duplicate
definitions to allow for GCC-specific formats. */
#if GCC_VERSION >= 3005
#define ATTRIBUTE_GCC_PPDIAG(m, n) __attribute__ ((__format__ (GCC_PPDIAG_STYLE, m ,n))) ATTRIBUTE_NONNULL(m)
#else
#define ATTRIBUTE_GCC_PPDIAG(m, n) ATTRIBUTE_NONNULL(m)
#endif
extern void pp_printf (pretty_printer *, const char *, ...)
ATTRIBUTE_GCC_PPDIAG(2,3);
extern void pp_verbatim (pretty_printer *, const char *, ...)
ATTRIBUTE_GCC_PPDIAG(2,3);
extern void pp_flush (pretty_printer *);
extern void pp_really_flush (pretty_printer *);
extern void pp_format (pretty_printer *, text_info *);
extern void pp_output_formatted_text (pretty_printer *);
extern void pp_format_verbatim (pretty_printer *, text_info *);
extern void pp_indent (pretty_printer *);
extern void pp_newline (pretty_printer *);
extern void pp_character (pretty_printer *, int);
extern void pp_string (pretty_printer *, const char *);
extern void pp_write_text_to_stream (pretty_printer *);
extern void pp_write_text_as_dot_label_to_stream (pretty_printer *, bool);
extern void pp_maybe_space (pretty_printer *);
/* Switch into verbatim mode and return the old mode. */
static inline pp_wrapping_mode_t
pp_set_verbatim_wrapping_ (pretty_printer *pp)
{
pp_wrapping_mode_t oldmode = pp_wrapping_mode (pp);
pp_line_cutoff (pp) = 0;
pp_prefixing_rule (pp) = DIAGNOSTICS_SHOW_PREFIX_NEVER;
return oldmode;
}
#define pp_set_verbatim_wrapping(PP) pp_set_verbatim_wrapping_ (PP)
extern const char *identifier_to_locale (const char *);
extern void *(*identifier_to_locale_alloc) (size_t);
extern void (*identifier_to_locale_free) (void *);
#endif /* GCC_PRETTY_PRINT_H */
```
|
```php
<?php
namespace MathPHP\Probability\Distribution\Multivariate;
use MathPHP\Functions\Map;
use MathPHP\Functions\Special;
use MathPHP\Functions\Support;
use MathPHP\Exception;
/**
* Dirichlet distribution
* path_to_url
*/
class Dirichlet
{
/**
* Distribution parameter bounds limits
* (0,)
* @var array{"": string}
*/
public const PARAMETER_LIMITS = [
'' => '(0,)',
];
/**
* Distribution parameter bounds limits
* x (0,1)
* @var array{x: string}
*/
public const SUPPORT_LIMITS = [
'x' => '(0,1)',
];
/** @var float[] $s */
protected $s;
/**
* Constructor
*
* @param float[] $s
*/
public function __construct(array $s)
{
$n = \count($s);
for ($i = 0; $i < $n; $i++) {
Support::checkLimits(self::PARAMETER_LIMITS, ['' => $s[$i]]);
}
$this->s = $s;
}
/**
* Probability density function
*
* 1 K -1
* pdf = ---- x
* B()
*
* where B() is the multivariate Beta function
*
* @param float[] $xs
*
* @return float
*
* @throws Exception\BadDataException if xs and s don't have the same number of elements
*/
public function pdf(array $xs): float
{
if (\count($xs) !== \count($this->s)) {
throw new Exception\BadDataException('xs and s must have the same number of elements');
}
$n = \count($xs);
for ($i = 0; $i < $n; $i++) {
Support::checkLimits(self::SUPPORT_LIMITS, ['x' => $xs[$i]]);
}
/*
* K -1
* x
*
*/
$x = \array_product(
\array_map(
function ($x, $) {
return $x ** ($ - 1);
},
$xs,
$this->s
)
);
$B = Special::multivariateBeta($this->s);
return $x / $B;
}
}
```
|
My Sad Captains are a British five-piece rock and folk band originally from London, and currently signed to Bella Union. They consist of Ed Wallis (vocals, guitar), Leon Dufficy (guitar), Ben Walker (drums, vocals), Steve Blackwell (bass) and Henry Thomas (keyboards, vocals) have released four albums to date and have toured both Europe and North America. Their most recent record is Sun Bridge, released in October 2017.
History
The band was formed in 2004 by frontman Ed Wallis after his previous outfit, which included Nick Goss, had dissolved, after the two had moved down to London to attend university. Wallis decided to "record a load of songs myself and call it My Sad Captains". He realised that the tracks would work better with a full band set-up and reconnected with Goss, while recruiting Jack Swayne on bass and Ed's brother Jim Wallis on drums. After the band had formed it wasn't 'until a year or so later that we were all living in London and taking it seriously'. The band became a five-piece when Cathy Lucas joined the band on violin, vocals and keyboards. The band decided to concentrate on their material and avoided to release something 'for the sake of it'.
In 2007 they made their debut with a 7" single "Bad Decisions" b/w "Here and Elsewhere" for the Fortuna Pop! label. It was followed by "All Hat and No Plans" b/w "Great Expectations" released at the end of March 2008 through White Heat. Over the course of 2008, the band began recording their debut album with Paul Jones. While the sessions were self-financed, Jones, the co-owner of Stolen Recordings, made them 'an offer we couldn't refuse'. In June 2009, the band released Here & Elsewhere. It was followed in 2011 by Fight Less, Win More on the same label.
They released their third album, Best of Times, on 17 March 2014 through Bella Union. They spent much of the year on tour in support, while videos for the songs "Goodbye" and "Hardly There" appeared.
Their fourth record, Sun Bridge, was released on 6 October 2017, also on Bella Union. It was described by Uncut as "A sumptuous synthesis of epiphanic pop and Krautrock–inflected drift and diffusion… Few British bands since Spiritualized in their ‘90s imperial phase have been as proficient at inducing a beatific state of drift". The band opened for The Sea and Cake on a UK tour in June 2018; their last live show so far was in October that year.
Name
The band's name comes from a poem by the Anglo-American poet Thom Gunn, Wallis explained that he decided 'to steal someone else’s idea rather than think of my own'. He simply liked the poem, but that there was 'no special significance to it'. Wallis discovered the book 'sitting on a shelf unattended, when I was looking for a name, and it seemed to capture some element of the music that I heard in my head'. The phrase itself originates from Shakespeare's Antony and Cleopatra Act III Scene 13:
"Come,
Let's have one other gaudy night: call to me
All my sad captains; fill our bowls once more;
Let's mock the midnight bell."
Discography
Albums
Here & Elsewhere (Stolen Recordings, 2009)
Fight Less, Win More (Stolen Recordings, 2011)
Best of Times (Bella Union, 2014)
Sun Bridge (Bella Union, 2017)
References
External links
mysadcaptains.co.uk - Official Page - no longer online
British indie rock groups
Bella Union artists
|
The 2009 Canadian Olympic Curling Trials were held December 6–13, 2009 at Rexall Place in Edmonton. The event is also known and advertised as Roar of the Rings. The winner of the men's and women's events represented Canada at the 2010 Winter Olympics. Canada was guaranteed a team in each event as hosts.
Canadian Olympic qualification process
For both men's and women's categories, a pool of sixteen teams is designated as eligible to be Canada's representative at the 2010 Olympics. From the pool of sixteen, four teams are selected to qualify directly for the 2009 Canadian Curling Trials, "The 2009 Roar of the Rings". The remaining twelve teams compete in a pre-trials tournament, which is a triple-knockout bonspiel, with four teams advancing to the eight-team trials. The winner of the trials represents Canada at the 2010 Olympics.
Pool of sixteen
For each of the three curling seasons from 2006–07 to 2008–09, four teams are named to the pool of sixteen, resulting in a total of twelve teams in the pool by the end of the 2008–09 season. The four teams are the following:
winner of the Canadian Men's/Women's Curling Championships
winner of the Canada Cup tournament
winner of the Players' Championships
leader in the Canadian Team Ranking System for that season
If a team qualifies under more than one criterion (for example, a team wins both the Canada Cup and is the leader in the CTRS standings) or has already qualified in a previous season, then the four spots for that season are rounded out by selecting the highest ranked teams in the season's CTRS standings that have not already qualified.
To select the remaining four teams for the pool of sixteen, after the 2008–2009 season, from the teams that have not already qualified, the highest ranked teams are chosen, based on three-season, two-season, and one-season rankings. The rankings are determined by adding up the CTRS points earned by each team in their best events in each season.
If a team's membership changes from one season to another, the CTRS points earned by the team are divided amongst the individual players and allocated to their new teams.
Direct qualifiers to the Olympic Trials
From the pool of sixteen, the first four teams who meet any one of the following criteria (in order of priority) will be qualified directly for the Canadian Olympic Trials.
team leads the CTRS standings in two of the three curling seasons from 2006–07 to 2008–09
team wins three of the following events in the seasons from 2006–07 to 2008–09:
Canada Cup tournament
Players' Championships
Canadian Men's/Women's Curling Championships
World Curling Championships
team not yet qualified for the Olympic Trials with the highest CTRS point total from 2006–07 to 2008–09, using the same formula used to qualify teams to the pool of sixteen
team not yet qualified for the Olympic Trials with the highest CTRS point total from 2007–08 to 2008–09, using the same formula used to qualify teams to the pool of sixteen
Pre-trials qualifier
The pre-trials tournament was held on November 10–15, 2009 at the CN Centre in Prince George, British Columbia. The twelve teams from the pool of sixteen that did not qualify directly for the Olympic trials participated in a triple-knockout competition that selected four additional teams to compete in the Olympic Trials.
Qualified teams
Men's
Women's
Men's Tournament Brackets
A Event
B Event
C Event
Women's Tournament Brackets
A Event
B Event
C Event
Olympic Trials
Qualified teams
Men's
Women's
Men's tournament
Standings
Draw 1
December 6, 6:00pm
Draw 2
December 7, 1:00pm
Draw 3
December 8, 8:30am
Draw 4
December 8, 6:00pm
Draw 5
December 9, 1:00pm
Draw 6
December 10, 8:30am
Draw 7
December 10, 6:00pm
Semifinal
December 12, 1:00pm
Final
December 13, 1:00pm
Women's tournament
Standings
Draw 1
December 6, 1:00pm
Draw 2
December 7, 8:30am
Draw 3
December 7, 6:00pm
Draw 4
December 8, 1:00pm
Draw 5
December 9, 8:30am
Draw 6
December 9, 7:30pm
Draw 7
December 10, 1:00pm
Tiebreaker 1
December 11, 8:30am
Tiebreaker 2
December 11, 1:00pm
Semifinal
December 11, 6:00pm
Final
December 12, 6:00pm
References
External links
Draw Schedule
See also
2010 Winter Olympics
Qualification for the 2010 Winter Olympics
Curling at the 2010 Winter Olympics
Olympic Curling Trials, 2009
Canadian Olympic Curling Trials
Curling competitions in Edmonton
2009 in Alberta
December 2009 sports events in Canada
|
```go
//
// Last.Backend LLC CONFIDENTIAL
// __________________
//
// [2014] - [2019] Last.Backend LLC
// All Rights Reserved.
//
// NOTICE: All information contained herein is, and remains
// the property of Last.Backend LLC and its suppliers,
// if any. The intellectual and technical concepts contained
// herein are proprietary to Last.Backend LLC
// and its suppliers and may be covered by Russian Federation and Foreign Patents,
// patents in process, and are protected by trade secret or copyright law.
// Dissemination of this information or reproduction of this material
// is strictly forbidden unless prior written permission is obtained
// from Last.Backend LLC.
//
package config
import (
"github.com/lastbackend/lastbackend/pkg/util/http"
"github.com/lastbackend/lastbackend/pkg/util/http/middleware"
)
var Routes = []http.Route{
// Route handlers
{Path: "/namespace/{namespace}/config", Method: http.MethodPost, Middleware: []http.Middleware{middleware.Authenticate}, Handler: ConfigCreateH},
{Path: "/namespace/{namespace}/config", Method: http.MethodGet, Middleware: []http.Middleware{middleware.Authenticate}, Handler: ConfigListH},
{Path: "/namespace/{namespace}/config/{config}", Method: http.MethodGet, Middleware: []http.Middleware{middleware.Authenticate}, Handler: ConfigGetH},
{Path: "/namespace/{namespace}/config/{config}", Method: http.MethodPut, Middleware: []http.Middleware{middleware.Authenticate}, Handler: ConfigUpdateH},
{Path: "/namespace/{namespace}/config/{config}", Method: http.MethodDelete, Middleware: []http.Middleware{middleware.Authenticate}, Handler: ConfigRemoveH},
}
```
|
Several cases of sexual abuse in St. John's archdiocese have been reported, starting in 1988. It is an important chapter in the series of clerical abuse affairs that occurred in the dioceses of Canada.
James Hickey affair
In September 1988, Fr. James Hickey pleaded guilty to 20 charges of sexual assault, gross indecency and indecent assault involving teenage boys while he was a parish priest on the Burin Peninsula and in the St. John's area. He spent five years in prison, serving his sentence at Her Majesty's Penitentiary, St. John's, and Dorchester Penitentiary, NB. Despite Hickey's criminal conviction, archdiocese leaders fought against the victims' lawsuits demanding damages for over 20 years.
Hickey, the first priest convicted in a sexual-abuse scandal, died in 1992.
In February 2009, the Supreme Court of Newfoundland and Labrador ruled that the Archdiocese of St. John's was "vicariously liable" for the sexual abuse of eight former altar boys by Hickey.
Mount Cashel orphanage scandal
In 1988, a scandal erupted over allegations of widespread abuse of children at Mount Cashel Orphanage in Newfoundland. From 1989 to 1993, nine Christian Brothers were charged and prosecuted for various criminal offences including sex offences against the boys of Mount Cashel orphanage. The religious order that ran the orphanage filed for bankruptcy in the face of numerous lawsuits. Since the Mount Cashel scandal erupted, a number of priests across the country have been accused of sexual abuse.
In July 2020, the Court of Appeal for Newfoundland and Labrador unanimously reversed a 2018 Canadian Supreme ruling and ruled that the Archdiocese of St. John's was liable for the sexual abuse committed at the Mount Cashel Orphanage in the 1950s and 1960s.
In July 2021, the Archdiocese of St. John's announced plans to sell off assets in order to compensate victims of the Mount Cashel sex abuse scandal.
Hughes Inquiry
The Hughes Inquiry was a Canadian royal commission which concluded that officials had transferred offenders and covered up the sexual abuse at Mount Cashel. It recommended that victims be compensated. The commission began inquiry investigations on 1 June 1989 and published its report in April 1992.
Resignation of bishop Penney
The Winter Commission was appointed in 1989 by Archbishop Alphonsus Penney and released its report during the following year. Its final report, submitted in 1990, was entitled The report of the Archdiocesan Commission of Enquiry into the Sexual Abuse of Children by Members of the Clergy.
Archbishop Penney resigned on February 2, 1991, following the release of the commission's report, which placed some of the blame for cover-ups of the abuse on him.
Allegations against bishop Lahey
In 1989, Fr. Kevin Molloy went to former St. John's archbishop Alphonsus Liguori Penney to report that a child had seen pornography at the home of a priest Raymond Lahey. These allegations were recounted in 2009 when Bishop Lahey was subsequently arrested for separate allegations involving illicit pornography (see: sexual abuse scandal in Antigonish diocese).
1992 guidelines from the CCCB
In 1992, the Canadian Catholic bishops responded by unveiling guidelines, calling for fairness and openness to all allegations, stressing the need to "respect" the jurisdiction of outside authorities, and recommending counselling and compassion for the victims. However, some assert that, the bishops' guidelines notwithstanding, the sexual abuse problems have not been adequately addressed.
2004 Supreme Court decision
In 2004, the Supreme Court of Canada in Doe v Bennett, upheld the lower court's decision that the ecclesiastical corporation, Roman Catholic Episcopal Corporation of St. George's in Western Newfoundland, was vicariously liable (as well as directly liable) for sexual abuse by Father Kevin Bennett.
Rev. Peter Power controversy
In July 2020, Rev. Peter Power, who was originally from the Archdiocese of Toronto, was charged with charges of sexual touching, sexual assault and committing an indecent act involving two teenaged boys, aged 18 and 16 years old at a residence in a small Newfoundland community earlier in the year. Though officially retired, Power was still occasionally active in Catholic ministry when he relocated to Newfoundland.
British Columbia Link to the St. John's archdiocese
In February 2021, a British Columbia man alleged that he was sexually abused by one of the Christian Brother's, who confessed to the Royal Newfoundland Constabulary of molesting children at the Mount Cashel Orphanage in 1975.
In August 2022, a British Columbia man, known only as 'John B. Doe,' filed a class action lawsuit in British Columbia, alleging that he was physically and sexually abused while attending Vancouver College, a preparatory Catholic School for boys located in the Shaughnessy neighbourhood of Vancouver, British Columbia. The lawsuit alleges that six Christian Brothers working as teachers at the school, were known to have committed crimes, (in some cases admitted to crimes) against children in NL, before being transferred to Vancouver to teach at Vancouver College.
In September 2022, police in Burnaby, BC, acknowledged that they had an active investigation in relation to a complaint against a former NL Christian Brother, who was transferred from the Mount Cashel Orphanage subsequent to allegations of child molestation, to St. Thomas More Collegiate, a private school ran by the congregation of Christian Brothers. The complainant, John A. Doe, is accusing former Christian Brother Edward English of abuse allegations during his time at the private college. John A. Doe, questions how Brother English was allowed to quietly be transferred from NL to BC, without charges, after admitting to molesting children to the Royal Newfoundland Constabulary, in 1975.
See also
Catholic Church sexual abuse cases in Canada
Child sexual abuse
Religious abuse
Roman Catholic Archdiocese of St. John's, Newfoundland
Sexual abuse cases in the Congregation of Christian Brothers
References
Catholic Church sexual abuse scandals in Canada
Roman Catholic Ecclesiastical Province of St. John's, Newfoundland
Violence against men in North America
Child sexual abuse in Canada
|
Daviesia spinosissima is a species of flowering plant in the family Fabaceae and is endemic to the south of Western Australia. It is a shrub with crowded, rigid, sharply-pointed, narrowly triangular phyllodes, and yellow and red flowers.
Description
Daviesia spinosissima is a rigid, glabrous shrub that typically grows to a height of . Its phyllodes are crowded, rigid, vertically compressed and narrowly triangular, long, wide and sharply pointed. The flowers are arranged singly in leaf axils on a pedicel long with bracts about long attached. The sepals are long and joined at the base with lobes about long. The standard petal is broadly egg-shaped with a notched tip, long, wide and yellow. The wings are long and red, the keel long and red. Flowering occurs from October to March and the fruit is a triangular, sharply-pointed, inflated pod long.
Taxonomy
Daviesia spinosissima was first formally described in 1844 by Carl Meissner in Lehmann's Plantae Preissianae. The specific epithet (spinosissima) means "very spiny".
Distribution and habitat
This daviesia grows in heath in near-coastal areas of southern Western Australia between Narrikup, Denmark and near Mount Manypeaks in the Esperance Plains, Jarrah Forest and Warren biogeographic regions of south-western Western Australia.
Conservation status
Daviesia spinosissima is listed as "not threatened" by the Government of Western Australia Department of Biodiversity, Conservation and Attractions.
References
spinosissima
Taxa named by Carl Meissner
Plants described in 1844
Flora of Western Australia
|
```java
//
//
// path_to_url
//
// Unless required by applicable law or agreed to in writing, software
// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
//
////////////////////////////////////////////////////////////////////////////////
import com.code_intelligence.jazzer.api.FuzzedDataProvider;
import redis.clients.jedis.Jedis;
import java.net.URI;
import java.net.URISyntaxException;
public class JedisURIFuzzer {
public static void fuzzerTestOneInput(FuzzedDataProvider data) {
try{
Jedis jedis = new Jedis(new URI(data.consumeRemainingAsString()));
}
catch (URISyntaxException e) {}
catch (java.lang.NumberFormatException e) {}
catch (redis.clients.jedis.exceptions.InvalidURIException e) {}
}
}
```
|
James "Jim" Collins (29 May 1896 – 13 October 1990) was an Australian rules footballer who played 30 games with Essendon in the Victorian Football League (VFL).
Notes
External links
1896 births
1990 deaths
Australian rules footballers from Melbourne
Essendon Football Club players
Yarraville Football Club players
People from Footscray, Victoria
|
```toml
[package]
org = "test_globalcycle"
name = "viaRecordFieldDefault"
version = "1.0.0"
```
|
```php
<?php
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
*/
namespace Google\Service\CloudFunctions\Resource;
use Google\Service\CloudFunctions\ListOperationsResponse;
use Google\Service\CloudFunctions\Operation;
/**
* The "operations" collection of methods.
* Typical usage is:
* <code>
* $cloudfunctionsService = new Google\Service\CloudFunctions(...);
* $operations = $cloudfunctionsService->operations;
* </code>
*/
class Operations extends \Google\Service\Resource
{
/**
* Gets the latest state of a long-running operation. Clients can use this
* method to poll the operation result at intervals as recommended by the API
* service. (operations.get)
*
* @param string $name The name of the operation resource.
* @param array $optParams Optional parameters.
* @return Operation
*/
public function get($name, $optParams = [])
{
$params = ['name' => $name];
$params = array_merge($params, $optParams);
return $this->call('get', [$params], Operation::class);
}
/**
* Lists operations that match the specified filter in the request. If the
* server doesn't support this method, it returns `UNIMPLEMENTED`. NOTE: the
* `name` binding allows API services to override the binding to use different
* resource name schemes, such as `users/operations`. To override the binding,
* API services can add a binding such as `"/v1/{name=users}/operations"` to
* their service configuration. For backwards compatibility, the default name
* includes the operations collection id, however overriding users must ensure
* the name binding is the parent resource, without the operations collection
* id. (operations.listOperations)
*
* @param array $optParams Optional parameters.
*
* @opt_param string filter Required. A filter for matching the requested
* operations. The supported formats of *filter* are: To query for a specific
* function: project:*,location:*,function:* To query for all of the latest
* operations for a project: project:*,latest:true
* @opt_param string name Must not be set.
* @opt_param int pageSize The maximum number of records that should be
* returned. Requested page size cannot exceed 100. If not set, the default page
* size is 100. Pagination is only supported when querying for a specific
* function.
* @opt_param string pageToken Token identifying which result to start with,
* which is returned by a previous list call. Pagination is only supported when
* querying for a specific function.
* @return ListOperationsResponse
*/
public function listOperations($optParams = [])
{
$params = [];
$params = array_merge($params, $optParams);
return $this->call('list', [$params], ListOperationsResponse::class);
}
}
// Adding a class alias for backwards compatibility with the previous class name.
class_alias(Operations::class, 'Google_Service_CloudFunctions_Resource_Operations');
```
|
```go
package pipeline_test
import (
"context"
"testing"
"time"
"github.com/ovh/cds/engine/api/action"
"github.com/ovh/cds/engine/api/ascode"
"github.com/ovh/cds/engine/api/pipeline"
"github.com/ovh/cds/engine/api/test"
"github.com/ovh/cds/engine/api/test/assets"
"github.com/ovh/cds/sdk"
"github.com/ovh/cds/sdk/exportentities"
"github.com/stretchr/testify/assert"
"github.com/stretchr/testify/require"
"gopkg.in/yaml.v2"
)
func TestParseAndImport(t *testing.T) {
db, cache := test.SetupPG(t)
u, _ := assets.InsertAdminUser(t, db)
key := sdk.RandomString(10)
pipName := sdk.RandomString(10)
proj := assets.InsertTestProject(t, db, cache, key, key)
pip1 := sdk.Pipeline{
Name: pipName,
FromRepository: "foo",
ProjectID: proj.ID,
ProjectKey: proj.Key,
}
require.NoError(t, pipeline.InsertPipeline(db, &pip1))
var epip = new(exportentities.PipelineV1)
body := []byte(`
version: v1.0
name: ` + pipName + `
`)
errenv := yaml.Unmarshal(body, epip)
require.NoError(t, errenv)
_, _, globalError := pipeline.ParseAndImport(context.TODO(), db, cache, *proj, *epip, u, pipeline.ImportOptions{Force: false})
require.Error(t, globalError)
_, _, globalError2 := pipeline.ParseAndImport(context.TODO(), db, cache, *proj, *epip, u, pipeline.ImportOptions{Force: true, FromRepository: "bar"})
require.Error(t, globalError2)
_, _, globalError3 := pipeline.ParseAndImport(context.TODO(), db, cache, *proj, *epip, u, pipeline.ImportOptions{Force: true})
require.NoError(t, globalError3)
}
func TestParseAndImportCleanAsCode(t *testing.T) {
db, cache := test.SetupPG(t)
u, _ := assets.InsertAdminUser(t, db)
key := sdk.RandomString(10)
pipName := sdk.RandomString(10)
proj := assets.InsertTestProject(t, db, cache, key, key)
pip1 := sdk.Pipeline{
Name: pipName,
FromRepository: "myfoorepoenv",
ProjectID: proj.ID,
ProjectKey: proj.Key,
}
require.NoError(t, pipeline.InsertPipeline(db, &pip1))
var epip = new(exportentities.PipelineV1)
body := []byte(`
version: v1.0
name: ` + pipName + `
`)
errenv := yaml.Unmarshal(body, epip)
require.NoError(t, errenv)
require.NoError(t, action.CreateBuiltinActions(db))
wf := assets.InsertTestWorkflow(t, db, cache, proj, "workflow1")
// Add some events to resync
asCodeEvent := sdk.AsCodeEvent{
WorkflowID: wf.ID,
Username: u.GetUsername(),
CreateDate: time.Now(),
FromRepo: "myfoorepoenv",
Data: sdk.AsCodeEventData{
Pipelines: map[int64]string{
pip1.ID: pip1.Name,
},
},
}
assert.NoError(t, ascode.UpsertEvent(db, &asCodeEvent))
events, err := ascode.LoadEventsByWorkflowID(context.TODO(), db, wf.ID)
assert.Equal(t, 1, len(events))
// try to import with force, without a repo, it's ok
_, _, globalError3 := pipeline.ParseAndImport(context.TODO(), db, cache, *proj, *epip, u, pipeline.ImportOptions{Force: true})
require.NoError(t, globalError3)
events, err = ascode.LoadEventsByWorkflowID(context.TODO(), db, wf.ID)
assert.NoError(t, err)
assert.Equal(t, 0, len(events))
}
```
|
Mike Berry (born Michael Hubert Bourne, 24 September 1942) is a British singer and actor. He is known for his top ten hits "Don't You Think It's Time" (1963) and "The Sunshine of Your Smile" (1980) in a singing career spanning nearly 60 years. He became an actor in the 1970s, and was best known for his appearances as Mr. Spooner on the British sitcom Are You Being Served? in the early 1980s.
Early life
Berry was born in Northampton. His parents grew up in Rhodesia but met in England and his mother was an amateur actress and singer. Six months after his birth his mother moved with him to North Wales for two years. The family then moved to Stoke Newington where he attended William Patten Primary School and passed his eleven plus exam, winning a scholarship to Hackney Downs Grocers' School. He left the school aged 16 without qualifications to become an apprentice compositor.
Career
Music
Berry was a fan of skiffle and rock and roll music as a teenager and he formed his own skiffle group called "The Rebels" and then introduced electric guitars as "Kenny Lord and the Statesmen." Joe Meek became their recording manager and producer, and he signed up a group called the Stormers as his new back-up band, naming the new group "Mike Berry and the Outlaws."
He had three hits in the 1960s in the UK Singles Chart, his most successful being "Don't You Think It's Time", reaching No. 6 in January 1963. His "Tribute To Buddy Holly" is also noted for having been banned by the BBC for being "morbid." The hit singles were all produced by Joe Meek.
In the mid-1970s he returned to the charts in the Netherlands and Belgium, as pirate radio station Radio Mi Amigo and Radio Veronica played his new record material, released on Dutch record label Pink Elephant Records. "Don't Be Cruel" made No. 14 in the Dutch Nationale Hitparade in May 1975. His next record, a remake of his 1960 debut song "Tribute to Buddy Holly", hit No. 2 in October of that same year. In 1977, "I'm A Rocker," released on Flemish record label Scramble Records (owned by Radio Mi Amigo DJ Norbert), failed to chart.
In 1980, he had a chart success in the UK, with "The Sunshine of Your Smile," a cover version of a romantic song which was produced by Chas Hodges; this had originally been written before the First World War and recorded by Jessie Broughton in about 1915. In 1985, his song "Everyone's A Wally" was included as the b-side to the video game by Mikro-Gen of the same name. His most recent CD was About Time Too!, recorded in Nashville, Tennessee with The Crickets and released on the UK Rollercoaster Records label, Berry's label of choice since their reissue of Joe Meek productions and new material from the 1990s.
In 1988, Berry co-wrote "This is the Kiss" with Mel Simpson which was chosen to be among the final eight songs in "A Song for Europe" (the UK selection vehicle for the Eurovision Song Contest) performed by Two-Che. The song placed second with 73,785 televotes.
In 2016, Berry auditioned for the fifth series of The Voice but was not successful.
In 2017, Berry went on a UK tour with The Solid Gold Rock'n'Roll Show, which also featured Eden Kane, Marty Wilde, Mark Wynter and the Wildcats. In 2019, he toured again with The Solid Gold Rock'n'Roll Show, alongside Marty Wilde, Charlie Gracie, Nancy Ann Lee (Little Miss Sixties) and the Wildcats.
Acting
In the 1970s, Berry developed a career as an actor and he appeared in many television commercials. In 1979, he was cast as the father (Mr. Peters) of the two children in the TV version of the Worzel Gummidge books, along with Jon Pertwee and Una Stubbs. In 1981, he replaced Trevor Bannister's character (Mr. Lucas) in the British sitcom Are You Being Served? and stayed until the end of the show's run in 1985. Since the death of Nicholas Smith in December 2015, he has been the lone surviving actor from the show who played a major recurring character. Berry also starred in a series of commercials for Blue Riband in the 1980s.
His most recent film work was acting in Julie and the Cadillacs (1999).
Family
His brother is the actor, performer and activist Bette Bourne.
Discography
Singles
"Will You Love Me Tomorrow" / "My Baby Doll" (with The Outlaws – Decca 11314 – 1961)
"Tribute to Buddy Holly" / "What's the Matter" (with The Outlaws – HMV 912 – 1961) – UK No. 24
"It's Just a Matter of Time" / "Little Boy Blue" (with The Admirals (Outlaws) – HMV 979 – 1962)
"Every Little Kiss" / "How Many Times" (HMV 1042 – 1962)
"Don't You Think it's Time" / "Loneliness" (with The Outlaws – HMV 1105 – 1962) – UK No. 6
"My Little Baby" / "You'll Do It You'll Fall in Love" (with The Outlaws – HMV 1142 – 1963) – UK No. 34
"It Really Doesn't Matter" / "Try a Little Bit Harder" (HMV 1194 – 1963)
"Intro" / "Brown Eyed Handsome Man" (Graham Dean / The Innocents – Columbia 1536 – 1963)
"My Little Baby" / "More Than I Can Say" (with The Innocents – Columbia 1536 – 1963)
"La Bamba" / "Don't You Think it's Time" (with The Innocents – Columbia 1536 – 1963)
"On My Mind" / "This Little Girl" (with The Innocents – HMV 1257 – 1964)
"Lovesick" / "Letters of Love" (with The Innocents – HMV 1284 – 1964)
"Who Will It Be" / "Talk" (with The Innocents – HMV 1314 – 1964)
"Two Lovers" / "Don't Try to Stand in My Way" (HMV 1362 – 1964)
"That's All I Ever Want from You" / "She Didn't Care" (HMV 1449 – 1965)
"It Comes and Goes" / "Gonna Fall in Love" (HMV 1484 – 1965)
"Warm Baby" / "Just Thought I´d Phone" (HMV 1530 – 1966)
"Raining in My Heart" / "Eyes" (Polydor 56182 – 1967)
"Can't You Hear My Heartbeat" / "Alice" (D-Metronome – 1967)
"Don't Be Cruel" / "It's All Over" (Pink Elephant Records – 1975)
"Tribute to Buddy Holly" (remake) / "Dial My Up" (Pink Elephant Records – 1975)
"I'm a Rocker" / "It's a Hard Hard Hard World" (Scramble Records – 1977)
"Don't Ever Change" (Polydor – 1978)
"The Sunshine of Your Smile" (Polydor – 1980) – UK No. 9
"If I Could Only Make You Care" (Polydor – 1980) – UK No. 37
"Memories" (Polydor – 1981) – UK No. 55
"Diana" (Polydor – 1981)
"What'll I Do" (Polydor – 1982)
"Everyone's a Wally" (B-side to cassette release of computer game – 1985)
"It's Time For Mike Berry – Vinyl EP" – (Rollercoaster Records – 1990)
"Sounds of the Sixties" (Rollercoaster Records – 1992)
"Rock'n'Roll Daze" (Rollercoaster Records – 1998)
"Keep Your Hands To Yourself – Live in Sweden" (Rollercoaster Records – 2001)
"About Time Too! – with The Crickets, Recorded in Nashville" (Rollercoaster Records -2005)
"Before I Grow Too Old – CD EP" – (Rollercoaster Records – 2006)
"Hi There Darlin'! Merry Christmas" – Mr. Bert Spooner with instrumental accompaniment by Mike Berry & The Outlaws – (Rollercoaster Records – 2007)
"Sunshine of Your Smile – Hits and Memories from the 1980s" – (Rollercoaster Records – 2016)
"Drift Away" – (Rollercoaster Records – 2019)
References
External links
[ Mike Berry biography @ Allmusic.com]
Mike Berry discography
Interview with Mike Berry, The Spectrum, Accessed July 6, 2017
1942 births
Living people
Actors from Northampton
British male comedy actors
English male singers
English male television actors
English pop singers
Musicians from Northampton
People educated at Hackney Downs School
20th-century English male actors
20th-century British male singers
20th-century English singers
|
Fantail Creek is a stream in the U.S. state of South Dakota.
Fantail Creek was named after the fantail deer observed near it.
See also
List of rivers of South Dakota
References
Rivers of Lawrence County, South Dakota
Rivers of South Dakota
|
```c++
//
// Redistribution and use in source and binary forms, with or without
// modification, are permitted provided that the following conditions
// are met:
// * Redistributions of source code must retain the above copyright
// notice, this list of conditions and the following disclaimer.
// * Redistributions in binary form must reproduce the above copyright
// notice, this list of conditions and the following disclaimer in the
// documentation and/or other materials provided with the distribution.
// * Neither the name of NVIDIA CORPORATION nor the names of its
// contributors may be used to endorse or promote products derived
// from this software without specific prior written permission.
//
// THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS ``AS IS'' AND ANY
// EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
// IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
// PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
// CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
// EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
// PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
// PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
// OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
// (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
// OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
//
#include <RendererConfig.h>
#if defined(RENDERER_ENABLE_DIRECT3D11)
#include "D3D11RendererIndexBuffer.h"
#include <RendererIndexBufferDesc.h>
#if PX_WINDOWS
#include <task/PxTask.h>
#endif
using namespace SampleRenderer;
static DXGI_FORMAT getD3D11Format(RendererIndexBuffer::Format format)
{
DXGI_FORMAT dxgiFormat = DXGI_FORMAT_UNKNOWN;
switch (format)
{
case RendererIndexBuffer::FORMAT_UINT16:
dxgiFormat = DXGI_FORMAT_R16_UINT;
break;
case RendererIndexBuffer::FORMAT_UINT32:
dxgiFormat = DXGI_FORMAT_R32_UINT;
break;
}
RENDERER_ASSERT(dxgiFormat != DXGI_FORMAT_UNKNOWN, "Unable to convert to DXGI_FORMAT.");
return dxgiFormat;
}
D3D11RendererIndexBuffer::D3D11RendererIndexBuffer(ID3D11Device& d3dDevice, ID3D11DeviceContext& d3dDeviceContext, const RendererIndexBufferDesc& desc, bool bUseMapForLocking) :
RendererIndexBuffer(desc),
m_d3dDevice(d3dDevice),
m_d3dDeviceContext(d3dDeviceContext),
m_d3dIndexBuffer(NULL),
m_bUseMapForLocking(bUseMapForLocking && (!desc.registerInCUDA)),
m_buffer(NULL)
{
memset(&m_d3dBufferDesc, 0, sizeof(D3D11_BUFFER_DESC));
m_d3dBufferDesc.BindFlags = D3D11_BIND_INDEX_BUFFER;
m_d3dBufferDesc.ByteWidth = (UINT)(getFormatByteSize(desc.format) * desc.maxIndices);
m_d3dBufferFormat = getD3D11Format(desc.format);
if (m_bUseMapForLocking)
{
m_d3dBufferDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
m_d3dBufferDesc.Usage = D3D11_USAGE_DYNAMIC;
}
else
{
m_d3dBufferDesc.CPUAccessFlags = 0;
m_d3dBufferDesc.Usage = D3D11_USAGE_DEFAULT;
m_buffer = new PxU8[m_d3dBufferDesc.ByteWidth];
memset(m_buffer, 0, sizeof(PxU8)*m_d3dBufferDesc.ByteWidth);
}
onDeviceReset();
if (m_d3dIndexBuffer)
{
m_maxIndices = desc.maxIndices;
}
}
D3D11RendererIndexBuffer::~D3D11RendererIndexBuffer(void)
{
if (m_d3dIndexBuffer)
{
#if PX_WINDOWS && PX_SUPPORT_GPU_PHYSX
if (m_interopContext && m_registeredInCUDA)
{
m_registeredInCUDA = !m_interopContext->unregisterResourceInCuda(m_InteropHandle);
}
#endif
m_d3dIndexBuffer->Release();
m_d3dIndexBuffer = NULL;
}
delete [] m_buffer;
}
void D3D11RendererIndexBuffer::onDeviceLost(void)
{
m_registeredInCUDA = false;
if (m_d3dIndexBuffer)
{
#if PX_WINDOWS && PX_SUPPORT_GPU_PHYSX
if (m_interopContext && m_registeredInCUDA)
{
m_registeredInCUDA = !m_interopContext->unregisterResourceInCuda(m_InteropHandle);
}
#endif
m_d3dIndexBuffer->Release();
m_d3dIndexBuffer = 0;
}
}
void D3D11RendererIndexBuffer::onDeviceReset(void)
{
if (!m_d3dIndexBuffer)
{
m_d3dDevice.CreateBuffer(&m_d3dBufferDesc, NULL, &m_d3dIndexBuffer);
RENDERER_ASSERT(m_d3dIndexBuffer, "Failed to create DIRECT3D11 Index Buffer.");
#if PX_WINDOWS && PX_SUPPORT_GPU_PHYSX
if (m_interopContext && m_d3dIndexBuffer && m_mustBeRegisteredInCUDA)
{
m_registeredInCUDA = m_interopContext->registerResourceInCudaD3D(m_InteropHandle, m_d3dIndexBuffer);
}
#endif
}
}
void* D3D11RendererIndexBuffer::lock(void)
{
// For now NO_OVERWRITE is the only mapping that functions properly
return internalLock(getHint() == HINT_STATIC ? /* D3D11_MAP_WRITE_DISCARD */ D3D11_MAP_WRITE_NO_OVERWRITE : D3D11_MAP_WRITE_NO_OVERWRITE);
}
void* D3D11RendererIndexBuffer::internalLock(D3D11_MAP MapType)
{
void* buffer = 0;
if (m_d3dIndexBuffer)
{
if (m_bUseMapForLocking)
{
D3D11_MAPPED_SUBRESOURCE mappedRead;
m_d3dDeviceContext.Map(m_d3dIndexBuffer, 0, MapType, NULL, &mappedRead);
RENDERER_ASSERT(mappedRead.pData, "Failed to lock DIRECT3D11 Index Buffer.");
buffer = mappedRead.pData;
}
else
{
buffer = m_buffer;
}
}
return buffer;
}
void D3D11RendererIndexBuffer::unlock(void)
{
if (m_d3dIndexBuffer)
{
if (m_bUseMapForLocking)
{
m_d3dDeviceContext.Unmap(m_d3dIndexBuffer, 0);
}
else
{
m_d3dDeviceContext.UpdateSubresource(m_d3dIndexBuffer, 0, NULL, m_buffer, m_d3dBufferDesc.ByteWidth, 0);
}
}
}
void D3D11RendererIndexBuffer::bind(void) const
{
m_d3dDeviceContext.IASetIndexBuffer(m_d3dIndexBuffer, m_d3dBufferFormat, 0);
}
void D3D11RendererIndexBuffer::unbind(void) const
{
m_d3dDeviceContext.IASetIndexBuffer(NULL, DXGI_FORMAT(), 0);
}
#endif // #if defined(RENDERER_ENABLE_DIRECT3D11)
```
|
Al Israel (April 16, 1935 – March 16, 2011) was an American film and TV actor who is best known for his role as the chainsaw-wielding Colombian drug dealer "Hector the Toad" in the 1983 film Scarface. He also appeared alongside Al Pacino in Carlito's Way a decade later.
He was one of three original cast members to voice the 2006 video game based on the film. The game entitled Scarface: The World Is Yours sold more than two million units in less than two years.
Al Israel died on March 16, 2011, at age 75.
Filmography
References
External links
Al Israel(Aveleyman)
American male film actors
American male television actors
Jewish American male actors
2011 deaths
1930s births
21st-century American Jews
|
Getaround is an online car sharing or peer-to-peer carsharing service that connects drivers who need to reserve cars with car owners who share their cars in exchange for payment.
As of 2019, the company was reported to have five million users and approximately 20,000 connected cars worldwide.
Getaround launched to the public on May 24, 2011, at the TechCrunch Disrupt conference. The company operates in Boston, Chicago, San Francisco Bay Area, New Jersey, Portland, Seattle, Philadelphia, Miami, Orlando, Atlanta, San Diego, Los Angeles, Denver, and Washington D.C.
History
Getaround was founded in 2009 by Sam Zaid, Jessica Scorpio, and Elliot Kroo. In May 2011, Getaround won the TechCrunch Disrupt New York competition. In 2012, Getaround began serving Portland, Oregon with the aid of a $1.725 million grant from the Federal Highway Administration.
In November 2016, Getaround reached an agreement with City CarShare to take over its fleet, parking spaces and member base.
In August 2018, Getaround raised $300 million in fundings from Softbank.
In April 2019, Getaround absorbed the carsharing platform Drivy for $300 million and rebranded as Getaround six months later.
In May 2022, Getaround announced their agreement to a Special Purpose Acquisition Company (SPAC) Merger. The deal sees the organisation to be start selling shares of the organisation on the New York Stock. Listed as ‘GETR’ on the exchange the organisation will see a combined company equity value of $1.2 billion.
Financial difficulties
In January 2020, The Information reported the company planned to lay off approximately 150 staff members or about twenty-five percent of the workforce.
Bloomberg reported in March 2020 that demand had dropped due to the COVID-19 pandemic, and that the company was short on cash and looking for a buyer.
Criminal use
Criminals have used Getaround, along with other peer-to-peer car rental services such as Turo, for illegal activities. In February 2020, the Washington Post reported that thieves were finding available cars using the Getaround mobile app, which displayed the exact locations of vehicles for rent. Victims have reported that thieves could break into a car, destroy the Getaround Connect device that is intended to immobilize the car and report its position, and take the keys that had been locked inside the vehicle. Of the 787 cars stolen in the District of Columbia between October 1, 2019, and February 4, 2020, the Metropolitan Police Department of the District of Columbia estimated that 49 of the thefts involved car rental apps such as Getaround. In July 2021, the Attorney General for the District of Columbia announced a settlement with Getaround that required Getaround to pay the district $950,000, to pay restitution to users whose vehicles had been damaged or stolen, and to make other changes to its platform.
In February 2020, NBC News interviewed eight Getaround users whose cars had been stolen, damaged, seized by police as evidence, or otherwise misused. Many of the owners were not fully compensated by Getaround's insurance for their losses. A former Getaround employee told NBC News that the company has known since 2017 that its GPS tracking devices were not tamper-proof.
References
Carsharing
Companies based in San Francisco
Peer-to-peer
Online marketplaces of the United States
2009 establishments in California
Softbank portfolio companies
|
Ram Lallu Vaishya is an Indian politician from the Bhartiya Janata Party. He was elected as the Member of Legislative Assembly of Madhya Pradesh (MLA) from Singrauli constituency in the 2018 polls. He won 4000 approximately votes defeating his immediate rival Renu Shah of Indian National Congress.
References
Madhya Pradesh MLAs 2013–2018
People from Singrauli district
Living people
Bharatiya Janata Party politicians from Madhya Pradesh
Year of birth missing (living people)
|
```python
from bs4 import BeautifulSoup
EN = (
{
'variable': 'congressperson_name',
'name': 'Congressperson Name',
'desc': """Name used by the congressperson during his term in
office. Usually it is composed by two elements: a given name and a
family name; two given names; or two forename, except if the head
of the Chamber of Deputies explicitly alter this rule in order to avoid
confusion."""
},
{
'variable': 'congressperson_id',
'name': 'Unique Identifier of Congressperson',
'desc': """Unique identifier number of a congressperson at the
Chamber of Deputies."""
},
{
'variable': 'congressperson_document',
'name': 'Congressperson Document Number',
'desc': """Document used to identify the congressperson at the
Chamber of Deputies. May change from one term to another."""
},
{
'variable': 'term',
'name': 'Legislative Period Number',
'desc': """Legislative period: 4 years period, the same period
of the term of congresspeople. In the context of this allowance,
it represents the initial year of the legislature. It is also used
as part of the Congressperson Document Number since it changes in
between legislatures."""
},
{
'variable': 'state',
'name': 'State',
'desc': """In the context of this allowance it represents the
state or federative unit that elected the congressperson; it is
also used to define the value of the allowance to the
congressperson."""
},
{
'variable': 'party',
'name': 'Party',
'desc': """It represents the abbreviation of a party. Definition
of party: it is an organization built by people with interests or
ideologies in common. They form an association with the purpose of
achieving power to implement a government program. They are legal
entities, free and autonomous when it comes to their creation and
self-organization, since they respect the constitutional
commandments."""
},
{
'variable': 'term_id',
'name': 'Legislative Period Code',
'desc': """Legislative period: 4 years period, the same period
of the term of congresspeople. In the context of this allowance it
represents the identifying code of the legislature, an ordinal
number incremented by one each new legislature (e.g. the
2011 legislature is the 54th legislature)."""
},
{
'variable': 'subquota_number',
'name': 'Subquota Number',
'desc': """In the context of this allowance this is the code of
the category group referring to the nature of the expense claimed
by the congressperson's receipt, the receipt of what was debited
from the congressperson's account."""
},
{
'variable': 'subquota_description',
'name': 'Subquota Description',
'desc': """The description of the category group referring to
the nature of the expense."""
},
{
'variable': 'subquota_group_id',
'name': 'Subquota Specification Number',
'desc': """In the context of this allowance there are expenses
under certain category groups that require further specifications
(e.g. fuel). This variable represents the code of these detailed
specification."""
},
{
'variable': 'subquota_group_description',
'name': 'Subquota Specification Description',
'desc': """Description of the detailed specification required by
certain category groups."""
},
{
'variable': 'supplier',
'name': 'Supplier',
'desc': """Name of the supplier of the product or service
specified by the receipt."""
},
{
'variable': 'cnpj_cpf',
'name': 'CNPJ/CPF',
'desc': """CNPJ or CPF are identification numbers issued for,
respectively, companies and people by Federal Revenue of Brazil.
CNPJ are 14 digits long and CPF are 11 digits long. This field is
the identification number (CNPJ or CPF) of the legal entity issuing
the receipt. The receipt is a proof of the expense and is a valid
document used to claim for a reimbursement."""
},
{
'variable': 'document_number',
'name': 'Document Number',
'desc': """This field is the identifying number issued in the
receipt, in the proof of expense declared by the congressperson in
this allowance."""
},
{
'variable': 'document_type',
'name': 'Fiscal Document Type',
'desc': """Type of receipt 0 (zero) for bill of sale; 1 (one)
for simple receipt; and 2 (two) to expense made abroad."""
},
{
'variable': 'issue_date',
'name': 'Issue Date',
'desc': """Issuing date of the receipt."""
},
{
'variable': 'document_value',
'name': 'Document Value',
'desc': """Value of the expense in the receipt. If it refers to
fly tickets this value can be negative, meaning that it is a
credit related to another fly tickets issued but not used by the
congressperson (the same is valid for `net_value`)."""
},
{
'variable': 'remark_value',
'name': 'Remark Value',
'desc': """Remarked value of the expense concerning the value of
the receipt, or remarked value of the expense."""
},
{
'variable': 'net_value',
'name': 'Net Value',
'desc': """Net value of the receipt calculated from the value of
the receipt and the remarked value. This is the value that is going
to be debited from the congressperson's account. If the category
group is Telephone and the value is zero, it means the expense was
franchised out."""
},
{
'variable': 'month',
'name': 'Month',
'desc': """Month of the receipt. It is used together with the
year to determine in which month the debt will be considered in the
context of this allowance."""
},
{
'variable': 'year',
'name': 'Year',
'desc': """Year of the receipt. It is used together with the
month to determine in which month the debt will be considered in
the context of this allowance."""
},
{
'variable': 'installment',
'name': 'Installment Number',
'desc': """The number of the installment of the receipt. Used
when the receipt has to be reimbursed in installments."""
},
{
'variable': 'passenger',
'name': 'Passenger',
'desc': """Name of the passenger when the receipt refers to a
fly ticket."""
},
{
'variable': 'leg_of_the_trip',
'name': 'Leg of the Trip',
'desc': """Leg of the trip when the receipt refers to a fly
ticket."""
},
{
'variable': 'batch_number',
'name': 'Batch Number',
'desc': """In the context of this allowance the batch number
refers to the cover number of a batch grouping receipts handed in
to the Chamber of Deputies to be reimbursed. This data together with the
reimbursement number helps in finding the receipt in the Lower
House Archive."""
},
{
'variable': 'reimbursement_number',
'name': 'Reimbursement Number',
'desc': """In the context of this allowance the reimbursement
number points to document issued in the reimbursement process.
This data together with the reimbursement number helps in finding
the receipt in the Chamber of Deputies Archive."""
},
{
'variable': 'reimbursement_value',
'name': 'Reimbursement Value',
'desc': 'Reimbursement value referring to the document value.'
},
{
'variable': 'applicant_id',
'name': 'Applicant Identifier',
'desc': """Identifying number of a congressperson or the Chamber of Deputies
leadership for the sake of transparency and accountability within
this allowance."""
}
)
def get_portuguese():
"""
Returns a generator of dictionaries with variable, name and description in
pt-BR (based on data/2016-08-08-datasets-format.html)
"""
with open('data/2016-08-08-datasets-format.html', 'rb') as file_handler:
parsed = BeautifulSoup(file_handler.read(), 'lxml')
for row in parsed.select('.tabela-2 tr'):
cells = row.select('td')
if cells:
var, name, desc = map(lambda x: x.text.strip(), cells)
yield {
'variable': var,
'name': name,
'desc': desc
}
def clean_up(s):
"""Remove new lines and indentation from a string."""
return ' '.join(s.split())
def variable_block(count, pt, en):
"""
Get the count (int) the pt version (dict) and en version (dict) and outputs
a generator with markdown contents with all the variable info in both
languages. The dict is expected to have three keys: variable, name & desc.
"""
return (
'',
'## {}. {} (`{}`)'.format(count, en['name'], en['variable']),
'',
'| | |',
'|:------:|:------:|',
'| **{}** | **{}** |'.format(pt['name'], en['name']),
'| `{}` | `{}` |'.format(pt['variable'], en['variable']),
'| {} | {} |'.format(pt['desc'], clean_up(en['desc'])),
''
)
def markdown():
yield from (
'# Quota for Exercising Parliamentary Activity (CEAP)',
'',
'> This file is auto-generated by `src/translation_table.py`.',
'',
'The following files are covered by this description:',
'',
'```',
'2016-08-08-current-year.xz', '2016-08-08-last-year.xz', '2016-08-08-previous-years.xz',
'```'
'',
'The Quota for Exercising Parliamentary Activity (aka CEAP) is a montly quota available exclusively for covering costs of deputies with the exercise of parliamentary activity. The [Bureau Act 43 of 2009 ](path_to_url describe the guidelines for its use.',
)
for index, contents in enumerate(zip(get_portuguese(), EN)):
yield from variable_block(index + 1, *contents)
with open('data/2016-08-08-ceap-datasets.md', 'w') as file_handler:
file_handler.write('\n'.join(markdown()))
```
|
```objective-c
#ifdef __OBJC__
#import <UIKit/UIKit.h>
#else
#ifndef FOUNDATION_EXPORT
#if defined(__cplusplus)
#define FOUNDATION_EXPORT extern "C"
#else
#define FOUNDATION_EXPORT extern
#endif
#endif
#endif
#import "NSData+ImageContentType.h"
#import "SDImageCache.h"
#import "SDWebImageCompat.h"
#import "SDWebImageDecoder.h"
#import "SDWebImageDownloader.h"
#import "SDWebImageDownloaderOperation.h"
#import "SDWebImageManager.h"
#import "SDWebImageOperation.h"
#import "SDWebImagePrefetcher.h"
#import "UIButton+WebCache.h"
#import "UIImage+GIF.h"
#import "UIImage+MultiFormat.h"
#import "UIImageView+HighlightedWebCache.h"
#import "UIImageView+WebCache.h"
#import "UIView+WebCacheOperation.h"
FOUNDATION_EXPORT double SDWebImageVersionNumber;
FOUNDATION_EXPORT const unsigned char SDWebImageVersionString[];
```
|
Agrimonia (from the Greek ), commonly known as agrimony, is a genus of 12–15 species of perennial herbaceous flowering plants in the family Rosaceae, native to the temperate regions of the Northern Hemisphere, with one species also in Africa. The species grow to between tall, with interrupted pinnate leaves, and tiny yellow flowers borne on a single (usually unbranched) spike.
Agrimonia species are used as food plants by the larvae of some Lepidoptera species including grizzled skipper (recorded on A. eupatoria) and large grizzled skipper.
Species
Agrimonia eupatoria – Common agrimony (Europe, Asia, Africa)
Agrimonia gryposepala – Common agrimony, tall hairy agrimony (North America)
Agrimonia incisa – Incised agrimony (North America)
Agrimonia coreana – Korean agrimony (eastern Asia)
Agrimonia microcarpa – Smallfruit agrimony (North America)
Agrimonia nipponica – Japanese agrimony (eastern Asia)
Agrimonia parviflora – Harvestlice agrimony (North America)
Agrimonia pilosa – Hairy agrimony (eastern Europe, Asia)
Agrimonia procera – Fragrant agrimony (Europe)
Agrimonia pubescens – Soft or downy agrimony (North America)
Agrimonia repens – Short agrimony (southwest Asia)
Agrimonia rostellata – Beaked agrimony (North America)
Agrimonia striata – Roadside agrimony (North America)
Uses
In ancient times, it was used for foot baths and tired feet. Agrimony has a long history of medicinal use. The English poet Michael Drayton once hailed it as an "all-heal" and through the ages it was considered a panacea. The ancient Greeks used agrimony to treat eye ailments, and it was made into brews for diarrhea and disorders of the gallbladder, liver, and kidneys. The Anglo-Saxons boiled agrimony in milk and used it to improve erectile performance.
They also made a solution from the leaves and seeds for healing wounds; this use continued through the Middle Ages and afterward, in a preparation called eau d'arquebusade, or "musket-shot water". It has been added to tea as a spring tonic. According to the German Federal Commission E (Phytotherapy)-Monograph "Agrimony", published 1990, the internal application area is "mild, nonspecific, acute diarrhea" and "inflammation of oral and pharyngeal mucosa" and the external application "mild, superficial inflammation of the skin".<ref name="Commission E">German Federal Commission E Monographs (Phytotherapy): Monograph Agrimony (Agrimoniae herba). Bundesanzeiger. March 13, 1990. – www.heilpflanzen-welt.de.</ref>
Folklore
Traditional British folklore states that if a sprig of Agrimonia eupatoria was placed under a person's head, they would sleep until it was removed.
See also
Aremonia agrimonioides (Bastard-agrimony, of the related genus Aremonia)
Eupatorium cannabinum'' (Hemp-agrimony)
References
External links
Herb Forum Agrimony Thread
Medicinal plants
Agrimoniinae
Rosaceae genera
Taxa named by Joseph Pitton de Tournefort
|
13 Minutes () is a 2015 German drama film directed by Oliver Hirschbiegel that tells the true story of Georg Elser's failed attempt to assassinate Adolf Hitler in November 1939. The title of the film is drawn from the fact that Elser's bomb detonated in a venue that Hitler had left just 13 minutes before.
It was screened out of competition at the 65th Berlin International Film Festival. It was one of eight films shortlisted by Germany to be their submission for the Academy Award for Best Foreign Language Film at the 88th Academy Awards, but it lost out to Labyrinth of Lies.
Plot
In November 1939, after planting a home-made bomb inside a column of a Munich Bierkeller, Georg Elser (Christian Friedel) attempts to cross into neutral Switzerland but is caught at the border. His bomb detonates but misses killing German leader Adolf Hitler by just 13 minutes.
The German security services find incriminating evidence on Elser and link him to the assassination attempt. They believe Elser must have been working with a group of conspirators and proceed to torture Elser. They also round up members of his family from his home village, including Else Härlen (Katharina Schüttler), a married woman Elser has been seeing.
When Else is brought before Elser, he fears for her life and tells Kripo police chief Arthur Nebe (Burghart Klaußner) and Gestapo head Heinrich Müller (Johann von Bülow) that he acted alone, procuring detonators from a steel factory and stealing dynamite from a nearby quarry. He outlines the two clockwork mechanisms he built to time the explosion and hopefully kill Hitler as he made a speech. Still not believed to have attempted the assassination alone, Elser is once more tortured using drugs (Pervitin), but with the same result as before—he insists that he acted alone.
Through flashbacks it is learned that Elser came to despise the Nazis and saw that Hitler needed to be removed to save Germany. Following his arrest, Elser was kept in concentration camps for five years and was shot a few days before American forces liberated Dachau concentration camp, a few weeks before the war ended. In his last days he hears that Arthur Nebe has been killed for his part in the July assassination plot.
Elser is now regarded as a German resistance hero of the Second World War.
Cast
Christian Friedel as Georg Elser
Katharina Schüttler as Else Härlen
Burghart Klaußner as Arthur Nebe
as Heinrich Müller
as Eberle
as Josef Schurr
as Erich
as SS Obergruppenführer
as Maria Elser
Martin Maria Abram as Ludwig Elser
as Franz Xaver Lechner
Critical reception
The film has been received generally positively by critics, holding a approval rating on Rotten Tomatoes. The site's critical consensus reads, "13 Minutes explores an oft-neglected corner of World War II history with just enough craft and narrative momentum to offset a disappointing lack of subtlety." The review in The Guardian newspaper noted the film as "...a heartfelt study of a man who tried to kill Hitler". The newspaper was also very complimentary about Christian Friedel's performance as Elser.
However, the entertainment magazine Variety was less impressed, saying "... the absence of subtlety combined with predictable dollops of sentimentalism once again trivialize events in the name of making them understandable". In The Daily Telegraphs review, the reviewer noted the film as having an "...overbearing sentimentalism and lacquered, Oscar-hungry sheen".
References
External links
2015 films
2010s historical films
2010s German-language films
German historical films
Films about the German Resistance
Films about assassinations
Films directed by Oliver Hirschbiegel
Films scored by David Holmes (musician)
2010s German films
|
```java
package com.yahoo.vespa.hosted.provision.applications;
import com.yahoo.config.provision.CloudAccount;
import com.yahoo.config.provision.ClusterInfo;
import com.yahoo.config.provision.IntRange;
import com.yahoo.config.provision.Capacity;
import com.yahoo.config.provision.ClusterResources;
import com.yahoo.config.provision.ClusterSpec;
import com.yahoo.vespa.hosted.provision.autoscale.Autoscaler;
import com.yahoo.vespa.hosted.provision.autoscale.Autoscaling;
import com.yahoo.vespa.hosted.provision.autoscale.ClusterModel;
import java.time.Duration;
import java.time.Instant;
import java.util.ArrayList;
import java.util.List;
import java.util.Objects;
import java.util.Optional;
/**
* The node repo's view of a cluster in an application deployment.
*
* This is immutable, and must be locked with the application lock on read-modify-write.
*
* @author bratseth
*/
public class Cluster {
public static final int maxScalingEvents = 15;
private final ClusterSpec.Id id;
private final boolean exclusive;
private final ClusterResources min, max;
private final IntRange groupSize;
private final boolean required;
private final Optional<CloudAccount> cloudAccount;
private final List<Autoscaling> suggestions;
private final Autoscaling target;
private final ClusterInfo clusterInfo;
private final BcpGroupInfo bcpGroupInfo;
/** The maxScalingEvents last scaling events of this, sorted by increasing time (newest last) */
private final List<ScalingEvent> scalingEvents;
public Cluster(ClusterSpec.Id id,
boolean exclusive,
ClusterResources minResources,
ClusterResources maxResources,
IntRange groupSize,
boolean required,
Optional<CloudAccount> cloudAccount,
List<Autoscaling> suggestions,
Autoscaling target,
ClusterInfo clusterInfo,
BcpGroupInfo bcpGroupInfo,
List<ScalingEvent> scalingEvents) {
this.id = Objects.requireNonNull(id);
this.exclusive = exclusive;
this.min = Objects.requireNonNull(minResources);
this.max = Objects.requireNonNull(maxResources);
this.groupSize = Objects.requireNonNull(groupSize);
this.required = required;
this.cloudAccount = Objects.requireNonNull(cloudAccount);
this.suggestions = Objects.requireNonNull(suggestions);
Objects.requireNonNull(target);
if (target.resources().isPresent() && ! target.resources().get().isWithin(minResources, maxResources))
this.target = target.withResources(Optional.empty()); // Delete illegal target
else
this.target = target;
this.clusterInfo = clusterInfo;
this.bcpGroupInfo = Objects.requireNonNull(bcpGroupInfo);
this.scalingEvents = List.copyOf(scalingEvents);
}
public ClusterSpec.Id id() { return id; }
/** Returns whether the nodes allocated to this cluster must be on host exclusively dedicated to this application */
public boolean exclusive() { return exclusive; }
/** Returns the configured minimal resources in this cluster */
public ClusterResources minResources() { return min; }
/** Returns the configured maximal resources in this cluster */
public ClusterResources maxResources() { return max; }
/** Returns the configured group size range in this cluster */
public IntRange groupSize() { return groupSize; }
/**
* Returns whether the resources of this cluster are required to be within the specified min and max.
* Otherwise, they may be adjusted by capacity policies.
*/
public boolean required() { return required; }
/** Returns the enclave cloud account of this cluster, or empty if not enclave. */
public Optional<CloudAccount> cloudAccount() { return cloudAccount; }
/**
* Returns the computed resources (between min and max, inclusive) this cluster should
* have allocated at the moment (whether or not it actually has it),
* or empty if the system currently has no target.
*/
public Autoscaling target() { return target; }
/**
* The list of suggested resources, which may or may not be within the min and max limits,
* or empty if there is currently no recorded suggestion.
* List is sorted by preference
*/
public List<Autoscaling> suggestions() { return suggestions; }
/** Returns true if there is a current suggestion and we should actually make this suggestion to users. */
public boolean shouldSuggestResources(ClusterResources currentResources) {
if (suggestions.isEmpty()) return false;
return suggestions.stream().noneMatch(suggestion ->
suggestion.resources().isEmpty()
|| suggestion.resources().get().isWithin(min, max)
|| ! Autoscaler.worthRescaling(currentResources, suggestion.resources().get())
);
}
public ClusterInfo clusterInfo() { return clusterInfo; }
/** Returns info about the BCP group of clusters this belongs to. */
public BcpGroupInfo bcpGroupInfo() { return bcpGroupInfo; }
/** Returns the recent scaling events in this cluster */
public List<ScalingEvent> scalingEvents() { return scalingEvents; }
public Optional<ScalingEvent> lastScalingEvent() {
if (scalingEvents.isEmpty()) return Optional.empty();
return Optional.of(scalingEvents.get(scalingEvents.size() - 1));
}
/** Returns whether the last scaling event in this has yet to complete. */
public boolean scalingInProgress() {
return lastScalingEvent().isPresent() && lastScalingEvent().get().completion().isEmpty();
}
public Cluster withConfiguration(boolean exclusive, Capacity capacity) {
return new Cluster(id, exclusive,
capacity.minResources(), capacity.maxResources(), capacity.groupSize(), capacity.isRequired(),
capacity.cloudAccount(), suggestions, target, capacity.clusterInfo(), bcpGroupInfo, scalingEvents);
}
public Cluster withSuggestions(List<Autoscaling> suggestions) {
return new Cluster(id, exclusive, min, max, groupSize, required, cloudAccount, suggestions, target, clusterInfo, bcpGroupInfo, scalingEvents);
}
public Cluster withTarget(Autoscaling target) {
return new Cluster(id, exclusive, min, max, groupSize, required, cloudAccount, suggestions, target, clusterInfo, bcpGroupInfo, scalingEvents);
}
public Cluster with(BcpGroupInfo bcpGroupInfo) {
return new Cluster(id, exclusive, min, max, groupSize, required, cloudAccount, suggestions, target, clusterInfo, bcpGroupInfo, scalingEvents);
}
/** Add or update (based on "at" time) a scaling event */
public Cluster with(ScalingEvent scalingEvent) {
List<ScalingEvent> scalingEvents = new ArrayList<>(this.scalingEvents);
int existingIndex = eventIndexAt(scalingEvent.at());
if (existingIndex >= 0)
scalingEvents.set(existingIndex, scalingEvent);
else
scalingEvents.add(scalingEvent);
prune(scalingEvents);
return new Cluster(id, exclusive, min, max, groupSize, required, cloudAccount, suggestions, target, clusterInfo, bcpGroupInfo, scalingEvents);
}
@Override
public int hashCode() { return id.hashCode(); }
@Override
public boolean equals(Object other) {
if (other == this) return true;
if ( ! (other instanceof Cluster)) return false;
return ((Cluster)other).id().equals(this.id);
}
@Override
public String toString() { return id.toString(); }
private void prune(List<ScalingEvent> scalingEvents) {
while (scalingEvents.size() > maxScalingEvents)
scalingEvents.remove(0);
}
private int eventIndexAt(Instant at) {
for (int i = 0; i < scalingEvents.size(); i++) {
if (scalingEvents.get(i).at().equals(at))
return i;
}
return -1;
}
public static Cluster create(ClusterSpec.Id id, boolean exclusive, Capacity requested) {
return new Cluster(id, exclusive,
requested.minResources(), requested.maxResources(), requested.groupSize(), requested.isRequired(),
requested.cloudAccount(), List.of(), Autoscaling.empty(), requested.clusterInfo(), BcpGroupInfo.empty(), List.of());
}
/** The predicted time it will take to rescale this cluster. */
public Duration scalingDuration() {
int completedEventCount = 0;
Duration totalDuration = Duration.ZERO;
for (ScalingEvent event : scalingEvents()) {
if (event.duration().isEmpty()) continue;
// Assume we have missed timely recording completion if it is longer than 4 days, so ignore
if ( ! event.duration().get().minus(Duration.ofDays(4)).isNegative()) continue;
completedEventCount++;
totalDuration = totalDuration.plus(event.duration().get());
}
if (completedEventCount == 0) return ClusterModel.minScalingDuration();
return minimum(ClusterModel.minScalingDuration(), totalDuration.dividedBy(completedEventCount));
}
/** The predicted time this cluster will stay in each resource configuration (including the scaling duration). */
public Duration allocationDuration(ClusterSpec clusterSpec) {
if (scalingEvents.size() < 2) return Duration.ofHours(12); // Default
long totalDurationMs = 0;
for (int i = 1; i < scalingEvents().size(); i++)
totalDurationMs += scalingEvents().get(i).at().toEpochMilli() - scalingEvents().get(i - 1).at().toEpochMilli();
return Duration.ofMillis(totalDurationMs / (scalingEvents.size() - 1));
}
private static Duration minimum(Duration smallestAllowed, Duration duration) {
if (duration.minus(smallestAllowed).isNegative())
return smallestAllowed;
return duration;
}
}
```
|
The Blue Mountains Dams are a series of six dams in the Blue Mountains which supply water to the Blue Mountains and Sydney, Australia. The Dams are managed by the WaterNSW. Water in this scheme may be supplemented from the Fish River Scheme.
Cascade Dams
There are three dams built on the Cascade Creek, near Katoomba, known as Cascade numbers 1, 2 and 3. The Middle Dam was first, completed in 1908. It is tall; long; and holds . The Lower Cascade Dam is an earthfill embankment dam with a central concrete core which was completed in 1926. It is high; long; and holds . The Upper Cascade Dam is another earthfill embankment dam, built in 1938. It is high; long; and it holds .
Middle Cascade Dam is a "Darley-Wade" dam named after the series of dams designed by L.A.B Wade, a prominent dam engineer of the era and CW Darley who designed the first constant radius thin-arch dam in Australia (Lithgow 1). It was originally constructed in 1908 as a buttress section before being raised to a constant radius arch dam to satisfy the increasing water demand of the area. Stage 1, known originally as "Katoomba Dam" was a curved buttress dam built to a height of 7.6m. The dam was built with every intent for future raising. The dam was raised by filling in the spaces of the buttresses with concrete. The right abutment consists of a gravity section to increase the effectiveness of the arch action and due to unsuitable foundations for an arch abutment.
Lake Medlow and Greaves Creek Dams
The heritage-listed Lake Medlow and Greaves Creek Dams were built on the Adams and Greaves Creeks respectively. Lake Medlow Dam was the first concrete thin arch, high stress dam built in New South Wales, and is one of the thinnest dams in the world. It is high; long; and holds . Greaves Creek Dam is also a concrete arch dam, it was completed in 1942. It is high; long; and holds . Sydney Water decommissioned the Greaves Creek Treatment Plant.
Woodford Creek Dam
Woodford Creek Dam is a concrete arch dam which was built on the junction of Woodford Creek and Bulls Creek and completed in 1928. It was subsequently raised several times. It is high and long. Water is no longer drawn from Woodford Dam. In late 2009 Woodford Dam surrounds were opened up to walkers and mountain bikers. Access to the dam wall and lake is still prohibited.
Originally envisaged to supply steam trains, Woodford Dam is one of six dams constructed to supply water to the Blue Mountains area. Two raisings have occurred with the latest in 1947. Due to the closure of the Linden Water filtration Plant in circa 2000, the storage is no longer used for water supply and the outlet raw water pipework has been disconnected and capped.
The first stage was constructed by NSW Railways between 1928 and 1929. Stage two raised the storage by 2.44m in 1935. The final third stage increased the storage by an additional 3.05m in 1947.
Although rare, the storage has been used for rural firefighting by helicopters.
The dam wall is an arch/gravity structure with keyed vertical contraction joints and copper waterstops. The foundations are excavated into the sandstone in a series of steps
References
Coordinates
- Lake Medlow Dam
- Greaves Creek Dam
- Upper Cascade Dam
- Middle Cascade Dam
- Lower Cascade Dam
- Woodford Dam
External links
Blue Mountains Dams at WaterNSW.
Geography of Sydney
Blue Mountains (New South Wales)
Arch dams
Dams in New South Wales
Embankment dams
Earth-filled dams
|
```html
<div *nzModalTitle> {{ 'mxk.text.select' | i18n }} </div>
<form nz-form [nzLayout]="'inline'" (ngSubmit)="onSearch()" class="search__form" style="margin-bottom: 10px">
<div nz-row [nzGutter]="{ xs: 8, sm: 8, md: 8, lg: 24, xl: 48, xxl: 48 }">
<div nz-col nzMd="14" nzSm="24">
<nz-form-item>
<nz-form-label nzFor="groupName">{{ 'mxk.groups.name' | i18n }}</nz-form-label>
<nz-form-control>
<input
nz-input
[(ngModel)]="query.params.groupName"
[ngModelOptions]="{ standalone: true }"
name="groupName"
placeholder=""
id="groupName"
/>
</nz-form-control>
</nz-form-item>
</div>
<div nz-col [nzSpan]="query.expandForm ? 24 : 10" [class.text-right]="query.expandForm">
<button nz-button type="submit" [nzType]="'primary'" [nzLoading]="query.submitLoading">{{ 'mxk.text.query' | i18n }}</button>
<button nz-button type="reset" (click)="onReset()" class="mx-sm" style="display: none">{{ 'mxk.text.reset' | i18n }}</button>
<button nz-button (click)="query.expandForm = !query.expandForm" class="mx-sm" style="display: none">
{{ query.expandForm ? ('mxk.text.collapse' | i18n) : ('mxk.text.expand' | i18n) }}</button
>
<button nz-button nzType="primary" (click)="onSubmit($event)">{{ 'mxk.text.confirm' | i18n }}</button>
</div>
</div>
</form>
<nz-table
#dynamicTable
nzTableLayout="auto"
nzSize="small"
nzBordered
nzShowSizeChanger
[nzData]="query.results.rows"
[nzFrontPagination]="false"
[nzTotal]="query.results.records"
[nzPageSizeOptions]="query.params.pageSizeOptions"
[nzPageSize]="query.params.pageSize"
[nzPageIndex]="query.params.pageNumber"
[nzLoading]="this.query.tableLoading"
(nzQueryParams)="onQueryParamsChange($event)"
nzWidth="100%"
>
<thead>
<tr>
<th></th>
<th nzAlign="center" style="display: none">Id</th>
<th nzAlign="center">{{ 'mxk.groups.name' | i18n }}</th>
<th nzAlign="center">{{ 'mxk.groups.category' | i18n }}</th>
</tr>
</thead>
<tbody>
<tr *ngFor="let data of query.results.rows">
<td
[nzChecked]="query.tableCheckedId.has(data.id)"
[nzDisabled]="data.disabled"
(nzCheckedChange)="onTableItemChecked(data.id, $event)"
></td>
<td nzAlign="left" style="display: none">
<span>{{ data.id }}</span>
</td>
<td nzAlign="left"> {{ data.groupName }}</td>
<td nzAlign="center" *ngIf="data.category == 'dynamic'"> {{ 'mxk.groups.category.dynamic' | i18n }}</td>
<td nzAlign="center" *ngIf="data.category == 'static'"> {{ 'mxk.groups.category.static' | i18n }}</td>
</tr>
</tbody>
</nz-table>
<div *nzModalFooter style="display: none">
<button nz-button nzType="default" (click)="onClose($event)">{{ 'mxk.text.close' | i18n }}</button>
<button nz-button nzType="primary" (click)="onSubmit($event)">{{ 'mxk.text.submit' | i18n }}</button>
</div>
```
|
FC Saturn-1991 Saint Petersburg () was a Russian football team from Saint Petersburg. It played professionally from 1992 to 1995, including 3 seasons (1993–1995) in the second-highest Russian First Division. In 1996 it merged with FC Lokomotiv Saint Petersburg. Before 1995 it was called FC Smena-Saturn Saint Petersburg.
External links
Team history at KLISF
Association football clubs established in 1991
Association football clubs disestablished in 1996
Defunct football clubs in Saint Petersburg
1991 establishments in Russia
1996 disestablishments in Russia
|
```html
<div style="background-color: CadetBlue; padding: 20px;">
<h3>Emails Custom View - Microsoft Graph Toolkit</h3>
<p>Isolated mode: ⚠ mandatory. Find more samples on <a href="path_to_url" target="_blank">MGT Playground</a>. MGT components require API permissions, see the <a href="path_to_url">Microsoft docs</a> for more info.</p>
<style>
.email {
box-shadow: 0 3px 7px rgba(0, 0, 0, 0.3);
padding: 10px;
margin: 8px 4px;
font-family: Segoe UI, Frutiger, Frutiger Linotype, Dejavu Sans, Helvetica Neue, Arial, sans-serif;
}
.email:hover {
box-shadow: 0 3px 14px rgba(0, 0, 0, 0.3);
padding: 10px;
margin: 8px 4px;
}
.email h3 {
font-size: 12px;
margin-bottom: 4px;
}
.email h4 {
font-size: 10px;
margin-top: 0px;
margin-bottom: 4px;
}
.email mgt-person {
--font-size: 10px;
--avatar-size-s: 12px;
}
.email .preview {
font-size: 13px;
text-overflow: ellipsis;
word-wrap: break-word;
overflow: hidden;
max-height: 2.8em;
line-height: 1.4em;
}
</style>
<script>
function setProvider() {mgt.Providers.globalProvider = new mgt.SharePointProvider(props.context);}
</script>
<script src="path_to_url" type="text/javascript"></script>
<script src="path_to_url" type="text/javascript" onload="setProvider()"></script>
<mgt-person person-query="me" view="twoLines"></mgt-person>
<mgt-get resource="/me/messages" version="beta" scopes="mail.read" max-pages="1">
<template>
<div class="email" data-for="email in value">
<h3>{{ email.subject }}</h3>
<h4>
<mgt-person person-query="{{email.sender.emailAddress.address}}" view="oneline" person-card="hover">
</mgt-person>
</h4>
<div data-if="email.bodyPreview" class="preview" innerHtml>{{email.bodyPreview}}</div>
<div data-else class="preview">
email body is empty
</div>
</div>
</template>
<template data-type="loading">
loading
</template>
<template data-type="error">
{{ this }}
</template>
</mgt-get>
</div>
```
|
```less
.guide-anchor {
display: inline-block;
position: relative;
}
.lite-popover.is-guide-anchor {
max-width: 200px;
}
.is-guide-anchor .guide-content {
padding: 10px 20px;
}
```
|
```c++
#include <sstream>
#include "baldr/json.h"
#include "skadi/sample.h"
#include "tyr/serializers.h"
using namespace valhalla;
using namespace valhalla::midgard;
using namespace valhalla::baldr;
namespace {
json::ArrayPtr serialize_range_height(const std::vector<double>& ranges,
const std::vector<double>& heights,
const uint32_t precision,
const double no_data_value) {
auto array = json::array({});
// for each posting
auto range = ranges.cbegin();
for (const auto height : heights) {
auto element = json::array({json::fixed_t{*range, 0}});
if (height == no_data_value) {
element->push_back(nullptr);
} else {
element->push_back({json::fixed_t{height, precision}});
}
array->push_back(element);
++range;
}
return array;
}
json::ArrayPtr serialize_height(const std::vector<double>& heights,
const uint32_t precision,
const double no_data_value) {
auto array = json::array({});
for (const auto height : heights) {
// add all heights's to an array
if (height == no_data_value) {
array->push_back(nullptr);
} else {
array->push_back({json::fixed_t{height, precision}});
}
}
return array;
}
json::ArrayPtr serialize_shape(const google::protobuf::RepeatedPtrField<valhalla::Location>& shape) {
auto array = json::array({});
for (const auto& p : shape) {
array->emplace_back(json::map(
{{"lon", json::fixed_t{p.ll().lng(), 6}}, {"lat", json::fixed_t{p.ll().lat(), 6}}}));
}
return array;
}
} // namespace
namespace valhalla {
namespace tyr {
/* example height with range response:
{
"shape": [ {"lat": 40.712433, "lon": -76.504913}, {"lat": 40.712276, "lon": -76.605263} ],
"range_height": [ [0,303], [8467,275], [25380,198] ]
}
*/
std::string serializeHeight(const Api& request,
const std::vector<double>& heights,
const std::vector<double>& ranges) {
auto json = json::map({});
// get the precision to use for returned heights
uint32_t precision = request.options().height_precision();
// get the distances between the postings
if (ranges.size()) {
json = json::map({{"range_height", serialize_range_height(ranges, heights, precision,
skadi::get_no_data_value())}});
} // just the postings
else {
json = json::map({{"height", serialize_height(heights, precision, skadi::get_no_data_value())}});
}
// send back the shape as well
if (request.options().has_encoded_polyline_case()) {
json->emplace("encoded_polyline", request.options().encoded_polyline());
} else {
json->emplace("shape", serialize_shape(request.options().shape()));
}
if (request.options().has_id_case()) {
json->emplace("id", request.options().id());
}
// add warnings to json response
if (request.info().warnings_size() >= 1) {
json->emplace("warnings", serializeWarnings(request));
}
std::stringstream ss;
ss << *json;
return ss.str();
}
} // namespace tyr
} // namespace valhalla
```
|
```smalltalk
using System.ComponentModel;
using AppKit;
namespace Xamarin.Forms.Platform.MacOS
{
public class TextCellRenderer : CellRenderer
{
readonly Color s_defaultDetailColor = ColorExtensions.SecondaryLabelColor.ToColor(NSColorSpace.DeviceRGBColorSpace);
readonly Color s_defaultTextColor = ColorExtensions.TextColor.ToColor(NSColorSpace.DeviceRGBColorSpace);
public override NSView GetCell(Cell item, NSView reusableView, NSTableView tv)
{
var textCell = (TextCell)item;
var tvc = reusableView as CellNSView ?? new CellNSView(NSTableViewCellStyle.Subtitle);
if (tvc.Cell != null)
tvc.Cell.PropertyChanged -= tvc.HandlePropertyChanged;
tvc.Cell = textCell;
textCell.PropertyChanged += tvc.HandlePropertyChanged;
tvc.PropertyChanged = HandlePropertyChanged;
tvc.TextLabel.StringValue = textCell.Text ?? "";
tvc.DetailTextLabel.StringValue = textCell.Detail ?? "";
tvc.TextLabel.TextColor = textCell.TextColor.ToNSColor(s_defaultTextColor);
tvc.DetailTextLabel.TextColor = textCell.DetailColor.ToNSColor(s_defaultDetailColor);
WireUpForceUpdateSizeRequested(item, tvc, tv);
UpdateIsEnabled(tvc, textCell);
UpdateBackground(tvc, item);
SetAccessibility(tvc, item);
return tvc;
}
protected virtual void HandlePropertyChanged(object sender, PropertyChangedEventArgs args)
{
var tvc = (CellNSView)sender;
var textCell = (TextCell)tvc.Cell;
if (args.PropertyName == TextCell.TextProperty.PropertyName)
{
tvc.TextLabel.StringValue = textCell.Text ?? "";
tvc.TextLabel.SizeToFit();
}
else if (args.PropertyName == TextCell.DetailProperty.PropertyName)
{
tvc.DetailTextLabel.StringValue = textCell.Detail ?? "";
tvc.DetailTextLabel.SizeToFit();
}
else if (args.PropertyName == TextCell.TextColorProperty.PropertyName)
tvc.TextLabel.TextColor = textCell.TextColor.ToNSColor(s_defaultTextColor);
else if (args.PropertyName == TextCell.DetailColorProperty.PropertyName)
tvc.DetailTextLabel.TextColor = textCell.DetailColor.ToNSColor(s_defaultTextColor);
else if (args.PropertyName == Cell.IsEnabledProperty.PropertyName)
UpdateIsEnabled(tvc, textCell);
}
static void UpdateIsEnabled(CellNSView cell, TextCell entryCell)
{
cell.TextLabel.Enabled = entryCell.IsEnabled;
cell.DetailTextLabel.Enabled = entryCell.IsEnabled;
}
}
}
```
|
Lyutvi Ahmed Mestan (, ) (born 24 December 1960) is a Bulgarian politician of Turkish-Bulgarian origin. He was chairman of the Movement for Rights and Freedoms from January 2013 to 24 December 2015. He has been the Member of Parliament for Kardzhali. He was removed as party leader by the DPS central council and expelled from the party for what it considered an excessively pro-Turkish government stance following the downing of a Russian bomber jet by the Turkish Air Force. He subsequently founded a new political force, DOST, acronym of Democrats for Responsibility, Solidarity and Tolerance in Bulgarian, and a double entendre also signifying friend in Turkish. The party is based on pro-EU, pro-NATO and liberal opinions.
References
External links
1960 births
Living people
People from Kardzhali Province
Members of the National Assembly (Bulgaria)
Bulgarian people of Turkish descent
Movement for Rights and Freedoms politicians
|
A major north–south highway extending almost the entire length of the Florida peninsula, State Road 45 (SR 45) is the unsigned Florida Department of Transportation designation of most of the current U.S. Route 41 in Florida. The southern terminus of SR 45 is an intersection with SR 90 in downtown Naples; the northern terminus is an intersection with US 441 (SR 25) in High Springs. South of Causeway Boulevard (SR 676) near Tampa, SR 45 is also known as the Tamiami Trail.
South and east of Naples, US 41 turns eastward as SR 90 as the Tamiami Trail crosses the Everglades on its way to Miami; north of High Springs, US 41 overlaps US 441 (SR 25) until their split in Lake City (from there US 41 continues to the Georgia border with the hidden SR 25 designation).
Separations of US 41 and SR 45 between SR 45 termini
SR 45 away from US 41
Business US 41 - Venice
Business US 41 - Bradenton to Memphis
Business US 41 (historic US 541) - Rockport to Ybor City
SR 60 - Tampa to Ybor City
SR 45 (not overlapped between Adamo Drive/SR 60 and Hillsborough Avenue/US 41-92-SR 600) - Tampa
US 41 away from SR 45
SR 45A - Venice
US 301 - Bradenton, concurrent with SR 55
SR 55 - Bradenton to Memphis
SR 599 - Rockport to Tampa
US 92 (SR 600) - Tampa
Additional concurrencies with SR 45
SR 45-55 - Memphis
US 41/SR 45-60 - Ybor City
US 41/SR 45-700 - Brooksville
US 41/SR 44-45 - Inverness
Alt US 27-US 41/SR 45-500 - Williston
US 27-US 41/SR 45 - Williston to High Springs
Major intersections
State Road 45A
State Road 45A (SR 45A) is the Venice Bypass, a segment along U.S. Route 41 (US 41) east of the Tamiami Trail in Venice, which was originally part of US 41 until 1965 when that segment was redesignated as US 41 Bus after the Intracoastal Waterway (ICW) was dredged through Venice by the U.S. Army Corps of Engineers in 1964. The route begins near Shamrock Boulevard in Venice Gardens and terminates at Venetia Bay Boulevard in the Eastgate section of Venice.
References
External links
Florida Route Log (SR 45)
045
045
045
045
045
045
045
045
045
045
045
045
045
045
U.S. Route 41
|
```c++
/*
*
* Use of this source code is governed by a BSD-style license that can be
* found in the LICENSE file.
*/
#include "SkTypes.h"
#if defined SK_BUILD_CONDENSED
#include "SkMemberInfo.h"
#if SK_USE_CONDENSED_INFO == 1
#error "SK_USE_CONDENSED_INFO must be zero to build condensed info"
#endif
#if !defined SK_BUILD_FOR_WIN32
#error "SK_BUILD_FOR_WIN32 must be defined to build condensed info"
#endif
#include "SkDisplayType.h"
#include "SkIntArray.h"
#include <stdio.h>
SkTDMemberInfoArray gInfos;
SkTDIntArray gInfosCounts;
SkTDDisplayTypesArray gInfosTypeIDs;
SkTDMemberInfoArray gUnknowns;
SkTDIntArray gUnknownsCounts;
static void AddInfo(SkDisplayTypes type, const SkMemberInfo* info, int infoCount) {
SkASSERT(gInfos[type] == NULL);
gInfos[type] = info;
gInfosCounts[type] = infoCount;
*gInfosTypeIDs.append() = type;
size_t allStrs = 0;
for (int inner = 0; inner < infoCount; inner++) {
SkASSERT(info[inner].fCount < 256);
int offset = (int) info[inner].fOffset;
SkASSERT(offset < 128 && offset > -129);
SkASSERT(allStrs < 256);
if (info[inner].fType == SkType_BaseClassInfo) {
const SkMemberInfo* innerInfo = (const SkMemberInfo*) info[inner].fName;
if (gUnknowns.find(innerInfo) == -1) {
*gUnknowns.append() = innerInfo;
*gUnknownsCounts.append() = info[inner].fCount;
}
}
if (info[inner].fType != SkType_BaseClassInfo && info[inner].fName)
allStrs += strlen(info[inner].fName);
allStrs += 1;
SkASSERT(info[inner].fType < 256);
}
}
static void WriteInfo(FILE* condensed, const SkMemberInfo* info, int infoCount,
const char* typeName, bool draw, bool display) {
fprintf(condensed, "static const char g%sStrings[] = \n", typeName);
int inner;
// write strings
for (inner = 0; inner < infoCount; inner++) {
const char* name = (info[inner].fType != SkType_BaseClassInfo && info[inner].fName) ?
info[inner].fName : "";
const char* zero = inner < infoCount - 1 ? "\\0" : "";
fprintf(condensed, "\t\"%s%s\"\n", name, zero);
}
fprintf(condensed, ";\n\nstatic const SkMemberInfo g%s", draw ? "Draw" : display ? "Display" : "");
fprintf(condensed, "%sInfo[] = {", typeName);
size_t nameOffset = 0;
// write info tables
for (inner = 0; inner < infoCount; inner++) {
size_t offset = info[inner].fOffset;
if (info[inner].fType == SkType_BaseClassInfo) {
offset = (size_t) gInfos.find((const SkMemberInfo* ) info[inner].fName);
SkASSERT((int) offset >= 0);
offset = gInfosTypeIDs.find((SkDisplayTypes) offset);
SkASSERT((int) offset >= 0);
}
fprintf(condensed, "\n\t{%d, %d, %d, %d}", nameOffset, offset,
info[inner].fType, info[inner].fCount);
if (inner < infoCount - 1)
putc(',', condensed);
if (info[inner].fType != SkType_BaseClassInfo && info[inner].fName)
nameOffset += strlen(info[inner].fName);
nameOffset += 1;
}
fprintf(condensed, "\n};\n\n");
}
static void Get3DName(char* scratch, const char* name) {
if (strncmp("skia3d:", name, sizeof("skia3d:") - 1) == 0) {
strcpy(scratch, "3D_");
scratch[3]= name[7] & ~0x20;
strcpy(&scratch[4], &name[8]);
} else {
scratch[0] = name[0] & ~0x20;
strcpy(&scratch[1], &name[1]);
}
}
int type_compare(const void* a, const void* b) {
SkDisplayTypes first = *(SkDisplayTypes*) a;
SkDisplayTypes second = *(SkDisplayTypes*) b;
return first < second ? -1 : first == second ? 0 : 1;
}
void SkDisplayType::BuildCondensedInfo(SkAnimateMaker* maker) {
gInfos.setCount(kNumberOfTypes);
memset(gInfos.begin(), 0, sizeof(gInfos[0]) * kNumberOfTypes);
gInfosCounts.setCount(kNumberOfTypes);
memset(gInfosCounts.begin(), -1, sizeof(gInfosCounts[0]) * kNumberOfTypes);
// check to see if it is condensable
int index, infoCount;
for (index = 0; index < kTypeNamesSize; index++) {
const SkMemberInfo* info = GetMembers(maker, gTypeNames[index].fType, &infoCount);
if (info == NULL)
continue;
AddInfo(gTypeNames[index].fType, info, infoCount);
}
const SkMemberInfo* extraInfo =
SkDisplayType::GetMembers(maker, SkType_3D_Point, &infoCount);
AddInfo(SkType_Point, extraInfo, infoCount);
AddInfo(SkType_3D_Point, extraInfo, infoCount);
// int baseInfos = gInfos.count();
do {
SkTDMemberInfoArray oldRefs = gUnknowns;
SkTDIntArray oldRefCounts = gUnknownsCounts;
gUnknowns.reset();
gUnknownsCounts.reset();
for (index = 0; index < oldRefs.count(); index++) {
const SkMemberInfo* info = oldRefs[index];
if (gInfos.find(info) == -1) {
int typeIndex = 0;
for (; typeIndex < kNumberOfTypes; typeIndex++) {
const SkMemberInfo* temp = SkDisplayType::GetMembers(
maker, (SkDisplayTypes) typeIndex, NULL);
if (temp == info)
break;
}
SkASSERT(typeIndex < kNumberOfTypes);
AddInfo((SkDisplayTypes) typeIndex, info, oldRefCounts[index]);
}
}
} while (gUnknowns.count() > 0);
qsort(gInfosTypeIDs.begin(), gInfosTypeIDs.count(), sizeof(gInfosTypeIDs[0]), &type_compare);
#ifdef SK_DEBUG
FILE* condensed = fopen("../../src/animator/SkCondensedDebug.cpp", "w+");
fprintf(condensed, "#include \"SkTypes.h\"\n");
fprintf(condensed, "#ifdef SK_DEBUG\n");
#else
FILE* condensed = fopen("../../src/animator/SkCondensedRelease.cpp", "w+");
fprintf(condensed, "#include \"SkTypes.h\"\n");
fprintf(condensed, "#ifdef SK_RELEASE\n");
#endif
// write header
fprintf(condensed, "// This file was automatically generated.\n");
fprintf(condensed, "// To change it, edit the file with the matching debug info.\n");
fprintf(condensed, "// Then execute SkDisplayType::BuildCondensedInfo() to "
"regenerate this file.\n\n");
// write name of memberInfo
int typeNameIndex = 0;
int unknown = 1;
for (index = 0; index < gInfos.count(); index++) {
const SkMemberInfo* info = gInfos[index];
if (info == NULL)
continue;
char scratch[64];
bool drawPrefix, displayPrefix;
while (gTypeNames[typeNameIndex].fType < index)
typeNameIndex++;
if (gTypeNames[typeNameIndex].fType == index) {
Get3DName(scratch, gTypeNames[typeNameIndex].fName);
drawPrefix = gTypeNames[typeNameIndex].fDrawPrefix;
displayPrefix = gTypeNames[typeNameIndex].fDisplayPrefix;
} else {
sprintf(scratch, "Unknown%d", unknown++);
drawPrefix = displayPrefix = false;
}
WriteInfo(condensed, info, gInfosCounts[index], scratch, drawPrefix, displayPrefix);
}
// write array of table pointers
// start here;
fprintf(condensed, "static const SkMemberInfo* const gInfoTables[] = {");
typeNameIndex = 0;
unknown = 1;
for (index = 0; index < gInfos.count(); index++) {
const SkMemberInfo* info = gInfos[index];
if (info == NULL)
continue;
char scratch[64];
bool drawPrefix, displayPrefix;
while (gTypeNames[typeNameIndex].fType < index)
typeNameIndex++;
if (gTypeNames[typeNameIndex].fType == index) {
Get3DName(scratch, gTypeNames[typeNameIndex].fName);
drawPrefix = gTypeNames[typeNameIndex].fDrawPrefix;
displayPrefix = gTypeNames[typeNameIndex].fDisplayPrefix;
} else {
sprintf(scratch, "Unknown%d", unknown++);
drawPrefix = displayPrefix = false;
}
fprintf(condensed, "\n\tg");
if (drawPrefix)
fprintf(condensed, "Draw");
if (displayPrefix)
fprintf(condensed, "Display");
fprintf(condensed, "%sInfo", scratch);
if (index < gInfos.count() - 1)
putc(',', condensed);
}
fprintf(condensed, "\n};\n\n");
// write the array of number of entries in the info table
fprintf(condensed, "static const unsigned char gInfoCounts[] = {\n\t");
int written = 0;
for (index = 0; index < gInfosCounts.count(); index++) {
int count = gInfosCounts[index];
if (count < 0)
continue;
if (written > 0)
putc(',', condensed);
if (written % 20 == 19)
fprintf(condensed, "\n\t");
fprintf(condensed, "%d",count);
written++;
}
fprintf(condensed, "\n};\n\n");
// write array of type ids table entries correspond to
fprintf(condensed, "static const unsigned char gTypeIDs[] = {\n\t");
int typeIDCount = 0;
typeNameIndex = 0;
unknown = 1;
for (index = 0; index < gInfosCounts.count(); index++) {
const SkMemberInfo* info = gInfos[index];
if (info == NULL)
continue;
typeIDCount++;
char scratch[64];
while (gTypeNames[typeNameIndex].fType < index)
typeNameIndex++;
if (gTypeNames[typeNameIndex].fType == index) {
Get3DName(scratch, gTypeNames[typeNameIndex].fName);
} else
sprintf(scratch, "Unknown%d", unknown++);
fprintf(condensed, "%d%c // %s\n\t", index,
index < gInfosCounts.count() ? ',' : ' ', scratch);
}
fprintf(condensed, "\n};\n\n");
fprintf(condensed, "static const int kTypeIDs = %d;\n\n", typeIDCount);
// write the array of string pointers
fprintf(condensed, "static const char* const gInfoNames[] = {");
typeNameIndex = 0;
unknown = 1;
written = 0;
for (index = 0; index < gInfosCounts.count(); index++) {
const SkMemberInfo* info = gInfos[index];
if (info == NULL)
continue;
if (written > 0)
putc(',', condensed);
written++;
fprintf(condensed, "\n\tg");
char scratch[64];
while (gTypeNames[typeNameIndex].fType < index)
typeNameIndex++;
if (gTypeNames[typeNameIndex].fType == index) {
Get3DName(scratch, gTypeNames[typeNameIndex].fName);
} else
sprintf(scratch, "Unknown%d", unknown++);
fprintf(condensed, "%sStrings", scratch);
}
fprintf(condensed, "\n};\n\n");
fprintf(condensed, "#endif\n");
fclose(condensed);
gInfos.reset();
gInfosCounts.reset();
gInfosTypeIDs.reset();
gUnknowns.reset();
gUnknownsCounts.reset();
}
#elif defined SK_DEBUG
#include "SkDisplayType.h"
void SkDisplayType::BuildCondensedInfo(SkAnimateMaker* ) {}
#endif
```
|
```xml
<?xml version="1.0" encoding="utf-8"?>
<Project DefaultTargets="Build" ToolsVersion="15.0" xmlns="path_to_url">
<ItemGroup Label="ProjectConfigurations">
<ProjectConfiguration Include="debug|Win32">
<Configuration>debug</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="release|Win32">
<Configuration>release</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="profile|Win32">
<Configuration>profile</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
<ProjectConfiguration Include="checked|Win32">
<Configuration>checked</Configuration>
<Platform>Win32</Platform>
</ProjectConfiguration>
</ItemGroup>
<PropertyGroup Label="Globals">
<ProjectGuid>{961D3695-F024-7FB6-D236-2180AF48561A}</ProjectGuid>
<RootNamespace>SampleClothingHelloWorld</RootNamespace>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.Default.props" />
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='debug|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<PlatformToolset>v141</PlatformToolset>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='release|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<PlatformToolset>v141</PlatformToolset>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='profile|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<PlatformToolset>v141</PlatformToolset>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='checked|Win32'" Label="Configuration">
<ConfigurationType>Application</ConfigurationType>
<PlatformToolset>v141</PlatformToolset>
</PropertyGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.props" />
<ImportGroup Label="ExtensionSettings">
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='debug|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="../../../compiler/paths.vsprops" />
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='release|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="../../../compiler/paths.vsprops" />
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='profile|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="../../../compiler/paths.vsprops" />
</ImportGroup>
<ImportGroup Label="PropertySheets" Condition="'$(Configuration)|$(Platform)'=='checked|Win32'">
<Import Project="$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props" Condition="exists('$(UserRootDir)\Microsoft.Cpp.$(Platform).user.props')" Label="LocalAppDataPlatform" />
<Import Project="../../../compiler/paths.vsprops" />
</ImportGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='debug|Win32'">
<OutDir>./../../../bin/vc15win32-PhysX_3.4\</OutDir>
<IntDir>./build/Win32/SampleClothingHelloWorld/debug\</IntDir>
<TargetExt>.exe</TargetExt>
<TargetName>$(ProjectName)DEBUG</TargetName>
<CodeAnalysisRuleSet>AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules />
<CodeAnalysisRuleAssemblies />
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='debug|Win32'">
<ClCompile>
<FloatingPointModel>Precise</FloatingPointModel>
<AdditionalOptions>/wd4201 /wd4324 /Wall /wd4514 /wd4820 /wd4127 /wd4710 /wd4711 /wd4061 /wd4668 /wd4626 /wd4266 /wd4263 /wd4264 /wd4640 /wd4625 /wd4574 /wd4191 /wd4987 /wd4986 /wd4946 /wd4836 /wd4571 /wd4826 /wd4577 /wd4458 /wd4456 /wd4457 /wd4548 /wd5026 /wd5027 /wd4464 /wd5038 /wd5039 /wd4596 /wd4365 /wd4774 /wd4996 /wd5045 /GR- /GF /WX /fp:fast /arch:SSE2 /MP /Od /RTCsu /fp:fast /WX- /Zi /Oi /Oy- /Gm- /EHsc /GS /Gd /nologo /wd4005 /wd4244 /d2Zi+</AdditionalOptions>
<Optimization>Disabled</Optimization>
<AdditionalIncludeDirectories>../../../../PxShared/include;../../../../PxShared/include/filebuf;../../../../PxShared/include/foundation;../../../../PxShared/include/task;../../../../PxShared/include/cudamanager;../../../../PxShared/include/pvd;../../../../PxShared/src/foundation/include;../../../../PxShared/src/filebuf/include;../../../../PxShared/src/fastxml/include;../../../../PxShared/src/pvd/include;./../../../include;./../../../include/PhysX3;./../../../include/basicios;./../../../include/clothing;./../../../include/destructible;./../../../include/emitter;./../../../include/particles;./../../../include/iofx;./../../../include/pxparticleios;../../../../PhysX_3.4/Include;./../../../shared/external/include;./../../../shared/general/shared;./../../../shared/general/RenderDebug/public;./../../../include;./../../../include/PhysX3;$(WindowsSDK_IncludePath);./../../../externals/extensions/externals/include/directxtex;./../../../externals/extensions/externals/include/dxut/Core;./../../../externals/extensions/externals/include/dxut/Optional;./../../../externals/extensions/externals/include/effects11;./../../../externals/extensions/externals/include/simpleopt;./../../../externals/extensions/include/nvidiautils;./../../../externals/extensions/include/nvsimplemesh;./../../../externals/extensions/externals/include;./../../../externals/extensions/externals/include/anttweakbar;./../../../externals/extensions/externals/include/assimp;./../../SampleBase;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;_CRT_SECURE_NO_DEPRECATE;_CRT_NONSTDC_NO_DEPRECATE;DISABLE_CUDA_PHYSX;_ALLOW_ITERATOR_DEBUG_LEVEL_MISMATCH;_ALLOW_RUNTIME_LIBRARY_MISMATCH;_DEBUG;PX_DEBUG;PX_CHECKED;PHYSX_PROFILE_SDK;PX_SUPPORT_VISUAL_DEBUGGER;PX_PROFILE;PX_NVTX=1;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<ExceptionHandling>Sync</ExceptionHandling>
<WarningLevel>Level3</WarningLevel>
<RuntimeLibrary>MultiThreadedDebug</RuntimeLibrary>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<PrecompiledHeaderFile></PrecompiledHeaderFile>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<AdditionalOptions>/INCREMENTAL:NO PhysX3CommonDEBUG_x86.lib PhysX3DEBUG_x86.lib PhysX3CookingDEBUG_x86.lib PhysX3ExtensionsDEBUG.lib PxPvdSDKDEBUG_x86.lib PxTaskDEBUG_x86.lib PxFoundationDEBUG_x86.lib PsFastXmlDEBUG_x86.lib ApexFrameworkDEBUG_x86.lib Apex_LegacyDEBUG_x86.lib Apex_DestructibleDEBUG_x86.lib /SUBSYSTEM:WINDOWS /LARGEADDRESSAWARE /NOLOGO /OPT:REF /OPT:ICF /INCREMENTAL:NO</AdditionalOptions>
<AdditionalDependencies>xinput.lib;d3dcompiler.lib;d3d11.lib;dxguid.lib;dxgi.lib;winmm.lib;comctl32.lib;kernel32.lib;user32.lib;gdi32.lib;winspool.lib;comdlg32.lib;advapi32.lib;shell32.lib;ole32.lib;oleaut32.lib;uuid.lib;odbc32.lib;odbccp32.lib;shlwapi.lib;directxtexDEBUG.lib;DXUTDEBUG.lib;Effects11DEBUG.lib;nvidiautilsDEBUG.lib;nvsimplemeshDEBUG.lib;assimp.lib;%(AdditionalDependencies)</AdditionalDependencies>
<OutputFile>$(OutDir)$(ProjectName)DEBUG.exe</OutputFile>
<AdditionalLibraryDirectories>../../../../PxShared/lib/vc15WIN32;$(WindowsSDK_LibraryPath_x86);./../../../externals/extensions/externals/lib/WIN32;./../../../externals/extensions/lib/WIN32;../../../../PxShared/lib/vc15win32;./../../../lib/vc15WIN32-PhysX_3.4;../../../../PhysX_3.4/Lib/vc15WIN32;%(AdditionalLibraryDirectories)</AdditionalLibraryDirectories>
<ProgramDatabaseFile>$(OutDir)/$(ProjectName)DEBUG.exe.pdb</ProgramDatabaseFile>
<SubSystem>Console</SubSystem>
<ImportLibrary>$(OutDir)$(TargetName).lib</ImportLibrary>
<GenerateDebugInformation>true</GenerateDebugInformation>
<TargetMachine>MachineX86</TargetMachine>
</Link>
<ResourceCompile>
</ResourceCompile>
<ProjectReference>
</ProjectReference>
</ItemDefinitionGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='release|Win32'">
<OutDir>./../../../bin/vc15win32-PhysX_3.4\</OutDir>
<IntDir>./build/Win32/SampleClothingHelloWorld/release\</IntDir>
<TargetExt>.exe</TargetExt>
<TargetName>$(ProjectName)</TargetName>
<CodeAnalysisRuleSet>AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules />
<CodeAnalysisRuleAssemblies />
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='release|Win32'">
<ClCompile>
<FloatingPointModel>Precise</FloatingPointModel>
<AdditionalOptions>/wd4201 /wd4324 /Wall /wd4514 /wd4820 /wd4127 /wd4710 /wd4711 /wd4061 /wd4668 /wd4626 /wd4266 /wd4263 /wd4264 /wd4640 /wd4625 /wd4574 /wd4191 /wd4987 /wd4986 /wd4946 /wd4836 /wd4571 /wd4826 /wd4577 /wd4458 /wd4456 /wd4457 /wd4548 /wd5026 /wd5027 /wd4464 /wd5038 /wd5039 /wd4596 /wd4365 /wd4774 /wd4996 /wd5045 /GR- /GF /WX /fp:fast /arch:SSE2 /MP /Ox /fp:fast /WX- /Zi /Oi /Oy- /Gm- /EHsc /GS /Gd /nologo /wd4005 /wd4244 /d2Zi+</AdditionalOptions>
<Optimization>Disabled</Optimization>
<AdditionalIncludeDirectories>../../../../PxShared/include;../../../../PxShared/include/filebuf;../../../../PxShared/include/foundation;../../../../PxShared/include/task;../../../../PxShared/include/cudamanager;../../../../PxShared/include/pvd;../../../../PxShared/src/foundation/include;../../../../PxShared/src/filebuf/include;../../../../PxShared/src/fastxml/include;../../../../PxShared/src/pvd/include;./../../../include;./../../../include/PhysX3;./../../../include/basicios;./../../../include/clothing;./../../../include/destructible;./../../../include/emitter;./../../../include/particles;./../../../include/iofx;./../../../include/pxparticleios;../../../../PhysX_3.4/Include;./../../../shared/external/include;./../../../shared/general/shared;./../../../shared/general/RenderDebug/public;./../../../include;./../../../include/PhysX3;$(WindowsSDK_IncludePath);./../../../externals/extensions/externals/include/directxtex;./../../../externals/extensions/externals/include/dxut/Core;./../../../externals/extensions/externals/include/dxut/Optional;./../../../externals/extensions/externals/include/effects11;./../../../externals/extensions/externals/include/simpleopt;./../../../externals/extensions/include/nvidiautils;./../../../externals/extensions/include/nvsimplemesh;./../../../externals/extensions/externals/include;./../../../externals/extensions/externals/include/anttweakbar;./../../../externals/extensions/externals/include/assimp;./../../SampleBase;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;_CRT_SECURE_NO_DEPRECATE;_CRT_NONSTDC_NO_DEPRECATE;DISABLE_CUDA_PHYSX;_ALLOW_ITERATOR_DEBUG_LEVEL_MISMATCH;_ALLOW_RUNTIME_LIBRARY_MISMATCH;NDEBUG;APEX_SHIPPING;_SECURE_SCL=0;_ITERATOR_DEBUG_LEVEL=0;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<ExceptionHandling>Sync</ExceptionHandling>
<WarningLevel>Level3</WarningLevel>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<PrecompiledHeaderFile></PrecompiledHeaderFile>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<AdditionalOptions>/INCREMENTAL:NO PhysX3Common_x86.lib PhysX3_x86.lib PhysX3Cooking_x86.lib PhysX3Extensions.lib PxPvdSDK_x86.lib PxTask_x86.lib PxFoundation_x86.lib PsFastXml_x86.lib ApexFramework_x86.lib Apex_Legacy_x86.lib Apex_Destructible_x86.lib /SUBSYSTEM:WINDOWS /LARGEADDRESSAWARE /NOLOGO /OPT:REF /OPT:ICF /INCREMENTAL:NO</AdditionalOptions>
<AdditionalDependencies>xinput.lib;d3dcompiler.lib;d3d11.lib;dxguid.lib;dxgi.lib;winmm.lib;comctl32.lib;kernel32.lib;user32.lib;gdi32.lib;winspool.lib;comdlg32.lib;advapi32.lib;shell32.lib;ole32.lib;oleaut32.lib;uuid.lib;odbc32.lib;odbccp32.lib;shlwapi.lib;directxtex.lib;DXUT.lib;Effects11.lib;nvidiautils.lib;nvsimplemesh.lib;assimp.lib;%(AdditionalDependencies)</AdditionalDependencies>
<OutputFile>$(OutDir)$(ProjectName).exe</OutputFile>
<AdditionalLibraryDirectories>../../../../PxShared/lib/vc15WIN32;$(WindowsSDK_LibraryPath_x86);./../../../externals/extensions/externals/lib/WIN32;./../../../externals/extensions/lib/WIN32;../../../../PxShared/lib/vc15win32;./../../../lib/vc15WIN32-PhysX_3.4;../../../../PhysX_3.4/Lib/vc15WIN32;%(AdditionalLibraryDirectories)</AdditionalLibraryDirectories>
<ProgramDatabaseFile>$(OutDir)/$(ProjectName).exe.pdb</ProgramDatabaseFile>
<SubSystem>Console</SubSystem>
<ImportLibrary>$(OutDir)$(TargetName).lib</ImportLibrary>
<GenerateDebugInformation>true</GenerateDebugInformation>
<TargetMachine>MachineX86</TargetMachine>
</Link>
<ResourceCompile>
</ResourceCompile>
<ProjectReference>
</ProjectReference>
</ItemDefinitionGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='profile|Win32'">
<OutDir>./../../../bin/vc15win32-PhysX_3.4\</OutDir>
<IntDir>./build/Win32/SampleClothingHelloWorld/profile\</IntDir>
<TargetExt>.exe</TargetExt>
<TargetName>$(ProjectName)PROFILE</TargetName>
<CodeAnalysisRuleSet>AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules />
<CodeAnalysisRuleAssemblies />
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='profile|Win32'">
<ClCompile>
<FloatingPointModel>Precise</FloatingPointModel>
<AdditionalOptions>/wd4201 /wd4324 /Wall /wd4514 /wd4820 /wd4127 /wd4710 /wd4711 /wd4061 /wd4668 /wd4626 /wd4266 /wd4263 /wd4264 /wd4640 /wd4625 /wd4574 /wd4191 /wd4987 /wd4986 /wd4946 /wd4836 /wd4571 /wd4826 /wd4577 /wd4458 /wd4456 /wd4457 /wd4548 /wd5026 /wd5027 /wd4464 /wd5038 /wd5039 /wd4596 /wd4365 /wd4774 /wd4996 /wd5045 /GR- /GF /WX /fp:fast /arch:SSE2 /MP /Ox /fp:fast /WX- /Zi /Oi /Oy- /Gm- /EHsc /GS /Gd /nologo /wd4005 /wd4244 /d2Zi+</AdditionalOptions>
<Optimization>Disabled</Optimization>
<AdditionalIncludeDirectories>../../../../PxShared/include;../../../../PxShared/include/filebuf;../../../../PxShared/include/foundation;../../../../PxShared/include/task;../../../../PxShared/include/cudamanager;../../../../PxShared/include/pvd;../../../../PxShared/src/foundation/include;../../../../PxShared/src/filebuf/include;../../../../PxShared/src/fastxml/include;../../../../PxShared/src/pvd/include;./../../../include;./../../../include/PhysX3;./../../../include/basicios;./../../../include/clothing;./../../../include/destructible;./../../../include/emitter;./../../../include/particles;./../../../include/iofx;./../../../include/pxparticleios;../../../../PhysX_3.4/Include;./../../../shared/external/include;./../../../shared/general/shared;./../../../shared/general/RenderDebug/public;./../../../include;./../../../include/PhysX3;$(WindowsSDK_IncludePath);./../../../externals/extensions/externals/include/directxtex;./../../../externals/extensions/externals/include/dxut/Core;./../../../externals/extensions/externals/include/dxut/Optional;./../../../externals/extensions/externals/include/effects11;./../../../externals/extensions/externals/include/simpleopt;./../../../externals/extensions/include/nvidiautils;./../../../externals/extensions/include/nvsimplemesh;./../../../externals/extensions/externals/include;./../../../externals/extensions/externals/include/anttweakbar;./../../../externals/extensions/externals/include/assimp;./../../SampleBase;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;_CRT_SECURE_NO_DEPRECATE;_CRT_NONSTDC_NO_DEPRECATE;DISABLE_CUDA_PHYSX;_ALLOW_ITERATOR_DEBUG_LEVEL_MISMATCH;_ALLOW_RUNTIME_LIBRARY_MISMATCH;NDEBUG;PHYSX_PROFILE_SDK;PX_SUPPORT_VISUAL_DEBUGGER;PX_PROFILE;PX_NVTX=1;_SECURE_SCL=0;_ITERATOR_DEBUG_LEVEL=0;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<ExceptionHandling>Sync</ExceptionHandling>
<WarningLevel>Level3</WarningLevel>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<PrecompiledHeaderFile></PrecompiledHeaderFile>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<AdditionalOptions>/INCREMENTAL:NO PhysX3CommonPROFILE_x86.lib PhysX3PROFILE_x86.lib PhysX3CookingPROFILE_x86.lib PhysX3ExtensionsPROFILE.lib PxPvdSDKPROFILE_x86.lib PxTaskPROFILE_x86.lib PxFoundationPROFILE_x86.lib PsFastXmlPROFILE_x86.lib ApexFrameworkPROFILE_x86.lib Apex_LegacyPROFILE_x86.lib Apex_DestructiblePROFILE_x86.lib /SUBSYSTEM:WINDOWS /LARGEADDRESSAWARE /NOLOGO /OPT:REF /OPT:ICF /INCREMENTAL:NO</AdditionalOptions>
<AdditionalDependencies>xinput.lib;d3dcompiler.lib;d3d11.lib;dxguid.lib;dxgi.lib;winmm.lib;comctl32.lib;kernel32.lib;user32.lib;gdi32.lib;winspool.lib;comdlg32.lib;advapi32.lib;shell32.lib;ole32.lib;oleaut32.lib;uuid.lib;odbc32.lib;odbccp32.lib;shlwapi.lib;directxtex.lib;DXUT.lib;Effects11.lib;nvidiautils.lib;nvsimplemesh.lib;assimp.lib;%(AdditionalDependencies)</AdditionalDependencies>
<OutputFile>$(OutDir)$(ProjectName)PROFILE.exe</OutputFile>
<AdditionalLibraryDirectories>../../../../PxShared/lib/vc15WIN32;$(WindowsSDK_LibraryPath_x86);./../../../externals/extensions/externals/lib/WIN32;./../../../externals/extensions/lib/WIN32;../../../../PxShared/lib/vc15win32;./../../../lib/vc15WIN32-PhysX_3.4;../../../../PhysX_3.4/Lib/vc15WIN32;%(AdditionalLibraryDirectories)</AdditionalLibraryDirectories>
<ProgramDatabaseFile>$(OutDir)/$(ProjectName)PROFILE.exe.pdb</ProgramDatabaseFile>
<SubSystem>Console</SubSystem>
<ImportLibrary>$(OutDir)$(TargetName).lib</ImportLibrary>
<GenerateDebugInformation>true</GenerateDebugInformation>
<TargetMachine>MachineX86</TargetMachine>
</Link>
<ResourceCompile>
</ResourceCompile>
<ProjectReference>
</ProjectReference>
</ItemDefinitionGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)'=='checked|Win32'">
<OutDir>./../../../bin/vc15win32-PhysX_3.4\</OutDir>
<IntDir>./build/Win32/SampleClothingHelloWorld/checked\</IntDir>
<TargetExt>.exe</TargetExt>
<TargetName>$(ProjectName)CHECKED</TargetName>
<CodeAnalysisRuleSet>AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisRules />
<CodeAnalysisRuleAssemblies />
</PropertyGroup>
<ItemDefinitionGroup Condition="'$(Configuration)|$(Platform)'=='checked|Win32'">
<ClCompile>
<FloatingPointModel>Precise</FloatingPointModel>
<AdditionalOptions>/wd4201 /wd4324 /Wall /wd4514 /wd4820 /wd4127 /wd4710 /wd4711 /wd4061 /wd4668 /wd4626 /wd4266 /wd4263 /wd4264 /wd4640 /wd4625 /wd4574 /wd4191 /wd4987 /wd4986 /wd4946 /wd4836 /wd4571 /wd4826 /wd4577 /wd4458 /wd4456 /wd4457 /wd4548 /wd5026 /wd5027 /wd4464 /wd5038 /wd5039 /wd4596 /wd4365 /wd4774 /wd4996 /wd5045 /GR- /GF /WX /fp:fast /arch:SSE2 /MP /Ox /fp:fast /WX- /Zi /Oi /Oy- /Gm- /EHsc /GS /Gd /nologo /wd4005 /wd4244 /d2Zi+</AdditionalOptions>
<Optimization>Disabled</Optimization>
<AdditionalIncludeDirectories>../../../../PxShared/include;../../../../PxShared/include/filebuf;../../../../PxShared/include/foundation;../../../../PxShared/include/task;../../../../PxShared/include/cudamanager;../../../../PxShared/include/pvd;../../../../PxShared/src/foundation/include;../../../../PxShared/src/filebuf/include;../../../../PxShared/src/fastxml/include;../../../../PxShared/src/pvd/include;./../../../include;./../../../include/PhysX3;./../../../include/basicios;./../../../include/clothing;./../../../include/destructible;./../../../include/emitter;./../../../include/particles;./../../../include/iofx;./../../../include/pxparticleios;../../../../PhysX_3.4/Include;./../../../shared/external/include;./../../../shared/general/shared;./../../../shared/general/RenderDebug/public;./../../../include;./../../../include/PhysX3;$(WindowsSDK_IncludePath);./../../../externals/extensions/externals/include/directxtex;./../../../externals/extensions/externals/include/dxut/Core;./../../../externals/extensions/externals/include/dxut/Optional;./../../../externals/extensions/externals/include/effects11;./../../../externals/extensions/externals/include/simpleopt;./../../../externals/extensions/include/nvidiautils;./../../../externals/extensions/include/nvsimplemesh;./../../../externals/extensions/externals/include;./../../../externals/extensions/externals/include/anttweakbar;./../../../externals/extensions/externals/include/assimp;./../../SampleBase;%(AdditionalIncludeDirectories)</AdditionalIncludeDirectories>
<PreprocessorDefinitions>WIN32;_CRT_SECURE_NO_DEPRECATE;_CRT_NONSTDC_NO_DEPRECATE;DISABLE_CUDA_PHYSX;_ALLOW_ITERATOR_DEBUG_LEVEL_MISMATCH;_ALLOW_RUNTIME_LIBRARY_MISMATCH;NDEBUG;PX_CHECKED;PHYSX_PROFILE_SDK;PX_SUPPORT_VISUAL_DEBUGGER;PX_ENABLE_CHECKED_ASSERTS;PX_NVTX=1;_SECURE_SCL=0;_ITERATOR_DEBUG_LEVEL=0;%(PreprocessorDefinitions)</PreprocessorDefinitions>
<ExceptionHandling>Sync</ExceptionHandling>
<WarningLevel>Level3</WarningLevel>
<RuntimeLibrary>MultiThreaded</RuntimeLibrary>
<PrecompiledHeader>NotUsing</PrecompiledHeader>
<PrecompiledHeaderFile></PrecompiledHeaderFile>
<DebugInformationFormat>ProgramDatabase</DebugInformationFormat>
</ClCompile>
<Link>
<AdditionalOptions>/INCREMENTAL:NO PhysX3CommonCHECKED_x86.lib PhysX3CHECKED_x86.lib PhysX3CookingCHECKED_x86.lib PhysX3ExtensionsCHECKED.lib PxPvdSDKCHECKED_x86.lib PxTaskCHECKED_x86.lib PxFoundationCHECKED_x86.lib PsFastXmlCHECKED_x86.lib ApexFrameworkCHECKED_x86.lib Apex_LegacyCHECKED_x86.lib Apex_DestructibleCHECKED_x86.lib /SUBSYSTEM:WINDOWS /LARGEADDRESSAWARE /NOLOGO /OPT:REF /OPT:ICF /INCREMENTAL:NO</AdditionalOptions>
<AdditionalDependencies>xinput.lib;d3dcompiler.lib;d3d11.lib;dxguid.lib;dxgi.lib;winmm.lib;comctl32.lib;kernel32.lib;user32.lib;gdi32.lib;winspool.lib;comdlg32.lib;advapi32.lib;shell32.lib;ole32.lib;oleaut32.lib;uuid.lib;odbc32.lib;odbccp32.lib;shlwapi.lib;directxtex.lib;DXUT.lib;Effects11.lib;nvidiautils.lib;nvsimplemesh.lib;assimp.lib;%(AdditionalDependencies)</AdditionalDependencies>
<OutputFile>$(OutDir)$(ProjectName)CHECKED.exe</OutputFile>
<AdditionalLibraryDirectories>../../../../PxShared/lib/vc15WIN32;$(WindowsSDK_LibraryPath_x86);./../../../externals/extensions/externals/lib/WIN32;./../../../externals/extensions/lib/WIN32;../../../../PxShared/lib/vc15win32;./../../../lib/vc15WIN32-PhysX_3.4;../../../../PhysX_3.4/Lib/vc15WIN32;%(AdditionalLibraryDirectories)</AdditionalLibraryDirectories>
<ProgramDatabaseFile>$(OutDir)/$(ProjectName)CHECKED.exe.pdb</ProgramDatabaseFile>
<SubSystem>Console</SubSystem>
<ImportLibrary>$(OutDir)$(TargetName).lib</ImportLibrary>
<GenerateDebugInformation>true</GenerateDebugInformation>
<TargetMachine>MachineX86</TargetMachine>
</Link>
<ResourceCompile>
</ResourceCompile>
<ProjectReference>
</ProjectReference>
</ItemDefinitionGroup>
<ItemGroup>
<ClCompile Include="..\..\SampleClothingHelloWorld\Main.cpp">
</ClCompile>
<ClCompile Include="..\..\SampleClothingHelloWorld\SampleSceneController.cpp">
</ClCompile>
<ClCompile Include="..\..\SampleClothingHelloWorld\SampleUIController.cpp">
</ClCompile>
<ClInclude Include="..\..\SampleClothingHelloWorld\SampleSceneController.h">
</ClInclude>
<ClInclude Include="..\..\SampleClothingHelloWorld\SampleUIController.h">
</ClInclude>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="./../../../externals/extensions/externals/build/vs2017WIN32/DirectXTex.vcxproj">
<ReferenceOutputAssembly>false</ReferenceOutputAssembly>
</ProjectReference>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="./../../../externals/extensions/externals/build/vs2017WIN32/DXUT.vcxproj">
<ReferenceOutputAssembly>false</ReferenceOutputAssembly>
</ProjectReference>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="./../../../externals/extensions/externals/build/vs2017WIN32/Effects11.vcxproj">
<ReferenceOutputAssembly>false</ReferenceOutputAssembly>
</ProjectReference>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="./../../../externals/extensions/build/vs2017WIN32/nvsimplemesh.vcxproj">
<ReferenceOutputAssembly>false</ReferenceOutputAssembly>
</ProjectReference>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="./../../../externals/extensions/build/vs2017WIN32/nvidiautils.vcxproj">
<ReferenceOutputAssembly>false</ReferenceOutputAssembly>
</ProjectReference>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="./SampleBase.vcxproj">
<ReferenceOutputAssembly>false</ReferenceOutputAssembly>
</ProjectReference>
</ItemGroup>
<Import Project="$(VCTargetsPath)\Microsoft.Cpp.targets" />
<ImportGroup Label="ExtensionTargets"></ImportGroup>
</Project>
```
|
Gebhard I (died 27 March 1023), known as Gebhard of Swabia, was the Bishop of Regensburg from 994 until his death.
Following the death of Bishop Wolfgang, the cathedral canons elected Tagino to replace him, with the support of Henry II, Duke of Bavaria. Otto III, however, ignored the election and appointed his own royal chaplain, Gebhard, instead; he then took Tagino into his royal chapel.
During his episcopate, he founded Prüll Abbey and tried to revert the separation between the property of the diocese and that of St. Emmeram's Abbey, which his predecessor had effected. This gave rise to much dispute. In 996, Otto heard Abbot Ramwold's complaint and summoned Gebhard, whom he made promise not to confiscate further property from St. Emmeram's. He put the monastery under royal protection. He nevertheless remained in conflict over financial matters into the reign of the Emperor Henry II.
Gebhard also gained the right of coinage from Otto III. On his death, he was succeeded by Gebhard II.
Sources
Bernhardt, John W, 1993: Itinerant Kingship and Royal Monasteries in Early Medieval Germany, c. 936–1075. Cambridge: Cambridge University Press
References
11th-century Roman Catholic bishops in Bavaria
1023 deaths
Roman Catholic bishops of Regensburg
10th-century births
|
```smalltalk
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
/*
* These authors would like to acknowledge the Spanish Ministry of Industry,
* Tourism and Trade, for the support in the project TSI020301-2008-2
* "PIRAmIDE: Personalizable Interactions with Resources on AmI-enabled
* Mobile Dynamic Environments", led by Treelogic
* ( path_to_url ):
*
* path_to_url
*/
using System;
using NUnit.Framework;
using ZXing.Common;
using ZXing.OneD.RSS.Expanded.Test;
namespace ZXing.OneD.RSS.Expanded.Decoders.Test
{
/// <summary>
/// <author>Pablo Ordua, University of Deusto (pablo.orduna@deusto.es)</author>
/// </summary>
[TestFixture]
public abstract class AbstractDecoderTest
{
protected static String numeric_10 = "..X..XX";
protected static String numeric_12 = "..X.X.X";
protected static String numeric_1FNC1 = "..XXX.X";
//protected static String numeric_FNC11 = "XXX.XXX";
protected static String numeric2alpha = "....";
protected static String alpha_A = "X.....";
protected static String alpha_FNC1 = ".XXXX";
protected static String alpha2numeric = "...";
protected static String alpha2isoiec646 = "..X..";
protected static String i646_B = "X.....X";
protected static String i646_C = "X....X.";
protected static String i646_FNC1 = ".XXXX";
protected static String isoiec646_2alpha = "..X..";
protected static String compressedGtin_900123456798908 = ".........X..XXX.X.X.X...XX.XXXXX.XXXX.X.";
protected static String compressedGtin_900000000000008 = "........................................";
protected static String compressed15bitWeight_1750 = "....XX.XX.X.XX.";
protected static String compressed15bitWeight_11750 = ".X.XX.XXXX..XX.";
protected static String compressed15bitWeight_0 = "...............";
protected static String compressed20bitWeight_1750 = ".........XX.XX.X.XX.";
protected static String compressedDate_March_12th_2010 = "....XXXX.X..XX..";
protected static String compressedDate_End = "X..X.XX.........";
protected static void assertCorrectBinaryString(String binaryString, String expectedNumber)
{
BitArray binary = BinaryUtil.buildBitArrayFromStringWithoutSpaces(binaryString);
AbstractExpandedDecoder decoder = AbstractExpandedDecoder.createDecoder(binary);
String result = decoder.parseInformation();
Assert.AreEqual(expectedNumber, result);
}
}
}
```
|
An HVDC converter station (or simply converter station) is a specialised type of substation which forms the terminal equipment for a high-voltage direct current (HVDC) transmission line. It converts direct current to alternating current or the reverse. In addition to the converter, the station usually contains:
three-phase alternating current switch gear
transformers
capacitors or synchronous condensers for reactive power
filters for harmonic suppression, and
direct current switch gear.
Components
Converter
The converter is usually installed in a building called the valve hall. Early HVDC systems used mercury-arc valves, but since the mid-1970s, solid state devices such as thyristors have been used. Converters using thyristors or mercury-arc valves are known as line commutated converters. In thyristor-based converters, many thyristors are connected in series to form a thyristor valve, and each converter normally consists of six or twelve thyristor valves. The thyristor valves are usually grouped in pairs or groups of four and can stand on insulators on the floor or hang from insulators from the ceiling.
Line commutated converters require voltage from the AC network for commutation, but since the late 1990s, voltage sourced converters have started to be used for HVDC. Voltage sourced converters use insulated-gate bipolar transistors instead of thyristors, and these can provide power to a deenergized AC system.
Almost all converters used for HVDC are intrinsically able to operate with power conversion in either direction. Power conversion from AC to DC is called rectification and conversion from DC to AC is called inversion.
DC equipment
The direct current equipment often includes a coil (called a reactor) that adds inductance in series with the DC line to help smooth the direct current. The inductance typically amounts to between 0.1 H and 1 H. The smoothing reactor can have either an air-core or an iron-core. Iron-core coils look like oil-filled high voltage transformers. Air-core smoothing coils resemble, but are considerably larger than, carrier frequency choke coils in high voltage transmission lines and are supported by insulators. Air coils have the advantage of generating less acoustical noise than iron-core coils, they eliminate the potential environmental hazard of spilled oil, and they do not saturate under transient high current fault conditions. This part of the plant will also contain instruments for measurement of direct current and voltage.
Special direct current filters are used to eliminate high frequency interference. Such filters are required if the transmission line will use power-line communication techniques for communication and control, or if the overhead line will run through populated areas. These filters can be passive LC filters or active filters, consisting of an amplifier coupled through transformers and protection capacitors, which gives a signal out of phase to the interference signal on the line, thereby cancelling it. Such a system was used on the Baltic Cable HVDC project.
Converter transformer
The converter transformers step up the voltage of the AC supply network. Using a star-to-delta or "wye-delta" connection of the transformer windings, the converter can operate with 12 pulses for each cycle in the AC supply, which eliminates numerous harmonic current components. The insulation of the transformer windings must be specially designed to withstand a large DC potential to earth. Converter transformers can be built as large as 300 Mega volt amperes (MW) as a single unit. It is impractical to transport larger transformers, so when larger ratings are required, several individual transformers are connected together. Either two three-phase units or three single-phase units can be used. With the latter variant only one type of transformer is used, making the supply of a spare transformer more economical.
Converter transformers operate with high flux Power Steps In the Four Steps of the Converter per cycle, and so produce more acoustic noise than normal three-phase power transformers. This effect should be considered in the siting of an HVDC converter station. Noise-reducing enclosures may be applied.
Reactive power
When line commutated converters are used, the converter station will require between 40% and 60% of its power rating as reactive power. This can be provided by banks of switched capacitors or by synchronous condensers, or if a suitable power generating station is located close to the static inverter plant, the generators in the power station. The demand for reactive power can be reduced if the converter transformers have on-load tap changers with a sufficient range of taps for AC voltage control. Some of the reactive power requirement can be supplied in the harmonic filter components.
Voltage sourced converters can generate or absorb reactive as well as real power, and additional reactive power equipment is generally not needed.
Harmonic filters
Harmonic filters are necessary for the elimination of the harmonic waves and for the production of the reactive power at line commutated converter stations. At plants with six pulse line commutated converters, complex harmonic filters are necessary because there are odd numbered harmonics of the orders and produced on the AC side and even harmonics of order on the DC side. At 12 pulse converter stations, only harmonic voltages or currents of the order and (on the AC side) or (on the DC side) result. Filters are tuned to the expected harmonic frequencies and consist of series combinations of capacitors and inductors.
Voltage sourced converters generally produce lower intensity harmonics than line commutated converters. As a result, harmonic filters are generally smaller or may be omitted altogether.
Beside the harmonic filters, equipment is also provided to eliminate spurious signals in the frequency range of power-line carrier equipment in the range of 30 kHz to 500 kHz. These filters are usually near the alternating current terminal of the static inverter transformer. They consist of a coil which passes the load current, with a parallel capacitor to form a resonant circuit.
In special cases, it may be possible to use exclusively machines for generating the reactive power. This is realized at the terminal of HVDC Volgograd-Donbass situated on Volga Hydroelectric Station.
AC switchgear
The three-phase alternating current switch gear of a converter station is similar to that of an AC substation. It will contain circuit breakers for overcurrent protection of the converter transformers, isolating switches, grounding switches, and instrument transformers for control, measurement and protection. The station will also have lightning arresters for protection of the AC equipment from lightning surges on the AC system.
Others
Required area
The area required for a converter station is much larger than a conventional transformer, for example a site with a transmission rating of 600 megawatts and a transmission voltage of 400 kV is approximately 300 x 300 metres (1000 x 1000 feet). Lower-voltage plants may require somewhat less ground area, since less air space clearance would be required around outdoor high-voltage equipment.
Location factors
Converter stations produce acoustic noise. Converter stations can generate serious levels of radio-frequency interference, so include design features to control these emissions. Walls may provide noise protection. As with all AC substations, oil from equipment must be prevented from contaminating ground water in case of a spill. Substantial area may be required for overhead transmission lines, but can be reduced if underground cable is used.
See also
List of HVDC projects
Rotary converter plant
High-voltage transformer fire barriers
References
Converter station
Electric power conversion
|
In music pantonality may refer to:
Twelve-tone music, seen as an extension of tonality to all keys (rather than to no key)
Nonfunctional tonality or pandiatonicism
See also
Bitonality
|
Acetobacter malorum is a bacterium. Its type strain is LMG 1746T (= DSM 14337T).
References
Further reading
Sagarzazu, Noelia Isabel, et al. "Optimization of denaturing high performance liquid chromatography technique for rapid detection and identification of acetic acid bacteria of interest in vinegar production." Acetic Acid Bacteria 2.1s (2013): e5.
External links
LPSN
Type strain of Acetobacter malorum at BacDive - the Bacterial Diversity Metadatabase
Rhodospirillales
Bacteria described in 2002
|
is a Japanese manga series written and illustrated by Kagiji Kumanomata. It has been serialized in Shogakukan's Weekly Shōnen Sunday magazine since May 2016, with its chapters collected in twenty-five tankōbon volumes as of July 2023. An anime television series adaptation produced by Doga Kobo aired from October to December 2020.
Plot
The story follows Princess Syalis, a young princess who was kidnapped by the Demon King, and her quest to sleep well while imprisoned. Despite the Demon King's best attempts, the princess turns her kidnapping into a vacation; having grown tired with the stress of a royal life, she is indifferent to the situation.
Characters
Princess Syalis is a princess who was kidnapped by the demon king and causes chaos in the demon castle while trying to get a good night's sleep. She has been living at the demon king's castle for four years now.
He kidnapped Syalis in her sleep, so he has hardly ever seen her awake and has never had a proper conversation with her. He is always thinking of ways to make the hero get to him and gets worried if he is going to lose, he even puts weapons and tools to make the hero's quest easier. Apart from that he is a good leader for the demons of the castle, but even he can't go against Princess Syalis, and is shown to care for her to a certain point.
Demon Cleric might seem like a gentle young man, but he is actually the demon in charge of the Demon Temple inside the Demon Castle. The Demon Cleric is usually the one called upon to use magic to resurrect anyone who happens to die in the Demon Castle.
Dawner is the hero who is entrusted by the Human Confederation of Goodereste with the task of saving Princess Syalis from the demon castle. Although Dawner and Princess Sylias are childhood friends and also engaged, she does not remember who he is. Whenever she mentions Dawner to the denizens of the demon castle, she calls him, "D-Whatsit". Due to Dawner being outgoing and friendly to Princess Syalis, he had always disturbed her slumber, thus causing her to have nightmares whenever she dreams about him. He is also known for his poor sense of direction, which explains why Princess Syalis has been captured for four years.
Loyal retainer to the Demon Lord, in charge of enforcing rules. The princess drives him batty by her blatant disregard to the fact she's supposed to be a hostage. In fact, later in the series, he notes that Princess is costing them a fortune with all her misadventures; adding onto the woe everyone endures.
He and his subordinates are the principal victims of Princess Syralis's antics, especially when she needs cloth to make things to get a good sleep. The death of the Ghost Shrouds by the Princess is a common joke in the series.
A demon who has dominion over water. Because Poseidon never wears a shirt, the princess labels him a nudist.
A gentle plant demon, whom has motherly concern for the Princess. While willing to overlook all the deaths the Cleric is forces to undo, Neo Alraune became "miffed" upon she her tree trunk brother killed by the Princess as part of one her usual ridiculous sleep plans.
Spirit of a forbidden grimoire the princess freed by sheer dumb luck. Alazif tries encouraging Syalis to use his powers to defeat the demons and leave the castle; however, he is left severely disappointed with her due to her preoccupation with sleeping.
A demon with an arm made of scissors. Syalis gave him her crown to keep his arm together in exchange for his lost scissors; unfortunately leading to her routine of killing the ghost shrouds.
A harpy who believes that the princess considers her a friend over a misunderstanding with Syalis wishing to nap on Harpy's feathers
Twilight's older brother. Unlike his clumsy brother, Hades the classic embodiment of a Demon King; always looking for chance to usurp the throne if Twilight messes up bad enough.
A sleep demon; he must sleep 20 hours a day or risk death. Hypnos was called to examine Sylas' nightmare at one point; revealing to the demons just how much the process loathes Dawner (the source of the nightmare).
A succubus who looks eerily like the princess; only she has red hair and gold eyes. Cubey wished to learn why the princess was so popular despite looking the same, but got suckered into being trained as her double.
Queen of Goodereste, a kingdom of humankind, and mother of Princess Suyaris.
Media
Manga
Sleepy Princess in the Demon Castle is written and illustrated by Kagiji Kumanomata. The series started in Shogakukan's Weekly Shōnen Sunday on May 11, 2016. Shogakukan has collected its chapters into individual tankōbon volumes. The first volume was released on September 16, 2016. As of July 18, 2023, twenty-five volumes have been released.
Viz Media announced in September 2017 that they had licensed the series in North America. The first volume was released on June 12, 2018. On May 9, 2023, Viz Media launched their Viz Manga digital manga service, with the series' chapters receiving simultaneous English publication in North America as they are released in Japan.
An official fanbook of the series was released on October 16, 2020.
Volumes
Anime
An anime television series adaptation was announced in Weekly Shōnen Sunday in September 2019. The series was animated by Doga Kobo and directed by Mitsue Yamazaki, with Yoshiko Nakamura handling series composition and Ai Kikuchi designing the characters. Yukari Hashimoto composed the series' music. The series ran for 12 episodes from October 6 to December 22, 2020, on TV Tokyo, AT-X, and BS TV Tokyo. Inori Minase performed the opening theme , while ORESAMA performed the ending theme "Gimmme!".
Funimation acquired the series and streamed it on its website in North America and the British Isles. On February 21, 2021, Funimation announced the series would receive an English dub, with the first episode premiering the next day. Following Sony's acquisition of Crunchyroll, the series was moved to Crunchyroll. In Southeast Asia and South Asia, the anime is licensed by Muse Communication, and released on Bilibili in Southeast Asia. The series premiered on Animax Asia on May 12, 2021.
Episodes
Reception
In 2017, the series was ranked 16th at the third Next Manga Awards in the print category.
Notes
References
External links
Adventure anime and manga
Anime series based on manga
Comedy anime and manga
Crunchyroll anime
Demons in anime and manga
Doga Kobo
Fantasy anime and manga
Muse Communication
Shogakukan manga
Shōnen manga
Slice of life anime and manga
TV Tokyo original programming
Viz Media manga
Works set in castles
|
```vue
<template>
<div>
<divider>{{ $t('Simple card with header and content') }}</divider>
<card :header="{title: $t('My wallet')}">
<div slot="content" class="card-demo-flex card-demo-content01">
<div class="vux-1px-r">
<span>1130</span>
<br/>
{{ $t('Point') }}
</div>
<div class="vux-1px-r">
<span>15</span>
<br/>
{{ $t('Coupon') }}
</div>
<div class="vux-1px-r">
<span>0</span>
<br/>
{{ $t('Gift card') }}
</div>
<div>
<span>88</span>
<br/>
{{ $t('Cash') }}
</div>
</div>
</card>
<br>
<divider>{{ $t('With footer') }}</divider>
<card :header="{title: $t('Product details') }" :footer="{title: $t('More'),link:'/component/panel'}">
<p slot="content" class="card-padding">{{ $t('Custom content') }}</p>
</card>
<br>
<divider>{{ $t('Use header slot and content slot') }}</divider>
<card>
<img slot="header" src="path_to_url" style="width:100%;display:block;">
<div slot="content" class="card-padding">
<p style="color:#999;font-size:12px;">Posted on January 21, 2015</p>
<p style="font-size:14px;line-height:1.2;">Quisque eget vestibulum nulla. Quisque quis dui quis ex ultricies efficitur vitae non felis. Phasellus quis nibh hendrerit..</p>
</div>
</card>
</div>
</template>
<i18n>
Simple card with header and content:
zh-CN:
My wallet:
zh-CN:
Point:
zh-CN:
Coupon:
zh-CN:
Gift card:
zh-CN: /E
Cash:
zh-CN:
With footer:
zh-CN: footer
Product details:
zh-CN:
More:
zh-CN:
Custom content:
zh-CN:
Use header slot and content slot:
zh-CN: header slot content slot
</i18n>
<script>
import { Divider, Card } from 'vux'
export default {
components: {
Card,
Divider
}
}
</script>
<style scoped lang="less">
@import '~vux/src/styles/1px.less';
.card-demo-flex {
display: flex;
}
.card-demo-content01 {
padding: 10px 0;
}
.card-padding {
padding: 15px;
}
.card-demo-flex > div {
flex: 1;
text-align: center;
font-size: 12px;
}
.card-demo-flex span {
color: #f74c31;
}
</style>
```
|
```javascript
import { newRxError } from "../../rx-error.js";
import { ensureNotFalsy } from "../utils/index.js";
export function ensureSchemaSupportsAttachments(doc) {
var schemaJson = doc.collection.schema.jsonSchema;
if (!schemaJson.attachments) {
throw newRxError('AT1', {
link: 'path_to_url
});
}
}
export function assignMethodsToAttachment(attachment) {
Object.entries(attachment.doc.collection.attachments).forEach(([funName, fun]) => {
Object.defineProperty(attachment, funName, {
get: () => fun.bind(attachment)
});
});
}
/**
* Fill up the missing attachment.data of the newDocument
* so that the new document can be send to somewhere else
* which could then receive all required attachments data
* that it did not have before.
*/
export async function fillWriteDataForAttachmentsChange(primaryPath, storageInstance, newDocument, originalDocument) {
if (!newDocument._attachments || originalDocument && !originalDocument._attachments) {
throw new Error('_attachments missing');
}
var docId = newDocument[primaryPath];
var originalAttachmentsIds = new Set(originalDocument && originalDocument._attachments ? Object.keys(originalDocument._attachments) : []);
await Promise.all(Object.entries(newDocument._attachments).map(async ([key, value]) => {
if ((!originalAttachmentsIds.has(key) || originalDocument && ensureNotFalsy(originalDocument._attachments)[key].digest !== value.digest) && !value.data) {
var attachmentDataString = await storageInstance.getAttachmentData(docId, key, value.digest);
value.data = attachmentDataString;
}
}));
return newDocument;
}
//# sourceMappingURL=attachments-utils.js.map
```
|
```objective-c
*/
/**
* @brief Macros for power gating memory banks specific for ACE 1.0
*/
#ifndef __ZEPHYR_ACE_LIB_ASM_MEMORY_MANAGEMENT_H__
#define __ZEPHYR_ACE_LIB_ASM_MEMORY_MANAGEMENT_H__
#ifdef _ASMLANGUAGE
/* These definitions should be placed elsewhere, but I can't find a good place for them. */
#define LSPGCTL 0x71D80
#define LSPGCTL_HIGH ((LSPGCTL >> 4) & 0xff00)
#define LSPGCTL_LOW ((LSPGCTL >> 4) & 0xff)
#define MAX_MEMORY_SEGMENTS 1
#define EBB_SEGMENT_SIZE 32
#define PLATFORM_HPSRAM_EBB_COUNT 22
.macro m_ace_hpsram_power_change segment_index, mask, ax, ay, az, au, aw
.if \segment_index == 0
.if EBB_SEGMENT_SIZE > PLATFORM_HPSRAM_EBB_COUNT
.set i_end, PLATFORM_HPSRAM_EBB_COUNT
.else
.set i_end, EBB_SEGMENT_SIZE
.endif
.elseif PLATFORM_HPSRAM_EBB_COUNT >= EBB_SEGMENT_SIZE
.set i_end, PLATFORM_HPSRAM_EBB_COUNT - EBB_SEGMENT_SIZE
.else
.err
.endif
rsr.sar \aw /* store old sar value */
/* SHIM_HSPGCTL(ebb_index): 0x17a800 >> 11 == 0x2f5 */
movi \az, 0x2f5
slli \az, \az, 0xb
/* 8 * (\segment_index << 5) == (\segment_index << 5) << 3 == \segment_index << 8 */
addmi \az, \az, \segment_index << 8
movi \au, i_end - 1 /* au = banks count in segment */
2 :
/* au = current bank in segment */
mov \ax, \mask /* ax = mask */
ssr \au
srl \ax, \ax /* ax >>= current bank */
extui \ax, \ax, 0, 1 /* ax &= BIT(1) */
s8i \ax, \az, 0 /* HSxPGCTL.l2lmpge = ax */
memw
1 :
l8ui \ay, \az, 4 /* ax=HSxPGISTS.l2lmpgis */
bne \ax, \ay, 1b /* wait till status==request */
addi \az, \az, 8
addi \au, \au, -1
bnez \au, 2b
wsr.sar \aw
.endm
.macro m_ace_lpsram_power_down_entire ax, ay, az, au
movi \au, 8 /* LPSRAM_EBB_QUANTITY */
movi \az, LSPGCTL_LOW
addmi \az, \az, LSPGCTL_HIGH
slli \az, \az, 4
movi \ay, 1
2 :
s8i \ay, \az, 0
memw
1 :
l8ui \ax, \az, 4
bne \ax, \ay, 1b
addi \az, \az, 8
addi \au, \au, -1
bnez \au, 2b
.endm
#endif /* _ASMLANGUAGE */
#endif /* __Z_ACE_LIB_ASM_MEMORY_MANAGEMENT_H__ */
```
|
Alpine climate is the typical weather (climate) for elevations above the tree line, where trees fail to grow due to cold. This climate is also referred to as a mountain climate or highland climate.
Definition
There are multiple definitions of alpine climate.
In the Köppen climate classification, the alpine and mountain climates are part of group E, along with the polar climate, where no month has a mean temperature higher than .
According to the Holdridge life zone system, there are two mountain climates which prevent tree growth :
a) the alpine climate,
which occurs when the mean biotemperature of a location is between . The alpine climate in Holdridge system is roughly equivalent to the warmest tundra climates (ET) in the Köppen system.
b) the alvar climate, the coldest mountain climate since the biotemperature is between 0 °C and 1.5 °C (biotemperature can never be below 0 °C). It corresponds more or less to the coldest tundra climates and to the ice cap climates (EF) as well.
Holdrige reasoned that plants net primary productivity ceases with plants becoming dormant at temperatures below and above . Therefore, he defined biotemperature as the mean of all temperatures but with all temperatures below freezing and above 30 °C adjusted to 0 °C; that is, the sum of temperatures not adjusted is divided by the number of all temperatures (including both adjusted and non-adjusted ones).
The variability of the alpine climate throughout the year depends on the latitude of the location. For tropical oceanic locations, such as the summit of Mauna Loa, the temperature is roughly constant throughout the year. For mid-latitude locations, such as Mount Washington in New Hampshire, the temperature varies seasonally, but never gets very warm.
Cause
The temperature profile of the atmosphere is a result of an interaction between radiation and convection. Sunlight in the visible spectrum hits the ground and heats it. The ground then heats the air at the surface. If radiation were the only way to transfer heat from the ground to space, the greenhouse effect of gases in the atmosphere would keep the ground at roughly , and the temperature would decay exponentially with height.
However, when air is hot, it tends to expand, which lowers its density. Thus, hot air tends to rise and transfer heat upward. This is the process of convection. Convection comes to equilibrium when a parcel of air at a given altitude has the same density as its surroundings. Air is a poor conductor of heat, so a parcel of air will rise and fall without exchanging heat. This is known as an adiabatic process, which has a characteristic pressure-temperature curve. As the pressure gets lower, the temperature decreases. The rate of decrease of temperature with elevation is known as the adiabatic lapse rate, which is approximately 9.8 °C per kilometer (or 5.4 °F per 1000 feet) of altitude.
The presence of water in the atmosphere complicates the process of convection. Water vapor contains latent heat of vaporization. As air rises and cools, it eventually becomes saturated and cannot hold its quantity of water vapor. The water vapor condenses (forming clouds), and releases heat, which changes the lapse rate from the dry adiabatic lapse rate to the moist adiabatic lapse rate (5.5 °C per kilometre or 3 °F per 1000 feet). The actual lapse rate, called the environmental lapse rate, is not constant (it can fluctuate throughout the day or seasonally and also regionally), but a normal lapse rate is 5.5 °C per 1,000 m (3.57 °F per 1,000 ft). Therefore, moving up on a mountain is roughly equivalent to moving 80 kilometres (50 miles or 0.75° of latitude) towards the pole. This relationship is only approximate, however, since local factors, such as proximity to oceans, can drastically modify the climate. As the altitude increases, the main form of precipitation becomes snow and the winds increase. The temperature continues to drop until the tropopause, at , where it does not decrease further. This is higher than the highest summit.
Distribution
Although this climate classification only covers a small portion of the Earth's surface, alpine climates are widely distributed. They are present in the Himalayas, the Tibetan Plateau, Gansu and Qinghai in Asia, the Alps, the Pyrenees, the Cantabrian Mountains and the Sierra Nevada in Europe, the Andes in South America, the Sierra Nevada, the Cascade Mountains, the Rocky Mountains, the northern Appalachian Mountains (Adirondacks and White Mountains), and the Trans-Mexican volcanic belt in North America, the Southern Alps in New Zealand, the Snowy Mountains in Australia, high elevations in the Atlas Mountains, Ethiopian Highlands, and the Eastern Highlands of Africa, and the central parts of Borneo and New Guinea and the summits of Mount Pico in the Atlantic and Mauna Loa in the Pacific.
The lowest altitude of alpine climate varies dramatically by latitude. If alpine climate is defined by the tree line, then it occurs as low as at 68°N in Sweden, while on Mount Kilimanjaro in Tanzania, the tree line is at .
See also
Alpine plant
Climate of the Alps
List of alpine climate locations
References
A
Köppen climate types
Mountain meteorology
Montane ecology
Climate by mountain range
Climate of the Alps
|
```yaml
apiVersion: release-notes/v2
kind: feature
area: installation
docs:
- 'path_to_url
releaseNotes:
- |
**Improved** Usage on OpenShift clusters is simplified by removing the need of manually creating a `NetworkAttachmentDefinition` resource on every application namespace.
```
|
List of works by or about Kit Reed, American writer.
Novels
Mother Isn't Dead She's Only Sleeping (1961)
At War As Children (1964)
The Better Part (1967)
Armed Camps (1969)
Cry of the Daughter (1971)
Tiger Rag (1973)
Captain Grownup (1976)
The Ballad of T. Rantula (1979)
Magic Time (1980)
Fort Privilege (1985)
The Revenge of the Senior Citizens (1986)
Blood Fever (1986) [as by Shelley Hyde]
Catholic Girls (1987)
Gone (1992) [as by Kit Craig]
Twice Burned (1993) [as by Kit Craig]
Little Sisters of the Apocalypse (1994)
Strait (1995) [as by Kit Craig]
J. Eden (1996)
Closer (1997) [as by Kit Craig]
Some Safe Place (1998) [as by Kit Craig]
Short Fuse (1999) [as by Kit Craig]
@expectations (2000)
Thinner Than Thou (2004)
Bronze (2005)
The Baby Merchant (2006)
The Night Children (2008)
Enclave (2009)
Son of Destruction (2012)
Where (2015)
Mormama (2017)
Short fiction
Collections
Mister Da V. and Other Stories (1967)
The Killer Mice (1976)
Other Stories and...The Attack of the Giant Baby (1981)
Thief of Lives (1992)
Weird Women, Wired Women (1998)
Seven for the Apocalypse (1999)
Dogs of Truth : New and Uncollected Stories (2005)
What Wolves Know (2011)
The Story Until Now: A Great Big Book of Stories (2013)
Anthologies
Fat (1974)
Short stories
See also her bibliographic entry in the Internet Speculative Fiction Database and also in the Laboratory of Fantastic.
Book reviews
Doctors by Erich Segal
Cordelia Underwood, Or the Marvelous Beginnings of the Moosepath League, by Van Reid
Reservation Road, by John Burnham Schwartz
The Better Man, by Anita Nair
Critical studies and reviews of Reed's work
The story until now
Notes
Bibliographies by writer
Bibliographies of American writers
Science fiction bibliographies
|
Dizaj (, also Romanized as Dīzaji; also known as Dīzaj Amīr, Dīzaj-e Amīr Madār, and Dizeh) is a village in Bavil Rural District, in the Central District of Osku County, East Azerbaijan Province, Iran. At the 2006 census, its population was 2,990, in 818 families.
References
Populated places in Osku County
|
```objective-c
/*
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing,
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* specific language governing permissions and limitations
*/
#pragma once
#include <glib-object.h>
G_BEGIN_DECLS
/**
* GArrowMetadataVersion:
* @GARROW_METADATA_VERSION_V1: Version 1.
* @GARROW_METADATA_VERSION_V2: Version 2.
* @GARROW_METADATA_VERSION_V3: Version 3.
*
* They are corresponding to `arrow::ipc::MetadataVersion::type`
* values.
*/
typedef enum {
GARROW_METADATA_VERSION_V1,
GARROW_METADATA_VERSION_V2,
GARROW_METADATA_VERSION_V3
} GArrowMetadataVersion;
G_END_DECLS
```
|
This list of cities, towns, unincorporated communities, counties, and other recognized places in the U.S. state of Alaska also includes information on the number and names of counties in which the place lies, and its lower and upper zip code bounds, if applicable.
F
|
```css
Matching images to a website's color scheme
Hexadecimal color system
Highlight input forms using `:focus` pseudo-class
Use `SVG` for icons
Multiple borders with pseudo elements
```
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
/**
* Simultaneously sort two single-precision floating-point strided arrays based on the sort order of the first array using heapsort.
*
* @module @stdlib/blas/ext/base/ssort2hp
*
* @example
* var Float32Array = require( '@stdlib/array/float32' );
* var ssort2hp = require( '@stdlib/blas/ext/base/ssort2hp' );
*
* var x = new Float32Array( [ 1.0, -2.0, 3.0, -4.0 ] );
* var y = new Float32Array( [ 0.0, 1.0, 2.0, 3.0 ] );
*
* ssort2hp( x.length, 1.0, x, 1, y, 1 );
*
* console.log( x );
* // => <Float32Array>[ -4.0, -2.0, 1.0, 3.0 ]
*
* console.log( y );
* // => <Float32Array>[ 3.0, 1.0, 0.0, 2.0 ]
*
* @example
* var Float32Array = require( '@stdlib/array/float32' );
* var ssort2hp = require( '@stdlib/blas/ext/base/ssort2hp' );
*
* var x = new Float32Array( [ 1.0, -2.0, 3.0, -4.0 ] );
* var y = new Float32Array( [ 0.0, 1.0, 2.0, 3.0 ] );
*
* ssort2hp( x.length, 1.0, x, 1, 0, y, 1, 0 );
*
* console.log( x );
* // => <Float32Array>[ -4.0, -2.0, 1.0, 3.0 ]
*
* console.log( y );
* // => <Float32Array>[ 3.0, 1.0, 0.0, 2.0 ]
*/
// MODULES //
var join = require( 'path' ).join;
var tryRequire = require( '@stdlib/utils/try-require' );
var isError = require( '@stdlib/assert/is-error' );
var main = require( './main.js' );
// MAIN //
var ssort2hp;
var tmp = tryRequire( join( __dirname, './native.js' ) );
if ( isError( tmp ) ) {
ssort2hp = main;
} else {
ssort2hp = tmp;
}
// EXPORTS //
module.exports = ssort2hp;
// exports: { "ndarray": "ssort2hp.ndarray" }
```
|
```c++
#include <IO/MySQLPacketPayloadReadBuffer.h>
namespace DB
{
namespace ErrorCodes
{
extern const int UNKNOWN_PACKET_FROM_CLIENT;
}
const size_t MAX_PACKET_LENGTH = (1 << 24) - 1; // 16 mb
MySQLPacketPayloadReadBuffer::MySQLPacketPayloadReadBuffer(ReadBuffer & in_, uint8_t & sequence_id_)
: ReadBuffer(in_.position(), 0), in(in_), sequence_id(sequence_id_) // not in.buffer().begin(), because working buffer may include previous packet
{
}
bool MySQLPacketPayloadReadBuffer::nextImpl()
{
if (!has_read_header || (payload_length == MAX_PACKET_LENGTH && offset == payload_length))
{
has_read_header = true;
working_buffer.resize(0);
offset = 0;
payload_length = 0;
in.readStrict(reinterpret_cast<char *>(&payload_length), 3);
if (payload_length > MAX_PACKET_LENGTH)
throw Exception(ErrorCodes::UNKNOWN_PACKET_FROM_CLIENT,
"Received packet with payload larger than max_packet_size: {}", payload_length);
size_t packet_sequence_id = 0;
in.readStrict(reinterpret_cast<char &>(packet_sequence_id));
if (packet_sequence_id != sequence_id)
throw Exception(ErrorCodes::UNKNOWN_PACKET_FROM_CLIENT,
"Received packet with wrong sequence-id: {}. Expected: {}.", packet_sequence_id, static_cast<unsigned int>(sequence_id));
sequence_id++;
if (payload_length == 0)
return false;
}
else if (offset == payload_length)
{
return false;
}
in.nextIfAtEnd();
/// Don't return a buffer when no bytes available
if (!in.hasPendingData())
return false;
working_buffer = ReadBuffer::Buffer(in.position(), in.buffer().end());
size_t count = std::min(in.available(), payload_length - offset);
working_buffer.resize(count);
in.ignore(count);
offset += count;
return true;
}
}
```
|
```java
Be as specific as possible when catching exceptions
The distinction between checked and unchecked exceptions
Converting stack trace to a string
Most common reason behind **stack overflow** error
Throwing an `exception`
```
|
Robert Bell (1823 – February 25, 1873) was a surveyor, journalist and political figure in Canada West.
He was born in Ireland in 1821 and arrived in New York state with his parents. In 1823, they moved to a farm near Kemptville, Upper Canada. He qualified as a land surveyor for the province in 1843. He moved to Bytown, where he began work as a surveyor. He surveyed the Muskoka, Haliburton and Nipissing areas as part of the government's plan to open up these areas for settlement.
In 1849, he purchased the Bytown Packet newspaper (renamed the Ottawa Citizen in 1851) from Henry J. Friel and John George Bell. In it Bell expounded his ideas for promoting the settlement of the waste lands between Bytown and Lake Huron. He helped found the Bytown and Prescott Railway and later became president. He served on the town council for Bytown and later represented Russell in the 7th and 8th Parliaments for the Province of Canada.
He died in Hull, Quebec in 1873.
References
External links
1823 births
1873 deaths
Members of the Legislative Assembly of the Province of Canada from Canada West
British surveyors
Canadian surveyors
Journalists from Ontario
People from Leeds and Grenville United Counties
Canadian people of Ulster-Scottish descent
Ottawa Citizen people
19th-century Canadian journalists
Canadian male journalists
Ontario municipal councillors
19th-century Canadian male writers
Irish emigrants to pre-Confederation Ontario
Immigrants to Upper Canada
|
```xml
import RepoEntity from '../api/repoEntity';
export default (state: Array<RepoEntity> = [], action) => {
switch (action.type){
case 'REPOS_ASSIGN':
return [...action.repos];
default:
return state;
}
}
```
|
```xml
import type { Conditional } from 'storybook/internal/types';
// TODO ?
export interface JsDocParam {
name: string;
description?: string;
}
export interface JsDocParamDeprecated {
deprecated?: string;
}
export interface JsDocReturns {
description?: string;
}
export interface JsDocTags {
params?: JsDocParam[];
deprecated?: JsDocParamDeprecated;
returns?: JsDocReturns;
}
export interface PropSummaryValue {
summary: string;
detail?: string;
required?: boolean;
}
export type PropType = PropSummaryValue;
export type PropDefaultValue = PropSummaryValue;
export interface TableAnnotation {
type: PropType;
jsDocTags?: JsDocTags;
defaultValue?: PropDefaultValue;
category?: string;
}
export interface ArgType {
name?: string;
description?: string;
defaultValue?: any;
if?: Conditional;
table?: {
category?: string;
disable?: boolean;
subcategory?: string;
defaultValue?: {
summary?: string;
detail?: string;
};
type?: {
summary?: string;
detail?: string;
};
readonly?: boolean;
[key: string]: any;
};
[key: string]: any;
}
export interface ArgTypes {
[key: string]: ArgType;
}
export interface Args {
[key: string]: any;
}
export type Globals = { [name: string]: any };
```
|
Seven Men from Now (also billed as 7 Men from Now) is a 1956 American Western film directed by Budd Boetticher and starring Randolph Scott, Gail Russell and Lee Marvin. The film was written by Burt Kennedy and produced by John Wayne's Batjac Productions.
Plot
Ben Stride walks into a desert cave encampment during a nighttime rainstorm. He encounters two men taking shelter next to a fire and asks to join them. Stride tells the men he is from the town of Silver Springs, which provokes a mysterious reaction from the two men. They discuss a robbery and murder that recently occurred there. When Stride realizes who the men are, he guns them down.
While tracking through the Arizona wilderness, Stride comes upon a wagon stuck in the mud. Stride helps pull the wagon clear, and the wagon's owners, John and Annie Greer, are grateful. Travelers from Kansas City, they admit they are inexperienced at frontier life and ask Stride to ride with them as they head south to the border town of Flora Vista on their way west to California. Greer says he hopes to find a sales job there, but has been taking odd jobs along the way. The mention of Flora Vista arouses Stride's curiosity and he agrees to take them to the border. As the trio travels, Annie shows a growing attraction to Stride. At one point they are stopped by a US Army detail, whose commanding officer tells them to go back, as Chiricahua Apache have been spotted in the area and he cannot guarantee their safety.
Stride and the Greers travel on to a stagecoach relay station where they encounter Bill Masters and Clete, two former nemeses of Stride's. As they all spend the night at the station, Masters tells the Greers that Stride was once the sheriff of Silver Springs, and his wife was killed during the robbery of the Wells Fargo freight office. Stride has been tracking and killing the seven men who performed the robbery, and Masters intends to abscond with the $20,000 in gold they stole once Stride has accomplished his task. Annie feels sympathy for Stride, who confesses that he feels guilty about his wife's death because at the time he was no longer sheriff and too proud to take the deputy job, so she took one as a clerk at the freight office. Before the wagon heads out of the station, with Masters and Clete tagging along opportunistically, they are met by Chiricahua warriors. The Apache leave when Stride gives up one of the horses to the hungry tribesmen.
The group encounters one of the Wells Fargo robbers, who is being chased by Indians. Unaware of the man's part in the robbery, Stride saves him from the Apache. The man, however, recognizes Stride and nearly kills him, but Stride is saved when Masters shoots the man in the back.
Masters and Clete reach Flora Vista ahead of the wagon, and they meet with the Wells Fargo bandits waiting for delivery of their gold. Masters tells their leader, Payte Bodeen, that Stride is heading in their direction to kill all of them and avenge his wife's death. Bodeen dispatches two of the bandits to meet Stride before he can reach Flora Vista. Meanwhile, Stride leaves Greer and Annie, telling them to continue on without him. Stride rides ahead into a canyon alone and is ambushed by the two bank robbers but kills them both. Wounded in the leg, Stride is knocked unconscious while trying to ride away with one of the bandits' horses.
The Greers’ find Stride while still unconscious and tend to him. Stride regains consciousness and overhears Greer admitting to his wife that he was paid $500 to deliver the Wells Fargo box containing the gold to Flora Vista. Stride takes the gold away from Greer to draw the rest of the bandits out from town, and Greer and Annie head into Flora Vista to notify the local sheriff.
Greer arrives in town without the gold, telling Bodeen that Stride has it, and as he walks down the street toward the sheriff's office, Bodeen guns him down. The last two bandits, Bodeen and Clint, ride out to confront Stride. Stride shoots Clint but Bodeen is killed by Masters and Clete. Masters, blinded by greed, then kills Clete and walks out into the clearing where Stride has placed the box of gold. They face off, and Stride kills Masters before he can pull his guns.
Stride returns the gold to Wells Fargo and tells Annie that he is going to take a job as a deputy sheriff in Silver Springs. He puts her on a stagecoach bound for California, then rides away. Annie, however, tells the stage driver she is not going.
Cast
Randolph Scott as Ben Stride
Gail Russell as Annie Greer
Lee Marvin as Bill Masters
Walter Reed as John Greer
John Larch as Payte Bodeen (1 of the seven men)
Don 'Red' Barry as Clete (Bill Master's guy)
Fred Graham as Henchman (1 of the seven men)
John Beradino as Clint (1 of the seven men)
John Phillips as Jed (1 of the seven men)
Chuck Roberson as Mason (1 of the seven men)
Stuart Whitman as Cavalry Lt. Collins
Pamela Duncan as Señorita Nellie
Steve Mitchell as Fowler (1 of the seven men)
Cliff Lyons as Henchman (1 of the seven men)
Chet Brandenburg as Townsman (uncredited)
Chick Hannan as Townsman (uncredited)
Cap Somers as Townsman (uncredited)
George Sowards as Stage Driver (uncredited)
Fred Sherman as The Prospector
Production
John Wayne and Robert Fellows's production company Batjac purchased the Burt Kennedy screenplay with the intention of having Wayne star as Stride. It was Kennedy's first film script. However, Wayne was locked into doing The Searchers for John Ford. According to Kennedy, Wayne wasn't particularly interested in the script until he became aware that Robert Mitchum's representatives were interested in it, at which point Wayne perked up and suggested going to Warners and casting Randolph Scott instead. Scott insisted on Budd Boetticher as the director. As with their previous collaboration, Bullfighter and the Lady, Wayne, in his role as producer, recut Seven Men From Now without Boetticher's approval, although it has since been restored to the director's original vision.
Seven Men from Now was the first in a seven-film collaboration between Scott, Boetticher, and producer Harry Joe Brown, with five of the films written by Kennedy.
The movie was shot in the Alabama Hills and other locations near Lone Pine, California in the last months of 1955. Gail Russell was cast as the female lead due to her previous work with Wayne in Angel and the Badman and Wake of the Red Witch, in which Wayne's wife at the time accused them of having an affair (denied by both and rejected by the court during Wayne's divorce proceedings). She had not worked on a movie for nearly five years prior to Seven Men from Now, due to her struggles with stage-fright-induced alcoholism. Boetticher worked very hard to keep her from drinking during the filming.
See also
List of American films of 1956
References
External links
1956 Western (genre) films
1956 films
American Western (genre) films
American films about revenge
American survival films
Batjac Productions films
Films directed by Budd Boetticher
Films produced by John Wayne
Films shot in Lone Pine, California
Warner Bros. films
1950s English-language films
1950s American films
English-language Western (genre) films
|
Empire Archer was a 7,031 ton cargo ship which was built in 1942. She was renamed Baron Murray in 1946. In 1959 she was renamed Cathay, serving until 1963 when she was scrapped.
History
Wartime
Empire Archer was built by Caledon Shipbuilding & Engineering Company, Dundee as yard number 395. She was launched on 29 June 1942 and completed in August 1942. Empire Archer was built for the Ministry of War Transport and managed by Raeburn & Verel Ltd.
Empire Archer served in a number of convoys during the Second World War.
JW51B
Convoy JW 51B sailed from Liverpool on 22 December 1942 and arrived at the Kola Inlet on 4 January 1943. Empire Archer was carrying the convoy's Commodore, Captain R A Melhuish. Her cargo consisted of 21 fighter aircraft, 4,376 tons of general cargo, 18 tanks and 141 vehicles. This convoy was unsuccessfully attacked on 31 December by German heavy warships in the Battle of the Barents Sea
RA 53
Convoy RA 53 sailed from the Kola Inlet on 1 March 1943 and arrived at Loch Ewe on 14 March.
KMS 19
Convoy KMS 19 sailed from the Clyde on 25 June 1943 and passed Gibraltar on 6 July. The convoy was operating in support of Operation Husky. Empire Archer was bound for Malta, from where she sailed on 23 July as part of Convoy KMS 19T, arriving in Tripoli on 24 July.
RA 56
Convoy RA 56 sailed from the Kola Inlet on 3 February 1944 and arrived at Loch Ewe on 11 February.
OS 89
Empire Archer was listed as a member of Convoy OS 89, sailing from Liverpool on 15 September 1944 bound for Freetown, Sierra Leone. Empire Archer was due to sail from Aultbea, but did not sail in this convoy.
The reason that Empire Archer was not in the convoy was that on 13 September she had become stranded on Rathlin Island on her way from Sunderland to join the convoy prior to sailing to the United States. She was refloated and then beached off Bangor. Later she was towed to Belfast and then Glasgow for repairs, arriving on 25 September 1944.
JW 63
Convoy JW 63 sailed from Loch Ewe on 30 December 1944 and arrived at the Kola Inlet on 8 January 1945. Amongst Empire Archer's cargo were Spifire LFIXs MJ188, MJ336, MJ337, MJ400, SM542, SM572, SM588, SM595, SM617, SM619, SM630, SM637 and SM663.
RA 64
Convoy RA 64 sailed from the Kola Inlet on 17 February 1945 and arrived at Loch Ewe on 28 February. Empire Archer was carrying the convoys Vice Commodore.
Postwar
In 1946, Empire Archer was sold to H Hogarth & Sons, Glasgow and renamed Baron Murray. She served with them until 1959 when she was sold to the Cathay Shipping Corporation, Panama and renamed Cathay. In 1963, she was sent to Yokosuka for scrapping, arriving on 24 July.
Official number and code letters
Official Numbers were a forerunner to IMO Numbers.
Empire Archer and Baron Murray had the UK Official Number 166215. Empire Archer used the Code Letters BDZV.
References
External links
Photo of SS Baron Murray .
1942 ships
Ships built in Dundee
Steamships of the United Kingdom
Empire ships
Ministry of War Transport ships
Steamships of Panama
Merchant ships of Panama
|
```python
import asyncio
import logging
from copy import copy
from typing import Any, List
from unittest import mock
import pytest
from aio_pika.tools import CallbackCollection, ensure_awaitable
log = logging.getLogger(__name__)
# noinspection PyTypeChecker
class TestCase:
@pytest.fixture
def instance(self) -> mock.MagicMock:
return mock.MagicMock()
@pytest.fixture
def collection(self, instance):
return CallbackCollection(instance)
def test_basic(self, collection):
def func(sender, *args, **kwargs):
pass
collection.add(func)
assert func in collection
with pytest.raises(ValueError):
collection.add(None)
collection.remove(func)
with pytest.raises(LookupError):
collection.remove(func)
for _ in range(10):
collection.add(func)
assert len(collection) == 1
collection.freeze()
with pytest.raises(RuntimeError):
collection.freeze()
assert len(collection) == 1
with pytest.raises(RuntimeError):
collection.add(func)
with pytest.raises(RuntimeError):
collection.remove(func)
with pytest.raises(RuntimeError):
collection.clear()
collection2 = copy(collection)
collection.unfreeze()
assert not copy(collection).is_frozen
assert collection.is_frozen != collection2.is_frozen
with pytest.raises(RuntimeError):
collection.unfreeze()
collection.clear()
assert collection2
assert not collection
def test_callback_call(self, collection):
l1: List[Any] = list()
l2: List[Any] = list()
assert l1 == l2
collection.add(lambda sender, x: l1.append(x))
collection.add(lambda sender, x: l2.append(x))
collection(1)
collection(2)
assert l1 == l2
assert l1 == [1, 2]
async def test_blank_awaitable_callback(self, collection):
await collection()
async def test_awaitable_callback(
self, event_loop, collection, instance,
):
future = event_loop.create_future()
shared = []
async def coro(arg):
nonlocal shared
shared.append(arg)
def task_maker(arg):
return event_loop.create_task(coro(arg))
collection.add(future.set_result)
collection.add(coro)
collection.add(task_maker)
await collection()
assert shared == [instance, instance]
assert await future == instance
async def test_collection_create_tasks(
self, event_loop, collection, instance,
):
future = event_loop.create_future()
async def coro(arg):
await asyncio.sleep(0.5)
future.set_result(arg)
collection.add(coro)
# noinspection PyAsyncCall
collection()
assert await future == instance
async def test_collection_run_tasks_parallel(self, collection):
class Callable:
def __init__(self):
self.counter = 0
async def __call__(self, *args, **kwargs):
await asyncio.sleep(1)
self.counter += 1
callables = [Callable() for _ in range(100)]
for callable in callables:
collection.add(callable)
await asyncio.wait_for(collection(), timeout=2)
assert [c.counter for c in callables] == [1] * 100
class TestEnsureAwaitable:
async def test_non_coroutine(self):
with pytest.deprecated_call(match="You probably registering the"):
func = ensure_awaitable(lambda x: x * x)
with pytest.deprecated_call(match="Function"):
assert await func(2) == 4
with pytest.deprecated_call(match="Function"):
assert await func(4) == 16
async def test_coroutine(self):
async def square(x):
return x * x
func = ensure_awaitable(square)
assert await func(2) == 4
assert await func(4) == 16
async def test_something_awaitable_returned(self):
def non_coro(x):
async def coro(x):
return x * x
return coro(x)
with pytest.deprecated_call(match="You probably registering the"):
func = ensure_awaitable(non_coro)
assert await func(2) == 4
async def test_something_non_awaitable_returned(self):
def non_coro(x):
def coro(x):
return x * x
return coro(x)
with pytest.deprecated_call(match="You probably registering the"):
func = ensure_awaitable(non_coro)
with pytest.deprecated_call(match="Function"):
assert await func(2) == 4
```
|
```javascript
/**
* @license Apache-2.0
*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing, software
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
*/
'use strict';
// MODULES //
var logger = require( 'debug' );
// MAIN //
var debug = logger( 'inspect-stream-sink' );
// EXPORTS //
module.exports = debug;
```
|
```xml
import { IsTabletOrMobileScreen } from '@/Application/UseCase/IsTabletOrMobileScreen'
import { useApplication } from '@/Components/ApplicationProvider'
import { debounce } from '@/Utils'
import { useEffect, useMemo, useState } from 'react'
export default function useIsTabletOrMobileScreen() {
const [_windowSize, setWindowSize] = useState(0)
const application = useApplication()
const usecase = useMemo(() => new IsTabletOrMobileScreen(application.environment), [application])
useEffect(() => {
const handleResize = debounce(() => {
setWindowSize(window.innerWidth)
}, 100)
window.addEventListener('resize', handleResize)
handleResize()
return () => {
window.removeEventListener('resize', handleResize)
}
}, [])
const isTabletOrMobileScreen = usecase.execute().getValue()
return isTabletOrMobileScreen
}
```
|
```objective-c
This program is free software; you can redistribute it and/or modify
the Free Software Foundation
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
with this program; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA. */
// Animation names
#define DEVIL_ANIM_DEFAULT_ANIMATION 0
#define DEVIL_ANIM_FROMSTANDTOATTACK01POSITION 1
#define DEVIL_ANIM_ATTACK01LOOPMINIGUN 2
#define DEVIL_ANIM_FROMATTACK01TOSTANDPOSITION 3
#define DEVIL_ANIM_ATTACK02LOOPCLAWS 4
#define DEVIL_ANIM_ATTACK03 5
#define DEVIL_ANIM_DEATH 6
#define DEVIL_ANIM_DEATHREST 7
#define DEVIL_ANIM_FROMSTANDTODEFENDPOSITION 8
#define DEVIL_ANIM_DEFENDLOOP 9
#define DEVIL_ANIM_FROMDEFENDTOSTANDPOSITION 10
#define DEVIL_ANIM_RUN 11
#define DEVIL_ANIM_STANDLOOP 12
#define DEVIL_ANIM_WALK 13
#define DEVIL_ANIM_WOUND01SLIGHTFRONT 14
#define DEVIL_ANIM_WOUND02SLIGHTBACK 15
#define DEVIL_ANIM_WOUND03CRITICALFRONT 16
// Color names
// Patch names
// Names of collision boxes
#define DEVIL_COLLISION_BOX_DEAFULT 0
#define DEVIL_COLLISION_BOX_DEATH 1
// Attaching position names
#define DEVIL_ATTACHMENT_MINIGUN 0
#define DEVIL_ATTACHMENT_STICK 1
#define DEVIL_ATTACHMENT_SHIELD 2
// Sound names
```
|
The exquisite wrasse (Cirrhilabrus exquisitus) is a species of ray-finned fish from the family Labridae, the wrasses, which is native to reefs in the Indo-West Pacific region. It can be found in the aquarium trade.
Description
The exquisite wrasse has a dominant colour of greenish through to reddish and exhibits complex patterning of colours. The adult males are olive-green dorsally fading to white, pale blue or pink on their underparts and having an oval shaped dark spot on the caudal peduncle which has its bottom margin touching the lateral line. A blue line, which is frequently interrupted, runs from underneath the pectoral fin to the spot on the tail base, and another blue line runs from the corner of the mouth to above the eye before running along the base of the dorsal fin, a second line on the head runs from the posterior edge of the eye until it breaks up above the pectoral fin, and a third line runs from the rear of the mouth to the just above the pectoral-fin base. The base of the pectoral fin is marked with a black bar, edged with blue while the margin of that fin is red. All of the fins show variable amounts of red in their middle portions. The juveniles and smaller females are reddish with a blue-margined black oval-shaped spot on the caudal peduncle and a white spot at the tip of the snout. The colouration shown by exquisite wrasse does vary geographically. A male can attain a standard length of .
Distribution
The exquisite wrasse is found from the eastern coast of Africa as far south as Sodwana Bay in South Africa, east through the Indian Ocean to Australia and into the Pacific Ocean as far as east as the Tuamotus, French Polynesia. It reaches north as far as the Ryukyu Islands and south to the northern Great Barrier Reef.
Habitat and biology
The exquisite wrasse occurs where there is rubble or low patches of reefs where there is a strong current; it is also found on reef edges and around exposed outcrops of reef within areas of rubble. It can occur in reasonably large, mixed sex groups when feeding on zooplankton high above the seabed. The males often display to each other. It is considered that there may be some association with the mushroom Heliofungia actiniformis. They are protogynous hermaphrodites the males developing a larger size, longer more pointed fins and more colourful body pattern as the transform from females to males.
Naming and taxonomy
Cirrhilabrus exquisitus was formally described by the South African ichthyologist James L.B. Smith in 1957 with the type locality given as Pinda in Mozambique. This is the most widespread member of the genus Cirrhilabrus and may prove to represent a species complex.
Human uses
This species is collected for the aquarium trade, but it has not yet been bred in the aquarium.
References
External links
Cirrhilabrus
Fish of Thailand
Taxa named by J. L. B. Smith
Fish described in 1957
|
Ocyale lanca, is a species of spider of the genus Ocyale. It is endemic to Sri Lanka.
See also
List of Lycosidae species
References
Spiders described in 1879
Spiders of Asia
Endemic fauna of Sri Lanka
Lycosidae
|
Frederic Edward Clements (September 16, 1874 – July 26, 1945) was an American plant ecologist and pioneer in the study of plant ecology and vegetation succession.
Biography
Born in Lincoln, Nebraska, he studied botany at the University of Nebraska, graduating in 1894 and obtaining a doctorate in 1898. One of his teachers was botanist Charles Bessey, who inspired Clements to research topics such as microscopy, plant physiology, and laboratory experimentation. He was also classmate of Willa Cather and Roscoe Pound. While at the University of Nebraska, he met Edith Gertrude Schwartz (1874–1971), also a botanist and ecologist, and they were married in 1899.
In 1905 he was appointed full professor at the University of Nebraska, but left in 1907 to head the botany department at the University of Minnesota in Minneapolis. From 1917 to 1941 he was employed as an ecologist at the Carnegie Institution of Washington in Washington, D.C., where he was able to carry out dedicated ecological research. While employed at Carnegie Institution of Washington, Clements faced criticism for his experiments conducted with the purpose of creating new plant species. Due to these criticisms and as well as personal conflicts with his co workers, in the 1920s the title of director of research in experimental taxonomy was given to Harvey Monroe Hall.
During winter he worked at research stations in Tucson, Arizona, and Santa Barbara, California, while in the summer he performed fieldwork at the Carnegie Institution's Alpine Laboratory, a research station in Englemann Canyon on the slopes of Pikes Peak, Colorado. During this time he worked alongside staff of the U.S. Soil Conservation Service. In addition to his field investigations, he carried out experimental work in the laboratory and greenhouse, both at the Pikes Peak station and at Santa Barbara.
Theory of vegetation change to climax community
From his observations of the vegetation of Nebraska and the western United States, Clements developed one of the most influential theories of vegetation development. Vegetation composition does not represent a permanent condition but gradually changes with time. Clements suggested that the development of vegetation can be understood as a unidirectional sequence of stages resembling the development of an individual organism. After a complete or partial disturbance, vegetation grows back (under ideal conditions) towards a stable "climax state", which describes the vegetation best suited to the local conditions. Though any actual instance of vegetation might follow the ideal sequence towards stability, it can be interpreted in relation to that sequence, as a deviation from it due to non-ideal conditions.
In these studies, he and Roscoe Pound (who subsequently moved from ecology to legal scholarship) developed the widely-used method of sampling using quadrats around 1898.
Clements's climax theory of vegetation dominated plant ecology during the first decades of the twentieth century, though it was criticized significantly by ecologists William Skinner Cooper, Henry Gleason and Arthur Tansley early on, and by Robert Whittaker mid-century, and largely fell out of favor.
Community-unit view of vegetation types or plant communities
In his 1916 publication, Plant Succession, and his 1920 Plant Indicators, Clements metaphorically equated units of vegetation, (now called vegetation types or plant communities) with individual organisms. He observed that some groups of species, which he called "formations", were repeatedly associated together. He is frequently said to have believed that some species were dependent on the group, and the group on that species in an obligatory relationship. However, this interpretation has been challenged by the argument that Clements did not assume mutual dependence as an organizing principle of formations or plant communities.
Clements observed little overlap in kinds of species from type to type, with many species confined to just a single type. Some plants were widespread over vegetation types, but the areas of geographical overlap (ecotones) was narrow. His view of a community as a distinct unit was challenged in 1926 by Henry Gleason, who viewed vegetation as a continuum, not a unit, with associations being merely coincidental, and that any support by observations or data of clusters of species as predicted by Clements's view was either an artifact of the observer's perception or a result of defective data analysis.
Lamarckism
Clements was an advocate of neo-Lamarckian evolution. Ecologist Arthur Tansley wrote that because of his support for Lamarckism, Clements "never seemed to give proper weight to the results of modern genetical research."
Science historian Ronald C. Tobey has commented that:
[Clements] believed that plants and animals could acquire a wide variety and range of characteristics in their struggle to survive and adapt to their environment, and that these features were heritable. In the 1920s, he conducted experiments to transform plant species native to one ecological zone into a species adapted to another, higher, zone. Clements was quite convinced of the validity of his experiments, but this experimental Lamarckism fell to experimental disproof in the 1930s.
Clements spent much time trying to demonstrate the inheritance of acquired traits in plants. By the late 1930s scientists had provided
Darwinian explanations for the results of his transplant experiments.
Honors
In 1903, the flower Clementsia rhodantha Rhodiola rhodantha ("Clements's rose flower"), a stonecrop, was named in honor of Frederic Clements.
Writings
Among his works are:
The Phytogeography of Nebraska (1898; second edition, 1900)
Research Methods in Ecology (1905)
Plant Physiology and Ecology (1907)
Plant Succession. An Analysis of the Development of Vegetation (1916)
Plant Indicators. The Relation of Plant Communities to Process and Practice (1920)
The Phylogenetic Method in Taxonomy: The North American Species of Artemisia, Chrysothamnus, and Atriplex (1923, with Harvey Monroe Hall)
Plant Succession and Indicators. A definitive edition of Plant succession and Plant indicators (1928, reprinted 1973)
Flower Families and Ancestors (1928, with Edith Clements)
Plant Competition. An Analysis of Community Functions (1929, with J.E. Weaver & H.C. Hanson. Washington: Carnegie Institution of Washington
The Genera of Fungi (1931, repr. 1965, with C. L. Shear)
Nature and structure of the climax (1936). The Journal of Ecology, 24(1), 252–284.
See also
:Category:Taxa named by Frederic Clements
Suzanne Simard
References
External links
Edith S. and Frederic E. Clements Papers, 1876–1969. University of Wyoming – American Heritage Center
AHC Digital collection of Edith S. and Frederic E. Clements
Clements Papers Document the History of Ecology AHC blog
1874 births
1945 deaths
Academics from Nebraska
American ecologists
Ecological succession
Lamarckism
People from Lincoln, Nebraska
Plant ecologists
University of Nebraska–Lincoln alumni
|
Kosmos 1341 ( meaning Cosmos 1341) was a Soviet US-K missile early warning satellite which was launched in 1982 as part of the Soviet military's Oko programme. The satellite was designed to identify missile launches using optical telescopes and infrared sensors.
Kosmos 1341 was launched from Site 16/2 at Plesetsk Cosmodrome in the Russian SSR. A Molniya-M carrier rocket with a 2BL upper stage was used to perform the launch, which took place at 05:44 UTC on 3 March 1982. The launch successfully placed the satellite into a molniya orbit. It subsequently received its Kosmos designation, and the international designator 1982-016A. The United States Space Command assigned it the Satellite Catalog Number 13080.
See also
1982 in spaceflight
List of Kosmos satellites (1251–1500)
List of Oko satellites
List of R-7 launches (1980-1984)
References
Kosmos satellites
Oko
Spacecraft launched in 1982
Spacecraft launched by Molniya-M rockets
|
Bieńczyce is one of 18 districts of Kraków, located in the northern part of the city. The name Bieńczyce comes from a village of same name that is now a part of the district.
According to the Central Statistical Office data, the district's area is and 42 633 people inhabit Bieńczyce.
History
The name Bieńczyce comes from the owner of the village named Bień (Benedykt). It is mentioned for the first time in documents in 1224. The village belonged first to the Church of St Michael the Archangel and St Stanislaus Bishop, and from 1317 to the Church of St. Florian. In 1391, the first mill on Dłubnia River was built in the village. The other mill from 1449 was being powered by the existing millrace to the 21st century. In the second half of the 15th century, a manor house with a farm was built in the village (today, at Kaczeńcowa Street, a manor house and an outbuilding from the beginning of the 20th century have been preserved in the neighborhood of the pond). Later, the village of Bieńczyce constituted the property of the canons of the Wawel Cathedral. After the Second Partition of Poland, the village was confiscated by the partitioning Austrian powers.
In the 19th century, the Austrians led the road, and followingly, a railway line constructed from 1899 until 1900, that ran through Bieńczyce to Kocmyrzów. The existence of the railway line is still evident to the preserved remains of tracks, the remains of two railway bridges over the Dłubnia River and its mill and the preserved building of the railway station Bieńczyce. The village belonged to the parish in Raciborowice. There was a small chapel in Bieńczyce, a branch of the parish church.
One of the recognizable inhabitants of the village was Franciszek Ptak, a rich peasant and innkeeper, politician of the Polish People's Party and member of the Diet of Galicia and Lodomeria, who co-financed the construction of "Falcon" Polish Gymnastic Society House and the public school in Bieńczyce in the first decade of the 20th century.
In 1949, the construction of the city of Nowa Huta began in the eastern part of the village. In 1951, the village was attached together with the new city to Kraków as one of the Nowa Huta estates. In the years 1967–1977 a modern temple of Our Lady Queen of Poland was established, called the Ark of the Lord. It was the first church built within one of the housing estates of Nowa Huta, not on its outskirts – mainly thanks to the efforts of Bishop Karol Wojtyła and residents. In the 1980s, the area around the Ark of the Lord was one of the most important places of demonstration for Nowa Huta Solidarity trade union. During one of the demonstrations, on 13 October 1982, the officer of SB Capt. Andrzej Augustyn fatally wounded a 20-year-old worker, Bogdan Włosik.
In the years 1976–1993, the Ludwig Rydygier Krakow Specialist Hospital was built in the district of Bieńczyce. The Nowa Huta Lagoon and the "Wanda" Sports Club are also located in the district.
Population
Subdivisions of Bieńczyce
Bieńczyce is divided into smaller subdivisions (osiedles). Here's a list of them.
Bieńczyce
Osiedle Albertyńskie
Osiedle Jagiellońskie
Osiedle Kalinowe
Osiedle Kazimierzowskie
Osiedle Kościuszkowskie
Osiedle Na Lotnisku
Osiedle Niepodległości
Osiedle Przy Arce
Osiedle Strusia
Osiedle Wysokie
Osiedle Złotej Jesieni
References
External links
Official website of Bieńczyce
Biuletyn Informacji Publicznej
Districts of Kraków
|
Fotiou () is a Greek surname. It is the surname of:
Damien Fotiou, Australian actor.
Elli Fotiou (born 1939), Greek actress.
Panagiotis Fotiou (born 1964), Greek long-distance runner.
Theano Fotiou (born 1946), Greek architect and government minister.
Harikleia Fotiou (1918–1984), Greek painter and engraver.
Greek-language surnames
Surnames
Patronymic surnames
|
Grant Maloy Smith (born August 28, 1957) is an American singer, songwriter, musician, and former businessman from Jacksonville, Florida.
Early life
Smith was born in Jacksonville, Florida and started playing songs of The Beatles with the guitar at an early age. He attended the Rhode Island School of Design but did not complete his tenure there, opting to focus on music.
Early music career
Smith's first band was called Britannia (1981-1984). In 2008, he and his band opened for the national acts like Elvin Bishop, Steppenwolf, and The Guess Who. Britannia played all original music, written by Smith; one Smith's songs "I'm A Loaded Gun" was included on the 1981 album "Southern New England's Best Rock From JB 105".
Smith was married in 1985 and he continued writing music. The family moved to California in 1991, and Smith joined the Songwriters Guild of America, attending song pitching meetings at their Hollywood offices.
Filmography
Film scoring
After returning to Rhode Island in 1995, Smith began scoring indie films, including "Code Of Ethics" starring Melissa Leo, an Academy Award-winning actress. He also scored "Pray for Power," starring Lisa Boyle. He worked frequently with directors Christian de Rezendes and Dawn Radican Natalia.
Acting
In 2019, Grant appeared in the feature film "Oildale", playing the character Brady Cooper, a musician. In the film he performed one of his original songs "I Come From America", which was sung by character in the film.
Full-length movies
1997 Night of the Beast
1998 Boxed Man
1998 Code of Ethics
2001 Serial Intentions
2003 Extra Credit
2008 Solitaire
2010 The Rich and the Poor Are Naked
2011 Pledging Allegiance
Short films
2008 PC Noir
2010 Thinking Through the Drink
2010 Duet
2012 Nijinsky's Room
2012 Cat Scratch
Video feature-length films
2001 - Pray for Power
2002 - Hope High
As an actor
2003 - Extra Credit (feature film) as Jake Lawrence
2019 - Oildale (feature film) as Brady Cooper
Television
2009 Mythbusters (Season 7, ep 1) as himself
Pop/rock album period
From 2008 to 2012, Smith wrote and self-produced with his own label, Small Dog Records, several albums of pop/rock music. The first was Already August, (2008), which blended elements of folk and Americana music with pop and rock ballads.
In 2010 Smith released Big Bowl of Courage, with songs that were generally more rock and roll than the previous album.
The next album was American Merman (2011), where Smith experimented with reggae structures in several tracks.
His final pop/rock album came in 2012, Mister Sparklepants.
Americana music period
In 2012 Smith transitioned to Americana, or American roots music, a subgenre of country music. He wrote and produced the album "Yellow Trailer" originally released on Smith's own Chinese Sock Puppet Records in 2013, but was remastered and re-released in 2015 on Suburban Cowboy Records. That album was entered into the Grammy Awards the year.
In 2014 Smith was asked by producer Art Greenhaw to sing on several tracks of a Roots Gospel album. He contributed with lead vocals and with one original song of his own "Where Main Street Ends," a gospel version of a song that he had written. This album was entered into the roots gospel category of the Grammys in 2014.
At the end of 2014, Smith was invited by New York producer Perry Margouleff to travel to England and assist him in several shows that singer Paul Rodgers (Bad Company, Free) was doing at the Royal Albert Hall. Smith worked behind the scenes on the entire tour.
In 2015, Smith was asked to narrate a song on a spoken word album that was being produced by Hawaiian-based DJ Cindy Paulos, called Arise Above Abuse: Artists Speak Out for Women. He co-narrated the track "One in Five" with Hawaiian Congresswoman Tulsi Gabbard. He also provided the music for this track and contributed an original song about the kidnapping of women and girls in Africa called "She Would Not Bow Her Head." (from Smith's 2012 Album "Mister Sparklepants").
His next album, "Dust Bowl - American Stories," was released on Suburban Cowboy Records in 2017 and features bassist William Wittman of Cyndi Lauper, and drummer Skoota Warner, as well as keyboardist Tommy Mandel, formerly of Bryan Adams and Dire Straits, who performed on the basic tracks of the album. Production then moved to Nashville, where additional tracks were recorded by IBGMA award-winning dobro player Rob Ickes, fiddle player Steve Stokes of Alabama, cellist Tim Lorsch of Keith Urban, Percy Sledge, accordion player Jeff Taylor of The Time Jumpers, percussionist and drummer Matt Burgess of Willie Nelson, Lynyrd Skynyrd, Jewel, pedal steel player Mike Johnson, of Alison Krauss and Dolly Parton, violinist Lorenza Ponce of Bon Jovi, Adele, Sam Smith, and violinist Rocio Marron of The Voice, under the supervision of co-producer Jeff Silverman.
Because "Dust Bowl - American Stories" is a theme album related to the Dust Bowl of the 1930s, Smith was invited to perform the album at the Kern County Museum, which he did on January 14, 2018. He also performed the entire album for the Bakersfield High School on January 17, 2018, and at the Centennial Rodeo Opry in Oklahoma City in August 2017.
Smith toured the United States, Europe and Mexico. Although primarily a headliner, he sometimes opened for other artists during 2015-2016, including Rita Coolidge, Jon Pousette-Dart, and John Ford Coley. He has performed at The Bitter End in New York, The Clive Davis Theatre at The Grammy Museum, the Troubadour in Hollywood, The National Sylvan Theatre in DC, and The Bluebird Cafe in Nashville. In April 2018, Smith performed on Song Of The Mountains, which is recorded before a live audience and also syndicated on PBS television throughout North America. In August, he performed on Woodsongs Old Time Radio Hour, which is carried by more than 500 radio stations and also broadcast on PBS television.
In 2018, Smith's original song "Man Of Steel" was named the official theme song of the National Veterans Foundation.
In the same year, Smith appeared as one of the performers at The Indie Collaborative's debut public performance at Carnegie Hall.
In 2022, Smith performed a cover of Michael Stephenson’s "My Prison" to raise funds. All funds were donated to the National Veteran's Foundation.
Later in 2022, Smith again performed at Carnegie Hall for the "Celebrating Earth Day in Song" presented by The Indie Collaborative. The event featured award-winning Emmy, Grammy, and Billboard top 10 musical artists.
As an author
In September 2018 Smith released a Christmas single and children's book that he wrote, called "Fly Possum Fly." He enlisted country prodigy EmiSunshine to be the featured vocalist on the song.
Before his music career
Before focusing on his musical career, Smith worked in the scientific measuring equipment industry in various positions. He eventually started his own company, Dewetron America, Inc, which he sold to Dewetron GMBH of Austria, leaving completely in 2017. Smith and his company provided numerous systems to NASA for the Constellation Program. The company won the NASA Tech Briefs Product of the Year Award four times under his leadership, in 2006, 2009, 2012 and 2015. In 2016 he was requested by MEDICAL DESIGN BRIEFS to write an article outlining his perspective on the future of measuring instruments.
Achievements
2017: named the Best Male Americana Artist at the Indie Music Channel Awards, and performed during the awards ceremony at the Troubadour In Los Angeles.
2017: won two Grammy participation certificates for his work as co-producer on the Grammy-award-winning album "Presidential Suite: Eight Variations on Freedom," by jazz artist Ted Nash.
2016: named the Best Folk Artist, and won for the Best Americana Roots Song (Old Black Roller) at the Indie Music Channel Awards, and performed during the awards at The Clive Davis Theatre at The Grammy Museum
2015: winner of the Singer Universe "Best Vocalist of the Month" competition.
Discography
Albums
References
External links
1957 births
20th-century American singer-songwriters
American male singer-songwriters
Singer-songwriters from Florida
Musicians from Jacksonville, Florida
Living people
20th-century American male singers
|
```c
/*-
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in the
* documentation and/or other materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS
* OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION)
* HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
* LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY
* OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF
* SUCH DAMAGE.
*/
#include <stdio.h>
#include <stdint.h>
#include <pthread.h>
#include <sys/param.h>
#include <sys/uio.h>
#include <xhyve/support/misc.h>
#include <xhyve/xhyve.h>
#include <xhyve/pci_emul.h>
#include <xhyve/virtio.h>
/*
* Functions for dealing with generalized "virtual devices" as
* defined by <path_to_url#output=search&q=virtio+spec>
*/
/*
* In case we decide to relax the "virtio softc comes at the
* front of virtio-based device softc" constraint, let's use
* this to convert.
*/
#define DEV_SOFTC(vs) ((void *)(vs))
/*
* Link a virtio_softc to its constants, the device softc, and
* the PCI emulation.
*/
void
vi_softc_linkup(struct virtio_softc *vs, struct virtio_consts *vc,
void *dev_softc, struct pci_devinst *pi,
struct vqueue_info *queues)
{
int i;
/* vs and dev_softc addresses must match */
assert((void *)vs == dev_softc);
vs->vs_vc = vc;
vs->vs_pi = pi;
pi->pi_arg = vs;
vs->vs_queues = queues;
for (i = 0; i < vc->vc_nvq; i++) {
queues[i].vq_vs = vs;
queues[i].vq_num = (uint16_t) i;
}
}
/*
* Reset device (device-wide). This erases all queues, i.e.,
* all the queues become invalid (though we don't wipe out the
* internal pointers, we just clear the VQ_ALLOC flag).
*
* It resets negotiated features to "none".
*
* If MSI-X is enabled, this also resets all the vectors to NO_VECTOR.
*/
void
vi_reset_dev(struct virtio_softc *vs)
{
struct vqueue_info *vq;
int i, nvq;
nvq = vs->vs_vc->vc_nvq;
for (vq = vs->vs_queues, i = 0; i < nvq; vq++, i++) {
vq->vq_flags = 0;
vq->vq_last_avail = 0;
vq->vq_save_used = 0;
vq->vq_pfn = 0;
vq->vq_msix_idx = VIRTIO_MSI_NO_VECTOR;
}
vs->vs_negotiated_caps = 0;
vs->vs_curq = 0;
/* vs->vs_status = 0; -- redundant */
if (vs->vs_isr)
pci_lintr_deassert(vs->vs_pi);
vs->vs_isr = 0;
vs->vs_msix_cfg_idx = VIRTIO_MSI_NO_VECTOR;
}
/*
* Set I/O BAR (usually 0) to map PCI config registers.
*/
void
vi_set_io_bar(struct virtio_softc *vs, int barnum)
{
size_t size;
/*
* ??? should we use CFG0 if MSI-X is disabled?
* Existing code did not...
*/
size = VTCFG_R_CFG1 + vs->vs_vc->vc_cfgsize;
pci_emul_alloc_bar(vs->vs_pi, barnum, PCIBAR_IO, size);
}
/*
* Initialize MSI-X vector capabilities if we're to use MSI-X,
* or MSI capabilities if not.
*
* We assume we want one MSI-X vector per queue, here, plus one
* for the config vec.
*/
int
vi_intr_init(struct virtio_softc *vs, int barnum, int use_msix)
{
int nvec;
if (use_msix) {
vs->vs_flags |= VIRTIO_USE_MSIX;
VS_LOCK(vs);
vi_reset_dev(vs); /* set all vectors to NO_VECTOR */
VS_UNLOCK(vs);
nvec = vs->vs_vc->vc_nvq + 1;
if (pci_emul_add_msixcap(vs->vs_pi, nvec, barnum))
return (1);
} else
vs->vs_flags &= ~VIRTIO_USE_MSIX;
/* Only 1 MSI vector for bhyve */
pci_emul_add_msicap(vs->vs_pi, 1);
/* Legacy interrupts are mandatory for virtio devices */
pci_lintr_request(vs->vs_pi);
return (0);
}
/*
* Initialize the currently-selected virtio queue (vs->vs_curq).
* The guest just gave us a page frame number, from which we can
* calculate the addresses of the queue.
*/
static void
vi_vq_init(struct virtio_softc *vs, uint32_t pfn)
{
struct vqueue_info *vq;
uint64_t phys;
size_t size;
char *base;
vq = &vs->vs_queues[vs->vs_curq];
vq->vq_pfn = pfn;
phys = (uint64_t)pfn << VRING_PFN;
size = vring_size(vq->vq_qsize);
base = paddr_guest2host(phys, size);
/* First page(s) are descriptors... */
vq->vq_desc = (struct virtio_desc *)base;
base += vq->vq_qsize * sizeof(struct virtio_desc);
/* ... immediately followed by "avail" ring (entirely uint16_t's) */
vq->vq_avail = (struct vring_avail *)base;
base += (2 + vq->vq_qsize + 1) * sizeof(uint16_t);
/* Then it's rounded up to the next page... */
base = (char *) roundup2(((uintptr_t) base), ((uintptr_t) VRING_ALIGN));
/* ... and the last page(s) are the used ring. */
vq->vq_used = (struct vring_used *)base;
/* Mark queue as allocated, and start at 0 when we use it. */
vq->vq_flags = VQ_ALLOC;
vq->vq_last_avail = 0;
vq->vq_save_used = 0;
}
/*
* Helper inline for vq_getchain(): record the i'th "real"
* descriptor.
*/
static inline void
_vq_record(int i, volatile struct virtio_desc *vd, struct iovec *iov, int n_iov,
uint16_t *flags)
{
if (i >= n_iov)
return;
iov[i].iov_base = paddr_guest2host(vd->vd_addr, vd->vd_len);
iov[i].iov_len = vd->vd_len;
if (flags != NULL)
flags[i] = vd->vd_flags;
}
#define VQ_MAX_DESCRIPTORS 512 /* see below */
/*
* Examine the chain of descriptors starting at the "next one" to
* make sure that they describe a sensible request. If so, return
* the number of "real" descriptors that would be needed/used in
* acting on this request. This may be smaller than the number of
* available descriptors, e.g., if there are two available but
* they are two separate requests, this just returns 1. Or, it
* may be larger: if there are indirect descriptors involved,
* there may only be one descriptor available but it may be an
* indirect pointing to eight more. We return 8 in this case,
* i.e., we do not count the indirect descriptors, only the "real"
* ones.
*
* Basically, this vets the vd_flags and vd_next field of each
* descriptor and tells you how many are involved. Since some may
* be indirect, this also needs the vmctx (in the pci_devinst
* at vs->vs_pi) so that it can find indirect descriptors.
*
* As we process each descriptor, we copy and adjust it (guest to
* host address wise, also using the vmtctx) into the given iov[]
* array (of the given size). If the array overflows, we stop
* placing values into the array but keep processing descriptors,
* up to VQ_MAX_DESCRIPTORS, before giving up and returning -1.
* So you, the caller, must not assume that iov[] is as big as the
* return value (you can process the same thing twice to allocate
* a larger iov array if needed, or supply a zero length to find
* out how much space is needed).
*
* If you want to verify the WRITE flag on each descriptor, pass a
* non-NULL "flags" pointer to an array of "uint16_t" of the same size
* as n_iov and we'll copy each vd_flags field after unwinding any
* indirects.
*
* If some descriptor(s) are invalid, this prints a diagnostic message
* and returns -1. If no descriptors are ready now it simply returns 0.
*
* You are assumed to have done a vq_ring_ready() if needed (note
* that vq_has_descs() does one).
*/
int
vq_getchain(struct vqueue_info *vq, uint16_t *pidx, struct iovec *iov,
int n_iov, uint16_t *flags)
{
int i;
u_int ndesc, n_indir;
u_int idx, next;
volatile struct virtio_desc *vdir, *vindir, *vp;
struct virtio_softc *vs;
const char *name;
vs = vq->vq_vs;
name = vs->vs_vc->vc_name;
/*
* Note: it's the responsibility of the guest not to
* update vq->vq_avail->va_idx until all of the descriptors
* the guest has written are valid (including all their
* vd_next fields and vd_flags).
*
* Compute (last_avail - va_idx) in integers mod 2**16. This is
* the number of descriptors the device has made available
* since the last time we updated vq->vq_last_avail.
*
* We just need to do the subtraction as an unsigned int,
* then trim off excess bits.
*/
idx = vq->vq_last_avail;
ndesc = (uint16_t)((u_int)vq->vq_avail->va_idx - idx);
if (ndesc == 0)
return (0);
if (ndesc > vq->vq_qsize) {
/* XXX need better way to diagnose issues */
fprintf(stderr,
"%s: ndesc (%u) out of range, driver confused?\r\n",
name, (u_int)ndesc);
return (-1);
}
/*
* Now count/parse "involved" descriptors starting from
* the head of the chain.
*
* To prevent loops, we could be more complicated and
* check whether we're re-visiting a previously visited
* index, but we just abort if the count gets excessive.
*/
*pidx = next = vq->vq_avail->va_ring[idx & (vq->vq_qsize - 1)];
vq->vq_last_avail++;
for (i = 0; i < VQ_MAX_DESCRIPTORS; next = vdir->vd_next) {
if (next >= vq->vq_qsize) {
fprintf(stderr,
"%s: descriptor index %u out of range, "
"driver confused?\r\n",
name, next);
return (-1);
}
vdir = &vq->vq_desc[next];
if ((vdir->vd_flags & VRING_DESC_F_INDIRECT) == 0) {
_vq_record(i, vdir, iov, n_iov, flags);
i++;
} else if ((vs->vs_vc->vc_hv_caps &
VIRTIO_RING_F_INDIRECT_DESC) == 0) {
fprintf(stderr,
"%s: descriptor has forbidden INDIRECT flag, "
"driver confused?\r\n",
name);
return (-1);
} else {
n_indir = vdir->vd_len / 16;
if ((vdir->vd_len & 0xf) || n_indir == 0) {
fprintf(stderr,
"%s: invalid indir len 0x%x, "
"driver confused?\r\n",
name, (u_int)vdir->vd_len);
return (-1);
}
vindir = paddr_guest2host(vdir->vd_addr, vdir->vd_len);
/*
* Indirects start at the 0th, then follow
* their own embedded "next"s until those run
* out. Each one's indirect flag must be off
* (we don't really have to check, could just
* ignore errors...).
*/
next = 0;
for (;;) {
vp = &vindir[next];
if (vp->vd_flags & VRING_DESC_F_INDIRECT) {
fprintf(stderr,
"%s: indirect desc has INDIR flag,"
" driver confused?\r\n",
name);
return (-1);
}
_vq_record(i, vp, iov, n_iov, flags);
if (++i > VQ_MAX_DESCRIPTORS)
goto loopy;
if ((vp->vd_flags & VRING_DESC_F_NEXT) == 0)
break;
next = vp->vd_next;
if (next >= n_indir) {
fprintf(stderr,
"%s: invalid next %u > %u, "
"driver confused?\r\n",
name, (u_int)next, n_indir);
return (-1);
}
}
}
if ((vdir->vd_flags & VRING_DESC_F_NEXT) == 0)
return (i);
}
loopy:
fprintf(stderr,
"%s: descriptor loop? count > %d - driver confused?\r\n",
name, i);
return (-1);
}
/*
* Return the currently-first request chain back to the available queue.
*
* (This chain is the one you handled when you called vq_getchain()
* and used its positive return value.)
*/
void
vq_retchain(struct vqueue_info *vq)
{
vq->vq_last_avail--;
}
/*
* Return specified request chain to the guest, setting its I/O length
* to the provided value.
*
* (This chain is the one you handled when you called vq_getchain()
* and used its positive return value.)
*/
void
vq_relchain(struct vqueue_info *vq, uint16_t idx, uint32_t iolen)
{
uint16_t uidx, mask;
volatile struct vring_used *vuh;
volatile struct virtio_used *vue;
/*
* Notes:
* - mask is N-1 where N is a power of 2 so computes x % N
* - vuh points to the "used" data shared with guest
* - vue points to the "used" ring entry we want to update
* - head is the same value we compute in vq_iovecs().
*
* (I apologize for the two fields named vu_idx; the
* virtio spec calls the one that vue points to, "id"...)
*/
mask = vq->vq_qsize - 1;
vuh = vq->vq_used;
uidx = vuh->vu_idx;
vue = &vuh->vu_ring[uidx++ & mask];
vue->vu_idx = idx;
vue->vu_tlen = iolen;
vuh->vu_idx = uidx;
}
/*
* Driver has finished processing "available" chains and calling
* vq_relchain on each one. If driver used all the available
* chains, used_all should be set.
*
* If the "used" index moved we may need to inform the guest, i.e.,
* deliver an interrupt. Even if the used index did NOT move we
* may need to deliver an interrupt, if the avail ring is empty and
* we are supposed to interrupt on empty.
*
* Note that used_all_avail is provided by the caller because it's
* a snapshot of the ring state when he decided to finish interrupt
* processing -- it's possible that descriptors became available after
* that point. (It's also typically a constant 1/True as well.)
*/
void
vq_endchains(struct vqueue_info *vq, int used_all_avail)
{
struct virtio_softc *vs;
uint16_t event_idx, new_idx, old_idx;
int intr;
/*
* Interrupt generation: if we're using EVENT_IDX,
* interrupt if we've crossed the event threshold.
* Otherwise interrupt is generated if we added "used" entries,
* but suppressed by VRING_AVAIL_F_NO_INTERRUPT.
*
* In any case, though, if NOTIFY_ON_EMPTY is set and the
* entire avail was processed, we need to interrupt always.
*/
vs = vq->vq_vs;
old_idx = vq->vq_save_used;
vq->vq_save_used = new_idx = vq->vq_used->vu_idx;
if (used_all_avail &&
(vs->vs_negotiated_caps & VIRTIO_F_NOTIFY_ON_EMPTY))
intr = 1;
else if (vs->vs_negotiated_caps & VIRTIO_RING_F_EVENT_IDX) {
event_idx = VQ_USED_EVENT_IDX(vq);
/*
* This calculation is per docs and the kernel
* (see src/sys/dev/virtio/virtio_ring.h).
*/
intr = (uint16_t)(new_idx - event_idx - 1) <
(uint16_t)(new_idx - old_idx);
} else {
intr = new_idx != old_idx &&
!(vq->vq_avail->va_flags & VRING_AVAIL_F_NO_INTERRUPT);
}
if (intr)
vq_interrupt(vs, vq);
}
/* Note: these are in sorted order to make for a fast search */
static struct config_reg {
uint16_t cr_offset; /* register offset */
uint8_t cr_size; /* size (bytes) */
uint8_t cr_ro; /* true => reg is read only */
const char *cr_name; /* name of reg */
} config_regs[] = {
{ VTCFG_R_HOSTCAP, 4, 1, "HOSTCAP" },
{ VTCFG_R_GUESTCAP, 4, 0, "GUESTCAP" },
{ VTCFG_R_PFN, 4, 0, "PFN" },
{ VTCFG_R_QNUM, 2, 1, "QNUM" },
{ VTCFG_R_QSEL, 2, 0, "QSEL" },
{ VTCFG_R_QNOTIFY, 2, 0, "QNOTIFY" },
{ VTCFG_R_STATUS, 1, 0, "STATUS" },
{ VTCFG_R_ISR, 1, 0, "ISR" },
{ VTCFG_R_CFGVEC, 2, 0, "CFGVEC" },
{ VTCFG_R_QVEC, 2, 0, "QVEC" },
};
static inline struct config_reg *
vi_find_cr(int offset) {
u_int hi, lo, mid;
struct config_reg *cr;
lo = 0;
hi = sizeof(config_regs) / sizeof(*config_regs) - 1;
while (hi >= lo) {
mid = (hi + lo) >> 1;
cr = &config_regs[mid];
if (cr->cr_offset == offset)
return (cr);
if (cr->cr_offset < offset)
lo = mid + 1;
else
hi = mid - 1;
}
return (NULL);
}
/*
* Handle pci config space reads.
* If it's to the MSI-X info, do that.
* If it's part of the virtio standard stuff, do that.
* Otherwise dispatch to the actual driver.
*/
uint64_t
vi_pci_read(UNUSED int vcpu, struct pci_devinst *pi, int baridx,
uint64_t offset, int size)
{
struct virtio_softc *vs = pi->pi_arg;
struct virtio_consts *vc;
struct config_reg *cr;
uint64_t virtio_config_size, max;
const char *name;
uint32_t newoff;
uint32_t value;
int error;
if (vs->vs_flags & VIRTIO_USE_MSIX) {
if (baridx == pci_msix_table_bar(pi) ||
baridx == pci_msix_pba_bar(pi)) {
return (pci_emul_msix_tread(pi, offset, size));
}
}
/* XXX probably should do something better than just assert() */
assert(baridx == 0);
if (vs->vs_mtx)
pthread_mutex_lock(vs->vs_mtx);
vc = vs->vs_vc;
name = vc->vc_name;
value = size == 1 ? 0xff : size == 2 ? 0xffff : 0xffffffff;
if (size != 1 && size != 2 && size != 4)
goto bad;
if (pci_msix_enabled(pi))
virtio_config_size = VTCFG_R_CFG1;
else
virtio_config_size = VTCFG_R_CFG0;
if (offset >= virtio_config_size) {
/*
* Subtract off the standard size (including MSI-X
* registers if enabled) and dispatch to underlying driver.
* If that fails, fall into general code.
*/
newoff = (uint32_t) (offset - virtio_config_size);
max = vc->vc_cfgsize ? vc->vc_cfgsize : 0x100000000;
if ((newoff + ((unsigned) size)) > max)
goto bad;
if (vc->vc_cfgread != NULL)
error = (*vc->vc_cfgread)(DEV_SOFTC(vs), ((int) newoff), size, &value);
else
error = 0;
if (!error)
goto done;
}
bad:
cr = vi_find_cr((int) offset);
if (cr == NULL || cr->cr_size != size) {
if (cr != NULL) {
/* offset must be OK, so size must be bad */
fprintf(stderr,
"%s: read from %s: bad size %d\r\n",
name, cr->cr_name, size);
} else {
fprintf(stderr,
"%s: read from bad offset/size %jd/%d\r\n",
name, (uintmax_t)offset, size);
}
goto done;
}
switch (offset) {
case VTCFG_R_HOSTCAP:
value = (uint32_t) vc->vc_hv_caps;
break;
case VTCFG_R_GUESTCAP:
value = vs->vs_negotiated_caps;
break;
case VTCFG_R_PFN:
if (vs->vs_curq < vc->vc_nvq)
value = vs->vs_queues[vs->vs_curq].vq_pfn;
break;
case VTCFG_R_QNUM:
value = vs->vs_curq < vc->vc_nvq ?
vs->vs_queues[vs->vs_curq].vq_qsize : 0;
break;
case VTCFG_R_QSEL:
value = (uint32_t) (vs->vs_curq);
break;
case VTCFG_R_QNOTIFY:
value = 0; /* XXX */
break;
case VTCFG_R_STATUS:
value = vs->vs_status;
break;
case VTCFG_R_ISR:
value = vs->vs_isr;
vs->vs_isr = 0; /* a read clears this flag */
if (value)
pci_lintr_deassert(pi);
break;
case VTCFG_R_CFGVEC:
value = vs->vs_msix_cfg_idx;
break;
case VTCFG_R_QVEC:
value = vs->vs_curq < vc->vc_nvq ?
vs->vs_queues[vs->vs_curq].vq_msix_idx :
VIRTIO_MSI_NO_VECTOR;
break;
}
done:
if (vs->vs_mtx)
pthread_mutex_unlock(vs->vs_mtx);
return (value);
}
/*
* Handle pci config space writes.
* If it's to the MSI-X info, do that.
* If it's part of the virtio standard stuff, do that.
* Otherwise dispatch to the actual driver.
*/
void
vi_pci_write(UNUSED int vcpu, struct pci_devinst *pi, int baridx,
uint64_t offset, int size, uint64_t value)
{
struct virtio_softc *vs = pi->pi_arg;
struct vqueue_info *vq;
struct virtio_consts *vc;
struct config_reg *cr;
uint64_t virtio_config_size, max;
const char *name;
uint32_t newoff;
int error;
if (vs->vs_flags & VIRTIO_USE_MSIX) {
if (baridx == pci_msix_table_bar(pi) ||
baridx == pci_msix_pba_bar(pi)) {
pci_emul_msix_twrite(pi, offset, size, value);
return;
}
}
/* XXX probably should do something better than just assert() */
assert(baridx == 0);
if (vs->vs_mtx)
pthread_mutex_lock(vs->vs_mtx);
vc = vs->vs_vc;
name = vc->vc_name;
if (size != 1 && size != 2 && size != 4)
goto bad;
if (pci_msix_enabled(pi))
virtio_config_size = VTCFG_R_CFG1;
else
virtio_config_size = VTCFG_R_CFG0;
if (offset >= virtio_config_size) {
/*
* Subtract off the standard size (including MSI-X
* registers if enabled) and dispatch to underlying driver.
*/
newoff = (uint32_t) (offset - virtio_config_size);
max = vc->vc_cfgsize ? vc->vc_cfgsize : 0x100000000;
if ((newoff + ((unsigned) size)) > max)
goto bad;
if (vc->vc_cfgwrite != NULL)
error = (*vc->vc_cfgwrite)(DEV_SOFTC(vs), ((int) newoff), size,
((uint32_t) value));
else
error = 0;
if (!error)
goto done;
}
bad:
cr = vi_find_cr((int) offset);
if (cr == NULL || cr->cr_size != size || cr->cr_ro) {
if (cr != NULL) {
/* offset must be OK, wrong size and/or reg is R/O */
if (cr->cr_size != size)
fprintf(stderr,
"%s: write to %s: bad size %d\r\n",
name, cr->cr_name, size);
if (cr->cr_ro)
fprintf(stderr,
"%s: write to read-only reg %s\r\n",
name, cr->cr_name);
} else {
fprintf(stderr,
"%s: write to bad offset/size %jd/%d\r\n",
name, (uintmax_t)offset, size);
}
goto done;
}
switch (offset) {
case VTCFG_R_GUESTCAP:
vs->vs_negotiated_caps = (uint32_t) (value & vc->vc_hv_caps);
if (vc->vc_apply_features)
(*vc->vc_apply_features)(DEV_SOFTC(vs),
vs->vs_negotiated_caps);
break;
case VTCFG_R_PFN:
if (vs->vs_curq >= vc->vc_nvq)
goto bad_qindex;
vi_vq_init(vs, ((uint32_t) value));
break;
case VTCFG_R_QSEL:
/*
* Note that the guest is allowed to select an
* invalid queue; we just need to return a QNUM
* of 0 while the bad queue is selected.
*/
vs->vs_curq = (int) value;
break;
case VTCFG_R_QNOTIFY:
if (value >= ((uint64_t) vc->vc_nvq)) {
fprintf(stderr, "%s: queue %d notify out of range\r\n",
name, (int)value);
goto done;
}
vq = &vs->vs_queues[value];
if (vq->vq_notify)
(*vq->vq_notify)(DEV_SOFTC(vs), vq);
else if (vc->vc_qnotify)
(*vc->vc_qnotify)(DEV_SOFTC(vs), vq);
else
fprintf(stderr,
"%s: qnotify queue %d: missing vq/vc notify\r\n",
name, (int)value);
break;
case VTCFG_R_STATUS:
vs->vs_status = (uint8_t) value;
if (value == 0)
(*vc->vc_reset)(DEV_SOFTC(vs));
break;
case VTCFG_R_CFGVEC:
vs->vs_msix_cfg_idx = (uint16_t) value;
break;
case VTCFG_R_QVEC:
if (vs->vs_curq >= vc->vc_nvq)
goto bad_qindex;
vq = &vs->vs_queues[vs->vs_curq];
vq->vq_msix_idx = (uint16_t) value;
break;
}
goto done;
bad_qindex:
fprintf(stderr,
"%s: write config reg %s: curq %d >= max %d\r\n",
name, cr->cr_name, vs->vs_curq, vc->vc_nvq);
done:
if (vs->vs_mtx)
pthread_mutex_unlock(vs->vs_mtx);
}
```
|
The 1976–77 Scottish Inter-District Championship was a rugby union competition for Scotland's district teams.
This season saw the 24th Scottish Inter-District Championship.
South won the competition with 3 wins.
1976-77 League Table
Results
Round 1
South:
Glasgow District:
Round 2
North and Midlands:
Edinburgh District:
References
1976–77 in Scottish rugby union
Scottish Inter-District Championship seasons
|
```java
/*
*
*
* path_to_url
*
* Unless required by applicable law or agreed to in writing,
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* specific language governing permissions and limitations
*/
package org.wso2.ballerinalang.compiler.semantics.analyzer;
import io.ballerina.tools.diagnostics.DiagnosticCode;
import io.ballerina.tools.diagnostics.Location;
import org.ballerinalang.model.TreeBuilder;
import org.ballerinalang.model.elements.AttachPoint;
import org.ballerinalang.model.elements.Flag;
import org.ballerinalang.model.elements.PackageID;
import org.ballerinalang.model.symbols.SymbolKind;
import org.ballerinalang.model.tree.IdentifierNode;
import org.ballerinalang.model.tree.NodeKind;
import org.ballerinalang.model.tree.OperatorKind;
import org.ballerinalang.model.types.TypeKind;
import org.ballerinalang.util.diagnostic.DiagnosticErrorCode;
import org.wso2.ballerinalang.compiler.diagnostic.BLangDiagnosticLog;
import org.wso2.ballerinalang.compiler.parser.BLangAnonymousModelHelper;
import org.wso2.ballerinalang.compiler.parser.BLangMissingNodesHelper;
import org.wso2.ballerinalang.compiler.semantics.model.Scope;
import org.wso2.ballerinalang.compiler.semantics.model.Scope.ScopeEntry;
import org.wso2.ballerinalang.compiler.semantics.model.SymbolEnv;
import org.wso2.ballerinalang.compiler.semantics.model.SymbolTable;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BAnnotationAttachmentSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BAnnotationSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BConstantSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BErrorTypeSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BInvokableTypeSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BObjectTypeSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BOperatorSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BPackageSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BRecordTypeSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BTypeDefinitionSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BTypeSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BVarSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.BXMLNSSymbol;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.SymTag;
import org.wso2.ballerinalang.compiler.semantics.model.symbols.Symbols;
import org.wso2.ballerinalang.compiler.semantics.model.types.BAnydataType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BArrayType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BErrorType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BField;
import org.wso2.ballerinalang.compiler.semantics.model.types.BFiniteType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BFutureType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BIntersectionType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BInvokableType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BJSONType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BMapType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BObjectType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BParameterizedType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BRecordType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BStreamType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BTableType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BTupleMember;
import org.wso2.ballerinalang.compiler.semantics.model.types.BTupleType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BTypeIdSet;
import org.wso2.ballerinalang.compiler.semantics.model.types.BTypedescType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BUnionType;
import org.wso2.ballerinalang.compiler.semantics.model.types.BXMLType;
import org.wso2.ballerinalang.compiler.tree.BLangAnnotationAttachment;
import org.wso2.ballerinalang.compiler.tree.BLangConstantValue;
import org.wso2.ballerinalang.compiler.tree.BLangFunction;
import org.wso2.ballerinalang.compiler.tree.BLangIdentifier;
import org.wso2.ballerinalang.compiler.tree.BLangNode;
import org.wso2.ballerinalang.compiler.tree.BLangNodeTransformer;
import org.wso2.ballerinalang.compiler.tree.BLangSimpleVariable;
import org.wso2.ballerinalang.compiler.tree.BLangTableKeySpecifier;
import org.wso2.ballerinalang.compiler.tree.BLangTypeDefinition;
import org.wso2.ballerinalang.compiler.tree.BLangVariable;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangBinaryExpr;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangExpression;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangLiteral;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangReAtomQuantifier;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangReCapturingGroups;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangReSequence;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangReTerm;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangRecordLiteral;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangSimpleVarRef;
import org.wso2.ballerinalang.compiler.tree.expressions.BLangTypedescExpr;
import org.wso2.ballerinalang.compiler.tree.types.BLangArrayType;
import org.wso2.ballerinalang.compiler.tree.types.BLangBuiltInRefTypeNode;
import org.wso2.ballerinalang.compiler.tree.types.BLangConstrainedType;
import org.wso2.ballerinalang.compiler.tree.types.BLangErrorType;
import org.wso2.ballerinalang.compiler.tree.types.BLangFiniteTypeNode;
import org.wso2.ballerinalang.compiler.tree.types.BLangFunctionTypeNode;
import org.wso2.ballerinalang.compiler.tree.types.BLangIntersectionTypeNode;
import org.wso2.ballerinalang.compiler.tree.types.BLangObjectTypeNode;
import org.wso2.ballerinalang.compiler.tree.types.BLangRecordTypeNode;
import org.wso2.ballerinalang.compiler.tree.types.BLangStreamType;
import org.wso2.ballerinalang.compiler.tree.types.BLangTableTypeNode;
import org.wso2.ballerinalang.compiler.tree.types.BLangTupleTypeNode;
import org.wso2.ballerinalang.compiler.tree.types.BLangType;
import org.wso2.ballerinalang.compiler.tree.types.BLangUnionTypeNode;
import org.wso2.ballerinalang.compiler.tree.types.BLangUserDefinedType;
import org.wso2.ballerinalang.compiler.tree.types.BLangValueType;
import org.wso2.ballerinalang.compiler.util.BArrayState;
import org.wso2.ballerinalang.compiler.util.CompilerContext;
import org.wso2.ballerinalang.compiler.util.ImmutableTypeCloner;
import org.wso2.ballerinalang.compiler.util.Name;
import org.wso2.ballerinalang.compiler.util.Names;
import org.wso2.ballerinalang.compiler.util.TypeDefBuilderHelper;
import org.wso2.ballerinalang.compiler.util.TypeTags;
import org.wso2.ballerinalang.compiler.util.Unifier;
import org.wso2.ballerinalang.util.Flags;
import org.wso2.ballerinalang.util.Lists;
import java.util.ArrayList;
import java.util.Collection;
import java.util.EnumSet;
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Map;
import java.util.Optional;
import java.util.Set;
import java.util.Stack;
import static java.lang.String.format;
import static org.ballerinalang.model.symbols.SymbolOrigin.BUILTIN;
import static org.ballerinalang.model.symbols.SymbolOrigin.SOURCE;
import static org.ballerinalang.model.symbols.SymbolOrigin.VIRTUAL;
import static org.wso2.ballerinalang.compiler.semantics.model.Scope.NOT_FOUND_ENTRY;
import static org.wso2.ballerinalang.compiler.util.Constants.INFERRED_ARRAY_INDICATOR;
import static org.wso2.ballerinalang.compiler.util.Constants.OPEN_ARRAY_INDICATOR;
/**
* @since 0.94
*/
public class SymbolResolver extends BLangNodeTransformer<SymbolResolver.AnalyzerData, BType> {
public static final int MAX_ARRAY_SIZE = Integer.MAX_VALUE - 10; // -10 was added due to the JVM limitations
private static final CompilerContext.Key<SymbolResolver> SYMBOL_RESOLVER_KEY =
new CompilerContext.Key<>();
private final SymbolTable symTable;
private final Names names;
private final BLangDiagnosticLog dlog;
private final Types types;
private final SymbolEnter symbolEnter;
private final TypeResolver typeResolver;
private final BLangAnonymousModelHelper anonymousModelHelper;
private final BLangMissingNodesHelper missingNodesHelper;
private final Unifier unifier;
private final SemanticAnalyzer semanticAnalyzer;
private final Stack<String> anonTypeNameSuffixes;
public static SymbolResolver getInstance(CompilerContext context) {
SymbolResolver symbolResolver = context.get(SYMBOL_RESOLVER_KEY);
if (symbolResolver == null) {
symbolResolver = new SymbolResolver(context);
}
return symbolResolver;
}
public SymbolResolver(CompilerContext context) {
context.put(SYMBOL_RESOLVER_KEY, this);
this.symTable = SymbolTable.getInstance(context);
this.names = Names.getInstance(context);
this.dlog = BLangDiagnosticLog.getInstance(context);
this.types = Types.getInstance(context);
this.symbolEnter = SymbolEnter.getInstance(context);
this.anonymousModelHelper = BLangAnonymousModelHelper.getInstance(context);
this.typeResolver = TypeResolver.getInstance(context);
this.missingNodesHelper = BLangMissingNodesHelper.getInstance(context);
this.semanticAnalyzer = SemanticAnalyzer.getInstance(context);
this.unifier = new Unifier();
this.anonTypeNameSuffixes = new Stack<>();
}
@Override
public BType transformNode(BLangNode node, AnalyzerData props) {
// Should not reach here
return symTable.neverType;
}
public boolean checkForUniqueSymbol(Location pos, SymbolEnv env, BSymbol symbol) {
//lookup symbol
BSymbol foundSym = symTable.notFoundSymbol;
long expSymTag = symbol.tag;
if ((expSymTag & SymTag.IMPORT) == SymTag.IMPORT) {
foundSym = lookupSymbolInPrefixSpace(env, symbol.name);
} else if ((expSymTag & SymTag.ANNOTATION) == SymTag.ANNOTATION) {
foundSym = lookupSymbolInAnnotationSpace(env, symbol.name);
} else if ((expSymTag & SymTag.CONSTRUCTOR) == SymTag.CONSTRUCTOR) {
foundSym = lookupSymbolInConstructorSpace(env, symbol.name);
} else if ((expSymTag & SymTag.MAIN) == SymTag.MAIN) {
// Using this method for looking up in the main symbol space since record field symbols lookup have
// different semantics depending on whether it's looking up a referenced symbol or looking up to see if
// the symbol is unique within the scope.
foundSym = lookupSymbolForDecl(env, symbol.name, SymTag.MAIN);
}
if (foundSym == symTable.notFoundSymbol && expSymTag == SymTag.FUNCTION) {
int dotPosition = symbol.name.value.indexOf('.');
if (dotPosition > 0 && dotPosition != symbol.name.value.length()) {
String funcName = symbol.name.value.substring(dotPosition + 1);
foundSym = lookupSymbolForDecl(env, Names.fromString(funcName), SymTag.MAIN);
}
}
//if symbol is not found then it is unique for the current scope
if (foundSym == symTable.notFoundSymbol) {
return true;
}
if (symbol.tag == SymTag.XMLNS && isDistinctXMLNSSymbol((BXMLNSSymbol) symbol, (BXMLNSSymbol) foundSym)) {
return true;
}
// if a symbol is found, then check whether it is unique
if (!isDistinctSymbol(pos, symbol, foundSym)) {
return false;
}
if (isRedeclaredSymbol(symbol, foundSym)) {
Name name = symbol.name;
if (Symbols.isRemote(symbol) ^ Symbols.isRemote(foundSym)) {
dlog.error(pos, DiagnosticErrorCode.UNSUPPORTED_REMOTE_METHOD_NAME_IN_SCOPE, name);
return false;
}
if (symbol.kind != SymbolKind.CONSTANT) {
dlog.error(pos, DiagnosticErrorCode.REDECLARED_SYMBOL, name);
}
return false;
}
// In order to remove duplicate errors.
return (foundSym.tag & SymTag.SERVICE) != SymTag.SERVICE;
}
private boolean isRedeclaredSymbol(BSymbol symbol, BSymbol foundSym) {
return hasSameOwner(symbol, foundSym) || isSymbolRedeclaredInTestPackage(symbol, foundSym) ||
isRedeclaredTypeDefinitionSymbol(symbol, foundSym);
}
public boolean checkForUniqueSymbol(SymbolEnv env, BSymbol symbol) {
BSymbol foundSym = lookupSymbolInMainSpace(env, symbol.name);
if (foundSym == symTable.notFoundSymbol) {
return true;
}
if (symbol.tag == SymTag.CONSTRUCTOR && foundSym.tag == SymTag.ERROR) {
return false;
}
return !hasSameOwner(symbol, foundSym);
}
/**
* This method will check whether the given symbol that is being defined is unique by only checking its current
* environment scope.
*
* @param pos symbol pos for diagnostic purpose.
* @param env symbol environment to lookup.
* @param symbol the symbol that is being defined.
* @param expSymTag expected tag of the symbol for.
* @return true if the symbol is unique, false otherwise.
*/
public boolean checkForUniqueSymbolInCurrentScope(Location pos, SymbolEnv env, BSymbol symbol,
long expSymTag) {
//lookup in current scope
BSymbol foundSym = lookupSymbolInGivenScope(env, symbol.name, expSymTag);
//if symbol is not found then it is unique for the current scope
if (foundSym == symTable.notFoundSymbol) {
return true;
}
//if a symbol is found, then check whether it is unique
return isDistinctSymbol(pos, symbol, foundSym);
}
/**
* This method will check whether the symbol being defined is unique comparing it with the found symbol
* from the scope.
*
* @param pos symbol pos for diagnostic purpose.
* @param symbol symbol that is being defined.
* @param foundSym symbol that is found from the scope.
* @return true if the symbol is unique, false otherwise.
*/
private boolean isDistinctSymbol(Location pos, BSymbol symbol, BSymbol foundSym) {
// It is allowed to have a error constructor symbol with the same name as a type def.
if (symbol.tag == SymTag.CONSTRUCTOR && foundSym.tag == SymTag.ERROR) {
return false;
}
if (isSymbolDefinedInRootPkgLvl(foundSym)) {
dlog.error(pos, DiagnosticErrorCode.REDECLARED_BUILTIN_SYMBOL, symbol.name);
return false;
}
return true;
}
private boolean hasSameOwner(BSymbol symbol, BSymbol foundSym) {
// check whether the given symbol owner is same as found symbol's owner
if (foundSym.owner == symbol.owner) {
return true;
} else if (Symbols.isFlagOn(symbol.owner.flags, Flags.LAMBDA) &&
((foundSym.owner.tag & SymTag.INVOKABLE) == SymTag.INVOKABLE)) {
// If the symbol being defined is inside a lambda and the existing symbol is defined inside a function, both
// symbols are in the same block scope.
return true;
} else if (((symbol.owner.tag & SymTag.LET) == SymTag.LET) &&
((foundSym.owner.tag & SymTag.INVOKABLE) == SymTag.INVOKABLE)) {
// If the symbol being defined is inside a let expression and the existing symbol is defined inside a
// function both symbols are in the same scope.
return true;
} else if (((symbol.owner.tag & SymTag.FUNCTION_TYPE) == SymTag.FUNCTION_TYPE) &&
((foundSym.owner.tag & SymTag.INVOKABLE) == SymTag.INVOKABLE)) {
// If the symbol being defined is inside a function type and the existing symbol is defined inside a
// function both symbols are in the same scope.
return true;
} else if (Symbols.isFlagOn(symbol.owner.flags, Flags.OBJECT_CTOR) &&
((foundSym.owner.tag & SymTag.INVOKABLE) == SymTag.INVOKABLE)) {
// object ctor is using a symbol inside a function masking the symbol in the function scope
// This is preventing outer scope variable name crashes with method names inside object ctor
if (Symbols.isFlagOn(symbol.flags, Flags.ATTACHED) || Symbols.isFlagOn(foundSym.flags, Flags.ATTACHED)) {
return false;
}
// This prevents `self` symbol crash between multilevel object ctors
if (foundSym.name.value.equals(Names.SELF.value) || symbol.name.value.equals(Names.SELF.value)) {
return false;
}
return true;
}
return false;
}
private boolean isRedeclaredTypeDefinitionSymbol(BSymbol symbol, BSymbol foundSym) {
if (symbol.kind != SymbolKind.TYPE_DEF && foundSym.kind != SymbolKind.TYPE_DEF) {
return false;
}
if (symbol.kind == SymbolKind.TYPE_DEF) {
return hasSameOwner(symbol.type.tsymbol, foundSym);
} else {
return hasSameOwner(symbol, foundSym.type.tsymbol);
}
}
private boolean isSymbolRedeclaredInTestPackage(BSymbol symbol, BSymbol foundSym) {
if (Symbols.isFlagOn(symbol.owner.flags, Flags.TESTABLE) &&
!Symbols.isFlagOn(foundSym.owner.flags, Flags.TESTABLE)) {
return true;
}
return false;
}
private boolean isSymbolDefinedInRootPkgLvl(BSymbol foundSym) {
long foundSymTag = foundSym.tag;
return symTable.rootPkgSymbol.pkgID.equals(foundSym.pkgID) &&
(foundSymTag & SymTag.VARIABLE_NAME) == SymTag.VARIABLE_NAME;
}
/**
* Lookup the symbol using given name in the given environment scope only.
*
* @param env environment to lookup the symbol.
* @param name name of the symbol to lookup.
* @param expSymTag expected tag of the symbol.
* @return if a symbol is found return it.
*/
public BSymbol lookupSymbolInGivenScope(SymbolEnv env, Name name, long expSymTag) {
ScopeEntry entry = env.scope.lookup(name);
while (entry != NOT_FOUND_ENTRY) {
if (symTable.rootPkgSymbol.pkgID.equals(entry.symbol.pkgID) &&
(entry.symbol.tag & SymTag.VARIABLE_NAME) == SymTag.VARIABLE_NAME) {
return entry.symbol;
}
if ((entry.symbol.tag & expSymTag) == expSymTag && !isFieldRefFromWithinARecord(entry.symbol, env)) {
return entry.symbol;
}
entry = entry.next;
}
return symTable.notFoundSymbol;
}
public boolean checkForUniqueMemberSymbol(Location pos, SymbolEnv env, BSymbol symbol) {
BSymbol foundSym = lookupMemberSymbol(pos, env.scope, env, symbol.name, symbol.tag);
if (foundSym != symTable.notFoundSymbol) {
dlog.error(pos, DiagnosticErrorCode.REDECLARED_SYMBOL, symbol.name);
return false;
}
return true;
}
public BSymbol resolveBinaryOperator(OperatorKind opKind,
BType lhsType,
BType rhsType) {
return resolveOperator(Names.fromString(opKind.value()), Lists.of(lhsType, rhsType));
}
private BSymbol createEqualityOperator(OperatorKind opKind, BType lhsType, BType rhsType) {
List<BType> paramTypes = Lists.of(lhsType, rhsType);
BType retType = symTable.booleanType;
BInvokableType opType = new BInvokableType(paramTypes, retType, null);
return new BOperatorSymbol(Names.fromString(opKind.value()), null, opType, null, symTable.builtinPos, VIRTUAL);
}
public BSymbol resolveUnaryOperator(OperatorKind opKind,
BType type) {
return resolveOperator(Names.fromString(opKind.value()), Lists.of(type));
}
public BSymbol resolveOperator(Name name, List<BType> types) {
ScopeEntry entry = symTable.rootScope.lookup(name);
return resolveOperator(entry, types);
}
private BSymbol createBinaryComparisonOperator(OperatorKind opKind, BType lhsType, BType rhsType) {
List<BType> paramTypes = Lists.of(lhsType, rhsType);
BInvokableType opType = new BInvokableType(paramTypes, symTable.booleanType, null);
return new BOperatorSymbol(Names.fromString(opKind.value()), null, opType, null, symTable.builtinPos, VIRTUAL);
}
private BSymbol createBinaryOperator(OperatorKind opKind, BType lhsType, BType rhsType, BType retType) {
List<BType> paramTypes = Lists.of(lhsType, rhsType);
BInvokableType opType = new BInvokableType(paramTypes, retType, null);
return new BOperatorSymbol(Names.fromString(opKind.value()), null, opType, null, symTable.builtinPos, VIRTUAL);
}
BSymbol createUnaryOperator(OperatorKind kind, BType type, BType retType) {
List<BType> paramTypes = Lists.of(type);
BInvokableType opType = new BInvokableType(paramTypes, retType, null);
return new BOperatorSymbol(Names.fromString(kind.value()), null, opType, null, symTable.builtinPos, VIRTUAL);
}
public BSymbol resolvePkgSymbol(Location pos, SymbolEnv env, Name pkgAlias) {
if (pkgAlias == Names.EMPTY) {
// Return the current package symbol
return env.enclPkg.symbol;
}
// Lookup for an imported package
BSymbol pkgSymbol = lookupSymbolInPrefixSpace(env, pkgAlias);
if (pkgSymbol == symTable.notFoundSymbol) {
dlog.error(pos, DiagnosticErrorCode.UNDEFINED_MODULE, pkgAlias.value);
}
return pkgSymbol;
}
public BSymbol resolvePrefixSymbol(SymbolEnv env, Name pkgAlias, Name compUnit) {
if (pkgAlias == Names.EMPTY) {
// Return the current package symbol
return env.enclPkg.symbol;
}
// Lookup for an imported package
ScopeEntry entry = env.scope.lookup(pkgAlias);
while (entry != NOT_FOUND_ENTRY) {
BSymbol symbol = entry.symbol;
long tag = symbol.tag;
if (isDistinctXMLNSSymbol(symbol, compUnit)) {
return symbol;
}
if (!((tag & SymTag.XMLNS) == SymTag.XMLNS) && (tag & SymTag.IMPORT) == SymTag.IMPORT &&
((BPackageSymbol) symbol).compUnit.equals(compUnit)) {
((BPackageSymbol) symbol).isUsed = true;
return symbol;
}
entry = entry.next;
}
if (env.enclEnv != null) {
return resolvePrefixSymbol(env.enclEnv, pkgAlias, compUnit);
}
return symTable.notFoundSymbol;
}
private boolean isDistinctXMLNSSymbol(BSymbol symbol, Name compUnit) {
if (symbol instanceof BXMLNSSymbol bxmlnsSymbol) {
Name xmlnsCompUnit = bxmlnsSymbol.compUnit;
return xmlnsCompUnit == null || xmlnsCompUnit.equals(compUnit);
}
return false;
}
public BSymbol resolveAnnotation(Location pos, SymbolEnv env, Name pkgAlias, Name annotationName) {
return this.lookupAnnotationSpaceSymbolInPackage(pos, env, pkgAlias, annotationName);
}
public BSymbol resolveStructField(Location location, SymbolEnv env, Name fieldName,
BTypeSymbol structSymbol) {
return lookupMemberSymbol(location, structSymbol.scope, env, fieldName, SymTag.VARIABLE);
}
public BSymbol resolveObjectField(Location location, SymbolEnv env, Name fieldName,
BTypeSymbol objectSymbol) {
return lookupMemberSymbol(location, objectSymbol.scope, env, fieldName, SymTag.VARIABLE);
}
public BSymbol resolveObjectMethod(Location pos, SymbolEnv env, Name fieldName,
BObjectTypeSymbol objectSymbol) {
return lookupMemberSymbol(pos, objectSymbol.scope, env, fieldName, SymTag.VARIABLE);
}
public BSymbol resolveInvocableObjectField(Location pos, SymbolEnv env, Name fieldName,
BObjectTypeSymbol objectTypeSymbol) {
return lookupMemberSymbol(pos, objectTypeSymbol.scope, env, fieldName, SymTag.VARIABLE);
}
public BType resolveTypeNode(BLangType typeNode, SymbolEnv env) {
AnalyzerData data = new AnalyzerData(env);
return resolveTypeNode(typeNode, data, env, DiagnosticErrorCode.UNKNOWN_TYPE);
}
public BType resolveTypeNode(BLangType typeNode, SymbolEnv env, DiagnosticCode diagCode) {
AnalyzerData data = new AnalyzerData(env);
return resolveTypeNode(typeNode, data, env, diagCode);
}
private BType resolveTypeNode(BLangType typeNode, AnalyzerData data, SymbolEnv env) {
return resolveTypeNode(typeNode, data, env, DiagnosticErrorCode.UNKNOWN_TYPE);
}
private BType resolveTypeNode(BLangType typeNode, AnalyzerData data, SymbolEnv env, DiagnosticCode diagCode) {
if (typeNode == null) {
return symTable.neverType;
}
SymbolEnv prevEnv = data.env;
DiagnosticCode preDiagCode = data.diagCode;
data.env = env; // TODO: can remove?
data.diagCode = diagCode;
BType resultType = typeNode.apply(this, data);
data.env = prevEnv;
data.diagCode = preDiagCode;
BType refType = Types.getImpliedType(resultType);
if (refType != symTable.noType) {
// If the typeNode.nullable is true then convert the resultType to a union type
// if it is not already a union type, JSON type, or any type
if (typeNode.nullable && resultType.tag == TypeTags.UNION) {
BUnionType unionType = (BUnionType) refType;
unionType.add(symTable.nilType);
} else if (typeNode.nullable && resultType.tag != TypeTags.JSON && resultType.tag != TypeTags.ANY) {
resultType = BUnionType.create(null, resultType, symTable.nilType);
} else if (typeNode.nullable && refType.tag != TypeTags.JSON && refType.tag != TypeTags.ANY) {
resultType = BUnionType.create(null, resultType, symTable.nilType);
}
}
validateDistinctType(typeNode, resultType);
typeNode.setBType(resultType);
return resultType;
}
public void validateDistinctType(BLangType typeNode, BType type) {
if (typeNode.flagSet.contains(Flag.DISTINCT) && !isDistinctAllowedOnType(type)) {
dlog.error(typeNode.pos, DiagnosticErrorCode.DISTINCT_TYPING_ONLY_SUPPORT_OBJECTS_AND_ERRORS);
}
}
private boolean isDistinctAllowedOnType(BType type) {
if (type.tag == TypeTags.TYPEREFDESC) {
return isDistinctAllowedOnType(Types.getImpliedType(type));
}
if (type.tag == TypeTags.INTERSECTION) {
for (BType constituentType : ((BIntersectionType) type).getConstituentTypes()) {
if (!isDistinctAllowedOnType(constituentType)) {
return false;
}
}
return true;
}
if (type.tag == TypeTags.UNION) {
for (BType memberType : ((BUnionType) type).getMemberTypes()) {
if (!isDistinctAllowedOnType(memberType)) {
return false;
}
}
return true;
}
return type.tag == TypeTags.ERROR
|| type.tag == TypeTags.OBJECT
|| type.tag == TypeTags.NONE
|| type.tag == TypeTags.SEMANTIC_ERROR;
}
/**
* Return the symbol associated with the given name in the current package. This method first searches the symbol in
* the current scope and proceeds the enclosing scope, if it is not there in the current scope. This process
* continues until the symbol is found or the root scope is reached. This method is mainly meant for checking
* whether a given symbol is already defined in the scope hierarchy.
*
* @param env current symbol environment
* @param name symbol name
* @param expSymTag expected symbol type/tag
* @return resolved symbol
*/
private BSymbol lookupSymbolForDecl(SymbolEnv env, Name name, long expSymTag) {
ScopeEntry entry = env.scope.lookup(name);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & expSymTag) == expSymTag) {
return entry.symbol;
}
entry = entry.next;
}
if (env.enclEnv != null) {
return lookupSymbol(env.enclEnv, name, expSymTag);
}
return symTable.notFoundSymbol;
}
/**
* Return the symbol associated with the given name in the current package. This method first searches the symbol in
* the current scope and proceeds the enclosing scope, if it is not there in the current scope. This process
* continues until the symbol is found or the root scope is reached. This method is meant for looking up a symbol
* when they are referenced. If looking up a symbol from within a record type definition, this method ignores record
* fields. This is done so that default value expressions cannot refer to other record fields.
*
* @param env current symbol environment
* @param name symbol name
* @param expSymTag expected symbol type/tag
* @return resolved symbol
*/
private BSymbol lookupSymbol(SymbolEnv env, Name name, long expSymTag) {
ScopeEntry entry = env.scope.lookup(name);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & expSymTag) == expSymTag && !isFieldRefFromWithinARecord(entry.symbol, env)) {
return entry.symbol;
}
entry = entry.next;
}
if (env.enclEnv != null) {
return lookupSymbol(env.enclEnv, name, expSymTag);
}
return symTable.notFoundSymbol;
}
/**
* Checks whether the specified symbol is a symbol of a record field and whether that field is referred to from
* within a record type definition (not necessarily the owner of the field).
*
* @param symbol symbol to be tested
* @param env the environment in which the symbol was found
* @return returns `true` if the aboove described condition holds
*/
private boolean isFieldRefFromWithinARecord(BSymbol symbol, SymbolEnv env) {
return (symbol.owner.tag & SymTag.RECORD) == SymTag.RECORD &&
env.enclType != null && env.enclType.getKind() == NodeKind.RECORD_TYPE;
}
public BSymbol lookupSymbolInMainSpace(SymbolEnv env, Name name) {
return lookupSymbol(env, name, SymTag.MAIN);
}
public BSymbol lookupSymbolInAnnotationSpace(SymbolEnv env, Name name) {
return lookupSymbol(env, name, SymTag.ANNOTATION);
}
public BSymbol lookupSymbolInPrefixSpace(SymbolEnv env, Name name) {
return lookupSymbol(env, name, SymTag.IMPORT);
}
public BSymbol lookupSymbolInConstructorSpace(SymbolEnv env, Name name) {
return lookupSymbol(env, name, SymTag.CONSTRUCTOR);
}
public BSymbol lookupLangLibMethod(BType type, Name name, SymbolEnv env) {
BType referredType = Types.getImpliedType(type);
if (symTable.langAnnotationModuleSymbol == null) {
return symTable.notFoundSymbol;
}
BSymbol bSymbol;
switch (referredType.tag) {
case TypeTags.ARRAY:
case TypeTags.TUPLE:
bSymbol = lookupMethodInModule(symTable.langArrayModuleSymbol, name, env);
break;
case TypeTags.DECIMAL:
bSymbol = lookupMethodInModule(symTable.langDecimalModuleSymbol, name, env);
break;
case TypeTags.ERROR:
bSymbol = lookupMethodInModule(symTable.langErrorModuleSymbol, name, env);
break;
case TypeTags.FLOAT:
bSymbol = lookupMethodInModule(symTable.langFloatModuleSymbol, name, env);
break;
case TypeTags.FUTURE:
bSymbol = lookupMethodInModule(symTable.langFutureModuleSymbol, name, env);
break;
case TypeTags.INT:
case TypeTags.SIGNED32_INT:
case TypeTags.SIGNED16_INT:
case TypeTags.SIGNED8_INT:
case TypeTags.UNSIGNED32_INT:
case TypeTags.UNSIGNED16_INT:
case TypeTags.UNSIGNED8_INT:
case TypeTags.BYTE:
bSymbol = lookupMethodInModule(symTable.langIntModuleSymbol, name, env);
break;
case TypeTags.MAP:
case TypeTags.RECORD:
bSymbol = lookupMethodInModule(symTable.langMapModuleSymbol, name, env);
break;
case TypeTags.OBJECT:
bSymbol = lookupMethodInModule(symTable.langObjectModuleSymbol, name, env);
break;
case TypeTags.STREAM:
bSymbol = lookupMethodInModule(symTable.langStreamModuleSymbol, name, env);
break;
case TypeTags.TABLE:
bSymbol = lookupMethodInModule(symTable.langTableModuleSymbol, name, env);
break;
case TypeTags.STRING:
case TypeTags.CHAR_STRING:
bSymbol = lookupMethodInModule(symTable.langStringModuleSymbol, name, env);
break;
case TypeTags.TYPEDESC:
bSymbol = lookupMethodInModule(symTable.langTypedescModuleSymbol, name, env);
break;
case TypeTags.XML:
case TypeTags.XML_ELEMENT:
case TypeTags.XML_COMMENT:
case TypeTags.XML_PI:
bSymbol = lookupMethodInModule(symTable.langXmlModuleSymbol, name, env);
break;
case TypeTags.XML_TEXT:
bSymbol = lookupMethodInModule(symTable.langXmlModuleSymbol, name, env);
if (bSymbol == symTable.notFoundSymbol) {
bSymbol = lookupMethodInModule(symTable.langStringModuleSymbol, name, env);
}
break;
case TypeTags.BOOLEAN:
bSymbol = lookupMethodInModule(symTable.langBooleanModuleSymbol, name, env);
break;
case TypeTags.UNION:
Iterator<BType> itr = ((BUnionType) referredType).getMemberTypes().iterator();
if (!itr.hasNext()) {
throw new IllegalArgumentException(
format("Union type '%s' does not have member types", type));
}
BType member = Types.getImpliedType(itr.next());
if (TypeTags.isIntegerTypeTag(member.tag) || member.tag == TypeTags.BYTE) {
member = symTable.intType;
} else if (TypeTags.isStringTypeTag(member.tag)) {
member = symTable.stringType;
}
if (types.isSubTypeOfBaseType(type, member.tag)) {
bSymbol = lookupLangLibMethod(member, name, env);
} else {
bSymbol = symTable.notFoundSymbol;
}
break;
case TypeTags.FINITE:
if (types.isAssignable(type, symTable.intType)) {
return lookupLangLibMethod(symTable.intType, name, env);
}
if (types.isAssignable(type, symTable.stringType)) {
return lookupLangLibMethod(symTable.stringType, name, env);
}
if (types.isAssignable(type, symTable.decimalType)) {
return lookupLangLibMethod(symTable.decimalType, name, env);
}
if (types.isAssignable(type, symTable.floatType)) {
return lookupLangLibMethod(symTable.floatType, name, env);
}
if (types.isAssignable(type, symTable.booleanType)) {
return lookupLangLibMethod(symTable.booleanType, name, env);
}
bSymbol = symTable.notFoundSymbol;
break;
case TypeTags.REGEXP:
bSymbol = lookupMethodInModule(symTable.langRegexpModuleSymbol, name, env);
break;
default:
bSymbol = symTable.notFoundSymbol;
}
if (bSymbol == symTable.notFoundSymbol && referredType.tag != TypeTags.OBJECT) {
bSymbol = lookupMethodInModule(symTable.langValueModuleSymbol, name, env);
}
if (bSymbol == symTable.notFoundSymbol) {
bSymbol = lookupMethodInModule(symTable.langInternalModuleSymbol, name, env);
}
return bSymbol;
}
/**
* Recursively analyse the symbol env to find the closure variable symbol that is being resolved.
*
* @param env symbol env to analyse and find the closure variable.
* @param symbol symbol to lookup
* @return closure symbol wrapper along with the resolved count
*/
public BSymbol lookupClosureVarSymbol(SymbolEnv env, BSymbol symbol) {
ScopeEntry entry = env.scope.lookup(symbol.name);
while (entry != NOT_FOUND_ENTRY) {
if (entry.symbol == symbol) {
return entry.symbol;
}
entry = entry.next;
}
if (env.enclEnv == null || env.enclEnv.node == null) {
return symTable.notFoundSymbol;
}
return lookupClosureVarSymbol(env.enclEnv, symbol);
}
public BSymbol lookupMainSpaceSymbolInPackage(Location pos,
SymbolEnv env,
Name pkgAlias,
Name name) {
// 1) Look up the current package if the package alias is empty.
if (pkgAlias == Names.EMPTY) {
return lookupSymbolInMainSpace(env, name);
}
// 2) Retrieve the package symbol first
BSymbol pkgSymbol =
resolvePrefixSymbol(env, pkgAlias, Names.fromString(pos.lineRange().fileName()));
if (pkgSymbol == symTable.notFoundSymbol) {
dlog.error(pos, DiagnosticErrorCode.UNDEFINED_MODULE, pkgAlias.value);
return pkgSymbol;
}
// 3) Look up the package scope.
return lookupMemberSymbol(pos, pkgSymbol.scope, env, name, SymTag.MAIN);
}
public BSymbol lookupMainSpaceSymbolInPackage(BSymbol pkgSymbol, Location pos, SymbolEnv env, Name name) {
return lookupMemberSymbol(pos, pkgSymbol.scope, env, name, SymTag.MAIN);
}
public BSymbol lookupPrefixSpaceSymbolInPackage(Location pos,
SymbolEnv env,
Name pkgAlias,
Name name) {
// 1) Look up the current package if the package alias is empty.
if (pkgAlias == Names.EMPTY) {
return lookupSymbolInPrefixSpace(env, name);
}
// 2) Retrieve the package symbol first
BSymbol pkgSymbol =
resolvePrefixSymbol(env, pkgAlias, Names.fromString(pos.lineRange().fileName()));
if (pkgSymbol == symTable.notFoundSymbol) {
dlog.error(pos, DiagnosticErrorCode.UNDEFINED_MODULE, pkgAlias.value);
return pkgSymbol;
}
// 3) Look up the package scope.
return lookupMemberSymbol(pos, pkgSymbol.scope, env, name, SymTag.IMPORT);
}
public BSymbol lookupAnnotationSpaceSymbolInPackage(Location pos,
SymbolEnv env,
Name pkgAlias,
Name name) {
// 1) Look up the current package if the package alias is empty.
if (pkgAlias == Names.EMPTY) {
return lookupSymbolInAnnotationSpace(env, name);
}
// 2) Retrieve the package symbol first
BSymbol pkgSymbol =
resolvePrefixSymbol(env, pkgAlias, Names.fromString(pos.lineRange().fileName()));
if (pkgSymbol == symTable.notFoundSymbol) {
dlog.error(pos, DiagnosticErrorCode.UNDEFINED_MODULE, pkgAlias.value);
return pkgSymbol;
}
// 3) Look up the package scope.
return lookupMemberSymbol(pos, pkgSymbol.scope, env, name, SymTag.ANNOTATION);
}
public BSymbol lookupConstructorSpaceSymbolInPackage(Location pos,
SymbolEnv env,
Name pkgAlias,
Name name) {
// 1) Look up the current package if the package alias is empty.
if (pkgAlias == Names.EMPTY) {
return lookupSymbolInConstructorSpace(env, name);
}
// 2) Retrieve the package symbol first
BSymbol pkgSymbol =
resolvePrefixSymbol(env, pkgAlias, Names.fromString(pos.lineRange().fileName()));
if (pkgSymbol == symTable.notFoundSymbol) {
dlog.error(pos, DiagnosticErrorCode.UNDEFINED_MODULE, pkgAlias.value);
return pkgSymbol;
}
// 3) Look up the package scope.
return lookupMemberSymbol(pos, pkgSymbol.scope, env, name, SymTag.CONSTRUCTOR);
}
public BSymbol lookupMethodInModule(BPackageSymbol moduleSymbol, Name name, SymbolEnv env) {
// What we get here is T.Name, this should convert to
ScopeEntry entry = moduleSymbol.scope.lookup(name);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & SymTag.FUNCTION) != SymTag.FUNCTION) {
entry = entry.next;
continue;
}
if (isMemberAccessAllowed(env, entry.symbol)) {
return entry.symbol;
}
return symTable.notFoundSymbol;
}
return symTable.notFoundSymbol;
}
/**
* Return the symbol with the given name.
* This method only looks at the symbol defined in the given scope.
*
* @param pos diagnostic position
* @param scope current scope
* @param env symbol environment
* @param name symbol name
* @param expSymTag expected symbol type/tag
* @return resolved symbol
*/
public BSymbol lookupMemberSymbol(Location pos,
Scope scope,
SymbolEnv env,
Name name,
long expSymTag) {
ScopeEntry entry = scope.lookup(name);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & expSymTag) != expSymTag) {
entry = entry.next;
continue;
}
if (isMemberAccessAllowed(env, entry.symbol)) {
return entry.symbol;
} else {
dlog.error(pos, DiagnosticErrorCode.ATTEMPT_REFER_NON_ACCESSIBLE_SYMBOL, entry.symbol.name);
return symTable.notFoundSymbol;
}
}
return symTable.notFoundSymbol;
}
/**
* Resolve and return the namespaces visible to the given environment, as a map.
*
* @param env Environment to get the visible namespaces
* @return Map of namespace symbols visible to the given environment
*/
public Map<Name, BXMLNSSymbol> resolveAllNamespaces(SymbolEnv env) {
Map<Name, BXMLNSSymbol> namespaces = new LinkedHashMap<>();
addNamespacesInScope(namespaces, env);
return namespaces;
}
public void boostrapErrorType() {
ScopeEntry entry = symTable.rootPkgSymbol.scope.lookup(Names.ERROR);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & SymTag.TYPE) != SymTag.TYPE) {
entry = entry.next;
continue;
}
symTable.errorType = (BErrorType) Types.getImpliedType(entry.symbol.type);
symTable.detailType = (BMapType) symTable.errorType.detailType;
return;
}
throw new IllegalStateException("built-in error not found ?");
}
public void defineOperators() {
symTable.defineOperators();
}
public void bootstrapAnydataType() {
ScopeEntry entry = symTable.langAnnotationModuleSymbol.scope.lookup(Names.ANYDATA);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & SymTag.TYPE) != SymTag.TYPE) {
entry = entry.next;
continue;
}
BUnionType type = (BUnionType) Types.getImpliedType(entry.symbol.type);
symTable.anydataType = new BAnydataType(type);
Optional<BIntersectionType> immutableType = Types.getImmutableType(symTable, PackageID.ANNOTATIONS, type);
if (immutableType.isPresent()) {
Types.addImmutableType(symTable, PackageID.ANNOTATIONS, symTable.anydataType, immutableType.get());
}
symTable.anydataOrReadonly = BUnionType.create(null, symTable.anydataType, symTable.readonlyType);
entry.symbol.type = symTable.anydataType;
entry.symbol.origin = BUILTIN;
symTable.anydataType.tsymbol = new BTypeSymbol(SymTag.TYPE, Flags.PUBLIC, Names.ANYDATA,
PackageID.ANNOTATIONS, symTable.anydataType, symTable.rootPkgSymbol, symTable.builtinPos, BUILTIN);
return;
}
throw new IllegalStateException("built-in 'anydata' type not found");
}
public void bootstrapJsonType() {
ScopeEntry entry = symTable.langAnnotationModuleSymbol.scope.lookup(Names.JSON);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & SymTag.TYPE) != SymTag.TYPE) {
entry = entry.next;
continue;
}
BUnionType type = (BUnionType) Types.getImpliedType(entry.symbol.type);
symTable.jsonType = new BJSONType(type);
Optional<BIntersectionType> immutableType = Types.getImmutableType(symTable, PackageID.ANNOTATIONS,
type);
if (immutableType.isPresent()) {
Types.addImmutableType(symTable, PackageID.ANNOTATIONS, symTable.jsonType, immutableType.get());
}
symTable.jsonType.tsymbol = new BTypeSymbol(SymTag.TYPE, Flags.PUBLIC, Names.JSON, PackageID.ANNOTATIONS,
symTable.jsonType, symTable.langAnnotationModuleSymbol, symTable.builtinPos, BUILTIN);
entry.symbol.type = symTable.jsonType;
entry.symbol.origin = BUILTIN;
return;
}
throw new IllegalStateException("built-in 'json' type not found");
}
public void bootstrapCloneableType() {
if (symTable.langValueModuleSymbol != null) {
ScopeEntry entry = symTable.langValueModuleSymbol.scope.lookup(Names.CLONEABLE);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & SymTag.TYPE) != SymTag.TYPE) {
entry = entry.next;
continue;
}
symTable.cloneableType = (BUnionType) Types.getImpliedType(entry.symbol.type);
symTable.cloneableType.tsymbol =
new BTypeSymbol(SymTag.TYPE, Flags.PUBLIC, Names.CLONEABLE,
PackageID.VALUE, symTable.cloneableType, symTable.langValueModuleSymbol,
symTable.builtinPos, BUILTIN);
symTable.detailType = new BMapType(TypeTags.MAP, symTable.cloneableType, null);
symTable.errorType = new BErrorType(null, symTable.detailType);
symTable.errorType.tsymbol = new BErrorTypeSymbol(SymTag.ERROR, Flags.PUBLIC, Names.ERROR,
symTable.rootPkgSymbol.pkgID, symTable.errorType, symTable.rootPkgSymbol, symTable.builtinPos
, BUILTIN);
symTable.errorOrNilType = BUnionType.create(null, symTable.errorType, symTable.nilType);
symTable.anyOrErrorType = BUnionType.create(null, symTable.anyType, symTable.errorType);
symTable.mapAllType = new BMapType(TypeTags.MAP, symTable.anyOrErrorType, null);
symTable.arrayAllType = new BArrayType(symTable.anyOrErrorType);
symTable.typeDesc.constraint = symTable.anyOrErrorType;
symTable.futureType.constraint = symTable.anyOrErrorType;
symTable.pureType = BUnionType.create(null, symTable.anydataType, symTable.errorType);
return;
}
throw new IllegalStateException("built-in 'lang.value:Cloneable' type not found");
}
ScopeEntry entry = symTable.rootPkgSymbol.scope.lookup(Names.CLONEABLE_INTERNAL);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & SymTag.TYPE) != SymTag.TYPE) {
entry = entry.next;
continue;
}
entry.symbol.type = symTable.cloneableType;
break;
}
}
public void bootstrapIntRangeType() {
ScopeEntry entry = symTable.langInternalModuleSymbol.scope.lookup(Names.CREATE_INT_RANGE);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & SymTag.INVOKABLE) != SymTag.INVOKABLE) {
entry = entry.next;
continue;
}
symTable.intRangeType = (BObjectType) Types
.getImpliedType(((BInvokableType) entry.symbol.type).retType);
symTable.defineIntRangeOperations();
return;
}
throw new IllegalStateException("built-in Integer Range type not found ?");
}
public void bootstrapIterableType() {
ScopeEntry entry = symTable.langObjectModuleSymbol.scope.lookup(Names.OBJECT_ITERABLE);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & SymTag.TYPE) != SymTag.TYPE) {
entry = entry.next;
continue;
}
symTable.iterableType = (BObjectType) Types.getImpliedType(entry.symbol.type);
return;
}
throw new IllegalStateException("built-in distinct Iterable type not found ?");
}
public void loadRawTemplateType() {
ScopeEntry entry = symTable.langObjectModuleSymbol.scope.lookup(Names.RAW_TEMPLATE);
while (entry != NOT_FOUND_ENTRY) {
if ((entry.symbol.tag & SymTag.TYPE) != SymTag.TYPE) {
entry = entry.next;
continue;
}
symTable.rawTemplateType = (BObjectType) entry.symbol.type;
return;
}
throw new IllegalStateException("'lang.object:RawTemplate' type not found");
}
// visit type nodes
@Override
public BType transform(BLangValueType valueTypeNode, AnalyzerData data) {
return visitBuiltInTypeNode(valueTypeNode, data, valueTypeNode.typeKind);
}
@Override
public BType transform(BLangBuiltInRefTypeNode builtInRefType, AnalyzerData data) {
return visitBuiltInTypeNode(builtInRefType, data, builtInRefType.typeKind);
}
@Override
public BType transform(BLangArrayType arrayTypeNode, AnalyzerData data) {
// The value of the dimensions field should always be >= 1
// If sizes is null array is unsealed
BType resultType = resolveTypeNode(arrayTypeNode.elemtype, data, data.env, data.diagCode);
if (resultType == symTable.noType) {
return resultType;
}
boolean isError = false;
for (int i = 0; i < arrayTypeNode.dimensions; i++) {
BTypeSymbol arrayTypeSymbol = Symbols.createTypeSymbol(SymTag.ARRAY_TYPE, Flags.PUBLIC, Names.EMPTY,
data.env.enclPkg.symbol.pkgID, null, data.env.scope.owner, arrayTypeNode.pos, SOURCE);
BArrayType arrType;
if (arrayTypeNode.sizes.isEmpty()) {
arrType = new BArrayType(resultType, arrayTypeSymbol);
} else {
BLangExpression size = arrayTypeNode.sizes.get(i);
if (size.getKind() == NodeKind.LITERAL || size.getKind() == NodeKind.NUMERIC_LITERAL) {
Integer sizeIndicator = (Integer) (((BLangLiteral) size).getValue());
BArrayState arrayState;
if (sizeIndicator == OPEN_ARRAY_INDICATOR) {
arrayState = BArrayState.OPEN;
} else if (sizeIndicator == INFERRED_ARRAY_INDICATOR) {
arrayState = BArrayState.INFERRED;
} else {
arrayState = BArrayState.CLOSED;
}
arrType = new BArrayType(resultType, arrayTypeSymbol, sizeIndicator, arrayState);
} else {
if (size.getKind() != NodeKind.SIMPLE_VARIABLE_REF) {
dlog.error(size.pos, DiagnosticErrorCode.INCOMPATIBLE_TYPES, symTable.intType,
((BLangTypedescExpr) size).getTypeNode());
isError = true;
continue;
}
BLangSimpleVarRef sizeReference = (BLangSimpleVarRef) size;
Name pkgAlias = names.fromIdNode(sizeReference.pkgAlias);
Name typeName = names.fromIdNode(sizeReference.variableName);
BSymbol sizeSymbol = lookupMainSpaceSymbolInPackage(size.pos, data.env, pkgAlias, typeName);
sizeReference.symbol = sizeSymbol;
if (symTable.notFoundSymbol == sizeSymbol) {
dlog.error(arrayTypeNode.pos, DiagnosticErrorCode.UNDEFINED_SYMBOL, size);
isError = true;
continue;
}
if (sizeSymbol.tag != SymTag.CONSTANT) {
dlog.error(size.pos, DiagnosticErrorCode.INVALID_ARRAY_SIZE_REFERENCE, sizeSymbol);
isError = true;
continue;
}
BConstantSymbol sizeConstSymbol = (BConstantSymbol) sizeSymbol;
BType lengthLiteralType = Types.getImpliedType(sizeConstSymbol.literalType);
if (lengthLiteralType.tag != TypeTags.INT) {
dlog.error(size.pos, DiagnosticErrorCode.INCOMPATIBLE_TYPES, symTable.intType,
sizeConstSymbol.literalType);
isError = true;
continue;
}
int length;
long lengthCheck = Long.parseLong(sizeConstSymbol.type.toString());
if (lengthCheck > MAX_ARRAY_SIZE) {
length = 0;
dlog.error(size.pos,
DiagnosticErrorCode.ARRAY_LENGTH_GREATER_THAT_2147483637_NOT_YET_SUPPORTED);
} else if (lengthCheck < 0) {
length = 0;
dlog.error(size.pos, DiagnosticErrorCode.INVALID_ARRAY_LENGTH);
} else {
length = (int) lengthCheck;
}
arrType = new BArrayType(resultType, arrayTypeSymbol, length, BArrayState.CLOSED);
}
}
arrayTypeSymbol.type = arrType;
resultType = arrayTypeSymbol.type;
markParameterizedType(arrType, arrType.eType);
}
if (isError) {
resultType = symTable.semanticError;
}
return resultType;
}
@Override
public BType transform(BLangUnionTypeNode unionTypeNode, AnalyzerData data) {
LinkedHashSet<BType> memberTypes = new LinkedHashSet<>();
for (BLangType langType : unionTypeNode.memberTypeNodes) {
memberTypes.add(resolveTypeNode(langType, data.env));
}
if (memberTypes.contains(symTable.noType)) {
return symTable.noType;
}
BTypeSymbol unionTypeSymbol = Symbols.createTypeSymbol(SymTag.UNION_TYPE, Flags.asMask(EnumSet.of(Flag.PUBLIC)),
Names.EMPTY, data.env.enclPkg.symbol.pkgID, null,
data.env.scope.owner, unionTypeNode.pos, SOURCE);
BUnionType unionType = BUnionType.create(unionTypeSymbol, memberTypes);
unionTypeSymbol.type = unionType;
markParameterizedType(unionType, memberTypes);
return unionType;
}
@Override
public BType transform(BLangIntersectionTypeNode intersectionTypeNode, AnalyzerData data) {
return computeIntersectionType(intersectionTypeNode, data);
}
@Override
public BType transform(BLangObjectTypeNode objectTypeNode, AnalyzerData data) {
EnumSet<Flag> flags = EnumSet.copyOf(objectTypeNode.flagSet);
if (objectTypeNode.isAnonymous) {
flags.add(Flag.PUBLIC);
}
int typeFlags = 0;
if (flags.contains(Flag.READONLY)) {
typeFlags |= Flags.READONLY;
}
if (flags.contains(Flag.ISOLATED)) {
typeFlags |= Flags.ISOLATED;
}
if (flags.contains(Flag.SERVICE)) {
typeFlags |= Flags.SERVICE;
}
BTypeSymbol objectSymbol = Symbols.createObjectSymbol(Flags.asMask(flags), Names.EMPTY,
data.env.enclPkg.symbol.pkgID, null, data.env.scope.owner, objectTypeNode.pos, SOURCE);
BObjectType objectType = new BObjectType(objectSymbol, typeFlags);
objectSymbol.type = objectType;
objectTypeNode.symbol = objectSymbol;
return objectType;
}
@Override
public BType transform(BLangRecordTypeNode recordTypeNode, AnalyzerData data) {
// If we cannot resolve a type of a type definition, we create a dummy symbol for it. If the type node is
// a record, a symbol will be created for it when we define the dummy symbol (from here). When we define the
// node later, this method will be called again. In such cases, we don't need to create a new symbol here.
if (recordTypeNode.symbol == null) {
EnumSet<Flag> flags = recordTypeNode.isAnonymous ? EnumSet.of(Flag.PUBLIC, Flag.ANONYMOUS)
: EnumSet.noneOf(Flag.class);
BRecordTypeSymbol recordSymbol = Symbols.createRecordSymbol(Flags.asMask(flags), Names.EMPTY,
data.env.enclPkg.symbol.pkgID, null,
data.env.scope.owner, recordTypeNode.pos,
recordTypeNode.isAnonymous ? VIRTUAL : SOURCE);
BRecordType recordType = new BRecordType(recordSymbol);
recordSymbol.type = recordType;
recordTypeNode.symbol = recordSymbol;
if (data.env.node.getKind() != NodeKind.PACKAGE) {
recordSymbol.name = Names.fromString(
anonymousModelHelper.getNextAnonymousTypeKey(data.env.enclPkg.packageID));
symbolEnter.defineSymbol(recordTypeNode.pos, recordTypeNode.symbol, data.env);
symbolEnter.defineNode(recordTypeNode, data.env);
}
return recordType;
}
return recordTypeNode.symbol.type;
}
@Override
public BType transform(BLangStreamType streamTypeNode, AnalyzerData data) {
BType type = resolveTypeNode(streamTypeNode.type, data, data.env);
BType constraintType = resolveTypeNode(streamTypeNode.constraint, data, data.env);
BType error = streamTypeNode.error != null ?
resolveTypeNode(streamTypeNode.error, data, data.env) : symTable.nilType;
// If the constrained type is undefined, return noType as the type.
if (constraintType == symTable.noType) {
return symTable.noType;
}
BType streamType = new BStreamType(TypeTags.STREAM, constraintType, error, null);
BTypeSymbol typeSymbol = type.tsymbol;
streamType.tsymbol = Symbols.createTypeSymbol(typeSymbol.tag, typeSymbol.flags, typeSymbol.name,
typeSymbol.originalName, typeSymbol.pkgID, streamType,
data.env.scope.owner, streamTypeNode.pos, SOURCE);
markParameterizedType(streamType, constraintType);
if (error != null) {
markParameterizedType(streamType, error);
}
return streamType;
}
@Override
public BType transform(BLangTableTypeNode tableTypeNode, AnalyzerData data) {
BType type = resolveTypeNode(tableTypeNode.type, data, data.env);
BType constraintType = resolveTypeNode(tableTypeNode.constraint, data, data.env);
// If the constrained type is undefined, return noType as the type.
if (constraintType == symTable.noType) {
return symTable.noType;
}
BTableType tableType = new BTableType(TypeTags.TABLE, constraintType, null);
BTypeSymbol typeSymbol = type.tsymbol;
tableType.tsymbol = Symbols.createTypeSymbol(SymTag.TYPE, Flags.asMask(EnumSet.noneOf(Flag.class)),
typeSymbol.name, typeSymbol.originalName, typeSymbol.pkgID,
tableType, data.env.scope.owner, tableTypeNode.pos, SOURCE);
tableType.tsymbol.flags = typeSymbol.flags;
tableType.constraintPos = tableTypeNode.constraint.pos;
tableType.isTypeInlineDefined = tableTypeNode.isTypeInlineDefined;
if (tableTypeNode.tableKeyTypeConstraint != null) {
tableType.keyTypeConstraint = resolveTypeNode(tableTypeNode.tableKeyTypeConstraint.keyType, data, data.env);
tableType.keyPos = tableTypeNode.tableKeyTypeConstraint.pos;
} else if (tableTypeNode.tableKeySpecifier != null) {
BLangTableKeySpecifier tableKeySpecifier = tableTypeNode.tableKeySpecifier;
List<String> fieldNameList = new ArrayList<>();
for (IdentifierNode identifier : tableKeySpecifier.fieldNameIdentifierList) {
fieldNameList.add(((BLangIdentifier) identifier).value);
}
tableType.fieldNameList = fieldNameList;
tableType.keyPos = tableKeySpecifier.pos;
}
if (Types.getImpliedType(constraintType).tag == TypeTags.MAP &&
(!tableType.fieldNameList.isEmpty() || tableType.keyTypeConstraint != null) &&
!tableType.tsymbol.owner.getFlags().contains(Flag.LANG_LIB)) {
dlog.error(tableType.keyPos,
DiagnosticErrorCode.KEY_CONSTRAINT_NOT_SUPPORTED_FOR_TABLE_WITH_MAP_CONSTRAINT);
return symTable.semanticError;
}
markParameterizedType(tableType, constraintType);
tableTypeNode.tableType = tableType;
return tableType;
}
@Override
public BType transform(BLangFiniteTypeNode finiteTypeNode, AnalyzerData data) {
BTypeSymbol finiteTypeSymbol = Symbols.createTypeSymbol(SymTag.FINITE_TYPE,
Flags.asMask(EnumSet.of(Flag.PUBLIC)), Names.EMPTY,
data.env.enclPkg.symbol.pkgID, null, data.env.scope.owner,
finiteTypeNode.pos, SOURCE);
// In case we encounter unary expressions in finite type, we will be replacing them with numeric literals
// Note: calling semanticAnalyzer form symbolResolver is a temporary fix.
semanticAnalyzer.analyzeNode(finiteTypeNode, data.env);
BFiniteType finiteType = new BFiniteType(finiteTypeSymbol);
for (BLangExpression expressionOrLiteral : finiteTypeNode.valueSpace) {
finiteType.addValue(expressionOrLiteral);
}
finiteTypeSymbol.type = finiteType;
return finiteType;
}
@Override
public BType transform(BLangTupleTypeNode tupleTypeNode, AnalyzerData data) {
List<BLangSimpleVariable> members = new ArrayList<>(tupleTypeNode.members.size());
BTypeSymbol tupleTypeSymbol = Symbols.createTypeSymbol(SymTag.TUPLE_TYPE,
Flags.asMask(EnumSet.of(Flag.PUBLIC)), Names.EMPTY, data.env.enclPkg.symbol.pkgID, null,
data.env.scope.owner, tupleTypeNode.pos, SOURCE);
SymbolEnv tupleEnv = SymbolEnv.createTypeEnv(tupleTypeNode, new Scope(tupleTypeSymbol), data.env);
boolean hasUndefinedMember = false;
for (BLangSimpleVariable member : tupleTypeNode.members) {
BType bType = member.getBType();
if (bType == null) {
symbolEnter.defineNode(member, tupleEnv);
} else if (bType == symTable.noType) {
member.setBType(null);
symbolEnter.defineNode(member, tupleEnv);
}
if (member.getBType() == symTable.noType) {
hasUndefinedMember = true;
}
members.add(member);
}
// If at least one member is undefined, return noType as the type.
if (hasUndefinedMember) {
return symTable.noType;
}
List<BTupleMember> tupleMembers = new ArrayList<>();
members.forEach(member -> tupleMembers.add(new BTupleMember(member.getBType(),
new BVarSymbol(member.getBType().flags, member.symbol.name, member.symbol.pkgID, member.getBType(),
member.symbol.owner, member.pos, SOURCE))));
BTupleType tupleType = new BTupleType(tupleTypeSymbol, tupleMembers);
tupleTypeSymbol.type = tupleType;
if (tupleTypeNode.restParamType != null) {
BType tupleRestType = resolveTypeNode(tupleTypeNode.restParamType, data, data.env);
if (tupleRestType == symTable.noType) {
return symTable.noType;
}
tupleType.restType = tupleRestType;
markParameterizedType(tupleType, tupleType.restType);
}
markParameterizedType(tupleType, tupleType.getTupleTypes());
return tupleType;
}
@Override
public BType transform(BLangErrorType errorTypeNode, AnalyzerData data) {
BType detailType = Optional.ofNullable(errorTypeNode.detailType)
.map(bLangType -> resolveTypeNode(bLangType, data, data.env)).orElse(symTable.detailType);
if (errorTypeNode.isAnonymous) {
errorTypeNode.flagSet.add(Flag.PUBLIC);
errorTypeNode.flagSet.add(Flag.ANONYMOUS);
}
// The builtin error type
BErrorType bErrorType = symTable.errorType;
boolean distinctErrorDef = errorTypeNode.flagSet.contains(Flag.DISTINCT);
if (detailType == symTable.detailType && !distinctErrorDef &&
!data.env.enclPkg.packageID.equals(PackageID.ANNOTATIONS)) {
return bErrorType;
}
// Define user define error type.
BErrorTypeSymbol errorTypeSymbol = Symbols.createErrorSymbol(Flags.asMask(errorTypeNode.flagSet),
Names.EMPTY, data.env.enclPkg.packageID, null, data.env.scope.owner, errorTypeNode.pos, SOURCE);
PackageID packageID = data.env.enclPkg.packageID;
if (data.env.node.getKind() != NodeKind.PACKAGE) {
errorTypeSymbol.name = Names.fromString(
anonymousModelHelper.getNextAnonymousTypeKey(packageID));
symbolEnter.defineSymbol(errorTypeNode.pos, errorTypeSymbol, data.env);
}
BErrorType errorType = new BErrorType(errorTypeSymbol, detailType);
errorType.flags |= errorTypeSymbol.flags;
errorTypeSymbol.type = errorType;
markParameterizedType(errorType, detailType);
errorType.typeIdSet = BTypeIdSet.emptySet();
if (errorTypeNode.isAnonymous && errorTypeNode.flagSet.contains(Flag.DISTINCT)) {
errorType.typeIdSet.add(
BTypeIdSet.from(packageID, anonymousModelHelper.getNextAnonymousTypeId(packageID), true));
}
return errorType;
}
@Override
public BType transform(BLangConstrainedType constrainedTypeNode, AnalyzerData data) {
BType type = resolveTypeNode(constrainedTypeNode.type, data, data.env);
BType constraintType = resolveTypeNode(constrainedTypeNode.constraint, data, data.env);
// If the constrained type is undefined, return noType as the type.
if (constraintType == symTable.noType) {
return symTable.noType;
}
BType constrainedType;
if (type.tag == TypeTags.FUTURE) {
constrainedType = new BFutureType(TypeTags.FUTURE, constraintType, null);
} else if (type.tag == TypeTags.MAP) {
constrainedType = new BMapType(TypeTags.MAP, constraintType, null);
} else if (type.tag == TypeTags.TYPEDESC) {
constrainedType = new BTypedescType(constraintType, null);
} else if (type.tag == TypeTags.XML) {
if (Types.getImpliedType(constraintType).tag == TypeTags.PARAMETERIZED_TYPE) {
BType typedescType = ((BParameterizedType) constraintType).paramSymbol.type;
BType typedescConstraint = ((BTypedescType) typedescType).constraint;
validateXMLConstraintType(typedescConstraint, constrainedTypeNode.pos);
} else {
validateXMLConstraintType(constraintType, constrainedTypeNode.pos);
}
constrainedType = new BXMLType(constraintType, null);
} else {
return symTable.neverType;
}
BTypeSymbol typeSymbol = type.tsymbol;
constrainedType.tsymbol = Symbols.createTypeSymbol(typeSymbol.tag, typeSymbol.flags, typeSymbol.name,
typeSymbol.originalName, typeSymbol.pkgID, constrainedType, typeSymbol.owner,
constrainedTypeNode.pos, SOURCE);
markParameterizedType(constrainedType, constraintType);
return constrainedType;
}
public void validateXMLConstraintType(BType type, Location pos) {
BType constraintType = Types.getImpliedType(type);
int constrainedTag = constraintType.tag;
if (constrainedTag == TypeTags.UNION) {
checkUnionTypeForXMLSubTypes((BUnionType) constraintType, pos);
return;
}
if (!TypeTags.isXMLTypeTag(constrainedTag) && constrainedTag != TypeTags.NEVER) {
dlog.error(pos, DiagnosticErrorCode.INCOMPATIBLE_TYPE_CONSTRAINT, symTable.xmlType, type);
}
}
private void checkUnionTypeForXMLSubTypes(BUnionType constraintUnionType, Location pos) {
for (BType memberType : constraintUnionType.getMemberTypes()) {
memberType = Types.getImpliedType(memberType);
if (memberType.tag == TypeTags.UNION) {
checkUnionTypeForXMLSubTypes((BUnionType) memberType, pos);
}
if (!TypeTags.isXMLTypeTag(memberType.tag)) {
dlog.error(pos, DiagnosticErrorCode.INCOMPATIBLE_TYPE_CONSTRAINT, symTable.xmlType,
constraintUnionType);
}
}
}
@Override
public BType transform(BLangUserDefinedType userDefinedTypeNode, AnalyzerData data) {
String name = userDefinedTypeNode.typeName.value;
BType type;
// 1) Resolve the package scope using the package alias.
// If the package alias is not empty or null, then find the package scope,
// if not use the current package scope.
// 2) lookup the typename in the package scope returned from step 1.
// 3) If the symbol is not found, then lookup in the root scope. e.g. for types such as 'error'
BLangIdentifier pkgAliasIdentifier = userDefinedTypeNode.pkgAlias;
Name pkgAlias = names.fromIdNode(pkgAliasIdentifier);
BLangIdentifier typeNameIdentifier = userDefinedTypeNode.typeName;
Name typeName = names.fromIdNode(typeNameIdentifier);
BSymbol symbol = symTable.notFoundSymbol;
SymbolEnv env = data.env;
// 1) Resolve ANNOTATION type if and only current scope inside ANNOTATION definition.
// Only valued types and ANNOTATION type allowed.
if (env.scope.owner.tag == SymTag.ANNOTATION) {
symbol = lookupAnnotationSpaceSymbolInPackage(userDefinedTypeNode.pos, env, pkgAlias, typeName);
}
// 2) Resolve the package scope using the package alias.
// If the package alias is not empty or null, then find the package scope,
if (symbol == symTable.notFoundSymbol) {
BSymbol tempSymbol = lookupMainSpaceSymbolInPackage(userDefinedTypeNode.pos, env, pkgAlias, typeName);
BSymbol refSymbol = tempSymbol.tag == SymTag.TYPE_DEF ?
Types.getImpliedType(tempSymbol.type).tsymbol : tempSymbol;
// Tsymbol of the effective type can be null for invalid intersections and `xml & readonly` intersections
if (refSymbol == null) {
refSymbol = tempSymbol;
}
NodeKind envNodeKind = data.env.node.getKind();
if ((refSymbol.tag & SymTag.TYPE) == SymTag.TYPE) {
symbol = tempSymbol;
} else if (Symbols.isTagOn(refSymbol, SymTag.VARIABLE) &&
(envNodeKind == NodeKind.FUNCTION || envNodeKind == NodeKind.RESOURCE_FUNC)) {
BLangFunction func = (BLangFunction) data.env.node;
boolean errored = false;
if (func.returnTypeNode == null ||
(func.hasBody() && func.body.getKind() != NodeKind.EXTERN_FUNCTION_BODY)) {
dlog.error(userDefinedTypeNode.pos,
DiagnosticErrorCode.INVALID_NON_EXTERNAL_DEPENDENTLY_TYPED_FUNCTION);
errored = true;
}
if (tempSymbol.type != null &&
Types.getImpliedType(tempSymbol.type).tag != TypeTags.TYPEDESC) {
dlog.error(userDefinedTypeNode.pos, DiagnosticErrorCode.INVALID_PARAM_TYPE_FOR_RETURN_TYPE,
tempSymbol.type);
errored = true;
}
if (errored) {
return symTable.semanticError;
}
ParameterizedTypeInfo parameterizedTypeInfo =
getTypedescParamValueType(func.requiredParams, data, refSymbol);
BType paramValType = parameterizedTypeInfo == null ? null : parameterizedTypeInfo.paramValueType;
if (paramValType == symTable.semanticError) {
return symTable.semanticError;
}
if (paramValType != null) {
BTypeSymbol tSymbol = new BTypeSymbol(SymTag.TYPE, Flags.PARAMETERIZED | tempSymbol.flags,
tempSymbol.name, tempSymbol.originalName, tempSymbol.pkgID,
null, func.symbol, tempSymbol.pos, VIRTUAL);
tSymbol.type = new BParameterizedType(paramValType, (BVarSymbol) tempSymbol,
tSymbol, tempSymbol.name, parameterizedTypeInfo.index);
tSymbol.type.flags |= Flags.PARAMETERIZED;
userDefinedTypeNode.symbol = tSymbol;
return tSymbol.type;
}
} else if (Symbols.isTagOn(tempSymbol, SymTag.VARIABLE) && (env.node.getKind() == NodeKind.FUNCTION_TYPE
|| env.node.getKind() == NodeKind.TUPLE_TYPE_NODE)) {
SymbolEnv symbolEnv = env;
BLangFunction func = null;
BLangFunctionTypeNode funcTypeNode = null;
ParameterizedTypeInfo parameterizedTypeInfo = null;
while ((symbolEnv.node.getKind() == NodeKind.FUNCTION_TYPE ||
symbolEnv.node.getKind() == NodeKind.FUNCTION ||
symbolEnv.node.getKind() == NodeKind.TUPLE_TYPE_NODE) && parameterizedTypeInfo == null) {
if (symbolEnv.node.getKind() == NodeKind.FUNCTION_TYPE) {
funcTypeNode = (BLangFunctionTypeNode) symbolEnv.node;
parameterizedTypeInfo = getTypedescParamValueType(funcTypeNode.params, data, tempSymbol);
} else if (symbolEnv.node.getKind() == NodeKind.FUNCTION) {
func = (BLangFunction) symbolEnv.node;
parameterizedTypeInfo = getTypedescParamValueType(func.requiredParams, data, tempSymbol);
}
symbolEnv = symbolEnv.enclEnv;
}
BType paramValType = parameterizedTypeInfo == null ? null : parameterizedTypeInfo.paramValueType;
if (paramValType == symTable.semanticError) {
return symTable.semanticError;
}
BSymbol bSymbol = null;
if (func != null) {
bSymbol = func.symbol;
} else if (funcTypeNode != null) {
bSymbol = funcTypeNode.getBType().tsymbol;
}
if (paramValType != null) {
BTypeSymbol tSymbol = new BTypeSymbol(SymTag.TYPE, Flags.PARAMETERIZED | tempSymbol.flags,
tempSymbol.name, tempSymbol.pkgID, null, bSymbol,
tempSymbol.pos, VIRTUAL);
tSymbol.type = new BParameterizedType(paramValType, (BVarSymbol) tempSymbol,
tSymbol, tempSymbol.name, parameterizedTypeInfo.index);
tSymbol.type.flags |= Flags.PARAMETERIZED;
userDefinedTypeNode.symbol = tSymbol;
return tSymbol.type;
}
}
}
if (symbol == symTable.notFoundSymbol) {
// 3) Lookup the root scope for types such as 'error'
symbol = lookupMemberSymbol(userDefinedTypeNode.pos, symTable.rootScope, env, typeName,
SymTag.VARIABLE_NAME);
}
if (env.logErrors && symbol == symTable.notFoundSymbol) {
if (!missingNodesHelper.isMissingNode(pkgAlias) && !missingNodesHelper.isMissingNode(typeName) &&
!symbolEnter.isUnknownTypeRef(userDefinedTypeNode)
&& typeResolver.isNotUnknownTypeRef(userDefinedTypeNode)) {
dlog.error(userDefinedTypeNode.pos, data.diagCode, typeName);
}
return symTable.semanticError;
}
userDefinedTypeNode.symbol = symbol;
type = symbol.type;
boolean isCloneableTypeDef = type.tag == TypeTags.UNION && types.isCloneableType((BUnionType) type);
if (symbol.kind == SymbolKind.TYPE_DEF && !Symbols.isFlagOn(symbol.flags, Flags.ANONYMOUS)
&& !isCloneableTypeDef) {
BType referenceType = ((BTypeDefinitionSymbol) symbol).referenceType;
referenceType.flags |= symbol.type.flags;
referenceType.tsymbol.flags |= symbol.type.flags;
return referenceType;
}
if (type.getKind() != TypeKind.OTHER) {
return type;
}
return typeResolver.validateModuleLevelDef(name, pkgAlias, typeName, userDefinedTypeNode);
}
public ParameterizedTypeInfo getTypedescParamValueType(List<BLangSimpleVariable> params,
AnalyzerData data, BSymbol varSym) {
for (int i = 0; i < params.size(); i++) {
BLangSimpleVariable param = params.get(i);
if (param.name.value.equals(varSym.name.value)) {
if (param.expr == null || param.expr.getKind() == NodeKind.INFER_TYPEDESC_EXPR) {
return new ParameterizedTypeInfo(
((BTypedescType) Types.getImpliedType(varSym.type)).constraint, i);
}
NodeKind defaultValueExprKind = param.expr.getKind();
if (defaultValueExprKind == NodeKind.TYPEDESC_EXPRESSION) {
return new ParameterizedTypeInfo(
resolveTypeNode(((BLangTypedescExpr) param.expr).typeNode, data, data.env), i);
}
if (defaultValueExprKind == NodeKind.SIMPLE_VARIABLE_REF) {
Name varName = names.fromIdNode(((BLangSimpleVarRef) param.expr).variableName);
BSymbol typeRefSym = lookupSymbolInMainSpace(data.env, varName);
if (typeRefSym != symTable.notFoundSymbol) {
return new ParameterizedTypeInfo(typeRefSym.type, i);
}
return new ParameterizedTypeInfo(symTable.semanticError);
}
dlog.error(param.pos, DiagnosticErrorCode.INVALID_TYPEDESC_PARAM);
return new ParameterizedTypeInfo(symTable.semanticError);
}
}
return null;
}
@Override
public BType transform(BLangFunctionTypeNode functionTypeNode, AnalyzerData data) {
SymbolEnv env = data.env;
if (functionTypeNode.getBType() == null) {
BInvokableTypeSymbol invokableTypeSymbol;
BInvokableType invokableType;
if (functionTypeNode.flagSet.contains(Flag.ANY_FUNCTION)) {
invokableType = new BInvokableType(null, null, null, null);
invokableType.flags = Flags.asMask(functionTypeNode.flagSet);
invokableTypeSymbol = Symbols.createInvokableTypeSymbol(SymTag.FUNCTION_TYPE,
Flags.asMask(functionTypeNode.flagSet),
env.enclPkg.symbol.pkgID, invokableType,
env.scope.owner, functionTypeNode.pos, VIRTUAL);
invokableTypeSymbol.params = null;
invokableTypeSymbol.restParam = null;
invokableTypeSymbol.returnType = null;
invokableType.tsymbol = invokableTypeSymbol;
} else {
invokableTypeSymbol = Symbols.createInvokableTypeSymbol(SymTag.FUNCTION_TYPE,
Flags.asMask(functionTypeNode.flagSet),
env.enclPkg.symbol.pkgID, functionTypeNode.getBType(),
env.scope.owner, functionTypeNode.pos, VIRTUAL);
invokableType = new BInvokableType(invokableTypeSymbol);
invokableTypeSymbol.type = invokableType;
invokableTypeSymbol.name =
Names.fromString(anonymousModelHelper.getNextAnonymousTypeKey(env.enclPkg.packageID));
symbolEnter.defineSymbol(functionTypeNode.pos, invokableTypeSymbol, env);
if (env.node.getKind() != NodeKind.PACKAGE || !functionTypeNode.inTypeDefinitionContext) {
functionTypeNode.setBType(invokableType);
symbolEnter.defineNode(functionTypeNode, env);
}
}
List<BLangSimpleVariable> params = functionTypeNode.getParams();
Location pos = functionTypeNode.pos;
BLangType returnTypeNode = functionTypeNode.returnTypeNode;
validateInferTypedescParams(pos, params, returnTypeNode == null ? null : invokableType);
return invokableType;
} else {
return functionTypeNode.getBType();
}
}
/**
* Lookup all the visible in-scope symbols for a given environment scope.
*
* @param env Symbol environment
* @return all the visible symbols
*/
public Map<Name, List<ScopeEntry>> getAllVisibleInScopeSymbols(SymbolEnv env) {
Map<Name, List<ScopeEntry>> visibleEntries = new HashMap<>();
env.scope.entries.forEach((key, value) -> {
ArrayList<ScopeEntry> entryList = new ArrayList<>();
entryList.add(value);
visibleEntries.put(key, entryList);
});
if (env.enclEnv != null) {
getAllVisibleInScopeSymbols(env.enclEnv).forEach((name, entryList) -> {
if (!visibleEntries.containsKey(name)) {
visibleEntries.put(name, entryList);
} else {
List<ScopeEntry> scopeEntries = visibleEntries.get(name);
entryList.forEach(scopeEntry -> {
if (!scopeEntries.contains(scopeEntry) && !isModuleLevelVar(scopeEntry.symbol)) {
scopeEntries.add(scopeEntry);
}
});
}
});
}
return visibleEntries;
}
public BSymbol getBinaryEqualityForTypeSets(OperatorKind opKind, BType lhsType, BType rhsType,
BLangBinaryExpr binaryExpr, SymbolEnv env) {
boolean validEqualityIntersectionExists;
switch (opKind) {
case EQUAL:
case NOT_EQUAL:
validEqualityIntersectionExists = types.validEqualityIntersectionExists(lhsType, rhsType);
break;
case REF_EQUAL:
case REF_NOT_EQUAL:
validEqualityIntersectionExists =
types.getTypeIntersection(Types.IntersectionContext.compilerInternalIntersectionTestContext(),
lhsType, rhsType, env) != symTable.semanticError;
break;
default:
return symTable.notFoundSymbol;
}
if (validEqualityIntersectionExists) {
if ((!types.isValueType(lhsType) && !types.isValueType(rhsType)) ||
(types.isValueType(lhsType) && types.isValueType(rhsType))) {
return createEqualityOperator(opKind, lhsType, rhsType);
} else {
types.setImplicitCastExpr(binaryExpr.rhsExpr, rhsType, symTable.anyType);
types.setImplicitCastExpr(binaryExpr.lhsExpr, lhsType, symTable.anyType);
switch (opKind) {
case REF_EQUAL:
// if one is a value type, consider === the same as ==
return createEqualityOperator(OperatorKind.EQUAL, symTable.anyType,
symTable.anyType);
case REF_NOT_EQUAL:
// if one is a value type, consider !== the same as !=
return createEqualityOperator(OperatorKind.NOT_EQUAL, symTable.anyType,
symTable.anyType);
default:
return createEqualityOperator(opKind, symTable.anyType, symTable.anyType);
}
}
}
return symTable.notFoundSymbol;
}
public BSymbol getBitwiseShiftOpsForTypeSets(OperatorKind opKind, BType lhsType, BType rhsType) {
boolean validIntTypesExists;
switch (opKind) {
case BITWISE_LEFT_SHIFT:
case BITWISE_RIGHT_SHIFT:
case BITWISE_UNSIGNED_RIGHT_SHIFT:
validIntTypesExists = types.validIntegerTypeExists(lhsType) && types.validIntegerTypeExists(rhsType);
break;
default:
return symTable.notFoundSymbol;
}
if (validIntTypesExists) {
switch (opKind) {
case BITWISE_LEFT_SHIFT:
return createShiftOperator(opKind, lhsType, rhsType);
case BITWISE_RIGHT_SHIFT:
case BITWISE_UNSIGNED_RIGHT_SHIFT:
switch (Types.getImpliedType(lhsType).tag) {
case TypeTags.UNSIGNED32_INT:
case TypeTags.UNSIGNED16_INT:
case TypeTags.UNSIGNED8_INT:
case TypeTags.BYTE:
return createBinaryOperator(opKind, lhsType, rhsType, lhsType);
default:
return createShiftOperator(opKind, lhsType, rhsType);
}
}
}
return symTable.notFoundSymbol;
}
private BSymbol createShiftOperator(OperatorKind opKind, BType lhsType, BType rhsType) {
if (lhsType.isNullable() || rhsType.isNullable()) {
BType intOptional = BUnionType.create(null, symTable.intType, symTable.nilType);
return createBinaryOperator(opKind, lhsType, rhsType, intOptional);
}
return createBinaryOperator(opKind, lhsType, rhsType, symTable.intType);
}
public BSymbol getArithmeticOpsForTypeSets(OperatorKind opKind, BType lhsType, BType rhsType) {
boolean validNumericOrStringTypeExists;
switch (opKind) {
case ADD:
validNumericOrStringTypeExists = (types.validNumericTypeExists(lhsType) &&
types.validNumericTypeExists(rhsType)) ||
(types.validStringOrXmlTypeExists(lhsType) &&
types.validStringOrXmlTypeExists(rhsType));
break;
case SUB:
case DIV:
case MUL:
case MOD:
validNumericOrStringTypeExists = types.validNumericTypeExists(lhsType) &&
types.validNumericTypeExists(rhsType);
break;
default:
return symTable.notFoundSymbol;
}
if (validNumericOrStringTypeExists) {
BType compatibleType1;
BType compatibleType2;
if (lhsType.isNullable()) {
compatibleType1 = types.findCompatibleType(types.getSafeType(lhsType, true, false));
} else {
compatibleType1 = types.findCompatibleType(lhsType);
}
if (rhsType.isNullable()) {
compatibleType2 = types.findCompatibleType(types.getSafeType(rhsType, true, false));
} else {
compatibleType2 = types.findCompatibleType(rhsType);
}
if (compatibleType1 != compatibleType2 && types.isBasicNumericType(compatibleType1) &&
!isIntFloatingPointMultiplication(opKind, compatibleType1, compatibleType2)) {
return symTable.notFoundSymbol;
}
BType returnType = compatibleType1.tag < compatibleType2.tag ? compatibleType2 : compatibleType1;
if (lhsType.isNullable() || rhsType.isNullable()) {
returnType = BUnionType.create(null, returnType, symTable.nilType);
}
return createBinaryOperator(opKind, lhsType, rhsType, returnType);
}
return symTable.notFoundSymbol;
}
private boolean isIntFloatingPointMultiplication(OperatorKind opKind, BType lhsCompatibleType,
BType rhsCompatibleType) {
switch (opKind) {
case MUL:
return lhsCompatibleType.tag == TypeTags.INT && isFloatingPointType(rhsCompatibleType) ||
rhsCompatibleType.tag == TypeTags.INT && isFloatingPointType(lhsCompatibleType);
case DIV:
case MOD:
return isFloatingPointType(lhsCompatibleType) && rhsCompatibleType.tag == TypeTags.INT;
default:
return false;
}
}
private boolean isFloatingPointType(BType type) {
return type.tag == TypeTags.DECIMAL || type.tag == TypeTags.FLOAT;
}
public BSymbol getUnaryOpsForTypeSets(OperatorKind opKind, BType type) {
boolean validNumericTypeExists;
switch (opKind) {
case ADD:
case SUB:
validNumericTypeExists = types.validNumericTypeExists(type);
break;
default:
return symTable.notFoundSymbol;
}
if (!validNumericTypeExists) {
return symTable.notFoundSymbol;
}
if (opKind == OperatorKind.ADD) {
return createUnaryOperator(opKind, type, type);
}
if (type.isNullable()) {
BType compatibleType = types.findCompatibleType(types.getSafeType(type, true, false));
return createUnaryOperator(opKind, type, BUnionType.create(null, compatibleType, symTable.nilType));
}
return createUnaryOperator(opKind, type, types.findCompatibleType(type));
}
public BSymbol getBinaryBitwiseOpsForTypeSets(OperatorKind opKind, BType lhsType, BType rhsType) {
BType referredLhsType = Types.getImpliedType(lhsType);
BType referredRhsType = Types.getImpliedType(rhsType);
boolean validIntTypesExists;
switch (opKind) {
case BITWISE_AND:
case BITWISE_OR:
case BITWISE_XOR:
validIntTypesExists = types.validIntegerTypeExists(lhsType) && types.validIntegerTypeExists(rhsType);
break;
default:
return symTable.notFoundSymbol;
}
if (validIntTypesExists) {
switch (opKind) {
case BITWISE_AND:
switch (referredLhsType.tag) {
case TypeTags.UNSIGNED8_INT:
case TypeTags.BYTE:
case TypeTags.UNSIGNED16_INT:
case TypeTags.UNSIGNED32_INT:
return createBinaryOperator(opKind, lhsType, rhsType, lhsType);
}
switch (referredRhsType.tag) {
case TypeTags.UNSIGNED8_INT:
case TypeTags.BYTE:
case TypeTags.UNSIGNED16_INT:
case TypeTags.UNSIGNED32_INT:
return createBinaryOperator(opKind, lhsType, rhsType, rhsType);
}
if (referredLhsType.isNullable() || referredRhsType.isNullable()) {
BType intOptional = BUnionType.create(null, symTable.intType, symTable.nilType);
return createBinaryOperator(opKind, lhsType, rhsType, intOptional);
}
return createBinaryOperator(opKind, lhsType, rhsType, symTable.intType);
case BITWISE_OR:
case BITWISE_XOR:
if (referredLhsType.isNullable() || referredRhsType.isNullable()) {
BType intOptional = BUnionType.create(null, symTable.intType, symTable.nilType);
return createBinaryOperator(opKind, lhsType, rhsType, intOptional);
}
return createBinaryOperator(opKind, lhsType, rhsType, symTable.intType);
}
}
return symTable.notFoundSymbol;
}
/**
* Define binary comparison operator for valid ordered types.
*
* @param opKind Binary operator kind
* @param lhsType Type of the left hand side value
* @param rhsType Type of the right hand side value
* @return {@literal <, <=, >, or >=} symbol
*/
public BSymbol getBinaryComparisonOpForTypeSets(OperatorKind opKind, BType lhsType, BType rhsType) {
boolean validOrderedTypesExist;
switch (opKind) {
case LESS_THAN:
case LESS_EQUAL:
case GREATER_THAN:
case GREATER_EQUAL:
validOrderedTypesExist = types.isOrderedType(lhsType, false) &&
types.isOrderedType(rhsType, false) && types.isSameOrderedType(lhsType, rhsType);
break;
default:
return symTable.notFoundSymbol;
}
if (validOrderedTypesExist) {
switch (opKind) {
case LESS_THAN:
return createBinaryComparisonOperator(OperatorKind.LESS_THAN, lhsType, rhsType);
case LESS_EQUAL:
return createBinaryComparisonOperator(OperatorKind.LESS_EQUAL, lhsType, rhsType);
case GREATER_THAN:
return createBinaryComparisonOperator(OperatorKind.GREATER_THAN, lhsType, rhsType);
default:
return createBinaryComparisonOperator(OperatorKind.GREATER_EQUAL, lhsType, rhsType);
}
}
return symTable.notFoundSymbol;
}
/**
* Defines {@code ...} or {@code ..<} operator for int subtypes.
*
* @param opKind Binary operator kind
* @param lhsType Type of the left-hand side value
* @param rhsType Type of the right-hand side value
* @return Defined symbol
*/
public BSymbol getRangeOpsForTypeSets(OperatorKind opKind, BType lhsType, BType rhsType) {
if (opKind != OperatorKind.CLOSED_RANGE && opKind != OperatorKind.HALF_OPEN_RANGE) {
return symTable.notFoundSymbol;
}
boolean validIntTypesExists = types.validIntegerTypeExists(lhsType) && types.validIntegerTypeExists(rhsType);
if (!validIntTypesExists) {
return symTable.notFoundSymbol;
}
return createBinaryOperator(opKind, lhsType, rhsType, symTable.intRangeType);
}
public boolean isBinaryShiftOperator(OperatorKind binaryOpKind) {
return binaryOpKind == OperatorKind.BITWISE_LEFT_SHIFT ||
binaryOpKind == OperatorKind.BITWISE_RIGHT_SHIFT ||
binaryOpKind == OperatorKind.BITWISE_UNSIGNED_RIGHT_SHIFT;
}
public boolean isArithmeticOperator(OperatorKind binaryOpKind) {
return binaryOpKind == OperatorKind.ADD || binaryOpKind == OperatorKind.SUB ||
binaryOpKind == OperatorKind.DIV || binaryOpKind == OperatorKind.MUL ||
binaryOpKind == OperatorKind.MOD;
}
public boolean isBinaryComparisonOperator(OperatorKind binaryOpKind) {
return binaryOpKind == OperatorKind.LESS_THAN ||
binaryOpKind == OperatorKind.LESS_EQUAL || binaryOpKind == OperatorKind.GREATER_THAN ||
binaryOpKind == OperatorKind.GREATER_EQUAL;
}
public boolean markParameterizedType(BType type, BType constituentType) {
if (Symbols.isFlagOn(constituentType.flags, Flags.PARAMETERIZED)) {
type.tsymbol.flags |= Flags.PARAMETERIZED;
type.flags |= Flags.PARAMETERIZED;
return true;
}
return false;
}
public void markParameterizedType(BType enclosingType, Collection<BType> constituentTypes) {
if (Symbols.isFlagOn(enclosingType.flags, Flags.PARAMETERIZED)) {
return;
}
for (BType type : constituentTypes) {
if (type == null) {
// This is to avoid having to have this null check in each caller site, where some constituent types
// are expected to be null at times. e.g., rest param
continue;
}
if (markParameterizedType(enclosingType, type)) {
break;
}
}
}
// private methods
private BSymbol resolveOperator(ScopeEntry entry, List<BType> typeList) {
BSymbol foundSymbol = symTable.notFoundSymbol;
while (entry != NOT_FOUND_ENTRY) {
BInvokableType opType = (BInvokableType) entry.symbol.type;
if (typeList.size() == opType.paramTypes.size()) {
boolean match = true;
for (int i = 0; i < typeList.size(); i++) {
BType t = Types.getImpliedType(typeList.get(i));
if ((t.getKind() == TypeKind.UNION) && (opType.paramTypes.get(i).getKind() == TypeKind.UNION)) {
if (!this.types.isSameType(t, opType.paramTypes.get(i))) {
match = false;
}
} else if (t.tag != opType.paramTypes.get(i).tag) {
match = false;
break;
}
}
if (match) {
foundSymbol = entry.symbol;
break;
}
}
entry = entry.next;
}
return foundSymbol;
}
public BType visitBuiltInTypeNode(BLangType typeNode, AnalyzerData data, TypeKind typeKind) {
Name typeName = names.fromTypeKind(typeKind);
BSymbol typeSymbol = lookupMemberSymbol(typeNode.pos, symTable.rootScope, data.env, typeName, SymTag.TYPE);
if (typeSymbol == symTable.notFoundSymbol) {
dlog.error(typeNode.pos, data.diagCode, typeName);
}
typeNode.setBType(typeSymbol.type);
return typeSymbol.type;
}
private void addNamespacesInScope(Map<Name, BXMLNSSymbol> namespaces, SymbolEnv env) {
if (env == null) {
return;
}
env.scope.entries.forEach((name, scopeEntry) -> {
if (scopeEntry.symbol.kind == SymbolKind.XMLNS) {
BXMLNSSymbol nsSymbol = (BXMLNSSymbol) scopeEntry.symbol;
// Skip if the namespace is already added, by a child scope. That means it has been overridden.
if (!namespaces.containsKey(name)) {
namespaces.put(name, nsSymbol);
}
}
});
addNamespacesInScope(namespaces, env.enclEnv);
}
private boolean isMemberAccessAllowed(SymbolEnv env, BSymbol symbol) {
if (Symbols.isPublic(symbol)) {
return true;
}
if (!Symbols.isPrivate(symbol)) {
return env.enclPkg.symbol.pkgID.equals(symbol.pkgID);
}
if (env.enclType != null) {
return env.enclType.getBType().tsymbol == symbol.owner;
}
return isMemberAllowed(env, symbol);
}
private boolean isMemberAllowed(SymbolEnv env, BSymbol symbol) {
return env != null && (env.enclInvokable != null
&& env.enclInvokable.symbol.receiverSymbol != null
&& env.enclInvokable.symbol.receiverSymbol.type.tsymbol == symbol.owner
|| isMemberAllowed(env.enclEnv, symbol));
}
private BType computeIntersectionType(BLangIntersectionTypeNode intersectionTypeNode, AnalyzerData data) {
List<BLangType> constituentTypeNodes = intersectionTypeNode.constituentTypeNodes;
Map<BType, BLangType> typeBLangTypeMap = new HashMap<>();
boolean validIntersection = true;
boolean isErrorIntersection = false;
boolean isAlreadyExistingType = false;
BLangType bLangTypeOne = constituentTypeNodes.get(0);
BType typeOne = resolveTypeNode(bLangTypeOne, data, data.env);
if (typeOne == symTable.noType) {
return symTable.noType;
}
typeBLangTypeMap.put(typeOne, bLangTypeOne);
BLangType bLangTypeTwo = constituentTypeNodes.get(1);
BType typeTwo = resolveTypeNode(bLangTypeTwo, data, data.env);
if (typeTwo == symTable.noType) {
return symTable.noType;
}
BType typeOneReference = Types.getImpliedType(typeOne);
BType typeTwoReference = Types.getImpliedType(typeTwo);
typeBLangTypeMap.put(typeTwo, bLangTypeTwo);
boolean hasReadOnlyType = typeOneReference == symTable.readonlyType
|| typeTwoReference == symTable.readonlyType;
if (typeOneReference.tag == TypeTags.ERROR || typeTwoReference.tag == TypeTags.ERROR) {
isErrorIntersection = true;
}
if (!(hasReadOnlyType || isErrorIntersection)) {
dlog.error(intersectionTypeNode.pos,
DiagnosticErrorCode.UNSUPPORTED_TYPE_INTERSECTION);
for (int i = 2; i < constituentTypeNodes.size(); i++) {
resolveTypeNode(constituentTypeNodes.get(i), data.env);
}
return symTable.semanticError;
}
BType potentialIntersectionType = getPotentialIntersection(
Types.IntersectionContext.from(dlog, bLangTypeOne.pos, bLangTypeTwo.pos),
typeOne, typeTwo, data.env);
if (typeOne == potentialIntersectionType || typeTwo == potentialIntersectionType) {
isAlreadyExistingType = true;
}
LinkedHashSet<BType> constituentBTypes = new LinkedHashSet<>();
constituentBTypes.add(typeOne);
constituentBTypes.add(typeTwo);
if (potentialIntersectionType == symTable.semanticError) {
validIntersection = false;
} else {
for (int i = 2; i < constituentTypeNodes.size(); i++) {
BLangType bLangType = constituentTypeNodes.get(i);
BType type = resolveTypeNode(bLangType, data, data.env);
if (Types.getImpliedType(type).tag == TypeTags.ERROR) {
isErrorIntersection = true;
}
typeBLangTypeMap.put(type, bLangType);
if (!hasReadOnlyType) {
hasReadOnlyType = type == symTable.readonlyType;
}
if (type == symTable.noType) {
return symTable.noType;
}
BType tempIntersectionType = getPotentialIntersection(
Types.IntersectionContext.from(dlog, bLangTypeOne.pos, bLangTypeTwo.pos),
potentialIntersectionType, type, data.env);
if (tempIntersectionType == symTable.semanticError) {
validIntersection = false;
break;
}
if (type == tempIntersectionType) {
potentialIntersectionType = type;
isAlreadyExistingType = true;
} else if (potentialIntersectionType != tempIntersectionType) {
potentialIntersectionType = tempIntersectionType;
isAlreadyExistingType = false;
}
constituentBTypes.add(type);
}
}
if (!validIntersection) {
dlog.error(intersectionTypeNode.pos, DiagnosticErrorCode.INVALID_INTERSECTION_TYPE, intersectionTypeNode);
return symTable.semanticError;
}
if (isErrorIntersection && !isAlreadyExistingType) {
BType detailType = ((BErrorType) potentialIntersectionType).detailType;
boolean existingErrorDetailType = false;
if (detailType.tsymbol != null) {
BSymbol detailTypeSymbol = lookupSymbolInMainSpace(data.env, detailType.tsymbol.name);
if (detailTypeSymbol != symTable.notFoundSymbol) {
existingErrorDetailType = true;
}
}
return createIntersectionErrorType((BErrorType) potentialIntersectionType, intersectionTypeNode.pos,
constituentBTypes, existingErrorDetailType, data.env);
}
if (types.isInherentlyImmutableType(potentialIntersectionType) ||
(Symbols.isFlagOn(potentialIntersectionType.flags, Flags.READONLY) &&
// For objects, a new type has to be created.
!types.isSubTypeOfBaseType(potentialIntersectionType, TypeTags.OBJECT))) {
BTypeSymbol effectiveTypeTSymbol = potentialIntersectionType.tsymbol;
return createIntersectionType(potentialIntersectionType, constituentBTypes, effectiveTypeTSymbol.pkgID,
effectiveTypeTSymbol.owner, intersectionTypeNode.pos);
}
PackageID packageID = data.env.enclPkg.packageID;
if (!types.isSelectivelyImmutableType(potentialIntersectionType, false, packageID)) {
if (types.isSelectivelyImmutableType(potentialIntersectionType, packageID)) {
// This intersection would have been valid if not for `readonly object`s.
dlog.error(intersectionTypeNode.pos, DiagnosticErrorCode.INVALID_READONLY_OBJECT_INTERSECTION_TYPE);
} else {
dlog.error(intersectionTypeNode.pos, DiagnosticErrorCode.INVALID_READONLY_INTERSECTION_TYPE,
potentialIntersectionType);
}
return symTable.semanticError;
}
BLangType typeNode = typeBLangTypeMap.get(potentialIntersectionType);
Set<Flag> flagSet;
if (typeNode == null) {
flagSet = new HashSet<>();
} else if (typeNode.getKind() == NodeKind.OBJECT_TYPE) {
flagSet = typeNode.flagSet;
} else if (typeNode.getKind() == NodeKind.USER_DEFINED_TYPE) {
flagSet = typeNode.flagSet;
} else {
flagSet = new HashSet<>();
}
return ImmutableTypeCloner.getImmutableIntersectionType(intersectionTypeNode.pos, types,
potentialIntersectionType,
data.env, symTable, anonymousModelHelper, names, flagSet);
}
public BIntersectionType createIntersectionErrorType(BErrorType intersectionErrorType,
Location pos,
LinkedHashSet<BType> constituentBTypes,
boolean isAlreadyDefinedDetailType, SymbolEnv env) {
BSymbol owner = intersectionErrorType.tsymbol.owner;
PackageID pkgId = intersectionErrorType.tsymbol.pkgID;
SymbolEnv pkgEnv = symTable.pkgEnvMap.get(env.enclPkg.symbol);
BType detailType = Types.getImpliedType(intersectionErrorType.detailType);
if (!isAlreadyDefinedDetailType && detailType.tag == TypeTags.RECORD) {
defineErrorDetailRecord((BRecordType) detailType, pos, pkgEnv);
}
return createIntersectionType(intersectionErrorType, constituentBTypes, pkgId, owner, pos);
}
private void defineErrorDetailRecord(BRecordType detailRecord, Location pos, SymbolEnv env) {
BRecordTypeSymbol detailRecordSymbol = (BRecordTypeSymbol) detailRecord.tsymbol;
for (BField field : detailRecord.fields.values()) {
BVarSymbol fieldSymbol = field.symbol;
detailRecordSymbol.scope.define(fieldSymbol.name, fieldSymbol);
}
BLangRecordTypeNode detailRecordTypeNode = TypeDefBuilderHelper.createRecordTypeNode(new ArrayList<>(),
detailRecord, pos);
BLangTypeDefinition detailRecordTypeDefinition = TypeDefBuilderHelper.createTypeDefinitionForTSymbol(
detailRecord, detailRecordSymbol, detailRecordTypeNode, env);
detailRecordTypeDefinition.pos = pos;
}
private BIntersectionType createIntersectionType(BType effectiveType,
LinkedHashSet<BType> constituentBTypes, PackageID pkgId,
BSymbol owner, Location pos) {
BTypeSymbol intersectionTypeSymbol =
Symbols.createTypeSymbol(SymTag.INTERSECTION_TYPE, 0, Names.EMPTY, pkgId, null, owner, pos, VIRTUAL);
BIntersectionType intersectionType =
new BIntersectionType(intersectionTypeSymbol, constituentBTypes, effectiveType, effectiveType.flags);
intersectionTypeSymbol.type = intersectionType;
return intersectionType;
}
public BType getPotentialIntersection(Types.IntersectionContext intersectionContext,
BType lhsType, BType rhsType, SymbolEnv env) {
if (Types.getImpliedType(lhsType) == symTable.readonlyType) {
return rhsType;
}
if (Types.getImpliedType(rhsType) == symTable.readonlyType) {
return lhsType;
}
return types.getTypeIntersection(intersectionContext, lhsType, rhsType, env);
}
boolean validateInferTypedescParams(Location pos, List<? extends BLangVariable> parameters, BType retType) {
int inferTypedescParamCount = 0;
BVarSymbol paramWithInferredTypedescDefault = null;
Location inferDefaultLocation = null;
for (BLangVariable parameter : parameters) {
BType type = Types.getImpliedType(parameter.getBType());
BLangExpression expr = parameter.expr;
if (type != null && type.tag == TypeTags.TYPEDESC && expr != null &&
expr.getKind() == NodeKind.INFER_TYPEDESC_EXPR) {
paramWithInferredTypedescDefault = parameter.symbol;
inferDefaultLocation = expr.pos;
inferTypedescParamCount++;
}
}
if (inferTypedescParamCount > 1) {
dlog.error(pos, DiagnosticErrorCode.MULTIPLE_INFER_TYPEDESC_PARAMS);
return false;
}
if (paramWithInferredTypedescDefault == null) {
return true;
}
if (retType == null) {
dlog.error(inferDefaultLocation,
DiagnosticErrorCode.CANNOT_USE_INFERRED_TYPEDESC_DEFAULT_WITH_UNREFERENCED_PARAM);
return false;
}
if (unifier.refersInferableParamName(paramWithInferredTypedescDefault.name.value, retType)) {
return true;
}
dlog.error(inferDefaultLocation,
DiagnosticErrorCode.CANNOT_USE_INFERRED_TYPEDESC_DEFAULT_WITH_UNREFERENCED_PARAM);
return false;
}
private boolean isModuleLevelVar(BSymbol symbol) {
return symbol.getKind() == SymbolKind.VARIABLE && symbol.owner.getKind() == SymbolKind.PACKAGE;
}
public void populateAnnotationAttachmentSymbol(BLangAnnotationAttachment annotationAttachment, SymbolEnv env,
ConstantValueResolver constantValueResolver) {
populateAnnotationAttachmentSymbol(annotationAttachment, env, constantValueResolver, new Stack<>());
}
public void populateAnnotationAttachmentSymbol(BLangAnnotationAttachment annotationAttachment, SymbolEnv env,
ConstantValueResolver constantValueResolver,
Stack<String> anonTypeNameSuffixes) {
BAnnotationSymbol annotationSymbol = annotationAttachment.annotationSymbol;
if (annotationSymbol == null) {
return;
}
if (!Symbols.isFlagOn(annotationSymbol.flags, Flags.CONSTANT)) {
annotationAttachment.annotationAttachmentSymbol =
new BAnnotationAttachmentSymbol(annotationSymbol, env.enclPkg.packageID, env.scope.owner,
annotationAttachment.pos, SOURCE, annotationSymbol.attachedType);
return;
}
BLangExpression expr = annotationAttachment.expr;
BType attachedType = annotationAttachment.annotationSymbol.attachedType;
if (attachedType == null) {
attachedType = symTable.trueType;
} else {
attachedType = Types.getImpliedType(attachedType);
attachedType = attachedType.tag == TypeTags.ARRAY ? ((BArrayType) attachedType).eType : attachedType;
}
boolean isSourceOnlyAnon = isSourceAnonOnly(annotationSymbol.points);
BConstantSymbol constantSymbol = new BConstantSymbol(0, Names.EMPTY, Names.EMPTY, env.enclPkg.packageID,
attachedType, attachedType, env.scope.owner,
annotationAttachment.pos, VIRTUAL);
BLangConstantValue constAnnotationValue;
if (expr == null) {
if (types.isAssignable(attachedType, symTable.trueType)) {
// path_to_url
constAnnotationValue = new BLangConstantValue(true, symTable.booleanType);
constantSymbol.value = constAnnotationValue;
} else {
// TODO: 2022-03-06 Need to handle defaults coming from records.
BLangRecordLiteral mappingConstructor = (BLangRecordLiteral) TreeBuilder.createRecordLiteralNode();
mappingConstructor.setBType(attachedType);
mappingConstructor.typeChecked = true;
constAnnotationValue = constantValueResolver.constructBLangConstantValueWithExactType(
mappingConstructor, constantSymbol, env);
}
} else {
constAnnotationValue = constantValueResolver.constructBLangConstantValueWithExactType(expr, constantSymbol,
env, anonTypeNameSuffixes, isSourceOnlyAnon);
}
constantSymbol.type = constAnnotationValue.type;
annotationAttachment.annotationAttachmentSymbol =
new BAnnotationAttachmentSymbol.BConstAnnotationAttachmentSymbol(annotationSymbol,
env.enclPkg.packageID,
env.scope.owner,
annotationAttachment.pos, SOURCE,
constantSymbol);
}
private boolean isSourceAnonOnly(Set<AttachPoint> attachPoints) {
for (AttachPoint attachPoint : attachPoints) {
if (!attachPoint.source) {
return false;
}
}
return true;
}
public Set<BVarSymbol> getConfigVarSymbolsIncludingImportedModules(BPackageSymbol packageSymbol) {
Set<BVarSymbol> configVars = new HashSet<>();
populateConfigurableVars(packageSymbol, configVars);
if (!packageSymbol.imports.isEmpty()) {
for (BPackageSymbol importSymbol : packageSymbol.imports) {
populateConfigurableVars(importSymbol, configVars);
}
}
return configVars;
}
private void populateConfigurableVars(BPackageSymbol pkgSymbol, Set<BVarSymbol> configVars) {
for (Scope.ScopeEntry entry : pkgSymbol.scope.entries.values()) {
BSymbol symbol = entry.symbol;
if (symbol != null) {
if (symbol.tag == SymTag.TYPE_DEF) {
symbol = symbol.type.tsymbol;
}
if (symbol.tag == SymTag.VARIABLE && Symbols.isFlagOn(symbol.flags, Flags.CONFIGURABLE)) {
configVars.add((BVarSymbol) symbol);
}
}
}
}
public BAnnotationSymbol getStrandAnnotationSymbol() {
return (BAnnotationSymbol) lookupSymbolInAnnotationSpace(
symTable.pkgEnvMap.get(symTable.rootPkgSymbol), Names.fromString("strand"));
}
public BPackageSymbol getModuleForPackageId(PackageID packageID) {
return symTable.pkgEnvMap.keySet().stream()
.filter(moduleSymbol -> packageID.equals(moduleSymbol.pkgID))
.findFirst()
.get();
}
public List<BLangExpression> getListOfInterpolations(List<BLangExpression> sequenceList) {
List<BLangExpression> interpolationsList = new ArrayList<>();
for (BLangExpression seq : sequenceList) {
if (seq.getKind() != NodeKind.REG_EXP_SEQUENCE) {
continue;
}
BLangReSequence sequence = (BLangReSequence) seq;
for (BLangReTerm term : sequence.termList) {
if (term.getKind() != NodeKind.REG_EXP_ATOM_QUANTIFIER) {
continue;
}
BLangExpression atom = ((BLangReAtomQuantifier) term).atom;
NodeKind kind = atom.getKind();
if (!isReAtomNode(kind)) {
interpolationsList.add(atom);
continue;
}
if (kind == NodeKind.REG_EXP_CAPTURING_GROUP) {
interpolationsList.addAll(
getListOfInterpolations(((BLangReCapturingGroups) atom).disjunction.sequenceList));
}
}
}
return interpolationsList;
}
public boolean isReAtomNode(NodeKind kind) {
switch (kind) {
case REG_EXP_ATOM_CHAR_ESCAPE:
case REG_EXP_CHARACTER_CLASS:
case REG_EXP_CAPTURING_GROUP:
return true;
default:
return false;
}
}
private boolean isDistinctXMLNSSymbol(BXMLNSSymbol symbol, BXMLNSSymbol foundSym) {
Name foundSymCompUnit = foundSym.compUnit;
Name symbolCompUnit = symbol.compUnit;
boolean isFoundSymModuleXmlns = foundSymCompUnit != null;
boolean isSymModuleXmlns = symbolCompUnit != null;
if (isFoundSymModuleXmlns && isSymModuleXmlns) {
return !foundSymCompUnit.value.equals(symbolCompUnit.value);
}
// If only one of the symbols have a compUnit then it is a module level xmlns and the symbols are distinct.
// If they both don't have a compUnit then it is a redeclared prefix.
return isFoundSymModuleXmlns || isSymModuleXmlns;
}
/**
* @since 2.0.0
*/
public static class ParameterizedTypeInfo {
BType paramValueType;
int index = -1;
private ParameterizedTypeInfo(BType paramValueType) {
this.paramValueType = paramValueType;
}
private ParameterizedTypeInfo(BType paramValueType, int index) {
this.paramValueType = paramValueType;
this.index = index;
}
}
/**
* @since 2.0.0
*/
public static class AnalyzerData {
SymbolEnv env;
DiagnosticCode diagCode;
public AnalyzerData(SymbolEnv env) {
this.env = env;
}
}
}
```
|
```c++
//
// file LICENSE or copy at path_to_url
#include <cds/urcu/signal_buffered.h>
#ifdef CDS_URCU_SIGNAL_HANDLING_ENABLED
#include "test_intrusive_michael_michael_rcu.h"
namespace {
typedef cds::urcu::signal_buffered<> rcu_implementation;
typedef cds::urcu::signal_buffered_stripped rcu_implementation_stripped;
} // namespace
INSTANTIATE_TYPED_TEST_CASE_P( RCU_SHB, IntrusiveMichaelSet, rcu_implementation );
INSTANTIATE_TYPED_TEST_CASE_P( RCU_SHB_stripped, IntrusiveMichaelSet, rcu_implementation_stripped );
#endif // #ifdef CDS_URCU_SIGNAL_HANDLING_ENABLED
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.