text
stringlengths
1
22.8M
Whangaruru is a rural community and harbour on the east coast of Northland, New Zealand. Mokau, Helena Bay, Whakapara, Hikurangi and Whangarei are to the south and the Bay of Islands is to the northwest. The Whangaruru area includes the villages of Punaruku west of the harbour, Whangaruru north of the harbour, and Whangaruru North east of the harbour. The area was reputedly named by Puhimoanariki who was sailing up the coast. After searching for a long time he found shelter from bad weather there. The name "Whangaruru" is a Māori-language word meaning "sheltered harbour", and is a direct linguistic cognate with "Honolulu" in the related Hawaiian language. Marae The Ngātiwai hapū of Te Uri o Hikihiki are the indigenous people of Whangaruru. The hapū has several traditional meeting grounds in the Whangaruru and Panaruku area, including the Ngātiwai Marae and Ngāti Wai Soldiers' Memorial Hall, the Ōtetao Reti Marae and Hoori Reti meeting house, and the Tuparehuia Marae. In October 2020, the Government committed $444,239 from the Provincial Growth Fund to upgrade Ōtetao Reti Marae, creating 33 jobs. It also committed $295,095 to upgrade Ngātiwai Marae, creating 5 jobs. Demographics Statistics New Zealand describes Ōakura-Whangaruru South as a rural settlement. The settlement covers and had an estimated population of as of with a population density of people per km2. The settlement is part of the larger Whangaruru statistical area. Ōakura-Whangaruru South had a population of 150 at the 2018 New Zealand census, an increase of 39 people (35.1%) since the 2013 census, and a decrease of 9 people (−5.7%) since the 2006 census. There were 66 households, comprising 81 males and 69 females, giving a sex ratio of 1.17 males per female. The median age was 60.4 years (compared with 37.4 years nationally), with 15 people (10.0%) aged under 15 years, 18 (12.0%) aged 15 to 29, 60 (40.0%) aged 30 to 64, and 57 (38.0%) aged 65 or older. Ethnicities were 76.0% European/Pākehā, 30.0% Māori, 8.0% Pacific peoples, and 2.0% Asian. People may identify with more than one ethnicity. Although some people chose not to answer the census's question about religious affiliation, 34.0% had no religion, 44.0% were Christian and 4.0% had other religions. Of those at least 15 years old, 27 (20.0%) people had a bachelor's or higher degree, and 33 (24.4%) people had no formal qualifications. The median income was $21,100, compared with $31,800 nationally. 9 people (6.7%) earned over $70,000 compared to 17.2% nationally. The employment status of those at least 15 was that 33 (24.4%) people were employed full-time, 12 (8.9%) were part-time, and 3 (2.2%) were unemployed. Whangaruru statistical area The statistical area of Whangaruru covers and had an estimated population of as of with a population density of people per km2. Whangaruru statistical area had a population of 2,520 at the 2018 New Zealand census, an increase of 420 people (20.0%) since the 2013 census, and an increase of 453 people (21.9%) since the 2006 census. There were 858 households, comprising 1,323 males and 1,197 females, giving a sex ratio of 1.11 males per female. The median age was 44.1 years (compared with 37.4 years nationally), with 525 people (20.8%) aged under 15 years, 396 (15.7%) aged 15 to 29, 1,170 (46.4%) aged 30 to 64, and 432 (17.1%) aged 65 or older. Ethnicities were 77.1% European/Pākehā, 35.7% Māori, 3.2% Pacific peoples, 1.1% Asian, and 1.5% other ethnicities. People may identify with more than one ethnicity. The percentage of people born overseas was 10.5, compared with 27.1% nationally. Although some people chose not to answer the census's question about religious affiliation, 54.6% had no religion, 33.6% were Christian, 1.3% had Māori religious beliefs, 0.1% were Hindu, 0.1% were Muslim, 0.1% were Buddhist and 1.7% had other religions. Of those at least 15 years old, 267 (13.4%) people had a bachelor's or higher degree, and 426 (21.4%) people had no formal qualifications. The median income was $24,300, compared with $31,800 nationally. 210 people (10.5%) earned over $70,000 compared to 17.2% nationally. The employment status of those at least 15 was that 867 (43.5%) people were employed full-time, 306 (15.3%) were part-time, and 132 (6.6%) were unemployed. Education Whangaruru School is a coeducational full primary (years 1-8) school with a roll of students as of The school was founded in 2005 to replace Punaruku, Ngaiotonga Valley and Helena Bay Schools. It is on the site of the old Punaruku School. Te Kura Hourua ki Whangaruru was a secondary (years 9-13) partnership school opened in 2014, and closed in 2016. External links Whangarei District Council Description Notes Whangarei District Populated places in the Northland Region Ports and harbours of New Zealand
```go package client /* path_to_url Unless required by applicable law or agreed to in writing, software WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ import ( "fmt" "net/url" "github.com/apache/trafficcontrol/v8/lib/go-tc" "github.com/apache/trafficcontrol/v8/traffic_ops/toclientlib" ) const ( // apiProfiles is the full path to the /profiles API endpoint. apiProfiles = "/profiles" // apiProfilesNameParameters is the full path to the // /profiles/name/{{name}}/parameters API endpoint. apiProfilesNameParameters = apiProfiles + "/name/%s/parameters" ) // CreateProfile creates the passed Profile. func (to *Session) CreateProfile(pl tc.ProfileV5, opts RequestOptions) (tc.Alerts, toclientlib.ReqInf, error) { if pl.CDNID == 0 && pl.CDNName != "" { cdnOpts := NewRequestOptions() cdnOpts.QueryParameters.Set("name", pl.CDNName) cdns, _, err := to.GetCDNs(cdnOpts) if err != nil { return tc.Alerts{}, toclientlib.ReqInf{}, fmt.Errorf("resolving Profile's CDN Name '%s' to an ID: %w", pl.CDNName, err) } if len(cdns.Response) == 0 { return tc.Alerts{ Alerts: []tc.Alert{ { Text: fmt.Sprintf("no CDN with name %s", pl.CDNName), Level: "error", }, }, }, toclientlib.ReqInf{}, fmt.Errorf("no CDN with name %s", pl.CDNName) } pl.CDNID = cdns.Response[0].ID } var alerts tc.Alerts reqInf, err := to.post(apiProfiles, opts, pl, &alerts) return alerts, reqInf, err } // UpdateProfile replaces the Profile identified by ID with the one provided. func (to *Session) UpdateProfile(id int, pl tc.ProfileV5, opts RequestOptions) (tc.Alerts, toclientlib.ReqInf, error) { route := fmt.Sprintf("%s/%d", apiProfiles, id) var alerts tc.Alerts reqInf, err := to.put(route, opts, pl, &alerts) return alerts, reqInf, err } // GetParametersByProfileName returns all of the Parameters that are assigned // to the Profile with the given Name. func (to *Session) GetParametersByProfileName(profileName string, opts RequestOptions) (tc.ParametersResponseV5, toclientlib.ReqInf, error) { route := fmt.Sprintf(apiProfilesNameParameters, profileName) var data tc.ParametersResponseV5 reqInf, err := to.get(route, opts, &data) return data, reqInf, err } // GetProfiles returns all Profiles stored in Traffic Ops. func (to *Session) GetProfiles(opts RequestOptions) (tc.ProfilesResponseV5, toclientlib.ReqInf, error) { var data tc.ProfilesResponseV5 reqInf, err := to.get(apiProfiles, opts, &data) return data, reqInf, err } // DeleteProfile deletes the Profile with the given ID. func (to *Session) DeleteProfile(id int, opts RequestOptions) (tc.Alerts, toclientlib.ReqInf, error) { URI := fmt.Sprintf("%s/%d", apiProfiles, id) var alerts tc.Alerts reqInf, err := to.del(URI, opts, &alerts) return alerts, reqInf, err } // ExportProfile Returns an exported Profile. func (to *Session) ExportProfile(id int, opts RequestOptions) (tc.ProfileExportResponse, toclientlib.ReqInf, error) { route := fmt.Sprintf("%s/%d/export", apiProfiles, id) var data tc.ProfileExportResponse reqInf, err := to.get(route, opts, &data) return data, reqInf, err } // ImportProfile imports an exported Profile. func (to *Session) ImportProfile(importRequest tc.ProfileImportRequest, opts RequestOptions) (tc.ProfileImportResponse, toclientlib.ReqInf, error) { route := fmt.Sprintf("%s/import", apiProfiles) var data tc.ProfileImportResponse reqInf, err := to.post(route, opts, importRequest, &data) return data, reqInf, err } // CopyProfile creates a new profile from an existing profile. func (to *Session) CopyProfile(p tc.ProfileCopy, opts RequestOptions) (tc.ProfileCopyResponse, toclientlib.ReqInf, error) { path := fmt.Sprintf("%s/name/%s/copy/%s", apiProfiles, url.PathEscape(p.Name), url.PathEscape(p.ExistingName)) var resp tc.ProfileCopyResponse reqInf, err := to.post(path, opts, p, &resp) return resp, reqInf, err } ```
The Rieti Valley or Rieti Plain ( or Conca Reatina) is a small plain in central Italy, where lies the city of Rieti, Lazio. It is also known as Sacred Valley and Holy Valley () since saint Francis of Assisi lived here for many years and erected four shrines, which have become the destination of pilgrims. It is the center of the Sabine region and an important part of the province of Rieti. Originated from the draining of the ancient Lake Velino, it is crossed by the Velino river and bordered by Monti Reatini and Monti Sabini. Origin In prehistory, the Rieti Valley was entirely occupied by a large lake which ancient Romans called Lake Velinus, since its tributary was the Velino river. The lake was formed during the quaternary, when limestone carried by water in the river deposited in the tight canyon where it flowed, shortly before joining the Nera river, near the present-day village of Marmore. As a result, the riverbed was occluded and the Rieti Plain was flooded becoming a lake. The water level in the lake rose and lowered several times during the centuries, favouring the formation of wide marshy zones around the lake where it was unhealthy to live because of malaria. For this reason, in 271 BC (after Rome had defeated the Sabines and acquired control of the area), consul Manius Curius Dentatus decided to drain the lake by digging an artificial canal in the limestone rock at Marmore. This imposing engineering achievement created the Cascata delle Marmore, a tall waterfall which allowed the Velino river to flow into the Nera river, and create a large and fertile valley to be farmed. Of the original great lake only some minor lakes remain, the largest being Lago di Piediluco. After the fall of the Roman Empire, lack of maintenance caused the canal to become obstructed again and in the Middle Ages the lake partially reformed. New draining interventions were ordered in 1545 by Pope Paul III; Antonio da Sangallo the Younger was charged to dig a new canal, but died of malaria in 1546 before the works were completed. Only in 1596 pope Clement VIII ordered new interventions, and Giovanni Fontana completed the new canal, ultimately draining the valley. Even after the lake was drained, recurring floods of the Velino river still caused problems for farmers, damaging their fields. This problem was solved in the Fascist era, when two large dams were built along the course of the two main tributaries of Velino (rivers Salto and Turano) to control their flow. As a result, the large artificial lakes Salto and Turano were formed (around 20 km southeast from the Rieti plain). Geography The Rieti Plain has a semi-circular shape and covers around , ranging from 370 to 380 metres above sea level; it is 14 km long and averages 7 km in width. It is bordered all round by mountains: Monti Sabini to the west and south and Monti Reatini to the east (the highest peak being Monte Terminillo, a popular skiing resort, high). On the plain two minor lakes can be found, the remains of the ancient Lake Velinus: Lago Lungo and Lago di Ripasottile. These small wetlands have preserved similar conditions to those present before the draining of the valley, and are a resting area for many migrating bird species; for this reason the area is now a nature reserve. Agriculture The Rieti Valley has always been known for its fertility, and was sometimes nicknamed "the granary of Rome". Virgil wrote that, if a stick was planted in a field, it could not be seen anymore on the day after, due to the grass that had grown around it. In the 19th century, wheat native to the Rieti Valley was famous all over Italy for being very productive and disease-resistant; agronomist Nazareno Strampelli used it as a starting point for his experiments, which led to the creation of wheat varieties that became popular all over the world in the mid-20th century. Other crops in the past included woad and sugar beet (which was refined at the Rieti sugar mill). Today the most important crops are corn, sunflowers and vegetables. Tourism and pilgrimages In the course of his life, saint Francis of Assisi visited repeatedly the Rieti Valley: the first time probably in 1209, then a long stay in 1223 and then another from the autumn of 1225 to April 1226. While in the valley, Francis represented for the first time the nativity scene, wrote the final version of the Franciscan Rule and probably also the Canticle of the Sun, but most importantly he founded the four shrines that are located at the four borders of the plain: Sanctuary of Greccio, La Foresta, Poggio Bustone and Fonte Colombo. The stay of Saint Francis coincided with a period in which Rieti enjoyed economic prosperity and became often a papal seat, from Innocent III in 1198 to Boniface VIII in 1298. Today, the Franciscan sanctuaries have become objects of pilgrimage; tourists and pilgrims walk a path known as the Cammino di Francesco, which links the shrines and other landmarks such as Rieti's mediaeval city centre, the Abbey of Saint Pastor and the Lungo and Ripasottile Lakes natural reserve. References Plains of Italy Valleys of Lazio
```go // Use of this source code is governed by a BSD-style // license that can be found in the LICENSE file. package ssh import ( "io" "io/ioutil" "sync" "testing" ) func muxPair() (*mux, *mux) { a, b := memPipe() s := newMux(a) c := newMux(b) return s, c } // Returns both ends of a channel, and the mux for the the 2nd // channel. func channelPair(t *testing.T) (*channel, *channel, *mux) { c, s := muxPair() res := make(chan *channel, 1) go func() { newCh, ok := <-s.incomingChannels if !ok { t.Fatalf("No incoming channel") } if newCh.ChannelType() != "chan" { t.Fatalf("got type %q want chan", newCh.ChannelType()) } ch, _, err := newCh.Accept() if err != nil { t.Fatalf("Accept %v", err) } res <- ch.(*channel) }() ch, err := c.openChannel("chan", nil) if err != nil { t.Fatalf("OpenChannel: %v", err) } return <-res, ch, c } // Test that stderr and stdout can be addressed from different // goroutines. This is intended for use with the race detector. func TestMuxChannelExtendedThreadSafety(t *testing.T) { writer, reader, mux := channelPair(t) defer writer.Close() defer reader.Close() defer mux.Close() var wr, rd sync.WaitGroup magic := "hello world" wr.Add(2) go func() { io.WriteString(writer, magic) wr.Done() }() go func() { io.WriteString(writer.Stderr(), magic) wr.Done() }() rd.Add(2) go func() { c, err := ioutil.ReadAll(reader) if string(c) != magic { t.Fatalf("stdout read got %q, want %q (error %s)", c, magic, err) } rd.Done() }() go func() { c, err := ioutil.ReadAll(reader.Stderr()) if string(c) != magic { t.Fatalf("stderr read got %q, want %q (error %s)", c, magic, err) } rd.Done() }() wr.Wait() writer.CloseWrite() rd.Wait() } func TestMuxReadWrite(t *testing.T) { s, c, mux := channelPair(t) defer s.Close() defer c.Close() defer mux.Close() magic := "hello world" magicExt := "hello stderr" go func() { _, err := s.Write([]byte(magic)) if err != nil { t.Fatalf("Write: %v", err) } _, err = s.Extended(1).Write([]byte(magicExt)) if err != nil { t.Fatalf("Write: %v", err) } err = s.Close() if err != nil { t.Fatalf("Close: %v", err) } }() var buf [1024]byte n, err := c.Read(buf[:]) if err != nil { t.Fatalf("server Read: %v", err) } got := string(buf[:n]) if got != magic { t.Fatalf("server: got %q want %q", got, magic) } n, err = c.Extended(1).Read(buf[:]) if err != nil { t.Fatalf("server Read: %v", err) } got = string(buf[:n]) if got != magicExt { t.Fatalf("server: got %q want %q", got, magic) } } func TestMuxChannelOverflow(t *testing.T) { reader, writer, mux := channelPair(t) defer reader.Close() defer writer.Close() defer mux.Close() wDone := make(chan int, 1) go func() { if _, err := writer.Write(make([]byte, channelWindowSize)); err != nil { t.Errorf("could not fill window: %v", err) } writer.Write(make([]byte, 1)) wDone <- 1 }() writer.remoteWin.waitWriterBlocked() // Send 1 byte. packet := make([]byte, 1+4+4+1) packet[0] = msgChannelData marshalUint32(packet[1:], writer.remoteId) marshalUint32(packet[5:], uint32(1)) packet[9] = 42 if err := writer.mux.conn.writePacket(packet); err != nil { t.Errorf("could not send packet") } if _, err := reader.SendRequest("hello", true, nil); err == nil { t.Errorf("SendRequest succeeded.") } <-wDone } func TestMuxChannelCloseWriteUnblock(t *testing.T) { reader, writer, mux := channelPair(t) defer reader.Close() defer writer.Close() defer mux.Close() wDone := make(chan int, 1) go func() { if _, err := writer.Write(make([]byte, channelWindowSize)); err != nil { t.Errorf("could not fill window: %v", err) } if _, err := writer.Write(make([]byte, 1)); err != io.EOF { t.Errorf("got %v, want EOF for unblock write", err) } wDone <- 1 }() writer.remoteWin.waitWriterBlocked() reader.Close() <-wDone } func TestMuxConnectionCloseWriteUnblock(t *testing.T) { reader, writer, mux := channelPair(t) defer reader.Close() defer writer.Close() defer mux.Close() wDone := make(chan int, 1) go func() { if _, err := writer.Write(make([]byte, channelWindowSize)); err != nil { t.Errorf("could not fill window: %v", err) } if _, err := writer.Write(make([]byte, 1)); err != io.EOF { t.Errorf("got %v, want EOF for unblock write", err) } wDone <- 1 }() writer.remoteWin.waitWriterBlocked() mux.Close() <-wDone } func TestMuxReject(t *testing.T) { client, server := muxPair() defer server.Close() defer client.Close() go func() { ch, ok := <-server.incomingChannels if !ok { t.Fatalf("Accept") } if ch.ChannelType() != "ch" || string(ch.ExtraData()) != "extra" { t.Fatalf("unexpected channel: %q, %q", ch.ChannelType(), ch.ExtraData()) } ch.Reject(RejectionReason(42), "message") }() ch, err := client.openChannel("ch", []byte("extra")) if ch != nil { t.Fatal("openChannel not rejected") } ocf, ok := err.(*OpenChannelError) if !ok { t.Errorf("got %#v want *OpenChannelError", err) } else if ocf.Reason != 42 || ocf.Message != "message" { t.Errorf("got %#v, want {Reason: 42, Message: %q}", ocf, "message") } want := "ssh: rejected: unknown reason 42 (message)" if err.Error() != want { t.Errorf("got %q, want %q", err.Error(), want) } } func TestMuxChannelRequest(t *testing.T) { client, server, mux := channelPair(t) defer server.Close() defer client.Close() defer mux.Close() var received int var wg sync.WaitGroup wg.Add(1) go func() { for r := range server.incomingRequests { received++ r.Reply(r.Type == "yes", nil) } wg.Done() }() _, err := client.SendRequest("yes", false, nil) if err != nil { t.Fatalf("SendRequest: %v", err) } ok, err := client.SendRequest("yes", true, nil) if err != nil { t.Fatalf("SendRequest: %v", err) } if !ok { t.Errorf("SendRequest(yes): %v", ok) } ok, err = client.SendRequest("no", true, nil) if err != nil { t.Fatalf("SendRequest: %v", err) } if ok { t.Errorf("SendRequest(no): %v", ok) } client.Close() wg.Wait() if received != 3 { t.Errorf("got %d requests, want %d", received, 3) } } func TestMuxGlobalRequest(t *testing.T) { clientMux, serverMux := muxPair() defer serverMux.Close() defer clientMux.Close() var seen bool go func() { for r := range serverMux.incomingRequests { seen = seen || r.Type == "peek" if r.WantReply { err := r.Reply(r.Type == "yes", append([]byte(r.Type), r.Payload...)) if err != nil { t.Errorf("AckRequest: %v", err) } } } }() _, _, err := clientMux.SendRequest("peek", false, nil) if err != nil { t.Errorf("SendRequest: %v", err) } ok, data, err := clientMux.SendRequest("yes", true, []byte("a")) if !ok || string(data) != "yesa" || err != nil { t.Errorf("SendRequest(\"yes\", true, \"a\"): %v %v %v", ok, data, err) } if ok, data, err := clientMux.SendRequest("yes", true, []byte("a")); !ok || string(data) != "yesa" || err != nil { t.Errorf("SendRequest(\"yes\", true, \"a\"): %v %v %v", ok, data, err) } if ok, data, err := clientMux.SendRequest("no", true, []byte("a")); ok || string(data) != "noa" || err != nil { t.Errorf("SendRequest(\"no\", true, \"a\"): %v %v %v", ok, data, err) } clientMux.Disconnect(0, "") if !seen { t.Errorf("never saw 'peek' request") } } func TestMuxGlobalRequestUnblock(t *testing.T) { clientMux, serverMux := muxPair() defer serverMux.Close() defer clientMux.Close() result := make(chan error, 1) go func() { _, _, err := clientMux.SendRequest("hello", true, nil) result <- err }() <-serverMux.incomingRequests serverMux.conn.Close() err := <-result if err != io.EOF { t.Errorf("want EOF, got %v", io.EOF) } } func TestMuxChannelRequestUnblock(t *testing.T) { a, b, connB := channelPair(t) defer a.Close() defer b.Close() defer connB.Close() result := make(chan error, 1) go func() { _, err := a.SendRequest("hello", true, nil) result <- err }() <-b.incomingRequests connB.conn.Close() err := <-result if err != io.EOF { t.Errorf("want EOF, got %v", err) } } func TestMuxDisconnect(t *testing.T) { a, b := muxPair() defer a.Close() defer b.Close() go func() { for r := range b.incomingRequests { r.Reply(true, nil) } }() a.Disconnect(42, "whatever") ok, _, err := a.SendRequest("hello", true, nil) if ok || err == nil { t.Errorf("got reply after disconnecting") } err = b.Wait() if d, ok := err.(*disconnectMsg); !ok || d.Reason != 42 { t.Errorf("got %#v, want disconnectMsg{Reason:42}", err) } } func TestMuxCloseChannel(t *testing.T) { r, w, mux := channelPair(t) defer mux.Close() defer r.Close() defer w.Close() result := make(chan error, 1) go func() { var b [1024]byte _, err := r.Read(b[:]) result <- err }() if err := w.Close(); err != nil { t.Errorf("w.Close: %v", err) } if _, err := w.Write([]byte("hello")); err != io.EOF { t.Errorf("got err %v, want io.EOF after Close", err) } if err := <-result; err != io.EOF { t.Errorf("got %v (%T), want io.EOF", err, err) } } func TestMuxCloseWriteChannel(t *testing.T) { r, w, mux := channelPair(t) defer mux.Close() result := make(chan error, 1) go func() { var b [1024]byte _, err := r.Read(b[:]) result <- err }() if err := w.CloseWrite(); err != nil { t.Errorf("w.CloseWrite: %v", err) } if _, err := w.Write([]byte("hello")); err != io.EOF { t.Errorf("got err %v, want io.EOF after CloseWrite", err) } if err := <-result; err != io.EOF { t.Errorf("got %v (%T), want io.EOF", err, err) } } func TestMuxInvalidRecord(t *testing.T) { a, b := muxPair() defer a.Close() defer b.Close() packet := make([]byte, 1+4+4+1) packet[0] = msgChannelData marshalUint32(packet[1:], 29348723 /* invalid channel id */) marshalUint32(packet[5:], 1) packet[9] = 42 a.conn.writePacket(packet) go a.SendRequest("hello", false, nil) // 'a' wrote an invalid packet, so 'b' has exited. req, ok := <-b.incomingRequests if ok { t.Errorf("got request %#v after receiving invalid packet", req) } } func TestZeroWindowAdjust(t *testing.T) { a, b, mux := channelPair(t) defer a.Close() defer b.Close() defer mux.Close() go func() { io.WriteString(a, "hello") // bogus adjust. a.sendMessage(windowAdjustMsg{}) io.WriteString(a, "world") a.Close() }() want := "helloworld" c, _ := ioutil.ReadAll(b) if string(c) != want { t.Errorf("got %q want %q", c, want) } } func TestMuxMaxPacketSize(t *testing.T) { a, b, mux := channelPair(t) defer a.Close() defer b.Close() defer mux.Close() large := make([]byte, a.maxRemotePayload+1) packet := make([]byte, 1+4+4+1+len(large)) packet[0] = msgChannelData marshalUint32(packet[1:], a.remoteId) marshalUint32(packet[5:], uint32(len(large))) packet[9] = 42 if err := a.mux.conn.writePacket(packet); err != nil { t.Errorf("could not send packet") } go a.SendRequest("hello", false, nil) _, ok := <-b.incomingRequests if ok { t.Errorf("connection still alive after receiving large packet.") } } // Don't ship code with debug=true. func TestDebug(t *testing.T) { if debugMux { t.Error("mux debug switched on") } if debugHandshake { t.Error("handshake debug switched on") } } ```
```c++ /*<- (See accompanying file LICENSE.md or copy at path_to_url ->*/ //[ is_cv_member #include <type_traits> #include <boost/callable_traits/is_cv_member.hpp> namespace ct = boost::callable_traits; struct foo; static_assert(ct::is_cv_member<int(foo::*)() const volatile>::value, ""); static_assert(!ct::is_cv_member<int(foo::*)()>::value, ""); static_assert(!ct::is_cv_member<int(foo::*)() const>::value, ""); static_assert(!ct::is_cv_member<int(foo::*)() volatile>::value, ""); int main() {} //] ```
```javascript // @flow import fetch from './fetchClassic'; import pipelineStore from './PipelineStore'; import pipelineMetadataService from './PipelineMetadataService'; import type { PipelineInfo, StageInfo, StepInfo } from './PipelineStore'; import { convertInternalModelToJson } from './PipelineSyntaxConverter'; import { isObservableArray } from 'mobx'; import debounce from 'lodash.debounce'; const validationTimeout = 500; export function isValidEnvironmentKey(key: string): boolean { if (!key) { return false; } if (/^[_$a-zA-Z\xA0-\uFFFF][_$a-zA-Z0-9\xA0-\uFFFF]*$/.test(key)) { return true; } return false; } function _isArray(o) { return o instanceof Array || typeof o === 'array' || isObservableArray(o); } function _hasValidationErrors(node) { return node && node.validationErrors && node.validationErrors.length; } function _appendValidationError(node, message) { if (_hasValidationErrors(node)) { node.validationErrors.push(message); } else { node.validationErrors = [message]; } } function _validateAgentEntries(metadata, agent) { if (!agent || agent.type === 'none' || agent.type === 'any') { return; } let meta; for (const m of metadata.agentMetadata) { if (agent.type === m.symbol) { meta = m; break; } } if (!meta) { _appendValidationError(meta, 'Unknown agent type: ' + agent.type); return; } meta.parameters.map(param => { const arg = agent.arguments.filter(arg => arg.key === param.name)[0]; if (param.isRequired && (!arg || !arg.value || !arg.value.value)) { _appendValidationError(agent, param.name + ' is required'); } }); } function _validateEnvironmentEntries(metadata, entries) { for (const entry of entries) { if (!_hasValidationErrors(entry) && !isValidEnvironmentKey(entry.key)) { _appendValidationError(entry, 'Environment Name is not valid. Please ensure it is a valid Pipeline identifier.'); } } } function _validateStepValues(metadata, steps) { for (const step of steps) { const meta = metadata.stepMetadata.find(step); if (meta && meta.isRequired && !step.validationErrors && !step.value.value) { _appendValidationError(entry, 'Required step value'); } if (step.children) { _validateStepValues(metadata, step.children); } } } function _addClientSideErrors(metadata, node) { const parent = pipelineStore.findParentStage(node); if (node.agent) { _validateAgentEntries(metadata, node.agent); } if (node.environment) { _validateEnvironmentEntries(metadata, node.environment); } if (node.steps) { _validateStepValues(metadata, node.steps); } if (!node.children || !node.children.length) { if (parent && (!node.steps || !node.steps.length)) { // For this one particular error, just replace it //node.validationErrors = []; //node.steps.validationErrors = [ 'At least one step is required' ]; node.validationErrors = ['At least one step is required']; } if (node === pipelineStore.pipeline) { // override default message node.validationErrors = ['A stage is required']; } } else { node.children.map(child => _addClientSideErrors(metadata, child)); } } export class PipelineValidator { lastPipelineValidated: string; validatePipeline(pipeline: PipelineInfo, handler: ValidationResult) { const json = convertInternalModelToJson(pipeline); fetch( '/pipeline-model-converter/validateJson', 'json=' + encodeURIComponent(JSON.stringify(json)), data => { if (!data.result && data.errors) { if (window.isDevelopmentMode) console.error(data); } handler(data); }, { disableLoadingIndicator: true } ); } /** * Indicates this node or any child node has a validation error */ hasValidationErrors(node: Object, visited: any[] = []): boolean { if (visited.indexOf(node) >= 0) { return false; } visited.push(node); if (_hasValidationErrors(node)) { return true; } // if this is a parallel, check the parent stage for errors const parent = pipelineStore.findParentStage(node); if (parent && pipelineStore.pipeline !== parent && parent.validationErrors) { return true; } for (const key of Object.keys(node)) { const val = node[key]; if (val instanceof Object) { if (this.hasValidationErrors(val, visited)) { return true; } } } return false; } /** * Gets the validation errors for the specific node */ getNodeValidationErrors(node: Object, visited: any[] = []): Object[] { const validationErrors = node.validationErrors ? [...node.validationErrors] : []; // if this is a parallel, check the parent stage for errors const parent = pipelineStore.findParentStage(node); if (parent && pipelineStore.pipeline !== parent && parent.validationErrors) { validationErrors.push.apply(validationErrors, parent.validationErrors); } return validationErrors.length ? validationErrors : null; } /** * Gets all validation errors for the node and all child nodes */ getAllValidationErrors(node: Object, visited: any[] = []): Object[] { if (visited.indexOf(node) >= 0) { return null; } visited.push(node); const validationErrors = []; if (_hasValidationErrors(node)) { validationErrors.push.apply(validationErrors, node.validationErrors); } if (node instanceof Array || typeof node === 'array') { for (const v of node) { const childErrors = this.getAllValidationErrors(v, visited); if (childErrors) { validationErrors.push.apply(validationErrors, childErrors); } } } else if (node instanceof Object) { for (const key of Object.keys(node)) { const childErrors = this.getAllValidationErrors(node[key], visited); if (childErrors) { validationErrors.push.apply(validationErrors, childErrors); } } } return validationErrors.length ? validationErrors : null; } findNodeFromPath(pipeline: PipelineInfo, path: string[]): any { // like: "pipeline"/"stages"/"0"/"branches"/"0"/"steps" let node = pipeline; for (let i = 0; i < path.length; i++) { const part = path[i]; switch (part) { case 'pipeline': { break; } case 'stages': { const idx = parseInt(path[++i]); if (parseInt(idx) === idx) { node = node.children[idx]; } break; } case 'branches': { const idx = parseInt(path[++i]); // check if the 'default' single node path vs. parallel if (!node.children || node.children.length == 0) { // This probably in a parallel block, and is referencing the default branch, which is this node } else { node = node.children[idx]; } break; } case 'parallel': { const idx = parseInt(path[++i]); if (!isNaN(idx)) { node = node.children[idx]; } break; } case 'steps': { const idx = parseInt(path[++i]); if (!isNaN(idx)) { // it is actually the steps array, so just target the node node = node.steps[idx]; } break; } case 'arguments': { const idx = parseInt(path[++i]); if (!isNaN(idx)) { // it is actually the arguments array, so just target the node // FIXME ehh, arguments are stored in 'data' node = node.data; } break; } default: { // if we have reached a key/value, just apply the error here if (node && 'key' in node && 'value' in node) { return node; } // some error with some unknown section, try to find it // so we can at least display the error node = node[part]; break; } } } if (!node) { if (window.isDevelopmentMode) console.error('unable to find node for', path, 'in', pipeline); return pipeline; } return node; } applyValidationMarkers(metadata, pipeline: PipelineInfo, validation: Object): void { // just make sure nothing is hanging around this.clearValidationMarkers(pipeline); if (validation.result == 'failure') { for (const error of validation.errors) { if (error.location) { const node = this.findNodeFromPath(pipeline, error.location); if (node) { // ignore errors for nodes that are 'pristine' if (!node.pristine) { _appendValidationError(node, error.error); } error.applied = true; } } else if (error.jenkinsfileErrors) { for (const globalError of error.jenkinsfileErrors.errors) { if (_isArray(globalError.error)) { for (const errorText of globalError.error) { _appendValidationError(pipeline, errorText); error.applied = true; } } else { _appendValidationError(pipeline, globalError.error); error.applied = true; } } } } for (const error of validation.errors) { if (!error.applied) { if (window.developmentMode) console.error(error); // surface in the UI _appendValidationError(pipeline, error.error); } } } _addClientSideErrors(metadata, pipeline); } clearValidationMarkers(node: Object, visited: any[] = []): void { if (visited.indexOf(node) >= 0) { return; } visited.push(node); if (node.validationErrors) { delete node.validationErrors; } if (_isArray(node)) { for (const v of node) { this.clearValidationMarkers(v, visited); } } else if (node instanceof Object) { for (const key of Object.keys(node)) { const val = node[key]; if (val instanceof Object) { this.clearValidationMarkers(val, visited); } } } } hasPristineEdits(node: Object, visited: any[] = []) { if (visited.indexOf(node) >= 0) { return false; } visited.push(node); if (node.pristine) { return true; } for (const key of Object.keys(node)) { const val = node[key]; if (val instanceof Object) { if (this.hasPristineEdits(val, visited)) { return true; } } } return false; } validateNow(onComplete) { const pipeline = pipelineStore.pipeline; const json = JSON.stringify(convertInternalModelToJson(pipeline)); this.lastPipelineValidated = json + (this.hasPristineEdits(pipeline) ? '.' : ''); pipelineMetadataService.getStepListing(stepMetadata => { pipelineMetadataService.getAgentListing(agentMetadata => { this.validatePipeline(pipeline, validationResult => { this.applyValidationMarkers({ stepMetadata, agentMetadata }, pipeline, validationResult); pipelineStore.setPipeline(pipeline); // notify listeners to re-render if (onComplete) onComplete(); }); }); }); } delayedValidate = debounce(() => { this.validateNow(); }, validationTimeout); validate(onComplete) { const json = JSON.stringify(convertInternalModelToJson(pipelineStore.pipeline)); if (this.lastPipelineValidated === json) { if (onComplete) onComplete(); return; } if (!this.lastPipelineValidate) { this.validateNow(onComplete); } else { this.delayedValidate(); } } } const pipelineValidator = new PipelineValidator(); export default pipelineValidator; ```
```html+erb <%= decidim_modal id: "conference-registration-confirm-#{model.id}", class: "conference__registration-modal" do %> <div class="flex items-center gap-2"> <%= icon "ticket-line", class: "w-6 h-6 text-gray fill-current flex-none" %> <div class="font-semibold text-black text-2xl" id="dialog-title-conference-registration-confirm-<%= model.id %>"><%= I18n.t("registration", scope: "decidim.conferences.conference.show") %></div> </div> <div id="dialog-desc-conference-registration-confirm-<%= model.id %>" class="text-gray-2 text-xl mt-9"> <%= decidim_sanitize_editor translated_attribute(conference.registration_terms) %> </div> <div class="flex justify-between mt-16"> <button class="button button__lg button__transparent-secondary" data-dialog-close="<%= model.id %>"><%= t("cancel", scope: "decidim.conferences.conference.registration_confirm") %></button> <%= button_to t("confirm", scope: "decidim.conferences.conference.registration_confirm"), conference_registration_type_conference_registration_path(conference, model), class: "button button__lg button__secondary" %> </div> <% end %> ```
```java /* * contributor license agreements. See the NOTICE file distributed with * this work for additional information regarding copyright ownership. * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ package org.apache.rocketmq.store; import org.apache.rocketmq.common.UtilAll; import org.apache.rocketmq.common.message.Message; import org.apache.rocketmq.common.message.MessageDecoder; import org.apache.rocketmq.common.message.MessageExtBatch; import org.junit.After; import java.io.File; import java.net.InetAddress; import java.net.InetSocketAddress; import java.net.SocketAddress; import java.net.UnknownHostException; import java.util.*; import java.util.concurrent.atomic.AtomicInteger; public class StoreTestBase { private int QUEUE_TOTAL = 100; private AtomicInteger QueueId = new AtomicInteger(0); private SocketAddress BornHost = new InetSocketAddress("127.0.0.1", 8123); private SocketAddress StoreHost = BornHost; private byte[] MessageBody = new byte[1024]; protected Set<String> baseDirs = new HashSet<>(); private static AtomicInteger port = new AtomicInteger(30000); public static synchronized int nextPort() { return port.addAndGet(5); } protected MessageExtBatch buildBatchMessage(int size) { MessageExtBatch messageExtBatch = new MessageExtBatch(); messageExtBatch.setTopic("StoreTest"); messageExtBatch.setTags("TAG1"); messageExtBatch.setKeys("Hello"); messageExtBatch.setQueueId(Math.abs(QueueId.getAndIncrement()) % QUEUE_TOTAL); messageExtBatch.setSysFlag(0); messageExtBatch.setBornTimestamp(System.currentTimeMillis()); messageExtBatch.setBornHost(BornHost); messageExtBatch.setStoreHost(StoreHost); List<Message> messageList = new ArrayList<>(size); for (int i = 0; i < size; i++) { messageList.add(buildMessage()); } messageExtBatch.setBody(MessageDecoder.encodeMessages(messageList)); return messageExtBatch; } protected MessageExtBrokerInner buildMessage() { MessageExtBrokerInner msg = new MessageExtBrokerInner(); msg.setTopic("StoreTest"); msg.setTags("TAG1"); msg.setKeys("Hello"); msg.setBody(MessageBody); msg.setKeys(String.valueOf(System.currentTimeMillis())); msg.setQueueId(Math.abs(QueueId.getAndIncrement()) % QUEUE_TOTAL); msg.setSysFlag(0); msg.setBornTimestamp(System.currentTimeMillis()); msg.setStoreHost(StoreHost); msg.setBornHost(BornHost); return msg; } protected MessageExtBatch buildIPv6HostBatchMessage(int size) { MessageExtBatch messageExtBatch = new MessageExtBatch(); messageExtBatch.setTopic("StoreTest"); messageExtBatch.setTags("TAG1"); messageExtBatch.setKeys("Hello"); messageExtBatch.setBody(MessageBody); messageExtBatch.setMsgId("24084004018081003FAA1DDE2B3F898A00002A9F0000000000000CA0"); messageExtBatch.setKeys(String.valueOf(System.currentTimeMillis())); messageExtBatch.setQueueId(Math.abs(QueueId.getAndIncrement()) % QUEUE_TOTAL); messageExtBatch.setSysFlag(0); messageExtBatch.setBornHostV6Flag(); messageExtBatch.setStoreHostAddressV6Flag(); messageExtBatch.setBornTimestamp(System.currentTimeMillis()); try { messageExtBatch.setBornHost(new InetSocketAddress(InetAddress.getByName("1050:0000:0000:0000:0005:0600:300c:326b"), 8123)); } catch (UnknownHostException e) { e.printStackTrace(); } try { messageExtBatch.setStoreHost(new InetSocketAddress(InetAddress.getByName("::1"), 8123)); } catch (UnknownHostException e) { e.printStackTrace(); } List<Message> messageList = new ArrayList<>(size); for (int i = 0; i < size; i++) { messageList.add(buildIPv6HostMessage()); } messageExtBatch.setBody(MessageDecoder.encodeMessages(messageList)); return messageExtBatch; } protected MessageExtBrokerInner buildIPv6HostMessage() { MessageExtBrokerInner msg = new MessageExtBrokerInner(); msg.setTopic("StoreTest"); msg.setTags("TAG1"); msg.setKeys("Hello"); msg.setBody(MessageBody); msg.setMsgId("24084004018081003FAA1DDE2B3F898A00002A9F0000000000000CA0"); msg.setKeys(String.valueOf(System.currentTimeMillis())); msg.setQueueId(Math.abs(QueueId.getAndIncrement()) % QUEUE_TOTAL); msg.setSysFlag(0); msg.setBornHostV6Flag(); msg.setStoreHostAddressV6Flag(); msg.setBornTimestamp(System.currentTimeMillis()); try { msg.setBornHost(new InetSocketAddress(InetAddress.getByName("1050:0000:0000:0000:0005:0600:300c:326b"), 8123)); } catch (UnknownHostException e) { e.printStackTrace(); } try { msg.setStoreHost(new InetSocketAddress(InetAddress.getByName("::1"), 8123)); } catch (UnknownHostException e) { e.printStackTrace(); } return msg; } public static String createBaseDir() { String baseDir = System.getProperty("user.home") + File.separator + "unitteststore" + File.separator + UUID.randomUUID(); final File file = new File(baseDir); if (file.exists()) { System.exit(1); } return baseDir; } public static boolean makeSureFileExists(String fileName) throws Exception { File file = new File(fileName); MappedFile.ensureDirOK(file.getParent()); return file.createNewFile(); } public static void deleteFile(String fileName) { deleteFile(new File(fileName)); } public static void deleteFile(File file) { UtilAll.deleteFile(file); } @After public void clear() { for (String baseDir : baseDirs) { deleteFile(baseDir); } } } ```
```haskell {-# LANGUAGE DeriveGeneric #-} {-# LANGUAGE LambdaCase #-} {-# LANGUAGE ViewPatterns #-} module Cardano.Wallet.Types.UtxoStatistics ( -- * Types UtxoStatistics , BoundType , UtxoStatisticsError(..) -- * Constructing 'UtxoStatistics' , computeUtxoStatistics -- * Constructing 'BoundType' , log10 ) where import Universum import Control.Lens (at, (?~)) import Data.Aeson (FromJSON (..), Object, ToJSON (..), Value (..), genericParseJSON, genericToJSON, object, withObject, (.:), (.=)) import Data.Aeson.Types (Parser) import Data.Swagger (NamedSchema (..), Referenced (..), SwaggerType (..), ToSchema (..), declareSchemaRef, genericDeclareNamedSchema, minimum_, properties, required, type_) import Data.Word (Word64) import Formatting (bprint, build, (%)) import Serokell.Util (listJson) import Test.QuickCheck (Arbitrary (..), arbitrary, choose, elements, infiniteListOf, shuffle) import Cardano.Wallet.API.V1.Swagger.Example (Example) import Cardano.Wallet.Util (eitherToParser) import Pos.Chain.Txp (TxOut (..), TxOutAux (..), Utxo) import Pos.Core.Common (Coin (..)) import Pos.Infra.Util.LogSafe (BuildableSafeGen (..), deriveSafeBuildable) import qualified Control.Foldl as L import qualified Data.Aeson as Aeson import qualified Data.HashMap.Strict as HMS import qualified Data.List.NonEmpty as NL import qualified Data.Map.Strict as Map import qualified Data.Swagger as Swagger import qualified Formatting.Buildable -- -- TYPES -- data UtxoStatistics = UtxoStatistics { theHistogram :: ![HistogramBar] , theAllStakes :: !Word64 } deriving (Show, Generic, Ord) data UtxoStatisticsError = ErrEmptyHistogram | ErrInvalidBounds !Text | ErrInvalidTotalStakes !Text deriving (Eq, Show, Read, Generic) -- Buckets boundaries can be constructed in different ways data BoundType = Log10 deriving (Eq, Show, Read, Generic) instance ToJSON BoundType where toJSON = genericToJSON aesonEnumOpts instance FromJSON BoundType where parseJSON = genericParseJSON aesonEnumOpts instance ToSchema BoundType where declareNamedSchema = genericDeclareNamedSchema Swagger.defaultSchemaOptions instance Buildable UtxoStatisticsError where build = \case ErrEmptyHistogram -> bprint "Utxo statistics histogram cannot be empty." ErrInvalidBounds err -> bprint ("Utxo statistics have invalid bounds: "%build%".") err ErrInvalidTotalStakes err -> bprint ("Utxo statistics have invalid total stakes: "%build%".") err instance Eq UtxoStatistics where (UtxoStatistics h s) == (UtxoStatistics h' s') = s == s' && sorted h == sorted h' where sorted :: [HistogramBar] -> [HistogramBar] sorted = sortOn (\(HistogramBarCount key _) -> key) instance ToJSON UtxoStatistics where toJSON (UtxoStatistics bars allStakes) = let histogramObject = Object . HMS.fromList . map extractBarKey extractBarKey (HistogramBarCount bound stake) = show bound .= stake in object [ "histogram" .= histogramObject bars , "allStakes" .= allStakes , "boundType" .= log10 ] instance FromJSON UtxoStatistics where parseJSON = withObject "UtxoStatistics" parseUtxoStatistics where parseUtxoStatistics :: Object -> Parser UtxoStatistics parseUtxoStatistics o = eitherToParser =<< mkUtxoStatistics <$> (o .: "boundType") <*> (o .: "histogram") <*> (o .: "allStakes") instance Arbitrary UtxoStatistics where arbitrary = do upperBounds <- shuffle (NL.toList $ generateBounds Log10) counts <- infiniteListOf arbitrary let histogram = zip upperBounds counts let histoBars = map (uncurry HistogramBarCount) histogram allStakes <- choose (getPossibleBounds $ Map.fromList histogram) return $ UtxoStatistics histoBars allStakes instance BuildableSafeGen UtxoStatistics where buildSafeGen _ UtxoStatistics{..} = bprint ("{" %" histogram="%build %" allStakes="%build %" }") theHistogram theAllStakes instance Example UtxoStatistics instance ToSchema UtxoStatistics where declareNamedSchema _ = do wordRef <- declareSchemaRef (Proxy :: Proxy Word64) btypeRef <- declareSchemaRef (Proxy :: Proxy BoundType) pure $ NamedSchema (Just "UtxoStatistics") $ mempty & type_ ?~ SwaggerObject & required .~ ["histogram", "allStakes"] & properties .~ (mempty & at "boundType" ?~ btypeRef & at "allStakes" ?~ (Inline $ mempty & type_ ?~ SwaggerNumber & minimum_ .~ Just 0 ) & at "histogram" ?~ Inline (mempty & type_ ?~ SwaggerObject & properties .~ (mempty & at "10" ?~ wordRef & at "100" ?~ wordRef & at "1000" ?~ wordRef & at "10000" ?~ wordRef & at "100000" ?~ wordRef & at "1000000" ?~ wordRef & at "10000000" ?~ wordRef & at "100000000" ?~ wordRef & at "1000000000" ?~ wordRef & at "10000000000" ?~ wordRef & at "100000000000" ?~ wordRef & at "1000000000000" ?~ wordRef & at "10000000000000" ?~ wordRef & at "100000000000000" ?~ wordRef & at "1000000000000000" ?~ wordRef & at "10000000000000000" ?~ wordRef & at "45000000000000000" ?~ wordRef ) ) ) -- -- CONSTRUCTING -- -- | Smart-constructor to create bounds using a log-10 scale log10 :: BoundType log10 = Log10 {-# INLINE log10 #-} -- | Compute UtxoStatistics from a bunch of UTXOs computeUtxoStatistics :: BoundType -> [Utxo] -> UtxoStatistics computeUtxoStatistics btype = L.fold foldStatistics . concatMap getCoins where getCoins :: Utxo -> [Word64] getCoins = map (getCoin . txOutValue . toaOut) . Map.elems foldStatistics :: L.Fold Word64 UtxoStatistics foldStatistics = UtxoStatistics <$> foldBuckets (generateBounds btype) <*> L.sum foldBuckets :: NonEmpty Word64 -> L.Fold Word64 [HistogramBar] foldBuckets bounds = let step :: Map Word64 Word64 -> Word64 -> Map Word64 Word64 step x a = case Map.lookupGE a x of Just (k, v) -> Map.insert k (v+1) x Nothing -> Map.adjust (+1) (head bounds) x initial :: Map Word64 Word64 initial = Map.fromList $ zip (NL.toList bounds) (repeat 0) extract :: Map Word64 Word64 -> [HistogramBar] extract = map (uncurry HistogramBarCount) . Map.toList in L.Fold step initial extract -- -- INTERNALS -- -- Utxo statistics for the wallet. -- Histogram is composed of bars that represent the bucket. The bucket is tagged by upper bound of a given bucket. -- The bar value corresponds to the number of stakes -- In the future the bar value could be different things: -- (a) sum of stakes in a bucket -- (b) avg or std of stake in a bucket -- (c) topN buckets -- to name a few data HistogramBar = HistogramBarCount { bucketUpperBound :: !Word64 , bucketCount :: !Word64 } deriving (Show, Eq, Ord, Generic) instance Example HistogramBar instance Arbitrary HistogramBar where arbitrary = do upperBound <- elements (NL.toList $ generateBounds log10) count <- arbitrary pure (HistogramBarCount upperBound count) instance Buildable [HistogramBar] where build = bprint listJson instance BuildableSafeGen HistogramBar where buildSafeGen _ HistogramBarCount{..} = bprint ("{" %" upperBound="%build %" count="%build %" }") bucketUpperBound bucketCount mkUtxoStatistics :: BoundType -> Map Word64 Word64 -> Word64 -> Either UtxoStatisticsError UtxoStatistics mkUtxoStatistics btype histogram allStakes = do let (histoKeys, histoElems) = (Map.keys histogram, Map.elems histogram) let acceptedKeys = NL.toList $ generateBounds btype let (minPossibleValue, maxPossibleValue) = getPossibleBounds histogram let constructHistogram = uncurry HistogramBarCount let histoBars = map constructHistogram $ Map.toList histogram when (length histoKeys <= 0) $ Left ErrEmptyHistogram when (any (`notElem` acceptedKeys) histoKeys) $ Left $ ErrInvalidBounds $ "given bounds are incompatible with bound type (" <> show btype <> ")" when (any (< 0) histoElems) $ Left $ ErrInvalidBounds "encountered negative bound" when (allStakes < 0) $ Left $ ErrInvalidTotalStakes "total stakes is negative" when (allStakes < minPossibleValue && allStakes > maxPossibleValue) $ Left $ ErrInvalidTotalStakes "inconsistent total stakes & histogram" pure UtxoStatistics { theHistogram = histoBars , theAllStakes = allStakes } generateBounds :: BoundType -> NonEmpty Word64 generateBounds bType = let (^!) :: Word64 -> Word64 -> Word64 (^!) = (^) in case bType of Log10 -> NL.fromList $ map (\toPower -> 10 ^! toPower) [1..16] ++ [45 * (10 ^! 15)] getPossibleBounds :: Map Word64 Word64 -> (Word64, Word64) getPossibleBounds histogram = (calculatePossibleBound fst, calculatePossibleBound snd) where createBracketPairs :: Num a => [a] -> [(a,a)] createBracketPairs (reverse -> (x:xs)) = zip (map (+1) $ reverse (xs ++ [0])) (reverse (x:xs)) createBracketPairs _ = [] matching fromPair (key,value) = map ( (*value) . fromPair ) . filter (\(_,upper) -> key == upper) acceptedKeys = NL.toList $ generateBounds log10 calculatePossibleBound fromPair = sum . concatMap (\pair -> matching fromPair pair $ createBracketPairs acceptedKeys) $ Map.toList histogram aesonEnumOpts :: Aeson.Options aesonEnumOpts = Aeson.defaultOptions { Aeson.tagSingleConstructors = True } -- | TH at the end because it needs mostly everything to be declared first deriveSafeBuildable ''UtxoStatistics deriveSafeBuildable ''HistogramBar ```
```vue <example src="./examples/PositionDirection.vue" /> <example src="./examples/AnimationTypes.vue" /> <example src="./examples/EventTriggers.vue" /> <example src="./examples/MorphingIcon.vue" /> <template> <page-container centered :title="$t('pages.speedDial.title')"> <div class="page-container-section"> <p>Floating Action Buttons can show related actions upon hovering or pressing. The button should remain on screen after the menu is invoked.</p> <p>Speed dial component is pretty flexible and have many options to make it easy to suit all your needs. You can apply different positions, work with a couple of events to trigger the content and also have a awesome morph effect on your main action.</p> <p>The component is divided in 3 parts: The <code>md-speed-dial</code>, which is the container that control all children, <code>md-speed-dial-content</code> which is the content to be displayed (a.k.a buttons) and <code>md-speed-dial-trigger</code> who is responsible for triggering the content exhibition. Take a look at this following example:</p> </div> <div class="page-container-section"> <h2 id="speedpositions">Speed Dial positions</h2> <p>You can specify any position that you want for you Speed Dial component. They can be top and bottom, and combined with left, center or right:</p> <code-example title="Positions and directions" :component="examples['position-direction']" /> <note-block tip>Prefer the FAB on bottom left position for your main action on scrollable contents. Always use a <code>md-direction</code> equals to <code>bottom</code> when using top position.</note-block> </div> <div class="page-container-section"> <h2 id="effects">Effects</h2> <p>The component can be displayed different animations for each scenario that you might want:</p> <code-example title="Animations types" :component="examples['animation-types']" /> </div> <div class="page-container-section"> <h2 id="triggers">Triggers</h2> <p>You can trigger the speed dial content using hover or click. Using this allows you to have a open/close feature or to hold a main action:</p> <code-example title="Event triggers" :component="examples['event-triggers']" /> <note-block tip>For desktop environments it's better to have a hover effect. On mobile you can toggle the property to use click instead.</note-block> <api-item title="API - md-speed-dial"> <api-table :headings="props.headings" :props="props.props" slot="props" /> <api-table :headings="classes.headings" :props="classes.props" slot="classes" /> </api-item> </div> <div class="page-container-section"> <h2 id="iconMorph">Icon Morph</h2> <p>Sometimes you want the speed dial to have a cross icon to represent your close action after showing the content. This can be easily achieved with the morph icons.</p> <p>To create that, create two <code>md-icon</code> components inside the trigger and add a <code>md-morph-initial</code> in the one you would like it to be the initial state (or an open state) and a <code>md-morph-final</code> on the close state:</p> <code-example title="Morphing Icons" :component="examples['morphing-icon']" /> </div> <div class="page-container-section"> <h3>Components</h3> <api-item title="API - md-speed-dial-content"> <p>This component does not have any extra option.</p> </api-item> <api-item title="API - md-speed-dial-trigger"> <p>This component is just an alias of <code>md-button</code> with <code>md-fab</code> class. So every option of <router-link to="/components/button">Buttons</router-link> can be applied here, even the Vue Router options...</p> </api-item> </div> </page-container> </template> <script> import examples from 'docs-mixins/docsExample' export default { name: 'DocSpeedDial', mixins: [examples], data: () => ({ props: { headings: ['Name', 'Description', 'Default'], props: [ { name: 'md-direction', type: 'String', description: 'Applies the style to show the content below or above the trigger', defaults: 'top' }, { offset: true, name: 'md-direction="top"', type: 'String', description: 'Sets the direction of the animation effect to top. This is the default value of md-direction. You don\'t have to pass it unless you want to reset it\'s default value', defaults: '-' }, { offset: true, name: 'md-direction="bottom"', type: 'String', description: 'Sets the direction of the animation effect to bottom.', defaults: '-' }, { name: 'md-effect', type: 'Boolean', description: 'Enables/Disables the ripple effect.', defaults: 'fling' }, { offset: true, name: 'md-effect="fling"', type: 'String', description: 'Applies a reveal effect combining both opacity and scale. This is the default behaviour in most of applications using FAB.', defaults: '-' }, { offset: true, name: 'md-effect="scale"', type: 'String', description: 'Applies a reveal effect using scale only.', defaults: '-' }, { offset: true, name: 'md-effect="opacity"', type: 'String', description: 'Applies a reveal effect using opacity only.', defaults: '-' }, { name: 'md-event', type: 'String', description: 'Specifies the event who triggers the content', defaults: 'hover' }, { offset: true, name: 'md-event="hover"', type: 'String', description: 'Opens the content on hover.', defaults: '-' }, { offset: true, name: 'md-event="click"', type: 'String', description: 'Opens the content on click.', defaults: '-' } ] }, classes: { headings: ['Name', 'Description'], props: [ { name: 'md-top-right', description: 'Positions the Speed Dial on the top right of the nearest relative parent' }, { name: 'md-top-center', description: 'Positions the Speed Dial on the top center of the nearest relative parent' }, { name: 'md-top-left', description: 'Positions the Speed Dial on the top left of the nearest relative parent' }, { name: 'md-bottom-right', description: 'Positions the Speed Dial on the bottom right of the nearest relative parent' }, { name: 'md-bottom-center', description: 'Positions the Speed Dial on the bottom center of the nearest relative parent' }, { name: 'md-bottom-left', description: 'Positions the Speed Dial on the bottom left of the nearest relative parent' }, { name: 'md-fixed', description: 'Applies css "position: fixed" to Speed Dial. Better used with the 4 position coordinates above' } ] } }) } </script> ```
The Treasure of Puteaux is a collection of Gallic coins discovered by chance in 1950 at Puteaux, Hauts-de-Seine, France. Most of the coins are from the Parisii tribe. The coins were discovered in November 1950, when construction was taking place to widen a street in Puteaux. At that time some workers discovered a container with many gold coins, weighing about 800 grams. The exact circumstances of the discovery remain imprecise according to the Department of Coins and Medals in France. The collection consists of 120 coins, most of which depict the head of Apollo. This find is the largest number of Parisii coins discovered at one time, and sheds light on the history of the subject. The coins were eventually auctioned, and their prices reached records paid for Gallic coins. Sources Monique Mainjonet, Le Trésor de Puteaux. Revue Numismatique, VIe série, tome IV, 1962. Jean-Baptiste Colbert de Beaulieu, Les Monnaies gauloises des Parisii, Imprimerie nationale, 1970 Henri de La Tour, Atlas de monnaies gauloises, préparé par la Commission de topographie des Gaules. Plon, Paris 1892. Treasure troves in France 1950 archaeological discoveries
```objective-c /* * * This file is part of System Informer. * * Authors: * * jxy-s 2022-2023 * */ #pragma once #define WPP_CONTROL_GUIDS \ WPP_DEFINE_CONTROL_GUID( \ KSystemInformer, (f64b58a2, 8214, 4037, 8c7d, b96ce6098f3d), \ WPP_DEFINE_BIT(GENERAL) /* bit 0 = 0x00000001 */ \ WPP_DEFINE_BIT(UTIL) /* bit 1 = 0x00000002 */ \ WPP_DEFINE_BIT(COMMS) /* bit 2 = 0x00000004 */ \ WPP_DEFINE_BIT(INFORMER) /* bit 3 = 0x00000008 */ \ WPP_DEFINE_BIT(VERIFY) /* bit 4 = 0x00000010 */ \ WPP_DEFINE_BIT(HASH) /* bit 5 = 0x00000020 */ \ WPP_DEFINE_BIT(TRACKING) /* bit 5 = 0x00000040 */ \ WPP_DEFINE_BIT(PROTECTION) /* bit 5 = 0x00000080 */ \ WPP_DEFINE_BIT(SOCKET) /* bit 6 = 0x00000100 */ \ ) #define WPP_LEVEL_EVENT_LOGGER(level,event) WPP_LEVEL_LOGGER(event) #define WPP_LEVEL_EVENT_ENABLED(level,event) \ (WPP_LEVEL_ENABLED(event) && \ WPP_CONTROL(WPP_BIT_ ## event).Level >= level) #define TMH_STRINGIFYX(x) #x #define TMH_STRINGIFY(x) TMH_STRINGIFYX(x) #ifdef TMH_FILE #include TMH_STRINGIFY(TMH_FILE) #endif ```
Clifton Hyde (born November 27, 1976) is an American multi-instrumentalist, composer, and producer currently working from and residing in Nashville. He has composed multiple film scores and is a frequent collaborator of filmmaker Miles Doleac. Hyde has composed scores for The Hollow (2016 film), Demons (2017 film), “Two Birds”, “Handsome”, "Hallowed Ground", & "Demigod". In his feature film debut, Sona Jain's "For Real"; Clifton performed as pianist for composer/tabla player, Zakir Hussain. His piano and steel guitar can be heard on the feature film, "Sun Dogs". He has performed with Michael Stipe (R.E.M), Patti Smith, Philip Glass, Debbie Harry (Blondie), Lou Reed, Ben Harper, and The Kinks Dave Davies at Carnegie Hall. He also performed at Carnegie Hall on French Horn with the South Korean Symphony performing Mozart's "Mass In C Major" and Haydn's "Great Mass" and Baritone acoustic guitar with Iceland's Sigur Ros. Hyde collaborated with the sculptures of French artist Alain Kirili and the paintings of artist Lou Rizzolo for the World Peace Art Initiative in Stavanger, Norway. He has composed modern classical music for modern dance choreographer Janis Brenner as well as providing music for the Czech-American Marionette Theatre. He has performed in the pit for numerous Broadway/off Broadway shows such as Jesus Christ Superstar, Guys and Dolls, Hair, City of Angels, Return to the Forbidden Planet, Blood Brothers, The Who's Tommy, The Three Penny Opera, Little Shop of Horrors, Children of Eden, and many regional and touring companies. He has worked with The Blue Man Group (Zither/Chapman Stick/Lead Guitar/Bass), Tinpan (guitar, voice, french horn, music director, producer), & Gato Loco (guitar, baritone guitar, french horn, alto horn, producer) as well as producing and playing with numerous other projects and musicians, including members of the Metropolitan Opera & New York Philharmonic. External links Clifton Hyde interview Clifton Hyde & Janis Brenner Dance Video Clifton Hyde @ All About Jazz Joe McPhee, Alain Kirili, & Clifton Hyde concert video Slide guitarists American mandolinists American male composers 21st-century American composers Steel guitarists Musicians from Hattiesburg, Mississippi 1976 births Living people Guitarists from Mississippi American male guitarists 21st-century American guitarists 21st-century American male musicians
```yaml {{- /* */}} apiVersion: v1 kind: Service metadata: name: {{ template "grafana-loki.gossip-ring.fullname" . }} namespace: {{ .Release.Namespace | quote }} labels: {{- include "common.labels.standard" ( dict "customLabels" .Values.commonLabels "context" $ ) | nindent 4 }} app.kubernetes.io/part-of: grafana-loki app.kubernetes.io/component: loki {{- if or .Values.commonAnnotations .Values.loki.gossipRing.service.annotations }} {{- $annotations := include "common.tplvalues.merge" ( dict "values" ( list .Values.loki.gossipRing.service.annotations .Values.commonAnnotations ) "context" . ) }} annotations: {{- include "common.tplvalues.render" ( dict "value" $annotations "context" $) | nindent 4 }} {{- end }} spec: type: ClusterIP publishNotReadyAddresses: true clusterIP: None ports: - name: http port: {{ .Values.loki.gossipRing.service.ports.http }} targetPort: http-memberlist protocol: TCP selector: {{- include "common.labels.matchLabels" ( dict "customLabels" .Values.commonLabels "context" $ ) | nindent 4 }} loki-gossip-member: "true" ```
Vaaname Ellai () is a 1992 Indian Tamil language drama film written and directed by K. Balachander, starring Anand Babu, Ramya Krishnan, Madhoo, Vishali Kannadasan, Rajesh and Babloo Prithiveeraj. The story involves five characters from different backgrounds, who get vexed with their lives, jointly decide to end their lives together, but begin a short journey of 100 days to live together happily before dying. The story of Vaaname Ellai was conceived after Balachander became upset learning of numerous suicides inspired by his own film Ek Duuje Ke Liye (1981), and was convinced he could make a film speaking against suicide. The film was released on 22 May 1992, and Balachander won the Filmfare Award for Best Director – Tamil. Plot Five young people decide that life is not worth living anymore for various reasons. Deepak, the son of a judge K. Manjunath, is an idealist and never suspects that the many gifts that are showered upon him by his father were things received as bribes. One day, dressed as a robot, he shoots a video song about corruption in society. Seeing this, his friend remarks that his own father is very corrupt. The angered Deepak beats up his friend and challenges him that if that were true, he would commit suicide out of shame. Back at home, he sees his father taking a large amount of money as bribe for a lawsuit. The shocked Deepak argues with his father over his corrupt practices. His mother starts justifying corruption which has brought the family luxuries like the bungalow, car, and Anand's Yamaha bike. Besides, a large amount of money is needed for the marriage dowry of their two daughters (Anand's sisters). Anand is unable to bear this and immediately sets fire to his new Yamaha bike. He then leaves home. Gautham is the only son of a rich businessman, M. R. T. He is motherless. He is in love with the computer operator, Suguna working in his father's office. He is happy-go-lucky. One day his father learns of their love and despises it as he has big plans to marry his son to the daughter of a rich, potential business partner. He threatens Gautham to not marry Suguna. But the much pampered Gautham is adamant in marrying her. Meanwhile, Akhila, the widowed mother of Suguna, also objects to their love, fearing for problems arising due to the difference in their social status. M. R. T. comes up with a plan to stop the couple from marrying. He convinces Akhila to marry him, thereby making Gautham and Suguna as step-siblings. The couple is heartbroken. Babloo wants to commit suicide, but Suguna wants to continue to live as the new step-daughter of the rich businessman. She soon shows her own brand of revenge by heavy partying and boozing and having one night stands. Whenever arrested, she proudly proclaims being the rich man's daughter. Madhoo was forced into marriage with a very rich but old man, and Ramya is a gang-rape victim. The other is a poor, unemployed youth of high caste who does not get employment because of caste based reservation despite scoring very high marks in college. They all meet at the suicide point and decide to live a happy life for 100 days and then end it all. In those 100 days, they have all sorts of fun. They also sing mourning songs for their own death. But one of them, the unemployed, secretly tries to change his friends' minds away from suicide. But they tell that their mind is made up and he can leave if he chooses to. But he sets his suicide earlier and informs them that he did so to make them realise that death is no joke and if his friends changed their minds, his death would not be in vain. They soon start getting doubts about going ahead with their suicides. Meanwhile, they find a baby at their doorstep and have no choice but to take care of the child. They get emotionally close to the child. Finally the dead friend's dad comes and meets them having tracked his son's letters with great difficulty. When informed of his son's death he mourns and accuses the remaining youths for being the cause of his death. The four decide to die immediately upon hearing this. As they go to the suicide point, they meet their dead friend there. He says that he had faked his own death as well as arranged for the child and his father to dissuade them. His father takes the child to an orphanage and they meet people with various physical deformities trying to live a fruitful and cheerful life. After gaining inspiration from many of the disabled persons who had achieved things and also from the advice they give them, the five youths decide to life a long and brave life. Cast Anand Babu as Deepak Babloo Prithveeraj as Gautam Madhoo as Karpagam Ramya Krishnan as Subathra Gowtham Sundararajan as Pasupathy Charle as Sabu Kavithalaya Krishnan as Babu Dhamu as Chevvaa Madhan Bob as Paa. Dhi. Pandiyan Vishali Kannadasan as Suguna Bhanupriya as herself Cochin Haneefa as M.R.T., Gautam's father Rajesh as Samyvelu, Pasupathy's father H. Ramakrishnan as Gandhiraman Chithra as Sundari, Gandhiraman's sister Sethu Vinayagam as K. Manjunath, Deepak's father Poovilangu Mohan as Deepak's friend Y. Vijaya as Akhila, Suguna's mother S. N. Lakshmi as Sabu and Babu's mother Production After the release of Ek Duuje Ke Liye (1981), Balachander was upset on learning of numerous people wanting to commit suicide the way it happened in the film. Lakshmi Vijayakumar, a psychiatrist, suggested he make a film showing "the thought of suicide in a positive light — to go through struggle, but overcome it and realise that life is worth living". This laid the foundation for Vaaname Ellai. The film marked the acting debut of actors Madhan Bob and Dhamu as a comedian who went on to become popular. Bob had been recommended to Balachander by his father's friend. Soundtrack Tamil (original) version The music was composed by Maragadha Mani, with lyrics by Vairamuthu. He later reused the tune of "Kambangadu" as "Gundu Soodi" for the Telugu film Chatrapathi (2005). The tune of "Jana Gana Mana" was later adapted by Maragadha Mani as "Yedavaku" for the Telugu film S. P. Parasuram. Telugu (dubbed) version The film was dubbed into Telugu as October 2 and all lyrics were written by Rajasri. Release and reception Vaaname Ellai was released on 22 May 1992. On the same day, N. Krishnaswamy of The Indian Express wrote, "Given the varied nature of characters at hand, the treatment at first has got to be episodic but Balachander's vibrant style of telling a tale keeps reels crackling". K. Vijiyan of New Straits Times called it a "must-see picture for youth". Kutty Krishnan of Kalki praised Balachander for delivering a message against suicide. The film won the Tamil Nadu State Film Award Special Prize for Best Film, and Balachander won the Best Story Writer award at the same ceremony. Balachander also won the Filmfare Award for Best Director – Tamil. References External links 1990s Tamil-language films 1992 drama films 1992 films Films about suicide Films directed by K. Balachander Films scored by M. M. Keeravani Films with screenplays by K. Balachander Indian drama films
```shell #! /usr/bin/env bats load '/bats-support/load.bash' load '/bats-assert/load.bash' load '/getssl/test/test_helper.bash' # This is run for every test teardown() { [ -n "$BATS_TEST_COMPLETED" ] || touch $BATS_RUN_TMPDIR/failed.skip } setup() { [ ! -f $BATS_RUN_TMPDIR/failed.skip ] || skip "skipping tests after first failure" export CURL_CA_BUNDLE=/root/pebble-ca-bundle.crt } @test "Create dual certificates using HTTP-01 verification" { if [ -n "$STAGING" ]; then skip "Using staging server, skipping internal test" fi check_nginx if [ "$OLD_NGINX" = "false" ]; then CONFIG_FILE="getssl-http01-dual-rsa-ecdsa.cfg" else CONFIG_FILE="getssl-http01-dual-rsa-ecdsa-old-nginx.cfg" fi setup_environment init_getssl create_certificate assert_success check_output_for_errors check_certificates assert [ -e "${INSTALL_DIR}/.getssl/${GETSSL_CMD_HOST}/chain.ec.crt" ] assert [ -e "${INSTALL_DIR}/.getssl/${GETSSL_CMD_HOST}/fullchain.ec.crt" ] assert [ -e "${INSTALL_DIR}/.getssl/${GETSSL_CMD_HOST}/${GETSSL_CMD_HOST}.ec.crt" ] } @test "Check renewal test works for dual certificates using HTTP-01" { if [ -n "$STAGING" ]; then skip "Using staging server, skipping internal test" fi check_nginx run ${CODE_DIR}/getssl -U -d $GETSSL_HOST if [ "$OLD_NGINX" = "false" ]; then assert_line --partial "certificate on server is same as the local cert" else assert_line --partial "certificate is valid for more than 30 days" fi assert_success } @test "Force renewal of dual certificates using HTTP-01" { if [ -n "$STAGING" ]; then skip "Using staging server, skipping internal test" fi run ${CODE_DIR}/getssl -U -f $GETSSL_HOST assert_success check_output_for_errors } @test "Create dual certificates using DNS-01 verification" { if [ -n "$STAGING" ]; then skip "Using staging server, skipping internal test" fi check_nginx if [ "$OLD_NGINX" = "false" ]; then CONFIG_FILE="getssl-dns01-dual-rsa-ecdsa.cfg" else CONFIG_FILE="getssl-dns01-dual-rsa-ecdsa-old-nginx.cfg" fi setup_environment init_getssl create_certificate assert_success check_output_for_errors check_certificates assert [ -e "${INSTALL_DIR}/.getssl/${GETSSL_CMD_HOST}/chain.ec.crt" ] assert [ -e "${INSTALL_DIR}/.getssl/${GETSSL_CMD_HOST}/fullchain.ec.crt" ] assert [ -e "${INSTALL_DIR}/.getssl/${GETSSL_CMD_HOST}/${GETSSL_CMD_HOST}.ec.crt" ] } @test "Force renewal of dual certificates using DNS-01" { if [ -n "$STAGING" ]; then skip "Using staging server, skipping internal test" fi run ${CODE_DIR}/getssl -U -f $GETSSL_HOST assert_success check_output_for_errors cleanup_environment } ```
```c /* * * All rights reserved. * * Redistribution and use in source and binary forms, with or without modification, are * permitted provided that the following conditions are met: * * 1. Redistributions of source code must retain the above copyright notice, this list of * conditions and the following disclaimer. * * 2. Redistributions in binary form must reproduce the above copyright notice, this list of * conditions and the following disclaimer in the documentation and/or other materials provided * with the distribution. * * 3. Neither the name of the copyright holder nor the names of its contributors may be used to * endorse or promote products derived from this software without specific prior written * permission. * * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS * OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF * MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE * COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE * GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED * AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING * NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED * OF THE POSSIBILITY OF SUCH DAMAGE. */ int add(int first, int second) { first = first + 2; return first + second; } int main() { return add(1, add(3, add(4, 5))); } ```
Afte is a river of North Rhine-Westphalia, Germany. It is a tributary of the river Alme, into which it flows in Büren. See also List of rivers of North Rhine-Westphalia References Rivers of North Rhine-Westphalia Rivers of Germany
```objective-c // // // path_to_url // // Unless required by applicable law or agreed to in writing, software // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. #pragma once #include "paddle/phi/core/dense_tensor.h" namespace phi { template <typename T, typename Context> void CummaxGradKernel(const Context& dev_ctx, const DenseTensor& x, const DenseTensor& indices, const DenseTensor& out_grad, int axis, DataType dtype, DenseTensor* x_grad); template <typename T, typename Context> void CumminGradKernel(const Context& dev_ctx, const DenseTensor& x, const DenseTensor& indices, const DenseTensor& out_grad, int axis, DataType dtype, DenseTensor* x_grad); } // namespace phi ```
```html <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "path_to_url"> <html xmlns="path_to_url"> <head> <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> <title>MEGA 2.0</title> <meta http-equiv="X-UA-Compatible" content="IE=Edge" /> <link rel="stylesheet" type="text/css" href="css/style.css" /> <link rel="stylesheet" type="text/css" href="css/style.css" /> <script type="text/javascript" src="js/jquery-1.8.1.js"></script> <script type="text/javascript" src="js/jquery.jscrollpane.js"></script> <script type="text/javascript" src="js/jquery.mousewheel.js"></script> <script type="text/javascript"> $(document).ready(function() { function initMainScroll() { $('.main-scroll-block').jScrollPane({enableKeyboardNavigation:false,showArrows:true, arrowSize:5,animateScroll: true}); } initMainScroll(); $(window).bind('resize', function () { initMainScroll(); }); $('.login-register-input input').unbind('focus'); $('.login-register-input input').bind('focus',function(e) { $(this).parent().addClass('focused'); }); $('.login-register-input input').unbind('blur'); $('.login-register-input input').bind('blur',function(e) { $(this).parent().removeClass('focused'); }); }); </script> </head> <body id="bodyel" class="bottom-pages"> <div id="startholder" class="fmholder" style="display: block;"> <div class="widget-block hidden"> <div class="widget-circle percents-0"> <div class="widget-arrows"> <div class="widget-tooltip"> <div class="widget-icon uploading hidden"> <span class="widget-txt">Uploading</span> <span class="widget-speed-block ulspeed"> KB/s </span> </div> <div class="widget-icon downloading hidden"> <span class="widget-txt">Downloading</span> <span class="widget-speed-block dlspeed"> KB/s </span> </div> </div> <div class="widget-arrow"></div> </div> </div> </div> <div class="main-scroll-block"> <div class="main-pad-block"> <div class="top-head"> <a class="logo"></a> <a class="top-menu-icon">Menu</a> <a class="create-account-button hidden"> Create Account </a> <a class="top-login-button hidden"> Login </a> <a class="fm-avatar" style="display: block;"><img alt="" src="blob:path_to_url"></a> <div class="activity-status-block" style="display: none;"> <div class="activity-status online" style="display: none;"></div> </div> <div class="top-search-bl"> <div class="top-search-clear"> <div class="top-clear-button"></div> <input type="text" value="Search" class="top-search-input"> </div> </div> </div> <!-- Recover-page step2 !--> <div class="main-mid-pad backup-recover"> <div class="main-left-block"> <h3 class="main-italic-header">Account recovery</h3> <div class="register-st2-txt-block"> <p> Your password is also your encryption key, so when you lose it you risk losing access to your data.</span> </p> <h5 class="main-italic-header"> <span class="red">Download to your computer</span> </h5> <!-- Add class "uploading" to show percents !--> <div class="recover-upload-block"> <div class="restore-upload-percents"> 100% </div> <div class="backup-file-info"> <!-- please hide 2 spans if its needed. Please add "success" or "fail" classnames according uploading status. !--> <span class="restore-uploading-status-icon success"> </span> <span class="tranfer-filetype-txt">README.txt</span> </div> <div class="backup-download-button"> Upload </div> </div> <!-- Please add "fail" class to add fail icon !--> <div class="login-register-input"> <div class="backup-input-button">Verify</div> <input type="text" name="key-input2" id="key-input2" value="Your Key"> </div> </div> </div> <!-- Remove "hidden" class if upload is succes !--> <div class="main-right-block recover-block"> <h3 class="main-italic-header"><span class="green">Congratulations!</span></h3> <div class="register-st2-txt-block"> <p> You were able to regain access to your account. Please choose a new password below: </p> <div class="login-register-input password first"> <div class="top-login-input-tooltip"> <div class="top-login-tooltip-arrow"> <div class="top-loginp-tooltip-txt password"> Invalid password. <div class="white-txt password">Please strengthen your password.</div> </div> </div> </div> <div class="password-status-icon"> <div class="password-status-warning hidden"> </div> </div> <div class="register-loading-icon"><img alt="" src="path_to_url"></div> <input type="text" name="login-password" id="register-password" value="Password"> </div> <div class="new-registration good2"> <div class="register-pass-status-line1"></div> <div class="register-pass-status-line2"></div> <div class="register-pass-status-line3"></div> <div class="register-pass-status-line4"></div> <div class="register-pass-status-line5"></div> <div class="password-stutus-txt hidden"> <div class="new-reg-status-pad"> <strong>Strength:</strong> Password status </div> <div class="new-reg-status-description"> Your password is easily guessed. Try making your password longer. Combine uppercase & lowercase letters. Add special characters. Do not use names or dictionary words. </div> </div> </div> <div class="login-register-input password confirm"> <div class="top-login-input-tooltip"> <div class="top-login-tooltip-arrow"> <div class="top-loginp-tooltip-txt"> Passwords do not match. <div class="white-txt">Please try again</div> </div> </div> </div> <input type="text" name="login-password2" id="register-password2" value="Retype Password"> </div> <div class="restore-verify-button active"> Validate password </div> <div class="clear"></div> </div> </div> <!-- Remove "hidden" class if user clicked "No" !--> <div class="main-right-block recover-block hidden"> <h3 class="main-italic-header"><span class="red">Oops.</span></h3> <div class="register-st2-txt-block"> <p> Something went wrong. Either this key is not associated with your account, or is a wrong file. </p> <p> The file where your key is stored should be in plain text format, with the name Document_name.txt. </p> <p> If you're sure that you uploaded the correct file, please <a href="">contact support</a>. </p> </div> </div> <div class="clear"></div> </div> <!-- end !--> <div class="nw-bottom-block en"> <div class="nw-bottom-pad"> <div class="nw-bottom-column"> <div class="nw-bottom-header">MEGA</div> <a href="#privacycompany" class="nw-bottom-link">The Privacy Company</a> <a href="#about" class="nw-bottom-link">About us</a> <a href="#credits" class="nw-bottom-link">Credits</a> <a href="#blog" class="nw-bottom-link">Blog</a> </div> <div class="nw-bottom-column"> <div class="nw-bottom-header">Apps</div> <a href="#mobile" class="nw-bottom-link">Mobile Apps</a> <a href="#sync" class="nw-bottom-link">Sync Client</a> <a href="#chrome" class="nw-bottom-link">Chrome App</a> <a href="#firefox" class="nw-bottom-link">Firefox App</a> </div> <div class="nw-bottom-column"> <div class="nw-bottom-header">Support</div> <a href="#help" class="nw-bottom-link">Help Centre</a> <a href="#contact" class="nw-bottom-link">Contact Us</a> <a href="#resellers" class="nw-bottom-link">Resellers</a> </div> <div class="nw-bottom-column"> <div class="nw-bottom-header">Developers</div> <a href="#dev" class="nw-bottom-link">SDK</a> <a href="#doc" class="nw-bottom-link">Documentation</a> <a href="#affiliates" class="nw-bottom-link">Affiliate Program</a> </div> <div class="nw-bottom-column"> <div class="nw-bottom-header">Legal &amp; policies</div> <div class="nw-start-nz"></div> <div class="clear"></div> </div> <div class="nw-bottom-pad-with-border"> <div class="nw-bottom-copyrights"> Mega Limited 2014 <span>All rights reserved</span> </div> <a href="path_to_url" target="_blank" class="nw-bottom-social facebook"></a> <a href="path_to_url" target="_blank" class="nw-bottom-social twitter"></a> <div class="clear"></div> </div> </div> </div> </div> </div> </div> <div class="fm-dialog-overlay hidden"></div> <div class="fm-dialog restore-success hidden"> <div class="reg-success-icon"></div> <div class="reg-success-header">Awesome!</div> <div class="reg-success-txt"> You successfully changed your password.</div> <div class="restore-close-button"> <div class="fm-dialog-button browsers-button active">Close</div> </div> </div> </body> </html> ```
```objective-c /* * * Use of this source code is governed by a BSD-style license * that can be found in the LICENSE file in the root of the source * tree. An additional intellectual property rights grant can be found * in the file PATENTS. All contributing project authors may * be found in the AUTHORS file in the root of the source tree. */ #include <string> #ifndef RTC_BASE_PROTOBUF_UTILS_H_ #define RTC_BASE_PROTOBUF_UTILS_H_ #if WEBRTC_ENABLE_PROTOBUF #include "third_party/protobuf/src/google/protobuf/message_lite.h" #include "third_party/protobuf/src/google/protobuf/repeated_field.h" namespace webrtc { using google::protobuf::MessageLite; using google::protobuf::RepeatedPtrField; } // namespace webrtc #endif // WEBRTC_ENABLE_PROTOBUF #endif // RTC_BASE_PROTOBUF_UTILS_H_ ```
Ross McCrorie (born 18 March 1998) is a Scottish professional footballer who plays as a centre-back or defensive midfielder for EFL Championship club Bristol City. McCrorie has previously played for Rangers, Ayr United, Dumbarton, Portsmouth and Aberdeen. He has also captained the Scotland under-21 team. Club career Early careers and loans out McCrorie, alongside his twin brother Robby (a goalkeeper), worked his way through the Rangers youth system, becoming captain of their U20s side. McCrorie joined Scottish League One side Ayr United on loan in February 2016, playing fifteen times and scoring twice as Ayr won promotion. After signing a new contract in December 2016, at the same time as his brother, he joined his former coach at Rangers, Ian Durrant, at Scottish Championship side Dumbarton on a loan deal until the end of the season. Rangers first breakthrough McCrorie made his debut for the Rangers senior team in September 2017, when he replaced Bruno Alves in a League Cup quarter-final against Partick Thistle at Firhill Stadium, won 3–1 after extra time. Following the match, manager Pedro Caixinha praised McCrorie, stating that he was "going to be the future of this country, not only this club, as a centre-half. We are very glad to have him with us". Later in the same week he made his first start – also his Scottish Premiership debut and maiden experience of the Old Firm derby – against Celtic at Ibrox. Rangers lost the match 2–0, but McCrorie's performance was reported favourably by the press and both Rangers and Celtic fans. McCrorie scored his first senior goal for Rangers with a header in the first half against Partick Thistle on 4 November 2017 at Ibrox; Rangers won the match 3–0. On 3 December he was deployed as a defensive midfielder in a fixture away to Aberdeen, which his team won to overtake their opponents into second place in the table. On 28 December he signed a new four-year contract. After he scored an important goal in the second visit to Aberdeen in May 2018, McCrorie received praise from both the Pittodrie boss Derek McInnes and the Rangers caretaker manager Jimmy Nicholl for his overall performance and influence in bringing his team back into the match; the player himself dedicated the strike to his dying grandmother. Loan to Portsmouth In July 2019, McCrorie agreed a deal to move to English club Portsmouth on loan for the 2019–20 season; it was reported that Pompey had an option to buy him outright, but this was refuted by Rangers manager Steven Gerrard a few days later. He made his competitive debut for Pompey on 3 August, a 1–0 defeat away at Shrewsbury Town, during which he was sent off in the 81st minute for a late challenge on Shrewsbury's Donald Love. Aberdeen On 17 August 2020, McCrorie joined Aberdeen, initially on a one-year loan due to short-term cash flow issues at his new club, with an obligation to buy for a reported fee of around £350,000. A clause in the deal meant he would not be eligible to play against his 'parent club' during the 2020–21 Scottish Premiership campaign, although he would not return to Rangers before becoming a permanent Aberdeen signing. The deal was made permanent on 1 February 2021, ahead of schedule, as Scott Wright moved in the opposite direction. Bristol City On 6 June 2023, McCrorie joined Bristol City on a three-year deal with the option of a further year, for an undisclosed fee. International career With his brother Robby, Scotland U16 won the Victory Shield in 2013–14, and were selected for Scotland U17 in the 2015 UEFA European U-17 Championship McCrorie was named in the Scotland U19 squad for the elite round of the European Championships in March 2017 alongside Dumbarton teammate Daniel Harvie. He had captained the side in the qualifying phase of the tournament. He also captained the Scotland U21 team on several occasions. Selected for the under-20 squad in the 2017 Toulon Tournament. The team went on to claim the bronze medal. It was the nations first ever medal at the competition. McCrorie received his first call-up to the senior Scotland squad in October 2020. He was called up again in June 2023 for games against Norway and Georgia. Career statistics Honours Ayr United Scottish Championship Play-offs: 2015–16 Scotland U16 Victory Shield: 2013–14 References External links 1998 births Living people Scottish men's footballers Scottish twins Footballers from South Ayrshire Men's association football defenders Men's association football midfielders Scotland men's youth international footballers Dumbarton F.C. players Rangers F.C. players Portsmouth F.C. players Scottish Professional Football League players Ayr United F.C. players Scotland men's under-21 international footballers Aberdeen F.C. players People from Dailly
```javascript 'use strict'; const nestedValue = require('../helpers/nestedValue'); module.exports = function whereNotBetween(key, values) { return this.filter(item => ( nestedValue(item, key) < values[0] || nestedValue(item, key) > values[values.length - 1] )); }; ```
Denis Valerievich Bogdan (; born 13 October 1996) is a Belarusian-born Russian volleyball player and a member of the Russian men's national volleyball team. Sporting achievements Clubs CEV Challenge Cup 2015/2016 – with Fakel Novy Urengoy 2016/2017 – with Fakel Novy Urengoy National championships 2021/2022 Russian Championship, with Dynamo Moscow 2021/2022 Russian Super Cup, with Dynamo Moscow References 1996 births Living people Russian men's volleyball players Olympic volleyball players for Russia Volleyball players at the 2020 Summer Olympics Sportspeople from Grodno Universiade medalists in volleyball Universiade bronze medalists for Russia Medalists at the 2019 Summer Universiade Medalists at the 2020 Summer Olympics Olympic silver medalists for the Russian Olympic Committee athletes Olympic medalists in volleyball
```shell Using tags for version control How to set your username and email Remote repositories: viewing, editing and deleting Remote repositories: fetching and pushing Dates in git ```
Niz () is a rural locality (a village) in Semizerye Rural Settlement, Kaduysky District, Vologda Oblast, Russia. The population was 6 as of 2002. Geography Niz is located 14 km southwest of Kaduy (the district's administrative centre) by road. Korninskaya is the nearest rural locality. References Rural localities in Kaduysky District
Native Tongue (stylized in all uppercase) is the eleventh studio album by American alternative rock band Switchfoot. It was released on January 18, 2019, through Fantasy Records. Native Tongue peaked at No. 41 in its opening week on the Billboard 200 chart and No. 2 on the Billboard Christian albums chart. At the 50th Dove Awards, Native Tongue won Rock/Contemporary Album of the Year. Promotion Singles "Native Tongue" was released on October 19, 2018, as the album's first single, and was written by Jon Foreman, Tim Foreman, and Brent Kutzle of OneRepublic. The video for the track premiered on Paste magazine's website the same day. "Voices" was released on November 16, 2018, as the second single and the music video was released on the same day. Switchfoot promoted the album with a North American Native Tongue Tour, with supporting acts Colony House and Tyson Motsenbocker. "All I Need" was released on December 14, 2018, across all streaming platforms, followed by "Let it Happen" on January 4, 2019. Accolades Track listing History Native Tongue is Switchfoot's first studio album since Learning to Breathe (in 2000) not to crack the top 20 on the Billboard 200. It is also Switchfoot's first studio album since Hello Hurricane, in 2009, not to peak at No. 1 on the Billboard Top Christian Albums chart. It is Switchfoot's longest studio album by track listing, and the second-longest by duration, beaten only by Vice Verses. Personnel Switchfoot Jon Foreman – lead vocals, guitar Tim Foreman – bass guitar, background vocals, lead vocals Chad Butler – drums, percussion Jerome Fontamillas – keyboards, guitar Drew Shirley – guitar Technical The Foreman Brothers – producer Brent Kutzle – producer Tyler Chester – producer Tanner Sparks – engineer, mixing Tyler Spry – producer, engineer Bear Rinehart – additional production Adam Hawkins – mixing Bryan Cook – mixing John Nathaniel – mixing Joe Laporta – mastering Paul Blakemore – mastering Brandon Collins – string arrangement Additional instrumentation Kaela Sinclair – vocals Tanner Sparks – guitar Tyler Chester – keyboards, acoustic guitar Brent Kutzle – guitar, keyboards Tyler Spry – background vocals, programming, keyboards, guitar Bear Rinehart – keyboards Steve Wilmot – programming John Painter – marching drums, horns, strings David Davidson – violin David Angell – violin Monisa Angell – viola Betsy Lamb – viola Craig Nelson – contrabass Paul Nelson – cello Keith Tutt II – cello Charts References 2019 albums Fantasy Records albums Switchfoot albums
Vampires Suck is a 2010 American parody film written and directed by Jason Friedberg and Aaron Seltzer. It stars Jenn Proske, Matt Lanter, Christopher N. Riggi, Ken Jeong, Anneliese van der Pol, and Arielle Kebbel. The film is a parody of The Twilight Saga franchise (mainly the original film and its sequel, New Moon). Like the previous Friedberg and Seltzer movies, the film was panned by critics for its humor and plot. 20th Century Fox theatrically released the film on August 18, 2010. Plot Edward Sullen strips off his clothes during the saint Salvatore festival while Becca Crane rushes to stop him. While running through a fountain, she accidentally splashes water around, causing people to party in the fountain and thus stopping Becca. In a flashback, Becca Crane moves to the Pacific Northwestern town of Sporks to live with her clueless father, Sheriff Frank, after her mother starts an affair with Tiger Woods. Meanwhile, a killing spree is attributed to the Canadians, but the real perpetrators are a group of vampires usually mistaken for the Black Eyed Peas. Becca quickly befriends many students at her new high school, including Jennifer. She is also intrigued by the mysterious and aloof Edward Sullen, whose odd behavior perplexes her during their biology class. Later, when Becca is nearly struck by a van in the school parking lot, Edward stops the vehicle by thrusting another student in the van's way. Becca later "dreams" Edward was in her room; in truth he is, but is repulsed by things she does while sleeping. After much research and thought, Becca confronts Edward and tells him she thinks that he is one of the Jonas Brothers. Edward corrects her, saying he is a vampire but that he only consumes animal blood. Despite the danger of being around a vampire, Becca agrees to go to prom with Edward. Later, Becca and Edward kiss passionately in her room. Becca attempts to seduce Edward into having sex, but he abstains. On Becca's birthday, Edward introduces her to his vampire family. While unwrapping a gift, Becca gets a paper cut. Edward's brother Jeremiah attempts to bite her, but is knocked away. To keep his family away from Becca, Edward distracts them and takes her out to the woods. He then proceeds to break up with Becca, who throws a tantrum after he leaves. Much to her expectance, Becca is attacked by three nomadic vampires, but Edward intervenes and saves her. Edward's departure leaves Becca heartbroken for months, but she is comforted by her deepening friendship with her childhood friend Jacob White. When Becca is accosted in the woods by the nomadic vampires again, Jacob transforms into a Chihuahua as his werewolf pack arrives to save her. Meanwhile, Edward has moved to Rio and is now dating Lady Gaga to get over losing Becca. He is later informed by his sister Iris, via her gift of prophecy, that Becca killed herself. Edward becomes depressed and decides to provoke the "Zolturi", a powerful and narcissistic vampire coven, into killing him by exposing himself in the sunlight in front of humans, thus exposing the existence of vampires. Iris has another vision of Becca's survival as he leaves, but she is unable to warn him. Iris arrives at Becca's house, and tells her that she has to save Edward by showing him that she is still alive. To Becca's shock, the Zolturi are currently partying at the prom due to the St. Salvatore theme. Jacob appears and demands that Becca choose between him and Edward, but just before she announces her decision, he is distracted by a cat and runs off to chase it. Upon arriving at the prom, Becca is caught between the warring factions of Edward fangirls and Jacob fangirls. She is unable to reach Edward before he exposes himself, figuratively and literally. However, twilight occurs, followed by a new moon and an eclipse, concealing Edward's vampiric nature as Becca gets him to safety. Frank arrives to check on Becca, making her hopeful; however, he thinks supernatural creatures are the prom theme and leaves. After a fight between him and the Zolturi leader, Daro, Edward is forced to turn Becca into a vampire, at risk of being killed horribly. He agrees to do so only on the condition that she marry him, which she accepts. In the mid-credits scene, Edward is struck on the head by the leader of Jacob's fangirls. Edward survives the blow, and the girl is attacked by the newly vampirized Becca as the movie concludes. Cast Jenn Proske as Becca Crane Matt Lanter as Edward Sullen Christopher N. Riggi as Jacob White Ken Jeong as Daro Anneliese van der Pol as Jennifer Diedrich Bader as Sheriff Frank Crane Arielle Kebbel as Rachel B. J. Britt as Antoine Charlie Weber as Jack Crista Flanagan as Eden Kelsey Legin (credited as Kelsey Ford) as Iris Jun Hee Lee as Derric David DeLuise as Fisherman Scully Ike Barinholtz as Bobby White (uncredited) Dave Foley as Principal Smith Randal Reeder as Biker Dude Nick Eversman as Jeremiah Sullen Zane Holtz as Alex Krystal Mayo as Buffy the Vampire Slayer Parodies Films and TV shows The Twilight Saga (main parody) (2008–2010) Buffy the Vampire Slayer (1997–2003) True Blood (2008–2014) Alice in Wonderland (2010) Jersey Shore (2009–2012) The Vampire Diaries (2007–2009) Real-life people Black Eyed Peas Lady Gaga Tiger Woods Chris Brown Release Vampires Suck was released on August 18, 2010, in the United States, Canada and Russia, on August 26 in Australia and on October 15 in the United Kingdom. 20th Century Fox did not provide advance screenings of the film for critics. Critical response On Rotten Tomatoes the film has an approval rating of 4% based on reviews from 91 critics. The consensus reads: "Witlessly broad and utterly devoid of laughs, Vampires Suck represents a slight step forward for the Friedberg-Seltzer team." Metacritic, gave it a weighted average score of 18 out of 100 based on reviews from 17 critics, indicating "overwhelming dislike", the worst score for a wide release in 2010. Audiences polled by CinemaScore gave the film an average grade of "C+" on an A+ to F scale. Spill.com, whose video reviews are usually around five minutes long and censored, had a twenty-second review which consisted of Korey Coleman staring blankly into the camera before uttering, "Fuck you" (which is the lowest rating the website gives) uncensored. In the audio commentary from the site, Coleman stated, "The films that these two directors make are so blatant at being nothing more than a juvenile finger pointing at an image or mention of a popular trend that, to me, they seem exploitive of a young culture raised to have an ever-decreasing attention span, thanks to the Internet and channel surfing and, this may sound a little crazy, but, I think it shows a slight de-evolution in what people will accept as entertainment." Peter Travers of Rolling Stone gave the film zero out of four stars, and wrote a four word-long review, which simply stated: "This movie sucks more." Film critic Mark Kermode reviewed the film on his Radio 5 show, prefacing the review by saying "It's no surprise to know that it's all terrible, witless, boring, terror". He criticized the film for what he perceived as stale subject matter, saying that the Twilight franchise had left the public consciousness and was no-longer fit for parody, "It's not just that the ship has sailed; it's that the ship has sailed, gone across the Atlantic, hit an iceberg, sunk, been dragged up by at least one company, been turned into the biggest movie hit ever, and is now currently being retrofitted for 3D for an anniversary re-release." Another review from Collider's Jake Horowitz said that "not a single thing in its dreadful 82 minutes running time is even remotely worth watching, or considered even slightly entertaining." He then wrote that "no amount of review can make up for what I witnessed while watching Vampires Suck. How wrong I was to assume that it was even watchable, because with each over-done hit on the head or kick in the balls I cringed at the thought of this movie making nearly $80 million worldwide; and imagining people actually laughing in a theatre somewhere." Despite the overwhelming negativity, Jenn Proske's mimicry performance based on Kristen Stewart received some praise; Steve Persall of Tampa Bay Times stated, "One thing the movie roasts to perfection is Kristen Stewart's overly pensive Bella... A newcomer named Jenn Proske has the mumbling, hair-twisting, lip-biting tics down pat, and her expressions of repressed sexuality are almost as funny as Stewart's." Entertainment Weekly said, "The exception is newcomer Jenn Proske, who spoofs Twilight star Kristen Stewart's flustered, hair-tugging angst with hilarious precision." Vampires Suck was given four nominations from the Golden Raspberry Awards, including Worst Picture, Worst Director, Worst Screenplay and Worst Prequel, Remake, Rip-Off or Sequel. Box office In the United States, the film opened at number one on August 18 with $4,016,858. On August 19, the film dropped to #3 behind The Expendables and Eat Pray Love with $2,347,044. By the weekend, Vampires Suck landed at #2 behind The Expendables and $200,000 over Eat Pray Love. The full second week the film dropped to #11, grossing no more than $500 per theater, respectively. In its second weekend, the film dropped more than 50% from its opening weekend but rose to #6. As of July 12, 2012, the film has grossed $80,547,866 worldwide. See also Vampire film References External links 20th Century Fox films 2010 films 2010 comedy horror films 2010s parody films 2010s teen comedy films American parody films American slapstick comedy films American comedy horror films American supernatural horror films American teen comedy films Dune Entertainment films Films about proms Films directed by Jason Friedberg and Aaron Seltzer Films produced by Peter Safran Films scored by Christopher Lennertz Films shot in Louisiana Regency Enterprises films Vampire comedy films American werewolf films Works based on Twilight (novel series) Parodies of horror American vampire films 2010s English-language films 2010s American films
```javascript import * as lib from './lib' console.log(lib.cat) ```
```chuck /*++ version 3. Alternative licensing terms are available. Contact info@minocacorp.com for details. See the LICENSE file at the root of this project for complete licensing information. Module Name: Broadcom 2709 PWM Audio Abstract: This module implements Broadcom 2709 PWM Audio support. Author: Chris Stevens 2-May-2017 Environment: Kernel --*/ from menv import driver; function build() { var drv; var dynlibs; var entries; var name = "bc27pwma"; var sources; sources = [ "pwma.c", ]; dynlibs = [ "drivers/sound/core:sound" ]; drv = { "label": name, "inputs": sources + dynlibs, }; entries = driver(drv); return entries; } ```
```objective-c // [AsmJit] // Complete JIT Assembler for C++ Language. // // Zlib - See COPYING file in this package. // [Guard] #ifndef _ASMJIT_CORE_ASSEMBLER_H #define _ASMJIT_CORE_ASSEMBLER_H // [Dependencies - AsmJit] #include "../core/buffer.h" #include "../core/context.h" #include "../core/defs.h" #include "../core/logger.h" #include "../core/podvector.h" #include "../core/zonememory.h" // [Api-Begin] #include "../core/apibegin.h" namespace AsmJit { //! @addtogroup AsmJit_Core //! @{ // ============================================================================ // [AsmJit::Assembler] // ============================================================================ //! @brief Base class for @ref Assembler. //! //! This class implements core setialization API only. The platform specific //! methods and intrinsics is implemented by derived classes. //! //! @sa @c Assembler. struct Assembler { // your_sha256_hash---------- // [Construction / Destruction] // your_sha256_hash---------- //! @brief Creates Assembler instance. ASMJIT_API Assembler(Context* context); //! @brief Destroys Assembler instance ASMJIT_API virtual ~Assembler(); // your_sha256_hash---------- // [LabelLink] // your_sha256_hash---------- //! @brief Data structure used to link linked-labels. struct LabelLink { //! @brief Previous link. LabelLink* prev; //! @brief Offset. sysint_t offset; //! @brief Inlined displacement. sysint_t displacement; //! @brief RelocId if link must be absolute when relocated. sysint_t relocId; }; // your_sha256_hash---------- // [LabelData] // your_sha256_hash---------- //! @brief Label data. struct LabelData { //! @brief Label offset. sysint_t offset; //! @brief Label links chain. LabelLink* links; }; // your_sha256_hash---------- // [RelocData] // your_sha256_hash---------- // X86 architecture uses 32-bit absolute addressing model by memory operands, // but 64-bit mode uses relative addressing model (RIP + displacement). In // code we are always using relative addressing model for referencing labels // and embedded data. In 32-bit mode we must patch all references to absolute // address before we can call generated function. We are patching only memory // operands. //! @brief Code relocation data (relative vs absolute addresses). struct RelocData { //! @brief Type of relocation. uint32_t type; //! @brief Size of relocation (4 or 8 bytes). uint32_t size; //! @brief Offset from code begin address. sysint_t offset; //! @brief Relative displacement or absolute address. union { //! @brief Relative displacement from code begin address (not to @c offset). sysint_t destination; //! @brief Absolute address where to jump; void* address; }; }; // your_sha256_hash---------- // [Context] // your_sha256_hash---------- //! @brief Get code generator. inline Context* getContext() const { return _context; } // your_sha256_hash---------- // [Memory Management] // your_sha256_hash---------- //! @brief Get zone memory manager. inline ZoneMemory* getZoneMemory() const { return const_cast<ZoneMemory*>(&_zoneMemory); } // your_sha256_hash---------- // [Logging] // your_sha256_hash---------- //! @brief Get logger. inline Logger* getLogger() const { return _logger; } //! @brief Set logger to @a logger. ASMJIT_API virtual void setLogger(Logger* logger); // your_sha256_hash---------- // [Error Handling] // your_sha256_hash---------- //! @brief Get error code. inline uint32_t getError() const { return _error; } //! @brief Set error code. //! //! This method is virtual, because higher classes can use it to catch all //! errors. ASMJIT_API virtual void setError(uint32_t error); // your_sha256_hash---------- // [Properties] // your_sha256_hash---------- //! @brief Get assembler property. ASMJIT_API virtual uint32_t getProperty(uint32_t propertyId) const; //! @brief Set assembler property. ASMJIT_API virtual void setProperty(uint32_t propertyId, uint32_t value); // your_sha256_hash---------- // [Capacity] // your_sha256_hash---------- //! @brief Get capacity of internal code buffer. inline size_t getCapacity() const { return _buffer.getCapacity(); } // your_sha256_hash---------- // [Offset] // your_sha256_hash---------- //! @brief Return current offset in buffer. inline size_t getOffset() const { return _buffer.getOffset(); } //! @brief Set offset to @a o and returns previous offset. //! //! This method can be used to truncate code (previous offset is not //! recorded) or to overwrite instruction stream at position @a o. //! //! @return Previous offset value that can be uset to set offset back later. inline size_t toOffset(size_t o) { return _buffer.toOffset(o); } // your_sha256_hash---------- // [GetCode / GetCodeSize] // your_sha256_hash---------- //! @brief Return start of assembler code buffer. //! //! Note that buffer address can change if you emit instruction or something //! else. Use this pointer only when you finished or make sure you do not //! use returned pointer after emitting. inline uint8_t* getCode() const { return _buffer.getData(); } //! @brief Return current offset in buffer (same as <code>getOffset() + getTramplineSize()</code>). inline size_t getCodeSize() const { return _buffer.getOffset() + getTrampolineSize(); } // your_sha256_hash---------- // [TakeCode] // your_sha256_hash---------- //! @brief Take internal code buffer and NULL all pointers (you take the ownership). ASMJIT_API uint8_t* takeCode(); // your_sha256_hash---------- // [Clear / Reset] // your_sha256_hash---------- //! @brief Clear everything, but not deallocate buffers. ASMJIT_API void clear(); //! @brief Reset everything (means also to free all buffers). ASMJIT_API void reset(); //! @brief Called by clear() and reset() to clear all data related to derived //! class implementation. ASMJIT_API virtual void _purge(); // your_sha256_hash---------- // [EnsureSpace] // your_sha256_hash---------- //! @brief Ensure space for next instruction. //! //! Note that this method can return false. It's rare and probably you never //! get this, but in some situations it's still possible. inline bool ensureSpace() { return _buffer.ensureSpace(); } // your_sha256_hash---------- // [GetTrampolineSize] // your_sha256_hash---------- //! @brief Get size of all possible trampolines needed to successfuly generate //! relative jumps to absolute addresses. This value is only non-zero if jmp //! of call instructions were used with immediate operand (this means jump or //! call absolute address directly). //! //! Currently only _emitJmpOrCallReloc() method can increase trampoline size //! value. inline size_t getTrampolineSize() const { return _trampolineSize; } // your_sha256_hash---------- // [Buffer - Getters] // your_sha256_hash---------- //! @brief Get byte at position @a pos. inline uint8_t getByteAt(size_t pos) const { return _buffer.getByteAt(pos); } //! @brief Get word at position @a pos. inline uint16_t getWordAt(size_t pos) const { return _buffer.getWordAt(pos); } //! @brief Get dword at position @a pos. inline uint32_t getDWordAt(size_t pos) const { return _buffer.getDWordAt(pos); } //! @brief Get qword at position @a pos. inline uint64_t getQWordAt(size_t pos) const { return _buffer.getQWordAt(pos); } //! @brief Get int32_t at position @a pos. inline int32_t getInt32At(size_t pos) const { return (int32_t)_buffer.getDWordAt(pos); } //! @brief Get int64_t at position @a pos. inline int64_t getInt64At(size_t pos) const { return (int64_t)_buffer.getQWordAt(pos); } //! @brief Get intptr_t at position @a pos. inline intptr_t getIntPtrTAt(size_t pos) const { return _buffer.getIntPtrTAt(pos); } //! @brief Get uintptr_t at position @a pos. inline uintptr_t getUIntPtrTAt(size_t pos) const { return _buffer.getUIntPtrTAt(pos); } //! @brief Get uintptr_t at position @a pos. inline size_t getSizeTAt(size_t pos) const { return _buffer.getSizeTAt(pos); } // your_sha256_hash---------- // [Buffer - Setters] // your_sha256_hash---------- //! @brief Set byte at position @a pos. inline void setByteAt(size_t pos, uint8_t x) { _buffer.setByteAt(pos, x); } //! @brief Set word at position @a pos. inline void setWordAt(size_t pos, uint16_t x) { _buffer.setWordAt(pos, x); } //! @brief Set dword at position @a pos. inline void setDWordAt(size_t pos, uint32_t x) { _buffer.setDWordAt(pos, x); } //! @brief Set qword at position @a pos. inline void setQWordAt(size_t pos, uint64_t x) { _buffer.setQWordAt(pos, x); } //! @brief Set int32_t at position @a pos. inline void setInt32At(size_t pos, int32_t x) { _buffer.setDWordAt(pos, (uint32_t)x); } //! @brief Set int64_t at position @a pos. inline void setInt64At(size_t pos, int64_t x) { _buffer.setQWordAt(pos, (uint64_t)x); } //! @brief Set intptr_t at position @a pos. inline void setIntPtrTAt(size_t pos, intptr_t x) { _buffer.setIntPtrTAt(pos, x); } //! @brief Set uintptr_t at position @a pos. inline void setUInt64At(size_t pos, uintptr_t x) { _buffer.setUIntPtrTAt(pos, x); } //! @brief Set size_t at position @a pos. inline void setSizeTAt(size_t pos, size_t x) { _buffer.setSizeTAt(pos, x); } // your_sha256_hash---------- // [CanEmit] // your_sha256_hash---------- //! @brief Get whether the instruction can be emitted. //! //! This function behaves like @c ensureSpace(), but it also checks if //! assembler is in error state and in that case it returns @c false. //! Assembler internally always uses this function before new instruction is //! emitted. //! //! It's implemented like: //! <code>return ensureSpace() && !getError();</code> inline bool canEmit() { // If there is an error, we can't emit another instruction until last error // is cleared by calling @c setError(kErrorOk). If something caused the // error while generating code it's probably fatal in all cases. You can't // use generated code anymore, because you are not sure about the status. if (_error) return false; // The ensureSpace() method returns true on success and false on failure. We // are catching return value and setting error code here. if (ensureSpace()) return true; // If we are here, there is memory allocation error. Note that this is HEAP // allocation error, virtual allocation error can be caused only by // AsmJit::VirtualMemory class! setError(kErrorNoHeapMemory); return false; } // your_sha256_hash---------- // [Emit] // // These functions are not protected against buffer overrun. Each place of // code which calls these functions ensures that there is some space using // canEmit() method. Emitters are internally protected in AsmJit::Buffer, // but only in debug builds. // your_sha256_hash---------- //! @brief Emit Byte to internal buffer. inline void _emitByte(uint8_t x) { _buffer.emitByte(x); } //! @brief Emit word (2 bytes) to internal buffer. inline void _emitWord(uint16_t x) { _buffer.emitWord(x); } //! @brief Emit dword (4 bytes) to internal buffer. inline void _emitDWord(uint32_t x) { _buffer.emitDWord(x); } //! @brief Emit qword (8 bytes) to internal buffer. inline void _emitQWord(uint64_t x) { _buffer.emitQWord(x); } //! @brief Emit Int32 (4 bytes) to internal buffer. inline void _emitInt32(int32_t x) { _buffer.emitDWord((uint32_t)x); } //! @brief Emit Int64 (8 bytes) to internal buffer. inline void _emitInt64(int64_t x) { _buffer.emitQWord((uint64_t)x); } //! @brief Emit intptr_t (4 or 8 bytes) to internal buffer. inline void _emitIntPtrT(intptr_t x) { _buffer.emitIntPtrT(x); } //! @brief Emit uintptr_t (4 or 8 bytes) to internal buffer. inline void _emitUIntPtrT(uintptr_t x) { _buffer.emitUIntPtrT(x); } //! @brief Emit size_t (4 or 8 bytes) to internal buffer. inline void _emitSizeT(size_t x) { _buffer.emitSizeT(x); } //! @brief Embed data into instruction stream. ASMJIT_API void embed(const void* data, size_t len); // your_sha256_hash---------- // [Reloc] // your_sha256_hash---------- //! @brief Relocate code to a given address @a dst. //! //! @param dst Where the relocated code should me stored. The pointer can be //! address returned by virtual memory allocator or your own address if you //! want only to store the code for later reuse (or load, etc...). //! @param addressBase Base address used for relocation. When using JIT code //! generation, this will be the same as @a dst, only casted to system //! integer type. But when generating code for remote process then the value //! can be different. //! //! @retval The bytes used. Code-generator can create trampolines which are //! used when calling other functions inside the JIT code. However, these //! trampolines can be unused so the relocCode() returns the exact size needed //! for the function. //! //! A given buffer will be overwritten, to get number of bytes required use //! @c getCodeSize(). virtual size_t relocCode(void* dst, sysuint_t addressBase) const = 0; //! @brief Simplifed version of @c relocCode() method designed for JIT. //! //! @overload inline size_t relocCode(void* dst) const { return relocCode(dst, (uintptr_t)dst); } // your_sha256_hash---------- // [Make] // your_sha256_hash---------- //! @brief Make is convenience method to make currently serialized code and //! return pointer to generated function. //! //! What you need is only to cast this pointer to your function type and call //! it. Note that if there was an error and calling @c getError() method not //! returns @c kErrorOk (zero) then this function always return @c NULL and //! error value remains the same. virtual void* make() = 0; // your_sha256_hash---------- // [Helpers] // your_sha256_hash---------- ASMJIT_API LabelLink* _newLabelLink(); // your_sha256_hash---------- // [Members] // your_sha256_hash---------- //! @brief ZoneMemory management. ZoneMemory _zoneMemory; //! @brief Binary code buffer. Buffer _buffer; //! @brief Context (for example @ref JitContext). Context* _context; //! @brief Logger. Logger* _logger; //! @brief Error code. uint32_t _error; //! @brief Properties. uint32_t _properties; //! @brief Emit flags for next instruction (cleared after emit). uint32_t _emitOptions; //! @brief Size of possible trampolines. uint32_t _trampolineSize; //! @brief Inline comment that will be logged by the next instruction and //! set to NULL. const char* _inlineComment; //! @brief Linked list of unused links (@c LabelLink* structures) LabelLink* _unusedLinks; //! @brief Labels data. PodVector<LabelData> _labels; //! @brief Relocations data. PodVector<RelocData> _relocData; }; //! @} } // AsmJit namespace // [Api-End] #include "../core/apiend.h" // [Guard] #endif // _ASMJIT_CORE_ASSEMBLER_H ```
Metatacha is a monotypic moth genus of the family Noctuidae erected by George Hampson in 1913. Its only species, Metatacha excavata, was first described by George Thomas Bethune-Baker in 1909. It is found in the Democratic Republic of the Congo and Uganda. References Catocalinae
```swift // // URL+ExtendedAttributes.swift // XCGLogger: path_to_url // // Created by Dave Wood on 2017-04-04. // Some rights reserved: path_to_url // // Based on code by Martin-R here: path_to_url import Foundation extension URL { /// Get extended attribute. func extendedAttribute(forName name: String) throws -> Data? { let data: Data? = try self.withUnsafeFileSystemRepresentation { (fileSystemPath: (UnsafePointer<Int8>?)) -> Data? in // Determine attribute size let length: Int = getxattr(fileSystemPath, name, nil, 0, 0, 0) guard length >= 0 else { return nil } // Create buffer with required size var data: Data = Data(count: length) // Retrieve attribute let result: Int = data.withUnsafeMutableBytes { [count = data.count] (body: UnsafeMutableRawBufferPointer) -> Int in getxattr(fileSystemPath, name, body.baseAddress, count, 0, 0) } guard result >= 0 else { throw URL.posixError(errno) } return data } return data } /// Set extended attribute. func setExtendedAttribute(data: Data, forName name: String) throws { try self.withUnsafeFileSystemRepresentation { fileSystemPath in let result: Int32 = data.withUnsafeBytes { [count = data.count] (body: UnsafeRawBufferPointer) -> Int32 in setxattr(fileSystemPath, name, body.baseAddress, count, 0, 0) } guard result >= 0 else { throw URL.posixError(errno) } } } /// Remove extended attribute. func removeExtendedAttribute(forName name: String) throws { try self.withUnsafeFileSystemRepresentation { fileSystemPath in let result: Int32 = removexattr(fileSystemPath, name, 0) guard result >= 0 else { throw URL.posixError(errno) } } } /// Get list of all extended attributes. func listExtendedAttributes() throws -> [String] { let list: [String] = try self.withUnsafeFileSystemRepresentation { (fileSystemPath: (UnsafePointer<Int8>?)) -> [String] in let length: Int = listxattr(fileSystemPath, nil, 0, 0) guard length >= 0 else { throw URL.posixError(errno) } // Create buffer with required size var data: Data = Data(count: length) // Retrieve attribute list let result: Int = data.withUnsafeMutableBytes { [count = data.count] (body: UnsafeMutableRawBufferPointer) -> Int in return listxattr(fileSystemPath, UnsafeMutablePointer<Int8>(OpaquePointer(body.baseAddress)), count, 0) } guard result >= 0 else { throw URL.posixError(errno) } // Extract attribute names let list: [String] = data.split(separator: 0).compactMap { String(data: Data($0), encoding: .utf8) } return list } return list } /// Helper function to create an NSError from a Unix errno. private static func posixError(_ err: Int32) -> NSError { return NSError(domain: NSPOSIXErrorDomain, code: Int(err), userInfo: [NSLocalizedDescriptionKey: String(cString: strerror(err))]) } } ```
```javascript "use strict"; function __export(m) { for (var p in m) if (!exports.hasOwnProperty(p)) exports[p] = m[p]; } Object.defineProperty(exports, "__esModule", { value: true }); __export(require("rxjs-compat/operators/sampleTime")); //# sourceMappingURL=sampleTime.js.map ```
Beverly Hills Cop II: The Motion Picture Soundtrack Album is the soundtrack to Tony Scott's 1987 action comedy film Beverly Hills Cop II. It was released in 1987 through MCA Records. Composed of eleven songs, production was handled by André Cymone, Giorgio Moroder, Keith Forsey, George Michael, Harold Faltermeyer, Howie Rice, Michael Verdick, Narada Michael Walden, Ready for the World and Stephen Bray. The song "Hold On" as sung by Keta Bill plays during the scene wherein Axel, Rosewood and Taggart confront Dent at the Playboy Mansion. However, the film's soundtrack CD released by MCA Records includes only a different song entitled "Hold On", sung by Corey Hart. This song has different music and slightly altered lyrics. The film introduced George Michael's controversial song "I Want Your Sex", a number 2 hit on the Billboard Hot 100. It also includes "Cross My Broken Heart" by The Jets (a Top 10 hit on the Billboard Hot 100) and "Shakedown" by Bob Seger (which became a 1 hit on that same chart), as well as "Better Way" performed by James Ingram. The Pointer Sisters scored a moderate hit with "Be There" (42 on the Hot 100), their single from the soundtrack. It was the second time the sisters had contributed to the Beverly Hills Cop franchise; they'd notched a top 10 single with "Neutron Dance" from the Beverly Hills Cop soundtrack. Harold Faltermeyer's 1988 album, Harold F, includes a song called "Bad Guys", which is used as part of the film's score—an instrumental section of the song plays during the opening jewelry store robbery scene, and also during several other scenes throughout the film. The soundtrack debuted at 8 on the Billboard 200 albums charts and spent 26 weeks on the charts, a far cry compared to the 49 weeks spent by the first film's soundtrack. Despite this, one song from the album, "Shakedown", was nominated for an Academy Award and the Golden Globe Award for Best Original Song. However, another song from the album, "I Want Your Sex", won the Razzie Award for Worst Song, despite it going on to achieve a platinum certification for sales by the Recording Industry Association of America. Track listing Official score album In 2016, La-La Land Records issued a limited edition album featuring the score from the movie composed by Harold Faltermeyer as well as the notable songs from the movie. "Adrianos" (2:52) "Bogomil Oil Well Jog / Bogomil Gets Shot" (2:28) "Axel Gets the News" (1:10) "Warehouse" (0:34) "Hospital Visit" (1:05) "Mansion" (1:07) "Loyalty / Drive to Shooting Club" (1:51) "Boys Car Talk" (1:11) "Shoot Screens / Meet Dent and Cain" (2:53) "I'll Be Sure to Duck" (0:54) "Drive to Bogomil's" (0:58) "Axel Shoes / Boys at Mansion" (1:24) "Splash / Drive to 385" (0:41) "Shootout" (0:53) "Boys at Rosewood's" (0:42) "Axel Calls Jeffrey" (1:01) "Fingerprint" (0:27) "Sneak to Shooting Club" (2:33) "Jeffrey Calls Todd / Lutz Calls Jeffrey" (1:29) "City Deposit" (4:12) "The Tread to Hef's / Drive to Bernstein's" (theme suite) (1:44) "Racetrack" (5:04) "Drive to Oil / Hit Vic" (2:25) "Sneak to Shack / Alarm" (1:44) "Oil Field Shootout / Kill Dent and Karla" (4:11) "Wrap Up" (0:56) "Goodbye" (1:11) "Loyalty" (alternate) (0:12) "Bad Guys" - Keith Forsey (4:35) "Shakedown" - Bob Seger (4:02) "I Want Your Sex" - George Michael (4:45) "The Heat is On" - Glenn Frey (3:45) "Be There" - The Pointer Sisters (4:12) "All Revved Up" - Jermaine Jackson (4:00) "Better Way" - James Ingram (4:09) "In Deep" - Charlie Sexton (3:32) Charts Certifications References External links 1987 soundtrack albums Beverly Hills Cop (franchise) MCA Records soundtracks Albums produced by Stephen Bray Albums produced by Keith Forsey Albums produced by Giorgio Moroder Albums produced by Harold Faltermeyer Albums produced by Narada Michael Walden
```javascript !function(e){if("object"==typeof exports&&"undefined"!=typeof module)module.exports=e();else if("function"==typeof define&&define.amd)define([],e);else{var f;"undefined"!=typeof window?f=window:"undefined"!=typeof global?f=global:"undefined"!=typeof self&&(f=self),f.Slideout=e()}}(function(){var define,module,exports;return (function e(t,n,r){function s(o,u){if(!n[o]){if(!t[o]){var a=typeof require=="function"&&require;if(!u&&a)return a(o,!0);if(i)return i(o,!0);var f=new Error("Cannot find module '"+o+"'");throw f.code="MODULE_NOT_FOUND",f}var l=n[o]={exports:{}};t[o][0].call(l.exports,function(e){var n=t[o][1][e];return s(n?n:e)},l,l.exports,e,t,n,r)}return n[o].exports}var i=typeof require=="function"&&require;for(var o=0;o<r.length;o++)s(r[o]);return s})({1:[function(require,module,exports){ 'use strict'; /** * Module dependencies */ var decouple = require('decouple'); var Emitter = require('emitter'); /** * Privates */ var scrollTimeout; var scrolling = false; var doc = window.document; var html = doc.documentElement; var msPointerSupported = window.navigator.msPointerEnabled; var touch = { 'start': msPointerSupported ? 'MSPointerDown' : 'touchstart', 'move': msPointerSupported ? 'MSPointerMove' : 'touchmove', 'end': msPointerSupported ? 'MSPointerUp' : 'touchend' }; var prefix = (function prefix() { var regex = /^(Webkit|Khtml|Moz|ms|O)(?=[A-Z])/; var styleDeclaration = doc.getElementsByTagName('script')[0].style; for (var prop in styleDeclaration) { if (regex.test(prop)) { return '-' + prop.match(regex)[0].toLowerCase() + '-'; } } // Nothing found so far? Webkit does not enumerate over the CSS properties of the style object. // However (prop in style) returns the correct value, so we'll have to test for // the precence of a specific property if ('WebkitOpacity' in styleDeclaration) { return '-webkit-'; } if ('KhtmlOpacity' in styleDeclaration) { return '-khtml-'; } return ''; }()); function extend(destination, from) { for (var prop in from) { if (from[prop]) { destination[prop] = from[prop]; } } return destination; } function inherits(child, uber) { child.prototype = extend(child.prototype || {}, uber.prototype); } function hasIgnoredElements(el) { while (el.parentNode) { if (el.getAttribute('data-slideout-ignore') !== null) { return el; } el = el.parentNode; } return null; } /** * Slideout constructor */ function Slideout(options) { options = options || {}; // Sets default values this._startOffsetX = 0; this._currentOffsetX = 0; this._opening = false; this._moved = false; this._opened = false; this._preventOpen = false; this._touch = options.touch === undefined ? true : options.touch && true; this._side = options.side || 'left'; // Sets panel this.panel = options.panel; this.menu = options.menu; // Sets classnames if (!this.panel.classList.contains('slideout-panel')) { this.panel.classList.add('slideout-panel'); } if (!this.panel.classList.contains('slideout-panel-' + this._side)) { this.panel.classList.add('slideout-panel-' + this._side); } if (!this.menu.classList.contains('slideout-menu')) { this.menu.classList.add('slideout-menu'); } if (!this.menu.classList.contains('slideout-menu-' + this._side)) { this.menu.classList.add('slideout-menu-' + this._side); } // Sets options this._fx = options.fx || 'ease'; this._duration = parseInt(options.duration, 10) || 300; this._tolerance = parseInt(options.tolerance, 10) || 70; this._padding = this._translateTo = parseInt(options.padding, 10) || 256; this._orientation = this._side === 'right' ? -1 : 1; this._translateTo *= this._orientation; // Init touch events if (this._touch) { this._initTouchEvents(); } } /** * Inherits from Emitter */ inherits(Slideout, Emitter); /** * Opens the slideout menu. */ Slideout.prototype.open = function() { var self = this; this.emit('beforeopen'); if (!html.classList.contains('slideout-open')) { html.classList.add('slideout-open'); } this._setTransition(); this._translateXTo(this._translateTo); this._opened = true; setTimeout(function() { self.panel.style.transition = self.panel.style['-webkit-transition'] = ''; self.emit('open'); }, this._duration + 50); return this; }; /** * Closes slideout menu. */ Slideout.prototype.close = function() { var self = this; if (!this.isOpen() && !this._opening) { return this; } this.emit('beforeclose'); this._setTransition(); this._translateXTo(0); this._opened = false; setTimeout(function() { html.classList.remove('slideout-open'); self.panel.style.transition = self.panel.style['-webkit-transition'] = self.panel.style[prefix + 'transform'] = self.panel.style.transform = ''; self.emit('close'); }, this._duration + 50); return this; }; /** * Toggles (open/close) slideout menu. */ Slideout.prototype.toggle = function() { return this.isOpen() ? this.close() : this.open(); }; /** * Returns true if the slideout is currently open, and false if it is closed. */ Slideout.prototype.isOpen = function() { return this._opened; }; /** * Translates panel and updates currentOffset with a given X point */ Slideout.prototype._translateXTo = function(translateX) { this._currentOffsetX = translateX; this.panel.style[prefix + 'transform'] = this.panel.style.transform = 'translateX(' + translateX + 'px)'; return this; }; /** * Set transition properties */ Slideout.prototype._setTransition = function() { this.panel.style[prefix + 'transition'] = this.panel.style.transition = prefix + 'transform ' + this._duration + 'ms ' + this._fx; return this; }; /** * Initializes touch event */ Slideout.prototype._initTouchEvents = function() { var self = this; /** * Decouple scroll event */ this._onScrollFn = decouple(doc, 'scroll', function() { if (!self._moved) { clearTimeout(scrollTimeout); scrolling = true; scrollTimeout = setTimeout(function() { scrolling = false; }, 250); } }); /** * Prevents touchmove event if slideout is moving */ this._preventMove = function(eve) { if (self._moved) { eve.preventDefault(); } }; doc.addEventListener(touch.move, this._preventMove); /** * Resets values on touchstart */ this._resetTouchFn = function(eve) { if (typeof eve.touches === 'undefined') { return; } self._moved = false; self._opening = false; self._startOffsetX = eve.touches[0].pageX; self._preventOpen = (!self._touch || (!self.isOpen() && self.menu.clientWidth !== 0)); }; this.panel.addEventListener(touch.start, this._resetTouchFn); /** * Resets values on touchcancel */ this._onTouchCancelFn = function() { self._moved = false; self._opening = false; }; this.panel.addEventListener('touchcancel', this._onTouchCancelFn); /** * Toggles slideout on touchend */ this._onTouchEndFn = function() { if (self._moved) { self.emit('translateend'); (self._opening && Math.abs(self._currentOffsetX) > self._tolerance) ? self.open() : self.close(); } self._moved = false; }; this.panel.addEventListener(touch.end, this._onTouchEndFn); /** * Translates panel on touchmove */ this._onTouchMoveFn = function(eve) { if ( scrolling || self._preventOpen || typeof eve.touches === 'undefined' || hasIgnoredElements(eve.target) ) { return; } var dif_x = eve.touches[0].clientX - self._startOffsetX; var translateX = self._currentOffsetX = dif_x; if (Math.abs(translateX) > self._padding) { return; } if (Math.abs(dif_x) > 20) { self._opening = true; var oriented_dif_x = dif_x * self._orientation; if (self._opened && oriented_dif_x > 0 || !self._opened && oriented_dif_x < 0) { return; } if (!self._moved) { self.emit('translatestart'); } if (oriented_dif_x <= 0) { translateX = dif_x + self._padding * self._orientation; self._opening = false; } if (!(self._moved && html.classList.contains('slideout-open'))) { html.classList.add('slideout-open'); } self.panel.style[prefix + 'transform'] = self.panel.style.transform = 'translateX(' + translateX + 'px)'; self.emit('translate', translateX); self._moved = true; } }; this.panel.addEventListener(touch.move, this._onTouchMoveFn); return this; }; /** * Enable opening the slideout via touch events. */ Slideout.prototype.enableTouch = function() { this._touch = true; return this; }; /** * Disable opening the slideout via touch events. */ Slideout.prototype.disableTouch = function() { this._touch = false; return this; }; /** * Destroy an instance of slideout. */ Slideout.prototype.destroy = function() { // Close before clean this.close(); // Remove event listeners doc.removeEventListener(touch.move, this._preventMove); this.panel.removeEventListener(touch.start, this._resetTouchFn); this.panel.removeEventListener('touchcancel', this._onTouchCancelFn); this.panel.removeEventListener(touch.end, this._onTouchEndFn); this.panel.removeEventListener(touch.move, this._onTouchMoveFn); doc.removeEventListener('scroll', this._onScrollFn); // Remove methods this.open = this.close = function() {}; // Return the instance so it can be easily dereferenced return this; }; /** * Expose Slideout */ module.exports = Slideout; },{"decouple":2,"emitter":3}],2:[function(require,module,exports){ 'use strict'; var requestAnimFrame = (function() { return window.requestAnimationFrame || window.webkitRequestAnimationFrame || function (callback) { window.setTimeout(callback, 1000 / 60); }; }()); function decouple(node, event, fn) { var eve, tracking = false; function captureEvent(e) { eve = e; track(); } function track() { if (!tracking) { requestAnimFrame(update); tracking = true; } } function update() { fn.call(node, eve); tracking = false; } node.addEventListener(event, captureEvent, false); return captureEvent; } /** * Expose decouple */ module.exports = decouple; },{}],3:[function(require,module,exports){ "use strict"; var _classCallCheck = function (instance, Constructor) { if (!(instance instanceof Constructor)) { throw new TypeError("Cannot call a class as a function"); } }; exports.__esModule = true; /** * Creates a new instance of Emitter. * @class * @returns {Object} Returns a new instance of Emitter. * @example * // Creates a new instance of Emitter. * var Emitter = require('emitter'); * * var emitter = new Emitter(); */ var Emitter = (function () { function Emitter() { _classCallCheck(this, Emitter); } /** * Adds a listener to the collection for the specified event. * @memberof! Emitter.prototype * @function * @param {String} event - The event name. * @param {Function} listener - A listener function to add. * @returns {Object} Returns an instance of Emitter. * @example * // Add an event listener to "foo" event. * emitter.on('foo', listener); */ Emitter.prototype.on = function on(event, listener) { // Use the current collection or create it. this._eventCollection = this._eventCollection || {}; // Use the current collection of an event or create it. this._eventCollection[event] = this._eventCollection[event] || []; // Appends the listener into the collection of the given event this._eventCollection[event].push(listener); return this; }; /** * Adds a listener to the collection for the specified event that will be called only once. * @memberof! Emitter.prototype * @function * @param {String} event - The event name. * @param {Function} listener - A listener function to add. * @returns {Object} Returns an instance of Emitter. * @example * // Will add an event handler to "foo" event once. * emitter.once('foo', listener); */ Emitter.prototype.once = function once(event, listener) { var self = this; function fn() { self.off(event, fn); listener.apply(this, arguments); } fn.listener = listener; this.on(event, fn); return this; }; /** * Removes a listener from the collection for the specified event. * @memberof! Emitter.prototype * @function * @param {String} event - The event name. * @param {Function} listener - A listener function to remove. * @returns {Object} Returns an instance of Emitter. * @example * // Remove a given listener. * emitter.off('foo', listener); */ Emitter.prototype.off = function off(event, listener) { var listeners = undefined; // Defines listeners value. if (!this._eventCollection || !(listeners = this._eventCollection[event])) { return this; } listeners.forEach(function (fn, i) { if (fn === listener || fn.listener === listener) { // Removes the given listener. listeners.splice(i, 1); } }); // Removes an empty event collection. if (listeners.length === 0) { delete this._eventCollection[event]; } return this; }; /** * Execute each item in the listener collection in order with the specified data. * @memberof! Emitter.prototype * @function * @param {String} event - The name of the event you want to emit. * @param {...Object} data - Data to pass to the listeners. * @returns {Object} Returns an instance of Emitter. * @example * // Emits the "foo" event with 'param1' and 'param2' as arguments. * emitter.emit('foo', 'param1', 'param2'); */ Emitter.prototype.emit = function emit(event) { var _this = this; for (var _len = arguments.length, args = Array(_len > 1 ? _len - 1 : 0), _key = 1; _key < _len; _key++) { args[_key - 1] = arguments[_key]; } var listeners = undefined; // Defines listeners value. if (!this._eventCollection || !(listeners = this._eventCollection[event])) { return this; } // Clone listeners listeners = listeners.slice(0); listeners.forEach(function (fn) { return fn.apply(_this, args); }); return this; }; return Emitter; })(); /** * Exports Emitter */ exports["default"] = Emitter; module.exports = exports["default"]; },{}]},{},[1])(1) }); //# sourceMappingURL=data:application/json;base64,your_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashclN1cHBvcnRlZCA/your_sha256_hashyour_sha256_hashbiAgJ2VuZCc6IG1zUG9pbnRlclN1cHBvcnRlZCA/your_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashLy8gTm90aGluZyBmb3VuZCBzbyBmYXI/your_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashc09wZW4oKSA/your_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashdGguYWJzKHNlbGYuX2N1cnJlbnRPZmZzZXRYKSA+your_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashaWZfeCA+your_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashyour_sha256_hashZmF1bHRcIl07Il19 ```
```go //go:build ee /* Version 1.0 ("KERO-1.0) 1. You may only view, read and display for studying purposes the source code of the software licensed under this license, and, to the extent explicitly provided under this license, the binary code. 2. Any use of the software which exceeds the foregoing right, including, without limitation, its execution, compilation, copying, modification and distribution, is expressly prohibited. 3. THE SOFTWARE IS PROVIDED AS IS, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. END OF TERMS AND CONDITIONS */ package seedcontroller import ( "context" "testing" kubermaticv1 "k8c.io/kubermatic/v2/pkg/apis/kubermatic/v1" kubermaticlog "k8c.io/kubermatic/v2/pkg/log" "k8c.io/kubermatic/v2/pkg/test/diff" "k8c.io/kubermatic/v2/pkg/test/fake" "k8s.io/apimachinery/pkg/api/resource" "k8s.io/apimachinery/pkg/types" "k8s.io/client-go/tools/record" ctrlruntimeclient "sigs.k8s.io/controller-runtime/pkg/client" "sigs.k8s.io/controller-runtime/pkg/reconcile" ) const rqName = "resourceQuota" const projectId = "project1" func TestReconcile(t *testing.T) { testCases := []struct { name string requestName string resourceQuota *kubermaticv1.ResourceQuota seedClient ctrlruntimeclient.Client expectedUsage kubermaticv1.ResourceDetails }{ { name: "scenario 1: calculate rq local usage", requestName: rqName, resourceQuota: genResourceQuota(rqName), seedClient: fake. NewClientBuilder(). WithObjects(genResourceQuota(rqName), genCluster("c1", projectId, "2", "5G", "10G"), genCluster("c2", projectId, "5", "2G", "8G"), genCluster("notSameProjectCluster", "impostor", "3", "3G", "3G")). Build(), expectedUsage: *genResourceDetails("7", "7G", "18G"), }, } for _, tc := range testCases { t.Run(tc.name, func(t *testing.T) { ctx := context.Background() r := &reconciler{ log: kubermaticlog.Logger, recorder: &record.FakeRecorder{}, seedClient: tc.seedClient, } request := reconcile.Request{NamespacedName: types.NamespacedName{Name: tc.requestName}} if _, err := r.Reconcile(ctx, request); err != nil { t.Fatalf("reconciling failed: %v", err) } rq := &kubermaticv1.ResourceQuota{} err := tc.seedClient.Get(ctx, request.NamespacedName, rq) if err != nil { t.Fatalf("failed to get resource quota: %v", err) } if !diff.SemanticallyEqual(tc.expectedUsage, rq.Status.LocalUsage) { t.Fatalf("Objects differ:\n%v", diff.ObjectDiff(tc.expectedUsage, rq.Status.LocalUsage)) } }) } } func genResourceQuota(name string) *kubermaticv1.ResourceQuota { rq := &kubermaticv1.ResourceQuota{} rq.Name = name rq.Spec = kubermaticv1.ResourceQuotaSpec{ Subject: kubermaticv1.Subject{ Name: projectId, Kind: kubermaticv1.ProjectSubjectKind, }, } return rq } func genResourceDetails(cpu, mem, storage string) *kubermaticv1.ResourceDetails { return kubermaticv1.NewResourceDetails(resource.MustParse(cpu), resource.MustParse(mem), resource.MustParse(storage)) } func genCluster(name, projectId, cpu, mem, storage string) *kubermaticv1.Cluster { cluster := &kubermaticv1.Cluster{} cluster.Name = name cluster.Labels = map[string]string{kubermaticv1.ProjectIDLabelKey: projectId} cluster.Status.ResourceUsage = genResourceDetails(cpu, mem, storage) return cluster } ```
Malcolm A. E. "Mal" Spence (2 January 1936 – 30 October 2017) was a Jamaican athlete who mainly competed in the 400 metres. His twin brother Melville also competed in track and field. Malcolm died five years and two days after his brother. Spence competed for the British West Indies in the 1960 Summer Olympics held in Rome, Italy, where he won the bronze medal in the men's 4x400 metres relay with his teammates James Wedderburn, Keith Gardner and George Kerr. Curiously, there were two people named Malcolm Spence running the 400 meters distance at both the 1956 and the 1960 Olympics, both getting a bronze medal in 1960. Malcolm Spence from South Africa took the bronze medal in the Open race, while the South African relay team finished in fourth, one second behind Mal Spence's British West Indian relay team. Both twins returned to run the 4x400 relay in 1964 as members of the first independent Jamaican team. Living in Florida, he served as a torchbearer for the 1996 Olympics in nearby Atlanta. Both Mal and his twin brother were recruited to run for Arizona State University during the civil rights movement of the late 1950s, among the first international athletes to come to the US for athletics. He is the author of The Lives and Times of Mal and Mel: Three Times Jamaican Olympians published in 2011. References 1936 births 2017 deaths Jamaican male sprinters Olympic bronze medalists for the British West Indies Athletes (track and field) at the 1956 Summer Olympics Athletes (track and field) at the 1958 British Empire and Commonwealth Games Athletes (track and field) at the 1959 Pan American Games Athletes (track and field) at the 1960 Summer Olympics Athletes (track and field) at the 1962 British Empire and Commonwealth Games Athletes (track and field) at the 1963 Pan American Games Athletes (track and field) at the 1964 Summer Olympics Athletes (track and field) at the 1966 British Empire and Commonwealth Games Olympic athletes for Jamaica Olympic athletes for the British West Indies Commonwealth Games medallists in athletics Pan American Games gold medalists for the British West Indies Pan American Games bronze medalists for the British West Indies Pan American Games medalists in athletics (track and field) Jamaican twins Medalists at the 1960 Summer Olympics Olympic bronze medalists in athletics (track and field) Commonwealth Games gold medallists for Jamaica Commonwealth Games bronze medallists for Jamaica Central American and Caribbean Games bronze medalists for Jamaica Competitors at the 1962 Central American and Caribbean Games Central American and Caribbean Games gold medalists for Jamaica Central American and Caribbean Games medalists in athletics Medalists at the 1959 Pan American Games Medalists at the 1963 Pan American Games Medallists at the 1958 British Empire and Commonwealth Games Medallists at the 1962 British Empire and Commonwealth Games Athletes from Kingston, Jamaica
Sipalolasma ellioti is a species of spider of the genus Sipalolasma. It is endemic to Sri Lanka. The female is 11 mm in length. See also List of Barychelidae species References Endemic fauna of Sri Lanka Barychelidae Spiders of Asia Spiders described in 1892
```scss $tree-elbow-width: 24px !default; $tree-icon-width: 18px !default; ```
Mamamoo (; Japanese: ; commonly stylized in all caps as MAMAMOO) is a South Korean girl group formed and managed by Rainbow Bridge World. The group is composed of four members: Solar, Moonbyul, Wheein, and Hwasa. Known for their strong live vocals, harmonies, and writing much of their material, they are recognized as one of the premier girl groups in K-pop. They are highly regarded in the industry for their talent, and ability to cross multiple genres with their music; from retro, jazz, R&B concepts in their early years to more contemporary hip-hop, as well as emotional ballads. Since their debut in 2014, they have been noted for challenging conventional beauty standards, breaking gender stereotypes and conducting themselves in ways that most typical K-pop stars do not. Mamamoo officially debuted on June 18, 2014, with their first extended play (EP) Hello, which featured the lead single "Mr. Ambiguous". Their debut was considered by critics as one of the best K-pop debuts of 2014. Mamamoo rose to domestic fame in 2015 with their single "Um Oh Ah Yeah" which became a sleeper hit and peaked at number three on South Korea's Gaon Digital Chart. Combined with their many appearances and victories in the popular music competition program Immortal Songs, this launched them into mainstream recognition with the public. Their following release of "You're the Best" (2016) from their debut studio album Melting, proved to be a commercial success and dominated at least eight major music sites upon release, further cementing their popularity and growing momentum. In 2017, they made a splash on the international scene with their fifth EP, Purple. Its lead single "Yes I Am," earned them their first number one on the Billboard World Albums chart. The quartet went on to achieve consecutive hits with high-charting singles "Starry Night" (2018), "Egotistic" (2018), "Gogobebe" (2019), from their "Four Seasons, Four Colors" project whereby every member took turns to become the focus for each album. This culminated in their biggest international success to date, "HIP" from their second studio album Reality in Black becoming a number-one hit on the Billboard World Digital Song Sales chart. In November 2020, the song was certified platinum by the Korea Music & Content Association (KMCA) for surpassing 100 million streams, making it their third time to achieve this feat. They also earned a gold certification from the Recording Industry Association of Japan (RIAJ), indicating their breakthrough in the Japanese music scene. Mamamoo has received numerous accolades, most notably the Golden Disk Award for Best Group, the Mnet Asian Music Award for Best Vocal Group, multiple honors at the Seoul Music Awards, as well as the Gaon Chart Music Award for New Artist of the Year in their rookie year. History 2014–2015: Formation, debut and rising popularity The group's name "Mamamoo" is intended to symbolise a baby babbling for the first time. This is because they aim to make music that would be familiar to everyone around the world like the word "mama" that babies would commonly first learn and speak of, as well as the added meaning of approaching their listeners instinctively with their music. Prior to their official debut, Mamamoo collaborated with several artists. Their first collaboration, titled "Don't Be Happy" with Bumkey was released on January 8, 2014. A second collaboration with K.Will titled "Peppermint Chocolate" featuring Wheesung was released on February 11, 2014. "Peppermint Chocolate" debuted at number 11 on the Gaon Digital Chart in its first week. On May 30, 2014, Mamamoo released a collaboration single called "HeeHeeHaHeHo" with rap duo Geeks. The group made their official debut on June 18, 2014, with the lead single "Mr. Ambiguous" from their first extended play (EP) Hello. The music video for "Mr. Ambiguous" contained cameo appearances from many well-known K-pop industry figures such as CNBLUE's Jonghyun, Baek Ji-young, Wheesung, Jung Joon-young, Bumkey, K.Will, and Rhymer of Brand New Music. The album contained three previously released collaborations and four new songs. The group made their first live appearance on the June 19 episode of M Countdown. In July 2014, Mamamoo released their first original soundtrack contribution titled "Love Lane" for the Korean drama Marriage, Not Dating. On November 21, 2014, Mamamoo released their second EP Piano Man with the title track of the same name. The title song peaked at 41 on Gaon's Digital chart. By the end of 2014, Mamamoo ranked tenth highest among idol girl groups for digital sales, 19th in album sales, and 11th in overall sales according to Gaon's year-end rankings. On January 10, 2015, Mamamoo performed a rendition of Joo Hyun-mi's "Wait a Minute" at the singing show Immortal Songs 2, reaching the final round before losing to Kim Kyung-ho. On April 2, 2015, Mamamoo released "Ahh Oop!", the first single of their third EP titled Pink Funky. "Ahh Oop!" marks the group's second collaboration with label mate Esna, after she was featured in "Gentleman" on their second EP Piano Man. On June 13, 2015, the group traveled to Ulaanbaatar, Mongolia, to perform at an event sponsored by the South Korean Embassy with Crayon Pop and K-Much. The event was a commemorative concert held in honor of the 25th anniversary of diplomatic ties between South Korea and Mongolia. On June 19, 2015, Mamamoo released their third EP Pink Funky and lead single "Um Oh Ah Yeh". The song was a commercial success, peaking at number three on the Gaon Chart, becoming their first top-three single. On August 23, 2015, after promotions concluded, Mamamoo held their first fan meeting, entitled "1st Moo Party", for a total of 1,200 fans at the Olympic Park in Seoul. Tickets for the fan meeting sold out within one minute, so the group added an additional meeting for 1,200 more fans the same night. They also held another "Moo Party" in Los Angeles, which took place on October 4, 2015. Mamamoo also collaborated and performed with label mate Basick on the Korean rap survival show Show Me the Money. On August 29, 2015, Mamamoo returned to Immortal Song 2 with a rendition of Jo Young-nam's "Delilah". On October 31, 2015, Mamamoo returned to the Immortal Song 2, singing a rendition of Korean trot singer Bae Ho's song "Backwood's Mountain" (두메산골). Their performance earned their first overall win on Immortal Song with 404 points. 2016–2017: Breakthrough and subsequent EPs On January 10, 2016, RBW announced Mamamoo's first solo concert since their debut in 2014. The concert, titled 2016 Mamamoo Concert-Moosical, was held on August 13–14, 2016, at the Olympic Hall in Seoul. 7,000 tickets for the concert were sold out in one minute. On January 26, 2016, Mamamoo pre-released an R&B ballad, "I Miss You", from their first full-length album Melting. On February 12, 2016, another track, "1cm/Taller than You" was pre-released with a music video. The full album was released on February 26, 2016, debuting at number 3 on the Gaon Chart. The title track "You're the Best (넌 is 뭔들)" also debuted at number three but peaked at number one the following week, becoming their first number-one single. On March 6, 2016, they received their first music show win with the song "You're the Best" on Inkigayo, followed by wins on Music Bank, M Countdown, and other music shows. They received eight wins in total for the single. On March 16, 2016, Mamamoo performed in Austin, Texas at South By Southwest's K-Pop Night Out. On August 31, 2016, Mamamoo released the singles "Angel" and "Dab Dab" as subgroups consisting of vocalists (Solar and Wheein) and rappers (Moonbyul and Hwasa) respectively. On September 21, 2016, Mamamoo released their follow-up digital single "New York" and accompanying music video. After wrapping up promotions for "New York" Mamamoo's agency announced that the group will make their comeback on November 7 with their fourth EP, Memory. The lead single for Memory was announced to be "Décalcomanie". Shortly afterwards, Mamamoo participated in several year-end award shows, while also featuring in the OST titled "Love" for hit TV show Goblin. On January 19, 2017, Mamamoo announced their second solo concert titled 2017 Mamamoo Concert Moosical: Curtain Call, which was held on March 3–5, 2017 in Seoul and August 19–20, 2017 in Busan at KBS Busan Hall. Following the first show of their March 2017 concert in Seoul, Mamamoo received criticism for performing in blackface when, as part of the concert, they played a video containing the members impersonating Bruno Mars while wearing darker makeup, meant to recreate a snippet of the music video for Mars' "Uptown Funk" (2014). The clip was cut from following concert dates and multiple apologies were promptly issued, including one directly from the members, stating that they were "extremely ignorant of blackface and did not understand the implications of our actions. We will be taking time to understand more about our international fans to ensure this never happens again." The group released their fifth EP Purple with the lead single "Yes I Am" on June 22, 2017. The single quickly climbed to the number one spot on the Melon real-time chart. After one day, Mamamoo set the record for the highest number of unique listeners in 24-hours with "Yes I Am" on Melon for a girl group. On June 27, 2017, the song received its first music show win on The Show, followed by wins on Show Champion, M Countdown, and Show! Music Core. Purple also peaked at number one on the Billboard World Albums chart. 2018–2020: Widespread recognition, Japanese debut and member solo debuts On January 4, 2018, Mamamoo released a pre-release single called "Paint Me" to act as a prelude to their upcoming project series "Four Seasons, Four Colors". The goal of the series is to showcase four mini albums, each combining one color and a matching member's characteristic for each season. The group has stated that they wish to show their depth as artists and present a more mature style with this project. Mamamoo started off the "Four Seasons, Four Colors" project with the release of their sixth EP, Yellow Flower, on March 7. Yellow Flower debuted at number one on the Gaon Album Chart, becoming their first number-one album since their 2014 debut. The Latin-inspired and dance-pop song "Starry Night" was released alongside the album, with an accompanying music video. "Starry Night" debuted and peaked at number two on the Gaon Digital Chart, earning 44.7 million Gaon Index points in its debut week. It also ranked at number six on the mid-year and 13 on the year-end editions of the Gaon Digital Chart, making it the fourth-highest-charting song by a girl group on the latter. The song earned the group their first music recording certification since the Gaon Music Chart and Korea Music & Content Association (KMCA) introduced certifications in April 2018; in November 2018, it was certified platinum for 100 million streams, and July 2019, it was certified platinum for 2.5 million paid digital downloads. On July 1, Mamamoo released the single "Rainy Season" as the pre-release single from their upcoming album. "Rainy Season" peaked at number two on the Gaon Digital Chart, behind only Blackpink's "Ddu-Du Ddu-Du", and was nominated for Best Female Vocal Performance at the 2018 Genie Music Awards. The group released their seventh EP, Red Moon, which serves as the second installment in the "Four Seasons, Four Colors" series, on July 16. The EP debuted and peaked at number three on the Gaon Album Chart, with 38,000 copies sold in July 2018. The EP debuted at number four on the Billboard World Albums with 1,000 copies cold, marking their best U.S. sales week, and gave the group their first-ever appearance on the Billboard Heatseekers Albums Chart, entering the chart at number 25. The Latin-pop lead single "Egotistic", released alongside the EP, peaked at number four on the Gaon Digital Chart and the Billboard World Digital Songs Sales chart, earning the group their fifth top-ten entry on the latter chart. On August 18–19, Mamamoo held their third headlining concert, titled "2018 Mamamoo Concert 4Seasons S/S", at the SK Olympic Handball Gymnasium in Seoul. Tickets for the concert were sold out within two minutes of being available for sale. Mamamoo released their debut Japanese single, the re-recorded Japanese version of their 2016 single "Décalcomanie", on October 3 by Victor Entertainment. The CD and digital releases of the single also include the b-side "You Don't Know Me", a pop track that serves as their first original Japanese song. "Décalcomanie" peaked at number 11 on the weekly Oricon Singles Chart, with over 9,000 copies sold. On November 29, the group returned to the Korean-language market, releasing their eighth EP Blue;S, which serves as the third installment of the "Four Seasons, Four Colors" series of albums. Blue;S, which focuses on member Solar, peaked at number seven on the Gaon Album Chart. The lead single "Wind Flower" peaked at number nine on the Gaon Digital Chart. On February 6, 2019, the group released the Japanese version of "Wind Flower", alongside its b-side "Sleep Talk", as their second Japanese-language single. "Wind Flower" peaked at number 16 on the Oricon Singles Chart, selling 7,000 copies. On March 14, 2019, Mamamoo released their ninth EP, and the fourth and final installment of the "Four Seasons, Four Colors" project, White Wind, on March 14, 2019, with the lead single "Gogobebe". White Wind debuted at number one on the Gaon Album Chart, selling nearly 60,000 physical copies by April 2019. "Gogobebe" also saw decent commercial success, peaking at number five on the Gaon Digital Chart and placing at number 72 on the yearly chart. The song also peaked at number two on both the Billboard Korea K-Pop Hot 100 and Billboard World Digital Songs Sales charts. On March 27, the group announced their fourth headlining concert, titled "2019 Mamamoo Concert 4Seasons F/W". The concert was held at Jangchung Gymnasium in Seoul on April 19–21. The concerts served as a grand finale for Mamamoo's "Four Seasons, Four Colors" project, launched the previous year in order to re-create the group's identity. On July 24, Mamamoo released a new promotional single titled "Gleam" in collaboration with the brand Davich Eyeglasses. The dance and pop single saw minor success, peaking at number 89 on the Gaon Digital Chart and at number 15 on the Billboard World Digital Songs Sales chart. In August 2019, Mamamoo was announced to be joining Mnet reality-competition show Queendom as one of six teams. The show is a "comeback battle" between six trending girl group acts in order to "determine the real number one" when all six release their singles at the same time. Queendom premiered on August 26 and ran for ten total episodes, ending on October 31. During the show's finale, the group was crowned the winner of Queendom, thus earning the prize of a full-length comeback show to be broadcast on Mnet. As part of the show, Mamamoo released various songs, including a cover of AOA's 2016 hit "Good Luck" and their original finale song "Destiny". Following their Queendom victory, Mamamoo released their second full album, Reality in Black, on November 14, 2019, with the lead single "Hip". Reality in Black debuted and peaked atop the Gaon Album Chart, while "Hip" peaked at number four on the Digital Chart. "Hip" also debuted at number five on the World Digital Songs Sales chart and peaked atop the chart in its second week, giving the group their very first number-one hit on the chart. On February 19, 2020, Mamamoo released their third Japanese song, "Shampoo", as a digital single. The song, along with the Japanese version of "Hip", is included in the Japanese edition of Reality in Black. The Japanese edition of the album was released on March 11 and peaked at number 27 on the Oricon Albums Chart with 2,500 copies sold. 2020: Focus on members' solo activities and Travel In January 2020, an OSEN interview with Kim Do-hoon, the CEO and executive producer of Mamamoo's record label RBW, was published, in which Kim detailed the trajectory of Mamamoo's career and activities in 2020. He reflected on the successes of Mamamoo's participation on Queendom and their second Korean studio album Reality in Black and confirmed that the members would begin to focus on solo projects, with each member hoping to showcase their individualities that they had previously shown in the "Four Seasons, Four Colors" project series. Moonbyul became the first member to release a solo project in 2020, releasing her debut EP Dark Side of the Moon on February 14. On April 23, Solar released her debut single "Spit It Out" and its accompanying single album. Following the single's debut at number 12 (and subsequent peak at number six), Mamamoo became the second Korean group in which every member had a solo song chart on the Billboards World Digital Song Sales chart. Mamamoo released a promotional single, "Wanna Be Myself," on September 10. The "nostalgic" and "minimalist" music video was released in conjunction with the song itself and gained 3.8 million views in its first 24 hours of being released. The single peaked at number 93 on the Gaon Digital Chart and at number 11 on the Billboard World Digital Songs Sales chart. In October 2020, Mamamoo announced their tenth EP, due out in November. "Dingga" (딩가딩가), the first single off of the EP, was pre-released ahead of the album on October 20. The disco-influenced pop song, characterized by themes of isolation and longing in the COVID-19 pandemic, was described by Teen Vogue as a "fun, chill" dance track accompanied by a "striking, retro-inspired music video to match." Led by a "simply melody line and funky vibes," the single earned top ten positions in South Korea and on the Billboard World Digital Songs Sales chart, and further charted in Singapore and Japan. The full EP, titled Travel, and its second single, "Aya", were released on November 3. Travel received mixed receptions for its lack of cohesion, and it peaked at number two on the Gaon Album Chart. The EP broke the group's personal sales records, selling a reported 100,000 copies in its first day and over 128,000 copies in its first week, and additionally topped iTunes charts in 29 countries. Described as a "180-degree flip" from the retro-pop style of "Dingga," the second single from Travel, "Aya," was described by critics as being "alluring" and "captivating" and peaked within the top 40 of the Gaon Digital Chart. 2021–present: Member contract renewals, WAW, Mamamoo+ unit, Mic On On January 22, 2021, RBW announced that Solar and Moonbyul had renewed their contracts with their agency, while Wheein and Hwasa were still currently discussing contract renewals. On March 30, Hwasa officially renewed her contract with RBW, with Wheein's contract remaining under discussion. Later on the entertainment RBW also confirmed that Mamamoo will stay together and won't disband. On May 1, Mamamoo held a global virtual concert on live-streaming platform LIVENow, becoming the first South Korean music act to hold a concert on the platform. Their eleventh EP, WAW, was released on June 2, along with the lead single "Where Are We Now". WAW topped iTunes albums charts in 21 countries worldwide, and it sold over 40,000 domestic copies in its first day of release, according to the Hanteo chart. On June 11, RBW announced that Wheein would be leaving the company as she has chosen not to renew her solo contract with the agency. However, she signed a second contract extension to remain as a member of Mamamoo until at least December 2023. Thus, RBW remains responsible for her activities in the group but is no longer involved in her solo endeavors. On August 31, it was revealed that Wheein signed an exclusive solo contract with the agency THEL1VE. A compilation album, titled I Say Mamamoo: The Best, was released alongside the lead single "Mumumumuch" on September 15. The album peaked at number eight on the Gaon Album Chart. On March 23, 2022, Mamamoo released the Japanese edition of The Best. In August 2022, it was announced that members Solar and Moonbyul would be forming the group's first official subunit, Mamamoo+, with an expected album release at the end of the month. The group released their 12th EP titled Mic On and the lead single "Illella" on October 11, 2022. Members Adapted from their RBW website profile: Solar () Moonbyul () Wheein () Hwasa () Discography Korean albums Melting (2016) Reality in Black (2019)Japanese album''' 4colors (2019) Reality in Black''- "Japanese edition" (2020) Concerts Filmography Reality shows Television series Awards and nominations Since their 2014 debut, Mamamoo has received five Mnet Asian Music Awards and Asia Artist Awards respectively, four Golden Disc Awards, Seoul Music Awards and The Fact Music Awards respectively. A total of 44 awards out of 161 nominations. See also List of best-selling girl groups References External links 2014 establishments in South Korea Musical groups established in 2014 K-pop music groups South Korean girl groups Musical groups from Seoul South Korean dance music groups Vocal quartets Melon Music Award winners Victor Entertainment artists South Korean pop music groups South Korean contemporary R&B musical groups South Korean musical quartets MAMA Award winners
```kotlin package mega.privacy.android.domain.usecase.backup import com.google.common.truth.Truth import kotlinx.coroutines.ExperimentalCoroutinesApi import kotlinx.coroutines.test.runTest import mega.privacy.android.domain.repository.BackupRepository import org.junit.jupiter.api.BeforeAll import org.junit.jupiter.api.BeforeEach import org.junit.jupiter.api.TestInstance import org.junit.jupiter.params.ParameterizedTest import org.junit.jupiter.params.provider.NullAndEmptySource import org.junit.jupiter.params.provider.ValueSource import org.mockito.kotlin.any import org.mockito.kotlin.mock import org.mockito.kotlin.reset import org.mockito.kotlin.whenever /** * Test class for [GetDeviceNameUseCase] */ @ExperimentalCoroutinesApi @TestInstance(TestInstance.Lifecycle.PER_CLASS) internal class GetDeviceNameUseCaseTest { private lateinit var underTest: GetDeviceNameUseCase private val backupRepository = mock<BackupRepository>() @BeforeAll fun setUp() { underTest = GetDeviceNameUseCase(backupRepository) } @BeforeEach fun resetMocks() { reset(backupRepository) } @ParameterizedTest(name = "returns {0}") @NullAndEmptySource @ValueSource(strings = ["Samsung S23 Ultra"]) fun `test that when invoked device name is returned`(deviceName: String?) = runTest { whenever(backupRepository.getDeviceName(any())).thenReturn(deviceName) val actual = underTest("abcd") Truth.assertThat(actual).isEqualTo(deviceName) } } ```
```scss .table-scroll { width: 100%; overflow: auto; } table { min-width: 100%; margin: 0; &, thead, tbody, tr, th, td { border-collapse: collapse; border: 0; border-spacing: 2px; } th, td { vertical-align: baseline; padding: .5rem 0 .5em 1em; text-align: left; p { &:first-child { margin-top: 0; } &:last-child { margin-bottom: 0; } } } tr { background-color: mix($theme-color, $theme-bg, 0%); &:nth-child(even) { background-color: mix($theme-color, $theme-bg, 5%); } } thead { th, td { border-bottom: 2px solid $theme-fancy-rev; } } } ```
Ingavi is a province in the La Paz Department in Bolivia. This is where the Battle of Ingavi occurred on November 18, 1841, and where the World Heritage Site of Tiwanaku is situated. During the presidency of Eliodoro Villazón the province was founded on December 16, 1909, with Viacha as its capital. Geography Ingavi lies on the southern shore of Lake Titicaca. The Chilla-Kimsa Chata mountain range traverses the province. Some of the highest mountains of the province are listed below: Subdivision Ingavi Province is divided into seven municipalities which are partly further subdivided into cantons. Population The people are predominantly indigenous citizens of Aymara descent. Tourist attractions Some of the tourist attractions of the municipalities are: In Viacha Municipality: The town of Viacha Viliroco lagoon, a small man made lake in Viacha Canton Qalachaka bridge in Viacha Canton "Virgen de Letanías" Sanctuary in Viacha Canton "Pan de Azúcar" mountain in Viacha Canton Fields of the Battle of Ingavi in Viacha Canton In Guaqui Municipality: Guaqui festivity from July 23 to July 25 celebrated in honour of Apostle James Apostle James church of Guaqui built between 1625 and 1784 Guaqui port In Tiwanaku Municipality: archaeological site of Tiwanaku in Tiwanaku Canton Saint Peter church in Tiwanaku Willkakuti, the Andean-Amazonic New Year, celebrated on June 21 of every year in the viewpoint of Kimsa Chata mountain in Tiwanaku Canton Tiwanaku festivity (Señor de la Exaltación) celebrated in Tiwanaku Canton in September In Desaguadero Municipality: The international fair of Desaguadero on the Peruvian border Desaguadero River with its avifauna and native Aymara and Uru communities along its banks. The river runs along the entire Altiplano. In San Andrés de Machaca Municipality: Afiani lagoon in San Andrés de Machaca Canton The chullpa of Kañoma in San Andrés de Machaca Canton The church of San Andrés de Machaca, built between 1806 and 1836 In Jesús de Machaca Municipality: The archaeological site of Qhunqhu Wankani in Jesús de Machaca Canton Jesús de Machaca, an indigenous community Yakayuni saltflats in Jesús de Machaca Canton where salt exploitation is possible Uru Iruwit'u community in Jesús de Machaca Canton and the Uru Iruwit'u museum in Jesús de Machaca In Taraco Municipality: Taraco Peninsula in Taraco Canton with its archaeological sites: Chiripa and its museum Qala Uyuni, Pumani and Achacachi Coacollo in Coacollo Iwawi (Kolata Quenacache, Ojje Puku, Awichu and Ch'uxña Qala) in Higuagui Grande, Taraco Canton Sikuya Island in Taraco Canton The church of Taraco dating from 1767 Taraco museum in Taraco Canton See also Awallamaya Lake Ch'alla Jawira Jach'a Jawira Qullpa Jawira Thujsa Jawira Tiwanaku River References www.ine.gov.bo / census 2001 www.ine.gov.bo / estimation 2005 Provinces of La Paz Department (Bolivia)
Johnny Dixon is a fictional American boy featured in a series of twelve children's gothic horror novels, 1983 to 1999, written by John Bellairs or his successor Brad Strickland. In each book, 12-year-old Johnny and his group of friends face and overcome evil forces usually bent on ending the world. Alternatively, Johnny Dixon is the book series ("Johnny Dixon and the Professor" in ISFDB). The series is set in the early 1950s. Johnny lives with his paternal grandparents in fictional Duston Heights, Massachusetts. His mother has died of cancer some time prior to the beginning of the series, and his father is a fighter pilot in the United States Air Force during the Korean War. Johnny's best friend, history professor Roderick Childermass, lives across the street. In The Mummy, the Will and the Crypt, Johnny meets a boy his own age, Byron Q. "Fergie" Ferguson, at a Boy Scouts camp. Thenceforth, Johnny, Fergie, and Professor Childermass (who is typically referred to as simply "the professor") are the three principal characters of the series. Series bibliography One Johnny Dixon book was outlined by Bellairs and completed by Strickland. See also Lewis Barnavelt (series) Anthony Monday (series) References Johnny Dixon Johnny Dixon Literary characters introduced in 1983 Child characters in literature Characters in children's literature Characters in American novels of the 20th century
```javascript // Load modules var Code = require('code'); var Cryptiles = require('..'); var Lab = require('lab'); // Declare internals var internals = {}; // Test shortcuts var lab = exports.lab = Lab.script(); var describe = lab.describe; var it = lab.it; var expect = Code.expect; describe('randomString()', function () { it('should generate the right length string', function (done) { for (var i = 1; i <= 1000; ++i) { expect(Cryptiles.randomString(i).length).to.equal(i); } done(); }); it('returns an error on invalid bits size', function (done) { expect(Cryptiles.randomString(99999999999999999999).message).to.match(/Failed generating random bits/); done(); }); }); describe('randomBits()', function () { it('returns an error on invalid input', function (done) { expect(Cryptiles.randomBits(0).message).to.equal('Invalid random bits count'); done(); }); }); describe('fixedTimeComparison()', function () { var a = Cryptiles.randomString(50000); var b = Cryptiles.randomString(150000); it('should take the same amount of time comparing different string sizes', function (done) { var now = Date.now(); Cryptiles.fixedTimeComparison(b, a); var t1 = Date.now() - now; now = Date.now(); Cryptiles.fixedTimeComparison(b, b); var t2 = Date.now() - now; expect(t2 - t1).to.be.within(-20, 20); done(); }); it('should return true for equal strings', function (done) { expect(Cryptiles.fixedTimeComparison(a, a)).to.equal(true); done(); }); it('should return false for different strings (size, a < b)', function (done) { expect(Cryptiles.fixedTimeComparison(a, a + 'x')).to.equal(false); done(); }); it('should return false for different strings (size, a > b)', function (done) { expect(Cryptiles.fixedTimeComparison(a + 'x', a)).to.equal(false); done(); }); it('should return false for different strings (size, a = b)', function (done) { expect(Cryptiles.fixedTimeComparison(a + 'x', a + 'y')).to.equal(false); done(); }); it('should return false when not a string', function (done) { expect(Cryptiles.fixedTimeComparison('x', null)).to.equal(false); done(); }); it('should return false when not a string (left)', function (done) { expect(Cryptiles.fixedTimeComparison(null, 'x')).to.equal(false); done(); }); }); ```
Yeşilalan () is a neighbourhood of the municipality and district of Savur, Mardin Province, Turkey. Its population is 753 (2022). Before the 2013 reorganisation, it was a town (belde). It is populated by Kurds of the Surgucu tribe. References Neighbourhoods in Savur District Kurdish settlements in Mardin Province
Jazz journalism was a term applied to American sensational newspapers in the 1920s. Focused on entertainment, celebrities, sports, scandal and crime, the style was a New York phenomenon, practiced primarily by three new tabloid-size daily newspapers in a fight for circulation. Convenient for readers on subways, the small-format papers were designed to display large page one photographs and headlines for newsstand sales. The tabloids' popularity was controversial, but also influenced the city's and nation's more traditional media, especially when columnist Walter Winchell became popular both in print and on the air. Jazz journalism was a sensationalist style of news that matched its era, the rebellious Roaring Twenties, a dramatic change from the somber news of World War I. The new tabloid newspapers featured provocative headlines and photographs, and stories about entertainment celebrities, sex scandals, and murder trials. History The beginning of jazz journalism was Joseph Medill Patterson's New York Daily News in 1919. It was followed in 1924 by William Randolph Hearst's New York Daily Mirror and Bernarr Macfadden's New York Evening Graphic. As Hearst and the late Joseph Pulitzer had done with broadsheet yellow journalism a quarter-century earlier, the new tabloid journalism battled for circulation with increasingly dramatic page one images and bold headlines. All three New York tabloids emphasized celebrity, scandal, the entertainment world, crime and violence. Broadway and Hollywood entertainment celebrities and the nightclub, music and crime cultures of prohibition were a bigger focus in the tabloids then civic affairs or international news. Many of the visual storytelling innovations were controversial. The Evening Graphic invented the "composograph," a composite news photo illustration created in a studio and art department to illustrate stories for which a real photograph could not be taken, such as actor Rudolph Valentino on a hospital operating table, or a private Broadway party featuring a nude chorus girl in a champagne-filled bathtub. When Valentino died in 1926, a composite was made of him in Heaven standing next to opera singer Enrico Caruso, who had died in 1921. In one of the most famous photographs of the era, with a camera strapped to his ankle, reporter Tom Howard of the Daily News secretly took a photograph of Ruth Snyder in the execution chamber as she was being electrocuted at Sing Sing prison in 1928 after being convicted of murder. Among the most notable jazz journalism reporters of the time were Walter Winchell, who began as Broadway columnist in the Evening Graphic, his successor Ed Sullivan, who previously had been the paper's sports editor, and entertainment gossip columnists Louella Parsons, and Hedda Hopper. The tabloids were edited for entertainment, and links between the two worlds went beyond the headlines: Winchell was a vaudeville performer before he turned to journalism, Parsons was a movie script writer and Hopper had been an actress. Parsons, who wrote for the Los Angeles Examiner and the New York American, was known for discovering the secrets of celebrities. Winchell moved on from the Graphic to the New York Mirror and to radio. Sullivan went on to the New York Daily News and, eventually, to television. Business Advertisements, which were important for these newspapers, were often for soaps, creams, ointments, and tonics. Advertising revenue for newspapers from 1915–1929 tripled from $275 million to $800 million because of the commercial impact the papers had on readers. References Further reading Simon Michael, Jazz Journalism; the story of the tabloid newspapers, New York: E. P. Dutton, 1938. Jazz culture Music journalism
```objective-c #pragma once #include "envoy/config/core/v3/base.pb.h" #include "envoy/http/header_map.h" #include "absl/strings/str_format.h" #include "gmock/gmock.h" namespace Envoy { namespace Extensions { namespace HttpFilters { namespace ExternalProcessing { class ExtProcTestUtility { public: // Compare a reference header map to a proto static bool headerProtosEqualIgnoreOrder(const ::Envoy::Http::HeaderMap& expected, const envoy::config::core::v3::HeaderMap& actual); private: // These headers are present in the actual, but cannot be specified in the expected // ignoredHeaders should not be used for equal comparison static const absl::flat_hash_set<std::string> ignoredHeaders(); }; MATCHER_P(HeaderProtosEqual, expected, "HTTP header protos match") { return ExtProcTestUtility::headerProtosEqualIgnoreOrder(expected, arg); } MATCHER_P(HasNoHeader, key, absl::StrFormat("Headers have no value for \"%s\"", key)) { return arg.get(::Envoy::Http::LowerCaseString(std::string(key))).empty(); } MATCHER_P2(SingleHeaderValueIs, key, value, absl::StrFormat("Header \"%s\" equals \"%s\"", key, value)) { const auto hdr = arg.get(::Envoy::Http::LowerCaseString(std::string(key))); if (hdr.size() != 1) { return false; } return hdr[0]->value() == value; } MATCHER_P2(SingleProtoHeaderValueIs, key, value, absl::StrFormat("Header \"%s\" equals \"%s\"", key, value)) { for (const auto& hdr : arg.headers()) { if (key == hdr.key()) { return value == hdr.value(); } } return false; } envoy::config::core::v3::HeaderValue makeHeaderValue(const std::string& key, const std::string& value); } // namespace ExternalProcessing } // namespace HttpFilters } // namespace Extensions } // namespace Envoy ```
```glsl module Fable.Tests.Convert open System open System.Globalization open Util.Testing open Fable.Tests.Util //------------------------------------- // Parse and TryParse //------------------------------------- let tryParse f initial (value: string) = let res = ref initial #if FABLE_COMPILER let success = f(value, res) #else let success = f(value, NumberStyles.Number, CultureInfo("en-US"), res) #endif (success, !res) let parse f (a: string) = #if FABLE_COMPILER f(a) #else f(a, CultureInfo("en-US")) #endif let tests = testList "Convert" [ testCase "System.Double.Parse works" <| fun () -> parse Double.Parse "1.5" |> equal 1.5 testCase "System.Double.TryParse works" <| fun () -> tryParse Double.TryParse 0.0 "1" |> equal (true, 1.0) tryParse Double.TryParse 0.0 " 1 " |> equal (true, 1.0) tryParse Double.TryParse 0.0 "1.5" |> equal (true, 1.5) tryParse Double.TryParse 0.0 " 1.5 " |> equal (true, 1.5) tryParse Double.TryParse 0.0 "foo" |> equal (false, 0.0) tryParse Double.TryParse 0.0 "" |> equal (false, 0.0) tryParse Double.TryParse 0.0 " " |> equal (false, 0.0) tryParse Double.TryParse 0.0 "9X" |> equal (false, 0.0) tryParse Double.TryParse 0.0 "X9" |> equal (false, 0.0) tryParse Double.TryParse 0.0 "X9TRE34" |> equal (false, 0.0) tryParse Double.TryParse 0.0 "9SayWhat12Huh" |> equal (false, 0.0) tryParse Double.TryParse 0.0 "-1.5" |> equal (true, -1.5) testCase "System.Decimal.Parse works" <| fun () -> parse Decimal.Parse "1.5" |> equal 1.5M testCase "System.Decimal.TryParse works" <| fun () -> let equal expected (success, actual) = match expected with | Some expected -> equal true success; equal expected actual | None -> equal false success tryParse Decimal.TryParse 0.0M "1" |> equal (Some 1.0M) tryParse Decimal.TryParse 0.0M " 1 " |> equal (Some 1.0M) tryParse Decimal.TryParse 0.0M "1.5" |> equal (Some 1.5M) tryParse Decimal.TryParse 0.0M " 1.5 " |> equal (Some 1.5M) tryParse Decimal.TryParse 0.0M "foo" |> equal None tryParse Decimal.TryParse 0.0M "9X" |> equal None tryParse Decimal.TryParse 0.0M "X9" |> equal None tryParse Decimal.TryParse 0.0M "X9TRE34" |> equal None tryParse Decimal.TryParse 0.0M "9SayWhat12Huh" |> equal None tryParse Decimal.TryParse 0.0M "-1.5" |> equal (Some -1.5M) testCase "System.Single.Parse works" <| fun () -> parse Single.Parse "1.5" |> equal 1.5f testCase "System.Single.TryParse works" <| fun () -> tryParse Single.TryParse 0.0f "1" |> equal (true, 1.0f) tryParse Single.TryParse 0.0f " 1 " |> equal (true, 1.0f) tryParse Single.TryParse 0.0f "1.5" |> equal (true, 1.5f) tryParse Single.TryParse 0.0f " 1.5 " |> equal (true, 1.5f) tryParse Single.TryParse 0.0f "foo" |> equal (false, 0.0f) tryParse Single.TryParse 0.0f "9X" |> equal (false, 0.0f) tryParse Single.TryParse 0.0f "X9" |> equal (false, 0.0f) tryParse Single.TryParse 0.0f "X9TRE34" |> equal (false, 0.0f) tryParse Single.TryParse 0.0f "9SayWhat12Huh" |> equal (false, 0.0f) tryParse Single.TryParse 0.0f "-1.5" |> equal (true, -1.5f) testCase "System.Boolean.Parse works" <| fun () -> Boolean.Parse "true" |> equal true Boolean.Parse "True" |> equal true Boolean.Parse " true " |> equal true Boolean.Parse "false" |> equal false Boolean.Parse "False" |> equal false Boolean.Parse " false " |> equal false throwsAnyError (fun () -> Boolean.Parse "tru") throwsAnyError (fun () -> Boolean.Parse "falsee") testCase "System.Boolean.TryParse works" <| fun () -> // expect parse success Boolean.TryParse "true" |> equal (true, true) Boolean.TryParse "True" |> equal (true, true) Boolean.TryParse " true " |> equal (true, true) Boolean.TryParse "false" |> equal (true, false) Boolean.TryParse "False" |> equal (true, false) Boolean.TryParse " false " |> equal (true, false) // expect parse failure Boolean.TryParse "tru" |> equal (false, false) Boolean.TryParse "falsee" |> equal (false, false) Boolean.TryParse "0" |> equal (false, false) Boolean.TryParse "" |> equal (false, false) Boolean.TryParse "1" |> equal (false, false) Boolean.TryParse null |> equal (false, false) testCase "System.SByte.Parse works" <| fun () -> SByte.Parse("5") |> equal 5y SByte.Parse("-5") |> equal -5y SByte.Parse("-128") |> equal -128y (fun () -> SByte.Parse("128")) |> throwsError "" (fun () -> SByte.Parse("5f")) |> throwsError "" (fun () -> SByte.Parse("F")) |> throwsError "" (fun () -> SByte.Parse("5o")) |> throwsError "" testCase "System.SByte.Parse with hex works" <| fun () -> SByte.Parse("55", System.Globalization.NumberStyles.HexNumber) |> equal 85y SByte.Parse("5f", System.Globalization.NumberStyles.HexNumber) |> equal 95y SByte.Parse("FF", System.Globalization.NumberStyles.HexNumber) |> equal -1y (fun () -> SByte.Parse("1FF", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" (fun () -> SByte.Parse("5o", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" (fun () -> SByte.Parse("o5", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" testCase "System.Int16.Parse works" <| fun () -> Int16.Parse("5") |> equal 5s Int16.Parse("-5") |> equal -5s Int16.Parse("-32768") |> equal -32768s (fun () -> Int16.Parse("32768")) |> throwsError "" (fun () -> Int16.Parse("5f")) |> throwsError "" (fun () -> Int16.Parse("FFF")) |> throwsError "" (fun () -> Int16.Parse("5fo0")) |> throwsError "" testCase "System.Int16.Parse with hex works" <| fun () -> Int16.Parse("5555", System.Globalization.NumberStyles.HexNumber) |> equal 21845s Int16.Parse("5f", System.Globalization.NumberStyles.HexNumber) |> equal 95s Int16.Parse("FFFF", System.Globalization.NumberStyles.HexNumber) |> equal -1s (fun () -> Int16.Parse("1FFFF", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" (fun () -> Int16.Parse("5foo", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" (fun () -> Int16.Parse("foo5", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" testCase "System.Int32.Parse works" <| fun () -> Int32.Parse("5") |> equal 5 Int32.Parse("-5") |> equal -5 Int32.Parse("-2147483648") |> equal -2147483648 (fun () -> Int32.Parse("2147483648")) |> throwsError "" (fun () -> Int32.Parse("5f")) |> throwsError "" (fun () -> Int32.Parse("f5")) |> throwsError "" (fun () -> Int32.Parse("foo")) |> throwsError "" testCase "System.Int32.Parse with hex works" <| fun () -> Int32.Parse("555555", System.Globalization.NumberStyles.HexNumber) |> equal 5592405 Int32.Parse("5f", System.Globalization.NumberStyles.HexNumber) |> equal 95 Int32.Parse("FFFFFFFF", System.Globalization.NumberStyles.HexNumber) |> equal -1 (fun () -> Int32.Parse("1FFFFFFFF", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" (fun () -> Int32.Parse("5foo", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" (fun () -> Int32.Parse("foo5", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" testCase "System.Int64.Parse works" <| fun () -> Int64.Parse("5") |> equal 5L Int64.Parse("-5") |> equal -5L Int64.Parse("-9223372036854775808") |> equal -9223372036854775808L (fun () -> Int64.Parse("9223372036854775808")) |> throwsError "" (fun () -> Int64.Parse("5f")) |> throwsError "" (fun () -> Int64.Parse("f5")) |> throwsError "" (fun () -> Int64.Parse("foo")) |> throwsError "" testCase "System.Int64.Parse with hex works" <| fun () -> Int64.Parse("555555", System.Globalization.NumberStyles.HexNumber) |> equal 5592405L Int64.Parse("5f", System.Globalization.NumberStyles.HexNumber) |> equal 95L Int64.Parse("FFFFFFFFFFFFFFFF", System.Globalization.NumberStyles.HexNumber) |> equal -1L (fun () -> Int64.Parse("1FFFFFFFFFFFFFFFF", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" (fun () -> Int64.Parse("5foo", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" (fun () -> Int64.Parse("foo5", System.Globalization.NumberStyles.HexNumber)) |> throwsError "" testCase "System.Int64.TryParse works" <| fun () -> tryParse Int64.TryParse 0L "99" |> equal (true, 99L) tryParse Int64.TryParse 0L "foo" |> equal (false, 0L) testCase "System.UInt32.TryParse works" <| fun () -> tryParse UInt32.TryParse 0u "99" |> equal (true, 99u) tryParse UInt32.TryParse 0u "foo" |> equal (false, 0u) testCase "System.UInt64.TryParse works" <| fun () -> tryParse UInt64.TryParse 0UL "99" |> equal (true, 99UL) tryParse UInt64.TryParse 0UL "foo" |> equal (false, 0UL) testCase "Parsing integers with different radices works" <| fun () -> equal 11 (int "11") equal 17 (int "0x11") equal 9 (int "0o11") equal 3 (int "0b11") testCase "System.Int32.TryParse works" <| fun () -> tryParse Int32.TryParse 0 "1" |> equal (true, 1) tryParse Int32.TryParse 0 " 1 " |> equal (true, 1) tryParse Int32.TryParse 0 "1.5" |> equal (false, 0) tryParse Int32.TryParse 0 " 1.5 " |> equal (false, 0) tryParse Int32.TryParse 0 "foo" |> equal (false, 0) tryParse Int32.TryParse 0 "9X" |> equal (false, 0) tryParse Int32.TryParse 0 "X9" |> equal (false, 0) tryParse Int32.TryParse 0 "X9TRE34" |> equal (false, 0) tryParse Int32.TryParse 0 "9SayWhat12Huh" |> equal (false, 0) tryParse Int32.TryParse 0 "-1" |> equal (true, -1) testCase "BigInt.TryParse works" <| fun () -> tryParse bigint.TryParse 0I "4234523548923954" |> equal (true, 4234523548923954I) tryParse bigint.TryParse 0I "9SayWhat12Huh" |> equal (false, 0I) testCase "System.Int32.ToString works" <| fun () -> (5592405).ToString() |> equal "5592405" testCase "System.Int32.ToString 'd' works" <| fun () -> (5592405).ToString("d") |> equal "5592405" (5592405).ToString("d10") |> equal "0005592405" testCase "System.Int32.ToString 'x' works" <| fun () -> (5592405).ToString("x") |> equal "555555" (5592405).ToString("x10") |> equal "0000555555" testCase "System.Int64.ToString works" <| fun () -> (5592405L).ToString() |> equal "5592405" testCase "System.Int64.ToString 'd' works" <| fun () -> (5592405L).ToString("d") |> equal "5592405" (5592405L).ToString("d10") |> equal "0005592405" testCase "System.Int64.ToString 'x' works" <| fun () -> (5592405L).ToString("x") |> equal "555555" (5592405L).ToString("x10") |> equal "0000555555" testCase "System.BigInt.ToString works" <| fun () -> (5592405I).ToString() |> equal "5592405" testCase "System.Decimal.ToString works" <| fun () -> (5592405M).ToString() |> equal "5592405" //------------------------------------- // System.Convert //------------------------------------- testCase "System.Convert.ToSByte works" <| fun () -> let x = 1y sbyte(1y) |> equal x sbyte(1uy) |> equal x sbyte(257s) |> equal x sbyte(257) |> equal x sbyte(257L) |> equal x sbyte(257u) |> equal x sbyte(257us) |> equal x sbyte(257ul) |> equal x sbyte(257uL) |> equal x sbyte(1.f) |> equal x sbyte(1.) |> equal x sbyte(1.4) |> equal x sbyte(1.5) |> equal x sbyte(1.6) |> equal x sbyte(1.m) |> equal x sbyte("1") |> equal x (fun () -> sbyte("1.4")) |> throwsError "" (fun () -> sbyte("foo")) |> throwsError "" Convert.ToSByte(1y) |> equal x Convert.ToSByte(1uy) |> equal x Convert.ToSByte(1s) |> equal x Convert.ToSByte(1) |> equal x Convert.ToSByte(1L) |> equal x Convert.ToSByte(1u) |> equal x Convert.ToSByte(1us) |> equal x Convert.ToSByte(1ul) |> equal x Convert.ToSByte(1uL) |> equal x Convert.ToSByte(1.f) |> equal x Convert.ToSByte(1.) |> equal x Convert.ToSByte(1.m) |> equal x Convert.ToSByte(1.4) |> equal x Convert.ToSByte(1.5) |> equal (x+x) Convert.ToSByte(1.6) |> equal (x+x) Convert.ToSByte(2.5) |> equal (x+x) Convert.ToSByte(2.6) |> equal (x+x+x) Convert.ToSByte(3.5) |> equal (x+x+x+x) Convert.ToSByte("1") |> equal x Convert.ToSByte('a') |> equal 97y (fun () -> Convert.ToSByte("1.4")) |> throwsError "" (fun () -> Convert.ToSByte("foo")) |> throwsError "" testCase "System.Convert.ToInt16 works" <| fun () -> let x = 1s int16(1y) |> equal x int16(1uy) |> equal x int16(1s) |> equal x int16(65537) |> equal x int16(65537L) |> equal x int16(65537u) |> equal x int16(1us) |> equal x int16(65537ul) |> equal x int16(65537uL) |> equal x int16(1.f) |> equal x int16(1.) |> equal x int16(1.4) |> equal x int16(1.5) |> equal x int16(1.6) |> equal x int16(1.m) |> equal x int16("1") |> equal x (fun () -> int16("1.4")) |> throwsError "" (fun () -> int16("foo")) |> throwsError "" Convert.ToInt16(1y) |> equal x Convert.ToInt16(1uy) |> equal x Convert.ToInt16(1s) |> equal x Convert.ToInt16(1) |> equal x Convert.ToInt16(1L) |> equal x Convert.ToInt16(1u) |> equal x Convert.ToInt16(1us) |> equal x Convert.ToInt16(1ul) |> equal x Convert.ToInt16(1uL) |> equal x Convert.ToInt16(1.f) |> equal x Convert.ToInt16(1.) |> equal x Convert.ToInt16(1.m) |> equal x Convert.ToInt16(1.4) |> equal x Convert.ToInt16(1.5) |> equal (x+x) Convert.ToInt16(1.6) |> equal (x+x) Convert.ToInt16(2.5) |> equal (x+x) Convert.ToInt16(2.6) |> equal (x+x+x) Convert.ToInt16(3.5) |> equal (x+x+x+x) Convert.ToInt16("1") |> equal x Convert.ToInt16('a') |> equal 97s (fun () -> Convert.ToInt16("1.4")) |> throwsError "" (fun () -> Convert.ToInt16("foo")) |> throwsError "" testCase "System.Convert.ToInt32 works" <| fun () -> let x = 1 int(1y) |> equal x int(1uy) |> equal x int(1s) |> equal x int(1) |> equal x int(1L) |> equal x int(1u) |> equal x int(1us) |> equal x int(1ul) |> equal x int(1uL) |> equal x int(1.f) |> equal x int(1.) |> equal x int(1.4) |> equal x int(1.5) |> equal x int(1.6) |> equal x int(1.m) |> equal x int("1") |> equal x (fun () -> int("1.4")) |> throwsError "" (fun () -> int("foo")) |> throwsError "" Convert.ToInt32(1y) |> equal x Convert.ToInt32(1uy) |> equal x Convert.ToInt32(1s) |> equal x Convert.ToInt32(1) |> equal x Convert.ToInt32(1L) |> equal x Convert.ToInt32(1u) |> equal x Convert.ToInt32(1us) |> equal x Convert.ToInt32(1ul) |> equal x Convert.ToInt32(1uL) |> equal x Convert.ToInt32(1.f) |> equal x Convert.ToInt32(1.) |> equal x Convert.ToInt32(1.m) |> equal x Convert.ToInt32(1.4) |> equal x Convert.ToInt32(1.5) |> equal (x+x) Convert.ToInt32(1.6) |> equal (x+x) Convert.ToInt32(2.5) |> equal (x+x) Convert.ToInt32(2.6) |> equal (x+x+x) Convert.ToInt32(3.5) |> equal (x+x+x+x) Convert.ToInt32("1") |> equal x Convert.ToInt32('a') |> equal 97 (fun () -> Convert.ToInt32("1.4")) |> throwsError "" (fun () -> Convert.ToInt32("foo")) |> throwsError "" testCase "Special cases conversion to/from Int64 work" <| fun () -> let xn = -1L let xnu = 0xFFFFFFFFFFFFFFFFuL -1 |> uint64 |> equal xnu 0xFFFFFFFFu |> int64 |> equal 0xFFFFFFFFL xn |> uint64 |> equal xnu xnu |> int64 |> equal -1L xn |> int32 |> equal -1 xn |> uint32 |> equal 0xFFFFFFFFu xnu |> int32 |> equal -1 xnu |> uint32 |> equal 0xFFFFFFFFu testCase "Special cases conversion to UInt64 work" <| fun () -> // See #1880 uint64 "0x9fffffffffffffff" |> equal 11529215046068469759UL uint64 "0xafffffffffffffff" |> equal 12682136550675316735UL uint64 "0xAFFFFFFFFFFFFFFF" |> equal 12682136550675316735UL uint64 "0x9fffffff_ffffffff" |> equal 11529215046068469759UL uint64 "0x9fff_ffff_ffff_ffff" |> equal 11529215046068469759UL testCase "System.Convert.ToInt64 works" <| fun () -> let x = 1L int64(1y) |> equal x int64(1uy) |> equal x int64(1s) |> equal x int64(1) |> equal x int64(1L) |> equal x int64(1u) |> equal x int64(1us) |> equal x int64(1ul) |> equal x int64(1uL) |> equal x int64(1.f) |> equal x int64(1.) |> equal x int64(1.4) |> equal x int64(1.5) |> equal x int64(1.6) |> equal x int64(1.m) |> equal x int64("1") |> equal x (fun () -> int64("1.4")) |> throwsError "" (fun () -> int64("foo")) |> throwsError "" Convert.ToInt64(1y) |> equal x Convert.ToInt64(1uy) |> equal x Convert.ToInt64(1s) |> equal x Convert.ToInt64(1) |> equal x Convert.ToInt64(1L) |> equal x Convert.ToInt64(1u) |> equal x Convert.ToInt64(1us) |> equal x Convert.ToInt64(1ul) |> equal x Convert.ToInt64(1uL) |> equal x Convert.ToInt64(1.f) |> equal x Convert.ToInt64(1.) |> equal x Convert.ToInt64(1.m) |> equal x Convert.ToInt64(1.4) |> equal x Convert.ToInt64(1.5) |> equal (x+x) Convert.ToInt64(1.6) |> equal (x+x) Convert.ToInt64(2.5) |> equal (x+x) Convert.ToInt64(2.6) |> equal (x+x+x) Convert.ToInt64(3.5) |> equal (x+x+x+x) Convert.ToInt64("1") |> equal x Convert.ToInt64('a') |> equal 97L (fun () -> Convert.ToInt64("1.4")) |> throwsError "" (fun () -> Convert.ToInt64("foo")) |> throwsError "" testCase "System.Convert.ToByte works" <| fun () -> let x = 1uy byte(1y) |> equal x byte(1uy) |> equal x byte(257s) |> equal x byte(257) |> equal x byte(257L) |> equal x byte(257u) |> equal x byte(257us) |> equal x byte(257ul) |> equal x byte(257uL) |> equal x byte(1.f) |> equal x byte(1.) |> equal x byte(1.4) |> equal x byte(1.5) |> equal x byte(1.6) |> equal x byte(1.m) |> equal x byte("1") |> equal x (fun () -> byte("1.4")) |> throwsError "" (fun () -> byte("foo")) |> throwsError "" Convert.ToByte(1y) |> equal x Convert.ToByte(1uy) |> equal x Convert.ToByte(1s) |> equal x Convert.ToByte(1) |> equal x Convert.ToByte(1L) |> equal x Convert.ToByte(1u) |> equal x Convert.ToByte(1us) |> equal x Convert.ToByte(1ul) |> equal x Convert.ToByte(1uL) |> equal x Convert.ToByte(1.f) |> equal x Convert.ToByte(1.) |> equal x Convert.ToByte(1.m) |> equal x Convert.ToByte(1.4) |> equal x Convert.ToByte(1.5) |> equal (x+x) Convert.ToByte(1.6) |> equal (x+x) Convert.ToByte(2.5) |> equal (x+x) Convert.ToByte(2.6) |> equal (x+x+x) Convert.ToByte(3.5) |> equal (x+x+x+x) Convert.ToByte("1") |> equal x Convert.ToByte('a') |> equal 97uy (fun () -> Convert.ToByte("1.4")) |> throwsError "" (fun () -> Convert.ToByte("foo")) |> throwsError "" testCase "System.Convert.ToUInt16 works" <| fun () -> let x = 1us uint16(1y) |> equal x uint16(1uy) |> equal x uint16(1s) |> equal x uint16(65537) |> equal x uint16(65537L) |> equal x uint16(65537u) |> equal x uint16(1us) |> equal x uint16(65537ul) |> equal x uint16(65537uL) |> equal x uint16(1.f) |> equal x uint16(1.) |> equal x uint16(1.4) |> equal x uint16(1.5) |> equal x uint16(1.6) |> equal x uint16(1.m) |> equal x uint16("1") |> equal x (fun () -> uint16("1.4")) |> throwsError "" (fun () -> uint16("foo")) |> throwsError "" Convert.ToUInt16(1y) |> equal x Convert.ToUInt16(1uy) |> equal x Convert.ToUInt16(1s) |> equal x Convert.ToUInt16(1) |> equal x Convert.ToUInt16(1L) |> equal x Convert.ToUInt16(1u) |> equal x Convert.ToUInt16(1us) |> equal x Convert.ToUInt16(1ul) |> equal x Convert.ToUInt16(1uL) |> equal x Convert.ToUInt16(1.f) |> equal x Convert.ToUInt16(1.) |> equal x Convert.ToUInt16(1.m) |> equal x Convert.ToUInt16(1.4) |> equal x Convert.ToUInt16(1.5) |> equal (x+x) Convert.ToUInt16(1.6) |> equal (x+x) Convert.ToUInt16(2.5) |> equal (x+x) Convert.ToUInt16(2.6) |> equal (x+x+x) Convert.ToUInt16(3.5) |> equal (x+x+x+x) Convert.ToUInt16("1") |> equal x Convert.ToUInt16('a') |> equal 97us (fun () -> Convert.ToUInt16("1.4")) |> throwsError "" (fun () -> Convert.ToUInt16("foo")) |> throwsError "" testCase "System.Convert.ToUInt32 works" <| fun () -> let x = 1u uint32(1y) |> equal x uint32(1uy) |> equal x uint32(1s) |> equal x uint32(1) |> equal x uint32(1L) |> equal x uint32(1u) |> equal x uint32(1us) |> equal x uint32(1ul) |> equal x uint32(1uL) |> equal x uint32(1.f) |> equal x uint32(1.) |> equal x uint32(1.4) |> equal x uint32(1.5) |> equal x uint32(1.6) |> equal x uint32(1.m) |> equal x uint32("1") |> equal x (fun () -> uint32("1.4")) |> throwsError "" (fun () -> uint32("foo")) |> throwsError "" Convert.ToUInt32(1y) |> equal x Convert.ToUInt32(1uy) |> equal x Convert.ToUInt32(1s) |> equal x Convert.ToUInt32(1) |> equal x Convert.ToUInt32(1L) |> equal x Convert.ToUInt32(1u) |> equal x Convert.ToUInt32(1us) |> equal x Convert.ToUInt32(1ul) |> equal x Convert.ToUInt32(1uL) |> equal x Convert.ToUInt32(1.f) |> equal x Convert.ToUInt32(1.) |> equal x Convert.ToUInt32(1.m) |> equal x Convert.ToUInt32(1.4) |> equal x Convert.ToUInt32(1.5) |> equal (x+x) Convert.ToUInt32(1.6) |> equal (x+x) Convert.ToUInt32(2.5) |> equal (x+x) Convert.ToUInt32(2.6) |> equal (x+x+x) Convert.ToUInt32(3.5) |> equal (x+x+x+x) Convert.ToUInt32("1") |> equal x Convert.ToUInt32('a') |> equal 97u (fun () -> Convert.ToUInt32("1.4")) |> throwsError "" (fun () -> Convert.ToUInt32("foo")) |> throwsError "" testCase "System.Convert.ToUInt64 works" <| fun () -> let x = 1uL uint64(1y) |> equal x uint64(1uy) |> equal x uint64(1s) |> equal x uint64(1) |> equal x uint64(1L) |> equal x uint64(1u) |> equal x uint64(1us) |> equal x uint64(1ul) |> equal x uint64(1uL) |> equal x uint64(1.f) |> equal x uint64(1.) |> equal x uint64(1.4) |> equal x uint64(1.5) |> equal x uint64(1.6) |> equal x uint64(1.m) |> equal x uint64("1") |> equal x (fun () -> uint64("1.4")) |> throwsError "" (fun () -> uint64("foo")) |> throwsError "" Convert.ToUInt64(1y) |> equal x Convert.ToUInt64(1uy) |> equal x Convert.ToUInt64(1s) |> equal x Convert.ToUInt64(1) |> equal x Convert.ToUInt64(1L) |> equal x Convert.ToUInt64(1u) |> equal x Convert.ToUInt64(1us) |> equal x Convert.ToUInt64(1ul) |> equal x Convert.ToUInt64(1uL) |> equal x Convert.ToUInt64(1.f) |> equal x Convert.ToUInt64(1.) |> equal x Convert.ToUInt64(1.m) |> equal x Convert.ToUInt64(1.4) |> equal x Convert.ToUInt64(1.5) |> equal (x+x) Convert.ToUInt64(1.6) |> equal (x+x) Convert.ToUInt64(2.5) |> equal (x+x) Convert.ToUInt64(2.6) |> equal (x+x+x) Convert.ToUInt64(3.5) |> equal (x+x+x+x) Convert.ToUInt64("1") |> equal x Convert.ToUInt64('a') |> equal 97UL (fun () -> Convert.ToUInt64("1.4")) |> throwsError "" (fun () -> Convert.ToUInt64("foo")) |> throwsError "" testCase "Convert between (un)signed long" <| fun () -> // See #1485 int64 System.UInt64.MaxValue |> equal -1L uint64 -1L |> equal System.UInt64.MaxValue testCase "int64 can parse signed longs" <| fun () -> // See #1586 let a = int64 "5" let b = int64 "-5" let c = int64 "+5" equal 5L a equal -5L b a = b |> equal false a = c |> equal true testCase "System.Convert.ToSingle works" <| fun () -> let x = 1.f float32(1y) |> equal x float32(1uy) |> equal x float32(1s) |> equal x float32(1) |> equal x float32(1L) |> equal x float32(1u) |> equal x float32(1us) |> equal x float32(1ul) |> equal x float32(1uL) |> equal x float32(1.f) |> equal x float32(1.) |> equal x float32(1.m) |> equal x float32("1.") |> equal x (fun () -> float32("foo")) |> throwsError "" Convert.ToSingle(1y) |> equal x Convert.ToSingle(1uy) |> equal x Convert.ToSingle(1s) |> equal x Convert.ToSingle(1) |> equal x Convert.ToSingle(1L) |> equal x Convert.ToSingle(1u) |> equal x Convert.ToSingle(1us) |> equal x Convert.ToSingle(1ul) |> equal x Convert.ToSingle(1uL) |> equal x Convert.ToSingle(1.f) |> equal x Convert.ToSingle(1.) |> equal x Convert.ToSingle(1.m) |> equal x parse Convert.ToSingle "1." |> equal x (fun () -> Convert.ToSingle("foo")) |> throwsError "" testCase "System.Convert.ToDouble works" <| fun () -> let x = 1. float(1y) |> equal x float(1uy) |> equal x float(1s) |> equal x float(1) |> equal x float(1L) |> equal x float(1u) |> equal x float(1us) |> equal x float(1ul) |> equal x float(1uL) |> equal x float(1.f) |> equal x float(1.) |> equal x float(1.m) |> equal x float("1.") |> equal x (fun () -> float("foo")) |> throwsError "" Convert.ToDouble(1y) |> equal x Convert.ToDouble(1uy) |> equal x Convert.ToDouble(1s) |> equal x Convert.ToDouble(1) |> equal x Convert.ToDouble(1L) |> equal x Convert.ToDouble(1u) |> equal x Convert.ToDouble(1us) |> equal x Convert.ToDouble(1ul) |> equal x Convert.ToDouble(1uL) |> equal x Convert.ToDouble(1.f) |> equal x Convert.ToDouble(1.) |> equal x Convert.ToDouble(1.m) |> equal x parse Convert.ToDouble "1." |> equal x (fun () -> Convert.ToDouble("foo")) |> throwsError "" testCase "System.Convert.ToDecimal works" <| fun () -> let x = 1.m decimal(1y) |> equal x decimal(1uy) |> equal x decimal(1s) |> equal x decimal(1) |> equal x decimal(1L) |> equal x decimal(1u) |> equal x decimal(1us) |> equal x decimal(1ul) |> equal x decimal(1.f) |> equal x decimal(1.) |> equal x decimal(1.m) |> equal x decimal("1.") |> equal x (fun () -> decimal("foo")) |> throwsError "" Convert.ToDecimal(1y) |> equal x Convert.ToDecimal(1uy) |> equal x Convert.ToDecimal(1s) |> equal x Convert.ToDecimal(1) |> equal x Convert.ToDecimal(1L) |> equal x Convert.ToDecimal(1u) |> equal x Convert.ToDecimal(1us) |> equal x Convert.ToDecimal(1ul) |> equal x Convert.ToDecimal(1uL) |> equal x Convert.ToDecimal(1.f) |> equal x Convert.ToDecimal(1.) |> equal x Convert.ToDecimal(1.m) |> equal x parse Convert.ToDecimal "1." |> equal x (fun () -> Convert.ToDecimal("foo")) |> throwsError "" // String to number convertions (with base) testCase "System.Convert.ToSByte with base works" <| fun () -> let x = "101" Convert.ToSByte(x) |> equal 101y Convert.ToSByte(x, 2) |> equal 5y Convert.ToSByte(x, 8) |> equal 65y Convert.ToSByte(x, 10) |> equal 101y #if FABLE_COMPILER (fun () -> Convert.ToSByte("255", 2)) |> throwsError "" #else (fun () -> Convert.ToSByte("255", 2)) |> throwsError "Could not find any recognizable digits." #endif testCase "System.Convert.ToInt16 with base works" <| fun () -> let x = "101" Convert.ToInt16(x) |> equal 101s Convert.ToInt16(x, 2) |> equal 5s Convert.ToInt16(x, 8) |> equal 65s Convert.ToInt16(x, 10) |> equal 101s Convert.ToInt16(x, 16) |> equal 257s #if FABLE_COMPILER (fun () -> Convert.ToInt16("255", 2)) |> throwsError "" #else (fun () -> Convert.ToInt16("255", 2)) |> throwsError "Could not find any recognizable digits." #endif testCase "System.Convert.ToInt32 with base works" <| fun () -> let x = "101" Convert.ToInt32(x) |> equal 101 Convert.ToInt32(x, 2) |> equal 5 Convert.ToInt32(x, 8) |> equal 65 Convert.ToInt32(x, 10) |> equal 101 Convert.ToInt32(x, 16) |> equal 257 #if FABLE_COMPILER (fun () -> Convert.ToInt32("255", 2)) |> throwsError "" #else (fun () -> Convert.ToInt32("255", 2)) |> throwsError "Could not find any recognizable digits." #endif testCase "System.Convert.ToInt64 with base works" <| fun () -> let x = "101" Convert.ToInt64(x) |> equal 101L Convert.ToInt64(x, 2) |> equal 5L Convert.ToInt64(x, 8) |> equal 65L Convert.ToInt64(x, 10) |> equal 101L Convert.ToInt64(x, 16) |> equal 257L #if FABLE_COMPILER (fun () -> Convert.ToInt64("255", 2)) |> throwsError "" #else (fun () -> Convert.ToInt64("255", 2)) |> throwsError "Could not find any recognizable digits." #endif testCase "System.Convert.ToByte with base works" <| fun () -> let x = "101" Convert.ToByte(x) |> equal 101uy Convert.ToByte(x, 2) |> equal 5uy Convert.ToByte(x, 8) |> equal 65uy Convert.ToByte(x, 10) |> equal 101uy #if FABLE_COMPILER (fun () -> Convert.ToByte("255", 2)) |> throwsError "" #else (fun () -> Convert.ToByte("255", 2)) |> throwsError "Could not find any recognizable digits." #endif testCase "System.Convert.ToUInt16 with base works" <| fun () -> let x = "101" Convert.ToUInt16(x) |> equal 101us Convert.ToUInt16(x, 2) |> equal 5us Convert.ToUInt16(x, 8) |> equal 65us Convert.ToUInt16(x, 10) |> equal 101us Convert.ToUInt16(x, 16) |> equal 257us #if FABLE_COMPILER (fun () -> Convert.ToUInt16("255", 2)) |> throwsError "" #else (fun () -> Convert.ToUInt16("255", 2)) |> throwsError "Could not find any recognizable digits." #endif testCase "System.Convert.ToUInt32 with base works" <| fun () -> let x = "101" Convert.ToUInt32(x) |> equal 101u Convert.ToUInt32(x, 2) |> equal 5u Convert.ToUInt32(x, 8) |> equal 65u Convert.ToUInt32(x, 10) |> equal 101u Convert.ToUInt32(x, 16) |> equal 257u #if FABLE_COMPILER (fun () -> Convert.ToUInt32("255", 2)) |> throwsError "" #else (fun () -> Convert.ToUInt32("255", 2)) |> throwsError "Could not find any recognizable digits." #endif testCase "System.Convert.ToUInt64 with base works" <| fun () -> let x = "101" Convert.ToUInt64(x) |> equal 101uL Convert.ToUInt64(x, 2) |> equal 5uL Convert.ToUInt64(x, 8) |> equal 65uL Convert.ToUInt64(x, 10) |> equal 101uL Convert.ToUInt64(x, 16) |> equal 257uL #if FABLE_COMPILER (fun () -> Convert.ToUInt64("255", 2)) |> throwsError "" #else (fun () -> Convert.ToUInt64("255", 2)) |> throwsError "Could not find any recognizable digits." #endif // Number to string convertions (with base) testCase "System.Convert.ToString with base works" <| fun () -> Convert.ToString(Byte.MaxValue,2) |> equal "11111111" Convert.ToString(Int16.MaxValue,2) |> equal "111111111111111" Convert.ToString(Int32.MaxValue,2) |> equal "1111111111111111111111111111111" Convert.ToString(Int64.MaxValue,2) |> equal "111111111111111111111111111111111111111111111111111111111111111" testCase "System.Convert.ToString SByte works" <| fun () -> let x = "101" Convert.ToString(101y) |> equal x testCase "System.Convert.ToString Int16 works" <| fun () -> let x = "101" Convert.ToString(101s) |> equal x Convert.ToString(5s, 2) |> equal x Convert.ToString(65s, 8) |> equal x Convert.ToString(101s, 10) |> equal x Convert.ToString(257s, 16) |> equal x Convert.ToString(-5s, 16) |> equal "fffb" testCase "System.Convert.ToString Int32 works" <| fun () -> let x = "101" Convert.ToString(101) |> equal x Convert.ToString(5, 2) |> equal x Convert.ToString(65, 8) |> equal x Convert.ToString(101, 10) |> equal x Convert.ToString(257, 16) |> equal x Convert.ToString(-5, 16) |> equal "fffffffb" testCase "System.Convert.ToString Int64 works" <| fun () -> let x = "101" Convert.ToString(101L) |> equal x Convert.ToString(5L, 2) |> equal x Convert.ToString(65L, 8) |> equal x Convert.ToString(101L, 10) |> equal x Convert.ToString(257L, 16) |> equal x // TODO long.js lib always use negative sign to convert negative longs to strings // Convert.ToString(-5L, 16) |> equal "fffffffffffffffb" testCase "System.Convert.ToString Byte works" <| fun () -> let x = "101" Convert.ToString(101uy) |> equal x Convert.ToString(5uy, 2) |> equal x Convert.ToString(65uy, 8) |> equal x Convert.ToString(101uy, 10) |> equal x testCase "System.Convert.ToString UInt16 works" <| fun () -> let x = "101" Convert.ToString(101us) |> equal x testCase "System.Convert.ToString UInt32 works" <| fun () -> let x = "101" Convert.ToString(101u) |> equal x testCase "System.Convert.ToString UInt64 works" <| fun () -> let x = "101" Convert.ToString(101uL) |> equal x testCase "System.Convert.ToString Single works" <| fun () -> let x = "101" Convert.ToString(101.f) |> equal x testCase "System.Convert.ToString Double works" <| fun () -> let x = "101" Convert.ToString(101.) |> equal x testCase "System.Convert.ToString Decimal works" <| fun () -> let x = "101" Convert.ToString(101.m) |> equal x testCase "FSharp.Core type converters can combined via the >> operator" <| fun () -> "1" |> (sbyte >> Ok) |> equal (Ok 1y) "1" |> (int16 >> Ok) |> equal (Ok 1s) "1" |> (int >> Ok) |> equal (Ok 1) "1" |> (int64 >> Ok) |> equal (Ok 1L) "1" |> (byte >> Ok) |> equal (Ok 1uy) "1" |> (uint16 >> Ok) |> equal (Ok 1us) "1" |> (uint32 >> Ok) |> equal (Ok 1u) "1" |> (uint64 >> Ok) |> equal (Ok 1uL) "1" |> (float32 >> Ok) |> equal (Ok 1.f) "1" |> (float >> Ok) |> equal (Ok 1.) "1" |> (double >> Ok) |> equal (Ok 1.) "1" |> (decimal >> Ok) |> equal (Ok 1.m) //------------------------------------- // System.BitConverter //------------------------------------- testCase "BitConverter.IsLittleEndian works" <| fun () -> BitConverter.IsLittleEndian |> equal true testCase "BitConverter.GetBytes Boolean works" <| fun () -> let value = true let bytes = BitConverter.GetBytes(value) bytes |> equal [| 1uy |] testCase "BitConverter.GetBytes Char works" <| fun () -> let value = 'A' let bytes = BitConverter.GetBytes(value) bytes |> equal [| 65uy; 0uy |] testCase "BitConverter.GetBytes Int16 works" <| fun () -> let value = 0x0102s let bytes = BitConverter.GetBytes(value) bytes |> equal [| 2uy; 1uy |] testCase "BitConverter.GetBytes Int32 works" <| fun () -> let value = 0x01020304 let bytes = BitConverter.GetBytes(value) bytes |> equal [| 4uy; 3uy; 2uy; 1uy |] testCase "BitConverter.GetBytes Int64 works" <| fun () -> let value = 0x0102030405060708L let bytes = BitConverter.GetBytes(value) bytes |> equal [| 8uy; 7uy; 6uy; 5uy; 4uy; 3uy; 2uy; 1uy |] testCase "BitConverter.GetBytes UInt16 works" <| fun () -> let value = 0xFF02us let bytes = BitConverter.GetBytes(value) bytes |> equal [| 2uy; 255uy |] testCase "BitConverter.GetBytes UInt32 works" <| fun () -> let value = 0xFF020304u let bytes = BitConverter.GetBytes(value) bytes |> equal [| 4uy; 3uy; 2uy; 255uy |] testCase "BitConverter.GetBytes UInt64 works" <| fun () -> let value = 0xFF02030405060708UL let bytes = BitConverter.GetBytes(value) bytes |> equal [| 8uy; 7uy; 6uy; 5uy; 4uy; 3uy; 2uy; 255uy |] testCase "BitConverter.GetBytes Single works" <| fun () -> let value = 1.0f let bytes = BitConverter.GetBytes(value) bytes |> equal [| 0uy; 0uy; 128uy; 63uy |] testCase "BitConverter.GetBytes Double works" <| fun () -> let value = 1.0 let bytes = BitConverter.GetBytes(value) bytes |> equal [| 0uy; 0uy; 0uy; 0uy; 0uy; 0uy; 240uy; 63uy |] testCase "BitConverter.Int64BitsToDouble works" <| fun () -> let f = BitConverter.Int64BitsToDouble(1L) f |> equal 4.9406564584124654E-324 testCase "BitConverter.DoubleToInt64Bits works" <| fun () -> let i = BitConverter.DoubleToInt64Bits(1.0) i |> equal 4607182418800017408L testCase "BitConverter.ToBoolean works" <| fun () -> let value = true let bytes = BitConverter.GetBytes(value) BitConverter.ToBoolean(bytes, 0) |> equal value testCase "BitConverter.ToChar works" <| fun () -> let value = 'A' let bytes = BitConverter.GetBytes(value) BitConverter.ToChar(bytes, 0) |> equal value testCase "BitConverter.ToInt16 works" <| fun () -> let value = 0x0102s let bytes = BitConverter.GetBytes(value) BitConverter.ToInt16(bytes, 0) |> equal value testCase "BitConverter.ToInt32 works" <| fun () -> let value = 0x01020304 let bytes = BitConverter.GetBytes(value) BitConverter.ToInt32(bytes, 0) |> equal value testCase "BitConverter.ToInt64 works" <| fun () -> let value = 0x0102030405060708L let bytes = BitConverter.GetBytes(value) BitConverter.ToInt64(bytes, 0) |> equal value testCase "BitConverter.ToUInt16 works" <| fun () -> let value = 0xFF02us let bytes = BitConverter.GetBytes(value) BitConverter.ToUInt16(bytes, 0) |> equal value testCase "BitConverter.ToUInt32 works" <| fun () -> let value = 0xFF020304u let bytes = BitConverter.GetBytes(value) BitConverter.ToUInt32(bytes, 0) |> equal value testCase "BitConverter.ToUInt64 works" <| fun () -> let value = 0xFF02030405060708UL let bytes = BitConverter.GetBytes(value) BitConverter.ToUInt64(bytes, 0) |> equal value testCase "BitConverter.ToSingle works" <| fun () -> let value = 1.0f let bytes = BitConverter.GetBytes(value) BitConverter.ToSingle(bytes, 0) |> equal value testCase "BitConverter.ToDouble works" <| fun () -> let value = 1.0 let bytes = BitConverter.GetBytes(value) BitConverter.ToDouble(bytes, 0) |> equal value testCase "BitConverter.ToString works" <| fun () -> let value = 0x01020304 let bytes = BitConverter.GetBytes(value) BitConverter.ToString(bytes) |> equal "04-03-02-01" testCase "BitConverter.ToString 2 works" <| fun () -> let value = 0x01020304 let bytes = BitConverter.GetBytes(value) BitConverter.ToString(bytes, 1) |> equal "03-02-01" testCase "BitConverter.ToString 3 works" <| fun () -> let value = 0x01020304 let bytes = BitConverter.GetBytes(value) BitConverter.ToString(bytes, 1, 2) |> equal "03-02" //------------------------------------- // System.Numerics.BigInteger //------------------------------------- testCase "BigInt from uint32 works" <| fun () -> bigint System.UInt32.MaxValue |> equal 4294967295I testCase "BigInt ToSByte works" <| fun () -> let value = 0x02y sbyte (bigint (int32 value)) |> equal value testCase "BigInt ToInt16 works" <| fun () -> let value = 0x0102s int16 (bigint (int32 value)) |> equal value testCase "BigInt ToInt32 works" <| fun () -> let value = 0x01020304 int32 (bigint value) |> equal value testCase "BigInt ToInt64 works" <| fun () -> let value = 0x0102030405060708L int64 (bigint value) |> equal value testCase "BigInt ToByte works" <| fun () -> let value = 0x02uy byte (bigint (uint32 value)) |> equal value testCase "BigInt ToUInt16 works" <| fun () -> let value = 0xFF02us uint16 (bigint (uint32 value)) |> equal value testCase "BigInt ToUInt32 works" <| fun () -> //let value = 0xFF020304u //TODO: BigInt.FromUInt32 not implemented yet, so this will fail let value = 0x1F020304u uint32 (bigint value) |> equal value testCase "BigInt ToUInt64 works" <| fun () -> let value = 0xFF02030405060708UL uint64 (bigint value) |> equal value testCase "BigInt ToSingle works" <| fun () -> let value = 1.0f single (bigint value) |> equal value testCase "BigInt ToDouble works" <| fun () -> let value = -1.0 double (bigint value) |> equal value testCase "BigInt ToDecimal works" <| fun () -> let value = 1.0m decimal (bigint value) |> equal value testCase "BigInt ToDecimal with Decimal.MinValue works" <| fun () -> let value = Decimal.MinValue decimal (bigint value) |> equal value testCase "BigInt ToDecimal with Decimal.MaxValue works" <| fun () -> let value = Decimal.MaxValue decimal (bigint value) |> equal value testCase "BigInt ToString works" <| fun () -> let value = 1234567890 string (bigint value) |> equal "1234567890" testCase "Convert.ToBase64String works" <| fun () -> let bytes = [| 2uy; 4uy; 6uy; 8uy; 10uy; 12uy; 14uy; 16uy; 18uy; 20uy |] Convert.ToBase64String(bytes) |> equal "AgQGCAoMDhASFA==" testCase "Convert.FromBase64String works" <| fun () -> Convert.FromBase64String("AgQGCAoMDhASFA==") |> equal [| 2uy; 4uy; 6uy; 8uy; 10uy; 12uy; 14uy; 16uy; 18uy; 20uy |] // id is prefixed for guid creation as we check at compile time (if able) to create a string const testCase "Guid.Parse works" <| fun () -> let guids = [ Guid.Parse("96258006-c4ba-4a7f-80c4-de7f2b2898c5") Guid.Parse(id "96258006-c4ba-4a7f-80c4-de7f2b2898c5") Guid.Parse("96258006c4ba4a7f80c4de7f2b2898c5") Guid.Parse(id "96258006c4ba4a7f80c4de7f2b2898c5") Guid.Parse("{96258006-c4ba-4a7f-80c4-de7f2b2898c5}") Guid.Parse(id "{96258006-c4ba-4a7f-80c4-de7f2b2898c5}") Guid.Parse("(96258006-c4ba-4a7f-80c4-de7f2b2898c5)") Guid.Parse(id "(96258006-c4ba-4a7f-80c4-de7f2b2898c5)") Guid.Parse("{0x96258006,0xc4ba,0x4a7f,{0x80,0xc4,0xde,0x7f,0x2b,0x28,0x98,0xc5}}") Guid.Parse(id "{0x96258006,0xc4ba,0x4a7f,{0x80,0xc4,0xde,0x7f,0x2b,0x28,0x98,0xc5}}") Guid("96258006-c4ba-4a7f-80c4-de7f2b2898c5") Guid(id "96258006-c4ba-4a7f-80c4-de7f2b2898c5") Guid("96258006c4ba4a7f80c4de7f2b2898c5") Guid(id "96258006c4ba4a7f80c4de7f2b2898c5") Guid("{96258006-c4ba-4a7f-80c4-de7f2b2898c5}") Guid(id "{96258006-c4ba-4a7f-80c4-de7f2b2898c5}") Guid("(96258006-c4ba-4a7f-80c4-de7f2b2898c5)") Guid(id "(96258006-c4ba-4a7f-80c4-de7f2b2898c5)") Guid("{0x96258006,0xc4ba,0x4a7f,{0x80,0xc4,0xde,0x7f,0x2b,0x28,0x98,0xc5}}") Guid(id "{0x96258006,0xc4ba,0x4a7f,{0x80,0xc4,0xde,0x7f,0x2b,0x28,0x98,0xc5}}") ] guids |> List.iter (fun g -> g.ToString() |> equal "96258006-c4ba-4a7f-80c4-de7f2b2898c5") // testCase "Guid.Parse fails if string is not well formed" <| fun () -> // let success = // try // let g1 = Guid.Parse(id "foo") // true // with _ -> false // equal false success testCase "Guid.TryParse works" <| fun () -> let successGuids = [ Guid.TryParse("96258006-c4ba-4a7f-80c4-de7f2b2898c5") Guid.TryParse(id "96258006-c4ba-4a7f-80c4-de7f2b2898c5") Guid.TryParse("96258006c4ba4a7f80c4de7f2b2898c5") Guid.TryParse(id "96258006c4ba4a7f80c4de7f2b2898c5") Guid.TryParse("{96258006-c4ba-4a7f-80c4-de7f2b2898c5}") Guid.TryParse(id "{96258006-c4ba-4a7f-80c4-de7f2b2898c5}") Guid.TryParse("(96258006-c4ba-4a7f-80c4-de7f2b2898c5)") Guid.TryParse(id "(96258006-c4ba-4a7f-80c4-de7f2b2898c5)") Guid.TryParse("{0x96258006,0xc4ba,0x4a7f,{0x80,0xc4,0xde,0x7f,0x2b,0x28,0x98,0xc5}}") Guid.TryParse(id "{0x96258006,0xc4ba,0x4a7f,{0x80,0xc4,0xde,0x7f,0x2b,0x28,0x98,0xc5}}") ] let failGuids = [ Guid.TryParse("96258006-c4ba-4a7f-80c4") Guid.TryParse(id "96258006-c4ba-4a7f-80c4") Guid.TryParse("96258007f80c4de7f2b2898c5") Guid.TryParse(id "96258007f80c4de7f2b2898c5") Guid.TryParse("{96258006-c4ba-4a7f-80c4}") Guid.TryParse(id "{96258006-c4ba-4a7f-80c4}") Guid.TryParse("(96258006-c4ba-80c4-de7f2b2898c5)") Guid.TryParse(id "(96258006-c4ba-80c4-de7f2b2898c5)") Guid.TryParse("{0x96258006,0xc4ba,{0x80,0xc4,0xde,0x7f,0x28,0x98,0xc5}}") Guid.TryParse(id "{0x96258006,0xc4ba,{0x80,0xc4,0xde,0x7f,0x28,0x98,0xc5}}") ] successGuids |> List.iter (fst >> (equal true)) failGuids |> List.iter (fst >> (equal false)) testCase "Parsed guids with different case are considered the same" <| fun () -> // See #1718 let aGuid = Guid.NewGuid() let lower = aGuid.ToString().ToLower() let upper = aGuid.ToString().ToUpper() lower = upper |> equal false let lowerGuid = Guid.Parse lower let upperGuid = Guid.Parse upper lowerGuid = upperGuid |> equal true testCase "Convert Guid to byte array works" <| fun () -> let g = Guid.Parse("96258006-c4ba-4a7f-80c4-de7f2b2898c5") let g2 = Guid.Parse(id "96258006-c4ba-4a7f-80c4-de7f2b2898c5") g.ToByteArray() |> equal [|6uy; 128uy; 37uy; 150uy; 186uy; 196uy; 127uy; 74uy; 128uy; 196uy; 222uy; 127uy; 43uy; 40uy; 152uy; 197uy|] g2.ToByteArray() |> equal [|6uy; 128uy; 37uy; 150uy; 186uy; 196uy; 127uy; 74uy; 128uy; 196uy; 222uy; 127uy; 43uy; 40uy; 152uy; 197uy|] testCase "Convert byte array to Guid works" <| fun () -> let g = Guid [|6uy; 128uy; 37uy; 150uy; 186uy; 196uy; 127uy; 74uy; 128uy; 196uy; 222uy; 127uy; 43uy; 40uy; 152uy; 197uy|] g.ToString() |> equal "96258006-c4ba-4a7f-80c4-de7f2b2898c5" testCase "Guid.ToString works with formats" <| fun () -> let g = Guid.Parse("96258006-c4ba-4a7f-80c4-de7f2b2898c5") let g2 = Guid.Parse(id "96258006-c4ba-4a7f-80c4-de7f2b2898c5") let testGuid (g: Guid) = g.ToString() |> equal "96258006-c4ba-4a7f-80c4-de7f2b2898c5" g.ToString("N") |> equal "96258006c4ba4a7f80c4de7f2b2898c5" g.ToString("D") |> equal "96258006-c4ba-4a7f-80c4-de7f2b2898c5" g.ToString("B") |> equal "{96258006-c4ba-4a7f-80c4-de7f2b2898c5}" g.ToString("P") |> equal "(96258006-c4ba-4a7f-80c4-de7f2b2898c5)" g.ToString("X") |> equal "{0x96258006,0xc4ba,0x4a7f,{0x80,0xc4,0xde,0x7f,0x2b,0x28,0x98,0xc5}}" testGuid g testGuid g2 //------------------------------------- // System.Text.Encoding //------------------------------------- testCase "Encoding.Unicode.GetBytes works" <| fun () -> System.Text.Encoding.Unicode.GetBytes("za\u0306\u01FD\u03B2\uD8FF\uDCFF") |> equal [| 0x7Auy; 0x00uy; 0x61uy; 0x00uy; 0x06uy; 0x03uy; 0xFDuy; 0x01uy; 0xB2uy; 0x03uy; 0xFFuy; 0xD8uy; 0xFFuy; 0xDCuy |] testCase "Encoding.Unicode.GetBytes for range works" <| fun () -> System.Text.Encoding.Unicode.GetBytes("za\u0306\u01FD\u03B2\uD8FF\uDCFF".ToCharArray(), 4, 3) |> equal [| 0xB2uy; 0x03uy; 0xFFuy; 0xD8uy; 0xFFuy; 0xDCuy |] testCase "Encoding.Unicode.GetString works" <| fun () -> let bytes = [| 0x7Auy; 0x00uy; 0x61uy; 0x00uy; 0x06uy; 0x03uy; 0xFDuy; 0x01uy; 0xB2uy; 0x03uy; 0xFFuy; 0xD8uy; 0xFFuy; 0xDCuy |] System.Text.Encoding.Unicode.GetString(bytes) |> equal "za\u0306\u01FD\u03B2\uD8FF\uDCFF" testCase "Encoding.Unicode.GetString for range works" <| fun () -> let bytes = [| 0x7Auy; 0x00uy; 0x61uy; 0x00uy; 0x06uy; 0x03uy; 0xFDuy; 0x01uy; 0xB2uy; 0x03uy; 0xFFuy; 0xD8uy; 0xFFuy; 0xDCuy |] System.Text.Encoding.Unicode.GetString(bytes, 8, 6) |> equal "\u03B2\uD8FF\uDCFF" testCase "Encoding.UTF8.GetBytes works" <| fun () -> System.Text.Encoding.UTF8.GetBytes("za\u0306\u01FD\u03B2\uD8FF\uDCFF") |> equal [| 0x7Auy; 0x61uy; 0xCCuy; 0x86uy; 0xC7uy; 0xBDuy; 0xCEuy; 0xB2uy; 0xF1uy; 0x8Fuy; 0xB3uy; 0xBFuy |] testCase "Encoding.UTF8.GetBytes for range works" <| fun () -> System.Text.Encoding.UTF8.GetBytes("za\u0306\u01FD\u03B2\uD8FF\uDCFF".ToCharArray(), 4, 3) |> equal [| 0xCEuy; 0xB2uy; 0xF1uy; 0x8Fuy; 0xB3uy; 0xBFuy |] testCase "Encoding.UTF8.GetString works" <| fun () -> let bytes = [| 0x7Auy; 0x61uy; 0xCCuy; 0x86uy; 0xC7uy; 0xBDuy; 0xCEuy; 0xB2uy; 0xF1uy; 0x8Fuy; 0xB3uy; 0xBFuy |] System.Text.Encoding.UTF8.GetString(bytes) |> equal "za\u0306\u01FD\u03B2\uD8FF\uDCFF" testCase "Encoding.UTF8.GetString for range works" <| fun () -> let bytes = [| 0x7Auy; 0x61uy; 0xCCuy; 0x86uy; 0xC7uy; 0xBDuy; 0xCEuy; 0xB2uy; 0xF1uy; 0x8Fuy; 0xB3uy; 0xBFuy |] System.Text.Encoding.UTF8.GetString(bytes, 6, 6) |> equal "\u03B2\uD8FF\uDCFF" ] ```
```scilab \S[300]\s[80]OUT=$(smenu -n 18 t0025.in) \S[300]\s[200]~abc\rsssssSSSSs\r \S[300]\s[80]echo ":$\s[80]OUT:" exit 0 ```
Mortuary is a 1982 American slasher film directed by Howard Avedis and starring Mary Beth McDonough, Bill Paxton, David Wallace, Lynda Day George, and Christopher George in his final film role before his death. It follows a young woman who, while investigating the death of her father, exposes disturbing secrets surrounding a local mortuary. Filmed in Southern California, Mortuary was first released regionally in 1982 before its release expanded the following year in the fall of 1983, after it was acquired by Film Ventures International. Mortuary was a box-office success, ranking among the top twenty highest-grossing films at the time of its release, and went on to earn over $4 million internationally. The film received some favorable reviews from critics, who praised its cinematography, performances, and direction. Plot Wealthy psychiatrist Dr. Parson is bludgeoned and drowned in his pool, an incident his daughter Christie believes was murder, but which her mother Eve insists was an accident. Several weeks later, Josh, an ex-employee at a local mortuary, sneaks into the mortuary warehouse with Christie's boyfriend Greg, planning to steal tires as compensation for unpaid wages from his boss, Hank Andrews. Inside, the two men observe an occult sabbath, which Josh is impervious to, having been fired for witnessing one before. While retrieving the tires, Josh is stabbed to death with a trocar by a cloaked assailant. Shortly after, Greg observes Josh's van speed away. Later that night, Greg and Christie search for Josh at the local roller skating rink, but cannot find him. The following day, Christie is pursued by a car en route to her family's secluded coastal mansion. After an argument with her mother, Christie is accosted by a hooded figure at the pool. She flees into the house, and Eve assures her it was only a dream. The next day, Greg confesses to Christie that he saw Eve in attendance at the sabbath he witnessed at the mortuary. Christie suspects her mother and Hank, whom Eve began dating only weeks after her husband's death, may have murdered Christie's father, and are orchestrating a plot to drive her insane. Meanwhile, Paul, Hank's son and an embalmer at the mortuary, vies for Christie's attention. She and Greg dismiss his eccentricities on his mentally-ill mother's recent suicide. That night, Greg and Christie spend time alone in her home, but are subjected to various electronic interruptions, such as lights turning on and off, and the stereo playing by itself. The next day, Greg and Christie follow Eve to the mortuary, where they observe her engaging in a séance, attempting to contact her late husband. That night, Christie is attacked by a cloaked figure resembling Paul, and smashes a glass window, startling her mother. Eve assumes it to have been a nightmare, but asks Christie if the alleged attacker could have been Paul; she explains that Paul was a patient of Christie's father, and that he had been obsessed with her. After Christie and Eve return to their bedrooms, a cloaked figure viciously stabs Eve to death while she lays in bed. The assailant, revealed to be Paul donning a white latex mask, chases Christie through the house. He attempts to stab her, but she unmasks him before he renders her unconscious. He brings her to the mortuary, where he begins the process of embalming her alive, but is stopped when Hank arrives. Paul explains that he had to "punish" Eve for telling Christie about his psychiatric condition, and that he had murdered Dr. Parson for previously having him incarcerated. Paul stabs his father in a rage, killing him, before being confronted by Greg, who has come searching for Christie. Paul manages to lock Greg in the embalming room. Paul takes Christie and the corpses of Eve and his father to the warehouse, where he has arranged a makeshift wedding ceremony for himself and Christie. Surrounded by the preserved bodies of his victims, Paul pretends to conduct a Mozart symphony; among them is the body of Paul's mother, whose death he faked and whom he has induced into a coma. As he attempts to cut Christie's throat with a scalpel, Paul is attacked by Greg, who has broken free and armed himself with an axe. In the mélee, Christie begins to sleepwalk, and proceeds to take the axe and drive it into Paul's back, killing him. Greg and Christie embrace, before Mrs. Andrews suddenly awakens from her coma, lunging at the couple with a knife. Cast Production Release Hickmar Productions first released Mortuary regionally, screening it in Tucson, Arizona beginning May 7, 1982. It later opened in Newport News, Virginia on July 8, 1983 before premiering in Los Angeles on September 2, 1983. A promotional trailer was shot exclusively for the film's release, which features actor Michael Berryman, though he never appeared in the film. Box office The film grossed $763,184 during its fall 1983 release in 146 theaters, and went on to have a worldwide box-office gross of $4,319,001. As of January 1984, the film ranked among the top twenty highest-grossing films at the U.S. box office. Critical response Variety described the film as superior to the average "stab-and-slab" horror film, while the Arizona Daily Stars Jacqi Tully praised Mortuary as "cool and sleek looking...  Avedis has written in a romantic element with more psychological dread than physical horror." Linda Gross of the Los Angeles Times also gave it a favorable review, describing it as a "sick, scary slasher movie," though conceded that the screenplay "goes overboard on the gore." Gross also praised Gary Graver's cinematography as "classy, and at times, very beautiful." Henry Edgar of the Daily Press praised the film as well-crafted, writing: "Mortuary is rare among horror films. Even though the script leaves a few holes, the dialogue is mostly natural and the director keeps the motion moving at a quick pace." George Williams of The Sacramento Bee alternately gave the film an unfavorable review in his 1983 review, writing that it "is so embarrassingly bad its creators decided to put it on the shelf as an act of mercy." Robert C. Trussell of The Kansas City Star also panned the film for its dialogue and "unconvincing" violence, writing: "There is absolutely nothing to recommend [about] Mortuary, a wretched 1981 film featuring one of the last performances of the late Christopher Day George." Home media Hokushin Audio Visual released the film on VHS and Betamax in the United Kingdom in December 1983. In the United States, Mortuary received a VHS release through Vestron Video in 1984. On May 15, 2012, the film was finally transferred to DVD with a 16×9 (1.78:1) HD master from the original inter-negative by Scorpion Releasing, in conjunction with Camelot Entertainment. The special features included play with or without the "Nightmare Theater" experience, on camera interview with composer John Cacavas, and the original trailer. On October 7, 2014, Scorpion Releasing released the film on Blu-ray in a limited edition run of only 1,200 copies. MVD Visual released a new Blu-ray edition on July 6, 2021. Notes References External links 1982 films 1982 horror films 1982 independent films American slasher films American mystery films American teen horror films American serial killer films American independent films Funeral homes in fiction 1980s American films 1980s English-language films 1980s mystery films 1980s serial killer films 1980s slasher films 1980s teen horror films 1980s horror thriller films
The men's 4 × 200 metre freestyle relay event at the 2018 Commonwealth Games was held on 8 April at the Gold Coast Aquatic Centre. Records Prior to this competition, the existing world and Commonwealth Games records were as follows. The following records were established during the competition: Results The final was held at 22:01. References Men's 4 x 200 metre freestyle relay Commonwealth Games
```java /** * Classes to support storing and retrieving image display settings. * <p> * These are intended to be simple and JSON-friendly. */ package qupath.lib.display.settings; ```
```go // // // path_to_url // // Unless required by applicable law or agreed to in writing, software // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. package operator import ( "testing" ) type ImageSpec struct { SpecImage string Image string Version string Tag string SHA string } func TestBuildImagePath(t *testing.T) { defaultImageSpec := &ImageSpec{ Image: "foo.com/bar", Version: "0.0.1", } // imageWithoutVersion := "myrepo/myimage:123" // imageWithVersion := "myhost:9090/myrepo/myimage:0.2" // imageWithTag := "myhost:9090/myrepo/myimage:latest" // imageWithSHA := "foo/bar@sha256:12345" cases := []struct { spec *ImageSpec expected string }{ { spec: &ImageSpec{}, expected: "", }, { spec: defaultImageSpec, expected: defaultImageSpec.Image + ":" + defaultImageSpec.Version, }, { spec: &ImageSpec{"", "myrepo.com/foo", "1.0", "", ""}, expected: "myrepo.com/foo:1.0", }, { spec: &ImageSpec{"", "myrepo.com/foo", "1.0", "latest", ""}, expected: "myrepo.com/foo:latest", }, { spec: &ImageSpec{"", "myrepo.com/foo", "1.0", "latest", "abcd1234"}, expected: "myrepo.com/foo@sha256:abcd1234", }, { spec: &ImageSpec{"myspecrepo.com/myimage", "myrepo.com/foo", "1.0", "latest", "abcd1234"}, expected: "myspecrepo.com/myimage", }, } for i, c := range cases { result, _ := BuildImagePath(c.spec.SpecImage, c.spec.Image, c.spec.Version, c.spec.Tag, c.spec.SHA) if c.expected != result { t.Errorf("expected test case %d to be %q but got %q", i, c.expected, result) } } } ```
Trechus muguensis is a species of ground beetle in the subfamily Trechinae. It was described by Schmidt in 2009. References muguensis Beetles described in 2009
Padma Shri Award, India's fourth highest civilian honours – Winners, 2000 – 2009 Recipients Explanatory notes Non-citizen recipients Posthumous recipients References External links Recipients of the Padma Shri Lists of Indian award winners 2000s in India 2000s-related lists
```xml import * as path from 'path'; import * as sourceMapModule from 'source-map'; import * as ts from 'typescript'; import { generate } from '../compiler/generator/generator'; import { ASTNode } from '../compiler/interfaces/AST'; import { ITransformerResult } from '../compiler/interfaces/ITranformerResult'; import { ImportType } from '../compiler/interfaces/ImportType'; import { parseJavascript, parseTypeScript, ICodeParser } from '../compiler/parser'; import { ISerializableTransformationContext } from '../compiler/transformer'; import { EXECUTABLE_EXTENSIONS, JS_EXTENSIONS, STYLESHEET_EXTENSIONS, TS_EXTENSIONS } from '../config/extensions'; import { Context } from '../core/context'; import { env } from '../env'; import { ITypescriptTarget } from '../interfaces/TypescriptInterfaces'; import { IModuleTree } from '../production/module/ModuleTree'; import { IStylesheetModuleResponse } from '../stylesheet/interfaces'; import { getFileModificationTime, makePublicPath, readFile } from '../utils/utils'; import { IRelativeResolve } from './asyncModuleResolver'; import { PackageType, IPackage } from './package'; export function Module() {} export interface IModule { absPath?: string; ast?: ASTNode; breakDependantsCache?: boolean; captured?: boolean; contents?: string; css?: IStylesheetModuleResponse; ctx?: Context; dependencies?: Array<number>; errored?: boolean; extension?: string; id?: number; ignore?: boolean; isCSSModule?: boolean; isCSSSourceMapRequired?: boolean; isCSSText?: boolean; isCached?: boolean; isCommonsEligible?: boolean; isEntry?: boolean; isExecutable?: boolean; isJavaScript?: boolean; isSourceMapRequired?: boolean; isSplit?: boolean; isStylesheet?: boolean; isTypeScript?: boolean; moduleSourceRefs?: Record<string, IModule>; moduleTree?: IModuleTree; pending?: Array<Promise<any>>; pkg?: IPackage; publicPath?: string; sourceMap?: string; storage?: Record<string, any>; props?: { fuseBoxPath?: string; }; generate?: () => void; getMeta?: () => any; getTransformationContext?: () => ISerializableTransformationContext; init?: () => void; initFromCache?: (meta: IModuleMeta, data: { contents: string; sourceMap: string }) => void; parse?: () => ASTNode; read?: () => string; resolve?: (props: { importType?: ImportType; statement: string }) => Promise<IRelativeResolve>; transpile?: () => ITransformerResult; transpileDown?(buildTarget: ITypescriptTarget): void; } export interface IModuleMeta { absPath: string; breakDependantsCache?: boolean; dependencies: Array<number>; id: number; mtime: number; packageId?: string; publicPath: string; // a list of dependencies for verification (essential dependency) // if those dependencies changed this one should be affected too. v?: Array<number>; } export function createModule(props: { absPath?: string; ctx?: Context; pkg?: IPackage }): IModule { const self: IModule = { ctx: props.ctx, dependencies: [], pkg: props.pkg, // legacy props props: {}, storage: {}, // generate the code generate: () => { if (!self.ast) throw new Error('Cannot generate code without AST'); const genOptions: any = { ecmaVersion: 7, }; if (self.isSourceMapRequired) { const sourceMap = new sourceMapModule.SourceMapGenerator({ file: self.publicPath, }); genOptions.sourceMap = sourceMap; } if (self.ctx.config.isProduction) { genOptions.indent = ''; genOptions.lineEnd = ''; } const code = generate(self.ast, genOptions); if (self.isSourceMapRequired) { const jsonSourceMaps = genOptions.sourceMap.toJSON(); if (!jsonSourceMaps.sourcesContent) { delete jsonSourceMaps.file; jsonSourceMaps.sources = [self.publicPath]; jsonSourceMaps.sourcesContent = [self.contents]; } self.sourceMap = JSON.stringify(jsonSourceMaps); } self.contents = code; return code; }, getMeta: (): IModuleMeta => { const meta: IModuleMeta = { absPath: self.absPath, dependencies: self.dependencies, id: self.id, mtime: getFileModificationTime(self.absPath), packageId: props.pkg !== undefined ? props.pkg.publicName : undefined, publicPath: self.publicPath, }; if (self.breakDependantsCache) meta.breakDependantsCache = true; return meta; }, getTransformationContext: () => { return { compilerOptions: self.ctx.compilerOptions, config: { electron: { nodeIntegration: self.ctx.config.electron.nodeIntegration, }, }, module: { absPath: self.absPath, extension: self.extension, isSourceMapRequired: self.isSourceMapRequired, publicPath: self.publicPath, }, pkg: { type: self.pkg.type }, //userTransformers: self.ctx.userTransformers, }; }, init: () => { const ext = path.extname(props.absPath); self.extension = path.extname(props.absPath); self.isJavaScript = JS_EXTENSIONS.includes(ext); self.isTypeScript = TS_EXTENSIONS.includes(ext); self.isStylesheet = STYLESHEET_EXTENSIONS.includes(ext); self.isExecutable = EXECUTABLE_EXTENSIONS.includes(ext); self.absPath = props.absPath; self.isCommonsEligible = false; self.pending = []; self.moduleSourceRefs = {}; self.isEntry = false; self.isSplit = false; const config = props.ctx.config; if (self.isStylesheet) { let isCSSSourceMapRequired = true; if (config.sourceMap.css === false) { isCSSSourceMapRequired = false; } if (props.pkg && props.pkg.type === PackageType.EXTERNAL_PACKAGE && !config.sourceMap.vendor) { isCSSSourceMapRequired = false; } self.isCSSSourceMapRequired = isCSSSourceMapRequired; } self.props.fuseBoxPath = makePublicPath(self.absPath); self.publicPath = self.props.fuseBoxPath; self.isSourceMapRequired = true; if (self.pkg && self.pkg.type === PackageType.USER_PACKAGE) { if (!config.sourceMap.project) self.isSourceMapRequired = false; } else { if (!config.sourceMap.vendor) self.isSourceMapRequired = false; } }, initFromCache: (meta: IModuleMeta, data: { contents: string; sourceMap: string }) => { self.id = meta.id; self.absPath = meta.absPath; self.extension = path.extname(self.absPath); self.isJavaScript = JS_EXTENSIONS.includes(self.extension); self.isTypeScript = TS_EXTENSIONS.includes(self.extension); self.isStylesheet = STYLESHEET_EXTENSIONS.includes(self.extension); self.isExecutable = EXECUTABLE_EXTENSIONS.includes(self.extension); self.contents = data.contents; self.sourceMap = data.sourceMap; self.dependencies = meta.dependencies; self.publicPath = meta.publicPath; self.breakDependantsCache = meta.breakDependantsCache; self.isCached = true; if (self.sourceMap) self.isSourceMapRequired = true; }, // parse using javascript or typescript parse: () => { if (!self.contents) { props.ctx.log.warn(`One of your dependencies contains an empty module:\n\t ${self.publicPath}`); self.ast = { body: [ { declaration: { type: 'Literal', value: '', }, type: 'ExportDefaultDeclaration', }, ], sourceType: 'module', type: 'Program', }; return self.ast; } let parser: ICodeParser; if (self.isTypeScript) parser = parseTypeScript; else { parser = parseJavascript; const parserOptions = self.ctx.compilerOptions.jsParser; const isExternal = self.pkg.type === PackageType.EXTERNAL_PACKAGE; if (isExternal) { if (parserOptions.nodeModules === 'ts') parser = parseTypeScript; } else if (parserOptions.project === 'ts') parser = parseTypeScript; } const jsxRequired = self.extension !== '.ts'; try { // @todo: fix jsx properly self.ast = parser(self.contents, { jsx: jsxRequired, locations: self.isSourceMapRequired, }); self.errored = false; } catch (e) { self.errored = true; const message = `Error while parsing module ${self.absPath}\n\t ${e.stack || e.message}`; props.ctx.log.error(message); self.ast = parseJavascript(``); } return self.ast; }, // read the contents read: () => { if (self.contents !== undefined) return self.contents; try { self.contents = readFile(self.absPath); } catch (e) { if (self.absPath.includes('node_modules')) { props.ctx.log.warn(`Did you forget to run 'npm install'?`); } props.ctx.log.error(`Module not found at\n\t${self.publicPath}`, e.message); throw e; } return self.contents; }, transpileDown: (buildTarget: ITypescriptTarget) => { // we can't support sourcemaps on downtranspiling self.isSourceMapRequired = false; const config = ts.convertCompilerOptionsFromJson( { importHelpers: false, noEmitHelpers: true, target: buildTarget }, env.SCRIPT_PATH, ); const data = ts.transpileModule(self.contents, { compilerOptions: config.options, fileName: self.absPath, }); self.contents = data.outputText; }, }; return self; } ```
```html <!DOCTYPE html> <html> {{> head}} <body> {{> sidebar}} {{#entries}} <div class="entry"> {{{html}}} </div> <div class="light">{{#date}} <a href="{{{url}}}" title="{{title}}">{{date}}</a> {{/date}}{{#tags}} <a title="Everything tagged {{name}}" href="/tagged/{{slug}}">{{name}}</a> {{/tags}}</div> <hr class="full"/> {{/entries}} <p class="entry" style="text-align:center">View <a href="/archives">the archives</a></p> {{> footer}} </body> </html> ```
```go // // path_to_url // // Unless required by applicable law or agreed to in writing, software // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // Package cobra is a commander providing a simple interface to create powerful modern CLI interfaces. // In addition to providing an interface, Cobra simultaneously provides a controller to organize your application code. package cobra import ( "bytes" "context" "fmt" "io" "os" "path/filepath" "sort" "strings" flag "github.com/spf13/pflag" ) // FParseErrWhitelist configures Flag parse errors to be ignored type FParseErrWhitelist flag.ParseErrorsWhitelist // Command is just that, a command for your application. // E.g. 'go run ...' - 'run' is the command. Cobra requires // you to define the usage and description as part of your command // definition to ensure usability. type Command struct { // Use is the one-line usage message. Use string // Aliases is an array of aliases that can be used instead of the first word in Use. Aliases []string // SuggestFor is an array of command names for which this command will be suggested - // similar to aliases but only suggests. SuggestFor []string // Short is the short description shown in the 'help' output. Short string // Long is the long message shown in the 'help <this-command>' output. Long string // Example is examples of how to use the command. Example string // ValidArgs is list of all valid non-flag arguments that are accepted in bash completions ValidArgs []string // ValidArgsFunction is an optional function that provides valid non-flag arguments for bash completion. // It is a dynamic version of using ValidArgs. // Only one of ValidArgs and ValidArgsFunction can be used for a command. ValidArgsFunction func(cmd *Command, args []string, toComplete string) ([]string, ShellCompDirective) // Expected arguments Args PositionalArgs // ArgAliases is List of aliases for ValidArgs. // These are not suggested to the user in the bash completion, // but accepted if entered manually. ArgAliases []string // BashCompletionFunction is custom functions used by the bash autocompletion generator. BashCompletionFunction string // Deprecated defines, if this command is deprecated and should print this string when used. Deprecated string // Hidden defines, if this command is hidden and should NOT show up in the list of available commands. Hidden bool // Annotations are key/value pairs that can be used by applications to identify or // group commands. Annotations map[string]string // Version defines the version for this command. If this value is non-empty and the command does not // define a "version" flag, a "version" boolean flag will be added to the command and, if specified, // will print content of the "Version" variable. A shorthand "v" flag will also be added if the // command does not define one. Version string // The *Run functions are executed in the following order: // * PersistentPreRun() // * PreRun() // * Run() // * PostRun() // * PersistentPostRun() // All functions get the same args, the arguments after the command name. // // PersistentPreRun: children of this command will inherit and execute. PersistentPreRun func(cmd *Command, args []string) // PersistentPreRunE: PersistentPreRun but returns an error. PersistentPreRunE func(cmd *Command, args []string) error // PreRun: children of this command will not inherit. PreRun func(cmd *Command, args []string) // PreRunE: PreRun but returns an error. PreRunE func(cmd *Command, args []string) error // Run: Typically the actual work function. Most commands will only implement this. Run func(cmd *Command, args []string) // RunE: Run but returns an error. RunE func(cmd *Command, args []string) error // PostRun: run after the Run command. PostRun func(cmd *Command, args []string) // PostRunE: PostRun but returns an error. PostRunE func(cmd *Command, args []string) error // PersistentPostRun: children of this command will inherit and execute after PostRun. PersistentPostRun func(cmd *Command, args []string) // PersistentPostRunE: PersistentPostRun but returns an error. PersistentPostRunE func(cmd *Command, args []string) error // SilenceErrors is an option to quiet errors down stream. SilenceErrors bool // SilenceUsage is an option to silence usage when an error occurs. SilenceUsage bool // DisableFlagParsing disables the flag parsing. // If this is true all flags will be passed to the command as arguments. DisableFlagParsing bool // DisableAutoGenTag defines, if gen tag ("Auto generated by spf13/cobra...") // will be printed by generating docs for this command. DisableAutoGenTag bool // DisableFlagsInUseLine will disable the addition of [flags] to the usage // line of a command when printing help or generating docs DisableFlagsInUseLine bool // DisableSuggestions disables the suggestions based on Levenshtein distance // that go along with 'unknown command' messages. DisableSuggestions bool // SuggestionsMinimumDistance defines minimum levenshtein distance to display suggestions. // Must be > 0. SuggestionsMinimumDistance int // TraverseChildren parses flags on all parents before executing child command. TraverseChildren bool // FParseErrWhitelist flag parse errors to be ignored FParseErrWhitelist FParseErrWhitelist ctx context.Context // commands is the list of commands supported by this program. commands []*Command // parent is a parent command for this command. parent *Command // Max lengths of commands' string lengths for use in padding. commandsMaxUseLen int commandsMaxCommandPathLen int commandsMaxNameLen int // commandsAreSorted defines, if command slice are sorted or not. commandsAreSorted bool // commandCalledAs is the name or alias value used to call this command. commandCalledAs struct { name string called bool } // args is actual args parsed from flags. args []string // flagErrorBuf contains all error messages from pflag. flagErrorBuf *bytes.Buffer // flags is full set of flags. flags *flag.FlagSet // pflags contains persistent flags. pflags *flag.FlagSet // lflags contains local flags. lflags *flag.FlagSet // iflags contains inherited flags. iflags *flag.FlagSet // parentsPflags is all persistent flags of cmd's parents. parentsPflags *flag.FlagSet // globNormFunc is the global normalization function // that we can use on every pflag set and children commands globNormFunc func(f *flag.FlagSet, name string) flag.NormalizedName // usageFunc is usage func defined by user. usageFunc func(*Command) error // usageTemplate is usage template defined by user. usageTemplate string // flagErrorFunc is func defined by user and it's called when the parsing of // flags returns an error. flagErrorFunc func(*Command, error) error // helpTemplate is help template defined by user. helpTemplate string // helpFunc is help func defined by user. helpFunc func(*Command, []string) // helpCommand is command with usage 'help'. If it's not defined by user, // cobra uses default help command. helpCommand *Command // versionTemplate is the version template defined by user. versionTemplate string // inReader is a reader defined by the user that replaces stdin inReader io.Reader // outWriter is a writer defined by the user that replaces stdout outWriter io.Writer // errWriter is a writer defined by the user that replaces stderr errWriter io.Writer } // Context returns underlying command context. If command wasn't // executed with ExecuteContext Context returns Background context. func (c *Command) Context() context.Context { return c.ctx } // SetArgs sets arguments for the command. It is set to os.Args[1:] by default, if desired, can be overridden // particularly useful when testing. func (c *Command) SetArgs(a []string) { c.args = a } // SetOutput sets the destination for usage and error messages. // If output is nil, os.Stderr is used. // Deprecated: Use SetOut and/or SetErr instead func (c *Command) SetOutput(output io.Writer) { c.outWriter = output c.errWriter = output } // SetOut sets the destination for usage messages. // If newOut is nil, os.Stdout is used. func (c *Command) SetOut(newOut io.Writer) { c.outWriter = newOut } // SetErr sets the destination for error messages. // If newErr is nil, os.Stderr is used. func (c *Command) SetErr(newErr io.Writer) { c.errWriter = newErr } // SetIn sets the source for input data // If newIn is nil, os.Stdin is used. func (c *Command) SetIn(newIn io.Reader) { c.inReader = newIn } // SetUsageFunc sets usage function. Usage can be defined by application. func (c *Command) SetUsageFunc(f func(*Command) error) { c.usageFunc = f } // SetUsageTemplate sets usage template. Can be defined by Application. func (c *Command) SetUsageTemplate(s string) { c.usageTemplate = s } // SetFlagErrorFunc sets a function to generate an error when flag parsing // fails. func (c *Command) SetFlagErrorFunc(f func(*Command, error) error) { c.flagErrorFunc = f } // SetHelpFunc sets help function. Can be defined by Application. func (c *Command) SetHelpFunc(f func(*Command, []string)) { c.helpFunc = f } // SetHelpCommand sets help command. func (c *Command) SetHelpCommand(cmd *Command) { c.helpCommand = cmd } // SetHelpTemplate sets help template to be used. Application can use it to set custom template. func (c *Command) SetHelpTemplate(s string) { c.helpTemplate = s } // SetVersionTemplate sets version template to be used. Application can use it to set custom template. func (c *Command) SetVersionTemplate(s string) { c.versionTemplate = s } // SetGlobalNormalizationFunc sets a normalization function to all flag sets and also to child commands. // The user should not have a cyclic dependency on commands. func (c *Command) SetGlobalNormalizationFunc(n func(f *flag.FlagSet, name string) flag.NormalizedName) { c.Flags().SetNormalizeFunc(n) c.PersistentFlags().SetNormalizeFunc(n) c.globNormFunc = n for _, command := range c.commands { command.SetGlobalNormalizationFunc(n) } } // OutOrStdout returns output to stdout. func (c *Command) OutOrStdout() io.Writer { return c.getOut(os.Stdout) } // OutOrStderr returns output to stderr func (c *Command) OutOrStderr() io.Writer { return c.getOut(os.Stderr) } // ErrOrStderr returns output to stderr func (c *Command) ErrOrStderr() io.Writer { return c.getErr(os.Stderr) } // InOrStdin returns input to stdin func (c *Command) InOrStdin() io.Reader { return c.getIn(os.Stdin) } func (c *Command) getOut(def io.Writer) io.Writer { if c.outWriter != nil { return c.outWriter } if c.HasParent() { return c.parent.getOut(def) } return def } func (c *Command) getErr(def io.Writer) io.Writer { if c.errWriter != nil { return c.errWriter } if c.HasParent() { return c.parent.getErr(def) } return def } func (c *Command) getIn(def io.Reader) io.Reader { if c.inReader != nil { return c.inReader } if c.HasParent() { return c.parent.getIn(def) } return def } // UsageFunc returns either the function set by SetUsageFunc for this command // or a parent, or it returns a default usage function. func (c *Command) UsageFunc() (f func(*Command) error) { if c.usageFunc != nil { return c.usageFunc } if c.HasParent() { return c.Parent().UsageFunc() } return func(c *Command) error { c.mergePersistentFlags() err := tmpl(c.OutOrStderr(), c.UsageTemplate(), c) if err != nil { c.Println(err) } return err } } // Usage puts out the usage for the command. // Used when a user provides invalid input. // Can be defined by user by overriding UsageFunc. func (c *Command) Usage() error { return c.UsageFunc()(c) } // HelpFunc returns either the function set by SetHelpFunc for this command // or a parent, or it returns a function with default help behavior. func (c *Command) HelpFunc() func(*Command, []string) { if c.helpFunc != nil { return c.helpFunc } if c.HasParent() { return c.Parent().HelpFunc() } return func(c *Command, a []string) { c.mergePersistentFlags() // The help should be sent to stdout // See path_to_url err := tmpl(c.OutOrStdout(), c.HelpTemplate(), c) if err != nil { c.Println(err) } } } // Help puts out the help for the command. // Used when a user calls help [command]. // Can be defined by user by overriding HelpFunc. func (c *Command) Help() error { c.HelpFunc()(c, []string{}) return nil } // UsageString returns usage string. func (c *Command) UsageString() string { // Storing normal writers tmpOutput := c.outWriter tmpErr := c.errWriter bb := new(bytes.Buffer) c.outWriter = bb c.errWriter = bb c.Usage() // Setting things back to normal c.outWriter = tmpOutput c.errWriter = tmpErr return bb.String() } // FlagErrorFunc returns either the function set by SetFlagErrorFunc for this // command or a parent, or it returns a function which returns the original // error. func (c *Command) FlagErrorFunc() (f func(*Command, error) error) { if c.flagErrorFunc != nil { return c.flagErrorFunc } if c.HasParent() { return c.parent.FlagErrorFunc() } return func(c *Command, err error) error { return err } } var minUsagePadding = 25 // UsagePadding return padding for the usage. func (c *Command) UsagePadding() int { if c.parent == nil || minUsagePadding > c.parent.commandsMaxUseLen { return minUsagePadding } return c.parent.commandsMaxUseLen } var minCommandPathPadding = 11 // CommandPathPadding return padding for the command path. func (c *Command) CommandPathPadding() int { if c.parent == nil || minCommandPathPadding > c.parent.commandsMaxCommandPathLen { return minCommandPathPadding } return c.parent.commandsMaxCommandPathLen } var minNamePadding = 11 // NamePadding returns padding for the name. func (c *Command) NamePadding() int { if c.parent == nil || minNamePadding > c.parent.commandsMaxNameLen { return minNamePadding } return c.parent.commandsMaxNameLen } // UsageTemplate returns usage template for the command. func (c *Command) UsageTemplate() string { if c.usageTemplate != "" { return c.usageTemplate } if c.HasParent() { return c.parent.UsageTemplate() } return `Usage:{{if .Runnable}} {{.UseLine}}{{end}}{{if .HasAvailableSubCommands}} {{.CommandPath}} [command]{{end}}{{if gt (len .Aliases) 0}} Aliases: {{.NameAndAliases}}{{end}}{{if .HasExample}} Examples: {{.Example}}{{end}}{{if .HasAvailableSubCommands}} Available Commands:{{range .Commands}}{{if (or .IsAvailableCommand (eq .Name "help"))}} {{rpad .Name .NamePadding }} {{.Short}}{{end}}{{end}}{{end}}{{if .HasAvailableLocalFlags}} Flags: {{.LocalFlags.FlagUsages | trimTrailingWhitespaces}}{{end}}{{if .HasAvailableInheritedFlags}} Global Flags: {{.InheritedFlags.FlagUsages | trimTrailingWhitespaces}}{{end}}{{if .HasHelpSubCommands}} Additional help topics:{{range .Commands}}{{if .IsAdditionalHelpTopicCommand}} {{rpad .CommandPath .CommandPathPadding}} {{.Short}}{{end}}{{end}}{{end}}{{if .HasAvailableSubCommands}} Use "{{.CommandPath}} [command] --help" for more information about a command.{{end}} ` } // HelpTemplate return help template for the command. func (c *Command) HelpTemplate() string { if c.helpTemplate != "" { return c.helpTemplate } if c.HasParent() { return c.parent.HelpTemplate() } return `{{with (or .Long .Short)}}{{. | trimTrailingWhitespaces}} {{end}}{{if or .Runnable .HasSubCommands}}{{.UsageString}}{{end}}` } // VersionTemplate return version template for the command. func (c *Command) VersionTemplate() string { if c.versionTemplate != "" { return c.versionTemplate } if c.HasParent() { return c.parent.VersionTemplate() } return `{{with .Name}}{{printf "%s " .}}{{end}}{{printf "version %s" .Version}} ` } func hasNoOptDefVal(name string, fs *flag.FlagSet) bool { flag := fs.Lookup(name) if flag == nil { return false } return flag.NoOptDefVal != "" } func shortHasNoOptDefVal(name string, fs *flag.FlagSet) bool { if len(name) == 0 { return false } flag := fs.ShorthandLookup(name[:1]) if flag == nil { return false } return flag.NoOptDefVal != "" } func stripFlags(args []string, c *Command) []string { if len(args) == 0 { return args } c.mergePersistentFlags() commands := []string{} flags := c.Flags() Loop: for len(args) > 0 { s := args[0] args = args[1:] switch { case s == "--": // "--" terminates the flags break Loop case strings.HasPrefix(s, "--") && !strings.Contains(s, "=") && !hasNoOptDefVal(s[2:], flags): // If '--flag arg' then // delete arg from args. fallthrough // (do the same as below) case strings.HasPrefix(s, "-") && !strings.Contains(s, "=") && len(s) == 2 && !shortHasNoOptDefVal(s[1:], flags): // If '-f arg' then // delete 'arg' from args or break the loop if len(args) <= 1. if len(args) <= 1 { break Loop } else { args = args[1:] continue } case s != "" && !strings.HasPrefix(s, "-"): commands = append(commands, s) } } return commands } // argsMinusFirstX removes only the first x from args. Otherwise, commands that look like // openshift admin policy add-role-to-user admin my-user, lose the admin argument (arg[4]). func argsMinusFirstX(args []string, x string) []string { for i, y := range args { if x == y { ret := []string{} ret = append(ret, args[:i]...) ret = append(ret, args[i+1:]...) return ret } } return args } func isFlagArg(arg string) bool { return ((len(arg) >= 3 && arg[1] == '-') || (len(arg) >= 2 && arg[0] == '-' && arg[1] != '-')) } // Find the target command given the args and command tree // Meant to be run on the highest node. Only searches down. func (c *Command) Find(args []string) (*Command, []string, error) { var innerfind func(*Command, []string) (*Command, []string) innerfind = func(c *Command, innerArgs []string) (*Command, []string) { argsWOflags := stripFlags(innerArgs, c) if len(argsWOflags) == 0 { return c, innerArgs } nextSubCmd := argsWOflags[0] cmd := c.findNext(nextSubCmd) if cmd != nil { return innerfind(cmd, argsMinusFirstX(innerArgs, nextSubCmd)) } return c, innerArgs } commandFound, a := innerfind(c, args) if commandFound.Args == nil { return commandFound, a, legacyArgs(commandFound, stripFlags(a, commandFound)) } return commandFound, a, nil } func (c *Command) findSuggestions(arg string) string { if c.DisableSuggestions { return "" } if c.SuggestionsMinimumDistance <= 0 { c.SuggestionsMinimumDistance = 2 } suggestionsString := "" if suggestions := c.SuggestionsFor(arg); len(suggestions) > 0 { suggestionsString += "\n\nDid you mean this?\n" for _, s := range suggestions { suggestionsString += fmt.Sprintf("\t%v\n", s) } } return suggestionsString } func (c *Command) findNext(next string) *Command { matches := make([]*Command, 0) for _, cmd := range c.commands { if cmd.Name() == next || cmd.HasAlias(next) { cmd.commandCalledAs.name = next return cmd } if EnablePrefixMatching && cmd.hasNameOrAliasPrefix(next) { matches = append(matches, cmd) } } if len(matches) == 1 { return matches[0] } return nil } // Traverse the command tree to find the command, and parse args for // each parent. func (c *Command) Traverse(args []string) (*Command, []string, error) { flags := []string{} inFlag := false for i, arg := range args { switch { // A long flag with a space separated value case strings.HasPrefix(arg, "--") && !strings.Contains(arg, "="): // TODO: this isn't quite right, we should really check ahead for 'true' or 'false' inFlag = !hasNoOptDefVal(arg[2:], c.Flags()) flags = append(flags, arg) continue // A short flag with a space separated value case strings.HasPrefix(arg, "-") && !strings.Contains(arg, "=") && len(arg) == 2 && !shortHasNoOptDefVal(arg[1:], c.Flags()): inFlag = true flags = append(flags, arg) continue // The value for a flag case inFlag: inFlag = false flags = append(flags, arg) continue // A flag without a value, or with an `=` separated value case isFlagArg(arg): flags = append(flags, arg) continue } cmd := c.findNext(arg) if cmd == nil { return c, args, nil } if err := c.ParseFlags(flags); err != nil { return nil, args, err } return cmd.Traverse(args[i+1:]) } return c, args, nil } // SuggestionsFor provides suggestions for the typedName. func (c *Command) SuggestionsFor(typedName string) []string { suggestions := []string{} for _, cmd := range c.commands { if cmd.IsAvailableCommand() { levenshteinDistance := ld(typedName, cmd.Name(), true) suggestByLevenshtein := levenshteinDistance <= c.SuggestionsMinimumDistance suggestByPrefix := strings.HasPrefix(strings.ToLower(cmd.Name()), strings.ToLower(typedName)) if suggestByLevenshtein || suggestByPrefix { suggestions = append(suggestions, cmd.Name()) } for _, explicitSuggestion := range cmd.SuggestFor { if strings.EqualFold(typedName, explicitSuggestion) { suggestions = append(suggestions, cmd.Name()) } } } } return suggestions } // VisitParents visits all parents of the command and invokes fn on each parent. func (c *Command) VisitParents(fn func(*Command)) { if c.HasParent() { fn(c.Parent()) c.Parent().VisitParents(fn) } } // Root finds root command. func (c *Command) Root() *Command { if c.HasParent() { return c.Parent().Root() } return c } // ArgsLenAtDash will return the length of c.Flags().Args at the moment // when a -- was found during args parsing. func (c *Command) ArgsLenAtDash() int { return c.Flags().ArgsLenAtDash() } func (c *Command) execute(a []string) (err error) { if c == nil { return fmt.Errorf("Called Execute() on a nil Command") } if len(c.Deprecated) > 0 { c.Printf("Command %q is deprecated, %s\n", c.Name(), c.Deprecated) } // initialize help and version flag at the last point possible to allow for user // overriding c.InitDefaultHelpFlag() c.InitDefaultVersionFlag() err = c.ParseFlags(a) if err != nil { return c.FlagErrorFunc()(c, err) } // If help is called, regardless of other flags, return we want help. // Also say we need help if the command isn't runnable. helpVal, err := c.Flags().GetBool("help") if err != nil { // should be impossible to get here as we always declare a help // flag in InitDefaultHelpFlag() c.Println("\"help\" flag declared as non-bool. Please correct your code") return err } if helpVal { return flag.ErrHelp } // for back-compat, only add version flag behavior if version is defined if c.Version != "" { versionVal, err := c.Flags().GetBool("version") if err != nil { c.Println("\"version\" flag declared as non-bool. Please correct your code") return err } if versionVal { err := tmpl(c.OutOrStdout(), c.VersionTemplate(), c) if err != nil { c.Println(err) } return err } } if !c.Runnable() { return flag.ErrHelp } c.preRun() argWoFlags := c.Flags().Args() if c.DisableFlagParsing { argWoFlags = a } if err := c.ValidateArgs(argWoFlags); err != nil { return err } for p := c; p != nil; p = p.Parent() { if p.PersistentPreRunE != nil { if err := p.PersistentPreRunE(c, argWoFlags); err != nil { return err } break } else if p.PersistentPreRun != nil { p.PersistentPreRun(c, argWoFlags) break } } if c.PreRunE != nil { if err := c.PreRunE(c, argWoFlags); err != nil { return err } } else if c.PreRun != nil { c.PreRun(c, argWoFlags) } if err := c.validateRequiredFlags(); err != nil { return err } if c.RunE != nil { if err := c.RunE(c, argWoFlags); err != nil { return err } } else { c.Run(c, argWoFlags) } if c.PostRunE != nil { if err := c.PostRunE(c, argWoFlags); err != nil { return err } } else if c.PostRun != nil { c.PostRun(c, argWoFlags) } for p := c; p != nil; p = p.Parent() { if p.PersistentPostRunE != nil { if err := p.PersistentPostRunE(c, argWoFlags); err != nil { return err } break } else if p.PersistentPostRun != nil { p.PersistentPostRun(c, argWoFlags) break } } return nil } func (c *Command) preRun() { for _, x := range initializers { x() } } // ExecuteContext is the same as Execute(), but sets the ctx on the command. // Retrieve ctx by calling cmd.Context() inside your *Run lifecycle functions. func (c *Command) ExecuteContext(ctx context.Context) error { c.ctx = ctx return c.Execute() } // Execute uses the args (os.Args[1:] by default) // and run through the command tree finding appropriate matches // for commands and then corresponding flags. func (c *Command) Execute() error { _, err := c.ExecuteC() return err } // ExecuteC executes the command. func (c *Command) ExecuteC() (cmd *Command, err error) { if c.ctx == nil { c.ctx = context.Background() } // Regardless of what command execute is called on, run on Root only if c.HasParent() { return c.Root().ExecuteC() } // windows hook if preExecHookFn != nil { preExecHookFn(c) } // initialize help as the last point possible to allow for user // overriding c.InitDefaultHelpCmd() args := c.args // Workaround FAIL with "go test -v" or "cobra.test -test.v", see #155 if c.args == nil && filepath.Base(os.Args[0]) != "cobra.test" { args = os.Args[1:] } // initialize the hidden command to be used for bash completion c.initCompleteCmd(args) var flags []string if c.TraverseChildren { cmd, flags, err = c.Traverse(args) } else { cmd, flags, err = c.Find(args) } if err != nil { // If found parse to a subcommand and then failed, talk about the subcommand if cmd != nil { c = cmd } if !c.SilenceErrors { c.Println("Error:", err.Error()) c.Printf("Run '%v --help' for usage.\n", c.CommandPath()) } return c, err } cmd.commandCalledAs.called = true if cmd.commandCalledAs.name == "" { cmd.commandCalledAs.name = cmd.Name() } // We have to pass global context to children command // if context is present on the parent command. if cmd.ctx == nil { cmd.ctx = c.ctx } err = cmd.execute(flags) if err != nil { // Always show help if requested, even if SilenceErrors is in // effect if err == flag.ErrHelp { cmd.HelpFunc()(cmd, args) return cmd, nil } // If root command has SilentErrors flagged, // all subcommands should respect it if !cmd.SilenceErrors && !c.SilenceErrors { c.Println("Error:", err.Error()) } // If root command has SilentUsage flagged, // all subcommands should respect it if !cmd.SilenceUsage && !c.SilenceUsage { c.Println(cmd.UsageString()) } } return cmd, err } func (c *Command) ValidateArgs(args []string) error { if c.Args == nil { return nil } return c.Args(c, args) } func (c *Command) validateRequiredFlags() error { flags := c.Flags() missingFlagNames := []string{} flags.VisitAll(func(pflag *flag.Flag) { requiredAnnotation, found := pflag.Annotations[BashCompOneRequiredFlag] if !found { return } if (requiredAnnotation[0] == "true") && !pflag.Changed { missingFlagNames = append(missingFlagNames, pflag.Name) } }) if len(missingFlagNames) > 0 { return fmt.Errorf(`required flag(s) "%s" not set`, strings.Join(missingFlagNames, `", "`)) } return nil } // InitDefaultHelpFlag adds default help flag to c. // It is called automatically by executing the c or by calling help and usage. // If c already has help flag, it will do nothing. func (c *Command) InitDefaultHelpFlag() { c.mergePersistentFlags() if c.Flags().Lookup("help") == nil { usage := "help for " if c.Name() == "" { usage += "this command" } else { usage += c.Name() } c.Flags().BoolP("help", "h", false, usage) } } // InitDefaultVersionFlag adds default version flag to c. // It is called automatically by executing the c. // If c already has a version flag, it will do nothing. // If c.Version is empty, it will do nothing. func (c *Command) InitDefaultVersionFlag() { if c.Version == "" { return } c.mergePersistentFlags() if c.Flags().Lookup("version") == nil { usage := "version for " if c.Name() == "" { usage += "this command" } else { usage += c.Name() } if c.Flags().ShorthandLookup("v") == nil { c.Flags().BoolP("version", "v", false, usage) } else { c.Flags().Bool("version", false, usage) } } } // InitDefaultHelpCmd adds default help command to c. // It is called automatically by executing the c or by calling help and usage. // If c already has help command or c has no subcommands, it will do nothing. func (c *Command) InitDefaultHelpCmd() { if !c.HasSubCommands() { return } if c.helpCommand == nil { c.helpCommand = &Command{ Use: "help [command]", Short: "Help about any command", Long: `Help provides help for any command in the application. Simply type ` + c.Name() + ` help [path to command] for full details.`, Run: func(c *Command, args []string) { cmd, _, e := c.Root().Find(args) if cmd == nil || e != nil { c.Printf("Unknown help topic %#q\n", args) c.Root().Usage() } else { cmd.InitDefaultHelpFlag() // make possible 'help' flag to be shown cmd.Help() } }, } } c.RemoveCommand(c.helpCommand) c.AddCommand(c.helpCommand) } // ResetCommands delete parent, subcommand and help command from c. func (c *Command) ResetCommands() { c.parent = nil c.commands = nil c.helpCommand = nil c.parentsPflags = nil } // Sorts commands by their names. type commandSorterByName []*Command func (c commandSorterByName) Len() int { return len(c) } func (c commandSorterByName) Swap(i, j int) { c[i], c[j] = c[j], c[i] } func (c commandSorterByName) Less(i, j int) bool { return c[i].Name() < c[j].Name() } // Commands returns a sorted slice of child commands. func (c *Command) Commands() []*Command { // do not sort commands if it already sorted or sorting was disabled if EnableCommandSorting && !c.commandsAreSorted { sort.Sort(commandSorterByName(c.commands)) c.commandsAreSorted = true } return c.commands } // AddCommand adds one or more commands to this parent command. func (c *Command) AddCommand(cmds ...*Command) { for i, x := range cmds { if cmds[i] == c { panic("Command can't be a child of itself") } cmds[i].parent = c // update max lengths usageLen := len(x.Use) if usageLen > c.commandsMaxUseLen { c.commandsMaxUseLen = usageLen } commandPathLen := len(x.CommandPath()) if commandPathLen > c.commandsMaxCommandPathLen { c.commandsMaxCommandPathLen = commandPathLen } nameLen := len(x.Name()) if nameLen > c.commandsMaxNameLen { c.commandsMaxNameLen = nameLen } // If global normalization function exists, update all children if c.globNormFunc != nil { x.SetGlobalNormalizationFunc(c.globNormFunc) } c.commands = append(c.commands, x) c.commandsAreSorted = false } } // RemoveCommand removes one or more commands from a parent command. func (c *Command) RemoveCommand(cmds ...*Command) { commands := []*Command{} main: for _, command := range c.commands { for _, cmd := range cmds { if command == cmd { command.parent = nil continue main } } commands = append(commands, command) } c.commands = commands // recompute all lengths c.commandsMaxUseLen = 0 c.commandsMaxCommandPathLen = 0 c.commandsMaxNameLen = 0 for _, command := range c.commands { usageLen := len(command.Use) if usageLen > c.commandsMaxUseLen { c.commandsMaxUseLen = usageLen } commandPathLen := len(command.CommandPath()) if commandPathLen > c.commandsMaxCommandPathLen { c.commandsMaxCommandPathLen = commandPathLen } nameLen := len(command.Name()) if nameLen > c.commandsMaxNameLen { c.commandsMaxNameLen = nameLen } } } // Print is a convenience method to Print to the defined output, fallback to Stderr if not set. func (c *Command) Print(i ...interface{}) { fmt.Fprint(c.OutOrStderr(), i...) } // Println is a convenience method to Println to the defined output, fallback to Stderr if not set. func (c *Command) Println(i ...interface{}) { c.Print(fmt.Sprintln(i...)) } // Printf is a convenience method to Printf to the defined output, fallback to Stderr if not set. func (c *Command) Printf(format string, i ...interface{}) { c.Print(fmt.Sprintf(format, i...)) } // PrintErr is a convenience method to Print to the defined Err output, fallback to Stderr if not set. func (c *Command) PrintErr(i ...interface{}) { fmt.Fprint(c.ErrOrStderr(), i...) } // PrintErrln is a convenience method to Println to the defined Err output, fallback to Stderr if not set. func (c *Command) PrintErrln(i ...interface{}) { c.Print(fmt.Sprintln(i...)) } // PrintErrf is a convenience method to Printf to the defined Err output, fallback to Stderr if not set. func (c *Command) PrintErrf(format string, i ...interface{}) { c.Print(fmt.Sprintf(format, i...)) } // CommandPath returns the full path to this command. func (c *Command) CommandPath() string { if c.HasParent() { return c.Parent().CommandPath() + " " + c.Name() } return c.Name() } // UseLine puts out the full usage for a given command (including parents). func (c *Command) UseLine() string { var useline string if c.HasParent() { useline = c.parent.CommandPath() + " " + c.Use } else { useline = c.Use } if c.DisableFlagsInUseLine { return useline } if c.HasAvailableFlags() && !strings.Contains(useline, "[flags]") { useline += " [flags]" } return useline } // DebugFlags used to determine which flags have been assigned to which commands // and which persist. func (c *Command) DebugFlags() { c.Println("DebugFlags called on", c.Name()) var debugflags func(*Command) debugflags = func(x *Command) { if x.HasFlags() || x.HasPersistentFlags() { c.Println(x.Name()) } if x.HasFlags() { x.flags.VisitAll(func(f *flag.Flag) { if x.HasPersistentFlags() && x.persistentFlag(f.Name) != nil { c.Println(" -"+f.Shorthand+",", "--"+f.Name, "["+f.DefValue+"]", "", f.Value, " [LP]") } else { c.Println(" -"+f.Shorthand+",", "--"+f.Name, "["+f.DefValue+"]", "", f.Value, " [L]") } }) } if x.HasPersistentFlags() { x.pflags.VisitAll(func(f *flag.Flag) { if x.HasFlags() { if x.flags.Lookup(f.Name) == nil { c.Println(" -"+f.Shorthand+",", "--"+f.Name, "["+f.DefValue+"]", "", f.Value, " [P]") } } else { c.Println(" -"+f.Shorthand+",", "--"+f.Name, "["+f.DefValue+"]", "", f.Value, " [P]") } }) } c.Println(x.flagErrorBuf) if x.HasSubCommands() { for _, y := range x.commands { debugflags(y) } } } debugflags(c) } // Name returns the command's name: the first word in the use line. func (c *Command) Name() string { name := c.Use i := strings.Index(name, " ") if i >= 0 { name = name[:i] } return name } // HasAlias determines if a given string is an alias of the command. func (c *Command) HasAlias(s string) bool { for _, a := range c.Aliases { if a == s { return true } } return false } // CalledAs returns the command name or alias that was used to invoke // this command or an empty string if the command has not been called. func (c *Command) CalledAs() string { if c.commandCalledAs.called { return c.commandCalledAs.name } return "" } // hasNameOrAliasPrefix returns true if the Name or any of aliases start // with prefix func (c *Command) hasNameOrAliasPrefix(prefix string) bool { if strings.HasPrefix(c.Name(), prefix) { c.commandCalledAs.name = c.Name() return true } for _, alias := range c.Aliases { if strings.HasPrefix(alias, prefix) { c.commandCalledAs.name = alias return true } } return false } // NameAndAliases returns a list of the command name and all aliases func (c *Command) NameAndAliases() string { return strings.Join(append([]string{c.Name()}, c.Aliases...), ", ") } // HasExample determines if the command has example. func (c *Command) HasExample() bool { return len(c.Example) > 0 } // Runnable determines if the command is itself runnable. func (c *Command) Runnable() bool { return c.Run != nil || c.RunE != nil } // HasSubCommands determines if the command has children commands. func (c *Command) HasSubCommands() bool { return len(c.commands) > 0 } // IsAvailableCommand determines if a command is available as a non-help command // (this includes all non deprecated/hidden commands). func (c *Command) IsAvailableCommand() bool { if len(c.Deprecated) != 0 || c.Hidden { return false } if c.HasParent() && c.Parent().helpCommand == c { return false } if c.Runnable() || c.HasAvailableSubCommands() { return true } return false } // IsAdditionalHelpTopicCommand determines if a command is an additional // help topic command; additional help topic command is determined by the // fact that it is NOT runnable/hidden/deprecated, and has no sub commands that // are runnable/hidden/deprecated. // Concrete example: path_to_url#issuecomment-282741924. func (c *Command) IsAdditionalHelpTopicCommand() bool { // if a command is runnable, deprecated, or hidden it is not a 'help' command if c.Runnable() || len(c.Deprecated) != 0 || c.Hidden { return false } // if any non-help sub commands are found, the command is not a 'help' command for _, sub := range c.commands { if !sub.IsAdditionalHelpTopicCommand() { return false } } // the command either has no sub commands, or no non-help sub commands return true } // HasHelpSubCommands determines if a command has any available 'help' sub commands // that need to be shown in the usage/help default template under 'additional help // topics'. func (c *Command) HasHelpSubCommands() bool { // return true on the first found available 'help' sub command for _, sub := range c.commands { if sub.IsAdditionalHelpTopicCommand() { return true } } // the command either has no sub commands, or no available 'help' sub commands return false } // HasAvailableSubCommands determines if a command has available sub commands that // need to be shown in the usage/help default template under 'available commands'. func (c *Command) HasAvailableSubCommands() bool { // return true on the first found available (non deprecated/help/hidden) // sub command for _, sub := range c.commands { if sub.IsAvailableCommand() { return true } } // the command either has no sub commands, or no available (non deprecated/help/hidden) // sub commands return false } // HasParent determines if the command is a child command. func (c *Command) HasParent() bool { return c.parent != nil } // GlobalNormalizationFunc returns the global normalization function or nil if it doesn't exist. func (c *Command) GlobalNormalizationFunc() func(f *flag.FlagSet, name string) flag.NormalizedName { return c.globNormFunc } // Flags returns the complete FlagSet that applies // to this command (local and persistent declared here and by all parents). func (c *Command) Flags() *flag.FlagSet { if c.flags == nil { c.flags = flag.NewFlagSet(c.Name(), flag.ContinueOnError) if c.flagErrorBuf == nil { c.flagErrorBuf = new(bytes.Buffer) } c.flags.SetOutput(c.flagErrorBuf) } return c.flags } // LocalNonPersistentFlags are flags specific to this command which will NOT persist to subcommands. func (c *Command) LocalNonPersistentFlags() *flag.FlagSet { persistentFlags := c.PersistentFlags() out := flag.NewFlagSet(c.Name(), flag.ContinueOnError) c.LocalFlags().VisitAll(func(f *flag.Flag) { if persistentFlags.Lookup(f.Name) == nil { out.AddFlag(f) } }) return out } // LocalFlags returns the local FlagSet specifically set in the current command. func (c *Command) LocalFlags() *flag.FlagSet { c.mergePersistentFlags() if c.lflags == nil { c.lflags = flag.NewFlagSet(c.Name(), flag.ContinueOnError) if c.flagErrorBuf == nil { c.flagErrorBuf = new(bytes.Buffer) } c.lflags.SetOutput(c.flagErrorBuf) } c.lflags.SortFlags = c.Flags().SortFlags if c.globNormFunc != nil { c.lflags.SetNormalizeFunc(c.globNormFunc) } addToLocal := func(f *flag.Flag) { if c.lflags.Lookup(f.Name) == nil && c.parentsPflags.Lookup(f.Name) == nil { c.lflags.AddFlag(f) } } c.Flags().VisitAll(addToLocal) c.PersistentFlags().VisitAll(addToLocal) return c.lflags } // InheritedFlags returns all flags which were inherited from parent commands. func (c *Command) InheritedFlags() *flag.FlagSet { c.mergePersistentFlags() if c.iflags == nil { c.iflags = flag.NewFlagSet(c.Name(), flag.ContinueOnError) if c.flagErrorBuf == nil { c.flagErrorBuf = new(bytes.Buffer) } c.iflags.SetOutput(c.flagErrorBuf) } local := c.LocalFlags() if c.globNormFunc != nil { c.iflags.SetNormalizeFunc(c.globNormFunc) } c.parentsPflags.VisitAll(func(f *flag.Flag) { if c.iflags.Lookup(f.Name) == nil && local.Lookup(f.Name) == nil { c.iflags.AddFlag(f) } }) return c.iflags } // NonInheritedFlags returns all flags which were not inherited from parent commands. func (c *Command) NonInheritedFlags() *flag.FlagSet { return c.LocalFlags() } // PersistentFlags returns the persistent FlagSet specifically set in the current command. func (c *Command) PersistentFlags() *flag.FlagSet { if c.pflags == nil { c.pflags = flag.NewFlagSet(c.Name(), flag.ContinueOnError) if c.flagErrorBuf == nil { c.flagErrorBuf = new(bytes.Buffer) } c.pflags.SetOutput(c.flagErrorBuf) } return c.pflags } // ResetFlags deletes all flags from command. func (c *Command) ResetFlags() { c.flagErrorBuf = new(bytes.Buffer) c.flagErrorBuf.Reset() c.flags = flag.NewFlagSet(c.Name(), flag.ContinueOnError) c.flags.SetOutput(c.flagErrorBuf) c.pflags = flag.NewFlagSet(c.Name(), flag.ContinueOnError) c.pflags.SetOutput(c.flagErrorBuf) c.lflags = nil c.iflags = nil c.parentsPflags = nil } // HasFlags checks if the command contains any flags (local plus persistent from the entire structure). func (c *Command) HasFlags() bool { return c.Flags().HasFlags() } // HasPersistentFlags checks if the command contains persistent flags. func (c *Command) HasPersistentFlags() bool { return c.PersistentFlags().HasFlags() } // HasLocalFlags checks if the command has flags specifically declared locally. func (c *Command) HasLocalFlags() bool { return c.LocalFlags().HasFlags() } // HasInheritedFlags checks if the command has flags inherited from its parent command. func (c *Command) HasInheritedFlags() bool { return c.InheritedFlags().HasFlags() } // HasAvailableFlags checks if the command contains any flags (local plus persistent from the entire // structure) which are not hidden or deprecated. func (c *Command) HasAvailableFlags() bool { return c.Flags().HasAvailableFlags() } // HasAvailablePersistentFlags checks if the command contains persistent flags which are not hidden or deprecated. func (c *Command) HasAvailablePersistentFlags() bool { return c.PersistentFlags().HasAvailableFlags() } // HasAvailableLocalFlags checks if the command has flags specifically declared locally which are not hidden // or deprecated. func (c *Command) HasAvailableLocalFlags() bool { return c.LocalFlags().HasAvailableFlags() } // HasAvailableInheritedFlags checks if the command has flags inherited from its parent command which are // not hidden or deprecated. func (c *Command) HasAvailableInheritedFlags() bool { return c.InheritedFlags().HasAvailableFlags() } // Flag climbs up the command tree looking for matching flag. func (c *Command) Flag(name string) (flag *flag.Flag) { flag = c.Flags().Lookup(name) if flag == nil { flag = c.persistentFlag(name) } return } // Recursively find matching persistent flag. func (c *Command) persistentFlag(name string) (flag *flag.Flag) { if c.HasPersistentFlags() { flag = c.PersistentFlags().Lookup(name) } if flag == nil { c.updateParentsPflags() flag = c.parentsPflags.Lookup(name) } return } // ParseFlags parses persistent flag tree and local flags. func (c *Command) ParseFlags(args []string) error { if c.DisableFlagParsing { return nil } if c.flagErrorBuf == nil { c.flagErrorBuf = new(bytes.Buffer) } beforeErrorBufLen := c.flagErrorBuf.Len() c.mergePersistentFlags() // do it here after merging all flags and just before parse c.Flags().ParseErrorsWhitelist = flag.ParseErrorsWhitelist(c.FParseErrWhitelist) err := c.Flags().Parse(args) // Print warnings if they occurred (e.g. deprecated flag messages). if c.flagErrorBuf.Len()-beforeErrorBufLen > 0 && err == nil { c.Print(c.flagErrorBuf.String()) } return err } // Parent returns a commands parent command. func (c *Command) Parent() *Command { return c.parent } // mergePersistentFlags merges c.PersistentFlags() to c.Flags() // and adds missing persistent flags of all parents. func (c *Command) mergePersistentFlags() { c.updateParentsPflags() c.Flags().AddFlagSet(c.PersistentFlags()) c.Flags().AddFlagSet(c.parentsPflags) } // updateParentsPflags updates c.parentsPflags by adding // new persistent flags of all parents. // If c.parentsPflags == nil, it makes new. func (c *Command) updateParentsPflags() { if c.parentsPflags == nil { c.parentsPflags = flag.NewFlagSet(c.Name(), flag.ContinueOnError) c.parentsPflags.SetOutput(c.flagErrorBuf) c.parentsPflags.SortFlags = false } if c.globNormFunc != nil { c.parentsPflags.SetNormalizeFunc(c.globNormFunc) } c.Root().PersistentFlags().AddFlagSet(flag.CommandLine) c.VisitParents(func(parent *Command) { c.parentsPflags.AddFlagSet(parent.PersistentFlags()) }) } ```
Zlatolist () is a village in southern Bulgaria, located in the Krumovgrad municipality of the Kardzhali Province. It is situated in the Eastern Rhodopes on the banks of the Krumovitsa River. The majority of its population consists of ethnic Turks. Zlatolist Hill on Trinity Peninsula in Antarctica is named after the village. References Villages in Kardzhali Province
```xml /** * @file App entry * @module app/main * @author Surmon <path_to_url */ import helmet from 'helmet' import passport from 'passport' import bodyParser from 'body-parser' import cookieParser from 'cookie-parser' import compression from 'compression' import { NestFactory } from '@nestjs/core' import { AppModule } from '@app/app.module' import { HttpExceptionFilter } from '@app/filters/error.filter' import { TransformInterceptor } from '@app/interceptors/transform.interceptor' import { LoggingInterceptor } from '@app/interceptors/logging.interceptor' import { ErrorInterceptor } from '@app/interceptors/error.interceptor' import { environment, isProdEnv } from '@app/app.environment' import logger from '@app/utils/logger' import * as APP_CONFIG from '@app/app.config' async function bootstrap() { // MARK: keep logger enabled on dev env const app = await NestFactory.create(AppModule, isProdEnv ? { logger: false } : {}) app.use(helmet()) app.use(compression()) app.use(cookieParser()) app.use(bodyParser.json({ limit: '1mb' })) app.use(bodyParser.urlencoded({ extended: true })) // MARK: Beware of upgrades! // path_to_url#changed app.use(passport.initialize()) app.useGlobalFilters(new HttpExceptionFilter()) app.useGlobalInterceptors(new TransformInterceptor(), new ErrorInterceptor(), new LoggingInterceptor()) // path_to_url#issuecomment-403212561 // path_to_url // MARK: can't used! // useContainer(app.select(AppModule), { fallbackOnErrors: true, fallback: true }) return await app.listen(APP_CONFIG.APP.PORT) } bootstrap().then(() => { logger.success( `${APP_CONFIG.APP.NAME} app is running!`, `| env: ${environment}`, `| port: ${APP_CONFIG.APP.PORT}`, `| ${new Date().toLocaleString()}` ) }) ```
```xml import { useMemo } from 'react' import { Theme } from '@nivo/core' import { useTheme } from 'styled-components' export const useAxisTheme = (): Theme => { const theme = useTheme() const nivoTheme: Theme = useMemo(() => { return { ...theme.nivo, axis: { ...theme.nivo.axis, domain: { ...theme.nivo.axis!.domain, line: { ...theme.nivo.axis!.domain!.line, strokeWidth: 1, }, }, legend: { ...theme.nivo.axis!.legend, text: { ...theme.nivo.axis!.legend!.text, fill: theme.colors.accent, }, }, }, } }, [theme]) return nivoTheme } ```
The Piazza della Loggia bombing () was a bombing that took place on the morning of 28 May 1974, in Brescia, Italy during an anti-fascist protest. The terrorist attack killed eight people and wounded 102. The bomb was placed inside a rubbish bin at the east end of the square. In 2015, a Court of appeal in Milan issued a final life sentence to Ordine Nuovo members Carlo Maria Maggi and Maurizio Tramonte for ordering the bombing, closing one of the longest-running cases on terrorism during Italy's years of lead. Overview The first judicial investigation led to the condemnation in 1979 of a member of the Brescian far-right movement, Ermanno Buzzi. However, this first sentence was cancelled in 1982, but in 1983 the Court of Cassation declared the appeal process must be redone at the Court of Appeal in Venice. The suspects were acquitted in 1985 by the Court of Venice, and in 1987 by the Supreme Court of Cassation. Buzzi, killed in Novara's prison by neo-fascists Pierluigi Concutelli and Mario Tuti in 1981, was judged "a corpse to be carried out". A second investigation led to the accusation of other far-right activists: Cesare Ferri, Sergio Latini and Alessandro Stepanoff. They were acquitted for lack of evidence in 1987; in 1989 Ferri and Latini were acquitted of having committed the crime, Stepanoff for lack of evidence. A third investigation led to a trial for Francesco Delfino (a Carabiniere), Carlo Maria Maggi, Pino Rauti, Maurizio Tramonte and Delfo Zorzi (members of the Ordine Nuovo neo-fascist group). The presence of Tramonte in the place at the time of the blast was confirmed in 2008 by the results of an anthropological forensic test on a picture taken that day. On 16 November 2010, the Court of Brescia acquitted the defendants (the prosecutor had requested life imprisonment for Delfino, Maggi, Tramonte and Zorzi, and the acquittal for lack of evidence for Pino Rauti). The four defendants were acquitted also in 2012, but in 2014 the supreme Court of Cassation declared the appeal process must be redone at the Court of appeal in Milan for Maggi and Tramonte. Delfino and Zorzi are finally acquitted. On 22 July 2015, the Court of appeal sentenced Maggi and Tramonte to life imprisonment for ordering and coordinating the massacre. On 21 of June 2017, Tramonte was arrested in Portugal by the Portuguese Police in the Sanctuary of Fátima, while allegedly praying for forgiveness. Victims Giulietta Banzi Bazoli (34, teacher) Livia Bottardi Milani (32, teacher) Euplo Natali (69, retired, former partisan) Luigi Pinto (25, teacher) Bartolomeo Talenti (56, labourer) Alberto Trebeschi (37, teacher) Clementina Calzari Trebeschi (31, teacher) Vittorio Zambarda (60, labourer) 2000 report alleging US intelligence involvement A 2000 parliamentary report by the Olive Tree coalition claimed "that US intelligence agents were informed in advance about several right-wing terrorist bombings, including the December 1969 Piazza Fontana bombing in Milan and the Piazza della Loggia bombing in Brescia five years later, but did nothing to alert the Italian authorities or to prevent the attacks from taking place. It also [alleged] that Pino Rauti [current leader of the Social Idea Movement ], a journalist and founder of the far-right Ordine Nuovo subversive organisation, received regular funding from a press officer at the US embassy in Rome. 'So even before the 'stabilising' plans that Atlantic circles had prepared for Italy became operational through the bombings, one of the leading members of the subversive right was literally in the pay of the American embassy in Rome,' the report says. See also Terrorism in the European Union La notte della Repubblica (TV programme) Anni di piombo History of Italy (1970s-1980s) Notes Years of Lead (Italy) Mass murder in 1974 1974 crimes in Italy Improvised explosive device bombings in Italy Neo-fascist attacks in Italy Brescia 1974 disasters in Italy Terrorist incidents in Europe in 1974 Terrorist incidents in Italy in the 1970s
Pilot Officer Geoffrey Lloyd Wells Memorial Seat is a heritage-listed memorial at 103 Stanley Terrace, Taringa, City of Brisbane, Queensland, Australia. It was designed by the Wells Family in honour of their son Pilot Officer Geoffrey Lloyd Wells. It was built some time after August 1941 by Wells Family. It was added to the Queensland Heritage Register on 21 October 1992. History Geoffrey Lloyd Wells was born on 26 June 1915 at Wallingford, Taringa, the son of Henry Charles Wells and his wife Elsie Irene (née Lloyd). During World War II he enlisted in Brisbane in the Royal Australian Air Force. Pilot Officer Geoffrey Lloyd Wells was shot down over in a Wellington X9700 over Roggel, the Netherlands on 17 August 1941 during active service. He was buried in the Jonkerbos War Cemetery near the town of Nijmegen located in a woodland area known as Jonkers Bosch. This stone seat, which is set into a stone fence along Stanley Terrace, was constructed by his family as a memorial to him. The seat was erected by his family in the corner of their property, as a memorial and for public use. Description The seat is of rustic design, constructed of slabs of Brisbane tuff. It is set into the front garden wall, also of tuff, fronting Stanley Terrace. The back of the seat is butterfly-shaped and has a centrally placed metal plaque. In addition to the details of Wells' death, the plaque bears the quote from Winston Churchill: "never has so much been owed by so many to so few". Timber slats forming the seat have been reconstructed. The style of the fence above the surrounding wall has changed since the seat's construction. However, together with the adjacent shady jacaranda tree, the fence and seat combine to create a picturesque resting spot for passing pedestrians. Heritage listing Pilot Officer Geoffrey Lloyd Wells Memorial Seat was listed on the Queensland Heritage Register on 21 October 1992 having satisfied the following criteria. The place is important in demonstrating the evolution or pattern of Queensland's history. It is illustrative of the sustained tradition of memorialising the war dead, which had been established in Queensland at the turn of the 20th century with the South African War of 1899–1902, and popularised during and immediately following the First World War (1914–1918). The place demonstrates rare, uncommon or endangered aspects of Queensland's cultural heritage. The Pilot Officer Geoffrey Lloyd Wells Memorial Seat at Taringa is a rare private memorial erected for public benefit. The place is important in demonstrating the principal characteristics of a particular class of cultural places. It is illustrative of the sustained tradition of memorialising the war dead, which had been established in Queensland at the turn of the 20th century with the South African War of 1899–1902, and popularised during and immediately following the First World War (1914–1918). The place is important because of its aesthetic significance. It makes an original and attractive contribution to the streetscape, and is significant as an unusual form of war memorial. References Attribution External links Queensland Heritage Register Taringa, Queensland Articles incorporating text from the Queensland Heritage Register World War II memorials in Queensland
```kotlin package mega.privacy.android.app.constants class IntentConstants { // Intent's Extras companion object { const val EXTRA_ACCOUNT_TYPE = "EXTRA_ACCOUNT_TYPE" const val EXTRA_ASK_PERMISSIONS = "EXTRA_ASK_PERMISSIONS" const val EXTRA_FIRST_LOGIN = "EXTRA_FIRST_LOGIN" const val EXTRA_NEW_ACCOUNT = "EXTRA_NEW_ACCOUNT" const val EXTRA_UPGRADE_ACCOUNT = "EXTRA_UPGRADE_ACCOUNT" const val EXTRA_MASTER_KEY = "EXTRA_MASTER_KEY" const val ACTION_OPEN_ACHIEVEMENTS = "ACTION_OPEN_ACHIEVEMENTS" } } ```
Aksaklar can refer to: Aksaklar, Göynük Aksaklar, Yığılca
```python import demistomock as demisto # noqa: F401 from CommonServerPython import * # noqa: F401 def add_comment(args: Dict[str, Any]) -> CommandResults: demisto.debug("adding comment") tags = argToList(args.get('tags', 'FROM XSOAR')) comment_body = args.get('comment', '') return CommandResults( readable_output=comment_body, mark_as_note=True, tags=tags ) def main(): # pragma: no cover try: demisto.debug('SplunkAddComment is being called') res = add_comment(demisto.args()) return_results(res) except Exception as ex: return_error(f'Failed to execute SplunkAddComment. Error: {str(ex)}') if __name__ in ["__builtin__", "builtins", '__main__']: main() ```
Daniel Blue may refer to: Daniel Blue (church administrator) (1796–1884), founder of the Saint Andrews African Methodist Episcopal Church in Sacramento, California Daniel T. Blue, Jr. (born 1949), Democratic Representative for North Carolina Danny Blue (comedian) (born 1949), British comedian Danny Blue (Hustle), fictional character from TV series Hustle
The Restitution of Decayed Intelligence is a 10" vinyl by Coil which heavily references samples from The Remote Viewer. It is named in reference to A Restitution of Decayed Intelligence, a book written in 1605 by Richard Verstegan. Release history The vinyl was released in four versions as limited editions: 450 numbered copies were pressed on black vinyl. 50 numbered copies were pressed on black/white marbled vinyl for series subscribers. 100 'artist copies' were pressed on black/white marbled vinyl. 6 copies were released in a wooden box with a test pressing along with both black and marbled vinyl. The CD (with bonus tracks) was released by Threshold Archives. In addition to John Balance and Peter Christopherson, Thighpaulsandra is credited in this recording. "The Restitution of Decayed Intelligence II (Extract)" appears on the compilation CD The Lactamase 10" Sampler with a track length of 5:01. An mp3 bearing the same name was available for download on Beta-lactam Ring Records website for a limited time, with a track length of 9:02. The song "Bad Message" from the same session can be found on the compilation Lactamase Bonus Compilation. Track listing Side A: "The Restitution of Decayed Intelligence I" – 16:06 Side B: "The Restitution of Decayed Intelligence II" – 12:01 References External links The Restitution of Decayed Intelligence at Brainwashed Coil (band) EPs Coil (band) songs 2003 EPs 2003 songs
Błękitni Stargard is a Polish association football sports club from Stargard. The men's team is currently playing in the fourth-tier III liga, following their 2020–21 relegation from the II liga, the reserve men's team in the fifth division whereas the women's team is in the fourth division. There are also 11 different youth teams. It was formerly a multi-sports club. History On 18 May 1945, on the initiative of the athlete Tadeusz Świniarski, a participant of the 1946 European Athletics Championships, Błękitni, the first Polish sports club in Western Pomerania, was founded. The football, boxing and athletics sections were all officially launched in 1945. Two years later, a volleyball section was added. Table tennis and swimming sections followed in 1948 and 1949, respectively. League history In 1980–81, Błękitni finished second behind Gryf Słupsk at the third tier and won promotion to the II liga. During the 1981/82 season, the team played in the second division where they finished 15th out of 16 teams and were relegated back to the third tier. The club played the next 16 seasons in the III liga, managing to finish second on four separate occasions. However, none of these granted them promotion. In 1998, the club was relegated from the third tier having finished 10th. Błękitni spent the next two seasons at the fourth tier - in 1998/99 they were denied promotion by Kotwica Kołobrzeg but won their group next season. However, they didn't manage to keep their spot at the third tier and returned to the fourth level one year later. After two promotions in succession, in 2002–03, the club returned to the second division but was withdrawn at the halfway stage of the season, their results annulled. After being withdrawn from the second tier, Błękitni joined the IV liga, the fourth tier of Polish association football, in 2004 and played for the next nine seasons at that level. In 2013, the club won promotion to the third division and has been playing at the third tier since. In the Polish Cup The club had an unprecedented cup run during the 2014–15 season, reaching the semifinals of the Polish Cup. The club was playing in the third division at the time. In the first round, Błękitni won 6–1 with Małapanew Ozimek, lower-tier team. In the second round, the club already eliminated its first higher-level opponent winning 3–1 with Pogoń Siedlce and went on to win against Chojniczanka 1–0 the following round. Both of these opponents were playing in the second division at the time. In the 16th-finals eliminated a fourth-division team, Gryf Wejherowo, 2–1. The following round, Błękitni won with another second-tier team, GKS Tychy, 3–2 and advanced to the quarterfinals which were held in the spring of 2015. Then, having managed to achieve two surprise 2–0 wins, home and away, against Ekstraklasa side Cracovia in the quarterfinals, the team progressed to the semifinals, where they faced Lech Poznań. While Błęktini did win against Lech 3–1 at home, they were eventually knocked out by Lech in the return leg, after losing 1–3 in regular time. In extra time, Lech scored two goals to win the tie 6–4 on aggregate. Achievements Association football (men's) 15th place in Second Division: 1981–82 Semifinal of the Polish Cup: 2014–15 Table tennis Second division: 1957, 1990–2000 Current squad Notes References External links Official Club website Unofficial website 90minut.pl profile Association football clubs established in 1945 1945 establishments in Poland Football clubs in West Pomeranian Voivodeship
The thermal history of Earth involves the study of the cooling history of Earth's interior. It is a sub-field of geophysics. (Thermal histories are also computed for the internal cooling of other planetary and stellar bodies.) The study of the thermal evolution of Earth's interior is uncertain and controversial in all aspects, from the interpretation of petrologic observations used to infer the temperature of the interior, to the fluid dynamics responsible for heat loss, to material properties that determine the efficiency of heat transport. Overview Observations that can be used to infer the temperature of Earth's interior range from the oldest rocks on Earth to modern seismic images of the inner core size. Ancient volcanic rocks can be associated with a depth and temperature of melting through their geochemical composition. Using this technique and some geological inferences about the conditions under which the rock is preserved, the temperature of the mantle can be inferred. The mantle itself is fully convective, so that the temperature in the mantle is basically constant with depth outside the top and bottom thermal boundary layers. This is not quite true because the temperature in any convective body under pressure must increase along an adiabat, but the adiabatic temperature gradient is usually much smaller than the temperature jumps at the boundaries. Therefore, the mantle is usually associated with a single or potential temperature that refers to the mid-mantle temperature extrapolated along the adiabat to the surface. The potential temperature of the mantle is estimated to be about 1350 C today. There is an analogous potential temperature of the core but since there are no samples from the core its present-day temperature relies on extrapolating the temperature along an adiabat from the inner core boundary, where the iron solidus is somewhat constrained. Thermodynamics The simplest mathematical formulation of the thermal history of Earth's interior involves the time evolution of the mid-mantle and mid-core temperatures. To derive these equations one must first write the energy balance for the mantle and the core separately. They are, for the mantle, and for the core. is the surface heat flow [W] at the surface of the Earth (and mantle), is the secular cooling heat from the mantle, and , , and are the mass, specific heat, and temperature of the mantle. is the radiogenic heat production in the mantle and is the heat flow from the core mantle boundary. is the secular cooling heat from the core, and and are the latent and gravitational heat flow from the inner core boundary due to the solidification of iron. Solving for and gives, and, Thermal Catastrophe In 1862, Lord Kelvin calculated the age of the Earth at between 20 million and 400 million years by assuming that Earth had formed as a completely molten object, and determined the amount of time it would take for the near-surface to cool to its present temperature. Since uniformitarianism required a much older Earth, there was a contradiction. Eventually, the additional heat sources within the Earth were discovered, allowing for a much older age. This section is about a similar paradox in current geology, called the thermal catastrophe. The thermal catastrophe of the Earth can be demonstrated by solving the above equations for the evolution of the mantle with . The catastrophe is defined as when the mean mantle temperature exceeds the mantle solidus so that the entire mantle melts. Using the geochemically preferred Urey ratio of and the geodynamically preferred cooling exponent of the mantle temperature reaches the mantle solidus (i.e. a catastrophe) in 1-2 Ga. This result is clearly unacceptable because geologic evidence for a solid mantle exists as far back as 4 Ga (and possibly further). Hence, the thermal catastrophe problem is the foremost paradox in the thermal history of the Earth. New Core Paradox The "New Core Paradox" posits that the new upward revisions to the empirically measured thermal conductivity of iron at the pressure and temperature conditions of Earth's core imply that the dynamo is thermally stratified at present, driven solely by compositional convection associated with the solidification of the inner core. However, wide spread paleomagnetic evidence for a geodynamo older than the likely age of the inner core (~1 Gyr) creates a paradox as to what powered the geodynamo prior to inner core nucleation. Recently it has been proposed that a higher core cooling rate and lower mantle cooling rate can resolve the paradox in part. However, the paradox remains unresolved. Two additional constraints have been recently proposed. Numerical simulations of the material properties of high pressure-temperature iron claim an upper limit of 105 W/m/K to the thermal conductivity. This downward revision to the conductivity partially alleviates the issues of the New Core Paradox by lowering the adiabatic core heat flow required to keep the core thermally convective. However, this study was later retracted by the authors, who stated that their calculations were in error by a factor of two, due to neglecting spin degeneracy. That would halve the electron-electron resistivity, supporting earlier estimates of high iron conductivity. Also, recent geochemical experiments have led to the proposal that radiogenic heat in the core is larger than previously thought. This revision, if true, would also alleviate issues with the core heat budget by providing an additional energy source back in time. See also Earth's inner core Earth's magnetic field Earth's structure Geologic temperature record List of periods and events in climate history Paleothermometer Radiative forcing Timeline of glaciation References Further reading Geophysics Heat transfer
This is a list of tunnels documented by the Historic American Engineering Record in the U.S. state of West Virginia. Tunnels See also List of bridges documented by the Historic American Engineering Record in West Virginia References Tunnels Tunnels West Virginia
```batchfile "C:\Program Files\Unity\Editor\Unity.exe" -batchmode -quit -nographics -executeMethod "Batchmode.AndroidBuildTaiGuoWithGM" -logFile "d:\buildLog.txt" -projectPath "E:\HL_1.0.10Thailand" ```
Jorja is a feminine given name related to Georgia which may refer to: Jorja Chalmers, Australian saxophone and keyboard player Jorja Douglas, winner of Got What It Takes? (series 2), a British children's talent show and member of FLO Jorja Fleezanis, American violinist Jorja Fox (born 1968), American actress and producer Jorja Leap, American anthropologist, author, and social worker Jorja Monatella, a character in the Dean Koontz novel Strangers Jorja Smith (born 1997), English singer-songwriter See also Jorgjia Filçe-Truja (1907–1994), Albanian soprano Feminine given names
Port Albert was the original terminus station on the South Gippsland railway line, the railway opening to that station on 13 January 1892. Early survey documents regard the station site as "Palmerston" station ground. The last advertised trolley service ran on 14 February 1949. References Disused railway stations in Victoria (state) Transport in Gippsland (region) Shire of Wellington
```sqlpl update ACT_GE_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'common.schema.version'; update ACT_GE_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'entitylink.schema.version'; update ACT_GE_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'identitylink.schema.version'; update ACT_GE_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'job.schema.version'; update ACT_GE_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'batch.schema.version'; update ACT_GE_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'task.schema.version'; update ACT_GE_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'variable.schema.version'; update ACT_GE_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'eventsubscription.schema.version'; update ACT_GE_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'schema.version'; update ACT_ID_PROPERTY set VALUE_ = '6.7.2.0' where NAME_ = 'schema.version'; UPDATE ACT_APP_DATABASECHANGELOGLOCK SET `LOCKED` = 1, LOCKEDBY = '192.168.68.111 (192.168.68.111)', LOCKGRANTED = '2021-12-28 10:30:20.524' WHERE ID = 1 AND `LOCKED` = 0; UPDATE ACT_APP_DATABASECHANGELOGLOCK SET `LOCKED` = 0, LOCKEDBY = NULL, LOCKGRANTED = NULL WHERE ID = 1; UPDATE ACT_CMMN_DATABASECHANGELOGLOCK SET `LOCKED` = 1, LOCKEDBY = '192.168.68.111 (192.168.68.111)', LOCKGRANTED = '2021-12-28 10:30:21.066' WHERE ID = 1 AND `LOCKED` = 0; UPDATE ACT_CMMN_DATABASECHANGELOGLOCK SET `LOCKED` = 0, LOCKEDBY = NULL, LOCKGRANTED = NULL WHERE ID = 1; UPDATE FLW_EV_DATABASECHANGELOGLOCK SET `LOCKED` = 1, LOCKEDBY = '192.168.68.111 (192.168.68.111)', LOCKGRANTED = '2021-12-28 10:30:21.326' WHERE ID = 1 AND `LOCKED` = 0; ALTER TABLE FLW_CHANNEL_DEFINITION ADD TYPE_ VARCHAR(255) NULL; ALTER TABLE FLW_CHANNEL_DEFINITION ADD IMPLEMENTATION_ VARCHAR(255) NULL; INSERT INTO FLW_EV_DATABASECHANGELOG (ID, AUTHOR, FILENAME, DATEEXECUTED, ORDEREXECUTED, MD5SUM, `DESCRIPTION`, COMMENTS, EXECTYPE, CONTEXTS, LABELS, LIQUIBASE, DEPLOYMENT_ID) VALUES ('2', 'flowable', 'org/flowable/eventregistry/db/liquibase/flowable-eventregistry-db-changelog.xml', NOW(), 2, '8:0ea825feb8e470558f0b5754352b9cda', 'addColumn tableName=FLW_CHANNEL_DEFINITION; addColumn tableName=FLW_CHANNEL_DEFINITION', '', 'EXECUTED', NULL, NULL, '4.3.5', '0683821439'); INSERT INTO FLW_EV_DATABASECHANGELOG (ID, AUTHOR, FILENAME, DATEEXECUTED, ORDEREXECUTED, MD5SUM, `DESCRIPTION`, COMMENTS, EXECTYPE, CONTEXTS, LABELS, LIQUIBASE, DEPLOYMENT_ID) VALUES ('3', 'flowable', 'org/flowable/eventregistry/db/liquibase/flowable-eventregistry-db-changelog.xml', NOW(), 3, '8:3c2bb293350b5cbe6504331980c9dcee', 'customChange', '', 'EXECUTED', NULL, NULL, '4.3.5', '0683821439'); UPDATE FLW_EV_DATABASECHANGELOGLOCK SET `LOCKED` = 0, LOCKEDBY = NULL, LOCKGRANTED = NULL WHERE ID = 1; UPDATE ACT_DMN_DATABASECHANGELOGLOCK SET `LOCKED` = 1, LOCKEDBY = '192.168.68.111 (192.168.68.111)', LOCKGRANTED = '2021-12-28 10:30:21.631' WHERE ID = 1 AND `LOCKED` = 0; UPDATE ACT_DMN_DATABASECHANGELOGLOCK SET `LOCKED` = 0, LOCKEDBY = NULL, LOCKGRANTED = NULL WHERE ID = 1; UPDATE ACT_FO_DATABASECHANGELOGLOCK SET `LOCKED` = 1, LOCKEDBY = '192.168.68.111 (192.168.68.111)', LOCKGRANTED = '2021-12-28 10:30:21.854' WHERE ID = 1 AND `LOCKED` = 0; UPDATE ACT_FO_DATABASECHANGELOGLOCK SET `LOCKED` = 0, LOCKEDBY = NULL, LOCKGRANTED = NULL WHERE ID = 1; UPDATE ACT_CO_DATABASECHANGELOGLOCK SET `LOCKED` = 1, LOCKEDBY = '192.168.68.111 (192.168.68.111)', LOCKGRANTED = '2021-12-28 10:30:22.045' WHERE ID = 1 AND `LOCKED` = 0; UPDATE ACT_CO_DATABASECHANGELOGLOCK SET `LOCKED` = 0, LOCKEDBY = NULL, LOCKGRANTED = NULL WHERE ID = 1; ```
Srayikkadu is a coastal village in Kerala, India. It is in Kollam district. Srayikkadu is part of Alappad panchayat which is part of karunagappally Taluk situated on a narrow strip of land sandwiched between the Arabian Sea and the T S Canal.The village is connected to the mainland by bridges as well as by country boat ferries. The main nearby places Karunagappally, Kayamkulam, Ochira are connected by roads. By water, it is connected to various places like Alappuzha, Kollam. Mata Amritanandamayi Math is 1 km away from this small village as well. It is a green land with backwaters and beaches. References GLPSchool, srayikkad Villages in Kollam district
St. Joseph's College, Nainital is a day boarding and residential school in Nainital, Kumaon, India, providing private school education. Overview St Joseph's college was established in 1888. The site was previously the location of a seminary, run by the Italian Capuchin Fathers. The school is still referred to as "SEM" (for Seminary). In 1892 four Christian Brothers took formal charge of St Joseph's College, and thus began the involvement of the Christian Brothers in the running of the school. The school is one of 20 educational institutions in India conducted by the Congregation of Christian Brothers, a pontifical institute, founded in Ireland in 1802 by Edmund Ignatius Rice, a wealthy Catholic layman, who was beatified in 1996. Student life The pupils are boys are aged 6 to 18. The roll numbers about 1100, with 360 boarders. New admissions are taken only in classes one and eleven. Boarders are admitted from class 3 onwards. The school is affiliated to the Council for the Indian School Certificate Examinations and prepares students for the ICSE examinations. It has a plus-two section for the ISC examination. The school has seven playing fields, a gymnasium, lawn tennis, two squash courts, basketball court, billiard tables, and a swimming pool. The school takes part in inter-college events. The college has a distinguished football team in tournaments of Nainital and Edmund Rice schools. The school hosts inter-school events such as the Edmund Rice Meet and the Nirip Deep Memorial Soccer Tournament. Events such as quizzes, debates, declamations and elocution contests are held. Notable alumni Politics K. C. Pant – Union minister with Cabinet rank and Vice Chairman of the Planning Commission Rajendra Singh – head of Rashtriya Swayamsevak Sangh K. C. Singh Baba – Member of Parliament, Nainital and sportsman Kirti Vardhan Singh – Member of Parliament Sajeeb Wazed – son of Sheikh Hasina Sumit Hridayesh , MLA , Haldwani , Uttrakhand Armed forces Zameer Uddin Shah Rajesh Singh Adhikari Rustom K. S. Ghandhi – governor of Himachal Pradesh Business, commerce and industry Peter de Noronha – philanthropist and civil servant Neeraj Roy – entrepreneur Education, fine arts and media Sorab K. Ghandhi – scientist Naseeruddin Shah – film actor Neelesh Misra – songwriter, journalist and author Paritosh Painter – theatre actor, director and film, TV writer Suhel Seth – actor Manoj Joshi – journalist Amit Abraham – psychologist Sports Richard James Allen – field hockey player Ronald Riley – field hockey player Trevor Vanderputt – field hockey player Media The school's buildings have featured in Indian films. The best known of these are Masoom (winner of 4 Filmfare Awards) and Koi... Mil Gaya (winner of 5 Filmfare Awards). The school buildings have also featured in various TV series such as Breathe into the shadows. See also List of Christian Brothers schools References 1888 establishments in British India Boarding schools in Uttarakhand British colonial architecture in India Boys' schools in India Catholic boarding schools in India Catholic secondary schools in India Christian schools in Uttarakhand Congregation of Christian Brothers schools in India Education in Nainital Educational institutions established in 1888 High schools and secondary schools in Uttarakhand Primary schools in India Schools in Colonial India
Erich Jagsch (born 31 August 1955) is a former Austrian racing cyclist. He rode in the 1980 Tour de France. References External links 1955 births Living people Austrian male cyclists People from Klosterneuburg Sportspeople from Lower Austria 20th-century Austrian people
```python # for complete details. from __future__ import absolute_import, division, print_function import abc import binascii import inspect import sys import warnings # We use a UserWarning subclass, instead of DeprecationWarning, because CPython # decided deprecation warnings should be invisble by default. class CryptographyDeprecationWarning(UserWarning): pass # Several APIs were deprecated with no specific end-of-life date because of the # ubiquity of their use. They should not be removed until we agree on when that # cycle ends. PersistentlyDeprecated2017 = CryptographyDeprecationWarning PersistentlyDeprecated2018 = CryptographyDeprecationWarning PersistentlyDeprecated2019 = CryptographyDeprecationWarning DeprecatedIn27 = CryptographyDeprecationWarning def _check_bytes(name, value): if not isinstance(value, bytes): raise TypeError("{} must be bytes".format(name)) def _check_byteslike(name, value): try: memoryview(value) except TypeError: raise TypeError("{} must be bytes-like".format(name)) def read_only_property(name): return property(lambda self: getattr(self, name)) def register_interface(iface): def register_decorator(klass): verify_interface(iface, klass) iface.register(klass) return klass return register_decorator def register_interface_if(predicate, iface): def register_decorator(klass): if predicate: verify_interface(iface, klass) iface.register(klass) return klass return register_decorator if hasattr(int, "from_bytes"): int_from_bytes = int.from_bytes else: def int_from_bytes(data, byteorder, signed=False): assert byteorder == 'big' assert not signed return int(binascii.hexlify(data), 16) if hasattr(int, "to_bytes"): def int_to_bytes(integer, length=None): return integer.to_bytes( length or (integer.bit_length() + 7) // 8 or 1, 'big' ) else: def int_to_bytes(integer, length=None): hex_string = '%x' % integer if length is None: n = len(hex_string) else: n = length * 2 return binascii.unhexlify(hex_string.zfill(n + (n & 1))) class InterfaceNotImplemented(Exception): pass if hasattr(inspect, "signature"): signature = inspect.signature else: signature = inspect.getargspec def verify_interface(iface, klass): for method in iface.__abstractmethods__: if not hasattr(klass, method): raise InterfaceNotImplemented( "{} is missing a {!r} method".format(klass, method) ) if isinstance(getattr(iface, method), abc.abstractproperty): # Can't properly verify these yet. continue sig = signature(getattr(iface, method)) actual = signature(getattr(klass, method)) if sig != actual: raise InterfaceNotImplemented( "{}.{}'s signature differs from the expected. Expected: " "{!r}. Received: {!r}".format( klass, method, sig, actual ) ) # No longer needed as of 2.2, but retained because we have external consumers # who use it. def bit_length(x): return x.bit_length() class _DeprecatedValue(object): def __init__(self, value, message, warning_class): self.value = value self.message = message self.warning_class = warning_class class _ModuleWithDeprecations(object): def __init__(self, module): self.__dict__["_module"] = module def __getattr__(self, attr): obj = getattr(self._module, attr) if isinstance(obj, _DeprecatedValue): warnings.warn(obj.message, obj.warning_class, stacklevel=2) obj = obj.value return obj def __setattr__(self, attr, value): setattr(self._module, attr, value) def __delattr__(self, attr): obj = getattr(self._module, attr) if isinstance(obj, _DeprecatedValue): warnings.warn(obj.message, obj.warning_class, stacklevel=2) delattr(self._module, attr) def __dir__(self): return ["_module"] + dir(self._module) def deprecated(value, module_name, message, warning_class): module = sys.modules[module_name] if not isinstance(module, _ModuleWithDeprecations): sys.modules[module_name] = _ModuleWithDeprecations(module) return _DeprecatedValue(value, message, warning_class) def cached_property(func): cached_name = "_cached_{}".format(func) sentinel = object() def inner(instance): cache = getattr(instance, cached_name, sentinel) if cache is not sentinel: return cache result = func(instance) setattr(instance, cached_name, result) return result return property(inner) ```
Fort Townsend State Park (formerly Old Fort Townsend State Park) is a public recreation area located two miles south of Port Townsend in Jefferson County, Washington. The state park occupies a third of the site of the original Fort Townsend built in 1856. The park includes of shoreline on Port Townsend Bay, picnicking and camping areas, of hiking trails, and facilities for boating, fishing, and crabbing. History Fort Townsend was built in 1856 by the U.S. Army to protect settlers. The entire garrison was transferred to the American Camp or Camp Pickett on San Juan Island in 1859 during the border dispute called the Pig War. Reactivated in 1874, the fort continued in use until fire destroyed the barracks in late 1894; it was abandoned in 1895. The site was retained on the Army rolls until World War II, when it was used as a munitions defusing station. Washington State Parks took custody in 1953, and it became a state park. References External links Fort Townsend Historical State Park Washington State Parks and Recreation Commission Fort Townsend Historical State Park Map Washington State Parks and Recreation Commission Closed installations of the United States Army Townsend Parks in Jefferson County, Washington State parks of Washington (state)
```xml import { spawnSync } from 'child_process'; jest.mock('child_process', () => ({ spawnSync: jest.fn()})); describe('get-is-python-installed', () => { let getIsPythonInstalled: any; beforeEach(() => { ({ getIsPythonInstalled } = require('../../src/utils/get-is-python-installed')); jest.resetModules(); }); it('correctly returns the Python version if installed', async () => { (spawnSync as any).mockReturnValue({ output: '\nPython 3.8.1\n' }); expect(getIsPythonInstalled()).toBe('Python 3.8.1'); }); it('correctly returns the Python version if installed', async () => { (spawnSync as any).mockImplementation(() => { throw new Error('No!'); }); expect(getIsPythonInstalled()).toBeNull(); }); }); ```
Kamionka is a village in the administrative district of Gmina Lubasz, within Czarnków-Trzcianka County, Greater Poland Voivodeship, in west-central Poland. It lies approximately south of Lubasz, south of Czarnków, and north-west of the regional capital Poznań. References Villages in Czarnków-Trzcianka County
```c /* =========================================================================== This file is part of Quake III Arena source code. Quake III Arena source code is free software; you can redistribute it or (at your option) any later version. Quake III Arena source code is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the along with Quake III Arena source code; if not, write to the Free Software Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA =========================================================================== */ // // // g_mem.c // #include "g_local.h" #define POOLSIZE (256 * 1024) static char memoryPool[POOLSIZE]; static int allocPoint; void *G_Alloc( int size ) { char *p; if ( g_debugAlloc.integer ) { G_Printf( "G_Alloc of %i bytes (%i left)\n", size, POOLSIZE - allocPoint - ( ( size + 31 ) & ~31 ) ); } if ( allocPoint + size > POOLSIZE ) { G_Error( "G_Alloc: failed on allocation of %i bytes", size ); return NULL; } p = &memoryPool[allocPoint]; allocPoint += ( size + 31 ) & ~31; return p; } void G_InitMemory( void ) { allocPoint = 0; } void Svcmd_GameMem_f( void ) { G_Printf( "Game memory status: %i out of %i bytes allocated\n", allocPoint, POOLSIZE ); } ```
The Western Australia Post Office Directory, also known as Wise Directories or Wise Street Directories, was published in Perth in 1893–1949. It was published by H. Pierssené and later by H. Wise & Co. It listed household, business, society, and Government contacts in Perth, Fremantle, Kalgoorlie, Boulder and Coolgardie including some rural areas of Western Australia. Publishers The Western Australian Directory was published by H. Pierssene between 1893 and 1895. Herbert Pierssene was a merchant and importer of English continental and Ceylonese goods. He was an agent for McCulluch Carrying Company and a bottler of West Australian wines. The Western Australia Post Office Directory was published by Wise & Co. between 1895 and 1949 with the exception of 1943 and 1948. Wise Directories The directories provide information by locality, individual surname, government service, and by trade or profession. The addresses of householders and businesses throughout Western Australia are included. Maps were sometimes published with an edition of the directory. The towns section of the directories normally contained separate street directories of Perth and suburbs, Fremantle and suburbs, Kalgoorlie, Boulder and Coolgardie. Known colloquially to users and book collectors as Wise Directories or Wise Street Directories the red covered directories were published between 1893 and 1949. Due to the annual changes, the directories are valuable historical documents for Western Australian history. They are scarce in the Australian rare book market. The directories have been invaluable referent points for such projects as the Dictionary of Western Australians and others where the street lists in the directory provide details of inhabitants and houses in some streets in the more built-up residential areas. Country towns in the directory have name lists only. They have been available in microfilm form in J S Battye Library, and more recently have become online in one of the J S Battye Library digitization projects. See also Australia Post Australian Dictionary of Biography Cyclopedia of Western Australia Dictionary of Australian Biography State Records Office of Western Australia References External links http://www.slwa.wa.gov.au/find/wa_resources/post_office_directories Books about Western Australia Western Australia Post Office Directory Australian directories Gazetteers Postal system of Australia
```c++ #include <Columns/ColumnConst.h> #include <DataTypes/DataTypeDate.h> #include <DataTypes/DataTypeDateTime.h> #include <DataTypes/ObjectUtils.h> #include <Disks/createVolume.h> #include <IO/HashingWriteBuffer.h> #include <IO/WriteHelpers.h> #include <Interpreters/AggregationCommon.h> #include <Interpreters/Context.h> #include <Interpreters/MergeTreeTransaction.h> #include <Processors/TTL/ITTLAlgorithm.h> #include <Storages/MergeTree/DataPartStorageOnDiskFull.h> #include <Storages/MergeTree/MergeTreeDataWriter.h> #include <Storages/MergeTree/MergedBlockOutputStream.h> #include <Storages/MergeTree/MergeTreeSettings.h> #include <Storages/MergeTree/RowOrderOptimizer.h> #include <Common/ElapsedTimeProfileEventIncrement.h> #include <Common/Exception.h> #include <Common/HashTable/HashMap.h> #include <Common/OpenTelemetryTraceContext.h> #include <Common/typeid_cast.h> #include <Core/Settings.h> #include <Parsers/queryToString.h> #include <Processors/Merges/Algorithms/ReplacingSortedAlgorithm.h> #include <Processors/Merges/Algorithms/MergingSortedAlgorithm.h> #include <Processors/Merges/Algorithms/CollapsingSortedAlgorithm.h> #include <Processors/Merges/Algorithms/SummingSortedAlgorithm.h> #include <Processors/Merges/Algorithms/AggregatingSortedAlgorithm.h> #include <Processors/Merges/Algorithms/VersionedCollapsingAlgorithm.h> #include <Processors/Merges/Algorithms/GraphiteRollupSortedAlgorithm.h> #include <Processors/Sources/SourceFromSingleChunk.h> namespace ProfileEvents { extern const Event MergeTreeDataWriterBlocks; extern const Event MergeTreeDataWriterBlocksAlreadySorted; extern const Event MergeTreeDataWriterRows; extern const Event MergeTreeDataWriterUncompressedBytes; extern const Event MergeTreeDataWriterCompressedBytes; extern const Event MergeTreeDataWriterSortingBlocksMicroseconds; extern const Event MergeTreeDataWriterMergingBlocksMicroseconds; extern const Event MergeTreeDataWriterProjectionsCalculationMicroseconds; extern const Event MergeTreeDataProjectionWriterBlocks; extern const Event MergeTreeDataProjectionWriterBlocksAlreadySorted; extern const Event MergeTreeDataProjectionWriterRows; extern const Event MergeTreeDataProjectionWriterUncompressedBytes; extern const Event MergeTreeDataProjectionWriterCompressedBytes; extern const Event MergeTreeDataProjectionWriterSortingBlocksMicroseconds; extern const Event MergeTreeDataProjectionWriterMergingBlocksMicroseconds; extern const Event RejectedInserts; } namespace DB { namespace ErrorCodes { extern const int ABORTED; extern const int LOGICAL_ERROR; extern const int TOO_MANY_PARTS; } namespace { void buildScatterSelector( const ColumnRawPtrs & columns, PODArray<size_t> & partition_num_to_first_row, IColumn::Selector & selector, size_t max_parts, ContextPtr context) { /// Use generic hashed variant since partitioning is unlikely to be a bottleneck. using Data = HashMap<UInt128, size_t, UInt128TrivialHash>; Data partitions_map; size_t num_rows = columns[0]->size(); size_t partitions_count = 0; size_t throw_on_limit = context->getSettingsRef().throw_on_max_partitions_per_insert_block; for (size_t i = 0; i < num_rows; ++i) { Data::key_type key = hash128(i, columns.size(), columns); typename Data::LookupResult it; bool inserted; partitions_map.emplace(key, it, inserted); if (inserted) { if (max_parts && partitions_count >= max_parts && throw_on_limit) { ProfileEvents::increment(ProfileEvents::RejectedInserts); throw Exception(ErrorCodes::TOO_MANY_PARTS, "Too many partitions for single INSERT block (more than {}). " "The limit is controlled by 'max_partitions_per_insert_block' setting. " "Large number of partitions is a common misconception. " "It will lead to severe negative performance impact, including slow server startup, " "slow INSERT queries and slow SELECT queries. Recommended total number of partitions " "for a table is under 1000..10000. Please note, that partitioning is not intended " "to speed up SELECT queries (ORDER BY key is sufficient to make range queries fast). " "Partitions are intended for data manipulation (DROP PARTITION, etc).", max_parts); } partition_num_to_first_row.push_back(i); it->getMapped() = partitions_count; ++partitions_count; /// Optimization for common case when there is only one partition - defer selector initialization. if (partitions_count == 2) { selector = IColumn::Selector(num_rows); std::fill(selector.begin(), selector.begin() + i, 0); } } if (partitions_count > 1) selector[i] = it->getMapped(); } // Checking partitions per insert block again here outside the loop above // so we can log the total number of partitions that would have parts created if (max_parts && partitions_count >= max_parts && !throw_on_limit) { const auto & client_info = context->getClientInfo(); LoggerPtr log = getLogger("MergeTreeDataWriter"); LOG_WARNING(log, "INSERT query from initial_user {} (query ID: {}) inserted a block " "that created parts in {} partitions. This is being logged " "rather than throwing an exception as throw_on_max_partitions_per_insert_block=false.", client_info.initial_user, client_info.initial_query_id, partitions_count); } } /// Computes ttls and updates ttl infos void updateTTL( const ContextPtr context, const TTLDescription & ttl_entry, IMergeTreeDataPart::TTLInfos & ttl_infos, DB::MergeTreeDataPartTTLInfo & ttl_info, const Block & block, bool update_part_min_max_ttls) { auto expr_and_set = ttl_entry.buildExpression(context); for (auto & subquery : expr_and_set.sets->getSubqueries()) subquery->buildSetInplace(context); auto ttl_column = ITTLAlgorithm::executeExpressionAndGetColumn(expr_and_set.expression, block, ttl_entry.result_column); if (const ColumnUInt16 * column_date = typeid_cast<const ColumnUInt16 *>(ttl_column.get())) { const auto & date_lut = DateLUT::serverTimezoneInstance(); for (const auto & val : column_date->getData()) ttl_info.update(date_lut.fromDayNum(DayNum(val))); } else if (const ColumnUInt32 * column_date_time = typeid_cast<const ColumnUInt32 *>(ttl_column.get())) { for (const auto & val : column_date_time->getData()) ttl_info.update(val); } else if (const ColumnConst * column_const = typeid_cast<const ColumnConst *>(ttl_column.get())) { if (typeid_cast<const ColumnUInt16 *>(&column_const->getDataColumn())) { const auto & date_lut = DateLUT::serverTimezoneInstance(); ttl_info.update(date_lut.fromDayNum(DayNum(column_const->getValue<UInt16>()))); } else if (typeid_cast<const ColumnUInt32 *>(&column_const->getDataColumn())) { ttl_info.update(column_const->getValue<UInt32>()); } else throw Exception(ErrorCodes::LOGICAL_ERROR, "Unexpected type of result TTL column"); } else throw Exception(ErrorCodes::LOGICAL_ERROR, "Unexpected type of result TTL column"); if (update_part_min_max_ttls) ttl_infos.updatePartMinMaxTTL(ttl_info.min, ttl_info.max); } } void MergeTreeDataWriter::TemporaryPart::cancel() { try { /// An exception context is needed to proper delete write buffers without finalization throw Exception(ErrorCodes::ABORTED, "Cancel temporary part."); } catch (...) { *this = TemporaryPart{}; } } void MergeTreeDataWriter::TemporaryPart::finalize() { for (auto & stream : streams) stream.finalizer.finish(); part->getDataPartStorage().precommitTransaction(); for (const auto & [_, projection] : part->getProjectionParts()) projection->getDataPartStorage().precommitTransaction(); } std::vector<AsyncInsertInfoPtr> scatterAsyncInsertInfoBySelector(AsyncInsertInfoPtr async_insert_info, const IColumn::Selector & selector, size_t partition_num) { if (nullptr == async_insert_info) { return {}; } if (selector.empty()) { return {async_insert_info}; } std::vector<AsyncInsertInfoPtr> result(partition_num); std::vector<Int64> last_row_for_partition(partition_num, -1); size_t offset_idx = 0; for (size_t i = 0; i < selector.size(); ++i) { ++last_row_for_partition[selector[i]]; if (i + 1 == async_insert_info->offsets[offset_idx]) { for (size_t part_id = 0; part_id < last_row_for_partition.size(); ++part_id) { Int64 last_row = last_row_for_partition[part_id]; if (-1 == last_row) continue; size_t offset = static_cast<size_t>(last_row + 1); if (result[part_id] == nullptr) result[part_id] = std::make_shared<AsyncInsertInfo>(); if (result[part_id]->offsets.empty() || offset > *result[part_id]->offsets.rbegin()) { result[part_id]->offsets.push_back(offset); result[part_id]->tokens.push_back(async_insert_info->tokens[offset_idx]); } } ++offset_idx; } } if (offset_idx != async_insert_info->offsets.size()) { LOG_ERROR( getLogger("MergeTreeDataWriter"), "ChunkInfo of async insert offsets doesn't match the selector size {}. Offsets content is ({})", selector.size(), fmt::join(async_insert_info->offsets.begin(), async_insert_info->offsets.end(), ",")); throw Exception(ErrorCodes::LOGICAL_ERROR, "Unexpected error for async deduplicated insert, please check error logs"); } return result; } BlocksWithPartition MergeTreeDataWriter::splitBlockIntoParts( Block && block, size_t max_parts, const StorageMetadataPtr & metadata_snapshot, ContextPtr context, AsyncInsertInfoPtr async_insert_info) { BlocksWithPartition result; if (!block || !block.rows()) return result; metadata_snapshot->check(block, true); if (!metadata_snapshot->hasPartitionKey()) /// Table is not partitioned. { result.emplace_back(Block(block), Row{}); if (async_insert_info != nullptr) { result[0].offsets = std::move(async_insert_info->offsets); result[0].tokens = std::move(async_insert_info->tokens); } return result; } Block block_copy = block; /// After expression execution partition key columns will be added to block_copy with names regarding partition function. auto partition_key_names_and_types = MergeTreePartition::executePartitionByExpression(metadata_snapshot, block_copy, context); ColumnRawPtrs partition_columns; partition_columns.reserve(partition_key_names_and_types.size()); for (const auto & element : partition_key_names_and_types) partition_columns.emplace_back(block_copy.getByName(element.name).column.get()); PODArray<size_t> partition_num_to_first_row; IColumn::Selector selector; buildScatterSelector(partition_columns, partition_num_to_first_row, selector, max_parts, context); auto async_insert_info_with_partition = scatterAsyncInsertInfoBySelector(async_insert_info, selector, partition_num_to_first_row.size()); size_t partitions_count = partition_num_to_first_row.size(); result.reserve(partitions_count); auto get_partition = [&](size_t num) { Row partition(partition_columns.size()); for (size_t i = 0; i < partition_columns.size(); ++i) partition[i] = (*partition_columns[i])[partition_num_to_first_row[num]]; return partition; }; if (partitions_count == 1) { /// A typical case is when there is one partition (you do not need to split anything). /// NOTE: returning a copy of the original block so that calculated partition key columns /// do not interfere with possible calculated primary key columns of the same name. result.emplace_back(Block(block), get_partition(0)); if (!async_insert_info_with_partition.empty()) { result[0].offsets = std::move(async_insert_info_with_partition[0]->offsets); result[0].tokens = std::move(async_insert_info_with_partition[0]->tokens); } return result; } for (size_t i = 0; i < partitions_count; ++i) result.emplace_back(block.cloneEmpty(), get_partition(i)); for (size_t col = 0; col < block.columns(); ++col) { MutableColumns scattered = block.getByPosition(col).column->scatter(partitions_count, selector); for (size_t i = 0; i < partitions_count; ++i) result[i].block.getByPosition(col).column = std::move(scattered[i]); } for (size_t i = 0; i < async_insert_info_with_partition.size(); ++i) { if (async_insert_info_with_partition[i] == nullptr) { LOG_ERROR( getLogger("MergeTreeDataWriter"), "The {}th element in async_insert_info_with_partition is nullptr. There are totally {} partitions in the insert. Selector content is ({}). Offsets content is ({})", i, partitions_count, fmt::join(selector.begin(), selector.end(), ","), fmt::join(async_insert_info->offsets.begin(), async_insert_info->offsets.end(), ",")); throw Exception(ErrorCodes::LOGICAL_ERROR, "Unexpected error for async deduplicated insert, please check error logs"); } result[i].offsets = std::move(async_insert_info_with_partition[i]->offsets); result[i].tokens = std::move(async_insert_info_with_partition[i]->tokens); } return result; } Block MergeTreeDataWriter::mergeBlock( Block && block, SortDescription sort_description, const Names & partition_key_columns, IColumn::Permutation *& permutation, const MergeTreeData::MergingParams & merging_params) { OpenTelemetry::SpanHolder span("MergeTreeDataWriter::mergeBlock"); size_t block_size = block.rows(); span.addAttribute("clickhouse.rows", block_size); span.addAttribute("clickhouse.columns", block.columns()); auto get_merging_algorithm = [&]() -> std::shared_ptr<IMergingAlgorithm> { switch (merging_params.mode) { /// There is nothing to merge in single block in ordinary MergeTree case MergeTreeData::MergingParams::Ordinary: return nullptr; case MergeTreeData::MergingParams::Replacing: return std::make_shared<ReplacingSortedAlgorithm>( block, 1, sort_description, merging_params.is_deleted_column, merging_params.version_column, block_size + 1, /*block_size_bytes=*/0); case MergeTreeData::MergingParams::Collapsing: return std::make_shared<CollapsingSortedAlgorithm>( block, 1, sort_description, merging_params.sign_column, false, block_size + 1, /*block_size_bytes=*/0, getLogger("MergeTreeDataWriter")); case MergeTreeData::MergingParams::Summing: return std::make_shared<SummingSortedAlgorithm>( block, 1, sort_description, merging_params.columns_to_sum, partition_key_columns, block_size + 1, /*block_size_bytes=*/0); case MergeTreeData::MergingParams::Aggregating: return std::make_shared<AggregatingSortedAlgorithm>(block, 1, sort_description, block_size + 1, /*block_size_bytes=*/0); case MergeTreeData::MergingParams::VersionedCollapsing: return std::make_shared<VersionedCollapsingAlgorithm>( block, 1, sort_description, merging_params.sign_column, block_size + 1, /*block_size_bytes=*/0); case MergeTreeData::MergingParams::Graphite: return std::make_shared<GraphiteRollupSortedAlgorithm>( block, 1, sort_description, block_size + 1, /*block_size_bytes=*/0, merging_params.graphite_params, time(nullptr)); } }; auto merging_algorithm = get_merging_algorithm(); if (!merging_algorithm) return block; span.addAttribute("clickhouse.merging_algorithm", merging_algorithm->getName()); Chunk chunk(block.getColumns(), block_size); IMergingAlgorithm::Input input; input.set(std::move(chunk)); input.permutation = permutation; IMergingAlgorithm::Inputs inputs; inputs.push_back(std::move(input)); merging_algorithm->initialize(std::move(inputs)); IMergingAlgorithm::Status status = merging_algorithm->merge(); /// Check that after first merge merging_algorithm is waiting for data from input 0. if (status.required_source != 0) throw Exception(ErrorCodes::LOGICAL_ERROR, "Required source after the first merge is not 0. Chunk rows: {}, is_finished: {}, required_source: {}, algorithm: {}", status.chunk.getNumRows(), status.is_finished, status.required_source, merging_algorithm->getName()); status = merging_algorithm->merge(); /// Check that merge is finished. if (!status.is_finished) throw Exception(ErrorCodes::LOGICAL_ERROR, "Merge is not finished after the second merge."); /// Merged Block is sorted and we don't need to use permutation anymore permutation = nullptr; return block.cloneWithColumns(status.chunk.getColumns()); } MergeTreeDataWriter::TemporaryPart MergeTreeDataWriter::writeTempPart(BlockWithPartition & block, const StorageMetadataPtr & metadata_snapshot, ContextPtr context) { return writeTempPartImpl(block, metadata_snapshot, context, data.insert_increment.get(), /*need_tmp_prefix = */true); } MergeTreeDataWriter::TemporaryPart MergeTreeDataWriter::writeTempPartWithoutPrefix(BlockWithPartition & block, const StorageMetadataPtr & metadata_snapshot, int64_t block_number, ContextPtr context) { return writeTempPartImpl(block, metadata_snapshot, context, block_number, /*need_tmp_prefix = */false); } MergeTreeDataWriter::TemporaryPart MergeTreeDataWriter::writeTempPartImpl( BlockWithPartition & block_with_partition, const StorageMetadataPtr & metadata_snapshot, ContextPtr context, int64_t block_number, bool need_tmp_prefix) { TemporaryPart temp_part; Block & block = block_with_partition.block; auto columns = metadata_snapshot->getColumns().getAllPhysical().filter(block.getNames()); for (auto & column : columns) if (column.type->hasDynamicSubcolumnsDeprecated()) column.type = block.getByName(column.name).type; auto minmax_idx = std::make_shared<IMergeTreeDataPart::MinMaxIndex>(); minmax_idx->update(block, MergeTreeData::getMinMaxColumnsNames(metadata_snapshot->getPartitionKey())); MergeTreePartition partition(block_with_partition.partition); MergeTreePartInfo new_part_info(partition.getID(metadata_snapshot->getPartitionKey().sample_block), block_number, block_number, 0); String part_name; if (data.format_version < MERGE_TREE_DATA_MIN_FORMAT_VERSION_WITH_CUSTOM_PARTITIONING) { DayNum min_date(minmax_idx->hyperrectangle[data.minmax_idx_date_column_pos].left.safeGet<UInt64>()); DayNum max_date(minmax_idx->hyperrectangle[data.minmax_idx_date_column_pos].right.safeGet<UInt64>()); const auto & date_lut = DateLUT::serverTimezoneInstance(); auto min_month = date_lut.toNumYYYYMM(min_date); auto max_month = date_lut.toNumYYYYMM(max_date); if (min_month != max_month) throw Exception(ErrorCodes::LOGICAL_ERROR, "Part spans more than one month."); part_name = new_part_info.getPartNameV0(min_date, max_date); } else part_name = new_part_info.getPartNameV1(); std::string part_dir; if (need_tmp_prefix) { std::string temp_prefix = "tmp_insert_"; const auto & temp_postfix = data.getPostfixForTempInsertName(); if (!temp_postfix.empty()) temp_prefix += temp_postfix + "_"; part_dir = temp_prefix + part_name; } else { part_dir = part_name; } temp_part.temporary_directory_lock = data.getTemporaryPartDirectoryHolder(part_dir); MergeTreeIndices indices; if (context->getSettingsRef().materialize_skip_indexes_on_insert) indices = MergeTreeIndexFactory::instance().getMany(metadata_snapshot->getSecondaryIndices()); ColumnsStatistics statistics; if (context->getSettingsRef().materialize_statistics_on_insert) statistics = MergeTreeStatisticsFactory::instance().getMany(metadata_snapshot->getColumns()); /// If we need to calculate some columns to sort. if (metadata_snapshot->hasSortingKey() || metadata_snapshot->hasSecondaryIndices()) data.getSortingKeyAndSkipIndicesExpression(metadata_snapshot, indices)->execute(block); Names sort_columns = metadata_snapshot->getSortingKeyColumns(); SortDescription sort_description; size_t sort_columns_size = sort_columns.size(); sort_description.reserve(sort_columns_size); for (size_t i = 0; i < sort_columns_size; ++i) sort_description.emplace_back(sort_columns[i], 1, 1); ProfileEvents::increment(ProfileEvents::MergeTreeDataWriterBlocks); /// Sort IColumn::Permutation * perm_ptr = nullptr; IColumn::Permutation perm; if (!sort_description.empty()) { ProfileEventTimeIncrement<Microseconds> watch(ProfileEvents::MergeTreeDataWriterSortingBlocksMicroseconds); if (!isAlreadySorted(block, sort_description)) { stableGetPermutation(block, sort_description, perm); perm_ptr = &perm; } else ProfileEvents::increment(ProfileEvents::MergeTreeDataWriterBlocksAlreadySorted); } if (data.getSettings()->optimize_row_order && data.merging_params.mode == MergeTreeData::MergingParams::Mode::Ordinary) /// Nobody knows if this optimization messes up specialized MergeTree engines. { RowOrderOptimizer::optimize(block, sort_description, perm); perm_ptr = &perm; } Names partition_key_columns = metadata_snapshot->getPartitionKey().column_names; if (context->getSettingsRef().optimize_on_insert) { ProfileEventTimeIncrement<Microseconds> watch(ProfileEvents::MergeTreeDataWriterMergingBlocksMicroseconds); block = mergeBlock(std::move(block), sort_description, partition_key_columns, perm_ptr, data.merging_params); } /// Size of part would not be greater than block.bytes() + epsilon size_t expected_size = block.bytes(); /// If optimize_on_insert is true, block may become empty after merge. There /// is no need to create empty part. Since expected_size could be zero when /// part only contains empty tuples. As a result, check rows instead. if (block.rows() == 0) return temp_part; DB::IMergeTreeDataPart::TTLInfos move_ttl_infos; const auto & move_ttl_entries = metadata_snapshot->getMoveTTLs(); for (const auto & ttl_entry : move_ttl_entries) updateTTL(context, ttl_entry, move_ttl_infos, move_ttl_infos.moves_ttl[ttl_entry.result_column], block, false); ReservationPtr reservation = data.reserveSpacePreferringTTLRules(metadata_snapshot, expected_size, move_ttl_infos, time(nullptr), 0, true); VolumePtr volume = data.getStoragePolicy()->getVolume(0); VolumePtr data_part_volume = createVolumeFromReservation(reservation, volume); auto new_data_part = data.getDataPartBuilder(part_name, data_part_volume, part_dir) .withPartFormat(data.choosePartFormat(expected_size, block.rows())) .withPartInfo(new_part_info) .build(); auto data_part_storage = new_data_part->getDataPartStoragePtr(); data_part_storage->beginTransaction(); if (data.storage_settings.get()->assign_part_uuids) new_data_part->uuid = UUIDHelpers::generateV4(); const auto & data_settings = data.getSettings(); SerializationInfo::Settings settings{data_settings->ratio_of_defaults_for_sparse_serialization, true}; SerializationInfoByName infos(columns, settings); infos.add(block); new_data_part->setColumns(columns, infos, metadata_snapshot->getMetadataVersion()); new_data_part->rows_count = block.rows(); new_data_part->existing_rows_count = block.rows(); new_data_part->partition = std::move(partition); new_data_part->minmax_idx = std::move(minmax_idx); new_data_part->is_temp = true; /// In case of replicated merge tree with zero copy replication /// Here Clickhouse claims that this new part can be deleted in temporary state without unlocking the blobs /// The blobs have to be removed along with the part, this temporary part owns them and does not share them yet. new_data_part->remove_tmp_policy = IMergeTreeDataPart::BlobsRemovalPolicyForTemporaryParts::REMOVE_BLOBS; SyncGuardPtr sync_guard; if (new_data_part->isStoredOnDisk()) { /// The name could be non-unique in case of stale files from previous runs. String full_path = new_data_part->getDataPartStorage().getFullPath(); if (new_data_part->getDataPartStorage().exists()) { LOG_WARNING(log, "Removing old temporary directory {}", full_path); data_part_storage->removeRecursive(); } data_part_storage->createDirectories(); if (data.getSettings()->fsync_part_directory) { const auto disk = data_part_volume->getDisk(); sync_guard = disk->getDirectorySyncGuard(full_path); } } if (metadata_snapshot->hasRowsTTL()) updateTTL(context, metadata_snapshot->getRowsTTL(), new_data_part->ttl_infos, new_data_part->ttl_infos.table_ttl, block, true); for (const auto & ttl_entry : metadata_snapshot->getGroupByTTLs()) updateTTL(context, ttl_entry, new_data_part->ttl_infos, new_data_part->ttl_infos.group_by_ttl[ttl_entry.result_column], block, true); for (const auto & ttl_entry : metadata_snapshot->getRowsWhereTTLs()) updateTTL(context, ttl_entry, new_data_part->ttl_infos, new_data_part->ttl_infos.rows_where_ttl[ttl_entry.result_column], block, true); for (const auto & [name, ttl_entry] : metadata_snapshot->getColumnTTLs()) updateTTL(context, ttl_entry, new_data_part->ttl_infos, new_data_part->ttl_infos.columns_ttl[name], block, true); const auto & recompression_ttl_entries = metadata_snapshot->getRecompressionTTLs(); for (const auto & ttl_entry : recompression_ttl_entries) updateTTL(context, ttl_entry, new_data_part->ttl_infos, new_data_part->ttl_infos.recompression_ttl[ttl_entry.result_column], block, false); new_data_part->ttl_infos.update(move_ttl_infos); /// This effectively chooses minimal compression method: /// either default lz4 or compression method with zero thresholds on absolute and relative part size. auto compression_codec = data.getContext()->chooseCompressionCodec(0, 0); auto out = std::make_unique<MergedBlockOutputStream>( new_data_part, metadata_snapshot, columns, indices, statistics, compression_codec, context->getCurrentTransaction() ? context->getCurrentTransaction()->tid : Tx::PrehistoricTID, false, false, context->getWriteSettings()); out->writeWithPermutation(block, perm_ptr); for (const auto & projection : metadata_snapshot->getProjections()) { Block projection_block; { ProfileEventTimeIncrement<Microseconds> watch(ProfileEvents::MergeTreeDataWriterProjectionsCalculationMicroseconds); projection_block = projection.calculate(block, context); LOG_DEBUG(log, "Spent {} ms calculating projection {} for the part {}", watch.elapsed() / 1000, projection.name, new_data_part->name); } if (projection_block.rows()) { auto proj_temp_part = writeProjectionPart(data, log, projection_block, projection, new_data_part.get(), /*merge_is_needed=*/false); new_data_part->addProjectionPart(projection.name, std::move(proj_temp_part.part)); for (auto & stream : proj_temp_part.streams) temp_part.streams.emplace_back(std::move(stream)); } } auto finalizer = out->finalizePartAsync( new_data_part, data_settings->fsync_after_insert, nullptr, nullptr); temp_part.part = new_data_part; temp_part.streams.emplace_back(TemporaryPart::Stream{.stream = std::move(out), .finalizer = std::move(finalizer)}); ProfileEvents::increment(ProfileEvents::MergeTreeDataWriterRows, block.rows()); ProfileEvents::increment(ProfileEvents::MergeTreeDataWriterUncompressedBytes, block.bytes()); ProfileEvents::increment(ProfileEvents::MergeTreeDataWriterCompressedBytes, new_data_part->getBytesOnDisk()); return temp_part; } MergeTreeDataWriter::TemporaryPart MergeTreeDataWriter::writeProjectionPartImpl( const String & part_name, bool is_temp, IMergeTreeDataPart * parent_part, const MergeTreeData & data, LoggerPtr log, Block block, const ProjectionDescription & projection, bool merge_is_needed) { TemporaryPart temp_part; const auto & metadata_snapshot = projection.metadata; MergeTreeDataPartType part_type; /// Size of part would not be greater than block.bytes() + epsilon size_t expected_size = block.bytes(); // just check if there is enough space on parent volume MergeTreeData::reserveSpace(expected_size, parent_part->getDataPartStorage()); part_type = data.choosePartFormatOnDisk(expected_size, block.rows()).part_type; auto new_data_part = parent_part->getProjectionPartBuilder(part_name, is_temp).withPartType(part_type).build(); auto projection_part_storage = new_data_part->getDataPartStoragePtr(); if (is_temp) projection_part_storage->beginTransaction(); new_data_part->is_temp = is_temp; NamesAndTypesList columns = metadata_snapshot->getColumns().getAllPhysical().filter(block.getNames()); SerializationInfo::Settings settings{data.getSettings()->ratio_of_defaults_for_sparse_serialization, true}; SerializationInfoByName infos(columns, settings); infos.add(block); new_data_part->setColumns(columns, infos, metadata_snapshot->getMetadataVersion()); if (new_data_part->isStoredOnDisk()) { /// The name could be non-unique in case of stale files from previous runs. if (projection_part_storage->exists()) { LOG_WARNING(log, "Removing old temporary directory {}", projection_part_storage->getFullPath()); projection_part_storage->removeRecursive(); } projection_part_storage->createDirectories(); } /// If we need to calculate some columns to sort. if (metadata_snapshot->hasSortingKey() || metadata_snapshot->hasSecondaryIndices()) data.getSortingKeyAndSkipIndicesExpression(metadata_snapshot, {})->execute(block); Names sort_columns = metadata_snapshot->getSortingKeyColumns(); SortDescription sort_description; size_t sort_columns_size = sort_columns.size(); sort_description.reserve(sort_columns_size); for (size_t i = 0; i < sort_columns_size; ++i) sort_description.emplace_back(sort_columns[i], 1, 1); ProfileEvents::increment(ProfileEvents::MergeTreeDataProjectionWriterBlocks); /// Sort IColumn::Permutation * perm_ptr = nullptr; IColumn::Permutation perm; if (!sort_description.empty()) { ProfileEventTimeIncrement<Microseconds> watch(ProfileEvents::MergeTreeDataProjectionWriterSortingBlocksMicroseconds); if (!isAlreadySorted(block, sort_description)) { stableGetPermutation(block, sort_description, perm); perm_ptr = &perm; } else ProfileEvents::increment(ProfileEvents::MergeTreeDataProjectionWriterBlocksAlreadySorted); } if (data.getSettings()->optimize_row_order && data.merging_params.mode == MergeTreeData::MergingParams::Mode::Ordinary) /// Nobody knows if this optimization messes up specialized MergeTree engines. { RowOrderOptimizer::optimize(block, sort_description, perm); perm_ptr = &perm; } if (projection.type == ProjectionDescription::Type::Aggregate && merge_is_needed) { ProfileEventTimeIncrement<Microseconds> watch(ProfileEvents::MergeTreeDataProjectionWriterMergingBlocksMicroseconds); MergeTreeData::MergingParams projection_merging_params; projection_merging_params.mode = MergeTreeData::MergingParams::Aggregating; block = mergeBlock(std::move(block), sort_description, {}, perm_ptr, projection_merging_params); } /// This effectively chooses minimal compression method: /// either default lz4 or compression method with zero thresholds on absolute and relative part size. auto compression_codec = data.getContext()->chooseCompressionCodec(0, 0); auto out = std::make_unique<MergedBlockOutputStream>( new_data_part, metadata_snapshot, columns, MergeTreeIndices{}, /// TODO(hanfei): It should be helpful to write statistics for projection result. ColumnsStatistics{}, compression_codec, Tx::PrehistoricTID, false, false, data.getContext()->getWriteSettings()); out->writeWithPermutation(block, perm_ptr); auto finalizer = out->finalizePartAsync(new_data_part, false); temp_part.part = new_data_part; temp_part.streams.emplace_back(TemporaryPart::Stream{.stream = std::move(out), .finalizer = std::move(finalizer)}); ProfileEvents::increment(ProfileEvents::MergeTreeDataProjectionWriterRows, block.rows()); ProfileEvents::increment(ProfileEvents::MergeTreeDataProjectionWriterUncompressedBytes, block.bytes()); ProfileEvents::increment(ProfileEvents::MergeTreeDataProjectionWriterCompressedBytes, new_data_part->getBytesOnDisk()); return temp_part; } MergeTreeDataWriter::TemporaryPart MergeTreeDataWriter::writeProjectionPart( const MergeTreeData & data, LoggerPtr log, Block block, const ProjectionDescription & projection, IMergeTreeDataPart * parent_part, bool merge_is_needed) { return writeProjectionPartImpl( projection.name, false /* is_temp */, parent_part, data, log, std::move(block), projection, merge_is_needed); } /// This is used for projection materialization process which may contain multiple stages of /// projection part merges. MergeTreeDataWriter::TemporaryPart MergeTreeDataWriter::writeTempProjectionPart( const MergeTreeData & data, LoggerPtr log, Block block, const ProjectionDescription & projection, IMergeTreeDataPart * parent_part, size_t block_num) { auto part_name = fmt::format("{}_{}", projection.name, block_num); return writeProjectionPartImpl( part_name, true /* is_temp */, parent_part, data, log, std::move(block), projection, /*merge_is_needed=*/true); } } ```
```text Human Health Max 0 CodeFreak/PS3UserCheat 6 017C55B0 00000588 A1 00000000 0004 6 017C55B0 00000584 A2 00000000 0000 # Helicoper Health Max 0 CodeFreak/PS3UserCheat 6 017BC1BC 000010F4 6 00000000 00000588 A1 00000000 0004 6 017BC1BC 000010F4 6 00000000 00000584 A2 00000000 0000 # Cash 999,999,999 0 CodeFreak/PS3UserCheat 6 01538934 00000004 6 00000000 00000028 0 00000000 3B9AC9FF # Ammo Max 0 CodeFreak/PS3UserCheat 6 0156B730 00000000 6 00000000 00000000 6 00000000 00000014 6 00000000 00000000 6 00000000 00000008 0 00000000 0000270F 6 0156B730 00000000 6 00000000 00000000 6 00000000 00000014 6 00000000 00000000 6 00000000 0000000C 0 00000000 0000270F 6 0156B730 00000000 6 00000000 00000004 6 00000000 00000014 6 00000000 00000000 6 00000000 00000008 0 00000000 0000270F 6 0156B730 00000000 6 00000000 00000004 6 00000000 00000014 6 00000000 00000000 6 00000000 0000000C 0 00000000 0000270F 6 0156B730 00000000 6 00000000 00000008 6 00000000 00000014 6 00000000 00000000 6 00000000 00000008 0 00000000 0000270F 6 0156B730 00000000 6 00000000 00000008 6 00000000 00000014 6 00000000 00000000 6 00000000 0000000C 0 00000000 0000270F 6 0156B730 00000000 6 00000000 0000000C 6 00000000 00000014 6 00000000 00000000 6 00000000 00000008 0 00000000 0000270F 6 0156B730 00000000 6 00000000 0000000C 6 00000000 00000014 6 00000000 00000000 6 00000000 0000000C 0 00000000 0000270F # Wanzer Health Max 0 CodeFreak/PS3UserCheat 6 017C55B0 000012DC 0 00000000 447A0000 6 017C55B0 000012D8 0 00000000 447A0000 6 017C55B0 000013E0 0 00000000 447A0000 6 017C55B0 000013DC 0 00000000 447A0000 6 017C55B0 000014E4 0 00000000 447A0000 6 017C55B0 000014E0 0 00000000 447A0000 6 017C55B0 000015E8 0 00000000 447A0000 6 017C55B0 000015E4 0 00000000 447A0000 # Infinite Wanzer Fuel Usage 0 CodeFreak/PS3UserCheat 6 017C55B0 00001A2C 0 00000000 42C80000 # Infinite Wanzer E.D.G.E. Usage 0 CodeFreak/PS3UserCheat 6 017C55B0 00001A60 0 00000000 42C80000 # ```
```go /* path_to_url Unless required by applicable law or agreed to in writing, software WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ package proxy import ( "testing" api "k8s.io/api/core/v1" networking "k8s.io/api/networking/v1" meta_v1 "k8s.io/apimachinery/pkg/apis/meta/v1" "k8s.io/ingress-nginx/internal/ingress/annotations/parser" "k8s.io/ingress-nginx/internal/ingress/defaults" "k8s.io/ingress-nginx/internal/ingress/resolver" ) const ( off = "off" proxyHTTPVersion = "1.0" proxyMaxTempFileSize = "128k" ) func buildIngress() *networking.Ingress { defaultBackend := networking.IngressBackend{ Service: &networking.IngressServiceBackend{ Name: "default-backend", Port: networking.ServiceBackendPort{ Number: 80, }, }, } return &networking.Ingress{ ObjectMeta: meta_v1.ObjectMeta{ Name: "foo", Namespace: api.NamespaceDefault, }, Spec: networking.IngressSpec{ DefaultBackend: &networking.IngressBackend{ Service: &networking.IngressServiceBackend{ Name: "default-backend", Port: networking.ServiceBackendPort{ Number: 80, }, }, }, Rules: []networking.IngressRule{ { Host: "foo.bar.com", IngressRuleValue: networking.IngressRuleValue{ HTTP: &networking.HTTPIngressRuleValue{ Paths: []networking.HTTPIngressPath{ { Path: "/foo", Backend: defaultBackend, }, }, }, }, }, }, }, } } type mockBackend struct { resolver.Mock } func (m mockBackend) GetDefaultBackend() defaults.Backend { return defaults.Backend{ ProxyConnectTimeout: 10, ProxySendTimeout: 15, ProxyReadTimeout: 20, ProxyBuffersNumber: 4, ProxyBufferSize: "10k", ProxyBodySize: "3k", ProxyNextUpstream: "error", ProxyNextUpstreamTimeout: 0, ProxyNextUpstreamTries: 3, ProxyRequestBuffering: "on", ProxyBuffering: off, ProxyHTTPVersion: "1.1", ProxyMaxTempFileSize: "1024m", } } func TestProxy(t *testing.T) { ing := buildIngress() data := map[string]string{} data[parser.GetAnnotationWithPrefix("proxy-connect-timeout")] = "1" data[parser.GetAnnotationWithPrefix("proxy-send-timeout")] = "2" data[parser.GetAnnotationWithPrefix("proxy-read-timeout")] = "3" data[parser.GetAnnotationWithPrefix("proxy-buffers-number")] = "8" data[parser.GetAnnotationWithPrefix("proxy-buffer-size")] = "1k" data[parser.GetAnnotationWithPrefix("proxy-body-size")] = "2k" data[parser.GetAnnotationWithPrefix("proxy-next-upstream")] = off data[parser.GetAnnotationWithPrefix("proxy-next-upstream-timeout")] = "5" data[parser.GetAnnotationWithPrefix("proxy-next-upstream-tries")] = "3" data[parser.GetAnnotationWithPrefix("proxy-request-buffering")] = off data[parser.GetAnnotationWithPrefix("proxy-buffering")] = "on" data[parser.GetAnnotationWithPrefix("proxy-http-version")] = proxyHTTPVersion data[parser.GetAnnotationWithPrefix("proxy-max-temp-file-size")] = proxyMaxTempFileSize ing.SetAnnotations(data) i, err := NewParser(mockBackend{}).Parse(ing) if err != nil { t.Fatalf("unexpected error parsing a valid") } p, ok := i.(*Config) if !ok { t.Fatalf("expected a Config type") } if p.ConnectTimeout != 1 { t.Errorf("expected 1 as connect-timeout but returned %v", p.ConnectTimeout) } if p.SendTimeout != 2 { t.Errorf("expected 2 as send-timeout but returned %v", p.SendTimeout) } if p.ReadTimeout != 3 { t.Errorf("expected 3 as read-timeout but returned %v", p.ReadTimeout) } if p.BuffersNumber != 8 { t.Errorf("expected 8 as proxy-buffers-number but returned %v", p.BuffersNumber) } if p.BufferSize != "1k" { t.Errorf("expected 1k as buffer-size but returned %v", p.BufferSize) } if p.BodySize != "2k" { t.Errorf("expected 2k as body-size but returned %v", p.BodySize) } if p.NextUpstream != off { t.Errorf("expected off as next-upstream but returned %v", p.NextUpstream) } if p.NextUpstreamTimeout != 5 { t.Errorf("expected 5 as next-upstream-timeout but returned %v", p.NextUpstreamTimeout) } if p.NextUpstreamTries != 3 { t.Errorf("expected 3 as next-upstream-tries but returned %v", p.NextUpstreamTries) } if p.RequestBuffering != off { t.Errorf("expected off as request-buffering but returned %v", p.RequestBuffering) } if p.ProxyBuffering != "on" { t.Errorf("expected on as proxy-buffering but returned %v", p.ProxyBuffering) } if p.ProxyHTTPVersion != proxyHTTPVersion { t.Errorf("expected 1.0 as proxy-http-version but returned %v", p.ProxyHTTPVersion) } if p.ProxyMaxTempFileSize != proxyMaxTempFileSize { t.Errorf("expected 128k as proxy-max-temp-file-size but returned %v", p.ProxyMaxTempFileSize) } } func TestProxyComplex(t *testing.T) { ing := buildIngress() data := map[string]string{} data[parser.GetAnnotationWithPrefix("proxy-connect-timeout")] = "1" data[parser.GetAnnotationWithPrefix("proxy-send-timeout")] = "2" data[parser.GetAnnotationWithPrefix("proxy-read-timeout")] = "3" data[parser.GetAnnotationWithPrefix("proxy-buffers-number")] = "8" data[parser.GetAnnotationWithPrefix("proxy-buffer-size")] = "1k" data[parser.GetAnnotationWithPrefix("proxy-body-size")] = "2k" data[parser.GetAnnotationWithPrefix("proxy-next-upstream")] = "error http_502" data[parser.GetAnnotationWithPrefix("proxy-next-upstream-timeout")] = "5" data[parser.GetAnnotationWithPrefix("proxy-next-upstream-tries")] = "3" data[parser.GetAnnotationWithPrefix("proxy-request-buffering")] = "off" data[parser.GetAnnotationWithPrefix("proxy-buffering")] = "on" data[parser.GetAnnotationWithPrefix("proxy-http-version")] = proxyHTTPVersion data[parser.GetAnnotationWithPrefix("proxy-max-temp-file-size")] = proxyMaxTempFileSize ing.SetAnnotations(data) i, err := NewParser(mockBackend{}).Parse(ing) if err != nil { t.Fatalf("unexpected error parsing a valid") } p, ok := i.(*Config) if !ok { t.Fatalf("expected a Config type") } if p.ConnectTimeout != 1 { t.Errorf("expected 1 as connect-timeout but returned %v", p.ConnectTimeout) } if p.SendTimeout != 2 { t.Errorf("expected 2 as send-timeout but returned %v", p.SendTimeout) } if p.ReadTimeout != 3 { t.Errorf("expected 3 as read-timeout but returned %v", p.ReadTimeout) } if p.BuffersNumber != 8 { t.Errorf("expected 8 as proxy-buffers-number but returned %v", p.BuffersNumber) } if p.BufferSize != "1k" { t.Errorf("expected 1k as buffer-size but returned %v", p.BufferSize) } if p.BodySize != "2k" { t.Errorf("expected 2k as body-size but returned %v", p.BodySize) } if p.NextUpstream != "error http_502" { t.Errorf("expected off as next-upstream but returned %v", p.NextUpstream) } if p.NextUpstreamTimeout != 5 { t.Errorf("expected 5 as next-upstream-timeout but returned %v", p.NextUpstreamTimeout) } if p.NextUpstreamTries != 3 { t.Errorf("expected 3 as next-upstream-tries but returned %v", p.NextUpstreamTries) } if p.RequestBuffering != "off" { t.Errorf("expected off as request-buffering but returned %v", p.RequestBuffering) } if p.ProxyBuffering != "on" { t.Errorf("expected on as proxy-buffering but returned %v", p.ProxyBuffering) } if p.ProxyHTTPVersion != proxyHTTPVersion { t.Errorf("expected 1.0 as proxy-http-version but returned %v", p.ProxyHTTPVersion) } if p.ProxyMaxTempFileSize != proxyMaxTempFileSize { t.Errorf("expected 128k as proxy-max-temp-file-size but returned %v", p.ProxyMaxTempFileSize) } } func TestProxyWithNoAnnotation(t *testing.T) { ing := buildIngress() data := map[string]string{} ing.SetAnnotations(data) i, err := NewParser(mockBackend{}).Parse(ing) if err != nil { t.Fatalf("unexpected error parsing a valid") } p, ok := i.(*Config) if !ok { t.Fatalf("expected a Config type") } if p.ConnectTimeout != 10 { t.Errorf("expected 10 as connect-timeout but returned %v", p.ConnectTimeout) } if p.SendTimeout != 15 { t.Errorf("expected 15 as send-timeout but returned %v", p.SendTimeout) } if p.ReadTimeout != 20 { t.Errorf("expected 20 as read-timeout but returned %v", p.ReadTimeout) } if p.BuffersNumber != 4 { t.Errorf("expected 4 as buffer-number but returned %v", p.BuffersNumber) } if p.BufferSize != "10k" { t.Errorf("expected 10k as buffer-size but returned %v", p.BufferSize) } if p.BodySize != "3k" { t.Errorf("expected 3k as body-size but returned %v", p.BodySize) } if p.NextUpstream != "error" { t.Errorf("expected error as next-upstream but returned %v", p.NextUpstream) } if p.NextUpstreamTimeout != 0 { t.Errorf("expected 0 as next-upstream-timeout but returned %v", p.NextUpstreamTimeout) } if p.NextUpstreamTries != 3 { t.Errorf("expected 3 as next-upstream-tries but returned %v", p.NextUpstreamTries) } if p.RequestBuffering != "on" { t.Errorf("expected on as request-buffering but returned %v", p.RequestBuffering) } if p.ProxyHTTPVersion != "1.1" { t.Errorf("expected 1.1 as proxy-http-version but returned %v", p.ProxyHTTPVersion) } if p.ProxyMaxTempFileSize != "1024m" { t.Errorf("expected 1024m as proxy-max-temp-file-size but returned %v", p.ProxyMaxTempFileSize) } } ```
```smalltalk using Microsoft.Xna.Framework.Graphics; using Microsoft.Xna.Framework; namespace Nez { public class WaterReflectionEffect : ReflectionEffect { /// <summary> /// defaults to 0.015. Waves are calculated by sampling the normal map twice. Any values generated that are sparkleIntensity greater /// than the actual uv value at the place of sampling will be colored sparkleColor. /// </summary> /// <value>The sparkle intensity.</value> public float SparkleIntensity { set => _sparkleIntensityParam.SetValue(value); } /// <summary> /// the color for the sparkly wave peaks /// </summary> /// <value>The color of the sparkle.</value> public Vector3 SparkleColor { set => _sparkleColorParam.SetValue(value); } /// <summary> /// position in screen space of the top of the water plane /// </summary> /// <value>The screen space vertical offset.</value> public float ScreenSpaceVerticalOffset { set => _screenSpaceVerticalOffsetParam.SetValue(Mathf.Map(value, 0, 1, -1, 1)); } /// <summary> /// defaults to 0.3. intensity of the perspective correction /// </summary> /// <value>The perspective correction intensity.</value> public float PerspectiveCorrectionIntensity { set => _perspectiveCorrectionIntensityParam.SetValue(value); } /// <summary> /// defaults to 2. speed that the first displacment/normal uv is scrolled /// </summary> /// <value>The first displacement speed.</value> public float FirstDisplacementSpeed { set => _firstDisplacementSpeedParam.SetValue(value / 100); } /// <summary> /// defaults to 6. speed that the second displacment/normal uv is scrolled /// </summary> /// <value>The second displacement speed.</value> public float SecondDisplacementSpeed { set => _secondDisplacementSpeedParam.SetValue(value / 100); } /// <summary> /// defaults to 3. the normal map is sampled twice then combined. The 2nd sampling is scaled by this value. /// </summary> /// <value>The second displacement scale.</value> public float SecondDisplacementScale { set => _secondDisplacementScaleParam.SetValue(value); } const float _sparkleIntensity = 0.015f; const float _perspectiveCorrectionIntensity = 0.3f; const float _reflectionIntensity = 0.85f; const float _normalMagnitude = 0.03f; const float _firstDisplacementSpeed = 6f; const float _secondDisplacementSpeed = 2f; const float _secondDisplacementScale = 3f; EffectParameter _timeParam; EffectParameter _sparkleIntensityParam; EffectParameter _sparkleColorParam; EffectParameter _screenSpaceVerticalOffsetParam; EffectParameter _perspectiveCorrectionIntensityParam; EffectParameter _firstDisplacementSpeedParam; EffectParameter _secondDisplacementSpeedParam; EffectParameter _secondDisplacementScaleParam; public WaterReflectionEffect() : base() { CurrentTechnique = Techniques["WaterReflectionTechnique"]; _timeParam = Parameters["_time"]; _sparkleIntensityParam = Parameters["_sparkleIntensity"]; _sparkleColorParam = Parameters["_sparkleColor"]; _screenSpaceVerticalOffsetParam = Parameters["_screenSpaceVerticalOffset"]; _perspectiveCorrectionIntensityParam = Parameters["_perspectiveCorrectionIntensity"]; _firstDisplacementSpeedParam = Parameters["_firstDisplacementSpeed"]; _secondDisplacementSpeedParam = Parameters["_secondDisplacementSpeed"]; _secondDisplacementScaleParam = Parameters["_secondDisplacementScale"]; _sparkleIntensityParam.SetValue(_sparkleIntensity); _sparkleColorParam.SetValue(Vector3.One); _perspectiveCorrectionIntensityParam.SetValue(_perspectiveCorrectionIntensity); FirstDisplacementSpeed = _firstDisplacementSpeed; SecondDisplacementSpeed = _secondDisplacementSpeed; _secondDisplacementScaleParam.SetValue(_secondDisplacementScale); // override some defaults from the ReflectionEffect ReflectionIntensity = _reflectionIntensity; NormalMagnitude = _normalMagnitude; } protected override void OnApply() { _timeParam.SetValue(Time.TotalTime); } } } ```
```objective-c /* * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions are * met: * * * Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * * Redistributions in binary form must reproduce the above * copyright notice, this list of conditions and the following disclaimer * in the documentation and/or other materials provided with the * distribution. * * Neither the name of Google Inc. nor the names of its * contributors may be used to endorse or promote products derived from * this software without specific prior written permission. * * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ #ifndef PictureSnapshot_h #define PictureSnapshot_h #include "platform/JSONValues.h" #include "platform/PlatformExport.h" #include "platform/graphics/GraphicsContext.h" #include "third_party/skia/include/core/SkPicture.h" #include "third_party/skia/include/core/SkPictureRecorder.h" #include "wtf/RefCounted.h" namespace blink { class FloatRect; class PLATFORM_EXPORT PictureSnapshot : public RefCounted<PictureSnapshot> { WTF_MAKE_NONCOPYABLE(PictureSnapshot); public: typedef Vector<Vector<double>> Timings; struct TilePictureStream : RefCounted<TilePictureStream> { FloatPoint layerOffset; Vector<char> data; }; static PassRefPtr<PictureSnapshot> load(const Vector<RefPtr<TilePictureStream>>&); PictureSnapshot(PassRefPtr<const SkPicture>); PassOwnPtr<Vector<char>> replay(unsigned fromStep = 0, unsigned toStep = 0, double scale = 1.0) const; PassOwnPtr<Timings> profile(unsigned minIterations, double minDuration, const FloatRect* clipRect) const; PassRefPtr<JSONArray> snapshotCommandLog() const; bool isEmpty() const; private: PassOwnPtr<SkBitmap> createBitmap() const; RefPtr<const SkPicture> m_picture; }; } // namespace blink #endif // PictureSnapshot_h ```
```java * * path_to_url * * Unless required by applicable law or agreed to in writing, software * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. */ package org.flowable.cmmn.converter.export; import javax.xml.stream.XMLStreamWriter; import org.apache.commons.lang3.StringUtils; import org.flowable.cmmn.converter.CmmnXmlConstants; import org.flowable.cmmn.model.CmmnModel; import org.flowable.cmmn.model.ReactivateEventListener; import org.flowable.cmmn.model.ReactivationRule; /** * Exports a reactivation event listener and all of its attributes and elements. * * @author Micha Kiener */ public class ReactivationEventListenerExport extends AbstractPlanItemDefinitionExport<ReactivateEventListener> { @Override protected Class<? extends ReactivateEventListener> getExportablePlanItemDefinitionClass() { return ReactivateEventListener.class; } @Override protected String getPlanItemDefinitionXmlElementValue(ReactivateEventListener reactivationEventListener) { return ELEMENT_GENERIC_EVENT_LISTENER; } @Override protected void writePlanItemDefinitionSpecificAttributes(ReactivateEventListener reactivationEventListener, XMLStreamWriter xtw) throws Exception { super.writePlanItemDefinitionSpecificAttributes(reactivationEventListener, xtw); xtw.writeAttribute(FLOWABLE_EXTENSIONS_NAMESPACE, CmmnXmlConstants.ATTRIBUTE_EVENT_LISTENER_TYPE, "reactivate"); if (StringUtils.isNotEmpty(reactivationEventListener.getAvailableConditionExpression())) { xtw.writeAttribute(FLOWABLE_EXTENSIONS_NAMESPACE, CmmnXmlConstants.ATTRIBUTE_EVENT_LISTENER_AVAILABLE_CONDITION, reactivationEventListener.getAvailableConditionExpression()); } } @Override protected boolean writePlanItemDefinitionExtensionElements(CmmnModel model, ReactivateEventListener reactivationEventListener, boolean didWriteExtensionElement, XMLStreamWriter xtw) throws Exception { boolean extensionElementsWritten = super.writePlanItemDefinitionExtensionElements(model, reactivationEventListener, didWriteExtensionElement, xtw); ReactivationRule reactivationRule = reactivationEventListener.getDefaultReactivationRule(); if (reactivationRule != null) { if (!extensionElementsWritten) { xtw.writeStartElement(CmmnXmlConstants.ELEMENT_EXTENSION_ELEMENTS); extensionElementsWritten = true; } xtw.writeStartElement(FLOWABLE_EXTENSIONS_PREFIX, ELEMENT_DEFAULT_REACTIVATION_RULE, FLOWABLE_EXTENSIONS_NAMESPACE); PlanItemControlExport.writeReactivationRuleAttributes(reactivationRule, xtw); xtw.writeEndElement(); } return extensionElementsWritten; } } ```
Allsvenskan 1992, part of the 1992 Swedish football season, was the 68th Allsvenskan season played. IFK Norrköping won the league ahead of runners-up Östers IF, and advanced to Mästerskapsserien 1992 along with the teams placed 3 to 6, while the teams placed 7 to 10 advanced to Kvalsvenskan 1992. Spring 1992 League table Results Autumn 1992 Mästerskapsserien 1992 Results Kvalsvenskan 1992 Allsvenskan qualification play-offs 1992 1st round 2nd round Promotions, relegations and qualifications Season statistics Top scorers References Print Online Notes Allsvenskan seasons Swed Swed 1
```javascript define("ace/mode/doc_comment_highlight_rules",["require","exports","module","ace/lib/oop","ace/mode/text_highlight_rules"], function(require, exports, module) { "use strict"; var oop = require("../lib/oop"); var TextHighlightRules = require("./text_highlight_rules").TextHighlightRules; var DocCommentHighlightRules = function() { this.$rules = { "start" : [ { token : "comment.doc.tag", regex : "@[\\w\\d_]+" // TODO: fix email addresses }, DocCommentHighlightRules.getTagRule(), { defaultToken : "comment.doc", caseInsensitive: true }] }; }; oop.inherits(DocCommentHighlightRules, TextHighlightRules); DocCommentHighlightRules.getTagRule = function(start) { return { token : "comment.doc.tag.storage.type", regex : "\\b(?:TODO|FIXME|XXX|HACK)\\b" }; } DocCommentHighlightRules.getStartRule = function(start) { return { token : "comment.doc", // doc comment regex : "\\/\\*(?=\\*)", next : start }; }; DocCommentHighlightRules.getEndRule = function (start) { return { token : "comment.doc", // closing comment regex : "\\*\\/", next : start }; }; exports.DocCommentHighlightRules = DocCommentHighlightRules; }); define("ace/mode/d_highlight_rules",["require","exports","module","ace/lib/oop","ace/mode/doc_comment_highlight_rules","ace/mode/text_highlight_rules"], function(require, exports, module) { "use strict"; var oop = require("../lib/oop"); var DocCommentHighlightRules = require("./doc_comment_highlight_rules").DocCommentHighlightRules; var TextHighlightRules = require("./text_highlight_rules").TextHighlightRules; var DHighlightRules = function() { var keywords = ( "this|super|import|module|body|mixin|__traits|invariant|alias|asm|delete|"+ "typeof|typeid|sizeof|cast|new|in|is|typedef|__vector|__parameters" ); var keywordControls = ( "break|case|continue|default|do|else|for|foreach|foreach_reverse|goto|if|" + "return|switch|while|catch|try|throw|finally|version|assert|unittest|with" ); var types = ( "auto|bool|char|dchar|wchar|byte|ubyte|float|double|real|" + "cfloat|creal|cdouble|cent|ifloat|ireal|idouble|" + "int|long|short|void|uint|ulong|ushort|ucent|" + "function|delegate|string|wstring|dstring|size_t|ptrdiff_t|hash_t|Object" ); var modifiers = ( "abstract|align|debug|deprecated|export|extern|const|final|in|inout|out|" + "ref|immutable|lazy|nothrow|override|package|pragma|private|protected|" + "public|pure|scope|shared|__gshared|synchronized|static|volatile" ); var storages = ( "class|struct|union|template|interface|enum|macro" ); var stringEscapesSeq = { token: "constant.language.escape", regex: "\\\\(?:(?:x[0-9A-F]{2})|(?:[0-7]{1,3})|(?:['\"\\?0abfnrtv\\\\])|" + "(?:u[0-9a-fA-F]{4})|(?:U[0-9a-fA-F]{8}))" }; var builtinConstants = ( "null|true|false|"+ "__DATE__|__EOF__|__TIME__|__TIMESTAMP__|__VENDOR__|__VERSION__|"+ "__FILE__|__MODULE__|__LINE__|__FUNCTION__|__PRETTY_FUNCTION__" ); var operators = ( "/|/\\=|&|&\\=|&&|\\|\\|\\=|\\|\\||\\-|\\-\\=|\\-\\-|\\+|" + "\\+\\=|\\+\\+|\\<|\\<\\=|\\<\\<|\\<\\<\\=|\\<\\>|\\<\\>\\=|\\>|\\>\\=|\\>\\>\\=|" + "\\>\\>\\>\\=|\\>\\>|\\>\\>\\>|\\!|\\!\\=|\\!\\<\\>|\\!\\<\\>\\=|\\!\\<|\\!\\<\\=|" + "\\!\\>|\\!\\>\\=|\\?|\\$|\\=|\\=\\=|\\*|\\*\\=|%|%\\=|" + "\\^|\\^\\=|\\^\\^|\\^\\^\\=|~|~\\=|\\=\\>|#" ); var keywordMapper = this.$keywords = this.createKeywordMapper({ "keyword.modifier" : modifiers, "keyword.control" : keywordControls, "keyword.type" : types, "keyword": keywords, "keyword.storage": storages, "punctation": "\\.|\\,|;|\\.\\.|\\.\\.\\.", "keyword.operator" : operators, "constant.language": builtinConstants }, "identifier"); var identifierRe = "[a-zA-Z_\u00a1-\uffff][a-zA-Z\\d_\u00a1-\uffff]*\\b"; this.$rules = { "start" : [ { //-------------------------------------------------------- COMMENTS token : "comment", regex : "\\/\\/.*$" }, DocCommentHighlightRules.getStartRule("doc-start"), { token : "comment", // multi line comment regex : "\\/\\*", next : "star-comment" }, { token: "comment.shebang", regex: "^\\s*#!.*" }, { token : "comment", regex : "\\/\\+", next: "plus-comment" }, { //-------------------------------------------------------- STRINGS onMatch: function(value, currentState, state) { state.unshift(this.next, value.substr(2)); return "string"; }, regex: 'q"(?:[\\[\\(\\{\\<]+)', next: 'operator-heredoc-string' }, { onMatch: function(value, currentState, state) { state.unshift(this.next, value.substr(2)); return "string"; }, regex: 'q"(?:[a-zA-Z_]+)$', next: 'identifier-heredoc-string' }, { token : "string", // multi line string start regex : '[xr]?"', next : "quote-string" }, { token : "string", // multi line string start regex : '[xr]?`', next : "backtick-string" }, { token : "string", // single line regex : "[xr]?['](?:(?:\\\\.)|(?:[^'\\\\]))*?['][cdw]?" }, { //-------------------------------------------------------- RULES token: ["keyword", "text", "paren.lparen"], regex: /(asm)(\s*)({)/, next: "d-asm" }, { token: ["keyword", "text", "paren.lparen", "constant.language"], regex: "(__traits)(\\s*)(\\()("+identifierRe+")" }, { // import|module abc token: ["keyword", "text", "variable.module"], regex: "(import|module)(\\s+)((?:"+identifierRe+"\\.?)*)" }, { // storage Name token: ["keyword.storage", "text", "entity.name.type"], regex: "("+storages+")(\\s*)("+identifierRe+")" }, { // alias|typedef foo bar; token: ["keyword", "text", "variable.storage", "text"], regex: "(alias|typedef)(\\s*)("+identifierRe+")(\\s*)" }, { //-------------------------------------------------------- OTHERS token : "constant.numeric", // hex regex : "0[xX][0-9a-fA-F_]+(l|ul|u|f|F|L|U|UL)?\\b" }, { token : "constant.numeric", // float regex : "[+-]?\\d[\\d_]*(?:(?:\\.[\\d_]*)?(?:[eE][+-]?[\\d_]+)?)?(l|ul|u|f|F|L|U|UL)?\\b" }, { token: "entity.other.attribute-name", regex: "@"+identifierRe }, { token : keywordMapper, regex : "[a-zA-Z_][a-zA-Z0-9_]*\\b" }, { token : "keyword.operator", regex : operators }, { token : "punctuation.operator", regex : "\\?|\\:|\\,|\\;|\\.|\\:" }, { token : "paren.lparen", regex : "[[({]" }, { token : "paren.rparen", regex : "[\\])}]" }, { token : "text", regex : "\\s+" } ], "star-comment" : [ { token : "comment", // closing comment regex : "\\*\\/", next : "start" }, { defaultToken: 'comment' } ], "plus-comment" : [ { token : "comment", // closing comment regex : "\\+\\/", next : "start" }, { defaultToken: 'comment' } ], "quote-string" : [ stringEscapesSeq, { token : "string", regex : '"[cdw]?', next : "start" }, { defaultToken: 'string' } ], "backtick-string" : [ stringEscapesSeq, { token : "string", regex : '`[cdw]?', next : "start" }, { defaultToken: 'string' } ], "operator-heredoc-string": [ { onMatch: function(value, currentState, state) { value = value.substring(value.length-2, value.length-1); var map = {'>':'<',']':'[',')':'(','}':'{'}; if(Object.keys(map).indexOf(value) != -1) value = map[value]; if(value != state[1]) return "string"; state.shift(); state.shift(); return "string"; }, regex: '(?:[\\]\\)}>]+)"', next: 'start' }, { token: 'string', regex: '[^\\]\\)}>]+' } ], "identifier-heredoc-string": [ { onMatch: function(value, currentState, state) { value = value.substring(0, value.length-1); if(value != state[1]) return "string"; state.shift(); state.shift(); return "string"; }, regex: '^(?:[A-Za-z_][a-zA-Z0-9]+)"', next: 'start' }, { token: 'string', regex: '[^\\]\\)}>]+' } ], "d-asm": [ { token: "paren.rparen", regex: "\\}", next: "start" }, { token: 'keyword.instruction', regex: '[a-zA-Z]+', next: 'd-asm-instruction' }, { token: "text", regex: "\\s+" } ], 'd-asm-instruction': [ { token: 'constant.language', regex: /AL|AH|AX|EAX|BL|BH|BX|EBX|CL|CH|CX|ECX|DL|DH|DX|EDX|BP|EBP|SP|ESP|DI|EDI|SI|ESI/i }, { token: 'identifier', regex: '[a-zA-Z]+' }, { token: 'string', regex: '".*"' }, { token: 'comment', regex: '//.*$' }, { token: 'constant.numeric', regex: '[0-9.xA-F]+' }, { token: 'punctuation.operator', regex: '\\,' }, { token: 'punctuation.operator', regex: ';', next: 'd-asm' }, { token: 'text', regex: '\\s+' } ] }; this.embedRules(DocCommentHighlightRules, "doc-", [ DocCommentHighlightRules.getEndRule("start") ]); }; DHighlightRules.metaData = { comment: 'D language', fileTypes: [ 'd', 'di' ], firstLineMatch: '^#!.*\\b[glr]?dmd\\b.', foldingStartMarker: '(?x)/\\*\\*(?!\\*)|^(?![^{]*?//|[^{]*?/\\*(?!.*?\\*/.*?\\{)).*?\\{\\s*($|//|/\\*(?!.*?\\*/.*\\S))', foldingStopMarker: '(?<!\\*)\\*\\*/|^\\s*\\}', keyEquivalent: '^~D', name: 'D', scopeName: 'source.d' }; oop.inherits(DHighlightRules, TextHighlightRules); exports.DHighlightRules = DHighlightRules; }); define("ace/mode/folding/cstyle",["require","exports","module","ace/lib/oop","ace/range","ace/mode/folding/fold_mode"], function(require, exports, module) { "use strict"; var oop = require("../../lib/oop"); var Range = require("../../range").Range; var BaseFoldMode = require("./fold_mode").FoldMode; var FoldMode = exports.FoldMode = function(commentRegex) { if (commentRegex) { this.foldingStartMarker = new RegExp( this.foldingStartMarker.source.replace(/\|[^|]*?$/, "|" + commentRegex.start) ); this.foldingStopMarker = new RegExp( this.foldingStopMarker.source.replace(/\|[^|]*?$/, "|" + commentRegex.end) ); } }; oop.inherits(FoldMode, BaseFoldMode); (function() { this.foldingStartMarker = /(\{|\[)[^\}\]]*$|^\s*(\/\*)/; this.foldingStopMarker = /^[^\[\{]*(\}|\])|^[\s\*]*(\*\/)/; this.singleLineBlockCommentRe= /^\s*(\/\*).*\*\/\s*$/; this.tripleStarBlockCommentRe = /^\s*(\/\*\*\*).*\*\/\s*$/; this.startRegionRe = /^\s*(\/\*|\/\/)#?region\b/; this._getFoldWidgetBase = this.getFoldWidget; this.getFoldWidget = function(session, foldStyle, row) { var line = session.getLine(row); if (this.singleLineBlockCommentRe.test(line)) { if (!this.startRegionRe.test(line) && !this.tripleStarBlockCommentRe.test(line)) return ""; } var fw = this._getFoldWidgetBase(session, foldStyle, row); if (!fw && this.startRegionRe.test(line)) return "start"; // lineCommentRegionStart return fw; }; this.getFoldWidgetRange = function(session, foldStyle, row, forceMultiline) { var line = session.getLine(row); if (this.startRegionRe.test(line)) return this.getCommentRegionBlock(session, line, row); var match = line.match(this.foldingStartMarker); if (match) { var i = match.index; if (match[1]) return this.openingBracketBlock(session, match[1], row, i); var range = session.getCommentFoldRange(row, i + match[0].length, 1); if (range && !range.isMultiLine()) { if (forceMultiline) { range = this.getSectionRange(session, row); } else if (foldStyle != "all") range = null; } return range; } if (foldStyle === "markbegin") return; var match = line.match(this.foldingStopMarker); if (match) { var i = match.index + match[0].length; if (match[1]) return this.closingBracketBlock(session, match[1], row, i); return session.getCommentFoldRange(row, i, -1); } }; this.getSectionRange = function(session, row) { var line = session.getLine(row); var startIndent = line.search(/\S/); var startRow = row; var startColumn = line.length; row = row + 1; var endRow = row; var maxRow = session.getLength(); while (++row < maxRow) { line = session.getLine(row); var indent = line.search(/\S/); if (indent === -1) continue; if (startIndent > indent) break; var subRange = this.getFoldWidgetRange(session, "all", row); if (subRange) { if (subRange.start.row <= startRow) { break; } else if (subRange.isMultiLine()) { row = subRange.end.row; } else if (startIndent == indent) { break; } } endRow = row; } return new Range(startRow, startColumn, endRow, session.getLine(endRow).length); }; this.getCommentRegionBlock = function(session, line, row) { var startColumn = line.search(/\s*$/); var maxRow = session.getLength(); var startRow = row; var re = /^\s*(?:\/\*|\/\/|--)#?(end)?region\b/; var depth = 1; while (++row < maxRow) { line = session.getLine(row); var m = re.exec(line); if (!m) continue; if (m[1]) depth--; else depth++; if (!depth) break; } var endRow = row; if (endRow > startRow) { return new Range(startRow, startColumn, endRow, line.length); } }; }).call(FoldMode.prototype); }); define("ace/mode/d",["require","exports","module","ace/lib/oop","ace/mode/text","ace/mode/d_highlight_rules","ace/mode/folding/cstyle"], function(require, exports, module) { "use strict"; var oop = require("../lib/oop"); var TextMode = require("./text").Mode; var DHighlightRules = require("./d_highlight_rules").DHighlightRules; var FoldMode = require("./folding/cstyle").FoldMode; var Mode = function() { this.HighlightRules = DHighlightRules; this.foldingRules = new FoldMode(); this.$behaviour = this.$defaultBehaviour; }; oop.inherits(Mode, TextMode); (function() { this.lineCommentStart = "//"; this.blockComment = {start: "/*", end: "*/"}; this.$id = "ace/mode/d"; }).call(Mode.prototype); exports.Mode = Mode; }); ```
```c /* $OpenBSD: fpu_sqrt.c,v 1.7 2024/03/29 21:02:11 miod Exp $ */ /* * The Regents of the University of California. All rights reserved. * * This software was developed by the Computer Systems Engineering group * at Lawrence Berkeley Laboratory under DARPA contract BG 91-66 and * contributed to Berkeley. * * All advertising materials mentioning features or use of this software * must display the following acknowledgement: * This product includes software developed by the University of * California, Lawrence Berkeley Laboratory. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions * are met: * 1. Redistributions of source code must retain the above copyright * notice, this list of conditions and the following disclaimer. * 2. Redistributions in binary form must reproduce the above copyright * notice, this list of conditions and the following disclaimer in the * documentation and/or other materials provided with the distribution. * 3. All advertising materials mentioning features or use of this software * must display the following acknowledgement: * This product includes software developed by the University of * California, Berkeley and its contributors. * 4. Neither the name of the University nor the names of its contributors * may be used to endorse or promote products derived from this software * without specific prior written permission. * * THIS SOFTWARE IS PROVIDED BY THE REGENTS AND CONTRIBUTORS ``AS IS'' AND * ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE * ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE * FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL * DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS * OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) * HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT * LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY * OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF * SUCH DAMAGE. * * @(#)fpu_sqrt.c 8.1 (Berkeley) 6/11/93 * $NetBSD: fpu_sqrt.c,v 1.2 1994/11/20 20:52:46 deraadt Exp $ */ /* * Perform an FPU square root (return sqrt(x)). */ #include <sys/types.h> #include "fpu_arith.h" #include "fpu_emu.h" #include "fpu_extern.h" /* * Our task is to calculate the square root of a floating point number x0. * This number x normally has the form: * * exp * x = mant * 2 (where 1 <= mant < 2 and exp is an integer) * * This can be left as it stands, or the mantissa can be doubled and the * exponent decremented: * * exp-1 * x = (2 * mant) * 2 (where 2 <= 2 * mant < 4) * * If the exponent `exp' is even, the square root of the number is best * handled using the first form, and is by definition equal to: * * exp/2 * sqrt(x) = sqrt(mant) * 2 * * If exp is odd, on the other hand, it is convenient to use the second * form, giving: * * (exp-1)/2 * sqrt(x) = sqrt(2 * mant) * 2 * * In the first case, we have * * 1 <= mant < 2 * * and therefore * * sqrt(1) <= sqrt(mant) < sqrt(2) * * while in the second case we have * * 2 <= 2*mant < 4 * * and therefore * * sqrt(2) <= sqrt(2*mant) < sqrt(4) * * so that in any case, we are sure that * * sqrt(1) <= sqrt(n * mant) < sqrt(4), n = 1 or 2 * * or * * 1 <= sqrt(n * mant) < 2, n = 1 or 2. * * This root is therefore a properly formed mantissa for a floating * point number. The exponent of sqrt(x) is either exp/2 or (exp-1)/2 * as above. This leaves us with the problem of finding the square root * of a fixed-point number in the range [1..4). * * Though it may not be instantly obvious, the following square root * algorithm works for any integer x of an even number of bits, provided * that no overflows occur: * * let q = 0 * for k = NBITS-1 to 0 step -1 do -- for each digit in the answer... * x *= 2 -- multiply by radix, for next digit * if x >= 2q + 2^k then -- if adding 2^k does not * x -= 2q + 2^k -- exceed the correct root, * q += 2^k -- add 2^k and adjust x * fi * done * sqrt = q / 2^(NBITS/2) -- (and any remainder is in x) * * If NBITS is odd (so that k is initially even), we can just add another * zero bit at the top of x. Doing so means that q is not going to acquire * a 1 bit in the first trip around the loop (since x0 < 2^NBITS). If the * final value in x is not needed, or can be off by a factor of 2, this is * equivalent to moving the `x *= 2' step to the bottom of the loop: * * for k = NBITS-1 to 0 step -1 do if ... fi; x *= 2; done * * and the result q will then be sqrt(x0) * 2^floor(NBITS / 2). * (Since the algorithm is destructive on x, we will call x's initial * value, for which q is some power of two times its square root, x0.) * * If we insert a loop invariant y = 2q, we can then rewrite this using * C notation as: * * q = y = 0; x = x0; * for (k = NBITS; --k >= 0;) { * #if (NBITS is even) * x *= 2; * #endif * t = y + (1 << k); * if (x >= t) { * x -= t; * q += 1 << k; * y += 1 << (k + 1); * } * #if (NBITS is odd) * x *= 2; * #endif * } * * If x0 is fixed point, rather than an integer, we can simply alter the * scale factor between q and sqrt(x0). As it happens, we can easily arrange * for the scale factor to be 2**0 or 1, so that sqrt(x0) == q. * * In our case, however, x0 (and therefore x, y, q, and t) are multiword * integers, which adds some complication. But note that q is built one * bit at a time, from the top down, and is not used itself in the loop * (we use 2q as held in y instead). This means we can build our answer * in an integer, one word at a time, which saves a bit of work. Also, * since 1 << k is always a `new' bit in q, 1 << k and 1 << (k+1) are * `new' bits in y and we can set them with an `or' operation rather than * a full-blown multiword add. * * We are almost done, except for one snag. We must prove that none of our * intermediate calculations can overflow. We know that x0 is in [1..4) * and therefore the square root in q will be in [1..2), but what about x, * y, and t? * * We know that y = 2q at the beginning of each loop. (The relation only * fails temporarily while y and q are being updated.) Since q < 2, y < 4. * The sum in t can, in our case, be as much as y+(1<<1) = y+2 < 6, and. * Furthermore, we can prove with a bit of work that x never exceeds y by * more than 2, so that even after doubling, 0 <= x < 8. (This is left as * an exercise to the reader, mostly because I have become tired of working * on this comment.) * * If our floating point mantissas (which are of the form 1.frac) occupy * B+1 bits, our largest intermediary needs at most B+3 bits, or two extra. * In fact, we want even one more bit (for a carry, to avoid compares), or * three extra. There is a comment in fpu_emu.h reminding maintainers of * this, so we have some justification in assuming it. */ struct fpn * __fpu_sqrt(fe) struct fpemu *fe; { struct fpn *x = &fe->fe_f1; u_int bit, q, tt; u_int x0, x1, x2, x3; u_int y0, y1, y2, y3; u_int d0, d1, d2, d3; int e; /* * Take care of special cases first. In order: * * sqrt(NaN) = NaN * sqrt(+0) = +0 * sqrt(-0) = -0 * sqrt(x < 0) = NaN (including sqrt(-Inf)) * sqrt(+Inf) = +Inf * * Then all that remains are numbers with mantissas in [1..2). */ if (ISNAN(x) || ISZERO(x)) return (x); if (x->fp_sign) return (__fpu_newnan(fe)); if (ISINF(x)) return (x); /* * Calculate result exponent. As noted above, this may involve * doubling the mantissa. We will also need to double x each * time around the loop, so we define a macro for this here, and * we break out the multiword mantissa. */ #ifdef FPU_SHL1_BY_ADD #define DOUBLE_X { \ FPU_ADDS(x3, x3, x3); FPU_ADDCS(x2, x2, x2); \ FPU_ADDCS(x1, x1, x1); FPU_ADDC(x0, x0, x0); \ } #else #define DOUBLE_X { \ x0 = (x0 << 1) | (x1 >> 31); x1 = (x1 << 1) | (x2 >> 31); \ x2 = (x2 << 1) | (x3 >> 31); x3 <<= 1; \ } #endif #if (FP_NMANT & 1) != 0 # define ODD_DOUBLE DOUBLE_X # define EVEN_DOUBLE /* nothing */ #else # define ODD_DOUBLE /* nothing */ # define EVEN_DOUBLE DOUBLE_X #endif x0 = x->fp_mant[0]; x1 = x->fp_mant[1]; x2 = x->fp_mant[2]; x3 = x->fp_mant[3]; e = x->fp_exp; if (e & 1) /* exponent is odd; use sqrt(2mant) */ DOUBLE_X; /* THE FOLLOWING ASSUMES THAT RIGHT SHIFT DOES SIGN EXTENSION */ x->fp_exp = e >> 1; /* calculates (e&1 ? (e-1)/2 : e/2 */ /* * Now calculate the mantissa root. Since x is now in [1..4), * we know that the first trip around the loop will definitely * set the top bit in q, so we can do that manually and start * the loop at the next bit down instead. We must be sure to * double x correctly while doing the `known q=1.0'. * * We do this one mantissa-word at a time, as noted above, to * save work. To avoid `(1U << 31) << 1', we also do the top bit * outside of each per-word loop. * * The calculation `t = y + bit' breaks down into `t0 = y0, ..., * t3 = y3, t? |= bit' for the appropriate word. Since the bit * is always a `new' one, this means that three of the `t?'s are * just the corresponding `y?'; we use `#define's here for this. * The variable `tt' holds the actual `t?' variable. */ /* calculate q0 */ #define t0 tt bit = FP_1; EVEN_DOUBLE; /* if (x >= (t0 = y0 | bit)) { */ /* always true */ q = bit; x0 -= bit; y0 = bit << 1; /* } */ ODD_DOUBLE; while ((bit >>= 1) != 0) { /* for remaining bits in q0 */ EVEN_DOUBLE; t0 = y0 | bit; /* t = y + bit */ if (x0 >= t0) { /* if x >= t then */ x0 -= t0; /* x -= t */ q |= bit; /* q += bit */ y0 |= bit << 1; /* y += bit << 1 */ } ODD_DOUBLE; } x->fp_mant[0] = q; #undef t0 /* calculate q1. note (y0&1)==0. */ #define t0 y0 #define t1 tt q = 0; y1 = 0; bit = 1U << 31; EVEN_DOUBLE; t1 = bit; FPU_SUBS(d1, x1, t1); FPU_SUBC(d0, x0, t0); /* d = x - t */ if ((int)d0 >= 0) { /* if d >= 0 (i.e., x >= t) then */ x0 = d0, x1 = d1; /* x -= t */ q = bit; /* q += bit */ y0 |= 1; /* y += bit << 1 */ } ODD_DOUBLE; while ((bit >>= 1) != 0) { /* for remaining bits in q1 */ EVEN_DOUBLE; /* as before */ t1 = y1 | bit; FPU_SUBS(d1, x1, t1); FPU_SUBC(d0, x0, t0); if ((int)d0 >= 0) { x0 = d0, x1 = d1; q |= bit; y1 |= bit << 1; } ODD_DOUBLE; } x->fp_mant[1] = q; #undef t1 /* calculate q2. note (y1&1)==0; y0 (aka t0) is fixed. */ #define t1 y1 #define t2 tt q = 0; y2 = 0; bit = 1U << 31; EVEN_DOUBLE; t2 = bit; FPU_SUBS(d2, x2, t2); FPU_SUBCS(d1, x1, t1); FPU_SUBC(d0, x0, t0); if ((int)d0 >= 0) { x0 = d0, x1 = d1, x2 = d2; q |= bit; y1 |= 1; /* now t1, y1 are set in concrete */ } ODD_DOUBLE; while ((bit >>= 1) != 0) { EVEN_DOUBLE; t2 = y2 | bit; FPU_SUBS(d2, x2, t2); FPU_SUBCS(d1, x1, t1); FPU_SUBC(d0, x0, t0); if ((int)d0 >= 0) { x0 = d0, x1 = d1, x2 = d2; q |= bit; y2 |= bit << 1; } ODD_DOUBLE; } x->fp_mant[2] = q; #undef t2 /* calculate q3. y0, t0, y1, t1 all fixed; y2, t2, almost done. */ #define t2 y2 #define t3 tt q = 0; y3 = 0; bit = 1U << 31; EVEN_DOUBLE; t3 = bit; FPU_SUBS(d3, x3, t3); FPU_SUBCS(d2, x2, t2); FPU_SUBCS(d1, x1, t1); FPU_SUBC(d0, x0, t0); if ((int)d0 >= 0) { x0 = d0, x1 = d1, x2 = d2, x3 = d3; q |= bit; y2 |= 1; } ODD_DOUBLE; while ((bit >>= 1) != 0) { EVEN_DOUBLE; t3 = y3 | bit; FPU_SUBS(d3, x3, t3); FPU_SUBCS(d2, x2, t2); FPU_SUBCS(d1, x1, t1); FPU_SUBC(d0, x0, t0); if ((int)d0 >= 0) { x0 = d0, x1 = d1, x2 = d2, x3 = d3; q |= bit; y3 |= bit << 1; } ODD_DOUBLE; } x->fp_mant[3] = q; /* * The result, which includes guard and round bits, is exact iff * x is now zero; any nonzero bits in x represent sticky bits. */ x->fp_sticky = x0 | x1 | x2 | x3; return (x); } ```
.dss is a filename extension. It may refer to: (most likely) a Digital Speech Standard file The existence of .DSS files on storage mediums may also be the result of automatic lossy file name conversion of .DS_Store files to e.g. the old FAT 8.3 character file name restrictions
Albin August Heinrich Emil Swoboda (13 November 1836 – 5 August 1901) was an Austrian operatic tenor, actor, and opera director of German birth. One of the most famous opera singers of the "Golden Age" of Viennese operetta, he was a leading tenor and dramatic stage actor at the Theater an der Wien from 1859–1878. He notably appeared in the world premieres of operettas by composers Jacques Offenbach, Johann Strauss II, and Franz von Suppé. He also appeared in musical comedies and plays in Vienna, and appeared in stage productions of all kinds as a guest artist in theatres in Germany, Hungary, Poland, and Russia. His son Albin Swoboda Jr. also had a successful career as an opera singer. In 1955 a street in the Hietzing district of Vienna was named after him. Early life Born in Neustrelitz, Swoboda was the son of tenor and opera director Joseph Wilhelm Swoboda and dramatic soprano Angelika Peréchon-Swoboda. His brother Karl Swoboda was also an opera singer. During his early years he moved often with his parents, living in Leipzig (1837–1838), Halle, Saxony-Anhalt (1838–1839), and Düsseldorf (1840–1841). In 1841 both of his parents were engaged at the opera house in Frankfurt am Main, and the family lived there for the next seven years. His mother died in 1846 at the age of 30. In 1848 his father was offered a position at the Carltheater in Vienna, and the family accordingly moved to that city. In 1849 his father joined the Vienna Hofoper where he worked as a leading and comprimario tenor for the next 16 years and then served as the theatre's director from 1865–1875. Career Swoboda began his career in 1852 as a member of the opera chorus at the Theater in der Josefstadt in Vienna, a decision which his father initially opposed. He left there a few years later to join the opera house in Kraków where he samg his first leading roles and appeared in plays. Engagements in operas and plays soon followed in Linz, Bad Ischl, and Salzburg. In 1857 he returned to Vienna when Johann Nestroy hired him as a leading tenor at the Carltheater where he appeared mostly in musical comedies and farces. In 1859 Swoboda left the Carltheater to become a leading tenor and dramatic actor at the Theater an der Wien. He remained committed to that theatre for the next 14 years, and afterwards appeared as a guest artist. He drew acclaim in the operetta repertoire, and notably sang in numerous world premieres at that house; including Karl in Suppé's Das Pensionat (1860), Janio in Strauss's Indigo und die vierzig Räuber, Lambrequin in Offenbach's Der schwarze Korsar (1872), Arthur Bryk in Strauss's Der Karneval in Rom (1873), and Helmut Forst in Strauss's Blindekuh (1878). Other roles in his repertoire included Pâris in La belle Hélène, Piquillo in La Périchole, and the title role in Orpheus in the Underworld. As a dramatic actor he appeared to critical success in the world premieres of several plays by Ludwig Anzengruber among other stage parts. In 1873 Swoboda was appointed the first artistic director at the newly built Ringtheater which presented its first performance in January 1874. He left there in 1875 to work in the same capacity at the new Deutsches Theater Budapest under his father who was general director. He notably sang the role of Eisenstein in Die Fledermaus for the theatre's inaugural performance. In January 1878 he succeeded his father as general director of theatre, but left that position the following year due to financial reasons. In 1880 he performed in operettas and musical comedies in Russia and Poland, including performances in St. Petersburg, Moscow, Riga, and Lodz among others. In 1881 he was committed to the Semperoper, after which he abandoned his singing career and performed only as a dramatic actor. Other than appearances in popular plays in Vienna in 1883 and 1898, his latter career never matched the success he enjoyed earlier. Swoboda died in 1901 at the age of 64. He was married twice during his life. His first marriage to soprano Friederike Fischer (1844–1898), whom he starred opposite on several occasions at the Theater an der Wien, ended in divorce. Their son Albin Swoboda Jr. had a successful career as an operatic bass-baritone. His second marriage was to the dramatic actress Gretchen Swoboda (1872–1921). References 1836 births 1901 deaths Austrian operatic tenors People from Neustrelitz 19th-century Austrian male opera singers
```c++ // or more contributor license agreements. See the NOTICE file // distributed with this work for additional information // regarding copyright ownership. The ASF licenses this file // // path_to_url // // Unless required by applicable law or agreed to in writing, // "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY // specific language governing permissions and limitations #include "runtime/bufferpool/system-allocator.h" #include <sys/mman.h> #include <gperftools/malloc_extension.h> #include "gutil/strings/substitute.h" #include "util/bit-util.h" #include "common/names.h" // TODO: IMPALA-5073: this should eventually become the default once we are confident // that it is superior to allocating via TCMalloc. DEFINE_bool(mmap_buffers, false, "(Experimental) If true, allocate buffers directly from the operating system " "instead of with TCMalloc."); DEFINE_bool(madvise_huge_pages, true, "(Advanced) If true, advise operating system to back large memory buffers with huge " "pages"); namespace impala { /// These are the page sizes on x86-64. We could parse /proc/meminfo to programmatically /// get this, but it is unlikely to change unless we port to a different architecture. static int64_t SMALL_PAGE_SIZE = 4LL * 1024; static int64_t HUGE_PAGE_SIZE = 2LL * 1024 * 1024; SystemAllocator::SystemAllocator(int64_t min_buffer_len) : min_buffer_len_(min_buffer_len) { DCHECK(BitUtil::IsPowerOf2(min_buffer_len)); #if !defined(ADDRESS_SANITIZER) && !defined(THREAD_SANITIZER) // Free() assumes that aggressive decommit is enabled for TCMalloc. size_t aggressive_decommit_enabled; MallocExtension::instance()->GetNumericProperty( "tcmalloc.aggressive_memory_decommit", &aggressive_decommit_enabled); CHECK_EQ(true, aggressive_decommit_enabled); #endif } Status SystemAllocator::Allocate(int64_t len, BufferPool::BufferHandle* buffer) { DCHECK_GE(len, min_buffer_len_); DCHECK_LE(len, BufferPool::MAX_BUFFER_BYTES); DCHECK(BitUtil::IsPowerOf2(len)) << len; uint8_t* buffer_mem; if (FLAGS_mmap_buffers) { RETURN_IF_ERROR(AllocateViaMMap(len, &buffer_mem)); } else { RETURN_IF_ERROR(AllocateViaMalloc(len, &buffer_mem)); } buffer->Open(buffer_mem, len, CpuInfo::GetCurrentCore()); return Status::OK(); } Status SystemAllocator::AllocateViaMMap(int64_t len, uint8_t** buffer_mem) { int64_t map_len = len; bool use_huge_pages = len % HUGE_PAGE_SIZE == 0 && FLAGS_madvise_huge_pages; if (use_huge_pages) { // Map an extra huge page so we can fix up the alignment if needed. map_len += HUGE_PAGE_SIZE; } uint8_t* mem = reinterpret_cast<uint8_t*>( mmap(nullptr, map_len, PROT_READ | PROT_WRITE, MAP_ANONYMOUS | MAP_PRIVATE, -1, 0)); if (mem == MAP_FAILED) { const char* error = strerror(errno); return Status(TErrorCode::BUFFER_ALLOCATION_FAILED, len, error); } if (use_huge_pages) { // mmap() may return memory that is not aligned to the huge page size. For the // subsequent madvise() call to work well, we need to align it ourselves and // unmap the memory on either side of the buffer that we don't need. uintptr_t misalignment = reinterpret_cast<uintptr_t>(mem) % HUGE_PAGE_SIZE; if (misalignment != 0) { uintptr_t fixup = HUGE_PAGE_SIZE - misalignment; munmap(mem, fixup); mem += fixup; map_len -= fixup; } munmap(mem + len, map_len - len); DCHECK_EQ(reinterpret_cast<uintptr_t>(mem) % HUGE_PAGE_SIZE, 0) << std::hex << reinterpret_cast<uintptr_t>(mem); // Mark the buffer as a candidate for promotion to huge pages. The Linux Transparent // Huge Pages implementation will try to back the memory with a huge page if it is // enabled. MADV_HUGEPAGE was introduced in 2.6.38, so we similarly need to skip this // code if we are compiling against an older kernel. #ifdef MADV_HUGEPAGE int rc; // According to madvise() docs it may return EAGAIN to signal that we should retry. do { rc = madvise(mem, len, MADV_HUGEPAGE); } while (rc == -1 && errno == EAGAIN); DCHECK(rc == 0) << "madvise(MADV_HUGEPAGE) shouldn't fail" << errno; #endif } *buffer_mem = mem; return Status::OK(); } Status SystemAllocator::AllocateViaMalloc(int64_t len, uint8_t** buffer_mem) { bool use_huge_pages = len % HUGE_PAGE_SIZE == 0 && FLAGS_madvise_huge_pages; // Allocate, aligned to the page size that we expect to back the memory range. // This ensures that it can be backed by a whole pages, rather than parts of pages. size_t alignment = use_huge_pages ? HUGE_PAGE_SIZE : SMALL_PAGE_SIZE; int rc = posix_memalign(reinterpret_cast<void**>(buffer_mem), alignment, len); #ifdef THREAD_SANITIZER // Workaround TSAN bug where posix_memalign returns 0 even when allocation fails. It // should instead return ENOMEM. See path_to_url if (rc == 0 && *buffer_mem == nullptr && len != 0) rc = ENOMEM; #endif if (rc != 0) { return Status(TErrorCode::BUFFER_ALLOCATION_FAILED, len, Substitute("posix_memalign() failed to allocate buffer: $0", GetStrErrMsg())); } if (use_huge_pages) { #ifdef MADV_HUGEPAGE // According to madvise() docs it may return EAGAIN to signal that we should retry. do { rc = madvise(*buffer_mem, len, MADV_HUGEPAGE); } while (rc == -1 && errno == EAGAIN); DCHECK(rc == 0) << "madvise(MADV_HUGEPAGE) shouldn't fail" << errno; #endif } return Status::OK(); } void SystemAllocator::Free(BufferPool::BufferHandle&& buffer) { if (FLAGS_mmap_buffers) { int rc = munmap(buffer.data(), buffer.len()); DCHECK_EQ(rc, 0) << "Unexpected munmap() error: " << errno; } else { bool use_huge_pages = buffer.len() % HUGE_PAGE_SIZE == 0 && FLAGS_madvise_huge_pages; if (use_huge_pages) { // Undo the madvise so that is isn't a candidate to be newly backed by huge pages. // We depend on TCMalloc's "aggressive decommit" mode decommitting the physical // huge pages with madvise(DONTNEED) when we call free(). Otherwise, this huge // page region may be divvied up and subsequently decommitted in smaller chunks, // which may not actually release the physical memory, causing Impala physical // memory usage to exceed the process limit. #ifdef MADV_NOHUGEPAGE // According to madvise() docs it may return EAGAIN to signal that we should retry. int rc; do { rc = madvise(buffer.data(), buffer.len(), MADV_NOHUGEPAGE); } while (rc == -1 && errno == EAGAIN); DCHECK(rc == 0) << "madvise(MADV_NOHUGEPAGE) shouldn't fail" << errno; #endif } free(buffer.data()); } buffer.Reset(); // Avoid DCHECK in ~BufferHandle(). } } ```
```swift ////////////////////////////////////////////////////////////////////////////////// // // B L I N K // // // This file is part of Blink. // // Blink is free software: you can redistribute it and/or modify // (at your option) any later version. // // Blink is distributed in the hope that it will be useful, // but WITHOUT ANY WARRANTY; without even the implied warranty of // MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the // // along with Blink. If not, see <path_to_url // // In addition, Blink is also subject to certain additional terms under // GNU GPL version 3 section 7. // // You should have received a copy of these additional terms immediately // which accompanied the Blink Source Code. If not, see // <path_to_url // //////////////////////////////////////////////////////////////////////////////// let p12Cert = """ your_sha256_hash your_sha256_hash POoycwQCAggAgIIFCJvn75yn4mouP9jsN77nnmDVdSnnitVkWc/Ky5nWiZTZzEUU jWUQ9ZQfPTE9tL89e2g/8zKQeclNmW4fxN/uWYz8zCk7bWw4ylJUs4SS2Si0mdje rnjuiqvl2jwgbmhw6MeSbeYnx/MBhIr+YVEs5YjWiXN8CL4ODX3rYOREYzJ2GcCd IuDBacgXn+5mL77cdlXP8TXRepB5zagls6JZ1VwYU0BAybllZsZH/ieEIcWlACG7 9hWYoNSOiWsP2Ywg0pXh6V+JKIvf26ep9Gzj7jAoZ/Z7Ef4LjU/vpFqM/sH/yww6 HyhlCF/RWMnchI+U6ssfm1jhG0ebWHtVyn05ZCHNRJ1KX0vUcEhx7sydGEEPrBbj GhZBdmS0TZb7+oahpSxBNutUrHUo3SEVjB+1qy2a9q+bCUOmutMwmwfVNEDzFegW OrCAHY9SyfQE1HQrtNUPZHjbxGb+v0hOnmlBXoalJll2JRglEizvzDqS22CXxa4H NlcTVBupBUdnJlpMDv+XIEajPZsGAdiSmZqLiGTqQhisjG612RjWWB3VGz/kSGDL 4E+yEBYJQa/hi+XWiveCvnimIghdlSxjGs0XCeayIiygrsaI5U4j2sBCRdp+Utt5 XzsDdSKo6TBo/wSA0Qja5UOFv3D2t0PPTMEeFlxphTKfLfveszKxe8Ip8YON3vsm WeXNHvfS7TTfm37ARfGIF+gZSYZ7m2X6hL9xiwzCyPEJFSZwVWcz44YDdHgmnbzm B6zoPVhCyh2TjYkePNzMWsqVj906ODUTcG2rh83jIsoFJzwT1u49Y+pwxZN4zRtP roTl8jmybsnz10/VneBRYgQUwD096apbM1oRTkbUTtofOqTfwprh/gW0hp1L/eMs 9AER7mwINNwJbIeHNWg4HbXfq/vVKKxgvGSyYZXC08Lr7BXzRmhlZZ9kzeVvmMSK 3tBK/eEwTC6pFJUyrHBrqyP1FfeIKu5A1p9JuJGjvEmgH+qB3EZ4inOIf1RsWlZE /5zhBEMD7yZ/ONLZ+ggPi4BBW1Sb3oqgGyHq06Yt16Q6iYqh0uKFv+2d4GW7KvB1 ozN7EaWr+mfdWM1cMm8aG60gVIcJasQD39LAs4GWYrc0LrxU0VWfUov0kTbZ/OMD Ns0u6vuVoXuetBbEgO7ZA4MuibgnhgCafGE9CTfrkAr60G0QSEGhD3VyEZDUM+ny h3KnFB/u7lCmJKLzB9Y6cGEQIM8OLlV1CMbHLHF3RjPul0gq8t7SSHBUulssshnI fQRkxSfhD+CwnJyCMxM0MCh3Q1IE80lfXjCEo4CwlJar45nNAmQNmlwCqEC1PN20 XB3feT4LtPNsG6Jl5gO/2Kxixv9PzFb5I2tSy2E3aPL10dIKoz50uLlURrAKlNyT your_sha256_hash SDveLijO69rihX3rDtgdLzYy9q2NPWznLeDMmt8Me4sFR2bwyUP+L+T7IoocN21h xZmhTXMJvqiawHPE0fiwMmGPYaHHjMXEyiDY6tEVUyGdh96D4p/6fl3uBtSP7UEO lB5xq/we6exAibiVC+mntPY1Cqy8oEPC7gWF5WG+ra3hZ+8IoPFOhjYZaoCCURL3 Qx7ba3v3ZEx1eSt7VQwo1L43eSC4sRpzLd8EjGfP97rUvsb+r6g3AQLDsiy4eDtc your_sha256_hash AqCCCXYwgglyMBwGCiqGSIb3DQEMAQMwDgQIy1bS/x0yMD8CAggABIIJUBAtWYwD BxwNGs0HsQoXnoNSz23AgaVRP0F0gmKhiCSBYEsehEsIfDBm8D42w4jt8/OhsLzU BJNybLdFKsaykOkRez1E2Hl5NHS7C2Q6WLDicJAw48xRaM0fTp/YmnGnB94JLXly C8FFVi/ggdRXzX7yFnp4jDVn31bBEuaysuZS9hx0fIqdZoxyenPFHZhrRCeuyzOg bfUN8iRZHrWjN6o++tHUNmhhBVwS07rufG7IqFdiQ73uT7T33FLWgQGf/L660dgr 440J2IsUpxSqrMJEwVeOfJ1onWjmAh+QPFvWAAnIIIs6t+k2SxKrbe1HxtWfYD2P xb2SQvQzDd8Gm6bO7oicsO2FqF016K6EMBFhnkgz5goSji7rdF+V1ky2HhZQNz0i your_sha256_hash bSoanMXUdvEO1SOKHM9Tcn+aPAMFwrKATo/VhdMt1yTzpBgXu7Q98OKf6cGbTWiN 3Drim05k++Cp8wdI9o/5HYvNq1YtG7w9kxHHYVS1K/7/9uaofdMPadNr+bMrkw/M IYsb00BZV3UaVlqhUGzxLwHlzRQ9pmWX25m4xq23FM+Yei4yGxKkH/R5NnBRTSkH +g8i90G0YlDkV5piuqfCL7DXSDksn9hC2H6baPyBC7JMsknOhmcm+HQHV9tGA3Wp 8vPcQaG7Pn3kFXrG9b/pk7JYI2X4nObSsl5Jod9YErjtS5SZtRVKNWT1vnLZF24K d7udOMOJ8lNLJ5Kjxq8oVjvjUSoRjIHB/i8dQx6n56iIWrPdAsrySIS7kVYrOKPy 2SxsBzvexOIjHKRS/nDuybI+9tg3gJNUt5/0xj2ScEcjysudowtiYzB7ecqhdJF0 rn29IVttcKHIEv2Sjk2A9GPH36k30+M6qQK4dv7hICg4cEDUrJemhCZtlj0FW0Oc 1ZZ1xyIkmORpg9PCpRzvya5vig80BYPu76labmFjuXjEj89ycSe/B3EFOdlZCqJ8 /VswhWHVbuSqIABE68QfNyIFhr0SV1qZ9K6mcefgp9x9WeywIF4o+sr08bqqcHHV NOlJ8RZB+Wlhu/Hj8kj+IpuwsZm5EonGpWH8X96v/805WxGZnRsKgcV2YaJRqfPI 34yoqRIm+YYEIXA46H9AWslyclLf5hgTUfHsAu6YwRXLOWsyeGouzOs7t/usHEYy gMnjt8iKRJJh7SA4pyKdhMxTyma9py/10ZOJpH/yHxKI7jGEmwWo6ycojGGlJ8QN bf2wKvNHEY8LfCsRNxxtNClcwkFyi/2RKc04RvZi33kwP+w9nCd4WOQot4SLLpML 4ZxRAZuu+jLxupyrBb3ZHdKCZ93Qbpo50Gcakwz5KmYlvgb/QMD0GnR0oyaHdW94 4l24SPy3K7rArFBpZr/Zhiq4vIqrht9VpM3UVk+MrkYy507ivQT69sgeQ3/ud5Dn ywpmJe06b7p/LFSKVtiaCPRbqn55Auf5Jh/0fdVPN6QNanTUEeucMFp5zJjKSNsu ZEQja3IwmhdTsn/2LPbK9bCup+m91qVey4q8g2wFHfKMPs8OXWhH4LPorOVmDCnU your_sha256_hash ZVl6yv/y3TWio7DvON2IkxZwmXnsh91KWkUaf2kMBb8ddYOZlINUkBhEk+IRPYqb 7YPHo/4KRRfgzFucX96+JhVQjo7Yh6c7yJI5croAdRgKt+ml+RELKjTN9pM2xteC XFRL41/tIBSBu62geK+QxRkl7npSi6Fu8v7rEkXGjvODMDjplenB1WIBodiu5rSD Ojf3ixwyefq/yH/U9nld9M0/4PYdxF49ocqpIp/+2uNF0NBd0GjJoQvzo6zv/G0k ilvTtnE+cXwm2ArpteTGMCDnG2WVwr1Ro+fGWGHZd1617qYlcIgKmfaXofUoJC42 ckoCRctme8uP3FI7DU9v5qtAxuw59M2uu01kxC0O8uNHOpjv2V/E34WveTpLnV0G FsSXOKzRCPfPCGJPSmEp1kBnF5rCxBnAWtL85uHG6Fuc23ZnL7mpK3AK+Z2SZzzi PxcO2xScR6UrA3piFJ2rCFCTk0d+VPNC5WlYQ53EJM6YLcNGi9qcmyEAN934NtyV N96uJTIRfz4aZzsC+MusvfIXrY+5BhSsrMS3KHh3pHnddUmZ0NRFUDD10/qSYx1n cfxyavuhegOBGolZYrLUYBEFf03r29lMJutkUByq/bKDRLddw4+A0Pjz1wrv1ezz cNw7qO5wQFsgwolP/aQTT9VzB7ot/Lsj2cEvtMeSWjPVdDHS08huMUKsSmVuqJUg BvOyLkGk+v2W73V4jJ5omanr6aQko2Td74pTFa6CtX0orY9vjg8YMRxRIa/l9xmF amo5+7unyTAOcceTHTC7M6nOIi802+8ceX5A0bn3q2H0X7bvjwKNMHCk5s59/ni4 TUhgqujICQT1t9EQR5AKVDXkK6N4F/z1N3ZPbezOMVXb8a7d3xwGwa/b3X9MHriJ 1kqQnsakSC8Rw2yn6kIDUdntdJk3/w2N4SV+RoofQJbfSbachMgkZXD4mUUGnkih 90EUDOnmvqBG/TcKIxwbLLZmOhyj4SSzAvqVlM2CJXTQ9x2ZDpWvqHxJhKSHjizr RL6aHTxs31goE7hshx+pFI727R4nbZFXAQ+6pu4HfuRosiUNxTZrX8ARf7CevqgE NGrv+ZftJvJk6jCrLVxruyRcldsueBhTrmNDaaavc0NBCj/ppYK+elGwMFmjW5cr 3zy7YoR/X4hP8NO82QFy5+n3llkwlPtsjkCMqUaRTLWQsFpq/aLaH0d900xpohXT your_sha256_hash bfkZZRRSqLnqvF04SPyQJTUy2lmtDCU0SpU35ra6OnTZ9udUmMzuccqhDj9+S0+8 JW7D/ucLP/4bUV97ynuJM5fjn7pfIWaMTMlf9sPq0ZwKITlY0ScNYZlorFrCNIfQ 5lz0gyWz/H2b9IyjdGyk/9V0jatmHFLLixezQlM8m3OvOI2sRpjGu4cQ2sbwhirU dkGQyvJyknH3giaKRBZoYQpCuy0sPf+3FL0dMSUwIwYJKoZIhvcNAQkVMRYEFD19 RcPy9ibhzz/UWLvufVOBcXQtMDEwITAJBgUrDgMCGgUABBRrIw9lO3XdJ3Zs8L5z LN/Xy2AzvAQIwSGIMRRHzLwCAggA """ let pemCert = """ -----BEGIN CERTIFICATE----- your_sha256_hash your_sha256_hash your_sha256_hash h14K/KK8o3VOumeaD8x5Zf13iWApq1Fx6rjRuKh1K7yrI53q38nw/Q1ul782lbxI iw8JaMiX4V6nPE+5Kej8LFyy98w46VsBnYs968AL42ordB3nPC8zwVbfPYrHv1oy +cBcmcULtq750fvWP1acCiBqdgMSnF0omSTkTU/D3lIUqpwgpQl2/VlhtpWlPyoM Aq4DO+Vmye9NUMOZQceApIb0sH6a0JrB5jD91h9sgVWayNMVxju7GziPBzmRiuMV wDD2rMjo7gFJLjB5XagTYXtiIpojzk5rKpZAxTktHYiP1poN/aHnqCjlmzvICR88 8BL6JEfcj3p9NSIi344MagdWmRQt19bEXpEHVLJn2SAE/TvyNmchAevYtlsIaUDW Y1iabuxIA4QxWCAWteh/LPuLKP+hicVJ/mGZ1XVRKl6iVvUgDCJaPCJOcUnPZJxK your_sha256_hash +td2YqX6UKKYqXebkdgE1TSZeW82ECmwZo0n8C5jT8PUbGYk1Os4CSLYUasGQtHN UPP3yOv44eJ+4Af+ZS+XkUlEkC3UB0ZyiStzdpT7QDpNBsNgE7DZ6RyB7Q47Vxzn your_sha256_hash CwUAA4ICAQAA/PI4+iSy9zys2i/uJs7qtz/Omi0RJjnz9lKm1K1Cr7apedm8aIWm ZhRLFJt4LdchjqT+dAVH+E0Su1wiF5EafvRoXMEfHJ1BW7rXyZlrvvIyVtqxvAeN PfkHhyjPS7n5P2C7hHOEqmeQilbuagc28/HIVDdoXyXighjZWnHGGMPVCwpTPHbO OtjyU6HF5Dujv+8lFVr1tgdexfKnC9tm3puYO81rDDzENV9VHp18NqKznD5mLLYK Ngx2yyqJNxppJlenlP+1ryLd061EG/GnaiPJ65eaQKtz+mUXoWBnSeUZCUarxc5T your_sha256_hash edoo1PQQZqf9rAdFQd/c4m7HMPfnvxRlVt0sl55Q/B+vGdIfQOCxUtChaP+PZhjQ WlkLL6lBVfyrk+tDYny4pBpr+nBgaLUDHCZLFi5//SXUFvMH/98xj1VNQX1DGj/R your_sha256_hash your_sha256_hash VNRsYjq7Hbq4nYioPS6Og942BbwtjHTPf77ALvHZVDBVY0upPZXYvA== -----END CERTIFICATE----- """ ```
Golborne High School is a coeducational foundation secondary school located in Golborne, Metropolitan Borough of Wigan, England. History The oldest part of the school was opened in January 1954. This comprises the main corridor and its classrooms, and the Gymnasium. The school only admitted girls and was called the Girls' Secondary Modern School. In 1967 boys were admitted for the first time (having previously been educated at what later became Golborne County Junior School.) In 1969 Golborne became a Comprehensive School. New additions to the building included the Sports Hall (originally intended to be a swimming pool) and the RoSLA Block where the Sixth Form students were educated. The school badge comprises a green shield, quartered by the red cross of St George of England. In each quarter of the shield are symbols representing the industry in the area- the black diamond representing coal mining, the crossed shuttles of the weaving industry, and a sheaf of wheat symbolising local agriculture. The fourth symbol is the red rose of Lancashire, although technically the school now lies in Greater Manchester since local government re-organisation in 1974. Since this time there have been three major changes to the status of the school: In 1989 the 6th form closed with the final year taking their exams in 1990, the school lost its comprehensive state and became Golborne High. In 2007 in became a College of Visual Arts In 2010 it became a trust school under the umbrella of the newly formed Golborne and Lowton Co-operative Learning partnership A proposed of merging with Lowton High School to form a new school of 1200 pupils, eventually moving to a new site in Lowton in 2012, was cancelled by the coalition government in July 2010. Both of the old schools would have been demolished which generated some local opposition. The school took second place in a Salter's Festival of Chemistry contest. In July 2007 Golborne sent representatives to a Youth Summit to press for adult understanding of youth issues.. In September 2007, the school was selected to host a display of lunar rock samples, which the students were allowed to handle. Two students reached the National Final of a robot building competition in Milton Keynes, having won the heats of the contest at Salford University. Other students had their excellent artwork exhibited on the Saatchi Gallery's online facility. In December 2008 Sir Ian McKellen, who was brought up in Wigan, visited the school to lend his support to the schools anti-bullying campaign. Golborne was recognised as one of 100 schools with the largest increase on performance from 2002 to 2004, based on participation in GCSE exams. The school has continued to improve and 2009 achieved its best ever results with 51% of students achieving 5 good GCSE passes including English and Maths. As far back as 2004, the school had been pursuing arts school status, which would provide for a more focused curriculum enhanced with state-of-the-art equipment. In September 2007, the school became a College of Visual Arts. Recently opened is the Digital Photography Suite which is intended not only for school use but for community groups. The school is to be equipped with its own art gallery. Houses The school is divided into five houses the names of which are derived from the names of famous artists in several fields: Fonteyn – after Margot Fonteyn the ballerina Lennon – after John Lennon the Liverpool songwriter and member of The Beatles Picasso – after Pablo Picasso the artist Shakespeare – after William Shakespeare the playwright Westwood – after Vivienne Westwood the fashion designer References External links Golborne High School 1954 establishments in England Secondary schools in the Metropolitan Borough of Wigan Educational institutions established in 1954 Foundation schools in the Metropolitan Borough of Wigan