text
stringlengths 24
5.93k
|
|---|
Genre fiction. In the book-trade, genre fiction, also known as formula fiction,[1] or commercial fiction,[2] encompasses fictional works written with the intent of fitting into a specific literary genre in order to appeal to readers and fans already familiar with that genre.[3] These labels commonly imply that this type of fiction places more value on plot and entertainment than on character development, philosophical themes, or artistic depth.[2] This distinguishes genre fiction from literary fiction. The main genres are crime, fantasy, romance, science fiction and horror—as well as perhaps Western, inspirational and historical fiction. Slipstream genre is sometimes thought to be in between genre and non-genre fiction.[4] In the publishing industry the term category fiction is often used as a synonym for genre fiction,[citation needed] with the categories serving as the familiar shelf headings within the fiction section of a bookstore, such as Western or mystery. Some authors classified instead as literary fiction have written genre novels under pseudonyms, while others are argued to have employed genre elements in literary fiction.[5][6][7][better source needed]
|
Little, Brown Book Group. Little, Brown Book Group is a UK publishing company created in 1988, with multiple predecessors. Since 2006 Little, Brown Book Group has been owned by Hachette UK, a subsidiary of Hachette Livre. It was acquired in 2006 from Time Warner of New York City, who then owned LBBG via the American publisher Little, Brown and Company. [1] Little, Brown has won the Publisher of the Year Award four times – in 1994, 2004, 2010 and 2014.[citation needed] Little and Brown was established in Boston, Massachusetts, United States, by Charles Little and James Brown in 1837; as Little, Brown and Company it was acquired by Time Inc in 1968. Little, Brown became part of the Time Warner Book Group when Time merged with Warner Communications in 1989. Still based in Boston, the Time Warner subsidiary Little, Brown purchased British publisher Macdonald from Maxwell Communication Corporation in 1992.[2] The firm was renamed Little, Brown Book Group (Little, Brown offices moved to New York City in 2001.) In 2014, Little, Brown acquired independent publisher Constable and Robinson, and soon merged Piatkus with the Constable and Robinson imprints to form Piatkus Constable Robinson (PCR).[3] Another Constable and Robinson imprint, Corsair, publishes literary fiction and non-fiction separately from PCR.[4] In 2015, Ursula Doyle (formerly Associate Publisher of Virago) announced a new imprint, Fleet. Fleets launch titles in 2016 included Charlotte Rogans Now and Again, Melissa Flemings A Hope More Powerful than the Sea, and the paperback edition of Virginia Bailys Early One Morning. The Fleet imprints releases include Material Girls: Why Reality Matters for Feminism (2021) by Kathleen Stock,[5][6] and Did Ye Hear Mammy Died? (2021) by Seamas OReilly.[7]
|
Holding company. A holding company is a company whose primary business is holding a controlling interest in the securities of other companies.[1] A holding company usually does not produce goods or services itself. Its purpose is to own stock of other companies to create a corporate group. Holding companies also conduct trade and other business activities themselves. Holding companies reduce risk for the shareholders, and can permit the ownership and control of a number of different companies. They can be subsidiaries in a tiered structure. Holding companies are also created to hold assets such as intellectual property or trade secrets that are protected from the operating company. That creates a smaller risk when it comes to litigation. In the United States, 80% of stock, in voting and value, must be owned before tax consolidation benefits such as tax-free dividends can be claimed.[2] That is, if Company A owns 80% or more of the stock of Company B, Company A will not pay taxes on dividends paid by Company B to its stockholders, as the payment of dividends from B to A is essentially transferring cash within a single enterprise. Any other shareholders of Company B will pay the usual taxes on dividends, as they are legitimate and ordinary dividends to these shareholders.
|
Genovese. Genovese is an Italian surname meaning, properly, someone from Genoa. Its Italian plural form Genovesi has also developed into a surname.
|
Eiji Tsuburaya. Eiji Tsuburaya (Japanese: 円谷 英二, Hepburn: Tsuburaya Eiji; July 7, 1901[b] – January 25, 1970) was a Japanese special effects director, filmmaker, and cinematographer. A co-creator of the Godzilla and Ultraman franchises, he is considered one of the most important and influential figures in the history of cinema. Tsuburaya is known as the Father of Tokusatsu,[6][d] having pioneered Japans special effects industry and introduced several technological developments in film productions. In a career spanning five decades, Tsuburaya worked on approximately 250 films—including globally renowned features directed by Ishirō Honda, Hiroshi Inagaki, and Akira Kurosawa—and earned six Japan Technical Awards. Following a brief stint as an inventor, Tsuburaya was employed by Japanese cinema pioneer Yoshirō Edamasa in 1919 and began his career working as an assistant cinematographer on Edamasas A Tune of Pity. Thereafter, he worked as an assistant cinematographer on several films, including Teinosuke Kinugasas A Page of Madness (1926). At the age of thirty-two, Tsuburaya watched King Kong, which greatly influenced him to work in special effects. Tsuburaya completed the first iron shooting crane in October 1934, and an adaptation of the crane is still in use across the globe today. After filming his directorial debut on the cruiser Asama in the Pacific Ocean, he worked on Princess Kaguya (1935), one of Japans first major films to incorporate special effects. His first majorly successful film in effects, The Daughter of the Samurai (1937), remarkably featured the first full-scale rear projection. In 1937, Tsuburaya was employed by Toho and established the companys effects department. Tsuburaya directed the effects for The War at Sea from Hawaii to Malaya in 1942, which became the highest-grossing Japanese film in history upon its release. His elaborate effects were believed to be behind the films major success, and he won an award for his work from the Japan Motion Picture Cinematographers Association. In 1948, however, Tsuburaya was purged from Toho by the Supreme Commander for the Allied Powers because of his involvement in propaganda films during World War II. Thus, he founded Tsuburaya Special Technology Laboratory with his eldest son Hajime and worked without credit at major Japanese studios outside Toho, creating effects for films such as Daieis The Invisible Man Appears (1949), widely regarded as the first Japanese science fiction film. In 1950, Tsuburaya returned to Toho alongside his effects crew from Tsuburaya Special Technology Laboratory. At age fifty-three, he gained international recognition and won his first Japan Technical Award for Special Skill for directing the effects in Ishirō Hondas kaiju film Godzilla (1954). He served as the effects director for Tohos string of financially successful tokusatsu films that followed, including, Rodan (1956), The Mysterians (1957), The Three Treasures (1959), Mothra, The Last War (both 1961), and King Kong vs. Godzilla (1962). In April 1963, Tsuburaya founded Tsuburaya Special Effects Productions; his company would go onto produce the television shows Ultra Q, Ultraman (both 1966), Ultraseven (1967–1968), and Mighty Jack (1968). Ultra Q and Ultraman were extremely successful upon their 1966 broadcast, with Ultra Q making him a household name in Japan and gaining him more attention from the media who dubbed him the God of Tokusatsu. While he spent his late years working on several Toho films and operating his company, Tsuburayas health began to decline, and he died in 1970.
|
Russell Square. Russell Square is a large garden square in Bloomsbury, in the London Borough of Camden, built predominantly by the firm of James Burton. It is near the University of Londons main buildings and the British Museum. Almost exactly square, to the north is Woburn Place and to the south-east is Southampton Row. Russell Square tube station sits to the north-east.[1] It is named after the surname of the Earls and Dukes of Bedford; the freehold remains with the latters conservation trusts who have agreed public access and management by Camden Council. The gardens are in the mainstream, initial category (of Grade II listing) on the Register of Historic Parks and Gardens.[2] Following the demolition of Bedford House, Russell Square and Bedford Square were laid out in 1804.[3] The square is named after the surname of the Earls and Dukes of Bedford, who developed the familys London landholdings in the 17th and 18th centuries.[3] Between 1805 and 1830, Thomas Lawrence had a studio at number 65.[4] Other past residents include the famous 19th-century architectural father-and-son partnership, Philip and Philip Charles Hardwick, who lived at number 60 in the 1850s.[5] On the eastern side the Hotel Russell, built in 1898 to a design by Charles Fitzroy Doll, dominates (its builders were connected with the company which created RMS Titanic),[6] alongside the Imperial Hotel, which was also designed by Charles Fitzroy Doll and built from 1905 to 1911. The old Imperial building was demolished in 1967.[7]
|
Archibald Constable. Archibald David Constable (24 February 1774 – 21 July 1827) was a Scottish publisher, bookseller and stationer. Constable was born at Carnbee, Fife, son of the land steward to the Earl of Kellie.[1] In 1788 Archibald was apprenticed to Peter Hill, an Edinburgh bookseller, based on the High Street south of the Mercat Cross. In 1795 Constable started in business for himself as a dealer in rare books, taking a unit immediately opposite Peter Hill, on the north side of the Mercat Cross. He was then living in a house in Calton village on the edge of Calton Hill.[2] He bought the rights to publish the Scots Magazine in 1801, and John Leyden, the orientalist, became its editor. In 1800 Constable began the Farmers Magazine, and in November 1802 he issued the first number of the Edinburgh Review, under the nominal editorship of Sydney Smith; Lord Jeffrey, was, however, the guiding spirit of the review, having as his associates Lord Brougham, Sir Walter Scott, Henry Hallam, John Playfair and afterwards Lord Macaulay.[1]
|
Geneva (disambiguation). Geneva is the second-most-populous city in Switzerland. Geneva may also refer to:
|
Commonwealth realm. A Commonwealth realm is a sovereign state in the Commonwealth of Nations that has the same constitutional monarch and head of state as the other realms. The current monarch is King Charles III.[1][2][3] Except for the United Kingdom, in each of the realms the monarch is represented by a governor-general. The phrase Commonwealth realm is an informal description not used in any law. As of 2025[update], there are 15 Commonwealth realms: Antigua and Barbuda, Australia, The Bahamas, Belize, Canada, Grenada, Jamaica, New Zealand, Papua New Guinea, Saint Kitts and Nevis, Saint Lucia, Saint Vincent and the Grenadines, Solomon Islands, Tuvalu, and the United Kingdom. While the Commonwealth of Nations has 56 independent member states, only these 15 have Charles III as head of state. He is also Head of the Commonwealth, a non-constitutional role. The notion of these states sharing the same person as their monarch traces back to 1867 when Canada became the first dominion, a largely self-governing nation in the British Empire; others, such as Australia (1901) and New Zealand (1907), followed. With the growing independence of the dominions in the 1920s, the Balfour Declaration of 1926 established the Commonwealth of Nations and that the nations were considered equal in status ... though united by a common allegiance to the Crown.[1] The Statute of Westminster 1931 further set the relationship between the realms and the Crown, including a convention that any alteration to the line of succession in any one country must be voluntarily approved by all the others. The modern Commonwealth of Nations was then formally constituted by the London Declaration in 1949 when India wanted to become a republic without leaving the Commonwealth; this left seven independent nations sharing the Crown: Australia, Canada, Ceylon (now Sri Lanka), New Zealand, Pakistan, South Africa, and the United Kingdom. Since then, new realms have been created through the independence of former colonies and dependencies; Saint Kitts and Nevis is the youngest extant realm, becoming one in 1983. Some realms became republics; Barbados changed from being a realm to a republic in 2021.[4] There are currently 15 Commonwealth realms scattered across three continents (nine in North America, five in Oceania, and one in Europe), with a combined area of 18.7 million km2 (7.2 million sq mi)[a] (excluding the Antarctic claims which would raise the figure to 26.8 million km2 (10.3 million sq mi)) and a population of more than 150 million.[5]
|
Canton of Geneva. The Canton of Geneva, officially the Republic and Canton of Geneva,[4][5] is one of the 26 cantons of the Swiss Confederation. It is composed of forty-five municipalities, and the seat of the government and parliament is in the city of Geneva. Geneva is the French-speaking westernmost canton of Switzerland. It lies at the western end of Lake Geneva and on both sides of the Rhone, its main river. Within the country, the canton borders Vaud to the east, the only adjacent canton. However, most of Genevas border is with France, specifically the region of Auvergne-Rhône-Alpes. As is the case in several other Swiss cantons (Ticino, Neuchâtel, and Jura), Geneva is referred to as a republic within the Swiss Confederation. One of the most populated cantons, Geneva is considered one of the most cosmopolitan regions of the country. As a center of the Calvinist Reformation, the city of Geneva has had a great influence on the canton, which essentially consists of the city and its suburbs. Notable institutions of international importance based in the canton are the United Nations, the International Committee of the Red Cross and CERN. The Canton of Geneva, whose official name is the Republic and Canton of Geneva, is the successor of the Republic of Geneva.[6]
|
Sherlock Holmes. Sherlock Holmes (/ˈʃɜːrlɒk ˈhoʊmz/) is a fictional detective created by British author Arthur Conan Doyle. Referring to himself as a consulting detective in his stories, Holmes is known for his proficiency with observation, deduction, forensic science and logical reasoning that borders on the fantastic, which he employs when investigating cases for a wide variety of clients, including Scotland Yard. The character Sherlock Holmes first appeared in print in 1887s A Study in Scarlet. His popularity became widespread with the first series of short stories in The Strand Magazine, beginning with A Scandal in Bohemia in 1891; additional tales appeared from then until 1927, eventually totalling four novels and 56 short stories. All but one[a] are set in the Victorian or Edwardian eras between 1880 and 1914. Most are narrated by the character of Holmess friend and biographer, Dr. John H. Watson, who usually accompanies Holmes during his investigations and often shares quarters with him at the address of 221B Baker Street, London, where many of the stories begin. Though not the first fictional detective, Sherlock Holmes is arguably the best known.[1] By the 1990s, over 25,000 stage adaptations, films, television productions, and publications had featured the detective,[2] and Guinness World Records lists him as the most portrayed human literary character in film and television history.[3] Holmess popularity and fame are such that many have believed him to be not a fictional character but an actual person;[4][5][6] many literary and fan societies have been founded on this pretence. Avid readers of the Holmes stories helped create the modern practice of fandom, with the Sherlock Holmes fandom being one of the first cohesive fan communities in the world.[7] The character and stories have had a profound and lasting effect on mystery writing and popular culture as a whole, with the original tales, as well as thousands written by authors other than Conan Doyle, being adapted into stage and radio plays, television, films, video games, and other media for over one hundred years. Edgar Allan Poes C. Auguste Dupin is generally acknowledged as the forerunner of the modern detective story in English fiction and served as the prototype for many later characters, including Holmes.[8] Conan Doyle once wrote, Each [of Poes detective stories] is a root from which a whole literature has developed ... Where was the detective story until Poe breathed the breath of life into it?[9] Similarly, the stories of Émile Gaboriaus Monsieur Lecoq were extremely popular at the time Conan Doyle began writing Holmes, and Holmess speech and behaviour sometimes follow those of Lecoq.[10][11] Doyle has his main characters discuss these literary antecedents near the beginning of A Study in Scarlet, which is set soon after Watson is first introduced to Holmes. Watson attempts to compliment Holmes by comparing him to Dupin, to which Holmes replies that he found Dupin to be a very inferior fellow and Lecoq to be a miserable bungler.[12]
|
Crime Story (disambiguation). Crime fiction stories are narratives that centre on criminal acts and especially on the investigation of a crime. Crime Story may also refer to:
|
Chicago (disambiguation). Chicago, Illinois, is the third-most populous city in the United States. Chicago may also refer to: Any of several disciplines, some associated with the University of Chicago, including
|
Municipalities of Switzerland. Municipalities (German: Gemeinden, Einwohnergemeinden or politische Gemeinden; French: communes; Italian: comuni; Romansh: vischnancas) are the lowest level of administrative division in Switzerland. Each municipality is part of one of the Swiss cantons, which form the Swiss Confederation. In most cantons, municipalities are also part of districts or other sub-cantonal administrative divisions. There are 2,121 municipalities as of January 2025[update].[1] Their populations range between several hundred thousand (Zürich), and a few dozen people (Kammersrohr, Bister), and their territory between 0.32 km² (Rivaz) and 439 km² (Scuol). The beginnings of the modern municipality system date back to the Helvetic Republic. Under the Old Swiss Confederacy, citizenship was granted by each town and village to only residents. These citizens enjoyed access to community property and in some cases additional protection under the law. Additionally, the urban towns and the rural villages had differing rights and laws. The creation of a uniform Swiss citizenship, which applied equally for citizens of the old towns and their tenants and servants, led to conflict. The wealthier villagers and urban citizens held rights to forests, common land and other municipal property which they did not want to share with the new citizens, who were generally poor. The compromise solution, which was written into the municipal laws of the Helvetic Republic, is still valid today. Two politically separate but often geographically similar organizations were created. The first, the so-called municipality, was a political community formed by election and its voting body consists of all resident citizens. However, the community land and property remained with the former local citizens who were gathered together into the Bürgergemeinde/bourgeoisie. During the Mediation era (1803–1814), and especially during the Restoration era (1814–1830), many of the gains toward uniform citizenship were lost. Many political municipalities were abolished and limits were placed on the exercise of political rights for everyone except the members of the Bürgergemeinde. In the Regeneration era (1830–1848), the liberal revolutions of the common people helped to restore some rights again in a few cantons. In other cantons, the Bürgergemeinden were able to maintain power as political communities. In the city of Zürich it was not until the Municipal Act of 1866 that the political municipality came back into existence.[2] The relationship between the political municipality and the Bürgergemeinde was often dominated by the latters ownership of community property. Often the administration and profit from the property were totally held by the Bürgergemeinden, leaving the political municipality dependent on the Bürgergemeinde for money and use of the property. It was not until the political municipality acquired rights over property that served the public (such as schools, fire stations, etc.) and taxes, that they obtained full independence. For example, in the city of Bern, it was not until after the property division of 1852 that the political municipality had the right to levy taxes.[2]
|
Crime Stories (disambiguation). Crime Stories are narratives that centre on criminal acts and especially on the investigation of a crime. Crime Stories may also refer to:
|
Murder mystery (disambiguation). A murder mystery is a work of crime fiction. Murder mystery may also refer to:
|
Fable (disambiguation). A fable is a story intended to illustrate a moral. Fable(s), The Fable(s), or A Fable may also refer to:
|
Chicago River. 41°53′11″N 87°38′15″W / 41.88639°N 87.63750°W / 41.88639; -87.63750 The Chicago River is a system of rivers and canals with a combined length of 156 miles (251 km)[1] that runs through the city of Chicago, including its center (the Chicago Loop).[2] The river is one of the reasons for Chicagos geographic importance: the related Chicago Portage is a link between the Great Lakes and the Mississippi River Basin, and ultimately the Gulf of Mexico. In 1887, the Illinois General Assembly decided to reverse the flow of the Chicago River through civil engineering by taking water from Lake Michigan and discharging it into the Mississippi River watershed, partly in response to concerns created by an extreme weather event in 1885 that threatened the citys water supply.[3][n 1] In 1889, the state created the Chicago Sanitary District (now the Metropolitan Water Reclamation District) to replace the Illinois and Michigan Canal with the Chicago Sanitary and Ship Canal, a much larger waterway, because the former had become inadequate to serve the citys increasing sewage and commercial navigation needs.[4] Completed by 1900,[5] the project reversed the flow of the main stem and South Branch and altered the flow of the North Branch by using a series of canal locks and pumping stations, increasing the flow from Lake Michigan into the river, causing the river to empty into the new canal instead. In 1999, the system was named a Civil Engineering Monument of the Millennium by the American Society of Civil Engineers (ASCE).[6] The river is represented on the municipal flag of Chicago by two horizontal blue stripes.[7] Its three branches serve as the inspiration for the municipal device,[8][9][10] a three-branched, Y-shaped symbol that is found on many buildings and other structures throughout Chicago. When it followed its natural course, the North and South Branches of the Chicago River converged at Wolf Point to form the main stem, which jogged southward from the present course of the river to avoid a baymouth bar, entering Lake Michigan at about the level of present-day Madison Street.[11] Today, the main stem of the Chicago River flows west from Lake Michigan to Wolf Point, where it converges with the North Branch to flow into the South Branch, where the rivers course goes south and west to empty in the Chicago Sanitary and Ship Canal.
|
French language. French (français [fʁɑ̃sɛ] ⓘ or langue française [lɑ̃ɡ fʁɑ̃sɛːz] ⓘ) is a Romance language of the Indo-European family. Like all other Romance languages, it descended from the Vulgar Latin of the Roman Empire. French evolved from Northern Old Gallo-Romance, a descendant of the Latin spoken in Northern Gaul. Its closest relatives are the other langues doïl—languages historically spoken in northern France and in southern Belgium, which French (Francien) largely supplanted. It was also influenced by native Celtic languages of Northern Roman Gaul and by the Germanic Frankish language of the post-Roman Frankish invaders. As a result of French and Belgian colonialism from the 16th century onward, it was introduced to new territories in the Americas, Africa, and Asia, and numerous French-based creole languages, most notably Haitian Creole, were developed. A French-speaking person or nation may be referred to as Francophone in both English and French. French is an official language in 26 countries, as well as one of the most geographically widespread languages in the world, with speakers in about 50 countries.[4] Most of these countries are members of the Organisation internationale de la Francophonie (OIF), the community of 54 member states which share the use or teaching of French. It is estimated to have about 310 million speakers, of which about 74 million are native speakers;[5] it is spoken as a first language (in descending order of the number of speakers) in France, Canada (Quebec), Belgium (Wallonia and the Brussels-Capital Region), western Switzerland (Romandy region), parts of Luxembourg, and Monaco.[6] Meanwhile in Francophone Africa it is spoken mainly as a second language or lingua franca, though it has also become a native language in a small number of urban areas; in some North African countries like Algeria, despite not having official status, it is also a first language among some upper classes of the population alongside the indigenous ones, but only a second one among the general population.[7] In 2015, approximately 40% of the Francophone population (including L2 and partial speakers) lived in Europe, 36% in sub-Saharan Africa and the Indian Ocean, 15% in North Africa and the Middle East, 8% in the Americas, and 1% in Asia and Oceania.[8] French is the second most widely spoken mother tongue in the European Union.[9] Of Europeans who speak other languages natively, approximately one-fifth are able to speak French as a second language.[10] Many institutions of the EU use French as a working language along with English, German and Italian; in some institutions, French is the sole working language (e.g. at the Court of Justice of the European Union).[11] French is also the 22th most natively spoken language in the world,[12] the sixth most spoken language by total number of speakers, and is among the top five most studied languages worldwide, with about 120 million learners as of 2017.[13][14] French has a long history as an international language of literature and scientific standards and is a primary or second language of many international organisations including the United Nations, the European Union, the North Atlantic Treaty Organization, the World Trade Organization, the International Olympic Committee, the General Conference on Weights and Measures, and the International Committee of the Red Cross.
|
Saying. A saying is any concise expression that is especially memorable because of its meaning or style. A saying often shows a wisdom or cultural standard, having different meanings than just the words themselves.[1] Sayings are categorized as follows:
|
Narrative. A narrative, story, or tale is any account of a series of related events or experiences,[1][2] whether non-fictional (memoir, biography, news report, documentary, travelogue, etc.) or fictional (fairy tale, fable, legend, thriller, novel, etc.).[3][4][5] Narratives can be presented through a sequence of written or spoken words, through still or moving images, or through any combination of these. Narrative is expressed in all mediums of human creativity, art, and entertainment, including speech, literature, theatre, dance, music and song, comics, journalism, animation, video (including film and television), video games, radio, structured and unstructured recreation, and potentially even purely visual arts like painting, sculpture, drawing, and photography, as long as a sequence of events is presented. The social and cultural activity of humans sharing narratives is called storytelling, the vast majority of which has taken the form of oral storytelling.[6] Since the rise of literate societies however, many narratives have been additionally recorded, created, or otherwise passed down in written form. The formal and literary process of constructing a narrative—narration—is one of the four traditional rhetorical modes of discourse, along with argumentation, description, and exposition. This is a somewhat distinct usage from narration in the narrower sense of a commentary used to convey a story, alongside various additional narrative techniques used to build and enhance any given story. The noun narration and adjective narrative entered English from French in the 15th century; narrative became usable as a noun in the following century.[7] These words ultimately derive from the Latin verb narrare (to tell), itself derived from the adjective gnarus (knowing or skilled).[8][9] A narrative is the telling of some actual or fictitious sequence of connected events to an audience, by a narrator in some cases (and in all cases of written narratives). A personal narrative is any narrative in prose in which the speaker or writer presents, usually informally and in a spontaneous moment, their own personal experiences, such as in casual face-to-face conversation or in text messaging. Narratives are to be distinguished from simple descriptions of qualities, states, or situations without any particular individuals involved. Narratives range all the way from the shortest accounts of events (for example, the simple sentence the cat sat on the mat or a brief news item) to the most extended works, in the form of long and complex series that contain multiple books, films, television episodes, etc.
|
Charles III. Charles III (Charles Philip Arthur George; born 14 November 1948) is King of the United Kingdom and the 14 other Commonwealth realms.[b] Charles was born during the reign of his maternal grandfather, King George VI, and became heir apparent when his mother, Queen Elizabeth II, acceded to the throne in 1952. He was created Prince of Wales in 1958 and his investiture was held in 1969. He was educated at Cheam School and Gordonstoun, and later spent six months at the Timbertop campus of Geelong Grammar School in Victoria, Australia. After completing a history degree from the University of Cambridge, Charles served in the Royal Air Force and the Royal Navy from 1971 to 1976. After his 1981 wedding to Lady Diana Spencer, they had two sons, William and Harry. After years of estrangement and well-publicised extramarital affairs, Charles and Diana divorced in 1996. Diana died as a result of injuries sustained in a car crash the following year. In 2005 Charles married his long-term partner, Camilla Parker Bowles. As heir apparent, Charles undertook official duties and engagements on behalf of his mother and represented the United Kingdom on visits abroad. He founded The Princes Trust[e] in 1976, sponsored the Princes Charities and became patron or president of more than 800 other charities and organisations. He advocated for the conservation of historic buildings and the importance of traditional architecture in society. In that vein, he generated the experimental new town of Poundbury. An environmentalist, Charles supported organic farming and action to prevent climate change during his time as the manager of the Duchy of Cornwall estates, earning him awards and recognition as well as criticism. He is also a prominent critic of the adoption of genetically modified food, while his support for alternative medicine has been criticised. He has authored or co-authored 17 books. Charles became king upon his mothers death in 2022. At the age of 73 he was the oldest person to accede to the British throne, after having been the longest-serving heir apparent and Prince of Wales in British history. Significant events in his reign have included his coronation in 2023 and his cancer diagnosis the following year, the latter of which temporarily suspended planned public engagements.
|
Folklore (disambiguation). Folklore is a body of expressive culture shared by a particular group of people. Folklore may also refer to:
|
Anthropomorphism. Anthropomorphism (from the Greek words ánthrōpos (ἄνθρωπος), meaning human, and morphē (μορφή), meaning form or shape) is the attribution of human form, character, or attributes to non-human entities.[1] It is considered to be an innate tendency of human psychology.[2] Personification is the related attribution of human form and characteristics to abstract concepts such as nations, emotions, and natural forces, such as seasons and weather. Both have ancient roots as storytelling and artistic devices, and most cultures have traditional fables with anthropomorphized animals as characters. People have also routinely attributed human emotions and behavioral traits to wild as well as domesticated animals.[3] Anthropomorphism and anthropomorphization derive from the verb form anthropomorphize,[a] itself derived from the Greek ánthrōpos (ἄνθρωπος, lit. human) and morphē (μορφή, form). It is first attested in 1753, originally in reference to the heresy of applying a human form to the Christian God.[b][1] From the beginnings of human behavioral modernity in the Upper Paleolithic, about 40,000 years ago, examples of zoomorphic (animal-shaped) works of art occur that may represent the earliest known evidence of anthropomorphism. One of the oldest known is an ivory sculpture, the Löwenmensch figurine, Germany, a human-shaped figurine with the head of a lioness or lion, determined to be about 32,000 years old.[5][6] It is not possible to say what these prehistoric artworks represent. A more recent example is The Sorcerer, an enigmatic cave painting from the Trois-Frères Cave, Ariège, France: the figures significance is unknown, but it is usually interpreted as some kind of great spirit or master of the animals. In either case there is an element of anthropomorphism.
|
People. The term the people refers to the public or common mass of people of a polity.[1] As such it is a concept of human rights law, international law as well as constitutional law, particularly used for claims of popular sovereignty. In contrast, a people is any plurality of persons considered as a whole. Used in politics and law, the term a people refers to the collective or community of an ethnic group or nation.[1] Chapter One, Article One of the Charter of the United Nations states that peoples have the right to self-determination.[2] Though the mere status as peoples and the right to self-determination, as for example in the case of Indigenous peoples (peoples, as in all groups of indigenous people, not merely all indigenous persons as in indigenous people),[clarification needed] does not automatically provide for independent sovereignty and therefore secession.[3][4] Indeed, judge Ivor Jennings identified the inherent problems in the right of peoples to self-determination, as it requires pre-defining a said people.[5] Both the Roman Republic and the Roman Empire used the Latin term Senatus Populusque Romanus, (the Senate and People of Rome). This term was fixed abbreviated (SPQR) to Roman legionary standards, and even after the Roman Emperors achieved a state of total personal autocracy, they continued to wield their power in the name of the Senate and People of Rome. The term Peoples Republic, used since late modernity, is a name used by states, which particularly identify constitutionally with a form of socialism. In criminal law, in certain jurisdictions, criminal prosecutions are brought in the name of the People. Several U.S. states, including California, Illinois, and New York, use this style.[6] Citations outside the jurisdictions in question usually substitute the name of the state for the words the People in the case captions.[7] Four states — Massachusetts, Virginia, Pennsylvania, and Kentucky — refer to themselves as the Commonwealth in case captions and legal process. Other states, such as Indiana, typically refer to themselves as the State in case captions and legal process. Outside the United States, criminal trials in Ireland and the Philippines are prosecuted in the name of the people of their respective states.
|
Hansel and Gretel. Hansel and Gretel (/ˈhænsəl, ˈhɛn- ... ˈɡrɛtəl/; German: Hänsel und Gretel [ˈhɛnzl̩ ʔʊnt ˈɡʁeːtl̩])[a] is a German fairy tale collected by the Brothers Grimm and published in 1812 as part of Grimms Fairy Tales (KHM 15).[1][2] Hansel and Gretel are siblings who are abandoned in a forest and fall into the hands of a witch who lives in a house made of bread,[3] cake, and sugar. The witch, who has cannibalistic intentions, intends to fatten Hansel before eventually eating him. However, Gretel saves her brother by pushing the witch into her own oven, killing the witch. The children then escape with the witchs treasure.[4] Set in medieval Germany, Hansel and Gretel has been adapted into various media, including the opera Hänsel und Gretel by Engelbert Humperdinck, which was first performed in 1893.[5][6] Although Jacob and Wilhelm Grimm credited various tales from Hesse (the region where they lived) as their source, scholars have argued that the brothers heard the story in 1809 from the family of Wilhelms friend and future wife, Dortchen Wild, and partly from other sources.[7] A handwritten note in the Grimms personal copy of the first edition reveals that in 1813 Wild contributed to the childrens verse answer to the witch, The wind, the wind,/ The heavenly child, which rhymes in German: Der Wind, der Wind,/ Das himmlische Kind.[2] According to folklorist Jack Zipes, the tale emerged in the Late Middle Ages Germany (1250–1500). Shortly after this period, close written variants like Martin Montanus Garten Gesellschaft (1590) began to appear.[4] Scholar Christine Goldberg argues that the episode of the paths marked with stones and crumbs, already found in the French Finette Cendron and Hop-o-My-Thumb (1697), represents an elaboration of the motif of the thread that Ariadne gives Theseus to use to get out of the Minoan labyrinth.[8] A house made of confectionery is also found in a 14th-century manuscript about the Land of Cockayne.[5]
|
Arthur Rackham. Arthur Rackham RWS (19 September 1867 – 6 September 1939) was an English book illustrator. He is recognised as one of the leading figures during the Golden Age of British book illustration. His work is noted for its robust pen and ink drawings, which were combined with the use of watercolour, a technique he developed due to his background as a journalistic illustrator. Rackhams 51 colour pieces for the early American tale Rip Van Winkle became a turning point in the production of books since – through colour-separated printing – it featured the accurate reproduction of colour artwork.[1] His best-known works also include the illustrations for Peter Pan in Kensington Gardens, and Fairy Tales of the Brothers Grimm. Rackham was born at 210 South Lambeth Road, Vauxhall, London as one of 12 children. In 1884, at the age of 17, he was sent on an ocean voyage to Australia to improve his fragile health, accompanied by two aunts.[2] At the age of 18, he worked as an insurance clerk at the Westminster Fire Office and began studying part-time at the Lambeth School of Art.[3] In 1892, he left his job and started working for the Westminster Budget as a reporter and illustrator. His first book of illustrations were published in 1893 in To the Other Side by Thomas Rhodes, but his first serious commission was in 1894 for The Dolly Dialogues, the collected sketches of Anthony Hope, who later went on to write The Prisoner of Zenda. Book illustrating then became Rackhams career for the rest of his life.
|
Common Era. Common Era (CE) and Before the Common Era (BCE) are year notations for the Gregorian or Julian calendar, and are exactly equivalent to the Anno Domini (AD) and Before Christ (BC) notations. 2025 CE and AD 2025 each describe the current year; 400 BCE and 400 BC are the same year.[1][2] BCE/CE are primarily used to avoid religious connotations[3] by not referring to Jesus as Our Lord.[4][5][a] Nevertheless, the year numbers remain the same as Anno Domini. The expression can be traced back to 1615, when it first appears in a book by Johannes Kepler as the Latin: annus aerae nostrae vulgaris (year of our common era),[7][8] and to 1635 in English as Vulgar Era.[b] The term Common Era can be found in English as early as 1708,[9] and became more widely used in the mid-19th century by Jewish religious scholars. Around the year 525, the Christian monk Dionysius Exiguus devised the principle of taking the moment that he believed to be the date of the incarnation of Jesus to be the point from which years are numbered (the epoch) of the Christian ecclesiastical calendar.[10][11][12] Dionysius labeled the column of the table in which he introduced the new era as Anni Domini Nostri Jesu Christi (the years of our Lord Jesus Christ).[10]: 52 He did this to replace the Era of the Martyrs system (then used for some Easter tables) because he did not wish to continue the memory of a tyrant who persecuted Christians.[10]: 50 This way of numbering years became more widespread in Europe, with its use by Bede in England in 731. Bede also introduced the practice of dating years before 1 backwards, without a year zero.[c] The term Common Era is traced back in English to its appearance as Vulgar Era to distinguish years of the Anno Domini era, which was in popular use, from dates of the regnal year (the year of the reign of a sovereign) typically used in national law.[14] (The word vulgar originally meant of the ordinary people, with no derogatory associations.[15])
|
King James Version. The King James Version (KJV), also the King James Bible (KJB) and the Authorized Version (AV), is an Early Modern English translation of the Christian Bible for the Church of England, which was commissioned in 1604 and published in 1611, by sponsorship of King James VI and I.[d] The 80 books of the King James Version[4] include 39 books of the Old Testament, 14 books of Apocrypha, and the 27 books of the New Testament. Noted for its majesty of style, the King James Version has been described as one of the most important books in English culture and a driving force in the shaping of the English-speaking world.[5][6] The King James Version remains the preferred translation of many Protestant Christians, and is considered the only valid one by some Evangelicals. It is considered one of the important literary accomplishments of early modern England. The KJV 1611 is a 17th-century translation and thus contains a large number of archaisms and false friends—words that contemporary readers may think they understand but that actually carry obsolete or unfamiliar meanings—making understanding the text difficult for modern readers, even pastors and preachers trained in formal theological institutes.[7] The KJV was the third translation into English approved by the Church of England. The first had been the Great Bible in 1535, and the second had been the Bishops Bible in 1568.[8] Meanwhile in Switzerland the first generation of Protestant Reformers had produced the Geneva Bible[9] which was published in 1560[10] which proved more popular among the laity. However, the footnotes represented a Calvinistic Puritanism that was too radical for James. James convened the Hampton Court Conference in January 1604, where a new English version was conceived in response to the problems of the earlier translations perceived by the Puritans,[11] a faction of the Church of England.[12] James gave translators instructions intended to ensure the new version would conform to the ecclesiology, and reflect the episcopal structure, of the Church of England and its belief in an ordained clergy.[13][14] In common with most other translations of the period, the New Testament was translated from Greek, the Old Testament from Hebrew and Aramaic, and the Apocrypha from Greek and Latin.[15]
|
Chicago Loop. The Loop is Chicagos central business district and one of the citys 77 municipally recognized community areas. Located at the center of downtown Chicago[3] on the shores of Lake Michigan, it is the second-largest business district in North America, after Midtown Manhattan in New York City. The world headquarters and regional offices of several global and national businesses, retail establishments, restaurants, hotels, museums, theaters, and libraries—as well as many of Chicagos most famous attractions—are located in the Loop.[4] The district also hosts Chicagos City Hall, the seat of Cook County, offices of the state of Illinois, United States federal offices, as well as several foreign consulates. The intersection of State Street and Madison Street in the Loop is the origin point for the address system on Chicagos street grid, a grid system that has been adopted by numerous cities worldwide. The Loops definition and perceived boundaries have evolved over time. Since the 1920s, the area bounded by the Chicago River to the west and north, Lake Michigan to the east, and Roosevelt Road to the south has been called the Loop. It took its name from a somewhat smaller area, the 35 city blocks bounded on the north by Lake Street, on the west by Wells Street, on the south by Van Buren Street, and on the east by Wabash Avenue—the Union Loop formed by the L in the late 1800s.[5] Similarly, the South Loop and the West Loop historically referred to areas within the Loop proper, but in the 21st century began to refer to the entire Near South and much of the Near West Sides of the city, respectively.[6][7] In 1803, the United States Army built Fort Dearborn in what is now the Loop; although earlier settlement was present, this was the first settlement in the area sponsored by the United States federal government. When Chicago and Cook County were incorporated in the 1830s, the area was selected as the site of their respective seats. Originally mixed-use, the neighborhood became increasingly commercial in the 1870s. This process accelerated in the aftermath of the 1871 Great Chicago Fire, which destroyed most of the neighborhoods buildings. Some of the worlds earliest skyscrapers were constructed in the Loop, giving rise to the Chicago School of architecture. By the late 19th century, cable car turnarounds and the Union Loop encircled the area, giving the neighborhood its name. Near the lake, Grant Park, known as Chicagos front yard, is Chicagos oldest park; it was significantly expanded in the late 19th and early 20th centuries and houses a number of features and museums. Starting in the 1920s, road improvements for highways were constructed to and into the Loop, perhaps most famously U.S. Route 66 (US 66), which was commissioned in 1926. While dominated by offices and public buildings, its residential population boomed during the latter 20th century and first decades of the 21st, partly due to the development of former rail yards (at one time, the area had six major interurban railroad terminals and land was also needed for extensive rail cargo storage and transfer), industrial building conversions, as well as additional high-rise residences. Since 1950, the Loops resident population has increased in percentage terms the most out of all of Chicagos community areas.
|
Chicago L. The Chicago L (short for elevated)[4] is the rapid transit system serving the city of Chicago and some of its surrounding suburbs in the U.S. state of Illinois. Operated by the Chicago Transit Authority (CTA), it is the fourth-largest rapid transit system in the United States in terms of total route length, at 102.8 miles (165.4 km) long as of 2014,[1][note 1] and the third-busiest rapid transit system in the United States after the New York City Subway and the Washington Metro.[5] As of January 2024, the L had 1,480 rail cars operating across eight different routes on 224.1 miles of track. CTA trains make about 1,888 trips each day servicing 146 train stations.[6] In 2024, the system had 127,463,400 rides, or about 422,200 per weekday in the second quarter of 2025.[7] The L provides 24-hour service on the Red and Blue Lines, making Chicago, New York City, and Copenhagen the only three cities in the world to offer 24-hour train service on some of their lines throughout their respective city limits.[note 2] The oldest sections of the Chicago L started operations in 1892,[8] making it the second-oldest rapid transit system in the Americas, after New York Citys elevated lines. The L gained its name from el because large parts of the system run on elevated track.[9][10] Portions of the network are in subway tunnels, at grade level, or in open cuts.[1] The L has been credited for fostering the growth of Chicagos dense city core that is one of the citys distinguishing features.[11] And according to urban engineer Christof Speiler, the system stands out in the United States because it continued to invest in services even through the post-World-War era growth of the expressway; its general use of alleyways instead of streets throughout its history, and expressway medians after the war, better knit the system into the city, and in pioneering ways.[12] It consists of eight rapid transit lines laid out in a spoke–hub distribution paradigm focusing transit towards the Loop. In a 2005 poll, Chicago Tribune readers voted it one of the seven wonders of Chicago, behind the lakefront and Wrigley Field, and ahead of Willis Tower (formerly the Sears Tower), the Water Tower, the University of Chicago, and the Museum of Science and Industry.[13]
|
Hoodening. Hoodening (/ʊd.ɛnɪŋ/), also spelled hodening and oodening, is a folk custom found in Kent, a county in South East England. The tradition entails the use of a wooden hobby horse known as a hooden horse that is mounted on a pole and carried by a person hidden under a sackcloth. Originally, the tradition was restricted to the area of East Kent, although in the twentieth century it spread into neighbouring West Kent. It represents a regional variation of a hooded animal tradition that appears in various forms throughout Britain and Ireland. As recorded from the eighteenth to the early twentieth centuries, hoodening was a tradition performed at Christmas time by groups of farm labourers. They would form into teams to accompany the hooden horse on its travels around the local area, and although the makeup of such groups varied, they typically included someone to carry the horse, a leader, a man in female clothing known as a Mollie, and several musicians. The team would then carry the hooden horse to local houses and shops, where they would expect payment for their appearance. Although this practice is extinct, in the present the hooden horse is incorporated into various Kentish mummers plays and Morris dances that take place at different times of the year. The origins of the hoodening tradition, and the original derivation of the term hooden, remain subject to academic debate. An early suggestion was that hooden was related to the Anglo-Saxon pre-Christian god Woden (Odin), and that the tradition therefore originated with pre-Christian religious practices in the early medieval Kingdom of Kent. This idea has not found support from historians or folklorists studying the tradition. A more widely accepted explanation among scholars is that the term hooden relates to hooded, a reference to the sackcloth worn by the person carrying the horse. The absence of late medieval references to such practices and the geographic dispersal of the various British hooded animal traditions—among them the Mari Lwyd of south Wales, the Broad of the Cotswolds, and the Old Ball, Old Tup, and Old Horse of northern England—have led to suggestions that they derive from the regionalised popularisation of the sixteenth- and seventeenth-century fashion for hobby horses among the social elite. The earliest textual reference to the hoodening tradition comes from the first half of the eighteenth century. Scattered references to it appeared over the next century and a half, many of which considered it to be a declining tradition that had died out in many parts of Kent. Aware of this decline, in the early twentieth century the folklorist and historian Percy Maylam documented what survived of the tradition and traced its appearances in historical documents, publishing his findings as The Hooden Horse in 1909. Although deemed extinct at the time of the First World War, the custom was revived in an altered form during the mid-twentieth century, when the use of the hooden horse was incorporated into some modern Kentish folk traditions.
|
Culture. Culture (/ˈkʌltʃər/ KUL-chər) is a concept that encompasses the social behavior, institutions, and norms found in human societies, as well as the knowledge, beliefs, arts, laws, customs, capabilities, attitudes, and habits of the individuals in these groups.[1] Culture often originates from or is attributed to a specific region or location. Humans acquire culture through the learning processes of enculturation and socialization, which is shown by the diversity of cultures across societies. A cultural norm codifies acceptable conduct in society; it serves as a guideline for behavior, dress, language, and demeanor in a situation, which serves as a template for expectations in a social group. Accepting only a monoculture in a social group can bear risks, just as a single species can wither in the face of environmental change, for lack of functional responses to the change.[2] Thus in military culture, valor is counted as a typical behavior for an individual, and duty, honor, and loyalty to the social group are counted as virtues or functional responses in the continuum of conflict. In religion, analogous attributes can be identified in a social group. Cultural change, or repositioning, is the reconstruction of a cultural concept of a society. Cultures are internally affected by both forces encouraging change and forces resisting change. Cultures are externally affected via contact between societies.
|
Trick-or-treating. Trick-or-treating is a traditional Halloween custom for children and adults in some countries. During the evening of Halloween, on October 31, people in costumes travel from house to house, asking for treats with the phrase trick or treat. The treat is some form of confectionery, usually candy/sweets, although in some cultures money is given instead. The trick refers to a threat, usually idle, to perform mischief on the resident(s) or their property if no treat is given. Some people signal that they are willing to hand out treats by putting up Halloween decorations outside their doors; houses may also leave their porch lights on as a universal indicator that they have candy; some simply leave treats available on their porches for the children to take freely, on the honor system. The history of trick-or-treating traces back to Scotland and Ireland, where the tradition of guising, going house to house at Halloween and putting on a small performance to be rewarded with food or treats, goes back at least as far as the 16th century, as does the tradition of people wearing costumes at Halloween. There are many accounts from 19th-century Scotland and Ireland of people going house to house in costume at Halloween, reciting verses in exchange for food, and sometimes warning of misfortune if they were not welcomed.[1][2][3] In North America, the earliest known occurrence of guising is from 1898, when children were recorded as having done this in the province of British Columbia, Canada.[4] The interjection trick or treat! was then first recorded in the Canadian province of Ontario in 1917.[5] While going house to house in costume has long been popular among the Scots and Irish, it is only in the 2000s that saying trick or treat has become common in Scotland and Ireland.[2] Prior to this, children in Ireland would commonly say help the Halloween party at the doors of homeowners.[2] The activity is prevalent in the Anglospheric countries of the United Kingdom, Ireland, the United States and Canada. It also has extended into Mexico. In northwestern and central Mexico, the practice is called calaverita (Spanish diminutive for calavera, skull in English), and instead of trick or treat, the children ask, ¿Me da mi calaverita? ([Can you] give me my little skull?), where a calaverita is a small skull made of sugar or chocolate. Traditions similar to the modern custom of trick-or-treating extend all the way back to classical antiquity, although it is extremely unlikely that any of them are directly related to the modern custom. The ancient Greek writer Athenaeus of Naucratis records in his book The Deipnosophists that, in ancient times, the Greek island of Rhodes had a custom in which children would go from door-to-door dressed as swallows, singing a song, which demanded the owners of the house to give them food and threatened to cause mischief if the owners of the house refused.[6][7][8] This tradition was claimed to have been started by the Rhodian lawgiver Cleobulus.[9] Starting as far back as the 15th century, among Christians, there had been a custom of sharing soul-cakes at Allhallowtide (October 31 through November 2).[11][12] People would visit houses and take soul-cakes, either as representatives of the dead, or in return for praying for their souls.[13] Later, people went from parish to parish at Halloween, begging soul-cakes by singing under the windows some such verse as this: Soul, souls, for a soul-cake; Pray you good mistress, a soul-cake![14] They typically asked for mercy on all Christian souls for a soul-cake.[15] It was known as Souling and was recorded in parts of Britain, Flanders, southern Germany, and Austria.[16] Shakespeare mentions the practice in his comedy The Two Gentlemen of Verona (1593), when Speed accuses his master of puling [whimpering or whining] like a beggar at Hallowmas.[17] In western England, mostly in the counties bordering Wales, souling was common.[12] According to one 19th century English writer parties of children, dressed up in fantastic costume […] went round to the farm houses and cottages, singing a song, and begging for cakes (spoken of as soal-cakes), apples, money, or anything that the goodwives would give them.[18] In England, souling remained an important part of Allhallowtide observances until the 19th century, in both Protestant and Catholic areas.[19][20]
|
Mummers play. Mummers plays are folk plays performed by troupes of amateur actors, traditionally all male, known as mummers or guisers (also by local names such as rhymers, pace-eggers, soulers, tipteerers, wrenboys, and galoshins). Historically, mummers plays consisted of informal groups of costumed community members that visited from house to house on various holidays.[1][2][3] The modern term refers especially to a play in which a number of characters are called on stage, two of whom engage in a combat, the loser being revived by a doctor character. This play is sometimes found associated with a sword dance though both also exist in Britain independently. Plays may be performed in the street or during visits to houses and pubs. They are generally performed seasonally, often at Christmas, Easter or on Plough Monday, more rarely on Halloween or All Souls Day, and often with a collection of money. The practice may be compared with other customs such as those of Halloween, Bonfire Night, wassailing, pace egging and first-footing at new year.[4] Although the term mummer has been in use since the Middle Ages, no scripts or details survive from that era and the term may have been used loosely to describe performers of several different kinds. The earliest evidence of mummers plays as they are known today is from the mid- to late 18th century. Mummers plays should not be confused with the earlier mystery plays. Mumming spread from the British Isles to a number of former British colonies. Ireland has its own unique history of mummers play, and adopted the term for the tradition from the English language.[5]
|
List of municipalities in Illinois. Illinois is a state located in the Midwestern United States. According to the 2020 United States census, Illinois is the 6th most populous state with 12,812,508 inhabitants but the 24th largest by land area spanning 55,499.0 square miles (143,742 km2) of land.[1] Illinois is divided into 102 counties and, as of 2020, contained 1,300 municipalities consisting of cities, towns, and villages. The most populous city is Chicago with 2,746,388 residents while the least populous is Valley City with 14 residents.[2] The largest municipality by land area is Chicago, which spans 227.73 sq mi (589.8 km2), while the smallest is Irwin at 0.045 sq mi (0.12 km2).[2]
|
Family law (disambiguation). Family law is an area of the law that deals with family matters and domestic relations. Family Law may also refer to:
|
The Decemberists. The Decemberists are an American indie rock band from Portland, Oregon, formed in 2000. The band consists of Colin Meloy (lead vocals, guitar), Chris Funk (guitar, multi-instrumentalist), Jenny Conlee (piano, keyboards, accordion, backing vocals), Nate Query (bass), and John Moen (drums). As of 2024[update], the band has released nine studio albums with their lyrics often focusing on historical incidents and folklore. Audience participation is a part of their live performances, typically during encores. The band stages whimsical reenactments of sea battles and other centuries-old events, typically of regional interest, or acts out songs with members of the crowd. In 2011, the track Down by the Water from their album The King Is Dead was nominated for Best Rock Song at the 54th Grammy Awards. The Decemberists formed in 2000 when Colin Meloy left his band Tarkio in Montana and moved to Portland, Oregon. There he met Nate Query, who introduced Meloy to Jenny Conlee (they had played together in the band Calobo) and the three scored a silent film together. Playing a solo show prior to meeting Query, Meloy met Chris Funk. Funk was a fan of Tarkio and played pedal steel on the first two Decemberists releases, not officially becoming a member until the third effort. The bands first drummer, Ezra Holbrook, was replaced by Rachel Blumberg after Castaways and Cutouts, who in turn was replaced by John Moen after Picaresque. The bands name refers to the Decembrist revolt, an 1825 insurrection in Imperial Russia. Meloy has stated that the name is also meant to invoke the drama and melancholy of the month of December.[1] 5 Songs, the bands debut extended play, was self-released in 2001. The members at that time played for several hours in a McMenamins hotel the night before to raise the money needed to record in the studio the next day.[2] This originally served as a demo tape and the five songs on it (minus Apology Song) were recorded in under two hours. After releasing its first full record, Castaways and Cutouts, on Hush Records, the group moved onto the Kill Rock Stars recording label. After the re-release of Castaways, Her Majesty the Decemberists was released in 2003. In 2004, the band released The Tain, an eighteen-and-a-half minute single track based on the Irish mythological epic Táin Bó Cúailnge. The bands final album with Kill Rock Stars was Picaresque, which was recorded in a former church.
|
Soul cake. A soul cake, also known as a soulmass-cake, is a small round cake with sweet spices, which resembles a shortbread biscuit. It is traditionally made for Halloween, All Saints Day, and All Souls Day to commemorate the dead in many Christian traditions.[1][2] The cakes, often simply referred to as souls, are given out to soulers (mainly consisting of children and the poor) who go from door to door during the days of Allhallowtide, singing and saying prayers for the souls of the givers and their friends,[1] especially the souls of deceased relatives, thought to be in the intermediate state between Earth and Heaven.[3] In England, the practice dates to the medieval period,[4] and it continued there until the 1930s by both Protestant and Catholic Christians.[5][6][1] In Sheffield and Cheshire, the custom has continued into modern times. In Lancashire and in the North-east of England, soul cakes were known as Harcakes, a kind of thin parkin.[7] The practice of giving and eating soul cakes continues in some countries today, such as Portugal (where it is known as Pão-por-Deus and occurs on All Saints Day and All Souls Day), as well as the Philippines (where it is known as Pangangaluwa and occurs on All Hallows Eve).[8][9] In other countries, souling is seen as the origin of the practice of trick-or-treating.[10] In the United States, some churches, during Allhallowtide, have invited people to come receive sweets from them and have offered to pray for the souls of their friends, relatives or even pets as they do so.[11] Among Catholics and Lutherans, some parishioners have their soul cakes blessed by a priest before being distributed; in exchange, the children promise to pray for the souls of the deceased relatives of the giver during the month of November, which is a month dedicated especially to praying for the Holy Souls. Any leftover soul cakes are shared among the distributing family or given to the poor. The tradition of giving soul cakes was celebrated in Britain and Ireland during the Middle Ages,[12] although similar practices for the souls of the dead were found as far south as Italy.[13] The cakes are usually filled with allspice, nutmeg, cinnamon, ginger or other sweet spices, raisins or currants, and before baking are topped with the mark of a cross to signify that these were alms. They were traditionally set out with glasses of wine, an offering for the dead as in early Christian tradition,[14] and either on All Hallows Eve (Halloween),[15] All Saints Day or All Souls Day, children would go souling,[16] or ritually begging for cakes door to door.[citation needed]
|
Winter solstice. The winter solstice, or hibernal solstice, occurs when either of Earths poles reaches its maximum tilt away from the Sun. This happens twice yearly, once in each hemisphere (Northern and Southern). For that hemisphere, the winter solstice is the day with the shortest period of daylight and longest night of the year, and when the Sun is at its lowest daily maximum elevation in the sky.[7] Each polar region experiences continuous darkness or twilight around its winter solstice. The opposite event is the summer solstice. The winter solstice occurs during the hemispheres winter. In the Northern Hemisphere, this is the December solstice (December 21 or 22) and in the Southern Hemisphere, this is the June solstice (June 20 or 21). Although the winter solstice itself lasts only a moment, the term also refers to the day on which it occurs. Traditionally, in many temperate regions, the winter solstice is seen as the middle of winter; midwinter is another name for the winter solstice, although it carries other meanings as well. Other names are the extreme of winter, or the shortest day. Since prehistory, the winter solstice has been a significant time of year in many cultures and has been marked by festivals and rites.[8] This is because it is the point when the shortening of daylight hours is reversed and the daytime begins to lengthen again. In parts of Europe it was seen as the symbolic death and rebirth of the Sun. Some ancient monuments such as Newgrange, Stonehenge, and Cahokia Woodhenge are aligned with the sunrise or sunset on the winter solstice. There is evidence that the winter solstice was deemed an important moment of the annual cycle for some cultures as far back as the Neolithic (New Stone Age). Astronomical events were often used to guide farming activities, such as the mating of animals, the sowing of crops and the monitoring of winter reserves of food. Livestock were slaughtered so they would not have to be fed during the winter, so it was almost the only time of year when a plentiful supply of fresh meat was available.[9]
|
Album. An album is a collection of audio recordings (e.g., music) issued on a medium such as compact disc (CD), vinyl (record), audio tape (like 8-track or cassette), or digital. Albums of recorded sound were developed in the early 20th century as individual 78 rpm records (78s) collected in a bound book resembling a photo album; this format evolved after 1948 into single vinyl long-playing (LP) records played at 33+1⁄3 rpm. The album was the dominant form of recorded music expression and consumption from the mid-1960s to the early 21st century, a period known as the album era.[1] Vinyl LPs are still issued, though album sales in the 21st-century have mostly focused on CD and MP3 formats. The 8-track tape was the first tape format widely used alongside vinyl from 1965 until being phased out by 1983, being gradually supplanted by the cassette tape throughout the 1970s and early 1980s; the popularity of the cassette reached its peak during the late 1980s before sharply declining during the 1990s. The cassette had largely disappeared by the first decade of the 2000s. Most albums are recorded in a studio,[2] making them studio albums, although they may also be recorded in a concert venue, at home, in the field, or a mix of places. The time frame for completely recording an album varies between a few hours to several years. This process usually requires several takes with different parts recorded separately, and then brought or mixed together. Recordings that are done in one take without overdubbing are termed live, even when done in a studio. Studios are built to absorb sound, eliminating reverberation, to assist in mixing different takes; other locations, such as concert venues and some live rooms, have reverberation, which creates a live sound.[3] Recordings, including live, may contain editing, sound effects, voice adjustments, etc. With modern recording technology, artists can be recorded in separate rooms or at separate times while listening to the other parts using headphones; with each part recorded as a separate track. An album is generally considered to contain more tracks than an extended play (EP). Album covers and liner notes are used, and sometimes additional information is provided, such as analysis of the recording, and lyrics or librettos.[4][5] Historically, the term album was applied to a collection of various items housed in a book format. In musical usage, the word was used for collections of short pieces of printed music from the early nineteenth century.[6] Later, collections of related 78s were bundled in book-like albums[7] (one side of a 78 rpm record could hold only about 3.5 minutes of sound). When LP records were introduced, a collection of pieces or songs on a single record was called an album; the word was extended to other recording media such as compact disc, MiniDisc, compact audio cassette, 8-track tape and digital albums as they were introduced.[8]
|
Types of marriages. The type, functions, and characteristics of marriage vary from culture to culture, and can change over time. In general there are two types: civil marriage and religious marriage, and typically marriages employ a combination of both (religious marriages must often be licensed and recognized by the state, and conversely civil marriages, while not sanctioned under religious law, are nevertheless respected). Marriages between people of differing religions are called interfaith marriages, while marital conversion, a more controversial concept than interfaith marriage, refers to the religious conversion of one partner to the others religion for sake of satisfying a religious requirement. In the Americas and Europe, in the 21st century, legally recognized marriages are formally presumed to be monogamous (although some pockets of society accept polygamy socially, if not legally, and some couples choose to enter into open marriages). In these countries, divorce is relatively simple and socially accepted. In the West, the prevailing view toward marriage today is that it is based on a legal covenant recognizing emotional attachment between the partners and entered into voluntarily. In the West, marriage has evolved from a life-time covenant that can only be broken by fault or death to a contract that can be broken by either party at will. Other shifts in Western marriage since World War I include: Outside the West, same-race marriage was illegal in Paraguay before becoming legal.
|
Legitimacy (criminal law). In law, legitimacy is distinguished from legality (see also color of law). An action can be legal but not legitimate or vice versa it can be legitimate but not legal. Thomas Hilbink suggests that the power to compel obedience to the law, is derived from the power to sway public opinion, to the belief that the law and its agents are legitimate and deserving of this obedience.[1] Where as Tyler says, Legitimacy is ...a psychological property of an authority, institution, or social arrangement, that leads those connected to it to believe that it is appropriate, proper, and just (Tyler, 2006b: 375). Thus viewed, the legal legitimacy is the belief that the law and agents of the law are rightful holders of authority; that they have the right to dictate appropriate behaviour and are entitled to be obeyed; and that laws should be obeyed, simply because, that is the right thing to do (Tyler, 2006a; Tyler, 2006b; cf. Easton, 1965).[2] Peter Kropotkin suggested that acceptance of the rule of law developed in response to the rampant abuse of authority by the nobility; post advent of the middle class after the French Revolution, strict adherence to the law was conceived as the ultimate equalizer within society. “Whatever this law might be,” Kropotkin writes, “it promised to affect lord and peasant alike; it proclaimed the equality of rich and poor before the judge”.[1] To establish that a government action can be legal whilst not being legitimate; e.g., the Gulf of Tonkin Resolution, which allowed the United States to wage war against Vietnam without a formal declaration of war. It is also possible for a government action to be legitimate without being legal; e.g., a pre-emptive war, a military junta. An example of such matters arises when legitimate institutions clash in a constitutional crisis.
|
Music genre. A music genre is a conventional category that identifies some pieces of music as belonging to a shared tradition or set of conventions.[1] Genre is to be distinguished from musical form and musical style, although in practice these terms are sometimes used interchangeably.[2] Music can be divided into genres in numerous ways, sometimes broadly and with polarity, e.g., popular music as opposed to art music or folk music, or, as another example, religious music and secular music. Often, however, classification draws on the proliferation of derivative subgenres, fusion genres, and microgenres that has started to accrue, e.g., screamo, country pop, and mumble rap, respectively. The artistic nature of music means that these classifications are often subjective and controversial, and some may overlap. As genres evolve, novel music is sometimes lumped into existing categories. Douglass M. Green distinguishes between genre and form in his book Form in Tonal Music. He lists madrigal, motet, canzona, ricercar, and dance as examples of genres from the Renaissance period. To further clarify the meaning of genre, Green writes about Beethovens Op. 61 and Mendelssohns Op. 64 . He explains that both are identical in genre and are violin concertos that have different forms. However, Mozarts Rondo for Piano, K. 511, and the Agnus Dei from his Mass, K. 317, are quite different in genre but happen to be similar in form.[3] In 1982, Franco Fabbri proposed a definition of the musical genre that is now considered to be normative:[4] musical genre is a set of musical events (real or possible) whose course is governed by a definite set of socially accepted rules, where a musical event can be defined as any type of activity performed around any type of event involving sound.[5]
|
Common-law marriage. Common-law marriage, also known as non-ceremonial marriage,[1][2] sui iuris marriage, informal marriage, de facto marriage, more uxorio or marriage by habit and repute, is a marriage that results from the parties agreement to consider themselves married, followed by cohabitation, rather than through a statutorily defined process. Not all jurisdictions permit common law marriage, but will typically respect the validity of such a marriage lawfully entered in another state or country. The original concept of a common-law marriage is one considered valid by both partners, but not formally recorded with a state or religious registry, nor celebrated in a formal civil or religious service. In effect, the act of the couple representing themselves to others as being married and organizing their relation as if they were married, means they are married. The term common-law marriage (or similar) has wider informal use, often to denote relations that are not legally recognized as marriages. It is often used colloquially or by the media to refer to cohabiting couples, regardless of any legal rights or religious implications involved. This can create confusion in regard to the term and to the legal rights of unmarried partners (in addition to the actual status of the couple referred to).[3] Common-law marriage is a marriage that takes legal effect without the prerequisites of a marriage license or participation in a marriage ceremony. The marriage occurs when two people who are legally capable of being married, and who intend to be married, live together as a married couple and hold themselves out to the world as a married couple.[4] The term common-law marriage is often used incorrectly to describe various types of couple relationships, such as cohabitation (whether or not registered) or other legally formalized relations. Although these interpersonal relationships are often called common-law marriage, they differ from its original meaning in that they are not legally recognized as marriages, but may be a parallel interpersonal status such as a domestic partnership, registered partnership, common law partner conjugal union, or civil union. Non-marital relationship contracts are not necessarily recognized from one jurisdiction to another.
|
Indie rock. Indie rock is a subgenre of rock music that originated in the United Kingdom, United States and New Zealand in the early to mid-1980s. Although the term was originally used to describe rock music released through independent record labels, by the 1990s it became more widely associated with the music such bands produced. The sound of indie rock has its origins in the UK DIY music of the Buzzcocks, Desperate Bicycles[1] and Television Personalities,[2][3] the New Zealand Dunedin sound of the Chills, Tall Dwarfs,[4] the Clean[5] and the Verlaines, alongside Australias The Go-Betweens[6] and early 1980s college rock radio stations who would frequently play jangle pop bands like the Smiths and R.E.M. The genre solidified itself during the mid–1980s with NMEs C86 cassette in the United Kingdom and the underground success of Sonic Youth, Dinosaur Jr. and Unrest in the United States. During the 1990s, indie rock bands like Sonic Youth, the Pixies and Radiohead all released albums on major labels and subgenres like slowcore, Midwest emo, slacker rock and space rock began. By this time, indie had evolved to refer to bands whose music was released on independent record labels, in addition to the record labels themselves. As the decade progressed many individual local scenes developed their own distinct takes on the genre: baggy in Manchester; grebo in Stourbridge and Leicester; and shoegaze in London and the Thames Valley. During the 1990s, the mainstream success of grunge and Britpop, two movements influenced by indie rock, brought increased attention to the genre and saw record labels use their independent status as a marketing tactic. This led to a split within indie rock: one side conforming to mainstream radio; the other becoming increasingly experimental. By this point, indie rock referred to the musical style rather than ties to the independent music scene. In the 2000s, indie rock reentered the mainstream through the garage rock and post-punk revival and the influence of the Strokes and the Libertines. This success was exacerbated in the middle of the decade by Bloc Party, Arctic Monkeys and the Killers, while indie rock further proliferated into the 2000s blog rock era and the British landfill indie movement. Lately weve been hearing the tag 90s indie rock used to describe bands ranging from Waxahatchee to Speedy Ortiz to Yuck, and while our brain immediately turns a switch that associates the phrase with sounds like Pavement, 90s indie rock was really as eclectic and undefinable as, well, contemporary indie rock.
|
English and Welsh bastardy laws. In the law of England and Wales, a bastard (also historically called whoreson, although both of these terms have largely dropped from common usage) is an illegitimate child, one whose parents were not married at the time of their birth. Until 1926, there was no possibility of post factum legitimisation of a bastard. The word bastard is from the Old French bastard, which in turn was from Medieval Latin bastardus. In the modern French bâtard, the circumflex (â) merely represents the loss of the s over time. According to some sources, bastardus may have come from the word bastum, which means pack saddle,[1] the connection possibly being the idea that a bastard might be the child of a passing traveller (who would have a pack saddle). In support of this is the Old French phrase fils de bast loosely meaning child of the saddle, which had a similar meaning.[1] A more defined possibility is that such a traveller was a member of the corps de bast, referring to the division of an army who arrived in town with their pack saddles the night before the troops, and left the day after, so that they may deal with all of the provisions of an army, and even do advanced scouting. This meant that for two days, they had unfettered access to all of the women in town, and were therefore the ones most likely to be the cause of the towns illegitimate offspring. (This explanation is apocryphal, but no attempt at dispute seems to have been proffered.) Bastardy was not a status, like villeinage, but the fact of being a bastard had a number of legal effects on an individual. One exception to the general principle that a bastard could not inherit occurred when the eldest son (who would otherwise be heir) was born a bastard but the second son was born after the parents were married. The Provisions of Merton 1235 (20 Hen. 3 c. IX), otherwise known as the Special Bastardy Act 1235, provided that except in the case of real actions the fact of bastardy could be proved by trial by jury, rather than necessitating a bishops certificate.
|
Concubinage. Concubinage is an interpersonal and sexual relationship between two people in which the couple does not want to, or cannot, enter into a full marriage.[1] Concubinage and marriage are often regarded as similar, but mutually exclusive.[2] During the early stages of European colonialism, administrators often encouraged European men to practice concubinage to discourage them from paying prostitutes for sex (which could spread venereal disease) and from homosexuality. Colonial administrators also believed that having an intimate relationship with a native woman would enhance white mens understanding of native culture and would provide them with essential domestic labor. The latter was critical, as it meant white men did not require wives from the metropole, hence did not require a family wage. Colonial administrators eventually discouraged the practice when these liaisons resulted in offspring who threatened colonial rule by producing a mixed race class. This political threat eventually prompted colonial administrators to encourage white women to travel to the colonies, where they contributed to the colonial project, while at the same time contributing to domesticity and the separation of public and private spheres.[3] In China, until the 20th century, concubinage was a formal and institutionalized practice that upheld concubines rights and obligations.[4] A concubine could be freeborn or of slave origin, and her experience could vary tremendously according to her masters whim.[4] During the Mongol conquests, both foreign royals[5] and captured women were taken as concubines.[6] Concubinage was also common in Meiji Japan as a status symbol.[7] Many Middle Eastern societies used concubinage for reproduction.[8] The practice of a barren wife giving her husband a slave as a concubine is recorded in the Code of Hammurabi.[8] The children of such relationships would be regarded as legitimate.[8] Such concubinage was also widely practiced in the premodern Muslim world, and many of the rulers of the Abbasid Caliphate and the Ottoman Empire were born out of such relationships.[9] Throughout Africa, from Egypt to South Africa, slave concubinage resulted in racially mixed populations.[10] The practice declined as a result of the abolition of slavery.[9]
|
Spanish Netherlands. The Spanish Netherlands (Spanish: Países Bajos Españoles; Dutch: Spaanse Nederlanden; French: Pays-Bas espagnols; German: Spanische Niederlande; historically in Spanish: Flandes, the name Flanders was used as a pars pro toto)[4] were a collection of States of the Holy Roman Empire in the Low Countries, held in personal union by the Spanish Habsburgs, but not annexed to the Spanish Crown, thus encompassing the second period in history of the Habsburg Netherlands, that lasted from 1556 to 1714. This region comprised most of the modern states of Belgium and Luxembourg, as well as parts of northern France, the southern Netherlands, and western Germany, with the capital being Brussels. The Army of Flanders was given the task of defending the territory. The Imperial fiefs in the former Burgundian Netherlands had been inherited by the House of Habsburg from the extinct House of Valois-Burgundy upon the death of Mary of Burgundy in 1482. The Seventeen Provinces formed the core of the Habsburg Netherlands, which passed to the Spanish Habsburgs upon the abdication of Emperor Charles V in 1556. Spanish hegemony in Netherlands was solidified following their victory in the Fall of Antwerp during the Eighty Years War. When part of the Netherlands separated to form the autonomous Dutch Republic in 1581, the remainder of the area stayed under Spanish rule until the War of the Spanish Succession. A common administration of fiefs in the Low Countries, centered in the Duchy of Brabant, already existed under the rule of the Burgundian Duke Philip the Good with the implementation of a stadtholder and the first convocation of the States General of the Netherlands in 1464.[5] His granddaughter Mary had confirmed a number of privileges to the States by the Great Privilege signed in 1477.[6] After the government takeover by her husband Archduke Maximilian I of Austria, the States insisted on their privileges, culminating in a Hook rebellion in Holland and Flemish revolts. Maximilian prevailed with the support of Duke Albert III of Saxony and his son Philip the Handsome, husband of Joanna of Castile, could assume the rule over the Habsburg Netherlands in 1493.[7][8] Philip as well as his son and successor Charles V retained the title of a Duke of Burgundy referring to their Burgundian inheritance, but not having the Duchy of Burgundy in their possession, since it was taken by the French already in 1477. Only the Free County of Burgundy, in the Holy Roman Empire, remained in Habsburg rule, since 1493. The Habsburgs often used the term Burgundy to refer to their hereditary lands both in historical Burgundy and the Low Countries (e.g. in the name of the Imperial Burgundian Circle established in 1512), actually until 1795, when the Austrian Netherlands were lost to the French Republic. The Governor-general of the Netherlands was responsible for the administration of the Habsburg lands in the Low Countries. Charles V was born and raised in the Low Countries and often stayed at the Palace of Coudenberg in Brussels. By the Pragmatic Sanction of 1549, Charles V declared the Seventeen Provinces a united and indivisible Habsburg dominion. Between 1555 and 1556, the House of Habsburg split into an Austrian and a Spanish branch as a consequence of Charless abdications: the Netherlands were left to his son Philip II of Spain, while his brother King Ferdinand I succeeded him as Holy Roman Emperor. The Seventeen Provinces, de jure still fiefs of the Holy Roman Empire, from that time on de facto were ruled by the Spanish branch of the Habsburgs.
|
Political legitimacy. In political science, legitimacy is a concept concerning the right of an authority, usually a governing law or a regime, to rule the actions of a society.[1][2] In political systems where this is not the case, unpopular regimes survive because they are considered legitimate by a small, influential elite.[3] In Chinese political philosophy, since the historical period of the Zhou dynasty (1046–256 BC), the political legitimacy of a ruler and government was derived from the Mandate of Heaven, and unjust rulers who lost said mandate therefore lost the right to rule the people. In moral philosophy, the term legitimacy is often positively interpreted as the normative status conferred by a governed people upon their governors institutions, offices, and actions, based upon the belief that their governments actions are appropriate uses of power by a legally constituted government.[4] The Enlightenment-era British social John Locke (1632–1704) said that political legitimacy derives from popular explicit and implicit consent of the governed: The argument of the [Second] Treatise is that the government is not legitimate unless it is carried on with the consent of the governed.[5] The German political philosopher Dolf Sternberger said that [l]egitimacy is the foundation of such governmental power as is exercised, both with a consciousness on the governments part that it has a right to govern, and with some recognition by the governed of that right.[6] The American political sociologist Seymour Martin Lipset said that legitimacy also involves the capacity of a political system to engender and maintain the belief that existing political institutions are the most appropriate and proper ones for the society.[7] The American political scientist Robert A. Dahl explained legitimacy as a reservoir: so long as the water is at a given level, political stability is maintained, if it falls below the required level, political legitimacy is endangered.[3] Legitimacy is a value whereby something or someone is recognized and accepted as right and proper.[8] In political science, legitimacy has traditionally been understood as the popular acceptance and recognition by the public of the authority of a governing régime, whereby authority has political power through consent and mutual understandings, not coercion. The three types of political legitimacy described by German sociologist Max Weber, in Politics as Vocation, are traditional, charismatic, and rational-legal: More recent scholarship distinguishes between multiple other types of legitimacy in an effort to draw distinctions between various approaches to the construct. These include empirical legitimacy versus normative legitimacy, instrumental versus substantive legitimacy, popular legitimacy, regulative legitimacy, and procedural legitimacy.[10][11][12] Types of legitimacy draw distinctions that account for different sources of legitimacy, different frameworks for evaluating legitimacy, or different objects of legitimacy.[13][12]
|
Novella. A novella is a narrative prose fiction whose length is shorter than most novels, but longer than most novelettes and short stories. The English word novella derives from the Italian novella meaning a short story related to true (or apparently so) facts. The Italian term is a feminine of novello, which means new, similarly to the English word news.[1] Merriam-Webster defines a novella as a work of fiction intermediate in length and complexity between a short story and a novel.[1] There is disagreement regarding the number of pages or words necessary for a story to be considered a novella, a short story or a novel.[2] The Science Fiction and Fantasy Writers Association defines a novellas word count to be between 17,500 and 40,000 words;[3][4] at 250 words per page, this equates to 70 to 160 pages. See below for definitions used by other organisations. The novella as a literary genre began developing in the Italian literature of the early Renaissance, principally by Giovanni Boccaccio, author of The Decameron (1353).[5] The Decameron featured 100 tales (named novellas) told by ten people (seven women and three men) fleeing the Black Death, by escaping from Florence to the Fiesole hills in 1348. This structure was then imitated by subsequent authors, notably the French queen Marguerite de Navarre, whose Heptaméron (1559) included 72 original French tales and was modeled after the structure of The Decameron. The Italian genre novella grew out of a rich tradition of medieval short narrative forms. It took its first major form in the anonymous late 13th century Libro di novelle et di bel parlar gentile, known as Il Novellino, and reached its culmination with The Decameron. Followers of Boccaccio such as Giovanni Fiorentino, Franco Sacchetti, Giovanni Sercambi and Simone de Prodenzani continued the tradition into the early 15th century. The Italian novella influenced many later writers, including Shakespeare.[6]
|
Hardcover. A hardcover, hard cover, or hardback (also known as hardbound, and sometimes as casebound[1]) book is one bound with rigid protective covers (typically of binders board or heavy paperboard covered with buckram or other cloth, heavy paper, or occasionally leather).[1] It has a flexible, sewn spine which allows the book to lie flat on a surface when opened.[1] Modern hardcovers may have the pages glued onto the spine in much the same way as paperbacks.[1] Following the ISBN sequence numbers, books of this type may be identified by the abbreviation Hbk. Hardcover books are often printed on acid-free paper, and they are much more durable than paperbacks, which have flexible, easily damaged paper covers. Hardcover books are marginally more costly to manufacture. Hardcovers are frequently protected by artistic dust jackets, but a jacketless alternative has increased in popularity: these paper-over-board or jacketless hardcover bindings forgo the dust jacket in favor of printing the cover design directly onto the board binding.[2][3] If brisk sales are anticipated, a hardcover edition of a book is typically released first, followed by a trade paperback edition (same format as hardcover) the next year. Some publishers publish paperback originals if slow hardback sales are anticipated. For very popular books these sales cycles may be extended, and followed by a mass market paperback edition typeset in a more compact size and printed on thinner, less hardy paper. This is intended to, in part, prolong the life of the immediate buying boom that occurs for some best sellers: After the attention to the book has subsided, a lower-cost version in the paperback, is released to sell further copies. In the past the release of a paperback edition was one year after the hardback, but by the early 21st century paperbacks were released six months after the hardback by some publishers.[4] It is very unusual for a book that was first published in paperback to be followed by a hardback. One example is the novel The Judgment of Paris by Gore Vidal, which had its revised edition of 1961 first published in paperback, and later in hardcover.[5] Hardcover books are usually sold at higher prices than comparable paperbacks. Books for the general public are usually printed in hardback only for authors who are expected to be successful, or as a precursor to the paperback to predict sale levels; however, many academic books are often only published in hardcover editions. Hardcovers usually consist of a page block, two boards, and a cloth or heavy paper covering.[1] The pages are sewn together and glued onto a flexible spine between the boards, and it too is covered by the cloth.[1] A paper wrapper, or dust jacket, is usually put over the binding, folding over each horizontal end of the boards. Dust jackets serve to protect the underlying cover from wear. On the folded part, or flap, over the front cover is generally a blurb, or a summary of the book. The back flap is where the biography of the author can be found. Reviews are often placed on the back of the jacket. Many modern bestselling hardcover books use a partial cloth cover, with a cloth-covered board on the spine only, and only boards covering the rest of the book.
|
Paperback. A paperback (softcover, softback) book is one with a thick paper or paperboard cover, also known as wrappers, and often held together with glue rather than stitches or staples. In contrast, hardback (hardcover) books are bound with cardboard covered with cloth, leather, paper, or plastic. Inexpensive books bound in paper have existed since at least the 19th century in such forms as pamphlets, yellowbacks and dime novels.[a] Modern paperbacks can be differentiated from one another by size. In the United States, there are mass-market paperbacks and larger, more durable trade paperbacks. In the United Kingdom, there are A-format, B-format, and the largest C-format sizes.[1] Paperback editions of books are issued when a publisher decides to release a book in a low-cost format. Lower-quality paper, glued (rather than stapled or sewn) bindings, and the lack of a hard cover may contribute to the lower cost of paperbacks. In the early days of modern paperbacks, the 1930s and 1940s, they were sold as a cheaper, less permanent, and more convenient alternative to traditional hardcover books, as the name of the first American paperback publisher, Pocket Books, indicates. In addition, the Pocket Books edition of Wuthering Heights, one of the first ten books it published in 1939, emphasized the impermanence of paperbacks by telling readers: if you enjoyed it so much you may wish to own it in a more permanent edition, they could return the 25 cent book to Pocket Books with an additional 70 cents and it would send them a copy of the 95 cent Modern Library edition substantially bound in durable cloth.[2] Since the mid-20th century, paperbacks can also be the preferred medium when a book is not expected to be a major seller and the publisher wishes to release the book without a large investment. Examples include many novels and newer editions or reprintings of older books.
|
Cohabitation. Cohabitation is an arrangement where people who are not legally married live together as a couple. They are often involved in a romantic or sexually intimate relationship on a long-term or permanent basis. Such arrangements have become increasingly common in Western countries since the late 20th century, led by changing social views, especially regarding marriage. The term dates from the mid 16th century, being used with this meaning as early as 1530.[1] Cohabitation is a common pattern among people in the Western world. In Europe, the Scandinavian countries began this trend, although many countries have since followed.[3] Mediterranean Europe has traditionally been very conservative, with religion playing a strong role. Until the mid-1990s, cohabitation levels remained low in this region, but have since increased;[4] for example, in Portugal the majority of children have been born of unwed parents since 2015, constituting 60% of the total in 2021.[5] In the United States, over the past few decades there has been an increase in unmarried couples cohabiting.[6] Historically, Western countries have been influenced by Christian doctrine on sex, which opposes unmarried cohabitation. As social norms have changed, such beliefs have become less widely held and some Christian denominations view cohabitation as a precursor to marriage.[7] Pope Francis has performed the marriages of cohabiting couples who had children,[8] while former archbishop of Canterbury Rowan Williams[9] and the archbishop of York John Sentamu have expressed tolerance of cohabitation.[10]
|
Picaresque novel. The picaresque novel (Spanish: picaresca, from pícaro, for rogue or rascal) is a genre of prose fiction. It depicts the adventures of a roguish but appealing hero, usually of low social class, who lives by his wits in a corrupt society.[1] Picaresque novels typically adopt the form of an episodic prose narrative[2] with a realistic style. There are often elements of comedy and satire. The picaresque genre began with the Spanish novel Lazarillo de Tormes[3] (1554), which was published anonymously during the Spanish Golden Age because of its anticlerical content. Literary works from Imperial Rome published during the 1st–2nd century AD, such as Satyricon[3] by Petronius and The Golden Ass by Apuleius had a relevant influence on the picaresque genre and are considered predecessors. Other notable early Spanish contributors to the genre included Mateo Alemáns Guzmán de Alfarache (1599–1604) and Francisco de Quevedos El Buscón (1626). Some other ancient influences of the picaresque genre include Roman playwrights such as Plautus and Terence. The Golden Ass by Apuleius nevertheless remains, according to various scholars such as F. W. Chandler, A. Marasso, T. Somerville and T. Bodenmüller, the primary antecedent influence for the picaresque genre.[4] Subsequently, following the example of Spanish writers, the genre flourished throughout Europe for more than 200 years and it continues to have an influence on modern literature and fiction. According to the traditional view of Thrall and Hibbard (first published in 1936), seven qualities distinguish the picaresque novel or narrative form, all or some of which an author may employ for effect:[5] In the English-speaking world, the term picaresque is often used loosely to refer to novels that contain some elements of this genre; e.g. an episodic recounting of adventures on the road.[6] The term is also sometimes used to describe works which only contain some of the genres elements, such as Miguel de Cervantes Don Quixote (1605 and 1615), or Charles Dickens The Pickwick Papers (1836–1837). The word pícaro first starts to appear in Spain with the current meaning in 1545, though at the time it had no association with literature.[7] The word pícaro does not appear in Lazarillo de Tormes (1554), the novella credited by modern scholars with founding the genre. The expression picaresque novel was coined in 1810.[8][9] Whether it has any validity at all as a generic label in the Spanish sixteenth and seventeenth centuries—Cervantes certainly used picaresque with a different meaning than it has today—has been called into question. There is unresolved debate within Hispanic studies about what the term means, or meant, and which works were, or should be, so called. The only work clearly called picaresque by its contemporaries was Mateo Alemáns Guzmán de Alfarache (1599–1604), which they considered El libro del pícaro (English: The Book of the Pícaro).[10]
|
Moorabbin, Victoria. Moorabbin is a suburb in Melbourne, Victoria, Australia, 15 km south-east of Melbournes Central Business District,[2] located within the City of Kingston local government area. Moorabbin recorded a population of 6,287 at the 2021 census.[3] Most of the eastern side of Moorabbin has been an industrial area since the first development in the mid-1960s. Major businesses with a presence in the area include Coca-Cola. Moorabbin is also well known locally for its residential area built after World War II.[4] The word Moorabbin is believed to have come from the Aboriginal word moorooboon meaning mothers milk, as it was purportedly a place where women and children stayed and rested while the men hunted further afield.[citation needed] In 1846, the first European settlers arrived, brothers John and Richard King, who are thought to have come from the Western Port area.[citation needed]
|
Nomarch. A nomarch (Ancient Greek: νομάρχης,[1] Ancient Egyptian: ḥrj tp ꜥꜣ Great Chief) was a provincial governor in ancient Egypt; the country was divided into 42 provinces, called nomes (singular spꜣ.t, plural spꜣ.wt). A nomarch was the government official responsible for a nome.[2] More recent studies are more cautious about using this term as it is a Greek word that does not exactly match Ancient Egyptian administrative titles[3] and modern scholars often prefer other, more neural words for describing the heads of the provinces, such as governor.[4] The term nome is derived from Ancient Greek: νομός nomós province, district. Nomarch is derived from νομάρχης nomárkhēs: province + -άρχης ruler. The division of the Egyptian kingdom into nomes can be documented as far back as the reign of Djoser of the 3rd Dynasty in the early Old Kingdom, c. 2670 BCE, and potentially dates even further back to the Predynastic kingdoms of the Nile valley. The earliest topographical lists of the nomes of Upper and Lower Egypt date back to the reign of Nyuserre Ini, of the mid 5th Dynasty, from which time the nomarchs no longer lived at royal capital but stayed in their nomes.[5] The power of the nomarchs grew with the reforms of Nyuserres second successor, Djedkare Isesi, which effectively decentralized the Egyptian state. The post of nomarch then quickly became hereditary, thereby creating a virtual feudal system where local allegiances slowly superseded obedience to the pharaoh. Less than 200 years after Djedkares reign, the nomarchs had become the all-powerful heads of the provinces. At the dawn of the First Intermediate Period, the power of the Pharaohs of the 8th Dynasty had diminished to the extent that they owed their position to the most powerful nomarchs, upon whom they could only bestow titles and honours. The power of the nomarchs remained important during the later royal revival under the impulse of the 11th Dynasty, originally a family of Theban nomarchs. Their power diminished during the subsequent 12th Dynasty, setting the stage for the apex of royal power during the Middle Kingdom.
|
Monarch (disambiguation). A monarch is the head of state of a monarchy, who holds the office for life or until abdication. Monarch or Monarchy may also refer to:
|
Order of Australia. The Order of Australia is an Australian honour that recognises Australian citizens and other persons for outstanding achievement and service.[2] It was established on 14 February 1975 by Elizabeth II, Queen of Australia, on the advice of then prime minister Gough Whitlam. Before the establishment of the order, Australians could receive British honours, which continued to be issued in parallel until 1992. Appointments to the order are made by the governor-general, with the approval of The Sovereign,[1][a] according to recommendations made by the Council for the Order of Australia.[4] Members of the government are not involved in the recommendation of appointments, other than for military and honorary awards. The King of Australia is the sovereign head of the order,[2][5] and the governor-general is the principal companion and chancellor of the order. The governor-generals official secretary, Gerard Martin (appointed 1 July 2024), is secretary of the order. The order is divided into a general and a military division. The five levels of appointment to the order in descending order of seniority are:
|
Royal prerogative. The royal prerogative is a body of customary authority, privilege, and immunity recognised in common law (and sometimes in civil law jurisdictions possessing a monarchy) as belonging to the sovereign, and which have become widely vested in the government.[note 1] It is the means by which some of the executive powers of government, possessed by and vested in a monarch with regard to the process of governance of the state, are carried out. In most constitutional monarchies, prerogatives can be abolished by Parliament under its legislative authority.[citation needed] In the Commonwealth realms, this draws on the constitutional statutes at the time of the Glorious Revolution, when William III and Mary II were invited to take the throne.[citation needed] In the United Kingdom, the remaining powers of the royal prerogative are devolved to the head of the government, which, for more than two centuries, has been the Prime Minister; the benefits, equally, such as ratification of treaties and mineral rights in all gold and silver ores, vest in (belong to) the government.[1][citation needed] In Britain, prerogative powers were originally exercised by the monarch acting without an observed requirement for parliamentary consent (after its empowerment in certain matters following Magna Carta). Since the accession of the House of Hanover, these powers have been exercised, with minor exceptions in economically unimportant sectors, on the advice of the prime minister or the Cabinet, who are accountable to Parliament (and exclusively so, except in matters of the Royal Family) since at least the time of William IV.[citation needed]
|
East Slavic name. East Slavic naming customs are the traditional way of identifying a persons given name, patronymic name, and family name in East Slavic cultures in Russia and some countries formerly part of the Russian Empire and the Soviet Union. They are used commonly in Russia, Ukraine, Belarus, Moldova, Kazakhstan, Turkmenistan, Uzbekistan, and to a lesser extent in Kyrgyzstan, Tajikistan, Azerbaijan, Armenia and Georgia. East Slavic parents select a given name for a newborn child. Most first names in East Slavic languages originate from two sources: Almost all first names are single. Doubled first names (as in, for example, French, like Jean-Luc) are very rare and are from foreign influence. Most doubled first names are written with a hyphen: Mariya-Tereza.
|
Illegitimate (film). Illegitimate (Romanian: Ilegitim) is a 2016 Romanian drama film directed by Adrian Sitaru. The film premiered at 2016 Berlin Film Festival, where it received C.I.C.A.E Award[1][2] Also, the film won the Golden Arena for the Best Film (Pula, Croatia), Namur Award for the Best Screenplay and Best Actor (Adrian Titieni), and Prix Sauvage - Special Mention-Best Actress (Alina Grigore) at L’Europe autour de l’Europe, Paris.[1] The film shows the story of two siblings, brother - Romeo Anghelescu (Robi Urs) and sister - Sasha Anghelescu (Alina Grigore), who have an incestuous love. This article related to a Romanian film is a stub. You can help Wikipedia by expanding it. This 2010s drama film–related article is a stub. You can help Wikipedia by expanding it.
|
Musical theatre. Musical theatre is a form of theatrical performance that combines songs, spoken dialogue, acting and dance. The story and emotional content of a musical – humor, pathos, love, anger – are communicated through words, music, movement and technical aspects of the entertainment as an integrated whole. Although musical theatre overlaps with other theatrical forms like opera and dance, it may be distinguished by the equal importance given to the music as compared with the dialogue, movement and other elements. Since the early 20th century, musical theatre stage works have generally been called, simply, musicals. Although music has been a part of dramatic presentations since ancient times, modern Western musical theatre emerged during the 19th century, with many structural elements established by the light opera works of Jacques Offenbach in France, Gilbert and Sullivan in Britain and the works of Harrigan and Hart in America. These were followed by Edwardian musical comedies, which emerged in Britain, and the musical theatre works of American creators like George M. Cohan at the turn of the 20th century. The Princess Theatre musicals (1915–1918) were artistic steps forward beyond the revues and other frothy entertainments of the early 20th century and led to such groundbreaking works as Show Boat (1927), Of Thee I Sing (1931) and Oklahoma! (1943). Some of the best-known musicals through the decades that followed include My Fair Lady (1956), The Fantasticks (1960), Hair (1967), A Chorus Line (1975), Les Misérables (1985), The Phantom of the Opera (1986), Rent (1996), Wicked (2003) and Hamilton (2015). Musicals are performed around the world. They may be presented in large venues, such as big-budget Broadway or West End productions in New York City or London. Alternatively, musicals may be staged in smaller venues, such as off-Broadway, off-off-Broadway, regional theatre, fringe theatre, or community theatre productions, or on tour. Musicals are often presented by amateur and school groups in churches, schools and other performance spaces. In addition to the United States and Britain, there are vibrant musical theatre scenes in continental Europe, Asia, Australasia, Canada and Latin America. Since the 20th century, the book musical has been defined as a musical play where songs and dances are fully integrated into a well-made story with serious dramatic goals and which is able to evoke genuine emotions other than laughter.[2][3] The three main components of a book musical are its music, lyrics and book. The book or script of a musical refers to the story, character development and dramatic structure, including the spoken dialogue and stage directions, but it can also refer to the dialogue and lyrics together, which are sometimes referred to as the libretto (Italian for small book). The music and lyrics together form the score of a musical and include songs, incidental music and musical scenes, which are theatrical sequence[s] set to music, often combining song with spoken dialogue.[4] The interpretation of a musical is the responsibility of its creative team, which includes a director, a musical director, usually a choreographer and sometimes an orchestrator. A musicals production is also creatively characterized by technical aspects, such as set design, costumes, stage properties (props), lighting and sound. The creative team, designs and interpretations generally change from the original production to succeeding productions. Some production elements, however, may be retained from the original production, for example, Bob Fosses choreography in Chicago.
|
Abdication system. The abdicational system (Chinese: 禪讓制; pinyin: Shàn ràng zhì) was a historical Chinese political system.[1] According to Chinese mythology, it was the system used by the Three Sovereigns and Five Emperors before the switch to hereditary rule in the Xia dynasty.[1] Emperor Yao abdicated and chose Emperor Shun as his successor.[2] Chinese archaeologist Feng Shi (冯時; 馮時) argues Qi of Xia had violently seized power and established a hereditary system after the death of his father Yu the Great, he argues this with traces of violence discovered around that time.[3] The idea was most influential in the 4th century BC and declined in later periods.[4] According to Chinese mythology, following the rule of the Yellow Emperor, chieftains of different tribes in the Yellow River basin, including Yao, Shun, and Yu, came together to create a tribal alliance. Rather than engaging in warfare to establish dominance, these tribes opted for a more peaceful approach by selecting their leaders via an electoral process. This method drew its inspiration from the time-honored military democratic custom. In this system, the head of the tribal coalition was chosen through a democratic procedure involving representatives from each tribe.[5] A well-known instance of this mechanism at work is the resignation of Emperor Yao. As he aged, Yao sought to identify a successor who could carry on his legacy. He called upon the chieftains of different tribes and inquired, Who can assume my role? A minister recommended the skilled and righteous Shun, who was subsequently assessed and promoted to the rank of Emperor Shun.[5] This method facilitated a nonviolent transition of power, with the leaders role being transferred through dialogue and a democratic approach instead of through conflict and aggression. This system was widespread in ancient China, particularly during the advanced phases of primitive society.[5] A resurgence of curiosity in the abdication system has been sparked by the discovery of four brief texts unearthed in recent Chinese archeological excavations.[4] Although the primary texts were lost in the burning of books and burying of scholars that took place between 213 and 212 BCE, the concept continued to be a part of political discussions throughout history.[1]
|
Monash University. Monash University (/ˈmɒnæʃ/) is a public research university based in Melbourne, Victoria, Australia. Named after World War I general Sir John Monash, it was founded in 1958 and is the second oldest university in the state. The university has a number of campuses, four of which are in Victoria (Clayton, Caulfield, Peninsula, and Parkville), one in Malaysia and another one in Indonesia. Monash also owns land (3.6 hectares) in Notting Hill, opposite its Clayton campus.[15] Monash has a research and teaching centre in Prato, Italy, a graduate research school in Mumbai, India and graduate schools in Suzhou, China and Tangerang, Indonesia. Courses are also delivered at other locations, including South Africa. Monash is home to major research facilities, including the Monash Law School, the Australian Synchrotron, the Monash Science Technology Research and Innovation Precinct (STRIP), the Australian Stem Cell Centre, Victorian College of Pharmacy, and 100 research centres[16] and 17 co-operative research centres. In 2019, its total revenue was over $2.72 billion (AUD), with external research income around $462 million.[17] In 2019, Monash enrolled over 55,000 undergraduate and over 25,000 postgraduate students.[18] It has more applicants than any other university in the state of Victoria.[19] Monash is a member of Australias Group of Eight research universities, a member of the ASAIHL, and is the only Australian member of the M8 Alliance of Academic Health Centers, Universities and National Academies. Monash is one of the Australian universities to be ranked in the École des Mines de Paris (Mines ParisTech) ranking on the basis of the number of alumni listed among CEOs in the 500 largest worldwide companies.[20] Established by an Act of Parliament in 1958, the original campus was in the suburb of Clayton where the university was granted an expansive site of 100 hectares of open land.[21] The 100 hectares of land consisted of farmland and included the former Talbot Epileptic Colony.[22] The Tudor-style farmhouse built by the OShea family became the original Vice-Chancellors House — now University House.[23][24][25][26]
|
Absolute monarchy. Absolute monarchy[1][2] is a form of monarchy in which the sovereign is the sole source of political power, unconstrained by constitutions, legislatures or other checks on their authority.[3] Throughout history, there have been many examples of absolute monarchs, with some famous examples including Louis XIV of France, and Frederick the Great.[4][5] Absolute monarchies include Brunei, Eswatini,[6] Oman,[7] Saudi Arabia,[8] Vatican City,[9] and the individual emirates composing the United Arab Emirates, which itself is a federation of such monarchies – a federal monarchy.[10][11] Though absolute monarchies are sometimes supported by legal documents (such as the Kings Law of Denmark-Norway), they are distinct from constitutional monarchies, in which the authority of the monarch is restricted (e.g. by legislature or unwritten customs) or balanced by that of other officials, such as a prime minister, as is in the case of the United Kingdom, or the Nordic countries.[3] Absolute monarchies are similar to but should not be confused with hereditary dictatorships such as North Korea or Baathist Syria. In the Ottoman Empire, the Sultan wielded absolute power over the state and was considered a Padishah meaning Great King by his people. Many sultans wielded absolute power through heavenly mandates reflected in their titles, such as Shadow of God on Earth. In ancient Mesopotamia, many rulers of Assyria, Babylonia and Sumer were absolute monarchs as well. Throughout Imperial China, many emperors and one empress (Wu Zetian) wielded absolute power through the Mandate of Heaven. In pre-Columbian America, the Inca Empire was ruled by a Sapa Inca, who was considered the son of Inti, the sun god and absolute ruler over the people and nation. Korea under the Joseon dynasty[12] and short-lived empire was also an absolute monarchy.
|
Dame. Dame is a traditionally British honorific title given to women who have been admitted to certain orders of chivalry. It is the female equivalent of Sir, the title used by knights.[1] Baronetesses in their own right also use the title Dame.[citation needed] A woman appointed to the grades of Dame Commander or Dame Grand Cross of the Order of Saint John,[2] the Order of the Holy Sepulchre,[3] the Order of the Bath, the Order of Saint Michael and Saint George, the Royal Victorian Order, or the Order of the British Empire becomes a dame.[4] A Central European order in which female members receive the rank of Dame is the Order of Saint George.[5] Since there is no female equivalent to a Knight Bachelor, women are always appointed to an order of chivalry.[6] Women who are appointed to the Order of the Garter or the Order of the Thistle are given the title of Lady rather than Dame.[7] Women receive all their honours in the same fashion as men receiving decorations or medals, even if they are receiving a damehood, so there is no female word equivalent of being knighted. The Order of the Ermine, founded in France by John V, Duke of Brittany, in 1381, was the first order of chivalry to accept women. However, female knights existed for centuries in many places in the world prior to this.[8] Like their male counterparts, they were distinguished by the flying of coloured banners and generally bore a coat of arms. One woman who participated in tournaments was Joane Agnes Hotot (born 1378), but she was not the only one.[9][10] Additionally, women adopted certain forms of regalia which became closely associated with the status of knighthood.[11]
|
Big Bad Wolf. The Big Bad Wolf is a fictional wolf appearing in several cautionary tales, including some of Grimms Fairy Tales. Versions of this character have appeared in numerous works, and it has become a generic archetype of a menacing predatory antagonist. Little Red Riding Hood, The Three Little Pigs, The Wolf and the Seven Young Kids, The Boy Who Cried Wolf and the Russian tale Peter and the Wolf, reflect the theme of the ravening wolf and of the creature released unharmed from its belly, but the general theme of restoration is very old. The dialogue between the wolf and Little Red Riding Hood has its analogies to the Norse Þrymskviða from the Elder Edda; the giant Þrymr had stolen Mjölner, Thors hammer, and demanded Freyja as his bride for its return. Instead, the gods dressed Thor as a bride and sent him. When the giants note Thors unladylike eyes, eating, and drinking, Loki explains them as Freyja not having slept, or eaten, or drunk, out of longing for the wedding.[1] 19th-century Folklorists and cultural anthropologists such as P. Saintyves and Edward Burnett Tylor saw Little Red Riding Hood in terms of solar myths and other naturally occurring cycles, stating that the wolf represents the night swallowing the sun, and the variations in which Little Red Riding Hood is cut out of the wolfs belly represent the dawn.[2] In this interpretation, there is a connection between the wolf of this tale and Skoll or Fenrir, the wolf in Norse mythology that will swallow the sun at Ragnarök.[3] Ethologist Dr. Valerius Geist of the University of Calgary, Alberta wrote that the fable was likely based on genuine risk of wolf attacks at the time. He argues that wolves are in fact dangerous predators, and fables served as a valid warning not to enter forests where wolves were known to live, and to be on the look out for such. Both wolves and wilderness were treated as enemies of humanity in that region and time.[4]
|
Little Red Riding Hood. Little Red Riding Hood (French: Le Petit Chaperon Rouge) is a fairy tale by Charles Perrault about a young girl and a Big Bad Wolf.[4][5] Its origins can be traced back to several pre-17th-century European folk tales. It was later retold in the 19th-century by the Brothers Grimm. The story has varied considerably in different versions over the centuries, translations, and as the subject of numerous modern adaptations. Other names for the story are Little Red Cap or simply Red Riding Hood. It is number 333 in the Aarne–Thompson classification system for folktales.[6] The story revolves around a girl named Little Red Riding Hood, named after the red hooded cape that she wears. The girl walks through the woods to deliver food to her sickly grandmother (wine and cake depending on the translation). A stalking wolf wants to eat the girl and the food in the basket. After he inquires as to where she is going, he suggests that she pick some flowers as a present for her grandmother. While she goes in search of flowers, he goes to the grandmothers house and gains entry by pretending to be Riding Hood. He swallows the grandmother whole, climbs into her bed, and waits for the girl, disguised as the grandmother.
|
Japan black. Japan black (also called black japan and bicycle paint[1]) is a lacquer or varnish suitable for many substrates but known especially for its use on iron and steel. It can also be called japan lacquer and Brunswick black. Its name comes from the association between the finish and Japanese products in the West.[2] Used as a verb, japan means to finish in japan black. Thus japanning and japanned are terms describing the process and its products. Its high bitumen content provides a protective finish that is durable and dries quickly. This allowed japan black to be used extensively in the production of automobiles in the early 20th century in the United States. Japan black consists mostly of an asphaltic base dissolved in naphtha or turpentine, sometimes with other varnish ingredients, such as linseed oil. It is applied directly to metal parts, and then baked at about 200°C (400°F) for up to an hour.[3] Japan blacks popularity was due in part to its durability as an automotive finish; however, it was the ability of japan black to dry quickly that made it a favorite of early mass-produced automobiles such as Henry Fords Model T.[4] While other colors were available for automotive finishes, early colored variants of automotive lacquers could take up to 14 days to cure, whereas japan black would cure in 48 hours or less. Thus, variously colored pre-1925 car bodies were usually consigned to special orders, or custom-bodied luxury automobiles.
|
Patronymic. A patronymic, or patronym, is a component of a personal name based on the given name of ones father, grandfather (more specifically an avonymic),[1][2] or an earlier male ancestor. It is the male equivalent of a matronymic. Patronymics are used, by custom or official policy, in many countries worldwide, although elsewhere their use has been replaced by or transformed into patronymic surnames. Examples of such transformations include common English surnames such as Johnson (son of John). The usual noun and adjective in English is patronymic, but as a noun this exists in free variation alongside patronym.[a] The first part of the word patronym comes from Greek πατήρ patēr father (GEN πατρός patros whence the combining form πατρο- patro-);[3] the second part comes from Greek ὄνυμα onyma, a variant form of ὄνομα onoma name.[4] In the form patronymic, this stands with the addition of the suffix -ικός (-ikos), which was originally used to form adjectives with the sense pertaining to (thus pertaining to the fathers name). These forms are attested in Hellenistic Greek as πατρώνυμος (patrōnymos) and πατρωνυμικός (patrōnymikos).[5] The form patronym, first attested in English in 1834, was borrowed into English from French patronyme, which had previously borrowed the word directly from Greek. Patronymic, first attested in English in 1612, has a more complex history. Both Greek words had entered Latin, and, from Latin, French. The English form patronymic was borrowed through the mutual influence of French and Latin on English.[6] In many areas around the world, patronyms predate the use of family names. Family names in many Celtic, Germanic, Iberian, Georgian, Armenian and Slavic languages originate from patronyms, e.g. Wilson (son of William), FitzGerald (son of Gerald), Powell (from ap Hywel), Fernández (son of Fernando), Rodríguez (son of Rodrigo), Andersson or Andersen (son of Anders, Scandinavian form of Andrew), Carlsen (son of Carl), Ilyin (of Ilya), Petrov (of Peter), Grigorovich (son of Grigory, Russian form of Gregory), Stefanović (son of Stefan, little Stefan), MacAllister (from mac Alistair, meaning son of Alistair, anglicized Scottish form of Alexander) and OConor (from Ó Conchobhair, meaning grandson/descendant of Conchobhar). Other cultures which formerly used patronyms have switched to the more widespread style of passing the fathers last name to the children (and wife) as their own. In Iceland, family names are unusual; Icelandic law favours the use of patronyms (and more recently, matronyms) over family names. Traditionally Muslim and non-Arabic speaking African people, such as Hausa and Fulani people, usually (with some exceptions) follow the Arab naming pattern.[7] The word or phrase meaning son of is, however, omitted. As such, Mohamed son of Ibrahim son of Ahmed is Mohamed Ibrahim Ahmed, and Mohamed Ibrahim Ahmeds son Ali is Ali Mohamed Ibrahim.
|
Vagrancy (biology). Vagrancy is a phenomenon in biology whereby an individual animal (usually a bird) appears well outside its normal range;[1] they are known as vagrants. The term accidental is sometimes also used. There are a number of poorly understood factors which might cause an animal to become a vagrant, including internal causes such as navigation errors (endogenous vagrancy) and external causes such as severe weather (exogenous vagrancy).[2] Vagrancy events may lead to colonisation and eventually to speciation.[3] In the Northern Hemisphere, adult birds (possibly inexperienced younger adults) of many species are known to continue past their normal breeding range during their spring migration and end up in areas further north (such birds are termed spring overshoots).[4] In autumn, some young birds, instead of heading to their usual wintering grounds, take incorrect courses and migrate through areas which are not on their normal migration path. For example, Siberian passerines which normally winter in Southeast Asia are commonly found in Northwest Europe, e.g. Arctic warblers in Britain.[5] This is reverse migration, where the birds migrate in the opposite direction to that expected (say, flying north-west instead of south-east). The causes of this are unknown, but genetic mutation or other anomaly relating to the birds magnetic sensibilities is suspected.[6] Other birds are sent off course by storms and high winds, such as some North American birds blown across the Atlantic Ocean to Europe. Birds can also be blown out to sea, become physically exhausted, land on a ship and end up being carried to the ships destination. While many vagrant birds do not survive, if sufficient numbers wander to a new area they can establish new populations. Many isolated oceanic islands are home to species that are descended from landbirds blown out to sea, Hawaiian honeycreepers and Darwins finches being prominent examples.
|
Fairy tale (disambiguation). A fairy tale is a story featuring folkloric characters. Fairy Tale(s), Faerie Tale(s), Faery Tale(s), or Fairytale(s) may also refer to:
|
Japan, Missouri. Japan (pronounced /ˈdʒeɪˌpæn/ JAY-pan or /ˈdʒeɪpən/ JAY-pun[1]) is an unincorporated community in southwest Franklin County, in the U.S. state of Missouri.[2] The community is located on Missouri Route AE 7.5 mile west-northwest of Sullivan.[3] A post office called Japan was established in 1860, and remained in operation until 1908.[4] The community was named after a local Roman Catholic Church, the Church of the Holy Martyrs of Japan.[1] The name was almost changed in the aftermath of the attack on Pearl Harbor due to anti-Japanese sentiment in the United States.[5] 38°14′21″N 91°18′22″W / 38.2392150°N 91.3059819°W / 38.2392150; -91.3059819[1]
|
Japan, Montenegro. Japan (Cyrillic: Јапан) is a hamlet in the municipality of Andrijevica, Montenegro. The village is roughly 6 km away from the border of Albania. The name of the village is often disputed; one source bases it on a legend from the time of the Kingdom of Montenegro, where a village wiseman and flag-bearer (barjaktar) by the name of Samilo Fatić was a prominent fighter in the war against the Austro-Hungarian Empire during World War I. Afterwards, on a ceremony where he was awarded with a medal from King Nikola I, the King himself asked him of his aspirations, and he responded that his greatest wish would be to bring water to the village from the source of Biruljak. When the King asked him of the location of this village, he responded with far away in the unseen, far away as Japan.[1] Due to the increasingly difficult conditions during the winter, some of the inhabitants have chosen to leave the village and settle elsewhere, given the village is located at the foothills of Mount Komovi, which is now a protected national park since 2018. The village came to the attention of the Japanese, which was visited by some Japanese journalists, who except in an unusual name, were convinced and with what natural beauties Montenegro has at its disposal.[2]
|
Carl Larsson. Carl Olof Larsson (Swedish pronunciation: [ˈkɑːɭ ˈlɑ̌ːʂɔn] ⓘ; 28 May 1853 – 22 January 1919) was a Swedish painter representative of the Arts and Crafts movement. His many paintings include oils, watercolors, and frescoes. He is principally known for his watercolors of idyllic family life. He considered his finest work to be Midvinterblot (Midwinter Sacrifice), a large painting now displayed inside the Swedish National Museum of Fine Arts.[2][3] Larsson was born on 28 May 1853, in the Gamla stan neighborhood of Stockholm, Sweden.[1] His parents were extremely poor, and his childhood was not happy. Renate Puvogel, in her book Carl Larsson (Cologne: Taschen; 1994), gives detailed information about Larssons life: His mother was thrown out of the house, together with Carl and his brother Johan; after enduring a series of temporary dwellings, the family moved into Grev Magnigränd No. 7 (later No. 5) in what was then Ladugårdsplan, present-day Östermalm.[4] As a rule, each room was home to three families; penury, filth and vice thrived there, leisurely seethed and smouldered, eaten-away and rotten bodies and souls. Such an environment is the natural breeding ground for cholera, he wrote in his autobiographical novel Jag.[5] Larssons father worked as a casual laborer, sailed as a stoker on a ship headed for Scandinavia, and lost the lease to a nearby mill, only to work there later as a mere grain carrier. Larsson portrays him as a loveless man lacking self-control; he drank, ranted and raved, and incurred the lifelong anger of his son after an outburst in which he declared, I curse the day you were born. In contrast, Carls mother worked long hours as a laundress to provide for her family.[4]
|
Vagrant (disambiguation). A vagrant is a person who lives without a home or regular employment and wanders from place to place. Vagrant or vagrancy may also refer to:
|
Japanese lacquerware. Japanese Lacquerware (日本漆器, shikki) is a Japanese craft with a wide range of fine and decorative arts, as lacquer has been used in urushi-e, prints, and on a wide variety of objects from Buddha statues to bento boxes for food. The characteristic of Japanese lacquerware is the diversity of lacquerware using a decoration technique called maki-e (蒔絵) in which metal powder is sprinkled to attach to lacquer. The invention of various maki-e techniques in Japanese history expanded artistic expression, and various tools and works of art such as inro are highly decorative.[2] A number of terms are used in Japanese to refer to lacquerware. Shikki (漆器) means lacquer ware in the most literal sense, while nurimono (塗物) means coated things, and urushi-nuri (漆塗) means lacquer coating.[3] The terms related to lacquer or lacquerware such as Japanning, Urushiol and maque which means lacquer in Mexican Spanish, are derived from Japanese lacquerware.[4][5] It has been confirmed that the lacquer tree existed in Japan from 12,600 years ago in the incipient Jōmon period. This was confirmed by radioactive carbon dating of the lacquer tree found at the Torihama shell mound, and is the oldest lacquer tree in the world found as of 2011.[7] Lacquer was used in Japan as early as 7000 BCE, during the Jōmon period. Evidence for the earliest lacquerware was discovered at the Kakinoshima B Excavation Site in Hokkaido. The ornaments woven with lacquered red thread were discovered in a pit grave dating from the first half of the Initial Jōmon period. Also, at Kakinoshima A Excavation Site, earthenware with a spout painted with vermilion lacquer, which was made 3200 years ago, was found almost completely intact.[8][9][7]
|
Order of the British Empire. The Most Excellent Order of the British Empire is a British order of chivalry, rewarding valuable service in a wide range of useful activities.[2] It comprises five classes of awards across both civil and military divisions, the most senior two of which make the recipient either a knight if male or a dame if female.[3] There is also the related British Empire Medal, whose recipients are affiliated with the order, but are not members of it. The order was established on 4 June 1917 by King George V, who created the order to recognise such persons, male or female, as may have rendered or shall hereafter render important services to Our Empire.[3] Equal recognition was to be given for services rendered in the UK and overseas.[4] Today, the majority of recipients are UK citizens, though a number of Commonwealth realms outside the UK continue to make appointments to the order.[5] Honorary awards may be made to citizens of other nations of which the orders sovereign is not the head of state. The five classes of appointment to the Order are, from highest grade to lowest grade: The senior two ranks of Knight or Dame Grand Cross and Knight or Dame Commander entitle their members to use the titles Sir for men and Dame for women before their forenames, except with honorary awards.[6]
|
Vagabond (disambiguation). A vagabond is a person who wanders from place to place without a permanent home or regular work. (The) Vagabond or Vagabonds may also refer to:
|
Melbourne. Melbourne (/ˈmɛlbərn/ MEL-bərn,[note 1] locally [ˈmæɫbən] ⓘ; Boonwurrung/Woiwurrung: Narrm or Naarm[9][10]) is the capital and most populous city of the Australian state of Victoria, and the second most-populous city in Australia, after Sydney.[1] The citys name generally refers to a 9,993 km2 (3,858 sq mi) area,[2] comprising an urban agglomeration of 31 local government areas.[11] The name is also used to specifically refer to the local government area named City of Melbourne, whose area is centred on the Melbourne central business district and some immediate surrounds. The city occupies much of the northern and eastern coastlines of Port Phillip Bay. As of 2024, the population of the city was 5.35 million, or 19% of the population of Australia;[1] inhabitants are referred to as Melburnians. The area of Melbourne has been home to Aboriginal Victorians for over 40,000 years and serves as an important meeting place for local Kulin nation clans.[12][13] Of the five peoples of the Kulin nation, the traditional custodians of the land encompassing Melbourne are the Boonwurrung, Woiwurrung and the Wurundjeri peoples. In 1803, a short-lived British penal settlement was established at Port Phillip, then part of the Colony of New South Wales. Melbourne was founded in 1835 with the arrival of free settlers from Van Diemens Land (modern-day Tasmania).[12] It was incorporated as a Crown settlement in 1837, and named after the then-Prime Minister of the United Kingdom, William Lamb, 2nd Viscount Melbourne.[12] Declared a city by Queen Victoria in 1847, it became the capital of the newly separated Colony of Victoria in 1851.[14] During the 1850s Victorian gold rush, the city entered a lengthy boom period that, by the late 1880s, had transformed it into Australias, and one of the worlds, largest and wealthiest metropolises.[15][16] After the federation of Australia in 1901, Melbourne served as the interim seat of government of the new nation until Canberra became the permanent capital in 1927.[17] Today, Melbourne is culturally diverse and, among world cities, has the fourth-largest foreign born population. It is a leading financial centre in the Asia-Pacific region, ranking 28th globally in the 2024 Global Financial Centres Index.[18] The citys eclectic architecture blends Victorian era structures, such as the World Heritage-listed Royal Exhibition Building, with one of the worlds tallest skylines. Additional landmarks include the Melbourne Cricket Ground and the National Gallery of Victoria. Noted for its cultural heritage, the city gave rise to Australian rules football, Australian impressionism and Australian cinema, and is noted for its street art, live music and theatre scenes. It hosts major annual sporting events, such as the Australian Grand Prix and the Australian Open, and also hosted the 1956 Summer Olympics. Melbourne ranked as the worlds most livable city for much of the 2010s.[19]
|
The Blind Girl. The Blind Girl is an oil on canvas painting by John Everett Millais, from 1856. It depicts two itinerant beggars, presumed to be sisters, one of whom is a blind musician, her concertina on her lap. They are resting by the roadside after a rainstorm, before travelling to the town of Winchelsea, visible in the background.[1] The painting has been interpreted as an allegory of the senses, contrasting the experiences of the blind and sighted sisters.[2] The former feels the warmth of the sun on her face, and fondles a blade of grass, while the latter shields her eyes from the sun or rain and looks at a double rainbow that has just appeared. Some critics have interpreted the rainbow in Biblical terms, as the sign of Gods covenant described in Genesis 9:16.[3] When the painting was first exhibited in 1856 it was pointed out to Millais that in double rainbows the secondary rainbow inverts the order of the colours. Millais had originally painted the colours in the same order in both rainbows. He altered it for scientific accuracy.[4] A tortoiseshell butterfly rests on the blind girls shawl, implying that she is holding herself extremely still. The sign around her neck reads Pity the Blind.
|
John Everett Millais. Sir John Everett Millais, 1st Baronet PRA (UK: /ˈmɪleɪ/ MIL-ay, US: /mɪˈleɪ/ mil-AY;[1][2] 8 June 1829 – 13 August 1896) was an English painter and illustrator who was one of the founders of the Pre-Raphaelite Brotherhood.[3] He was a child prodigy who, aged eleven, became the youngest student to enter the Royal Academy Schools. The Pre-Raphaelite Brotherhood was founded at his family home in London, at 83 Gower Street (now number 7). Millais became the most famous exponent of the style, his painting Christ in the House of His Parents (1849–50) generating considerable controversy, and he produced a picture that could serve as the embodiment of the historical and naturalist focus of the group, Ophelia, in 1851–52. By the mid-1850s, Millais was moving away from the Pre-Raphaelite style to develop a new form of realism in his art. His later works were enormously successful, making Millais one of the wealthiest artists of his day, but some former admirers including William Morris saw this as a sell-out (Millais notoriously allowed one of his paintings to be used for a sentimental soap advertisement). While these and early 20th-century critics, reading art through the lens of Modernism, viewed much of his later production as wanting, this perspective has changed in recent decades, as his later works have come to be seen in the context of wider changes and advanced tendencies in the broader late nineteenth-century art world, and can now be seen as predictive of the art world of the present. Millaiss personal life has also played a significant role in his reputation. His wife Effie was formerly married to the critic John Ruskin, who had supported Millaiss early work. The annulment of the Ruskin marriage and Effies subsequent marriage to Millais have sometimes been linked to his change of style, but she became a powerful promoter of his work and they worked in concert to secure commissions and expand their social and intellectual circles. Millais was born in Southampton, England, in 1829, of a prominent Jersey-based family. His parents were John William Millais and Emily Mary Millais (née Evermy). Most of his early childhood was spent in Jersey, to which he retained a strong devotion throughout his life. The author Thackeray once asked him when England conquered Jersey. Millais replied Never! Jersey conquered England.[4] The family moved to Dinan in Brittany for a few years in his childhood.
|
Folk music (disambiguation). Folk music is a genre of music. Folk music may also refer to:
|
Empire of Japan. The Empire of Japan,[c] also known as the Japanese Empire or Imperial Japan, was the Japanese nation state[d] that existed from the Meiji Restoration on January 3, 1868, until the Constitution of Japan took effect on May 3, 1947.[8] From 1910 to 1945, it included the Japanese archipelago, the Kurils, Karafuto, Korea, and Taiwan. The South Seas Mandate and concessions such as the Kwantung Leased Territory were de jure not internal parts of the empire but dependent territories. In the closing stages of World War II, with Japan defeated alongside the rest of the Axis powers, the formalized surrender was issued on September 2, 1945, in compliance with the Potsdam Declaration of the Allies, and the empires territory subsequently shrunk to cover only the Japanese archipelago resembling modern Japan. Under the slogans of Enrich the Country, Strengthen the Armed Forces[e] and Promote Industry[f] which followed the Boshin War and the restoration of power to the emperor from the shogun, Japan underwent a period of large-scale industrialization and militarization, often regarded as the fastest modernization of any country to date. All of these aspects contributed to Japans emergence as a great power following the First Sino-Japanese War, the Boxer Rebellion, the Russo-Japanese War, and World War I. Economic and political turmoil in the 1920s, including the Great Depression, led to the rise of militarism, nationalism, statism and authoritarianism, during which Japan joined the Axis alliance with Nazi Germany and Fascist Italy, conquering a large part of the Asia–Pacific;[15] during this period, the Imperial Japanese Army (IJA) and the Imperial Japanese Navy (IJN) committed numerous atrocities and war crimes, including the Nanjing Massacre.[16][17][18][19][20] There has been a debate over defining the political system of Japan as a dictatorship, which has been refuted due by the absence of a dictator,[21] and over calling it fascist. The other suggested terms were para-fascism,[22] militarism, corporatism, totalitarianism,[23] and police state.[24] The Imperial Japanese Armed Forces initially achieved large-scale military successes during the Second Sino-Japanese War and the Pacific War. However, from 1942 onwards, and particularly after decisive Allied advances at Midway Atoll and Guadalcanal, Japan was forced to adopt a defensive stance against the United States. The American-led island-hopping campaign led to the eventual loss of many of Japans Oceanian island possessions in the following three years. Eventually, the American military captured Iwo Jima and Okinawa Island, leaving the Japanese mainland unprotected and without a significant naval defense force. By August 1945, plans had been made for an Allied invasion of mainland Japan, but were shelved after Japan surrendered in the face of a major breakthrough by the Western Allies and the Soviet Union, with the atomic bombings of Hiroshima and Nagasaki and the Soviet invasion of Manchuria. The Pacific War officially came to an end on September 2, 1945, leading to the beginning of the Allied occupation of Japan, during which United States military leader Douglas MacArthur administered the country. In 1947, through Allied efforts, a new Japans constitution was enacted, officially ending the Japanese Empire and forming modern Japan. During this time, the Imperial Japanese Armed Forces were dissolved. It was later replaced by the current Japan Self-Defense Forces in 1954. Reconstruction under the Allied occupation continued until 1952, consolidating the modern Japanese constitutional monarchy. In total, the Empire of Japan had three emperors: Meiji, Taishō, and Shōwa. The Imperial era came to an end partway through Shōwas reign, and he remained emperor until 1989.
|
Traditional music (disambiguation). Traditional music refers to any music reproduced and shared through musical traditions and may equally refer to:
|
Folksinger (disambiguation). A folk singer or folksinger or is a person who sings traditional or contemporary folk music. Folksinger may also refer to:
|
Parkville, Victoria. Parkville is an inner-city suburb in Melbourne, Victoria, Australia, 3 km (1.9 mi) north of Melbournes Central Business District, located within the Cities of Melbourne and Merri-bek local government areas.[2] Parkville recorded a population of 7,074 at the 2021 census.[1] Parkville is bordered by North Melbourne to the south-west, Carlton and Carlton North to the south and east, Brunswick to the north (where a part of Parkville lies within the City of Merri-bek), and Flemington to the west. The suburb includes the postcodes 3052 and 3010 (University). The suburb encompasses Royal Park, an expansive parkland which is notable as home to the Royal Melbourne Zoological Gardens. Parkville was also the location of the athletes village for the 2006 Commonwealth Games. Parkville is a major education, research and healthcare precinct and home to the University of Melbourne, Monash University Pharmacy faculty, Royal Melbourne Hospital, Royal Womens Hospital, Royal Childrens Hospital, the Victorian Comprehensive Cancer Centre and CSL.
|
Béla Bartók. Béla Viktor János Bartók (/ˈbeɪlə ˈbɑːrtɒk/; Hungarian: [ˈbɒrtoːk ˈbeːlɒ]; 25 March 1881 – 26 September 1945) was a Hungarian composer, pianist and ethnomusicologist. He is considered one of the most important composers of the 20th century; he and Franz Liszt are regarded as Hungarys greatest composers.[1] Among his notable works are the opera Bluebeards Castle, the ballet The Miraculous Mandarin, Music for Strings, Percussion and Celesta, the Concerto for Orchestra and six string quartets. Through his collection and analytical study of folk music, he was one of the founders of comparative musicology, which later became known as ethnomusicology. Per Anthony Tommasini, Bartók has empowered generations of subsequent composers to incorporate folk music and classical traditions from whatever culture into their works and was a formidable modernist who in the face of Schoenberg’s breathtaking formulations showed another way, forging a language that was an amalgam of tonality, unorthodox scales and atonal wanderings.[2] Bartók was born in the Banatian town of Nagyszentmiklós in the Kingdom of Hungary (present-day Sânnicolau Mare, Romania) on 25 March 1881.[3] On his fathers side, the Bartók family was a Hungarian lower noble family, originating from Borsodszirák, Borsod.[4] His paternal grandmother was a Catholic of Bunjevci origin, but considered herself Hungarian.[5] Bartóks father (1855–1888) was also named Béla [hu]. Bartóks mother, Paula (née Voit) [hu] (1857–1939), spoke[6] Hungarian fluently.[7] A native of Turócszentmárton (present-day Martin, Slovakia),[8] she had German, Hungarian and Slovak or Polish ancestry. Béla displayed notable musical talent very early in life. According to his mother, he could distinguish between different dance rhythms that she played on the piano before he learned to speak in complete sentences.[9] By the age of four he was able to play 40 pieces on the piano, and his mother began formally teaching him the next year. In 1888, when he was seven, his father, the director of an agricultural school, died suddenly. His mother then took Béla and his sister, Erzsébet, to live in Nagyszőlős (present-day Vynohradiv, Ukraine) and then in Pressburg (present-day Bratislava, Slovakia). Béla gave his first public recital aged 11 in Nagyszőlős, to positive critical reception.[10][page needed] Among the pieces he played was his own first composition, written two years previously: a short piece called The Course of the Danube.[11] Shortly thereafter, László Erkel [hu] accepted him as a pupil.[12]
|
Carlton, Victoria. Carlton is an inner-city suburb in Melbourne, Victoria, Australia, three kilometres north of the Melbourne central business district within the city of Melbourne local government area. Carlton recorded a population of 16,055 at the 2021 census.[2] Immediately adjoining the CBD, Carlton is known nationwide for its Little Italy precinct centred on Lygon Street, for its preponderance of 19th-century Victorian architecture and its garden squares including the Carlton Gardens, the latter being the location of the Royal Exhibition Building, one of Australias few man-made sites with World Heritage status. Due to its proximity to the University of Melbourne, the CBD campus of RMIT University and the Fitzroy campus of Australian Catholic University, Carlton is also home to one of the highest concentrations of university students in Australia. Carlton was founded in 1851, at the beginning of the Victorian gold rush, with the Carlton Post Office opening on 19 October 1865.[3] The suburb was named after Carlton House, the Westminster residence of King George IV.[4]
|
Tianming (disambiguation). Tianming may refer to:
|
Jack Charles (disambiguation). Jack Charles (1943–2022) was an Australian actor and activist. Jack Charles may also refer to:
|
Nindethana Theatre. Nindethana Theatre was Australias first Aboriginal theatre company, founded in Melbourne in 1971, with its last performance in Adelaide in 1974. The theatre company was formed after the Australia Council for the Arts asked Jack Charles to form a group of Aboriginal actors. The initial cohort consisted of seven young people from Aboriginal hostels in Melbourne, four of whom had never acted before.[1] Nindethana was established by Charles and Bob Maza at the Pram Factory in Melbourne in 1971,[2][3][4][5] with help from New Zealand-born playwright, theatre director, and actor John Smythe and others.[6][7] Its stated objective was the performance, encouragement and promotion of Aboriginal drama, music, art, literature, film production and other such cultural activities in the community.[2] It was the first Aboriginal theatre group in the country.[1] The first production planned was Rocket Range, by Jim Crawford, but that did not get staged due to production difficulties and lack of trained staff.[8] Their earliest production was The Cherry Pickers, in August 1971,[9] written by Kevin Gilbert and recognised as the first Aboriginal play.[10] In 1972, the theatre staged a performance called Jack Charles is up and fighting (1972),[11]
|
Kingdom of Hungary. The Kingdom of Hungary was a monarchy in Central Europe that existed for nearly a millennium, from 1000 to 1946 and was a key part of the Habsburg monarchy from 1526-1918. The Catholic kingdom emerged as a continuation of the Grand Principality of Hungary upon the coronation of the first king Stephen I at Esztergom around the year 1000;[8] his family (the Árpád dynasty) led the monarchy for 300 years. By the 12th century, the kingdom had become a European power.[8] Due to the Ottoman occupation of the central and southern territories of Hungary in the 16th century, the country was partitioned into three parts: the Habsburg Royal Hungary, Ottoman Hungary, and the semi-independent Principality of Transylvania.[8] The House of Habsburg held the Hungarian throne after the Battle of Mohács in 1526 continuously until 1918 and also played a key role in the wars against the Ottoman Empire and the eventual expulsion of the Turks during and after the Great Turkish War. The Hungarians fought many wars of independence against the Habsburgs, including in 1604–06, 1664–71, 1680–85, 1703–11, and 1848–49. From 1867, territories connected to the Hungarian crown were incorporated into Austria-Hungary under the name of Lands of the Crown of Saint Stephen. The monarchy ended with the deposition of the last king Charles IV in 1918, after which Hungary became a republic. The kingdom was nominally restored during the Regency of 1920–46, ending under the Soviet occupation in 1946.[8] The Kingdom of Hungary was a multiethnic[9] state from its inception[10] until the Treaty of Trianon and it covered what is today Hungary, Slovakia, Transylvania and other parts of Romania, Carpathian Ruthenia (now part of Ukraine), Vojvodina (now part of Serbia), the territory of Burgenland (now part of Austria), Međimurje (now part of Croatia), Prekmurje (now part of Slovenia) and a few villages which are now part of Poland. From 1102, it also included the Kingdom of Croatia, being in personal union with it, united under the King of Hungary.
|
Australian Aboriginal elder. Australian Aboriginal elders in the context of Aboriginal and Torres Strait Islander culture, is defined as someone who has gained recognition as a custodian of knowledge and lore, and who has permission to disclose knowledge and beliefs.[1][2] They may be male or female, and of any age, but must be trusted and respected by their community[3] for their wisdom, cultural knowledge and community service.[4] Elders provide support for their communities in the form of guidance, counselling and knowledge, which help tackle problems of health, education, unemployment and racism,[5][6] particularly for younger people. They may be distinguished as one of two types: community elders and traditional elders.[3] Elders play an important role in maintenance of culture, songs, oral histories, sacred stories, Aboriginal Australian languages,[7] and dance, and are also educators who demonstrate leadership and skills in resolving conflicts. Elders also preside over ceremonies and other spiritual practices, and attend to the health and well-being of young people.[6] Elders are sometimes addressed by other Aboriginal people as Uncle or Aunty as a mark of respect. The honorific may be used by non-Aboriginal people, but generally only when permission is given to do so.[3][4][1] Self-determination advocacy organisation the Aboriginal Provisional Government was initially headed by a Council of Elders in accordance with the traditions and beliefs of Aboriginal groups nationwide.[8] The Dreaming Path, a book written by the first Aboriginal CEO of an Australian TAFE, Paul Callaghan, in collaboration with Ngemba elder Paul Gordon, describes the important role played by elders in Aboriginal society. Some organisations have created formal elder-in-residence programs, such as the University of South Australias Elders on Campus project, which helps to support Indigenous students.[6][9]
|
Family (biology). Family (Latin: familia, pl.: familiae) is one of the eight major hierarchical taxonomic ranks in Linnaean taxonomy. It is classified between order and genus.[1] A family may be divided into subfamilies, which are intermediate ranks between the ranks of family and genus. The official family names are Latin in origin; however, popular names are often used: for example, walnut trees and hickory trees belong to the family Juglandaceae, but that family is commonly referred to as the walnut family. The delineation of what constitutes a family—or whether a described family should be acknowledged—is established and decided upon by active taxonomists. There are not strict regulations for outlining or acknowledging a family, yet in the realm of plants, these classifications often rely on both the vegetative and reproductive characteristics of plant species. Taxonomists frequently hold varying perspectives on these descriptions, leading to a lack of widespread consensus within the scientific community for extended periods. The naming of families is codified by various international bodies using the following suffixes: Name changes at the family level are regulated by the codes of nomenclature. For botanical families, some traditional names like Palmae (Arecaceae), Cruciferae (Brassicaceae), and Leguminosae (Fabaceae) are conserved alongside their standardized -aceae forms due to their historical significance and widespread use in the literature. Family names are typically formed from the stem of a type genus within the family. In zoology, when a valid family name is based on a genus that is later found to be a junior synonym, the family name may be maintained for stability if it was established before 1960. In botany, some family names that were found to be junior synonyms have been conserved due to their widespread use in the scientific literature.[5]
|
Monarch (disambiguation). A monarch is the head of state of a monarchy, who holds the office for life or until abdication. Monarch or Monarchy may also refer to:
|
Family Life. Family Life may also refer to:
|
Family (disambiguation). A family is a domestic or social group. Family or The Family may also refer to:
|
Standard Chinese. Standard Chinese (simplified Chinese: 现代标准汉语; traditional Chinese: 現代標準漢語; pinyin: Xiàndài biāozhǔn hànyǔ; lit. modern standard Han speech) is a modern standard form of Mandarin Chinese that was first codified during the republican era (1912–1949). It is designated as the official language of mainland China and a major language in the United Nations, Singapore, and Taiwan. It is largely based on the Beijing dialect. Standard Chinese is a pluricentric language with local standards in mainland China, Taiwan and Singapore that mainly differ in their lexicon.[8] Hong Kong written Chinese, used for formal written communication in Hong Kong and Macau, is a form of Standard Chinese that is read aloud with the Cantonese reading of characters. Like other Sinitic languages, Standard Chinese is a tonal language with topic-prominent organization and subject–verb–object (SVO) word order. Compared with southern varieties, the language has fewer vowels, final consonants and tones, but more initial consonants. It is an analytic language, albeit with many compound words. In the context of linguistics, the dialect has been labeled Standard Northern Mandarin[9][10][11] or Standard Beijing Mandarin,[12][13] and in common speech simply Mandarin,[14] more specifically qualified as Standard Mandarin, Modern Standard Mandarin, or Standard Mandarin Chinese. Among linguists, Standard Chinese has been referred to as Standard Northern Mandarin[9][10][11] or Standard Beijing Mandarin.[12][13] It is colloquially referred to as simply Mandarin,[14] though this term may also refer to the Mandarin dialect group as a whole, or the late imperial form used as a lingua franca.[15][16][17][14] Mandarin is a translation of Guanhua (官話; 官话; bureaucrat speech),[18] which referred to the late imperial lingua franca.[19] The term Modern Standard Mandarin is used to distinguish it from older forms.[18][20]
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.