text
stringlengths 24
5.93k
|
|---|
Southwark. Southwark (/ˈsʌðərk/ ⓘ SUDH-ərk)[1] is a district of Central London situated on the south bank of the River Thames, forming the north-western part of the wider modern London Borough of Southwark. The district, which is the oldest part of South London, developed due to its position at the southern end of the early versions of London Bridge, for centuries the only dry crossing on the river. Around 43 AD, engineers of the Roman Empire found the geographic features of the south bank here suitable for the placement and construction of the first bridge.[2] Londons historic core, the City of London, lay north of the bridge and for centuries the area of Southwark just south of the bridge was partially governed by the City, while other areas of the district were more loosely governed. The section known as Liberty of the Clink became a place of entertainment. By the 12th century Southwark had been incorporated as an ancient borough, and this historic status is reflected in the alternative name of the area, as Borough. The ancient borough of Southwarks river frontage extended from the modern borough boundary, just to the west of the Oxo Tower, to St Saviours Dock (originally the mouth of the River Neckinger) in the east. In the 16th century, parts of Southwark near London Bridge became a formal City ward, Bridge Without. The urban area expanded over the years and Southwark was completely separated administratively from the now small City in 1900, although some remnants of City administration remain in the Guildable Manor. Like other parts of London, it now falls under the Mayor of London and London Assembly in addition to its Burough authority. Local points of interest include Southwark Cathedral, Borough Market, Shakespeares Globe theatre, The Shard, Tower Bridge, Butlers Wharf and the Tate Modern museum. The name Suthriganaweorc[3] or Suthringa geweorche[4] is recorded for the area in the 10th-century Anglo-Saxon document known as the Burghal Hidage[4] and means fort of the men of Surrey[3] or the defensive work of the men of Surrey.[4] Southwark is recorded in the 1086 Domesday Book as Sudweca. The name means southern defensive work and is formed from the Old English sūþ (south) and weorc (work). The southern location is in reference to the City of London to the north, Southwark being at the southern end of London Bridge. In Old English, Surrey means southern district (or the men of the southern district),[5] so the change from southern district work to the latter southern work may be an evolution based on the elision of the single syllable ge element, meaning district.
|
Ruby character. Ruby characters or rubi characters (Japanese: ルビ; rōmaji: rubi; Korean: 루비; romaja: rubi) are small, annotative glosses that are usually placed above or to the right of logographic characters of languages in the East Asian cultural sphere, such as Chinese hanzi, Japanese kanji, and Korean hanja, to show the logographs pronunciation; these were formerly also used for Vietnamese chữ Hán and chữ Nôm, and may still occasionally be seen in that context when reading archaic texts. Typically called just ruby or rubi, such annotations are most commonly used as pronunciation guides for characters that are likely to be unfamiliar to the reader. Here is an example of Japanese ruby characters (called furigana) for Tokyo (東京): Most furigana are written with the hiragana syllabary, but katakana and romaji are also occasionally used. Alternatively, sometimes foreign words (usually English) are printed with furigana to provide the meaning, and vice versa. Textbooks sometimes render on-readings with katakana and kun-readings with hiragana. Here is an example of ruby characters for Beijing (北京) in Zhuyin (a.k.a. Bopomofo), Xiaoerjing, and Pinyin. In Taiwan, the main syllabary used for Chinese ruby characters is Zhuyin fuhao (also known as Bopomofo); in mainland China pinyin is mainly used. Typically, unlike the example shown above, zhuyin is used with a vertical traditional writing and zhuyin is written on the right side of the characters. In mainland China, horizontal script is used and ruby characters (pinyin) are written above the Chinese characters.
|
Jurisprudence. Jurisprudence, also known as theory of law or philosophy of law, is the examination in a general perspective of what law is and what it ought to be. It investigates issues such as the definition of law; legal validity; legal norms and values; and the relationship between law and other fields of study, including economics, ethics, history, sociology, and political philosophy. Modern jurisprudence began in the 18th century and was based on the first principles of natural law, civil law, and the law of nations. Contemporary philosophy of law addresses problems internal to law and legal systems and problems of law as a social institution that relates to the larger political and social context in which it exists. Jurisprudence can be divided into categories both by the type of question scholars seek to answer and by the theories of jurisprudence, or schools of thought, regarding how those questions are best answered: The terms philosophy of law and jurisprudence are often used interchangeably, though jurisprudence sometimes encompasses forms of reasoning that fit into economics or sociology. Whereas lawyers are interested in what the law is on a specific issue in a specific jurisdiction, analytical philosophers of law are interested in identifying the features of law shared across cultures, times, and places. Taken together, these foundational features of law offer the kind of universal definition philosophers are after. The general approach allows philosophers to ask questions about, for example, what separates law from morality, politics, or practical reason.[1] While the field has traditionally focused on giving an account of laws nature, some scholars have begun to examine the nature of domains within law, e.g. tort law, contract law, or criminal law. These scholars focus on what makes certain domains of law distinctive and how one domain differs from another. A particularly fecund area of research has been the distinction between tort law and criminal law, which more generally bears on the difference between civil and criminal law.[2]
|
Inuboh Station. Inuboh Station (犬吠駅, Inubō-eki) is a railway station on the privately operated Chōshi Electric Railway Line in Chōshi, Chiba, Japan. Inuboh Station is served by the 6.4 km (4.0 mi) Chōshi Electric Railway Line from Chōshi to Tokawa. It is located between Kimigahama and Tokawa stations, and is a distance of 5.5 km from Chōshi Station.[1] The station is staffed during the daytime, and consists of one side platform serving a single track.[1] Nure senbei (moist senbei rice crackers) are made and sold inside the large Portuguese-style station building.[2] Former Choshi Electric Railway DeHa 501 EMU car was sectioned and grounded in front of the station together with former Sagami Railway 2000 series EMU car MoNi 2022. These were used as shop and restaurant facilities until they were cut up on-site in July 2012 due to their increasingly poor structural condition.[3][4]
|
Inubōsaki Lighthouse. Inubōsaki Lighthouse (犬吠埼燈台, Inubōsaki tōdai) is a lighthouse on Cape Inubō, in the city of Chōshi, Chiba Prefecture Japan. It is notable as one of the few lighthouses whose original lens was a first order Fresnel lens, the strongest type of Fresnel lens. It is a Registered Tangible Cultural Property of Japan. The lighthouse is located within the borders of the Suigo-Tsukuba Quasi-National Park. Although not one of eight lighthouses to be built in Meiji period Japan under the provisions of the Anglo-Japanese Treaty of Amity and Commerce of 1858, signed by the Bakumatsu period Tokugawa Shogunate, the need for a lighthouse at Cape Inubō for the safety of vessels on the northeastern approaches to Tokyo was recognized at an early time after Japan was opened to the West. The wreck of the Tokugawa navy warship Mikaho in a typhoon on the rocks of Cape Inubō with the loss of 13 lives on October 6, 1868 further emphasized the need for a lighthouse. The lighthouse was designed and constructed by British engineer Richard Henry Brunton, born 1841 in Kincardineshire, Scotland, who was under contract by the new Meiji government. Brunton constructed another 25 lighthouses from far northern Hokkaidō to southern Kyūshū during his career in Japan. Work began on the start of 1872.[4] The Inubōsaki Lighthouse was lit on November 15, 1874. The structure consisted of a cylindrical tower made from the first domestically-produced red bricks in Japan. Brunton supervised the construction of a brick factory in Tomioka Village in what is now part of Narita City, which produced 193,000 bricks for the project.[5] However, Brunton was uncertain of the mechanical strength of the Japanese bricks, so he constructed the tower using a double thickness for the walls. The tower, at 31.5 meters, is also the second tallest brick lighthouse in Japan, surpassed only by the Shiriyazaki Lighthouse (also built by Brunton) in Higashidōri, Aomori Prefecture. Repairs for historical preservation and improvements in earthquake safety were made in 1977.[1] The Inubōsaki Lighthouse is currently open to the public, who may visit a small museum at its base, and climb to the top for a panoramic view over the Pacific Ocean. It is registered with the International Association of Lighthouse Authorities as one of the “One Hundred Most Important Lighthouses in the World.The lighthouse is currently maintained by the Japan Coast Guard. Furthermore, in 2020, it was registatered on Important Cultural Property (Japan).[6]
|
Byōbugaura. Byōbugaura (屏風ヶ浦, Byōbugaura) is an inlet on the northeast coast of Chiba Prefecture that ranges from Cape Inubō in Chōshi to Cape Gyōbumi in Asahi. Byōbugaura is an important part of the coastal area in Chiba Prefecture, as it connects the northern point of the Pacific Ocean coast at Chōshi to Kujūkuri Beach, which covers a large portion of the western side of the prefecture.[1][2] The name of the inlet is formed from the word byōbu, the Japanese-style folding screen, and ura, meaning an inlet. Byōbugaura resembles the White Cliffs of Dover on the English Channel. For this reason the inlet is sometimes called Tōyō no Dōbā (東洋のドーバー), or Dover of the East.[3] Byōbugaura is known for its long history of marine erosion. Byōbugaura spans approximately 10 m (33 ft), with precipitous cliffs that reach an altitude of 60 m (197 ft). Three strata are clearly visible on the cliffs of the inlet: the reddish Kantō Loam Stratum at the top, the Katori Statum in the middle, the chalky-white Iioka Stratum prominently on the cliff face, and the Naarai Stratum at the base. The Kantō Loam Stratum is red and is composed of material from volcanic eruptions of Mount Fuji and Mount Hakone, and the white portions of the cliff face consist of easily eroded clay.[2] Byōbugaura is noted in Japan for its historical and ongoing marine erosion due to the geological character of the inlet and the near-constant violent waves of the Pacific Ocean in the area.[4] About 6 kilometres (3.7 mi) of land has been lost to erosion in the past 700 years.[5] In modern times, in the 63 years between 1888 and 1951 30 metres (98 ft) of land have been lost.[6] Tetrapods have been extensively installed at Byōbugaura to establish breakwaters, and have softened the retreat of land.[1] The area of Byōbugaura on the border of Chōshi features marine caves, but most have also been lost due to marine erosion. Byōbugaura was used as a defensive position overlooking the Pacific Ocean as early as the Kamakura period (1185–1333). Tsuneharu Kataoka (fl. 12th century), a grandson of the regional leader Taira no Tadatsune, used an area 100 metres (330 ft) in front of the present-day location of Iioka Lighthouse on Cape Gyōbumi to build Sanuki Castle. Kataoka plotted against the first Kamakura shōgun Minamoto no Yoritomo, and Kataoka, his family, and local supporters were defeated and killed at Sanuki Castle by members of the Chiba clan. The remains of the castle have been lost out to sea due to marine erosion.[5][7] Byōbugaura makes up the southernmost part of Suigō-Tsukuba Quasi-National Park,[4] but because of the danger of violent waves off the inlet there are few recreational facilities built in the area. Byōbugaura, due to its dramatic coastal scenery and location near Tokyo, has been extensively used as a filming location for television dramas, commercials, and film. The precinct of the Tokai Shrine, above the inlet in Chōshi, has an important old-growth forest featuring the tabu species of laurel, castanopsis, and camellia, and is a Designated Natural Area of Chiba Prefecture.[1][8][9] The areas above the inlet are also used for agricultural. The headland around Cape Inubō is cultivated for cabbage production.
|
Guildford Guildhall. The Guildford Guildhall is a Guildhall located on the High Street of the town of Guildford, Surrey. It is a Grade I listed building.[1] The Guildhall, which initially accommodated a market hall on the ground floor and a courtroom on the first floor, was built around 1550.[1] It was substantially remodelled with a new facade and a new council chamber being installed on the first floor in 1683.[2][3] The external design involved three doors on the ground floor, three mullion windows flanked by Ionic order pilasters augmented by a balcony with iron railings on the first floor and an ornamental cupola on the roof.[1] The projecting clock, erected at that time, was presented to the council by a London clockmaker, John Aylward, in return for being allowed to trade in the borough.[2][4] The interior design involved a courtroom on the ground floor and a council chamber on the first floor.[3] The panelling in the council chamber was taken from Stoughton Manor House shortly before it was demolished in the late 17th century.[2] The ornamental cupola was replaced in 1882.[1] During much of the 20th century the town hall served as the meeting place of the Municipal Borough of Guildford but it ceased to be the local seat of government in 1974, when the amalgamation of the municipal borough of Guildford and Guildford Rural District to form Guildford Borough Council took place; the amalgamated Borough Council decided to hold its meetings at Millmead House.[5]
|
Guildford (disambiguation). Guildford is a town in Surrey, England. It gives its name to the Borough of Guildford, the Diocese of Guildford and the Parliamentary constituency of Guildford. Guildford, Guilford, or Gildford may also refer to:
|
Orthography. An orthography is a set of conventions for writing a language, including norms of spelling, punctuation, word boundaries, capitalization, hyphenation, and emphasis. Most national and international languages have an established writing system that has undergone substantial standardization, thus exhibiting less dialect variation than the spoken language.[1][2] These processes can fossilize pronunciation patterns that are no longer routinely observed in speech (e.g. would and should); they can also reflect deliberate efforts to introduce variability for the sake of national identity, as seen in Noah Websters efforts to introduce easily noticeable differences between American and British spelling (e.g. honor and honour). Orthographic norms develop through social and political influence at various levels, such as encounters with print in education, the workplace, and the state. Some nations have established language academies in an attempt to regulate aspects of the national language, including its orthography—such as the Académie Française in France and the Royal Spanish Academy in Spain. No such authority exists for most languages, including English. Some non-state organizations, such as newspapers of record and academic journals, choose greater orthographic homogeneity by enforcing a particular style guide or spelling standard such as Oxford spelling. The English word orthography is first attested in the 15th century, ultimately from Ancient Greek: ὀρθός (orthós correct) and γράφειν (gráphein to write).[3]
|
Horizontal and vertical writing in East Asian scripts. Many East Asian scripts can be written horizontally or vertically. Chinese characters, Korean hangul, and Japanese kana may be oriented along either axis, as they consist mainly of disconnected logographic or syllabic units, each occupying a square block of space, thus allowing for flexibility for which direction texts can be written, be it horizontally from left-to-right, horizontally from right-to-left, vertically from top-to-bottom, and even vertically from bottom-to-top. Traditionally, written Chinese, Vietnamese, Korean, and Japanese are written vertically in columns going from top to bottom and ordered from right to left, with each new column starting to the left of the preceding one. The stroke order and stroke direction of Chinese characters, Vietnamese chữ Nôm, Korean hangul, and kana all facilitate writing in this manner.[why?] In addition, writing in vertical columns from right to left facilitated writing with a brush in the right hand while continually unrolling the sheet of paper or scroll with the left. Since the nineteenth century, it has become increasingly common for these languages to be written horizontally, from left to right, with successive rows going from top to bottom, under the influence of European languages such as English, although vertical writing is still frequently used in Hong Kong, Japan, Korea, Macau, and Taiwan. Chinese characters, Japanese kana, Vietnamese chữ Nôm and Korean hangul can be written horizontally or vertically. There are some small differences in orthography. In horizontal writing it is more common to use Arabic numerals, whereas Chinese numerals are more common in vertical text. In these scripts, the positions of punctuation marks, for example the relative position of commas and full stops (periods), differ between horizontal and vertical writing. Punctuation such as the parentheses, quotation marks, book title marks (Chinese), ellipsis mark, dash, wavy dash (Japanese), proper noun mark (Chinese), wavy book title mark (Chinese), emphasis mark, and chōon mark (Japanese) are all rotated 90 degrees when switching between horizontal and vertical text.
|
The Shard. The Shard,[a] also referred to as the Shard London Bridge[13] and formerly London Bridge Tower,[14] is a 72-storey mixed-use development supertall pyramid-shaped skyscraper, designed by the Italian architect Renzo Piano, in Southwark, London, that forms part of The Shard Quarter development. Standing 309.6 metres (1,016 feet) high, The Shard is the tallest building in the United Kingdom, the seventh-tallest building in Europe, and the second-tallest outside Russia behind the Varso Tower in Warsaw, which beats the Shard by less than half a metre.[15] The Shard replaced Southwark Towers, a 24-storey office block built on the site in 1975. The Shards construction began in March 2009; it was topped out on 30 March 2012 and inaugurated on 5 July 2012. Practical completion was achieved in November 2012. The towers privately operated observation deck, The View from The Shard, was opened to the public on 1 February 2013. The glass-clad pyramidal tower has 72 habitable floors, with a viewing gallery and open-air observation deck on the 72nd floor, at a height of 244 metres (801 ft). The Shard was developed by Sellar Property Group on behalf of LBQ Ltd and is jointly owned by Sellar Property (5%) and the State of Qatar (95%). In 1998, London-based entrepreneur Irvine Sellar and his partners decided to redevelop the 1970s-era Southwark Towers following a UK government white paper encouraging the development of tall buildings at major transport hubs. Sellar flew to Berlin in the spring of 2000 to meet the Italian architect Renzo Piano for lunch. According to Sellar, Piano spoke of his contempt for conventional tall buildings during the meal, before flipping over the restaurants menu and sketching a spire-like sculpture emerging from the River Thames.[16] In July 2002, the Deputy Prime Minister, John Prescott, ordered a planning inquiry after the development plans for the Shard were opposed by the Commission for Architecture and the Built Environment and several heritage bodies, including the Royal Parks Foundation and English Heritage.[17][18] The inquiry took place in April and May 2003,[14][19] and on 19 November 2003, the Office of the Deputy Prime Minister announced that planning consent had been approved.[20] The government stated that:
|
Guildford railway station. Guildford railway station is at one of three main railway junctions on the Portsmouth Direct Line and serves the town of Guildford, in Surrey, England. It is 30 miles 27 chains (30.34 mi; 48.8 km) down the line from London Waterloo via Woking.[1] It provides an interchange station for two other railway lines: the North Downs Line northwards towards Reading, which has a connection to Aldershot, and eastwards to Redhill; and the New Guildford Line, the alternative route to London Waterloo, via Cobham or Epsom. Guildford station is the larger, more frequently and more diversely served of the two stations in Guildford town centre, the other being London Road (Guildford) on the New Guildford Line. The station was opened by the London and South Western Railway (LSWR) on 5 May 1845,[2] but was substantially enlarged and rebuilt in 1880.
|
Logogram. In a written language, a logogram (from Ancient Greek logos word, and gramma that which is drawn or written), also logograph or lexigraph, is a written character that represents a semantic component of a language, such as a word or morpheme. Chinese characters as used in Chinese as well as other languages are logograms, as are Egyptian hieroglyphs and characters in cuneiform script. A writing system that primarily uses logograms is called a logography. Non-logographic writing systems, such as alphabets and syllabaries, are phonemic: their individual symbols represent sounds directly and lack any inherent meaning. However, all known logographies have some phonetic component, generally based on the rebus principle, and the addition of a phonetic component to pure ideographs is considered to be a key innovation in enabling the writing system to adequately encode human language. Some of the earliest recorded writing systems are logographic; the first historical civilizations of Mesopotamia, Egypt, China and Mesoamerica all used some form of logographic writing.[1][2] All logographic scripts ever used for natural languages rely on the rebus principle to extend a relatively limited set of logograms: A subset of characters is used for their phonetic values, either consonantal or syllabic. The term logosyllabary is used to emphasize the partially phonetic nature of these scripts when the phonetic domain is the syllable. In Ancient Egyptian hieroglyphs, Cholti, and in Chinese, there has been the additional development of determinatives, which are combined with logograms to narrow down their possible meaning. In Chinese, they are fused with logographic elements used phonetically; such radical and phonetic characters make up the bulk of the script. Ancient Egyptian and Chinese relegated the active use of rebus to the spelling of foreign and dialectical words. Logoconsonantal scripts have graphemes that may be extended phonetically according to the consonants of the words they represent, ignoring the vowels. For example, Egyptian was used to write both sȝ duck and sȝ son, though it is likely that these words were not pronounced the same except for their consonants. The primary examples of logoconsonantal scripts are Egyptian hieroglyphs, hieratic, and demotic: Ancient Egyptian.
|
Tower Bridge. Tower Bridge is a Grade I listed combined bascule, suspension, and, until 1960, cantilever bridge[1] in London, built between 1886 and 1894, designed by Horace Jones and engineered by John Wolfe Barry with the help of Henry Marc Brunel.[2] It crosses the River Thames close to the Tower of London and is one of five London bridges owned and maintained by the City Bridge Foundation, a charitable trust founded in 1282. The bridge was constructed to connect the 39 per cent of Londons population that lived east of London Bridge, equivalent to the populations of Manchester on the one side, and Liverpool on the other,[3] while allowing shipping to access the Pool of London between the Tower of London and London Bridge. The bridge was opened by Edward, Prince of Wales, and Alexandra, Princess of Wales, on 30 June 1894. The bridge is 940 feet (290 m) in length including the abutments[4] and consists of two 213-foot (65 m) bridge towers connected at the upper level by two horizontal walkways, and a central pair of bascules that can open to allow shipping. Originally hydraulically powered, the operating mechanism was converted to an electro-hydraulic system in 1972. The bridge is part of the London Inner Ring Road and thus the boundary of the London congestion charge zone, and remains an important traffic route with 40,000 crossings every day. The bridge deck is freely accessible to both vehicles and pedestrians, whereas the bridges twin towers, high-level walkways, and Victorian engine rooms form part of the Tower Bridge Exhibition.
|
River Thames. The River Thames (/tɛmz/ ⓘ TEMZ), known alternatively in parts as the River Isis, is a river that flows through southern England including London. At 215 miles (346 km), it is the longest river entirely in England and the second-longest in the United Kingdom, after the River Severn. The river rises at Thames Head in Gloucestershire and flows into the North Sea near Tilbury, Essex and Gravesend, Kent, via the Thames Estuary. From the west, it flows through Oxford (where it is sometimes called the Isis), Reading, Henley-on-Thames and Windsor. The Thames also drains the whole of Greater London.[1] The lower reaches of the river are called the Tideway, derived from its long tidal reach up to Teddington Lock. Its tidal section includes most of its London stretch and has a rise and fall of 23 ft (7 m). From Oxford to the estuary, the Thames drops by 55 metres (180 ft). Running through some of the drier parts of mainland Britain and heavily abstracted for drinking water, the Thames discharge is low considering its length and breadth: the Severn has a discharge almost twice as large on average despite having a smaller drainage basin. In Scotland, the Tay achieves more than double the Thames average discharge from a drainage basin that is 60% smaller. Along its course are 45 navigation locks with accompanying weirs. Its catchment area covers a large part of south-eastern and a small part of western England; the river is fed by at least 50 named tributaries. The river contains over 80 islands. With its waters varying from freshwater to almost seawater, the Thames supports a variety of wildlife and has a number of adjoining Sites of Special Scientific Interest, with the largest being in the North Kent Marshes and covering 20.4 sq mi (5,289 ha).[2]
|
Video game producer. A video game producer is the top person in charge of overseeing development of a video game.[1][2] The earliest documented use of the term producer in games was by Trip Hawkins, who established the position when he founded Electronic Arts in 1982: Producers basically manage the relationship with the artist. They find the talent, work out product deals, get contracts signed, manage them, and bring them to their conclusion. The producers do most of the things that a product manager does. They dont do the marketing, which in some cases product managers do. They dont make decisions about packaging and merchandising, but they do get involved ... theyre a little like book editors, a little bit like film producers, and a lot like product managers.[3] Sierra On-Lines 1982 computer game Time Zone may be the first to list credits for Producer and Executive Producer.[4] As of late 1983 Electronic Arts had five producers: A product marketer and two others from Hawkins former employer Apple (good at working with engineering people), one former IBM salesman and executive recruiter, and one product marketer from Automated Simulations;[3] it popularized the use of the title in the industry.[4] Hawkins vision—influenced by his relationship with Jerry Moss—was that producers would manage artists and repertoire in the same way as in the music business, and Hawkins brought in record producers from A&M Records to help train those first producers. Activision made Brad Fregger their first producer in April 1983. Although the term is an industry standard today, it was dismissed as imitation Hollywood by many game executives and press members at the time. Over its entire history, the role of the video game producer has been defined in a wide range of ways by different companies and different teams, and there are a variety of positions within the industry referred to as producer.
|
Town. A town is a type of a human settlement, generally larger than a village but smaller than a city.[1] The criteria for distinguishing a town vary globally, often depending on factors such as population size, economic character, administrative status, or historical significance. In some regions, towns are formally defined by legal charters or government designations, while in others, the term is used informally. Towns typically feature centralized services, infrastructure, and governance, such as municipal authorities, and serve as hubs for commerce, education, and cultural activities within their regions. The concept of a town varies culturally and legally. For example, in the United Kingdom, a town may historically derive its status from a market town designation or royal charter, while in the United States, the term is often loosely applied to incorporated municipalities. In some countries, such as Australia and Canada, distinctions between towns, cities, and rural areas are based on population thresholds. Globally, towns play diverse roles, ranging from agricultural service centers to suburban communities within metropolitan areas. The word town shares an origin with the German word Zaun (fence), the Dutch word tuin (garden, yard; fence, enclosure), and the Old Norse tún (enclosure, as for a homestead).[2] The original Proto-Germanic word, *tūną, is thought to be an early borrowing from *dūnom (cf. Old Irish dún, Welsh din).[3]
|
Video game design. Video game design is the process of designing the rules and content of video games in the pre-production stage[1] and designing the gameplay, environment, storyline and characters in the production stage. Some common video game design subdisciplines are world design, level design, system design, content design, and user interface design. Within the video game industry, video game design is usually just referred to as game design, which is a more general term elsewhere. The video game designer is like the director of a film; the designer is the visionary of the game and controls the artistic and technical elements of the game in fulfillment of their vision.[2] However, with complex games, such as MMORPGs or a big budget action or sports title, designers may number in the dozens. In these cases, there are generally one or two principal designers and multiple junior designers who specify subsets or subsystems of the game. As the industry has aged and embraced alternative production methodologies such as agile, the role of a principal game designer has begun to separate - some studios emphasizing the auteur model while others emphasizing a more team oriented model. In larger companies like Electronic Arts, each aspect of the game (control, level design) may have a separate producer, lead designer and several general designers. Video game design requires artistic and technical competence as well as sometimes including writing skills.[3] Historically, video game programmers have sometimes comprised the entire design team. This is the case of such noted designers as Sid Meier, John Romero, Chris Sawyer and Will Wright. A notable exception to this policy was Coleco, which from its very start separated the function of design and programming. As video games became more complex, computers and consoles became more powerful, the job of the game designer became separate from the lead programmer. Soon, game complexity demanded team members focused on game design. A number of early veterans chose the game design path eschewing programming and delegating those tasks to others. Video game design starts with an idea,[4][5][6][7] often a variation or modification on an existing concept.[4][8] The game idea will fall within one or several genres and designers will often experiment with mixing genres.[9][10] The game designer usually produces an initial game proposal document containing the concept, gameplay, feature list, setting and story, target audience, requirements and schedule, staff and budget estimates.[11] Multiple design decisions are made during the course of a games development; it is the responsibility of the designer to decide which elements should be implemented. For example, consistency with the games vision, budget or hardware limitations.[12] Design changes will have a significant impact on required resources.[13]
|
Game art design. Game art design is a subset of game development involving the process of creating the artistic aspects of video games. Video game art design begins in the pre-production phase of creating a video game. Video game artists are visual artists involved from the conception of the game who make rough sketches of the characters, setting, objects, etc.[1][2][3][4] These starting concept designs can also be created by the game designers before the game is moved into actualization. Sometimes, these concept designs are called programmer art.[5] After the rough sketches are completed and the game is ready to be moved forward, those artists or more artists are brought in to develop graphic designs based on the sketches. The art design of a game can involve anywhere from two people and up. Small gaming companies tend to not have as many artists on the team, meaning that their artist must be skilled in several types of art development, whereas the larger the company, although an artist can be skilled in several types of development, the roles each artist plays becomes more specialized.[6] A games artwork included in media, such as demos and screenshots, has a significant impact on customers, because artwork can be judged from previews, while gameplay cannot.[1][7] Artists work closely with designers on what is needed for the game.[8] Tools used for art design and production are known as art tools. These can range from pen and paper to full software packages for both 2D and 3D digital art.[9] A developer may employ a tools team responsible for art production applications. This includes using existing software packages and creating custom exporters and plug-ins for them.[10]
|
Privately held company. A privately held company (or simply a private company) is a company whose shares and related rights or obligations are not offered for public subscription or publicly negotiated in their respective listed markets. Instead, the companys stock is offered, owned, traded or exchanged privately, also known as over-the-counter. Related terms are unlisted organisation, unquoted company and private equity. Private companies are often less well-known than their publicly traded counterparts but still have major importance in the worlds economy. For example, in 2008, the 441 largest private companies in the United States accounted for $1.8 trillion in revenues and employed 6.2 million people, according to Forbes.[1] In general, all companies that are not owned by the government are classified as private enterprises. This definition encompasses both publicly traded and privately held companies, as their investors are individuals. Private ownership of productive assets differs from state ownership or collective ownership (as in worker-owned companies). This usage is often found in former Eastern Bloc countries to differentiate from former state-owned enterprises,[citation needed] but it may be used anywhere in contrast to a state-owned or a collectively owned company. In the United States, a privately held company refers to a business entity owned by private stakeholders, investors, or company founders, and its shares are not available for public purchase on stock exchanges. That contrasts with public companies, whose shares are publicly traded, which allows investing by the general public.
|
Philippine Sea. The Philippine Sea is a marginal sea of the Western Pacific Ocean east of the Philippine Archipelago (hence the name) and the largest sea in the world, occupying an estimated surface area of 5 million square kilometers (2×10^6 sq mi).[1] The Philippine Sea Plate forms the floor of the sea.[2] Its western border is the first island chain to the west, comprising the Ryukyu Islands in the northwest and Taiwan in the west. Its southwestern border comprises the Philippine islands of Luzon, Catanduanes, Samar, Leyte, and Mindanao. Its northern border comprises the Japanese islands of Honshu, Shikoku and Kyūshū. Its eastern border is the second island chain to the east, comprising the Bonin Islands and Iwo Jima in the northeast, the Mariana Islands (including Guam, Saipan, and Tinian) in the due east, and Halmahera, Palau, Yap and Ulithi (of the Caroline Islands) in the southeast. Its southern border is Indonesias Morotai Island.[3] The sea has a complex and diverse undersea relief.[4] The floor is formed into a structural basin by a series of geologic faults and fracture zones. Island arcs, which are actually extended ridges protruding above the ocean surface due to plate tectonic activity in the area, enclose the Philippine Sea to the north, east and south. The Philippine archipelago, Ryukyu Islands, and the Marianas are examples. Another prominent feature of the Philippine Sea is the presence of deep sea trenches, among them the Philippine Trench and the Mariana Trench, containing the deepest point on the planet. The Philippine Sea has the Philippines and Taiwan to the west, Japan to the north, the Marianas to the east and Palau to the south. Adjacent seas include the Celebes Sea which is separated by Mindanao and smaller islands to the south, the South China Sea which is separated by Philippines, and the East China Sea which is separated by the Ryukyu Islands. The International Hydrographic Organization defines the Philippine Sea as that area of the North Pacific Ocean off the Eastern coasts of the Philippine Islands, bounded as follows:[5][failed verification]
|
Taiwanese yen. The Taiwanese yen (Japanese: 圓, Hepburn: en) was the currency of Japanese Taiwan from 1895 to 1946. It was on a par with and circulated alongside the Japanese yen. The yen was subdivided into 100 sen (錢). It was replaced by the Old Taiwan dollar in 1946, which in turn was replaced by the New Taiwan dollar in 1949. In 1895, as a result of the First Sino-Japanese War, Qing China ceded Taiwan to Japan in the Treaty of Shimonoseki. The Japanese yen then became the currency of Taiwan, with distinct banknotes denominated in yen issued by the Bank of Taiwan from 1898. Only banknotes and stamp currency were issued. In 1945, after Japan was defeated in World War II, the Republic of China assumed the administration of Taiwan, took over the Bank of Taiwan within a year, and introduced the Old Taiwan dollar, which replaced the yen at par. In 1899, the Bank of Taiwan introduced 1 and 5 yen notes, followed by 50 yen notes in 1900 and 10 yen in 1901. 100 yen notes were introduced in 1937 and 1000 yen in 1945. The last notes issued were dated 1945. In 1917, stamp currency was issued in denominations of 5, 10, 20, and 50 sen. 1, 3 and 5 sen stamp currency was issued in 1918. These issues consisted of postage stamps of the appropriate denomination fixed to forms called tokubetsu yubin kitte daishi (special postage stamp cards).
|
Video game development. Video game development (sometimes shortened to gamedev) is the process of creating a video game. It is a multidisciplinary practice, involving programming, design, art, audio, user interface, and writing. Each of those may be made up of more specialized skills; art includes 3D modeling of objects, character modeling, animation, visual effects, and so on. Development is supported by project management, production, and quality assurance. Teams can be many hundreds of people, a small group, or even a single person. Development of commercial video games is normally funded by a publisher and can take two to five years to reach completion. Game creation by small, self-funded teams is called independent development. The technology in a game may be written from scratch or use proprietary software specific to one company. As development has become more complex, it has become common for companies and independent developers alike to use off-the-shelf engines such as Unity, Unreal Engine or Godot.[1][2] Commercial game development began in the 1970s with the advent of arcade video games, first-generation video game consoles like the Atari 2600, and home computers like the Apple II. Into the 1980s, a lone programmer could develop a full and complete game such as Pitfall!. By the second and third generation of video game consoles in the late 1980s, the growing popularity of 3D graphics on personal computers, and higher expectations for visuals and quality, it became difficult for a single person to produce a mainstream video game. The average cost of producing a high-end (often called AAA) game slowly rose from US$1–4 million in 2000, to over $200 million and up by 2023. At the same time, independent game development has flourished. The best-selling video game of all time, Minecraft, was initially written by one person, then supported by a small team, before the company was acquired by Microsoft and greatly expanded. Mainstream commercial video games are generally developed in phases. A concept is developed which then moves to pre-production where prototypes are written and the plan for the entire game is created. This is followed by full-scale development or production, then sometimes a post-production period where the game is polished. It has become common for many developers, especially smaller developers, to publicly release games in an early access form, where iterative development takes place in tandem with feedback from actual players. Games are produced through the software development process.[3] Games are developed as a creative outlet[4] and to generate profit.[5] Game making is considered both art and science.[6][7] Development is normally funded by a publisher.[8] Well-made games bring profit more readily.[6] However, it is important to estimate a games financial requirements,[9] such as development costs of individual features.[10] Failing to provide clear implications of games expectations may result in exceeding allocated budget.[9] In fact, the majority of commercial games do not produce profit.[11][12][13] Most developers cannot afford to change their development schedule midway, and require estimating their capabilities with available resources before production.[14]
|
Japanese military currency (1937–1945). Japanese military currency (Chinese and Japanese: 日本軍用手票, also 日本軍票 in short) was money issued to the soldiers of the Imperial Japanese Armed Forces as a salary.[citation needed] The military yen reached its peak during the Pacific War period, when the Japanese government excessively[clarification needed] issued it to all of its occupied territories. In Hong Kong, the military yen was forced upon the local population as the sole official currency of the territory.[1] Since the military yen was not backed by gold, and did not have a specific place of issuance, the military yen could not be exchanged for the Japanese yen. Forcing local populations to use the military yen officially was one of the ways the Japanese government could dominate the local economies. The territories controlled or occupied by Japan had many different currencies. Taiwan maintained its own banking system and bank notes after it came under Japanese sovereignty in 1895. The same is true for Korea post 1910. Between 1931 and 1945, large parts of China and South East Asia were occupied by Japan. Several types of currencies were put into circulation there during the occupation. In China, several puppet governments were created (e.g. Manchukuo), each issuing their own currency. In South East Asia, the Japanese military arranged for bank notes to be issued, denominated in the various currencies (rupees, pesos, dollars, etc.) that had been circulating there prior to the occupation. These latter are referred to as Japanese invasion money. In addition to these currencies, the Japanese military issued their own bank notes, denominated in yen – this is the Japanese military yen. The military yen became the official currency in some occupied areas, e.g. Hong Kong. In the late 1930s there was an issue of military yen which was similar to the standard yen in terms of design, but with minor modifications. Generally, thick red lines were overprinted to cancel the name Bank of Japan (日本銀行) and any text promising to pay the bearer in gold or silver. Large red text instead indicated that the note was military currency (軍用手票) so as not to be confused with regular Japanese yen. Later series were less crude. In the early 1940s, the Japanese government issued military yen notes with a design prepared specifically for the military yen. These designs were not based on existing Japanese yen notes, but featured original designs such as Onagadori cocks and dragons. All later series featured the following text on the reverse of the note:[citation needed]
|
Lists of ISO 639 codes. ISO 639 is a set of standards by the International Organization for Standardization that is concerned with representation of names for languages and language groups. Lists of ISO 639 codes are:
|
Aichi Prefecture. Aichi Prefecture (愛知県, Aichi-ken; Japanese pronunciation: [aꜜi.tɕi, ai.tɕi̥ꜜ.keɴ][2]) is a prefecture of Japan located in the Chūbu region of Honshū.[3]: 11, 126 Aichi Prefecture has a population of 7,461,111 (as of 1 January 2025[update]) and a geographic area of 5,172.92 square kilometres (1,997.28 sq mi) with a population density of 1,442 inhabitants per square kilometre (3,730/sq mi). Aichi Prefecture borders Mie Prefecture to the west, Gifu Prefecture and Nagano Prefecture to the north, and Shizuoka Prefecture to the east. Nagoya is the capital and largest city of the prefecture. Nagoya is the capital and largest city of Aichi Prefecture, and the fourth-largest city in Japan. Other major cities include Toyota, Okazaki, and Ichinomiya. Aichi Prefecture and Nagoya form the core of the Chūkyō metropolitan area, the third-largest metropolitan area in Japan and one of the largest metropolitan areas in the world.[3]: 685 Aichi Prefecture is located on Japans Pacific Ocean coast and forms part of the Tōkai region, a subregion of the Chūbu region and Kansai region. Aichi Prefecture is home to the Toyota Motor Corporation. Aichi Prefecture had many locations with the Higashiyama Zoo and Botanical Gardens, The Chubu Centrair International Airport, and the Legoland Japan Resort. Located near the centre of the Japanese main island of Honshu, Aichi Prefecture faces the Ise and Mikawa Bays to the south and borders Shizuoka Prefecture to the east, Nagano Prefecture to the northeast, Gifu Prefecture to the north, and Mie Prefecture to the west. It measures 106 km (66 mi) east to west and 94 km (58 mi) south to north and forms a major portion of the Nōbi Plain. With an area of 5,172.48 square kilometres (1,997.11 sq mi) it accounts for approximately 1.36% of the total surface area of Japan. The highest spot is Chausuyama at 1,415 m (4,642 ft) above sea level. The western part of the prefecture is dominated by Nagoya, Japans third largest city, and its suburbs, while the eastern part is less densely populated but still contains several major industrial centres. Due to its robust economy, for the period from October 2005 to October 2006, Aichi was the fastest growing prefecture in terms of population, beating Tokyo, at 7.4% and around with after Saitama Prefecture.
|
Korean yen. The yen was the currency of Korea, Empire of Japan between 1910 and 1945. It was equivalent to the Japanese yen and consisted of Japanese currency and banknotes issued specifically for Korea. The yen was subdivided into 100 sen. It replaced the Korean won at par and was replaced by the South Korean won and the North Korean won at par. From 1902 to 1910, banknotes were issued by Dai-Ichi Bank[1]. Denominations included 10 sen, 20 sen, 50 sen, 1 yen, 5 yen, and 10 yen. The sen notes were vertical and resembled the Japanese sen notes of 1872 and the Japanese military yen at the turn of the century. These notes were redeemable in Japanese Currency at any of its Branches in Korea. In 1909, the Bank of Korea (韓國銀行) was founded in Seoul as a central bank and began issuing currency of modern type. Following the establishment of the Bank of Korea, it would immediately begin to issue its own banknotes, these new banknotes were redeemable in gold or Nippon Ginko notes.[2] Most of the reserves held by the Bank of Korea at the time were banknotes issued by the Bank of Japan and commercial paper.[2] The banknotes issued by the Bank of Korea were only very slightly modified from the earlier Dai-Ichi Bank banknotes that had circulated in Korea, this was done to reduce any possible confusion during the transition period.[2] The name of the Bank of Korea was inserted and the royal plum crest of Korea replaced Dai-Ichi Banks 10-pointed star emblem, and the reverse sides of the 1 yen banknotes changed colour, but all the overall the changes were minute.[2]
|
Fukui Prefecture. Fukui Prefecture (福井県, Fukui-ken; Japanese pronunciation: [ɸɯ̥.kɯ(ꜜ)(.)i, -kɯ.iꜜ.keɴ, -kɯꜜi.keɴ][2]) is a prefecture of Japan located in the Chūbu region of Honshū.[3] Fukui Prefecture has a population of 737,229 (1 January 2025) and has a geographic area of 4,190 km2 (1,617 sq mi). Fukui Prefecture borders Ishikawa Prefecture to the north, Gifu Prefecture to the east, Shiga Prefecture to the south, and Kyoto Prefecture to the southwest. Fukui is the capital and largest city of Fukui Prefecture, with other major cities including Sakai, Echizen, and Sabae.[4] Fukui Prefecture is located on the Sea of Japan coast and is part of the historic Hokuriku region of Japan. The Matsudaira clan, a powerful samurai clan during the Edo period that became a component of the Japanese nobility after the Meiji Restoration, was headquartered at Fukui Castle on the site of the modern prefectural offices. Fukui Prefecture is home to the Kitadani Formation and Kitadani Family, the Ichijōdani Asakura Family Historic Ruins, and the Tōjinbō cliff range. The Kitadani Dinosaur Quarry, on the Sugiyama River within the city limits of Katsuyama, has yielded animals such as Fukuiraptor, Fukuisaurus, Nipponosaurus, Koshisaurus, Fukuivenator, Fukuititan, and Tambatitanis, as well as an unnamed dromaeosaurid. Fukui originally consisted of the old provinces of Wakasa and Echizen, before the prefecture was formed in 1871.[5] During the Edo period, the daimyō of the region was surnamed Matsudaira, and was a descendant of Tokugawa Ieyasu.
|
Yen and yuan sign. The yen and yuan sign (¥) is a currency sign used for the Japanese yen and the Chinese yuan currencies when writing in Latin scripts. This character resembles a capital letter Y with a single or double horizontal stroke. The symbol is usually placed before the value it represents, for example: ¥50, or JP¥50 and CN¥50 when disambiguation is needed.[a] When writing in Japanese and Chinese, the Japanese kanji or Chinese character is written following the amount, for example 50円 in Japan, and 50元 or 50圆 in China. After the institution of Japans New Currency Act, from 1871 through the early 20th century, the yen was either referred to (in documents printed in Latin script) by its full name yen, or abbreviated with a capital Y.[citation needed] One of the earliest uses of ¥ can be found in J. Twizell Wawns Japanese Municipal Government With an Account of the Administration of the City of Kobe,[1] published in 1899. Usage of the sign increased in the early 20th century, primarily in Western English-speaking countries, but has become commonly used in Japan as well. The Unicode code point is U+00A5 ¥ YEN SIGN (¥). Additionally, there is a full width character, ¥, at code point U+FFE5 ¥ FULLWIDTH YEN SIGN[b] for use with wide fonts, especially East Asian fonts. There was no code-point for any ¥ symbol in the original (7-bit) US-ASCII and consequently many early systems reassigned 5C (allocated to the backslash (\) in ASCII) to the yen sign. With the arrival of 8-bit encoding, the ISO/IEC 8859-1 (ISO Latin 1) character set assigned code point A5 to the ¥ in 1985; Unicode continues this encoding. In JIS X 0201, of which Shift JIS is an extension, assigns code point 0x5C to the Latin-script yen sign: as noted above, this is the code used for the backslash in ASCII and also subsequently in Unicode. The JIS X 0201 standard was widely adopted in Japan.
|
Video game industry. The video game industry is the tertiary and quaternary sectors of the entertainment industry that specialize in the development, marketing, distribution, monetization, and consumer feedback of video games. The industry encompasses dozens of job disciplines and thousands of jobs worldwide.[1] The video game industry has grown from niche to mainstream.[2] As of July 2018[update], video games generated US$134.9 billion annually in global sales.[3] In the US, the industry earned about $9.5 billion in 2007, $11.7 billion in 2008, and US$25.1 billion in[update] 2010,[4] according to the ESA annual report. Research from Ampere Analysis indicated three points: the sector has consistently grown since at least 2015 and expanded 26% from 2019 to 2021, to a record $191 billion; the global games and services market is forecast to shrink 1.2% annually to $188 billion in 2022.[5] The industry has influenced the technological advancement of personal computers through sound cards, graphics cards and 3D graphic accelerators, CPUs, and co-processors like PhysX.[citation needed] Sound cards, for example, were originally developed for games and then improved for adoption by the music industry.[6] In 2017 in the United States, which represented about a third of the global video game market, the Entertainment Software Association estimated that there were over 2,300 development companies and over 525 publishing companies, including in hardware and software manufacturing, service providers, and distributors. These companies in total have nearly 66,000 direct employees. When including indirect employment, such as a developer using the services of a graphics design package from a different firm, the total number of employees involved in the video game industry rises above 220,000.[7]
|
Yan (surname). Yan is a surname in several languages and the pinyin romanization for several Chinese surnames, including 严 (嚴), 晏 (晏), 偃 (偃), 颜 (顏), 言 (言), 燕 (燕), 阎 (閻), 闫 (閆), 鄢 (鄢) in simplified (traditional) form. These characters are romanised as Yen in the Wade–Giles romanization system which was commonly used before the early 80s. As such, individuals and institutions who had to romanize their Chinese names prior to that time, such as when having their books translated or publishing manuscripts outside of China, used Yen instead of Yan. Such examples include Yenching University and the Harvard-Yenching Institute. The Yan surname in Taiwan is mostly spelled as Yen since only until recently has the government approved the use of pinyin romanization of names. The Cantonese romanization of these surnames is Yim. As such, most people from Hong Kong and Chinese diaspora that emigrated prior to 1949 from Guangdong use the name Yim. On many occasions, the surname 甄 (甄) is also romanized as Yan in Cantonese. This name in Mandarin is romanized as Zhēn, see Zhen (surname). Yan is also an alternative spelling of the Breton name Yann. Yan (simplified Chinese: 闫; traditional Chinese: 閆), pinyin Yán, originated as a variant of the surname 閻.
|
Kangxi radicals. The Kangxi radicals (Chinese: 康熙部首; pinyin: Kāngxī bùshǒu), also known as Zihui radicals, are a set of 214 radicals that were collated in the 18th-century Kangxi Dictionary to aid categorization of Chinese characters. They are primarily sorted by stroke count. They are the most popular system of radicals for dictionaries that order characters by radical and stroke count. They are encoded in Unicode alongside other CJK characters, under the block Kangxi radicals, while graphical variants are included in the block CJK Radicals Supplement. Originally introduced in the Zihui dictionary of 1615, they are more commonly referred to in relation to the 1716 Kangxi Dictionary—Kangxi being the commissioning emperors era name. The 1915 encyclopedic word dictionary Ciyuan also uses this system. In modern times, many dictionaries that list Traditional Chinese head characters continue to use this system, for example the Wang Li Character Dictionary of Ancient Chinese (2000). The system of 214 Kangxi radicals is based on the older system of 540 radicals used in the Han-era Shuowen Jiezi. Since 2009, the Chinese government has promoted a 201-radical system (Table of Han Character Radicals) called the Table of Indexing Chinese Character Components, as a national standard for use with simplified characters. The Kangxi dictionary lists a total of 47,035 characters divided among the 214 radicals, for an average of 220 characters per radical; however, the distribution is unequal, with the median number of characters per radical being 64, the maximum number being 1,902 (for radical 140 艸), and the minimum being 5 (for radical 138 艮). The radicals have between one and 17 strokes, with a median of 5 strokes and an average of slightly below 5.7 strokes.
|
Chinese character radicals. A radical (Chinese: 部首; pinyin: bùshǒu; lit. section header), or indexing component, is a visually prominent component of a Chinese character under which the character is traditionally listed in a Chinese dictionary. The radical for a character is typically a semantic component, but it can also be another structural component or an artificially extracted portion of the character. In some cases, the original semantic or phonological connection has become obscure, owing to changes in the meaning or pronunciation of the character over time. The use of the English term radical is based on an analogy between the structure of Chinese characters and the inflection of words in European languages.[a] Radicals are also sometimes called classifiers, but this name is more commonly applied to the grammatical measure words in Chinese.[2] In the earliest Chinese dictionaries, such as the Erya (3rd century BC), characters were grouped together in broad semantic categories. Because the vast majority of characters are phono-semantic compounds, combining a semantic component with a phonetic component, each semantic component tended to recur within a particular section of the dictionary. In the 2nd century AD, the Han dynasty scholar Xu Shen organized his etymological dictionary Shuowen Jiezi by selecting 540 recurring graphic elements he called bù (部; categories).[3] Most were common semantic components, but they also included shared graphic elements such as a dot or horizontal stroke. Some were even artificially extracted groups of strokes, termed glyphs by Serruys,[4] which never had an independent existence other than being listed in Shuowen. Each character was listed under only one element, which is then referred to as the radical for that character. For example, characters containing 女; nǚ; female or 木; mù; tree, wood are often grouped together in the sections for those radicals. Mei Yingzuos 1615 dictionary Zihui made two further innovations. He reduced the list of radicals to 214 and arranged characters under each radical in increasing order of the number of additional strokes—the radical-and-stroke method still used in the vast majority of present-day Chinese dictionaries. These innovations were also adopted by the more famous Kangxi Dictionary of 1716. Thus the standard 214 radicals introduced in the Zihui are usually known as the Kangxi radicals. These were first called bùshǒu (部首; section header) in the Kangxi Dictionary.[3] Although there is some variation in such lists – depending primarily on what secondary radicals are also indexed – these canonical 214 radicals of the Kangxi Dictionary still serve as the basis for most modern Chinese dictionaries. Some of the graphically similar radicals are combined in many dictionaries, such as 月; yuè; moon and the 月 form (⺼) of 肉; ròu; meat, flesh.
|
Latent image. A latent image is an invisible image produced by the exposure to light of a photosensitive material such as photographic film. When photographic film is developed, the area that was exposed darkens and forms a visible image. In the early days of photography, the nature of the invisible change in the silver halide crystals of the films emulsion coating was unknown, so the image was said to be latent until the film was treated with photographic developer. In more physical terms, a latent image is a small cluster of metallic silver atoms formed in or on a silver halide crystal due to reduction of interstitial silver ions by photoelectrons (a photolytic silver cluster). If intense exposure continues, such photolytic silver clusters grow to visible sizes. This is called printing out the image. On the other hand, the formation of a visible image by the action of photographic developer is called developing out the image. The size of a silver cluster in the latent image can be as small as a few silver atoms. However, in order to act as an effective latent image center, at least four silver atoms are necessary. On the other hand, a developed silver grain can have billions of silver atoms. Therefore, photographic developer acting on the latent image is a chemical amplifier with a gain factor up to several billion. The development system was the most important technology that increased the photographic sensitivity in the history of photography. The action of the light on the silver halide grains within the emulsion forms sites of metallic silver in the grains. The basic mechanism by which this happens was first proposed by R. W. Gurney and N. F. Mott in 1938. The incoming photon liberates an electron, called a photoelectron, from a silver halide crystal. Photoelectrons migrate to a shallow electron trap site (a sensitivity site), where the electrons reduce silver ions to form a metallic silver speck. A positive hole must also be generated, but it is largely ignored. Subsequent work has slightly modified this picture, so that hole trapping is also considered (Mitchell, 1957). Since then, understanding of the mechanism of sensitivity and latent image formation has been greatly improved. A latent image is formed when light changes the charge atoms in the molecule. Taking bromine as a halide for this example, when light hits a silver halide molecule, the halide is changed from a negative charge to a neutral one, releasing an electron that then changes the charge of the silver from a positive one to a neutral one.[1]
|
Marketing. Marketing is the act of acquiring, satisfying and retaining customers.[3][4] It is one of the primary components of business management and commerce.[5] Marketing is usually conducted by the seller, typically a retailer or manufacturer. Products can be marketed to other businesses (B2B) or directly to consumers (B2C).[6] Sometimes tasks are contracted to dedicated marketing firms, like a media, market research, or advertising agency. Sometimes, a trade association or government agency (such as the Agricultural Marketing Service) advertises on behalf of an entire industry or locality, often a specific type of food (e.g. Got Milk?), food from a specific area, or a city or region as a tourism destination. Market orientations are philosophies concerning the factors that should go into market planning.[7] The marketing mix, which outlines the specifics of the product and how it will be sold, including the channels that will be used to advertise the product,[8][9] is affected by the environment surrounding the product,[10] the results of marketing research and market research,[11][12] and the characteristics of the products target market.[13] Once these factors are determined, marketers must then decide what methods of promoting the product,[6] including use of coupons and other price inducements.[14]
|
Intaglio (printmaking). Intaglio (/ɪnˈtæli.oʊ, -ˈtɑːli-/ in-TAL-ee-oh, -TAH-lee-;[1] Italian: [inˈtaʎʎo]) is the family of printing and printmaking techniques in which the image is incised into a surface and the incised line or sunken area holds the ink.[2] It is the direct opposite of a relief print where the parts of the matrix that make the image stand above the main surface. Normally copper, or in recent times zinc, sheets called plates are used as a surface or matrix, and the incisions are created by etching, engraving, drypoint, aquatint or mezzotint, often in combination.[3] Collagraphs may also be printed as intaglio plates.[4] After the decline of the main relief technique of woodcut around 1550, the intaglio techniques dominated both artistic printmaking as well as most types of illustration and popular prints until the mid 19th century. The word intaglio describes prints created from plates where the ink-bearing regions are recessed beneath the plates surface. Though brass, zinc, and other materials are occasionally utilized, copper is the most common material for the plates.[5] In intaglio printing, the lines to be printed are cut into a metal (e.g. copper) plate by means either of a cutting tool called a burin, held in the hand—in which case the process is called engraving; or through the corrosive action of acid—in which case the process is known as etching.[6][7]
|
Bimetallism. Bimetallism,[a] also known as the bimetallic standard, is a monetary standard in which the value of the monetary unit is defined as equivalent to certain quantities of two metals, creating a fixed rate of exchange between them.[3] In all known historical cases, the metals are gold and silver. For scholarly purposes, proper bimetallism is sometimes distinguished as permitting that both gold and silver money are legal tender in unlimited amounts and that gold and silver may be taken to be coined by the government mints in unlimited quantities.[4] This distinguishes it from limping standard bimetallism, where both gold and silver are legal tender but only one is freely coined (e.g. the monies of France, Germany, and the United States after 1873), and from trade bimetallism, where both metals are freely coined but only one is legal tender and the other is used as trade money (e.g. most monies in western Europe from the 13th to 18th centuries). Economists also distinguish legal bimetallism, where the law guarantees these conditions, and de facto bimetallism, where gold and silver coins circulate at a fixed rate. During the 19th century there was a great deal of scholarly debate and political controversy regarding the use of bimetallism in place of a gold standard or silver standard (monometallism).[5][6] Bimetallism was intended to increase the supply of money, stabilize prices, and facilitate setting exchange rates.[7] Some scholars argued that bimetallism was inherently unstable owing to Greshams law, and that its replacement by a monometallic standard was inevitable. Other scholars claimed that in practice bimetallism had a stabilizing effect on economies. The controversy became largely moot after technological progress and the South African and Klondike Gold Rushes increased the supply of gold in circulation at the end of the century, ending most of the political pressure for greater use of silver. It became completely academic after the 1971 Nixon shock; since then, all of the worlds currencies have operated as more or less freely floating fiat money, unconnected to the value of silver or gold. Nonetheless, academics continue to debate, inconclusively, the relative use of the metallic standards.[b] From the 7th century BCE, Asia Minor, especially in the areas of Lydia and Ionia, is known to have created a coinage based on electrum, a natural occurring material that is a variable mix of gold and silver (with about 54% gold and 44% silver). Before Croesus, his father Alyattes had already started to mint various types of non-standardized electrum coins. They were in use in Lydia and surrounding areas for about 80 years.[1] The unpredictability of its composition implied that it had a variable value which was very hard to determine, which greatly hampered its development.[1]
|
Film. A film, also known as a movie or motion picture,[a] is a work of visual art that simulates experiences and otherwise communicates ideas, stories, perceptions, emotions, or atmosphere through the use of moving images that are generally, since the 1930s, synchronized with sound and some times using other sensory stimulations.[1] Films are produced by recording actual people and objects with cameras or by creating them using animation techniques and special effects. They comprise a series of individual frames, but when these images are shown rapidly in succession, the illusion of motion is given to the viewer. Flickering between frames is not seen due to an effect known as persistence of vision, whereby the eye retains a visual image for a fraction of a second after the source has been removed. Also of relevance is what causes the perception of motion; a psychological effect identified as beta movement. Films are considered by many to be an important art form; films entertain, educate, enlighten and inspire audiences. The visual elements of cinema need no translation, giving the motion picture a universal power of communication. Any film can become a worldwide attraction, especially with the addition of dubbing or subtitles that translate the dialogue. Films are also artifacts created by specific cultures, which reflect those cultures, and, in turn, affect them.
|
Fantasia (musical form). A fantasia (Italian: [fantaˈziːa]; also English: fantasy, fancy, fantazy, phantasy, German: Fantasie, Phantasie, French: fantaisie) is a musical composition with roots in improvisation. The fantasia, like the impromptu, seldom follows the textbook rules of any strict musical form. The term was first applied to music during the 16th century, at first to refer to the imaginative musical idea rather than to a particular compositional genre. Its earliest use as a title was in German keyboard manuscripts from before 1520, and by 1536 is found in printed tablatures from Spain, Italy, Germany, and France. From the outset, the fantasia had the sense of the play of imaginative invention, particularly in lute or vihuela composers such as Francesco Canova da Milano and Luis de Milán. Its form and style consequently ranges from the freely improvisatory to the strictly contrapuntal, and also encompasses more or less standard sectional forms.[1] One of the most important composers in the development of the fantasia was Jan Pieterszoon Sweelinck. His greatest work in this style is the fantasia cromatica (a specific form called chromatic fantasia), which in many ways forms a link between the Renaissance and the Baroque. According to the Oxford Dictionary of Music, in the 16th century the instrumental fantasia was a strict imitation of the vocal motet.[2] Polyphonic solo fantasias were widely composed for the Lute & early keyboards. Composers such as William Byrd & Orlando Gibbons wrote many surviving keyboard fantasias, while also expanding the genre with outstanding examples for recorders & viols. In addition to Byrd & Gibbons, composers John Coprario, Alfonso Ferrabosco, Thomas Lupo, John Ward, and William White continued to expand the genre for viol consort while examples by William Lawes, John Jenkins, William Cranford, Matthew Locke, and Henry Purcell are regarded as highly exceptional from the late 17th-century. The form expanded in scope during the Baroque period with works ranging from J. S. Bachs Chromatic Fantasia and Fugue, BWV 903, for harpsichord; Great Fantasia and Fugue in G minor, BWV 542, for organ; and Fantasia and Fugue in C minor, BWV 537, for organ are examples. Georg Philipp Telemann published Twelve Fantasias for Solo Flute in 1733, and Twelve Fantasias for Solo Violin and Twelve Fantasias for Viola da Gamba solo in 1735.
|
International Securities Identification Number. An International Securities Identification Number (ISIN) is a code that uniquely identifies a security globally for the purposes of facilitating clearing, reporting and settlement of trades. Its structure is defined in ISO 6166. The ISIN code is a 12-character alphanumeric code that serves for uniform identification of a security through normalization of the assigned National Number, where one exists, at trading and settlement. ISINs were first used in 1981, but did not reach wide acceptance until 1989, when the G30 countries recommended adoption.[1] The ISIN was endorsed a year later by ISO with the ISO 6166 standard. Initially information was distributed via CD-ROMs and this was later replaced by distribution over the internet.[citation needed] ISINs slowly gained traction worldwide and became the most popular global securities identifier. Trading, clearing and settlement systems in many countries adopted ISINs as a secondary measure of identifying securities. Some countries, mainly in Europe, moved to using the ISIN as their primary means of identifying securities. European regulations such as Solvency II Directive 2009 increasingly require the ISIN to be reported.[2]
|
Numeral prefix. Numeral or number prefixes are prefixes derived from numerals or occasionally other numbers. In English and many other languages, they are used to coin numerous series of words. For example: In many European languages there are two principal systems, taken from Latin and Greek, each with several subsystems; in addition, Sanskrit occupies a marginal position.[B] There is also an international set of metric prefixes, which are used in the worlds standard measurement system. In the following prefixes, a final vowel is normally dropped before a root that begins with a vowel, with the exceptions of bi-, which is extended to bis- before a vowel; among the other monosyllables, du-, di-, dvi-, and tri-, never vary. Words in the cardinal category are cardinal numbers, such as the English one, two, three, which name the count of items in a sequence. The multiple category are adverbial numbers, like the English once, twice, thrice, that specify the number of events or instances of otherwise identical or similar items. Enumeration with the distributive category originally was meant to specify one each, two each or one by one, two by two, etc., giving how many items of each type are desired or had been found, although distinct word forms for that meaning are now mostly lost. The ordinal category are based on ordinal numbers such as the English first, second, third, which specify position of items in a sequence. In Latin and Greek, the ordinal forms are also used for fractions for amounts higher than 2; only the fraction 1 / 2 has special forms. The same suffix may be used with more than one category of number, as for example the ordinal numbers secondary and tertiary and the distributive numbers binary and ternary.
|
Alloy. An alloy is a mixture of chemical elements of which in most cases at least one is a metallic element, although it is also sometimes used for mixtures of elements; herein only metallic alloys are described. Metallic alloys often have properties that differ from those of the pure elements from which they are made. The vast majority of metals used for commercial purposes are alloyed to improve their properties or behavior, such as increased strength, hardness or corrosion resistance. Metals may also be alloyed to reduce their overall cost, for instance alloys of gold and copper. A typical example of an alloy is 304 grade stainless steel which is commonly used for kitchen utensils, pans, knives and forks. Sometime also known as 18/8, it as an alloy consisting broadly of 74% iron, 18% chromium and 8% nickel. The chromium and nickel alloying elements add strength and hardness to the majority iron element, but their main function is to make it resistant to rust/corrosion. In an alloy, the atoms are joined by metallic bonding rather than by covalent bonds typically found in chemical compounds.[1] The alloy constituents are usually measured by mass percentage for practical applications, and in atomic fraction for basic science studies. Alloys are usually classified as substitutional or interstitial alloys, depending on the atomic arrangement that forms the alloy. They can be further classified as homogeneous (consisting of a single phase), or heterogeneous (consisting of two or more phases) or intermetallic. An alloy may be a solid solution of metal elements (a single phase, where all metallic grains (crystals) are of the same composition) or a mixture of metallic phases (two or more solutions, forming a microstructure of different crystals within the metal).
|
Watermark. A watermark is an identifying image or pattern in paper that appears as various shades of lightness/darkness when viewed by transmitted light (or when viewed by reflected light, atop a dark background), caused by thickness or density variations in the paper.[1] Watermarks have been used on postage stamps, currency, and other government documents to discourage counterfeiting. There are two main ways of producing watermarks in paper; the dandy roll process, and the more complex cylinder mould process. Watermarks vary greatly in their visibility; while some are obvious on casual inspection, others require some study to pick out. Various aids have been developed, such as watermark fluid that wets the paper without damaging it. A watermark is very useful in the examination of paper because it can be used for dating documents and artworks, identifying sizes, mill trademarks and locations, and determining the quality of a sheet of paper. The word is also used for digital practices that share similarities with physical watermarks. In one case, overprint on computer-printed output may be used to identify output from an unlicensed trial version of a program. In another instance, identifying codes can be encoded as a digital watermark for a music, video, picture, or other file. Or an artist adding their identifying digital Signature, graphic, logo in their digital artworks as an identifier or anti-counterfeit measure . Watermarks were first introduced in Fabriano, Italy, in 1282.[2] At the time, watermarks were created by changing the thickness of paper during a stage in the manufacturing process when it was still wet. Traditionally, a watermark was made by impressing a water-coated metal stamp onto the paper during manufacturing. The invention of the dandy roll in 1826 by John Marshall revolutionised the watermark process and made it easier for producers to watermark their paper. The dandy roll is a light roller covered by material similar to window screen that is embossed with a pattern. Faint lines are made by laid wires that run parallel to the axis of the dandy roll, and the bold lines are made by chain wires that run around the circumference to secure the laid wires to the roll from the outside. Because the chain wires are located on the outside of the laid wires, they have a greater influence on the impression in the pulp, hence their bolder appearance than the laid wire lines.
|
EURion constellation. The EURion constellation (also known as Omron rings or doughnuts[1]) is a pattern of symbols incorporated into a number of secure documents such as banknotes, cheques, and ownership title certificate designs worldwide since about 1996. It is added to help imaging software detect the presence of such a document in a digital image. Such software can then block the user from reproducing such documents to prevent counterfeiting using colour photocopiers. The name EURion constellation was coined by security researcher Markus Kuhn, who uncovered the pattern on the 10-euro banknote in early 2002 while experimenting with a Xerox colour photocopier that refused to reproduce banknotes.[2] The pattern has never been mentioned officially; Kuhn named it the EURion constellation as it resembled the astronomical Orion constellation, and EUR is the ISO 4217 designation of the euro currency.[3] The EURion constellation first described by Kuhn consists of a pattern of five small yellow, green or orange circles, which is repeated across areas of the banknote at different orientations. The mere presence of five of these circles on a page is sufficient for some colour photocopiers to refuse processing. Some banks integrate the constellation tightly with the remaining design of the note. On 50 DM German banknotes, the EURion circles formed the innermost circles in a background pattern of fine concentric circles. On the front of former Bank of England Elgar £20 notes, they appear as green heads of musical notes; however, on the Smith £20 notes of 2007 the circles merely cluster around the £20 text. On some U.S. bills, they appear as the digit zero in small, yellow numbers matching the value of the note. On Japanese yen, these circles sometimes appear as flowers.
|
Aomori Prefecture. Aomori Prefecture (青森県, Aomori-ken; Japanese pronunciation: [a.oꜜ.mo.ɾʲi, a.o.mo.ɾʲiꜜ.keɴ]) (hiragana: あおもりけん)[3][4] is a prefecture of Japan in the Tōhoku region. The prefectures capital, largest city, and namesake is the city of Aomori. Aomori is the northernmost prefecture on Japans main island, Honshu, and is bordered by the Pacific Ocean to the east, Iwate Prefecture to the southeast, Akita Prefecture to the southwest, the Sea of Japan to the west, and Hokkaido across the Tsugaru Strait to the north. Aomori Prefecture is the 8th-largest prefecture, with an area of 9,645.64 km2 (3,724.20 sq mi), and the 31st-most populous prefecture, with more than 1.18 million people. Mount Iwaki, an active stratovolcano, is the prefectures highest point, at almost 1,624.7 m (5,330 ft). Aomori is the third-most populous prefecture in the Tōhoku region. Humans have inhabited the prefecture for at least 15,000 years, and the oldest evidence of pottery in Japan was discovered at the Jōmon period Odai Yamamoto I site. After centuries of rule by the Nanbu and Tsugaru clans, the prefecture was formed out of the northern part of Mutsu Province during the Meiji Restoration. The entire Tōhoku region, including Aomori Prefecture, experienced significant growth in population and economy until the late 20th century, when a significant population decline began. Though the prefecture remains dominated by primary sector industries, especially apple orchards, it also serves as a transportation and logistics hub due to its strategic location at the northern end of Honshu. Additionally, it is also the largest producer of the superfood, black garlic in Japan. Tourism is also a significant part of the prefectures economy because of its natural beauty and historical sites, especially the Jōmon Prehistoric Sites in Northern Japan and Shirakami-Sanchi World Heritage Sites. Aomori Prefecture has a distinctive subculture influenced by its location relative to the central government of Japan at the northern edge of Honshu and the regions long, snowy winters. It is the birthplace of the traditional Tsugaru-jamisen, a virtuosic style of playing shamisen. Embroidery, pottery, lacquerware, cabinetry, and iron working are also significant crafts in the prefecture. Various porridges and soups are distinctive to the area. Several festivals are held in Aomori Prefecture; the most noted of which is Aomori Nebuta Matsuri. The prefectures most significant writer is novelist Osamu Dazai. Tadamori Ōshima is a politician from the prefecture who held several high-level positions in the national government. Aomori Prefecture is home to several association football, baseball, basketball, and ice hockey teams. The prefectures religious beliefs are heavily rooted in Shinto and Buddhism and has uniqueness in its blind mediums and alleged tomb of Jesus. Aomori literally means blue forest, although it could possibly be translated as green forest. The name most likely refers to a small forest on a hill which existed near Aomori City, which was often used by fishermen as a landmark.[5][6][7][8] The oldest evidence of pottery in Japan was found at the Odai Yamamoto I site in the town of Sotogahama in the northwestern part of the prefecture. The relics found there suggest that the Jōmon period began about 15,000 years ago.[9] By 7,000 BCE, fishing cultures had developed along the shores of the prefecture which were three metres higher than the present day shoreline.[10] Around 3,900 BCE, the settlement at the Sannai-Maruyama Site in the present-day city of Aomori began.[11] The settlement shows evidence of the wide interaction between the sites inhabitants and people from across Jōmon period Japan, including Hokkaido and Kyushu.[9] The settlement of Sannai-Maruyama ended around 2300 BCE for unknown reasons. Its abandonment was likely due to the populations subsistence economy being unable to result in sustained growth, with its end being spurred on by the reduced amount of natural resources during the neoglaciation.[12] The Jōmon period continued up to 300 BCE in present-day Aomori Prefecture at the Kamegaoka site in the city of Tsugaru where the Shakōkidogū was found.[9]
|
TOPIX. The Tokyo Stock Price Index (東証株価指数, Tōshō Kabuka shisū), commonly known as the TOPIX, is an important stock market index for the Tokyo Stock Exchange (TSE) in Japan, along with the Nikkei 225. The TOPIX tracks the entire market of domestic companies and covers most stocks in the Prime market and some stocks in the Standard market. It is calculated and published by the TSE. As of January 2025, there are planned to be 1,716 companies listed on the TSE, since about 400 stocks with low liquidity were phased out after the TSE reform in 2022. The index transitioned from a system where a companys weighting is based on the total number of shares outstanding to a weighting based on the number of shares available for trading (called the free float). This transition started in October 2005 and was completed in June 2006. Although the change is a technicality, it had a significant effect on the weighting of many companies in the index, because many companies in Japan hold a significant number of shares of their business partners as a part of intricate business alliances, and such shares are no longer included in calculating the weight of companies in the index. The TOPIX index is traded as a future on the Osaka Exchange under the ticker symbol JTPX.[1] The CQG contract specifications for the TOPIX Index are listed below. TSE currently calculates and distributes TOPIX every second and further plans to launch a new High-Speed Index dissemination service provided at the millisecond level starting from February 28, 2011.[needs update]
|
Fashion. Fashion is a term used interchangeably to describe the creation of clothing, footwear, accessories, cosmetics, and jewellery of different cultural aesthetics and their mix and match into outfits that depict distinctive ways of dressing (styles and trends) as signifiers of social status, self-expression, and group belonging. As a multifaceted term, fashion describes an industry, designs, aesthetics, and trends. The term fashion originates from the Latin word Facere, which means to make, and describes the manufacturing, mixing, and wearing of outfits adorned with specific cultural aesthetics, patterns, motifs, shapes, and cuts, allowing people to showcase their group belongings, values, meanings, beliefs, and ways of life. Given the rise in mass production of commodities and clothing at lower prices and global reach, reducing fashions environmental impact and improving sustainability has become an urgent issue among politicians, brands, and consumers.[1][2] The French word mode, meaning fashion, dates as far back as 1482, while the English word denoting something in style dates only to the 16th century. Other words exist related to concepts of style and appeal that precede mode. In the 12th and 13th century Old French the concept of elegance begins to appear in the context of aristocratic preferences to enhance beauty and display refinement, and cointerie, the idea of making oneself more attractive to others by style or artifice in grooming and dress, appears in a 13th-century poem by Guillaume de Lorris advising men that handsome clothes and handsome accessories improve a man a great deal.[3] Fashion scholar Susan B. Kaiser states that everyone is forced to appear, unmediated before others.[4] Everyone is evaluated by their attire, and evaluation includes the consideration of colors, materials, silhouette, and how garments appear on the body. Garments identical in style and material also appear different depending on the wearers body shape, or whether the garment has been washed, folded, mended, or is new.
|
Holography. Holography is a technique that allows a wavefront to be recorded and later reconstructed. It is best known as a method of generating three-dimensional images, and has a wide range of other uses, including data storage, microscopy, and interferometry. In principle, it is possible to make a hologram for any type of wave. A hologram is a recording of an interference pattern that can reproduce a 3D light field using diffraction. In general usage, a hologram is a recording of any type of wavefront in the form of an interference pattern. It can be created by capturing light from a real scene, or it can be generated by a computer, in which case it is known as a computer-generated hologram, which can show virtual objects or scenes. Optical holography needs a laser light to record the light field. The reproduced light field can generate an image that has the depth and parallax of the original scene.[1] A hologram is usually unintelligible when viewed under diffuse ambient light. When suitably lit, the interference pattern diffracts the light into an accurate reproduction of the original light field, and the objects that were in it exhibit visual depth cues such as parallax and perspective that change realistically with the different angles of viewing. That is, the view of the image from different angles shows the subject viewed from similar angles. A hologram is traditionally generated by overlaying a second wavefront, known as the reference beam, onto a wavefront of interest. This generates an interference pattern, which is then captured on a physical medium. When the recorded interference pattern is later illuminated by the second wavefront, it is diffracted to recreate the original wavefront.[2] The 3D image from a hologram can often be viewed with non-laser light. However, in common practice, major image quality compromises are made to remove the need for laser illumination to view the hologram. A computer-generated hologram is created by digitally modeling and combining two wavefronts to generate an interference pattern image. This image can then be printed onto a mask or film and illuminated with an appropriate light source to reconstruct the desired wavefront.[2] Alternatively, the interference pattern image can be directly displayed on a dynamic holographic display.[3]
|
Fantasie (Widmann). Fantasie for Solo Clarinet is a solo instrumental work by Jörg Widmann and was composed in 1993. It has a Harlequin spirit.[2] The Three Pieces for Solo Clarinet and Dialogue de lombre double were used as the basis.[3] The piece was premiered by the composer on 1 March 1994 at Bayerischer Rundfunk in Munich.[1] The work is an expression of virtuoso flourishes and youthful exuberance.[4] The composer cites The Rite of Spring in Fantasie.[5] Sections:[6]
|
Fantasie in C (Schumann). The Fantasie in C, Op. 17, was written by Robert Schumann in 1836. It was revised prior to publication in 1839, when it was dedicated to Franz Liszt. It is generally described as one of Schumanns greatest works for solo piano, and is one of the central works of the early Romantic period. It is often called by the Italian version, fantasia; the word Fantasie is the German spelling. The Fantasie is in loose sonata form. Its three movements are headed: The first movement is rhapsodic and passionate; the middle movement is a grandiose rondo based on a majestic march, with episodes that recall the emotion of the first movement; and the finale is slow and meditative. The piece has its origin in early 1836, when Schumann composed a piece entitled Ruines expressing his distress at being parted from his beloved Clara Wieck (later to become his wife). This later became the first movement of the Fantasy.[1] Later that year, he wrote two more movements to create a work intended as a contribution to the appeal for funds to erect a monument to Beethoven in his birthplace, Bonn. Schumann offered the work to the publisher Kirstner, suggesting that 100 presentation copies could be sold to raise money for the monument. Other contributions to the Beethoven monument fund included Mendelssohns Variations sérieuses.[2] The original title of Schumanns work was Obolen auf Beethovens Monument: Ruinen, Trophaen, Palmen, Grosse Sonate f.d. Piano f. Für Beethovens Denkmal. Kirstner refused, and Schumann tried offering the piece to Haslinger in January 1837. When Haslinger also refused, he offered it to Breitkopf & Härtel in May 1837. The movements subtitles (Ruins, Trophies, Palms) became Ruins, Triumphal Arch, and Constellation, and were then removed altogether before Breitkopf & Härtel eventually issued the Fantasie in May 1839.[3]
|
Coin. A coin is a small object, usually round and flat, used primarily as a medium of exchange or legal tender. They are standardized in weight, and produced in large quantities at a mint in order to facilitate trade. They are most often issued by a government. Coins often have images, numerals, or text on them. The faces of coins or medals are sometimes called the obverse and the reverse, referring to the front and back sides, respectively. The obverse of a coin is commonly called heads, because it often depicts the head of a prominent person, and the reverse is known as tails. The first metal coins – invented in the ancient Greek world and disseminated during the Hellenistic period – were precious metal–based, and were invented in order to simplify and regularize the task of measuring and weighing bullion (bulk metal) carried around for the purpose of transactions. They carried their value within the coins themselves, but the stampings also induced manipulations, such as the clipping of coins to remove some of the precious metal.[1] Most modern coinage metals are base metal, and their value comes from their status as fiat money — the value of the coin is established by law. In the last hundred years, the face value of circulated coins has occasionally been lower than the value of the metal they contain, primarily due to inflation. If the difference becomes significant, the issuing authority may decide to withdraw these coins from circulation, possibly issuing new equivalents with a different composition, or the public may decide to melt the coins down or hoard them (see Greshams law). Currently coins are used as money in everyday transactions, circulating alongside banknotes. Usually, the highest value coin in circulation (excluding bullion coins) is worth less than the lowest-value note. Coins are usually more efficient than banknotes because they last longer: banknotes last only about four years, compared with 30 years for a coin.[2][3] Exceptions to the rule of face value being higher than content value currently occur for bullion coins made of copper, silver, or gold (and rarely other metals, such as platinum or palladium), intended for collectors or investors in precious metals. Examples of modern gold collector/investor coins include the British sovereign minted by the United Kingdom, the American Gold Eagle minted by the United States, the Canadian Gold Maple Leaf minted by Canada, and the Krugerrand, minted by South Africa. While the Eagle and Sovereign coins have nominal (purely symbolic) face values, the Krugerrand does not. Commemorative coins usually serve as collectors items only, although some countries also issue commemorative coins for regular circulation, such as the 2€ commemorative coins and U.S. America the Beautiful quarters. Early metal coinage came into use about the time of the Axial Age in West Asia, in the Greek world, in northern India, and in China.[4]
|
Fantasy (group). Fantasy is an urban pop vocal group based in New York City who scored several hits on the Hot Dance Music/Club Play chart, including Youre Too Late, which hit number one in 1981.[1] Group members included Ken Roberson, Tamm E Hunt, Rufus Jackson and Carolyn Edwards. The groups producer, Tony Valor, continued to use the name in 1985 when they released an Italo disco-influenced single called Hes My Number One. Youre Too Late was a number-one dance hit in the United States. It had a five-week reign at the top of the Billboard Hot Dance Club Play chart in early 1981.[1] It also reached the top 30 on the Soul Singles chart.[2] In 1982, the band released a pop-soul number entitled Hold On Tight, which peaked at number 35 on the Dance Club chart, and #1 on the 1982 Billboard Year End Award followed by Live the Life I Love, boogie song that had reached number 41 on the same chart by 1983.
|
Metal. A metal (from Ancient Greek μέταλλον (métallon) mine, quarry, metal) is a material that, when polished or fractured, shows a lustrous appearance, and conducts electricity and heat relatively well. These properties are all associated with having electrons available at the Fermi level, as against nonmetallic materials which do not.[1]: Chpt 8 & 19 [2]: Chpt 7 & 8 Metals are typically ductile (can be drawn into a wire) and malleable (can be shaped via hammering or pressing).[3] A metal may be a chemical element such as iron; an alloy such as stainless steel; or a molecular compound such as polymeric sulfur nitride.[4] The general science of metals is called metallurgy, a subtopic of materials science; aspects of the electronic and thermal properties are also within the scope of condensed matter physics and solid-state chemistry, it is a multidisciplinary topic. In colloquial use materials such as steel alloys are referred to as metals, while others such as polymers, wood or ceramics are nonmetallic materials. A metal conducts electricity at a temperature of absolute zero,[5] which is a consequence of delocalized states at the Fermi energy.[1][2] Many elements and compounds become metallic under high pressures, for example, iodine gradually becomes a metal at a pressure of between 40 and 170 thousand times atmospheric pressure. When discussing the periodic table and some chemical properties, the term metal is often used to denote those elements which in pure form and at standard conditions are metals in the sense of electrical conduction mentioned above. The related term metallic may also be used for types of dopant atoms or alloying elements.
|
United Principalities of Moldavia and Wallachia. The United Principalities of Moldavia and Wallachia (Romanian: Principatele Unite ale Moldovei și Țării Românești),[2] commonly called United Principalities or Wallachia and Moldavia, was the personal union of the Principality of Moldavia and the Principality of Wallachia. The union was formed on 5 February [O.S. 24 January] 1859 when Alexandru Ioan Cuza was elected as the Domnitor (Ruling Prince) of both principalities. Their separate autonomous vassalage in the Ottoman Empire continued with the unification of both principalities. On 3 February [O.S. 22 January] 1862, Moldavia and Wallachia formally united to create the Romanian United Principalities, the core of the Romanian nation state.[3][4] In February 1866, Prince Cuza was forced to abdicate and go into exile by a political coalition led by the Liberals; the German prince Karl of Hohenzollern-Sigmaringen was offered the Throne and, on 22 May [O.S. 10 May] 1866 he entered Bucharest for the first time. In July the same year, a new constitution came into effect, giving the country the name of Romania; internationally, this name was used only after 1877, since at the time it shared a common foreign policy with the Ottoman Empire. Nominally, the new state remained a vassal of the Ottoman Empire. However, by this time the suzerainty of the Sublime Porte had become a legal fiction. Romania had its own flag and anthem; after 1867, it had its own currency as well. On 21 May [O.S. 9 May] 1877, Romania proclaimed itself fully independent; the proclamation was sanctioned by the Domnitor the following day. Four years later, on 22 May [O.S. 10 May] 1881, the 1866 constitution was modified and Romania became a kingdom, and Domnitor Carol I was crowned as the first king of Romania. After the First World War, Transylvania and other territories were also included. For its triple symbolic meaning, the date of May 10 was celebrated as Romanias National Day until 1948, when the Communist regime installed the republic. As a historical term designating the pre-Union Principalities of Moldavia and Wallachia, sometimes including the Principality of Transylvania, the term Romanian Principalities dates back to the beginnings of modern Romanian history in the mid-19th century.[citation needed][5] It was subsequently used by Romanian historians as an alternative to the much older term Romanian Lands. English use of Romanian Principalities is documented from the second half of the 19th century.
|
Ioan Slavici. Ioan Slavici (Romanian pronunciation: [iˈo̯an ˈslavitʃʲ]; 18 January 1848 – 17 August 1925) was a Romanian writer and journalist from Austria-Hungary, later Romania. He made his debut in Convorbiri literare (Literary Conversations) (1871), with the comedy Fata de birău (The Mayors Daughter). Alongside Mihai Eminescu he founded the Young Romania Social and Literary Academic Society and organized, in 1871, the Putna Celebration of the Romanian Students from Romania and from abroad. At the end of 1874, he settled in Bucharest, where he became secretary of the Hurmuzachi Collection Committee, then he became a professor, and then an editor of the newspaper Timpul (The Time). Alongside Ion Luca Caragiale and George Coșbuc, he edited the Vatra (The Hearth) magazine. During World War I, he collaborated at the newspapers Ziua (The Day) and Gazeta Bucureștilor (The Bucharest Gazette). He was awarded the Romanian Academy Award (1903). Slavici was born in the village of Világos (today Șiria, Romania), near Arad, in 1848 to Sava and Elena Slavici. Slavici studied at the local Orthodox school in Șiria and various other institutions in Transylvania, taught in either Hungarian or German and becomes a member of the Romania Lecture Society.[1] After finishing his studies, Slavici first enrolles to study law in Budapest in 1868, and in Vienna, in 1869. But shortly after due to financial difficulties he is forced to return to Cumlaus and take a job as a notary public.[2] Throughout his employment, Slavici saved the money that would help him continue his studies in Vienna.
|
Convorbiri Literare. Convorbiri Literare (lit. Literary Talks) is a Romanian literary magazine published in Romania. It is among the most important journals of the nineteenth-century Romania.[1] Convorbiri Literare was founded by Titu Maiorescu in 1867.[2][3][4] The magazine was the organ of the Junimea group, a literary society which was established in 1864.[4][5] The group included aristocratic Moldovans except for Titu Maiorescu.[4] The magazine was first headquartered in Iaşi and later moved to Bucharest.[5] Convorbiri Literare is published monthly[4] by Convorbiri Literare publishing house.[6] The magazine covered art reviews[7] and translations of literary work.[3] From 1906 the magazine also featured articles on plastic arts.[7] The contributors included Alexandru Tzigara-Samurcaș and Apcar Baltazar among others.[7] The other significant contributors were Mihai Eminescu, Ion Creangă and Ion Luca Caragiale.[8] Convorbiri Literare has a conservative stance,[5] and its literary rival was socialist Contemporanul during the communist regime in Romania.[4]
|
Fantasy Records. Fantasy Records is an American independent record label company founded by brothers Max and Sol Stanley Weiss in 1949. The early years of the company were dedicated to issuing recordings by jazz pianist Dave Brubeck, who was also one of its investors, but in more recent years the label has been known for its recordings of comedian Lenny Bruce, jazz pianist Vince Guaraldi, the last recordings made on the Wurlitzer organ in the San Francisco Fox Theatre before the theatre was demolished, organist Korla Pandit, the 1960s rock band Creedence Clearwater Revival,[1] bandleader Woody Herman, and disco and R&B singer Sylvester. In 1949, Jack Sheedy, owner of a San Francisco-based record label called Coronet, was talked into making the first recording of an octet and a trio featuring Dave Brubeck (not to be confused with either the Australian Coronet Records or the New York City-based Coronet Records of the late 1950s). Sheedys Coronet Records had recorded area Dixieland bands. But he was unable to pay his bills, and in 1949 he turned his masters over to a pressing company, the Circle Record Company, which was owned by Max and Sol Weiss.[2] The Weiss brothers changed the name of their business to Fantasy Records and met an increasing demand for Brubecks music by recording and issuing new records. The company was soon shipping 40,000 to 50,000 copies of Brubeck records per quarter.[3] When Brubeck signed with Fantasy, he believed he had 50 percent interest in the company. He worked as an unofficial artists and repertoire (A&R) assistant, encouraging the Weiss brothers to sign Gerry Mulligan, Chet Baker, and Red Norvo. When he discovered that all he owned was 50 percent of his own recordings, he signed with Columbia Records.[4] Fantasy was known for its unique colored-vinyl pressings. Monaural records were pressed in red vinyl while stereo pressings were pressed in blue. Later stereo pressings were red vinyl with a blue label. Eventually the company switched to black vinyl for all pressings and the label design went through several revisions as well. In 1955, Saul Zaentz joined the company. Jazz musician Charles Mingus gave Debut Records to Zaentz as a wedding gift; at the time, Zaentz was marrying Minguss ex-wife, Celia, who had helped found Debut with Mingus and musician Max Roach. After an unsuccessful attempt by Audio Fidelity Records to buy Fantasy,[5] Zaentz became president in 1967. He and a group of investors bought Fantasy from the Weiss brothers that year.[6] He then acquired Prestige Records (1971), Riverside (1972), and Milestone (1972).[1][7]
|
Moros intrepidus. Moros is a genus of small tyrannosauroid theropod dinosaur that lived during the Late Cretaceous period in what is now Utah. It contains a single species, M. intrepidus.[1] Moros represents one of the earliest known diagnostic tyrannosauroid material from North America.[1] Moros was first discovered at the Stormy Theropod site located in Emery County in the U.S. state of Utah. Palaeontologists had been researching the area for ten years when, in 2013, limb bones were seen jutting out of a hillside, prompting the excavation.[2] The bones were described as of a new species in February, 2019.[3] The type species, Moros intrepidus, was named and described by Lindsay E. Zanno, Ryan T. Tucker, Aurore Canoville, Haviv M. Avrahami, Terry A. Gates, and Peter J. Makovicky. The generic name is derived from the Greek term Moros (an embodiment of impending doom), in reference to the establishment of the tyrannosauroid lineage in North America that would soon dominate the continent by the end of the Cretaceous. The specific name is the Latin word intrepidus (intrepid), referring to the hypothesized dispersal of tyrannosauroids from Asia throughout North America following the arrival of Moros.[1] The holotype specimen, NCSM 33392, was found in the lower Mussentuchit Member of the Cedar Mountain Formation dating from the Cenomanian age. The layer has a maximimum age of 96.4 million years. The holotype consists of a right leg, specifically the thighbone, shinbone, second and fourth metatarsal, and the third and fourth phalanx of the fourth toe. Lines of arrested growth, or LAGs, indicate that it represents a subadult individual of six or seven years old, nearing its maximum size. Additionally, two premaxillary teeth were referred to the species, specimens NCSM 33393 and NCSM 33276.[1] Moros was a small-bodied, cursorial tyrannosauroid with an estimated leg length of 1.2 m (3.9 ft) and a body mass of 78 kg (172 lb).[1] The foot bones of Moros were extremely slender, with metatarsal proportions found to be more similar to ornithomimids than to other Late Cretaceous tyrannosauroids.[1]
|
Morus (plant). See text. Morus, a genus of flowering plants in the family Moraceae, consists of 19 species of deciduous trees commonly known as mulberries, growing wild and under cultivation in many temperate world regions.[1][2][3][4] Generally, the genus has 64 subordinate taxa,[5] though the three most common are referred to as white, red, and black, originating from the color of their dormant buds and not necessarily the fruit color (Morus alba, M. rubra, and M. nigra, respectively), with numerous cultivars and some taxa currently unchecked and awaiting taxonomic scrutiny.[6][5] M. alba is native to South Asia, but is widely distributed across Europe, Southern Africa, South America, and North America.[2] M. alba is also the species most preferred by the silkworm. It is regarded as an invasive species in Brazil, the United States and some states of Australia.[2][7] The closely related genus Broussonetia is also commonly known as mulberry, notably the paper mulberry (Broussonetia papyrifera).[8]
|
Gannet (disambiguation). The gannet is a seabird. It may also refer to:
|
Early Miocene. The Early Miocene (also known as Lower Miocene) is a sub-epoch of the Miocene Epoch made up of two stages: the Aquitanian and Burdigalian stages.[2][3] The sub-epoch lasted from 23.03 ± 0.05 Ma to 15.97 ± 0.05 Ma (million years ago). It was preceded by the Oligocene epoch. As the climate started to get cooler, the landscape started to change. New mammals evolved to replace the extinct animals of the Oligocene epoch. The first members of the hyena and weasel family started to evolve to replace the extinct Hyaenodon, entelodonts and bear-dogs. The chalicotheres survived the Oligocene epoch. A new genus of entelodont called Daeodon evolved in order to adapt to the new habitats and hunt the new prey animals of the Early Miocene epoch; it quickly became the top predator of North America. But it became extinct due to competition from Amphicyon, a newcomer from Eurasia. Amphicyon bested Daeodon because the bear-dogs larger brain, sharper teeth and longer legs built for longer chases helped it to overcome its prey. This geochronology article is a stub. You can help Wikipedia by expanding it.
|
Hiroshima Prefecture. Hiroshima Prefecture (広島県, Hiroshima-ken; Japanese pronunciation: [çi.ɾo.ɕi.ma, -maꜜ.keɴ][2]) is a prefecture of Japan located in the Chūgoku region of Honshu.[3] Hiroshima Prefecture has a population of 2,811,410 (1 June 2019) and has a geographic area of 8,479 km2 (3,274 sq mi). Hiroshima Prefecture borders Okayama Prefecture to the east, Tottori Prefecture to the northeast, Shimane Prefecture to the north, and Yamaguchi Prefecture to the southwest. Hiroshima Prefecture also borders Ehime Prefecture for 74 metres (243 ft) on Hyōtanjima.[4] Hiroshima is the capital and largest city of Hiroshima Prefecture, and the largest city in the Chūgoku region, with other major cities including Fukuyama, Kure, and Higashihiroshima.[5] Hiroshima Prefecture is located on the Seto Inland Sea across from the island of Shikoku, and is bounded to the north by the Chūgoku Mountains. Hiroshima Prefecture is one of the three prefectures of Japan with more than one UNESCO World Heritage Site. The area around Hiroshima was formerly divided into Bingo Province and Aki Province.[6] This location has been a center of trade and culture since the beginning of Japans recorded history. Hiroshima is a traditional center of the Chūgoku region and was the seat of the Mōri clan until the Battle of Sekigahara. Together with Nara and Tokyo, Hiroshima is one of the three prefectures with more than one UNESCO World Heritage Site. The two such sites in Hiroshima Prefecture are: Hiroshima prefecture lies in the middle of Japan. Most of the prefecture consists of mountains leading towards Shimane Prefecture; and rivers produce rich plains near the coast.
|
500 yen note. The 500 yen note (五百円紙幣) is a discontinued denomination of Japanese yen issued from 1951 to 1994 in paper form. Crudely made notes were first made in an unsuccessful attempt to curb inflation at the time, and the series as a whole is broken down into three different types of note. Only the last two have a known design which feature Iwakura Tomomi on the obverse, and Mount Fuji on the reverse. Starting in 1982, new 500 yen coins began to be minted which eventually replaced their paper counterparts.[1] While the production of 500 yen notes continued until 1984, all of the notes issued were officially withdrawn from circulation in 1994. Five hundred yen notes were allowed to retain their legal tender status, but they are now worth more on the collectors market in numismatic value than they are at their face value.[2][3][4] The first series of 500 yen notes (called series B) were released on April 2, 1951 with improved security features such as watermarks. This time these new notes appeared to have been more successful, as they were issued for almost 20 years until finally being withdrawn on January 4, 1971.[2][5] The final 500 yen notes are referred to as series C notes, and were issued starting on November 1, 1969 with new watermarks to enhance security. The issue came to an end on April 1, 1994 when 500 yen notes were withdrawn from circulation.
|
Moros. In Greek mythology, Moros /ˈmɔːrɒs/ or Morus /ˈmɔːrəs/ (Ancient Greek: Μόρος means doom, fate[1]) is the personified spirit of impending doom,[2] who drives mortals to their deadly fate. It was also said that Moros gave people the ability to foresee their death. His Roman equivalent was Fatum. Moros is the offspring of Nyx, the primordial goddess of the night. It is suggested by Roman authors that Moros was son of Erebus, primordial god of darkness.[3] However, in Hesiods Theogony it is suggested that Nyx bore him by herself, along with several of her other children. Regardless of the presence or absence of Moros father, this would make him the brother of the Moirai, or the Fates. Among his other siblings are Thanatos and the Keres, death spirits who represented the physical aspects of death—Keres being the bringers of violent death and terminal sickness, while Thanatos represents a more peaceful passing. In Prometheus Bound, the titular Titan suggests that he gave humanity the spirit Elpis, the personification of hope, in order to help them ignore the inevitability of Moros.[4] He is also referred to as the all-destroying god, who, even in the realm of Death, does not set his victim free,[5] further supporting his image as representative of the inevitability of death and suffering. Aeschylus, Fragment 199 (from Plutarch, Life and Poety of Homer 157) (trans. Weir Smyth):
|
Chūgoku region. The Chūgoku region (Japanese: 中国地方, Hepburn: Chūgoku-chihō; [tɕɯꜜː.ɡo.kɯ, -ŋo.kɯ, tɕɯː.ɡo.kɯ̥ tɕiꜜ.hoː, -ŋo.kɯ̥-][3][a]), also known as the Sanin-Sanyō (山陰山陽地方, Sanin-Sanyō-chihō) region, is the westernmost region of Honshū, the largest island of Japan. It consists of the prefectures of Hiroshima, Okayama, Shimane, Tottori and Yamaguchi.[4] As of the 2020 census, it has a population of 7,328,339. Chūgoku literally means middle country, but the origin of the name is unclear. Historically, Japan was divided into a number of provinces called koku, which were in turn classified according to both their power and their distances from the administrative center in Kansai. Under the latter classification, most provinces are divided into near countries (近国, kingoku), middle countries (中国, chūgoku), and far countries (遠国, ongoku). Therefore, one explanation is that Chūgoku was originally used to refer to the collection of middle countries to the west of the capital. However, only five (fewer than half) of the provinces normally considered part of Chūgoku region were in fact classified as middle countries, and the term never applied to the many middle countries to the east of Kansai. Therefore, an alternative explanation is that Chūgoku referred to provinces between Kansai and Kyūshū, which was historically important as the link between Japan and mainland Asia. Historically, Chūgoku referred to the 16 provinces of Sanindō (山陰道) and Sanyōdō (山陽道), which led to the regions alternative name described below. However, because some of the easternmost provinces were later subsumed into prefectures based primarily in Kansai, those areas are, strictly speaking, not part of the Chūgoku region in modern usage. In Japanese, the characters 中国 and the reading Chūgoku are also used to mean China. The same characters are used in Chinese to refer to China, but pronounced Zhōngguó in Mandarin, lit. Middle Kingdom or Middle Country (Wade Giles: Chung1-kuo2). It is similar to the use of the West Country in English for a region of England. However, before the end of the Second World War, China was more commonly called shina (支那/シナ; which shares the same etymology of the word China in English) in order to avoid confusing the Chūgoku region. Due to the extensive use of this word during the Sino-Japanese War, the term shina has become an offensive word and was abandoned thereafter, and Chūgoku has since then been used instead of shina. In modern times, primarily in the tourism industry, for the same purpose, the Chūgoku region is also called the Sanin‐Sanyō region. Sanin (yin of the mountains) is the northern part facing the Sea of Japan. Sanyō (yang of the mountains) is the southern part facing the Seto Inland Sea. These names were created using the yin and yang‐based place‐naming scheme. The city of Hiroshima, the capital of the Chūgoku region, was rebuilt after being destroyed by an atomic bomb in 1945, and is now an industrial metropolis of more than one million people.
|
Yamaguchi Prefecture. Yamaguchi Prefecture (山口県, Yamaguchi-ken[a]) is a prefecture of Japan located in the Chūgoku region of Honshu.[3] Yamaguchi Prefecture has a population of 1,377,631 (1 February 2018) and has a geographic area of 6,112 km2 (2,359 sq mi). Yamaguchi Prefecture borders Shimane Prefecture to the north and Hiroshima Prefecture to the northeast. Yamaguchi is the capital and Shimonoseki is the largest city of Yamaguchi Prefecture, with other major cities including Ube, Shūnan, and Iwakuni.[4] Yamaguchi Prefecture is located at the western tip of Honshu with coastlines on the Sea of Japan and Seto Inland Sea, and separated from the island of Kyushu by the Kanmon Straits. Yamaguchi Prefecture was created by the merger of the provinces of Suō and Nagato.[5] During the rise of the samurai class during the Heian and Kamakura Periods (794–1333), the Ouchi family of Suō Province and the Koto family of Nagato Province gained influence as powerful warrior clans. In the Muromachi period (1336—1573), Ouchi Hiroyo, the 24th ruler of the Ouchi family conquered both areas of Yamaguchi Prefecture. The Ouchi clan imitated the city planning of Kyoto. They gained great wealth through cultural imports from the continent and trade with Korea and Ming Dynasty China. As a result, Yamaguchi came to be known as the Kyoto of the West, and Ouchi culture flourished. Sue Harutaka defeated the 31st ruler of the Ouchi clan. The Sue clan was then defeated by Mōri Motonari, and the Mōri family gained control of the Chūgoku region. Yamaguchi was ruled as part of the Mōri clan domain during the Sengoku period. Mōri Terumoto was then defeated by Tokugawa Ieyasu in the battle of Sekigahara in 1600. He was forced to give up all his land except for the Suō and Nagato areas (current-day Yamaguchi Prefecture), where he built his castle in Hagi. Mōri sought to strengthen the economic base of the region and increase local production with his Three Whites campaign (salt, rice, and paper). After Commodore Matthew Perrys opening of Japan, clans from Nagato (also called Chōshū) played a key role in the fall of the Tokugawa shogunate and the establishment of the new imperial government. Four years after the Edo Shogunate was overthrown and the Meiji government formed in 1868, the present Yamaguchi Prefecture was established. The Meiji government brought in many new systems and modern policies, and promoted the introduction of modern industry, though the prefecture was still centered on agriculture during this period. In the Taishō period, from 1912 to 1926, shipbuilding, chemical, machinery, and metal working plants were built in Yamaguchis harbors in the Seto Inland Sea area. During the post-World War II Shōwa Period, Yamaguchi developed into one of the most industrialized prefectures in the country due to the establishment of petrochemical complexes.[6] As of April 1, 2012, 7% of the total land area of the prefecture was designated as Natural Parks, namely the Setonaikai National Park; Akiyoshidai, Kita-Nagato Kaigan, and Nishi-Chūgoku Sanchi Quasi-National Parks; and Chōmonkyō, Iwakiyama, Rakanzan, and Toyota Prefectural Natural Parks.[7]
|
500 yen coin (commemorative). The 500 yen coin (五百円硬貨, Gohyaku-en kōka) is a denomination of the Japanese yen. In addition to being used as circulating currency, this denomination has also been used to make commemorative coins struck by the Japan Mint. These coins are intended for collectors only and were never issued for circulation. Throughout the coins history, the Japan Mint has issued 500 yen coins commemorating the various subjects of Japans history. Early commemorative coins minted under the Shōwa era have their dates of reign written in Kanji script. This practice was later replaced by adding Arabic numerals to reflect the current Emperors year of reign.
|
Romanian language. Romanian (obsolete spelling: Roumanian; endonym: limba română [ˈlimba roˈmɨnə] ⓘ, or românește [romɨˈneʃte], lit. in Romanian) is the official and main language of Romania and Moldova. Romanian is part of the Eastern Romance sub-branch of Romance languages, a linguistic group that evolved from several dialects of Vulgar Latin which separated from the Western Romance languages in the course of the period from the 5th to the 8th centuries.[12] To distinguish it within the Eastern Romance languages, in comparative linguistics it is called Daco-Romanian as opposed to its closest relatives, Aromanian, Megleno-Romanian, and Istro-Romanian. It is also spoken as a minority language by stable communities in the countries surrounding Romania (Bulgaria, Hungary, Serbia and Ukraine), and by the large Romanian diaspora. In total, it is spoken by 25 million people as a first language.[1] Romanian was also known as Moldovan in Moldova, although the Constitutional Court of Moldova ruled in 2013 that the official language of Moldova is Romanian.[c] On 16 March 2023, the Moldovan Parliament approved a law on referring to the national language as Romanian in all legislative texts and the constitution. On 22 March, the president of Moldova, Maia Sandu, promulgated the law.[13] The history of the Romanian language started in the Roman provinces north of the Jireček Line in Classical antiquity but there are 3 main hypotheses about its exact territory: the autochthony thesis (it developed in left-Danube Dacia only), the discontinuation thesis (it developed in right-Danube provinces only), and the as-well-as thesis that supports the language development on both sides of the Danube.[14] Between the 6th and 8th century, following the accumulated tendencies inherited from the vernacular spoken in this large area and, to a much smaller degree, the influences from native dialects, and in the context of a lessened power of the Roman central authority the language evolved into Common Romanian. This proto-language then came into close contact with the Slavic languages and subsequently divided into Aromanian, Megleno-Romanian, Istro-Romanian, and Daco-Romanian.[15][16] Due to limited attestation between the 6th and 16th century, entire stages from its history are re-constructed by researchers, often with proposed relative chronologies and loose limits.[17] From the 12th or 13th century, official documents and religious texts were written in Old Church Slavonic, a language that had a similar role to Medieval Latin in Western Europe. The oldest dated text in Romanian is a letter written in 1521 with Cyrillic letters, and until late 18th century, including during the development of printing, the same alphabet was used. The period after 1780, starting with the writing of its first grammar books, represents the modern age of the language, during which time the Latin alphabet became official, the literary language was standardized, and a large number of words from Modern Latin and other Romance languages entered the lexis.
|
Ginza. Ginza (/ˈɡɪnzə/ GHIN-zə; Japanese: 銀座 [ɡindza]) is a district of Chūō, Tokyo, located south of Yaesu and Kyōbashi, west of Tsukiji, east of Yūrakuchō and Uchisaiwaichō, and north of Shinbashi. It is a popular upscale shopping area of Tokyo, with numerous internationally renowned department stores, boutiques, restaurants and coffeehouses located in its vicinity. Ginza was a part of the old Kyobashi ward of Tokyo City, which, together with Nihonbashi and Kanda, formed the core of Shitamachi,[1] the original downtown center of Edo (Tokyo). Ginza was built upon a former swamp that was filled in during the 16th century. The name Ginza comes after the establishment of a silver-coin mint established there in 1612, during the Edo period.[2] After a devastating fire in 1872 burned down most of the area,[2] the Meiji government designated the Ginza area as a model of modernization. The government planned the construction of fireproof brick buildings and larger, better streets connecting Shimbashi Station all the way to the foreign concession in Tsukiji. Soon after the fire, redevelopment schemes were prepared by Colin Alexander McVean[3] a chief surveyor of the Public Works under direction of Yamao Yozo, but execution designs were provided by the Irish-born engineer Thomas Waters;[2] the Bureau of Construction of the Ministry of Finance was in charge of construction. The following year, a Western-style shopping promenade on the street from the Shinbashi bridge to the Kyōbashi bridge in the southwestern part of Chūō with two- and three-story Georgian brick buildings was completed.
|
John Raymond science fiction magazines. Between 1952 and 1954, John Raymond published four digest-size science fiction and fantasy magazines. Raymond was an American publisher of mens magazines who knew little about science fiction, but the fields rapid growth and a distributors recommendation prompted him to pursue the genre. Raymond consulted and then hired Lester del Rey to edit the first magazine, Space Science Fiction, which appeared in May 1952. Following a second distributors suggestion that year, Raymond launched Science Fiction Adventures, which del Rey again edited, but under an alias. In 1953, Raymond gave del Rey two more magazines to edit: Rocket Stories, which targeted a younger audience, and Fantasy Magazine, which published fantasy rather than science fiction. All four magazines were profitable, but Raymond did not reinvest the profits in improving the magazines and was late in paying contributors. Del Rey persuaded Raymond to invest some of the profits back into the magazines, but nothing came of this and, when del Rey discovered that Raymond was planning to cut rates instead, he resigned. Two of the magazines continued for a short time with Harry Harrison as editor, but by the end of 1954 all four magazines had ceased publication. The magazines are well regarded by science fiction historians. They carried fiction by many names well known in the field or who later became famous, including Isaac Asimov, Philip K. Dick, Robert E. Howard, and John Jakes. American science fiction magazines first appeared in the 1920s with the launch of Amazing Stories, a pulp magazine published by Hugo Gernsback. World War II and its attendant paper shortages interrupted the expanding market for the genre, but by the late 1940s the market began to recover again.[1] In October 1950, the first issue of Galaxy Science Fiction appeared; it reached a circulation of 100,000 within a year, and its success encouraged other publishers to enter the field.[2] John Raymond, at that time primarily a publisher of mens magazines, was told by his distributor that science fiction was a growing field; Raymond knew nothing about science fiction so he asked Lester del Rey for advice, and then offered del Rey the job of editor on the new magazine. Del Rey was initially hesitant, but eventually agreed to become the editor of Space Science Fiction; the first issue was dated May 1952. When another distributor approached Raymond to ask if he would be interested in publishing a science fiction title, he suggested to del Rey that this second magazine should focus on action stories. The result was Science Fiction Adventures, which appeared in November that year. Raymond decided to expand further, launching Fantasy Magazine in March 1953, and Rocket Stories, which like Science Fiction Adventures was aimed at a juvenile readership, the following month. Ziff-Davis had launched Fantastic, a rival fantasy magazine, in 1952, and once Fantasy Magazine appeared, they threatened to sue Raymond because of the similarity of the titles, so Raymond renamed the magazine Fantasy Fiction from the second issue onwards.[3]
|
Chuo University. Chuo University (中央大学, Chūō Daigaku), commonly referred to as Chuo (中央) or Chu-Dai (中大), is a private research university in Hachioji, Tokyo, Japan. The university finds its roots in a school called Igirisu Hōritsu Gakkō [ja] (English Law School), which was founded in 1885, and became a university in 1920.[2] The university operates four campuses in Tokyo: the largest in Hachiōji (Tama campus), one in Bunkyō (Korakuen campus), and two others in Shinjuku (Ichigaya and Ichigaya-Tamachi campuses). Chuo is organized into six faculties, ten graduate schools, and nine research institutes. There are also four affiliated high schools and two affiliated junior high schools. When written in Chinese characters, Chuo University shares the same name with National Central University in Taiwan and Chung-Ang University in South Korea. Chuo was founded as the English Law School (英吉利法律学校, Igirisu Hōritsu Gakkō) in 1885 at Kanda in Tokyo by Rokuichiro Masujima together with some group of 18 young lawyers led by him.[3][4] Before 1889, the school moved and was renamed to Tokyo College of Law (Tōkyō Hōgakuin). The curriculum was changed to reflect the government reform of Japanese law and creation of a new civil code.[4] Opposition to the implementation of new civil code resulted in the government shuttering of the campus journal and the subsequent creation of the Chuo Law Review (Hōgaku Shinpō), which has been regularly published since then.[4] The university was burnt down in the Great Kanda Fire that occurred in 1892, but was able to hold temporary classes. Before 1903, the school was promoted to Tokyo University of Law (Tokyo Hōgakuin Daigaku) and in 1905, the school expanded itself with the department of economics and renamed itself Chuo University. The origin of its name Chuo has not been certain. However, many founders of the university were once students of the Middle Temple, London, United Kingdom before they completed their training and became qualified as Barristers. This is one of the reasons why the university was renamed to Chuo, which literally means middle, center or central.
|
Nara (city). Nara (奈良市, Nara-shi; Japanese pronunciation: [naꜜ.ɾa, na.ɾaꜜ.ɕi] ⓘ[2]) is the capital city of Nara Prefecture, Japan. As of 2022[update], Nara has an estimated population of 367,353 according to World Population Review, making it the largest city in Nara Prefecture and sixth-largest in the Kansai region of Honshu. Nara is a core city located in the northern part of Nara Prefecture bordering the Kyoto Prefecture. Nara was the capital of Japan during the Nara period from 710 to 784 as the seat of the Emperor before the capital was moved to Nagaoka-kyō, except for the years 740 to 745, when the capital was placed in Kuni-kyō, Naniwa-kyō and Shigaraki Palace. Nara is home to eight major historic temples, shrines, and heritage sites, specifically Tōdai-ji, Saidai-ji, Kōfuku-ji, Kasuga Shrine, Gangō-ji, Yakushi-ji, Tōshōdai-ji, and the Heijō Palace, together with Kasugayama Primeval Forest, collectively form the Historic Monuments of Ancient Nara, a UNESCO World Heritage Site. By the Heian period, a variety of different characters had been used to represent the name Nara: 乃楽, 乃羅, 平, 平城, 名良, 奈良, 奈羅, 常, 那良, 那楽, 那羅, 楢, 諾良, 諾楽, 寧, 寧楽 and 儺羅. A number of theories for the origin of the name Nara have been proposed, and some of the better-known ones are listed here. The second theory in the list, from the notable folklorist Kunio Yanagita (1875–1962), is most widely accepted at present.
|
SIX Group. SIX is a key financial market infrastructure company in Switzerland. The company provides services relating to securities transactions, the processing of financial information, payment transactions and is building a digital infrastructure. The company name SIX is an abbreviation and stands for Swiss Infrastructure and Exchange. SIX is globally active, with its headquarters in Zurich.[3] Ticker AG in Zurich was founded with the purpose of transmitting stock market prices.[4] It was the predecessor of Telekurs AG, which later merged with other companies to form SIX.[5][6] With the opening of the new exchange, the ticker system also began to broadcast.[4] This stock exchange ticker, one of the first on the European continent and a special application of the local telegraph, transmitted the Zurich stock exchange prices, the closing prices of other Swiss and important foreign stock exchanges in italics on a narrow strip of paper to any number of recipients.[4] Ticker AG introduced the first stock exchange television ever and aroused media interest with this novelty: The prices of up to ninety stocks could be followed on screen.[5] The Neue Zürcher Zeitung described stock exchange television as a world first and devoted a technology supplement to this technical development.[7] Ticker AG changed its name to Telekurs AG.[8]
|
Waseda University. Waseda University (Japanese: 早稲田大学), abbreviated as Waseda (早稲田) or Sōdai (早大), is a private research university in Shinjuku, Tokyo. Founded in 1882 as the Tōkyō Professional School [ja] by Ōkuma Shigenobu, the eighth and eleventh prime minister of Japan, the school was formally renamed Waseda University in 1902.[6] Waseda is organized into 36 departments: 13 undergraduate schools and 23 graduate schools. As of 2023, there are 38,776 undergraduate students and 8,490 graduate students. In addition to a central campus in Shinjuku (Waseda Campus and Nishiwaseda Campus), the university operates campuses in Chūō, Nishitōkyō, Tokorozawa, Honjō, and Kitakyūshū. Waseda also operates 21 research institutes at its main Shinjuku campus.[7] The universitys faculty and alumni include eight prime ministers of Japan[8]; three prime ministers of Korea[9][10][11]; founders of leading Japanese and Korean companies such as Sony, UNIQLO, Samsung,Lotte, and POSCO; a number of important figures of Japanese literature, including Haruki Murakami, Yoko Ogawa, and Yoko Tawada. Waseda was founded as Tōkyō Professional School (東京專門學校, Tōkyō Senmon Gakkō) on 21 October 1882 by samurai scholar and Meiji-era politician and former prime minister Ōkuma Shigenobu. Before the name Waseda was selected, it was known variously as Waseda Gakkō (早稲田学校) or Totsuka Gakkō (戸塚学校) after the location of the founders villa in Waseda Village and the schools location in Totsuka Village, respectively. It was renamed Waseda University (早稲田大学, Waseda-daigaku) on 2 September 1902, upon acquiring university status. It started as a college with three departments under the old Japanese system of higher education.
|
Mike Ashley (writer). Michael Raymond Donald Ashley (born 1948) is a British bibliographer, author and editor of science fiction, mystery, and fantasy. Ashley has published over 100 nonfiction books and anthologies. He edits the long-running Mammoth Book series of short story anthologies, each arranged around a particular theme in mystery, fantasy, or science fiction. He has a special interest in fiction magazines and has published a multi-volume History of the Science Fiction Magazine and a study of British fiction magazines, The Age of the Storytellers. Ashley won the Edgar Allan Poe Award for Best Critical/Biographical Work for The Mammoth Encyclopedia of Modern Crime Fiction in 2003 and the Bram Stoker Award for Best Non-Fiction for The Supernatural Index in 1995. He received the Pilgrim Award for lifetime achievement in science fiction scholarship from the Science Fiction Research Association in 2002. He was nominated for the Hugo Award for Best Related Work for Transformations: The Story of the Science Fiction Magazines from 1950 to 1970 in 2006. In addition to the books listed below, Ashley edited and prepared for publication the novel The Enchantresses (1997) by Vera Chapman. He has contributed to The Encyclopedia of Fantasy (as contributing editor), The Encyclopedia of Science Fiction (as contributing editor of the third edition), and other reference works. He wrote the books to accompany British Library exhibitions Taking Liberties in 2008 and Out of This World: Science Fiction But Not As You Know It in 2011.
|
Irvin S. Cobb. Irvin Shrewsbury Cobb (June 23, 1876 – March 11, 1944) was an American author, humorist, editor and columnist from Paducah, Kentucky, who relocated to New York in 1904, living there for the remainder of his life. He wrote for the New York World, Joseph Pulitzers newspaper, as the highest paid staff reporter in the United States. Cobb also wrote more than 60 books and 300 short stories. Some of his works were adapted for silent movies. Several of his Judge Priest short stories were adapted in the 1930s for two feature films directed by John Ford. Cobb was the second of four children born to Kentucky natives in Paducah, Kentucky. His maternal grandfather, Reuben Saunders, M.D., is credited with discovering in 1873 that injections of morphine-atropine were useful in treating cholera. Cobb was raised in Paducah, and the events and people of his childhood became the basis for much of his later works.[1] Later in life, Cobb was nicknamed Duke of Paducah.[2] Cobb was educated in public and private elementary schools, and then entered William A. Cades Academy intending to pursue a law career. When Cobb was 16, his father became an alcoholic, after the death of his grandfather. Forced to quit school and find work, Cobb began his writing career. Cobb started in journalism with the Paducah Daily News at age seventeen, and became the nations youngest managing news editor at age nineteen. He later worked at the Louisville Evening Post for a year and a half.
|
Argosy (magazine). Argosy was an American magazine, founded in 1882 as The Golden Argosy, a childrens weekly, edited by Frank Munsey and published by E. G. Rideout. Munsey took over as publisher when Rideout went bankrupt in 1883, and after many struggles made the magazine profitable. He shortened the title to The Argosy in 1888 and targeted an audience of men and boys with adventure stories. In 1894 he switched it to a monthly schedule and in 1896 he eliminated all non-fiction and started using cheap pulp paper, making it the first pulp magazine. Circulation had reached half a million by 1907, and remained strong until the 1930s. The name was changed to Argosy All-Story Weekly in 1920 after the magazine merged with All-Story Weekly, another Munsey pulp, and from 1929 it became just Argosy. In 1925 Munsey died, and the publisher, the Frank A. Munsey Company, was purchased by William Dewart, who had worked for Munsey. By 1942 circulation had fallen to no more than 50,000, and after a failed effort to revive the magazine by including sensational non-fiction, it was sold that year to Popular Publications, another pulp magazine publisher. Popular converted it from pulp to slick format, and initially attempted to make it a fiction-only magazine, but gave up on this within a year. Instead it became a mens magazine, carrying fiction and feature articles aimed at men. Circulation soared and by the early 1950s was well over one million. Early contributors included Horatio Alger, Oliver Optic, and G. A. Henty. During the pulp era, many famous writers appeared in Argosy, including O. Henry, James Branch Cabell, Albert Payson Terhune, Edgar Rice Burroughs, Erle Stanley Gardner, Zane Grey, Robert E. Howard, and Max Brand. Argosy was regarded as one of the most prestigious publications in the pulp market, along with Blue Book, Adventure and Short Stories. After the transition to slick format it continued to publish fiction, including science fiction by Robert Heinlein, Arthur Clarke, and Ray Bradbury. From 1948 to 1958 it published a series by Gardner called The Court of Last Resort which examined the cases of dozens of convicts who maintained their innocence, and succeeding in overturning many of the convictions. NBC adapted the series for television in 1957. Popular sold Argosy to David Geller in 1972, and in 1978 Geller sold it to the Filipacchi Group, which closed it at the end of the year. The magazine has been revived several times, most recently in 2016.
|
Copper. Copper is a chemical element; it has symbol Cu (from Latin cuprum) and atomic number 29. It is a soft, malleable, and ductile metal with very high thermal and electrical conductivity. A freshly exposed surface of pure copper has a pinkish-orange color. Copper is used as a conductor of heat and electricity, as a building material, and as a constituent of various metal alloys, such as sterling silver used in jewelry, cupronickel used to make marine hardware and coins, and constantan used in strain gauges and thermocouples for temperature measurement. Copper is one of the few metals that can occur in nature in a directly usable, unalloyed metallic form. This means that copper is a native metal. This led to very early human use in several regions, from c. 8000 BC. Thousands of years later, it was the first metal to be smelted from sulfide ores, c. 5000 BC; the first metal to be cast into a shape in a mold, c. 4000 BC; and the first metal to be purposely alloyed with another metal, tin, to create bronze, c. 3500 BC.[11] Commonly encountered compounds are copper(II) salts, which often impart blue or green colors to such minerals as azurite, malachite, and turquoise, and have been used widely and historically as pigments. Copper used in buildings, usually for roofing, oxidizes to form a green patina of compounds called verdigris. Copper is sometimes used in decorative art, both in its elemental metal form and in compounds as pigments. Copper compounds are used as bacteriostatic agents, fungicides, and wood preservatives.
|
Swiss Association for Standardization. The Swiss Association for Standardization (SNV, German: Schweizerische Normen-Vereinigung, French: Association Suisse de Normalisation) is in charge of Switzerlands international cooperation and acceptance in the field of standardization. It is a founding member of both ISO and CEN. The Swiss Association for Standardization liaises between experts of standardization and users of standards. This article about an organisation in Switzerland is a stub. You can help Wikipedia by expanding it.
|
Higashiyamanashi District, Yamanashi. Higashiyamanashi (東山梨郡, Higashiyamanashi-gun) was a district located in Yamanashi Prefecture, Japan. As of 2004, the district had an estimated population of 10,701 persons with a density of 135 persons per km2. The total area was 79.27 km2. Prior to its dissolution, the district consisted of three towns:
|
International Organization for Standardization. The International Organization for Standardization (ISO /ˈaɪsoʊ/;[3] French: Organisation internationale de normalisation; Russian: Международная организация по стандартизации) is an independent, non-governmental, international standard development organization composed of representatives from the national standards organizations of member countries.[4][5] Membership requirements are given in Article 3 of the ISO Statutes.[6] ISO was founded on 23 February 1947, and (as of July 2024[update]) it has published over 25,000 international standards covering almost all aspects of technology and manufacturing. It has over 800 technical committees (TCs) and subcommittees (SCs) to take care of standards development.[7] The organization develops and publishes international standards in technical and nontechnical fields, including everything from manufactured products and technology to food safety, transport, IT, agriculture, and healthcare.[7][8][9][10] More specialized topics like electrical and electronic engineering are instead handled by the International Electrotechnical Commission.[11] It is headquartered in Geneva, Switzerland.[7] The three official languages of ISO are English, French, and Russian.[2]
|
Zinc. Zinc is a chemical element; it has symbol Zn and atomic number 30. It is a slightly brittle metal at room temperature and has a shiny-greyish appearance when oxidation is removed. It is the first element in group 12 (IIB) of the periodic table. In some respects, zinc is chemically similar to magnesium: both elements exhibit only one normal oxidation state (+2), and the Zn2+ and Mg2+ ions are of similar size.[b] Zinc is the 24th most abundant element in Earths crust and has five stable isotopes. The most common zinc ore is sphalerite (zinc blende), a zinc sulfide mineral. The largest workable lodes are in Australia, Asia, and the United States. Zinc is refined by froth flotation of the ore, roasting, and final extraction using electricity (electrowinning). Zinc is an essential trace element for humans,[8][9][10] animals,[11] plants[12] and for microorganisms[13] and is necessary for prenatal and postnatal development.[14] It is the second most abundant trace metal in humans after iron, an important cofactor for many enzymes, and the only metal which appears in all enzyme classes.[12][10] Zinc is also an essential nutrient element for coral growth.[15] Zinc deficiency affects about two billion people in the developing world and is associated with many diseases.[16] In children, deficiency causes growth retardation, delayed sexual maturation, infection susceptibility, and diarrhea.[14] Enzymes with a zinc atom in the reactive center are widespread in biochemistry, such as alcohol dehydrogenase in humans.[17] Consumption of excess zinc may cause ataxia, lethargy, and copper deficiency. In marine biomes, notably within polar regions, a deficit of zinc can compromise the vitality of primary algal communities, potentially destabilizing the intricate marine trophic structures and consequently impacting biodiversity.[18] Brass, an alloy of copper and zinc in various proportions, was used as early as the third millennium BC in the Aegean area and the region which currently includes Iraq, the United Arab Emirates, Kalmykia, Turkmenistan and Georgia. In the second millennium BC it was used in the regions currently including West India, Uzbekistan, Iran, Syria, Iraq, and Israel.[19][20][21] Zinc metal was not produced on a large scale until the 12th century in India, though it was known to the ancient Romans and Greeks.[22] The mines of Rajasthan have given definite evidence of zinc production going back to the 6th century BC.[23] The oldest evidence of pure zinc comes from Zawar, in Rajasthan, as early as the 9th century AD when a distillation process was employed to make pure zinc.[24] Alchemists burned zinc in air to form what they called philosophers wool or white snow.
|
Sensationalism. In journalism and mass media, sensationalism is a type of editorial tactic. Events and topics in news stories are selected and worded to excite the greatest number of readers and viewers. This style of news reporting encourages biased or emotionally loaded impressions of events rather than neutrality, and may cause a manipulation to the truth of a story.[1][better source needed] Sensationalism may rely on reports about generally insignificant matters and portray them as a major influence on society, or biased presentations of newsworthy topics, in a trivial, or tabloid manner, contrary to general assumptions of professional journalistic standards.[2][3] Some tactics include being deliberately obtuse,[4] appealing to emotions,[5][better source needed] being controversial, intentionally omitting facts and information,[6][better source needed] being loud and self-centered, and acting to obtain attention.[5][better source needed] Trivial information and events are sometimes misrepresented and exaggerated as important or significant, and often include stories about the actions of individuals and small groups of people,[1][better source needed] the content of which is often insignificant and irrelevant to the macro-level day-to-day events occurring globally. In A History of News, Mitchell Stephens notes sensationalism can be found in the Ancient Roman gazette Acta Diurna, where official notices and announcements were presented daily on public message boards, the perceived content of which spread with enthusiasm in illiterate societies.[2] Sensationalism was used in books of the 16th and 17th century, to teach moral lessons. According to Stephens, sensationalism brought the news to a new audience when it became aimed at the lower class, who had less of a need to accurately understand politics and the economy, to occupy them in other matters. Through sensationalism, he claims, the audience was further educated and encouraged to take more interest in the news.[2] The more modern forms of sensationalism developed in the course of the nineteenth century in parallel with the expansion of print culture in industrialized nations. A genre of British literature, sensation novels, became in the 1860s an example of how the publishing industry could capitalize on surprising narrative to market serialized fiction in periodicals.[citation needed] The attention-grasping rhetorical techniques found in sensation fiction were also employed in articles on science, modern technology, finance, and in historical accounts of contemporary events.[7] Sensationalism in nineteenth century could be found in popular culture, literature, performance, art history, theory, pre-cinema, and early cinema.[8] In the Soviet Union, strong censorship resulted in only positive occurrences being reported on, with the news looking significantly different than in the West.[9][additional citation(s) needed]
|
Kawabe District, Akita. Kawabe District (河辺郡, Kawabe-gun) was a former rural district located in southern Akita, Japan. On October 1, 2005, its remaining components, the towns of Kawabe and Yūwa merged into the city of Akita, upon which Kawabe District was dissolved and ceased to exist as an administrative unit. As of 2003 (before the merger), the district had an estimated population of 18,264 and a population density of 40.99 persons per km2. The total area was 445.57 km2. The area of Kawabe District was formerly part of Dewa Province, which was divided into the provinces of Ugo Province and Uzen Province following the Meiji restoration on January 19, 1869, with the area of Kawabe becoming part of Ugo Province. At the time, the area consisted of 59 villages all of which were formerly under the control of Kubota Domain. Akita Prefecture was founded on December 13, 1871. With the establishment of the municipality system on April 1, 1889, Kawabe District, with 14 villages was established. Ushijima and Araya were raised to town status in 1896, but Ushijima was annexed by the city of Akita in 1924. Wada was raised to town status in 1935, but Araya was absorbed into the city of Akita in 1941. On April 1, 1948 - Kawabe District acquired the village of Taishodera (from Yuri District). Kawabe was raised to town status on March 31, 1955, followed by Yūwa on April 1, 1972. On January 11, 2005, the towns of Kawabe and Yūwa were merged into the expanded city of Akita. Kawabe District was dissolved as a result of this merger.
|
Nickel. Nickel is a chemical element; it has symbol Ni and atomic number 28. It is a silvery-white lustrous metal with a slight golden tinge. Nickel is a hard and ductile transition metal. Pure nickel is chemically reactive, but large pieces are slow to react with air under standard conditions because a passivation layer of nickel oxide that prevents further corrosion forms on the surface. Even so, pure native nickel is found in Earths crust only in tiny amounts, usually in ultramafic rocks,[10][11] and in the interiors of larger nickel–iron meteorites that were not exposed to oxygen when outside Earths atmosphere. Meteoric nickel is found in combination with iron, a reflection of the origin of those elements as major end products of supernova nucleosynthesis. An iron–nickel mixture is thought to compose Earths outer and inner cores.[12] Use of nickel (as natural meteoric nickel–iron alloy) has been traced as far back as 3500 BCE. Nickel was first isolated and classified as an element in 1751 by Axel Fredrik Cronstedt, who initially mistook the ore for a copper mineral, in the cobalt mines of Los, Hälsingland, Sweden. The elements name comes from a mischievous sprite of German miner mythology, Nickel (similar to Old Nick). Nickel minerals can be green, like copper ores, and were known as kupfernickel – Nickels copper – because they produced no copper. Although most nickel in the earths crust exists as oxides, economically more important nickel ores are sulfides, especially pentlandite. Major production sites include Sulawesi, Indonesia, the Sudbury region, Canada (which is thought to be of meteoric origin), New Caledonia in the Pacific, Western Australia, and Norilsk, Russia.[13]
|
Meiji-mura. Meiji-mura (博物館明治村, Hakubutsukan Meiji-mura; Meiji Village Museum) is an open-air architectural museum/theme park in Inuyama, near Nagoya in Aichi prefecture, Japan. It was opened on March 18, 1965. The museum preserves historic buildings from Japans Meiji (1867–1912), Taishō (1912–1926), and early Shōwa (1926–1945) periods. Over 60 historical buildings have been moved and reconstructed onto 1 square kilometre (250 acres) of rolling hills alongside Lake Iruka. The most noteworthy building there is the reconstructed main entrance and lobby of Frank Lloyd Wrights landmark Imperial Hotel, which originally stood in Tokyo from 1923 to 1967, when the main structure was demolished to make way for a new, larger version of the hotel.[1] The Meiji era was a period of rapid change in Japan. After centuries of isolation, Japan began to incorporate ideas from the west, including building styles and construction techniques. Meiji-mura was started by Yoshirō Taniguchi (谷口 吉郎 Taniguchi Yoshirō 1904–79), an architect, and Motoo Tsuchikawa (土川元夫 Tsuchikawa Moto-o, 1903–74), then vice president and later president of Nagoya Railroad (Meitetsu). While riding the Yamanote line in Tokyo, Taniguchi lamented the sight of the demolition of the Rokumeikan, a symbol of Meiji era architecture. He appealed to his college classmate Tsuchikawa to join him in working to preserve western style Meiji era buildings of cultural or historical importance. On July 16, 1962 they formed a foundation for this purpose, with Nagoya Railroad providing the funding. Meiji-mura was opened on March 18, 1965 on the banks of the Lake Iruka reservoir, operated under Nagoya Railroad with Taniguchi as museum director, with 15 buildings. Meiji-muras goal is to preserve these historic early examples of western architecture mixed with Japanese construction techniques and materials. Incidentally, many of the buildings were saved from demolition during the post World War II period, another time of transition and rapid progress in Japanese history. Though it is still operated by Nagoya Railroad, a subsidiary company was created in 2003 to oversee it and nearby Little World. Due to the recent financial declines with Nagoya Railroad the future of the park is in question. While renovations had been put on hold for a time, work on moving the Shibakawa Yashiki from Nishinomiya, Hyōgo was begun in January 2005.
|
Bank. A bank is a financial institution that accepts deposits from the public and creates a demand deposit while simultaneously making loans.[1] Lending activities can be directly performed by the bank or indirectly through capital markets.[2] As banks play an important role in financial stability and the economy of a country, most jurisdictions exercise a high degree of regulation over banks. Most countries have institutionalized a system known as fractional-reserve banking, under which banks hold liquid assets equal to only a portion of their current liabilities.[3] In addition to other regulations intended to ensure liquidity, banks are generally subject to minimum capital requirements based on an international set of capital standards, the Basel Accords.[4] Banking in its modern sense evolved in the fourteenth century in the prosperous cities of Renaissance Italy but, in many ways, functioned as a continuation of ideas and concepts of credit and lending that had their roots in the ancient world. In the history of banking, a number of banking dynasties – notably, the Medicis, the Pazzi, the Fuggers, the Welsers, the Berenbergs, and the Rothschilds – have played a central role over many centuries. The oldest existing retail bank is Banca Monte dei Paschi di Siena (founded in 1472), while the oldest existing merchant bank is Berenberg Bank (founded in 1590). Banking as an archaic activity (or quasi-banking[5][6]) is thought to have begun as early as the end of the 4th millennium BCE,[7] to the 3rd millennia BCE.[8][9]
|
Okayama (disambiguation). Okayama is the capital city of Okayama Prefecture in the Chūgoku region of Japan. Okayama may also refer to:
|
Shōō. Shōō (勝央町, Shōō-chō) is a town located in Katsuta District, Okayama Prefecture, Japan. As of 1 September 2022[update], the town had an estimated population of 10,900 in 4713 households and a population density of 83 persons per km2.[1] The total area of the town is 54.05 square kilometres (20.87 sq mi). Shōō is said to be the place where Kintarō died.[citation needed] Shōō is located in the northeastern part of Okayama Prefecture. Located on the southern side of the Chugoku Mountains, it is mostly hills and forests. Okayama Prefecture Shōō has a Humid subtropical climate (Köppen Cfa) characterized by warm summers and cool winters with moderate snowfall. The average annual temperature in Shōō is 14.0 °C. The average annual rainfall is 1501 mm with September as the wettest month. The temperatures are highest on average in January, at around 25.9 °C, and lowest in January, at around 2.4 °C.[2] Per Japanese census data,[3] the population of Shōō has been as follows.
|
Katsuta, Okayama. Katsuta (勝田町, Katsuta-chō) was a town located in Katsuta District, Okayama Prefecture, Japan. As of 2003, the town had an estimated population of 3,660 and a density of 41.93 persons per km2. The total area was 87.29 km2. On March 31, 2005, Katsuta, along with the towns of Mimasaka (former), Aida, Ōhara and Sakutō, and the village of Higashiawakura (all from Aida District), was merged to create the city of Mimasaka.[1][2] This Okayama Prefecture location article is a stub. You can help Wikipedia by expanding it.
|
Fukuoka (disambiguation). Fukuoka is the capital city of Fukuoka Prefecture. Fukuoka may also refer to:
|
Shōboku, Okayama. Shōboku (勝北町, Shōboku-chō) was a town located in Katsuta District, Okayama Prefecture, Japan. As of 2003, the town had an estimated population of 7,494 and a density of 166.90 persons per km2. The total area was 44.90 km2. On February 28, 2005, Shōboku, along with the town of Kamo, the village of Aba (both from Tomata District), and the town of Kume (from Kume District), was merged into the expanded city of Tsuyama and no longer exists as an independent municipality.[1] This Okayama Prefecture location article is a stub. You can help Wikipedia by expanding it.
|
List of villages in Japan. A village (村, mura, son)[a] is a local administrative unit in Japan.[1] It is a local public body along with prefecture (県, ken; or other equivalents), city (市, shi), and town (町, chō, machi). Geographically, a villages extent is contained within a prefecture. Villages are larger than a local settlement; each is a subdivision of rural district (郡, gun), which are subdivided into towns and villages with no overlap and no uncovered area. As a result of mergers and elevation to higher statuses, the number of villages in Japan is decreasing. As of 2006, 13 prefectures no longer have any villages: Tochigi (since March 20, 2006), Fukui (since March 3, 2006), Ishikawa (since March 1, 2005), Shizuoka (since July 1, 2005), Hyōgo (since April 1, 1999), Mie (since November 1, 2005), Shiga (since January 1, 2005), Hiroshima (since November 5, 2004), Yamaguchi (since March 20, 2006), Ehime (since January 16, 2005), Kagawa (since April 1, 1999), Nagasaki (since October 1, 2005), and Saga (since March 20, 2006). 村 can have the reading of mura or son, but with the exception of Tottori, Okayama, Hiroshima, Yamaguchi, Tokushima, Miyazaki and Okinawa, most prefectures use the mura reading.
|
Kibi dango (Okayama). A Kibi dango (吉備団子, きびだんご; Kibi Province dumpling) is a type of wagashi sweet or snack with an eponymous reference to Kibi-no-kuni, an old province roughly coincident with todays Okayama Prefecture. It is made by forming gyūhi, a sort of soft mochi, into flat round cakes.[1][2][3] Glutinous rice, starch, syrup and sugar are the basic ingredients.[1] It is manufactured by some fifteen confectioners based in Okayama City.[4] While perhaps originally made from kibi (proso millet),[5] the modern recipe uses little or no millet,[a] and substantively differs from kibi dango (黍団子, millet dumpling) of yore, famous from the Japanese heroic folk tale of Momotarō or Peach Boy; nevertheless, Kibi dango continues to be represented as being the same as the folk hero Peach Boys dumpling.[6] The simplistic, and widely disseminated notion regarding its invention is that it was developed in the early Ansei era (c. 1856) by the confectioner Kōeidō,[2][3] but a local historian has traced a more elaborate multi-phased history in which the founding of this wagashi shop and the development of the modern recipe is pushed to a number of years later. Some hypotheses trace its pre-history to the dumpling (or some other food item) served at the Kibitsu Shrine in Okayama. The resident deity of this shrine, Kibitsuhiko, is a legendary ogre-slayer, claimed to be the true identity of Momotarō, especially by Okayama locals. The theory originated in the 1930s, and since then there has been concerted effort in the region to promote the folk hero Momotarō as a local of Kibi Province, and his dumplings as Kibi dango by default.[7] There are irreconcilably differing accounts of the dates and sequences of events regarding the invention. The standard curt explanation is that this specialty dessert was first invented by the confectioner Kōeidō (廣榮堂) during the early Ansei era (1854–).[3] This purveyor later split into two brands, Kōeidō Honten (廣榮堂本店) and Kōeidō Takeda (広栄堂武田),[8] which remain to this day. Takeda is the family name of the original business. The current proprietors give a more complex account of the first origins, but local historians uncovered an even more convoluted history. The official line version, endorsed by the Kōeidō Honten, is that the family ran a ceramics merchant named Hirose-ya (廣瀬屋) for 7 generations running, until it switched business to that of a confectioner in 1856, changing the shop name to Kōeidō. According to this scenario, around 1856,[b] Takeda Hanzō (半蔵), the retired predecessor of the family ceramics shop, was one of the three Okayama townsmen who collectively devised the new recipe that was somehow an improvement over the steamed millet dumpling rectangularly shaped like kakimochi, which did not keep well, and was eaten with red bean paste or with sauce poured on top, which was a common staple wherever the crop was harvested.[9]
|
Administrative divisions of Japan. Naruhito Fumihito
|
Okayama Castle. Okayama Castle (岡山城, Okayama-jō) is a Japanese castle in the city of Okayama in Okayama Prefecture in Japan. The main tower was completed in 1597,[1] destroyed in 1945 and replicated in concrete in 1966. Two of the watch towers survived the bombing of 1945 and are now listed by the national Agency for Cultural Affairs as Important Cultural Properties. In stark contrast to the white Egret Castle of neighboring Himeji, Okayama Castle has a black exterior, earning it the nickname Crow Castle (烏城, U-jō) or castle of the black bird. (The black castle of Matsumoto in Nagano is also known as Crow Castle, but it is karasu-jō in Japanese.) Today, only a few parts of Okayama Castles roof (including the fish-shaped-gargoyles) are gilded, but prior to the Battle of Sekigahara the main keep also featured gilded roof tiles, earning it the nickname Golden Crow Castle (金烏城, Kin U-jō). In 1570, Ukita Naoie killed castle lord Kanemitsu Munetaka and started remodeling the castle[2] and completed by his son Hideie in 1597. Three years later, Hideie sided with the ill-fated Toyotomi Clan at the Battle of Sekigahara, was captured by the Tokugawa Clan and exiled to the island prison of Hachijo. The castle and surrounding fiefdoms were given to Kobayakawa Hideaki as spoils of war. Kobayakawa died just two years later without leaving an heir, and the castle (and fiefdom) was given to the Ikeda Clan, who later added Kōraku-en as a private garden. In 1869 the castle became the property of the Meiji governments Hyōbu-shō (Ministry of War), who saw the samurai era castles as archaic and unnecessary. Like many other castles throughout Japan, the outer moats were filled in, most castle buildings were dismantled and the old castle walls gradually disappeared underneath the city. On June 29, 1945, allied bombers burnt the main keep, and an adjacent gate, to the ground, leaving only two turrets and some of the stone walls remaining. Reconstruction work on the keep and gate began in 1964 and was completed in 1966. In 1996 the rooftop gargoyles were gilded as part of the 400th anniversary celebrations.
|
Animation studio. An animation studio is a company producing animated media. The broadest such companies conceive of products to produce, own the physical equipment for production, employ operators for that equipment, and hold a major stake in the sales or rentals of the media produced. They also own rights over merchandising and creative rights for characters created/held by the company, much like authors holding copyrights. In some early cases, they also held patent rights over methods of animation used in certain studios that were used for boosting productivity. Overall, they are business concerns and can function as such in legal terms. The idea of a studio dedicated to animating cartoons was spearheaded by Raoul Barré and his studio, Barré Studio, co-founded with Bill Nolan, beating out the studio created by J.R. Bray, Bray Productions, to the honor of the first studio dedicated to animation.[1] Though beaten to the post of being the first studio, Brays studio employee, Earl Hurd, came up with patents designed for mass-producing the output for the studio. As Hurd did not file for these patents under his own name but handed them to Bray, they would go on to form the Bray-Hurd Patent Company and sold these techniques for royalties to other animation studios of the time.[2] The biggest name in animation studios during this early time was Disney Brothers Animation Studio (now known as Walt Disney Animation Studios), co-founded by Walt and Roy O. Disney. Started on October 16, 1923, the studio went on to make its first animated short, Steamboat Willie in 1928, to much critical success,[3] though the real breakthrough was in 1937, when the studio was able to produce a full-length animated feature film i.e. Snow White and the Seven Dwarfs, which laid the foundation for other studios to try to make full-length movies.[4] In 1932 Flowers and Trees, a production by Walt Disney Productions and United Artists, won the first Academy Award for Best Animated Short Film.[5] This period, from the 1920s to the 1950s or sometimes considered from 1911 to the death of Walt Disney in 1966, is commonly known as the Golden Age of American Animation as it included the growth of Disney, as well as the rise of Warner Bros. Cartoons and the Metro-Goldwyn-Mayer cartoon studio as prominent animation studios.[6] Disney continued to lead in technical prowess among studios for a long time afterwards, as can be seen with their achievements. In 1941, Otto Messmer created the first animated television commercials for Botany Tie ads/weather reports. They were shown on NBC-TV in New York until 1949.[2] This marked the first forays of animation designed for the smaller screen and was to be followed by the first animated series specifically made for television, Crusader Rabbit, in 1948.[7][better source needed] Its creator, Alex Anderson, had to create the studio Television Arts Productions specifically for the purpose of creating this series as his old studio, Terrytoons, refused to make a series for television. Since Crusader Rabbit, however, many studios have seen this as a profitable enterprise and many have entered the made for television market since, with Joseph Barbera and William Hanna refining the production process for television animation on their show Ruff and Reddy. It was in 1958 that The Huckleberry Hound Show claimed the title of being the first all-new half-hour cartoon show. This, along with their previous success with the series Tom and Jerry, elevated their animation studio, H.B. Enterprises (later Hanna-Barbera Productions), to dominate the North American television animation market during the latter half of the 20th century.[8] In 2002, Shrek, produced by DreamWorks and Pacific Data Images won the first Academy Award for Best Animated Feature.[9] Since then, Disney/Pixar have produced the most number of movies either to win or be nominated for the award.[10]
|
Kōraku-en. Kōraku-en (後楽園, Kōrakuen) is a Japanese garden located in Okayama, Okayama Prefecture. It is one of the Three Great Gardens of Japan, along with Kenroku-en and Kairaku-en. Korakuen was built in 1700 by Ikeda Tsunamasa, lord of Okayama. The garden reached its modern form in 1863.[1] Zhu Zhiyu, one of the greatest scholars of Confucianism in the Ming dynasty and Edo Japan, helped to redesign the garden.[2] In 1687, the daimyō Ikeda Tsunamasa ordered Tsuda Nagatada to begin construction of the garden. It was completed in 1700 and has retained its original appearance to the present day, except for a few changes by various daimyōs. The garden was originally called Kōen (後園 later garden) because it was built after Okayama Castle. However, since the garden was built in the spirit of senyukoraku (先憂後楽 grieve earlier than others, enjoy later than others), the name was changed to Kōrakuen (後楽園) in 1871. The Korakuen is one of the few daimyō gardens in the provinces where historical change can be observed, thanks to the many Edo period paintings and Ikeda family records and documents left behind. The garden was used as a place for entertaining important guests and also as a spa of sorts for daimyōs, although regular folk could visit on certain days.
|
Ōhori Park. Ōhori Park (大濠公園, Ōhori-kōen) is a park in Chūō-ku, Fukuoka, Japan and a registered Place of Scenic Beauty.[1] The name Ōhori means a large moat and it derives from the fact that Kuroda Nagamasa, the old lord of Fukuoka, reclaimed the northern half of a cove or an inlet called Kusagae which was facing Hakata Bay and made a moat for the Fukuoka Castle. At the same time the Hii (Tajima) River, which was flowing into the cove, was diverted from its course to the west.[citation needed] The present park was reconstructed by Fukuoka City, modeled on the West Lake of China, and opened in 1929. A fireworks festival is held here every August.[citation needed] The Fukuoka Art Museum and the United States Consulate are nearby.[citation needed] Heiwadai Stadium also used to stand near Ohori Park. Opened in 1949 and closed in 1997 and demolished the following year, this baseball stadium was home to 3 NPB franchises. The Nishi Nippon Pirates only stayed in Heiwadai for their only year of operation in 1950. The Nishitetsu Clippers/Lions (now Saitama Seibu Lions) played their entire time in Heiwadai during their tenure in Fukuoka from 1950-1978 before moving to Saitama. The last team to call Heiwadai home was the Fukuoka Daiei Hawks (now Fukuoka SoftBank Hawks) for their first 3 years from 1989-1992. The stadium was eventually replaced with the Fukuoka PayPay Dome. During a renovation to Heiwadai in 1987, underneath the bleachers of the stadium, ruins of an ancient facility were found. When the stadium was demolished in 1997, the outfield bleachers were left as archeological work continued until these bleachers were also demolished in 2008 due to concerns over safety. It can be said that the reason why there were ruins found under the bleachers in Heiwadai was because it was built over the ruins of Fukuoka Castle.[citation needed]
|
Hakozaki Shrine. Hakozaki Shrine (筥崎宮, Hakozaki-gū) is a Shintō shrine in Fukuoka .[1] Hakozaki Shrine was founded in 923 (1102 years ago) (923), with the transfer of the spirit[citation needed] of the kami Hachiman from Daibu Hachiman Shrine in what is Honami Commandry, Chikuzen Province in Kyūshū. During the first Mongol invasion on November 19, 1274 (Bunei 11, 20th day of the 10th month), the Japanese defenders were pushed back from the several landing sites.[3] In the ensuing skirmishes, the shrine was burned to the ground.[4] When the shrine was reconstructed, a calligraphy Tekikoku kōfuku (敵国降伏; surrender of the enemy nation) was put on the tower gate. The calligraphy was written by Emperor Daigo, dedicated by Emperor Daijo Kameyama as a supplication to Hachiman to defeat invaders. The shrine is highly ranked among the many shrines in Japan. It was listed in Engishiki-jinmyōchō (延喜式神名帳) edited in 927. In 11th or 12th century, the shrine was ranked as Ichinomiya (一宮; first shrine) of Chikuzen Province. From 1871 to 1946, Hakozaki was officially designated a Kanpei-taisha (官幣大社), in the first rank of government supported shrines. Other similar Hachiman shrines were Iwashimizu Hachimangū of Yawata in Kyoto Prefecture and Usa Shrine of Usa in Ōita Prefecture.[5]
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.