source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Layers%20%28digital%20image%20editing%29
Layers are used in digital image editing to separate different elements of an image. A layer can be compared to a transparency on which imaging effects or images are applied and placed over or under an image. Today they are an integral feature of image editor. Layers were first commercially available in Fauve Matisse (later Macromedia xRes), and then available in Adobe Photoshop 3.0, in 1994, but today a wide range of other programs, such as Photo-Paint, Paint Shop Pro, GIMP, Paint.NET, StylePix, and even batch processing tools also include this feature. In vector image editors that support animation,Vn pro APK layers are used to further enable manipulation along a common timeline for the animation; in SVG images, the equivalent to layers are "groups". Layer types There are different kinds of layers, and not all of them exist in all programs. They represent a part of a picture, either as pixels or as modification instructions. They are stacked on top of each other, and depending on the order, determine the appearance of the final picture. In graphics software, layers are the different levels at which one can place an object or image file. In the program, layers can be stacked, merged, or defined when creating a digital image. Layers can be partially obscured allowing portions of images within a layer to be hidden or shown in a translucent manner within another image. Layers can also be used to combine two or more images into a single digital image. For the purpose of editing, working with layers allows for applying changes to just one specific layer. Layer (basic) The standard layer available to most programs consists of a rectangular, semitransparent picture which may be superimposed over other layers. Some programs require that layers cover the same area as the final canvas, but others offer layers of multiple sizes. Each layer may bear individual settings, such as opacity, blending modes, dynamic filters, and potentially hundreds of other properties. Layer
https://en.wikipedia.org/wiki/Serostatus
Serostatus refers to the presence or absence of a serological marker in the blood. The presence of detectable levels of a specific marker within the serum is considered seropositivity, while the absence of such levels is considered seronegativity. HIV/AIDS The term serostatus is commonly used in HIV/AIDS prevention efforts. In the late 20th and early 21st centuries, social advocacy has emphasized the importance of learning one's HIV/AIDS serostatus in an effort to curtail the spread of the disease. Autoimmune disease Researchers have investigated the effects of autoantibody serostatus on autoimmune disease presentation. Study of seronegative patient populations has led to the identification of additional autoantibodies that could potentially help with diagnosis. See also Seroconversion Correlates of immunity
https://en.wikipedia.org/wiki/1%3A32%20scale
1:32 scale is a traditional scale for models and miniatures, in which one unit (such as an inch or a centimeter) on the model represents 32 units on the actual object. It is also known as "three-eighths scale", since inch represents a foot. A tall person is modeled as tall in 1:32 scale. 1:32 was once so common a scale for toy trains, autos, and soldiers that it was known as "standard size" in the industry (not to be confused with Lionel's "Standard Gauge"). 1:32 is the scale for Gauge 1 toy and model trains. It was the scale of some of the earliest plastic model car kits. It is a common scale for aircraft models and for figure modeling, where it is called 54 mm scale, from the height of the human figure. 1:32 was used for equipment to match 54 mm toy soldiers for miniature wargaming and was common in scale military modeling such as tanks and armored cars until it was largely replaced by 1:35 scale. 1:32 is now considered to be the 'Normal' scale for agricultural models such as Britains or Siku 1:32 scale is also the preferred scale for modeling aircraft as this “large scale” benefits the builder with the opportunity to better detail his kit or scratch built aircraft project. 1:32 aircraft models also have their own contest category in modeling competitions as per IPMS rules, and 1:32 scale category is considered the top tier in aircraft modeling contest categories. 1:32 is a useful scale for scratch modelling or railways. As well as standard gauge gauge 1 using track, narrow gauge modellers use 0 gauge () track for 42", 1m and 36" prototype gauges. Also H0/00 track at 16.5mm is used to represent models of gauge railways. Today, 1:32 is associated with slot car scale. A standard for tabletop rail-racing in the mid-1950s, it was adopted by the original slot car manufacturers, Victory Industries and Scalextric. Fewer 1:32 car model kits are manufactured today, making scratch building slot cars quite a bit more difficult than it used to be. See also List
https://en.wikipedia.org/wiki/Dorian%20M.%20Goldfeld
Dorian Morris Goldfeld (born January 21, 1947) is an American mathematician working in analytic number theory and automorphic forms at Columbia University. Professional career Goldfeld received his B.S. degree in 1967 from Columbia University. His doctoral dissertation, entitled "Some Methods of Averaging in the Analytical Theory of Numbers", was completed under the supervision of Patrick X. Gallagher in 1969, also at Columbia. He has held positions at the University of California at Berkeley (Miller Fellow, 1969–1971), Hebrew University (1971–1972), Tel Aviv University (1972–1973), Institute for Advanced Study (1973–1974), in Italy (1974–1976), at MIT (1976–1982), University of Texas at Austin (1983–1985) and Harvard (1982–1985). Since 1985, he has been a professor at Columbia University. He is a member of the editorial board of Acta Arithmetica and of The Ramanujan Journal. On January 1, 2018 he became the Editor-in-Chief of the Journal of Number Theory. He is a co-founder and board member of Veridify Security, formerly SecureRF, a corporation that has developed the world's first linear-based security solutions. Goldfeld advised several doctoral students including M. Ram Murty. In 1986, he brought Shou-Wu Zhang to the United States to study at Columbia. Research interests Goldfeld's research interests include various topics in number theory. In his thesis, he proved a version of Artin's conjecture on primitive roots on the average without the use of the Riemann Hypothesis. In 1976, Goldfeld provided an ingredient for the effective solution of Gauss' class number problem for imaginary quadratic fields. Specifically, he proved an effective lower bound for the class number of an imaginary quadratic field assuming the existence of an elliptic curve whose L-function had a zero of order at least 3 at . (Such a curve was found soon after by Gross and Zagier). This effective lower bound then allows the determination of all imaginary fields with a given class numbe
https://en.wikipedia.org/wiki/NCR%20315
The NCR 315 Data Processing System, released in January 1962 by NCR, is a second-generation computer. All printed circuit boards use resistor–transistor logic (RTL) to create the various logic elements. It uses 12-bit slab memory structure using magnetic-core memory. The instructions can use a memory slab as either two 6-bit alphanumeric characters or as three 4-bit BCD digits. Basic memory is 5000 "slabs" (10,000 characters or 15,000 decimal digits) of handmade core memory, which is expandable to a maximum of 40,000 slabs (80,000 characters or 120,000 decimal digits) in four refrigerator-size cabinets. The main processor includes three cabinets and a console section that houses the power supply, keyboard, output writer (an IBM electric typewriter), and a panel with lights that indicate the current status of the program counter, registers, arithmetic accumulator, and system errors. Input/Output is by direct parallel connections to each type of peripheral through a two-cable bundle with 1-inch-thick cables. Some devices like magnetic tape and the CRAM are daisy-chained to allow multiple drives to be connected. The central processor (315 Data Processor) weighed about . Later models in this series include the 315-100 and the 315-RMC (Rod Memory Computer). Memory organization The addressable unit of memory on the NCR 315 series is a "slab", short for "syllable", consisting of 12 data bits and a parity bit. Its size falls between a byte and a typical word (hence the name, 'syllable'). A slab may contain three digits (with at sign, comma, space, ampersand, point, and minus treated as digits) or two alphabetic characters of six bits each. A slab may contain a decimal value from -99 to +999. A numeric value contains up to eight slabs. If the value is negative then the minus sign is the leftmost digit of this row. There are instructions to transform digits to or from alphanumeric characters. These commands use the accumulator, which has a maximum length of eight slabs
https://en.wikipedia.org/wiki/Microbial%20population%20biology
Microbial population biology is the application of the principles of population biology to microorganisms. Distinguishing from other biological disciplines Microbial population biology, in practice, is the application of population ecology and population genetics toward understanding the ecology and evolution of bacteria, archaebacteria, microscopic fungi (such as yeasts), additional microscopic eukaryotes (e.g., "protozoa" and algae), and viruses. Microbial population biology also encompasses the evolution and ecology of community interactions (community ecology) between microorganisms, including microbial coevolution and predator-prey interactions. In addition, microbial population biology considers microbial interactions with more macroscopic organisms (e.g., host-parasite interactions), though strictly this should be more from the perspective of the microscopic rather than the macroscopic organism. A good deal of microbial population biology may be described also as microbial evolutionary ecology. On the other hand, typically microbial population biologists (unlike microbial ecologists) are less concerned with questions of the role of microorganisms in ecosystem ecology, which is the study of nutrient cycling and energy movement between biotic as well as abiotic components of ecosystems. Microbial population biology can include aspects of molecular evolution or phylogenetics. Strictly, however, these emphases should be employed toward understanding issues of microbial evolution and ecology rather than as a means of understanding more universal truths applicable to both microscopic and macroscopic organisms. The microorganisms in such endeavors consequently should be recognized as organisms rather than simply as molecular or evolutionary reductionist model systems. Thus, the study of RNA in vitro evolution is not microbial population biology and nor is the in silico generation of phylogenies of otherwise non-microbial sequences, even if aspects of either may
https://en.wikipedia.org/wiki/Western%20equine%20encephalitis%20virus
The Western equine encephalomyelitis virus is the causative agent of relatively uncommon viral disease Western equine encephalomyelitis (WEE). An alphavirus of the family Togaviridae, the WEE virus is an arbovirus (arthropod-borne virus) transmitted by mosquitoes of the genera Culex and Culiseta. WEE is a recombinant virus between two other alphaviruses, an ancestral Sindbis virus-like virus, and an ancestral Eastern equine encephalitis virus-like virus. There have been under 700 confirmed cases in the U.S. since 1964. This virus contains an envelope that is made up of glycoproteins and nucleic acids. The virus is transmitted to people and horses by bites from infected mosquitoes (Culex tarsalis and Aedes taeniorhynchus) and birds during wet, summer months. According to the CDC, geographic occurrence for this virus is worldwide, and tends to be more prevalent in places in and around swampy areas where human populations tend to be limited. In North America, WEE is seen primarily in U. S. states and Canadian provinces west of the Mississippi River. The disease is also seen in countries of South America. WEE is commonly a subclinical infection; symptomatic infections are uncommon. However, the disease can cause serious sequelae in infants and children. Unlike Eastern equine encephalitis, the overall mortality of WEE is low (approximately 4%) and is associated mostly with infection in the elderly. Approximately 15–20% of horses that acquire the virus will die or be put down. There is no human vaccine for WEE and there are no licensed therapeutic drugs in the U.S. for this infection. The virus affects the brain and spinal cord of the infected host. History WEE was discovered in 1930 when a number of horses in the San Joaquin Valley of California, USA died of a mysterious encephalitis. Karl Friedrich Meyer investigated but was not able to isolate the pathogen from necropsies of horses that had been dead for some time and needed samples from an animal in the earlier
https://en.wikipedia.org/wiki/Mission%20specialist
Mission specialist (MS) was a specific position held by certain NASA astronauts who were tasked with conducting a range of scientific, medical, or engineering experiments during a spaceflight mission. These specialists were usually assigned to a specific field of expertise that was related to the goals of the particular mission they were assigned to. Mission specialists were highly trained individuals who underwent extensive training in preparation for their missions. They were required to have a broad range of skills, including knowledge of science and engineering, as well as experience in operating complex equipment in a zero-gravity environment. During a mission, mission specialists were responsible for conducting experiments, operating equipment, and performing spacewalks to repair or maintain equipment outside the spacecraft. They also played a critical role in ensuring the safety of the crew by monitoring the spacecraft's systems and responding to emergencies as needed. The role of mission specialist was an important one in the Space Shuttle program, as they were instrumental in the success of the program's many scientific and engineering missions. Many of the advances in science and technology that were made during this period were made possible by the hard work and dedication of the mission specialists who worked tirelessly to push the boundaries of what was possible in space.
https://en.wikipedia.org/wiki/Modal%20analysis
Modal analysis is the study of the dynamic properties of systems in the frequency domain. It consists of mechanically exciting a studied component in such a way to target the modeshapes of the structure, and recording the vibration data with a network of sensors. Examples would include measuring the vibration of a car's body when it is attached to a shaker, or the noise pattern in a room when excited by a loudspeaker. Modern day experimental modal analysis systems are composed of 1) sensors such as transducers (typically accelerometers, load cells), or non contact via a Laser vibrometer, or stereophotogrammetric cameras 2) data acquisition system and an analog-to-digital converter front end (to digitize analog instrumentation signals) and 3) host PC (personal computer) to view the data and analyze it. Classically this was done with a SIMO (single-input, multiple-output) approach, that is, one excitation point, and then the response is measured at many other points. In the past a hammer survey, using a fixed accelerometer and a roving hammer as excitation, gave a MISO (multiple-input, single-output) analysis, which is mathematically identical to SIMO, due to the principle of reciprocity. In recent years MIMO (multi-input, multiple-output) have become more practical, where partial coherence analysis identifies which part of the response comes from which excitation source. Using multiple shakers leads to a uniform distribution of the energy over the entire structure and a better coherence in the measurement. A single shaker may not effectively excite all the modes of a structure. Typical excitation signals can be classed as impulse, broadband, swept sine, chirp, and possibly others. Each has its own advantages and disadvantages. The analysis of the signals typically relies on Fourier analysis. The resulting transfer function will show one or more resonances, whose characteristic mass, frequency and damping ratio can be estimated from the measurements. The animate
https://en.wikipedia.org/wiki/Fractional%20cascading
In computer science, fractional cascading is a technique to speed up a sequence of binary searches for the same value in a sequence of related data structures. The first binary search in the sequence takes a logarithmic amount of time, as is standard for binary searches, but successive searches in the sequence are faster. The original version of fractional cascading, introduced in two papers by Chazelle and Guibas in 1986 (; ), combined the idea of cascading, originating in range searching data structures of and , with the idea of fractional sampling, which originated in . Later authors introduced more complex forms of fractional cascading that allow the data structure to be maintained as the data changes by a sequence of discrete insertion and deletion events. Example As a simple example of fractional cascading, consider the following problem. We are given as input a collection of ordered lists of numbers, such that the total length of all lists is , and must process them so that we can perform binary searches for a query value in each of the lists. For instance, with and , = 24, 64, 65, 80, 93 = 23, 25, 26 = 13, 44, 62, 66 = 11, 35, 46, 79, 81 The simplest solution to this searching problem is just to store each list separately. If we do so, the space requirement is , but the time to perform a query is , as we must perform a separate binary search in each of lists. The worst case for querying this structure occurs when each of the lists has equal size , so each of the binary searches involved in a query takes time . A second solution allows faster queries at the expense of more space: we may merge all the lists into a single big list , and associate with each item of a list of the results of searching for in each of the smaller lists . If we describe an element of this merged list as where is the numerical value and , , , and are the positions (the first number has position 0) of the next element at least as large as in each of the origina
https://en.wikipedia.org/wiki/Industrial%20PC
An industrial PC is a computer intended for industrial purposes (production of goods and services), with a form factor between a nettop and a server rack. Industrial PCs have higher dependability and precision standards, and are generally more expensive than consumer electronics. They often use complex instruction sets, such as x86, where reduced instruction sets such as ARM would otherwise be used. History IBM released the 5531 Industrial Computer in 1984, arguably the first "industrial PC". The IBM 7531, an industrial version of the IBM AT PC was released May 21, 1985. Industrial Computer Source first offered the 6531 Industrial Computer in 1985. This was a proprietary 4U rackmount industrial computer based on a clone IBM PC motherboard. Applications Industrial PCs are primarily used for process control and/or data acquisition. In some cases, an industrial PC is simply used as a front-end to another control computer in a distributed processing environment. Software can be custom written for a particular application or an off-the-shelf package such as TwinCAT, Wonder Ware, Labtech Notebook or LabView can be used to provide a base level of programming. Analog Devices got exclusive sales for OEM European industrial market and provided MACSYM 120 combined IBM 5531 and MACBASIC a multitasking basic running on C/CPM from Digital Research. Analog and digital I/O cards plugged inside PC and/or extension rack made MAC120 as one of the most powerful and easy to use controller for plant applications at this date. An application may simply require the I/O such as the serial port offered by the motherboard. In other cases, expansion cards are installed to provide analog and digital I/O, specific machine interface, expanded communications ports, and so forth, as required by the application. Industrial PCs offer different features than consumer PCs in terms of reliability, compatibility, expansion options and long-term supply. Industrial PCs are typically characterized
https://en.wikipedia.org/wiki/Askania-Nova
Askania-Nova () is a Ukrainian nature reserve located in Kherson Oblast, Ukraine, within the dry Taurida steppe near Oleshky Sands and active member of the UNESCO Man and the Biosphere Programme. It is also a research institute of the Ukrainian Academy of Agricultural Sciences. The reserve consists of a zoological park, a botanical (dendrological) garden, and an open territory of virgin steppes. History The nature reserve was established in 1898 by Friedrich-Jacob Eduardovych Falz-Fein (1863–1920) around the German colony of Askania-Nova, which only in 1890 became an organized settlement, Khutir. In March 1919, Askania-Nova was confiscated from the Falz-Fein family by the Red Army as part of the state nationalization programme. The last owner refused to evacuate to Germany. She was Sofia-Louise Bohdanivna (Gottlieb) Knauff (1835–1919), the mother of Friedrich Falz-Fein. Her refusal resulted in her summary execution by two Red Army guardsmen who shot her for failing to surrender her estate in Khorly (today a port in Kherson Oblast). On April 1, 1919, Askania-Nova was declared a People's Sanctuary Park by a decree of the Council of People's Commissars of the Ukrainian SSR; while on February 8, 1921, it was reorganized into a State Steppe Reserve of the Ukrainian SSR. The main purposes of the reserve were to preserve and study the environment of the virgin steppe, as well as possibly to acclimatize and study a larger number of animal and plant types. Askania-Nova became a scientific-steppe station, a zoo-technical station with breeding farms, a phyto-technical station, and included various other scientific institutions. Notably, the zoo and botanical garden were greatly expanded. Part of the reserve included portions of steppe reserve, an acclimatization zoo, and an arboretum. From 1932 to 1956, the reserve was transformed into the All-Union scientific-research institute for the hybridization and acclimatization of animals of M. Ivanov. It consisted of 12 departm
https://en.wikipedia.org/wiki/Titahi%20Bay%20Transmitter
The Titahi Bay Transmitter, which until 16 February 2016 was New Zealand's second tallest structure, transmitted AM radio signals from a 220 metres tall radio mast insulated against ground at Titahi Bay in New Zealand. The station which previously had three masts, now consists of only one mast with a height of 137 metres. A third – smaller – mast with a height of 53 metres was toppled on 10 November 2015. The tower's surrounding buildings were opened in 1937. Five radio programmes broadcast on four frequencies from the tower: Radio New Zealand National on 567 kHz Star and AM Network on 657 kHz Newstalk ZB on 1035 kHz Te Upoko O Te Ika on 1161 kHz In 2004 the tower was refurnished, badly corroded parts were removed and replaced, the whole tower was sand-blasted and repainted, and an array of LED warning lights were added at the behest of the NZCAA. According to workers refurbishing it, scaling the tower takes 45 minutes. From the top there are views of the entire Kapiti coast region. The site formerly transmitted Radio New Zealand's shortwave service, these broadcasts used a series of shorter free-standing masts supporting curtain arrays. Under the right conditions, the AM signal for National Radio can be received as far north as Norfolk Island and as far south as Dunedin. The Department of Conservation owns the land surrounding the tower, which is leased to Radio NZ for the transmitting towers, to the local Titahi Golf Club, and as farm land. The site is located within Whitirea Park, and is planned to come under the control of the Greater Wellington Regional Council. Only in recent years has the station's emergency power generator been replaced. The previous one, supplied by the American military after the Second World War, formed part of the driving machinery of a submarine which was no longer required. The site was never used for overseas telephone links, which (before the advent of undersea cables and satellites) were provided by two New Zealand Post
https://en.wikipedia.org/wiki/Broadcast%20signal%20intrusion
A broadcast signal intrusion is the hijacking of broadcast signals of radio, television stations, cable television broadcast feeds or satellite signals without permission or licence. Hijacking incidents have involved local TV and radio stations as well as cable and national networks. Although television, cable, and satellite broadcast signal intrusions tend to receive more media coverage, radio station intrusions are more frequent, as many simply rebroadcast a signal received from another radio station. All that is required is an FM transmitter that can overpower the same frequency as the station being its rebroadcast. Other methods that have been used in North America to intrude on legal broadcasts include breaking into the transmitter area and splicing audio directly into the feed. As a cable television operator connects itself in the signal path between individual stations and the system's subscribers, broadcasters have fallen victim to signal tampering on cable systems on multiple occasions. Notable incidents Southern Television On November 26, 1977, an audio message, purporting to come from outer space and conveyed by an individual named 'Vrillon' of the 'Ashtar Galactic Command', was broadcast during an ITN news bulletin on Southern Television in the United Kingdom. The intrusion did not entirely affect the video signal but replaced the program audio with a six-minute speech about the destiny of the human race and a disaster to affect "your world and the beings on other worlds around you". The IBA confirmed that it was the first time such a transmission had been made. Captain Midnight At 12:32 a.m. Eastern Time on April 27, 1986, HBO (Home Box Office) had its satellite signal feed from its operations center on Long Island in Hauppauge, New York interrupted by a man calling himself "Captain Midnight". The interruption occurred during a presentation of The Falcon and the Snowman. The intrusion lasted between 4 and 5 minutes and was seen by viewers alon
https://en.wikipedia.org/wiki/Address%20decoder
In digital electronics, an address decoder is a binary decoder that has two or more inputs for address bits and one or more outputs for device selection signals. When the address for a particular device appears on the address inputs, the decoder asserts the selection output for that device. A dedicated, single-output address decoder may be incorporated into each device on an address bus, or a single address decoder may serve multiple devices. A single address decoder with n address input bits can serve up to 2n devices. Several members of the 7400 series of integrated circuits can be used as address decoders. For example, when used as an address decoder, the 74154 provides four address inputs and sixteen (i.e., 24) device selector outputs. An address decoder is a particular use of a binary decoder circuit known as a "demultiplexer" or "demux" (the 74154 is commonly called a "4-to-16 demultiplexer"), which has many other uses besides address decoding. Address decoders are fundamental building blocks for systems that use buses. They are represented in all integrated circuit families and processes and in all standard FPGA and ASIC libraries. They are discussed in introductory textbooks in digital logic design. Address decoder selects the storage cell in a memory An address decoder is a commonly used component in microelectronics that is used to select memory cells in randomly addressable memory devices. Such a memory cell consists of a fixed number of memory elements or bits. The address decoder is connected to an address bus and reads the address created there. Using a special switching logic, it uses this address to calculate which memory cell is to be accessed. It then selects that cell by selecting it via a special control line. This line is also known as the select line. In dynamic memories (DRAM), there are row and column select lines on the memory matrix, which are controlled by address decoders integrated in the chip. Depending on the type of decoder, the
https://en.wikipedia.org/wiki/NCR%20Century%20100
The NCR Century 100 was NCR's first all integrated circuit computer built in 1968. All logic gates were created by wire-wrapping NAND gates together to form flip-flops and other complex circuits. The console of the system had only 18 lights and switches and allowed entry of a boot routine, or changes to loaded programs or data in memory. A typewriter console was also available. Peripherals The 615-100 Series integrated a complete data processing system that had 16KB or 32KB of short rod memory, an 80-column punched card reader or paper tape reader, two 5MB removable disk drives, and a 600-line-per-minute line printer. The system could be provided with a punched paper tape reader, or an external card reader/punch, and also allowed for the attachment of multiple 9-track, 1/2-inch, reel-to-reel magnetic tape drives. Two more disk drives could be attached to the system. The Century series used an instruction set with two instruction lengths: 4 bytes (32 bits) and 8 bytes (64 bits). Rod memory The memory of the Century Series computers used machine-made, short, iron-oxide-coated ceramic rods— long and approximately the diameter of a human hair— as their random access memories, instead of the hand-labor-intensive core memories that were used by other computers of the time. The economy of machine assembly was augmented by selling rod memory without paying patent royalties on core memory to NCR's competitor, IBM. Each 16K memory module consisted of two stacks, each stack containing sixteen planes of 4608 rods. Disk drives The Model 655 disk drive used a removable disk pack. It was the first by NCR to employ floating or flying heads with 12 read/write heads per surface. This reduced track-to-track movement and thus access times. However, this meant that there were 12 times more heads per drive, increasing the likelihood of head crashes. These flying heads were moved using a 16 position magnetic actuator. The actuator used four different magnets to create the 16 posi
https://en.wikipedia.org/wiki/Continuous%20spatial%20automaton
In automata theory (a subfield of computer science), continuous spatial automata, unlike cellular automata, have a continuum of locations, while the state of a location still is any of a finite number of real numbers. Time can also be continuous, and in this case the state evolves according to differential equations. One important example is reaction–diffusion textures, differential equations proposed by Alan Turing to explain how chemical reactions could create the stripes on zebras and spots on leopards. When these are approximated by CA, such CAs often yield similar patterns. Another important example is neural fields, which are the continuum limit of neural networks where average firing rates evolve based on integro-differential equations. Such models demonstrate spatiotemporal pattern formation, localized states and travelling waves. They have been used as models for cortical memory states and visual hallucinations. MacLennan considers continuous spatial automata as a model of computation, and demonstrated that they can implement Turing-universality. See also Analog computer Coupled map lattice
https://en.wikipedia.org/wiki/Plant%20anatomy
Plant anatomy or Phytotomy is the general term for the study of the internal structure of plants. Originally it included plant morphology, the description of the physical form and external structure of plants, but since the mid-20th century plant anatomy has been considered a separate field referring only to internal plant structure. Plant anatomy is now frequently investigated at the cellular level, and often involves the sectioning of tissues and microscopy. Structural divisions Some studies of plant anatomy use a systems approach, organized on the basis of the plant's activities, such as nutrient transport, flowering, pollination, embryogenesis or seed development. Others are more classically divided into the following structural categories: Flower anatomy, including study of the Calyx, Corolla, Androecium, and Gynoecium Leaf anatomy, including study of the Epidermis, stomata and Palisade cells Stem anatomy, including Stem structure and vascular tissues, buds and shoot apex Fruit/Seed anatomy, including structure of the Ovule, Seed, Pericarp and Accessory fruit Wood anatomy, including structure of the Bark, Cork, Xylem, Phloem, Vascular cambium, Heartwood and sapwood and branch collar Root anatomy, including structure of the Root, root tip, endodermis History About 300 BC Theophrastus wrote a number of plant treatises, only two of which survive, Enquiry into Plants (Περὶ φυτῶν ἱστορία), and On the Causes of Plants (Περὶ φυτῶν αἰτιῶν). He developed concepts of plant morphology and classification, which did not withstand the scientific scrutiny of the Renaissance. A Swiss physician and botanist, Gaspard Bauhin, introduced binomial nomenclature into plant taxonomy. He published Pinax theatri botanici in 1596, which was the first to use this convention for naming of species. His criteria for classification included natural relationships, or 'affinities', which in many cases were structural. It was in the late 1600s that plant anatomy became refined int
https://en.wikipedia.org/wiki/Plant%20morphology
Phytomorphology is the study of the physical form and external structure of plants. This is usually considered distinct from plant anatomy, which is the study of the internal structure of plants, especially at the microscopic level. Plant morphology is useful in the visual identification of plants. Recent studies in molecular biology started to investigate the molecular processes involved in determining the conservation and diversification of plant morphologies. In these studies transcriptome conservation patterns were found to mark crucial ontogenetic transitions during the plant life cycle which may result in evolutionary constraints limiting diversification. Scope Plant morphology "represents a study of the development, form, and structure of plants, and, by implication, an attempt to interpret these on the basis of similarity of plan and origin". There are four major areas of investigation in plant morphology, and each overlaps with another field of the biological sciences. First of all, morphology is comparative, meaning that the morphologist examines structures in many different plants of the same or different species, then draws comparisons and formulates ideas about similarities. When structures in different species are believed to exist and develop as a result of common, inherited genetic pathways, those structures are termed homologous. For example, the leaves of pine, oak, and cabbage all look very different, but share certain basic structures and arrangement of parts. The homology of leaves is an easy conclusion to make. The plant morphologist goes further, and discovers that the spines of cactus also share the same basic structure and development as leaves in other plants, and therefore cactus spines are homologous to leaves as well. This aspect of plant morphology overlaps with the study of plant evolution and paleobotany. Secondly, plant morphology observes both the vegetative (somatic) structures of plants, as well as the reproductive str
https://en.wikipedia.org/wiki/Deluge%20%28software%29
Deluge BitTorrent Client is a free and open-source, cross-platform BitTorrent client written in Python. Deluge uses a front and back end architecture where libtorrent, a software library written in C++ which provides the application's networking logic, is connected to one of various front ends including a text console, the web interface and a graphical desktop interface using GTK through the project's own Python bindings. Deluge is released under the terms of the GPL-3.0-or-later license. Features Deluge aims to be a lightweight, secure, and feature-rich client. To help achieve this, most of its features are part of plugin modules which were written by various developers. Starting with version 1.0, Deluge separated its core from its interface, running it instead in a daemon (server/service), allowing users to remotely manage the application over the web. Deluge has supported magnet links since version 1.1.0 released in January 2009. History Deluge was started by two members of ubuntuforums.org, Zach Tibbitts and Alon Zakai, who previously hosted and maintained the project at Google Code, but who subsequently moved it to its own website. In its first stages, Deluge was originally titled gTorrent, to reflect that it was targeted for the GNOME desktop environment. When the first version was released on September 25, 2006, it was renamed to Deluge due to an existing project named gtorrent on SourceForge, in addition to the fact that it was finally coded to work not only on GNOME but on any platform which could support GTK. The 0.5.x release marked a complete rewrite from the 0.4.x code branch. The 0.5.x branch added support for encryption, peer exchange, binary prefix, and UPnP. Nearing the time of the 0.5.1 release, the two original developers effectively left the project, leaving Rory Mobley and Andrew "andar" Resch to continue Deluge's development. Version 0.5.4.1 saw support for both Mac OS X (via MacPorts) and Windows being introduced. Around this tim
https://en.wikipedia.org/wiki/Hacker%27s%20Delight
Hacker's Delight is a software algorithm book by Henry S. Warren, Jr. first published in 2002. It presents fast bit-level and low-level arithmetic algorithms for common tasks such as counting bits or improving speed of division by using multiplication. Background The author, an IBM researcher working on systems ranging from the IBM 704 to the PowerPC, collected what he called "programming tricks" over the course of his career. These tricks concern efficient low-level manipulation of bit strings and numbers. According to the book's foreword by Guy L. Steele, the target audience includes compiler writers and people writing high-performance code. Summary Programming examples are written in C and assembler for a RISC architecture similar, but not identical to PowerPC. Algorithms are given as formulas for any number of bits, the examples usually for 32 bits. Apart from the introduction, chapters are independent of each other, each focusing on a particular subject. Many algorithms in the book depend on two's complement integer numbers. The subject matter of the second edition of the book includes algorithms for Basic algorithms for manipulating individual bits, formulas for identities, inequalities, overflow detection for arithmetic operations and shifts Rounding up and down to a multiple of a known power of 2, the next power of 2 and for detecting if an operation crossed a power-of-2 boundary Checking bounds Counting total, leading and trailing zeros Searching for bit strings Permutations of bits and bytes in a word Software algorithms for multiplication Integer division Efficient integer division and calculating of the remainder when the divisor is known Integer square and cube roots Unusual number systems, including base -2 Transfer of values between floating point and integer Cyclic redundancy checks, error-correcting codes and Gray codes Hilbert curves including a discussion of applications Style The style is that of an informal mathematical te
https://en.wikipedia.org/wiki/Circaseptan
A circaseptan rhythm is a cycle consisting of approximately 7 days in which many biological processes of life, such as cellular immune system activity, resolve. See also Circadian rhythm Chronobiology
https://en.wikipedia.org/wiki/Islamic%20toilet%20etiquette
Islamic toilet etiquette is a set of personal hygiene rules in Islam that concerns going to the toilet. This code of Islamic hygienical jurisprudence is called Qaḍāʾ al-Ḥāǧa (). Personal hygiene is mentioned in a single verse of the Quran in the context of ritual purification from a minor source of impurity, known as the Wuḍūʾ verse; its interpretation is contentious between different legal schools and sects of Islam. Further requirements with regard to personal hygiene are derived from ahadith, and these requirements also differ between sects. Rules A Muslim must first find an acceptable place away from standing water, people's pathways, or shade. It is advised that it is better to enter the area with the left foot, facing away from the Qibla (direction of prayer towards Mecca). It is reported in the collection of hadith, Sahih al-Bukhari, that just before entering the toilet, Muhammad said: (. Following his example, Muslims are advised to say this Dua before entering into the toilet. While on the toilet, one must remain silent. Talking and initiating or answering greetings are strongly discouraged. When defecating together, two men cannot converse, nor look at each other's genitals. Eating any food while on the toilet is forbidden. After defecating, the anus must be washed with water using the left hand, or if water is unavailable, with an odd number of smooth stones or pebbles called jamrah or hijaarah (Sahih Al-Bukhari 161, Book 4, Hadith 27). It is now more common to wipe with tissues and water. Similarly, the penis or vulva must be washed with water using the left hand after urinating, a procedure called istinja. It is commonly done using a vessel known as a Aftabeh, Lota, or bodna. When leaving the toilet, one is advised to exit with the right foot and say the Dua for leaving bathroom/toilet: "'الحمد لله الذي أذهب عني الأذى وعافاني'Alhamdu lillahil lazi azha-ba annill Aza Wa AA Fani. "Praise be to Allah who relieved me of the filth and gave me relief."
https://en.wikipedia.org/wiki/Luis%20Santal%C3%B3
Luís Antoni Santaló Sors (October 9, 1911 – November 22, 2001) was a Spanish mathematician. He graduated from the University of Madrid and he studied at the University of Hamburg, where he received his Ph.D. in 1936. His advisor was Wilhelm Blaschke. Because of the Spanish Civil War, he moved to Argentina as a professor in the National University of the Littoral, National University of La Plata and University of Buenos Aires. His work with Blaschke on convex sets is now cited in its connection with Mahler volume. Blaschke and Santaló also collaborated on integral geometry. Santaló wrote textbooks in Spanish on non-Euclidean geometry, projective geometry, and tensors. Works Luis Santaló published in both English and Spanish: Introduction to Integral Geometry (1953) Chapter I. Metric integral geometry of the plane including densities and the isoperimetric inequality. Ch. II. Integral geometry on surfaces including Blaschke's formula and the isoperimetric inequality on surfaces of constant curvature. Ch. III. General integral geometry: Lie groups on the plane: central-affine, unimodular affine, projective groups. Geometrias no Euclidianas (1961) I. The Elements of Euclid II. Non-Euclidean geometries III., IV. Projective geometry and conics V,VI,VII. Hyperbolic geometry: graphic properties, angles and distances, areas and curves. (This text develops the Klein model, the earliest instance of a model.) VIII. Other models of non-Euclidean geometry Geometria proyectiva (1966) A curious feature of this book on projective geometry is the opening on abstract algebra including laws of composition, group theory, ring theory, fields, finite fields, vector spaces and linear mapping. These seven introductory sections on algebraic structures provide an enhanced vocabulary for the treatment of 15 classical topics of projective geometry. Furthermore, sections (14) projectivities with non-commutative fields, (22) quadrics over non-commutative fields, and (26) finite geometr
https://en.wikipedia.org/wiki/AmigaOS%204
AmigaOS 4 (abbreviated as OS4 or AOS4) is a line of Amiga operating systems which runs on PowerPC microprocessors. It is mainly based on AmigaOS 3.1 source code developed by Commodore, and partially on version 3.9 developed by Haage & Partner. "The Final Update" (for OS version 4.0) was released on 24 December 2006 (originally released in April 2004) after five years of development by the Belgian company Hyperion Entertainment under license from Amiga, Inc. for AmigaOne registered users. History During the five years of development, purchasers of AmigaOne machines could download pre-release versions of AmigaOS 4.0 from Hyperion's repository as long as these were made available. On 20 December 2006, Amiga, Inc. terminated the contract with Hyperion Entertainment to produce or sell AmigaOS 4. Nevertheless, AmigaOS 4.0 was released commercially for Amigas with PowerUP accelerator cards in November 2007 (having been available only to developers and beta-testers until then). The Italian computer company ACube Systems has announced Sam440ep and Sam440ep-flex motherboards, which are AmigaOS 4 compatible. Also, a third party bootloader, known as the "Moana", was released by Acube on torrent sites; it allows installation of the Sam440ep version of OS4 to Mac Mini G4s. However this is both unofficial and unsupported as of today, and very incomplete, especially regarding drivers. During the judicial procedure (between Hyperion and Amiga, Inc.), OS4 was still being developed and distributed. On 30 September 2009, Hyperion Entertainment and Amiga, Inc. reached a settlement agreement where Hyperion is granted an exclusive right to AmigaOS 3.1 and market AmigaOS 4 and subsequent versions of AmigaOS (including AmigaOS 5 without limitation). Hyperion has assured the Amiga community that it will continue the development and the distribution of AmigaOS 4.x (and beyond), as it has done since November 2001. Description AmigaOS 4 can be divided into two parts: the Workbench and the
https://en.wikipedia.org/wiki/Hospital%20emergency%20codes
Hospital emergency codes are coded messages often announced over a public address system of a hospital to alert staff to various classes of on-site emergencies. The use of codes is intended to convey essential information quickly and with minimal misunderstanding to staff while preventing stress and panic among visitors to the hospital. Such codes are sometimes posted on placards throughout the hospital or are printed on employee identification badges for ready reference. Hospital emergency codes have varied widely by location, even between hospitals in the same community. Confusion over these codes has led to the proposal for and sometimes adoption of standardized codes. In many American, Canadian, New Zealand and Australian hospitals, for example "code blue" indicates a patient has entered cardiac arrest, while "code red" indicates that a fire has broken out somewhere in the hospital facility. In order for a code call to be useful in activating the response of specific hospital personnel to a given situation, it is usually accompanied by a specific location description (e.g., "Code red, second floor, corridor three, room two-twelve"). Other codes, however, only signal hospital staff generally to prepare for the consequences of some external event such as a natural disaster. Standardised color codes Australia Australian hospitals and other buildings are covered by Australian Standard 4083 (1997) Code Black: Personal threat Code Blue: medical emergency Code Brown: external emergency (disaster, mass casualties etc.) Code Orange: evacuation Code Purple: bomb threat Code Red: fire Code Yellow: internal emergency Canada Alberta Codes in Alberta are prescribed by Alberta Health Services. Code black: bomb threat/suspicious package Code blue: cardiac arrest/medical emergency Code brown: chemical spill/hazardous material Code green: evacuation Code grey: shelter in place/air exclusion Code orange: mass casualty incident Code purple: hostage situation
https://en.wikipedia.org/wiki/Mean%20sojourn%20time
The mean sojourn time (or sometimes mean waiting time) for an object in a system is the amount of time an object is expected to spend in a system before leaving the system for good. Calculation Imagine you are standing in line to buy a ticket at the counter. If you, after one minute, observe the number of customers that are behind you it might be looked upon as a (rough) estimate of the number of customers entering the system (here, waiting line) per unit time (here, minute). If you then divide the number of customers in front of you with this ”flow” of customers you just estimated the waiting time you should expect; i.e. the time it will take you to reach the counter, and indeed it is a rough estimate. To formalize this somewhat consider the waiting line as a system S into which there is a flow of particles (customers) and where the process “buy ticket” means that the particle leaves the system. The waiting time we have considered above is commonly referred to as transit time, and the theorem we have applied is occasionally called the Little's theorem, which could be formulated as: the expected steady state number of particles in the system S equals the flow of particles into S times the mean transit time. Similar theorems have been discovered in other fields, and in physiology it was earlier known as one of the Stewart-Hamilton equations (e.g. used for estimation of blood volume of organs). This principle (or, theorem) can be generalized. Thus, consider a system S in the form of a closed domain of finite volume in the Euclidean space. And let us further consider the situation where there is a stream of ”equivalent” particles into S (number of particles per time unit) where each particle retains its identity while being in S and eventually - after a finite time - leaves the system irreversibly (i.e. for these particles the system is ”open”). The Figure depicts the thought motion history of a single such particle, which thus moves in and out of the subsystem s
https://en.wikipedia.org/wiki/Steinmetz%20solid
In geometry, a Steinmetz solid is the solid body obtained as the intersection of two or three cylinders of equal radius at right angles. Each of the curves of the intersection of two cylinders is an ellipse. The intersection of two cylinders is called a bicylinder. Topologically, it is equivalent to a square hosohedron. The intersection of three cylinders is called a tricylinder. A bisected bicylinder is called a vault, and a cloister vault in architecture has this shape. Steinmetz solids are named after mathematician Charles Proteus Steinmetz, who solved the problem of determining the volume of the intersection. However, the same problem had been solved earlier, by Archimedes in the ancient Greek world, Zu Chongzhi in ancient China, and Piero della Francesca in the early Italian Renaissance. They appear prominently in the sculptures of Frank Smullin. Bicylinder A bicylinder generated by two cylinders with radius has the volume and the surface area . The upper half of a bicylinder is the square case of a domical vault, a dome-shaped solid based on any convex polygon whose cross-sections are similar copies of the polygon, and analogous formulas calculating the volume and surface area of a domical vault as a rational multiple of the volume and surface area of its enclosing prism hold more generally. In China, the bicylinder is known as Mou he fang gai, literally "two square umbrella"; it was described by the third-century mathematician Liu Hui. Proof of the volume formula For deriving the volume formula it is convenient to use the common idea for calculating the volume of a sphere: collecting thin cylindric slices. In this case the thin slices are square cuboids (see diagram). This leads to . It is well known that the relations of the volumes of a right circular cone, one half of a sphere and a right circular cylinder with same radii and heights are 1 : 2 : 3. For one half of a bicylinder a similar statement is true: The relations of the volumes of the ins
https://en.wikipedia.org/wiki/Gaugino%20condensation
In quantum field theory, gaugino condensation is the nonzero vacuum expectation value in some models of a bilinear expression constructed in theories with supersymmetry from the superpartner of a gauge boson called the gaugino. The gaugino and the bosonic gauge field and the D-term are all components of a supersymmetric vector superfield in the Wess–Zumino gauge. where represents the gaugino field (a spinor) and is an energy scale, and represent Lie algebra indices and and represent van der Waerden (two component spinor) indices. The mechanism is somewhat analogous to chiral symmetry breaking and is an example of a fermionic condensate. In the superfield notation, is the gauge field strength and is a chiral superfield. is also a chiral superfield and we see that what acquires a nonzero VEV is not the F-term of this chiral superfield. Because of this, gaugino condensation in and of itself does not lead to supersymmetry breaking. If we also have supersymmetry breaking, it is caused by something other than the gaugino condensate. However, a gaugino condensate definitely breaks U(1)R symmetry as has an R-charge of 2. See also Tachyon condensation
https://en.wikipedia.org/wiki/Logarithmic%20conformal%20field%20theory
In theoretical physics, a logarithmic conformal field theory is a conformal field theory in which the correlators of the basic fields are allowed to be logarithmic at short distance, instead of being powers of the fields' distance. Equivalently, the dilation operator is not diagonalizable. Examples of logarithmic conformal field theories include critical percolation. In two dimensions Just like conformal field theory in general, logarithmic conformal field theory has been particularly well-studied in two dimensions. Some two-dimensional logarithmic CFTs have been solved: The Gaberdiel–Kausch CFT at central charge , which is rational with respect to its extended symmetry algebra, namely the triplet algebra. The Wess–Zumino–Witten model, based on the simplest non-trivial supergroup. The triplet model at is also rational with respect to the triplet algebra.
https://en.wikipedia.org/wiki/FMRIB%20Software%20Library
The FMRIB Software Library, abbreviated FSL, is a software library containing image analysis and statistical tools for functional, structural and diffusion MRI brain imaging data. FSL is available as both precompiled binaries and source code for Apple and PC (Linux) computers. It is freely available for non-commercial use. FSL Functionality History and development FSL is written mainly by members of the FMRIB (Functional Magnetic Resonance Imaging of the Brain) Analysis Group, Oxford University, UK. The first release of FSL was in 2000; there has been approximately one major new release each year to date. The FMRIB Analysis Group is primarily funded by the Wellcome Trust and the UK EPSRC and MRC Research Councils. See also AFNI FreeSurfer SPM Neuroimaging External links FSL website FMRIB Analysis Group
https://en.wikipedia.org/wiki/List%20of%20unrefined%20sweeteners
This list of unrefined sweeteners includes all natural, unrefined, or low-processed sweeteners. Sweeteners are usually made from the fruit or sap of plants, but can also be made from any other part of the plant, or all of it. Some sweeteners are made from starch, with the use of enzymes. Sweeteners made by animals, especially insects, are put in their own section as they can come from more than one part of plants. From sap The sap of some species is concentrated to make sweeteners, usually through drying or boiling. Cane juice, syrup, molasses, and raw sugar, which has many regional and commercial names including demerara, jaggery, muscovado, panela, piloncillo, turbinado sugar, and Sucanat, are all made from sugarcane (Saccharum spp.). Sweet sorghum syrup is made from the sugary juice extracted from the stalks of Sorghum spp., especially S. bicolor. Mexican or maize sugar can be made by boiling down the juice of green maize stalks. Agave nectar is made from the sap of Agave spp., including tequila agave (Agave tequilana). Birch syrup is made from the sap of birch trees (Betula spp.). Maple syrup, taffy and sugar are made from the sap of tapped maple trees (Acer spp.). Palm sugar is made by tapping the flower stalk of various Palm trees to collect the sap. The most important species for this is the Indian date palm (Phoenix sylvestris), but other species used include palmyra (Borassus flabelliformis), coconut (Cocos nucifera), toddy (Caryota urens), gomuti (Arenga saccharifera), and nipa (Nypa fruticans) palms. The sweet resin of the sugar pine (Pinus lambertiana) was considered by John Muir to be better than maple sugar. A sugary extract from manna ash that contains the sugar mannose and the sugar alcohol mannitol. From roots The juice extracted from the tuberous roots of certain plants is, much like sap, concentrated to make sweeteners, usually through drying or boiling. Sugar beet syrup (Zuckerrübensirup in German) is made from the tuberous roots of the sug
https://en.wikipedia.org/wiki/The%20Discoverers
The Discoverers is a non-fiction historical work by Daniel Boorstin, published in 1983, and is the first in the Knowledge Trilogy, which also includes The Creators and The Seekers. The book, subtitled A History of Man's Search to Know His World and Himself, is a history of human discovery. Discovery in many forms is described: exploration, science, medicine, mathematics, and more-theoretical ones, such as time, evolution, plate tectonics, and relativity. Boorstin praises the inventive, human mind and its eternal quest to discover the universe and humanity's place in it. In "A Personal Note to the Reader", Boorstin writes "My hero is Man, the Discoverer. The world we now view from the literate West ... had to be opened by countless Columbuses. In the deep recesses of the past, they remain anonymous." The structure of the book is topical and chronological, beginning in the prehistoric era in Babylon and Egypt. Themes The Discoverers (as well as The Creators and The Seekers) resonates with tales of individuals, their lives, beliefs and accomplishments. They form the building blocks of his tale and from them flow descriptions and commentary on historical events. In this respect he is like other historians (David McCullough, Paul Johnson, Louis Hartz and Richard Hofstadter, to name a few) who give prominence to the individual and the incremental approach to history. Thus, in the chapter "In Search of the Missing Link", he features Edward Tyson and his contributions in comparative anatomy. Tycho Brahe, the Danish astronomer, is the guiding light in "The Witness of the Naked Eye" and Isaac Newton merits an entire chapter ("God said, Let Newton Be!") devoted to his life and accomplishments. The role of religion and culture is another recurring theme. Boorstin, a reform Jew, has been described as a "secular, skeptical moderate Northeastern liberal of the New Deal rather than the New Left school." The purpose of religion (and God) was not personal salvation but establish
https://en.wikipedia.org/wiki/Mizrah
Mizrah ( mīzrāḥ) is the Hebrew word for "east" and the direction that Jews in the Diaspora west of Israel face during prayer. Practically speaking, Jews would face the city of Jerusalem when praying, and those north, east, or south of Jerusalem face south, west, and north respectively. In European and Mediterranean communities west of the Holy Land, the word "mizrach" also refers to the wall of the synagogue that faces east, where seats are reserved for the rabbi and other dignitaries. In addition, "mizrach" refers to an ornamental wall plaque used to indicate the direction of prayer in Jewish homes. Jewish law The Talmud states that a Jew praying in the Diaspora, shall direct himself toward the Land of Israel; in Israel, toward Jerusalem; in Jerusalem, toward the Temple; and in the Temple, toward the Holy of Holies. The same rule is found in the Mishnah; however, it is prescribed for individual prayers only rather than for congregational prayers at a synagogue. Thus, if a man is east of the Temple, he should turn westward; if in the west, eastward; in the south, northward; and if in the north, southward. The custom is based on the prayer of Solomon (I Kings ; II Chron. ). Another passage supporting this rule is found in the Book of Daniel, which relates that in the upper chamber of the house, where Daniel prayed three times a day, the windows were opened toward Jerusalem (Dan. ). The Tosefta demands that the entrance to the synagogue should be on the eastern side with the congregation facing west. The requirement is probably based on the orientation of the tent of meeting, which had its gates on the eastern side (, or Solomon's Temple, the portals of which were to the east (Ezek. ). Maimonides attempted to reconcile the Tosefta's provision with the requirement to pray toward Jerusalem by stating that the doors of the synagogue should face east, while the Ark should be placed "in the direction in which people pray in that city," i.e., toward Jerusalem. The Shul
https://en.wikipedia.org/wiki/PipeRench
The PipeRench Reconfigurable Computing Project is a project from the Carnegie Mellon University intended to improve reconfigurable computing systems. It aims to allow hardware virtualization through high-speed reconfiguration, in order to minimize resource constraints in FPGAs and similar systems. The project has already succeeded in manufacturing a chip and testing it. PipeRench has been licensed by a start-up—Rapport and is the basis of their Kilocore chip. External links PipeRench official site Reconfigurable computing
https://en.wikipedia.org/wiki/Band%20III
Band III is the name of the range of radio frequencies within the very high frequency (VHF) part of the electromagnetic spectrum from 174 to 240 megahertz (MHz). It is primarily used for radio and television broadcasting. It is also called high-band VHF, in contrast to Bands I and II. Broadcast Television North America The band is subdivided into seven channels for television broadcasting, each occupying 6 MHz. Europe European Band III allocations vary from country to country, with channel widths of 7 or 8 MHz. The standard channel allocations for European countries that use System B with 7 MHz channel spacing are as follows: The Irish (8 MHz) system is shown below. Oceania Australia has allocated 8 channels in Band III for digital television, each with 7 MHz bandwidth. Russia and other former members of OIRT Russian analog television is transmitted using System D with 8 MHz channel bandwidth. Radio The band came into use for radio broadcasting at the turn of the 21st century and is used for Digital Audio Broadcasting. It is subdivided into a number of frequency blocks: Worldwide usage Europe In the UK and part of Ireland, Band III was originally used for monochrome 405-line television; however, this was discontinued by the mid-1980s. Other European countries (including Ireland) continued to use Band III for analogue 625-line colour television. Digital television in the DVB-T standard can be used in conjunction with VHF Band III and is used as such in some places. The use of sub-band 2 and sub-band 3 band for Digital Audio Broadcasting is now being widely adopted. Sub-band 1 is used for MPT-1327 trunked PMR radio, remote wireless microphones and PMSE links. North America In North America, use of the band for television broadcasts is still widespread. Favorable propagation characteristics and reasonable power limits (up to 65 kW for full-power digital television, versus 20 kW or less on VHF Band I) has meant that many US broadcasters elected to
https://en.wikipedia.org/wiki/Band%20II
Band II is the range of radio frequencies within the very high frequency (VHF) part of the electromagnetic spectrum from 87.5 to 108.0 megahertz (MHz). Radio Band II is primarily used worldwide for FM radio broadcasting. Broadcast television Usage in Russia and in other former members of OIRT In the former Soviet Union and other countries-members of OIRT, frequencies from 76 MHz to 100 MHz were designated for broadcast television usage. Considering 8 MHz channel bandwidth used by the Russian analog television system (System D), the following television channels had been defined: Broadcast television channels 1 and 2 are assigned to VHF I band, channels 6 to 12 are assigned to VHF III band. Starting from the early 1990s, frequencies previously allotted to television channels 4 and 5 have been re-allocated for FM radio, thereby harmonizing it with the Western allocation for FM radio service.
https://en.wikipedia.org/wiki/Interval%20scheduling
Interval scheduling is a class of problems in computer science, particularly in the area of algorithm design. The problems consider a set of tasks. Each task is represented by an interval describing the time in which it needs to be processed by some machine (or, equivalently, scheduled on some resource). For instance, task A might run from 2:00 to 5:00, task B might run from 4:00 to 10:00 and task C might run from 9:00 to 11:00. A subset of intervals is compatible if no two intervals overlap on the machine/resource. For example, the subset {A,C} is compatible, as is the subset {B}; but neither {A,B} nor {B,C} are compatible subsets, because the corresponding intervals within each subset overlap. The interval scheduling maximization problem (ISMP) is to find a largest compatible set, i.e., a set of non-overlapping intervals of maximum size. The goal here is to execute as many tasks as possible, that is, to maximize the throughput. It is equivalent to finding a maximum independent set in an interval graph. A generalization of the problem considers machines/resources. Here the goal is to find compatible subsets whose union is the largest. In an upgraded version of the problem, the intervals are partitioned into groups. A subset of intervals is compatible if no two intervals overlap, and moreover, no two intervals belong to the same group (i.e., the subset contains at most a single representative of each group). Each group of intervals corresponds to a single task, and represents several alternative intervals in which it can be executed. The group interval scheduling decision problem (GISDP) is to decide whether there exists a compatible set in which all groups are represented. The goal here is to execute a single representative task from each group. GISDPk is a restricted version of GISDP in which the number of intervals in each group is at most k. The group interval scheduling maximization problem (GISMP) is to find a largest compatible set - a set of non-overl
https://en.wikipedia.org/wiki/Band%20I
Band I is a range of radio frequencies within the very high frequency (VHF) part of the electromagnetic spectrum. The first time there was defined "for simplicity" in Annex 1 of "Final acts of the European Broadcasting Conference in the VHF and UHF bands - Stockholm, 1961". Band I ranges from 47 to 68 MHz for the European Broadcasting Area, and from 54 to 88 MHz for the Americas and it is primarily used for television broadcasting in compliance with ITU Radio Regulations (article 1.38). With the transition to digital TV, most Band I transmitters have already been switched off. Television broadcasting usage Channel spacings vary from country to country, with spacings of 6, 7 and 8 MHz being common. In the UK, Band I was originally used by the BBC for monochrome 405-line television; likewise, the French former 455-line (1937-1939) then 441-line (1943-1956) transmitter on the Eiffel Tower in Paris, and some stations of the French monochrome 819-line system used Band I. Both 405-line and 819-line systems were discontinued in the mid-1980s. Other European countries used Band I for 625-line analogue television, first in monochrome and later in colour. This was being gradually phased out with the introduction of digital television in the DVB-T standard, which is not defined for VHF Band I, though some older receivers and some modulators do support it. In the United States, use of this band is for analog NTSC (ended June 12, 2009) and digital ATSC (current). Digital television has problems with impulse noise interference, particularly in this band. Europe In European countries that used System B for television broadcasting, the band was subdivided into three channels, each being 7 MHz wide: Italy also used a "outband" "channel C" (video : 82.25 MHz - audio : 87.75 MHz). It was used by the first transmitter brought in service by the RAI in Torino in the Fifties which was previously used in WW2 by the US to broadcast NTSC TV on channel A6 for military purposes, later
https://en.wikipedia.org/wiki/Myogenic%20mechanism
The myogenic mechanism is how arteries and arterioles react to an increase or decrease of blood pressure to keep the blood flow constant within the blood vessel. Myogenic response refers to a contraction initiated by the myocyte itself instead of an outside occurrence or stimulus such as nerve innervation. Most often observed in (although not necessarily restricted to) smaller resistance arteries, this 'basal' myogenic tone may be useful in the regulation of organ blood flow and peripheral resistance, as it positions a vessel in a preconstricted state that allows other factors to induce additional constriction or dilation to increase or decrease blood flow. The smooth muscle of the blood vessels reacts to the stretching of the muscle by opening ion channels, which cause the muscle to depolarize, leading to muscle contraction. This significantly reduces the volume of blood able to pass through the lumen, which reduces blood flow through the blood vessel. Alternatively when the smooth muscle in the blood vessel relaxes, the ion channels close, resulting in vasodilation of the blood vessel; this increases the rate of flow through the lumen. This system is especially significant in the kidneys, where the glomerular filtration rate (the rate of blood filtration by the nephron) is particularly sensitive to changes in blood pressure. However, with the aid of the myogenic mechanism, the glomerular filtration rate remains very insensitive to changes in human blood pressure. Myogenic mechanisms in the kidney are part of the autoregulation mechanism which maintains a constant renal blood flow at varying arterial pressure. Concomitant autoregulation of glomerular pressure and filtration indicates regulation of preglomerular resistance. Model and experimental studies were performed to evaluate two mechanisms in the kidney, myogenic response and tubuloglomerular feedback. A mathematical model showed good autoregulation through a myogenic response, aimed at maintaining a consta
https://en.wikipedia.org/wiki/Oort%20constants
The Oort constants (discovered by Jan Oort) and are empirically derived parameters that characterize the local rotational properties of our galaxy, the Milky Way, in the following manner: where and are the rotational velocity and distance to the Galactic Center, respectively, measured at the position of the Sun, and and are the velocities and distances at other positions in our part of the galaxy. As derived below, and depend only on the motions and positions of stars in the solar neighborhood. As of 2018, the most accurate values of these constants are , . From the Oort constants, it is possible to determine the orbital properties of the Sun, such as the orbital velocity and period, and infer local properties of the Galactic disk, such as the mass density and how the rotational velocity changes as a function of radius from the Galactic Center. Historical significance and background By the 1920s, a large fraction of the astronomical community had recognized that some of the diffuse, cloud-like objects, or nebulae, seen in the night sky were collections of stars located beyond our own, local collection of star clusters. These galaxies had diverse morphologies, ranging from ellipsoids to disks. The concentrated band of starlight that is the visible signature of the Milky Way was indicative of a disk structure for our galaxy; however, our location within our galaxy made structural determinations from observations difficult. Classical mechanics predicted that a collection of stars could be supported against gravitational collapse by either random velocities of the stars or their rotation about its center of mass. For a disk-shaped collection, the support should be mainly rotational. Depending on the mass density, or distribution of the mass in the disk, the rotation velocity may be different at each radius from the center of the disk to the outer edge. A plot of these rotational velocities against the radii at which they are measured is called a rotation
https://en.wikipedia.org/wiki/Orbit%20portrait
In mathematics, an orbit portrait is a combinatorial tool used in complex dynamics for understanding the behavior of one-complex dimensional quadratic maps. In simple words one can say that it is : a list of external angles for which rays land on points of that orbit graph showing above list Definition Given a quadratic map from the complex plane to itself and a repelling or parabolic periodic orbit of , so that (where subscripts are taken 1 + modulo ), let be the set of angles whose corresponding external rays land at . Then the set is called the orbit portrait of the periodic orbit . All of the sets must have the same number of elements, which is called the valence of the portrait. Examples Parabolic or repelling orbit portrait valence 2 valence 3 Valence is 3 so rays land on each orbit point. For complex quadratic polynomial with c= -0.03111+0.79111*i portrait of parabolic period 3 orbit is : Rays for above angles land on points of that orbit . Parameter c is a center of period 9 hyperbolic component of Mandelbrot set. For parabolic julia set c = -1.125 + 0.21650635094611*i. It is a root point between period 2 and period 6 components of Mandelbrot set. Orbit portrait of period 2 orbit with valence 3 is : valence 4 Formal orbit portraits Every orbit portrait has the following properties: Each is a finite subset of The doubling map on the circle gives a bijection from to and preserves cyclic order of the angles. All of the angles in all of the sets are periodic under the doubling map of the circle, and all of the angles have the same exact period. This period must be a multiple of , so the period is of the form , where is called the recurrent ray period. The sets are pairwise unlinked, which is to say that given any pair of them, there are two disjoint intervals of where each interval contains one of the sets. Any collection of subsets of the circle which satisfy these four properties above is called a formal orbit portrait.
https://en.wikipedia.org/wiki/Keycap
A keycap is a small cover of plastic, metal, or other material placed over the keyswitch of a computer keyboard. Keycaps are often illustrated to indicate the key function or alphanumeric character they correspond to. Early keyboards were manufactured with the keyswitch and keycap integrated in one unit; keycaps separate from the switch were introduced to facilitate the production of different keyboard layouts. History Typical keycaps in the 1970s and 1980s were produced using two-shot molding, with the markings molded into each keycap in a different color of plastic. This eventually fell out of favor, as it was more expensive (particularly in tooling costs), and tended to produce keycaps more durable than the equipment on which they were mounted. Modern keycaps are usually labelled by stamping or laser engraving. However, two-shot molding ("doubleshot") keycaps are still available today, known for their feel and general durability. Modern keycaps Keycaps can be bought in replacement sets for a keyboard. Notably, replacement sets are frequently sold for keyboards that use Cherry MX-style stems. Custom sets are bought and sold within the enthusiast communities, and artisan keycaps can be purchased individually. Some artisan keycaps are cast into unique shapes such as LEGO Keycaps are sold in printed and unprinted varieties. The unprinted variety, known as "Blank Keycaps," is said to promote touch typing and help build muscle memory because the user is forced to rely on motion rather than visuals. There are many designs for you to choose from. From anime design, bi-color design, game-based design, and even custom keycap as you wish. However, within the modern mechanical keyboard community, unprinted caps are typically chosen for their visual appeal. The most common plastics used are ABS, PBT and POM (see the materials section). The top of most keycaps may be described as cylinder-shaped (curving to the sides as if a fat cylinder was resting on it), flat or sph
https://en.wikipedia.org/wiki/Version%20space%20learning
Version space learning is a logical approach to machine learning, specifically binary classification. Version space learning algorithms search a predefined space of hypotheses, viewed as a set of logical sentences. Formally, the hypothesis space is a disjunction (i.e., either hypothesis 1 is true, or hypothesis 2, or any subset of the hypotheses 1 through ). A version space learning algorithm is presented with examples, which it will use to restrict its hypothesis space; for each example , the hypotheses that are inconsistent with are removed from the space. This iterative refining of the hypothesis space is called the candidate elimination algorithm, the hypothesis space maintained inside the algorithm its version space. The version space algorithm In settings where there is a generality-ordering on hypotheses, it is possible to represent the version space by two sets of hypotheses: (1) the most specific consistent hypotheses, and (2) the most general consistent hypotheses, where "consistent" indicates agreement with observed data. The most specific hypotheses (i.e., the specific boundary SB) cover the observed positive training examples, and as little of the remaining feature space as possible. These hypotheses, if reduced any further, exclude a positive training example, and hence become inconsistent. These minimal hypotheses essentially constitute a (pessimistic) claim that the true concept is defined just by the positive data already observed: Thus, if a novel (never-before-seen) data point is observed, it should be assumed to be negative. (I.e., if data has not previously been ruled in, then it's ruled out.) The most general hypotheses (i.e., the general boundary GB) cover the observed positive training examples, but also cover as much of the remaining feature space without including any negative training examples. These, if enlarged any further, include a negative training example, and hence become inconsistent. These maximal hypotheses essentially
https://en.wikipedia.org/wiki/FreeBSD
FreeBSD is a free and open-source Unix-like operating system descended from the Berkeley Software Distribution (BSD). The first version of FreeBSD was released in 1993. In 2005, FreeBSD was the most popular open-source BSD operating system, accounting for more than three-quarters of all installed and permissively licensed BSD systems. FreeBSD has similarities with Linux, with two major differences in scope and licensing: FreeBSD maintains a complete system, i.e. the project delivers a kernel, device drivers, userland utilities, and documentation, as opposed to Linux only delivering a kernel and drivers, and relying on third-parties for system software; FreeBSD source code is generally released under a permissive BSD license, as opposed to the copyleft GPL used by Linux. The FreeBSD project includes a security team overseeing all software shipped in the base distribution. A wide range of additional third-party applications may be installed from binary packages using the pkg package management system or from source via FreeBSD Ports, or by manually compiling source code. Much of FreeBSD's codebase has become an integral part of other operating systems such as Darwin (the basis for macOS, iOS, iPadOS, watchOS, and tvOS), TrueNAS (an open-source NAS/SAN operating system), and the system software for the PlayStation 3 and PlayStation 4 game consoles. The other BSD systems (OpenBSD, NetBSD, and DragonFly BSD) also contain a large amount of FreeBSD code, and vice-versa. History Background In 1974, Professor Bob Fabry of the University of California, Berkeley, acquired a Unix source license from AT&T. Supported by funding from DARPA, the Computer Systems Research Group started to modify and improve AT&T Research Unix. They called this modified version "Berkeley Unix" or "Berkeley Software Distribution" (BSD), implementing features such as TCP/IP, virtual memory, and the Berkeley Fast File System. The BSD project was founded in 1976 by Bill Joy. But since BSD contained
https://en.wikipedia.org/wiki/Beckman%E2%80%93Quarles%20theorem
In geometry, the Beckman–Quarles theorem states that if a transformation of the Euclidean plane or a higher-dimensional Euclidean space preserves unit distances, then it preserves all Euclidean distances. Equivalently, every homomorphism from the unit distance graph of the plane to itself must be an isometry of the plane. The theorem is named after Frank S. Beckman and Donald A. Quarles Jr., who published this result in 1953; it was later rediscovered by other authors and re-proved in multiple ways. Analogous theorems for rational subsets of Euclidean spaces, or for non-Euclidean geometry, are also known. Statement and proof idea Formally, the result is as follows. Let be a function or multivalued function from a -dimensional Euclidean space to itself, and suppose that, for every pair of points and that are at unit distance from each other, every pair of images and are also at unit distance from each other. Then must be an isometry: it is a one-to-one function that preserves distances between all pairs of One way of rephrasing the Beckman–Quarles theorem involves graph homomorphisms, mappings between undirected graphs that take vertices to vertices and edges to edges. For the unit distance graph whose vertices are all of the points in the plane, with an edge between any two points at unit distance, a homomorphism from this graph to itself is the same thing as a unit-distance-preserving transformation of the plane. Thus, the Beckman–Quarles theorem states that the only homomorphisms from this graph to itself are the obvious ones coming from isometries of the For this graph, all homomorphisms are symmetries of the graph, the defining property of a class of graphs called As well as the original proofs of Beckman and Quarles of the theorem, and the proofs in later papers rediscovering the several alternative proofs have been If is the set of distances preserved by a then it follows from the triangle inequality that certain comparisons of other distances
https://en.wikipedia.org/wiki/Lernmatrix
Lernmatrix, an associative-memory-like architecture of an artificial neural network, invented around 1960 by Karl Steinbuch. External links A new theoretical framework for the Steinbuch's Lernmatrix Pattern recognition and classification using weightless neural networks (WNN) and Steinbuch Lernmatrix DARPA project will study neural network processes Artificial neural networks
https://en.wikipedia.org/wiki/Tortuosity
Tortuosity is widely used as a critical parameter to predict transport properties of porous media, such as rocks and soils. But unlike other standard microstructural properties, the concept of tortuosity is vague with multiple definitions and various evaluation methods introduced in different contexts. Hydraulic, electrical, diffusional, and thermal tortuosities are defined to describe different transport processes in porous media, while geometrical tortuosity is introduced to characterize the morphological property of porous microstructures. Tortuosity in 2-D Subjective estimation (sometimes aided by optometric grading scales) is often used. The simplest mathematical method to estimate tortuosity is the arc-chord ratio: the ratio of the length of the curve (C) to the distance between its ends (L): Arc-chord ratio equals 1 for a straight line and is infinite for a circle. Another method, proposed in 1999, is to estimate the tortuosity as the integral of the square (or module) of the curvature. Dividing the result by length of curve or chord has also been tried. In 2002 several Italian scientists proposed one more method. At first, the curve is divided into several (N) parts with constant sign of curvature (using hysteresis to decrease sensitivity to noise). Then the arc-chord ratio for each part is found and the tortuosity is estimated by: In this case tortuosity of both straight line and circle is estimated to be 0. In 1993 Swiss mathematician Martin Mächler proposed an analogy: it’s relatively easy to drive a bicycle or a car in a trajectory with a constant curvature (an arc of a circle), but it’s much harder to drive where curvature changes. This would imply that roughness (or tortuosity) could be measured by relative change of curvature. In this case the proposed "local" measure was derivative of logarithm of curvature: However, in this case tortuosity of a straight line is left undefined. In 2005 it was proposed to measure tortuosity by an integral of
https://en.wikipedia.org/wiki/Coastal%20strand
Coastal strand is a plant community of flowering plants that form along the shore in loose sand just above the high tide line. Many plants that grow in this area are endemic to the strand. The community has low species diversity because so few plants can tolerate the harsh conditions of high winds, battering salt spray, and extreme high temperatures in the summer. Plants must also be adapted to sandy saline soils, with extremely low nutrient loads, and low water holding capacity. Plants that grow along the coast are very tolerant of the winds and salt and sand loaded ocean spray. Many species are succulent, storing salty water in their leaves. The leaves are often light colored or grey green to reflect sunlight and reduce desiccation. Hairy leaves may reduce evapotranspiration, may help gather moisture from the air, and may reflect a small portion of incoming solar radiation thereby reducing the plants internal temperature. They are often very low in height with prostrate stems and spread by rooting at the nodes and may have deep tap roots, both rooting systems helping to anchor the shifting sands as the plants colonize the beach above high tide. Pacific coastal strand plants Asteraceae (sunflower family): Ambrosia chamissonis, greene beach-bur Solidago spathulata subsp. spathulata, coast goldenrod Tanacetum camphoratum, dune tansy Gnaphalium bicolor, (everlasting) Erigeron glaucus, seaside daisy Brassicaceae (mustard family) Erysimum insulare Caryophyllaceae (pink family) Cardionema ramosissimum Crassulaceae (stonecrop family) Dudleya farinosa, bluff lettuce Fabaceae (legume family) Lupinus arboreus, yellow bush lupine Lupinus chamissonis, Chamisso bush lupine Lamiaceae (mint family) Monardella crispa, crisp monardella Nyctaginaceae (four o'clock family) Abronia latifolia, (sand verbena) Abronia maritima, (sand verbena) Abronia umbellata, (sand verbena) Onagraceae (evening primrose family) Camissonia cheiranthifolia subsp. suffruticosa, beach evening primro
https://en.wikipedia.org/wiki/Occupancy%20sensor
An occupancy sensor is an indoor device used to detect the presence of a person. Applications include automatic adjustment of lights or temperature or ventilation systems in response to the quantity of people present. The sensors typically use infrared, ultrasonic, microwave, or other technology. The term encompasses devices as different as PIR sensors, hotel room keycard locks and smart meters. Occupancy sensors are typically used to save energy, provide automatic control, and comply with building codes. Vacancy sensor A vacancy sensor works like an occupancy sensor, however, lights must be manually turned ON, but will automatically turn OFF when motion is no longer detected. Sensor types Occupancy sensor types include: PIR sensors, which work on heat difference detection, measuring infrared radiation. Inside the device is a pyroelectric sensor which can detect the sudden presence of objects (such as humans) who radiate a temperature different from the temperature of the background, such as the room temperature of a wall. Environmental sensors, such as temperature, humidity and CO2 sensors, which detect the change in the environment due to the presence of a human. Ultrasonic sensors, similar to radar. they work on the doppler shift principle. An ultrasonic sensor will send high frequency sound waves in area and will check for their reflected patterns. If the reflected pattern is changing continuously then it assumes that there is occupancy and the lighting load connected is turned on. If the reflected pattern is the same for a preset time then the sensor assumes there is no occupancy and the load is switched off. Microwave sensors. Similar to the ultrasonic sensor, a microwave sensor also works on the doppler shift principle. A microwave sensor will send high frequency microwaves in an area and will check for their reflected patterns. If the reflected pattern is changing continuously then it assumes that there is occupancy and the lighting load connected is turn
https://en.wikipedia.org/wiki/Valis%20II
is a 1989 action-platform video game originally developed by Laser Soft, published by Telenet Japan and NEC for the PC Engine CD-ROM²/TurboGrafx-CD. A home computer version was released for PC-8801, MSX2, PC-9801 and X68000. A super deformed-style remake was also released in 1992 for the Sega Mega Drive/Genesis. It is the second entry in the eponymous series. It stars Yuko Asou, a Japanese schoolgirl teenager chosen to become the Valis warrior by wielding the titular mystical sword, after defeating the demon lord Rogles. The dream world Vecanti fell under the rule of emperor Megas, whose hatred towards his brother Rogles and bloodthirsty tendencies seeks to wipe out traces of the former tyrant, including his supporters. Gameplay varies between each version but all share similar elements, as the player explores and search for items and power-ups, while fighting enemies and defeat bosses. Work on Valis II did not start for a period of two and a half years, as the team became understaffed when several members had left after Valis: The Fantasm Soldier. Telenet began shifting their focus in order to establish themselves in the LaserDisc market when the development moved toward. It was the first title created by Laser Soft, an internal gaming division of Telenet formed specifically to explore games for the CD-ROM format, and they also cooperated with Renovation Game (Reno), which handled the home computer version. The staff hired animators for the project, as people within the anime industry were becoming interested in the video game industry. Both the TurboGrafx-CD and computer versions were made simultaneously, but each under different development lines. The Genesis remake was slated for a European release by UbiSoft, as part of a multi-game licensing deal with Telenet's North American subsidiary Renovation Products, but it was never officially released in the region. Each version of the game have since been re-released through download services for other platforms and
https://en.wikipedia.org/wiki/Absolute%20configuration
Absolute configuration refers to the spatial arrangement of atoms within a chiral molecular entity (or group) and its resultant stereochemical description. Absolute configuration is typically relevant in organic molecules where carbon is bonded to four different substituents. This type of construction creates two possible enantiomers. Absolute configuration uses a set of rules to describe the relative positions of each bond around the chiral center atom. The most common labeling method uses the descriptors R or S and is based on the Cahn–Ingold–Prelog priority rules. R and S refer to rectus and sinister, Latin for right and left, respectively. Chiral molecules can differ in their chemical properties, but are identical in their physical properties, which can make distinguishing enantiomers challenging. Absolute configurations for a chiral molecule (in pure form) are most often obtained by X-ray crystallography, although with some important limitations. All enantiomerically pure chiral molecules crystallise in one of the 65 Sohncke groups (chiral space groups). Alternative techniques include optical rotatory dispersion, vibrational circular dichroism, ultraviolet-visible spectroscopy, the use of chiral shift reagents in proton NMR and Coulomb explosion imaging. History Until 1951, it was not possible to obtain the absolute configuration of chiral compounds. It was at some time decided that (+)-glyceraldehyde was the -enantiomer. The configuration of other chiral compounds was then related to that of (+)-glyceraldehyde by sequences of chemical reactions. For example, oxidation of (+)-glyceraldehyde (1) with mercury oxide gives (−)-glyceric acid (2), a reaction that does not alter the stereocenter. Thus the absolute configuration of (−)-glyceric acid must be the same as that of (+)-glyceraldehyde. Nitric acid oxidation of (+)-isoserine (3) gives (–)-glyceric acid, establishing that (+)-isoserine also has the same absolute configuration. (+)-Isoserine can be converted
https://en.wikipedia.org/wiki/Fizz%20buzz
Fizz buzz is a group word game for children to teach them about division. Players take turns to count incrementally, replacing any number divisible by three with the word "fizz", and any number divisible by five with the word "buzz", and any number divisible by both 3 and 5 with the word "fizzbuzz". Play Players generally sit in a circle. The player designated to go first says the number "1", and the players then count upwards in turn. However, any number divisible by three is replaced by the word fizz and any number divisible by five by the word buzz. Numbers divisible by both three and five (i.e. divisible by 15) become fizz buzz. A player who hesitates or makes a mistake is eliminated. For example, a typical round of fizz buzz would start as follows: Other variations In some versions of the game, other divisibility rules such as 7 can be used instead. Another rule that may be used to complicate the game is where numbers containing a digit also trigger the corresponding rule (for instance, 52 would use the same rule for a number divisible by 5). Programming Fizz buzz (often spelled FizzBuzz in this context) has been used as an interview screening device for computer programmers. Writing a program to output the first 100 FizzBuzz numbers is a relatively trivial problem requiring little more than a loop and conditional statements. However, its value in coding interviews is to analyze fundamental coding habits that may be indicative of overall coding ingenuity.
https://en.wikipedia.org/wiki/History%20of%20phycology
The history of phycology is the history of the scientific study of algae. Human interest in plants as food goes back into the origins of the species, and knowledge of algae can be traced back more than two thousand years. However, only in the last three hundred years has that knowledge evolved into a rapidly developing science. Early days The study of botany goes back into pre-history, as plants have been eaten since the beginning of the human race. The first attempts at plant cultivation are believed to have been made shortly before 10,000 BC in Western Asia (Morton, 1981) and the first references to algae are to be found in early Chinese literature. Records as far back as 3000 BC indicate that algae were used by the emperor of China as food (Huisman, 2000 p. 13). The use of Porphyra in China dates back to at least AD 533–544 (Mumfard and Miura, 1988); there are also references in Roman and Greek literature. The Greek word for algae was phycos whilst in Latin the name became fucus. There are early references to the use of algae for manure. The first coralline algae to be recognized as living organisms were probably Corallina, by Pliny the Elder in the 1st century AD (Irvine and Chamberlain, 1994 p. 11). The classification of plants suffered many changes since Theophrastus (372–287 BC) and Aristotle (384–322 BC) grouped them as "trees", "shrubs" and "herbs" (Smith, 1955 p. 1). Little is known of botany during the Middle Ages — it was the Dark Ages of botany. The development of the study of phycology runs in a pattern comparable with, and parallel to, other biological fields but at a different rate. After the invention of the printing-press in the 15th century education enabled people to read and knowledge to spread. Exploration of the world and the advance of knowledge Written accounts of the algae of South Africa were made by the Portuguese explorers of the 15th and 16th centuries; however it is not clear to which species they refer. (Huisman, 2000 p. 7) 17
https://en.wikipedia.org/wiki/Batman/Aliens
Batman/Aliens is a crossover between the Batman and Aliens comic book franchises. It was published in 1997. A sequel was released in 2003. Batman/Aliens Batman parachutes into the jungle near the Guatemala and Mexico borderline, investigating the disappearance of a Wayne Enterprises geologist. He encounters an American Special Ops team hunting a target, and both are set upon by the Aliens. Several members of the team are killed, but along the way Batman becomes familiar with the Aliens' life cycle, and collects two facehuggers in specimen jars. The team leader sacrifices himself to blow up a nest of the Aliens, leaving only Batman and two members of the team alive, making their way to the team's evacuation point. One of the survivors, an intensely ambitious woman named Hyatt, leaves her teammate to be killed by one of the last Aliens and ambushes Batman, holding him at gunpoint while she relieves him of the lost geologist's voice recorder and one of the specimen jars. She says that the Aliens are an incredibly potent weapon if properly used, and bringing the information about them back to the U.S. government will make her career. She is so fixated on Batman that she fails to notice a gargantuan Alien hybrid - the result of an Alien embryo being implanted into a crocodile - rise behind her. The hybrid kills Hyatt, but Batman kills the creature by tying its legs and tipping it into the mouth of an active volcano. Alone, he escapes from the jungle. In the Batcave, Bruce Wayne listens to the geologist's last message to his family, cut off as the man is attacked by an Alien. Bruce decides to drop the specimen jars containing the facehuggers into the cave's depths and tell no one about them. The Aliens are much too dangerous, he believes, "not because of what they are, but because of what we are". This story was spun off of a two-part short story featured in Dark Horse Presents #101-102 entitled Aliens: Incubation. The events that Batman discovers on the geologist's
https://en.wikipedia.org/wiki/Background%20debug%20mode%20interface
Background debug mode (BDM) interface is an electronic interface that allows debugging of embedded systems. Specifically, it provides in-circuit debugging functionality in microcontrollers. It requires a single wire and specialized electronics in the system being debugged. It appears in many Freescale Semiconductor products. The interface allows a Host to manage and query a target. Specialized hardware is required in the target device. No special hardware is required in the host; a simple bidirectional I/O pin is sufficient. I/O signals The signals used by BDM to communicate data to and from the target are initiated by the host processor. The host negates the transmission line, and then either Asserts the line sooner, to output a 1, Asserts the line later, to output a 0, Tri-states its output, allowing the target to drive the line. The host can sense a 1 or 0 as an input value. At the start of the next bit time, the host negates the transmission line, and the process repeats. Each bit is communicated in this manner. In other words, the increasing complexity of today's software and hardware designs is leading to some fresh approaches to debugging. Silicon manufacturers offer more and more on-chip debugging features for emulation of new processors. This capability, implemented in various processors under such names as background debug mode (BDM), JTAG and on-chip in-circuit emulation, puts basic debugging functions on the chip itself. With a BDM (1 wire interface) or JTAG (standard JTAG) debug port, you control and monitor the microcontroller solely through the stable on-chip debugging services. This debugging mode runs even when the target system crashes and enables developers to continue investigating the cause of the crash. Microcontroller application development A good development tool environment is important to reduce total development time and cost. Users want to debug their application program under conditions that imitate the actual setup of the
https://en.wikipedia.org/wiki/Dispersion%20point
In topology, a dispersion point or explosion point is a point in a topological space the removal of which leaves the space highly disconnected. More specifically, if X is a connected topological space containing the point p and at least two other points, p is a dispersion point for X if and only if is totally disconnected (every subspace is disconnected, or, equivalently, every connected component is a single point). If X is connected and is totally separated (for each two points x and y there exists a clopen set containing x and not containing y) then p is an explosion point. A space can have at most one dispersion point or explosion point. Every totally separated space is totally disconnected, so every explosion point is a dispersion point. The Knaster–Kuratowski fan has a dispersion point; any space with the particular point topology has an explosion point. If p is an explosion point for a space X, then the totally separated space is said to be pulverized.
https://en.wikipedia.org/wiki/Xbox%20360%20HD%20DVD%20Player
The Xbox 360 HD DVD Player is a discontinued accessory for the Xbox 360 console that enables the playback of movies on HD DVD discs. Microsoft offered the drive for sale between November 2006 and February 2008. It was initially sold for $199. Bill Gates announced during his keynote speech at CES 2006 that an external HD DVD drive would be released for the Xbox 360 during 2006. At E3 2006, Microsoft officially presented the external HD DVD drive. According to Japan's chief of Xbox operations, Yoshihiro Maruyama, Microsoft would not release Xbox 360 games in the new disc formats. On February 23, 2008, the Xbox 360 HD DVD player was abandoned by Microsoft. This decision came just days after Toshiba's announcement to discontinue all HD DVD players and effectively end the format war between Blu-ray and HD DVD. Two days later, the price of the HD DVD Player was reduced to a clearance price of $49.99. Peter Moore had stated that if HD DVD loses the format war, Microsoft may also release an external Blu-ray drive. This was later denied by Microsoft. Special black versions of the drive, along with black media remotes, were given to members of the Xbox 360 HD DVD development team. Unlike other black accessories which were created alongside the black Elite console, the black HD DVD drive was never made available to the general public. Technology The HD DVD player connects to the Xbox 360 using a mini USB connection. All of the audio and video processing and output come from Xbox 360 itself. The unit can also function as a USB hub, with 2 ports on the rear. It also includes a clip for attaching the wireless network adapter to it, much like what Xbox 360 consoles of the time had. The device also has an integrated 256 MB memory unit which is used for storage of HD DVD data and is accessible to the user for saving other data such as saved games. The drive plays standard DVDs in addition to HD DVD titles; however it does not read Xbox or Xbox 360 game discs, Audio CDs or mixe
https://en.wikipedia.org/wiki/Material%20selection
Material selection is a step in the process of designing any physical object. In the context of product design, the main goal of material selection is to minimize cost while meeting product performance goals. Systematic selection of the best material for a given application begins with properties and costs of candidate materials. Material selection is often benefited by the use of material index or performance index relevant to the desired material properties. For example, a thermal blanket must have poor thermal conductivity in order to minimize heat transfer for a given temperature difference. It is essential that a designer should have a thorough knowledge of the properties of the materials and their behavior under working conditions. Some of the important characteristics of materials are : strength, durability, flexibility, weight, resistance to heat and corrosion, ability to cast, welded or hardened, machinability, electrical conductivity, etc. Systematic selection for applications requiring multiple criteria is more complex. For example, when the material should be both stiff and light, for a rod a combination of high Young's modulus and low density indicates the best material, whereas for a plate the cube root of stiffness divided by density is the best indicator, since a plate's bending stiffness scales by its thickness cubed. Similarly, again considering both stiffness and lightness, for a rod that will be pulled in tension the specific modulus, or modulus divided by density should be considered, whereas for a beam that will be subject to bending, the material index is the best indicator. Reality often presents limitations, and the utilitarian factor must be taken in consideration. The cost of the ideal material, depending on shape, size and composition, may be prohibitive, and the demand, the commonality of frequently utilized and known items, its characteristics and even the region of the market dictate its availability. Ashby plots An Ashby plot
https://en.wikipedia.org/wiki/Charles%20Robertson%20Maier
Charles Robertson Maier, C StJ, CD, FRSA, FHSC,(born 1945) is the current Priory Historian for St John Ambulance, The Priory of Canada. Maier was born in St Louis, Missouri in 1945. He received a Bachelor of Arts in 1969 from the University of British Columbia and a Master of Arts in 1970 from King's College London, University of London. He was an archivist for the Yukon Territory until the foundation of the Canadian Heraldic Authority in 1988 when he was commissioned Athabaska Herald. He was made a fellow of the Royal Heraldry Society of Canada in 1991. He was promoted Commander of the Order of St John in 2005.
https://en.wikipedia.org/wiki/Isopeptide%20bond
An isopeptide bond is a type of amide bond formed between a carboxyl group of one amino acid and an amino group of another. An isopeptide bond is the linkage between the side chain amino or carboxyl group of one amino acid to the α-carboxyl, α-amino group, or the side chain of another amino acid. In a typical peptide bond, also known as eupeptide bond, the amide bond always forms between the α-carboxyl group of one amino acid and the α-amino group of the second amino acid. Isopeptide bonds are rarer than regular peptide bonds. Isopeptide bonds lead to branching in the primary sequence of a protein. Proteins formed from normal peptide bonds typically have a linear primary sequence. Amide bonds, and thus isopeptide bonds, are stabilized by resonance (electron delocalization) between the carbonyl oxygen, the carbonyl carbon, and the nitrogen atom. The bond strength of an isopeptide bond is similar to that of a peptide due to the similar bonding type. The bond strength of a peptide bond is 2.3-3.6 kcal/mol. Amino acids such as lysine, glutamic acid, glutamine, aspartic acid, and asparagine can form isopeptide bonds because they all contain an amino or carboxyl group on their side chain. For example, the formation of an isopeptide bond between the sidechains of lysine and glutamine is as follows: Gln−(C=O)NH2 + Lys-NH3+ → Gln−(C=O)NH−Lys + NH4+ The ε-amino group of lysine can also react with the α-carboxyl group of any other amino acid as in the following reaction: Ile-(C=O)O- + Lys-NH3+ → Ile-(C=O)NH-Lys + H2O Isopeptide bond formation can be enzyme-catalyzed or occur spontaneously. The reaction between lysine and glutamine, as shown above, is catalyzed by a transglutaminase. Another example of enzyme-catalyzed isopeptide bond formation is the formation of the glutathione molecule. Glutathione, a tripeptide, contains a normal peptide bond (between cysteine and glycine) and an isopeptide bond (between glutamate and cysteine). The formation of the isopeptide bond
https://en.wikipedia.org/wiki/Introduction%20to%20entropy
In thermodynamics, entropy is a numerical quantity that shows that many physical processes can go in only one direction in time. For example, cream and coffee can be mixed together, but cannot be "unmixed"; a piece of wood can be burned, but cannot be "unburned". The word 'entropy' has entered popular usage to refer a lack of order or predictability, or of a gradual decline into disorder. A more physical interpretation of thermodynamic entropy refers to spread of energy or matter, or to extent and diversity of microscopic motion. If a movie that shows coffee being mixed or wood being burned is played in reverse, it would depict processes impossible in reality. Mixing coffee and burning wood are "irreversible". Irreversibility is described by a law of nature known as the second law of thermodynamics, which states that in an isolated system (a system not connected to any other system) which is undergoing change, entropy increases over time. Entropy does not increase indefinitely. A body of matter and radiation eventually will reach an unchanging state, with no detectable flows, and is then said to be in a state of thermodynamic equilibrium. Thermodynamic entropy has a definite value for such a body and is at its maximum value. When bodies of matter or radiation, initially in their own states of internal thermodynamic equilibrium, are brought together so as to intimately interact and reach a new joint equilibrium, then their total entropy increases. For example, a glass of warm water with an ice cube in it will have a lower entropy than that same system some time later when the ice has melted leaving a glass of cool water. Such processes are irreversible: An ice cube in a glass of warm water will not spontaneously form from a glass of cool water. Some processes in nature are almost reversible. For example, the orbiting of the planets around the Sun may be thought of as practically reversible: A movie of the planets orbiting the Sun which is run in reverse would not a
https://en.wikipedia.org/wiki/Optical%20sine%20theorem
In optics, the optical sine theorem states that the products of the index, height, and sine of the slope angle of a ray in object space and its corresponding ray in image space are equal. That is: External links http://physics.tamuk.edu/~suson/html/4323/aberatn.html#Optical%20Sine Sine theorem Physics theorems
https://en.wikipedia.org/wiki/Witness-indistinguishable%20proof
A witness-indistinguishable proof (WIP) is a variant of a zero-knowledge proof for languages in NP. In a typical zero-knowledge proof of a statement, the prover will use a witness for the statement as input to the protocol, and the verifier will learn nothing other than the truth of the statement. In a WIP, this zero-knowledge condition is weakened, and the only guarantee is that the verifier will not be able to distinguish between provers that use different witnesses. In particular, the protocol may leak information about the set of all witnesses, or even leak the witness that was used when there is only one possible witness. Witness-indistinguishable proof systems were first introduced by Feige and Shamir. Unlike zero-knowledge proofs, they remain secure when multiple proofs are being performed concurrently. See also Ring signature
https://en.wikipedia.org/wiki/OKATO
Russian Classification on Objects of Administrative Division (), or OKATO (), also called All-Russian classification on units of administrative and territorial distribution in English, is one of several Russian national registers. OKATO's purpose is organization of information about structure of the administrative divisions of the federal subjects of Russia. The document assigns numeric codes to each administrative division of the country, which are hierarchically structured from the federal subject level down to selsoviet level; an expanded version also includes listings of individual inhabited localities within each administrative division. OKATO is used for statistical and tax purposes. It was adopted on July 31, 1995, replacing SOATO (Designation System of Objects of Administrative Division of the Union of SSR and the Union Republics, as well as Inhabited Localities). It went into effect on January 1, 1997 and as of 2014 underwent 243 revisions. The compilation and maintenance of the OKATO data are the responsibility of the Federal State Statistics Service of Russia (Rosstat). See also Administrative division codes of the People's Republic of China (:zh:中华人民共和国行政区划代码), a somewhat similar system used in the PRC (only down to the county level). OKTMO, Russian Classification on Territories of Municipal Division
https://en.wikipedia.org/wiki/Iran%20Bioinformatics%20Center
Iran Bioinformatics Center (IBC) is the only academic center in Iran working on Bioinformatics. Although there are some independent research groups such as Bioinformatics and Biomathematics Unit in Mazandaran University of Medical Sciences working on Bioinformatics but IBC is a part of Institute of Biochemistry and Biophysics (IBB) in Tehran University. IBC offers a Ph.D. program for Bioinformatics. See also Institute of Biochemistry and Biophysics Tehran University Bioinformatics and Biomathematics Unit Bioinformatics External links Iran Bioinformatics Center Institute of Biophysics and Biochemistry Bioinformatics Center of Institute of Bio-IT University of Tehran Bioinformatics organizations
https://en.wikipedia.org/wiki/Provider%20Backbone%20Bridge%20Traffic%20Engineering
Provider Backbone Bridge Traffic Engineering (PBB-TE) is a computer networking technology specified in IEEE 802.1Qay, an amendment to the IEEE 802.1Q standard. PBB-TE adapts Ethernet to carrier class transport networks. It is based on the layered VLAN tags and MAC-in-MAC encapsulation defined in IEEE 802.1ah (Provider Backbone Bridges (PBB)), but it differs from PBB in eliminating flooding, dynamically created forwarding tables, and spanning tree protocols. Compared to PBB and its predecessors, PBB-TE behaves more predictably and its behavior can be more easily controlled by the network operator, at the expense of requiring up-front connection configuration at each bridge along a forwarding path. PBB-TE Operations, Administration, and Management (OAM) is usually based on IEEE 802.1ag. It was initially based on Nortel's Provider Backbone Transport (PBT). PBB-TE's connection-oriented features and behaviors, as well as its OAM approach, are inspired by SDH/SONET. PBB-TE can also provide path protection levels similar to the UPSR (Unidirectional Path Switched Ring) protection in SDH/SONET networks. Principle of operation The IEEE 802.1Qay PBB-TE standard extends the functionality of IEEE 802.1ah Provider Backbone Bridges, adding a connection-oriented mode using point-to-point trunks that deliver resiliency and configurable performance levels. A service is identified by an I-SID (Backbone Service Instance Identifier) and each service is associated with a PBB-TE trunk. Each PBB-TE trunk is identified by a triplet of B-SA, B-DA and B-VID. The B-SA and B-DA identify the source and destination bridges, respectively, that are the endpoints of the trunk. The B-VID is a backbone VLAN identifier that is used to distinguish different trunks to the same destination. The management system configures the PBB-TE trunks on all the edge and core bridges by creating static forwarding database entries; the management system is responsible for ensuring that there are no forwarding lo
https://en.wikipedia.org/wiki/List%20of%20concept-%20and%20mind-mapping%20software
Concept mapping and mind mapping software is used to create diagrams of relationships between concepts, ideas, or other pieces of information. It has been suggested that the mind mapping technique can improve learning and study efficiency up to 15% over conventional note-taking. Many software packages and websites allow creating or otherwise supporting mind maps. File format Using a standard file format allows interchange of files between various programs. Many programs listed below support the OPML file format and the XML file format used by FreeMind. Free and open-source The following tools comply with the Free Software Foundation's (FSF) definition of free software. As such, they are also open-source software. Freeware The following is a list of notable concept mapping and mind mapping applications which are freeware and available at no cost. Some are open source and others are proprietary software. Proprietary software The table below lists pieces of proprietary commercial software that allow creating mind and concept maps. See also Brainstorming List of Unified Modeling Language tools Outliner Study software
https://en.wikipedia.org/wiki/International%20Neuroinformatics%20Coordinating%20Facility
The International Neuroinformatics Coordinating Facility (INCF) is an international non-profit organization with the mission to develop, evaluate, and endorse standards and best practices that embrace the principles of Open, FAIR, and Citable neuroscience. INCF also provides training on how standards and best practices facilitate reproducibility and enables the publishing of the entirety of research output, including data and code. INCF was established in 2005 by recommendations of the Global Science Forum working group of the OECD. The INCF is hosted by the Karolinska Institutet in Stockholm, Sweden. The INCF network comprises institutions, organizations, companies, and individuals active in neuroinformatics, neuroscience, data science, technology, and science policy and publishing. The Network is organized in governing bodies and working groups which coordinate various categories of global neuroinformatics activities that guide and oversee the development and endorsement of standards and best practices, as well as provide training on how standards and best practices facilitate reproducibility and enables the publishing of the entirety of research output, including data and code. The current Directors are Mathew Abrams and Helena Ledmyr, and the Governing Board Chair is Maryann Martone The INCF network aims to promote the application of computational approaches to understanding the brain and to develop the infrastructure required to use computational methods to integrate and analyze diverse data across scales, techniques, and species to understand the brain and positively impact the health and well-being of society. Background A key element to successfully understanding the nervous system is the integration of neuroscience with information sciences. The field that studies the nervous system, neuroscience, has responded to the fantastic challenge of understanding how our brain works with the use of the most sophisticated technologies, from studies on the genome
https://en.wikipedia.org/wiki/IEEE%20802.1ah
IEEE 802.1ah is an amendment to the IEEE 802.1Q networking standard which adds support for Provider Backbone Bridges. It includes an architecture and a set of protocols for routing over a provider's network, allowing interconnection of multiple provider bridge networks without losing each customer's individually defined VLANs. It was initially created by Nortel before being submitted to the IEEE 802.1 committee for standardization. The final version was approved by the IEEE in June 2008 and has been integrated into IEEE 802.1Q-2011. History The now-ubiquitous Ethernet was initially defined as a local area network (LAN) technology to interconnect the computers within a small organization in which these host computers were very close in proximity to each other. Over the years, Ethernet has become such a popular technology that it became the default Data Link Layer (OSI Layer 2) mechanism for data transport. This created a need for extending the Ethernet from a customer LAN bridging domain to service provider MAN, also known as the Provider bridging domain. For this, a 4 byte S-Tag or Service Tag, a type of Virtual LAN tag, was added to the header of the Ethernet frame in IEEE 802.1ad standard. In the service provider domain, switching was based on S-Tag and destination MAC address, and C-tag was used to create virtual LAN within the customer domain. This technology is also known as QinQ or Q-tunneling. QinQ does not offer true separation of customer and provider domains but is merely a way to overcome the limitations on the VLAN identifier space. It can also help in separation of the customer and provider control domains when used with other features like control protocol tunneling or Per-VLAN Spanning Tree etc. There is still the problem of having too little control on the MAC addresses, since QinQ forwarding is still based on the customer destination addresses. Thus, better mechanisms are needed. Description The idea of PBB is to offer complete separation of c
https://en.wikipedia.org/wiki/S-63%20%28encryption%20standard%29
S-63 is an International Hydrographic Organization (IHO) standard for encrypting, securing and compressing electronic navigational chart (ENC) data. The Data Protection Scheme was prepared by the IHO Data Protection Scheme Advisory Group, and was based on the protection scheme developed and operated by Primar as part of providing their protected ENC service. ECC (Electronic Chart Centre) and United Kingdom Hydrographic Office were the original contributing organizations. The UKHO has since left this arrangement and Primar is now operated exclusively by ECC. The standard was adopted as the official IHO standard by the IHO member states in December 2002. The S-63 standard secures data by encrypting the basic transfer database using the Blowfish algorithm, SHA-1-hashing the data based on a random key and adding a CRC32 check. The standard also defines the systems to develop permit files that are delivered to end-users of ENC data enabling them to decrypt the data and use it for navigation. It also defines the use of DSA format signatures to authenticate the data originator, however because of poor implementation of the standard by ECDIS hardware manufacturers, virtually all signing is performed centrally by the IHO which acts as the scheme administrator. Exceptions to this are a few smaller resellers such as AUSRenc operated by AHS. Compression is achieved by applying the standard ZIP (file format) algorithm to the base and update ENC files, before encryption. The other files are not compressed.
https://en.wikipedia.org/wiki/Lazy%20loading
Lazy loading (also known as asynchronous loading) is a technique commonly used in computer programming and mostly in web design and web development to defer initialization of an object until the point at which it is needed. It can contribute to efficiency in the program's operation if properly and appropriately used. This makes it ideal in use cases where network content is accessed and initialization times are to be kept at a minimum, such as in the case of web pages. For example, deferring loading of images on a web page until they are needed can make the initial display of the web page faster. The opposite of lazy loading is eager loading. Examples With web frameworks Prior to being established as a web standard, web frameworks were generally used to implement lazy loading. One of these is AngularJS. Since lazy loading decreases bandwidth and subsequently server resources, it is a strong contender to implement in a website, especially in order to improve user retention by having less delay when loading the page, which may also improve Search Engine Optimization. Below is an example of lazy loading being used in Angular, programmed in TypeScript from Farata Systems @NgModule({ imports: [ BrowserModule, RouterModule.forRoot([ {path: '', component: HomeComponent}, {path: 'product', component: ProductDetailComponent}, {path: 'luxury', loadChildren: () => import('./luxury.module').then(m => m.LuxuryModule), data: {preloadme:true} } ] // , {preloadingStrategy: CustomPreloadingStrategy} ) ], declarations: [ AppComponent, HomeComponent, ProductDetailComponent], providers:[{provide: LocationStrategy, useClass: HashLocationStrategy}, CustomPreloadingStrategy], bootstrap: [ AppComponent ] }) As a web standard Since 2020, major web browsers have enabled native handling of lazy loading by default. This allows lazy loading to be incorporated into a webpage by adding HTML attributes. The loading attribute support
https://en.wikipedia.org/wiki/Bluetooth%20stack
A Bluetooth stack is software that is an implementation of the Bluetooth protocol stack. Bluetooth stacks can be roughly divided into two distinct categories: General-purpose implementations that are written with emphasis on feature-richness and flexibility, usually for desktop computers. Support for additional Bluetooth profiles can typically be added through drivers. Embedded system implementations intended for use in devices where resources are limited and demands are lower, such as Bluetooth peripheral devices. General-purpose implementations BSD FreeBSD The FreeBSD bluetooth stack is implemented using the Netgraph framework. A broad variety of Bluetooth USB dongles are supported by the ng_ubt driver. The implementation was committed in 2002, and first released with FreeBSD 5.0. NetBSD NetBSD has its own Bluetooth implementation, committed in 2006, and first released with . OpenBSD OpenBSD has had the implementation from NetBSD for some time, but it was removed in 2014 due lack of maintainership and code rot. DragonFly BSD DragonFly BSD has had NetBSD's Bluetooth implementation since 1.11 (2008), first released with . A netgraph-based implementation from FreeBSD has also been available in the tree since 2008, dating to an import of Netgraph from the FreeBSD 7 timeframe into DragonFly, but was possibly disabled until 2014-11-15, and may still require more work. Linux BlueALSA BlueALSA is a Bluetooth audio ALSA backend that allows the use of Bluetooth-connected audio devices without the use of PulseAudio or PipeWire. BlueZ BlueZ, initially developed by Qualcomm, is a Bluetooth stack, included with the official Linux kernel distributions, for Linux kernel-based family of operating systems. Its goal is to program an implementation of the Bluetooth wireless standards specifications for Linux. As of 2006, the BlueZ stack supports all core Bluetooth protocols and layers. It was initially developed by Qualcomm, and is available for Linux kernel versions 2
https://en.wikipedia.org/wiki/Counter-machine%20model
There are many variants of the counter machine, among them those of Hermes, Ershov, Péter, Minsky, Lambek, Shepherdson and Sturgis, and Schönhage. These are explained below. The models in more detail 1954: Hermes' model Shepherdson and Sturgis observe that "the proof of this universality [of digital computers to Turing machines] ... seems to have been first written down by Hermes, who showed in [7--their reference number] how an idealized computer could be programmed to duplicate the behavior of any Turing machine" (Shepherdson and Sturgis, p. 219). Shepherdson and Sturgis observe that: "Kaphengst's approach is interesting in that it gives a direct proof of the universality of present-day digital computers, at least when idealized to the extent of admitting an infinity of storage registers each capable of storing arbitrarily long words" (Shepherdson and Sturgis, p. 219) The only two arithmetic instructions are Successor operation Testing two numbers for equality The rest of the operations are transfers from register-to-accumulator or accumulator-to-register or test-jumps. Kaphengst's paper is written in German; Sheperdson and Sturgis's translation uses terms such as "mill" and "orders". The machine contains "a mill" (accumulator). Kaphengst designates his mill/accumulator with the "infinity" symbol but we will use "A" in the following description. It also contains an "order register" ("order" as in "instruction", not as in "sequence"). (This usage came from the Burks–Goldstine–von Neumann (1946) report's description of "...an Electronic Computing Instrument".) The order/instruction register is register "0". And, although not clear from Sheperdson and Sturgis's exposition, the model contains an "extension register" designated by Kaphengst "infinity-prime"; we will use "E". The instructions are stored in the registers: "...so the machine, like an actual computer, is capable of doing arithmetic operations on its own program" (p. 244). Thus this model is act
https://en.wikipedia.org/wiki/Why%E2%80%93because%20analysis
Why–because analysis (WBA) is a method for accident analysis. It is independent of application domain and has been used to analyse, among others, aviation-, railway-, marine-, and computer-related accidents and incidents. It is mainly used as an after-the-fact (or a posteriori) analysis method. WBA strives to ensure objectivity, falsifiability and reproducibility of results. The result of a WBA is a why–because graph (WBG), a type of causal notation used to represent interdependencies within a system. The WBG depicts causal relations between factors of an accident. It is a directed acyclic graph where the nodes of the graph are factors. Directed edges denote cause–effect relations between the factors. WBA in detail WBA starts with the question "What is the accident or accidents in question?". In most cases this is easy to define. Next comes an iterative process to determine causes. When causes for the accident have been identified, formal tests are applied to all potential cause-effect relations. This process can be iterated for the newfound causes, and so on, until a satisfactory result has been achieved. At each node (factor), each contributing cause (related factor) must have been necessary to cause the accident, and the totality of causes must have been sufficient to do so. The formal tests The counterfactual test (CT) – The CT leads back to David Lewis' formal notion of causality and counterfactuals. The CT asks the following question: "If the cause had not been, could the effect have happened?". The CT proves or disproves that a cause is a necessary causal factor for an effect. Only if it is necessary for the cause in question then it is clearly contributing to the effect. The causal sufficiency test – The CST asks the question: "Will an effect always happen if all attributed causes happen?". The CST aims at deciding whether a set of causes are sufficient for an effect to happen. The missing of causes can thus be identified. Only if for all causal
https://en.wikipedia.org/wiki/Devices%20Profile%20for%20Web%20Services
The Devices Profile for Web Services (DPWS) defines a minimal set of implementation constraints to enable secure web service messaging, discovery, description, and eventing on resource-constrained devices. Its objectives are similar to those of Universal Plug and Play (UPnP) but, in addition, DPWS is fully aligned with Web Services technology and includes numerous extension points allowing for seamless integration of device-provided services in enterprise-wide application scenarios. DPWS standardization The DPWS specification was initially published in May 2004 and was submitted for standardization to OASIS in July 2008. DPWS 1.1 was approved as OASIS Standard together with WS-Discovery 1.1 and SOAP-over-UDP 1.1 on June 30, 2009. DPWS defines an architecture in which devices run two types of services: hosting services and hosted services. Hosting services are directly associated with a device, and play an important part in the device discovery process. Hosted services are mostly functional and depend on their hosting device for discovery. In addition to these hosted services, DPWS specifies a set of built-in services: Discovery services: used by a device connected to a network to advertise itself and to discover other devices. Support of discovery has led some to dub DPWS as "the USB for Ethernet." Metadata exchange services: provide dynamic access to a device's hosted services and to their metadata. Publish/subscribe eventing services: allowing other devices to subscribe to asynchronous event messages produced by a given service. DPWS builds on the following core Web Services standards: WSDL 1.1, XML Schema, SOAP 1.2, WS-Addressing, and further comprises WS-MetadataExchange, WS-Transfer, WS-Policy, WS-Security, WS-Discovery and WS-Eventing. Microsoft's Windows Vista and Windows Embedded CE6R2 platforms natively integrate DPWS with a stack called WSDAPI, included as part of the Windows Rally technologies. Support for OSGi is on the way. Use cases Because DPWS
https://en.wikipedia.org/wiki/Gunnera%20tinctoria
Gunnera tinctoria, known as giant rhubarb, Chilean rhubarb, or nalca, is a flowering plant species native to southern Chile and neighboring zones in Argentina. It is unrelated to rhubarb, as the two plants belong to different orders, but looks similar from a distance and has similar culinary uses. It is a large-leaved perennial plant that grows to more than two metres tall. It has been introduced to many parts of the world as an ornamental plant. In some countries, such as New Zealand, the United Kingdom and Ireland, it has spread from gardens and is becoming an introduced species of concern. It is known under the synonyms: Gunnera chilensis Lam. and Gunnera scabra Ruiz & Pav. Taxonomy It was first described in 1782 by Juan Ignacio Molina as Panke tinctoria, and was transferred to the genus, Gunnera, in 1805 by Charles-François Brisseau de Mirbel. Description Gunnera tinctoria is a giant, clump-forming herbaceous perennial. The leaves can grow up to 2.5m across, cordate and palmate with up to 9-lobed margins. The stems are covered in numerous spikes. It has erect spikes of cone-shaped inflorescences (to 1m) from spring to early summer, with small flowers. The fruit is orange. The number of seeds is estimated from 80,000 per seedhead to 250,000 per plant. Habitat Stream and roadsides. Uses In its native Chile, where it is called nalca or pangue, it is used in a similar way to European rhubarb: after peeling, the stalks are eaten fresh or cooked into jam or cordial. The leaves are used in the preparation of the traditional Chilean dish curanto. As an invasive species In parts of New Zealand the Chilean rhubarb has become a recognized pest plant. For instance in Taranaki, on the west coast of the North Island it was spread to riverbeds, coastal cliffs and forest margins. G. tinctoria is on the National Pest Plant Accord. Under Section 52 and 53 of the Biosecurity Act, it is an offence to knowingly propagate, distribute, spread, sell, offer for sale. In Great Br
https://en.wikipedia.org/wiki/Fermi%20resonance
A Fermi resonance is the shifting of the energies and intensities of absorption bands in an infrared or Raman spectrum. It is a consequence of quantum-mechanical wavefunction mixing. The phenomenon was explained by the Italian physicist Enrico Fermi. Selection rules and occurrence Two conditions must be satisfied for the occurrence of Fermi resonance: The two vibrational modes of a molecule transform according to the same irreducible representation in their molecular point group. In other words, the two vibrations must have the same symmetries (Mulliken symbols). The transitions coincidentally have very similar energies. Fermi resonance most often occurs between fundamental and overtone excitations, if they are nearly coincident in energy. Fermi resonance leads to two effects. First, the high-energy mode shifts to higher energy, and the low-energy mode shifts to still lower energy. Second, the weaker mode gains intensity (becomes more allowed), and the more intense band decreases in intensity. The two transitions are describable as a linear combination of the parent modes. Fermi resonance does not lead to additional bands in the spectrum, but rather shifts in bands that would otherwise exist. Examples Ketones High-resolution IR spectra of most ketones reveal that the "carbonyl band" is split into a doublet. The peak separation is usually only a few cm−1. This splitting arises from the mixing of νCO and the overtone of HCH bending modes. CO2 In CO2, the bending vibration ν2 (667 cm−1) has symmetry Πu. The first excited state of ν2 is denoted 0110 (no excitation in the ν1 mode (symmetric stretch), one quantum of excitation in the ν2 bending mode with angular momentum about the molecular axis equal to ±1, no excitation in the ν3 mode (asymmetric stretch)) and clearly transforms according to the irreducible representation Πu. Putting two quanta into the ν2 mode leads to a state with components of symmetry (Πu × Πu)+ = Σ+g + Δ g. These are called 0200 and 0220 r
https://en.wikipedia.org/wiki/Fray%20in%20Magical%20Adventure
Fray in Magical Adventure, also known as just Fray (フレイ) and Fray-Xak Epilogue (Gai-den), is a 1990 spin-off "gaiden" (sidestory) game in a role-playing video game series Xak developed and published by the Japanese software developer Micro Cabin. Even though it is directly connected to the more serious Xak storyline, Fray has a less serious tone and light-hearted comedic approach to telling the story. It was originally released for the MSX2 and was later ported to several different systems, among them MSX turbo R, PC-9801, PC Engine (as Fray CD), and Game Gear. Gameplay Fray is a simple action RPG. The game proceeds by the player's character Fray fighting through a preset overhead view map shooting opposing monsters, jumping over obstacles, and locating powerups and Gold, the game's currency, along the way. At the end of each stage the player will fight a boss and enter a town or safe haven where the player can purchase new equipment, hit points and the option to save their progress. Fray advances in power through the items that she can equip such as different rods and shields. Battles are in real-time as Fray walks around on automatic vertically scrolling game map as well as the monster characters. She has an attack and defense rating, and can switch between different projectile weapon styles as well as use special attacks and healing items. Plot Fray features a high fantasy setting where a great war was fought between the benevolent but weakening ancient gods and a demon race, which led to the collapse and eventual mortality of the gods. After this 'War of Sealing', the gods divided the world into three parts: Xak, the world of humans, Oceanity, the world of faeries, and Zekisis, the world of demons. The demon world of Zekisis was tightly sealed from the other two worlds to prevent reentry of the warmongering demon race. Some demons were left behind in Xak, however, and others managed to discover a separate means to enter Xak from Zekisis anyway. (This ancient h
https://en.wikipedia.org/wiki/Allan%20M.%20Campbell
Allan McCulloch Campbell (April 27, 1929 – April 19, 2018) was an American microbiologist and geneticist and the Barbara Kimball Browning Professor Emeritus in the Department of Biology at Stanford University. His pioneering work on Lambda phage helped to advance molecular biology in the late 20th century. An important collaborator and member of his laboratory at Stanford University was biochemist Alice del Campillo Campbell, his wife. Education Campbell earned his bachelor's degree at the University of California, Berkeley (1950) and master's (1951) and doctoral (1953) degrees from the University of Illinois where he worked with Sol Spiegelman. Career From 1953-1957 Campbell was on the faculty of the University of Michigan. During the summers he spent time with Gio ("Joe") Bertani at Caltech and the University of Southern California, at Cold Spring Harbor Laboratory and at the Institut Pasteur with François Jacob. In 1958 he married Alice del Campillo, a Ph.D. student in biochemistry at the University of Michigan. They spent their honeymoon year working in Paris. The two worked closely together throughout their careers, investigating research questions such as the encoding of heat-sensitive endolysin and the biosynthesis and regulation of biotin. Campbell spent the next nine years on the faculty of the University of Rochester, where he made significant discoveries about lambda phage. In 1968 Campbell joined the Department of Biology at Stanford University, where he led his own laboratory. He was appointed to the Barbara Kimball Browning endowed chair in 1992. Campbell was the editor of the Annual Review of Genetics from 1985 to 2012. Research Campbell's research has concentrated on the genetics of bacteria and their viruses, especially the integration of viral DNA into host chromosomes. His most prominent discovery was the proposal of the “Campbell model” of virus insertion, where viral DNA is inserted into the host chromosome, becoming covalently bon
https://en.wikipedia.org/wiki/Beale%20number
In mechanical engineering, the Beale number is a parameter that characterizes the performance of Stirling engines. It is often used to estimate the power output of a Stirling engine design. For engines operating with a high temperature differential, typical values for the Beale number are in the range 0.11−0.15; where a larger number indicates higher performance. Definition The Beale number can be defined in terms of a Stirling engine's operating parameters: where: Bn is the Beale number Wo is the power output of the engine (watts) P is the mean average gas pressure (Pa) or (MPa, if volume is in cm3) V is swept volume of the power piston (m3, or cm3, if pressure is in MPa) F is the engine cycle frequency (Hz) Estimating Stirling power To estimate the power output of an engine, nominal values are assumed for the Beale number, pressure, swept volume and frequency, then the power is calculated as the product of these parameters, as follows: See also West number External links Stirling Engine Performance Calculator Beale number calculator Dimensionless numbers Piston engines Mechanical engineering
https://en.wikipedia.org/wiki/Mantle%20convection
Mantle convection is the very slow creeping motion of Earth's solid silicate mantle as convection currents carry heat from the interior to the planet's surface. The Earth's surface lithosphere rides atop the asthenosphere and the two form the components of the upper mantle. The lithosphere is divided into a number of tectonic plates that are continuously being created or consumed at plate boundaries. Accretion occurs as mantle is added to the growing edges of a plate, associated with seafloor spreading. Upwelling beneath the spreading centers is a shallow, rising component of mantle convection and in most cases not directly linked to the global mantle upwelling. The hot material added at spreading centers cools down by conduction and convection of heat as it moves away from the spreading centers. At the consumption edges of the plate, the material has thermally contracted to become dense, and it sinks under its own weight in the process of subduction usually at an ocean trench. Subduction is the descending component of mantle convection. This subducted material sinks through the Earth's interior. Some subducted material appears to reach the lower mantle, while in other regions, this material is impeded from sinking further, possibly due to a phase transition from spinel to silicate perovskite and magnesiowustite, an endothermic reaction. The subducted oceanic crust triggers volcanism, although the basic mechanisms are varied. Volcanism may occur due to processes that add buoyancy to partially melted mantle, which would cause upward flow of the partial melt due to decrease in its density. Secondary convection may cause surface volcanism as a consequence of intraplate extension and mantle plumes. In 1993 it was suggested that inhomogeneities in D" layer have some impact on mantle convection. Mantle convection causes tectonic plates to move around the Earth's surface. Types of convection During the late 20th century, there was significant debate within the geo
https://en.wikipedia.org/wiki/Efimov%20state
The Efimov effect is an effect in the quantum mechanics of few-body systems predicted by the Russian theoretical physicist V. N. Efimov in 1970. Efimov’s effect is where three identical bosons interact, with the prediction of an infinite series of excited three-body energy levels when a two-body state is exactly at the dissociation threshold. One corollary is that there exist bound states (called Efimov states) of three bosons even if the two-particle attraction is too weak to allow two bosons to form a pair. A (three-particle) Efimov state, where the (two-body) sub-systems are unbound, is often depicted symbolically by the Borromean rings. This means that if one of the particles is removed, the remaining two fall apart. In this case, the Efimov state is also called a Borromean state. Theory Efimov predicted that, as the pair interactions among three identical bosons approach resonance—that is, as the binding energy of some two-body bound state approaches zero or the scattering length of such a state becomes infinite—the three-body spectrum exhibits an infinite sequence of bound states whose scattering lengths and binding energies each form a geometric progression where the common ratio is a universal constant (OEIS ). Here is the order of the imaginary-order modified Bessel function of the second kind that describes the radial dependence of the wavefunction. By virtue of the resonance-determined boundary conditions, it is the unique positive value of satisfying the transcendental equation . Experimental results In 2005, for the first time the research group of Rudolf Grimm and Hanns-Christoph Nägerl from the Institute for Experimental Physics at the University of Innsbruck experimentally confirmed such a state in an ultracold gas of caesium atoms. In 2006, they published their findings in the scientific journal Nature. Further experimental proof for the existence of the Efimov state has been given recently by independent groups. Almost 40 years after
https://en.wikipedia.org/wiki/Ontogeny%20and%20Phylogeny
Ontogeny and Phylogeny is a 1977 book on evolution by Stephen Jay Gould, in which the author explores the relationship between embryonic development (ontogeny) and biological evolution (phylogeny). Unlike his many popular books of essays, it was a technical book, and over the following decades it was influential in stimulating research into heterochrony (changes in the timing of embryonic development), which had been neglected since Ernst Haeckel's theory that ontogeny recapitulates phylogeny had been largely discredited. This helped to create the field of evolutionary developmental biology. Context Ontogeny and Phylogeny is Stephen Jay Gould's first technical book. He wrote that Ernst Mayr had suggested in passing that he write a book on development. Gould stated he "only began it as a practice run to learn the style of lengthy exposition before embarking on my magnum opus about macroevolution." This later work was published in 2002 as The Structure of Evolutionary Theory. Book Publication The book was published in 1977 by Belknap Press of Harvard University Press. It was reprinted seventeen times by the same publisher between 1977 and 2003. Summary The first half of the book explores Ernst Haeckel's biogenetic law (recapitulation)—the discredited idea that embryonic developmental stages replay the evolutionary transitions of adult forms of an organism's past descendants—and how this idea influenced thinking in biology, theology, and psychology. Gould begins with the ancient Greek philosopher Anaximander, showing that the ideas formed a tradition leading to the French naturalist Charles Bonnet. Gould describes the recapitulationists in the 19th century, from the German Lorenz Oken and Johann Friedrich Meckel to the French Étienne Serres. The book examines the criticism of the theory by the Baltic German Karl Ernst von Baer and the Swiss-American Louis Agassiz, and relates 19th century phylogeny to Charles Darwin's 1859 theory of evolution, Haeckel's approac
https://en.wikipedia.org/wiki/Aliens%3A%20The%20Computer%20Game%20%28UK%20Version%29
Aliens: The Computer Game is a 1986 video game developed by Software Studios and published by Electric Dreams Software initially for Amstrad CPC, Commodore 64 and ZX Spectrum. It is based on the film of the same title. Ports for the Commodore 16 and MSX were developed by Mr. Micro and published in 1987. Gameplay Aliens: The Computer Game is played from a first-person perspective, and is set inside an atmosphere processing plant, a maze complex consisting of 255 rooms. The player encounters Alien enemies throughout the game. Upon killing an Alien, the body leaves a deadly pool of acid blood that must be avoided. The player also faces the threat of bio-mechanical growth, which, if left uncontained, results in new Alien eggs and facehugger enemies. The player's ultimate goal is to reach the room that houses the Queen Alien and her nests, both of which must be destroyed. The player can directly control a team of Marine soldiers, or can issue orders to the team from the Mobile Tactical Operations Bay (MTOB). When playing from the MTOB, the player has a view of the team's surroundings via video cameras attached to each soldier's helmet. Reception According to Retro Gamer, "the game was praised by the computing press - Zzap!, Amstrad Action, and Sinclair User awarded it 81%, 90% and 5/5 respectively." Crash gave it a score of 84%, with one reviewer declaring it "the best game-of-the-film to date," and the review by Zzap!64 also opined it was "the best tie-in game to date, and a good game to boot." Computer Gamer gave this "excellent game of a superb film" an overall score of 80%. Commodore User gave it 8 out of 10 and Your Sinclair gave it 9 out of 10. In 1993, Commodore Force ranked the game at number 59 on its list of the top 100 Commodore 64 games. In a Retro Gamer retrospective, Darren Jones opined that "despite being incredible basic to look at, Aliens dripped with atmosphere and was quite unlike any movie conversion of the time, and not just because it was so bloo
https://en.wikipedia.org/wiki/Herbal%20distillate
Herbal distillates, also known as floral waters, hydrosols, hydrolates, herbal waters, and essential waters, are aqueous products of hydrodistillation. They are colloidal suspensions of essential oils as well as water-soluble components obtained by steam distillation or hydrodistillation (a variant of steam distillation) from plants and herbs. These herbal distillates have uses as flavorings and cosmetics. Common herbal distillates for skincare include rose water, orange flower water, and witch hazel. Rosemary, oregano, and thyme are hydrosols that may be used in food manufacturing. Production Herbal distillates are produced in the same or similar manner as essential oils. However, essential oils will float to the top of the distillate where it is removed, leaving behind the watery distillate. For this reason the term essential water is an apt description. In the past, these essential waters were often considered a byproduct of distillation, but are now considered an important co-product. The produced herbal waters are essentially diluted essential oils at less than 1% concentration (typically 0.02% to 0.05%). Several factors, such as temperature and an herb's growth cycle, impact the characteristics of a distillate, and therefore influence the timing of the distillation. Rosemary, for example, should be distilled in the peak of summer before it flowers. Usage Distillates are used as flavorings, cosmetics and as herbal treatments. Herbal distillates are less concentrated than essential oils, possibly making them more suitable for some topical uses. Science The science of distillation is based on the fact that different substances vaporise at different temperatures. Unlike other extraction techniques based on solubility of a compound in either water or oil, distillation will separate components regardless of their solubility. The distillate will contain compounds that vaporize at or below the temperature of distillation. The actual chemical components of these
https://en.wikipedia.org/wiki/Malate%E2%80%93aspartate%20shuttle
The malate–aspartate shuttle (sometimes simply the malate shuttle) is a biochemical system for translocating electrons produced during glycolysis across the semipermeable inner membrane of the mitochondrion for oxidative phosphorylation in eukaryotes. These electrons enter the electron transport chain of the mitochondria via reduction equivalents to generate ATP. The shuttle system is required because the mitochondrial inner membrane is impermeable to NADH, the primary reducing equivalent of the electron transport chain. To circumvent this, malate carries the reducing equivalents across the membrane. Components The shuttle consists of four protein parts: malate dehydrogenase in the mitochondrial matrix and intermembrane space. aspartate aminotransferase in the mitochondrial matrix and intermembrane space. malate-alpha-ketoglutarate antiporter in the inner membrane. glutamate-aspartate antiporter in the inner membrane. Mechanism The primary enzyme in the malate–aspartate shuttle is malate dehydrogenase. Malate dehydrogenase is present in two forms in the shuttle system: mitochondrial malate dehydrogenase and cytosolic malate dehydrogenase. The two malate dehydrogenases are differentiated by their location and structure, and catalyze their reactions in opposite directions in this process. First, in the cytosol, malate dehydrogenase catalyses the reaction of oxaloacetate and NADH to produce malate and NAD+. In this process, two electrons generated from NADH, and an accompanying H+, are attached to oxaloacetate to form malate. Once malate is formed, the first antiporter (malate-alpha-ketoglutarate) imports the malate from the cytosol into the mitochondrial matrix and also exports alpha-ketoglutarate from the matrix into the cytosol simultaneously. After malate reaches the mitochondrial matrix, it is converted by mitochondrial malate dehydrogenase into oxaloacetate, during which NAD+ is reduced with two electrons to form NADH. Oxaloacetate is then transformed into a
https://en.wikipedia.org/wiki/Master%20of%20Quantitative%20Finance
A master's degree in quantitative finance concerns the application of mathematical methods to the solution of problems in financial economics. There are several like-titled degrees which may further focus on financial engineering, computational finance, mathematical finance, and/or financial risk management. In general, these degrees aim to prepare students for roles as "quants" (quantitative analysts), including analysis, structuring, trading, and investing; in particular, these degrees emphasize derivatives and fixed income, and the hedging and management of the resultant market and credit risk. Formal master's-level training in quantitative finance has existed since 1990. Structure The program is usually one to one and a half years in duration, and may include a thesis component. Entrance requirements are generally multivariable calculus, linear algebra, differential equations and some exposure to computer programming (usually C++); programs emphasizing financial mathematics may require some background in measure theory. Initially, the curriculum builds quantitative skills, and simultaneously develops the underlying finance theory: The quantitative component draws on applied mathematics, computer science and statistical modelling, and emphasizes stochastic calculus, numerical methods and simulation techniques; see . Some programs also focus on econometrics / time series analysis. The theory component usually includes a formal study of financial economics, addressing asset pricing and financial markets; some programs may also include general coverage of economics, accounting, corporate finance and portfolio management. The components are then integrated, addressing the modelling, valuation and hedging of equity derivatives, commodity derivatives, foreign exchange derivatives, and fixed income instruments and their related credit- and interest rate derivatives; see . Programs often include dedicated modules in market risk and credit risk, with some degree
https://en.wikipedia.org/wiki/Aliens%3A%20The%20Computer%20Game%20%28US%20Version%29
Aliens: The Computer Game is a 1986 video game developed and published by Activision for the Commodore 64, Apple II based on the film of the same title. As Activision's UK subsidiary Electric Dreams Software had independently released their own version of the game with the same title, the game was renamed for European release. Initially planned to be released as Aliens: The Second Part., it was finally published under the title Aliens: US Version with ports for the Amstrad CPC and ZX Spectrum produced by Mr Micro. Gameplay Aliens is a series of six minigames strung together via graphical interactive sequences, akin to an adventure game, though the only interaction possible is advancing the dialog, displayed in speech balloons. The minigames are mostly action sequences that involve piloting a ship from Sulaco to the planet's surface, recognizing equipment, and fighting aliens. Reception At the time of its release, the game received mixed reviews, including the scores of 85% from Commodore Format, 8/10 (averaged) from Computer and Video Games, 45% from Crash, 5/10 from Sinclair User, 9/10 from Your Sinclair, and 60% from Zzap!64. Info gave the Commodore 64 version four stars out of five: "The aliens are appropriately creepy, and each sequence is well done & plays quite differently from the others". Retrospective VentureBeats Stephen Kleckner commented in a 2014 feature that "as with a lot of compilation-designed titles, Aliens falls into that trap of being a collection of mediocre experiences instead of a game with a singular focus. […] Hardcore fans who own a Commodore 64 should load this one up. Everyone else isn’t missing much that a Let's Play video won't provide." On the other hand, Chris Cummins from Topless Robot wrote in 2010 that "the now-crude graphics aside, it's still arguably the best game based on any of the films in the Alien saga." Reviews Isaac Asimov's Science Fiction Magazine v11 n8 (1987 08) See also List of Alien, Predator and Alien vs. Pr
https://en.wikipedia.org/wiki/Hall%20algebra
In mathematics, the Hall algebra is an associative algebra with a basis corresponding to isomorphism classes of finite abelian p-groups. It was first discussed by but forgotten until it was rediscovered by , both of whom published no more than brief summaries of their work. The Hall polynomials are the structure constants of the Hall algebra. The Hall algebra plays an important role in the theory of Masaki Kashiwara and George Lusztig regarding canonical bases in quantum groups. generalized Hall algebras to more general categories, such as the category of representations of a quiver. Construction A finite abelian p-group M is a direct sum of cyclic p-power components where is a partition of called the type of M. Let be the number of subgroups N of M such that N has type and the quotient M/N has type . Hall proved that the functions g are polynomial functions of p with integer coefficients. Thus we may replace p with an indeterminate q, which results in the Hall polynomials Hall next constructs an associative ring over , now called the Hall algebra. This ring has a basis consisting of the symbols and the structure constants of the multiplication in this basis are given by the Hall polynomials: It turns out that H is a commutative ring, freely generated by the elements corresponding to the elementary p-groups. The linear map from H to the algebra of symmetric functions defined on the generators by the formula (where en is the nth elementary symmetric function) uniquely extends to a ring homomorphism and the images of the basis elements may be interpreted via the Hall–Littlewood symmetric functions. Specializing q to 0, these symmetric functions become Schur functions, which are thus closely connected with the theory of Hall polynomials.
https://en.wikipedia.org/wiki/Md5deep
md5deep is a software package used in the computer security, system administration and computer forensics communities to run large numbers of files through any of several different cryptographic digests. It was originally authored by Jesse Kornblum, at the time a special agent of the Air Force Office of Special Investigations. , he still maintains it. The name md5deep is misleading. Since version 2.0, the md5deep package contains several different programs able to perform MD5, SHA-1, SHA-256, Tiger192 and Whirlpool digests, each of them named by the digest type followed by the word "deep". Thus, the name may confuse some people into thinking it only provides the MD5 algorithm when the package supports many more. md5deep can be invoked in several different ways. Typically users operate it recursively, where md5deep walks through one directory at a time giving digests of each file found, and recursing into any subdirectories within. Its recursive behavior is approximately a depth-first search, which has the benefit of presenting files in lexicographical order. On Unix-like systems, similar functionality can be often obtained by combining find with hashing utilities such as md5sum, sha256sum or tthsum. md5deep exists for Windows and most Unix-based systems, including OS X. It is present in OS X's Fink, Homebrew and MacPorts projects. Binary packages exist for most free Unix systems. Many vendors initially resist including md5deep as they mistakenly believe its functions can be reproduced with one line of shell scripting. The matching function of the program, however, cannot be done easily in shell. Because md5deep was written by an employee of the U.S. government, on government time, it is in the public domain. Other software surrounding it, such as graphical front-ends, may be copyrighted. See also Hash functions MD5, SHA-1, and SHA-2 (which includes SHA-224, SHA-256, SHA-384, SHA-512)
https://en.wikipedia.org/wiki/AMD%20APU
AMD Accelerated Processing Unit (APU), formerly known as Fusion, is a series of 64-bit microprocessors from Advanced Micro Devices (AMD), combining a general-purpose AMD64 central processing unit (CPU) and 3D integrated graphics processing unit (IGPU) on a single die. AMD announced the first generation APUs, Llano for high-performance and Brazos for low-power devices, in January 2011. The second generation Trinity for high-performance and Brazos-2 for low-power devices were announced in June 2012. The third generation Kaveri for high performance devices were launched in January 2014, while Kabini and Temash for low-power devices were announced in the summer of 2013. Since the launch of the Zen microarchitecture, Ryzen and Athlon APUs have released to the global market as Raven Ridge on the DDR4 platform, after Bristol Ridge a year prior. AMD has also supplied semi-custom APUs for consoles starting with the release of Sony PlayStation 4 and Microsoft Xbox One eighth generation video game consoles. History The AMD Fusion project started in 2006 with the aim of developing a system on a chip that combined a CPU with a GPU on a single die. This effort was moved forward by AMD's acquisition of graphics chipset manufacturer ATI in 2006. The project reportedly required three internal iterations of the Fusion concept to create a product deemed worthy of release. Reasons contributing to the delay of the project include the technical difficulties of combining a CPU and GPU on the same die at a 45 nm process, and conflicting views on what the role of the CPU and GPU should be within the project. The first generation desktop and laptop APU, codenamed Llano, was announced on 4 January 2011 at the 2011 Consumer Electronics Show in Las Vegas and released shortly thereafter. It featured K10 CPU cores and a Radeon HD 6000 series GPU on the same die on the FM1 socket. An APU for low-power devices was announced as the Brazos platform, based on the Bobcat microarchitecture and a Rad
https://en.wikipedia.org/wiki/SDEP
The SDEP (Street events Data Exchange Protocol) comprises an XML data schema and web service WSDL for exchanging information about streetworks, roadworks, and street events between systems. Elgin was funded by the UK NeSDS Government e-Standards Programme to conduct a consultation and convene meetings to define the requirements of a common data exchange protocol for streetworks registers and other systems handling street events' data. SDEP was developed to allow the open exchange of such data between back office systems used by local authorities to manage their highway networks, in order to enable e-Government and streetworks co-ordination. The SDEP consultation group comprised ELGIN (Chair), Mayrise Ltd., Symology Ltd., Pitney Bowes Inc., Exor Corporation (Bentley Systems), Office of the Deputy Prime Minister and Transport for London, with the National Traffic Control Centre in an observing capacity. See also Office of the Deputy Prime Minister Transport for London National Traffic Control Centre External links SDEP technical documentation including an XML schema and WSDL XML markup languages Web service specifications World Wide Web Consortium standards Technical communication Computer file formats Open formats Data modeling languages Data serialization formats Application layer protocols Presentation layer protocols
https://en.wikipedia.org/wiki/Decussation
Decussation is used in biological contexts to describe a crossing (due to the shape of the Roman numeral for ten, an uppercase 'X' (), ). In Latin anatomical terms, the form is used, e.g. . Similarly, the anatomical term chiasma is named after the Greek uppercase 'Χ' (chi). Whereas a decussation refers to a crossing within the central nervous system, various kinds of crossings in the peripheral nervous system are called chiasma. Examples include: In the brain, where nerve fibers obliquely cross from one lateral side of the brain to the other, that is to say they cross at a level other than their origin. See for examples decussation of pyramids and sensory decussation. In neuroanatomy, the term chiasma is reserved for crossing of- or within nerves such as in the optic chiasm. In botanical leaf taxology, the word decussate describes an opposite pattern of leaves which has successive pairs at right angles to each other (i.e. rotated 90 degrees along the stem when viewed from above). In effect, successive pairs of leaves cross each other. Basil is a classic example of a decussate leaf pattern. In tooth enamel, where bundles of rods cross each other as they travel from the enamel-dentine junction to the outer enamel surface, or near to it. In taxonomic description where decussate markings or structures occur, names such as or or otherwise in part containing "decuss..." are common, especially in the specific epithet. Evolutionary significance The origin of the contralateral organization, the optic chiasm and the major decussations on the nervous system of vertebrates has been a long standing puzzle to scientists. The visual map theory of Ramón y Cajal has long been popular but has been criticized for its logical inconsistence. More recently, it has been proposed that the decussations are caused by an axial twist by which the anterior head, along with the forebrain, is turned by 180° with respect to the rest of the body. See also Definition of types of cro
https://en.wikipedia.org/wiki/Faron%20Moller
Faron George Moller (born February 25, 1962 in Trail, British Columbia) is a Canadian-born British computer scientist and expert on theoretical computer science, particularly infinite-state automata theory and temporal logic. His work has focussed on structural decomposition techniques for analysing abstract models of computing systems. He is founding Director of the Swansea Railway Verification Group; Director of Technocamps; and Head of the Institute of Coding in Wales. In 2023, he as elected General Secretary of the Learned Society of Wales. Biography Moller studied mathematics and computer science as an undergraduate at the University of British Columbia, and then as a Masters student at the University of Waterloo, before going on to do a PhD supervised by Robin Milner in the Laboratory for Foundations of Computer Science at the University of Edinburgh. He has held posts at the universities of Strathclyde and Edinburgh, The Swedish Institute for Computer Science, The Royal Institute of Technology in Stockholm, and Uppsala University before moving to Wales as Professor of Computer Science at Swansea University in 2000. Appointments and honours Moller is a Fellow of the Learned Society of Wales, a Fellow of the British Computer Society and Fellow of the Institute of Mathematics and its Applications, and served as President of the British Colloquium for Theoretical Computer Science for 15 years (2004-2019). He is a Chartered Mathematician, a Chartered Scientist, and a Chartered IT Professional. His full nomenclature with post-nominal letters is Professor Faron Moller BSc, MMath, PhD, CITP, CMath, CSci, FLSW, FBCS, FIMA. He is also Director of Technocamps, a pan-Wales schools outreach programme aimed at introducing and reinforcing Computer Science and Digital Competency within all Welsh schools and inspiring young people to study computing-based topics; and Head of the Institute of Coding in Wales. See also List of University of Waterloo people
https://en.wikipedia.org/wiki/Onomancy
Onomancy (or nomancy) is divination based on a subject's name. Onomancy gained popularity in Europe during the Late Middle Ages, but is said to have originated with the Pythagoreans in antiquity. Several methods of analyzing a name are possible, some of which are based on arithmancy or gematria. An early example of onomancy is found in the Secretum Secretorum. The system given there involves adding up the numerical values of the letters in the names of two antagonists, dividing the total for each person by 9, and comparing the remainders with a table which predicts the victor. In China, Taiwan, and Japan, onomancy is known as 姓名判断 (Chinese: xingming panduan; Japanese: seimei handan). It can take several forms, but the most popular is based on the character strokes in the subject's written name, and the result number will be modulo 81, the remainders 0, 1, 3, 5, 6, 7, 8, 11, 13, 15, 16, 17, 18, 21, 23, 24, 25, 29, 31, 32, 33, 35, 37, 38, 39, 41, 45, 47, 48, 52, 57, 58, 61, 63, 65, 67, 68 are “lucky” numbers. Notes
https://en.wikipedia.org/wiki/Starvation%20response
Starvation response in animals (including humans) is a set of adaptive biochemical and physiological changes, triggered by lack of food or extreme weight loss, in which the body seeks to conserve energy by reducing the amount of food energy it consumes. Equivalent or closely related terms include famine response, starvation mode, famine mode, starvation resistance, starvation tolerance, adapted starvation, adaptive thermogenesis, fat adaptation, and metabolic adaptation. In humans Ordinarily, the body responds to reduced energy intake by burning fat reserves and consuming muscle and other tissues. Specifically, the body burns fat after first exhausting the contents of the digestive tract along with glycogen reserves stored in liver cells and after significant protein loss. After prolonged periods of starvation, the body uses the proteins within muscle tissue as a fuel source, which results in muscle mass loss. Magnitude and composition The magnitude and composition of the starvation response (i.e. metabolic adaptation) was estimated in a study of 8 individuals living in isolation in Biosphere 2 for two years. During their isolation, they gradually lost an average of 15% (range: 9–24%) of their body weight due to harsh conditions. On emerging from isolation, the eight isolated individuals were compared with a 152-person control group that initially had similar physical characteristics. On average, the starvation response of the individuals after isolation was a reduction in daily total energy expenditure. of the starvation response was explained by a reduction in fat-free mass and fat mass. An additional was explained by a reduction in fidgeting. The remaining was statistically insignificant. General The energetic requirements of a body are composed of the basal metabolic rate (BMR) and the physical activity level (ERAT, exercise-related activity thermogenesis). This caloric requirement can be met with protein, fat, carbohydrates, or a mixture of those.
https://en.wikipedia.org/wiki/Primon%20gas
In mathematical physics, the primon gas or Riemann gas discovered by Bernard Julia is a model illustrating correspondences between number theory and methods in quantum field theory, statistical mechanics and dynamical systems such as the Lee-Yang theorem. It is a quantum field theory of a set of non-interacting particles, the primons; it is called a gas or a free model because the particles are non-interacting. The idea of the primon gas was independently discovered by Donald Spector. Later works by Ioannis Bakas and Mark Bowick, and Spector explored the connection of such systems to string theory. The model State space Consider a Hilbert space H with an orthonormal basis of states labelled by the prime numbers p. Second quantization gives a new Hilbert space K, the bosonic Fock space on H, where states describe collections of primes - which we can call primons if we think of them as analogous to particles in quantum field theory. This Fock space has an orthonormal basis given by finite multisets of primes. In other words, to specify one of these basis elements we can list the number of primons for each prime : where the total is finite. Since any positive natural number has a unique factorization into primes: we can also denote the basis elements of the Fock space as simply where In short, the Fock space for primons has an orthonormal basis given by the positive natural numbers, but we think of each such number as a collection of primons: its prime factors, counted with multiplicity. Identifying the Hamiltonian via the Koopman operator Given the state , we may use the Koopman operator to lift dynamics from the space of states to the space of observables: where is an algorithm for integer factorisation, analogous to the discrete logarithm, and is the successor function. Thus, we have: A precise motivation for defining the Koopman operator is that it represents a global linearisation of , which views linear combinations of eigenstates as
https://en.wikipedia.org/wiki/The%20Tower%20of%20Cabin
The Tower of Cabin is an unusual spinoff of the fantasy role-playing video game series called Xak by the Japanese developer MicroCabin. The Tower of Cabin was released just before MicroCabin decided to discontinue the development of Xak III for MSX. The Tower of Cabin was also ported to the NEC PC9801. In this game, the player plays as a new MicroCabin employee as they wander the MicroCabin headquarters building in Japan with a goal to work with the various lead programmers and developers in making the company more successful. This is very similar to the Japanese game known as Segagaga (Dreamcast) in which the player runs the entire Sega company as the president of Sega Corporation. This game featured many characters from the Xak series such as Latok, Fray and Pixie. There were two unlockable mini-games in The Tower of Cabin: a short fighting game between Pixie and Fray and a text adventure called Crusader that took place within the Xak universe. Setting and story The game's setting takes place at the MicroCabin office building in Japan. The player descends and ascends to various floors throughout the building while carrying out various business related tasks. Characters Besides the main character, the player, there are several miscellaneous NPC's that the player can talk with in order to proceed through the game. In addition there are a few characters from other MicroCabin games that make a cameo appearance: Fray, the blue-haired sorceress of the Xak series and Fray in Magical Adventure makes a cameo appearance. Pixie, the green-haired faerie of the Xak series makes a cameo appearance. Latok, the main hero of the Xak series makes a cameo appearance. Gameplay Most of the game is played by traveling the office building to various departments and relaying information between the employees and completing various business related tasks in order to ensure the success of the MicroCabin company. External links XyZ: A Tribute to the Xak, Ys and Zelda Series Flame B
https://en.wikipedia.org/wiki/Soy%20protein
Soy protein is a protein that is isolated from soybean. It is made from soybean meal that has been dehulled and defatted. Dehulled and defatted soybeans are processed into three kinds of high protein commercial products: soy flour, concentrates, and isolates. Soy protein isolate has been used since 1959 in foods for its functional properties. Soy protein is generally regarded as being concentrated in protein bodies, which are estimated to contain at least 60–70% of the total soybean protein. Upon germination of the soybean, the protein will be digested, and the released amino acids will be transported to locations of seedling growth. Soybeans contain a small but newly very significant 2S Albumin storage protein. Legume proteins, such as soy and pulses, belong to the globulin family of seed storage proteins called legumin and vicilins, or in the case of soybeans, glycinin and beta-conglycinin. Soybeans also contain biologically active or metabolic proteins, such as enzymes, trypsin inhibitors, hemagglutinins, and cysteine proteases very similar to papain. The soy cotyledon storage proteins, important for human nutrition, can be extracted most efficiently by water, water plus dilute alkali (pH 7–9), or aqueous solutions of sodium chloride (0.5–2 M ≈ 30-120 g/L) from dehulled and defatted soybeans that have undergone only a minimal heat treatment so the protein is close to being native or undenatured. History Soy protein has been available since 1936 for its functional properties. In that year, organic chemist Percy Lavon Julian designed the world's first plant for the isolation of industrial-grade soy protein called alpha protein. The largest use of industrial-grade protein was, and still is, for paper coatings, in which it serves as a pigment binder. However, Julian's plant must have also been the source of the "soy protein isolate" which Ford's Robert Boyer and Frank Calvert spun into an artificial silk that was then tailored into that now famous "silk is soy" su