source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Garlic%20Is%20as%20Good%20as%20Ten%20Mothers
Garlic Is as Good as Ten Mothers is a 1980 documentary film about garlic directed by Les Blank. Its official premiere was at the 1980 Berlin Film Festival. Production It was filmed at the Gilroy Garlic Festival in Gilroy, California, as well as in other locations in Northern California. The director recommends that, when the film is shown, a toaster oven containing several heads of garlic be turned on in the rear of the theater, unbeknownst to the audience, with the intended result that approximately halfway through the showing the entire theater will be filled with the smell of garlic. The title is a shortened form of the saying "Garlic is as good as ten mothers... for keeping the girls away." Reception and legacy In 2004, the film was selected for preservation in the United States' National Film Registry by the Library of Congress as being "culturally, historically, or aesthetically significant." The Academy Film Archive preserved Garlic Is as Good as Ten Mothers in 1999. In Blank's 1982 film Burden of Dreams, a documentary chronicling the filming of Fitzcarraldo, director Werner Herzog and other crew members can be seen wearing "Garlic Is as Good as Ten Mothers" T-shirts. See also All in This Tea
https://en.wikipedia.org/wiki/Gauge%20principle
In physics, a gauge principle specifies a procedure for obtaining an interaction term from a free Lagrangian which is symmetric with respect to a continuous symmetry—the results of localizing (or gauging) the global symmetry group must be accompanied by the inclusion of additional fields (such as the electromagnetic field), with appropriate kinetic and interaction terms in the action, in such a way that the extended Lagrangian is covariant with respect to a new extended group of local transformations. See also Gauge theory Gauge covariant derivative Gauge fixing Gauge gravitation theory Kaluza–Klein theory Lie algebra Lie group Lorenz gauge Quantum chromodynamics Quantum electrodynamics Quantum field theory Quantum gauge theory Standard Model Standard Model (mathematical formulation) Symmetry breaking Symmetry in physics Yang–Mills theory Yang–Mills existence and mass gap 1964 PRL symmetry breaking papers
https://en.wikipedia.org/wiki/IBM%20JX
The IBM JX (or JXPC) was a personal computer released in 1984 into the Japanese, Australian and New Zealand markets. Designed in Japan, it was based on the technology of the IBM PCjr and was designated the IBM 5511. It was targeted in the Australasian market towards the public education sector rather than at consumers, and was sold in three levels: JX (64 KiB), JX2 (128 KiB) and JX3 (256 KiB). Upgrades were available to both 384 KiB and 512 KiB. The JX was the first IBM PC to use 3.5" floppy drives. IBM Japan expected to sell 200,000 units of JX, but only 40,000 units were produced. The JX was discontinued in 1987, and IBM Japan gave 15,000 units of JX to its employees in honor of the company's 50th anniversary. General The IBM JX's main difference from the PCjr was a professional keyboard (rather than the PCjr's disparaged chiclet keyboard), dual 3.5" floppy drives, as well as options for a 5.25" floppy drive and a hard drive, both of which sat atop the main unit. The JX did not support PCjr-like "sidecar" add-ons for hardware expansion. In common with the PCjr, however, it had no DMA controller. It also supported the otherwise unique-in-the-IBM-PC-world ECGA (Enhanced Color Graphics Adapter—16 simultaneous colors, but only at 320×200 resolution) and the PCjr's 4-channel sound. Support for these two features was utilised by only a handful of software developers—Sierra On-line being the most well-known. Configuration It had several innovative features: Single or twin 3.5" 720 KB (initially only 360 KB) diskette drives Wireless infra-red keyboard 16-color video output Stackable expansion Joystick ports Cartridge slots In Japan, both white and dark gray units were available, but elsewhere all IBM JXs were dark gray—very unusual in the days of the standard color of IBM "beige boxes". All models sold in Japan have a Japanese font stored on 128 KB of ROM, but the basic system only has the capability to display 40×11 Japanese text. The Extended Display Cartridg
https://en.wikipedia.org/wiki/Bone%20marrow
Bone marrow is a semi-solid tissue found within the spongy (also known as cancellous) portions of bones. In birds and mammals, bone marrow is the primary site of new blood cell production (or haematopoiesis). It is composed of hematopoietic cells, marrow adipose tissue, and supportive stromal cells. In adult humans, bone marrow is primarily located in the ribs, vertebrae, sternum, and bones of the pelvis. Bone marrow comprises approximately 5% of total body mass in healthy adult humans, such that a man weighing 73 kg (161 lbs) will have around 3.7 kg (8 lbs) of bone marrow. Human marrow produces approximately 500 billion blood cells per day, which join the systemic circulation via permeable vasculature sinusoids within the medullary cavity. All types of hematopoietic cells, including both myeloid and lymphoid lineages, are created in bone marrow; however, lymphoid cells must migrate to other lymphoid organs (e.g. thymus) in order to complete maturation. Bone marrow transplants can be conducted to treat severe diseases of the bone marrow, including certain forms of cancer such as leukemia. Several types of stem cells are related to bone marrow. Hematopoietic stem cells in the bone marrow can give rise to hematopoietic lineage cells, and mesenchymal stem cells, which can be isolated from the primary culture of bone marrow stroma, can give rise to bone, adipose, and cartilage tissue. Structure The composition of marrow is dynamic, as the mixture of cellular and non-cellular components (connective tissue) shifts with age and in response to systemic factors. In humans, marrow is colloquially characterized as "red" or "yellow" marrow (, , respectively) depending on the prevalence of hematopoietic cells vs fat cells. While the precise mechanisms underlying marrow regulation are not understood, compositional changes occur according to stereotypical patterns. For example, a newborn baby's bones exclusively contain hematopoietically active "red" marrow, and there is a pro
https://en.wikipedia.org/wiki/Meet-in-the-middle%20attack
The meet-in-the-middle attack (MITM), a known plaintext attack, is a generic space–time tradeoff cryptographic attack against encryption schemes that rely on performing multiple encryption operations in sequence. The MITM attack is the primary reason why Double DES is not used and why a Triple DES key (168-bit) can be brute-forced by an attacker with 256 space and 2112 operations. Description When trying to improve the security of a block cipher, a tempting idea is to encrypt the data several times using multiple keys. One might think this doubles or even n-tuples the security of the multiple-encryption scheme, depending on the number of times the data is encrypted, because an exhaustive search on all possible combinations of keys (simple brute-force) would take 2n·k attempts if the data is encrypted with k-bit keys n times. The MITM is a generic attack which weakens the security benefits of using multiple encryptions by storing intermediate values from the encryptions or decryptions and using those to improve the time required to brute force the decryption keys. This makes a Meet-in-the-Middle attack (MITM) a generic space–time tradeoff cryptographic attack. The MITM attack attempts to find the keys by using both the range (ciphertext) and domain (plaintext) of the composition of several functions (or block ciphers) such that the forward mapping through the first functions is the same as the backward mapping (inverse image) through the last functions, quite literally meeting in the middle of the composed function. For example, although Double DES encrypts the data with two different 56-bit keys, Double DES can be broken with 257 encryption and decryption operations. The multidimensional MITM (MD-MITM) uses a combination of several simultaneous MITM attacks like described above, where the meeting happens in multiple positions in the composed function. History Diffie and Hellman first proposed the meet-in-the-middle attack on a hypothetical expansion of a bloc
https://en.wikipedia.org/wiki/Visual%20neuroscience
Visual neuroscience is a branch of neuroscience that focuses on the visual system of the human body, mainly located in the brain's visual cortex. The main goal of visual neuroscience is to understand how neural activity results in visual perception, as well as behaviors dependent on vision. In the past, visual neuroscience has focused primarily on how the brain (and in particular the Visual Cortex) responds to light rays projected from static images and onto the retina. While this provides a reasonable explanation for the visual perception of a static image, it does not provide an accurate explanation for how we perceive the world as it really is, an ever-changing, and ever-moving 3-D environment. The topics summarized below are representative of this area, but far from exhaustive. To be less topic specific, one can see this textbook for the computational link between neural activities and visual perception and behavior: "Understanding vision: theory, models, and data" , published by Oxford University Press 2014. Face processing A recent study using Event-Related Potentials (ERPs) linked an increased neural activity in the occipito-temporal region of the brain to the visual categorization of facial expressions. Results focus on a negative peak in the ERP that occurs 170 milliseconds after the stimulus onset. This action potential, called the N170, was measured using electrodes in the occipito-temporal region, an area already known to be changed by face stimuli. Studying by using the EEG, and ERP methods allow for an extremely high temporal resolution of 4 milliseconds, which makes these kinds of experiments extremely well suited for accurately estimating and comparing the time it takes the brain to perform a certain function. Scientists used classification image techniques, to determine what parts of complex visual stimuli (such as a face) will be relied on when patients are asked to assign them to a category, or emotion. They computed the important feature
https://en.wikipedia.org/wiki/International%20Framework%20for%20Nuclear%20Energy%20Cooperation
The International Framework for Nuclear Energy Cooperation (IFNEC) is a forum of states and organizations that share a common vision of a safe and secure development of nuclear energy for worldwide purposes. Formerly the Global Nuclear Energy Partnership (GNEP), IFNEC began as a U.S. proposal, announced by United States Secretary of Energy Samuel Bodman on February 6, 2006, to form an international partnership to promote the use of nuclear power and close the nuclear fuel cycle in a way that reduces nuclear waste and the risk of nuclear proliferation. This proposal would divide the world into "fuel supplier nations," which supply enriched uranium fuel and take back spent fuel, and "user nations," which operate nuclear power plants. As GNEP the proposal proved controversial in the United States and internationally. The U.S. Congress provided far less funding for GNEP than President George W. Bush requested. U.S. arms control organizations criticized the proposal to resume reprocessing as costly and increasing proliferation risks. Some countries and analysts criticized the GNEP proposal for discriminating between countries as nuclear fuel cycle "haves" and "have-nots." In April 2009 the U.S. Department of Energy announced the cancellation of the U.S. domestic component of GNEP. In 2010, the GNEP was renamed the International Framework for Nuclear Energy Cooperation. IFNEC is now an international partnership with 34 participant and 31 observer countries, and three international organization observers. The international organization observers are: the International Atomic Energy Agency, the Generation IV International Forum, and the European Atomic Energy Community. Since 2015, the Nuclear Energy Agency provides Technical Secretariat support. IFNEC operates by consensus among its partners based on an agreed GNEP Statement of Mission. GNEP in the United States The GNEP proposal began as part of the Advanced Energy Initiative announced by President Bush in his 200
https://en.wikipedia.org/wiki/ICTP%20Prize
The International Centre for Theoretical Physics Prize, ICTP Prize was created in 1982. Worth 3000 euros is awarded to a young (under 40) physicist or mathematician from a developing country to promote theoretical mathematics and physics research in the developing world. Awardees include notable scientists from many continents. Winners 2022 Shant Baghram and Mohammad Hossein Namjoo 2021 Rondrotiana Barimalala and Narendra Ojha 2020 Dibyendu Roy and Mehdi Kargarian 2019 Basudeb Dasgupta and Suvrat Raju 2018 Luis E. F. Foa Torres and Hongjun Xiang 2017 Emilio Kropff 2016 Aninda Sinha 2015 Aijun Ding and Vijayakumar S. Nair 2014 Pablo Cornaglia 2013 Yasaman Farzan and Patchanita Thamyongkit 2012 Pablo Mininni 2011 Ado Jorio 2010 Shiraz Minwalla 2009 Marcelo Barreiro 2008 Abhishek Dhar and Zhong Fang 2007 M.M. Sheikh-Jabbari 2005 Xiaohua Zhu 2004 B. Gabriel Mindlin 2003 Manindra Agrawal 2002 Mohit Randeria 2001 Soo-Jong Rey 2000 Sheng-Li Tan and T. N. Venkataramana 1999 Daniel Domínguez 1998 Anamaría Font and Fernando Quevedo 1997 Nitin Nitsure 1996 A. M. Jayannavar 1995 Spenta R. Wadia 1994 Chao-Jiang Xu 1993 Deepak Dhar 1992 Elcio Abdalla 1991 Hong Van Le 1990 José L. Morán-López 1989 Ashoke Sen 1988 José Onuchic 1987 Abdullah Sadiq 1986 Li Jia Ming 1985 Chike Obi 1984 Ricardo Galvão 1983 Ganapathy Baskaran Source: ICTP Prize Official website See also Abdus Salam International Centre for Theoretical Physics (ICTP)
https://en.wikipedia.org/wiki/Tacet
Tacet is Latin which translates literally into English as "(it) is silent" (pronounced: , , or ). It is a musical term to indicate that an instrument or voice does not sound, also known as a rest. In vocal polyphony and in orchestral scores, it usually indicates a long period of time, typically an entire movement. In more modern music such as jazz, tacet tends to mark considerably shorter breaks. Multirests, or multiple-measure rests, are rests which last multiple measures (or multiple rests, each of which lasts an entire measure). It was common for early symphonies to leave out the brass or percussion in certain movements, especially in slow (second) movements, and this is the instruction given in the parts for the player to wait until the end of the movement. It is also commonly used in accompaniment music to indicate that the instrument does not play on a certain run through a portion of the music, e.g. "Tacet 1st time." The phrase tacet al fine is used to indicate that the performer should remain silent for the remainder of the piece (or portion thereof), and need not, for example, count rests. Tacet may be appropriate when a particular instrument/voice/section, "is to rest for an entire section, movement, or composition." "Partial rests, of course, in every case must be written in. Even though it means 'silent,' the term tacet...is not a wise substitution for a lengthy rest within a movement...The term tacet, therefore, should be used only to indicate that a player rests throughout an . "N.C." ("no chord") is often used in guitar tablature or chord charts to indicate tacets, rests, or caesuras in the accompaniment. Uses of tacet The earliest known usage of the term is 1724. A unique usage of this term is in John Cage's 1952 composition 4′33″. Tacet is indicated for all three movements, for all instruments. The piece's first performance lasted a total of 4 minutes and 33 seconds, without a note being played. See also Latin influence in English
https://en.wikipedia.org/wiki/CACNG3
Voltage-dependent calcium channel gamma-3 subunit is a protein that in humans is encoded by the CACNG3 gene. L-type calcium channels are composed of five subunits. The protein encoded by this gene represents one of these subunits, gamma, and is one of several gamma subunit proteins. It is an integral membrane protein that is thought to stabilize the calcium channel in an inactive (closed) state. This protein is similar to the mouse stargazin protein, mutations in which have been associated with absence seizures, also known as petit-mal or spike-wave seizures. This gene is a member of the neuronal calcium channel gamma subunit gene subfamily of the PMP-22/EMP/MP20 family. This gene is a candidate gene for a familial infantile convulsive disorder with paroxysmal choreoathetosis. See also Voltage-dependent calcium channel
https://en.wikipedia.org/wiki/Mauthner%20cell
The Mauthner cells are a pair of big and easily identifiable neurons (one for each half of the body) located in the rhombomere 4 of the hindbrain in fish and amphibians that are responsible for a very fast escape reflex (in the majority of animals – a so-called C-start response). The cells are also notable for their unusual use of both chemical and electrical synapses. Evolutionary history Mauthner cells first appear in lampreys (being absent in hagfish and lancelets), and are present in virtually all teleost fish, as well as in amphibians (including postmetamorphic frogs and toads). Some fish, such as lumpsuckers, seem to have lost the Mauthner cells however. Role in behavior The C-start A C-start is a type of a very quick startle or escape reflex that is employed by fish and amphibians (including larval frogs and toads). There are two sequential stages in the C-start: first, the head rotates about the center of mass towards the direction of future escape, and the body of the animal exhibits a curvature that resembles a letter C; then, at the second stage, the animal is propelled forward. The duration of these stages varies from species to species from about 10 to 20 ms for the first stage, and from 20 to 30 ms for the second. In fish this forward propulsion does not require contraction of the antagonistic muscle, but results from the body stiffness and the hydrodynamic resistance of the tail. When an antagonistic muscular contraction does occur during stage 2, the fish rotates in the opposite direction, producing a counter-turn, and a directional change. The role of the Mauthner cell in the C-start behavior In cases when an abrupt acoustic, tactile or visual stimulus elicits a single action potential in one M-cell, it always correlates with a contralateral C-start escape. An extremely quick mutual feedback inhibitory circuit then assures that only one M-cell reaches spiking threshold—as the C-start has to be unilateral by definition—and that only one action
https://en.wikipedia.org/wiki/Analysis%20of%20similarities
Analysis of similarities (ANOSIM) is a non-parametric statistical test widely used in the field of ecology. The test was first suggested by K. R. Clarke as an ANOVA-like test, where instead of operating on raw data, operates on a ranked dissimilarity matrix. Given a matrix of rank dissimilarities between a set of samples, each belonging to a single site (e.g. a single treatment group), the ANOSIM tests whether we can reject the null hypothesis that the similarity between sites is greater than or equal to the similarity within each site. The test statistic R is calculated in the following way: where B is the average of rank similarities of pairs of samples (or replicates) originating from different sites, W is the average of rank similarity of pairs among replicates within sites, and M = n(n − 1)/2 where n is the number of samples. The test statistic R is constrained between the values −1 to 1, where positive numbers suggest more similarity within sites and values close to zero represent no difference between sites and within sites similarities. Negative R values suggest more similarity between sites than within sites and may raise the possibility of wrong assignment of samples to sites. For the purpose of hypothesis testing, where the null hypothesis is that the similarities within sites are smaller or equal to the similarities between sites, the R statistic is usually compared to a set of R′ values that are achieved by means of randomly shuffling site labels between the samples and calculating the resulting R′, repeated many times. The percent of times that the actual R surpassed the permutations derived R′ values is the p-value for the actual R statistic. Ranking of dissimilarity in ANOSIM and NMDS (non-metric multidimensional scaling) go hand in hand. Combining both methods complement visualisation of group differences along with significance testing. ANOSIM is implemented in several statistical software including PRIMER, the R Vegan package and PAST.
https://en.wikipedia.org/wiki/Federal%20Networking%20Council
Informally established in the early 1990s, the Federal Networking Council (FNC) was later chartered by the US National Science and Technology Council's Committee on Computing, Information and Communications (CCIC) to continue to act as a forum for networking collaborations among US federal agencies to meet their research, education, and operational mission goals and to bridge the gap between the advanced networking technologies being developed by research FNC agencies and the ultimate acquisition of mature version of these technologies from the commercial sector. The FNC consisted of a group made up of representatives from the United States Department of Defense (DoD), the National Science Foundation, the Department of Energy, and the National Aeronautics and Space Administration (NASA), among others. By October 1997, the FNC advisory committee was de-chartered and many of the FNC activities were transferred to the Large Scale Networking group of the Computing, Information, and Communications (CIC) R&D subcommittee of the Networking and Information Technology Research and Development program, or the Applications Council. On October 24, 1995, the Federal Networking Council passed a resolution defining the term Internet: Resolution: The Federal Networking Council (FNC) agrees that the following language reflects our definition of the term ``Internet. ``Internet'' refers to the global information system that - (i) is logically linked together by a globally unique address space based on the Internet Protocol (IP) or its subsequent extensions/follow-ons; (ii) is able to support communications using the Transmission Control Protocol/Internet Protocol (TCP/IP) suite or its subsequent extensions/follow-ons, and/or other IP-compatible protocols; and (iii) provides, uses or makes accessible, either publicly or privately, high level services layered on the communications and related infrastructure described herein.' Some notable members of the council advisory committee i
https://en.wikipedia.org/wiki/Paleoflora%20of%20the%20Eocene%20Okanagan%20Highlands
The paleoflora of the Eocene Okanagan Highlands includes all plant and fungi fossils preserved in the Eocene Okanagan Highlands Lagerstätten. The highlands are a series of Early Eocene geological formations which span an transect of British Columbia, Canada and Washington state, United States and are known for the diverse and detailed plant fossils which represent an upland temperate ecosystem immediately after the Paleocene-Eocene thermal maximum, and before the increased cooling of the middle and late Eocene to Oligocene. The fossiliferous deposits of the region were noted as early as 1873, with small amounts of systematic work happening in the 1880-90s on British Columbian sites, and 1920-30s for Washington sites. A returned focus and more detailed descriptive work on the Okanagan Highlands sites revived in the 1970's. The noted richness of agricultural plant families in Republic and Princeton floras resulted in the term "Eocene orchards" being used for the paleofloras. Paleoflora The Eocene Okanagan Highlands hosted a diverse mix of temperate and tropical paleobiotic elements, with the forests having the first significant proportions of temperate plants in North America. The paleobotanical community was a mixture of plants found in subtropical evergreen and temperate deciduous forests. Included in the forest were a number of important modern temperate flowering plant families such as Betulaceae, Rosaceae, and Sapindaceae, plus the conifer family Pinaceae. Study of the deciduous plants from the highlands has documented the occurrence of heteromorphic leaves derived from sun versus shade conditions and long shoot or short shoot buds. The paleobotanical community of the Republic area has been noted as the most diverse floral community of the Okanagan highlands, with some estimates ranging to over 68 families and 134 genera being present. The noted richness of Rosaceae fossils along with other important agricultural plant families found in the Republic and P
https://en.wikipedia.org/wiki/TRPM1
Transient receptor potential cation channel subfamily M member 1 is a protein that in humans is encoded by the TRPM1 gene. Function The protein encoded by this gene is a member of the transient receptor potential (TRP) family of non-selective cation channels. It is expressed in the retina, in a subset of bipolar cells termed ON bipolar cells. These cells form synapses with either rods or cones, collecting signals from them. In the dark, the signal arrives in the form of the neurotransmitter glutamate, which is detected by a G protein-coupled receptor (GPCR) signal transduction cascade. Detection of glutamate by the GPCR Metabotropic glutamate receptor 6 results in closing of the TRPM1 channel. At the onset of light, glutamate release is halted and mGluR6 is deactivated; this results in opening of the TRPM1 channel, influx of sodium and calcium, and depolarization of the bipolar cell. In addition to the retina, TRPM1 is also expressed in melanocytes, which are melanin-producing cells in the skin. The expression of TRPM1 is inversely correlated with melanoma aggressiveness, suggesting that it might suppress melanoma metastasis. However, subsequent work showed that a microRNA located in an intron of the TRPM1 gene, rather than the TRPM1 protein itself, is responsible for the tumor suppressor function. The expression of both TRPM1 and the microRNA are regulated by the Microphthalmia-associated transcription factor. Clinical significance Mutations in TRPM1 are associated with congenital stationary night blindness in humans and coat spotting patterns in Appaloosa horses. See also TRPM
https://en.wikipedia.org/wiki/Sonifi%20Solutions
SONIFI Solutions, Inc. is a provider of guest-facing entertainment technology in hotels and healthcare settings. The company's corporate headquarters are in Sioux Falls, South Dakota with offices in Los Angeles, California, Mexico and Canada. SONIFI has over 650 employees. SONIFI's primary customer base is in the United States, but they also deliver services in Canada, Mexico and over 30 other countries through relationships with local licensees. History SONIFI Solutions was founded in 1980 as Satellite Movie Company. The company was renamed LodgeNet Entertainment Corporation in 1991 and became a publicly traded corporation in 1993. LodgeNet purchased The Hotel Networks, On Command, and Stay Online corporations in 2006 and 2007, and changed its name to LodgeNet Interactive Corporation in 2008. In late 1993, LodgeNet launched its on-demand hospitality service, including worldwide delivery of Super NES games to hotel guests via its proprietary building-wide networks. LodgeNet eventually reported the system being installed in 200,000 hotel guest rooms by April 1996, and 530,000 guest rooms by mid-1999. By April 1996, LodgeNet reported that its partnership with Nintendo to deliver Super NES games had yielded 200,000 worldwide hotel guest room installations. On June 16, 1998, Nintendo and LodgeNet entered a 10-year licensing agreement for an "aggressive" upgrade to add Nintendo 64 support to their existing 500,000 Super NES equipped guest room installations. Lodgenet reported that in five years to date, the system had "caused Nintendo to become the most successful new product rollout in the history of the hotel pay-per-view industry". LodgeNet reported that within the middle of 1998 alone, 35 million hotel guests encountered the Nintendo name as an integral amenity, and it reported sales of more than 54 million minutes of Nintendo-based gameplay. On June 10, 1999, LodgeNet and Nintendo began expanding and upgrading their existing Super NES buildout to include Nin
https://en.wikipedia.org/wiki/Ombient
Ombient is the moniker under which Mike Hunter performs his completely improvised ambient/drone music. Ombient's ambient/drone music, being of a live and improvisational nature, is representative of the feeling of the moment in which it is performed and of the subtle feedback between the audience and the cool performer. It features amplified guitar which is processed and layered using digital looping equipment enabling a single guitar to produce symphonic levels of density. Ombient has in the last 3 years delved deeply into the world of analog synthesis and more recently analog modular synthesis. Mike Hunter also plays in the progressive/world/ambient/space rock band known as Brainstatik, of which Mike Hunter is a member. Mike Hunter is also a Fractal artist. Mike Hunter also hosts the long time running FM radio program "Music With Space" on WPRB 103.3 FM, in Princeton.
https://en.wikipedia.org/wiki/Eric%20Friedlander
Eric Mark Friedlander (born January 7, 1944 in Santurce, Puerto Rico) is an American mathematician who is working in algebraic topology, algebraic geometry, algebraic K-theory and representation theory. Friedlander graduated from Swarthmore College with bachelor's degree in 1965 and in 1970 received a Ph.D. from the Massachusetts Institute of Technology, under the supervision of Michael Artin, (Fibrations in Étale Homotopy Theory). He was a postdoctoral instructor at Princeton University: a lecturer in 1971 and assistant professor in 1972. From 1973 to 1974, he was, through the US exchange program, at France, in particular at the Institut des Hautes Études Scientifiques. In 1975, he became an associate professor and in 1980 a professor at Northwestern University, where he was a chairman of the mathematics department from 1987 to 1990 and from 1999 to 2003. In 1999, he became Henry S. Noyes Professor of mathematics. As of 2008, he is Dean's Professor at the University of Southern California. In 1981 and from 1985 to 1986, he was at the Institute for Advanced Study in Princeton, New Jersey. He received the Humboldt Research Award, while at the University of Heidelberg, from 1996 to 1998. He was also a visiting scholar and visiting professor at ETH Zurich, at the Max Planck Institute for Mathematics in Bonn, at the Mathematical Sciences Research Institute, in Oxford, Cambridge, Paris, at Brown University, the Hebrew University, and at the Institut Henri Poincaré. Since 2000, he has been on the Board of Trustees of the American Mathematical Society. Friedlander is a co-editor of the Journal of Pure and Applied Algebra. In 1998, he was an invited speaker at the International Congress of Mathematicians in Berlin (Geometry of infinitesimal group schemes). In 2012 he became a fellow of the American Mathematical Society. Friedlander is married to another mathematician, Susan Friedlander. Among his students is David A. Cox. Works With Andrei Suslin and Vladimir Voevods
https://en.wikipedia.org/wiki/Transversality%20%28mathematics%29
In mathematics, transversality is a notion that describes how spaces can intersect; transversality can be seen as the "opposite" of tangency, and plays a role in general position. It formalizes the idea of a generic intersection in differential topology. It is defined by considering the linearizations of the intersecting spaces at the points of intersection. Definition Two submanifolds of a given finite-dimensional smooth manifold are said to intersect transversally if at every point of intersection, their separate tangent spaces at that point together generate the tangent space of the ambient manifold at that point. Manifolds that do not intersect are vacuously transverse. If the manifolds are of complementary dimension (i.e., their dimensions add up to the dimension of the ambient space), the condition means that the tangent space to the ambient manifold is the direct sum of the two smaller tangent spaces. If an intersection is transverse, then the intersection will be a submanifold whose codimension is equal to the sums of the codimensions of the two manifolds. In the absence of the transversality condition the intersection may fail to be a submanifold, having some sort of singular point. In particular, this means that transverse submanifolds of complementary dimension intersect in isolated points (i.e., a 0-manifold). If both submanifolds and the ambient manifold are oriented, their intersection is oriented. When the intersection is zero-dimensional, the orientation is simply a plus or minus for each point. One notation for the transverse intersection of two submanifolds and of a given manifold is . This notation can be read in two ways: either as “ and intersect transversally” or as an alternative notation for the set-theoretic intersection of and when that intersection is transverse. In this notation, the definition of transversality reads Transversality of maps The notion of transversality of a pair of submanifolds is easily extended to tran
https://en.wikipedia.org/wiki/Scoring%20rule
In decision theory, a scoring rule provides a summary measure for the evaluation of probabilistic predictions or forecasts. It is applicable to tasks in which predictions assign probabilities to events, i.e. one issues a probability distribution as prediction. This includes probabilistic classification of a set of mutually exclusive outcomes or classes. On the other side, a scoring function provides a summary measure for the evaluation of point predictions, i.e. one predicts a property or functional , like the expectation or the median. Scoring rules and scoring functions can be thought of as "cost function" or "loss function". They are evaluated as empirical mean of a given sample, simply called score. Scores of different predictions or models can then be compared to conclude which model is best. If a cost is levied in proportion to a proper scoring rule, the minimal expected cost corresponds to reporting the true set of probabilities. Proper scoring rules are used in meteorology, finance, and pattern classification where a forecaster or algorithm will attempt to minimize the average score to yield refined, calibrated probabilities (i.e. accurate probabilities). Motivation Since the metrics in Evaluation of binary classifiers are not evaluating the calibration, scoring rules which can do so are needed. These scoring rules can be used as loss functions in empirical risk minimization. Definition Consider a sample space , a σ-algebra of subsets of and a convex class of probability measures on . A function defined on and taking values in the extended real line, , is -quasi-integrable if it is measurable with respect to and is quasi-integrable with respect to all . Probabilistic forecast A probabilistic forecast is any probability measure . Scoring rule A scoring rule is any extended real-valued function such that is -quasi-integrable for all . represents the loss or penalty when the forecast is issued and the observation materializes. Point fore
https://en.wikipedia.org/wiki/Kingsoft
Kingsoft Corporation () is a Chinese software company based in Beijing. Kingsoft operates four subsidiaries: Seasun for video game development, Cheetah Mobile for mobile internet apps, Kingsoft Cloud for cloud storage platforms, and WPS for office software, including WPS Office. It also produced security software known as Kingsoft Security. The most popular game developed by Kingsoft is JX Online 3, launched in 2009. Kingsoft owns data centers in mainland China, Hong Kong, Russia, Southeast Asia, and North America.The company is listed on the Hong Kong Stock Exchange. History The company was founded in 1988 by Qiu Bojun. In 2011, Bojun sold his 15.68% stake in Kingsoft to Tencent.
https://en.wikipedia.org/wiki/Honey%20encryption
Honey encryption is a type of data encryption that "produces a ciphertext, which, when decrypted with an incorrect key as guessed by the attacker, presents a plausible-looking yet incorrect plaintext password or encryption key." Creators Ari Juels and Thomas Ristenpart of the University of Wisconsin, the developers of the encryption system, presented a paper on honey encryption at the 2014 Eurocrypt cryptography conference. Method of protection A brute-force attack involves repeated decryption with random keys; this is equivalent to picking random plaintexts from the space of all possible plaintexts with a uniform distribution. This is effective because even though the attacker is equally likely to see any given plaintext, most plaintexts are extremely unlikely to be legitimate i.e. the distribution of legitimate plaintexts is non-uniform. Honey encryption defeats such attacks by first transforming the plaintext into a space such that the distribution of legitimate plaintexts is uniform. Thus an attacker guessing keys will see legitimate-looking plaintexts frequently and random-looking plaintexts infrequently. This makes it difficult to determine when the correct key has been guessed. In effect, honey encryption "[serves] up fake data in response to every incorrect guess of the password or encryption key." The security of honey encryption relies on the fact that the probability of an attacker judging a plaintext to be legitimate can be calculated (by the encrypting party) at the time of encryption. This makes honey encryption difficult to apply in certain applications e.g. where the space of plaintexts is very large or the distribution of plaintexts is unknown. It also means that honey encryption can be vulnerable to brute-force attacks if this probability is miscalculated. For example, it is vulnerable to known-plaintext attacks: if the attacker has a crib that a plaintext must match to be legitimate, they will be able to brute-force even Honey Encrypted data i
https://en.wikipedia.org/wiki/Server-Gated%20Cryptography
Server-Gated Cryptography (SGC), also known as International Step-Up by Netscape, is a defunct mechanism that was used to step up from 40-bit or 56-bit to 128-bit cipher suites with SSL. It was created in response to United States federal legislation on the export of strong cryptography in the 1990s. The legislation had limited encryption to weak algorithms and shorter key lengths in software exported outside of the United States of America. When the legislation added an exception for financial transactions, SGC was created as an extension to SSL with the certificates being restricted to financial organisations. In 1999, this list was expanded to include online merchants, healthcare organizations, and insurance companies. This legislation changed in January 2000, resulting in vendors no longer shipping export-grade browsers and SGC certificates becoming available without restriction. Internet Explorer supported SGC starting with patched versions of Internet Explorer 3. SGC became obsolete when Internet Explorer 5.01 SP1 and Internet Explorer 5.5 started supporting strong encryption without the need for a separate high encryption pack (except on Windows 2000, which needs its own high encryption pack that was included in Service Pack 2 and later). "Export-grade" browsers are unusable on the modern Web due to many servers disabling export cipher suites. Additionally, these browsers are incapable of using SHA-2 family signature hash algorithms like SHA-256. Certification authorities are trying to phase out the new issuance of certificates with the older SHA-1 signature hash algorithm. The continuing use of SGC facilitates the use of obsolete, insecure Web browsers with HTTPS. However, while certificates that use the SHA-1 signature hash algorithm remain available, some certificate authorities continue to issue SGC certificates (often charging a premium for them) although they are obsolete. The reason certificate authorities can charge a premium for SGC certificates
https://en.wikipedia.org/wiki/Elevation%20%28ballistics%29
In ballistics, the elevation is the angle between the horizontal plane and the axial direction of the barrel of a gun, mortar or heavy artillery. Originally, elevation was a linear measure of how high the gunners had to physically lift the muzzle of a gun up from the gun carriage to compensate for projectile drop and hit targets at a certain distance. Until WWI Though early 20th-century firearms were relatively easy to fire, artillery was not. Before and during World War I, the only way to effectively fire artillery was plotting points on a plane. Most artillery units seldom employed their guns in small numbers. Instead of using pin-point artillery firing they used old means of "fire for effect" using artillery en masse. This tactic was employed successfully by past armies. By World War I, reasonably accurate artillery fire was possible even at long range requiring significant elevation. However, artillery tactics used in previous wars were carried on, and still had similar success where great accuracy was not required. Large warships such as battleships carried large-caliber guns that needed to be elevated above the direct point of aim for firing accurately at small targets at long range. From WWII As time passed on, more accurate artillery guns were developed in a range of sizes. Some small artillery pieces were used at high elevations as mortars, medium-sized guns were used on tanks as well as fixed positions, and the largest guns became long-range land batteries and battleship armaments. With the introduction of better tanks in World War II, elevation had to be taken into account by tank gunners, which had to aim through the Gunner's Auxiliary Sights (GAS) or even through iron sights. At shorter ranges the high velocity of tank and other munitions made elevation less of an issue. During World War II artillery fire-control systems (FCS) were introduced, improving the effectiveness of artillery fire. With advances in the 21st century, it has become easy t
https://en.wikipedia.org/wiki/Min%C3%A4%20Per%C3%A4smies
Minä Peräsmies is a 1998 Finnish PC-ROM for Windows that consists of comics, games and other content based on the superhero character of Peräsmies whom is able to fly by farting ”with the power of a thousand hurricanes”. The ROM was created by a team at the media company Mediakeisari Oy including Timo Kokkila (the artist), Petri Tuomola and Reima Mäkinen and published and sold by Plan1 Oy. The character became known from the Finnish comic and humor magazine Pahkasika and strips were released from 1983 to 2000. Content The ROM includes 8 comics, 5 games (one of which is a printable board game), the Food Circle of Peräsmies and the Museum Center Fiasco (a multimedia Fart Museum and a collection of images). The packaging also contains comic strips and other "stuff". Reception The ROM did not receive a mixed reception. Most of the published reviews conclude that it is an essentially worth a single quick viewing with some descent artwork and momentarily interesting technical work including the wide array of fart sounds´. Making-of documentary A making-of has been released on YouTube in two parts. The documentary "Näin tehtiin Minä Peräsmies" consists of interviews and archive footage from the time of the ROM's making and was made by Reima Mäkinen.
https://en.wikipedia.org/wiki/The%20Symmetries%20of%20Things
The Symmetries of Things is a book on mathematical symmetry and the symmetries of geometric objects, aimed at audiences of multiple levels. It was written over the course of many years by John Horton Conway, Heidi Burgiel, and Chaim Goodman-Strauss, and published in 2008 by A K Peters. Its critical reception was mixed, with some reviewers praising it for its accessible and thorough approach to its material and for its many inspiring illustrations, and others complaining about its inconsistent level of difficulty, overuse of neologisms, failure to adequately cite prior work, and technical errors. Topics The Symmetries of Things has three major sections, subdivided into 26 chapters. The first of the sections discusses the symmetries of geometric objects. It includes both the symmetries of finite objects in two and three dimensions, and two-dimensional infinite structures such as frieze patterns and tessellations, and develops a new notation for these symmetries based on work of Alexander Murray MacBeath that, as proven by the authors using a simplified form of the Riemann–Hurwitz formula, covers all possibilities. Other topics include Euler's polyhedral formula and the classification of two-dimensional surfaces. It is heavily illustrated with both artworks and objects depicting these symmetries, such as the art of M. C. Escher and Bathsheba Grossman, as well as new illustrations created by the authors using custom software. The second section of the book considers symmetries more abstractly and combinatorially, considering both the color-preserving symmetries of colored objects, the symmetries of topological spaces described in terms of orbifolds, and abstract forms of symmetry described by group theory and presentations of groups. This section culminates with a classification of all of the finite groups with up to 2009 elements. The third section of the book provides a classification of the three-dimensional space groups and examples of honeycombs such as the Weai
https://en.wikipedia.org/wiki/Thalassogen
In astronomy, a thalassogen denotes a substance capable of forming a planetary ocean. Thalassogens are not necessarily life sustaining, although most interest has been in the context of extraterrestrial life. The term was coined by Isaac Asimov in his essay "The Thalassogens", later published in his 1972 collection The Left Hand of the Electron. Said term was coined via the Ancient Greek prefix thalasso- ("sea") and the suffix -gen ("producer"). Elements making up thalassogens have to be relatively abundant, the substance must be chemically stable in its environment, and must remain liquid under the conditions found on some planets. Freitas gives the following table, noting that the liquid range typically increases with increasing pressure: The critical temperature and pressure represents the point where the distinction between gas and liquid vanishes, a possible upper limit for life (though life in supercritical fluids has been discussed both in science and fiction, such as in Close to Critical by Hal Clement). Later authors have also suggested sulfuric acid, ethane, and water/ammonia mixtures as possible thalassogens. The discovery of possible subsurface oceans on moons such as Europa (and, less obviously, Ganymede and Callisto) also extends the range of possible environments. See also Extraterrestrial liquid water Hypothetical types of biochemistry
https://en.wikipedia.org/wiki/Phono%20input
Phono input is a set of input jacks, usually mini jacks or RCA connectors, located on the rear panel of a preamp, mixer or amplifier, especially on early radio sets, to which a phonograph or turntable is attached. Modern phono cartridges give a very low level output signal of the order of a few millivolts which the circuitry amplifies and equalizes. Phonograph recordings are made with high frequencies boosted and the low frequencies attenuated: during playback the frequency response changes are reversed. This reduces background noise, including clicks or pops, and also conserves the amount of physical space needed for each groove, by reducing the size of the larger low-frequency undulations. This is accomplished in the amplifier with a phono input that incorporates standardized RIAA equalization circuitry. Through at least the 1980s, the phono input was widely available on consumer stereo equipment—even some larger boomboxes had them. By the 2000s only very sophisticated and expensive stereo receivers retained the phono input, since most users were expected to use digital music formats such as CD or satellite radio. Some newer low-cost turntables include built-in amplifiers to produce line-level (one volt) outputs; devices are available that perform this conversion for use with computers; or older amplifiers or radio receivers can be used. Nearly all DJ mixers have two or more phono inputs, together with two or more one-volt line inputs that also use RCA connectors. This "phono input" designed for the millivolt signal from an unamplified turntable should not be confused with the modern standard one-volt line input and output that also uses RCA connectors and is found on video cameras, recorders and similar modern equipment.
https://en.wikipedia.org/wiki/Protein%20catabolism
In molecular biology, protein catabolism is the breakdown of proteins into smaller peptides and ultimately into amino acids. Protein catabolism is a key function of digestion process. Protein catabolism often begins with pepsin, which converts proteins into polypeptides. These polypeptides are then further degraded. In humans, the pancreatic proteases include trypsin, chymotrypsin, and other enzymes. In the intestine, the small peptides are broken down into amino acids that can be absorbed into the bloodstream. These absorbed amino acids can then undergo amino acid catabolism, where they are utilized as an energy source or as precursors to new proteins. The amino acids produced by catabolism may be directly recycled to form new proteins, converted into different amino acids, or can undergo amino acid catabolism to be converted to other compounds via the Krebs cycle. Interface with other metabolic and salvage pathways Protein catabolism produces amino acids that are used to form bacterial proteins or oxidized to meet the energy needs of the cell. The amino acids that are produced by protein catabolism can then be further catabolized in amino acid catabolism. Among the several degradative processes for amino acids are Deamination (removal of an amino group), transamination (transfer of amino group), decarboxylation (removal of carboxyl group), and dehydrogenation (removal of hydrogen). Degradation of amino acids can function as part of a salvage pathway, whereby parts of degraded amino acids are used to create new amino acids, or as part of a metabolic pathway whereby the amino acid is broken down to release or recapture chemical energy. For example, the chemical energy that is released by oxidization in a dehydrogenation reaction can be used to reduce NAD+ to NADH, which can then be fed directly into the Krebs/Citric Acid (TCA) Cycle. Protein degradation Protein degradation differs from protein catabolism. Proteins are produced and destroyed routinely as par
https://en.wikipedia.org/wiki/Asymmetric%20carbon
In stereochemistry, an asymmetric carbon is a carbon atom that is bonded to four different types of atoms or groups of atoms. The four atoms and/or groups attached to the carbon atom can be arranged in space in two different ways that are mirror images of each other, and which lead to so-called left-handed and right-handed versions (stereoisomers) of the same molecule. Molecules that cannot be superimposed on their own mirror image are said to be chiral; as the asymmetric carbon is the center of this chirality, it is also known as a chiral carbon. As an example, malic acid () has 4 carbon atoms but just one of them is asymmetric. The asymmetric carbon atom, bolded in the formula, is the one attached to two carbon atoms, an oxygen atom, and a hydrogen atom. One may initially be inclined to think this atom is not asymmetric because it is attached to two carbon atoms, but because those two carbon atoms are not attached to exactly the same things, there are two different groups of atoms that the carbon atom in question is attached to, therefore making it an asymmetric carbon atom: Knowing the number of asymmetric carbon atoms, one can calculate the maximum possible number of stereoisomers for any given molecule as follows: If is the number of asymmetric carbon atoms then the maximum number of isomers =  (Le Bel-van't Hoff rule) This is a corollary of Le Bel and van't Hoff's simultaneously announced conclusions, in 1874, that the most probable orientation of the bonds of a carbon atom linked to four groups or atoms is toward the apexes of a tetrahedron, and that this accounted for all then-known phenomena of molecular asymmetry (which involved a carbon atom bearing four different atoms or groups). A tetrose with 2 asymmetric carbon atoms has 22 =  4 stereoisomers: An aldopentose with 3 asymmetric carbon atoms has 23 =  8 stereoisomers: An aldohexose with 4 asymmetric carbon atoms has 24 =  16 stereoisomers:
https://en.wikipedia.org/wiki/Kuramoto%E2%80%93Sivashinsky%20equation
In mathematics, the Kuramoto–Sivashinsky equation (also called the KS equation or flame equation) is a fourth-order nonlinear partial differential equation. It is named after Yoshiki Kuramoto and Gregory Sivashinsky, who derived the equation in the late 1970s to model the diffusive–thermal instabilities in a laminar flame front. The equation was independelty derived by G. M. Homsy and A. A. Nepomnyashchii in 1974, in connection with the stability of liquid film on an inclined plane and by R. E. LaQuey et. al. in 1975 in connection with trapped-ion instability. The Kuramoto–Sivashinsky equation is known for its chaotic behavior. Definition The 1d version of the Kuramoto–Sivashinsky equation is An alternate form is obtained by differentiating with respect to and substituting . This is the form used in fluid dynamics applications. The Kuramoto–Sivashinsky equation can also be generalized to higher dimensions. In spatially periodic domains, one possibility is where is the Laplace operator, and is the biharmonic operator. Properties The Cauchy problem for the 1d Kuramoto–Sivashinsky equation is well-posed in the sense of Hadamard—that is, for given initial data , there exists a unique solution that depends continuously on the initial data. The 1d Kuramoto–Sivashinsky equation possesses Galilean invariance—that is, if is a solution, then so is , where is an arbitrary constant. Physically, since is a velocity, this change of variable describes a transformation into a frame that is moving with constant relative velocity . On a periodic domain, the equation also has a reflection symmetry: if is a solution, then is also a solution. Solutions Solutions of the Kuramoto–Sivashinsky equation possess rich dynamical characteristics. Considered on a periodic domain , the dynamics undergoes a series of bifurcations as the domain size is increased, culminating in the onset of chaotic behavior. Depending on the value of , solutions may include equilibria, relative equ
https://en.wikipedia.org/wiki/Regulator%20gene
A regulator gene, regulator, or regulatory gene is a gene involved in controlling the expression of one or more other genes. Regulatory sequences, which encode regulatory genes, are often at the five prime end (5') to the start site of transcription of the gene they regulate. In addition, these sequences can also be found at the three prime end (3') to the transcription start site. In both cases, whether the regulatory sequence occurs before (5') or after (3') the gene it regulates, the sequence is often many kilobases away from the transcription start site. A regulator gene may encode a protein, or it may work at the level of RNA, as in the case of genes encoding microRNAs. An example of a regulator gene is a gene that codes for a repressor protein that inhibits the activity of an operator (a gene which binds repressor proteins thus inhibiting the translation of RNA to protein via RNA polymerase). In prokaryotes, regulator genes often code for repressor proteins. Repressor proteins bind to operators or promoters, preventing RNA polymerase from transcribing RNA. They are usually constantly expressed so the cell always has a supply of repressor molecules on hand. Inducers cause repressor proteins to change shape or otherwise become unable to bind DNA, allowing RNA polymerase to continue transcription. Regulator genes can be located within an operon, adjacent to it, or far away from it. Other regulatory genes code for activator proteins. An activator binds to a site on the DNA molecule and causes an increase in transcription of a nearby gene. In prokaryotes, a well-known activator protein is the catabolite activator protein (CAP), involved in positive control of the lac operon. In the regulation of gene expression, studied in evolutionary developmental biology (evo-devo), both activators and repressors play important roles. Regulatory genes can also be described as positive or negative regulators, based on the environmental conditions that surround the ce
https://en.wikipedia.org/wiki/Remnant%20natural%20area
A remnant natural area, also known as remnant habitat, is an ecological community containing native flora and fauna that has not been significantly disturbed by destructive activities such as agriculture, logging, pollution, development, fire suppression, or non-native species invasion. The more disturbed an area has been, the less characteristic it becomes of remnant habitat. Remnant areas are also described as "biologically intact" or "ecologically intact." Remnant natural areas are often used as reference ecosystems in ecological restoration projects. Ecology A remnant natural area can be described in terms of its natural quality or biological integrity, which is the extent to which it has the internal biodiversity and abiotic elements to replicate itself over time. Another definition of biological integrity is "the capability of supporting and maintaining a balanced, integrated, adaptive community of organisms having a species composition, diversity, and functional organization comparable to that of the natural habitat of the region." Abiotic elements determining the quality of a natural area may include factors such as hydrologic connectivity or fire. In areas that have been dredged, drained, or dammed, the altered hydrology can destroy a remnant natural area. Similarly, too much or too little fire can degrade or destroy a remnant natural area. Remnant natural areas are characterized by the presence of "conservative" plants and animals—organisms that are restricted to or highly characteristic of areas that have not been disturbed by humans. Tools to measure aspects of natural areas quality in remnant areas include Floristic Quality Assessment and the Macroinvertebrate Community Index. Examples In the upper Midwestern United States, remnant natural areas date prior to European settlement, going back to the end of the Wisconsinian Glaciation approximately 15,000 years ago. Diverse remnant plant community examples in that region include tallgrass prairie, beec
https://en.wikipedia.org/wiki/XS%20%28server%29
XS is the studio production server of the Belgian company EVS Broadcast Equipment. It has been inspired from the XT3 server but can be controlled by dedicated controllers from EVS for the studio environment: Xsense, IPDirector, Xscreen, Insio or by non-EVS controllers such as automation systems, linear or hybrid editors, switchers and controllers through API or standard protocols. Designed to replace VTRs, the server allows incoming feeds to be recorded, quickly enriched by metadata and played out or instantly streamed or transferred to post-production. The server benefits from loop recording and allows to record, control and play media. It has from 2 to 6 channels SD/HD and 6-channel 3D/1080p (3G or dual link) and offers the same features in a 3D environment. It supports several formats and codec, with specific codecs for News environment. It is widely used in News environments and sometimes in Sports.
https://en.wikipedia.org/wiki/Waveguide
A waveguide is a structure that guides waves by restricting the transmission of energy to one direction. Common types of waveguides include acoustic waveguides which direct sound, optical waveguides which direct light, and radio-frequency waveguides which direct electromagnetic waves other than light like radio waves. Without the physical constraint of a waveguide, waves would expand into three-dimensional space and their intensities would decrease according to the inverse square law. There are different types of waveguides for different types of waves. The original and most common meaning is a hollow conductive metal pipe used to carry high frequency radio waves, particularly microwaves. Dielectric waveguides are used at higher radio frequencies, and transparent dielectric waveguides and optical fibers serve as waveguides for light. In acoustics, air ducts and horns are used as waveguides for sound in musical instruments and loudspeakers, and specially-shaped metal rods conduct ultrasonic waves in ultrasonic machining. The geometry of a waveguide reflects its function; in addition to more common types that channel the wave in one dimension, there are two-dimensional slab waveguides which confine waves to two dimensions. The frequency of the transmitted wave also dictates the size of a waveguide: each waveguide has a cutoff wavelength determined by its size and will not conduct waves of greater wavelength; an optical fiber that guides light will not transmit microwaves which have a much larger wavelength. Some naturally occurring structures can also act as waveguides. The SOFAR channel layer in the ocean can guide the sound of whale song across enormous distances. Any shape of cross section of waveguide can support EM waves. Irregular shapes are difficult to analyse. Commonly used waveguides are rectangular and circular in shape. Uses The uses of waveguides for transmitting signals were known even before the term was coined. The phenomenon of sound waves g
https://en.wikipedia.org/wiki/Craterellus%20calicornucopioides
Craterellus calicornucopioides is an edible fungus in the family Cantharellaceae. Described by David Arora and Jonathan L. Frank in 2015, is the North American version of the similar European species Craterellus cornucopioides. Molecular phylogenetics has shown that they are, however, distinct species.
https://en.wikipedia.org/wiki/Ardipithecus
Ardipithecus is a genus of an extinct hominine that lived during the Late Miocene and Early Pliocene epochs in the Afar Depression, Ethiopia. Originally described as one of the earliest ancestors of humans after they diverged from the chimpanzees, the relation of this genus to human ancestors and whether it is a hominin is now a matter of debate. Two fossil species are described in the literature: A. ramidus, which lived about 4.4 million years ago during the early Pliocene, and A. kadabba, dated to approximately 5.6 million years ago (late Miocene). Initial behavioral analysis indicated that Ardipithecus could be very similar to chimpanzees, however more recent analysis based on canine size and lack of canine sexual dimorphism indicates that Ardipithecus was characterised by reduced aggression, and that they more closely resemble bonobos. Some analyses describe Australopithecus as being sister to Ardipithecus ramidus specifically. This means that Australopithecus is distinctly closer related to Ardipithecus ramidus than Ardipithecus kadabba. Cladistically, then, Australopithecus (and eventually Homo sapiens) indeed emerged within the Ardipithecus lineage, and this lineage is not literally extinct. Ardipithecus ramidus A. ramidus was named in September 1994. The first fossil found was dated to 4.4 million years ago on the basis of its stratigraphic position between two volcanic strata: the basal Gaala Tuff Complex (G.A.T.C.) and the Daam Aatu Basaltic Tuff (D.A.B.T.). The name Ardipithecus ramidus stems mostly from the Afar language, in which Ardi means "ground/floor" and ramid means "root". The pithecus portion of the name is from the Greek word for "ape". Like most hominids, but unlike all previously recognized hominins, it had a grasping hallux or big toe adapted for locomotion in the trees. It is not confirmed how many other features of its skeleton reflect adaptation to bipedalism on the ground as well. Like later hominins, Ardipithecus had reduced canine
https://en.wikipedia.org/wiki/Chicken%20of%20the%20Sea
Chicken of the Sea is a packager and provider of seafood, owned by the Thai Union Group in Samut Sakhon, Thailand. The brand is attached to tuna, salmon, clams, crab, shrimp, mackerel, oysters, kippers and sardines in cans, pouches and cups, as are its sister brands, Genova and Ace of Diamonds. History The company was founded in 1914 when Frank Van Camp and his son bought the California Tuna Canning Company and changed its name to the Van Camp Seafood Company. The phrase "Chicken of the Sea", first devised as a way to describe the taste, was so successful that soon it also became the company name. In 1963, Van Camp Seafood Company was purchased by Ralston Purina. In 1988, Ralston sold its Van Camp division to an Indonesian corporation, PT Mantrust (headed by Teguh Sutantyo), which had financial problems, and the primary creditor, Prudential Life Insurance Company, became the majority owner. In 1997 the company was purchased by the investment group Tri-Union Seafoods LLC, made up of three partners: Thai Union International Inc., a Thai conglomerate based in Bangkok and the then-largest tuna packer in Asia and second largest in the world Edmund A. Gann, American owner of Caribbean Marine Service, Co., Inc., a tuna-fishing fleet Tri-Marine International, Inc., a global trading company formed in Singapore in 1972 dealing in tuna and tuna products headed by Renato Curto, president and majority shareholder. The new owners changed the name of Van Camp Seafood Company to Chicken of the Sea International. In 2000, Tri-Marine International Inc and Edmund A. Gann sold their 50 percent interest in Chicken of the Sea to Thai Union International, Inc., leaving Thai Union the sole owner of the company. Chicken of the Sea International and Tri-Union International LLC merged into one company, still called Chicken of the Sea International. With the 2003 acquisition of Empress International, an importer of frozen shrimp and other shellfish, Chicken of the Sea's total annual
https://en.wikipedia.org/wiki/Bruhat%20order
In mathematics, the Bruhat order (also called strong order or strong Bruhat order or Chevalley order or Bruhat–Chevalley order or Chevalley–Bruhat order) is a partial order on the elements of a Coxeter group, that corresponds to the inclusion order on Schubert varieties. History The Bruhat order on the Schubert varieties of a flag manifold or a Grassmannian was first studied by , and the analogue for more general semisimple algebraic groups was studied by . started the combinatorial study of the Bruhat order on the Weyl group, and introduced the name "Bruhat order" because of the relation to the Bruhat decomposition introduced by François Bruhat. The left and right weak Bruhat orderings were studied by . Definition If (W, S) is a Coxeter system with generators S, then the Bruhat order is a partial order on the group W. Recall that a reduced word for an element w of W is a minimal length expression of w as a product of elements of S, and the length ℓ(w) of w is the length of a reduced word. The (strong) Bruhat order is defined by u ≤ v if some substring of some (or every) reduced word for v is a reduced word for u. (Note that here a substring is not necessarily a consecutive substring.) The weak left (Bruhat) order is defined by u ≤L v if some final substring of some reduced word for v is a reduced word for u. The weak right (Bruhat) order is defined by u ≤R v if some initial substring of some reduced word for v is a reduced word for u. For more on the weak orders, see the article weak order of permutations. Bruhat graph The Bruhat graph is a directed graph related to the (strong) Bruhat order. The vertex set is the set of elements of the Coxeter group and the edge set consists of directed edges (u, v) whenever u = tv for some reflection t and ℓ(u) < ℓ(v). One may view the graph as an edge-labeled directed graph with edge labels coming from the set of reflections. (One could also define the Bruhat graph using multiplication on the right; as graphs,
https://en.wikipedia.org/wiki/WDR%20paper%20computer
The WDR paper computer or Know-how Computer is an educational model of a computer consisting only of a pen, a sheet of paper, and individual matches in the most simple case. This allows anyone interested to learn how to program without having an electronic computer at their disposal. The paper computer was created in the early 1980s when computer access was not yet widespread in Germany, to allow people to familiarize themselves with basic computer operation and assembly-like programming languages. It was distributed in over copies and at its time belonged to the computers with the widest circulation. The Know-how Computer was developed by and Ulrich Rohde and was first presented in the television program WDR Computerclub in 1983. It was also published in German computer magazines mc and . The original printed version of the paper computer has up to 21 lines of code on the left and eight registers on the right, which are represented as boxes that contain as many matches as the value in the corresponding register. A pen is used to indicate the line of code which is about to be executed. The user steps through the program, adding and subtracting matches from the appropriate registers and following program flow until the stop instruction is encountered. The instruction set of five commands is small but Turing complete and therefore enough to represent all mathematical functions: inc *register*: Add 1 to the register. dec *register*: Subtract 1 from the register. jmp *line*: Jumps to the specified line. isz *register*: Checks if the register is zero. If so, skips a line. If not, continues normally. stp: Stops the program. In the original newspaper article about this computer, it was written slightly differently (translation): + = Add 1 to the contents of data register XX and increase (program step) by 1 - = Subtract 1 from the contents of data register XX and increase (program step) by 1 (J) = (Jump) to (line) XX 0 = Check if the content of the data re
https://en.wikipedia.org/wiki/State%20quality%20mark%20of%20the%20USSR
The State quality mark of the USSR (, transliteration ) was the official Soviet mark for the certification of quality established in 1967. Symbol The sign was a pentagonal shield with a rotated letter K (from Russian word – quality) stylized as scales below the Cyrillic abbreviation for USSR (, ). History It was used to mark consumer, production, and technical goods to certify that they met quality standards and, in general, to increase the effectiveness of the production system in the USSR. The goods themselves or their packaging were marked, as was the accompanying documentation, labels or tags. Rules of its use were defined by GOST, an acronym for "state standard" (), section 1.9-67 (April 7, 1967). The right to use the sign was leased to the enterprises for 2–3 years based on the examination of the goods by the State Attestation Commission (, ) that should certify that the goods are of the "higher quality category". That is: their quality "meets or exceeds the quality of the best international analogs", parameters of quality are stable, goods fully satisfy Soviet state standards, goods are compatible with international standards, production of goods is economically effective, and they satisfy the demands of the state economy and the population. Obtaining the sign allowed the enterprises to increase the state controlled price for the goods by ten percent. When the sign was introduced it indeed suggested high quality of the goods but after some time a lot of Soviet-made goods were certified for the sign while their quality often remained below expectations of customers. After dissolution of the Soviet Union, the Russian government introduced its own sign for certification of quality, known as the Rostest mark (or R mark). See also Certification mark State Emblem of the Soviet Union Rostest – organization responsible for the newer Rostest mark
https://en.wikipedia.org/wiki/Torservers.net
torservers.net is an independent network of non-profit organisations that provide nodes to the Tor anonymity network. The network started in June 2010 and currently transfers up to 7.4GB/s (~59.2Gb/s) of exit node traffic as of May 2022. Torservers.net is known for operating servers with high network bandwidth and running them as exit nodes in the Tor network, which helps increase its speed and capacity. The group additionally helps provide lawyers for relay operators along with arranging operator meetups. Funding While Tor is free software that anyone can run, successful operation of Tor nodes may require technical expertise, access to high-bandwidth, and can involve legal complications in some jurisdictions. The Torservers.net network accepts financial donations as a way to sponsor additional nodes. Bavarian Raid On June 20, 2018, Bavarian police raided the home of the board members of the German non-profit Zwiebelfreunde, "Friends of the Onion," (part of torservers.net). Zwiebelfreunde helps collect donations from Europe for various non-commercial providers such as Riseup.net. The police claim the raid was prompted by a blog post from an unrelated activist that promised violence against an upcoming Alternative for Germany convention in Augsburg. The blog post was published on a website that used a Riseup.net e-mail address. Riseup Collective is based in Seattle in the United States, and reported publicly that Zwiebelfreunde does not run its service. On August 23 the German court at Landgericht München ruled that the raid and seizures was illegal. The hardware and documentation seized had been kept under seal, and purportedly were neither analyzed nor evaluated by the Bavarian police. Members Associated Whistleblowing Press (Belgium) Access Now (USA) Calyx Institute (USA) Zwiebelfreunde e.V. (Germany) Hart Voor Internetvrijheid (Netherlands) Library Freedom Project (USA) Noisebridge (USA) External links Official website
https://en.wikipedia.org/wiki/SLAC%20National%20Accelerator%20Laboratory
SLAC National Accelerator Laboratory, originally named the Stanford Linear Accelerator Center, is a federally funded research and development center in Menlo Park, California, United States. Founded in 1962, the laboratory is now sponsored by the United States Department of Energy and administrated by Stanford University. It is the site of the Stanford Linear Accelerator, a 3.2 kilometer (2-mile) linear accelerator constructed in 1966 that could accelerate electrons to energies of 50 GeV. Today SLAC research centers on a broad program in atomic and solid-state physics, chemistry, biology, and medicine using X-rays from synchrotron radiation and a free-electron laser as well as experimental and theoretical research in elementary particle physics, astroparticle physics, and cosmology. The laboratory is under the programmatic direction of the United States Department of Energy Office of Science. History Founded in 1962 as the Stanford Linear Accelerator Center, the facility is located on of Stanford University-owned land on Sand Hill Road in Menlo Park, California—just west of the university's main campus. The main accelerator is long—the longest linear accelerator in the world—and has been operational since 1966. Research at SLAC has produced three Nobel Prizes in Physics: 1976: The charm quark—see J/ψ meson 1990: Quark structure inside protons and neutrons 1995: The tau lepton SLAC's meeting facilities also provided a venue for the Homebrew Computer Club and other pioneers of the home computer revolution of the late 1970s and early 1980s. In 1984 the laboratory was named an ASME National Historic Engineering Landmark and an IEEE Milestone. SLAC developed and, in December 1991, began hosting the first World Wide Web server outside of Europe. In the early-to-mid 1990s, the Stanford Linear Collider (SLC) investigated the properties of the Z boson using the Stanford Large Detector. As of 2005, SLAC employed over 1,000 people, some 150 of whom were physicists
https://en.wikipedia.org/wiki/Beurling%20zeta%20function
In mathematics, a Beurling zeta function is an analogue of the Riemann zeta function where the ordinary primes are replaced by a set of Beurling generalized primes: any sequence of real numbers greater than 1 that tend to infinity. These were introduced by . A Beurling generalized integer is a number that can be written as a product of Beurling generalized primes. Beurling generalized the usual prime number theorem to Beurling generalized primes. He showed that if the number N(x) of Beurling generalized integers less than x is of the form N(x) = Ax + O(x log−γx) with γ > 3/2 then the number of Beurling generalized primes less than x is asymptotic to x/log x, just as for ordinary primes, but if γ = 3/2 then this conclusion need not hold. See also Abstract analytic number theory
https://en.wikipedia.org/wiki/U.S.%20National%20Vegetation%20Classification
The U.S. National Vegetation Classification (NVC or USNVC) is a scheme for classifying the natural and cultural vegetation communities of the United States. The purpose of this standardized vegetation classification system is to facilitate communication between land managers, scientists, and the public when managing, researching, and protecting plant communities. The non-profit group NatureServe maintains the NVC for the U.S. government. See also British National Vegetation Classification Vegetation classification External links The U.S. National Vegetation Classification website "National Vegetation Classification Standard, Version 2" FGDC-STD-005-2008, Vegetation Subcommittee, Federal Geographic Data Committee, February 2008 U.S. Geological Survey page about the Vegetation Characterization Program Federal Geographic Data Committee page about the NVC Environment of the United States Flora of the United States NatureServe Biological classification
https://en.wikipedia.org/wiki/Fedor%20Nazarov
Fedor (Fedya) L'vovich Nazarov (; born 1967) is a Russian mathematician working in the United States. He has done research in mathematical analysis and its applications, in particular in functional analysis and classical analysis (including harmonic analysis, Fourier analysis, and complex analytic functions). Biography Fedor Nazarov received his Ph.D. from St Petersburg University in 1993, with Victor Petrovich Havin as advisor. Before his Ph.D. studies, Nazarov received the Gold Medal and Special prize at the International Mathematics Olympiad in 1984. Nazarov worked at Michigan State University in East Lansing from 1995 to 2007 and at the University of Wisconsin–Madison from 2007 to 2011. Since 2011, he has been a full professor of Mathematics at Kent State University. Awards Nazarov was awarded the Salem Prize in 1999 "for his work in harmonic analysis, in particular, the uncertainty principle, and his contribution to the development of Bellman function methods". He gave an invited talk at the International Congress of Mathematicians in 2010, on the topic of "Analysis". See also Nazarov's inequality for exponential sums
https://en.wikipedia.org/wiki/Topological%20quantum%20computer
A topological quantum computer is a theoretical quantum computer proposed by Russian-American physicist Alexei Kitaev in 1997. It employs quasiparticles in two-dimensional systems, called anyons, whose world lines pass around one another to form braids in a three-dimensional spacetime (i.e., one temporal plus two spatial dimensions). These braids form the logic gates that make up the computer. The advantage of a quantum computer based on quantum braids over using trapped quantum particles is that the former is much more stable. Small, cumulative perturbations can cause quantum states to decohere and introduce errors in the computation, but such small perturbations do not change the braids' topological properties. This is like the effort required to cut a string and reattach the ends to form a different braid, as opposed to a ball (representing an ordinary quantum particle in four-dimensional spacetime) bumping into a wall. While the elements of a topological quantum computer originate in a purely mathematical realm, experiments in fractional quantum Hall systems indicate these elements may be created in the real world using semiconductors made of gallium arsenide at a temperature of near absolute zero and subjected to strong magnetic fields. Introduction Anyons are quasiparticles in a two-dimensional space. Anyons are neither fermions nor bosons, but like fermions, they cannot occupy the same state. Thus, the world lines of two anyons cannot intersect or merge, which allows their paths to form stable braids in space-time. Anyons can form from excitations in a cold, two-dimensional electron gas in a very strong magnetic field, and carry fractional units of magnetic flux. This phenomenon is called the fractional quantum Hall effect. In typical laboratory systems, the electron gas occupies a thin semiconducting layer sandwiched between layers of aluminium gallium arsenide. When anyons are braided, the transformation of the quantum state of the system depends only o
https://en.wikipedia.org/wiki/Object%20Process%20Methodology
Object process methodology (OPM) is a conceptual modeling language and methodology for capturing knowledge and designing systems, specified as ISO/PAS 19450. Based on a minimal universal ontology of stateful objects and processes that transform them, OPM can be used to formally specify the function, structure, and behavior of artificial and natural systems in a large variety of domains. OPM was conceived and developed by Dov Dori. The ideas underlying OPM were published for the first time in 1995. Since then, OPM has evolved and developed. In 2002, the first book on OPM was published, and on December 15, 2015, after six years of work by ISO TC184/SC5, ISO adopted OPM as ISO/PAS 19450. A second book on OPM was published in 2016. Since 2019, OPM has become a foundation for a Professional Certificate program in Model-Based Systems Engineering - MBSE at EdX. Lectures are available as web videos on Youtube. Overview Object process methodology (OPM) is a conceptual modeling language and methodology for capturing knowledge and designing systems. Based on a minimal universal ontology of stateful objects and processes that transform them, OPM can be used to formally specify the function, structure, and behavior of artificial and natural systems in a large variety of domains. Catering to human cognitive abilities, an OPM model represents the system under design or study bimodally in both graphics and text for improved representation, understanding, communication, and learning. In OPM, an object anything that does or does not exist. Objects are stateful—they may have states, such that at each point in time, the object is at one of its states or in transition between states. A process is a thing that transforms an object by creating or consuming it, or by changing its state. OPM is bimodal; it is expressed both visually/graphically in object-process diagrams (OPD) and verbally/textually in Object-Process Language (OPL), a set of automatically generated sentences in a su
https://en.wikipedia.org/wiki/Heaviside%E2%80%93Lorentz%20units
Heaviside–Lorentz units (or Lorentz–Heaviside units) constitute a system of units and quantities that extends the CGS with a particular set of equations that defines electromagnetic quantities, named for Oliver Heaviside and Hendrik Antoon Lorentz. They share with the CGS-Gaussian system that the electric constant and magnetic constant do not appear in the defining equations for electromagnetism, having been incorporated implicitly into the electromagnetic quantities. Heaviside–Lorentz units may be thought of as normalizing and , while at the same time revising Maxwell's equations to use the speed of light instead. The Heaviside–Lorentz unit system, like the International System of Quantities upon which the SI system is based, but unlike the CGS-Gaussian system, is rationalized, with the result that there are no factors of appearing explicitly in Maxwell's equations. That this system is rationalized partly explains its appeal in quantum field theory: the Lagrangian underlying the theory does not have any factors of when this system is used. Consequently, electromagnetic quantities in the Heaviside–Lorentz system differ by factors of in the definitions of the electric and magnetic fields and of electric charge. It is often used in relativistic calculations, and are used in particle physics. They are particularly convenient when performing calculations in spatial dimensions greater than three such as in string theory. Motivation In the mid-late 19th Century, electromagnetic measurements were frequently made in either the so-called electrostatic (ESU) or electromagnetic (EMU) systems of units. These were based respectively on Coulomb's and Ampere's Law. Use of these systems, as with to the subsequently developed Gaussian CGS units, resulted in many factors of appearing in formulas for electromagnetic results, including those without circular or spherical symmetry. For example, in the CGS-Gaussian system, the capacitance of sphere of radius is while that
https://en.wikipedia.org/wiki/History%20of%20mathematical%20notation
The history of mathematical notation includes the commencement, progress, and cultural diffusion of mathematical symbols and the conflict of the methods of notation confronted in a notation's move to popularity or inconspicuousness. Mathematical notation comprises the symbols used to write mathematical equations and formulas. Notation generally implies a set of well-defined representations of quantities and symbols operators. The history includes Hindu–Arabic numerals, letters from the Roman, Greek, Hebrew, and German alphabets, and a host of symbols invented by mathematicians over the past several centuries. The development of mathematical notation can be divided in stages: The "rhetorical" stage is where calculations are performed by words and no symbols are used. The "syncopated" stage is where frequently used operations and quantities are represented by symbolic syntactical abbreviations. From ancient times through the post-classical age, bursts of mathematical creativity were often followed by centuries of stagnation. As the early modern age opened and the worldwide spread of knowledge began, written examples of mathematical developments came to light. The "symbolic" stage is where comprehensive systems of notation supersede rhetoric. Beginning in Italy in the 16th century, new mathematical developments, interacting with new scientific discoveries were made at an increasing pace that continues through the present day. This symbolic system was in use by medieval Indian mathematicians and in Europe since the middle of the 17th century, and has continued to develop in the contemporary era. The area of study known as the history of mathematics is primarily an investigation into the origin of discoveries in mathematics and the focus here, the investigation into the mathematical methods and notation of the past. Rhetorical stage Although the history commences with that of the Ionian schools, there is no doubt that those Ancient Greeks who paid attention to i
https://en.wikipedia.org/wiki/Fauna%20of%20Ireland
The fauna of Ireland comprises all the animal species inhabiting the island of Ireland and its surrounding waters. Summary This table uses figures supplied by the National Parks and Wildlife Service. Vertebrates by class Mammals Only 26 land mammal species (including bats, but not including marine mammals) are native to Ireland, because it has been isolated from the European mainland (by rising sea levels after the Midlandian Ice Age), since about 14,000 BC. Some species, such as the red fox, European hedgehog, stoat, otter, pygmy shrew, and badger are common, whereas others, like the Irish hare, red deer, and pine marten are less common and generally seen only in certain national parks and nature reserves around the island. Some introduced species have become thoroughly naturalised, e.g. the European rabbit, grey squirrel, bank vole, and brown rat. In addition, ten species of bat are found in Ireland. Megafaunal extinctions In the Ice Age (which included warm spells), mammals such as the woolly mammoth, muskox, wild horse, giant deer, brown bear, spotted hyena, cave lion, Arctic lemming, Arctic fox, wolf, Eurasian lynx, and reindeer flourished or migrated depending on the degree of coldness. The Irish brown bear was a genetically distinct (clade 2) brown bear from a lineage that had significant polar bear mtDNA. The closest surviving brown bear is Ursus arctos middendorffi in Alaska. Excavations of Barbary macaque remains indicate the species was artificially brought to Ireland at some point in the past. Reptiles Only one land reptile is native to the country, the viviparous lizard. It is common in national parks, particularly in the Wicklow Mountains. Slowworms are common in parts of The Burren area in County Clare, but they are not a native species and were probably introduced in the 1970s. Five marine turtle species appear regularly off the west coast, the leatherback, green, hawksbill, loggerhead, and Kemp's ridley, but they very rarely come ashore. L
https://en.wikipedia.org/wiki/Sup%27R%27Mod
The Sup 'R' Mod II is an RF modulator which was sold by M&R Enterprises in the late 1970s and early 1980s. It connects computers and other devices with composite video outputs, to a television. History Apple Computer wanted to provide Apple II computers with color output on a television, but had trouble getting FCC approval because the RF modulation solution is too noisy. Apple made an arrangement with a small nearby company, M&R Enterprises, to manufacture and sell the devices. Apple could not sell the modulator and computer as a package, but retail computer dealers could sell both devices to the end user. Marty Spergel, who ran M&R Enterprises, was told by Steve Jobs that it might sell up to 50 units a month. Spergel later estimated that he had sold about 400,000 units. The Sup 'R' Mod II began selling in April 1978, for . Technical features The Sup 'R' Mod II kit has a small printed circuit board, an antenna switch, and a coaxial cable with a ferrite core and RCA connectors. Composite video is received by the circuit board through a short cable terminating in a Molex connector, which plugs into a header on the Apple II motherboard. Input can also be provided through an RCA connector. The output of the RF modulator goes out through a coaxial cable to the antenna switch. The antenna switch allows the user to select between television broadcasts and computer output. The television antenna connects to inputs on the switch, and the switch output connects to the back of the television. The connections use screw terminals with spade lugs. Moving the switch from "TV" to "GAME PLAY" selects the computer output. The modulator presents a color signal on UHF channel 33.
https://en.wikipedia.org/wiki/D-Link
D-Link Systems, Inc. (formerly Datex Systems, Inc.) is a Taiwanese multinational networking equipment corporation founded in 1986 and headquartered in Taipei, Taiwan. History Datex Systems was founded in 1986 in Taipei, Taiwan. In 1992 the company changed its name into D-Link. D-Link went public and became the first networking company on the Taiwan Stock Exchange in 1994. It is now publicly traded on the TSEC and NSE stock exchanges. In 1988, D-Link released the industry's first peer-to-peer LANSmart Network Operating System, able to run concurrently with early networking systems such as Novell's NetWare and TCP/IP, which most small network operating systems could not do at the time. In 2007, it was the leading networking company in the small to medium business (SMB) segment worldwide with a 21.9% market share. In March 2008, it became the market leader in Wi-Fi product shipments worldwide, with 33% of the total market. In 2007, the company was featured in the "Info Tech 100", a listing of the world's best IT companies. It was also ranked as the 9th best IT company in the world for shareholder returns by BusinessWeek. In the same year, D-Link released one of the first WiFiCertified802.11n draft 2.0 Wi-Fi routers (DIR-655), which subsequently became one of the most successful draft 802.11n routers. In May 2013, D-Link released its flagship draft 802.11ac Wireless AC1750 Dual-Band Router (DIR-868L), which at that point had attained the fastest-ever wireless throughput as tested by blogger Tim Higgins. In April 2019, D-Link was named "Gartner Peer Insights Customers’ Choice for Wired and Wireless LAN Access Infrastructure". In June 2020, D-Link joined the Taiwan Steel Group (TSG). In 2021, D-Link announced to become the agent for the international information security brand, Cyberbit in Taiwan, and launched the new EAGLE PRO AI series transforming home Wi-Fi experiences. In 2022, D-Link obtained the TRUSTe Privacy seal, certification of ISO/IEC 27001:2013 an
https://en.wikipedia.org/wiki/Gradual%20typing
Gradual typing is a type system in which some variables and expressions may be given types and the correctness of the typing is checked at compile time (which is static typing) and some expressions may be left untyped and eventual type errors are reported at runtime (which is dynamic typing). Gradual typing allows software developers to choose either type paradigm as appropriate, from within a single language. In many cases gradual typing is added to an existing dynamic language, creating a derived language allowing but not requiring static typing to be used. In some cases a language uses gradual typing from the start. History The term was coined by Jeremy Siek, who developed gradual typing in 2006 with Walid Taha. Implementation In gradual typing, a special type named dynamic is used to represent statically-unknown types. The notion of type equality is replaced by a new relation called consistency that relates the dynamic type to every other type. The consistency relation is reflexive and symmetric but not transitive. Prior attempts at integrating static and dynamic typing tried to make the dynamic type be both the top and bottom of the subtype hierarchy. However, because subtyping is transitive, that results in every type becoming related to every other type, and so subtyping would no longer rule out any static type errors. The addition of a second phase of plausibility checking to the type system did not completely solve this problem. Gradual typing can easily be integrated into the type system of an object-oriented language that already uses the subsumption rule to allow implicit upcasts with respect to subtyping. The main idea is that consistency and subtyping are orthogonal ideas that compose nicely. To add subtyping to a gradually-typed language, simply add the subsumption rule and add a subtyping rule that makes the dynamic type a subtype of itself, because subtyping is supposed to be reflexive. (But do not make the top of the subtyping order dynamic!)
https://en.wikipedia.org/wiki/Socialist%20millionaire%20problem
In cryptography, the socialist millionaire problem is one in which two millionaires want to determine if their wealth is equal without disclosing any information about their riches to each other. It is a variant of the Millionaire's Problem whereby two millionaires wish to compare their riches to determine who has the most wealth without disclosing any information about their riches to each other. It is often used as a cryptographic protocol that allows two parties to verify the identity of the remote party through the use of a shared secret, avoiding a man-in-the-middle attack without the inconvenience of manually comparing public key fingerprints through an outside channel. In effect, a relatively weak password/passphrase in natural language can be used. Motivation Alice and Bob have secret values and , respectively. Alice and Bob wish to learn if without allowing either party to learn anything else about the other's secret value. A passive attacker simply spying on the messages Alice and Bob exchange learns nothing about and , not even whether . Even if one of the parties is dishonest and deviates from the protocol, that person cannot learn anything more than if . An active attacker capable of arbitrarily interfering with Alice and Bob's communication (a man-in-the-middle) cannot learn more than a passive attacker and cannot affect the outcome of the protocol other than to make it fail. Therefore, the protocol can be used to authenticate whether two parties have the same secret information. Popular instant message cryptography package Off-the-Record Messaging uses the Socialist Millionaire protocol for authentication, in which the secrets and contain information about both parties' long-term authentication public keys as well as information entered by the users themselves. Off-the-Record Messaging protocol The protocol is based on group theory. A group of prime order and a generator are agreed upon a priori, and in practice are generally fixed i
https://en.wikipedia.org/wiki/Printed%20electronic%20circuit
A printed electronic circuit (PEC) was an ancestor of the hybrid integrated circuit (IC). PECs were common in tube (valve) equipment from the 1940s through the 1970s. Brands Couplate was the Centralab trademark, whilst Sprague called them BulPlates. Aerovox used the generic PEC. Difference from hybrid integrated circuits PECs contained only resistors and capacitors arranged in circuits to simplify construction of tube equipment. Also, their voltage ratings were suitable for tubes. Later, hybrid ICs contained transistors, and often monolithic integrated circuits. Their voltage ratings were suitable for the transistors they contained.
https://en.wikipedia.org/wiki/Erd%C5%91s%E2%80%93Fuchs%20theorem
In mathematics, in the area of additive number theory, the Erdős–Fuchs theorem is a statement about the number of ways that numbers can be represented as a sum of elements of a given additive basis, stating that the average order of this number cannot be too close to being a linear function. The theorem is named after Paul Erdős and Wolfgang Heinrich Johannes Fuchs, who published it in 1956. Statement Let be an infinite subset of the natural numbers and its representation function, which denotes the number of ways that a natural number can be expressed as the sum of elements of (taking order into account). We then consider the accumulated representation function which counts (also taking order into account) the number of solutions to , where . The theorem then states that, for any given , the relation cannot be satisfied; that is, there is no satisfying the above estimate. Theorems of Erdős–Fuchs type The Erdős–Fuchs theorem has an interesting history of precedents and generalizations. In 1915, it was already known by G. H. Hardy that in the case of the sequence of perfect squares one has This estimate is a little better than that described by Erdős–Fuchs, but at the cost of a slight loss of precision, P. Erdős and W. H. J. Fuchs achieved complete generality in their result (at least for the case ). Another reason this result is so celebrated may be due to the fact that, in 1941, P. Erdős and P. Turán conjectured that, subject to the same hypotheses as in the theorem stated, the relation could not hold. This fact remained unproven until 1956, when Erdős and Fuchs obtained their theorem, which is even stronger than the previously conjectured estimate. Improved versions for h = 2 This theorem has been extended in a number of different directions. In 1980, A. Sárközy considered two sequences which are "near" in some sense. He proved the following: Theorem (Sárközy, 1980). If and are two infinite subsets of natural numbers with , then cannot hold f
https://en.wikipedia.org/wiki/HKDF
HKDF is a simple key derivation function (KDF) based on the HMAC message authentication code. It was initially proposed by its authors as a building block in various protocols and applications, as well as to discourage the proliferation of multiple KDF mechanisms. The main approach HKDF follows is the "extract-then-expand" paradigm, where the KDF logically consists of two modules: the first stage takes the input keying material and "extracts" from it a fixed-length pseudorandom key, and then the second stage "expands" this key into several additional pseudorandom keys (the output of the KDF). It can be used, for example, to convert shared secrets exchanged via Diffie–Hellman into key material suitable for use in encryption, integrity checking or authentication. It is formally described in RFC 5869. One of its authors also described the algorithm in a companion paper in 2010. NIST SP800-56Cr2 specifies a parameterizable extract-then-expand scheme, noting that RFC 5869 HKDF is a version of it and citing its paper for the rationale for the recommendations' extract-and-expand mechanisms. There are implementations of HKDF for C#, Go, Java, JavaScript, Perl, PHP, Python, Ruby, Rust, and other programming languages. Mechanism HKDF is the composition of two functions, HKDF-Extract and HKDF-Expand: HKDF(salt, IKM, info, length) = HKDF-Expand(HKDF-Extract(salt, IKM), info, length) HKDF-Extract HKDF-Extract takes "input key material" (IKM) such as a shared secret generated using Diffie-Hellman, and an optional salt, and generates a cryptographic key called the PRK ("pseudorandom key"). This acts as a "randomness extractor", taking a potentially non-uniform value of high min-entropy and generating a value indistinguishable from a uniform random value. HKDF-Extract is the output of HMAC with the "salt" as the key and the "IKM" as the message. HKDF-Expand HKDF-Expand takes the PRK, some "info", and a length, and generates output of the desired length. HKDF-Expand acts a
https://en.wikipedia.org/wiki/Murder%20of%20Lynette%20White
Lynette Deborah White (5 July 1967 – 14 February 1988) was murdered on 14 February 1988 in Cardiff, Wales. South Wales Police issued a photofit image of a bloodstained, white male seen in the vicinity at the time of the murder but were unable to trace the man. In November 1988, the police charged five men with White's murder, although none of the scientific evidence discovered at the crime scene could be linked to them. In November 1990, following what was then the longest murder trial in British history, three of the men were found guilty and sentenced to life imprisonment. In December 1992, the convictions were ruled unsafe and quashed by the Court of Appeal after it was decided that the police investigating the murder had acted improperly. The wrongful conviction of the three men has been called one of the most egregious miscarriages of justice in recent times. The police insisted that the men had been released purely on a legal technicality, that they would be seeking no other suspects, and resisted calls for the case to be reopened. In January 2002, new DNA technology enabled forensic scientists led by Angela Gallop to obtain a reliable crime scene DNA profile. The extracted profile led police to the real killer, Jeffrey Gafoor, who confessed to White's murder and was sentenced to life imprisonment. Gafoor received a shorter minimum tariff (the length of time before a prisoner may be considered for parole) than had been given to the wrongfully convicted men, due to the reduction for a guilty plea, highlighting a controversial feature of the sentencing guidelines. In 2004, the Independent Police Complaints Commission (IPCC) began a review of the conduct of the police during the original inquiry. Over the next 12 months around 30 people were arrested in connection with the investigation, 19 of whom were serving or retired police officers. In 2007, three of the prosecution witnesses who gave evidence at the original murder trial were convicted of perjury and sen
https://en.wikipedia.org/wiki/DataVault
The DataVault was Thinking Machines' mass storage system, storing five gigabytes of data, expandable to ten gigabytes with transfer rates of 40 megabytes per second. Eight DataVaults could be operated in parallel for a combined data transfer rate of 320 megabytes per second for up to 80 gigabytes of data. Each DataVault unit stored its data in an array of 39 individual disk drives with data spread across the drives. Each 64-bit data chunk received from the I/O bus was split into two 32-bit words. After verifying parity, the DataVault controller added 7 bits of Error Correcting Code (ECC) and stored the resulting 39 bits on 39 individual drives. Subsequent failure of any one of the 39 drives would not impair reading of the data, since the ECC code allows any single bit error to be detected and corrected. Although operation is possible with a single failed drive, three spare drives were available to replace failed units until they are repaired. The ECC codes permit 100% recovery of the data on any one failed disk, allowing a new copy of this data to be reconstructed and written onto the replacement disk. Once this recovery is complete, the data base is considered to be healed. In today's terminology this would be labeled a RAID-2 subsystem. However, these units shipped before the label RAID was formed. The DataVault was an example of unusual industrial design. Instead of the usual rectilinear box, the cabinet had a gentle curve that made it look like an information desk or a bartender's station.
https://en.wikipedia.org/wiki/Layered%20Service%20Provider
Layered Service Provider (LSP) is a deprecated feature of the Microsoft Windows Winsock 2 Service Provider Interface (SPI). A Layered Service Provider is a DLL that uses Winsock APIs to attempt to insert itself into the TCP/IP protocol stack. Once in the stack, a Layered Service Provider can intercept and modify inbound and outbound Internet traffic. It allows processing of all the TCP/IP traffic taking place between the Internet and the applications that are accessing the Internet (such as a web browser, the email client, etc.). For example, it could be used by malware to redirect web browers to rogue websites, or to block access to sites like Windows Update. Alternatively, a computer security program could scan network traffic for viruses or other threats. The Winsock Service Provider Interface (SPI) API provides a mechanism for layering providers on top of each other. Winsock LSPs are available for a range of useful purposes, including parental controls and Web content filtering. The parental controls web filter in Windows Vista is an LSP. The layering order of all providers is kept in the Winsock Catalog. Details Unlike the well-known Winsock 2 API, which is covered by numerous books, documentation, and samples, the Winsock 2 SPI is relatively unexplored. The Winsock 2 SPI is implemented by network transport service providers and namespace resolution service providers. The Winsock 2 SPI can be used to extend an existing transport service provider by implementing a Layered Service Provider. For example, quality of service (QoS) on Windows 98 and Windows 2000 is implemented as an LSP over the TCP/IP protocol stack. Another use for LSPs would be to develop specialized URL filtering software to prevent Web browsers from accessing certain sites, regardless of the browser installed on a desktop. The Winsock 2 SPI allows software developers to create two different types of service providers—transport and namespace. Transport providers (commonly referred to as proto
https://en.wikipedia.org/wiki/Chromosome%20conformation%20capture
Chromosome conformation capture techniques (often abbreviated to 3C technologies or 3C-based methods) are a set of molecular biology methods used to analyze the spatial organization of chromatin in a cell. These methods quantify the number of interactions between genomic loci that are nearby in 3-D space, but may be separated by many nucleotides in the linear genome. Such interactions may result from biological functions, such as promoter-enhancer interactions, or from random polymer looping, where undirected physical motion of chromatin causes loci to collide. Interaction frequencies may be analyzed directly, or they may be converted to distances and used to reconstruct 3-D structures. The chief difference between 3C-based methods is their scope. For example, when using PCR to detect interaction in a 3C experiment, the interactions between two specific fragments are quantified. In contrast, Hi-C quantifies interactions between all possible pairs of fragments simultaneously. Deep sequencing of material produced by 3C also produces genome-wide interactions maps. History Historically, microscopy was the primary method of investigating nuclear organization, which can be dated back to 1590. In 1879, Walther Flemming coined the term chromatin. In 1883, August Weismann connected chromatin with heredity. In 1884, Albrecht Kossel discovered histones. In 1888, Sutton and Boveri proposed the theory of continuity of chromatin during the cell cycle In 1889, Wilhelm von Waldemeyer created the term "chromosome". In 1928, Emil Heitz coined the terms heterochromatin and euchromatin. In 1942, Conrad Waddington postulated the epigenetic landscapes. In 1948, Rollin Hotchkiss discovered DNA methylation. In 1953, Watson and Crick reported the double helix structure of DNA based on Rosalind Franklin's X-ray diffraction images. In 1961, Mary Lyon postulated the principle of X-inactivation. In 1973/1974, chromatin fiber was discovered. In 1975, Pierre Chambon coined the te
https://en.wikipedia.org/wiki/Trilaminar%20embryonic%20disc
A trilaminar embryonic disc (or trilaminary blastoderm, or trilaminar germ disk) is an early stage in the development of triploblastic organisms, which include humans and many other animals. It is the next stage from the earlier bilaminar embryonic disc. It is an embryo which exists as three different germ layers – the ectoderm, the mesoderm and the endoderm. These layers are arranged on top of each other, giving rise to the name trilaminar, or "three-layered". The mesoderm is segmented further into the paraxial, intermediate and the lateral plate mesoderm. These three layers arise early in the third week (during gastrulation) from the epiblast (a portion of the mammalian inner cell mass).
https://en.wikipedia.org/wiki/Levator%20labii%20superioris
The levator labii superioris (pl. levatores labii superioris, also called quadratus labii superioris, pl. quadrati labii superioris) is a muscle of the human body used in facial expression. It is a broad sheet, the origin of which extends from the side of the nose to the zygomatic bone. Structure Its medial fibers form the angular head (also known as the levator labii superioris alaeque nasi muscle,) which arises by a pointed extremity from the upper part of the frontal process of the maxilla and passing obliquely downward and lateralward divides into two slips. One of these is inserted into the greater alar cartilage and skin of the nose; the other is prolonged into the lateral part of the upper lip, blending with the infraorbital head and with the orbicularis oris. The intermediate portion or infraorbital head arises from the lower margin of the orbit immediately above the infraorbital foramen, some of its fibers being attached to the maxilla, others to the zygomatic bone. Its fibers converge, to be inserted into the muscular substance of the upper lip between the angular head and the levator anguli oris. The lateral fibers, forming the zygomatic head (also known as the zygomaticus minor muscle) arise from the malar surface of the zygomatic bone immediately behind the zygomaticomaxillary suture and pass downward and medialward to the upper lip. Function Its main function is to elevate the upper lip. See also Levator labii superioris alaeque nasi Additional images
https://en.wikipedia.org/wiki/Google%20Wave
Google Wave, later known as Apache Wave, is a discontinued software framework for real-time collaborative online editing. Originally developed by Google and announced on May 28, 2009, it was renamed to Apache Wave when the project was adopted by the Apache Software Foundation as an incubator project in 2010. Wave is a web-based computing platform and communications protocol designed to merge key features of communications media, such as email, instant messaging, wikis, and social networking. Communications using the system can be synchronous or asynchronous. Software extensions provide contextual spelling and grammar checking, automated language translation and other features. Initially released only to developers, a preview release of Google Wave was extended to 100,000 users in September 2009, each allowed to invite additional users. Google accepted most requests submitted starting November 29, 2009, soon after the September extended release of the technical preview. On May 19, 2010, it was released to the general public. On August 4, 2010, Google announced the suspension of stand-alone Wave development and the intent of maintaining the web site at least for the remainder of the year, and on November 22, 2011, announced that existing Waves would become read-only in January 2012, and all Waves would be deleted in April 2012. Development was handed over to the Apache Software Foundation which started to develop a server-based product called Wave in a Box. Apache Wave never reached a full release and was discontinued on January 15, 2018. History Origin of name The science fiction television series Firefly provided the inspiration for the project's name. In the series, a wave is an electronic communication, often consisting of a video call or video message. During the developer preview, a number of references were made to the series, such as Lars Rasmussen replying to a message with "shiny", a word used in the series to mean cool or good, and the crash message o
https://en.wikipedia.org/wiki/Merimepodib
Merimepodib (VX-497) is a drug which acts as an inhibitor of the enzyme inosine monophosphate dehydrogenase, which is required for the synthesis of nucleotide bases containing guanine. This consequently inhibits synthesis of DNA and RNA, and results in antiviral and immunosuppressive effects. It progressed as far as Phase 2b human clinical trials against Hepatitis C but showed only modest benefits in comparison to existing treatments, however it continues to be researched, and also shows activity against other viral diseases such as Zika virus and foot and mouth disease virus. Merimepodib was investigated in combination with remdesivir in a phase 2 clinical trial in the U.S. as a potential treatment of COVID-19 by ViralClear Pharmaceuticals. The trial stopped in October 2020, and the company announced in a news release that it was "unlikely that the trial would meet its primary safety endpoints", and that it "does not intend to further develop merimepodib".
https://en.wikipedia.org/wiki/Vital%20heat
Vital heat, also called innate or natural heat, or calidum innatum, is a term in Ancient Greek medicine and philosophy that has generally referred to the heat produced within the body, usually the heat produced by the heart and the circulatory system. Vital heat was a somewhat controversial subject because it was formerly believed that heat was acquired by an outside source such as the element of fire. Origin of concept It was a previously accepted concept that heat was absorbed through external sources; however, the concept of vital heat was more or less stumbled upon by a physiological observation that associates cold with the dead and heat with the living. "For the concept of vital heat we may – somewhat arbitrarily – take our starting point in Parmenides. His correlation of dead with cold, alive with warm, may not have been primarily intended as a contribution to physiology, yet the physiological significance of this thought was perceived by his successors; witness Empedocles, who taught 'sleep comes about when the heat of the blood is cooled to the proper degree, death when it becomes altogether cold'". Aristotle would eventually modify this doctrine stating that "sleep is a temporary overpowering of the inner heat by other factors in the body, death its final extinction." Vital heat and ancient medicine According to Ancient Greek physicians, vital heat was produced by the heart, maintained by the pneuma (air, breath, spirit or soul), and circulated throughout the body by blood vessels, which were thought to be intact tubes using blood to transmit heat. Aristotle supported this argument by showing that when the heart is made cold compared to other organs, the individual dies. He believed that the heat produced in the heart caused blood to react in a similar way to boiling, expanding out through the blood vessels with every beat. This extreme heat, according to him, can lead to a self-consuming flame if it is not cooled by air from the lungs. Galen wrot
https://en.wikipedia.org/wiki/Linear%20parameter-varying%20control
Linear parameter-varying control (LPV control) deals with the control of linear parameter-varying systems, a class of nonlinear systems which can be modelled as parametrized linear systems whose parameters change with their state. Gain scheduling In designing feedback controllers for dynamical systems a variety of modern, multivariable controllers are used. In general, these controllers are often designed at various operating points using linearized models of the system dynamics and are scheduled as a function of a parameter or parameters for operation at intermediate conditions. It is an approach for the control of non-linear systems that uses a family of linear controllers, each of which provides satisfactory control for a different operating point of the system. One or more observable variables, called the scheduling variables, are used to determine the current operating region of the system and to enable the appropriate linear controller. For example, in case of aircraft control, a set of controllers are designed at different gridded locations of corresponding parameters such as AoA, Mach, dynamic pressure, CG etc. In brief, gain scheduling is a control design approach that constructs a nonlinear controller for a nonlinear plant by patching together a collection of linear controllers. These linear controllers are blended in real-time via switching or interpolation. Scheduling multivariable controllers can be very tedious and time-consuming task. A new paradigm is the linear parameter-varying (LPV) techniques which synthesize of automatically scheduled multivariable controller. Drawbacks of classical gain scheduling An important drawback of classical gain scheduling approach is that adequate performance and in some cases even stability is not guaranteed at operating conditions other than the design points. Scheduling multivariable controllers is often a tedious and time-consuming task and it holds true especially in the field of aerospace control where the p
https://en.wikipedia.org/wiki/Cell%20microprocessor%20implementations
Cell microprocessors are multi-core processors that use cellular architecture for high performance distributed computing. The first commercial Cell microprocessor, the Cell BE, was designed for the Sony PlayStation 3. IBM designed the PowerXCell 8i for use in the Roadrunner supercomputer. Implementation First edition Cell on 90 nm CMOS IBM has published information concerning two different versions of Cell in this process, an early engineering sample designated DD1, and an enhanced version designated DD2 intended for production. The main enhancement in DD2 was a small lengthening of the die to accommodate a larger PPE core, which is reported to "contain more SIMD/vector execution resources". Some preliminary information released by IBM references the DD1 variant. As a result, some early journalistic accounts of the Cell's capabilities now differ from production hardware. Cell floorplan Powerpoint material accompanying an STI presentation given by Dr Peter Hofstee], includes a photograph of the DD2 Cell die overdrawn with functional unit boundaries which are also captioned by name, which reveals the breakdown of silicon area by function unit as follows: SPE floorplan Additional details concerning the internal SPE implementation have been disclosed by IBM engineers, including Peter Hofstee, IBM's chief architect of the synergistic processing element, in a scholarly IEEE publication. This document includes a photograph of the 2.54 mm × 5.81 mm SPE, as implemented in 90-nm SOI. In this technology, the SPE contains 21 million transistors of which 14 million are contained in arrays (a term presumably designating register files and the local store) and 7 million transistors are logic. This photograph is overdrawn with functional unit boundaries, which are also captioned by name, which reveals the breakdown of silicon area by function unit as follows: Understanding the dispatch pipes is important to write efficient code. In the SPU architecture, two instructions c
https://en.wikipedia.org/wiki/Mikro%2767
Mikro'67 is a Bulgarian manufacturing company based in Razgrad, that produces and commercialises basketball backboards, outdoor recreation goods, toys, and scale model vehicles (mainly cars and trucks). In 1990 there were 96 toy manufacturers in Bulgaria producers' co-operative societies and factories. Thirteen of them formed the State Economic Group "ДСО МЛАДОСТ" / "DSO Mladost"). Nowadays there is only one toy manufacturer left - "МИР"/"MIR" (from the communist era) to "МИКРО" / "MIKRO" a.k.a. "МИКРО'67" / "MIKRO'67" (post-communist era). Nowadays both the Mikro and Mikro'67 names are used interchangeably. The factory In 1952 in one of the workshops of TPK (producers' co-operative society) Metalik differentiated team of workers for toys manufacturing. With a great effort and ambition the first toy was handmade - a wind-up chain tractor. The production list was growing - trucks, dumpers, diggers, a cannon, a car, a rocket and more. The volume of production and the number of the workers were growing; these were factors for the creation of a new independent factory. On January 1, 1967, it was established and independent producers' co-operative society for toys manufacturing with chairman Petar Petrov. A question then arose - how was the new manufacturer to be named? With a number of opinions and suggestions, the most adequate of them was "Peace", because peace means joy and carefree childhood for children all over the world. In 1971, the workers moved into a new building. Introduction of new technologies had then begun. Typical for that time were battery-operated toys - machine guns, telephones, SAU tanks and more. The range of toys produced widened. On July 1, 1971, the name was changed from "TPK Mir" to "Mir factory" and became a part of DSO Mladost, a part of Committee for Youth and Sport and later a part from the Ministry of Light industry. The adoption of new series of Tonka (USA) toys has begun. The production list grew up to 40 different toys. The unifi
https://en.wikipedia.org/wiki/Speech%20Pathology%20Australia
Speech Pathology Australia (SPA) is the national peak body for the speech pathology profession in Australia. History of the association Established in 1949, SPA began as the Australian College of Speech Therapists, set up to regulate and maintain the qualifications and standards of the profession. The Australian Branch of the British Medical Association "granted the Australian College of Speech Therapists full professional recognition as the examining, qualifying and representative body for speech therapy within the Commonwealth". The new organisation combined the Victorian Council of Speech Therapy, the Australian Association of Speech Therapists (New South Wales), the South Australian Council of Speech Science and Speech Therapy, and the Council for Speech Therapy (Western Australia), and granted members the right to practice in the United Kingdom. In 1974–75, the organisation became the Australian Association of Speech & Hearing. The association no longer conducted examinations nor granted licentiates to graduates, a responsibility that was taken over by tertiary institutions. In 1996 the organisation became the Speech Pathology Association of Australia Ltd and adopted the public name of Speech Pathology Australia. Speech therapy in Australia and Elinor Wray Sydney-born Elinor Wray undertook speech therapy training at the Central School of Speech and Drama and St Thomas' Hospital in London, as well as observing speech therapy practice at St Bartholomew's Hospital and King's College Hospital. She then spent three months observing at the London County Council Stammering Centres before returning to Sydney in 1929 to establish the first Australian speech therapy service. Appointed in an honorary capacity at the Royal Alexandra Hospital for Children at Camperdown, Sydney (now located at Westmead) on the recommendation of surgeon (and later president of the hospital) Sir Robert Wade in 1931, Wray voluntarily conducted three clinics weekly for the next seve
https://en.wikipedia.org/wiki/Evansia%20%28journal%29
Evansia is a quarterly, peer-reviewed scientific journal, publishing research on issues in biology and environmental preservation related to lichenology and bryology, primarily in North America. It is published quarterly by the American Bryological and Lichenological Society (ABLS) and serves as the information bulletin of the ABLS. Articles are frequently popular or semi-technical rather than technical and intended for both amateurs and professionals. There are reports on local flora and presentations of techniques for studying and curating lichens, bryophytes, and hepatics. The ABLS named the journal in honor of Alexander William Evans.
https://en.wikipedia.org/wiki/Android-x86
Android-x86 was an open source project that made an unofficial porting of the Android mobile operating system developed by the Open Handset Alliance to run on devices powered by x86 processors, rather than RISC-based ARM chips. Developers Chih-Wei Huang and Yi Sun originated the project in 2009. The project began as a series of patches to the Android source code to enable Android to run on various netbooks, tablets and ultra-mobile PCs. Overview The OS is based on the Android Open Source Project (AOSP) with some modifications and improvements. Some components are developed by the project which allow it to run on PC architecture. For instance, some low-level components are replaced to better suit the platform, such as the kernel and HALs. The OS enables OpenGL ES hardware acceleration via Mesa if supported GPUs are detected, including Intel GMA, AMD's Radeon, Nvidia's chipsets (Nouveau), VMware () and QEMU (). Without a supported GPU, the OS can run in non-accelerated mode via software rendering. Since release 7.1, the software renderer has been implemented via the SwiftShader project. Like a normal Linux distribution, the project releases pre-built ISO images which can run under live mode or installed to a hard disk on the target system. Since release 4.4-r2, the project also releases efi_img which can be used to create a live USB to be booted from on UEFI systems. Since release 4.4-r4, the UEFI support was united into the ISO images and efi_img was marked as deprecated. Except AOSP, the following incomplete list of components are developed from scratch or derived from other open source projects to form the entire Android-x86 codebase: Kernel Installer drm_gralloc and gbm_gralloc Mesa SwiftShader Audio Camera GPS Lights Radio Interface Layer Sensors More and more components may be added to the updated version. Android-x86 (Q) and (R) branches are only source code releases as of August 2022. Related projects Project Celadon A related project, Cela
https://en.wikipedia.org/wiki/Euler%20characteristic
In mathematics, and more specifically in algebraic topology and polyhedral combinatorics, the Euler characteristic (or Euler number, or Euler–Poincaré characteristic) is a topological invariant, a number that describes a topological space's shape or structure regardless of the way it is bent. It is commonly denoted by (Greek lower-case letter chi). The Euler characteristic was originally defined for polyhedra and used to prove various theorems about them, including the classification of the Platonic solids. It was stated for Platonic solids in 1537 in an unpublished manuscript by Francesco Maurolico. Leonhard Euler, for whom the concept is named, introduced it for convex polyhedra more generally but failed to rigorously prove that it is an invariant. In modern mathematics, the Euler characteristic arises from homology and, more abstractly, homological algebra. Polyhedra The Euler characteristic was classically defined for the surfaces of polyhedra, according to the formula where , , and are respectively the numbers of vertices (corners), edges and faces in the given polyhedron. Any convex polyhedron's surface has Euler characteristic This equation, stated by Euler in 1758, is known as Euler's polyhedron formula. It corresponds to the Euler characteristic of the sphere (i.e. ), and applies identically to spherical polyhedra. An illustration of the formula on all Platonic polyhedra is given below. The surfaces of nonconvex polyhedra can have various Euler characteristics: For regular polyhedra, Arthur Cayley derived a modified form of Euler's formula using the density , vertex figure density and face density This version holds both for convex polyhedra (where the densities are all 1) and the non-convex Kepler–Poinsot polyhedra. Projective polyhedra all have Euler characteristic 1, like the real projective plane, while the surfaces of toroidal polyhedra all have Euler characteristic 0, like the torus. Plane graphs The Euler characteristic can be define
https://en.wikipedia.org/wiki/Radial%20tuberosity
Beneath the neck of the radius, on the medial side, is an eminence, the radial tuberosity; its surface is divided into: a posterior, rough portion, for the insertion of the tendon of the biceps brachii. an anterior, smooth portion, on which a bursa is interposed between the tendon and the bone. Ligaments that support the elbow joint also attach to the radial tuberosity.
https://en.wikipedia.org/wiki/Ramanujam%E2%80%93Samuel%20theorem
In algebraic geometry, the Ramanujam–Samuel theorem gives conditions for a divisor of a local ring to be principal. It was introduced independently by in answer to a question of Grothendieck and by C. P. Ramanujam in an appendix to a paper by , and was generalized by . Statement Grothendieck's version of the Ramanujam–Samuel theorem is as follows. Suppose that A is a local Noetherian ring with maximal ideal m, whose completion is integral and integrally closed, and ρ is a local homomorphism from A to a local Noetherian ring B of larger dimension such that B is formally smooth over A and the residue field of B is finite over that of A. Then a cycle of codimension 1 in Spec(B) that is principal at the point mB is principal.
https://en.wikipedia.org/wiki/Mass%20amateurization
Mass amateurization refers to the capabilities that new forms of media have given to non-professionals and the ways in which those non-professionals have applied those capabilities to solve problems (e.g. create and distribute content) that compete with the solutions offered by larger, professional institutions. Mass amateurization is most often associated with Web 2.0 technologies. These technologies include the rise of blogs and citizen journalism, photo and video-sharing services such as Flickr and YouTube, user-generated wikis like Wikipedia, and distributed accommodation services such as Airbnb. While the social web is not the only technology responsible for the rise of mass amateurization, Clay Shirky claims Web 2.0 has allowed amateurs to undertake increasingly complex tasks resulting in accomplishments that would seem daunting within the traditional institutional model. In addition to whole websites and applications, Web 2.0 has also birthed a variety of digital tools that facilitate organization and problem solving on a large scale. These tools include tags, trackbacks, and hashtags. These new forms of media became widely available during the first decade of the 21st century due in part to the fall of transactional costs of creating and distributing media. Mass amateurization is a social, cumulative and collaborative activity, wherein ideas will flow back up the pipeline from consumers and they will share them among themselves. There is no institutional hierarchy in mass amateurization. There is only an informal group of collaborators working to solve a problem. Due to mass amateurization, amateurs are able to collaborate without the interference from the inherent obstacles associated with institutions. These obstacles include the costs that an institution incurs while educating, training, directing, coaching, advising, and organizing its members. Background Mass amateurization was first popularized by Clay Shirky in his 2008 book, Here Comes Everybody:
https://en.wikipedia.org/wiki/Race%20condition
A race condition or race hazard is the condition of an electronics, software, or other system where the system's substantive behavior is dependent on the sequence or timing of other uncontrollable events. It becomes a bug when one or more of the possible behaviors is undesirable. The term race condition was already in use by 1954, for example in David A. Huffman's doctoral thesis "The synthesis of sequential switching circuits". Race conditions can occur especially in logic circuits, multithreaded, or distributed software programs. In electronics A typical example of a race condition may occur when a logic gate combines signals that have traveled along different paths from the same source. The inputs to the gate can change at slightly different times in response to a change in the source signal. The output may, for a brief period, change to an unwanted state before settling back to the designed state. Certain systems can tolerate such glitches but if this output functions as a clock signal for further systems that contain memory, for example, the system can rapidly depart from its designed behaviour (in effect, the temporary glitch becomes a permanent glitch). Consider, for example, a two-input AND gate fed with the following logic: A logic signal on one input and its negation, (the ¬ is a boolean negation), on another input in theory never output a true value: . If, however, changes in the value of take longer to propagate to the second input than the first when changes from false to true then a brief period will ensue during which both inputs are true, and so the gate's output will also be true. A practical example of a race condition can occur when logic circuitry is used to detect certain outputs of a counter. If all the bits of the counter do not change exactly simultaneously, there will be intermediate patterns that can trigger false matches. Critical and non-critical forms A critical race condition occurs when the order in which internal variables
https://en.wikipedia.org/wiki/Shlaer%E2%80%93Mellor%20method
The Shlaer–Mellor method, also known as object-oriented systems analysis (OOSA) or object-oriented analysis (OOA) is an object-oriented software development methodology introduced by Sally Shlaer and Stephen Mellor in 1988. The method makes the documented analysis so precise that it is possible to implement the analysis model directly by translation to the target architecture, rather than by elaborating model changes through a series of more platform-specific models. In the new millennium the Shlaer–Mellor method has migrated to the UML notation, becoming Executable UML. Overview The Shlaer–Mellor method is one of a number of software development methodologies which arrived in the late 1980s. Most familiar were object-oriented analysis and design (OOAD) by Grady Booch, object modeling technique (OMT) by James Rumbaugh, object-oriented software engineering by Ivar Jacobson and object-oriented analysis (OOA) by Shlaer and Mellor. These methods had adopted a new object-oriented paradigm to overcome the established weaknesses in the existing structured analysis and structured design (SASD) methods of the 1960s and 1970s. Of these well-known problems, Shlaer and Mellor chose to address: The complexity of designs generated through the use of structured analysis and structured design (SASD) methods. The problem of maintaining analysis and design documentation over time. Before publication of their second book in 1991 Shlaer and Mellor had stopped naming their method "Object-Oriented Systems Analysis" in favor of just "Object-Oriented Analysis". The method started focusing on the concept of Recursive Design (RD), which enabled the automated translation aspect of the method. What makes Shlaer–Mellor unique among the object-oriented methods is: the degree to which object-oriented semantic decomposition is taken, the precision of the Shlaer–Mellor Notation used to express the analysis, and the defined behavior of that analysis model at run-time. The general solut
https://en.wikipedia.org/wiki/Mattel%20Aquarius
Aquarius is a home computer designed by Radofin and released by Mattel Electronics in 1983. Based on the Zilog Z80 microprocessor, the system has a rubber chiclet keyboard, 4K of RAM, and a subset of Microsoft BASIC in ROM. It connects to a television set for audiovisual output, and uses a cassette tape recorder for secondary data storage. A limited number of peripherals, such as a 40-column thermal printer, a 4-color printer/plotter, and a 300 baud modem, were released. The Aquarius was discontinued in October 1983, only a few months after it was launched. Development Looking to compete in the home computer market, Mattel Electronics turned to Radofin, the Hong Kong based manufacturer of their Intellivision consoles. Radofin had designed two computer systems. Internally they were known as "Checkers" and the more sophisticated "Chess". Mattel contracted for these to become the Aquarius and Aquarius II, respectively. Aquarius was announced in 1982 and finally released in June 1983, at a price of $160. Production ceased four months later because of poor sales. Mattel paid Radofin to take back the marketing rights. Four other companies: CEZAR Industries, CRIMAC, New Era Incentives, and Bentley Industries also marketed the unit and accessories. The Aquarius was often bundled with the Mini-Expander peripheral, which added game pads, an additional cartridge port for memory expansion, and the General Instrument AY-3-8910 sound chip. Other peripherals were the Data recorder, 40 column thermal printer, 4K and 16K RAM carts. Less common first party peripherals include a 300 baud cartridge modem, 32k RAM cart, 4 color plotter, and Quick Disk drive. Reception Although less expensive than the TI-99/4A and VIC-20, the Aquarius had comparatively weak graphics and limited memory. Internally, Mattel programmers adopted Bob Del Principe's mock slogan, "Aquarius -a system for the seventies". Of the 32 software titles Mattel announced for the unit, only 21 were released, most of
https://en.wikipedia.org/wiki/Cosmic%20background%20radiation
Cosmic background radiation is electromagnetic radiation that fills all space. The origin of this radiation depends on the region of the spectrum that is observed. One component is the cosmic microwave background. This component is redshifted photons that have freely streamed from an epoch when the Universe became transparent for the first time to radiation. Its discovery and detailed observations of its properties are considered one of the major confirmations of the Big Bang. The discovery (by chance in 1965) of the cosmic background radiation suggests that the early universe was dominated by a radiation field, a field of extremely high temperature and pressure. The Sunyaev–Zel'dovich effect shows the phenomena of radiant cosmic background radiation interacting with "electron" clouds distorting the spectrum of the radiation. There is also background radiation in the infrared, x-rays, etc., with different causes, and they can sometimes be resolved into an individual source. See cosmic infrared background and X-ray background. See also cosmic neutrino background and extragalactic background light. Timeline of significant events 1896: Charles Édouard Guillaume estimates the "radiation of the stars" to be 5.6 K. 1926: Sir Arthur Eddington estimates the non-thermal radiation of starlight in the galaxy has an effective temperature of 3.2 K. 1930s: Erich Regener calculates that the non-thermal spectrum of cosmic rays in the galaxy has an effective temperature of 2.8 K. 1931: The term microwave first appears in print: "When trials with wavelengths as low as 18 cm were made known, there was undisguised surprise that the problem of the micro-wave had been solved so soon." Telegraph & Telephone Journal XVII. 179/1" 1938: Walther Nernst re-estimates the cosmic ray temperature as 0.75 K. 1946: The term "microwave" is first used in print in an astronomical context in an article "Microwave Radiation from the Sun and Moon" by Robert Dicke and Robert Beringer. 1946: Ro
https://en.wikipedia.org/wiki/Hypercharge
In particle physics, the hypercharge (a portmanteau of hyperonic and charge) Y of a particle is a quantum number conserved under the strong interaction. The concept of hypercharge provides a single charge operator that accounts for properties of isospin, electric charge, and flavour. The hypercharge is useful to classify hadrons; the similarly named weak hypercharge has an analogous role in the electroweak interaction. Definition Hypercharge is one of two quantum numbers of the SU(3) model of hadrons, alongside isospin . The isospin alone was sufficient for two quark flavours — namely and — whereas presently 6 flavours of quarks are known. SU(3) weight diagrams (see below) are 2 dimensional, with the coordinates referring to two quantum numbers: (also known as ), which is the  component of isospin, and , which is the hypercharge (defined by strangeness , charm , bottomness , topness , and baryon number ). Mathematically, hypercharge is Strong interactions conserve hypercharge (and weak hypercharge), but weak interactions do not. Relation with electric charge and isospin The Gell-Mann–Nishijima formula relates isospin and electric charge where I3 is the third component of isospin and Q is the particle's charge. Isospin creates multiplets of particles whose average charge is related to the hypercharge by: since the hypercharge is the same for all members of a multiplet, and the average of the I3 values is 0. These definitions in their original form hold only for the three lightest quarks. SU(3) model in relation to hypercharge The SU(2) model has multiplets characterized by a quantum number J, which is the total angular momentum. Each multiplet consists of substates with equally-spaced values of Jz, forming a symmetric arrangement seen in atomic spectra and isospin. This formalizes the observation that certain strong baryon decays were not observed, leading to the prediction of the mass, strangeness and charge of the baryon. The SU(3) has supermultipl
https://en.wikipedia.org/wiki/Ambio%204
Introduction Ambio 4 was quadrophonic sound technology commercialised in the early 1970s that could reproduce the ambience or sound information of a room as well as play stereo. Ambiophony was an extension of stereo reproduction to enhance the sense of realism and it could be used with nearly all stereo programme materials. The technology was included in receivers, amplifiers and music centres from manufacturers including Philips, Ferguson Electronics and Bang and Olufsen alongside mono and stereo playback. The electronics behind Ambiophony was based on, or similar to, the Hafler circuit.
https://en.wikipedia.org/wiki/Philippe%20Charlier
Philippe Charlier is a French coroner, forensic pathologist and paleopathologist. Biography Charlier was born in Meaux on 25 June 1977. His father is a doctor, his mother a pharmacist. He made his first dig at the age of 10, when he found a human skull. He studied archaeology and art history at the Michelet Institute and was part of the forensic department at Raymond Poincaré University Hospital. Charlier's work has focused on the remains of Richard Lionheart, Agnès Sorel, Fulk III, Count of Anjou, Diane de Poitiers, relics of Louis IX scattered in France, false relics of Joan of Arc, and the presumed head of Henry IV. In 2017, he reconfirmed the authenticity of Adolf Hitler's dental remains, the only remains of the Nazi dictator confirmed to have been found.
https://en.wikipedia.org/wiki/Superquadrics
In mathematics, the superquadrics or super-quadrics (also superquadratics) are a family of geometric shapes defined by formulas that resemble those of ellipsoids and other quadrics, except that the squaring operations are replaced by arbitrary powers. They can be seen as the three-dimensional relatives of the superellipses. The term may refer to the solid object or to its surface, depending on the context. The equations below specify the surface; the solid is specified by replacing the equality signs by less-than-or-equal signs. The superquadrics include many shapes that resemble cubes, octahedra, cylinders, lozenges and spindles, with rounded or sharp corners. Because of their flexibility and relative simplicity, they are popular geometric modeling tools, especially in computer graphics. It becomes an important geometric primitive widely used in computer vision, robotics, and physical simulation. Some authors, such as Alan Barr, define "superquadrics" as including both the superellipsoids and the supertoroids. In modern computer vision literatures, superquadrics and superellipsoids are used interchangeably, since superellipsoids are the most representative and widely utilized shape among all the superquadrics. Comprehensive coverage of geometrical properties of superquadrics and methods of their recovery from range images and point clouds are covered in several computer vision literatures. Useful tools and algorithms for superquadrics visualization, sampling, and recovery are open-sourced here. Formulas Implicit equation The surface of the basic superquadric is given by where r, s, and t are positive real numbers that determine the main features of the superquadric. Namely: less than 1: a pointy octahedron modified to have concave faces and sharp edges. exactly 1: a regular octahedron. between 1 and 2: an octahedron modified to have convex faces, blunt edges and blunt corners. exactly 2: a sphere greater than 2: a cube modified to have rounded edges an
https://en.wikipedia.org/wiki/Skew%20%28antenna%29
Skew is a term used in antenna engineering. It is a technique to improve the horizontal radiation pattern of a high power transmitter station. In a high power VHF or UHF station, usually the antenna system is constructed to broadcast to four directions each separated 90° from each other. So the directivity of the antenna system resembles a four leaf clover. While settlements within the main lobe receive enough energy, the energy received by the settlements between the main lobes may be 6 dB less. One popular method to solve the problem is to skew the antenna panels symmetrically around the central axis of the mast. Usually a skew of λ/4 gives the desired almost-uniform horizontal radiation pattern. But in cases where more than one RF signal is applied to antenna system (via combiner), the improvement in the horizontal radiation pattern may be inadequate for some signals.
https://en.wikipedia.org/wiki/Performance%20Application%20Programming%20Interface
In computer science, Performance Application Programming Interface (PAPI) is a portable interface (in the form of a library) to hardware performance counters on modern microprocessors. It is being widely used to collect low level performance metrics (e.g. instruction counts, clock cycles, cache misses) of computer systems running UNIX/Linux operating systems. PAPI provides predefined high level hardware events summarized from popular processors and direct access to low level native events of one particular processor. Counter multiplexing and overflow handling are also supported. Operating system support for accessing hardware counters is needed to use PAPI. For example, prior to 2010, a Linux/x86 kernel had to be patched with a performance monitoring counters driver (perfctr link) to support PAPI. Since Linux version 2.6.32, and PAPI 2010 releases, PAPI can leverage the existing perf subsystem in Linux, and thus does not need any out of tree driver to be functional anymore. Supported Operating Systems and requirements are listed in the official repository's documentation INSTALL.txt. See also Performance analysis Further reading A Portable Programming Interface for Performance Evaluation on Modern Processors / International Journal of High Performance Computing Applications archive Volume 14 Issue 3, August 2000, Pages 189-204 doi:10.1177/109434200001400303 Dongarra, Jack, et al. "Using PAPI for hardware performance monitoring on Linux systems" // Conference on Linux Clusters: The HPC Revolution. Vol. 5. Linux Clusters Institute, 2001. External links Official site Philip Mucci, Performance Monitoring with PAPI / Dr.Dobbs, June 01, 2005 Development of a PAPI Backend for the Sun Niagara 2 Processor, 2009 Profilers Software optimization
https://en.wikipedia.org/wiki/Development%2C%20Growth%20%26%20Differentiation
Development Growth & Differentiation is a peer-reviewed scientific journal published by Wiley on behalf of the Japanese Society of Developmental Biologists. It was established in 1950 as Embryologia, obtaining its current title in 1969. The editor-in-chief is Masanori Taira (Chuo University). According to the Journal Citation Reports, the journal has a 2021 impact factor of 3.063. Awards Since 2017, the journal gives three annual awards. The "Editor-in-Chief Prize" is awarded to the most cited article of the last three years, while the "Wiley Prize" has been awarded to the most downloaded article of the previous year. Since 2011, the "Young Investigator Paper Award" have been awarded to a post-doc or graduate student's first author. DGD Awards Editors-in-chief The following persons are or have been editor-in-chief:
https://en.wikipedia.org/wiki/Prodiamine
Prodiamine is a preemergent herbicide of the dinitroaniline class. Prodiamine is used with crops such as soybeans, alfalfa, cotton, and ornamental crops. Prodiamine inhibits the formation of microtubules. Prodiamine was developed by Sandoz AG and marketed beginning in 1987. Prodiamine can be obtained starting from 2,4-dichlorobenzotrifluoride.
https://en.wikipedia.org/wiki/History%20of%20personal%20computers
The history of the personal computer as a mass-market consumer electronic device began with the microcomputer revolution of the 1970s. A personal computer is one intended for interactive individual use, as opposed to a mainframe computer where the end user's requests are filtered through operating staff, or a time-sharing system in which one large processor is shared by many individuals. After the development of the microprocessor, individual personal computers were low enough in cost that they eventually became affordable consumer goods. Early personal computers – generally called microcomputers – were sold often in electronic kit form and in limited numbers, and were of interest mostly to hobbyists and technicians. Etymology There are several competing claims as to the origins of the term "personal computer". Yale Law School librarian Fred Shapiro notes an early published use of the phrase in a 1968 Hewlett-Packard advertisement for a programmable calculator, which they called "The new Hewlett-Packard 9100A personal computer." Other claims include computer pioneer Alan Kay's purported use of the term in a 1972 paper, Whole Earth Catalog publisher Stewart Brand's usage in a 1974 book, MITS co-founder Ed Roberts usage in 1975, and Byte magazine's May 1976 usage of "[in] the personal computing field" in its first edition. In 1975 Creative Computing defined the personal computer as a "non-(time)shared system containing sufficient processing power and storage capabilities to satisfy the needs of an individual user." Overview The history of the personal computer as mass-market consumer electronic devices effectively began in 1977 with the introduction of microcomputers, although some mainframe and minicomputers had been applied as single-user systems much earlier. A personal computer is one intended for interactive individual use, as opposed to a mainframe computer where the end user's requests are filtered through operating staff, or a time sharing system in which
https://en.wikipedia.org/wiki/European%20Genetics%20Foundation
The European Genetics Foundation (EGF) is a non-profit organization, dedicated to the training of young geneticists active in medicine, to continuing education in genetics/genomics and to the promotion of public understanding of genetics. Its main office is located in Bologna, Italy. Background In 1988 Prof. Giovanni Romeo, President of the European Genetics Foundation (EGF) and professor of Medical Genetics at the University of Bologna and Prof. Victor A. McKusick founded together the European School of Genetic Medicine (ESGM). Since that time ESGM has taught genetics to postgraduate students (young M.D. and PhD) from some 70 different countries. Most of the courses are presented at ESGM's Main Training Center (MTC) in Bertinoro di Romagna (Italy), and are also available via webcast at authorized Remote Training Centers (RTC) in various countries in Europe and the Mediterranean area (Hybrid Courses). In the Netherlands and Switzerland, medical geneticists must attend at least one ESGM course before admission to their Board examinations. For these reasons, the School has been able to expand and to obtain funding from the European Commission and from other international organizations. Presentation of the Ronzano Project The European School of Genetic Medicine was founded in 1988 and saw rapid success, which necessitated that an administrative body be formed. To this end the European Genetics Foundation was born in Genoa on 20 November 1995, with the following aims: to run the ESGM, promoting the advanced scientific and professional training of young European Geneticists, with particular attention to the applications in the field of preventive medicine; to promote public education about genetics discoveries; to organize conferences, courses, international prizes and initiatives aimed at bringing together the scientific and humanistic disciplines. The ESGM began receiving funding from the European Union and from other international organizations including the Eu
https://en.wikipedia.org/wiki/Pyrrolobenzodiazepine
Pyrrolobenzodiazepines, (PBD) are a class of compound that may have antibiotic or anti-tumor properties. Some dimeric pyrrolobenzodiazepines are used as the cytotoxic drug payloads in antibody-drug conjugates, for example in loncastuximab tesirine. History Anthramycin, the first PBD monomer, was first synthesized in the 1960s.
https://en.wikipedia.org/wiki/Rice%20Institute%20Computer
The Rice Institute Computer, also known as the Rice Computer or R1, was a 54-bit tagged architecture digital computer built during 1958–1961 (partially operational beginning in 1959) on the campus of Rice University, Houston, Texas, United States. Operating as Rice's primary computer until the middle 1960s, the Rice Institute Computer was decommissioned in 1971. The system initially used vacuum tubes and semiconductor diodes for its logic circuits; some later peripherals were built in solid-state emitter-coupled logic. It was designed by Martin H. Graham. A copy of the machine called OSAGE was built and operated at the University of Oklahoma. Memory Memory was implemented using a variety of technologies over the lifetime of the R1. Originally a cathode ray tube or "Williams tube" array, RCA core memory was introduced in 1966, followed by Ampex core memory in 1967. Following those two upgrades, the R1 had reached its full 32k word capacity, although the original electrostatic memory was soon decommissioned due to falling reliability in its old age. Architecture The R1 had seven memory-mapped general-purpose processor registers, each 54 bits in size, in addition to a constant zero register. For memory addressing, seven 16-bit "B-Registers" were used. The program counter was also held in a writable "B-Register". See the table below for conventions and hardware-enforced usage of these registers. See also List of vacuum-tube computers
https://en.wikipedia.org/wiki/Chamicero%20de%20Perij%C3%A1
Chamicero de Perijá is a small nature reserve in Colombia established in 2014. It is centered on the Colombian side of the Serranía de Perijá mountain range, part of the East Andes. It holds incredible biodiversity and home to numerous endangered and endemic bird species. Chamicero de Perijá protects a area of cloud forest habitat which is home to several rare bird taxa, including the Perijá thistletail, the Perijá metaltail, the Perijá brush-finch, an endemic subspecies of the rufous antpitta, and the Perijá tapaculo. Biodiversity It reserve is home to the Perijá Antpitta, Perijá thistletail, the Perijá metaltail, the Perijá brush-finch and Perijá Hemispingus, all rare bird species endemic to the region. There are other endemic bird species in the region, but are more commonly come across, including the Rufous Spinetail and the native yellow breasted brush-finch. Other birds found in the region include both the Crested Quetzal, Golden-headed Quetzal, Barred Fruiteater, Andean Condor, black chested Buzzard-Eagle, Plushcap, the Buff-Breasted Mountain-Tanager, Lacrimose Mountain-Tanager, Hook-billed Kites, White-rumped Hawk and the Páramo Seedeater. There are many more species that have still not been documented yet in the region. Geography The Chamicero de Perijá reserve is located at the northern part of South America's Andes Mountain range, as part of the smaller Serranía de Perijá mountain range. The region is covered mainly by high Andean forest and Andean páramo and subpáramo grasslands. The region also has a number of rivers that run through it, mainly feeders for the Manaure River. Politics and Founding For decades, the region was inhabited by guerrilla fighters and narcos. In 2006, the Columbian government redoubled efforts to establish control in the area. Then in 2014, the ProAves and Rainforest Trust raised over $180,000 to buy the 1,850 acres to establish the reserve. This is helping to protect it from deforestation, which had picked up pace in th
https://en.wikipedia.org/wiki/Two-way%20finite%20automaton
In computer science, in particular in automata theory, a two-way finite automaton is a finite automaton that is allowed to re-read its input. Two-way deterministic finite automaton A two-way deterministic finite automaton (2DFA) is an abstract machine, a generalized version of the deterministic finite automaton (DFA) which can revisit characters already processed. As in a DFA, there are a finite number of states with transitions between them based on the current character, but each transition is also labelled with a value indicating whether the machine will move its position in the input to the left, right, or stay at the same position. Equivalently, 2DFAs can be seen as read-only Turing machines with no work tape, only a read-only input tape. 2DFAs were introduced in a seminal 1959 paper by Rabin and Scott, who proved them to have equivalent power to one-way DFAs. That is, any formal language which can be recognized by a 2DFA can be recognized by a DFA which only examines and consumes each character in order. Since DFAs are obviously a special case of 2DFAs, this implies that both kinds of machines recognize precisely the class of regular languages. However, the equivalent DFA for a 2DFA may require exponentially many states, making 2DFAs a much more practical representation for algorithms for some common problems. 2DFAs are also equivalent to read-only Turing machines that use only a constant amount of space on their work tape, since any constant amount of information can be incorporated into the finite control state via a product construction (a state for each combination of work tape state and control state). Formal description Formally, a two-way deterministic finite automaton can be described by the following 8-tuple: where is the finite, non-empty set of states is the finite, non-empty set of input symbols is the left endmarker is the right endmarker is the start state is the end state is the reject state In addition, the following tw
https://en.wikipedia.org/wiki/FM%20broadcasting
FM broadcasting is a method of radio broadcasting that uses frequency modulation (FM) of the radio broadcast carrier wave. Invented in 1933 by American engineer Edwin Armstrong, wide-band FM is used worldwide to transmit high-fidelity sound over broadcast radio. FM broadcasting offers higher fidelity—more accurate reproduction of the original program sound—than other broadcasting techniques, such as AM broadcasting. It is also less susceptible to common forms of interference, having less static and popping sounds than are often heard on AM. Therefore, FM is used for most broadcasts of music and general audio (in the audio spectrum). FM radio stations use the very high frequency range of radio frequencies. Broadcast bands Throughout the world, the FM broadcast band falls within the VHF part of the radio spectrum. Usually 87.5 to 108.0 MHz is used, or some portion of it, with few exceptions: In the former Soviet republics, and some former Eastern Bloc countries, the older 65.8–74 MHz band is also used. Assigned frequencies are at intervals of 30 kHz. This band, sometimes referred to as the OIRT band, is slowly phased out. Where the OIRT band is used, the 87.5–108.0 MHz band is referred to as the CCIR band. In Japan, the band 76–95 MHz is used. In Brazil, until the late 2010s, FM broadcast stations only used the 88-108 MHz Band, but with the phasing out of analog television, the 76-88 MHz band (old band channels 5 and 6 in VHF television) are allocated for old local MW stations who have moved to FM in agreement with ANATEL. The frequency of an FM broadcast station (more strictly its assigned nominal center frequency) is usually a multiple of 100 kHz. In most of South Korea, the Americas, the Philippines, and the Caribbean, only odd multiples are used. Some other countries follow this plan because of the import of vehicles, principally from the United States, with radios that can only tune to these frequencies. In some parts of Europe, Greenland, and Africa, only
https://en.wikipedia.org/wiki/Souders%E2%80%93Brown%20equation
The Souders–Brown equation (named after Mott Souders and George Granger Brown) has been a tool for obtaining the maximum allowable vapor velocity in vapor–liquid separation vessels (variously called flash drums, knockout drums, knockout pots, compressor suction drums and compressor inlet drums). It has also been used for the same purpose in designing trayed fractionating columns, trayed absorption columns and other vapor–liquid-contacting columns. A vapor–liquid separator drum is a vertical vessel into which a liquid and vapor mixture (or a flashing liquid) is fed and wherein the liquid is separated by gravity, falls to the bottom of the vessel, and is withdrawn. The vapor travels upward at a design velocity which minimizes the entrainment of any liquid droplets in the vapor as it exits the top of the vessel. Use The diameter of a vapor–liquid separator drum is dictated by the expected volumetric flow rate of vapor and liquid from the drum. The following sizing methodology is based on the assumption that those flow rates are known. Use a vertical pressure vessel with a length–diameter ratio of about 3 to 4, and size the vessel to provide about 5 minutes of liquid inventory between the normal liquid level and the bottom of the vessel (with the normal liquid level being somewhat below the feed inlet). Calculate the maximum allowable vapor velocity in the vessel by using the Souders–Brown equation: Then the cross-sectional area of the drum can be found from: And the drum diameter is: The drum should have a vapor outlet at the top, liquid outlet at the bottom, and feed inlet at about the half-full level. At the vapor outlet, provide a de-entraining mesh pad within the drum such that the vapor must pass through that mesh before it can leave the drum. Depending upon how much liquid flow is expected, the liquid outlet line should probably have a liquid level control valve. As for the mechanical design of the drum (materials of construction, wall thickness, corrosio
https://en.wikipedia.org/wiki/Phrase%20name
In Australian botany, a phrase name is an informal name given to a plant taxon that has not yet been given a formal scientific name. The term was adopted in 1992 by the Australian Herbarium Information Systems Committee. The species phrase name consists of four components — the generic name, "sp." (to indicate it is a species), an identifier (geographical or morphological) and a collector's name and number representing a herbarium specimen vouchering the concept of the new species. For example, the phrase name "Derris sp. Wenlock River (B.Hyland 21082V)" refers to an unnamed species of Derris, the specimen of which was collected by Bernard Hyland on the Wenlock River and whose herbarium number is 21082V. Likewise the phrase name "Dryandra sp. 1 (A.S.George 16647)" refers to an unnamed Dryandra species to which belongs the herbarium specimen numbered 16647 collected by Alex George. The second example is not typical — the identifier is usually something more informative, such as a location or distinctive morphological characteristic. A phrase name is used for an unnamed taxon, which may or may not have already been formally described, but only for as long as no scientific name has been chosen for it; afterwards it may be listed as a synonym,
https://en.wikipedia.org/wiki/Tony%20Hey
Professor Anthony John Grenville Hey (born 17 August 1946) was vice-president of Microsoft Research Connections, a division of Microsoft Research, until his departure in 2014. Education Hey was educated at King Edward's School, Birmingham and the University of Oxford. He graduated with a Bachelor of Arts degree in physics in 1967, and a Doctor of Philosophy in theoretical physics in 1970 supervised by P. K. Kabir. He was a student of Worcester College, Oxford and St John's College, Oxford. Career and research From 1970 through 1972 Hey was a postdoctoral fellow at California Institute of Technology (Caltech). Moving to Pasadena, California, he worked with Richard Feynman and Murray Gell-Mann, both winners of the Nobel Prize in Physics. He then moved to Geneva, Switzerland and worked as a fellow at CERN (the European organisation for nuclear research) for two years. Hey worked about thirty years as an academic at University of Southampton, starting in 1974 as a particle physicist. He spent 1978 as a visiting fellow at Massachusetts Institute of Technology. For 1981 he returned to Caltech as a visiting research professor. There he learned of Carver Mead's work on very-large-scale integration and become interested in applying parallel computing techniques to large-scale scientific simulations. Hey worked with British semiconductor company Inmos on the Transputer project in the 1980s. He switched to computer science in 1985, and in 1986 became professor of computation in the Department of Electronics and Computer Science at Southampton. While there, he was promoted to Head of the School of Electronics and Computer Science in 1994 and Dean of Engineering and Applied Science in 1999. Among his work was "doing research on Unix with tools like LaTeX." In 1990 he was a visiting fellow at the Thomas J. Watson Research Center of IBM Research. He then worked with Jack Dongarra, Rolf Hempel and David Walker, to define the Message Passing Interface (MPI) which became a de
https://en.wikipedia.org/wiki/Ipfirewall
ipfirewall or ipfw is a FreeBSD IP, stateful firewall, packet filter and traffic accounting facility. Its ruleset logic is similar to many other packet filters except IPFilter. ipfw is authored and maintained by FreeBSD volunteer staff members. Its syntax enables use of sophisticated filtering capabilities and thus enables users to satisfy advanced requirements. It can either be used as a loadable kernel module or incorporated into the kernel; use as a loadable kernel module where possible is highly recommended. ipfw was the built-in firewall of Mac OS X until Mac OS X 10.7 Lion in 2011 when it was replaced with the OpenBSD project's PF. Like FreeBSD, ipfw is open source. It is used in many FreeBSD-based firewall products, including m0n0wall and FreeNAS. A port of an early version of ipfw was used since Linux 1.1 as the first implementation of firewall available for Linux, until it was replaced by ipchains. A modern port of ipfw and the dummynet traffic shaper is available for Linux (including a prebuilt package for OpenWrt) and Microsoft Windows. wipfw is a Windows port of an old (2001) version of ipfw. Alternative user interfaces for ipfw See also netfilter/iptables, a Linux-based descendant of ipchains NPF, a NetBSD packet filter PF, another widely deployed BSD firewall solution