source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/Hindley%E2%80%93Milner%20type%20system
A Hindley–Milner (HM) type system is a classical type system for the lambda calculus with parametric polymorphism. It is also known as Damas–Milner or Damas–Hindley–Milner. It was first described by J. Roger Hindley and later rediscovered by Robin Milner. Luis Damas contributed a close formal analysis and proof of the method in his PhD thesis. Among HM's more notable properties are its completeness and its ability to infer the most general type of a given program without programmer-supplied type annotations or other hints. Algorithm W is an efficient type inference method in practice and has been successfully applied on large code bases, although it has a high theoretical complexity. HM is preferably used for functional languages. It was first implemented as part of the type system of the programming language ML. Since then, HM has been extended in various ways, most notably with type class constraints like those in Haskell. Introduction As a type inference method, Hindley–Milner is able to deduce the types of variables, expressions and functions from programs written in an entirely untyped style. Being scope sensitive, it is not limited to deriving the types only from a small portion of source code, but rather from complete programs or modules. Being able to cope with parametric types, too, it is core to the type systems of many functional programming languages. It was first applied in this manner in the ML programming language. The origin is the type inference algorithm for the simply typed lambda calculus that was devised by Haskell Curry and Robert Feys in 1958. In 1969, J. Roger Hindley extended this work and proved that their algorithm always inferred the most general type. In 1978, Robin Milner, independently of Hindley's work, provided an equivalent algorithm, Algorithm W. In 1982, Luis Damas finally proved that Milner's algorithm is complete and extended it to support systems with polymorphic references. Monomorphism vs. polymorphism In the simply t
https://en.wikipedia.org/wiki/Dawon%20Kahng
Dawon Kahng (; May 4, 1931 – May 13, 1992) was a Korean-American electrical engineer and inventor, known for his work in solid-state electronics. He is best known for inventing the MOSFET (metal–oxide–semiconductor field-effect transistor, or MOS transistor), along with his colleague Mohamed Atalla, in 1959. Kahng and Atalla developed both the PMOS and NMOS processes for MOSFET semiconductor device fabrication. The MOSFET is the most widely used type of transistor, and the basic element in most modern electronic equipment. Kahng and Atalla later proposed the concept of the MOS integrated circuit, and they did pioneering work on Schottky diodes and nanolayer-base transistors in the early 1960s. Kahng then invented the floating-gate MOSFET (FGMOS) with Simon Min Sze in 1967. Kahng and Sze proposed that FGMOS could be used as floating-gate memory cells for non-volatile memory (NVM) and reprogrammable read-only memory (ROM), which became the basis for EPROM (erasable programmable ROM), EEPROM (electrically erasable programmable ROM) and flash memory technologies. Kahng was inducted into the National Inventors Hall of Fame in 2009. Biography Dawon Kahng was born on May 4, 1931, in Keijō, Chōsen (today Seoul, South Korea). He studied physics at Seoul National University in South Korea, and immigrated to the United States in 1955 to attend Ohio State University, where he received a doctorate in electrical engineering in 1959. He was a researcher at Bell Telephone Laboratories in Murray Hill, New Jersey, and he invented MOSFET (metal–oxide–semiconductor field-effect transistor), which is the basic element in most of today's electronic equipment, with Mohamed Atalla in 1959. They fabricated both PMOS and NMOS devices with a 20µm process. Extending their work on MOS technology, Kahng and Atalla next did pioneering work on hot carrier devices, which used what would later be called a Schottky barrier. The Schottky diode, also known as the Schottky-barrier diode, was theori
https://en.wikipedia.org/wiki/Handel-C
Handel-C is a high-level programming language which targets low-level hardware, most commonly used in the programming of FPGAs. It is a rich subset of C, with non-standard extensions to control hardware instantiation with an emphasis on parallelism. Handel-C is to hardware design what the first high-level programming languages were to programming CPUs. Unlike many other design languages that target a specific architecture Handel-C can be compiled to a number of design languages and then synthesised to the corresponding hardware. This frees developers to concentrate on the programming task at hand rather than the idiosyncrasies of a specific design language and architecture. Additional features The subset of C includes all common C language features necessary to describe complex algorithms. Like many embedded C compilers, floating point data types were omitted. Floating point arithmetic is supported through external libraries that are very efficient. Parallel programs In order to facilitate a way to describe parallel behavior some of the CSP keywords are used, along with the general file structure of Occam. For example: par { ++c; a = d + e; b = d + e; } Channels Channels provide a mechanism for message passing between parallel threads. Channels can be defined as asynchronous or synchronous (with or without an inferred storage element respectively). A thread writing to a synchronous channel will be immediately blocked until the corresponding listening thread is ready to receive the message. Likewise the receiving thread will block on a read statement until the sending thread executes the next send. Thus they may be used as a means of synchronizing threads. par { chan int a; // declare a synchronous channel int x; // begin sending thread seq (i = 0; i < 10; i++) { a ! i; // send the values 0 to 9 sequentially into the channel } // begin receiving thread seq (j = 0; j < 10; j++) { a ? x; // pe
https://en.wikipedia.org/wiki/Humanistic%20informatics
Humanistic informatics is one of several names chosen for the study of the relationship between human culture and technology. The term is fairly common in Europe, but is little known in the English-speaking world, though digital humanities (also known as humanities computing) is in many cases roughly equivalent. Humanistic informatics departments were generally started in the 1990s when universities rarely taught humanities-based approaches to the rapidly developing computerized society. For this reason, the field was quite broadly defined, and included courses in humanities computing, basic introductions to how computers work, historical developments of technology, technology and learning, digital art and literature and digital culture. Today several departments have declared more specialized areas of research, such as digital arts and culture at the University of Bergen, and socio-cultural communication with and without technology at the University of Aalborg. Digital humanities is a primary topic, and there are several universities in the US and the UK that have digital arts and humanities research and development centers. One aspect of digital humanities that will grow will be the intersection of new digital media and the humanities, particularly in the gaming industry which has developed both casual and serious gaming and game design strategies to foster learning in the humanities and all other academic disciplines. A key principle in all digital interactive media or games is the storyline; the narrative or quest or goal of the game is primary to both literary works and games. Characters and players go on the quest, and playing the game becomes the narrative. Game design principles, also relevant in literature and the fine arts, include visual literacy and empowering players/learners to align with great artists and writers who believe in the creative process.
https://en.wikipedia.org/wiki/Alain%20A.%20Lewis
Alain A. Lewis (born 1947) is an American mathematician. A student of the mathematical economist Kenneth Arrow, Lewis is credited by the historian of economics Philip Mirowski with making Arrow aware of computational limits to economic agency. Life Lewis gained his BA in philosophy, economics and statistics from George Washington University in 1969, and a PhD in applied mathematics from Harvard University in 1979. He was based at Lawrence Livermore Labs from 1978 to 1979, RAND from 1979 to 1982, the University of Singapore from 1981 to 1983, Cornell University from 1983 to 1987 and University of California, Irvine from 1987. Works 'A Nonstandard Theory of Games. Part I: On the Existence of the Quasi-Kernel and Related Solution Concepts for *Finite Cooperative Games', Harvard University Center on Decision and Conflict in Complex Situations Technical Report no. TR-6, June 1979 'A Nonstandard Theory of Games. Part II. On Non-Atomic Representations', Harvard University, Technical Report no. TR-7, June 1979 'A Nonstandard Theory of Games. Part III. Noncooperative *Finite Games', Harvard University, Technical Report no TR-8, June 1979 'A Nonstandard Theory of Games. Part IV. Equilibrium Points for Finite Games', Technical Report no. TR-9, June 1979 'Arrow's theorem and group decision making on public policy', RAND Papers, 1979 'Aspects of fair division', RAND Papers, 1980 (with Perry Thorndyke and others) 'Improving Training and Performance of Navy Teams: A Design for a Research Program', RAND Reports, 1980 'A note on the Lagrangean expression of Nash equilibria', RAND Papers, 1980 'On the formal character of plausible reasoning', RAND Papers, 1980 'The Use of Utility in Multiattribute Utility Analysis', RAND Papers, 1980 'Notes on *finite cooperative games', RAND Papers, 1981 'Hyperfinite Von Neumann games', Mathematical Social Sciences, Vol. 9, No. 2 (1985), pp. 189–194 'Loeb-measurable solutions to *finite games', Vol. 9, No. 3 (1985), pp. 197–247 'On e
https://en.wikipedia.org/wiki/Multisample%20anti-aliasing
Multisample anti-aliasing (MSAA) is a type of spatial anti-aliasing, a technique used in computer graphics to remove jaggies. Definition The term generally refers to a special case of supersampling. Initial implementations of full-scene anti-aliasing (FSAA) worked conceptually by simply rendering a scene at a higher resolution, and then downsampling to a lower-resolution output. Most modern GPUs are capable of this form of anti-aliasing, but it greatly taxes resources such as texture, bandwidth, and fillrate. (If a program is highly TCL-bound or CPU-bound, supersampling can be used without much performance hit.) According to the OpenGL GL_ARB_multisample specification, "multisampling" refers to a specific optimization of supersampling. The specification dictates that the renderer evaluate the fragment program once per pixel, and only "truly" supersample the depth and stencil values. (This is not the same as supersampling but, by the OpenGL 1.5 specification, the definition had been updated to include fully supersampling implementations as well.) In graphics literature in general, "multisampling" refers to any special case of supersampling where some components of the final image are not fully supersampled. The lists below refer specifically to the ARB_multisample definition. Description In supersample anti-aliasing, multiple locations are sampled within every pixel, and each of those samples is fully rendered and combined with the others to produce the pixel that is ultimately displayed. This is computationally expensive, because the entire rendering process must be repeated for each sample location. It is also inefficient, as aliasing is typically only noticed in some parts of the image, such as the edges, whereas supersampling is performed for every single pixel. In multisample anti-aliasing, if any of the multi sample locations in a pixel is covered by the triangle being rendered, a shading computation must be performed for that triangle. However this calc
https://en.wikipedia.org/wiki/Li%E2%80%93Fraumeni%20syndrome
Li–Fraumeni syndrome is a rare, autosomal dominant, hereditary disorder that predisposes carriers to cancer development. It was named after two American physicians, Frederick Pei Li and Joseph F. Fraumeni, Jr., who first recognized the syndrome after reviewing the medical records and death certificates of 648 childhood rhabdomyosarcoma patients. This syndrome is also known as the sarcoma, breast, leukaemia and adrenal gland (SBLA) syndrome. The syndrome is linked to germline mutations of the p53 tumor suppressor gene, which encodes a transcription factor (p53) that normally regulates the cell cycle and prevents genomic mutations. The mutations can be inherited, or can arise from mutations early in embryogenesis, or in one of the parent's germ cells. Presentation Li–Fraumeni syndrome is characterized by early onset of cancer, a wide variety of types of cancers, and development of multiple cancers throughout one's life. Pathology LFS1: Mutations in TP53 Normal conditions: TP53 is a tumor suppressor gene on chromosome 17 that normally assists in the control of cell division and growth through action on the normal cell cycle. TP53 typically become expressed due to cellular stressors, such as DNA damage, and can halt the cell cycle to assist with either the repair of repairable DNA damage, or can induce apoptosis of a cell with irreparable damage. The repair of "bad" DNA, or the apoptosis of a cell, prevents the proliferation of damaged cells. Mutant conditions: Mutations of TP53 can inhibit its normal function, and allow cells with damaged DNA to continue to divide. If these DNA mutations are left unchecked, some cells can divide uncontrollably, forming tumors (cancers). Further mutations in the DNA could lead to malignant cells that can travel to, and develop cancer in, different areas of the body. Many individuals with Li–Fraumeni syndrome have been shown to be heterozygous for a TP53 mutation. Recent studies have shown that 60% to 80% of classic LFS families h
https://en.wikipedia.org/wiki/Tropical%20and%20subtropical%20moist%20broadleaf%20forests
Tropical and subtropical moist broadleaf forests (TSMF), also known as tropical moist forest, is a subtropical and tropical forest habitat type defined by the World Wide Fund for Nature. Description TSMF is generally found in large, discontinuous patches centered on the equatorial belt and between the Tropic of Cancer and Tropic of Capricorn, TSMF are characterized by low variability in annual temperature and high levels of rainfall of more than annually. Forest composition is dominated by evergreen and semi-deciduous tree species. These forests are home to more species than any other terrestrial ecosystem on Earth: Half of the world's species may live in these forests, where a square kilometer may be home to more than 1,000 tree species. These forests are found around the world, particularly in the Indo-Malayan Archipelago, the Amazon Basin, and the African Congo Basin. The perpetually warm, wet climate makes these environments more productive than any other terrestrial environment on Earth and promotes explosive plant growth. A tree here may grow over in height in just 5 years. From above, the forest appears as an unending sea of green, broken only by occasional, taller "emergent" trees. These towering emergents are the realm of hornbills, toucans, and the harpy eagle. In general, biodiversity is highest in the forest canopy. The canopy can be divided into five layers: overstory canopy with emergent crowns, a medium layer of canopy, lower canopy, shrub level, and finally understory. The canopy is home to many of the forest's animals, including apes and monkeys. Below the canopy, a lower understory hosts to snakes and big cats. The forest floor, relatively clear of undergrowth due to the thick canopy above, is prowled by other animals such as gorillas and deer. All levels of these forests contain an unparalleled diversity of invertebrate species, including New Guinea’s stick insects and butterflies that can grow over in length. Many forests are being cl
https://en.wikipedia.org/wiki/Altova
Altova is a commercial software development company with headquarters in Beverly, MA, United States and Vienna, Austria, that produces integrated XML, JSON, database, UML, and data management software development tools. Company Altova was founded in 1992 as an XML development software company. Its software is used by more than 4 million users and more than 100,000 companies globally. The first product was XMLSpy, and around the year 2000, Altova began to develop new tools to augment XMLSpy and expand into new areas of software development. The CEO and president of Altova is Alexander Falk, who has explained that the development of Altova software has occurred through the inclusion of features most requested by the users of previous program incarnations. Falk is also the inventor behind Altova's patents. Altova software attempts to increase the efficiency of program use in order to reduce the amount of time needed for users to learn database software and other tasks such as query execution. Examples of Altova software includes the XML editor XMLSpy, and MapForce, a data mapping tool. Altova has also added XBRL capable programs to its XML software line, including development tools. In addition, they have included Web Services Description Language, project management and Unified Modeling Language capabilities to their software. Most recently, the company has introduced a mobile development environment called MobileTogether for developing cross-platform enterprise mobile solutions. At the beginning of 2014, the company claimed to have more than 4.6 million users of its software. Programs XMLSpy—XML editor for modeling, editing, transforming, and debugging XML technologies MapForce—any-to-any graphical data mapping, conversion, and integration tool MapForce FlexText—graphical utility for parsing flat files StyleVision—multipurpose visual XSLT stylesheet design, multi-channel publishing, and report building tool UModel—UML modeling tool DatabaseSpy—multi-database
https://en.wikipedia.org/wiki/Cardiovascular%20physiology
Cardiovascular physiology is the study of the cardiovascular system, specifically addressing the physiology of the heart ("cardio") and blood vessels ("vascular"). These subjects are sometimes addressed separately, under the names cardiac physiology and circulatory physiology. Although the different aspects of cardiovascular physiology are closely interrelated, the subject is still usually divided into several subtopics. Heart Cardiac output (= heart rate * stroke volume. Can also be calculated with Fick principle, palpating method.) Stroke volume (= end-diastolic volume − end-systolic volume) Ejection fraction (= stroke volume / end-diastolic volume) Cardiac output is mathematically ` to systole Inotropic, chronotropic, and dromotropic states Cardiac input (= heart rate * suction volume Can be calculated by inverting terms in Fick principle) Suction volume (= end-systolic volume + end-diastolic volume) Injection fraction (=suction volume / end-systolic volume) Cardiac input is mathematically ` to diastole Electrical conduction system of the heart Electrocardiogram Cardiac marker Cardiac action potential Frank–Starling law of the heart Wiggers diagram Pressure volume diagram Regulation of blood pressure Baroreceptor Baroreflex Renin–angiotensin system Renin Angiotensin Juxtaglomerular apparatus Aortic body and carotid body Autoregulation Cerebral Autoregulation Hemodynamics Under most circumstances, the body attempts to maintain a steady mean arterial pressure. When there is a major and immediate decrease (such as that due to hemorrhage or standing up), the body can increase the following: Heart rate Total peripheral resistance (primarily due to vasoconstriction of arteries) Inotropic state In turn, this can have a significant impact upon several other variables: Stroke volume Cardiac output Pressure Pulse pressure (systolic pressure - diastolic pressure) Mean arterial pressure (usually approximated with diastolic pressure +
https://en.wikipedia.org/wiki/Scrubbing%20%28audio%29
In digital audio editing, scrubbing is an interaction in which a user drags a cursor or playhead across a segment of a waveform to hear it. Scrubbing is a convenient way to quickly navigate an audio file, and is a common feature of modern digital audio workstations and other audio editing software. The term comes from the early days of the recording industry and refers to the process of physically moving tape reels to locate a specific point in the audio track; this gave the engineer the impression that the tape was being scrubbed, or cleaned. Implementations Common scrubbing feedback techniques include: Resampling allows playback at arbitrary rates, which also pitch-shifts the audio, approximating the effect of playing audio from an analog source like tape or vinyl with a similarly varying motion Cut-and-paste the original signal is segmented into frames of constant width and playback is obtained by either discarding (time compression) or repeating (time expansion) some frames. Timeline stretching processes the audio to allow playback at arbitrary rates without changing the pitch (audio time stretching), common approaches include: the Phase Vocoder, and Time Domain Harmonic Scaling See also Rocking and rolling
https://en.wikipedia.org/wiki/Olfactory%20nerve
The olfactory nerve, also known as the first cranial nerve, cranial nerve I, or simply CN I, is a cranial nerve that contains sensory nerve fibers relating to the sense of smell. The afferent nerve fibers of the olfactory receptor neurons transmit nerve impulses about odors to the central nervous system (olfaction). Derived from the embryonic nasal placode, the olfactory nerve is somewhat unusual among cranial nerves because it is capable of some regeneration if damaged. The olfactory nerve is sensory in nature and originates on the olfactory mucosa in the upper part of the nasal cavity. From the olfactory mucosa, the nerve (actually many small nerve fascicles) travels up through the cribriform plate of the ethmoid bone to reach the surface of the brain. Here the fascicles enter the olfactory bulb and synapse there; from the bulbs (one on each side) the olfactory information is transmitted into the brain via the olfactory tract. The fascicles of the olfactory nerve are not visible on a cadaver brain because they are severed upon removal. Structure The specialized olfactory receptor neurons of the olfactory nerve are located in the olfactory mucosa of the upper parts of the nasal cavity. The olfactory nerves consist of a collection of many sensory nerve fibers that extend from the olfactory epithelium to the olfactory bulb, passing through the many openings of the cribriform plate, a sieve-like structure of the ethmoid bone. The sense of smell arises from the stimulation of receptors by small molecules in inspired air of varying spatial, chemical, and electrical properties that reach the nasal epithelium in the nasal cavity during inhalation. These stimulants are transduced into electrical activity in the olfactory neurons, which then transmit these impulses to the olfactory bulb and from there they reach the olfactory areas of the brain via the olfactory tract. The olfactory nerve is the shortest of the twelve cranial nerves and, similar to the optic nerve, doe
https://en.wikipedia.org/wiki/High%20flux%20reactor
A High Flux Reactor is a type of nuclear research reactor. High Flux Isotope Reactor (HFIR), in Oak Ridge, Tennessee, United States of America, High Flux Australian Reactor (HIFAR), Australia's first nuclear reactor, High-Flux Advanced Neutron Application Reactor (HANARO), in South Korea. The High Flux Reactor at Institut Laue–Langevin in France. High Flux Reactor (HFR) at Petten in the Netherlands Nuclear research reactors
https://en.wikipedia.org/wiki/Mohammad-Ali%20Najafi
Mohammad-Ali Najafi (; born 13 January 1952) is an Iranian mathematician and reformist politician who was the Mayor of Tehran, serving in the post for eight months, until April 2018. He held cabinet portfolios during the 1980s, 1990s and 2010s. He is also a retired professor of mathematics at Sharif University of Technology. Early life and education Najafi was born in Tehran on 13 January 1952. He ranked first in Iranian national university entrance exam and enrolled in Sharif University of Technology (then known as Aryamehr University of Technology). He earned a Bachelor of Science degree in mathematics from the Sharif University of Technology. Following his bachelors, he enrolled in the graduate program at the Massachusetts Institute of Technology. He received his Master of Science degree in mathematics with the final grade of A+ in 1976 but dropped out of PhD program in 1978 during the Iranian revolution to return to Iran. Career Following the Iranian revolution of 1979, Najafi returned to Iran and became a faculty member at Isfahan University of Technology in 1979 and he was the chair of the university from 1980 to 1981. He was a faculty member at department of mathematical sciences in Sharif University of Technology from 1984 to 1988, when he moved to government. At the end of the reformist government of Mohammad Khatami and following Mahmoud Ahmadinejad's election Najafi moved back to university and has been faculty in the department of mathematics at Sharif University of Technology working on representation theory. He served as an advisor to Mostafa Chamran. He was the minister of higher education from 1981 to 1984 in the cabinet of then Prime Minister Mir-Hossein Mousavi. In 1989, he became the minister of education under then President Hashemi Rafsanjani and served until 1997. In 1997, he was appointed vice president and head of the Planning and Budget Organization by President Mohammad Khatami, but after a merge of the organization with another he was
https://en.wikipedia.org/wiki/Phase%20perturbation
Phase perturbation is the shifting, from whatever cause, in the phase of an electronic signal. The shifting is often quite rapid, and may appear to be random or cyclic. The phase departure in phase perturbation usually is larger, but less rapid, than in phase jitter. Phase perturbation may be expressed in degrees, with any cyclic component expressed in hertz.
https://en.wikipedia.org/wiki/Darwin%20Core
Darwin Core (often abbreviated to DwC) is an extension of Dublin Core for biodiversity informatics. It is meant to provide a stable standard reference for sharing information on biological diversity (biodiversity). The terms described in this standard are a part of a larger set of vocabularies and technical specifications under development and maintained by Biodiversity Information Standards (TDWG) (formerly the Taxonomic Databases Working Group). Description The Darwin Core is a body of standards intended to facilitate the sharing of information about biological diversity. The DwC includes a glossary of terms, and documentation providing reference definitions, examples, and commentary. An overview of the currently adopted terms and concepts can be found in the Darwin Core quick reference guide maintained by TDWG. The DwC operational unit is primarily based on taxa, their occurrence in nature as documented by observations, specimens, and samples, and related information. Included in the standard are documents describing how these terms are managed, how the set of terms can be extended for new purposes, and how the terms can be used. Each DwC term includes a definition and discussions meant to promote the consistent use of the terms across applications and disciplines. In other contexts, such terms might be called properties, elements, fields, columns, attributes, or concepts. Though the data types and constraints are not provided in the term definitions, recommendations are made about how to restrict the values where appropriate, for instance by suggesting the use of controlled vocabularies. DwC standards are versioned and are constantly evolving, and working groups frequently add to the documentation practical examples that discuss, refine, and expand the normative definitions of each term. This approach to documentation allows the standard to adapt to new purposes without disrupting existing applications. In practice, Darwin Core decouples the definition and
https://en.wikipedia.org/wiki/Pappus%20chain
In geometry, the Pappus chain is a ring of circles between two tangent circles investigated by Pappus of Alexandria in the 3rd century AD. Construction The arbelos is defined by two circles, CU and CV, which are tangent at the point A and where CU is enclosed by CV. Let the radii of these two circles be denoted as rU and rV, respectively, and let their respective centers be the points U and V. The Pappus chain consists of the circles in the shaded grey region, which are externally tangent to CU (the inner circle) and internally tangent to CV (the outer circle). Let the radius, diameter and center point of the nth circle of the Pappus chain be denoted as rn, dn and Pn, respectively. Properties Centers of the circles Ellipse All the centers of the circles in the Pappus chain are located on a common ellipse, for the following reason. The sum of the distances from the nth circle of the Pappus chain to the two centers U and V of the arbelos circles equals a constant Thus, the foci of this ellipse are U and V, the centers of the two circles that define the arbelos; these points correspond to the midpoints of the line segments AB and AC, respectively. Coordinates If r = AC/AB, then the center of the nth circle in the chain is: Radii of the circles If r = AC/AB, then the radius of the nth circle in the chain is: Circle inversion The height hn of the center of the nth circle above the base diameter ACB equals n times dn. This may be shown by inverting in a circle centered on the tangent point A. The circle of inversion is chosen to intersect the nth circle perpendicularly, so that the nth circle is transformed into itself. The two arbelos circles, CU and CV, are transformed into parallel lines tangent to and sandwiching the nth circle; hence, the other circles of the Pappus chain are transformed into similarly sandwiched circles of the same diameter. The initial circle C0 and the final circle Cn each contribute ½dn to the height hn, whereas the circles C1–Cn−
https://en.wikipedia.org/wiki/Hecke%20operator
In mathematics, in particular in the theory of modular forms, a Hecke operator, studied by , is a certain kind of "averaging" operator that plays a significant role in the structure of vector spaces of modular forms and more general automorphic representations. History used Hecke operators on modular forms in a paper on the special cusp form of Ramanujan, ahead of the general theory given by . Mordell proved that the Ramanujan tau function, expressing the coefficients of the Ramanujan form, is a multiplicative function: The idea goes back to earlier work of Adolf Hurwitz, who treated algebraic correspondences between modular curves which realise some individual Hecke operators. Mathematical description Hecke operators can be realized in a number of contexts. The simplest meaning is combinatorial, namely as taking for a given integer some function defined on the lattices of fixed rank to with the sum taken over all the that are subgroups of of index . For example, with and two dimensions, there are three such . Modular forms are particular kinds of functions of a lattice, subject to conditions making them analytic functions and homogeneous with respect to homotheties, as well as moderate growth at infinity; these conditions are preserved by the summation, and so Hecke operators preserve the space of modular forms of a given weight. Another way to express Hecke operators is by means of double cosets in the modular group. In the contemporary adelic approach, this translates to double cosets with respect to some compact subgroups. Explicit formula Let be the set of integral matrices with determinant and be the full modular group . Given a modular form of weight , the th Hecke operator acts by the formula where is in the upper half-plane and the normalization constant assures that the image of a form with integer Fourier coefficients has integer Fourier coefficients. This can be rewritten in the form which leads to the formula for
https://en.wikipedia.org/wiki/Composite%20field
In quantum field theory, a composite field is a field defined in terms of other more "elementary" fields. It might describe a composite particle (bound state) or it might not. It might be local, or it might be nonlocal. Noether fields are often composite fields and they are local. In the generalized LSZ formalism, composite fields, which are usually nonlocal, are used to model asymptotic bound states. See also Fermionic field Bosonic field Auxiliary field Quantum field theory
https://en.wikipedia.org/wiki/Deep%20facial%20vein
The anterior facial vein receives a branch of considerable size, the deep facial vein, from the pterygoid venous plexus.
https://en.wikipedia.org/wiki/Early%20Algebra
Early Algebra is an approach to early mathematics teaching and learning. It is about teaching traditional topics in more profound ways. It is also an area of research in mathematics education. Traditionally, algebra instruction has been postponed until adolescence. However, data of early algebra researchers shows ways to teach algebraic thinking much earlier. The National Council of Teachers of Mathematics (NCTM) integrates algebra into its Principles and Standards starting from Kindergarten. One of the major goals of early algebra is generalizing number and set ideas. It moves from particular numbers to patterns in numbers. This includes generalizing arithmetic operations as functions, as well as engaging children in noticing and beginning to formalize properties of numbers and operations such as the commutative property, identities, and inverses. Students historically have had a very difficult time adjusting to algebra for a number of reasons. Researchers have found that by working with students on such ideas as developing rules for the use of letters to stand in for numbers and the true meaning of the equals symbol (it is a balance point, and does not mean "put the answer next"), children are much better prepared for formal algebra instruction. Teacher professional development in this area consists of presenting common student misconceptions and then developing lessons to move students out of faulty ways of thinking and into correct generalizations. The use of true, false, and open number sentences can go a long way toward getting students thinking about the properties of number and operations and the meaning of the equals sign. Research areas in early algebra include use of representations, such as symbols, graphs and tables; cognitive development of students; viewing arithmetic as a part of algebraic conceptual fields Notes
https://en.wikipedia.org/wiki/List%20of%20enterprise%20portal%20vendors
This is a list of notable vendors of enterprise portals. An enterprise portal is a framework for integrating information, people and processes across organizational boundaries. enterprise portal vendors
https://en.wikipedia.org/wiki/Bearberry
Bearberries are three species of dwarf shrubs in the genus Arctostaphylos. Unlike the other species of Arctostaphylos (see manzanita), they are adapted to Arctic and Subarctic climates, and have a circumpolar distribution in northern North America, Asia and Europe. Description Bearberries grow as low-lying bushes. Furthermore, one can see from the images that they have a round shape to them as well. They are capable of surviving on soils predominantly composed of sand. In Canada, they are found in the Northern Latitude forests, and they can also be found growing on gravel surfaces. Species The name "bearberry" for the plant derives from the edible fruit which is a favorite food of bears. The fruit are edible and are sometimes gathered as food for humans. The leaves of the plant are used in herbal medicine. Alpine bearberry: Arctostaphylos alpina (L.) Spreng (syn. Arctous alpinus (L.) Niedenzu). This is a procumbent shrub . Leaves not winter green, but dead leaves persist on stems for several years. Berries dark purple to black. Distribution: circumpolar, at high latitudes, from Scotland east across Scandinavia, Russia, Alaska, Canada and Greenland; southern limits in Europe in the Pyrenees and the Alps, in Asia to the Altay Mountains, and in North America to British Columbia in the west, and Maine and New Hampshire in the United States in the east. Red bearberry: Arctostaphylos rubra (Rehd. & Wilson) Fernald (syn. Arctous rubra (Rehder and E.H. Wilson) Nakai; Arctous alpinus var. ruber Rehd. and Wilson). This is a procumbent shrub . Leaves deciduous, falling in autumn to leave bare stems. Berries red. Distribution: in the mountains of Sichuan, southwestern China north and east to eastern Siberia, Alaska and northern Canada east to northern Quebec. Common bearberry: Arctostaphylos uva-ursi (L.) Spreng. Uses The berries ripen late in the year, and can be eaten raw. The plant contains diverse phytochemicals, including ursolic acid, tannic acid, gallic acid, so
https://en.wikipedia.org/wiki/Topologically%20associating%20domain
A topologically associating domain (TAD) is a self-interacting genomic region, meaning that DNA sequences within a TAD physically interact with each other more frequently than with sequences outside the TAD. The median size of a TAD in mouse cells is 880 kb, and they have similar sizes in non-mammalian species. Boundaries at both side of these domains are conserved between different mammalian cell types and even across species and are highly enriched with CCCTC-binding factor (CTCF) and cohesin. In addition, some types of genes (such as transfer RNA genes and housekeeping genes) appear near TAD boundaries more often than would be expected by chance. The functions of TADs are not fully understood and are still a matter of debate. Most of the studies indicate TADs regulate gene expression by limiting the enhancer-promoter interaction to each TAD; however, a recent study uncouples TAD organization and gene expression. Disruption of TAD boundaries are found to be associated with wide range of diseases such as cancer, variety of limb malformations such as synpolydactyly, Cooks syndrome, and F-syndrome, and number of brain disorders like Hypoplastic corpus callosum and Adult-onset demyelinating leukodystrophy. The mechanisms underlying TAD formation are also complex and not yet fully elucidated, though a number of protein complexes and DNA elements are associated with TAD boundaries. However, the handcuff model and the loop extrusion model describe the TAD formation by the aid of CTCF and cohesin proteins. Furthermore, it has been proposed that the stiffness of TAD boundaries itself could cause the domain insulation and TAD formation. Discovery and diversity TADs are defined as regions whose DNA sequences preferentially contact each other. They were discovered in 2012 using chromosome conformation capture techniques including Hi-C. They have been shown to be present in multiple species, including fruit flies (Drosophila), mouse, plants, fungi and human genomes. In b
https://en.wikipedia.org/wiki/Catherine%20Yan
Catherine Huafei Yan () is a professor of mathematics at Texas A&M University interested in algebraic combinatorics. Education and career Yan earned a bachelor's degree from Peking University in 1993. She was a student of Gian-Carlo Rota at the Massachusetts Institute of Technology, where she earned her Ph.D. in 1997 with a dissertation on The Theory of Commuting Boolean Algebras. After working for two years as a Courant Instructor at New York University, she joined Texas A&M in 1999, with a three-year hiatus as Chern Professor at the Center of Combinatorics, Nankai University, from 2005 to 2008. Book With her advisor and Joseph Kung, she is an author of Combinatorics: The Rota Way (Cambridge University Press, 2009). The book provides an exposition of the areas of combinatorics of interest to Rota, unified through an algebraic framework, and lists many open research problems in this area. Recognition Yan won a Sloan Research Fellowship in 2001. She was elected to the 2018 class of fellows of the American Mathematical Society "for contributions to combinatorics and discrete geometry".
https://en.wikipedia.org/wiki/KCNS1
Potassium voltage-gated channel subfamily S member 1 is a protein that in humans is encoded by the KCNS1 gene. The protein encoded by this gene is a voltage-gated potassium channel subunit.
https://en.wikipedia.org/wiki/INCA%20Internet
INCA Internet Corporation (), also known as nProtect, is a corporation which sells computer software. INCA Internet was founded by Young Heum Joo, the current CEO and President of INCA Internet, in 2000. It offers anti-virus, anti-spyware, game security, and unified corporate security. Headquartered in Seoul, Republic of Korea, INCA Internet was selected as one of the Deloitte Technology Fast 50 Korea 2007 and Deloitte Technology Fast 500 Asia Pacific 2007.<ref name="INCA_FAST500"></ref> Company Overview INCA Internet is an information security company based in Republic of Korea, and develops the 'nProtect' line of computer security products. Young Heum Joo founded the company on January 31, 2000, and is currently the CEO and President. The company currently holds more than 70% of the market share of information security for Korean financial institutions and more than 90% of game portal security. It is a public company limited by shares, Young Hem Joo being the largest stockholder, followed by JAIC from Japan and MeesPierson from the Netherlands. Other major investors include JAIC, Japan's largest independent venture capital firm, and KDB (Korea Development Bank). The main business areas of INCA Internet include online PC security services for financial institutions, internet business corporations, and online game corporations among others, online game security solutions, a united PC security solution for corporate internal security and a B2C business such as an online Anti-Virus for normal internet users. INCA Internet was one of the first Application Service Provider (ASP) companies in the online PC security industry. The products are widely used by Korean and Japanese financial institutions, public institutions, worldwide on-line game companies. INCA Internet was awarded the IR52 Jang Yeong-sil award, which is regarded as the highest and most reputable award in the Korean industrial technology field. It acquired the ISO 9000 Certificate by TUViT, an IT certif
https://en.wikipedia.org/wiki/Intrusive%20research
Intrusive research is the gathering of data from individuals through interviewing, observation, or surveying, when consent is legally required, yet the test subjects don't have the capacity to give such consent due to mental illness or developmental disability. It is a legal issue addressed by the United Kingdom Mental Capacity Act 2005. Intrusive research and UK's Mental Capacity Act UK's Mental Capacity Act 2005 criminalizes intrusive research if it was carried out without securing the consent of the person involved, who has the mental capacity to make the decision. It is also unlawful to involve a person lacking mental capacity without the approval of an "appropriate body". The appropriate body pertains to the authority appointed by the Secretary of State in England and the National Assembly for Wales (e.g. Research Ethics Committee). The Act also applies to non-interventional research such as observational research. The law provides the statutory framework and provisions for the participation of people without the capacity to give consent in intrusive research. The Act also applies to clinical trials of treatments and procedures, but doesn't apply to trials of medicinal products, for which there is a separate regulation (The Medicines for Human Use Regulations, 2004). The Mental capacity Act code of practice, 2007, gives examples of intrusive research: Clinical research into new types of treatments (except clinical trials of medicines that are covered by separate regulations) Health or social care services research to evaluate the effectiveness of a policy intervention or service innovation. Research in other fields, (e.g. criminal justice, psychological studies, lifestyle or socioeconomic surveys) Research on tissue samples (i.e. tissue removed during surgical or diagnostic procedures) Research on health and other personal data collected from records. Observations, photography or videoing of humans. Other usage A broader conceptualization defin
https://en.wikipedia.org/wiki/Thomas%20Bayes
Thomas Bayes ( ; 7 April 1761) was an English statistician, philosopher and Presbyterian minister who is known for formulating a specific case of the theorem that bears his name: Bayes' theorem. Bayes never published what would become his most famous accomplishment; his notes were edited and published posthumously by Richard Price. Biography Thomas Bayes was the son of London Presbyterian minister Joshua Bayes, and was possibly born in Hertfordshire. He came from a prominent nonconformist family from Sheffield. In 1719, he enrolled at the University of Edinburgh to study logic and theology. On his return around 1722, he assisted his father at the latter's chapel in London before moving to Tunbridge Wells, Kent, around 1734. There he was minister of the Mount Sion Chapel, until 1752. He is known to have published two works in his lifetime, one theological and one mathematical: Divine Benevolence, or an Attempt to Prove That the Principal End of the Divine Providence and Government is the Happiness of His Creatures (1731) An Introduction to the Doctrine of Fluxions, and a Defence of the Mathematicians Against the Objections of the Author of The Analyst (published anonymously in 1736), in which he defended the logical foundation of Isaac Newton's calculus ("fluxions") against the criticism by George Berkeley, a bishop and noted philosopher, the author of The Analyst Bayes was elected as a Fellow of the Royal Society in 1742. His nomination letter was signed by Philip Stanhope, Martin Folkes, James Burrow, Cromwell Mortimer, and John Eames. It is speculated that he was accepted by the society on the strength of the Introduction to the Doctrine of Fluxions, as he is not known to have published any other mathematical work during his lifetime. In his later years he took a deep interest in probability. Historian Stephen Stigler thinks that Bayes became interested in the subject while reviewing a work written in 1755 by Thomas Simpson, but George Alfred Barnard thin
https://en.wikipedia.org/wiki/The%20Winnower
The Winnower was a publishing platform and journal that offered traditional scholarly publishing tools (Digital Object Identifiers (DOIs), permanent archival, Altmetrics, PDF creation, etc.) to enable rigorous scholastic discussion of topics across all areas of intellectual inquiry, whether in the sciences, humanities, public policy, or otherwise. Between 2014 and 2016, The Winnower published and archived the following: Student Essays Conference Proceedings Peer Reviews Theses Grants Book Reviews Journal Clubs How-to's Lab notes Scholarly reddit AMAs Foldscope Images Blog posts Original research Open Letters. History The Winnower was founded by Dr. Joshua Nicholson. It went live on May 27, 2014, with a primary focus of publishing scientific research, but expanded its scope to include a diverse set of topics spanning the humanities, social sciences, science policy, and professional commentaries, to name just a few. As of April 2016 it had over 1,000 publications from 4,500+ authors around the world. In November 2016, it was announced that the publishing platform Authorea had bought The Winnower. New submissions were then stopped, with the site directing authors to Authorea. The site has been largely inactive but archived since 2016. Authorea was in turn purchased by one of the 'big 5' academic publishers, Wiley, in 2018. Post-publication Peer Review The Winnower offered post-publication peer review. After submission, the paper was immediately made visible online, and was open for public, non-anonymous reviews by registered members of The Winnower community. Articles could be revised indefinitely until the author chose to "freeze" a final version and purchase a digital object identifier. See also Authorea Scholarly peer review F1000Research Journal club Conference Proceedings
https://en.wikipedia.org/wiki/In%20silico%20medicine
In silico medicine (also known as "computational medicine") is the application of in silico research to problems involving health and medicine. It is the direct use of computer simulation in the diagnosis, treatment, or prevention of a disease. More specifically, in silico medicine is characterized by modeling, simulation, and visualization of biological and medical processes in computers with the goal of simulating real biological processes in a virtual environment. History The term in silico was first used in 1989 at a workshop "Cellular Automata: Theory and Applications" by a mathematician from National Autonomous University of Mexico (UNAM). The term in silico radiation oncology, a precursor of generic in silico medicine was coined and first introduced by G. Stamatakos in Proceedings of the IEEE in 2002. The same researcher coined and introduced the more generic term in silico oncology. In silico medicine is considered an extension of previous work using mathematical models of biological systems. It became apparent that the techniques used to model biological systems has utility to explain and predict dynamics in the medical field. The first fields in medicine to use in silico modeling were genetics, physiology and biochemistry. The field saw a dramatic influx of data when the human genome was sequenced in the 1980s and 1990s. Concurrently the increase in available computational power allowed for modeling of complex systems that were previously impractical. Rationale There are numerous reasons why in silico medicine is used. For example, in silico medical modeling can allow for early prediction of success of a compound for a medicinal purpose and elucidate potential adverse effects early in the drug discovery process. In silico modeling can also provide a humane alternative to animal testing. It has been purported by a company in the field, that computer-aided models will make the use of testing on living organisms obsolete. Examples The term in sili
https://en.wikipedia.org/wiki/Heinrich%20August%20Rothe
Heinrich August Rothe (1773–1842) was a German mathematician, a professor of mathematics at Erlangen. He was a student of Carl Hindenburg and a member of Hindenburg's school of combinatorics. Biography Rothe was born in 1773 in Dresden, and in 1793 became a docent at the University of Leipzig. He became an extraordinary professor at Leipzig in 1796, and in 1804 he moved to Erlangen as a full professor, taking over the chair formerly held by Karl Christian von Langsdorf. He died in 1842, and his position at Erlangen was in turn taken by Johann Wilhelm Pfaff, the brother of the more famous mathematician Johann Friedrich Pfaff. Research The Rothe–Hagen identity, a summation formula for binomial coefficients, appeared in Rothe's 1793 thesis. It is named for him and for the later work of Johann Georg Hagen. The same thesis also included a formula for computing the Taylor series of an inverse function from the Taylor series for the function itself, related to the Lagrange inversion theorem. In the study of permutations, Rothe was the first to define the inverse of a permutation, in 1800. He developed a technique for visualizing permutations now known as a Rothe diagram, a square table that has a dot in each cell (i,j) for which the permutation maps position i to position j and a cross in each cell (i,j) for which there is a dot later in row i and another dot later in column j. Using Rothe diagrams, he showed that the number of inversions in a permutation is the same as in its inverse, for the inverse permutation has as its diagram the transpose of the original diagram, and the inversions of both permutations are marked by the crosses. Rothe used this fact to show that the determinant of a matrix is the same as the determinant of the transpose: if one expands a determinant as a polynomial, each term corresponds to a permutation, and the sign of the term is determined by the parity of its number of inversions. Since each term of the determinant of the transpose correspon
https://en.wikipedia.org/wiki/Molecular%20risk%20assessment
Molecular risk assessment is a procedure in which biomarkers (for example, biological molecules or changes in tumor cell DNA) are used to estimate a person's risk for developing cancer. Specific biomarkers may be linked to particular types of cancer. Sources External links Molecular risk assessment entry in the public domain NCI Dictionary of Cancer Terms Biological techniques and tools Cancer screening
https://en.wikipedia.org/wiki/Phragmoplastophyta
The Phragmoplastophyta (Lecointre & Guyader 2006) are a proposed sister clade of the Klebsormidiaceae in the Streptophyte/Charophyte clade. The Phragmoplastophyta consist of the Charophycaea and another unnamed clade which contains the Coleochaetophyceae, Zygnematophyceae, Mesotaeniaceae, and Embryophytes (land plants). It is an important step in the emergence of land plants within the green algae. It is equivalent to the ZCC clade/grade, cladistically granting the Embryophyta. The mitosis of Phragmoplastophyta takes place via a phragmoplast. Another synapomorphy of this clade is the synthesis of cellulose microfibrils by a complex of octameric cellulose synthetases. This complex crosses the plasma membrane and polymerizes molecules from the cytoplasm into cellulose microfibrils, which, together with each other, form fibrils, necessary in the formation of the wall. The Phragmoplastophyte wall is also formed of phenolic compounds. It is within Phragmoplastophyta we find the three clades of charophyte/streptophyte algae with true multicellular organization with differentiated cell types; Charophyceae, Coleochaetophyceae and land plants. The other charophyte algae are either unicellular, colonial, sarcinoid (three-dimensional packets of cells) or unbranched filamentous. Below is a consensus reconstruction of green algal relationships, mainly based on molecular data.
https://en.wikipedia.org/wiki/John%20McCarthy%20%28computer%20scientist%29
John McCarthy (September 4, 1927 – October 24, 2011) was an American computer scientist and cognitive scientist. He was one of the founders of the discipline of artificial intelligence. He co-authored the document that coined the term "artificial intelligence" (AI), developed the programming language family Lisp, significantly influenced the design of the language ALGOL, popularized time-sharing, and invented garbage collection. McCarthy spent most of his career at Stanford University. He received many accolades and honors, such as the 1971 Turing Award for his contributions to the topic of AI, the United States National Medal of Science, and the Kyoto Prize. Early life and education John McCarthy was born in Boston, Massachusetts, on September 4, 1927, to an Irish immigrant father and a Lithuanian Jewish immigrant mother, John Patrick and Ida (Glatt) McCarthy. The family was obliged to relocate frequently during the Great Depression, until McCarthy's father found work as an organizer for the Amalgamated Clothing Workers in Los Angeles, California. His father came from Cromane, a small fishing village in County Kerry, Ireland. His mother died in 1957. Both parents were active members of the Communist Party during the 1930s, and they encouraged learning and critical thinking. Before he attended high school, he got interested in science by reading a translation of a Russian popular science book for children, called 100,000 Whys. John was fluent in the Russian language and made friends with Russian scientists during multiple trips to the Soviet Union but he distanced himself after making visits to the Soviet Bloc, which led to him becoming a conservative Republican. McCarthy graduated from Belmont High School two years early. McCarthy was accepted into Caltech in 1944. McCarthy showed an early aptitude for mathematics; during his teens he taught himself college mathematics by studying the textbooks used at the nearby California Institute of Technology (Caltech).
https://en.wikipedia.org/wiki/Evolution%20and%20the%20Theory%20of%20Games
Evolution and the Theory of Games is a book by the British evolutionary biologist John Maynard Smith on evolutionary game theory. The book was initially published in December 1982 by Cambridge University Press. Overview In the book, John Maynard Smith summarises work on evolutionary game theory that had developed in the 1970s, to which he made several important contributions. The book is also noted for being well written and not overly mathematically challenging. The main contribution to be had from this book is the introduction of the Evolutionarily Stable Strategy, or ESS, concept, which states that for a set of behaviours to be conserved over evolutionary time, they must be the most profitable avenue of action when common, so that no alternative behaviour can invade. So, for instance, suppose that in a population of frogs, males fight to the death over breeding ponds. This would be an ESS if any one cowardly frog that does not fight to the death always fares worse (in fitness terms, of course). A more likely scenario is one where fighting to the death is not an ESS because a frog might arise that will stop fighting if it realises that it is going to lose. This frog would then reap the benefits of fighting, but not the ultimate cost. Hence, fighting to the death would easily be invaded by a mutation that causes this sort of "informed fighting." Much complexity can be built from this, and Maynard Smith is outstanding at explaining in clear prose and with simple math. Reception See also Evolutionary biology
https://en.wikipedia.org/wiki/Women%20in%20Cell%20Biology
Women in Cell Biology (WCIB) is a subcommittee of the American Society for Cell Biology (ASCB) created to promote women in cell biology and present awards. History A group of women were unhappy with the lack of recognition in ASCB.  In 1971, Virginia Walbot gathered a group of women to meet at the annual ASCB meetings and WICB began.  The goal was to provide a space for women to talk and network with other women in the field, learn about job opportunities, and promote women in academia.  Newsletters were distributed containing job listings and news of powerful women in biology.  Originally, WICB was not accepted by ASCB; the newsletter was not funded and later discontinued in the 1970s. WICB was established as a committee within ASCB in 1994. Activities Currently, WICB meets annually at ASCB meetings and has a column in the ASCB newsletter. The goals of WICB are to nominate and give awards and communicate through the newsletter. Awards WICB awards the following annually: WICB Junior Award for Excellence in Research WICB Mid-Career Award for Excellence in Research Sandra K. Masur Senior Leadership Award
https://en.wikipedia.org/wiki/Suslin%27s%20problem
In mathematics, Suslin's problem is a question about totally ordered sets posed by and published posthumously. It has been shown to be independent of the standard axiomatic system of set theory known as ZFC; showed that the statement can neither be proven nor disproven from those axioms, assuming ZF is consistent. (Suslin is also sometimes written with the French transliteration as , from the Cyrillic .) Formulation Suslin's problem asks: Given a non-empty totally ordered set R with the four properties R does not have a least nor a greatest element; the order on R is dense (between any two distinct elements there is another); the order on R is complete, in the sense that every non-empty bounded subset has a supremum and an infimum; and every collection of mutually disjoint non-empty open intervals in R is countable (this is the countable chain condition for the order topology of R), is R necessarily order-isomorphic to the real line R? If the requirement for the countable chain condition is replaced with the requirement that R contains a countable dense subset (i.e., R is a separable space), then the answer is indeed yes: any such set R is necessarily order-isomorphic to R (proved by Cantor). The condition for a topological space that every collection of non-empty disjoint open sets is at most countable is called the Suslin property. Implications Any totally ordered set that is not isomorphic to R but satisfies properties 1–4 is known as a Suslin line. The Suslin hypothesis says that there are no Suslin lines: that every countable-chain-condition dense complete linear order without endpoints is isomorphic to the real line. An equivalent statement is that every tree of height ω1 either has a branch of length ω1 or an antichain of cardinality . The generalized Suslin hypothesis says that for every infinite regular cardinal κ every tree of height κ either has a branch of length κ or an antichain of cardinality κ. The existence of Suslin lines is equivalent
https://en.wikipedia.org/wiki/Griesmer%20bound
In the mathematics of coding theory, the Griesmer bound, named after James Hugo Griesmer, is a bound on the length of linear binary codes of dimension k and minimum distance d. There is also a very similar version for non-binary codes. Statement of the bound For a binary linear code, the Griesmer bound is: Proof Let denote the minimum length of a binary code of dimension k and distance d. Let C be such a code. We want to show that Let G be a generator matrix of C. We can always suppose that the first row of G is of the form r = (1, ..., 1, 0, ..., 0) with weight d. The matrix generates a code , which is called the residual code of obviously has dimension and length has a distance but we don't know it. Let be such that . There exists a vector such that the concatenation Then On the other hand, also since and is linear: But so this becomes . By summing this with we obtain . But so we get As is integral, we get This implies so that By induction over k we will eventually get Note that at any step the dimension decreases by 1 and the distance is halved, and we use the identity for any integer a and positive integer k. The bound for the general case For a linear code over , the Griesmer bound becomes: The proof is similar to the binary case and so it is omitted. See also Singleton bound Hamming bound Gilbert-Varshamov bound Johnson bound Plotkin bound Elias Bassalygo bound
https://en.wikipedia.org/wiki/Isozyme
In biochemistry, isozymes (also known as isoenzymes or more generally as multiple forms of enzymes) are enzymes that differ in amino acid sequence but catalyze the same chemical reaction. Isozymes usually have different kinetic parameters (e.g. different KM values), or are regulated differently. They permit the fine-tuning of metabolism to meet the particular needs of a given tissue or developmental stage. In many cases, isozymes are encoded by homologous genes that have diverged over time. Strictly speaking, enzymes with different amino acid sequences that catalyse the same reaction are isozymes if encoded by different genes, or allozymes if encoded by different alleles of the same gene; the two terms are often used interchangeably. Introduction Isozymes were first described by R. L. Hunter and Clement Markert (1957) who defined them as different variants of the same enzyme having identical functions and present in the same individual. This definition encompasses (1) enzyme variants that are the product of different genes and thus represent different loci (described as isozymes) and (2) enzymes that are the product of different alleles of the same gene (described as allozymes). Isozymes are usually the result of gene duplication, but can also arise from polyploidisation or nucleic acid hybridization. Over evolutionary time, if the function of the new variant remains identical to the original, then it is likely that one or the other will be lost as mutations accumulate, resulting in a pseudogene. However, if the mutations do not immediately prevent the enzyme from functioning, but instead modify either its function, or its pattern of expression, then the two variants may both be favoured by natural selection and become specialised to different functions. For example, they may be expressed at different stages of development or in different tissues. Allozymes may result from point mutations or from insertion-deletion (indel) events that affect the coding seque
https://en.wikipedia.org/wiki/Marginal%20value%20theorem
The marginal value theorem (MVT) is an optimality model that usually describes the behavior of an optimally foraging individual in a system where resources (often food) are located in discrete patches separated by areas with no resources. Due to the resource-free space, animals must spend time traveling between patches. The MVT can also be applied to other situations in which organisms face diminishing returns. The MVT was first proposed by Eric Charnov in 1976. In his original formulation: "The predator should leave the patch it is presently in when the marginal capture rate in the patch drops to the average capture rate for the habitat." Definition All animals must forage for food in order to meet their energetic needs, but doing so is energetically costly. It is assumed that evolution by natural selection results in animals utilizing the most economic and efficient strategy to balance energy gain and consumption. The Marginal Value Theorem is an optimality model that describes the strategy that maximizes gain per unit time in systems where resources, and thus rate of returns, decrease with time. The model weighs benefits and costs and is used to predict giving up time and giving up density. Giving up time (GUT) is the interval of time between when the animal last feeds and when it leaves the patch. Giving up density (GUD) is the food density within a patch when the animal will choose to move on to other food patches. When an animal is foraging in a system where food sources are patchily distributed, the MVT can be used to predict how much time an individual will spend searching for a particular patch before moving on to a new one. In general, individuals will stay longer if (1) patches are farther apart or (2) current patches are poor in resources. Both situations increase the ratio of travel cost to foraging benefit. Modeling As animals forage in patchy systems, they balance resource intake, traveling time, and foraging time. Resource intake within a patch
https://en.wikipedia.org/wiki/Dioptrique
"La dioptrique" (in English "Dioptrique", "Optics", or "Dioptrics"), is a short treatise published in 1637 included in one of the Essays written with Discourse on the Method by René Descartes. In this essay Descartes uses various models to understand the properties of light. This essay is known as Descartes' greatest contribution to optics, as it is the first publication of the Law of Refraction. First Discourse: On Light The first discourse captures Descartes' theories on the nature of light. In the first model, he compares light to a stick that allows a blind person to discern his environment through touch. Descartes says: You have only to consider that the differences which a blind man notes among trees, rocks, water, and similar things through the medium of his stick do not seem less to him than those among red, yellow, green, and all the other colors seem to us; and that nevertheless these differences are nothing other, in all these bodies, than the diverse ways of moving, or of resisting the movements of, this stick. Descartes' second model on light uses his theory of the elements to demonstrate the rectilinear transmission of light as well as the movement of light through solid objects. He uses a metaphor of wine flowing through a vat of grapes, then exiting through a hole at the bottom of the vat. Now consider that, since there is no vacuum in Nature as almost all the Philosophers affirm, and since there are nevertheless many pores in all the bodies that we perceive around us, as experiment can show quite clearly, it is necessary that these pores be filled with some very subtle and very fluid material, extending without interruption from the stars and planets to us. Thus, this subtle material being compared with the wine in that vat, and the less fluid or heavier parts, of the air as well as of other transparent bodies, being compared with the bunches of grapes which are mixed in, you will easily understand the following: Just as the parts of this wine..
https://en.wikipedia.org/wiki/Cuteness
Cuteness is a type of attractiveness commonly associated with youth and appearance, as well as a scientific concept and analytical model in ethology, first introduced by Austrian ethologist Konrad Lorenz. Lorenz proposed the concept of baby schema (Kindchenschema), a set of facial and body features that make a creature appear "cute" and activate ("release") in others the motivation to care for it. Cuteness may be ascribed to people as well as things that are regarded as attractive or charming. Juvenile traits Doug Jones, a visiting scholar in anthropology at Cornell University, said that the proportions of facial features change with age due to changes in hard tissue and soft tissue, and Jones said that these "age-related changes" cause juvenile animals to have the "characteristic 'cute' appearance" of proportionately smaller snouts, higher foreheads and larger eyes than their adult counterparts. In terms of hard tissue, Jones said that the neurocranium grows a lot in juveniles while the bones for the nose and the parts of the skull involved in chewing food only reach maximum growth later. In terms of soft tissue, Jones said that the cartilaginous tissues of the ears and nose continue to grow throughout a person's lifetime, starting at age twenty-five the eyebrows descend on the "supraorbital rim" from a position above the supraorbital rim to a position below it, the "lateral aspect of the eyebrows" sags with age, making the eyes appear smaller, and the red part of the lips gets thinner with age due to loss of connective tissue. A study found that the faces of "attractive" Northern Italian Caucasian children have "characteristics of babyness" such as a "larger forehead", a smaller jaw, "a proportionately larger and more prominent maxilla", a wider face, a flatter face and larger "anteroposterior" facial dimensions than the Northern Italian Caucasian children used as a reference. Biological function Konrad Lorenz argued in 1949 that infantile features triggered
https://en.wikipedia.org/wiki/St%20Margaret%27s%20Bay%20Windmill
St Margaret's Bay Windmill is a Grade II listed Smock mill on South Foreland, the southeasternmost point of England. It was built in 1929 to generate electricity for the attached house, high on the White Cliffs of Dover. History The mill was built for Sir William Bearswell by Holman's, the Canterbury millwrights. It was built to generate electricity and started generating in June 1929. The mill ceased to generate electricity in 1939, when the dynamo was removed. During the Second World War, the mill was occupied by a special branch of the WRNS. Repairs were done to the mill in 1969 by millwrights Vincent Pargeter and Philip Lennard. These included a new fantail and repairs to the sails. Description St Margaret's Bay Windmill is a three-storey smock mill on a single-storey brick base. It has four patent sails and is winded by a fantail. The mill generated electricity via a dynamo and is now used as residential accommodation, a use it has always had. See also South Foreland lighthouse is a few hundred metres away
https://en.wikipedia.org/wiki/Posterior%20nasal%20spine
The posterior nasal spine is part of the horizontal plate of the palatine bone of the skull. It is found at the medial end of its posterior border. It is paired with the corresponding palatine bone to form a solid spine. It is the attachment of the uvula muscle. Structure The posterior nasal spine is found at the medial end of the posterior border of the horizontal plate of the palatine bone of the skull. Function The posterior nasal spine is the attachment of the uvula muscle. Clinical applications The posterior nasal spine is an important cephalometric landmark. Additional images See also anterior nasal spine
https://en.wikipedia.org/wiki/Scale%20%28social%20sciences%29
In the social sciences, scaling is the process of measuring or ordering entities with respect to quantitative attributes or traits. For example, a scaling technique might involve estimating individuals' levels of extraversion, or the perceived quality of products. Certain methods of scaling permit estimation of magnitudes on a continuum, while other methods provide only for relative ordering of the entities. The level of measurement is the type of data that is measured. The word scale, including in academic literature, is sometimes used to refer to another composite measure, that of an index. Those concepts are however different. Scale construction decisions What level (level of measurement) of data is involved (nominal, ordinal, interval, or ratio)? What will the results be used for? What should be used - a scale, index, or typology? What types of statistical analysis would be useful? Choose to use a comparative scale or a noncomparative scale. How many scale divisions or categories should be used (1 to 10; 1 to 7; −3 to +3)? Should there be an odd or even number of divisions? (Odd gives neutral center value; even forces respondents to take a non-neutral position.) What should the nature and descriptiveness of the scale labels be? What should the physical form or layout of the scale be? (graphic, simple linear, vertical, horizontal) Should a response be forced or be left optional? Scale construction method It is possible that something similar to one's scale will already exist, so including those scale(s) and possible dependent variables in one's survey may increase validity of one's scale. Begin by generating at least ten items to represent each of the scales. Administer the survey; the more representative and larger the sample, the more confidence one will have in the scales. Review the means and standard deviations for the items, dropping any items with skewed means or very low variance. Run a principal components analysis with oblique rotation on one's
https://en.wikipedia.org/wiki/Continental%20Biogeographic%20Region
The Continental Biogeographic Region is a biogeographic region of Europe that extend in a broad band from east to west through the center of the continent. Extent The Continental Region extends from central France to the Ural Mountains. The southern part of this region is almost completely separated from the larger northern part by the Alps and Carpathians of the Alpine region and the plains of the Pannonian region. More than 25% of the European Union is in the Continental region. Luxembourg is completely within the region. Large parts of France, Germany, Italy, Poland, Czech Republic and Bulgaria are in the region, as are significant parts of Denmark, Belgium, Austria, Slovenia and Romania. Just 3% of Sweden is in the region. Environment The climate is generally hot in summer and cold in winter, with less variation of temperature in the west, where the Atlantic has a moderating influence. The lands is generally flat in the north and hillier further south, apart from the wide floodplains of the Danube and Po rivers. The region was covered by wetlands and deciduous beech forests after the glaciers of the last ice age retreated. The forests have mostly been cleared to make way for farming and the rivers have been canalized, greatly reducing wetland habitats. Pomerania in Poland and eastern Germany is still thinly populated and holds many lakes, fens and mires. The southern part of the Continental Region has much vegetation in common with the lower levels of the Alpine region, and with the Mediterranean region. Notes Sources Environment of Europe Biogeography
https://en.wikipedia.org/wiki/Embedded%20hypervisor
An embedded hypervisor is a hypervisor that supports the requirements of embedded systems. The requirements for an embedded hypervisor are distinct from hypervisors targeting server and desktop applications. An embedded hypervisor is designed into the embedded device from the outset, rather than loaded subsequent to device deployment. While desktop and enterprise environments use hypervisors to consolidate hardware and isolate computing environments from one another, in an embedded system, the various components typically function collectively to provide the device's functionality. Mobile virtualization overlaps with embedded system virtualization, and shares some use cases. Typical attributes of embedded virtualization include efficiency, security, communication, isolation and real-time capabilities. Background Software virtualization has been a major topic in the enterprise space since the late 1960s, but only since the early 2000s has its use appeared in embedded systems. The use of virtualization and its implementation in the form of a hypervisor in embedded systems are very different from enterprise applications. An effective implementation of an embedded hypervisor must deal with a number of issues specific to such applications. These issues include the highly integrated nature of embedded systems, the requirement for isolated functional blocks within the system to communicate rapidly, the need for real-time/deterministic performance, the resource-constrained target environment and the wide range of security and reliability requirements. Hypervisor A hypervisor provides one or more software virtualization environments in which other software, including operating systems, can run with the appearance of full access to the underlying system hardware, where in fact such access is under the complete control of the hypervisor. These virtual environments are called virtual machines (VM)s, and a hypervisor will typically support multiple VMs managed simultane
https://en.wikipedia.org/wiki/Environment%20of%20Australia
The Australian environment ranges from virtually pristine Antarctic territory and rainforests to degraded industrial areas of major cities. Forty distinct ecoregions have been identified across the Australian mainland and islands. Central Australia has a very dry climate. The interior has a number of deserts while most of the coastal areas are populated. Northern Australia experiences tropical cyclones while much of the country is prone to periodic drought. This dry and warm environment and exposure to cyclones, makes Australia particularly vulnerable to climate change -- with some areas already experiencing increases in wildfires and fragile ecosystems. The island ecology of Australia has led to a number of unique endemic plant and animal species, notably marsupials like the kangaroo and koala. Agriculture and mining are predominant land uses which cause negative impacts on many different ecosystems. The management of the impact on the environment from the mining industry, the protection of the Great Barrier Reef, forests and native animals are recurring issues of conservation. The protected areas in Australia are important sources of ecotourism, with sites like the Great Barrier Reef and World Heritage sites like Tasmanian Wilderness World Heritage Area or the Uluṟu-Kata Tjuṯa National Park draw both national and international tourism. Clean Up Australia Day was an initiative developed in 1989 to collaboratively clean up local areas and is held on the first Sunday of autumn (in March). Protected areas Protected areas cover 895,288 km2 of Australia's land area, or about 11.5% of the total land area. Of these, two-thirds are considered strictly protected (IUCN categories I to IV), and the rest is mostly managed resources protected area (IUCN category VI). There are also 200 marine protected areas, which cover a further 64.8 million hectares. Indigenous Protected Area have been established since the 1990s, the largest of which covers part of the Tanami Dese
https://en.wikipedia.org/wiki/Synchronization%20in%20telecommunications
Many services running on modern digital telecommunications networks require accurate synchronization for correct operation. For example, if telephone exchanges are not synchronized, then bit slips will occur and degrade performance. Telecommunication networks rely on the use of highly accurate primary reference clocks which are distributed network-wide using synchronization links and synchronization supply units. Ideally, clocks in a telecommunications network are synchronous, controlled to run at identical rates, or at the same mean rate with a fixed relative phase displacement, within a specified limited range. However, they may be mesochronous in practice. In common usage, mesochronous networks are often described as synchronous. Components Primary reference clock (PRC) Modern telecommunications networks use highly accurate primary master clocks that must meet the international standards requirement for long term frequency accuracy better than 1 part in 1011. To get this performance, atomic clocks or GPS disciplined oscillators are normally used. Synchronization supply unit Synchronization supply units (SSU) are used to ensure reliable synchronisation distribution. They have a number of key functions: They filter the synchronisation signal they receive to remove the higher frequency phase noise. They provide distribution by providing a scalable number of outputs to synchronise other local equipment. They provide a capability to carry on producing a high quality output even when their input reference is lost, this is referred to as holdover mode. Quality metrics In telecoms networks two key parameters are used for measurement of synchronisation performance. These parameters are defined by the International Telecommunication Union in its recommendation G.811, by European Telecommunications Standards Institute in its standard EN 300 462-1-1, by the ANSI Synchronization Interface Standard T1.101 defines profiles for clock accuracy at each stratum level, and b
https://en.wikipedia.org/wiki/Edward%20Kasner
Edward Kasner (April 2, 1878 – January 7, 1955) was an American mathematician who was appointed Tutor on Mathematics in the Columbia University Mathematics Department. Kasner was the first Jewish person appointed to a faculty position in the sciences at Columbia University. Subsequently, he became an adjunct professor in 1906, and a full professor in 1910, at the university. Differential geometry was his main field of study. In addition to introducing the term "googol", he is known also for the Kasner metric and the Kasner polygon. Education Kasner's 1899 PhD dissertation at Columbia University was titled The Invariant Theory of the Inversion Group: Geometry upon a Quadric Surface; it was published by the American Mathematical Society in 1900 in their Transactions. Googol and googolplex Kasner is perhaps best remembered today for introducing the term "googol." In order to pique the interest of children, Kasner sought a name for a very large number: one followed by 100 zeros. On a walk in the New Jersey Palisades with his nephews, Milton (1911–1981) and Edwin Sirotta, Kasner asked for their ideas. Nine-year-old Milton suggested "googol". In 1940, with James R. Newman, Kasner co-wrote a non-technical book surveying the field of mathematics, called Mathematics and the Imagination (). It was in this book that the term "googol" was first popularized: The Internet search engine "Google" originated from a misspelling of "googol", and the "Googleplex" (the Google company headquarters in Mountain View, California) is similarly derived from googolplex. Personal life Kasner was Jewish and was the son of Austrian immigrants. Works Edward Kasner and James R. Newman, Mathematics and the Imagination, Tempus Books of Microsoft Press, 1989.
https://en.wikipedia.org/wiki/Observations%20Concerning%20the%20Increase%20of%20Mankind%2C%20Peopling%20of%20Countries%2C%20etc.
Observations Concerning the Increase of Mankind, Peopling of Countries, etc. is a short essay written in 1751 by American polymath Benjamin Franklin. It was circulated by Franklin in manuscript to his circle of friends, but in 1755 it was published as an addendum in a Boston pamphlet on another subject. It was reissued ten times during the next 15 years. The essay examines population growth and its limits. Writing as, at the time, a loyal subject of the British Crown, Franklin argues that the British should increase their population and power by expanding across the Americas, taking the view that Europe is too crowded. Content Franklin projected an exponential growth (doubling every 25 years) in the population of the Thirteen Colonies, so that in a century "the greatest Number of Englishmen will be on this Side of the Water", thereby increasing the power of England. As Englishmen they would share language, manners, and religion with their countrymen in England, thus extending English civilization and English rule substantially". Franklin viewed the land in America as underutilized and available for the expansion of farming. This enabled the population to establish households at an earlier age and support larger families than was possible in Europe. The limit to expansion, reached in Europe but not America, is reached when the "crowding and interfering with each other's means of subsistence", an idea that would inspire Malthus. Historian Walter Isaacson writes that Franklin's theory was empirically based on the population data during his day. Franklin's reasoning was essentially correct in that America's population continued to double every twenty years, surpassing England's population in the 1850s, and continued until the frontier era ended in the early 1900s. According to the United States Census, from 1750 to 1900, the population of colonial and continental America overall doubled every twenty five years, correctly aligning with Franklin's prediction. Protect
https://en.wikipedia.org/wiki/H%C3%B6lder%27s%20theorem
In mathematics, Hölder's theorem states that the gamma function does not satisfy any algebraic differential equation whose coefficients are rational functions. This result was first proved by Otto Hölder in 1887; several alternative proofs have subsequently been found. The theorem also generalizes to the -gamma function. Statement of the theorem For every there is no non-zero polynomial such that where is the gamma function. For example, define by Then the equation is called an algebraic differential equation, which, in this case, has the solutions and — the Bessel functions of the first and second kind respectively. Hence, we say that and are differentially algebraic (also algebraically transcendental). Most of the familiar special functions of mathematical physics are differentially algebraic. All algebraic combinations of differentially algebraic functions are differentially algebraic. Furthermore, all compositions of differentially algebraic functions are differentially algebraic. Hölder’s Theorem simply states that the gamma function, , is not differentially algebraic and is therefore transcendentally transcendental. Proof Let and assume that a non-zero polynomial exists such that As a non-zero polynomial in can never give rise to the zero function on any non-empty open domain of (by the fundamental theorem of algebra), we may suppose, without loss of generality, that contains a monomial term having a non-zero power of one of the indeterminates . Assume also that has the lowest possible overall degree with respect to the lexicographic ordering For example, because the highest power of in any monomial term of the first polynomial is smaller than that of the second polynomial. Next, observe that for all we have: If we define a second polynomial by the transformation then we obtain the following algebraic differential equation for : Furthermore, if is the highest-degree monomial term in , then the highest-degree monomial term i
https://en.wikipedia.org/wiki/The%20Journal%20of%20Pathology
The Journal of Pathology is a peer-reviewed medical journal that was established in 1892 as The Journal of Pathology and Bacteriology by German Sims Woodhead. It has been the official journal of the Pathological Society of Great Britain and Ireland (present name: Pathological Society) since its foundation in 1906. The journal has published important papers in pathology and experimental medicine including work by Rudolf Virchow and Ilya Ilyich Mechnikov, both of whom contributed to the inaugural issue. In 1969, the journal's title was shortened to The Journal of Pathology. In January 1999, the first of an ongoing series of Annual Review issues was published, on the topic of "Molecular and Cellular Themes in Cancer Research", edited by Peter A. Hall and David P. Lane. A history of the journal was written in 2006 by former editor-in-chief C. Simon Herrington, as a chapter of a book on the history of the Pathological Society. The journal publishes research papers, reviews, commentaries, and perspectives, as well as the abstracts of the Pathological Society Winter and Summer Meetings (in two separate online-only yearly supplements). The current editor in chief is Peter A. Hall MD PhD. The journal is published on behalf of the Pathological Society by John Wiley & Sons. Because of the success of the Journal, in mid 2014 a companion journal was launched with a more clinical focus. This was initially called The Clinical Journal of Pathology but was then renamed Journal of Pathology: Clinical Research. Peter A. Hall was the initial editor of this companion journal. Peter Hall stood down as the Editor in Chief in late 2014 after 7 years.
https://en.wikipedia.org/wiki/Bernhard%20Landwehrmeyer
Georg Bernhard Landwehrmeyer FRCP is a German neurologist and neuroscientist in the field of neurodegeneration primarily focusing on Huntington's disease. Landwehrmeyer is a professor of neurology at Ulm University Hospital. He was one of the founders of the European Huntington's Disease Network (EHDN) in 2004 and was chairman of its executive committee until 2014. Education and career Landwehrmeyer received his MD degree and Doctoral Degree from the Albert Ludwigs University of Freiburg, Germany, where he also completed a residency in neurology, research training in neuropathology and molecular pharmacology, and a residency in neurology and psychiatry. Landwehrmeyer studied at the Royal Victoria Hospital, Belfast and Kantonsspital St. Gallen, Basel. He was a post-Doc from 1993 to 1995 at Massachusetts General Hospital, Harvard Medical School. During 1995–1999, he was a staff member at Albert Ludwigs University of Freiburg, Department of Neurology & Psychiatry. In 1999, he received his Board certification in Neurology and has been a full professor since 2000. He served as principal investigator in numerous HD clinical trials and observational studies and is the principal investigator of the CHDI-sponsored Enroll-HD study. Research Landwehrmeyer started working on Huntington's Disease (HD) in 1993 when he started a postdoc at Massachusetts General Hospital (MGH) with Anne B. Young, then Chief of Neurology, a couple of months before the HD gene and the HD expansion mutation was discovered. He went to Venezuela with Anne and Nancy Wexler several times, and was alerted to stimulating HD field studies aside from work at the bench. In 2000 he was appointed full Professor of Neurology, 'Clinical Neurobiology,' at the University of Ulm and was given the opportunity to organize (together with Albert Ludolph, the chairperson of the Department of Neurology at the University of Ulm, who initiated the work) the first large (>500 participants), long-term (3-year randomized
https://en.wikipedia.org/wiki/Anne%20Fausto-Sterling
Anne Fausto-Sterling ( Sterling; born July 30, 1944) is an American sexologist who has written extensively on the social construction of gender, sexual identity, gender identity, gender roles, and intersexuality. She is the Nancy Duke Lewis Professor Emerita of Biology and Gender Studies at Brown University. Life and career Fausto-Sterling's mother, Dorothy Sterling, was a noted writer and historian while her father was also a published writer. Fausto-Sterling received her Bachelor of Arts degree in zoology from the University of Wisconsin in 1965 and her Ph.D. in developmental genetics from Brown University in 1970. After earning her Ph.D. she joined the faculty of Brown, where she was appointed Nancy Duke Lewis Professor of Biology and Gender Studies. In a 1993 paper titled "", Fausto-Sterling laid out a thought experiment considering an alternative model of gender containing five sexes: male, female, merm, ferm, and herm. She later said that the paper "had intended to be provocative, but I had also written with tongue firmly in cheek". Fausto-Sterling has written two books intended for a general audience. The first of those books, Myths of Gender, was first published in 1985. Her second book for the general public is Sexing the Body: Gender Politics and the Construction of Sexuality, published in 2000. In the book she sets out to "convince readers of the need for theories that allow for a good deal of human variation and that integrate the analytical powers of the biological and the social into the systematic analysis of human development." Fausto-Sterling married Paula Vogel, a Yale professor and Pulitzer-winning playwright, in 2004. She has served on the editorial board of the journal Perspectives in Biology and Medicine and on the advisory board of the feminist academic journal Signs. She retired from Brown University in 2014, after 44 years on the faculty. Reception Historian of science Evelynn M. Hammonds describes Fausto-Sterling as one of the most i
https://en.wikipedia.org/wiki/Proteinoplast
Proteinoplasts (sometimes called proteoplasts, aleuroplasts, and aleuronaplasts) are specialized organelles found only in plant cells. Proteinoplasts belong to a broad category of organelles known as plastids. Plastids are specialized double-membrane organelles found in plant cells. Plastids perform a variety of functions such as metabolism of energy, and biological reactions. There are multiple types of plastids recognized including Leucoplasts, Chromoplasts, and Chloroplasts. Plastids are broken up into different categories based on characteristics such as size, function and physical traits. Chromoplasts help to synthesize and store large amounts of carotenoids. Chloroplasts are photosynthesizing structures that help to make light energy for the plant.  Leucoplasts are a colorless type of plastid which means that no photosynthesis occurs here. The colorless pigmentation of the leucoplast is due to not containing the structural components of thylakoids unlike what is found in chloroplasts and chromoplasts that gives them their pigmentation. From leucoplasts stems the subtype, proteinoplasts, which contain proteins for storage. They contain crystalline bodies of protein and can be the sites of enzyme activity involving those proteins. Proteinoplasts are found in many seeds, such as brazil nuts, peanuts and pulses. Although all plastids contain high concentrations of protein, proteinoplasts were identified in the 1960s and 1970s as having large protein inclusions that are visible with both light microscopes and electron microscopes. Other subtypes of Leucoplasts include amyloplast, and elaioplasts. Amyloplasts help to store and synthesize starch molecules found in plants, while elaioplasts synthesize and store lipids in plant cells. See also Chloroplast and etioplast Chromoplast Leucoplast Amyloplast Elaioplast
https://en.wikipedia.org/wiki/Nanobiomechanics
Nanobiomechanics (also bionanomechanics) is an emerging field in nanoscience and biomechanics that combines the powerful tools of nanomechanics to explore fundamental science of biomaterials and biomechanics. Since the introduction by its founder Yuan-Cheng Fung, the field of biomechanics has become one of the branches of mechanics and bioscience. For many years, biomechanics has examined tissue. Through advancements in nanoscience, the scale of the forces that could be measured and also the scale of observation of biomaterials was reduced to "nano" and "pico" level. Consequently, it became possible to measure the mechanical properties of biological materials at nanoscale. This is relevant to improve tissue engineering processes and cellular therapy. Most of the biological materials have different hierarchical levels, and the smallest ones refer to the nanoscale. For example, bone has up to seven levels of biological organization, and the smallest level, i.e., single collagen fibril and hydroxylapatite minerals have dimensions well below 100 nm. Therefore, being able to probe properties at this small scales provides a great opportunity for better understanding the fundamental properties of these materials. For example, measurements have shown that nanomechanical heterogeneity exists even within single collagen fibrils as small as 100 nm. One of the other most relevant topics in this field is measurement of tiny forces on living cells to recognize changes caused by different diseases, including disease progression. For example, it has been shown that red blood cells infected by malaria are 10 times stiffer than normal cells. Likewise, it has been shown that cancer cells are 70 percent softer than normal cells. Early signs of aging cartilage and osteoarthritis has been shown by looking at the changes in the tissue at the nanoscale. Methods, instrumentation, and application The common methods in nanobiomechanics include atomic force microscopy (AFM), nanoindentat
https://en.wikipedia.org/wiki/Isobutane
Isobutane, also known as i-butane, 2-methylpropane or methylpropane, is a chemical compound with molecular formula HC(CH3)3. It is an isomer of butane. Isobutane is a colorless, odorless gas. It is the simplest alkane with a tertiary carbon atom. Isobutane is used as a precursor molecule in the petrochemical industry, for example in the synthesis of isooctane. Production Isobutane is obtained by isomerization of butane. Uses Isobutane is the principal feedstock in alkylation units of refineries. Using isobutane, gasoline-grade "blendstocks" are generated with high branching for good combustion characteristics. Typical products created with isobutane are 2,4-dimethylpentane and especially 2,2,4-trimethylpentane. Solvent In the Chevron Phillips slurry process for making high-density polyethylene, isobutane is used as a diluent. As the slurried polyethylene is removed, isobutane is "flashed" off, and condensed, and recycled back into the loop reactor for this purpose. Precursor to tert-butyl hydroperoxide Isobutane is oxidized to tert-butyl hydroperoxide, which is subsequently reacted with propylene to yield propylene oxide. The tert-butanol that results as a by-product is typically used to make gasoline additives such as methyl tert-butyl ether (MTBE). Miscellaneous uses Isobutane is also used as a propellant for aerosol spray cans. Isobutane is used as part of blended fuels, especially common in fuel canisters used for camping. Refrigerant Isobutane is used as a refrigerant. Use in refrigerators started in 1993 when Greenpeace presented the Greenfreeze project with the former East German company . In this regard, blends of pure, dry "isobutane" (R-600a) (that is, isobutane mixtures) have negligible ozone depletion potential and very low global warming potential (having a value of 3.3 times the GWP of carbon dioxide) and can serve as a functional replacement for R-12, R-22 (both of these being commonly known by the trademark Freon), R-134a, and other chlorofluo
https://en.wikipedia.org/wiki/Oxhydroelectric%20effect
The oxhydroelectric effect consists in the generation of voltage and electric current in pure liquid water, without any electrolyte, upon exposure to electromagnetic radiation in the infrared range, after creating a physical (not chemical) asymmetry in liquid water e.g. thanks to a strongly hydrophile polymer, such as Nafion. Since the publication of the first seminal research, other independent research has been published, which refer to this effect, in scientific peer reviewed, reputable journals (with impact factors higher than the median in the respective fields). The system can be described as a photovoltaic cell operating in the infrared electromagnetic range, based on liquid water instead of a semiconductor. Theoretical model The model proposed by Roberto Germano and his collaborators, who have first observed the effect is based on the known concept of the exclusion zone. The first observations of a different behaviour of water molecules close to the walls of its container date back to late ‘60s and early ‘70s, when Drost-Hansen, upon reviewing many experimental articles, came to the conclusion that interfacial water shows structural difference with respect to the bulk liquid water. In 2006 Gerald Pollack published a seminal work on the exclusion zone and those observations were subsequently reported by several other groups, in which a hydrophilic material creates a coherent water region at the boundary between its surface and the water. Further elaborating on the work of Pollack, the model describes liquid water as a system made of two phases: a matrix of non-coherent water molecules hosting many “Coherence Domains” (CDs), about 0.1 um in size, found in the exclusion zone, but also in the bulk volume. In this model the behaviour of the coherence domains is also considered as the cause for the formation of xerosydryle. The two phases, are characterized by different thermodynamic parameters, and are in a stable non-equilibrium state. The coherent
https://en.wikipedia.org/wiki/Julia%20B.%20Olayanju
Julia B Olayanju is a geneticist, writer, and a nutrition education advocate. She adopts different approaches to increase public awareness on the science of food and health. She is the founder at FoodNiche Inc. and GrubEasy Interactive Labs Inc. Education Olayanju developed an interest in scientific research while working on a research project for her master's degree at West Texas A & M University. She later proceeded to Rutgers University, New Brunswick, New Jersey where she earned her Ph.D. in Microbiology and Molecular Genetics. Her doctoral research work focused on understanding the anti-cancer properties and mechanism of action of isothiocyanates in breast cancer cells. She earned a Masters in Public Health from Harvard TH Chan School of Public Health Work Olayanju is the convener of the FoodNiche Tech Summit & The FoodNiche Global Innovation Summit creating networking and learning opportunities for food industry stakeholders. The conferences bring together experts from academia with business leaders from the food industry  for collaboration and thought-provoking conversations on shaping a healthier food system. Olayanju co-founded GrubEasy Interactive Labs Inc. to leverage technology to promote science of food and health education. Through the technology platform FoodNiche®-Ed, teachers can engage, reward and educate students on the science of food and health. Olayanju leverages the media to  communicate scientific facts and promote awareness on the importance of food to overall well-being. She started this through her column on Forbes  and more recently by hosting scientists and  food industry experts from around the world on The Food + Health Podcast Personal life Julia B. Olayanju married Bunmi Olayanju in 2006, they have 2 children.
https://en.wikipedia.org/wiki/Music%20on%20demand
Music-on-demand (MOD) is a music recording industry certified multi-billion dollar music distribution & subscriber-based industry model conceived with the growth of two-way computing and telecommunications in the early 1990s originally architected by Dale Schalow. Primarily, high-quality music is made available to purchase, access by search, and play back instantly using software on set-top boxes (6MHz separated guard band channels), coaxial, fiber optics, cellular mobile devices, Apple Macintosh, Microsoft Windows, from an available distribution point, such as a computer host or server located at a telephone, cable TV & wireless data center facility. History Band uut Gruusbek! In 1992, computer modem speeds were limited to less than 28 thousand bits per second (28 kbit/s), compared with uncompressed, pulse code modulated (PCM), music on compact disc (CD) that required 150 thousand bytes per second. As a result, additional bandwidth is required to accommodate delivering real-time audio at CD quality standards: 16-bit frame, 44.1 kHz sampling rate, stereophonic (two channel audio). This prompted telephony, CATV, cellular and satellite providers (Virginia Tech DoD cooperative) to consider changing standards, in terms of building higher capacity for existing telecommunications infrastructures and considering business use cases to offer supplemental, U.S. based private, affordable monthly on-demand service subscription plans with revenue split for compensation to music artists representation, licensing groups, telecommunications provider and music-on-demand solutions technology provider. Early design, long range planning, and development of music-on-demand technology, in accordance with the laws of the United States such as the Home Recording Act of 1992, mechanical, copyright licensing include Access Music Network (AMN) by inventor & technology owner Dale Schalow. Mr. Schalow, in the early 1990s, was an independent audio engineer and programmer in Los Angeles, C
https://en.wikipedia.org/wiki/Wireless%20ad%20hoc%20network
A wireless ad hoc network (WANET) or mobile ad hoc network (MANET) is a decentralized type of wireless network. The network is ad hoc because it does not rely on a pre-existing infrastructure, such as routers or wireless access points. Instead, each node participates in routing by forwarding data for other nodes. The determination of which nodes forward data is made dynamically on the basis of network connectivity and the routing algorithm in use. Such wireless networks lack the complexities of infrastructure setup and administration, enabling devices to create and join networks "on the fly". Each device in a MANET is free to move independently in any direction, and will therefore change its links to other devices frequently. Each must forward traffic unrelated to its own use, and therefore be a router. The primary challenge in building a MANET is equipping each device to continuously maintain the information required to properly route traffic. This becomes harder as the scale of the MANET increases due to 1) the desire to route packets to/through every other node, 2) the percentage of overhead traffic needed to maintain real-time routing status, 3) each node has its own goodput to route independent and unaware of others needs, and 4) all must share limited communication bandwidth, such as a slice of radio spectrum. Such networks may operate by themselves or may be connected to the larger Internet. They may contain one or multiple and different transceivers between nodes. This results in a highly dynamic, autonomous topology. MANETs usually have a routable networking environment on top of a link layer ad hoc network. History Packet radio The earliest wireless data network was called PRNET, the packet radio network, and was sponsored by Defense Advanced Research Projects Agency (DARPA) in the early 1970s. Bolt, Beranek and Newman Inc. (BBN) and SRI International designed, built, and experimented with these earliest systems. Experimenters included Robert Kahn,
https://en.wikipedia.org/wiki/Fervidicoccus
Fervidicoccus fontis is a species of anaerobic organotroph archaeum belonging to the kingdom Thermoproteota. It is thermophilic and slightly acidophilic, being found at the Uzon Caldera at temperatures between 75 °C and 80 °C. Cells are single cocci of 1–3 μm in diameter with no archaella.
https://en.wikipedia.org/wiki/Goodnites
Goodnites (formerly Pull-Ups Goodnites; known as DryNites in the United Kingdom and most markets outside of North America) are disposable underwear designed for managing bedwetting. Goodnites are produced by Kimberly-Clark. The product has also been seen titled as Huggies Goodnites on official Huggies branded webpages. Goodnites constitute the middle level of Kimberly-Clark's line of disposable products, being targeted at children, teens and young adults. The company also produces Huggies diapers for babies, Pull-Ups training pants for toddlers undergoing toilet training, Poise pads for adult women, and Depend incontinence products for adults in general. History 1990s 1994 - Goodnites released The original Goodnites were released in 1994. They came in two sizes: medium (45-65 lbs) and large (65-85 lbs). 1999 - Goodnites released a new size In 1999, Kimberly-Clark introduced a new extra-large size (85 lbs-125 lbs and up). 2000s 2001 A "Cloth-Like Cover" replaced the previous cover. 2003 - Goodnites introduce a new fit Kimberly-Clark introduced the "Trim-Fit" style (a drastic reduction in padding thickness and the overall size of the pull-up). 2004 - Goodnites introduces gender-specific Prior to 2004, Goodnites were unisex, plain-white pull-ups with only a faux tag printed on the back. Kimberly-Clark introduced gender-specific Goodnites with absorbency zoned for boys and girls. Medium Goodnites became small/medium and were designed to fit kids 38-65 pounds. The small/medium size is the equivalent of size 4-8 underwear (Size 8 US is 23.5in Waist). Large and extra large Goodnites were combined into large/extra-Large for kids from 60-125+ pounds (Height for healthy weight of 125 pounds is 4' 11" up to 5' 8" (The CDC states that the Average Height for Men is 69in or 5' 9" with an average waist size of 40.3in while the Average for Women is 63.6in or 5' 3.6" with the average waist size of 38.7in)). The large/extra-large size is equivalent to si
https://en.wikipedia.org/wiki/Rate%20of%20return%20pricing
Rate of return pricing or Target-return pricing is a method of which a firm will set the price of its product based on their desired returns on said product. The concept of rate return pricing is very similar to return on investment however, in this circumstance the company can manipulate its prices to achieve the desired goal. This method is used primarily by companies that either have a lot of capital or have a monopoly on the market and when an investor requests a specific return on their investment. In a competitive market rate of return pricing can be a poor market strategy as its focus at the final profit margins and does not account for supply and demand factors. If a competitor is able to set a lower price, it could decrease demand for the product resulting in a lower sales then forecasted and failing to reach the desired profit margin. Formula The formula is: Target-return pricing = unit cost + [(desired return on investment * invested capital) / expected unit sales] For example, assume a firm invests $100 million in order to produce and market designer snowflakes, and they estimate that with demand for designer snowflakes being what it is, they can sell 2 million flakes per year. Further, from preliminary production data they know that at that level of output their average total cost is $50 per flake. Total annual costs would be $100 million (2 million units at $50 each). Next, management decides they want a 20% return on investment (ROI) which is $20 million (20% of a $100 million investment). Enter this data into the formula: Target-return pricing = 50 + [(0.2 *100 million) / 2 million] Target-return pricing = 50 + (20 million / 2 million) Target-return pricing = 50 + 10 Target-return pricing = 60 Thus the price for the snowflakes will be $60 each. Advantages Rate of return pricing enables firms to better assess the profitability of a product or service. It enables the cost of invested capital to be accounted when the setting price per unit an
https://en.wikipedia.org/wiki/Bowman%E2%80%93Heidenhain%20hypothesis
The Bowman–Heidenhain hypothesis is an early explanation of renal function and urinary secretion. The hypothesis states that the kidney is first and foremost a secreting gland. According to the theory, glomeruli are merely filters, while the tubules are the true secretory structures. To prove his hypothesis, Rudolph Heidenhain injected methylene blue into an animal's bloodstream, which soon appeared in the urine. More evident proof of this theory is that when the tubules are destroyed, urine becomes watery and urea and other toxic products remain in the blood. This theory was later merged with the Ludwig theory to form the modern theory.
https://en.wikipedia.org/wiki/GPS%20drawing
GPS drawing, also known as GPS art, is a method of drawing where an artist uses a Global Positioning System (GPS) device and follows a pre-planned route to create a large-scale picture or pattern. The .GPX data file recorded during the drawing process is then visualised, usually overlaying it as a line on a map of the area. Artists usually run or cycle the route—while cars, vans, boats and aeroplanes are utilized to create larger pieces. The first known GPS drawing was made by Reid Stowe in 1999. "Voyage of the Turtle" is an ocean sized drawing with a 5,500 mile circumference in the Atlantic made using a sailboat. The GPS data was recorded in logbooks and was therefore very low resolution. In 2000, after the US Military GPS satellite signals were opened up to the public, artists Jeremy Wood and Hugh Pryor were able to use a newly available GPS tracker to record their movements. To display their drawings Hugh Pryor wrote a computer program which convented the GPX data into a single line to be shown on screen or to be turned into an image file. With these tools in place GPS drawing as distinct artform was able to develop. Planning GPS artists can spend many hours finding a certain image or text hidden in a map or can sometimes simply see an existing image in a map due to pareidolia. In many cities and towns the road layout and landscape restricts the routes available so artists have to find creative ways to show their pictures or characters. In cities where there is a strong grid pattern 8-bit-style or pixelated images can be created of almost any object or shape. Many artists will create paper or digital maps of their route to follow on their journey. Artistic style There are many approaches to GPS drawing which an artist can choose depending on their means of travel and the landscape around them. Roads, trails, and paths only One style uses only pre-existing roads, paths, trails, etc. This can make it more challenging to find a route and plan the artwor
https://en.wikipedia.org/wiki/The%20Lexicon%20of%20Comicana
The Lexicon of Comicana is a 1980 book by the American cartoonist Mort Walker. It was intended as a tongue-in-cheek look at the devices used by comics cartoonists. In it, Walker invented an international set of symbols called symbolia after researching cartoons around the world (described by the term comicana). In 1964, Walker had written an article called "Let's Get Down to Grawlixes", a satirical piece for the National Cartoonists Society. He used terms such as grawlixes for his own amusement, but they soon began to catch on and acquired an unexpected validity. The Lexicon was written in response to this. The names he invented for them sometimes appear in dictionaries, and serve as convenient terminology occasionally used by cartoonists and critics. A 2001 gallery showing of comic- and street-influenced art in San Francisco, for example, was called "Plewds! Squeans! and Spurls!" Examples Agitrons: wiggly lines around a shaking object or character. Blurgits, swalloops: curved lines preceding or trailing after a character's moving limbs. Briffits (💨): clouds of dust that hang in the wake of a swiftly departing character or object. Dites, hites and vites: straight lines drawn across flat, clear and reflective surfaces, such as windows and mirrors. The first letter indicates direction: diagonal, horizontal and vertical respectively. Hites may also be used trailing after something moving with great speed. Emanata: lines drawn around the head to indicate shock or surprise Grawlixes (#, $, *, @): typographical symbols standing in for profanities, appearing in dialogue balloons in place of actual dialogue. Indotherm (♨): wavy, rising lines used to represent steam or heat. Lucaflect: a shiny spot on a surface of something, depicted as a four-paned window shape. Plewds (💦): flying sweat droplets that appear around a character's head when working hard, stressed, etc. Quimps (🪐): A special example of the grawlix, a symbol resembling the planet Saturn. Solrads: radiating li
https://en.wikipedia.org/wiki/Thiele%27s%20interpolation%20formula
In mathematics, Thiele's interpolation formula is a formula that defines a rational function from a finite set of inputs and their function values . The problem of generating a function whose graph passes through a given set of function values is called interpolation. This interpolation formula is named after the Danish mathematician Thorvald N. Thiele. It is expressed as a continued fraction, where ρ represents the reciprocal difference: Be careful that the -th level in Thiele's interpolation formula is while the -th reciprocal difference is defined to be . The two terms are different and can not be cancelled!
https://en.wikipedia.org/wiki/Continental%20rise
The continental rise is a low-relief zone of accumulated sediments that lies between the continental slope and the abyssal plain. It is a major part of the continental margin, covering around 10% of the ocean floor. Formation This geologic structure results from deposition of sediments, mainly due to mass wasting, the gravity-driven downhill motion of sand and other sediments. Mass wasting can occur gradually, with sediments accumulating discontinuously, or in large, sudden events. Large mass wasting occurrences are often triggered by sudden events such as earthquakes or oversteepening of the continental slope. More gradual accumulation of sediments occurs when hemipelagic sediments suspended in the ocean slowly settle to the ocean basin. Slope Because the continental rise lies below the continental slope and is formed from sediment deposition, it has a very gentle slope, usually ranging from 1:50 to 1:500. As the continental rise extends seaward, the layers of sediment thin, and the rise merges with the abyssal plain, typically forming a slope of around 1:1000. Accompanying Structures Alluvial Fans Deposition of sediments at the mouth of submarine canyons may form enormous fan-shaped accumulations called submarine fans on both the continental slope and continental rise. Alluvial or sedimentary fans are shallow cone-shaped reliefs at the base of the continental slope that merge together, forming the continental rise. Erosional submarine canyons slope downward and lead to alluvial fan valleys with increasing depth. It is in this zone that sediment is deposited, forming the continental rise. Alluvial fans such as the Bengal Fan, which stretches , make up one of the largest sedimentary structures in the world. Many alluvial fans also contain critical oil and natural gas reservoirs, making them key points for the collection of seismic data. Abyssal Plain Beyond the continental rise stretches the abyssal plain, which lies on top of basaltic oceanic crust and spa
https://en.wikipedia.org/wiki/Space%20selfie
A space selfie is a selfie (self-portrait photograph typically posted on social media sites) that is taken in outer space. This include selfies taken by astronauts (also known as astronaut selfies), machines (also known as space robot selfies and rover selfies) and by indirect methods. Astronauts The first known space selfie (during an EVA - an earlier shot inside the capsule was taken on Gemini 10 by Michael Collins) was taken by Buzz Aldrin during the Gemini 12 mission. The extra-vehicular activity (EVA) equipment used by astronauts during spacewalks contains a specially designed camera for photography in outer space. The main purpose of the EVA camera is to take pictures of the subjects related to the missions. There have been many space selfies, some of which use the visor of another astronaut's helmet as the mirror. Early space selfies after the word "selfie" was first used in 2002 without assistance from another astronaut included Donald Pettit and Stephen Robinson. Pettit took one during the Expedition 6 in January 2003. Robinson took his during the repair of the Space Shuttle Discovery on August 3, 2005, as part of the STS-114 mission. Another notable space selfie was taken by Japanese astronaut Akihiko Hoshide during the six-hour, 28-minute spacewalk on September 5, 2012. Hoshide's photo became a viral phenomenon after Commander Chris Hadfield uploaded the photo to his Twitter account on September 30, 2013. Coincidentally, Oxford University Press, the publisher of the Oxford English Dictionary, announced in November 2013 that "selfie" was the word of the year for 2013. The picture topped many selfie lists of the year. Another space selfie of Hoshide also showed up on Instagram and appeared on a list of top selfies of 2013. Machines Space selfies can be dated back to 1976 when the lander of the Viking 2 mission took the photo of its deck after landing on Mars; however they were not considered by Discovery News as true selfies in its list of top 10 spa
https://en.wikipedia.org/wiki/California%20Games
California Games is a 1987 sports video game originally released by Epyx for the Apple II and Commodore 64, and ported to other home computers and video game consoles. Branching from their Summer Games and Winter Games series, this game consists of a collection of outdoor sports purportedly popular in California. The game was successful and spawned a sequel, California Games II. Gameplay The events available vary slightly depending on the platform, but include all of the following: Half-pipe Footbag Surfing (starring Rippin' Rick) Roller skating BMX Flying disc Development Several members of the development team moved on to other projects. Chuck Sommerville, the designer of the half-pipe game in California Games, later developed the game Chip's Challenge, while Ken Nicholson, the designer of the footbag game, was the inventor of the technology used in Microsoft's DirectX. Kevin Norman, the designer of the BMX game, went on to found the educational science software company Norman & Globus, makers of the ElectroWiz series of products. The sound design for the original version of California Games was done by Chris Grigg, member of the band Negativland. Ports Originally written for the Apple II and Commodore 64, it was eventually ported to Amiga, Apple IIGS, Atari 2600, Atari ST, MS-DOS, Genesis, Amstrad CPC, ZX Spectrum, Nintendo Entertainment System, MSX and Master System. The Atari Lynx version was the pack-in game for the system when it was launched in June 1989. An Atari XE version was planned and contracted out by Atari Corp. to Epyx in 1988 but no code was delivered by the publication deadline. Reception California Games was a commercial blockbuster. With more than 300,000 copies sold in the first nine months, it was the most-successful Epyx game, outselling each of the four previous and two subsequent titles in the company's "Games" series. CEO David Shannon Morse said that it was the first Epyx game to appeal equally to boys and girls during playte
https://en.wikipedia.org/wiki/Hydrodynamica
Hydrodynamica (Latin for Hydrodynamics) is a book published by Daniel Bernoulli in 1738. The title of this book eventually christened the field of fluid mechanics as hydrodynamics. The book deals with fluid mechanics and is organized around the idea of conservation of energy, as received from Christiaan Huygens's formulation of this principle. The book describes the theory of water flowing through a tube and of water flowing from a hole in a container. In doing so, Bernoulli explained the nature of hydrodynamic pressure and discovered the role of loss of vis viva in fluid flow, which would later be known as the Bernoulli principle. The book also discusses hydraulic machines and introduces the notion of work and efficiency of a machine. In the tenth chapter, Bernoulli discussed the first model of the kinetic theory of gases. Assuming that heat increases the velocity of the gas particles, he demonstrated that the pressure of air is proportional to kinetic energy of gas particles, thus making the temperature of gas proportional to this kinetic energy as well. Notes Bibliography 1738 books Physics books Mathematics books 1730s in science Mathematics literature 18th-century Latin books
https://en.wikipedia.org/wiki/Audio%20frequency
An audio frequency or audible frequency (AF) is a periodic vibration whose frequency is audible to the average human. The SI unit of frequency is the hertz (Hz). It is the property of sound that most determines pitch. The generally accepted standard hearing range for humans is 20 to 20,000 Hz. In air at atmospheric pressure, these represent sound waves with wavelengths of to . Frequencies below 20 Hz are generally felt rather than heard, assuming the amplitude of the vibration is great enough. Sound frequencies above 20 kHz are called ultrasonic. Sound propagates as mechanical vibration waves of pressure and displacement, in air or other substances. In general, frequency components of a sound determine its "color", its timbre. When speaking about the frequency (in singular) of a sound, it means the property that most determines its pitch. Higher pitches have higher frequency, and lower pitches are lower frequency. The frequencies an ear can hear are limited to a specific range of frequencies. The audible frequency range for humans is typically given as being between about 20 Hz and 20,000 Hz (20 kHz), though the high frequency limit usually reduces with age. Other species have different hearing ranges. For example, some dog breeds can perceive vibrations up to 60,000 Hz. In many media, such as air, the speed of sound is approximately independent of frequency, so the wavelength of the sound waves (distance between repetitions) is approximately inversely proportional to frequency. Frequencies and descriptions See also Absolute threshold of hearing Hypersonic effect, controversial claim for human perception above 20,000 Hz Loudspeaker Musical acoustics Piano key frequencies Scientific pitch notation Whistle register
https://en.wikipedia.org/wiki/Extracellular%20RNA
Extracellular RNA (exRNA) describes RNA species present outside of the cells in which they were transcribed. Carried within extracellular vesicles, lipoproteins, and protein complexes, exRNAs are protected from ubiquitous RNA-degrading enzymes. exRNAs may be found in the environment or, in multicellular organisms, within the tissues or biological fluids such as venous blood, saliva, breast milk, urine, semen, menstrual blood, and vaginal fluid. Although their biological function is not fully understood, exRNAs have been proposed to play a role in a variety of biological processes including syntrophy, intercellular communication, and cell regulation. The United States National Institutes of Health (NIH) published in 2012 a set of Requests for Applications (RFAs) for investigating extracellular RNA biology. Funded by the NIH Common Fund, the resulting program was collectively known as the Extracellular RNA Communication Consortium (ERCC). The ERCC was renewed for a second phase in 2019. Background Both prokaryotic and eukaryotic cells are known to release RNA, and this release can be passive or active. The Endosomal Sorting Complex Required for Transport (ESCRT) machinery was previously considered as a possible mechanism for RNA secretion from the cell, but more recently research studying microRNA secretion in human embryonic kidney cells and Cercopithecus aethiops kidney cells identified neutral sphingomyelinase 2 (nSMase2), an enzyme involved in ceramide biosynthesis, as a regulator of microRNA secretion levels. ExRNAs are often found packaged within vesicles such as exosomes, ectosomes, prostasomes, microvesicles, and apoptotic bodies. Although RNAs can be excreted from the cell without an enveloping container, ribonucleases present in extracellular environments would eventually degrade the molecule. Types Extracellular RNA should not be viewed as a category describing a set of RNAs with a specific biological function or belonging to a particular RNA family. S
https://en.wikipedia.org/wiki/Emotive%20Internet
Emotive Internet (also Emotive Web, Emotional Internet) is a conceptualization of the Internet as an emergent emotional public space, such as how it serves as a space for the social sharing of emotions. It can also denote the quality of the Internet that allows it to be used to communicate in an emotive fashion or with emotional intent. Since it is an expressive medium, it also enables users to construct and represent their identities online. This is evident in the way emotional responses have been integrated in online communication and interactions. The concept is also linked to emotional analytics and emotion-sensing applications, particularly those technologies that power the Internet of Things (IoTs) - the smart home devices that have the capability to store and process the user's emotional profile to deliver services. Concept It is recognized that the Internet has the capability to allow its users a genuine display of emotions. This is demonstrated in the video conferencing platform, which represents the closest form of synchronous computer-mediated communication (CMC) to face-to-face communication because it tends to reproduce, on a technological level, relationship and communicative experience that feature complex sensorial channels. It transmits a considerable amount of information so that it sustains interpersonal relationships and exploit shared context that allows for mutual understanding. This phenomenon is also no longer confined to communication between users but also between users and smart devices or users and applications. An application of the emotive quality of the Internet involves emerging technologies that fall within the affective computing field. These include those that use sensor technologies and computer algorithms to enable smart machines to detect, recognize, and share human emotions. The display of emotions in the Internet can be observed in platforms where users interact such as online forums. A Polish study, which analyzed thousa
https://en.wikipedia.org/wiki/HP%20Cloud
HP Cloud was a set of cloud computing services available from Hewlett-Packard that offered public cloud, private cloud, hybrid cloud, managed private cloud and other cloud services. It was the combination of the previous HP Converged Cloud business unit and HP Cloud Services, an OpenStack-based public cloud. It was marketed to enterprise organizations to combine public cloud services with internal IT resources to create hybrid clouds, or a mix of private and public cloud environments, from around 2011 until 2016. History HP Converged Cloud was announced in April 2012. HP Converged Cloud is managed under a Hewlett-Packard business unit established in 2013 named The Converged Cloud unit, headed by Saar Gillai as Senior Vice President and general manager of Converged Cloud. HP Public Cloud was announced on March 14, 2011, and launched as a public beta on May 10, 2012. HP Fellow and MySQL author Brian Aker announced the Relational Database Service on stage at the 2012 MySQL User's Conference. The HP Public Cloud Beta that went live in May 2012 included OpenStack technology-based storage and content delivery network (CDN) components. HP Cloud Object Storage and HP Cloud CDN were moved into general availability on August 1, 2012. HP Cloud DNS as a Service was moved into general availability on July 1, 2013. The two business units were merged in late 2013 and announced at the HP Discover event in Barcelona, Spain. On May 7, 2014, HP announced the HP Helion portfolio of products and services, and stated that the company planned to invest over $1 billion in cloud. HP Helion included HP's existing cloud products, new OpenStack technology-based services, and both professional and support services to assist businesses in building and managing hybrid IT environments. On October 21, 2015, HP announced that it would shut down the HP Cloud in January 2016. In October 2023, Amaryllo resumes HP cloud business with an introduction of HP cloud storage service through a trademark
https://en.wikipedia.org/wiki/Hamiltonian%20spite
Within the field of social evolution, Hamiltonian spite is a term for spiteful behaviors occurring among conspecifics that have a cost for the actor and a negative impact upon the recipient. Theories on altruism and spitefulness W. D. Hamilton published an influential paper on altruism in 1964 to explain why genetic kin tend to help each other. He argued that genetically related individuals are likely to carry the copies of the same alleles; thus, helping kin may ensure that copies of the actors' alleles pass onto next generations of both the recipient and the actor. While this became a widely accepted idea, it was less noted that Hamilton published a later paper that modified this view. This paper argues that by measuring the genetic relatedness between any two (randomly chosen) individuals of a population several times, we can identify an average level of relatedness. Theoretical models predict that (1) it is adaptive for an individual to be altruistic to any other individuals that are more closely related to it than this average level, and also that (2) it is adaptive for an individual to be spiteful against any other individuals that are less closely related to it than this average level. The indirect adaptive benefits of such acts can surpass certain costs of the act (either helpful or harmful) itself. Hamilton mentioned birds and fishes exhibiting infanticide (more specifically: ovicide) as examples for such behaviors. Briefly, an individual can increase the chance of its genetic alleles to be passed to the next generations either by helping those that are more closely related, or by harming those that are less closely related than relationship by chance. Doubts about the adaptive nature of spiteful behavior Though altruism and spitefulness appear to be two sides of the same coin, the latter is less accepted among evolutionary biologists. First, unlike the case with the beneficiary of an altruistic act, targets of aggression are likely to act in revenge
https://en.wikipedia.org/wiki/Glycoside%20hydrolase%20family%2072
In molecular biology, glycoside hydrolase family 72 is a family of glycoside hydrolases. Glycoside hydrolases are a widespread group of enzymes that hydrolyse the glycosidic bond between two or more carbohydrates, or between a carbohydrate and a non-carbohydrate moiety. A classification system for glycoside hydrolases, based on sequence similarity, has led to the definition of >100 different families. This classification is available on the CAZy web site, and also discussed at CAZypedia, an online encyclopedia of carbohydrate active enzymes. This family includes yeast glycolipid proteins anchored to the membrane. It includes Candida albicans pH-regulated protein, which is required for apical growth and plays a role in morphogenesis, and Saccharomyces cerevisiae glycolipid anchored surface protein.
https://en.wikipedia.org/wiki/Protein%20inhibitor%20of%20activated%20STAT
Protein inhibitor of activated STAT (PIAS), also known as E3 SUMO-protein ligase PIAS, is a protein that regulates transcription in mammals. PIAS proteins act as transcriptional co-regulators with at least 60 different proteins in order to either activate or repress transcription. The transcription factors STAT, NF-κB, p73, and p53 are among the many proteins that PIAS interacts with. The seven proteins that belong to the mammalian PIAS family are encoded by four genes: PIAS1, PIAS2 (PIASx), PIAS3, and PIAS4 (PIASy). Apart from PIAS1, each gene encodes two protein isoforms. Homologues of PIAS proteins have been found in other eukaryotes, including Zimp/dPIAS in Drosophila melanogaster and zfPIAS4a in zebrafish. SIZ1 and SIZ2 were two homologues identified in yeast. PIAS proteins contain each conserved domain and motif of the PIAS protein family, with a few exceptions. The known functions of these domains and motifs are similar among all PIAS protein family members. These functions include acting as E3 SUMO-protein ligases during SUMOylation, which is an important process in transcriptional regulation. Presently, less is known about the higher order structure of PIAS proteins. The three-dimensional protein structures of PIAS2, PIAS3, and SIZ1 have only recently been solved. PIAS proteins have potential applications in cancer treatment and prevention. They may also play an important role in regulating immune system responses. Discovery The discovery of PIAS3 was first published in 1997. The discovery was made while the JAK-STAT pathway was being studied. The discovery of other PIAS proteins, including PIAS1, PIASxα, PIASxβ, and PIASy, was published the following year. The interaction between STATs and PIASs was characterized by the yeast two-hybrid assay. PIAS proteins were named based on their ability to inhibit STAT. For example, PIAS1 inhibited STAT1, and PIAS3 inhibited STAT3. When it was discovered that PIAS proteins did far more than simply inhibi
https://en.wikipedia.org/wiki/NeXTdimension
The NeXTdimension (ND) is an accelerated 32-bit color board manufactured and sold by NeXT from 1991 that gave the NeXTcube color capabilities with PostScript planned. The NeXTBus (NuBus-like) card was a full size card for the NeXTcube, filling one of four slots, another one being filled with the main board itself. The NeXTdimension featured S-Video input and output, RGB output, an Intel i860 64-bit RISC processor at 33 MHz for Postscript acceleration, 8 MB main memory (expandable to 64 MB via eight 72-pin SIMM slots) and 4 MB VRAM for a resolution of 1120x832 at 24-bit color plus 8-bit alpha channel. An onboard C-Cube CL550 chip for MJPEG video compression was announced, but never shipped. A handful of engineering prototypes for the MJPEG daughterboard exist. A stripped down Mach kernel was used as the operating system for the card. Due to the supporting processor, 32-bit color on the NeXTdimension was faster than 2-bit greyscale Display PostScript on the NeXTcube. Display PostScript never actually ran on the board so the Intel i860 never did much more than move blocks of color data around. The Motorola 68040 did the crunching and the board, while fast for its time, never lived up to the hype. Since the main board always included the greyscale video logic, each NeXTdimension allowed the simultaneous use of an additional monitor. List price for a NeXTdimension sold as an add-on to the NeXTcube was , and for the MegaPixel Color Display. See also NeXT character set NeXTcube
https://en.wikipedia.org/wiki/Octuple-precision%20floating-point%20format
In computing, octuple precision is a binary floating-point-based computer number format that occupies 32 bytes (256 bits) in computer memory. This 256-bit octuple precision is for applications requiring results in higher than quadruple precision. This format is rarely (if ever) used and very few environments support it. IEEE 754 octuple-precision binary floating-point format: binary256 In its 2008 revision, the IEEE 754 standard specifies a binary256 format among the interchange formats (it is not a basic format), as having: Sign bit: 1 bit Exponent width: 19 bits Significand precision: 237 bits (236 explicitly stored) The format is written with an implicit lead bit with value 1 unless the exponent is all zeros. Thus only 236 bits of the significand appear in the memory format, but the total precision is 237 bits (approximately 71 decimal digits: ). The bits are laid out as follows: Exponent encoding The octuple-precision binary floating-point exponent is encoded using an offset binary representation, with the zero offset being 262143; also known as exponent bias in the IEEE 754 standard. Emin = −262142 Emax = 262143 Exponent bias = 3FFFF16 = 262143 Thus, as defined by the offset binary representation, in order to get the true exponent the offset of 262143 has to be subtracted from the stored exponent. The stored exponents 0000016 and 7FFFF16 are interpreted specially. The minimum strictly positive (subnormal) value is and has a precision of only one bit. The minimum positive normal value is 2−262142 ≈ 2.4824 × 10−78913. The maximum representable value is 2262144 − 2261907 ≈ 1.6113 × 1078913. Octuple-precision examples These examples are given in bit representation, in hexadecimal, of the floating-point value. This includes the sign, (biased) exponent, and significand. 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 000016 = +0 8000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 0000 000016 = −0
https://en.wikipedia.org/wiki/Ocean%20Biodiversity%20Information%20System
The Ocean Biodiversity Information System (OBIS), formerly Ocean Biogeographic Information System, is a web-based access point to information about the distribution and abundance of living species in the ocean. It was developed as the information management component of the ten year Census of Marine Life (CoML) (2001-2010), but is not limited to CoML-derived data, and aims to provide an integrated view of all marine biodiversity data that may be made available to it on an open access basis by respective data custodians. According to its web site as at July 2018, OBIS "is a global open-access data and information clearing-house on marine biodiversity for science, conservation and sustainable development." 8 specific objectives are listed in the OBIS site, of which the leading item is to "Provide [the] world's largest scientific knowledge base on the diversity, distribution and abundance of all marine organisms in an integrated and standardized format". History and current status Initial ideas for OBIS were developed at a CoML meeting on benthic (bottom-dwelling) ocean life in October 1997. Recommendations from this workshop led to a web site (http://marine.rutgers.edu/OBIS) at Rutgers in 1998 to demonstrate the initial OBIS concept. An inaugural OBIS International Workshop was held on November 3–4, 1999 in Washington, DC, which led to scoping of the project and outreach to potential partners, with selected contributions published in a special issue of Oceanography magazine, within which OBIS founder Dr. J. F. Grassle articulated the vision of OBIS as "an on-line, worldwide atlas for accessing, modeling and mapping marine biological data in a multidimensional geographic context." In May 2000, US Government Agencies in the National Oceanographic Partnership Program together with the Alfred P. Sloan Foundation funded eight research projects to initiate OBIS. In May 2001, the US National Science Foundation funded Rutgers University to develop a global portal for OBIS. A
https://en.wikipedia.org/wiki/List%20of%20instruments%20used%20in%20forensics
Instruments used in Forensics, including autopsy dissections are as follows: Instrument list Serological, chemical and genetic testings are done by the respective people of these branches. Image gallery
https://en.wikipedia.org/wiki/Trimethylsilylpropanoic%20acid
Trimethylsilylpropanoic acid (TMSP or TSP) is a chemical compound containing a trimethylsilyl group. It is used as internal reference in nuclear magnetic resonance for aqueous solvents (e.g. D2O). For that use it is often deuterated (3-(trimethylsilyl)-2,2,3,3-tetradeuteropropionic acid or TMSP-d4). Other internal references that are frequently used in NMR experiments are DSS and tetramethylsilane.
https://en.wikipedia.org/wiki/Soliton%20model%20in%20neuroscience
The soliton hypothesis in neuroscience is a model that claims to explain how action potentials are initiated and conducted along axons based on a thermodynamic theory of nerve pulse propagation. It proposes that the signals travel along the cell's membrane in the form of certain kinds of solitary sound (or density) pulses that can be modeled as solitons. The model is proposed as an alternative to the Hodgkin–Huxley model in which action potentials: voltage-gated ion channels in the membrane open and allow sodium ions to enter the cell (inward current). The resulting decrease in membrane potential opens nearby voltage-gated sodium channels, thus propagating the action potential. The transmembrane potential is restored by delayed opening of potassium channels. Soliton hypothesis proponents assert that energy is mainly conserved during propagation except dissipation losses; Measured temperature changes are completely inconsistent with the Hodgkin-Huxley model. The soliton model (and sound waves in general) depends on adiabatic propagation in which the energy provided at the source of excitation is carried adiabatically through the medium, i.e. plasma membrane. The measurement of a temperature pulse and the claimed absence of heat release during an action potential were the basis of the proposal that nerve impulses are an adiabatic phenomenon much like sound waves. Synaptically evoked action potentials in the electric organ of the electric eel are associated with substantial positive (only) heat production followed by active cooling to ambient temperature. In the garfish olfactory nerve, the action potential is associated with a biphasic temperature change; however, there is a net production of heat. These published results are inconsistent with the Hodgkin-Huxley Model and the authors interpret their work in terms of that model: The initial sodium current releases heat as the membrane capacitance is discharged; heat is absorbed during recharge of the membrane capacit
https://en.wikipedia.org/wiki/Electron%20quadruplets
Electron quadruplets are a possible phenomenon in an exotic state of matter in which Cooper pairs do not exhibit long-range order, but electron quadruplets do. This "quartic metal" phase is related to but distinct from those superconductors explained by the standard BCS theory; rather than expelling magnetic field lines as in the Meissner effect, it generates them, a spontaneous Nernst effect that indicates the breaking of time-reversal symmetry. After the theoretical possibility was raised, observations consistent with electron quadrupling were published using hole-doped Ba1-xKxFe2As2 in 2021. See also List of states of matter
https://en.wikipedia.org/wiki/Resurrection%20stone
A resurrection stone is a stone of immense weight which was hired out to prevent newly buried corpses from being stolen. List of resurrection stones in the United Kingdom St Laurence Church, Lurgashall Llantrisant Dean Row Chapel, Wilmslow
https://en.wikipedia.org/wiki/John%20Alexander%20Smith
John Alexander Smith (21 April 1863 – 19 December 1939) was a British idealist philosopher, who was the Jowett Lecturer of philosophy at Balliol College, Oxford from 1896 to 1910, and Waynflete Professor of Moral and Metaphysical Philosophy, carrying a Fellowship at Magdalen College in the same university, from 1910 to 1936. He was born in Dingwall and died in Oxford. Life and work Smith was educated at Inverness Academy, the Edinburgh Collegiate School, Edinburgh University (where he was Ferguson classical scholar in 1884), and at Balliol College, Oxford, to which he was admitted as Warner exhibitioner and honorary scholar in Hilary term 1884. His most visible accomplishments were his work with William David Ross on a 12-volume translation of Aristotle, and his Gifford Lectures for 1929–1931 on the Heritage of Idealism, which were never published. The 'Moral' tag in his Professorial title disappeared with R. G. Collingwood's appointment in 1936. Smith expressed some unease about the combination of 'moral' and 'metaphysical' in his inaugural lecture Knowing and Acting: The framer of the Chair's regulations, he remarks, describes the Professor's duties 'in a way which rather sets a problem than furnishes guidance. The Professor, he says, 'shall lecture and give instruction on the principles and history of Mental Philosophy, and on its connexion with Ethics.' He distinguishes two great departments of philosophical thought — so recognizedly different as already to be assigned for separate treatment to two other Professors in the University — and he enjoins that they shall be afresh discussed in their connexion with one another, yet with respect to their distinction. It can scarcely be his meaning that his Professor should attempt the invidious task of harmonising the possibly divergent accounts given of Logic by the Wykeham Professor and of Ethics by Whyte's Professor, of performing in public the higher synthesis of his colleagues' several contributions to philosoph
https://en.wikipedia.org/wiki/Hawaii%20Ocean%20Time-series
The Hawaii Ocean Time-series (HOT) program is a long-term oceanographic study based at the University of Hawaii at Manoa. In 2015, the American Society for Microbiology designated the HOT Program's field site Station ALOHA (A Long-Term Oligotrophic Habitat Assessment; ()) a "Milestone in Microbiology", for playing "a key role in defining the discipline of microbial oceanography and educating the public about the vital role of marine microbes in global ecosystems." Scientists working on the Hawaii Ocean Time-series (HOT) program have been making repeated observations of the hydrography, chemistry and biology of the water column at a station north of Oahu, Hawaii since October 1988. The objective of this research is to provide a comprehensive description of the ocean at a site representative of the North Pacific Subtropical Gyre. Cruises are made approximately once per month to the deep-water Station ALOHA located 100 km north of Oahu, Hawaii. Measurements of the thermohaline structure, water column chemistry, currents, optical properties, primary production, plankton community structure, and rates of particle export are made on each cruise. The HOT program also uses autonomous underwater vehicles, including floats and gliders, to collect data at Station ALOHA between cruises. Overview HOT was founded to understand the processes controlling the fluxes of carbon and associated bioelements in the ocean and to document changes in the physical structure of the water column. To achieve this, the HOT program has several specific goals: 1. Quantify temporal (seasonal to decadal) changes in reservoirs and fluxes of carbon and associated bioelements (nitrogen, oxygen, phosphorus, and silicon). 2. Identify processes controlling air-sea carbon exchange, rates of carbon transformation through the planktonic food web, and fluxes of carbon into the ocean. 3. Form a multi-decadal baseline based on the gathered data that will allow researchers to decipher natural and anthropoge
https://en.wikipedia.org/wiki/Stunted%20projective%20space
In mathematics, a stunted projective space is a construction on a projective space of importance in homotopy theory, introduced by . Part of a conventional projective space is collapsed down to a point. More concretely, in a real projective space, complex projective space or quaternionic projective space KPn, where K stands for the real numbers, complex numbers or quaternions, one can find (in many ways) copies of KPm, where m < n. The corresponding stunted projective space is then KPn,m = KPn/KPm, where the notation implies that the KPm has been identified to a point. This makes a topological space that is no longer a manifold. The importance of this construction was realised when it was shown that real stunted projective spaces arose as Spanier–Whitehead duals of spaces of Ioan James, so-called quasi-projective spaces, constructed from Stiefel manifolds. Their properties were therefore linked to the construction of frame fields on spheres. In this way the vector fields on spheres question was reduced to a question on stunted projective spaces: for RPn,m, is there a degree one mapping on the 'next cell up' (of the first dimension not collapsed in the 'stunting') that extends to the whole space? Frank Adams showed that this could not happen, completing the proof. In later developments spaces KP∞,m and stunted lens spaces have also been used.
https://en.wikipedia.org/wiki/Indium%20%28111In%29%20capromab%20pendetide
{{DISPLAYTITLE:Indium (111In) capromab pendetide}} Indium (111In) capromab pendetide (trade name Prostascint) is used to image the extent of prostate cancer. Capromab is a mouse monoclonal antibody which recognizes a protein found on both prostate cancer cells and normal prostate tissue. It is linked to pendetide, a derivative of DTPA. Pendetide acts as a chelating agent for the radionuclide indium-111. Following an intravenous injection of Prostascint, imaging is performed using single-photon emission computed tomography (SPECT). Early trials with yttrium (90Y) capromab pendetide were also conducted.
https://en.wikipedia.org/wiki/WiDi
Wireless Display (WiDi) is technology developed by Intel that enables users to stream music, movies, photos, videos and apps without cables from a compatible computer to a compatible HDTV or through the use of an adapter with other HDTVs or computer monitors. Intel WiDi supports HD 1080p video quality, 5.1 surround sound, and low latency for interacting with applications sent to the TV from a PC. Using the Intel WiDi Widget, users can perform different functions simultaneously on their PC and TV such as checking email on the PC while streaming a movie to the TV from the same device. WiDi development was discontinued in 2015 in favor of Miracast, a standard developed by the Wi-Fi Alliance and natively supported by Windows 8.1 and later. In Microsoft's Windows 10 operating system, the built-in Wireless Display is called Project, which can be used to mirror the device's display to a TV if it supports Miracast. Version history 2010 - WiDi 1.0 - Supports 720p 2011 - WiDi 2.0 - Supports 1080p 2012 - WiDi 3.0 - Supports 1080p @ 60 FPS September 2012 - WiDi 3.5 - Supports Windows 8, touch functionality, 1080p output, 3D content, HDCP2, Blu-ray, and USB devices and Miracast. 2013 - WiDi 4.0 2014 - WiDi 4.1 2014 - WiDi 4.2 - 5 GHz Wi-Fi support (with compatible receiver) 2015 - WiDi 5.1 - Supports 4k - Ultra HD displays 2015 - WiDi 6.0 October 2015 - The marketing and development of WiDi applications was discontinued by Intel, who said that this was because the Miracast standard was natively supported in Windows for wireless display. Miracast The Miracast standard is supported in Intel Wireless Display versions 3.5 through 6.0, when it was discontinued. After this development, Intel recommended that business users utilize Intel Unite as a platform for collaboration. Miracast was included in Android 4.2 smart phones through Android 7, and on Windows 8.1 and 10. It can stream on TVs, projectors, and media players. See also AirPlay Chromecast Digital Living Ne
https://en.wikipedia.org/wiki/Reinforcement
In reinforcement theory, it is argued that human behavior is a result of "contingent consequences" to human actions. The publication pushes forward the idea that "you get what you reinforce". This means that behavior, when given the right types of reinforcers, can be changed for the better and negative behavior can be reinforced away. The model of self-regulation has three main aspects of human behavior, which are self-awareness, self-reflection, and self-regulation. Reinforcements traditionally align with self-regulation. The behavior can be influenced by the consequence but behavior also has antecedents. There are four types of behavior management: positive reinforcement, negative reinforcement, positive punishment, negative punishment. In behavioral terms, positive means addition, negative means removal, reinforcement is anything that increases a behavior, and punishment is anything that decreases a behavior. Positive reinforcement is the addition of a stimulus which increases the behavior (like a paycheck). Negative reinforcement is the removal of an aversive stimulus that increases the behavior (like Tylenol removes a headache). Positive punishment is an imposition of an aversive stimulus to decrease a behavior. Negative punishment is the removal of a desired stimulus. Negative punishment commonly occurs as removing a benefit following poor performance. While reinforcement and punishment do not require an individual to consciously perceive an effect elicited by the stimulus, it still requires conscious effort to work towards a desired goal. Extinction involves discontinuing or removing the reinforcer that previously maintained the behavior. If a behavior is no longer contacting reinforcement, it should extinguish. Rewarding stimuli, which are associated with "wanting" and "liking" (desire and pleasure, respectively) and appetitive behavior, function as positive reinforcers; the converse statement is also true: positive reinforcers provide a desirable stim
https://en.wikipedia.org/wiki/Set%20partitioning%20in%20hierarchical%20trees
Set partitioning in hierarchical trees (SPIHT) is an image compression algorithm that exploits the inherent similarities across the subbands in a wavelet decomposition of an image. The algorithm was developed by Brazilian engineer Amir Said with William A. Pearlman in 1996. General description The algorithm codes the most important wavelet transform coefficients first, and transmits the bits so that an increasingly refined copy of the original image can be obtained progressively. See also Embedded Zerotrees of Wavelet transforms (EZW) Wavelet
https://en.wikipedia.org/wiki/Crypton%20%28particle%29
In particle physics, the crypton is a hypothetical superheavy particle, thought to exist in a hidden sector of string theory. It has been proposed as a candidate particle to explain the dark matter content of the universe. Cryptons arising in the hidden sector of a superstring-derived flipped SU(5) GUT model have been shown to be metastable with a lifetime exceeding the age of the universe. Their slow decays may provide a source for the ultra-high-energy cosmic rays (UHECR).
https://en.wikipedia.org/wiki/History%20of%20genetics
The history of genetics dates from the classical era with contributions by Pythagoras, Hippocrates, Aristotle, Epicurus, and others. Modern genetics began with the work of the Augustinian friar Gregor Johann Mendel. His work on pea plants, published in 1866, provided the initial evidence that, on its rediscovery in 1900, helped to establish the theory of Mendelian inheritance. In ancient Greece, Hippocrates suggested that all organs of the body of a parent gave off invisible “seeds,” miniaturised components, that were transmitted during sexual intercourse and combined in the mother's womb to form a baby. In the Early Modern times, William Harvey's book On Animal Generation contradicted Aristotle's theories of genetics and embryology. The 1900 rediscovery of Mendel's work by Hugo de Vries, Carl Correns and Erich von Tschermak led to rapid advances in genetics. By 1915 the basic principles of Mendelian genetics had been studied in a wide variety of organisms — most notably the fruit fly Drosophila melanogaster. Led by Thomas Hunt Morgan and his fellow "drosophilists", geneticists developed the Mendelian model, which was widely accepted by 1925. Alongside experimental work, mathematicians developed the statistical framework of population genetics, bringing genetic explanations into the study of evolution. With the basic patterns of genetic inheritance established, many biologists turned to investigations of the physical nature of the gene. In the 1940s and early 1950s, experiments pointed to DNA as the portion of chromosomes (and perhaps other nucleoproteins) that held genes. A focus on new model organisms such as viruses and bacteria, along with the discovery of the double helical structure of DNA in 1953, marked the transition to the era of molecular genetics. In the following years, chemists developed techniques for sequencing both nucleic acids and proteins, while many others worked out the relationship between these two forms of biological molecules and disc
https://en.wikipedia.org/wiki/Iomega
Iomega (later LenovoEMC) produced external, portable, and networked data storage products. Established in the 1980s in Roy, Utah, United States, Iomega sold more than 410 million digital storage drives and disks, including the Zip drive floppy disk system. Formerly a public company, it was acquired by EMC Corporation in 2008, and then by Lenovo, which rebranded the product line as LenovoEMC, until discontinuation in 2018. History Iomega started in Roy, Utah, U.S. in 1980 and moved its headquarters to San Diego, California in 2001. For many years, it was a significant name in the data storage industry. Iomega's most famous product, the Zip drive, offered relatively large amounts of storage on portable, high-capacity floppy disks. The original Zip disk's 100MB capacity was a huge improvement over the decades-long standard of 1.44MB standard floppy disks. The Zip drive became a common internal and external peripheral for IBM-compatible and Macintosh personal computers. However, Zip drives sometimes failed after a short period, which failure was commonly referred to as the "click of death." This problem, combined with competition from CD-RW drives, caused Zip drive sales to decline dramatically, even after introducing larger 250MB and 750MB versions. Iomega eventually launched a CD-RW drive. Without the revenue from its proprietary storage disks and drives, Iomega's sales and profits declined considerably. Iomega's stock price, which was over $100 at its height in the 1990s, fell to around $2 in the mid-2000s. Trying to find a niche, Iomega released devices such as the HipZip MP3 player, the FotoShow Digital Image Center, and numerous external hard drives, optical drives, and NAS products. None of these products were successful. In 2012, reporter Vincent Verweij of Dutch broadcaster Katholieke Radio Omroep revealed that at least 16,000 Iomega NAS devices were publicly exposing their users' files on the Internet. This was due to Iomega having disabled password securi
https://en.wikipedia.org/wiki/Society%20for%20Neuroscience
The Society for Neuroscience (SfN) is a professional society, headquartered in Washington, D.C., for basic scientists and physicians around the world whose research is focused on the study of the brain and nervous system. It is especially well known for its annual meeting, consistently one of the largest scientific conferences in the world. History SfN was founded in 1969 by Ralph W. Gerard and, at nearly 37,000 members, has grown to be the largest neuroscience society in the world. The stated mission of the society is to: Advance the understanding of the brain and the nervous system. Provide professional development activities, information, and educational resources. Promote public information and general education about science and neuroscience. Inform legislators and other policy makers about the implications of research for public policy, societal benefit, and continued scientific progress. Annual meeting The society holds an annual meeting that is attended by scientists and physicians from all around the world. The first annual meeting of the society was held in Washington, DC in 1971, and it was attended by 1,396 scientists. Subsequent meetings have been held annually in a variety of cities throughout the US, with the exception of the 1988 meeting, which was held in Canada. The 2022 meeting was held in San Diego, California. Publishing The Journal of Neuroscience, was launched in 1981 and has consistently been a multidisciplinary journal publishing papers on a broad range of topics of general interest to those working on the nervous-system. In addition, SfN publications offer breadth and depth into the rapidly developing field of neuroscience. eNeuro, was launched in 2014, SfN's open-access journal publishes high quality papers in all areas of neuroscience that increase the understanding of the nervous-system, including replication studies and negative results. SfN's digital member magazine, Neuroscience Quarterly covers SfN news, programs, science, an