source stringlengths 31 227 | text stringlengths 9 2k |
|---|---|
https://en.wikipedia.org/wiki/Thompson%E2%80%93LaGarde%20Tests | The Thompson–LaGarde Tests were a series of tests conducted in 1904 to determine which caliber should be used in American military handguns.
History
The Army had previously been using the .38 Long Colt, and the cartridge's relatively poor ballistics were highlighted during the Philippine–American War of 1899–1902, when reports from U.S. Army officers were received regarding the .38 bullet's inability to stop charges of frenzied Moro juramentados in the Moro Rebellion, even at extremely close ranges. A typical instance occurred in 1905 and was later recounted by Col. Louis A. LaGarde:
Antonio Caspi, a prisoner on the island of Samar, P.I. attempted escape on Oct. 26, 1905. He was shot four times at close range in a hand-to-hand encounter by a .38 Colt's revolver loaded with U.S. Army regulation ammunition. He was finally stunned by a blow on the forehead from the butt end of a Springfield carbine.
Col. LaGarde noted Caspi's wounds were fairly well-placed: three bullets entered the chest, perforating the lungs. One passed through the body, one lodged near the back and the other lodged in subcutaneous tissue. The fourth round went through the right hand and exited through the forearm. So the Army began looking for a solution. The task was assigned to Colonel John T. Thompson of the Infantry, and Major Louis Anatole LaGarde of the Medical Corps.
Testing
The tests were conducted at the Nelson Morris Company Union Stock Yards in Chicago, Illinois, using both live cattle outside a local slaughterhouse, as well as some human cadavers. To consider different combinations of factors, several different calibers were used during the tests: 7.65×21mm Parabellum (.30 Luger), 9×19mm Parabellum (Germany), .38 Long Colt, .38 ACP, blunt and hollow-point .45 Colt (US), .476 Eley (UK), and the "cupped" .455 Webley (UK).
The first day of testing involved eight live cattle; seven were shot through the lungs using different caliber rounds, and the effects recorded. The remaining anim |
https://en.wikipedia.org/wiki/155th%20meridian%20east | The meridian 155° east of Greenwich is a line of longitude that extends from the North Pole across the Arctic Ocean, Asia, the Pacific Ocean, Australasia, the Southern Ocean, and Antarctica to the South Pole.
The 155th meridian east forms a great circle with the 25th meridian west.
From Pole to Pole
Starting at the North Pole and heading south to the South Pole, the 155th meridian east passes through:
{| class="wikitable plainrowheaders"
! scope="col" width="130" | Co-ordinates
! scope="col" | Country, territory or sea
! scope="col" | Notes
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Arctic Ocean
| style="background:#b0e0e6;" |
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | East Siberian Sea
| style="background:#b0e0e6;" |
|-valign="top"
|
! scope="row" |
| Sakha Republic Magadan Oblast — from
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Sea of Okhotsk
| style="background:#b0e0e6;" | Shelikhov Gulf
|-
|
! scope="row" |
| Magadan Oblast
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Sea of Okhotsk
| style="background:#b0e0e6;" |
|-
|
! scope="row" |
| Sakhalin Oblast — Antsiferov Island, Kuril Islands
|-valign="top"
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Pacific Ocean
| style="background:#b0e0e6;" | Passing just east of the island of Onekotan, Sakhalin Oblast, (at ) Passing just west of the atoll of Oroluk, (at ) Passing just east of the atoll of Nukuoro, (at ) Passing just east of the atoll of Kapingamarangi, (at ) Passing just east of the islands of Nuguria, (at )
|-valign="top"
|
! scope="row" |
| Bougainville Island
|-
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Solomon Sea
| style="background:#b0e0e6;" |
|-valign="top"
| style="background:#b0e0e6;" |
! scope="row" style="background:#b0e0e6;" | Coral Sea
| style="background:#b0e0e6;" | Passi |
https://en.wikipedia.org/wiki/Axiom%20of%20infinity | In axiomatic set theory and the branches of mathematics and philosophy that use it, the axiom of infinity is one of the axioms of Zermelo–Fraenkel set theory. It guarantees the existence of at least one infinite set, namely a set containing the natural numbers. It was first published by Ernst Zermelo as part of his set theory in 1908.
Formal statement
In the formal language of the Zermelo–Fraenkel axioms, the axiom reads:
In words, there is a set I (the set that is postulated to be infinite), such that the empty set is in I, and such that whenever any x is a member of I, the set formed by taking the union of x with its singleton {x} is also a member of I. Such a set is sometimes called an inductive set.
Interpretation and consequences
This axiom is closely related to the von Neumann construction of the natural numbers in set theory, in which the successor of x is defined as x ∪ {x}. If x is a set, then it follows from the other axioms of set theory that this successor is also a uniquely defined set. Successors are used to define the usual set-theoretic encoding of the natural numbers. In this encoding, zero is the empty set:
0 = {}.
The number 1 is the successor of 0:
1 = 0 ∪ {0} = {} ∪ {0} = {0} = {{}}.
Likewise, 2 is the successor of 1:
2 = 1 ∪ {1} = {0} ∪ {1} = {0, 1} = { {}, {{}} },
and so on:
3 = {0, 1, 2} = { {}, {{}}, {{}, {{}}} };
4 = {0, 1, 2, 3} = { {}, {{}}, { {}, {{}} }, { {}, {{}}, {{}, {{}}} } }.
A consequence of this definition is that every natural number is equal to the set of all preceding natural numbers. The count of elements in each set, at the top level, is the same as the represented natural number, and the nesting depth of the most deeply nested empty set {}, including its nesting in the set that represents the number of which it is a part, is also equal to the natural number that the set represents.
This construction forms the natural numbers. However, the other axioms are insufficient to prove the existence of the set of all |
https://en.wikipedia.org/wiki/List%20of%20MeSH%20codes%20%28D12.776%29 | The following is a partial list of the "D" codes for Medical Subject Headings (MeSH), as defined by the United States National Library of Medicine (NLM).
This list continues the information at List of MeSH codes (D12.644). Codes following these are found at List of MeSH codes (D13). For other MeSH codes, see List of MeSH codes.
The source for this content is the set of 2006 MeSH Trees from the NLM.
– proteins
– albumins
– c-reactive protein
– conalbumin
– lactalbumin
– ovalbumin
– avidin
– parvalbumins
– ricin
– serum albumin
– methemalbumin
– prealbumin
– serum albumin, bovine
– serum albumin, radio-iodinated
– technetium tc 99m aggregated albumin
– algal proteins
– amphibian proteins
– xenopus proteins
– amyloid
– amyloid beta-protein
– amyloid beta-protein precursor
– serum amyloid a protein
– serum amyloid p-component
– antifreeze proteins
– antifreeze proteins, type i
– antifreeze proteins, type ii
– antifreeze proteins, type iii
– antifreeze proteins, type iv
– apoproteins
– apoenzymes
– apolipoproteins
– apolipoprotein A
– apolipoprotein A1
– apolipoprotein A2
– apolipoprotein B
– apolipoprotein C
– apolipoprotein E
– aprotinin
– archaeal proteins
– bacteriorhodopsins
– dna topoisomerases, type i, archaeal
– halorhodopsins
– periplasmic proteins
– armadillo domain proteins
– beta-catenin
– gamma catenin
– plakophilins
– avian proteins
– bacterial proteins
See List of MeSH codes (D12.776.097).
– blood proteins
See List of MeSH codes (D12.776.124).
– carrier proteins
See List of MeSH codes (D12.776.157).
– cell cycle proteins
– cdc25 phosphatase
– cellular apoptosis susceptibility protein
– cullin proteins
– cyclin-dependent kinase inhibitor proteins
– cyclin-dependent kinase inhibitor p15
– cyclin-dependent kinase inhibitor p16
– cyclin-dependent kinase inhibitor p18
– cyclin-dependent kinase inhibitor p19
– cyclin-dependent kinase inhibitor p21
– cyclin-dependent kinase inhibitor p27
– cyclin-de |
https://en.wikipedia.org/wiki/List%20of%20unsolved%20problems%20in%20neuroscience | There are yet unsolved problems in neuroscience, although some of these problems have evidence supporting a hypothesized solution, and the field is rapidly evolving. One major problem is even enumerating what would belong on a list such as this. However, these problems include:
Consciousness
Consciousness:
How can consciousness be defined?
What is the neural basis of subjective experience, cognition, wakefulness, alertness, arousal, and attention?
Quantum mind: Does quantum mechanical phenomena, such as entanglement and superposition, play an important part in the brain's function and can it explain critical aspects of consciousness?
Is there a "hard problem of consciousness"?
If so, how is it solved?
What, if any, is the function of consciousness?
What is the nature and mechanism behind near-death experiences?
How can death be defined? Can consciousness exist after death?
If consciousness is generated by brain activity, then how do some patients with physically deteriorated brains suddenly gain a brief moment of restored consciousness prior to death, a phenomenon known as terminal lucidity?
Problem of representation: How exactly does the mind function (or how does the brain interpret and represent information about the world)?
Bayesian mind: Does the mind make sense of the world by constantly trying to make predictions according to the rules of Bayesian probability?
Computational theory of mind: Is the mind a symbol manipulation system, operating on a model of computation, similar to a computer?
Connectionism: Can the mind be explained by mathematical models known as artificial neural networks?
Embodied cognition: Is the cognition of an organism affected by the organism's entire body (rather than just simply its brain), including its interactions with the environment?
Extended mind thesis: Does the mind not only exist in the brain, but also functions in the outside world by using physical objects as mental processes? Or just as prosthetic limbs can becom |
https://en.wikipedia.org/wiki/Submitochondrial%20particle | A submitochondrial particle (SMP) is an artificial vesicle made from the inner mitochondrial membrane. They can be formed by subjecting isolated mitochondria to sonication, freezing and thawing, high pressure, or osmotic shock. SMPs can be used to study the electron transport chain in a cell-free context.
The process of SMP formation forces the inner mitochondrial membrane inside out, meaning that the matrix-facing leaflet becomes the outer surface of the SMP, and the intermembrane space-facing leaflet faces the lumen of the SMP. As a consequence, the F1 particles which normally face the matrix are exposed. Chaotropic agents can destabilize F1 particles and cause them to dissociate from the membrane, thereby uncoupling the final step of oxidative phosphorylation from the rest of the electron transport chain. |
https://en.wikipedia.org/wiki/Pisohamate%20ligament | The pisohamate ligament is a ligament in the hand. It connects the pisiform to the hook of the hamate. It is a prolongation of the tendon of the flexor carpi ulnaris.
It serves as part of the origin for the abductor digiti minimi. It also forms the floor of the ulnar canal, a canal that allows the ulnar nerve and ulnar artery into the hand. |
https://en.wikipedia.org/wiki/ELF1 | E74-like factor 1 (ets domain transcription factor) is a protein that in humans is encoded by the ELF1 gene.
Function
This gene encodes an E26 transformation-specific related transcription factor. The encoded protein is primarily expressed in lymphoid cells and can act as both an enhancer and a repressor to regulate transcription of various genes. Alternative splicing results in multiple transcript variants. |
https://en.wikipedia.org/wiki/Sulcus%20spiralis%20externus | The basilar crest gives attachment to the outer edge of the basilar membrane; immediately above the crest is a concavity, the sulcus spiralis externus. |
https://en.wikipedia.org/wiki/Uno%20%28video%20game%29 | Uno is a video game based on the card game of the same name. It has been released for a number of platforms. The Xbox 360 version by Carbonated Games and Microsoft Game Studios was released on May 9, 2006, as a digital download via Xbox Live Arcade. A version for iPhone OS and iPod devices was released in 2008 by Gameloft. Gameloft released the PlayStation 3 version on October 1, 2009, and also released a version for WiiWare, Nintendo DSi via DSiWare, and PlayStation Portable. An updated version developed by Ubisoft Chengdu and published by Ubisoft was released for the PlayStation 4 and Xbox One in August 2016, Microsoft Windows in December 2016 and for the Nintendo Switch in November 2017.
Uno's original version was well received by critics. A sequel to the game's original version, Uno Rush, was announced at E3 2008 and released in 2009.
Gameplay
Uno is a video game that takes similarities to the card game of the same name. For the official rules, see the rules of the physical version.
Differences between versions
Xbox 360 version
The Xbox 360 version of the game offers three different game modes including Standard Uno, Partner Uno, and House Rules Uno. In Partner Uno, players sitting across from each other join forces to form a team, so that a win by either player is a win for the team. In House Rules Uno, the rules can be tweaked and customized to the player's preference.
The Xbox 360 version of Uno offers multiplayer for up to four players through Xbox Live. Players can join or drop-out of in-progress games at any time, with computer players automatically taking over for any missing humans. The game supports the Xbox Live Vision camera, allowing opponents to view an image of the player (or whatever the camera is pointed at) while playing the game.
Theme decks
The Xbox 360 version of Uno supports downloadable content through the Xbox Live Marketplace. This content takes the form of custom theme decks, which feature new visual appearances, sound effec |
https://en.wikipedia.org/wiki/Sherman%20trap | The Sherman trap is a box-style animal trap designed for the live capture of small mammals. It was invented by Dr. H. B. Sherman in the 1920s and became commercially available in 1955. Since that time, the Sherman trap has been used extensively by researchers in the biological sciences for capturing animals such as mice, voles, shrews, and chipmunks. The Sherman trap consists of eight hinged pieces of sheet metal (either galvanized steel or aluminum) that allow the trap to be collapsed for storage or transport. Sherman traps are often set in grids and may be baited with grains and seed.
Description
The hinged design allows the trap to fold up flat into something only the width of one side panel. This makes it compact for storage and easy to transport to field locations (e.g. in a back pack). Both ends are hinged, but in normal operation the rear end is closed and the front folds inwards and latches the treadle, trigger plate, in place. When an animal enters far enough to be clear of the front door, their weight releases the latch and the door closes behind them.
The lure or bait is placed at the far end and can be dropped in place through the rear hinged door.
Variants
Later, other variants that built upon the basic design, appeared - such as the Elliott trap used in Europe and Australasia. The Elliott trap has simplified the design slightly and is made from just 7 hinged panels. |
https://en.wikipedia.org/wiki/Nikitsky%20Botanical%20Garden | Nikita Botanical Garden (,
) is one of the oldest botanical gardens in Europe. It is located in Crimea, close to Yalta, by the shores of the Black Sea. It was founded in 1812 and named after the settlement Nikita in Crimea, Russian Empire.
Its founder and first director was Russian botanist Christian von Steven of Swedish descent. The garden and its collections were greatly expanded by the Livonian Nicolai Anders von Hartwiss, director 1827–1860.
The total area is 11 square kilometres. It is a scientific research centre, a producer of saplings and seeds, and a tourist attraction. The garden opened a representative office in Damascus in 2022.
The garden was the part of the Ukrainian Academy of Agrarian Sciences. It has subsidiaries in Crimea and Kherson Oblast. Its collection counts over 50,000 species, sorts and hybrides. Its scientific work consists in study of natural flora, collection of gene fund, selection and introduction of new agricultural plants for south Ukraine, Russia, and other countries.
Gallery |
https://en.wikipedia.org/wiki/54%20%28number%29 | 54 (fifty-four) is the natural number following 53 and preceding 55.
In mathematics
54 is an abundant number and a semiperfect number, like all other multiples of 6.
It is twice the third power of three, 3 + 3 = 54, and hence is a Leyland number.
54 is the smallest number that can be written as the sum of three positive squares in more than two different ways: = = = 54.
It is a 19-gonal number,
In base 10, 54 is a Harshad number.
The Holt graph has 54 edges.
The sine of an angle of 54 degrees is half the golden ratio.
The number of primes ≤ 28.
A Lehmer-Comtet number.
54 is the only non-trivial Neon Number in Power 9: 549 = 3,904,305,912,313,344; 3 + 9 + 0 + 4 + 3 + 0 + 5 + 9 + 1 + 2 + 3 + 1 + 3 + 3 + 4 + 4 = 54
In science
The atomic number of xenon is 54.
Astronomy
Messier object M54, a magnitude 8.5 globular cluster in the constellation Sagittarius
The New General Catalogue object NGC 54, a spiral galaxy in the constellation Cetus
The number of years in three Saros cycles of eclipses of the sun and moon is known as a Triple Saros or exeligmos (Greek: "turn of the wheel").
In sports
Fewest points in an NBA playoff game: Chicago (96), Utah (54), June 7, 1998
The New York Rangers won the Stanley Cup in 1994, 54 years after their previous Cup win. It is the longest drought in the trophy's history.
For years car number 54 was driven by NASCAR's Lennie Pond. More recently, it is known as the Nationwide Series car number for Kyle Busch.
A score of 54 on a par 72 course in golf is colloquially referred to as a perfect round. This score has never been achieved in competition.
The number used when a player is defeated 3 games in a row in racquetball.
In other fields
54 is also:
+54 The code for international direct dial phone calls to Argentina
A broadcast television channel number
54, a 1998 film about Studio 54 starring Ryan Phillippe, Mike Myers, and Salma Hayek
54, a novel by the Wu Ming collective of authors
In the title of a 1960s television show |
https://en.wikipedia.org/wiki/BattlEye | BattlEye is a proprietary anti-cheat software designed to detect players that hack or abusively use exploits in an online game. It was initially released as a third-party anti-cheat for Battlefield Vietnam in 2004 and has since been officially implemented in numerous video games, primarily shooter games such as PUBG: Battlegrounds, Arma 3, Destiny 2, and DayZ.
BattlEye supports Valve Corporation's Proton compatibility layer and is usable on the Steam Deck.
Games using BattlEye
Arma 2 (2009)
PlanetSide 2 (2012)
Arma 3 (2013)
Rainbow Six Siege (2015)
Heroes & Generals (2016)
Escape from Tarkov (2017)
Ark: Survival Evolved (2017)
Unturned (2017)
Destiny 2 (2017)
PUBG: Battlegrounds (2017)
Fortnite Battle Royale (2017)
Ghost Recon: Wildlands (2017)
Atlas (2018)
Z1 Battle Royale (2018)
DayZ (2018)
PlanetSide Arena (2019)
Ghost Recon: Breakpoint (2019)
Watch Dogs: Legion (2020)
Arma Reforger (2022)
The Cycle: Frontier (2022)
Mount & Blade II: Bannerlord (2022)
Tom Clancy's Rainbow Six Extraction (2022)
Tibia (2023)
War Rock'' (2023) |
https://en.wikipedia.org/wiki/Shift72 | Shift72 is a New Zealand-based company that facilitates white-label video streaming platforms. It was founded in 2008.
Shift72 is best known for facilitating online film festivals during the ongoing COVID-19 pandemic which prevented physical screenings. It hosted festivals for TIFF, Sundance Film Festival, Cannes Film Festival, and SXSW. According to Indiewire, the company has hosted more than 100 virtual festivals since March 2020.
See also
Online film festivals |
https://en.wikipedia.org/wiki/Nanotechnology%20in%20fiction | The use of nanotechnology in fiction has attracted scholarly attention. The first use of the distinguishing concepts of nanotechnology was "There's Plenty of Room at the Bottom", a talk given by physicist Richard Feynman in 1959. K. Eric Drexler's 1986 book Engines of Creation introduced the general public to the concept of nanotechnology. Since then, nanotechnology has been used frequently in a diverse range of fiction, often as a justification for unusual or far-fetched occurrences featured in speculative fiction.
Notable examples
Literature
In 1931, Boris Zhitkov wrote a short story called Microhands (Микроруки), where the narrator builds for himself a pair of microscopic remote manipulators, and uses them for fine tasks like eye surgery. When he attempts to build even smaller manipulators to be manipulated by the first pair, the story goes into detail about the problem of regular materials behaving differently on a microscopic scale.
In his 1956 short story The Next Tenants, Arthur C. Clarke describes tiny machines that operate at the micrometre scale – although not strictly nanoscale (billionth of a meter), they are the first fictional example of the concepts now associated with nanotechnology.
A concept similar to nanotechnology, called "micromechanical devices", was described in Lem's 1959 novel Eden These devices were used by the aliens as "seeds" to grow a wall around the human spaceship. <ref> Doktryna nieingerencji, In: Marek Oramus, Bogowie Lema</ref>
Stanislaw Lem's 1964 novel The Invincible involves the discovery of an artificial ecosystem of minuscule robots, although like in Clarke's story they are larger than what is strictly meant by the term 'nanotechnology'.
Robert Silverberg's 1969 short story How It Was when the Past Went Away describes nanotechnology being used in the construction of stereo loudspeakers, with a thousand speakers per inch.
The 1984 novel Peace on Earth by Stanislaw Lem tells about small bacteria-sized nanorobots lookin |
https://en.wikipedia.org/wiki/Acid%20grassland | Acid grassland is a nutrient-poor habitat characterised by grassy tussocks and bare ground.
Habitat
The vegetation is dominated by grasses and herbaceous plants, growing on soils deficient in lime (calcium). These may be found on acid sedimentary rock such as sandstone; acid igneous rock such as granite; and fluvial or glacial deposits such as sand and gravel. Typical plants of lowland acid grassland in Britain include common bent grass, Agrostis capillaris, wavy hair-grass, Deschampsia flexuosa, bristle bent grass, Agrostis curtisii, tormentil, Potentilla erecta, and flowers such as sheep's sorrel, Rumex acetosella and heath bedstraw, Galium saxatile.
In Britain
In Britain, under 30,000 hectares of lowland acid grassland remain, often on common land and nature reserves. It is considered a nationally important habitat; areas are found in London on freely-draining sandy and gravelly soils. 271 Sites of Special Scientific Interest have been notified with acid grassland as a principal reason for the designation. Greater London's Richmond Park, Epping Forest and Wimbledon Common are all Special Areas of Conservation with considerable areas of acid grassland. |
https://en.wikipedia.org/wiki/Gallium%28II%29%20selenide | Gallium(II) selenide (GaSe) is a chemical compound. It has a hexagonal layer structure, similar to that of GaS. It is a photoconductor, a second harmonic generation crystal in nonlinear optics, and has been used as a far-infrared conversion material at 14–31 THz and above.
Uses
It is said to have potential for optical applications but the exploitation of this potential has been limited by the ability to readily grow single crystals Gallium selenide crystals show great promise as a nonlinear optical material and as a photoconductor. Non-linear optical materials are used in the frequency conversion of laser light. Frequency conversion involves the shifting of the wavelength of a monochromatic source of light, usually laser light, to a higher or lower wavelength of light that cannot be produced from a conventional laser source.
Several methods of frequency conversion using non-linear optical materials exist. Second harmonic generation leads to doubling of the frequency of infrared carbon dioxide lasers. In optical parametric generation, the wavelength of light is doubled. Near-infrared solid-state lasers are usually used in optical parametric generations.
One original problem with using gallium selenide in optics is that it is easily broken along cleavage lines and thus it can be hard to cut for practical application. It has been found, however, that doping the crystals with indium greatly enhances their structural strength and makes their application much more practical. There remain, however, difficulties with crystal growth that must be overcome before gallium selenide crystals may become more widely used in optics.
Single layers of gallium selenide are dynamically stable two-dimensional semiconductors, in which the valence band has an inverted Mexican-hat shape, leading to a Lifshitz transition as the hole-doping is increased.
The integration of gallium selenide into electronic devices has been hindered by its air sensitivity. Several approaches have been deve |
https://en.wikipedia.org/wiki/Milan%20Randi%C4%87 | Milan Randić (born 1 October 1930) is a Croatian American scientist who is one of the leading experts in the field of computational chemistry.
Birth and education
Randić was born in the city of Belgrade, where his parents, originally from Kostrena (Croatian Primorje – Region in the northern Adriatic), lived at the time. Kostrena is well known by its maritime tradition, shipowners and seamen. Randic's ancestors were sailing ship owners as well as ship captains. His parents moved to Zagreb in 1941, where he continued his education.
After finishing Gymnasium in Zagreb, he studied Theoretical Physics at the University of Zagreb during 1949–1953 and studied for Ph. D degree at the University of Cambridge, England (1954–1958).
Academic career
From 1960 to 1970 he was at the Ruđer Bošković Institute in Zagreb, Croatia, where he founded the Theoretical Chemistry Group. During 1971–1980 he was visiting various universities in USA including Johns Hopkins, MIT, Harvard, Tufts, and Cornell. With 1973 his research oriented towards application of Discrete Mathematics and Graph Theory in particular to characterization of molecules and bio-molecules. During 1980 to 1997 he was professor in the Department of Mathematics and Computer Science at Drake University, Des Moines, Iowa. During the past 15 years he is spending three months each year at the National Institute of Chemistry, Ljubljana, Slovenia collaborating with scientists from its Laboratory for Chemometrics.
Research and achievements
Randić has been a major contributor to the development of mathematical chemistry, in particular to the development and use of molecular descriptors based on the use of Graph Theory. In 1975 he introduced the Randić index, the first connectivity index.
He is a corresponding member of the Croatian Academy of Sciences and Arts and founder of the International Academy of Mathematical Chemistry, the seat of which is in Dubrovnik, Croatia. With year 2000 his research has been mostly shifted |
https://en.wikipedia.org/wiki/Antistatic%20device | An antistatic device is any device that reduces, dampens, or otherwise inhibits electrostatic discharge, or ESD, which is the buildup or discharge of static electricity. ESD can damage electrical components such as computer hard drives, and even ignite flammable liquids and gases.
Many methods exist for neutralizing static electricity, varying in use and effectiveness depending on the application. Antistatic agents are chemical compounds that can be added to an object, or the packaging of an object, to help deter the buildup or discharge of static electricity. For the neutralization of static charge in a larger area, such as a factory floor, semiconductor cleanroom or workshop, antistatic systems may utilize electron emission effects such as corona discharge or photoemission that introduce ions into the area that combine with and neutralize any electrically charged object. In many situations, sufficient ESD protection can be achieved with electrical grounding.
Symbology
Various symbols can be found on products, indicating that the product is electrostatically sensitive, as with sensitive electrical components, or that it offers antistatic protection, as with antistatic bags.
Reach symbol
ANSI/ESD standard S8.1-2007 is most commonly seen on applications related to electronics. Several variations consist of a triangle with a reaching hand depicted inside of it using negative space.
Versions of the symbol will often have the hand being crossed out as a warning for the component being protected, indicating that it is ESD sensitive and is not to be touched unless antistatic precautions are taken.
Another version of the symbol has the triangle surrounded by an arc. This variant is in reference to the antistatic protective device, such as an antistatic wrist strap, rather than the component being protected. It usually does not feature the hand being crossed out, indicating that it makes contact with the component safe.
Circle
Another common symbol takes the form |
https://en.wikipedia.org/wiki/Image%20persistence | Image persistence, or image retention, is the LCD and plasma display equivalent of screen burn-in. Unlike screen burn, the effects are usually temporary and often not visible without close inspection. Plasma displays experiencing severe image persistence can result in screen burn-in instead.
Image persistence can occur as easily as having something remain unchanged on the screen in the same location for a duration of even 10 minutes, such as a web page or document. Minor cases of image persistence are generally only visible when looking at darker areas on the screen, and usually invisible to the eye during ordinary computer use.
Cause
Liquid crystals have a natural relaxed state. When a voltage is applied they rearrange themselves to block certain light waves. If left with the same voltage for an extended period of time (e.g. displaying a pointer or the Taskbar in one place, or showing a static picture for extended periods of time), the liquid crystals can develop a tendency to stay in one position. This ever-so-slight tendency to stay arranged in one position can throw the requested color off by a slight degree, which causes the image to look like the traditional "burn-in" on phosphor based displays. In fact, the root cause of LCD image retention is different from phosphor aging, but the phenomenon is the same, namely uneven use of display pixels. Slight LCD image retention can be recovered. When severe image retention occurs, the liquid crystal molecules have been polarized and cannot rotate in the electric field, so they cannot be recovered.
The cause of this tendency is unclear. It might be due to various factors, including accumulation of ionic impurities inside the LCD, impurities introduced during the fabrication of the LCD, imperfect driver settings, electric charge building up near the electrodes, parasitic capacitance, or a DC voltage component that occurs unavoidably in some display pixels owing to anisotropy in the dielectric constant of the liquid c |
https://en.wikipedia.org/wiki/Program%20in%20Placebo%20Studies | The Program in Placebo Studies and the Therapeutic Encounter was founded in July 2011, at Beth Israel Deaconess Medical Center and the Harvard Medical School. Its purpose is to bring together researchers who are examining the placebo response.
External links
Program in Placebo Studies and the Therapeutic Encounter website
Research projects |
https://en.wikipedia.org/wiki/Hypothallus | In true slime molds (myxogastria), lichens, and in species of the family Clavicipitaceae, the hypothallus is the layer on which the fruit body sits, lying in contact with the substrate. The word is derived from the Ancient Greek root hypó ("under") and thallós ("shoot" or "thallus").
The hypothallus is produced by the plasmodium at the beginning of fructification. Depending on the species, it can be membranous to thick or tender to solid and nearly transparent to brightly coloured. It may surround an individual fruit body, or may form a contiguous connection between multiple fruit bodies. In some rare cases it is missing entirely.
In crustose lichens, the hypothallus is the blackish lower layer of the thallus that produces rhizines, which are holdfasts that attach the lichen to its substrate.
In some taxa the hypothallus may be involved in the formation of the fruit body. In the "epihypothallic" Stemonitida, the hypothallus forms hollow, tubular stems and a columella, up which the remaining plasmodium then rises, producing the spores. In all other myxogastria "subhypothallic" development takes place. Here, the hypothallus produces a layer on the plasmodium, which creates the rooms of the single fruit bodies during fructification. As the surrounding plasmodium flows in the fruit body, the hypothallus will lie directly on the substrate, shrinking and creating the edge of the mature fruit body. Here, the hypothallus is part of a morphological unit with peridium and stem, which serves as a membranous surface of the whole structure with the spores. Epihypothaly is an autapomorphy of the stemonitida and is, in comparison to subhypothaly, a primitive feature. |
https://en.wikipedia.org/wiki/QNX | QNX ( or ) is a commercial Unix-like real-time operating system, aimed primarily at the embedded systems market.
The product was originally developed in the early 1980s by Canadian company Quantum Software Systems, later renamed QNX Software Systems.
, it is used in a variety of devices including cars, medical devices, program logic controllers, robots, trains, and more.
History
Gordon Bell and Dan Dodge, both students at the University of Waterloo in 1980, took a course in real-time operating systems, in which the students constructed a basic real-time microkernel and user programs. Both were convinced there was a commercial need for such a system, and moved to the high-tech planned community Kanata, Ontario, to start Quantum Software Systems that year. In 1982, the first version of QUNIX was released for the Intel 8088 CPU. In 1984, Quantum Software Systems renamed QUNIX to QNX in an effort to avoid any trademark infringement challenges.
One of the first widespread uses of the QNX real-time OS (RTOS) was in the nonembedded world when it was selected as the operating system for the Ontario education system's own computer design, the Unisys ICON. Over the years QNX was used mostly for larger projects, as its 44k kernel was too large to fit inside the one-chip computers of the era. The system garnered a reputation for reliability and became used in running machinery in many industrial applications.
In the late-1980s, Quantum realized that the market was rapidly moving towards the Portable Operating System Interface (POSIX) model and decided to rewrite the kernel to be much more compatible at a low level. The result was QNX 4. During this time Patrick Hayden, while working as an intern, along with Robin Burgener (a full-time employee at the time), developed a new windowing system. This patented concept was developed into the embeddable graphical user interface (GUI) named the QNX Photon microGUI. QNX also provided a version of the X Window System.
To demonstra |
https://en.wikipedia.org/wiki/Pisier%E2%80%93Ringrose%20inequality | In mathematics, Pisier–Ringrose inequality is an inequality in the theory of C*-algebras which was proved by Gilles Pisier in 1978 affirming a conjecture of John Ringrose. It is an extension of the Grothendieck inequality.
Statement
Theorem. If is a bounded, linear mapping of one C*-algebra into another C*-algebra , then
for each finite set of elements of .
See also
Haagerup-Pisier inequality
Christensen-Haagerup Principle
Notes |
https://en.wikipedia.org/wiki/Embryonic%20Stem%20Cell%20Research%20Oversight%20Committees | The National Academies called for the establishment of Embryonic Stem Cell Research Oversight (ESCRO) Committees in its 2005 Guidelines for Human Embryonic Stem Cell Research to manage the ethical and legal concerns in human embryonic stem cell research. Because of the complexity and novelty of many of the issues involved in that research, the Guidelines committee believes that all research institutions engaged in human embryonic stem cell research should create and maintain these committees at the local level.
The composition and responsibilities of ESCRO committees was further clarified in the Amendments to the National Academies' Guidelines for Human Embryonic Stem Cell Research, released in February 2007.
Organization
The National Academies' Guidelines make clear that activities related to human embryonic stem cell research should be overseen by an ESCRO committee. Those committees could be internal to a single institution or established jointly with one or more other institutions. Alternatively, an institution may have its proposals reviewed by an ESCRO committee of another institution or by an independent ESCRO committee. Many of these changes are discussed in the 2007 Amendments.
Composition
The composition of ESCRO committees was specified to include representatives of the public and people with expertise in developmental biology, stem cell research, molecular biology, assisted reproduction, and ethical and legal issues in human embryonic stem cell research.
The 2007 Amendments clarified that public representations should be independent and lay and in addition to the listed areas of scientific expertise.
Although ESCRO committees may overlap with other oversight committees, it should not be a subcommittee of an Institutional Review Board, as its responsibilities extend beyond human subject protections.
Responsibilities
The Guidelines assigns several responsibilities to ESCRO committees:
provide oversight over all issues related to derivation and us |
https://en.wikipedia.org/wiki/POV-Ray | The Persistence of Vision Ray Tracer, most commonly acronymed as POV-Ray, is a cross-platform ray-tracing program that generates images from a text-based scene description. It was originally based on DKBTrace, written by David Kirk Buck and Aaron A. Collins for Amiga computers. There are also influences from the earlier Polyray raytracer because of contributions from its author, Alexander Enzmann. POV-Ray is free and open-source software, with the source code available under the AGPL-3.0-or-later license.
History
Sometime in the 1980s, David Kirk Buck downloaded the source code for a Unix ray tracer to his Amiga. He experimented with it for a while and eventually decided to write his own ray tracer named DKBTrace after his initials. He posted it to the "You Can Call Me Ray" bulletin board system (BBS) in Chicago, thinking others might be interested in it. In 1987, Aaron A. Collins downloaded DKBTrace and began working on an x86 port of it. He and David Buck collaborated to add several more features.
When the program proved to be more popular than anticipated, they could not keep up with demand for more features. Thus, in July 1991, David turned over the project to a team of programmers working in the "GraphDev" forum on CompuServe. At the same time, David felt that it was inappropriate to use his initials on a program he no longer maintained. The name "STAR-Light" (Software Taskforce on Animation and Rendering) was initially used, but eventually the name became "PV-Ray", and then ultimately "POV-Ray" (Persistence of Vision Ray Tracer), a name inspired by Dalí's painting, The Persistence of Memory.
Features of the application, and a summary of its history, are discussed in a February 2008 interview with David Kirk Buck and Chris Cason on episode 24 of FLOSS Weekly.
Features
POV-Ray has matured substantially since it was created. Recent versions of the software include the following features:
a Turing-complete scene description language (SDL) that supports ma |
https://en.wikipedia.org/wiki/Meitner%E2%80%93Hupfeld%20effect | The Meitner–Hupfeld effect, named after Lise Meitner and Hans-Hermann Hupfeld, is an anomalously large scattering of gamma rays by heavy elements. The effect was later explained by a broad theory from which evolved the Standard Model, a theory for explaining the structure of the atomic nucleus. The anomalous gamma-ray behavior was eventually ascribed to electron–positron pair production and annihilation.
Although Professor Meitner was recognized for her work, Dr. Hupfeld is usually ignored, and little or no account of his life exists.
See also
Pair production
Electron-positron annihilation |
https://en.wikipedia.org/wiki/Networked%20Help%20Desk | Networked Help Desk is an open standard initiative to provide a common API for sharing customer support tickets between separate instances of issue tracking, bug tracking, customer relationship management (CRM) and project management systems to improve customer service and reduce vendor lock-in. The initiative was created by Zendesk in June 2011 in collaboration with eight other founding member organizations including Atlassian, New Relic, OTRS, Pivotal Tracker, ServiceNow and SugarCRM. The first integration, between Zendesk and Atlassian's issue tracking product, Jira, was announced at the 2011 Atlassian Summit. By August 2011, 34 member companies had joined the initiative. A year after launching, over 50 organizations had joined. Within Zendesk instances this feature is branded as ticket sharing.
Basis
Support tools are generally built around a common paradigm that begins with a customer making a request or an incident report, these create a ticket. Each ticket has a progress status and is updated with annotations and attachments. These annotations and attachments may be visible to the customer (public), or only visible to analysts (private). Customers are notified of progress made on their ticket until it is complete. If the people necessary to complete a ticket are using separate support tools, additional overhead is introduced in maintaining the relevant information in the ticket in each tool while notifying the customer of progress made by each group in completing their ticket. For example, if a customer support issue is caused by a software bug and reported to a help desk using one system, and then the fix is documented by the developers in another, and analyzed in a customer relationship management tool, keeping the records in each system up-to-date and notifying the customer manually using a swivel chair approach is unnecessarily time-consuming and error-prone. If information is not transferred correctly, a customer may have to re-explain their problem e |
https://en.wikipedia.org/wiki/Railroad%20ecology | Railroad ecology or railway ecology is a term used to refer to the study of the ecological community growing along railroad or railway tracks and the effects of railroads on natural ecosystems. Such ecosystems have been studied primarily in Europe. Similar conditions and effects appear also by roads used by vehicles. Railroads along with roads, canals, and power lines are examples of linear infrastructure intrusions.
Conditions
Railroad beds, like road beds, are designed to drain water away from the tracks, so there is usually a bed of rock and gravel resulting in fast drainage away from the tracks. At the same time, this drainage often accumulates in areas fairly near the tracks where drainage is poor, forming small artificial wetlands. These unnatural conditions combine to form different zones, some in which water is scarce, others in which water is abundant.
Maintenance
Railroad companies routinely clear-cut and/or spray with herbicide any vegetation that grows too close to the tracks. This favors vegetation that is able to respond favorably to clearcutting, and/or resist herbicides.
On overhead electrified railroad lines, clear-cutting must be more extensive, vertically as well as horizontally, in order to prevent vegetation (especially tree limbs) from interfering with the pantographs on a moving train, breaking off and falling on the wires, or simply from arcing in proximity to high voltage transmission cables. The same vegetative selection processes described in the previous paragraph apply, but may additionally favor climbing vines due to the presence of catenary and transmission poles in addition to the wooden communications and signal pole lines which often exist(ed) along nonelectrified lines.
History
Historically, conditions along railroad beds were very different from today. Coal engines used to blanket the area with soot, favoring species adapted to these conditions (some of which only occurred naturally in volcanic areas). Newer engines prod |
https://en.wikipedia.org/wiki/Niphimycin | Niphimycin is an antimicrobial made by Streptomyces. |
https://en.wikipedia.org/wiki/AMD%20Eyefinity | AMD Eyefinity is a brand name for AMD video card products that support multi-monitor setups by integrating multiple (up to six) display controllers on one GPU. AMD Eyefinity was introduced with the Radeon HD 5000 Series "Evergreen" in September 2009 and has been available on APUs and professional-grade graphics cards branded AMD FirePro as well.
AMD Eyefinity supports a maximum of 2 non-DisplayPort displays (e.g., HDMI, DVI, VGA, DMS-59, VHDCI) (which AMD calls "legacy output") and up to 6 DisplayPort displays simultaneously using a single graphics card or APU. To feed more than two displays, the additional panels must have native DisplayPort support. Alternatively active DisplayPort-to-DVI/HDMI/VGA adapters can be employed.
The setup of large video walls by connecting multiple computers over Gigabit Ethernet or Ethernet is also supported.
The version of AMD Eyefinity (aka DCE, display controller engine) introduced with Excavator-based Carrizo APUs features a Video underlay pipe.
Overview
AMD Eyefinity is implemented by multiple on-die display controllers. The HD 5000-series designs host two internal clocks and one external clock. Displays connected over VGA, DVI, or HDMI each require their own internal clock. But all displays connected over DisplayPort can be driven from only one external clock. This external clock is what allows Eyefinity to fuel up to six monitors from a single card.
The entire HD 5000 series of products have Eyefinity capabilities supporting three outputs. The Radeon HD 5870 Eyefinity Edition, however, supports six mini DisplayPort outputs, all of which can be simultaneously active.
The display controller has two RAMDACs that drive the VGA or DVI ports in analog mode. For example, when a DVI-to-VGA converter is attached to a DVI port). It also has a maximum of six digital transmitters that can output either a DisplayPort signal or a TMDS signal for either DVI or HDMI, and two clock signal generators to drive the digital outputs in TMDS |
https://en.wikipedia.org/wiki/Carta%20Worldwide | Carta Worldwide is a Canadian financial technology company that offers digital payments technology and modern card issuer processing for banks and financial technology "fintech" companies. In addition to their Canadian headquarters in Toronto, Carta has offices in London, Casablanca, and Charlottetown, PEI. Carta operates internationally, providing financial technology and digital payment software and cloud API issuer processing.
Carta's clients include Vodafone, PayPal, Banco Sabadell, Westpac NZ, and Novum Bank, nets, TransferWise, and Sodexo. The company also partners with Visa and MasterCard.
Carta was the world's first processor to complete integration to MasterCard MOTAPS, enabling rapid deployment for NFC programs. Their Charlottetown, PEI, data centre is the only secure third party issuer processing data host for financial services in Canada.
Carta was part of the development team that produced ApplePay and was the first processor in the world to perform an ApplePay transaction.
History
Carta was founded in 2006 in Canada by Frank Svatousek, Robert Elensky, and Rui Mendes. Company's headquarters were setup in Oakville, Ontario, Canada and relocated in to Switzerland with service company offices opened in Hong Kong and London, UK.
In 2009, Brian Semkiw was appointed as CEO. Brian Semkiw is the co-founder of Rand Worldwide which had 1500 employees and had secured revenues of approximately $500 million by the time it was privatized in 2007. Profit Magazine named Semkiw "Entrepreneur of the Year" in 1996.
On September 15, 2014, Carta Worldwide closed the first tranche of a $12 million Series D venture round. Led by Toronto-based merchant bank Difference Capital and DC Thomson (UK), the $7-million investment adds to the more than $50 million the company has invested in its digital transaction processing platform.
In 2015 Carta partnered with Visa Inc. and Vodafone to enable contactless bank card payments. The Vodafone wallet uses a hybrid of a hardware |
https://en.wikipedia.org/wiki/Spatial%20twist%20continuum | In finite element analysis, the spatial twist continuum (STC) is a dual representation of a hexahedral mesh that defines the global connectivity constraint. Generation of an STC can simplify the automated generation of a mesh. The method was published in 1993 by a group led by Peter Murdoch.
The name is derived from the description of the surfaces that define the connectivity of the hexahedral elements. The surfaces are arranged in the three principal dimensions such that they form orthogonal intersections that coincide with the centroid of the hexahedral element. They are arranged predominately coplanar to each other in their respective dimensions yet they can twist into the other dimensional planes through transitions. The surfaces are unbroken throughout the entire volume of the mesh hence they are continuums.
Explanation
One of the areas where the STC finds application is computational fluid dynamics, a field of analysis that involves simulating the flow of fluids over and through bodies defined by boundary surfaces. The procedure involves building a mesh using it to analyze the system using a finite volume approach.
An analyst has many choices available for creating a mesh that can be used in a CFD or CAE simulation, one is to use a Tetrahedral, Polyhedral, Trimmed Cartesian or Mixed of Hybrid of Hexahedra called hex dominate, these are classified as non-structured meshes, which can all be created automatically, however the CFD and FEA results are both inaccurate and prone to solution divergence, (the simulation fails to solve).
The other option for the analyst is to use an all-hexahedral mesh that offers far greater solver stability and speed as well as accuracy and the ability to run much more powerful turbulence solvers like Large eddy simulation LES in transient mode as opposed to the non-structured meshes that can only run a steady state RANS model.
The difficulty with generating an all-hexahedral mesh on a complex geometry is that mesh needs to ta |
https://en.wikipedia.org/wiki/Doris%20Carver | Doris Loveday Carver (born 1946) is an American computer scientist and software engineer at Louisiana State University, where she is Dow Chemical Distinguished Professor of Computer Science and Engineering, and director of the Software Engineering Laboratory. She is the former president of the IEEE Computer Society and editor-in-chief of IEEE Computer.
Education and career
Carver is a graduate of Carson–Newman College. She earned a master's degree in mathematics at the University of Tennessee in 1969, and entered doctoral study at Texas A&M University in the late 1970s, initially in mathematics, but quickly switching to computer science after taking a course in the subject. She completed her Ph.D. in computer science there in 1981, with the dissertation The effects of complexity on COBOL program changes.
After completing her doctorate, she joined the faculty at Southeastern Louisiana University before moving to Louisiana State University in 1986. At Louisiana State, she has been Interim Dean of the Graduate School, Senior Associate Vice Chancellor of Research and Economic Development, and Interim Vice Chancellor of Research and Economic Development. She has also gone on leave from her faculty position to work as a program officer for the National Science Foundation.
She was president of the IEEE Computer Society in 1998, and later became editor-in-chief of IEEE Computer.
Recognition
Carver was named a Fellow of the IEEE in 1998, "for contributions to the field of software engineering". She became a Fellow of the American Association for the Advancement of Science in 2002. In 2004, the IEEE Computer Society gave her their Richard E. Merwin Award for Distinguished Service. |
https://en.wikipedia.org/wiki/Bluetooth | Bluetooth is a short-range wireless technology standard that is used for exchanging data between fixed and mobile devices over short distances and building personal area networks (PANs). In the most widely used mode, transmission power is limited to 2.5 milliwatts, giving it a very short range of up to . It employs UHF radio waves in the ISM bands, from 2.402GHz to 2.48GHz. It is mainly used as an alternative to wire connections, to exchange files between nearby portable devices and connect cell phones and music players with wireless headphones.
Bluetooth is managed by the Bluetooth Special Interest Group (SIG), which has more than 35,000 member companies in the areas of telecommunication, computing, networking, and consumer electronics. The IEEE standardized Bluetooth as IEEE 802.15.1, but no longer maintains the standard. The Bluetooth SIG oversees development of the specification, manages the qualification program, and protects the trademarks. A manufacturer must meet Bluetooth SIG standards to market it as a Bluetooth device. A network of patents apply to the technology, which are licensed to individual qualifying devices. , 4.7 billion Bluetooth integrated circuit chips are shipped annually.
Etymology
The name "Bluetooth" was proposed in 1997 by Jim Kardach of Intel, one of the founders of the Bluetooth SIG. The name was inspired by a conversation with Sven Mattisson who related Scandinavian history through tales from Frans G. Bengtsson's The Long Ships, a historical novel about Vikings and the 10th-century Danish king Harald Bluetooth. Upon discovering a picture of the runestone of Harald Bluetooth in the book A History of the Vikings by Gwyn Jones, Kardach proposed Bluetooth as the codename for the short-range wireless program which is now called Bluetooth.
According to Bluetooth's official website,
Bluetooth is the Anglicised version of the Scandinavian Blåtand/Blåtann (or in Old Norse blátǫnn). It was the epithet of King Harald Bluetooth, who united th |
https://en.wikipedia.org/wiki/Expertise%20finding | Expertise finding is the use of tools for finding and assessing individual expertise. In the recruitment industry, expertise finding is the problem of searching for employable candidates with certain required skills set. In other words, it is the challenge of linking humans to expertise areas, and as such is a sub-problem of expertise retrieval (the other problem being expertise profiling).
Importance of expertise
It can be argued that human expertise is more valuable than capital, means of production or intellectual property. Contrary to expertise, all other aspects of capitalism are now relatively generic: access to capital is global, as is access to means of production for many areas of manufacturing. Intellectual property can be similarly licensed. Furthermore, expertise finding is also a key aspect of institutional memory, as without its experts an institution is effectively decapitated. However, finding and "licensing" expertise, the key to the effective use of these resources, remain much harder, starting with the very first step: finding expertise that you can trust.
Until very recently, finding expertise required a mix of individual, social and collaborative practices, a haphazard process at best. Mostly, it involved contacting individuals one trusts and asking them for referrals, while hoping that one's judgment about those individuals is justified and that their answers are thoughtful.
In the last fifteen years, a class of knowledge management software has emerged to facilitate and improve the quality of expertise finding, termed "expertise locating systems". These software range from social networking systems to knowledge bases. Some software, like those in the social networking realm, rely on users to connect each other, thus using social filtering to act as "recommender systems".
At the other end of the spectrum are specialized knowledge bases that rely on experts to populate a specialized type of database with their self-determined areas o |
https://en.wikipedia.org/wiki/Bond%20hardening | Bond hardening is a process of creating a new chemical bond by strong laser fields—an effect opposite to bond softening. However, it is not opposite in the sense that the bond becomes stronger, but in the sense that the molecule enters a state that is diametrically opposite to the bond-softened state. Such states require laser pulses of high intensity, in the range of 1013–1015 W/cm2, and they disappear once the pulse is gone.
Theory
Bond hardening and bond softening share the same theoretical basis, which is described under the latter entry. Briefly, the ground and the first excited energy curves of the H2+ ion are dressed in photons. The laser field perturbs the curves and turns their crossings into anticrossings. Bond softening occurs on the lower branches of the anticrossings and bond hardening happens if the molecule is excited to the upper branches – see Fig. 1.
To trap the molecule in the bond-hardened state, the anticrossing gap cannot be too small or too large. If it is too small, the system can undergo a diabatic transition to the lower branch of the anticrossing and dissociate via bond softening. If the gap is too large, the upper branch becomes shallow or even repulsive, and the system can also dissociate. This means that bound bond-hardened states can exist only in relatively narrow range of laser intensities, which makes them difficult to observe.
Experimental search for bond hardening
When the existence of bond softening was experimentally verified in 1990, the attention turned to bond hardening. Rather noisy photoelectron spectra reported in the early 1990s implied bond hardening occurring at the 1-photon and 3-photon anticrossings. These reports were received with great interest because bond hardening could explain apparent stabilization of the molecular bond in strong laser fields accompanied by a collective ejection of several electrons. However, instead of more convincing evidence, new negative results relegated bond hardening to a remote the |
https://en.wikipedia.org/wiki/Shotgun%20proteomics | Shotgun proteomics refers to the use of bottom-up proteomics techniques in identifying proteins in complex mixtures using a combination of high performance liquid chromatography combined with mass spectrometry. The name is derived from shotgun sequencing of DNA which is itself named after the rapidly expanding, quasi-random firing pattern of a shotgun. The most common method of shotgun proteomics starts with the proteins in the mixture being digested and the resulting peptides are separated by liquid chromatography. Tandem mass spectrometry is then used to identify the peptides.
Targeted proteomics using SRM and data-independent acquisition methods are often considered alternatives to shotgun proteomics in the field of bottom-up proteomics. While shotgun proteomics uses data-dependent selection of precursor ions to generate fragment ion scans, the aforementioned methods use a deterministic method for acquisition of fragment ion scans.
History
Shotgun proteomics arose from the difficulties of using previous technologies to separate complex mixtures. In 1975, two-dimensional polyacrylamide gel electrophoresis (2D-PAGE) was described by O’Farrell and Klose with the ability to resolve complex protein mixtures. The development of matrix-assisted laser desorption ionization (MALDI), electrospray ionization (ESI), and database searching continued to grow the field of proteomics. However these methods still had difficulty identifying and separating low-abundance proteins, aberrant proteins, and membrane proteins. Shotgun proteomics emerged as a method that could resolve even these proteins.
Advantages
Shotgun proteomics allows global protein identification as well as the ability to systematically profile dynamic proteomes. It also avoids the modest separation efficiency and poor mass spectral sensitivity associated with intact protein analysis.
Disadvantages
The dynamic exclusion filtering that is often used in shotgun proteomics maximizes the number of identified |
https://en.wikipedia.org/wiki/Anatoly%20Karatsuba | Anatoly Alexeyevich Karatsuba (his first name often spelled Anatolii) (; Grozny, Soviet Union, 31 January 1937 – Moscow, Russia, 28 September 2008) was a Russian mathematician working in the field of analytic number theory, p-adic numbers and Dirichlet series.
For most of his student and professional life he was associated with the Faculty of Mechanics and Mathematics of Moscow State University, defending a D.Sc. there entitled "The method of trigonometric sums and intermediate value theorems" in 1966. He later held a position at the Steklov Institute of Mathematics of the Academy of Sciences.
His textbook Foundations of Analytic Number Theory went to two editions, 1975 and 1983.
The Karatsuba algorithm is the earliest known divide and conquer algorithm for multiplication and lives on as a special case of its direct generalization, the Toom–Cook algorithm.
The main research works of Anatoly Karatsuba were published in more than 160 research papers and monographs.
His daughter, Yekaterina Karatsuba, also a mathematician, constructed the FEE method.
Work on informatics
As a student of Lomonosov Moscow State University, Karatsuba attended the seminar of Andrey Kolmogorov and found solutions to two problems set up by Kolmogorov. This was essential for the development of automata theory and started a new branch in Mathematics, the theory of fast algorithms.
Automata
In the paper of :Edward F. Moore, , an automaton (or a machine) , is defined as a device with states, input symbols
and output symbols. Nine theorems on the structure of and experiments with are proved. Later such machines got the name of Moore machines. At the end of the paper, in the chapter «New problems», Moore formulates the problem of improving the estimates which he obtained in Theorems 8 and 9:
Theorem 8 (Moore). Given an arbitrary machine , such that every two states can be distinguished from each other, there exists an experiment of length that identifies the state of at the end |
https://en.wikipedia.org/wiki/How%20to%20Solve%20It | How to Solve It (1945) is a small volume by mathematician George Pólya describing methods of problem solving.
Four principles
How to Solve It suggests the following steps when solving a mathematical problem:
First, you have to understand the problem.
After understanding, make a plan.
Carry out the plan.
Look back on your work. How could it be better?
If this technique fails, Pólya advises: "If you cannot solve the proposed problem, try to solve first some related problem. Could you imagine a more accessible related problem?"
First principle: Understand the problem
"Understand the problem" is often neglected as being obvious and is not even mentioned in many mathematics classes. Yet students are often stymied in their efforts to solve it, simply because they don't understand it fully, or even in part. In order to remedy this oversight, Pólya taught teachers how to prompt each student with appropriate questions, depending on the situation, such as:
What are you asked to find or show?
Can you restate the problem in your own words?
Can you think of a picture or a diagram that might help you understand the problem?
Is there enough information to enable you to find a solution?
Do you understand all the words used in stating the problem?
Do you need to ask a question to get the answer?
The teacher is to select the question with the appropriate level of difficulty for each student to ascertain if each student understands at their own level, moving up or down the list to prompt each student, until each one can respond with something constructive.
Second principle: Devise a plan
Pólya mentions that there are many reasonable ways to solve problems. The skill at choosing an appropriate strategy is best learned by solving many problems. You will find choosing a strategy increasingly easy. A partial list of strategies is included:
Guess and check
Make an orderly list
Eliminate possibilities
Use symmetry
Consider special cases
Use direct reasoning
Sol |
https://en.wikipedia.org/wiki/YProxy | yProxy is a Network News Transfer Protocol (NNTP) proxy server for the Windows operating system. yProxy's main function is to convert yEnc-encoded attachments to UUE-encoded attachments on the fly. The main purpose of this is to add functionality to NNTP newsreaders that do not have native support for yEnc.
The inventor of yEnc recommends yProxy for use by Windows users whose newsreaders do not support yEnc decoding.
yProxy comes in two varieties:
yProxy
yProxy Pro
yProxy
The latest free version of yProxy is version 1.3.
History of yProxy
yEnc (8 bit ASCII of 8 bit data) was released in 2001, and almost immediately the most popular utility for decoding yEnc became a software utility named yEnc32. yEnc32 was an early provider of yEnc decoding, but yEnc32, while flexible through its user interface, requires manual steps to decode yEnc attachments.
In the spring of 2002, shortly after yEnc gained popularity in binary newsgroups, yProxy was released as freeware. yProxy was designed to convert yEnc attachments as they are downloaded, without user intervention. Because yProxy is a proxy server, once it is configured, the user must only ensure that yProxy is running in order to use it.
Due to the design of yProxy as a generic NNTP proxy server, yProxy can be used by any NNTP newsreader. There are many free and commercial NNTP newsreader clients that do not natively support yEnc. yProxy was designed to let the user continue to use his or her existing newsreader.
As of May 31, 2007, the following, popular, free newsreaders do not support yEnc:
Outlook Express
Windows Mail
Windows Live Mail
Mozilla Thunderbird
The free version of yProxy is not supported on Windows Vista or Windows 7 due to yProxy's dependency on WinHelp for the help file. In addition, the free version of yProxy only includes instructions for configuring Outlook Express, which does not apply to Windows Vista's free email and NNTP client, Windows Mail or Windows Live Mail for Windows 7.
The f |
https://en.wikipedia.org/wiki/PAGEOS | PAGEOS (PAssive Geodetic Earth Orbiting Satellite) was a balloon satellite which was launched by NASA in June 1966.
Design
PAGEOS had a diameter of exactly , consisted of a thick mylar plastic film coated with vapour deposited aluminium enclosing a volume of and was used for the Weltnetz der Satellitentriangulation (Worldwide Satellite Triangulation Network) – a global cooperation organized by Hellmut Schmid (Switzerland & USA) 1969–1973.
Finished in 1974, the network connected 46 stations (3000–5000 km distance) of all continents with an accuracy of 3–5 m (approx. 20 times better than terrestrial triangulations at that time).
Orbit
The PAGEOS spacecraft was placed into a polar orbit (inclination 85–86°) with a height of approx. 4000 km, which had gradually lowered during its 9 years of operation. The satellite partly disintegrated in July 1975, which was followed by a second break-up that occurred in January 1976 resulting in the release of a large number of fragments. Most of these re-entered during the following decade. PAGEOS data have been released in 11 data sets.
PAGEOS' predecessors in satellite triangulation were the balloons Echo 1 (1960, 30 m) and Echo 2 (1964, 40 m) which were also used for passive telecommunication. Their apparent magnitude (brightness) was 1 mag, that of Pageos 2 mag (like Polaris) due to its higher orbit. Pageos could therefore be observed simultaneously e.g. from the ground in places such as Europe and North America. PAGEOS appeared as a slow-moving star (at first glance it would appear to be stationary). Its orbital period was approximately three hours. Because of its high orbit and polar inclination it would avoid the Earth's shadow and be observed any time of the night (low-orbit satellites are only observable shortly after sunset and before sunrise). In the early 1970s PAGEOS varied from 2nd apparent magnitude to beyond visibility over a period of a few minutes.
In 2016, one of the largest fragments of PAGEOS de-orbited.
|
https://en.wikipedia.org/wiki/Warazan | was a system of record-keeping using knotted straw at the time of the Ryūkyū Kingdom. In the dialect of the Sakishima Islands it was known as barasan and on Okinawa Island as warazani or warazai. Formerly used in particular in relation to the "head tax", it is still to be found in connection with the annual , to record the amount of miki or sacred sake dedicated.
See also
Kaidā glyphs
Naha Tug-of-war
Quipu |
https://en.wikipedia.org/wiki/Dynamic%20mutation | In genetics, a dynamic mutation is an unstable heritable element where the probability of expression of a mutant phenotype is a function of the number of copies of the mutation. That is, the replication product (progeny) of a dynamic mutation has a different likelihood of mutation than its predecessor. These mutations, typically short sequences repeated many times, give rise to numerous known diseases, including the trinucleotide repeat disorders.
Robert I. Richards and Grant R. Sutherland called these phenomena, in the framework of dynamical genetics, dynamic mutations. Triplet expansion is caused by slippage during DNA replication. Due to the repetitive nature of the DNA sequence in these regions , 'loop out' structures may form during DNA replication while maintaining complementary base pairing between the parent strand and daughter strand being synthesized. If the loop out structure is formed from sequence on the daughter strand this will result in an increase in the number of repeats. However, if the loop out structure is formed on the parent strand a decrease in the number of repeats occurs. It appears that expansion of these repeats is more common than reduction. Generally the larger the expansion the more likely they are to cause disease or increase the severity of disease. This property results in the characteristic of anticipation seen in trinucleotide repeat disorders. Anticipation describes the tendency of age of onset to decrease and severity of symptoms to increase through successive generations of an affected family due to the expansion of these repeats.
Common features
Most of these diseases have neurological symptoms.
Anticipation/The Sherman paradox refers to progressively earlier or more severe expression of the disease in more recent generations.
Repeats are usually polymorphic in copy number, with mitotic and meiotic instability.
Copy number related to the severity and/or age of onset
Imprinting effects
Reverse mutation - The mutation can rev |
https://en.wikipedia.org/wiki/User%20onboarding | User onboarding is the process of improving an individual's requirements and success with a product or service. This term is often used in reference to software products, and it can be done in a manual or automated way. It is the process through which new software is designed such that new users are provided and acquire the necessary knowledge, skills, and behaviors in order to become “up and running” and effective users of website, app, or software service.
The term originates from the human resources term, onboarding, that refers to the mechanism through which new employees acquire the necessary knowledge, skills, and behaviors in order to become effective organizational members.
The goal of user onboarding is to get the users to understand the key principles at the heart of the product and to show them how it will improve their lives. If it can make the point of the product clear and easy to understand the first time a user tries it, have a better chance of gaining excited and engaged customers.
Offering a free trial is an example of how you can implement user onboarding. If someone is able to see how the product is useful and exciting to them within a free trial period, it can take them from being a user to a consumer—willing to invest in order to continue their experience.
Onboarding techniques
Leveraging the right user onboarding techniques based on the specific requirements can be extremely beneficial for businesses.
Below are different user onboarding techniques and their classifications.
Single sign-on or social login - Ease the login and add less friction with SSO or Social Login.
Interactive tutorials, Walkthrough - In-application tutorials that help to onboard new users, highlight features in your application, and also reflect your value proposition faster and easier.
Beacons, tooltips - Raise visitors attention by adding beacon to any element on the web page.
Checklists - Leverage gamification effect (see Zeigarnik effect) and provide a cle |
https://en.wikipedia.org/wiki/HP%20Multi-Programming%20Executive | MPE (Multi-Programming Executive) is a discontinued business-oriented mainframe computer real-time operating system made by Hewlett-Packard. While initially a mini-mainframe, the final high-end systems supported 12 CPUs and over 2000 simultaneous users.
Description
It runs on the HP 3000 family of computers, which originally used HP custom 16-bit stack architecture CISC CPUs and were later migrated to PA-RISC where the operating system was called MPE XL.
In 1983, the original version of MPE was written in a language called SPL (System Programming Language). MPE XL was written primarily in Pascal, with some assembly language and some of the old SPL code.
In 1992, the OS name was changed to MPE/iX to indicate Unix interoperability with the addition of POSIX compatibility. The discontinuance of the product line was announced in late 2001, with support from HP terminating at the end of 2010. A number of 3rd party companies still support both the hardware and software.
In 2002 HP released the last version MPE/iX 7.5.
Commands
Among others, MPE/iX supports the following list of common commands and programs.
=SHUTDOWN
BASIC
CHDIR
COPY
DEBUG
ECHO
ELSE
EXIT
FORTRAN
HELP
IF
PASCAL
PRINT
RENAME
SH
WHILE
See also
HP 3000 |
https://en.wikipedia.org/wiki/Locus%20heterogeneity | Locus heterogeneity occurs when mutations at multiple genomic loci are capable of producing the same phenotype (ie. a single trait, pattern of traits, or disorder), and each individual mutation is sufficient to cause the specific phenotype independently. Locus heterogeneity should not be confused with allelic heterogeneity, in which a single phenotype can be produced by multiple mutations, all of which are at the same locus on a chromosome. Likewise, it should not be confused with phenotypic heterogeneity, in which different phenotypes arise among organisms with identical genotypes and environmental conditions. Locus heterogeneity and allelic heterogeneity are the two components of genetic heterogeneity.
Locus heterogeneity may have major implications for a number of human diseases. For instance, it has been associated with retinitis pigmentosa, hypertrophic cardiomyopathy, osteogenesis imperfecta, familial hypercholesterolemia, and hearing loss. Heterogenous loci involved in formation of the same phenotype often contribute to similar biological pathways. The role and degree of locus heterogeneity is an important consideration in understanding disease phenotypes and in the development of therapeutic treatment for these diseases.
The detection of causal genes for diseases impacted by locus heterogeneity is difficult with genetic analysis methods such as linkage analysis and genome sequencing. These methods rely on comparison of affected family members, but when different family members have different disease-causing genes, such genes may not be accurately identified. Existing techniques have been modified and new techniques have been developed to overcome these challenges.
Retinitis pigmentosa
Retinitis pigmentosa is a condition that causes damage to the light-sensitive cells of the retina. There have been over 60 genes identified whose mutations independently cause retinitis pigmentosa, and these can be inherited in an autosomal dominant, autosomal recessive, or |
https://en.wikipedia.org/wiki/Location%20estimation%20in%20sensor%20networks | Location estimation in wireless sensor networks is the problem of estimating the location of an object from a set of noisy measurements. These measurements are acquired in a distributed
manner by a set of sensors.
Use
Many civilian and military applications require monitoring that can identify objects in a specific area, such as monitoring the front entrance of a private house by a single camera. Monitored areas that are large relative to objects of interest often require multiple sensors (e.g., infra-red detectors) at multiple locations. A centralized observer or computer application monitors the sensors. The communication to power and bandwidth requirements call for efficient design of the sensor, transmission, and processing.
The CodeBlue system of Harvard University is an example where a vast number of sensors distributed among hospital facilities allow staff to locate a patient in distress. In addition, the sensor array enables online recording of medical information while allowing the patient to move around. Military applications (e.g. locating an intruder into a secured area) are also good candidates for setting a wireless sensor network.
Setting
Let denote the position of interest. A set of sensors
acquire measurements contaminated by an
additive noise owing some known or unknown probability density function (PDF). The sensors transmit measurements to a central processor. The th sensor encodes
by a function . The application processing the data applies a pre-defined estimation rule
. The set of message functions
and the fusion rule are
designed to minimize estimation error.
For example: minimizing the mean squared error (MSE),
.
Ideally, sensors transmit their measurements
right to the processing center, that is . In this
settings, the maximum likelihood estimator (MLE) is an unbiased estimator whose MSE is
assuming a white Gaussian noise
. The next sections suggest
alternative designs when the sensors are bandwidth constrained to
1 bit trans |
https://en.wikipedia.org/wiki/Computational%20and%20Systems%20Neuroscience | Computational and Systems Neuroscience (COSYNE or CoSyNe) is an annual scientific conference for the exchange of experimental and theoretical/computational approaches to problems in systems neuroscience. It is an important meeting for computational neuroscientists where many levels of approaches are discussed. It is a single track-meeting with oral and poster sessions and attracts about 800-900 participants from a variety of disciplines, including neuroscience, computer science and machine learning. Until 2018, the 3-day long main meeting was held in Salt Lake City, followed by two days of workshops at Snowbird, Utah. In 2018, COSYNE moved to Denver (3 days) and Breckenridge (2 days).
History
COSYNE grew out of the Neural Information and Coding (NIC) meetings founded by Anthony Zador in 1996. The first COSYNE was organized in 2004 by Michael Shadlen, Alexandre Pouget, Carlos Brody and Anthony Zador. The current Executive Committee consists of Alexandre Pouget, Zachary Mainen, Stephanie Palmer and Anthony Zador.
Meetings
Related Meetings
Neural Information Processing Systems (since 1987)
Annual meeting of the Organization for Computational Neuroscience (since 1990/1992)
Conference on Cognitive Computational Neuroscience (since 2017)
Berstein Conference (since 2005) |
https://en.wikipedia.org/wiki/Necuno | The Necuno is a phone-like mobile device exclusively manufactured in Finland. It seeks to provide high level security and user privacy by omitting the cellular modem. For this reason, it cannot be used on a regular mobile phone network. Instead it offers VOIP via a peer-to-peer encrypted communication platform called Ciphra. Standard cellular connectivity is planned for later versions.
The Necuno is mostly open-source, apart from an isolated firmware blob without access to the main memory, used in the Wi-Fi driver for regulatory reasons. The device uses Plasma Mobile by default, but it can run a variety of open-source mobile operating systems. It also has an ethernet port.
See also
Comparison of open-source mobile phones |
https://en.wikipedia.org/wiki/General%20Dirichlet%20series | In the field of mathematical analysis, a general Dirichlet series is an infinite series that takes the form of
where , are complex numbers and is a strictly increasing sequence of nonnegative real numbers that tends to infinity.
A simple observation shows that an 'ordinary' Dirichlet series
is obtained by substituting while a power series
is obtained when .
Fundamental theorems
If a Dirichlet series is convergent at , then it is uniformly convergent in the domain
and convergent for any where .
There are now three possibilities regarding the convergence of a Dirichlet series, i.e. it may converge for all, for none or for some values of s. In the latter case, there exist a such that the series is convergent for and divergent for . By convention, if the series converges nowhere and if the series converges everywhere on the complex plane.
Abscissa of convergence
The abscissa of convergence of a Dirichlet series can be defined as above. Another equivalent definition is
The line is called the line of convergence. The half-plane of convergence is defined as
The abscissa, line and half-plane of convergence of a Dirichlet series are analogous to radius, boundary and disk of convergence of a power series.
On the line of convergence, the question of convergence remains open as in the case of power series. However, if a Dirichlet series converges and diverges at different points on the same vertical line, then this line must be the line of convergence. The proof is implicit in the definition of abscissa of convergence. An example would be the series
which converges at (alternating harmonic series) and diverges at (harmonic series). Thus, is the line of convergence.
Suppose that a Dirichlet series does not converge at , then it is clear that and diverges. On the other hand, if a Dirichlet series converges at , then and converges. Thus, there are two formulas to compute , depending on the convergence of which can be determin |
https://en.wikipedia.org/wiki/Centre%20for%20Materials%20Elaboration%20and%20Structural%20Studies | The "Centre d’Élaboration de Matériaux et d’Etudes Structurales" (CEMES) is a CNRS laboratory located in Toulouse, France.
CEMES is a public fundamental research laboratory specializing in solid state physics, nanosciences, molecular chemistry and materials science. Its activities cover a spectrum from synthesizing (nano)materials and molecular systems to study and modelling of their structures and physical properties (optical, mechanical, electronic and magnetic) and their integration in devices.
An important part of the laboratory’s experimental activity is studying and manipulating individual objects whose characteristic sizes are at the nanometric or atomic scales. Most of this experimental work uses state-of-the-art instrumentation supported by instrumental and methodological developments in the laboratory’s fields of transmission electron microscopy, near-field microscopy and optical spectroscopy. These research and development themes integrate modelling and theoretical studies carried out at different scales within the laboratory.
CEMES is a CNRS laboratory associated with Toulouse III - Paul Sabatier University and Institut National des Sciences Appliquées de Toulouse (INSA). Created in 1988, CEMES followed the previous Laboratoire d’Optique Electronique (LOE) created in 1957 by Prof. Gaston Dupouy.
In contact with the academic community, CEMES is involved in training given at the university at every level: Bachelor, Master and Doctorate.
CEMES currently hosts about 160 people. In 2022 this included: 40 full-time CNRS researchers, 26 University professors or assistant professors, 36 engineers, technicians and administrative staff, and 12 postdocs, 31 Ph.D. students, and many undergraduate students.
Objectives
Studying structures and properties of nanomaterials and nanostructures at the atomic scale
Establishing relationships between nano and microstructures and the physical properties of various kinds of materials and nanomaterials
Inventing and d |
https://en.wikipedia.org/wiki/Thetatorquevirus | Thetatorquevirus is a genus in the family of Anelloviridae, in group II in the Baltimore classification. The genus contains 10 species.
Taxonomy
The genus contains the following species:
Torque teno arthrovec virus 3
Torque teno canid virus 1
Torque teno mustilid virus 1
Torque teno procyo virus 5
Torque teno procyo virus 6
Torque teno ursid virus 1
Torque teno ursid virus 2
Torque teno ursid virus 3
Torque teno ursid virus 4
Torque teno viverrid virus 4 |
https://en.wikipedia.org/wiki/Simplicial%20homotopy | In algebraic topology, a simplicial homotopypg 23 is an analog of a homotopy between topological spaces for simplicial sets. If
are maps between simplicial sets, a simplicial homotopy from f to g is a map
such that the diagram (see ) formed by f, g and h commute; the key is to use the diagram that results in and for all x in X.
See also
Kan complex
Dold–Kan correspondence (under which a chain homotopy corresponds to a simplicial homotopy)
Simplicial homology |
https://en.wikipedia.org/wiki/HoDoMS | HoDoMS (Heads of Departments of Mathematical Sciences) is an educational company that acts as a body to represent the heads of United Kingdom higher education departments of mathematical sciences. It aims to discuss and promote the interests of higher education mathematics in the UK and to facilitate dialogue between departments.
Governance
HoDoMS is operated by a committee including four officer roles which are listed below with incumbents. The committee includes observers from the Institute of Mathematics and its Applications, The OR Society, the Royal Statistical Society, the Council for the Mathematical Sciences and the Edinburgh Mathematical Society.
Activities
The main activity of HoDoMS is to run an annual conference bringing members together for briefings and discussion on current issues. For example, the 2020 conference heard briefings on policy issues such as research funding, the Research Excellence Framework 2021, the Teaching Excellence Framework as well as practicalities such as online marking, knowledge exchange, teaching as a career for mathematics undergraduates, and academics and mental health.
HoDoMS also collaborates with other organisations, for example with the London Mathematical Society on an 'Education Day' in 2019 and with the Institute of Mathematics and its Applications and the Isaac Newton Institute on an 'Induction Course for New Lecturers in the Mathematical Sciences' in 2021
History
The first meeting of HoDoMS took place on 14th September 1995 at University College, London under its first chair, Graham Wilks. On 14th August 2018, HoDoMS was incorporated as a Private company limited by guarantee.
Affiliations
HoDoMS is a member of the Joint Mathematical Council of the United Kingdom (JMC). |
https://en.wikipedia.org/wiki/Hydrus%20%28software%29 | Hydrus is a suite of Windows-based modeling software that can be used for analysis of water flow, heat and solute transport in variably saturated porous media (e.g., soils). HYDRUS suite of software is supported by an interactive graphics-based interface for data-preprocessing, discretization of the soil profile, and graphic presentation of the results. While HYDRUS-1D simulates water flow, solute and heat transport in one-dimension, and is a public domain software, HYDRUS 2D/3D extends the simulation capabilities to the second and third dimensions, and is distributed commercially.
History
HYDRUS 1D
HYDRUS-1D traces its roots to the early work of van Genuchten and his SUMATRA and WORM models, as well as later work by Vogel (1987) and Kool and van Genuchten (1989) and their SWMI and HYDRUS models, respectively. While Hermitian cubic finite element numerical schemes were used in SUMATRA and linear finite elements in WORM and the older HYDRUS code for solution of both the water flow and solute transport equations, SWMI used finite differences to solve the flow equation.
Various features of these four early models were combined first in the DOS-based SWMI_ST model (Šimůnek et al., 1993), and later in the Windows-based HYDRUS-1D simulator (Šimůnek et al., 1998). After releasing versions 1 (for 16-bit Windows 3.1) and 2 (for 32-bit Windows 95), the next two major updates (versions 3 and 4) were released in 2005 and 2008. These last two versions included additional modules applicable to more complex biogeochemical reactions than the standard HYDRUS modules.
While the standard modules of HYDRUS-1D can simulate the transport of solutes that are either fully independent or involved in the sequential first-order degradation chains, the two new modules can consider mutual interactions between multiple solutes, such as cation exchange and precipitation/dissolution.
Version 3 included the UNSATCHEM module (Suarez and Šimůnek, 1997) for simulating carbon dioxide transport a |
https://en.wikipedia.org/wiki/Jot%20Down | Jot Down is a cultural magazine based in Barcelona, Spain. The magazine was started as an online magazine, but has also published quarterly print editions. It describes itself as one of the representatives of slow journalism.
History and profile
Jot Down was launched by a group of journalists and businesspeople led by Mar de Marchis, Ángel Fernández and Ricardo Jonás on 16 May 2011 as an online cultural magazine. As of 2017 Mar de Marchis was also the editor-in-chief.
The magazine features narrations and interviews using visuals, including photographs, illustrations and graphic humor. The early contributors of the magazine include Enric González, Fernando Savater and Félix de Azúa. From 2012 Jot Down published print editions on a quarterly basis. The print edition is published black and white and consists of 350 pages.
As of 2017 Jot Down had one million unique visitors and sold 10,000–15,000 copies of print edition. |
https://en.wikipedia.org/wiki/Earth%27s%20internal%20heat%20budget | Earth's internal heat budget is fundamental to the thermal history of the Earth. The flow of heat from Earth's interior to the surface is estimated at 47±2 terawatts (TW) and comes from two main sources in roughly equal amounts: the radiogenic heat produced by the radioactive decay of isotopes in the mantle and crust, and the primordial heat left over from the formation of Earth.
Earth's internal heat travels along geothermal gradients and powers most geological processes. It drives mantle convection, plate tectonics, mountain building, rock metamorphism, and volcanism. Convective heat transfer within the planet's high-temperature metallic core is also theorized to sustain a geodynamo which generates Earth's magnetic field.
Despite its geological significance, Earth's interior heat contributes only 0.03% of Earth's total energy budget at the surface, which is dominated by 173,000 TW of incoming solar radiation. This external energy source powers most of the planet's atmospheric, oceanic, and biologic processes. Nevertheless on land and at the ocean floor, the sensible heat absorbed from non-reflected insolation flows inward only by means of thermal conduction, and thus penetrates only several tens of centimeters on the daily cycle and only several tens of meters on the annual cycle. This renders solar radiation minimally relevant for processes internal to Earth's crust.
Global data on heat-flow density are collected and compiled by the International Heat Flow Commission of the International Association of Seismology and Physics of the Earth's Interior.
Heat and early estimate of Earth's age
Based on calculations of Earth's cooling rate, which assumed constant conductivity in the Earth's interior, in 1862 William Thomson, later Lord Kelvin, estimated the age of the Earth at 98 million years, which contrasts with the age of 4.5 billion years obtained in the 20th century by radiometric dating. As pointed out by John Perry in 1895 a variable conductivity in the E |
https://en.wikipedia.org/wiki/Action%20Message%20Format | Action Message Format (AMF) is a binary format used to serialize object graphs such as ActionScript objects and XML, or send messages between an Adobe Flash client and a remote service, usually a Flash Media Server or third party alternatives. The Actionscript 3 language provides classes for encoding and decoding from the AMF format.
The format is often used in conjunction with Adobe's RTMP to establish connections and control commands for the delivery of streaming media. In this case, the AMF data is encapsulated in a chunk which has a header which defines things such as the message length and type (whether it is a "ping", "command" or media data).
Format analysis
AMF was introduced with Flash Player 6, and this version is referred to as AMF0. It was unchanged until the release of Flash Player 9 and ActionScript 3.0, when new data types and language features prompted an update, called AMF3. Flash Player 10 added vector and dictionary data types documented in a revised specification of January 2013.
Adobe Systems published the AMF binary data protocol specification in December 2007 and announced that it will support the developer community to make this protocol available for every major server platform.
AMF self-contained packet
The following amf-packet is for transmission of messages outside of defined Adobe/Macromedia containers or transports such as Flash Video or the Real Time Messaging Protocol.
If either the header-length or message-length are unknown then they are set to -1 or 0xFFFFFFFF
uimsbf: unsigned integer, most significant bit first
simsbf: signed integer, most significant bit first
AMF0
The format specifies the various data types that can be used to encode data. Adobe states that AMF is mainly used to represent object graphs that include named properties in the form of key-value pairs, where the keys are encoded as strings and the values can be of any data type such as strings or numbers as well as arrays and other objects. XML is supported as |
https://en.wikipedia.org/wiki/Securus%2C%20Inc. | Securus Inc., was a Cary, North Carolina based provider of GPS tracking and personal emergency response technology.
The company was founded in 2008 as Positioning Animals Worldwide Inc. and launched SpotLight and SpotLite GPS pet locator in collaboration with American Kennel Club Companion Animal Recovery (AKC CAR). In 2010 the company changed its name to Securus, Inc., to reflect its diversification into new markets. The company's chief executive officer is Chris Newton.
In 2011, Securus, Inc acquired Zoombak, LLC from TruePosition, Inc., a subsidiary of Liberty Media. Zoombak, Inc was known for developing GPS-based products that help people track things, ranging from teenage drivers to employees, pets and automobiles.
In March 2015, it was announced that Securus was selling its personal emergency response service (PERS) platform to Ogden, UT-based Freeus, LLC, sister company of AvantGuard Monitoring Systems. That same month, it was announced that BrickHouse Security acquired Securus, including Zoombak and related brands.
Securus has been involved in buying and selling location data of private citizens that use cell-phones. This has been raised as a privacy concern as it allows Securus to sell for its own profit the "location of nearly any phone in the US without authorization," including to bounty hunters. |
https://en.wikipedia.org/wiki/Bitcrusher | A Bitcrusher is an audio effect that produces distortion by reducing of the resolution or bandwidth of digital audio data. The resulting quantization noise may produce a "warmer" sound impression, or a harsh one, depending on the amount of reduction.
Methods
A typical bitcrusher uses two methods to reduce audio fidelity: sample rate reduction and resolution reduction.
Sample rate reduction
Digital audio is composed of a rapid series of numeric samples that encode the changing amplitude of an audio waveform. To accurately represent a wideband waveform of substantial duration, digital audio requires a large number of samples at a high sample rate. The higher the rate, the more accurate the waveform; a lower rate requires the source analog signal to be low-pass filtered to limit the maximum frequency component in the signal, or else high-frequency components of the signal will be aliased. Specifically, the frequency of sampling (a.k.a. the sample rate) must be at least twice the maximum frequency component in the signal; this maximum signal frequency of one-half the sampling frequency is called the Nyquist limit.
Though it is a common misconception that the sample rate affects the "smoothness" of the digitally represented waveform, this is not true; sampling theory guarantees that up to the maximum signal frequency supported by the sample rate (i.e. the Nyquist limit), the digital (discrete) signal will exactly represent the analog (continuous-wave) source, except for the distortion of quantization noise resulting from the finite precision of the individual samples. The original signal can be exactly reconstructed simply bypassing the low-pass discrete signal through an ideal low-pass filter (with a perfect vertical cutoff profile). However, as an ideal filter is impossible to build, a real filter, with a gradual transition between the passband and the stopband, must be used, with the consequence that it is impossible to accurately record all frequencies right up t |
https://en.wikipedia.org/wiki/Tradeston%20Flour%20Mills%20explosion | On 9 July 1872 the Tradeston Flour Mills, in Glasgow, Scotland, exploded. Eighteen people died, and at least sixteen were injured. An investigation suggested that the explosion was caused by the grain feed to a pair of millstones stopping, causing them to rub against each other, resulting in a spark or fire igniting the grain dust in the air. That fire was then drawn by a fan into an "exhaust box" designed to collect grain dust, which then ignited, causing a second explosion which destroyed the building. At the time, there were general concerns about similar incidents worldwide, so the incident and investigation were widely reported across the world.
Background
The mill was owned by Matthew Muir & Sons, had been in operation for thirty years, and consisted of a five-storey grain store on King Street (now Kingston Street), another grain store that occupied most of a four-storey building on Clyde Place, and a four-storey grain mill building between the two, with three boilers and an engine shed attached. This occupied the majority of the block surrounded by Clyde Place, Commerce Street, King Street and Centre Streets, with Gorbals Free Church, the Bute Hotel, some shops and some dwelling houses taking up the rest of the block.
Explosion
At 4 pm on 9 July, just as the day shift was about to finish, a large explosion blew out the front and back of the mill building. Survivors of the explosion described a small initial explosion that filled the building with flour, and then a large explosion that blew out the walls. The buildings were then engulfed in fire. Employees of neighbouring businesses were also injured and killed in the explosion. Six people were taken to the Royal Infirmary with serious injuries, while another ten with less serious injuries were sent home to recover.
Firefighters were dispatched from all but one of the city's fire stations with firefighters from Bridgeton station being held in reserve in case of another fire. An off-duty firefighter from t |
https://en.wikipedia.org/wiki/Flavored%20fortified%20wine | Flavored fortified wines or tonic wines (known informally as bum wines or bum vino) are inexpensive fortified wines that typically have an alcohol content between 13% and 20% alcohol by volume (ABV). They are made from various fruits (including grapes and citrus fruits) with added sugar, artificial flavor, and artificial color.
Brands
Buckfast Tonic Wine is a tonic wine with added alcohol, caffeine, and sugar, produced under license from Buckfast Abbey, a Roman Catholic monastery located in Devon, England. It is particularly popular along the central belt of Scotland, especially Glasgow, Faifley, East Kilbride, Hamilton, Coatbridge and other Strathclyde areas, as well as Falkirk, Fife, Edinburgh and the Lothians, but critics have blamed it for being a cause of social problems in Scotland. Some have nicknamed it "Wreck the Hoose Juice". It also enjoys strong popularity and near cult-following in Northern Ireland, often referred to simply as "Bucky" and in some cases "Lurgan Champagne".
Divas Vkat (originally Vodkat). Made in the Riverina region of New South Wales, Australia, Divas produce a range of flavoured and unflavoured fortified wines used as a substitute to traditional mixers such as vodka, triple sec and coconut rum. Alcohol levels vary by flavour, but tend to be between 20% and 22% alcohol by volume (40 to 44 proof). There is also a range of "Ready to Drink" (RTD) 330 mL pre-mixed cocktails in a four-pack, with a strength of 8% alcohol by volume.
MD 20/20 (often called by its nickname Mad Dog) is an American fortified wine. The MD stands for its producer, Mogen David. MD 20/20 has an alcohol content that varies by flavor from 13% to 18%. Initially, 20/20 stood for 20 oz at 20% alcohol. Currently, MD 20/20 is sold neither in 20 oz bottles nor at 20% alcohol by volume. In the Central Belt of Scotland it is a very popular beverage, usually drunk from the bottle and often associated with antisocial behaviour.
Solntsedar (, named after a town on the Bla |
https://en.wikipedia.org/wiki/Spirometry | Spirometry (meaning the measuring of breath) is the most common of the pulmonary function tests (PFTs). It measures lung function, specifically the amount (volume) and/or speed (flow) of air that can be inhaled and exhaled. Spirometry is helpful in assessing breathing patterns that identify conditions such as asthma, pulmonary fibrosis, cystic fibrosis, and COPD. It is also helpful as part of a system of health surveillance, in which breathing patterns are measured over time.
Spirometry generates pneumotachographs, which are charts that plot the volume and flow of air coming in and out of the lungs from one inhalation and one exhalation.
Testing
Spirometer
The spirometry test is performed using a device called a spirometer, which comes in several different varieties. Most spirometers display the following graphs, called spirograms:
a volume-time curve, showing volume (litres) along the Y-axis and time (seconds) along the X-axis
a flow-volume loop, which graphically depicts the rate of airflow on the Y-axis and the total volume inspired or expired on the X-axis
Procedure
The basic forced volume vital capacity (FVC) test varies slightly depending on the equipment used. It can be in the form of either closed or open circuit. Regardless of differences in testing procedure providers are recommended to follow the ATS/ERS Standardisation of Spirometry. The standard procedure ensures an accurate and objectively collected data, based on a common reference, to reduce compatibility of the results when shared across differing medical groups.
The patient is asked to put on soft nose clips to prevent air escape and a breathing sensor in their mouth forming an air tight seal. Guided by a technician, the patient is given step by step instructions to take an abrupt maximum effort inhale, followed by a maximum effort exhale lasting for a target of at least 6 seconds. When assessing possible upper airway obstruction, the technician will direct the patient to make an additional |
https://en.wikipedia.org/wiki/Senses%20of%20Cinema | Senses of Cinema is a quarterly online film magazine founded in 1999 by filmmaker Bill Mousoulis. Based in Melbourne, Australia, Senses of Cinema publishes work by film critics from all over the world, including critical essays, career overviews of the works of key directors, and coverage of many international festivals.
Its contributors have included Raphaël Bassan, Salvador Carrasco, Barbara Creed, Wheeler Winston Dixon, David Ehrenstein, Thomas Elsaesser, Valie Export, Gwendolyn Audrey Foster, Dušan Makavejev, Edgar Morin, Joseph Natoli, Murray Pomerance, Berenice Reynaud, Jonathan Rosenbaum, David Sanjek, Sally Shafto, David Sterritt, Robert Dassanowsky, and Viviane Vagh.
The magazine's current editors are Amanda Barbour, César Albarrán-Torres, Tara Judah, Abel Muñoz-Hénonin and Fiona Villella.
Format
Every issue of Senses of Cinema follows roughly the same format: about a dozen "featured articles," often related to a unifying theme, a special dossier often devoted to some aspect of Australian cinema, reports from various major international, regional and underground film festivals, book reviews, and articles devoted to recent screenings and retrospectives at the Melbourne Cinematheque.
Senses of Cinema regularly publishes interviews in its featured articles section.
Great directors
Senses of Cinema maintains a database of career overview essays on "great directors." The essays range in scope and length, but they all maintain an auteurist perspective on the filmmakers' work.
See also
Bright Lights Film Journal
The Moving Arts Film Journal |
https://en.wikipedia.org/wiki/Horseradish | Horseradish (Armoracia rusticana, syn. Cochlearia armoracia) is a perennial plant of the family Brassicaceae (which also includes mustard, wasabi, broccoli, cabbage, and radish). It is a root vegetable, cultivated and used worldwide as a spice and as a condiment. The species is probably native to southeastern Europe and western Asia.
Description
Horseradish grows up to tall, with hairless bright green unlobed leaves up to long that may be mistaken for docks (Rumex). It is cultivated primarily for its large, white, tapered root. The white four-petalled flowers are scented and are borne in dense panicles. Established plants may form extensive patches and may become invasive unless carefully managed.
Intact horseradish root has little aroma. When cut or grated, enzymes from within the plant cells digest sinigrin (a glucosinolate) to produce allyl isothiocyanate (mustard oil), which irritates the mucous membranes of the sinuses and eyes. Once exposed to air or heat, horseradish loses its pungency, darkens in color, and develops a bitter flavor.
History
Horseradish has been cultivated since antiquity. Dioscorides listed horseradish equally as Persicon sinapi (Diosc. 2.186) or Sinapi persicum (Diosc. 2.168), which Pliny's Natural History reported as Persicon napy; Cato discusses the plant in his treatises on agriculture, and a mural in Pompeii shows the plant. Horseradish is probably the plant mentioned by Pliny the Elder in his Natural History under the name of Amoracia, and recommended by him for its medicinal qualities, and possibly the wild radish, or raphanos agrios of the Greeks. The early Renaissance herbalists Pietro Andrea Mattioli and John Gerard showed it under Raphanus. Its modern Linnaean genus Armoracia was first applied to it by Heinrich Bernhard Ruppius, in his Flora Jenensis, 1745, but Linnaeus himself called it Cochlearia armoracia.
Both roots and leaves were used as a traditional medicine during the Middle Ages. The root was used as a condiment o |
https://en.wikipedia.org/wiki/Palmar%20aponeurosis | The palmar aponeurosis (palmar fascia) invests the muscles of the palm, and consists of central, lateral, and medial portions.
Structure
The central portion occupies the middle of the palm, is triangular in shape, and of great strength
Its apex is continuous with the lower margin of the transverse carpal ligament, and receives the expanded tendon of the palmaris longus.
Its base divides below into four slips, one for each finger. Each slip gives off superficial fibers to the skin of the palm and finger, those to the palm joining the skin at the furrow corresponding to the metacarpophalangeal articulations, and those to the fingers passing into the skin at the transverse fold at the bases of the fingers.
The deeper part of each slip subdivides into two processes, which are inserted into the fibrous sheaths of the flexor tendons. From the sides of these processes offsets are attached to the transverse metacarpal ligament.
By this arrangement short channels are formed on the front of the heads of the metacarpal bones; through these the flexor tendons pass. The intervals between the four slips transmit the digital vessels and nerves, and the tendons of the lumbricales.
At the points of division into the slips mentioned, numerous strong, transverse fasciculi bind the separate processes together.
The central part of the palmar aponeurosis is intimately bound to the integument by dense fibroareolar tissue forming the superficial palmar fascia, and gives origin by its medial margin to the palmaris brevis.
It covers the superficial volar arch, the tendons of the flexor muscles, and the branches of the median and ulnar nerves; and on either side it gives off a septum, which is continuous with the interosseous aponeurosis, and separates the intermediate from the collateral groups of muscles.
Lateral and medial portions
The lateral and medial portions of the palmar aponeurosis are thin, fibrous layers, which cover, on the radial side, the muscles of the ball of the thu |
https://en.wikipedia.org/wiki/AMMECR1 | In molecular biology, the AMMECR1 protein (Alport syndrome, intellectual disability, midface hypoplasia and elliptocytosis chromosomal region gene 1 protein) is a protein encoded by the AMMECR1 gene on human chromosome Xq22.3.
The contiguous gene deletion syndrome is characterised by Alport syndrome (A), intellectual disability (M), midface hypoplasia (M), and elliptocytosis (E), as well as generalized hypoplasia and cardiac abnormalities. It is caused by a deletion in Xq22.3, comprising several genes including AMME chromosomal region gene 1 (AMMECR1), which encodes a protein with a nuclear location and presently unknown function. The C-terminal region of AMMECR1 (from residue 122 to 333) is well conserved, and homologues appear in species ranging from bacteria and archaea to eukaryotes. The high level of conservation of the AMMECR1 domain points to a basic cellular function, potentially in either the transcription, replication, repair or translation machinery.
The AMMECR1 domain contains a six-amino-acid motif (LRGCIG) that might be functionally important since it is strikingly conserved throughout evolution. The AMMECR1 domain consists of two distinct subdomains of different sizes. The large subdomain, which contains both the N- and C-terminal regions, consists of five alpha-helices and five beta-strands. These five beta-strands form an antiparallel beta-sheet. The small subdomain consists of four alpha-helices and three beta-strands, and these beta-strands also form an antiparallel beta-sheet. The conserved 'LRGCIG' motif is located at beta(2) and its N-terminal loop, and most of the side chains of these residues point toward the interface of the two subdomains. The two subdomains are connected by only two loops, and the interaction between the two subdomains is not strong. Thus, these subdomains may move dynamically when the substrate enters the cleft. The size of the cleft suggests that the substrate is large, e.g., the substrate may be a nucleic acid or pr |
https://en.wikipedia.org/wiki/Richard%20Schoen | Richard Melvin Schoen (born October 23, 1950) is an American mathematician known for his work in differential geometry and geometric analysis. He is best known for the resolution of the Yamabe problem in 1984.
Career
Born in Celina, Ohio, and a 1968 graduate of Fort Recovery High School, he received his B.S. from the University of Dayton in mathematics. He then received his PhD in 1977 from Stanford University. After faculty positions at the Courant Institute, NYU, University of California, Berkeley, and University of California, San Diego, he was Professor at Stanford University from 1987–2014, as Bass Professor of Humanities and Sciences since 1992. He is currently Distinguished Professor and Excellence in Teaching Chair at the University of California, Irvine. His surname is pronounced "Shane."
Schoen received an NSF Graduate Research Fellowship in 1972 and a Sloan Research Fellowship in 1979. Schoen is a 1983 MacArthur Fellow. He has been invited to speak at the International Congress of Mathematicians (ICM) three times, including twice as a Plenary Speaker. In 1983 he was an Invited Speaker at the ICM in Warsaw, in 1986 he was a Plenary Speaker at the ICM in Berkeley, and in 2010 he was a Plenary Speaker at the ICM in Hyderabad. For his work on the Yamabe problem, Schoen was awarded the Bôcher Memorial Prize in 1989. He was elected to the American Academy of Arts and Sciences in 1988 and to the National Academy of Sciences in 1991, became Fellow of the American Association for the Advancement of Science in 1995, and won a Guggenheim Fellowship in 1996. In 2012 he became a Fellow of the American Mathematical Society. He received the 2014–15 Dean’s Award for Lifetime Achievements in Teaching from Stanford University. In 2015, he was elected Vice President of the American Mathematical Society. He was awarded an Honorary Doctor of Science from the University of Warwick in 2015. He received the Wolf Prize in Mathematics for 2017, shared with Charles Fefferman. I |
https://en.wikipedia.org/wiki/Carefree%20Black%20Girls | Carefree Black Girls is a cultural concept and movement that aims to increase the breadth of "alternative" representations of black women. The origins of this expression can be traced to both Twitter and Tumblr. Zeba Blay was reportedly the first person to use the expression as a hashtag on Twitter in May 2013. Danielle Hawkins soon launched a blog on Tumblr by the same name. In her article for The Root, Diamond Sharp describes "carefree black girls" as an idea that, "[black women] have used to anchor expressions of individuality and whimsy in the face of the heavy stereotypes and painful realities that too often color discussions of their demographic." At Refinery29, Jamala Johns said it was "a way to celebrate all things joyous and eclectic among brown ladies. Cultivated online and driven by social media, it's one telling piece of a much wider development of inspiration assembled by and for black women." Hillary Crosley Coker, a reporter for Jezebel provides specific examples of notable black women embodying the concept. She claims that, "ladies like Chiara de Blasio (with her hippie flower headband), Solange [Knowles] and her eclectic style and Janelle Monae's futurism are their patron saints".
Reception
The "carefree black girl" movement has prompted the development of related concepts and efforts such as "carefree black boys," a term also dubbed by Blay. Another concept that emerged was "carefree black kids" via the hashtag from Another Round host and Late Night with Stephen Colbert writer Heben Nigatu (#carefreeblackkids2k16). In July 2016, Blavity called the photos and videos posted with Nigatu's hashtag "the bright light we needed after this troubling week," which was marked by the state-sponsored killings of Alton Sterling and Philandro Castile.
Criticism
As the "carefree black girl" concept gained favorable recognition, it has also faced criticism. Shamira Ibrahim, reporter for The Root compares the emergence of the "carefree black girl" concept to "bl |
https://en.wikipedia.org/wiki/Master%20boot%20record | A master boot record (MBR) is a special type of boot sector at the very beginning of partitioned computer mass storage devices like fixed disks or removable drives intended for use with IBM PC-compatible systems and beyond. The concept of MBRs was publicly introduced in 1983 with PC DOS 2.0.
The MBR holds the information on how the disc's sectors (aka “blocks”) are divided into partitions, each partition notionally containing a file system. The MBR also contains executable code to function as a loader for the installed operating system—usually by passing control over to the loader's second stage, or in conjunction with each partition's volume boot record (VBR). This MBR code is usually referred to as a boot loader.
The organization of the partition table in the MBR limits the maximum addressable storage space of a partitioned disk to 2 TiB . Approaches to slightly raise this limit utilizing 32-bit arithmetic or 4096-byte sectors are not officially supported, as they fatally break compatibility with existing boot loaders, most MBR-compliant operating systems and associated system tools, and may cause serious data corruption when used outside of narrowly controlled system environments. Therefore, the MBR-based partitioning scheme is in the process of being superseded by the GUID Partition Table (GPT) scheme in new computers. A GPT can coexist with an MBR in order to provide some limited form of backward compatibility for older systems.
MBRs are not present on non-partitioned media such as floppies, superfloppies or other storage devices configured to behave as such, nor are they necessarily present on drives used in non-PC platforms.
Overview
Support for partitioned media, and thereby the master boot record (MBR), was introduced with IBM PC DOS 2.0 in March 1983 in order to support the 10 MB hard disk of the then-new IBM Personal Computer XT, still using the FAT12 file system. The original version of the MBR was written by David Litton of IBM in June 1982. The pa |
https://en.wikipedia.org/wiki/Collusion | Collusion is a deceitful agreement or secret cooperation between two or more parties to limit open competition by deceiving, misleading or defrauding others of their legal right. Collusion is not always considered illegal. It can be used to attain objectives forbidden by law; for example, by defrauding or gaining an unfair market advantage. It is an agreement among firms or individuals to divide a market, set prices, limit production or limit opportunities.
It can involve "unions, wage fixing, kickbacks, or misrepresenting the independence of the relationship between the colluding parties". In legal terms, all acts effected by collusion are considered void.
Definition
In the study of economics and market competition, collusion takes place within an industry when rival companies cooperate for their mutual benefit. Conspiracy usually involves an agreement between two or more sellers to take action to suppress competition between sellers in the market. Because competition among sellers can provide consumers with low prices, conspiracy agreements increase the price consumers pay for the goods. Because of this harm to consumers, it is against antitrust laws to fix prices by agreement between producers, so participants must keep it a secret. Collusion often takes place within an oligopoly market structure, where there are few firms and agreements that have significant impacts on the entire market or industry. To differentiate from a cartel, collusive agreements between parties may not be explicit; however, the implications of cartels and collusion are the same.
Under competition law, there is an important distinction between direct and covert collusion. Direct collusion generally refers to a group of companies communicating directly with each other to coordinate and monitor their actions, such as cooperating through pricing, market allocation, sales quotas, etc. On the other hand, tacit collusion is where companies coordinate and monitor their behavior without direct c |
https://en.wikipedia.org/wiki/Vimentin | Vimentin is a structural protein that in humans is encoded by the VIM gene. Its name comes from the Latin vimentum which refers to an array of flexible rods.
Vimentin is a type III intermediate filament (IF) protein that is expressed in mesenchymal cells. IF proteins are found in all animal cells as well as bacteria. Intermediate filaments, along with tubulin-based microtubules and actin-based microfilaments, comprises the cytoskeleton. All IF proteins are expressed in a highly developmentally-regulated fashion; vimentin is the major cytoskeletal component of mesenchymal cells. Because of this, vimentin is often used as a marker of mesenchymally-derived cells or cells undergoing an epithelial-to-mesenchymal transition (EMT) during both normal development and metastatic progression.
Structure
The assembly of the fibrous vimentin filament that forms the cytoskeleton follows a gradual sequence .The vimentin monomer has a central α-helical domain, capped on each end by non-helical amino (head) and carboxyl (tail) domains. Two monomers are likely co-translationally expressed in a way that facilitates their interaction forming a coiled-coil dimer, which is the basic subunit of vimentin assembly. A pair of coiled-coil dimers connect in an antiparallel fashion to form a tetramer. Eight tetramers join to form what is known as the unit-length filament (ULF), ULFs then stick to each other and elongate followed by compaction to form the fibrous proteins.
The α-helical sequences contain a pattern of hydrophobic amino acids that contribute to forming a "hydrophobic seal" on the surface of the helix. In addition, there is a periodic distribution of acidic and basic amino acids that seems to play an important role in stabilizing coiled-coil dimers. The spacing of the charged residues is optimal for ionic salt bridges, which allows for the stabilization of the α-helix structure. While this type of stabilization is intuitive for intrachain interactions, rather than interchain i |
https://en.wikipedia.org/wiki/Horton%20graph | In the mathematical field of graph theory, the Horton graph or Horton 96-graph is a 3-regular graph with 96 vertices and 144 edges discovered by Joseph Horton. Published by Bondy and Murty in 1976, it provides a counterexample to the Tutte conjecture that every cubic 3-connected bipartite graph is Hamiltonian.
After the Horton graph, a number of smaller counterexamples to the Tutte conjecture were found. Among them are a 92 vertex graph by Horton published in 1982, a 78 vertex graph by Owens published in 1983, and the two Ellingham-Horton graphs (54 and 78 vertices).
The first Ellingham-Horton graph was published by Ellingham in 1981 and was of order 78. At that time, it was the smallest known counterexample to the Tutte conjecture. The second one was published by Ellingham and Horton in 1983 and was of order 54. In 1989, Georges' graph, the smallest currently-known non-Hamiltonian 3-connected cubic bipartite graph was discovered, containing 50 vertices.
As a non-Hamiltonian cubic graph with many long cycles, the Horton graph provides good benchmark for programs that search for Hamiltonian cycles.
The Horton graph has chromatic number 2, chromatic index 3, radius 10, diameter 10, girth 6, book thickness 3 and queue number 2. It is also a 3-edge-connected graph.
Algebraic properties
The automorphism group of the Horton graph is of order 96 and is isomorphic to Z/2Z×Z/2Z×S4, the direct product of the Klein four-group and the symmetric group S4.
The characteristic polynomial of the Horton graph is :
.
Gallery |
https://en.wikipedia.org/wiki/Decoding%20methods | In coding theory, decoding is the process of translating received messages into codewords of a given code. There have been many common methods of mapping messages to codewords. These are often used to recover messages sent over a noisy channel, such as a binary symmetric channel.
Notation
is considered a binary code with the length ; shall be elements of ; and is the distance between those elements.
Ideal observer decoding
One may be given the message , then ideal observer decoding generates the codeword . The process results in this solution:
For example, a person can choose the codeword that is most likely to be received as the message after transmission.
Decoding conventions
Each codeword does not have an expected possibility: there may be more than one codeword with an equal likelihood of mutating into the received message. In such a case, the sender and receiver(s) must agree ahead of time on a decoding convention. Popular conventions include:
Request that the codeword be resent automatic repeat-request.
Choose any random codeword from the set of most likely codewords which is nearer to that.
If another code follows, mark the ambiguous bits of the codeword as erasures and hope that the outer code disambiguates them
Maximum likelihood decoding
Given a received vector maximum likelihood decoding picks a codeword that maximizes
,
that is, the codeword that maximizes the probability that was received, given that was sent. If all codewords are equally likely to be sent then this scheme is equivalent to ideal observer decoding.
In fact, by Bayes Theorem,
Upon fixing , is restructured and
is constant as all codewords are equally likely to be sent.
Therefore,
is maximised as a function of the variable precisely when
is maximised, and the claim follows.
As with ideal observer decoding, a convention must be agreed to for non-unique decoding.
The maximum likelihood decoding problem can also be modeled as an integer programming problem.
The |
https://en.wikipedia.org/wiki/Logic%20synthesis | In computer engineering, logic synthesis is a process by which an abstract specification of desired circuit behavior, typically at register transfer level (RTL), is turned into a design implementation in terms of logic gates, typically by a computer program called a synthesis tool. Common examples of this process include synthesis of designs specified in hardware description languages, including VHDL and Verilog. Some synthesis tools generate bitstreams for programmable logic devices such as PALs or FPGAs, while others target the creation of ASICs. Logic synthesis is one step in circuit design in the electronic design automation, the others are place and route and verification and validation.
History
The roots of logic synthesis can be traced to the treatment of logic by George Boole (1815 to 1864), in what is now termed Boolean algebra. In 1938, Claude Shannon showed that the two-valued Boolean algebra can describe the operation of switching circuits. In the early days, logic design involved manipulating the truth table representations as Karnaugh maps. The Karnaugh map-based minimization of logic is guided by a set of rules on how entries in the maps can be combined. A human designer can typically only work with Karnaugh maps containing up to four to six variables.
The first step toward automation of logic minimization was the introduction of the Quine–McCluskey algorithm that could be implemented on a computer. This exact minimization technique presented the notion of prime implicants and minimum cost covers that would become the cornerstone of two-level minimization. Nowadays, the much more efficient Espresso heuristic logic minimizer has become the standard tool for this operation. Another area of early research was in state minimization and encoding of finite-state machines (FSMs), a task that was the bane of designers. The applications for logic synthesis lay primarily in digital computer design. Hence, IBM and Bell Labs played a pivotal role in the early |
https://en.wikipedia.org/wiki/OR4F5 | Olfactory receptor family 4 subfamily F member 5 is a protein that in humans is encoded by the OR4F5 gene.
Function
Olfactory receptors interact with odorant molecules in the nose, to initiate a neuronal response that triggers the perception of a smell. The olfactory receptor proteins are members of a large family of G-protein-coupled receptors (GPCR) arising from single coding-exon genes. Olfactory receptors share a 7-transmembrane domain structure with many neurotransmitter and hormone receptors and are responsible for the recognition and G protein-mediated transduction of odorant signals. The olfactory receptor gene family is the largest in the genome. The nomenclature assigned to the olfactory receptor genes and proteins for this organism is independent of other organisms.
Trivia
OR4F5 is the first protein coding gene on human chromosome 1. |
https://en.wikipedia.org/wiki/UPt3 | {{DISPLAYTITLE:UPt3}}
UPt3 is an inorganic binary intermetallic crystalline compound of platinum and uranium.
Production
It can be syntetized in the following ways:
as an intermetallic compound, by direct fusion of pure components according to stoichiometric calculations:
by reduction of uranium dioxide with hydrogen in the presence of platinum:
Physical properties
UPt3 forms crystals of hexagonal symmetry (some studies hypothesize a trigonal structure instead), space group P63/mmc, cell parameters a = 0.5766 nm and c = 0.4898 nm (c should be understood as distance from planes), with a structure similar to nisnite (Ni3Sn) and MgCd3.
The compound congruently melts at 1700 °C. The enthalpy of formation of the compound is -111 kJ/mol.
At temperatures below 1 K it becomes superconducting, thought to be due to the presence of heavy fermions (the uranium atoms). |
https://en.wikipedia.org/wiki/Journal%20de%20Th%C3%A9orie%20des%20Nombres%20de%20Bordeaux | The Journal de Théorie des Nombres de Bordeaux is a triannual peer-reviewed open-access scientific journal covering number theory and related topics. It was established in 1989 and is published by the Institut de Mathématiques de Bordeaux on behalf of the Société Arithmétique de Bordeaux. The editor-in-chief is Denis Benois (University of Bordeaux).
Abstracting and indexing
The journal is abstracted and indexed in Current Contents/Physical, Chemical & Earth Sciences, Zentralblatt MATH, Mathematical Reviews, Science Citation Index Expanded, and Scopus. According to the Journal Citation Reports, the journal has a 2015 impact factor of 0.294. |
https://en.wikipedia.org/wiki/Nonagon | In geometry, a nonagon () or enneagon () is a nine-sided polygon or 9-gon.
The name nonagon is a prefix hybrid formation, from Latin (nonus, "ninth" + gonon), used equivalently, attested already in the 16th century in French nonogone and in English from the 17th century. The name enneagon comes from Greek enneagonon (εννεα, "nine" + γωνον (from γωνία = "corner")), and is arguably more correct, though less common than "nonagon".
Regular nonagon
A regular nonagon is represented by Schläfli symbol {9} and has internal angles of 140°. The area of a regular nonagon of side length a is given by
where the radius r of the inscribed circle of the regular nonagon is
and where R is the radius of its circumscribed circle:
Construction
Although a regular nonagon is not constructible with compass and straightedge (as 9 = 32, which is not a product of distinct Fermat primes), there are very old methods of construction that produce very close approximations.
It can be also constructed using neusis, or by allowing the use of an angle trisector.
Symmetry
The regular enneagon has Dih9 symmetry, order 18. There are 2 subgroup dihedral symmetries: Dih3 and Dih1, and 3 cyclic group symmetries: Z9, Z3, and Z1.
These 6 symmetries can be seen in 6 distinct symmetries on the enneagon. John Conway labels these by a letter and group order. Full symmetry of the regular form is r18 and no symmetry is labeled a1. The dihedral symmetries are divided depending on whether they pass through vertices (d for diagonal) or edges (p for perpendiculars), and i when reflection lines path through both edges and vertices. Cyclic symmetries in the middle column are labeled as g for their central gyration orders.
Each subgroup symmetry allows one or more degrees of freedom for irregular forms. Only the g9 subgroup has no degrees of freedom but can seen as directed edges.
Tilings
The regular enneagon can tessellate the euclidean tiling with gaps. These gaps can be filled with regular hexagons and |
https://en.wikipedia.org/wiki/Weak%20hypercharge | In the Standard Model of electroweak interactions of particle physics, the weak hypercharge is a quantum number relating the electric charge and the third component of weak isospin. It is frequently denoted and corresponds to the gauge symmetry U(1).
It is conserved (only terms that are overall weak-hypercharge neutral are allowed in the Lagrangian). However, one of the interactions is with the Higgs field. Since the Higgs field vacuum expectation value is nonzero, particles interact with this field all the time even in vacuum. This changes their weak hypercharge (and weak isospin ). Only a specific combination of them, (electric charge), is conserved.
Mathematically, weak hypercharge appears similar to the Gell-Mann–Nishijima formula for the hypercharge of strong interactions (which is not conserved in weak interactions and is zero for leptons).
In the electroweak theory SU(2) transformations commute with U(1) transformations by definition and therefore U(1) charges for the elements of the SU(2) doublet (for example lefthanded up and down quarks) have to be equal. This is why U(1) cannot be identified with U(1)em and weak hypercharge has to be introduced.
Weak hypercharge was first introduced by Sheldon Glashow in 1961.
Definition
Weak hypercharge is the generator of the U(1) component of the electroweak gauge group, and its associated quantum field mixes with the electroweak quantum field to produce the observed gauge boson and the photon of quantum electrodynamics.
The weak hypercharge satisfies the relation
where is the electric charge (in elementary charge units) and is the third component of weak isospin (the SU(2) component).
Rearranging, the weak hypercharge can be explicitly defined as:
where "left"- and "right"-handed here are left and right chirality, respectively (distinct from helicity).
The weak hypercharge for an anti-fermion is the opposite of that of the corresponding fermion because the electric charge and the third component of th |
https://en.wikipedia.org/wiki/Hilbert%27s%20seventh%20problem | Hilbert's seventh problem is one of David Hilbert's list of open mathematical problems posed in 1900. It concerns the irrationality and transcendence of certain numbers (Irrationalität und Transzendenz bestimmter Zahlen).
Statement of the problem
Two specific equivalent questions are asked:
In an isosceles triangle, if the ratio of the base angle to the angle at the vertex is algebraic but not rational, is then the ratio between base and side always transcendental?
Is always transcendental, for algebraic and irrational algebraic ?
Solution
The question (in the second form) was answered in the affirmative by Aleksandr Gelfond in 1934, and refined by Theodor Schneider in 1935. This result is known as Gelfond's theorem or the Gelfond–Schneider theorem. (The restriction to irrational b is important, since it is easy to see that is algebraic for algebraic a and rational b.)
From the point of view of generalizations, this is the case
of the general linear form in logarithms which was studied by Gelfond and then solved by Alan Baker. It is called the Gelfond conjecture or Baker's theorem. Baker was awarded a Fields Medal in 1970 for this achievement.
See also
Hilbert number or Gelfond–Schneider constant |
https://en.wikipedia.org/wiki/Palindromic%20rheumatism | Palindromic rheumatism (PR) is a syndrome characterised by recurrent, self-resolving inflammatory attacks in and around the joints, and consists of arthritis or periarticular soft tissue inflammation. The course is often acute onset, with sudden and rapidly developing attacks or flares. There is pain, redness, swelling, and disability of one or multiple joints. The interval between recurrent palindromic attacks and the length of an attack is extremely variable from few hours to days. Attacks may become more frequent with time but there is no joint damage after attacks. It is thought to be an autoimmune disease, possibly an abortive form of rheumatoid arthritis.
Presentation
The exact prevalence of palindromic rheumatism in the general population is unknown, and this condition is often considered a rare disease by nonrheumatologists. However, a recent Canadian study showed that the incidence of PR in a cohort of incident arthritis was one case of PR for every 1.8 cases of rheumatoid arthritis (RA). The incidence of PR is less than that of RA but is not as rare as previously thought.
Palindromic rheumatism is a syndrome presented with inflammatory para-arthritis (soft tissue rheumatism) and inflammatory arthritis both of which cause sudden inflammation in one or several joints or soft tissue around joints. The flares usually present with mono- or oligo-articular involvement, which have onset over hours and last a few hours to a few days, and then go away completely. However episodes of recurrence form a pattern, with symptom-free periods between attacks lasting for weeks to months. The most commonly involved joints were knees, metacarpophalangeals and proximal interphalangeal joints of the hands or feet. Constitutionally, there may or may not be a fever and swelling of the joints. The soft tissues are involved with swelling of the periarticular tissues, especially heel pads and finger pads. Nodules may be found in the subcutaneous tissues. The frequency of attacks |
https://en.wikipedia.org/wiki/Direct%20digital%20synthesis | Direct digital synthesis (DDS) is a method employed by frequency synthesizers used for creating arbitrary waveforms from a single, fixed-frequency reference clock. DDS is used in applications such as signal generation, local oscillators in communication systems, function generators, mixers, modulators, sound synthesizers and as part of a digital phase-locked loop.
Overview
A basic Direct Digital Synthesizer consists of a frequency reference (often a crystal or SAW oscillator), a numerically controlled oscillator (NCO) and a digital-to-analog converter (DAC) as shown in Figure 1.
The reference oscillator provides a stable time base for the system and determines the frequency accuracy of the DDS. It provides the clock to the NCO, which produces at its output a discrete-time, quantized version of the desired output waveform (often a sinusoid) whose period is controlled by the digital word contained in the Frequency Control Register. The sampled, digital waveform is converted to an analog waveform by the DAC. The output reconstruction filter rejects the spectral replicas produced by the zero-order hold inherent in the analog conversion process.
Performance
A DDS has many advantages over its analog counterpart, the phase-locked loop (PLL), including much better frequency agility, improved phase noise, and precise control of the output phase across frequency switching transitions. Disadvantages include spurious responses mainly due to truncation effects in the NCO, crossing spurs resulting from high order (>1) Nyquist images, and a higher noise floor at large frequency offsets due mainly to the digital-to-analog converter.
Because a DDS is a sampled system, in addition to the desired waveform at output frequency Fout, Nyquist images are also generated (the primary image is at Fclk-Fout, where Fclk is the reference clock frequency). In order to reject these undesired images, a DDS is generally used in conjunction with an analog reconstruction lowpass filter as shown |
https://en.wikipedia.org/wiki/List%20of%20debuggers | This is a list of debuggers: computer programs that are used to test and debug other programs.
Debuggers
Advanced Debugger — The standard UNIX debugger
Allinea DDT — graphical debugger for debugging multithreaded and multiprocess applications on Linux platforms
AQtime — profiler and memory/resource debugger for Windows
ARM Development Studio 5 (DS-5)
CA/EZTEST — was a CICS interactive test/debug software package
CodeView — was a debugger for the DOS platform
dbx — a proprietary source-level debugger for Pascal/Fortran/C/C++ on UNIX platforms
DEBUG — the built-in debugger of DOS and Microsoft Windows
Dragonfly (Opera) — JavaScript and HTML DOM debugger
Dr. Memory — a DynamoRIO-based memory debugger
Dynamic debugging technique (DDT), and its octal counterpart Octal Debugging Technique
FusionDebug — interactive debugger for Adobe ColdFusion, Railo, and Lucee CFML Engines
FusionReactor — interactive IDE style debugger which includes various extensions/controls for allowing debugging of Java in production environments
GNU Debugger
Parasoft Insure++ — a multi-platform memory debugger
Intel Debugger
Interactive Disassembler (IDA Pro)
Java Platform Debugger Architecture
Jinx — a whole-system debugger for heisenbugs. It works transparently as a device driver.
JSwat — open-source Java debugger
LLDB
MacsBug — a debugger for the classic Mac OS
Memcheck — a Valgrind-based memory debugger
Modular Debugger — a C/C++ source level debugger for Solaris and derivates
OllyDbg — a disassembly-based debugger for Windows (GUI)
Omniscient Debugger — Forward and backward debugger for Java
Rational Purify (IBM) — multi-platform memory debugger
sdb — a symbolic debugger for C programs for ancient UNIX platforms
SIMMON (Simulation Monitor)
SoftICE — kernel mode debugger for Windows
SEGGER Ozone — debugger and performance analyser for embedded systems
TRACE32 — in-circuit debugger for embedded systems
Turbo Debugger — Pascal/C/assembly debugger for DOS
Undo LiveRecorder — C, C++, Go, Rust, Ja |
https://en.wikipedia.org/wiki/Tore%20Dyb%C3%A5 | Tore Dybå (born 31 July 1961) is a Norwegian scientist and software engineer in the fields of information systems and computer science. He has been a Chief Scientist at SINTEF ICT since 2003.
Career
Dybå received his Master of Science in Electrical Engineering and Computer Science from the Norwegian Institute of Technology in 1987. In 2001 he received his Doctoral degree (PhD) in Computer and Information Science from the Norwegian University of Science and Technology.
He worked as a software engineer and consultant in Norway and Saudi Arabia from 1987 until 1994 when he moved to SINTEF. Dybå had an adjunct position at the Simula Research Laboratory from 2002 to 2009, and from 2010 until 2015 he was a Professor of Software Engineering at the Department of Informatics at the University of Oslo.
Research
Dybå's research is related to organizational and socio-technical aspects of software development and how software development can be improved. He has been particularly concerned with combining rigorous research with topics of importance to the software industry, including software process improvement, agile software development and management, and empirical methods for software engineering.
Awards
For the period 2001–2012, the Journal of Systems and Software ranked Dybå as the top scholar worldwide in agile software development. The ranking named Dybå as the most active researchers by total articles in the period as well as the most cited researcher by total number of citations and adjusted citations.
In 2014 Dybå, together with Kitchenham and Jørgensen, received the Association for Computing Machinery's ACM SIGSOFT award for the most influential paper in the last ten years for the initial paper on evidence-based software engineering.
Dybå et al.’s article on evidence-based software engineering for practitioners was chosen by the editorial and advisory boards of IEEE Software as one of the magazine's 25th anniversary top picks of recommended reading.
Selected |
https://en.wikipedia.org/wiki/Trial%20division | Trial division is the most laborious but easiest to understand of the integer factorization algorithms. The essential idea behind trial division tests to see if an integer n, the integer to be factored, can be divided by each number in turn that is less than n. For example, for the integer , the only numbers that divide it are 1, 2, 3, 4, 6, 12. Selecting only the largest powers of primes in this list gives that .
Trial division was first described by Fibonacci in his book Liber Abaci (1202).
Method
Given an integer n (n refers to "the integer to be factored"), the trial division consists of systematically testing whether n is divisible by any smaller number. Clearly, it is only worthwhile to test candidate factors less than n, and in order from two upwards because an arbitrary n is more likely to be divisible by two than by three, and so on. With this ordering, there is no point in testing for divisibility by four if the number has already been determined not divisible by two, and so on for three and any multiple of three, etc. Therefore, the effort can be reduced by selecting only prime numbers as candidate factors. Furthermore, the trial factors need go no further than because, if n is divisible by some number p, then n = p × q and if q were smaller than p, n would have been detected earlier as being divisible by q or by a prime factor of q.
A definite bound on the prime factors is possible. Suppose is the 'th prime, so that P1 = 2, P2 = 3, P3 = 5, etc. Then the last prime number worth testing as a possible factor of n is where ; equality here would mean that is a factor. Thus, testing with 2, 3, and 5 suffices up to n = 48 not just 25 because the square of the next prime is 49, and below n = 25 just 2 and 3 are sufficient. Should the square root of n be an integer, then it is a factor and n is a perfect square.
An example of the trial division algorithm, using successive integers as trial factors, is as follows (in Python):
def trial_division(n: int |
https://en.wikipedia.org/wiki/Rebecca%20Jordan-Young | Rebecca M. Jordan-Young (born 1963), is an American feminist scientist and gender studies scholar. Her research focuses on social medical science, sex, gender, sexuality, and epidemiology. She is an Associate Professor of Women’s, Gender, and Sexuality Studies at Barnard College.
Life and career
Jordan-Young completed her undergraduate work at Bryn Mawr College, receiving her bachelor's degree in political science. She earned her master's degree and Ph.D. from Columbia University.
Jordan-Young was a principal investigator and deputy director of the Social Theory Core at the Center for Drug Use and HIV Research of the National Development and Research Institutes. She has served as a health disparities scholar sponsored by the National Institutes of Health. In 2008, Jordan-Young was a visiting scholar in cognitive neuroscience at the International School for Advanced Studies.
She is the author of Brain Storm: The Flaws in the Science of Sex Differences, a critical analysis of scientific research supporting the theory that psychological sex differences in humans are "hard-wired" into the brain. Jordan-Young argues that studies of “human brain organization theory,” fail to meet scientific standards.
In Out of Bounds? A Critique of the New Policies on Hyperandrogenism in Elite Female Athletes, a collaborative article with Katrina Karkazis, Georgiann Davis, and Silvia Camporesi, published in 2012 in the American Journal of Bioethics, the authors argue that a new sex testing policy by the International Association of Athletics Federations aimed at intersex women athletes will not protect against breaches of privacy, will require athletes to undergo unnecessary treatment in order to compete, and will intensify "gender policing". They recommend that athletes be able to compete in accordance with their legal gender.
In 2016, Jordan-Young was awarded a Guggenheim Fellowship to work on a book on testosterone, "T: The Unauthorized Biography", with co-author Katrina Karkazi |
https://en.wikipedia.org/wiki/Finnish%20Food%20Workers%27%20Union | The Finnish Food Workers' Union (, SEL) is a trade union representing workers in the food industry in Finland.
The Finnish Food and Drink Workers' Union was established in 1905, but banned in 1930. As a temporary measure, the new Finnish Federation of Trade Unions (SEK) admitted food workers to the Finnish General Workers' Union, but in December 1932, it split them into a new Finnish Food Workers' Union.
From 1945, the union was led by Kalle Lindholm, a member of the Finnish Communist Party. While a minority of Social Democratic Party supporters left in 1960 to join the rival General and Speciality Workers' Union, this rejoined the SEL in 1970, soon after the SEK merged into the new Central Organisation of Finnish Trade Unions. The SEL's membership grew rapidly, and by 1998 had reached 43,497. By 2020, this had fallen to 30,047.
Presidents
1932: Lauri Pöyhönen
1941: I. K. Björk
1941: Lauri Pöyhönen
1945: Kalle Lindholm
1949: Armas Reunamo
1951: Eero Teri
1953: Arvo Hautala
1966: Urpo Virtanen
1969: Jarl Sund
1991: Ritva Savtschenko
2006: Veli-Matti Kuntonen
External links |
https://en.wikipedia.org/wiki/JavaOS | JavaOS is a discontinued operating system based on a Java virtual machine. It was originally developed by Sun Microsystems. Unlike Windows, macOS, Unix, or Unix-like systems which are primarily written in the C programming language, JavaOS is primarily written in Java. It is now considered a legacy system.
History
The Java programming language was introduced by Sun in May 1995. Jim Mitchell and Peter Madany at JavaSoft designed a new operating system, codenamed Kona, written completely in Java. In March 1996, Tom Saulpaugh joined the now seven-person Kona team to design an input/output (I/O) architecture, having come from Apple as Macintosh system software engineer since June 1985 and co-architect of Copland.
JavaOS was first evangelized in a Byte article. In 1996, JavaSoft's official product announcement described the compact OS designed to run "in anything from net computers to pagers". In early 1997, JavaSoft transferred JavaOS to SunSoft. In late 1997, Bob Rodriguez led the team to collaborate with IBM who then marketed the platform, accelerated development, and made significant key architectural contributions to the next release of JavaOS, eventually renamed JavaOS for Business. IBM indicated its focus was more on network computer thin clients, specifically to replace traditional IBM 3270 "green screen" and Unix X terminals, and to implement single application clients.
The Chorus distributed real-time operating system was used for its microkernel technology. This began with Chorus Systèmes SA, a French company, licensing JavaOS from Sun and replacing the earlier JavaOS hardware abstraction layer with the Chorus microkernel, thereby creating the Chorus/Jazz product, which was intended to allow Java applications to run in a distributed, real-time embedded system environment. Then in September 1997, it was announced that Sun Microsystems was acquiring Chorus Systèmes SA.
In 1999, Sun and IBM announced the discontinuation of the JavaOS product. As early as 20 |
https://en.wikipedia.org/wiki/Flag%20of%20Derbyshire | The flag of Derbyshire is the flag of the English county of Derbyshire. Created in 2006, the flag has subsequently been registered at the Flag Institute and added to their UK Flags Register.
History
The creation of the flag came about as a result of a feature on Andy Whittaker's breakfast show on BBC Radio Derby in 2006. Jeremy Smith, a listener who had noticed the prominent use of Saint Piran's Flag while visiting Cornwall wanted to know whether Derbyshire had an equivalent symbol. As no flag to represent the county existed, a campaign to design one with the aid of listeners' suggestions was launched. The finished flag, designed by Martin Enright from Derby, was unveiled on 22 September 2006. Ceremonies to mark the first unfurling of the flag were held on the day at various locations around the county, including Derby, Ripley, Ashbourne and Buxton.
Design
The flag features a green cross on a blue background. These colours were chosen to represent Derbyshire's green countryside and its rivers and reservoirs, respectively. In the centre of the flag is a Tudor rose, which has been the county badge since the 15th century. The rose is gold in colour rather than the more usual red and white, partly to differentiate from the emblems of Yorkshire and Lancashire; it is also supposed to symbolise quality. |
https://en.wikipedia.org/wiki/Three-domain%20system | The three-domain system is a biological classification introduced by Carl Woese, Otto Kandler, and Mark Wheelis in 1990 that divides cellular life forms into three domains, namely Archaea, Bacteria, and Eukarya. The key difference from earlier classifications such as the two-empire system and the five-kingdom classification is the splitting of Archaea from Bacteria as completely different organisms. It has been challenged by the two-domain system that divides organisms into Bacteria and Archaea only, as Eukaryotes are considered as one group of Archaea.
Background
Woese argued, on the basis of differences in 16S rRNA genes, that bacteria, archaea, and eukaryotes each arose separately from an ancestor with poorly developed genetic machinery, often called a progenote. To reflect these primary lines of descent, he treated each as a domain, divided into several different kingdoms. Originally his split of the prokaryotes was into Eubacteria (now Bacteria) and Archaebacteria (now Archaea). Woese initially used the term "kingdom" to refer to the three primary phylogenic groupings, and this nomenclature was widely used until the term "domain" was adopted in 1990.
Acceptance of the validity of Woese's phylogenetically valid classification was a slow process. Prominent biologists including Salvador Luria and Ernst Mayr objected to his division of the prokaryotes. Not all criticism of him was restricted to the scientific level. A decade of labor-intensive oligonucleotide cataloging left him with a reputation as "a crank", and Woese would go on to be dubbed "Microbiology's Scarred Revolutionary" by a news article printed in the journal Science in 1997. The growing amount of supporting data led the scientific community to accept the Archaea by the mid-1980s. Today, very few scientists still accept the concept of a unified Prokarya.
Classification
The three-domain system adds a level of classification (the domains) "above" the kingdoms present in the previously used five- or |
https://en.wikipedia.org/wiki/Astrophysical%20jet | An astrophysical jet is an astronomical phenomenon where outflows of ionised matter are emitted as extended beams along the axis of rotation. When this greatly accelerated matter in the beam approaches the speed of light, astrophysical jets become relativistic jets as they show effects from special relativity.
The formation and powering of astrophysical jets are highly complex phenomena that are associated with many types of high-energy astronomical sources. They likely arise from dynamic interactions within accretion disks, whose active processes are commonly connected with compact central objects such as black holes, neutron stars or pulsars. One explanation is that tangled magnetic fields are organised to aim two diametrically opposing beams away from the central source by angles only several degrees wide Jets may also be influenced by a general relativity effect known as frame-dragging.
Most of the largest and most active jets are created by supermassive black holes (SMBH) in the centre of active galaxies such as quasars and radio galaxies or within galaxy clusters. Such jets can exceed millions of parsecs in length. Other astronomical objects that contain jets include cataclysmic variable stars, X-ray binaries and gamma-ray bursts (GRB). Jets on a much smaller scale (~parsecs) may be found in star forming regions, including T Tauri stars and Herbig–Haro objects; these objects are partially formed by the interaction of jets with the interstellar medium. Bipolar outflows may also be associated with protostars, or with evolved post-AGB stars, planetary nebulae and bipolar nebulae.
Relativistic jets
Relativistic jets are beams of ionised matter accelerated close to the speed of light. Most have been observationally associated with central black holes of some active galaxies, radio galaxies or quasars, and also by galactic stellar black holes, neutron stars or pulsars. Beam lengths may extend between several thousand, hundreds of thousands or millions of parsec |
https://en.wikipedia.org/wiki/LINC00899 | Long intergenic non-protein coding RNA 899 is a protein that in humans is encoded by the LINC00899 gene. |
https://en.wikipedia.org/wiki/Comparison%20of%20DVD%20ripper%20software | This article lists DVD ripper software capable of ripping and converting DVD discs, ISO image files or DVD folders to computer, mobile handsets and media players supported file formats.
General information
Note: Applications with a purple background are no longer in development.
Supported software & hardware, user interface
This table lists the operating systems that different DVD rippers can run on without emulation and/or compatibility layer(s) (e.g.Wine under Linux and/or other operating systems are marked as No, mostly noted, but there may be other applications running under emulation and/or compatibility layer(s) which are not marked). Other minimum system requirements are listed; some features (like High Definition support) may be unavailable with these specifications.
Disabling DRM
Note: As at 2009-12-10 much of the data below is based on available wiki-pages, official website pages & some limited user experience (i.e. where this table reads 'Yes' OR 'No', may be true OR may in fact need to read 'Partial', or 'Obsolete' as many encryption methods may change over time.)
Input files supported
Output files supported |
https://en.wikipedia.org/wiki/Abelson%27s%20paradox | Abelson's paradox is an applied statistics paradox identified by Robert P. Abelson. The paradox pertains to a possible paradoxical relationship between the magnitude of the r2 (i.e., coefficient of determination) effect size and its practical meaning.
Abelson's example was obtained from the analysis of the r2 of batting average in baseball and skill level. Although batting average is considered among the most significant characteristics necessary for success, the effect size was only a tiny 0.003.
See also
List of paradoxes |
https://en.wikipedia.org/wiki/Reactive%20bonding | Reactive bonding describes a wafer bonding procedure using highly reactive nanoscale multilayer systems as an intermediate layer between the bonding substrates. The multilayer system consists of two alternating different thin metallic films. The self-propagating exothermic reaction within the multilayer system contributes the local heat to bond the solder films. Based on the limited temperature the substrate material is exposed, temperature-sensitive components and materials with different CTEs, i.e. metals, polymers and ceramics, can be used without thermal damage.
Overview
The bonding is based on reactive nano scale multilayers providing an internal heat source. These foils are combined with additional solder layers to achieve bonding. The heat that is required for the bonding is created by a self-propagating exothermic reaction of the multilayer system. This reaction is ignited by an energy pulse, i.e. temperature, mechanical pressure, electrical spark or laser pulse. The generated heat is localized to the bonding interface and limited due to a short term heating phase within milliseconds.
This heat is an advantage of this approach, so the used materials are not exposed to high temperatures and allow rapid cooling. A drawback is that this approach is not applicable for bond frame dimensions of few ten micrometres. This is based on the limited handling and structuring abilities of the foils at this small dimensions.
The material used for multilayer systems is a bilayer of alternating elements, commonly Ni/Al, Al/Ti or Ti/a-Si. The metallic layer is usually 1 to 30 nm thick and can be arranged as horizontal or vertical nano scale material films and are a combination of a reactive and a low melting component. With increased bilayer thickness, the reaction velocity decreases and the reaction heat increases. Therefore, a specific balance between high reaction velocity and high reaction heat is necessary.
A commercial example of such material is NanoFoil. The co |
https://en.wikipedia.org/wiki/Solomon%20Feferman | Solomon Feferman (December 13, 1928 – July 26, 2016) was an American philosopher and mathematician who worked in mathematical logic. In addition to his prolific technical work in proof theory, recursion theory, and set theory, he was known for his contributions to the history of logic (for instance, via biographical writings on figures such as Kurt Gödel, Alfred Tarski, and Jean van Heijenoort) and as a vocal proponent of the philosophy of mathematics known as predicativism, notably from an anti-platonist stance.
Life
Solomon Feferman was born in The Bronx in New York City to working-class parents who had immigrated to the United States after World War I and had met and married in New York. Neither parent had any advanced education. The family moved to Los Angeles, where Feferman graduated from high school at age 16.
He received his B.S. from the California Institute of Technology in 1948, and in 1957 his Ph.D. in mathematics from the University of California, Berkeley, under Alfred Tarski, after having been drafted and having served in the U.S. Army from 1953 to 1955. In 1956 he was appointed to the Departments of Mathematics and Philosophy at Stanford University, where he later became the Patrick Suppes Professor of Humanities and Sciences. While the majority of his career was spent at Stanford, he also spent time as a post-doctoral fellow at the Institute for Advanced Study in Princeton, a visiting professor at MIT, and a visiting fellow at the University of Oxford (Wolfson College and All Souls College).
Feferman died on 26 July 2016 at his home in Stanford, following an illness that lasted three months and a stroke. At his death, he had been a member of the MAA for 37 years.
Contributions
Feferman was editor-in-chief of the five-volume Collected Works of Kurt Gödel, published by Oxford University Press between 2001 and 2013.
In 2004, together with his wife Anita Burdman Feferman, he published a biography of Alfred Tarski: Alfred Tarski: Life and Logic.
|
https://en.wikipedia.org/wiki/SLIMbus | The Serial Low-power Inter-chip Media Bus (SLIMbus) is a standard interface between baseband or application processors and peripheral components in mobile terminals. It was developed within the MIPI Alliance, founded by ARM, Nokia, STMicroelectronics and Texas Instruments. The interface supports many digital audio components simultaneously, and carries multiple digital audio data streams at differing sample rates and bit widths.
SLIMbus is implemented as a synchronous 2-wire, configurable Time Division Multiplexed (TDM) frame structure. It has supporting bus arbitration mechanisms and message structures which permit re-configuring the bus operational characteristics to system application needs at runtime. Physically, the data line (DATA) and the clock line (CLK) interconnect multiple SLIMbus components in a multi-drop bus topology. SLIMbus devices may dynamically “drop off” the bus and “reconnect” to the bus as required by using appropriate protocols outlined in the SLIMbus specification. When used in a mobile terminal or portable product, SLIMbus may replace legacy digital audio interfaces such as PCM, I2S, and SSI (Synchronous Serial Interface for digital audio), as well as some instances of many digital control buses such as I2C, SPI, microWire, UART, or GPIO pins on the digital audio components.
History
The MIPI Alliance was formed in fall of 2003.
Interface architecture including a low speed media link bus (LowML) presented at the F2F meeting of MIPI Alliance in Sophia Antipolis, France in March, 2004.
LML Investigation Group (LML-IG) created in July 2004 by the MIPI Alliance. First meeting was a teleconference on August 3, 2004.
LML Working Group (LML-WG) created in Q4 2004. LML WG Charter submitted to the MIPI Board in December, 2004.
First meeting as a full Working Group on April 12, 2005.
LML-WG released first draft of SLIMbus with text in all chapters (v0.55) on October 18, 2005.
SLIMbus specification v1.00 was released to adopters on May 16, 2007 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.