id
int64
39
79M
url
stringlengths
31
227
text
stringlengths
6
334k
source
stringlengths
1
150
categories
listlengths
1
6
token_count
int64
3
71.8k
subcategories
listlengths
0
30
68,115,127
https://en.wikipedia.org/wiki/Private%20market%20assets
Private market assets refer to investments in equity (shares) and debt issued by privately owned (non listed) companies – as opposed to ‘public’ (listed) corporations. These markets include private equity (PE) and venture capital (VC); real estate (property); infrastructure; farmland and forestry. Private Market Assets Matrix (PMAM) The Private Market Assets Matrix (PMAM), also called Infrastructure and Private Markets Investment Matrix, is an original strategic assessment tool developed by M. Nicolas Firzli, World Pensions Council and Joshua Franzel, MissionSquare Research Institute, International City/County Management Association. The matrix maps out the evolution of "institutional investment by visualizing dynamically the proportion of assets allocated to infrastructure (Y axis) and private-market assets overall (X axis) for a cross-section of pension funds perceived as highly representative” [of future trends]. References Economic policy Finance Public policy Infrastructure investment Actuarial science Pension funds Private equity
Private market assets
[ "Mathematics" ]
197
[ "Applied mathematics", "Actuarial science" ]
68,115,628
https://en.wikipedia.org/wiki/List%20of%20filename%20extensions%20%280%E2%80%939%29
This alphabetical list of filename extensions contains extensions of notable file formats used by multiple notable applications or services. See also List of filename extensions List of file formats References External links File Extension Resource The File Extensions Resource File information site File format finder List of file types File format search engine 0 0
List of filename extensions (0–9)
[ "Technology" ]
62
[ "Computing-related lists", "Lists of file formats" ]
68,116,053
https://en.wikipedia.org/wiki/Mark%20Shtaif
Mark Shtaif (Hebrew: מרק שטייף) is an Israeli communication scientist, and a professor of electrical engineering at the faculty of engineering of Tel Aviv University. As of October 2020, he serves as Tel Aviv University’s rector. Biography Early life Mark Shtaif was born in 1966 in Kishinev of the former USSR. His father Abraham was an engineer of agricultural machinery and his mother Tania worked as a pediatrician. His family immigrated to Israel in April 1973 when he was 7 years old. Education and career After graduating from the Reali high-school in Haifa, and following a mandatory military service, he completed the bachelor, masters and doctorate degrees in electrical engineering at the Technion in 1997 and joined the light-wave research lab of AT&T in Red Bank NJ. His initial position at AT&T was of a post-doctoral fellow, but he was soon promoted to the position of a senior and subsequently principal member of technical staff and specialized on the theoretical modeling of fiber communications systems. In 2000 he assumed the position of a principal architect at a newly established optical communication start-up named Celion Networks. Later in 2002 he joined Tel Aviv University’s faculty of engineering, where he has been teaching and conducting research ever since. His fields of research focus primarily on fiber optics and optical communication systems. Within this general area of activity he integrates the fields of optics, quantum theory, nonlinear systems, communications theory, information theory, and signal processing. Over the years he has contributed to a variety of topics including optical amplification, analysis of nonlinear propagation, polarization-related phenomena, analyses of noise and signal detection, quantum information in fiber systems, and fundamental limits to optical communications. In the years 2014 – 2017 he headed the department of Physical Electronics within the School of Electrical Engineering in Tel Aviv University, and in 2017 – 2020 he was the head of the entire school. In October 2020 he was appointed rector of Tel Aviv University. Personal life Mark Shtaif is married to Michal, an educational councilor. They have three children and reside in the town of Even Yehuda. References External links Profile on DBLP website Shtaif discussing the history of quantum communications in 2014 at memorial symposium for James P. Gordon on YouTube 1966 births Living people Academics from Tel Aviv Bessarabian Jews Electrical engineers Tel Aviv University people
Mark Shtaif
[ "Engineering" ]
483
[ "Electrical engineering", "Electrical engineers" ]
68,116,358
https://en.wikipedia.org/wiki/Radon%20storm
A radon storm is a day-long episode of increased atmospheric radon concentration due to moving air masses. In Antarctica and over the Southern Ocean, they often occur due to the arrival of continental air from South America and Africa and the concept was coined to describe sudden radon concentration increases there. Naturally, radon increases in concentration threefold in antarctic air in the summer months of December and January. References Radon Regional climate effects Anomalous weather Climate of Antarctica
Radon storm
[ "Physics" ]
95
[ "Weather", "Physical phenomena", "Anomalous weather" ]
68,116,797
https://en.wikipedia.org/wiki/Assembly%20theory
Assembly theory is a framework developed to quantify the complexity of molecules and objects by assessing the minimal number of steps required to assemble them from fundamental building blocks. Proposed by chemist Lee Cronin and his team, the theory assigns an assembly index to molecules, which serves as a measurable indicator of their structural complexity. Cronin and colleagues argue that this approach allows for experimental verification and has applications in understanding selection processes, evolution, and the identification of biosignatures in astrobiology. However, the usefulness of the approach has been disputed. Background The hypothesis was proposed by chemist Leroy Cronin in 2017 and developed by the team he leads at the University of Glasgow, then extended in collaboration with a team at Arizona State University led by astrobiologist Sara Imari Walker, in a paper released in 2021. Assembly theory conceptualizes objects not as point particles, but as entities defined by their possible formation histories. This allows objects to show evidence of selection, within well-defined boundaries of individuals or selected units. Combinatorial objects are important in chemistry, biology and technology, in which most objects of interest (if not all) are hierarchical modular structures. For any object an 'assembly space' can be defined as all recursively assembled pathways that produce this object. The 'assembly index' is the number of steps on a shortest path producing the object. For such shortest path, the assembly space captures the minimal memory, in terms of the minimal number of operations necessary to construct an object based on objects that could have existed in its past. The assembly is defined as "the total amount of selection necessary to produce an ensemble of observed objects"; for an ensemble containing objects in total, of which are unique, the assembly is defined to be , where denotes 'copy number', the number of occurrences of objects of type having assembly index . For example, the word 'abracadabra' contains 5 unique letters (a, b, c, d and r) and is 11 symbols long. It can be assembled from its constituents as a + b --> ab + r --> abr + a --> abra + c --> abrac + a --> abraca + d --> abracad + abra --> abracadabra, because 'abra' was already constructed at an earlier stage. Because this requires at least 7 steps, the assembly index is 7. The word ‘abracadrbaa’, of the same length, for example, has no repeats so has an assembly index of 10. Take two binary strings and as another example. Both have the same length bits, both have the same Hamming weight . However, the assembly index of the first string is ("01" is assembled, joined with itself into "0101", and joined again with "0101" taken from the assembly pool), while the assembly index of the second string is , since in this case only "01" can be taken from the assembly pool. In general, for K subunits of an object O the assembly index is bounded by . Once a pathway to assemble an object is discovered, the object can be reproduced. The rate of discovery of new objects can be defined by the expansion rate , introducing a discovery timescale . To include copy number in the dynamics of assembly theory, a production timescale is defined, where is the production rate of a specific object . Defining these two distinct timescales , for the initial discovery of an object, and , for making copies of existing objects, allows to determine the regimes in which selection is possible. While other approaches can provide a measure of complexity, the researchers claim that assembly theory's molecular assembly number is the first to be measurable experimentally. Molecules with a high assembly index are very unlikely to form abiotically, and the probability of abiotic formation goes down as the value of the assembly index increases. The assembly index of a molecule can be obtained directly via spectroscopic methods. This method could be implemented in a fragmentation tandem mass spectrometry instrument to search for biosignatures. The theory was extended to map chemical space with molecular assembly trees, demonstrating the application of this approach in drug discovery, in particular in research of new opiate-like molecules by connecting the "assembly pool elements through the same pattern in which they were disconnected from their parent compound(s)". It is difficult to identify chemical signatures that are unique to life. For example, the Viking lander biological experiments detected molecules that could be explained by either living or natural non-living processes. It appears that only living samples can produce assembly index measurements above ~15. However, 2021, Cronin first explained how polyoxometalates could have large assembly indexes >15 in theory due to autocatalysis. Critical views Chemist Steven A. Benner has publicly criticized various aspects of assembly theory. Benner argues that it is transparently false that non-living systems, and with no life intervention, cannot contain molecules that are complex but people would be misled in thinking that because it was published in Nature journals after peer review, these papers must be right. A paper published in the Journal of Molecular Evolution concludes that "the hype around Assembly Theory reflects rather unfavorably both on the authors and the scientific publication system in general". The author concludes that what "assembly theory really does is to detect and quantify bias caused by higher-level constraints in some well-defined rule-based worlds"; one "can use assembly theory to check whether something unexpected is going on in a very broad range of computational model worlds or universes". Another paper authored by a group of chemists and planetary scientists published in the Journal of the Royal Society Interface demonstrated that abiotic chemical processes have the potential to form crystal structures of great complexity — values exceeding the proposed abiotic/biotic divide of MA index = 15. They conclude that "while the proposal of a biosignature based on a molecular assembly index of 15 is an intriguing and testable concept, the contention that only life can generate molecular structures with MA index ≥ 15 is in error". Two papers published in 2024 argue that assembly theory provides no insights beyond those already available using algorithmic complexity and Claude Shannon's information theory. See also List of interstellar and circumstellar molecules Smallest grammar problem Word problem for groups References Further reading Extraterrestrial life Molecular biology techniques Theories
Assembly theory
[ "Chemistry", "Astronomy", "Biology" ]
1,318
[ "Hypothetical life forms", "Extraterrestrial life", "Molecular biology techniques", "Astronomical controversies", "Molecular biology", "Biological hypotheses" ]
68,118,346
https://en.wikipedia.org/wiki/Ratko%20Mitrovi%C4%87
Ratko Mitrović (; born 1956) is a Montenegrin civil engineer, university professor, scientist and government official, serving as Minister of Ecology, Spatial Planning and Urbanism of Montenegro in the cabinet of Zdravko Krivokapić, since December 2020. In addition to his academic career at the University of Montenegro, he is one of the founders of the non-governmental organization called "We won't give up Montenegro" (Ne damo Crnu Goru), which was founded by Montenegrin professors and intellectuals in support of the Serbian Orthodox Church in Montenegro after a controversial religion law targeted the legal status and the property of the Church. Early life and education Mitrović was born in 1956 in Podgorica, Montenegro which at the time was a part of the FPR Yugoslavia. He graduated in 1981 at two departments of the University of Niš's Faculty of Architecture and Civil Engineering, the Department of Construction and Design of multi-storey Buildings and the Department of Construction and Design of Infrastructure Buildings. He then enrolled postgraduate studies at the Faculty of Civil Engineering, University of Belgrade, and defended his master's thesis on the topic of improving the technology of prefabricated building systems in 1991. Mitrović received his doctorate in 1994 at the Technical Faculty of the University of Novi Sad's Department of Management and Construction Technologies, defending his doctoral dissertation called "Technical and technological modeling of organizational structures of construction companies in market conditions." During 1997 and 1998, he attended postdoctoral studies at the public University of Florida. Career Professional career He started his professional career in 1980s as an engineer for the development of new technologies in the General Construction Company in Titograd (now Podgorica), then capital of SR Montenegro of SFR Yugoslavia, where he soon became the head of operations for the construction of high-rise buildings. Then, in 1987, he started working to Montenegro state-owned electricity company (Elektroprivreda Crne Gore) as the chief engineer and construction manager. Between 1991 and 2000, he managed the construction of several residential buildings and cultural institutions in capital Podgorica, as well in several other Montenegrin settlements. He was also the director of the construction of the Serbian Orthodox Cathedral of the Resurrection of Christ in Podgorica, completed in 2013. In recent years, he has been working on research in the field of renewable energy sources. He is the author of dozens of protected conceptual solutions for the hydropower use of watercourses, as well as several protected conceptual solutions for the use of solar energy. Academic career Mitrović is a full professor at the Faculties of Architecture and of Civil Engineering of the University of Montenegro, where he teaches several subjects: Elements of buildings, Technology of construction of hydropower facilities, New technologies and materials, Organization of construction and Project Management. At the Faculty of Architecture, he is the head of the department of master's studies. He was hired as a guest lecturer at few faculties in Serbia, the United Kingdom and in Italy. Political career Mitrović was part 2020 religion law protests. He decided to enter political life of Montenegro in mid-2020, at the height of the political crisis in Montenegro, and the open conflict between the Serbian Orthodox Church in Montenegro and the DPS-led Montenegrin government, following the adoption of the disputed law on the status of religious communities in Montenegro, supporting 20192020 clerical protests and Serbian Orthodox Church (SPC) rights in Montenegro. In July 2020, together with Zdravko Krivokapić and several other Montenegrin university professors and intellectuals, he was one of the founders of the Church-backed "We won't give up Montenegro" (Ne damo Crnu Goru) NGO and public movement. He was elected the first head of the council of the organization. In a short period of time, the organization organized public events in which Bishop of Budimlja and Nikšić Joanikije (Mićović) and the Rector of the Theological Seminary in Cetinje, Gojko Perović participated, among others. 30 August 2020 election resulted in a victory for the opposition parties and the fall from power of the ruling DPS, which has ruled the country since the introduction of the multi-party system in 1990, while the "Ne damo Crnu Goru" and opposition list leader Krivokapić was selected new prime minister-designate of Montenegro by the new parliamentary majority, announcing withdrawal of the disputed law on religious communities. On 5 November 2020, the Prime Minister-designate Krivokapić appointed him a candidate for the Minister in newly created post of Minister of Ecology, Spatial Planning and Urbanism in the new government cabinet of Montenegro. Immediately after the announcement of the candidacy for ministerial positions in the new government, several environmental protection organizations and environmental activists demanded the withdrawal of Prime Minister-designate Krivokapic's proposal to nominate Professor Mitrovic as Minister of Ecology, Spatial Planning and Urbanism. They said they were revolted by the proposal for Mitrović to head the ecology ministry in the future cabinet, due to his earlier participation in the design of small hydropower plants on mountain rivers in northern Montenegro, against which environmental activists have been protesting for decades. It was also revealed that he was a co-founder of the company "Renevable Energy Montenegro", which is exclusively engaged in the construction of small hydropower plants, and that Mitrović himself designed more than fifteen small hydropower plants throughout Montenegro. In a statement to the media, Mirović admitted that he had participated in similar projects in the past, but that he completely gave up small hydropower projects in 2010, when he realized how many mini hydropower plants of that type are devastating to the environment, saying that such a model of electricity production unacceptable and that he now completely opposes the construction of mini hydropower plants in Montenegro. On 4 December 2020, the Parliament of Montenegro officially elected him Minister. In an interview in May 2021, Mitrović admitted in an interview that he had been involved in the construction of an illegal house near Budva for 20 years, of which he was accused of by opposition MPs earlier that year, saying he was no longer the owner but his son. After a controversial interview, the opposition and NGOs demanded Mitrović's resignation from Prime Minister Krivokapić, which he refused to do, saying "Mitrović misunderstood the journalistic issue due to hearing problems and was confused" and that he is necessary part of his government, as a proven expert in his field. Mitrović later said that his statement was "clumsily made" and that his son, who is the owner, had previously submitted a request for the legalization of the house, long before he become minister. The parliamentary opposition and NGOs continued to demand his resignation, calling into question his moral credibility to continue running the urbanism ministry. Personal life In addition to his native Serbian, he also speaks English and Russian language. Prime Minister of Montenegro, Zdravko Krivokapić and Mitrović are best men to each other, after the media revelation of that, some media and political circles close to the former DPS-led regime accused Mitrović, Krivokapić and his cabinet of nepotism. His brother Dragan Mitrović is the current steward of the Cathedral of the Resurrection of Christ in Podgorica, which he constructed between 1990s and 2013. References 1956 births Living people People from Podgorica Civil engineers Serbs of Montenegro Members of the Serbian Orthodox Church University of Niš alumni University of Belgrade alumni University of Novi Sad alumni University of Florida alumni Academic staff of the University of Montenegro Government ministers of Montenegro Montenegrin engineers
Ratko Mitrović
[ "Engineering" ]
1,560
[ "Civil engineering", "Civil engineers" ]
68,119,217
https://en.wikipedia.org/wiki/Boletus%20pseudopinophilus
Boletus pseudopinophilus is a species of porcini-like fungus native to eastern North America, where it grows under Pinus elliottii and Pinus palustris. Previously regarded as Boletus pinophilus it was found to have diverged significantly from the latter species. References pseudopinophilus Fungi of North America Edible fungi Fungi described in 2019 Fungus species
Boletus pseudopinophilus
[ "Biology" ]
81
[ "Fungi", "Fungus species" ]
68,123,871
https://en.wikipedia.org/wiki/Boletus%20quercophilus
Boletus quercophilus is a species of porcini-like fungus native to Costa Rica, where it grows under Quercus copeyensis and Quercus seemannii. References quercophilus Fungi of Central America Fungi described in 1999 Fungus species
Boletus quercophilus
[ "Biology" ]
54
[ "Fungi", "Fungus species" ]
68,123,889
https://en.wikipedia.org/wiki/Boletus%20fagacicola
Boletus fagacicola is a species of porcini-like fungus native to China, where it grows under trees of the family Fagaceae. References fagacicola Fungi of China Fungi described in 2016 Fungus species
Boletus fagacicola
[ "Biology" ]
48
[ "Fungi", "Fungus species" ]
68,123,985
https://en.wikipedia.org/wiki/Modified%20half-normal%20distribution
In probability theory and statistics, the modified half-normal distribution (MHN) is a three-parameter family of continuous probability distributions supported on the positive part of the real line. It can be viewed as a generalization of multiple families, including the half-normal distribution, truncated normal distribution, gamma distribution, and square root of the gamma distribution, all of which are special cases of the MHN distribution. Therefore, it is a flexible probability model for analyzing real-valued positive data. The name of the distribution is motivated by the similarities of its density function with that of the half-normal distribution. In addition to being used as a probability model, MHN distribution also appears in Markov chain Monte Carlo (MCMC)-based Bayesian procedures, including Bayesian modeling of the directional data, Bayesian binary regression, and Bayesian graphical modeling. In Bayesian analysis, new distributions often appear as a conditional posterior distribution; usage for many such probability distributions are too contextual, and they may not carry significance in a broader perspective. Additionally, many such distributions lack a tractable representation of its distributional aspects, such as the known functional form of the normalizing constant. However, the MHN distribution occurs in diverse areas of research, signifying its relevance to contemporary Bayesian statistical modeling and the associated computation. The moments (including variance and skewness) of the MHN distribution can be represented via the Fox–Wright Psi functions. There exists a recursive relation between the three consecutive moments of the distribution; this is helpful in developing an efficient approximation for the mean of the distribution, as well as constructing a moment-based estimation of its parameters. Definitions The probability density function of the modified half-normal distribution is where denotes the Fox–Wright Psi function. The connection between the normalizing constant of the distribution and the Fox–Wright function in provided in Sun, Kong, Pal. The cumulative distribution function (CDF) is where denotes the lower incomplete gamma function. Properties The modified half-normal distribution is an exponential family of distributions, and thus inherits the properties of exponential families. Moments Let . Choose a real value such that . Then the th moment isAdditionally,The variance of the distribution is The moment generating function of the MHN distribution is given as Modal characterization Consider with , , and . If , then the probability density function of the distribution is log-concave. If , then the mode of the distribution is located at If and , then the density has a local maximum at and a local minimum at The density function is gradually decreasing on and mode of the distribution does not exist, if either , or . Additional properties involving mode and expected values Let for , , and , and let the mode of the distribution be denoted by If , then for all . As gets larger, the difference between the upper and lower bounds approaches zero. Therefore, this also provides a high precision approximation of when is large. On the other hand, if and , then For all , , and , . Also, the condition is a sufficient condition for its validity. The fact that implies the distribution is positively skewed. Mixture representation Let . If , then there exists a random variable such that and . On the contrary, if then there exists a random variable such that and , where denotes the generalized inverse Gaussian distribution. References Probability distributions
Modified half-normal distribution
[ "Mathematics" ]
677
[ "Functions and mappings", "Mathematical relations", "Mathematical objects", "Probability distributions" ]
68,124,018
https://en.wikipedia.org/wiki/Boletus%20viscidiceps
Boletus viscidiceps is a species of porcini-like fungus native to Yunnan Province in southwestern China. References viscidiceps Fungi of China Fungi described in 2016 Fungus species
Boletus viscidiceps
[ "Biology" ]
40
[ "Fungi", "Fungus species" ]
68,126,330
https://en.wikipedia.org/wiki/Huang%20Chunping
Huang Chunping (born 1938) is a Chinese scientist in the fields of missile and aerospace engineering. He was the commander in chief of the Shenzhou 5 rocket system, China's first crewed spacecraft. He was also the commander in chief of Long March 3, Long March 2E and Long March 2F. Biography Huang was born into a peasant family in the town of Xiangqian, Minhou County, Fujian, in 1938. He is the seventh child in the family. All his six elder sisters were drowned by his father. He has two younger sisters and one younger brother. His father died when he was 16. He secondary studied at Minhou No.2 High School. After graduating from Beijing Institute of Technology in 1964, he was despatched to the 1st Branch of the 5th Academy of the Ministry of National Defense (now China Academy of Launch Vehicle Technology), where he successively served as technician, engineering group leader, office director, deputy director, director of Comprehensive Planning Department, director of Military Product Research and Production Department, assistant to president, deputy president, deputy director of Science and Technology Commission, chief designer, deputy chief designer, chief and deputy commander in chief. On 18 October 2019, Huang founded the Zhongxing Aerospace Information Technology Co., Ltd. in Fuzhou. Personal life Huang married Zhang Yonggui, who is also a space engineer. References 1938 births Living people People from Minhou County Scientists from Fujian Beijing Institute of Technology alumni Aerospace engineers
Huang Chunping
[ "Engineering" ]
299
[ "Aerospace engineers", "Aerospace engineering" ]
68,127,846
https://en.wikipedia.org/wiki/ZTF%20J1901%2B1458
ZTF J1901+1458 (nicknamed Z; formally ZTF J190132.9+145808.7; Gaia ID 4506869128279648512) is a white dwarf, about 135 light years away roughly in the direction of Epsilon Aquilae, discovered by the Zwicky Transient Facility circa 2021. It is the most massive white dwarf yet found, having 1.35 times the mass of the Sun, nearly the largest expected mass for this type of object. Its radius is about , about the size of Earth's Moon, and it rotates once every 7 minutes. The object's extreme rate of spin is hard to explain without supposing Z to be the result of a white dwarf merger, near the upper mass limit of a stable end product. Larger white dwarf mergers could be another mechanism of supernova production, which is not necessarily taken into account in how we have traditionally inferred dark energy from supernova observations. See also List of white dwarfs Bibliography References White dwarfs Aquila (constellation) Astronomical objects discovered in 2021
ZTF J1901+1458
[ "Astronomy" ]
225
[ "Aquila (constellation)", "Constellations" ]
68,130,212
https://en.wikipedia.org/wiki/Salsabil%20%28fountain%29
A salsabil (or salasabil), also known as a shadirwan, is a type of fountain which maximizes the surface area of the water. It is used for evaporative cooling of buildings, cooling and aeration of drinking water, and ornament (it has also been used to prevent eavesdropping). The water may flow in a thin sheet or thin streams, often over a wavy surface with many little waterfalls. Its use extends from southern Spain through north Africa and the Middle East to northern India. Etymology and name The name salsabil () likely derives from a Qur'anic reference. The term shadirwan is also used for devices for aerating drinking water. However, the term shadirwan or shadirvan (, , ) has slightly different uses in other cultures, such as designating a central ablutions fountain for a mosque courtyard in Turkish (see shadirvan). Design and setting The water flows in a manner designed to maximize the surface area, and thus evaporation. A salsabil may be a near-vertical marble waterfall mounted on a wall, or the sheet of water may flow down a slanted chute. Evaporative cooling causes the water and the surrounding air to cool as some of the water evaporates. Passive ventilation may be used to maximize the flow of unsaturated air over the water surface and carry the cooled air to where it is needed in the building. Salasabils are often used with windcatchers. A salsabil may also be used to aerate water for drinking in a sabil (or sebil; , ). Salsabils, in the form of inclined marble slabs over which drinking water flowed before being dispensed, were often included inside the sabils of Mamluk architecture. Salasabils were used in Mughal architecture from the 1200s to the 1600s. They were also used in recent centuries in Iran. They were sometimes used as decorative features in Ottoman domestic architecture. See also Passive cooling References Passive cooling Passive ventilation Water and Islam Water treatment
Salsabil (fountain)
[ "Chemistry", "Engineering", "Environmental_science" ]
432
[ "Water treatment", "Water pollution", "Water technology", "Environmental engineering" ]
75,307,284
https://en.wikipedia.org/wiki/%CE%91-Halo%20carboxylic%20acids%20and%20esters
α-Halo carboxylic acids and esters are organic compounds with the respective formulas where R and R' are organic substituents. The X in these compounds is a halide, usually chloride and bromide. These compounds are often used as intermediates in the preparation of more elaborate derivatives. They are often potent alkylating agents. The mono halide derivatives are chiral. Preparation They are often prepared by reaction of the acid or the ester with halogen: A related method is the Hell-Volhard-Zelinsky halogenation. Amino acids are susceptible to diazotization in the presence of chloride, a process that affords chiral 2-chloro carboxylic acids and esters. Reactions Consistent with these compounds being alkylating agents, the α-halide is readily substituted, e.g. by azide. Similarly, the α-bromocarboxylic acid undergo nucleophilic substitution with ammonia to give the amino acid, The Darzens reaction involves a ketone or aldehyde with an α-haloester in the presence of a base to form an α,β-epoxy ester, also called a "glycidic ester". The reaction process begins with deprotonation at the halogenated position. In a related reaction, α-halo carboxylic esters can be reduced by lithium aluminium hydride to the α-halo alcohols, which can be converted to the α-epoxides. α-Halo-esters can be converted to vinyl halides. upon reaction with ketones and chromous chloride. Applications A prominent α-halo carboxylic acid is chloroacetic acid, which is used to produce carboxymethyl cellulose, carboxymethyl starch, as well as several phenoxy herbicides. 2,2-Dichloropropionic acid ("Dalapon") is an herbicide. Reference Functional groups Alkylating agents Organochlorides
Α-Halo carboxylic acids and esters
[ "Chemistry" ]
427
[ "Alkylating agents", "Functional groups", "Reagents for organic chemistry" ]
75,310,751
https://en.wikipedia.org/wiki/Hoshi%20Meguri%20no%20Uta
is a piece of music composed in the pentatonic scale by Miyazawa Kenji in 1918. It is featured in his 1934 novel Night on the Galactic Railroad as well as its 1985 animated adaptation, where it appears in a music box arrangement by Shimizu Osamu and Haruomi Hosono. It has also been used as ending song in the visual novel and anime Planetarian: The Reverie of a Little Planet. Its name in Esperanto is La Kanto de la Rondiro de la Steloj. Score and translated lyrics The red-eyed Scorpion, and the Eagle's spread wings The blue-eyed Little Dog, and the coiled Snake of Light Orion sings in the heavens From where fall dew and frost The cloud of Andromeda in the shape of a fish's mouth, and the Great Bear who reaches out five times to the North to the brow of the Little Bear, where shines the guide of the pilgrimage of the sky Summary The lyrics are full of fantastic images of the night sky, although in some places they differ from the usual interpretation of the constellations. The "red eye" of the Scorpion mentioned in the song is Antares, heart of the constellation Scorpius, and the "blue eye" of Canis Minor is Procyon. The "guide of the pilgrimage of the sky" on the brow of Ursa Minor is thought to refer to Polaris, which is actually at the end of that constellation's tail. Hoshi meguri no uta received renewed notoriety when it was sung by Ōtake Shinobu and the Suginami Children's Chorus in the course of the closing ceremonies of the 2020 Summer Olympics in Tokyo. See also Musica universalis External links Haruomi Hosono and Shimizu Osamu's 1985 version of Hoshi Meguri no Uta on YouTube References 1918 compositions Works by Kenji Miyazawa Japanese children's songs Constellations
Hoshi Meguri no Uta
[ "Astronomy" ]
397
[ "Sky regions", "Constellations" ]
75,311,105
https://en.wikipedia.org/wiki/Gregory%20Odegard
Gregory M. Odegard is a materials researcher and academic. He is the John O. Hallquist Endowed Chair in Computational Mechanics in the Department of Mechanical Engineering – Engineering Mechanics at Michigan Technological University and the director of the NASA Institute for Ultra-Strong Composites by Computational Design. Odegard's work is focused on computational modeling of advanced composite systems, with his research interests spanning multiscale modeling, computational chemistry, materials science, and mechanics of materials. He is the recipient of 2008 Ferdinand P. Beer and E. Russell Johnston Jr. Outstanding New Mechanics Educator Award, 2011 Ralph R. Teetor Educational Award, 2021 Michigan Tech Distinguished Researcher Award, and 2023 NASA Outstanding Public Leadership Medal. Odegard is a Fellow of American Society of Mechanical Engineers (ASME), and an Associate Fellow of American Institute of Aeronautics and Astronautics (AIAA). Education Odegard earned his B.S. in Mechanical Engineering from the University of Colorado Boulder in 1995. He then completed his M.S. in Mechanical Engineering at the University of Denver in 1998, followed by his Ph.D. in materials science from the same institution in 2000 under Maciej S. Kumosa, with his doctoral thesis titled, "Shear-Dominated Biaxial Failure Analysis of Polymer-Matrix Composites at Room and Elevated Temperatures." Career Odegard worked as a National Research Council postdoctoral research associate in the Mechanics and Durability Branch at NASA Langley Research Center, Hampton, Virginia, from 2000 to 2002. Subsequently, he held positions as a Staff Scientist at ICASE in 2002 and as a Staff Scientist at the National Institute of Aerospace from 2003 to 2004, both at NASA Langley Research Center. He has been serving as a director of the NASA Space Technology Research Institute (STRI) for Ultra-Strong Composites by Computational Design (US-COMP). Odegard began his academic career at Michigan Technological University in 2004 as an assistant professor in the Department of Mechanical Engineering – Engineering Mechanics, and was appointed as an associate professor from 2009 to 2013. During this time, he briefly served as a Fulbright Research Scholar at the Norwegian University of Science and Technology, Trondheim, Norway. In 2014, he was named as the Richard and Elizabeth Henes Professor in Computational Mechanics in the Department of Mechanical Engineering – Engineering Mechanics at Michigan Technological University, a position he held until 2021. He has been holding an appointment as the John O. Hallquist Endowed Chair of Computational Mechanics at the same university since 2021. Research Odegard has led a multi-institution effort in developing ultra-strong composites for deep space exploration using carbon nanotubes (CNTs) and polymers, employing computational modeling for accurate property prediction, and has received media coverage for his contributions, including features in publications such as Chemical & Engineering News, CompositesWorld, Nature World News, and Space.com. For his efforts in leading US-COMP to achieve its goals, Odegard was awarded the NASA Outstanding Public Leadership Medal in 2023. Computational modeling of nanocomposites Odegard has conducted research on computational simulation of polymer and polymer-composite materials, and made contributions to the development of new multi-scale modeling approaches for advanced composite materials. During his time at NASA Langley Research Center, he developed techniques to connect computational chemistry with continuum mechanics. This new approach to materials modeling enabled the development of structure-property relationships in nano-structured materials. In collaboration with researchers from the National Institute of Aerospace and Langley Research Center in 2005, he used this approach to develop constitutive models for polymer composite systems reinforced with single-walled CNTs. Additionally, he developed a multiscale model for silica nanoparticle/polyimide composites, which integrated the molecular structures of the nanoparticle, polyimide, and interfacial regions into the bulk-level constitutive behavior. Odegard and his team further developed computational simulation techniques for nanocomposite materials. He developed the simulation of polymer materials using reactive force fields. These force fields allow for the simulation of chemical bond breakage during mechanical deformation, thus allowing for more accurate computational predictions of polymer mechanical behavior and failure. His team used these techniques to computationally design CNT nanocomposites with improved manufacturability and mechanical behavior. In addition, he was a contributor to the development of CNT yarn composites as part of US-COMP, which showed significant increases in mechanical stiffness and strength relative to state-of-the-art aerospace composite materials. Awards and honors 2006 – HJE Reid Award, NASA Langley Research Center 2008 – Ferdinand P. Beer and E. Russell Johnston Jr. Outstanding New Mechanics Educator Award, American Society of Engineering Education 2011 – Ralph R. Teetor Educational Award, Society of Automotive Engineers 2021 – Michigan Tech Distinguished Researcher Award 2023 – Outstanding Public Leadership Medal, NASA Selected articles Odegard, G. M., Gates, T. S., Nicholson, L. M., & Wise, K. E. (2002). Equivalent-continuum modeling of nano-structured materials. Composites Science and Technology, 62(14), 1869–1880. Odegard, G. M., Gates, T. S., Wise, K. E., Park, C., & Siochi, E. J. (2003). Constitutive modeling of nanotube–reinforced polymer composites. Composites science and technology, 63(11), 1671–1687 Odegard, G. M., & Bandyopadhyay, A. (2011). Physical aging of epoxy polymers and their composites. Journal of polymer science Part B: Polymer physics, 49(24), 1695–1716. Odegard, G. M., Jensen, B. D., Gowtham, S., Wu, J., He, J., & Zhang, Z. (2014). Predicting mechanical response of crosslinked epoxy using ReaxFF. Chemical Physics Letters, 591, 175–178. Odegard, G. M., Clancy, T. C., & Gates, T. S. (2017). Modeling of the mechanical properties of nanoparticle/polymer composites. In Characterization of Nanocomposites (pp. 319–342). Jenny Stanford Publishing. Odegard, G. M., Patil, S. U., Deshpande, P. P., Kanhaiya, K., Winetrout, J. J., Heinz, H., ... & Maiaru, M. (2021). Molecular dynamics modeling of epoxy resins using the reactive interface force field. Macromolecules, 54(21), 9815–9824. Odegard, G.M., Liang, Z., Siochi, E.J., & Warren, J.A. (2023). A successful strategy for MGI-inspired research. MRS Bulletin, 48(5), 434–438. References Materials scientists and engineers NASA people Michigan Technological University faculty University of Denver alumni University of Colorado Boulder alumni Living people Year of birth missing (living people)
Gregory Odegard
[ "Materials_science", "Engineering" ]
1,475
[ "Materials scientists and engineers", "Materials science" ]
75,311,307
https://en.wikipedia.org/wiki/CBS-Lifteam
CBS-Lifteam is an international French-Swiss group, founded in 1991, specializing in timber engineering. History CBS-Lifteam is a group specializing in timber building. It was founded in Lausanne in 1991 by engineer Jean-Luc Sandoz. The group is a pioneer in the use of all-construction methods in the bio-sourced sector. Its slogan is "More engineering, less material". The group has founded two design offices, Concept Bois Structure (CBS) in Paris and Concept Bois Technologie (CBT) in Lausanne, a wood prefabrication plant, Ecotim, created in 2005 in Savoie, and a construction supervision company, Lifteam, created in 2006. Business structure Technology department CBT is the continuation of a start-up company at the l'École polytechnique fédérale de Lausanne (EPFL) aimed at transferring the research work carried out at the school by Jean-Luc Sandoz under the guidance of Julius Natterer, who was then head of the school's wood chair, into the practical field. The company developed two devices in particular: the Sylvatest which measures the physical strength of wood using ultrasound and Polux, which assesses the reliability of wooden poles in service using hydro-densitometric measurement. These methodologies are expanding internationally. Structures department The group designs numerous structural concepts such as frameworks and floors like Dalle O'Portune and D-Dalle. The timber design office of excellence is an entity specializing in the design and optimization of wood construction projects. It stands out for its ability to meet the highest demands in terms of performance, durability and aesthetics, and is thus recognized for its contribution to the realization of ambitious projects in the field of timber construction. Production The Ecotim plant, implemented in La Rochette is the production facility, with a surface area of 4,900 m2. The current building replaces a 2200 m2 facility built in 2005 and destroyed by fire in 2010. The Group also has a presence in French Guiana, where it has been constructing buildings in local tropical materials, such as for CNES, since 2015. In 2023, the company will be present at the International Timber Construction Forum with all the players in Guyana's wood industry. CBS-Lifteam is a pioneer in the field of ecological construction, distinguished by its use of biosourced and geosourced materials. It is renowned for its innovation and exemplary nature in these areas of activity. Financial data and employees Company sales were €13 million in 2018. Increased by 40% in 2019. The company had 120 employees in 2019 and at the end of 2022. Environmental and humanitarian commitment In 2007, the company took part in l'UNECE. In 2021, CBS-Lifteam worked with an NGO to set up training courses for the migrants in Mexico. Awards The company is a prize-winner for innovative projects and ecological achievements. Green solution awards will be presented in 2023. References Companies of Switzerland Timber framed buildings Construction Ultrasound Civil engineering
CBS-Lifteam
[ "Engineering" ]
619
[ "Construction", "Civil engineering" ]
75,313,342
https://en.wikipedia.org/wiki/Giant%20Oj%C5%8D-sama
is a Japanese manga series written and illustrated by Nikumura Q. It began serialization on Shogakukan's Sunday Webry manga website in July 2021. Publication Written and illustrated by Nikumura Q, Giant Ojō-sama was originally a web comic published on Shueisha's Jump Rookie manga website on December 17, 2020. It later began serialization on Shogakukan's Sunday Webry manga website on July 30, 2021. The first volume was later released on November 12. Its chapters have been collected into nine tankōbon volumes as of April 2024. Reception The series was nominated for the eighth Next Manga Awards in 2022. The series was ranked 5th in AnimeJapan's 6th "Most Wanted Anime Adaptation" poll in 2023. References External links 2021 manga Comedy anime and manga Japanese comedy webcomics Kaiju Shogakukan manga Shōnen manga Webcomics in print
Giant Ojō-sama
[ "Physics", "Mathematics" ]
191
[ "Fiction about size change", "Quantity", "Physical quantities", "Size" ]
75,313,681
https://en.wikipedia.org/wiki/SpaceX%20Starship%20design%20history
Before settling on the 2018 Starship design, SpaceX successively presented a number of reusable super-heavy lift vehicle proposals. These preliminary spacecraft designs were known under various names (Mars Colonial Transporter, Interplanetary Transport System, BFR). In November 2005, before SpaceX had launched its first rocket, the Falcon 1, CEO Elon Musk first mentioned a high-capacity rocket concept able to launch to low Earth orbit, dubbed the BFR. Later in 2012, Elon Musk first publicly announced plans to develop a rocket surpassing the capabilities of the existing Falcon 9. SpaceX called it the Mars Colonial Transporter, as the rocket was to transport humans to Mars and back. In 2016, the name was changed to Interplanetary Transport System, as the rocket was planned to travel beyond Mars as well. The design called for a carbon fiber structure, a mass in excess of when fully-fueled, a payload of to low Earth orbit while being fully reusable. By 2017, the concept was temporarily re-dubbed the BFR. In December 2018, the structural material was changed from carbon composites to stainless steel, marking the transition from early design concepts of the Starship. Musk cited numerous reasons for the design change; low cost, ease of manufacture, increased strength of stainless steel at cryogenic temperatures, and ability to withstand high temperatures. In 2019, SpaceX began to refer to the entire vehicle as Starship, with the second stage being called Starship and the booster Super Heavy. They also announced that Starship would use reusable heat shield tiles similar to those of the Space Shuttle. The second-stage design had also settled on six Raptor engines by 2019; three optimized for sea-level and three optimized for vacuum. In 2019 SpaceX announced a change to the second stage's design, reducing the number of aft flaps from three to two to reduce weight. In March 2020, SpaceX released a Starship Users Guide, in which they stated the payload of Starship to low Earth orbit (LEO) would be in excess of , with a payload to geostationary transfer orbit (GTO) of . Early heavy-lift concepts In November 2005, before SpaceX launched the Falcon 1, its first rocket, CEO Elon Musk first referenced a long-term and high-capacity rocket concept named BFR. The BFR would be able to launch to LEO and would be equipped with Merlin 2 engines. The Merlin 2 would have been in direct lineage to the Merlin engines used on the Falcon 9, described as a scaled up regeneratively cooled engine comparable to the F-1 engines used on the Saturn V. In July 2010, after the final launch of Falcon 1 a year prior, SpaceX presented launch vehicle and Mars space tug concepts at a conference. The launch vehicle concepts were called Falcon X (later named Falcon 9), Falcon X Heavy (later named Falcon Heavy), and Falcon XX (later named Starship); the largest of all was the Falcon XX with a capacity to low Earth orbit. To deliver such payload, the rocket would have been as tall as the Saturn V and use six powerful Merlin 2 engines. Mars Colonial Transporter In October 2012, the company made the first public articulation of plans to develop a fully reusable rocket system with substantially greater capabilities than SpaceX's existing Falcon 9. Later in 2012, the company first mentioned the Mars Colonial Transporter rocket concept in public. It was going to be able to carry 100 people or of cargo to Mars and would be powered by methane-fueled Raptor engines. Musk referred to this new launch vehicle under the unspecified acronym "MCT", revealed to stand for "Mars Colonial Transporter" in 2013, which would serve the company's Mars system architecture. SpaceX COO Gwynne Shotwell gave a potential payload range between 150–200 tons to low Earth orbit for the planned rocket. For Mars missions, the spacecraft would carry up to of passengers and cargo. According to SpaceX engine development head Tom Mueller, SpaceX could use nine Raptor engines on a single MCT booster or spacecraft. The preliminary design would be at least in diameter, and was expected to have up to three cores totaling at least 27 booster engines. Interplanetary Transport System In 2016, the name of the Mars Colonial Transporter system was changed to the Interplanetary Transport System (ITS), due to the vehicle being capable of other destinations. Additionally, Elon Musk provided more details about the space mission architecture, launch vehicle, spacecraft, and Raptor engines. The first test firing of a Raptor engine on a test stand took place in September 2016. On September 26, 2016, a day before the 67th International Astronautical Congress, a Raptor engine fired for the first time. At the event, Musk announced SpaceX was developing a new rocket using Raptor engines called the Interplanetary Transport System. It would have two stages, a reusable booster and spacecraft. The stages' tanks were to be made from carbon composite, storing liquid methane and liquid oxygen. Despite the rocket's launch capacity to low Earth orbit, it was expected to have a low launch price. The spacecraft featured three variants: crew, cargo, and tanker; the tanker variant is used to transfer propellant to spacecraft in orbit. The concept, especially the technological feats required to make such a system possible and the funds needed, garnered substantial skepticism. Both stages would use autogenous pressurization of the propellant tanks, eliminating the Falcon 9's problematic high-pressure helium pressurization system. In October 2016, Musk indicated that the initial tank test article, made of carbon-fiber pre-preg, and built with no sealing liner, had performed well in cryogenic fluid testing. A pressure test at about 2/3 of the design burst pressure was completed in November 2016. In July 2017, Musk indicated that the architecture design had evolved since 2016 in order to support commercial transport via Earth-orbit and cislunar launches. The ITS booster was to be a , , reusable first stage powered by 42 engines, each producing of thrust. Total booster thrust would have been at liftoff, increasing to in a vacuum, several times the thrust of the Saturn V. It weighed when empty and when completely filled with propellant. It would have used grid fins to help guide the booster through the atmosphere for a precise landing. The engine configuration included 21 engines in an outer ring and 14 in an inner ring. The center cluster of seven engines would be able to gimbal for directional control, although some directional control would be achieved via differential thrust with the fixed engines. Each engine would be capable of throttling between 20 and 100 percent of rated thrust. The design goal was to achieve a separation velocity of about while retaining about 7% of the initial propellant to achieve a vertical landing at the launch pad.The design called for grid fins to guide the booster during atmospheric reentry. The booster return flights were expected to encounter loads lower than the Falcon 9, principally because the ITS would have both a lower mass ratio and a lower density. The booster was to be designed for 20 g nominal loads, and possibly as high as 30–40 g. In contrast to the landing approach used on SpaceX's Falcon 9—either a large, flat concrete pad or downrange floating landing platform, the ITS booster was to be designed to land on the launch mount itself, for immediate refueling and relaunch. The ITS second stage was planned to be used for long-duration spaceflight, instead of solely being used for reaching orbit. The two proposed variants aimed to be reusable. Its maximum width would be , with three sea level Raptor engines, and six optimized for vacuum firing. Total engine thrust in a vacuum was to be about . The Interplanetary Spaceship would have operated as a second-stage and interplanetary transport vehicle for cargo and passengers. It aimed to transport up to per trip to Mars following refueling in Earth orbit. Its three sea-level Raptor engines were designed to be used for maneuvering, descent, landing, and initial ascent from the Mars surface. It would have had a maximum capacity of of propellant, and a dry mass of 150 tonnes (330,000 lb). The ITS tanker would serve as a propellant tanker, transporting up to of propellants to low Earth orbit in a single launch. After refueling operations, it would land and be prepared for another flight. It had a maximum capacity of of propellant and had a dry mass of . Big Falcon Rocket In September 2017, at the 68th annual meeting of the International Astronautical Congress, Musk announced a new launch vehicle calling it the BFR, again changing the name, though stating that the name was temporary. The acronym was alternatively stated as standing for Big Falcon Rocket or Big Fucking Rocket, a tongue-in-cheek reference to the BFG from the Doom video game series. Musk foresaw the first two cargo missions to Mars as early as 2022, with the goal to "confirm water resources and identify hazards" while deploying "power, mining, and life support infrastructure" for future flights. This would be followed by four ships in 2024, two crewed BFR spaceships plus two cargo-only ships carrying equipment and supplies for a propellant plant. The design balanced objectives such as payload mass, landing capabilities, and reliability. The initial design showed the ship with six Raptor engines (two sea-level, four vacuum) down from nine in the previous ITS design. By September 2017, Raptors had been test-fired for a combined total of 20 minutes across 42 test cycles. The longest test was 100 seconds, limited by the size of the propellant tanks. The test engine operated at . The flight engine aimed for , on the way to in later iterations. In November 2017, Shotwell indicated that about half of all development work on BFR was focused on the engine. SpaceX looked for manufacturing sites in California, Texas, Louisiana, and Florida. By September 2017, SpaceX had started building launch vehicle components: "The tooling for the main tanks has been ordered, the facility is being built, we will start construction of the first ship [in the second quarter of 2018.]" By early 2018, the first carbon composite prototype ship was under construction, and SpaceX had begun building a new production facility at the Port of Los Angeles, California. In March, SpaceX announced that it would manufacture its launch vehicle and spaceship at a new facility on Seaside Drive at the port. By May, about 40 SpaceX employees were working on the BFR. SpaceX planned to transport the launch vehicle by barge, through the Panama Canal, to Cape Canaveral for launch. Since then, the company has terminated the agreements to do this. In August 2018, the head of the US Air Force Air Mobility Command expressed interest in the ability of the BFR to move up to of cargo anywhere in the world in under 30 minutes, for "less than the cost of a C-5". The BFR was designed to be tall, in diameter, and made of carbon composites. The upper stage, known as Big Falcon Ship (BFS), included a small delta wing at the rear end with split flaps for pitch and roll control. The delta wing and split flaps were said to expand the flight envelope to allow the ship to land in a variety of atmospheric densities (vacuum, thin, or heavy atmosphere) with a wide range of payloads. The BFS design originally had six Raptor engines, with four vacuum and two sea-level. By late 2017, SpaceX added a third sea-level engine (totaling 7) to allow greater Earth-to-Earth payload landings and still ensure capability if one of the engines fails. Three BFS versions were described: BFS cargo, BFS tanker, and BFS crew. The cargo version would have been used to reach Earth orbit as well as carry cargo to the Moon or Mars. After refueling in an elliptical Earth orbit, BFS was designed to eventually be able to land on the Moon and return to Earth without another refueling. The BFR also aimed to carry passengers/cargo in Earth-to-Earth transport, delivering its payload anywhere within 90 minutes. Changes to early Starship design In December 2018, the structural material was changed from carbon composites to stainless steel, marking the transition from early design concepts of the Starship. Musk cited numerous reasons for the design change; low cost and ease of manufacture, increased strength of stainless steel at cryogenic temperatures, as well as its ability to withstand high heat. The high temperature at which 300-series steel transitions to plastic deformation would eliminate the need for a heat shield on Starship's leeward side, while the much hotter windward side would be cooled by allowing fuel or water to bleed through micropores in a double-wall stainless steel skin, removing heat by evaporation. The liquid-cooled windward side was changed in 2019 to use reusable heat shield tiles similar to those of the Space Shuttle. In 2019, SpaceX began to refer to the entire vehicle as Starship, with the second stage being called Starship and the booster Super Heavy. In September 2019, Musk held an event about Starship development during which he further detailed the lower-stage booster, the upper-stage's method of controlling its descent, the heat shield, orbital refueling capacity, and potential destinations besides Mars. Over the years of design, the proportion of sea-level engines to vacuum engines on the second stage varied drastically. By 2019, the second stage design had settled on six Raptor engines—three optimized for sea-level and three optimized for vacuum. To decrease weight, aft flaps on the second stage were reduced from three to two. Later in 2019, Musk stated that Starship was expected to have a mass of and be able to initially transport a payload of , growing to over time. Musk hinted at an expendable variant that could place 250 tonnes into low orbit. One possible future use of Starship that SpaceX has proposed is point-to-point flights (called "Earth to Earth" flights by SpaceX), traveling anywhere on Earth in under an hour. In 2017 SpaceX president and chief operating officer Gwynne Shotwell stated that point-to-point travel with passengers could become cost competitive with conventional business class flights. John Logsdon, an academic on space policy and history, said that the idea of transporting passengers in this manner was "extremely unrealistic", as the craft would switch between weightlessness to 5 g of acceleration. He also commented that “Musk calls all of this ‘aspirational,’ which is a nice code word for more than likely not achievable.” See also History of SpaceX Space Shuttle design process SpaceX ambition of colonizing Mars Studied Space Shuttle designs Notes References SpaceX Starship Spacecraft design
SpaceX Starship design history
[ "Engineering" ]
3,074
[ "Spacecraft design", "Design", "Aerospace engineering" ]
75,314,023
https://en.wikipedia.org/wiki/HD%208357
AR Piscium is a binary star system in the northern constellation of Canes Venatici, abbreviated AR Psc. It has the Henry Draper Catalogue identifier HD 8357; AR Piscium is its variable star designation. The pair have a combined apparent visual magnitude that fluctuates around 7.24, which is too faint to be readily visible to the naked eye. Parallax measurements place it at a distance of 148 light years from the Sun. The motion of this star through the Milky Way suggests it is a member of the intermediate disc population. Variable X-ray source H0123+075 was identified from the HEAO 1 A-2 experiment and published by F. E. Marshall and associates in 1979. The following year, M. Garcia and associates identified the most probable source star as HD 8357, and determined it to be a RS Canum Venaticorum variable. This has a spectral class of G5 in the Henry Draper Catalogue. Optical observations by D. S. Hall and associates in 1980–1981 confirmed the source star to be optically variable with a period of . In 1993, AR Psc was identified as an extreme ultraviolet source by K. A. Pounds and associates using ROSAT. This is a double-lined spectroscopic binary with an orbital period of 14.3 days and an eccentricity of 0.185. The mass ratio of the two components is . The primary component is an evolving subgiant star with a stellar classification of Kl IV. It is the chromospherically active member of this system, displaying visual flares. Intense X-ray flares have been detected. The smaller and less massive secondary star is a G-type main-sequence star with a stellar class of G7 V. Based on the significant difference between the orbital and photometric periods, the two stars are in pseudosynchronous rotation. References Further reading K-type subgiants G-type main-sequence stars RS Canum Venaticorum variables Spectroscopic binaries X-ray binaries Pisces (constellation) Durchmusterung objects 3095 008357 006454 Piscium, AR
HD 8357
[ "Astronomy" ]
450
[ "Pisces (constellation)", "Constellations" ]
75,315,271
https://en.wikipedia.org/wiki/MEEZA%20QSTP
MEEZA QSTP (Arabic: ميزة كيو اس تس بي) is an IT provider founded in Qatar Science & Technology Park (QSTP). The company has five Tier III certified data centres known as M-VAULTs, offering a guaranteed uptime of 99.98% built to comply with the international standards. MEEZA's provides Data Centre Services, Managed IT services, Cloud Services, and IT Security Services, as well as Smart Cities Solutions and Artificial Intelligence (AI). MEEZA is a publicly quoted company on Qatar Stock Exchange main market. History MEEZA was founded in 2008 as Qatar Foundation joint venture providing Information Technology (IT) services. The name MEEZA was derived from the Arabic word “Advantage”. MEEZA opened its first data centre M-VAULT in 2009, offering 99.98% availability. In 2012, MEEZA has opened M-VAULT 3 data centre in the second quarter and M-VAULT 2 data centre in the third quarter. The data centres have been designed to Uptime Institute Tier 3 standards, with 99.98% guaranteed availability. In addition, MEEZA was the first company in Qatar to have Security Operations Centre (SOC), enabling to mitigate digital and cybersecurity threats. In 2021, MEEZA launched its fourth data centre, M-VAULT 4 hosting Microsoft cloud data centre. Later in 2022, MEEZA announced the launch of M-VAULT 5 data centre less than one year after the launch of M-VAULT 4 data centre to meet the increasing demand for data centres and cloud services. As of 2023, MEEZA is the first company to use the book building exercise for the Initial Public Offering (IPO) in Qatar. MEEZA shares were listed on the Qatar Stock Exchange main market from 23 August 2023. References Qatari companies established in 2008 Companies listed on the Qatar Stock Exchange Companies based in Doha Data centers Qatari brands
MEEZA QSTP
[ "Technology" ]
393
[ "Data centers", "Computers" ]
75,316,409
https://en.wikipedia.org/wiki/RipX
RipX is audio modification software developed by Hit'n'Mix Ltd. UK and used in entertainment industry. The software is helpful in separating and editing of musical instruments used in audio recordings. Company According to the UK government, Neuratron Group was formed in 2004 is having a subsidiary, Hit'n'Mix involved in developing software for source separation and audio editing. The firm has been developing different versions of professional music recognition software and other analysis tools targeted at AI enabled music makers in the world. Martin Dawe, the programmer innovated first audio files needed in music recognition : PhotoScore, Optical music recognition software that converted images of sheet music into playable music notation, that came out in 1996. Parts of PhotoScore were included with notation software Sibelius. Later, ambitions of doing the same for music recordings – turning recorded notes into midi, brought forth software titles AudioTune (2004) and two years later Audio Score. The first software offering for the Hit'n'Mix, the eponymous Hit'n'Mix, came to market in 2011, after the developer had spent 10 years creating developing the system. Hit'n'Mix contained the Rip Audio format that RipX still uses. Software RipX was initially released as Infinity, around 2019. A continuation of the earlier Hit'n'Mix software, it bore some resemblance to the popular Celemony pitch editing software, particularly DNA Direct Note Access feature that allows users to edit notes from different instruments within an audio recording. Its ability to isolate instruments set it apart from the start, but it came more into its own with the first major update, in 2021, when it was renamed RipX. The first version had relied on algorithmic processing, sinusoidal spectral analysis and resynthesis, but the update added machine learning, and came with drastically improved stem separation capabilities. As RipX grew it added upgrade modules. The affordable base version, DeepRemix, had stem separation, note editing, pitch/tempo editing and audio editing features like volume, EQ and panning for each instrument. The more expensive DeepAudio added more in-depth sound, pitch and harmony editing features, with product names such as Audioshop, Unpitched Editor, Clone, Draw Audio and others. It also added the RipScripts scripting feature, which was touted as the growth feature. Later the third module, Ripx DeepCreate added some Digital Audio Workstation (DAW)-like features, instrument replacement, audio recording and VST instrument hosting. DeepCreate added a new price point, placed between the affordable DeepRemix and the costlier DeepAudio. RipX DAW In November 2023, the software was relaunched as RipX DAW. At the same time, it switched back to two price tiers. The upgrade centered on changes making the software into a fully fledged DAW. it added some audio effects, recording and playback improvements, "Integrated AI Music Generator access", various UI changes and improvements and improvements that they described would affect both sound quality and speed. References Audio software
RipX
[ "Engineering" ]
625
[ "Audio engineering", "Audio software" ]
75,317,701
https://en.wikipedia.org/wiki/Exfoliation%20%28chemistry%29
Exfoliation is a process that separates layered materials into nanomaterials by breaking the bonds between layers using mechanical, chemical, or thermal procedures. While exfoliation has historical roots dating back centuries, significant advances and widespread research gained momentum after Novoselov and Geim's discovery of graphene using Scotch tape in 2004. Their Nobel Prize-winning research primarily relied on mechanical exfoliation for the production of graphene which sparked an immediate interest in the exfoliation process. Today, exfoliation is regarded as the most widely used nanomaterial production technique. Exfoliation typically involves breaking weak bonds called van der Waals bonds to create two-dimensional materials, such as graphene or transition metal dichalcogenide monolayers. While various reversible chemical processes, such as intercalation can disrupt the weak bonds in a lamellar structure and introduce guest species, many of them fail to produce single-sheet materials as the processes are not strong enough to cancel the interlayer attractions. However, during exfoliation, the high energy input leads to an extreme bond-breaking process that irreversibly separates the layers into single sheets. Lately, it has been shown that if the energy input is substantial enough, the procedure can even break much stronger, bonds such as metallic or ionic bonds to create non-van der Waals materials like hematene or other nanoplatelets. In recent years, exfoliation has found practical applications in a wide range of fields, from electronics to biomedical and beyond. It plays a vital role in creating advanced materials with properties tailored for specific uses, such as high-performance electronics, efficient energy storage devices, and lightweight yet robust materials for aerospace applications. This versatility and adaptability make exfoliation a crucial technique in cutting-edge material research and various industrial sectors. History While the use of exfoliation can be traced back to ancient Chinese and Mayan pottery, the earliest scientific work involving exfoliation was the production of vermiculite by Thomas H. Webb, in 1824. However, during this early period, no substantial research was conducted to understand the nature of the mechanisms that facilitated these reactions. Arguably, the first research that delved into the mechanism of the process rather than its usage was Brodie's work, which revealed that certain acids produced lamellar carbon structures in 1855. Despite this discovery, extensive research on the topic did not immediately follow. The exfoliation concepts we have today were not developed until the realization that graphite absorbed alkali metals in 1926. This discovery laid the groundwork for a more solid theoretical framework, enabling scientists to apply the method in their production processes. One method that made use of this theoretical background and eventually led to further development of the process as a production technique was Rüdorff and Hoffman's work, which introduced an electrochemical method for exfoliation in 1938. The development of electrochemical exfoliation piqued the interest of more researchers and more people started regarding the process as a production technique. One of the most notable examples of the success of the method as a mass production technique was the invention of the first commercial lithium carbon fluoride batteries in 1973. The real turning point for exfoliation research came in 2004 when Novoselov and Geim isolated graphene through mechanical exfoliation using Scotch tape. This innovative research earned them the Nobel Prize in Physics in 2010, reigniting the interest in exfoliation methods. In subsequent years, numerous processes were developed for more precise manufacturing and higher yields. While most of the exfoliation research focused on graphite and graphene during the last few decades, recently, the rather difficult processing of graphene and its lack of an obvious band structure led many research groups to begin working on different elements to utilize exfoliation for the production of other nanomaterials.  One of the most significant breakthroughs of this new wave of research has been the discovery of non-van der Waals nanoplatelets. This discovery demonstrated that exfoliation could occur without relying on weak bonds, which opened up new and promising applications in the industry. Types of Exfoliation The exfoliation process is typically applied to lamellar structures with weak bonds. While these bonds are weak enough to be easily broken by an external force, they are strong enough to not separate into single layers. In order to separate the material into single layers, the attraction between consecutive layers must be overcome. As the interest in exfoliated material research surged many researchers started to develop new and better ways to overcome these interlayer attractions. Despite the high number of methods it is possible to classify them into three distinct categories based on the source of energy used in the process: mechanical, chemical, and thermal exfoliation. Mechanical Exfoliation In mechanical exfoliation, external forces act upon the material, breaking the bonds due to the stress accumulated within the material. Depending on the intensity and the specific nature of the phenomenon, these forces break the van der Waals forces, separating materials into 2D nanostructures. Sometimes, a solvent is introduced to the material to facilitate complete breakdown, as liquid environments significantly reduce bond strength compared to vacuum conditions. While mechanical exfoliation is effective in separating the layers, it lacks predictability and systematic results. The process requires repetition until individual layers are achieved. To obtain consistent nanomaterials with specific properties, experimentation and fine-tuning of conditions based on the results is required. Therefore, mechanical exfoliation techniques are rather empirical, and most mathematical models rely on empirical results rather than the ab-initio calculations. The original method proposed by Novoselov and Geim, micromechanical cleavage,  was essentially a mechanical exfoliation method. Consequently, mechanical exfoliation methods were developed more rapidly than the others. Major mechanical exfoliation methods include micromechanical cleavage and liquid phase separation. Micromechanical Cleavage Micromechanical Cleavage is the original graphene production method proposed by Novoselov and Geim. This process involves using sticky tape to get graphite samples and separating the layers until getting a single layer. Although the process yielded high-purity single-layered materials, it fell out of favor quickly as it required several repetitions for a single layer of graphene and was likely to damage the graphene layers during the process. Liquid Phase Separation Liquid phase separation is one of the most widely used exfoliation methods. Its high yields, high purity, and scalability make it one of the most preferred exfoliation methods. It works by providing a liquid medium for the mechanical exfoliation methods. The liquid medium significantly reduces the strength of the bonds in the material compared to vacuum conditions, making it easier for mechanical forces to break the weak bonds in the material. However, due to the interfacial tension forces, liquid phase separation does not always yield uniform results. When the tension forces do not balance out, graphene single layers may break due to the tension forces. To achieve relatively uniform results, the overall energy of the system must be minimized. The best way to optimize this condition is to use solvents with similar surface tensions to the material of interest. Liquid phase separation utilizes various external forces to break the van der Waals forces. The most widely used liquid phase separation methods include sonication, which uses sound waves, and shearing, which uses shear forces. Sonication The sonication method utilizes ultrasonic sound waves to create micrometer-sized bubbles in liquid environments. When these bubbles reach a critical size, they collapse with an instantaneous temperature of 5000 K, a local pressure of 20MPa, and a heating/cooling rate up to 109 K s−1.These sudden physical differences create shock waves that can act on lamellar materials and break the weak forces in between the layers. Although sonication is a long-known laboratory technique, its implementation into graphene exfoliation was in 2008 and it led to liquid exfoliation becoming the predominant technique. While sonication is generally used as an exfoliation method on its own, it is also used as a further processing method to perfect the nanoflakes created with other exfoliation methods. Therefore it is a common technique used in combination with the other methods. One disadvantage of sonication is the reaction time though. A complete exfoliation reaction may take days to finish. However, prolonged exfoliation times allow the creation of more stable solutions, making long sonication times favorable for obtaining purer, defect-free products. Nanomaterials created with sonication yield 1.5 times larger unperturbed particle size. Shearing The shearing method makes use of lab mixers to exfoliate lamellar structures into single-layered nanomaterials. Lab mixers create a sufficient shear force that allows consecutive layers of the material to slip over each other. Which produces massive quantities of highly pure material.  Although the shearing method was widely used as a further processing method to break up relatively larger clusters of nanomaterials into single layers, before, in 2010, it was introduced as a direct method to exfoliate graphite into graphene. Later studies confirmed the applicability of the method to other lamellar materials such as h-BN, MoS2, WS2, MoSe2, and MoTe2. While this method has a high yield and purity among the other exfoliation methods, its known linear relations with concentration, mixing time, rotor speed, rotor diameter, and inverse relation with liquid volume, gives one of the best controllability out of all the exfoliation methods. This innovative procedure has been adapted for household kitchen mixers, significantly reducing the costs and complexity of the exfoliation methods, thereby sparking another wave of research in layered structures. Chemical Exfoliation Chemical exfoliation employs the intercalation process to separate material layers. During intercalation, guest ions or free electrons are introduced to the layers, disrupting the bond structure and forming new bonds. For example, in the case of van der Waals forces, which are common in chemical exfoliation, positive and negative regions are induced, attracting ions. Given that the bonds between layers are weak, they tend to break, forming new, stronger bonds with these ions. Typically, these stronger bonds lead to the creation of functional groups that significantly reduce interlayer attractions. At this stage, the interlayer attraction becomes low, and thanks to the ability of the functional groups to decompose with further processing, the layers can be easily separated. Chemical exfoliation's scalability advantages over other production methods have made it one of the most widely used techniques. In addition to its scalability, the variety of chemicals available plays an important role in encouraging researchers to explore various production methods. Chemical exfoliation is also commonly used in combination with mechanical and thermal exfoliation methods to obtain purer results. The most widely used chemical exfoliation methods are chemical vapor deposition, graphite oxide reduction, and electrochemical exfoliation. Chemical Vapor Deposition First introduced in 2008, chemical vapor deposition emerged as one of the most popular methods for graphene exfoliation. This method utilizes a transition metal film as a base layer and exposes it to hydrocarbons at high temperatures(900-1000°C) and ambient pressure. During the process, hydrocarbon decomposes, and carbon atoms form one to ten layers of graphene flakes over the metal film. The metal film is then cooled down at a determined rate to achieve specific particle sizes. This process is especially useful for applications such as circuit drawing and surface-based applications of graphene, including the production of photovoltaic cells. Although the method was widely used until the last decade, its relatively expensive process has been replaced by other methods. However, there is still ongoing research to further develop the process for more efficient use with various materials. Oxide Reduction The oxide reduction method is particularly widely used with graphite to create graphene. It involves introducing oxide functional groups into the lamellar structure, which doubles the distance between graphite layers and reduces van der Waals attractions. These functional groups are then removed using reductants, resulting in single graphene layers from the graphite, which can now be easily exfoliated due to reduced van der Waals attractions. This method is especially valuable for fine-tuning the band gap properties of graphene, which naturally lacks a band gap. While this method was widely used over the last decade, its impurity levels led to its decline in popularity. The presence of a large number of holes and defects made the produced graphene unsuitable for electronics, and the chemicals used were hazardous. In 2014, a research group succeeded in isolating graphene layers without the use of oxidants, significantly increasing the purity of the samples and eliminating the need for further processing of the products. This advancement is expected to reignite interest in oxide reduction exfoliation. Electrochemical Exfoliation One of the most promising exfoliation methods is electrochemical exfoliation, which has been popular among researchers since its introduction in 2008. This method is mainly based on 20th-century studies on electrolysis and electrochemical intercalation. Electrochemical exfoliation makes use of potential differences between a lamellar structured electrode and a platinum electrode to attract oppositely charged ions to the electrodes. These accumulations trigger the intercalation process in the material and ultimately result in the complete exfoliation of the material into single nanomaterial layers. However, intercalation is not always the only reaction mechanism, as sometimes bubbles are observed depending on the solvent and electrolyte used. These bubbles also facilitate exfoliation by creating a similar effect to the sonication method. The process might be called cathodic or anodic exfoliation, depending on which electrode is the lamellar structured electrode. Cathodic exfoliation requires an organic solvent medium with a lithium or alkylammonium electrolyte, while anodic exfoliation can be done with water and strong electrolytes. Anodic exfoliation is more efficient than cathodic exfoliation, as it forms oxide and hydroxide functional groups, significantly increasing intercalation in the material. However, anodic exfoliation also results in impure products, so the choice between the two methods depends on the specific application. Electrochemical exfoliation products may also require further processing. Unlike liquid exfoliation, electrochemical exfoliation eliminates most of the chemical reactions involved, resulting in purer products. This method increases scalability, controllability, and decreases contamination and reaction time for the exfoliated material. Therefore, many researchers aim to implement the method into the industry for the mass production of carbon nanomaterials and transition metal dichalcogenide monolayers. Thermal Exfoliation Thermal exfoliation uses heat as a source of energy for the exfoliation process. Despite heat being such a fundamental energy for most of the other chemical processes its use in exfoliation is relatively recent. Most thermal exfoliation methods have the same approach; chemically intercalated lamellar structures are subjected to extreme temperatures to decompose the functional groups created through chemical methods. The decomposition of these functional groups generates gases that build up pressure between layers, countering the van der Waals attractions between material layers. When well-chosen functional group/temperature combinations are used, complete separation of the layers occurs. One advantage of thermal exfoliation methods over others is their higher production rate, a crucial property for mass production applications. Additionally, their reaction times are the shortest among all exfoliation methods. A process that might take days to complete with mechanical exfoliation can be finished within seconds using thermal exfoliation methods. However, reduced reaction time and higher yields come at the cost of reduced control over particle size due to the nature of the process. Therefore, the process still lacks the optimization and reproducibility required by the industry. Today, the most widely used thermal methods are high-temperature, low-temperature, and microwave exfoliation methods. High Temperature Thermal Exfoliation High-temperature thermal exfoliation employs temperatures above 550°C to decompose functional groups. The biggest advantage of this method is its short reaction times. An exfoliation process that might take days to complete with mechanical exfoliation can be done in a matter of seconds through high-temperature thermal exfoliation. However, decreased reaction times come at the price of impure products. Due to the extremely high temperatures, operation costs increase significantly. Moreover, the carbon dioxide and water vapors produced during the decomposition of oxide groups react with the material, causing defects and impurities in the material. Low Temperature Thermal Exfoliation Low-temperature thermal exfoliation aims to retain the benefits of high-temperature thermal exfoliation while avoiding unexpected outcomes such as high costs and impurities. For this purpose, low-temperature thermal exfoliation employs relatively lower temperatures of 200°C-550°C to decompose the functional groups. These temperatures yield purer results than high-temperature thermal exfoliation because the chemicals produced at this temperature do not readily react with the layered material itself. Although this decrease in temperature affects reaction times, it is usually favored to achieve purer results. Even though the reaction time is shorter in low-temperature thermal exfoliation compared to high-temperature, it is still significantly shorter than in other methods. Additionally, low-temperature thermal exfoliation allows for fine-tuning of the bandgap properties of materials, making it an ideal method for electronic applications. Microwave Irradiation Exfoliation: Microwave Irradiation Exfoliation is another exfoliation method that would decrease the complexity of the exfoliation experiments greatly. First utilized for the production of exfoliated graphite, it was later adapted for other nanomaterials. In the microwave irradiation exfoliation method, materials partially intercalated through chemical processes are exposed to microwave radiation. Ions and molecules trapped between layers absorb microwaves, leading to local temperature changes. These local changes trigger significant physical and chemical phenomena that result in complete exfoliation of the lamellar material. Due to reduced costs and high efficiency, microwave irradiation exfoliation is one of the most popular exfoliation methods. The method also provides higher yields with pure results within shorter timeframes. Although microwave irradiation exfoliation has great benefits there is still some ambiguity in the mechanisms of this method as the products of the method are reported to be able to get exfoliated again through chemical exfoliation. Applications Ever since the isolation of graphene, exfoliation has been the most common and reliable method for creating graphene, with the ongoing development of new techniques to optimize the process. As graphene finds increasing applications in various areas of electronics, the quest for an optimized industrial production method for graphene becomes more significant. Currently, graphene is projected to play a crucial role in the production of low-cost solar cells, energy storage systems, and sensors. Therefore, various forms of graphene, from liquid suspensions to dispersions, coatings to dust, are necessary for implementation in industrial production methods. In addition to the graphene, the exfoliation process enables the production of various other carbon allotropes, with the most important ones being carbon nanotubes and carbon quantum dots. These materials are also expected to create billion-dollar industries, and as a result, commercialization of these materials are anticipated to show advancements in exfoliation methods. Although graphene is expected to be one of the most important materials in the future, there are still some disputes about some of its applications. The challenging processing of graphene and its lack of an obvious band structure have led many researchers to explore new uses of the exfoliation methods. This shift has recently increased research into efficient production methods for transition metal dichalcogenide (TMD) monolayers significantly. TMD monolayers have band gaps ranging from insulators to semiconductors, thanks to their quantum confinement effects. Therefore, they are expected to have significant applications in the near future, particularly with the further development of optoelectronics. Currently, TMD monolayers find applications in electronic devices such as solar cells, photodetectors, light-emitting diodes, and phototransistors. There is also a growing interest in their use in power storage systems, such as batteries and supercapacitors. Since exfoliation is TMD monolayers' most common production technique, it is projected that TMD monolayers' potential commercialization will require extensive use of exfoliation methods, eventually creating new applications for exfoliation. Theoretically, exfoliation requires the presence of weak bonds. However, recent studies have shown that even materials with metallic and ionic bonds can be exfoliated with the proper procedures. The materials created through these methods are called non-van der Waals nanoplatelets. One notable non-van der Waals material is the Hematane which is a single sheet of hematite, the most abundant form of iron ore. Hematane is known to have interesting photocatalytic properties due to its modified bandgap properties, offering potential applications in energy storage, optoelectronics, and biomedicine. Since one of the most common ways to create hematite is through liquid phase separation, applications of hematite would increase the interest in exfoliation. References Chemical processes
Exfoliation (chemistry)
[ "Chemistry" ]
4,392
[ "Chemical process engineering", "Chemical processes", "nan" ]
75,319,288
https://en.wikipedia.org/wiki/Clindamycin/adapalene/benzoyl%20peroxide
Clindamycin/adapalene/benzoyl peroxide, sold under the brand name Cabtreo, is a fixed-dose combination medication used for the treatment of acne. It contains clindamycin, as the phosphate, a lincosamide antibacterial; adapalene, a synthetic retinoid; and benzoyl peroxide, an oxidizing agent. It is applied to the skin. Clindamycin/adapalene/benzoyl peroxide was approved for medical use in the United States in October 2023. It is the first triple-combination topical acne treatment approved by the US Food and Drug Administration. References Anti-acne preparations Combination drugs
Clindamycin/adapalene/benzoyl peroxide
[ "Chemistry" ]
148
[ "Pharmacology", "Pharmacology stubs", "Medicinal chemistry stubs" ]
75,319,478
https://en.wikipedia.org/wiki/Asplund%20Pavilion
The Asplund Pavilion () is an Installation art structure built in 2018 in Venice, Italy. It is located at the Cini Foundation in a forest on the island of San Giorgio Maggiore. It was created for the 16th Venice Biennale art exhibition as one of eleven structures for the "Vatican Chapels" project, which was promoted by the Holy See. The project was the first time the Holy See had sponsored an art project at the exhibition. The Asplund Pavilion is noted for offering insights into contemporary perspectives on worship spaces from architects representing diverse cultural backgrounds. The project was inspired by the Skogskapellet (), designed in 1920 by the architect Erik Gunnar Asplund in the Skogskyrkogården () in Stockholm, Sweden. The Pavilion is entirely made of wood and contains an exhibition of original drawings and models by the architect Asplund. History The Holy See took part in the 16th International Architecture Exhibition of the Venice Biennale for the first time in 2018. Led by directors Francesco Dal Co and Micol Forti, the project team included eleven international architects, who reimagined Gunnar Asplund's 1920 Forest Chapel in Stockholm's cemetery within the contemporary socio-cultural context. The Asplund Pavilion was the first chapel constructed for the project and it was designed by two Italian architects, Francesco Magnani and Traudy Pelzel. It was built by Alpi, an Italian wood manufacturer. The participating architects included: Andrew Berman (United States) Carla Juacaba (Brazil) Eduardo Souto de Moura (Portugal) Eva Prats and Ricardo Flores (Spain) Francesco Cellini (Italy) Francesco Magnani and Traudy Pelzel (Italy) Javier Corvalan (Paraguay) Norman Foster (Great Britain) Sean Godsell (Australia) Smiljan Radic (Chile) Teronobu Fujimori (Japan) Each design adhered to the requirement of using materials assigned to each team provided by sponsoring commercial partners in the construction sector. The cost of materials was met by the sponsors, not the Vatican. Companies which supplied materials, technology and supported the construction costs included: Alpi; Barth Interni; Gruppofallani; Laboratorio Morselletto; Leucos; LignoAlp; Maeg; Moretti; Panariagroup; Piaggio Group; Sacaim; Saint-Gobain (Italy); Secco Sistemi; Simeon; Tecno; Terna; Zintek. The Vatican Chapels Project The Vatican Chapels project comprises 10 chapels and the Asplund Pavilion, aimed at exploring and reinterpreting the concept of sacred spaces in contemporary society. The Holy See created the project to combine chapels with tourism, based on modern changes in how individuals seek peace, quiet and spiritual space. The Holy See commissioned eleven international architects for the project to create new chapel designs, providing contemporary interpretation from different cultural backgrounds. The project was created to better understand how people view Catholic places of worship and different styles in the 21st century. The eleven chapels are categorized into two archetypal groups: those aligned with the archetype of the hut, based on the concept of a primary interior protective space those following the archetype of the totem, based on an architectural inclination that relinquishes its inherent spatiality, serving as a pre-existing space, transformed through the insertion of an element that alters its proximity in relation to the surrounding space The Asplund Pavilion conforms to the archetype of the hut and is based on a Forest Chapel. It functions as a transitional space, incorporating elements of both the extroverted hut and the introverted hut within its design. The Pavilion is designed to confine an interior space isolated from its surroundings, and define the space by establishing a direct relationship with the exterior. The term interior cabin archetype denotes chapels characterized by a precisely defined interior space, that is differentiated from the exterior. This archetype aligns with the Catholic Church's common practice of isolating sacred spaces from their surroundings, such as the interior space of the Forest Chapel. Architects and designers Francesco Magnani and Traudy Pelzel are the architects of the Asplund Pavilion. Both graduated from IUAV in Venice, Traudy Pelzel in 1994 and Francesco Magnani in 1999. They represent the team of architects whose collaboration created the MAP Studio (Magnani Pelzel Associated Architects) in 2004, an international office based in Venice, Italy. The studio carries out projects related to architecture, urbanism and design. The two architects have received awards for their work, including the XXXI Premio Torta in October 2011 for the restoration of the Porta Nuova Tower and an honourable mention during the Piran Days of Architecture in Slovenia in November 2011, for a project conducted in Venice. Piero Lissoni worked on the interior design of the Asplund Pavilion. Building The architecture of the Asplund Pavilion was inspired by Stavkirken, a medieval wooden Christian church building from Scandinavia. The Asplund Pavilion is approximately 11 meters long and 8 meters high, and it is supported by 11 lamellar wood portals that define 10 bays. It presents a pitched roof which emphasizes its height, characterized by continuous wood cladding made of 9000 wooden shingles, interrupted by the presence of a series of symmetrical triangular skylights placed on both sides. Materials For this project the company Alpi developed an experimental and unique material for the external part, which had to be waterproof and able to maintain its appearance over time. The exteriors of the building are entirely made of dark gray shingles, positioned like "dragon's skin", featuring the Xilo 2.0 Planked Gray wood pattern. Wood of the Xilo 2.0 Striped White collection was used to cover all internal surfaces. Style of the building The Asplund Pavilion belongs to the archetype of the hut which consists of three main elements: standalone columns, horizontal beams forming the entablature, and a basic pediment marking the triangular end of a pitched roof. The chapel encompasses the stereometry of the supporting structures crafted by Asplund and Lewerntz for the Stockholm cemetery, drawing inspiration from Nordic woodwork. The design attempts to create a domestic absolute, intertwining themes of shelter in nature and a reinterpreted vernacular architecture. The materials used were chosen to create a natural look to the exterior of the building, based on Asplund's original concept. The interior was designed to create a calm and tranquil atmosphere. The Pavilion incorporates shapes and colors from nature, and design elements incorporated to make a strong visual impression. The external shingles feature a dark wooden material, which creates a contrast with the light wood used for the internal cladding. The external pattern creates an intricate microstructure by playing with the light and shadow contrasts, alluding to the multitude of the Nordic forest. Art Collection The internal space of the Pavilion features an exhibition presenting replica drawings, texts, photographs and scale models referring to the original "Woodland Chapel". The exhibits are provided by the Canadian Centre for Architecture in Montreal and the Swedish Centre for Architecture and Design of Stockholm. The display supplies insight of the development behind both the internal and external spaces of the building, with sketches, black and white drawings and photographs. The materials used include graphite, coloured pencils, pen and black ink and were created between 1918 and 1921. See also Holy See San Giorgio Maggiore (Church) Wood shingle Hut References External links Venice Biennale The Vatican The Woodland Chapel Cini Foundation Alpi Buildings and structures in Venice Sacral architecture Chapels in Italy Christianity and nature
Asplund Pavilion
[ "Engineering" ]
1,560
[ "Sacral architecture", "Architecture" ]
75,322,386
https://en.wikipedia.org/wiki/Klick%20%28company%29
Klick is a global group of companies in the area of health marketing and advertising. History Klick was founded in 1997. The company introduced the SymPulse in 2017. Marketed as a "tele-empathy" device, it was designed to transmit the symptoms of Parkinson's disease to allow them to be felt and understood by care providers. In 2021 it created "Community Unity", a video series with the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine which discusses frequently asked questions about the COVID-19 vaccines. Operations In 2006, Klick Health replaced its internal email systems with a new software program intended for messaging and project management. This program, named Genome, incorporates analytics to potentially improve productivity and has been associated with the agency's expansion. Company co-founder Leerom Segal attributed these practices for Klick's low turnover rate of three percent, compared to seven percent across Canadian employers . The system was marketed to external clients as SenseiOS. The organizational structure of Klick Health has not included a human resources department, which was a point of concern for some employees. Its research and development division is Klick Labs. The division developed an AI program able to analyze people's voices to determine whether or not they have type 2 diabetes. Klick Health is described by the Yale School of Management as the "largest private health marketing company in the world". Awards Webby Award for its video for the "Kindness is Contagious" campaign from the Fred Rogers Center Multiple Clio Awards and Clio Independent Agency of the Year (2023, 2024) References External links 1997 establishments Marketing companies Advertising agencies of Canada Privately held companies of Canada Pharmaceutical industry
Klick (company)
[ "Chemistry", "Biology" ]
343
[ "Pharmaceutical industry", "Pharmacology", "Life sciences industry" ]
75,323,127
https://en.wikipedia.org/wiki/Stepan%20Badalov
Stepan Tigranovich Badalov (26 August 1919 – 17 June 2014) was a Soviet and Uzbek mineralogist and geochemist, Doctor of Geological and Mineralogical Sciences (1962) and Professor (1968). He was a Meritorious Scientist of the Republic of Uzbekistan and an Honorary Member of the Russian Mineralogical Society. He is considered the founder of isotope geochemistry. Life Badalov was born on 26 August 1919 in Chorjou city (now Türkmenabat) to a service worker family. His childhood was spent in Kogon and Qo‘qon cities. In June 1941, he graduated from the Mining Faculty of the Central Asian Industrial Institute in Tashkent with a specialisation in geology and exploration of mineral resources. From July 1941 to May 1945, he participated in the German-Soviet War (in 1942 as a technician-intendant 2nd rank) and fought at the Stalingrad, fourth Ukrainian and Baltic fronts. From 1 June 1946, he worked at the Institute of Geology of the Academy of Sciences of the Uzbek SSR (later renamed the Institute of Geology and Geophysics of the Academy of Sciences of the Uzbek SSR; Institute of Geology and Geophysics named after Khabib Abdullaev of the Academy of Sciences of the Republic of Uzbekistan). He went through the path from a postgraduate student to the head of the laboratory of geochemical cycles and processes. In 1950, he defended his candidate dissertation under the supervision of academician Alexander S. Uklonskiy on the topic "Mineralogy of the Uranium-Vanadium Deposit Temir-Kabuk (northern flank of the Nuratinskiy Mountains)". In 1962, he defended his doctoral dissertation on the topic "Mineralogy, Geochemistry and Genetic Features of Endogenous Deposits of the Almalyk Ore District". For over 40 years, he taught geochemistry and mineralogy at the geological faculty of the Tashkent State University. In 1969, he created the "Periodic System of Protoisotopes of Chemical Elements" with their alternation according to atomic weights. He identified about 540 isotopes for 81 stable chemical elements, of which 270 were stable. 3-award He was the scientific supervisor of 4 doctoral and 42 candidate dissertations, including postgraduates from Vietnam, Mali and Afghanistan. He founded a scientific school in the field of mineralogy and geochemistry of deposits. Awards and honours Medal "For the Victory over Germany in the Great Patriotic War 1941–1945" Medal "For Labour Valour" Order of the Patriotic War USSR State Prize The minerals badalovit (IMA 2016–053) and manganobadalovite (IMA 2020–035) are named after him. Membership in organisations 1950 — Russian Mineralogical Society, honorary member of the Russian Mineralogical Society (1992) Honorary Chairman of the Mineralogical Society of Uzbekistan References 1919 births 2014 deaths Uzbekistani people Uzbek Soviet Socialist Republic people Soviet geologists Recipients of the USSR State Prize People from Türkmenabat Soviet geochemists
Stepan Badalov
[ "Chemistry" ]
622
[ "Geochemists", "Soviet geochemists" ]
73,928,585
https://en.wikipedia.org/wiki/Binary%20trigger
A binary trigger (or pull and release trigger) is a type of device that allows a semi-automatic firearm to fire at an increased rate. A binary trigger works by firing one shot upon pulling the trigger and then firing a subsequent shot upon release of the trigger. Binary triggers are installed through modification of the fire-control group. The preinstalled trigger of a particular firearm is replaced by the binary trigger assembly. As in all semi-automatic firearms, only one round is fired within a single function of the trigger. This allows guns outfitted with a binary trigger to avoid classification as a machine gun within the definitions used by United States federal law, as stated by various ATF private-letter rulings. However, as with all private-letter rulings, these determinations on the U.S. legality of binary triggers are limited to the specific facts about the devices being examined. Any such legal opinion may be modified or revoked at any subsequent time by the Bureau of Alcohol, Tobacco, Firearms and Explosives. Furthermore, agency opinion is not always considered legally binding. Binary triggers became popular in the United States after the 2017 Las Vegas shooting as trigger cranks and bump stocks, devices similarly used to increase firing rate, had largely disappeared from online sellers due to fear of legal repercussions. However, in wake of the shooting, binary triggers also received scrutiny by media outlets. On January 1, 2025, a law that bans binary triggers went into effect in Minnesota following their use in the 2024 Burnsville shooting. See also Bump stock Forced reset trigger Hell-fire trigger Trigger crank References firearm components
Binary trigger
[ "Technology" ]
327
[ "Firearm components", "Components" ]
73,929,753
https://en.wikipedia.org/wiki/Trigger%20crank
A trigger crank is a device that allows a semi-automatic firearm to fire at an increased rate. The trigger crank typically consists of a screw-tight clamp and crank assembly. The crank assembly is clamped onto the trigger guard of a semi-automatic firearm. The device is positioned in front of the trigger. When the crank is turned, tiny gears depress the trigger and cause the weapon to fire. Internally, the firearm is not altered; hence, only one round is fired with every stroke of the trigger. This makes the "trigger crank" avoid classification as a machine gun for purposes of gun law in the United States, as stated in an IRS revenue ruling and various other private-letter rulings by ATF. However, a battery-powered "trigger crank" (and by extension Gatling gun) is a machine gun as was determined by the ATF in 2004. The devices have elicited scrutiny by gun control advocates and media commentators because of the perceived lax regulation placed upon them. See also Bump stock Binary trigger Hell-fire trigger Forced reset trigger References Firearm components
Trigger crank
[ "Technology" ]
219
[ "Firearm components", "Components" ]
73,929,769
https://en.wikipedia.org/wiki/Malligyong-1
Malligyong-1 (, meaning Telescope-1) is a type of North Korean reconnaissance satellite. It is North Korea's first spy satellite. It is in a sun-synchronous orbit at about altitude, and will provide a global optical imaging surveillance capability of several countries. The resolution of the imaging capability is not generally known. The mission's first two launch attempts failed, with the third one succeeding on 21 November 2023. This was also the first successful flight of North Korea's new launch vehicle, the Chollima-1. History First attempt The first launch attempt occurred on 31 May 2023. The second stage of the launch vehicle, Chollima-1, ignited too early into the mission, causing the mission to fail. Evacuation alerts were issued in Seoul and Okinawa Prefecture. The North Korean government quickly announced the launch failure. The remains crashed into the Yellow Sea and South Korea attempted to salvage the remainder of the rocket, searching a site off the coast of Eocheongdo. The South Korean Ministry of Defence released an image of a white cylinder, suspected to be a part of the rocket. North Korea's National Aerospace Technology Administration (NATA) said it would investigate before conducting a second satellite launch. The White House, Japan, and the UN Secretary-General condemned the launch, citing violations of Security Council resolutions prohibiting the use of ballistic missile technology. Second attempt A second launch attempt of the satellite took place on 23 August 2023, again onboard a Chollima-1 launch vehicle. The launch resulted again in a failure with the loss of the satellite, this time caused by an error in the emergency flight termination system during the third-stage flight. Third attempt A third launch attempt was initially scheduled to take place in October 2023 but was later moved to November due to some delays in fixing the technical issues that caused the previous failures. The launch took place on 21 November 2023. The South Korean news agency Yonhap quoted its counterpart in the North, the Korean Central News Agency, as saying the satellite had been successfully inserted in the predetermined orbit, resulting in the first successful flight of the Chollima-1 launch vehicle. However, no immediate independent observations could be made. The probe has been confirmed to be in orbit, however, its status is not known. According to NATA, Kim Jong Un oversaw the launch. Status On 27 February 2024, South Korean Defense Minister, Shin Won-sik stated that there were no signs of Malligyong-1 being operational, moreover, North Korea might launch another satellite in March 2024. According to Dutch astronomer Marco Langbroek, from 18 February 2024 to 24 February 2024, Malligyong-1 had made orbital raising maneuvers to prolong time in orbit and to circularize its orbit, this has demonstrated that satellite has on board propulsion and is communicating with ground communication stations in North Korea. Commands to conduct orbit raising maneuvers were transmitted from North Korea. Additional orbital raising maneuvers were made from 3 June to 7 June 2024 and from 6 September to 10 September 2024. Fourth attempt A fourth launch attempt of a new satellite, called Malligyong-1-1, took place on 27 May 2024, onboard an unnamed new launch vehicle using liquid-oxygen and petroleum propellants. The launch resulted again in a failure with the loss of the satellite. References 2023 controversies 2023 in North Korea Spacecraft launched in 2023 Satellite launch failures Space program of North Korea May 2023 events in Asia Astronomical controversies Military controversies Korean People's Army Air and Anti-Air Force
Malligyong-1
[ "Astronomy" ]
735
[ "Astronomical controversies", "History of astronomy" ]
73,930,088
https://en.wikipedia.org/wiki/Taeillo
Taeillo is a furniture company specialising in African-inspired designs. History The company was founded by Jumoke Dada in 2018. In December 2018, Taeillo ran an exhibit to showcase new designers of furniture, textile and lighting. In February 2020, Taeillo set up an augmented reality exhibition at Lagos social media week. In October 2021, Taeillo held an exhibition at Eko Design series during Lagos Design Week. In June 2022, Taeillo was featured on CNN's Inside Africa. Funding Taeillo raised about $3 million in funding over three rounds, starting from a $365, 000 pre-seed round in 2019. Its first round was backed by Montane Capital, Co-Creation Hub (CcHub) and B-Knight. In January 2021, CcHub added another $150,000 investment. In December 2022 the company raised $2.5 million in seed funding from Lagos-based investor Aruwa Capital Management, to help it scale its existing operations in Kenya. References Furniture manufacturers Design companies Nigerian companies established in 2018 Companies of Nigeria
Taeillo
[ "Engineering" ]
217
[ "Design", "Engineering companies", "Design companies" ]
73,931,619
https://en.wikipedia.org/wiki/Symbiomycota
Symbiomycota is a clade of fungi containing both Glomeromycota and Dikarya. It was supported by phylogenetic analyses based on RNA and multilocus DNA datasets. More recent analyses using genome-scale data have not supported this clade and have recovered Glomeromycota within Mucoromycota instead. It includes all mycorrhizal fungi; hence, the name references symbiosis. Although Endogone is a rare exception of a symbiotic fungus outside this group. The following is the phylogeny of Symbiomycota References Fungi by classification Fungus taxonomy
Symbiomycota
[ "Biology" ]
134
[ "Fungi", "Eukaryotes by classification", "Fungi by classification" ]
73,932,106
https://en.wikipedia.org/wiki/SQL%3A2023
SQL:2023 or ISO/IEC 9075:2023 (under the general title "Information technology – Database languages – SQL") is the ninth edition of the ISO (1987) and ANSI (1986) standard for the SQL database query language. It was formally adopted in June 2023. New features SQL:2023 includes new and updated features. The changes can be grouped into three main areas: Property graph queries, a graph query language built on top of SQL The new part 16, “Property Graph Queries (SQL/PGQ)”, has been added to the SQL standard. New features related to JSON JSON data type (T801) Enhanced JSON data type (T802) String-based JSON (T803) Hex integer literals in SQL/JSON path language (T840) SQL/JSON simplified accessor (T860–T864) SQL/JSON item methods (T865–T878) JSON comparison (T879–T882) Various smaller changes to the existing SQL language (all optional features): UNIQUE null treatment (F292) ORDER BY in grouped table (F868) GREATEST and LEAST (T054) String padding functions (T055) Multi-character TRIM function (T056) Optional string types maximum length (T081) Enhanced cycle mark values (T133) ANY_VALUE (T626) Underscores in numeric literals (T662) Property Graph Queries (SQL/PGQ) SQL/PGQ reduces the difference in functionality between relational DBMSs and native graph DBMSs. Basically, this new feature makes it easier to query data in tables as if it were in a graph database, providing a possibly more intuitive alternative to writing complex join queries. In comparison, the GQL standard for graph DBMSs adds graph updates, querying multiple graphs, and queries that return a graph result rather than a binding table. See also SQL/PGQ Property Graph Query SQL:2023 reserved words References External links . . Declarative programming languages Query languages Computer-related introductions in 2023
SQL:2023
[ "Technology" ]
459
[ "Computing stubs" ]
73,933,322
https://en.wikipedia.org/wiki/Goldston%E2%80%93Pintz%E2%80%93Y%C4%B1ld%C4%B1r%C4%B1m%20sieve
The Goldston–Pintz–Yıldırım sieve (also called GPY sieve or GPY method) is a sieve method and variant of the Selberg sieve with generalized, multidimensional sieve weights. The sieve led to a series of important breakthroughs in analytic number theory. It is named after the mathematicians Dan Goldston, János Pintz and Cem Yıldırım. They used it in 2005 to show that there are infinitely many prime tuples whose distances are arbitrarily smaller than the average distance that follows from the prime number theorem. The sieve was then modified by Yitang Zhang in order to prove a finite bound on the smallest gap between two consecutive primes that is attained infinitely often. Later the sieve was again modified by James Maynard (who lowered the bound to ) and by Terence Tao. Goldston–Pintz–Yıldırım sieve Notation Fix a and the following notation: is the set of prime numbers and the characteristic function of that set, is the von Mangoldt function, is the small prime omega function (which counts the distinct prime factors of ) is a set of distinct nonnegative integers . is another characteristic function of the primes defined as Notice that . For an we also define , is the amount of distinct residue classes of modulo . For example and because and . If for all , then we call admissible. Construction Let be admissible and consider the following sifting function where is a weight function we derive later. For each this sifting function counts the primes of the form minus some threshold , so if then there exist some such that at least are prime numbers in . Since has not so nice analytic properties one chooses rather the following sifting function Since and , we have only if there are at least two prime numbers and . Next we have to choose the weight function so that we can detect prime k-tuples. Derivation of the weights A candidate for the weight function is the generalized von Mangoldt function which has the following property: if , then . This functions also detects factors which are proper prime powers, but this can be removed in applications with a negligible error. So if is a prime k-tuple, then the function will not vanish. The factor is just for computational purposes. The (classical) von Mangoldt function can be approximated with the truncated von Mangoldt function where now no longer stands for the length of but for the truncation position. Analogously we approximate with For technical purposes we rather want to approximate tuples with primes in multiple components than solely prime tuples and introduce another parameter so we can choose to have or less distinct prime factors. This leads to the final form Without this additional parameter one has for a distinct the restriction but by introducing this parameter one gets the more looser restriction . So one has a -dimensional sieve for a -dimensional sieve problem. Goldston–Pintz–Yıldırım sieve The GPY sieve has the following form with . Proof of the main theorem by Goldston, Pintz and Yıldırım Consider and and and define . In their paper, Goldston, Pintz and Yıldırım proved in two propositions that under suitable conditions two asymptotic formulas of the form and hold, where are two constants, and are two singular series whose description we omit here. Finally one can apply these results to to derive the theorem by Goldston, Pintz and Yıldırım on infinitely many prime tuples whose distances are arbitrarily smaller than the average distance. References Sieve theory
Goldston–Pintz–Yıldırım sieve
[ "Mathematics" ]
763
[ "Sieve theory", "Combinatorics" ]
73,934,690
https://en.wikipedia.org/wiki/Airy%20process
The Airy processes are a family of stationary stochastic processes that appear as limit processes in the theory of random growth models and random matrix theory. They are conjectured to be universal limits describing the long time, large scale spatial fluctuations of the models in the (1+1)-dimensional KPZ universality class (Kardar–Parisi–Zhang equation) for many initial conditions (see also KPZ fixed point). The original process Airy2 was introduced in 2002 by the mathematicians Michael Prähofer and Herbert Spohn. They proved that the height function of a model from the (1+1)-dimensional KPZ universality class - the PNG droplet - converges under suitable scaling and initial condition to the Airy2 process and that it is a stationary process with almost surely continuous sample paths. The Airy process is named after the Airy function. The process can be defined through its finite-dimensional distribution with a Fredholm determinant and the so-called extended Airy kernel. It turns out that the one-point marginal distribution of the Airy2 process is the Tracy-Widom distribution of the GUE. There are several Airy processes. The Airy1 process was introduced by Tomohiro Sasomoto and the one-point marginal distribution of the Airy1 is a scalar multiply of the Tracy-Widom distribution of the GOE. Another Airy process is the Airystat process. Airy2 proces Let be in . The Airy2 process has the following finite-dimensional distribution where and is the extended Airy kernel Explanations If the extended Airy kernel reduces to the Airy kernel and hence where is the Tracy-Widom distribution of the GUE. is a trace class operator on with counting measure on and Lebesgue measure on , the kernel is . Literature References Stochastic processes Statistical mechanics
Airy process
[ "Physics" ]
391
[ "Statistical mechanics" ]
73,935,344
https://en.wikipedia.org/wiki/Piaseczno%20and%20Gr%C3%B3jec%20Narrow-gauge%20Railway
The Piaseczno and Grójec Narrow-gauge Railway (Polish: Piaseczyńska Kolej Wąskotorowa), until 2004 known as the Grójec Commuter Railway (Polish: Grójecka Kolej Dojazdowa) is the tourist narrow-gauge railway service in Warsaw, Poland, operated by the Piaseczno–Grójec Narrow-Gauge Railway Society, with the metre-gauge railway (1,000 mm) line. It was originally opened in 1900, and operated as a local transit railway line in Warsaw and it suburbia. Currently, most of the historical lines has been closed, with the remaining stations serving as a tourist attraction. References Metre gauge railways in Poland Companies set up in the Second Republic of Poland 1900 establishments in Poland History of transport in Warsaw Railway companies established in 1900 Railway lines opened in 1900 Heritage railways in Poland Transport in Warsaw
Piaseczno and Grójec Narrow-gauge Railway
[ "Physics" ]
185
[ "Physical systems", "Transport", "Transport stubs" ]
73,938,350
https://en.wikipedia.org/wiki/Skill-based%20matchmaking
Skill-based matchmaking (SBMM), also referred to as matchmaking ranking (MMR), is a form of matchmaking dependent on the relative skill level of the players involved. History A common rating system in chess is the Elo rating system, developed by Arpad Elo. Former International Chess Federation president Florencio Campomanes described it as an "inseparable partner to high-level chess". In 2006, Microsoft researchers proposed a skill-based rating system using Bayesian inference and deployed it on the Xbox Live network, then one of the largest deployments of a Bayesian inference algorithm. The researchers were displeased with the ranking system in the beta of Halo 2 (2004). By the time Halo 2 launched, it was using TrueSkill. The term skill-based matchmaking first appeared in a 2008 interview with game designer John Carmack in which he emphasized its importance in Quake Live (2010). Upon setting up an account with id Software, the game will ask the player for their skill level and judge accordingly depending on their performance from that point forward. The presence—or lack thereof—of skill-based matchmaking became a point of contention. During the development of Dota 2 (2013), Valve Software believed that the barrier to entry could be solved with, among other things, skill-based matchmaking through its Steamworks service; when Call of Duty: Black Ops (2010) developer Treyarch was asked why the game wouldn't include skill-based matchmaking unlike Halo 3 (2007), multiplayer design director David Vonderhaar said that speed was "more important than anything else". Description Team-based, competitive games such as League of Legends (2009), Counter-Strike: Global Offensive (2012), Dota 2 (2013), and Overwatch (2016) benefit from skill-based matchmaking. In contrast, Call of Duty: Black Ops II (2012)—a game that primarily focuses on single-player accomplishments—does not benefit from skill-based matchmaking. Treyarch, who developed Call of Duty: Black Ops II, consciously queued players exclusively using ping and latency, in a subversion of industry standards at the time. Queues In skill-based matchmaking, queue design focuses on how to divide parties into appropriate skill groups. In contrast to StarCraft II (2010), which focuses on player-on-player action, Blizzard Entertainment's Heroes of the Storm (2015) is a team-based game. The Heroes of the Storm matchmaker aims to have players win at least half of the games that they play. The game's matchmaker also aims to pair coordinated teams with other coordinated teams in order to avoid an unfair communication advantage. According to game director Ben Brode, Hearthstone (2014) maintains a separate pool of new players. Players remain in the pool until they win ten games or obtain two legendary minions. Player rating model The skill rating of a player is their ability to win a match based on aggregate data. Various models have emerged to achieve this. Mark Glickman implemented skill volatility into the Glicko rating system. In 2008, researchers at Microsoft extended TrueSkill for two-player games by describing a number for a player's ability to force draws. Variability in map, character, and server effects have been considered in at least two research papers. In 2016, two Cornell University graduates modeled skill rating as a vector of numbers, showing "substantial intransitivity". Reception Skill-based matchmaking is a controversial practice. In Call of Duty: Warzone (2020), streamers of the game often seek out "bot lobbies"—lobbies with less-skilled players. The Washington Post compared the practice to "LeBron James looking to join pickup games at the local YMCA". Call of Duty: Warzone players who have spoken out against the game's use of skill-based matchmaking include 100 Thieves CEO Nadeshot, former professional Counter-Strike: Global Offensive player Shroud, and 100 Thieves co-owner CouRage. Competitive Call of Duty: Warzone player HusKerrs wrote that high-skilled players must "sweat or try hard" in order to create engaging content. Streamer TimTheTatman refused to stream Call of Duty: Modern Warfare II (2022) upon discovering that it would implement skill-based matchmaking. In February 2023, Destiny 2 (2017) introduced skill-based matchmaking. Higher-skilled players subsequently discovered a way to enter lobbies with lower-skilled players, resulting in outcry from the community. References Citations Works cited Video game terminology
Skill-based matchmaking
[ "Technology" ]
953
[ "Computing terminology", "Video game terminology" ]
73,938,464
https://en.wikipedia.org/wiki/Edmonton%20Queer%20History%20Project
The Edmonton Queer History Project (EQHP) is a community-engaged research project focused on documenting, preserving, and making visible the history of Edmonton's 2SLGBTQ+ community. One of EQHP's most prominent initiatives is a map of 27 locations around Downtown Edmonton, each with historical significance to the local 2SLGBTQ+ community, that was initially launched in March 2022. The Project also launched an interactive website, two podcasts (From Here to Queer and Vriend Versus Alberta), and regularly hosts walking tours following the EQHP downtown map to promote the city's queer history that is often absent in school curriculum and left out of public conversation. History The Edmonton Queer History Project, then known as the Queer History Project, began in 2015 as an interactive multi-media art exhibit to celebrate the 35th anniversary of the Edmonton Pride Festival. Dr. Kristopher Wells organized the project with funding support from the Edmonton Community Foundation and Dr. Michelle Lavoie curated an exhibit, called We Are Here: Queer History Project, detailing decades of Edmonton's 2SLGBTQ+ community member stories at the Art Gallery of Alberta, which ran from 5 June until 21 June 2015. Open calls for contributions of personal memorabilia and stories for the exhibit went out in January, 2015, to ensure individual experiences were centred. Additionally, recording equipment was available for community members to add their stories to the project during the exhibit's run. The exhibit featured 21 interviews recorded with community leaders like Michael Phair as well as posters, photographs, and other ephemera. After its run at the Art Gallery of Alberta, We Are Here: Queer History Project became a travelling exhibition, visiting small towns in rural Alberta. Current work Following We Are Here: Queer History Project, the Project's members collated their research efforts into more accessible formats. In March 2022, EQHP launched a new website including a historical timeline of Edmonton's Pride Festival, a podcast called From Here to Queer, walking tours, and a map with 27 locations of historical significance to the local 2SLGBTQ+ community. Free physical copies of the map were made available at the Edmonton International Airport and various Edmonton Public Library locations. Dr. Kristopher Wells, Canada Research Chair and associate professor at MacEwan University, leads the project's research activities from MacEwan University's Centre for Sexual and Gender Diversity. Original members of the EQHP team include Dr. Kristopher Wells, Dr. Michelle Lavoie, Darrin Hagen, Michael Phair, Rob Browatzke, Kyler Chittick, Japkaran Saroya, and Paige Simpson. After a successful launch of the downtown map, EQHP was approached by the Old Strathcona Business Association to research queer history landmarks on the south side of the city like the Pride Corner of Whyte Avenue. EQHP's work continues with the introduction of a community map tagging project, #EQHPStories, and a new podcast, produced in partnership with the Edmonton Community Foundation, entitled Vriend Versus Alberta, marking the 25th anniversary of Vriend v Alberta. Launched in 2023, #EQHPStories is an interactive map of Edmonton which community members are able to add pins to locations across the city and add their own stories. A collection of various local 2SLGBTQ+ magazines, newsletters, and other items is also being digitized as part of the Internet Archive for EQHP in collaboration with the MacEwan University Library and University of Alberta Library. Guided tours With support from the Edmonton Downtown Business Association, EQHP hosts free guided walking tours, as well as bus tours, that explore local queer history. The interactive 90-minute tours are intended for small groups to promote interaction and conversation amongst participants. Members of the EQHP team, such as Michael Phair and Darrin Hagen, lead the walking tours. Podcasts The original EQHP podcast, From Here to Queer, is hosted by local playwright and queer historian, Darrin Hagen. Exploring the people, places, and moments important to Edmonton's queer history, the podcast features guests like Edmonton's first openly gay city councillor, Michael Phair, Judge Julie Lloyd, and Alison Redford, the first female Premier of Alberta. Vriend Versus Alberta is the newest podcast from EQHP, produced in collaboration with the Edmonton Community Foundation, as a limited 10-part series. Also hosted by Darrin Hagen, Vriend Versus Alberta commemorates the 25th anniversary of landmark legal case for 2SLGBTQ+ rights in Canada, Vriend v Alberta. Awards In September 2022, the Edmonton Queer History Project was shortlisted for the Governor General's History Award for Excellence in Community Programming. This award is one of multiple Governor General's Awards, recognizing organizations that "represent remarkable and inspiring initiatives that encourage public engagement in Canadian history." References External links Edmonton Queer History Project website Edmonton Queer History Collection on the Internet Archive #EQHPStories website From Here to Queer podcast LGBTQ in Alberta LGBTQ organizations based in Canada Organizations established in 2015 LGBTQ history in Canada Public awareness campaigns Research projects Social history organizations History organizations based in Canada Digital humanities projects Map websites History of Edmonton Historical geography of Canada Art exhibitions in Canada Multimedia works Popular scholarship Heritage interpretation MacEwan University Projects in North America Projects established in 2015 Collaborative mapping Digital history projects Digital library projects Cultural heritage of Canada
Edmonton Queer History Project
[ "Technology" ]
1,111
[ "Multimedia", "Multimedia works" ]
73,938,561
https://en.wikipedia.org/wiki/Monk%20Skin%20Tone%20Scale
The Monk Skin Tone Scale is an open-source, 10-shade scale describing human skin color, developed by Ellis Monk in partnership with Google and released in 2023. It is meant to replace the Fitzpatrick scale in fields such as computer vision research, after an IEEE study found the Fitzpatrick scale to be "poorly predictive of skin tone" and advised it "not be used as such in evaluations of computer vision applications." In particular, the Fitzpatrick scale was found to under-represent darker shades of skin relative to the global human population. The following table shows the 10 categories of the Monk Skin Tone Scale alongside the six categories of the Fitzpatrick scale, grouped into broad skin tone categories: Predecessor Computer vision researchers initially adopted the Fitzpatrick scale as a metric to evaluate how well a given collection of photos of people sampled the global population. However, the Fitzpatrick scale was developed to predict the risk of skin cancer in lighter-skinned people, and did not initially include darker skin tones at all. Two tones for darker people were later added to the original four tones to make it more inclusive. Despite these improvements, research has found that the Fitzpatrick Skin Tone correlated more with self-reported race than with objective measurements of skin tone, and that computer vision models trained using the Fitzpatrick scale perform poorly on images of people with darker skin. Use The Monk scale includes 10 skin tones. Though other scales (such as those used by cosmetics companies) may include many more shades, Monk claims that 10 tones balances diversity with ease of use, and can be used more consistently across different users than a scale with more tones:Usually, if you got past 10 or 12 points on these types of scales [and] ask the same person to repeatedly pick out the same tones, the more you increase that scale, the less people are able to do that. Cognitively speaking, it just becomes really hard to accurately and reliably differentiate.The primary intended application of the scale is in evaluating datasets for training computer vision models. Other proposed applications include increasing the diversity of image search results, so that an image search for "doctor" returns images of doctors with a broad range of skin tones. Google has cautioned against equating the shades in the scale with race, noting that skin tone can vary widely within race. The Monk scale is licensed under the Creative Commons Attribution 4.0 International license. See also Von Luschan's chromatic scale References Human skin color Color scales
Monk Skin Tone Scale
[ "Biology" ]
497
[ "Human skin color", "Pigmentation" ]
73,940,047
https://en.wikipedia.org/wiki/Forced%20reset%20trigger
A forced reset trigger (or "hard reset" trigger) is a device that allows a person to fire a semi-automatic firearm at an increased rate. The forced reset trigger works by mechanically resetting the trigger's position after a shot is fired. This allows for an increased rate of fire. However, the shooter must still manually pull the trigger each time it resets for any subsequent shot to be fired. Forced reset triggers are installed through replacement of the trigger control group. The preinstalled trigger of a particular firearm is replaced by the forced reset trigger's assembly. Typically, only one shot is fired per single function of the trigger. However, in the U.S., the ATF considers some forced reset triggers to be machineguns under the National Firearms Act. This determination by the ATF is being litigated by gun rights groups in the United States. On July 24th, 2024, district judge Reed O'Connor, from the Wichita Falls division of the United States District Court for the Northern District of Texas, issued a vacatur, under the Administrative Procedure Act, for the ATF's determination that some forced reset triggers are machineguns, finding that the determination was "arbitrary and capricious". History A patent for a forced reset trigger titled "Flex Fire Technology" was filed by Thomas Allen Graves in 2015. Graves states that he initially began developing forced reset trigger technology in the 1970s, focusing on modifications to Colt Navy action revolvers. Over the following decades, Graves continued refining this technology, applying it to platforms such as the Ruger 10/22 carbine and the Bersa Thunder handgun. In 2014, he claims to have invented a forced reset trigger specifically for the AR-15 platform, subsequently acquiring a patent for it in 2015. According to Graves, he later licensed his patented technology to two well-known manufacturers of forced reset triggers. See also Bump stock Hell-fire trigger Trigger crank Binary trigger References Firearms Firearm actions
Forced reset trigger
[ "Technology" ]
401
[ "Firearm components", "Components" ]
73,940,595
https://en.wikipedia.org/wiki/Accounts%20of%20Materials%20Research
Accounts of Materials Research () is a monthly peer-reviewed scientific journal published in partnership between ShanghaiTech University and American Chemical Society. It was rewarded by the Chinese government through Action Plan for the Excellence of Chinese STM Journals in 2020. The journal is a subscription-access publication that has committed to publishing an increasing number of open access articles, with a future target of transitioning to 100% open access. Abstracting and indexing The journal is abstracted and indexed in: Chemical Abstracts Service Emerging Sources Citation Index Scopus According to the Journal Citation Reports, the journal has a 2022 impact factor of 14.6. References External links American Chemical Society academic journals ShanghaiTech University Academic journals established in 2020 Materials science journals English-language journals
Accounts of Materials Research
[ "Materials_science", "Engineering" ]
149
[ "Materials science journals", "Materials science" ]
73,940,850
https://en.wikipedia.org/wiki/Small%20Nozomi%20and%20Big%20Yume
is a Japanese manga series written and illustrated by Sou Hamayumiba. It was serialized in Kodansha's manga magazine from May 2019 to January 2021, with its chapters collected in three volumes. Publication Written and illustrated by Sou Hamayumiba, Small Nozomi and Big Yume was serialized in Kodansha's manga magazine from May 22, 2019, to January 22, 2021. Kodansha collected its chapters in three volumes, released from October 23, 2019, to March 23, 2021. In North America, Kodansha USA licensed the manga for an English digital release. Volumes See also Dropout Idol Fruit Tart, another manga series by the same author Hanayamata, another manga series by the same author Notes References Further reading External links Comedy anime and manga Fiction about size change Kodansha manga Seinen manga Slice of life anime and manga
Small Nozomi and Big Yume
[ "Physics", "Mathematics" ]
181
[ "Fiction about size change", "Quantity", "Physical quantities", "Size" ]
73,941,595
https://en.wikipedia.org/wiki/Forest%20Ecosystems
Forest Ecosystems is a bimonthly peer-reviewed open access scientific journal covering research related to the structure and dynamics of "natural" and "domesticated" forest ecosystems. Previously published by Springer Nature, as of 2022 it is published by Elsevier on behalf of KeAi Communications. History The journal was established in 2014 by Weilun Yin (尹伟伦 (Beijing Forestry University) and Klaus von Gadow (University of Göttingen). The journal is sponsored by the Beijing Forestry University. Academic conferences The journal sponsors two international conferences each year in Beijing under the sponsorship of Beijing Forestry University. Editors-in-chief The editors-in-chief are John A. Kershaw (University of New Brunswick) and Osbert Jianxin Sun (孙建新; Beijing Forestry University). Abstracting and indexing The journal is abstracted and indexed in: BIOSIS Previews Biological Abstracts Current Contents/Agriculture, Biology & Environmental Sciences Science Citation Index Expanded Scopus The Zoological Record According to the Journal Citation Reports, the journal has a 2022 impact factor of 4.1. Notable articles Most cited articles According to its official website, the most cited articles are: See also Agricultural and Forest Meteorology Urban Forestry and Urban Greening Forest Ecology and Management References External links Ecology journals Open access journals Elsevier Academic journals established in 2014
Forest Ecosystems
[ "Environmental_science" ]
268
[ "Environmental science journals", "Ecology journals" ]
73,944,672
https://en.wikipedia.org/wiki/List%20of%20mammals%20of%20Cantabria
The vertebrate fauna in Cantabria presents a wide diversity thanks to the variety of ecological niches existing in the community and its geographical position, equidistant between the Mediterranean region of the peninsular south and the nearby region of Atlantic Europe. These lists show all the wild vertebrates living in Cantabria, classified according to the genus and family they belong to. In addition to the scientific name of each species, it also includes the common name in the Spanish language, the vernacular names most commonly used in this community, a brief description, a map of distribution in Spain and the conservation status. Mammals In Cantabria, 73 species of wild mammals can be found, grouped into 20 families. Some of them, such as the Spanish mole, the desman, the Granada hare and the broom hare, are considered Iberian endemisms, while others, such as the genet, the American mink or the coypu, are exotic species introduced by man. In terms of distribution, species such as the fox or the hedgehog are abundant throughout Cantabria, unlike the hare or the brown bear, whose distribution is much more scarce and localized. The conservation status of wild mammals in Cantabria is also diverse, with sixteen species near-threatened, fifteen vulnerable and one, the brown bear, in critical danger of extinction. Lagomorpha order Lagomorphs (Lagomorpha, from the Greek lagōs, hare and morphē, form) are an order belonging to the placental mammals related to rodents, from which they differ by possessing two pairs of upper teeth covered with a layer of enamel. Four species of lagomorphs are found in Cantabria, all of them within the family Leporidae (rabbits and hares). Three of them are endemic to the Iberian Peninsula and some, such as the broom hare, only inhabit the mountainous areas of the northern peninsular, between Galicia, Asturias, León and Cantabria. Rodentia order Rodents (Rodentia) are the most versatile and numerous order of mammals, with approximately 2280 living species (42% of all living species) distributed throughout all terrestrial and freshwater habitats. Their main common feature is their two large, continuously growing incisors, with which they gnaw seeds, gnaw wood, cut food or defend themselves against predators. Although they have developed a great variety of forms as a result of adaptation to different habitats and ecological niches, the species present in Cantabria have in common their relatively small size, thick grayish-brown fur and short legs and neck. The coypu, an invasive species native to South America and cited in the Soba valley, is the one that most deviates from the common pattern, slightly exceeding half a meter in length and six kilograms in weight. Eulipotyphla order Erinaceomorpha suborder The erinaceomorphs (Erinaceomorpha) are a suborder of placental mammals that includes a single family, the erinaceids (Erinaceidae), formerly included in the former order Insectivora. It includes the well-known hedgehogs of Eurasia and Africa and the subfamily Galericinae of Southeast Asia. Soricomorpha suborder The soricomorphs (Soricomorpha) are a suborder of placental mammals containing the families Nesophontidae, Solenodontidae, Soricidae, and Talpidae. Members of this order belonged to the extinct order Insectivora. Moles and shrews belong to this order. Chiroptera order The chiroptera or bats are an order of placental mammals whose upper limbs developed as wings. Three families of Chiroptera are present in Cantabria: the molossids, the rhinolophids or horseshoe bats and the vespertilionids. Of all of them, the most numerous is the family of vespertilionids, composed of twenty species gathered in six genera: for the genus Myotis; Myotis myotis, M. blythii, M. natereri, M. daubentonii, M. bechstenii, M. emarginata, M. mystacinus, M. capaccini, for the genus Miniopterus; miniopterus schreibersii, for the genus Nyctalus; Nyctalus noctula, N. lesiopterus, N. leisleri; for the genus Pipistrellus; Pipistrellus pipistrellus, P. pygmaeus, P. kuhlii, P. nathusii, P. savii; for the genus Plecotus, the Plecotus auritus and Plecotus austriacus; for the genus Eptesicus the Eptesicus seronitus and for the genus Barbastella a single species Barbastella barbastellus. The family of horseshoe bats includes three species: the large horseshoe bat (Rhinolophus ferrumequinum) of large size, the small horseshoe bat (Rhinolophus hipposideros) and the Mediterranean horseshoe bat (Rhinolophus euryale). Finally, the molossidae family includes the species Tadarida teniotis. Carnivora order The carnivores (Carnivora) are an order of placental mammals, characterized by the shape of their molars. It takes its name from the adaptation of most of its members to meat consumption; although several are omnivores. Artiodactyla order The artiodactyla (Artiodactyla, from Greek άρτιος (ártios), "pair" and δάκτυλος (dáktylos), "finger") are an order of ungulate mammals whose limbs end in an even number of toes. The most developed toes are the third and fourth toes and, except for hypopotamids, which are the only ones that rest on the ground. Notes References Bibliography Cantabria Fauna of Spain Biodiversity lists Mammals by location Mammals of Europe Mammal images
List of mammals of Cantabria
[ "Biology" ]
1,285
[ "Biodiversity lists", "Biodiversity" ]
73,945,013
https://en.wikipedia.org/wiki/Peter%20Brimblecombe
Peter Brimblecombe (born 1949) is an Australian-born, British atmospheric chemist, currently emeritus professor of atmospheric chemistry at the University of East Anglia and National Sun Yat-sen University in Taiwan. In a five-decade research career, he has written or co-authored seven books and around 350 peer-reviewed papers on air pollution and its effects on human health and the environment, but is probably best known as the author of The Big Smoke, which has been described as a definitive history of air pollution. Education and career Brimblecombe was born in Canberra, Australia and educated at the University of Auckland, New Zealand, where he earned a BSc (1970), MSc (1971), and PhD in chemistry (1973). His thesis, studying the aqueous chemistry of environmental sulfur dioxide, was supervised by David John Spedding. Following his doctorate, he worked in Fiji for a year, lecturing in inorganic chemistry at the School of Natural Resources of the University of the South Pacific. In 1974, he relocated to Britain to become first a lecturer then a professor in atmospheric chemistry at the University of East Anglia (UEA), where he also served as associate dean from 2008 to 2011. Following his retirement, after four decades at UEA, he moved to Hong Kong and shifted the focus of his research to study air pollution in Asia. From 2013 to 2018, he was chair professor of environmental chemistry at the School of Energy and Environment, City University of Hong Kong, then became Distinguished Research Chair Professor at National Sun Yat-sen University in Taiwan. He is currently emeritus professor of atmospheric chemistry at the University of East Anglia and at National Sun Yat-sen University. Research interests Brimblecombe's wide-ranging research has covered many different aspects of atmospheric chemistry and air pollution, but also makes connections to broader history, art, and culture. As he put it in a 2009 lecture: "Environmental pollution is not merely a matter of environmental chemistry. The smells have to be smelt. Painting and poetry can be as informative as a scientific description when trying to understand the complexities of environmental problems". His 1987 book The Big Smoke: A History of Air Pollution in London since Medieval Times is highly cited and often described as a "definitive", "classic" history of air pollution, although historians' views of the book were mixed. He has published numerous papers on the effects of air pollution on historic buildings and monuments, and both historical artifacts and everyday objects. In 2004, he was one of a group of experts from 10 countries involved in a three-year "Noah's Ark" project designed "to investigate the effects of climate change and pollution on Europe's historic built environment over the next 100 years". He has provided scientific advice on heritage and conservation to the Council of Europe, the European Parliament and the House of Lords. In the late 1990s, while working at UEA, Brimblecombe advised the National Trust on strategies to minimize the impact of dust on its historic collections, which led the organization to "ban" dusting for a three years and prompted considerable news comment. Giles Whittell, writing in The Times, noted that Brimblecombe, "who may know more about dust than anyone in the world, has advised historic houses to guide their visitors along routes with as few sharp turns as possible and to position their most precious artefacts at the end of the tour, by which time fatigue has set in and people fidget less". In the same paper, Simon Jenkins described Brimblecombe as "the nation's mite-buster king-at-arms, who strikes terror in the sternest housekeeper" and expressed mixed views about the plan. Brimblecombe's recent research includes studies of how microplastics are carried through the environment, how COVID-19 affected air pollution, and how pollution is depicted in the work of artists and writers such as Monet and Dickens. Other activities Brimblecombe served as chief editor of the academic journal Atmospheric Environment and is currently editor in chief of the journal City and Environment Interactions. He sits on the editorial boards of the Journal of Cultural Heritage and Environmental Chemistry. He is a frequent media commentator on issues related to pollution and the environment, including such topics as the ozone layer, climate change, air pollution in China, atmospheric acidity and acid rain, and the 1952 Great Smog of London. Awards Brimblecombe has been awarded the 2005 Società Chimica Italiana Gold Medal for his environmental research and, as part of the Noah's Ark project, mapping the impacts of climate change on heritage, the 2009 European Union Prize for Cultural Heritage / Europa Nostra Awards Grand Prize, which recognizes excellence in heritage conservation. Selected publications Books Articles References External links 1949 births Living people University of Auckland alumni English chemists Academics of the University of East Anglia Environmental historians Environmental scientists Air pollution in the United Kingdom Scientific journal editors
Peter Brimblecombe
[ "Environmental_science" ]
1,001
[ "Australian environmental scientists", "Environmental scientists", "British environmental scientists" ]
73,945,875
https://en.wikipedia.org/wiki/Entoloma%20eugenei
Entoloma eugenei is a species of agaric (gilled mushroom) in the family Entolomataceae. The species has a temperate distribution in the Russian Far East, Japan, and Korea, occurring mainly in mixed hardwood forests.Threats to its habitat have resulted in Entoloma eugenei being assessed as globally "endangered" on the IUCN Red List of Threatened Species. Taxonomy The species was first described from the Russian Far East in 2010. Molecular research, based on cladistic analysis of DNA sequences, has shown that it belongs in the subgenus Leptonia. Description Basidiocarps are agaricoid, up to 80 mm (3 in) tall, the cap hemispherical at first becoming flat, up to 60 mm (2.4 in) across. The cap surface is smooth, finely velvety when young, and deep blue. The lamellae (gills) are white becoming pink from the spores. The stipe (stem) is finely squamulose, cap-coloured or paler, lacking a ring. The spore print is pink, the spores (under a microscope) multi-angled, inamyloid, measuring about 10 to 12.5 by 6 to 8 μm. Similar species Entoloma dichroum is similar, but typically smaller and occurs in Europe. Distribution and habitat Entoloma eugenei is rare and currently known from a very few sites in the Russian Far East and single sites in Japan and Korea. It occurs in mixed hardwood forests. Conservation Because of its rarity and threats to its habitat through logging and deforestation, the species is of global conservation concern and is listed as "endangered" on the IUCN Red List of Threatened Species. References External links Entolomataceae Fungi of Asia Fungi described in 2010 Taxa named by Machiel Noordeloos Fungus species
Entoloma eugenei
[ "Biology" ]
382
[ "Fungi", "Fungus species" ]
73,946,042
https://en.wikipedia.org/wiki/Honeycomb%20%28company%29
Honeycomb (stylized as honeycomb.io) is an American software company known for its eponymous observability and application performance management (APM) platform and for its diversity, equity, and inclusion (DEI) practices. Honeycomb's venture capital investors to date include Headline, Scale Venture Partners, and Insight Partners. Honeycomb's tooling enables software developers to debug live software applications, especially those using a microservice architecture. Honeycomb accepts telemetry from applications instrumented with the OpenTelemetry SDKs, in addition to Structured JSON data or other custom integrations. Honeycomb offers metrics and tracing visualizations as well as AI-assisted debugging capabilities. The underlying software is a proprietary columnar database running on Amazon Web Services. Amazon has promoted the company as an early adopter of the Graviton family of ARM processors. History Honeycomb was founded in 2016 in San Francisco by Charity Majors and Christine Yen, both of whom were engineers at Parse (later acquired by Facebook). After the acquisition of Parse by Facebook, Yen proposed to Majors that they found a startup together. Majors wanted to build a tool that fixes problems with live applications running in the cloud, inspired by the Scuba in-memory analytics tool they had used at Facebook. Investors invested $4 million in seed money, followed by $11M in Series A funding in 2019 and $20M in Series B funding in 2020. In 2021, Honeycomb announced a $50M Series C funding round. In 2023, Honeycomb announced that it had raised an additional $50 million in Series D funding, for a total of $150 million in funding to-date. In December 2021, Honeycomb began an experiment in co-determination by appointing an employee representative to its board of directors, an uncommon governance practice for a United States company. As of 2023, the executive team, board, and employees of Honeycomb are each at least half women and non-binary, making the company unusually gender diverse for a technology startup. References External links Official website 2016 software 2016 establishments in California American companies established in 2016 Software companies established in 2016 Software companies based in the San Francisco Bay Area Software companies of the United States Big data companies Proprietary software Software performance management System administration Systems management Website monitoring software
Honeycomb (company)
[ "Technology" ]
470
[ "Information systems", "System administration" ]
73,946,101
https://en.wikipedia.org/wiki/Titulus%20%28fortification%29
A titulus is a detached segment of rampart, found at the gateways of Roman camps from the middle of the second century BCE. The earliest known example is at Renieblas near Numantia. Examples from the Flavian and Antonine periods are common in Britain. They are interpreted as making it difficult for enemies to rush directly at the gates, but three unusually large examples (at the entrance to the southern Roman camp at the base of Burnswark Hill) may also have been the sites from which ballistas launched the stone missiles that have been found in the fort on Burnswark Hill itself. The neuter form titulum has also been used. References Fortifications Roman fortifications
Titulus (fortification)
[ "Engineering" ]
142
[ "Fortifications", "Military engineering" ]
78,277,947
https://en.wikipedia.org/wiki/Biosecure%20Act
The Biosecure Act, or H.R. 8333, was a bill introduced during the 118th United States Congress that prohibits entities that receive federal funds from using biotechnology that is from a company associated with a U.S. foreign adversary. However, the bill was not included in the last legislation during the 118 United States Congress and as a result, it did not become law in 2024. Bill summary The Biosecure Act prohibits entities that receive federal funds from using biotechnology that is from a company associated with a foreign adversary. Specifically, U.S. federal agencies and recipients of federal funds (e.g., grantees) may not procure or use any biotechnology equipment or service that is from a biotechnology company of concern and may not contract with any entities that do so. A biotechnology company of concern is defined in this bill as "an entity that is under the control of a foreign adversary and that poses a risk to national security based on its research or multiomic data collection (e.g., collection of genomic information)." The Office of Management and Budget (OMB) must, in coordination with the Department of Defense (DOD) and other specified agencies, develop a list of prohibited companies; the list must include five particular companies, as specified in the bill. OMB and DOD may approve waivers of these restrictions on an as-needed basis, which are valid for up to one year and may be extended once for an additional 180 days. In addition, the Office of the Director of National Intelligence must report on the national security risks posed by (1) multiomic data collection by foreign adversaries in connection with biotechnology equipment or services, and (2) biotechnology companies that have such data. Legislative support and activity In May 2024, Representatives Brad Wenstrup and Raja Krishnamoorthy introduced H.R. 8333 in the U.S. House of Representatives. Upon introducing the legislation, the lawmakers said:"The Chinese Communist Party’s (CCP) national security laws require all Chinese firms to share any requested data with the CCP, including biotechnology companies that collect, test, and store American genomic data. Beijing Genomics Institute (BGI), a company in the People's Republic of China (PRC), has collected DNA from millions around the world and used that data without consent on genomic projects conducted by the Chinese military. Chinese company WuXi AppTec has sponsored events with China's military, reportedly stolen U.S. IP, and jointly operated genetic collection sites with China's military." Upon introduction, the bill was referred to the Committee on Oversight and Accountability. On May 15, 2024, the Committee advanced the bill in a 40–1 vote, demonstrating unusually high bipartisan support. On September 9, 2024. the U.S. House passed the Biosecure Act by a vote of 306–81, well exceeding the two-thirds vote required to pass. The bill was then sent to the U.S. Senate and referred to the Committee on Homeland Security and Governmental Affairs. In December 2024, WuXi AppTec announced that it would sell its Oxford Genetics and WuXi Advanced Therapies to a U.S.-based private equity firm, reportedly due to the Biosecure Act which is deterring potential clients. References Proposed legislation of the 118th United States Congress United States biotechnology law National security of the United States
Biosecure Act
[ "Biology" ]
697
[ "Biotechnology law", "United States biotechnology law" ]
78,280,086
https://en.wikipedia.org/wiki/C/1913%20Y1%20%28Delavan%29
Comet Delavan, formally designated as C/1913 Y1, is a hyperbolic comet discovered by astronomer Pablo T. Delavan on December 18, 1913, from the La Plata Observatory in Argentina. The comet was last seen on September 19, 1915. It is one of 19 comets used in the original sample by Jan Oort for his hypothesis regarding the origin of long-period comets in 1950. References Hyperbolic comets Non-periodic comets
C/1913 Y1 (Delavan)
[ "Astronomy" ]
92
[ "Astronomy stubs", "Comet stubs" ]
78,281,575
https://en.wikipedia.org/wiki/Ammonium%20dihydrogen%20arsenate
Ammonium dihydrogen arsenate is an inorganic chemical compound with the chemical formula . Synthesis The effect of ammonia on a concentrated solution of arsenic acid: Physical properties The compound forms colorless crystals, soluble in water. Uses The compound is used as a pharmaceutical intermediate. Ferroelectric, a material of nonlinear optics. References Arsenic compounds Ammonium compounds Acid salts
Ammonium dihydrogen arsenate
[ "Chemistry" ]
76
[ "Acid salts", "Ammonium compounds", "Salts" ]
78,282,552
https://en.wikipedia.org/wiki/HD%20861
HD 861 is a spectroscopic binary star system in the deep northern constellation of Cassiopeia. With an apparent magnitude of 6.622, the star is faintly visible to the naked eye under very dark skies and readily visible using binoculars. It is located approximately distant according to Gaia EDR3 parallax measurements, and is moving further away at a heliocentric radial velocity of 8.80 km/s. Stellar properties The primary star is a typical Am star, enriched in iron and especially so in barium but depleted in carbon, oxygen and calcium. At an age of 724 million (108.86) years, it is currently a main-sequence star fusing hydrogen into helium at its core. It will continue to do so for the next 320 million years until it runs out of core hydrogen at 1.05 billion (109.02) years old, at which point it will leave the main sequence and enter the subgiant phase. The secondary star is a G-type main-sequence star slightly less massive than the Sun and less than half as luminous. Orbit The orbital properties of the companion were first determined in 1971 by Acker, with an orbital period of 11.2153 days and an eccentricity of 0.22. In 2002, however, Debernardi found an entirely different set of orbital parameters in his PhD thesis and also discovered the stellar spectra of the secondary star. This new orbit has a longer period of 15.9696 days and a lower eccentricity of 0.124. This was backed up by Budaj et al., who also independently found the secondary spectra and obtained a mass ratio between the two stars that agreed with Debernardi's research. Notes References Am stars A-type main-sequence stars G-type main-sequence stars Cassiopeia (constellation) Spectroscopic binaries 000861 001063 BD+61 00016 J00131272+6202271
HD 861
[ "Astronomy" ]
407
[ "Cassiopeia (constellation)", "Constellations" ]
78,283,023
https://en.wikipedia.org/wiki/PKS%202004-447
PKS 2004-447 is a narrow-line Seyfert 1 galaxy located in the constellation of Sagittarius. It has a redshift of (z) 0.24 and is the radio-loudest gamma ray emitting AGN known in the southern hemisphere. It was first identified as an astronomical radio source during a very-long-baseline interferometry survey in 1989. The radio spectrum appears to be powerful and compact, making it a compact steep spectrum source. The X-ray emission for this source is described by a simple power-law in the (0.5-10 KeV) energy range. PKS 2004-447 is classified as a blazar. It is found variable on the electromagnetic spectrum and is a source of gamma ray activity. In October 2019, it was found to exhibit a gamma ray flare, reaching a highest flux of (1.3 ± 0.2) x 10−6 photon cm−2 s−1 at gamma-ray energies (0.1-300 GeV). A further study also showed during the post-flare in PKS 2004–447, there was the presence of Balmer, Paschen and Helium emission lines, with these lines vanishing within a period of 1.5 years. PKS 2004-447 has a core-jet structure with angular size measuring 40 mas. Its radio emission is found mainly dominated by the radio core, accounting for 42 percent of its total flux density at 1.4 GHz. There is a jet structure emerging from the core with a projected position angle measurement of -90°, which bends to -60° at 20 mas. This jet structure is further divided into two subcomponents enveloped completely from sight by the diffused emission. The host galaxy of PKS 2004-447 is a late-type barred disk galaxy with a stellar mass of ~ 7 x 1011 Mʘ, estimated from its K-band luminosity. It has a bulge showing a relatively low Sersic index of n ~ 1.2 with a bulge-to-total ratio of (B/T = 0.39 ± 0.02) for its J-band and (B/T = 0.44 ± 0.03) for its Ks-band. These values showed the bulge PKS 2004-447 is of pseudobulge morphology. The host galaxy shows two faint spiral arms and has an estimated black hole mass of 9 x 107 Mʘ. References External links PKS 2004-447 on SIMBAD PKS 2004-447 on NASA/IPAC Database Seyfert galaxies Active galaxies Sagittarius (constellation) 2830329 Astronomical objects discovered in 1989 Blazars
PKS 2004-447
[ "Astronomy" ]
555
[ "Sagittarius (constellation)", "Constellations" ]
78,283,444
https://en.wikipedia.org/wiki/Robotics%20engineering
Robotics engineering is a branch of engineering that focuses on the conception, design, manufacturing, and operation of robots. It involves a multidisciplinary approach, drawing primarily from mechanical, electrical, software, and artificial intelligence (AI) engineering. Robotics engineers are tasked with designing these robots to function reliably and safely in real-world scenarios, which often require addressing complex mechanical movements, real-time control, and adaptive decision-making through software and AI. Fundamental disciplines Robotics engineering combines several technical disciplines, all of which contribute to the performance, autonomy, and robustness of a robot. Mechanical engineering and kinematics Mechanical engineering is responsible for the physical construction and movement of robots. This involves designing the robot's structure, joints, and actuators, as well as analyzing its kinematics and dynamics. Kinematics Kinematic models are essential for controlling the movements of robots. Robotics engineers use forward kinematics to calculate the positions and orientations of a robot's end-effector, given specific joint angles, and inverse kinematics to determine the joint movements necessary for a desired end-effector position. These calculations allow for precise control over tasks such as object manipulation or locomotion. Actuation and materials Robotics engineers select actuators—such as electric motors, hydraulic systems, or pneumatic systems—based on the robot's intended function, power needs, and desired performance characteristics. Materials used in the construction of robots are also carefully chosen for strength, flexibility, and weight, with lightweight alloys and composite materials being popular choices for mobile robots. Electrical and electronics engineering Robots depend on electrical systems for power, communication, and control. Power management Powering a robot's motors, sensors, and processing units requires sophisticated electrical circuit design. Robotics engineers ensure that power is distributed efficiently and safely across the system, often using batteries or external power sources in a way that minimizes energy waste. Signal processing and sensors A robot's ability to interact with its environment depends on interpreting data from various sensors. Electrical engineers in robotics design systems to process signals from cameras, LiDAR, ultrasonic sensors, and force sensors, filtering out noise and converting raw data into usable information for the robot's control systems. Software engineering Software engineering is a fundamental aspect of robotics, focusing on the development of the code and systems that control a robot's hardware, manage real-time decision-making, and ensure reliable operation in complex environments. Software in robotics encompasses both low-level control software and high-level applications that enable advanced functionalities. Embedded systems Robotics engineers develop embedded systems that interface directly with a robot's hardware, managing actuators, sensors, and communication systems. These systems must operate in real-time to process sensor inputs and trigger appropriate actions, often with strict constraints on memory and processing power. Software architectures and frameworks Modern robots rely on modular and scalable software architectures. A popular framework in the field is the Robot Operating System (ROS), which facilitates communication between different subsystems and simplifies the development of robotic applications. Engineers use such frameworks to build flexible systems capable of handling tasks such as motion planning, perception, and autonomous decision-making. Real-time systems Robots frequently operate in environments where real-time processing is critical. Robotics engineers design software that can respond to sensor data and control actuators within tight time constraints. This includes optimizing algorithms for low-latency and developing robust error-handling procedures to prevent system failure during operation. AI engineering AI engineering plays an increasingly critical role in enabling robots to perform complex, adaptive tasks. It focuses on integrating artificial intelligence techniques such as machine learning, computer vision, and natural language processing to enhance a robot's autonomy and intelligence. Perception and computer vision Robots equipped with AI-powered perception systems can process and interpret visual and sensory data from their surroundings. Robotics engineers develop algorithms for object recognition, scene understanding, and real-time tracking, allowing robots to perceive their environment in ways similar to humans. These systems are often used for tasks such as autonomous navigation or grasping objects in unstructured environments. Machine learning for control and decision-making Machine learning techniques, particularly reinforcement learning and deep learning, allow robots to improve their performance over time. Robotics engineers design AI models that enable robots to learn from their experiences, optimizing control strategies and decision-making processes. This is particularly useful in environments where pre-programmed behavior is insufficient, such as in search and rescue missions or unpredictable industrial tasks. Control systems and feedback loops Control systems engineering ensures that robots move accurately and perform tasks in response to environmental stimuli. Robotics engineers design control algorithms that manage the interaction between sensors, actuators, and software. Closed-loop control Most robots rely on closed-loop control systems, where sensors provide continuous feedback to adjust movements and behaviors. This is essential in applications like robotic surgery, where extreme precision is required, or in manufacturing, where consistent performance over repetitive tasks is critical. Adaptive and nonlinear control systems For more advanced applications, robotics engineers develop adaptive control systems that can modify their behavior in response to changing environments. Nonlinear control techniques are employed when dealing with complex dynamics that are difficult to model using traditional methods, such as controlling the flight of drones or autonomous underwater vehicles. Key tools and technologies Robotics engineers leverage a wide array of software tools and technologies to design, test, and refine robotic systems. Simulation software Before physical prototypes are created, robotics engineers use advanced simulation software to model and predict the behavior of robotic systems in virtual environments. MATLAB and Simulink are standard tools for simulating both the kinematics (motion) and dynamics (forces) of robots. These platforms allow engineers to develop control algorithms, run system-level tests, and assess performance under various conditions without needing physical hardware. ROS (Robot Operating System) is another key framework, facilitating the simulation of robot behaviors in different environments. CAD and 3D modeling For mechanical design, robotics engineers use Computer-Aided Design (CAD) software, such as SolidWorks, AutoCAD, and PTC Creo, to create detailed 3D models of robotic components. These models are essential for visualizing the physical structure of the robot and for ensuring that all mechanical parts fit together precisely. CAD models are often integrated with simulation tools to test mechanical functionality and detect design flaws early in the process. Rapid prototyping and 3D printing Once the designs are verified through simulation, rapid prototyping technologies, including 3D printing and CNC machining, allow for the fast and cost-effective creation of physical prototypes. These methods enable engineers to iterate quickly, refining the design based on real-world testing and feedback, reducing the time to market. Finite element analysis (FEA) To ensure the robustness and durability of robotic components, engineers perform structural testing using finite alement analysis (FEA) software like ANSYS and Abaqus. FEA helps predict how materials will respond to stress, heat, and other environmental factors, optimizing designs for strength, efficiency, and material usage. Hardware-in-the-loop (HIL) testing To bridge the gap between simulation and physical testing, robotics engineers often use hardware-in-the-loop (HIL) systems. HIL testing integrates real hardware components into simulation models, allowing engineers to validate control algorithms and system responses in real-time without needing the full robotic system built, thus reducing risks and costs. Challenges The complexity of robotics engineering presents ongoing challenges. Robustness and fault tolerance Designing robots that can reliably operate in unpredictable environments is a key engineering challenge. Engineers must create systems that can detect and recover from hardware malfunctions, sensor failures, or software errors. This is important in mission-critical applications such as space exploration or medical robotics. Safety in human-robot interaction Ensuring safety in human-robot interaction is a significant challenge in the field of robotics engineering. In addition to technical aspects, such as the development of sensitive control systems and force-limited actuators, engineers must address the ethical and legal implications of these interactions. AI algorithms are employed to enable robots to anticipate and respond to human behavior in collaborative environments; however, these systems are not without flaws. When errors occur—such as a robot misinterpreting human movement or failing to halt its actions in time—the issue of responsibility arises. This question of accountability poses a substantial ethical dilemma. Should the responsibility for such errors fall upon the engineers who designed the robot, the manufacturers who produced it, or the organizations that deploy it? Furthermore, in cases where AI algorithms play a key role in the robot's decision-making process, there is the added complexity of determining whether the system itself could be partly accountable. This issue is particularly pertinent in industries such as healthcare and autonomous vehicles, where mistakes may result in severe consequences, including injury or death. Current legal frameworks in many countries have not yet fully addressed the complexities of human-robot interaction. Laws concerning liability, negligence, and safety standards often struggle to keep pace with technological advancements. The creation of regulations that clearly define accountability, establish safety protocols, and safeguard human rights will be crucial as robots become increasingly integrated into daily life. Optimization of motion and energy efficiency Robotics engineers must balance the need for high performance with energy efficiency. Motion-planning algorithms and energy-saving strategies are critical for mobile robots, especially in applications like autonomous drones or long-duration robotic missions where battery life is limited. References Robotics Computer engineering Robotics engineering Engineering disciplines
Robotics engineering
[ "Technology", "Engineering" ]
1,927
[ "Computer engineering", "Robotics engineering", "Automation", "Robotics", "nan", "Electrical engineering" ]
78,283,507
https://en.wikipedia.org/wiki/C/1915%20C1%20%28Mellish%29
Comet Mellish, also known formally as C/1915 C1, is one of five comets discovered by American astronomer John E. Mellish. It is a hyperbolic comet that reached perihelion on July 17, 1915. However, just two months earlier, Edward E. Barnard had reported the comet had splitted into three distinct objects in May 12, later increasing to four by May 24. In addition, it is thought that this comet was the parent body of the June Lyrids meteor shower, which was first discovered in 1966. References External links Non-periodic comets Hyperbolic comets Split comets Meteor shower progenitors
C/1915 C1 (Mellish)
[ "Astronomy" ]
128
[ "Astronomy stubs", "Comet stubs" ]
78,283,566
https://en.wikipedia.org/wiki/Zytron
Zytron, also known as DMPA, is a chlorophenoxy herbicide. It controls crabgrass and other weeds in turf preëmergently, and ants, chinch bugs and grubs. It is used on baseball pitches in Australia. Zytron inhibits microtubule assembly, preventing mitosis. making it a Group 3 / D / K1, similar to dinitroanilines like trifluralin. It was tested and commercially available in the US in 1959, and applied at 10-20 lbs per acre on turf, a high rate compared to other herbicides. Zytron disappears almost completely from the body within one hour of mammalian exposure. It does not accumulate in soil and is non-harmful to microflora. DMPA has in testing been applied at rates as high as 67 lbs per acre. Zytron may cause neurotoxicity in chickens. It is an organophosphorus ester, and other such chemicals are known to cause similar neurotoxicity. 100 mg/kg daily for 10 days was considered the minimum effective dose to observably alter hens' behaviour. Zytron has been sold under the tradenames "Dow Crabgrass Killer", "Dow 1329", "Dowco 118" and "T-H Crabgrass Killer." References Herbicides Chloroarenes Thiophosphoryl compounds Isopropylamino compounds Methyl esters Phosphoramidothioates
Zytron
[ "Chemistry", "Biology" ]
316
[ "Herbicides", "Functional groups", "Phosphoramidothioates", "Biocides", "Thiophosphoryl compounds" ]
78,284,162
https://en.wikipedia.org/wiki/Holafly
Holafly is an international eSIM provider based in Spain. History Holafly was founded in Murcia in 2017 by Pedro Máiquez and Lydia Hu. In 2019, Holafly joined Lanzadera's acceleration program. With the help of the program, Holafly expanded its services to include eSIM without roaming charges in countries such as Germany and France, and moved its headquarters to Valencia. eSIM Holafly was an early adopter of eSIM technology and provides prepaid data plans to international travelers in over 200 countries. The Holafly app is available on iOS and Android. The app helps users monitor their data usage. Reception Ritoban Mukherjee, writing in his review for TechRadar, described the onboarding process as "user-friendly" and its customer support as "excellent". References External links 2017 establishments in Spain Companies based in Valencia ESIM companies
Holafly
[ "Technology" ]
185
[ "ESIM companies", "Mobile technology companies" ]
78,284,627
https://en.wikipedia.org/wiki/Hajo%20Leschke
Hajo Leschke (born 11 February 1945 in Wentorf bei Hamburg) is a German mathematical physicist and (semi-)retired professor of theoretical physics at the Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU). He is known for rigorous results on model systems in quantum (statistical) mechanics obtained through functional-analytic and probabilistic techniques, jointly with his (former) students and other co-workers. His research topics include: Peierls Transition, Functional Formulations of Quantum and Stochastic Dynamics, Pekar–Fröhlich Polaron, Quantum Spin Chains, Feynman–Kac Formulas, (Random) Schrödinger Operators, Landau-Level Broadening, Lifschitz Tails, Anderson Localization, Fermionic Entanglement Entropies, Quantum Spin Glasses. Academic education Leschke studied physics and mathematics at the Universität Hamburg and graduated with a diploma in physics (1970) under thesis advisor Wolfgang Kundt (born 1931). He received his doctorate in physics (1975) under dissertation advisor Uwe Brandt (1944–1997) from the [Technische] Universität Dortmund, where he also earned the habilitation in physics (1981). His studies were supported by the Studienstiftung des deutschen Volkes (German Academic Scholarship Foundation) and the Kurt-Hartwig-Siemers–Wissenschaftspreis on the recommendation of Werner Döring (1911–2006) and of Pascual Jordan (1902–1980), respectively. Career Leschke was a research (and teaching) assistant to Ludwig Tewordt (1926–2016) at the Universität Hamburg, to Uwe Brandt at the Universität Dortmund, to Herbert Wagner (born 1935) at the Forschungszentrum Jülich (then: KFA Jülich), and to Richard Bausch (born 1935) at the [Heinrich-Heine–]Universität Düsseldorf (HHU) before he became a professor there in 1982 and at the FAU in 1983. In 1987, he was a guest professor at the University of Georgia, Athens (UGA) with host David P. Landau (born 1941). In 2004, he organized the workshop "Mathematics and physics of disordered systems" jointly with Michael Baake, Werner Kirsch, and Leonid A. Pastur at the Mathematisches Forschungsinstitut Oberwolfach (MFO), Germany. In 2017, he organized the workshop "Fisher–Hartwig asymptotics, Szegő expansions, and applications to statistical physics" jointly with Alexander V. Sobolev and Wolfgang Spitzer at the American Institute of Mathematics (AIM), then located in San Jose, California. From 1998 to 2011 Leschke belonged to the advisory board of the Annalen der Physik, then edited by Ulrich Eckern (born 1952) at the Universität Augsburg. Notable students Notable doctoral students of Leschke include Peter Müller (born 1967) and Simone Warzel (born 1973). The first one is professor of mathematics at the Ludwig-Maximilians-Universität (LMU) in Munich and dean of the Faculty of Mathematics, Informatics, and Statistics (2021–2025). The second one is professor of mathematics at the Technische Universität München (TUM) in Garching near Munich. Research achievements Leschke's research publications listed below all refer to properties of non-relativistic quantum systems which are modeled by some Hamiltonian, that is, by some self-adjoint operator on Hilbert space representing the total energy of the system, possibly depending on random variables representing disorder. In the publications from 2000 to 2017 the Hamiltonian is of Schrödinger type, that is, an operator for the sum of the kinetic and potential energy of "point-like" particles in Euclidean space. The two publications with Kurt Broderix (1962–2000) extend previously known continuity properties of the corresponding one-parameter Schrödinger semi-group (or Gibbs operator for different temperatures) to rather general magnetic fields and to (random) potential fields possibly leading to unbounded semi-groups; by suitably extending the Feynman–Kac formula and using the diamagnetic inequality. The other three publications from 2000 to 2004 consider the case of a single particle subject to a constant magnetic field and a random potential field. For a Poissonian field with positive single-impurity potential U the low-energy behavior of the integrated (or cumulative) density of states is derived, depending on the range of U. For a Gaussian random field (without an underlying lattice structure) the first proofs are given for the existence of the density of states and of Anderson localization in multi-dimensional continuous space. The publications in 2014 and 2017 refer to the case of many non-interacting particles which obey Fermi–Dirac statistics. For the corresponding ideal Fermi gas in thermodynamic equilibrium they contain the first rigorous results on the asymptotic growth of its quantum Rényi entropies of (spatial) entanglement at arbitrary temperature. These results have served as a standard of comparison for approximate arguments and/or numerical methods to better understand the correlations in many-fermion systems with interaction. The publications in 2021 are among the first ones providing rigorous results on quantum versions of the classic(al) Sherrington–Kirkpatrick spin-glass model. In particular, they prove for the first time the existence of a phase transition (related to spontaneous replica-symmetry breaking) if the temperature and the strength of the "transverse" magnetic field are low enough. The publication in 2023 illuminates this phase transition's relevance to the quantum-annealing algorithm in computer science. Selected publications since 2000 References External links Homepage at the FAU Hajo Leschke at the Mathematics Genealogy Project (MGP) Living people 1945 births Studienstiftung alumni 20th-century German physicists 21st-century German physicists German theoretical physicists Theoretical physicists Mathematical physicists University of Hamburg alumni Technical University of Dortmund alumni Academic staff of Heinrich Heine University Düsseldorf Academic staff of the University of Erlangen-Nuremberg
Hajo Leschke
[ "Physics" ]
1,305
[ "Theoretical physics", "Theoretical physicists" ]
78,285,690
https://en.wikipedia.org/wiki/Chaos%20%28malware%29
Chaos is a malware that infects Windows, Linux and FreeBSD devices. It is written in the Go programming language. It was discovered by Black Lotus Labs in April 2022. It is used for DDoS attacks and crypto mining. Chaos was believed to be an offshoot of Kaiji, a piece of botnet software. References Further reading Chaos Malware Walks Line Between Ransomware and Wiper Chaos Malware Quietly Evolves Persistence and Evasion Techniques Chaos is a Go-based Swiss army knife of malware - Lumen Linux malware Windows malware
Chaos (malware)
[ "Technology" ]
118
[ "Computing stubs" ]
78,285,850
https://en.wikipedia.org/wiki/List%20of%20FreeBSD%20malware
FreeBSD malware includes viruses, Trojans, worms and other types of malware that affect the FreeBSD operating system. Threats The following is a partial list of known FreeBSD malware. Chaos is a malware that infects Windows, Linux and FreeBSD devices Hive, ransomware that encrypts Linux and FreeBSD systems Interlock, ransomware targeting Windows and FreeBSD operating systems, appeared at the end of September 2024. References FreeBSD Malware by platform Lists of software
List of FreeBSD malware
[ "Technology" ]
110
[ "Computing-related lists", "Lists of software" ]
78,285,925
https://en.wikipedia.org/wiki/Code%20ownership
In software engineering, code ownership is a term used to describe control of an individual software developer or a development team over source code modifications of a module or a product. Definitions While the term is very popular, there is no universally accepted definition of it. Koana et al., in their 2024 literature review, found 28 different definitions, and classified them as follows: Psychological ownership is a feeling by the developer of ownership and pride in the particular element of the project; Corporeal ownership is a set of formal or informal rules defining responsibility for a particular software piece. The rules depend on the development approach taken by the team, but generally can be partitioned along the lines of "what is being owned?" / "who owns it?" / "what is the degree of control?": while the answer to "what?" is typically some part of the source code, the ownership concept have been also applied to other artifacts of the software development as diverse as an entire project or a single software bug; the owner ("who?") might be an individual developer or a group that might include authors of the code, reviewers, and managers. The two extremes are represented by a dedicated ownership with just one developer responsible for any particular piece of code and a collective code ownership, where every member of the team owns all the code; the degree of control by an owner can vary from a mandatory code review to responsibility for testing to a complete implementation. Authorship Some researchers also use the term to describe the authorship of software (identifying who wrote a particular line of software). Koana et al. state that this is a different, although related, meaning, as the code owner might not be original author of the software piece. Influence upon quality It is generally accepted that the lack of clear code ownership (usually in the form of many developers freely applying small changes to a shared piece of code) is causing errors to be introduced. At the same time, with no code owner, the knowledge about an artifact can be lost. This is confirmed by large-scale studies, for example, involving Windows 7 and Windows Vista. Code owners in version control Modern version control systems allow explicit designation of code owners for particular files or directories (cf. GitHub CODEOWNERS feature). Typically, the code owner is either receiving notifications for all the changes in the owned code or is required to approve each change. References Sources Software engineering terminology
Code ownership
[ "Technology", "Engineering" ]
492
[ "Software engineering", "Computing terminology", "Software engineering stubs", "Software engineering terminology" ]
78,285,950
https://en.wikipedia.org/wiki/F-Yang%E2%80%93Mills%20equations
In differential geometry, the -Yang–Mills equations (or -YM equations) are a generalization of the Yang–Mills equations. Its solutions are called -Yang–Mills connections (or -YM connections). Simple important cases of -Yang–Mills connections include exponential Yang–Mills connections using the exponential function for and -Yang–Mills connections using as exponent of a potence of the norm of the curvature form similar to the -norm. Also often considered are Yang–Mills–Born–Infeld connections (or YMBI connections) with positive or negative sign in a function involving the square root. This makes the Yang–Mills–Born–Infeld equation similar to the minimal surface equation. F-Yang–Mills action functional Let be a strictly increasing function (hence with ) and . Let: Since is a function, one can also consider the following constant: Let be a compact Lie group with Lie algebra and be a principal -bundle with an orientable Riemannian manifold having a metric and a volume form . Let be its adjoint bundle. is the space of connections, which are either under the adjoint representation invariant Lie algebra–valued or vector bundle–valued differential forms. Since the Hodge star operator is defined on the base manifold as it requires the metric and the volume form , the second space is usually used. The -Yang–Mills action functional is given by: For a flat connection (with ), one has . Hence is required to avert divergence for a non-compact manifold , although this condition can also be left out as only the derivative is of further importance. F-Yang–Mills connections and equations A connection is called -Yang–Mills connection, if it is a critical point of the -Yang–Mills action functional, hence if: for every smooth family with . This is the case iff the -Yang–Mills equations are fulfilled: For a -Yang–Mills connection , its curvature is called -Yang–Mills field. A -Yang–Mills connection/field with: is just an ordinary Yang–Mills connection/field. (or for normalization) is called (normed) exponential Yang–Mills connection/field. In this case, one has . The exponential and normed exponential Yang–Mills action functional are denoted with and respectively. is called -Yang–Mills connection/field. In this case, one has . Usual Yang–Mills connections/fields are exactly the -Yang–Mills connections/fields. The -Yang–Mills action functional is denoted with . or is called Yang–Mills–Born–Infeld connection/field (or YMBI connection/field) with negative or positive sign respectively. In these cases, one has and respectively. The Yang–Mills–Born–Infeld action functionals with negative and positive sign are denoted with and respectively. The Yang–Mills–Born–Infeld equations with positive sign are related to the minimal surface equation: Stable F-Yang–Mills connection Analogous to (weakly) stable Yang–Mills connections, one can define (weakly) stable -Yang–Mills connections. A -Yang–Mills connection is called stable if: for every smooth family with . It is called weakly stable if only holds. A -Yang–Mills connection, which is not weakly stable, is called unstable. For a (weakly) stable or unstable -Yang–Mills connection , its curvature is furthermore called a (weakly) stable or unstable -Yang–Mills field. Properties For a Yang–Mills connection with constant curvature, its stability as Yang–Mills connection implies its stability as exponential Yang–Mills connection. Every non-flat exponential Yang–Mills connection over with and: is unstable. Every non-flat Yang–Mills–Born–Infeld connection with negative sign over with and: is unstable. All non-flat -Yang–Mills connections over with are unstable. This result includes the following special cases: All non-flat Yang–Mills connections with positive sign over with are unstable. James Simons presented this result without written publication during a symposium on "Minimal Submanifolds and Geodesics" in Tokyo in September 1977. All non-flat -Yang–Mills connections over with are unstable. All non-flat Yang–Mills–Born–Infeld connections with positive sign over with are unstable. For , every non-flat -Yang–Mills connection over the Cayley plane is unstable. Literature See also Bi-Yang–Mills equations, modification of the Yang–Mills equation References External links F-Yang-Mills equation at the nLab Differential geometry Mathematical physics Partial differential equations
F-Yang–Mills equations
[ "Physics", "Mathematics" ]
931
[ "Applied mathematics", "Theoretical physics", "Mathematical physics" ]
78,285,966
https://en.wikipedia.org/wiki/Bi-Yang%E2%80%93Mills%20equations
In differential geometry, the Bi-Yang–Mills equations (or Bi-YM equations) are a modification of the Yang–Mills equations. Its solutions are called Bi-Yang–Mills connections (or Bi-YM connections). Simply put, Bi-Yang–Mills connections are to Yang–Mills connections what they are to flat connections. This stems from the fact, that Yang–Mills connections are not necessarily flat, but are at least a local extremum of curvature, while Bi-Yang–Mills connections are not necessarily Yang–Mills connections, but are at least a local extremum of the left side of the Yang–Mills equations. While Yang–Mills connections can be viewed as a non-linear generalization of harmonic maps, Bi-Yang–Mills connections can be viewed as a non-linear generalization of biharmonic maps. Bi-Yang–Mills action functional Let be a compact Lie group with Lie algebra and be a principal -bundle with a compact orientable Riemannian manifold having a metric and a volume form . Let be its adjoint bundle. is the space of connections, which are either under the adjoint representation invariant Lie algebra–valued or vector bundle–valued differential forms. Since the Hodge star operator is defined on the base manifold as it requires the metric and the volume form , the second space is usually used. The Bi-Yang–Mills action functional is given by: Bi-Yang–Mills connections and equation A connection is called Bi-Yang–Mills connection, if it is a critical point of the Bi-Yang–Mills action functional, hence if: for every smooth family with . This is the case iff the Bi-Yang–Mills equations are fulfilled: For a Bi-Yang–Mills connection , its curvature is called Bi-Yang–Mills field. Stable Bi-Yang–Mills connections Analogous to (weakly) stable Yang–Mills connections, one can define (weakly) stable Bi-Yang–Mills connections. A Bi-Yang–Mills connection is called stable if: for every smooth family with . It is called weakly stable if only holds. A Bi-Yang–Mills connection, which is not weakly stable, is called unstable. For a (weakly) stable or unstable Bi-Yang–Mills connection , its curvature is furthermore called a (weakly) stable or unstable Bi-Yang–Mills field. Properties Yang–Mills connections are weakly stable Bi-Yang–Mills connections. See also F-Yang–Mills equations, generalization of the Yang–Mills equation Literature References External links Bi-Yang-Mills equation at the nLab Differential geometry Mathematical physics Partial differential equations
Bi-Yang–Mills equations
[ "Physics", "Mathematics" ]
535
[ "Applied mathematics", "Theoretical physics", "Mathematical physics" ]
78,288,408
https://en.wikipedia.org/wiki/Proxipyricularia%20zingiberis
Proxipyricularia zingiberis is a fungus that was originally found in Japan growing on the leaves of ginger plants, Zingiber mioga and Zingiber officinale, in 1917, when it was described as Pyricularia zingiberis. P. zingiberis is a member of plant pathogenic fungi that predominantly affect monocotyledon plants, including ginger. Ginger is a valuable tropical crop used for spices, medicinal purposes, and consumption across the world, making P. zingiberis a concerning pathogenic agent. Pyricularia zingiberis was reclassified using advanced molecular techniques in 2014 as Proxipyricularia zingiberis, which is an evolutionary lineage that is genetically distinct from its previous classification, though morphologically similar. This distinction was made in an effort to resolve the polyphyletic nature of the genus Pyricularia after molecular phylogenetic analysis. Morphologic description and symptoms P. zingiberis is a causal agent of leaf spot and blast disease. Leaf spots appear as small brown lesions on living ginger leaves that are diamond-shaped. Symptoms predominate on leaves but can spread down the stem as symptoms progress. Scattered leaf spots bear sclerotium-like structures and can congregate into large blotches and progress into blast symptoms that can be lethal to the plant. The blast symptoms of the disease have resulted in significant yield losses to ginger growers. In culture, hyphal growth varies in coloration from hyaline to green-gray, often described as olivaceous. Conidiophores are polyblastic, unbranched to slightly branched in shape, and up to 350 μm in size. Conidiophores can be solitary or bundled to enhance spore production and dispersal and can be either intercalary or terminal. Conidia are pyriform and hyaline to pale brown, sporting 1-2 septations. P. zingiberis conidia range from 14 to 24.5 μm in length and 5.0 to 9.0 μm in width. For a visual depiction of the symptoms and microscopic structures of P. zingiberis, refer to this image here. Pathogenicity The pathogen has a polycyclic disease cycle and is dispersed through the air. Under favorable conditions, conidia on the leaf surface grow a germ tube that forms into an appressorium and penetrates the leaf cuticle using turgor pressure. Necrotic lesions form along the leaf surface as hyphae spread throughout cells within host tissues until lesions with spores emerge on the leaf surface for dispersal. Ecology and distribution P. zingiberis is a plant pathogen on Zingiber hosts. Species of Pyricularia have been identified as endophytes living in ginger leaf tissue. P. zingiberis is found in tropical climates with high humidity, which promotes infection. Environmental conditions largely influence disease development, so cultural practices have been used to aid in mitigating infection. Currently, P. zingiberis has been identified in Japan, Indonesia, Vietnam, and Malaysia. Primary pathogen management relies on chemical fungicides to slow or suppress the pathogen. In some Pyricularia species, resistant host cultivars have been developed to control disease, but ultimately have not been successful due to genetic diversity and adaptations in the fungal genome. Breeding resistance in ginger plants has not been noted as a primary management tactic for P. zingiberis. Currently, Z. mioga and Z. officinale are both susceptible. References Magnaporthales Fungus species
Proxipyricularia zingiberis
[ "Biology" ]
749
[ "Fungi", "Fungus species" ]
78,289,819
https://en.wikipedia.org/wiki/Ammonium%20hexabromoselenate%28IV%29
Ammonium hexabromoselenate(IV) is an inorganic chemical compound with the chemical formula . Synthesis Ammonium bromide is added to aqueous solutions of selenium tetrabromide acidified with hydrogen bromide: Also, selenium dioxide dissolved in hydrobromic acid and treated with ammonium chloride produces the compound. Physical properties The compound forms red crystals of the cubic system, space group Fm3m. The compound decomposes in water. References Selenium compounds Ammonium compounds
Ammonium hexabromoselenate(IV)
[ "Chemistry" ]
106
[ "Ammonium compounds", "Salts" ]
78,289,889
https://en.wikipedia.org/wiki/Polycab%20India
Polycab India Limited is an Indian electrical equipment company based in Mumbai, India. The company manufactures and sells electrical products, including wires and cables, electric fans, LED lighting and luminaires, switches and switchgear, solar products, and conduits and accessories. It also operates in the engineering, procurement, and construction (EPC) sector. In 2023, the company was ranked 161st on the Fortune India 500 list in 2023, with revenues of 14,206 crores. It is the largest wire and cable manufacturer in India and holds 25% to 26% of the market share in the wires and cables sector in India. As of March 2023, the company operates 28 manufacturing units in Gujarat, Maharashtra, Karnataka, Uttarakhand, Tamil Nadu, and the Union Territory of Daman, along with over 29 warehouses across India. The company is included in the MSCI Standard Index, and is a constituent of the Nifty Midcap 100 Index and the BSE 200 Index. History The company's origins date back to 1964 with the establishment of Sind Electric Stores in Lohar Chawl by Thakurdas Jaisinghani, who had moved to Bombay from Pakistan after the Partition of India. This store dealt in various electrical products like fans, lighting fixtures, switches, and wires. By 1975, the company established Thakur Industries and a land was acquired from the Maharashtra Industrial Development Corporation in Andheri, Mumbai, for the construction of a cable and wire manufacturing facility. In 1983, the company formally entered the electrical goods manufacturing sector with the founding of Polycab Industries by Inder T. Jaisinghani and his brothers. A factory was established in Halol, Gujarat, for the production of PVC insulated wires and cables, copper and aluminum, and bare copper wire. In May 2008, the company entered into a joint venture with Nexans, a French cable manufacturer. The joint venture, with an initial investment of $37 million, focused on the production of rubber cables for the shipbuilding, railway, and wind power industries. In 2009, the company entered the engineering, procurement, and construction (EPC) sector, offering services covering the design, engineering, supply, execution, and commissioning of power distribution and rural electrification projects. In September 2008, the International Finance Corporation, the private investment arm of the World Bank, acquired a 12% stake in Polycab Wires for 551.5 crore (US$120 million). This transaction valued the company's cable manufacturing business at ₹4,600 crore (US$1 billion). In 2014, Polycab India Limited expanded its product range to include electric fans, LED lighting and luminaires, switches and switchgear, solar products, and conduits and accessories. In 2013, the company formed a joint venture with Nexans, holding a 49% stake, to establish production facilities in Gujarat. The investment for this venture amounted to $55 million (approximately 320 crore). In 2014, Polycab India Limited expanded its product range to include electric fans, LED lighting and luminaires, switches and switchgear, solar products, and conduits and accessories. Polycab India Limited established a 50:50 joint venture named Ryker Base with Trafigura (Singapore) in 2016 to strengthen its backward integration for copper. The company acquired full ownership in May 2020 and subsequently, Ryker Base was acquired by Hindalco Industries, a subsidiary of the Aditya Birla Group in November 2021 for ₹323 crore. IPO In October 2018, Polycab India filed its draft red herring prospectus (DRHP) with the Securities and Exchange Board of India for an initial public offering. The company went public in April 2019, listing on the National Stock Exchange of India and the Bombay Stock Exchange. The IPO was oversubscribed more than 52 times and raised ₹1,346 crore. Finance In FY24, Polycab India reported a revenue of ₹180,394 million, showing a 28% year-over-year growth. The company's EBITDA increased by 35% YoY to ₹24,918 million, with an EBITDA margin of 13.8%. The Profit After Tax (PAT) rose by 41% YoY to ₹18,029 million, and the PAT margin expanded to 10.0%. Earnings per Share (EPS) for FY24 stood at ₹118.93. The wires and cables segment accounted for 88% of the company's sales, the Fast-Moving Electrical Goods (FMEG) segment contributed 7%, and the remaining 5% came from other segments, primarily the EPC business. The company has received a long-term credit rating of AA+ (Positive) from both CRISIL and India Ratings and Research. Controversies In December 2023, the Income Tax Department raided 50 offices of Polycab and discovered that the company had 1,000 crore unaccounted cash sales, 400 crore unaccounted cash payments made on its behalf by a distributor and 100 crore non-genuine expenses. In March 2024, Polycab India's IT infrastructure was targeted by a ransomware attack attributed to the Lockbit group. The company subsequently reported that the incident did not have a significant impact on its core systems and operations. Awards and recognition Polycab India has been listed in the Fortune India 500 list for five consecutive years, from 2019 to 2023. Inder Jaisinghani, the chairman and managing director of Polycab India, and his family were ranked at #32 in Forbes India's 2023 list of the 100 Richest Individuals, with a net worth of $6.4 billion (approximately ₹53,298 crores). Jaisinghani was also featured on the Forbes India Rich List in 2022, ranking #60 with a net worth of $3.4 billion, and in 2021, ranking #57 with a net worth of $3.6 billion. The company has been awarded the title of "Superbrand" by Superbrands India. The company was awarded a silver medal in the Consumer Durable category at the ET Brand Equity's 2023 Trendies Awards by The Economic Times. In 2019, the Employer Branding Institute recognized Polycab India with a National Best Employer Brand Award. References Companies based in Mumbai Indian brands Electrical engineering companies Manufacturing companies of India Companies established in 1964 Companies listed on the Bombay Stock Exchange Companies listed on the National Stock Exchange of India Electrical equipment manufacturers Engineering companies of India 1964 establishments in Maharashtra 2019 initial public offerings
Polycab India
[ "Engineering" ]
1,350
[ "Electrical engineering companies", "Electrical engineering organizations", "Electrical equipment manufacturers", "Engineering companies" ]
78,290,160
https://en.wikipedia.org/wiki/HD%20197911
HD 197911 (HIP 102274) is a bluish-white hued star in the deep northern constellation of Cepheus, close to the border with Draco and Cygnus. With an apparent magnitude of 7.669, it is too faint to be seen by the naked eye under most conditions, but readily visible using binoculars. The star is located some distant according to Gaia EDR3 parallax measurements, but is moving closer to the Solar System at a heliocentric radial velocity of . The star appears close to the reflection nebula and H II region Sh2-130, alongside the A0-type star HD 197809 and G5-type star SAO 18999, though the latter two stars are located much closer to Earth at and , respectively. The nebula itself is situated at a distance of . Properties and origin HD 197911 is a massive B-type star with the spectral type B5, a mass 6.3 times that of the Sun and 9.2 times the radius. It is a runaway star traversing space at a peculiar velocity of 56.69 km/s. The star is thought to have once been part of a binary system, from which it was ejected as its companion ended its life in a supernova. Initially, the star was thought to have originated in an OB association called the Cepheus OB2 association, which it left 2–3 million years ago, when the association was 3–4 million years old. This aligned with the age of the Cepheus bubble, an annular structure of infrared emission, providing compelling evidence for the binary-supernova scenario, that is, that it was blasted out of a binary system by a companion going supernova. However, with updated astrometric data, it is now considered more likely that it formed in either Alessi-Teutsch 5 or NGC 7160, two star clusters that are each 12.5 and 9.0 million years old. Both origins are consistent with the initially proposed scenario. References B-type stars Cepheus (constellation) 197911 102274 BD+62 01854 J20432160+6312329
HD 197911
[ "Astronomy" ]
440
[ "Constellations", "Cepheus (constellation)" ]
78,290,381
https://en.wikipedia.org/wiki/Brauer%27s%20k%28B%29%20conjecture
Richard Brauer's k(B) Conjecture is a conjecture in modular representation theory of finite groups relating the number of complex irreducible characters in a Brauer block and the order of its defect groups. It was first announced in 1955. It is Problem 20 in Brauer's list of problems. Statement Let be a finite group and a prime. The set of irreducible complex characters can be partitioned into -blocks. To each -block is canonically associated a conjugacy class of -subgroups, called the defect groups of . The set of irreducible characters belonging to is denoted by . The k(B) Conjecture asserts that . The k(GV) problem In the case of blocks of -solvable groups, the conjecture is equivalent to the following question. Let be an elementary abelian group of order , let be a finite group of order non-divisible by and acting faithfully on by group automorphisms. Let denote the associated semidirect product and let be its number of conjugacy classes. Then This was proved by John Thompson and Geoffrey Robinson, except for finitely many prime numbers. A proof of the last open cases was published in 2004 References Conjectures that have been proved Module theory Representation theory of finite groups
Brauer's k(B) conjecture
[ "Mathematics" ]
263
[ "Fields of abstract algebra", "Module theory", "Conjectures that have been proved", "Mathematical problems", "Mathematical theorems" ]
78,291,740
https://en.wikipedia.org/wiki/S5%201803%2B784
S5 1803+784 is a BL Lacertae object located in the far northern constellation of Draco. It has an estimated redshift of (z) 0.68 and was first discovered as an astronomical radio source in 1981 by a team of astronomers. This object is also classified as a blazar because of its extreme variability on the electromagnetic spectrum and a source of gamma ray activity. According to preliminary analysis in May 2011, the source of S5 1803+784 has a gamma ray flux (E>100 MeV) of (1.1 ± 0.2) electron−6 photon cm−2 s−1. Description S5 1803+784 is in a constant flaring state. In April 2020, S5 1803+784 had a major outburst followed by more flaring episodes. During this period, S5 1803+784 exhibited highest flux level of 1.1 x 10−6 ph cm−2 s−1 while in pre-flaring region, a low flux was shown below 0.2 x 10−6 ph cm−2 s−1. In August 2020, S5 1803+784 entered a new flaring phase which lasted for 57 days. Its source brightness varied from 13.617 ± 0.009 to 15.888 ± 0.01 in R-bands, which the brightest-ever state for S5 1803+783 was observed on August 25. It is also known to show near-infrared flares. In an optical light curve, S5 1803+784 showed the overall variation greater than 3 magnitudes with the largest changes observed within three flares through no periodicity was found. However, the radio band variability is found different, showing modest oscillations instead of flares with a maximum amplitude of 30%. S5 1803+784 shows a peculiar radio structure with a compact radio core. There is a presence of an ejector nozzle, 0.1 parsecs in diameter surrounded by a ring structure with both a diameter of 1.4 parsecs and a width of 0.25 parsecs. Furthermore, it has a weaker secondary component located 45 arcseconds south and slightly to the west side of the core with a faint emission bridge joining them together. The jet of S5 1803+784 has a complex morphology. In milliarcsecond-scales, it is described as a bend chain of seven individual jet components with both separation gaps of 0.2 and 3 mas from its core, where new components appear to be emerging from it every two years. Three of the jet components are found to approach a brightest and stationary component (1.4 mas at 8.4 GHz) exhibiting apparent superluminal motion. Further studies showed in the jet's parsec-scale, most of the jet components within the inner core remain constant over a long period of time with the jet's width changing periodically around 8–9 years. Interestingly, the jet is shaped into a cone which the 18-cm emission from the injector region is found to be weaken by a factor of 300. References External links S5 1803+784 on SIMBAD S5 1803+784 on NASA/IPAC Database BL Lacertae objects Draco (constellation) Blazars Active galaxies 18036+7827 Astronomical objects discovered in 1981
S5 1803+784
[ "Astronomy" ]
693
[ "Constellations", "Draco (constellation)" ]
78,291,990
https://en.wikipedia.org/wiki/Hurwitz-stable%20matrix
In mathematics, a Hurwitz-stable matrix, or more commonly simply Hurwitz matrix, is a square matrix whose eigenvalues all have strictly negative real part. Some authors also use the term stability matrix. Such matrices play an important role in control theory. Definition A square matrix is called a Hurwitz matrix if every eigenvalue of has strictly negative real part, that is, for each eigenvalue . is also called a stable matrix, because then the differential equation is asymptotically stable, that is, as If is a (matrix-valued) transfer function, then is called Hurwitz if the poles of all elements of have negative real part. Note that it is not necessary that for a specific argument be a Hurwitz matrix — it need not even be square. The connection is that if is a Hurwitz matrix, then the dynamical system has a Hurwitz transfer function. Any hyperbolic fixed point (or equilibrium point) of a continuous dynamical system is locally asymptotically stable if and only if the Jacobian of the dynamical system is Hurwitz stable at the fixed point. The Hurwitz stability matrix is a crucial part of control theory. A system is stable if its control matrix is a Hurwitz matrix. The negative real components of the eigenvalues of the matrix represent negative feedback. Similarly, a system is inherently unstable if any of the eigenvalues have positive real components, representing positive feedback. See also M-matrix Perron–Frobenius theorem, which shows that any Hurwitz matrix must have at least one negative entry Z-matrix References External links Matrices
Hurwitz-stable matrix
[ "Mathematics" ]
345
[ "Matrices (mathematics)", "Mathematical objects" ]
78,292,039
https://en.wikipedia.org/wiki/HD%2024733
HD 24733 is a spectroscopic binary system that is also a Beta Lyrae variable located about away in the deep northern constellation of Camelopardalis, close to the border with Perseus. It has the variable-star designation DD Camelopardalis (sometimes abbreviated to DD Cam). With a mean apparent magnitude of 7.038, it is too faint to be seen by the naked eye from Earth, but readily visible through binoculars. Description The primary star of the system is an A-type main-sequence star with a spectral type of A3V. It has a mass of 2.16 and, at an effective temperature of , radiates 24 times the luminosity of the Sun from its photosphere. The star displays no chemical peculiarity. The secondary star is thought to be a G-type main-sequence star with the spectral type G0V. HD 24733 was discovered to be a variable star by László Szabados, who observed the star at the Piszkéstető Station of Konkoly Observatory from late 1991 until early 1995. It was independently discovered to be variable from the Hipparcos satellite data, and was given its variable star designation in 1999. The two stars revolve around each other in a tight, nearly circular (eccentricity 0.084) orbit with a period of 1.762838 days (1 day, 18 hours, 18 minutes). As seen from Earth, one component periodically passes in front of the other, blocking some or all of its light. Hence, the star system appears to vary in brightness; from a maximum apparent magnitude of 6.97, it dips by 0.17 and 0.11 mag as each of the stars is obscured by its companion. The shape of the light curves imply that, because of their close proximity to one another, the two stars are both heavily distorted to an ellipsoidal shape due to mutual gravitational interactions. Additionally, the system may be surrounded by circumstellar material. See also DV Aquarii HD 40372 References A-type main-sequence stars G-type main-sequence stars Camelopardalis Beta Lyrae variables 024733 018585 Camelopardalis, DD BD+53 00718 J03583820+5359193 Eclipsing binaries
HD 24733
[ "Astronomy" ]
486
[ "Camelopardalis", "Constellations" ]
78,292,145
https://en.wikipedia.org/wiki/Routh%E2%80%93Hurwitz%20matrix
In mathematics, the Routh–Hurwitz matrix, or more commonly just Hurwitz matrix, corresponding to a polynomial is a particular matrix whose nonzero entries are coefficients of the polynomial. Hurwitz matrix and the Hurwitz stability criterion Namely, given a real polynomial the square matrix is called Hurwitz matrix corresponding to the polynomial . It was established by Adolf Hurwitz in 1895 that a real polynomial with is stable (that is, all its roots have strictly negative real part) if and only if all the leading principal minors of the matrix are positive: and so on. The minors are called the Hurwitz determinants. Similarly, if then the polynomial is stable if and only if the principal minors have alternating signs starting with a negative one. Example As an example, consider the matrix and let be the characteristic polynomial of . The Routh–Hurwitz matrix associated to is then The leading principal minors of are Since the leading principal minors are all positive, all of the roots of have negative real part. Moreover, since is the characteristic polynomial of , it follows that all the eigenvalues of have negative real part, and hence is a Hurwitz-stable matrix. See also Routh–Hurwitz stability criterion Liénard–Chipart criterion P-matrix Notes References Matrices
Routh–Hurwitz matrix
[ "Mathematics" ]
268
[ "Matrices (mathematics)", "Mathematical objects" ]
70,978,425
https://en.wikipedia.org/wiki/Xi%20Octantis
Xi Octantis, Latinized from ξ Octantis, is a solitary variable star in the southern circumpolar constellation Octans. It has an apparent magnitude of about 5.3, allowing it to be faintly seen with the naked eye; however, this varies slightly. Located 514 light years away, the object is receding with a heliocentric radial velocity of . Xi Octantis has a stellar classification of B6 V, indicating that it is an ordinary B-type main-sequence star. Hintler et al. gives it a luminosity class IV (subgiant) while Houk and Cowley gives a classification intermediate between a B5 and B7 dwarf. Nevertheless, it has 4 times the mass of the Sun and is 3 times larger. It shines with a luminosity of from its photosphere at an effective temperature of , giving it a whitish blue glow. Xi Octantis is 46 million years old – 64.8% through its short main sequence lifetime – and spins modestly with a projected rotational velocity of . When the Hipparcos catalogue was released in 1997, Xi Octantis was found to vary in magnitude — ranging from 5.32 to 5.36 based on data from the International Variable Star Index. It has since been classified as a Slowly pulsating B-dwarf with a period of 1.78 days. References Octans B-type main-sequence stars B-type subgiants Octantis, Xi Octantis, 77 Slowly pulsating B-type stars 215573 112781 8663 CD-80 828 Variable stars
Xi Octantis
[ "Astronomy" ]
335
[ "Octans", "Constellations" ]
70,978,956
https://en.wikipedia.org/wiki/Pi1%20Octantis
{{DISPLAYTITLE:Pi1 Octantis}} Pi1 Octantis (Pi1 Oct), Latinized π1 Octantis, is a solitary star in the southern circumpolar constellation Octans. It is faintly visible to the naked eye with an apparent magnitude 5.64, and is estimated to be 387 light years away. However, it is receding with a heliocentric radial velocity of . Pi1 Oct has a stellar classification of G8/K0 III — intermediate between a G8 and K0 giant star. It has 2.74 times the mass of the Sun and an effective temperature of , giving a yellow hue. However, an enlarged radius of yields a luminosity 76 times that of the Sun. Pi1 Oct has a metallicity around solar level and spins with a projected rotational velocity lower than . References Octans K-type giants G-type giants 130650 073540 5525 PD-82 629 Octantis, 21
Pi1 Octantis
[ "Astronomy" ]
206
[ "Octans", "Constellations" ]
70,979,040
https://en.wikipedia.org/wiki/Chronology%20of%20early%20Christian%20monasticism
Christian monasticism first appeared in Egypt and Syria. This is a partial chronology of early Christian monasticism with its notable events listed. It covers 343 years. References Chronology Christian monasticism Early Christian Monasticism History of Catholic monasticism
Chronology of early Christian monasticism
[ "Physics" ]
47
[ "Spacetime", "Chronology", "Physical quantities", "Time" ]
70,979,123
https://en.wikipedia.org/wiki/Gibellula
Gibellula is a genus of parasitic fungi which attacks arachnids. The genus Gibellula was named after Prof. Giuseppe Gibelli. References Cordycipitaceae Parasitic fungi Taxa described in 1877 Taxa named by Pier Andrea Saccardo Hypocreales genera
Gibellula
[ "Biology" ]
59
[ "Fungus stubs", "Fungi" ]
70,979,708
https://en.wikipedia.org/wiki/Sigma%20%28signature%20format%29
Sigma is a signature format based on pattern matching for system logging, to detect malicious behavior in computer systems. See also YARA Snort Further reading References External links GitHub repository sigmatools on PyPi Computer forensics
Sigma (signature format)
[ "Engineering" ]
47
[ "Cybersecurity engineering", "Computer forensics" ]
70,980,151
https://en.wikipedia.org/wiki/Glucametacin
Glucametacin is a non-steroidal anti-inflammatory drug used for the treatment of mild or moderate pain associated with rheumatoid arthritis, osteoarthritis, and other rheumatological disorders. It has analgesic and anti-inflammatory effects. Glucametacin is an amide of indometacin with glucosamine. References Nonsteroidal anti-inflammatory drugs Amino sugars Indoles 4-Chlorophenyl compounds Methoxy compounds Amides
Glucametacin
[ "Chemistry" ]
111
[ "Amino sugars", "Carbohydrates", "Amides", "Functional groups" ]
70,980,270
https://en.wikipedia.org/wiki/Merri%20Sue%20Carter
Merri Sue Carter (born 1964) is an American astronomer who works at the United States Naval Observatory as director of the World Data Center for the Rotation of the Earth, Washington. She is also the author of books on the history of astronomy with her father, geodesist William E. Carter. Education and career Carter was born on November 16, 1964, in Columbus, Ohio, where her father, William E. Carter, was studying geodesy at Ohio State University. He became a research geodesist for the United States Air Force, the University of Hawaii, and the National Oceanic and Atmospheric Administration, and the family moved frequently as Carter was growing up. She graduated from University of Maryland, College Park in 1986, and earned a master's degree in 1999 through University of Maryland University College. She has been an astronomer at the United States Naval Observatory since 1986. There, she directs the World Data Center for the Rotation of the Earth, Washington, which coordinates data for the International Earth Rotation and Reference Systems Service. Books With her father, Carter is the author of: Latitude: How American Astronomers Solved the Mystery of Variation (Naval Institute Press, 2002), on the Chandler wobble Simon Newcomb: America's Unofficial Astronomer Royal (Mantanzas Publishing, 2006), a biography of Simon Newcomb References 1964 births Living people 20th-century American astronomers American women astronomers Historians of astronomy University of Maryland, College Park alumni University of Maryland Global Campus alumni 20th-century American women scientists 21st-century American astronomers 21st-century American women scientists People from Columbus, Ohio Scientists from Ohio
Merri Sue Carter
[ "Astronomy" ]
322
[ "People associated with astronomy", "Historians of astronomy", "History of astronomy" ]
70,980,442
https://en.wikipedia.org/wiki/Outline%20of%20urban%20planning
The following outline is provided as an overview of and topical guide to urban planning: Urban planning – technical and political process that is focused on the development and design of land use and the built environment, including air, water, and the infrastructure passing into and out of urban areas, such as transportation, communications, and distribution networks and their accessibility. Urban planning is also known as also known as town planning, city planning, regional planning, or rural planning. Major ideas Urban area City Metropolitan area Suburb Land use Planning Planning and zoning commission Growth management Branches of planning Physical and real estate Land-use planning Neighborhood planning Comprehensive planning (US) Spatial planning Urban design Redevelopment Regional planning Mixed-use development Community and economic development Community economic development Community development planning Environment Environmental planning Recreation resource planning Sustainable development Climate change adaptation Conservation development Low-impact development Transportation Transportation planning Transit-oriented development Public transport planning Preservation Historic preservation Preservation development Research Urbanism Urban informatics Other approaches Indigenous planning Online land planning Participatory planning Participatory GIS Technical aspects of urban planning Community engagement Participatory development People-centered development History and theory of planning Theories of urban planning and history of urban planning Sanitary movement Garden city movement Back-to-the-land movement Linear city City Beautiful movement Soviet urban planning ideologies of the 1920s Towers in the park New towns movement Strategic urban planning Advocacy planning New Suburbanism Communicative planning Rational planning model Planning by region Planning cultures Planning of Africa Urban planning in Africa Urbanization in Africa Urban planning in ancient Egypt Planning of Asia Urban planning in China Urban planning in Shanghai Ancient Chinese urban planning Urban planning in Singapore Urban planning in Iran Planning of Europe Town and country planning in the United Kingdom Spatial planning in England Town and country planning in Wales Spatial planning in Serbia Urban planning in Spain Urban planning of Barcelona Urban planning in the Czech Republic Urban planning in Nazi Germany Urban planning in communist countries Soviet urban planning ideologies of the 1920s Planning of North America Urban planning in the United States Planning and development in Detroit Planning of Oceania Urban planning in Australia Urban planning in Sydney Planning of South America Planning education Urban planning education Awards for planning Danish Urban Planning Award Kevin Lynch Award Related fields Architecture Civil engineering Development economics Urban ecology Urban economics Geography Land development Landscape architecture Marine spatial planning Public health Public policy Real estate development Social sciences See also Terminology Outline of transport planning Outline of architecture Outline of geography External links Urban planning Urban planning Urban planning
Outline of urban planning
[ "Engineering" ]
479
[ "Urban planning", "Architecture" ]
70,982,321
https://en.wikipedia.org/wiki/Great%20Replacement%20conspiracy%20theory%20in%20the%20United%20States
In the United States, the populist Great Replacement conspiracy theory holds the view that "political elites" are purposefully seeking to increase the number of racial and religious minorities in an attempt to displace the Christian white American population. Believers in the conspiracy theory have used it as a racist trope in an attempt to advocate anti-immigration policies and dogwhistle to xenophobic ideology. The theory has received strong support in many sectors of the Republican Party. According to David Smith, "Two in three Republicans agree with the 'great replacement' theory." As a result, it has become a major issue of political debate. It has also stimulated violent reactionary responses, including mass murders. Research published in 2024 found that people who endorse the Great Replacement conspiracy theory tend to have anti-social personality traits, authoritarian views, and negative attitudes toward immigrants, minorities, and women. The name is derived from the "Great Replacement" theory, invented in 2011 by the French author Renaud Camus; it is promoted in Europe, and it also has some similarities to the white genocide conspiracy theory, popularized by the American terrorist David Lane in his 1995 White Genocide Manifesto. Similar views originated in American nativism around 1900. According to Erika Lee, in 1894 the old stock Yankee upper-class founders of the Immigration Restriction League were "convinced that Anglo-Saxon traditions, peoples, and culture were being drowned in a flood of racially inferior foreigners from Southern and Eastern Europe". Responding to demographic projections Changes in the method by which the Census Bureau classifies the population by race led to a 2008 projection that white non-Hispanic Americans will make up less than half the population of the U.S. by 2042. This projection was criticized by academics as misleading, but was widely publicized by national media and white nationalist groups. Sociologist Richard Alba states, "The population projections that undergird the widespread belief in the arrival of a majority-minority society in the next few decades are based on the classification of the great majority of mixed majority-minority individuals as 'not white,' and hence as 'minority.' The evidence so far strongly contradicts this classification." Nevertheless, the projection generated widespread anxiety and even violence. It was a matter of how to read statistics. As the New York Times reported, a study co-authored by demographer Dowell Myers, "found that presenting the data differently could produce a much less anxious reaction.... [T]hey found that the negative effects that came from reading about a white decline were largely erased when the same people read about how the white category was in fact getting bigger by absorbing multiracial young people through intermarriage." According to Kaleigh Rogers of FiveThirtyEight, arguments for a "great replacement" in the United States are "built on false assumptions about American demographics and immigration: that white people will soon be a minority in this country, that immigrants and non-white voters are all Democrats, and that no longer being the majority group means a loss of power. When those assumptions are torn down, the true justifications for these fears become transparent." A May 2022 poll by the Associated Press found that one-third of American adults believed that an effort was underway "to replace native-born Americans with immigrants for electoral gains". The poll found that those who reported themselves as viewers of conservative and far-right media outlets were more likely to believe the theory, with 45% of One America News Network (OANN) and Newsmax viewers and 31% of Fox News viewers believing in it, as compared to 13% of CNN viewers and 11% of MSNBC viewers. Another May 2022 poll by Yahoo! News and YouGov found that 61% of people who voted for Donald Trump in the 2020 U.S. presidential election believe that "a group of people in this country are trying to replace native-born Americans with immigrants and people of color who share their political views." History Origins The origins of the basic idea of the replacement of the white population in the United States date to the late 19th century, when upper class Americans began to fear the arrival of what they considered inferior Catholic and Jewish immigrants from Eastern and Southern Europe. Leaders included Republican Senator Henry Cabot Lodge and polemicist Madison Grant, author of The Passing of the Great Race (1916). They sought immigration restriction. It was finally imposed by Congress in the Immigration Act of 1924. However the main restrictions were removed by the Immigration and Nationality Act of 1965. According to Elle Reeve of CNN, the great replacement "stayed mostly on the margins [in the United States] until 2014". That year, members of Gamergate began "mingl[ing] with neo-Nazis" on 4chan and 8chan, resulting in a "massive wave of young people enter[ing] what had been an old man's world of White nationalism." "Unite the Right" rally In 2017, white supremacist protesters at the Unite the Right rally in Charlottesville, Virginia used slogans that alluded to similar ideas of ethnic replacement, such as "You will not replace us" and "Jews will not replace us". After that event, Renaud Camus, the French writer who coined the term "Great Replacement," stated that he did not support violence, and disputed any association between his ideas and neo-Nazis; however, he said he approved of the feeling behind the chant. U.S. representative Steve King endorsed the conspiracy theory, stating: "Great replacement, yes," referring to the European migrant crisis that "these people walking into Europe by ethnic migration, 80 percent are young men." King presents the Great Replacement as a shared concern of Europe and the United States, claiming that "if we continue to abort our babies and import a replacement for them in the form of young violent men, we are supplanting our culture, our civilization." He has blamed George Soros as an alleged perpetrator behind the conspiracy. Abortion In May 2019, Florida State Senator Dennis Baxley was reported to use the replacement theory in relation to the abortion debate in the United States. Speaking of Western European birthrates as a warning to Americans, he said: "When you get a birth rate less than 2 percent, that society is disappearing, and it's being replaced by folks that come behind them and immigrate, don't wish to assimilate into that society and they do believe in having children." The following month, Nick Isgro, Vice Chair of the Maine Republican Party endorsed the conspiracy theory after claiming financial subsidies were promoted for abortions in the U.S. to "kill our own people", and that asylum seekers were "human pawns who are being played in a game by global elites and their partners here in Augusta." Greg Kesich, a writer for the Portland Press Herald, reported that the current mayor of Waterville's speech displayed the sentiment of the Great Replacement. 2018 Pittsburgh synagogue shooting In October 2018, far-right terrorist Robert Bowers killed 11 people, including Holocaust survivors, at the Tree of Life – Or L'Simcha Congregation synagogue in the Squirrel Hill neighborhood of Pittsburgh, Pennsylvania. It was the deadliest antisemitic attack in modern American history. Before the massacre, Bowers' recent social media posts, confirmed by his former co-workers, had indicated that his conservatism had quickly spiraled into white nationalism. The shooter eventually became inspired by the right-wing radio host Jim Quinn, and became deeply involved on extremist social networking websites, such as Gab, promoting antisemitic conspiracy theories on them. The shooter published posts that supported the white genocide conspiracy theory, such as one that said: "Daily Reminder: Diversity means chasing down the last white person". Shortly before the attack, the shooter made a series of posts on Gab attacking Jewish organizations and promoting the Great Replacement, saying: "HIAS likes to bring invaders in that kill our people. I can't sit by and watch my people get slaughtered. Screw your optics, I'm going in." 2019 El Paso Walmart shooting In August 2019, Patrick Crusius killed 23 people at a Walmart in El Paso, Texas, in the deadliest attack on Latinos in modern American history. Before the massacre, he released an anti-Hispanic, anti-immigrant manifesto promoting the Great Replacement inspired by the Christchurch mosque shootings in New Zealand by Brenton Tarrant. While the document uses language about immigrants similar to that used by U.S. president Donald Trump,"[S]ome of the language included in the document parroted Trump's own words, characterizing Hispanic migrants as invaders taking American jobs and arguing to 'send them back'." "Portions of the 2,300-word essay, titled 'The Inconvenient Truth', closely mirror Trump's rhetoric, as well as the language of the white nationalist movement, including a warning about the 'Hispanic invasion of Texas'." Brexit In July 2019, Keith Ellison, the Attorney General of Minnesota, stated how increasing and varied hate crime, exacerbated by the 2016 Brexit vote and election of Donald Trump, was "united by so-called 'replacement' theory, and that communities needed to "vigilantly and consistently counter each of these acts of violence and expressions of hate". At the same time, Mick Davis, the Chief Executive and Treasurer of the British Conservative Party, published his outrage of the concept. Writing in The Jewish Chronicle, Davis named the Great Replacement, "a driving force behind far right terror", as worse than merely a conspiracy theory, in that it was "profoundly antisemitic". Influence of Donald Trump's Twitter account According to the Institute for Strategic Dialogue, Donald Trump referenced the Great Replacement, and a 2019 tweet in favour of his proposed Border Wall was interpreted by many as endorsing the theory. They also stated that Trump's Twitter account was one of the most influential accounts promoting the theory. His history of describing Muslims and immigrants as "invaders", according to SBS News, closely mirrors the language of explicit supporters of the theory. Political scientist Robert A. Pape concluded from two surveys led by the Chicago Project on Security and Threats in 2021 that the Great Replacement theory had achieved "iconic status with white nationalists" and "might help explain why such a high percentage of the rioters [involved in the January 6 United States Capitol attack] hail[ed] from counties with fast-rising, non-White populations." Tucker Carlson In April 2021, the conspiracy theory was prominently and repeatedly mentioned by conservative television host Tucker Carlson on the Tucker Carlson Tonight show. Days later, during a committee hearing, Republican Congressman Scott Perry said "For many Americans, what seems to be happening or what they believe right now is happening is, what appears to them is we're replacing national-born Americans, native-born Americans to permanently transform the landscape of this very nation." Former speaker Newt Gingrich echoed the theory's sentiments while discussing immigration in a Fox News interview in August 2021, accusing the "anti-American left" of aiming to "drown traditional classic Americans" with mass immigration. On 22 September 2021, Tucker Carlson promoted the conspiracy theory on a segment of his Fox News show Tucker Carlson Tonight, claiming that President Joe Biden was intentionally trying to replace the population with people from the third world. According to a New York Times analysis published in April 2022, Carlson has made reference to the theory in more than 400 episodes between 2016 and 2021. 2022 Buffalo shooting One day after the 2022 Buffalo shooting by Payton Gendron, avowed Christian nationalist Andrew Torba, a proponent of far-right accelerationism and the CEO of alt-tech platform Gab, said that "The best way to stop White genocide and White replacement, both of which are demonstrably and undeniably happening, is to get married to a White woman and have a lot of White babies". Following the mass shooting in Buffalo, U.S. Representative for New York Elise Stefanik, the third-highest ranking Republican official in the House of Representatives, faced scrutiny for past campaign ads that traffic in language similar to that used by Great Replacement conspiracy theorists. In one attack ad, she falsely accused "radical Democrats" of planning to permanently undermine elections by "grant[ing] amnesty to 11 million illegal immigrants [who] will overthrow our current electorate and create a permanent liberal majority in Washington". During the 2022 United States infant formula shortage, Stefanik accused President Biden of withholding formula from American mothers while providing it to undocumented immigrants. In response to criticism following the shooting, she refused to repudiate replacement theory, and defended using language reminiscent of QAnon conspiracy theory tropes aimed at the LGBTQ community. Robert F. Kennedy Jr. COVID-19 conspiracy theory In July 2023, Robert F. Kennedy Jr., a candidate in the 2024 U.S. presidential election, promoted a conspiracy theory during a press event in New York City. According to a New York Post article by Jon Levine, Kennedy promoted the conspiracy that the novel coronavirus (COVID-19) was a genetically engineered bioweapon that may have been "ethnically targeted" to disproportionally affect white and black Americans, while simultaneously sparing Ashkenazi Jews and Chinese people. During the press event, Kennedy stated: Several news media outlets and Jewish organizations have labeled Kennedy's remarks as antisemitic, and that they promoted anti-Chinese sentiment. The Anti-Defamation League publicly responded to Kennedy's remarks, stating: Republican Party Since Donald Trump entered American politics as a presidential candidate in 2015, the great replacement conspiracy theory has become increasingly mainstream, and Republicans have embraced the conspiracy theory in various forms. Prior to this, embrace of the Great Replacement conspiracy theory was largely confined to the fringes of the party. This development represents a shift for the party, which, after the 2012 election defeat, issued a report that came to the conclusion that appealing to minorities is essential for the Republican Party to succeed in the future. Republican politicians have used the conspiracy theory to discredit the Democrats, falsely accusing them of inviting migrants to the country who would then give the Democratic Party an electoral edge. During the 2022 Senate elections, several Republican candidates such as JD Vance campaigned using Great Replacement theory rhetoric. Other Republicans that have promoted the conspiracy theory include Matt Gaetz and Elise Stefanik. 2024 Republican Party presidential debates On 6 December 2023, Vivek Ramaswamy, one of four candidates participating in the fourth Republican primary debate in Tuscaloosa, touted the Great Replacement theory alongside other fringe far-right conspiracy theories. These included saying that the January 6 United States Capitol attack was "an inside job" and that the 2016 and 2020 elections were stolen from Trump by "Big Tech" and the elites. Ramaswamy claimed that the Great Replacement was not a conspiracy, but instead a "basic statement of the Democratic Party's platform". Afterwards, white supremacists such as Nick Fuentes celebrated Ramaswamy's statements. See also Great Replacement theory Christian nationalism Demographic threat Disappearing blonde gene hoax Eurabia conspiracy theory Kalergi Plan Love jihad conspiracy theory Nativism in United States politics Palingenetic ultranationalism Racial views of Donald Trump Racism in the United States Replacement migration The Camp of the Saints White nationalism White supremacy Xenophobia References Further reading Alba, Richard. The Great Demographic Illusion: Majority, Minority, and the Expanding American Mainstream (Princeton UP, 2020) https://doi.org/10.1515/9780691202112 Alba, Richard, and Christopher Maggio. "Demographic change and assimilation in the early 21st-century United States." Proceedings of the National Academy of Sciences 119.13 (2022): e2118678119 online. Alexander, Charles C. "Prophet of American Racism: Madison Grant and the Nordic Myth" Phylon 23#1 pp 73–90 online. Daniel, Reginald. "Sociology of Multiracial Identity in the Late 1980s and Early 1990s: The Failure of a Perspective." Journal of Ethnic and Cultural Studies 8.2 (2021): 106–125. online Lee, Erika. "America first, immigrants last: American xenophobia then and now." Journal of the Gilded Age and Progressive Era 19.1 (2020): 3–18. online Lee, Erika. America for Americans: A History of Xenophobia in the United States (2019). excerpt Leslie, Gregory John, and David O. Sears. "The Heaviest Drop of Blood: Black Exceptionalism Among Multiracials." Political Psychology (2022). online Rodríguez-Muñiz, Michael. Figures of the future: Latino civil rights and the politics of demographic change (Princeton University Press, 2021). Song, Miri. "Who counts as multiracial?." Ethnic and Racial Studies 44.8 (2021): 1296–1323. Stefaniak, Anna, and Michael JA Wohl. "In time, we will simply disappear: Racial demographic shift undermines privileged group members’ support for marginalized social groups via collective angst." Group Processes & Intergroup Relations 25.3 (2022): NP1-NP23. online Alt-right Anti-immigration politics in the United States Conspiracy theories involving race and ethnicity Conspiracy theories promoted by Donald Trump Cultural assimilation Demography Democratic backsliding in the United States White genocide conspiracy theory White supremacy in the United States
Great Replacement conspiracy theory in the United States
[ "Environmental_science" ]
3,603
[ "Demography", "Environmental social science" ]
70,982,440
https://en.wikipedia.org/wiki/IQOO%203
The iQOO 3 is an Android-based smartphone designed, developed and marketed by Vivo owned iQOO as a part of its iQOO gaming smartphone series. This phone was announced on February 25, 2019. The iQOO 3 was introduced as a premium segment smartphone with a special focus on gaming. Design The slender 9.2 mm body of iQOO 3, the glossy glass sandwich finish stands out all on its own. It is available in Volcano Orange, Tornado Black, Quantum Silver color. It is available with FULL HD+ Super AMOLED, HDR10 with display 800 nits brightness & punch hole for the front camera. The screen supports 180 Hz Super Touch Response rate suitable for the gaming phones. The additional features which also includes under display optical fingerprint sensor along with face recognition for unlocking the phone & quad camera for photography. Specification Hardware iQOO 3 is a smartphone with a slate-type factor form, the dimensions of iQOO 3 are 158.5 x 74.9 x 9.2 mm (6.24 x 2.95 x 0.36 in) weighing 214.5 g (7.58 oz). The device is equipped with dual sim with support for GSM, HSPA and LTE & supporting major 5G bands for connectivity and Wi-Fi 802.11 a/b/g/n/ac/6, dual-band, Wi-Fi Direct, hotspot with Bluetooth 5.1, A2DP, LE, aptX HD support. For navigation GPS with A-GPS, GLONASS, Beidou, GALILEO are supported. It has a 6.44 inches, 100.1 cm2 (~84.3% screen-to-body ratio) diagonal Super AMOLED, HDR10+ touchscreen, the screen supports 800 nits of brightness with peak brightness of up to 1200 nits. The Super AMOLED also sports smalle. The 4400 mAh lithium polymer non-removable battery with fast-charging support for up to 65 W charging. The chipset is powered by a Qualcomm SM8250 Snapdragon 865 5G (7 nm+) Octa-core (1x2.84 GHz Kryo 585 & 3x2.42 GHz Kryo 585 & 4x1.8 GHz Kryo 585) powered by Adreno 650 GPU. The phone comes in multiple configuration of 128 GB 6 GB RAM(Chine Only), 128 GB 8 GB RAM, 128 GB 12 GB RAM, 256 GB 8 GB RAM, 256 GB 12 GB RAM & it has UFS 3.1 storage. The iQOO 3 features 5 camera setup single 32 MP camera at the front for the selfie & quad camera setup at the back. The main camera sensor is powered by 48 MP, f/1.8, (wide), 1/2.0", 0.8 μm, PDAF, supported by telephoto lens of 13 MP, f/2.5, 50 mm (telephoto), PDAF, 2x optical zoom and ultrawide lens of 13 MP, f/2.2, 120˚, 16 mm (ultrawide), AF. It also has 2 MP, f/2.4, (depth) depth sensor helping to capture better portrait pictures. The iQOO 3 is capable of recording 4K videos at 30/60 fps, full HD 1080p 30 fps. The main sensor has gyro-EIS support for video stability and can also record 720p at 240 fps. For enhanced gaming effect the phone also has the Monster Touch Buttons, which are two pressure-sensitive buttons on the side frame of the phone. This helped users to achieve quick multi-finger operations in the game. Monster touch buttons also enabled better grip, wider adaptation, comfort and reliable quality. 4D vibration can simulate the recoil when shooting and the vibration of the steering wheel when driving, making the game experience more realistic. Software iQOO had launched the iQOO3 with Android 10 with iQOO UI 1.0. The smartphone received the Android 11 based iQOO UI update in the year 2021 and the Android 12-based on same iQOO UI started rolling out in March & April 2022. References Vivo smartphones Discontinued flagship smartphones Mobile phones with 4K video recording Mobile phones with multiple rear cameras Mobile phones introduced in 2019
IQOO 3
[ "Technology" ]
907
[ "Discontinued flagship smartphones", "Flagship smartphones" ]
70,982,737
https://en.wikipedia.org/wiki/SET%20domain%20containing%20protein%201A
SET domain containing protein 1A (SETD1A) is a protein that serves as a component of a histone methyltransferase (HMT) complex that produces mono-, di-, and trimethylated histone H3 at the lys4 residue (K4). SETD1A is highly homologous with SETDB1 but has a distinct subnuclear distribution. Clinical significance Mutations of the SETD1A gene can cause neurodevelopmental disorder with speech impairment and dysmorphic facies (NEDSID) discovered in 2021, and early-onset epilepsy with or without developmental delay, first described in 2019. According to a review published in 2018, mutations of the SETD1A gene may increase the risk of schizophrenia, based on studies available up to that date. History The protein was first described in man in 2003 by Wysocka et al. See also SETDB1 - highly homologous to SETD1A SET domain References Genes on human chromosome 16
SET domain containing protein 1A
[ "Chemistry" ]
214
[ "Biochemistry stubs", "Protein stubs" ]
70,983,296
https://en.wikipedia.org/wiki/Kampot%20sea%20salt
Kampot sea salt (, ) is extracted from the seawater through salt evaporation ponds in the coastal Kampot and Kep provinces. Salt farms cover around 4,748 hectares of land in both provinces and are owned by 200 families who are members of Kampot-Kep Salt Association. The highest quality Kampot sea salt is Kampot Flower of Salt (, ), the fleur de sel harvested in small quantities from each pond only during April and May, the warmest months of the year, when there is little to no wind. History Salt production has a long history in the region, but the industry grew rapidly in the 1940s and 1950s. During the Khmer Rouge and the subsequent Cambodian Civil War salt production was nationalized. In 1986, a group of Kampot residents were granted 50 hectares of land by the state and given permission to start salt production as private entrepreneurs. In 2014, due to the favourable weather conditions and extended harvest season, 147,000 tonnes of Kampot sea salt were collected, almost double last year's harvest of 80,000 tonnes. In 2015, the even longer dry season allowed for the harvest of 170,000 tonnes of salt, increasing the amount of the country's Kampot sea salt reserves to 270,000 tonnes. The amount of harvested Kampot sea salt dropped to 140,000 tonnes in 2016. In 2017, an export contract between local producer Confirel Co Ltd and French company Le Guerandais was signed for 20 tonnes of unprocessed raw Kampot sea salt for a price of 58 USD per tonne, marking the first-ever export contract for a Kampot sea salt producer. In 2018, to tackle iodine deficiency among its population, the government of Cambodia banned the sale of non-iodized salt in Cambodia from 2019, while salt ionization had already been made mandatory for Cambodian salt producers since 2003. In the late 2010s, the Cambodian salt producers were reporting that the changing weather patterns and rising sea level caused by global warming were negatively affecting the Kampot salt production. A record low of 18,430 tonnes of Kampot sea salt could be harvested in 2019. Due to the dwindling income some Cambodian salt producers began to sell their salt farms and the children of salt workers were increasingly forced to drop out of school, prompting the Ministry of Industry, Science, Technology and Innovation of Cambodia to set up a working group in 2021 to tackle the issue. In 2022, the ministry presented a 2022–2026 salt development strategy to develop, manage and preserve salt farms, as well improve the economic efficiency of salt production. See also Kampot fish sauce Kampot pepper References External links The Life of a Cambodian Salt Farmer. 11 March 2016. Radio Free Asia via YouTube. Edible salt Cambodian cuisine Kampot province
Kampot sea salt
[ "Chemistry" ]
568
[ "Edible salt", "Salts" ]
70,983,445
https://en.wikipedia.org/wiki/Gamma2%20Octantis
{{DISPLAYTITLE:Gamma2 Octantis}} γ2 Octantis, Latinized to Gamma2 Octantis (Gamma2 Oct), is a solitary star in the southern circumpolar constellation Octans. It has an apparent magnitude of 5.72, allowing it to be faintly seen with the naked eye. Parallax measurements place the object at a distance of 320 light years and is currently receding with a heliocentric radial velocity of . Gamma2 Oct has a stellar classification of K0 III, indicating that it is a red giant. At present it has 115% the mass of the Sun but has expanded to 10.54 times its girth. It shines at 52.6 times the luminosity of the Sun from its enlarged photosphere at an effective temperature of , giving it a yellow-orange glow. Gamma2 Oct has a poorly constrained metallicity 91% that of the Sun and spins with a projected rotational velocity of about . References Octans Octantis, Gamma2 Octantis,87 224362 118114 9061 PD-82 907 K-type giants
Gamma2 Octantis
[ "Astronomy" ]
234
[ "Octans", "Constellations" ]
70,983,902
https://en.wikipedia.org/wiki/AG%20Virginis
AG Virginis is an eclipsing binary star system in the equatorial constellation of Virgo. With a maximum apparent visual magnitude of 8.51 it is too faint to be visible to the naked eye. The system is located at a distance of approximately 820 light years from the Sun based on parallax measurements. The variability of this system was first reported by P. Guthnick and R. Prager in 1929. R. S. Dugan determined the periodicity in 1933. C. Blanco and F. Catalano in 1970 proposed that this is a semidetached binary where the primary component has filled its Roche lobe, thereby allowing mass transfer. They noted that the orbital period appeared to vary slightly with a ~40 year cycle, which could be explained by a third component. In 1986, J. Kaluzny produced a model for the light curve which suggested this is instead a contact binary. Multiple observers noted a permanent asymmetry to the light curve, with the primary minimum appearing distorted. A localized "hot spot" hypothesis was proposed to explain this feature. This is a close binary system with an orbital period of . It is classified as a W Ursae Majoris variable, which means the components are in near contact with each other and their mutual gravitational influence is distorting their shapes. The components are separated by just 4.5 times the radius of the Sun, and the orbital plane is inclined at an angle of 84.4° to the line of sight from the Earth. This causes the two stars to eclipse each other during every orbit. The net visual brightness decreases by 0.58 in magnitude during the primary eclipse and by 0.45 during the secondary eclipse. The combined spectrum of the system has a varying stellar classification in the range of A7V-A9V, matching an A-type main-sequence star. The primary component has 2.2 times the mass and radius of the Sun, while the secondary has 74% of the Sun's mass and 136% of the radius of the Sun. References Further reading A-type main-sequence stars W Ursae Majoris variables Virgo (constellation) BD+13 2481 104350 058605 Virginis, AG
AG Virginis
[ "Astronomy" ]
455
[ "Virgo (constellation)", "Constellations" ]
70,984,019
https://en.wikipedia.org/wiki/Research%20transparency
Research transparency is a major aspect of scientific research. It covers a variety of scientific principles and practices: reproducibility, data and code sharing, citation standards or verifiability. The definitions and norms of research transparency significantly differ depending on the disciplines and fields of research. Due to the lack of consistent terminology, research transparency has frequently been defined negatively by addressing non-transparent usages (which are part of questionable research practices). After 2010, recurrent issues of research methodology have been increasingly acknowledged as structural crisis, that involve deep changes at all stages of the research process. Transparency has become a key value of the open science movement, which evolved from an initial focus on publishing to encompass a large diversity of research outputs. New common standards for research transparency, like the TOP Guidelines, aims to build and strengthen open research culture across disciplines and epistemic cultures. Definitions Confused terminologies There is no widespread consensus on the definition of research transparency. Differences between disciplines and epistemic cultures has largely contributed to different acceptions. The reproduction of past research has been a leading source of dissent. In an experimental setting, reproduction relies on the same set-up and apparatus, while replication only requires the use of the same methodology. Conversely, computational disciplines use reversed definitions of the term replicability and reproducibility. Alternative taxonomies have proposed to make do entirely with the ambiguity of reproducibility/replicability/repeatability. Goodman, Fanelli and Ioannidis recommended instead a distinction between method reproducibility (same experimental/computational setup) and result reproducibility (different setup but same overall principles). Core institutional actors continue to disagree on the meaning and usage of key concepts. In 2019, the National Academies of Science of the United States retained the experimental definition of replication and reproduction, which remains "at odds with the more flexible way they are used by [other] major organizations". The Association for Computing Machinery opted in 2016, for the computational definition and added also an intermediary notion of repeatability, where a different team of research use exactly the same measurement system and procedure. Debate over research transparency has also created new convergences between different disciplines and academic circles. In the Problem of science (2021), Rufus Barker Bausell argues that all disciplines, including the social sciences, currently face similar issues to medicine and physical sciences: "The problem, which has come to be known as the reproducibility crisis, affects almost all of science, not one or two individual disciplines." Negative definitions Due to lack of consistent terminology over research transparency, scientists, policy-makers and other major stake-holders have increasingly rely on negative definitions: what are the practices and forms that harm or disrupt any common ideal of research transparency. The taxonomy of scientific misconducts has been gradually expanded since the 1980s. The concept of questionable research practices (or QRP) was first incepted in a 1992 report of the Committee on Science, Engineering, and Public Policy as a way to address potentially non-intentional research failures (such as inadequacies in the research data management process). Questionable research practices uncover a large grey area of problematic practices, which are frequently associated to deficiencies in research transparency. In 2016, a study identified as much as 34 questionable research practices or "degree of freedom", that can occur at all the steps of the project (the initial hypothesis, the design of the study, collection of the data, the analysis and the reporting). Surveys of disciplinary practices have shown large differences in the admissibility and spread of questionable research practices. While data fabrication and, to a lesser extent, rounding of statistical indicators like the p value are largely rejected, the non-publication of negative results or the adjonctions of supplementary data are not identified as major issues. In 2009, a meta-analysis of 18 surveys estimated that less than 2% of scientists "admitted to have fabricated, falsified or modified data or results at least once". Real prevalence may be under-estimated due to self-reporting: regarding "the behaviour of colleagues admission rates were 14.12%". Questionable research practices are more widespread as more than one third of the respondents admit to have done it once. A large 2021 survey of 6,813 respondents in the Netherlands found significantly higher estimate, with 4% of the respondents engaging in data fabrication and more than half of the respondents engaging in questionable research practices. Higher rates can be either attributed to a deterioration of ethic norms or to "the increased awareness of research integrity in recent years". A new dimension of open science? Transparency has been increasingly acknowledged as an important component of open science. Until the 2010s, definitions of open science have been mostly focused on technical access and enhanced participation and collaboration between academics and non-academics. In 2016, Liz Lyon identified transparency as a "third dimension" of open science, due to the fact that "the concept of transparency and the associated term ‘reproducibility’, have become increasingly important in the current interdisciplinary research environment." According to Kevin Elliott, the open science movement "encompasses a number of different initiatives aimed at somewhat different forms of transparency." First drafted in 2014, the TOP guidelines have significantly contributed to bring transparency on the agenda of the open science movements. They aim to promote an "open research culture" and implement "strong incentives to be more transparent". They rely on eight standards, with different levels of compliance. While the standards are modular, they also aim to articulate a consistent ethos of science as "they also complement each other, in that commitment to one standard may facilitate adoption of others.". This open science framework of transparency has been in turn coopted by leading contributors and institutions on the topic of research transparency. After 2015, contributions from science historians underlined that there have been no significant deterioration of research quality, as past experiments and research design were not significantly better conceived and the rate of false or partially false has likely remained approximately constant for the last decades. Consequently, proponents of research transparency have come to embrace more explicitly the discourse of open science: the culture of scientific transparency becomes a new ideal to achieve rather than a fundamental principle to re-establish. The concept of transparency has contributed to create convergences between open science and other open movements in different areas such as open data or open government. In 2015, the OECD describe transparency as a common "rationale for open science and open data". History Discourse and practices of research transparency (before 1945) Transparency has been a fundamental criterion of experimental research for centuries. Successful replications have become an integral part of the institutional discourse of natural sciences (then called natural philosophy) in the 17th century. An early scientific society of Florence the Accademia del Cimento adopted in 1657 the motto provando e riprovando as a call for "repeated (public) performances of experimental trials" A key member of the Accademia, the naturalist Francesco Redi described extensively of the forms and benefits of procedural experimentation, that made it possible to check for random effects, the soundness of the experiment design, or causal relationships through repeated trials Replication and the open documentation of scientific experiments has become a key component of the diffusion of scientific knowledge in society: once they attained a satisfying rate of success, experiments could be performed in a variety of social spaces such as courts, marketplaces or learned salon. Although transparency has been early on acknowledged as a key component of science, it was not defined consistently. Most concept associated today with research transparency have arisen as terms of the art with no clear and widespread definitions. The concept of reproducibility appeared in an article on the "Methods of illuminations" first published in 1902: one of the methods examined was deemed limited regarding "reproducibility and constancy" In 2019, the National Academies underlined that the distinction between reproduction, repetition and replication has remained largely unclear and unharmonized across disciplines: "What one group means by one word, the other group means by the other word. These terms — and others, such as repeatability — have long been used in relation to the general concept of one experiment or study confirming the results of another." Beyond this lack of formalization, there was a significant drift between the institutional and disciplinary discourse on research transparency and the reality of research work, that has persisted till the 21st century. Due to the high cost of the apparatus and the lack of incentives, most experiences were not reproduced by contemporary researchers: even a committed proponent of experimentalism like Robert Doyle had to devolve to a form of virtual experimentalism, by describing in detail a research design that has only been run once For Friedrich Steinle, the gap between the postulated virtue of transparency and the material conditions of science has never been solved: "The rare cases in which replication actually is attempted are those that either are central for theory development (e.g., by being incompatible with existing theory) or promise broad attention due to major economical perspectives. Despite the formal ideal of replicability, we do not live in a culture of replication." Preconditions of the transparency crisis (1945–2000) The development of big science after the Second World War has created unprecedented challenges for research transparency. The generalization of statistical methods across a large number of fields, as well as the increasing breadth and complexity of research projects, entailed a series of concerns about the lack of proper documentation of the scientific process. Due to the expansion of the published research output, new quantitative methods for literature surveys have been developed under the label of meta-analysis or meta-science. These rely on the assumption that quantitative results and the details of the experimental and observational framework are sound (such as the size or the composition of the sample). In 1966, Stanley Schor and Irving Karten published one of the first generic evaluation of statistical methods in 67 leading medical journals. While few outright problematic papers were found, "in almost 73% of the reports read (those needing revision and those which should have been rejected), conclusions were drawn when the justification for these conclusions was invalid" In the 1970s and the 1980s, scientific misconducts gradually ceased to be presented as individual misconducts and became collective problems that need to be addressed by scientific institutions and communities. Between 1979 and 1981, several major cases of scientific frauds and plagiarism draw a larger focus to the issue from researchers and policy-makers in the United States In a well-publicized investigation, Betrayers of Science, two scientific journalists described scientific fraud as a structural problem: "As more cases of frauds broke into public view (…) we wondered if fraud wasn't a quite regular minor feature of the scientific landscape (…) Logic, replication, peer review — all had been successfully defied by scientific forgers, often for extended periods of time". The codification of research integrity has been the main institutional answer to this increased public scrutiny with "numerous codes of conduct field specific, national, and international alike." The reproducibility/transparency debate (2000–2015) In the 2000s, long-standing issues on the standardization of research methodology have been increasingly presented as a structural crisis which "if not addressed the general public will inevitably lose its trust in science." The early 2010s is commonly considered to be a turning point: "it wasn’t until sometime around 2011–2012 that the scientific community’s consciousness was bombarded with irreproducibility warnings". An early significant contribution to the debate has been the controversial and influential claim of John Ioannidis from 2005: "most published research findings are false. The main argument was based on the excessively lax experimental standards in place, with numerous weak result being presented as solid research: "the majority of modern biomedical research is operating in areas with very low pre- and post-study probability for true findings" Due to being published in PLOS Medicine the study of Ioannidis had a considerable echo in psychology, medicine and biology. In the following decades, large range projects attempted to assess experimental reproducibility. In 2015, the Reproducibility Project: Psychology attempted to reproduced 100 studies from three top psychology journals (Journal of Personality and Social Psychology, Journal of Experimental Psychology: Learning, Memory, and Cognition, and Psychological Science): while nearly all paper had reproducible effects, it was found that only 36% of the replications were significant enough (p value above the common threshold of 0.05). In 2021, another Reproducibility Project, Cancer Biology, analyzed 53 top papers about cancer published between 2010 and 2012 and established that the effect sizes were 85% smaller on average than the original findings . During the 2010s, the concept of reproducibility crisis has been expanded to a wider array of disciplines. The share of citations per year of the seminal paper of John Ioannidis, Why Most Published Research Findings Are False in the main fields of research according to the metadata recorded by the academic search engine Semantic Scholar (6,349 citations as of June 2022) shows how this framing has especially expanded to computing sciences. In Economics, a replication of 18 experimental studies in two major journals, found a failure rate comparable to psychology or medicine (39%). Several global surveys have reported a growing uneasiness of scientific communities over reproducibility and other issues of research transparency. In 2016, Nature highlighted that "more than 70% of researchers have tried and failed to reproduce another scientist's experiments, and more than half have failed to reproduce their own experiments" The survey also found "no consensus on what reproducibility is or should be", in part due to disciplinary differences, which makes it harder to assess what could be the necessary steps to overcome the issue at plays. The Nature survey has also been criticized for its paradoxical lack of research transparency, since it was not based on a representative sample but an online survey: it has "relied on convenience samples and other methodological choices that limit the conclusions that can be made about attitudes among the larger scientific community" Despite mixed results, the Nature survey has been largely disseminated and ahs become a common entry data for any study of research transparency. Reproducibility crisis and other issues of research transparency have become a public topic addressed in the general press: "Reproducibility conversations are also unique compared to other methodological conversations because they have received sustained attention in both the scientific literature and the popular press". Research transparency and open science (2015–) Since 2000, the open science movement has expanded beyond access to scientific outputs (publication, data or software) to encompass the entire process of scientific production. In 2018, Vicente-Saez and Martinez-Fuentes have attempted to map the common values shared by the standard definitions of open science in the English-speaking scientific literature indexed on Scopus and the Web of Science. Access is no longer the main dimension of open science, as it has been extended by more recent commitments toward transparency, collaborative work and social impact. Through this process, open science has been increasingly structured over a consisting set of ethical principles: "novel open science practices have developed in tandem with novel organising forms of conducting and sharing research through open repositories, open physical labs, and transdisciplinary research platforms. Together, these novel practices and organising forms are expanding the ethos of science at universities." The global scale of the open science movement and its integration in a large variety of technical tools, standards and regulations makes it possible to overcome the "classic collective action problem" embodied by research transparency: there is a structural discrepancy between the stated objective of scientific institutions and the lack of incentives to implement them at an individual level. The formalization of open science as a potential framework to ensure research transparency has been initially undertaken by institutional and communities initiatives. The TOP guidelines were elaborated in 2014 by a committee for Transparency and Openness Promotion that included "disciplinary leaders, journal editors, funding agency representatives, and disciplinary experts largely from the social and behavioral sciences". The guidelines rely on eight standards, with different levels of compliance. While the standards are modular, they also aim to articulate a consistent ethos of science as "they also complement each other, in that commitment to one standard may facilitate adoption of others." After 2015, theses initiatives have partly influenced new regulations and code of ethics. The European Code of Conduct for Research Integrity from 2017 is strongly structured around open science and open data: it "pays data management almost an equal amount of attention as publishing and is also in this sense the most advanced of the four CoCs." First adopted in July 2020, the Hong Kong principles for assessing researchers acknowledge open science as one of the five pillars of scientific integrity: "It seems clear that the various modalities of open science need to be rewarded in the assessment of researchers because these behaviors strongly increase transparency, which is a core principle of research integrity." Forms of research transparency Research transparency has a large variety of forms depending on the disciplinary culture, the material condition of research and the interaction between scientists and other social circles (policy-makers, non-academic professionals, general audience). For Lyon, Jeng and Mattern, "the term ‘transparency’ has been applied in a range of contexts by diverse research stakeholders, who have articulated and framed the concept in a number of different ways." In 2020, Kevin Elliott introduced a taxonomy of eight dimensions of research transparency: purpose, audience, content, timeframe, actors, mechanism, venues and dangers. For Elliott not all forms of transparency are achievable and desirable, so that a proper terminology can help to make the more appropriate decisions: "While these are important objections, the taxonomy of transparency considered here suggests that the best response to them is typically not to abandon the goal of transparency entirely to consider what forms of transparency are best able to minimize them.". Method reproducibility Goodman, Fanelli and Ioannidis define method reproducibility as "the provision of enough detail about study procedures and data so the same procedures could, in theory or in actuality, be exactly repeated." This acception is largely synonymous with replicability in a computational context or reproducibility in an experimental context. In the report of the National Academies of Science, that opted for an experimental terminology, the counterpart of method reproducibility was described as "obtaining consistent results using the same input data; computational steps, methods, and code; and conditions of analysis". Method reproducibility is more attainable in computational sciences: as long as it behaves as expected, the same code should produce the same output. Open code, open data and more recently, research notebook are common recommendations to enhance method reproducibility. In principle, the wider availability of research output makes it possible to assess and audit the process of analysis. In practice, Roger Peng already underlined in 2011, that many projects require "computing power that may not be available to all researchers". This issue has worsened in some areas such as Artificial Intelligence or Computer vision, as the development of very large deep learning models makes it nearly impossible to recreate them (or at a prohibitive cost), even when the original code and data are open. Method reproducibility can also be affected by library dependency, as the open code can rely on external programs which may not always be available or compatible. Two studies in 2018 and 2019 have shown that a large share of research notebook hosted on GitHub are no longer usable, either due to the of required extensions no longer being available or issues in the code. In experimental sciences, there is no commonly agreed criterium of method reproducibility: "in practice, the level of procedural detail needed to describe a study as "methodologically reproducible" does not have consensus." Result reproducibility Goodman, Fanelli and Ioannidis define result reproducibility as "obtaining the same results from the conduct of an independent study whose procedures are as closely matched". Result reproducibility is comparable to replication in an experimental context and reproducibility in a computational context. The definition of replicability retained in the National Academies of Science, largely applies to it: "obtaining consistent results across studies aimed at answering the same scientific question, each of which has obtained its own data.". The reproducibility crisis met in experimental disciplines like psychology or medicine is mostly a crisis of "result reproducibility", since it concerns research that cannot been simply re-executed, but involve the independent recreation of the experimental design. As such it is arguably the most debated form of research transparency in the recent years. Result reproducibility is harder to achieve than other forms of research transparency. It involve a variety of issues that may include computational reproducibility, accuracy of scientific measurement and diversity of methodological approaches. There are no universal standard to determine how close are the original procedures matched and criterium may vary depending on the disciplines or, even on the field of research. Consequently, meta-analysis of reproducibility have faced significant challenges. A 2015 study of 100 psychology papers conducted by Open Science Collaboration has been confronted with the "lack of a single accepted definition" which "opened the door to controversy about their methodological approach and conclusions" and made it necessary to fall back on "subjective assessments" of result reproducibility. Observation reproducibility and verifiability In 2018 Sabina Leonelli defines observation reproducibility as the "expectation being that any skilled researcher placed in the same time and place would pick out, if not the same data, at least similar patterns". This expectation recovers a large range scientific and scholarly practices in non-experimental disciplines: "A tremendous amount of research in the medical, historical and social sciences does not rest on experimentation, but rather on observational techniques such as surveys, descriptions and case reports documenting unique circumstances" The development of open scientific infrastructure has radically transformed the status and the availability of scientific data and other primary sources. Access to theses resources has been thoroughly transformed by digitization and the attribution of unique identifiers. Permanent digital object identifiers (or DOI) have been first allocated to dataset since the early 2000s which solved a long-standing debate on the citability of scientific data. Increased transparency of citations to primary sources or research materials has been framed by Andrew Moravcsik as a "revolution in qualitative research". Access to theses resources has been thoroughly transformed by digitization and the attribution of unique identifiers. Permanent digital object identifiers (or DOI) have been first allocated to dataset since the early 2000s which solved a long-standing debate on the citability of scientific data. Value transparency Transparency of research values has been a major focus of disciplines with strong involvements in policy-making such as environment studies or social sciences. In 2009, Heather Douglas underlined that the public discourse on science has been largely dominated by normative ideals of objective research: if the procedures have been correctly applied, science results should be "value-free". For Douglas, this ideal remains largely at loss with the effective process of research and scientific advising as pre-defined values may largely predate choices about the concepts, the protocols and the data used. Douglas argued instead in favor of a disclosure of the values held by researchers: "the values should be made as explicit as possible in this indirect role, whether in policy documents or in the research papers of scientists." In the 2010s, several philosopher of sciences attempted to systematize value transparency in the context of open science. In 2017, Kevin Elliott emphasized three conditions for value transparency in research, the first one involved "being as transparent as possible about (…) data, methods, models and assumptions so that value influence can be scrutinized". Review and editorial transparency Until the 2010s, the editorial practices of scholarly publishing have remained largely unformal and little studied: "Despite 350 years of scholarly publishing (…) research on ItAs [Instruction to authors], and on their evolution and change, is scarce." Editorial transparency has been recently acknowledged as a natural expansion of the debate over research reproducibility. Several principles laid in the 2015 TOP guidelines already implied the existence of explicit editorial standards. Unprecedented attention given to editorial transparency has also been motivated by the diversification and the complexification of the open science publishing landscape: "Triggered by a wide variety of expectations for journals’ editorial processes, journals have started to experiment with new ways of organizing their editorial assessment and peer review systems (...) The arrival of these innovations in an already diverse set of practices of peer review and editorial selection means we can no longer assume that authors, readers, and reviewers simply know how editorial assessment operates." Transparent by design: developing open workflow The TOPs Guidelines have set up an influential transdisciplinary standard to establish result reproducibility in an open science context. While experimental and computational disciplines remains a primary focus, the standards have strived to integrate concerns and formats more specific to other disciplinary practices (such as research materials). Informal incentives like badges or indexes have been initially advocated as a way to support the adoption of harmonized policies in regard to research transparency. Due to the development of open science, regulation and standardized infrastructures or processes are increasingly favored. Sharing of research outputs Data sharing has been early on identified as major potential solution to the reproducibility crisis and the lack of solid guidelines for statistical indicators. In 2005, John Ioannidis hypothesized that "some kind of registration or networking of data collections or investigators within fields may be more feasible than registration of each and every hypothesis-generating experiment." The sharing of research outputs is covered by three standards of the TOPs guidelines: on Data transparency (2), Analytic/code methods transparency (3) and Research materials transparency (4). All the relevant data, code and research materials are to be stored on a "trusted repository" and all analysis being already reproduced independently prior to publication. Extended citation standards While citation standards are commonly applied to academic reference, there is much less formalization for all the other research output, such as data, code, primary sources or qualitative assessments. In 2012, the American Political Science Association adopted new policies for open qualitative research. They covered three dimensions of transparency: data transparency (in the sense of precise bibliographic data to the original sources), analytic transparency (in regards to claims extrapolated from the cited sources) and production transparency (in reference to the editorial choices made in the selection of the sources). In 2014, Andrew Moravcsik advocated the implementation of transparency appendix, containing detailed quotes of original sources as well as annotations "explaining how the source supports the claim being made". According to the TOP Guidelines, "appropriate citation for data and materials" should be provided each publication. Consequently, scientific outputs like code or dataset are fully acknowledged as citable contributions: "Regular and rigorous citation of these materials credit them as original intellectual contributions." Pre-registrations Pre-registrations are covered by two TOP guidelines: Preregistration of studies (6) and Preregistration of analysis plans (7). In both cases, for the highest level of compliance journal should provide "link and badge in article to meeting requirements". Pre-registrations aims to preventively address a variety of questionable research practices. It takes usually the form of "a timestamped uneditable research plan to a public archive [that] states the hypotheses to be tested, target sample sizes". Preregistration acts as an ethical contract as it theoretically constrains "the researcher degrees of freedom that make QRPs and p-hacking work". Preregistration do not solve all the range of questionable research practices. Selective reporting of the results would especially still be compatible with a predefined research plan: "preregistration does not fully counter publication bias as it does not guarantee that findings will be reported." It has been argued that preregistration may also in some cases harm the quality of the research output by creating artificial constraints that do not fit with the reality of the research field: "Preregistration may interfere with valid inference because nothing prevents a researcher from preregistering a poor analytical plan." While advocated as a relatively cost-free solution, preregistration may be in reality harder to implement as it relies on a significant commitment on the part of the researchers. An empiric study of the adoption of open science experiments in a psychology journals has shown that "Adoption of pre-registration lags relative to other open science practices (…) from 2015 to 2020". Consequently "even within researchers who see field-wide benefits of pre-registration, there is uncertainty surrounding the costs and benefits to individuals." Replication studies Replication studies or assessments of replicability aims to re-do one or several original studies. Although the concept has only appeared in the 2010s, replication studies have been existing for decades but were not acknowledged as such. The 2019 report of the National academies include a meta-analysis of 25 replications published between 1986 and 2019. It finds that the majority of the replication concern the medical and social sciences (especially, psychology and behavioral economics) and that there is for now no standardized evaluation criteria: "methods of assessing replicability are inconsistent and the replicability percentages depend strongly on the methods used." Consequently, at least as for 2019, replication studies cannot be aggregated to extrapolate a replicability rate: they "are not necessarily indicative of the actual rate of non-replicability across science for a number" The TOPs guidelines have called for an enhanced recognition and valorization of replication studies. The eighth standards state that compliant journals should use "registered Reports as a submission option for replication studies with peer review". Open editorial policies In July 2018, several publishers, librarians, journal editors and researchers drafted a Leiden Declaration for Transparent Editorial Policies. The declaration underlined that journals "often do not contain information about reviewer selection, review criteria, blinding, the use of digital tools such as text similarity scanners, as well as policies on corrections and retractions" and this lack of transparency. The declaration identifies four main publication and peer review phases that should be better documented: At submission: details on the governance of the journal, its scope, the editorial board or the rejection rates. During review: criteria for selection, timing of the review and model of peer review (double bind, single bind, open). Publication: disclosure of the "roles in the review process". Post-publication: "criteria and procedures for corrections, expressions of concern, retraction" and other changes. In 2020, the Leiden Declaration has been expanded and supplemented by a Platform for Responsible Editorial Policies (PREP). This initiative also aims to solve the structural scarcity of data and empirical information on editorial policies and peer review practices. As of 2022, this database contains partially crowdsourced information on the editorial procedures of 490 journals, from an initial base of 353 journals. The procedures evaluated include especially "the level of anonymity afforded to authors and reviewers; the use of digital tools such as plagiarism scanners; and the timing of peer review in the research and publication process". Despite this developments, research on editorial research still highlight the need for the "a comprehensive database that would allow authors or other stakeholders to compare journals based on their (…) requirements or recommendations" References Bibliography Standards and declarations Reports Books and theses Academic articles Chapters Conferences Other sources Ethics and statistics Open science Scientific method
Research transparency
[ "Technology" ]
6,454
[ "Ethics and statistics", "Ethics of science and technology" ]
70,984,276
https://en.wikipedia.org/wiki/Foundation%20model
A foundation model, also known as large X model (LxM), is a machine learning or deep learning model that is trained on vast datasets so it can be applied across a wide range of use cases. Generative AI applications like Large Language Models are often examples of foundation models. Building foundation models is often highly resource-intensive, with the most advanced models costing hundreds of millions of dollars to cover the expenses of acquiring, curating, and processing massive datasets, as well as the compute power required for training. These costs stem from the need for sophisticated infrastructure, extended training times, and advanced hardware, such as GPUs. In contrast, adapting an existing foundation model for a specific task or using it directly is far less costly, as it leverages pre-trained capabilities and typically requires only fine-tuning on smaller, task-specific datasets. Early examples of foundation models are language models (LMs) like OpenAI's GPT series and Google's BERT. Beyond text, foundation models have been developed across a range of modalitiesincluding DALL-E and Flamingo for images, MusicGen for music, and RT-2 for robotic control. Foundation models are also being developed for fields like astronomy, radiology, genomics, music, coding, times-series forecasting, mathematics, and chemistry. Definitions The Stanford Institute for Human-Centered Artificial Intelligence's (HAI) Center for Research on Foundation Models (CRFM) coined the term "foundation model" in August 2021 to mean "any model that is trained on broad data (generally using self-supervision at scale) that can be adapted (e.g., fine-tuned) to a wide range of downstream tasks". This was based on their observation that preexisting terms, while overlapping, were not adequate, stating that "'(large) language model' was too narrow given [the] focus is not only language; 'self-supervised model' was too specific to the training objective; and 'pretrained model' suggested that the noteworthy action all happened after 'pretraining." The term "foundation model" was chosen over "foundational model" because "foundational" implies that these models provide fundamental principles in a way that "foundation" does not. As governments regulate foundation models, new legal definitions have emerged. In the United States, the Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence defines a foundation model as "an AI model that is trained on broad data; generally uses self-supervision; contains at least tens of billions of parameters; is applicable across a wide range of contexts". In the United States, the proposed AI Foundation Model Transparency Act of 2023 by House Representatives Don Beyer (D, VA) and Anna Eshoo (D, CA) defines a foundation model as "an artificial intelligence model trained on broad data, generally uses self supervision, generally contains at least 1,000,000,000 parameters, is applicable across a wide range of contexts, and exhibits, or could be easily modified to exhibit, high levels of performance at tasks that could pose a serious risk to security, national economic security, national public health or safety, or any combination of those matters." In the European Union, the European Parliament's negotiated position on the E.U. AI Act defines a foundation model as an "AI model that is trained on broad data at scale, is designed for generality of output, and can be adapted to a wide range of distinctive tasks". In the United Kingdom, the Competition and Markets Authority's AI Foundation Models: Initial Report defines foundations model as "a type of AI technology that are trained on vast amounts of data that can be adapted to a wide range of tasks and operations." The United States's definitions are the only ones to make reference to the size of a foundation model, and differ on magnitude. Beyer and Eshoo's definition also specifies that foundation models must achieve a level of performance as to be a potential danger. In contrast, the E.U. definition requires the model to be designed for generality of output. All definitions agree that foundation models must be trained on a broad range of data with potential applications in many domains. History Technologically, foundation models are built using established machine learning techniques like deep neural networks, transfer learning, and self-supervised learning. Foundation models differ from previous techniques as they are general purpose models function as a reusable infrastructure, instead of bespoke and one-off task-specific models. Advances in computer parallelism (e.g., CUDA GPUs) and new developments in neural network architecture (e.g., Transformers), and the increased use of training data with minimal supervision all contributed to the rise of foundation models. Foundation models began to materialize as the latest wave of deep learning models in the late 2010s. Relative to most prior work on deep learning, these language models demonstrated the potential of training on much larger web-sourced datasets using self-supervised objectives (e.g. predicting the next word in a large corpus of text). These approaches, which draw upon earlier works like word2vec and GloVe, deviated from prior supervised approaches that required annotated data (e.g. crowd-sourced labels). The 2022 releases of Stable Diffusion and ChatGPT (initially powered by the GPT-3.5 model) led to foundation models and generative AI entering widespread public discourse. Further, releases of LLaMA, Llama 2, and Mistral in 2023 contributed to a greater emphasis placed on how foundation models are released with open foundation models garnering a lot of support and scrutiny. Related concepts Frontier models Certain highly advanced foundation models are termed "frontier models", which have the potential to "possess dangerous capabilities sufficient to pose severe risks to public safety." These "dangerous capabilities" stem from the accidental or intentional misuse of such models, which in conjunction with their powerful nature can lead to severe harms. As foundation models continue to improve, some AI researchers speculate that almost all next-generation foundation models will be considered frontier models. Since the concept of dangerous capabilities is inherently subjective, there is no strict designation for what foundation models qualify as frontier models. However, some generally held ideas for sufficiently dangerous capabilities include: Designing and synthesizing new biological or chemical weapons Producing and propagating convincing, tailored disinformation with minimal user instruction Harnessing unprecedented offensive cyber capabilities Evading human control through deceptive means Due to frontier models' unique capabilities, it is difficult to effectively regulate their development and deployment. Because of their emergent nature, new dangerous capabilities can appear on their own in frontier models, both in the development stage and after being deployed. Additionally, since frontier models continue to adapt after deployment, it remains difficult to mitigate all harms that arise from already-deployed models. If a frontier model happens to be open-source or is released online, the model can also disseminate rapidly, further hampering regulators by creating a lack of accountability. General-purpose AI Due to their adaptability to a wide range of use-cases, foundation models are sometimes considered to be examples of general-purpose AI. In designing the EU AI Act, the European Parliament has stated that a new wave of general-purpose AI technologies shapes the overall AI ecosystem. The fuller structure of the ecosystem, in addition to the properties of specific general-purpose AI systems, influences the design of AI policy and research. General-purpose AI systems also often appear in people's everyday lives through applications and tools like ChatGPT or DALL-E. Government agencies like EU Parliament have identified regulation general-purpose AI, such as foundation models, to be a high priority. General-purpose AI systems are often characterized by large size, opacity, and potential for emergence, all of which can create unintended harms. Such systems also heavily influence downstream applications, which further exacerbates the need for regulation. In regards to prominent legislation, a number of stakeholders have pushed for the EU AI Act to include restrictions on general-purpose AI systems, all of which would also apply to foundation models. Technical details Modeling For a foundation model to effectively generalize, it must acquire rich representations of the training data. As a result, expressive model architectures that efficiently process large-scale data are often preferred in building foundation models. Currently, the Transformer architecture is the de facto choice for building foundation models across a range of modalities. Training Foundation models are built by optimizing a training objective(s), which is a mathematical function that determines how model parameters are updated based on model predictions on training data. Language models are often trained with a next-tokens prediction objective, which refers to the extent at which the model is able to predict the next token in a sequence. Image models are commonly trained with contrastive learning or diffusion training objectives. For contrastive learning, images are randomly augmented before being evaluated on the resulting similarity of the model's representations. For diffusion models, images are noised and the model learns to gradually de-noise via the objective. Multimodal training objectives also exist, with some separating images and text during training, while others examine them concurrently. In general, the training objectives for foundation models promote the learning of broadly useful representations of data. With the rise of foundation models and the larger datasets that power them, a training objective must be able to parse through internet-scale data for meaningful data points. Additionally, since foundation models are designed to solve a general range of tasks, training objectives ought to be domain complete, or able to solve a broad set of downstream capabilities within the given domain. Lastly, foundation model training objectives should seek to scale well and be computationally efficient. With model size and compute power both being relevant constraints, a training objective must be able to overcome such bottlenecks. Data Foundation models are trained on a large quantity of data, working under the maxim "the more data, the better." Performance evaluation does show that more data generally leads to better performance, but other issues arise as data quantity grows. Tasks like managing the dataset, integrating data across new applications, ensuring adherence to data licenses, and maintaining data quality all become more difficult as data size grows. The specific demands of foundation models have only exacerbated such issues, as it remains the norm for large foundation models to use public web-scraped data. Foundation models include also search engines data and SEO meta tags data. Public web data remains a plentiful resource, but it also demands stringent moderation and data processing from foundation model developers before it can be successfully integrated into the training pipeline. Training foundation models often runs the risk of violating user privacy, as private data can be disclosed, collected, or used in ways beyond the stated scope. Even if no private data is leaked, models can still inadvertently compromise security through learned behavior in the resulting foundation model. Data quality is another key point, as web-scraped data frequently contains biased, duplicate, and toxic material. Once foundation models are deployed, ensuring high-quality data is still an issue, as undesirable behavior can still emerge from small subsets of data. Systems The size of foundation models also brings about issues with the computer systems they run on. The average foundation model is too large to be run within a single accelerator's memory and the initial training process requires an expensive amount of resources. Such issues are predicted to further exacerbate in future as foundation models grow to new heights. Due to this constraint, researchers have begun looking into compressing model size through tight model inference. GPUs are the most common choice of compute hardware for machine learning, due to high memory storage and strong power. Typical foundation model training requires many GPUs, all connected in parallel with fast interconnects. Acquiring a sufficient amount of GPUs of requisite compute efficiency is a challenge for many foundation model developers, one that has led to an increasing dilemma in the field. Larger models require greater compute power, but often at the cost of improved compute efficiency. Since training remains time-consuming and expensive, the tradeoff between compute power and compute efficiency has led only a few select companies to afford the production costs for large, state of the art foundation models. Some techniques like compression and distillation can make inference more affordable, but they fail to completely shore up this weakness. Scaling The accuracy and capabilities of foundation models often scale predictably with the size of the model and the amount of the training data. Specifically, scaling laws have been discovered, which are data-based empirical trends that relate resources (data, model size, compute usage) to model capabilities. Particularly, a model's scale is defined by compute, dataset size, and the number of parameters, all of which exhibit a power-law relationship with end performance. However, broken scaling laws have been discovered in which this relationship smoothly transitions (at points referred to as break(s)) from a power law with one exponent to a power law with another (different) exponent. When one does not collect any points near (or after) the break(s), it can be difficult to obtain an accurate extrapolation. Adaptation Foundation models are inherently multi-purpose: to use these model for a specific use case requires some form of adaptation. At a minimum, models need to be adapted to perform the task of interest (task specification), but often better performance can be achieved by more extensive adaptation to the domain of interest (domain specialization). A variety of methods (e.g. prompting, in-context learning, fine-tuning, LoRA) provide different tradeoffs between the costs of adaptation and the extent to which models are specialized. Some major facets to consider when adapting a foundation model are compute budget and data availability. Foundation models can be very large, up to trillions of parameters in size, so adapting the entirety of a foundation model can be computationally expensive. Therefore, developers sometimes adapt only the last neural layer or only the bias vectors to save time and space. For particularly niche applications, specific data may also not be available to adapt the foundation model sufficiently. In such circumstances, data must be manually labeled, which is costly and can demand expert knowledge. Evaluation Evaluation is a key part of developing foundation models. Not only does evaluation allow for tracking progress of high-performance models, it also creates benchmarks for future model development. Stakeholders rely on evaluations to understand model behaviors and gain insight into their various attributes. Traditionally, foundation models are evaluated relative to each other through standardized task benchmarks like MMLU, MMMU, HumanEval, and GSM8K. Given that foundation models are multi-purpose, increasingly meta-benchmarks are developed that aggregate different underlying benchmarks. Examples include LM-Harness, BIG-Bench, HELM, OpenLLM Leaderboard, DecodingTrust, and HEIM. Since foundation models' utility depends on their own general capabilities and the performance of fine-tuned applications, evaluation must cover both metrics. Proper evaluation examines both a foundation model's downstream applications in aggregate and the direct properties the foundation model holds. To ensure further equity in evaluation, certain existing evaluation frameworks account for all adaptation resources, which leads to more informed analyses for the benefit of all stakeholders. Supply chain Foundation models' general capabilities allow them to fulfill a unique role in the AI ecosystem, fueled by many upstream and downstream technologies. Training a foundation model requires several resources (e.g. data, compute, labor, hardware, code), with foundation models often involving immense amounts of data and compute (also referred to as computational power). Due to foundation models' large development costs and inexpensive adaptation requirements, the AI landscape has shifted to a small subset of AI companies making foundation models for downstream adaptation. Thus, most foundation model companies outsource this step to specialized data providers (e.g. Scale AI, Surge) and compute providers (e.g. Amazon Web Services, Google Cloud, Microsoft Azure). The foundation model developer itself will then take the data and use the supplied compute to actually train the foundation model. After the foundation model is completely built, much of the data and labor requirements abate. In this development process, hardware and compute are the most necessary, and also the most exclusive resources. To train larger and more complex AI, a sufficient amount of compute is key. However, compute is consolidated in the hands of a few, select entities, which most foundation model developers depend on. As such, the foundation model pipeline is concentrated heavily around these providers. Compute is also costly; in 2023, AI companies spent more than 80% of total capital on compute resources. Foundation models require a large amount of general data to power their capabilities. Early foundation models scraped from subsets of the internet to provide this data information. As the size and scope of foundation models grows, larger quantities of internet scraping becomes necessary, resulting in higher likelihoods of biased or toxic data. This toxic or biased data can disproportionately harm marginalized groups and exacerbate existing prejudices. To address this issue of low-quality data that arose with unsupervised training, some foundation model developers have turned to manual filtering. This practice, known as data labor, comes with its own host of issues. Such manual data detoxification is often outsourced to reduce labor costs, with some workers making less than $2 per hour. The foundation model will then be hosted online either via the developer or via an external organization. Once released, other parties can create applications based on the foundation model, whether through fine-tuning or wholly new purposes. People can then access these applications to serve their various means, allowing one foundation model to power and reach a wide audience. Release strategies After a foundation model is built, it can be released in one of many ways. There are many facets to a release: the asset itself, who has access, how access changes over time, and the conditions on use. All these factors contribute to how a foundation model will affect downstream applications. In particular, the two most common forms of foundation model release are through APIs and direct model downloads. When a model is released via an API, users can query the model and receive responses, but cannot directly access the model itself. Comparatively, the model could be directly downloadable for users to access and modify. Both release strategies are often classified as an open release. The exact definition of an open release is disputed, but widely accepted requirements are provided by the Open Source Initiative. Some open foundation models are: PaLM 2, Llama 2, Granite, and Mistral. While open foundation models can further research and development more easily, they are also more susceptible to misuse. Open foundation models can be downloaded by anyone, and particularly powerful models can be fine-tuned to intentionally or unintentionally cause harm. During a closed release, the foundation model cannot be accessed by the public, but is used internally by an organization. Such releases are considered safer, but offer no additional value to the research community or the public at large. Some foundation models like Google DeepMind's Flamingo are fully closed, meaning they are available only to the model developer; others, such as OpenAI's GPT-4, are limited access, available to the public but only as a black box; and still others, such as Meta's Llama 2 are open, with broadly available model weights enabling downstream modification and scrutiny. References Natural language processing Computational linguistics Computational fields of study Language modeling Unsupervised learning Deep learning
Foundation model
[ "Technology" ]
4,007
[ "Computational fields of study", "Computational linguistics", "Natural language processing", "Computing and society", "Natural language and computing" ]
70,984,388
https://en.wikipedia.org/wiki/Ministry%20of%20Water%20Supply%2C%20Irrigation%20and%20Energy%20%28Koshi%20Province%29
The Ministry of Water, Irrigation and Energy is the governmental body of Koshi Province, Nepal, responsible for managing drinking water, irrigation, electricity, water-borne disasters (rivers and landslides) at the Provincial level. Overview According to the constitution of Nepal, the provinces have been established in the Federal Democratic Republic of Nepal, as provided for in the three-tier government structure. Koshi province is one of the seven provinces in Nepal. There are 14 districts: Bhojpur, Dhankuta, Ilam, Jhapa, Khotang, Morang, Okhaldhunga, Panchthar, Sankhuwasabha, Solukhumbu, Sunsari, Taplejung, Terhathum and Udayapur. There is one metropolitan city and tw sub-metropolitan cities. There are a total of 49 local levels consisting of municipalities and 88 rural municipalities. After Nepal was transformed into a federal structure and the system of union, state and local government was completed in the year 2017, the three-level elections were completed and the provincial government was formed. The Ministry of Water Supply, Irrigation and Energy was established on 6 February 2022 with the responsibility of carrying out various works in the fields of drinking water, energy and irrigation in all the fourteen districts of Koshi Province, as specified in the division of labor regulations. Construction of drinking water plan (gravity, pumping, improvement of water quality), projects related to drainage, construction of irrigation projects, ground water resources, river control (embankment), works related to renewable energy promotion, study, research, plan formulation, approval and This ministry is carrying out the work of planning, operation and maintenance from the subordinate offices. List of former ministers This is a list of all former Ministers since 2022–Present. References External links Official Website of Office of the Chief Minister and Council of Ministers of Koshi Province Official Website of Ministry of Water Supply, Irrigation and Energy of Koshi Province Government bodies Energy ministries Koshi Province Government agencies established in 2022
Ministry of Water Supply, Irrigation and Energy (Koshi Province)
[ "Engineering" ]
412
[ "Energy organizations", "Energy ministries" ]
70,984,618
https://en.wikipedia.org/wiki/R%20Volantis
R Volantis is a single variable star in the southern circumpolar constellation Volans. It has an average apparent magnitude of 8.7, making it readily visible in amateur telescopes but not to the naked eye. The object is relatively far at a distance of about 2,300 light years but is drifting closer with a radial velocity of . R Volantis' peculiarity was first observed in 1954 when it was found to have emission lines in its spectrum. Observations from 1955 to 1967 reveal that the star was a probable Mira variable and was given its current designation. However, its nature as a carbon star wasn't discovered until 1968 by Pik-Sin The. In the paper, R Volantis and V1163 Centauri (HD 114586) had their spectrums studied and revealed that the former is a carbon star while the latter is an S-type star. R Volantis has a stellar classification of Ce, indicating that it is a carbon star with emission lines. It is a giant star on the asymptotic giant branch, meaning that it is generating energy via hydrogen and helium shells around an inert carbon core. As a result, it has expanded to times the radius of the Sun and now radiates a luminosity of . R Vol has an effective temperature of , giving a deep red hue. It fluctuates between magnitude 8.7 and 15.4 and has a period of 445 days. Notes References Volans Carbon stars Mira variables Variable stars CD-72 378 Asymptotic-giant-branch stars Emission-line stars Volantis, R
R Volantis
[ "Astronomy" ]
333
[ "Volans", "Constellations" ]
70,984,843
https://en.wikipedia.org/wiki/Rhizorhabdus%20wittichii
The species Rhizorhabdus wittichii, formerly Sphingomonas wittichii, is a Gram-negative, rod-shaped motile bacterium, with an optimum growth temperature at 30 °C. It forms a greyish white colony. It has been found to have a 67 mol% of DNA G+C content. The R. wittichii RW1 genome consists of 5,915,246 bp and consists of a single circular chromosome and two plasmids. Background It was first isolated from water of the River Elbe by R.-M. Wittich, after whom the species is named. The species was originally thought to belong to the genus Sphingomonas, despite poor alignment of its 16S rRNA gene with its putative nearest neighbor. It has since been reclassified to Rhizorhabdus as part of a larger re-evaluation of Alphaproteobacteria. Strain(s) Its type strain is R. wittichii RW1 DSM 6014T (= JCM 10273T = EY 4224T). Mechanism and biotechnological applications R. wittichii RW1 is notable for metabolising dibenzo-p-dioxin and phenazine-1-carboxylic acid. In fact, Sphingomonas wittichii strain RW1 (RW1) is one of the very few strains that can grow on dibenzo-p-dioxin (DD). Furthermore, this bacterium also grows on dibenzofuran and 4-chloro-dibenzofuran, using them as the sole carbon sources. Such biodegradative capabilities are not unique to this strain. R. wittichii MPO218 degrades ibuprofen, carrying degradative genes on a large plasmid. Thanks to its wide-ranging metabolic capabilities and likely propensity to acquire novel degradation genes, in no small part due to its wealth of plasmids, The unusual arrangement of its genes involved in dioxin degradation, and the full description of the dioxin degradation pathway, is still under investigation. This organism holds a high potential for biotechnological applications. References Further reading External links Type strain of Sphingomonas wittichii at BacDive - the Bacterial Diversity Metadatabase Sphingomonadales Bacteria described in 2001 Environmental microbiology Gram-negative bacteria
Rhizorhabdus wittichii
[ "Chemistry", "Biology", "Environmental_science" ]
518
[ "Prokaryotes", "Biotechnology", "Biodegradation", "Environmental microbiology", "Ecological techniques", "nan", "Bacteria", "Bioremediation", "Environmental soil science", "Microorganisms" ]
70,985,289
https://en.wikipedia.org/wiki/Staircase%20paradox
In mathematical analysis, the staircase paradox is a pathological example showing that limits of curves do not necessarily preserve their length. It consists of a sequence of "staircase" polygonal chains in a unit square, formed from horizontal and vertical line segments of decreasing length, so that these staircases converge uniformly to the diagonal of the square. However, each staircase has length two, while the length of the diagonal is the square root of 2, so the sequence of staircase lengths does not converge to the length of the diagonal. Martin Gardner calls this "an ancient geometrical paradox". It shows that, for curves under uniform convergence, the length of a curve is not a continuous function of the curve. For any smooth curve, polygonal chains with segment lengths decreasing to zero, connecting consecutive vertices along the curve, always converge to the arc length. The failure of the staircase curves to converge to the correct length can be explained by the fact that some of their vertices do not lie on the diagonal. In higher dimensions, the Schwarz lantern provides an analogous example showing that polyhedral surfaces that converge pointwise to a curved surface do not necessarily converge to its area, even when the vertices all lie on the surface. As well as highlighting the need for careful definitions of arc length in mathematics education, the paradox has applications in digital geometry, where it motivates methods of estimating the perimeter of pixelated shapes that do not merely sum the lengths of boundaries between pixels. See also Coastline paradox, similar paradox where a straight segment approximation diverges Aliasing, a more general phenomenon of inaccuracies caused by pixelation Cantor staircase, a fractal curve along the diagonal of a unit square Taxicab geometry, in which the lengths of the staircases and of the diagonal are equal References External links A Short Note: Extending the Staircase Paradox Length Mathematical paradoxes Limits (mathematics)
Staircase paradox
[ "Physics", "Mathematics" ]
376
[ "Scalar physical quantities", "Physical quantities", "Distance", "Quantity", "Size", "Mathematical paradoxes", "Length", "Wikipedia categories named after physical quantities", "Mathematical problems" ]
70,985,345
https://en.wikipedia.org/wiki/Mathemalchemy
Mathemalchemy (French: MathémAlchimie) is a traveling art installation dedicated to a celebration of the intersection of art and mathematics. It is a collaborative work led by Duke University mathematician Ingrid Daubechies and fiber artist Dominique Ehrmann. The cross-disciplinary team of 24 people, who collectively built the installation during the calendar years 2020 and 2021, includes artists, mathematicians, and craftspeople who employed a wide variety of materials to illustrate, amuse, and educate the public on the wonders, mystery, and beauty of mathematics. Including the core team of 24, about 70 people contributed in some way to the realization of Mathemalchemy. Description The art installation occupies a footprint approximately , which extends up to in height (in addition, small custom-fabricated tables are arranged around the periphery to protect the more fragile elements). A map shows the 14 or so different zones or regions within the exhibit, which is filled with hundreds of detailed mathematical artifacts, some smaller than ; the entire exhibit comprises more than 1,000 parts which must be packed for shipment. Versions of some of the complex mathematical objects can be purchased through an associated "Mathemalchemy Boutique" website. The art installation contains puns (such as "Pi" in a bakery) and Easter eggs, such as a miniature model of the Antikythera mechanism hidden on the bottom of "Knotilus Bay". Mathematically sophisticated visitors may enjoy puzzling out and decoding the many mathematical allusions symbolized in the exhibit, while viewers of all levels are invited to enjoy the self-guided tours, detailed explanations, and videos available on the accompanying official website . A downloadable comic book was created to explore some of the themes of the exhibition, using an independent narrative set in the world of Mathemalchemy. Themes The installation features or illustrates mathematical concepts at many different levels. All of the participants regard "recreational mathematics"—especially when it has a strong visual component—as having an important role in education and in culture in general. Jessica Sklar maintains that "mathematics is, at heart, a human endeavor" and feels compelled to make it accessible to those who don't regard themselves as "math people". Bronna Butler talks about the heritage of JH Conway, whose lectures were "almost magical in quality" because they used what looked like curios and tricks but in the end arrived at answers to "fundamental questions of mathematics". Henry Segerman, who wrote the book Visualizing Mathematics With 3D Printing, contributed 3D pieces that explore stereographic projection and polyhedra. According to Susan Goldstine, "The interplay between mathematics and fiber arts is endlessly fascinating [and] allows for a deeper understanding ways that these crafts can illuminate complex concepts in mathematics". Edmund Harriss says, "You don’t need a background in math to appreciate the installation, just like you can enjoy a concert without being a musician". The creators had the goal of illustrating as much of mathematics as possible. Thus the various exhibits touch on number theory, fractals, tessellations, probability theory, Zeno's paradoxes, Venn diagrams, knot theory, calculus, chaos theory, topology, hyperbolic geometry, symbolic logic—and much else—all in a setting that is beautiful and fun. Mathematicians explicitly mentioned or alluded to include Vladimir Arnold, John H. Conway, Felix Klein, Sofya Kovalevskaya, Henri Lebesgue, Ada Lovelace, Benoit Mandelbrot, Maryam Mirzakhani, August Möbius, Emmy Noether, Marjorie Rice, Bernhard Riemann, Caroline Series, Wacław Sierpiński, Alicia Boole Stott, William Thurston, Helge von Koch, Gladys West, Zeno, and many others. Twenty of the "mathemalchemists" are women, and the facility especially celebrates the contributions of women in mathematics, from amateur Marjorie Rice, who found new kinds of pentagon tilings, to Maryam Mirzakhani, the first woman to ever garner a Fields Medal. Gallery History Daubechies and Ehrmann presented the project in a special session at the 2020 Joint Mathematics Meetings (JMM) in Denver, Colorado. They soon had a core group of more than a dozen interested mathematicians and artists who in turn suggested other people not at JMM. Eventually the group would grow to 24 people. Originally, the intent was to collectively design and fabricate in a series of workshops to be held at Duke University in Durham, North Carolina, starting in March 2020. The COVID-19 pandemic disrupted these plans. Working instead over Zoom, under the guidance of Dominique Ehrmann and various "team leaders" for different parts of the installation, the installation was collectively designed and discussed. In July 2021 the team could finally get together at Duke for the first in-person meeting, where the components that had been fabricated in various locations in the US and Canada were assembled for the first time, leading to the first complete full-scale construction. The 24 members of the team employed ceramics, knitting, crocheting, quilting, beadwork, 3D printing, welding, woodworking, textile embellishment, origami, metal-folding, water-sculpted brick, and temari balls to create the room-sized installation. Venues The finished installation was originally displayed at Duke University, then moving to the National Academy of Sciences (NAS) building in Washington DC, where it was on display from December 4, 2021, until June 12, 2022. The installation next showed at Juniata College in Huntingdon, Pennsylvania before moving to Boston University from January to March 2023, partially overlapping with the 2023 Joint Mathematics Meetings in Boston. The exhibit then moved to Beaty Biodiversity Museum in Vancouver, British Columbia and then in November of that year it went to Northern Kentucky University where it remained until February 2024. From May 22 to October 27, 2024 Mathemalchemy was at the National Museum of Mathematics (MoMath) in New York City. From November 6, 2024 to May 2, 2025, the University of Quebec in Montreal (UQAM) hosts the exhibition. , fundraising is underway to mount the exhibition at the Navajo Nation Museum in Window Rock, Arizona. The exhibit is planned to ultimately reside in the Duke University mathematics building, on permanent display. See also Mathematica: A World of Numbers... and Beyond – 1961 iconic mathematics exhibition by Ray and Charles Eames Mathematics and art References External links Mathemalchemy Art Installation on YouTube Installation art works Recreational mathematics Mathematics organizations Mathematics conferences Mathematics education Mathematics and art Traveling exhibits Mathematics education in the United States Art and design organizations Organizations established in 2020 Artist groups and collectives 2020 establishments
Mathemalchemy
[ "Mathematics", "Engineering" ]
1,380
[ "Recreational mathematics", "Design", "Art and design organizations" ]
70,985,868
https://en.wikipedia.org/wiki/Pacman%20%28security%20vulnerability%29
Pacman is a side-channel vulnerability in certain ARM CPUs that was made public by Massachusetts Institute of Technology security researchers on June 10, 2021. It affects the pointer authentication (PAC) mechanism in many ARMv8.3 chips, including Apple's M1 CPU. Pacman creates an 'oracle' that lets an attacker guess a pointer's PAC signature without crashing the program if the guess is wrong. PAC signatures are typically less than 16 bits wide, so an attacker can use the oracle to guess the signature in 216 tries or fewer. It is unfixable without hardware changes because it is caused by the inherent design of CPU caches and branch predictors. Impact and response Pacman alone is not an exploitable vulnerability. PAC is a 'last line of defense' that detects when software running on the CPU is being exploited by a memory corruption attack and reacts by crashing the software before the attacker completes their exploit. Apple stated that they did not believe the vulnerability posed a serious threat to users because it requires specific conditions to be exploited. Background Pacman is similar to Spectre, abusing two key CPU optimizations to create a PAC oracle: branch prediction and memory caching. PAC (Pointer Authentication Codes) PAC is a security feature in ARMv8.3-based computer processors that mitigates against return-oriented programming by adding a cryptographic signature to the upper bits of pointers. Compilers emit PAC 'sign' instructions before storing pointers to memory, and 'verify' instructions after loading pointers from memory. If an attacker tampers with the pointer, the signature becomes invalid and the program crashes when the pointer is next accessed. PAC signatures are not cryptographically secure because they need to be small enough to fit into the unused upper bits of pointers. Therefore, if an attacker can reliably test whether a guessed signature is correct without crashing the program, they can brute-force the correct signature. Branch prediction Modern CPUs employ branch prediction to reduce the number of pipeline stalls caused by conditional branches. Branch prediction uses heuristics to guess the direction of a conditional branch and begin executing the predicted path – while the condition is still being evaluated. Instructions executed during this period are 'speculative', and the CPU holds their results in the re-order buffer (ROB) without writing them back to memory. Once the CPU finishes evaluating the condition and determines that its initial prediction was correct, it 'retires' the instructions in the ROB by writing their changes back to memory and propagating any exceptions produced. If the speculation was incorrect, the CPU flushes the ROB and resumes execution at the correct location. Memory caching CPU caches accelerate memory accesses by caching frequently accessed memory on the CPU die. This lowers the cost of memory accesses from hundreds of cycles to fewer than 10, by reducing the amount of time spent communicating with the physically separate northbridge and RAM chip. When an uncached address is loaded, the CPU immediately stashes the loaded data into the cache, evicting another entry if the cache is full. These changes are not held in the ROB because the presence or absence of an address in the cache is considered 'unobservable', so stashes and evictions that occur during speculative execution persist after the ROB has been flushed, even if that path was not ultimately taken. Mechanism Principle Pacman tricks the CPU into checking the validity of a guessed PAC signature within a mispredicted branch so that exceptions produced by potentially incorrect guesses are discarded during the ROB flush. If the guess was incorrect, the exception thrown during speculative execution forces the CPU to stall, preventing further instructions from being speculatively executed. A Pacman gadget is a sequence of instructions of the following form: if (condition): ptr = verify(attacker_tamperable_pointer) load(ptr) Sequences of this form are common and can be found in most compiled programs supporting PAC. When the CPU mispredicts the condition, it begins speculatively executing the PAC verification instruction. If the attacker's guess was correct, the verification instruction succeeds and the CPU proceeds to load the address from memory; if the guess was incorrect, the verification instruction throws an exception. This exception is held in the ROB and then discarded once the CPU finds the condition to be false. The attacker then uses a hardware side-channel to determine whether the load instruction was executed, therefore determining whether their guessed signature was correct. Attack Ravichandran et al. demonstrate that the cache-based Prime and Probe technique can be used to determine whether the load instruction executed. The attacker determines if the load instruction in a Pacman gadget was executed by filling the cache with data, calling the gadget, and checking the latency of accessing the previously loaded addresses. If one of the addresses takes longer than before, it was evicted by the gadget and the attacker knows that their guess was correct. The attacker may then use this forged pointer elsewhere in the program to hijack it. 1. Train The attacker calls the Pacman gadget many times with condition = true. The branch predictor is now trained to guess that the condition is true on subsequent calls. During this period, attacker_tamperable_pointer is its original value with a valid PAC signature. 2. Prime The attacker fills the L1 cache by loading from addresses they control. The contents of these memory locations does not matter – the attacker just needs to be able to precisely measure their access latency. 3. Evict The attacker overwrites attacker_tamperable_pointer with their target pointer and guess for the target pointer's PAC signature. They then call the Pacman gadget with condition = false, causing the branch to be mispredicted. The branch predictor will speculatively execute the contents of the if statement, before eventually flushing the pipeline and rolling back. During this speculative execution, two things can occur: The speculative execution proceeds to the load() instruction. This means that the verify() instruction did not fault, implying the guessed signature was correct. The load() instruction will then load the target pointer into cache, evicting an address in the attacker's eviction set. Speculative execution faults on the verify instruction, preventing execution of the load(). This implies the guessed signature was wrong. Since this was speculatively executed within a mispredicted branch, the fault is not propagated to the program. 4. Probe The attacker measures the access time for each element in their eviction set. If one of the elements was evicted (i.e., the access is slow) then the guess was correct. If none of the elements were evicted (i.e., all accesses are fast) then the guess was wrong. This process can be repeated with different guesses until the correct signature is found. Notes References See also Side-channel attack External links Transient execution CPU vulnerabilities 2022 in computing ARM architecture
Pacman (security vulnerability)
[ "Technology" ]
1,424
[ "Transient execution CPU vulnerabilities", "Computer security exploits" ]
70,988,025
https://en.wikipedia.org/wiki/Belvarafenib
Belvarafenib (developed by Hanmi Pharmaceuticals and Genentech) is a small molecule RAF dimer (type II) inhibitor which shows anti-tumor clinical activity in cancer patients with BRAFV600E- and NRAS- mutations. References Antineoplastic and immunomodulating drugs Isoquinolines Carboxamides Chloroarenes Fluoroarenes Amines Anilines Pyrimidines Thienopyrimidines
Belvarafenib
[ "Chemistry" ]
99
[ "Amines", "Bases (chemistry)", "Functional groups" ]
70,988,103
https://en.wikipedia.org/wiki/Pronova%20BioPharma
Pronova BioPharma is a Norwegian company. In Denmark it is a bulk manufacturer of omega-3 products with a manufacturing plant in Kalundborg. It was acquired by BASF in 2014. Pronova BioPharma ASA had its roots in Norway's codfish liver oil industry. The company was founded in 1991 as a spinout from the JC Martens company, which in turn was founded in 1838 in Bergen, Norway. Pronova developed the concentrated omega-3-acid ethyl esters formulation that is the active pharmaceutical ingredient of Lovaza. Lovaza Pronova won approvals to market the drug, called Omacor in Europe (and initially in the US), in several European countries in 2001 after conducting a three-and-a-half-year trial in 11,000 subjects; The company partnered with other companies like Pierre Fabre in France. In 2004, Pronova licensed the US and Puerto Rican rights to Reliant Therapeutics, whose business model was in-licensing of cardiovascular drugs. In that same year, Reliant and Pronova won FDA approval for the drug, and it was launched in the US and Europe in 2005. Global sales in 2005 were $144M, and by 2008, they were $778M. By 17 November 2010, the constituent companies of the OSEAX included Pronova BioPharma. In 2009, generic companies Teva Pharmaceuticals and Par Pharmaceutical made clear their intentions to file Abbreviated New Drug Applications (“ANDAs”) to bring generics to market, and in April 2009, Pronova sued them from infringing the key US patents covering Lovaza, US 5,656,667 (due to expire in April 2017), US 5,502,077 (exp March 2013). Subsequently, in May 2012, a district court ruled in Pronova's favor, saying that the patents were valid. The generic companies appealed, and in September 2013, the Federal Circuit reversed, saying that because more than one year before Pronova's predecessor company applied for a patent, it had sent samples of the fish oil used in Lovaza to a researcher for testing. This event thus constituted "public use" that invalidated the patent in question. Generic versions of Lovaza were introduced in America in April 2014. Pronovo has continued to manufacture the ingredients in Lovaza, and in 2012, BASF announced it would acquire Pronova for $844 million. The deal closed in 2013. Research Pronova BioPharma is a commercial partner in MabCent-SFI. Brand names Lovaz (US)/Omacor (Europe), sold by Woodward Pharma Services, LLC in the US; created and manufactured by Pronova. It was approved in the United States in 2004. References External links https://www.epax.com/why-epax/about-us/ Chemical companies Chemical companies established in 1991
Pronova BioPharma
[ "Chemistry" ]
608
[ "Chemical companies" ]
70,989,491
https://en.wikipedia.org/wiki/Allan%20Warner%20%28physician%29
Allan Warner ( – 24 May 1952), was a British physician who served as Leicester's chief medical officer of health. His photographs of various stages of smallpox, taken at Leicester smallpox isolation hospital, appeared in An Atlas of Illustrations of Clinical Medicine, Surgery and Pathology in 1901. Later, he had administrative roles at the Western Park Open Air School and acted as the medical advisor to Leicester's Mental Deficiency committee. Early life and family Allan Warner was born in Finchley, Middlesex, around 1871. He married Elizabeth Maud, eight years his junior, around 1907. By 1911, they had two children, John and Mary. Photographs Warner's photographs of various stages of smallpox, taken at Leicester smallpox isolation hospital, appeared in An Atlas of Illustrations of Clinical Medicine, Surgery and Pathology (1901). At the time, he was Leicester's assistant medical officer of health. Western Park Open Air School In 1931, as Leicester's chief school medical officer of health, Warner, in the committee minutes, described the Western Park Open Air School as having a "healthy environment" that could put right a "child's nervous activity which has degenerated owing to disuse". Two years earlier he had maintained that "many health movements" had produced a "health conscience" which in turn expanded the Leicester School Medical Service. He described the aim of the school as to "so train the children that they would eventually become hardy men and women", something he felt was important for a good citizen in the interwar years. In contrast were the "overcrowded sunless rooms" of the schools in the city centre, with "stagnant humid atmosphere of the overcrowded house". This he felt "resulted in children that were "incapable of strenuous muscular action and over sensitive to pain". In one later report, Warner recited George Newman: "the existence and strength of the nation ultimately depends upon the survival of its children and their physical and mental health". School Medical Service He wrote of rising costs of the School Medical Service, and that "parents with tuberculosis should be prevented from having more children". He acted as the medical advisor to Leicester's Mental Deficiency committee, and calculated that Leicester had "60 lower grade children and 300 adult defectives". He "suggested that idiots and imbeciles should be put in an institution, low-grade children could be left with the parents, and the feeble minded segregated so that they could not reproduce". Death Warner died on 24 May 1952 at the Regent Road Hospital in Leicester, at the age of 81. He was survived by his wife Elizabeth Maud. References Date of birth unknown 1952 deaths Vaccination advocates 19th-century British medical doctors 20th-century British medical doctors British eugenicists 1870s births People from Finchley
Allan Warner (physician)
[ "Biology" ]
567
[ "Vaccination", "Vaccination advocates" ]
70,989,793
https://en.wikipedia.org/wiki/Cercidospora%20epipolytropa
Cercidospora epipolytropa is a species of lichenicolous fungus in the genus Cercidospora but it has not been assigned to a family. It is known to parasitise the crustose lichen Lecanora polytropa. The fungus was first formally described by mycologist William Mudd in 1861. Ferdinand Christian Gustav Arnold transferred it to Cercidospora in 1874. References Dothideomycetes Fungi described in 1861 Lichenicolous fungi Fungus species
Cercidospora epipolytropa
[ "Biology" ]
107
[ "Fungi", "Fungus species" ]
70,989,810
https://en.wikipedia.org/wiki/Cercidospora%20thamnoliicola
Cercidospora thamnoliicola is a species of lichenicolous fungus in the genus Cercidospora but it has not been assigned to a family. It is known to parasitise the lichen Thamnolia vermicularis in Iceland but it is rare there. The species was first formally described by mycologist Per G. Ihlen in 1995, from specimens growing on Thamnolia vermicularis in Norway. References Dothideomycetes Fungi described in 1995 Fungi of Europe Fungi of Iceland Lichenicolous fungi Fungus species
Cercidospora thamnoliicola
[ "Biology" ]
123
[ "Fungi", "Fungus species" ]