id
int64
39
79M
url
stringlengths
31
227
text
stringlengths
6
334k
source
stringlengths
1
150
categories
listlengths
1
6
token_count
int64
3
71.8k
subcategories
listlengths
0
30
61,154,967
https://en.wikipedia.org/wiki/C2532H3854N672O711S16
{{DISPLAYTITLE:C2532H3854N672O711S16}} The molecular formula C2532H3854N672O711S16 (molar mass: 55597.4 g/mol) may refer to: Alglucerase Imiglucerase
C2532H3854N672O711S16
[ "Chemistry" ]
71
[ "Isomerism", "Set index articles on molecular formulas" ]
61,155,401
https://en.wikipedia.org/wiki/Lecanoric%20acid
Lecanoric acid is a chemical produced by several species of lichen. Lecanoric acid is classified as a polyphenol and a didepside, and it functions as an antioxidant. It is an ester of orsellinic acid with itself. The acid is named after the lichen Lecanora, in which it was discovered. The acid has also been isolated from Usnea subvacata, Parmotrema stuppuem, Parmotrema tinctorum, Parmotrema grayana, Xanthoparmelia arida and Xanthoparmelia lecanorica. A related compound, 5-chlorolecanoric acid, is found in some species of Punctelia. References Polyphenols Benzoic acids Benzoate esters Lichen products
Lecanoric acid
[ "Chemistry" ]
170
[ "Natural products", "Lichen products" ]
61,155,496
https://en.wikipedia.org/wiki/Umbilicaric%20acid
Umbilicaric acid is an organic polyphenolic carboxylic acid made by several species of lichen. It is named after Umbilicaria. Umbilicaric acid is a tridepside, containing three phenol rings in orsellinic acid moieties. Identification of unbilicaric acid can be important in the identification of lichen species. See also Gyrophoric acid References Polyphenols Benzoic acids Benzoate esters Lichen products
Umbilicaric acid
[ "Chemistry" ]
106
[ "Natural products", "Lichen products" ]
61,156,704
https://en.wikipedia.org/wiki/C17H26N4O
{{DISPLAYTITLE:C17H26N4O}} The molecular formula C17H26N4O (molar mass: 302.41 g/mol, exact mass: 302.2107 u) may refer to: Alniditan Emedastine Molecular formulas
C17H26N4O
[ "Physics", "Chemistry" ]
63
[ "Molecules", "Set index articles on molecular formulas", "Isomerism", "Molecular formulas", "Matter" ]
61,157,251
https://en.wikipedia.org/wiki/MUBII-TB-DB
MUBII-TB-DB is a database that focuses on tuberculosis antibiotic resistance genes. It is a highly structured, text-based database focusing on Mycobacterium tuberculosis at seven different mutation loci: rpoB, pncA, katG; mabA(fabG1)-inhA, gyrA, gyrB, and rrs. MUBII analyzes the query using two parallel strategies: 1). A BLAST search against previously mutated sequences. 2). Alignment of query sequences with wild-type sequences. MUBII outputs graphs of alignments and description of the mutation and therapeutic significance. Therapeutically relevant mutations are tagged as "High-Confident" based on the criteria set by Sandgren et al. MUBII-TB-DB provides a platform that is easy to use for even users that are not trained in bioinformatics. See also Antimicrobial Resistance databases References Tuberculosis Antimicrobial resistance organizations Biological databases
MUBII-TB-DB
[ "Biology" ]
202
[ "Bioinformatics", "Biological databases" ]
61,157,820
https://en.wikipedia.org/wiki/MvirDB
In molecular biology, MvirDB was a publicly available database that stored information on toxins, virulence factors and antibiotic resistance genes. Sources that this database used for DNA and protein information included: Tox-Prot, SCORPION, the PRINTS Virulence Factors, VFDB, TVFac, Islander, ARGO and VIDA. The database provided a BLAST tool that allowed the user to query their sequence against all DNA and protein sequences in MvirDB. Information on virulence factors could be obtained from the usage of the provided browser tool. Once the browser tool was used, the results were returned as a readable table that was organized by ascending E-Values, each of which were hyperlinked to their related page. MvirDB was implemented in an Oracle 10g relational database. MvirDB appears to have been inactive for some time, and is therefore not current. The last available snapshot was made on August 2, 2017. See also Antimicrobial Resistance databases References Antimicrobial resistance organizations Biological databases
MvirDB
[ "Biology" ]
214
[ "Bioinformatics", "Biological databases" ]
61,158,238
https://en.wikipedia.org/wiki/Vector%20Packet%20Processing
Vector Packet Processing (VPP) platform is an extensible, open-source framework, which offers the functionality of network switches or routers. Vector processing is the process of processing multiple packets at a time, with low latency. Single packet processing and high latency are present in the scalar processing approach, which VPP aims to make obsolete. This open-source, Linux Foundation backed framework is part of the Fast Data Project (FD.io). VPP uses the Data Plane Development Kit device drivers and libraries for many of its layer 1 functions - however, this functionality is separated into an optional plugin-in for VPP. Technology In order to push for scalability of networks, VPP reads the largest available vector of packets in the networks I/O layer. Instead of processing each packet individually throughout an entire graph with several nodes, VPP selects the entire vector of packets and pushes them through a graph node, before moving on to the next node. The instruction cache then adapts to the process and the remaining packets are processed even faster, due to the previously learned instructions from the first vector packet. External links FD.io Project Pages VPP User Documentation FD.io Wiki on VPP References Free routing software Networking hardware
Vector Packet Processing
[ "Engineering" ]
257
[ "Computer networks engineering", "Networking hardware" ]
61,158,619
https://en.wikipedia.org/wiki/C3H5Br
{{DISPLAYTITLE:C3H5Br}} The molecular formula C3H5Br may refer to: Allyl bromide Bromocyclopropane
C3H5Br
[ "Chemistry" ]
38
[ "Isomerism", "Set index articles on molecular formulas" ]
61,158,940
https://en.wikipedia.org/wiki/XCMS%20Online
XCMS Online is a cloud version of the original eXtensible Computational Mass Spectrometry (XCMS) technology (a bioinformatics software designed for statistical analysis of mass spectrometry data), created by the Siuzdak Lab at Scripps Research. XCMS introduced the concept of nonlinear retention time alignment that allowed for the statistical assessment of the detected peaks across LCMS and GCMS datasets. XCMS Online was designed to facilitate XCMS analyses through a cloud portal and as a more straightforward (non command driven) way to analyze, visualize and share untargeted metabolomic data. Further to this, the combination of XCMS and METLIN allows for the identification of known molecules using METLIN's tandem mass spectrometry data, and enables the identification of unknown (uncharacterized molecules) via similarity searching of tandem mass spectrometry data. XCMS Online has also become a systems biology tool for integrating different omic data sets. As of January 2021, the XCMSOnline - METLIN platform has over 44,000 registered users. XCMS - METLIN was recognized in 2023 as the year's top analytical innovation. XCMS Online works by comparing groups of raw or preprocessed metabolomic data to discover metabolites using methods such as nonlinear retention time alignment and feature detection & matching. Once analysis is complete the data can be viewed several different ways including via bubble plots, heat maps, chromatograms, and box plots. In addition, XCMS Online is integrated with METLIN, a large metabolite database. The following file formats are supported for direct upload to the site. History In 2005, the Siuzdak Lab created an open-source tool named XCMS in the programming language R. Noticing the need for a more accessible, graphical data processing tool they created the cloud-based XCMS Online in 2012. The ability for users to stream data directly from instruments while being acquired was added in 2014. Also in that year a commercial version named XCMS Plus (owned by Mass Consortium Corporation) was released and, in 2015, SCIEX became a reseller. In 2017 it was shown that XCMS Online could be used in a systems biology workflow. One year later, in the absence of a publicly available alternative, a version of XCMS Online was released with the ability to perform multiple reaction monitoring (MRM). References External links XCMS Online Bioinformatics software Mass spectrometry software
XCMS Online
[ "Physics", "Chemistry", "Biology" ]
532
[ "Spectrum (physical sciences)", "Chemistry software", "Bioinformatics software", "Bioinformatics", "Mass spectrometry software", "Mass spectrometry" ]
61,159,333
https://en.wikipedia.org/wiki/Warming%20stripes
Warming stripes (sometimes referred to as climate stripes, climate timelines or stripe graphics) are data visualization graphics that use a series of coloured stripes chronologically ordered to visually portray long-term temperature trends. Warming stripes reflect a "minimalist" style, conceived to use colour alone to avoid technical distractions to intuitively convey global warming trends to non-scientists. The initial concept of visualizing historical temperature data has been extended to involve animation, to visualize sea level rise and predictive climate data, and to visually juxtapose temperature trends with other data such as atmospheric concentration, global glacier retreat, precipitation, progression of ocean depths, aviation emission's percentage contribution to global warming, biodiversity loss, soil moisture deviations, and fine particulate matter concentrations. In less technical contexts, the graphics have been embraced by climate activists, used as cover images of books and magazines, used in fashion design, projected onto natural landmarks, and used on athletic team uniforms, music festival stages, and public infrastructure. Background, publication and content In May 2016, to make visualizing climate change easier for the general public, University of Reading climate scientist Ed Hawkins created an animated spiral graphic of global temperature change as a function of time, a representation said to have gone viral. Jason Samenow wrote in The Washington Post that the spiral graph was "the most compelling global warming visualization ever made", before it was featured in the opening ceremony of the 2016 Summer Olympics. Separately, by 10 June 2017, Ellie Highwood, also a climate scientist at the University of Reading, had completed a crocheted "global warming blanket" that was inspired by "temperature blankets" representing temperature trends in respective localities. Hawkins provided Highwood with a more user friendly colour scale to avoid the muted colour differences present in Highwood's blanket. Independently, in November 2015, University of Georgia estuarine scientist Joan Sheldon made a "globally warm scarf" having 400 blue, red and purple rows, but could not contact Hawkins until 2022. Both Highwood and Sheldon credit as their original inspirations, "sky blankets" and "sky scarves" which are based on daily sky colours. On 22 May 2018, Hawkins published graphics constituting a chronologically ordered series of blue and red vertical stripes that he called warming stripes. Hawkins, a lead author for the IPCC 6th Assessment Report, received the Royal Society's 2018 Kavli Medal, in part "for actively communicating climate science and its various implications with broad audiences". As described in a BBC article, in the month the big meteorological agencies release their annual climate assessments, Hawkins experimented with different ways of rendering the global data and "chanced upon the coloured stripes idea". When he tried out a banner at the Hay Festival, according to the article, Hawkins "knew he'd struck a chord". The National Centre for Atmospheric Science (UK), with which Hawkins is affiliated, states that the stripes "paint a picture of our changing climate in a compelling way. Hawkins swapped out numerical data points for colours which we intuitively react to". Others have called Hawkins' warming stripes "climate stripes" or "climate timelines". Warming stripe graphics are reminiscent of colour field painting, a style prominent in the mid 20th century, which strips out all distractions and uses only colour to convey meaning. Colour field pioneer artist Barnett Newman said he was "creating images whose reality is self-evident", an ethos that Hawkins is said to have applied to the problem of climate change. Collaborating with Berkeley Earth scientist Robert Rohde, on 17 June 2019 Hawkins published for public use, a large set of warming stripes on ShowYourStripes.info. Individualized warming stripe graphics were published for the globe, for most countries, as well as for certain smaller regions such as states in the US or parts of the UK, since different parts of the world are warming more quickly than others. Data sources and data visualization Warming stripe graphics are defined with various parameters, including: source of dataset (meteorological organization) geographical scope of measurement (global, country, state, etc.) time period (year range, for horizontal "axis") temperature range (range of anomaly (deviation) about a reference or baseline temperature) colour palette (usually, shades of blue and red), colour scale (assignment of colours to represent respective ranges of temperature anomaly), temperature boundaries (temperature above which a stripe is red and below which is blue, usually determined by an average annual temperature over a "reference period" or "baseline" of usually 30 years). Hawkins' original graphics use the eight most saturated blues and reds from the ColorBrewer 9-class single hue palettes, which optimize colour palettes for maps and are noted for their colourblind-friendliness. Hawkins said the specific colour choice was an aesthetic decision ("I think they look just right"), also selecting baseline periods to ensure equally dark shades of blue and red for aesthetic balance. Hawkins chose the 1971-2000 average as a boundary between reds and blues because the average global temperature in that reference period represented the mid-point in the warming to date. A Republik analysis said that "this graphic explains everything in the blink of an eye", attributing its effect mainly to the chosen colors, which "have a magical effect on our brain, (letting) us recognize connections before we have even actively thought about them". The analysis concluded that colors other than blue and red "don't convey the same urgency as (Hawkins') original graphic, in which the colors were used in the classic way: blue=cold, red=warm." ShowYourStripes.info cites dataset sources Berkeley Earth, NOAA, UK Met Office, MeteoSwiss, DWD (Germany), specifically explaining that the data for most countries comes from the Berkeley Earth temperature dataset, except that for the US, UK, Switzerland & Germany the data comes from respective national meteorological agencies. For each country-level #ShowYourStripes graphic (Hawkins, June 2019), the average temperature in the 1971–2000 reference period is set as the boundary between blue (cooler) and red (warmer) colours, the colour scale varying +/- 2.6 standard deviations of the annual average temperatures between 1901 and 2000. Hawkins noted in 2019 that the graphic for the Arctic "broke the colour scale" since it is warming more than twice as fast as the global average, and reported that the 2023 global average was so extreme that a new, darker shade of red was required. For statistical and geographic reasons, it is expected that graphics for small areas will show more year-to-year variation than those for large regions. Year-to-year changes reflected in graphics for localities result from weather variability, whereas global warming over centuries reflects climate change. The NOAA website warns that the graphics "shouldn't be used to compare the rate of change at one location to another", explaining that "the highest and lowest values on the colour scale may be different at different locations". Further, a certain colour in one graphic will not necessarily correspond to the same temperature in other graphics. A climate change denier generated a warming stripes graphic that misleadingly affixed Northern Hemisphere readings over one period to global readings over another period, and omitted readings for the most recent thirteen years, with some of the data being 29-year-smoothed—to give the false impression that recent warming is routine. Calling the graphic "imposter warming stripes", meteorologist Jeff Berardelli described it in January 2020 as "a mishmash of data riddled with gaps and inconsistencies" with an apparent objective to confuse the public. Applications and influence After Hawkins' first publication of warming stripe graphics in May 2018, broadcast meteorologists in multiple countries began to show stripe-decorated neckties, necklaces, pins and coffee mugs on-air, reflecting a growing acceptance of climate science among meteorologists and a willingness to communicate it to audiences. In 2019, the United States House Select Committee on the Climate Crisis used warming stripes in its committee logo, showing horizontally oriented stripes behind a silhouette of the United States Capitol, and three US Senators wore warming stripe lapel pins at the 2020 State of the Union Address. On 17 June 2019, Hawkins initiated a social media campaign with hashtag #ShowYourStripes that encourages people to download their regions' graphics from ShowYourStripes.info, and to post them. The campaign was backed by U.N. Climate Change, the World Meteorological Organization and the Intergovernmental Panel on Climate Change. Called "a new symbol for the climate emergency" by French magazine L'EDN, the graphics have been embraced by climate activists, used as cover images of books and magazines, used in fashion design, projected onto natural landmarks, and used on athletic team uniforms, music festival stages, and public infrastructure. More specifically, warming stripes have been applied to knit-it-yourself scarves, a vase, neckties, cufflinks, bath towels, vehicles, and a music festival stage, as well as on the side of Freiburg, Germany, streetcars, as municipal murals in Córdoba, Spain, Anchorage, Alaska, and Jersey, on face masks during the COVID-19 pandemic, in an action logo of the German soccer club 1. FSV Mainz 05, on the side of the Climate Change Observatory in Valencia, on the side of a power station turbine house in Reading, Berkshire, on tech-themed shirts, on designer dresses, on the uniforms of Reading Football Club, on Leipzig's Sachsen Bridge, on a biomethane-powered bus, as a stage backdrop at the 2022 Glastonbury Festival, on the racer uniforms and socks and webpage banner of the Climate Classic bicycle race, on the World Bank's Climate Explainer Series, projected onto the White Cliffs of Dover, on an Envision Racing electric race car, and on numerous bridges and towers noted by Climate Central. Remarking that "infiltrating popular culture is a means of triggering a change of attitude that will lead to mass action", Hawkins surmised that making the graphics available for free has made them used more widely. Hawkins further said that any merchandise-related profits are donated to charity. Through a campaign led by nonprofit Climate Central using hashtag #MetsUnite, more than 100 TV meteorologists—the scientists most laymen interact with more than any other—featured warming stripes and used the graphics to focus audience attention during broadcasts on summer solstices beginning in 2018 with the "Stripes for the Solstice" effort. On 24 June 2019, Hawkins tweeted that nearly a million stripe graphics had been downloaded by visitors from more than 180 countries in the course of their first week. In 2018, the German Weather Service's meteorological training journal Promet showed a warming stripes graphic on the cover of the issue titled "Climate Communication". By September 2019, the Met Office, the UK's national weather service, was using both a climate spiral and a warming stripe graphic on its "What is climate change?" webpage. Concurrently, the cover of the 21–27 September 2019 issue of The Economist, dedicated to "The climate issue," showed a warming stripe graphic, as did the cover of The Guardian on the morning of the 20 September 2019 climate strikes. The environmental initiative Scientists for Future (2019) included warming stripes in its logo. The Science Information Service (Germany) noted in December 2019 that warming stripes were a "frequently used motif" in demonstrations by the School strike for the climate and Scientists for Future, and were also on the roof of the German Maritime Museum in Bremerhaven. Also in December 2019, Voilà Information Design said that warming stripes "have replaced the polar bear on a melting iceberg as the icon of the climate crisis". On 18 January 2020, a 20-metre-wide artistic light-show installation of warming stripes was opened at the Gendarmenmarkt in Berlin, with the Berlin-Brandenburg Academy of Sciences building being illuminated in the same way. The cover of the "Climate Issue" (fall 2020) of the Space Science and Engineering Center's Through the Atmosphere journal was a warming stripes graphic, and in June 2021 the WMO used warming stripes to "show climate change is here and now" in its statement that "2021 is a make-or-break year for climate action". The November 2021 UN Climate Change Conference (COP26) exhibited an immersive "climate canopy" sculpture consisting of hanging, blue and red color-coded, vertical lighted bars with fabric fringes. On 27 September 2019, the Fachhochschule (University of Applied Science) Potsdam announced that warming stripes graphics had won in the science category of an international competition recognising innovative and understandable visualisations of climate change, the jury stating that the graphics make an "impact through their innovative, minimalist design". Hawkins was appointed Member of the Order of the British Empire (MBE) in the 2020 New Year Honours "For services to Climate Science and to Science Communication". In April 2022, textiles from haute couture fashion designer Lucy Tammam with warming stripes won the Best Customer Engagement Campaign title in the Sustainable Fashion 2022 awards by Drapers fashion magazine. In October 2022, the front cover of Greta Thunberg's The Climate Book features warming stripes. In May 2024, Hawkins received the Royal Geographical Society's Geographical Engagement Award for his work in developing warming stripes. Extensions of warming stripes In 2018, University of Reading post-doctoral research assistant Emanuele Bevacqua juxtaposed vertical-stripe graphics for concentration and for average global temperature (August), and "circular warming stripes" depicting average global temperature with concentric coloured rings (November). In March 2019, German engineer Alexander Radtke extended Hawkins' historical graphics to show predictions of future warming through the year 2200, a graphic that one commentator described as making the future "a lot more visceral". Radtke bifurcated the graphic to show diverging predictions for different degrees of human action in reducing greenhouse gas emissions. On or before 30 May 2019, UK-based software engineer Kevin Pluck designed animated warming stripes that portray the unfolding of the temperature increase, allowing viewers to experience the change from an earlier stable climate to recent rapid warming. By June 2019, Hawkins vertically stacked hundreds of warming stripe graphics from corresponding world locations and grouped them by continent to form a comprehensive, composite graphic, "Temperature Changes Around the World (1901–2018)". On 1 July 2019, Durham University geography research fellow Richard Selwyn Jones published a Global Glacier Change graphic, modeled after and credited as being inspired by Hawkins' #ShowYourStripes graphics, allowing global warming and global glacier retreat to be visually juxtaposed. Jones followed on 8 July 2019 with a stripe graphic portraying global sea level change using only shades of blue. Separately, NOAA displayed a graphic juxtaposing annual temperatures and precipitation, researchers from the Netherlands used stripe graphics to represent progression of ocean depths, and the Institute of Physics used applied the graphic to represent aviation emission's percentage contribution to global warming. In 2023, University of Derby professor Miles Richardson created sequenced stripes to illustrate biodiversity loss, and the German Meteorological Service represented soil moisture deviations using sequenced green and brown stripes. In August 2024, the website airqualitystripes.info published shareable "air quality stripes" graphics for world cities, using blue, yellow, orange, red and black stripes to represent fine particulate matter (PM2.5) concentrations over time. Critical response Some warned that warming stripes of individual countries or states, taken out of context, could advance the idea that global temperatures are not rising, though research meteorologist J. Marshall Shepherd said that "geographic variations in the graphics offer an outstanding science communication opportunity". Meteorologist and #MetsUnite coordinator Jeff Berardelli said that "local stripe visuals help us tell a nuanced story—the climate is not changing uniformly everywhere". Others say the charts should include axes or legends, though the website FAQ page explains the graphics were "specifically designed to be as simple as possible, and to start conversations... (to) fill a gap and enable communication with minimal scientific knowledge required to understand their meaning". J. Marshall Shepherd, former president of the American Meteorological Society, lauded Hawkins' approach, writing that "it is important not to miss the bigger picture. Science communication to the public has to be different" and commending Hawkins for his "innovative" approach and "outstanding science communication" effort. In The Washington Post, Matthew Cappucci wrote that the "simple graphics ... leave a striking visual impression" and are "an easily accessible way to convey an alarming trend", adding that "warming tendencies are plain as day". Greenpeace spokesman Graham Thompson remarked that the graphics are "like a really well-designed logo while still being an accurate representation of very important data". CBS News contributor Jeff Berardelli noted that the graphics "aren't based on future projections or model assumptions" in the context of stating that "science is not left or right. It's simply factual." A September 2019 editorial in The Economist hypothesized that "to represent this span of human history (1850–2018) as a set of simple stripes may seem reductive"—noting those years "saw world wars, technological innovation, trade on an unprecedented scale and a staggering creation of wealth"—but concluded that "those complex histories and the simplifying stripes share a common cause," namely, fossil fuel combustion. Informally, warming stripes have been said to resemble "tie-dyed bar codes" and a "work of art in a gallery". See also Climate change art Climate communication Color field Craftivism Data and information visualization Environmental communication Instrumental temperature record Scientific consensus on climate change The Tempestry Project Notes References Further reading — clickable map of warming stripes for each county in 48 contiguous US states — Survey of climate change visualizations External links ShowYourStripes.info — warming stripes portraying historical data for multiple locations Climate change in art Climate communication Climatology Climate and weather statistics Scientific visualization Data and information visualization
Warming stripes
[ "Physics" ]
3,738
[ "Weather", "Physical phenomena", "Climate and weather statistics" ]
61,161,562
https://en.wikipedia.org/wiki/George%20%28snail%29
George ( – January 1, 2019) was a snail of the species Achatinella apexfulva, and the last known individual of his species. Background Achatinella apexfulva was endemic to forests of Oahu, Hawaii. Its populations declined dramatically due to predation by the rosy wolfsnail, which was introduced to Hawaii in the 1950s to control agricultural pests. The species was listed as federally endangered in 1981. Life In 1997, all known remaining specimens of A. apexfulva were collected and bred in captivity. Most offspring died of unknown causes, but one successful offspring was born. This individual, born in a laboratory at the University of Hawaii at Manoa, was named George, after Lonesome George, a Pinta Island tortoise that was also the last of its kind. George's parents were collected from the last known wild population of A. apexfulva, in a few trees near Oahu's Poamoho trail. At the time of his birth, about 20 A. apexfulva individuals survived in captivity; however, by the mid-2000s, George was the only remaining member of the species. George has been described as "a thumbnail-size whorl of dark brown and tan." Although typically referred to using the pronoun "he", George was actually a hermaphrodite. He became sexually mature in 2012, but could not reproduce without a mate. While George was alive, it became a tradition for snail researchers to stop at the spot where the last A. apexfulva were found and scan the trees with binoculars, in the hope of finding him a mate. As of 2016, George lived in a terrarium at the University of Hawaii. At the time of his death, George was kept in a trailer on the outskirts of Kailua, Oahu, cared for by researcher David Sischo, director of the state's snail extinction prevention program, and colleagues. In August 2018, George was among 2000 snails temporarily transferred from Kawainui Marsh to the main Department of Land and Natural Resources offices in downtown Honolulu, to protect against damage from Hurricane Lane. Death and legacy On January 1, 2019, George died at age 14, leaving the species reportedly extinct. His body was discovered the following morning. As of 2019, George's remains are stored in a cupboard labelled "DEATH CABINET", alongside the bodies of other dead snail specimens. In 2017, researchers collected a two-millimetre sample of George's foot, which is now kept in storage at San Diego's Frozen Zoo, to be available for possible future cloning attempts. References Notes Citations 2000s animal births 2019 animal deaths Endlings Individual molluscs
George (snail)
[ "Biology" ]
552
[ "Individual organisms", "Endlings" ]
52,023,256
https://en.wikipedia.org/wiki/Red%20Dead%20Redemption%202
Red Dead Redemption 2 is a 2018 action-adventure game developed and published by Rockstar Games. The game is the third entry in the Red Dead series and a prequel to the 2010 game Red Dead Redemption. The story is set in a fictionalized representation of the United States in 1899 and follows the exploits of Arthur Morgan, an outlaw and member of the Van der Linde gang, who must deal with the decline of the Wild West while attempting to survive against government forces, rival gangs, and other adversaries. The game is presented through first- and third-person perspectives, and the player may freely roam its interactive open world. Gameplay elements include shootouts, robberies, hunting, horseback riding, interacting with non-player characters, and maintaining the character's honor rating through moral choices and deeds. A bounty system governs the response of law enforcement and bounty hunters to crimes committed by the player. The game's development lasted over eight years, beginning soon after Red Dead Redemptions release, and it became one of the most expensive video games ever made. Rockstar co-opted all of its studios into one large team to facilitate development. They drew influence from real locations as opposed to film or art, focused on creating an accurate reflection of the time with the game's characters and world. The game was Rockstar's first built specifically for eighth-generation consoles, having tested their technical capabilities while porting Grand Theft Auto V. The game's soundtrack features an original score composed by Woody Jackson and several vocal tracks produced by Daniel Lanois. Development included a crunch schedule of 100-hour weeks, leading to reports of mandatory and unpaid overtime. Red Dead Online, the game's online multiplayer mode, lets up to 32 players engage in a variety of cooperative and competitive game modes. Red Dead Redemption 2 was released for the PlayStation 4 and Xbox One in October 2018, and for Windows and Stadia in November 2019. It broke several records and had the second-biggest launch in the history of entertainment, generating in sales from its opening weekend and exceeding the lifetime sales of Red Dead Redemption in two weeks. The game received critical acclaim, with praise directed at its story, characters, open world, graphics, music, and level of detail, but some criticism at its control scheme and emphasis on realism over player freedom. It won more than 175 Game of the Year awards and received multiple other accolades from awards shows and gaming publications. It is considered one of eighth-generation console gaming's most significant titles and among the greatest video games ever made. It is among the best-selling video games with over 67million copies shipped. Gameplay Red Dead Redemption 2 is a Western-themed action-adventure game. Played from a first- or third-person perspective, the game is set in an open-world environment featuring a fictionalized version of the United States in 1899. It features single-player and online multiplayer components, the latter released under Red Dead Online. For most of the game, the player controls outlaw Arthur Morgan, a member of the Van der Linde gang, as he completes missions—linear scenarios with set objectives—to progress the story; from the epilogue, the player controls Red Dead Redemption protagonist John Marston. Outside of missions, they can freely roam the interactive world. They may engage in combat with enemies using melee attacks, firearms, bow and arrow, throwables, or dynamite, and can dual wield weapons. The player can swim as Arthur but not as John. Red Dead Redemption 2s world features different landscapes with occasional travelers, bandits, and wildlife, and urban settlements ranging from farmhouses to towns and cities. Horses are the main forms of transportation, of which there are various breeds with different attributes. The player can steal horses and must train or tame wild horses to use them; to own a horse, they must saddle or stable it. Repeated use of a horse begins a bonding process, increased by leading, petting, cleaning, and feeding it, and the player will acquire advantages as they ride their horse. Stagecoaches and trains can be used to travel; the player can hijack a train or stagecoach by threatening the driver and rob its contents or passengers. The player may witness or partake in random events in the world, including ambushes, crimes, pleas for assistance, ride-by shootings, public executions, and animal attacks. They may be rewarded when helping others. They may partake in side activities, including tasks with companions and strangers, dueling, bounty hunting, searching for collectibles such as rock carvings, and playing poker, blackjack, dominoes, and five finger filet. Hunting animals provides food, income, and materials for crafting items. The choice of weapon and shot placement affect the quality and value of meat and pelt, and the player can skin the animal or carry the carcass, which will rot over time, decrease its value, and attract predators. Some story moments give the player the option to accept or decline additional missions and lightly shape the plot around their choices. They can choose different dialogue trees with non-player characters (NPCs), such as being friendly or insulting. If they choose to kill an NPC, they can loot their corpse. The Honor system measures how the player's actions are perceived: morally-positive choices and deeds like helping strangers, following the law, and sparing opponents in a duel will increase Honor, while negative deeds such as theft and harming innocents will decrease it. Dialogue and outcomes often differ based on Honor level, and attaining milestones grants unique benefits: high Honor provides special outfits and store discounts, while low Honor grants more items from looting. In addition to health and stamina bars, the player has cores, which affect the rate at which their health and stamina regenerate. Freezing or overheating rapidly drains cores, preventable by wearing weather-appropriate clothing. The player can gain or lose weight depending on how much they eat; an underweight character will have less health but more stamina, while an overweight character can better absorb damage but with less stamina. Eating and sleeping replenishes cores. The player can bathe to remain clean and visit a barber to change hairstyles; hair grows realistically over time. Weapons require cleaning to maintain their performance. Using a certain type of gun extensively improves weapon handling, reduces recoil, and increases the rate of reloading. The player can take cover, free aim, and target a person or animal. Individual body parts can be targeted to take down targets without killing them. Weapons consist of pistols, revolvers, repeaters, rifles, shotguns, bows, explosives, lassos, mounted Gatling guns, and melee weapons such as knives and tomahawks. The player can use Dead Eye to slow down time and mark targets. Once the targeting sequence ends, they fire to every marked location in a very short space of time. The Dead Eye system upgrades progressively and grants abilities such as targeting fatal points. When the player commits a crime, witnesses alert the law; the player can stop them to avoid repercussions. Law enforcers investigate once alerted. When the player is caught, a bounty is set on their head; as they commit more crimes, their bounty grows higher and more lawmen will be sent to hunt them. If the player escapes the law, bounty hunters track them down. After committing enough crime, the U.S. Marshals will be sent to the player's location. To escape law enforcement, they can evade the wanted vicinity, hide from pursuers, or kill them. The bounty will remain on their head, lawmen and civilians will be more vigilant, and regions where crimes have been committed will be on lockdown. When caught by lawmen, the player can surrender if they are unarmed and on foot. They can remove their bounty by paying it off or spending time in jail. Synopsis Setting and characters The world of Red Dead Redemption 2 spans five fictitious U.S. states: New Hanover, Ambarino, and Lemoyne are located to the immediate north and east of New Austin and West Elizabeth, which return from Red Dead Redemption. Ambarino is a mountain wilderness, with the largest settlement being the Wapiti Native American reservation; New Hanover encompasses a sweeping valley and woody foothills that feature the cattle town of Valentine, the riverside Van Horn Trading Post, and the coal town of Annesburg; and Lemoyne is composed of bayous and plantations resembling the Southeastern United States, and is home to the Southern town of Rhodes, the Creole village of Lagras, and the former French colony of Saint Denis, analogous to New Orleans. West Elizabeth consists of wide plains, dense forests, and the prosperous port town of Blackwater; the region is expanded from the original game with a northern portion containing the mountain resort town of Strawberry. New Austin is an arid desert region on the border with Mexico and centered on the frontier towns of Armadillo and Tumbleweed, featured in the original game. Parts of New Austin and West Elizabeth were redesigned to reflect the earlier time: Blackwater is under development, while Armadillo is a ghost town as a result of a cholera outbreak. The player controls Arthur Morgan (Roger Clark), an enforcer and veteran member of the Van der Linde gang, led by Dutch van der Linde (Benjamin Byron Davis), a charismatic anarchist who extols personal freedom and decries the encroaching march of modern civilization. The gang includes Dutch's best friend and co-leader Hosea Matthews (Curzon Dobell), John Marston (Rob Wiethoff) and his partner Abigail Roberts (Cali Elizabeth Moore) and son Jack Marston (Marissa Buccianti and Ted Sutherland), the lazy Uncle (John O'Creagh and James McBride), gunslingers Bill Williamson (Steve J. Palmer), Sean MacGuire (Michael Mellamphy), Javier Escuella (Gabriel Sloyer), and Micah Bell (Peter Blomquist), Black-Native American hunter Charles Smith (Noshir Dalal), and housewife-turned-gunslinger Sadie Adler (Alex McKenna). The gang's criminal acts bring them into conflict with wealthy oil magnate Leviticus Cornwall (John Rue), who recruits the Pinkerton Detective Agency, led by Andrew Milton (John Hickok) and his subordinate Edgar Ross (Jim Bentley), to hunt them down. The gang encounter Italian Mafia boss Angelo Bronte (Jim Pirri), controversial governor Colonel Alberto Fussar (Alfredo Narciso), and Dutch's nemesis Colm O'Driscoll (Andrew Berg), and become entangled with the warring Gray and Braithwaite families, who are rumored to be hoarding Civil War gold. Arthur helps Rains Fall (Graham Greene) and his son Eagle Flies (Jeremiah Bitsui), both members of the Native American Wapiti tribe whose land is targeted by the U.S. Army. Plot After a botched ferry heist in 1899, the Van der Linde gang are forced to leave their substantial money stash and flee Blackwater. Realizing the progress of civilization is ending the time of outlaws, they decide to gain enough money to escape the law and retire. They rob a train owned by Cornwall, who hires Pinkertons to apprehend them. The gang perform jobs to earn money, as Dutch continually promises the next heist will be their last. Following a shootout with Cornwall's men in Valentine, the gang relocate to Lemoyne, where they work simultaneously for the Grays and Braithwaites in an attempt to turn them against each other. However, the families double-cross them: the Grays kill Sean during an ambush in Rhodes, while the Braithwaites kidnap and sell Jack to Bronte. The gang retaliate and destroy both families before retrieving Jack from Bronte, who offers them leads on work but eventually double-crosses them. Dutch kidnaps and feeds him to an alligator as revenge, which disturbs Arthur. The gang rob a bank in Saint Denis, but the Pinkertons intervene, killing Hosea and arresting John. Dutch, Arthur, Bill, Javier, and Micah escape the city via a ship heading to Cuba. A torrential storm sinks the ship, and the men wash ashore on an island, Guarma, where they become embroiled in a war between tyrannical sugar plantation owner Fussar and the enslaved local population. After helping the revolutionaries kill Fussar, the group secure transport back to the United States and reunite with the rest of the gang, though they are soon assaulted by Pinkertons whom they repel. Dutch, paranoid that a gang member is working as an informant, obsesses over one last heist. He doubts Arthur's loyalty after he disobeys him by liberating John earlier than planned, naming Micah his top lieutenant in Arthur's place. Arthur becomes concerned Dutch is no longer the man he knew, as he is becoming insular, abandons their ideals, and murders Cornwall. Faced with his mortality after being diagnosed with tuberculosis, Arthur reflects on his actions and how to protect the gang, telling John to run away with Abigail and Jack and openly defying Dutch by aiding the local Native American people. Several gang members become disenchanted and leave, while Dutch and Micah arrange one final heist of an Army payroll train. Arthur's faith in Dutch is shattered when he abandons Arthur to the Army, leaves John for dead, and refuses to rescue a kidnapped Abigail. Arthur and Sadie rescue Abigail from Milton, who names Micah as the Pinkertons' informer before Abigail kills him. Arthur returns to camp and openly accuses Micah of betrayal. Dutch, Bill, Javier, and Micah turn on Arthur and a newly returned John, but the standoff is broken when Pinkertons attack. The player can choose to have Arthur aid John's escape by delaying the Pinkertons or return to the camp to recover the gang's money. Micah ambushes Arthur, and Dutch intervenes in their fight. Arthur convinces Dutch to abandon Micah and leave. If the player has high honor, Arthur succumbs to his injuries and disease while watching the sunrise; if the player has low honor, Micah executes him. Eight years later, in 1907, John and his family are trying to lead honest lives. They find work at a ranch where John reveals his combat experience against bandits threatening the ranch. Believing John is unwilling to give up his old ways, Abigail leaves with Jack. John takes a loan from the bank to purchase a ranch. He works with Uncle, Sadie, and Charles to build a new home, and proposes to Abigail on her return. Learning Micah is still alive and formed his own gang, John, Sadie, and Charles assault his camp. They find Dutch, who shoots Micah after a tense standoff and leaves in silence, allowing John to kill Micah and claim the gang's Blackwater stash to pay his debt. John marries Abigail and they start a new life on their ranch alongside Jack and Uncle, as Sadie and Charles leave for other pursuits. Mid-credits scenes show the fates of other surviving gang members. Edgar Ross tracks down Micah's killer, which leads him to John's ranch, foreshadowing the events of Red Dead Redemption. Development Preliminary work on Red Dead Redemption 2 began shortly following the release of the original game, Red Dead Redemption (2010). Rockstar San Diego, the studio behind the original game, had a rough outline of the game by mid-2011, and by late 2012, rough scripts of the game had been completed. When Rockstar Games realized a group of distinct studios would not necessarily work, it co-opted all of its studios into one large team, presented simply as Rockstar Games, to facilitate development between 1,600 people; a total of around 2,000 people worked on the game. Analyst estimations place the game's combined development and marketing budget between and , which would make it one of the most expensive video games to develop. While the main theme of the original game was to protect family at all costs, Red Dead Redemption 2 tells the story of a family's breakdown in the Van der Linde gang. The team was interested in exploring the story of why the gang fell apart, as frequently mentioned in the first game. Rockstar's Vice President of Creativity Dan Houser was inspired by film and literature when writing the game, though he avoided contemporary works to avoid being accused of stealing ideas. The team was not specifically inspired by film or art but rather real locations. They sought to create an accurate reflection of the time, with people and locations: the citizens feature a contrast between rich and poor, while the locales contrast between civilization and wilderness. Houser viewed the game as historical fiction, opting to allude to historical events instead of retelling them due to their unpleasantness. Red Dead Redemption 2s recording sessions began in 2013. Rockstar wanted a diverse cast of characters within the Van der Linde gang. The writers put particular focus on the individual stories behind each character, exploring their life before the gang and their reasons for remaining with the group. Several characters were cut from the game during development as their personalities failed to add to the narrative. The actors sometimes improvised some additional lines, but mostly remained faithful to the script. The team decided the player would control one character in Red Dead Redemption 2, as opposed to the three protagonists in Rockstar's previous title Grand Theft Auto V (2013), to follow the character more personally and understand how the events impact him. They felt a single character is more appropriate for the narrative structure of a Western. Red Dead Redemption 2 is the first game from Rockstar built specifically for the PlayStation 4 and Xbox One. Rockstar had tested these consoles' technical capabilities when porting Grand Theft Auto V, initially released on the PlayStation 3 and Xbox 360, to them. Once the team had defined what limitations were sustainable, they found the areas requiring the most focus. One of Rockstar's goals with Red Dead Redemption 2s gameplay was to make the player feel as though they are living in a world, instead of playing missions and watching cutscenes. A method used to achieve this was through the gang's moving camp, where the player can interact with other characters. The team ensured the characters maintained the same personality and mood from cutscene to gameplay to make the world feel more alive and realistic. Woody Jackson, who worked with Rockstar on the original game and Grand Theft Auto V, returned to compose Red Dead Redemption 2s original score. Red Dead Redemption 2 has three different types of score: narrative, which is heard during the missions in the game's story; interactive, when the player is roaming the open world or in multiplayer; and environmental, which includes campfire singing songs or a character playing music in the world. The game's music regularly reacts according to the player's decisions in the world. Jackson purchased several instruments from the Wrecking Crew featured on classic cowboy films. In total, over 110 musicians worked on the music for the game. Daniel Lanois produced the original vocal tracks for the game, collaborating with artists such as D'Angelo, Willie Nelson, Rhiannon Giddens, and Josh Homme. Director of music and audio Ivan Pavlovich engaged saxophone player Colin Stetson, experimental band Senyawa, and musician Arca to work on the score. Rockstar Games first teased Red Dead Redemption 2 on October 16–17, 2016, before the official announcement on October 18, 2016. Originally due for release in the second half of 2017, the game was delayed twice: first to Q1/Q2 2018, and later to October 26, 2018. According to Rockstar, the game required extra development time for "polish". To spur pre-order sales, Rockstar collaborated with several retail outlets to provide special edition versions of the game. A companion app, released alongside the game for Android and iOS devices, acts as a second screen wherein the player can view in-game items such as catalogs, journals, and a real-time mini-map. The game was released for Windows on November 5, 2019, and was a launch title for Stadia when the service launched on November 19, 2019. The Windows version has visual and technical improvements. Reception Critical response Red Dead Redemption 2 received "universal acclaim" from critics, according to review aggregator Metacritic. It is one of the highest-rated games on Metacritic, and the highest-rated PlayStation 4 and Xbox One game alongside Rockstar's Grand Theft Auto V. Reviewers praised the story, characters, open world, graphics, music, and level of detail. Matt Bertz of Game Informer described the game as "the biggest and most cohesive adventure Rockstar Games has ever created", and GamesRadars David Meikleham felt it "represents the current pinnacle of video game design". Keza MacDonald of The Guardian declared it "a landmark game" and "a new high water-mark for lifelike video game worlds"; IGNs Luke Reilly named it "one of the greatest games of the modern age". Peter Suderman, writing for The New York Times, considered Red Dead Redemption 2 as an example of video games as a work of art, comparing the game's abilities to "[tell] individual stories against the backdrop of national and cultural identity, deconstructing their genres while advancing the form" to the current state of film and television with similar works like The Godfather and The Sopranos. Regarding its narrative, Meikleham of GamesRadar called Red Dead Redemption 2 "perhaps the boldest triple-A game ever made", praising the unpredictability of the narrative and comparing its epilogue to The Last of Us (2013). The Guardians MacDonald praised the story's twists, applauding the writers' ability to feed smaller stories into the overall narrative. Nick Plessas of Electronic Gaming Monthly (EGM) noted the best stories "are to be found in the margins", discovered and written by the player. Game Informers Bertz felt the narrative rarely suffered from repetition, an impressive feat considering the game's scope, though expressed desire for more passive, quiet moments. Conversely, GameSpots Kallie Plagge was frustrated by the predictability later in the narrative though acknowledged its importance to Arthur's story. Alex Navarro of Giant Bomb felt the narrative suffered in its clichéd Native American portrayal and side missions. Some reviewers commented on the game's slow opening hours and its lengthy epilogue. EGMs Plessas found the journey of redemption for Arthur "far more redeeming" than John's in Red Dead Redemption, noting his sins heightened his sympathy for the character. Conversely, Eurogamers Martin Robinson considered Arthur less compelling than Marston, resulting in a confusing narrative. GameSpots Plagge felt the new characters contributed to the story's quality and Mike Williams of USgamer wrote they "feel like actual people" due to their varied personalities. IGNs Reilly praised the cultural variety and avoidance of caricatures, and Giant Bombs Navarro noted the characters possess humanity often lacking in other Rockstar games, particularly in the thoughtful portrayal of Arthur's internal conflicts. MacDonald of The Guardian found the performances led to more believable characters. Polygons Chris Plante found the political commentary shone when focusing on the Braithwaite and Gray families but considered the portrayal of Native American characters insensitive and confusing. Eirik Gumeny, writing for Polygon, praised the realistic and unfiltered depiction of tuberculosis, including the misguided and hostile reactions from others. Several critics considered Red Dead Redemption 2s open world among the greatest in video games; EGMs Plessas said it "pushes industry boundaries in both size and detail", and The Guardians MacDonald praised the imitation of real American landscapes. IGNs Reilly considered the world "broader, more beautiful, and more varied" than its predecessor's, due in part to how each environment feels alive. GameSpots Plagge felt compelled to explore the open world due to its variety, reactivity, and surprises. GamesRadars Meikleham called Red Dead Redemption 2 "the best looking video game of all time" with some of the most impressive lighting and weather systems, and USgamers Williams considered it one of the best-looking on PlayStation 4 and Xbox One. IGNs Reilly praised the lighting engine, facial animation, and level of granular detail. Game Informers Bertz lauded the attention to detail and found the world felt more alive due to "an unrivaled dynamic weather system, ambient sound effects, and the most ambitious ecology of flora and fauna ever seen in games". Several reviewers lauded the level of detail in all aspects of gameplay—EGMs Plessas felt the attention to detail led to deeper immersion—though some found the sheer amount of realism restricted opportunities and unnecessarily prolonged some animations. IGNs Reilly felt Arthur's movement did not feel cumbersome despite being "heavier" than Grand Theft Auto Vs protagonists, and found the intimate battles more exciting. Polygons Plante considered the conversation options limited but still an improvement over the violence of other action games. Eurogamers Robinson voiced frustration at the lack of freedom in some story missions. Some reviewers criticized the controls and found its button layout and user interface inconsistent and confusing. Red Dead Redemption 2s Windows release received "universal acclaim" according to Metacritic; it is one of the highest-rated PC games. PCGamesNs Sam White thought the graphics improvements made the open world "[look] the best it ever has". Destructoids Carter praised the addition of the Photo Mode. Sam Machkovech of Ars Technica felt the cutscenes animations did not scale well to higher frame rates but considered the gameplay far superior to console. Rock, Paper, Shotguns Matthew Castle lauded the adapted controls, particularly when painting targets in Dead Eye, though felt they took longer to familiarize oneself with. PC Gamers James Davenport found the first-person perspective superior on the Windows version due to the responsiveness of the mouse but noted the game crashed several times; Jeuxvideo.coms Jean-Kléber Lauret echoed similar criticisms, observing the graphical and technical enhancements required advanced hardware. Polygons Samit Sarkar criticized the port's technical issues and declared it unplayable at the time. One week after release, PCMags Tony Polanco said the technical issues had been mostly solved. Accolades Red Dead Redemption 2 won over 175 Game of the Year awards, receiving wins at the Australian Games Awards, Brazil Game Awards, Fun & Serious Game Festival, and Italian Video Game Awards, and from outlets such as 4Players, AusGamers, Complex, Digital Trends, Edge, Electronic Gaming Monthly, Gamereactor, GameSpot, The Guardian, Hot Press, news.com.au, The Telegraph, USgamer, and Vulture. On Metacritic, Red Dead Redemption 2 was the highest-rated game of 2018. The game was named among the best games of the 2010s by Entertainment.ie, The Hollywood Reporter, Metacritic, National Post, NME, Stuff, Thrillist, VG247, and Wired UK. At The Game Awards 2018, the game received eight nominations and won four awards: Best Audio Design, Best Narrative, Best Score/Music, and Best Performance for Clark as Arthur. At the 6th SXSW Gaming Awards, Red Dead Redemption 2 was named Trending Game of the Year and won Excellence in SFX and Technical Achievement. The game earned eight nominations at the 22nd Annual D.I.C.E. Awards, seven at the 19th Game Developers Choice Awards, and six at the 15th British Academy Games Awards. Sales Since the previous installment in the series was among the highest-reviewed and best-selling games of the seventh generation of video game consoles, many analysts believed Red Dead Redemption 2 would be one of the highest-selling games of 2018 and have a great effect on other game sales during the fourth quarter. After the game's announcement in October 2016, analyst Ben Schacter of Macquarie Research estimated it would sell 12 million copies in its first quarter, while analysts at Cowen and Company gave a "conservative" estimate of 15 million sales. In July 2018, industry analyst Mat Piscatella predicted Red Dead Redemption 2 would be the best-selling game of 2018, outselling other blockbuster titles such as Battlefield V, Call of Duty: Black Ops 4, and Fallout 76; some industry commentators noted frequent franchises like Assassin's Creed and Call of Duty were launching their 2018 entries—Odyssey and Black Ops 4, respectively—earlier than usual, predicting an avoidance of competition with Red Dead Redemption 2. Shortly before release in October 2018, Schacter estimated the game would sell 15 million copies in its first quarter, though noted investor expectations were at 20million copies; Michael Pachter of Wedbush Securities predicted 25 million. Michael Olson of Piper Jaffray projected revenue between and in the first three days, while Doug Creutz of Cowen Inc. estimated between and . Red Dead Redemption 2 had the largest opening weekend in entertainment history, making over in revenue in three days, and over 17million copies shipped in two weeks, exceeding the lifetime sales of Red Dead Redemption. Additionally, Red Dead Redemption 2 was the second-highest-grossing entertainment launch (behind Grand Theft Auto V and set records for largest pre-orders, first-day sales, and three-day sales on the PlayStation Network. The share price for Rockstar's parent company, Take-Two Interactive, rose nine percent in the week after release. VentureBeats Dean Takahashi noted the game likely broke-even in its first week and, based on analyst estimates, would begin to earn a profit by December 2018. The game shipped 23million copies in 2018, generating in revenue, and sales reached 29million in 2019, 36million in 2020, 43million in 2021, 50million in 2022, 61million in 2023, and over 67million by June 2024. By dollar sales, it was the best-selling game of the latter half of the 2010s, and the decade's seventh-best-selling overall. It is among the best-selling video games. In the United States, Red Dead Redemption 2 was the second-best-selling game of October 2018, behind Call of Duty: Black Ops 4. It was the nation's best-selling-game in November, third-best-selling in December, and overall best-selling of the year. In 2019, it maintained its placement in the nation's top charts, and was the twelfth-best-selling game of the year. It remained in the charts for the first half of 2020. In the United Kingdom, Red Dead Redemption 2 was the best-selling retail game in its first week of release and the second-fastest-selling game of 2018 (behind FIFA 19). The opening week physical sales doubled its predecessor's, with 68 percent of sales from the PlayStation 4 version. Red Dead Redemption 2 was the third-fastest-selling non-FIFA game released in its generation, behind Call of Duty: Black Ops III and Call of Duty: Advanced Warfare. In the United Kingdom, it was the second-best-selling game in 2018, fifth in 2019, eleventh in 2020, sixth in 2021, and ninth in 2022. Within its first week in Japan, the PlayStation 4 version sold 132,984 copies, placing it at number one on the all-format video game sales chart. In Australia, it was the best-selling game of 2018, and the fifteenth-best-selling of 2020. Worldwide, the Windows version sold 406,000 copies upon launch in November 2019, doubling to over one million after its release on Steam the following month. Red Dead Online The online multiplayer component to Red Dead Redemption 2, titled Red Dead Online, was released as a public beta on November 27, 2018, to players who owned a special edition of the base game, and then progressively opened to all owners. Players customize a character and are free to explore the environment alone or in a group. The game world features events in which up to 32 players can partake individually or with a posse group. As players complete activities throughout the game world, they receive experience points to raise their characters in rank and receive bonuses, thereby progressing in the game. Though Red Dead Online and Red Dead Redemption 2 share assets and gameplay, Rockstar viewed them as separate products with independent trajectories, reflected in its decision to launch the multiplayer title separately. Player progression in the public beta carried over when the beta ended on May 15, 2019. A standalone client for Red Dead Online was released on December 1, 2020, for PlayStation 4, Windows, and Xbox One. Post-release content was added to the game through free title updates. In July 2022, Rockstar announced Red Dead Online would not receive more major updates, instead focusing on smaller missions and the expansion of existing modes as development resources were withdrawn to focus on Grand Theft Auto VI. Controversies In February 2018, online technology publication Trusted Reviews published an article leaking several features due to be included in Red Dead Redemption 2, including a first-person perspective, and a battle royale mode in Red Dead Online. The information was obtained from a leaked document in August 2017, but the site had hesitated to publish the article as the claims were "unsubstantiated" until promotional material validated its legitimacy; the document was sent to other sites at the time. In November 2018, Trusted Reviews replaced the article with an apology, noting the "information was confidential" and should not have been published. In a settlement with Take-Two Interactive, Trusted Reviews agreed to donate () to charities chosen by Take-Two; Rockstar directed the funds be donated to the American Indian College Fund, the American Prairie Reserve, and the First Nations Development Institute. Neither Trusted Reviews nor Take-Two indicated any specific laws had been violated. Several journalists recognized the uniqueness of successful legal action against media outlets; Seth Barton of MCV/Develop called the outcome "an incredible development for games industry journalism" and felt it would result in hesitancy to leak information regarding Rockstar in future. Kotakus Keza MacDonald similarly described the events as "extraordinary" as it likely meant Take-Two argued the information was a trade secret and Trusted Reviews was unable to use a public interest defense; she added "it might prove to be influential" and prevent publications from leaking information in the future, even if obtained legally. Prior to the game's release, Dan Houser stated the team had been working 100-hour weeks "several times in 2018". Many sources interpreted this statement as "crunch time" for the entire development staff of the game, comparable to similar accusations made by wives of Rockstar San Diego employees in regards to the development of the game's predecessor. The following day, Rockstar clarified in a statement the work duration mentioned by Houser only affected the senior writing staff for Red Dead Redemption 2, and the duration had only been the case for three weeks during the entire development. Houser added the company would never expect or force any employee to work as long as was stated, and those staying late at the development studios were powered by their passion for the project. However, other Rockstar employees argued Houser's statements did not give an accurate picture of the "crunch-time culture" at the company many of its employees worked under, which included "mandatory" overtime and years-long periods of crunch. Due to the salary-based nature of employment contracts, many employees were not compensated for their overtime work and instead depended on year-end bonus payments that hinged on the sales performance of the game. Nonetheless, a sentiment echoed across many employee statements was the observation that working conditions had somewhat improved since development on the original Red Dead Redemption. By April 2020, several employees reported the company had made significant changes as a result of the publicity surrounding the work culture, and many were cautiously optimistic about Rockstar's future. In November 2018, YouTuber Shirrako posted several videos of his player character murdering a female suffragette NPC, including feeding her to an alligator and dropping her down a mineshaft. Critics noted the majority of comments on the videos were sexist and misogynistic. Shirrako claimed the actions were apolitical and he did not support the sexist comments but did not wish to censor them. Matt Leonard of GameRevolution called Shirrako's response "plain bullshit", noting he continued to post similar videos encouraging the same behavior. In response, YouTube suspended the channel for violation of their community guidelines, citing its graphic nature for shock purposes and for promoting violence. Shirrako protested the decision, claiming it was hypocritical as in-game violence against men did not receive the same response. YouTube restored the channel and designated an age restriction to the suffragette videos, commenting "the reviewer will be educated on this outcome and on how to avoid repeating this mistake". Some critics questioned if Rockstar was partly to blame for the behavior, as the game does not limit attacks on the suffragette as it does other characters, such as children; scholars Kristine Jørgensen and Torill Elvira Mortensen, writing in Games and Culture, acknowledged this concern, but recognized the responsibility ultimately lay with the player, and limiting attacks could be interpreted as both a political statement from Rockstar and a restriction on the player's freedom of expression. Writing for Public History Weekly, Moritz Hoffman noted the incident reflects a newer issue of open world games: granting freedom without penalties promotes disinhibition. In The Journal of the Gilded Age and Progressive Era, scholars Hilary Jane Locke and Thomas Mackay wrote it "points to a sharp contrast between the game's portrayal of Progressive Era politics ... and how some players have responded to its depictions thereof". Securitas AB, the parent company of the modern-day Pinkerton agency, issued a cease and desist notice to Take-Two Interactive on December 13, 2018, asserting Red Dead Redemption 2s use of the Pinkerton name and badge imagery was against their trademark and demanded royalties for each copy of the game sold or they would take legal action. Take-Two filed a complaint against Securitas on January 11, 2019, maintaining the Pinkerton name was strongly associated with the Wild West, and its use of the term did not infringe on the Pinkerton trademark. Take-Two sought a summary judgment to declare the use of Pinkerton in the game as allowed fair use. Game Informers Javy Gwaltney agreed with Take-Two's claims, questioning why Securitas had not targeted other works depicting the Pinkerton agency in the past; he felt "the company likely just wants a cut of [the game's] profits". In response to Take-Two's complaint, Pinkerton president Jack Zahran described the game's portrayal of Pinkertons as "baseless" and "inaccurate", noting Pinkerton employees would "have to explain to their young game players why Red Dead Redemption 2 encourages people to murder Pinkertons", but hoped the companies could come to an "amicable solution". By April 2019, Securitas withdrew its claims and Take-Two moved to withdraw its complaint. Legacy Critics agreed Red Dead Redemption 2 was among the best games of the eighth generation of video game consoles. GQs White described it as "a generation-defining release", and VG247s McKeand named it "a benchmark for other open world games to aspire to". IGN ranked the game as the third-best Xbox One game and eleventh-best PC and PlayStation 4 game. In November 2020, TechRadar listed it among the greatest games of the eighth generation; editor Gerald Lynch felt it set the bar for believable open world games. In December, GamesRadar+ ranked it the fifth-best game of the generation, noting it had already begun to influence the open-world and role-playing genres. Since its release, Red Dead Redemption 2 has been cited as one of the greatest video games ever made. In March 2019, Popular Mechanics ranked it 24th on its list of greatest games. In October, IGN added Red Dead Redemption 2 to its list of top 100 video games, ranked 62nd in 2019 and promoted to 8th in 2021; editor Luke Reilly praised its "uncompromising detail" and wrote it "stands shoulder-to-shoulder with Grand Theft Auto V as one of gaming's greatest open-world achievements". In July 2020, Dylan Haas of Mashable considered the game his second favorite of all time, citing its realism, world, characters, and narrative. In November 2021, GamesRadar+ ranked it 28th on its list of top 50 games, describing it as "one of the best sandbox games ever made". In April 2022, GamingBolts Ravi Sinha ranked Red Dead Redemption 2 the second-best game of all time, citing its characters, narrative, attention to detail, and visual fidelity, naming it Rockstar's "finest work". In September, USA Today ranked it 21st on its list of best games, praising Arthur as "one of the most likable protagonists in games" and describing the world as "the real star of the show". In May 2023, over 200 developers, journalists, and content creators surveyed by GQ ranked Red Dead Redemption 2 the 15th-best game; GQs Sam White and Robert Leedham called it "perhaps the greatest flex in video game history" which set a "benchmark for cinematic storytelling and attention to detail". Producer Eiji Aonuma said open world games like Red Dead Redemption 2 inspired developers of The Legend of Zelda: Tears of the Kingdom (2023). Red Dead Redemption 2 was referenced several times in the South Park episodes "Time to Get Cereal" and "Nobody Got Cereal?" in November 2018. Game footage was used in the first music video for "Old Town Road" by Lil Nas X in December, which scholars saw as the impact of the game's influence on Western culture and country music. In April 2022, Joe Meizies won Virtual Photographer of the Year at the London Games Festival for his virtual photography in Red Dead Redemption 2. Tombstone Redemption, a fan event organized by Kenney Palkow, was held in Tombstone, Arizona, on July 29–30, 2023, with an estimated 10,000 attendees, including fourteen cast members from the series. Tombstone was redressed to resemble the in-game Blackwater. The event returned the following year as Black Hills Redemption, held in Deadwood, South Dakota, on June 21–23, with twenty actors present as guests. In July 2021, a study published by the University of Exeter and Truro and Penwith College found Red Dead Redemption 2 players had an increased understanding of ecology and animal behavior; players were able to identify three more animals on average than other gamers. In late 2021, University of Tennessee professor Tore Olsson started teaching "Red Dead America", a class about United States history in the late nineteenth and early twentieth centuries, including the frontier myth, Jim Crow laws, settler colonialism, and women's suffrage, inspired by the lack of academic discourse surrounding the game's history. Olsson found the class attracted larger enrolments than other history subjects. He published a book about the topic, titled Red Dead's History, in August 2024; the audiobook is narrated by Roger Clark. Notes References External links 2018 video games Action-adventure games Cultural depictions of the Mafia Euphoria (software) games Fiction about bank robbery Fiction about infectious diseases Fiction about train robbery The Game Awards winners Game Developers Choice Award winners Golden Joystick Award winners Ku Klux Klan in popular culture Multiplayer and single-player video games Open-world video games PlayStation 4 games PlayStation 4 Pro enhanced games Revisionist Westerns Rockstar Advanced Game Engine games Rockstar Games games Stadia games Take-Two Interactive games Video game prequels Video games developed in Canada Video games developed in India Video games developed in the United Kingdom Video games developed in the United States Video games produced by Dan Houser Video games scored by Woody Jackson Video games set in 1899 Video games set in 1907 Video games set in the American frontier Video games set in the Caribbean Video games set on fictional islands Video games with time manipulation Video games written by Dan Houser Western (genre) video games Windows games Works about atonement Xbox One games Xbox One X enhanced games
Red Dead Redemption 2
[ "Biology" ]
9,150
[ "Behavior", "Works about atonement", "Works about behavior" ]
52,023,684
https://en.wikipedia.org/wiki/West%20Adams%20Heritage%20Association
Founded in 1983, the West Adams Heritage Association (WAHA) is an historic preservation organization in Los Angeles, California. It is a tax-exempt non-profit organization that is focused on the preservation of the West Adams section of the city. As stated on their website: "West Adams is located just south and west of Downtown and contains the city's largest concentration of Victorian and Craftsman homes, five of the city's Los Angeles Historic Preservation Overlay Zones, and a concentration of Los Angeles Historic-Cultural Monuments." The organization is known for sponsoring neighborhood tours, an annual holiday tour & progressive dinner party, as well as speaking out about preservation issues affecting West Adams. References External links Non-profit organizations based in Los Angeles Historic preservation organizations in the United States Heritage organizations Architectural history Buildings and structures in Los Angeles 1983 establishments in California Organizations established in 1983
West Adams Heritage Association
[ "Engineering" ]
174
[ "Architectural history", "Architecture" ]
52,024,457
https://en.wikipedia.org/wiki/Valbenazine
Valbenazine, sold under the brand name Ingrezza, is a medication used to treat tardive dyskinesia. It acts as a vesicular monoamine transporter 2 (VMAT2) inhibitor. Medical use Valbenazine is used to treat tardive dyskinesia in adults. Tardive dyskinesia is a drug-induced neurological injury characterized by involuntary movements. The clinical trials that led to the approval of valbenazine by the US Food and Drug Administration (FDA) were six weeks in duration. An industry-sponsored study has studied the use of valbenazine for up to 48 weeks, in which it was found to be safe and effective for maintaining short-term (6 week) improvements in tardive dyskinesia. Contraindications There are no contraindications for the use of valbenazine according to the prescribing information. Adverse effects Side effects may include sleepiness or QT prolongation. Significant prolongation has not yet been observed at recommended dosage levels, however, those taking inhibitors of the liver enzymes CYP2D6 or CYP3A4 – or who are poor CYP2D6 metabolizers – may be at risk for significant prolongation. Valbenazine has not been effectively studied in pregnancy, and it is recommended that women who are pregnant or breastfeeding avoid use of valbenazine. Pharmacology Mechanism of action Valbenazine is known to cause reversible reduction of dopamine release by selectively inhibiting pre-synaptic human vesicular monoamine transporter type 2 (VMAT2). In vitro, valbenazine shows great selectivity for VMAT2 and little to no affinity for VMAT1 or other monoamine receptors. Although the exact cause of tardive dyskinesia is unknown, it is hypothesized that it may result from neuroleptic-induced dopamine hypersensitivity because it is exclusively associated with the use of neuroleptic drugs. By selectively reducing the ability of VMAT2 to load dopamine into synaptic vesicles, the drug reduces overall levels of available dopamine in the synaptic cleft, ideally alleviating the symptoms associated with dopamine hypersensitivity. The importance of valbenazine selectivity inhibiting VMAT2 over other monoamine transporters is that VMAT2 is mainly involved with the transport of dopamine, and to a much lesser extent other monoamines such as norepinephrine, serotonin, and histamine. This selectivity is likely to reduce the likelihood of "off-target" adverse effects which may result from the upstream inhibition of these other monoamines. Pharmacokinetics Valbenazine is a prodrug which is an ester of [+]-α-dihydrotetrabenazine (DTBZ) with the amino acid L-valine. It is extensively hydrolyzed to the active metabolite DTBZ. Plasma protein binding of valbenazine is over 99%, and that of DTBZ is about 64%. The biological half-life of both valbenazine and DTBZ is 15 to 22 hours. Liver enzymes involved in inactivation are CYP3A4, CYP3A5 and CYP2D6. The drug is excreted, mostly in form of inactive metabolites, via the urine (60%) and the feces (30%). Society and culture Legal status Valbenazine is produced by Neurocrine Biosciences. Valbenazine is the first medication approved by the FDA for the treatment of tardive dyskinesia, in April 2017. Economics While Neurocrine Biosciences does not hold a final patent for valbenazine or elagolix, they do hold a patent for the VMAT2 inhibitor [9,10-dimethoxy-3-(2-methylpropyl)-1H,2H,3H,4H,6H,7H,11bH-pyrido-[2,1-a]isoquinolin-2-yl]methanol and related compounds, which includes valbenazine. Names The International Nonproprietary Name (INN) is valbenazine. Research Valbenazine is being studied for the treatment of Tourette's syndrome. References Further reading Antidyskinetic agents Monoamine-depleting agents Prodrugs Tardive dyskinesia VMAT inhibitors
Valbenazine
[ "Chemistry" ]
991
[ "Chemicals in medicine", "Prodrugs" ]
52,026,083
https://en.wikipedia.org/wiki/Electric%20overhead%20traveling%20crane
Electric overhead traveling cranes or EOT cranes are a common type of overhead crane, also called bridge cranes. They consist of parallel runways, much akin to rails of a railroad, with a traveling bridge spanning the gap. EOT cranes are specifically powered by electricity. Applications EOT cranes are extensively used in warehouses and industry. An EOT crane is able to carry heavy objects to anywhere needed on the factory floor, and can also be used for lifting. However, it cannot be used in every industry. The working temperature is to limited to a range between -20°C to 40°C. Single girder EOT crane A single girder EOT crane has one main girder, making it easy to install, and requires less maintenance. The most common single girder EOT cranes are as follows: LD type single girder EOT crane LDP type single girder EOT crane and HD type single girder EOT crane It is used for lighter industrial applications as it has lower weight limits. Double girder EOT crane QD type hook double bridge crane LH electric hoist double girder bridge crane NLH type double girder EOT crane References Cranes (machines)
Electric overhead traveling crane
[ "Engineering" ]
241
[ "Engineering vehicles", "Cranes (machines)" ]
52,031,327
https://en.wikipedia.org/wiki/C.%20Marcella%20Carollo
C. Marcella Carollo worked as a professional astronomer for 25 years between 1994 and 2019. Her scientific career was ended by the ETH Zürich who, following accusations that she had bullied students, made her the first Professor to be dismissed at ETH Zurich in the 165 years of its history. Carollo has maintained her innocence against these accusations, publicly commenting on her case in terms that indicate "academic mobbing". The dismissal was appealed unsuccessfully to the Swiss Federal Administrative Court. Education Carollo began her studies at the University of Palermo where she earned a laurea degree in physics in 1987, with a specialization in biophysics. She worked for more than four years outside of academia before starting a PhD in astrophysics at Ludwig Maximilian University of Munich, where she graduated in 1994. Career Carollo was awarded a European Community Prize Fellowship, which she held at Leiden University from 1994 to 1996. She held a Hubble Postdoctoral Fellowship at Johns Hopkins University from 1997 to 1999. Carollo was appointed Assistant Professor in the Astronomy Department at Columbia University in 1999, a position she held until 2002. That year, she moved to ETH Zurich as an Associate Professor, in a dual appointment with her spouse Simon Lilly. She was promoted to Full Professor in 2007. She contributed as a member of the Science Oversight Committee to the development of the WFC3 camera which was installed on the Hubble Space Telescope in 2009. In 2012, she entered the Top Italian Scientist list from VIA Academy and in 2013 she was awarded the Winton Capital Research Prize. In 2018, she was identified as a Highly Cited Researcher for her research work at ETH between 2006 and 2016 – one of only about 20 ETH scientists so recognized. Research Carollo's contribution to astronomy is in the fields of extragalactic astronomy and specifically galaxy formation and evolution. Her early work established the relation between the metallicity gradient and stellar mass in galactic spheroids, demonstrated the presence of dark matter halos beyond their half-light radii and was seminal in discovering and characterizing disk-like (pseudo) bulges and nuclear massive star clusters in disk galaxies like the Milky Way. Later she and her ETH group worked on the role of galactic environment and progenitor bias in galaxy evolution, the growth and "quenching" of massive galaxies at high redshifts, and participated in the discovery and characterization of the most distant galaxies in the universe, in the heart of the reionization epoch. Controversy In December 2016, Professor Carollo informed one of her PhD students that she could no longer supervise her PhD, because of the lack of progress. The PhD student reached out to ETH Ombusdman Wilfred van Gunsteren, complaining that she was insufficiently supervised and had been bullied by Professor Carollo. In January 2017, the ETH Ombudsman as well as the student collected complaints (testimonials) about Carollo from a number of previous students and postdocs. In August 2017, ETH Zurich dissolved its Institute for Astronomy. Marcella Carollo and her spouse Simon Lilly, until the dissolution the head of that Institute, were given a sabbatical leave. In October 2017, an article about the closure and the allegations against Carollo appeared in a Swiss newspaper and was also reported internationally. Shortly afterwards, the ETH Zurich commissioned an Administrative Investigation from an external lawyer, Dr. Markus Rüssli, of the Zurich law firm Umbricht. His report was delivered to ETH in October 2018. ETH published this Rüssli report in April 2019. Meanwhile, on 17 January 2018, the ETH Zurich announced a second investigation against Carollo. This concerned accusations of scientific misconduct in the same testimonials. The ETH at this time suspended Carollo from her duties at the university. A special committee was convened in accordance with the ETH Professors' Ordinance which gave with final recommendation that Prof. Carollo should not be dismissed. ETH published this special committee report in April 2019. In March 2019 the President of ETH Zurich, not following the special committee recommendation and therefore not following ETH's own rules, submitted a request to the ETH Board to terminate the employment relationship with Professor Carollo. On July 15, 2019, the ETH Board agreed to this request and dismissed Carollo with six months notice. Women professor colleagues complained about this decision such as Prof. Ursula Keller and Prof. Janet Hering, both former presidents of the ETH Women Professors Forum. Similar cases happened at the Max Planck Society and 145 women scientists signed an open letter to express their concern. The Carollo case has been widely covered in the German-language Swiss media. One Swiss online magazine, Republik, took sides with Carollo and criticized the ETH Zürich's handling of the case in a series of articles in 2019. Carollo appealed her dismissal to the Federal Administrative Court. In its decision in 2022, the court upheld the dismissal. Although the contested termination was unjustified, it could not be qualified as abusive or gender-discriminatory according to Swiss Gender Equality Act. It also found dismissing Carollo without earlier warnings unfair, and it awarded Carollo a compensation of 8 months salary. In his farewell lecure Simon Lilly spoke for the first time publicly about his observation of the case. The ETH decided to remove the lecture from their video platform. In his lecture series about Mobbing and Hierarchies in Academia, Kenneth Westhues described the case as academic mobbing. References 21st-century Italian astronomers 20th-century Swiss astronomers Hubble Fellows Women astronomers 1962 births Living people Academic staff of ETH Zurich 20th-century Italian women scientists Scientists from Palermo 20th-century Italian astronomers 21st-century Swiss astronomers 20th-century Swiss women scientists 21st-century Swiss women scientists 21st-century Italian women scientists
C. Marcella Carollo
[ "Astronomy" ]
1,179
[ "Women astronomers", "Astronomers" ]
52,032,623
https://en.wikipedia.org/wiki/Agile%20tooling
Agile tooling is the design and fabrication of manufacturing related-tools such as dies, molds, patterns, jigs and fixtures in a configuration that aims to maximise the tools' performance, minimise manufacturing time and cost, and avoid delay in prototyping. A fully functional agile tooling laboratory consists of CNC milling, turning and routing equipment. It can also include additive manufacturing platforms (such as fused filament fabrication, selective laser sintering, Stereolithography, and direct metal laser sintering), hydroforming, vacuum forming, die casting, stamping, injection molding and welding equipment. Agile tooling is similar to rapid tooling, which uses additive manufacturing to make tools or tooling quickly, either directly by making parts that serve as the actual tools or tooling components, such as mold inserts; or indirectly by producing patterns that are in turn used in a secondary process to produce the actual tools. Another similar technique is prototype tooling, where molds, dies and other devices are used to produce prototypes. Rapid manufacturing, and specifically rapid tooling technologies, are earlier in their development than rapid prototyping (RP) technologies, and are often extensions of RP. The aim of all toolmaking is to catch design errors early in the design process; improve product design better products, reduce product cost, and reduce time to market. Users Hundreds of universities and research centers around the globe are investing in additive manufacturing equipment in order to be positioned to make prototypes and tactile representations of real parts. Few have fully committed the concept of using additive manufacturing (AM) to create manufacturing tools (fixturing, clamps, molds, dies, patterns, negatives, etc.). AM experts seem to agree that tooling is a large, namely untapped market. Deloitte University Press estimated that in 2012 alone, the AM Tooling market $1.2 Billion. At that point in the development cycle of AM Tooling, much of the work was performed under the guise of “let’s try it and see what happens”. Industry applications Additive manufacturing, starting with today's infancy period, requires manufacturing firms to be flexible, ever-improving users of all available technologies to remain competitive. Advocates of additive manufacturing also predict that this arc of technological development will counter globalization, as end users will do much of their own manufacturing rather than engage in trade to buy products from other people and corporations. The real integration of the newer additive technologies into commercial production, however, is more a matter of complementing traditional subtractive methods rather than displacing them entirely. Automotive – approaching niche vehicle markets (making less than 100, 000 vehicles), rather than high production volume Aircraft – the U.S. aircraft industry operates in an environment where production volumes are relatively low and resulting product costs are relatively high. Agile tooling can be applied in the early design stage of the development cycle to minimize the high cost of redesign. Medical – cast tooling would benefit a great deal from agile tooling. However, the cost for the tooling may still be significantly greater than the cost of a casting piece, with high lead times. Since only several dozen or several hundred metal parts are needed, the challenge for mass production is still prevalent. A balance between these four areas – quantity, design, material, and speed are key to designing and producing a fully functional product. See also Computer-aided design CAD) Computer-aided engineering (CAE) Computer-aided manufacturing (CAM) References External links Product Design & Engineering Manufacturing Product design Manufacturing plants Industrial processes 3D printing processes Industrial equipment Computer-aided manufacturing Industrial design Prototypes Management cybernetics Digital manufacturing Fused filament fabrication
Agile tooling
[ "Technology", "Engineering" ]
752
[ "Industrial design", "Design engineering", "Product design", "nan", "Industrial computing", "Design", "Digital manufacturing" ]
52,033,713
https://en.wikipedia.org/wiki/HR%204729
HR 4729 (HD 108250) is a multiple star system located about from the Sun in the constellation of Crux and part of the asterism known as the Southern Cross. It is a close companion of α Crucis and sometimes called α Crucis C. Nomenclature HR 4729 is the star's designation in the Bright Star Catalogue. It is also often referred to by its Henry Draper Catalogue listing of HD 108250. Because of its closeness to α Crucis it is included in many multiple star catalogues as α Crucis C. It is also listed as star 25 in Crux in the Uranometria Argentina, displayed as 25 G. Crucis. Discovery HR 4729 was first observed in 1829, as a companion to α Crucis, by James Dunlop from Paramatta in New South Wales. As early as 1916, HR 4729 was reported to have a variable radial velocity indicating a likely binary system, but the orbital elements were not calculated until 1979. System HR 4729 lies 90 arcseconds away from the triple star system of α Crucis and shares its motion through space, suggesting it may be gravitationally bound to it, and it is therefore generally assumed to be physically associated. In the context of being a companion to α Crucis it is usually referred to as α Crucis C. HR 4729 is itself a close spectroscopic binary system with a period of 1 day 5 hours. It also has a faint visual companion 2.1" away. A further seven faint stars are also listed as members of the α Crucis group out to a distance of about two arc-minutes. One particular companion very close to HR 4729 has been resolved using adaptive optics at infrared wavelengths. It has been named α Crucis P, or α Crucis CP because it is only 2" from HR 4729. Rizzuto and colleagues determined in 2011 that the α Crucis system, including HR 4729, was 66% likely to be a member of the Lower Centaurus–Crux sub-group of the Scorpius–Centaurus association. It was not previously seen to be a member of the group because of confusion over the true radial velocity of the spectroscopic pair. On 2008 October 2, the Cassini–Huygens spacecraft resolved three of the components (A, B and C) of the multiple star system as Saturn's disk occulted it. Stellar properties HR 4729 is a hot class B main sequence star nearly ten times as massive as the sun. It is only about twelve million years old, but already shows signs of evolving away from the main sequence. Several studies have assigned a subgiant luminosity class to the star. The spectroscopic companion cannot be seen in the spectrum, therefore little is known about its properties. Analysis of the orbit shows that it has a mass greater than the sun. The physically associated companion α Crucis D or α Crucis CP is a 15th magnitude star. Its relative faintness suggests an M0 V spectral type. Two other, even fainter stars, lie within of HR 4729. References External links http://www.daviddarling.info/encyclopedia/A/Acrux.html Double stars B-type main-sequence stars Crux Lower Centaurus Crux Crucis, 25 Spectroscopic binaries 4729 108250 Durchmusterung objects
HR 4729
[ "Astronomy" ]
709
[ "Crux", "Constellations" ]
52,035,185
https://en.wikipedia.org/wiki/Bevan%20point
In geometry, the Bevan point, named after Benjamin Bevan, is a triangle center. It is defined as center of the Bevan circle, that is the circle through the centers of the three excircles of a triangle. The Bevan point of a triangle is the reflection of the incenter across the circumcenter of the triangle. Bevan posed the problem of proving this in 1804, in a mathematical problem column in The Mathematical Repository. The problem was solved in 1806 by John Butterworth. The Bevan point of triangle has the same distance from its Euler line as its incenter . Their distance is where denotes the radius of the circumcircle and the sides of . The Bevan is point is also the midpoint of the line segment connecting the Nagel point and the de Longchamps point . The radius of the Bevan circle is , that is twice the radius of the circumcircle. References Triangle centers
Bevan point
[ "Physics", "Mathematics" ]
198
[ "Point (geometry)", "Triangle centers", "Points defined for a triangle", "Geometric centers", "Geometry", "Geometry stubs", "Symmetry" ]
52,035,641
https://en.wikipedia.org/wiki/K.%20Christopher%20Garcia
K. Christopher Garcia is an American scientist known for his research on the molecular and structural biology of cell surface receptors. Garcia is a professor in the Departments of Molecular and Cellular Physiology and Structural Biology at the Stanford University School of Medicine, an Investigator of the Howard Hughes Medical Institute and a member of the National Academies of Science and Medicine. In addition to his role at Stanford, Garcia is a co-founder of several biotechnology companies, including Alexo Therapeutics, Surrozen, and 3T Biosciences. Education Garcia earned his B.S. in biochemistry from Tulane University. He attended graduate school at the Johns Hopkins University School of Medicine, where he received his Ph.D. in Biophysics under the mentorship of L. Mario Amzel. After receiving his Ph.D., Garcia conducted postdoctoral research at Genentech in the laboratories of David Goeddel and Tony Kossiakoff, where he immersed himself in the nascent technologies of protein engineering and recombinant protein expression, and then at The Scripps Research Institute in the laboratory of Ian Wilson. Research career Garcia's research integrates approaches in structural biology, biochemistry and protein engineering to understand how cell surface receptors sense environmental cues through the engagement of extracellular ligands, and transduce signals. The overarching theme of the laboratory is to elucidate the structural and mechanistic basis of receptor activation in systems relevant to human disease, and to exploit this information to design and engineer new molecules with therapeutic properties. Thus there is a close integration of basic science discovery with translation. Garcia's laboratory at Stanford has published numerous scientific articles describing the molecular structure and signaling mechanisms of proteins important for immunity, neurobiology and development. Antigen recognition Garcia's earliest research as a graduate student at Johns Hopkins University focused on understanding how anti-idiotyopic antibodies recognize peptide antigens. As a postdoctoral scholar at The Scripps Research Institute, Garcia conducted a groundbreaking study that revealed how T cells of the immune system survey peptides presented by major histocompatibility complex proteins (MHC), thus allowing them to distinguish between "self" and "non-self". Garcia's research led to the first visualization of a T cell receptor (TCR) bound to a peptide-MHC (pMHC) complex and was published in the journal Science in 1996. Garcia's 1996 article on the TCR-MHC interaction has had broad impact in the fields of immunology and immunotherapy. At Stanford University, the Garcia Laboratory reported the structure of the pre-B cell receptor (pre-BCR) in 2007, which revealed how pre-BCRs oligomerize to signal in the absence of antigen. Garcia's group has also authored several additional landmark articles exploring various aspects of TCR-pMHC interactions, including the first structure of a γδ TCR-pMHC complex, the molecular basis for dual recognition of "self" and "foreign" MHCs by TCRs, insights into the germline basis of TCR/MHC interactions, the extent of cross-reactivity in the TCR repertoire, and elucidation of the structural trigger for TCR signaling. In Garcia's most recent work, his lab developed a peptide-MHC library technology that has enabled the discovery of antigens for orphan T cell receptors, such as those resident in tumors. This technology also enabled a breakthrough in understanding how signaling is initiated by pMHC engagement. Cytokine signaling Garcia's research has established how structural and biophysical principles govern receptor binding and signal activation in many different cytokine systems. Key findings include determination of the first crystal structures of the following cytokine family members in complex with their surface receptors: gp130 family (IL-6), common gamma (γc) family (IL-2), Type I Interferons (IFNα2/IFNω) and Type III Interferons. The Garcia Laboratory has also determined crystal structures of many other major cytokine-receptor complexes including those of IL-1, IL-4, IL-13, IL-15, IL-17, IL-23, LIF and CNTF. These structures have revealed a wide range of binding topologies and architectures, and demonstrate how convergent evolution has provided many solutions for cytokine receptors to transduce signals across the cell membrane. In addition to molecular studies of cytokines, Garcia's group has also used directed evolution to engineer high affinity cytokine variants (IL-2, IL-4, IFN-λ) with improved therapeutic properties. Wnt signaling In 2012, Garcia's laboratory determined the crystal structure of a Wnt protein in complex with its cellular receptor, Frizzled. The Wnt-Frizzled structure indicated that Wnts utilize a post-translational lipid modification to directly engage the Frizzled extracellular domain, which represents a highly unusual binding mode among soluble ligands. Garcia's study revealed a striking, donut-shaped architecture adopted by the Wnt-Frizzled complex that adorns the cover of the July 6th, 2012 issue of Science. More recently, Garcia's laboratory reported a breakthrough in being able to recapitulate canonical Wnt signaling using water-soluble bispecific ligands that dimerize Frizzled and Lrp6, which has important implications for the development of therapeutics for regenerative medicine. Notch signaling In 2015 and 2017, Garcia published articles in Science describing the first atomic-level visualizations of Notch signaling complexes. Garcia's group used directed evolution to strengthen low-affinity interactions between the receptor Notch1 and ligands Delta-like 4 (DLL4) and Jagged1 (Jag1) as a means of stabilizing the complexes for co-crystallization. Notch1-DLL4 and Notch1-Jag1 structures were determined by x-ray crystallography and revealed long, narrow binding interfaces assisted by multiple O-linked fucose and glucose modifications on Notch1. O-linked glycans are rarely observed at protein-protein interfaces, and their presence at the Notch-ligand interface explained how changes in glycosylation state influence Notch signaling activity. Garcia's 2017 publication also established that Notch-ligand interactions form catch bonds, and that Delta-like and Jagged ligands have different mechanical force thresholds for Notch receptor activation. GPCR signaling In 2015, the Garcia Laboratory reported the x-ray crystal structure of the virally encoded G-protein coupled receptor (GPCR), US28, bound to its chemokine ligand, fractalkine (CX3CL1). The US28-Fractalkine structure was one of the first reports to visualize a protein ligand bound to a GPCR, and revealed that the globular "head" of fractalkine docks onto the extracellular loops of US28, while fractalkine's flexible N-terminal "tail" threads into a cavity in the center of US28 as a means of fine-tuning its downstream signaling activity. In more recent studies, the lab has engineered biased chemokine ligands and shown that GPCR activation is governed by ligands that induce shape changes rather than highly specific bonding chemistries. Cancer immunotherapy Garcia has conducted several studies targeting cellular receptors for applications in cancer immunotherapy. In 2013, Garcia's group developed high affinity antagonists of the receptor CD47 that potently enhance the antitumor effects of established therapeutic antibodies. Garcia later determined that the therapeutic effects of CD47 blockade require combination therapy with checkpoint blockade antibodies in immunocompetent hosts, thus proving that CD47-based therapy relies upon stimulation of the adaptive immune system. Garcia's lab published the creation of an "orthogonal" IL-2 receptor complex to enable the selective delivery of IL-2 signals to engineered T cells during adoptive cell therapy. They also reported a new technology using yeast-displayed peptide-MHC molecules to identify tumor antigens recognized by Tumor Infiltrating Lymphocytes. Video highlights Garcia has published descriptions of several research findings online in the form of videos. Awards 1999 - March of Dimes Basil O’Connor Award 1999 - Frederick J. Terman Junior Faculty Award 1999 - Rita Allen Foundation Scholar 1999 - American Heart Association New Investigator Award 2000 - Cancer Research Institute New Investigator Award 2001 - Pew Scholar 2002 - Keck Distinguished Medical Scholar 2004 - Established Investigator of the American Heart Association 2012 - Elected to National Academy of Sciences 2013 - NIHMERIT award 2015 - Member of Mathematical Sciences Jury for the Infosys Prize 2016 - Elected to National Academy of Medicine 2024 - Passano Award Personal life Garcia is a competitive long-distance runner and has run more than 120 ultramarathons, including several 100-mile races. External links Garcia Lab website at Stanford University References American molecular biologists Year of birth missing (living people) Living people Members of the United States National Academy of Sciences Members of the National Academy of Medicine Howard Hughes Medical Investigators Stanford University School of Medicine faculty Tulane University alumni Johns Hopkins School of Medicine alumni Structural biologists
K. Christopher Garcia
[ "Chemistry" ]
1,888
[ "Structural biologists", "Structural biology" ]
52,035,643
https://en.wikipedia.org/wiki/Ultimate%20tic-tac-toe
Ultimate tic-tac-toe (also known as UTT, super tic-tac-toe, meta tic-tac-toe, (tic-tac-toe)² or Ultimate Noughts and Crosses) is a board game composed of nine tic-tac-toe boards arranged in a 3 × 3 grid. Players take turns playing on the smaller tic-tac-toe boards until one of them wins on the larger board. Compared to traditional tic-tac-toe, strategy in this game is conceptually more difficult and has proven more challenging for computers. Rules Just like in regular tic-tac-toe, the two players (X and O) take turns, starting with X. The game starts with X playing wherever they want in any of the 81 empty spots. Thereafter, each player moves in the small board corresponding to the position of the previous move in its small board, as indicated in the figures. If a move is played so that it wins a small board by the rules of normal tic-tac-toe, then the entire small board is marked as won by the player in the larger board. Once a small board is won by a player or it is filled completely, no more moves may be played in that board. If a player is sent to such a board, then that player may play in any other board. Game play ends when either a player wins the larger board or there are no legal moves remaining, in which case the game is a draw. Gameplay Super tic-tac-toe is significantly more complex than most other variations of tic-tac-toe, as there is no clear strategy to playing. This is because of the complicated game branching in this game. Even though every move must be played in a small board, equivalent to a normal tic-tac-toe board, each move must take into account the larger board in several ways: Anticipating the next move: Each move played in a small board determines where the opponent's next move can be played. This might make moves that are considered bad in normal tic-tac-toe viable, since the opponent is forced to play on certain board. This way a player could play the same smaller board multiple times in a row, without their opponent being able to respond. Therefore, players are forced to consider the larger game board instead of simply focusing on the smaller boards. Visualizing the game tree: Visualizing future branches of the game tree is more difficult than single board tic-tac-toe. Each move determines the next move, and therefore reading ahead—predicting future moves—follows a much less linear path. Future board positions are no longer interchangeable, each move leading to starkly different possible future positions. This makes the game tree difficult to visualize, possibly leaving many possible paths overlooked. Winning the game: Due to the rules of super tic-tac-toe, the larger board is never directly affected. It is governed only by actions that occur in smaller boards. This means that each move played is not intended to win the small board, but to win the larger board. In fact, it may be strategic to sacrifice a small board to your opponent in order to win a more important small board yourself. This added layer of complexity makes it harder to analyze the relative importance and significance of moves, and consequently harder to play well. Computer implementations While tic-tac-toe is elementary to solve and can be done nearly instantly using depth-first search, ultimate tic-tac-toe cannot be reasonably solved using any brute-force tactics. Therefore, more creative computer implementations are necessary to play this game. The most common artificial intelligence (AI) tactic, minimax, may be used to play ultimate tic-tac-toe, but has difficulty playing this. This is because, despite having relatively simple rules, ultimate tic-tac-toe lacks any simple heuristic evaluation function. This function is necessary in minimax, for it determines how good a specific position is. Although elementary evaluation functions can be made for ultimate tic-tac-toe by taking into account the number of small board victories, these largely overlook positional advantage that is much harder to quantify. Without any efficient evaluation function, most typical computer implementations are weak, and therefore there are few computer opponents that can consistently outplay humans. However, artificial intelligence algorithms that don't need evaluation functions, like the Monte Carlo tree-search algorithm, have no problem in playing this game. The Monte Carlo tree search relies on random simulations of games to determine how good a position is instead of a positional evaluation and is therefore able to accurately assess how good a current position is. Therefore, computer implementations using these algorithms tend to outperform minimax solutions and can consistently beat human opponents. Online ultimate tic-tac-toe Online UTT is UTT that is played over the internet and allows players from around the world to play against each other in real time. Not many online UTT platforms exist due to the games decreased popularity and limited player counts. Few players have formulated theory behind UTT, such as openings and winning strategies however current communities exist for this objective. Variants A variant of the game allows players to continue playing in already won boxes if there are still empty spaces. This allows the game to last longer and involves further strategic moves. It was shown in 2020 that this set of rules for the game admits a winning strategy for the first player to move, meaning that the first player to move can always win assuming perfect play. If playing with this rule set is still preferred, the forced-win problem can be practically solved by generating the first 4 moves at random. This is most effectively done by randomly generating a 5-digit number, then using the first digit to select a larger board and the next four digits to place "X"s and "O"s in the appropriate small board. It is also possible to create an expanded version of Ultimate Tic Tac Toe by effectively creating more layers of nested Tic Tac Toe games within the larger board. For example, a game that has a further layer would have 81 base level Tic Tac Toe boards. Tic-Tac-Ku, a game invented by Mark Asperheim and Cris Van Oosterum, has similar rules to ultimate tic-tac-toe, however a player wins the game by winning at least five small boards, instead of three in a line. See also Recursion Tic-tac-toe variants References External links Monte Carlo tree-search implementation AlphaZero-like AI solution for playing Ultimate Tic-Tac-Toe in the browser Ultimate Tic-Tac-Toe game where artificial intelligences confront each other Python implementation Tic-tac-toe Abstract strategy games Mathematical games Paper-and-pencil games Tic-tac-toe variants
Ultimate tic-tac-toe
[ "Mathematics" ]
1,420
[ "Recreational mathematics", "Mathematical games" ]
52,035,827
https://en.wikipedia.org/wiki/Seth%20Baum
Seth Baum is an American researcher involved in the field of risk research. He is the executive director of the Global Catastrophic Risk Institute (GCRI), a think tank focused on existential risk. He is also affiliated with the Blue Marble Space Institute of Science and the Columbia University Center for Research on Environmental Decisions. Academic career Baum obtained his BS in optics and mathematics in 2003 at the University of Rochester, followed by an MS in Electrical Engineering, Northeastern University in 2006. In 2012, he obtained his PhD in Geography with his dissertation on climate change policy: "Discounting Across Space and Time in Climate Change Assessment" from Pennsylvania State University. Later, he completed a post-doctoral fellowship with the Columbia University Center for Research on Environmental Decisions. Baum then steered his research interests into astrophysics and global risks, including global warming and nuclear war, and the development of effective solutions for reducing them. Furthermore, he is a Fellow of the Society for Risk Analysis. Work As a graduate student in Northstrom, Boston, Baum contributed to the Whats Up magazine (now Spare Change News), from 2004 to 2007. In 2011, Baum co-founded GCRI along with Tony Barrett, with the mission to "develop the best ways to confront humanity's gravest threats". The institute has since grown rapidly, publishing in peer-reviewed academic journals and media outlets. As of 2016, its main work is on the "Integrated Assessment Project", which assesses all the global catastrophic risks in order to make them available for societal learning and decision making processes. GCRI is funded by "a mix of grants, private donations, and occasional consulting work. Two years later, Baum hosted a regular blog on Scientific American and has been interviewed about his work and research in the History Channel and the O'Reilly Factor., where he was asked about studying possible human contact with extraterrestrial life and the ethics involved. He also started contributing regularly to The Huffington Post, writing about the Russo-Ukrainian War and the Syrian Civil War as possible scenarios for nuclear war. In 2016, after receiving a 100,000 dollar grant from the Future of Life Institute his research interests shifted to AI safety and the ethics of outer space. That same year, he wrote a monthly column for the Bulletin of the Atomic Scientists, where he discussed AI threats, biological weapons and the risks of nuclear deterrence failure. See also Global catastrophic risk References 1980 births Living people Academics from New York City American ethicists University of Rochester alumni Northeastern University alumni American futurologists Safety researchers
Seth Baum
[ "Engineering" ]
519
[ "Safety engineering", "Safety researchers" ]
52,036,418
https://en.wikipedia.org/wiki/Paecilomyces%20hepiali
Paecilomyces hepiali is an entomophagous fungus. Based on 18S rDNA sequencing, this species is distinct from Ophiocordyceps sinensis. Samsoniella hepiali is defined by NCBI as a homotypic synonym of P. hepiali. Further work on the classification of this species was described in 2020. References Trichocomaceae Fungus species
Paecilomyces hepiali
[ "Biology" ]
85
[ "Fungi", "Fungus species" ]
52,036,756
https://en.wikipedia.org/wiki/Cache%20hierarchy
Cache hierarchy, or multi-level cache, is a memory architecture that uses a hierarchy of memory stores based on varying access speeds to cache data. Highly requested data is cached in high-speed access memory stores, allowing swifter access by central processing unit (CPU) cores. Cache hierarchy is a form and part of memory hierarchy and can be considered a form of tiered storage. This design was intended to allow CPU cores to process faster despite the memory latency of main memory access. Accessing main memory can act as a bottleneck for CPU core performance as the CPU waits for data, while making all of main memory high-speed may be prohibitively expensive. High-speed caches are a compromise allowing high-speed access to the data most-used by the CPU, permitting a faster CPU clock. Background In the history of computer and electronic chip development, there was a period when increases in CPU speed outpaced the improvements in memory access speed. The gap between the speed of CPUs and memory meant that the CPU would often be idle. CPUs were increasingly capable of running and executing larger amounts of instructions in a given time, but the time needed to access data from main memory prevented programs from fully benefiting from this capability. This issue motivated the creation of memory models with higher access rates in order to realize the potential of faster processors. This resulted in the concept of cache memory, first proposed by Maurice Wilkes, a British computer scientist at the University of Cambridge in 1965. He called such memory models "slave memory". Between roughly 1970 and 1990, papers and articles by Anant Agarwal, Alan Jay Smith, Mark D. Hill, Thomas R. Puzak, and others discussed better cache memory designs. The first cache memory models were implemented at the time, but even as researchers were investigating and proposing better designs, the need for faster memory models continued. This need resulted from the fact that although early cache models improved data access latency, with respect to cost and technical limitations it was not feasible for a computer system's cache to approach the size of main memory. From 1990 onward, ideas such as adding another cache level (second-level), as a backup for the first-level cache were proposed. Jean-Loup Baer, Wen-Hann Wang, Andrew W. Wilson, and others have conducted research on this model. When several simulations and implementations demonstrated the advantages of two-level cache models, the concept of multi-level caches caught on as a new and generally better model of cache memories. Since 2000, multi-level cache models have received widespread attention and are currently implemented in many systems, such as the three-level caches that are present in Intel's Core i7 products. Multi-level cache Accessing main memory for each instruction execution may result in slow processing, with the clock speed depending on the time required to find and fetch the data. In order to hide this memory latency from the processor, data caching is used. Whenever the data is required by the processor, it is fetched from the main memory and stored in the smaller memory structure called a cache. If there is any further need of that data, the cache is searched first before going to the main memory. This structure resides closer to the processor in terms of the time taken to search and fetch data with respect to the main memory. The advantages of using cache can be proven by calculating the average access time (AAT) for the memory hierarchy with and without the cache. Average access time (AAT) Caches, being small in size, may result in frequent misses – when a search of the cache does not provide the sought-after information – resulting in a call to main memory to fetch data. Hence, the AAT is affected by the miss rate of each structure from which it searches for the data. AAT for main memory is given by Hit time main memory. AAT for caches can be given by Hit timecache + (Miss ratecache × Miss Penaltytime taken to go to main memory after missing cache). The hit time for caches is less than the hit time for the main memory, so the AAT for data retrieval is significantly lower when accessing data through the cache rather than main memory. Trade-offs While using the cache may improve memory latency, it may not always result in the required improvement for the time taken to fetch data due to the way caches are organized and traversed. For example, direct-mapped caches that are the same size usually have a higher miss rate than fully associative caches. This may also depend on the benchmark of the computer testing the processor and on the pattern of instructions. But using a fully associative cache may result in more power consumption, as it has to search the whole cache every time. Due to this, the trade-off between power consumption (and associated heat) and the size of the cache becomes critical in the cache design. Evolution In the case of a cache miss, the purpose of using such a structure will be rendered useless and the computer will have to go to the main memory to fetch the required data. However, with a multiple-level cache, if the computer misses the cache closest to the processor (level-one cache or L1) it will then search through the next-closest level(s) of cache and go to main memory only if these methods fail. The general trend is to keep the L1 cache small and at a distance of 1–2 CPU clock cycles from the processor, with the lower levels of caches increasing in size to store more data than L1, hence being more distant but with a lower miss rate. This results in a better AAT. The number of cache levels can be designed by architects according to their requirements after checking for trade-offs between cost, AATs, and size. Performance gains With the technology-scaling that allowed memory systems able to be accommodated on a single chip, most modern day processors have up to three or four cache levels. The reduction in the AAT can be understood by this example, where the computer checks AAT for different configurations up to L3 caches. Example: main memory = 50 , L1 = 1 ns with 10% miss rate, L2 = 5 ns with 1% miss rate, L3 = 10 ns with 0.2% miss rate. No cache, AAT = 50 ns L1 cache, AAT = 1 ns + (0.1 × 50 ns) = 6 ns L1–2 caches, AAT = 1 ns + (0.1 × [5 ns + (0.01 × 50 ns)]) = 1.55 ns L1–3 caches, AAT = 1 ns + (0.1 × [5 ns + (0.01 × [10 ns + (0.002 × 50 ns)])]) = 1.5101 ns Disadvantages Cache memory comes at an increased marginal cost than main memory and thus can increase the cost of the overall system. Cached data is stored only so long as power is provided to the cache. Increased on-chip area required for memory system. Benefits may be minimized or eliminated in the case of a large programs with poor temporal locality, which frequently access the main memory. Properties Banked versus unified In a banked cache, the cache is divided into a cache dedicated to instruction storage and a cache dedicated to data. In contrast, a unified cache contains both the instructions and data in the same cache. During a process, the L1 cache (or most upper-level cache in relation to its connection to the processor) is accessed by the processor to retrieve both instructions and data. Requiring both actions to be implemented at the same time requires multiple ports and more access time in a unified cache. Having multiple ports requires additional hardware and wiring, leading to a significant structure between the caches and processing units. To avoid this, the L1 cache is often organized as a banked cache which results in fewer ports, less hardware, and generally lower access times. Modern processors have split caches, and in systems with multilevel caches higher level caches may be unified while lower levels split. Inclusion policies Whether a block present in the upper cache layer can also be present in the lower cache level is governed by the memory system's inclusion policy, which may be inclusive, exclusive or non-inclusive non-exclusive (NINE). With an inclusive policy, all the blocks present in the upper-level cache have to be present in the lower-level cache as well. Each upper-level cache component is a subset of the lower-level cache component. In this case, since there is a duplication of blocks, there is some wastage of memory. However, checking is faster. Under an exclusive policy, all the cache hierarchy components are completely exclusive, so that any element in the upper-level cache will not be present in any of the lower cache components. This enables complete usage of the cache memory. However, there is a high memory-access latency. The above policies require a set of rules to be followed in order to implement them. If none of these are forced, the resulting inclusion policy is called non-inclusive non-exclusive (NINE). This means that the upper-level cache may or may not be present in the lower-level cache. Write policies There are two policies which define the way in which a modified cache block will be updated in the main memory: write through and write back. In the case of write through policy, whenever the value of the cache block changes, it is further modified in the lower-level memory hierarchy as well. This policy ensures that the data is stored safely as it is written throughout the hierarchy. However, in the case of the write back policy, the changed cache block will be updated in the lower-level hierarchy only when the cache block is evicted. A "dirty bit" is attached to each cache block and set whenever the cache block is modified. During eviction, blocks with a set dirty bit will be written to the lower-level hierarchy. Under this policy, there is a risk for data-loss as the most recently changed copy of a datum is only stored in the cache and therefore some corrective techniques must be observed. In case of a write where the byte is not present in the cache block, the byte may be brought to the cache as determined by a write allocate or write no-allocate policy. Write allocate policy states that in case of a write miss, the block is fetched from the main memory and placed in the cache before writing. In the write no-allocate policy, if the block is missed in the cache it will write in the lower-level memory hierarchy without fetching the block into the cache. The common combinations of the policies are "write back, write allocate" and "write through, write no-allocate". Shared versus private A private cache is assigned to one particular core in a processor, and cannot be accessed by any other cores. In some architectures, each core has its own private cache; this creates the risk of duplicate blocks in a system's cache architecture, which results in reduced capacity utilization. However, this type of design choice in a multi-layer cache architecture can also be good for a lower data-access latency. A shared cache is a cache which can be accessed by multiple cores. Since it is shared, each block in the cache is unique and therefore has a larger hit rate as there will be no duplicate blocks. However, data-access latency can increase as multiple cores try to access the same cache. In multi-core processors, the design choice to make a cache shared or private impacts the performance of the processor. In practice, the upper-level cache L1 (or sometimes L2) is implemented as private and lower-level caches are implemented as shared. This design provides high access rates for the high-level caches and low miss rates for the lower-level caches. Recent implementation models Intel Xeon Emerald Rapids (2024) Up 64 core: L1 cache - 80  per core L2 cache - 2  per core L3 cache - 5  per core (i.e., up to 320  total) Intel i5 Raptor Lake-HX (2024) 6 core (performance| efficiency): L1 cache - 128  per core L2 cache - 2  per core | 4-8  semi-shared L3 cache - 20-24  shared AMD EPYC 9684X (2023) 96 core: L1 cache - 64  per core L2 cache - 1  per core L3 cache - 1152  shared Apple M1 Ultra (2022) 20 core (4:1 "performance" core | "efficiency" core): L1 cache - 320|192  per core L2 cache - 52  semi-shared L3 cache - 96  shared AMD Ryzen 7000 (2022) 6 to 16 core: L1 cache - 64  per core L2 cache - 1  per core L3 cache - 32 to 128  shared AMD Zen 2 microarchitecture (2019) L1 cache – 32 kB data & 32 kB instruction per core, 8-way L2 cache – 512 kB per core, 8-way inclusive L3 cache – 16 MB local per 4-core CCX, 2 CCXs per chiplet, 16-way non-inclusive. Up to 64 MB on desktop CPUs and 256 MB on server CPUs AMD Zen microarchitecture (2017) L1 cache – 32 kB data & 64 kB instruction per core, 4-way L2 cache – 512 kB per core, 4-way inclusive L3 cache – 4 MB local & remote per 4-core CCX, 2 CCXs per chiplet, 16-way non-inclusive. Up to 16 MB on desktop CPUs and 64 MB on server CPUs Intel Kaby Lake microarchitecture (2016) L1 cache (instruction and data) – 64 kB per core L2 cache – 256 kB per core L3 cache – 2 MB to 8 MB shared Intel Broadwell microarchitecture (2014) L1 cache (instruction and data) – 64  per core L2 cache – 256 kB per core L3 cache – 2  to 6 MB shared L4 cache – 128 MB of eDRAM (Iris Pro models only) IBM POWER7 (2010) L1 cache (instruction and data) – each 64-banked, each bank has 2rd+1wr ports 32 kB, 8-way associative, 128B block, write through L2 cache – 256 kB, 8-way, 128B block, write back, inclusive of L1, 2 ns access latency L3 cache – 8 regions of 4 MB (total 32 MB), local region 6 ns, remote 30 ns, each region 8-way associative, DRAM data array, SRAM tag array See also POWER7 Intel Broadwell Microarchitecture Intel Kaby Lake Microarchitecture CPU cache Memory hierarchy CAS latency Cache (computing) References Cache (computing) Computer architecture Computer hardware Computer memory
Cache hierarchy
[ "Technology", "Engineering" ]
3,115
[ "Computer engineering", "Computer architecture", "Computer hardware", "Computer systems", "Computer science", "Computers" ]
52,037,261
https://en.wikipedia.org/wiki/IEEE%20Journal%20of%20Solid-State%20Circuits
The IEEE Journal of Solid-State Circuits is a monthly peer-reviewed scientific journal on new developments and research in solid-state circuits, published by the Institute of Electrical and Electronics Engineers (IEEE) in New York City. The journal serves as a companion venue for expanding on work presented at the International Solid-State Circuits Conference, the Symposia on VLSI Technology and Circuits, and the Custom Integrated Circuits Conference. The journal has an impact factor of 6.12 and is edited by Dennis Sylvester (University of Michigan). References External links Journal of Solid-State Circuits, IEEE Electronics journals Semiconductor journals Monthly journals English-language journals Academic journals established in 1966 Electrical and electronic engineering journals
IEEE Journal of Solid-State Circuits
[ "Engineering" ]
140
[ "Electrical engineering", "Electronic engineering", "Electrical and electronic engineering journals" ]
52,038,701
https://en.wikipedia.org/wiki/International%20Conference%20on%20Business%20Process%20Management
The International Conference on Business Process Management is an academic conference organized annually by the BPM community. The conference was first organized in 2003 Eindhoven, Netherlands. Since then the conference has been organized annually. The conference is the premium forum for researchers, practitioners and developers in the field of Business Process Management (BPM). The conference typically attracts over 300 participants from all over the world. The BPM Steering Committee is responsible for the conference, including selection of organizers, invited speakers, workshops, etc. Topics Topics covered by the conference include: Business process modeling BPM/WFM systems Process mining Business process intelligence Workflow automation Process change management Reference process models Process modeling languages Case management Process variability and configuration Operations research for business processes Collaborative business process management Qualitative and quantitative process analysis (e.g. process simulation) Management aspects of BPM Decision management Process discovery Process compliance Process innovation Process execution architectures History The first conference was organized by Wil van der Aalst and was held in conjunction with the 24th International Conference on Applications and Theory of Petri Nets in Eindhoven. Since BPM 2005 in Nancy, the conference has co-located workshops in different subfields of BPM. BPM 2023 in Utrecht, Netherlands General Chair: Hajo Reijers PC Co-Chairs: Shazia Sadiq, Chiara Di Francescomarino, Andrea Burattin, Christian Janiesch Workshop Chairs: Luise Pufahl, Jochen De Weerdt BPM 2022 in Münster, Germany PC Co-Chairs: Adela del Río Ortega, Claudio Di Ciccio, Remco Dijkman, Stefanie Rinderle-Ma General Chair: Jörg Becker Workshop Chairs: Cristina Cabanillas, Agnes Koschmider, Niels Frederik Garmann-Johnsen BPM 2021 in Rome, Italy PC Co-Chairs: Artem Polyvyanyy, Moe Thandar Wynn, Amy Van Looy, Manfred Reichert General Chair: Massimo Mecella Workshop Chairs: Andrea Marrella, Barbara Weber BPM 2020 in Sevilla, Spain PC Co-Chairs: Dirk Fahland, Chiara Ghidini, Jörg Becker, Marlon Dumas General Chair: Manuel Resinas, Antonio Ruiz-Cortés Workshop Chairs: Adela del-Río-Ortega, Henrik Leopold, Flavia M. Santoro BPM 2019 in Vienna, Austria PC Co-Chairs: Thomas Hildebrandt, Boudewijn van Dongen, Maximilian Röglinger, Jan Mendling General Chair: Jan Mendling, Stefanie Rinderle-Ma Workshop Chairs: Remco Dijkman, Chiara di Francescomarino, Uwe Zdun BPM 2018 in Sydney, Australia PC Co-Chairs: Mathias Weske, Marco Montali, Ingo Weber, Jan vom Brocke General Chair: Boualem Benatallah and Jian Yang Workshop Chairs: Florian Daniel, Hamid Motahari, Michael Sheng BPM 2017 in Barcelona, Spain PC Co-Chairs: Josep Carmona, Gregor Engels, Akhil Kumar General Chair: Josep Carmona Workshop Chairs: Matthias Weidlich, Ernest Teniente BPM 2016 in Rio de Janeiro, Brazil PC Co-Chairs: Marcello La Rosa, Peter Loos, Oscar Pastor General Chair: Flávia Santoro Workshop Chairs: Marlon Dumas, Marcelo Fantinato BPM 2015 in Innsbruck, Austria PC Co-Chairs: Hamid Reza Motahari-Nezhad, Jan Recker, Matthias Weidlich General Chair: Barbara Weber BPM 2014 in Eindhoven, Netherlands (relocated from Haifa, Israel) PC Co-Chairs: Shazia Sadiq, Pnina Soffer, Hagen Völzer General Co-Chairs: Avigdor Gal, Mor Peleg Local Chair: Wil van der Aalst BPM 2013 in Beijing, China PC Co-Chairs: Florian Daniel, Jianmin Wang, Barbara Weber General Chair: Jianmin Wang BPM 2012 in Tallinn, Estonia PC Co-Chairs: Alistair Barros, Avi Gal, Ekkart Kindler General Chair: Marlon Dumas BPM 2011 in Clermont-Ferrand, France PC Co-Chairs: Stefanie Rinderle-Ma, Farouk Toumani, Karsten Wolf General Chair: Farouk Toumani, Mohand-Said Hacid BPM 2010 in Hoboken (NJ), United States PC Co-Chairs: Richard Hull, Jan Mendling, Stefan Tai General Chair: Michael zur Mühlen BPM 2009 in Ulm, Germany PC Co-Chairs: Umeshwar Dayal, Johann Eder, Hajo A. Reijers General Chairs: Peter Dadam, Manfred Reichert BPM 2008 in Milan, Italy PC Co-Chairs: Marlon Dumas, Manfred Reichert, Ming-Chien Shan General Chair: Barbara Pernici BPM 2007 in Brisbane, Australia PC Co-Chairs: Gustavo Alonso, Peter Dadam, Michael Rosemann General Chairs: Marlon Dumas, Michael Rosemann BPM 2006 in Vienna, Austria PC Co-Chairs: Schahram Dustdar, José Luiz Fiadeiro, Amit P. Sheth General Chair: Schahram Dustdar BPM 2005 in Nancy, France PC Co-Chairs: Wil M. P. van der Aalst, Boualem Benatallah, Fabio Casati General Chair: Claude Godart BPM 2004 in Potsdam, Germany PC Co-Chairs: Jörg Desel, Barbara Pernici, Mathias Weske General Chair: Mathias Weske BPM 2003 in Eindhoven, Netherlands PC Co-Chairs: Wil van der Aalst, Arthur H. M. ter Hofstede, Mathias Weske General Chair: Wil van der Aalst See also The list of computer science conferences contains other academic conferences in computer science. The topics of the conference cover the field of computer science. Process mining is a process management technique that allows for the analysis of business processes based on event logs. Business Process Management (BPM) includes methods, techniques, and tools to support the design, enactment, management, and analysis of operational business processes. It can be considered as an extension of classical Workflow Management (WFM) systems and approaches. References External links Website of the BPM Conference Series. BPM Steering Committee Website of the 15th International Conference on Business Process Management, Barcelona, September 2017. DBLP entry listing all BPM conference proceedings (including workshops) BPM Newsletters (two issues per year) Computer science conferences Recurring events established in 2003
International Conference on Business Process Management
[ "Technology" ]
1,338
[ "Computer science", "Computer science conferences" ]
52,039,583
https://en.wikipedia.org/wiki/NGC%20302
NGC 302 is a magnitude 16.6 star located in the constellation Cetus. It was recorded in 1886 by Frank Muller. References 0302 ? Cetus Discoveries by Frank Muller (astronomer) J005625.30-103947.6
NGC 302
[ "Astronomy" ]
53
[ "Cetus", "Constellations" ]
52,039,606
https://en.wikipedia.org/wiki/NGC%20303
NGC 303 is a lenticular galaxy in the constellation Cetus. It was discovered in 1886 by Francis Leavenworth. References 0303 ? Cetus Lenticular galaxies 003240
NGC 303
[ "Astronomy" ]
39
[ "Cetus", "Constellations" ]
52,039,643
https://en.wikipedia.org/wiki/NGC%20304
NGC 304 is a lenticular galaxy in the constellation Andromeda. It was discovered on October 23, 1878, by Édouard Stephan. One supernova, SN 2021dnn (type Ia, mag. 15.3), was discovered in NGC 304 on 22 February, 2021. References External links 0304 00573 3326 18781003 Andromeda (constellation) Discoveries by Édouard Stephan Lenticular galaxies
NGC 304
[ "Astronomy" ]
87
[ "Andromeda (constellation)", "Constellations" ]
52,040,048
https://en.wikipedia.org/wiki/Agile%20architecture
Agile architecture means how enterprise architects, system architects and software architects apply architectural practice in agile software development. A number of commentators have identified a tension between traditional software architecture and agile methods along the axis of adaptation (leaving architectural decisions until the last possible moment) versus anticipation (planning in advance) (Kruchten, 2010 ). Waterman, Nobel, and Allan (2015) explored the tensions between spending too little time designing an up-front architecture, increasing risk, and spending too much time, negatively impacting of the delivery of value to the customer. They identify six forces that can affect agile architecture: Requirements instability, technical risk, early value, team culture, customer agility and experience. These forces may be addressed by six strategies: Respond to change, address risk, emergent architecture, big design up front and use frameworks and template architectures. Definition Several attempts have been made to specify what makes up an agile approach to architecture. According to the SAFe framework, the principles of agile architecture are: Design emerges. Architecture is a collaboration. (intentional architecture) The bigger the system, the longer the runway (architectural runway) Build the simplest architecture that can possibly work (established design principles) When in doubt, code or model it out (spikes, prototype, domain and use case models) They build it, they test it (design for testability) There is no monopoly on innovation (teams, hackathons) - Facebook's Like button was conceived as part of a hackathon Implement architectural flow (architectural epics and the portfolio kanban) - the portfolio Kanban goes through funnel, review, analysis, portfolio backlog and implementing Principles At the enterprise architecture level, Scott Ambler (2016) proposes the following principles: Communication over perfection Active stakeholder participation Enablement over inspection (exemplars) Evolutionary collaboration over blueprinting Enterprise architects are active participants on development teams High level models (the more complex, the more abstract) Capture details with working code Lean guidance and rules, not bureaucratic procedures Have a dedicated team of experienced enterprise architects Dimensions Svyatoslav Kotusev identifies the following dimensions of "agile" Enterprise Architecture: Agility of strategic planning, including such aspects as (a) the overall amount of time and effort devoted to strategic planning, (b) the organizational scope covered by strategic planning, (c) the time horizon of strategic planning and (d) how exactly the desired future is defined Agility of initiative delivery, including such aspects as (a) the logical flow of initiative delivery and (b) the volume of EA artifacts developed for initiatives, i.e. solution overviews and solution designs Agility of finance allocation, including such aspects as (a) the composition of corporate IT investment portfolios and (b) the structure of budgeting processes Agility of architecture governance, including such aspects as (a) the formality of decision-making processes and (b) the adherence to the approved plans Agility of architecture function, including such aspects as (a) the ratio of architects in the total IT workforce and (b) the degree of participation of architects in IT projects Agility of other elements, including such aspects as (a) the level of technical standardization and (b) the sophistication of utilized software tools Practices The open source Design Practice Repository (DPR) collects agile architecting practices such as: SMART NFR Elicitation Architectural decision Capturing Stepwise Service Design References External links Agile Training Agile software development Architectural theory
Agile architecture
[ "Engineering" ]
704
[ "Architectural theory", "Architecture" ]
54,779,970
https://en.wikipedia.org/wiki/New%20Breeding%20Techniques
New Breeding Techniques (NBT), also named New Plant Engineering Techniques, are a suite of methods that could increase and accelerate the development of new traits in plant breeding. These new techniques, often involve 'genome editing' whose intention is to modify DNA at specific locations within the plants' genes so that new traits and properties are produced in crop plants. An ongoing discussion in many countries is as to whether NBTs should be included within the same pre-existing governmental regulations to control genetic modification. Methods involved New breeding techniques (NBTs) make specific changes within plant DNA in order to change its traits, and these modifications can vary in scale from altering single base, to inserting or removing one or more genes. The various methods of achieving these changes in traits include the following: Cutting and modifying the genome during the repair process (three tools are used to achieve this: Zinc finger nuclease; TALENs, and CRISPR/Cas Tools) Genome editing to introduce changes to just a few base pairs (using a technique called 'oligonucleotide-directed mutagenesis' (ODM)). Transferring a gene from an identical or closely related species (cisgenesis) Adding in a reshuffled set of regulatory instructions from same species (intragenesis) Deploying processes that alter gene activity without altering the DNA itself (epigenetic methods) Grafting of unaltered plant onto a genetically modified rootstock Potential benefits and disbenefits Many European environmental organisations came together in 2016 to jointly express serious concerns over new breeding techniques. Regulation OECD The Organization for Economic Cooperation and Development (OECD) has its own 'Working Group on Harmonization of Regulatory Oversight in Biotechnology' but, as at 2015, there had been virtually no progress in addressing issues around NBTs, and this includes many major food-producing countries like Russia, South Africa, Brazil, Peru, Mexico, China, Japan and India. Despite its huge potential importance for trade and agriculture, as well as potential risks, the majority of food producing countries in the world at that date still had no policies or protocols for regulating or analysing food products derived specifically from new breeding techniques. South America Argentina introduced regulations and protocols affecting NBTs. These were in place by 2015 and gave clarity to plant developers at an early stage so they could anticipate whether or not their products were likely to be regarded as GMOs. The protocols conform to the internationally recognised 2003 Cartagena Protocol on Biosafety. North America United States The United States Department of Agriculture is responsible for determining whether food products derived from NBTs should be regulated, and this is undertaken on a case-by-case manner under the US Plant Protection Act. As of 2015 there was no specific policy towards NBTs, although in the summer of that year the White House announced plans to update the U.S. Regulatory Framework for Biotechnology. Canada Canada's food regulatory system differs from those of most other countries, and its procedures already accommodate products from any breeding technique, including NBTs. This is because its 1993 'Biotechnology Regulatory Framework' is based upon a concept of regulatory triggering based upon "Plants with Novel Traits". In other words, if a new trait does not exist within normal cultivated plant populations in Canada, then no matter how it was developed, it will trigger the normal regulatory processes and testing. See also Genetic engineering in North America Synthetic biology References Further reading Plant genetics Plant breeding
New Breeding Techniques
[ "Chemistry", "Biology" ]
702
[ "Plant breeding", "Plant genetics", "Plants", "Molecular biology" ]
54,780,428
https://en.wikipedia.org/wiki/Stomatophyta
The Stomatophyta are a proposed sister branch of the Marchantiophyta (Liverworts), together forming the Embryophyta. The Stomatophyta consist of the Bryophyta (Moss), and the remainder of the Embryophyta, including the Anthocerotophyta (Hornsworts). The word stomatophyta means plant with stoma. An updated phylogeny of Embryophyta based on the work by Novíkov & Barabaš-Krasni 2015 with plant taxon authors from Anderson, Anderson & Cleal 2007 and some clade names from Pelletier 2012 and Lecointre, et al. References Plants
Stomatophyta
[ "Biology" ]
143
[ "Plant stubs", "Plants" ]
54,784,150
https://en.wikipedia.org/wiki/David%20A.%20Klarner
David Anthony Klarner (October 10, 1940March 20, 1999) was an American mathematician, author, and educator. He is known for his work in combinatorial enumeration, polyominoes, and box-packing. Klarner was a friend and correspondent of mathematics popularizer Martin Gardner and frequently made contributions to Gardner's Mathematical Games column in Scientific American. He edited a book honoring Gardner on the occasion of his 65th birthday. Gardner in turn dedicated his twelfth collection of mathematical games columns to Klarner. Beginning in 1969 Klarner made significant contributions to the theory of combinatorial enumeration, especially focusing on polyominoes and box-packing. Working with Ronald L. Rivest he found upper bounds on the number of n-ominoes. Klarner's Theorem is the statement that an m by n rectangle can be packed with 1-by-x rectangles if and only if x divides one of m and n. He has also published important results in group theory and number theory, in particular working on the Collatz conjecture (sometimes called the 3x + 1 problem). The Klarner-Rado Sequence is named after Klarner and Richard Rado. Biography Klarner was born in Fort Bragg, California, and spent his childhood in Napa, California. He married Kara Lynn Klarner in 1961. Their son Carl Eoin Klarner was born on April 21, 1969. Klarner did his undergraduate work at Humboldt State University (1960–63), got his Ph.D. at the University of Alberta (1963–66), and did post-doctoral work at McMaster University in Hamilton, Ontario (1966–68). He also did post-doctoral work at Eindhoven University of Technology in the Netherlands (1968-1970), at the University of Reading in England working with Richard Rado (1970–71), and at Stanford University (1971–73). He served as an assistant professor at Binghamton University (1973–79) and was a visiting professor at Humboldt State University in California (1979–80). He returned to Eindhoven as a professor (1980–81), and to Binghamton (1981–82). From 1982 to 1996 he was a professor of computer science at the University of Nebraska, at Lincoln, with a one-year break at Eindhoven in academic year 1991–92. He retired to Eureka, California in 1997 and died there in 1999. He was a frequent contributor to recreational mathematics and worked with many key mathematics popularizers including Ronald L. Rivest, John H. Conway, Richard K. Guy, Donald Coxeter, Ronald Graham, and Donald Knuth. Organizations and awards Klarner was a member of the Association for Computing Machinery, the American Mathematical Society, the Mathematical Association of America, and the Fibonacci Association. He was awarded a National Science Foundation Fellowship Award in mathematics in 1963. In 1986 Klarner received a University of Nebraska-Lincoln Distinguished Teaching Award in Computer Science. The David A. Klarner Fellowship for Computer Science was set up after Klarner's death by Spyros Magliveras a fellow professor in Computer Science at UNL. Bibliography Asymptotically Optimal Box Packing Theorems: Klarner systems by Michael Reid, Department of Mathematics, University of Central Florida, June, 2008 A Lifetime of Puzzles edited by Erik D. Demaine, Martin L. Demaine, Tom Rodgers; pp. 221–225: Satterfield's Tomb, a puzzle by David A. Klarner and Wade Satterfield; Selected publications Books The Mathematical Gardner (editor), Publisher: Boston : Prindle, Weber & Schmidt; Belmont, Calif. : Wadsworth International, , (electronic book) Papers Polyominoes by Gill Barequet, Solomon W. Golomb, and David A. Klarner, December 2016 The number of tilings of a block with blocks (with F. S. S. Magliveras), European Journal of Combinatorics: Volume 9 Issue 4, July 1988 The number of tiered posets modulo six Discrete Mathematics, Vol. 62, Issue 3, pp. 295–297, December 1986 Asymptotics for coefficients of algebraic functions (with Patricia Woodworth), Aequationes Mathematicae, Volume 23, Issue 1, pp. 236–241, December 1981 An algorithm to determine when certain sets have 0-density Journal of Algorithms, Vol. 2, Issue 1, Pages 31–43, March 1981 Some remarks on the Cayley-Hamilton theorem American Mathematical Monthly, Vol. 83, No. 5, pp. 367–369, May, 1976 Asymptotic bounds for the number of convex n-ominoes (with Ronald L. Rivest), Discrete Mathematics, Vol. 8, Issue 1, pp. 31–40, March 1974 A finite basis theorem revisited Stanford University: Computer Science Department, April 1973 The number of SDR's in certain regular systems Stanford University: Computer Science Department, April 1973 Selected combinatorial research problems (with Václav Chvátal and Donald E. Knuth), Stanford University: Computer Science Department, June 1972 Sets generated by iteration of a linear operation Stanford University: Computer Science Department, March 1972 Linear Combinations of Sets of Consecutive Integers (with Richard Rado), Stanford University: Computer Science Department, March 1972 Sets generated by iteration of a linear operation Stanford University: Computer Science Department, March 1972 Packing a rectangle with congruent n-ominoes Journal of Combinatorial Theory, Vol. 7, Issue 2, Pages 107–115, September 1969 Packing boxes with congruent figures (with F. Göbel), Indagationes Mathematicae 31, pp. 465–472, MR 40 #6362, 1969 Some Results Concerning Polyominoes Fibonacci Quarterly, 3, pp. 9–20, February 1965 References External links David A. Klarner fonds University of Calgary Special Collections Mathematics popularizers Recreational mathematicians 20th-century American mathematicians California State Polytechnic University, Humboldt alumni University of Alberta alumni McMaster University alumni Academic staff of the Eindhoven University of Technology Binghamton University faculty Academic staff of the University of Calgary University of Nebraska faculty American number theorists Combinatorial game theorists 1940 births 1999 deaths People from Fort Bragg, California Writers from California Mathematicians from California
David A. Klarner
[ "Mathematics" ]
1,340
[ "Recreational mathematics", "Recreational mathematicians" ]
54,793,295
https://en.wikipedia.org/wiki/Dingle%20Dell%20meteorite
Dingle Dell is a 1.15 kg ordinary chondrite of subclass L/LL5, and the fourth meteorite to be recovered by the Desert Fireball Network camera observatory. It fell in the Morawa region of Western Australia on 31 October 2016 8:05 pm local time, and was recovered less than a week later, on the morning of 7 November, in a paddock at Dingle Dell farm. Given the rapid turnaround for meteorite recovery and a lack of rainfall between fall date and find date, the rock is in pristine condition and shows no evidence of terrestrial weathering (W0). This particular meteorite fall demonstrates the proficiency of the DFN as a sample recovery tool for meteoritics. Physical properties and composition The rock is 1.15 kg in mass, and approximately 16 × 9 × 4 cm in size. It was originally slightly wedged shaped, with pristine fusion crust that is both primary and secondary, which indicate this rock broke up as it was passing through the Earth's atmosphere. Dingle dell contains Chondrules between 1.15 – 4.11 mm in diameter that are poorly defined, which is characteristic of a type 5 ordinary chondrite and moderate amounts of thermal metamorphism. Both olivine and pyroxene have undulose extinction, which is evidence for mild shock, and therefore this rock is classified as an S2. Bulk density is 3.23 g/cm3, and grain density is 3.61 g/cm3. Together, these measurements imply Dingle Dell has a porosity of 10.5%, which is close to the mean for lightly shocked, unweathered ordinary chondrite falls. The magnetic susceptibility and grain density of the meteorite are higher than typical LL chondrites, however together with the physical properties of the rock, Dingle Dell belongs to an intermediate population of meteorites that lie between the L and LL chemical groups. Fall description and Recovery Several fireball reports were made by the public in the wheat-belt region of Western Australia using the Fireballs in the Sky smartphone app. Users can report a fireball sighting to help supplement the data obtained from the Desert Fireball Network observatory. Six of the DFN cameras also observed the 6.2-second fireball. The rock entered the Earth's atmosphere traveling with a velocity of 15.43 km/s. It decelerated to a velocity of 3.54 km/s over a distance of 78 km, and stopped ablating at 19.52 km altitude. Members of the Desert Fireball Network team visited the local area around the fall on 3 November, to contact local land owners to seek permission to search. Following this, a search team of 4 people arrived on 5 November; the meteorite was recovered on the second day of searching. References Meteorites found in Australia Chondrite meteorites 2016 in science 21st-century astronomical events October 2016 events in Australia
Dingle Dell meteorite
[ "Astronomy" ]
603
[ "Astronomical events", "21st-century astronomical events" ]
54,795,122
https://en.wikipedia.org/wiki/Off-target%20activity
Off-target activity is biological activity of a drug that is different from and not at that of its intended biological target. It most commonly contributes to side effects. However, in some cases, off-target activity can be taken advantage of for therapeutic purposes. An example of this is the repurposing of the antimineralocorticoid and diuretic spironolactone, which was found to produce feminization and gynecomastia as side effects, for use as an antiandrogen in the treatment of androgen-dependent conditions like acne and hirsutism in women. Metformin also causes off-target activity. See also Antitarget References Bioactivity Pharmacodynamics
Off-target activity
[ "Chemistry" ]
153
[ "Pharmacology", "Pharmacodynamics" ]
54,796,069
https://en.wikipedia.org/wiki/NGC%205949
NGC 5949 is a dwarf spiral galaxy located around 44 million light-years away in the constellation Draco. NGC 5949 was discovered in 1801 by William Herschel, and it is 30,000 light-years across. NGC 5949 is not known to have an active galactic nucleus, and it is not known for much star-formation. Characteristics With a mass of about a hundredth that of the Milky Way, NGC 5949 is a relatively bulky example of a dwarf galaxy. Its classification as a dwarf is due to its relatively small number of constituent stars, but the galaxy’s loosely-bound spiral arms also place it in the category of barred spirals. This structure is just visible in the Hubble Space Telescope (HST) image, which shows the galaxy as a bright yet ill-defined pinwheel. Despite its small proportions, NGC 5949’s proximity has meant that its light can be picked up by fairly small telescopes, as discovered by William Herschel. Astronomers have run into several cosmological quandaries when it comes to dwarf galaxies like NGC 5949. For example, the distribution of dark matter within dwarfs is quite puzzling (the “cuspy halo” problem), and our simulations of the Universe predict that there should be many more dwarf galaxies than we see around us (the “missing satellites” problem). References External links 5949 Draco (constellation) 9866 Dwarf spiral galaxies Unbarred spiral galaxies
NGC 5949
[ "Astronomy" ]
295
[ "Constellations", "Draco (constellation)" ]
54,796,255
https://en.wikipedia.org/wiki/Minimum%20Information%20Required%20About%20a%20Glycomics%20Experiment
The Minimum Information Required About a Glycomics Experiment (MIRAGE) initiative is part of the Minimum Information Standards and specifically applies to guidelines for reporting (describing metadata) on a glycomics experiment. The initiative is supported by the Beilstein Institute for the Advancement of Chemical Sciences. The MIRAGE project focuses on the development of publication guidelines for interaction and structural glycomics data as well as the development of data exchange formats. The project was launched in 2011 in Seattle and set off with the description of the aims of the MIRAGE project. Organization The MIRAGE Commission consists of three groups which tightly interact with each other. The advisory board consists of leading scientists in glycobiology, who, for example, critically review the outcomes of the working group and promote the reporting guidelines within the community. The working group seeks for external consultation and directly interacts with the glycomics community. The group members carry out defined subprojects (e.g. development and revision of guidelines) by focusing on specific research areas to fulfill the overall aims of the MIRAGE project. The co-ordination team links the subprojects from the working group together and passes the outcomes to the advisory board for review. Reporting guidelines The following reporting guidelines were developed and published: MIRAGE MS guidelines for reporting mass spectrometry-based glycan analysis. These guidelines are based on the MIAPE guideline template, i.e. MIAPE-MS version 2.24. MIRAGE Sample preparation guidelines which are considered a common basis for any further MIRAGE reporting guidelines in order to keep the requirements for data analysis short and consistent. MIRAGE Glycan microarray guidelines for the comprehensive description of Glycan array experiments the reporting guidelines for glycan microarray analysis have been developed. In order to assist the authors to reporting in compliance with these guidelines, exemplar publications and a template with a data example is provided. MIRAGE Liquid chromatography guidelines for reporting of liquid chromatography (LC) glycan data. Derivatives The MIRAGE reporting guidelines provide essential frameworks for subsequent projects related with the development of both software tools for the analysis of experimental glycan data and databases for the deposition of interaction analysis data (e.g. from glycan microarray experiments) and structural analysis data (e.g. from mass spectrometry and liquid chromatography experiments). As the guidelines include the definitions of the minimum information required for reporting glycomics experiments comprehensively, this information is incorporated in database structures, data acquisition forms and data exchange formats. The following databases comply with the MIRAGE guidelines: UniCarb-DB, which stores curated data and information on glycan structures and associated fragment data characterised by LC-Tandem_mass spectrometry strategies. GlycoStore a curated Chromatography, Electrophoresis and Mass spectrometry derived composition database of N-, O-, glycosphingolipid (GSL) glycans and free oligosaccharides associated with a range of glycoproteins, glycolipids and biotherapeutics. UniCarb-DR a MSn data repository for glycan structures GlycoPOST a mass spectra repository for glycomics and glycoproteomics The following projects refer to the MIRAGE standards: GlyTouCan is a glycan structure repository where unique identifiers are assigned to individually reported glycan structures UniCarbKB a database of glycans and glycoproteins GlyGen, a data integration and dissemination project for carbohydrate and glycoconjugate related data GlyConnect, an integrated platform for glycomics and glycoproteomics References External links Record in Fairsharing.org for MIRAGE Sample Preparation guidelines Record in Fairsharing.org for MIRAGE Mass spectrometry guidelines Record in Fairsharing.org for MIRAGE Glycan array guidelines Record in Fairsharing.org for MIRAGE Liquid chromatography guidelines Bioinformatics Minimum Information Standards Glycobiology Glycomics Carbohydrates
Minimum Information Required About a Glycomics Experiment
[ "Chemistry", "Engineering", "Biology" ]
856
[ "Biomolecules by chemical classification", "Carbohydrates", "Biological engineering", "Organic compounds", "Glycomics", "Bioinformatics", "Carbohydrate chemistry", "Biochemistry", "Glycobiology" ]
54,798,306
https://en.wikipedia.org/wiki/CONSELF
CONSELF is a computer-aided engineering (CAE) platform used by engineers for design purposes. The platform, which highly relies on cloud computing, is developed by CONSELF SRL since its first release in October 2015. In March 2016 a new release of the platform defined guided workflows for the users with focus on turbomachinery, fire scenarios and flows with dispersed solid particles. Through the platform it is possible to run both Computational Fluid Dynamics and Finite Element Analysis. Among the solvers and libraries used by CONSELF platform, a number of open-source technologies are included, such as: FEA: Code_Aster CFD: OpenFOAM The accuracy of the application is guaranteed by a close cooperation with Italian universities and production of academic papers and research studies. Because of its level of innovation and thanks to the high number of industrial applications, the platform has been rewarded in Italy by national industrial association CONFINDUSTRIA. Features The simulation platform is currently capable of running both CFD and FEA simulations, using hardware resources provided on a pay-per-use basis. Going into the details of the simulation capabilities and, considering the two major operative fields, CONSELF gives the following features: Mesh generation Tetrahedral and hexahedral meshing algorithms Boundary layer definition for CFD analysis Finite Element Analysis solvers Single body simulation Static analysis Modal analysis Isotropic linear elastic material model Geometrical non-linear behaviour (large-displacements) Computational Fluid Dynamics solvers Incompressible/Compressible single material flow Multiphase non compressible flows Passive scalar transport for HVAC Single Reference Frame (SRF) simulation for Turbomachinery Flow with particles File format CONSELF is currently able to interact with a number of 3D modelling generated file formats such as: STEP, IGES, STL formats. In addition, the geometry can be directly imported from their partner CAD platform, namely Onshape. As output files CONSELF is 100% compatible with opensource viewer ParaView. References Cloud platforms Computer-aided engineering software for Linux Computer-aided engineering software Finite element software Simulation software
CONSELF
[ "Technology" ]
436
[ "Cloud platforms", "Computing platforms" ]
54,798,805
https://en.wikipedia.org/wiki/FSC%20Millport
FSC Millport, run by the Field Studies Council, is located on the island of Great Cumbrae in the Firth of Clyde, Scotland. The field centre was formerly known as the University Marine Biological Station Millport (UMBSM), a higher education institute run by the University of London in partnership with Glasgow University but was closed due to the withdrawal of higher education funding in 2013. FSC reopened the centre in 2014 and continues to host and teach university, school and college groups and to support and host research students from all over the world, whilst also extending its educational reach and providing a variety of courses in natural history and outdoor environmental activities for adult learners and families to enjoy. The centre is a very popular conference venue hosting many international events. The Robertson Museum and Aquarium (named after the founder of the original Marine Station, David Robertson) is open to visitors between March and November. The centre also functions as a Meteorological Office Weather Station and Admiralty Tide Monitor. History The Ark, an 84 ft lighter originally moored in the flooded Granton quarry, was fitted out as a floating laboratory by the father of modern oceanography, Sir John Murray. This boat was brought to Port Loy on the Isle of Cumbrae in 1885 and formed the beginnings of the Scottish Marine Station. She attracted a stream of distinguished scientists, drawn by the richness of the fauna and flora of the Firth of Clyde, but gradually fell into disuse after the opening of the Millport Marine Station, and on the night of 20 January 1900 was completely destroyed by a great storm. In 1894 a committee headed by amateur naturalist David Robertson began to build a marine station on the Isle of Cumbrae and took over the Ark. Sadly David Robertson died before completion of the centre, but in 1897 Millport Marine Biological Station (MMBS) was opened by Sir John Murray. Despite many struggles during its first few decades, in which sufficient funding was difficult to attain and there was much conflict between research priorities and the needs of education, the station persisted. On 21 July 1904 Scotia, the ship of Dr William Speirs Bruce's Scottish National Antarctic Expedition, returned to her first Scottish landing site at the Keppel Pier on the Isle of Cumbrae. From this beginning, the station was gradually built up to its present size. The original building proved too small for the purpose and an architectural copy was built alongside. In 1914 the Scottish Marine Biological Association was established at MMBS. In 1922 Sheina Marshall joined the Scottish Marine Laboratory, beginning a scientific career dedicated to the study of plant and animal plankton. She went on to become one of the first women to be elected a Fellow of the Royal Society of Edinburgh, and later became a Fellow of the Royal Society, as well as being awarded the Order of the British Empire in 1966. From 1966 to 1987 the station ran under the Directorship of Ronald Ian Currie FRSE who was responsible for the creation of RV Challenger and RV Calanus. In 1970 the Scottish Marine Biological Station moved to Dunstaffnage Bay (Oban), and MMBS was taken over by the University of London in partnership with Glasgow University, becoming the University Marine Biological Station Millport (UMBSM). It continued to expand, with a hostel accommodation block opening in 1975. In December 2012 it was announced that the University Marine Biological Station Millport would be forced to close after the Higher Education Funding Council for England withdrew the grant of 400,000 pounds that it gave to the University of London to run the station. UMBSM closed on 31 October 2013. Ownership of MMBS was transferred to the Field Studies Council on 1 January 2014. In May 2014 a four-million-pound package of funding was announced that allowed a comprehensive programme of development and refurbishment to be completed over five years. FSC Millport continues to develop and grow as one of the Field Studies Council's centres. See also Sheina Marshall David Robertson (naturalist) Field Studies Council References External links Official website Educational institutions established in 1885 Buildings and structures in North Ayrshire Education in North Ayrshire Science and technology in Scotland Biological stations Marine biology Field studies centres in the United Kingdom Millport, Cumbrae Firth of Clyde 1885 establishments in Scotland Oceanographic organizations Scientific organisations based in the United Kingdom
FSC Millport
[ "Biology" ]
863
[ "Marine biology" ]
54,798,835
https://en.wikipedia.org/wiki/Crustal%20magnetism
Crustal magnetism is the magnetic field of the crust of a planetary body. The crustal magnetism of Earth has been studied; in particular, various magnetic crustal anomalies have been studied. Two examples of crustal magnetic anomalies on Earth that have been studied in the Americas are the Brunswick magnetic anomaly (BMA) and East Coast magnetic anomaly (ECMA). Also, there can be a correlation between physical geological features and certain readings from crustal magnetism on Earth. Below the surface of the Earth, the crustal magnetism is lost because the temperature rises above the curie temperature of the materials producing the field. On Mars the crustal magnetic fields have been noted as affecting its ionosphere. (See also Atmosphere of Mars) The magnetic fields on Mars from its rocks and crust are thought to come from ferromagnetism and if the material is heated above its curie temperature the magnetic imprint is un-done. The Mars Global Surveyor (MGS) discovered magnetic stripes in the crust of Mars, especially in the Phaethontis and Eridania quadrangles (Terra Cimmeria and Terra Sirenum). The magnetometer on MGS discovered wide stripes of magnetized crust running roughly parallel for up to . These stripes alternate in polarity, with the north magnetic pole of one pointing up from the surface and the north magnetic pole of the next pointing down. When similar stripes were discovered on Earth in the 1960s, they were taken as evidence of plate tectonics. Researchers believe these magnetic stripes on Mars are evidence for a short, early period of plate tectonic activity. Only roughly half of Mars seems to have a crustal magnetic field; there are several possible explanations for this, such as that an internal dynamo only affected part of the planet, or that a body struck Mars in the past destroying the magnetism. Lunar crustal magnetism has also been discovered and studied. The Lunar Prospector probe of the 1990s mapped the lunar crustal magnetic field across the whole globe of the Moon for the first time. This allowed the impact on magnetic fields of previously identified impact basins to be studied. The impact basins Orientale and Imbrium had some of the weakest magnetic fields in the survey, and additional study puts constraints on the existence of steady ambient lunar magnetic fields in the Moon's past given the assumptions of the data and collection techniques. Remnant crustal magnetic fields are studied on Moon as in this case, and have also been examined on Earth and on Mars. However, these are the only bodies that have by the early 21st century been studied, though it is predicted other bodies will have this geological feature. See also Magnetic anomaly (geology) Rock magnetism Paleomagnetism References External links Global mapping of lunar crustal magnetic fields by Lunar Prospector D.L. Mitchell, et al. (2008) Elements Magazine - Crustal Magnetism, Lamellar Magnetism and Rocks That Remember Magnetism in astronomy Geomagnetism
Crustal magnetism
[ "Astronomy" ]
605
[ "Magnetism in astronomy" ]
54,799,459
https://en.wikipedia.org/wiki/Rolling%20and%20wheeled%20creatures%20in%20fiction%20and%20legend
Legends and speculative fiction reveal a longstanding human fascination with rolling and wheeled creatures. Such creatures appear in mythologies from Europe, Japan, pre-Columbian Mexico, the United States, and Australia, and in numerous modern works. Rolling creatures The triskelion is a motif with central symmetry used since ancient times. A variant with three human legs appears in the medieval flag of the Isle of Man. A variant with the head of Medusa in the union of the legs is associated with Sicily. It is not known the meaning it had in antiquity or its original Greek name. The hoop snake, a creature of legend in the United States and Australia, is said to grasp its tail in its mouth and roll like a wheel towards its prey. Japanese culture includes a similar mythical creature, the Tsuchinoko. Buer, a demon mentioned in the 16th-century grimoire Pseudomonarchia Daemonum, was described in Collin de Plancy's 1825 edition of Dictionnaire Infernal as having "the shape of a star or wheel". The 1863 edition of this book featured an illustration by Louis Le Breton, depicting a creature with five legs radially arranged. Neil R. Jones' 1937 story "On the Planet Fragment" features aliens dubbed the Disci, which are shaped like wheels, with limbs around the circumference. One of their methods of locomotion is a "rolling motion like that of a cartwheel." The 1944 science fiction short story "Arena", by Fredric Brown, features a telepathic alien called an Outsider, which is roughly spherical and moves by rolling. The story was the basis for a 1967 Star Trek episode of the same name, and possibly also a 1964 episode of The Outer Limits entitled "Fun and Games", though neither television treatment included a spherical creature. E. E. "Doc" Smith's 1950 novel First Lensman features the fontema, which consists of two wheels connected by articulations to an axle, lives on sunlight, and has only two behaviors: rolling, and conjugation/mating, which is scarcely more complicated. The Dutch graphic artist M. C. Escher invented a creature that was capable of rolling itself forward, which he named Pedalternorotandomovens centroculatus articulosus. He illustrated this creature in his 1951 lithograph (also known by the English title Curl-up). A 1956 Scrooge McDuck comic, Land Beneath the Ground!, by Carl Barks, introduced Terries and Fermies (a play on the phrase terra firma), creatures who move from place to place by rolling. The Terries and Fermies have made a sport of their rolling abilities, causing earthquakes in the process. Northern Irish author James White's Sector General series features "Rollers" from the planet Drambo, doughnut-shaped aquatic organisms that do not have hearts, but which instead must roll continuously to maintain circulation by means of gravity. The Rollers are described in the short story "Spacebird" in the 1980 edition of Ambulance Ship, and in other works in the series. The 1982 puppet-animated fantasy film The Dark Crystal, directed by Jim Henson and Frank Oz, introduced the character Fizzgig, a dog-like companion creature that rolls from place to place. In 2015, an original film puppet of Fizzgig was put on auction with an estimated value of $12,000–$15,000. In The Citadel of Chaos (1983) by Steve Jackson, the reader encounters Wheelies, disc-shaped creatures with four arms who move by doing cartwheels. Tuf Voyaging, a 1986 science fiction novel by George R. R. Martin, features an alien called a Rolleram, described as a "berserk living cannonball of enormous size", which kills its prey by rolling over it and crushing it, before digesting it externally. Adults of the species weigh approximately six metric tons and can roll faster than . In the Sonic the Hedgehog video game series, which first appeared in 1991, the eponymous Sonic and his sidekick Tails are capable of moving by rolling. The 1995 short story "Microbe", by Kenyon College biologist and feminist science fiction writer Joan Slonczewski, describes an exploratory expedition to an alien world whose plant and animal life consists entirely of doughnut-shaped organisms. Wheeled creatures Toy animals with wheels dating from the Pre-Columbian era were uncovered by archaeologists in Veracruz, Mexico, in the 1940s. The indigenous peoples of this region did not use wheels for transportation prior to the arrival of Europeans. L. Frank Baum's 1907 children's novel Ozma of Oz features humanoid creatures with wheels instead of hands and feet, called Wheelers. Their wheels are composed of keratin, which has been suggested by biologists as a means of avoiding nutrient and waste transfer problems with living wheels. Despite moving quickly on open terrain, the Wheelers are stymied by obstacles in their path that do not hinder creatures with limbs. They also appear in the 1985 film Return to Oz. The surrealist artist Remedios Varo (1908–1963) painted images of fantastical creatures with wheels as their bases, such as Homo rodans (1959), Fantastic animal (1959), and The Ladies at Bonhuer. The 1966 novella The Last Castle by Jack Vance describes "power-wagons" as creatures with a mix of biological and mechanical elements, including wheels. The 1968 novel The Goblin Reservation by Clifford D. Simak features an intelligent alien race that uses biological wheels. Kurt Vonnegut's 1973 novel Breakfast of Champions includes a brief description of fictional author Kilgore Trout's novel Plague on Wheels, which features a planet inhabited by sentient wheeled automobiles. Evsise, the narrator of Harlan Ellison’s 1975 novelette "I'm Looking for Kadak", describes himself thus: "I am squat and round and move very close to the ground by a series of caterpillar feet set around the rim of ball joints and sockets on either side of my toches ... and when I’ve wound the feet tight, I have to jump off the ground so they can unwind and then I move forward again which makes my movement very peculiar ..." Chorlton and the Wheelies, a British stop-motion-animated television series that aired from 1976 to 1979, was set in "Wheelie World", which was inhabited by three-wheeled creatures called "wheelies". John Varley's 1977 short story, "In the Hall of the Martian Kings" feature several types of creatures on Mars with wheels (for locomotion) or spinning windmills. Piers Anthony's 1977 book Cluster and its sequels feature aliens called Polarians, which locomote by gripping and balancing atop a large ball. The ball is a living, though temporarily separable, portion of the Polarian's body. David Brin's Uplift Universe includes a wheeled species called the g'Kek, which are described in some detail in the 1995 novel Brightness Reef. In 1996's Infinity's Shore, a g'Kek is described as looking like "a squid in a wheelchair." The g'Kek suffer from arthritic axles in their old age, particularly when living in a high-gravity environment. A 1997 novel in the Animorphs series, The Andalite Chronicles, includes an alien called a Mortron, composed of two separate entities: a yellow and black bottom half with four wheels, and a red, elongated head with razor-sharp teeth and concealed wings. The 2000 novel The Amber Spyglass, by English author Philip Pullman, features an alien race known as the Mulefa, which have diamond-shaped bodies with one leg at the front and back and one on each side. The Mulefa use large, disk-shaped seed pods as wheels. They mount the pods on bone axles on their front and back legs, while propelling themselves with their side legs. The Mulefa have a symbiotic relationship with the seed pod trees, which depend on the rolling action to crack open the pods and allow the seeds to disperse. In the 2000 novel Wheelers, by English mathematician Ian Stewart and reproductive biologist Jack Cohen, a Jovian species called "blimps" has developed the ability to biologically produce machines called "wheelers", which use wheels for locomotion. The children's television series Jungle Junction, which premiered in 2009, features hybrid jungle animals with wheels rather than legs; one such animal, Ellyvan, is a hybrid of an elephant and a van. These animals traverse their habitat on elevated highways. The 2011 video game Dark Souls features Wheel Skeletons (or "Bonewheels"), which wear a wooden-spiked wheel, allowing them to roll at high speed. The 2021 Japanese children's stop motion animated series Pui Pui Molcar features guinea pig/vehicle hybrids. They are sentient, but are shown being driven around on roads by humans. Although they have wheels, they are usually shown using them like feet to walk and run. References Fantasy tropes Hypothetical life forms Legendary creatures Speculative evolution
Rolling and wheeled creatures in fiction and legend
[ "Biology" ]
1,889
[ "Biological hypotheses", "Speculative evolution", "Hypothetical life forms" ]
54,799,514
https://en.wikipedia.org/wiki/MindSphere
MindSphere is an industrial IoT-as-a-service solution developed by Siemens for applications in the context of the Internet of Things (IoT). MindSphere stores operational data and makes it accessible through digital applications (“MindSphere applications”) to allow industrial customers to make decisions based on valuable factual information. The system is used in applications such as automated production and vehicle fleet management. Assets can be securely connected to MindSphere with auxiliary MindSphere products that collect and transfer relevant machine and plant data. Examples include real-time telemetric data from moving assets like cars, time series data and geographical data, which can be used for predictive maintenance or to develop new analytical tools. MindSphere is now known as Insights Hub. Overview As an industrial IoT as a service solution, MindSphere collects and analyzes all kinds of sensor data in real time. This information can be used to optimize products, production assets and manufacturing processes along the entire value chain. MindSphere’s open application interfaces make it possible to obtain data from machines, plants or entire fleets irrespective of the manufacturer. These interfaces include OPC Foundation’s OPC Unified Architecture (OPC UA). To help customers create their own software applications and services, MindSphere is equipped with open application programming interfaces (APIs) and development tools. This allows OEMs to integrate their own technology. MindSphere is based on the concept of closed feedback loops enabling the bi-directional data flow between production and development: Real-world plants, machines and equipment can be connected to MindSphere in order to extract operational data. Valuable information (i.e., “digital twins” of machines) can then be extrapolated from the raw data through analytics and utilized to optimize products as well as production processes and environments in the next cycle of innovation. Timeline August 2017 – End of closed beta phase and release of MindSphere Version 2.0 January 2018 – Release of MindSphere Version 3.0 on AWS May 2018 – Release of MindSphere on Microsoft Azure April 2019 – Release of MindSphere Version 3.0 on Alibaba Cloud June 2023 – Mindsphere is now known as Insights Hub See also Internet of Things Industry 4.0 References External links Industrial automation Internet of things Big data products Industrial computing Technology forecasting Cloud platforms
MindSphere
[ "Technology", "Engineering" ]
488
[ "Cloud platforms", "Computing platforms", "Industrial computing", "Industrial engineering", "Automation", "Industrial automation" ]
54,801,001
https://en.wikipedia.org/wiki/1%2C4-Butane%20sultone
1,4-butane sultone is a six-membered δ-sultone and the cyclic ester of 4-hydroxybutanesulfonic acid. As a sulfo-alkylating agent, 1,4-butanesultone is used to introduce the sulfobutyl group (–(CH2)4–SO3−) into hydrophobic compounds possessing nucleophilic functional groups, for example hydroxy groups (as in the case of β-cyclodextrin) or amino groups (as in the case of polymethine dyes). In such, the sulfobutyl group is present as neutral sodium salt and considerably increases the water solubility of the derivatives. Preparation A lab scale synthesis of 1,4-butanesultone starts from 4,4'-dichlorodibutyl ether (accessible from tetrahydrofuran treated with phosphorus oxychloride and concentrated sulfuric acid), which reacts with sodium sulfite forming the corresponding 4,4'-butanedisulfonic disodium salt. By passing it through an acidic ion exchanger, the disodium salt is converted into the disulphonic acid which forms two molecules of 1,4-butanesultone at elevated temperature and reduced pressure under elimination of water. The yields obtained range from 72 to 80%. Starting from 4-chlorobutan-1-ol (from tetrahydrofuran and hydrogen chloride in 54 to 57% yield), the sodium salt of 4-hydroxybutan-1-sulfonic acid is obtained with sodium sulfite. This salt is converted with strong acids (such as hydrochloric acid) into the very hygroscopic 4-hydroxybutanesulfonic acid and cyclized to 1,4-butanesultone under elimination of water. The cyclization of 4-hydroxybutanesulfonic acid in aqueous solution proceeds particularly efficiently when heated with high-boiling, water-immiscible solvents (for example 1,2-dichlorobenzene or diethylbenzene, both boiling at about 180 °C) in which 1,4-butane-sultone dissolves and is thereby protected from hydrolysis in the aqueous medium. 1,4-butanesultone is obtained in yields of up to 99% upon reflux within one hour. The vacuum distillation of the sodium salt of 4-hydroxybutanesulfonic acid leads in the presence of concentrated sulfuric acid directly to 1,4-butanesultone. The sodium salt of 4-chlorobutane-1-sulfonic acid, which is obtained from 1,4-dichlorobutane with sodium sulfite, can also be cyclized to 1,4-butanesultone by heating to 180-250 °C. The free-radical initiated sulfochlorination of 1-chlorobutane leads to a mixture of positionally isomeric sulfochlorides and chlorination products and is therefore not suitable for the direct preparation of 1,4-butanesultone. Properties 1,4-butanesultone is a viscous, clear, colorless and odorless liquid which reacts in boiling water (to 4-hydroxybutanesulfonic acid) and alcohols (to 4-alkoxybutanesulfonic acid) and dissolves in many organic solvents. At temperatures below the melting point, the compound crystallizes giving "large, magnificent plates". [3] Compared to the homologous γ-sultone 1,3-propanesultone, 1,4-butanesultone is significantly less reactive as alkylating agent, but classified as mutagenic and carcinogenic. Applications Sulfobetaines 1,4-butanesultone reacts smoothly with nucleophiles such as ammonia to form the corresponding zwitterionic, usually very water-soluble sulfobutylbetaines. Sulfobetaines with longer alkyl chains (CnH2n+1 mit n > 10) show interesting properties as surface-active compounds (surfactants, detergents) with antimicrobial properties. In the reaction of N-N-butylimidazole with 1,4-Butansultone in Toluene in a 98% yield is formed 1-butylimidazolium-3-(n-butylsulfonate) 1-Butylimidazolium-3-(n-butylsulfonate) catalyses as a component of multifunctional catalysts the reaction of platform chemicals from biomass (for example levulinic acid or itaconic acid) into the corresponding lactones, diols or cyclic ethers. Aminoalkylphosphonic acids (such as aminomethane diphosphonic acid, accessible from phosphorus trichloride, formamide and phosphonic acid) form with 1,4-butanesultone N-(sulfobutyl)aminomethane diphosphonic acids: N-(sulfobutyl)aminomethane diphosphonic acid is characterized by very high water solubility (< 1000 g·l−1) and a strong capability as complexing agent and water softener. Sulfobutylation of cyanine dyes leads to readily water-soluble compounds which react with proteins like antibodies and can be used as pH-sensitive fluorescence markers. Ionic liquids The ionic liquid 4-triethylammonium butane-1-sulfonic acid hydrogensulfate (TEBSA HSO4) is formed by the reaction of 1,4-butanesultone with triethylamine in acetonitrile to the zwitterion (85% yield) and subsequent reaction with concentrated sulfuric acid. 4-triethylammonium butane-1-sulfonic acid hydrogensulfate can replace conventional mineral acids as effective and easily recyclable acid catalyst in solvent-free reactions. The ring opening of 1,4-butanesultone with organic chloride salts yields ionic liquids of the 4-chlorobutylsulfonate type in quantitative yield. The chlorine atom in the 4-chlorobutylsulfonate anion can be substituted by heating with inorganic (e.g. potassium fluoride) or organic salts (e.g. sodium acetate) by the respective anion. Sulfobutylated β-cyclodextrin Already in 1949 the reaction of 1,4-butanesultone with the water-insoluble polysaccharide cellulose in sodium hydroxide solution was reported, which leads to a water-soluble product. Derived from this the derivatization of β-cyclodextrin to sulfobutyl ether-beta-cyclodextrin (SBECD) is by now an important application of 1,4-butanesultone. Sulfobutyl ether-beta-cyclodextrin is a water-soluble inclusion compound for the solubilization and stabilization of sparsely water-soluble and chemically instable components. β-Cyclodextrin can be reacted with 1,4-butanesultone in sodium hydroxide solution at 70 °C to the sulfobutyl ether in yields of up to 80% and a degree of substitution of 6.68. Thereby, the water solubility of the β-cyclodextrin increases from 18.5 g · l-1 to more than 900 g · l-1 at 25 °C. Sulfobutyl ether-beta-cyclodextrin also finds a wide range of applications as an inert vehicle for drug delivery (the drugs transport and release). See also 1,3-Propane sultone References Sultones Esters Six-membered rings
1,4-Butane sultone
[ "Chemistry" ]
1,676
[ "Organic compounds", "Esters", "Functional groups" ]
54,801,052
https://en.wikipedia.org/wiki/Google%27s%20Ideological%20Echo%20Chamber
"Google's Ideological Echo Chamber", commonly referred to as the Google memo, is an internal memo, dated July 2017, by US-based Google engineer James Damore () about Google's culture and diversity policies. The memo and Google's subsequent firing of Damore in August 2017 became a subject of interest for the media. Damore's arguments received both praise and criticism from media outlets, scientists, academics and others. The company fired Damore for violation of the company's code of conduct. Damore filed a complaint with the National Labor Relations Board, but later withdrew this complaint. A lawyer with the NLRB wrote that his firing did not violate Federal employment laws, as most employees in the United States can be fired at the employer's discretion. After withdrawing this complaint, Damore filed a class action lawsuit, retaining the services of attorney Harmeet Dhillon, alleging that Google was discriminating against conservatives, Whites, Asians, and men. Damore withdrew his claims in the lawsuit to pursue arbitration against Google. Course of events James Damore wrote the memo after a Google diversity program he attended solicited feedback. The memo was written on a flight to China. Calling the culture at Google an "ideological echo chamber", the memo states that, whereas discrimination exists, it is extreme to ascribe all disparities to oppression, and it is authoritarian to try to correct disparities through reverse discrimination. Instead, the memo argues that male to female disparities can be partly explained by biological differences. Alluding to the work of Simon Baron-Cohen, Damore said that those differences include women generally having a stronger interest in people rather than things, and tending to be more social, artistic, and prone to neuroticism (a higher-order personality trait). Damore's memorandum also suggests ways to adapt the tech workplace to those differences to increase women's representation and comfort, without resorting to discrimination. The memo is dated July 2017 and was originally shared on an internal mailing list. It was later updated with a preface affirming the author's opposition to workplace sexism and stereotyping. On August 5, a version of the memo (omitting sources and graphs) was published by Gizmodo. The memo's publication resulted in controversy across social media, and in public criticism of the memo and its author from some Google employees. According to Wired, Google's internal forums showed some support for Damore, who said he received private thanks from employees who were afraid to come forward. Damore was fired remotely by Google on August 7, 2017. The same day, prior to being fired, Damore filed a complaint with the National Labor Relations Board. The complaint is marked as "8(a)(1) Coercive Statements (Threats, Promises of Benefits, etc.)". A subsequent statement from Google asserted that its executives were unaware of the complaint when they fired Damore; it is illegal to fire an employee in retaliation for an NLRB complaint. Following his firing, Damore announced he would pursue legal action against Google. Google's VP of Diversity, Danielle Brown, responded to the memo on August 8: "Part of building an open, inclusive environment means fostering a culture in which those with alternative views, including different political views, feel safe sharing their opinions. But that discourse needs to work alongside the principles of equal employment found in our Code of Conduct, policies, and anti-discrimination laws". Google's CEO Sundar Pichai wrote a note to Google employees, supporting Brown's formal response, and adding that much of the document was fair to debate. His explanation read "to suggest a group of our colleagues have traits that make them less biologically suited to that work is offensive and not OK ... At the same time, there are co-workers who are questioning whether they can safely express their views in the workplace (especially those with a minority viewpoint). They too feel under threat, and that is also not OK." Anonymously-placed physical ads criticizing Pichai and Google for the firing were put up shortly after. Damore characterized the response by Google executives as having "shamed" him for his views. CNN described the fallout as "perhaps the biggest setback to what has been a foundational premise for [Google] employees: the freedom to speak up about anything and everything". Damore gave interviews to Bloomberg Technology and to the YouTube channels of Canadian professor Jordan Peterson and podcaster Stefan Molyneux. Damore stated that he wanted his first interviews to be with media who were not hostile. He wrote an op-ed in The Wall Street Journal, detailing the history of the memo and Google's reaction, followed by interviews with Reason, Reddit's "IAmA" section, CNN, CNBC, Business Insider, Joe Rogan, Dave Rubin, Milo Yiannopoulos, and Ben Shapiro. In response to the memo, Google's CEO planned an internal "town hall" meeting, fielding questions from employees on inclusivity. The meeting was cancelled a short time before it was due to start, over safety concerns as "our Dory questions appeared externally this afternoon, and on some websites, Googlers are now being named personally". Outlets found to be posting these names, with pictures, included 4chan, Breitbart News, and Milo Yiannopoulos' blog. Danielle Brown, Google's VP for diversity, was harassed online, and temporarily disabled her Twitter account. Damore withdrew his complaint with the National Labor Relations Board before the board released any official findings. However, shortly before the withdrawal, an internal NLRB memo found that his firing was legal. The memo, which was not released publicly until February 2018, said that, whereas the law shielded him from being fired solely for criticizing Google, it did not protect discriminatory statements, that his memo's "statements regarding biological differences between the sexes were so harmful, discriminatory, and disruptive as to be unprotected", and that these "discriminatory statements", not his criticisms of Google, were the reason for his firing. After withdrawing his complaint with the National Labor Relations Board, Damore and another ex-Google employee instead shifted focus to a class action lawsuit accusing Google of various forms of discrimination against conservatives, white people, and men. In October 2018, Damore and the other former Google employee dismissed their claims in the lawsuit, in order to pursue private arbitration against Google. Another engineer, Tim Chevalier, later filed a lawsuit against Google claiming that he was terminated in part for criticizing Damore's memo on Google's internal message boards. Reactions On the science Responses from scientists who study gender and psychology reflected the controversial nature of the science Damore cited. Some commentators in the academic community said Damore had understood the science correctly, such as Debra W. Soh, a columnist and psychologist; Lee Jussim, a professor of social psychology at Rutgers University; and Geoffrey Miller, an evolutionary psychology professor at University of New Mexico. Others said that he had got the science wrong and relied on data that was suspect, outdated, irrelevant, or otherwise flawed; these included Gina Rippon, chair of cognitive brain imaging at Aston University; evolutionary biologist Suzanne Sadedin; and Rosalind Barnett, a psychologist at Brandeis University. David P. Schmitt, former professor of psychology at Bradley University, said that while some sex differences are "small to moderate" in size and not relevant to occupational performance at Google, "culturally universal sex differences in personal values and certain cognitive abilities are a bit larger in size, and sex differences in occupational interests are quite large. It seems likely these culturally universal and biologically-linked sex differences play some role in the gendered hiring patterns of Google employees." British journalist Angela Saini said that Damore failed to understand the research he cited, while American journalist John Horgan criticized the track record of evolutionary psychology and behavioral genetics. Columnist for The Guardian Owen Jones said that the memo was "guff dressed up with pseudo-scientific jargon" and cited a former Google employee saying that it failed to show the desired qualities of an engineer. Feminist journalist Louise Perry in her book The Case Against the Sexual Revolution comments on the affair saying that she is sympathetic to Damore and that the science he quotes is perfectly sound. Alice H. Eagly, professor of psychology at Northwestern University, wrote "As a social scientist who's been conducting psychological research about sex and gender for almost 50 years, I agree that biological differences between the sexes likely are part of the reason we see fewer women than men in the ranks of Silicon Valley's tech workers. But the road between biology and employment is long and bumpy, and any causal connection does not rule out the relevance of nonbiological causes." Impact on Google Prior to his interview with Damore, Steve Kovach interviewed a female Google employee for Business Insider who said she objected to the memo, saying it lumped all women together, and that it came across as a personal attack. Business Insider also reported that several women were preparing to leave Google by interviewing for other jobs. Within Google, the memo sparked discussions among staff, some of whom believe they were disciplined or fired for their comments supporting diversity or for criticizing Damore's beliefs. Concerns about sexism In addition to Sheryl Sandberg, who linked to scientific counterarguments, a number of other women in technology condemned the memorandum, including Megan Smith, a former Google vice president. Susan Wojcicki, CEO of YouTube, wrote an editorial in which she described feeling devastated about the potential effect of the memo on young women. Laurie Leshin, president of the Worcester Polytechnic Institute, said that she was heartened by the backlash against the memo, which gave her hope that things were changing. Kara Swisher of Recode criticized the memo as sexist; Cynthia B. Lee, a computer science lecturer at Stanford University stated that there is ample evidence for bias in tech and that correcting this was more important than whether biological differences might account for a proportion of the numerical imbalances in Google and in technology. Cathy Young in USA Today said that while the memo had legitimate points, it mischaracterized some sex differences as being universal, while Google's reaction to the memo was harmful since it fed into arguments that men are oppressed in modern workplaces. Libertarian author Megan McArdle, writing for Bloomberg View, said that Damore's claims about differing levels of interest between the sexes reflected her own experiences. Christina Cauterucci of Slate drew parallels between arguments from Damore's memo and those of men's rights activists. UC Law legal scholar Joan C. Williams expressed concerns about the prescriptive language used by some diversity training programs and recommended that diversity initiatives be phrased in problem-solving terms. Employment law and free speech concerns Yuki Noguchi, a reporter for NPR (National Public Radio), said that Damore's firing has raised questions regarding the limits of free speech in the workplace. First Amendment free speech protections usually do not extend into the workplace, as the First Amendment restricts government action but not the actions of private employers, and employers have a duty to protect their employees against a hostile work environment. Several employment law experts interviewed by CNBC said that while Damore could challenge his firing in court, his potential case would be weak and Google would arguably have several defensible reasons for firing him; had Google not made a substantive response to his memo, that could have been cited as evidence of a "hostile work environment" in lawsuits against Google. Additionally, they argued that the memo could indicate that Damore would be unable to fairly assess or supervise the work of female colleagues. Cultural commentary Google's reaction to the memo and its firing of Damore were criticized by several cultural commentators, including Margaret Wente of The Globe and Mail, Erick Erickson, a conservative writer for RedState, David Brooks of The New York Times, Clive Crook of Bloomberg View, and moral philosopher Peter Singer, writing in New York Daily News. Others objected to the intensity of the broader response to the memo in the media and across the internet, such as CNN's Kirsten Powers, Conor Friedersdorf of The Atlantic, and Jesse Singal, writing in The Boston Globe. See also Biological determinism Cancel culture Criticism of Google Gender disparity in computing Resistance to diversity efforts in organizations Neuroscience of sex differences Sex differences in psychology Sexism in the technology industry Women in computing Women in STEM fields References Further reading External links The memo as PDF also hosted here Fired for Truth - James Damore's official website Google Video on Unconscious Bias - Making the Unconscious Conscious by Life at Google (YouTube, 4 minutes) 2017 controversies in the United States 2017 documents Ideological Echo Chamber Diversity in computing Memoranda Sexism in the United States Women in computing Works about Google Computing-related controversies Freedom of speech in the United States fr:Google's Ideological Echo Chamber
Google's Ideological Echo Chamber
[ "Technology" ]
2,655
[ "Computing-related controversies", "Computing and society", "Diversity in computing" ]
54,801,476
https://en.wikipedia.org/wiki/Jeremy%20Baumberg
Jeremy John Baumberg, (born 14 March 1967) is a British physicist who is professor of nanoscience in the Cavendish Laboratory at the University of Cambridge, a fellow of Jesus College, Cambridge, and director of the NanoPhotonics Centre. Education Baumberg was born on 14 March 1967. He was educated at the University of Cambridge, where he was an undergraduate student of Jesus College, Cambridge, and awarded a Bachelor of Arts degree in natural sciences in 1988. He moved to the University of Oxford, where he was awarded a Doctor of Philosophy degree in 1993. During his postgraduate study he was a student of Jesus College, Oxford, and supervised by John Francis Ryan, where his doctoral research investigated nonlinear optics in semiconductors. Career and research Following his PhD, Baumberg was a visiting IBM Research fellow at the University of California, Santa Barbara (UCSB) from 1994 to 1995. He returned to the UK to work in the Hitachi Cambridge Lab from 1995 to 1998 before being appointed professor of nano-scale physics at the University of Southampton from 1998 to 2007 where he co-founded Mesophotonics Limited, a Southampton University spin-off company. Baumberg's research is in nanotechnology, including nanophotonics, plasmonics, metamaterials and optical microcavities. He is interested in the development of nanostructured optical materials that undergo unusual interactions with light, and his research has various commercial applications. His early work led to the development of a number of pioneering experimental techniques. Baumberg appeared in the documentary The Secret Life of Materials in 2015 and a Horizon documentary about Schön scandal first broadcast in 2004. Awards and honours Baumberg has received several awards for his research including the Mullard Award in 2004 and Rumford Medal in 2014, both from the Royal Society. The Institute of Physics (IOP) awarded Baumberg with the Silver Young Medal and Prize in 2013 and the Gold Faraday Medal and Prize in 2017. Baumberg was elected a Fellow of the Royal Society (FRS) in 2011. Publications The Secret Life of Science: How It Really Works and Why It Matters (Princeton UP, 2018) Personal life Baumberg is the son of the late Simon Baumberg OBE, a microbiologist and who served as Professor of bacterial genetics at the University of Leeds from 1996 to 2005. References External links Jeremy Baumberg's Fellows of the Royal Society Fellows of the Institute of Physics Fellows of Jesus College, Cambridge Alumni of Jesus College, Cambridge British materials scientists Living people 1967 births Alumni of Jesus College, Oxford
Jeremy Baumberg
[ "Materials_science" ]
529
[ "Metamaterials scientists", "Metamaterials" ]
54,801,854
https://en.wikipedia.org/wiki/The%20Architect%20and%20His%20Office
The Architect and His Office was a landmark report about the state of the British architectural profession in the early 1960s. It was commissioned by the Royal Institute of British Architects in 1960 and the report was published in February 1962 with an introduction by William Holford, then President of the institute. Running to more than 250 pages, the report examined architectural education, fees and salaries, and management and technical competence. It was based on extensive fieldwork including a questionnaire survey and visits to nearly 70 architects' offices of various kinds. Funding of slightly more than £11,000 was provided for the study by the Leverhulme Trust. According to Frank Duffy, in his book Architectural Knowledge: the Idea of a Profession, the study stands as one of the best studies of a profession ever carried out anywhere in the world (page 173). Duffy considers that the study did much to reform architectural practices in Britain, particularly by contributing to the development of the RIBA Plan of Work, a set of discrete stages of the architect's work from inception to completion. As well as each stage being precisely described, the stages also had a precise fraction of the total architect's fee associated with them. However, as Duffy also comments, the identification of the usual work pattern as 'normal services' also implied that many activities known as 'other services' (management, surveying, engineering) lay outside the work of the architect, and architects lost market share as a consequence of such exclusivity (page 179). Architectural education Architecture books
The Architect and His Office
[ "Engineering" ]
306
[ "Architectural education", "Architecture" ]
54,802,140
https://en.wikipedia.org/wiki/George%20H.%20Dodd
George H Dodd was a biochemist who specialised in the study and production of perfumes and pheromones. He died on 14th December 2020. Career George Dodd studied at Trinity College Dublin, and obtained his D.Phil. at Oxford under the supervision of George Radda; his thesis described studies structural transitions of the enzyme glutamate dehydrogenase using new methods based on fluorescence spectroscopy. He worked at Unilever before joining the University of Warwick in 1971. There he returned to his lifelong interest in olfaction, writing, for example, on the effect of odorants on enzyme activity. After leaving Warwick in 1994 he founded a smell biotechnology company, Kiotech and later opened “The Perfume Studio”. References Year of birth missing (living people) Alumni of Trinity College Dublin Academics of the University of Warwick Perfumers
George H. Dodd
[ "Chemistry" ]
172
[ "Biochemistry stubs", "Biochemists", "Biochemist stubs" ]
54,802,226
https://en.wikipedia.org/wiki/Phyllis%20Zee
Phyllis C. Zee is the Benjamin and Virginia T. Boshes Professor in Neurology, the director of the Center for Circadian and Sleep Medicine (CCSM) and the chief of the Division of Sleep Medicine (neurology) at the Feinberg School of Medicine, Northwestern University, Chicago. She is also the medical director of Sleep Disorders Center at Northwestern Memorial Hospital. Career As director of CCSM, Zee oversees an interdisciplinary program in basic and translational sleep and circadian rhythm research, and findings from her team have paved the way for innovative approaches to improve sleep and circadian health. Zee is the founder of the first circadian medicine clinic in the US, where innovative treatments are available for patients with circadian rhythm disorders. A central theme of her research program is understand the role of circadian-sleep interactions on the expression and development of cardiometabolic and neurologic disorders. Zee's research has focused on the effects of age and neurodegeneration on sleep and circadian rhythms and pathophysiology of circadian sleep-wake disorders. In addition, her laboratory is studying the effects of circadian-sleep based interventions, such as exercise, bright light and feed-fast schedules on cognitive, cardiovascular and metabolic functions and their potential to delay cardiometabolic aging and neurodegeneration. Recently her research team has also been interested in the use of acoustic and electrical neurostimulation to enhance slow wave sleep and memory in older adults. Zee also has authored more than 300 peer reviewed original articles, reviews and chapters on the topics of sleep, circadian rhythms, and sleep/wake disorders. She has also trained over 50 pre-doctoral and post-doctoral students and has mentored numerous faculty members. Zee is a fellow of the American Academy of Sleep Medicine, fellow of the American Academy of Neurology and member of the American Neurological Association. She has served on numerous national and international committees, NIH scientific review panels, and international advisory boards. She is past president of the Sleep Research Society, past president of the Sleep Research Society Foundation and past chair of the NIH Sleep Disorders Research Advisory Board. Dr. Zee is a Member of the NIH National Heart Lung and Blood Advisory Council. She is the recipient of the 2011 American Academy of Neurology Sleep Science Award and the 2014 American Academy of Sleep Medicine academic honor, the William C. Dement Academic Achievement Award. References External links http://www.sleepupdates.org/Faculty/PhyllisCZee,MD,PhD.aspx Living people Northwestern University faculty American neurologists Women neurologists American women neuroscientists American neuroscientists Year of birth missing (living people) Sleep researchers American women academics 21st-century American women Fellows of the American Academy of Neurology
Phyllis Zee
[ "Biology" ]
574
[ "Sleep researchers", "Behavior", "Sleep" ]
54,802,660
https://en.wikipedia.org/wiki/Jim%20Horne%20%28neuroscientist%29
James Anthony Horne (April 1946 - October 2023) was a British sleep neuroscientist and emeritus professor of psychophysiology at the School of Sport, Exercise and Health Sciences at Loughborough University. He was a regular commentator in the British media on the subject of sleep. References External links Personal website 1946 births Living people People educated at Wallington County Grammar School Academics of Loughborough University British neuroscientists Sleep researchers British physiologists
Jim Horne (neuroscientist)
[ "Biology" ]
96
[ "Sleep researchers", "Behavior", "Sleep" ]
54,802,851
https://en.wikipedia.org/wiki/Advanced%20Innovation%20Design%20Approach
Advanced Innovation Design Approach (AIDA) is a holistic approach for enhancing the innovative and competitive capabilities of industrial companies. The name Advanced Innovation Design Approach (AIDA) was proposed in the research project "Innovation Process 4.0" run at the University of Applied Sciences Offenburg, Germany in co-operation with 10 German industrial companies in 2015–2019. AIDA can be considered as a pioneering mindset, an individually adaptable range of strong innovation techniques such as comprehensive front-end innovation process, advanced innovation methods, best tools and methods of the theory of inventive problem solving TRIZ, organisational measures for accelerating innovation, IT-solutions for Computer-Aided Innovation, and other tools for new product development, elaborated in the recent decade in the industry and academia. Initially the AIDA has been conceptualised as a systemic approach including analysis, optimizations and further development of the innovation process and promoting the innovation climate in industrial companies. The innovation process with self-configuration, self-optimization, self-diagnostics and intelligent information processing and communication, is understood as a holistic system comprising following typical phases with feedback loops and simultaneous auxiliary or follow-up processes: uncovering of solution-neutral customer needs, technology and market trends, identification of the needs and problems with high market potential and formulation of the innovation tasks and strategy, systematic idea generation and problem solving, evaluation and enhancement of solution ideas, creation of innovation concepts based on solution ideas, evaluation of the innovation concepts as well as implementation, validation and market launch of chosen innovation concepts. The Advanced Innovation Design Approach was refined and further developed for the application in the field of process engineering in the context of the EU research project "Intensified by Design - Platform for the intensification of processes involving solids handling” within international consortium of 22 universities, research institutes and industrial companies under H2020 SPIRE programme. In 2020 the European Commission has placed AIDA on its Innovation Radar as innovation with the high market potential. Principle of completeness As a holistic innovation approach, AIDA postulates the complete problem analysis and a comprehensive idea generation and problem solving. The problems faced by the industry can not be solved by single eureka idea. The principle of completeness in the new product development can be illustrated by following 4 steps. Initial complex problem must be segmented into the partial problems. The problem ranking method helps to identify problems crucial for innovation success. The strongest TRIZ inventive principles replace the random brainstorming, increasing the quality and quantity of ideas within a short period of time. For each partial problem several ideas must be generated. No relevant idea should be overlooked or lost. The complementary solution ideas are combined to the solution concepts. A robust solution concept delivers solutions for all partial problems. The solution concepts often have their secondary side effects, like costs, risks or R&D expenditures, which must be limited through concept optimization. Another example demonstrates the principle of completeness in the phase of the innovation strategy formulation. For the complete identification of existing and future customer needs or benefits several complementary methods are used simultaneously (Tool 4. Innovation potential analysis): Voice-of-the-Customer Methods, e.g. Lead User identification or web-based monitoring. Analysis of the customer working process (Process Mapping) Prediction of the customer needs. Analysis of system functions, and Identification of the new product features and innovation tasks from the patent literature. Analysis of market and technological trends, and others. AIDA innovation tools AIDA tools or apps most frequently used in the practice include: Brainstorming 40x40: Generate 40 ideas with enhanced 40 TRIZ Inventive Principles (incl. 160 inventive sub-principles, 2017). Inno-Workshop: Tool for systematic problem solving and moderation of innovation workshops with TRIZ. TRIZ Inventor: Solving of bottle-neck problems with inventive algorithm ARIZ in its short form. Innovation potential analysis: comprehensive identification of innovation opportunities, customer benefits and segments with high market potential. New concept development: implementation of the selected innovation tasks (tool 04) into new concepts with high market potential. Root-conflict analysis and anticipatory failure identification: tool for elimination harmful effects. Systematic and creative cost cutting: for products and processes. InnoMonitor: tool for continuous monitoring of innovative capability of companies (80 parameters and 10 key performance indicators). Database of 200+ best practice measures for enhancement of innovation capability. Rapid Cross Industry Innovation: an easy-to-use method for fast idea generation with the help of analogies and similarity rules (2019). AIDA Automatic Idea & IP Generator: a new app for fast and complete automatic idea generation based on 200 inventive principles (2020-23), tuned for applying ChatGPT or other GAI-tools. Advanced Innovation Design Methods The new Advanced Innovation Methods are the basis for the further development of the AIDA-tools or apps. The following list will be regularly updated: Advanced Design (2014) - Methods for the early stage of the innovation process, proposed by the research team of the Politecnico di Milano, Italy. Advanced design methods for successful innovation (2013) - new methods in the industrial design from the Dutch research platform Design United, Delft University of Technology Root Conflict Analysis RCA+ (2011) - universal method for comprehensive problem and contradiction analysis. The Business Model Navigator (2014) - engineering method for systematic business model innovation, containing 55 business models for creative copying and recombination. University of St. Gallen. New Product Blueprinting (2012) - The Advanced Innovation and Marketing (AIM) Institute. References Product development Systems engineering Innovation Industrial design TRIZ Science and technology studies
Advanced Innovation Design Approach
[ "Technology", "Engineering" ]
1,146
[ "Industrial design", "Systems engineering", "Design engineering", "Science and technology studies", "Design" ]
54,802,873
https://en.wikipedia.org/wiki/Roger%20Ekirch
Arthur Roger Ekirch (born February 6, 1950) is University Distinguished Professor of history at Virginia Tech in the United States. He was a Guggenheim fellow in 1998. The son of intellectual historian Arthur A. Ekirch Jr. and Dorothy Gustafson, Roger Ekirch is internationally known for his pioneering research into pre-industrial sleeping patterns that was first published in "Sleep We Have Lost: Pre-Industrial Slumber in the British Isles" and later in his award-winning 2005 book At Day's Close: Night in Times Past. Selected publications Books "Poor Carolina": Politics and society in Colonial North Carolina, 1729–1776, University of North Carolina Press, 1981. Bound for America: The Transportation of British Convicts to the Colonies, 1718–1775, Oxford University Press, 1987. At Day's Close: Night in Times Past, W.W. Norton, 2005. Birthright: The True Story of the Kidnapping of Jemmy Annesley, W.W. Norton, 2010. American Sanctuary: Mutiny, Martyrdom, and National Identity in the Age of Revolution, Pantheon, 2017. La Grande Transformation du Sommeil: Comment la Revolution Industrielle a Bouleversé Nos Nuits, Editions Amersterdam, 2021. Articles "Sleep We Have Lost: Pre-Industrial Slumber in the British Isles", The American Historical Review, 2001. "The Modernization of Western Slumber: Or, Does Insomnia Have a History?", Past & Present, 2015. "Segmented Sleep in Preindustrial Societies", Sleep, 2016. "What Sleep Research Can Learn From History", Sleep Health, 2018. See also Biphasic and polyphasic sleep References External links Rethinking Sleep (NY Times) Segmented sleep (Harpers) 21st-century American historians American male non-fiction writers People from Washington, D.C. Dartmouth College alumni Johns Hopkins University alumni Virginia Tech faculty Sleep researchers Living people 1950 births 21st-century American male writers
Roger Ekirch
[ "Biology" ]
405
[ "Sleep researchers", "Behavior", "Sleep" ]
63,291,291
https://en.wikipedia.org/wiki/Marcy%20Zenobi-Wong
Marcy Zenobi-Wong is an American engineer and professor of Tissue Engineering and Biofabrication at the Swiss Federal Institute of Technology (ETH Zurich). She is known for her work in the field of Tissue Engineering. Education and career Zenobi-Wong completed her undergraduate degree in mechanical engineering at the Massachusetts Institute of Technology, and a graduate degree at Stanford University. She completed her PhD on the role of mechanical forces in skeletal development in 1990. After this, she first worked for a year as a postdoc in the Orthopaedic Research Laboratories, University of Michigan, before moving to the University of Bern as group leader Cartilage Biomechanics in 1992, where she habilitated in 2000. In 2003, she moved to ETH Zürich, first to the Institute for Biomedical Engineering, and later to the Department of Health Sciences and Technology, where she became an associate professor in 2017. Work Zenobi-Wong works in the area of tissue engineering, in particular for cartilage regeneration. She develops functional biomaterials which mimic the extracellular matrix. The biofabrication techniques used to develop these materials include electrospinning, casting, two-photon polymerization and bioprinting. Zenobi-Wong holds four licensed patents in the fields of tissue engineering, tissue engineering techniques, and gene expression assays. She was one of the originators of the MSc Biomedical Engineering program at ETH Zürich, and developed several graduate level courses in tissue engineering and biomedical engineering. Zenobi-Wong currently serves as President of the Swiss Society for Biomaterials and Regenerative Medicine, and as secretary general of the International Society of Biofabrication. References External links ETH Zürich Department of Health Sciences and Technology - Tissue Engineering and Biofabrication Group Living people Biomaterials Tissue engineering Academic staff of ETH Zurich 1963 births 21st-century American engineers 21st-century American educators MIT School of Engineering alumni Stanford University alumni American women engineers American women academics 21st-century American women
Marcy Zenobi-Wong
[ "Physics", "Chemistry", "Engineering", "Biology" ]
415
[ "Biomaterials", "Biological engineering", "Cloning", "Chemical engineering", "Materials", "Tissue engineering", "Matter", "Medical technology" ]
63,292,069
https://en.wikipedia.org/wiki/Chemical%20Workers%27%20Union%20%28Finland%29
The Chemical Workers' Union (, KTL) was a trade union representing workers in the chemical industry in Finland. The union was founded in 1970, with the merger of the Finnish General Workers' Union and many workers from the General and Speciality Workers' Union. The new union affiliated to the Central Organisation of Finnish Trade Unions. By the 1980s, the union was keen to collaborate with others in the light industries, and in 1990, it began investigating a merger with the Rubber and Leather Workers' Union. The two eventually merged in 1993, with a new Chemical Union founded on 24 October. References Chemical industry trade unions Trade unions in Finland Trade unions established in 1970 Trade unions disestablished in 1993
Chemical Workers' Union (Finland)
[ "Chemistry" ]
141
[ "Chemical industry trade unions" ]
63,294,746
https://en.wikipedia.org/wiki/Susan%20Sinnott
Susan Buthaina Sinnott is professor and head of materials science and engineering at Pennsylvania State University. Sinnott is a fellow of the Materials Research Society (MRS), the American Association for the Advancement of Science (AAAS) and the American Physical Society (APS). She has served as editor-in-chief of the journal Computational Materials Science since 2014. Early life and education Sinnott received a bachelors of science in chemistry at the University of Texas at Austin. She moved to Iowa State University for her graduate studies, and earned her doctoral degree in physical chemistry in 1993. Research and career After graduating Sinnott moved to the United States Naval Research Laboratory where she worked on surface chemistry. After two years at the Naval Research Laboratory, Sinnott was appointed an assistant professor at the University of Kentucky. In 2000 she was recruited to the University of Florida as an Associate Professor. Sinnott was promoted to Professor at the University of Florida in 2005, where she led projects on cyber infrastructure and quantum theory. In 2015 Sinnott was appointed Head of Materials Science and Engineering at the Pennsylvania State University. Sinnott's research involves the development of computational methods to understand the electronic and atomic structure of materials. Her computational models include continuum level modelling and fluid dynamics and take into account material behaviour at the nanoscale. She has investigated the formation and role of grain boundaries, dopants, defects and heterogeneous interfaces. Her research has considered perovskites, showing that the alignment or tilting of the perovskite oxygen cages impacts the materials properties. Sinnott has served as editor-in-chief of the scientific journal Computational Materials Science since 2014. Her principal research interests at Penn State University include two-dimensional and nano-structured materials, gas adsorption and separation in porous solid materials, and condensed matter physics. Selected awards and honours Her awards include: 2005 Elected a fellow of the American Vacuum Society 2009 Distinguished editor of the Physical Review Letters 2010 Elected a fellow of the American Association for the Advancement of Science 2011 Elected a fellow of the American Ceramic Society 2012 Elected a fellow of the Materials Research Society 2013 Elected a fellow of the American Physical Society 2013 Top 25 Women Professors in Florida Selected publications Her publications include A second-generation reactive empirical bond order (REBO) potential energy expression for hydrocarbons Model of carbon nanotube growth through chemical vapor deposition Carbon nanotubes: synthesis, properties, and applications Effect of chemical functionalization on the mechanical properties of carbon nanotubes References University of Texas at Austin alumni Pennsylvania State University faculty Women materials scientists and engineers American materials scientists Iowa State University alumni University of Kentucky faculty University of Florida faculty Fellows of the American Association for the Advancement of Science Fellows of the American Physical Society Year of birth missing (living people) Living people
Susan Sinnott
[ "Materials_science", "Technology" ]
559
[ "Women materials scientists and engineers", "Materials scientists and engineers", "Women in science and technology" ]
63,295,596
https://en.wikipedia.org/wiki/Janson%20inequality
In the mathematical theory of probability, Janson's inequality is a collection of related inequalities giving an exponential bound on the probability of many related events happening simultaneously by their pairwise dependence. Informally Janson's inequality involves taking a sample of many independent random binary variables, and a set of subsets of those variables and bounding the probability that the sample will contain any of those subsets by their pairwise correlation. Statement Let be our set of variables. We intend to sample these variables according to probabilities . Let be the random variable of the subset of that includes with probability . That is, independently, for every . Let be a family of subsets of . We want to bound the probability that any is a subset of . We will bound it using the expectation of the number of such that , which we call , and a term from the pairwise probability of being in , which we call . For , let be the random variable that is one if and zero otherwise. Let be the random variables of the number of sets in that are inside : . Then we define the following variables: Then the Janson inequality is: and Tail bound Janson later extended this result to give a tail bound on the probability of only a few sets being subsets. Let give the distance from the expected number of subsets. Let . Then we have Uses Janson's Inequality has been used in pseudorandomness for bounds on constant-depth circuits. Research leading to these inequalities were originally motivated by estimating chromatic numbers of random graphs. References Probabilistic inequalities
Janson inequality
[ "Mathematics" ]
327
[ "Theorems in probability theory", "Probabilistic inequalities", "Inequalities (mathematics)" ]
63,295,970
https://en.wikipedia.org/wiki/Haar%27s%20Tauberian%20theorem
In mathematical analysis, Haar's Tauberian theorem named after Alfréd Haar, relates the asymptotic behaviour of a continuous function to properties of its Laplace transform. It is related to the integral formulation of the Hardy–Littlewood Tauberian theorem. Simplified version by Feller William Feller gives the following simplified form for this theorem: Suppose that is a non-negative and continuous function for , having finite Laplace transform for . Then is well defined for any complex value of with . Suppose that verifies the following conditions: 1. For the function (which is regular on the right half-plane ) has continuous boundary values as , for and , furthermore for it may be written as where has finite derivatives and is bounded in every finite interval; 2. The integral converges uniformly with respect to for fixed and ; 3. as , uniformly with respect to ; 4. tend to zero as ; 5. The integrals and converge uniformly with respect to for fixed , and . Under these conditions Complete version A more detailed version is given in. Suppose that is a continuous function for , having Laplace transform with the following properties 1. For all values with the function is regular; 2. For all , the function , considered as a function of the variable , has the Fourier property ("Fourierschen Charakter besitzt") defined by Haar as for any there is a value such that for all whenever or . 3. The function has a boundary value for of the form where and is an times differentiable function of and such that the derivative is bounded on any finite interval (for the variable ) 4. The derivatives for have zero limit for and for has the Fourier property as defined above. 5. For sufficiently large the following hold Under the above hypotheses we have the asymptotic formula References Tauberian theorems
Haar's Tauberian theorem
[ "Mathematics" ]
379
[ "Theorems in mathematical analysis", "Tauberian theorems" ]
63,296,031
https://en.wikipedia.org/wiki/Filociclovir
Filociclovir (cyclopropavir, MBX-400) is an antiviral drug which was developed for the treatment of cytomegalovirus infection and also shows some activity against other double-stranded DNA viruses. It has reached Phase II human clinical trials. References Antiviral drugs
Filociclovir
[ "Biology" ]
66
[ "Antiviral drugs", "Biocides" ]
63,297,156
https://en.wikipedia.org/wiki/LTT%203780
LTT 3780, also known as TOI-732 or LP 729-54, is the brighter component of a wide visual binary star system in the constellation Hydra. This star is host to a pair of orbiting exoplanets. Based on parallax measurements, it is located at a distance of 72 light years from the Sun. LTT 3780 has an apparent visual magnitude of 13.07, requiring a telescope to view. The spectrum of LTT 3780 presents as a small M-type main-sequence star, a red dwarf, with a stellar classification of M3.5 V. It is spinning very slowly, with a rotation period of 104 days. The abundance of iron, an indicator of the star's metallicity, appears higher than in the Sun. The star is inactive, showing a negligible level of magnetic activity in its chromosphere. It has about 40% of the mass and 37% of the radius of the Sun. The star is radiating just 17% of the Sun's luminosity from its photosphere at an effective temperature of 3,331. Collectively designated LDS 3977, the two stars in this system share a common proper motion and have an angular separation of , which corresponds to a (physical) projected separation of . At this separation, the orbital period would be ~9,100 years. The fainter member is a red dwarf with a class of M5.0 V. It has 14% of the mass of the Sun and 17% of the Sun's radius. Planetary system In 2020, an analysis carried out by a team of astronomers led by astronomer Ryan Cloutier of the TESS project confirmed the existence of two planets on mildly eccentric orbits, the inner being a super-Earth and the outer a small gas planet about half the mass of Uranus. LTT 3780 c Astronomers utilizing the Gemini South 8.1-meter telescope performed an atmospheric survey of LTT 3780 c through high-resolution transmission spectroscopy. From observations during a single transit, they detected tentative signs of methane in the atmosphere but found no traces of ammonia, even though it is highly detectable in a cloud-free, hydrogen-rich atmosphere. See also List of extrasolar planets List of multiplanetary systems References M-type main-sequence stars Planetary systems with two confirmed planets Planetary transit variables Hydra (constellation) 732
LTT 3780
[ "Astronomy" ]
486
[ "Hydra (constellation)", "Constellations" ]
63,298,152
https://en.wikipedia.org/wiki/Informal%20housing
Informal housing or informal settlement can include any form of housing, shelter, or settlement (or lack thereof) which is illegal, falls outside of government control or regulation, or is not afforded protection by the state. As such, the informal housing industry is part of the informal sector. To have informal housing status is to exist in "a state of deregulation, one where the ownership, use, and purpose of land cannot be fixed and mapped according to any prescribed set of regulations or the law". While there is no global unified law of property-ownership, the informal occupant or community will typically lack security of tenure and, with this, ready or reliable access to civic amenities (potable water, electricity and gas supply, road creation and maintenance, emergency services, sanitation and waste collection). Due to the informal nature of occupancy, the state will typically be unable to extract rent or land taxes. The term "informal housing" is useful in capturing the informal population other than those living in slum settlements or shanty towns. UN-Habitat more narrowly defines slum housing as lacking at least one of the following criteria: durability, sufficient living space, safe and accessible water, adequate sanitation, and security of tenure. Common categories or terms associated with informal housing include: slums, shanty towns, squats, homelessness, backyard housing and pavement dwellers. In developing countries People around the world face issues of homelessness and insecurity of tenure. However, particularly pernicious circumstances may obtain in developing countries, leading to a large proportion of the population resorting to informal housing. According to Saskia Sassen, in the race to become a "global city" with the requisite state-of-the-art economic and regulatory platforms for handling the operations of international firms and markets, radical physical interventions in the fabric of the city are often called for, displacing "modest, low-profit firms and households". Persistent conflict and insecurity can also weaken the institutions that would record and formalize housing transactions. For instance, until 1991 municipal officials possessed a registry of land in Mogadishu, Somalia. But these records are now held by a diasporic Somali living in Sweden, who charges a fee to verify land deeds. If households lack the economic resilience to repurchase in the same area or to relocate to a place that offers similar economic opportunity, they are prime candidates for informal housing. For example, in Mumbai, India, fast-paced economic growth, coupled with inadequate infrastructure, endemic corruption and the legacy of restrictive tenancy laws have left the city unable to house the estimated 54% who now live informally. Informal housing is often built incrementally, as householders acquire the resources, time and security to build additions and enhancements. Many cities in the developing world are experiencing a rapid increase in informal housing, driven by mass migration to cities in search of employment or fleeing from war or environmental disaster. According to Robert Neuwirth, there are over 1 billion (one in seven) squatters worldwide. If current trends continue, this will increase to 2 billion by 2030 (one in four), and 3 billion by 2050 (one in three). In African cities, between half and three-quarters of new housing is developed on informally acquired land. Informal homes, and the often informal livelihoods that accompany them, are set to be defining features of the cities of the future. In the United States Informal housing can also be found in developed countries like the United States. Unpermitted secondary units are seen as informal housing. In 2012, among the total stock of approximately 462,000 single-family homes in Los Angeles, California, there were estimated to be close to 50,000 unpermitted secondary units. See also Affordable housing Informal sector Right to housing Subsidized housing References Urban planning Squatting Housing Human settlement External links Solving the Housing Crisis Half-a-House at a Time: Incremental Housing as a Means to Fulfilling the Human Right to Housing (U. Miami Inter-Am. L. Rev. 2021) - Law Review article reviewing legal implementation of incremental housing in Chile and the United States
Informal housing
[ "Engineering" ]
860
[ "Urban planning", "Architecture" ]
63,298,661
https://en.wikipedia.org/wiki/Fenofibrate/simvastatin
Fenofibrate/simvastatin, sold under the brand name Cholib, is a fixed-dose combination medication used to treat abnormal blood lipid levels when used in combination with a low-fat diet and exercise. It contains fenofibrate and simvastatin. It was approved for use in the European Union in August 2013. Medical uses Fenofibrate/simvastatin is indicated as adjunctive therapy to diet and exercise in high cardiovascular risk adults with mixed dyslipidemia to reduce triglycerides and increase HDL C levels when LDL C levels are adequately controlled with the corresponding dose of simvastatin monotherapy. Adverse effects References Further reading 2-Methyl-2-phenoxypropanoic acid derivatives 17β-Hydroxysteroid dehydrogenase inhibitors Benzophenones Carboxylate esters Chloroarenes Combination lipid-lowering drugs Isopropyl esters Lactones Neuroprotective agents Prodrugs Secondary alcohols Statins Tetrahydropyrans
Fenofibrate/simvastatin
[ "Chemistry" ]
219
[ "Chemicals in medicine", "Prodrugs" ]
63,299,027
https://en.wikipedia.org/wiki/Frank%20Calegari
Francesco Damien "Frank" Calegari is a professor of mathematics at the University of Chicago working in number theory and the Langlands program. Early life and education Frank Calegari was born on December 15, 1975. He has both Australian and American citizenship. He won a bronze medal and a silver medal at the International Mathematical Olympiad while representing Australia in 1992 and 1993 respectively. Calegari received his PhD in mathematics from the University of California, Berkeley in 2002 under the supervision of Ken Ribet. Career Calegari was a Benjamin Peirce Assistant Professor at Harvard University from 2002 to 2006. He then moved to Northwestern University, where he was an assistant professor from 2006 to 2009, an associate professor from 2009 to 2012, and a professor from 2012 to 2015. He has been a professor of mathematics at the University of Chicago since 2015. Calegari was a von Neumann Fellow of mathematics at the Institute for Advanced Study from 2010 to 2011. Calegari was an editor at Mathematische Zeitschrift from 2013 to 2021. He has been an editor of Algebra & Number Theory and an associate editor of the Annals of Mathematics since 2019. Research Calegari works in algebraic number theory, including Langlands reciprocity and torsion classes in the cohomology of arithmetic groups. In collaboration with Vesselin Dimitrov and Yunqing Tang, Calegari proved the unbounded denominators conjecture of A.O.L. Atkin and Swinnerton-Dyer: if a modular form is not modular for some congruence subgroup of the modular group, then the Fourier coefficients of have unbounded denominators. It has been known for decades that if is modular for some congruence subgroup, then its coefficients have bounded denominators. Also in collaboration with Dimitrov and Tang, he proved the linear independence of and Awards Calegari held a 5-year American Institute of Mathematics Fellowship from 2002 to 2006 and a Sloan Research Fellowship from 2009 to 2012. He was inducted as a Fellow of the American Mathematical Society in 2013. Selected publications Personal life Mathematician Danny Calegari is Frank Calegari's brother. References External links 20th-century Australian mathematicians 21st-century American mathematicians Number theorists Living people Place of birth missing (living people) University of Chicago faculty UC Berkeley College of Letters and Science alumni Institute for Advanced Study visiting scholars International Mathematical Olympiad participants 1975 births
Frank Calegari
[ "Mathematics" ]
489
[ "Number theorists", "Number theory" ]
63,300,462
https://en.wikipedia.org/wiki/Fritz%20Zwicky%20Prize%20for%20Astrophysics%20and%20Cosmology
The Fritz Zwicky Prize for Astrophysics and Cosmology is awarded biennially to a living person who, in the estimation of the judges, "has obtained fundamental and outstanding results related to astrophysics and/or cosmology". These results may constitute a body of work over a period of time or may be a single specific result. The Prize was established in 2020 and is awarded by the European Astronomical Society (EAS) on behalf of the Fritz Zwicky Foundation, located in Glarus, Switzerland. Recipients are invited to deliver a plenary lecture at the following EAS Annual Meeting. Recipients See also List of astronomy awards References External links Astronomy prizes Physical cosmology Awards established in 2020
Fritz Zwicky Prize for Astrophysics and Cosmology
[ "Physics", "Astronomy", "Technology" ]
145
[ "Astronomy prizes", "Theoretical physics", "Astrophysics", "Physical cosmology", "Science and technology awards", "Astronomical sub-disciplines" ]
63,303,180
https://en.wikipedia.org/wiki/2MASS%20J11263991%E2%88%925003550
2MASS J11263991−5003550 (2MASS J1126−5003) is a brown dwarf about 53 light-years distant from earth. The brown dwarf is notable for an unusual blue near-infrared color. This brown dwarf does not show subdwarf features and the blue color cannot be explained by an unresolved binary. Instead the blue color is explained by patchy clouds. The patchy cloud model allows thick clouds and a cloud coverage of 50% to explain the spectra of 2MASS J1126−5003. Other blue L-dwarfs exist, but are quite rare. 2MASS J1126−5003 has a deep water (H2O) absorption feature in its spectra, which is comparable with late L-dwarfs and early T-dwarfs. It also shows weak carbon monoxide (CO) features. It lacks any methane (CH4) feature and is therefore not a T-dwarf. Based on near-infrared spectra this brown dwarf was therefore classified as an L9 spectral type brown dwarf. The optical spectrum is on the other hand more similar to a mid-type L-dwarf. Here a spectral type of L4.5 fits the optical spectrum. This optical spectral type is a more reliable estimate as the near-infrared spectrum does not fit spectra from other L-dwarfs. Lower metallicity and higher surface gravity might play a role in the formation of the weather on 2MASS J1126−5003. Lower metallicity reduces the available metal species to form cloud condensates. The higher surface gravity might cause an increased sedimentation of cloud condensates, resulting in thinner clouds. Other factors, like rotation, vertical upwelling and magnetic fields might play a role as well. Previously one suggested scenario were thinner clouds. This brown dwarf shows variations in the J-band and at mid-infrared wavelengths with a period of 3.2 ± 0.3 hours. This is a clear indication of patchy clouds. References Centaurus L-type brown dwarfs Brown dwarfs J11263991−5003550
2MASS J11263991−5003550
[ "Astronomy" ]
429
[ "Centaurus", "Constellations" ]
63,303,989
https://en.wikipedia.org/wiki/Vanessa%20Allen%20Sutherland
Vanessa Lorraine Allen Sutherland is a corporate lawyer and former chairperson of the U.S. Chemical Safety and Hazard Investigation Board (CSB). Early life Sutherland was born at Sibley Memorial Hospital in Washington, D.C. She lived in Tantallon, Maryland, where she attended Queen Anne School. She graduated from high school at the age of 16 and enrolled at Drew University, where she received a B.A. in political science and art history, and later attended American University, where she received a J.D. and M.B.A. After graduating from college, she moved to Fort Washington, Maryland. Career After graduating from Drew, Sutherland worked at the office of the Inspector General of the Department of Energy prior to attending law school. While attending American University, she served as an associate at Federal Deposit Insurance Corporation and a clerk at Fulbright & Jaworski. After graduating from law school, she worked as a corporate attorney at the telecommunications company MCI Inc. At this company, she became vice president and deputy general counsel of Digex, a subsidiary. She later worked as a counsel for the tobacco product producer Altria (formerly Philip Morris Companies, Inc.). In 2011, Sutherland began government service as chief counsel for the Pipeline and Hazardous Materials Safety Administration. Chemical Safety Board Sutherland was nominated by President Barack Obama to the U.S. Chemical Safety Board in March 2015 after the resignation of Rafael Moure-Eraso over allegations of mismanagement. She was confirmed by the Senate in August 2015. In 2017, Sutherland was chairperson of the agency when the Trump administration attempted to defund the CSB for the 2018 United States federal budget. In March 2018, the Office of Management and Budget informed Sutherland that the Trump administration had again proposed to shut down the agency as part of the 2019 United States federal budget. This caused Sutherland to resign despite having two years left in her five-year term. After leaving the CSB, Sutherland joined Norfolk Southern Railway as a vice president. The agency was ultimately not defunded after the House Appropriations Committee opposed the Trump administration's proposal and proposed a $1 million increase in the agency's 2019 budget. Kristen Kulinowski became the interim executive after Sutherland's departure until Katherine Lemos was confirmed as chair in March 2020. CSB says it closed thirteen incident investigations under Sutherland. References External links Hearing on the nomination of Vanessa Sutherland to be a member and chairperson of the Chemical Safety Board from the U.S. Government Publishing Office (PDF) United States Chemical Safety and Hazard Investigation Board Drew University alumni American University alumni Obama administration personnel Corporate lawyers Living people Year of birth missing (living people)
Vanessa Allen Sutherland
[ "Chemistry" ]
537
[ "United States Chemical Safety and Hazard Investigation Board" ]
56,324,635
https://en.wikipedia.org/wiki/FRAXD
FRAXD or FRAXD gene is a gene symbol for fragile site, aphidicolin type, common, fra(X)(q27.2) D. The locus of the gene is located on fragile site of the q arm of chromosome X at position 27.2. It is used for gene testing in Homo sapiens (Human beings). References Genes on human chromosome X Proteins
FRAXD
[ "Chemistry" ]
83
[ "Biomolecules by chemical classification", "Proteins", "Molecular biology" ]
56,325,079
https://en.wikipedia.org/wiki/Kyropoulos%20method
The Kyropoulos method, also known as the KY method or Kyropoulos technique, is a method of bulk crystal growth used to obtain single crystals. The largest application of the Kyropoulos method is to grow large boules of single crystal sapphire used to produce substrates for the manufacture gallium nitride-based LEDs, and as a durable optical material. History The method is named for , who proposed the technique in 1926 as a method to grow brittle alkali halide and alkali earth metal crystals for precision optics. The method was a response to the limited boule sizes attainable by the Czochralski and Verneuil methods at the time. The Kyropoulos method was applied to sapphire crystal growth in the 1970s in the Soviet Union. The method The feedstock is melted in a crucible. (For sapphire crystal growth, the feedstock is high-purity aluminum oxide—only a few parts per million of impurities—which is then heated above 2100 °C in a tungsten or molybdenum crucible.) A precisely oriented seed crystal is dipped into the molten material. The seed crystal is slowly pulled upwards and may be rotated simultaneously. By precisely controlling the temperature gradients, rate of pulling and rate of temperature decrease, it is possible to produce a large, single-crystal, roughly cylindrical ingot from the melt. In contrast with the Czochralski method, the Kyropoulos technique crystallizes the entire feedstock volume into the boule. The size and aspect ratio of the crucible is close to that of the final crystal, and the crystal grows downward into the crucible, rather than being pulled up and out of the crucible as in the Czochralski method. The upward pulling of the seed is at a much slower rate than the downward growth of the crystal, and serves primarily to shape the meniscus of the solid-liquid interface via surface tension. The growth rate is controlled by slowly decreasing the temperature of the furnace until the entire melt has solidified. Hanging the seed from a weight sensor can provide feedback to determine the growth rate, although precise measurements are complicated by the changing and imperfect shape of the crystal diameter, the unknown convex shape of the solid-liquid interface, and these features' interaction with buoyant forces and convection within the melt. The Kyropoulos method is characterized by smaller temperature gradients at the crystallization front than the Czochralski method. Like the Czochralski method, the crystal grows free of any external mechanical shaping forces, and thus has few lattice defects and low internal stress. This process can be performed in an inert atmosphere, such as argon, or under high vacuum. Advantages The major advantages include technical simplicity of the process and possibility to grow crystals with large sizes (≥30 cm). The method also shows low dislocation density. Disadvantages The most significant disadvantage of the method is an unstable speed of growth which happens due to heat exchange changes incurred by a growing boule size and which are difficult to predict. Due to this problem the crystals are typically grown at very slow speed in order to avoid unnecessary internal defects. Application Currently the method is used by several companies around the world to produce sapphire for the electronics and optics industries. Crystal sizes The sizes of sapphire crystals grown by the Kyropoulos method have increased dramatically since the 1980s. In the mid-2000s sapphire crystals up to 30 kg were developed which could yield 150 mm diameter substrates. By 2017, the largest reported sapphire grown by the Kyropoulos method was 350 kg, and could produce 300 mm diameter substrates. Because of sapphire's anisotropic crystal structure, the orientation of the cylindrical axis of the boules grown by the Kyropoulos method is perpendicular to the orientation required for deposition of GaN on the LED substrates. This means that cores must be drilled through the sides of the boule before being sliced into wafers. This means the as-grown boules have a significantly larger diameter than the resulting wafers. As of 2017 the leading manufacturers of blue and white LEDs used 150 mm diameter sapphire substrates, with some manufacturers still using 100 mm, and 2 inch substrates. See also Bridgman–Stockbarger method Monocrystalline silicon Float-zone silicon Laser-heated pedestal growth Micro-pulling-down References External links Crystal growth technique summaries Semiconductor growth Industrial processes Crystals Methods of crystal growth
Kyropoulos method
[ "Chemistry", "Materials_science" ]
899
[ "Crystallography", "Crystals", "Methods of crystal growth" ]
56,325,455
https://en.wikipedia.org/wiki/High%20performance%20organization
The high performance organization (HPO) is a conceptual framework for organizations that leads to improved, sustainable organizational performance. It is an alternative model to the bureaucratic model known as Taylorism. There is not a clear definition of the high performance organization, but research shows that organizations that fit this model all hold a common set of characteristics. Chief among these is the ability to recognize the need to adapt to the surroundings that the organization operates in. High performance organizations can quickly and efficiently change their operating structure and practices to meet needs. These organizations focus on long term success while delivering on actionable short term goals. These organizations are flexible, customer focused, and able to work highly effectively in teams. The culture and management of these organizations support flatter hierarchies, teamwork, diversity, and adaptability to the environment which are all of paramount success to this type of organization. Compared to other organizations, high performance organizations spend much more time on continuously improving their core capabilities and invest in their workforce, leading to increased growth and performance. High performance organizations are sometimes labeled as high commitment organizations. History World War II ushered in a great amount of increased manufacturing and industrial production. With this came an increased concern over the human impact on work. The Hawthorne studies were part of the reason why more importance was placed on considering the human impact of work. During this period, industrial manufacturers followed the standardized large scale production method, characterized by mass production, scientific management, and stringent division of labor. This led to increased boredom among blue collar workers who would do the same repetitive job on a daily basis. Management in this period was characterized by careful and calculated monitoring which would cause workers to feel a sense of distrust. By the 1960s management for the industrial manufacturing industries had difficulty attracting and retaining its workforce. During the 1960s there was a push for job enrichment. This grew out of the sociotechnical systems approach to work, which was pioneered by the Tavistock Institute. This system is characterized by the open systems model and self-directed work team, which are also key to the success of a high performance organization. Research on the sociotechnical systems approach to work has shown that this approach is related to increased employee satisfaction and motivation. Another important step towards the high performance organization was the Japanese Revolution in manufacturing, which pointed out another flaw to the scientific model of production. Because workers were so focused on only doing one monotonous task, they were not aware of the bigger picture. Most employees were completely unaware of the quality of the products that they were producing. The focus that Japanese manufacturing companies put on quality, through their early quality circles, eventually led to the implementation of total quality management which is a key factor of producing quality products that meet consumer demands at low price points. Another reason for the move away from the older, highly bureaucratic approach towards the high performance organization was the rapid change in the business environment since the 1980s. The 1980s were characterized by a difficulty in American production due to increased competition from foreign firms, increased inflation on oil prices, and a decrease in productivity. This change was characterized by increased globalization, an increase in diversity in the workplace, large technological advances, and increased competition. To better meet the demands of the changing market place, organizations first tried to implement increased technological innovation in their production facilities in order to regain the competitive advantage. These companies soon came to realize that the human factor was also necessary in regaining its competitive footing. The realization of the importance of the human factors in work have led organizations to rely on the high performance organization to drive production and increase their employee's quality of work. Characteristics Organizational design High performance organizations value teamwork and collaboration as priorities in their organizational design. These organizations flatten organizational hierarchies and make it easier for cross-functional collaboration to occur. They do this by reducing barriers between functional units and getting rid of complex organizational bureaucracies. In an HPO, relationships are strengthened among employees who perform distinct functions, or that only perform within a given business silo. which improves organizational performance. This is particularly evident in organizations that exhibit highly interdependent work such as hospitals. High performance organizations value sharing of information at all levels by incentivizing information sharing in both bottom up and top down processes. The design is also very malleable and can adjust to both external and internal concerns. Teams The most apparent difference in the organizational design of HPOs is their reliance on teamwork. Teams operate semi-autonomously to set schedules, manage quality, and solve problems. These self-directed work teams thrive off of information sharing from all levels of the organization and are multi-skilled with the flexibility to solve problems without the need of direct supervision. Members of self-directed work teams have been shown to have greater job satisfaction, more autonomy and idea input, and improved work variety. These teams are often small in number, typically ranging from 7–15 members. Members of these teams share complementary skills and membership is often cross-functional. In order for these teams to truly operate at high performance, they must buy into the teamwork framework. Team members who are part of high performance teams tend to have strong personal commitment to one another's growth and success, and to the organizations growth and success. The high sense of commitment exhibited by teams in a high performance organization allow these teams to have a better sense of purpose, more accountability, and more actionable goals which allows them greater productivity. High performance teams move through the same stage development framework, which was popularized by Tuckman, as other teams. They must be guided by a competent leader through the stages until they are ready to truly operate at high performance. Individuals HPOs foster an organization of learning where they invest heavily in their workforce. They do this typically through leadership development and competency management. HPOs will develop a clear set of core competencies that they want the organization's employees to master. They will invest in keeping these competencies prominent through training and development. These organizations also reinvent the way they refer to their employees in order to place value on the team concept. Employee titles will reflect this. They will often be referred to as team members or associates as opposed to employees or staff. This again increases employee involvement and makes employees more committed to the larger goals and competencies that the organization places value in. Leaders The roles of managers in an HPO are also reinvented. Traditional models for organizations would have leaders closely monitor or supervise their teams. Team leaders in HPOs are more concerned with long term strategic planning and direction. They take a more hands off approach and their titles reflect this change in responsibility. Leaders in HPOs trust in their employees to make the right decisions. They act as a coach to their team members by giving them support and keeping them focused on the project at hand. These leaders are able to lead depending on the situation and have the capability to adjust their leadership style based upon the needs of their team members. They know when to inspire people with direct communication and also have the ability to read when a more hands off approach is necessary. Although these leaders act with a hands-off approach, they hold non-performers accountable for not reaching their goals. Leadership practices are also in line with the company's vision, values, and goals. Leaders of these organizations make all of their decisions with the organization's values in mind. Leadership behavior that is consistent with the organization's vision involves setting clear expectations, promoting a sense of belonging, fostering employee involvement in decision making, and encouraging learning and development. Leaders in an HPO also have the responsibility of understanding and being able to quickly make important decisions about the always changing marketplace in which their organization operates. Leaders should have the ability to anticipate changes in competition, technology, and economics within their market. Organization strategy and vision HPOs create strong vision, value, and mission statements which guide their organizations and align them with the outside environment. The mission, vision, and values of the organization act as foundations on which the organization is built. They inform employees what is rewarded and also what is not. HPOs implement vision statements that are specific, strategic, and carefully crafted. Leaders propagate the vision at all levels by ensuring that activities are aligned with vision and strategy of the organization. HPOs also set lofty, but measurable and achievable goals for their organization in order to guide their vision. The vision and strategy of the organization is made clear to employees at all levels. A common understanding of the organizations strategy and direction creates a strategic mind-set among employees that helps the organization achieve its goals. Innovative practices HPOs reward and incentivize behavior that is in line with the organizations goals. They implement reward programs that aim to benefit employees who follow the values of the organization. HPOs streamline information sharing across all levels of the organization. Information sharing is streamlined via communications channels set up with state of the art information technology. Internal communication is interactive and open exchange is rewarded. Typically, HPOs implement innovative ICT networks within their organization. While HPOs do streamline their information, they also share information across all levels of the organization to make sure that everyone is sharing in the same vision. An HPO is constantly improving their products, manufacturing processes, or services in order to gain a competitive advantage. These organizations focus on the efficiency of all aspects of their product. They implement various process and quality optimization models such as total quality management, Lean Six Sigma, quality circles, process re-engineering, and lean manufacturing. HPOs have innovative human resources practices. For example, employees may be involved in the hiring process. All team members may be involved when hiring a new member to join that team. Human resources may also implement pay for knowledge or pay for skill programs where employees are monetarily rewarded for attending training sessions that further their skills and abilities. Generally, there is a more focused approach on training where specific skills are targeted by the organization through data collection and needs assessment. These skills are the focus of the training and development programs that are implemented by human resources. It is typical for these organizations to have an internal learning and organizational development team which dedicates its time to conducting skill and competency based needs assessments and then training employees where it is needed. Flexibility and adaptability The success of HPOs are due to their ability to have structures in place that allow them to quickly adjust to the environment that they operate within. HPOs have the ability to reconfigure themselves to meet the demands of the marketplace and avoid its threats. HPOs constantly survey and monitor the environment to understand the context of their business, identify trends, and seek out any competitors. A HPO's growth is facilitated by creating partnerships and creating networks with other organizations after careful examination of the value added by entering into these relationships. They have high external orientation and strive to meet customer demands. They meet and exceed customer demands by fostering close relationships with customers, understanding their customers' values, and being responsive to their customers' needs. HPOs maintain relationships with their stakeholders by creating mutual relationships. References Organizational behavior
High performance organization
[ "Biology" ]
2,235
[ "Behavior", "Organizational behavior", "Human behavior" ]
56,326,541
https://en.wikipedia.org/wiki/Greenland%20Telescope
The Greenland Telescope is a radio telescope situated at the Thule Air Base in north-western Greenland. It will later be deployed at the Summit Station research camp, located at the highest point of the Greenland ice sheet at an altitude of 3,210 meters (10,530 feet). The telescope is an international collaboration between: The Academia Sinica Institute of Astronomy and Astrophysics (Taiwan) (project leaders) The Smithsonian Astrophysical Observatory of the Center for Astrophysics Harvard & Smithsonian (United States) The National Radio Astronomy Observatory (United States) The Haystack Observatory of the Massachusetts Institute of Technology (United States) In 2011 the U.S. National Science Foundation gave the Smithsonian Astrophysical Observatory a 12-meter radio antenna that had been used as a prototype for the ALMA project in Chile. The antenna was to be deployed in Greenland. Deploying the telescope in the middle of Greenland is ideal for detecting certain radio frequencies. The telescope will be used to study the event horizons of black holes and to test how general relativity behaves in environments with extreme gravity. The Greenland Telescope will become part of the global network of telescopes that makes up the Event Horizon Telescope that will study supermassive black holes and explore the origin of the relativistic jet in the active galaxy Messier 87. Progress and current status Between 2013 and 2015, the Taiwanese Academia Sinica Institute of Astronomy and Astrophysics modified the telescope so that it would better work in the cold environment of the Arctic. The telescope was shipped to Greenland in July 2016 and re-assembled in July 2017 at Thule Air Base in north-western Greenland. The telescope took its first image on 25th of December 2017. An update on "Construction, Commissioning, and Operations" of the telescope at Pituffik Space Base (the revised name for the complex) was published on ArXiv in July 2023, describing "the lessons learned from the operations in the Arctic regions, and the prospect of the telescope." One of the systems tested was the location system; when the telescope is deployed on the ice cap summit, it will move with the ground it is mounted on. Establishing the telescope's geographical position to the required accuracy of 5m required about an hour of observation time. The snow and ice removal systems were also successfully tested. The telescope will be deployed at the Summit Station research camp, located at the highest point of the Greenland ice sheet. References Additional sources Hirashita, Hiroyuki; Koch, Patrick M.; Matsushita, Satoki; Takakuwa, Shigehisa; Nakamura, Masanori; Asada, Keiichi; Liu, Hauyu Baobab; Urata, Yuji; Wang, Ming-Jye; Wang, Wei-Hao; Takahashi, Satoko; Tang, Ya-Wen; Chang, Hsian-Hong; Huang, Kuiyun; Morata, Oscar; Otsuka, Masaaki; Lin, Kai-Yang; Tsai, An-Li; Lin, Yen-Ting; Srinivasan, Sundar; Martin-Cocher, Pierre; Pu, Hung-Yi; Kemper, Francisca; Patel, Nimesh; Grimes, Paul; Huang, Yau-De; Han, Chih-Chiang; Huang, Yen-Ru; Nishioka, Hiroaki; Lin, Lupin Chun-Che; Zhang, Qizhou; Keto, Eric; Burgos, Roberto; Chen, Ming-Tang; Inoue, Makoto; Ho, Paul T. P.. "First-generation science cases for ground-based terahertz telescopes". Publications of the Astronomical Society of Japan, 2016: Volume 68, Issue 1, id.R1 pp. doi:10.1093/pasj/psv115 10.1093/pasj/psv115 The M87 Workshop: Towards the 100th Anniversary of the Discovery of Cosmic Jets Arctic Greenland Telescope Opens New Era of Astronomy SpaceRef, 2018-05-31. Telescopes Radio observatories Astronomy in Taiwan Interferometric telescopes Radio telescopes Astronomical imaging Astronomical instruments
Greenland Telescope
[ "Astronomy" ]
842
[ "Telescopes", "Astronomical instruments" ]
56,328,827
https://en.wikipedia.org/wiki/Biological%20methanation
Biological methanation (also: biological hydrogen methanation (BHM) or microbiological methanation) is a conversion process to generate methane by means of highly specialized microorganisms (Archaea) within a technical system. This process can be applied in a power-to-gas system to produce biomethane and is appreciated as an important storage technology for variable renewable energy in the context of energy transition. This technology was successfully implemented at a first power-to-gas plant of that kind in the year 2015. Disambiguation Biological methanation contains the principle of the so-called methanogenesis, a specific, anaerobic metabolic pathway where hydrogen and carbon dioxide are converted into methane. By analogy with the biological process, a chemical-catalytic process, also known as Sabatier reaction, exists. Principle of function Numerous and common microorganisms within the domain Archaea convert the compounds hydrogen (H2) and carbon dioxide (CO2) into methane in a bio-catalytic way. The therefore relevant metabolic processes run under strictly anaerobic conditions and in an aqueous environment. Suitable Archaea for this process are so called Methanogens with a hydrogenotrophical metabolism. They are primary to be allocated among the order of Methanopyrales, Methanobacteriales, Methanococcales and Methanomicrobiales. These Methanogens are naturally adapted for different anaerobic environments and conditions. Basically, the Methanogens need aqueous, anoxic conditions with min. 50% water and a redox potential of less than −330 mV. The Methanogens prefer lightly acidic to alkali living conditions and are found in a very wide temperature range from 4 to 110 °C. Reactor types The most common utilized reactor type for biological methanation is the stirred-tank reactor in which the mass transfer is influenced by several factors such as geometry of the reactor, impeller configuration, the agitation speed and the gas flow rate. Additionally, less investigated reactor types like Trickle-bed reactors, bubble-column reactors and gas-lift reactors have specific drawbacks and advantages regarding the abovementioned limitations. Potential applications of biological methanation Biological methanation can take place as an in-situ process within a fermenter (see fig. 3.1) or as an ex-situ process in a separate reactor (see fig. 3.2 to 3.4). Biological methanation in a biogas or clarification plant with a gas processing system (in-situ process) Hydrogen is added directly to the fermentation material during a fermentation process and the biological methanation takes place subsequently in the thoroughly gassed fermentation material. The gas is, depending on its pureness, cleaned up to methane before the infeed into the gas grid. Biological methanation takes place in a separate methanation plant. The gas is completely converted into methane before the infeed into the gas grid. The carbon dioxide, produced in a gas processing system, is converted into methane in a separate methanation plant, by adding hydrogen and can then be fed into the gas grid. Biological methanation in combination with an arbitrary carbon dioxide source (ex-situ process) In a separate methanation plant the hydrogen is converted into methane together with carbon dioxide and then fed into the gas grid (stand-alone solution). Biological methanation in a pressurized reactor vessel (in-situ process). Pressure allows for better hydrogen solubility and therefore easier conversion into methane by microorganisms. A possible reactor configuration can be Autogenerative high-pressure digestion. Research in Korea has demonstrated that 90% > CH4, 180 MJ/m3 biogas can be produced in this way. Implementation in the field Since March 2015 the first power-to-gas plant globally is feeding synthetical bio methane, generated by means of biological methanation, into the public gas grid in Allendorf (Eder), Germany. The plant runs with an output rate of 15 Nm3/h, which corresponds to 400,000 kWh per year. With this amount of gas a distance of 750,000 kilometers per year with a CNG-vehicle can be achieved. References Biogas technology Methane
Biological methanation
[ "Chemistry", "Biology" ]
888
[ "Greenhouse gases", "Biofuels technology", "Methane", "Biogas technology" ]
56,330,181
https://en.wikipedia.org/wiki/Central%20American%20Pacific%20Islands
The Central American Pacific Islands is a biogeographical area used in the World Geographical Scheme for Recording Plant Distributions. It has the Level 3 code "CPI". It consists of a number of islands off the western coast of Central America in the Pacific Ocean: Clipperton Island, Cocos Island and Malpelo Island. Clipperton Island is the most north-westerly, lying off Nicaragua and Costa Rica. Politically it belongs to France. Cocos Island and Malpelo Island lie south of Panama, although Cocos Island belongs to Costa Rica and Malpelo Island to Colombia. See also References Biogeography Geography of Central America Natural history of Central America
Central American Pacific Islands
[ "Biology" ]
137
[ "Biogeography" ]
56,331,770
https://en.wikipedia.org/wiki/NGC%206040
NGC 6040 is a spiral galaxy located about 550 million light-years away in the constellation Hercules. NGC 6040 was discovered by astronomer Édouard Stephan on June 27, 1870. NGC 6040 is interacting with the lenticular galaxy PGC 56942. As a result of this interaction, NGC 6040's southern spiral arm has been warped in the direction toward PGC 56942. NGC 6040 and PGC 56942 are both members of the Hercules Cluster. NGC 6040 was classified in the 1966 Atlas of Peculiar Galaxies by Halton Arp, who listed it as Arp 122. However, Mr. Arp mistakenly identified NGC 6040 as NGC 6039, which is not part of any Arp object. Neutral hydrogen depletion NGC 6040 and PGC 56942 are both depleted of their neutral hydrogen content. This depletion may have been caused as both galaxies fell into the Hercules Cluster and interacted with the surrounding intracluster medium (ICM). This interaction would have caused ram-pressure stripping and in effect removed the gas in the two galaxies. See also List of NGC objects (6001–7000) Arp 120 Arp 272 References External links Hercules (constellation) Intermediate spiral galaxies Interacting galaxies 6040 056932 122 10165 Astronomical objects discovered in 1870 Hercules Cluster Discoveries by Édouard Stephan
NGC 6040
[ "Astronomy" ]
276
[ "Hercules (constellation)", "Constellations" ]
56,332,137
https://en.wikipedia.org/wiki/Song%20Jian
Song Jian (; born 29 December 1931) is a Chinese aerospace engineer, demographer, and politician. He was deputy chief designer of China's submarine-launched ballistic missile (JL-1) and one of the country's leading scientists in the post-Cultural Revolution era. After a decade of two-child restrictions in the 1970s, and following the Chinese government's announcement in 1979 to advocate for one child per family, he became a leading advocate for rapid implementation and broad coverage of China's one-child policy. He served in high-ranking political positions including Vice Minister of Aerospace Industry, Director of the State Science and Technology Commission (1985–1998), vice-premier-level State Councillor (1986–1998), President of the Chinese Academy of Engineering, Vice Chairperson of the Chinese People's Political Consultative Conference, and a member of the Central Committee of the Chinese Communist Party. Early life and education Song Jian was born on 29 December 1931 in Rongcheng, Shandong Province. In 1946, he enlisted in the Chinese Communist Party's Eighth Route Army during the Chinese Civil War at the age of 14. After the establishment of the People's Republic of China in 1949, he studied at the Harbin Institute of Technology and Beijing Foreign Language Institute, before being sent to the Soviet Union in 1953 on the recommendation of Liu Shaoqi, Vice Chairman of China. Described as a "brilliant" student, he studied cybernetics and military science under the theorist A. A. Feldbaum. He earned an associate Ph.D. degree from Moscow State University and a Ph.D. from Bauman Moscow State Technical University. He published seven papers in Russian on control theory, which won praise from Soviet and American scientists. Career After the Sino-Soviet split in 1960, Song returned to China and was put in charge of control systems at the Fifth Academy (later known as the Seventh Ministry of Machine Building or Missile Ministry) of the Ministry of National Defense. He was one of China's top experts on missile guidance systems. Qian Xuesen, the "father of China's space and missile defence programs", highly praised Song's ability and declared that Song was China's foremost control theorist, surpassing Qian himself. Qian personally chose Song to co-author the revised edition of his Engineering Cybernetics, regarded as a bible of Chinese military science. At the beginning of the Cultural Revolution, Song's home was ransacked by the Red Guards before Premier Zhou Enlai included him in the list of the top 50 scientists considered indispensable to national defence and afforded special protection. Song was sent to the Jiuquan Satellite Launch Center in the desert, where he could focus on his studies and research, before returning to Beijing in 1969. His work on anti-ballistic missiles attracted Zhou's attention. In the late 1970s, Song applied his expertise in cybernetics to the problem of population control and became a proponent of China's so-called one-child policy. At the same time, he continued to work in the missile and aerospace programs and rapidly ascended the political hierarchy. He was appointed deputy chief designer of JL-1, China's submarine-launched ballistic missile in February 1980 (under Huang Weilu), and Vice Minister of Aerospace Industry in 1982. In 1985 he became Director of the powerful State Science and Technology Commission, and the next year he additionally became a State Councillor, a vice-premier-level position. He held both positions until 1998, when he was appointed President of the Chinese Academy of Engineering and Vice Chairperson of the Chinese People's Political Consultative Conference (CPPCC). Song was an alternate member of the 12th Central Committee of the Chinese Communist Party, and a full member of the 13th, 14th, and 15th central committees. One-child policy After the end of the Cultural Revolution, Premier Zhou Enlai announced in 1970 a five-year plan that called for population growth targets in light of Malthusian concerns that a rapidly growing population would derail China's economic development. That program evolved into a two-child policy for the rest of the 1970s. Thereafter, China's new leader Deng Xiaoping continued this program, reducing military spending and urging scientists to focus their energy on solving the country's urgent economic problems, including widespread poverty. In 1978, as China made its initial announcement to tighten the restrictions to one child per family, Song attended the Seventh World Congress of the International Federation of Automatic Control in Helsinki, Finland, where he encountered the cybernetic-based population control theory associated with the Club of Rome. He saw the theory as a precise and scientific approach to the population control problem, which seemed superior to the Marxist perspectives that had long predominated in China. Based on assumptions of future trends, Song and his group performed calculations that determined the "ideal" population for China in the next 100 years was 650 to 700 million, about two-thirds of its then-population of 1 billion. In order to achieve this long-term environmentally sustainable population, he showed that the "optimal" trajectory was to reduce fertility rapidly to one child per couple by 1985 and maintain that level for 20 to 40 years, and then slowly raise it to the replacement level (2.1 children per woman). Following the Chinese central government's decision to advocate for one-child families in 1979, Song and his associates entered the picture, actively supporting and promoting the one child ideal through conference discussions in 1980 in Chengdu. They presented their work to members of the Chinese Academy of Sciences, and through the country's top scientists, came to the attention of, and won the support of, China's top leaders for rapid implementation and broad coverage of one-child limits. Song's work was endorsed by Vice Premiers Chen Muhua and Wang Zhen, who recommended it to Chen Yun, the second most influential official after Deng Xiaoping. Shocked by Song's population projections, the highest of which projected China's population to reach 4 billion by 2080 if women continued to have three births per woman, China's leaders were convinced that rapid adoption of a near universal one-child policy was the country's only option if it wanted to meet Song's population targets. Although some leaders, including Zhao Ziyang and Hu Yaobang, expressed doubts about its feasibility, at a top-secret high-level meeting convened in April 1980 Song won over many policymakers to his recommendation of universal one-child limits. In September, the third session of the 5th National People's Congress approved the policy. Although it is widely agreed that Song's population projections influenced the speed and scope of implementation of one-child limits, several leading scholars have refuted Greenhalgh's thesis that Song "hijacked the population policymaking process" and that he should be considered both the inventor and central architect of the one-child policy, a thesis that has often been regurgitated without much critical reflection. Liang Zhongtang, who participated in the critical policy discussions in Chengdu in 1980 and emerged as the foremost internal critic of one-child limits, confirms that Greenhalgh put too much emphasis on Song and his group. Wang et al. agree, concluding that "the idea of the one-child policy came from leaders within the Party, not from scientists who offered evidence to support it.". Goodkind suggests that Song and his colleagues were "more like expert witnesses for a government already determined to prosecute one-child restrictions, taking advantage of the opportunity to become players in a vast and expanding government bureaucracy" Indeed, upon learning of Song's work in February 1980, correspondence from Wang Zhen, Chen Muhua, and other top officials suggests that they were already highly sympathetic to Song's position. It is also important to note that the universal one-child limits advocated by Song lasted only five years. In the mid-1980s, China began to permit exemptions for rural parents whose first child was a daughter (an exemption allowed due to the unpopularity of the universal one-child rule), which, along with other exemptions, resulted in a "1.5-child" policy that lasted for nearly 30 years. Thus, the policy in place since the mid-1980s that has been commonly referred to as "the one-child policy" was actually a less restrictive policy, the very sort that China might have adopted in 1980 even without the population projections and cybernetic models of Song and his colleagues. Other programs As the director of the State Science and Technology Commission, Song was in charge of China's science and technology policies. He oversaw the , which aims to popularize scientific knowledge and technology to benefit common people, the , which encourages scientists to commercialize their scientific discoveries, and the 863 Program, which aims to stimulate the development of high-tech research in China. He also launched the Xia–Shang–Zhou Chronology Project to determine a more accurate chronology of the earliest dynasties in Chinese history. Honours and awards Song Jian is an academician of both the Chinese Academy of Sciences and the Chinese Academy of Engineering. He is a foreign member of the US National Academy of Engineering, the Russian Academy of Sciences, and the Royal Swedish Academy of Engineering Sciences. He is also a member of the Euro-Asian Academy of Sciences and the International Academy of Astronautics. References Bibliography 1931 births Living people Bauman Moscow State Technical University alumni Beijing Foreign Studies University alumni Chinese aerospace engineers Chinese demographers Chinese expatriates in the Soviet Union Control theorists Cyberneticists Foreign associates of the National Academy of Engineering Foreign members of the Russian Academy of Sciences Harbin Institute of Technology alumni Members of the Chinese Academy of Engineering Members of the Chinese Academy of Sciences Members of the 13th Central Committee of the Chinese Communist Party Members of the 14th Central Committee of the Chinese Communist Party Members of the 15th Central Committee of the Chinese Communist Party Members of the Royal Swedish Academy of Engineering Sciences Moscow State University alumni One-child policy Politicians from Weihai Scientists from Shandong State councillors of China Vice Chairpersons of the National Committee of the Chinese People's Political Consultative Conference
Song Jian
[ "Engineering" ]
2,059
[ "Control engineering", "Control theorists" ]
56,332,298
https://en.wikipedia.org/wiki/Cultural%20keystone%20species
A cultural keystone species is one which is of exceptional significance to a particular culture or a people. Such species can be identified by their prevalence in language, cultural practices (e.g. ceremonies), traditions, diet, medicines, material items, and histories of a community. These species influence social systems and culture and are a key feature of a community's identity. The concept was first proposed by Gary Nabhan and John Carr in 1994 and later described by Sergio Cristancho and Joanne Vining in 2000 and by ethnobotanist Ann Garibaldi and ethnobiologist Nancy Turner in 2004. It is a "metaphorical parallel" to the ecological keystone species concept, and may be useful for biodiversity conservation and ecological restoration. Definitions The exact definition of cultural keystone species remains under debate and is considered to be more abstract than the related ecological concept. Garibaldi and Turner emphasize that the cultural keystone species concept is not an extension of ecological keystone species, but rather a parallel concept that bridges social and physical sciences, as well as indigenous knowledge and western knowledge, to offer a more holistic approach. Other researchers debate whether or not cultural keystone species are different from economically important species. Additionally, it is argued that the concept will be reduced to a biological term if it only focuses on specific species, but this may be solved by considering cultural keystone species as a "complex" that develops based on the ways that the species is used and its impacts on cultural practices over time, through conscious social practices, decision-making processes, and changes to societal needs and practices. Garibaldi and Turner outline six elements that should be considered when identifying a cultural keystone species: The magnitude and variety of ways the species is used The species' influence on language The species' role in cultural practices (e.g. traditional practices, ceremonies) The continuation of the species' importance even as cultural identity changes over time The irreplaceability of the species by another species accessible to the group The species' role in activities outside its own territory (e.g. trade) Loring argues that this framing misses an essential feature of the cultural keystone concept as originally conceived by Nabhan and Carr: that the importance of the relationship flows in both directions. In other words, a cultural keystone species is not just important culturally; the species, and the cultural practices that surround it, are also essential to the health and structure of the broader ecosystem. They are, in Loring's words, "a point of convergence... where our intersections are intrinsically powerful, meaningful and symbolic." Not all cultural keystone species are beneficial to a community or an ecosystem, particularly when the species is considered invasive. One example of this is the Australian Eucalyptus tree that is now widespread in California and is considered to be culturally important because of its aesthetic value and dietary uses. However, the tree is a threat to native species and has drastically impacted the ecosystems it is found in. Significance The cultural keystone species concept may have important applications for conservation and ecological restoration initiatives because these species may serve as a starting point from which to identify the needs of both the community and the ecosystem. Cultural keystone species reinforce the close relationships between communities and their surrounding environments, particularly for indigenous communities currently facing environmental and economic challenges. These species may offer information about an ecosystem or a community's resilience, and their identification can support the survival of communities who depend on a cultural keystone species. It is argued that these species should play a role in environmental policy, for example in the Cultural Impact Assessment of the United Nations Environment Programme, to connect cultural and ecological conservation for indigenous peoples. Legal recognition of cultural keystone species can also improve social justice, ensure continuation of indigenous practices, and promote inclusive social-ecological management practices. Researchers have also found that identification of cultural keystone species supports the integration of indigenous perspectives on environmental stewardship and improved natural resource management practices. The application of the concept can support the development of innovative methods to conserve natural resources as a result of this integration. Additionally, "invisible losses" can be avoided because the identification of these species includes consideration of the cultural and social importance of a species. Examples in North America White pine The white pine (Pinus strobus L.), found across northeastern North America, is a cultural keystone species for the Kitcisakik Algonquin community in Quebec. The tree is prevalent in legends and myths that are central to the culture, history, and identity of the Kitcisakik. The tree is said to offer protection to the people because of its large size when mature, and provides a home to bald eagles, a sacred species for Algonquin societies. Parts of the tree are also used in material goods and medicines, and the species is considered to be irreplaceable by the Kitcisakik, who rely on the services the white pine offers to both humans and the environment. The white pine is currently threatened by logging and environmental changes and the Kitcisakik are central in efforts to modify practices so that the species will survive. The researchers that identified the white pine as the cultural keystone species of the Kitcisakik using Garibaldi and Turner's methodology and community interviews note that the tree is not only culturally significant, but ecologically as well - it provides services to animals such as birds, moose, and marten. The identification of this tree as a cultural keystone species offers insight into how and why culturally important components of the environment should be considered when developing natural resource management and restoration strategies. The inclusion of the white pine's cultural significance in these strategies may ensure that cultural needs are considered alongside environmental and economic priorities. Western red-cedar Western red-cedar (Thuja plicata) is a cultural keystone species for the First Nations cultures of the Pacific Northwest Coast of North America, such as the Tsimshian, Haida, Heiltsuk, and Kwakwaka’wakw. It provides wood, bark, and roots for various uses such as canoes, clothing, baskets, and ceremonies. It is considered a sacred gift from the Creator and features in many stories and rituals. Red laver seaweed Red laver seaweed (Porphyra abbottiae) is a cultural keystone species for the Coast Tsimshian, Haida, Heiltsuk, Kwakwaka’wakw, and other coastal peoples of British Columbia. It is harvested, dried, and eaten as a nutritious food. It requires detailed knowledge and skills to collect and process, and is valued as a trade item and a medicine. It is also associated with seasonal indicators, taboos, and narratives. Wapato Wapato (Sagittaria latifolia) is a cultural keystone species for the Katzie and other Sto:lo peoples of British Columbia. It is an aquatic plant that produces edible tubers. It was formerly a staple food and trade item for many groups, especially the Katzie. It was cultivated and managed in wetlands, but its use declined with the introduction of the potato and the loss of habitat. References Ecological processes Ethnobotany Ethnobiology
Cultural keystone species
[ "Physics", "Biology", "Environmental_science" ]
1,448
[ "Physical phenomena", "Earth phenomena", "Ecological processes", "Environmental social science", "Ethnobiology" ]
56,337,158
https://en.wikipedia.org/wiki/Bortezomib/dexamethasone
Bortezomib/dexamethasone is a combination drug against multiple myeloma. When bortezomib is used by the trade name Velcade, the combination is called Vel/Dex (or Vel-Dex or Veldex). Bortezomib is a proteasome inhibitor and dexamethasone is a corticosteroid. References Combination cancer drugs
Bortezomib/dexamethasone
[ "Chemistry" ]
88
[ "Pharmacology", "Pharmacology stubs", "Medicinal chemistry stubs" ]
69,024,769
https://en.wikipedia.org/wiki/Sclerococcum%20fissurinae
Sclerococcum fissurinae is a species of lichenicolous fungus in the family Dactylosporaceae. Found in Alaska, it was formally described as a new species in 2020 by Sergio Pérez-Ortega. The type specimen was collected in the Hoonah-Angoon Census Area, just outside of Glacier Bay National Park. Here it was found growing on the script lichen species Fissurina insidiosa, which itself was growing on the bark of an alder tree. The specific epithet refers to its host. The fungus forms black, circular apothecia on its lichen host that are up to 0.6 mm in diameter. Its asci are eight-spored, measuring 25–33 by 8–12 μm. The ascospores are brown with an ellipsoid shape, and typical dimensions of 8–12 by 3–4 μm; they usually have three septa, although sometimes only one or two occur. A morphologically similar species in the same genus is S. parasiticum but in this fungus, which has a different host, the ascospores are slightly larger (9–15 by 3.5–5 μm). References Lecanorales Fungi described in 2020 Fungi of the United States Lichenicolous fungi Fungi without expected TNC conservation status Fungus species
Sclerococcum fissurinae
[ "Biology" ]
277
[ "Fungi", "Fungus species" ]
69,025,987
https://en.wikipedia.org/wiki/Bello%20Bako%20Dambatta
Bello Bako Dambatta (born 15 December 1950 in Kano, Kano State, Nigeria) is a Nigerian chemist and university administrator. He served as the vice-chancellor of Bayero University, Kano from 1995 until 1999. Education and career Dambatta was born on 15 December 1950 in the city of Kano, Nigeria. He began his higher education at Kaduna Polytechnic (1970–72), then moved to England, where he attended Huddersfield Polytechnic (1972–74), Trent Polytechnic (1974–75), the University of Bradford (1975–76), and Lancaster University (1980-83). He completed his PhD at Lancaster University, with a thesis titled A study of the free-radical polymerisations of tri-n-butyltin methacrylate and i-vinylimidazole. He was appointed a lecturer at Bayero University, Kano in 1978, and was promoted, reaching full professor in 1994. He served as the head of the Department of Chemistry from 1991 until 1995, and was dean of the Faculty of Science from 1992 until 1995. He was vice-chancellor of Bayero University from 1995 to 1999, and was the first to be selected by the faculty of the university (previous vice-chancellors had been appointed centrally by the government). Dambatta became a fellow of the Chemical Society of Nigeria in August 1996. Honoured To honour and remember his work, the Bello Bakko Dambatta Lecture Theatre was named after him. It is located at the pharmacy department at the Bayero University Kano old site. References Dambazawa family Living people 1950 births Academic staff of Bayero University Kano Alumni of Lancaster University Alumni of the University of Huddersfield Alumni of the University of Bradford Vice-chancellors of Nigerian universities Nigerian scientists People from Kano State Chemical Society of Nigeria
Bello Bako Dambatta
[ "Chemistry" ]
377
[ "Chemical Society of Nigeria" ]
69,026,183
https://en.wikipedia.org/wiki/Regonyl
Regonyl (developmental code name TX-380), also known as 17α-ethynyl-5α-androst-2-en-17β-ol 17β-acetate, is a steroidal drug described as an antiprogestogen and "antiprolactin" (prolactin inhibitor). It was studied for lactation inhibition in bitches. It has minimal to no androgenic, estrogenic, or progestogenic activity but is said to strongly inhibit the hypothalamic–pituitary–gonadal axis at central and peripheral levels and to markedly oppose the action of progesterone. However, the antiprogestogenic effects of regonyl do not appear to be due to direct interaction with the progesterone receptor. The actions of regonyl result in estrus cycle disturbances and impaired ovulation. Regonyl was proposed for use in humans, for instance in the treatment of gynecological disorders like endometriosis and benign breast disease, and in hormonal contraception. References Abandoned drugs Alcohols Ethynyl compounds Androstanes Antiprogestogens Drugs with unknown mechanisms of action Prolactin modulators Veterinary drugs
Regonyl
[ "Chemistry" ]
265
[ "Drug safety", "Abandoned drugs" ]
69,026,512
https://en.wikipedia.org/wiki/Mercedes-Benz%20supercharged%20Grand%20Prix%20racing%20engine
Mercedes-Benz made a series of pre-war supercharged Grand Prix racing engines for their Silver Arrow race cars; between 1934 and 1939. They made two supercharged inline-8 engines; the M25 and M125, and one V12 with two generations; the M154 / M163. Background Despite reducing the engine size by roughly half, Daimler engineers soon managed to get more power from the supercharged Straight-8 M25 engine than the maximum 300 hp of the SSK. Development of the chassis and the car had allowed to increase capacity to more than 4 litre, and output of the new engine version M25C was well over 400 hp. As the M25 engines became unreliable when enlarged to 4.7-litre and 490 hp, a V12 engine was tested, but it proved too heavy. The W125's supercharged engine, with 8 cylinders in line (94.0 x 102 mm) and 5,662.85 cc (345.56 CID), attained an output of up to 595 horsepower (444 kW) in race trim. The highest testbed power measured was 637 BHP (646 PS) at 5,800 rpm. It gave 245 BHP (248 PS) at a mere 2,000 rpm. In 1938, the engine capacity of supercharged Grand Prix cars was limited to 3000cc, and the W125 was replaced by the Mercedes-Benz W154. The W125 was considered the most powerful road racing car ever for about 3 decades, until large capacity, American-built V8 engines in Can-Am sports cars reached similar power in the late 1960s. In Grand Prix racing itself, the figure was not exceeded until the early 1980s (when Grand Prix racing had become known as Formula One), with the appearance of highly powerful turbo-charged engines; such as the Renault EF-Type engine, BMW M12/13, and the Ferrari Tipo 021. With no regulations limiting engine size, other than the total car weight limit, Mercedes designed a 5.6 litre engine configured with eight inline cylinders and double overhead camshaft for the W125. Named the M125, the engine was also fitted with a Roots type supercharger producing of torque at the start of the season. The engines built varied in power, attaining an output between 560 and 640 horsepower (418-477 kW) at 5800 rpm. Fuel used was a custom mix of 40% methyl alcohol, 32% benzene, 24% ethyl alcohol and 4% gasoline light. The engine weighed 222 kg (490 lbs) - approximately 30% of the total weight of the car, and was mounted in the front of the car. 1938 saw changes in the rules, with the maximum limit on weight being replaced with a maximum limit on engine capacity and a minimum weight for the car being introduced; the W125 was no longer eligible for entry without major modification. Instead, Mercedes-Benz developed a new car, the W154, and the W125 was withdrawn from racing. The M154 engine was created as a result of a rule change by the sports governing body AIACR, which limited supercharged engine capacities to 3000cc. Mercedes' previous car, the supercharged 5700cc W125, was therefore ineligible. The company decided that a new car based on the chassis of the W125 and designed to comply with the new regulations would be preferable to modifying the existing car. Although using the same chassis design as the 1938 car, a different body was used for the 1939 season and the M154 engine used during 1938 was replaced by the M163. As a result of the new engine, the 1939 car is often mistakenly referred to as a Mercedes-Benz W163. For the 1938 season, Grand Prix racing's governing body AIACR moved from a formula limited by weight to one by engine capacity. The new regulations allowed a maximum capacity of 3000cc with a supercharger or 4500cc without. This meant Mercedes-Benz's previous car, the supercharged 5700cc W125, was ineligible to continue. Its new car was based on the W125 chassis, with a supercharged 3000cc engine determined after both types had been tested. The new M154 engine was a 3000cc (2,961.54 cc; 67.0 x 70.0 mm) supercharged V12, attaining an output between 425 and 474 horsepower. In 1939, the 2-stage supercharged version recorded a testbed power of 476 BHP (483 PS) at 7,800 rpm. Each one of these engines reputedly cost 89,700 German reichsmarks in 1938 (US$949,601 today). Applications Mercedes-Benz W125 Mercedes-Benz W25 Mercedes-Benz W154 Mercedes-Benz W125 Rekordwagen References Mercedes-Benz engines Straight-eight engines V12 engines Engines by model Gasoline engines by model
Mercedes-Benz supercharged Grand Prix racing engine
[ "Technology" ]
1,018
[ "Engines", "Engines by model" ]
69,026,625
https://en.wikipedia.org/wiki/Matroid-constrained%20number%20partitioning
Matroid-constrained number partitioning is a variant of the multiway number partitioning problem, in which the subsets in the partition should be independent sets of a matroid. The input to this problem is a set S of items, a positive integer m, and some m matroids over the same set S. The goal is to partition S into m subsets, such that each subset i is an independent set in matroid i. Subject to this constraint, some objective function should be minimized, for example, minimizing the largest sum item sizes in a subset. In a more general variant, each of the m matroids has a weight function, which assigns a weight to each element of the ground-set. Various objective functions have been considered. For each of the three operators max,min,sum, one can use this operator on the weights of items in each subset, and on the subsets themselves. All in all, there are 9 possible objective functions, each of which can be maximized or minimized. Special cases Some important special cases of matroid-constrainted partitioning problems are: The (max,sum) objective is the maximum over all subsets, of the total weight in the subset. When the items represent jobs and the weights represent their length, this objective is simply the makespan of the schedule. Therefore, minimizing this objective is equivalent to minimizing the makespan under matroid constraints. The dual goal of maximizing (min,sum) has also been studied in this context. The special case in which the matroids are free matroids (no constraints) and the m weight-functions are identical corresponds to identical-machines scheduling, also known as multiway number partitioning. The case of free matroids and different weight-functions corresponds to unrelated-machines scheduling. The special case of uniform matroids corresponds to cardinality constraints on the subsets. The more general case of partition matroids corresponds to categorized cardinality constraints. These problems are described in the page on balanced number partitioning. The (sum,sum) objective is the sum of weights of all items in all subsets, where the weights in each subset i are computed by the weight-function of matroid i. Minimizing this objective can be reduced to the weighted matroid intersection problem - finding a maximum-weight subset that is simultaneously independent in two given matroids. This problem is solvable in polynomial time. The (max,max) objective is the maximum weight in all subsets, where the weights in each subset i are computed by the weight-function of matroid i. Minimizing this objective with graphic matroids can be used to solve the minimum bottleneck spanning tree problem. The (sum,min) objective is the sum of minimum weights in all subsets. Maximizing this objective, with k identical graphical matroids, can be used to solve the maximum total capacity spanning tree partition problem. The (sum,max) objective is the sum of maximum weights in all subsets. This objective can represent the total memory needed for scheduling, when each matroid i represents the feasible allocations in machine i. General matroid constraints General matroid constraints were first considered by Burkard and Yao. They showed that minimizing (sum,max) can be done in polynomial time by a greedy algorithm for a subclass of matroids, which includes partition matroids. Hwang and Rothblum presented an alternative sufficient condition. Wu and Yao presented an approximation algorithm for minimizing (max,sum) with general matroid constraints. Abbassi, Mirrokni and Thakur present an approximation algorithm for a problem of diversity maximization under matroid constraints. Kawase, Kimura, Makino and Sumita show that the maximization problems can be reduced to minimization problems. Then, they analyze seven minimization problems: Minimizing (sum,max): the problem is strongly NP-hard even when the matroids and weights are identical. There is a PTAS for identical matroids and weights. For general matroids and weights, there is an εm-approximation algorithm for any ε > 0. It is NP-hard to approximate with factor O(log m). Minimizing (min,min), (max,max), (min,max) and (min,sum): there are polynomial-time algorithms. They reduce the problems to the feasibility problem of the matroid partitioning problem. Minimizing (max,min) and (sum,min): there are polynomial-time algorithms for identical matroids and weights. In the general case, it is strongly NP-hard even to approximate. The other two problems were analyzed in previous works: minimizing (max,sum) is known to be strongly NP-hard (3-partition is a special case), and minimizing (sum,sum) can be reduced to weighted matroid intersection, which is polynomial. Related problems Matroid partitioning is a different problem, in which the number of parts m is not fixed. There is a single matroid, and the goal is to partition its elements into a smallest number of independent sets. References Matroid theory Number partitioning
Matroid-constrained number partitioning
[ "Mathematics" ]
1,068
[ "Matroid theory", "Combinatorics" ]
69,026,636
https://en.wikipedia.org/wiki/Wicat%20Systems
Wicat Systems, Inc., was an American computer and software company founded in 1980 in Orem, Utah. Originally a branch of WICAT, the World Institute for Computer-Assisted Teaching (later the Wicat Education Institute), the company manufactured multi-user systems for educational institutions before focusing their efforts on educational software development in the early 1990s. The company was among the first to use the Motorola 68000 microprocessor in a computer with the introduction of the Wicat System 100 in 1980. Both Wicat Systems and its parent institution were founded by Dustin H. Heuston, originally of New York. History At its peak in the mid-1980s, Wicat Systems employed 500 and had an annual budget of US$40 million. The company formed a joint venture with Control Data Corporation in early 1985. Named Plato/Wicat after Control Data's Plato educational software, the venture was intended to "address the entire educational process, including computer-based instructional courseware, testing and evaluation, and classroom management and administration". In 1992, the company was acquired by Jostens in a stock swap valuated at roughly $111 million. Jostens, who had a rival educational software division Jostens Learning which was aimed at preschools, planned to use the Wicat Systems repertoire to increase their presence in high schools and higher education. During the period from the late 1980's to 1996, Wicat Systems also operated a UK branch in Camberley in Surrey. During this period, Wicat produced CBT (computer based training - the forerunner of eLearning), and partial cockpit simulations for aviation clients. These included many of the then-leading airlines and aircraft manufacturers and training covered pilot, cabin crew and ground crew training. Norfolk Southern Railway was another of many non-aviation clients. Citations References 1980 establishments in Utah 1992 disestablishments in Utah 1992 mergers and acquisitions American companies established in 1980 American companies disestablished in 1992 Companies based in Orem, Utah Computer companies established in 1980 Computer companies disestablished in 1992 Defunct computer companies of the United States Defunct software companies of the United States Defunct computer hardware companies Defunct computer systems companies
Wicat Systems
[ "Technology" ]
440
[ "Computing stubs", "Computer company stubs" ]
69,026,877
https://en.wikipedia.org/wiki/LunaNet
LunaNet is a NASA and ESA project and proposed data network aiming to provide a "Lunar Internet" for cis-lunar spacecraft and installations. It will be able to store-and-forward data to provide a Delay/Disruption Tolerant Network (DTN). The objective is to avoid needing to preschedule data communications back to Earth. LunaNet will also offer navigation services, eg. for orbit determination, and navigation on the lunar surface. Draft interoperability specifications have been issued. The LunaNet Interoperability Specification (LNIS) is the document which publishes the LunaNet standard. LNIS version 4 was published online in September 12, 2022. LNIS version 5 draft for review was provided online for comment late 2023. NASA's instantiation of LunaNet is called Lunar Communication Relay and Navigation System (LCRNS). Moonlight Initiative is an ESA project intending to adopt the specifications. JAXA's instantiation of LunaNet is called Lunar Navigation Satellite System (LNSS) See also Deep Space Network, NASA spacecraft communications Artemis program, NASA's return to the Moon Laser communication in space Coordinated Lunar Time References External links LunaNet Interoperability Specification Documents NASA's Lunar Communications Relay and Navigation Systems (LCRNS) NASA Spacecraft communication
LunaNet
[ "Engineering" ]
260
[ "Spacecraft communication", "Aerospace engineering" ]
69,027,179
https://en.wikipedia.org/wiki/Matthew%20Todd%20%28chemist%29
Matthew Houghton Todd (born 13 January 1973) is a British chemist and the Professor and Chair of Drug Discovery of the School of Pharmacy at University College London. He is the founder of Open Source Malaria (OSM) and his research focuses on drug discovery and development for this disease. Recently, he has expanded to other areas, particularly neglected diseases such as tuberculosis and mycetoma in the Open Source Tuberculosis (OSTB) and Open Source Mycetoma (MycetOS) project, through a collaboration with the Drugs for Neglected Diseases Initiative and Erasmus MC. In addition, he has some research activity in catalysis and methodology. Education Todd received an MA in Natural Sciences from the University of Cambridge in 1995. He obtained his PhD in Organic Chemistry at the same institution in 1999, working with Chris Abell on encoding and linker strategies for combinatorial chemistry. Todd was a Wellcome Trust Postdoctoral Research Fellow at the University of California, Berkeley from 1999 to 2000, working with Paul A. Bartlett on synthesis of amino acid-derived heterocycles by Lewis acid catalysis and radical cyclisations from peptide acetals. Career and research From 2000 to 2001, he was a College Fellow and Lecturer at New Hall, Cambridge (now Murray Edwards College, Cambridge). He began his independent research career in 2001 at Queen Mary University of London. In 2005, he relocated to Australia where he was a lecturer, senior lecturer, then associate professor at the School of Chemistry, University of Sydney. In 2018, he returned to the United Kingdom to take the role of professor and Chair of Drug Discovery at UCL School of Pharmacy. In response to the price hike of HIV/AIDS drug, pyrimethamine (Daraprim), by Turing Pharmaceuticals, Todd and the Open Source Malaria team led a small team of high school students from Sydney Grammar School to synthesise the drug. The students produced 3.7 grams of pyrimethamine for US$20, which would be worth between US$35,000 and US$110,000 in the United States based on hiked prices. This received significant media attention and was featured on ABC, BBC, CNN, The Guardian, and Time. Todd has been a vocal proponent of open science and open research. In 2011, he proposed Six Laws of Open Research to guide present and future open research projects including OSM and MycetOS: Todd is on the Editorial boards of Chemistry Central Journal, ChemistryOpen, PLOS One, Scientific Reports, and Scientific Data. Honours and awards 2011 – NSW Premier's Prize for Science & Engineering (Emerging Research) 2012 – Wellcome Trust/Google/PLOS Accelerating Science Award 2014 – Blue Obelisk Award 2017 – Medicine Maker Power List 2018 – Medicine Maker Power List 2019 – Medicine Maker Power List 2020 – Medicine Maker Power List 2021 – Medicine Maker Power List See also Open access Open collaboration Open innovation Open science data Open Source Drug Discovery Open-source model References External links Matthew H. Todd on Twitter Matthew H. Todd on ORCID Matthew H. Todd on Google Scholar Open Source Malaria Open Source Malaria on GitHub Open Source Malaria on Twitter Open Source Mycetoma on GitHub Open Source Mycetoma on Reddit Open Source Mycetoma on Twitter Open Source Pharma Open Source Tuberculosis Open Source Tuberculosis on GitHub Open Source Tuberculosis on Twitter British organic chemists English chemists Living people 1973 births Alumni of the University of Cambridge Academics of University College London Academics of Queen Mary University of London People associated with the UCL School of Pharmacy
Matthew Todd (chemist)
[ "Chemistry" ]
728
[ "Organic chemists", "British organic chemists" ]
69,027,563
https://en.wikipedia.org/wiki/Patricia%20Glibert
Patricia Marguerite Glibert is a marine scientist known for her research on nutrient use by phytoplankton and harmful algal blooms in Chesapeake Bay. She is an elected fellow of the American Association for the Advancement of Science. Education and career Glibert has an undergraduate degree from Skidmore College and a master's degree from the University of New Hampshire, where she examined the movement of nutrients in an estuary. Glibert moved to Harvard University for her Ph.D., which she earned in 1982 with a dissertation working on the uptake of ammonium by small marine organisms. Following her Ph.D., Glibert was a postdoctoral researcher and scientist at Woods Hole Oceanographic Institution. In 1986 Glibert moved to the University of Maryland, where she was promoted to professor in 1993. In 2020, Glibert was elected president-elect of the Association for the Sciences of Limnology and Oceanography (ASLO), and followed Roxane Maranger as president in 2022. Research Glibert's research centers on nutrients, phytoplankton, and harmful algal blooms, especially the connection between harmful algal blooms and nutrients. She has conducted this research in multiple locations including Shinnecock Bay, Long Island, Florida Bay, the Chesapeake Bay, Kuwait Bay, the Scotian Shelf, the waters off Cape Cod, and Chesapeake Bay. She has examined the production and consumption of nitrogen, the effect of temperature on nutrient uptake, and the role of mixotrophy in nutrient use. Her work includes investigations into nutrient cycling in model organisms including Trichodesmium, Prorocentrum, and Synechococcus. Glibert's research encompasses issues of climate change and human impacts on the environment. Selected publications Awards and honors Glibert received an honorary doctorate from Linnaeus University in 2011, and was elected a fellow of the American Association for the Advancement of Science in 2012. She has also been named one of the top women professors in Maryland (2013), and is a sustaining fellow of the Association for the Sciences of Limnology and Oceanography (2016). Personal life Glibert describes herself as "one-half dual-career couple" and is married to Todd Kana, a phytoplankton ecologist at the University of Maryland. In 2016 they published Aquatic Microbial Ecology and Biochemistry: A Dual Perspective, a collection written by dual career couples who have collaborated on research in the field. They have three children; Glibert's daughter was the first child born to a woman scientist at Woods Hole Oceanographic Institution. References External links Fellows of the American Association for the Advancement of Science Skidmore College alumni University of New Hampshire alumni Harvard University alumni University of Maryland, College Park faculty Biogeochemists Women ecologists Living people Year of birth missing (living people) Presidents of the Association for the Sciences of Limnology and Oceanography
Patricia Glibert
[ "Chemistry" ]
596
[ "Geochemists", "Biogeochemistry", "Biogeochemists" ]
69,029,439
https://en.wikipedia.org/wiki/George%20C.%20Bompas
George Cox Bompas (18 April 1827 – 23 May 1905) was a British solicitor and astronomer. Bompas was born in Bloomsbury. He was the second son of Serjeant Charles Carpenter Bompas. His brother was William Bompas. Bompas was admitted as a solicitor in 1850 and continued to practice until 1903. He worked for the law firm Bischoff, Coxe and Bompas and was a solicitor for several of their companies. He later was employed as a lawyer in the George Earl Church debt contract with Bolivia. He married May Anne Scott Buckland, daughter of Rev. William Buckland, in 1860. They had four children. Bompas took interest in astronomy and studied periodic meteor showers and the Zodiacal light. He was elected a Fellow of the Royal Astronomical Society on 14 December 1894. He was a Fellow of the Royal Geographical Society, the Geological Society and the Paleontological Society. In 1896, Bompas authored a paper for the Victoria Institute of which he was a member titled "On evolution and design" which argued for a form of theistic evolution. This was controversial because such a position usually invited disapproval from the Victoria Institute's membership who favoured creationism. However, Bompas was a solicitor from a well-regarded Baptist family so his paper received an unusually mild reaction. Selected publications Life of Frank Buckland: By his Brother-in-Law (1888) The Problem of the Shakespeare Plays (1902) References 1827 births 1905 deaths 19th-century British astronomers British solicitors English Baptists Fellows of the Royal Astronomical Society Fellows of the Royal Geographical Society People from Bloomsbury Theistic evolutionists
George C. Bompas
[ "Biology" ]
337
[ "Non-Darwinian evolution", "Theistic evolutionists", "Biology theories" ]
69,030,017
https://en.wikipedia.org/wiki/Potassium%20hypochromate
Potassium hypochromate is a chemical compound with the formula K3CrO4 with the unusual Cr5+ ion. This compound is unstable in water but stable in alkaline solution and was found to have a similar crystal structure to potassium hypomanganate. Preparation This compound is commonly prepared by reacting chromium(III) oxide and potassium hydroxide at 850 °C under argon: Cr2O3 + 6 KOH → 2 K3CrO4 + H2O + 2 H2 This compound can be prepared other ways such as replacing chromium oxide with potassium chromate. It is important that there is no Fe2+ ions present because it would reduce the Cr(V) ions to Cr(III) ions. Reactions Potassium hypochromate decomposes in water to form chromium(III) oxide and potassium chromate when alkali is not present or low. Potassium hypochromate also reacts with acids such as hydrochloric acid to form chromium(III) oxide, potassium chromate, and potassium chloride: 6 K3CrO4 + 10 HCl → 4 K2CrO4 + Cr2O3 + 5 H2O + 10 KCl Other reducing agents such as hydroperoxides can oxidize the hypochromate ion into chromate ions. At extremely high temperatures, it decomposes into potassium chromate and potassium metal. This compound is used to synthesize various compounds such as chromyl chlorosulfate by reacting this compound with chlorosulfuric acid. References Potassium compounds Chromates
Potassium hypochromate
[ "Chemistry" ]
341
[ "Chromates", "Oxidizing agents", "Salts" ]
69,030,134
https://en.wikipedia.org/wiki/2021%20Kitimat%20smelter%20strike
The 2021 Kitimat smelter strike was a 69-day labor strike by Rio Tinto workers in Kitimat, British Columbia, Canada. Background Aluminum producer Rio Tinto is the main employer in the municipality of Kitimat, in the North Coast region of British Columbia. The workers of the Rio Tinto aluminum smelter and Kemano hydroelectric power plant in Kitimat are represented by Unifor, as Local 2301, with around 950 workers in total. Strike The collective bargaining agreement between Unifor Local 2301 and Rio Tinto expired in 2021. Negotiations for a new agreement, however, broke down during the summer, with the union accusing Rio Tinto of having used contractors and temporary employees in violation of the previous agreement, of failing to address concerns over pensions, of forcing younger employees into bad pension plans, and of having several hundred employee grievances in backlog. Rio Tinto accused the union of having too many demands and of rejecting a third party mediator for negotiations. On 21 July, the union announced a 72-hour strike notice, with the workers having voted unanimously to go on strike. National Unifor president Jerry Dias released a statement saying that "despite record-setting profits, Rio Tinto appears so unwilling to work with us and treat our members fairly" and that the union had proposed the first increase in employee benefits in a decade. On 25 July, at one minute past midnight, the first workers walked off the job. The strike had begun. The B.C. Labour Relations Board, however, issued an essential services order in support of the smelter. The smelter would continue to operate at 25% capacity during the strike. Striking workers would receive $300 a week from the Unifor strike fund and an additional $100 a week from the Local 2301 strike fund to compensate for lost salaries. Despite reporting a decrease in sales of up to 30%, a number of local business supported the strike, offering the striking workers free lunches and free haircuts. On 29 July, local NDP MP Taylor Bachrach visited the strikers, stating that "when you have a historic strike vote of 100 per cent for the first time in the history of this plant and in the history of this community, that means that things have gotten pretty bad." Local MLA Ellis Ross called for the British Columbian government to intervene, stating that he didn't want "to see Kitimat going into decline" or "people suffering to make mortgage payments." On 5 August, after there had been no negotiations between the union and Rio Tinto, several hundred local residents held a rally to "save the northwest," calling for the two sides to resume negotiating. On 15 August, the captain of MV Indiana, a Norwegian cargo ship that had arrived in Kitimat a week before the strike to collect aluminum shipments and got stuck due to the strike, released a statement saying that the ship was running out of low sulphur marine fuel. Arbutus Point Marine. subcontractor Northwest Fuels, who held responsibility for delivering fuel to the ship, was refusing to cross the picket line. On August 12, the two sides held a meeting to determine whether there was enough common ground to continue negotiations. On 25 August, the union and Rio Tinto agreed to resume collective bargaining negotiations. On 2 September, the two sides released a joint statement saying that the negotiations were progressing well. On 25 September, the union and Rio Tinto announced that a new collective bargaining agreement had been reached. On 4 October, the workers voted to ratify the deal by a majority of 70.6%, bringing an end to the strike. Rio Tinto also announced that a memorandum of understanding had been reached on working together and that a protocol for returning to work on the B.C. Works operation had been outlined. See also 1976 CASAW wildcat strike References 2021 labor disputes and strikes 2021 in British Columbia Labour disputes in British Columbia Unifor Aluminum in Canada Metallurgical industry of Canada Kitimat Manufacturing industry strikes in Canada
2021 Kitimat smelter strike
[ "Chemistry" ]
829
[ "Metallurgical industry of Canada", "Metallurgical industry by country" ]
69,031,669
https://en.wikipedia.org/wiki/Saildrone%20%28company%29
Saildrone, Inc. is a United States company based in Alameda, California, that designs, manufacturers, and operates a fleet of unmanned/uncrewed surface vehicles (USVs), or ocean drones, known as "saildrones". The company was founded by engineer Richard Jenkins in 2012. Saildrone customers and research partners include the various departments of the National Oceanographic and Atmospheric Administration NASA, the University of New Hampshire, the University of Rhode Island, the Commonwealth Scientific and Industrial Research Organisation (Australia), the European Center for Medium-range Weather Forecasts, GEOMAR Helmholtz Centre for Ocean Research Kiel (Germany), the Monterey Bay Aquarium Research Institute, and others. History Founding (2012-2018) Saildrone was founded by Richard Jenkins in 2012. In 2014, Saildrone began a partnership with the National Oceanic and Atmospheric Administration's Pacific Marine Environmental Laboratory under a Cooperative Research and Development Agreement to develop and refine vehicle capabilities and payload of sensors. Objectives included acoustic fisheries surveys for management and conservation while also collecting metocean data. In 2016, Saildrone closed a $14 million Series A funding round. The round was led by Social Capital and included Capricorn Investment Group and Lux Capital. Saildrone had previously received mission-related investment from The Schmidt Family Foundation, a private foundation created by Eric and Wendy Schmidt. Projects and funding (2017-2023) In 2017, two saildrones deployed from San Francisco took part in the NASA-funded Salinity Processes in the Upper-ocean Regional Study 2 (SPURS-2) field campaign as part of their more than six-month Tropical Pacific Observing System (TPOS)-2020 pilot study in the eastern tropical Pacific. The mission compared saildrone measurements with those of the research vessel Revelle and the Woods Hole Oceanographic Institution (WHOI) SPURS-2 buoy. The data collected by the saildrones was found to be in good agreement with the ship and buoy, and demonstrated the saildrone to be “an effective platform for observing a wide range of oceanographic variables important to air-sea interaction studies,” according to a paper published in Oceanography. In 2018, the website DroneBelow reported the company raised a $60 million Series B funding round to scale operations with participation from Horizons Ventures as well as existing investors Social Capital, Capricorn Investment Group, and Lux Capital. In October 2020, the U.S. Coast Guard Research and Development Center in Hawaii began a 30-day test to "assess low-cost, commercially available autonomous solutions to improve maritime domain awareness in remote regions of the Pacific Ocean." Saildrone was one of two platforms tested. In 2021, TechCrunch reported the company had raised a $100 million Series C funding round led by Mary Meeker's investment fund Bond Capital with participation from new investors XN, Standard Investments, Emerson Collective, Crowley Maritime Corporation, as well as previous investors Capricorn's Technology Impact Fund, Lux Capital, Social Capital, and Tribe Capital. In 2022, the Saildrone Surveyor was recognized with the Innovation Award from the Blue Marine Foundation and BOAT International's annual Ocean Awards for revolutionizing ocean mapping. The company says that with 20 Saildrone Surveyors, it should be possible to achieve Seabed 2030's goal of mapping the world's oceans in high-resolution by the end of the decade. Vehicles There are three Saildrone platforms: Explorer, Voyager, and Surveyor. All three Saildrone uncrewed surface vehicles (USVs) combine wind-powered propulsion technology with solar-powered meteorological and oceanographic sensors. Saildrone Explorer The Saildrone Explorer is a USV that can sail at an average speed of (depending on the wind) and stay at sea for up to 365 days. The Explorer is designed for fisheries missions, metocean data collection, ecosystem monitoring, and satellite calibration and validation missions. Saildrone Voyager In August 2021, Seapower Magazine reported the company is adding a new mid-size USV to the fleet: The Voyager is a USV with primary wind power and auxiliary propulsion of a 4kW electric motor for a wide variety of missions including bathymetry (ocean mapping) missions, border patrol and maritime domain awareness. The average speed is 5 knots. Saildrone Surveyor At long and weighing 14 tons, the Surveyor is the largest vehicle in the Saildrone fleet. According to Wired, the Surveyor was first launched in January 2021 and is designed to carry multibeam echo sounders for IHO-compliant bathymetry surveys. The Surveyor's multibeam echo sounders can map the ocean seafloor to depths of . It also carries an acoustic Doppler current profiler to measure the speed and direction of ocean currents. In July 2021, the Surveyor completed its first trans-Pacific mapping mission sailing from San Francisco to Honolulu, Hawaii, and mapping of seafloor along the way. Hawaii News Now reported that 20 Surveyors could map the entire ocean in less than 10 years. In September 2022, it was announced that Austal USA signed an agreement with Saildrone, to build Saildrone Surveyor drones by year end for the US Navy, and other customers. In April 2024, Saildrone and Thales Australia announced a partnership to integrate the Thales BlueSentry thin-line towed array with the Surveyor for conducting autonomous long-endurance anti-submarine warfare (ASW) missions. Missions 2019 Antarctic circumnavigation In January 2019, a consortium of organizations led by the Li Ka Shing Foundation launched an autonomous circumnavigation of Antarctica using a group of saildrones. Researchers from agencies around the world participated including from NOAA, NASA, CSIRO, Palmer Long-Term Ecological Research, the Scripps Institution of Oceanography, the Southern Ocean Observing System, the Japan Agency for Marine-Earth Science and Technology, the Korea Polar Research Institute, the Norwegian Polar Institute, the University of Exeter, the University of Gothenburg, the University of Otago, and the New Zealand National Institute of Water and Atmospheric Research. Bloomberg Businessweek reported that, on August 3, 2019, SD 1020 became the first autonomous vehicle to circumnavigate Antarctica, having spent 196 days in the Southern Ocean sailing 13,670 miles. During the mission, SD 1020 had to survive freezing temperatures, waves, winds, and collisions with giant icebergs. In order to survive the extreme conditions of the Southern Ocean, the saildrone was equipped with a special "square" wing. According to a paper published in Geophysical Research Letters by oceanographers Adrienne Sutton, Nancy Williams, and Bronte Tilbrook, one aspect of the mission focused on using Saildrone in situ data collection to better understand the role of the Southern Ocean in regulating the global carbon budget. Assumptions that the Southern Ocean is a significant carbon sink had previously been made using ship-based measurements, which are limited due to challenging ocean conditions in the Southern Ocean. The data collected by the saildrone was used to reduce uncertainty about Southern Ocean uptake: "By directly measuring air and surface seawater carbon dioxide (CO2) and wind speed on the USV, we were able to observe exchange between the ocean and atmosphere every hour during the mission. Using this data set, we estimated potential errors in these measurements as well as other approaches to estimating exchange." 2021 Atlantic hurricane mission In partnership with NOAA, Saildrone deployed five vehicles equipped with "hurricane" wings to the tropical Atlantic Ocean to study air-sea heat exchange to better understand hurricane rapid intensification during the 2021 Atlantic hurricane season. On September 30, 2021, SD 1045 became the first Saildrone Explorer to sail into a category 4 hurricane. It collected ocean data and video from inside Hurricane Sam where the sea state included waves and wind speeds reached over . NOAA has stated that it will deploy five more Saildrone USVs during the 2022 hurricane season. References External links Official website Unmanned surface vehicles of the United States Oceanographic instrumentation Data_companies Companies_based_in_Alameda,_California Technology companies based in the San Francisco Bay Area Unmanned surface vehicle manufacturers
Saildrone (company)
[ "Technology", "Engineering" ]
1,681
[ "Oceanographic instrumentation", "Measuring instruments" ]
69,036,261
https://en.wikipedia.org/wiki/Porsche%20547%20engine
The Porsche 547 and Porsche 547/3 are naturally-aspirated, flat-four, boxer racing engines, designed by Porsche for Formula One racing; between and History In October 1958, the Fédération Internationale de l'Automobile (FIA) announced that for the 1961 Formula One season, engine capacity would be limited to the same 1.5 litres as in Formula Two (F2). This meant that Porsche could use their F2 cars almost unchanged in F1. The 787 would not get the eight-cylinder though, continuing with the air-cooled, DOHC four-cylinder Type 547 boxer engine that had been developed by Ernst Fuhrmann and that had powered the 550 Spyders and 718 series until then. It was powered by a 547/3 four-cylinder engine with Kugelfischer fuel injection. At Monaco the car retired when the fuel injection cut out. A second car, also fitted with the 547/3 engine, was completed in time to appear in the Dutch Grand Prix on 22 May alongside the other 787. Technical data Applications Porsche 787 Porsche 718 References Porsche 1961 in Formula One Formula One engines Porsche in motorsport Boxer engines Engines by model Gasoline engines by model
Porsche 547 engine
[ "Technology" ]
248
[ "Engines", "Engines by model" ]
69,036,377
https://en.wikipedia.org/wiki/Porsche%20753%20engine
The Porsche Type 753 engine is a naturally-aspirated, flat-eight racing engine, designed by Porsche for Formula One racing. It was used for a single season in 1962 in the 1½ litre formula. Background In October 1958 the Fédération Internationale de l'Automobile (FIA) announced that for the 1961 Formula One season, engine capacity would be limited to the same 1.5 litres as in Formula Two (F2). This meant that Porsche could use their F2 cars almost unchanged in F1. For 1961 Porsche launched the Type 787. The car had a new chassis that was longer than that of the 718/2 by an additional to accommodate the Type 753 then in development. While it kept the earlier car's rear suspension, at the front was a new upper and lower A-arm suspension with coil springs. The first chassis completed was powered by a 547/3 four-cylinder engine with Kugelfischer fuel injection. At the Monaco Grand Prix the car retired when the fuel injection cut out. A second car, also fitted with the 547/3 engine, was completed in time to appear in the Dutch Grand Prix alongside the other 787. The cars placed 10th and 11th, but their lack of power and poor handling caused Ferry Porsche to retire the model. Porsche would focus on building a brand new competitive formula race car with an eight-cylinder engine. Like the Porsche 787 before it, the 804 engine had a smoother surface than its predecessor, which was achieved in part by using a horizontal cooling fan (vertical axis) on top of the new engine, in contrast to the vertically-mounted (horizontal axis) cooling fan used on the four-cylinder Fuhrmann engine. Engine The design of the new Type 753 was handled by Hans Hönick and Hans Mezger. It continued Porsche's traditions of a boxer layout and air-cooling. The bore and stroke were respectively, giving a displacement of . The oversquare dimensions kept piston speeds low, and kept the engine narrow and as far out of the airflow on the sides of the car's tub as possible, although it was still wider than the 120° V6 and 90° V8s of the competition. A prototype engine was first started on a test-bench on 12 December 1960. That first 753 only produced (some sources say ). During the development of the 804, there were concerns about the readiness of the eight-cylinder engine, so a second chassis, 804-02, was modified to accept the air-cooled 1.5-liter four-cylinder boxer engine type 547 from the 787. That chassis was later converted back to the eight-cylinder configuration before ever racing with the four-cylinder engine. Swiss racing driver, engineer, and fuel injection specialist Michael May transferred from Mercedes-Benz to Porsche to work on the 753 engine, but wound up developing improvements for the 547/3 engine instead, before leaving Porsche for Ferrari. With a compression ratio of 10.0:1, the 753 flat-eight produced at 9200 rpm on its first outing. This was still less power than the new Coventry-Climax and BRM V8 engines. Type 753 Work began on Type 753 in 1960, following the announcement of a 1.5-litre displacement limit for the 1961 Formula One (F1) season. The design of the new F1 engine, Porsche's first flat-eight, was done by Hans Hönick and Hans Mezger. The 753 inherited the traditional Porsche features of a boxer layout and air-cooling, but the number of cylinders increased to eight. Bore and stroke were respectively, resulting in a displacement of . The oversquare dimensions kept piston speeds low, and also kept the engine narrow and as far out of the airflow on the sides of the car's tub as possible, although it was still wider than the 120° V6 and 90° V8s of the competition. The centre of the engine was a magnesium crankcase cast in two halves split vertically along the centre-line of the crankshaft. The crankcase carried a one-piece crankshaft in nine main bearings. The eight aluminum cylinder barrels had their bores treated with a spray-on molybdenum/steel coating called Ferral. Each finned cylinder had its own separate aluminum cylinder head, with four studs per cylinder holding the heads and barrels to the crankcase. An aluminum valve-gear cover cast as a single piece stabilized the four cylinders on each side of the engine. The valvetrain was similar in some respects to that designed by Ernst Fuhrmann for the Type 547 four-cylinder engine. There were two overhead camshafts per cylinder bank, operating two valves per cylinder. As with the 547, the cams were driven by shafts rather than gears or chains, and the cam lobes were separate pieces that were keyed onto the shaft. The 753 added a second countershaft above the crankshaft to the single one underneath the crankshaft in the 547. Both countershafts rotated at half crankshaft speed. Two layshafts from the upper countershaft drove the left and right intake camshafts, while two other layshafts from the lower countershaft drove the exhaust camshafts, eliminating the vertical shafts in the 547's cylinder heads that gave that engine one of its nicknames. A short vertical shaft from the bevel gear on the right-hand inlet camshaft drove the axial cooling fan at 0.92x crankshaft speed. The valvetrain was designed to operate reliably at up to 10,000 rpm. The engine had a dry sump system with a separate oil tank. A Bosch dual ignition system with four ignition coils and two distributors fired two spark plugs per cylinder. The air-fuel mixture was delivered by four Weber double downdraft carburetors; two on each side. Assembly of the engine was a time consuming job, often requiring repeated assembly and disassembly with extensive hand-fitting of components. Building and setting up a 753 never took less than 100 hours and could take up to 220 hours. The engine, with exhaust and clutch, was long, wide, high and weighed . A prototype engine was first started on a test-bench on 12 December 1960. Initial power output was disappointing; (some sources say ), when the target had been . Mezger and his team worked to improve both the engine's reliability and power output. The earliest engines had a 90° angle between the valves. When this was reduced, first to 84° and subsequently to 72°, power output rose. Other changes included reshaping the combustion chamber, lightening crankpins, and switching to titanium connecting rods. Power was eventually raised to . Although the chassis of the Type 787 F1 car was lengthened to accommodate the 753, the flat-eight was never installed and the car used the 547 throughout its short life. The 753 engine debuted in Porsche's Formula One Type 804 on 20 May 1962 at the Dutch Grand Prix at Zandvoort. With a compression ratio of 10.0:1, the engine produced at 9200 rpm on its first outing. This was still less power than the new Coventry-Climax and BRM V8 engines. With the improved six-speed transmission from the Type 718 and a ZF limited-slip differential, the car reached a top speed of . The 753 delivered Porsche's only F1 win as a constructor at the 1962 French Grand Prix at Rouen-Les-Essarts, in an 804 driven by Dan Gurney. A short-stroke version of the engine was developed, designated the 753/1. The 753 also influenced the design of the engine for Porsche's 901 project, which would become the 911. Technical summary Applications Porsche 804 References Porsche 1962 in Formula One Formula One engines Porsche in motorsport Boxer engines Engines by model Gasoline engines by model
Porsche 753 engine
[ "Technology" ]
1,632
[ "Engines", "Engines by model" ]
69,036,831
https://en.wikipedia.org/wiki/Elimination%20of%20tuberculosis
Elimination of tuberculosis is the effort to reduce the number of tuberculosis (TB) cases to less than one per 1 million population, contrasted with the effort to completely eradicate infection in humans worldwide. The goal of tuberculosis elimination is hampered by the lack of rapid testing, short and effective treatment courses, and completely effective vaccine. The WHO as well as the Stop TB Partnership aim for the full elimination of TB by 2050—requiring a 1000-fold reduction in tuberculosis incidence. As of 2017, tuberculosis has not been eliminated from any country. Feasibility Tuberculosis has been a curable illness since the 1940s when the first drugs became available, although multidrug-resistant and extensively drug-resistant TB present an increasing challenge. According to a 2017 article in International Journal of Infectious Diseases, tuberculosis eradication is possible if enough technology, funding, and political will were available, but the characteristics of the disease do not make it easy to eradicate. So far, tuberculosis has not been eliminated from any country. According to European Respiratory Review, tuberculosis eradication is not considered possible due to the lack of a completely effective vaccine and large reservoir of people infected with latent tuberculosis. According to a 2013 review, tuberculosis elimination will require not just treating active tuberculosis but also latent cases, and eliminating tuberculosis by 2050 worldwide is not possible, although great reductions in infections and deaths are possible. Addressing poverty is a further requirement for eliminating tuberculosis. People who are poor are disproportionately affected by tuberculosis because the disease is made worse by inadequate housing and malnutrition, and poverty can make it difficult to get treatment. The WHO has estimated that eliminating poverty would reduce tuberculosis incidence by 84 percent. Elimination strategies Globally In 2014, the World Health Organization launched the End TB Strategy with the goal of reducing tuberculosis deaths by 95% and incidence by 90% before 2035. As of 2020, the world was not on track to meet those goals. The WHO, as well as the Stop TB Partnership, are now aiming for the full elimination of TB by the year 2050, which will require a 1000-fold reduction in the incidence of the disease. India In 2017, the Indian government announced its intention to eliminate tuberculosis in the country by 2025. The previous year, it accounted for 27 percent of tuberculosis cases and 29 percent of deaths worldwide, making it the highest burden country for both tuberculosis and multidrug-resistant tuberculosis. References Epidemiology Health campaigns Tuberculosis Infectious diseases with eradication efforts
Elimination of tuberculosis
[ "Environmental_science" ]
498
[ "Epidemiology", "Environmental social science" ]
69,037,042
https://en.wikipedia.org/wiki/Egil%20Lillest%C3%B8l
Egil Sigurd Lillestøl (19 March 1938 — 27 September 2021) was a Norwegian experimental elementary particle physicist. Education and early career Lillestøl graduated in 1964 from the University of Bergen and obtained his PhD from the same university in 1970. He was appointed associate professor at the university the same year and appointed professor in 1984. Leading up to his professorship in Bergen, he had been a fellow at CERN (1964–1967) in Geneva, Switzerland and a guest researcher at Collège de France (1973) in Paris. Career Lillestøl was involved in experiments carried out at DESY, Hamburg, Germany, where he was central in the PLUTO Collaboration, and at CERN, where he was member of the DELPHI Collaboration. Furthermore, he was instrumental in the process to develop a long-term funding model allowing Norwegian research groups to participate in the LHC experiments ATLAS and ALICE. Still affiliated with University of Bergen, Lillestøl took up positions at CERN. He was Deputy Head of CERN's Physics Division from 1990 to 1992. After this he split his time between the university and CERN. From 1992 to 2009 he was Head of the European School of Particle Physics, CERN Schools of Physics. Physics communication Lillestøl put a strong emphasis on physics communication. In addition to teaching at university level and giving presentations at international conferences, he also communicated complex physics concepts to both school children and the general public. He was a regular lecturer at the European Space Camp from 2001 onwards. Energy and climate Lillestøl was also interested in global energy issues and an opportunity to solve the world's energy problems using thorium-based nuclear power. He advocated establishing the use of nuclear power to solve the international energy crisis, and that Norway should build a prototype of a thorium reactor. He was founding member of the International Thorium Energy Committee. Lillestøl was critical of the Norwegian climate debate and pointed out that the Earth's climate is very complicated, and climate science is young and still has a long way to go before one can hope to find complete explanations for the Earth's climate variations. Bibliography Lillestøl co-authored the popular science book The Search for Infinity (1994), which has been translated into eight other languages. He also contributed to the preparation of the Norwegian textbook Generell fysikk for universiteter og høgskoler (General Physics for Universities and University Colleges) (2001). Furthermore, he was also one of the main drivers behind the UNESCO-supported traveling exhibition “Science bringing nations together”, organised jointly by JINR and CERN. His scientific output, as recorded by the database Inspire-HEP, amount closely to hundred works. Awards In 2007, he was awarded the Research Council of Norway's Award for Excellence in Communication of Science for "dissemination of basic physics to schoolchildren and the public". Egil Lillestøl was a member of the Norwegian Academy of Technological Sciences and the Royal Norwegian Society of Sciences and Letters. References 1938 births 2021 deaths Members of the Norwegian Academy of Technological Sciences People associated with CERN Norwegian physicists Royal Norwegian Society of Sciences and Letters Particle physicists Science communicators Academic staff of the University of Bergen
Egil Lillestøl
[ "Physics" ]
661
[ "Particle physicists", "Particle physics" ]
69,037,339
https://en.wikipedia.org/wiki/Porsche%20flat-twelve%20engine
Porsche produced a series of naturally-aspirated, and later an extremely powerful twin-turbocharged flat-twelve engine for their Porsche 917 sports prototype; between 1969 and 1973. Overview The engine was designed by chief engineer Hans Mezger under the leadership of Ferdinand Piëch and Helmuth Bott. Power came from a new 4.5-litre air-cooled engine designed by Mezger, which was a combination of 2 of Porsche's 2.25L flat-6 engines used in previous racing cars. The 'Type 912' engine featured a 180° flat-12 cylinder layout, twin overhead camshafts driven from centrally mounted gears and twin spark plugs fed from two distributors. The large horizontally mounted cooling fan was also driven from centrally mounted gears. It was Porsche's first 12-cylinder engine and used many components made of titanium, magnesium and exotic alloys that had been developed for lightweight "Bergspider" hill climb racers. Other methods of weight reduction were rather simple, such as making the gear shift knob out of birch wood, some methods were not simple, such as using the tubular frame itself as oil piping to the front oil cooler. By 1971, the original 4.5-liter engine, which had produced around 520 hp in 1969, had been enlarged through 4.9-liters (600 hp) to 5-liters and produced a maximum of 630 hp. The favorite team to win, Gulf-backed John Wyer Automotive, lined up three 917Ks, two with the 4.9-liter engine, and one with the 4.5-liter unit. Two 917 LH were entered in Le Mans, one in white and red trim by Porsche Salzburg. Driven by Vic Elford and Kurt Ahrens, the pole sitter's 4.9-liter engine dropped an inlet valve after 225 laps. Both drivers had also been entered on the team's other car, a red and white 917 K with the 4.5-liter engine, qualified by Hans Herrmann and Richard Attwood in rather low 15th spot, but they did not drive after their own car failed. The car with the 4.5L engine gained the nickname of the Hippie Car or the Psychedelic Porsche from the team and media. At the end it was the red and white #23 917K of Porsche Salzburg, with the standard 4.5-litre engine, carefully driven by Stuttgart's own Hans Herrmann and Englishman Richard Attwood through the pouring rain, that finally scored the first overall win at Le Mans, in a wet race that saw only 7 ranked finishers. The domination of Gulf-Wyer and Martini Porsches in 1971 was overwhelming. The only potential challenger to the 917 appeared early in the season: Roger Penske had bought a used 512S chassis that was dismantled and rebuilt beyond M specification. The car was specially tuned for long races, receiving many unique features among which were a larger rear wing and an aviation-inspired quick refueling system. The engine was tuned by Can-Am V8 specialist Traco and able to deliver more than 600 hp (450 kW). As the new rules for the 3-liter prototypes were not favorable to their existing low-weight, low-power Porsche 908, Porsche decided against developing a new high power engine that could keep up with the F1-based engine designs of the competition — at least in naturally aspirated form. In 1976 they would return to sport-prototype racing with the turbocharged Porsche 936 race cars after the engines were tested in Porsche 911 versions. After their successes with the 917 mainly in Europe, Porsche instead decided to focus on the North American markets and the Can-Am Challenge. For that series, larger and more powerful engines were needed. Although a 16-cylinder engine with about was tested, a turbocharged 12-cylinder engine with comparable power output was ultimately used. The turbocharged 917/10K entered by Penske Racing won the 1972 series with George Follmer, after a testing accident sidelined primary driver Mark Donohue. This broke the five-year stranglehold McLaren had on the series. The further evolution of the 917, the 917/30 with revised aerodynamics, a longer wheelbase and an even stronger 5.4-liter engine with around in race trim, won the 1973 edition winning all races but two when Charlie Kemp won the Mosport race and George Follmer won Road Atlanta and Mark Donohue won the rest. Most of the opposition was made of private 917/10K as McLaren, unable to compete against the 917 turbos, had already left the series to concentrate on Formula 1(and USAC, for several years). The 917 and its engine's domination, the oil crisis, and fiery tragedies like Roger Williamson's in Zandvoort pushed the SCCA to introduce a 3 miles per U.S. gallon maximum fuel consumption rule for 1974. Due to this change, the Penske 917/30 competed in only one race in 1974, and some customers retrofitted their 917/10K with naturally aspirated engines. The 917/30 was the most powerful sports car racer ever built and raced. The 5.374-litre 12 cylinder (90.0 x 70.4 mm) twin-turbocharged engine could produce around at 7,800 rpm in race trim. The 917/30 dominated the Can-Am series during the 1973 season. The 917 was also the only championship-winning car in Can-Am not to be powered by Chevrolet. Applications Porsche 917 References Porsche Porsche in motorsport Boxer engines Engines by model Gasoline engines by model Flat engines
Porsche flat-twelve engine
[ "Technology" ]
1,165
[ "Engines", "Engines by model" ]
69,037,644
https://en.wikipedia.org/wiki/Roscosmos%20Cosmonaut%20Corps
The Cosmonaut Corps () is a unit of the Russia's Roscosmos state corporation that selects, trains, and provides cosmonauts as crew members for the Russian Federation and international space missions. It is part of the Yuri Gagarin Cosmonaut Training Center, based at Star City in Moscow Oblast, Russia. History The development of Soviet science and technology made it possible, by the end of the 1950s, to consider the issues of crewed space flight. At the beginning of 1959, the President of the USSR Academy of Sciences Mstislav Keldysh held a meeting at which questions about crewed space flight were discussed specifically, right down to "who should fly?". The decision on the selection and training of astronauts for the first space flight on the spacecraft "Vostok" was made in the Resolution of the Central Committee of the Communist Party and the Council of Ministers of the USSR No. 22-10 "On the medical selection of candidates for astronauts", dated January 5, 1959, and in the Resolution Council of Ministers of the USSR No. 569-264 "On the preparation of man for space flights", May 22, 1959. The selection of candidates for cosmonauts corps was entrusted to the command of the Air Force of the Armed Forces, military doctors and medical flight commissions, which monitored the health of pilots in units and formations, and the training of future cosmonauts was entrusted to the Air Force of the Armed Forces of the USSR. Later, the selection was directly entrusted to a group of specialists from the Central Military Research Aviation Hospital (TsVNIAH). The cosmonaut corps was formed on January 11, 1960, by the order of the Commander-in-Chief of the Air Force of the Armed Forces of the USSR, dated March 7, 1960, the first 12 pilots who passed the initial selection were appointed to the post of listener-cosmonauts of the Air Force; The first cosmonaut corps, which included the future first cosmonaut of Yuri Gagarin, consisted of twenty people. On March 23, 1961, Yuri Gagarin was appointed as the commander of the cosmonaut corps. The first Cosmonauts Corps was military unit No. 26266, which formed with the task of training cosmonauts, and a little later it was transformed into the Cosmonaut Training Center of the Air Force of the Armed Forces. After the dissolution of the Soviet Union, the Corps became partly civilian and was managed by the Russian Space and Aviation Agency (RKA). Organization The Cosmonaut Corps is based at the Yuri Gagarin Cosmonaut Training Center in Star City, Russia, although members may be assigned to other locations based on mission requirements. The Chief of the Cosmonaut Office is the most senior leadership position for active cosmonauts in the Corps. The Chief serves as head of the Corps and is the principal adviser to the Roscosmos Director-General on cosmonaut training and operations. The first Chief Astronaut was Yuri Gagarin, appointed in 1960. The current Chief is Maksim Kharlamov. Requirements In order to enter the cosmonaut corps, a candidate for the role of a space pilot must pass medical and psychological tests (in the Central Research Aviation Hospital), as well as undergo a face-to-face interview. During the Soviet era, in addition, membership in the Communist Party of the Soviet Union was also a prerequisite for joining the cosmonaut corps. The main current requirements for joining the cosmonaut corps are to be with Russian citizenship, age up to 35 years, have higher education, have knowledge of English, successfully pass medical and psychological tests, and have body weight up to . List of active cosmonauts , the corps has 24 "active" cosmonauts consisting of 1 woman and 23 men. All of the current members of the cosmonaut corps were selected in 1996 or later. Missions underlined are in progress. Missions in italics are scheduled and subject to change. List of former cosmonauts (partial) Russia and the Soviet Union The Soviet space program came under the control of the Russian Federation in December 1991; the new program, now called the Russian Federal Space Agency, retained continuity of equipment and personnel with the Soviet program. While all Soviet and RKA cosmonauts were born within the borders of the U.S.S.R., many were born outside the boundaries of Russia, and may be claimed by other Soviet successor states as nationals of those states. These cosmonauts are marked with an asterisk * and their place of birth is shown in an appended list. All, however, claimed Soviet or Russian citizenship at the time of their space flights. A Viktor Mikhaylovich Afanasyev — Soyuz TM-11, Soyuz TM-18, Soyuz TM-29, Soyuz TM-33/32 Vladimir Aksyonov (1935–2024) — Soyuz 22, Soyuz T-2 Aleksandr Pavlovich Aleksandrov — Soyuz T-9, Soyuz TM-3 Ivan Anikeyev (1933–1992) — Expelled from Vostok program; no flights. Anatoly Artsebarsky* — Soyuz TM-12 Yuri Artyukhin (1930–1998) — Soyuz 14 Oleg Atkov — Soyuz T-10/11 Toktar Aubakirov* — Soyuz TM-13/12 Sergei Avdeyev — Soyuz TM-15, Soyuz TM-22 B Andrei Babkin — No flights. Aleksandr Balandin — Soyuz TM-9 Yuri Baturin — Soyuz TM-28/27, Soyuz TM-32/31 Pavel Belyayev (1925–1970) — Voskhod 2 Georgi Beregovoi* (1921–1995) — Soyuz 3 Anatoly Berezovoy (1942–2014) — Soyuz T-5/7 Valentin Bondarenko (1937–1961) — No flights. Andrei Borisenko — Soyuz TMA-21 Nikolai Budarin — STS-71/Soyuz TM-21, Soyuz TM-27, STS-113/Soyuz TMA-1 Valery Bykovsky — (1934–2019) — Vostok 5, Soyuz 22, Soyuz 31/29 D Vladimir Dezhurov — Soyuz TM-21/STS-71 Georgy Dobrovolsky* (1928–1971), Died on reentry. — Soyuz 11 Lev Dyomin (1926–1998) — Soyuz 15 Vladimir Dzhanibekov* — Soyuz 27/26, Soyuz 39, Soyuz T-12, Soyuz T-13 F Konstantin Feoktistov (1926–2009) — Voskhod 1 Valentin Filatyev (1930–1990) — Expelled from Vostok program; no flights. Anatoly Filipchenko (1928–2022) — Soyuz 7, Soyuz 16 G Yuri Gagarin (1934–1968), First person in space. — Vostok 1 Yuri Gidzenko* — Soyuz TM-22, Soyuz TM-31/STS-102, Soyuz TM-34/Soyuz TM-33 Yuri Glazkov (1939–2008) — Soyuz 24 Viktor Gorbatko (1934–2017) — Soyuz 7, Soyuz 24, Soyuz 37/36 Georgi Grechko (1931–2017) — Soyuz 17, Soyuz 26/27, Soyuz T-14/13 Aleksei Gubarev (1931–2015) — Soyuz 17, Soyuz 28 I Aleksandr Ivanchenkov — Soyuz 29/31, Soyuz T-6, Anatoli Ivanishin — Soyuz TMA-22, Soyuz MS-01, Soyuz MS-16, K Aleksandr Kaleri* — Soyuz TM-14, Soyuz TM-24, Soyuz TM-30, Soyuz TMA-3, Soyuz TMA-01M Yevgeny Khrunov (1933–2000) — Soyuz 5/4 Leonid Kizim* (1941–2010) — Soyuz T-3, Soyuz T-10/11, Soyuz T-15 Pyotr Klimuk* — Soyuz 13, Soyuz 18, Soyuz 30 Vladimir Komarov (1927–1967), Died on reentry. — Voskhod 1, Soyuz 1 Yelena V. Kondakova — Soyuz TM-20/STS-84 Dmitri Kondratyev — Soyuz TMA-20 Mikhail Korniyenko — Soyuz TMA-18, Soyuz TMA-16M Valery Korzun — Soyuz TM-24, STS-111/113 Oleg Kotov* — Soyuz TMA-10, Soyuz TMA-17, Soyuz TMA-10M Vladimir Kovalyonok* — Soyuz 25, Soyuz 29/31, Soyuz T-4 Konstantin Kozeyev — Soyuz TM-33/32 Sergei Krikalev — Soyuz TM-7, Soyuz TM-12/ Soyuz TM-13, STS-60, STS-88, Soyuz TM-31/STS-102, Soyuz TMA-6 Valeri Kubasov (1935–2014) — Soyuz 6, Soyuz 19, Soyuz 36/35 L Aleksandr Laveykin — Soyuz TM-2 Vasili Lazarev (1928–1990) — Soyuz 12, Soyuz 18a Aleksandr Lazutkin — Soyuz TM-25 Valentin Lebedev — Soyuz 13, Soyuz T-5/7 Alexei Leonov (1934–2019) — Voskhod 2 (first walk in space), Soyuz 19 Anatoli Levchenko* (1941–1988) — Soyuz TM-4/3 Yuri Lonchakov* — STS-100, Soyuz TMA-1/TM-34, Soyuz TMA-13 Vladimir Lyakhov* (1941–2018) — Soyuz 32/34, Soyuz T-9, Soyuz TM-6/5 M Oleg Makarov (1933–2003) — Soyuz 12, Soyuz 18a, Soyuz 27/26, Soyuz T-3 Yuri Malenchenko* — Soyuz TM-19, STS-106, Soyuz TMA-2, Soyuz TMA-11, Soyuz TMA-05M, Soyuz TMA-19M, Yury Malyshev (1941–1999) — Soyuz T-2, Soyuz T-11/10 Gennadi Manakov (1950–2019) — Soyuz TM-10, Soyuz TM-16 Denis Matveev* — Soyuz MS-21 Musa Manarov* — Soyuz TM-4/6, Soyuz TM-11 Alexander Misurkin — Soyuz TMA-08M, Soyuz MS-06, Soyuz MS-20 Boris Morukov (1950–2015) — STS-106 Talgat Musabayev* — Soyuz TM-19, Soyuz TM-27, Soyuz TM-32/31 N Grigori Nelyubov (1934–1966) — Expelled from Vostok program, no flights. Andriyan Nikolayev (1929–2004) — Vostok 3, Soyuz 9 O Yuri Onufrienko* — Soyuz TM-23, STS-108/111 P Gennady Padalka — Soyuz TM-28, Soyuz TMA-4, Soyuz TMA-14, Soyuz TMA-04M, Soyuz TMA-16M Viktor Patsayev* (1933–1971), Died on reentry. — Soyuz 11 Aleksandr Poleshchuk — Soyuz TM-16 Valeri Polyakov (1942–2022) — Soyuz TM-6/7, Soyuz TM-18/20 Leonid Popov* — Soyuz 35/37, Soyuz 40, Soyuz T-7/5 Pavel Popovich* (1930–2009) — Vostok 4, Soyuz 14 R Sergei Revin — Soyuz TMA-04M Roman Romanenko — Soyuz TMA-15, Soyuz TMA-07M Yuri Romanenko — Soyuz 26/27, Soyuz 38, Soyuz TM-2/3 Valery Rozhdestvensky (1939–2011) — Soyuz 23 Nikolai Rukavishnikov (1932–2002) — Soyuz 10, Soyuz 16, Soyuz 33 Sergei Ryazanski — Soyuz TMA-10M, Soyuz MS-05 Valery Ryumin (1939–2022) — Soyuz 25, Soyuz 32/34, Soyuz 35/37, STS-91 S Aleksandr Samokutyayev — Soyuz TMA-21, Soyuz TMA-14M Gennadi Sarafanov (1942–2005) — Soyuz 15 Viktor Savinykh — Soyuz T-4, Soyuz T-13/14, Soyuz TM-3, Svetlana Savitskaya — Soyuz T-7/5, Soyuz T-12 Aleksandr Serebrov (1944–2013) — Soyuz T-7/5, Soyuz T-8, Soyuz TM-8, Soyuz TM-17 Yelena Serova — Soyuz TMA-14M Vitali Sevastyanov (1935–2010) — Soyuz 9, Soyuz 18 Yuri Shargin — Soyuz TMA-5/4 Salizhan Sharipov* — STS-89, Soyuz TMA-5 Vladimir Shatalov* (1927–2021) — Soyuz 4, Soyuz 8, Soyuz 10 Anton Shkaplerov — Soyuz TMA-22, Soyuz TMA-15M, Soyuz MS-07, Soyuz MS-19 Georgi Shonin* (1935–1997) — Soyuz 6 Oleg Skripochka — Soyuz TMA-01M, Soyuz TMA-20M, Soyuz MS-15 Aleksandr Skvortsov — Soyuz TMA-18, Soyuz MS-13 Anatoly Solovyev* — Soyuz TM-5/4, Soyuz TM-9, Soyuz TM-15, STS-71/Soyuz TM-21, Soyuz TM-26 Vladimir Solovyov — Soyuz T-10/11, Soyuz T-15 Gennadi Strekalov (1940–2004) — Soyuz T-3, Soyuz T-8, Soyuz T-11/10, Soyuz TM-10, Soyuz TM-21/STS-71 Maksim Surayev — Soyuz TMA-16, Soyuz TMA-13M T Yevgeni Tarelkin — Soyuz TMA-06M Valentina Tereshkova, First woman in space. — Vostok 6 Gherman Titov (1935–2000) — Vostok 2 Vladimir Titov — Soyuz T-8, Soyuz TM-4/6, STS-63, STS-86 Valeri Tokarev — STS-96, Soyuz TMA-7 Sergei Treshchov — STS-111/113 Vasili Tsibliyev* — Soyuz TM-17, Soyuz TM-25 Mikhail Tyurin — STS-105/108, Soyuz TMA-9, Soyuz TMA-11M U Yuri Usachov — Soyuz TM-18, Soyuz TM-23, STS-101, STS-102/STS-105 V Vladimir Vasyutin* (1952–2002) — Soyuz T-14 Aleksandr Viktorenko* — Soyuz TM-3/2, Soyuz TM-8, Soyuz TM-14, Soyuz TM-20 Pavel Vinogradov — Soyuz TM-26, Soyuz TMA-8, Soyuz TMA-08M Igor Volk* (1937–2017) — Soyuz T-12 Alexander Volkov* — Soyuz T-14, Soyuz TM-7, Soyuz TM-13, Soyuz TM-13 Sergei Aleksandrovich Volkov* — Soyuz TMA-12, Soyuz TMA-02M Vladislav Volkov (1935–1971), Died on reentry. — Soyuz 7, Soyuz 11 Boris Volynov — Soyuz 5, Soyuz 21 Sergei Vozovikov (1958–1993), drowned during survival training program — No flights. Y Boris Yegorov (1937–1994) — Voskhod 1 Aleksei Yeliseyev — Soyuz 5/4, Soyuz 8, Soyuz 10 Fyodor Yurchikhin* — STS-112, Soyuz TMA-10, Soyuz TMA-19, Soyuz TMA-09M, Soyuz MS-04 Z Dmitri Zaikin (1932–2013) — No flights. Sergei Zalyotin — Soyuz TM-30, Soyuz TMA-1/TM-34 Vitali Zholobov* — Soyuz 21 Vyacheslav Zudov — Soyuz 23 Soviet and Russian cosmonauts born outside Russia All of the locations below were part of the former U.S.S.R. at the time of the cosmonauts' birth. Azerbaidzhan S.S.R. / Azerbaijan Musa Manarov, born in Baku, Azerbaijan Byelorussian S.S.R. / Belarus Pyotr Klimuk, born in Komarovka, Belarus Vladimir Kovalyonok, born in Beloye, Belarus Oleg Novitski, born in Chervyen', Belarus Georgian S.S.R. / Georgia Fyodor Yurchikhin, born in Batumi, Georgia Kazakh S.S.R. / Kazakhstan Toktar Aubakirov, born in Karaganda, Kazakhstan Yuri Lonchakov, born in Balkhash, Kazakhstan Talgat Musabayev, born in Kargaly, Kazakhstan Viktor Patsayev, born in Aktyubinsk, Kazakhstan Dmitry Petelin — born in Kustanai, Kazakhstan Vladimir Shatalov, born in Petropavlovsk, Kazakhstan Aleksandr Viktorenko, born in Olginka, Kazakhstan Kirghiz S.S.R. / Kyrgyzstan Salizhan Sharipov, born in Uzgen, Kyrgyzstan Sergey Korsakov, born in Krunze, Kyrgyzstan Latvian S.S.R. / Latvia Aleksandr Kaleri, born in Jūrmala, Latvia Anatoly Solovyev, born in Riga, Latvia Oleg Artemyev, born in Riga, Latvia Turkmen S.S.R. / Turkmenistan Oleg Kononenko, born in Chardzhou, Turkmenistan Ukrainian S.S.R. / Ukraine Anatoly Artsebarsky, born in Prosyana, Ukraine Georgi Beregovoi, born in Federivka, Ukraine Georgiy Dobrovolskiy, born in Odessa, Ukraine Yuri Gidzenko, born in Yelanets, Ukraine Leonid Kizim, born in Krasnyi Lyman, Ukraine Oleg Kotov, born in Simferopol, Ukraine Anatoli Levchenko, born in Krasnokutsk, Ukraine Vladimir Lyakhov, born in Antratsyt, Ukraine Yuri Malenchenko, born in Svitlovodsk, Ukraine Yuri Onufriyenko, born in Ryasne, Ukraine Leonid Popov, born in Oleksandriia, Ukraine Pavel Popovich, born in Uzyn, Ukraine. Georgi Shonin, born in Rovenky, Ukraine Vasili Tsibliyev, born in Horikhivka, Ukraine Vladimir Vasyutin, born in Kharkiv, Ukraine Igor Volk, born in Zmiiv, Ukraine Aleksandr Volkov, born in Horlivka, Ukraine Sergei Aleksandrovich Volkov, born in Chuhuiv, Ukraine Vitali Zholobov, born in Zburyivka, Ukraine Uzbek S.S.R. / Uzbekistan Vladimir Dzhanibekov, born in Iskandar, Uzbekistan See also Other astronaut corps: Canadian Astronaut Corps European Astronaut Corps NASA Astronaut Corps (United States) JAXA Astronaut Corps (Japan) People's Liberation Army Astronaut Corps (China) Intercosmos, a Soviet space program designed to give nations on friendly relations with the Soviet Union access to crewed and uncrewed space missions Roscosmos, the program's eventual post-Soviet continuation under the Russian Federation Pilot-Cosmonaut of the USSR and Pilot-Cosmonaut of the Russian Federation, an honorary titles List of Soviet human spaceflight missions List of Russian human spaceflight missions References External links Roscosmos Lists of astronauts Human spaceflight programs Russian space program personnel Crewed space program of Russia
Roscosmos Cosmonaut Corps
[ "Engineering" ]
4,219
[ "Space programs", "Human spaceflight programs" ]
69,039,000
https://en.wikipedia.org/wiki/Glugging
Glugging (also referred to as "the glug-glug process") is the physical phenomenon which occurs when a liquid is poured rapidly from a vessel with a narrow opening, such as a bottle. It is a facet of fluid dynamics. As liquid is poured from a bottle, the air pressure in the bottle is lowered, and air at higher pressure from outside the bottle is forced into the bottle, in the form of a bubble, impeding the flow of liquid. Once the bubble enters, more liquid escapes, and the process is repeated. The reciprocal action of glugging creates a rhythmic sound. The English word "glug" is onomatopoeic, describing this sound. Onomatopoeias in other languages include (German). Academic papers have been written about the physics of glugging, and about the impact of glugging sounds on consumers' perception of products such as wine. Research into glugging has been done using high-speed photography. Factors which affect glugging are the viscosity of the liquid, its carbonation, the size and shape of the container's neck and its opening (collectively referred to as "bottle geometry"), the angle at which the container is held, and the ratio of air to liquid in the bottle (which means that the rate and the sound of the glugging changes as the bottle empties). See also References fluid dynamics food science
Glugging
[ "Chemistry", "Engineering" ]
299
[ "Piping", "Chemical engineering", "Fluid dynamics" ]
71,986,552
https://en.wikipedia.org/wiki/Text-to-video%20model
A text-to-video model is a machine learning model that uses a natural language description as input to produce a video relevant to the input text. Advancements during the 2020s in the generation of high-quality, text-conditioned videos have largely been driven by the development of video diffusion models. Models There are different models, including open source models. Chinese-language input CogVideo is the earliest text-to-video model "of 9.4 billion parameters" to be developed, with its demo version of open source codes first presented on GitHub in 2022. That year, Meta Platforms released a partial text-to-video model called "Make-A-Video", and Google's Brain (later Google DeepMind) introduced Imagen Video, a text-to-video model with 3D U-Net. In March 2023, a research paper titled "VideoFusion: Decomposed Diffusion Models for High-Quality Video Generation" was published, presenting a novel approach to video generation. The VideoFusion model decomposes the diffusion process into two components: base noise and residual noise, which are shared across frames to ensure temporal coherence. By utilizing a pre-trained image diffusion model as a base generator, the model efficiently generated high-quality and coherent videos. Fine-tuning the pre-trained model on video data addressed the domain gap between image and video data, enhancing the model's ability to produce realistic and consistent video sequences. In the same month, Adobe introduced Firefly AI as part of its features. In January 2024, Google announced development of a text-to-video model named Lumiere which is anticipated to integrate advanced video editing capabilities. Matthias Niessner and Lourdes Agapito at AI company Synthesia work on developing 3D neural rendering techniques that can synthesise realistic video by using 2D and 3D neural representations of shape, appearances, and motion for controllable video synthesis of avatars. In June 2024, Luma Labs launched its Dream Machine video tool. That same month, Kuaishou extended its Kling AI text-to-video model to international users. In July 2024, TikTok owner ByteDance released Jimeng AI in China, through its subsidiary, Faceu Technology. By September 2024, the Chinese AI company MiniMax debuted its video-01 model, joining other established AI model companies like Zhipu AI, Baichuan, and Moonshot AI, which contribute to China’s involvement in AI technology. Alternative approaches to text-to-video models include Google's Phenaki, Hour One, Colossyan, Runway's Gen-3 Alpha, and OpenAI's Sora, Several additional text-to-video models, such as Plug-and-Play, Text2LIVE, and TuneAVideo, have emerged. Google is also preparing to launch a video generation tool named Veo for YouTube Shorts in 2025. FLUX.1 developer Black Forest Labs has announced its text-to-video model SOTA. Architecture and training There are several architectures that have been used to create Text-to-Video models. Similar to Text-to-Image models, these models can be trained using Recurrent Neural Networks (RNNs) such as long short-term memory (LSTM) networks, which has been used for Pixel Transformation Models and Stochastic Video Generation Models, which aid in consistency and realism respectively. An alternative for these include transformer models. Generative adversarial networks (GANs), Variational autoencoders (VAEs), — which can aid in the prediction of human motion — and diffusion models have also been used to develop the image generation aspects of the model. Text-video datasets used to train models include, but are not limited to, WebVid-10M, HDVILA-100M, CCV, ActivityNet, and Panda-70M. These datasets contain millions of original videos of interest, generated videos, captioned-videos, and textual information that help train models for accuracy. Text-video datasets used to train models include, but are not limited to PromptSource, DiffusionDB, and VidProM. These datasets provide the range of text inputs needed to teach models how to interpret a variety of textual prompts. The video generation process involves synchronizing the text inputs with video frames, ensuring alignment and consistency throughout the sequence. This predictive process is subject to decline in quality as the length of the video increases due to resource limitations. Limitations Despite the rapid evolution of Text-to-Video models in their performance, a primary limitation is that they are very computationally heavy which limits its capacity to provide high quality and lengthy outputs. Additionally, these models require a large amount of specific training data to be able to generate high quality and coherent outputs, which brings about the issue of accessibility. Moreover, models may misinterpret textual prompts, resulting in video outputs that deviate from the intended meaning. This can occur due to limitations in capturing semantic context embedded in text, which affects the model’s ability to align generated video with the user’s intended message. Various models, including Make-A-Video, Imagen Video, Phenaki, CogVideo, GODIVA, and NUWA, are currently being tested and refined to enhance their alignment capabilities and overall performance in text-to-video generation. Ethics The deployment of Text-to-Video models raises ethical considerations related to content generation. These models have the potential to create inappropriate or unauthorized content, including explicit material, graphic violence, misinformation, and likenesses of real individuals without consent. Ensuring that AI-generated content complies with established standards for safe and ethical usage is essential, as content generated by these models may not always be easily identified as harmful or misleading. The ability of AI to recognize and filter out NSFW or copyrighted content remains an ongoing challenge, with implications for both creators and audiences. Impacts and applications Text-to-Video models offer a broad range of applications that may benefit various fields, from educational and promotional to creative industries. These models can streamline content creation for training videos, movie previews, gaming assets, and visualizations, making it easier to generate high-quality, dynamic content. These features provide users with economical and personal benefits. Comparison of existing models See also Text-to-image model VideoPoet, unreleased Google's model, precursor of Lumiere Deepfake Human image synthesis ChatGPT References External links Free text-to-video Artificial intelligence engineering Algorithms Language Natural language processing Video Computers
Text-to-video model
[ "Mathematics", "Technology", "Engineering" ]
1,363
[ "Algorithms", "Mathematical logic", "Applied mathematics", "Software engineering", "Natural language processing", "Artificial intelligence engineering", "Natural language and computing" ]
71,987,233
https://en.wikipedia.org/wiki/List%20of%20Art%20Deco%20architecture%20in%20Kentucky
This is a list of buildings that are examples of the Art Deco architectural style in Kentucky, United States. Bowling Green Booth Fire & Safety, Bowling Green, 1948 Capitol Arts Center (former Columbia Theatre), Bowling Green, 1921 Galloway Farm Equipment (now Booth Fire & Safety), Modern Automotive District, Bowling Green, 1948 Galloway Motor Company Building, Modern Automotive District, Bowling Green, 1948 WLBJ Building, Bowling Green, 1940 William Hardcastle Filling Station, Modern Automotive District, Bowling Green, 1948 Lexington F. W. Woolworth Building, Lexington, 1946 Lexington Laundry Co., Lexington, 1929 Lexington National Guard Armory, Lexington, 1941 Louisville 1495 South 11th Street Building, Louisville, 1935 American Bluegrass Marble, Louisville AT&T Building, Louisville, 1930 Bernheim Distillery Boiling Plant, Louisville, 1937 Bowman Field, Louisville, 1921 Bridges, Smith & Co (former Four Roses Bourbon), Louisville, 1870, 1940s Brown-Forman Warehouse, Louisville, 1936 Byck's Department Store, Louisville, 1924 Charles D. Jacob Elementary School, Louisville, 1932 Courier-Journal Building, Louisville Fire Department Headquarters, Louisville, 1937 Fiscal Court Building, Louisville, 1938 Fisher-Klosterman Building (former Bernheim Distillery Bottling Plant), Louisville, 1937 George Rogers Clark Memorial Bridge, Louisville, 1929 Jacob Elementary, Louisville, 1932 James Russell Lowell Elementary School, Louisville, 1931 Jones–Dabney Company Laboratory, Louisville, 1935 Kelley Technical Coatings, Louisville, 1933 LG&E Building, Louisville, 1925–1928 Louisville Cotton Mills Administration Building (now a restaurant), Louisville, 1936–1941 Louisville Fire Department No. 9, Louisville, 1946 Louisville National Guard Armory, Louisville, 1942 Louisville Public Works, Louisville, 1934 Meyzeek Middle School, Louisville, 1929 Miller Paper Company Buildings, Louisville, 1947 Norton Healthcare Building, Louisville, 1925 Ohio Theatre façade, Louisville, 1941 Seagram's Distillery, Louisville, 1933 Sears, Roebuck and Company Store, Louisville, 1928 South Central Bell Company Office Building, Louisville, 1930 Trinity High School, Louisville, 1941 Valley Traditional High School, Louisville, 1936, 1953 Vogue Theatre, Louisville, 1939 WHAS Radio Transmitter Building, Louisville, 1930s Wheeling Corrugating Company Building, Louisville Scottsville Lyric Theater, Scottsville, 1930s Horse Shoe Cafeteria, Scottsville Downtown Commercial Historic District, Scottsville, 1915 and 1935 Washington Overall Factory, Scottsville Downtown Commercial Historic District, Scottsville, 1928, 1930s Whitesburg Boone Youth Drop-In Center (former Boone Motor Company), Whitesburg Historic District, Whitesburg, 1930 Quillen Building, Whitesburg Historic District, Whitesburg, 1948 Whitesburg High School Gymnasium, Whitesburg Historic District, Whitesburg, 1940s Other cities Arista Theater, Lebanon Historic Commercial District, Lebanon, 1935 Ashland Armory, Ashland, 1948 Bell Theater, Pineville Burke's Bakery, Danville, 1930 Caldwell County Courthouse, Princeton Downtown Commercial District, Princeton, 1939 Capitol Theater, Princeton Downtown Commercial District, Princeton, 1930s Carlisle Armory, Carlisle, 1941 City Hall, Paintsville, 1940 Coca-Cola Bottling Plant, Paducah, 1939 Coca-Cola Plant, Shelbyville, 1937 Devou Park Band Shell, Covington, 1939 Elizabethtown Armory, Elizabethtown, 1948 Estill County Courthouse, Irvine Historic Business District, Irvine, 1939 Eversole Building, Harlan Commercial District, Harlan, 1936 Estill County Youth Center (former National Guard Armory), Ravenna, 1934 Greenville City Hall, Greenville, 1940 Harrodsburg Armory (now Mercer Area Family Education), Harrodsburg, 1942 Horse Cave State Bank, Horse Cave, 1937 Jeffersontown Colored School, Jeffersontown, 1930 Kentucky Theatre/Hartford City Hall Complex, Downtown Hartford Historic District, Hartford, late 1930s Lane Theater, Williamsburg, 1948 Ludlow Theater, Ludlow, 1946 Mack Theatre, Irvine Historic Business District, Irvine, 1930s Madisonville Armory, Madisonville, 1947 Marianne Theater, Bellevue, 1942 Ohio County Courthouse, Downtown Hartford Historic District, Hartford, 1943 Old Wayne County High School, Monticello, 1941 Paintsville City Hall, Paintsville, 1940 Paramount Arts Center, Ashland, 1931 Richmond Armory, Richmond, 1942 Russelville Armory, Russelville, 1934 S. E. Cooke Building, Harrodsburg Simon Kenton High School, Independence, 1937 Somerset Armory, Somerset, 1948 Springfield Armory, Springfield, 1942 State Theatre, Elizabethtown, 1942 United States Bullion Depository, Fort Knox, 1936 United States Post Office, Covington, 1941 University of Kentucky Cooperative Extension Building (former National Guard Armory), Carlisle, 1941 Watkins Department Store, Paducah, 1941 Webster County Courthouse, Dixon, 1938 Williamsburg National Guard Armory, Williamsburg, 1941 Worsham Hall, Henderson, 1936 See also List of Art Deco architecture List of Art Deco architecture in the United States References "Art Deco & Streamline Moderne Buildings." Roadside Architecture.com. Retrieved 2019-01-03. Cinema Treasures. Retrieved 2022-09-06 "Court House Lover". Flickr. Retrieved 2022-09-06 "Louisville Art Deco". Retrieved 2019-01-20. "New Deal Map". The Living New Deal. Retrieved 2020-12-25. "SAH Archipedia". Society of Architectural Historians. Retrieved 2021-11-21. External links Art Deco Lists of buildings and structures in Kentucky
List of Art Deco architecture in Kentucky
[ "Engineering" ]
1,087
[ "Architecture lists", "Architecture" ]
71,987,303
https://en.wikipedia.org/wiki/List%20of%20Art%20Deco%20architecture%20in%20Louisiana
This is a list of buildings that are examples of the Art Deco architectural style in Louisiana, United States. Baton Rouge Baton Rouge Savings and Loan Association, Baton Rouge Campbell Apartment Building, Baton Rouge, 1930 Howard Auditorium, Louisiana State University, Baton Rouge, 1940 S. H. Kress and Co. Building, Baton Rouge, 1935, 1960 Lincoln Theater, Baton Rouge, 1950 Louisiana State Capitol, Baton Rouge, 1932 United States Post Office and Courthouse, Baton Rouge, 1932 Lake Charles Artists Civic Theatre and Studio (ACTS) Theatre, Lake Charles, 1940 Bulber Auditorium, McNeese State University, Lake Charles, 1940 Kaufman Hall, McNeese State University, Lake Charles, 1941 New Orleans Algy Theater, New Orleans, 1940s Alvar Street Library, New Orleans, 1940 Ashton B&B (former Ashton Theater), New Orleans, 1927 Blue Plate Building, New Orleans, 1941 Carver Theater, New Orleans, 1950 Charity Hospital, New Orleans, 1939 Eleanor McMain Secondary School, New Orleans, 1932 F. Edward Hebert Federal Building, New Orleans, 1939 Flint-Goodridge Hospital of Dillard University, New Orleans, 1931 Gem Theater, New Orleans, 1947 General Laundry Cleaners & Dryers, New Orleans, 1929 Gus Mayer Department Store, New Orleans, 1948 Joy Theater, New Orleans, 1947 Lakefront Airport, New Orleans, 1930s Mangel's, New Orleans National American Bank Building, New Orleans, 1929 Orleans Parish Criminal Court, New Orleans, 1931 Rosetree Blown Glass Studio and Gallery (former Rosetree Theater), 1930s Sister Stanislaus Memorial Building, New Orleans, 1938 Tivoli Theatre, New Orleans, 1927 Tremé Market, New Orleans William Frantz Elementary School, New Orleans, 1937 New Iberia Evangeline Theater (now Sliman Theatre for the Performing Arts), 129 East Main Street, Downtown New Iberia Commercial Historic District, New Iberia, 1929 and 1940 Iberia Parish Courthouse and Jail, New Iberia, 1940 Wormser's Department Store, New Iberia Shreveport Capri Theatre (former Saenger Theatre), Shreveport, 1911 and 1940s Louisiana State Exhibit Building, Shreveport, 1937 Masonic Temple, Shreveport, 1937 Salvation Army, Shreveport, 1937 Shreveport Municipal Memorial Auditorium, Shreveport, 1929 Other cities Acadia City Hall, Crowley Historic District, Crowley, 1931 Acadia Parish Courthouse, Crowley Historic District, Crowley, 1952 Big D Corral Theatre, DeRidder Commercial Historic District, DeRidder, 1940 Caldwell Parish Courthouse, Columbia, 1937 City Hall, Winnfield, 1937 Concordia Parish Courthouse, Vidalia, 1939 Denham Springs City Hall, Denham Springs, 1940 Dixie Center for the Arts, Ruston, 1928 and 1933 Dual State Monument, Union County, 1931 East Carroll Parish Courthouse and Jail, Lake Providence, 1938 Fiske Theatre, Oak Grove, West Carroll Parish, 1950 Gymnasium, Napoleonville, 1930s Jackson Parish Courthouse and Jail, Jonesboro, 1938 Lafayette International Center, Lafayette, 1939 Lafourche Parish Jail, Thibodoaux, 1940s Mama's Place, Metairie More Mileage Gas Station, Jennings, 1938 Napoleon Middle School Gymnasium, Napoleonville, 1939 Natchitoches Parish Courthouse, Natchitoches, 1940 Old Brusly High School Gymnasium, Brusly, 1937 Palace Theatre, Jonesboro, 1929 Port Allen High School, Port Allen, 1937 Rapides Parish Courthouse and Jail, Alexandria, 1940 Rayville Light & Water Plant, Rayville, 1940 Rice Theatre, Crowley Historic District, Crowley, 1941 Ruston High School, Ruston, 1921 and 1940 St. Bernard Parish Courthouse, Chalmette, 1939 St. Helena Parish Courthouse, Greensburg, 1937 St. Landry Parish Courthouse, Opelousas Historic District, Opelousas, 1940 Strand Theatre, Jennings, 1939 United States Post Office, Leesville, 1936 United States Post Office and Courthouse, Alexandria, 1932 United States Post Office and Courthouse, Arcadia, 1937 United States Post Office and Courthouse, Monroe, 1934 See also List of Art Deco architecture List of Art Deco architecture in the United States References "Art Deco & Streamline Moderne Buildings." Roadside Architecture.com. Retrieved 2019-01-03. Cinema Treasures. Retrieved 2022-09-06 "Court House Lover". Flickr. Retrieved 2022-09-06 "New Deal Map". The Living New Deal. Retrieved 2020-12-25. "SAH Archipedia". Society of Architectural Historians. Retrieved 2021-11-21. External links Art Deco Lists of buildings and structures in Louisiana
List of Art Deco architecture in Louisiana
[ "Engineering" ]
920
[ "Architecture lists", "Architecture" ]
71,988,529
https://en.wikipedia.org/wiki/Directional%20freezing
Directional freezing freezes from only one direction. Directional freezing can freeze water, from only one direction or side of a container, into clear ice. Directional freezing in a domestic freezer can be done by putting water in a insulated container so that the water freezes from the top down, and removing before fully frozen, so that the minerals in the water are not frozen. F Hoffmann La Roche AG, Roche Diagnostics GmbH has a 2017 directional freezing patent for drying solid material. See also Aquamelt Clear ice Hydrogel Freeze-casting§Static vs. dynamic freezing profiles Molecular self-assembly Further reading References Phase transitions Cryobiology Molecular physics Intermolecular forces Nanotechnology
Directional freezing
[ "Physics", "Chemistry", "Materials_science", "Engineering", "Biology" ]
139
[ " and optical physics stubs", "Physical phenomena", "Phase transitions", "Molecular physics", "Biochemistry", "Phases of matter", "Critical phenomena", "Cryobiology", "Intermolecular forces", "Materials science", " molecular", "nan", "Atomic", "Nanotechnology", "Statistical mechanics", ...
71,988,533
https://en.wikipedia.org/wiki/Global%20coordination%20level
Global coordination level (GCL) is a computational method that evaluates the system-wide dependency in multivariate data, by calculating the distance correlation between random subsets of the variables. Originally applied to gene expression data, GCL assesses the level of coordination between genes, which are fundamentally linked in performing tasks and biological functions. Unlike traditional methods that require precise knowledge of pairwise interactions between genes, GCL can evaluate coordination without such information. The GCL value of zero signifies independent gene expression, while values above zero indicate gene-to-gene regulatory interactions. For instance, when GCL is applied to known genetic pathways in the Kyoto Encyclopedia of Genes and Genomes (KEGG) database, it yields significantly positive values, while random subsets of genes or mock pathways with similar gene expression levels show very low GCL values. Additionally, GCL can be useful in analyzing high-dimensional ecological and biochemical dynamics. Introduction Genes interact with each other in a complex structure known as the gene regulatory network, which plays a crucial role in implementing various biological functions and performing different tasks within cells. However, inferring the precise pairwise interactions of the gene regulatory network remains challenging due to the large number of functional genes and the inherent stochasticity of these systems. Despite these challenges, certain features of the gene regulatory network can still be extracted without fully inferring all the interactions. For instance, the network connectivity, which refers to the density of actual gene-gene interactions compared to all possible interactions, may have important implications for general cellular processes. Method description The calculation of the Conditional Likelihood (CL) is based on multivariate dependencies among genes in a given cohort of cells. This involves a repeated procedure of randomly selecting subsets of genes and calculating the distance correlation between them, as described in the work. By averaging over many such gene subsets, a single numerical value, known as the Gene Connectivity Landscape (GCL), is obtained to assess the overall dependencies between the genes. However, there are several important pre- and post-processing steps that need to be taken into account to ensure the accuracy and reliability of the GCL. Firstly, clustering methods should be applied to divide the analyzed cohort of cells into subsets, and the GCL should be calculated separately for each subset or the largest one to ensure homogeneity. Secondly, cells that deviate significantly from the rest of the cells (referred to as 'outliers') or cells that are too similar to each other (referred to as 'inliers') should be filtered out to avoid their undue influence on the GCL calculation. Additionally, jackknife analysis, which involves systematically omitting subsets of cells from the analysis and recalculating the GCL, should be performed to test the stability of the results. These steps are necessary because the GCL, like other correlation measures, can be sensitive to unusual cells and heterogeneous cohorts, especially in the context of sparse, noisy, and outlier-prone scRNA-seq data. Applications Aging: Stochastic aberration of transcriptional regulation is a dominant factor in the process of aging. However, when assessing GCLs in multiple single-cell RNA-sequencing datasets, the decline of GCL with age has been consistently observed across various organisms and cell types. Notably, significant decreases in GCL were found in mouse hematopoietic stem cells based on single-cell RNA-seq data, supporting the hypothesis of aging as dys-differentiation. This idea, originally posited by Richard Cutler in the 1970s, suggests that cells deviate from their proper state of differentiation as they age, as evidenced by the activation of genes that should normally be silent in aged tissues. Measuring biological variability: The GCL decreases in cohorts of cells with increased 'biological variability' only when it arises from gene interactions. The GCL can be used to assess and compare the ratio between introduced biological and technical variability in cohorts with similar cell-to-cell variability. References Computational biology
Global coordination level
[ "Biology" ]
836
[ "Computational biology" ]
71,988,586
https://en.wikipedia.org/wiki/List%20of%20Art%20Deco%20architecture%20in%20New%20Jersey
There are numerous buildings that are examples of the Art Deco architecture, including Streamline Moderne, in New Jersey, United States. Asbury Park Deal Lake Court Apartments, 1930s Jersey Central Power & Light Building, 1922 Hot Mess Studio, Asbury Park Old Heating Plant, Asbury Park, 1930 The Santander, 1929 Paramount Theatre, 1930 Camden City Hall, 1931 KIPP Cooper Norcross High School Mastery Schools of Camden, McGraw Elementary McCrory's/Sam's Discount Pierre Building, 306 Cooper, 1932 Rio Theatre (now a church), 1937 United States Post Office and Courthouse, 1932 East Orange 30 South Munn, East Orange East Orange VA Hospital, East Orange, 1950 West Colonial Apartments, East Orange Elizabeth Altenburg Piano House, Elizabeth, 1929 Hersch Tower, Elizabeth, 1931 Ritz Theatre, Elizabeth, 1925 Jersey City McGinley Square-Bergen Square-Journal Square corridor along Bergen Avenue: 789 (Bergen Theater; later Pix Theater), 830 (Provident Bank Headquarters, 1920), 872 (Independent Order of Odd Fellows, 1920), 875, 880 (First National Bank Building, 1920), 885, 910, 911, 918, 920 (Hurwitz Building, 1930) and 924 295 Newark Avenue, 1929 500 Communipaw Avenue (former Junction Fishery), 1920s A. Harry Moore School (NJCU), 1931 Jersey City Armory The Beacon, former Jersey City Medical Center, 1934–1938 & Jones Hall, 591 Montgomery Street, 1939 596 Communipaw Avenue 61 Duncan Avenue Miss America Diner, 1942 65 Tonnele Avenue (Ramada Jersey City) Roosevelt Stadium, 1937 (demolished) White Mana Diner, 1931 Ellis Island Ferry Building, c.1937 Newark 118-122 Market Street 138-138 1/2 Halsey Street, 1925 34 William Street, 1925 837-839 Broad Street, 1930 87 Halsey Street 9-13 Hill Street, 1929 Chancellor Avenue Elementary School, 1938 Eleven 80 (formerly Lefcourt–Newark Building), 1930 Griffith Building, 1927 Hotel Douglas, 5-21 Hill Street, 1923 Ivy Hill Elementary School, 1931 Lyons Towers Condominiums, 1939 National Newark Building, 744 Broad Street, 1931 Newark Arts High School (formerly Newark School of Fine and Industrial Art), 1931 Newark Metropolitan Airport Buildings, 1928 Newark Museum of Art Newark Symphony Hall (originally the Mosque Theater, 1925) Newark Urby, 155 Washington Street (original parking tower converted to residences) Paramount Theater Pennsylvania Station, 1935 Science High School (demolished 2017), some original Art Deco terra cotta tiles incorporated into 50 Rector Park United States Savings Bank, 187 Market Street, 1929 Walker House (formerly New Jersey Bell Headquarters Building), 540 Broad Street, 1929 Weequahic High School, 1932 Trenton Maxine's (now Rio Sports Bar & Grill), Trenton, 1948 Catholic Youth Organization (former RKO Broad Theatre), Trenton, 1920s Clarkson S. Fisher Federal Building and United States Courthouse, Trenton, 1932 Daylight/Twilight Alternative High School, Trenton Hedgepeth–Williams Elementary School, Trenton Parker Elementary School, Trenton Paul Robeson Elementary School, Trenton Trenton War Memorial, Trenton, 1930 Ulysses S. Grant Elementary School, Trenton, 1938 Washington Elementary School, Trenton Other places North Jersey 225 Larch Avenue, Teaneck, 1938 Bergen Performing Arts Center, Englewood, 1926 Garden State Crematory, North Bergen, 1907 Sears, Roebuck, and Company Building, Hackensack, 1930s Sears Roebuck (now Kennedy Center), Union City, 1932 WMCA Transmitter Building, Kearny, 1940 531 Mitchell Street, Orange (now parking garage for Harvard Printing Apts.) People's Bank and Trust Company Building, Passaic, 1931 Temple Emanuel, Paterson, 1929 Belvidere Theater, Belvidere, 1930s Clifton Post Office, 1936 Bloomfield Trust Company, circa 1929 US Post Office, West New York, 1937 South Jersey Boardwalk Hall, Atlantic City, 1929 Gateway 26 (former Hunts Casino), Wildwood, 1940 Hunt's Casino, Wildwood, 1940 Ventnor Theater, Ventnor, 1922 Landis Theatre-Mori Brothers Building, Vineland (1937) Central Jersey Perth Amboy Bank Building, Perth Amboy 224 Smith Street, Perth Amboy Jersey City Central Power, & Light Company, Keyport, 1930s Princeton Fire Department, Princeton Hook & Ladder Company, Princeton, 1933 Thomas Alva Edison Memorial Tower and Museum, Edison, 1938 Brook Arts Center, Bound Brook, 1927 Bow-Tie Warner Theater, Ridgeway, 1932 Lakewood Post Office, Lakewood, 1938 Municipal buildings Sears. Roebuck and Company Sears, Roebuck, and Company, 168 Elizabeth Avenue at Bigelow Street, Newark, opened 31 Oct. 1929 Sears, Roebuck, and Company Building, Hackensack, 1930 Sears Roebuck (now Kennedy Center), Union City, 1932 Bridges and tunnels Route 46 bridges over the Passaic River Lincoln Tunnel toll plaza and ventilation towers Holland Tunnel ventilation towers Morris Goodkind Bridge (formerly College Bridge), 1929, over Raritan River See also List of Art Deco architecture List of Art Deco architecture in the United States References External links "Art Deco & Streamline Moderne Buildings." Roadside Architecture.com. Retrieved 2019-01-03. Cinema Treasures. Retrieved 2022-09-06 "New Deal Map". The Living New Deal. Retrieved 2020-12-25. "SAH Archipedia". Society of Architectural Historians. Retrieved 2021-11-21. Art Deco Lists of buildings and structures in New Jersey
List of Art Deco architecture in New Jersey
[ "Engineering" ]
1,133
[ "Architecture lists", "Architecture" ]