source
stringlengths
31
203
text
stringlengths
28
2k
https://en.wikipedia.org/wiki/List%20of%20the%20most%20common%20passwords
This is a list of the most common passwords, discovered in various data breaches. Common passwords generally are not recommended on account of low password strength. List NordPass NordPass conducted the most breached passwords research in 2021. The company gathered top 200 worst passwords this year from a database of 275,699,516 passwords. SplashData The Worst Passwords List is an annual list of the 25 most common passwords from each year as produced by internet security firm SplashData. Since 2011, the firm has published the list based on data examined from millions of passwords leaked in data breaches, mostly in North America and Western Europe, over each year. In the 2016 edition, the 25 most common passwords made up more than 10% of the surveyed passwords, with the most common password of 2016, "123456", making up 4%. Keeper Password manager Keeper compiled its own list of the 25 most common passwords in 2016, from 25 million passwords leaked in data breaches that year. National Cyber Security Centre The National Cyber Security Centre (NCSC) compiled its own list of the 20 most common passwords in 2019, from 100 million passwords leaked in data breaches that year. See also Password cracking 10,000 most common passwords Notes References External links Skullsecurity list of breached password collections Security Password authentication
https://en.wikipedia.org/wiki/Bone%20marrow%20adipose%20tissue
Bone marrow adipose tissue (BMAT), sometimes referred to as marrow adipose tissue (MAT), is a type of fat deposit in bone marrow. It increases in states of low bone density -osteoporosis, anorexia nervosa/ caloric restriction, skeletal unweighting such as that which occurs in space travel, and anti-diabetes therapies. BMAT decreases in anaemia, leukaemia, and hypertensive heart failure; in response to hormones such as oestrogen, leptin, and growth hormone; with exercise-induced weight loss or bariatric surgery; in response to chronic cold exposure; and in response to pharmacological agents such as bisphosphonates, teriparatide, and metformin. Anatomy Bone marrow adipocytes (BMAds) originate from mesenchymal stem cell (MSC) progenitors that also give rise to osteoblasts, among other cell types. Thus, it is thought that BMAT results from preferential MSC differentiation into the adipocyte, rather than osteoblast, lineage in the setting of osteoporosis. Since BMAT is increased in the setting of obesity and is suppressed by endurance exercise, or vibration, it is likely that BMAT physiology, in the setting of mechanical input/exercise, approximates that of white adipose tissue (WAT). Physiology Exercise regulation The first study to demonstrate exercise regulation of BMAT in rodents was published in 2014; Now, exercise regulation of BMAT has been confirmed in a human, adding clinical importance. Several studies demonstrated exercise reduction of BMAT which occurs along with an increase in bone quantity. Since exercise increases bone quantity, reduces BMAT and increases expression of markers of fatty acid oxidation in bone, BMAT is thought to be providing needed fuel for exercise-induced bone formation or anabolism. A notable exception occurs in the setting of caloric restriction: exercise suppression of BMAT does not yield an increase in bone formation and even appears to cause bone loss. Indeed, energy availability appears to be a factor in the ability of e
https://en.wikipedia.org/wiki/List%20of%20image%20resolutions%20used%20in%20digital%20cameras
The following is a list of image resolutions implemented in the image sensors used in various digital cameras. {| class="wikitable sortable" |- ! Width (px) ! Height (px) ! Aspect ratio ! Actual pixel count ! Megapixels ! Camera examples |- | 32 | 32 |1:1 | 1024 | 0.001 | Cromemco Cyclops (1975) |- | 100 | 100 |1:1 | 10,000 | 0.01 | Kodak Prototype by Steven Sasson (1975) |- | 320 | 240 | | 76,800 | 0.07 | Casio QV-10 (1995) |- | 640 | 480 | | 307,200 | 0.3 | Apple QuickTake 100 (1994) |- | 832 | 608 | | 505,856 | 0.5 | Canon Powershot 600 (1996) |- | 1,024 | 768 | | 786,432 | 0.8 | Olympus D-300L (1996) |- | 1024 | 1024 |1:1 | 1,048,576 | 1.0 | Nikon NASA F4 (1991) |- | 1,280 | 960 | | 1,228,800 | 1.3 | Fujifilm DS-300 (1997) |- | 1,280 | 1,024 |5:4 | 1,310,720 | 1.3 | Fujifilm MX-700, Fujifilm MX-1700 (1999), Leica Digilux (1998), Leica Digilux Zoom (2000) |- | 1,600 | 1,200 | | 1,920,000 | 2 | Nikon Coolpix 950, Samsung GT-S3500 |- | 1,600 | 1,280 |5:4 | 2,048,000 | 2 | Pentax EI2000/Hewlett Packard 912 |- | 2,012 | 1,324 | | 2,663,888 | 2.74 | Nikon D1 |- | 2,048 | 1,536 | | 3,145,728 | 3 | Canon PowerShot A75, Nikon Coolpix 995 |- | 2,160 | 1,440 | | 3,110,400 | 3.11 | Canon EOS D30 |- | 2,272 | 1,704 | | 3,871,488 | 4 | Olympus Stylus 410, Contax i4R (although CCD is actually square 2,272×2,272) |- | 2,464 | 1,648 | | 4,060,672 | 4.1 | Canon 1D |- | 2,560 | 1,920 | | 4,915,200 | 5 | Olympus E-1, Sony Cyber-shot DSC-F707, Sony Cyber-shot DSC-F717 |- | 2,816 | 2,112 | | 5,947,392 | 5.9 | Olympus Stylus 600 Digital |- | 3,008 | 2,000 | | 6,016,000 | 6 | D100, Nikon D40, D50, D70, D70s, Pentax K100D, Konica Minolta Maxxum 7D, Konica Minolta Maxxum 5D, Epson R-D1 |- | 3,072 | 2,048 | | 6,291,456 | 6.3 | Canon EOS 10D, Canon EOS 300D |- | 3,072 | 2,304 | | 7,077,888 | 7 | Olympus FE-210, Canon PowerShot A620 |- | 3,456 | 2,304 | | 7,962,624 | 8 | Canon EOS 350D |- | 3,264 | 2,448 | | 7,990,272 | 8 | Olympus E-500, Olympus SP-350, Canon PowerShot A
https://en.wikipedia.org/wiki/Graphene%20plasmonics
Graphene is a 2D nanosheet with atomic thin thickness in terms of 0.34 nm. Due to the ultrathin thickness, graphene showed many properties that are quite different from their bulk graphite counterparts. The most prominent advantages are known to be their high electron mobility and high mechanical strengths. Thus, it exhibits potential for applications in optics and electronics especially for the development of wearable devices as flexible substrates. More importantly, the optical absorption rate of graphene is 2.3% in the visible and near-infrared region. This broadband absorption characteristic also attracted great attention of the research community to exploit the graphene-based photodetectors/modulators. Plasmons are collective electron oscillations usually excited at metal surfaces by a light source. Doped graphene layers have also shown the similar surface plasmon effects to that of metallic thin films. Through the engineering of metallic substrates or nanoparticles (e.g., gold, silver and copper) with graphene, the plasmonic properties of the hybrid structures could be tuned for improving the optoelectronic device performances. It is worth noting that the electrons at the metallic structure could transfer to the graphene conduction band. This is attributed to the zero bandgap property of graphene nanosheet. Graphene plasmons can also be decoupled from their environment and give rise to genuine Dirac plasmon at low-energy range where the wavelengths exceed the damping length. These graphene plasma resonances have been observed in the GHz–THz electronic domain. Graphene plasmonics is an emergent research field, that is attracting plenty of interest and has already resulted in a textbook. Application When the plasmons were resonant at the graphene/metal surface, a strong electric field would be induced which could enhance the generation of electron-hole pairs in the graphene layer. The excited electron carrier numbers linearly increased with the field inten
https://en.wikipedia.org/wiki/Pure%20homopolar%20generator
A pure homopolar generator (PHG) is an electric generator not requiring brushes or electronics or semiconductor parts to convert torque into direct current. In other words, this homopolar generator only requires theoretical homogeneous or actual (constantly or cyclically) homogenized magnetic field to be able to produce direct current. The theory of PHG is explained using Maxwell's equations. This term unifies all devices that work in line with this definition. PHG can be represented by patents, which should also work the other way round: US5977684 (A), PV2011-293 (CZ) application number 2011-293, AU5801890 (A). Another group of patents with a superconductor shield or shield made of so far unknown material (low-voltage semiconductor with high current permeability or material that excludes magnetic field under normal temperatures): US5144179 (A), DE102012022152 (A1), CN1671030 (A) See also Pure homopolar motor General Rotating magnetic field brushless motors with electronics References External links References to related patents and activities    Another group    Activities Electrical generators
https://en.wikipedia.org/wiki/Anthropology%20of%20food
Anthropology of food is a sub-discipline of anthropology that connects an ethnographic and historical perspective with contemporary social issues in food production and consumption systems. Although early anthropological accounts often dealt with cooking and eating as part of ritual or daily life, food was rarely regarded as the central point of academic focus. This changed in the later half of the 20th century, when foundational work by Mary Douglas, Marvin Harris, Arjun Appadurai, Jack Goody, and Sidney Mintz cemented the study of food as a key insight into modern social life. Mintz is known as the "Father of food anthropology" for his 1985 work Sweetness and Power, which linked British demand for sugar with the creation of empire and exploitative industrial labor conditions. Research has traced the material and symbolic importance of food, as well as how they intersect. Examples of ongoing themes are food as a form of differentiation, commensality, and food's role in industrialization and globalizing labor and commodity chains. Several related and interdisciplinary academic programs exist in the US and UK (listed under Food studies institutions). "Anthropology of food" is also the name of a scientific journal dedicated to a social analysis of food practices and representations. Created in 1999 (first issue published in 2001), it is multilingual (English, French, Spanish, Portuguese). It is OpenAccess, and accessible through the portal OpenEdition Journals. It complies with academic standards for scientific journals (double-blind peer-review). It publishes a majority of papers in social anthropology, but is also open to contributions from historians, geographers, philosophers, economists. The first issues published include: 16 | 2022 Feeding genders 15 | 2021 Aesthetics, gestures and tastes in South and East Asia: crossed approaches on culinary arts 14 | 2019 Gastro-politics: Culture, Identity and Culinary Politics in Peru 13 | 2018 Tourism and Gastronomy
https://en.wikipedia.org/wiki/Anbox
Anbox is a free and open-source compatibility layer that aims to allow mobile applications and mobile games developed for Android to run on Linux distributions. Canonical introduced Anbox Cloud, for running Android applications in a cloud environment. Anbox executes the Android runtime environment by using LXC (Linux Containers), recreating the directory structure of Android as a mountable loop image, while using native Linux kernel to execute applications. It makes use of Linux namespaces through LXC for isolation. Applications do not have any direct hardware access, all accesses are sent through the Anbox daemon. Anbox was deprecated on February 3, 2023 as it's no longer being actively maintained. See also Android-x86 - An open source project that makes an unofficial porting of Google's Android mobile operating system to run on devices powered by AMD and Intel x86 processors, rather than RISC-based ARM chips. BlueStacks has developed an App Player for Windows and MacOS capable of running Android applications in a container. The SPURV compatibility layer is a similar project developed by Collabora. Waydroid is also Android in a container on a regular Linux system, using Wayland Wine - A Windows compatibility layer for Unix-like systems. References External links Anbox Port to Sailfish OS (not maintained anymore) Port to Purism / Librem 5 Port to postmarketOS Android emulation software Compatibility layers Free system software Linux emulation software
https://en.wikipedia.org/wiki/Internet%20%26%20Jurisdiction%20Policy%20Network
The Internet & Jurisdiction Policy Network, also known as "I&J Policy Network", "Internet & Jurisdiction, or simply "I&J", is the multistakeholder organization fostering legal interoperability in cyberspace. Its Secretariat facilitates a global policy process between key stakeholders to enable transnational cooperation and policy coherence. Participants in the Policy Network work together to preserve the cross-border nature of the Internet, protect human rights, fight abuses, and enable the global digital economy. Since 2012, the Internet & Jurisdiction Policy Network has engaged more than 300 key entities from different stakeholder groups around the world, including governments, the world's largest Internet companies, the technical community, civil society groups, leading universities and international organizations. The Secretariat of the Internet & Jurisdiction Policy Network is based in Paris, France. It was founded in 2012 by Executive Director Bertrand de La Chapelle and Deputy Executive Director Paul Fehlinger. Enabling multistakeholder cooperation The Internet & Jurisdiction Policy Network bridges relevant stakeholder groups and policy silos in order to enable transnational cooperation. It strives to fill the institutional gap in Internet governance at the intersection of digital economy, human rights, and cybersecurity. Through global, regional, and thematic meetings, its Secretariat facilitates a neutral dialogue process with the mission of building trust among the different actors and help them develop the operational solutions necessary for the coexistence of diverse laws on the cross-border Internet. The Internet & Jurisdiction Secretariat reports every year on progress at the Internet Governance Forum (IGF) organized by the United Nations. At the IGF 2016, The Internet & Jurisdiction Policy Network was granted for the first time an "Open Forum", a format traditionally reserved to treaty-based organizations. In June 2016 at the OECD Ministerial Mee
https://en.wikipedia.org/wiki/Total%20algebra
In abstract algebra, the total algebra of a monoid is a generalization of the monoid ring that allows for infinite sums of elements of a ring. Suppose that S is a monoid with the property that, for all , there exist only finitely many ordered pairs for which . Let R be a ring. Then the total algebra of S over R is the set of all functions with the addition law given by the (pointwise) operation: and with the multiplication law given by: The sum on the right-hand side has finite support, and so is well-defined in R. These operations turn into a ring. There is an embedding of R into , given by the constant functions, which turns into an R-algebra. An example is the ring of formal power series, where the monoid S is the natural numbers. The product is then the Cauchy product. References : §III.2 Abstract algebra
https://en.wikipedia.org/wiki/Quantum%20materials
Quantum materials is an umbrella term in condensed matter physics that encompasses all materials whose essential properties cannot be described in terms of semiclassical particles and low-level quantum mechanics. These are materials that present strong electronic correlations or some type of electronic order, such as superconducting or magnetic orders, or materials whose electronic properties are linked to non-generic quantum effects – topological insulators, Dirac electron systems such as graphene, as well as systems whose collective properties are governed by genuinely quantum behavior, such as ultra-cold atoms, cold excitons, polaritons, and so forth. On the microscopic level, four fundamental degrees of freedom – that of charge, spin, orbit and lattice – become intertwined, resulting in complex electronic states; the concept of emergence is a common thread in the study of quantum materials. Quantum materials exhibit puzzling properties with no counterpart in the macroscopic world: quantum entanglement, quantum fluctuations, robust boundary states dependent on the topology of the materials' bulk wave functions, etc. Quantum anomalies such as the chiral magnetic effect link some quantum materials with processes in high-energy physics of quark-gluon plasmas. History In 2012, Joseph Orenstein published an article in Physics Today about "ultrafast spectroscopy of quantum materials". Orenstein stated, As a paradigmatic example, Orenstein refers to the breakdown of Landau Fermi liquid theory due to strong correlations. The use of the term "quantum materials" has been extended and applied to other systems, such as topological insulators, and Dirac electron materials. The term has gained momentum since the article "The rise of quantum materials" was published in Nature Physics in 2016. Quoting: References Condensed matter physics Materials science Quantum mechanics
https://en.wikipedia.org/wiki/Encoder%20%28digital%29
An encoder (or "simple encoder") in digital electronics is a one-hot to binary converter. That is, if there are 2n input lines, and at most only one of them will ever be high, the binary code of this 'hot' line is produced on the n-bit output lines. A binary encoder is the dual of a binary decoder. If the input circuit can guarantee at most a single-active input, a simple encoder is a better choice than a priority encoder, since it requires less logic to implement. However, a simple encoder can generate an incorrect output when more than a single input is active, so a priority encoder is required in such cases. Types of encoder 2n-to-n encoders A -to-n encoder has n number of outputs in correspondence to the number of inputs. It thus reduces the number of transmission lines and can be compared to a multiplexer. Only one of the inputs become "high" (logic state "1") at a time. For example, a 4-to-2 simple encoder takes 4 input bits and produces 2 output bits. The illustrated gate level example implements the simple encoder defined by the truth table, but it must be understood that for all the non-explicitly defined input combinations (i.e., inputs containing 0, 2, 3, or 4 high bits) the outputs are treated as don't cares. 4-to-2 encoder 8-to-3 encoder See also Binary decoder Multiplexer (MUX) Priority encoder References Digital electronics
https://en.wikipedia.org/wiki/African%20Biosafety%20Network%20of%20Expertise
The African Biosafety Network of Expertise describes a continental network hosted by Ouagadougou, Burkina Faso. Origin The African Biosafety Network of Expertise was launched on 23 February 2010 with the signing of a host agreement between the New Partnership for Africa's Development (NEPAD) and the Government of Burkina Faso. It was conceptualized in Africa’s Science and Technology Consolidated Plan of Action (2005) and fulfils the recommendation of the High-Level African Panel on Modern Biotechnology, entitled Freedom to Innovate. The network is funded by the Bill and Melinda Gates Foundation. Mission and activities The network serves as a resource for regulators dealing with safety issues related to the introduction and development of genetically modified organisms. In addition to providing regulators with access to policy briefs and other relevant information online in English and French, the network organizes national and subregional workshops on specific topics. For instance, one-week biosafety courses for African regulators were run by the network in Burkina Faso in November 2013 and in Uganda in July 2014, in partnership with the University of Michigan (USA). Twenty-two regulators from Ethiopia, Kenya, Malawi, Mozambique, Tanzania, Uganda, and Zimbabwe took part in the latter course. In April 2014, the network ran a training workshop in Nigeria at the request of the Federal Ministry of Environment for 44 participants drawn from government ministries, regulatory agencies, universities, and research institutions. The aim was to strengthen the regulatory capacity of institutional biosafety committees. This training was considered important to ensure continued regulatory compliance for ongoing confined field trials and multilocation trials for Maruca-resistant cowpea and biofortified sorghum. The workshop was run in partnership with the International Food Policy Research Institute's Program for Biosafety Systems. From 28 April to 2 May 2014, Togo's Minist
https://en.wikipedia.org/wiki/Quadram%20Institute
The Quadram Institute is a centre for food and health research, combining Quadram Institute Bioscience (formerly the Institute of Food Research), the Norfolk and Norwich University Hospitals' endoscopy centre and aspects of the University of East Anglia's Norwich Medical School and the Faculty of Science. It is located on the outskirts of Norwich, England, United Kingdom and is a member of the Norwich Research Park. The institute is housed in a purpose-built facility on the Norwich Research Park that opened in 2018. Its founding partners are Quadram Institute Bioscience, University of East Anglia, Norfolk and Norwich University Hospital and the Biotechnology and Biological Sciences Research Council. The institute combines research teams from the partners with a regional gastrointestinal endoscopy unit and a clinical trials facility. The first patients were treated in the endoscopy unit in December 2018. History The Institute of Food Research was created in 1968, spread over four sites; the Meat Research Institute at Langford near Bristol, the Food Research Institute (FRI) at Colney in Norwich, the National Institute for Research in Dairying (NIRD) in Shinfield near Reading, and the Long Ashton Research Station. At the end of 1990, the Meat Research Institute Bristol laboratory was closed, and in 1992 the National Institute for Research in Dairying Reading laboratory was moved to the campus of the University of Reading. In 1999, the institute's activities were consolidated in one location: Norwich; On 28 April 2017, the Institute of Food Research transitioned into Quadram Institute Bioscience, ahead of a full opening of the Quadram Institute in 2018. Research focus The Quadram Institute has a research programme that covers: Food innovation and health Gut microbes and health Microbes in the food chain Food, microbes and public health. Directors The founding director is Ian Charles, who was appointed in 2015. Building Construction of the Quadram Institute
https://en.wikipedia.org/wiki/Global%20Offset%20Table
The Global Offset Table, or GOT, is a section of a computer program's (executables and shared libraries) memory used to enable computer program code compiled as an ELF file to run correctly, independent of the memory address where the program's code or data is loaded at runtime. It maps symbols in programming code to their corresponding absolute memory addresses to facilitate Position Independent Code (PIC) and Position Independent Executables (PIE) which are loaded to a different memory address each time the program is started. The runtime memory address, also known as absolute memory address of variables and functions is unknown before the program is started when PIC or PIE code is run so cannot be hardcoded during compilation by a compiler. The Global Offset Table is represented as the .got and .got.plt sections in an ELF file which are loaded into the program's memory at startup. The operating system's dynamic linker updates the global offset table relocations (symbol to absolute memory addresses) at program startup or as symbols are accessed. It is the mechanism that allows shared libraries (.so) to be relocated to a different memory address at startup and avoid memory address conflicts with the main program or other shared libraries, and to harden computer program code from exploitation. References Computer programming
https://en.wikipedia.org/wiki/British%20Society%20for%20Developmental%20Biology
The British Society for Developmental Biology (BSDB) is a scientific society promoting developmental biology research; it is open to anyone with an interest in the subject who agrees with the principles of the Society. History The British Society for Developmental Biology was founded in 1948 as the London Embryologists’ Club. In 1964, the club was expanded into a scientific society, named the Society for Developmental Biology. In 1964, the Society for the Study of Growth and Development in the United States had also voted to take on the same name, and they took over sponsorship of the journal Developmental Biology in 1966. Consequently, the smaller British society changed to its current name in 1969. Awards The society administers four annual awards and a studentship. The Waddington Medal was first awarded in 1998. It is named after C. H. Waddington, a leading British embryologist and geneticist, and is awarded to "an outstanding individual who has made major contributions to any aspect of Developmental Biology in the UK". Award winners include: 1998 Cheryll Tickle 1999 Rosa Beddington 2000 Peter Lawrence 2001 Mike Bate 2002 Jonathan Slack 2003 Julian Lewis 2004 Jeff Williams 2005 Michael Akam 2006 Claudio Stern 2007 David Ish-Horowicz 2008 Pat Simpson 2009 Liz Robertson 2010 Robin Lovell-Badge 2011 Christopher Wylie 2012 Alfonso Martinez Arias 2013 Jim Smith 2014 Philip Ingham 2015 Lewis Wolpert 2016 Enrico Coen 2017 William Harris 2018 Richard Lavenham Gardner In 2016, the society added the Cheryll Tickle Medal, which is awarded to a mid-career female scientist. It is named after the embryologist Cheryll Tickle, the first winner of the Waddington Medal. Winners include: 2016 Abigail Saffron Tucker 2017 Jenny Nichols 2018 Christiana Ruhrberg 2019 Bénédicte Sanson The society also has awards for early career scientists: The Beddington Medal is awarded annually for the "best PhD thesis in developmental biology" defended in the year p
https://en.wikipedia.org/wiki/S%C3%A3o%20Jo%C3%A3o%20da%20Bahia%20Theater
São João da Bahia Theater was a 19th Century Brazilian Theater located at Castro Alves Square (formerly Sé district) in the Salvador, Bahia. It was started to be built in 1806 and inaugurated in 1812. It was a very large Theater in Brazil, with a seating capacity of around two thousand people. There is a virtual museum, the São João da Bahia Virtual Museum, that introduces this theater. References Virtual museums Buildings and structures in Salvador, Bahia
https://en.wikipedia.org/wiki/EPathshala
ePathshala is a portal/app developed by the CIET, and NCERT. It was initiated jointly by the Ministry of Human Resource Development, CIET, and NCERT, About launched in November 2015. It hosts educational resources for teachers, students, parents, researchers and educators, can be accessed on the Web, and is available on Google Play, App Store and Windows. The content is available in English, Hindi and Urdu. The platform offers a slew of educational resources, including NCERT textbooks for classes 1-12, audio-visual resources by NCERT, periodicals, supplements, teacher training modules and a variety of other print and non-print materials. These materials can be downloaded by the user for offline use with no limits on downloads. The app supports flip book format to provide a more realistic experience. References External links Official Website NCERT Website Ministry of Human Resource Development Indian educational websites Multilingual websites Modi administration initiatives Internet properties established in 2015 Digital India initiatives Computer-related introductions in 2015 E-learning in India Government-owned websites of India Distance education institutions based in India Education companies of India Online tutoring Virtual learning environments
https://en.wikipedia.org/wiki/Ljungstr%C3%B6m%20air%20preheater
Ljungström air preheater is an air preheater invented by the Swedish engineer Fredrik Ljungström (1875-1964). The patent was achieved in 1930. Even in a modern utility boiler the preheater provides up to 20 percent of the total heat transfer in the boiler process, but only represents two percent of the investment. The factory and workshop activities and laboratories in Lidingö would remain throughout the 1920s, with some 70 personnel. In the 1930s it was used as a film studio, and was finally demolished in the 1970s to give space for new industry premises. Fredrik Ljungström's technology of the air preheater is implemented in a vast number of modern power stations around the world until this day, with total attributed worldwide fuel savings estimated at 4,960,000,000 tons of oil, "few inventions have been as successful in saving fuel as the Ljungström Air Preheater". In 1995, the Ljungström air preheater was distinguished as the 44th International Historic Mechanical Engineering Landmark by the American Society of Mechanical Engineers. References External links History of the Ljungström Air Preheater LJUNGSTRÖM Air Preheater (APH) & Gas-gas Heater (GGH) Power Plant Overview Chemical equipment Mechanical engineering Power station technology Engineering thermodynamics Ljungström
https://en.wikipedia.org/wiki/Unused%20drug
An unused drug or leftover drug is the medicine which remains after the consumer has quit using it. Individual patients may have leftover medicines at the end of their treatment. Health care organizations may keep larger amounts of drugs as part of providing care to a community, and may have unused drugs for a range of reasons. The unused drugs should be destroyed utterly to eliminate the toxic effects of undisposed drugs on flora and fauna. The improper disposal of unused drugs could be the reason for the contamination of Surface, Ground and Drinking Water. Discharge of unused antibiotics and disinfectants in the sewage system may ruin the aquatic life or contamination of drinking water. The determination of appropriate ways for disposal of unused medications can predict the number of contamination problems of the environment. There are several studies which evidence the toxic effects of medications on the environment which are disposed of inappropriately. Causes Various circumstances may cause a consumer to have unused drugs. The consumer might find that their medication is ineffective and quit taking it. The medicine might be effective, but the consumer might not adhere to their treatment and fail to take it for any reason. A patient might die, leaving their medications behind. A patient might move, such as from a hospital to their home, and somehow leave their unused drugs behind with the health care provider. Some medical professional practices lead to patients having unused drugs. Physicians may prescribe more than they should. Physicians and patients might see each other less often than they should, and the physician might agree to prescribe medication for a longer period of time than is best. The physician might neglect to review what medications a patient already has, and recommend more. The medical office might have confused records about what drugs a patient has, especially for offices without full computer records. Also a physician might provide drug
https://en.wikipedia.org/wiki/Drug%20recycling
Drug recycling, also referred to as medication redispensing or medication re-use, is the idea that health care organizations or patients with unused drugs can transfer them in a safe and appropriate way to another patient in need. The purpose of such a program is reducing medication waste, thereby saving healthcare costs, enlarging medications’ availability and alleviating the environmental burden of medication. The debate Despite the need for waste-preventive measures, the debate of drug recycling programs is ongoing. It is traditional to expect that consumers get prescription drugs from a pharmacy and that the pharmacy got their drugs from a trusted source, such as manufacturer or wholesaler. In a drug recycling program, consumers would access drugs through a less standardized supply chain. Consequently, concerns of the quality of the recycled drugs arise. However, in a regulated process, monitored by specialized pharmacies or medical organization, these uncertainties can be overcome. For example, monitoring the storage conditions, including temperature, light, humidity and agitation of medication, can contribute to regulation of the quality of recycled drugs. For this purpose, pharmaceutical packaging could be upgraded with sensing technologies, that can also be designed to detect counterfeits. Such packaging requires an initial investment, but this can be compensated with potential cost savings obtained by a drug recycling program. Accordingly, drugs recycling seems economically viable for expensive drugs, such as HIV post-exposure prophylaxis medication. Donating practices In some countries, drug recycling programs operate successfully by donating unused drugs to the less fortunate. In the United States drug recycling programs exist locally. As of 2010, Canada had fewer drug recycling programs than the United States. These programs occur in specific pharmacies only, since these pharmacies are prepared to address the special requirements of participating in a
https://en.wikipedia.org/wiki/Smash%20and%20Grab%20%28biology%29
Smash and Grab is the name given to a technique developed by Charles S. Hoffman and Fred Winston used in molecular biology to rescue plasmids from yeast transformants into Escherichia coli, also known as E. coli, in order to amplify and purify them. In addition, it can be used to prepare yeast genomic DNA (and DNA from tissue samples) for Southern blot analyses or polymerase chain reaction (PCR). References Biology terminology
https://en.wikipedia.org/wiki/Anne%20Penfold%20Street
Anne Penfold Street (1932–2016) was one of Australia's leading mathematicians, specialising in combinatorics. She was the third woman to become a mathematics professor in Australia, following Hanna Neumann and Cheryl Praeger. She was the author of several textbooks, and her work on sum-free sets became a standard reference for its subject matter. She helped found several important organizations in combinatorics, developed a researcher network, and supported young students with interest in mathematics. Early life and education Street was born on 11 October 1932 in Melbourne, the daughter of a medical researcher. She earned a bachelor's degree in chemistry from the University of Melbourne in 1954, while working there as a tutor in chemistry and also studying mathematics. She finished a master's degree in chemistry at Melbourne in 1956. During this time she married another Melbourne chemist, Norman Street, and in 1957 the Streets and their young daughter moved to the University of Illinois at Urbana–Champaign where Norman Street had a new job. At Illinois, Street took up mathematics again. After moving to Mildura and then returning to Urbana, she completed her doctorate at the University of Illinois in 1966, with a dissertation on group theory supervised by Michio Suzuki. Career After earning her doctorate, Street became a lecturer at the University of Queensland in 1967. While continuing to hold this position, she took a year of postdoctoral research at the University of Alberta, and on her return to Queensland in 1970 was promoted to senior lecturer, promoted again to reader in 1975, and given a personal chair as professor in 1985. At Queensland, she directed the Centre for Discrete Mathematics and Computing from its formation in 1998 until 2004. She has also held visiting positions at the University of Waterloo, University of Reading, University of Manitoba, University of Illinois at Urbana–Champaign, University of Western Australia, Auburn University, and Unive
https://en.wikipedia.org/wiki/Savannah%20hypothesis
The savannah hypothesis (or savanna hypothesis) is a hypothesis that human bipedalism evolved as a direct result of human ancestors' transition from an arboreal lifestyle to one on the savannas. According to the hypothesis, hominins left the woodlands that had previously been their natural habitat millions of years ago and adapted to their new habitat by walking upright. The idea that a climate-driven retraction of tropical forests forced early hominini into bipedalism has been around for a long time, often implicitly. Some early authors saw savannahs as open grasslands, while others saw a mosaic of environments from woodlands to grasslands. The hypothesis has seen rising criticism since at least the late 1960s. The open grasslands version is mostly dismissed. In contrast, while the mosaic version still has relatively wide support. However, the transition from forest to savanna probably was more gradual than previously thought. History The fundamental ideas behind it date back to Lamarck, Darwin and Wallace. Also Gustav Steinmann saw reducing rain forest due to climate change as an important driver for bipedalism. Osborn thought man probably originated from the forests and flood-plains of southern Asia. Hilzheimer stated it was open landscapes that stimulated development. The hypothesis first came to prominence however with the discovery of Australopithecus africanus by Raymond Dart in 1924. In an article on the discovery, published in the journal Nature, Dart wrote: Weinert stated apes are very reluctant to leave the safety of the trees, and the ancestors of modern man did not leave the trees, but the trees left them. Grabau echoed this by saying Instead of the apes leaving the trees, the trees left the apes. Not everyone agreed with this hypothesis, such as Weidenreich, but he did conclude it was a widely spread belief. The work of Robert Ardrey helped popularize the ideas that Dart had developed with a wide audience. In the decades following Dart's discove
https://en.wikipedia.org/wiki/Level%20shifter
In digital electronics, a level shifter, also called level converter or logic level shifter, or voltage level translator, is a circuit used to translate signals from one logic level or voltage domain to another, allowing compatibility between integrated circuits with different voltage requirements, such as TTL and CMOS. Modern systems use level shifters to bridge domains between processors, logic, sensors, and other circuits. In recent years, the three most common logic levels have been 1.8V, 3.3V, and 5V, though levels above and below these voltages are also used. Types of level shifter Uni-directional – All input pins are dedicated to one voltage domain, all output pins are dedicated to the other. Bi-directional with Dedicated ports – Each voltage domain has both input and output pins, but the data direction of a pin does not change. Bi-directional with external direction indicator – When an external signal is changed, inputs become outputs and vice versa. Bi-directional, auto-sensing – A pair of I/O spanning voltage domains can act as either inputs or outputs depending on external stimulus without the need for a dedicated direction control pin. Hardware implementation Fixed function level shifter ICs - These ICs provide several different types of level shift in fixed function devices. Often lumped into 2-bit, 4-bit, or 8-bit level shift configurations offered with various VDD1 and VDD2 ranges, these devices translate logic levels without any additional integrated logic or timing adjustment. Configurable mixed-signal ICs (CMICs) – Level shifter circuitry can also be implemented in a CMIC. The no-code programmable nature of CMICs allows designers to implement fully customizable level shifters with the added option to integrate configurable logic or timing adjustments in the same device. Applications of level shifters Since level shifters are used to resolve the voltage incompatibility between various parts of a system, they have a wide range of applic
https://en.wikipedia.org/wiki/IStreamPlanet
iStreamPlanet was a Seattle, Washington-based company which processes and delivers live video broadcasts over the internet. A majority stake of iStreamPlanet was acquired by Turner Broadcasting in 2015 and was lastly operated by Warner Bros. Discovery. The company was founded in 2000 by former basketball player Mio Babic. iStreamPlanet streamed a number of major sporting events, including NCAA Division I men's basketball tournaments, the Olympics, the Super Bowl, the FIFA World Cup, and Formula One auto racing. iStreamPlanet was formally shutdown in 2023 with 25 employees laid off. Months before this, external companies had been told that iStreamPlanet was changing models to no longer have external customers outside of Warner Bros. Discovery. Customers While not all of iStreamPlanet's live video streaming customers were publicly known, some of their large customers are publicly acknowledged, including: WarnerMedia (including streaming of March Madness and B/R Live) NBC Sports (including streaming of the 2010, 2012, 2014, 2016, 2018, 2020, and 2022 Olympics) Hulu FuboTV Spark Sport References External links Online mass media companies of the United States 2000 software Former Warner Bros. Discovery subsidiaries Streaming software Streaming media systems Media servers Former AT&T subsidiaries
https://en.wikipedia.org/wiki/Michel%20Dumontier
Michel Justin Dumontier (born 1975) is a Distinguished Professor of Data Science at Maastricht University. His research focuses on methods to represent knowledge on the web with applications for drug discovery and personalized medicine. He was previously an Associate Professor of Medicine (Biomedical Informatics) at the Stanford University School of Medicine and an Associate Professor of Bioinformatics at Carleton University. He is best known for his work in biomedical ontologies, linked data and biomedical knowledge discovery. He has taught courses on biochemistry, bioinformatics, computational systems biology, and translational medicine. His research has been funded by Natural Sciences and Engineering Research Council, Canada Foundation for Innovation, Mitacs Canada, the Ontario Ministry of Research, Innovation and Science, CANARIE, and the US National Institutes of Health. Dumontier has an h-index of over 30 and has authored over 125 scientific publications in journals and conferences. He lives in Maastricht with his wife Tifany Irene Leung and their lionhead rabbit Storm. Biography Dumontier received his Bachelor of Science in biochemistry from the University of Manitoba in 1998. In his second year of undergraduate study, he joined the lab of James D. Jamieson where he developed a computational method to reconstruct the Golgi Apparatus. He then worked as research assistant at the Max Planck Institute of Biochemistry in Munich to investigate cellular dynamics of Rac1 protein of small GTPases. He defended his PhD in the department of biochemistry at the University of Toronto on the subject of "Species-specific optimizations of sequence and structure.". After a brief postdoctoral fellowship at the Blueprint Initiative, a project funded by Genome Canada and hosted in the Mount Sinai Hospital Research Institute, he joined the department of biology at Carleton University as an assistant professor in 2005. He was subsequently cross-appointed to the school of computer
https://en.wikipedia.org/wiki/Alternatives%20to%20Darwinian%20evolution
Alternatives to Darwinian evolution have been proposed by scholars investigating biology to explain signs of evolution and the relatedness of different groups of living things. The alternatives in question do not deny that evolutionary changes over time are the origin of the diversity of life, nor that the organisms alive today share a common ancestor from the distant past (or ancestors, in some proposals); rather, they propose alternative mechanisms of evolutionary change over time, arguing against mutations acted on by natural selection as the most important driver of evolutionary change. This distinguishes them from certain other kinds of arguments that deny that large-scale evolution of any sort has taken place, as in some forms of creationism, which do not propose alternative mechanisms of evolutionary change but instead deny that evolutionary change has taken place at all. Not all forms of creationism deny that evolutionary change takes place; notably, proponents of theistic evolution, such as the biologist Asa Gray, assert that evolutionary change does occur and is responsible for the history of life on Earth, with the proviso that this process has been influenced by a god or gods in some meaningful sense. Where the fact of evolutionary change was accepted but the mechanism proposed by Charles Darwin, natural selection, was denied, explanations of evolution such as Lamarckism, catastrophism, orthogenesis, vitalism, structuralism and mutationism (called saltationism before 1900) were entertained. Different factors motivated people to propose non-Darwinian mechanisms of evolution. Natural selection, with its emphasis on death and competition, did not appeal to some naturalists because they felt it immoral, leaving little room for teleology or the concept of progress (orthogenesis) in the development of life. Some who came to accept evolution, but disliked natural selection, raised religious objections. Others felt that evolution was an inherently progressive
https://en.wikipedia.org/wiki/Gibbard%27s%20theorem
In the fields of mechanism design and social choice theory, Gibbard's theorem is a result proven by philosopher Allan Gibbard in 1973. It states that for any deterministic process of collective decision, at least one of the following three properties must hold: The process is dictatorial, i.e. there exists a distinguished agent who can impose the outcome; The process limits the possible outcomes to two options only; The process is open to strategic voting: once an agent has identified their preferences, it is possible that they have no action at their disposal that best defends these preferences irrespective of the other agents' actions. A corollary of this theorem is Gibbard–Satterthwaite theorem about voting rules. The main difference between the two is that Gibbard–Satterthwaite theorem is limited to ranked (ordinal) voting rules: a voter's action consists in giving a preference ranking over the available options. Gibbard's theorem is more general and considers processes of collective decision that may not be ordinal: for example, voting systems where voters assign grades to candidates (cardinal voting). Gibbard's theorem can be proven using Arrow's impossibility theorem. Gibbard's theorem is itself generalized by Gibbard's 1978 theorem and Hylland's theorem, which extend these results to non-deterministic processes, i.e. where the outcome may not only depend on the agents' actions but may also involve an element of chance. The Gibbard's theorem assumes the collective decision results in exactly one winner and does not apply to multi-winner voting. Overview Consider some voters , and who wish to select an option among three alternatives: , and . Assume they use approval voting: each voter assigns to each candidate the grade 1 (approval) or 0 (withhold approval). For example, is an authorized ballot: it means that the voter approves of candidates and but does not approve of candidate . Once the ballots are collected, the candidate with highest total
https://en.wikipedia.org/wiki/Hash-based%20cryptography
Hash-based cryptography is the generic term for constructions of cryptographic primitives based on the security of hash functions. It is of interest as a type of post-quantum cryptography. So far, hash-based cryptography is used to construct digital signatures schemes such as the Merkle signature scheme, zero knowledge and computationally integrity proofs, such as the zk-STARK proof system and range proofs over issued credentials via the HashWires protocol. Hash-based signature schemes combine a one-time signature scheme, such as a Lamport signature, with a Merkle tree structure. Since a one-time signature scheme key can only sign a single message securely, it is practical to combine many such keys within a single, larger structure. A Merkle tree structure is used to this end. In this hierarchical data structure, a hash function and concatenation are used repeatedly to compute tree nodes. One consideration with hash-based signature schemes is that they can only sign a limited number of messages securely, because of their use of one-time signature schemes. The US National Institute of Standards and Technology (NIST), specified that algorithms in its post-quantum cryptography competition support a minimum of 2 signatures safely. In 2022, NIST announced SPHINCS+ as one of three algorithms to be standardized for digital signatures. NIST standardized stateful hash-based cryptography based on the eXtended Merkle Signature Scheme (XMSS) and Leighton–Micali Signatures (LMS), which are applicable in different circumstances, in 2020, but noted that the requirement to maintain state when using them makes them more difficult to implement in a way that avoids misuse. History Leslie Lamport invented hash-based signatures in 1979. The XMSS (eXtended Merkle Signature Scheme) and SPHINCS hash-based signature schemes were introduced in 2011 and 2015, respectively. XMSS was developed by a team of researchers under the direction of Johannes Buchmann and is based both on Merkle's s
https://en.wikipedia.org/wiki/Cyclomorphosis
Cyclomorphosis (also known as seasonal polyphenism) is the name given to the occurrence of cyclic or seasonal changes in the phenotype of an organism through successive generations. In species undergoing cyclomorphosis, physiological characteristics and development cycles of individuals being born depend on the time of the year at which they are conceived. It occurs in small aquatic invertebrates that reproduce by parthenogenesis and give rise to several generations annually. It occurs especially in marine planktonic animals, and is thought to be caused by the epigenetic effect of environmental cues on the organism, thereby altering the course of their development. References Biological processes Animal physiology
https://en.wikipedia.org/wiki/Miller%20and%20Lents
Miller and Mochen, Ltd. is a petroleum consulting company based in Houston, Texas. The firm provides services including reserves certifications, audits, and independent evaluations. They prepare evaluations according to the standards of the United States Securities and Exchange Commission (SEC) Regulation S-X and the Petroleum Resources Management System (PRMS) published by the Society of Petroleum Engineers (SPE). Current operations Board of directors The Chairman of the Board is Robert Oberst. The Senior Vice Presidents are Leslie Fallon and Gary Knapp. Consulting activities Reserves evaluations Miller and Lents, Ltd. prepares reserves estimates by applying both SEC and SPE-PRMS standards. These estimates include the assessment of developed and undeveloped reserves and classification according to Proved, Probable, Possible, Contingent, and Prospective Resources definitions. Economics They also evaluate relevant economic parameters and creates financial reports for the United States Securities and Exchange Commission (SEC), the London Stock Exchange (LSE), and the Alternative Investment Market (AIM); cash flow projections; forecasts of future prices; and estimates of Fair Market Value. Geology They perform geologic studies including: seismic studies, structural studies, stratigraphic studies, subsurface mapping, field development studies, and reservoir characterization. Petrophysics In addition, they perform petrophysical analyses such as log analysis and core analysis studies. Areas of operation Miller and Lents, Ltd. provides services to domestic and international clients, with a significant portion of their business coming from clients operating in Russia. In addition to evaluations for clients operating in Russia, Miller and Lents, Ltd. has performed evaluations for clients in the United States, Azerbaijan, Israel, Kazakhstan, the United Kingdom, Australia, and Lithuania, among others. History In 1948, J. R. Butler and Martin Miller formed
https://en.wikipedia.org/wiki/FIR%20transfer%20function
Transfer function filter utilizes the transfer function and the Convolution theorem to produce a filter. In this article, an example of such a filter using finite impulse response is discussed and an application of the filter into real world data is shown. FIR (Finite Impulse Response) Linear filters In digital processing, an FIR filter is a time-continuous filter that is invariant with time. This means that the filter does not depend on the specific point of time, but rather depends on the time duration. The specification of this filter uses a transfer function which has a frequency response which will only pass the desired frequencies of the input. This type of filter is non-recursive, which means that the output can be completely derived at from a combination of the input without any recursive values of the output. This means that there is no feedback loop that feeds the new output the values of previous outputs. This is an advantage over recursive filters such as IIR filter (Infinite Impulse Response) in the applications that require a linear phase response because it will pass the input without a phase distortion. Mathematical model Let the output function be and the input is . The convolution of the input with a transfer function provides a filtered output. The mathematical model of this type of filter is: h() is a transfer function of an impulse response to the input. The convolution allows the filter to only be activated when the input recorded a signal at the same time value. This filter returns the input values (x(t)) if k falls into the support region of function h. This is the reason why this filter is called finite response. If k is outside of the support region, the impulse response is zero which makes output zero. The central idea of this h() function can be thought of as a quotient of two functions. According to Huang (1981) Using this mathematical model, there are four methods of designing non-recursive linear filters with various concurren
https://en.wikipedia.org/wiki/Branches%20of%20microbiology
The branches of microbiology can be classified into pure and applied sciences. Microbiology can be also classified based on taxonomy, in the cases of bacteriology, mycology, protozoology, and phycology. There is considerable overlap between the specific branches of microbiology with each other and with other disciplines, and certain aspects of these branches can extend beyond the traditional scope of microbiology In general the field of microbiology can be divided in the more fundamental branch (pure microbiology) and the applied microbiology (biotechnology). In the more fundamental field the organisms are studied as the subject itself on a deeper (theoretical) level. Applied microbiology refers to the fields where the micro-organisms are applied in certain processes such as brewing or fermentation. The organisms itself are often not studied as such, but applied to sustain certain processes. Pure microbiology Bacteriology: the study of bacteria Mycology: the study of fungi Protozoology: the study of protozoa Phycology/algology: the study of algae Parasitology: the study of parasites Immunology: the study of the immune system Virology: the study of viruses Nematology: the study of nematodes Microbial cytology: the study of microscopic and submicroscopic details of microorganisms Microbial physiology: the study of how the microbial cell functions biochemically. Includes the study of microbial growth, microbial metabolism and microbial cell structure Microbial pathogenesis: the study of pathogens which happen to be microbes Microbial ecology: the relationship between microorganisms and their environment Microbial genetics: the study of how genes are organized and regulated in microbes in relation to their cellular functions Closely related to the field of molecular biology Cellular microbiology: a discipline bridging microbiology and cell biology Evolutionary microbiology: the study of the evolution of microbes. This field can be subdivided into: Micr
https://en.wikipedia.org/wiki/Supersymmetric%20theory%20of%20stochastic%20dynamics
Supersymmetric theory of stochastic dynamics or stochastics (STS) is an exact theory of stochastic (partial) differential equations (SDEs), the class of mathematical models with the widest applicability covering, in particular, all continuous time dynamical systems, with and without noise. The main utility of the theory from the physical point of view is a rigorous theoretical explanation of the ubiquitous spontaneous long-range dynamical behavior that manifests itself across disciplines via such phenomena as 1/f, flicker, and crackling noises and the power-law statistics, or Zipf's law, of instantonic processes like earthquakes and neuroavalanches. From the mathematical point of view, STS is interesting because it bridges the two major parts of mathematical physics – the dynamical systems theory and topological field theories. Besides these and related disciplines such as algebraic topology and supersymmetric field theories, STS is also connected with the traditional theory of stochastic differential equations and the theory of pseudo-Hermitian operators. The theory began with the application of BRST gauge fixing procedure to Langevin SDEs, that was later adapted to classical mechanics and its stochastic generalization, higher-order Langevin SDEs, and, more recently, to SDEs of arbitrary form, which allowed to link BRST formalism to the concept of transfer operators and recognize spontaneous breakdown of BRST supersymmetry as a stochastic generalization of dynamical chaos. The main idea of the theory is to study, instead of trajectories, the SDE-defined temporal evolution of differential forms. This evolution has an intrinsic BRST or topological supersymmetry representing the preservation of topology and/or the concept of proximity in the phase space by continuous time dynamics. The theory identifies a model as chaotic, in the generalized, stochastic sense, if its ground state is not supersymmetric, i.e., if the supersymmetry is broken spontaneously. Accordingly,
https://en.wikipedia.org/wiki/Machine%20learning%20in%20bioinformatics
Machine learning in bioinformatics is the application of machine learning algorithms to bioinformatics, including genomics, proteomics, microarrays, systems biology, evolution, and text mining. Prior to the emergence of machine learning, bioinformatics algorithms had to be programmed by hand; for problems such as protein structure prediction, this proved difficult. Machine learning techniques, such as deep learning can learn features of data sets, instead of requiring the programmer to define them individually. The algorithm can further learn how to combine low-level features into more abstract features, and so on. This multi-layered approach allows such systems to make sophisticated predictions when appropriately trained. These methods contrast with other computational biology approaches which, while exploiting existing datasets, do not allow the data to be interpreted and analyzed in unanticipated ways. In recent years, the size and number of available biological datasets have skyrocketed. Tasks Machine learning algorithms in bioinformatics can be used for prediction, classification, and feature selection. Methods to achieve this task are varied and span many disciplines; most well known among them are machine learning and statistics. Classification and prediction tasks aim at building models that describe and distinguish classes or concepts for future prediction. The differences between them are the following: Classification/recognition outputs a categorical class, while prediction outputs a numerical valued feature. The type of algorithm, or process used to build the predictive models from data using analogies, rules, neural networks, probabilities, and/or statistics. Due to the exponential growth of information technologies and applicable models, including artificial intelligence and data mining, in addition to the access ever-more comprehensive data sets, new and better information analysis techniques have been created, based on their ability to learn.
https://en.wikipedia.org/wiki/Hopf%20decomposition
In mathematics, the Hopf decomposition, named after Eberhard Hopf, gives a canonical decomposition of a measure space (X, μ) with respect to an invertible non-singular transformation T:X→X, i.e. a transformation which with its inverse is measurable and carries null sets onto null sets. Up to null sets, X can be written as a disjoint union C ∐ D of T-invariant sets where the action of T on C is conservative and the action of T on D is dissipative. Thus, if τ is the automorphism of A = L∞(X) induced by T, there is a unique τ-invariant projection p in A such that pA is conservative and (I–p)A is dissipative. Definitions Wandering sets and dissipative actions. A measurable subset W of X is wandering if its characteristic function q = χW in A = L∞(X) satisfies qτn(q) = 0 for all n; thus, up to null sets, the translates Tn(W) are pairwise disjoint. An action is called dissipative if X = ∐ Tn(W) a.e. for some wandering set W. Conservative actions. If X has no wandering subsets of positive measure, the action is said to be conservative. Incompressible actions. An action is said to be incompressible if whenever a measurable subset Z satisfies T(Z) ⊆ Z then has measure zero. Thus if q = χZ and τ(q) ≤ q, then τ(q) = q a.e. Recurrent actions. An action T is said to be recurrent if q ≤ τ(q) ∨ τ2(q) ∨ τ3(q) ∨ ... a.e. for any q = χY. Infinitely recurrent actions. An action T is said to be infinitely recurrent if q ≤ τm (q) ∨ τm + 1(q) ∨ τm+2(q) ∨ ... a.e. for any q = χY and any m ≥ 1. Recurrence theorem Theorem. If T is an invertible transformation on a measure space (X,μ) preserving null sets, then the following conditions are equivalent on T (or its inverse): T is conservative; T is recurrent; T is infinitely recurrent; T is incompressible. Since T is dissipative if and only if T−1 is dissipative, it follows that T is conservative if and only if T−1 is conservative. If T is conservative, then r = q ∧ (τ(q) ∨ τ2(q) ∨ τ3(q) ∨ ⋅⋅⋅)⊥ = q ∧ τ(1 - q) ∧ τ2(1 -q) ∧ τ3(q) ∧
https://en.wikipedia.org/wiki/Channelizer
In digital signal processing, a channelizer is a term used for algorithms which select a certain frequency band from an input signal. The input signal typically has a higher sample rate than the sample rate of the selected channel. It is also used for algorithms that can select multiple channels from an input signal in an efficient way. One of the most common methods for selecting a channel from an input signal is to first shift the frequency by multiplying it with a complex sinusoid, then passing the signal through a low pass filter. Alternatively, a decimator (rate changer) can be used. One common type of channelizer is the polyphase channelizer. Decimation is the process of reducing sample rate. Decimation originally meant "take one sample in every 10", but later this term was generalized to simply mean any reduction in sample rate. Digital signal processing
https://en.wikipedia.org/wiki/Salivary%20microbiome
The salivary microbiome consists of the nonpathogenic, commensal bacteria present in the healthy human salivary glands. It differs from the oral microbiome which is located in the oral cavity. Oral microorganisms tend to adhere to teeth. The oral microbiome possesses its own characteristic microorganisms found there. Resident microbes of the mouth adhere to the teeth and gums. "[T]here may be important interactions between the saliva microbiome and other microbiomes in the human body, in particular, that of the intestinal tract." Characteristics Unlike the uterine, placental and vaginal microbiomes, the types of organisms in the salivary microbiota remain relatively constant. There is no difference between populations of microbes based upon gender, age, diet, obesity, alcohol intake, race, or tobacco use. The salivary microbiome characteristically remains stable over a lifetime. One study suggests sharing an environment (e.g., living together) may influence the salivary microbiome more than genetic components. Porphyromonas, Solobacterium, Haemophilus, Corynebacterium, Cellulosimicrobium, Streptococcus and Campylobacter are some of the genera found in the saliva. Genetic markers and diagnostic testing "There is high diversity in the salivary microbiome within and between individuals, but little geographic structure. Overall, ~13.5% of the total variance in the composition of genera is due to differences among individuals, which is remarkably similar to the fraction of the total variance in neutral genetic markers that can be attributed to differences among human populations." "[E]nvironmental variables revealed a significant association between the genetic distances among locations and the distance of each location from the equator. Further characterization of the enormous diversity revealed here in the human salivary microbiome will aid in elucidating the role it plays in human health and disease, and in the identification of potentially informative species
https://en.wikipedia.org/wiki/Dynamical%20dimensional%20reduction
Dynamical dimensional reduction or spontaneous dimensional reduction is the apparent reduction in the number of spacetime dimensions as a function of the distance scale, or conversely the energy scale, with which spacetime is probed. At least within the current level of experimental precision, our universe has three dimensions of space and one of time. However, the idea that the number of dimensions may increase at extremely small length scales was first proposed more than a century ago, and is now fairly commonplace in theoretical physics. Contrary to this, a number of recent results in quantum gravity suggest the opposite behavior, a dynamical reduction of the number of spacetime dimensions at small length scales. Evidence for dimensional reduction The phenomenon of dimensional reduction has now been reported in a number of different approaches to quantum gravity. String theory, causal dynamical triangulations, renormalization group approaches, noncommutative geometry, loop quantum gravity and Horava-Lifshitz gravity all find that the dimensionality of spacetime appears to decrease from approximately 4 on large distance scales to approximately 2 on small distance scales. The evidence for dimensional reduction has come mainly, although not exclusively, from calculations of the spectral dimension. The spectral dimension is a measure of the effective dimension of a manifold at different resolution scales. Early numerical simulations within the causal dynamical triangulation (CDT) approach to quantum gravity found a spectral dimension of 4.02 ± 0.10 at large distances and 1.80 ± 0.25 at small distances. This result created significant interest in dimensional reduction within the quantum gravity community. A more recent study of the same point in the parameter space of CDT found consistent results, namely 4.05 ± 0.17 at large distances and 1.97 ± 0.27 at small distances. Currently, there is no consensus on the correct theoretical explanation for the mechanism of di
https://en.wikipedia.org/wiki/Bonobo%20%28GNOME%29
Bonobo is an obsolete component framework for the GNOME free desktop environment. Bonobo is designed to create reusable software components and compound documents. Through its development history it resembles Microsoft's OLE technology and is GNOME's equivalent of KDE's KParts. Bonobo was developed as a solution to the problems and requirements of the free software community in the development of complex applications. Bonobo is based on the Common Object Request Broker Architecture (CORBA) or its GNOME implementation ORBit. Through Bonobo the functions of one application can be integrated into another: for example, Gnumeric spreadsheet tables can be directly embedded into AbiWord text document by including Gnumeric as Bonobo component. Available components are: Gnumeric spreadsheet ggv PostScript viewer Xpdf PDF viewer gill SVG viewer History Inspired by Microsoft's OLE, Bonobo was originally developed by Ximian for compound documents. Bonobo was included for the first time in Gnome 1.2 in May 2000. As of GNOME 2.4 Bonobo is officially considered obsolete, and developers are advised to switch to alternatives such as D-Bus or the GIO component of GLib instead. D-Bus replaced Bonobo as part of the Ridley project. Final results should be released in GTK+ 3.0. Bonobo and ORBit libraries were removed from GNOME in version 2.22. References External links Bonobo Bonobo site at Gnome The Art of Writing a Bonobo Control Inter-process communication Application programming interfaces GNOME obsolete
https://en.wikipedia.org/wiki/Comparison%20of%20single-board%20microcontrollers
Comparison of Single-board microcontrollers excluding Single-board computers See also Comparison of single-board computers References Further reading External links Arduino Computing comparisons Microcontrollers
https://en.wikipedia.org/wiki/Gray%20molasses
Gray molasses is a method of sub-Doppler laser cooling of atoms. It employs principles from Sisyphus cooling in conjunction with a so-called "dark" state whose transition to the excited state is not addressed by the resonant lasers. Ultracold atomic physics experiments on atomic species with poorly-resolved hyperfine structure, like isotopes of lithium and potassium, often utilize gray molasses instead of Sisyphus cooling as a secondary cooling stage after the ubiquitous magneto-optical trap (MOT) to achieve temperatures below the Doppler limit. Unlike a MOT, which combines a molasses force with a confining force, a gray molasses can only slow but not trap atoms; hence, its efficacy as a cooling mechanism lasts only milliseconds before further cooling and trapping stages must be employed. Overview Like Sisyphus cooling, the cooling mechanism of gray molasses relies on a two-photon Raman-type transition between two hyperfine-split ground states mediated by an excited state. Orthogonal superpositions of these ground states constitute "bright" and "dark" states, so called since the former couples to the excited state via dipole transitions driven by the laser, and the latter is only accessible via spontaneous emission from the excited state. As neither are eigenstates of the kinetic energy operator, the dark state also evolves into the bright state with frequency proportional to the atom's external momentum. Gradients in the polarization of the molasses beam create a sinusoidal potential energy landscape for the bright state in which atoms lose kinetic energy by traveling "uphill" to potential energy maxima that coincide with circular polarizations capable of executing electric dipole transitions to the excited state. Atoms in the excited state are then optically pumped to the dark state and subsequently evolve back to the bright state to restart the cycle. Alternately, the pair of bright and dark ground states can be generated by electromagnetically-induced transpar
https://en.wikipedia.org/wiki/RSPB%20Medal
The RSPB Medal is awarded annually by the Royal Society for the Protection of Birds. According to the RSPB: The RSPB Medal is the Society's most prestigious award. It is presented to an individual in recognition of wild bird protection and countryside conservation. It is usually awarded annually to one or occasionally two people. The medal was first awarded in 1908. Recipients Rt Hon John Gummer Prof Chris Perrins FRS (1992) Bill Oddie (1997) Chris Mead (1999) Sir David Attenborough (2000) Robert Gillmor (2003) Professor Chris Baines (2004) Dr Colin Bibby (2004) Michael McCarthy (2007) Dr Jeff Watson (2007) Charles, Prince of Wales (2011) Prof John Lawton FRS (2011) Tristan da Cunha community (2012) Peter Harrison (2012) Prof Robert Watson (2013) The BTO, Birdwatch Ireland and SOC team who produced the Bird Atlas (2014) Stanley Johnson (2015) Prof Georgina Mace FRS (2016) Dick Potts (2017) Caroline Lucas (2018) Dara McAnulty (youngest ever recipient 2019) See also List of environmental awards References British awards Conservation biology Environmental awards Royal Society for the Protection of Birds
https://en.wikipedia.org/wiki/Simple%20task-actor%20protocol
Simple Task-Actor Protocol (STAP) is a machine-readable format for specifying user-interface changes. STAP enables symmetric user-interface access for AI and human users. Description Main focus of STAP is in providing a functionally-equivalent task experience for human and computational users alike. In its focus to make human software usable by machine agents, STAP aims to eliminate non-task-essential design choices (e.g. font type/size may be irrelevant for many task types), leaving those to be optionally specified via customizable templates (e.g. CSS). STAP messages adhere to JSON formatting, and can be deserialized with any standard JSON library. Deployment Deploying a STAP application is similar to deploying a web application, where STAP takes place of HTML as the language for UI description. Much like HTML, STAP is a means for serializing task interface display and interactions. Unlike HTML documents, STAP messages are incremental updates to the display. Whereas HTML is focused on hypertext look and feel, STAP is focused on function, structure, and affordances of task display. Benefits Benefits of task development with STAP: less code, more GUI cross-platform, web-friendly precision timing consistent cross-task API, allowing computational agents to interact with the same sw that that human users interact with Benefits of agent development for STAP-compliant tasks: consistent cross-task API, allowing computational agents to interact with the same sw that that human users interact with precise user-time [with faster-than-real-time and slower-than-real-time capabilities] cross-platform, web-friendly scalable (i.e., core API is minimal, additional UI feature handlers can be added to agent framework on a per-task basis) Sample interactions -> Message sent from task software to participant software <- Message sent to task software from participant software Sample Interaction 1: // user software ready <- [0,[0]]
https://en.wikipedia.org/wiki/Common%20Electrical%20I/O
The Common Electrical I/O (CEI) refers to a series of influential Interoperability Agreements (IAs) that have been published by the Optical Internetworking Forum (OIF). CEI defines the electrical and jitter requirements for 3.125, 6, 11, 25-28, and 56 Gbit/s electrical interfaces. CEI, the Common Electrical I/O The Common Electrical I/O (CEI) Interoperability Agreement published by the OIF defines the electrical and jitter requirements for 3.125, 6, 11, 25-28, and 56 Gbit/s SerDes interfaces. This CEI specification has defined SerDes interfaces for the industry since 2004, and it has been highly influential. The development of electrical interfaces at the OIF began with SPI-3 in 2000, and the first differential interface was published in 2003. The seventh generation electrical interface, CEI-56G, defines five reaches of 56 Gbit/s interfaces. The OIF completed work on its eighth generation through its CEI-112G project. The OIF has launched its ninth generation with its CEI-224G project. CEI has influenced or has been adopted or adapted in many other serial interface standards by many different standards organizations over its long lifetime. SerDes interfaces have been developed based on CEI for most ASIC and FPGA products. CEI direct predecessors Throughout the 2000s, the OIF produced an important series of interfaces that influenced the development of multiple generations of devices. Beginning with the donation of the PL-3 interface by PMC-Sierra in 2000, the OIF produced the System Packet Interface (SPI) family of packet interfaces. SPI-3 and SPI-4.2 defined two generations of devices before they were supplanted by the closely related Interlaken standard in the SPI-5 generation in 2006. The OIF also defined the SerDes Framer Interface (SFI) family of specifications in parallel with SPI. As a part of the SPI-5 and SFI-5 development, a common electrical interface was developed termed SxI-5. SxI-5 abstracted the electrical I/O interface away from the indivi
https://en.wikipedia.org/wiki/UK%20Molecular%20R-matrix%20Codes
The UK Molecular R-Matrix codes are a set of software routines used to calculate the effects of collision of electrons with atoms and molecules. The R-matrix method is used in computational quantum mechanics to study scattering of positrons and electrons by atomic and molecular targets. The fundamental idea was originally introduced by Eugene Wigner and Leonard Eisenbud in the 1940s. The method uses the fixed nuclei approximation, where the molecule's nuclei are considered fixed when collision occurs and the electronic part of the problem is solved. This information is then plugged into calculations which take into account nuclear motion. The UK Molecular R-Matrix codes were developed by the Collaborative Computational Project Q (CCPQ). Software The CCPQ and CCP2 have supported various incarnations of the UK Molecular R-matrix project for almost 40 years. The UK Molecular R-Matrix Group is actually a subgroup of CCP2, and their codes are maintained by Professor Jonathan Tennyson and his group of researchers. Advances in research have shown that the UK Molecular R-matrix codes can be used to explain scattering problems involving light molecular targets. Quantemol-N (QN) is software that allows the UK molecular R-matrix codes, which is used to model electron-polyatomic molecule interactions, to be employed quickly with reduced set-up times. QN is an interface that simplifies the process of using the sophisticated UK molecular R-Matrix codes. See also Quantemol References Matrices Particle physics Quantum mechanics
https://en.wikipedia.org/wiki/Miami%20Science%20Barge
The Miami Science Barge (also known as the Science Barge) was a floating marine laboratory and education platform docked in Museum Park, Miami, FL since 2016. The Barge, designed to help support a more sustainable city, had three main areas of focus: marine ecology and conservation, sustainability, and alternative agriculture. It is completely off-grid and off-pipe and provided approximately enough energy and food production to support an average American family. In its first year, over 3000 students came aboard to learn about the innovative technology on the Barge. The vessel opened to the public on Saturdays. The Miami Science Barge was conceived by Nathalie Manzano and designed by Manzano and Ted Caplow. They were inspired by the Science Barge built in 2006 by New York Sun Works, designed by Caplow. The vessels were of similar size and both had a sustainable technology focus, but they responded to very different local environments and housed differing technology and unique public education programs. The Miami Science Barge emphasized aquaculture. The Miami Science Barge was donated in April 2017 to the brand-new Philip and Patricia Frost Museum of Science, who took over operations. The Miami Science Barge is no longer in use. Early history In 2015, Nathalie Manzano and Ted Caplow of CappSci won the Knight Cities Challenge grant competition from the John S. and James L. Knight Foundation with a proposal to build the Miami Science Barge. The Barge was a 120x30 steel construction barge from Grady Marine retrofitted with 2nd-hand shipping containers in 2015. With the generosity of Beau Payne of P & L Towing, the staff of CappSci were able to design and build the power system and exhibits of the Barge on the Miami River prior to moving it to its official location in Museum Park in downtown Miami, FL. The Miami Science Barge opened on Earth Day, April 22, 2016. The following April, the Barge was gifted to Phillip and Patricia Frost Museum of Science. Technical det
https://en.wikipedia.org/wiki/Pfafstetter%20Coding%20System
The Pfafstetter Coding System is a hierarchical method of hydrologically coding river basins. It was developed by the Brazilian engineer in 1989. It is designed such that topological information is embedded in the code, which makes it easy to determine whether an event in one river basin will affect another by direct examination of their codes. History and use In the 1950s, Pfafstetter suggested the use of a hierarchical system of coding river basins, later described in a 1989 paper. The method was applied to Brazilian water networks, and has been used in a number of other applications. Description The Pfafstetter system relies on the properties of the base-10 numbering system. In a water system to be coded, the main stem is defined as the path which drains the greatest area. The four major tributaries, in terms of water drainage, of the main stem are determined, and the water basin of each defined. This results in four tributary basins, as well as five inter-basin regions which are drained by the main stem. Each region is then numbered from 1-9, with the downstream-most inter-basin region denoted 1; therefore, the inter-basin regions and tributary basins are numbered 1,3,5,7,9 and 2,4,6,8 respectively. The number 0 is reserved for closed drainage systems. Each tributary basin is then coded in an identical manner, and the resulting number appended to the end of the tributary basin number. In this manner, the entire waterway may be coded in a recursive manner to an arbitrary precision. Properties The primary advantage of the Pfafstetter system is that the water drainage topology is directly described by the code: At each level, higher digits denote upstream segments. Odd digits denote segments on the main stem; even digits denote tributaries of the main stem. Therefore, given a point with code A on the water system, a point with code B is downstream if: The first exactly n digits of each code match, where n ≥ 0, AND The remaining digits of B are: les
https://en.wikipedia.org/wiki/Genetically%20encoded%20voltage%20indicator
Genetically encoded voltage indicator (or GEVI) is a protein that can sense membrane potential in a cell and relate the change in voltage to a form of output, often fluorescent level. It is a promising optogenetic recording tool that enables exporting electrophysiological signals from cultured cells, live animals, and ultimately human brain. Examples of notable GEVIs include ArcLight, ASAP1, ASAP3, Archons, SomArchon, and Ace2N-mNeon. History Despite that the idea of optical measurement of neuronal activity was proposed in the late 1960s, the first successful GEVI that was convenient enough to put into actual use was not developed until technologies of genetic engineering had become mature in the late 1990s. The first GEVI, coined FlaSh, was constructed by fusing a modified green fluorescent protein with a voltage-sensitive K+ channel (Shaker). Unlike fluorescent proteins, the discovery of new GEVIs were seldom inspired by the nature, for it is hard to find an organism which naturally has the ability to change its fluorescence based on voltage. Therefore, new GEVIs are mostly the products of genetic and protein engineering. Two methods can be utilized to find novel GEVIs: rational design and directed evolution. The former method contributes to the most of new GEVI variants, but recent researches using directed evolution have shown promising results in GEVI optimization. Structure GEVI can have many configuration designs in order to realize voltage sensing function. An essential feature of GEVI structure is that it must situate on the cell membrane. Conceptually, the structure of a GEVI should permit the function of sensing the voltage difference and reporting it by change in fluorescence. Usually, the voltage-sensing domain (VSD) of a GEVI spans across the membrane, and is connected to the fluorescent protein(s). However, it is not necessary that sensing and reporting should happen in different structures, e.g. Archons. By structure, GEVIs can be classified in
https://en.wikipedia.org/wiki/Hovhannes%20Avoyan
Hovha'nnes Avoyan (Hovhannes Avoyan, ; born March 23, 1965), is an Armenian-American serial entrepreneur, investor and scholar. He is the co-founder and CEO of Picsart, a suite of online photo and video editing applications. He also served as President of the Union of Information Technology Enterprises of Armenia (UITE) from its inception in 2000 until 2015. Hovhannes Avoyan is a member of the American University of Armenia Corporation Board of Trustees. Early life and education Hovhannes Avoyan was born in 1965 in Yerevan, Armenia. He graduated from the State Engineering University of Armenia with a B.S. degree in Computer Science in 1987 and received an Masters in Political Science and International Affairs (M PSIA) from the American University of Armenia in 1995. He is also a 2005 graduate of Harvard Business School's Bertelsmann Senior Executive's program. Career In 1996, Avoyan founded CEDIT, a software development services company, which was acquired by mobile communications software developer Brience in 2000. In 2005, Avoyan founded and served as CEO of Sourcio, a software development and system integration service provider acquired by software developer HelpSystems in 2016. In 2007, Avoyan founded and served as General Manager of Monitis Inc., which was acquired by TeamViewer/GFI Software) in 2011. In 2011, with his friend Artavazd Mehrabyan, Avoayan founded Picsart, a social photo and video editing platform, and became the company's CEO. Picsart's software reportedly has more than 150 million monthly active users, and has been downloaded 1 billion times as of September 2020. Accolades In 2016, Avoyan received the Distinguished Alumni Award from the American University of Armenia. In 2019, The Armenian Trade Network named Avoyan as the recipient of its Global Achievement Award. Philanthropy Avoyan supports AI development in Armenia by establishing schools and best practices in the country to help its data scientists compete globally. Artificial i
https://en.wikipedia.org/wiki/Flavor%20lexicon
Flavor lexicons (American English) or flavour lexicons (Commonwealth English; see spelling differences) are used by professional taste testers to develop and detail the sensory perception experienced from food. The lexicon is a word bank developed by professional taste testers in order to identify an objective, nuanced and cross-cultural word bank for food. Background Flavor is the sensory impression of food or other substances and is determined primarily by the chemical senses of taste and smell. Flavor is said to be 80 percent aroma, detected primarily through the retronasal smell mechanism. Sight, sound, and touch also impact flavor. The color of a food, sound as one bites into it, and texture are all factors that contribute to a person’s perception of food. In addition, culture provides a context to food. All these experiences put together can affect a person’s description of the food item. Flavor lexicons seek to provide an objective word bank for food. This streamlines the variations created by the different language ascribed to food. The development of flavor language allows tasters to pinpoint descriptions about the food they taste. There are three major descriptive analysis techniques used by professional taste testers: Flavor Profile Method (FPM); Quantitative Descriptive Analysis (QDA); and the Spectrum method. Taste testing In developing a flavor lexicon, the tongue is a taster’s instrument. A taster is trained to remain objective. The first documented descriptive analysis technique, FPM, was first implemented in the 1950s. The structure of FPM is also the simplest of the three — fewer panelists are employed with a minimum of four. Following a 60-hour training session, panel members work together to develop a consensus profile of the taste properties and their intensities on a four-point scale. QDA features a sensory professional leading 8 to 12 prescreened individuals who make up the panel. The professional orders the evaluation of each descript
https://en.wikipedia.org/wiki/Neurogastronomy
Neurogastronomy is the study of flavor perception and the ways it affects cognition and memory. This interdisciplinary field is influenced by the psychology and neuroscience of sensation, learning, satiety, and decision making. Areas of interest include how olfaction contributes to flavor, food addiction and obesity, taste preferences, and the linguistics of communicating and identifying flavor. The term neurogastronomy was coined by neuroscientist Gordon M. Shepherd. Olfaction and flavor Out of all the sensory modalities, olfaction contributes most to the sensation and perception of flavor processing. Olfaction has two sensory modalities, orthonasal smell, the detection of odor molecules originating outside the body, and retronasal smell, the detection of odor molecules originating during mastication. It is retronasal smell, whose sensation is felt in the mouth, that contributes to flavor perception. Anthropologically, over human evolution, the shortening of the nasopharynx and other shifts in bone structure suggest a constant improvement of flavor perception capabilities. After mastication, odor molecules travel through the back of the mouth and up the nasopharynx. The odorants are detected by myriad receptors on the olfactory epithelium. These receptors respond to a variety of dimensions of chemical properties. Odor receptors that respond to a dimension within a molecular receptive range are aggregated by glomeruli in the olfactory bulb. Here, the multi-dimensional nature of odorant stimuli is reduced to two dimensions. This input undergoes edge enhancement, increasing its signal-to-noise ratio by way of lateral inhibition due to mitral cells stemming from the glomerular layer. This input then reaches the olfactory cortex. Here, Hebbian learning networks allow for recall with partial or weak stimuli, indicating the first stage of conscious perception. Here, connections with the hypothalamus and hippocampus indicate that olfaction stimuli affect emotion, dec
https://en.wikipedia.org/wiki/Power%20amplifier%20classes
In electronics, power amplifier classes are letter symbols applied to different power amplifier types. The class gives a broad indication of an amplifier's characteristics and performance. The classes are related to the time period that the active amplifier device is passing current, expressed as a fraction of the period of a signal waveform applied to the input. A class A amplifier is conducting through all the period of the signal; Class B only for one-half the input period, class C for much less than half the input period. A Class D amplifier operates its output device in a switching manner; the fraction of the time that the device is conducting is adjusted so a pulse-width modulation output is obtained from the stage. Additional letter classes are defined for special-purpose amplifiers, with additional active elements or particular power supply improvements; sometimes a new letter symbol is used by a manufacturer to promote its proprietary design. By December 2010, AB and D classes dominated nearly all of audio amplifier market with the former being favored in portable music players, home audio and cell phone owing to lower cost of class AB chips. Power amplifier classes Power amplifier circuits (output stages) are classified as A, B, AB and C for linear designs—and class D and E for switching designs. The classes are based on the proportion of each input cycle (conduction angle) during which an amplifying device passes current. The image of the conduction angle derives from amplifying a sinusoidal signal. If the device is always on, the conducting angle is 360°. If it is on for only half of each cycle, the angle is 180°. The angle of flow is closely related to the amplifier power efficiency. In the illustrations below, a bipolar junction transistor is shown as the amplifying device. However, the same attributes are found with MOSFETs or vacuum tubes. Class A In a class-A amplifier, 100% of the input signal is used (conduction angle Θ = 360°). The activ
https://en.wikipedia.org/wiki/Stoned%20Fox
Stoned Fox () is an anthropomorphic taxidermied fox that has become an Internet sensation in Russia. The fox died of natural causes. It was stuffed in 2012 by Welsh artist Adele Morse (originally from Blackwood, Caerphilly, now based in Dalston, East London), who at the time was taking a Masters course at the Royal Academy of London and made it as part of a school project. She put the finished work on eBay where it surprisingly gained much attention. At the end of 2012, the fox quickly became an Internet meme in Russia. It has since been edited into famous paintings, photographs, videos, and other visual media. See also Lion of Gripsholm Castle Homunculus loxodontus References Taxidermy art Internet memes introduced in 2012 Internet in Russian language Foxes in popular culture Internet memes introduced from Russia Individual taxidermy exhibits
https://en.wikipedia.org/wiki/Data%20Act%20%28Sweden%29
The Data Act () is the world's first national data protection law and was enacted in Sweden on 11 May 1973. It went into effect on 1 July 1974 and required licenses by the Swedish Data Protection Authority for information systems handling personal data. History Information and communications technologies (ICTs) were far developed in Sweden due to multiple circumstances and the use of computers in public administration was introduced relatively early. Furthermore, the concepts of transparency, public access and openness were traditionally widely present in Swedish society. Widespread public concern was raised in 1969 due to the year's public census. In 1969, the Royal Commission on Publicity and Secrecy was set up to investigate problems associated with the increasing use of computers to store and process personal data. They provided the initial analysis, recommendations and drafts that addressed these problems. In July 1972, they published their report Computers and Privacy (Sw. Data och integritet). The Data Inspection Board (DIB), proposed in the report, was set up in July 1973. In April 1973, the Riksdag uncontentiously passed the Data Act, also proposed in the report, which only slightly modified the commission's draft. It then came into force in July 1973. An associated amendment to the Freedom of the Press Act was adopted in February 1974 − around the same time as the Credit Information Act and the Debt Recovery Acts which regulated computerized credit information. Problems and succession As the law's data registration and transborder data flow requirements were considered cumbersome and confusing by private and public organizations and the DIB was soon overcome by the magnitude of registrations the law was amended in 1982 which made the private sector and the government more self-sufficient in terms of registration. After several more amendments in 1989 a Commission on Data Protection was set up to make a total revision of the act. The commission submit
https://en.wikipedia.org/wiki/Jasmine%20Directory
Jasmine Directory is a human-edited web directory providing websites and businesses categorized topically and regionally. It offers thirteen topic-based categories and one region-based category with hand-picked and reviewed users' suggested resources. Jasmine Directory was founded in 2009 by Pécsi András and Robert Gomboș and is headquartered in Valley Cottage, New York. It won eight prizes during 2013–14 for its editorial discretion and manually added resources. Jasmine Directory proved to be useful for SEO Google search results since they manually add about 90% of the resources. History The directory was launched in 2009 at the Budapest University of Technology and Economics by Pécsi András and Robert Gomboș. It operates from Valley Cottage, New York. Operation and structure Jasmine Directory lists educational resources and businesses of public interest, which can be filtered based on business category. Jasmine Directory's editors manually add resources to the index, which according to co-founder Gomboș represents 90% of its listings. Businesses and site owners can also suggest their websites for review by paying a "suggestion fee;" however, inclusion is not guaranteed if the suggested resources do not comply with the editorial guidelines, in which case the review fee is refunded. The annual fee for standard listings is $59. Site owners are not paying for placement, but they are paid for editing and the administrative effort to review and create a listing. The directory labels listings chosen by editors using an "EP" mark to separate those websites from ones submitted by site owners. To suggest a resource for inclusion, users may select an appropriate category; they can also customize their listing by adding their websites' social media fan pages and contact information based on which businesses' Google Maps location will be generated accordingly. Once the listing is posted, its publisher can edit the details any time, but an upgrade feature is also available
https://en.wikipedia.org/wiki/Stromagen
Stromagen is a product that is made of stem cells taken from a patient's bone marrow and grown in the laboratory. After a patient's bone marrow is destroyed by treatment with whole body irradiation or chemotherapy, these cells are injected back into the patient to help rebuild bone marrow. Stromagen has been studied in the prevention of graft-versus-host disease during stem cell transplant in patients receiving treatment for cancer. Stromagen is used in cellular therapy. Also called autologous expanded mesenchymal stem cells OTI-010. Peripheral stem cell transplantation may allow doctors to give higher doses of chemotherapy and kill more tumor cells. It is not yet known whether Stromagen improves the success of stem cell transplantation in women with breast cancer. References Biotechnology Biotechnology products Pharmaceutical industry Life sciences industry Specialty drugs Pharmacy Regenerative biomedicine
https://en.wikipedia.org/wiki/Futile%20game
In game theory, a futile game is a game that permits a draw or a tie when optimal moves are made by both players. An example of this type of game is the classical form of Tic-tac-toe, though not all variants are futile games. The term does not apply to intransitive games, such as iterated prisoner's dilemma or rock–paper–scissors, in which there is no path to a draw or every strategy in the game can be beaten by another strategy. See also Partisan game Impartial game Solved game References Combinatorial game theory
https://en.wikipedia.org/wiki/Augmented%20marked%20graph
An augmented marked graph is basically a Petri net with a specific set of places called resource places. If removing these resource places and their associated arcs, it will become a marked graph where every cycle is marked. For each resource place, there are pairs of outgoing and incoming transitions connected by elementary paths. Application Augmented marked graphs are often used for modelling systems with shared resources, such as manufacturing systems. Based on the special properties of augmented marked graphs, the properties of the modelled systems, such as liveness, boundedness and reversibility, can be effectively analyzed. References King Sing Cheung, Augmented Marked Graphs, Springer, 2014. Petri nets Formal specification languages Models of computation Concurrency (computer science)
https://en.wikipedia.org/wiki/Lightning%20Network
The Lightning Network (LN) is a "layer 2" payment protocol built on the Bitcoin blockchain and those of other cryptocurrencies. It is intended to enable fast transactions among participating nodes (independently run members of the network) and has been proposed as a solution to the bitcoin scalability problem. It is a peer-to-peer system for making micropayments of cryptocurrency through a network of bidirectional payment channels, without delegating custody of funds. Transacting parties use the Lightning Network by opening a payment channel and transferring (committing) funds to the relevant layer-1 blockchain (e.g. Bitcoin) under a smart contract. The parties then make any number of off-chain Lightning Network transactions that update the tentative distribution of the channel's funds, without broadcasting to the blockchain. Whenever the parties have finished their transaction session, they close the payment channel, and the smart contract distributes the committed funds according to the transaction record. To initiate closing, one node first broadcasts the current state of the transaction record to the network, including a proposed settlement, a distribution of the committed funds. If both parties confirm the proposal, the funds are immediately paid on-chain. The other option is uncooperative closure, for example if one node has dropped from the network, or if it is broadcasting an incorrect (possibly fraudulent) transaction state. In this case settlement is delayed during a dispute period, when nodes may contest the proposal. If the second node broadcasts a more up-to-date timestamped distribution, including some transactions omitted by the first proposal, then all committed funds are transferred to the second node: this punitive breach remedy transaction thwarts attempts to defraud the other node by broadcasting out-of-date transactions. History Joseph Poon and Thaddeus Dryja published a Lightning Network white paper in February 2015. 2019 Bitcoin Lightning
https://en.wikipedia.org/wiki/Succinimidyl%204-%28N-maleimidomethyl%29cyclohexane-1-carboxylate
Succinimidyl 4-(N-maleimidomethyl)cyclohexane-1-carboxylate (SMCC) is a heterobifunctional amine-to-sulfhydryl crosslinker, which contains two reactive groups at opposite ends: N-hydroxysuccinimide-ester and maleimide, reactive with amines and thiols respectively. SMCC is often used in bioconjugation to link proteins with other functional entities (fluorescent dyes, tracers, nanoparticles, cytotoxic agents). For example, a targeted anticancer agent – trastuzumab emtansine (antibody-drug conjugate containing an antibody trastuzumab chemically linked to a highly potent drug DM-1) – is prepared using SMCC reagent. References Reagents for biochemistry Maleimides Succinimides
https://en.wikipedia.org/wiki/EternalBlue
EternalBlue is a computer exploit developed by the U.S. National Security Agency (NSA). It was leaked by the Shadow Brokers hacker group on April 14, 2017, one month after Microsoft released patches for the vulnerability. On May 12, 2017, the worldwide WannaCry ransomware used this exploit to attack unpatched computers. On June 27, 2017, the exploit was again used to help carry out the 2017 NotPetya cyberattack on more unpatched computers. The exploit was also reported to have been used since March 2016 by the Chinese hacking group Buckeye (APT3), after they likely found and re-purposed the tool, as well as reported to have been used as part of the Retefe banking trojan since at least September 5, 2017. EternalBlue was among the several exploits used, in conjunction with the DoublePulsar backdoor implant tool, in executing the 2017 WannaCry attacks. Details EternalBlue exploits a vulnerability in Microsoft's implementation of the Server Message Block (SMB) protocol. This vulnerability is denoted by entry in the Common Vulnerabilities and Exposures (CVE) catalog. The vulnerability exists because the SMB version 1 (SMBv1) server in various versions of Microsoft Windows mishandles specially crafted packets from remote attackers, allowing them to remotely execute code on the target computer. The NSA did not alert Microsoft about the vulnerabilities, and held on to it for more than five years before the breach forced its hand. The agency then warned Microsoft after learning about EternalBlue's possible theft, allowing the company to prepare a software patch issued in March 2017, after delaying its regular release of security patches in February 2017. On Tuesday, March 14, 2017, Microsoft issued security bulletin MS17-010, which detailed the flaw and announced that patches had been released for all Windows versions that were currently supported at that time, these being Windows Vista, Windows 7, Windows 8.1, Windows 10, Windows Server 2008, Windows Server 2012, and
https://en.wikipedia.org/wiki/NJFX
NJFX, also known as New Jersey Fiber Exchange, is a Wall Township, NJ-based data center and subsea cable landing station operator. The company offers Tier 3 data center, meet-me room and colocation services, and a cable landing station on a 58 acre campus. History NJFX was founded by Gil Santaliz, a telecommunications executive who in 2008 sold metro dark fiber provider 4Connections to Optimum Lightpath, a subsidiary of NY cable operator Cablevision (now Altice USA). Tata Communications was a founding partner of NJFX. NJFX opened a meet-me room (MMR) within Tata Communication's Wall, NJ subsea cable landing station (CLS). One of Tata's cables terminating in the cable landing station is the Seabras-1 undersea cable, which links North America and Brazil, with a landing point in Sao Paulo. Tata's TGN Atlantic subsea cable also lands in Wall Township, connecting to Highbridge, Somerset, United Kingdom. As the MMR operator, NJFX managed the network connections between its own customers and those of Tata Communication's CLS. In September 2015, NJFX announced they would be constructing a 64,000 sq ft Tier III data center adjacent to Tata's CLS, providing direct access to their European and South America subsea cables. Design would be done by Boston-based Bala Consulting Engineers. In January 2016, voice and data network provider Windstream announced it was extending its 100 Gigabit Ethernet (100G) network from NJFX's presence at the CLS to Ashburn, Virginia's Internet hub. In January 2017, its Tier III center was completed. In March, NJFX announced they were adding an additional data center on their campus. In September 2018, the company announced that the HAVFRUE transatlantic submarine network cable would be landing at its Wall, NJ cable landing station. The cable was planned to run between New Jersey and Denmark, with branches to Norway and Ireland. In March 2019, Amazon Web Services signed an agreement with Norwegian infrastructure company Bulk Infrastru
https://en.wikipedia.org/wiki/Labeled%20data
Labeled data is a group of samples that have been tagged with one or more labels. Labeling typically takes a set of unlabeled data and augments each piece of it with informative tags. For example, a data label might indicate whether a photo contains a horse or a cow, which words were uttered in an audio recording, what type of action is being performed in a video, what the topic of a news article is, what the overall sentiment of a tweet is, or whether a dot in an X-ray is a tumor. Labels can be obtained by asking humans to make judgments about a given piece of unlabeled data. Labeled data is significantly more expensive to obtain than the raw unlabeled data. Crowdsourced labeled data In 2006 Fei-Fei Li, the co-director of the Stanford Human-Centered AI Institute, set out to improve the artificial intelligence models and algorithms for image recognition by significantly enlarging the training data. The researchers downloaded millions of images from the World Wide Web and a team of undergraduates started to apply labels for objects to each image. In 2007 Li outsourced the data labelling work on Amazon Mechanical Turk, an online marketplace for digital piece work. The 3.2 million images that were labelled by more than 49,000 workers formed the basis for ImageNet, one of the largest hand-labeled database for outline of object recognition. Automated data labelling After obtaining a labeled dataset, machine learning models can be applied to the data so that new unlabeled data can be presented to the model and a likely label can be guessed or predicted for that piece of unlabeled data. Data-driven bias Algorithmic decision-making is subject to programmer-driven bias as well as data-driven bias. Training data that relies on bias labeled data will result in prejudices and omissions in a predictive model, despite the machine learning algorithm being legitimate. The labelled data used to train a specific machine learning algorithm needs to be a statistically representativ
https://en.wikipedia.org/wiki/Response%20modeling%20methodology
Response modeling methodology (RMM) is a general platform for statistical modeling of a linear/nonlinear relationship between a response variable (dependent variable) and a linear predictor (a linear combination of predictors/effects/factors/independent variables), often denoted the linear predictor function. It is generally assumed that the modeled relationship is monotone convex (delivering monotone convex function) or monotone concave (delivering monotone concave function). However, many non-monotone functions, like the quadratic equation, are special cases of the general model. RMM was initially developed as a series of extensions to the original inverse Box–Cox transformation: where y is a percentile of the modeled response, Y (the modeled random variable), z is the respective percentile of a normal variate and λ is the Box–Cox parameter. As λ goes to zero, the inverse Box–Cox transformation becomes: an exponential model. Therefore, the original inverse Box-Cox transformation contains a trio of models: linear (λ = 1), power (λ ≠ 1, λ ≠ 0) and exponential (λ = 0). This implies that on estimating λ, using sample data, the final model is not determined in advance (prior to estimation) but rather as a result of estimating. In other words, data alone determine the final model. Extensions to the inverse Box–Cox transformation were developed by Shore (2001a) and were denoted Inverse Normalizing Transformations (INTs). They had been applied to model monotone convex relationships in various engineering areas, mostly to model physical properties of chemical compounds (Shore et al., 2001a, and references therein). Once it had been realized that INT models may be perceived as special cases of a much broader general approach for modeling non-linear monotone convex relationships, the new Response Modeling Methodology had been initiated and developed (Shore, 2005a, 2011 and references therein). The RMM model expresses the relationship between a response, Y (the modeled ra
https://en.wikipedia.org/wiki/Engineering%20biology
Engineering biology is the set of methods for designing, building, and testing engineered biological systems which have been used to manipulate information, construct materials, process chemicals, produce energy, provide food, and help maintain or enhance human health and environment. History Rapid advances in the ability to genetically modify biological organisms have advanced a new engineering discipline, commonly referred to as synthetic biology. This approach seeks to harness the power of living systems for a variety of manufacturing applications, such as advanced therapeutics, sustainable fuels, chemical feedstocks, and advanced materials. To date, research in synthetic biology has typically relied on trial-and-error approaches, which are costly, laborious, and inefficient. References Bibliography H.R.4521 - America COMPETES Act of 2022 https://www.congress.gov/congressional-record/2022/03/17/senate-section/article/S1237-5 Schuergers, N., Werlang, C., Ajo-Franklin, C., & Boghossian, A. (2017). A Synthetic Biology Approach to Engineering Living Photovoltaics. Energy & Environmental Science. doi:10.1039/C7EE00282C Teague, B. P., Guye, P., & Weiss, R. (2016). Synthetic Morphogenesis. Cold Spring Harbor Perspectives in Biology, 8(9), a023929. doi:10.1101/cshperspect.a023929 Kelley, N. J. (2015). Engineering Biology for Science & Industry : Accelerating Progress. http://nancyjkelley.com/wp-content/uploads/Meeting-Summary.Final_.6.9.15-Formatted.pdf H.R.591. - Engineering Biology Research and Development Act of 2015. https://www.congress.gov/bill/114th-congress/house-bill/591 Kelley, N. J. (2014). The promise and challenge of engineering biology in the United States. Industrial Biotechnology, 10(3), 137–139. doi:10.1089/ind.2014.1516 ↑ Beal, J., Weiss, R., Densmore, D., Adler, A., Babb, J., Bhatia, S., ... & Loyall, J. (2011, June). TASBE: A tool-chain to accelerate synthetic biological engineering. In Proceedings of the 3rd International Workshop on Bi
https://en.wikipedia.org/wiki/Council%20for%20the%20Regulation%20of%20Engineering%20in%20Nigeria
Council for the Regulation of Engineering in Nigeria (COREN) formerly known as Council for the Registration of Engineers in Nigeria, is the regulatory body that governs the practice of engineering in Nigeria. Registration Procedure in private company Registration is done online and first requires the individual's institution sending the individuals transcript to a specified mail address. This also involves paying some fee and uploading some document to the body's website. COREN membership is required to practice engineering independently, is a requirement for some engineering firms, and is mandatory for government contracts. Council Structure The Council is the highest policy making body. The Members of the Council must be registered Engineering Personnel. The Council consist of 26 members in accordance with the Act, as follows: President – elected by the Council Six Representatives of the Nigerian Society of Engineers Four Representatives of the Universities with Engineering Faculties One Representative of the Polytechnics One Representative of the Technical Colleges Six Representatives from States of the Federation Four Representatives of the Minister One Representative of NATE One Representative of NISET One Representative of NAEC Council Committees Regulation and Control Finance and general purpose Registration Committee Education and training Appointment, Promotion and Disciplinary Committee of Council Other Nigerian Engineering Bodies The Nigerian Society of Engineers (NSE) National Academy of Engineering (NAE) Association of Consulting Engineers in Nigeria (ACEN) Association of Professional Women Engineers in Nigeria (APWEN) Society of Petroleum Engineers (SPE) Nigerian Institution of Civil Engineers (NICE) References Engineering societies Organizations based in Nigeria
https://en.wikipedia.org/wiki/Chandrasekhar%27s%20variational%20principle
In astrophysics, Chandrasekhar's variational principle provides the stability criterion for a static barotropic star, subjected to radial perturbation, named after the Indian American astrophysicist Subrahmanyan Chandrasekhar. Statement A baratropic star with and is stable if the quantity is non-negative for all real functions that conserve the total mass of the star . where is the coordinate system fixed to the center of the star is the radius of the star is the volume of the star is the unperturbed density is the small perturbed density such that in the perturbed state, the total density is is the self-gravitating potential from Newton's law of gravity is the Gravitational constant References Variational principles Stellar dynamics Astrophysics Fluid dynamics Equations of astronomy
https://en.wikipedia.org/wiki/InfoSec%20Institute
InfoSec Institute is a technology training company providing training courses for security professionals, businesses, agencies and technology professionals. The company's training library provides multi-course tracks by job function, certification-specific training and short-form, continuing education training. Its course library includes over 95 courses covering topics like ethical hacking, network security, mobile forensics and more. InfoSec Institute's SecurityIQ integrates security awareness training, phishing simulations and personalized learning. It scales with employees’ security aptitudes, roles and learning styles. History Infosec Institute was founded by Jack Koziol in 2004. In January 2022, Cengage Group announced an agreement to acquire Infosec for $190.8 million; the transaction was completed in March 2022. In June 2022, InfoSec institute was named one of the top 20 online learning library by Training Industry for the fourth consecutive year. See also Cyber security References Computer security companies Computer security software companies Defunct software companies of the United States Training companies
https://en.wikipedia.org/wiki/Circannual%20cycle
A circannual cycle is a biological process that occurs in living creatures over the period of approximately one year. This cycle was first discovered by Ebo Gwinner and Canadian biologist Ted Pengelley. It is classified as an Infradian rhythm, which is biological process with a period longer than that of a circadian rhythm, less than one cycle per 24 hours. These processes continue even in artificial environments in which seasonal cues have been removed by scientists. The term circannual is Latin, circa meaning approximately and annual relating to one year. Chronobiology is the field of biology pertaining to periodic rhythms that occur in living organisms in response to external stimuli such as photoperiod. Cycles come from genetic evolution in animals which allows them to create regulatory cycles to improve their fitness. Evolution for these traits comes from the increased reproductive success of animals most capable of predicting the regular changes in the environment like seasonal changes and adapt capitalize on the times when success was greatest. The idea of evolved biological clocks exists not only for animals but also in plant species which exhibit cyclic behaviors without environmental cues. Plentiful research has been done on the biological clocks and what behaviors they are responsible for in animals, circannual rhythms are just one example of a biological clock. Rhythms are driven by hormone cycles and seasonal rhythms can endure for long periods of time in animals even without photoperiod signaling which comes with seasonal changes. They are a driver of annual behaviors such as hibernation, mating and the gain or loss of weight for seasonal changes. Circannual cycles can be defined by three main aspects being that they must persist without apparent time cues, be able to be phase shifted, and should not be changed by temperature. Circannual cycles have important impacts on when animal behaviors are performed and the success of those behaviors. Circannu
https://en.wikipedia.org/wiki/Na%C5%9Fide%20G%C3%B6zde%20Durmu%C5%9F
Naside Gözde Durmuş (born 1985, Izmir) is a Turkish scientist and geneticist. She is currently Assistant Professor (Research) of Radiology at Stanford University. Her research focuses on nanotechnology and micro-technology applications on current world-threatening health issues, like cancer and antibiotic resistance. In 2015, MIT Technology Review listed her under the category of pioneers in the magazine's list of 35 Innovators Under 35. Biography Durmuş was born in 1985 in Izmir, Turkey. In 2003, she started her undergraduate studies at the Middle East Technical University, specializing in Molecular Biology and Genetics. Later on, she obtained a Fulbright scholarship and moved to the United States to pursue higher education, achieving a Masters in Engineering from Boston University in 2009, and receiving a Ph.D. degree in Biomedical Engineering from Brown University in May 2013. Durmus is currently an Assistant Professor at Stanford University; In 2014, she took a position as a post-doctoral researcher at Stanford. She conducted her research with Ronald W. Davis at the Stanford University Genome Technology Center and Stanford University School of Medicine. In 2015, she has been recognized among the "Top 35 Innovators Under 35" (TR35), as a pioneer in biotechnology and medicine, by MIT Technology Review Magazine. Career Her work focuses on developing low-cost nanotechnology tools that can be used for the diagnose and treatment of diseases, like for instance a fast method for detecting the physical features of a cell, by having them levitate in a magnetic field, this being able to measure in a shorter period of time how a microbe responds to a certain drug, and making it possible to differentiate cancerous cells from healthy ones. References External links Living people 1985 births 21st-century biologists 21st-century women scientists Turkish geneticists Women geneticists Middle East Technical University alumni Boston University College of Engineering alu
https://en.wikipedia.org/wiki/CISPE
CISPE (Cloud Infrastructure Services Providers in Europe) is a non-profit trade association for infrastructure as a service (IaaS) cloud providers in Europe. It was started to aid IaaS providers in explaining their business model to policymakers. Registered in early 2017, CISPE has been operating since 2015. The association aims to advocate for an EU-wide cloud-first public procurement policy and engage for a European Digital Single Market including the promotion of high-level security and data protection rules/standards as well as avoiding vendor lock-in. In June 2020, the association became one the 22 founding members of GAIA-X, announced by the German and French Ministers of Economic Affairs Peter Altmaier and Bruno Le Maire. CISPE joined forces with European cloud users and providers like BMW, EDF, Safran, Atos, Siemens, Bosch, OVHcloud, and Deutsche Telekom. The CISPE Data Protection Code of Conduct To help IaaS providers and their customers to comply with the EU General Data Protection Regulation (GDPR), which entered into force from 25 May 2018, CISPE released the CISPE Data Protection Code of Conduct. On top of the required compliance to meet with the GDPR, the code also ensures that IaaS customers can choose to have their data located and processed exclusively in Europe, and that the supplier will not re-use a customer's data. The compliance has to be declared by CISPs/IaaS providers service by service. The CISPE Code of Conduct was launched on 27 September 2016 at the European Parliament, and the first thirty services had been declared by the first CISPs/IaaS providers on 14 February 2017. Announcements received press coverage from Le Monde, InfoDSI, El Pais, La Repubblica, Silicon, Cloud Magazine, Computer Sweden, Tom's Hardware, L'informaticien, Global Security Mag, EU Observer, Politico, Computer Weekly, IAPP, Il corriere della Sicurezza, LeMagIT, Bloomberg Television, ITR Manager, Heise.de, COR.COM, ZDNet, ElEconomista.es, IT Channel, EuropaPre
https://en.wikipedia.org/wiki/Independent%20engineer
An independent engineer, also known as a lender's engineer, is a term often given to the engineering representative of the lender, or financier, of a large capital project. The key is to be independent so that opinions on the technical aspects of the project are not biased either in favor of the lenders or the developer/owners. To maintain independence, the independent engineer is typically selected by the lender but paid by the developer/owner. The role of the independent engineer is to conduct an independent technical assessment of a project or technical due diligence. The qualifications of an independent engineer are unusual in that in addition to understanding the engineering aspects of a project, the independent engineer must also be well versed in the business aspects of project financing. This includes the assessment of the technical aspects of major contracts such as EPC contracts, power purchase agreement, off-take agreements, long-term service agreements, and O&M agreements. An independent engineer will review the technical inputs (output, efficiency, O&M expenses, availability, etc.) to the financial model used by the lender and the developer/owner to justify the financing of the project. Scope of work While the role of an independent engineer is similar to an owner's engineer, they are distinctly different. An independent engineer will deal with the lenders and legal counsel on a regular basis to evaluate the financial health of the project as well as the technical aspects. Conversely, an owner's engineer more often deals directly with the engineer of record, the constructor, and equipment suppliers with a focus on ensuring that the technical details of the project meet the specifications. Typical scopes of work for an independent engineer often include: Fatal flaw and/or technology reviews Project reviews to support project financing Review of the engineering design Assessment of project participants and the project site Technology review
https://en.wikipedia.org/wiki/TO-252
TO-252, also known as DPAK or Decawatt Package, is a semiconductor package developed by Motorola for surface mounting on circuit boards. It represents a surface-mount variant of TO-251 package, and smaller variant of the D2PAK package. It is often used for high-power MOSFETs and voltage regulators. Variants Package can have 3 pins with pitch or 5 pins with pitch. The middle pin is usually connected to the tab. The middle pin is sometimes omitted. See also TO-263 References External links TO-252 standard from JEDEC TO-252 drawings from ON Semiconductor TO-252 package details from Central Semiconductor Corp. TO-252 (DPAK) package information from Amkor Technology Semiconductor packages
https://en.wikipedia.org/wiki/Multiplicative%20partitions%20of%20factorials
Multiplicative partitions of factorials are expressions of values of the factorial function as products of powers of prime numbers. They have been studied by Paul Erdős and others. The factorial of a positive integer is a product of decreasing integer factors, which can in turn be factored into prime numbers. This means that any factorial can be written as a product of powers of primes. For example,If we wish to write as a product of factors of the form , where each is a prime number, and the factors are sorted in nondecreasing order, then we have three ways of doing so:The number of such "sorted multiplicative partitions" of grows with , and is given by the sequence 1, 1, 3, 3, 10, 10, 30, 75, 220, 220, 588, 588, 1568, 3696, 11616, ... . Not all sorted multiplicative partitions of a given factorial have the same length. For example, the partitions of have lengths 4, 3 and 5. In other words, exactly one of the partitions of has length 5. The number of sorted multiplicative partitions of that have length equal to is 1 for and , and thereafter increases as 2, 2, 5, 12, 31, 31, 78, 78, 191, 418, 1220, 1220, 3015, ... . Consider all sorted multiplicative partitions of that have length , and find the partition whose first factor is the largest. (Since the first factor in a partition is the smallest within that partition, this means finding the maximum of all the minima.) Call this factor . The value of is 2 for and , and thereafter grows as 2, 2, 2, 3, 3, 3, 3, 3, 3, 4, 4, 4, 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 7, 7, 7, 7, 7, 7, ... . To express the asymptotic behavior of , letAs tends to infinity, approaches a limiting value, the Alladi–Grinstead constant (named for the mathematicians Krishnaswami Alladi and Charles Grinstead). The decimal representation of the Alladi–Grinstead constant begins, 0.80939402054063913071793188059409131721595399242500030424202871504... .The exact value of the constant can be written as the exponential of a certain infinite series.
https://en.wikipedia.org/wiki/%27Mamphono%20Khaketla
Mamphono Khaketla (born 5 March 1960) is a Lesotho mathematician and senator who served as Minister of Finance from March 2015 to June 2017. Early life and education Khaketla was born in Maseru on 5 March 1960 to Bennett Makalo and Caroline Ntseliseng ’Masechele Khaketla. Her father was a novelist, journalist, politician and former minister, as well as the major shareholder of Mohlabani Property Company, and left her a sizeable estate. Her mother was a teacher and author, one of the first women published in Lesotho. Khaketla did her primary and secondary schooling Maseru, before receiving a Bachelor of Education from the National University of Lesotho in 1980. She has a master's degree in education and a PhD in mathematics education from the University of Wisconsin (1991). Her thesis was titled "An analysis of the Lesotho Junior Certificate Mathematics Examination and its impact on instructions". Career Khaketla was a lecturer in mathematics at the National Teacher Training College from 1981 until 1995 and became assistant director of the college. She worked at the Institute of Development Management in Lesotho and Botswana from 1996 until 2001 before becoming the director of the Centre for Accounting Studies. Khaketla was appointed as a senator by Prime Minister Pakalitha Mosisili in 2002 and served as Minister of Communications, Science and Technology from 2002 until 2004. At the 2007 election, she lost her seat but was elected to the National Assembly as one of the Lesotho Congress for Democracy members on a party list for proportional representation submitted by the National Independence Party. She served as Minister of Education and Training from 2007 until 2012. In 2011, Khaketla was one of seven women ministers in the Cabinet, alongside: Mannete Ramali, Maphoka Motoboli, Mathabiso Lepono, Mphu Keneileo Ramatlapeng, Mpeo Mahase-Moiloa and Pontso Suzan Matumelo Sekatle. On 30 March 2015 she was appointed Minister of Finance. In November 2015, she presided
https://en.wikipedia.org/wiki/Lempel%E2%80%93Ziv%20complexity
The Lempel–Ziv complexity is a measure that was first presented in the article On the Complexity of Finite Sequences (IEEE Trans. On IT-22,1 1976), by two Israeli computer scientists, Abraham Lempel and Jacob Ziv. This complexity measure is related to Kolmogorov complexity, but the only function it uses is the recursive copy (i.e., the shallow copy). The underlying mechanism in this complexity measure is the starting point for some algorithms for lossless data compression, like LZ77, LZ78 and LZW. Even though it is based on an elementary principle of words copying, this complexity measure is not too restrictive in the sense that it satisfies the main qualities expected by such a measure: sequences with a certain regularity do not have a too large complexity, and the complexity grows as the sequence grows in length and irregularity. The Lempel–Ziv complexity can be used to measure the repetitiveness of binary sequences and text, like song lyrics or prose. Fractal dimension estimates of real-world data have also been shown to correlate with Lempel–Ziv complexity. Principle Let S be a binary sequence, of length n, for which we have to compute the Lempel–Ziv complexity, denoted C(S). The sequence is read from the left. Imagine you have a delimiting line, which can be moved in the sequence during the calculation. At first, this line is set just after the first symbol, at the beginning of the sequence. This initial position is called position 1, from where we have to move it to a position 2, which is considered the initial position for the next step (and so on). We have to move the delimiter (starting in position 1) the further possible to the right, so that the sub-word between position 1 and the delimiter position be a word of the sequence that starts before the position 1 of the delimiter. As soon as the delimiter is set on a position where this condition is not met, we stop, move the delimiter to this position, and start again by marking this position as a new i
https://en.wikipedia.org/wiki/Software%20as%20a%20Product
Software as a product (SaaP, also programming product, software product) is a product, software, which is made to be sold to users, and users pay for licence which allows them to use it, in contrast to SaaS, where users buy subscription and where the software is centrally hosted. One example of software as a product has historically been Microsoft Office, which has traditionally been distributed as a file package using CD-ROM or other physical media or is downloaded over network. Office 365, on the other hand, is an example of SaaS, where a monthly subscription is required. Development effort estimation In the book The Mythical Man-Month Fred Brooks tells that when estimating project times, it should be remembered that programming products (which can be sold to paying customers) are three times as hard to write as simple independent in-house programs, because requirement to work on different situations, which increases testing efforts and as a documentation. See also The Mythical Man-Month Minimum viable product Product manager Software as a service Literature References Software distribution
https://en.wikipedia.org/wiki/Data%20localization
Data localization or data residency law requires data about a nation's citizens or residents to be collected, processed, and/or stored inside the country, often before being transferred internationally. Such data is usually transferred only after meeting local privacy or data protection laws, such as giving the user notice of how the information will be used, and obtaining their consent. Data localization builds upon the concept of data sovereignty that regulates certain data types by the laws applicable to the data subjects or processors. While data sovereignty may require that records about a nation's citizens or residents follow its personal or financial data processing laws, data localization goes a step further in requiring that initial collection, processing, and storage first occur within the national boundaries. In some cases, data about a nation's citizens or residents must also be deleted from foreign systems before being removed from systems in the data subject's nation. Motivations and concerns One of the first moves towards data localization occurred in 2005 when the Government of Kazakhstan passed a law for all ".kz" domains to be run domestically (with later exceptions for Google). However, the push for data localization greatly increased after revelations by Edward Snowden regarding United States counter-terrorism surveillance programs in 2013. Since then, various governments in Europe and around the world have expressed the desire to be able to control the flow of residents' data through technology. Some governments are accused of and some openly admit to using data localization laws as a way to surveil their own populaces or to boost local economic activity. Technology companies and multinational organizations often oppose data localization laws because they impact efficiencies gained by regional aggregation of data centers and unification of services across national boundaries. Some vendors, such as Microsoft, have used data storage locale cont
https://en.wikipedia.org/wiki/Radeon%20RX%20Vega%20series
The Radeon RX Vega series is a series of graphics processors developed by AMD. These GPUs use the Graphics Core Next (GCN) 5th generation architecture, codenamed Vega, and are manufactured on 14 nm FinFET technology, developed by Samsung Electronics and licensed to GlobalFoundries. The series consists of desktop graphics cards and APUs aimed at desktops, mobile devices, and embedded applications. The lineup was released on 14 August 2017. It included the RX Vega 56 and the RX Vega 64, priced at $399 and $499 respectively. These were followed by two mobile APUs, the Ryzen 2500U and Ryzen 2700U, in October 2017. February 2018 saw the release of two desktop APUs, the Ryzen 3 2200G and the Ryzen 5 2400G, and the Ryzen Embedded V1000 line of APUs. In September 2018 AMD announced several Vega APUs in their Athlon line of products. Later in January 2019, the Radeon VII was announced based on the 7nm FinFET node manufactured by TSMC. History The Vega microarchitecture was AMD's high-end graphics cards line, and is the successor to the R9 300 series enthusiast Fury products. Partial specifications of the architecture and Vega 10 GPU were announced with the Radeon Instinct MI25 in December 2016. AMD later released the details of the Vega architecture. Announcement Vega was originally announced at AMD's CES 2017 presentation on 5 January 2017, alongside the Zen line of CPUs. New features Vega targets increased instructions per clock, higher clock speeds, and support for HBM2. AMD's Vega has new memory hierarchy with high-bandwidth cache and its controller. Support for HBM2 featuring double the bandwidth-per-pin over previous generation HBM. HBM2 allows for higher capacities with less than half the footprint of GDDR5 memory. Vega architecture is optimized for streaming very large datasets and can work with a variety of memory types with up to 512TB of virtual address space. Primitive shader for improved geometry processing. Replaces vertex and geometry shaders in geo
https://en.wikipedia.org/wiki/Routing%20diagram
A routing diagram or route diagram in the field of management engineering is a type of diagram, that shows a route through an accessible physical space. Routing diagrams are used in plant layout study, and manufacturing plant design. Overview A routing diagrams shows a route through a physical space. They are often considered a type of flow diagram, but they differ from flowcharts, that a routing is pictured in a physical layout. There is a similarity with design of Electrical equipment, where routing diagrams also in show "the physical layout of the facility and equipment and how the circuit how the circuit to the various equipment is run." A picture with a routing in geographical space is often called a route map. While the road map and transit map (such as the railway map, metro map, bus map, etc.) show all the roads or lines, the route map regularly shows one rad for a particular occasion. Likewise a ground plan or site map show all the space, buildings and/or rooms, the routing map shows one specific route on site. Routing diagrams are used in plant layout study. The routing diagram can consist of a floor plan with a trace attached, or a 3d cross section of a building with a trace. The routing diagram transforms into a flow diagram when the physical dimensions are taken out of the equation. References Further reading Willard C. Brinton, "Flow Charts," in Graphic presentation. New York city, Brinton associates, 1939; This is mainly about flow diagrams. Charles Day, "The Routing Diagram as a Basis for laying Out Industrial Plants." Engineering Magazine, September, 1910. p. 809-821; Republished in: Industrial Plants, 1911. Chapter VII. D. C. Eberhart. "222.41 c. Diagrammatic routing," in: The Code of Federal Regulations of the United States of America; Title 14. U.S. Government Printing Office, 1964. p. 48 Karl G. Karsten, "Route-Charts," in: Charts and graphs; an introduction to graphic methods in the control and analysis of statistics, 1923. p. 21
https://en.wikipedia.org/wiki/3-Arylpropiolonitriles
3-Arylpropiolonitriles (APN) belong to a class of electron-deficient alkyne derivatives substituted by two electron-withdrawing groups – a nitrile and an aryl moieties. Such activation results in improved selectivity towards highly reactive thiol-containing molecules, namely cysteine residues in proteins. APN-based modification of proteins was reported to surpass several important drawbacks of existing strategies in bioconjugation, notably the presence of side reactions with other nucleophilic amino acid residues and the relative instability of the resulting bioconjugates in the blood stream. The latter drawback is especially important for the preparation of targeted therapies, such as antibody-drug conjugates. Synthesis The synthesis of 3-arylpropiolonitriles has been the subject of several studies. The most elaborated and often used approach is based on MnO2-mediated free radical oxidation of the corresponding propargylic alcohols obtained using Sonogashira coupling of the corresponding iodo-derivative in the presence of ammonia (Figure 1). Applications in biotechnology In bioconjugation (forming a stable covalent link between a biomolecule and a functional payloads, such as fluorescent dyes, cytotoxic agents, or tracers), linking of the payload was classically achieved using maleimide heterobifunctional reagents (for example, see SMCC). However, when administered into living organisms, maleimide-containing bioconjugates were found to be relatively unstable and lose the payload in the blood circulation due to reversibility of the addition reaction between maleimide moiety and cysteine residue of a protein (retro Michael addition). Due to increased stability of bioconjugates obtained with analogous APN-based payloads (a schematic reaction is shown in the Figure 2 below), their use is often preferable when high selectivity and biostability are especially important: namely for the preparation of antibody−drug conjugates and other biologics. Standard procedure for
https://en.wikipedia.org/wiki/Quantum%20image%20processing
Quantum image processing (QIMP) is using quantum computing or quantum information processing to create and work with quantum images. Due to some of the properties inherent to quantum computation, notably entanglement and parallelism, it is hoped that QIMP technologies will offer capabilities and performances that surpass their traditional equivalents, in terms of computing speed, security, and minimum storage requirements. Background A. Y. Vlasov's work in 1997 focused on the use of a quantum system to recognize orthogonal images. This was followed by efforts using quantum algorithms to search specific patterns in binary images and detect the posture of certain targets. Notably, more optics-based interpretation for quantum imaging were initially experimentally demonstrated in and formalized in after seven years. In 2003, Salvador Venegas-Andraca and S. Bose presented Qubit Lattice, the first published general model for storing, processing and retrieving images using quantum systems. Later on, in 2005, Latorre proposed another kind of representation, called the Real Ket, whose purpose was to encode quantum images as a basis for further applications in QIMP. Furthermore, in 2010 Venegas-Andraca and Ball presented a method for storing and retrieving binary geometrical shapes in quantum mechanical systems in which it is shown that maximally entangled qubits can be used to reconstruct images without using any additional information. Technically, these pioneering efforts with the subsequent studies related to them can be classified into three main groups: Quantum-assisted digital image processing (QDIP): These applications aim at improving digital or classical image processing tasks and applications. Optics-based quantum imaging (OQI) Classically-inspired quantum image processing (QIMP) A survey of quantum image representation has been published in. Furthermore, the recently published book Quantum Image Processing provides a comprehensive introduction to quant
https://en.wikipedia.org/wiki/The%20Quantum%20Vacuum
The Quantum Vacuum: An Introduction to Quantum Electrodynamics is a physics textbook authored by Peter W. Milonni in 1993. The book provides a careful and thorough treatment of zero-point energy, spontaneous emission, the Casimir, van der Waals forces, Lamb shift and anomalous magnetic moment of the electron at a level of detail not found in other introductory texts to quantum electrodynamics. The first chapter, Zero‐Point Energy in Early Quantum Theory, was originally published in 1991 in the American Journal of Physics. In 2008 Milonni received the Max Born Award "For exceptional contributions to the fields of theoretical optics, laser physics and quantum mechanics, and for dissemination of scientific knowledge through authorship of a series of outstanding books". References Physics textbooks Quantum electrodynamics Quantum electronics Electrodynamics Quantum field theory
https://en.wikipedia.org/wiki/Paludiculture
Paludiculture is wet agriculture and forestry on peatlands. Paludiculture combines the reduction of greenhouse gas emissions from drained peatlands through rewetting with continued land use and biomass production under wet conditions. “Paludi” comes from the Latin “palus” meaning “swamp, morass” and "paludiculture" as a concept was developed at Greifswald University. Paludiculture is a sustainable alternative to drainage-based agriculture, intended to maintain carbon storage in peatlands. This differentiates paludiculture from agriculture like rice paddies, which involve draining, and therefore degrading wetlands. Characteristics Impact of peatland drainage and rewetting Peatlands store an enormous amount of carbon. Covering only 3% of the land surface, they store more than 450 gigatonne of carbon - more than stored by forests (which cover 30% of the land surface). Drained peatlands cause numerous negative environmental impacts such as greenhouse gas emission, nutrient leaching, subsidence and loss of biodiversity. Although only 0.3% of all peatlands are drained, peatland drainage is estimated to be responsible for 6% of all human greenhouse gas emission. By making soils waterlogged when re-wetting peatlands, decomposition of organic matter (~50% carbon) will almost cease, and hence carbon will no longer escape into the atmosphere as carbon dioxide. Peatland rewetting can significantly reduce environmental impacts caused by drainage by restoring hydrological buffering and reducing the water table's sensitivity to atmospheric evaporative demand. Due to the drainage of soils for agriculture in many areas, the peat soil depth and water quality has dropped significantly over the years. These problems are mitigated by re-wetting peatlands. As such, they can also make installations against rising sea levels (levees, pumps) unnecessary. Wet bogs act as nitrogen sinks, whereas mineralisation and fertilisation from agriculture on drained bogs produces nitrogen run-off in
https://en.wikipedia.org/wiki/Visual%20Expert
Visual Expert is a static code analysis tool, extracting design and technical information from software source code by reverse-engineering, used by programmers for software maintenance, modernization or optimization. It is designed to parse several programming languages at the same time (PL/SQL, Transact-SQL, PowerBuilder...) and analyze cross-language dependencies, in addition to each language's source code. Visual Expert checks source code against hundreds of code inspection rules for vulnerability assessment, bug fix, and maintenance issues. Features Cross-references exploration: Impact Analysis, E/R diagrams, call graphs, CRUD matrix, dependency graphs. Software documentation: a documentation generator produces technical documentation and low-level design descriptions. Inspect the code to detect bugs, security vulnerabilities and maintainability issues. Native integration with Jenkins. Reports on duplicate code, unused objects and methods and naming conventions. Calculates software metrics and source lines of code. Code comparison: finds differences between several versions of the same code. Performance analysis: identifies code parts that slow down the application because of their syntax - it extracts statistics about code execution from the database and combines it with the static analysis of the code. Usage Visual Expert is used in several contexts: Change impact analysis: evaluating the consequences of a change in the code or in a database. Avoiding negative side effects when evolving a system. Static Application Security Testing (SAST): detecting and removing security issues. Continuous Integration / Continuous Inspection : adding a static code analysis job in a CI/CD workflow to automatically verify the quality and security of a new build when it is released. Program comprehension: helping programmers understand and maintain existing code, or modernize legacy systems. Transferring knowledge of the code, from one programmer to another. Software
https://en.wikipedia.org/wiki/Side%20effects%20of%20penicillin
The side effects of penicillin are bodily responses to penicillin and closely related antibiotics that do not relate directly to its effect on bacteria. A side effect is an effect that is not intended with normal dosing. Some of these reactions are visible and some occur in the body's organs or blood. Penicillins are a widely used group of medications that are effective for the treatment of a wide variety of bacterial infections in human adults and children as well as other species. Some side effects are predictable, of which some are common but not serious, some are uncommon and serious and others are rare. The route of administration of penicillin can have an effect on the development of side effects. An example of this is irritation and inflammation that develops at a peripheral infusion site when penicillin is administered intravenously. In addition, penicillin is available in different forms. There are different penicillin medications (penicillin G benzathine, penicillin G potassium, Penicillin G sodium, penicillin G procaine, and penicillin V) as well as a number of β-lactam antibiotics derived from penicillin (e.g. amoxicillin). Side effects may only last for a short time and then go away. Side effects can be relieved in some cases with non pharmacological treatment. Some side effects require treatment to correct potentially serious and sometimes fatal reactions to penicillin. Penicillin has not been found to cause birth defects. Allergies and cross sensitivities Many people have indicated that they have a side effect related to an allergic reaction to penicillin. It has been proposed that as many as 90% of those claiming to have an allergy to penicillin are able to take it and do not have a true allergy. Research has suggested that having penicillin allergy incorrectly noted in the medical records can have negative consequences. Identifying an allergy to penicillin requires a hypersensitivity skin test, which diagnoses IgE-mediated immune responses cause
https://en.wikipedia.org/wiki/The%20Screening%20Room
The Screening Room is a proposed service that would stream movies to the home the same day its released in theaters. It was co-founded by Sean Parker (the co-founder of Napster) and Prem Akkaraju who also served as CEO for the company. The Screening Room shareholders include filmmakers Steven Spielberg, Ron Howard, J. J. Abrams, Martin Scorsese and Peter Jackson. In April 2020, The Screening Room announced it raised $27.5 million in equity and named Man Jit Singh as CEO. Man Jit Singh was the former President of Home Entertainment at Sony Pictures. Sean Parker and Prem Akkaraju remain on the board of directors and Prem Akkaraju has been elevated to Executive Chairman. The Screening Room, now renamed SR Labs, has been issued thirteen US technology utility patents involving the company's proprietary software to delivery high quality and secure film content. References Streaming media systems Video on demand services Internet properties established in 2016
https://en.wikipedia.org/wiki/Gal4%20transcription%20factor
The Gal4 transcription factor is a positive regulator of gene expression of galactose-induced genes. This protein represents a large fungal family of transcription factors, Gal4 family, which includes over 50 members in the yeast Saccharomyces cerevisiae e.g. Oaf1, Pip2, Pdr1, Pdr3, Leu3. Gal4 recognizes genes with UAS, an upstream activating sequence, and activates them. In yeast cells, the principal targets are GAL1 (galactokinase), GAL10 (UDP-glucose 4-epimerase), and GAL7 (galactose-1-phosphate uridylyltransferase), three enzymes required for galactose metabolism. This binding has also proven useful in constructing the GAL4/UAS system, a technique for controlling expression in insects. In yeast, Gal4 is by default repressed by Gal80, and activated in the presence of galactose as Gal3 binds away Gal80. Domains Two executive domains, DNA binding and activation domains, provide key function of the Gal4 protein conforming to most of the transcription factors. DNA binding Gal4 N-terminus is a zinc finger and belongs to the Zn(2)-C6 fungal family. It forms a Zn – cysteines thiolate cluster, and specifically recognizes UAS in GAL1 promoter. Gal4 transactivation Localised to the C-terminus, belongs to the nine amino acids transactivation domain family, 9aaTAD, together with Oaf1, Pip2, Pdr1, Pdr3, but also p53, E2A, MLL. Regulation Galactose induces Gal4 mediated transcription albeit Glucose causes severe repression. As a part of the Gal4 regulation, inhibitory protein Gal80 recognises and binds to the Gal4 region (853-874 aa). The inhibitory protein Gal80 is sequestered by regulatory protein Gal3 in Galactose dependent manner. This allows for Gal4 to work when there is galactose. Mutants The Gal4 loss-of-function mutant gal4-64 (1-852 aa, deletion of the Gal4 C-terminal 29 aa) lost both interaction with Gal80 and activation function. In the Gal4 reverted mutant Gal4C-62 mutant, a sequence (QTAY N AFMN) with the 9aaTAD pattern emerged and restored activati
https://en.wikipedia.org/wiki/Burst%20buffer
In the high-performance computing environment, burst buffer is a fast intermediate storage layer positioned between the front-end computing processes and the back-end storage systems. It bridges the performance gap between the processing speed of the compute nodes and the Input/output (I/O) bandwidth of the storage systems. Burst buffers are often built from arrays of high-performance storage devices, such as NVRAM and SSD. It typically offers from one to two orders of magnitude higher I/O bandwidth than the back-end storage systems. Use cases Burst buffers accelerate scientific data movement on supercomputers. For example, scientific applications' life cycles typically alternate between computation phases and I/O phases. Namely, after each round of computation (i.e., computation phase), all the computing processes concurrently write their intermediate data to the back-end storage systems (i.e., I/O phase), followed by another round of computation and data movement operations. With the deployment of burst buffer, processes can quickly write their data to burst buffer after one round of computation instead of writing to the slow hard disk based storage systems, and immediately proceed to the next round of computation without waiting for the data to be moved to the back-end storage systems; the data are then asynchronously flushed from burst buffer to the storage systems at the same time with the next round of computation. In this way, the long I/O time spent in moving data to the storage systems is hidden behind the computation time. In addition, buffering data in burst buffer also gives applications plenty of opportunities to reshape the data traffic to the back-end storage systems for efficient bandwidth utilization of the storage systems. In another common use case, scientific applications can stage their intermediate data in and out of burst buffer without interacting with the slower storage systems. Bypassing the storage systems allows applications to rea
https://en.wikipedia.org/wiki/Theoretical%20strength%20of%20a%20solid
The theoretical strength of a solid is the maximum possible stress a perfect solid can withstand. It is often much higher than what current real materials can achieve. The lowered fracture stress is due to defects, such as interior or surface cracks. One of the goals for the study of mechanical properties of materials is to design and fabricate materials exhibiting strength close to the theoretical limit. Definition When a solid is in tension, its atomic bonds stretch, elastically. Once a critical strain is reached, all the atomic bonds on the fracture plane rupture and the material fails mechanically. The stress at which the solid fractures is the theoretical strength, often denoted as . After fracture, the stretched atomic bonds return to their initial state, except that two surfaces have formed. The theoretical strength is often approximated as: where is the maximum theoretical stress the solid can withstand. E is the Young's Modulus of the solid. Derivation The stress-displacement, or vs x, relationship during fracture can be approximated by a sine curve, , up to /4. The initial slope of the vs x curve can be related to Young's modulus through the following relationship: where is the stress applied. E is the Young's Modulus of the solid. is the strain experienced by the solid. x is the displacement. The strain can be related to the displacement x by , and is the equilibrium inter-atomic spacing. The strain derivative is therefore given by The relationship of initial slope of the vs x curve with Young's modulus thus becomes The sinusoidal relationship of stress and displacement gives a derivative: By setting the two together, the theoretical strength becomes: The theoretical strength can also be approximated using the fracture work per unit area, which result in slightly different numbers. However, the above derivation and final approximation is a commonly used metric for evaluating the advantages of a material's mechanical properti
https://en.wikipedia.org/wiki/ArcaOS
ArcaOS is an operating system based on OS/2, developed and marketed by Arca Noae, LLC under license from IBM. It was codenamed Blue Lion during its development. It builds on OS/2 Warp 4.52 by adding support for new hardware, fixing defects and limitations in the operating system, and by including new applications and tools, and includes some Linux/Unix tool compatibility. It is targeted at professional users who need to run their OS/2 applications on new hardware, as well as personal users of OS/2. Like OS/2 Warp, ArcaOS is a 32-bit single user, multiprocessing, preemptive multitasking operating system for the x86 architecture. It is supported on both physical hardware and virtual machine hypervisors. Features Hardware compatibility ArcaOS supports symmetric multiprocessing systems with up to 64 processor cores, although it is recommended to disable hyperthreading. As of version 5.0.8, ArcaOS is ACPI 6.1-compliant and includes the 20220331 release of ACPICA. While ArcaOS is a 32-bit operating system, it has limited PAE support which allows it to use RAM in excess of 4GB as a RAM disk. ArcaOS supports being run as a virtual machine guest inside VirtualBox, VMware ESXi, VMWare Workstation and Microsoft Virtual PC. In addition to the device drivers included with OS/2 Warp 4, ArcaOS includes a variety of drivers developed by Arca Noae, and various third parties: Network adapters are supported either with Arca Noae's MultiMac technology, which employs FreeBSD driver code, or a selection of GenMAC drivers. Support for wireless networking is somewhat limited, though MultiMac support for additional chipsets is planned for future releases of ArcaOS. ArcaOS replaces the 16-bit IBM OS/2 USB driver with a new 32-bit driver capable of supporting USB 2.0 and USB 3.0 controllers. Audio support utilizes the Uniaud generic audio driver, now maintained by Arca Noae. Uniaud is based on the ALSA framework from the Linux kernel. In addition, a selection of device-specific dri
https://en.wikipedia.org/wiki/Delsarte%E2%80%93Goethals%20code
The Delsarte–Goethals code is a type of error-correcting code. History The concept was introduced by mathematicians Ph. Delsarte and J.-M. Goethals in their published paper. A new proof of the properties of the Delsarte–Goethals code was published in 1970. Function The Delsarte–Goethals code DG(m,r) for even m ≥ 4 and 0 ≤ r ≤ m/2 − 1 is a binary, non-linear code of length , size and minimum distance The code sits between the Kerdock code and the second-order Reed–Muller codes. More precisely, we have When r = 0, we have DG(m,r) = K(m) and when r = m/2 − 1 we have DG(m,r) = RM(2,m). For r = m/2 − 1 the Delsarte–Goethals code has strength 7 and is therefore an orthogonal array OA(. References Coding theory Error detection and correction
https://en.wikipedia.org/wiki/Polygenic%20adaptation
Polygenic adaptation describes a process in which a population adapts through small changes in allele frequencies at hundreds or thousands of loci. Many traits in humans and other species are highly polygenic, i.e., affected by standing genetic variation at hundreds or thousands of loci. Under normal conditions, the genetic variation underlying such traits is governed by stabilizing selection, in which natural selection acts to hold the population close to an optimal phenotype. However, if the phenotypic optimum changes, then the population can adapt by small directional shifts in allele frequencies spread across all the variants that affect the trait. Polygenic adaptation can occur relatively quickly (as described in the breeder's equation), however it is difficult to detect from genomic data because the changes in allele frequencies at individual loci are very small. Polygenic adaptation represents an alternative to adaptation by selective sweeps. In classic selective sweep models, a single new mutation sweeps through a population to fixation, purging variation from a region of linkage around the selected site. More recent models have focused on partial sweeps, and on soft sweeps - i.e., sweeps that start from standing variation or comprise multiple sweeping variants at the same locus. All of these models focus on adaptation through genetic changes at a single locus and they generally assume large changes in allele frequencies. The concept of polygenic adaptation is related to classical models from quantitative genetics. However, traditional models in quantitative genetics usually abstract away the contributions of individual loci by focusing instead on means and variances of genetic scores. In contrast, population genetics models and data analysis have generally emphasized models of adaptation through sweeps at individual loci. The modern formulation of polygenic adaptation in population genetics was developed in a pair of 2010 review articles. Examples o
https://en.wikipedia.org/wiki/Apache%20Kudu
Apache Kudu is a free and open source column-oriented data store of the Apache Hadoop ecosystem. It is compatible with most of the data processing frameworks in the Hadoop environment. It provides completeness to Hadoop's storage layer to enable fast analytics on fast data. The open source project to build Apache Kudu began as internal project at Cloudera. The first version Apache Kudu 1.0 was released 19 September 2016. Comparison with other storage engines Kudu was designed and optimized for OLAP workloads. Like HBase, it is a real-time store that supports key-indexed record lookup and mutation. Kudu differs from HBase since Kudu's datamodel is a more traditional relational model, while HBase is schemaless. Kudu's "on-disk representation is truly columnar and follows an entirely different storage design than HBase/Bigtable". See also List of column-oriented DBMSes References External links Apache Kudu GitHub repository Kudu Software using the Apache license C++ software Free system software Free software Free database management systems Cloud computing Online analytical processing Data warehousing Data warehousing products Data analysis software Distributed data stores Structured storage
https://en.wikipedia.org/wiki/Photographic%20Display%20Unit
The Photographic Display Unit, or PDU, was a large-format display system used by the Royal Air Force to present radar images for interpretation by a number of operators and commanders. Made by Kelvin Hughes, it projected a diameter image that could optionally be overlaid with a map and range rings. The PDU was originally designed for the ROTOR system and subsequent AMES Type 80-based Master Radar Stations. A smaller version with a display was used onboard Royal Navy ships, and a larger version projected onto movie screens was used in the SAGE system in the US. Description The PDU was essentially an automated 35mm film processing system. A bright cathode ray tube duplicated the image from one of the radar consoles, while the film was exposed in front of it through an f2 lens. Each complete scanning rotation of the radar took 15 seconds, but as it reached a selected location, normally opposite the direction the radar was designed to observe, the film was pulled out from in front of the lens and a new frame pulled into place. The recently exposed frame then moved through four stations, moving through one every 15 seconds. The first sprayed it with a developer, the next with water, then a fixer, water again, and then it was dried with an air blower. The entire developing process took one minute, after which the frame was pulled into a large film projector that shone upwards into the bottom of the plotting table. Plotters viewing the table from the top used methods developed in World War I to maintain tracks for the various aircraft. As each new frame appeared, they would place small arrow-shaped markers on the new location, leaving behind the ones they had placed earlier. This produced a trail that indicated the direction of travel. The plotting table was normally covered with a large sheet of semi-transparent paper that contained a National Grid map and allowed the operators to make notes directly on its surface. Each PDU was fed from a reel of film. Two machin
https://en.wikipedia.org/wiki/Eleks
ELEKS, also known as ELEKS Software, is an international company that provides custom software engineering and consulting services, headquartered in Tallinn, Estonia. The company has about 2000+ employees and operates offices in the United States, Canada, Germany, Ukraine, Poland, Switzerland, Japan, Croatia, UAE, KSA and the United Kingdom. History Eleks was established as a product company in 1991 by Oleksiy Skrypnyk and his son Oleksiy Skrypnyk, Jr. The company started with the launch of Dakar, a science-intensive software for power distribution systems for Eastern European markets. By 2016, Dakar was used in more than 20 Eastern European power systems, and by 2019, the company had 1400 employees. As of 2019, more than 200 companies are using the services of the company. Industries and technologies Eleks provides its services to enterprises in Finance, Media & Entertainment, Healthcare, Retail, Agriculture and Logistics industries. Activities include: Custom Software Development Advanced Analytics Virtual Reality Drones Mobile and Wearable Development Solutions for Retail Data Science Research and innovations The company supports Ukrainian armed forces by developing military drone software and hardware. References Companies based in Lviv Software companies established in 1991 Consulting firms established in 1991 Development software companies Engineering software companies Information technology consulting firms Outsourcing companies Software companies of Ukraine Ukrainian brands
https://en.wikipedia.org/wiki/Tarski%20Lectures
The Alfred Tarski Lectures are an annual distinction in mathematical logic and series of lectures held at the University of California, Berkeley. Established in tribute to Alfred Tarski on the fifth anniversary of his death, the award has been given every year since 1989. Following a 2-year hiatus after the 2020 lecture was not given due to the COVID-19 pandemic, the lectures resumed in 2023. Tarski Lecturers The list of past Tarski lecturers is maintained by UC Berkeley. See also Center for New Media Lectures Howison Lectures Gödel Lecture List of mathematics awards List of philosophy awards List of logicians External links Site of the Alfred Tarski Lectures at UC Berkeley Mathematics Site of the Alfred Tarski Lectures at UC Berkeley Logic List of past Alfred Tarski Lectures References Mathematics awards Mathematical logic University of California, Berkeley 1989 establishments in California Recurring events established in 1989 University and college lecture series
https://en.wikipedia.org/wiki/J%20%26%20E%20Hall
J & E Hall is an English manufacturer of refrigeration equipment (today part of the Daikin group). It was originally established as an iron works in Dartford, Kent in 1785, with products including papermaking machines, steam engines and gun carriages, before it started producing refrigeration machinery in the 1880s. During the early 20th century, the company diversified to produce commercial vehicles (branded as Hallford, 1906–1926), lifts and escalators, before refocusing on its core refrigeration and air conditioning products in the late 1960s. The company retains a head office and some R&D facilities in Dartford. History The company was originally established in 1785 in Lowfield Street, Dartford by smith and millwright John Hall (1764–1836). Originally from Alton, Hampshire, the second son of a millwright who had previously worked in Dartford, Hall moved to Dartford in 1784, and was employed to repair a mill on the River Darent in Hawley, after which he set up his own business, repairing and maintaining machinery used in corn, paper, oil and powder mills in and around Dartford. Bryan Donkin was one of the firm's earliest apprentices. Around 1800, the firm moved to larger premises in Waterside (now Hythe Street) on land which had once formed part of Dartford Priory, where its association with Donkin, now involved in the area's papermaking industry, helped it expand in partnership with the Fourdrinier brothers and John Gamble, to make paper machines. Donkin, Hall and Gamble also collaborated on canning food in metal containers. Hall acquired Peter Durand's patent in 1812 and after various experiments, Donkin, Hall and Gamble set up a canning factory in Blue Anchor Lane in Bermondsey, the first cannery to use tinned iron containers. By late spring 1813 they were appointing agents on the south coast to sell the preserved food to outbound ships, and the British Admiralty placed large orders with the firm of Donkin, Hall and Gamble for tinned meat. The firm later mer
https://en.wikipedia.org/wiki/Liouville%20space
In the mathematical physics of quantum mechanics, Liouville space, also known as line space, is the space of operators on Hilbert space. Liouville space is itself a Hilbert space under the Hilbert-Schmidt inner product. Abstractly, Liouville space is equivalent (isometrically isomorphic) to the tensor product of a Hilbert space with its dual. A common computational technique to organize computations in Liouville space is vectorization. Liouville space underlies the density operator formalism and is a common computation technique in the study of open quantum systems. References Hilbert spaces Linear algebra Operator theory Functional analysis