source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/123Movies | 123Movies, GoMovies, GoStream, MeMovies or 123movieshub was a network of file streaming websites operating from Vietnam which allowed users to watch films for free. It was called the world's "most popular illegal site" by the Motion Picture Association of America (MPAA) in March 2018, before being shut down a few weeks later on foot of a criminal investigation by the Vietnamese authorities. , websites imitating the brand remain active.
Development
The site went through several name changes after being shut down from different domains; sometimes the name appeared as "123Movies", and other times as "123movies". The original name, and URL, was 123movies.to, which changed to other domains including 123movies.is before redirecting to gomovies.to and later gomovies.is. It was changed to gostream.is, and then to memovies.to, before changing to 123movieshub.to/is and remaining there until shutdown.
In October 2016, the MPAA listed 123Movies in its Online Notorious Markets overview to the Office of the United States Trade Representative (USTR), stating that: "The site has a global Alexa rank of 559 and a local rank of 386 in the U.S. 123movies.to had 9.26 million worldwide unique visitors in August 2016 according to SimilarWeb data". In October 2016, Business Insider reported that 123movies.to was the "most-used pirate website" in the United Kingdom.
123Movies included HD, HD-RIP, Blu-ray and camera qualities of films. The video hosters and players it used included Openload, Streamango, and MyCloud. During its existence and shutdown period, the site was covered by TorrentFreak regarding its features, uptime/downtime, shutdown, and reasons for shutdown.
In December 2017, the creators of 123movies launched another streaming site dedicated to anime, named AnimeHub.to, which remained online for months after 123Movies's shutdown.
Shutdown
In March 2017, TorrentFreak reported that the US ambassador to Vietnam, Ted Osius, had been in talks with the local Minister of Inform |
https://en.wikipedia.org/wiki/Uno%20Platform | Uno Platform () is an open source cross-platform graphical user interface that allows WinUI and Universal Windows Platform (UWP) - based code to run on iOS, macOS, Linux, Android, and WebAssembly. Uno Platform is released under the Apache 2.0 license.
Applications can be built by using the UWP tools in Visual Studio on Windows, including XAML and C# Edit and Continue, and run on iOS, Android or in WebAssembly in a web browser. A plug in for Microsoft Visual Studio is available from Microsoft's Visual Studio Marketplace. The community surrounding Uno Platform open source project comes together at its annual conference UnoConf.
See also
WebAssembly
Blazor
.NET Multi-platform App UI (.NET MAUI)
Windows App SDK
References
Further reading
The Register: WinUI and WinRT: Official modern Windows API now universal thanks to WebAssembly
InfoWorld: Put Windows apps on the web with Uno
Channel9: Uno Platform Part 1
External links
Uno Platform
Mobile software development
Mobile software programming tools
Software development by platforms
Software using the Apache license
Web development software |
https://en.wikipedia.org/wiki/PROSPERO | The International Prospective Register of Systematic Reviews, better known as PROSPERO, is an open access online database of systematic review protocols on a wide range of topics. While it was initially restricted to medicine, , it also accepts protocols in criminology, social care, education and international development, as long as there is a health-related outcome. Researchers can choose to have their reviews prospectively registered with PROSPERO. The database is produced by the Centre for Reviews and Dissemination at the University of York in England, and it is funded by the National Institute for Health Research. Registration of systematic reviews in the database has been supported by PLoS Medicine, BioMed Central, the EQUATOR Network, and BMJ editor-in-chief Fiona Godlee, among others.
History
After the PRISMA statement was published in 2010, the University of York responded to its recommendation for prospective systematic review registration by beginning development of an online database of systematic reviews. The resulting PROSPERO database was launched in February 2011 by Frederick Curzon, 7th Earl Howe, the Parliamentary Under-Secretary of State for Health. It was simultaneously launched at a Vancouver, Canada meeting organized by the Canadian Institutes of Health Research that month. By October 2011, the database included 200 records of systematic reviews that were being conducted at the time. In October 2013, the Cochrane Collaboration began automatically including protocols of its systematic reviews in PROSPERO. By October 10, 2017, the number of registered reviews in the database had increased to 26,535.
Responses
In 2017, concern was raised that some protocols in PROSPERO could be "zombie reviews" for which the protocol had been registered, but its record in the database had not been updated to indicate that it had been completed. Andrade et al. showed that only 7% of all reviews registered in PROSPERO from 2011 to 2015 had since been marked as "co |
https://en.wikipedia.org/wiki/Waifu2x | waifu2x is an image scaling and noise reduction program for anime-style art and other types of photos.
waifu2x was inspired by Super-Resolution Convolutional Neural Network (SRCNN). It uses Nvidia CUDA for computing, although alternative implementations that allow for OpenCL and Vulkan have been created.
Etymology
Waifu (from the Japanese pronunciation of "wife") is anime slang for a female character to whom one is attracted.
2x means two-times magnification.
Example
See also
Comparison gallery of image scaling algorithms
References
External links
Image processing
Free software |
https://en.wikipedia.org/wiki/Parallel%20external%20memory | In computer science, a parallel external memory (PEM) model is a cache-aware, external-memory abstract machine. It is the parallel-computing analogy to the single-processor external memory (EM) model. In a similar way, it is the cache-aware analogy to the parallel random-access machine (PRAM). The PEM model consists of a number of processors, together with their respective private caches and a shared main memory.
Model
Definition
The PEM model is a combination of the EM model and the PRAM model. The PEM model is a computation model which consists of processors and a two-level memory hierarchy. This memory hierarchy consists of a large external memory (main memory) of size and small internal memories (caches). The processors share the main memory. Each cache is exclusive to a single processor. A processor can't access another’s cache. The caches have a size which is partitioned in blocks of size . The processors can only perform operations on data which are in their cache. The data can be transferred between the main memory and the cache in blocks of size .
I/O complexity
The complexity measure of the PEM model is the I/O complexity, which determines the number of parallel blocks transfers between the main memory and the cache. During a parallel block transfer each processor can transfer a block. So if processors load parallelly a data block of size form the main memory into their caches, it is considered as an I/O complexity of not . A program in the PEM model should minimize the data transfer between main memory and caches and operate as much as possible on the data in the caches.
Read/write conflicts
In the PEM model, there is no direct communication network between the P processors. The processors have to communicate indirectly over the main memory. If multiple processors try to access the same block in main memory concurrently read/write conflicts occur. Like in the PRAM model, three different variations of this problem are considered:
Concu |
https://en.wikipedia.org/wiki/Microprotein | A microprotein (miP) is a small protein encoded from a small open reading frame (smORF). They are a class of protein with a single protein domain that are related to multidomain proteins. Microproteins regulate larger multidomain proteins at the post-translational level. Microproteins are analogous to microRNAs (miRNAs) and heterodimerize with their targets causing dominant and negative effects. In animals and plants, microproteins have been found to greatly influence biological processes. Because of microproteins' dominant effects on their targets, microproteins are currently being studied for potential applications in biotechnology.
History
The first microprotein (miP) discovered was during a research in the early 1990s on genes for basic helix–loop–helix (bHLH) transcription factors from a murine erythroleukaemia cell cDNA library. The protein was found to be an inhibitor of DNA binding (ID protein), and it negatively regulated the transcription factor complex. The ID protein was 16 kDa and consisted of a helix-loop-helix (HLH) domain. The microprotein formed bHLH/HLH heterodimers which disrupted the functional basic helix–loop–helix (bHLH) homodimers.
The first microprotein discovered in plants was the LITTLE ZIPPER (ZPR) protein. The LITTLE ZIPPER protein contains a leucine zipper domain but does not have the domains required for DNA binding and transcription activation. Thus, LITTLE ZIPPER protein is analogous to the ID protein. Despite not all proteins being small, in 2011, this class of protein was given the name microproteins because their negative regulatory actions are similar to those of miRNAs.
Evolutionarily, the ID protein or proteins similar to ID found in all animals. In plants, microproteins are only found in higher order. However, the homeodomain transcription factors that belong to the three-amino-acid loop-extension (TALE) family are targets of microproteins, and these homeodomain proteins are conserved in animals, plants, and fungi.
Str |
https://en.wikipedia.org/wiki/Inter%20University%20Center%20for%20Bioscience | Inter University Centre for Bioscience (IUCB) was established at the School of Life Sciences, Kannur University, Kerala, India, by the Higher Education Department, Government of Kerala, to be a global center of excellence for research in biological sciences. Former Vice-President of India Mohammad Hamid Ansari inaugurated the centre on July 10, 2010. IUCB also have a herbal garden in its premises named after E.K. Janaki Ammal, renowned ethnobotanist from Thalassery who was the former Director-General of the Botanical Survey of India. The School of Life Sciences together with Inter University Center for Bioscience have active research collaborations with different research Institutes and industries across the country.
Research Highlights
References
Biology education
Kannur University
University research institutes |
https://en.wikipedia.org/wiki/Biodiversity%20Monitoring%20Switzerland | The Biodiversity Monitoring Switzerland (BDM) is a Swiss Confederation programme for the long-term monitoring of species diversity in Switzerland.
Introduction
The Biodiversity Monitoring Switzerland surveys the long-term development of species diversity in selected organism groups in Switzerland. The focus is on surveying common and widespread species in order to make informed statements about the development of species diversity in common landscapes.
Biodiversity Monitoring Switzerland is a programme run by the Federal Office for the Environment FOEN. It is a long-term environmental monitoring project, comparable with other national programmes, such as the Swiss National Forest Inventory (NFI), the National Surface Water Quality Monitoring Programme (NAWA), the Swiss Soil Monitoring Network (NABO) and the project “Monitoring the Effectiveness of Habitat Conservation in Switzerland” (WBS). There are similar biodiversity monitoring programmes in place in the United Kingdom (UK Countryside Survey by the UK Centre for Ecology & Hydrology) and in parts of Canada (Alberta Biodiversity Monitoring run by the Alberta Biodiversity Monitoring Institute).
Tasks and objectives
Together with other environmental information, the data from the Biodiversity Monitoring Switzerland underpin national conservation policy and other policy areas that are relevant to biodiversity such as agriculture and forestry. By signing the UN Convention on Biological Diversity (CBD), Switzerland also has an obligation under international law to monitor the long-term development of biodiversity.
The objectives of the Biodiversity Monitoring Switzerland are to
draw representative conclusions about biodiversity in Switzerland as a whole (sometimes broken down by biogeographic region or main type of land use, e.g. grassland, forests, settlements etc.);
monitor the evolution of species diversity as a whole, i.e. including in intensively managed areas and therefore draw conclusions about the com |
https://en.wikipedia.org/wiki/Tight%20junction%20proteins | Tight junction proteins (TJ proteins) are molecules situated at the tight junctions of epithelial, endothelial and myelinated cells. This multiprotein junctional complex has a regulatory function in passage of ions, water and solutes through the paracellular pathway. It can also coordinate the motion of lipids and proteins between the apical and basolateral surfaces of the plasma membrane. Thereby tight junction conducts signaling molecules, that influence the differentiation, proliferation and polarity of cells. So tight junction plays a key role in maintenance of osmotic balance and trans-cellular transport of tissue specific molecules. Nowadays is known more than 40 different proteins, that are involved in these selective TJ channels.
Structure of tight junction
The morphology of tight junction is formed by transmembrane strands in the inner side of plasma membrane with complementary grooves on the outer side. This TJ strand network is composed by transmembrane proteins, that interact with the actin in cytoskeleton and with submembrane proteins, which send a signal into the cell. The complexity of the network structure depends on the cell type and it can be visualized and analyzed by freeze-fracture electron microscopy, which shows the individual strands of the tight junction.
Function of tight junction proteins
TJ proteins could be divided in different groups according to their function or localization in tight junction. TJ proteins are mostly described in the epithelia and endothelia but also in myelinated cells. In the central and peripheral nervous system are TJ localized between a glia and an axon and within myelin sheaths, where they facilitate the signaling. Some of TJ proteins act as a scaffolds, that connect integral proteins with the actin in a cytoskeleton. Others have an ability to crosslink junctional molecules or transport vesicles through the tight junction. Some submembrane proteins are involved in the cell signaling and gene expression due t |
https://en.wikipedia.org/wiki/Marian%20Pour-El | Marian Boykan Pour-El (April 29, 1928 – June 10, 2009) was an American mathematical logician who did pioneering work in computable analysis.
Early life and education
Marian Boykan was born in 1928 in New York City; her parents were dentist Joseph Boykan and his wife Matilda (Mattie, née Caspe), a former laboratory technician and housewife.
As a young girl, she performed ballet at the Metropolitan Opera House,
and this influenced her later life where she was often more comfortable speaking before large audiences than in small groups.
Although she wanted to attend the Bronx High School of Science, it was at that time only for boys; instead, she went to a girls' school, Hunter College High School.
Her parents were unwilling to pay the tuition for private college for her, so she went to Hunter College, an inexpensive local higher-education establishment primarily aimed at training schoolteachers. There she earned a bachelor's degree in physics in 1949. She also completed enough courses in mathematics for a second major but was not allowed to have two majors by Hunter College's rules.
She was accepted to Harvard University for graduate studies in mathematics, with full support, as the only woman in the program.
At Harvard, she earned a master's degree in 1951 and a Ph.D. in mathematical logic in 1958. She was very isolated and lonely at Harvard, with few friends and, initially, no other students even willing to sit next to her in her classes. The nearest restroom to her classes was in a different building, and one of the few buildings with air conditioning in the summers was off-limits to women, even when she was assigned as an instructor to a class in that building. Because there were no logicians at Harvard at that time, she spent five years of her time as a visiting student at the University of California, Berkeley. Her doctoral dissertation was Computable Functions.
Career
After finishing her doctorate, Pour-El joined the mathematics faculty at Pennsylvania State |
https://en.wikipedia.org/wiki/Rank-maximal%20allocation | Rank-maximal (RM) allocation is a rule for fair division of indivisible items. Suppose we have to allocate some items among people. Each person can rank the items from best to worst. The RM rule says that we have to give as many people as possible their best (#1) item. Subject to that, we have to give as many people as possible their next-best (#2) item, and so on.
In the special case in which each person should receive a single item (for example, when the "items" are tasks and each task has to be done by a single person), the problem is called rank-maximal matching or greedy matching.
The idea is similar to that of utilitarian cake-cutting, where the goal is to maximize the sum of utilities of all participants. However, the utilitarian rule works with cardinal (numeric) utility functions, while the RM rule works with ordinal utilities (rankings).
Definition
There are several items and several agents. Each agent has a total order on the items. Agents can be indifferent between some items; for each agent, we can partition the items to equivalence classes that contain items of the same rank. For example, If Alice's preference-relation is x > y,z > w, it means that Alice's 1st choice is x, which is better for her than all other items; Alice's 2nd choice is y and z, which are equally good in her eyes but not as good as x; and Alice's 3rd choice is w, which she considers worse than all other items.
For every allocation of items to the agents, we construct its rank-vector as follows. Element #1 in the vector is the total number of items that are 1st-choice for their owners; Element #2 is the total number of items that are 2nd-choice for their owners; and so on.
A rank-maximal allocation is one in which the rank-vector is maximum, in lexicographic order.
Example
Three items, x y and z, have to be divided among three agents whose rankings are:
Alice: x > y > z
Bob: x > y > z
Carl: y > x > z
In the allocation (x, y, z), Alice gets her 1st choice (x), Bob ge |
https://en.wikipedia.org/wiki/Amicable%20triple | In mathematics, an amicable triple is a set of three different numbers so related that the restricted sum of the divisors of each is equal to the sum of other two numbers.
In another equivalent characterization, an amicable triple is a set of three different numbers so related that the sum of the divisors of each is equal to the sum of the three numbers.
So a triple (a, b, c) of natural numbers is called amicable if s(a) = b + c, s(b) = a + c and s(c) = a + b, or equivalently if σ(a) = σ(b) = σ(c) = a + b + c. Here σ(n) is the sum of all positive divisors, and s(n) = σ(n) − n is the aliquot sum.
References
Divisor function
Integer sequences
Number theory |
https://en.wikipedia.org/wiki/The%20Shift%20Project | The Shift Project (also called The Shift or TSP) is a French nonprofit created in 2010 that aims to limit both climate change and the dependency of our economy on fossil fuels.
Presentation, goals and organization
The Shift Project is a French nonprofit created in January 2010 in Paris by energy-climate experts such as Jean-Marc Jancovici, Geneviève Férone-Creuzet and Michel Lepetit. The organization aims to address two issues raised by the use of carbon: climate change and the depletion of fossil fuels. The Shift works as a think tank that shares ideas with economic, political, academic and voluntary actors.
The Shift Project is funded by corporate sponsors. Its budget for 2017 was about 600,000 euros.
The organization is led by a group of three people elected by the board of directors, which includes members of the sponsoring companies. A group of experts, called the "Expert Committee" (), ensures the scientific validity of the work done by The Shift Project. This group of experts (in economics, finance, climate, physics, history...) includes Alain Grandjean, Gaël Giraud, Hervé Le Treut, Jean-Pascal van Ypersele and Jacques Treiner. When the think tank was created, the first director was Cédric Ringenbach. He held this position until 2016, when he left The Shift Project and created the nonprofit organization The Climate Collage, which was later renamed to The Climate Fresk. Now headed by Matthieu Auzanneau, The Shift has a team of about ten employees and works with volunteers who are grouped into an independent nonprofit called The Shifters.
The Shift examines the dependency of our economy on oil through three angles: the potential return of economic growth, the issues related to the finite amount of oil and, of course, the climate change due to carbon emissions. According to The Shift, although the GDP may have use cases, it is not really useful, especially because it does not consider natural resources (so it does not account for their limited availability) |
https://en.wikipedia.org/wiki/Nekrasov%20matrix | In mathematics, a Nekrasov matrix or generalised Nekrasov matrix is a type of diagonally dominant matrix (i.e. one in which the diagonal elements are in some way greater than some function of the non-diagonal elements). Specifically if A is a generalised Nekrasov matrix, its diagonal elements are non-zero and the diagonal elements also satisfy,
where,
.
References
Matrices |
https://en.wikipedia.org/wiki/Image%20destriping | Image destriping is the process of removing stripes or streaks from images and videos without disrupting the original image/video. These artifacts plague a range of fields in scientific imaging including atomic force microscopy, light sheet fluorescence microscopy, and planetary satellite imaging.
The most common image processing techniques to reduce stripe artifacts is with Fourier filtering. Unfortunately, filtering methods risk altering or suppressing useful image data. Methods developed for multiple-sensor imaging systems in planetary satellites use statistical-based methods to match signal distribution across multiple sensors. More recently, a new class of approaches leverage compressed sensing, to regularize an optimization problem, and recover stripe free images. In many cases, these destriped images have little to no artifacts, even at low signal to noise ratios.
References
computer vision
image processing |
https://en.wikipedia.org/wiki/Ebbe%20Nielsen%20Challenge | The Ebbe Nielsen Challenge is an international science competition conducted annually from 2015 onwards by the Global Biodiversity Information Facility (GBIF), with a set of cash prizes that recognize researcher(s)' submissions in creating software or approaches that successfully address a GBIF-issued challenge in the field of biodiversity informatics. It succeeds the Ebbe Nielsen Prize, which was awarded annually by GBIF between 2002 and 2014. The name of the challenge honours the memory of prominent entomologist and biodiversity informatics proponent Ebbe Nielsen, who died of a heart attack in the U.S.A. en route to the 2001 GBIF Governing Board meeting.
History
In 2001, GBIF created the Ebbe Nielsen Prize to honour the recently deceased Danish-Australian entomologist Ebbe Nielsen, who was a keen proponent of both GBIF and the biodiversity informatics discipline. That prize recognized a global researcher or research team for their retrospective contribution(s) to the field of biodiversity informatics, according to criteria set out by GBIF in the terms of the award. From 2015 onwards, GBIF re-launched the award process as a competition between global individuals or teams of researchers to create new software or approaches to using biodiversity data according to a theme announced annually for each round of the competition, and also to split the prize money among multiple groups instead of a single winner as in the initial era of the Prize. Calls for entries to the competition, now called the "Ebbe Nielsen Challenge", have been issued annually from 2015 to the present, with winners announced through a competitive process in all years except for 2017, when an insufficient number of entries was received.
List of winners, 2015 onwards
2015
Challenge: "Make significant use of GBIF-mediated data in a way that provides new representations or insights. Your submission could involve a range of results—websites, stand-alone or mobile applications, or outputs of analyses—or |
https://en.wikipedia.org/wiki/Valeriepieris%20circle | A Valeriepieris circle is a figure drawn on the Earth's surface such that the majority of the human population lives within its interior. The concept was originally popularized by map posted on Reddit in 2013, made by a Texas ESL teacher named Ken Myers, whose username on the site gave the figure its name. Myers's original circle covers only about 6.7% of the Earth's total surface area, with a radius around in length, centered in the South China Sea. The map became a popular meme, and was featured in numerous internet media outlets.
Myers's original map uses the Winkel tripel projection, which means that his circle, not having been adjusted to the projection, does not correspond to a circle on the surface of a sphere.
In 2015, Singaporean professor Danny Quah—with the aid of an intern named Ken Teoh—verified Myers's original claim, as well as presenting a new, considerably smaller circle centered on the township of Mong Khet in Myanmar, with a radius of . In fact, Quah claimed this circle to be the smallest one possible, having been produced from more rigorous calculations and updated data, as well as being a proper circle on the Earth's surface.
In 2022, Myers's original circle was again tested by Riaz Shah, a professor at Hult International Business School. Shah used recently published data from the United Nations' World Population Prospects to estimate that 4.2 billion people lived inside the circle as of 2022, out of a total human population of 8 billion.
References
Geographical regions
World population
Population density
World maps
Internet memes |
https://en.wikipedia.org/wiki/GDPR%20fines%20and%20notices | The General Data Protection Regulation (GDPR) is a European Union regulation that specifies standards for data protection and electronic privacy in the European Economic Area, and the rights of European citizens to control the processing and distribution of personally-identifiable information.
Violators of GDPR may be fined up to €20 million, or up to 4% of the annual worldwide turnover of the preceding financial year, whichever is greater. The following is a list of fines and notices issued under the GDPR, including reasoning.
Fines and notices
References
External links
European Data Protection Board
Privacy law
Law enforcement
Crime in the European Union |
https://en.wikipedia.org/wiki/5G%20NR | 5G NR (New Radio) is a new radio access technology (RAT) developed by the 3rd Generation Partnership Project (3GPP) for the 5G (fifth generation) mobile network. It was designed to be the global standard for the air interface of 5G networks. It is based on orthogonal frequency-division multiplexing (OFDM), as is the 4G (fourth generation) long-term evolution (LTE) standard.
The 3GPP specification 38 series provides the technical details behind 5G NR, the successor of LTE.
The study of NR within 3GPP started in 2015, and the first specification was made available by the end of 2017. While the 3GPP standardization process was ongoing, the industry had already begun efforts to implement infrastructure compliant with the draft standard, with the first large-scale commercial launch of 5G NR having occurred in the end of 2018. Since 2019, many operators have deployed 5G NR networks and handset manufacturers have developed 5G NR enabled handsets.
Frequency bands
5G NR uses frequency bands in two broad frequency ranges:
Frequency Range 1 (FR1), for bands within MHz – MHz
Frequency Range 2 (FR2), for bands within MHz – MHz
Network deployments
Ooredoo was the first carrier to launch a commercial 5G NR network, in May 2018 in Qatar. Other carriers around the world have been following suit.
Development
In 2018, 3GPP published Release 15, which includes what is described as "Phase 1" standardization for 5G NR. The timeline for Release 16, which will be "5G phase 2", follows a freeze date of March 2020 and a completion date of June 2020, Release 17 was originally scheduled for delivery in September 2021. but, because of the COVID-19 pandemic, it was rescheduled for June 2022.
Release 18 work has started in 3GPP. Rel.18 is referred to as "NR Advanced" signifying another milestone in wireless communication systems. NR Advanced will include features such as eXtended Reality (XR), AI/ML studies, and Mobility enhancements. Mobility is in the core of 3GPP technolog |
https://en.wikipedia.org/wiki/AWM%E2%80%93Microsoft%20Research%20Prize%20in%20Algebra%20and%20Number%20Theory | The AWM–Microsoft Research Prize in Algebra and Number Theory and is a prize given every other year by the Association for Women in Mathematics to an outstanding young female researcher in algebra or number theory. It was funded in 2012 by Microsoft Research and first issued in 2014.
Winners
Sophie Morel (2014), for her research in number theory, particularly her contributions to the Langlands program, an application of her results on weighted cohomology, and a new proof of Brenti's combinatorial formula for Kazhdan-Lusztig polynomials.
Lauren Williams (2016), for her research in algebraic combinatorics, particularly her contributions on the totally nonnegative Grassmannian, her work on cluster algebras, and her proof (with Musiker and Schiffler) of the famous Laurent positivity conjecture.
Melanie Wood (2018), for her research in number theory and algebraic geometry, particularly her contributions in arithmetic statistics and tropical geometry, as well as her work with Ravi Vakil on the limiting behavior of natural families of varieties.
Melody Chan (2020), in recognition of her advances at the interface between algebraic geometry and combinatorics.
Jennifer Balakrishnan (2022), in recognition of her advances in computing rational points on algebraic curves over number fields.
See also
List of awards honoring women
List of mathematics awards
References
External links
AWM–Microsoft Research Prize, Association for Women in Mathematics
Awards honoring women
Awards and prizes of the Association for Women in Mathematics
Algebra
Number theory
Microsoft Research
2014 establishments in the United States
Awards established in 2014
Research awards
Biennial events |
https://en.wikipedia.org/wiki/Joan%20%26%20Joseph%20Birman%20Research%20Prize%20in%20Topology%20and%20Geometry | The Joan & Joseph Birman Research Prize in Topology and Geometry is a prize given every other year by the Association for Women in Mathematics to an outstanding young female researcher in topology or geometry. The prize fund for the award was endowed by a donation in 2013 from Joan Birman and her husband, Joseph Birman, and first awarded in 2015.
Winners
Elisenda Grigsby (2015), for her research in low-dimensional topology, particularly in knot theory and categorified invariants.
Emmy Murphy (2017), for her research in symplectic geometry where she developed new techniques for studying symplectic manifolds and contact geometry.
Kathryn Mann (2019), for "major breakthroughs in the theory of dynamics of group actions on manifolds".
Emily Riehl (2021), for "deep and foundational work in category theory and homotopy theory."
Kristen Hendricks (2023), for "highly influential work on equivariant aspects of Floer homology theories".
See also
List of awards honoring women
List of mathematics awards
References
External links
AWM Birman Research Prize, Association for Women in Mathematics
Awards honoring women
Awards and prizes of the Association for Women in Mathematics
Awards established in 2015
Biennial events
Early career awards
Research awards
Geometry
Topology
2015 establishments in the United States |
https://en.wikipedia.org/wiki/Novolak | Novolaks (sometimes: novolacs) are low molecular weight polymers derived from phenols and formaldehyde. They are related to Bakelite, which is more highly crosslinked. The term comes from Swedish "lack" for lacquer and Latin "novo" for new, since these materials were envisioned to replace natural lacquers such as copal resin.
Typically novolaks are prepared by the condensation of phenol or a mixture of p- and m-cresol with formaldehyde (as formalin). The reaction is acid catalyzed. Oxalic acid is often used because it can be subsequently removed by thermal decomposition. Novolaks have a degree of polymerization of approximately 20-40. The branching density, determined by the processing conditions, m- vs p-cresol ratio, as well as CH2O/cresol ratio is typically around 15%.
Novolaks are especially important in microelectronics where they are used as photoresist materials. They are also used as tackifiers in rubber.
See also
Epoxy
References
Plastics
Synthetic resins
Semiconductor device fabrication
Thermosetting plastics |
https://en.wikipedia.org/wiki/Sega%20SC-3000%20character%20set | Sega SC-3000 is a character set developed by Sega Corporation for the SC-3000 home computer.
Character sets
The following table shows the SC-3000 character set. Each character is shown with a potential Unicode equivalent. Space and control characters are represented by the abbreviations for their names.
� Not in Unicode
� Not in Unicode
References
Character sets
Sega |
https://en.wikipedia.org/wiki/Well-known%20text%20representation%20of%20coordinate%20reference%20systems | Well-known text representation of coordinate reference systems (WKT or WKT-CRS) is a text markup language for representing spatial reference systems and transformations between spatial reference systems. The formats were originally defined by the Open Geospatial Consortium (OGC) and described in their Simple Feature Access and Well-known text representation of coordinate reference systems specifications. The current standard definition is ISO 19162:2019. This supersedes ISO 19162:2015.
Version history
This WKT format was initially defined by the Open Geospatial Consortium (OGC) in 1999, then extended in 2001. This format, also defined in ISO 19125-1:2004, is sometime known as "WKT 1". Later, evolution of the Coordinate Reference System conceptual model, new requirements and inconsistencies in implementation of WKT 1 format between different software have encouraged the revision of that format. The updated "Well-known text representation of coordinate reference systems" standard, sometime known as "WKT 2", was adopted by the Open Geospatial Consortium in 2015. This standard is published conjointly by the International Organization for Standardization as ISO 19162:2015.
Confusingly, the original 2015 "WKT 2" standard has a version number 1 for the new, stricter WKT-CRS specification. A newer revision called WKT-CRS 2 was published in 2018, with the ISO version being ISO 19162:2019.
Backward compatibility
A software capable to read coordinate reference systems in WKT 2 format can also read many (but not all) equivalent systems in WKT 1 format. Some caveats exist, notably the removal of the TOWGS84 element which is replaced by the BOUNDCRS element. Another caveat is about the units of measurement. Some of them were unspecified in oldest WKT 1 specifications (for example the PRIMEM unit), which has led to different interpretations by different software. Those units of measurement have been clarified in the 2001 update and the WKT 2 specification is consistent with t |
https://en.wikipedia.org/wiki/Von%20Bertalanffy%20function | The von Bertalanffy growth function (VBGF), or von Bertalanffy curve, is a type of growth curve for a time series and is named after Ludwig von Bertalanffy. It is a special case of the generalised logistic function. The growth curve is used to model mean length from age in animals. The function is commonly applied in ecology to model fish growth and in paleontology to model sclerochronological parameters of shell growth.
The model can be written as the following:
where is age, is the growth coefficient, is the theoretical age when size is zero, and is asymptotic size. It is the solution of the following linear differential equation:
Seasonally-adjusted von Bertalanffy
The seasonally-adjusted von Bertalanffy is an extension of this function that accounts for organism growth that occurs seasonally. It was created by I. F. Somers in 1988.
See also
Gompertz function
Monod equation
Michaelis–Menten kinetics
References
Growth curves
Mathematical modeling |
https://en.wikipedia.org/wiki/4-Amino-5-hydroxymethyl-2-methylpyrimidine | Within the field of biochemistry, 4-amino-5-hydroxymethyl-2-methylpyrimidine (HMP) also known as toxopyrimidine together with its mono phosphate (HMP-P) and pyrophosphate (HMP-PP) esters are biogenetic precursors to the important biochemical cofactor thiamine pyrophosphate (TPP), a derivative of thiamine (vitamin B1).
HMP, HMP-P and HMP-PP are found along with thiamine forms in a wide variety of living organisms. Thiamine in various salt, formulation and biological matrix forms are used to supplement human and animal diets because these organisms lack the capability to produce it. Methodologies are being sought for biotechnology-based production of thiamine forms and for increasing thiamine content in food sources.
TPP biogenesis
In microorganisms and plants TPP results from coupling of pyrimidine fragment HMP-PP with thiazole fragment HET-P to give thiamine monophosphate, followed by conversion to the pyrophosphate.
Biogenesis of HMP-P and HET-P vary with types of organism.
HMP-P biogenesis
In bacteria, HMP-P arises by conversion of the purine biosynthetic precursor 5-aminoimidazole ribotide (AIR) through the action of enzymes such as phosphomethylpyrimidine synthase, a member of the radical SAM superfamily. Studies using isotopically labelled AIR have shown which atoms carry into the product. Mechanisms by which this occurs are not yet known with certainty.
In yeasts, HMP-P is derived from metabolites of histidine and pyridoxine. Some of these transformations appear to be catalyzed by radical SAM enzymes. Isotopically labelled precursors have been used to investigate this biogenesis. Mechanisms of the transformations are unknown.
In Salmonella, HMP-P can be derived independently of purine biogenesis when AICAR is available.
In algae, thiamine forms and precursors are scavenged by uptake from water of exogenous products from other organisms. In higher plants, thiamine biogenesis resembles that of bacteria. In some circumstances, thiamine forms and |
https://en.wikipedia.org/wiki/Haplotype%20block | In genetics, a haplotype block is a region of an organism's genome in which there is little evidence of a history of genetic recombination, and which contain only a small number of distinct haplotypes. According to the haplotype-block model, such blocks should show high levels of linkage disequilibrium and be separated from one another by numerous recombination events. The boundaries of haplotype blocks cannot be directly observed; they must instead be inferred indirectly through the use of algorithms. However, some evidence suggests that different algorithms for identifying haplotype blocks give very different results when used on the same data, though another study suggests that their results are generally consistent. The National Institutes of Health funded the HapMap project to catalog haplotype blocks throughout the human genome.
Definition
There are two main ways that the term "haplotype block" is defined: one based on whether a given genomic sequence displays higher linkage disequilibrium than a predetermined threshold, and one based on whether the sequence consists of a minimum number of single nucleotide polymorphisms (SNPs) that explain a majority of the common haplotypes in the sequence (or a lower-than-usual number of unique haplotypes). In 2001, Patil et al. proposed the following definition of the term: "Suppose we have a number of haplotypes consisting of a set of consecutive SNPs. A segment of consecutive SNPs is a block if at least α percent of haplotypes are represented more than once".
References
Genomics
Biology terminology |
https://en.wikipedia.org/wiki/Computational%20materials%20science | Computational materials science and engineering uses modeling, simulation, theory, and informatics to understand materials. The main goals include discovering new materials, determining material behavior and mechanisms, explaining experiments, and exploring materials theories. It is analogous to computational chemistry and computational biology as an increasingly important subfield of materials science.
Introduction
Just as materials science spans all length scales, from electrons to components, so do its computational sub-disciplines. While many methods and variations have been and continue to be developed, seven main simulation techniques, or motifs, have emerged.
These computer simulation methods use underlying models and approximations to understand material behavior in more complex scenarios than pure theory generally allows and with more detail and precision than is often possible from experiments. Each method can be used independently to predict materials properties and mechanisms, to feed information to other simulation methods run separately or concurrently, or to directly compare or contrast with experimental results.
One notable sub-field of computational materials science is integrated computational materials engineering (ICME), which seeks to use computational results and methods in conjunction with experiments, with a focus on industrial and commercial application. Major current themes in the field include uncertainty quantification and propagation throughout simulations for eventual decision making, data infrastructure for sharing simulation inputs and results, high-throughput materials design and discovery, and new approaches given significant increases in computing power and the continuing history of supercomputing.
Materials simulation methods
Electronic structure
Electronic structure methods solve the Schrödinger equation to calculate the energy of a system of electrons and atoms, the fundamental units of condensed matter.
Many variations |
https://en.wikipedia.org/wiki/Grizzly%20399 | Grizzly 399 (born 1996) is a grizzly bear inhabiting Grand Teton National Park and Bridger-Teton National Forest. She is followed by as many as 40 wildlife photographers, and millions of tourists come to the Greater Yellowstone Ecosystem to see her and the other grizzly bears. Grizzly 399 is the most famous brown bear mother in the world, with her own Facebook, Twitter, and Instagram accounts.
Background
Grizzly bears Ursus arctos horribilis are a subspecies of the North American brown bear species U. arctos. Several decades ago, grizzlies were assessed as being at risk of rapid extinction due to the rate at which the population was declining. Protection under the Endangered Species Act of 1973 has resulted in a population rebound: there are now approximately 2,000 grizzly bears in the contiguous United States, of which about half are estimated to live in the Greater Yellowstone Ecosystem. Grizzlies are stereotyped as ferocious, but the typical bear avoids contact with humans, living away from settlements and attacking only to protect themselves when startled by a human.
Grand Teton Wildlife Brigade
Created in 2007 in response to the magnitude of visitors coming to Grand Teton to view Grizzly 399 and her cubs, the Grand Teton Wildlife Brigade keeps the animals and the people apart and unharmed. In 2011, ranger Kate Wilmot, whose official title is "bear management specialist", said that this year things have become "completely chaotic". The real duty is managing the behavior of the people. This was partly due to social media increasing the popularity of the bears, and drawing more people to want to interact with them.
Wilmot directs 16 volunteers in the brigade throughout the summer until snowfall. If not for the brigaders, "wildlife watching would be a mess". The brigaders carry bear spray, but their primary role is to persuade tourists to respect the 100-yard viewing guideline established after incidents with Grizzly 610, 399's daughter.
Feeding the bears is il |
https://en.wikipedia.org/wiki/Symbiosome | A symbiosome is a specialised compartment in a host cell that houses an endosymbiont in a symbiotic relationship.
The term was first used in 1983 to describe the vacuole structure in the symbiosis between the animal host the Hydra, and the endosymbiont Chlorella. Symbiosomes are also seen in other cnidaria-dinoflagellate symbioses, including those found in coral-algal symbioses. In 1989 the concept was applied to the similar structure found in the nitrogen-fixing root nodules of certain plants.
The symbiosome in the root nodules has been much more successfully researched due in part to the complexity of isolating the symbiosome membrane in animal hosts. The symbiosome in a root nodule cell in a plant is an organelle-like structure that has formed in a symbiotic relationship with nitrogen-fixing bacteria. The plant symbiosome is unique to those plants that produce root nodules. The majority of such symbioses are made between legumes and diazotrophic Rhizobia bacteria. The rhizobia-legume symbioses are the most studied due to the importance in agriculture.
Each symbiosome in a root nodule cell encloses a single rhizobium that differentiates into a bacteroid. However, in some cases a symbiosome may house several bacteroids. The symbiosome membrane, or peribacteroid membrane, surrounds the bacteroid membrane, separated by a symbiosome space. This unit provides an inter-kingdom, micro-environment for the production of nitrogen for the plant, and the receipt of malate for energy for the bacteroid.
History
The concept of the symbiosome was first described in 1983, by Neckelmann and Muscatine, as seen in the symbiotic relationship between Chlorella ( a class of green algae, and Hydra a cnidarian animal host. Until then it had been described as a vacuole. A few years later in 1989, Lauren Roth with Gary Stacey as well as Robert B Mellor applied this concept to the nitrogen-fixing unit seen in the plant root nodule, previously called an infection vacuole.
This has sinc |
https://en.wikipedia.org/wiki/Molecular%20demon | A Molecular demon or biological molecular machine is a biological macromolecule that resembles and seems to have the same properties as Maxwell's demon. These macromolecules gather information in order to recognize their substrate or ligand within a myriad of other molecules floating in the intracellular or extracellular plasm. This molecular recognition represents an information gain which is equivalent to an energy gain or decrease in entropy. When the demon is reset i.e. when the ligand is released, the information is erased, energy is dissipated and entropy increases obeying the second law of thermodynamics. The difference between biological molecular demons and the thought experiment of Maxwell's demon is the latter's apparent violation of the second law.
Cycle
The molecular demon switches mainly between two conformations. The first, or basic state, upon recognizing and binding the ligand or substrate following an induced fit, undergoes a change in conformation which leads to the second quasi-stable state: the protein-ligand complex. In order to reset the protein to its original, basic state, it needs ATP. When ATP is consumed or hydrolyzed, the ligand is released and the demon acquires again information reverting to its basic state. The cycle may start again.
Ratchet
The second law of thermodynamics is a statistical law. Hence, occasionally, single molecules may not obey the law. All molecules are subject to the molecular storm, i.e. the random movement of molecules in the cytoplasm and the extracellular fluid. Molecular demons or molecular machines either biological or artificially constructed are continuously pushed around by the random thermal motion in a direction that sometimes violates the law. When this happens and the gliding back of the macromolecule from the movement it had made or the conformational change it underwent to its original state can be prevented, as is the case with molecular demons, the molecule works as a ratchet; it is possible t |
https://en.wikipedia.org/wiki/PROB1 | Proline-rich basic protein 1 (PROB1) is a protein encoded by the PROB1 gene located on human chromosome 5, open reading frame 65. PROB1 is also known as C5orf65 and weakly similar to basic proline-rich protein.
Gene
Characteristics
The PROB1 gene is 3251 bp long and contains a single exon.
Location
The PROB1 gene is located on human chromosome 5, cytogenetic band 5q31.2.
mRNA
Expression
PROB1 is expressed in 89 types of tissue in the human body, with highest expression in the skeletal muscle of the leg and cardiac muscle of the heart. While mRNA expression is somewhat ubiquitous and was also elevated in the spinal cord, cerebrum, and lymphocytes, measurable protein expression was only recorded in cardiac and skeletal muscle.
Protein
PROB1 is composed of 1015 amino acids. It contains two proline-rich regions, which compose the majority of the protein, and a domain of unknown function (DUF).
Structure
Predicted secondary structures for PROB1 reveal that the protein is mostly composed of random coils, with a small percentage of alpha helices and beta sheets present. This is likely due to the properties of proline; its large size, ring structure, and confined phi angle cause it to disrupt secondary structure formation. The DUF, which resides in the second proline-rich region of the protein, is also predicted to be completely composed of random coils. A tertiary structure prediction for PROB1 was generated using I-Tasser and rendered in PyMOL; overall, the protein displays an elongated structure.
Sub-cellular Localization
Analysis of protein structure, post-translational modifications, and localization signals reveals that PROB1 has no transmembrane domains and is an intracellular protein. Immunohistochemistry indicates its localization to the nucleoplasm of the cell.
Post-translational Modifications
An array of post-translational modifications were found for PROB1, including an S-palmitolyation site and a multitude of overlapping O-GlcNAcylation and phos |
https://en.wikipedia.org/wiki/Redfish%20%28specification%29 | The Redfish standard is a suite of specifications that deliver an industry standard protocol providing a RESTful interface for the management of servers, storage, networking, and converged infrastructure.
History
The Redfish standard has been elaborated under the SPMF umbrella at the DMTF in 2014. The first specification with base models (1.0) was published in August 2015. In 2016, Models for BIOS, disk drives, memory, storage, volume, endpoint, fabric, switch, PCIe device, zone, software/firmware inventory & update, multi-function NICs), host interface (KCS replacement) and privilege mapping were added. In 2017, Models for Composability, Location and errata were added. There is work in progress for Ethernet Switching, DCIM, and OCP.
In August 2016, SNIA released a first model for network storage services (Swordfish), an extension of the Redfish specification.
Industry adoption
Redfish support on server
Advantech SKY Server BMC
Dell iDRAC BMC with minimum iDRAC 7/8 FW 2.40.40.40, iDRAC9 FW 3.00.00.0
Fujitsu iRMCS5 BMC
HPE iLO BMC with minimum iLO4 FW 2.30, iLO5
HPE Moonshot BMC with minimum FW 1.41
Lenovo XClarity Controller (XCC) BMC with minimum XCC FW 1.00
Supermicro X10 BMC with minimum FW 3.0 and X11 with minimum FW 1.0
IBM Power Systems BMC with minimum OpenPOWER (OP) firmware level OP940
IBM Power Systems Flexible Service Processor (FSP) with minimum firmware level FW860.20
Cisco Integrated Management Controller with minimum IMC SW Version 3.0
Redfish support on BMC
Insyde Software Supervyse BMC
OpenBMC a Linux Foundation collaborative open-source BMC firmware stack
American Megatrends MegaRAC Remote Management Firmware
Vertiv Avocent Core Insight Embedded Management Systems
Software using Redfish APIs
OpenStack Ironic bare metal deployment project has a Redfish driver.
Ansible has multiple Redfish modules for Remote Management including redfish_info, redfish_config, and redfish_command
ManageIQ
Redfish libraries and tools
DMTF libr |
https://en.wikipedia.org/wiki/Client%20to%20Authenticator%20Protocol | The Client to Authenticator Protocol (CTAP) or X.1278 enables a roaming, user-controlled cryptographic authenticator (such as a smartphone or a hardware security key) to interoperate with a client platform such as a laptop.
Standard
CTAP is complementary to the Web Authentication (WebAuthn) standard published by the World Wide Web Consortium (W3C). WebAuthn and CTAP are the primary outputs of the FIDO2 Project, a joint effort between the FIDO Alliance and the W3C.
CTAP is based upon previous work done by the FIDO Alliance, in particular the Universal 2nd Factor (U2F) authentication standard. Specifically, the FIDO U2F 1.2 Proposed Standard (July 11, 2017) became the starting point for the CTAP Proposed Standard, the latest version of which was published on January 30, 2019.
The CTAP specification refers to two protocol versions, the CTAP1/U2F protocol and the CTAP2 protocol. An authenticator that implements CTAP2 is called a FIDO2 authenticator (also called a WebAuthn authenticator). If that authenticator implements CTAP1/U2F as well, it is backward compatible with U2F.
The protocol uses the CBOR binary data serialization format.
The standard was adopted as ITU-T Recommendation X.1278.
References
External links
FIDO Specifications Overview
FIDO Specifications
Authentication
Identification
Internet security
ITU-T recommendations
ITU-T X Series Recommendations |
https://en.wikipedia.org/wiki/Opportunistic%20Wireless%20Encryption | Opportunistic Wireless Encryption (OWE) is a Wi-Fi standard which ensures that the communication between each pair of endpoints is protected from other endpoints. Unlike conventional Wi-Fi, it provides "Individualized Data Protection" such that data traffic between a client and access point is "individualized". Other clients can still sniff and record this traffic, but they can't decrypt it.
OWE is an extension to IEEE 802.11. it is an encryption technique similar to that of Simultaneous Authentication of Equals (SAE) and is specified by Internet Engineering Task Force (IETF) in RFC 8110 with devices certified as Wi-Fi Certified Enhanced Open by the Wi-Fi Alliance.
See also
Wi-Fi Protected Access
References
Further reading
Internet privacy |
https://en.wikipedia.org/wiki/Map%20%28graph%20theory%29 | In topology and graph theory, a map is a subdivision of a surface such as the Euclidean plane into interior-disjoint regions,
formed by embedding a graph onto the surface and forming connected components (faces) of the complement of the graph.
That is, it is a tessellation of the surface. A map graph is a graph derived from a map by creating a vertex for each face and an edge for each pair of faces that meet at a vertex or edge of the embedded graph.
References
Topology
Graph theory objects |
https://en.wikipedia.org/wiki/JS%2B%2B | JS++ is a proprietary programming language for web development that extends JavaScript with a sound type system. It includes imperative, object-oriented, functional, and generic programming features.
History
JS++ first appeared on October 8, 2011. The modern implementation was announced at DeveloperWeek 2016 and released on May 31, 2016. The language is designed by Roger Poon and Anton Rapetov.
Syntax
Type annotations
Since JS++ is a superset of JavaScript, declaring types for variables is optional.
int x = 1; // declares the variable x with an "internal type" (JS++ type)
var y = 2; // declares the variable y with an "external type" (JavaScript type)
bool z = true; // declares the variable z with an "internal type" (JS++ type)
Features
JS++ features a type system that is sound.
JS++ is able to efficiently analyze out-of-bounds errors at compile time.
Development tools
Compiler
The JS++ compiler is available for Windows, Mac OS X, and Linux. The compiler generates JavaScript output.
Editor integration
JS++ integrates with various code editors including Visual Studio Code, Atom, and Sublime Text.
Build tools
JS++ can be integrated with third-party build tools like Webpack.
Release history
See also
TypeScript
PureScript
References
Programming languages
Web programming
Class-based programming languages
Functional languages
Statically typed programming languages
High-level programming languages
Programming languages created in 2011 |
https://en.wikipedia.org/wiki/Higman%E2%80%93Sims%20asymptotic%20formula | In finite group theory, the Higman–Sims asymptotic formula gives an asymptotic estimate on number of groups of prime power order.
Statement
Let be a (fixed) prime number. Define as the number of isomorphism classes of groups of order . Then:
Here, the big-O notation is with respect to , not with respect to (the constant under the big-O notation may depend on ).
References
group theory
Theorems in group theory |
https://en.wikipedia.org/wiki/Pixel%20Visual%20Core | The Pixel Visual Core (PVC) is a series of ARM-based system in package (SiP) image processors designed by Google. The PVC is a fully programmable image, vision and AI multi-core domain-specific architecture (DSA) for mobile devices and in future for IoT.
It first appeared in the Google Pixel 2 and 2 XL which were introduced on October 19, 2017. It has also appeared in the Google Pixel 3 and 3 XL. Starting with the Pixel 4, this chip was replaced with the Pixel Neural Core.
History
Google previously used Qualcomm Snapdragon's CPU, GPU, IPU, and DSP to handle its image processing for their Google Nexus and Google Pixel devices. With the increasing importance of computational photography techniques, Google developed the Pixel Visual Core (PVC). Google claims the PVC uses less power than using CPU and GPU while still being fully programmable, unlike their tensor processing unit (TPU) application-specific integrated circuit (ASIC).
Indeed, classical mobile devices equip an image signal processor (ISP) that is a fixed functionality image processing pipeline. In contrast to this, the PVC has a flexible programmable functionality, not limited only to image processing.
The PVC in the Google Pixel 2 and 2 XL is labeled SR3HX X726C502.
The PVC in the Google Pixel 3 and 3 XL is labeled SR3HX X739F030.
Thanks to the PVC, the Pixel 2 and Pixel 3 obtained a mobile DxOMark of 98 and 101.
The latter one was the top-ranked single-lens mobile DxOMark score, tied with the iPhone XR.
Pixel Visual Core software
A typical image-processing program of the PVC is written in Halide. Currently, it supports just a subset of Halide programming language without floating point operations and with limited memory access patterns.
Halide is a domain-specific language that lets the user decouple the algorithm and the scheduling of its execution.
In this way, the developer can write a program that is optimized for the target hardware architecture.
Pixel Visual Core ISA
The PVC has two types |
https://en.wikipedia.org/wiki/PG%283%2C2%29 | In finite geometry, PG(3,2) is the smallest three-dimensional projective space. It can be thought of as an extension of the Fano plane.
It has 15 points, 35 lines, and 15 planes. It also has the following properties:
Each point is contained in 7 lines and 7 planes
Each line is contained in 3 planes and contains 3 points
Each plane contains 7 points and 7 lines
Each plane is isomorphic to the Fano plane
Every pair of distinct planes intersect in a line
A line and a plane not containing the line intersect in exactly one point
Constructions
Construction from K6
Take a complete graph K6. It has 15 edges, 15 perfect matchings and 20 triangles. Create a point for each of the 15 edges, and a line for each of the 20 triangles and 15 matchings. The incidence structure between each triangle or matching (line) and its three constituent edges (points) induces a PG(3,2).
Construction from Fano planes
Take a Fano plane and apply all 5040 permutations of its 7 points. Discard duplicate planes to obtain a set of 30 distinct Fano planes. Pick any of the 30, and pick the 14 others that have exactly one line in common with the first, not 0 or 3. The incidence structure between the 1+14 = 15 Fano planes and the 35 triplets they mutually cover induces a PG(3,2).
Representations
Tetrahedral depiction
PG(3,2) can be represented as a tetrahedron. The 15 points correspond to the 4 vertices + 6 edge-midpoints + 4 face-centers + 1 body-center. The 35 lines correspond to the 6 edges + 12 face-medians + 4 face-incircles + 4 altitudes from a face to the opposite vertex + 3 lines connecting the midpoints of opposite edges + 6 ellipses connecting each edge midpoint with its two non-neighboring face centers. The 15 planes consist of the 4 faces + the 6 "medial" planes connecting each edge to the midpoint of the opposite edge + 4 "cones" connecting each vertex to the incircle of the opposite face + one "sphere" with the 6 edge centers and the body center. This was described by Burkhar |
https://en.wikipedia.org/wiki/Pseudo-panspermia | Pseudo-panspermia (sometimes called soft panspermia, molecular panspermia or quasi-panspermia) is a well-supported hypothesis for a stage in the origin of life. The theory first asserts that many of the small organic molecules used for life originated in space (for example, being incorporated in the solar nebula, from which the planets condensed). It continues that these organic molecules were distributed to planetary surfaces, where life then emerged on Earth and perhaps on other planets. Pseudo-panspermia differs from the fringe theory of panspermia, which asserts that life arrived on Earth from distant planets.
Background
Theories of the origin of life have been current since the 5th century BC, when the Greek philosopher Anaxagoras proposed an initial version of panspermia: life arrived on earth from the heavens. In modern times, panspermia has little support amongst mainstream scientists.
Extraterrestrial creation of organic molecules
Interstellar molecules are formed by chemical reactions within very sparse interstellar or circumstellar clouds of dust and gas. Usually this occurs when a molecule becomes ionised, often as the result of an interaction with cosmic rays. This positively charged molecule then draws in a nearby reactant by electrostatic attraction of the neutral molecule's electrons. Molecules can also be generated by reactions between neutral atoms and molecules, although this process is generally slower. The dust plays a critical role of shielding the molecules from the ionizing effect of ultraviolet radiation emitted by stars. The Murchison meteorite contains the organic molecules uracil and xanthine, which must therefore already have been present in the early Solar System, where they could have played a role in the origin of life.
Nitriles, key molecular precursors of the RNA World scenario, are among the most abundant chemical families in the universe and have been found in molecular clouds in the center of the Milky Way, protostars of di |
https://en.wikipedia.org/wiki/Oxford%20Concordance%20Program | The Oxford Concordance Program (OCP) was first released in 1981 and was a result of a project started
in 1978 by Oxford University Computing Services (OUCS) to create a machine independent text analysis program for producing word lists, indexes and concordances in a variety of languages and alphabets.
In the 1980s it was claimed to have been licensed to around 240 institutions in 23 countries.
History
OCP was designed and written in FORTRAN by Susan Hockey and Ian Marriott of Oxford University Computing Services in the period 1979–1980 and its authors acknowledged that it owed much to the earlier COCOA and CLOC (University of Birmingham) concordance systems.
During 1985–86 OCP was completely rewritten as version 2 to increase the efficiency of the program, a version was also produced for the IBM PC called Micro-OCP.
See also
Concordance (publishing)
References
History of software
Digital humanities |
https://en.wikipedia.org/wiki/Ruzsa%E2%80%93Szemer%C3%A9di%20problem | In combinatorial mathematics and extremal graph theory, the Ruzsa–Szemerédi problem or (6,3)-problem asks for the maximum number of edges in a
graph in which every edge belongs to a unique triangle.
Equivalently it asks for the maximum number of edges in a balanced bipartite graph whose edges can be partitioned into a linear number of induced matchings, or the maximum number of triples one can choose from points so that every six points contain at most two triples. The problem is named after Imre Z. Ruzsa and Endre Szemerédi, who first proved that its answer is smaller than by a slowly-growing (but still unknown) factor.
Equivalence between formulations
The following questions all have answers that are asymptotically equivalent: they differ by, at most, constant factors from each other.
What is the maximum possible number of edges in a graph with vertices in which every edge belongs to a unique triangle? The graphs with this property are called locally linear graphs or locally matching graphs.
What is the maximum possible number of edges in a bipartite graph with vertices on each side of its bipartition, whose edges can be partitioned into induced subgraphs that are each matchings?
What is the largest possible number of triples of points that one can select from given points, in such a way that every six points contain at most two of the selected triples?
The Ruzsa–Szemerédi problem asks for the answer to these equivalent questions.
To convert the bipartite graph induced matching problem into the unique triangle problem, add a third set of vertices to the graph, one for each induced matching, and add edges from vertices and of the bipartite graph to vertex in this third set whenever
bipartite edge belongs to induced matching .
The result is a balanced tripartite graph with vertices and the unique triangle property. In the other direction, an arbitrary graph with the unique triangle property can be made into a balanced tripartite graph by choosing a pa |
https://en.wikipedia.org/wiki/Iron%20law%20of%20processor%20performance | In computer architecture, the iron law of processor performance (or simply iron law of performance) describes the performance trade-off between complexity and the number of primitive instructions that processors use to perform calculations. This formulation of the trade-off spurred the development of Reduced Instruction Set Computers (RISC) whose instruction set architectures (ISAs) leverage a smaller set of core instructions to improve performance. The term was coined by Douglas Clark based on research performed by Clark and Joel Emer in the 1980s.
Explanation
The performance of a processor is the time it takes to execute a program: . This can be further broken down into three factors:
Selection of an instruction set architecture affects , whereas is largely determined by the manufacturing technology. Classic Complex Instruction Set Computer (CISC) ISAs optimized by providing a larger set of more complex CPU instructions. Generally speaking, however, complex instructions inflate the number of clock cycles per instruction because they must be decoded into simpler micro-operations actually performed by the hardware. After converting X86 binary to the micro-operations used internally, the total number of operations is close to what is produced for a comparable RISC ISA. The iron law of processor performance makes this trade-off explicit and pushes for optimization of as a whole, not just a single component.
While the iron law is credited for sparking the development of RISC architectures, it does not imply that a simpler ISA is always faster. If that were the case, the fastest ISA would consist of simple binary logic. A single CISC instruction can be faster than the equivalent set of RISC instructions when it enables multiple micro-operations to be performed in a single clock cycle. In practice, however, the regularity of RISC instructions allowed a pipelined implementation where the total execution time of an instruction was (typically) ~5 clock cycl |
https://en.wikipedia.org/wiki/Donald%20H.%20Sinnott | Donald Hugh Sinnott is an Australian engineer and academic notable in the area of radar. His expertise is in applied electromagnetics, including radio and radar systems, antennas and radio propagation, signal processing and global navigation satellite systems (GPS and related systems). He played a major role in development of Australia's Jindalee over-the-horizon radar system.
Early life and education
He received his PhD from Syracuse University, in 1972, under Roger F. Harrington.
Career and research
As a researcher in radio and radar technologies, he worked for many years at DST, supporting and advising Australia's Department of Defence. He played a major role in Australia's development of over-the-horizon radar, embodied in Australia's world-leading Jindalee project, before moving to senior defence management positions.
He was Chief of a number of Australia's Defence Science and Technology Group research Divisions in sensing and IT disciplines (1987–2000), the Department of Defence's Canberra-based First Assistant Secretary Science Policy (1995–1997), CEO of the Cooperative Research Centre for Sensor Signal and Information Processing and Company Board Chairman of the CRC's spin-off companies (2000–2003). He is currently a board member for the CRC on Contamination Assessment and Remediation of the Environment (CRC CARE).
He is an adjunct professor with the University of Adelaide, Australia, where he has taught and supervised higher-degree students. He's a fellow of Engineers Australia and IEEE. In 2014 he was awarded the M. A. Sargent Medal, for which the citation refers to his 'eminence and leadership' in his field.
He authored the book Radar Men: A. P. Rowe and John Strath in War and Peace, in 2016.
Honours and awards
M. A. Sargent Medal (2014)
References
External links
Radar Men
Australian electrical engineers
Living people
People from Adelaide
Syracuse University College of Engineering and Computer Science alumni
Fellow Members of the IEEE
Recipie |
https://en.wikipedia.org/wiki/Salem%E2%80%93Spencer%20set | In mathematics, and in particular in arithmetic combinatorics, a Salem-Spencer set is a set of numbers no three of which form an arithmetic progression. Salem–Spencer sets are also called 3-AP-free sequences or progression-free sets. They have also been called non-averaging sets, but this term has also been used to denote a set of integers none of which can be obtained as the average of any subset of the other numbers. Salem-Spencer sets are named after Raphaël Salem and Donald C. Spencer, who showed in 1942 that Salem–Spencer sets can have nearly-linear size. However a later theorem of Klaus Roth shows that the size is always less than linear.
Examples
For the smallest values of such that the numbers from to have a -element Salem-Spencer set are
1, 2, 4, 5, 9, 11, 13, 14, 20, 24, 26, 30, 32, 36, ...
For instance, among the numbers from 1 to 14, the eight numbers
{1, 2, 4, 5, 10, 11, 13, 14}
form the unique largest Salem-Spencer set.
This example is shifted by adding one to the elements of an infinite Salem–Spencer set, the Stanley sequence
0, 1, 3, 4, 9, 10, 12, 13, 27, 28, 30, 31, 36, 37, 39, 40, ...
of numbers that, when written as a ternary number, use only the digits 0 and 1. This sequence is the lexicographically first infinite Salem–Spencer set. Another infinite Salem–Spencer set is given by the cubes
0, 1, 8, 27, 64, 125, 216, 343, 512, 729, 1000, ...
It is a theorem of Leonhard Euler that no three cubes are in arithmetic progression.
Size
In 1942, Salem and Spencer published a proof that the integers in the range from to have large Salem–Spencer sets, of size . The denominator of this expression uses big O notation, and grows more slowly than any power of , so the sets found by Salem and Spencer have a size that is nearly linear. This bound disproved a conjecture of Paul Erdős and Pál Turán that the size of such a set could be at most for some .
The construction of Salem and Spencer was improved by Felix Behrend in 1946, who found sets of size |
https://en.wikipedia.org/wiki/Promiscuous%20gene%20expression | Promiscuous gene expression (PGE), formerly referred to as ectopic expression, is a process specific to the thymus that plays a pivotal role in the establishment of central tolerance. This phenomenon enables generation of self-antigens, so called tissue-restricted antigens (TRAs), which are in the body expressed only by one or few specific tissues (antigens rank among TRAs if they are expressed by less than five tissues from the sixty tested ). These antigens are represented for example by insulin from the pancreas or defensins from the gastrointestinal tract. Antigen-presenting cells (APCs) of the thymus, namely medullary thymic epithelial cells (mTECs), dendritic cells (DCs) and B cells are capable to present peptides derived from TRAs to developing T cells (thymus is the major origin of T cell development) and hereby test, whether their T cell receptors (TCRs) engage self entities and therefore their occurrence in the body can potentially lead to the development of autoimmune disease. In that case, thymic APCs either induce apoptosis in these autoreactive T cells (negative selection) or they deviate them to become T regulatory cells (Treg selection), which suppress self-reactive T cells in the body that escaped negative selection in the thymus. Thus, PGE is crucial for tissues protection against autoimmunity.
Characteristics of PGE in distinct cell types
The usual level of gene expression in the peripheral tissues (e.g. spleen, kidney, liver etc.) reaches about 60% of the mouse coding genome. Some peripheral tissues, including lungs, brain and testis, reveal the repertoire of expressed genes about 10% broader. Importantly, PGE in the thymus, which is mediated by unique subset of epithelial cells called mTECs, triggers expression of vast majority of the genes from the whole genome (~85%). Such a broad repertoire of expressed genes wasn't shown in any other tissue of the body.
mTECs
The process of PGE in the thymus was discovered in late 80's however, it took |
https://en.wikipedia.org/wiki/Felix%20Behrend | Felix Adalbert Behrend (23 April 1911 – 27 May 1962) was a German mathematician of Jewish descent who escaped Nazi Germany and settled in Australia. His research interests included combinatorics, number theory, and topology. Behrend's theorem and Behrend sequences are named after him.
Life
Behrend was born on 23 April 1911 in Charlottenburg, a suburb of Berlin. He was one of four children of Dr. Felix W. Behrend, a politically liberal mathematics and physics teacher. Although of Jewish descent, their family was Lutheran. Behrend followed his father in studying both mathematics and physics, both at Humboldt University of Berlin and the University of Hamburg, and completed a doctorate in 1933 at Humboldt University. His dissertation, Über numeri abundantes [On abundant numbers] was supervised by Erhard Schmidt.
With Adolf Hitler's rise to power in 1933, Behrend's father lost his job, and Behrend himself moved to Cambridge University in England to work with Harold Davenport and G. H. Hardy. After taking work with a life insurance company in Zurich in 1935 he was transferred to Prague, where he earned a habilitation at Charles University in 1938 while continuing to work as an actuary. He left Czechoslovakia in 1939, just before the war reached that country, and returned through Switzerland to England, but was deported on the HMT Dunera to Australia as an enemy alien in 1940.
Although both Hardy and J. H. C. Whitehead intervened for an early release, he remained in the prison camps in Australia, teaching mathematics there to the other internees.
After Thomas MacFarland Cherry added to the calls for his release, he gained his freedom in 1942 and began working at the University of Melbourne. He remained there for the rest of the career, and married a Hungarian dance teacher in 1945 in the Queen's College chapel; they had two children. Although his highest rank was associate professor, Bernhard Neumann writes that "he would have been made a (personal) professor" if not f |
https://en.wikipedia.org/wiki/CLOC | CLOC (an acronym derived from CoLOCation) was a first generation general purpose text analyzer program. It was produced at the University of Birmingham and could produce concordances as well as word lists and collocational analysis of text. First-generation concordancers were typically held on a mainframe computer and used at a single site; individual research teams would build their own concordancer and use it on the data they had access to locally, any further analysis was done by separate programs.
History
CLOC was written by Alan Reed in Algol 68-R which was available only on the ICT 1900 series of computer at that time. Perhaps because it was designed for use in a department of linguistics rather than by computer specialists it had the distinction of having a comparatively simple user interface, it also has some useful features for studying collations or the co-occurrence of words.
CLOC was used in the COBUILD project that was headed by Professor John Sinclair.
Further reading
References
History of software |
https://en.wikipedia.org/wiki/EP%20matrix | In mathematics, an EP matrix (or range-Hermitian matrix or RPN matrix) is a square matrix A whose range is equal to the range of its conjugate transpose A*. Another equivalent characterization of EP matrices is that the range of A is orthogonal to the nullspace of A. Thus, EP matrices are also known as RPN (Range Perpendicular to Nullspace) matrices.
EP matrices were introduced in 1950 by Hans Schwerdtfeger, and since then, many equivalent characterizations of EP matrices have been investigated through the literature. The meaning of the EP abbreviation stands originally for Equal Principal, but it is widely believed that it stands for Equal Projectors instead, since an equivalent characterization of EP matrices is based in terms of equality of the projectors AA+ and A+A.
The range of any matrix A is perpendicular to the null-space of A*, but is not necessarily perpendicular to the null-space of A. When A is an EP matrix, the range of A is precisely perpendicular to the null-space of A.
Properties
An equivalent characterization of an EP matrix A is that A commutes with its Moore-Penrose inverse, that is, the projectors AA+ and A+A are equal. This is similar to the characterization of normal matrices where A commutes with its conjugate transpose. As a corollary, nonsingular matrices are always EP matrices.
The sum of EP matrices Ai is an EP matrix if the null-space of the sum is contained in the null-space of each matrix Ai.
To be an EP matrix is a necessary condition for normality: A is normal if and only if A is EP matrix and AA*A2 = A2A*A.
When A is an EP matrix, the Moore-Penrose inverse of A is equal to the group inverse of A.
A is an EP matrix if and only if the Moore-Penrose inverse of A is an EP matrix.
Decomposition
The spectral theorem states that a matrix is normal if and only if it is unitarily similar to a diagonal matrix.
Weakening the normality condition to EPness, a similar statement is still valid. Precisely, a matrix A of rank r is a |
https://en.wikipedia.org/wiki/Acxiom | Acxiom (pronounced "ax-ee-um") is a Conway, Arkansas-based database marketing company. The company collects, analyzes and sells customer and business information used for targeted advertising campaigns. The company was formed in 2018 when Acxiom Corporation (since renamed LiveRamp) spun off its Acxiom Marketing Services (AMS) division to global advertising network Interpublic Group of Companies.
The company has offices in the United States, Europe and Asia.
History
The business that became Acxiom was founded in 1969 as Demographics, Inc. by Charles D. Ward in Conway, Arkansas. In 1988 it became Acxiom Corporation, and by 2012, the New York Times reported that the company had the world’s largest commercial database on consumers.
On May 14, 2014, Acxiom Corporation announced that it had acquired LiveRamp, a data onboarding company, for $310 million.
In January 2017, Acxiom Corporation launched Audience Cloud, an anonymous targeting tool that allowed demographic segmentation of customers without revealing their actual identities.
In February 2018, Acxiom Corporation announced a reorganization into two divisions - Acxiom Marketing Solutions (AMS) and LiveRamp. In July, advertising company Interpublic Group of Companies (IPG) announced they were buying Acxiom Corporation's AMS business for $2.3 billion. In September, Acxiom Corporation officially changed its name to LiveRamp, allowing the AMS business owned by IPG to keep the Acxiom name. Also in September, the company introduced an open data framework allowing clients to combine online and offline data sources.
In December 2019, Acxiom integrated its data on AWS Data Exchange.
Business
Acxiom provides anonymized customer data to marketers, allowing the delivery of more relevant ads to consumers, with more effective measurement.
Acxiom's client base in the United States consists primarily of companies in the financial, insurance and investment services, automotive, retail, telecommunications, healthcare, travel |
https://en.wikipedia.org/wiki/Randomized%20benchmarking | Randomized benchmarking is an experimental method for measuring the average error rates of quantum computing hardware platforms. The protocol estimates the average error rates by implementing long sequences of randomly sampled quantum gate operations.
Randomized benchmarking is the industry-standard protocol used by quantum hardware developers such as IBM and Google to test the performance of the quantum operations.
The original theory of randomized benchmarking, proposed by Joseph Emerson and collaborators, considered the implementation of sequences of Haar-random operations, but this had several practical limitations. The now-standard protocol for randomized benchmarking (RB) relies on uniformly random Clifford operations, as proposed in 2006 by Dankert et al. as an application of the theory of unitary t-designs. In current usage randomized benchmarking sometimes refers to the broader family of generalizations of the 2005 protocol involving different random gate sets that can identify various features of the strength and type of errors affecting the elementary quantum gate operations. Randomized benchmarking protocols are an important means of verifying and validating quantum operations and are also routinely used for the optimization of quantum control procedures.
Overview
Randomized benchmarking offers several key advantages over alternative approaches to error characterization. For example, the number of experimental procedures required for full characterization of errors (called tomography) grows exponentially with the number of quantum bits (called qubits). This makes tomographic methods impractical for even small systems of just 3 or 4 qubits. In contrast, randomized benchmarking protocols are the only known approaches to error characterization that scale efficiently as number of qubits in the system increases. Thus RB can be applied in practice to characterize errors in arbitrarily large quantum processors. Additionally, in experimental quantum comp |
https://en.wikipedia.org/wiki/Signs%20Of%20LIfe%20Detector | Signs Of LIfe Detector (SOLID) is an analytical instrument under development to detect extraterrestrial life in the form of organic biosignatures obtained from a core drill during planetary exploration.
The instrument is based on fluorescent immunoassays and it is being developed by the Spanish Astrobiology Center (CAB) in collaboration with the NASA Astrobiology Institute. SOLID is currently undergoing testing for use in astrobiology space missions that search for common biomolecules that may indicate the presence of extraterrestrial life, past or present. The system was validated in field tests and engineers are looking into ways to refine the method and miniaturize the instrument further.
Science background
Modern astrobiology inquiry has emphasized the search for water on Mars, chemical biosignatures in the permafrost, soil and rocks at the planet's surface, and even biomarker gases in the atmosphere that may give away the presence of past or present life. The detection of preserved organic molecules of unambiguous biological origin is fundamental for the confirmation of present or past life, but the 1976 Viking lander biological experiments failed to detect organics on Mars, and it is suspected it was because of the combined effects of heat applied during analysis and the unexpected presence of oxidants such as perchlorates in the Martian soil. The recent discovery of near surface ground ice on Mars supports arguments for the long-term preservation of biomolecules on Mars.
SOLID demonstrated that antibodies are unaffected by acidity, heat and oxidants such as perchlorates, and it has emerged as a viable choice for an astrobiology mission directly searching for biosignatures.
For a time, the ExoMars' Rosalind Franklin rover was planned to carry a similar instrument called Life Marker Chip.
Instrument
SOLID was designed for automatic in situ detection and identification of substances from liquid and crushed samples under the conditions of outer space. The |
https://en.wikipedia.org/wiki/Robert%20Shostak | Robert Eliot Shostak (born July 26, 1948, in Arlington, Virginia) is an American computer scientist and Silicon Valley entrepreneur. He is most noted academically for his seminal work in the branch of distributed computing known as Byzantine Fault Tolerance. He is also known for co-authoring the Paradox Database, and most recently, the founding of Vocera Communications, a company that makes wearable, Star Trek-like communication badges.
Shostak has authored more than forty academic papers and patents, and was editor of the 7th Conference on Automated Deduction. He has Erdős number 2 through his collaboration with Kenneth Kunen.
Shostak is a brother of Seth Shostak, who is Senior Astronomer at the SETI Institute and who frequently appears on television and radio.
Education
Robert Shostak was born in a Jewish family in Arlington, Virginia, the son of Arthur and Bertha Shostak (née Gortenburg); his father was an electrical engineer. He studied mathematics and computer science at Harvard College, graduating in 1970 with high honors. As part of his senior dissertation work, he designed and built one of the earliest personal computers using discrete RTL logic (microprocessors were not yet available) and a magnetic core memory. He continued at Harvard to earn his A.M. degree and Ph.D. in Computer Science in 1974. While at Harvard he was awarded the Detur Book Priz', and fellowships from IBM and the National Science Foundation.
Career
Afterwards, Shostak joined the research staff in the Computer Science Lab (CSL) at SRI International (formerly the Stanford Research Institute) in Menlo Park, California. Much of his work there focused on automated theorem proving, and specifically on the development of decision procedure algorithms for mechanized proof of the kinds of mathematical formulas that occur frequently in the formal verification of correctness of computer programs.
In collaboration with CSL's Richard L. Schwartz and P. Michael Melliar-Smith, Shostak |
https://en.wikipedia.org/wiki/Engineering%20consulting | Engineering consulting is the practice of performing engineering as a Consulting Engineer. It assists individuals, public and private companies with process management, idea organization, product design, fabrication, MRO (Maintenance, Repair and Operations), servicing, tech advice, tech specifications, tech estimating, costing, budgeting, valuation, branding, and marketing.
Engineering consulting firms may involve Civil, Structural, Mechanical, Electrical, Environmental, Chemical, Industrial, and Agricultural, Electronics and Telecom, Computer and Network, Instrumentation and Control, IT, Manufacturing and Production, Aerospace, Marine, Fire and Safety, etc.
Education
In certain countries, the title "Consulting Engineer" lacks legal protection, while in other countries, it necessitates a minimum of a Bachelor's Degree in Engineering and a Government License.
References
See also
FIDIC
Consulting by type
Engineering disciplines |
https://en.wikipedia.org/wiki/Surveying%20in%20Oceania | Surveying in Oceania may refer to:
Surveying in Australia
Surveying in New Zealand
Surveying |
https://en.wikipedia.org/wiki/Tammann%20Commemorative%20Medal | The Tammann Commemorative Medal is awarded once a year and was established in remembrance of Gustav Heinrich Johann Apollon Tammann. It honors members of the Deutsche Gesellschaft für Materialkunde, who have made outstanding contributions to the field of materials research.
Awardees
1973 Heinrich Wollenberger
1974 Hans Hillmann
1975 Manfred Wilkens
1976 Otmar Vöhringer
1977 Werner Pepperhoff
1978 Heinrich Mecking
1979 Theodor Hehenkamp
1980 Ulrich Heubner
1981 Helmut Holleck
1982 Friedrich Pfeifer
1983 Sigfried Steeb
1984 Christian Herzig
1986 Rudolf Akeret
1988 Florian Schubert
1989 Hans Paul Hougardy
1990 Herbert Stephan
1991 Georg Grathwohl
1992 Ernst-Theo Henig
1993 Wolfgang Gust
1994 Gerhard Inden
1995 Gerhard Sauthoff
1996 Hans Jürgen Grabke
1997 Günter Lange
1998 Hans-Georg Sockel
1999 Fritz Appel
2000 Gerhard K. Wolf
2001 Hans-Eckhardt Schaefer
2002 Dmitri Molodov
2004 Hermann Riedel
2005 Gunther Eggeler
2006 Stefanie Tschegg
2007 Jürgen Hirsch
2008 Rainer Schmid-Fetzer
2009 Reinhard Pippan
2011 Werner Skrotzki
2012 Ralf Riedel
2013 Ulrich Martin
2014 Peter Uggowitzer
2015 Willem J. Quadakkers
2016 Birgit Skrotzki
2017 Michael Zehetbauer
2018 Robert Danzer
2019 Jiří Svoboda
2020 Christos G. Aneziris
References
German science and technology awards
Awards established in 1973
Materials science awards
Research awards |
https://en.wikipedia.org/wiki/Human%20interactions%20with%20insects%20in%20southern%20Africa | Various cultures throughout Africa utilize insects for many things and have developed unique interactions with insects: as food sources, for sale or trade in markets, or for use in traditional practices and rituals, as ethnomedicine or as part of their traditional ecological knowledge. As food, also known as entomophagy, a variety of insects are collected as part of a protein rich source of nutrition for marginal communities. Entomophagy had been part of traditional culture throughout Africa, though this activity has been diminishing gradually with the influx of Western culture and market economies. Often the collection of insects for food has been the activity of children, both male and female.
Within Southern Africa different communities have established practices for regulating and maintaining their insect harvests. Some groups, through taboos, ritual, and hierarchical organizational structures acting as regulating bodies, have maintained their traditional practice for centuries. They monitor the development of certain caterpillar species' life cycles to ensure proper time frame for harvesting and sustainability.
Understanding the diversity of relationships to nature is a crucial aspect of fully grasping and contending with the challenges of modernity and ecology. According to the Food and Agriculture Organization of the United Nations report from January 2012, it has been recommended that insects be utilized both for human consumption as well as for animal feed. However, as the climate changes many agencies are reporting on the risk of the decline in insect populations within the larger ongoing phenomenon of biodiversity loss and how it may affect the world's ecology.
Southern Africa
Blouberg, Limpopo
Maize is a staple crop of Blouberg, Limpopo. Yet due to the processing methods of removing the germ and pericarp, maize is a poor source of protein which often requires supplementation. Within the Blouberg Region, Limpopo, there are some 30 species of insect w |
https://en.wikipedia.org/wiki/Popov%20criterion | In nonlinear control and stability theory, the Popov criterion is a stability criterion discovered by Vasile M. Popov for the absolute stability of a class of nonlinear systems whose nonlinearity must satisfy an open-sector condition. While the circle criterion can be applied to nonlinear time-varying systems, the Popov criterion is applicable only to autonomous (that is, time invariant) systems.
System description
The sub-class of Lur'e systems studied by Popov is described by:
where x ∈ Rn, ξ,u,y are scalars, and A,b,c and d have commensurate dimensions. The nonlinear element Φ: R → R is a time-invariant nonlinearity belonging to open sector (0, ∞), that is, Φ(0) = 0 and yΦ(y) > 0 for all y not equal to 0.
Note that the system studied by Popov has a pole at the origin and there is no direct pass-through from input to output, and the transfer function from u to y is given by
Criterion
Consider the system described above and suppose
A is Hurwitz
(A,b) is controllable
(A,c) is observable
d > 0 and
Φ ∈ (0,∞)
then the system is globally asymptotically stable if there exists a number r > 0 such that
See also
Circle criterion
References
Nonlinear control
Stability theory |
https://en.wikipedia.org/wiki/Videotex%20character%20set | The character sets used by Videotex are based, to greater or lesser extents, on ISO/IEC 2022. Three Data Syntax systems are defined by ITU T.101, corresponding to the Videotex systems of different countries.
Data Syntax 1
Data Syntax 1 is defined in Annex B of T.101:1994. It is based on the CAPTAIN system used in Japan. Its graphical sets include JIS X 0201 and JIS X 0208.
The following G-sets are available through ISO/IEC 2022-based designation escapes:
Mosaic sets for Data Syntax 1
The mosaic sets supply characters for use in semigraphics.
� Not in Unicode
Data Syntax 2
Data Syntax 2 is defined in Annex C of T.101:1994. It corresponds to some European Videotex systems such as CEPT T/CD 06-01. The graphical character coding of Data Syntax 2 is based on T.51.
The default G2 set of Data Syntax 2 is based on an older version of T.51, lacking the non-breaking space, soft hyphen, not sign (¬) and broken bar (¦) present in the current version, but adding a dialytika tonos (΅—combining form is U+0344) at the beginning of the row of diacritical marks for combination with codes from a Greek primary set. An umlaut diacritic code distinct from the diaeresis code, as included in some versions of T.61, is also sometimes included.
The default G1 set is the second mosaic set, corresponding roughly to the second mosaic set of Data Syntax 1. The default G3 set is the third mosaic set, matching the first mosaic set of Data Syntax 1 for 0x60 through 0x6D and 0x70 through 0x7D, and otherwise differing. The first mosaic set matches the second except for 0x40 through 0x5E: 0x40 through 0x5A follow ASCII (supplying uppercase letters), whereas the remainder are national variant characters; the displaced full block is placed at 0x7F.
Representation of 0x5B-5E is not guaranteed in international communication and may be replaced by national application oriented variants.
0x5F may be displayed either as ⌗ (square) or _ (lower bar) to represent the terminator function required by |
https://en.wikipedia.org/wiki/Qiu%20Dahong | Qiu Dahong (; born 6 April 1930) is a Chinese coastal and offshore engineer. He served as chief engineer of the Dalian Fishing Port, the New Dalian Port, the Qinhuangdao Petroleum Port, and many other projects. He is a professor of the Dalian University of Technology and directed the State Key Laboratory of Coastal and Offshore Engineering. He was elected an academician of the Chinese Academy of Sciences in 1991.
Biography
Qiu was born on 6 April 1930 in Shanghai, Republic of China, with his ancestral home in Huzhou, Zhejiang. After graduating from the Department of Civil Engineering of Tsinghua University in 1951, he joined the Dalian University of Technology (DUT), where he worked under Professor Qian Lingxi and helped create China's first port and harbour engineering program.
In 1958, Qiu was appointed chief engineer for the construction of Dalian Fishing Port at the age of 28. The port, designed to occupy of open water with docks for 300 fishing boats, was unprecedented in China in both scale and difficulty. When completed in 1966, it was Asia's largest fishing port. In 1987, Qiu again served as chief engineer for the port's expansion project, which was completed in 1989.
In 1973, Qiu became chief engineer of the New Dalian Port, the first port in China capable of handling oil tankers with a displacement of 100,000 tons. It was opened in 1976, and won a national gold medal for its design.
Qiu later led or participated in the design of the Qinhuangdao Petroleum Port, the Lianyungang Container Port, the Shenzhen Chiwan Port, the Hainan Petroleum Port, the Yamen Shipping Channel of the Pearl River estuary, and the Yangshan Port of Shanghai.
Qiu served as Director of the State Key Laboratory of Coastal and Offshore Engineering at DUT and was elected an academician of the Chinese Academy of Sciences in 1991. In 1992, he was elected a Central Committee member of the Jiusan Society.
Publications
Qiu published more than 100 scientific papers and multiple monog |
https://en.wikipedia.org/wiki/Neural%20style%20transfer | Neural style transfer (NST) refers to a class of software algorithms that manipulate digital images, or videos, in order to adopt the appearance or visual style of another image. NST algorithms are characterized by their use of deep neural networks for the sake of image transformation. Common uses for NST are the creation of artificial artwork from photographs, for example by transferring the appearance of famous paintings to user-supplied photographs. Several notable mobile apps use NST techniques for this purpose, including DeepArt and Prisma. This method has been used by artists and designers around the globe to develop new artwork based on existent style(s).
Earlier style transfer algorithms
NST is an example of image stylization, a problem studied for over two decades within the field of non-photorealistic rendering. The first two example-based style transfer algorithms were image analogies and image quilting. Both of these methods were based on patch-based texture synthesis algorithms.
Given a training pair of images–a photo and an artwork depicting that photo–a transformation could be learned and then applied to create new artwork from a new photo, by analogy. If no training photo was available, it would need to be produced by processing the input artwork; image quilting did not require this processing step, though it was demonstrated on only one style.
NST
NST was first published in the paper "A Neural Algorithm of Artistic Style" by Leon Gatys et al., originally released to ArXiv 2015, and subsequently accepted by the peer-reviewed CVPR conference in 2016. The original paper used a VGG-19 architecture that has been pre-trained to perform object recognition using the ImageNet dataset.
In 2017, Google AI introduced a method that allows a single deep convolutional style transfer network to learn multiple styles at the same time. This algorithm permits style interpolation in real-time, even when done on video media.
Formulation
The process of NST ass |
https://en.wikipedia.org/wiki/Net%20ecosystem%20production | Net ecosystem production (NEP) in ecology, limnology, and oceanography, is the difference between gross primary production (GPP) and net ecosystem respiration. Net ecosystem production represents all the carbon produced by plants in water through photosynthesis that does not get respired by animals, other heterotrophs, or the plants themselves.
Overview
Net ecosystem production describes the total carbon in an ecosystem that can be stored, exported, or oxidized back into carbon dioxide gas. NEP is written in units of mass of carbon per unit area per time, for example, grams carbon per square meter per year (g C m−2 yr−1). In a given ecosystem, carbon quantified as net ecosystem production can eventually end up: oxidized by fire or ultraviolet radiation, accumulated as biomass, exported as organic carbon to another system, or accumulated in sediments or soils. Carbon classified as NEP can be in the form of particles in the particulate organic carbon (POC) pool such as phytoplankton cells (living) and detritus (non-living), or it can be in the form of dissolved substances that have not yet been decomposed in the dissolved organic carbon (DOC) pool. In any form, if the carbon gets respired or decomposed by any living organism (plant, animal, bacteria, or other microscopic organism) to release carbon dioxide, that carbon no longer counts as NEP.
NEP = GPP - respiration [by plants] - respiration [by animals and other heterotrophs]
Net ecosystem production vs. net primary production
Net ecosystem production is all the carbon not respired, including respiration by plants and heterotrophic organisms such as animals and microbes. In contrast, net primary production (NPP) is all the carbon taken up by plants (autotrophs) minus the carbon that the plants themselves respire through cellular respiration.
NPP = GPP - respiration [by plants]
Net community production
Net community production (NCP) is the difference between net primary production and respiration by animals a |
https://en.wikipedia.org/wiki/Memory%20T%20cell%20inflation | Memory T cell inflation phenomenon is the formation and maintenance of a large population of specific CD8+ T cells in reaction to cytomegalovirus (CMV) infection.
Cytomegalovirus (CMV)
CMV is a worldwide type of virus which affects 60-80 % of the human population in developed countries. The virus is spread through saliva or urine and in healthy individuals can survive under the immune system control without any visible symptoms. The CMV life strategy is to integrate DNA into the genome of the host cells and escape the mechanism of natural immunity.
Infection
Immune response against CMV is primarily provided by CD8 + T cells which recognize viral fragments in MHC class I complex on the surface of infected cells and destroy these cells. Specific CD8+ T cells are generated in secondary lymphoid organs where naïve T cells encounter with cytomegalovirus antigen on antigen presenting cells. This results in a population of migrating effector CD8 + T-lymphocytes and the second small population called central memory T-cells that remains in the secondary lymphatic organs and the bone marrow. These cells are capable to respond and proliferate immediately after repeated pathogen recognition. The amount of memory cells generated as a response to cytomegalovirus is approximately 9.1% - 10.2% of all circulating CD4 + and CD8 + memory cells.
Memory CD8 + T lymphocytes characteristics
Generally, these cells express a low level of node localization markers - CD62L, CCR7 and occur in peripheral organs. They retain their standard functions like cytokine production and cytotoxicity. They do not express costimulatory molecules (CD28) and PD-1 receptor inhibitors on the surface, but they express the inhibitory molecules KLRG1 and CD85.
Immunosenescence
Remodeling of immune response and reduced ability to protect individuals from infectious diseases is observed in relation to age. Especially in the elderly, long-term CMV infection leads to a rapid increase the number of CMV-speci |
https://en.wikipedia.org/wiki/Vandi%20Verma | Vandana "Vandi" Verma is a space roboticist and chief engineer at NASA's Jet Propulsion Laboratory, known for driving the Mars rovers, notably Curiosity and Perseverance, using software including PLEXIL programming technology that she co-wrote and developed.
Biography
Verma was born and grew up partly in Halwara, India; her father was a pilot in the Indian Air Force. She gained her first qualification, a bachelor's degree in electrical engineering, at Punjab Engineering College in Chandigarh, India. She went on to gain a master's in robotics from Carnegie Mellon University (CMU)
followed by a PhD in robotics from Carnegie Mellon in 2005, with a thesis entitled Tractable Particle Filters for Robot Fault Diagnosis.
At CMU, she developed in interest in robotics in unknown environments. She was involved in a 3-year astrobiology experimental station in the Atacama desert. The desert was chosen because of the similarities between its hostile environment and the surface of Mars. She won a competition to create a robot to navigate a maze and collect balloons. She tested robotic technologies in the Arctic and Antarctic.
Between studies, she gained her pilot's license.
Her first post-graduate job was at Ames Research Center as a research scientist.
In 2006, Verma co-wrote PLEXIL, an open source programming language now used in automation technologies such as the NASA K10 rover, Mars Curiosity rover's percussion drill, the International Space Station, the Deep Space Habitat and Habitat Demonstration Unit, the Edison Demonstration of Smallsat Networks, LADEE, and Autonomy Operating System (AOS).
In 2007 Verma joined NASA's Jet Propulsion Laboratory (JPL) with a special interest in robotics and flight software and became part of the Mars rover team in 2008. As of 2019, she leads JPL's Autonomous Systems, Mobility and Robotic Systems group.
Verma has written academic papers in her field on subjects such as the AEGIS (Autonomous Exploration for Gathering Increased Science) |
https://en.wikipedia.org/wiki/Map%20projection%20of%20the%20triaxial%20ellipsoid | In geodesy, a map projection of the triaxial ellipsoid maps Earth or some other astronomical body modeled as a triaxial ellipsoid to the plane. Such a model is called the reference ellipsoid. In most cases, reference ellipsoids are spheroids, and sometimes spheres. Massive objects have sufficient gravity to overcome their own rigidity and usually have an oblate ellipsoid shape. However, minor moons or small solar system bodies are not under hydrostatic equilibrium. Usually such bodies have irregular shapes. Furthermore, some of gravitationally rounded objects may have a tri-axial ellipsoid shape due to rapid rotation (such as Haumea) or unidirectional strong tidal forces (such as Io).
Examples
A triaxial equivalent of the Mercator projection was developed by John P. Snyder.
Equidistant map projections of a triaxial ellipsoid were developed by Paweł Pędzich.
Conic Projections of a triaxial ellipsoid were developed by Maxim Nyrtsov.
Equal-area cylindrical and azimuthal projections of the triaxial ellipsoid were developed by Maxim Nyrtsov.
Jacobi conformal projections were described by Carl Gustav Jacob Jacobi.
See also
Geodesics on a triaxial ellipsoid
Map projection
Reference ellipsoid
Jacobi ellipsoid
Latitude
Ellipsoidal coordinates
Planetary coordinate system
References
Map projections |
https://en.wikipedia.org/wiki/Manbang | Manbang () is a series of state-owned digital media players issued by North Korea's Korean Central Broadcasting Committee, providing over-the-top content in the form of channels. It was created in response to streaming platforms like Netflix and Roku in the west, and the popularity of Chinese-made Notel players in North Korea.
Manbang, which ironically translates to "everywhere," is only available to citizens in Pyongynag, Siniju and Sariwon. Due to North Korea's isolationism, users connect to the service not by internet but via the state-controlled intranet using the IPTV protocol. It's hard to tell if the technology at play is IPTV or VOD but according to description it is a mixture of both.
Like Apple TV or Roku, the device is an Internet Protocol Television that works through a separate box. The system comes as a set-top box which first has to be connected to a modem and after that to the phone line. The box can be connected to a television through a HDMI cable.
History
Exact release date of Manbang system is unclear. One of the first set-top box appears to be manufactured in 2015. An intranet sites listing in 2015 included a site named “Manbang” with the operator being Korean Central Television.
On 16 August 2016, Manbang for the first time appeared on a report “망 TV다매체열람기‘만방'” by Korean Central Television. It reported that the implementation of “Intranet” Protocol Television (IPTV) which run on North Korea’s Kwangmyong intranet has begun. It also showcased a set-top box which was developed by Manbang IT company on which Manbang is based. KCTV also stated that the new service already has "several hundred users" and is "making the lives of citizens and children flourish" .
On 25 August 2016, Netflix took a light-hearted jab at Manbang by changing its Twitter bio description to read: "Manbang knockoff."
During the following years it appears that the North Korean government working towards making the service widely accessible . It was reported that the de |
https://en.wikipedia.org/wiki/Audiomack | Audiomack is an on-demand music streaming and audio discovery platform that allows artists and creators to upload limitless music and podcasts for listeners through its mobile apps and website. In February 2021, Billboard announced Audiomack streaming data would begin informing some of its flagship charts, including the Hot 100, the Billboard 200, and the Global 200. In March 2021, Fast Company magazine named Audiomack one of the 10 most innovative companies in music.
History
Co-founded in 2012 by Dave Macli, David Ponte, Thomas Klinger, Ty Wangsness and Brian Zisook, Audiomack allowed artists to freely share their mixtapes, songs, and albums. In April 2013, J. Cole (Yours Truly 2) and Chance the Rapper (Acid Rap) released new projects exclusively on the platform. In September 2018, Eminem released "Killshot", a diss track about Machine Gun Kelly, exclusively on Audiomack and gained 8.6 million plays in four months. In February, 2019, Nicki Minaj released three songs exclusively on the platform, including a remix of Blueface's "Thotiana."
In November 2020, Audiomack signed a music licensing agreement with Warner Music Group, covering the United States, Canada, Jamaica, and five "key African territories," including Ghana, Kenya, Nigeria, South Africa, and Tanzania. In February 2021, Variety reported Audiomack has music licensing agreements in the United States with Universal Music Group and Sony Music Entertainment. Audiomack also receives music through licensing deals with labels and distributors such as 300 and EMPIRE, among others, and manages rights through its relationship with Songtrust.
In December 2020, Audiomack opened its monetization program, AMP, to all eligible creators based in the United States, Canada, and the United Kingdom. In July 2021, Audiomack expanded the program to creators worldwide and introduced a partnership with Ziiki Media to help promote artists across Africa.
In December 2021, Audiomack launched Supporters, a "feature that will en |
https://en.wikipedia.org/wiki/Systematic%20Reviews%20in%20Pharmacy | The Systematic Reviews in Pharmacy is a monthly peer-reviewed open-access medical journal covering pharmaceutics, biopharmaceutics, pharmaceutical chemistry, pharmacognosy, pharmacology, pharmaceutical analysis, pharmacy practice, clinical and Biomedical sciences.
Abstracting and indexing
The journal is abstracted and indexed in:
According to Scopus, the journal has a 2019 CiteScore of 3.9
References
External links
Open access journals
Monthly journals
English-language journals
Pharmacology journals
Academic journals established in 2009
2009 establishments in Malaysia
Medknow Publications academic journals
Biomedical engineering journals |
https://en.wikipedia.org/wiki/Behrend%20sequence | In number theory, a Behrend sequence is an integer sequence whose multiples include almost all integers. The sequences are named after Felix Behrend.
Definition
If is a sequence of integers greater than one, and if denotes the set of positive integer multiples of members of , then is a Behrend sequence if has natural density one. This means that the proportion of the integers from 1 to that belong to converges, in the limit of large , to one.
Examples
The prime numbers form a Behrend sequence, because every integer greater than one is a multiple of a prime number. More generally, a subsequence of the prime numbers forms a Behrend sequence if and only if the sum of reciprocals of diverges.
The semiprimes, the products of two prime numbers, also form a Behrend sequence. The only integers that are not multiples of a semiprime are the prime powers. But as the prime powers have density zero, their complement, the multiples of the semiprimes, have density one.
History
The problem of characterizing these sequence was described as "very difficult" by Paul Erdős in 1979.
These sequences were named "Behrend sequences" in 1990 by Richard R. Hall, with a definition using logarithmic density in place of natural density. Hall chose their name in honor of Felix Behrend, who proved that for a Behrend sequence , the sum of reciprocals of must diverge. Later, Hall and Gérald Tenenbaum used natural density to define Behrend sequences in place of logarithmic density. This variation in definitions makes no difference in which sequences are Behrend sequences, because the Davenport–Erdős theorem shows that, for sets of multiples, having natural density one and having logarithmic density one are equivalent.
Derived sequences
When is a Behrend sequence, one may derive another Behrend sequence by omitting from any finite number of elements.
Every Behrend sequence may be decomposed into the disjoint union of infinitely many Behrend sequences.
References
Integer sequences |
https://en.wikipedia.org/wiki/Cost%20of%20reproduction%20hypothesis | In life history theory, the cost of reproduction hypothesis is the idea that reproduction is costly in terms of future survival and reproduction. This is mediated by various mechanisms, with the two most prominent being hormonal regulation and differential allocation of internal resources.
Definition and predictions
The cost of reproduction hypothesis posits that reproduction (and increased reproductive effort) is costly in terms of future survival and reproduction. These costs may be exacerbated in certain organisms, such as first--time breeders. Along with the idea that organisms are selected to maximize lifetime reproductive success, this hypothesis results in a trade-off between current reproduction and future fitness that is pivotal in life history theory. This trade-off can be analyzed on three levels: the genetic (which analyzes the genetic basis of covariation between traits), the phenotypic (which assess how traits directly connected to fitness covary), and the intermediate level (which involves the analysis of the mechanisms connecting the genetic and phenotypic levels, like physiological mechanisms).
A major prediction of the cost of reproduction hypothesis is that the importance of the cost declines as an organism ages, resulting in increased reproductive effort in older organisms (this prediction is the terminal investment hypothesis). The cost of reproduction hypothesis also predicts that the optimal reproductive effort in a season is less than the effort that would maximize the number of offspring produced that season. This is especially true in organisms with a long lifespan, as their residual reproductive value (measured as the total reproductive value minus the current reproductive investment) would be higher compared to those with a shorter life.
Trade-offs and causes
Costs of reproduction arise from multiple factors, including physiological, ecological, and behavioural factors. The two most prominent physiological factors are hormones and diff |
https://en.wikipedia.org/wiki/Stephen%20Drury%20%28mathematician%29 | Stephen William Drury is a Anglo-Canadian mathematician and professor of mathematics at McGill University. He specializes in mathematical analysis, harmonic analysis and linear algebra. He received his doctorate from the University of Cambridge in 1970 under the supervision of Nicholas Varopoulos and completed his postdoctoral training at the Faculté des sciences d'Orsay, France. He was recruited to McGill by Professor Carl Herz in 1972.
Among other contributions, he solved the Sidon set union problem, worked on restrictions of Fourier and Radon transforms to curves, and generalized von Neumann's inequality. In operator theory, the Drury–Arveson space is named after William Arveson and him.
His research now pertains to the interplay between matrix theory and harmonic analysis and their applications to graph theory.
References
Year of birth missing (living people)
Living people
20th-century Canadian mathematicians
21st-century Canadian mathematicians
Academic staff of McGill University
Mathematical analysts
Alumni of the University of Cambridge
20th-century British mathematicians
British emigrants to Canada
21st-century British mathematicians |
https://en.wikipedia.org/wiki/List%20of%20astrometric%20solvers | Programs capable of Astrometric solving:
The solvers Elbrus and Charon are obsolete and no longer developed.
External links
Astrometry.net webpage
All sky solver webpage
ANSVR webpage
Astrometry.net API lite
Astrotortilla webpage
CloudMakers webpage
Stellar Solver
ASTAP webpage
Regim webpage
Siril webpage
SIPS webpage
Tetra3 webpage
XParallax viu webpage
Astrometrica webpage
Observatory webpage
PinPoint webpage
PixInsight webpage
PlaneWave Instruments webpage
TheSky Astronomy Software webpage
Logiciel PRISM
PlateSolve3 info
Astronomical imaging
Astronomy software
astrometric solvers |
https://en.wikipedia.org/wiki/The%20Cypher%20%28video%20game%29 | The Cypher is an interactive fiction video game by EPG Multimedia Inc.
Plot and gameplay
The Cypher is a digital interactive fiction novel in eight episodic chapters that explores a murder mystery spanning ten centuries. Characters include a medieval lord, a 1900s archaeologist and a modern-day criminologist.
Players act as time-traveling detectives investigating two murders committed by the same murderer a thousand years apart. They explore numerous locations as they hunt for clues, a hidden chamber beneath a medieval castle, the offices of a Scotland Yard inspector, and the hotel room of a modern-day cryptologist. Along the way, players sift through forensic evidence, examine ancient manuscripts, decode encrypted emails, pore over illuminated books, or puzzle over ancient technology.
The story opens with John Shoresby, a cryptanalyst for SciCrypt Systems, who receives an email message from an unknown sender with no traceable source and no Internet identifier, as if from thin air. The message pleads for his help and asks him to look into a hundred year old missing person's case. When he downloads the email attachments, he recognizes the face of one of his ancestors, James Francis Ravenshim, an archeologist that vanished without a trace in the summer of 1900. Intrigued, he starts to dig into the case and discovers nothing is what it seems. James Francis’ disappearance wasn't the only inexplicable event that occurred all those years ago at Ravenshim Castle, his family's ancestral home. Weeks before, a man was found murdered in one of the underground chambers. Scotland Yard removed the body for an autopsy, but before they could examine it, the body shriveled up and mummified right before their eyes!
Shoresby continues to chase down clues, but faces obstruction at every turn. He discovers there may have been a cover up of the murder and the disappearance, aided by Scotland Yard, which ended with the complete destruction of Ravenshim Castle. What was everyone hidin |
https://en.wikipedia.org/wiki/Fluorescence%20imaging | Fluorescence imaging is a type of non-invasive imaging technique that can help visualize biological processes taking place in a living organism. Images can be produced from a variety of methods including: microscopy, imaging probes, and spectroscopy.
Fluorescence itself, is a form of luminescence that results from matter emitting light of a certain wavelength after absorbing electromagnetic radiation. Molecules that re-emit light upon absorption of light are called fluorophores.
Fluorescence imaging photographs fluorescent dyes and fluorescent proteins to mark molecular mechanisms and structures. It allows one to experimentally observe the dynamics of gene expression, protein expression, and molecular interactions in a living cell. It essentially serves as a precise, quantitative tool regarding biochemical applications.
A common misconception, fluorescence differs from bioluminescence by how the proteins from each process produce light. Bioluminescence is a chemical process that involves enzymes breaking down a substrate to produce light. Fluorescence is the physical excitation of an electron, and subsequent return to emit light.
Attributes
Fluorescence mechanism
When a certain molecule absorbs light, the energy of the molecule is briefly raised to a higher excited state. The subsequent return to ground state results in emission of fluorescent light that can be detected and measured. The emitted light, resulting from the absorbed photon of energy hv, has a specific wavelength. It is important to know this wavelength beforehand so that when an experiment is running, the measuring device knows what wavelength to be set at to detect light production. This wavelength is determined by the equation:
Where h = Planck's constant, and c = the speed of light. Typically a large scanning device or CCD is used here to measure the intensity and digitally photograph an image.
Fluorescent dyes versus proteins
Fluorescent dyes, with no maturation time, offer higher photo sta |
https://en.wikipedia.org/wiki/Amplitwist | In mathematics, the amplitwist is a concept created by Tristan Needham in the book Visual Complex Analysis (1997) to represent the derivative of a complex function visually.
Definition
The amplitwist associated with a given function is its derivative in the complex plane. More formally, it is a complex number such that in an infinitesimally small neighborhood of a point in the complex plane, for an infinitesimally small vector . The complex number is defined to be the derivative of at .
Uses
The concept of an amplitwist is used primarily in complex analysis to offer a way of visualizing the derivative of a complex-valued function as a local amplification and twist of vectors at a point in the complex plane.
Examples
Define the function . Consider the derivative of the function at the point . Since the derivative of is , we can say that for an infinitesimal vector at , .
References
Functions and mappings
Complex analysis |
https://en.wikipedia.org/wiki/Unihertz%20Jelly | The Unihertz Jelly (also the enhanced model, Unihertz Jelly Pro) is an Android smartphone developed by Unihertz, which billed it as "the smallest 4G smartphone" upon its release in 2017. It was initially developed with a successful Kickstarter project which reached its $30,000 goal in just 57 minutes, and eventually raised over $1.25 million. The Unihertz Jelly Pro is still currently available for sale in over 60 countries, including major retailers.
Specifications
Unihertz Jelly and Jelly Pro are both marketed as a phone with the full features of Android 7.0 Nougat and support for a 4G network, but are much more lightweight and compact than other phones with this same functionality. They feature a 2.45-inch display and weigh in at 60 grams.
In 2020, its successor, Unihertz Jelly 2, was released, which features a slightly larger display.
Software
Hardware
Reception
Unihertz was associated with a previous very small 3G smartphone, the Posh Micro X, which launched in 2015.
Reviews of Jelly and Jelly Pro, the "world's smallest 4G smartphone" have been mixed,
but it drew international attention.
There have been accusations of poor battery performance, and network traffic possibly sending personal data to China. Responses claim the network traffic is to speed up apps, and the company has been updating the phone software to improve performance. However, others have disputed this claim. It has received mixed reviews, with some critics calling it "innovative", praising it for its capabilities at such a small size.
References
Smart devices
Android (operating system) devices |
https://en.wikipedia.org/wiki/Differentiable%20programming | Differentiable programming is a programming paradigm in which a numeric computer program can be differentiated throughout via automatic differentiation. This allows for gradient-based optimization of parameters in the program, often via gradient descent, as well as other learning approaches that are based on higher order derivative information. Differentiable programming has found use in a wide variety of areas, particularly scientific computing and artificial intelligence. One of the early proposals to adopt such a framework in a systematic fashion to improve upon learning algorithms was made by the Advanced Concepts Team at the European Space Agency in early 2016.
Approaches
Most differentiable programming frameworks work by constructing a graph containing the control flow and data structures in the program. Attempts generally fall into two groups:
Static, compiled graph-based approaches such as TensorFlow, Theano, and MXNet. They tend to allow for good compiler optimization and easier scaling to large systems, but their static nature limits interactivity and the types of programs that can be created easily (e.g. those involving loops or recursion), as well as making it harder for users to reason effectively about their programs. A proof of concept compiler toolchain called Myia uses a subset of Python as a front end and supports higher-order functions, recursion, and higher-order derivatives.
Operator overloading, dynamic graph based approaches such as PyTorch and AutoGrad. Their dynamic and interactive nature lets most programs be written and reasoned about more easily. However, they lead to interpreter overhead (particularly when composing many small operations), poorer scalability, and reduced benefit from compiler optimization. A package for the Julia programming language Zygote works directly on Julia's intermediate representation, allowing it to still be optimized by Julia's just-in-time compiler.
A limitation of earlier approaches is that they are o |
https://en.wikipedia.org/wiki/Cysteine-rich%20protein | Cysteine-rich proteins (also cysteine-rich peptide, CRP, disulphide-rich peptide) are small proteins that contain a large number of cysteines. These cysteines either cross-link to form disulphide bonds, or bind metal ions by chelation, stabilising the protein's tertiary structure. CRPs include a highly conserved secretion peptide signal at the N-terminus and a cysteine-rich region at the C-terminus.
Structure
Disulphides
In an oxidising environment cysteines cross-link to form disulphide bonds. CRPs that form these typically have an even number of cysteines.
Metal binding
Cysteines can coordinate one or more metal ions by forming a chelation complex around them.
Functions in plants
CRPs are numerous in plants, with 756 CRP-encoding genes in the Arabidopsis thaliana genome. Several CRPs bind known receptors, but most CRP signaling mechanisms and protein interactions are uncharacterized. Characterized CRPs function as short-range intercellular signals during processes such as plant defense, bacterial symbiosis, stomatal patterning, fertilization, vegetative tissue development, and seed development.
Many CRPs function in plant defense. Defensins, a major class of CRP with an eight-cysteine motif forming four disulfide bridges, are involved in pathogen response. Other putative antimicrobial CRPs include lipid transfer proteins, thionins, knottins, heveins, and snakins. Additionally, some CRPs have allergenic, ɑ-amylase inhibitory, or protease inhibitory functions that deter herbivores.
In plant reproduction, CRPs are involved in pollen tube growth and guidance and early embryo patterning, in addition to other functions. Among those involved in pollen tube attraction are the LUREs, a group of ovular pollen-tube attractants in Arabidopsis thaliana and Torenia fournieri that preferentially attract conspecific pollen, and STIG1, a CRP expressed in the stigma of Solanum lycopersicum that interacts with the pollen-specific receptor PRK2. In early embryo development, |
https://en.wikipedia.org/wiki/BitSight | BitSight is a cybersecurity ratings company that analyzes companies, government agencies, and educational institutions. It is based in Back Bay, Boston. Security ratings that are delivered by BitSight are used by banks and insurance companies among other organizations.
The company rates more than 200,000 organizations with respect to their cybersecurity.
History
BitSight was founded in 2011 by Nagarjuna Venna and Stephen Boyer and currently has both United States-based and international employees. In 2016, BitSight raised $40 million USD in funding in the month of September.
In 2014, BitSight acquired AnubisNetworks, a Portugal-based cybersecurity firm that tracks real-time data threats.
By September 2016, BitSight had raised $40 million in a Series C round led by GGV Capital, with participation from Flybridge Capital Partners, Globespan Capital Partners, Menlo Ventures, Shaun McConnon, and the VC divisions of Comcast Ventures, Liberty Global Ventures, and Singtel Innov8.
Shaun McConnon stepped down as the CEO of BitSight in July 2017 but remains the executive chairman of the board. The CEO position was filled by Tom Turner in 2017, and then by Stephen Harvey in 2020.
In June 2018, BitSight closed $60 million in Series D funding, bringing the company's total funding to $155 million. BitSight's Series D financing was led by Warburg Pincus, with participation from existing investors Menlo Ventures, GGV Capital and Singtel Innov8.
In 2018, the company was located in Cambridge but purchased property in order to shift to Back Bay, where BitSight is currently located. Forbes has estimated BitSight's revenue as being US$100 million as of 2018.
Services
Organizations purchase BitSight's services in order to understand "security risks associated with sharing sensitive data with business partners." As of 2018, BitSight serves clients, including Lowe's, AIG, and Safeway.
BitSight assembles models that produce company ratings, which are based on a scale that enables |
https://en.wikipedia.org/wiki/Elanor%20Huntington | Elanor H. Huntington is Executive Director of Digital, National Facilities & Collections at the Commonwealth Scientific and Industrial Research Organisation and a Professor of Quantum Cybernetics at the Australian National University. She led a research program in the Australian Research Council Centre of Excellence for Quantum Computation and Communication Technology.
Early life and education
Huntington studied physics at the Australian National University and graduated in 1996 with a University Medal. She decided that she enjoyed using science to help others, and switched to engineering. She earned her PhD in 1999 working in experimental quantum optics. Huntington joined the Australian Defence Science and Technology Organisation after graduating, where she worked for 18 months before joining the University of New South Wales Canberra at the Australian Defence Force Academy.
Research
Huntington specialises in high speed measurements and the generation of non-classical states. She works on quantum computation, creating optical microchips that can detect, generate and manipulate states of light. She is interested in the intersection of quantum theory and applications. She joined the University of New South Wales in 2000. She has worked in the School of Engineering and Information Technology at the Australian Defence Force Academy at University of New South Wales, where she was made Head of the School of Engineering and IT in 2010. She leads a research program in the Australian Research Council Centre of Excellence for Quantum Computation and Communication Technology.
In 2011, Huntington and collaborators made a major breakthrough in quantum computation, by demonstrating that it was possible to teleport quantum non-Gaussian beams of light on a quantum superposition. These days, she makes use of waveguide technology, coupled with systems engineering, to design and build quantum technologies. She was appointed Dean of the Australian National University College of |
https://en.wikipedia.org/wiki/DNA-templated%20organic%20synthesis | DNA‐templated organic synthesis (DTS) is a way to control the reactivity of synthetic molecules by using nature's molarity‐based approach. Historically, DTS was used as a model of prebiotic nucleic acid replication. Now however, it is capable of translating DNA sequences into complex small‐molecule and polymer products of multistep organic synthesis.
Base Editors
The DNA base editors, developed at Harvard University under David Liu, allow altering the genomic structure of DNA. The base editors include BE3, BE4 and ABE7.
BE3 and its later version, BE4 allow to change the nucleobase C to T and nucleobase G to A. ABE7 allows to change A-T base pairs into G-C base pairs. The system works by rearranging the atoms in the target base pair and then tricking cells into fixing the other DNA strand to make the change permanent.
References
Biological engineering
Biotechnology
Emerging technologies
Genome editing
Molecular biology |
https://en.wikipedia.org/wiki/VTuber | A , or , is an online entertainer who uses a virtual avatar generated using computer graphics. Real-time motion capture software or technology are often—but not always—used to capture movement. The digital trend originated in Japan in the mid-2010s, and has become an international online phenomenon in the 2020s. A majority of VTubers are English and Japanese-speaking YouTubers or live streamers who use avatar designs. By 2020, there were more than 10,000 active VTubers. Although the term is an allusion to the video platform YouTube, they also use websites such as Niconico, Twitch, Facebook, Twitter, and Bilibili.
The first entertainer to use the phrase "virtual YouTuber", Kizuna AI, began creating content on YouTube in late 2016. Her popularity sparked a VTuber trend in Japan, and spurred the establishment of specialized agencies to promote them, including major ones such as Hololive Production, Nijisanji, and VShojo. Fan translations and foreign-language VTubers have marked a rise in the trend's international popularity. Virtual YouTubers have appeared in domestic advertising campaigns, and have broken livestream-related world records.
Overview
Virtual YouTubers (although more commonly referred to as VTubers) are online entertainers who are typically YouTubers or live streamers. They use avatars created with programs such as Live2D, portraying characters designed by online artists. VTubers are not bound by physical limitations, and many of them engage in activities that are unconstrained by their real-world identity. Some VTubers, particularly those from marginalized communities, choose to use avatars to reflect their online identity for personal comfort and safety reasons. Transgender VTubers may use their avatars as a means to better reflect their preferred presentation to their audience.
VTubers often portray themselves as a kayfabe character, not unlike professional wrestling; Mace, a WWE wrestler who himself began streaming on Twitch as a VTuber in 2021, r |
https://en.wikipedia.org/wiki/Indistinguishability%20obfuscation | In cryptography, indistinguishability obfuscation (abbreviated IO or iO) is a type of software obfuscation with the defining property that obfuscating any two programs that compute the same mathematical function results in programs that cannot be distinguished from each other. Informally, such obfuscation hides the implementation of a program while still allowing users to run it. Formally, IO satisfies the property that obfuscations of two circuits of the same size which implement the same function are computationally indistinguishable.
Indistinguishability obfuscation has several interesting theoretical properties. Firstly, iO is the "best-possible" obfuscation (in the sense that any secret about a program that can be hidden by any obfuscator at all can also be hidden by iO). Secondly, iO can be used to construct nearly the entire gamut of cryptographic primitives, including both mundane ones such as public-key cryptography and more exotic ones such as deniable encryption and functional encryption (which are types of cryptography that no-one previously knew how to construct), but with the notable exception of collision-resistant hash function families. For this reason, it has been referred to as "crypto-complete". Lastly, unlike many other kinds of cryptography, indistinguishability obfuscation continues to exist even if P=NP (though it would have to be constructed differently in this case), though this does not necessarily imply that iO exists unconditionally.
Though the idea of cryptographic software obfuscation has been around since 1996, indistinguishability obfuscation was first proposed by Barak et al. (2001), who proved that iO exists if P=NP is the case. For the P!=NP case (which is harder, but also more plausible), progress was slower: Garg et al. (2013) proposed a construction of iO based on a computational hardness assumption relating to multilinear maps, but this assumption was later disproven. A construction based on "well-founded assumptions" (hardn |
https://en.wikipedia.org/wiki/Learning%20curve%20%28machine%20learning%29 | In machine learning, a learning curve (or training curve) plots the optimal value of a model's loss function for a training set against this loss function evaluated on a validation data set with same parameters as produced the optimal function. Synonyms include error curve, experience curve, improvement curve and generalization curve.
More abstractly, the learning curve is a curve of (learning effort)-(predictive performance), where usually learning effort means number of training samples and predictive performance means accuracy on testing samples.
The machine learning curve is useful for many purposes including comparing different algorithms, choosing model parameters during design, adjusting optimization to improve convergence, and determining the amount of data used for training.
Formal definition
One model of a machine learning is producing a function, , which given some information, , predicts some variable, , from training data and . It is distinct from mathematical optimization because should predict well for outside of .
We often constrain the possible functions to a parameterized family of functions, , so that our function is more generalizable or so that the function has certain properties such as those that make finding a good easier, or because we have some a priori reason to think that these properties are true.
Given that it is not possible to produce a function that perfectly fits our data, it is then necessary to produce a loss function to measure how good our prediction is. We then define an optimization process which finds a which minimizes referred to as .
Training curve for amount of data
Then if our training data is and our validation data is a learning curve is the plot of the two curves
where
Training curve for number of iterations
Many optimization processes are iterative, repeating the same step until the process converges to an optimal value. Gradient descent is one such algorithm. If you define as the approxim |
https://en.wikipedia.org/wiki/Learning%20rate | In machine learning and statistics, the learning rate is a tuning parameter in an optimization algorithm that determines the step size at each iteration while moving toward a minimum of a loss function. Since it influences to what extent newly acquired information overrides old information, it metaphorically represents the speed at which a machine learning model "learns". In the adaptive control literature, the learning rate is commonly referred to as gain.
In setting a learning rate, there is a trade-off between the rate of convergence and overshooting. While the descent direction is usually determined from the gradient of the loss function, the learning rate determines how big a step is taken in that direction. A too high learning rate will make the learning jump over minima but a too low learning rate will either take too long to converge or get stuck in an undesirable local minimum.
In order to achieve faster convergence, prevent oscillations and getting stuck in undesirable local minima the learning rate is often varied during training either in accordance to a learning rate schedule or by using an adaptive learning rate. The learning rate and its adjustments may also differ per parameter, in which case it is a diagonal matrix that can be interpreted as an approximation to the inverse of the Hessian matrix in Newton's method. The learning rate is related to the step length determined by inexact line search in quasi-Newton methods and related optimization algorithms.
Learning rate schedule
Initial rate can be left as system default or can be selected using a range of techniques. A learning rate schedule changes the learning rate during learning and is most often changed between epochs/iterations. This is mainly done with two parameters: decay and momentum. There are many different learning rate schedules but the most common are time-based, step-based and exponential.
Decay serves to settle the learning in a nice place and avoid oscillations, a situation tha |
https://en.wikipedia.org/wiki/Debbie%20Leung | Debbie Leung is a University Research Chair at the Institute for Quantum Computing at the University of Waterloo, where she is also affiliated with the Department of Combinatorics and Optimization. She works in theoretical quantum information processing.
Leung's research areas include quantum cryptography, quantum communication, measurement-based quantum computation, fault-tolerant quantum computation and error correction.
Leung earned her Bachelor of Science in mathematics and physics from Caltech in 1995. She received her PhD under doctoral advisors Yoshihisa Yamamoto and Isaac Chuang at Stanford. In her PhD thesis, entitled "Towards Robust Quantum Computation", she demonstrated the surprising result that approximate quantum error-correcting codes can outperform their exact counterparts.
In 2002, Leung won the Tolman postdoctoral fellowship at the Institute of Quantum Information at Caltech and the Croucher Fellowship. In 2005, she won a 10-year Tier II Canada Research Chair in Quantum Communications. Her recent work focuses on quantum channel capacities, quantum network coding, and quantum information processing with limited entanglement.
References
External links
21st-century Canadian mathematicians
Canadian women mathematicians
Academic staff of the University of Waterloo
Quantum cryptography
Year of birth missing (living people)
Living people
Stanford University alumni
California Institute of Technology alumni |
https://en.wikipedia.org/wiki/Nature%20Machine%20Intelligence | Nature Machine Intelligence is a monthly peer-reviewed scientific journal published by Nature Portfolio covering machine learning and artificial intelligence. The editor-in-chief is Liesbeth Venema.
History
The journal was created in response to the machine learning explosion of the 2010s. It launched in January 2019, and its opening was met with controversy and boycotts within the machine learning research community due to opposition to Nature publishing the journal as closed access. To address this issue, now Nature Machine Intelligence gives authors an option to publish open access papers for an additional fee, and "authors remain owners of the research reported, and the code and data supporting the main findings of an article should be openly available. Moreover, preprints are allowed, in fact encouraged, and a link to the preprint can be added below the abstract, visible to all readers."
Abstracting and indexing
According to the Journal Citation Reports, the journal has a 2021 impact factor of 25.898, ranking it 1st out of 144 journals in the category "Computer Science, Artificial intelligence" and first out of 113 journals in the category "Computer Science, Interdisciplinary Applications".
References
External links
Official website
Nature Research academic journals
Computer science journals
Machine learning
Artificial intelligence publications
Academic journals established in 2019
English-language journals |
https://en.wikipedia.org/wiki/Electrical%20Products%20Corporation | The Electrical Products Corporation (EPCO) was a major producer of electric signs, especially neon signs, in the western region of the United States. Electrical Products Corp. was established in Los Angeles, incorporated on November 7, 1912. By 1923, EPCO had acquired the rights to the neon patents of neon light inventor Georges Claude and began the manufacture of neon lighting and signs. In 1962 it was acquired by the Federal Sign and Signal Corporation.
External links
"The neon sign maker that lit up California", LACurbed, Feb 15, 2019
References
Signage
Signage |
https://en.wikipedia.org/wiki/AppSheet | AppSheet is an application that provides a no-code development platform for application software, which allows users to create mobile, tablet, and web applications using data sources like Google Drive, DropBox, Office 365, and other cloud-based spreadsheet and database platforms. The platform can be utilized for a broad set of business use cases including project management, customer relationship management, field inspections, and personalized reporting.
AppSheet was acquired by Google in January 2020.
Platform
The AppSheet platform allows users to create mobile apps from cloud-based spreadsheets and databases. Apps can also be created directly as an add-on from spreadsheet platforms like Google Sheets. The platform is available from both a self-service model and a corporate licensing model for larger organizations with more governance, data analytics, and performance options. Compared to low-code development platforms which allow developers to develop with faster iteration cycles, AppSheet is a no-code platform which allows business users familiar with basic spreadsheet and database operations to build apps.
AppSheet compatible data sources include:
Google Sheets
Google Forms
Microsoft Excel on Office 365
Microsoft Excel on Dropbox
Microsoft Excel on Box (company)
Smartsheet
Salesforce
DreamFactory
Microsoft SQL Server
MySQL
PostgreSQL
Amazon DynamoDB
Features
Data Capture
AppSheet apps capture data in the form of images, signatures, geolocation, barcodes, and NFC. Data is automatically synced to the cloud-based, or users can opt to manually sync the data at any time. Common uses for data capture include field or equipment inspections, safety inspections, reporting, and inventory management.
Data Collaboration
Synced, shared data allows users to collaborate across mobile or desktop devices. Workflow rules can also be used to trigger notifications or work-based assignments where appropriate. Offline access is also possible as data storage is loca |
https://en.wikipedia.org/wiki/Darwinian%20threshold | Darwinian threshold or Darwinian transition is a term introduced by Carl Woese to describe a transition period during the evolution of the first cells when genetic transmission moves from a predominantly horizontal mode to a vertical mode. The process starts when the ancestors of the Last Universal Common Ancestor (the LUCA) become refractory to horizontal (or lateral) gene transfer (HGT) and become individual entities with vertical heredity upon which natural selection is effective. After this transition, life is characterized by genealogies that have a modern tree-like phylogeny.
Before the Darwinian threshold
The Last Universal Common Ancestor is often considered to be an already complex organism with a DNA-based genome, a complex informational flow and an efficient metabolism, but some authors, like Carl Woese, believe instead that the LUCA was not a discrete entity but rather a diverse community of cells that survived and evolved as a biological unit.
Carl Woese indicated that most likely there existed high mutation rates and small genomes. Also present were small proteins and larger imprecisely translated "statistical proteins". Entities in which translation had not yet developed to the point that proteins of the modern type could arise, have been termed “progenotes,” and the era during which these were the most advanced forms of life, the “progenote era”.
These organisms or biological entities, these progenotes (or ribocytes), had RNA as informational molecule instead of DNA. RNA is capable of both catalysis and replication and could have been central to the origins of heredity and life itself. It has been proposed that the initial molecular events were carried out by transfer RNAs (tRNAs). It is hypothesized that structured tRNAs could have provided amino acids during a process called self-translation of a single extended tRNA strand.
Compartmentalization with membranes was not yet completed and translation of proteins was not precise. Not every progeno |
https://en.wikipedia.org/wiki/Synthetic%20microbial%20consortia | Synthetic microbial consortia (commonly called co-cultures) are multi-population systems that can contain a diverse range of microbial species, and are adjustable to serve a variety of industrial, ecological, and tautological interests. For synthetic biology, consortia take the ability to engineer novel cell behaviors to a population level.
Consortia are more common than not in nature, and generally prove to be more robust than monocultures. Just over 7,000 species of bacteria have been cultured and identified to date. Many of the estimated 1.2 million bacteria species that remain have yet to be cultured and identified, in part due to inabilities to be cultured axenically. Evidence for symbiosis between microbes strongly suggests it to have been a necessary precursor of the evolution of land plants and for their transition from algal communities in the sea to land. When designing synthetic consortia, or editing naturally occurring consortia, synthetic biologists keep track of pH, temperature, initial metabolic profiles, incubation times, growth rate, and other pertinent variables.
Biofuel
One of the more salient applications of engineering behaviors and interactions between microbes in a community is the ability to combine or even switch metabolisms. The combination of autotrophic and heterotrophic microbes allows the unique possibility of a self-sufficient community that may produce desired biofuels to be collected. Co-culture dyads of autotrophic Synechococcus elongatus and heterotrophic Escherichia coli were found to be able to grow synchronously when the strain of S. elongatus was transformed to include a gene for sucrose export. The commensal combination of the sucrose-producing cyanobacteria with the modified E. coli metabolism may allow for a diverse array of metabolic products such as various butanol biofuels, terpenoids, and fatty-acid derived fuels.
Including a heterotroph also provides a solution to the issues of contamination when producing carbohydr |
https://en.wikipedia.org/wiki/Deontay%20Wilder%20vs.%20Tyson%20Fury%20II | Deontay Wilder vs. Tyson Fury II, billed as Unfinished Business, was a heavyweight professional boxing rematch between undefeated and reigning WBC champion Deontay Wilder and undefeated former unified heavyweight champion Tyson Fury, for the WBC and vacant The Ring heavyweight titles. The event took place on February 22, 2020, at the MGM Grand Garden Arena, Paradise, Nevada. Fury won the bout by seventh-round technical knockout (TKO).
The first fight had ended in a controversial split draw. Commentators thought that Fury had done enough to dethrone Wilder in the first bout, but uncertainty remained as Fury had been knocked down twice, and the bookmakers had Wilder as a slight favorite going into the rematch. In the rematch, Fury dominated Wilder, knocking him down twice, before Wilder's corner threw in the towel in the seventh round.
The fight was jointly promoted by Al Haymon's Premier Boxing Champions, Bob Arum's Top Rank and Frank Warren's Queensberry Promotions. According to Arum, it was confirmed that it achieved 800,000 - 850,000 pay-per-view buys in the United States . Fury's performance gained widespread praise; it was hailed as "sensational" and one of the most impressive displays from a heavyweight title bout in recent years.
Background
The fight was a rematch after the controversial split decision draw between Wilder and Fury on December 1, 2018. Out of 27 prominent boxing journalists, 15 scored Fury as the winner of the first bout, 3 scored it for Wilder, and 9 scored it as a draw.
After the bout, there were calls for an immediate rematch. The WBC announced in February 2019 that there would not be an immediate rematch as Fury signed a contract with ESPN and Top Rank that meant the negotiations would be more difficult as the rematch would be a co-promotion, of which there had only been two before: Lennox Lewis vs. Mike Tyson in 2002 and Floyd Mayweather Jr. vs. Manny Pacquiao in 2015.
As the rematch was delayed, Fury and Wilder both scheduled fight |
https://en.wikipedia.org/wiki/CRISPR%20gene%20editing | CRISPR gene editing (pronounced "crisper") is a genetic engineering technique in molecular biology by which the genomes of living organisms may be modified. It is based on a simplified version of the bacterial CRISPR-Cas9 antiviral defense system. By delivering the Cas9 nuclease complexed with a synthetic guide RNA (gRNA) into a cell, the cell's genome can be cut at a desired location, allowing existing genes to be removed and/or new ones added in vivo.
The technique is considered highly significant in biotechnology and medicine as it enables editing genomes in vivo very precisely, cheaply, and easily. It can be used in the creation of new medicines, agricultural products, and genetically modified organisms, or as a means of controlling pathogens and pests. It also has possibilities in the treatment of inherited genetic diseases as well as diseases arising from somatic mutations such as cancer. However, its use in human germline genetic modification is highly controversial. The development of the technique earned Jennifer Doudna and Emmanuelle Charpentier the Nobel Prize in Chemistry in 2020. The third researcher group that shared the Kavli Prize for the same discovery, led by Virginijus Šikšnys, was not awarded the Nobel prize.
Working like genetic scissors, the Cas9 nuclease opens both strands of the targeted sequence of DNA to introduce the modification by one of two methods. Knock-in mutations, facilitated via homology directed repair (HDR), is the traditional pathway of targeted genomic editing approaches. This allows for the introduction of targeted DNA damage and repair. HDR employs the use of similar DNA sequences to drive the repair of the break via the incorporation of exogenous DNA to function as the repair template. This method relies on the periodic and isolated occurrence of DNA damage at the target site in order for the repair to commence. Knock-out mutations caused by CRISPR-Cas9 result from the repair of the double-stranded break by means of non |
https://en.wikipedia.org/wiki/Equivalence%20problem | In theoretical computer science and formal language theory, the equivalence problem is the question of determining, given two representations of formal languages, whether they denote the same formal language.
The complexity and decidability of this decision problem depend upon the type of representation under consideration.
For instance, in the case of finite-state automata, equivalence is decidable, and the problem is PSPACE-complete.
Further, in the case of deterministic pushdown automata, equivalence is decidable, Géraud Sénizergues won the Gödel Prize for this result. Subsequently, the problem was shown to lie in TOWER, the least non-elementary complexity class.
It becomes an undecidable problem for pushdown automata or any machine that can decide context-free languages or more powerful languages.
References
Formal languages |
https://en.wikipedia.org/wiki/Crouzeix%27s%20conjecture | Crouzeix's conjecture is an unsolved (as of 2018) problem in matrix analysis. It was proposed by Michel Crouzeix in 2004, and it refines Crouzeix's theorem, which states:
where the set is the field of values of a n×n (i.e. square) complex matrix and is a complex function, that is analytic in the interior of and continuous up to the boundary of . The constant is independent of the matrix dimension, thus transferable to infinite-dimensional settings. The not yet proved conjecture states that the constant is sharpable to :
Michel Crouzeix and Cesar Palencia proved in 2017 that the result holds for , improving the original constant of .
Slightly reformulated, the conjecture can be stated as follows: For all square complex matrices and all complex polynomials :
holds, where the norm on the left-hand side is the spectral operator 2-norm.
While the general case is unknown, it is known that the conjecture holds for tridiagonal 3×3 matrices with elliptic field of values centered at an eigenvalue and for general n×n matrices that are nearly Jordan blocks. Furthermore, Anne Greenbaum and Michael L. Overton provided numerical support for Crouzeix's conjecture.
Further reading
References
Conjectures
Matrix theory
Unsolved problems in mathematics |
https://en.wikipedia.org/wiki/Tripod%20packing | In combinatorics, tripod packing is a problem of finding many disjoint tripods in a three-dimensional grid, where a tripod is an infinite polycube, the union of the grid cubes along three positive axis-aligned rays with a shared apex.
Several problems of tiling and packing tripods and related shapes were formulated in 1967 by Sherman K. Stein. Stein originally called the tripods of this problem "semicrosses", and they were also called Stein corners by Solomon W. Golomb. A collection of disjoint tripods can be represented compactly as a monotonic matrix, a square matrix whose nonzero entries increase along each row and column and whose equal nonzero entries are placed in a monotonic sequence of cells, and the problem can also be formulated in terms of finding sets of triples satisfying a compatibility condition called "2-comparability", or of finding compatible sets of triangles in a convex polygon.
The best lower bound known for the number of tripods that can have their apexes packed into an grid is , and the best upper bound is , both expressed in big Omega notation.
Equivalent problems
The coordinates of the apexes of a solution to the tripod problem form a 2-comparable sets of triples, where two triples are defined as being 2-comparable if there are either at least two coordinates where one triple is smaller than the other, or at least two coordinates where one triple is larger than the other. This condition ensures that the tripods defined from these triples do not have intersecting rays.
Another equivalent two-dimensional version of the question asks how many cells of an array of square cells (indexed from to ) can be filled in by the numbers from to in such a way that the non-empty cells of each row and each column of the array form strictly increasing sequences of numbers, and the positions holding each value form a monotonic chain within the array. An array with these properties is called a monotonic matrix. A collection of disjoint tripods with a |
https://en.wikipedia.org/wiki/Elgin%20Bryce%20Holt | Elgin Bryce Holt (September 4, 1873 – October 6, 1945) was an American geologist, mine owner and engineer, amateur scientist, anthropologist and entrepreneur who reorganized and managed the Cerro de Plata Mining Company in Magdalena, Sonora, Mexico.
Biography
Holt was born in Harrison, Arkansas, the sixth of Lydia Elizabeth (née Ryan) and "Judge" Isham Right Holt's eight children. In 1879, the family moved to a homestead raising cattle along the San Francisco river near Alma, New Mexico. In 1892, the family moved to Las Cruces, New Mexico, allowing the four youngest children to attend the New Mexico Agricultural College. Very successful in mining silver in Mexico, he was known as the "Silver King of Sonora". A member of the American Institute of Mining Engineers and the American Association of Engineers, Holt died in Los Angeles, California and is buried in Forest Lawn Cemetery.
Education
In 1897, Holt was a member of the fourth graduating class of the New Mexico College of Agriculture and Mechanic Arts (now New Mexico State University) having completed the Mining Engineering course. He earned degrees in Geology and Mineralogy. His senior thesis was entitled "The Potassium Cyanide Method of the Determination of Copper".
During his senior year Holt was manager of the college football team and editor-in-chief of the New Mexico Collegian in 1897, the college student newspaper.
Early career
In 1903, Holt and a former classmate W. C. Mossman, left for the 1904 World's Fair in St. Louis, Missouri, to join Zach Mulhall's Congress of Rough Riders and Ropers in the show's "broncho riding act".
Holt began his career renting his father's cattle business, working the family herd with his brother Isham for six years. During that time, Holt completed a post-graduate course in assaying.
Holt's older brother Ernest had a number of mining interests in Sonora, Mexico but was killed in 1900 by a revolver that was said to have fallen from his cot and exploded. Holt sold his cattl |
https://en.wikipedia.org/wiki/Alex%20Ma | Wan-Chun Alex Ma (; born 1978) is a Taiwanese software engineer.
Ma was born in 1978. Ma became interested in visual effects at a young age, influenced by the Star Wars film series and Wing Commander III: Heart of the Tiger, a video game released in 1994. He completed his bachelor's, master's, and doctoral degrees at National Taiwan University. With the help of the Graduate Student Study Abroad Program of Taiwan's National Science Council. Ma went to the University of Southern California in 2005, completing his studies under Paul Debevec at the Institute for Creative Technologies. Since obtaining his doctorate, Ma has worked for Activision and Google.
In 2019, Debevec's research team, including Ma, were jointly awarded an Academy Award for Technical Achievement for the development of the
Polarized Spherical Gradient Illumination, a facial appearance capture and modeling technology used in several films.
References
External links
Software engineers
21st-century engineers
Google employees
Taiwanese expatriates in the United States
Activision employees
National Taiwan University alumni
Taiwanese engineers
Academy Award for Technical Achievement winners
1978 births
Living people |
https://en.wikipedia.org/wiki/Electric%20bacteria | Electric bacteria are forms of bacteria that directly consume and excrete electrons at different energy potentials without requiring the metabolization of any sugars or other nutrients. This form of life appears to be especially adapted to low-oxygen environments. Most life forms require an oxygen environment in which to release the excess of electrons which are produced in metabolizing sugars. In a low oxygen environment, this pathway for releasing electrons is not available. Instead, electric bacteria "breathe" metals instead of oxygen, which effectively results in both an intake of and excretion of electrical charges.
Some electric bacteria:
Shewanella
Geobacter
Methanobacterium palustre
Methanococcus maripaludis
Mycobacterium smegmatis
Modified Escherichia coli
A broad group of 30 bacteria varieties
See also
Electron transport chain
References
Bacteria |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.