source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/Radeon%20RX%207000%20series | The Radeon RX 7000 series is a series of graphics processing units developed by AMD, based on their RDNA 3 architecture. It was announced on November 3, 2022 and is the successor to the Radeon RX 6000 series. Currently AMD has announced five graphics cards of the 7000 series: RX 7600, RX 7700 XT, RX 7800 XT, RX 7900 XT and RX 7900 XTX. AMD officially launched the RX 7900 XT and RX 7900 XTX on December 13, 2022. AMD released the RX 7600 on May 25, 2023. AMD released their last two graphics processing units of the RDNA 3 family on September 6, 2023 that being the 7700 xt and the 7800xt.
Radeon RX 7000 series features
RDNA 3 microarchitecture
Up to 96 Compute Units (CU) compared to the maximum of 80 in the RX 6000 series
New dual-issue shader ALUs in each CU with the ability to execute two instructions per cycle
Second-generation Ray tracing accelerators
Dedicated AI accelerators with Wave MMA (matrix multiply-accumulate) instructions
First consumer graphics card to be based on a chiplet design
TSMC N5 for Graphics Compute Die (GCD)
TSMC N6 for Memory Cache Die (MCD)
Up to 24GB of GDDR6 memory
Doubled L1 cache from 128 KB to 256 KB per array
50% increased L2 cache from 4 MB to 6 MB maximum
Second-generation Infinity Cache with up to 2.7x peak bandwidth and up to 96MB in capacity
PCIe 4.0 x16 interface
Support for AV1 hardware encoding and decoding for 12-bit video up to 8K60
New "Radiance Display" Engine with:
DisplayPort 2.1 UHBR 13.5 support (up to 54 Gbit/s bandwidth)
HDMI 2.1a support (up to 48 Gbit/s bandwidth)
Support up to 8K 165 Hz or 4K 480 Hz output with DSC
12-bit color and Rec. 2020 support for HDR
Navi 3x dies
Navi 31
The Navi 31 multi-chip module features 58 billion transistors, a 165% increase in transistor density than the previous generation Navi 2x, across seven dies: one Graphics Compute Die (GCD) and six Memory Cache Dies (MCD). The full Navi 31 die contained 12,288 FP32 cores, equivalent to 6144 Stream Processors. Reportedl |
https://en.wikipedia.org/wiki/Fungi%20in%20art | Fungi are a common theme or working material in art. They appear in many different artworks around the world, starting as early as around 8000 BCE. Fungi appear in nearly all art forms, including literature, paintings, and graphic arts; and more recently, contemporary art, music, photography, comic books, sculptures, video games, dance, cuisine, architecture, fashion, and design. There are a few exhibitions dedicated to fungi, and even an entire museum (the in Chile).
Contemporary artists experimenting with fungi often work within the realm of BioArts and may use fungi as materials. Artists may use fungi as allegory, narrative, or props; they may also film fungi with time-lapse photography to display fungal life cycles or try more experimental techniques. Artists using fungi may explore themes of transformation, decay, renewal, sustainability, or cycles of matter. They may also work with mycologists, ecologists, designers, or architects in a multidisciplinary way.
Artists may be indirectly influenced by fungi via derived substances (such as alcohol or psilocybin). They may depict the effects of these substances, make art under the influence of these substances, or in some cases, both.
By artistic area
In Western art, fungi have been historically saturated with negative associations, whereas Asian art and folk art are generally more favourable towards fungi. Reflecting these representations of mushrooms, Western cultures have been referred to as mycophobes (fear, loathing, or hostility towards mushrooms), a term first coined as fungophobia by British mycologist William Delisle Hay in his 1887 book An Elementary Text-Book of British Fungi, whereas Asian cultures have been generally described as mycophiles.
Since 2020, the annual Fungi Film Festival has recognized movies about fungi in all genres.
In some stories or artworks, fungi play an allegorical role, or part of mythology and folklore. The visible parts of some fungi – particularly mushrooms with a distin |
https://en.wikipedia.org/wiki/Symbiosis%20in%20Amoebozoa | Amoebozoa of the free living genus Acanthamoeba and the social amoeba genus Dictyostelium are single celled eukaryotic organisms that feed on bacteria, fungi, and algae through phagocytosis, with digestion occurring in phagolysosomes. Amoebozoa are present in most terrestrial ecosystems including soil and freshwater. Amoebozoa contain a vast array of symbionts that range from transient to permanent infections, confer a range of effects from mutualistic to pathogenic, and can act as environmental reservoirs for animal pathogenic bacteria. As single celled phagocytic organisms, amoebas simulate the function and environment of immune cells like macrophages, and as such their interactions with bacteria and other microbes are of great importance in understanding functions of the human immune system, as well as understanding how microbiomes can originate in eukaryotic organisms.
Amoeba-resistant microorganisms
Some microorganisms have evolved to become resistant to Amoebozoa, and are able to survive, grow, and exit free-living amoebae after phagocytosis. In order for an organism to survive in an Amoebozoa, they have developed a way to avoid or survive digestion by their host's acidic and oxidative phagolysosomes. Many of these amoeba-resistant microorganisms (ARMs) survive either in the amoeba cytoplasm or in host derived vacuoles surrounded by plasma membrane, allowing them to not only avoid digestion, but actively reproduce inside their host with some are capable of lysing the amoeba host cell. Known symbionts of Amoebozoa include bacteria from Alphaproteobacteria, Betaproteobacteria, Bacteroidetes, Firmicutes, Proteobacteria, Chlamydiae, and Paraburkholderia, all with different effects on their host, even within the same phylum. For example, some Chlamydiae bacteria are able to increase the growth rates of their hosts or increase motility, other Chlamydiae strains are able to fight off other pathogenic symbionts like legionella, and some Chlamydiae are parasitic and |
https://en.wikipedia.org/wiki/Filter%20%28social%20media%29 | Filters are appearance-altering digital image effects often used on social media. They initially simulated the effects of camera filters, and they have since developed with facial recognition technology and computer-generated augmented reality. Social media filters—especially beauty filters—are often used to alter the appearance of selfies taken on smartphones or other similar devices.
History
In 2010, Apple introduced the iPhone 4—the first iPhone model with a front camera. It gave rise to a dramatic increase in selfies, which could be touched up with more flattering lighting effects with applications such as Instagram. The American photographer Cole Rise was involved in the creation of the original filters for Instagram around 2010, designing several of them himself, including Sierra, Mayfair, Sutro, Amaro and Willow. In September, 2011, the Instagram 2.0 update for the application introduced "live filters," which allowed the user to preview the effect of the filter while shooting with the application's camera. #NoFilter, a hashtag label to describe an image that had not been filtered, became popular around 2013.
An update in 2014 allowed users to adjust the intensity of the filters as well as fine-tune other aspects of the image, features that had been available for years on applications such as VSCO and Litely.
In 2014, Snapchat started releasing sponsored filters to monetize the participatory use of the application. In September 2015, Snapchat acquired Looksery and released a feature called "lenses," animated filters using facial recognition technology. Some of the early lenses available on Snapchat at the time were Heart Eyes, Terminator, Puke Rainbows, Old, Scary, Rage Face, Heart Avalanche. The Coachella filter released April 2016 was a popular early augmented reality filter.
Beauty filter
A beauty filter is a filter applied to still photographs, or to video in real time, to enhance the physical attractiveness of the subject. Typical effects of such f |
https://en.wikipedia.org/wiki/MACH%20Alliance | The MACH Alliance is a not-for-profit advocacy group whose members include software vendors, systems integrators, agencies, and individual experts, called "Ambassadors", advocating for open and best-of-breed enterprise technology ecosystems. The Alliance was formed in June 2020 and has, as of February 2023, 78 members spanning three continents. Notable members are, in alphabetical order, Amazon Web Services, Capgemini, Deloitte, Google Cloud Platform, MongoDB, Publicis Sapient, Vercel, Wunderman Thompson.
History
The MACH alliance was founded in June 2020 by four companies: Contentstack, Commercetools, EPAM Systems, and Valtech plus ten inaugural members: Algolia, Amplience, Cloudinary, Constructor.io, Contentful, E2X, Fluent commerce, Frontastic, Mobify and Vue Storefront.
MACH is an acronym for:
Microservices-based,
API-first,
Cloud-native software-as-a-service and
Headless offerings.
About a year later, MACH membership reached 30 members and again a year later doubled to about 60 members.
Membership
The MACH Alliance actively seeks software vendors, systems integrators, agencies, consultancies, and individual experts who share their vision for open and best-of-breed enterprise technology ecosystems. For like-minded organizations the MACH Alliance established certification standards that help identify those that embrace MACH philosophies and offer MACH-certified services. In order to become a member, an organization must be in full compliance. Sticking to a concise definition of what services qualify and zealously enforcing this throughout their member application process has earned the MACH Alliance a reputation of being "bouncers controlling the velvet rope at the entrance of the Coolest Tech in Town Club".
Activities
The MACH alliance's main activities in support of their advocacy of open and best-of-breed enterprise technology ecosystem are: events and the publication of various content pieces.
Benefits of MACH
The advantages of an open and best-of-b |
https://en.wikipedia.org/wiki/Bishop%27s%20graph | In mathematics, a bishop's graph is a graph that represents all legal moves of the chess piece the bishop on a chessboard. Each vertex represents a square on the chessboard and each edge represents a legal move of the bishop; that is, there is an edge between two vertices (squares) if they occupy a common diagonal. When the chessboard has dimensions , then the induced graph is called the bishop's graph.
Properties
The fact that the chessboard has squares of two colors, say red and black, such that squares that are horizontally or vertically adjacent have opposite colors, implies that the bishop's graph has two connected components, whose vertex sets are the red and the black squares, respectively. The reason is that the bishop's diagonal moves do not allow it to change colors, but by one or more moves a bishop can get from any square to any other of the same color. The two components are isomorphic if the board has a side of even length, but not if both sides are odd.
A component of the bishop's graph can be treated as a rook's graph on a diamond if the original board is square and has sides of odd length, because if the red squares (say) are turned 45 degrees, the bishop's moves become horizontal and vertical, just like those of the rook.
Domination
A square is said to be attacked by a bishop if the bishop can get to that square in exactly one move. A dominating set is an arrangement of bishops such that every square is attacked or occupied by one of those bishops. An independent dominating set is a dominating set in which no bishop attacks any other. The minimum number of bishops needed to dominate a square board of side n is exactly n, and this is also the smallest number of bishops that can form an independent dominating set.
By contrast, a total domination set, which is a dominating set for which every square, including those occupied by bishops, is attacked by one of the bishops, requires more bishops; on the square board of side n ≥ 3, the least size |
https://en.wikipedia.org/wiki/Ford%20Model%20A%20engine | The Ford Model A engine -- primarily developed for the popular Ford Model A automobile (1927–1931, 4.8 million built) -- was one of the most mass-produced automobile engines of the 1920s and 1930s, widely used in automobiles, trucks, tractors and a wide variety of other vehicles and machinery.
A four-cylinder, carbureted, gasoline-fueled, piston engine, derived from the Ford Model T engine, the Ford Model A engine -- with a bigger bore and stroke, and higher compression ratio -- was twice as powerful as the Model T engine. Some derivatives, with improvements, were produced until 1958. Tens of thousands of the original design remain active even in the 21st Century.
Design and development
(initial text derived from Ford_Model_T_engine#Ford_Model_A_engine)
Development and production history
The Ford Model A engine was an evolution of the Ford Model T engine, but with double the power. It was developed in secret at Ford's Rouge Plant, in Michigan, and unveiled -- with the Ford Model A automobile -- December 2, 1927. The first Model A engine was completed earlier, October 20, 1927, and eventually installed in a 1928 Model A Fordor sedan, which Henry Ford gave to his friend, inventor Thomas A. Edison. There was immediate market demand for the Model A, but by January 1, 1928, just 5275 Model A engines had been built -- some not yet installed in a chassis, let alone shipped to a dealer.
However, by February 1929, production of the engines reached 1,000,000 units. At the end of Model A production, March, 1932, 4,849,340 Model As had been built. (Several hundred thousand Model AA trucks had also been built, typically with the same Model A engine).
Model A historian Steve Plucker, using Ford company records, calculates that 4,830,806 production engines were built between October, 1927 and November, 1931.
All Model A engines built in the U.S. were built in the Rouge plant, however some were built at Ford plants in Canada and Europe. During that time, the Model A and A |
https://en.wikipedia.org/wiki/L3cos | L3COS (Level 3 Consensus Operating System) is an algorithm for digitising processes based on Blockchain, which has a three-level structure and is distributed as Platform as a Service for state bodies and businesses. The algorithm is based on the blockchain, in which any decision made at any of the levels will become part of the common chain. The technology involves a three-level framework that provides national governments, businesses, and private individuals with the tools to create a digital economy that does not allow fraudulent activity, financial or otherwise.
History
Work on the “three-in-one” algorithm was started in 2013 and completed in October 2019. Founder and CEO of the project is Zurab Ashvil, a PHD in Cybernetics and Applied Mathematics, who was previously an executive of the Softbank Capital. The official launch of the algorithm was held at the World Economic Forum in Davos from 21 to 24 January 2020. Over the first eight years, more than $65 million was invested in the company.
In March 2020, the Central Bank of England invited private companies to participate in the development of a CBDC (Central Bank Digital Currency). In June 2020, a proposal was made to the Central Bank of England to use L3COS technology.
In March 2021, information emerged that the world’s first digital platform for agricultural goods and foodstuffs, AgriDex, was investing $85 million to develop it's platform on L3COS blockchain. The creation of a smart marketplace allows improvements in global food security, with reduced transaction costs and reduced food costs to consumers. The plan was to create a tokenised payment and exchange platform for AgriDex based on blockchain. The volume of supplies was estimated at $2.25 trillion annually. The technology will enable the company to process more than 1.5 million concurrent transactions per second and provide more than 150 companies with access to the marketplace.
In April 2021, L3COS announced a new partnership with US label ENT G |
https://en.wikipedia.org/wiki/DisCoCat | DisCoCat (Categorical Compositional Distributional) is a mathematical framework for natural language processing which uses category theory to unify distributional semantics with the principle of compositionality. The grammatical derivations in a categorial grammar (usually a pregroup grammar) are interpreted as linear maps acting on the tensor product of word vectors to produce the meaning of a sentence or a piece of text. String diagrams are used to visualise information flow and reason about natural language semantics.
History
The framework was first introduced by Bob Coecke, Mehrnoosh Sadrzadeh, and Stephen Clark as an application of categorical quantum mechanics to natural language processing. It started with the observation that pregroup grammars and quantum processes shared a common mathematical structure: they both form a rigid category (also known as a non-symmetric compact closed category). As such, they both benefit from a graphical calculus, which allows a purely diagrammatic reasoning. Although the analogy with quantum mechanics was kept informal at first, it eventually led to the development of quantum natural language processing.
Definition
There are multiple definitions of DisCoCat in the literature, depending on the choice made for the compositional aspect of the model. The common denominator between all the existent versions, however, always involves a categorical definition of DisCoCat as a structure-preserving functor from a category of grammar to a category of semantics, which usually encodes the distributional hypothesis.
The original paper used the categorical product of FinVect with a pregroup seen as a posetal category. This approach has some shortcomings: all parallel arrows of a posetal category are equal, which means that pregroups cannot distinguish between different grammatical derivations for the same syntactically ambiguous sentence. A more intuitive manner of saying the same is that one works with diagrams rather than with parti |
https://en.wikipedia.org/wiki/Quantum%20natural%20language%20processing | Quantum natural language processing (QNLP) is the application of quantum computing to natural language processing (NLP). It computes word embeddings as parameterised quantum circuits that can solve NLP tasks faster than any classical computer. It is inspired by categorical quantum mechanics and the DisCoCat framework, making use of string diagrams to translate from grammatical structure to quantum processes.
Theory
The first quantum algorithm for natural language processing used the DisCoCat framework and Grover's algorithm to show a quadratic quantum speedup for a text classification task. It was later shown that quantum language processing is BQP-Complete, i.e. quantum language models are more expressive than their classical counterpart, unless quantum mechanics can be efficiently simulated by classical computers.
These two theoretical results assume fault-tolerant quantum computation and a QRAM, i.e. an efficient way to load classical data on a quantum computer. Thus, they are not applicable to the noisy intermediate-scale quantum (NISQ) computers available today.
Experiments
The algorithm of Zeng and Coecke was adapted to the constraints of NISQ computers and implemented on IBM quantum computers to solve binary classification tasks. Instead of loading classical word vectors onto a quantum memory, the word vectors are computed directly as the parameters of quantum circuits. These parameters are optimised using methods from quantum machine learning to solve data-driven tasks such as question answering, machine translation and even algorithmic music composition.
See also
Categorical quantum mechanics
Natural language processing
Quantum machine learning
Applied category theory
String diagram
References
External links
DisCoPy, a Python toolkit for computing with string diagrams
lambeq, a Python library for quantum natural language processing
Quantum computing
Natural language processing |
https://en.wikipedia.org/wiki/Mycophycobiosis | A mycophycobiosis (composed of myco-, from the Ancient Greek: (mukês , "mushroom"), phyco-, from Ancient Greek: , (phûkos, fucus, used for algae), and -biose, from ancient Greek: (bióô, "to spend one's life") is a symbiotic organism made up of a multicellular algae and an ascomycete fungus housed inside the algae (in the thallus for example). The algae and fungus involved in this association are called mycophycobionts.
The essential role of the algae is to carry out photosynthesis, while that of the fungus is less obvious, but it could be linked to the transfer of minerals within the thallus, to a repellent effect on herbivores and, above all, to resistance to desiccation of this living organism in the intertidal zone.
Such symbioses have been reported in a few green algae (Prasiola, Blidingia) and red algae (Apophlaea), both in seawater and in freshwater.
Definition elements
Although compared to lichens by certain authors, mycophycobioses carry out an association of the opposite type: the algal partner is multicellular and forms the external structure of the symbiotic organization. Moreover, the reproduction of the two partners is always disjoint (the algae and the fungus reproduce separately). To explain the nuances of this duality, the ecologists Chantal Delzenne-Van Haluwyn, Michel Lerond propose the analogy of the two symbionts with an "ideal couple". In a lichen, the host is compared to a "macho fungus"; in mycophycobiosis, the host is "the algae that wears the panties".
According to Hawksworth the physiology of this symbiosis could well be comparable to that of lichens, but it remains to be better explored. Unlike lichens, mycophycobioses look like an algal partner, which remains fertile. These associations appear to be less coevolved than lichens, as they exhibit neither joint asexual multiplication of partners nor do they contain the equivalent lichen products.
History
The term mycophycobiosis was introduced by Jan and Erika Kohlmeyer in 1972, base |
https://en.wikipedia.org/wiki/Arnold%20conjecture | The Arnold conjecture, named after mathematician Vladimir Arnold, is a mathematical conjecture in the field of symplectic geometry, a branch of differential geometry.
Statement
Let be a compact symplectic manifold. For any smooth function , the symplectic form induces a Hamiltonian vector field on , defined by the identity:
The function is called a Hamiltonian function.
Suppose there is a 1-parameter family of Hamiltonian functions , inducing a 1-parameter family of Hamiltonian vector fields on . The family of vector fields integrates to a 1-parameter family of diffeomorphisms . Each individual is a Hamiltonian diffeomorphism of .
The Arnold conjecture says that for each Hamiltonian diffeomorphism of , it possesses at least as many fixed points as a smooth function on possesses critical points.
Nondegenerate Hamiltonian and weak Arnold conjecture
A Hamiltonian diffeomorphism is called nondegenerate if its graph intersects the diagonal of transversely. For nondegenerate Hamiltonian diffeomorphisms, a variant of the Arnold conjecture says that the number of fixed points is at least equal to the minimal number of critical points of a Morse function on , called the Morse number of .
In view of the Morse inequality, the Morse number is also greater than or equal to a homological invariant of , for example, the sum of Betti numbers over a field :
The weak Arnold conjecture says that for a nondegenerate Hamiltonian diffeomorphism on the above integer is a lower bound of its number of fixed points.
See also
Arnold–Givental conjecture
References
Symplectic geometry
Conjectures |
https://en.wikipedia.org/wiki/Evolved%20wireless%20ad%20hoc%20network | An evolved wireless ad hoc network (EVAN) is a decentralized type of wireless network that compensates for the shortcomings of the existing wireless ad hoc network (WANET). An EVAN is ad hoc like a WANET because it does not rely on a pre-existing infrastructure, such as routers in wired networks or access points in wireless networks. Further advantages of WANETs over networks with a fixed topology include flexibility (an ad hoc network can be created anywhere with mobile devices), scalability (you can easily add more nodes to the network) and lower administration costs (no need to build an infrastructure first). These characteristics of WANETs are maintained in EVAN as well. However, an EVAN has a physically separate resource management channel called tone channel, unlike existing WANETs. In WANETs, the data channel performs two roles: resource management and data transfer, but in EVAN, the data channel is used only for data transfer.
Challenges
Several books and works have revealed the technical and research challenges facing wireless ad hoc networks or MANETs. UEs moving in WANET rapidly change network topology. Many resources are used for channel management. A resource collision occurs when a UE allocates a resource or moves while occupying the resource. The UE in the middle experiences collisions due to packets being received simultaneously.
In summary:
Dynamic network topology by mobile UEs
Limited channel bandwidth
Communication resource collision
Hidden node problem
Solution
To solve the problems, the tone channel is used. this channel is dedicated to resource management. The types of tones used in the tone channel include an allocation tone for allocating a resource, a clearing tone for occupying a resource, and a detection tone for detecting a collision of an occupied resource. These tones are transmitted in tone slots. A tone slot consists of multiple tone subslots where a tone is transmitted. A slot 'n' in a tone channel maps to a slot 'n+1' in |
https://en.wikipedia.org/wiki/Transformer%20ratio%20arm%20bridge | The transformer ratio arm bridge or TRA bridge is a type of bridge circuit for measuring electronic components, using a.c. It can be designed to work in terms of either impedance or admittance. It can be used on resistors, capacitors and inductors, measuring minor as well as major terms, e.g. series resistance in capacitors. It is probably the most accurate type of bridge available, being capable of the precision needed, for example, when checking secondary component standards against national standards.
Like all bridges, the TRA bridge involves comparing an unknown component against a standard. Like all a.c. bridges, it requires a signal source and a null detector. The accuracy of this class of bridge depends on the ratio of the turns on one or more transformers. A notable advantage is that normal stray capacitance across the transformer, including lead capacitance, may affect the sensitivity of the bridge but does not affect its measuring accuracy.
History
The invention of the TRA bridge is credited to Alan Blumlein in his UK patent 323037 (published 1929), and this class of bridge is sometimes known as a Blumlein bridge, although links to earlier types of bridge can be seen. Blumlein's first patent was for a capacitance-measuring bridge: Fig. 1 is redrawn from one of the diagrams in the patent.
Subsequently the ratio arm principle was applied more generally, to other classes of electronic components and at frequencies up to r.f., and with many variations in how the unknown component was connected to the transformer or transformers.
Blumlein himself was responsible for several further related patents. He made his first bridge while employed by the British company Standard Telephones and Cables, which did not manufacture test instruments. TRA bridges have since been made by many specialist manufacturers, including Boonton, ESI (formerly Brown Engineering and BECO), General Radio, Marconi Instruments, H. W. Sullivan (now part of Megger) and Wayne Kerr.
Princi |
https://en.wikipedia.org/wiki/Branch%20%28company%29 | Branch (formerly Branch Metrics) is a mobile software company focused on mobile deep linking and attribution. The headquarters of the Branch is located in Palo Alto, California.
History
Branch Metrics was founded on April 15, 2014 by Alex Austin, Mike Molinet, Mada Seghete, and Dmitri Gaskin. In the summer of 2014, the company completed the StartX Accelerator program at Stanford University.
In September 2014, Branch announced the completion of a $3 million venture funding round led by New Enterprise Associates. In 2015 Branch won a startup competition at Mobile World Congress.
In February 2015, the company completed a $15 million Series A funding followed by additional $35 million funding in January 2016.
In April 2017, the company raised $60 million in Series C funding from Andy Rubin’s Playground Ventures.
In September 2018, the company authorized the sale of $129 million in Series D shares, and acquired the attribution analytics platform TUNE.
In February 2022, Branch raised $300M in Series F funding at a $4B valuation, led by New Enterprise Associates.
In 2022, the company acquired two applications of Android customization — Nova Launcher and Sesame Search, and a data platform — AdLibertas.
Technology
Branch has developed improved deep linking technology that directs users to a specific place within the app the first time a link is clicked, even if the app has not been installed. To do this, Branch Metrics is combining deep linking technology with matching technology.
Branch also uses tools for cohort analysis and touchpoint tracking.
References
Companies based in Palo Alto, California
Internet technology companies |
https://en.wikipedia.org/wiki/Krzysztof%20R.%20Apt | Krzysztof R. Apt (born 26 December 1949 in Katowice, Poland)
is a Polish computer scientist. He defended his PhD in mathematical logic in
Warsaw, Poland in 1974. His research interests include program correctness and semantics, use
of logic as a programming language, distributed computing, and game theory. Besides his own research, he has been heavily involved in service to
the computing community, notably by promoting the
use of logic in computer science (in particular by founding a new
journal) and by advocating open access to scientific literature.
Academic career
Apt has held various scientific positions in Poland, the Netherlands,
France, the U.S. (the William B. Blakemore II Professor, Computer
Science, UT Austin, 1987–1990), and Singapore (Visiting
Professor, Computer Science, NUS, 2002–2005).
Apt is a Fellow at CWI (National Research Institute for Mathematics and Computer Science)
in Amsterdam and Affiliated Professor at the University of Warsaw. Since 2014 he is also Professor Emeritus at the University of Amsterdam.
His research interests include program correctness and semantics, use
of logic as a programming language, distributed computing, and game theory. In particular, with coauthors he introduced the concept of
stratification in logic programming to provide a way to deal with
negation in logic and Datalog programs. His comprehensive survey of Hoare logic, written with
Ernst-Rüdiger Olderog, summarizes
the history of the subject since its inception in 1969.
Apt is a member of Academia Europaea, which serves as "a pan-European Academy of Sciences, Humanities and Letters"; membership is by invitation only and follows a rigorous peer review selection process.
He is the founder and first
Editor-in-Chief of the ACM Transactions on Computational Logic and
past president of the Association for Logic Programming (ALP). He is one of the three initiators of the Witold Lipski Prize for Young Researchers in Computer Science.
Apt has long been an active advoc |
https://en.wikipedia.org/wiki/IPadOS%2017 | iPadOS 17 is the fifth and current major release of the iPadOS operating system developed by Apple for its iPad line of tablet computers. The successor to iPadOS 16, it was announced at the company's Worldwide Developers Conference (WWDC) on June 5, 2023 and was released on September 18, 2023 along with iOS 17.
iPadOS 17 drops support for the first-generation iPad Pro, and the fifth-generation iPad, making it the first iPadOS release to be exclusive to iPads with Apple Pencil support, as well as the first version of iPadOS to drop support for an iPad Pro.
The first public beta was released on July 12, 2023 and the final version was released on September 18, 2023.
Features
Lock screen
The lock screen has been redesigned to match the appearance of iOS 16 and beyond.
PDF document handling
iPadOS can now identify PDF forms fields for quicker text input.
Siri
Users can now simply say "Siri" instead of "Hey Siri" to activate Siri by voice activation.
Health App
The Apple Health app is now available on iPads as well as on iPhones.
Notes App
The Notes app now supports real time collaboration between users in PDF documents.
Automatic verification codes
Adds support for one-time verification codes in the Mail app.
Adds feature to automatically delete verification codes.
Compatibility (supported devices)
iPadOS 17 requires an A10 chip or newer, which means it drops support for iPad models with A9 and A9X chips, officially marking the end of support for non-Apple Pencil compatible iPads. This also marks the third time Apple has dropped 64-bit devices.
Those using A10 or A10X SoC have limited support.
Those using A12, A12X, A12Z, or A13 SoC get additional features that are unavailable on older models.
Those using A14 or A15 SoC have almost full support.
Those using M1 or M2 SoC get full support.
iPad (6th generation)
iPad (7th generation)
iPad (8th generation)
iPad (9th generation)
iPad (10th generation)
iPad Air (3rd generation)
iPad Air |
https://en.wikipedia.org/wiki/Software%20composition%20analysis | It is a common software engineering practice to develop software by using different components. Using software components segments the complexity of larger elements into smaller pieces of code and increases flexibility by enabling easier reuse of components to address new requirements. The practice has widely expanded since the late 1990s with the popularization of open-source software (OSS) to help speed up the software development process and reduce time to market.
However, using open-source software introduces many risks for the software applications being developed. These risks can be organized into 5 categories:
OSS Version Control: risks of changes introduced by new versions
Security: risks of vulnerabilities in components - Common Vulnerabilities & Exposures (or CVEs)
License: risks of Intellectual property (IP) legal requirements
Development: risks of compatibility between existing codebase and open-source software
Support: risk of poor documentation and Obsolete software components
Shortly after the foundation of the Open Source Initiative in February 1998, the risks associated with OSS were raised and organizations tried to manage this using spreadsheets and documents to track all the open source components used by their developers.
For organizations using open-source components extensively, there was a need to help automate the analysis and management of open source risk. This resulted in a new category of software products called Software Composition Analysis (SCA) which helps organizations manage open source risk.
SCA strives to detect all the 3rd party components in use within a software application to help reduce risks associated with security vulnerabilities, IP licensing requirements, and obsolescence of components being used.
Overview
Software composition analysis (SCA) is a practice in the fields of Information technology and software engineering for analyzing custom-built software applications to detect embedded open-source software an |
https://en.wikipedia.org/wiki/Rockingham%20Kiln | The Rockingham, or Waterloo, Kiln in Swinton, South Yorkshire, England, is a pottery kiln dating from 1815. It formed part of the production centre for the Rockingham Pottery which, in the early 19th century, produced highly-decorative Rococo porcelain. The pottery failed in the mid-19th century, and the kiln is one of the few remaining elements of the Rockingham manufactory. It is a Grade II* listed building and forms part of the Rockingham Works Scheduled monument. The kiln is currently on the Historic England Heritage at Risk Register.
History
The original factory on the Swinton site produced simple earthenware pottery. The first recorded operator was a Joseph Flint, who in the 1740s was renting the site from the Marquess of Rockingham. A partnership with the Leeds Pottery failed and was dissolved by 1806. The subsequent owners, the Brameld family, built the Rockingham Kiln, and other structures on the site, in 1815. The date, the year of the Battle of Waterloo, led to the kiln's alternative name, the Waterloo Kiln. Despite the Brameld's investigations into the production of high-quality porcelain, the venture continued to be unsuccessful and the firm was extricated from a further bankruptcy in 1826 only by the intervention of William Fitzwilliam, 4th Earl Fitzwilliam, who had inherited the Wentworth Woodhouse estate from his uncle, the second Marquess of Rockingham.
The Earl's patronage, permitting the use of the Rockingham name and family crest, together with providing direct financial support, saw the Rockingham Pottery develop into a major producer of elaborate rococo-style porcelain, which enjoyed royal endorsement at home and considerable sales abroad. The factory produced major pieces including a full desert service for William IV which took eight years to complete. Ruth Harman, in her 2017 revised volume, Yorkshire West Riding: Sheffield and the South, of the Pevsner Buildings of England series, notes that "perfection was their undoing" and by 1842 th |
https://en.wikipedia.org/wiki/Sven%20Apel | Sven Apel (born 1977) is a German computer scientist and professor of software engineering at Saarland University.
His research focuses on software product lines and configurable systems, domain-specific generation and optimization, software analytics and intelligence, as well as empirical methods and the human factor in software development.
Education and career
Sven Apel studied computer science at the University of Magdeburg from 1996 to 2002. At the same university, he also received his doctorate in computer science in 2007 with a thesis on the “Role of Features and Aspects in Software Development.”
After his doctorate, Apel was a postdoctoral researcher at the University of Passau until 2010. From 2010 to 2013, he led the Emmy Noether Junior Research Group “Secure and Efficient Software Product Lines” there before he was appointed professor in Passau in 2013 as part of the DFG's Heisenberg Program.
Since 2019, Sven Apel has been a professor of software engineering at Saarland University.
In 2019, Apel, together with Christian Kästner and Martin Kuhlemann, received the “Most Influential Paper Award” at the Systems and Software Products Line Conference (SPLC) for the paper “Granularity in Software Product Lines”. In the article, the three researchers demonstrate how programs can be extended by fine-grained import from other software.
In 2022, together with Janet Feigenspan, Christian Kästner, Jörg Liebig and Stefan Hanenberg, he was awarded the “Most Influential Paper Award” at the International Conference on Program Comprehension (ICPC) for the paper “Measuring programming experience”. In the article, the researchers present a questionnaire and an experiment to assess and measure a programmer's level of experience.
According to Google Scholar, he has an h-index of 69.
Research areas
Sven Apel's research focuses in particular on methods, tools, and theories for the construction of manageable, reliable, efficient, configurable, and evolvable software sys |
https://en.wikipedia.org/wiki/Neoproterozoic%20oxygenation%20event | The Neoproterozoic Oxygenation Event (NOE), also called the Second Great Oxidation Event (the first having occurred during the Palaeoproterozoic), was a time interval between around 850 and 540 million years ago which saw a very significant increase in oxygen levels in Earth's atmosphere and oceans. Bringing an end to the Boring Billion, a period of extremely low atmospheric oxygen spanning from the Statherian to the Tonian, the NOE was the second major increase in environmental oxygen on Earth, though it was not as major as the Great Oxidation Event (GOE). Unlike the GOE, it is unclear whether the NOE was a synchronous, global event or a series of asynchronous, regional oxygenation intervals with unrelated causes.
Evidence for oxygenation
Carbon isotopes
Beginning around 850 Mya to around 720 Mya, a time interval roughly corresponding to the Late Tonian, between the end of the Boring Billion and the onset of the Cryogenian “Snowball Earth”, marine deposits record a very significant positive carbon isotope excursion. These elevated δ13C values are believed to be linked to an evolutionary radiation of eukaryotic plankton and enhanced organic burial, which in turn indicate a spike in oxygen production during this interval. Further positive carbon isotope excursions occurred during the Cryogenian. Although several negative carbon isotope excursions, associated with warming events, are known from the Late Tonian all the way up to the Proterozoic-Phanerozoic boundary, the carbon isotope record nonetheless maintains a noticeable positive trend throughout the Neoproterozoic.
Nitrogen isotopes
δ15N data from 750 to 580 million year-old marine sediments hailing from four different Neoproterozoic basins show similar nitrogen isotope ratios to modern oceans, with a mode of +4% and a range from -4% to +11%. No significant change is observed across the Cryogenian-Ediacaran boundary, implying that oxygen was already ubiquitous in the global ocean as early as 750 Mya, during th |
https://en.wikipedia.org/wiki/ETransportation | eTransportation is a peer-reviewed open-access scientific journal covering all modes of transportation by using electricity (vehicles, ships and airplanes). The journal was established in 2019 and is published by Elsevier. The editor-in-chief is Minggao Ouyang (Tsinghua University). It is emphasized that efforts to advocate UN's goals of sustainable development are welcomed, specifically "Affordable and clean energy".
Abstracting and indexing
The journal is abstracted and indexed in Ei Compendex, Scopus, and the Science Citation Index Expanded. According to the Journal Citation Reports, the journal has a 2021 impact factor of 1.65.
References
External links
Electrical and electronic engineering journals
Academic journals established in 2019
English-language journals
Elsevier academic journals
Creative Commons Attribution-licensed journals
Continuous journals
Transportation journals |
https://en.wikipedia.org/wiki/Phylogenetic%20reconciliation | In phylogenetics, reconciliation is an approach to connect the history of two or more coevolving biological entities. The general idea of reconciliation is that a phylogenetic tree representing the evolution of an entity (e.g. homologous genes or symbionts) can be drawn within another phylogenetic tree representing an encompassing entity (respectively, species, hosts) to reveal their interdependence and the evolutionary events that have marked their shared history. The development of reconciliation approaches started in the 1980s, mainly to depict the coevolution of a gene and a genome, and of a host and a symbiont, which can be mutualist, commensalist or parasitic. It has also been used for example to detect horizontal gene transfer, or understand the dynamics of genome evolution.
Phylogenetic reconciliation can account for a diversity of evolutionary trajectories of what makes life's history, intertwined with each other at all scales that can be considered, from molecules to populations or cultures. A recent avatar of the importance of interactions between levels of organization is the holobiont concept, where a macro-organism is seen as a complex partnership of diverse species. Modeling the evolution of such complex entities is one of the challenging and exciting direction of current research on reconciliation.
Phylogenetic trees as nested structures
Phylogenetic trees are intertwined at all levels of organization, integrating conflicts and dependencies within and between levels. Macro-organism populations migrate between continents, their microbe symbionts switch between populations, the genes of their symbionts transfer between microbe species, and domains are exchanged between genes.
This list of organization levels is not representative or exhaustive, but gives a view of levels where reconciliation methods have been used.
As a generic method, reconciliation could take into account numerous other levels. For instance, it could consider the syntenic organiz |
https://en.wikipedia.org/wiki/Curb%20your%20dog | In New York City from the 1930s to 1978, before citywide Pooper-scooper laws were enacted, street signs were put in place informing and encouraging citizens to "curb their dog." The phrase meant that it was acceptable and desirable to allow/have your dog defecate in the edge of the street, near the curb and in "the gutter", rather than on the sidewalk.
The first known "curb your dog" signs in New York City, twenty five in number, were distributed in 1937 "at points around the city" "in an effort to train owners."
In the 1970s, a curb your dog sign campaign was launched in response to a problem that was becoming intolerable. Signs were erected to educate residents that it was required for them to have their dogs defecate in the street gutter, as opposed to the sidewalk, with the intent that NYC Sanitation Department street sweeping machines would clean the streets on an overnight basis. This expensive approach to managing dog waste coincided with an NYC livability, demographic, and financial crisis and proved to be economically untenable.
The signs were of a civic nature being informational and educational. They did not list fines, cite law, or express consequences. In New York City beginning in 1955, education regarding sanitation and (including signage and campaigns) was seen as a cost effective way to manage a public quality of life and health concerns known as "street pollution." Benchmarked to today's civic sensibilities it is hardly imaginable that a large amount of, or any, dog waste would be acceptably disposed of at the curbside or on the sidewalk.
"Curb Your Dog" signs from the late 1960's to 1970's were spartan in presentation with a white border, white lettering stating "Curb your dog <line space> Keep New York Clean" against a black background. (source contains picture) An analog sign stated "leash, gutter and clean up after your dog Please."
The legacy of "curb your dog" signage remains in generational memory to such an extent that subs |
https://en.wikipedia.org/wiki/DREAM%20Challenges | DREAM Challenges (Dialogue for Reverse Engineering Assessment and Methods) is a non-profit initiative for advancing biomedical and systems biology research via crowd-sourced competitions. Started in 2006, DREAM challenges collaborate with Sage Bionetworks to provide a platform for competitions run on the Synapse platform. Over 60 DREAM challenges have been conducted over the span of over 15 years.
Overview
DREAM Challenges were founded in 2006 by Gustavo Stolovizky from IBM Research and Andrea Califano from Columbia University. Current chair of the DREAM organization is Paul Boutros from University of California. Further organization spans emeritus chairs Justin Guinney and Gustavo Stolovizky, and multiple DREAM directors.
Individual challenges focus on tackling a specific biomedical research question, typically narrowed down to a specific disease. A prominent disease focus has been on oncology, with multiple past challenges focused on breast cancer, acute myeloid leukemia, and prostate cancer or similar diseases. The data involved in an individual challenge reflects the disease context; while cancers typically involve data such as mutations in the human genome, gene expression and gene networks in transcriptomics, and large scale proteomics, newer challenges have shifted towards single cell sequencing technologies as well as emerging gut microbiome related research questions, thus reflecting trends in the wider research community.
Motivation for DREAM Challenges is that via crowd-sourcing data to a larger audience via competitions, better models and insight is gained than if the analysis was conducted by a single entity. Past competitions have been published in such scientific venues as the flagship journals of the Nature Portfolio and PLOS publishing groups. Results of DREAM challenges are announced via web platforms, and the top performing participants are invited to present their results in the annual RECOMB/ISCB Conferences with RSG/DREAM organized by the |
https://en.wikipedia.org/wiki/FinVect | In the mathematical field of category theory, FinVect (or FdVect) is the category whose objects are all finite-dimensional vector spaces and whose morphisms are all linear maps between them.
Properties
FinVect has two monoidal products:
the direct sum of vector spaces, which is both a categorical product and a coproduct,
the tensor product, which makes FinVect a compact closed category.
Examples
Tensor networks are string diagrams interpreted in FinVect.
Group representations are functors from groups, seen as one-object categories, into FinVect.
DisCoCat models are monoidal functors from a pregroup grammar to FinVect.
See also
FinSet
ZX-calculus
category of modules
References
Categories in category theory
Dimension |
https://en.wikipedia.org/wiki/Fast%20probability%20integration | Fast probability integration (FPI) is a method of determining the probability of a class of events, particularly a failure event, that is faster to execute than Monte Carlo analysis. It is used where large numbers of time-variant variables contribute to the reliability of a system. The method was proposed by Wen and Chen in 1987.
For a simple failure analysis with one stress variable, there will be a time-variant failure barrier, , beyond which the system will fail. This simple case may have a deterministic solution, but for more complex systems, such as crack analysis of a large structure, there can be a very large number of variables, for instance, because of the large number of ways a crack can propagate. In many cases, it is infeasible to produce a deterministic solution even when the individual variables are all individually deterministic. In this case, one defines a probabilistic failure barrier surface, , over the vector space of the stress variables.
If failure barrier crossings are assumed to comply with the Poisson counting process an expression for maximum probable failure can be developed for each stress variable. The overall probability of failure is obtained by averaging (that is, integrating) over the entire variable vector space. FPI is a method of approximating this integral. The input to FPI is a time-variant expression, but the output is time-invariant, allowing it to be solved by first-order reliability method (FORM) or second-order reliability method (SORM).
An FPI package is included as part of the core modules of the NASA-designed NESSUS software. It was initially used to analyse risks and uncertainties concerning the Space Shuttle main engine, but is now used much more widely in a variety of industries.
References
Bibliography
Beck, André T.; Melchers, Robert E., "Fatigue and fracture reliability analysis under random loading", pp. 2201–2204 in, Bathe, K.J (ed), Proceedings of the Second MIT Conference on Computational Fluid |
https://en.wikipedia.org/wiki/Color%20reproduction | Color reproduction is an aspect of color science concerned with producing light spectra that evoke a desired color, either through additive (light emitting) or subtractive (surface color) models. It converts physical correlates of color perception (CIE 1931 XYZ color space tristimulus values and related quantities) into light spectra that can be experienced by observers. In this way, it is the opposite of colorimetry.
It is concerned with the faithful reproduction of a color in one medium, with a color in another, so it is a central concept in color management and relies heavily on color calibration. For example, food packaging must be able to faithfully reproduce the colors of the foods therein in order to appeal to a customer. This involves proper color calibration of at least four devices:
Lighting, which must have a high color rendering index and not give a color cast to the object.
Camera, which measures the reflected spectrum of the object and converts to a trichromatic color space (e.g. RGB).
Screen, which reproduces color so a designer can proof the captured image and make color corrections as necessary.
Printer, which reproduces the final color on paper.
References
Further reading
Image processing
Visual perception
Psychophysics
Color |
https://en.wikipedia.org/wiki/Fluentgrid | Fluentgrid (formerly Phoenix IT Solutions) is an Indian software product company, which was founded in 1998. It provides digital transformation services for power, water, and gas distribution utilities and smart cities and communities. The company received the 2012 IBM Beacon Award and was also the winner of the Fierce Innovation Award 2015.
In 2016, Fluentgrid placed 14th in the Deloitte Technology Fast 50 Awards, recognising it as one of the fastest growing technology businesses in India. Company was also selected for Deloitte Technology Fast 500 Asia Pacific ratings. It was formerly known as Phoenix IT Solutions and a signatory to the United Nations Global Compact.
History
The company was originally founded by Gannamani Sesha Murali Krishna as Phoenix Cybertech India Pvt Ltd in 1998. In 2001, the company was renamed as Phoenix IT Solutions Ltd. Furthermore, In December 2015, the name of the company was changed to Fluentgrid Limited.
The company helped the Greater Visakhapatnam Municipal Corporation to set up and launch a state-of-the-art centralized City Command Center in 2016.
UPPCL CIS project implemented by the company became the finalist in DCD Global Awards 2018 under the Cloud Migration of the Year category.
In October 2021, the state government of Odisha assigned the company as the system integrator for the newly launched OPTCL's New Electricity Consumer Billing System in Odisha.
The company is credited for conceptualising the Utility Operations Center (UOC) in India. One of its customer care and billing software was live for over 11 million consumers on a cloud pay-as-you-go model for multiple DISCOMs in Uttar Pradesh. It became a case study under the Ujwal DISCOM Assurance Yojana (UDAY program) of the Ministry of Power, presently serving over 23 million rural consumers on cloud. The AMI product suite of Fluentgrid powers smart metering rollouts in New Delhi Municipal Council and Kanpur Electricity Supply Company.
In July 2020, during the COVID-1 |
https://en.wikipedia.org/wiki/S2S%20%28mathematics%29 | In mathematics, S2S is the monadic second order theory with two successors. It is one of the most expressive natural decidable theories known, with many decidable theories interpretable in S2S. Its decidability was proved by Rabin in 1969.
Basic properties
The first order objects of S2S are finite binary strings. The second order objects are arbitrary sets (or unary predicates) of finite binary strings. S2S has functions s→s0 and s→s1 on strings, and predicate s∈S (equivalently, S(s)) meaning string s belongs to set S.
Some properties and conventions:
By default, lowercase letters refer to first order objects, and uppercase to second order objects.
The inclusion of sets makes S2S second order, with "monadic" indicating absence of k-ary predicate variables for k>1.
Concatenation of strings s and t is denoted by st, and is not generally available in S2S, not even s→0s. The prefix relation between strings is definable.
Equality is primitive, or it can be defined as s = t ⇔ ∀S (S(s) ⇔ S(t)) and S = T ⇔ ∀s (S(s) ⇔ T(s)).
In place of strings, one can use (for example) natural numbers with n→2n+1 and n→2n+2 but no other operations.
The set of all binary strings is denoted by {0,1}*, using Kleene star.
Arbitrary subsets of {0,1}* are sometimes identified with trees, specifically as a {0,1}-labeled tree {0,1}*; {0,1}* forms a complete infinite binary tree.
For formula complexity, the prefix relation on strings is typically treated as first order. Without it, not all formulas would be equivalent to Δ12 formulas.
For properties expressible in S2S (viewing the set of all binary strings as a tree), for each node, only O(1) bits can be communicated between the left subtree and the right subtree and the rest (see communication complexity).
For a fixed k, a function from strings to k (i.e. natural numbers below k) can be encoded by a single set. Moreover, s,t ⇒ s01t where t doubles every character of t is injective, and s ⇒ {s01t: t∈{0,1}*} is S2S definable. B |
https://en.wikipedia.org/wiki/Intracellular%20space | Intracellular space is the interior space of the plasma membrane. It contains about two-thirds of TBW. Cellular rupture may occur if the intracellular space becomes dehydrated, or if the opposite happens, where it becomes too bloated. Thus it is important for the liquid to stay in optimal quantity.
See also
Extracellular space
References
Cell anatomy
Cell biology |
https://en.wikipedia.org/wiki/AMD%20%C3%89lan | AMD Élan is a family of 32-Bit embedded SoCs marketed by AMD based on x86 microprocessors. All of these products have been backed with a long-term supply guarantee to meet the needs of embedded processors. However, when AMD acquired the Geode division from National Semiconductor in August 2003, the product was suddenly discontinued. The Élan processors have certainly provided some momentum in the embedded world.
SC3xx family
The SC300 and SC310 combines a 32-Bit, x86 compatible, low-voltage 25 MHz or 33 MHz Am386SX CPU with memory controller, PC/AT peripheral controllers, real-time clock, PLL clock generators and ISA bus interface. SC300 integrates in addition two PCMCIA 2.1 slots and a CGA-compatible LCD controller. Power consumption at 33 MHz was only 600 mW including CPU, memory controller and peripherals. In suspend mode consumption reduces to 0,17 mW. The low budget 9,7" laptop Brother GeoBook NB60 uses a SC300
SC4xx family
The SC400 and SC410 combines a 32-Bit, x86 compatible, low-voltage 33 MHz, 66 MHz or 100 MHz Am486 CPU with memory controller, PC/AT peripheral controllers, real-time clock, PLL clock generators, VESA Local Bus and ISA bus interface. SC400 integrates in addition two PCMCIA 2.1 slots and a 4-bit color Super-twisted nematic display (STN) LCD controller. The C64 "Web.it" Internet Computer uses a SC400 with 16 MB of RAM, a 3.5" floppy disk drive, 56k-modem and PCMCIA. The AirPort Base Station uses a SC410, the Nokia 9000 Communicator a SC450. AMD announced the SC400 on October 15, 1996.
SC5xx family
The SC520 combines a 32 bit, x86 compatible, low-voltage 100 MHz or 133 MHz Am5x86 CPU with memory controller supporting SDRAM, PC/AT peripheral controllers, real-time clock and PCI bus. ISA and PCMCIA were no longer supported. AMD announced the SC520 on August 25, 1999.
Comparison
References
Microcontrollers
AMD x86 microprocessors
System on a chip
Embedded microprocessors |
https://en.wikipedia.org/wiki/Yellow.ai | Yellow.ai, formerly known as Yellow Messenger, is an enterprise conversational AI platform founded in 2016 and headquartered in San Mateo, California. It is an artificial intelligence platform that automates conversational experiences for customers and employees. It enables businesses to deliver human-like personalized interactions in the preferred language. The platform supports more than 135 languages.
History
Yellow.ai was founded by Raghu Ravinutala, Jaya Kishore Reddy Gollareddy and Rashid Khan in 2016 in Bangalore, India. Raghavendra Ravinutala and Jaya Kishore Reddy Gollareddy had quit their full-time jobs to establish Yellow Messenger, and they met Rashid Khan at a college hackathon, and that's when the latter began working with them. In 2016, the company became a part of Microsoft's accelerator and SAP Startup studio.
In April 2021, during COVID-19, the company built chatbots to help governments with vaccinations. It launched Yellow Messenger Care to create COVID-19 help-related omnichannel chatbots, which helped NGOs and hospitals in their crisis management efforts. In June 2021, the company rebranded itself from Yellow Messenger to Yellow.ai. In 2023, Yellow.ai announced the launch of its Dynamic Automation Platform (DAP) and revealed a new logo as part of a larger rebranding strategy.
Partnership and client base
In January 2019, the company collaborated with Microsoft to work on transforming its voice automation using Azure Al Speech Services and Natural language processing (NLP) tools. In February 2022, the company partnered with Tech Mahindra to develop enterprise AI technology. It partnered with the e-commerce company Unicommerce in July 2020. In February 2022, Edelweiss General Insurance launched its AI Voice Bot, using Yellow.ai's technology. Yellow.ai implemented its AI-based customer service technology in Urja, a virtual assistant launched by the public sector company BPCL. The company has also formed partnerships with Accenture, Infosys, TCS |
https://en.wikipedia.org/wiki/Facilitation%20cascade | A facilitation cascade is a sequence of ecological interactions that occur when a species benefits a second species that in turn has a positive effect on a third species. These facilitative interactions can take the form of amelioration of environmental stress and/or provision of refuge from predation. Autogenic ecosystem engineering species, structural species, habitat-forming species, and foundation species are associated with the most commonly recognized examples of facilitation cascades, sometimes referred to as a habitat cascades. Facilitation generally is a much broader concept that includes all forms of positive interactions including pollination, seed dispersal, and co-evolved commensalism and mutualistic relationships, such as between cnidarian hosts and symbiodinium in corals, and between algae and fungi in lichens. As such, facilitation cascades are widespread through all of the earth's major biomes with consistently positive effects on the abundance and biodiversity of associated organisms.
Overview
Facilitation cascades occur when prevalent foundation species, or less abundant but ecologically important keystone species, are involved in a hierarchy of positive interactions and consist of a primary facilitator which positively affects one or more secondary facilitators which support a suite of beneficiary species. Facilitation cascades at a minimum have a primary and secondary facilitator, although tertiary, quaternary, etc. facilitators may be found in some systems.
A typical example of facilitation cascades in a tropical coastal ecosystem
Origin of concept and related terms
The term facilitation cascade was coined by Altieri, Silliman, and Bertness during a study on New England cobblestone beaches to explain the chain of positive interactions that allow a diverse community to exist in a habitat that is otherwise characterized by substrate instability, elevated temperatures, and desiccation stress. Cordgrass is able to establish independently, and t |
https://en.wikipedia.org/wiki/Megaherbivore | Megaherbivores (Greek μέγας megas "large" and Latin herbivora "herbivore") are large terrestrial herbivores that can exceed in weight. This polyphyletic group of megafauna includes elephants, rhinos, hippos, and giraffes. The largest bovids (gaurs and American bisons) occasionally reach a weight of , but they are generally not considered to be megaherbivores. There are nine extant species of megaherbivores living in Africa and Asia. The African bush elephant is the largest extant species with bulls reaching a height of up to and a maximum weight of .
All megaherbivores are keystone species in their environment. Their ecological role is to change the vegetative structure through feeding behavior, and seed dispersal. Megaherbivores like most large mammals are K-selected species. They are characterized by their large size, invulnerability to predators, their impact on vegetation and their dietary tolerance. Megaherbivores have been around for over 300 million years, but they are now extirpated from much of their historic range.
Species
This is a list of all nine extant species of megaherbivores, with a brief description for each.
Ecology
Elephants are mixed feeders, giraffes and Javan rhinos are browsers, while white and Indian rhinoceroses are true grazers. Megaherbivores consume graminoid, which are dicotyledon proportions which also includes non-graminaceous monocots with dicots. They prefer eating the foliage, stemmy material and fruits of the plant. Elephants and rhinos exhibit hindgut fermentation while giraffes, like all bovids are ruminants with foregut fermentation. Hippos display foregut fermentation but they lack the distinctly divided section and remastication that are typical in ruminants.
Due to their size, megaherbivores can defoliate the landscape. Because of this they are considered keystone species in their environment. They use their size, power and feeding behavior to change the structure and composition of vegetation, which affects both |
https://en.wikipedia.org/wiki/Data%20cooperative | A data cooperative is a group of individuals voluntarily pooling together their data. As an entity, a data cooperative is a type of data infrastructure, formed through the voluntary and collaborative pooling efforts of individuals. Data cooperatives allow individuals to get paid for the data they create and to exercise more pricing power than they would have on their own or in an- other type of data exchange. Examples include cooperatives of music artists, video producers, and gig workers. The income is not a subsidy, but rather the result of individual economic activity channeled through exchanges that aggregate the data of producers and workers, thereby turning individuals into data entrepreneurs. As a data infrastructure, data cooperatives are created, owned and operated by community members, and this enables the communities, and its members, to have full control over their data, and the decisions that are made by the insights gathered from the data. By giving individual community members control over their data, data cooperatives are a new and innovative type of data infrastructure, that act as a counter weight against data brokers and data driven corporations.
Key aspects
Ownership rights
One key aspect of data cooperatives is that the individual members of a data cooperative have control and legal ownership over their data. As a key aspect, ownership rights also refers to the notion that all members of a data cooperative must be able to collect copies of their data. This can be done either automatically through electronic means (e.g. passive data-traffic copying software on their devices) or it can me done by manually uploading data files to the cooperative. The data that is collected is stored in a members personal data store (PSD). Within an individual's personal data store, members have the ability to add, remove or restrict access to personal data.
Fiduciary obligations to members
This key aspect of data cooperatives refers to the legally bound oblig |
https://en.wikipedia.org/wiki/L%C3%B6vheim%20Cube%20of%20Emotions | Lövheim Cube of Emotion is a theoretical model for the relationship between the monoamine neurotransmitters serotonin, dopamine and noradrenaline and emotions. The model was presented in 2012 by Swedish researcher Hugo Lövheim.
Lövheim classifies emotions according to Silvan Tomkins, and orders the basic emotions in a three-dimensional coordinate system where the level of the monoamine neurotransmitters form orthogonal axes. The model is regarded as a dimensional model of emotion.
The main concepts of the hypothesis are that the monoamine neurotransmitters are orthogonal in essence, and the proposed one-to-one relationship between the monoamine neurotransmitters and emotions.
References
Psychology articles needing expert attention
Emotion
Mathematical psychology
Affective science |
https://en.wikipedia.org/wiki/Landau%20kernel | The Landau kernel is named after the German number theorist Edmund Landau. The kernel is a summability kernel defined as:
where the coefficients are defined as follows
Visualisation
Using integration by parts, one can show that:
Hence, this implies that the Landau Kernel can be defined as follows:
Plotting this function for different values of n reveals that as n goes to infinity, approaches the Dirac delta function, as seen in the image, where the following functions are plotted.
Properties
Some general properties of the Landau kernel is that it is nonnegative and continuous on . These properties are made more concrete in the following section.
Dirac sequences
The third bullet point means that the area under the graph of the function becomes increasingly concentrated close to the origin as n approaches infinity. This definition lends us to the following theorem.
Proof: We prove the third property only. In order to do so, we introduce the following lemma:
Proof of the Lemma:
Using the definition of the coefficients above, we find that the integrand is even, we may writecompleting the proof of the lemma. A corollary of this lemma is the following:
See also
Poisson Kernel
Fejer Kernel
Dirichlet Kernel
References
Mathematical analysis |
https://en.wikipedia.org/wiki/Paul%20Chaleff | Paul Chaleff (born 1947) is an American ceramist and professor emeritus of Fine Arts at Hofstra University. He is considered a pioneer of the revival of wood-fired ceramics in the US and credited as one of the first to use wood-burning dragon kilns in the style of the anagama tradition. He is best known as an innovator of large-scale ceramic sculpture. His work can be found in the collections of the Museum of Modern Art Department of Architecture and Design, and in the Metropolitan Museum of Art.
Paul Chaleff's work was strongly influenced by master potter Takeshi Nakazato. In 1989, Chaleff began collaborating with sculptor Sir Anthony Caro. Together they created nearly 50 works, both figurative and abstract. Caro's sculpture has had a direct influence on Chaleff's work as has the sculpture of Isamu Noguchi, and the ceramics of John Mason and Lucie Rie. Chaleff has also been recognized as an innovator of large-scale ceramic sculpture. The strength of his works stems from their being rough, gestural, split, and impure while remaining elegant.
Education
Chaleff attended the Bronx High School of Science. In 1968, while studying biology at the City College of New York, Chaleff survived a drowning accident that took his friend's life. He graduated in 1969 with a degree in Fine Arts. In 1971, Chaleff received his Master of Fine Arts in Ceramic Design from City College of New York. In 1975 he traveled to Japan to study Japanese pottery and wood-burning kiln design and returned to New York in 1977 where he built a studio and kilns in Pine Plains.
Career
Chaleff's anagama kiln was one of the first in the US. In 1980, the Museum of Modern Art purchased and exhibited his work from that kiln. In 1980, his wood-fired work was showcased at an official State dinner at the White House. Between 1989 and 2000, Chaleff collaborated on a series of clay sculptures with Sir Anthony Caro in his studio, first in Pine Plains and then Ancram. In 1995, he participated in Fire and Clay, a s |
https://en.wikipedia.org/wiki/Heterogonesis | Heterogonesis describes the segregation of parental genomes into distinct cell lineages in the dividing zygote.
Fertilisation occurs when an ovum fuses with a sperm, forming a zygote. Normally, the genomes of the two parents assort into two diploid bi-parental daughter cells. In a heterogoneic cell division, the genome of only one parent assorts into a single daughter cell following the formation of a tripolar (rather than the normal bipolar) spindle apparatus. Heterogonesis allows for chromosomal segregation to occur in a dispermic fertilisation which may subsequently result in chimerism or sesquizygosis.
The term heterogonesis was coined in 2016 by Destouni and Vermeesch who observed the phenomenon in bovine zygotes. The word is derived from the Greek meaning "different parental origin".
References
Cell anatomy
Chimerism
Developmental biology
Genetics concepts
Mitosis
Reproduction |
https://en.wikipedia.org/wiki/Navin%20Kartik | Navin Kartik is an American economist. He is a professor of economics at Columbia University.
Biography
Kartik received his B.A. from Brandeis University and Ph.D. from Stanford University. He was a member of the Institute for Advanced Study from 2007 to 2008. He taught at University of California, San Diego from 2004 to 2009 before joining the Columbia faculty. His research has focused on applied game theory and political economy.
Kartik was elected a fellow of the Econometric Society in 2022. He was also a recipient of a Sloan Research Fellowship in 2010. In 2023 he became Editor of the American Economic Journal: Microeconomics; he also received the Lenfest Distinguished Faculty award at Columbia.
References
Living people
American economists
Econometricians
Columbia University faculty
University of California, San Diego faculty
Sloan Research Fellows
Institute for Advanced Study people
Brandeis University alumni
Stanford University alumni
Game theorists
Political economists
Year of birth missing (living people) |
https://en.wikipedia.org/wiki/Transpirational%20cooling%20%28biological%29 | Transpirational cooling is the cooling provided as plants transpire water. Excess heat generated from solar radiation is damaging to plant cells and thermal injury occurs during drought or when there is rapid transpiration which produces wilting. Green vegetation contributes to moderating climate by being cooler than adjacent bare earth or constructed areas. As plant leaves transpire they use energy to evaporate water aggregating up to a huge volume globally every day.
An individual tree transpiring 100 litres of water is equivalent to a cooling power of 70 kWh. Urban heat island effects can be attributed to the replacement of vegetation by constructed surfaces. Deforested areas reveal a higher temperature than adjacent intact forest. Forests and other natural ecosystems support climate stabilisation.
The Earth’s energy budget reveals pathways to mitigate climate change using our knowledge of the efficacy of how plants cool and moderating Western approaches with proven indigenous and traditional sources of knowledge.
Transpiration and cooling
Evapotranspiration is the combined processes moving water from the earth’s surface into the atmosphere. Transpiration is the movement of water through a plant and out of its leaves and other aerial parts into the atmosphere. This movement is driven by solar energy. In the tallest trees, such as Sequoia sempervirens, the water rises well over 100 metres from root-tip to canopy leaves. Such trees also exploit evaporation to keep the surface cool. Water vapour from evapotranspiration mixed with air moves upwards to the point of saturation and then, helped by the emissions of cloud condensation nuclei, forms clouds. Each gram molecule (mole) of condensing water will bring about a marked 1200-fold plus reduction in volume.The simultaneous release of latent heat will drive air from below to fill the partial vacuum. The energy required for the surrounding air to move in is readily calculated from the small (one-fifteenth of late |
https://en.wikipedia.org/wiki/Kleptotype | In taxonomy, a kleptotype is an unofficial term referring to a stolen, unrightfully displaced type specimen or part of a type specimen.
Etymology
The term is composed of klepto-, from the Ancient Greek (kléptō) meaning "to steal", and -type referring to type specimens. It translates to "stolen type".
History
During the second world war biological collections, like the herbarium in Berlin have been destroyed. This led to the loss of type specimens. In some cases only kleptotypes have survived the destruction, as the type material had been removed from their original collections. For instance, the type of Taxus celebica was thought to be destroyed during the second world war, but a kleptotype has survived the war in Stockholm.
Kleptotypes have been taken by researchers, who subsequently added their unauthorised type duplicates to their own collections.
Consequences
Taking kleptotypes has been criticised as destructive, wasteful, and unethical. The displacement of type material complicates the work of taxonomists, as species identities may become ambiguous due to the lacking type material. It can cause problems, as researchers have to search in multiple collections to get a complete perspective on the displaced material. To combat this issue it has been proposed to weigh specimens before loaning types, and to identify loss of material through comparing the types weight upon return. Also, in some herbaria, such as the herbarium Kew, specimens are glued to the herbarium sheets to hinder the removal of plant material. However, this also makes it difficult to handle the specimens.
Rules concerning type specimens
The International Code of Nomenclature for algae, fungi, and plants (ICN) does not explicitly prohibit the removal of material from type specimens, however it strongly recommends to conserve the type specimens properly. It is paramount that types remain intact, as they are an irreplaceable resource and point of reference.
References
Biological concepts
|
https://en.wikipedia.org/wiki/WordUp%20%28program%29 | WordUp is a word processor for the Atari ST platform released by Neocept in 1988. It was one of the first word processors on the platform to offer a true what you see is what you get (WYSIWYG) display, using GDOS to work with multiple fonts and embedded graphics. Most previous word processors on the platform were either entirely text-based, like Atari's own ST Writer, or did not use GDOS and did not support multiple fonts and effects on-screen.
Overall, the program was relatively simple, similar to MacWrite. It did offer some more powerful features like the ability to generate a glossary and good control over typography. Reviews were generally positive, especially over its ability to easily perform layout and editing in rich documents, but the lack of a spell checker and the very slow printing was a notable concern in most reviews.
Description
GDOS
The Graphics Environment Manager, or GEM, formed the basis of the Atari ST's graphical user interface (GUI). GEM included a system known as GDOS, short for Graphics Device Operating System, which was designed to virtualize graphics output in the same fashion that CP/Ms BDOS allowed different input/output devices to be virtualized. This meant that graphics generated for one device could be sent to any other GDOS device, for instance, from the screen to a printer. GDOS introduced significant overhead which noticeably effected the speed of all applications on the system, not just those using it.
For performance reasons, few word processors on the ST made use of GDOS and instead called the underlying graphics routines directly. This meant they lacked the features like multiple fonts and WYSIWYG layout that would be seen on the Macintosh even in simple programs like MacWrite. Their GUIs were limited to issuing commands through the menu system and interaction using dialog boxes. Among the most popular word processors on the ST was Atari's own ST Writer, which took this to its extreme and removed any GUI at all, opening in a |
https://en.wikipedia.org/wiki/Data%20care | Data care refers to treating people and their private information fairly and with dignity. Data has progressively become more and more utilized in our society all over the world. When it comes to securely storing a medical patient's data, an employee's data, or a citizen's private data. The concept of data care emerged from the increase of data usage over the years, it is a term used to describe the act of treating people and their data with care and respect. This concept elaborates on how caring for people's data is the responsibility of those who govern data, for example, businesses and policy makers. Along with how to care for it in an ethical manner, while keeping in mind the people that the data belongs to. And discussing the concept of 'slow computing' on how this can be properly utilized to help in creating and maintaining proper data care.
Defining data care
To define data care means treating people and their private information fairly and with dignity in terms of their data. Data care is a term used by the cybersecurity industry, to teach people to be more careful with their data on social media and their mobile devices. Such information could be their banking information, address, and other personal information. In 2019, a United States bill required social media platforms to be more responsible with their users' private data, which will help in ensuring proper data care. This is one example in how implementing proper data care policy will help put pressure on these companies to achieve data justice. Data care aims to allow data navigation while countering data power, and encourages "slow computing" (see below), all of which will help in reducing datafication, and making it more difficult for people's data to become traceable. This will also encourage open source alternatives for data to become more difficult to trace. This is something the cybersecurity industry has been working toward for some time, as a means to help protect people's privacy. Proper d |
https://en.wikipedia.org/wiki/Hive%20Social | Hive Social is a microblogging service and mobile app. The app received news coverage during the acquisition of Twitter by Elon Musk in November 2022.
Hive Social was developed by Raluca Pop, also known as Kassandra Pop, with the help of a freelance developer, and the first version launched on the Apple App Store in October 2019. A beta version for Android was released via the Google Play store on November 10, 2022. The app has been described as a hybrid between Twitter, Instagram and Tumblr, with text, image and video posts supported. Hive is rated for ages 17+ and explicit images are permitted, including sexual intercourse, genitalia and nude close-ups. Hive features a chronological timeline with no personalisation algorithms, while profiles offer a MySpace-like song playing feature.
the app has reached number one on the top free social apps on the App Store and has over one million user accounts, despite email verification limits being reached. This spike in growth be attributed to those who were leaving Twitter due to the chaos that ensued following Elon Musks purchase of Twitter on October 27, 2022.
On 30 November 2022, "zerforschung", a German hacker collective, published information about severe security issues with Hive Social, among them the possibility to access all personal data, including private posts, private messages, shared media and even deleted direct messages. This also included private email addresses and phone numbers entered during login. Attackers could also overwrite data such as posts owned by other users. In response to the publication of the security report, Hive abruptly shut down their service to attempt to address the vulnerabilities. As of December 15, 2022, Hive's servers were back online and the platform returned online. The app also left beta on Android devices and saw a full release on the Google Play Store. Some features were disabled in the 2.0.0 update after the shutdown, such as direct messages, music, and polls.
See also |
https://en.wikipedia.org/wiki/Structural%20identifiability | In the area of system identification, a dynamical system is structurally identifiable if it is possible to infer its unknown parameters by measuring its output over time. This problem arises in many branch of applied mathematics, since dynamical systems (such as the ones described by ordinary differential equations) are commonly utilized to model physical processes and these models contain unknown parameters that are typically estimated using experimental data.
However, in certain cases, the model structure may not permit a unique solution for this estimation problem, even when the data is continuous and free from noise. To avoid potential issues, it is recommended to verify the uniqueness of the solution in advance, prior to conducting any actual experiments. The lack of structural identifiability implies that there are multiple solutions for the problem of system identification, and the impossibility of distinguishing between these solutions suggests that the system has poor forecasting power as a model. On the other hand, control systems have been proposed with the goal of rendering the closed-loop system unidentifiable, decrease its susceptibility to covert attacks targeting cyber-physical systems.
Examples
Linear time-invariant system
Consider a linear time-invariant system with the following state-space representation:
and with initial conditions given by and . The solution of the output is
which implies that the parameters and are not structurally identifiable. For instance, the parameters generates the same output as the parameters .
Non-linear system
A model of a possible glucose homeostasis mechanism is given by the differential equations
where (c, si, p, α, γ) are parameters of the system, and the states are the plasma glucose concentration G, the plasma insulin concentration I, and the beta-cell functional mass β. It is possible to show that the parameters p and si are not structurally identifiable: any numerical choice of parameters p an |
https://en.wikipedia.org/wiki/Alceon%20Group | Alceon Group is an Australian manufacturer and owns Decor Corporation Founded in Melbourne in 1958 and acquired by Marlin Brands in 2017 and Willow Ware Australia Founded in Melbourne in 1887.
Willow had its origin in Wilson Brothers Pty, founded by Ralph and Richard, sons of Ralph Wilson, sen., (c. 1826 – 14 June 1901) and Elizabeth Wilson ( – 21 April 1912). With start-up capital from their parents, they began making tin cans in 1887, then developed a factory in North Melbourne. Ralph Wilson (1865 – 10 December 1930) married Agnes Kirkwood Twaddell (1870–1946) in 1896, and had a home "Benarty", in High street, Malvern. He seems to have been a respected employer, but nothing has been found of his brother Richard's involvement, apart from his retirement in 1906. Apart from robberies and vandalism, the company was never in the news; they established the "Willow" brand in the 1920s, making billies, Coolgardie safes, etc.; became W., M., Y., and A. H. Wilson Ltd. They later moved to Tullamarine, and now only make plastic products. Since 2018 owned by Decor Corporation, a subsidiary of Marlin Management Services.
Brands
Marlin Brands
Owns 50% of Marlin Brands with Oaktree Capital Management.
EziBuy
Zanui - Homewares Stores
Decor - Food Storage Containers, Brushware & Mops
Reva - Pegs
Starmaid - Document Storage
Willow - BBQ Needs, Document Storage
Mosaic Brands
In July 2018, City Chic Collective sold five of its brands; Autograph, Crossroads, Katies, Millers and Rivers, to Noni B (later renamed to Mosaic Brands) leaving only one brand, City Chic, in its portfolio. The remaining operations were rebranded City Chic Collective in November 2018.
Owns 36% of Mosaic Brands.
Autograph
BeMe
Crossroads
Events
Katies
Liz Jordan
Maggie T
Millers
Noni B
Rivers
Rockmans
Table Eight
W. Lane
See also
List of companies of Australia
Manufacturing in Australia
References
External links
Alceon Group | Shop Ethical! company profile
Australian companies established in 1887
Compa |
https://en.wikipedia.org/wiki/Fist%20and%20rose | The fist and rose, sometimes called the rose in the fist or fist with a rose, is an emblem used or formerly used by a number of socialist and social democratic parties around the world.
It depicts a rose, symbolizing the promises of a better life under a socialist government, and a clenched fist holding it, symbolizing the activist commitment and solidarity necessary to achieve it. The rose is displayed in the red colour associated with left-wing politics; recent variants display the leafs in green, reflecting the rise of environmental concerns. Its design involves political symbolism drawn from the history of socialism and social democracy, and also alluding to the counterculture of the 1960s.
The emblem was drawn in 1969 by the French graphic artist Marc Bonnet and became popular within the Socialist Party (PS), which made it its official logo in 1971. It was later used, with slight or large alterations and adaptations, by several parties elsewhere in Europe as well as in Africa, America, and Asia, although some have retired it since the end of the 20th century. In 1979, it was also taken up by the Socialist International (SI). It has often been chosen to provide an attractive visual alternative to the communist hammer and sickle, and to signal a party's affiliation to the SI and kinship with foreign left-wing parties.
History and use
France
Creation and adoption
The emblem was created in France within the Socialist Party (PS), at the time of its transformation from the prior SFIO at the Alfortville Congress (May 1969) and of its enlargement to the rest of the “non-communist left” at the Épinay Congress (June 1971). The emblem of the SFIO was the Three Arrows, a 1930s anti-fascist symbol, which was falling in disuse as the party wished to modernize. The Ceres, a left-leaning faction, had taken control of the PS Paris Federation and actively seek to change the party. The initiative for the emblem is frequently, although disputedly, credited to Didier Motchan |
https://en.wikipedia.org/wiki/The%20Erd%C5%91s%20Distance%20Problem | The Erdős Distance Problem is a monograph on the Erdős distinct distances problem in discrete geometry: how can one place points into -dimensional Euclidean space so that the pairs of points make the smallest possible distance set? It was written by Julia Garibaldi, Alex Iosevich, and Steven Senger, and published in 2011 by the American Mathematical Society as volume 56 of the Student Mathematical Library. The Basic Library List Committee of the Mathematical Association of America has suggested its inclusion in undergraduate mathematics libraries.
Topics
The Erdős Distance Problem consists of twelve chapters and three appendices.
After an introductory chapter describing the formulation of the problem by Paul Erdős and Erdős's proof that the number of distances is always at least proportional to , the next six chapters cover the two-dimensional version of the problem. They build on each other to describe successive improvements to the known results on the problem, reaching a lower bound proportional to in Chapter 7. These results connect the problem to other topics including the Cauchy–Schwarz inequality, the crossing number inequality, the Szemerédi–Trotter theorem on incidences between points and lines, and methods from information theory.
Subsequent chapters discuss variations of the problem: higher dimensions, other metric spaces for the plane, the number of distinct inner products between vectors, and analogous problems in spaces whose coordinates come from a finite field instead of the real numbers.
Audience and reception
Although the book is largely self-contained, it assumes a level of mathematical sophistication aimed at advanced university-level mathematics students. Exercises are included, making it possible to use it as a textbook for a specialized course. Reviewer Michael Weiss suggests that the book is less successful than its authors hoped at reaching "readers at different levels of mathematical experience": the density of some of its material, n |
https://en.wikipedia.org/wiki/Geometric%20Origami | Geometric Origami is a book on the mathematics of paper folding, focusing on the ability to simulate and extend classical straightedge and compass constructions using origami. It was written by Austrian mathematician and published by Arbelos Publishing (Shipley, UK) in 2008. The Basic Library List Committee of the Mathematical Association of America has suggested its inclusion in undergraduate mathematics libraries.
Topics
The book is divided into two main parts. The first part is more theoretical. It outlines the Huzita–Hatori axioms for mathematical origami, and proves that they are capable of simulating any straightedge and compass construction. It goes on to show that, in this mathematical model, origami is strictly more powerful than straightedge and compass: with origami, it is possible to solve any cubic equation or quartic equation. In particular, origami methods can be used to trisect angles, and for doubling the cube, two problems that have been proven to have no exact solution using only straightedge and compass.
The second part of the book focuses on folding instructions for constructing regular polygons using origami, and on finding the largest copy of a given regular polygon that can be constructed within a given square sheet of origami paper. With straightedge and compass, it is only possible to exactly construct regular for which is a product of a power of two with distinct Fermat primes (powers of two plus one): this allows to be 3, 5, 6, 8, 10, 12, etc. These are called the constructible polygons. With a construction system that can trisect angles, such as mathematical origami, more numbers of sides are possible, using Pierpont primes in place of Fermat primes, including for equal to 7, 13, 14, 17, 19, etc. Geometric Origami provides explicit folding instructions for 15 different regular polygons, including those with 3, 5, 6, 7, 8, 9, 10, 12, 13, 17, and 19 sides. Additionally, it discusses approximate constructions for polygons that cann |
https://en.wikipedia.org/wiki/Method%20of%20virtual%20quanta | The method of virtual quanta is a method used to calculate radiation produced by interactions of electromagnetic particles, particularly in the case of bremsstrahlung. It can also be applied in the context of gravitational radiation, and more recently to other field theories. The method was first developed by C. F. Weizsaecker and E. J. Williams in 1934.
Background
In problems of collision between charged particles or systems, the incident particle is often travelling at relativistic speeds when impacting the struck system, producing the field of a moving charge as follows:
where indicates the component of the electric field in the direction of travel of the particle, indicates the E-field in the direction perpendicular to and in the plane of the collision, is the impact parameter, is the Lorentz factor, the charge and the velocity of the incident particle.
In the ultrarelativistic limit, and have the form of a pulse of radiation travelling in the direction. This creates the virtual radiation pulse (virtual quanta) denoted by . Moreover, an additional magnetic field may be added in order to turn into a radiation pulse travelling along , denoted . This virtual magnetic field will turn out to be much smaller than , hence its contribution to the motion of particles is minimal.
By taking this point of view, the problem of the collision can be treated as a scattering of radiation. Similar analogies can be made for other processes (e.g. the ionisation of an atom by a fast electron can be treated as photoexcitation).
Bremsstrahlung
In the case of bremsstrahlung, the problem becomes one of the scattering of the virtual quanta in the nuclear Coulomb potential. This is a standard problem and the cross section of the scattering is known as the Thomson cross section:
The differential radiation cross section per unit frequency is hence:
where is the frequency spectrum of virtual quanta produced by the incident particle over all possible impact parameters.
Oth |
https://en.wikipedia.org/wiki/Spell%20Catcher | Spell Catcher, originally known as Thunder!, is a stand-alone spell checker for Atari ST, Macintosh and Microsoft Windows systems. It was published continually from 1985 until the untimely 2012 death of the primary developer, Evan Gross. Its original name refers to its lightning-fast speed, which set it apart from other spell checkers on the platform like Spellswell.
In addition to basic spell checking, later versions of the program offered a viewable dictionary and thesaurus, user-defined macro expansions, and auto-complete. Another notable feature was strong multilingual support, a rarity among such programs of the era. Over time, many of these features were also added to Mac OS X and Windows built-in spell checking functionality, but Spell Catcher remained in use due to its other features. It no longer runs under the most recent versions of macOS, which require 64-bit applications.
History
The program was originally released in 1985 for the Atari ST platform by Toronto-based Batteries Included, who sold it under the original name Thunder!. The designer of this version is listed as Mark Skapinker, but this name appears on no other documentation. There were two spell checkers with similar names at the same time, Turbo Lightning for DOS, and Mac Lightning for the Mac. Thunder also included statistics functions, including a word count and "grade level".
The Mac version was released a couple of months later. Batteries Included was purchased by Electronic Arts (EA) in 1987, as part of an AE buying spree. Thunder II was released in 1988, changing from a desk accessory to a control panel that hooked directly into the classic Mac OS text editing system to allow inline checking and replacement. During this period, the program won MacUser Eddy awards in 1986 and 1988.
In 1990, Gross was readying a version that ran under Apple's new System 7. EA had lost interest in the utilities market and sold the publishing rights to Baseline Publishing. The new Thunder 7 was release |
https://en.wikipedia.org/wiki/List%20of%20lichenicolous%20fungi%20of%20Iceland | This list of lichenicolous fungi of Iceland is based on a compiled checklist from 2009 with the taxonomy of the fungi revised in 2022 using the Global Biodiversity Information Facility online database.
Abrothallus parmeliarum
Arthonia epiphyscia
Arthonia fuscopurpurea
Arthonia gelidae
Arthonia intexta
Arthonia stereocaulina
Arthonia varians
Arthophacopsis parmeliarum
Bachmanniomyces punctum (listed as Phaeopyxis punctum )
Bachmanniomyces uncialicola
Buellia adjuncta
Carbonea supersparsa
Carbonea vitellinaria
Cecidonia umbonella
Cecidonia xenophana
Cercidospora epipolytropa
Cercidospora macrospora
Cercidospora punctillata
Cercidospora stereocaulorum
Cercidospora thamnoliicola
Cercidospora trypetheliza
Cercidospora verrucosaria
Clypeococcum placopsiphilum
Collemopsidium cephalodiorum (listed as Cercidispora cephalodiorum )
Corticifraga peltigerae
Didymellopsis pulposi
Endococcus fusiger
Endococcus propinquus
Endococcus rugulosus (also listed as Endococcus perpusillus which is a synonym of E. rugulosus)
Epibryon conductrix
Geltingia associata
Heterocephalacria bachmannii (listed as Syzygospora bachmannii )
Homostegia piggotii
Intralichen christiansenii
Lasiosphaeriopsis christiansenii
Lasiosphaeriopsis stereocaulicola
Lichenochora lepidiotae (listed as Sphaerulina lepidiotae )
Lichenodiplis lecanorae
Lichenopeltella cetrariicola
Lichenopeltella cladoniarum
Lichenosticta alcicornaria
Merismatium nigritellum
Muellerella erratica (listed as Muellerella pygmaea var. athallina )
Muellerella pygmaea
Muellerella pygmaea var. pygmaea
Muellerella ventosicola (listed as Muellerella pygmaea var. ventosicola )
Niesslia peltigericola (listed as Raciborskiomyces peltigericola )
Opegrapha pulvinata (synonym of O. pulvinata )
Opegrapha stereocaulicola
Phaeocalicium populneum
Polycoccum amygdalariae
Polycoccum deformans
Polycoccum pulvinatum
Polycoccum trypethelioides
Polycoc |
https://en.wikipedia.org/wiki/Source%20attribution | In the field of epidemiology, source attribution refers to a category of methods with the objective of reconstructing the transmission of an infectious disease from a specific source, such as a population, individual, or location. For example, source attribution methods may be used to trace the origin of a new pathogen that recently crossed from another host species into humans, or from one geographic region to another. It may be used to determine the common source of an outbreak of a foodborne infectious disease, such as a contaminated water supply. Finally, source attribution may be used to estimate the probability that an infection was transmitted from one specific individual to another, i.e., "who infected whom".
Source attribution can play an important role in public health surveillance and management of infectious disease outbreaks. In practice, it tends to be a problem of statistical inference, because transmission events are seldom observed directly and may have occurred in the distant past. Thus, there is an unavoidable level of uncertainty when reconstructing transmission events from residual evidence, such as the spatial distribution of the disease. As a result, source attribution models often employ Bayesian methods that can accommodate substantial uncertainty in model parameters.
Molecular source attribution is a subfield of source attribution that uses the molecular characteristics of the pathogen — most often its nucleic acid genome — to reconstruct transmission events. Many infectious diseases are routinely detected or characterized through genetic sequencing, which can be faster than culturing isolates in a reference laboratory and can identify specific strains of the pathogen at substantially higher precision than laboratory assays, such as antibody-based assays or drug susceptibility tests. On the other hand, analyzing the genetic (or whole genome) sequence data requires specialized computational methods to fit models of transmission. |
https://en.wikipedia.org/wiki/From%20Zero%20to%20Infinity | From Zero to Infinity: What Makes Numbers Interesting is a book in popular mathematics and number theory by Constance Reid. It was originally published in 1955 by the Thomas Y. Crowell Company. The fourth edition was published in 1992 by the Mathematical Association of America in their MAA Spectrum series. A K Peters published a fifth "Fiftieth anniversary edition" in 2006.
Background
Reid was not herself a professional mathematician, but came from a mathematical family that included her sister Julia Robinson and brother-in-law Raphael M. Robinson. She had worked as a schoolteacher, but by the time of the publication of From Zero to Infinity she was a "housewife and free-lance writer". She became known for her many books about mathematics and mathematicians, aimed at a popular audience, of which this was the first.
Reid's interest in number theory was sparked by her sister's use of computers to discover Mersenne primes. She published an article on a closely related topic, perfect numbers, in Scientific American in 1953, and wrote this book soon afterward. Her intended title was What Makes Numbers Interesting; the title From Zero to Infinity was a change made by the publisher.
Topics
The twelve chapters of From Zero to Infinity are numbered by the ten decimal digits, (Euler's number, approximately 2.71828), and , the smallest infinite cardinal number. Each chapter's topic is in some way related to its chapter number, with a generally increasing level of sophistication as the book progresses:
Chapter 0 discusses the history of number systems, the development of positional notation and its need for a placeholder symbol for zero, and the much later understanding of zero as being a number itself. It discusses the special properties held by zero among all other numbers, and the concept of indeterminate forms arising from division by zero.
Chapter 1 concerns the use of numbers to count things, arithmetic, and the concepts of prime numbers and integer factorization.
The |
https://en.wikipedia.org/wiki/Simplicial%20complex%20recognition%20problem | The simplicial complex recognition problem is a computational problem in algebraic topology. Given a simplicial complex, the problem is to decide whether it is homeomorphic to another fixed simplicial complex. The problem is undecidable for complexes of dimension 5 or more.
Background
An abstract simplicial complex (ASC) is family of sets that is closed under taking subsets (the subset of a set in the family is also a set in the family). Every abstract simplicial complex has a unique geometric realization in a Euclidean space as a geometric simplicial complex (GSC), where each set with k elements in the ASC is mapped to a (k-1)-dimensional simplex in the GSC. Thus, an ASC provides a finite representation of a geometric object. Given an ASC, one can ask several questions regarding the topology of the GSC it represents.
Homeomorphism problem
The homeomorphism problem is: given two finite simplicial complexes representing smooth manifolds, decide if they are homeomorphic.
If the complexes are of dimension at most 3, then the problem is decidable. This follows from the proof of the geometrization conjecture.
For every d ≥ 4, the homeomorphism problem for d-dimensional simplicial complexes is undecidable.
The same is true if "homeomorphic" is replaced with "piecewise-linear homeomorphic".
Recognition problem
The recognition problem is a sub-problem of the homeomorphism problem, in which one simplicial complex is given as a fixed parameter. Given another simplicial complex as an input, the problem is to decide whether it is homeomorphic to the given fixed complex.
The recognition problem is decidable for the 3-dimensional sphere . That is, there is an algorithm that can decide whether any given simplicial complex is homeomorphic to the boundary of a 4-dimensional ball.
The recognition problem is undecidable for the d-dimensional sphere for any d ≥ 5. The proof is by reduction to the word problem for groups. From this, it can be proved that the recognition pr |
https://en.wikipedia.org/wiki/Math%20on%20Trial | Math on Trial: How Numbers Get Used and Abused in the Courtroom is a book on mathematical and statistical reasoning in legal argumentation, for a popular audience. It was written by American mathematician Leila Schneps and her daughter, French mathematics educator Coralie Colmez, and published in 2013 by Basic Books.
Topics
Math on Trial consists of ten chapters, each outlining a particular mathematical fallacy, presenting a case study of a trial in which it arose, and then detailing the effects of the fallacy on the case outcome The cases range over a wide range of years and locations, and are roughly ordered by the sophistication of the reasoning needed to resolve them. Their descriptions are based on case records, contemporary newspaper accounts, later scholarship, and in some cases interviews with the principals. In particular:
Chapter 1 involves the incorrect assumption that related events have independent probabilities of occurring, a recurring theme in several other cases presented in later chapters. It illustrates this through Sally Clark, an English mother who was convicted of murdering her two infants, both of whom died suddenly soon after their birth. The case involved the testimony of pediatrician Roy Meadow, who testified that the probability of this occurring naturally was one in 73 million, based on an incorrect calculation in which he used the assumption of independence and squared the probability of a single sudden crib death. A second fallacy, also present in the case, is the assumption that an unlikely event cannot happen, when in fact many unlikely events (such as that some particular person wins a lottery) happen routinely.
Chapter 2 concerns another case of the false assumption of independence, used in the case of People v. Collins to argue that a certain combination of physical features used to identify a suspect was so exceedingly rare that only the defendants could have matched them.
Chapter 3 involves the Joe E. Sneed murder trial, in w |
https://en.wikipedia.org/wiki/Bias%20in%20the%20introduction%20of%20variation | Bias in the introduction of variation ("arrival bias") refers to a theory in the domain of evolutionary biology that asserts biases in the introduction of heritable variation are reflected in the outcome of evolution. It is relevant to topics in molecular evolution, evo-devo, and self-organization. In the context of this theory, "introduction" ("origination") is a technical term for events that shift an allele frequency upward from zero (mutation is the genetic process that converts one allele to another, whereas introduction is the population genetic process that adds to the set of alleles in a population with non-zero frequencies).
Formal models demonstrate that when an evolutionary process depends on introduction events, mutational and developmental biases in the generation of variation may influence the course of evolution by a first come, first served effect, so that evolution reflects the arrival of the likelier, not just the survival of the fitter.
Whereas mutational explanations for evolutionary patterns are often associated with neutral evolution, the theory of arrival biases distinctively predicts that biases in the generation of variation may shape adaptive change.
The most direct evidence for this kind of cause-effect relationship comes from laboratory studies showing that adaptive changes are systematically enriched for mutationally likely types of changes.
Retrospective analyses of natural cases of adaptation also provide support for the theory.
This theory is notable as an example of contemporary structuralist thinking, contrasting with a classical functionalist view in which the course of evolution is determined by natural selection (see ).
History
The theory of biases in the introduction process as a cause of orientation or direction in evolution has been explained as the convergence of two threads. The first, from theoretical population genetics, is the explicit recognition by theoreticians (toward the end of the 20th century) that a correct |
https://en.wikipedia.org/wiki/Panasonic%20Senior%20Partner | The Senior Partner (stylized as the Sr. Partner) is an IBM PC-compatible portable computer that was introduced by the Panasonic Corporation in 1984. Weighing roughly in its base configuration, the computer came equipped with a cathode-ray tube display and a built-in thermal printer.
Specifications
In its stock configuration, the Senior Partner weights and measures . Its monochrome, green-phosphor cathode-ray tube display measures nine inches diagonally and supports the CGA video mode for IBM PCs and compatibles, displaying text at up to 80 columns by 25 rows and graphics up to 640 by 200 pixels. The Senior Partner runs an Intel 8088 microprocessor clocked at the IBM-PC-standard 4.77 MHz. A slot for an aftermarket 8087 floating-point co-processor is included on the motherboard. The computer's base configuration is equipped with 128 KB of RAM, expandable to 256 KB via a proprietary plug-in expansion board. At the rear of the system unit is an RS-232 serial port, a Centronics-style parallel port (in a deviation from the IBM-PC-standard DB-25 parallel connector), and an RGBI port.
Panasonic offered three models of the Senior Partner: one with one 5.25-inch floppy drive; another with two such drives; and the last with one 5.25-inch floppy drive and one 10 MB hard drive. Panasonic dubbed the lattermost model the Super Senior Partner. The company supplied all units with MS-DOS 2.11, as well as a bundle of application software including GW-BASIC, WordStar, VisiCalc, pfs:File, pfs:Graph, and pfs:Report.
The Senior Partner features a built-in thermal printer capable of operating at up to 55 cps. The printer can feed out up to 80 inches of paper before jamming due to lacking a tractor-feed mechanism. It can print up to 132 columns of text per row.
Development and release
Panasonic announced the Senior Partner in November 1983 and began delivering units to customers in March 1984. The hard drive–based Super Senior Partner was unveiled in May 1984, to be available in Augus |
https://en.wikipedia.org/wiki/Panasonic%20Executive%20Partner | The Executive Partner (stylized as the Exec. Partner; model number FT-70) is an IBM PC-compatible portable computer that was introduced by the Panasonic Corporation in 1985. The portable computer is AC-powered exclusively, weighs between 28 and , and features a built-in printer. The Executive Partner was one of the first affordable portable computers with a plasma display.
Specifications
The Executive Partner is a portable computer in a clamshell form factor that measures . Depending on the configuration, the computer weighs between 28 and . Two models of the computer were released: one with dual 5.25-inch floppy drives and the other with one such floppy drive and a hard drive. All models in the Executive Partner range feature an Intel 8086-2 microprocessor running at user-switchable clock speed of 7.16 MHz or 4.77 MHz—the latter being the standard clock speed of the original IBM PC. The stock Executive Partner comes with 256 KB of RAM, expandable to 640 KB. Users must upgrade all the RAM at once if they are to upgrade to 640 KB because of the non-interoperability of the 64-kb chips of the 128 KB RAM with the 256-kb chips of the 256 KB RAM. The computer has one ISA expansion slot, supporting only certain cards 6 inch in length maximum. An expansion box offering slots for three full-length (13 in) cards was offered as an optional accessory.
The portable features a flat-panel gas plasma display measuring 11 in diagonally and the display housing holding the plasma panel being only 2 in thick. Because of the heavy current draw of the computer's plasma display, the Executive Partner is powered through mains AC exclusively. The plasma display produces a neon-orange image that was said to exhibit less glare than contemporary cathode-ray tubes and LCDs. A special hinge mechanism prevents the display housing from slamming into the keyboard half of the chassis and potentially breaking the fragile glass layers of the plasma display. The graphics adapter supports CGA video; i |
https://en.wikipedia.org/wiki/Sensory%20trap%20hypothesis | The sensory trap hypothesis describes an evolutionary idea that revolves around mating behavior and female mate choice. It is a model of female preference and male sexual trait evolution through what is known as sensory exploitation. Sensory exploitation, or a sensory trap is an event that occurs in nature where male members of a species perform behaviors or display visual traits that resemble a non-sexual stimulus which females are responsive to. This tricks females into engaging with the males, thus creating more mating opportunities for males. What makes it a sensory trap is that these female responses evolved in a non-sexual context, and the male produced stimulus exploits the female response which would not otherwise occur without the mimicked stimulus.
Limitations
The term "trap" indicates that these sensory trap events may be detrimental to female mating success, but they may not always be costly. In fact, there are circumstances where not responding to the stimulus itself can be costly, as females may ignore the actual stimulus in the correct context, and lose the fitness benefits that come with it. There are also circumstances where these traps can actually be beneficial in the context of mate choice, where the females who are responding to the trap end up gaining high-quality males to mate with.
While these sensory traps can be quite successful when they appropriately mimic the non-sexual stimulus, they often become exaggerated as a result of excessive selection to the point where they are no longer useful. This is due to the trait or behavior becoming imperceptible or no longer resembling the original stimulus.
Sensory traps in nature
Photinus, Photuris, and Pyractomena Firefly males use patterned light flashes that mimic the females' prey species when they are flying above them, which evokes a female response including their own pattern of flashing lights that the males use to locate them for mating.
Neumania papillator males engage in leg trembli |
https://en.wikipedia.org/wiki/D%C3%A9borah%20Oliveros | Déborah Oliveros Braniff is a Mexican mathematician whose research interests include discrete geometry, combinatorics, and convex geometry, including the geometry of bodies of constant width and related topics.
Education and career
After earning an undergraduate degree in mathematics from the National Autonomous University of Mexico (UNAM) in 1992, and earning a master's degree in 1994 under the mentorship of Mónica Clapp, Oliveros continued at UNAM for graduate study in mathematics, with doctoral research on an unsolved question of Stanislaw Ulam concerning the buoyancy of floating convex bodies. Her 1997 dissertation on the topic, Los volantines : sistemas dinamicos asociados al problema de la flotacion de los cuerpos, was jointly supervised by Luis Montejano and Javier Bracho.
She became a professor at UNAM in 1996, but left in 1999 for postdoctoral research at the University of Calgary in Canada. She became a professor there from 2001 to 2005, when she returned to a professorship at UNAM. She became one of the founders of the branch of the UNAM Institute of Mathematics at the UNAM Juriquilla campus, and directed the institute for 2015–2016. She also holds an affiliation with the Faculty of Engineering of the Autonomous University of Queretaro.
Book
Oliveros is a coauthor with Horst Martini and Luis Montejano of the book Bodies of Constant Width: An Introduction to Convex Geometry with Applications (Birkhäuser, 2019).
Recognition
UNAM gave Oliveros the "Reconocimiento Sor Juana Inés de la Cruz" award in 2014. She is a member of the Mexican Academy of Sciences.
References
External links
Home page
Year of birth missing (living people)
Living people
Mexican mathematicians
Mexican women mathematicians
Combinatorialists
Geometers
National Autonomous University of Mexico alumni
Academic staff of the University of Calgary
Academic staff of the National Autonomous University of Mexico
Members of the Mexican Academy of Sciences |
https://en.wikipedia.org/wiki/Trusted%20Information%20Security%20Assessment%20Exchange | Trusted Information Security Assessment Exchange (TISAX) is an assessment and exchange mechanism for the information security of enterprises, developed by the ENX Association and published by the Verband der Automobilindustrie (German Association of the Automotive Industry or VDA). TISAX concerns the secure processing of information from business partners, the protection of prototypes and data protection in accordance with the General Data Protection Regulation (GDPR) for potential business transactions between automobile manufacturers and their service providers or suppliers. The VDA established TISAX in 2017 together with the ENX Association.
Tests according to TISAX, especially for service providers and suppliers, are carried out by "TISAX test service providers". The ENX Association acts as a governance organization in the system. It approves the testing service providers and monitors the quality of the execution and the assessment results. This is to ensure that both the results at the end correspond to a desired quality and objectivity, and that the rights and obligations of the participants are safeguarded. This allows a company to decide whether the resulting maturity level of the supplier (service providers and suppliers) meets the requirements of the buyer.
The testing requirements have been revised several times. In October 2020, the status 5.0 was published. Backgrounds, areas of application, execution processes and testing requirements are summarized in a manual.
References
Information sensitivity
Automotive industry in Europe
Data security |
https://en.wikipedia.org/wiki/L%27%C3%AEle%20du%20Gouvernement | l'île du Gouvernement is an island in the St. Brandon archipelago. The island is uninhabited, and mostly functions as a bird and turtle sanctuary.
References
Islands of St. Brandon
Mascarene Islands
Outer Islands of Mauritius
Reefs of the Indian Ocean
Fishing areas of the Indian Ocean
Insular ecology
Important Bird Areas of Mauritius
Flora of Mauritius
Atolls of the Indian Ocean
Biodiversity
Conchology |
https://en.wikipedia.org/wiki/Avocar%C3%A9%20Island | Avocaré Island (Avoquer, Avocaire, L'Avocaire) is an island located in the St Brandon archipelago.
In the World Bank Report creating the marine protected area of St. Brandon, Avocaré Island was classified as a Group 3 Island together with Île Raphael, L'Île Coco and L'île du Sud.
In 1846, Avocaré Island was visited by British naval officer Edward Belcher aboard HMS Samarang, who confirmed that it was then a principal fishing station with fishermen catching 102 kg of fish per day. Avocaré Island is today an uninhabited bird and turtle sanctuary. Access to the island is restricted to prevent the introduction of invasive alien species.
See also
Mascarene Islands
St Brandon
Mauritius
Île Raphael
L'île du Sud
L'île du Gouvernement
L'Île Coco
References
Islands of St. Brandon
Outer Islands of Mauritius
Reefs of the Indian Ocean
Fishing areas of the Indian Ocean
Important Bird Areas of Mauritius
Atolls of the Indian Ocean |
https://en.wikipedia.org/wiki/Column%20groups%20and%20row%20groups | In tables and matrices, a column group or row group usually refers to a subset of columns or rows, respectively. Short names or notational names include col group or colgroup, and row group or rowgroup. They can have varying uses depending on context:
In mathematics, a partitioned matrix is an interpretation of a matrix as being broken down into submatrices which may be more precisely referred to as a collection of row groups and column groups
In web development, colgroup is a standard HTML attribute and an HTML event attribute, for example used for color formatting of entire columns in HTML tables. The colgroup tag acts as a "parent container of one or more <col> elements". Rowgroup is another HTML attribute.
In reporting (including business reporting, data reporting and financial reporting), colgroups and rowgroups can be used for constructing tables and matrices which dynamically adjusts the size of their columns and rows, respectively, by displaying the set of columns in the colgroup set (which again is a subset of the underlying data).
In reporting, colgroups and rowgroups can also be used for grouping of collapsible categories in the presentation of a table (with or without aggregation for the groups). One example of a use case may be if a table contains a lot of detailed information, but there is a want to display summarizing information of groups in the same table.
See also
Column (database)
Group by (SQL)
Row (database)
Row and column vectors in mathematics
Row and column spaces in mathematics
Table (database)
References
Matrices
Data modeling |
https://en.wikipedia.org/wiki/Account%20sharing | Account sharing, also known as credential sharing, is the process of sharing login information with multiple users to access online accounts or services. This can include sharing information like e-mail addresses, usernames and passwords for social media accounts, subscription services, gaming platforms or other online services.
Reasons for account sharing
Account sharing is a common practice, especially among younger users who may not have the financial resources to pay for multiple accounts or subscriptions. It is also commonly used among families or groups of friends who want to access a shared account or service. Another reason could be to gain special features that depends on a single account like special items in a game account.
People may also share passwords with their significant others as a symbol of affection or absolute trust.
Reasons against account sharing
Account sharing is prohibited by the terms of service of many online accounts, such as Google and Facebook. It can result in account suspensions or even termination, possibly causing users to lose access to important data or services. On the other hand, account sharing can lose money for the service the account is being shared for.
Account sharing can also make it easier for hackers to gain access to multiple accounts using a single set of login credentials. This can lead to sensitive data being compromised or accounts being taken over by unauthorized users. The person the information is shared with could act careless and not secure enough with the login data or could steal the information for other purposes as a social engineering-strategy.
Prevention
Some services offer combined accounts, such as multiple family accounts (Family Sharing) with special child safety options or software license for multiple company accounts to a cheaper prize to prevent account sharing. Another system to make account sharing more difficult is the Multi-factor authentication which often requires the sharer to i |
https://en.wikipedia.org/wiki/Manuel%20Saez%20%28industrial%20designer%29 | Manuel Saez (born 1973, Tucumán, Argentina) is an industrial designer. He graduated with Honors from the University of Bridgeport. His E-scooter company Beyond partnered with Metropolitan Transportation Authority during the pandemic. In 2016 US President Barack Obama receives his designed CMYK 4.0 model bicycle as the presidential gift from Argentina’s president Mauricio Macri. He designed bicycle, electric scooter, smart helmet, furniture and consumer goods.
References
American industrial designers
Argentine industrial designers
Living people
1973 births |
https://en.wikipedia.org/wiki/Prigogine%27s%20theorem | Prigogine's theorem is a theorem of non-equilibrium thermodynamics, originally formulated by Ilya Prigogine.
The formulation of Prigogine's theorem is:
According to this theorem, the stationary state of a linear non-equilibrium system (under conditions that prevent the achievement of an equilibrium state) corresponds to the minimum entropy production. If there are no such obstacles, then the production of entropy reaches its absolute minimum - zero. A linear system means the fulfillment of linear phenomenological relationships between thermodynamic flows and driving forces. The coefficients of proportionality in the relationships between flows and driving forces are called phenomenological coefficients.
The theorem was proved by Prigogine in 1947 from the Onsager relations. Prigogine's theorem is valid if the kinetic coefficients in the Onsager relations are constant (do not depend on driving forces and flows); for real systems, it is valid only approximately, so the minimum entropy production for a stationary state is not such a general principle as the maximum entropy for an equilibrium state. It has been experimentally established that Onsager's linear relations are valid in a fairly wide range of parameters for heat conduction and diffusion processes (for example, Fourier's law, Fick's law). For chemical reactions, the linear assumption is valid in a narrow region near the state of chemical equilibrium. The principle is also violated for systems odd with respect to time reversal.
References
External links
1977 Nobel Prize lecture by Ilya Prigogine
Attribution note: early versions of this article were translated from the Russian-language Wikipedia article on this topic.
Theorems
Thermodynamics |
https://en.wikipedia.org/wiki/Antilimit | In mathematics, the antilimit is the equivalent of a limit for a divergent series. The concept not necessarily unique or well-defined, but the general idea is to find a formula for a series and then evaluate it outside its radius of convergence.
Common divergent series
See also
Abel summation
Cesàro summation
Lindelöf summation
Euler summation
Borel summation
Mittag-Leffler summation
Lambert summation
Euler–Boole summation and Van Wijngaarden transformation can also be used on divergent series
References
Divergent series
Summability methods
Sequences and series
Mathematical analysis |
https://en.wikipedia.org/wiki/Cocaine%20Bear%20%28bear%29 | The Cocaine Bear, also known as Pablo Eskobear (sometimes spelled Escobear) or Cokey the Bear, was a 175-pound (79-kilogram) American black bear that fatally overdosed on cocaine in 1985. The cocaine had been dropped by drug smugglers in the wilderness in Tennessee, United States. The bear was found dead in northern Georgia and was stuffed and displayed at a mall in Kentucky. It inspired the 2023 horror comedy thriller film Cocaine Bear, as well as the 2023 documentary film Cocaine Bear: The True Story.
History
On September 11, 1985, former Lexington police department narcotics officer turned drug smuggler Andrew C. Thornton II was trafficking cocaine from Colombia into the United States. After dropping off a shipment in Blairsville, Georgia, Thornton and an accomplice, Bill Leonard, departed in a self-piloted Cessna 404 Titan. En route, the duo dropped a load of 40 plastic containers of cocaine into the wilderness before abandoning the plane above Knoxville, Tennessee. Allegedly, Thornton was killed when his parachute failed to open. According to the FBI, Thornton dumped his cargo because the load of two men, in addition to the cocaine, was too heavy for the plane to carry.
On December 23, the Georgia Bureau of Investigation reported finding a dead black bear that had eaten a large amount of the cocaine from the jettisoned containers. The containers had held about 75 pounds (34 kilograms) of cocaine, valued at $20 million (equivalent to $ million in ), and by the time the scene was studied by government authorities, all of the containers had been ripped open, with their contents scattered. The chief medical examiner from the Georgia State Crime Lab, Dr. Kenneth Alonso, stated that its stomach was "literally packed to the brim with cocaine", although he estimated the bear had absorbed only 3 to 4 grams into its bloodstream at the time of its death.
Dr. Alonso did not want to waste the body of the bear, so he had it taxidermied and gave it to the Chattahoochee R |
https://en.wikipedia.org/wiki/Gurzadyan%20theorem | In cosmology, Gurzadyan theorem, proved by Vahe Gurzadyan, states the most general functional form for the force satisfying the condition of identity of the gravity of the sphere and of a point mass located in the sphere's center. This theorem thus refers to the first statement of Isaac Newton’s shell theorem (the identity mentioned above) but not the second one, namely, the absence of gravitational force inside a shell.
The theorem had entered, for example, in physics manual website and its importance for cosmology outlined in several papers as well as in shell theorem.
The formula and the cosmological constant
The formula for the force derived in has the form
where and are constants. The first term is the familiar law of universal gravitation, the second one corresponds to the cosmological constant term in general relativity and McCrea-Milne cosmology.
Then the field is force-free only in the center of a shell but the confinement (oscillator) term does not change the initial symmetry of the Newtonian field. Also, this field corresponds to the only field possessing the property of the Newtonian one: the closing of orbits at any negative value of energy, i.e. the coincidence of the period of variation of the value of the radius vector with that of its revolution by (resonance principle) .
Consequences: cosmological constant as a physical constant
Einstein named the cosmological constant as a universal constant, introducing it to define the static cosmological model.
From this theorem the cosmological constant emerges as additional constant of gravity along with the Newton’s gravitational constant . Then, the cosmological constant is dimension independent and matter-uncoupled and hence can be considered even more universal than Newton’s gravitational constant.
For joining the set of fundamental constants , the gravitational
Newton’s constant, the speed of light and the Planck constant, yields
and a dimensionless quantity emerges for the 4-consta |
https://en.wikipedia.org/wiki/Silvilization | Silvilization is a conceptual framework or a vision of the world whereby the forest, a metaphor for primordial living, is the best place for human development and fulfilment. It is a portmanteau of the Latin word silva, meaning forest, and civilization.
History
The term was first coined by Pierre-Doris Maltais, leader of the Iriadamant eco-cult. Erkki Pulliainen, an MP of the Green League, in collaboration with Maltais and the University of Helsinki, implemented the interdisciplinary ESSOC project (“Ecological Sylvilisation and Survival with the Aid of Original Cultures”) in 1991. The project was considered a failure.
In 1997, a publication in the journal Interculture by the Intercultural Institute of Montreal was devoted entirely to the theme of silvilization and ecosophy. The articles were written by authors such as Edward Goldsmith, Gary Snyder, and Gita Mehta.
References
Sociology
Ecology |
https://en.wikipedia.org/wiki/Sasikanth%20Manipatruni | Sasikanth Manipatruni is an American engineer and inventor in the fields of Computer engineering, Integrated circuit technology, Materials Engineering and semiconductor device fabrication. Manipatruni contributed to developments in silicon photonics, spintronics and quantum materials.
Manipatruni is a co-author of 50 research papers and ~400 patents (cited about 7500 times ) in the areas of electro-optic modulators, Cavity optomechanics, nanophotonics & optical interconnects, spintronics, and new logic devices for extension of Moore's law. His work has appeared in Nature, Nature Physics, Nature communications, Science advances and Physical Review Letters.
Early life and education
Manipatruni received a bachelor's degree in Electrical Engineering and Physics from IIT Delhi in 2005 where he graduated with the institute silver medal. He also completed research under the Kishore Vaigyanik Protsahan Yojana at Indian Institute of Science working at Inter-University Centre for Astronomy and Astrophysics and in optimal control at Swiss Federal Institute of Technology at Zurich.
Research career
Manipatruni received his Ph.D. in Electrical Engineering with minor in applied engineering physics from Cornell University. The title of his thesis was "Scaling silicon nanophotonic interconnects : silicon electrooptic modulators, slowlight & optomechanical devices". His thesis advisors were Michal Lipson and Alexander Gaeta at Cornell University. He has co-authored academic research with Michal Lipson, Alexander Gaeta, Keren Bergman, Ramamoorthy Ramesh, Lane W. Martin, Naresh Shanbhag, Jian-Ping Wang, Paul McEuen, Christopher J. Hardy, Felix Casanova, Ehsan Afshari, Alyssa Apsel, Jacob T. Robinson, :fr:Manuel Bibes spanning Condensed matter physics, Electronics and devices, Photonics, Circuit theory, Computer architecture and hardware for Artificial intelligence areas.
Silicon optical links
Manipatruni's PhD thesis was focused on developing the then nascent field of silicon |
https://en.wikipedia.org/wiki/Impacts%20of%20California%20High-Speed%20Rail | In addition to the direct reduction in travel times the HSR project will produce, there are also economic and environmental impacts of the high-speed rail system. These were also specifically noted in Proposition 1A at the time the project sought authorization from the voters of the state in 2008. The anticipated benefits apply both generally to the state overall, as well as to the regions the train will pass through, and to the areas immediately around the train stations.
Estimates of current & past impacts
Latest 2022 & 2023 Impact Information, Overall and by Region
On March 17, 2023 the Authority released its latest 2022 and 2023 impact report.
Job training: The Central Valley Training Center
The Central Valley Training Center (located in Selma, California) is an organization supported by the Authority and local non-profit and governmental organizations. Since 2020 it has provided hands-on, free, 12-week pre-apprenticeship programs in 11 trades to prepare Central Valley veterans, at-risk young adults, minority, and low-income populations for construction jobs on the CAHSR project. As of December 2022 it has graduated 7 cohorts, totaling over 100 students, and further assisted them by providing job placement as well as other support services.
Annual Sustainability Reports
CAHSR is designed to be an entirely environmentally sustainable system. Each year since 2018 the Authority has produced a Sustainability Report. Highlights of the 2022 report are:
"Restoring more than 2,972 acres of habitat and protecting more than 3,190 acres of agricultural land;
Planting more than 7,100 trees;
Avoiding or sequestering 420,245 metric tons of carbon dioxide – the equivalent of removing one natural gas-fired power plant from the grid for a year;
Increasing small business participation to over 700 entities;
Generating between $12.7 and 13.7 billion in total economic activity in the state, with 56% investment in disadvantaged communities."
Cumulative economic impact estimat |
https://en.wikipedia.org/wiki/Paul%20Portier%20%28physiologist%29 | Paul Jules Portier (22 May 1866 – 26 January 1962) was a French physiologist who made important contributions to the discovery of anaphylaxis and the development of symbiogenesis. On a scientific expedition organised by Albert I, Prince of Monaco, he and Charles Richet discovered that toxins produced by marine animals (cnidarians such as Portuguese man o' war and sea anemone) could induce fatal shocks. They named the medical phenomenon "anaphylaxis," from which Richet went on to receive the 1913 Nobel Prize in Physiology or Medicine. Portier was the first scientist to explain that the cell organelle, mitochondrion, arose by symbiosis according to his evolutionary theory in 1818.
Biography
Portier was born in Bar-sur-Seine, France, to Ernest Paul and Julie Moreau Laure. He studied elementary education at the Lycée de Troyes from 1878 to 1885. After passing the final secondary examination (called bac) from the Saint-Sigisbert in Nancy, he qualified for service in the Ministry of Finance in 1888. However, he chose to study biology, following his childhood dream. In 1889, he entered the University of Paris from where he earned an M.D. in 1897 and Doctor of Science (docteur ès sciences) degree in 1912. He continued to work in the university as an assistant physician.
In 1906, Albert I, Prince of Monaco founded the Institute of Oceanography (Institut océanographique de Paris); Portier was appointed its professor. When the institute was inaugurated in 1911, Portier became its first director. In 1920, he was appointed professor of professor of comparative physiology at the University of Paris. In 1923, the University of Paris created a chair of physiology, which he held for the rest of his career. He retired in 1936, and the university awarded him the position of honorary professor. He played active roles in the administrations of the French Academy of Sciences and the French Academy of Medicine. He published his last book The Biology of Butterflies in 1949.
Portier m |
https://en.wikipedia.org/wiki/Eugenia%20O%27Reilly-Regueiro | Eugenia O'Reilly-Regueiro is a Mexican mathematician specializing in algebraic combinatorics and particular in the symmetries of combinatorial designs, circulant graphs, and abstract polytopes. She is a researcher in the Institute of Mathematics of the National Autonomous University of Mexico (UNAM).
Education and career
O'Reilly-Regueiro is originally from Mexico City. She was a mathematics student at UNAM, graduating in 1995. For the next two years she continued to work at UNAM as an assistant in the mathematics department of the Faculty of Chemistry, while studying harpsichord at UNAM's , working there with musician Luisa Durón.
Next, with a scholarship from the UNAM Dirección General de Asuntos del Personal Académico (DGAPA), she traveled to England for graduate study at Imperial College London, at that time part of the University of London system. She completed her PhD in 2003. Her dissertation, Flag-Transitive Symmetric Designs, was supervised by Martin Liebeck.
On completing her doctorate, she returned to UNAM as a researcher for the Institute of Mathematics.
Recognition
O'Reilly-Regueiro was elected to the Mexican Academy of Sciences in 2022.
References
External links
Home page
Year of birth missing (living people)
Living people
Mexican mathematicians
Mexican women mathematicians
Combinatorialists
Members of the Mexican Academy of Sciences |
https://en.wikipedia.org/wiki/Logic%20and%20Sexual%20Morality | Logic and Sexual Morality is a 1965 book by John Boyd Wilson in which the author provides a critique of philosophical arguments about sex.
Reception
The book was reviewed by John C. Hall and David Sladen.
References
External links
Logic and Sexual Morality
1965 non-fiction books
Sexual ethics books
Penguin Books books
Logic books
English-language books |
https://en.wikipedia.org/wiki/Motus%20%28wildlife%20tracking%20network%29 | Motus (Latin for movement) is a network of radio receivers for tracking signals from transmitters attached to wild animals. Motus uses radio telemetry for real-time tracking. It was launched by Birds Canada in 2014 in the US and Canada.
By 2022, more than 1,500 receiver stations have been installed in 34 countries, most receivers are concentrated in the United States and Canada, where the network began. Motus network has spread rapidly because it provides important key data useful to researchers and conservationists, both nationally and internationally.
The Motus transmitter's great advantage is that it has such a small size and weight, they can weigh ~0.2 grams to ~2.6 grams, and can therefore be attached to all animals, even small animals such as insects, for example a bee or a butterfly.
Once a researcher or organization receives state and federal permits, they only need to acquire the appropriate transmitters and attach them to their study objects, current transmitters' range (depending on size) is up to 12 miles (20 kilometers).
The long-used Geolocators and GPS loggers are light and small but only store the desired data, they cannot wirelessly transmit the data, this means that researchers must recapture the transmitter-equipped object to read the stored information, recapturing a wild animal can take a long time, and many times it does not succeed.
Depending on the animal to be tracked, the transmitter is attached in a suitable way, either with a thread or an adhesive, after a certain time the glue and thread dissolve and the transmitter falls off, having in the meantime transmitted all the data to the receivers it passed.
References
External links
Schematic view of the motus system.
Picture of a swallow fitted with a motus transmitter.
Radio technology |
https://en.wikipedia.org/wiki/Quiet%20and%20loud%20aliens | The concept of quiet and loud aliens is used in the modelling of hypotheses for the prevalence of extraterrestrial intelligence, particularly in the context of the Fermi Paradox. Hypothetical "loud" aliens expand their sphere of influence rapidly in a highly detectable way; hypothetical "quiet" aliens are hard or impossible to detect. A special case of loud alien civilizations are "grabby aliens" who also inhibit the development of other technological civilizations in their sphere of influence.
See also
Anthropic principle
Dark forest hypothesis
Search for extraterrestrial intelligence
References
Astrobiology
Extraterrestrial life
Fermi paradox
Search for extraterrestrial intelligence |
https://en.wikipedia.org/wiki/Yield%20%28hydrology%29 | The term yield is used to describe the volume of water escaping from a spring over a certain period of time, the discharge quantity of which is measured in [l/s]. Measurement methods include volume–filling-time measurement and water level measurement.
The discharge of a spring can fluctuate to a greater or lesser extent depending on precipitation and evaporation. Karst springs show particularly large time-dependent differences in the discharge.
References
Bibliography
Murawski, Hans and Wilhelm Meyer (2010). Geologisches Wörterbuch. 12th edn. Heidelberg: Spectrum.
Limnology
Hydrogeology |
https://en.wikipedia.org/wiki/Indigenous%20data%20governance | Data governance in the context of Indigenous data involves supporting the data interests, gaps and priorities of Indigenous peoples, in order to enable Indigenous self-determination. Generally, data governance refers to who has ownership, control and access over the use of data. Indigenous data governance requires the data to surround Indigenous peoples and its purpose to reflect Indigenous needs and priorities, rather than omitting Indigenous peoples in the production of Indigenous data.
Overview
Indigenous data governance is key in enabling Indigenous self-determinism and rebuilding strong Indigenous nations. Oftentimes, Indigenous peoples do not have access to relevant Indigenous data. Currently in Canada, much information on Indigenous peoples are considered government data that fall under Crown copyright, limiting access to relevant data such as archeological sites that are of significance to Indigenous nations. Thus, Indigenous data that lacks strong data governance often misrepresent Indigenous peoples, help inform policies that have discriminatory impacts on Indigenous peoples, and uphold colonial practices.
Definition of Indigenous data
Indigenous data can include knowledge and information on census, health and other administrative data about Indigenous peoples, information on the environment, non-humans and resources, and information on cultural heritage such as oral histories, clan knowledge and cultural sites. Indigenous data be produced by Indigenous people, governments, other institutions, and corporations. In terms of rebuilding Indigenous nations, Indigenous data can be useful for tribal governments when making decisions about their resources and communities.
Indigenous data sovereignty
Companies and states often have the power in deciding what kind of data is produced and for what purposes. Data sovereignty in the context of Indigenous data is about ensuring that Indigenous people have a say in the data that is produced about them, how this d |
https://en.wikipedia.org/wiki/Jacobi%20bound%20problem | The Jacobi Bound Problem concerns the veracity of Jacobi's inequality which is an inequality on the absolute dimension of a differential algebraic variety in terms of its defining equations.
The inequality is the differential algebraic analog of Bezout's theorem in affine space.
Although first formulated by Jacobi, In 1936 Joseph Ritt recognized the problem as non-rigorous in that Jacobi didn't even have a rigorous notion of absolute dimension (Jacobi and Ritt used the term "order" - which Ritt first gave a rigorous definition for using the notion of transcendence degree).
Intuitively, the absolute dimension is the number of constants of integration required to specify a solution of a system of ordinary differential equations.
A mathematical proof of the inequality has been open since 1936.
Statement
Let be a differential field of characteristic zero and consider a differential algebraic variety determined by the vanishing of differential polynomials .
If is an irreducible component of of finite absolute dimension then
In the above display is the *jacobi number*.
It is defined to be
.
References
Unsolved problems in mathematics
Differential algebra |
https://en.wikipedia.org/wiki/Crash%20Team%20Rumble | Crash Team Rumble is an online multiplayer video game developed by Toys for Bob and published by Activision. It was released on June 20, 2023, for PlayStation 4, PlayStation 5, Xbox One and Xbox Series X/S. The game is the third of the party genre released in the Crash Bandicoot series, and features several members of its cast as playable characters. The gameplay pits two teams of players against each other as they stockpile Wumpa Fruit while impeding the opposing team's efforts.
Gameplay
Crash Team Rumble is an online multiplayer "strategic platformer" with a competitive four-versus-four format. A number of Crash Bandicoot characters are playable, including Crash, Cortex, Coco, Dingodile, Brio, the female counterpart of N. Tropy and Tawna. Teams must capture more Wumpa Fruit than the other team to claim victory. In addition to depositing their own Wumpa Fruit at a drop-off zone, players must also defend their opponent's drop-off zone to prevent them from depositing their own supply. Each character has unique skills and abilities with which to battle the opposing team. The character roster is divided into three roles: "Blocker", "Booster", and "Scorer". The game features cross-platform play.
Marketing and release
On October 7, 2022, Activision delivered a package to influencers consisting of a pizza box with an attached receipt announcing the release of Crash Bandicoot 4: It's About Time via Steam on October 18. A message at the bottom of the receipt teased the announcement of a new Crash Bandicoot title on December 8, the date of The Game Awards 2022. At the awards ceremony, Crash Team Rumble was announced via a debut trailer, with a projected 2023 release for the PlayStation 4, PlayStation 5, Xbox One and Xbox Series X/S. It is the latest Crash Bandicoot title to be developed by Toys for Bob after Crash Bandicoot 4: It's About Time.
On March 21, 2023, Activision announced a closed beta release from April 20 to 24 for those who pre-ordered the game, with a ful |
https://en.wikipedia.org/wiki/Indigenous%20statistics | Indigenous statistics is a quantitative research method specific to Indigenous people. It can be better understood as an Indigenous quantitative methodology. Indigenous quantitative methodologies include practices, processes, and research that are done through an Indigenous lens.
The purpose of indigenous statistics is to diminish the disparities and inequalities faced by Indigenous people globally. Statistics are a reliable source of data, which can be used in the present and future. This is a relatively new concept in the research world. Statistics are the collection of quantitative data that is used to interpret and present data. Indigenous refers to an ethnic group of people who are the earliest inhabitants or native to that land. Connecting these two terms, researchers aim to provide fair and reliable data on Indigenous communities. By focusing on three central themes, which are situated in entering research through a solely Indigenous lens. The cultural framework of data, quantitative methodologies in data, and the situated activity amongst academic research.
Background
Statistics
Statistics are a collection of quantitative data. Statistics are how data is interpreted and presented. Statistics interpret our reality and influence the understanding of societies. The purpose of Indigenous statistics is to have Indigenous people collect their own data in a fashion they find best suitable for their community. This is done by Indigenous researchers, or through the perspective of Indigenous communities. Statistics, in turn, provide information used to determine theoretical and practical development and produce the notion of open data. Indigenous statistics aims to make statistics a source of reliable information regarding Indigenous societies.
Indigenous people
Indigenous Peoples is a term used to define people with ancestral origins in the land they inhabit. Indigenous peoples are the earliest known inhabitants of the land they inhabit.
Concerns with open dat |
https://en.wikipedia.org/wiki/Sigma%20hole%20interactions | In chemistry, sigma hole interactions (or σ-hole interactions) are a family of intermolecular forces that can occur between several classes of molecules and arise from an energetically stabilizing interaction between a positively-charged site, termed a sigma hole, and a negatively-charged site, typically a lone pair, on different atoms that are not covalently bonded to each other. These interactions are usually rationalized primarily via dispersion and electrostatic charge-transfer, and are characterized by a strong directional preference that makes them useful in applications in which control over supramolecular chemistry is desired.
Molecular basis of interaction
The basis of a sigma hole interaction is an energetically stabilizing interaction between a positively charged site (sigma hole) and a negatively charged site (lone pair) on different atoms. The positive site is produced by a covalent sigma bond between the atom hosting the sigma hole and a neighboring atom. The presence of the bond results in the distortion of the electron density around the host atom, with the density increasing equatorially (with respect to the bond) about the atom but decreasing along the extension of the bond. Through this mechanism, a region of positive electrostatic potential, termed a sigma hole, can be localized onto the surface of an atom bearing a sigma bond. This sigma hole could then engage in electrostatic interactions with a lone pair associated with a negative electrostatic potential.
In addition to the electrostatic interaction described above, dispersive forces are also thought to play a role in the overall interaction. Studies have found electrostatic and dispersive contributions to be roughly comparable in magnitude, and for the dominant contributor to vary from system to system.
Alternatively, sigma hole pair interactions can be conceptualized in terms of the mixing of molecular orbitals. The occupied sigma bonding orbital associated with the bond would give rise |
https://en.wikipedia.org/wiki/Transition-rate%20matrix | In probability theory, a transition-rate matrix (also known as a Q-matrix, intensity matrix, or infinitesimal generator matrix) is an array of numbers describing the instantaneous rate at which a continuous-time Markov chain transitions between states.
In a transition-rate matrix (sometimes written ), element (for ) denotes the rate departing from and arriving in state . The rates , and the diagonal elements are defined such that
,
and therefore the rows of the matrix sum to zero.
Up to a global sign, a large class of examples of such matrices is provided by the Laplacian of a directed, weighted graph. The vertices of the graph correspond to the Markov chain's states.
Properties
The transition-rate matrix has following properties:
There is at least one eigenvector with a vanishing eigenvalue, exactly one if the graph of is strongly connected.
All other eigenvalues fulfill .
All eigenvectors with a non-zero eigenvalue fulfill .
Example
An M/M/1 queue, a model which counts the number of jobs in a queueing system with arrivals at rate λ and services at rate μ, has transition-rate matrix
See also
Stochastic matrix
References
Markov processes
Matrices |
https://en.wikipedia.org/wiki/Diversity%20%28mathematics%29 | In mathematics, a diversity is a generalization of the concept of metric space. The concept was introduced in 2012 by Bryant and Tupper,
who call diversities "a form of multi-way metric". The concept finds application in nonlinear analysis.
Given a set , let be the set of finite subsets of .
A diversity is a pair consisting of a set and a function satisfying
(D1) , with if and only if
and
(D2) if then .
Bryant and Tupper observe that these axioms imply monotonicity; that is, if , then . They state that the term "diversity" comes from the appearance of a special case of their definition in work on phylogenetic and ecological diversities. They give the following examples:
Diameter diversity
Let be a metric space. Setting for all defines a diversity.
diversity
For all finite if we define then is a diversity.
Phylogenetic diversity
If T is a phylogenetic tree with taxon set X. For each finite , define
as the length of the smallest subtree of T connecting taxa in A. Then is a (phylogenetic) diversity.
Steiner diversity
Let be a metric space. For each finite , let denote
the minimum length of a Steiner tree within X connecting elements in A. Then is a
diversity.
Truncated diversity
Let be a diversity. For all define
. Then if , is a diversity.
Clique diversity
If is a graph, and is defined for any finite A as the largest clique of A, then is a diversity.
References
Metric spaces |
https://en.wikipedia.org/wiki/Van%20den%20Berg%E2%80%93Kesten%20inequality | In probability theory, the van den Berg–Kesten (BK) inequality or van den Berg–Kesten–Reimer (BKR) inequality states that the probability for two random events to both happen, and at the same time one can find "disjoint certificates" to show that they both happen, is at most the product of their individual probabilities. The special case for two monotone events (the notion as used in the FKG inequality) was first proved by van den Berg and Kesten in 1985, who also conjectured that the inequality holds in general, not requiring monotonicity. later proved this conjecture. The inequality is applied to probability spaces with a product structure, such as in percolation problems.
Statement
Let be probability spaces, each of finitely many elements. The inequality applies to spaces of the form , equipped with the product measure, so that each element is given the probability
For two events , their disjoint occurrence is defined as the event consisting of configurations whose memberships in and in can be verified on disjoint subsets of indices. Formally, if there exist subsets such that:
for all that agrees with on (in other words, ), is also in and
similarly every that agrees with on is in
The inequality asserts that:
for every pair of events and
Examples
Coin tosses
If corresponds to tossing a fair coin times, then each consists of the two possible outcomes, heads or tails, with equal probability. Consider the event that there exists 3 consecutive heads, and the event that there are at least 5 heads in total. Then would be the following event: there are 3 consecutive heads, and discarding those there are another 5 heads remaining. This event has probability at most which is to say the probability of getting in 10 tosses, and getting in another 10 tosses, independent of each other.
Numerically, and their disjoint occurrence would imply at least 8 heads, so
Percolation
In (Bernoulli) bond percolation of a graph, the 's are index |
https://en.wikipedia.org/wiki/Black%20Mathematicians%20and%20Their%20Works | Black Mathematicians and Their Works is an edited volume of works in and about mathematics, by African-American mathematicians. It was edited by Virginia Newell, Joella Gipson, L. Waldo Rich, and Beauregard Stubblefield, with a foreword by Wade Ellis, and published in 1980 by Dorrance & Company.
The Basic Library List Committee of the Mathematical Association of America has recommended its inclusion in undergraduate mathematics libraries.
Contents
The book celebrates the achievements of black mathematicians and also records their struggle against racism. It includes reprints of 23 papers of mathematics research and three more on mathematics education, by black mathematicians. It provides brief biographies and photographs of 62 black mathematicians, all long-established at the time of publication (having doctorates prior to 1973). It also reproduces several letters by Lee Lorch documenting racist behavior in mathematical societies, such as exclusion from conferences and their associated social gatherings. An appendix lists universities that have worked with black mathematicians, by the number of doctorates conferred and the number of faculty hired.
As well as two of the editors (Gipson and Stubblefield), the authors whose works are reproduced in the book include
Albert Turner Bharucha-Reid,
David Blackwell,
Lillian K. Bradley,
Marjorie Lee Browne,
Edward M. Carroll,
William Schieffelin Claytor,
Vivienne Malone-Mayes,
Clarence F. Stephens,
Walter Richard Talbot, and
J. Ernest Wilkins Jr.
Reception
Black Mathematicians and Their Works was the first book to collect the works of black mathematicians, and 40 years after its publication it remained the only such book. By demonstrating the successes of black mathematicians, it aimed to counter the then-current opinion that black people could not do mathematics, and provide encouragement to young black future mathematicians.
Edray Herber Goins has named this book as his "mathematical comfort food", writing:
References
|
https://en.wikipedia.org/wiki/Pteroma%20pendula | Pteroma pendula, the oil palm bagworm or simply bagworm, is a species of bagworm moth found in East and Southeast Asia that infests oil palm plantations.
Pteroma pendula is among most economically damaging pest of oil palm plantations in Malaysia and Indonesia, along with Metisa plana. The caterpillars also feed on other trees and shrubs, including Acacia mangium, Delonix regia, Cassia fistula, and Callerya atropurpurea. 31 different species have been identified as host plants for P. pendula. Insecticides are the favoured method of controlling the moth in most commercial plantations. Natural enemies such as predators, parasitoids, and fungi kill up to 4.85% of the population.
Life cycle
Survival rate of P. pendula eggs differs based on chosen host plant. The species has six larval instars. Pupae are typically found in middle and lower fronds, while caterpillars go higher in search fresh ones. Dimorphism has been reported in the pupal and imago stages. Males generally live longer than females.
Damage symptoms
P. pendula infestations can be detected by a number of symptoms. Holes in leaves and sometimes defoliation are some signs, and discolouration may also result.
References
Psychidae
Pests (organism)
Pests of oil palm
Fauna of Southeast Asia
Moths described in 1929 |
https://en.wikipedia.org/wiki/Aperiodic%20crystal | Aperiodic crystals lack three-dimensional translational symmetry but still exhibit three-dimensional long-range order. In other words, they are periodic crystals in higher dimensions. They are classified into three different categories: incommensurate modulated structures, incommensurate composite structures, and quasicrystals.
The diffraction patterns of aperiodic crystals contain two sets of peaks, which include "main reflections" and "satellite reflections". Main reflections are usually stronger in intensity and span a lattice defined by three-dimensional reciprocal lattice vectors. Satellite reflections are weaker in intensity and are known as "lattice ghosts". These reflections do not correspond to any lattice points in physical space and cannot be indexed with the original three vectors. To understand aperiodic crystal structures, one must use the superspace approach. In materials science, "superspace" or higher-dimensional space refers to the concept of describing the structures and properties of materials in terms of dimensions beyond the three dimensions of physical space. This may involve using mathematical models to describe the behavior of atoms or molecules in a materials in four, five, or even higher dimensions.
History
The history of aperiodic crystals can be traced back to the early 20th century, when the science of crystallography was in its infancy. At that time, it was generally accepted that the ground state of matter was always an ideal crystal with three-dimensional space group symmetry, or lattice periodicity. However, in the late 1900s, a number of developments in the field of crystallography challenged this belief. Researchers began to focus on the scattering of X-rays and other particles beyond just the Bragg peaks, which allowed them to better understand the effects of defects and finite size on the structure of crystals, as well as the presence of additional spots in diffraction patterns due to periodic variations in the crystal struc |
https://en.wikipedia.org/wiki/RoboForm | RoboForm is a password manager, which is a class of software that allows users to have secure, unique passwords for every website accessed. It is amongst the older password managers on the market, developed by US company Siber Systems, distributed as a freemium product with a subscription plan, available on macOS, Windows, iOS and Android and as a plugin for web browsers.
Overview
Siber Systems is a company founded in 1995 by Vadim Maslov with headquarters in Fairfax, Virginia. The company was founded to capitalize on research into text parsing, compilation and transformation to produce useful, commercially-viable technologies. They released RoboForm as their first consumer product in 1999.
RoboForm was initially a form-filling utility and was further developed into a full-fledged password manager, then delivered with password generator, password capturer, password importer, multi-factor authentication and secure password sharing.
The first business version of RoboForm was released in 2009. In 2010 it was introduced premium cross-platform subscription service for individuals and in 2015, Siber Systems launched RoboForm as a software as a service solution (SaaS). The freemium model was available starting in 2017.
See also
List of password managers
References
Password managers
Utilities for Windows |
https://en.wikipedia.org/wiki/EDIM%20technology | Epitope Detection in Monocytes (EDIM) is a technology that uses the innate immune system's mechanisms to detect biomarkers or antigens in immune cells. It is a non-invasive form of liquid biopsy, i.e. biopsy from blood, which analyzes activated macrophages (CD14+/CD16+) for disease-specific epitopes, such as tumor cell components.
Macrophages are part of the human immune system. They are involved in the detection, phagocytosis and destruction of organisms which are deemed harmful.
In case of cancerous tumors, macrophages ingest tumor cells and dissolve them with the help of enzymes, storing tumor proteins intracellularly, even when little tumor mass is present. With the help of EDIM technologie, activated macrophages containing intracellular tumor epitopes can be detected using CD14 and CD16 specific antibodies.
Areas of Application
Currently, EDIM technology is used for the blood test PanTum Detect. Here, the method is applied to examine which individuals would benefit from further cancer detection examinations with imaging procedures (MRI, PET/CT) to clarify a possible tumor disease. The two biomarkers used for PanTum Detect are TKTL1 and DNaseX.
The PanTum Detect blood test exploits the EDIM technology utilizing the fact that activated monocytes/macrophages phagocytose tumor cells and contain tumor proteins intracellularly.
References
Blood tests
Biomarkers
Biotechnology
Endocrine function tests
Cancer screening |
https://en.wikipedia.org/wiki/NebulaGraph | NebulaGraph is an open-source distributed graph database built for super large-scale graphs with milliseconds of latency. NebulaGraph adopts the Apache 2.0 license and also comes with a wide range of data visualization tools.
History
NebulaGraph was developed in 2018 by Vesoft Inc. with the aim of providing stable and reliable infrastructure software to enterprises across the globe. In May 2019, NebulaGraph was open-sourced on GitHub and its alpha version was released same year.
In June 2020, NebulaGraph raised $8M in a series pre-A funding round led by Redpoint China Ventures and Matrix Partners China.
In June 2019, NebulaGraph 1.0 GA version was released while version 2.0 GA was released in March 2021. The latest version 3.0.2 of Nebula was released in March 2022.
See also
Graph database
References
External links
Free database management systems
Document-oriented databases
Distributed computing architecture
Key-value databases
Structured storage
Graph databases |
https://en.wikipedia.org/wiki/Vermont%20SportsCar | Vermont SportsCar (VSC) is a race car manufacturer that designs, engineers, and builds rally, rallycross and other specialty vehicles for teams and private clients. Since 2006 Vermont SportsCar has been the technical partner to Subaru of America and manages the automaker’s racing division Subaru Motorsports USA.
Founded by Lance Smith in 1988, VSC operates in Milton, Vermont, with more than 70 full-time employees. VSC manages motorsports programs primarily within the American Rally Association Championship, Nitro Rallycross series, and the Mount Washington Hill Climb. Additionally VSC designs, builds and campaigns the Gymkhana Subaru vehicles driven by Travis Pastrana.
VSC manufactures and sells aftermarket performance parts for Subaru vehicles under the name VSC Performance. VSC also offers prototyping and engineering services for private race teams and builds custom race and road cars.
In early 2022 VSC launched a multi-car effort in the 2022–23 Nitro Rallycross Championship’s all-electric Group E class.
History
VSC was founded in 1988 by Lance Smith. A native of Williston, Vermont, Smith started as a mechanic preparing race cars for car builder and racer Tivvy Shenton and then in the 1980s for rally driver John Buffum. In 1988, he modified a Volkswagen Golf for a private team in the SCCA ProRally series and spent the next several years competing as a co-driver for several seasons in other cars he built, including a Toyota Celica, Mitsubishi Eclipse, and Mitsubishi Galant VR4. In 1992, Smith won the co-driver's championship in the North American Rally Cup and the Subaru "Pride and Professionalism" award for the best-prepped car. Smith reduced the amount of his co-driving in 1997 to focus on building a Mitsubishi Evolution V for Buffum and growing his business. In 2001 VSC supported Prodrive and Subaru of America with the launch of Subaru Rally Team USA competing in the SCCA ProRally Championship in the USA. In 2003, VSC helped manage the Mitsubishi factory |
https://en.wikipedia.org/wiki/Hooley%27s%20delta%20function | In mathematics, Hooley's delta function (), also called Erdős--Hooley delta-function, defines the maximum number of divisors of in for all , where is the Euler's number. The first few terms of this sequence are
.
History
The sequence was first introduced by Paul Erdős in 1974, then studied by Christopher Hooley in 1979.
In 2023, Dimitris Koukoulopoulos and Terence Tao proved that the sum of the first terms, , for . In particular, the average order of to is for any .
Later in 2023 Kevin Ford, Koukoulopoulos, and Tao proved the lower bound , where , fixed , and .
Usage
This function measures the tendency of divisors of a number to cluster.
The growth of this sequence is limited by where is the number of divisors of .
See also
Divisor function
Euler's number
References
Divisor function
Arithmetic functions
Number theory
Integer sequences |
https://en.wikipedia.org/wiki/Jq%20%28programming%20language%29 | jq is a very high-level lexically scoped functional programming language in which every JSON value is a constant. jq supports backtracking and managing indefinitely long streams of JSON data. It is related to the Icon and Haskell programming languages. The language supports a namespace-based module system and has some support for closures. In particular, functions and functional expressions can be used as parameters of other functions.
The original implementation of jq was in Haskell before being immediately ported to C.
History
jq was created by Stephen Dolan, and released in October 2012.
It was described as being "like sed for JSON data". Support for regular expressions was added in jq version 1.5.
A "wrapper" program for jq named yq adds support for YAML, XML and TOML. It was first released in 2017.
The Go implementation, gojq, was initially released in 2019. gojq notably extends jq to include support for YAML.
The first version of jaq to include extensive support for regular expressions was released in March 2023. This version (0.10) also includes a fast JSON parser. Support for nested functions and recursively defined functions was subsequently added, but as of June 2023, jaq still does not include a "streaming parser" for processing
very large JSON documents with minimal memory requirements.
Usage
Command-line usage
jq is typically used at the command line and can be used with other command-line utilities, such as curl. Here is an example showing how the output of a command can be piped to a jq filter to determine the category names associated with this Wikipedia page:
$ curl 'https://en.wikipedia.org/w/api.php?action=parse&page=jq_(programming_language)&format=json' | jq '.parse.categories[]."*"'
The output produced by this pipeline consists of a stream of JSON strings, the first few of which are:
"Articles_with_short_description"
"Short_description_matches_Wikidata"
"Dynamically_typed_programming_languages"
"Functional_languages"
"Programming_l |
https://en.wikipedia.org/wiki/Atkinson%20dithering | Atkinson dithering is a variant of Floyd-Steinberg dithering designed by Bill Atkinson at Apple Computer, and used in the original Macintosh computer.
Implementation
The algorithm achieves dithering using error diffusion, meaning it pushes (adds) the residual quantization error of a pixel onto its neighboring pixels, to be dealt with later. It spreads the debt out according to the distribution (shown as a map of the neighboring pixels):
The pixel indicated with a star (*) indicates the pixel currently being scanned, and the blank pixels are the previously scanned pixels.
The algorithm scans the image from left to right, top to bottom, quantizing pixel values one by one. Each time the quantization error is transferred to the neighboring pixels, while not affecting the pixels that already have been quantized. Hence, if a number of pixels have been rounded downwards, it becomes more likely that the next pixel is rounded upwards, such that on average, the quantization error is reduced.
Unlike Floyd-Steinberg dithering, only 3/4 of the error is diffused outward. This leads to a more localized dither, at the cost of lower performance on near-white and near-black areas, but the increase in contrast on those areas may be regarded as more visually desirable for some purposes.
References
External links
Article on Atkinson dithering by John Earnest
Atkinson Dithering in HTML by Andrew Stephens
Image processing
Computer graphics algorithms |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.