source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/Valuation%20%28geometry%29 | In geometry, a valuation is a finitely additive function from a collection of subsets of a set to an abelian semigroup.
For example, Lebesgue measure is a valuation on finite unions of convex bodies of Other examples of valuations on finite unions of convex bodies of are surface area, mean width, and Euler characteristic.
In geometry, continuity (or smoothness) conditions are often imposed on valuations, but there are also purely discrete facets of the theory. In fact, the concept of valuation has its origin in the dissection theory of polytopes and in particular Hilbert's third problem, which has grown into a rich theory reliant on tools from abstract algebra.
Definition
Let be a set, and let be a collection of subsets of A function on with values in an abelian semigroup is called a valuation if it satisfies
whenever and are elements of
If then one always assumes
Examples
Some common examples of are
the convex bodies in
compact convex polytopes in
convex cones
smooth compact polyhedra in a smooth manifold
Let be the set of convex bodies in Then some valuations on are
the Euler characteristic
Lebesgue measure restricted to
intrinsic volume (and, more generally, mixed volume)
the map where is the support function of
Some other valuations are
the lattice point enumerator , where is a lattice polytope
cardinality, on the family of finite sets
Valuations on convex bodies
From here on, let , let be the set of convex bodies in , and let be a valuation on .
We say is translation invariant if, for all and , we have .
Let . The Hausdorff distance is defined as
where is the -neighborhood of under some Euclidean inner product. Equipped with this metric, is a locally compact space.
The space of continuous, translation-invariant valuations from to is denoted by
The topology on is the topology of uniform convergence on compact subsets of Equipped with the norm
where is a bounded subset with nonempty interior |
https://en.wikipedia.org/wiki/Farshad%20Fatemi | Seyed Farshad Fatemi Ardestani (born 15 May 1973) is an Iranian economist and a member of Iranian National Competition Council.
He is Associate Professor of Economics and Vice President for Administration and Finance at Sharif University of Technology.
Fatemi is known for his works on game theory and industrial organization.
References
External links
Farshad Fatemi
Living people
Iranian economists
Academic staff of Sharif University of Technology
Alumni of University College London
1973 births
Alumni of the University of Essex
Game theorists |
https://en.wikipedia.org/wiki/Quasicrystals%20and%20Geometry | Quasicrystals and Geometry is a book on quasicrystals and aperiodic tiling by Marjorie Senechal, published in 1995 by Cambridge University Press ().
One of the main themes of the book is to understand how the mathematical properties of aperiodic tilings such as the Penrose tiling, and in particular the existence of arbitrarily large patches of five-way rotational symmetry throughout these tilings, correspond to the properties of quasicrystals including the five-way symmetry of their Bragg peaks. Neither kind of symmetry is possible for a traditional periodic tiling or periodic crystal structure, and the interplay between these topics led from the 1960s into the 1990s to new developments and new fundamental definitions in both mathematics and crystallography.
Topics
The book is divided into two parts. The first part covers the history of crystallography, the use of X-ray diffraction to study crystal structures through the Bragg peaks formed on their diffraction patterns, and the discovery in the early 1980s of quasicrystals, materials that form Bragg peaks in patterns with five-way symmetry, impossible for a repeating crystal structure. It models the arrangement of atoms in a substance by a Delone set, a set of points in the plane or in Euclidean space that are neither too closely spaced nor too far apart, and it discusses the mathematical and computational issues in X-ray diffraction and the construction of the diffraction spectrum from a Delone set.
Finally, it discusses a method for constructing Delone sets that have Bragg peaks by projecting bounded subsets of higher-dimensional lattices into lower-dimensional spaces.
This material also has strong connections to spectral theory and ergodic theory, deep topics in pure mathematics, but these were omitted in order to make the book accessible to non-specialists in those topics.
Another method for the construction of Delone sets that have Bragg peaks is to choose as points the vertices of certain aperiodic tilings |
https://en.wikipedia.org/wiki/Naiomi%20Cameron | Naiomi Tuere Cameron is an American mathematician working in the field of combinatorics. She is an associate professor at Spelman College as well as the vice president of National Association of Mathematicians. She was previously an associate professor at Lewis & Clark College in Portland, OR.
Cameron was born in Washington, D.C. and raised in Providence, Rhode Island. She attended Howard University for her undergraduate and graduate school, receiving both her B.S. and PhD in Mathematics. In 2019, she was featured on Mathematically Gifted and Black as a Black History Month 2019 Honoree.
Research
Cameron's academic work has been focused on enumerative and algebraic combinatorics and number theory. Her thesis for her dissertation in 2002 was Random Walks, Trees and Extensions of Riordan Group Techniques. Her other publications include:
Additional work
Cameron is the vice president of the National Association of Mathematicians for the 2019–2020 term.
References
External links
21st-century American mathematicians
American women mathematicians
Combinatorialists
Howard University alumni
Spelman College faculty
Living people
Year of birth missing (living people)
21st-century American women |
https://en.wikipedia.org/wiki/Pokhozhaev%27s%20identity | Pokhozhaev's identity is an integral relation satisfied by stationary localized solutions to a nonlinear Schrödinger equation or nonlinear Klein–Gordon equation. It was obtained by S.I. Pokhozhaev and is similar to the virial theorem. This relation is also known as D.H. Derrick's theorem. Similar identities can be derived for other equations of mathematical physics.
The Pokhozhaev identity for the stationary nonlinear Schrödinger equation
Here is a general form due to H. Berestycki and P.-L. Lions.
Let be continuous and real-valued, with .
Denote .
Let
be a solution to the equation
,
in the sense of distributions.
Then satisfies the relation
The Pokhozhaev identity for the stationary nonlinear Dirac equation
There is a form of the virial identity for the stationary nonlinear Dirac equation in three spatial dimensions (and also the Maxwell-Dirac equations) and in arbitrary spatial dimension.
Let
and let and be the self-adjoint Dirac matrices of size :
Let be the massless Dirac operator.
Let be continuous and real-valued, with .
Denote .
Let be a spinor-valued solution that satisfies the stationary form of the nonlinear Dirac equation,
in the sense of distributions,
with some .
Assume that
Then satisfies the relation
See also
Virial theorem
Derrick's theorem
References
Mathematical_identities
Theorems in mathematical physics
Physics theorems |
https://en.wikipedia.org/wiki/Photoactivated%20adenylyl%20cyclase | Photoactivated adenylyl cyclase (PAC) is a protein consisting of an adenylyl cyclase enzyme domain directly linked to a BLUF (blue light receptor using FAD) type light sensor domain. When illuminated with blue light, the enzyme domain becomes active and converts ATP to cAMP, an important second messenger in many cells. In the unicellular flagellate Euglena gracilis, PACα and PACβ (euPACs) serve as a photoreceptor complex that senses light for photophobic responses and phototaxis. Small but potent PACs were identified in the genome of the bacteria Beggiatoa (bPAC) and Oscillatoria acuminata (OaPAC). While natural bPAC has some enzymatic activity in the absence of light, variants with no dark activity have been engineered (PACmn).
Use of PACs as optogenetic tools
As PACs consist of a light sensor and an enzyme in a single protein, they can be expressed in other species and cell types to manipulate cAMP levels with light. When bPAC is expressed in mouse sperm, blue light illumination speeds up the swimming of transgenic sperm cells and aids fertilization. When expressed in neurons, illumination changes the branching pattern of growing axons. PAC has been used in mice to clarify the function of neurons in the hypothalamus, which use cAMP signaling to control mating behavior. Expression of PAC together with K+-specific cyclic-nucleotide-gated ion channels (CNGs) has been used to hyperpolarize neurons at very low light levels, which prevents them from firing action potentials.
Other light-activated cyclases
Photoactivated guanylyl cyclases have been discovered in the aquatic fungi Blastocladiella emersonii and Catenaria anguillulae. Unlike PACs, these light-activated cyclases use retinal as their light sensor and are therefore rhodopsin guanylyl cyclases (RhGC). When expressed in Xenopus oocytes or mammalian neurons, RhGCs generate cGMP in response to green light. Therefore, they are considered useful optogenetic tools to investigate cGMP signaling.
References
Pro |
https://en.wikipedia.org/wiki/Protocol%20Wars | A long-running debate in computer science known as the Protocol Wars occurred from the 1970s to the 1990s when engineers, organizations and nations became polarized over the issue of which communication protocol would result in the best and most robust computer networks. This culminated in the Internet–OSI Standards War in the 1980s and early 1990s, which was ultimately "won" by the Internet protocol suite (TCP/IP) by the mid-1990s and has since resulted in most other protocols disappearing.
The pioneers of packet switching technology built computer networks to research and provide data communications in the late 1960s and early 1970s. As more networks emerged in the mid to late 1970s, the debate about interface standards was described as a "battle for access standards". An international collaboration between several national postal, telegraph and telephone (PTT) providers and commercial operators agreed to the X.25 standard in 1976, which was adopted on public data networks providing global coverage. Separately, proprietary data communication protocols also emerged, most notably IBM's Systems Network Architecture and Digital Equipment Corporation's DECnet.
The United States Department of Defense developed and tested TCP/IP during the 1970s in collaboration with universities and researchers in the United States, United Kingdom and France. IPv4 was released in 1981 and the DoD made it standard for all military computer networking. By 1984, an international reference model known as the OSI model had been agreed upon, with which TCP/IP was not compatible. Many governments in Europe – particularly France, West Germany, the United Kingdom and the European Economic Community – and also the United States Department of Commerce mandated compliance with the OSI model and the US Department of Defense planned to transition away from TCP/IP to OSI.
Meanwhile, the development of a complete Internet protocol suite by 1989, and partnerships with the telecommunication and comput |
https://en.wikipedia.org/wiki/Prime%20Suspects%3A%20The%20Anatomy%20of%20Integers%20and%20Permutations | Prime Suspects: The Anatomy of Integers and Permutations is a graphic novel by Andrew Granville, Jennifer Granville, and Robert J. Lewis, released on August 6, 2019 and published by Princeton University Press.
Plot
Prime Suspects: Anatomy of Integers and Permutations is a unique graphic novel that blurs the boundaries between pure visual art, deep mathematics, film noir and police procedurals whilst exploring the nature of scientific research, the role of women in mathematics and paying homage to the titans of mathematical history.
Reception
Judith Reveal of the New York Journal of Books said that "Prime Suspects will appeal to... the mathematician who eats, sleeps, and drinks numbers, start on page one and just enjoy the story . . . the book is fun, and interesting, and a challenge on many levels."
Benjamin Linowitz of MAA Reviews stated that ". . . It's very difficult to write a book on an advanced topic in mathematics that's accessible to math students and enthusiasts yet touches on contemporary research that is of interest to a broad swath of practicing mathematicians. Prime Suspects is such a book. And it's entertaining to boot. I recommend it in the strongest terms."
Paolo Mancosu of the Journal Of Humanistic Mathematics said the book "does a terrific job at presenting readers with a fascinating and realistic picture of how mathematical research is conducted. It does so in a deep way and yet with a light hand without falling into the trap of transforming the novel into a lecture on advanced mathematics or on methodology. Both the story and the illustrations are a delight."
References
External links
Prime Suspects documentary videos by Tommy Britt
Prime Suspects trailer by Hasan Abdulla
90.5 WICN Public Radio podcast interview with Andrew and Jennifer Granville
Reviews
New York Journal Of Books by Judith Reveal,
European Mathematical Society review by Adhemar Bultheel
Rogues Portal review by Anelise Farris
Sondrabooks review by Sondra Ekl |
https://en.wikipedia.org/wiki/Reverse%20Mathematics%3A%20Proofs%20from%20the%20Inside%20Out | Reverse Mathematics: Proofs from the Inside Out is a book by John Stillwell on reverse mathematics, the process of examining proofs in mathematics to determine which axioms are required by the proof. It was published in 2018 by the Princeton University Press ().
Topics
The book begins with a historical overview of the long struggles with the parallel postulate in Euclidean geometry, and of the foundational crisis of the late 19th and early 20th centuries, Then, after reviewing background material in real analysis and computability theory, the book concentrates on the reverse mathematics of theorems in real analysis, including the Bolzano–Weierstrass theorem, the Heine–Borel theorem, the intermediate value theorem and extreme value theorem, the Heine–Cantor theorem on uniform continuity, the Hahn–Banach theorem, and the Riemann mapping theorem.
These theorems are analyzed with respect to three of the "big five" subsystems of second-order arithmetic, namely arithmetical comprehension, recursive comprehension, and the weak Kőnig's lemma.
Audience
The book is aimed at a "general mathematical audience" including undergraduate mathematics students with an introductory-level background in real analysis. It is intended both to excite mathematicians, physicists, and computer scientists about the foundational issues in their fields, and to provide an accessible introduction to the subject. However, it is not a textbook; for instance, it has no exercises. One theme of the book is that many theorems in this area require axioms in second-order arithmetic that encompass infinite processes and uncomputable functions.
Reception and related reading
Jeffry Hirst criticizes the book, writing that "if one is not too obsessive about the details, Proofs from the Inside Out is an interesting introduction," while finding details that he would prefer to be handled differently, in a topic for which details are important. In particular, in this area, there are multiple choices for how to b |
https://en.wikipedia.org/wiki/Semiorthogonal%20decomposition | In mathematics, a semiorthogonal decomposition is a way to divide a triangulated category into simpler pieces. One way to produce a semiorthogonal decomposition is from an exceptional collection, a special sequence of objects in a triangulated category. For an algebraic variety X, it has been fruitful to study semiorthogonal decompositions of the bounded derived category of coherent sheaves, .
Semiorthogonal decomposition
Alexei Bondal and Mikhail Kapranov (1989) defined a semiorthogonal decomposition of a triangulated category to be a sequence of strictly full triangulated subcategories such that:
for all and all objects and , every morphism from to is zero. That is, there are "no morphisms from right to left".
is generated by . That is, the smallest strictly full triangulated subcategory of containing is equal to .
The notation is used for a semiorthogonal decomposition.
Having a semiorthogonal decomposition implies that every object of has a canonical "filtration" whose graded pieces are (successively) in the subcategories . That is, for each object T of , there is a sequence
of morphisms in such that the cone of is in , for each i. Moreover, this sequence is unique up to a unique isomorphism.
One can also consider "orthogonal" decompositions of a triangulated category, by requiring that there are no morphisms from to for any . However, that property is too strong for most purposes. For example, for an (irreducible) smooth projective variety X over a field, the bounded derived category of coherent sheaves never has a nontrivial orthogonal decomposition, whereas it may have a semiorthogonal decomposition, by the examples below.
A semiorthogonal decomposition of a triangulated category may be considered as analogous to a finite filtration of an abelian group. Alternatively, one may consider a semiorthogonal decomposition as closer to a split exact sequence, because the exact sequence of triangulated categories is split by the subcategory , ma |
https://en.wikipedia.org/wiki/99%20Points%20of%20Intersection | 99 Points of Intersection: Examples—Pictures—Proofs is a book on constructions in Euclidean plane geometry in which three or more lines or curves meet in a single point of intersection. This book was originally written in German by Hans Walser as 99 Schnittpunkte (Eagle / Ed. am Gutenbergplatz, 2004), translated into English by Peter Hilton and Jean Pedersen, and published by the Mathematical Association of America in 2006 in their MAA Spectrum series ().
Topics and organization
The book is organized into three sections. The first section provides introductory material, describing different mathematical situations in which multiple curves might meet, and providing different possible explanations for this phenomenon, including symmetry, geometric transformations, and membership of the curves in a pencil of curves. The second section shows the 99 points of intersection of the title. Each is given on its own page, as a large figure with three smaller figures showing its construction, with a one-line caption but no explanatory text. The third section provides background material and proofs for some of these points of intersection, as well as extending and generalizing some of these results.
Some of these points of intersection are standard; for instance, these include the construction of the centroid of a triangle as the point where its three median lines meet, the construction of the orthocenter as the point where the three altitudes meet, and the construction of the circumcenter as the point where the three perpendicular bisectors of the sides meet, as well as two versions of Ceva's theorem. However, others are new to this book, and include intersections related to silver rectangles, tangent circles, the Pythagorean theorem, and the nine-point hyperbola.
Audience
John Jensen writes that "the clear and uncluttered illustrations of intersection make for a rich source for geometric investigation by high school geometry students".
And although Gerry Leversha calls the |
https://en.wikipedia.org/wiki/Fall%20Guys | Fall Guys (formerly known as Fall Guys: Ultimate Knockout is a free-to-play platform battle royale game developed by Mediatonic and published by Epic Games. The game involves up to 40 players who control jellybean-like characters and compete against each other in a series of randomly selected mini-games such as obstacle courses or soccer. Players are eliminated as the rounds progress until, eventually, the last remaining player is crowned the winner. The game draws inspiration from game shows like Takeshi's Castle, It's a Knockout, Wipeout, and playground games like tag and British Bulldog.
The game was released by Devolver Digital for PlayStation 4 and Windows on 4 August 2020. Following their acquisition of Mediatonic, the publishing rights were transferred to Epic Games. Subsequently, the game was made free-to-play on 21 June 2022 and released on additional platforms including Nintendo Switch, PlayStation 5, Xbox One and Xbox Series X/S, with full cross-platform play support among all platforms. As part of the transition, the game adapted a battle pass system for its monetization, offering avatar customizations to players.
Fall Guys received positive reviews from critics for its chaotic gameplay and visual appearance, however fan reception has been more critical in recent times. The game was a commercial success, selling more than 10 million copies and attracting more than 50 million players after the game went free-to-play. Since the game's launch Mediatonic has released numerous updates for the game. The latest update added new objects to their level editor, vault changes, bug fixes, and a new theme.
Gameplay
Up to 40 players compete in shows with battle royale-style gameplay. Players, represented as jellybean-like figures, move around in a three-dimensional playing field, with additional moves such as jumping, grabbing/climbing, or diving to assist gameplay. The aim is to qualify for subsequent rounds by successfully completing each of the randomly select |
https://en.wikipedia.org/wiki/Invasion%20genetics | Invasion genetics is the area of study within biology that examines evolutionary processes in the context of biological invasions. Invasion genetics considers how genetic and demographic factors affect the success of a species introduced outside of its native range, and how the mechanisms of evolution, such as natural selection, mutation, and genetic drift, operate in these populations. Researchers exploring these questions draw upon theory and approaches from a range of biological disciplines, including population genetics, evolutionary ecology, population biology, and phylogeography.
Invasion genetics, due to its focus on the biology of introduced species, is useful for identifying potential invasive species and developing practices for managing biological invasions. It is distinguished from the broader study of invasive species because it is less directly concerned with the impacts of biological invasions, such as environmental or economic harm. In addition to applications for invasive species management, insights gained from invasion genetics also contribute to a broader understanding of evolutionary processes such as genetic drift and adaptive evolution.
History
Descriptions of invasive species
Charles Elton formed the basis for examining biological invasions as a unified issue in his 1958 monograph, The Ecology of Invasions by Animals and Plants, drawing together case studies of species introductions. Other important events in the study of invasive species include a series of issues published by the Scientific Committee on Problems of the Environment in the 1980s and the founding of the journal Biological Invasions in 1999. Much of the research motivated by Elton's monograph is generally identified with invasion ecology, and focuses on the ecological causes and impacts of biological invasions.
The Genetics of Colonizing Species
The evolutionary modern synthesis in the early 20th century brought together Charles Darwin's theory of evolution by natural s |
https://en.wikipedia.org/wiki/Difference%20Equations%3A%20From%20Rabbits%20to%20Chaos | Difference Equations: From Rabbits to Chaos is an undergraduate-level textbook on difference equations, a type of recurrence relation in which the values of a sequence are determined by equations involving differences of successive terms of the sequence. It was written by Paul Cull, Mary Flahive, and Robby Robson, and published by Springer-Verlag in their Undergraduate Texts in Mathematics series (Vol. 111, 2005, doi:10.1007/0-387-27645-9, ).
Topics
After an introductory chapter on the Fibonacci numbers and the rabbit population dynamics example based on these numbers that Fibonacci introduced in his book Liber Abaci, the book includes chapters on
homogeneous linear equations, finite difference equations and generating functions, nonnegative difference equations and roots of characteristic polynomials, the Leslie matrix in population dynamics, matrix difference equations and Markov chains, recurrences in modular arithmetic, algorithmic applications of fast Fourier transforms, and nonlinear difference equations and dynamical systems. Four appendices include a set of worked problems, background on complex numbers and linear algebra, and a method of Morris Marden for testing whether the sequence defined by a difference equation converges to zero.
Reception and related reading
Other books on similar topics include A Treatise on the Calculus of Finite Differences by George Boole, Introduction to Difference Equations by S. Goldberg, Difference Equations: An Introduction with Applications by W. G. Kelley and A. C. Peterson, An Introduction to Difference Equations by S. Elaydi, Theory of Difference Equations: An Introduction by V. Lakshmikantham and D. Trigiante, and Difference Equations: Theory and Applications by R. E. Mickens. However, From Rabbits to Chaos places a greater emphasis on computation than theory compared to some of these other books. Reviewer Henry Ricardo writes that the book is "more suitable to an undergraduate course" than its alternatives, despite be |
https://en.wikipedia.org/wiki/Complexity%20and%20Real%20Computation | Complexity and Real Computation is a book on the computational complexity theory of real computation. It studies algorithms whose inputs and outputs are real numbers, using the Blum–Shub–Smale machine as its model of computation. For instance, this theory is capable of addressing a question posed in 1991 by Roger Penrose in The Emperor's New Mind: "is the Mandelbrot set computable?"
The book was written by Lenore Blum, Felipe Cucker, Michael Shub and Stephen Smale, with a foreword by Richard M. Karp, and published by Springer-Verlag in 1998 (doi:10.1007/978-1-4612-0701-6, ).
Purpose
Stephen Vavasis observes that this book fills a significant gap in the literature: although theoretical computer scientists working on discrete algorithms had been studying models of computation and their implications for the complexity of algorithms since the 1970s, researchers in numerical algorithms had for the most part failed to define their model of computation, leaving their results on a shaky foundation. Beyond the goal of making this aspect of the topic more well-founded, the book also has the aims of presenting new results in the complexity theory of real-number computation, and of collecting previously-known results in this theory.
Topics
The introduction of the book reprints the paper "Complexity and real computation: a manifesto", previously published by the same authors. This manifesto explains why classical discrete models of computation such as the Turing machine are inadequate for the study of numerical problems in areas such as scientific computing and computational geometry, motivating the newer model studied in the book. Following this, the book is divided into three parts.
Part I of the book sets up models of computation over any ring, with unit cost per ring operation. It provides analogues of recursion theory and of the P versus NP problem in each case, and proves the existence of NP-complete problems analogously to the proof of the Cook–Levin theorem in the cl |
https://en.wikipedia.org/wiki/Heaven%20Benchmark | Heaven Benchmark is benchmarking software based on the UNIGINE Engine. The benchmark was developed and published by UNIGINE Company in 2009. The main purpose of software is performance and stability testing for GPUs. Users can choose a workload preset, Basic or Extreme, or set the parameters by custom. The benchmark 3D scene is a steampunk-style city on flying islands in the middle of the clouds. The scene is GPU-intensive because of tessellation used for all the surfaces, dynamic sky with volumetric clouds and day-night cycle, real-time global illumination, and screen-space ambient occlusion.
Heaven and other benchmarks by UNIGINE Company are often used by hardware reviewers to compare performance of GPUs and by overclockers for online and offline competitions in GPU overclocking. Running Heaven (or another benchmark by UNIGINE Company) produces a performance score: the higher the numbers, the better the performance. Heaven Benchmark was shipped with Zotac GPUs. Included in Phoronix Test Suite.
Heaven Benchmark is claimed to be the first DirectX 11 benchmark. It was officially introduced at the Windows 7 presentation on October 22, 2009.
Technological features
Visuals powered by UNIGINE 1 Engine
Support for Windows XP, Windows Vista, Windows 7, Windows 8, Linux, macOS
Support for DirectX 9, DirectX 11 and OpenGL 4.0
Support for NVIDIA SLI and AMD CrossFire
GPU temperature and clock monitoring
Adaptive hardware tessellation
Dynamic sky with volumetric clouds and tweakable day-night cycle
Real-time global illumination and screen-space ambient occlusion
Support for stereo 3D and multi-monitor configurations
Cinematic and interactive fly/walk-through camera modes
See also
Benchmark
Overclocking
References
External links
Official website
Benchmarks (computing) |
https://en.wikipedia.org/wiki/Generalized%20pencil-of-function%20method | Generalized pencil-of-function method (GPOF), also known as matrix pencil method, is a signal processing technique for estimating a signal or extracting information with complex exponentials. Being similar to Prony and original pencil-of-function methods, it is generally preferred to those for its robustness and computational efficiency.
The method was originally developed by Yingbo Hua and Tapan Sarkar for estimating the behaviour of electromagnetic systems by its transient response, building on Sarkar's past work on the original pencil-of-function method. The method has a plethora of applications in electrical engineering, particularly related to problems in computational electromagnetics, microwave engineering and antenna theory.
Method
Mathematical basis
A transient electromagnetic signal can be represented as:
where
is the observed time-domain signal,
is the signal noise,
is the actual signal,
are the residues (),
are the poles of the system, defined as ,
by the identities of Z-transform,
are the damping factors and
are the angular frequencies.
The same sequence, sampled by a period of , can be written as the following:
,
Generalized pencil-of-function estimates the optimal and 's.
Noise-free analysis
For the noiseless case, two matrices, and , are produced:
where is defined as the pencil parameter. and can be decomposed into the following matrices:
where
and are diagonal matrices with sequentially-placed and values, respectively.
If , the generalized eigenvalues of the matrix pencil
yield the poles of the system, which are . Then, the generalized eigenvectors can be obtained by the following identities:
where the denotes the Moore–Penrose inverse, also known as the pseudo-inverse. Singular value decomposition can be employed to compute the pseudo-inverse.
Noise filtering
If noise is present in the system, and are combined in a general data matrix, :
where is the noisy data. For efficient fil |
https://en.wikipedia.org/wiki/Unraid | Unraid is a proprietary Linux-based operating system designed to run on home media server setups that operates as a network-attached storage device, application server, and virtualization host. Unraid is proprietary software developed and maintained by Lime Technology, Inc. Users of the software are encouraged to write and use plugins and Docker applications to extend the functionality of their systems.
Features
Unraid's primary feature is the ability to easily create and manage RAID arrays in hardware-agnostic ways, allowing users to use nearly any combination of hard drives to create an array, regardless of model, capacity, or connection type. Since Unraid saves data to individual drives rather than spreading single files out over multiple drives, users can create shares, which are groups of files that can be written to multiple drives (as determined by the user or system) and allow easy access and management by users.
Unraid also allows users to create Docker containers or virtual machines to host applications on the system. For example, a user could use a pre-made Docker container to host applications such as Plex, Jellyfin, and others.
Unraid's user license is attached to a specific USB flash drive, which may be linked to a user's forum account.
Technical specifications
Unraid is based on Linux Slackware.
Supported filesystems: XFS, Btrfs, ZFS and ReiserFS. ReiserFS is only for legacy reasons and for backward compatibility, and as a main-rule, shouldn't be used on new implementations.
GPL compliance
Unraid uses the Linux kernel and its filesystems. It most notably contains a greatly modified version of Linux md facilities named . The source code is distributed as part of the USB system image and is visible in the Unraid OS in . can be used to extract the file from without booting.
References
Operating systems based on the Linux kernel
Proprietary operating systems
RAID |
https://en.wikipedia.org/wiki/Recfiles | recfiles is a file format for human-editable, plain text databases.
Databases using this file format can be edited using any text editor. recfiles allow for basic relational database operations, typing, auto-incrementing, as well as a simple join operation.
Recutils is a collection of tools, like recfmt, recsel, and rec2csv used to work with recfile databases.
Various software libraries support the format.
Syntax
Data are stored in text files with empty lines separating records. Fields within a record are lines starting with their name and a colon; it is possible to wrap long entries. Multiple record types can be maintained in a single text file.
Example
# This is a recfile document.
%rec: Texts
%type: Year int
Author: Doug McIlroy
Year: 1964
Note: The Origin of Unix Pipes
Title: Unix Text Processing
Author: Dale Dougherty
Author: Tim O'Reilly
Year: 1987
Publisher: Hayden Books
Author: William Shakespeare
Title: Hamlet
Year: 1599
Year: 1600
Year: 1601
This example command would output the following three lines (of the two original entries, one having two authors):
$ recsel -e 'Year > "1900"' -p Author
Author: Doug McIlroy
Author: Dale Dougherty
Author: Tim O'Reilly
See also
asciidoc
TOML
org-mode
References
External links
Computer file formats
Computer-related introductions in 2010
Data serialization formats
GNU Project software
Lightweight markup languages
Open formats |
https://en.wikipedia.org/wiki/Adrian%20Mathias | Adrian Richard David Mathias (born 12 February 1944) is a British mathematician working in set theory.
The forcing notion Mathias forcing is named for him.
Career
Mathias was educated at Shrewsbury and Trinity College, Cambridge, where he read mathematics and graduated in 1965. After graduation, he moved to Bonn in Germany where he
studied with Ronald Jensen, visiting UCLA, Stanford, the University of Wisconsin, and Monash University during that period.
In 1969, he returned to Cambridge as a research fellow at Peterhouse and was admitted to the Ph.D. at Cambridge University in 1970. From 1969 to 1990, Mathias was a fellow of Peterhouse; during this period, he was the editor of the Mathematical Proceedings of the Cambridge Philosophical Society from 1972 to 1974, spent one academic year (1978/79) as Hochschulassistent to Jensen in Freiburg and another year (1989/90) at the MSRI in Berkeley. After leaving Peterhouse in 1990, Mathias had visiting positions in Warsaw, at the Mathematisches Forschungsinstitut Oberwolfach, at the CRM in Barcelona, and in Bogotá, before becoming Professor at the Université de la Réunion. He retired from his professorship in 2012 and was admitted to the higher degree of Doctor of Science at the University of Cambridge in 2015.
Work
Mathias became mathematically active soon after the introduction of forcing by Paul Cohen, and Kanamori credits his survey of forcing that was eventually published as Surrealist landscape with figures as being a "vital source" on forcing in its early days.
His paper Happy families, extending his 1968 Cambridge thesis, proves important properties of the forcing now known as Mathias forcing. In the same paper he shows that no (infinite) maximal almost disjoint family can be analytic.
Mathias also used forcing to separate two weak forms of the Axiom of choice, showing that the ordering principle, which states that any set can be linearly ordered, does not imply the Boolean Prime Ideal Theorem.
His more recent |
https://en.wikipedia.org/wiki/Boltzmann%20sampler | A Boltzmann sampler is an algorithm intended for random sampling of combinatorial structures. If the object size is viewed as its energy, and the argument of the corresponding generating function is interpreted in terms of the temperature of the physical system, then a Boltzmann sampler returns an object from a classical Boltzmann distribution.
The concept of Boltzmann sampler was proposed by Philippe Duchon, Philippe Flajolet, Guy Louchard and Gilles Schaeffer in 2004.
Description
The concept of Boltzmann sampling is closely related to the symbolic method in combinatorics.
Let be a combinatorial class with an ordinary generating function which has a nonzero radius of convergence , i.e. is complex analytic. Formally speaking, if each object
is equipped with a non-negative integer size , then the generating function is defined as
where denotes the number of objects of size . The size function is typically used to denote the number of vertices in a tree or in a graph, the number of letters in a word, etc.
A Boltzmann sampler for the class with a parameter such that , denoted as
returns an object with probability
Construction
Finite sets
If is finite, then an element is drawn with probability proportional to .
Disjoint union
If the target class is a disjoint union of two other classes, , and the generating functions and of and are known, then the Boltzmann sampler for can be obtained as
where stands for "if the random variable is 1, then execute , else execute ". More generally, if the disjoint union is taken over a finite set, the resulting Boltzmann sampler can be represented using a random choice with probabilities proportional to the values of the generating functions.
Cartesian product
If is a class constructed of ordered pairs where and , then the corresponding Boltzmann sampler can be obtained as
i.e. by forming a pair with and drawn independently from and .
Sequence
If is composed of all the finite sequences of ele |
https://en.wikipedia.org/wiki/The%20Mathematical%20Coloring%20Book | The Mathematical Coloring Book: Mathematics of Coloring and the Colorful Life of Its Creators is a book on graph coloring, Ramsey theory, and the history of development of these areas, concentrating in particular on the Hadwiger–Nelson problem and on the biography of Bartel Leendert van der Waerden. It was written by Alexander Soifer and published by Springer-Verlag in 2009 ().
Topics
The book "presents mathematics as a human endeavor" and "explores the birth of ideas and moral dilemmas of the times between and during the two World Wars". As such, as well as covering the mathematics of its topics, it includes biographical material and correspondence with many of the people involved in creating it, including in-depth coverage of Issai Schur, , and Bartel Leendert van der Waerden, in particular studying the question of van der Warden's complicity with the Nazis in his war-time service as a professor in Nazi Germany. It also includes biographical material on Paul Erdős, Frank P. Ramsey, Emmy Noether, Alfred Brauer, Richard Courant, Kenneth Falconer, Nicolas de Bruijn, Hillel Furstenberg, and Tibor Gallai, among others, as well as many historical photos of these subjects.
Mathematically, the book considers problems "on the boundary of geometry, combinatorics, and number theory", involving graph coloring problems such as the four color theorem, and generalizations of coloring in Ramsey theory where the use of a too-small number of colors leads to monochromatic structures larger than a single graph edge. Central to the book is the Hadwiger–Nelson problem, the problem of coloring the points of the Euclidean plane in such a way that no two points of the same color are a unit distance apart. Other topics covered by the book include Van der Waerden's theorem on monochromatic arithmetic progressions in colorings of the integers and its generalization to Szemerédi's theorem, the Happy ending problem, Rado's theorem, and questions in the foundations of mathematics involving th |
https://en.wikipedia.org/wiki/Derived%20noncommutative%20algebraic%20geometry | In mathematics, derived noncommutative algebraic geometry, the derived version of noncommutative algebraic geometry, is the geometric study of derived categories and related constructions of triangulated categories using categorical tools. Some basic examples include the bounded derived category of coherent sheaves on a smooth variety, , called its derived category, or the derived category of perfect complexes on an algebraic variety, denoted . For instance, the derived category of coherent sheaves on a smooth projective variety can be used as an invariant of the underlying variety for many cases (if has an ample (anti-)canonical sheaf). Unfortunately, studying derived categories as geometric objects of themselves does not have a standardized name.
Derived category of projective line
The derived category of is one of the motivating examples for derived non-commutative schemes due to its easy categorical structure. Recall that the Euler sequence of is the short exact sequence
if we consider the two terms on the right as a complex, then we get the distinguished triangle
Since we have constructed this sheaf using only categorical tools. We could repeat this again by tensoring the Euler sequence by the flat sheaf , and apply the cone construction again. If we take the duals of the sheaves, then we can construct all of the line bundles in using only its triangulated structure. It turns out the correct way of studying derived categories from its objects and triangulated structure is with exceptional collections.
Semiorthogonal decompositions and exceptional collections
The technical tools for encoding this construction are semiorthogonal decompositions and exceptional collections. A semiorthogonal decomposition of a triangulated category is a collection of full triangulated subcategories such that the following two properties hold
(1) For objects we have for
(2) The subcategories generate , meaning every object can be decomposed in to a sequence |
https://en.wikipedia.org/wiki/De%20numeris%20triangularibus%20et%20inde%20de%20progressionibus%20arithmeticis%3A%20Magisteria%20magna | De numeris triangularibus et inde de progressionibus arithmeticis: Magisteria magna is a 38-page mathematical treatise written in the early 17th century by Thomas Harriot, lost for many years, and finally published in facsimile form in 2009 in the book Thomas Harriot's Doctrine of Triangular Numbers: the "Magisteria Magna". Harriot's work dates from before the invention of calculus, and uses finite differences to accomplish many of the tasks that would later be made more easy by calculus.
De numeris triangularibus
Thomas Harriot wrote De numeris triangularibus et inde de progressionibus arithmeticis: Magisteria magna in the early 1600s, and showed it to his friends. By 1618 it was complete, but in 1621 Harriot died before publishing it. Some of its material was published posthumously, in 1631, as Artis analyticae praxis, but the rest languished in the British Library among many other pages of Harriot's works, and became forgotten until its rediscovery in the late 1700s. It was finally published in its entirety, as part of the 2009 book Thomas Harriot’s Doctrine of Triangular Numbers: the "Magisteria Magna".
The title can be translated as "The Great Doctrine of triangular numbers and, through them, of arithmetic progressions". Harriot's work concerns finite differences, and their uses in interpolation for calculating mathematical tables for navigation. Harriot forms the triangular numbers through the inverse process to finite differencing, partial summation, starting from a sequence of constant value one.
Repeating this process produces the higher-order binomial coefficients,
which in this way can be thought of as generalized triangular numbers, and which give the first part of Harriot's title.
Harriot's results were only improved 50 years later by Isaac Newton, and prefigure Newton's use of Newton polynomials for interpolation. As reviewer Matthias Schemmel writes, this work "shows what was possible in dealing with functional relations before the advent of the ca |
https://en.wikipedia.org/wiki/AG%20Neovo | Associated Industries China, Inc. (, ), known as AG Neovo, is a Taiwan-based multinational computer hardware and electronics company, headquartered in Nangang District, Taipei, Taiwan. Its main products include computer monitors, digital signage, commercial display, large format display, surveillance display, and healthcare displays. The company was established on May 18, 1978. In 1999, it transitioned its business direction to the development of electronic technology. In October of the same year, the company launched its owned brand - AG Neovo, with branch offices for Europe, Asia, and North America. Its digital photo frames and desktop computer monitors were once awarded by IF Product Design Award and Taiwan Excellence Awards.
Brand Name
AG Neovo. AG is an abbreviation of Aktiengesellschaft, which is a German term for a public limited company. Neovo is said to be a portmanteau of two Greek words, Neo and Vo.
History
Associated Industries China, Inc. was founded in 1978 in Taipei, Taiwan, producing steel intermodal containers as the revenue source.
In 1992, it was first listed on the Taiwan Stock Exchange under the ticker code 9912. In October 1999, it launched its own brand name AG Neovo.
In 2000, the company's business direction changed to the hi-tech industry. At the beginning, the business covered both computer monitors OEM and AG Neovo owned-branded product sales and marketing.
In 2003, it withdrew from the OEM business, focusing on brand owned business. The product line includes computer monitors, digital photo frames, large format displays, surveillance displays and digital signage display products.
In 2014, it set up the healthcare business unit. This product line includes dental handpieces and portable dental units.
In 2017, it set up the Solutions business unit. This product lines includes cloud-based digital signage, interactive flat panel displays and display management software.
Products
Monitor and Hardware Displays: Desktop monitors, sec |
https://en.wikipedia.org/wiki/Neil%20Ferguson%20%28epidemiologist%29 | Neil Morris Ferguson (born 1968) is a British epidemiologist and professor of mathematical biology, who specialises in the patterns of spread of infectious disease in humans and animals. He is the director of the Jameel Institute, and of the MRC Centre for Global Infectious Disease Analysis, and head of the Department of Infectious Disease Epidemiology in the School of Public Health and Vice-Dean for Academic Development in the Faculty of Medicine, all at Imperial College London.
Ferguson has used mathematical modelling to provide data on several disease outbreaks including the 2001 United Kingdom foot-and-mouth outbreak, the swine flu outbreak in 2009 in the UK, the 2012 Middle East respiratory syndrome coronavirus outbreak and the ebola epidemic in Western Africa in 2016. His work has also included research on mosquito-borne diseases including zika fever, yellow fever, dengue fever and malaria.
In February 2020, during the COVID-19 pandemic, which was first detected in China, Ferguson and his team used statistical models to estimate that cases of coronavirus disease 2019 (COVID-19) were significantly under-detected in China. He is part of the Imperial College COVID-19 Response Team.
Early life and education
Ferguson was born in Whitehaven, Cumberland, but grew up in Mid Wales, where he attended Llanidloes High School. His father was an educational psychologist, while his mother was a librarian who later became an Anglican priest.
He received his Bachelor of Arts degree in Physics in 1990 at Lady Margaret Hall, Oxford, and his Doctor of Philosophy degree in theoretical physics in 1994 at Linacre College, Oxford. His doctoral research investigated interpolations from crystalline to dynamically triangulated random surfaces and was supervised by John Wheater. It was there that he attended a lecture by Robert May on modelling the HIV epidemic, which together with the death of a friend's brother from AIDS, interested him in pursuing the mathematical modelling of in |
https://en.wikipedia.org/wiki/Cartier%20isomorphism | In algebraic geometry, the Cartier isomorphism is a certain isomorphism between the cohomology sheaves of the de Rham complex of a smooth algebraic variety over a field of positive characteristic, and the sheaves of differential forms on the Frobenius twist of the variety. It is named after Pierre Cartier. Intuitively, it shows that de Rham cohomology in positive characteristic is a much larger object than one might expect. It plays an important role in the approach of Deligne and Illusie to the degeneration of the Hodge–de Rham spectral sequence.
Statement
Let k be a field of characteristic p > 0, and let be a morphism of k-schemes. Let denote the Frobenius twist and let be the relative Frobenius. The Cartier map is defined to be the unique morphismof graded -algebras such that for any local section x of . (Here, for the Cartier map to be well-defined in general it is essential that one takes cohomology sheaves for the codomain.) The Cartier isomorphism is then the assertion that the map is an isomorphism if is a smooth morphism.
In the above, we have formulated the Cartier isomorphism in the form it is most commonly encountered (e.g., in the 1970 paper of Katz). In his original paper, Cartier actually considered the inverse map in a more restrictive setting, whence the notation for the Cartier map.
The smoothness assumption is not essential for the Cartier map to be an isomorphism. For instance, one has it for ind-smooth morphisms since both sides of the Cartier map commute with filtered colimits. By Popescu's theorem, one then has the Cartier isomorphism for a regular morphism of noetherian k-schemes. Ofer Gabber has also proven a Cartier isomorphism for valuation rings. In a different direction, one can dispense with such assumptions entirely if one instead works with derived de Rham cohomology (now taking the associated graded of the conjugate filtration) and the exterior powers of the cotangent complex.
References
Algebraic geometry |
https://en.wikipedia.org/wiki/Essential%20systems%20analysis | Essential systems analysis was a new methodology for software specification published in 1984 by Stephen M. McMenamin and John F. Palmer for performing structured systems analysis based on the concept of event partitioning.
The essence of a system is "its required behavior independent of the technology used to implement the system". It is an abstract model of what the system must do without describing how it will do it.
The methodology proposed that finding the true requirements for an information system entails the development of an essential model for the system, based on the concepts of a perfect internal technology, composed of:
a perfect memory, that is infinitely fast and big, and
a perfect processor, that is infinitely potent and fast.
Edward Yourdon later adapted it to develop modern structured analysis.
The main result was a new and more systematic way to develop the data-flow diagrams, which are the most characteristic tool of structured analysis.
Essential analysis, as adopted in Yourdon's modern structured analysis, was the main software development methodology until object-oriented analysis became mainstream.
References
Software design |
https://en.wikipedia.org/wiki/Viewpoints%3A%20Mathematical%20Perspective%20and%20Fractal%20Geometry%20in%20Art | Viewpoints: Mathematical Perspective and Fractal Geometry in Art is a textbook on mathematics and art. It was written by mathematicians Marc Frantz and Annalisa Crannell, and published in 2011 by the Princeton University Press (). The Basic Library List Committee of the Mathematical Association of America has recommended it for inclusion in undergraduate mathematics libraries.
Topics
The first seven chapters of the book concern perspectivity, while its final two concern fractals and their geometry. Topics covered within the chapters on perspectivity include coordinate systems for the plane and for Euclidean space, similarity, angles, and orthocenters, one-point and multi-point perspective, and anamorphic art. In the fractal chapters, the topics include self-similarity, exponentiation, and logarithms, and fractal dimension. Beyond this mathematical material, the book also describes methods for artists to depict scenes in perspective, and for viewers of art to understand the perspectives in the artworks they see, for instance by finding the optimal point from which to view an artwork. The chapters are ordered by difficulty, and begin with experiments that the students can perform on their own to motivate the material in each chapter.
The book is heavily illustrated by artworks and photography (such as the landscapes of Ansel Adams) and includes a series of essays or interviews by contemporary artists on the mathematical content of their artworks.
An appendix contains suggestions aimed at teachers of this material.
Audience and reception
Viewpoints is intended as a textbook for mathematics classes aimed at undergraduate liberal arts students, as a way to show these students how geometry can be used in their everyday life. However, it could even be used for high school art students,
and reviewer Paul Kelley writes that "it will be of value to anyone interested in an elementary introduction to the mathematics and practice of perspective drawing". It differs from many |
https://en.wikipedia.org/wiki/Video%20Services%20Forum | The Video Services Form (VSF) is an industry association that provides a platform for cooperation and communication between organizations with a stake in media networking. VSF activities include standards development, interoperability testing and the ongoing VidTrans conferences.
VSF published the TR-03 and TR-04 technical recommendations for professional video which were further developed by SMPTE to become SMPTE 2110.
Awards
Technology & Engineering Emmy Award for "Standardization and Productization of JPEG2000 (J2K) Interoperability."
References
1998 establishments in the United States
Broadcast engineering
Film and video technology |
https://en.wikipedia.org/wiki/The%20Pursuit%20of%20Perfect%20Packing | The Pursuit of Perfect Packing is a book on packing problems in geometry. It was written by physicists Tomaso Aste and Denis Weaire, and published in 2000 by Institute of Physics Publishing (doi:10.1887/0750306483, ) with a second edition published in 2008 by Taylor & Francis ().
Topics
The mathematical topics described in the book include sphere packing (including the Tammes problem, the Kepler conjecture, and higher-dimensional sphere packing), the Honeycomb conjecture and the Weaire–Phelan structure, Voronoi diagrams and Delaunay triangulations, Apollonian gaskets, random sequential adsorption, and the physical realizations of some of these structures by sand, soap bubbles, the seeds of plants, and columnar basalt. A broader theme involves the contrast between locally ordered and locally disordered structures, and the interplay between local and global considerations in optimal packings.
As well, the book includes biographical sketches of some of the contributors to this field, and histories of their work in this area, including Johannes Kepler, Stephen Hales, Joseph Plateau, Lord Kelvin, Osborne Reynolds, and J. D. Bernal.
Audience and reception
The book is aimed at a general audience rather than to professional mathematicians. Therefore, it avoids mathematical proofs and is otherwise not very technical. However, it contains pointers to the mathematical literature where readers more expert in these topics can find more detail. Avoiding proof may have been a necessary decision as some proofs in this area defy summarization: the proof by Thomas Hales of the Kepler conjecture on optimal sphere packing in three dimensions, announced shortly before the publication of the book and one of its central topics, is hundreds of pages long.
Reviewer Johann Linhart complains that (in the first edition) some figures are inaccurately drawn. And although finding the book "entertaining and easy to read", William Satzer finds it "frustrating" in the lack of detail in its stori |
https://en.wikipedia.org/wiki/Northern%20European%20Enclosure%20Dam | The Northern European Enclosure Dam (NEED) is a proposed solution to the problem of rising ocean levels in Northern Europe. It would be a megaproject, involving the construction of two massive dams in the English Channel and the North Sea; the former between France and England, and the latter between Scotland and Norway. The concept was conceived by the oceanographers Sjoerd Groeskamp and Joakim Kjellsson.
, the scheme remains a thought experiment intended to portray engineered solutions to the effects of climate change as too "extreme" to be pursued. The scheme's authors describe it as "more of a warning than a solution".
Groeskamp estimates that the NEED will cost 250 to 500 billion euros and will take 50 to 100 years to complete. Groeskamp, an oceanographer, has not revealed how he determined the cost projection or construction timetable.
Channel Dam
The southern enclosure (NEED South) would be a single dam across the Channel between The Lizard, Cornwall, England in the north and Plouescat, Ploudalmézeau, Brittany, France in the south. The stipulated length is , with an average depth of about and a maximum depth of .
North Sea Dam
The northern enclosure (NEED North) would be a multiple section dam at the perimeter of the northern rim of the North Sea. The detailed engineering is not stated, although some form of continuous structure could provide for overland infrastructure—road and/or railway between Great Britain and Norway.
Scotland–Shetland
The western section of the North Sea Dam would be an island jumping, from mainland Scotland in the southwest, through the Orkney Island to Shetland in the northeast, with a total length stipulated to 145 km.
The first stretch origin at Duncansby Head, Caithness, mainland Scotland and crossing Pentland Firth to Brough Ness, the southern tip of South Ronaldsay in the Orkney Islands. Although being a narrow strait of 10 km, the sea floor is down to 100 m depth.
The stretch through southern Orkney continues to Bur |
https://en.wikipedia.org/wiki/Plotting%20algorithms%20for%20the%20Mandelbrot%20set | There are many programs and algorithms used to plot the Mandelbrot set and other fractals, some of which are described in fractal-generating software. These programs use a variety of algorithms to determine the color of individual pixels efficiently.
Escape time algorithm
The simplest algorithm for generating a representation of the Mandelbrot set is known as the "escape time" algorithm. A repeating calculation is performed for each x, y point in the plot area and based on the behavior of that calculation, a color is chosen for that pixel.
Unoptimized naïve escape time algorithm
In both the unoptimized and optimized escape time algorithms, the x and y locations of each point are used as starting values in a repeating, or iterating calculation (described in detail below). The result of each iteration is used as the starting values for the next. The values are checked during each iteration to see whether they have reached a critical "escape" condition, or "bailout". If that condition is reached, the calculation is stopped, the pixel is drawn, and the next x, y point is examined. For some starting values, escape occurs quickly, after only a small number of iterations. For starting values very close to but not in the set, it may take hundreds or thousands of iterations to escape. For values within the Mandelbrot set, escape will never occur. The programmer or user must choose how many iterations–or how much "depth"–they wish to examine. The higher the maximal number of iterations, the more detail and subtlety emerge in the final image, but the longer time it will take to calculate the fractal image.
Escape conditions can be simple or complex. Because no complex number with a real or imaginary part greater than 2 can be part of the set, a common bailout is to escape when either coefficient exceeds 2. A more computationally complex method that detects escapes sooner, is to compute distance from the origin using the Pythagorean theorem, i.e., to determine the absolute |
https://en.wikipedia.org/wiki/Robert%20France | Robert Bertrand France (October 8, 1960 – February 15, 2015) was a Jamaica-born American computer scientist.
Robert B. France was born in Jamaica on October 8, 1960, the eldest son of Robert W. and Jeanette France. He attended high school in Guyana and studied for a BSc degree in Natural Sciences at the University of the West Indies in Saint Augustine, Trinidad and Tobago, majoring in Computer Science and Mathematics and receiving a first class degree in 1984. He then attended Massey University in New Zealand funded by a Commonwealth Scholarship, where he achieved a PhD degree in computer science in 1990. During the same year, he married Sheriffa R. Soleyn in Saint Vincent. They emigrated to the United States together and in due course moved to Fort Collins, Colorado.
During 1990–92, France was a postdoctoral research associate at the Institute for Advanced Computer Studies, University of Maryland. From 1992 to 1997, he was an assistant professor in the computer science and engineering department at Florida Atlantic University (FAU), becoming tenured in 1997–98. France was then appointed an associate professor from 1998 until 2004 and then full professor at Colorado State University within the department of computer science. He undertook research on model-driven software development, especially concerning formal software modeling languages and associated analysis tools. He was co-founder and editor-in-chief of the Software and Systems Modeling journal from 1999 until 2015.
In 2008, Robert France and his co-authors Andy Evans, Kevin Lano, and Bernhard Rumpe, were awarded the Ten Year Most Influential Paper Award at the MODELS 2008 Conference on Model Driven Engineering Languages and Systems for the 1998 paper "The UML as a Formal Modeling Notation". In 2013, France was awarded a five-year International Chair at INRIA in France. He was awarded a senior Dahl–Nygaard Prize for his research by the Association Internationale pour les Technologies Objets (AITO) in 2014. |
https://en.wikipedia.org/wiki/Energy-based%20model | An energy-based model (EBM) is a form of generative model (GM) imported directly from statistical physics to learning. GMs learn an underlying data distribution by analyzing a sample dataset. Once trained, a GM can produce other datasets that also match the data distribution. EBMs provide a unified framework for many probabilistic and non-probabilistic approaches to such learning, particularly for training graphical and other structured models.
An EBM learns the characteristics of a target dataset and generates a similar but larger dataset. EBMs detect the latent variables of a dataset and generate new datasets with a similar distribution.
Target applications include natural language processing, robotics and computer vision.
History
The term "energy-based models" was first coined in a JMLR paper where the authors defined a generalisation of independent components analysis to the overcomplete setting using EBMs.
Other early work on EBMs proposed models that represented energy as a composition of latent and observable variables. EBMs surfaced in 2003.
Approach
EBMs capture dependencies by associating an unnormalized probability scalar (energy) to each configuration of the combination of observed and latent variables. Inference consists of finding (values of) latent variables that minimize the energy given a set of (values of) the observed variables. Similarly, the model learns a function that associates low energies to correct values of the latent variables, and higher energies to incorrect values.
Traditional EBMs rely on stochastic gradient-descent (SGD) optimization methods that are typically hard to apply to high-dimension datasets. In 2019, OpenAI publicized a variant that instead used Langevin dynamics (LD). LD is an iterative optimization algorithm that introduces noise to the estimator as part of learning an objective function. It can be used for Bayesian learning scenarios by producing samples from a posterior distribution.
EBMs do not require that e |
https://en.wikipedia.org/wiki/Closing%20the%20Gap%3A%20The%20Quest%20to%20Understand%20Prime%20Numbers | Closing the Gap: The Quest to Understand Prime Numbers is a book on prime numbers and prime gaps by Vicky Neale, published in 2017 by the Oxford University Press (). The Basic Library List Committee of the Mathematical Association of America has suggested that it be included in undergraduate mathematics libraries.
Topics
The main topic of the book is the conjecture that there exist infinitely many twin primes, dating back at least to Alphonse de Polignac (who conjectured more generally in 1849 that every even number appears infinitely often as the difference between two primes), and the significant progress made recently by Yitang Zhang and others on this problem. Zhang did not solve the twin prime conjecture, but in 2013 he announced a proof that there exists an even number that is the difference between infinitely many pairs of primes. Zhang's original proof shows only that is less than 70 million, but subsequent work by others including the highly collaborative efforts of the Polymath Project reduced this bound to 246, or even, assuming the truth of the Elliott–Halberstam conjecture, to 6.
The book is structured with chapters that alternate between giving the chronological development of the twin prime problem, and providing mathematical background on related topics in number theory; reviewer Michael N. Fried describes this unusual structure as a rondo with the chronological sequence as its refrain and the mathematical parts as its verses. The mathematical topics covered in these chapters include Goldbach's conjecture that every even number is the sum of two primes, sums of squares and Waring's problem on representation by sums of powers, the Hardy–Littlewood circle method for comparing the area of a circle to the number of integer points in the circle and solving analogous problems in analytic number theory, the arithmetic of quaternions, Fermat’s Last Theorem, the fundamental theorem of arithmetic on the existence and uniqueness of prime factorizations, alm |
https://en.wikipedia.org/wiki/Using%20the%20Borsuk%E2%80%93Ulam%20Theorem | Using the Borsuk–Ulam Theorem: Lectures on Topological Methods in Combinatorics and Geometry is a graduate-level mathematics textbook in topological combinatorics. It describes the use of results in topology, and in particular the Borsuk–Ulam theorem, to prove theorems in combinatorics and discrete geometry. It was written by Czech mathematician Jiří Matoušek, and published in 2003 by Springer-Verlag in their Universitext series ().
Topics
The topic of the book is part of a relatively new field of mathematics crossing between topology and combinatorics, now called topological combinatorics. The starting point of the field, and one of the central inspirations for the book, was a proof that László Lovász published in 1978 of a 1955 conjecture by Martin Kneser, according to which the Kneser graphs have no graph coloring with colors. Lovász used the Borsuk–Ulam theorem in his proof, and Matoušek gathers many related results, published subsequently, to show that this connection between topology and combinatorics is not just a proof trick but an area.
The book has six chapters. After two chapters reviewing the basic notions of algebraic topology, and proving the Borsuk–Ulam theorem, the applications to combinatorics and geometry begin in the third chapter, with topics including the ham sandwich theorem, the necklace splitting problem, Gale's lemma on points in hemispheres, and several results on colorings of Kneser graphs. After another chapter on more advanced topics in equivariant topology, two more chapters of applications follow, separated according to whether the equivariance is modulo two or using a more complicated group action. Topics in these chapters include the van Kampen–Flores theorem on embeddability of skeletons of simplices into lower-dimensional Euclidean spaces, and topological and multicolored variants of Radon's theorem and Tverberg's theorem on partitions into subsets with intersecting convex hulls.
Audience and reception
The book is written at a |
https://en.wikipedia.org/wiki/Sergiy%20Vilkomir | Sergiy A. Vilkomir (November 19, 1956 – February 9, 2020) was a Ukrainian-born computer scientist.
Sergiy Vilkomir was born in 1956 in present-day Ukraine. He finished Mathematical College at the Moscow State University National Mathematical Boarding High School no. 18 (Head-Academician A. Kolmogorov, 1972–74), studied for an MSc degree in Mathematics and Mathematics Education at Kharkov State University (1974–79), and for a PhD degree at Kharkov Polytechnic Institute (1985–90). In Kharkiv, Ukraine, he then worked at the Ukrainian Polytechnic Institute (1979–82), the Central Institute of Complex Automation (1985–91), the Institute of Safety and Reliability of Technological Systems (1992–93), the Ukrainian State Scientific and Technical Centre on Nuclear and Radiation Safety (part of the Nuclear Safety Regulatory Authority of Ukraine, 1993–2000). His role included licensing and audits of computer-based safety systems at nuclear power plants.
In 2000, Vilkomir moved to the Centre for Applied Formal Methods at London South Bank University, becoming a Research Fellow there. He then joined the University of Wollongong in Australia, also as a Research Fellow. He subsequently worked with David Parnas at the University of Limerick in Ireland, before moving to the United States, initially as Research Associate Professor and the University of Tennessee during 2007–8, then rising to be an associate professor position at East Carolina University, which he joined in 2008. There he achieved academic tenure in 2012 and was Head of the Software Testing Research Group (STRG).
Vilkomir's main research contributions have been in the formalization of software testing. In particular, he proposed reinforced condition/decision coverage (RC/DC), a stronger version of the modified condition/decision coverage (MC/DC) coverage criterion for software testing in safety-critical systems.
Vilkomir was awarded the Google Faculty Research Award for 2010–11, the East Carolina University Scholar- |
https://en.wikipedia.org/wiki/Behind%20the%20Mask%20%28organisation%29 | Behind the Mask was an online archive for LGBT African activists and LGBT rights in Africa. Its offices were in Johannesburg, South Africa. Its web address was www.mask.org.za. It mainly focused on South Africa, but it also supported LGBT activists in East and Central Africa (Kenya, Uganda, Tanzania, Rwanda, Burundi, Zambia), southern Africa (Namibia, Botswana), and West Africa (Ghana and Sierra Leone). It was started by the Dutch journalist Bart Luirink. Its aim was to support LGBT African movements with information. Behind the Mask "bolstered LGBT organizing throughout Africa because its website shared information about gains and setbacks that activists in different places experienced." Behind the Mask's websites aided the visibility of LGBT Africans and served as "beacons of hope for sexually and gender-variant Africans." Behind the Mask shut down in 2012 and the contents of its website were almost entirely lost. Parts of the website at are still viewable through the Internet Archive Wayback Machine.
References
LGBT-related websites
Online archives
LGBT organisations in South Africa |
https://en.wikipedia.org/wiki/Poincar%C3%A9%20and%20the%20Three-Body%20Problem | Poincaré and the Three-Body Problem is a monograph in the history of mathematics on the work of Henri Poincaré on the three-body problem in celestial mechanics. It was written by June Barrow-Green, as a revision of her 1993 doctoral dissertation, and published in 1997 by the American Mathematical Society and London Mathematical Society as Volume 11 in their shared History of Mathematics series (). The Basic Library List Committee of the Mathematical Association of America has suggested its inclusion in undergraduate mathematics libraries.
Topics
The three-body problem concerns the motion of three bodies interacting under Newton's law of universal gravitation, and the existence of orbits for those three bodies that remain stable over long periods of time. This problem has been of great interest mathematically since Newton's formulation of the laws of gravity, in particular with respect to the joint motion of the sun, earth, and moon. The centerpiece of Poincaré and the Three-Body Problem is a memoir on this problem by Henri Poincaré, entitled Sur le problème des trois corps et les équations de la dynamique [On the problem of the three bodies and the equations of dynamics]. This memo won the King Oscar Prize in 1889, commemorating the 60th birthday of
Oscar II of Sweden, and was scheduled to be published in Acta Mathematica on the king's birthday, until Lars Edvard Phragmén and Poincaré determined that there were serious errors in the paper. Poincaré called for the paper to be withdrawn, spending more than the prize money to do so. In 1890 it was finally published in revised form, and over the next ten years Poincaré expanded it into a monograph, Les méthodes nouvelles de la mécanique céleste [New methods in celestial mechanics]. Poincare's work led to the discovery of chaos theory, set up a long-running separation between mathematicians and dynamical astronomers over the convergence of series, and became the initial claim to fame for Poincaré himself. The detailed s |
https://en.wikipedia.org/wiki/Radial%20plane | A radial plane is an anatomical plane that is used to describe a virtual slice along a radius of a somewhat cylindrical shaped body part. The radial planes need not be perfectly drawn to overlap on an exact intersection point, particularly when the body part being sectioned is not a perfect cylinder, such as in the case of the maxilla and mandible.
Usefulness
The radial plane can be useful because certain anatomical elements repeat in a circumferential manner (such as around the curvature of the dental arch (i.e. the jaw) and to speak of these entities using parallel planes becomes cumbersome and inaccurate.
For instance, the segment of bone on the outer circumference of each individual tooth is referred to as the facial plate of bone. Because the facial plate of bone is anterior to the incisors (in the front of the mouth) but lateral to the premolars and molars (in the back of the mouth), to visualize the facial plate of bone on various teeth will require sagittal slices for the former but coronal slices for the latter. To achieve greater uniformity and diminished confusion, simply speaking of radial slices provides a satisfactory solution for all teeth in both (upper and lower) arches.
Previous to the advent of this terminology, this plane was referred to as the axial plane relative to the body of the jawbone. It was believed that the jawbone was straightened out as though it were a straight tube, and then transverse (axial) sections were made of that tube.
References
Anatomical planes |
https://en.wikipedia.org/wiki/Livestreamed%20news | Livestreamed news refers to live videos streams of television news which are provided via streaming television or via streaming media by various television networks and television news outlets, from various countries. The majority of live news streams are produced as world news broadcasts, by major television networks, or by major news channels; however, there are some live news streams which are produced by individual local television channels as well.
A live news stream is distinct from news broadcasts that are transmitted via conventional broadcast television; since it is not transmitted via cable television services, and not via over-the-air television. These are provided through Smart TV, or else through the networks' own websites, or also possibly via internet television, especially YouTube, or via video on demand services, subscription video on demand websites such as e.g. Hulu, mobile apps, or digital media players that are designed to play streaming television, such as e.g. the Roku media player.
For some twenty-four-hour news channels, the content being shown via its streaming news service, and via its broadcast television channels, may be identical; however, for regular commercial networks, the content of the streaming news may be quite different than what is being broadcast; i.e. the broadcast channels may regularly carry standard television entertainment, while the streaming service is devoted to news only. One example of this is the American broadcaster ABC Television Network, and its streaming online ABC News Live service.
News sources by region
North American news outlets
Various networks and news outlets in North America have provided official live video streams of news for most or all of the day, as described below.
The ABC Television Network has provided a live streaming service of world news, known as "ABC News Live," for eighteen hours per day, since 2018. This is available via ABC's official platform on Hulu, as well as the network's of |
https://en.wikipedia.org/wiki/Christofari | Christofari — are Christofari (2019), Christofari Neo (2021) supercomputers of Sberbank based on Nvidia corporation hardware Sberbank of Russia and Nvidia. Their main purpose is neural network learning. They are also used for scientific research and commercial calculations.
The supercomputers are named after , the first customer of Sberbank, holder of the Bank's first savings account passbook. The supercomputers are listed in the Top 500 ranking of most powerful commercially available computer systems.
Development
Sberbank presented the supercomputers together with its subsidiary SberCloud. In December 2019, Sberbank and SberCloud commercially launched the Christofari supercomputer. Within a year, the power of Christofari became the foundation of a cloud based ML Space platform. It was configured to work with machine learning models. Sberbank and SberCloud announced this platform in December 2020.
The more powerful Christofari Neo supercomputer was presented at the AI Journey international conference in November 2021 by David Rafalovsky, the CTO of Sberbank Group. Currently David Rafalovsky is not a member of Sberbank Group.
Usage
The supercomputers can be used by scientific, commercial and government organizations working in the various sectors of the economy. The machines were developed to work with artificial intelligence algorithms, neural network learning, and inference of various models.
Sber uses Christofari for internal tasks e.g. speech recognition and autoresponder voice generation in a call center (40% of customer inquiries are already answered automatically by bots). Also they use it for analysis of CT scan images of the lungs. The SberDevices and Sber AI teams were the first who received access to Christofari Neo. They developed the first service based on the DALL-E neural network that generates images from queries in Russian.
The power of supercomputers is also provided to other organizations when connecting the services of the cloud platform |
https://en.wikipedia.org/wiki/Birkhoff%27s%20theorem%20%28equational%20logic%29 | In logic, Birkhoff's theorem in equational logic states that an equality t = u is a semantic consequence of a set of equalities E, if and only if t = u can be proven from the set of equalities. It is named after Garrett Birkhoff.
References
Logic
Formal sciences |
https://en.wikipedia.org/wiki/Complexities%3A%20Women%20in%20Mathematics | Complexities: Women in Mathematics is an edited volume on women in mathematics that "contains the stories and insights of more than eighty female mathematicians". It was edited by Bettye Anne Case and Anne M. Leggett, based on a collection of material from the Newsletter of the Association for Women in Mathematics, and published by Princeton University Press in 2005 ().
Topics
The book contains over 100 articles, by over 70 authors, divided into five sections. The first of these, "Inspiration", discusses the work of famous women in mathematics
(such as Sofya Kovalevskaya, Julia Robinson, and Emmy Noether) and of women mathematicians from the 18th and 19th centuries, offering insights into their personal life as well as their mathematics. Next, "Joining Together" covers the history of the Association for Women in Mathematics and related topics in the organization of women in mathematics including European Women in Mathematics
and the participation of women at the International Congress of Mathematicians.
The middle section, "Choices and Challenges", covers the problems facing women in contemporary mathematics, and includes a statistical quantification of these problems by Case and Leggett. "Celebration" is a collection of plenary talks and other materials from the Olga Taussky-Todd Celebration of Careers for Women in Mathematics, a conference held in 1999 to celebrate women in mathematics; its plenary speakers were , Evelyn Boyd Granville, Lisa Goldberg, Fern Hunt, Diane Lambert, Cathleen Synge Morawetz, Linda Petzold, Helene Shapiro, Richard S. Varga, Margaret H. Wright, and Lani Wu. The final chapter, "Into a New Century", consists of essays by the youg women mathematicians of the time the book was published, many of them in non-academic careers. A collection of photographs from 1975 to 2003 is included as an appendix.
Despite its material on the difficulties faced by women in mathematics, the tone of the book is "factual and upbeat", in many cases covering or |
https://en.wikipedia.org/wiki/United%20Kingdom%20Global%20Navigation%20Satellite%20System | The United Kingdom Global Navigation Satellite System (UK GNSS) was a United Kingdom Space Agency (UKSA) research programme which, between May 2018 and September 2020, developed outline proposals for a United Kingdom (UK) owned and operated conventional satellite navigation system, as a British alternative to the European Union (EU) owned and operated Galileo Global Navigation Satellite System. The main reason was to provide a national and independent system, to ensure UK security following its withdrawal from the EU as a result of Brexit. It was fully supported by the Ministry of Defence.
In September 2020, the UK GNSS programme concluded; it was relaunched as a new entity, namely the United Kingdom Space Based Positioning, Navigation and Timing Programme (UK SBPNTP).
History
With the now universal reliance on the output provided by satellite navigation systems by many aspects of everyday life, in both private and commercial sectors, along with critical uses by military, maritime, and emergency services, continued and reliable access to such navigation systems is vital for the United Kingdom. An earlier study by the UK Government warned that sustained disruption to a reliable satellite navigation could cost the British economy £1 billion per day.
The United Kingdom Global Navigation Satellite System was first discussed by the UK Government in May 2018, after the European Union told the United Kingdom that it would no longer have full access to, nor be able to use the encrypted secure component (known as the Public Regulated Service, which is only accessible to the military, emergency services, and government agencies) of the Galileo system, the European equivalent of the United States of America owned and operated Global Positioning System (GPS), originally known as Navstar GPS. This UK exclusion from Galileo was despite the fact that the United Kingdom had already contributed more than £1.2 billion towards the cost of setting up Galileo, together with provi |
https://en.wikipedia.org/wiki/Filter%20quantifier | In mathematics, a filter on a set informally gives a notion of which subsets are "large". Filter quantifiers are a type of logical quantifier which, informally, say whether or not a statement is true for "most" elements of Such quantifiers are often used in combinatorics, model theory (such as when dealing with ultraproducts), and in other fields of mathematical logic where (ultra)filters are used.
Background
Here we will use the set theory convention, where a filter on a set is defined to be an order-theoretic filter in the poset that is, a subset of such that:
and ;
For all we have ;
For all if then
Recall a filter on is an ultrafilter if, for every either or
Given a filter on a set we say a subset is -stationary if, for all we have
Definition
Let be a filter on a set We define the filter quantifiers and as formal logical symbols with the following interpretation:
is -stationary
for every first-order formula with one free variable. These also admit alternative definitions as
When is an ultrafilter, the two quantifiers defined above coincide, and we will often use the notation instead. Verbally, we might pronounce as "for -almost all ", "for -most ", "for the majority of (according to )", or "for most (according to )". In cases where the filter is clear, we might omit mention of
Properties
The filter quantifiers and satisfy the following logical identities, for all formulae :
Duality:
Weakening:
Conjunction:
Disjunction:
If are filters on then:
Additionally, if is an ultrafilter, the two filter quantifiers coincide: Renaming this quantifier the following properties hold:
Negation:
Weakening:
Conjunction:
Disjunction:
In general, filter quantifiers do not commute with each other, nor with the usual and quantifiers.
Examples
If is the trivial filter on then unpacking the definition, we have and This recovers the usual and quantifiers.
Let be the Fréchet filter on an infini |
https://en.wikipedia.org/wiki/The%20Calculating%20Machines | Die Rechenmaschinen, by Ernst Martin, and its English translation, The Calculating Machines (Die Rechenmaschinen): Their History and Development, are books on mechanical desktop calculators from prior to World War II.
Publication history
Die Rechenmaschinen, the original book by Martin, was published in 1925, and revised in 1937. Both editions are very rare. Little is known about Martin beyond these books.
The 1925 edition was edited and translated into English by Peggy A. Kidwell and Michael R. Williams, and published in 1992 by the MIT Press as the final 16th volume of its The Charles Babbage Institute Reprint Series for the History of Computing (). Kidwell and Williams chose this edition, rather than the revised edition, because of "the rarity of the books and the poor condition of the illustrations in extant copies". Indeed, they were only able to locate three copies of Martin's book.
The book and its translation includes many illustrations, and
the translation preserves some idiosyncrasies of the original work, including a set of advertisements for calculating machines at the end of the book.
Topics
After an introduction grouping calculating machines into seven types,
the book describes over 200 machines, comprising "almost every desk-top calculator available before World War II", ordered chronologically. It also contains biographical information about some of the people who contributed to the design of these machines, including Blaise Pascal, Gottfried Wilhelm Leibniz, and Giovanni Poleni.
Audience and reception
At the time Martin wrote the book, "mechanical calculating machines were a symbol of high-tech sophistication in the workplace"; reviewer Jonathan Samuel Golan suggests that it was aimed at collectors rather than historians, while the editors of the Bulletin of Science, Technology & Society suggest that instead its purpose was to inform the public. Nowadays, reviewer A. D. Booth suggests that readers of the book are likely to be people who once us |
https://en.wikipedia.org/wiki/Acoustic%20panel | Acoustic panels (also sound absorption panels, soundproof panels or sound panels) are sound-absorbing fabric-wrapped boards designed to control echo and reverberation in a room. Most commonly used to resolve speech intelligibility issues in commercial soundproofing treatments. Most panels are constructed with a wooden frame, filled with sound absorption material (mineral wool, fiber glass, cellulose, open cell foam, or combination of) and wrapped with fabric.
An acoustic board is a board made from sound absorbing materials, designed to provide sound insulation. Between two outer walls sound absorbing material is inserted and the wall is porous. Thus, when sound passes through an acoustic board, the intensity of sound is decreased. The loss of sound energy is balanced by producing heat energy. They are used in auditoriums, halls, seminar rooms, libraries, courts and wherever sound insulation is needed. Acoustic boards are also used in speaker boxes.
See also
Acoustics
Architectural acoustics
Room acoustics
Absorption (acoustics)
References
External links
Acoustic Board
akustik panel
StudioPANEL
Acoustics
Building engineering |
https://en.wikipedia.org/wiki/Transplant%20engineering | Transplant engineering (or allograft engineering) is a variant of genetic organ engineering which comprises allograft, autograft and xenograft engineering. In allograft engineering the graft is substantially modified by altering its genetic composition. The genetic modification can be permanent or transient. The aim of modifying the allograft is usually the mitigation of immunological graft rejection.
History
Transient genetic allograft engineering has been pioneered by Shaf Keshavjee and Marcelo Cypel at University Health Network in Toronto by adenoviral transduction for transgenic expression of the IL-10 gene. Permanent genetic allograft engineering has first been done by Rainer Blasczyk and Constanca Figueiredo at Hannover Medical School in Hanover by lentiviral transduction for knocking down MHC expression.
References
Genetic engineering
Transplantation medicine |
https://en.wikipedia.org/wiki/Sociology%20of%20quantification | The sociology of quantification is the investigation of quantification as a sociological phenomenon in its own right.
Content
According to a review published in 2018 sociology of quantification is an expanding field which includes the literature on the quantified self, on algorithms, and on various forms of metrics and indicators. Older works which can be classified under the same heading are Theodore Porter’s Trust in Numbers, the works of French sociologists Pierre Bourdieu and Alain Desrosières, and the classic works on probability by Ian Hacking and Lorraine Daston. The discipline gained traction due to the increasing importance and scope of quantification, its relation to the economics of conventions, and by the perception of its dangers as a weapon of oppression, or as means to undesirable ends.
For Sally Engle Merry quantification is a technology of control, but whether it is reformist or authoritarian depends on who harnessed it and for what purpose. The ‘governance by numbers’ is seen by jurist Alain Supiot as repudiating the goal of governing by just laws, advocating in its stead the attainment of measurable objectives. For Supiot the normative use of economic quantification leaves no option for countries and economic actors than to ride roughshod over social legislation, and pledge allegiance to stronger powers.
The French movement of ‘statactivisme’ suggests fighting numbers with numbers under the slogan “a new number is possible". On the other extreme, algorithmic automation is seen as an instrument of liberation by Aaron Bastani, spurring a debate on digital socialism. According to Espeland and Stevens an ethics of quantification would naturally descend from a sociology of quantification, especially at an age where democracy, merit, participation, accountability and even "fairness" are assumed to be best discovered and appreciated via numbers. Andrea Mennicken and Espeland provide a review (2019) of the main concerns about the "increasing expansion |
https://en.wikipedia.org/wiki/The%20Higher%20Infinite | The Higher Infinite: Large Cardinals in Set Theory from their Beginnings is a monograph in set theory by Akihiro Kanamori, concerning the history and theory of large cardinals, infinite sets characterized by such strong properties that their existence cannot be proven in Zermelo–Fraenkel set theory (ZFC). This book was published in 1994 by Springer-Verlag in their series Perspectives in Mathematical Logic, with a second edition in 2003 in their Springer Monographs in Mathematics series, and a paperback reprint of the second edition in 2009 ().
Topics
Not counting introductory material and appendices, there are six chapters in The Higher Infinite, arranged roughly in chronological order by the history of the development of the subject. The author writes that he chose this ordering "both because it provides the most coherent exposition of the mathematics and because it holds the key to any epistemological concerns".
In the first chapter, "Beginnings", the material includes inaccessible cardinals, Mahlo cardinals, measurable cardinals, compact cardinals and indescribable cardinals. The chapter covers the constructible universe and inner models, elementary embeddings and ultrapowers, and a result of Dana Scott that measurable cardinals are inconsistent with the axiom of constructibility.
The second chapter, "Partition properties", includes the partition calculus of Paul Erdős and Richard Rado, trees and Aronszajn trees, the model-theoretic study of large cardinals, and the existence of the set 0# of true formulae about indiscernibles. It also includes Jónsson cardinals and Rowbottom cardinals.
Next are two chapters on "Forcing and sets of reals" and "Aspects of measurability". The main topic of the first of these chapters is forcing, a technique introduced by Paul Cohen for proving consistency and inconsistency results in set theory; it also includes material in descriptive set theory. The second of these chapters covers the application of forcing by Robert M. Solo |
https://en.wikipedia.org/wiki/List%20of%20megaprojects%20in%20Bangladesh | This is a list of megaprojects. "(i.e. projects) characterized by: large investment commitment, vast complexity (especially in organizational terms), and long-lasting impact on the economy, the environment, and society". The number of such projects is so large that the list may never be fully completed. The Finance Minister of Bangladesh has recently unveiled an extensive roster of ambitious mega projects encompassing various sectors. These projects primarily focus on the construction of hospitals, schools, colleges, and other essential infrastructures. Consequently, this development surge is expected to generate a substantial demand for cement within the country.
Terms Explanation
Airports
Bridges
Road and highways
Railways
Energy projects
Ports
Defense
Buildings
Sports
Barrages
Delta Plan
Satellites
Special Economic Zone
References
Megaprojects
Megaprojects |
https://en.wikipedia.org/wiki/SIG%20Group | SIG Group may refer to:
SIG Group AG, one of the world’s most important companies in the Packaging industry from Switzerland.
Semen Indonesia Group, an Indonesian cement company. |
https://en.wikipedia.org/wiki/Treks%20into%20Intuitive%20Geometry | Treks into Intuitive Geometry: The World of Polygons and Polyhedra is a book on geometry, written as a discussion between a teacher and a student in the style of a Socratic dialogue. It was written by Japanese mathematician Jin Akiyama and science writer Kiyoko Matsunaga, and published by Springer-Verlag in 2015 ().
Topics
The term "intuitive geometry" of the title was used by László Fejes Tóth to refer to results in geometry that are accessible to the general public, and the book concerns topics of this type.
The book has 16 self-contained chapters, each beginning with an illustrative puzzle or real-world application.
It includes material on tessellations, polyhedra, and honeycombs, unfoldings of polyhedra and tessellations of unfoldings, cross sections of polyhedra, measuring boxes, gift wrapping, packing problems, wallpaper groups, pentagonal tilings, the Conway criterion for prototiles and Escher-like tilings of the plane by animal-shaped figures, aperiodic tilings including the Penrose tiling, the art gallery theorem, the Euler characteristic, dissection problems and the Dehn invariant, and the Steiner tree problem.
The book is heavily illustrated. And although the results of the book are demonstrated in an accessible way, the book provides sequences of deductions leading to each major claim, and more-complete proofs and references are provided in an appendix.
Audience and reception
Although it was initially developed from course material offered to undergraduates at the Tokyo University of Science, the book is aimed at a broad audience, and assumes only a high-school level knowledge of geometry. It could be used to encourage children in mathematics as well as to provide material for teachers and public lecturers. There is enough depth of material to also retain the interest of readers with a more advanced mathematical background.
Reviewer Matthieu Jacquemet writes that the ordering of topics is unintuitive and the dialogue-based format "artificial", but r |
https://en.wikipedia.org/wiki/A%20Guide%20to%20the%20Classification%20Theorem%20for%20Compact%20Surfaces | A Guide to the Classification Theorem for Compact Surfaces is a textbook in topology, on the classification of two-dimensional surfaces. It was written by Jean Gallier and Dianna Xu, and published in 2013 by Springer-Verlag as volume 9 of their Geometry and Computing series (, ). The Basic Library List Committee of the Mathematical Association of America has recommended its inclusion in undergraduate mathematics libraries.
Topics
The classification of surfaces (more formally, compact two-dimensional manifolds without boundary) can be stated very simply, as it depends only on the Euler characteristic and orientability of the surface. An orientable surface of this type must be topologically equivalent (homeomorphic) to a sphere, torus, or more general handlebody, classified by its number of handles. A non-orientable surface must be equivalent to a projective plane, Klein bottle, or more general surface characterized by an analogous number, its number of cross-caps. For compact surfaces with boundary, the only extra information needed is the number of boundary components. This result is presented informally at the start of the book, as the first of its six chapters. The rest of the book presents a more rigorous formulation of the problem, a presentation of the topological tools needed to prove the result, and a formal proof of the classification.
Other topics in topology discussed as part of this presentation include simplicial complexes, fundamental groups, simplicial homology and singular homology, and the Poincaré conjecture. Appendices include additional material on embeddings and self-intersecting mappings of surfaces into three-dimensional space such as the Roman surface, the structure of finitely generated abelian groups, general topology, the history of the classification theorem, and the Hauptvermutung (the theorem that every surface can be triangulated).
Audience and reception
This is a textbook aimed at the level of advanced undergraduates or beginning gr |
https://en.wikipedia.org/wiki/Immunometabolism | Immunometabolism is a branch of biology that studies the interplay between metabolism and immunology in all organisms. In particular, immunometabolism is the study of the molecular and biochemical underpinninngs for i) the metabolic regulation of immune function, and ii) the regulation of metabolism by molecules and cells of the immune system. Further categorization includes i) systemic immunometabolism and ii) cellular immunometabolism.
Immunometabolism first appears in academic literature in 2011, where it is defined as "an emerging field of investigation at the interface between the historically distinct disciplines of immunology and metabolism." A later article defines immunometabolism as describing "the changes that occur in intracellular metabolic pathways in immune cells during activation". Broadly, immunometabolic research records the physiological functioning of the immune system in the context of different metabolic conditions in health and disease. These studies can cover molecular and cellular aspects of immune system function in vitro, in situ, and in vivo, under different metabolic conditions. For example, highly proliferative cells such as cancer cells and activating T cells undergo metabolic reprogramming, increasing glucose uptake to shift towards aerobic glycolysis during normoxia. While aerobic glycolysis is an inefficient pathway for ATP production in quiescent cells, this so-called “Warburg effect” supports the bioenergetic and biosynthetic needs of rapidly proliferating cells.
Signalling and metabolic network
There are many indispensable signalling molecules connected to metabolic processes, which play an important role in both the immune system homeostasis and in the immune response. From these the most significant are mammalian target of rapamycin (mTOR), liver kinase B1 (LBK1), 5' AMP-activated protein kinase (AMPK), phosphoinositide 3 kinase (PI3K) and protein kinase B (akt). All of the aforementioned molecules together control the mo |
https://en.wikipedia.org/wiki/Runtime%20predictive%20analysis | Runtime predictive analysis (or predictive analysis) is a runtime verification technique in computer science for detecting property violations in program executions inferred from an observed execution. An important class of predictive analysis methods has been developed for detecting concurrency errors (such as data races) in concurrent programs, where a runtime monitor is used to predict errors which did not happen in the observed run, but can happen in an alternative execution of the same program. The predictive capability comes from the fact that the analysis is performed on an abstract model extracted online from the observed execution, which admits a class of executions beyond the observed one.
Overview
Informally, given an execution , predictive analysis checks errors in a reordered trace of . is called feasible from (alternatively a correct reordering of ) if any program that can generate can also generate .
In the context of concurrent programs, a predictive technique is sound if it only predicts concurrency errors in feasible executions of the causal model of the observed trace. Assuming the analysis has no knowledge about the source code of the program, the analysis is complete (also called maximal) if the inferred class of executions contains all executions that have the same program order and communication order prefix of the observed trace.
Applications
Predictive analysis has been applied to detect a wide class of concurrency errors, including:
Data races
Deadlocks
Atomicity violations
Order violations, e.g., use-after-free errors
Implementation
As is typical with dynamic program analysis, predictive analysis first instruments the source program. At runtime, the analysis can be performed online, in order to detect errors on the fly. Alternatively, the instrumentation can simply dump the execution trace for offline analysis. The latter approach is preferred for expensive refined predictive analyses that require random access to the exe |
https://en.wikipedia.org/wiki/Success | Success is the state or condition of meeting a defined range of expectations. It may be viewed as the opposite of failure. The criteria for success depend on context, and may be relative to a particular observer or belief system. One person might consider a success what another person considers a failure, particularly in cases of direct competition or a zero-sum game. Similarly, the degree of success or failure in a situation may be differently viewed by distinct observers or participants, such that a situation that one considers to be a success, another might consider to be a failure, a qualified success or a neutral situation. For example, a film that is a commercial failure or even a box-office bomb can go on to receive a cult following, with the initial lack of commercial success even lending a cachet of subcultural coolness.
It may also be difficult or impossible to ascertain whether a situation meets criteria for success or failure due to ambiguous or ill-defined definition of those criteria. Finding useful and effective criteria, or heuristics, to judge the failure or success of a situation may itself be a significant task.
In American culture
DeVitis and Rich link the success to the notion of the American Dream. They observe that "[t]he ideal of success is found in the American Dream which is probably the most potent ideology in American life" and suggest that "Americans generally believe in achievement, success, and materialism." Weiss, in his study of success in the American psyche, compares the American view of success with Max Weber's concept of the Protestant work ethic.
In biology
Natural selection is the variation in successful survival and reproduction of individuals due to differences in phenotype. It is a key mechanism of evolution, the change in the heritable traits characteristic of a population over generations. Charles Darwin popularized the term "natural selection", contrasting it with artificial selection, which in his view is intentiona |
https://en.wikipedia.org/wiki/Hussein%20Zedan | Hussein S. M. Zedan (1 July 1953 – 23 February 2019) was a computer scientist of Egyptian descent, mainly based in the United Kingdom.
Hussein Zedan was born in 1953. He received his PhD degree in 1981 at the University of Bristol, studying under John Derwent Pryce and Hubert Schwetlick for a thesis entitled Modified Rosenbrock-Wanner methods for solving systems of stiff ordinary differential equations.
Zedan was an academic in the Department of Computer Science at the University of York. Prof. Zedan then headed the Software Technology Research Laboratory (STRL) as Technical Director at De Montfort University. He was also Head of Computing Research. Later STRL was headed by Zedan's PhD student and subsequently colleague François Siewe. Zedan was subsequently appointed Assistant Vice-President of Academic Affairs and Development at the Applied Science University in Manama, Bahrain, until 2017.
Hussein Zedan died on 23 February 2019. He was married with two daughters.
Selected publications
– republished as:
References
External links
Hussein Zedan on ResearchGate
Hussein Zedan on Academia.edu
Hussein Zedan on LinkedIn
Hussein Zedan on DBLP
Hussein Zedan on IEEE Xplore
1953 births
2019 deaths
Alumni of the University of Bristol
Egyptian computer scientists
Egyptian expatriates in England
British computer scientists
Formal methods people
Software engineering researchers
Academics of the University of York
Academics of De Montfort University |
https://en.wikipedia.org/wiki/Combinatorial%20Games%3A%20Tic-Tac-Toe%20Theory | Combinatorial Games: Tic-Tac-Toe Theory is a monograph on the mathematics of tic-tac-toe and other positional games, written by József Beck. It was published in 2008 by the Cambridge University Press as volume 114 of their Encyclopedia of Mathematics and its Applications book series ().
Topics
A positional game is a game in which players alternate in taking possession of a given set of elements, with the goal of forming a winning configuration of elements; for instance, in tic-tac-toe and gomoku, the elements are the squares of a grid, and the winning configurations are lines of squares. These examples are symmetric: both players have the same winning configurations. However, positional games also include other possibilities such as the maker-breaker games in which one player (the "maker") tries to form a winning configuration and the other (the "breaker") tries to put off that outcome indefinitely or until the end of the game. In symmetric positional games one can use a strategy-stealing argument to prove that the first player has an advantage, but realizing this advantage by a constructive strategy can be very difficult.
According to the Hales–Jewett theorem, in tic-tac-toe-like games involving forming lines on a grid or higher-dimensional lattice, grids that are small relative to their dimension cannot lead to a drawn game: once the whole grid is partitioned between the two players, one of them will necessarily have a line. One of the main results of the book is that somewhat larger grids lead to a "weak win", a game in which one player can always force the formation of a line (not necessarily before the other player does), but that grid sizes beyond a certain threshold lead to a "strong draw", a game in which both players can prevent the other from forming a line. Moreover, the threshold between a weak win and a strong draw can often be determined precisely. The proof of this result uses a combination of the probabilistic method, to prove the existence of stra |
https://en.wikipedia.org/wiki/Sliding%20DFT | In applied mathematics, the sliding discrete Fourier transform is a recursive algorithm to compute successive STFTs of input data frames that are a single sample apart (hopsize − 1). The calculation for the sliding DFT is closely related to Goertzel algorithm.
Definition
Assuming that the hopsize between two consecutive DFTs is 1 sample, then
From this definition above, the DFT can be computed recursively thereafter. However, implementing the window function on a sliding DFT is difficult due to its recursive nature, therefore it is done exclusively in a frequency domain.
Sliding windowed infinite Fourier transform
It is not possible to implement asymmetric window functions into sliding DFT. However, the IIR version called sliding windowed infinite Fourier transform (SWIFT) provides an exponential window and the αSWIFT calculates two sDFTs in parallel where slow-decaying one is subtracted by fast-decaying one, therefore a window function of .
References
FFT algorithms |
https://en.wikipedia.org/wiki/Word%20Processing%20in%20Groups | Word Processing in Groups is a monograph in mathematics on the theory of automatic groups, a type of abstract algebra whose operations are defined by the behavior of finite automata. The book's authors are David B. A. Epstein, James W. Cannon, Derek F. Holt, Silvio V. F. Levy, Mike Paterson, and William Thurston. Widely circulated in preprint form, it formed the foundation of the study of automatic groups even before its 1992 publication by Jones and Bartlett Publishers ().
Topics
The book is divided into two parts, one on the basic theory of these structures and another on recent research, connections to geometry and topology, and other related topics.
The first part has eight chapters. They cover automata theory and regular languages, and the closure properties of regular languages under logical combinations; the definition of automatic groups and biautomatic groups; examples from topology and "combable" structure in the Cayley graphs of automatic groups; abelian groups and the automaticity of Euclidean groups; the theory of determining whether a group is automatic, and its practical implementation by Epstein, Holt, and Sarah Rees; extensions to asynchronous automata; and nilpotent groups.
The second part has four chapters, on braid groups, isoperimetric inequalities, geometric finiteness, and the fundamental groups of three-dimensional manifolds.
Audience and reception
Although not primarily a textbook, the first part of the book could be used as the basis for a graduate course. More generally, reviewer Gilbert Baumslag recommends it "very strongly to everyone who is interested in either group theory or topology, as well as to computer scientists."
Baumslag was an expert in a related but older area of study, groups defined by finite presentations, in which research was eventually stymied by the phenomenon that many basic problems are undecidable. Despite tracing the origins of automatic groups to early 20th-century mathematician Max Dehn, he writes that the |
https://en.wikipedia.org/wiki/Newman%E2%80%93Janis%20algorithm | In general relativity, the Newman–Janis algorithm (NJA) is a complexification technique for finding exact solutions to the Einstein field equations. In 1964, Newman and Janis showed that the Kerr metric could be obtained from the Schwarzschild metric by means of a coordinate transformation and allowing the radial coordinate to take on complex values. Originally, no clear reason for why the algorithm works was known.
In 1998, Drake and Szekeres gave a detailed explanation of the success of the algorithm and proved the uniqueness of certain solutions. In particular, the only perfect fluid solution generated by NJA is the Kerr metric and the only Petrov type D solution is the Kerr–Newman metric.
The algorithm works well on ƒ(R) and Einstein–Maxwell–Dilaton theories, but doesn't return expected results on Braneworld and Born–Infield theories.
See also
Birkhoff's theorem (relativity)
References
Algorithms
Exact solutions in general relativity |
https://en.wikipedia.org/wiki/Nutrient%20depletion | Nutrient depletion is a form of resource depletion and refers to the loss of nutrients and micronutrients in a habitat or parts of the biosphere, most often the soil (soil depletion, soil degradation). On the level of a complete ecological niche or ecosystem, nutrient depletion can also come about via the loss of the nutrient substrate (soil loss, wetland loss, etc.). Nutrients are usually the first link in the food chain, thus a loss of nutrients in a habitat will affect nutrient cycling and eventually the entire food chain.
Nutrient depletion can refer to shifts in the relative nutrient composition and overall nutrient quantity (i.e. food abundance). Human activity has changed both in the natural environment extensively, usually with negative effects on wildlife flora and fauna.
The opposite effect is known as eutrophication or nutrient pollution. Both depletion and eutrophication lead to shifts in biodiversity and species abundance (usually a decline).
The effects are bidirectional in that a shift in species composition in a habitat may also lead to shift in the nutrient composition.
See also
Soil nutrient
Soil erosion
References
Ecology
Natural resources
Nutritional physiology |
https://en.wikipedia.org/wiki/Horse%20Ridge%20%28chip%29 | Intel "Horse Ridge" is a cryogenic control chip that presented at the International Solid State Circuits Conference 2020 of San Francisco.
Horse Ridge is based on Intel's 22nm FFL (FinFET Low Power) CMOS technology. Intel and QuTech published a study in Nature in which they demonstrate that they have been able to operate qubits at temperatures above 1 degree Kelvin (-272.15 degrees Celsius).
In December 2020, Intel released Horse Ridge II, adding enhanced capabilities and higher levels of integration for sophisticated control of the quantum system. New features include the ability to manipulate and read qubit states (and drive up to 16 spin qubits with a direct digital synthesis (DDS) architecture) and control the potential of multiple gates needed to correlate multiple qubits (features 22 high-speed digital-to-analog converters (DACs)).
Horse Ridge II is also implemented using Intel's low-power 22nm FinFET technology (22FFL) and its operation has been tested at a temperature of 4 degree Kelvin.
References
Quantum computing |
https://en.wikipedia.org/wiki/Combinatorics%20of%20Finite%20Geometries | Combinatorics of Finite Geometries is an undergraduate mathematics textbook on finite geometry by Lynn Batten. It was published by Cambridge University Press in 1986 with a second edition in 1997 ().
Topics
The types of finite geometry covered by the book include partial linear spaces, linear spaces, affine spaces and affine planes, projective spaces and projective planes, polar spaces, generalized quadrangles, and partial geometries. A central connecting concept is the "connection number" of a point and a line not containing it, equal to the number of lines that meet the given point and intersect the given line.
The second edition adds a final chapter on blocking sets.
Beyond the basic theorems and proofs of this subject, the book includes many examples and exercises, and some history and information about current research.
Audience and reception
The book is aimed at advanced undergraduates, assuming only an introductory-level of abstract algebra and some knowledge of linear algebra. Its coverage of recent research also makes it useful as background reading for researchers in this area.
Reviewer Michael J. Kallaher cites as a "serious shortcoming" of the first edition its lack of coverage of applications of this subject, for instance to the design of experiments and to coding theory. The second edition has a section on applications but reviewer Tamás Szőnyi writes that it needs additional expansion.
Because of the many types of geometry covered in the book, the coverage of each of them is, at times, shallow; for instance, reviewer Theodore G. Ostrom complains that there is only half a page on non-Desarguesian planes. Additionally, Kallaher feels that block designs should have been included in place of some of the more esoteric geometries described by Batten. Reviewer Thomas Brylawski criticizes the book for "glossing over or ignoring" important results, for overcomplicated proofs, and for missed cases in some of its case analysis.
On the other hand, reviewer |
https://en.wikipedia.org/wiki/European%20Battery%20Alliance | The European Battery Alliance (EBA) is Europe's plan to create its own competitive and sustainable battery cell manufacturing value chain. Its purpose is to ensure that Europe benefits from the technological evolution in the Electric Vehicle Market and beyond. The action plan includes cleaner and more sustainable vehicles as well as safer traffic operations across Europe.
Stakeholders
120 industrial innovation stakeholders are currently active under the EBA alliance in partnership with participant EU states and the European Investment Bank. This entails growth and investment potential of the battery sector across Europe and beyond.
Operations
The action plan adopted focuses on innovating and developing a sustainable and competitive battery 'ecosystem' within Europe. The primary objective is to create a manufacturing value chain with sustainable battery cells at its core in order to avoid a technological dependence from 3rd parties. According to analysts, by 2025, Europe could capture a €250 billion market. EU wide demand is forecast to demand 10 to 20 large-scale battery cell production facilities. The first planned plant, called the "Automotive Cell Co.", will be stationed at a Groupe PSA’s site in Kaiserslautern. Saft Groupe S.A. will be taking part in the operations as this plant will be complementing another production facility located at Hauts-de-France. The two plants are to be operational by 2024, employ around 2,000 and serve up to 15% of Europe's demand.
Environmental impact
The European Union and all stakeholders involved are considering the entire life-cycle of batteries, including the environmental gains of use and the impact triggered by their production. Recycling and the recovery of materials at the end of the life cycle reduces the impact of both mining and manufacturing. emissions and hazardous substances used are lowered and the impact of mining is reduced. The EU wants to ensure recycling through the Battery Directive to offset the negative |
https://en.wikipedia.org/wiki/Art%20Gallery%20Theorems%20and%20Algorithms | Art Gallery Theorems and Algorithms is a mathematical monograph on topics related to the art gallery problem, on finding positions for guards within a polygonal museum floorplan so that all points of the museum are visible to at least one guard, and on related problems in computational geometry concerning polygons. It was written by Joseph O'Rourke, and published in 1987 in the International Series of Monographs on Computer Science of the Oxford University Press. Only 1000 copies were produced before the book went out of print, so to keep this material accessible O'Rourke has made a pdf version of the book available online.
Topics
The art gallery problem, posed by Victor Klee in 1973, asks for the number of points at which to place guards inside a polygon (representing the floor plan of a museum) so that each point within the polygon is visible to at least one guard. Václav Chvátal provided the first proof that the answer is at most for a polygon with corners, but a simplified proof by Steve Fisk based on graph coloring and polygon triangulation is more widely known. This is the opening material of the book, which goes on to covers topics including visibility, decompositions of polygons, coverings of polygons, triangulations and triangulation algorithms, and higher-dimensional generalizations, including the result that some polyhedra such as the Schönhardt polyhedron do not have triangulations without additional vertices. More generally, the book has as a theme "the interplay between discrete and computational geometry".
It has 10 chapters, whose topics include the original art gallery theorem and Fisk's triangulation-based proof; rectilinear polygons; guards that can patrol a line segment rather than a single point; special classes of polygons including star-shaped polygons, spiral polygons, and monotone polygons; non-simple polygons; prison yard problems, in which the guards must view the exterior, or both the interior and exterior, of a polygon; visibility gr |
https://en.wikipedia.org/wiki/IBM%204769 | The IBM 4769 PCIe Cryptographic Coprocessor is a hardware security module (HSM) that includes a secure cryptoprocessor implemented on a high-security, tamper resistant, programmable PCIe board. Specialized cryptographic electronics, microprocessor, memory, and random number generator housed within a tamper-responding environment provide a highly secure subsystem in which data processing and cryptography can be performed. Sensitive key material is never exposed outside the physical secure boundary in a clear format.
The IBM 4769 is designed to meet FIPS PUB 140-2 Level 4, the highest level of certification achievable for commercial cryptographic devices. The 4769 is part of IBM's pervasive encryption and enterprise security schemes. The IBM 4769 data sheet describes the coprocessor in detail.
IBM supplies two cryptographic-system implementations:
The PKCS#11 implementation, called IBM Enterprise PKCS11 (EP11), creates a high-security solution for application programs developed for this industry-standard API.
The IBM Common Cryptographic Architecture (CCA) implementation provides many functions of special interest in the finance industry, extensive support for distributed key management, and a base on which custom processing and cryptographic functions can be added.
Applications may include financial PIN transactions, bank-to-clearing-house transactions, EMV transactions for integrated circuit (chip) based credit cards, and general-purpose cryptographic applications using symmetric key algorithms, hashing algorithms, and public key algorithms.
The operational keys (symmetric or asymmetric private (RSA or Elliptic Curve)) are generated in the coprocessor and are then saved either in a keystore file or in application memory, encrypted under the master key of that coprocessor. Any coprocessor with an identical master key can use those keys. See elliptic curve cryptography (ECC) for more information about ECC. New hardware in the 4769 adds support to accelerate the |
https://en.wikipedia.org/wiki/Polyhedra%20%28book%29 | Polyhedra is a book on polyhedra, by Peter R. Cromwell. It was published by in 1997 by the Cambridge University Press, with an unrevised paperback edition in 1999.
Topics
The book covers both the mathematics of polyhedra and its historical development, limiting itself only to three-dimensional geometry. The notion of what it means to be a polyhedron has varied over the history of the subject, as have other related definitions, an issue that the book handles largely by keeping definitions informal and flexible, and by pointing out problematic examples for these intuitive definitions. Many digressions help make the material readable, and the book includes many illustrations, including historical reproductions, line diagrams, and photographs of models of polyhedra.
Polyhedra has ten chapters, the first four of which are primarily historical, with the remaining six more technical. The first chapter outlines the history of polyhedra from the ancient world up to Hilbert's third problem on the possibility of cutting polyhedra into pieces and reassembling them into different polyhedra. The second chapter considers the symmetries of polyhedra, the Platonic solids and Archimedean solids, and the honeycombs formed by space-filling polyhedra. Chapter 3 covers the history of geometry in medieval Islam and early Europe, including connections to astronomy and the study of visual perspective, and Chapter 4 concerns the contributions of Johannes Kepler to polyhedra and his attempts to use polyhedra to model the structure of the universe.
Among the remaining chapters, Chapter 5 concerns angles and trigonometry, the Euler characteristic, and the Gauss–Bonnet theorem (including also some speculation on whether René Descartes knew about the Euler characteristic prior to Euler). Chapter 6 covers Cauchy's rigidity theorem and flexible polyhedra, and chapter 7 covers self-intersecting star polyhedra. Chapter 8 returns to the symmetries of polyhedra and the classification of possible sym |
https://en.wikipedia.org/wiki/Chases%20and%20Escapes | Chases and Escapes: The Mathematics of Pursuit and Evasion is a mathematics book on continuous pursuit–evasion problems. It was written by Paul J. Nahin, and published by the Princeton University Press in 2007. It was reissued as a paperback reprint in 2012. The Basic Library List Committee of the Mathematical Association of America has rated this book as essential for inclusion in undergraduate mathematics libraries.
Topics
The book has four chapters, covering the solutions to 21 continuous pursuit–evasion problems, with an additional 10 "challenge problems" left for readers to solve, with solutions given in an appendix. The problems are presented as entertaining stories that "breathe life into the mathematics and invite wider engagement", and their solutions use varied methods, including the computer calculation of numerical solutions for differential equations whose solutions have no closed form.
Most of the material was previously known, but is collected here for the first time. The book also provides background material on the history of the problems it describes, although this is not its main focus.
Even before beginning its main content, the preface of the book begins with an example of pure evasion from known pursuit, the path used by the Enola Gay to escape the blast of the nuclear bomb it dropped on Hiroshima. The first chapter of the book concerns the opposite situation of "pure pursuit" without evasion, including the initial work in this area by Pierre Bouguer in 1732. Bouger studied a problem of pirates chasing a merchant ship, in which the merchant ship (unaware of the pirates) travels on a straight line while the pirate ship always travels towards the current position of the merchant ship. The resulting pursuit curve is called a radiodrome, and this chapter studies several similar problems and stories involving a linearly moving target, including variations where the pursuer may aim ahead of the target and the tractrix curve generated by a pursuer t |
https://en.wikipedia.org/wiki/Intercollegiate%20Biomathematics%20Alliance | The Intercollegiate Biomathematics Alliance (IBA) is a syndicate of organizations focused on connecting both academic and non-academic institutions to promote the study of biomathematics, ecology, and other related fields. Biomathematics is a scientific area connecting biology, ecology, mathematics, and computer science. Founded in 2014 by Executive director Olcay Akman of Illinois State University, the Intercollegiate Biomathematics Alliance helps organizations to work together and share resources among one another that are not regularly available at all institutions. The IBA is still young and typically attracts smaller colleges around the United States who tend to benefit more from being part of a consortium. However, in recent years, universities such as Arizona State University have joined and the IBA continues to maintain connections with larger research groups such as the Mathematical Bioscience Institute (MBI) and the National Institute for Mathematical and Biological Synthesis (NIMBioS).
History
In 2007, Olcay Akman of mathematics and Steven Juliano of biological sciences started a master's degree program at Illinois State University. The program grew and is now operated under the same umbrella as the IBA, the Center for Collaborative Studies in Mathematical Biology. In 2008, the first BEER (Biomathematics Ecology Education and Research) conference was held at Illinois State University with only 10 speakers and less than 50 attendees. In 2014, the BEER conference was the second largest biomathematics conference globally with more than 100 speakers. Then in 2014, other universities were asked to collaborate with the common goal of educating students about biomathematics, and this led to the creation of the Intercollegiate Biomathematics Alliance (IBA).
The IBA is not the first to create a network of institutions. Morehouse College in Atlanta, GA participates in its own network of institutions that helps to provide students with greater access to resources |
https://en.wikipedia.org/wiki/Astranis | Astranis Space Technologies Corp. is an American company specializing in geostationary communications satellites. It is headquartered in San Francisco, California.
In 2018, Astranis launched DemoSat-2, a prototype 3U cubesat. The launch aimed to test Software-Defined Radio (SDR) technology for future larger communications satellites.
The company publicly disclosed its projects in March 2018, following a funding round that was aimed at the development of geostationary communications satellites.
In January 2019, Astranis initiated a commercial program with Pacific Dataport, Inc. to increase the satellite internet capacity in Alaska. A 350 kg satellite was launched on April 30, 2023, as part of a multi-satellite payload.
Astranis was part of the Winter 2016 cohort of the Y Combinator accelerator program and has raised over $350 million in venture funding from firms such as BlackRock, Venrock, and Andreessen Horowitz.
History
Demonstration satellite
On January 12, 2018, Astranis launched its first satellite, "DemoSat 2", using an Indian PSLV-XL rocket. The satellite was a 3U cubesat measuring 10 cm x 10 cm x 30 cm and weighing less than 3 kg. It carried a prototype of the company's software-defined radio.
Geostationary satellites
Block 1
In 2019, Astranis leased its first MicroGEO spacecraft to Pacific Dataport, Inc., a subsidiary of Microcom. The satellite, named Arcturus, initially had an anticipated launch date in early 2022, which was later delayed to April 2023. After the launch, the company confirmed successful communication with the satellite and hardware deployment. Subsequent tests showed the spacecraft could deliver up to 8.5 Gbps, compared to its design specification of 7.5 Gbps.
n July 2023, Astranis reported a malfunction in an externally supplied solar array drive assembly on Arcturus, which affected the spacecraft's ability to provide internet service. According to Astranis CEO John Gedmark, no hardware built by Astranis failed.
Block 2
In Ap |
https://en.wikipedia.org/wiki/In%20Pursuit%20of%20the%20Traveling%20Salesman | In Pursuit of the Traveling Salesman: Mathematics at the Limits of Computation is a book on the travelling salesman problem, by William J. Cook, published in 2011 by the Princeton University Press, with a paperback reprint in 2014. The Basic Library List Committee of the Mathematical Association of America has suggested its inclusion in undergraduate mathematics libraries.
Topics
The travelling salesman problem asks to find the shortest cyclic tour of a collection of points, in the plane or in more abstract mathematical spaces.
Because the problem is NP-hard, algorithms that take polynomial time are unlikely to be guaranteed to find its optimal solution; on the other hand a brute-force search of all permutations would always solve the problem exactly but would take far too long to be usable for all but the smallest problems. Threading a middle ground between these too-fast and too-slow running times, and developing a practical system that can find the exact solution of larger instances, raises difficult questions of algorithm engineering, which have sparked the development of "many of the concepts and techniques of combinatorial optimization".
The introductory chapter of the book explores the limits of calculation on the problem, from 49-point problems solved by hand in the mid-1950s by George Dantzig, D. R. Fulkerson, and Selmer M. Johnson to a problem with 85,900 points solved optimally in 2006 by the Concorde TSP Solver, which Cook helped develop. The next chapters covers the early history of the problem and of related problems, including Leonhard Euler's work on the Seven Bridges of Königsberg, William Rowan Hamilton's Icosian game, and Julia Robinson first naming the problem in 1949. Another chapter describes real-world applications of the problem, ranging "from genome sequencing and designing computer processors to arranging music and hunting for planets". Reviewer Brian Hayes cites "the most charming revelation" of the book as being the fact that one of tho |
https://en.wikipedia.org/wiki/Cl%C3%A0udia%20Valls | Clàudia Valls Anglés is a mathematician and an expert in dynamical systems. She is an associate professor in the Instituto Superior Técnico of the University of Lisbon in Portugal.
Education
Valls completed a doctorate at the University of Barcelona in 1999. Her dissertation, The Classical Arnold Example of Diffusion with Two Equal Parameters, was supervised by .
Books
Valls is the co-author of books with Luís Barreira and others, including:
Instability in Hamiltonian systems (with Antonio Pumariño, Electronic Journal of Qualitative Theory of Differential Equations Monograph Series, Vol. 1, 2005)
Stability of nonautonomous differential equations (with Luís Barreira, Lecture Notes in Mathematics, Vol. 1926, Springer, 2008)
Complex analysis and differential equations (with Luís Barreira, Springer Undergraduate Mathematics Series, Springer, 2012), translated into French as Analyse complexe et équations différentielles (Enseignement SUP-Maths, EDP Sciences, 2011)
Exercices d’analyse complexe et équations différentielles [Exercises in complex analysis and differential equations] (with Luís Barreira, Enseignement SUP-Maths, EDP Sciences, 2011)
Equações diferenciais: Teoria qualitativa (with Luís Barreira, Ensino da Ciência e da Tecnologia, Vol. 33, IST Press, 2010), translated into English as Ordinary differential equations: Qualitative theory (Graduate Studies in Mathematics, Vol. 137, American Mathematical Society, 2012)
Dynamical systems: An introduction (with Luís Barreira and Davor Dragičević, Universitext, Springer, 2013; originally published in Portuguese in 2012)
Exercises in linear algebra (with Luís Barreira, World Scientific, 2016)
Admissibility and hyperbolicity (with Luís Barreira, SpringerBriefs in Mathematics, Springer, 2018)
Dynamical systems by example (with Luís Barreira, Problem Books in Mathematics, Springer, 2019)
References
Year of birth missing (living people)
Living people
21st-century Spanish mathematicians
21st-century Portuguese mathematicia |
https://en.wikipedia.org/wiki/Computing%20the%20Continuous%20Discretely | Computing the Continuous Discretely: Integer-Point Enumeration in Polyhedra is an undergraduate-level textbook in geometry, on the interplay between the volume of convex polytopes and the number of lattice points they contain. It was written by Matthias Beck and Sinai Robins, and published in 2007 by Springer-Verlag in their Undergraduate Texts in Mathematics series (Vol. 154). A second edition was published in 2015, and a German translation of the first edition by Kord Eickmeyer, Das Kontinuum diskret berechnen, was published by Springer in 2008.
Topics
The book begins with a motivating problem, the coin problem of determining which amounts of money can be represented (and what is the largest non-representable amount of money) for a given system of coin values.
Other topics touched on include face lattices of polytopes and the Dehn–Sommerville equations relating numbers of faces; Pick's theorem and the Ehrhart polynomials, both of which relate lattice counting to volume; generating functions, Fourier transforms, and Dedekind sums, different ways of encoding sequences of numbers into mathematical objects; Green's theorem and its discretization; Bernoulli polynomials; the Euler–Maclaurin formula for the difference between a sum and the corresponding integral; special polytopes including zonotopes, the Birkhoff polytope, and permutohedra; and the enumeration of magic squares. In this way, the topics of the book connect together geometry, number theory, and combinatorics.
Audience and reception
This book is written at an undergraduate level, and provides many exercises, making it suitable as an undergraduate textbook. Little mathematical background is assumed, except for some complex analysis towards the end of the book. The book also includes open problems, of more interest to researchers in these topics. As reviewer Darren Glass writes, "Even people who are familiar with the material would almost certainly learn something from the clear and engaging exposition that |
https://en.wikipedia.org/wiki/Pierre%20Peytier | Pierre Peytier - Jean Pierre Eugène Félicien Peytier (15 October 17931864), sometimes named Eugène Peytier - was a French officer, geographer, engineer, cartographer and painter.
Life
Pierre Peytier entered the École polytechnique in 1811, where he obtained his diploma (X1811). He was then integrated into the topographical service of the French army in the corps of the engineers-geographers in 1813. He was promoted lieutenant in 1817, then capitain in 1827.
In the Pyrenees (1825)
Engineer geographer and geodesist, he was one of the first geodesic officers charged in 1825 with the triangulation of the Pyrenees in order to establish the map of France, together with his colleague Paul-Michel Hossard. By necessity of service and together with the officers Corabœuf and Testuhe, he was also one of the first pyreneists.
He made the first ascents of the Pyrenean peaks Palas, Balaïtous and Saint-Barthélemy. These true exploits went completely unnoticed at the time and many later ascensionists, believing they were achieving these ascents first, found traces of the passage of the Geodesists. That was the case for the explorer Charles Packe when reaching the summit of the Balaïtous.
Scientific expedition of Morea (1829)
Captain Pierre Peytier, of the topographic service of the French army, had already been invited to Greece by its Governor Ioannis Kapodistrias when the latter had come to Paris in October 1827 to ask the French government for advisers and officers of the army French to organize the army of the newly founded Greek State (during the Greek War of Independence). Thus, on the recommendation of the French Ministry of War, Peytier and three other officers arrived in Greece, in order to train young Greek engineers who would undertake surveying projects, while Peytier himself was to draw the plans for the city of Corinth and the map of Peloponnese. Then when the scientific expedition of Morea landed at Navarino in the Peloponnese on 3 March 1829, Peytier was thu |
https://en.wikipedia.org/wiki/Timeline%20of%20bordism | This is a timeline of bordism, a topological theory based on the concept of the boundary of a manifold. For context see timeline of manifolds. Jean Dieudonné wrote that cobordism returns to the attempt in 1895 to define homology theory using only (smooth) manifolds.
Integral theorems
Cohomology
Homotopy theory
Notes
Algebraic topology
Differential topology
Bordism |
https://en.wikipedia.org/wiki/Library%20Hub%20Discover | Library Hub Discover is a union catalog operated by Jisc. It replaces Copac and SUNCAT. Its user interface is centred around a simple search engine-like query box.
References
External links
https://discover.libraryhub.jisc.ac.uk/
Academic libraries in the United Kingdom
British digital libraries
Jisc
Bibliographic databases and indexes
Databases in the United Kingdom
Higher education in the United Kingdom
Library cataloging and classification
Online databases |
https://en.wikipedia.org/wiki/Genome-wide%20CRISPR-Cas9%20knockout%20screens | Genome-wide CRISPR-Cas9 knockout screens aim to elucidate the relationship between genotype and phenotype by ablating gene expression on a genome-wide scale and studying the resulting phenotypic alterations. The approach utilises the CRISPR-Cas9 gene editing system, coupled with libraries of single guide RNAs (sgRNAs), which are designed to target every gene in the genome. Over recent years, the genome-wide CRISPR screen has emerged as a powerful tool for performing large-scale loss-of-function screens, with low noise, high knockout efficiency and minimal off-target effects.
History
Early studies in Caenorhabditis elegans and Drosophila melanogaster saw large-scale, systematic loss of function (LOF) screens performed through saturation mutagenesis, demonstrating the potential of this approach to characterise genetic pathways and identify genes with unique and essential functions. The saturation mutagenesis technique was later applied in other organisms, for example zebrafish and mice.
Targeted approaches for gene knockdown emerged in the 1980s with techniques such as homologous recombination, trans-cleaving ribozymes, and antisense technologies.
By the year 2000, RNA interference (RNAi) technology had emerged as a fast, simple, and inexpensive technique for targeted gene knockdown, and was routinely being used to study in vivo gene function in C. elegans. Indeed, in the span of only a few years following its discovery by Fire et al. (1998), almost all of the ~19,000 genes in C. elegans had been analysed using RNAi-based knockdown.
The production of RNAi libraries facilitated the application of this technology on a genome-wide scale, and RNAi-based methods became the predominant approach for genome-wide knockdown screens.
Nevertheless, RNAi-based approaches to genome-wide knockdown screens have their limitations. For one, the high off-target effects cause issues with false-positive observations. Additionally, because RNAi reduces gene expression at the post-tr |
https://en.wikipedia.org/wiki/Area%20of%20Special%20Conservation%20Interest | An Area of Special Conservation Interest (ASCI) is a protected area in Europe or North Africa, part of the Emerald network established by the countries who have signed the Berne Convention on the Conservation of European Wildlife and Natural Habitats.
The purpose of the ASCIs is to conserve and protect habitats and species defined in the convention.
Emerald Network
The Emerald Network is an ecological network of Areas of Special Conservation Interest.
The Council of Europe launched the network when it adopted Recommendation No. 16 (1989) of the Standing Committee to the Bern Convention.
The European Union's Natura 2000 network covers the portion of the Emerald network within the EU.
The network also include conservation units in non-community European states such as Andorra, Belarus, Georgia, the Republic of Moldova, Norway, Switzerland and Ukraine, and in several African states.
In the United Kingdom, Special Areas of Conservation and Special Protection Areas are Areas of Special Conservation Interest.
The African signatories to the Bern Convention include Burkina Faso, Morocco, Tunisia and Senegal.
Criteria
An ASCI should meet one or more of the following criteria as defined at UNEP-WCMC 2014, Biodiversity A-Z website: www.biodversitya-z.org, UNEP-WCMC, Cambridge, UK:
Contributes substantially to the survival of threatened species, endemic species, or any species listed in Appendices I and II of the Bern convention;
Supports significant numbers of species in an area of high species diversity or supports important populations of one or more species;
Contains an important and/or representative sample of endangered habitat types;
Contains an outstanding example of a particular habitat type or a mosaic of different habitat types;
Represents an important area for one or more migratory species;
Otherwise contributes substantially to the achievement of the objectives of the convention.
Notes
Sources
Biogeography
Environment of Europe
Protected areas of Europe
Pro |
https://en.wikipedia.org/wiki/Decentralized%20identifier | Decentralized identifiers (DIDs) are a type of globally unique identifier that enables an entity to be identified in a manner that is verifiable, persistent (as long as the DID controller desires), and does not require the use of a centralized registry. DIDs enable a new model of decentralized digital identity that is often referred to as self-sovereign identity or decentralized identity. They are an important component of decentralized web applications.
DID documents
A decentralized identifier resolves (points) to a DID document, a set of data describing the DID subject, including mechanisms, such as cryptographic public keys, that the DID subject or a DID delegate can use to authenticate itself and prove its association with the DID.
DID methods
Just as there are many different types of URIs, all of which conform to the URI standard, there are many different types of DID methods, all of which must conform to the DID standard. Each DID method specification must define:
The name of the DID method (which must appear between the first and second colon, e.g., did:example:).
The structure of the unique identifier that must follow the second colon.
The technical specifications for how a DID resolver can apply the CRUD operations to create, read, update, and deactivate a DID document using that method.
The W3C DID Working Group maintains a registry of DID methods.
Usage of DIDs
A DID identifies any subject (e.g., a person, organization, thing, data model, abstract entity, etc.) that the controller of the DID decides that it identifies. DIDs are designed to enable the controller of a DID to prove control over it and to be implemented independently of any centralized registry, identity provider, or certificate authority. DIDs are URIs that associate a DID subject with a DID document. Each DID document can express cryptographic material, verification methods, and service endpoints to enable trusted interactions associated with the DID subject. A DID document might |
https://en.wikipedia.org/wiki/System%20and%20Organization%20Controls | System and Organization Controls (SOC), (also sometimes referred to as service organizations controls) as defined by the American Institute of Certified Public Accountants (AICPA), is the name of a suite of reports produced during an audit. It is intended for use by service organizations (organizations that provide information systems as a service to other organizations) to issue validated reports of internal controls over those information systems to the users of those services. The reports focus on controls grouped into five categories called Trust Service Criteria. The Trust Services Criteria were established by The AICPA through its Assurance Services Executive Committee (ASEC) in 2017 (2017 TSC). These control criteria are to be used by the practitioner/examiner (Certified Public Accountant, CPA) in attestation or consulting engagements to evaluate and report on controls of information systems offered as a service. The engagements can be done on an entity wide, subsidiary, division, operating unit, product line or functional area basis. The Trust Services Criteria were modeled in conformity to The Committee of Sponsoring Organizations of the Treadway Commission (COSO) Internal Control - Integrated Framework (COSO Framework). In addition, the Trust Services Criteria can be mapped to NIST SP 800 - 53 criteria and to EU General Data Protection Regulation (GDPR) Articles. The AICPA auditing standard Statement on Standards for Attestation Engagements no. 18 (SSAE 18), section 320, "Reporting on an Examination of Controls at a Service Organization Relevant to User Entities' Internal Control Over Financial Reporting", defines two levels of reporting, type 1 and type 2. Additional AICPA guidance materials specify three types of reporting: SOC 1, SOC 2, and SOC 3.
Trust Service Criteria
Trust Services Criteria were designed such that they can provide flexibility in application to better suit the unique controls implemented by an organization to address its unique ris |
https://en.wikipedia.org/wiki/Internet%20security%20awareness | Internet security awareness or Cyber security awareness refers to how much end-users know about the cyber security threats their networks face, the risks they introduce and mitigating security best practices to guide their behavior. End users are considered the weakest link and the primary vulnerability within a network. Since end-users are a major vulnerability, technical means to improve security are not enough. Organizations could also seek to reduce the risk of the human element (end users). This could be accomplished by providing security best practice guidance for end users' awareness of cyber security. Employees could be taught about common threats and how to avoid or mitigate them.
Cyber security awareness, training, education
A cyber security risk mitigating end user program could consist of a combination of multiple approaches including cyber security awareness, cyber security training, and cyber security education. According to, and adopted from, see the below table that provides a comparison of the approaches.
Threats
Threat agents or threat actors are the perpetrators of the threat and usually look for the easiest way to gain access into a network, which is often the human element. However, these cyber threats can be mitigated. Some common threats include but are not limited to below.
Social engineering is when someone uses a compelling story, authority, or other means to convince someone to hand over sensitive information such as usernames and passwords. An end user with cyber security awareness will have the ability to recognize these types of attacks which improves their ability to avoid them.
Phishing is a form of social engineering. It is a popular attack that attempts to trick users into clicking a link within an email or on a website in hopes that they divulge sensitive information. This attack generally relies on a bulk email approach and the low cost of sending phishing emails. Few targets are fooled, but so many are targeted t |
https://en.wikipedia.org/wiki/Matrix%20factorization%20%28algebra%29 | In homological algebra, a branch of mathematics, a matrix factorization is a tool used to study infinitely long resolutions, generally over commutative rings.
Motivation
One of the problems with non-smooth algebras, such as Artin algebras, are their derived categories are poorly behaved due to infinite projective resolutions. For example, in the ring there is an infinite resolution of the -module whereInstead of looking at only the derived category of the module category, David Eisenbud studied such resolutions by looking at their periodicity. In general, such resolutions are periodic with period after finitely many objects in the resolution.
Definition
For a commutative ring and an element , a matrix factorization of is a pair of square matrices such that . This can be encoded more generally as a graded -module with an endomorphism such that .
Examples
(1) For and there is a matrix factorization where for .
(2) If and , then there is a matrix factorization where
Periodicity
definition
Main theorem
Given a regular local ring and an ideal generated by an -sequence, set and let
be a minimal -free resolution of the ground field. Then becomes periodic after at most steps. https://www.youtube.com/watch?v=2Jo5eCv9ZVY
Maximal Cohen-Macaulay modules
page 18 of eisenbud article
Categorical structure
Support of matrix factorizations
See also
Derived noncommutative algebraic geometry
Derived category
Homological algebra
Triangulated category
References
Further reading
Homological Algebra on a Complete Intersection with an Application to Group Representations
Geometric Study of the Category of Matrix Factorizations
https://web.math.princeton.edu/~takumim/takumim_Spr13_JP.pdf
https://arxiv.org/abs/1110.2918
Homological algebra |
https://en.wikipedia.org/wiki/MNase-seq | MNase-seq, short for micrococcal nuclease digestion with deep sequencing, is a molecular biological technique that was first pioneered in 2006 to measure nucleosome occupancy in the C. elegans genome, and was subsequently applied to the human genome in 2008. Though, the term ‘MNase-seq’ had not been coined until a year later, in 2009. Briefly, this technique relies on the use of the non-specific endo-exonuclease micrococcal nuclease, an enzyme derived from the bacteria Staphylococcus aureus, to bind and cleave protein-unbound regions of DNA on chromatin. DNA bound to histones or other chromatin-bound proteins (e.g. transcription factors) may remain undigested. The uncut DNA is then purified from the proteins and sequenced through one or more of the various Next-Generation sequencing methods.
MNase-seq is one of four classes of methods used for assessing the status of the epigenome through analysis of chromatin accessibility. The other three techniques are DNase-seq, FAIRE-seq, and ATAC-seq. While MNase-seq is primarily used to sequence regions of DNA bound by histones or other chromatin-bound proteins, the other three are commonly used for: mapping Deoxyribonuclease I hypersensitive sites (DHSs), sequencing the DNA unbound by chromatin proteins, or sequencing regions of loosely packaged chromatin through transposition of markers, respectively.
History
Micrococcal nuclease (MNase) was first discovered in S. aureus in 1956, protein crystallized in 1966, and characterized in 1967. MNase digestion of chromatin was key to early studies of chromatin structure; being used to determine that each nucleosomal unit of chromatin was composed of approximately 200bp of DNA. This, alongside Olins’ and Olins’ “beads on a string” model, confirmed Kornberg’s ideas regarding the basic chromatin structure. Upon additional studies, it was found that MNase could not degrade histone-bound DNA shorter than ~140bp and that DNase I and II could degrade the bound DNA to as low as 10bp. Thi |
https://en.wikipedia.org/wiki/Euler%27s%20Gem | Euler's Gem: The Polyhedron Formula and the Birth of Topology is a book on the formula for the Euler characteristic of convex polyhedra and its connections to the history of topology. It was written by David Richeson and published in 2008 by the Princeton University Press, with a paperback edition in 2012. It won the 2010 Euler Book Prize of the Mathematical Association of America.
Topics
The book is organized historically, and reviewer Robert Bradley divides the topics of the book into three parts. The first part discusses the earlier history of polyhedra, including the works of Pythagoras, Thales, Euclid, and Johannes Kepler, and the discovery by René Descartes of a polyhedral version of the Gauss–Bonnet theorem (later seen to be equivalent to Euler's formula). It surveys the life of Euler, his discovery in the early 1750s that the Euler characteristic is equal to two for all convex polyhedra, and his flawed attempts at a proof, and concludes with the first rigorous proof of this identity in 1794 by Adrien-Marie Legendre,
based on Girard's theorem relating the angular excess of triangles in spherical trigonometry to their area.
Although polyhedra are geometric objects, Euler's Gem argues that Euler discovered his formula by being the first to view them topologically (as abstract incidence patterns of vertices, faces, and edges), rather than through their geometric distances and angles. (However, this argument is undermined by the book's discussion of similar ideas in the earlier works of Kepler and Descartes.) The birth of topology is conventionally marked by an earlier contribution of Euler, his 1736 work on the Seven Bridges of Königsberg, and the middle part of the book connects these two works through the theory of graphs. It proves Euler's formula in a topological rather than geometric form, for planar graphs, and discusses its uses in proving that these graphs have vertices of low degree, a key component in proofs of the four color theorem. It even make |
https://en.wikipedia.org/wiki/Levitation%20based%20inertial%20sensing | Levitation based inertial sensing is a new and rapidly growing technique for measuring linear acceleration, rotation and orientation of a body. Based on this technique, inertial sensors such as accelerometers and gyroscopes, enables ultra-sensitive inertial sensing. For example, the world's best accelerometer used in the LISA Pathfinder in-flight experiment is based on a levitation system which reaches a sensitivity of and noise of .
History
The pioneering work related to the microparticle levitation was performed by Artur Ashkin in 1970. He demonstrated optical trapping of dielectric microspheres for the first time, forming an optical levitation system, by using a focused laser beam in air and liquid. This new technology was later named "optical tweezer" and applied in biochemistry and biophysics. Later, significant scientific progress on optically levitated systems was made, for example the cooling of the center of mass motion of a micro- or nanoparticle in the millikelvin regime. Very recently a research group published a paper showing motional quantum ground state cooling of a levitated nanoparticle. In addition, levitation based on electrostatic and magnetic approaches have also been proposed and realized.
Levitation systems have shown high force sensitivities in the range. For example, an optically levitated dielectric particle has been shown to exhibit force sensitivities beyond ~ . Thus, levitation systems show promise for ultra-sensitive force sensing, such as detection of short-range interactions. By levitating micro- or mesoparticles with a relatively large mass, this system can be employed as a high-performance inertial sensor, demonstrating nano-g sensitivity.
Method
One possible working principle behind a levitation based inertial sensing system is the following. By levitating a micro-object in vacuum and after a cool-down process, the center of mass motion of the micro-object can be controlled and coupled to the kinematic states of the system |
https://en.wikipedia.org/wiki/Proofs%20That%20Really%20Count | Proofs That Really Count: the Art of Combinatorial Proof is an undergraduate-level mathematics book on combinatorial proofs of mathematical identies. That is, it concerns equations between two integer-valued formulas, shown to be equal either by showing that both sides of the equation count the same type of mathematical objects, or by finding a one-to-one correspondence between the different types of object that they count. It was written by Arthur T. Benjamin and Jennifer Quinn, and published in 2003 by the Mathematical Association of America as volume 27 of their Dolciani Mathematical Expositions series. It won the Beckenbach Book Prize of the Mathematical Association of America.
Topics
The book provides combinatorial proofs of thirteen theorems in combinatorics and 246 numbered identities (collated in an appendix). Several additional "uncounted identities" are also included. Many proofs are based on a visual-reasoning method that the authors call "tiling", and in a foreword, the authors describe their work as providing a follow-up for counting problems of the Proof Without Words books by Roger B. Nelson.
The first three chapters of the book start with integer sequences defined by linear recurrence relations, the prototypical example of which is the sequence of Fibonacci numbers. These numbers can be given a combinatorial interpretation as the number of ways of tiling a strip of squares with tiles of two types, single squares and dominos; this interpretation can be used to prove many of the fundamental identities involving the Fibonacci numbers, and generalized to similar relations about other sequences defined similarly, such as the Lucas numbers, using "circular tilings and colored tilings". For instance, for the Fibonacci numbers, considering whether a tiling does or does not connect positions and of a strip of length immediately leads to the identity
Chapters four through seven of the book concern identities involving continued fractions, binomial coef |
https://en.wikipedia.org/wiki/SnRNA-seq | snRNA-seq, also known as single nucleus RNA sequencing, single nuclei RNA sequencing or sNuc-seq, is an RNA sequencing method for profiling gene expression in cells which are difficult to isolate, such as those from tissues that are archived or which are hard to be dissociated. It is an alternative to single cell RNA seq (scRNA-seq), as it analyzes nuclei instead of intact cells.
snRNA-seq minimizes the occurrence of spurious gene expression, as the localization of fully mature ribosomes to the cytoplasm means that any mRNAs of transcription factors that are expressed after the dissociation process cannot be translated, and thus their downstream targets cannot be transcribed. Additionally, snRNA-seq technology enables the discovery of new cell types which would otherwise be difficult to isolate.
Methods and technology
The basic snRNA-seq method requires 4 main steps: tissue processing, nuclei isolation, cell sorting, and sequencing. In order to isolate and sequence RNA inside the nucleus, snRNA-seq involves using a quick and mild nuclear dissociation protocol. This protocol allows for minimization of technical issues that can affect studies, especially those concerned with immediate early gene (IEG) behavior.
The resulting dissociated cells are suspended and the suspension gently lysed, allowing the cell nuclei to be separated from their cytoplasmic lysates using centrifugation. These separated nuclei/cells are sorted using fluorescence-activated cell sorting (FACS) into individual wells, and amplified using microfluidics machinery. Sequencing occurs as normal and the data can be analyzed as appropriate for its use.
This basic snRNA-seq methodology is capable of profiling RNA from tissues that are preserved or cannot be dissociated, but it does not have high throughput capability due to its reliance on nuclei sorting by FACS. This technique cannot be scaled easily to profiling large numbers of nuclei or samples. Massively parallel scRNA-seq methods exist and |
https://en.wikipedia.org/wiki/Nucleomodulin | Nucleomodulins are a family of bacterial proteins that enter the nucleus of eukaryotic cells.
This term comes from the contraction between "nucleus" and "modulins", which are microbial molecules that modulate the behaviour of eukaryotic cells. Nucleomodulins are produced by pathogenic or symbiotic bacteria. They act on various processes in the nucleus: remodelling of the chromatin structure, transcription, splicing of pre-messenger RNA, cell division.
The identification of nucleomodulins in several species of bacterial pathogens of humans, animals and plants has led to the emergence of the concept that direct control of the nucleus is one of the most sophisticated strategies used by microbes to bypass host defences. Nucleomodulins can be directly secreted into the intracellular medium after entry of the bacteria into the cell, like Listeria monocytogenes, or they can be injected from the extracellular medium or intracellular organelles using a type III or IV bacterial secretion system, also known as a "molecular syringe".
More recently, it has been shown that some of them, such as YopM from Yersinia pestis and IpaH9.8 from Shigella flexneri, can autonomously penetrate eukaryotic cells thanks to a membrane transduction domain.
The diversity of molecular mechanisms triggered by nucleomodulins is a source of inspiration for new biotechnologies. They are true nano-machines capable of hijacking a multitude of nuclear processes. In research, nucleomodulins are the subject of in-depth studies that have led to the discovery of new human nuclear regulators, such as the epigenetic regulator BAHD1.
Examples
Agrobacterium tumefaciens, responsible for crown gall disease, produces an arsenal of Vir proteins, including VirD2 and VirE2, enabling the precise integration of a piece of its DNA, called T-DNA, into that of the host plant
Listeria monocytogenes, responsible for listeriosis, can modulate the expression of immunity genes. One of the mechanisms at play involves th |
https://en.wikipedia.org/wiki/Pythagorean%20Triangles | Pythagorean Triangles is a book on right triangles, the Pythagorean theorem, and Pythagorean triples. It was originally written in the Polish language by Wacław Sierpiński (titled Trójkąty pitagorejskie), and published in Warsaw in 1954. Indian mathematician Ambikeshwar Sharma translated it into English, with some added material from Sierpiński, and published it in the Scripta Mathematica Studies series of Yeshiva University (volume 9 of the series) in 1962. Dover Books republished the translation in a paperback edition in 2003. There is also a Russian translation of the 1954 edition.
Topics
As a brief summary of the book's contents, reviewer Brian Hopkins quotes The Pirates of Penzance: "With many cheerful facts about the square of the hypotenuse."
The book is divided into 15 chapters (or 16, if one counts the added material as a separate chapter). The first three of these define the primitive Pythagorean triples (the ones in which the two sides and hypotenuse have no common factor), derive the standard formula for generating all primitive Pythagorean triples, compute the inradius of Pythagorean triangles, and construct all triangles with sides of length at most 100.
Chapter 4 considers special classes of Pythagorean triangles, including those with sides in arithmetic progression, nearly-isosceles triangles, and the relation between nearly-isosceles triangles and square triangular numbers. The next two chapters characterize the numbers that can appear in Pythagorean triples, and chapters 7–9 find sets of many Pythagorean triangles with the same side, the same hypotenuse, the same perimeter, the same area, or the same inradius.
Chapter 10 describes Pythagorean triangles with a side or area that is a square or cube, connecting this problem to Fermat's Last Theorem. After a chapter on Heronian triangles, Chapter 12 returns to this theme, discussing triangles whose hypotenuse and sum of sides are squares. Chapter 13 relates Pythagorean triangles to rational points |
https://en.wikipedia.org/wiki/Convexity%20%28algebraic%20geometry%29 | In algebraic geometry, convexity is a restrictive technical condition for algebraic varieties originally introduced to analyze Kontsevich moduli spaces in quantum cohomology. These moduli spaces are smooth orbifolds whenever the target space is convex. A variety is called convex if the pullback of the tangent bundle to a stable rational curve has globally generated sections. Geometrically this implies the curve is free to move around infinitesimally without any obstruction. Convexity is generally phrased as the technical condition
since Serre's vanishing theorem guarantees this sheaf has globally generated sections. Intuitively this means that on a neighborhood of a point, with a vector field in that neighborhood, the local parallel transport can be extended globally. This generalizes the idea of convexity in Euclidean geometry, where given two points in a convex set , all of the points are contained in that set. There is a vector field in a neighborhood of transporting to each point . Since the vector bundle of is trivial, hence globally generated, there is a vector field on such that the equality holds on restriction.
Examples
There are many examples of convex spaces, including the following.
Spaces with trivial rational curves
If the only maps from a rational curve to are constants maps, then the pullback of the tangent sheaf is the free sheaf where . These sheaves have trivial non-zero cohomology, and hence they are always convex. In particular, Abelian varieties have this property since the Albanese variety of a rational curve is trivial, and every map from a variety to an Abelian variety factors through the Albanese.
Projective spaces
Projective spaces are examples of homogeneous spaces, but their convexity can also be proved using a sheaf cohomology computation. Recall the Euler sequence relates the tangent space through a short exact sequence
If we only need to consider degree embeddings, there is a short exact sequence
gi |
https://en.wikipedia.org/wiki/1942%20Herefordshire%20TRE%20Halifax%20crash | V9977 was an Handley Page Halifax II that had been sent to the Telecommunications Research Establishment (TRE) at RAF Defford to be used as a flying testbed for the H2S radar.
On the afternoon of 7 June 1942, one of its Rolls-Royce Merlin engines caught fire and led to the aircraft crashing near the England-Wales border, killing all eleven crew-members. Among the dead was Alan Blumlein of EMI, who was well known as the inventor of stereophonic sound recording and the 405-line television system used in the UK until 1985.
Investigators determined that improper engine maintenance/assembly procedures caused the accident. It remains the deadliest crash in the history of military test flight in the UK.
History
Construction
V9977 was an early model Halifax II, which introduced the more powerful Merlin XX engine and a number of other detail changes over the original model.
Chosen for H2S
At a meeting on 23 December 1941, the Secretary of State for Air, Archibald Sinclair, directed the TRE should direct their work on H2S radar towards the new four-engine bombers, Shorts Stirling, Handley Page Halifax and Avro Lancaster. Immediately thereafter, Philip Dee, B J O'Kane and Geoffrey Hensby visited the Aeroplane and Armament Experimental Establishment at Boscombe Down to examine the available aircraft and concluded that the Halifax had the best possibilities for mounting the scanner in different locations for testing.
On 1 January 1942, Bernard Lovell received orders from Albert Rowe, director of the TRE, to take over the direction of H2S. Three days later he visited Handley Page with Bob King, a TRE fitter who was well acquainted with the installation of test systems on a variety of aircraft, and Whitaker from Nash & Thompson, who were building the scanner system. They had collectively planned for the radar to be installed in a large long radome under the aircraft.
They were met at the factory by a team of high-ranking members of the Halifax design team, including the c |
https://en.wikipedia.org/wiki/Kr00k | Kr00k (also written as KrØØk) is a security vulnerability that allows some WPA2 encrypted WiFi traffic to be decrypted. The vulnerability was originally discovered by security company ESET in 2019 and assigned on August 17, 2019. ESET estimates that this vulnerability affects over a billion devices.
Discovery
Kr00k was discovered by ESET Experimental Research and Detection Team, most prominently ESET security researcher Miloš Čermák.
It was named Kr00k by Robert Lipovsky and Stefan Svorencik. It was discovered when trying variations of the KRACK attack.
Initially found in chips made by Broadcom and Cypress, similar vulnerabilities have been found in other implementations, including those by Qualcomm and MediaTek.
Patches
The vulnerability is known to be patched in:
iOS 13.2 and iPadOS 13.2 - October 28, 2019
macOS Catalina 10.15.1, Security Update 2019–001, and Security Update 2019-006 - October 29, 2019
Vulnerable devices
During their research, ESET confirmed over a dozen popular devices were vulnerable.
Cisco has found several of their devices to be vulnerable and are working on patches. They are tracking the issue with advisory id cisco-sa-20200226-wi-fi-info-disclosure.
Known vulnerable devices include:
Amazon Echo 2nd gen
Amazon Kindle 8th gen
Apple iPad mini 2
Apple iPhone 6, 6S, 8, XR
Apple MacBook Air Retina 13-inch 2018
Asus wireless routers (RT-AC1200G+, RT-AC68U), but fixed in firmware Version 3.0.0.4.382.5161220 during March 2020
Google Nexus 5
Google Nexus 6
Google Nexus 6P
Raspberry Pi 3
Samsung Galaxy S4 (GT-I9505)
Samsung Galaxy S8
Xiaomi Redmi 3S
References
Computer security exploits
Hardware bugs
Wi-Fi
Computer-related introductions in 2019
Telecommunications-related introductions in 2019 |
https://en.wikipedia.org/wiki/Commercial%20National%20Security%20Algorithm%20Suite | The Commercial National Security Algorithm Suite (CNSA) is a set of cryptographic algorithms promulgated by the National Security Agency as a replacement for NSA Suite B Cryptography algorithms. It serves as the cryptographic base to protect US National Security Systems information up to the top secret level, while the NSA plans for a transition to quantum-resistant cryptography.
The suite includes
Advanced Encryption Standard with 256 bit keys
Elliptic-curve Diffie–Hellman and Elliptic Curve Digital Signature Algorithm with curve P-384
SHA-2 with 384 bits, Diffie–Hellman key exchange with a minimum 3072-bit modulus, and
RSA with a minimum modulus size of 3072.
The CNSA transition is notable for moving RSA from a temporary legacy status, as it appeared in Suite B, to supported status. It also did not include the Digital Signature Algorithm. This, and the overall delivery and timing of the announcement, in the absence of post-quantum standards, raised considerable speculation about whether NSA had found weaknesses e.g. in elliptic-curve algorithms or others, or was trying to distance itself from an exclusive focus on ECC for non-technical reasons.
In September 2022, the NSA announced CNSA 2.0, which includes its first recommendations for post-quantum cryptographic algorithms.
CNSA 2.0 includes
Advanced Encryption Standard with 256 bit keys
CRYSTALS-Kyber and CRYSTALS-Dilithium with Level V parameters
SHA-2 with 384 or 512 bits
eXtended Merkle Signature Scheme (XMSS) and Leighton-Micali Signatures (LMS) with all parameters approved, with SHA256/192 recommended
Note that compared to CNSA 1.0, CNSA 2.0:
Suggests separate post-quantum algorithms (XMSS/LMS) for software/firmware signing for use immediately
Allows SHA-512
Announced the selection of CRYSTALS-Kyber and CRYSTALS-Dilithium early, with the expectation that they will be mandated only when the final standards and FIPS-validated implementations are released.
RSA, Diffie-Hellman, and elliptic cur |
https://en.wikipedia.org/wiki/Functor%20%28functional%20programming%29 | In functional programming, a functor is a design pattern inspired by the definition from category theory that allows one to apply a function to values inside a generic type without changing the structure of the generic type. In Haskell this idea can be captured in a type class:
class Functor f where
fmap :: (a -> b) -> f a -> f b
This declaration says that any type of Functor must support a method fmap, which maps a function over the element(s) of the Functor.
Functors in Haskell should also obey functor laws, which state that the mapping operation preserves the identity function and composition of functions:
fmap id = id
fmap (g . h) = (fmap g) . (fmap h)
(where . stands for function composition).
In Scala a trait can be used:
trait Functor[F[_]] {
def map[A,B](a: F[A])(f: A => B): F[B]
}
Functors form a base for more complex abstractions like Applicative Functor, Monad, and Comonad, all of which build atop a canonical functor structure. Functors are useful in modeling functional effects by values of parameterized data types. Modifiable computations are modeled by allowing a pure function to be applied to values of the "inner" type, thus creating the new overall value which represents the modified computation (which might yet to be run).
Examples
In Haskell, lists are a simple example of a functor. We may implement as
fmap f [] = []
fmap f (x:xs) = (f x) : fmap f xs
A binary tree may similarly be described as a functor:
data Tree a = Leaf | Node a (Tree a) (Tree a)
instance Functor Tree where
fmap f Leaf = Leaf
fmap f (Node x l r) = Node (f x) (fmap f l) (fmap f r)
If we have a binary tree and a function , the function will apply to every element of . For example, if is , adding 1 to each element of can be expressed as .
See also
Functor in category theory
Applicative functor, a special type of functor
References
External links
Section about Functor in Haskell Typeclassopedia
Chapter 11 Functors, Applicative Func |
https://en.wikipedia.org/wiki/Nature%20Food | Nature Food is a monthly peer-reviewed academic journal published by Nature Portfolio. It was established in 2020. The editor-in-chief is Anne Mullen.
Abstracting and indexing
The journal is abstracted and indexed in:
Science Citation Index Expanded
Scopus
According to the Journal Citation Reports, the journal has a 2021 impact factor of 20.430, ranking it 1st out of 143 journals in the category "Food Science & Technology".
References
External links
Nature Research academic journals
English-language journals
Food science journals
Academic journals established in 2020
Monthly journals
Online-only journals |
https://en.wikipedia.org/wiki/Perceptics | Perceptics LLC is a developer and manufacturer of automated license plate recognition (LPR) equipment based in Farragut, Tennessee, founded in approximately 1978. John Dalton is the CEO. A large hack of their data exposed their operations, as well as the locations of installations.
Their technology is used by the U.S. Customs and Border Protection (CBP) at 43 border crossings, both to Mexico and Canada, as part of a partnership with Unisys Federal Systems. Perceptics is the exclusive license plate recognition provider for CBP. Perceptics operated as a subcontractor to Unisys for the license plate reader contract, worth $229 million over several years. As of 2019, Perceptics has worked on CBP contracts for "nearly 30 years". They also provide "under-vehicle surveillance systems", and have contracts with the Drug Enforcement Administration checkpoints, the Canada Border Services Agency, United Arab Emirates, and Saudi Arabia's Special Forces, and the Jordanian army.
Perceptics was previously a subsidiary of Northrop Grumman. They have been filling CBP contracts since 1982 and license plate readers since 1997. In 2002 the equipment cost was approximately $90,000 per lane.
Perceptics also discussed promoting their license plate reading technology for use on a congestion pricing scheme to MTA in New York City in a presentation titled "Smart Imaging Solutions for New York City Congestion Pricing". They demoed the technology to MTA's Bridges and Tunnels division. The Perceptics system provides much more capabilities than license plate reading, such as "Vehicle Occupancy Imaging System", which can identify drivers and passengers, as well as tracking car locations and driver behavior as a profile. Perceptics and Unisys were also involved in a CBP trial project called the "Vehicle Face System", involving facial recognition of car occupants.
Perceptics used Amazon Rekognition as of August 2018.
Canadian operations
Data from the hack revealed the Canada Border Services Age |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.