source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/Hooper%27s%20paradox | Hooper's paradox is a falsidical paradox based on an optical illusion. A geometric shape with an area of 32 units is dissected into four parts, which afterwards get assembled into a rectangle with an area of only 30 units.
Explanation
Upon close inspection one can notice that the triangles of the dissected shape are not identical to the triangles in the rectangle. The length of the shorter side at the right angle measures 2 units in the original shape but only 1.8 units in the rectangle. This means, the real triangles of the original shape overlap in the rectangle. The overlapping area is a parallelogram, the diagonals and sides of which can be computed via the Pythagorean theorem.
The area of this parallelogram can determined using Heron's formula for triangles. This yields
for the halved circumference of the triangle (half of the parallelogram) and with that for the area of the parallelogram
.
So the overlapping area of the two triangles accounts exactly for the vanished area of 2 units.
History
William Hooper published the paradox 1774 in his book Rational Recreations in which he named it The geometric money. The 1774 edition of his book still contained a false drawing, which got corrected in the 1782 edition. However Hooper was not the first to publish this geometric fallacy, since Hooper's book was largely an adaption of Edmé-Gilles Guyot's Nouvelles récréations physiques et mathétiques, which had been published in France in 1769. The description in this book contains the same false drawing as in Hooper's book, but it got corrected in a later edition as well.
See also
Chessboard paradox
Missing square puzzle
References
Martin Gardner: Mathematics, Magic and Mystery. Courier (Dover), 1956, , S. 129–155
Greg N. Frederickson: Dissections: Plane and Fancy. Cambridge University Press, 2003, , chapter 23, pp. 268–277 in particular pp. 271–274 (online update to chapter 23)
Simon During: Modern Enchantments: The Cultural Power of Secular Magic. Har |
https://en.wikipedia.org/wiki/Quadric%20%28algebraic%20geometry%29 | In mathematics, a quadric or quadric hypersurface is the subspace of N-dimensional space defined by a polynomial equation of degree 2 over a field. Quadrics are fundamental examples in algebraic geometry. The theory is simplified by working in projective space rather than affine space. An example is the quadric surface
in projective space over the complex numbers C. A quadric has a natural action of the orthogonal group, and so the study of quadrics can be considered as a descendant of Euclidean geometry.
Many properties of quadrics hold more generally for projective homogeneous varieties. Another generalization of quadrics is provided by Fano varieties.
Property of quadric
By definition, a quadric X of dimension n over a field k is the subspace of defined by q = 0, where q is a nonzero homogeneous polynomial of degree 2 over k in variables . (A homogeneous polynomial is also called a form, and so q may be called a quadratic form.) If q is the product of two linear forms, then X is the union of two hyperplanes. It is common to assume that and q is irreducible, which excludes that special case.
Here algebraic varieties over a field k are considered as a special class of schemes over k. When k is algebraically closed, one can also think of a projective variety in a more elementary way, as a subset of defined by homogeneous polynomial equations with coefficients in k.
If q can be written (after some linear change of coordinates) as a polynomial in a proper subset of the variables, then X is the projective cone over a lower-dimensional quadric. It is reasonable to focus attention on the case where X is not a cone. For k of characteristic not 2, X is not a cone if and only if X is smooth over k. When k has characteristic not 2, smoothness of a quadric is also equivalent to the Hessian matrix of q having nonzero determinant, or to the associated bilinear form b(x,y) = q(x+y) – q(x) – q(y) being nondegenerate. In general, for k of characteristic not 2, the rank of |
https://en.wikipedia.org/wiki/List%20of%20elevator%20test%20towers | This is a list of all known elevator test towers in the world.
List
Elevator test towers of unknown height
References
Lists of buildings and structures
Engineering-related lists |
https://en.wikipedia.org/wiki/Paya%2C%20Inc. | Paya Holdings, Inc. is an American payment processor service that operates in North America and is headquartered in Atlanta, Georgia. As well as its headquarters, it has offices in Florida, Ohio and Virginia.
The company provides online payments processing as well as products for face-to-face and telephone payments. It was known by the name Sage Payment Solutions while under the ownership of The Sage Group plc between 2006 and 2017. The company processes more than $30 billion for over 100,000 customers annually. The company was acquired by Nuvei in February 2023.
History
Sage Payment Solutions (SPS) North America was founded in 2006 by the London Stock Exchange-listed FTSE 100 company Sage Group who primarily operate in the accounting and payroll business software industry. The same year the Sage Group had entered the payments industry in its home country, the United Kingdom, through the acquisition of Protx, though the companies were kept separate.
Sage Group divested the US Sage Payment Solutions business to GTCR in 2017 for $260 million. Sage Payments Solutions was later rebranded as Paya in January 2018.
GTCR provided Paya with $350 million in new equity. In November 2018 the company appointed their current CEO, Jeff Hack, and that same month acquired Stewardship Technologies. In January 2019 Paya acquired First Billing Services.
On August 3, 2020, Paya announced that they were merging with Nasdaq-listed FinTech Acquisition Corp. III. The merged company will be listed on the Nasdaq stock exchange under the symbol PAYA and have a value of $1.3 billion. GTCR will remain the company's largest shareholder.
In January 2023, Canadian financial services company Nuvei agreed to buy Paya in an all-cash deal worth around US$1.3billion. The acquisition completed the next month.
See also
List of online payment service providers
References
External links
Companies based in Atlanta
Financial services companies established in 2006
Online payments
Payment service p |
https://en.wikipedia.org/wiki/Lectures%20in%20Geometric%20Combinatorics | Lectures in Geometric Combinatorics is a textbook on polyhedral combinatorics. It was written by Rekha R. Thomas, based on a course given by Thomas at the 2004 Park City Mathematics Institute, and published by the American Mathematical Society and Institute for Advanced Study in 2006, as volume 33 of their Student Mathematical Library book series.
Topics
The 14 chapters of the book can be grouped into two parts, with the first 2/3 of the book concerning the combinatorial properties of convex polytopes and the remainder connecting these topics to abstract algebra.
The topics covered include Schlegel diagrams and Gale diagrams, irrational polytopes, point set triangulations, regular triangulations and their polyhedral representation by secondary polytopes, the permutohedron as an example of a secondary polytope, Gröbner bases, toric ideals, and toric varieties, and the connections between Gröbner bases of toric ideals and regular triangulations of points.
Audience and reception
Although originally presented as an advanced undergraduate course, the book is also suitable for graduate students and for researchers interested in beginning work in this area. It requires only an undergraduate level of background material in mathematics (particularly linear algebra), and includes exercises making it suitable as a textbook. Reviewers Miklós Bóna and Alexander Zvonkin suggest it as a "quick introduction" to its topics, after which other books on the same topics can provide greater depth.
See also
List of books about polyhedra
References
Polyhedral combinatorics
Mathematics textbooks
2006 non-fiction books
Books of lectures |
https://en.wikipedia.org/wiki/Leavitt%20path%20algebra | In mathematics, a Leavitt path algebra is a universal algebra constructed from a directed graph. Leavitt path algebras generalize Leavitt algebras and may be considered as algebraic analogues of graph C*-algebras.
History
Leavitt path algebras were simultaneously introduced in 2005 by Gene Abrams and Gonzalo Aranda Pino as well as by Pere Ara, María Moreno, and Enrique Pardo, with neither of the two groups aware of the other's work. Leavitt path algebras have been investigated by dozens of mathematicians since their introduction, and in 2020 Leavitt path algebras were added to the Mathematics Subject Classification with code 16S88 under the general discipline of Associative Rings and Algebras.
The basic reference is the book Leavitt Path Algebras.
Graph terminology
The theory of Leavitt path algebras uses terminology for graphs similar to that of C*-algebraists, which differs slightly from that used by graph theorists. The term graph is typically taken to mean a directed graph consisting of a countable set of vertices , a countable set of edges , and maps identifying the range and source of each edge, respectively. A vertex is called a sink when ; i.e., there are no edges in with source . A vertex is called an infinite emitter when is infinite; i.e., there are infinitely many edges in with source . A vertex is called a singular vertex if it is either a sink or an infinite emitter, and a vertex is called a regular vertex if it is not a singular vertex. Note that a vertex is regular if and only if the number of edges in with source is finite and nonzero. A graph is called row-finite if it has no infinite emitters; i.e., if every vertex is either a regular vertex or a sink.
A path is a finite sequence of edges with for all . An infinite path is a countably infinite sequence of edges with for all . A cycle is a path with , and an exit for a cycle is an edge such that and for some . A cycle is called a simple cycle if for all .
The |
https://en.wikipedia.org/wiki/Ultragraph%20C%2A-algebra | In mathematics, an ultragraph C*-algebra is a universal C*-algebra generated by partial isometries on a collection of Hilbert spaces constructed from ultragraphs.pp. 6-7. These C*-algebras were created in order to simultaneously generalize the classes of graph C*-algebras and Exel–Laca algebras, giving a unified framework for studying these objects. This is because every graph can be encoded as an ultragraph, and similarly, every infinite graph giving an Exel-Laca algebras can also be encoded as an ultragraph.
Definitions
Ultragraphs
An ultragraph consists of a set of vertices , a set of edges , a source map , and a range map taking values in the power set collection of nonempty subsets of the vertex set. A directed graph is the special case of an ultragraph in which the range of each edge is a singleton, and ultragraphs may be thought of as generalized directed graph in which each edges starts at a single vertex and points to a nonempty subset of vertices.
Example
An easy way to visualize an ultragraph is to consider a directed graph with a set of labelled vertices, where each label corresponds to a subset in the image of an element of the range map. For example, given an ultragraph with vertices and edge labels, with source an range mapscan be visualized as the image on the right.
Ultragraph algebras
Given an ultragraph , we define to be the smallest subset of containing the singleton sets , containing the range sets , and closed under intersections, unions, and relative complements. A Cuntz–Krieger -family is a collection of projections together with a collection of partial isometries with mutually orthogonal ranges satisfying
, , for all ,
for all ,
whenever is a vertex that emits a finite number of edges, and
for all .
The ultragraph C*-algebra is the universal C*-algebra generated by a Cuntz–Krieger -family.
Properties
Every graph C*-algebra is seen to be an ultragraph algebra by simply considering the graph as a special case of |
https://en.wikipedia.org/wiki/Calculus%20on%20Euclidean%20space | In mathematics, calculus on Euclidean space is a generalization of calculus of functions in one or several variables to calculus of functions on Euclidean space as well as a finite-dimensional real vector space. This calculus is also known as advanced calculus, especially in the United States. It is similar to multivariable calculus but is somewhat more sophisticated in that it uses linear algebra (or some functional analysis) more extensively and covers some concepts from differential geometry such as differential forms and Stokes' formula in terms of differential forms. This extensive use of linear algebra also allows a natural generalization of multivariable calculus to calculus on Banach spaces or topological vector spaces.
Calculus on Euclidean space is also a local model of calculus on manifolds, a theory of functions on manifolds.
Basic notions
Functions in one real variable
This section is a brief review of function theory in one-variable calculus.
A real-valued function is continuous at if it is approximately constant near ; i.e.,
In contrast, the function is differentiable at if it is approximately linear near ; i.e., there is some real number such that
(For simplicity, suppose . Then the above means that where goes to 0 faster than h going to 0 and, in that sense, behaves like .)
The number depends on and thus is denoted as . If is differentiable on an open interval and if is a continuous function on , then is called a C1 function. More generally, is called a Ck function if its derivative is Ck-1 function. Taylor's theorem states that a Ck function is precisely a function that can be approximated by a polynomial of degree k.
If is a C1 function and for some , then either or ; i.e., either is strictly increasing or strictly decreasing in some open interval containing a. In particular, is bijective for some open interval containing . The inverse function theorem then says that the inverse function is differentiable on U with |
https://en.wikipedia.org/wiki/Multidrug-resistant%20bacteria | Multidrug-resistant bacteria (MDR bacteria) are bacteria that are resistant to three or more classes of antimicrobial drugs. MDR bacteria have seen an increase in prevalence in recent years and pose serious risks to public health. MDR bacteria can be broken into 3 main categories: Gram-positive, Gram-negative, and other (acid-stain). These bacteria employ various adaptations to avoid or mitigate the damage done by antimicrobials. With increased access to modern medicine there has been a sharp increase in the amount of antibiotics consumed. Given the abundant use of antibiotics there has been a considerable increase in the evolution of antimicrobial resistance factors, now outpacing the development of new antibiotics.
Examples identified as serious threats to public health
Examples of MDR bacteria identified as serious threats to public health include:
Gram-positive MDR bacteria
Clostridioides difficile
Staphylococcus aureus
Streptococcus pneumoniae
Gram-negative MDR bacteria
Carbapenem-resistant Acinetobacter
Escherichia coli
Klebsiella pneumoniae
Neisseria gonorrhoeae
Campylobacter
Pseudomonas aeruginosa
Salmonella
Shigella
Other MDR bacteria
Mycobacterium tuberculosis
Microbial adaptations
MDR bacteria employ a plurality of adaptations to overcome the environmental insults caused by antibiotics. Bacteria are capable of sharing these resistance factors in a process called horizontal gene transfer where resistant bacteria share genetic information that encodes resistance to the naive population.
Antibiotic inactivation: bacteria create proteins that can prevent damage caused by antibiotics, they can do this in two ways. First, inactivating or modifying the antibiotic so that it can no longer interact with its target. Second, degrading the antibiotic directly.
Multidrug efflux pumps: The use of transporter proteins to expel the antibiotic.
Modification of target sites: mutating or modifying elements of the bacteria structure to prevent interaction |
https://en.wikipedia.org/wiki/Gale%20diagram | In the mathematical discipline of polyhedral combinatorics, the Gale transform turns the vertices of any convex polytope into a set of vectors or points in a space of a different dimension, the Gale diagram of the polytope. It can be used to describe high-dimensional polytopes with few vertices, by transforming them into sets with the same number of points, but in a space of a much lower dimension. The process can also be reversed, to construct polytopes with desired properties from their Gale diagrams. The Gale transform and Gale diagram are named after David Gale, who introduced these methods in a 1956 paper on neighborly polytopes.
Definitions
Transform
Given a -dimensional polytope, with vertices, adjoin 1 to the Cartesian coordinates of each vertex, to obtain a -dimensional column vector. The matrix of these column vectors has dimensions , defining a linear mapping from -space to -space, surjective with rank . The kernel of describes linear dependencies among the original vertices with coefficients summing to zero; this kernel has dimension . The Gale transform of is a matrix of dimension , whose column vectors are a chosen basis for the kernel of . Then has row vectors of dimension . These row vectors form the Gale diagram of the polytope. A different choice of basis for the kernel changes the result only by a linear transformation.
Note that the vectors in the Gale diagram are in natural bijection with the vertices of the original -dimensional polytope, but the dimension of the Gale diagram is smaller whenever .
A proper subset of the vertices of a polytope forms the vertex set of a face of the polytope, if and only if the complementary set of vectors of the Gale transform has a convex hull that contains the origin in its relative interior.
Equivalently, the subset of vertices forms a face if and only if its affine span does not intersect the convex hull of the complementary vectors.
Linear diagram
Because the Gale transform is defined only up |
https://en.wikipedia.org/wiki/LitePoint | LitePoint is a wireless test company. The company’s hardware and software can be used to verify quality of wireless connectivity in smartphones, tablets, PCs, wireless access points and chipsets across WLAN and cellular technologies, including 5G, Bluetooth, Low Power Wide Area (LPWAN), NFC, Wi-Fi 6, Wi-Fi 6E, UWB, V2X and Zigbee. LitePoint claims it works with all of the leading chipset companies.
The company is headquartered in San Jose, California, with additional offices. LitePoint is a wholly owned subsidiary of Teradyne, a developer and supplier of automatic test equipment (ATE).
History
LitePoint was founded in 2000 by Benny Madsen, Christian Olgaard and Spiros Bouas. In 2011, LitePoint was acquired by Teradyne. In 2014, the company claimed it had supporting test equipment for 300 cellular and wireless connectivity chipsets. In 2019, LitePoint moved their headquarters from Sunnyvale, California to San Jose.
Work with FiRa Consortium
In 2019, LitePoint joined the FiRa Consortium, an organization aimed at facilitating the evolution of UWB wireless technology and fine-ranging (FiRa) technology. When the consortium formed, LitePoint was the only test vendor, joining other consortium members including Bosch, Samsung and Sony.
Awards
In 2013, the company’s IQxel-160 connectivity test system won the EDN China Innovation Award as the product leader in communications testing in the test and measurement category.
References
American companies established in 2000
Networking companies of the United States |
https://en.wikipedia.org/wiki/Zero-overhead%20looping | Zero-overhead looping is a feature of some processor instruction sets whose hardware can repeat the body of a loop automatically, rather than requiring software instructions which take up cycles (and therefore time) to do so. Zero-overhead loops are common in digital signal processors and some CISC instruction sets.
Background
In many instruction sets, a loop must be implemented by using instructions to increment or decrement a counter, check whether the end of the loop has been reached, and if not jump to the beginning of the loop so it can be repeated. Although this typically only represents around 3–16 bytes of space for each loop, even that small amount could be significant depending on the size of the CPU caches. More significant is that those instructions each take time to execute, time which is not spent doing useful work.
The overhead of such a loop is apparent compared to a completely unrolled loop, in which the body of the loop is duplicated exactly as many times as it will execute. In that case, no space or execution time is wasted on instructions to repeat the body of the loop. However, the duplication caused by loop unrolling can significantly increase code size, and the larger size can even impact execution time due to cache misses. (For this reason, it's common to only partially unroll loops, such as transforming it into a loop which performs the work of four iterations in one step before repeating. This balances the advantages of unrolling with the overhead of repeating the loop.) Moreover, completely unrolling a loop is only possible for a limited number of loops: those whose number of iterations is known at compile time.
For example, the following C code could be compiled and optimized into the following x86 assembly code:
Implementation
Processors with zero-overhead looping have machine instructions and registers to automatically repeat one or more instructions. Depending on the instructions available, these may only be suitable for count-cont |
https://en.wikipedia.org/wiki/List%20of%20inflammatory%20disorders |
Nervous system
CNS
Encephalitis
Myelitis
Meningitis
Arachnoiditis
PNS
Neuritis
eye
Dacryoadenitis
Scleritis
Episcleritis
Keratitis
Retinitis
Chorioretinitis
Blepharitis
Conjunctivitis
Uveitis
ear
Otitis externa
Otitis media
Labyrinthitis
Mastoiditis
Cardiovascular system
Carditis
Endocarditis
Myocarditis
Pericarditis
Vasculitis
Arteritis
Phlebitis
Capillaritis
Respiratory system
upper
Sinusitis
Rhinitis
Pharyngitis
Laryngitis
lower
Tracheitis
Bronchitis
Bronchiolitis
Pneumonitis
Pleuritis
Mediastinitis
Digestion system
Mouth
Stomatitis
Gingivitis
Gingivostomatitis
Glossitis
Tonsillitis
Sialadenitis/Parotitis
Cheilitis
Pulpitis
Gnathitis
Gastrointestinal tract
Esophagitis
Gastritis
Gastroenteritis
Enteritis
Colitis
Enterocolitis
Duodenitis
Ileitis
Caecitis
Appendicitis
Proctitis
Accessory digestive organs
Hepatitis
Ascending cholangitis
Cholecystitis
Pancreatitis
Peritonitis
Integumentary system
Dermatitis
Folliculitis
Cellulitis
Hidradenitis
Musculoskeletal system
Arthritis
Dermatomyositis
soft tissue
Myositis
Synovitis/Tenosynovitis
Bursitis
Enthesitis
Fasciitis
Capsulitis
Epicondylitis
Tendinitis
Panniculitis
Osteochondritis: Osteitis/Osteomyelitis
Spondylitis
Periostitis
Chondritis
Urinary system
Nephritis
Glomerulonephritis
Pyelonephritis
Ureteritis
Cystitis
Urethritis
Reproductive system
Female
Oophoritis
Salpingitis
Endometritis
Parametritis
Cervicitis
Vaginitis
Vulvitis
Mastitis
Male
Orchitis
Epididymitis
Prostatitis
Seminal vesiculitis
Balanitis
Posthitis
Balanoposthitis
Pregnancy/newborn
Chorioamnionitis
Funisitis
Omphalitis
Endocrine system
Insulitis
Hypophysitis
Thyroiditis
Parathyroiditis
Adrenalitis
Lymphatic system
Lymphangitis
Lymphadenitis
Physiology
Inflammations |
https://en.wikipedia.org/wiki/Les%20Archives%20du%20spectacle | Les Archives du spectacle – The Performing Arts Archive – is an online French database covering live performance (theatre, dance, opera, puppetry, etc.). It was created in 2007.
The site is designed to provide free information about plays, actors, actresses, directors, playwrights and other people and companies involved in the development of a show, a play, a musical or an opera in a French-speaking country (France, Switzerland, Belgium, Luxembourg, Canada). Details are given of major revivals as well as new works. Foreign shows performed in France are sometimes covered. The site offers no judgments on the quality of the shows it covers.
The site is run from the city of Montpellier. It operates in co-operation with the French Performing Arts Archives and ARTCENA, a national centre created by the French Ministry of Culture to coordinate and increase digital resources on circus, street and theatre arts.
References
Online databases
Theatre databases |
https://en.wikipedia.org/wiki/Convex%20Polyhedra%20%28book%29 | Convex Polyhedra is a book on the mathematics of convex polyhedra, written by Soviet mathematician Aleksandr Danilovich Aleksandrov, and originally published in Russian in 1950, under the title Выпуклые многогранники. It was translated into German by Wilhelm Süss as Konvexe Polyeder in 1958. An updated edition, translated into English by Nurlan S. Dairbekov, Semën Samsonovich Kutateladze and Alexei B. Sossinsky, with added material by Victor Zalgaller, L. A. Shor, and Yu. A. Volkov, was published as Convex Polyhedra by Springer-Verlag in 2005.
Topics
The main focus of the book is on the specification of geometric data that will determine uniquely the shape of a three-dimensional convex polyhedron, up to some class of geometric transformations such as congruence or similarity. It considers both bounded polyhedra (convex hulls of finite sets of points) and unbounded polyhedra (intersections of finitely many half-spaces).
The 1950 Russian edition of the book included 11 chapters. The first chapter covers the basic topological properties of polyhedra, including their topological equivalence to spheres (in the bounded case) and Euler's polyhedral formula. After a lemma of Augustin Cauchy on the impossibility of labeling the edges of a polyhedron by positive and negative signs so that each vertex has at least four sign changes, the remainder of chapter 2 outlines the content of the remaining book. Chapters 3 and 4 prove Alexandrov's uniqueness theorem, characterizing the surface geometry of polyhedra as being exactly the metric spaces that are topologically spherical locally like the Euclidean plane except at a finite set of points of positive angular defect, obeying Descartes' theorem on total angular defect that the total angular defect should be . Chapter 5 considers the metric spaces defined in the same way that are topologically a disk rather than a sphere, and studies the flexible polyhedral surfaces that result.
Chapters 6 through 8 of the book are related to a |
https://en.wikipedia.org/wiki/Brooklyn%20Bridge%20%28software%29 | The Brooklyn Bridge from White Crane Systems was a data transfer enabler. Although it came with some hardware, it was the software which was the basis of the product. It also could transform the data's format.
Overview
The New York Times described its category as being among "communications packages used to transfer files." In an era of 300 baud, Brooklyn Bridge operated at "115,200 baud" so that a transfer which "at 300 baud took 4 minutes and 36 seconds" only needed
5 seconds. Unlike some communications packages, this one retains the original version-date, so as not to alarm people
when they seem to have what looks like an update, when it's not.
Description
Once the software is installed, users comfortable with typing the word "COPY" can do so as readily as they sneakernet. An earlier review described it as "less cumbersome than conventional communications software" The use of neither specialized hardware nor specialized software is ideal in an era when this can be done using online or other "outside" services.
See also
BLAST (protocol)
Kermit (protocol)
Zamzar
References
Communication software
Computer data
Data management
History of software |
https://en.wikipedia.org/wiki/Codablock | Codablock is a family of stacked 1D barcodes (Codablock A,. Codablock F, Codablock 256) which was invented in Identcode Systeme GmbH in Germany in 1989 by Heinrich Oehlmann. Codablock barcodes are based on stacked Code 39 and Code 128 symbologies and have some advantages of 2D barcodes.
The barcodes were used mostly in health care industry (HIBC) and presently, Codablock codes are fully replaced by Data Matrix
History
Codablock codes invention were proceeded from 1989 to 1995 year. Codablock A was invented in 1989 and standardized as AIM standard in 1994. Codablock A was based on stacked Code 39 barcodes and doesn’t widely used because of Code 39 restrictions.
The next Codablock F was based on stacked Code 128 symbology and was standardized as AIM standard in 1995. As this time Codablock F is officially accessed as historical standard and isn’t recommended to use in new applications.
Codablock 256 was invented as internal ICS Identcode-Systeme standard and wasn’t standardized. It was also based on stacked Code 128 symbology. Codablock 256 could encode all 256 symbol ISO 8859-1 charset with FNC4 character and each line had error correction. Because of it has issues with reading by code128 scanners, 8-bit charset encoding was added to Codablock F standard and Codablock 256 almost was not used.
The Codablock also played an important pioneering role in the advance of 2D codes, because only it could be read reliably with the slightest modification of the laser scanners used at the time.
Codablock types
Codablock symbologies has been developed as a stacked version of Code 39 and Code 128 barcodes and has some advantages of 2D barcodes. They allow to utilize rectangular space more effectively then 1D barcode and have additional checking characters to ensure the content of the overall message.
Codablock can be compared with a line break in a text editor. As soon as one line is full, the next is broken, whereby the line number is inserted into each line and the num |
https://en.wikipedia.org/wiki/Dos%20por%20Dos | Dos por Dos (transl: Two by two) is a Philippine national radio program. It was first aired over DZMM-AM from April 24, 2000, to May 5, 2020, and was simulcast on its sister cable television channel from its launch in March 12, 2007, until July 31, 2020. It resumed airing over DZRH and DZRH News Television on August 31, 2020.
Created by Angelo Palmones, the program is hosted by Anthony Taberna and Gerry Baja.
History
DZMM: 2000–2020
Dos por Dos was the brainchild of then-DZMM manager Angelo Palmones. The program name was derived from the Filipino slang term for the length and width of a wooden jack. Network promotional material touted its hard-hitting commentary format as being likened to the strength of the jack's striking when it is being utilized for construction.
Prior to getting the job of the show's program hosts, Taberna and Baja were concurrently serving as DZMM Radyo Patrol field reporters and already paired as co-anchors for the station's weekend current affairs show Ito ang Radyo Patrol which they handled from 1999 to 2002 and the weekday early-morning newscast Gising Pilipinas from 2001 to 2007.
The program premiered on April 24, 2000, as a half-hour commentary program at the 7:30 am slot. Initially, the hosts tried to emulate the formal, high-brow and seriously-toned commentary programs of the time before retooling to exchanges of wacky banter interspersed between interview segments and field reports. This proved to be successful. The program expanded to an hour in 2006. Upon the launch of DZMM TeleRadyo in 2007, the show was expanded to the late afternoon slot as a lead-in to the radio station's simulcast of TV Patrol, a position it held until 2020.
In 2011, due to programming changes, most notably with Noli de Castro's Kabayan moving to the 6:00 am slot, the morning edition of Dos por Dos was moved to 5:00 am, more concentrating on early-morning field reports. With Gerry Baja's addition as co-host to ABS-CBN's morning show Umagang Kay Ganda in J |
https://en.wikipedia.org/wiki/C4%20model | The C4 model is a lean graphical notation technique for modelling the architecture of software systems. It is based on a structural decomposition of a system into containers and components and relies on existing modelling techniques such as the Unified Modelling Language (UML) or Entity Relation Diagrams (ERD) for the more detailed decomposition of the architectural building blocks.
History
The C4 model was created by the software architect Simon Brown between 2006 and 2011 on the roots of Unified Modelling Language (UML) and the 4+1 architectural view model. The launch of an official website under a Creative Commons license and an article published in 2018 popularised the emerging technique.
Overview
The C4 model documents the architecture of a software system, by showing multiple points of view that explain the decomposition of a system into containers and components, the relationship between these elements, and, where appropriate, the relation with its users.
The viewpoints are organized according to their hierarchical level:
Context diagrams (level 1): show the system in scope and its relationship with users and other systems;
Container diagrams (level 2): decompose a system into interrelated containers. A container represents an application or a data store;
Component diagrams (level 3): decompose containers into interrelated components, and relate the components to other containers or other systems;
Code diagrams (level 4): provide additional details about the design of the architectural elements that can be mapped to code. The C4 model relies at this level on existing notations such as Unified Modelling Language (UML), Entity Relation Diagrams (ERD) or diagrams generated by Integrated Development Environments (IDE).
For level 1 to 3, the C4 model uses 5 basic diagramming elements: persons, software systems, containers, components and relationships. The technique is not prescriptive for the layout, shape, colour and style of these elements. Instead, |
https://en.wikipedia.org/wiki/Intracellular%20bacteria | Intracellular bacteria are bacteria that have the capability to enter and survive within the cells of the host organism. Many of them are capable of growth extracellularly, but some of them can grow and reproduce only intracellularly (obligate intracellular parasites). Besides bacteria, there are other kinds of intracellular microorganisms.
Examples of non-obligate intracellular bacteria include members of the genera Brucella, Legionella, Listeria, and Mycobacterium. Examples of obligate intracellular bacteria include members of the order Rickettsiales and members of the genus Mycoplasma.
See also
Endosymbiont
References
Bacteria
Microbiology
Cells |
https://en.wikipedia.org/wiki/Cardinal%20and%20Ordinal%20Numbers | Cardinal and Ordinal Numbers is a book on transfinite numbers, by Polish mathematician Wacław Sierpiński. It was published in 1958 by Państwowe Wydawnictwo Naukowe, as volume 34 of the series Monografie Matematyczne of the Institute of Mathematics of the Polish Academy of Sciences. Sierpiński wrote on the same topic earlier, in his 1928 book Leçons sur les nombres transfinis, but his 1958 book on the topic was completely rewritten and significantly longer. A second edition of Cardinal and Ordinal Numbers was published in 1965.
Topics
After five introductory chapters on naive set theory and set-theoretic notation, and a sixth chapter on the axiom of choice, the book has four chapters on cardinal numbers, their arithmetic, and series and products of cardinal numbers, comprising approximately 50 pages. Following this, four longer chapters (totalling roughly 180 pages) cover orderings of sets, order types, well-orders, ordinal numbers, ordinal arithmetic, and the Burali-Forti paradox according to which the collection of all ordinal numbers cannot be a set. Three final chapters concern aleph numbers and the continuum hypothesis, statements equivalent to the axiom of choice, and consequences of the axiom of choice.
The second edition makes only minor changes to the first except for adding footnotes concerning two later developments in the area: the proof by Paul Cohen of the independence of the continuum hypothesis, and the construction by Robert M. Solovay of the Solovay model in which all sets of real numbers are Lebesgue measurable.
Audience and reception
Sierpiński was known for his significant contributions to the theory of transfinite numbers;, reviewer Reuben Goodstein calls his book "a goldmine of results", and similarly Leonard Gillman writes that it is highly valuable "as a compendium of interesting mathematical information, presented with care and clarity". Both Gillman and John C. Oxtoby call the writing style "leisurely" and "unhurried", and although Gillm |
https://en.wikipedia.org/wiki/Neuropixels | Neuropixels probes (or "Neuropixels") are electrodes developed in 2017 to record the activity of hundreds of neurons in the brain. The probes are based on CMOS technology and have 1,000 recording sites arranged in two rows on a thin, 1-cm long shank.
The probes are used in hundreds of neuroscience laboratories including the International Brain Laboratory, to record brain activity mostly in mice and rats. By revealing the activity of vast numbers of neurons, Neuropixels probes are allowing new approaches to the study of brain processes such as sensory processing, decision making, internal state, and emotions and to create brain-machine interfaces.
The probes were announced in 2017. They are designed and fabricated by imec, an electronics research center in Belgium. In 2022, Neuropixels probes were inserted in human patients.
References
External links
UCL Neuropixels page
neuropixels.org
Neuroscience
Neural engineering |
https://en.wikipedia.org/wiki/LEAPER%20gene%20editing | LEAPER (Leveraging endogenous ADAR for programmable editing of RNA) is a genetic engineering technique in molecular biology by which RNA can be edited. The technique relies on engineered strands of RNA to recruit native ADAR enzymes to swap out different compounds in RNA. Developed by researchers at Peking University in 2019, the technique, some have claimed, is more efficient than the CRISPR gene editing technique. Initial studies have claimed that editing efficiencies of up to 80% can be achieved.
Synopsis
As opposed to DNA gene editing techniques (e.g., using CRISPR-Cas proteins to make modifications directly to a defective gene), LEAPER targets editing messenger RNA (mRNA) for the same gene which is transcribed into a protein. Post-transcriptional RNA modification typically involves the strategy of converting adenosine-to-inosine (A-to-I) since inosine (I) demonstrably mimics guanosine (G) during translation into a protein. A-to-I editing is catalyzed by adenosine deaminase acting on RNA (ADAR) enzymes, whose substrates are double-stranded RNAs. Three human ADAR genes have been identified with ADAR1 (official symbol ADAR) and ADAR2 (ADARB1) proteins developed activity profiles. LEAPER achieves this targeted RNA editing through the use of short engineered ADAR-recruiting RNAs (). consist of endogenous ADAR1 proteins with several RNA binding domains (RBDs) fused with a peptide, CRISPR-Cas13b protein, and a guide RNA (gRNA) between 100 and 150 nt in length for high editing efficiency designed to recruit the chimeric ADAR protein to a target site.
This results in a change in which protein is synthesized during translation.
History
The technique was discovered by a team of researchers at Peking University in Beijing, China. The discovery was announced in the journal Nature Biotechnology in July 2019.
Applications
Chinese researchers have utilized LEAPER to restore functional enzyme activity in cells from patients with Hurler syndrome. They have claimed that LEA |
https://en.wikipedia.org/wiki/Francis%20Allard | Francis Allard is a French academic, engineer and Distinguished Professor in Civil Engineering. Since February 2017, Allard is professor emeritus at La Rochelle University and Chairman of :fr:Tipee (Building Innovation Platform)> He has expertise in heat and mass transfer phenomena with application in energy efficiency and indoor environment in buildings and urban microclimate.
Publications
Books
Allard, Francis. Natural ventilation in buildings. A design handbook. (1998). (Cited 796 times, according to Google Scholar )
Allard F, Ghiaus C, editors. Natural ventilation in the urban environment: assessment and design. Routledge; 2012 Jun 25.(Cited 165 times, according to Google Scholar.)
Journal articles
Blondeau P, Iordache V, Poupard O, Genin D, Allard F. Relationship between outdoor and indoor air quality in eight French schools. Indoor air. 2005 Feb 1;15(1):2-12.(Cited 339 times, according to Google Scholar.)
Blondeau P, Spérandio M, Allard F. Night ventilation for building cooling in summer. Solar energy. 1997 Nov 1;61(5):327-35. (Cited 209 times, according to Google Scholar.)
Kurnitski J, Allard F, Braham D, Goeders G, Heiselberg P, Jagemar L, Kosonen R, Lebrun J, Mazzarella L, Railio J, Seppänen O. How to define nearly net zero energy buildings nZEB. Rehva Journal. 2011 May;48(3):6-12. Cited 168 times according to Google Scholar.)
Ghiaus C, Allard F, Santamouris M, Georgakis C, Nicol F. Urban environment influence on natural ventilation potential. Building and environment. 2006 Apr 1;41(4):395-406. (Cited 140 times, according to Google Scholar.)
References
Year of birth missing (living people)
Living people
Place of birth missing (living people)
French academics
French civil engineers |
https://en.wikipedia.org/wiki/Carbon%20Copy%20%28software%29 | Carbon Copy was "a remote control/communications program" with for-its-day advanced features for remote screen sharing, background file transfer, and "movable chat windows".
Overview
The New York Times described it thus: "you can sit at the console of either machine and call up
the programs and files stored on the other". Computerworld called it "a package that mirrors every action a user takes on two connected PCs".
Part of its user base was acquired via inclusion as bonus software for a modem that could communicate at "300, 1200 and 2400 baud."
Carbon Copy's vendor, Meridian Technology, was acquired by Microcom in early 1988, and accepted tax credits to move software duplication and
packaging of Carbon Copy to Puerto Rico. Meridian had a British subsidiary, also acquired by Microcom.
History
Computerworld covered the flow of features and newer releases: 3.0 (1986), 1987, 1989. By 1991, although Version 5.2.2 was still actively marketed, Version 6.0 was released to coincide with the release of MS/DOS 5.0.
By 1994, DOS versions topped out at 6.0, and the 2.0 version of Carbon Copy Plus for Windows was available.
See also
BLAST (protocol)
Kermit (protocol)
References
History of software
History of telecommunications |
https://en.wikipedia.org/wiki/Convex%20Polytopes | Convex Polytopes is a graduate-level mathematics textbook about convex polytopes, higher-dimensional generalizations of three-dimensional convex polyhedra. It was written by Branko Grünbaum, with contributions from Victor Klee, Micha Perles, and G. C. Shephard, and published in 1967 by John Wiley & Sons. It went out of print in 1970. A second edition, prepared with the assistance of Volker Kaibel, Victor Klee, and Günter M. Ziegler, was published by Springer-Verlag in 2003, as volume 221 of their book series Graduate Texts in Mathematics.
Convex Polytopes was the winner of the 2005 Leroy P. Steele Prize for mathematical exposition, given by the American Mathematical Society. The Basic Library List Committee of the Mathematical Association of America has recommended its inclusion in undergraduate mathematics libraries.
Topics
The book has 19 chapters. After two chapters introducing background material in linear algebra, topology, and convex geometry, two more chapters provide basic definitions of polyhedra, in their two dual versions (intersections of half-spaces and convex hulls of finite point sets), introduce Schlegel diagrams, and provide some basic examples including the cyclic polytopes. Chapter 5 introduces Gale diagrams, and the next two chapters use them to study polytopes with a number of vertices only slightly higher than their dimension, and neighborly polytopes.
Chapters 8 through 11 study the numbers of faces of different dimensions in polytopes through Euler's polyhedral formula, the Dehn–Sommerville equations, and the extremal combinatorics of numbers of faces in polytopes. Chapter 11 connects the low-dimensional faces together into the skeleton of a polytope, and proves the van Kampen–Flores theorem about non-embeddability of skeletons into lower-dimensional spaces. Chapter 12 studies the question of when a skeleton uniquely determines the higher-dimensional combinatorial structure of its polytope. Chapter 13 provides a complete answer to this the |
https://en.wikipedia.org/wiki/AC%2000-69 | The Advisory Circular AC 00-69, Best Practices for Airborne Software Development Assurance Using EUROCAE ED-12( ) and RTCA DO-178( ), initially issued in 2017, supports application of the active revisions of ED-12C/DO-178C and AC 20-115. The AC does not state FAA guidance, but rather provides information in the form of "best practices" complementary to the objectives of ED-12C/DO-178C.
Notably, the guidance of FAA Order 8110.49 regarding "Software Change Impact Analysis" was removed in Revision A of that notice in 2017. The best practices that AC 00-69 now describes for Software Change Impact Analysis are much reduced and less prescriptive than what was removed from Order 8110.49.
This AC clarifies that Data Coupling Analysis and Control Coupling Analysis are distinct activities and that both are required for satisfying objective A-7 (8) of ED-12C/DO-178C and ED-12B/DO-178B, adding that data and control coupling analyses rely upon detailed design specification of interfaces and dependencies between components.
The AC also recommends that error handling (how the software avoids, detects, and handles runtime error) should be defined in explicit, reviewed design specifications rather than implemented ad hoc in the source code.
References
External links
AC 00-69, Best Practices for Airborne Software Development Assurance Using EUROCAE ED-12( ) and RTCA DO-178( )
Avionics
Safety
Software requirements
RTCA standards
Computer standards |
https://en.wikipedia.org/wiki/Focused%20proof | In mathematical logic, focused proofs are a family of analytic proofs that arise through goal-directed proof-search, and are a topic of study in structural proof theory and reductive logic. They form the most general definition of goal-directed proof-search—in which someone chooses a formula and performs hereditary reductions until the result meets some condition. The extremal case where reduction only terminates when axioms are reached forms the sub-family of uniform proofs.
A sequent calculus is said to have the focusing property when focused proofs are complete for some terminating condition. For System LK, System LJ, and System LL, uniform proofs are focused proofs where all the atoms are assigned negative polarity. Many other sequent calculi has been shown to have the focusing property, notably the nested sequent calculi of both the classical and intuitionistic variants of the modal logics in the S5 cube.
Uniform proofs
In the sequent calculus for an intuitionistic logic, the uniform proofs can be characterised as those in which the upward reading performs all right rules before the left rules. Typically, uniform proofs are not complete for the logic i.e., not all sequents or formulas admit a uniform proof, so one considers fragments where they are complete e.g., the hereditary Harrop fragment of Intuitionistic logic. Due to the deterministic behaviour, uniform proof-search has been used as the control mechanism defining the programming language paradigm of logic programming. Occasionally, uniform proof-search is implemented in a variant of the sequent calculus for the given logic where context management is automatic thereby increasing the fragment for which one can define a logic programming langue.
Focused proofs
The focusing principle was originally classified through the disambiguation between synchronous and asynchronous connective in Linear Logic i.e., connectives that interact with the context and those that do not, as consequence of research on lo |
https://en.wikipedia.org/wiki/Gateway%20Energy%20Storage | Gateway Energy Storage is a large-scale lithium-ion battery, operated by grid infrastructure developer LS Power.
It has a storage capacity of 250 MWh, and it is located in Otay Mesa, California, on the outskirts of San Diego. It uses cells from LG Chem.
The purpose of the battery is to provide power during times of peak demand after being charged with solar power during the day.
References
Battery (electricity)
Power stations in California |
https://en.wikipedia.org/wiki/Observer%20%28meteorological%29 | A meteorological observer, or weather observer, is a person authorized by a weather authority to make or record meteorological observations. They are technicians who are responsible for the accurate observation, rapid measurement, timely collection, recording, and timely submission of meteorological parameters and information and various atmospheric phenomena to the Meteorological Center. Surface, upper air, radar, and satellite are all forms of weather observations.
Role
Meteorological observers play a key role in many flood, drought, environmental and water resources applications. Whilst rainfall observations are most widely used, other parameters of interest include air temperatures, humidity and wind speeds. The main measurement techniques include raingauges, weather stations and weather radar, with satellite precipitation estimates playing an important role in data-sparse regions.
METARs are generated by both government-owned and privately contracted facilities. The collection of weather data can be automated by machines, such as the AWOS, and the ASOS. These automated facilities help gather large amounts of data.
Surface observations provide specific and relevant information around an airport. The FAA recommends this guideline of information in a report:
type of report: interval update, or update for rapid changes
station identifier: 4 letter code
date and time: date is the first two digits; last four digits are the time in UTC
modifier: denotes report came from an automated source, or was manually corrected
wind: first three digits are direction (in tens of degrees), last two digits are wind speed measured in knots
visibility: reported in statute miles, runway visibility is also reported
weather phenomena: intensity, proximity, description
sky condition: amount, height, vertical visibility
temperature and dew point: measured in Celsius, "M" denotes negative temperature
altitude: measured in inches of mercury, Hg
The information can be obtaine |
https://en.wikipedia.org/wiki/Searchable%20symmetric%20encryption | Searchable symmetric encryption (SSE) is a form of encryption that allows one to efficiently search over a collection of encrypted documents or files without the ability to decrypt them. SSE can be used to outsource files to an untrusted cloud storage server without ever revealing the files in the clear but while preserving the server's ability to search over them.
Description
A searchable symmetric encryption scheme is a symmetric-key encryption scheme that encrypts a collection of documents , where each document is viewed as a set of keywords from a keyword space . Given the encryption key and a keyword , one can generate a search token with which the encrypted data collection can be searched for . The result of the search is the subset of encrypted documents that contain the keyword .
Static SSE
A static SSE scheme consists of three algorithms that work as follows:
takes as input a security parameter and a document collection and outputs a symmetric key , an encrypted index , and an encrypted document collection
takes as input the secret key and a keyword and outputs a search token
takes as input the encrypted index , the encrypted document collection and a search token and outputs a set of encrypted documents
A static SSE scheme is used by a client and an untrusted server as follows. The client encrypts its data collection using the algorithm which returns a secret key and an encrypted document collection . The client keeps secret and sends and to the untrusted server. To search for a keyword , the client runs the algorithm on and to generate a search token which it sends to the server. The server runs Search with , , and and returns the resulting encrypted documents back to the client.
Dynamic SSE
A dynamic SSE scheme supports, in addition to search, the insertion and deletion of documents. A dynamic SSE scheme consists of seven algorithms where , and are as in the static case and the remaining algorithms work as follow |
https://en.wikipedia.org/wiki/Zscaler | Zscaler () is a cloud security company, with headquarters in San Jose, California. The company offers enterprise cloud security services.
History
Zscaler was founded in 2007 by Jay Chaudhry and K. Kailash. In August 2012, Zscaler secured $38 million in funding from strategic investors. In March 2018, the company had an initial public offering in which it raised $192 million. The company is traded on the Nasdaq using the symbol ZS. Zscaler stock was added to the Nasdaq 100 on December 17, 2021.
References
External links
Computer security companies specializing in botnets
Computer forensics
Content-control software
Companies based in San Jose, California
Software companies based in the San Francisco Bay Area
American companies established in 2007
Cloud computing providers
2007 establishments in California
Software companies established in 2007
Computer security companies
2018 initial public offerings
Companies listed on the Nasdaq
Software companies of the United States |
https://en.wikipedia.org/wiki/Statactivism | The French movement of statactivism advocates for the mobilization of statistics in support to social movements and agendas.
Content
The program of French statactivistts is to ‘fight against’ as well as ‘fight with’ numbers, using a variety of possible strategies:
‘Statistical judo’. This is a strategy of self-defence, whereby existing measures are ‘gamed’ as prescribed by the Goodhart's law;
Denouncing the inadequacy or bias or unfairness of existing indicators and measures, e.g. from official statistics of poverty or inequality;
Developing alternative indicators to substitute for those above;
Identifying social contexts and problems which are invisible to existing statistics
Statactivism's intellectually belongs to the tradition of sociology of numbers.
Following Alain Desrosières and Theodore Porter, statactivists use statistics as a “tool of weakness”, which offer to the weak members of society the opportunity to act against their oppression by making injustice visible.
See also
Sociology of quantification
Ethics of quantification
Society for the Social Studies of Quantification - SSSQ
References
Statistical organizations
Quantification (science)
Mathematical and quantitative methods (economics) |
https://en.wikipedia.org/wiki/Maria%20Watkins | Marja "Maria" Ludwika Watkins (; 2 December 19182 September 2010) was a defence electronics engineer, lecturer and President of the Women's Engineering Society.
Early life
Ziff was born on 2December 1918 in Vienna, Austria, and grew up in the Polish city of Lvov. Her parents were of Ukrainian descent; her father was director of a bank, and her mother was a research chemist at Lvov University.
Education
In 1938, Ziff applied to study electrical engineering at the University of Edinburgh. She was accepted and moved to Scotland, surprising the professor who had offered her a place, as he had believed her application was from a Polish man. She became the first woman to study electrical engineering there. She joined the Women's Engineering Society on her arrival in the UK in 1939. In 1941, she graduated from the University of Edinburgh with a degree in Electrical Engineering (Communications). As the situation worsened in Europe, her family refused to join her. Following the invasion of Poland by Russia and Germany, her parents and grandparents died in the concentration camps, only her sister surviving. She never felt able to return to Poland.
Career
In 1942, Ziff became a technical assistant at Johnson and Phillips Ltd. The company made cabling and navigation items for aircraft, and she worked on technical problems of their distribution systems. Her job was varied, including working as a research assistant for new airplane guidance systems, to supervising the repair of overhead power cables shot down by drunken soldiers or repairing electrical exchanges damaged by bombings. She was one of the assistants to Jules Thorn, the founder of Thorn Electrical Industries, one of the United Kingdom's largest electrical businesses. She lived in Blackheath, London during the latter part of the Second World War, volunteering as an air raid warden in the evenings. During this time she was working on research for the PLUTO Pipeline Under The Ocean project and on a secret airplane |
https://en.wikipedia.org/wiki/Matrix%202%20of%205 | Matrix 2 of 5 (also known as Code 2 of 5 Matrix.) is a variable length, discrete, two width symbology. Matrix 2 of 5 is a subset of two-out-of-five codes. Unlike Industrial 2 of 5 code, Matrix 2 of 5 can encode data not only with black bars but with white spaces.
Matrix 2 of 5 was developed in 1970-х by Nieaf Co. in The Netherlands and commonly was uses for warehouse sorting, photo finishing, and airline ticket marking.
Matrix 2 of 5 can encode only digits 0-9. Matrix 2 of 5 can include optional check digit. Most of barcode readers support this symbology.
Encoding
Matrix 2 of 5 is a subset of two-out-of-five codes family and uses wide and narrow elements for encoding. Unlike previously developed Industrial 2 of 5 it uses both black bars and white spaces for data encoding. However, it has lower density then Interleaved 2 of 5 code, because it is discrete symbology and requires additional space between data patterns. Main advantage over Interleaved 2 of 5 codes is ability to encode odd number of characters in message.
Matrix 2 of 5 encodes only digits from 0 to 9 in three black bars and two white spaces, with every data pattern split by additional white space. Matrix 2 of 5 could include optional checksum character which is added to the end of the barcode.
Matrix 2 of 5 features:
character set is a number (0-9);
encoding density moderate: barcode length on 11% longer than Interleaved 2 of 5 symbology and on 82% than Code 128;
variable length of symbol;
can include optional checking character.
Four starting bars and spaces in pattern have own weights which encode value of the symbol (except zero). Also, last black bar is used as parity bit to avoid single error. Value of the symbol is a sum of nonzero weights of four first pattern elements.
N - narrow black bar or white space.
W - wide black bar or white space.
Narrow to wide components difference could be from 1/3 to 2/5.
The barcode has the following physical structure:
1. Start character
2. Variable le |
https://en.wikipedia.org/wiki/Spermosphere | In plant science, the spermosphere is the zone in the soil surrounding a germinating seed. This is a small volume with radius perhaps 1 cm but varying with seed type, the variety of soil microorganisms, the level of soil moisture, and other factors. Within the spermosphere a range of complex interactions take place among the germinating seed, the soil, and the microbiome. Because germination is a brief process, the spermosphere is transient, but the impact of the microbial activity within the spermosphere can have strong and long-lasting effects on the developing plant.
Seeds exude various molecules that influence their surrounding microbial communities, either inhibiting or stimulating their growth. The composition of the exudates varies according to the plant type and such properties of the soil as its pH and moisture content. With these biochemical effects, the spermosphere develops both downward—to form the rhizosphere (upon the emergence of the plant's radicle)—and upward to form the laimosphere, which is the soil surrounding the growing plant stem.
References
Soil biology
Plant roots
Environmental soil science |
https://en.wikipedia.org/wiki/Meshulam%27s%20game | In graph theory, Meshulam's game is a game used to explain a theorem of Roy Meshulam related to the homological connectivity of the independence complex of a graph, which is the smallest index k such that all reduced homological groups up to and including k are trivial. The formulation of this theorem as a game is due to Aharoni, Berger and Ziv.
Description
The game-board is a graph G. It is a zero-sum game for two players, CON and NON. CON wants to show that I(G), the independence complex of G, has a high connectivity; NON wants to prove the opposite.
At his turn, CON chooses an edge e from the remaining graph. NON then chooses one of two options:
Disconnection – remove the edge e from the graph.
Explosion – remove both endpoints of e, together with all their neighbors and the edges incident to them.
The score of CON is defined as follows:
If at some point the remaining graph has an isolated vertex, the score is infinity;
Otherwise, at some point the remaining graph contains no vertex; in that case the score is the number of explosions.
For every given graph G, the game value on G (i.e., the score of CON when both sides play optimally) is denoted by Ψ(G).
Game value and homological connectivity
Meshulam proved that, for every graph G:where is the homological connectivity of plus 2.
Examples
If G is the empty graph, then Ψ(G) = 0, since no explosions are needed.
If G has k connected components, then Ψ(G) ≥ k. Regardless of the order in which CON offers edges, each explosion made by NON destroys vertices in a single component, so NON needs at least k explosions to destroy all vertices.
If G is a union of k vertex-disjoint cliques, each of which contains at least two vertices, then Ψ(G) = k, since each explosion completely destroys a single clique.
If G has an independence domination number of at least k, , then . Proof: Let A be an independent set with domination number at least k. CON starts by offering all edges (a,b) where a is in A. If NON d |
https://en.wikipedia.org/wiki/Geopositioning | Geopositioning is the process of determining or estimating the geographic position of an object.
Geopositioning yields a set of geographic coordinates (such as latitude and longitude) in a given map datum; positions may also be expressed as a bearing and range from a known landmark.
In turn, positions can determine a meaningful location, such as a street address.
Specific instances include: animal geotracking, the process of inferring the location of animals over time; positioning system, the mechanisms for the determination of geographic positions in general; internet geolocation, geolocating a device connected to the internet; and mobile phone tracking.
Background
Geopositioning uses various visual and electronic methods including position lines and position circles, celestial navigation, radio navigation, and the use of satellite navigation systems.
The calculation requires measurements or observations of distances or angles to reference points whose positions are known. In 2D surveys, observations of three reference points are enough to compute a position in a two-dimensional plane. In practice, observations are subject to errors resulting from various physical and atmospheric factors that influence the measurement of distances and angles.
A practical example of obtaining a position fix would be for a ship to take bearing measurements on three lighthouses positioned along the coast. These measurements could be made visually using a hand bearing compass, or in poor visibility, electronically using radar or radio direction finding. Since all physical observations are subject to errors, the resulting position fix is also subject to inaccuracy. Although in theory two lines of position (LOP) are enough to define a point, in practice 'crossing' more LOPs provides greater accuracy and confidence, especially if the lines cross at a good angle to each other. Three LOPs are considered the minimum for a practical navigational fix. The three LOPs when drawn on the char |
https://en.wikipedia.org/wiki/Glossary%20of%20microelectronics%20manufacturing%20terms | Glossary of microelectronics manufacturing terms
This is a list of terms used in the manufacture of electronic micro-components. Many of the terms are already defined and explained in Wikipedia; this glossary is for looking up, comparing, and reviewing the terms. You can help enhance this page by adding new terms or clarifying definitions of existing ones.
2.5D integration – an advanced integrated circuit packaging technology that bonds dies and/or chiplets onto an interposer for enclosure within a single package
3D integration – an advanced semiconductor technology that incorporates multiple layers of circuitry into a single chip, integrated both vertically and horizontally
3D-IC (also 3DIC or 3D IC) – Three-dimensional integrated circuit; an integrated circuit built with 3D integration
advanced packaging – the aggregation and interconnection of components before traditional packaging
ALD – see atomic layer deposition
atomic layer deposition (ALD) – chemical vapor deposition process by which very thin films of a controlled composition are grown
back end of line (BEoL) – wafer processing steps from the creation of metal interconnect layers through the final etching step that creates pad openings (see also front end of line, far back end of line, post-fab)
BEoL – see back end of line
bonding – any of several technologies that attach one electronic circuit or component to another; see wire bonding, thermocompression bonding, flip chip, hybrid bonding, etc.
breadboard – a construction base for prototyping of electronics
bumping – the formation of microbumps on the surface of an electronic circuit in preparation for flip chip assembly
carrier wafer – a wafer that is attached to dies, chiplets, or another wafer during intermediate steps, but is not a part of the finished device
chip – an integrated circuit; may refer to either a bare die or a packaged device
chip carrier – a package built to contain an integrated circuit
chiplet – a small |
https://en.wikipedia.org/wiki/Perspectiva%20corporum%20regularium | (from Latin: Perspective of the Regular Solids) is a book of perspective drawings of polyhedra by German Renaissance goldsmith Wenzel Jamnitzer, with engravings by Jost Amman, published in 1568.
Despite its Latin title, is written mainly in the German language. It was "the most lavish of the perspective books published in Germany in the late sixteenth century" and was included in several royal art collections. It may have been the first work to depict chiral icosahedral symmetry.
Topics
The book focuses on the five Platonic solids, with the subtitles of its title page citing Plato's Timaeus and Euclid's Elements for their history. Each of these five shapes has a chapter, whose title page relates the connection of its polyhedron to the classical elements in medieval cosmology: fire for the tetrahedron, earth for the cube, air for the octahedron, and water for the icosahedron, with the dodecahedron representing the heavens, its 12 faces corresponding to the 12 symbols of the zodiac. Each chapter includes four engravings of polyhedra, each showing six variations of the shape including some of their stellations and truncations, for a total of 120 polyhedra. This great amount of variation, some of which obscures the original Platonic form of each polyhedron, demonstrates the theory of the time that all the variation seen in the physical world comes from the combination of these basic elements.
Following these chapters, additional engravings depict additional polyhedral forms, including polyhedral compounds such as the stella octangula, polyhedral variations of spheres and cones, and outlined skeletons of polyhedra following those drawn by Leonardo da Vinci for Luca Pacioli's earlier book Divina proportione. In this part of the book, the shapes are arranged in a three-dimensional setting and often placed on smaller polyhedral pedestals.
Creation process
The roughly 50 engravings for the book were made by Jost Amman, a German woodcut artist, based on drawings by Jam |
https://en.wikipedia.org/wiki/Plant%20microbiome | The plant microbiome, also known as the phytomicrobiome, plays roles in plant health and productivity and has received significant attention in recent years. The microbiome has been defined as "a characteristic microbial community occupying a reasonably well-defined habitat which has distinct physio-chemical properties. The term thus not only refers to the microorganisms involved but also encompasses their theatre of activity".
Plants live in association with diverse microbial consortia. These microbes, referred to as the plant's microbiota, live both inside (the endosphere) and outside (the episphere) of plant tissues, and play important roles in the ecology and physiology of plants. "The core plant microbiome is thought to comprise keystone microbial taxa that are important for plant fitness and established through evolutionary mechanisms of selection and enrichment of microbial taxa containing essential functions genes for the fitness of the plant holobiont."
Plant microbiomes are shaped by both factors related to the plant itself, such as genotype, organ, species and health status, as well as factors related to the plant's environment, such as management, land use and climate. The health status of a plant has been reported in some studies to be reflected by or linked to its microbiome.
Overview
The study of the association of plants with microorganisms precedes that of the animal and human microbiomes, notably the roles of microbes in nitrogen and phosphorus uptake. The most notable examples are plant root-arbuscular mycorrhizal (AM) and legume-rhizobial symbioses, both of which greatly influence the ability of roots to uptake various nutrients from the soil. Some of these microbes cannot survive in the absence of the plant host (obligate symbionts include viruses and some bacteria and fungi), which provides space, oxygen, proteins, and carbohydrates to the microorganisms. The association of AM fungi with plants has been known since 1842, and over 80% of lan |
https://en.wikipedia.org/wiki/RNA%20therapeutics | RNA therapeutics are a new class of medications based on ribonucleic acid (RNA). Research has been working on clinical use since the 1990s, with significant success in cancer therapy in the early 2010s. In 2020 and 2021, mRNA vaccines have been developed globally for use in combating the coronavirus disease (COVID-19 pandemic). The Pfizer–BioNTech COVID-19 vaccine was the first mRNA vaccine approved by a medicines regulator, followed by the Moderna COVID-19 vaccine, and others.
The main types of RNA therapeutics are those based on messenger RNA (mRNA), antisense RNA (asRNA), RNA interference (RNAi), and RNA aptamers. Of the four types, mRNA-based therapy is the only type which is based on triggering synthesis of proteins within cells, making it particularly useful in vaccine development. Antisense RNA is complementary to coding mRNA and is used to trigger mRNA inactivation to prevent the mRNA from being used in protein translation. RNAi-based systems use a similar mechanism, and involve the use of both small interfering RNA (siRNA) and micro RNA (miRNA) to prevent mRNA translation and/or degrade mRNA. However, RNA aptamers are short, single stranded RNA molecules produced by directed evolution to bind to a variety of biomolecular targets with high affinity thereby affecting their normal in vivo activity.
RNA is synthesized from template DNA by RNA polymerase with messenger RNA (mRNA) serving as the intermediary biomolecule between DNA expression and protein translation. Because of its unique properties (such as its typically single-stranded nature and its 2' OH group) and its ability to adopt many different secondary/tertiary structures, both coding and noncoding RNAs have attracted attention in medicine. Research has begun to explore RNAs potential to be used for therapeutic benefit, and unique challenges have occurred during drug discovery and implementation of RNA therapeutics.
mRNA
Messenger RNA (mRNA) is a single-stranded RNA molecule that is complementary |
https://en.wikipedia.org/wiki/Checkmarx | Checkmarx is an enterprise application security company headquartered in Atlanta, Georgia in the United States. Founded in 2006, the company provides application security testing (AST) solutions that embed security into every phase of the software development lifecycle (SDLC), an approach to software testing known as "shift everywhere."
History
Checkmarx was founded in 2006 by Maty Siman, the company's CTO, and Emmanuel Benzaquen, former CEO (2006 – 2023), and has over 900 employees. Sandeep Johri has been serving as the CEO since February of 2023. The application security platform was designed for CISOs, AppSec managers, security advisors, and software developers.
On July 17, 2017, Checkmarx acquired Codebashing and started offering it as a service to help developers learn secure coding practices with gamified modules in their chosen programming language. In 2018, it also acquired Custodela, a company that provides software security program development as well as consulting services.
Checkmarx was acquired in April 2020 by Hellman & Friedman, a private equity firm with headquarters in San Francisco.
In August 2021, Checkmarx acquired Dustico, a software that detects backdoors and malicious attacks in the software supply chain.
In 2021, the company launched Checkmarx One, a cloud-native Enterprise Application Security platform, which became its most known product. It offers enterprises a full suite of application security testing tools to enable DevSecOps, including static application security testing (SAST), dynamic application security testing (DAST), Software Composition Analysis (SCA), supply chain security (SCS), API security, container security, infrastructure as code security (KICS), as well as CheckMarx Codebashing.
Checkmarx One also offers Checkmarx Fusion, a scan correlation engine (83% of scans are currently cross-correlated in Checkmarx One deployments) and CheckAI.
In January 2022, the company launched AppSec Program Maturity Assessment (APMA |
https://en.wikipedia.org/wiki/Split%20and%20pool%20synthesis | The split and pool (split-mix) synthesis is a method in combinatorial chemistry that can be used to prepare combinatorial compound libraries. It is a stepwise, highly efficient process realized in repeated cycles. The procedure makes it possible to prepare millions or even trillions of compounds as mixtures that can be used in drug research.
History
According to traditional methods, most organic compounds are synthesized one by one from building blocks coupling them together one after the other in a stepwise manner. Before 1982 nobody was even dreaming about making hundreds or thousands of compounds in a single process. Not speaking about millions or even trillions. So the productivity of the split and pool method invented by Prof. Á. Furka (Eötvös Loránd University Budapest Hungary), in 1982 seemed incredible at first sight. The method had been described it in a document notarized in the same year. The document is written in Hungarian and translated to English
Motivations that led to the invention are found in a 2002 paper
and the method was first published in international congresses in 1988 then in print in 1991.
The split and pool synthesis and its features
The split and pool synthesis (S&P synthesis) differs from traditional synthetic methods. The important novelty is the use of compound mixtures in the process. This is the reason of its unprecedentedly high productivity. Using the method one single chemist can make more compounds in a week than all chemists produced in the whole history of chemistry.
The S&P synthesis is applied in a stepwise manner by repeating three operations in each step of the process:
Dividing a compound mixture into equal portions
Coupling one different building block (BB) to each portion
Pooling and thoroughly mixing the portions
The original method is based on the solid-phase synthesis of Merrifield The procedure is illustrated in the figure by the flowing diagram showing of a two-cycle synthesis using the same three BBs in b |
https://en.wikipedia.org/wiki/Uthapuram%20caste%20wall | The Uthapuram caste wall, called by various names as the wall of shame, the wall of untouchability is a 12 ft high and 600 meter long wall built by dominant caste villagers reportedly to segregate the Dalit population in the Village of Uthapuram in Tamil Nadu. The village witnessed violence between Dalits and the dominant castes during 1948, 1964 and 1989 and was also known for its caste based discrimination.
Protests started in 2008 campaigning to demolish the wall led mostly by the Communist Party of India (Marxist) and left-wing organizations. Later a small portion of the wall was demolished by the government to allow entry to the Dalits to access the main road. Many dominant caste villagers left the village and moved 3 km away with their belongings reportedly as a protest for demolishing the wall.
70 houses belonging to the Dalits were attacked in October 2008 reportedly in retaliation for the demolition of the wall and a Dalit man was shot dead by the police. Tensions continued until 2015, when during a clash between the communities several vehicles were set on fire and many were hospitalized.
Background
Caste divisions and clashes
The Village of Uthapuram in the Madurai district has two major castes, the dominant caste Pillai and the Pallar caste. The village was known for its caste tensions and there were violent conflicts between the castes during the years 1948, 1964 and 1989.
Caste discrimination
The dominant caste villagers reportedly blocked attempts of the Dalits to build a bus stop and increased the elevation of a parapet close to the bus stop to discourage the Dalits from sitting before them. The tea-shops managed by caste Hindus are not visited by the Dalits. The Dalits are not permitted to enter an dominant caste-dominated streets and are refused space in the community halls and in the village squares and were also denied entry to burial sites.
The wall
The wall which was 600 meters long and 12 ft high was described in variously as a caste |
https://en.wikipedia.org/wiki/Hewlett-Packard%20Nanoprocessor | The Hewlett-Packard Nanoprocessor from HP (part number 1820-1692) was a small Control-Oriented Processor microcontroller without an ALU nor the ability to add or subtract. It was released in 1974 by HP and used in many HP products. It was packaged in a 40-pin ceramic DIP that dissipated less than one watt.
Description
The Nanoprocessor is an 8-bit control-oriented CPU built from nMOS logic. It has an 11-bit address bus that can directly address 2048 bytes of program ROM, expandable to 512 KB with bank switching.
The processor has sixteen 8-bit registers and an 8-bit accumulator. A 1-bit Extend register (E) acts as a carry flag. As well as the 11-bit program counter (PC), it has an 11-bit subroutine return register (SRR) and 11-bit Interrupt Return Register (IRR), each acting as a single-level stack.
In place of an arithmetic logic unit, it has a Control Logic Unit (CLU) and a magnitude comparator.
For input/output, the Nanoprocessor has 7 bidirectional control lines as well as 15 input and 15 output ports for 8-bit data transfers.
Code for the Nanoprocessor was written in assembly language using an assembler and loader that ran on an HP 2100 computer.
References
HP microprocessors
Microcontrollers |
https://en.wikipedia.org/wiki/Entangled%20Life | Entangled Life: How fungi make our worlds, change our minds and shape our futures is a 2020 non-fiction book on mycology by British biologist Merlin Sheldrake. His first book, it was published by Random House on 12 May 2020.
Summary
The book looks at fungi from a number of angles, including decomposition, fermentation, nutrient distribution, psilocybin production, the evolutionary role fungi play in plants, and the ways in which humans relate to the fungal kingdom. It uses music and philosophy to illustrate its thesis, and introduces readers to a number of central strands of research on mycology. It is also a personal account of Sheldrake's experiences with fungi.
Sheldrake is an expert in mycorrhizal fungi, holds a PhD in tropical ecology from the University of Cambridge for his work on underground fungal networks in tropical forests in Panama, where he was a predoctoral research fellow of the Smithsonian Tropical Research Institute, and his research is primarily in the fields of fungal biology and the history of Amazonian ethnobotany. He is the son of Rupert Sheldrake, a biologist, and Jill Purce, an author and therapist, and the brother of musician Cosmo Sheldrake.
Reception
The book was published to largely positive reviews. Jennifer Szalai of The New York Times called the book an "ebullient and ambitious exploration" of fungi, adding, "reading it left me not just moved but altered, eager to disseminate its message of what fungi can do." Eugenia Bone of The Wall Street Journal called it "a gorgeous book of literary nature writing in the tradition of [Robert] Macfarlane and John Fowles, ripe with insight and erudition." Rachel Cooke of The Observer called it "an astonishing book that could alter our perceptions of fungi forever." Richard Kerridge, reviewing the book in The Guardian, wrote that "when we look closely [at fungi], we meet large, unsettling questions... [Sheldrake] carries us easily into these questions with ebullience and precision."
The book was |
https://en.wikipedia.org/wiki/Eberhard%27s%20theorem | In mathematics, and more particularly in polyhedral combinatorics, Eberhard's theorem partially characterizes the multisets of polygons that can form the faces of simple convex polyhedra. It states that, for given numbers of triangles, quadrilaterals, pentagons, heptagons, and other polygons other than hexagons,
there exists a convex polyhedron with those given numbers of faces of each type (and an unspecified number of hexagonal faces) if and only if those numbers of polygons obey a linear equation derived from Euler's polyhedral formula.
The theorem is named after Victor Eberhard, a blind German mathematician, who published it in 1888 in his habilitation thesis and in expanded form in an 1891 book on polyhedra.
Definitions and statement
For an arbitrary convex polyhedron, one can define numbers , , , etc., where counts the faces of the polyhedron that have exactly sides. A three-dimensional convex polyhedron is defined to be simple when every vertex of the polyhedron is incident to exactly three edges. In a simple polygon, every vertex is incident to three angles of faces, and every edge is incident to two sides of faces. Since the numbers of angles and sides of the faces are given, one can calculate the three numbers (the total number of vertices), (the total number of edges), and (the total number of faces), by summing over all faces and multiplying by an appropriate factor:
and
Plugging these values into Euler's polyhedral formula and clearing denominators leads to the equation
which must be satisfied by the face counts of every simple polyhedron. However,
this equation is not affected by the value of (as its multiplier is zero), and, for some choices of the other face counts, changing can change whether or not a polyhedron with those face counts exists. That is, obeying this equation on the face counts is a necessary condition for the existence of a polyhedron, but not a sufficient condition, and a complete characterization of which face counts a |
https://en.wikipedia.org/wiki/Ridge%20function | In mathematics, a ridge function is any function that can be written as the composition of a univariate function with an affine transformation, that is: for some and .
Coinage of the term 'ridge function' is often attributed to B.F. Logan and L.A. Shepp.
Relevance
A ridge function is not susceptible to the curse of dimensionality, making it an instrumental tool in various estimation problems. This is a direct result of the fact that ridge functions are constant in directions:
Let be independent vectors that are orthogonal to , such that these vectors span dimensions.
Then
for all .
In other words, any shift of in a direction perpendicular to does not change the value of .
Ridge functions play an essential role in amongst others projection pursuit, generalized linear models, and as activation functions in neural networks. For a survey on ridge functions, see. For books on ridge functions, see.
References
Functions and mappings |
https://en.wikipedia.org/wiki/Likewise%2C%20Inc. | Likewise, Inc., is an American technology startup company which provides a social networking service for finding and saving content recommendations for movies, TV shows, books, and podcasts. A team of ex-Microsoft employees founded Likewise in October 2017 with financial investment from Microsoft co-founder Bill Gates.
The company is led by CEO Ian Morris and now has a team of about 35 employees. Its headquarters operates in Bellevue, Washington. As of July 2020, 1 million users have joined the platform.
History
Ideation (October 2017)
In 2017, former Microsoft Communications Chief Larry Cohen came up with the idea for Likewise in Bill Gates’ private office, Gates Ventures. Cohen currently serves as Gates Ventures’ CEO and managing partner.
Cohen collaborated with colleagues Michael Dix and Ian Morris to co-found what would become Likewise, with Morris as its CEO. Gates funded the company's early development.
The company developed its platform in stealth mode before launching publicly in October 2018. Gates served as the “ultimate beta tester” during development, giving his input on the app's design, and remains an active user and advisor.
Release (October 2018)
Likewise officially released its platform in the US and Canada on October 3, 2018.
Growth (2020 COVID-19 Pandemic)
Likewise experienced accelerated growth alongside the COVID-19 pandemic. From March 2020 to July 2020, the platform's monthly active users tripled in numbers. The company reached one million users in July 2020.
Applications
Mobile
Likewise is available as a mobile app for the Android and iOS mobile operating systems. Users receive recommendations from the Likewise algorithm, people they follow, and the Likewise editorial team.
Likewise TV
In October 2019, the company launched its Apple TV app called Likewise TV. The television app organizes shows across streaming services under one watchlist. On July 20, 2020, Likewise TV expanded to Android TV and Amazon Fire TV users.
Reference |
https://en.wikipedia.org/wiki/Cap%27n%20Proto | Cap’n Proto is a data serialization format and Remote Procedure Call (RPC) framework for exchanging data between computer programs. The high-level design focuses on speed and security, making it suitable for network as well as inter-process communication. Cap'n Proto was created by the former maintainer of Google's popular Protocol Buffers framework (Kenton Varda) and was designed to avoid some of its perceived shortcomings.
Technical overview
IDL Schema
Like most RPC frameworks dating as far back as Sun RPC and OSF DCE RPC (and their object-based descendants CORBA and DCOM), Cap'n Proto uses an Interface Description Language (IDL) to generate RPC libraries in a variety of programming languages - automating many low level details such as handling network requests, converting between data types, etc. The Cap'n Proto interface schema uses a C-like syntax and supports common primitives data types (booleans, integers, floats, etc.), compound types (structs, lists, enums), as well as generics and dynamic types. Cap'n Proto also supports Object Oriented features such as multiple inheritance, which has been criticized for its complexity. @0xa558ef006c0c123; #Unique identifiers are manually or automatically assigned to files and compound types
struct Date @0x5c5a558ef006c0c1 {
year @0 :Int16; #@n marks order values were added to the schema
month @1 :UInt8;
day @2 :UInt8;
}
struct Contact @0xf032a54bcb3667e0 {
name @0 :Text;
birthday @2 :Date; #fields can be added anywhere in the definition, but their numbering must reflect the order in which they were added
phones @1 :List(PhoneNumber);
struct PhoneNumber { #Compound types without an static ID cannot be renamed, as automatic IDs are deterministically generated
number @0 :Text;
type @1 :PhoneType = mobile; #Default value
enum PhoneType {
mobile @0;
landline @1;
}
}
}Values in Cap'n Proto messages are represented in binary, as opposed to text encoding used by "human-r |
https://en.wikipedia.org/wiki/Anisyl%20acetate | Anisyl acetate (4-methoxybenzyl acetate) is an acetate ester of anisyl alcohol. It is a naturally occurring flavor found in various fruits and types of vanilla. It is also used as a flavoring agent to produce a flavor profile described variously as sweet, smooth, fruity (cherry or plum) and vanilla or almond.
References
Flavors
Acetate esters
Phenol ethers
Benzyl compounds |
https://en.wikipedia.org/wiki/De%20quinque%20corporibus%20regularibus | De quinque corporibus regularibus (sometimes called Libellus de quinque corporibus regularibus) is a book on the geometry of polyhedra written in the 1480s or early 1490s by Italian painter and mathematician Piero della Francesca. It is a manuscript, in the Latin language; its title means [the little book] on the five regular solids. It is one of three books known to have been written by della Francesca.
Along with the Platonic solids, De quinque corporibus regularibus includes descriptions of five of the thirteen Archimedean solids, and of several other irregular polyhedra coming from architectural applications. It was the first of what would become many books connecting mathematics to art through the construction and perspective drawing of polyhedra, including Luca Pacioli's 1509 Divina proportione (which incorporated without credit an Italian translation of della Francesca's work).
Lost for many years, De quinque corporibus regularibus was rediscovered in the 19th century in the Vatican Library and the Vatican copy has since been republished in facsimile.
Background
The five Platonic solids (the regular tetrahedron, cube, octahedron, dodecahedron, and icosahedron) were known to della Francesca through two classical sources: Timaeus, in which Plato theorizes that four of them correspond to the classical elements making up the world (with the fifth, the dodecahedron, corresponding to the heavens), and the Elements of Euclid, in which the Platonic solids are constructed as mathematical objects. Two apocryphal books of the Elements concerning the metric properties of the Platonic solids, sometimes called pseudo-Euclid, were also commonly considered to be part of the Elements in the time of della Francesca. It is the material from the Elements and pseudo-Euclid, rather than from Timaeus, that forms della Francesca's main inspiration.
The thirteen Archimedean solids, convex polyhedra in which the vertices but not the faces are symmetric to each other, were classif |
https://en.wikipedia.org/wiki/Nagamori%20Awards | The Nagamori Award is an international award given by Nagamori Foundation of Kyoto, Japan. The award is to recognize outstanding researchers and engineers working on electrical and electronics engineering, especially related to motors, power generation, actuators, and energy related topics.
History
The awarded is founded by Shigenobu Nagamori in 2014, with the purpose of recognizing global researchers and engineers who have made outstanding and innovative technological advances in the area of electrical motors and power generation.
Awards details
The award is made annually to outstanding early to mid-career researchers and engineers. There will be six awards given annually, with one of the awards named as the Grand Nagamori award. The prize money of two million yen is awarded to each regular Nagamori award while the grand award comes with a cash prize of five million yen. An award ceremony is held in Kyoto, Japan around September each year, where winners will be present to receive their prizes.
Recent winners
The following are some of the recent winners.
References
International awards
Engineering awards
Electrical and electronic engineering awards
Japanese science and technology awards |
https://en.wikipedia.org/wiki/Schweizerisches%20Volksliedarchiv | The Schweizerische Volksliedarchiv is an institution established at the University of Basel.
It was founded in 1906 as a department of the . Under the direction of John Meier, associations and the press were called upon to send in the song collection known in Switzerland so that it could be scientifically and critically reviewed and published in a suitable manner. The appeal was sent to all parts of Switzerland.
As a result of this first call, a collection of 31,000 German-language songs, 3,200 songs from Romandy, 1,400 from Ticino and 1,200 Rhaeto-Romanic has been secured.
The collection was subsequently expanded through bequests. For example, the holdings of Hanns In der Gand, the folk song collection of Arthur Rossat or the estate of Armin Breu with around 400 records and tapes with historical field recordings were added to the archive.
See also
Deutsches Volksliedarchiv
Music archives
Swiss folklore
Sound archives
Music organisations based in Switzerland
1906 establishments in Switzerland
Organisations based in Basel |
https://en.wikipedia.org/wiki/PokerGO | PokerGO is an over-the-top content platform based in Las Vegas, Nevada. PokerGO was launched in 2017 as a subscription-based streaming service, offering poker centric online streaming.
The content offered on PokerGO includes poker tournaments, along with cash game-orientated shows.
As of February 2021, PokerGO's library of content includes over 2,400 videos totaling over 3,800 continuous hours.
Content
The content of PokerGO includes shows, tournament replays, and cash games. Other media includes episodes, live streams, and recap videos, and events that were streamed live become on-demand videos afterward.
Poker tournaments and cash games
High Stakes Poker
High Stakes Poker is a cash game poker television program that sees a mix of professional and amateur poker players playing high stakes No-Limit Hold’em with buy-ins ranging from $100,000 to $500,000.
The show debuted in January 2006 and ran for seven seasons until May 2011. In February 2020, PokerGO announced that they had acquired the High Stakes Poker brand and assets. In December 2020, a new season of High Stakes Poker aired on PokerGO and included returning players Tom Dwan, Phil Hellmuth, Brandon Adams, and Phil Ivey, while also introducing new players to High Stakes Poker including Jason Koon, Jean-Robert Bellande, Bryn Kenney, Doug Polk, Michael Schwimer, and Chamath Palihapitiya.
There are nine seasons of High Stakes Poker and 126 episodes, and current hosts include Gabe Kaplan and A.J. Benza.
Poker After Dark
Poker After Dark takes an intimate look at one table as it develops over a series of episodes. The seasons were split into weeks with each given a theme based on the players involved. Poker After Dark originally began under a No-Limit Hold’em sit-n-go format before evolving to cash games that also featured different game variations such as Pot-Limit Omaha, 2-7 Triple Draw, H.O.R.S.E., or Short Deck.
The original series would see one table play over five episodes with the sixth episode be |
https://en.wikipedia.org/wiki/Guillermo%20Rein | Guillermo Rein (born May 1975) is a professor of fire science in the Department of Mechanical Engineering at Imperial College London. His research is focused on fire, combustion, and heat transfer. He is the editor-in-chief of the journal Fire Technology and Fellow of the Combustion Institute.
Rein is best known for his contributions to smouldering combustion research in the field of fire science.
Biography
Rein obtained his Industrial Engineering degree at the ICAI School of Engineering in 1999. He studied mechanical engineering at the University of California, Berkeley, and obtained an MSc in 2003 and a PhD. in 2005. He taught at the School of Engineering of the University of Edinburgh (2006–2012), where he was a senior lecturer before moving to Imperial College in 2012.
Research
His research is centred on heat transfer, combustion and fire. He is best known in three areas: polymer and wood ignition; design of fire-resistant structures; and wildfire spread and mitigation.
Rein, together with his research group and collaborators, has edited two books, published six book chapters and over 100 journal publications. His current h-index is 51 and citation count is over 6,900 on Google Scholar.
Rein has been editor-in-chief of the journal Fire Technology since 2012. He was associate editor of Proceedings of the Combustion Institute from 2013 to 2019; associate editor of Thermal and Mass Transport (Frontiers of Mechanical Engineering) from 2016; and is on the editorial board of Safety Science and the advisory board of International Journal of Wildland Fire since 2016. He was also on the editorial board of Fire Safety Journal from 2014 to 2017.
Selected Awards
2009 Hinshelwood Prize
2016 SFPE Lund Award
2017 The Engineer Collaborate-to-Innovate Prize
2017 Sugden Award
2018 Arthur B. Guise Medal
2020 Research Excellence Award
References
External links
Imperial Hazelab's webpage
Mechanical engineers
Living people
University of California, Berkeley alumni
|
https://en.wikipedia.org/wiki/Among%20Us | Among Us is a 2018 online multiplayer social deduction game developed and published by American game studio Innersloth. The game was inspired by the party game Mafia and the science fiction horror film The Thing. The game allows for cross-platform play; it was released on iOS and Android devices in June 2018 and on Windows later that year in November. It was ported to the Nintendo Switch in December 2020 and on the PlayStation 4, PlayStation 5, Xbox One and Xbox Series X/S in December 2021. A virtual reality adaptation, Among Us VR, was released on November 10, 2022.
Among Us takes place in space-themed settings where players are colorful, armless cartoon astronauts. Each player takes on one of two roles: most are Crewmates, but a small number are Impostors. Crewmates work to complete assigned tasks in the game while identifying and voting out suspected Impostors (who appear identical to Crewmates) using social deduction, while Impostors have the objective of killing the Crewmates.
While the game was initially released in 2018 to little mainstream attention, it received a massive rise in popularity in 2020 due to many Twitch streamers and YouTubers playing it during the COVID-19 pandemic. It received favorable reviews from critics for fun and entertaining gameplay. The game and its stylized characters have been the subject of various internet memes.
Gameplay
Among Us is a multiplayer game for four to fifteen players. Up to three players are randomly and secretly chosen to be the Impostors each round. As of 2023, five playable maps are available: a spaceship called "The Skeld", an office building called "MIRA HQ", a planet base called "Polus", "The Airship", a setting from Innersloth's Henry Stickmin series, and the Fungle, a jungle map. The Crewmates can win the game one of two ways: either by completing all assigned tasks or by ejecting all Impostors. Impostors can likewise win in two ways: either by killing or ejecting all Crewmates, or by sabotaging a critic |
https://en.wikipedia.org/wiki/The%20Social%20Dilemma | The Social Dilemma is a 2020 American docudrama film directed by Jeff Orlowski and written by Orlowski, Davis Coombe, and Vickie Curtis about the negative social effects of social media.
Synopsis
This documentary dives into the psychological underpinnings and the manipulation techniques by which, it claims, social media and technology companies addict users. People's online activity is watched, tracked, and measured by these companies, who then use this data to build artificial intelligence models that predict the actions of their users. Tristan Harris, former Google design ethicist and co-founder of the Center for Humane Technology, explains in the documentary that there are three main goals of tech companies:
The engagement goal: to increase usage and to make sure users continue scrolling.
The growth goal: to ensure users are coming back and inviting friends that invite even more friends.
The advertisement goal: to make sure that while the above two goals are happening, the companies are also making as much money as possible from advertisements.
Harris summed this up with the warning: "If you're not paying for the product, you are the product", paraphrasing earlier insights from Television Delivers People, Tom Johnson and Andrew Lewis.
Harris likens the manipulation tactics used in technology to magic: how do you persuade people by manipulating what they see and how can this psychology be integrated into technology?
Another interviewee, Jonathan Haidt, a social psychologist at NYU Stern School of Business, brings up the concerns of mental health in relation to social media. There has been an increase in depression and suicide rates among teens and young adults since the early 2000s and Haidt states that this pattern points to the year social media was made available on mobile phones. The dangers of fake news are also discussed in the documentary. Harris argues that this is a "disinformation-for-profit business model" and that companies make more money by a |
https://en.wikipedia.org/wiki/Rectangle%20packing | Rectangle packing is a packing problem where the objective is to determine whether a given set of small rectangles can be placed inside a given large polygon, such that no two small rectangles overlap. Several variants of this problem have been studied.
Packing identical rectangles in a rectangle
In this variant, there are multiple instances of a single rectangle of size (l,w), and a bigger rectangle of size (L,W). The goal is to pack as many small rectangles as possible into the big rectangle without overlap between any rectangles (small or large). Common constraints of the problem include limiting small rectangle rotation to 90° multiples and requiring that each small rectangle is orthogonal to the large rectangle.
This problem has some applications such as loading of boxes on pallets and, specifically, woodpulp stowage. As an example result: it is possible to pack 147 small rectangles of size (137,95) in a big rectangle of size (1600,1230).
Packing identical squares in a rectilinear polygon
Given a rectilinear polygon (whose sides meet at right angles) R in the plane, a set S of points in R, and a set of identical squares, the goal is to find the largest number of non-overlapping squares that can be packed in points of S.
Suppose that, for each point p in S, we put a square centered at p. Let GS be the intersection graph of these squares. A square-packing is equivalent to an independent set in GS. Finding a largest square-packing is NP-hard; one may prove this by reducing from 3SAT.
Packing different rectangles in a given rectangle
In this variant, the small rectangles can have varying lengths and widths, and they should be packed in a given large rectangle. The decision problem of whether such a packing exists is NP-hard. This can be proved by a reduction from 3-partition. Given an instance of 3-partition with 3m positive integers: a1, ..., a3m, with a total sum of m T, we construct 3m small rectangles, all with a width of 1, such that the length of r |
https://en.wikipedia.org/wiki/Shallow%20%28underwater%20relief%29 | Shallow is an elevation of the bottom in the sea, river, lake, which impedes navigation. It is a type of an underwater relief where the depth of the water is low compared to that of the surrounding points.
Usually formed by sand or pebble deposits, can also be of volcanic origin or the result of human or animal activities.
Stranded near the shore of a reservoir or watercourse is called a shoal; the shallow ocean area adjacent to the mainland is the continental shelf.
Shallows can be permanently hidden under water or appear on the surface of the water periodically (for example, during low tide in the seas, changes in the water level in rivers from water content) in the form of islands, sediments, side streams, spits, etc.
On river shoals, if possible, to cross the river on foot, or by land transport, arrange fords.
See also
Spit (landform)
Rapids
Reef
Ocean bank
Bibliography
Jean-Jacques Delannoy, Philip Deline, René Lhénaff, Géographie physique: aspects et dynamique du géosystème terrestre, De Boeck Superieur, 2016, p. 634.
Republished in 2001 then in 2014 under the title Dictionnaire de la mer: savoir-faire, traditions, vocabulaires-techniques, Omnibus, XXIV-861 p.,
Hydrology |
https://en.wikipedia.org/wiki/Soul%20Electronics | Soul Electronics (stylized as SOUL Electronics or simply SOUL) is an audio equipment company. Founded in 2010, it produces various lines of wireless, Bluetooth-enabled headphones, earbuds, and speakers.
History
Soul Electronics was founded in 2010. The entity was originally known as Signeo USA, an American subsidiary of the Hong Kong-based Signeo Design International. In January 2011 at that year's Consumer Electronics Show (CES), the company introduced SOUL by Ludacris, a line of headphones designed in collaboration with the rapper, Ludacris. The line featured five different pairs of headphones, which were released to the public later in 2011. By that time, the company had become known as Soul Electronics.
In 2012, Soul Electronics partnered with sprinter, Usain Bolt, on the design of a new line of headphones and earbuds designed to be used while running or exercising. In the following two years, the company entered into a number of other sponsored partnerships with athletes, including Brendan Schaub, Tim Tebow, and Alex Fowler. In 2015, the company introduced the Combat+ Sync headphones which came with a built-in walkie-talkie feature. At CES 2018, the company introduced a new artificial intelligence feature in its "Run Free Pro Bio" and "Blade" models that would offer users live, in-ear coaching while running.
In 2019, the company introduced four new products: the Ultra Wireless over-ear headphones, the ST-XX wireless earbuds, the ST-XS2 wireless earphones, and the S-Storm portable Bluetooth speaker. Four additional earbud and earphone lines were released in 2020, including SYNC Pro, the SYNC ANC (active noise-canceling), S-Fit, and S-Gear.
In 2023, the company introduced their first open-ear style headphones called the Openear Series which include Openear 2, Openear Plus, Openear S-Clip and Openear S-Free.
Products
Soul Electronics produces a number of audio products, generally targeted at runners and other consumers who exercise. Its current product |
https://en.wikipedia.org/wiki/Mathematical%20Models%20%28Cundy%20and%20Rollett%29 | Mathematical Models is a book on the construction of physical models of mathematical objects for educational purposes. It was written by Martyn Cundy and A. P. Rollett, and published by the Clarendon Press in 1951, with a second edition in 1961. Tarquin Publications published a third edition in 1981.
The vertex configuration of a uniform polyhedron, a generalization of the Schläfli symbol that describes the pattern of polygons surrounding each vertex, was devised in this book as a way to name the Archimedean solids, and has sometimes been called the Cundy–Rollett symbol as a nod to this origin.
Topics
The first edition of the book had five chapters, including its introduction which discusses model-making in general and the different media and tools with which one can construct models. The media used for the constructions described in the book include "paper, cardboard, plywood, plastics, wire, string, and sheet metal".
The second chapter concerns plane geometry, and includes material on the golden ratio, the Pythagorean theorem, dissection problems, the mathematics of paper folding, tessellations, and plane curves, which are constructed by stitching, by graphical methods, and by mechanical devices.
The third chapter, and the largest part of the book, concerns polyhedron models, made from cardboard or plexiglass. It includes information about the Platonic solids, Archimedean solids, their stellations and duals, uniform polyhedron compounds, and deltahedra.
The fourth chapter is on additional topics in solid geometry and curved surfaces, particularly quadrics but also including topological manifolds such as the torus, Möbius strip and Klein bottle, and physical models helping to visualize the map coloring problem on these surfaces. Also included are sphere packings. The models in this chapter are constructed as the boundaries of solid objects, via two-dimensional paper cross-sections, and by string figures.
The fifth chapter, and the final one of the first editi |
https://en.wikipedia.org/wiki/Wassim%20Dhaouadi | Wassim Dhaouadi is a Tunisian mechanical engineer. He is known for having solved Bretherton's buoyant bubble problem in physics as an undergraduate at the EPFL (École Polytechnique Fédérale de Lausanne).
Career and recognition
Dhaouadi studied mechanical engineering and received a BSc from EPFL in 2018 and a MSc from ETH Zurich in 2020. As a bachelor student in John Kolinski's laboratory at EPFL, Dhaouadi solved Bretherton's buoyant bubble, a long-standing problem in physics. By using an optical interference method, he was able to prove the existence and measure the physical properties of a thin liquid film that had been hypothesized by physicists to explain why air bubbles appear to get stuck in capillaries.
In 2020, Dhaouadi was named one of the ten Outstanding Young Persons in the World in the category of Academic Leadership by the Junior Chamber International. He was also awarded with the Sanford C. Bernstein & Co. Leadership and Ethics Award from Columbia Business School.
He currently works as an intern in NASA's Jet Propulsion Laboratory.
References
École Polytechnique Fédérale de Lausanne alumni
Tunisian engineers
Year of birth missing (living people)
Living people
ETH Zurich alumni
Jet Propulsion Laboratory |
https://en.wikipedia.org/wiki/Potamology | Potamology (from - river, - science) is a river hydrology, this is one of the largest branches of land hydrology. The subject of study is the hydrological processes of rivers, the morphometry of river basins, the structure of river networks; channel processes, regime of river mouth areas; evaporation and infiltration of water in a river basin; water, thermal, ice regime of rivers; sediment regime; sources and types of rivers feeding, and various chemical and physical processes in rivers.
Bibliography
Lindeman, R. L., "The trophic-dynamic aspect of ecology", Ecology, 1942, XXIII, pp. 399–418.
Williams, R. B., "Computer simulation of energy flow in Cedar Bog Lake, Minnesota, based on the classical studies of Lindeman", Systems analysis and simulation in ecology (a cura di B. C. Patten), vol. I, New York 1971, pp. 543–582.
Hydrology |
https://en.wikipedia.org/wiki/List%20of%20topologies | The following is a list of named topologies or topological spaces, many of which are counterexamples in topology and related branches of mathematics. This is not a list of properties that a topology or topological space might possess; for that, see List of general topology topics and Topological property.
Discrete and indiscrete
Discrete topology − All subsets are open.
Indiscrete topology, chaotic topology, or Trivial topology − Only the empty set and its complement are open.
Cardinality and ordinals
Cocountable topology
Given a topological space the on is the topology having as a subbasis the union of and the family of all subsets of whose complements in are countable.
Cofinite topology
Double-pointed cofinite topology
Ordinal number topology
Pseudo-arc
Ran space
Tychonoff plank
Finite spaces
Discrete two-point space − The simplest example of a totally disconnected discrete space.
Either–or topology
Finite topological space
Pseudocircle − A finite topological space on 4 elements that fails to satisfy any separation axiom besides T0. However, from the viewpoint of algebraic topology, it has the remarkable property that it is indistinguishable from the circle
Sierpiński space, also called the connected two-point set − A 2-point set with the particular point topology
Integers
Arens–Fort space − A Hausdorff, regular, normal space that is not first-countable or compact. It has an element (i.e. ) for which there is no sequence in that converges to but there is a sequence in such that is a cluster point of
Arithmetic progression topologies
The Baire space − with the product topology, where denotes the natural numbers endowed with the discrete topology. It is the space of all sequences of natural numbers.
Divisor topology
Partition topology
Deleted integer topology
Odd–even topology
Fractals and Cantor set
Apollonian gasket
Cantor set − A subset of the closed interval with remarkable properties.
Cantor dust
Cantor space |
https://en.wikipedia.org/wiki/Resonant%20interaction | In nonlinear systems, a resonant interaction is the interaction of three or more waves, usually but not always of small amplitude. Resonant interactions occur when a simple set of criteria coupling wave-vectors and the dispersion equation are met. The simplicity of the criteria make technique popular in multiple fields. Its most prominent and well-developed forms appear in the study of gravity waves, but also finds numerous applications from astrophysics and biology to engineering and medicine. Theoretical work on partial differential equations provides insights into chaos theory; there are curious links to number theory. Resonant interactions allow waves to (elastically) scatter, diffuse or to become unstable. Diffusion processes are responsible for the eventual thermalization of most nonlinear systems; instabilities offer insight into high-dimensional chaos and turbulence.
Discussion
The underlying concept is that when the sum total of the energy and momentum of several vibrational modes sum to zero, they are free to mix together via nonlinearities in the system under study. Modes for which the energy and momentum do not sum to zero cannot interact, as this would imply a violation of energy/momentum conservation. The momentum of a wave is understood to be given by its wave-vector and its energy follows from the dispersion relation for the system.
For example, for three waves in continuous media, the resonant condition is conventionally written as the requirement that and also , the minus sign being taken depending on how energy is redistributed among the waves. For waves in discrete media, such as in computer simulations on a lattice, or in (nonlinear) solid-state systems, the wave vectors are quantized, and the normal modes can be called phonons. The Brillouin zone defines an upper bound on the wave vector, and waves can interact when they sum to integer multiples of the Brillouin vectors (Umklapp scattering).
Although three-wave systems provide the simples |
https://en.wikipedia.org/wiki/Data%20embassy | A data embassy is a solution traditionally implemented by nation states to ensure a country's digital continuity with particular respect to critical databases. It consists of a set of servers that store one country's data and are under that country's jurisdiction while being located in another country.
Purpose
Data embassies are regarded as a tool to ensure a government's digital continuity, meaning the survival of critical databases to allow the continuation of government even in a situation where governing from within the country's borders is no longer an option. Among threats that might lead to such situation are natural disasters, large-scale cyberattacks, and military invasion. In the worst-case scenario, a data embassy could enable government to provide its digital services without the national territory under its control. This makes data embassies particularly attractive to countries that have already digitalized their most crucial databases and are situated in the vicinity of the aforementioned threat vectors. Additionally, data embassies can offer additional computing power for heightened server traffic, for example during election season or the period of electronic tax return filing.
History
The 2007 cyberattacks on Estonia disrupted websites of Estonian organizations including the Estonian parliament as well as newspapers and banks. Furthermore, Estonia has implemented a stringent paperless policy, meaning that many crucial databases only exist in a digital format. Tasked with ensuring the security and immutability of these databases, the ministries looked towards data embassies as a possible solution for digital continuity. This was crucial not just for Estonia's own citizens but also for e-Residents who rely on these services around the world. These efforts were also written down in the Estonian Cyber Security Strategy 2014-2017 which created an outline for ensuring the digital continuity of the state.
In 2013, then-CIO of the Estonian government T |
https://en.wikipedia.org/wiki/Sports%20Engineering | Sports Engineering is a peer-reviewed academic journal covering "the application of engineering to sport". It is the journal of the International Sports Engineering Association, published for them by Springer. It was founded in 1998 by Steve Haake, then at University of Sheffield. It is published quarterly and is a hybrid open-access journal. The contents and abstracts are available online for volumes 6- (2003-). It is indexed in services including Ei Compendex, Inspec and Scopus.
References
Sports technology
Engineering journals
English-language journals
Academic journals established in 1998
Hybrid open access journals |
https://en.wikipedia.org/wiki/Aurora%20%28supercomputer%29 | Aurora is a planned supercomputer, originally contracted to be completed by 2018 but through a series of delays at the prime contractor, Intel Corporation, now planned to be completed in 2023. It was originally planned be the worlds’ fastest supercomputer with over 2 exaflops, however a series of delays have cast that into doubt. It is sponsored by the United States Department of Energy (DOE) and designed by Intel and Cray for the Argonne National Laboratory. It will have 2 exaFLOPS in computing power which is approximately a quintillion (260 or 1018) calculations per second and will have an expected cost of US$500 million. It will follow Frontier, which was the world's first exascale supercomputer in 2022 and as of June 2023 the world's fastest. Olivier Franza is the chief architect and principal investigator of this design.
History
In 2013 DOE presented their exascale vision of one exaFLOP at 20 MW by 2020. Aurora was first announced in 2015 and to be finished in 2018. It was expected to have a speed of 180 petaFLOPS which would be around the speed of Summit. Aurora was meant to be the most powerful supercomputer at the time of its launch and to be built by Cray with Intel processors. Later, in 2017, Intel announced that Aurora would be delayed to 2021 but scaled up to 1 exaFLOP. In March 2019, DOE said that it would build the first supercomputer with a performance of one exaFLOP in the United States in 2021. In October 2020, DOE said that Aurora would be delayed again for a further 6 months and would no longer be the first exascale computer in the US. In late October 2021 Intel announced Aurora would now exceed 2 exaFLOPS in peak double-precision compute. The system has been fully installed.
Planned usage
Planned functions include research on nuclear fusion, low carbon technologies, subatomic particles, cancer and cosmology. It will also develop new materials that will be useful for batteries and more efficient solar cells. It is to be available to the ge |
https://en.wikipedia.org/wiki/List%20of%20power%20engineering%20measuring%20equipment | Below is the list of measuring instruments used in power engineering work.
See also
E-meter
Power
Electronic test equipment
Measuring instruments |
https://en.wikipedia.org/wiki/Three-wave%20equation | In nonlinear systems, the three-wave equations, sometimes called the three-wave resonant interaction equations or triad resonances, describe small-amplitude waves in a variety of non-linear media, including electrical circuits and non-linear optics. They are a set of completely integrable nonlinear partial differential equations. Because they provide the simplest, most direct example of a resonant interaction, have broad applicability in the sciences, and are completely integrable, they have been intensively studied since the 1970s.
Informal introduction
The three-wave equation arises by consideration of some of the simplest imaginable non-linear systems. Linear differential systems have the generic form
for some differential operator D. The simplest non-linear extension of this is to write
How can one solve this? Several approaches are available. In a few exceptional cases, there might be known exact solutions to equations of this form. In general, these are found in some ad hoc fashion after applying some ansatz. A second approach is to assume that and use perturbation theory to find "corrections" to the linearized theory. A third approach is to apply techniques from scattering matrix (S-matrix) theory.
In the S-matrix approach, one considers particles or plane waves coming in from infinity, interacting, and then moving out to infinity. Counting from zero, the zero-particle case corresponds to the vacuum, consisting entirely of the background. The one-particle case is a wave that comes in from the distant past and then disappears into thin air; this can happen when the background is absorbing, deadening or dissipative. Alternately, a wave appears out of thin air and moves away. This occurs when the background is unstable and generates waves: one says that the system "radiates". The two-particle case consists of a particle coming in, and then going out. This is appropriate when the background is non-uniform: for example, an acoustic plane wave comes in, s |
https://en.wikipedia.org/wiki/Urban%20evolution | Urban evolution refers to the heritable genetic changes of populations in response to urban development and anthropogenic activities in urban areas. Urban evolution can be caused by mutation, genetic drift, gene flow, or evolution by natural selection. Biologists have observed evolutionary change in numerous species compared to their rural counterparts on a relatively short timescale.
Strong selection pressures due to urbanization play a big role in this process. The changed environmental conditions lead to selection and adaptive changes in city-dwelling plants and animals. Also, there is a significant change in species composition between rural and urban ecosystems.
Shared aspects of cities worldwide also give ample opportunity for scientists to study the specific evolutionary responses in these rapidly changed landscapes independently. How certain organisms (are able to) adapt to urban environments while others cannot, gives a live perspective on rapid evolution.
Urbanization
With urban growth, the urban-rural gradient has seen a large shift in distribution of humans, moving from low density to very high in the last millennia. This has brought a large change to environments as well as societies.
Urbanization transforms natural habitats to completely altered living spaces that sustain large human populations. Increasing congregation of humans accompanies the expansion of infrastructure, industry and housing. Natural vegetation and soil are mostly replaced or covered by dense grey materials. Urbanized areas continue to expand both in size and number globally; in 2018, the United Nations estimated that 68% of people globally will live in ever-larger urban areas by 2050.
Urban evolution selective agents
Urbanization intensifies diverse stressors spatiotemporally such that they can act in concert to cause rapid evolutionary consequences such as extinction, maladaptation, or adaptation. Three factors have come to the forefront as the main evolutionary influencer |
https://en.wikipedia.org/wiki/Bifacial%20solar%20cells | A bifacial solar cell (BSC) is any photovoltaic solar cell that can produce electrical energy when illuminated on both its surfaces, front or rear. In contrast, monofacial solar cells produce electrical energy only when photons impinge on their front side. Bifacial solar cells can make use of albedo radiation, which is useful for applications where a lot of light is reflected on surfaces such as roofs. The concept was introduced as a means of increasing the energy output in solar cells. Efficiency of solar cells, defined as the ratio of incident luminous power to generated electrical power under one or several suns (1 sun = 1000W/m2 ), is measured independently for the front and rear surfaces for bifacial solar cells. The bifaciality factor (%) is defined as the ratio of rear efficiency in relation to the front efficiency subject to the same irradiance.
The vast majority of solar cells today are made of silicon (Si). Silicon is a semiconductor and as such, its external electrons are in an interval of energies called the valence band and they completely fill the energy levels of this band. Above this valence band there is a forbidden band, or band gap, of energies within which no electron can exist, and further above, we find the conduction band. The conduction band of semiconductors is almost empty of electrons, but it is where valence band electrons will find accommodation after being excited by the absorption of photons. The excited electrons have more energy than the ordinary electrons of the semiconductor. The electrical conductivity of Si, as described so far, called intrinsic silicon, is exceedingly small. Introducing impurities to the Si in the form of phosphorus atoms will provide additional electrons located in the conduction band, rendering the Si n-type, with a conductivity that can be engineered by modifying the density of phosphorus atoms. Alternatively, impurification with boron or aluminum atoms renders the Si p-type, with a conductivity that can als |
https://en.wikipedia.org/wiki/Human%20Medicines%20Regulations%202012 | The Human Medicines Regulations 2012 in the United Kingdom were created, under statutory authority of the European Communities Act 1972 and the Medicines Act 1968 in 2012. The body responsible for their upkeep is the Medicines and Healthcare products Regulatory Agency. The regulations partially repealed the Medicines Act 1968 in line with EU legislation.
Amendments
In October 2020, the regulations were amended to expand the workforce eligible to administer COVID-19 vaccines, so enabling additional healthcare professionals to vaccinate the public. This was a temporary provision, but in January 2022 it was announced that this would be made permanent as would the provision for community pharmacy contractors to provide COVID-19 and flu vaccines “away from their normal registered premises”.
Regulation 174
Regulation 174 provides an exemption to the requirement for authorisation of Regulation 46, allowing for the sale or supply of any medicinal product to be temporarily authorised by the licensing authority (MHRA) in response to the suspected or confirmed spread of pathogenic agents, toxins, chemical agents or nuclear radiation.
References
External links
National Health Service
Pharmaceutics
Life sciences industry
Pharmacy
2012 establishments in the United Kingdom
Department of Health and Social Care
Medical regulation in the United Kingdom
Biotechnology
Statutory Instruments of the United Kingdom
Health law in the United Kingdom |
https://en.wikipedia.org/wiki/Interactive%20Application%20System | Interactive Application System (IAS) was a DEC operating system for the PDP-11. It was a fork from RSX-11D.
The last major release, Version 3.0, began distribution late 1979; the final version, 3.4, came out May 1990.
Overview
DEC's RSX-11A and C were paper tape based, B had limited disk support, "D" was for disk, and the "M" designation was for "small Memory
requirement" /later "Multi-user" (with RSX-11M Plus being a followup). IAS was designed to a mix of "concurrent timesharing, real-time and batch." A looking back described it as "bare basics .. handled interrupts .. scheduled processes, and provided interprocess communications" without being "all things to all people." Another description, rather than focusing on taking away overhead, wrote "IAS (Interactive Application System) was created by adding two things to 11D."
RSX-11's use of a version number as part of a file's identifier: MYFILE.DAT;3 was retained by IAS.
The batch facility's command files used the same syntax as the indirect command files available to interactive users; multiple batch jobs could run concurrently. The system could be tuned to either leave unused CPU cycles to batch, or to guarantee a minimum level (without taking from Real Time requirements).
DEC's Sort/Merge utility program was distributed as part of IAS.
Performance
The system can be operated in one of three modes: Real-Time, Multi-User, and Timesharing.
Multi-User shares the system with Real-Time tasks; Timesharing adds effective concurrent use of batch processing alongside "noncritical real-time tasks" and interactive users. Timesharing also adds Timesharing Control Primitives (TCP), described as a "mechanism for timesharing tasks to invoke and communicate with other timesharing tasks." An evaluation by TRW's Defense and Space Systems Group for Tactical Operations Analysis Support Facility at Langley AFB VA highlighted the "IAS heuristic timesharing scheduler" and "subtasking support at the Kernel Executive level via the |
https://en.wikipedia.org/wiki/Sterbenz%20lemma | In floating-point arithmetic, the Sterbenz lemma or Sterbenz's lemma is a theorem giving conditions under which floating-point differences are computed exactly.
It is named after Pat H. Sterbenz, who published a variant of it in 1974.
The Sterbenz lemma applies to IEEE 754, the most widely used floating-point number system in computers.
Proof
Let be the radix of the floating-point system and the precision.
Consider several easy cases first:
If is zero then , and if is zero then , so the result is trivial because floating-point negation is always exact.
If the result is zero and thus exact.
If then we must also have so . In this case, , so the result follows from the theorem restricted to .
If , we can write with , so the result follows from the theorem restricted to .
For the rest of the proof, assume without loss of generality.
Write in terms of their positive integral significands and minimal exponents :
Note that and may be subnormal—we do not assume .
The subtraction gives:
Let .
Since we have:
, so , from which we can conclude is an integer and therefore so is ; and
, so .
Further, since , we have , so that
which implies that
Hence
so is a floating-point number.
Note: Even if and are normal, i.e., , we cannot prove that and therefore cannot prove that is also normal.
For example, the difference of the two smallest positive normal floating-point numbers and is which is necessarily subnormal.
In floating-point number systems without subnormal numbers, such as CPUs in nonstandard flush-to-zero mode instead of the standard gradual underflow, the Sterbenz lemma does not apply.
Relation to catastrophic cancellation
The Sterbenz lemma may be contrasted with the phenomenon of catastrophic cancellation:
The Sterbenz lemma asserts that if and are sufficiently close floating-point numbers then their difference is computed exactly by floating-point arithmetic , with no rounding needed.
The phenomenon of catastrophic c |
https://en.wikipedia.org/wiki/Perron%27s%20irreducibility%20criterion | Perron's irreducibility criterion is a sufficient condition for a polynomial to be irreducible in —that is, for it to be unfactorable into the product of lower-degree polynomials with integer coefficients.
This criterion is applicable only to monic polynomials. However, unlike other commonly used criteria, Perron's criterion does not require any knowledge of prime decomposition of the polynomial's coefficients.
Criterion
Suppose we have the following polynomial with integer coefficients
where . If either of the following two conditions applies:
then is irreducible over the integers (and by Gauss's lemma also over the rational numbers).
History
The criterion was first published by Oskar Perron in 1907 in Journal für die reine und angewandte Mathematik.
Proof
A short proof can be given based on the following lemma due to Panaitopol:
Lemma. Let be a polynomial with . Then exactly one zero of satisfies , and the other zeroes of satisfy .
Suppose that where and are integer polynomials. Since, by the above lemma, has only one zero with modulus not less than , one of the polynomials has all its zeroes strictly inside the unit circle. Suppose that are the zeroes of , and . Note that is a nonzero integer, and , contradiction. Therefore, is irreducible.
Generalizations
In his publication Perron provided variants of the criterion for multivariate polynomials over arbitrary fields. In 2010, Bonciocat published novel proofs of these criteria.
See also
Eisenstein's criterion
Cohn's irreducibility criterion
References
Polynomials
Theorems in algebra |
https://en.wikipedia.org/wiki/Jared%20Roach | Jared C. Roach is an American biologist who invented the pairwise end sequencing strategy while a graduate student at the University of Washington.
Education and early career
Roach attended Cornell University, where he received his Bachelor of Science in biology in 1990. He then attended the University of Washington, where he received his PhD in immunology in 1998, and his MD in 1999. He trained in internal medicine at the University of Utah through 2001.
Career
Starting as a graduate student in the 1990s, Roach worked on the Human Genome Project from its early days through its conclusion in 2003. He invented pairwise end-sequencing while a graduate student in Leroy Hood's laboratory.
Roach was a senior fellow at the department of molecular biotechnology at the University of Washington from 1999-2000. In 2001, he became a research scientist at the Institute for Systems Biology.
In 2009, Roach was first author on a project which sequenced the whole genomes of a family of four, including two children affected by Miller syndrome and primary ciliary dyskinesia. This effort identified the cause of Miller syndrome, a simple recessive Mendelian disorder. It also produced the first complete whole-chromosomal parental haplotypes in humans. Parental haplotyping is the process of assigning all the variants in the genome to paternal and maternal chromosomes. The team applied these techniques to identify genetic mutations related to several genetic diseases, including genes for Adams–Oliver syndrome, alternating hemiplegia of childhood, certain subtypes of epilepsy, palmoplantar keratoderma, and Fanconi anemia.
From 2007 to 2009, he was scientific director of the High-Throughput Analysis Core (HAC) laboratory at Seattle Children’s Hospital. Since 2009, he has been a senior research scientist at the Institute for Systems Biology. Roach's group currently applies systems biology to complex genetic diseases, focusing on Alzheimer’s disease.
In 2020, Roach was involved in |
https://en.wikipedia.org/wiki/Cure%20Rare%20Disease | Cure Rare Disease is a non-profit biotechnology company based in Boston, Massachusetts that is working to create novel therapeutics using gene therapy, gene editing (CRISPR technology) and antisense oligonucleotides to treat people impacted by rare and ultra-rare genetic neuromuscular conditions.
History
Richard Horgan founded Terry's Foundation for Muscular Dystrophy in 2017, which became Cure Rare Disease in 2018, in order to develop a cure for Duchenne muscular dystrophy for his brother who has been battling the disease since childhood. Leveraging his network from Harvard Business School, Horgan formed a collaboration consisting of leading researchers and clinicians around the country to develop this cure for his brother, and eventually founded Cure Rare Disease.
Horgan connected first with a scientist at Boston Children's Hospital, Dr. Timothy Yu, who had just successfully created a custom drug for a girl with the neurodegenerative condition, Batten disease using antisense oligonucleotide (ASO) technology. Horgan's brother's mutation is not amenable to ASO technology, so Horgan adopted the process and instead used CRISPR as the technology to attempt to cure his brother.
This collaboration has expanded over the past three years and has led to the addition of notable researchers and institutions collaborating with Cure Rare Disease on their mission to treat rare disease.
Research
There are currently three drugs approved by the FDA for Duchenne muscular dystrophy to treat the patients with mutations on the dystrophin gene encompassing exon 51, 53, and 45. However, people with DMD have mutations impacting different exons of the gene, so these do not work to treat all patients.
Cure Rare Disease is developing novel therapeutics using gene replacement, gene editing (CRISPR gene-editing) and antisense oligonucleotide technologies. To systemically deliver a subset of therapeutics, including CRISPR, the therapeutic is inserted into the adeno-associated virus (AAV |
https://en.wikipedia.org/wiki/Hard%20privacy%20technologies | Hard privacy technologies are methods of protecting data. Hard privacy technologies and soft privacy technologies both fall under the category of privacy enchancing technologies. Hard privacy technologies allow online users to protect their privacy through different services and applications without the trust of the third-parties. The data protection goal is data minimization and reduction of the trust in third-parties and the freedom (and techniques) to conceal information or to communicate.
Applications of hard privacy technologies include onion routing, VPNs and the secret ballot used for democratic elections.
Systems for anonymous communications
Mix networks
Mix networks use both cryptography and permutations to provide anonymity in communications. The combination makes monitoring end-to-end communications more challenging for eavesdroppers, since it breaks the link between the sender and recipients.
Dining Cryptographers Net (DC-net)
DC-net is a protocol for communication that enables secure, uninterrupted communication. Its round-based protocol enables participants to publish one bit message per round unobservably.
The Integrated Services Digital Network (ISDN)
ISDN is based on a digital telecommunications network, i.e. a digital 64 kbit/s channel network. ISDN is primarily used for the swapping of networks; therefore it offers effective service for communication.
Attacks against anonymous communications
In order to cope with attacks on anonymity systems, the traffic analysis would trace information such as who is talking with whom, extract profiles and so on. The traffic analysis is used against vanilla or hardened systems.
Examples of hard privacy technologies
Onion routing
Onion routing is an internet-based encrypted technique to prevent eavesdropping, traffic analysis attacks and so on. Messages in an onion network are embedded in the encryption layers. The destination in each layer will be encrypted. For each router, the message is decrypted |
https://en.wikipedia.org/wiki/Shoploop | Shoploop is a new generation video shopping platform developed by Google that enables products to be promoted within a maximum of 90 seconds.
References
External links
Google software
E-commerce websites
E-commerce software |
https://en.wikipedia.org/wiki/Lion%20Attacking%20a%20Dromedary | Lion Attacking a Dromedary is an orientalist diorama by French taxidermist Édouard Verreaux in the collection of the Carnegie Museum of Natural History. It depicts a fictional scene of a man on a dromedary struggling to fend off an attack by a Barbary lion.
The diorama was created for the Paris Exposition of 1867 and subsequently shown at the American Museum of Natural History, Centennial Exposition, and the Carnegie Museum of Natural History. Since the 1890s, Lion Attacking a Dromedary has been criticized for its sensationalism and lack of accuracy. The male figure, referred to as an Arab by Verreaux, is a fictional pastiche of five North African cultures. The diorama is considered to be Verreaux's masterpiece.
Lion Attacking a Dromedary was purchased by the Carnegie Museum of Natural History in 1898. As part of a 2017 restoration, the museum found human remains in the diorama. In 2020, the diorama was removed from view in response to the Black Lives Matter movement and the lack of accuracy. Later that year it was returned to public view with additional context. Three years later, the exhibit was permanently removed from public view due to a newly enacted human-remains policy.
Creation and early exhibitions
Lion Attacking a Dromedary was created by French taxidermist Édouard Verreaux. Édouard was part of Maison Verreaux, a French taxidermy studio, with his brother Jules Verreaux. Verreaux created the work with the remains of a human, two barbary lions, and dromedary that were collected in Africa. The location from which the skins and bones were sourced and the date on which they were collected are unknown. The positioning of the human and lions in the diorama was based on Arab Horseman Killing a Boar and The Tiger Hunt by French sculptor Antoine-Louis Barye.
Lion Attacking a Dromedary was first displayed at the Paris Exposition of 1867 where it won a gold medal. After the death of Verreaux in 1867, Lion Attacking a Dromedary was sold to the American Museum of N |
https://en.wikipedia.org/wiki/Strategic%20Advisory%20Group%20of%20Experts | The Strategic Advisory Group of Experts (SAGE) is the principal advisory group to World Health Organization (WHO) for vaccines and immunization. Established in 1999 through the merging of two previous committees, notably the Scientific Advisory Group of Experts (which served the Program for Vaccine Development) and the Global Advisory Group (which served the EPI program) by Director-General of the WHO Gro Harlem Brundtland. It is charged with advising WHO on overall global policies and strategies, ranging from vaccines and biotechnology, research and development, to delivery of immunization and its linkages with other health interventions. SAGE is concerned not just with childhood vaccines and immunization, but all vaccine-preventable diseases. SAGE provide global recommendations on immunization policy and such recommendations will be further translated by advisory committee at the country level.
Membership
The SAGE has 15 members, who are recruited and selected as acknowledged experts from around the world in the fields of epidemiology, public health, vaccinology, paediatrics, internal medicine, infectious diseases, immunology, drug regulation, programme management, immunization delivery, health-care administration, health economics, and vaccine safety. Members are appointed by Director-General of the WHO to serve an initial term of 3 years, and can only be renewed once.
Working groups
SAGE meets at least twice annually in April and November, with working groups established for detailed review of specific topics prior to discussion by the full group. Priorities of work and meeting agendas are developed by the Group in consultation with WHO.
UNICEF, the Secretariat of the GAVI Alliance, and WHO Regional Offices participate as observers in SAGE meetings and deliberations. WHO also invites other observers to SAGE meetings, including representatives from WHO regional technical advisory groups, non-governmental organizations, international professional organizations, |
https://en.wikipedia.org/wiki/Doces%20de%20ovos | Doce de ovos is a sweet egg cream from Portuguese cuisine made with egg yolks and simple syrup. It is used as a filling for layered sponge cakes, and can be used as a sweet topping for ice creams and other desserts like Natas do Céu. The cream must be prepared at low temperature or in a bain marie to prevent the egg yolks coagulating.
It is a common component of products offered in Portuguese doçariaconfectionery stores.
See also
Fios de ovos
Ovos moles
References
Portuguese confectionery
Egg dishes
Food ingredients |
https://en.wikipedia.org/wiki/2Sum | 2Sum is a floating-point algorithm for computing the exact round-off error in a floating-point addition operation.
2Sum and its variant Fast2Sum were first published by Ole Møller in 1965.
Fast2Sum is often used implicitly in other algorithms such as compensated summation algorithms; Kahan's summation algorithm was published first in 1965, and Fast2Sum was later factored out of it by Dekker in 1971 for double-double arithmetic algorithms.
The names 2Sum and Fast2Sum appear to have been applied retroactively by Shewchuk in 1997.
Algorithm
Given two floating-point numbers and , 2Sum computes the floating-point sum rounded to nearest and the floating-point error so that .
The error is itself a floating-point number.
Inputs floating-point numbers
Outputs sum and error
return
Provided the floating-point arithmetic is correctly rounded to nearest (with ties resolved any way), as is the default in IEEE 754, and provided the sum does not overflow and, if it underflows, underflows gradually, it can be proven that
A variant of 2Sum called Fast2Sum uses only three floating-point operations, for floating-point arithmetic in radix 2 or radix 3, under the assumption that the exponent of is at least as large as the exponent of , such as when
Inputs radix-2 or radix-3 floating-point numbers and , of which at least one is zero, or which respectively have normalized exponents
Outputs sum and error
return
Even if the conditions are not satisfied, 2Sum and Fast2Sum often provide reasonable approximations to the error, i.e. , which enables algorithms for compensated summation, dot-product, etc., to have low error even if the inputs are not sorted or the rounding mode is unusual.
More complicated variants of 2Sum and Fast2Sum also exist for rounding modes other than round-to-nearest.
See also
Kahan summation algorithm
Round-off error
Double-double arithmetic
References
Computer arithmetic
Floating point
Numerical analysis |
https://en.wikipedia.org/wiki/List%20of%20elephant%20species%20by%20population | This is a list of estimated global populations of Proboscidean species, including their delineated subspecies. This list is generally comprehensive, but there is also uncertainty to some estimations.
See also
Lists of organisms by population
Lists of mammals by population
References
Mammals
Elephant
Proboscideans
Elephants |
https://en.wikipedia.org/wiki/Huang%27s%20law | Huang's law is an observation in computer science and engineering that advancements in graphics processing units (GPU) are growing at a rate much faster than with traditional central processing units (CPU). The observation is in contrast to Moore's law that predicted the number of transistors in a dense integrated circuit (IC) doubles about every two years. Huang's law states that the performance of GPUs will more than double every two years. The hypothesis is subject to questions about its validity.
History
The observation was made by Jensen Huang, then chief executive officer of Nvidia, at its 2018 GPU Technology Conference (GTC) held in San Jose, California. He observed that Nvidia’s GPUs were "25 times faster than five years ago" whereas Moore's law would have expected only a ten-fold increase. As microchip components become smaller, it became harder for chip advancement to meet the speed of Moore's Law.
In 2006 Nvidia's GPU had a 4x performance advantage over other CPUs. In 2018 the Nvidia GPU was 20 times faster than a comparable CPU node: the GPUs were 1.7x faster each year. Moore’s law would predict a doubling every two years, however Nvidia's GPU performance was more than tripled every two years fulfilling Huang's law.
Huang's law claims that a synergy between hardware, software and artificial intelligence makes the new 'law' possible. Huang said, "The innovation isn't just about chips," he said, "It's about the entire stack." He said that graphics processors especially are important to a new paradigm. Elimination of bottlenecks can speed up the process and create advantages in getting to the goal. "Nvidia is a one trick pony," Huang has said. According to Huang: "Accelerated computing is liberating, ... Let’s say you have an airplane that has to deliver a package. It takes 12 hours to deliver it. Instead of making the plane go faster, concentrate on how to deliver the package faster, look at 3D printing at the destination." The object "... is to del |
https://en.wikipedia.org/wiki/Gastronomica | Gastronomica: The Journal of Food and Culture is a peer-reviewed interdisciplinary academic journal with a focus on food. It is published by the University of California Press. It was founded by Darra Goldstein in 2001.
Awards
The journal has received a number of accolades:
Prix d'Or at the Gourmet Voice World Media Festival in 2004
2007 Utne Independent Press Award for Social/Cultural Coverage
Best Food Magazine in the World at the 2011 Gourmand Awards in Paris
Co-winner of the 2012 James Beard Foundation Award for Best Publication of the Year.
References
External links
University of California Press academic journals
Food and drink magazines
Quarterly magazines published in the United States
Magazines established in 2001
Magazines published in California
History of food and drink |
https://en.wikipedia.org/wiki/Shelf-break%20front | Shelf-Break Fronts are a process by which stratification of the water column occurs. This stratification normally results in thermoclines, since they occur where a sudden change in water depth causes a constriction of the current flow. They can be expressed as a ratio of their potential energy due to maintaining mixed (non-stratified) conditions, to the dissipated energy produced by the current being forced across the sudden change in depth. This can be expressed as:
The energy terms can be expressed in very detailed equations, but with constant terms factored out, the important terms are water velocity (average velocity, ) and water depth (h).
The equation for the stratification index can be expressed as:
Where is a friction coefficient, approximated as 0.003 for a sandy bottom. This index can be calculated for any coastal region, usually in the range of +3 (highly stratified) to -2 (highly turbulent).
Reason to calculate
The stratification index for a Shelf Break Front is an indication of how productive phytoplankton will be. When the stratification index is approximately 1.5, this produces a nutrient-rich environment for the growth of phytoplankton. Too much higher, and the stratification of the water column will not cause the upwellings of nutrients needed for the phytoplankton to prosper, too much lower, and the water will be too turbulent for the phytoplankton to use the nutrients available.
Stability of the front, in addition to nutrients, is a key to phytoplankton production.
An illustration of the stratification index for Narragansett Bay is shown here, with the average speeds estimated, using actual bathymetry for the bay, and an estimated for silt, which composes much of the bay's bottom. Using the Stokes Spreadsheet, and some customization on the size of silt particles, I used a = 0.0011. More accurate speed measurements and detailed values for the Bay's bottom could yield a higher fidelity image.
Notice the green color (a stratification ind |
https://en.wikipedia.org/wiki/Mary%20Clem | Mary A. Clem (née Mary A. McLaughlin; 19051979) was an American mathematician, and a human computer. She was a staff member at Iowa State University, and was recognized for inventing the “zero check” technique for detecting errors.
Biography
Clem was born on October 19, 1905 in the small town of Nevada, in Story County, western Iowa. She completed her high school degree and found employment for several years with the Iowa State Highway Commission and Iowa State College as a computing clerk, auditing clerk, and bookkeeper.
In 1931, she joined the Mathematics Statistical Service of the Mathematics Department of Iowa State College to work as a human computer under the supervision of George Snedecor. Although she complained that mathematics was her poorest subject in high school, she was fascinated with figures and data. Most of her work was done via punch cards, both creating formulas and cards, and running accuracy checks on them. She invented the “zero check” while working in Snedecor’s lab. The “zero check” is a sum that should equal zero if all other numbers had been correctly calculated. These sums helped check for errors in computing algorithms. Clem expressed that her lack of training as a mathematician is what made her notice these sums, as they had often been overlooked by others. In 1940, Clem was advanced to be technician and chief statistical clerk in charge of the Computing Service of the Statistical Laboratory. In 1962, she transferred to the new Computation Center at Iowa State University.
Clem went on the 2nd Allied Mission to Greece in 1946 as a junior statistician, and there she observed the elections. In 1952, she was a statistical consultant to the Atomic Bomb Casualty Commission in Hiroshima, Japan.
Publications
Homeyer, Paul G.; Clem, Mary A.; and Federer, Walter T. (1947) "Punched card and calculating machine methods for analyzing lattice experiments including lattice squares and the cubic lattice," Research Bulletin (Iowa Agriculture and |
https://en.wikipedia.org/wiki/Operation%20DisrupTor | Operation DisrupTor was an international investigation targeting drug traffickers on the dark web. Coordinated by the Joint Criminal Opioid and Darknet Enforcement, the operation was initiated and managed by the Federal Bureau of Investigation. The operation also included assistance from the Drug Enforcement Administration, U.S. Immigration and Customs Enforcement, United States Secret Service, United States Postal Inspection Service, IRS Criminal Investigation, Bureau of Alcohol, Tobacco, Firearms and Explosives, and local law enforcement agencies.
The investigation resulted in 179 arrests worldwide, in addition to 6.5 million dollars in seized funds (including digital currency) and 500 kilograms of drugs.
The operation was announced on September 22, 2020 by Deputy Attorney General Jeffrey A. Rosen at a joint press conference. The name of the operation is a portmanteau of "disrupt" and Tor, an open-source anonymity network that is often used to access the dark web.
References
External links
https://www.justice.gov/usao-dc/press-release/file/1301021/download
https://www.justice.gov/usao-wdpa/press-release/file/1318346/download
https://www.justice.gov/usao-wdpa/press-release/file/1318351/download
https://www.justice.gov/usao-wdpa/press-release/file/1318341/download
https://www.justice.gov/usao-wdpa/press-release/file/1318331/download
https://www.justice.gov/opa/page/file/1318706/download
https://www.justice.gov/opa/page/file/1318771/download
https://www.justice.gov/opa/page/file/1318671/download
https://www.justice.gov/opa/page/file/1318776/download
Federal Bureau of Investigation operations
United States intelligence operations
2020 in American law
Cybercrime |
https://en.wikipedia.org/wiki/Division%20lattice | The division lattice is an infinite complete bounded distributive lattice whose elements are the natural numbers ordered by divisibility. Its least element is 1, which divides all natural numbers, while its greatest element is 0, which is divisible by all natural numbers. The meet operation is greatest common divisor while the join operation is least common multiple.
The prime numbers are precisely the atoms of the division lattice, namely those natural numbers divisible only by themselves and 1.
For any square-free number n, its divisors form a Boolean algebra that is a sublattice of the division lattice. The elements of this sublattice are representable as the subsets of the set of prime factors of n. The converse also holds, namely that every sublattice of the division lattice that forms a Boolean algebra is isomorphic to the lattice of divisors of a square-free number.
References
Lattice theory
Boolean algebra
Prime numbers |
https://en.wikipedia.org/wiki/Cryptomenysis%20Patefacta | Cryptomenysis Patefacta, or Art of Secret Information Disclosed Without a Key is a 1685 non-fiction book written by John Falconer, it was only the second text written in English on the topic of cryptography. In 1693 it was republished as Rules for Explaining and Deciphering All Manner of Secret Writing.
The book serves as a guide to various cyphers including Egyptian hieroglyphs and fingerspelling.
References
1685 books
Cryptography books
Non-fiction books |
https://en.wikipedia.org/wiki/Kryptographik | Kryptographik Lehrbuch der Geheimschreibekunst (Cryptology: Instruction Book on the Art of Secret Writing) is an 1809 book on cryptography written by Johann Ludwig Klüber.
In 2011 the National Security Agency included a copy of Kryptographik, used by a German cryptographer during World War II, as part of a 50,000 page release of classified documents.
References
1809 non-fiction books
Cryptography books
Classified documents |
https://en.wikipedia.org/wiki/Shadow%20library | Shadow libraries are online databases of readily available content that is normally obscured or otherwise not readily accessible. Such content may be inaccessible for a number of reasons, including the use of paywalls, copyright controls, or other barriers to accessibility placed upon the content by its original owners. Shadow libraries usually consist of textual information as in electronic books, but may also include other digital media, including software, music, or films.
Examples of shadow libraries include Anna's Archive, Library Genesis, Sci-Hub and Z-Library, which are popular book and academic shadow libraries and may be the largest public libraries for books and literature.
Motivation
One of the primary motivations behind the creation of shadow libraries is to more readily disseminate academic content, especially papers from academic journals. Academic literature has become increasingly expensive, as costs to access information created by scholars have risen dramatically in recent years, especially the cost of books. The term serials crisis has emerged to describe this ongoing trend.
Conversely, the same motivation behind the serials crisis has also given rise to a concerted international political movement to make academic knowledge free or very cheap, known as the Open Access movement. The Open Access movement strives to establish both journals that are free to access (known as open access journals) and free-to-access repositories of academic journal papers published elsewhere. However, many open access journals require academics to pay fees to be published in an open access journal, which disincentives academics from publishing in such journals.
A tertiary motivator for the establishment of shadow libraries is the tacit endorsement by many academics of such efforts. Academics are rarely compensated by publishers for their work, regardless of whether their work is published in an open access journal or a conventionally priced journal. Thus, there i |
https://en.wikipedia.org/wiki/Dangerous%20Things | Dangerous Things is a Seattle-based cybernetic microchip biohacking implant retailer formed in 2013 by Amal Graafstra, following a crowdfunding campaign.
Dangerous Things built the first personal publicly available implantable NFC compliant transponder in 2013. In September 2020, Dangerous Things began another highly successful crowdfunding campaign to realize the world's first titanium encased fully bio-compatible sensing magnet, named the Titan.
References
Companies based in Seattle
Bionics
Biotechnology companies of the United States
Companies established in 2013 |
https://en.wikipedia.org/wiki/Modular%20Cognition%20Framework | The Modular Cognition Framework (MCF) is an open-ended theoretical framework for research into the way the mind is organized. It draws on the common ground shared by contemporary research in the various areas that are collectively known as cognitive science and is designed to be applicable to all these fields of research. It was established, by Michael Sharwood Smith and John Truscott in the first decade of the 21st century with a particular focus on language cognition when it was known as the MOGUL framework (Modular Online Growth and Use of Language).
The MCF is open-ended in the sense that it has a set of basic principles (see below) describing the architecture of the human mind: these amounts to setting out a skeleton model of the mind and providing a template for cognitive scientists to use. Both mind and brain are viewed as biological phenomena but at different levels of abstraction. These fundamental principles can be further interpreted in various ways by any researcher who is working with a theoretical approach that can be said to reflect, or can be aligned with the basic principles. In doing so researchers can identify their own hypotheses and research findings not only as confirming or challenging their own theory but also as a manifestation of the basic principles underlying all cognitive processing and representation.
By the end of 2020 four books based specifically on the framework had been published along with over 35 articles and chapters; numerous publications and theses by researchers using the MCF for their own purposes had also appeared. This has built on the framework giving it a richer, more elaborate structure in those areas that have been investigated. Nonetheless, different version of the elaborations can still be proposed.
The predominant assumption of the MCF is that the mind is composed of a collaborative network of functionally specialized systems which have evolved over time together with their physical manifestations in the brain t |
https://en.wikipedia.org/wiki/Tenable%2C%20Inc. | Tenable, Inc. is a cybersecurity company based in Columbia, Maryland. It is known as the creator of the vulnerability scanning software Nessus.
History
Tenable was founded in 2002 as Tenable Network Security, Inc. The original co-founders of Tenable were Ron Gula, Jack Huffard, and Renaud Deraison. In 2012, Tenable received its first round of institutional funding in the form of $50 million from the venture capital firm Accel Partners. In 2017, the company was renamed Tenable, Inc. Its initial public offering (IPO) took place on the Nasdaq on July 26, 2018.
Acquisitions
In 2016, Tenable acquired the cybersecurity company FlawCheck. In 2019, Tenable paid $78 million to acquire the Israel-based operational technology company Indegy Ltd.
In April 2022, Tenable announced plans to acquire an attack surface management software startup, Bit Discovery, for $45 million in cash. The acquisition was expected to close in the second quarter of 2022.
In September 2023, Tenable announced plans to acquire Ermetic—an Israeli cloud-native application protection startup—for $240 million in cash and $25 million in restricted stock and RSUs. The acquisition is expected to close in the fourth quarter of 2023.
References
External links
American companies established in 2002
Security companies of the United States
Technology companies of the United States
Companies based in Columbia, Maryland
Computer security companies |
https://en.wikipedia.org/wiki/Chance-constrained%20portfolio%20selection | Chance-constrained portfolio selection is an approach to portfolio selection under Loss aversion.
The formulation assumes that (i) investor’s preferences are representable by the expected utility of final wealth, and that (ii) they require that the probability of their final wealth falling below a survival or safety level must to be acceptably low.
The chance-constrained portfolio problem is then to find:
Max wjE(Xj), subject to Pr( wjXj < s) ≤ , wj = 1, wj ≥ 0 for all j,
where s is the survival level and is the admissible probability of ruin; w is the weight and x is the value of the jth asset to be included in the portfolio.
The original implementation is based on the seminal work of Abraham Charnes and William W. Cooper on stochastic programming in 1959,
and was first applied to finance by Bertil Naslund and Andrew B. Whinston in 1962
and in 1969 by N. H. Agnew, et al.
For fixed the chance-constrained portfolio problem represents Lexicographic preferences and is an implementation of capital asset pricing under loss aversion.
In general though, it is observed that no utility function can represent the preference ordering of chance-constrained programming because a fixed does not admit compensation for a small increase in by any increase in expected wealth.
For a comparison to mean-variance and safety-first portfolio problems, see; for a survey of solution methods here, see; for a discussion of the risk aversion properties of chance-constrained portfolio selection, see
See also
Portfolio optimization
Loss aversion
Stochastic programming
Expected utility theory
Lexicographic preferences
Capital asset pricing model
Post modern portfolio theory
References
Portfolio theories
Stochastic optimization
Financial economics
Actuarial science |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.