source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/List%20of%20College%20and%20University%20Agricultural%20Engineering%20Departments | Below is a listing of known academic programs that offer bachelor's degrees (B.S. or B.S.E. or B.E / B.Tech) in what ABET terms "Agricultural Engineering", "Biosystems Engineering", "Biological Engineering", or similarly named programs. ABET accredits college and university programs in the disciplines of applied science, computing, engineering, and engineering technology.
The Americas
North America
Mexico, Central and South America
Europe
Asia
Oceania
Africa
External links
Bureau of Labor Statistics, Agricultural Engineering Education
https://www.bls.gov/ooh/architecture-and-engineering/agricultural-engineers.htm#tab-4
Engineering education
Agricultural Engineering Departments |
https://en.wikipedia.org/wiki/Palmitate%20mediated%20localization | Palmitate mediated localization is a biological process that trafficks a palmitoylated protein to ordered lipid domains.
Biological function
One function is thought to cluster proteins to increase the efficiency of protein-protein interactions and facilitate biological processes. In the opposite scenario palmitate mediated localization sequesters proteins away from a non-localized molecule. In theory, disruption of palmitate mediated localization then allows a transient interaction of two molecules through lipid mixing. In the case of an enzyme, palmitate can sequester an enzyme away from its substrate. Disruption of palmitate mediated localization then activates the enzyme by substrate presentation.
Mechanism of sequestration
Palmitate mediated localization utilizes lipid partitioning and the formation of lipid rafts. Sequestration of palmitoylated proteins is regulated by cholesterol. Depletion of cholesterol with methyl-beta cyclodextrin disrupts palmitate mediated localization.
References
Biological processes |
https://en.wikipedia.org/wiki/Grigore%20Ro%C8%99u | Grigore Roșu is a computer science professor at the University of Illinois at Urbana-Champaign and a researcher in the
Information Trust Institute.
He is known for his contributions in runtime verification, the K framework,
matching logic,
and automated coinduction.
Biography
Roșu received a B.A. in Mathematics in 1995 and an M.S. in Fundamentals of Computing in 1996, both from the University of Bucharest, Romania, and a Ph.D. in Computer Science in 2000 from the University of California at San Diego. Between 2000 and 2002 he was a research scientist at NASA Ames Research Center. In 2002, he joined the department of computer science at the University of Illinois at Urbana–Champaign as an assistant professor. He became an associate professor in 2008 and a full professor in 2014.
Awards
IEEE/ACM most influential paper of the International Conference on Automated Software Engineering (ASE) award in 2016 (for an ASE 2001 paper)
Runtime Verification (RV) test of time award (for an RV 2001 paper)
ACM distinguished paper awards at ASE 2008, ASE 2016, and OOPSLA 2016
Best software science paper award at ETAPS 2002
NSF CAREER award in 2005
Ad AStra award in 2016
Contributions
Roșu coined the term "runtime verification" together with Havelund
as the name of a workshop
started in 2001, aiming at addressing problems at the boundary between formal verification and testing.
Roșu and his collaborators
introduced algorithms and techniques for
parametric property monitoring,
efficient monitor synthesis,
runtime predictive analysis,
and monitoring-oriented programming.
Roșu also founded Runtime Verification, Inc.,
a company aimed at commercializing runtime verification technology.
Roșu created and led the design and development of the K framework, which is an executable
semantic framework where programming languages,
type systems, and formal analysis tools are defined using configurations, computations, and rewrite rules.
Language tools such as inter |
https://en.wikipedia.org/wiki/Linux%20kernel%20version%20history | This article documents the version history of the Linux kernel. The Linux kernel is a free and open-source, monolithic, Unix-like operating system kernel. It was conceived and created in 1991 by Linus Torvalds.
Linux kernels have different support levels depending on the version. Usually, each stable version continues to backport bug fixes from the mainline until the next stable version is released. However, if a stable version has been designated as a long-term support (LTS) kernel, it will be maintained for an extra few years. After that, versions designated as Super-Long-Term Support (SLTS) will then be maintained by the Civil Infrastructure Platform (CIP) for many more years.
Releases 6.x.y
Releases 5.x.y
Releases 4.x.y
Releases 3.x.y
The jump from 2.6.x to 3.x wasn't because of a breaking update, but rather the first release of a new versioning scheme introduced as a more convenient system.
Releases 2.6.x.y
Versions 2.6.16 and 2.6.27 of the Linux kernel were unofficially given long-term support (LTS), before a 2011 working group in the Linux Foundation started a formal long-term support initiative.
Releases up to 2.6.0
See also
Linux adoption
Linux kernel
History of Linux
Timeline of free and open-source software
References
External links
Official Linux kernel website
Active kernel releases, on the official Linux kernel website
Linux versions changelog, in Linux Kernel Newbies
Linux Kernel Version History: Consolidated list
Linux kernel
Software version histories |
https://en.wikipedia.org/wiki/Futhark%20%28programming%20language%29 | Futhark is a functional data parallel array programming language originally developed at UCPH Department of Computer Science (DIKU) as part of the HIPERFIT project. It focuses on enabling data parallel programs written in a functional style to be executed with high performance on massively parallel hardware, in particular on graphics processing units (GPUs). Futhark is strongly inspired by NESL, and its implementation uses a variant of the flattening transformation, but imposes constraints on how parallelism can be expressed in order to enable more aggressive compiler optimisations. In particular, irregular nested data parallelism is not supported.
Overview
Futhark is a language in the ML family, with an indentation-insensitive syntax derived from OCaml, Standard ML, and Haskell. The type system is based on Hindley-Milner with a variety of extensions, such as uniqueness types and size-dependent types. Futhark is not intended as a general-purpose programming language for writing full applications, but is instead focused on writing computational "kernels" (not necessarily the same as a GPU kernel) which are then invoked from applications written in conventional languages.
Examples
Dot product
The following program computes the dot product of two vectors containing double-precision numbers.
def dotprod xs ys = f64.sum (map2 (*) xs ys))
It can also be equivalently written with explicit type annotations as follows.
def dotprod [n] (xs: [n]f64) (ys: [n]f64) : f64 = f64.sum (map2 (*) xs ys))
This makes the size-dependent types explicit: this function can only be invoked with two arrays of the same size, and the type checker will reject any program where this cannot be statically determined.
Matrix multiplication
The following program performs matrix multiplication, using the definition of dot product above.
def matmul [n][m][p] (A: [n][m]f64) (B: [m][p]f64) : [n][p]f64 =
map (\A_row ->
map (\B_col -> dotprod A_row B_col)
(transpose B) |
https://en.wikipedia.org/wiki/Necuno | The Necuno is a phone-like mobile device exclusively manufactured in Finland. It seeks to provide high level security and user privacy by omitting the cellular modem. For this reason, it cannot be used on a regular mobile phone network. Instead it offers VOIP via a peer-to-peer encrypted communication platform called Ciphra. Standard cellular connectivity is planned for later versions.
The Necuno is mostly open-source, apart from an isolated firmware blob without access to the main memory, used in the Wi-Fi driver for regulatory reasons. The device uses Plasma Mobile by default, but it can run a variety of open-source mobile operating systems. It also has an ethernet port.
See also
Comparison of open-source mobile phones
References
Mobile Linux
Linux-based devices
Open-source mobile phones
Peer-to-peer computing |
https://en.wikipedia.org/wiki/Categorical%20trace | In category theory, a branch of mathematics, the categorical trace is a generalization of the trace of a matrix.
Definition
The trace is defined in the context of a symmetric monoidal category C, i.e., a category equipped with a suitable notion of a product . (The notation reflects that the product is, in many cases, a kind of a tensor product.) An object X in such a category C is called dualizable if there is another object playing the role of a dual object of X. In this situation, the trace of a morphism is defined as the composition of the following morphisms:
where 1 is the monoidal unit and the extremal morphisms are the coevaluation and evaluation, which are part of the definition of dualizable objects.
The same definition applies, to great effect, also when C is a symmetric monoidal ∞-category.
Examples
If C is the category of vector spaces over a fixed field k, the dualizable objects are precisely the finite-dimensional vector spaces, and the trace in the sense above is the morphism
which is the multiplication by the trace of the endomorphism f in the usual sense of linear algebra.
If C is the ∞-category of chain complexes of modules (over a fixed commutative ring R), dualizable objects V in C are precisely the perfect complexes. The trace in this setting captures, for example, the Euler characteristic, which is the alternating sum of the ranks of its terms:
Further applications
have used categorical trace methods to prove an algebro-geometric version of the Atiyah–Bott fixed point formula, an extension of the Lefschetz fixed point formula.
References
Category theory
Fixed-point theorems
Geometry |
https://en.wikipedia.org/wiki/Journal%20of%20Surveying%20Engineering | The Journal of Surveying Engineering is a quarterly peer-reviewed scientific journal published by the American Society of Civil Engineers. It covers traditional areas of surveying and mapping, as well as new developments such as satellite positioning and navigation, computer applications, and digital mapping. It was established in 1956 (when ASCE Transactions was split into 12 technical journals).
Abstracting and indexing
The journal abstracted and indexed in the Emerging Sources Citation Index and Scopus.
Current Editorial Broad
Editor:
Michael J. Olsen, Ph.D., M.ASCE, Oregon State University
Associate Editor:
Alireza Amiri-Simkooei, Ph.D., University of Isfahan
Sergio Baselga, Ph.D., M.ASCE, Universidad Politécnica de Valencia
Said M. Easa, Ph.D., P.E., M.ASCE, Ryerson University,
Toronto
Craig Glennie, Ph.D., P.E., University of Houston
Jen-Yu Han, Ph.D., M.ASCE, National Taiwan University
Editorial Board:
Bishwa N. Acharya, Ph.D., M.ASCE, EMI, Inc.
James M. Anderson, Ph.D., P.E., P.L.S., University of California, Berkeley
David Belton, Curtin University
Michael L. Dennis, Ph.D., P.E., R.L.S., M.ASCE, National Geodetic Survey
Robert Duchnowski, University of Warmia and Mazury in Olsztyn
Ahmed F. Elaksher, Ph.D., P.L.S., New Mexico State University
Andrew C. Kellie, P.L.S., M.ASCE, Murray State University
Andrzej Kobryń, Bialystok University of Technology
Thomas H. Meyer, Ph.D., M.ASCE, University of Connecticut
Chris Parrish, Aff.M.ASCE, Oregon State University
Jacek Paziewski, Ph.D., University of Warmia and Mazury in Olsztyn
Elena Rangelova, Ph.D., P.Eng., University of Calgary
David A. Rolbiecki, P.L.S., M.ASCE, State of Texas Adjutant General’s Department
Michael Starek, Ph.D., M.ASCE, Texas A&M University, Corpus Christi
Ergin Tari, Ph.D., Istanbul Technical University
Guoquan Wang, Ph.D., M.ASCE, University of Houston
Benjamin E. Wilkinson, Ph.D., University of Florida
Book Review Editor:
Boudewijn H.W. van Gelder, Ph.D., M.ASCE, Purdue Univers |
https://en.wikipedia.org/wiki/National%20Heritage%20Database | The National Heritage Database is an online database containing information about various types of heritage-listed places in Australia and around the world.
It is a searchable database which includes:
places in the World Heritage List;
places in the Australian National Heritage List;
places in the Commonwealth National Heritage List;
places in the Register of the National Estate (a non-statutory archived list);
places in the List of Overseas Places of Historic Significance to Australia; and
places that have ever been considered for, or are currently under consideration for, any one of these lists.
References
Heritage registers in Australia
Heritage registers
Online databases |
https://en.wikipedia.org/wiki/Hoffman%27s%20packing%20puzzle | Hoffman's packing puzzle is an assembly puzzle named after Dean G. Hoffman, who described it in 1978. The puzzle consists of 27 identical rectangular cuboids, each of whose edges have three different lengths. Its goal is to assemble them all to fit within a cube whose edge length is the sum of the three lengths.
writes that the first person to solve the puzzle was David A. Klarner, and that typical solution times can range from 20 minutes to multiple hours.
Construction
The puzzle itself consists only of 27 identical rectangular cuboid-shaped blocks, although physical realizations of the puzzle also typically supply a cubical box to fit the blocks into. If the three lengths of the block edges are , , and , then the cube should have edge length .
Although the puzzle can be constructed with any three different edge lengths, it is most difficult when the three edge lengths of the blocks are close enough together that , as this prevents alternative solutions in which four blocks of the minimum width are packed next to each other. Additionally, having the three lengths form an arithmetic progression can make it more confusing, because in this case placing three blocks of the middle width next to each other produces a row of the correct total width but one that cannot lead to a valid solution to the whole puzzle.
Mathematical analysis
Each valid solution to the puzzle arranges the blocks in an approximate grid of blocks, with the sides of the blocks all parallel to the sides of the outer cube, and with one block of each width along each axis-parallel line of three blocks. Counting reflections and rotations as being the same solution as each other, the puzzle has 21 combinatorially distinct solutions.
The total volume of the pieces, , is less than the volume of the cube that they pack into. If one takes the cube root of both volumes, and divides by three, then the number obtained in this way from the total volume of the pieces is the geometric mean of , , and , whil |
https://en.wikipedia.org/wiki/Total%20sounding | Total sounding (TS) is a sounding method performed as part of geotechnical investigation. The sounding combines conventional rotary-pressure sounding with bedrock drilling, including rotation, ramming and flushing modes. The result indicates sediment stratification, occasionally soil type and may verify depth to bedrock.
History
The rotary-pressure sounding method was developed by the Norwegian Geotechnical Institute (NGI) and the Public Roads Administration (NPRA) in 1967.
References
External links
In situ geotechnical investigations |
https://en.wikipedia.org/wiki/Rotary-pressure%20sounding | Rotary-pressure sounding is a method of testing soil conditions that might be performed as part of a geotechnical investigation. A series of rods, with a specially designed tip, is forced into the ground under downward pressure. The rotation and speed of insertion are maintained at a constant rate, and the amount of force required to maintain that rate is measured. The results can be interpreted to provide information about sediment stratification, and sometimes also the type of soil and the depth to bedrock.
The rotary-pressure sounding method was developed by the Norwegian Geotechnical Institute (NGI) and the Norwegian Public Roads Administration (NPRA) in 1967.
References
In situ geotechnical investigations |
https://en.wikipedia.org/wiki/Roch%20Gu%C3%A9rin | Roch Guérin is a French computer scientist. He is the Harold B. & Adelaide G. Welge Professor of Computer Science at the McKelvey School of Engineering at Washington University in St. Louis, and chair of the Computer Science & Engineering department at that university. Prior to that he was the Alfred Fitler Moore Professor of Telecommunications Networks and professor of electrical and systems engineering and computer and information science at the University of Pennsylvania. He worked for 12 years at the IBM Thomas J. Watson Research Center.
Obtaining his BS from École nationale supérieure des telecommunications, he received his MS in 1984 and PhD in 1986 from the California Institute of Technology.
His research centers on computer networks, cloud computing, performance analysis, and network economics. He is a Fellow of the ACM and IEEE for contributions to the theory and practice of quality-of-service guarantees in packet networks, and the development and application of the equivalent bandwidth concept.
Selected research
Yavatkar, Raj, Dimitrios Pendarakis, and Roch Guerin. "A framework for policy-based admission control." (2000).
Apostolopoulos, George, et al. QoS routing mechanisms and OSPF extensions. Vol. 999. RFC 2676, August, 1999.
Guerin, Roch A., and Ariel Orda. "QoS routing in networks with inaccurate information: theory and algorithms." IEEE/ACM transactions on Networking 7.3 (1999): 350-364.
Guerin, Roch, Hamid Ahmadi, and Mahmoud Naghshineh. "Equivalent capacity and its application to bandwidth allocation in high-speed networks." IEEE Journal on selected areas in communications 9.7 (1991): 968-981.
References
Living people
Washington University in St. Louis faculty
French computer scientists
Year of birth missing (living people)
ENSTA Paris alumni
California Institute of Technology alumni
University of Pennsylvania faculty
Fellows of the Association for Computing Machinery
Fellow Members of the IEEE
Computer networking people |
https://en.wikipedia.org/wiki/Frontiers%20of%20Biogeography | Frontiers of Biogeography is a peer-reviewed open access scientific journal publishing biogeographical science, with the academic standards expected of a journal operated by and for an academic society. It published on behalf of the International Biogeographical Society, using the eScholarship Publishing platform. The current editor-in-chief is Robert J. Whittaker.
Abstracting and indexing
The journal is abstracted and indexed in:
References
External links
Open access journals
Ecology journals
Geography journals
Biogeography
Academic journals established in 2009
English-language journals |
https://en.wikipedia.org/wiki/Eric%20M.%20Rains | Eric Michael Rains (born 23 August 1973) is an American mathematician specializing in coding theory and special functions, especially applications from and to noncommutative algebraic geometry.
Biography
Eric Rains was 14 when he began classes in 1987. He left Case Western Reserve University with bachelor's degrees in computer science and physics and a master's degree in mathematics at age 17.
By means of a Churchill Scholarship he studied mathematics and physics at the University of Cambridge for the academic year 1991–1992, receiving a Certificate of Advanced Study in Mathematics. He received his PhD in 1995 from Harvard University with thesis Topics in Probability on Compact Lie Groups under the supervision of Persi Diaconis. From 1995 to 1996, Rains worked at the IDA's Center for Communications Research (CCR) in Princeton. From 1996 to 2002 he was a researcher for AT&T Labs. From 2002 to 2003 he returned to the CCR in Princeton. In 2003, Rains became a full professor at the University of California, Davis and since 2007 has been a full professor at Caltech where he currently works. He has served as the Executive Officer of the Caltech Mathematics Department from 2019 to 2022.
In the fall of 2006 he was a visiting professor at the University of Melbourne. He is the co-author with Gabriele Nebe and Neil J. A. Sloane of the 2006 book Self-Dual Codes and Invariant Theory.
In 2007, Rains was a plenary speaker at the Western Sectional meeting of the American Mathematical Society (AMS). In 2010 he was an invited speaker at the International Congress of Mathematicians in Hyderabad. He was elected a Fellow of the AMS in the class of 2018 for "contributions to coding theory, the theory of random matrices, the study of special functions, non-commutative geometry and number theory".
Selected publications
(This article has over 1200 citations.)
References
1973 births
Living people
20th-century American mathematicians
21st-century American mathematicians
Case Wes |
https://en.wikipedia.org/wiki/Journal%20of%20Aerospace%20Engineering | The Journal of Aerospace Engineering is a peer-reviewed scientific journal published by the American Society of Civil Engineers and combines civil engineering with aerospace technology (but also incorporates other elements of civil engineering) to develop structures for space and extreme conditions. Topics of interest include aerodynamics, computational fluid dynamics, wind tunnel testing of buildings and structures, aerospace structures and materials, and more.
History
The journal has previously published under the names Journal of the Aero-Space Transport Division (1962-1966) and as the Journal of the Air Transport Division (1956-1961)
Abstracting and indexing
The journal is abstracted and indexed in Ei Compendex, Science Citation Index Expanded, ProQuest databases, Civil Engineering Database, Inspec, Scopus, and EBSCO databases.
External links
Aerospace engineering journals
American Society of Civil Engineers academic journals |
https://en.wikipedia.org/wiki/Journal%20of%20Composites%20for%20Construction | The Journal of Composites for Construction is a peer-reviewed scientific journal published by the American Society of Civil Engineers and publishes original content dealing with the use of fiber-reinforced composite materials in construction. The journal editors are looking for papers that bridge the gap between research in the mechanics and manufacturing science of composite materials and the analysis and design of large civil engineering structural systems and their construction processes.
ABstracting and indexing
The journal is abstracted and indexed in Ei Compendex, Science Citation Index Expanded, ProQuest databases, Civil Engineering Database, Inspec, Scopus, and EBSCO databases.
References
External links
Engineering journals
American Society of Civil Engineers academic journals |
https://en.wikipedia.org/wiki/Bent%20Stumpe | Bent Stumpe (born 12 September 1938, Copenhagen, Denmark) is a Danish electronic engineer who spent most of his career at the international research laboratory CERN, Geneva, Switzerland. Stumpe built in 1972, following an idea launched by Frank Beck, a capacitive touchscreen for controlling CERN's Super Proton Synchrotron accelerator. In 1973 Beck and Stumpe published a CERN report, outlining the concept for a prototype touchscreen as well as a multi-function computer-configurable knob.
Education
Bent Stumpe was educated within the Royal Danish Air Force and obtained a certificate as a radio/radar engineer in 1959.
Career
Leaving the Air Force, Stumpe was employed from 1959–1961 at the Danish radio and television factory TO-R Radio before he was employed by CERN from 1961 until 2003. In combination with his activities at CERN, Stumpe was a consultant to the World Health Organization working on the development of an instrument for the early detection of Leprosy.
References
20th-century Danish engineers
Engineers from Copenhagen
1938 births
People associated with CERN
Living people
Electronics engineers
20th-century Danish inventors |
https://en.wikipedia.org/wiki/Alex%20Kipman | Alex Kipman (born 1979) is a Brazilian engineer. He was the lead developer of the Microsoft HoloLens smartglasses and helped develop the Xbox Kinect.
Biography
Kipman was born in Curitiba in 1979. The son of a Brazilian diplomat, Kipman grew up around the world. When he was seven or eight, he learned how to program the Atari 2600. Later on he would go to RIT, graduating in 2001 with a degree in software engineering and joined Microsoft that same year, starting development on Microsoft's integrated development environment (IDE) Visual Studio. Starting 2005, he helped in the development of Microsoft Windows, until joining the Xbox department in 2008, where he oversaw the acquisition of the technology for the Xbox Kinect from an Israeli company, PrimeSense. The product was finished two years later.
In 2011, Time magazine named him to its list of its 100 Most Influential People in the World, a list consisting of leaders, artists, innovators, icons and heroes. In a subsequent interview with Fast Company, he said "Software is the only art form in existence that is not bound by the confines of physics." In 2012 he was named Inventor of the Year by the Intellectual Property Owners Association.
In 2013, Kipman gave the commencement speech at his alma mater, the Rochester Institute of Technology (RIT).
In 2016, he gave a Ted Talk on mixed reality, called "A futuristic vision of the age of Holograms". In a 2017 interview with Alice Bonasio, he emphasized his passion for mixed reality, stating how it gives him a sense of "displacement superpowers". During the Hololens 2 reveal at the Mobile World Congress in 2019, Alex Kipman talked about how the Hololens 2 would be the "next era" of mixed reality, making it more culturally relevant.
In 2019 while he was developing metaverse technologies, the Smithsonian Institution in Washington, D.C named Kipman the winner of an American Ingenuity Award, calling him a pioneer of holographic and augmented reality technology. Later that y |
https://en.wikipedia.org/wiki/Memory-hard%20function | In cryptography, a memory-hard function (MHF) is a function that costs a significant amount of memory to efficiently evaluate. It differs from a memory-bound function, which incurs cost by slowing down computation through memory latency. MHFs have found use in key stretching and proof of work as their increased memory requirements significantly reduce the computational efficiency advantage of custom hardware over general-purpose hardware compared to non-MHFs.
Introduction
MHFs are designed to consume large amounts of memory on a computer in order to reduce the effectiveness of parallel computing. In order to evaluate the function using less memory, a significant time penalty is incurred. As each MHF computation requires a large amount of memory, the number of function computations that can occur simultaneously is limited by the amount of available memory. This reduces the efficiency of specialised hardware, such as application-specific integrated circuits and graphics processing units, which utilise parallelisation, in computing a MHF for a large number of inputs, such as when brute-forcing password hashes or mining cryptocurrency.
Motivation and examples
Bitcoin's proof-of-work uses repeated evaluation of the SHA-256 function, but modern general-purpose processors, such as off-the-shelf CPUs, are inefficient when computing a fixed function many times over. Specialized hardware, such as application-specific integrated circuits (ASICs) designed for Bitcoin mining, can use 30,000 times less energy per hash than x86 CPUs whilst having much greater hash rates. This led to concerns about the centralization of mining for Bitcoin and other cryptocurrencies. Because of this inequality between miners using ASICs and miners using CPUs or off-the shelf hardware, designers of later proof-of-work systems utilised hash functions for which it was difficult to construct ASICs that could evaluate the hash function significantly faster than a CPU.
As memory cost is platform-inde |
https://en.wikipedia.org/wiki/Induced%20cell%20cycle%20arrest | Induced cell cycle arrest is the use of a chemical or genetic manipulation to artificially halt progression through the cell cycle. Cellular processes like genome duplication and cell division stop. It can be temporary or permanent. It is an artificial activation of naturally occurring cell cycle checkpoints, induced by exogenous stimuli controlled by an experimenter.
Model organisms
In an academic research context, cell cycle arrest is typically performed in model organisms and cell extracts, such as Saccharomyces cervisiae (yeast) or Xenopus oocytes (frog eggs). Frog egg cell extracts have been used extensively in cell cycle research because they are relatively large, reaching a diameter of 1mm, and so contain large amounts of protein, making protein levels more easily measurable.
Purposes
There are a variety of reasons a researcher may want to temporarily or permanently prevent progress through the cell cycle.
Cell cycle synchronization
In some experiments, a researcher may want to control and synchronize the time when a group of cells progress to the next phase of the cell cycle. The cells can be induced to arrest as they arrive (at different time points) at a certain phase, so that when the arrest is lifted (for instance, rescuing cell cycle progression by introducing another chemical) all the cells resume cell cycle progression at the same time. In addition to this method acting as a scientific control for when the cells resume the cell cycle, this can be used to investigate necessity and sufficiency.
Another reason synchrony is important is the control for amount of DNA content, which varies at different parts of the cell cycle based on whether DNA replication has occurred since the last round of completed mitosis and cytokinesis.
Furthermore, synchronization of large numbers of cells into the same phase allows for the collection of large enough groups of cells in the same cycle for the use in other assays, such as western blot and RNA sequencing.
D |
https://en.wikipedia.org/wiki/ALTS | Application Layer Transport Security (ALTS) is a Google-developed authentication and transport encryption system used for securing Remote Procedure Call (RPC) within Google machines. Google started its development in 2007, as a tailored modification of TLS.
Background
ALTS, similar to TLS, was designed specifically for Google’s data centers and relies on two protocols, Handshake and Record. Google began developing ATLS in 2007 in order to create a security system solution for the company’s infrastructure.
The ALTS whitepaper was published in December 2017. At that time the dominant Application layer protocols were SSL and TLS 1.1 (TLS 1.2 was only published as an RFC in 2008), those supported many legacy algorithms and had poor security standards. As Google was in full control over the machines that needed secure transport of RPCs, deployment of systems was relatively easy, and so Google developers could afford designing their own system from scratch.
Another requirement that deemed a new system necessary is different trust models:
in TLS, the server side is committed to its own domain name (and corresponding naming scheme), while Google needed the same identity (i.e. RPC) to be used with multiple naming schemes, in order to simplify microservice replication, load balancing and rescheduling between hosts.
Details
Handshake protocol
The ALTS handshake protocol is based on authenticated Diffie-Hellman key exchange scheme, and supports both perfect forward secrecy (access to current keys does not compromise future security) and session resumption (noticeable speedups in the protocol after the first session between the parties).
Unlike TLS, in ALTS both parties — server and client — have a certificate proving their respective identities. The certificate chains to a trusted signing service verification key, with the leaf being an Elliptic curve Diffie-Hellman key, that is eventually used for key exchange. The elliptic curve used in the key exchange is Curve25519.
|
https://en.wikipedia.org/wiki/Software%20bot | A software bot is a type of software agent in the service of software project management and software engineering. A software bot has an identity and potentially personified aspects in order to serve their stakeholders. Software bots often compose software services and provide an alternative user interface, which is sometimes, but not necessarily conversational.
Software bots are typically used to execute tasks, suggest actions, engage in dialogue, and promote social and cultural aspects of a software project.
The term bot is derived from robot. However, robots act in the physical world and software bots act only in digital spaces. Some software bots are designed and behave as chatbots, but not all chatbots are software bots. Erlenhov et al. discuss the past and future of software bots and show that software bots have been adopted for many years.
Usage
Software bots are used to support development activities, such as communication among software developers and automation of repetitive tasks. Software bots have been adopted by several communities related to software development, such as open-source communities on GitHub and Stack Overflow.
GitHub bots have user accounts and can open, close, or comment on pull requests and issues. GitHub bots have been used to assign reviewers, ask contributors to sign the Contributor License Agreement, report continuous integration failures, review code and pull requests, welcome newcomers, run automated tests, merge pull requests, fix bugs and vulnerabilities, etc.
The Slack tool includes an API for developing software bots. There are slack bots for keeping track of todo lists, coordinating standup meetings, and managing support tickets. The
Chatbot company products further simplify the process of creating a custom Slack bot.
On Wikipedia, Wikipedia bots automate a variety of tasks, such as creating stub articles, consistently updating the format of multiple articles, and so on. Bots like ClueBot NG are capable of recogniz |
https://en.wikipedia.org/wiki/Guilty%20Gear%20Strive | is a fighting video game developed and published by Arc System Works. It is the seventh mainline installment of the Guilty Gear series, and the 25th overall. The game was released for PlayStation 4, PlayStation 5 and Windows in June 2021, for Japanese arcades in July 2021, and for Xbox One and Xbox Series X/S in March 2023.
Guilty Gear Strive received generally positive reviews from critics, who praised its visuals, gameplay and netcode, and has sold over 2.5 million units as of August 2023.
Gameplay
Intended as a "complete reconstruction of the franchise", Guilty Gear Strive retains the core essence of the series but revamps many features and mechanics, except for the removal of the series’ signature mechanic, the Instant Kill. It introduces the "Wall Break" feature, which allows for stage transitions when a combo is initiated in the corner of the arena.
Synopsis
Main Story
The story continues after the events of Guilty Gear Xrd. It is the conclusion of Sol Badguy's story (A.K.A. The Gear Hunters Saga), set in Washington, D.C., featuring his final confrontation with That Man, Asuka R. Kreutz.
Three weeks after the events of Guilty Gear Xrd, I-No frees the powerful magic-user Happy Chaos from the body of former Sanctus Populi, Ariels, who is imprisoned inside a special holding cell in Illyria. Chaos notes that she is physically incapable of feeling desire, then offers to help her find her “other half” so he can enjoy some drama. Asuka turns himself in to the US president Colin Vernon E. Groubitz, intending to join the White House's G4 peace summit from a holding cell and ask the other nations for assistance in ridding the world of the Tome of Origin and Sol Badguy. The world's leaders fear an attack from I-No and hire knights from each country, including Sol Badguy, now the world's renowned Gear hero who remains a bounty hunter. Sol and his second lover, Jack-O', plan to refuse, but accept after noting Ariels' warning about I-No and Chaos' plot. After releasin |
https://en.wikipedia.org/wiki/Indentation%20size%20effect | The indentation size effect (ISE) is the observation that hardness tends to increase as the indent size decreases at small scales. When an indent (any small mark, but usually made with a special tool) is created during material testing, the hardness of the material is not constant. At the small scale, materials will actually be harder than at the macro-scale. For the conventional indentation size effect, the smaller the indentation, the larger the difference in hardness. The effect has been seen through nanoindentation and microindentation measurements at varying depths. Dislocations increase material hardness by increasing flow stress through dislocation blocking mechanisms. Materials contain statistically stored dislocations (SSD) which are created by homogeneous strain and are dependent upon the material and processing conditions. Geometrically necessary dislocations (GND) on the other hand are formed, in addition to the dislocations statistically present, to maintain continuity within the material.
These additional geometrically necessary dislocations (GND) further increase the flow stress in the material and therefore the measured hardness. Theory suggests that plastic flow is impacted by both strain and the size of the strain gradient experienced in the material. Smaller indents have higher strain gradients relative to the size of the plastic zone and therefore have a higher measured hardness in some materials.
For practical purposes this effect means that hardness in the low micro and nano regimes cannot be directly compared if measured using different loads. However, the benefit of this effect is that it can be used to measure the effects of strain gradients on plasticity. Several new plasticity models have been developed using data from indentation size effect studies, which can be applied to high strain gradient situations such as thin films.
References
Hardness tests
Materials science
Plasticity (physics) |
https://en.wikipedia.org/wiki/Microsoft%20SEAL | Simple Encrypted Arithmetic Library or SEAL is a free and open-source cross platform software library developed by Microsoft Research that implements various forms of homomorphic encryption.
History
Development originally came out of the Cryptonets paper, demonstrating that artificial intelligence algorithms could be run on homomorphically encrypted data.
It is open-source (under the MIT License) and written in standard C++ without external dependencies and so it can be compiled cross platform. An official .NET wrapper written in C# is available and makes it easier for .NET applications to interact with SEAL.
Features
Algorithms
Microsoft SEAL supports both asymmetric and symmetric (added in version 3.4) encryption algorithms.
Scheme types
Microsoft SEAL comes with two different homomorphic encryption schemes with very different properties:
BFV: The BFV scheme allows modular arithmetic to be performed on encrypted integers. For applications where exact values are necessary, the BFV scheme is the only choice.
CKKS: The CKKS scheme allows additions and multiplications on encrypted real or complex numbers, but yields only approximate results. In applications such as summing up encrypted real numbers, evaluating machine learning models on encrypted data, or computing distances of encrypted locations CKKS is going to be by far the best choice.
Compression
Data compression can be achieved by building SEAL with Zlib support. By default, data is compressed using the DEFLATE algorithm which achieves significant memory footprint savings when serializing objects such as encryption parameters, ciphertexts, plaintexts, and all available keys: Public, Secret, Relin (relinearization), and Galois. Compression can always be disabled.
Availability
There are several known ports of SEAL to other languages in active development:
C++
Microsoft SEAL (Microsoft's source)
C#/F#
NuGet (Microsoft's official package)
Python
PySEAL
SEAL-Python
tf-seal
Pyfhel
JavaScri |
https://en.wikipedia.org/wiki/HElib | Homomorphic Encryption library or HElib is a free and open-source cross platform software developed by IBM that implements various forms of homomorphic encryption.
History
HElib was primarily developed by Shai Halevi and Victor Shoup, shortly after Craig Gentry was a researcher at IBM, with the initial release being on May 5, 2013.
Features
The library implements the Brakerski-Gentry-Vaikuntanathan (BGV) fully homomorphic encryption scheme, as well as optimizations such as Smart-Vercauteren ciphertext packing techniques.
HElib is written in C++ and uses the NTL mathematical library.
References
Homomorphic encryption
Cryptographic software
Free and open-source software
IBM software |
https://en.wikipedia.org/wiki/Choreutoscope | The choreutoscope is the first pre-cinema device which employed a system similar to early film projectors. It was the first projection device to use an intermittent movement, which became the basis of many cine cameras and projectors. It was formed by a sheet of glass on which different drawings were made, and the sheet was mounted on a type off Maltese cross mechanism, which made the image move suddenly. The most common drawing was the 'dancing skeleton' in which six sequential images of a skeleton were animated in the viewing pane.
History
The choreutoscope was invented by Lionel Smith Beale in 1866. Beale used it for demonstrations at the Royal Polytechnic. However, Beale was not the only one to create a choreutoscope, a few years later William C. Hughes created his own choreutoscope in 1884, and B. Brown created a similar machine in 1896.
References
External links
Dancing skeleton (animated) from the Alexis du Pont stereoviews and lantern slides collection at Hagley Museum and Library
Precursors of film
Projectors |
https://en.wikipedia.org/wiki/Helopeltis%20antonii | Helopeltis antonii, also known as the tea mosquito bug, are heteropterans found within the Miridae family. They have a relatively large geographical distribution and are a known pest of many agricultural “cash” crops such as cocoa, cashew, and tea. Subsequently, their impact negatively influences economic growth within the regions in which they inhabit. Thus, their impact on humans has caused them to be of great interest biologically, resulting in significant environmental implications.
Distribution
Helopeltis antonii are found in a region known as the old-world tropics which encompasses places such as India, Northern Australia, Guinea, Vietnam, Tanzania, Nigeria, and Indonesia. More specifically, they are more concentrated in the agricultural regions of the old-world tropics. In India their distribution is primarily found within the “cashew belt” which is located along the western coast and central regions of the country due to its high affinity for these plants. However, different nations grow certain crops in various locations within their borders. Crops that H. antonii prefer will ultimately determine their specific distribution within a country.
Identification of distribution
H. antonii are often mistaken and misidentified with other Helopeltis species. Thus, identifying the exact geographical range of H. antonii has become a difficult process. However, recent advances in species identification though DNA barcoding has made it much easier. DNA barcoding is a rapid and relatively inexpensive identification technique that locates unique genetic markers in their DNA allowing for the accurate identification of not only H. antonii, but other species as well.
Mating
Reproduction for H. antonii occurs in 4 stages—arousal, mounting, copulation, and termination of copulation—and occurs year-round. Mounting, arousal, and termination of copulation occurs within a short time frame; copulation is much longer and more variable in length. Mating typically occur in shade |
https://en.wikipedia.org/wiki/Mycetome | A mycetome is a specialized organ in a variety of animal species which houses that animal's symbionts, isolating them from the animal's natural cellular defense mechanisms and allowing sustained controlled symbiotic growth. In several species, such as bed bugs and certain families of leech, these symbionts are attached to the gut and aid in the production of vitamin B from ingested meals of blood. In insects, the organisms that inhabit these structures are either bacteria or yeasts.
In bed bugs, it has been found that heat stress can cause damage to the mycetome, preventing the symbionts from being successfully passed from the adult female to her eggs at the time of oogenesis, causing the resulting nymphs to develop abnormally or to die prematurely.
References
Insect biology
Symbiosis
Animal anatomy |
https://en.wikipedia.org/wiki/OpenFHE | OpenFHE is an open-source cross platform software library that provides implementations of fully homomorphic encryption schemes. OpenFHE is a successor of PALISADE and incorporates selected design features of HElib, HEAAN, and FHEW libraries.
History
PALISADE
Development began with the OpenFHE precursor PALISADE (software).
PALISADE adopted the open modular design principles of the predecessor SIPHER software library from the DARPA PROCEED program.
SIPHER development began in 2010, with a focus on modular open design principles to support rapid application deployment over multiple FHE schemes and hardware accelerator back-ends, including on mobile, FPGA and CPU-based computing systems.
PALISADE began building from earlier SIPHER designs in 2014, with an open-source release in 2017 and substantial improvements every subsequent 6 months. Much of the development was done at Raytheon BBN and NJIT.
PALISADE development was funded originally by the DARPA PROCEED and SafeWare programs, with subsequent improvements funded by additional DARPA programs, IARPA, the NSA, NIH, ONR, the United States Navy, the Sloan Foundation and commercial entities such as Duality Technologies. PALISADE has subsequently been used in commercial offerings, such as by Duality Technologies who raised funding in a Seed round and a later Series A round led by Intel Capital.
OpenFHE
PALISADE authors along with selected authors of HElib, HEAAN, and FHEW libraries released a new library in July 2022.
The initial release of the library included all features of PALISADE v1.11 and added several new design features, such as Hardware Acceleration Layer for multiple hardware acceleration backends and new bootstrapping procedures.
OpenFHE is used as an FHE backend for the Google Transpiler project.
Features
OpenFHE includes the following features:
Post-quantum public-key encryption
Fully homomorphic encryption (FHE)
Brakerski/Fan-Vercauteren (BFV) scheme for integer arithmetic with approximate bo |
https://en.wikipedia.org/wiki/Taxon%20cycle | Taxon cycles refer to a biogeographical theory of how species evolve through range expansions and contractions over time associated with adaptive shifts in the ecology and morphology of species. The taxon cycle concept was explicitly formulated by biologist E. O. Wilson in 1961 after he surveyed the distributions, habitats, behavior and morphology of ant species in the Melanesian archipelago.
Stages of the taxon cycle
Wilson categorized species into evolutionary "stages", which today are commonly described in the outline by Ricklefs & Cox (1972). However, with the advent of molecular techniques to construct time-calibrated phylogenetic relationships between species, the taxon cycle concept was further developed to include well-defined temporal scales and combined with concepts from ecological succession and speciation cycle theories. Taxon cycles have mainly been described in island settings (archipelagos), where the distributions and movements of species are readily recognized, but may also occur in continental biota.
Stage I: Young, rapidly expanding, undifferentiated, widely and continuously distributed species in the initial colonization stage inhabiting small island, coastal or disturbed (marginal) habitat. Such species are hypothesized to include very good dispersers, ephemeral species and ecological "supertramps".
Stage II: Species that are generally widespread across many islands, but where geographical expansion has slowed, population differentiation has generated subspecies or incipient species, and local extinction on small islands may have created gaps in the distribution. This stage includes species that have maintained a relatively good dispersal ability such as "great speciators". Early-stage "species complexes" may consist of stage II species.
Stage III: Older, well-differentiated and well-defined species that have moved to habitats inland (and uphill) and where reduced dispersal ability and extinctions have fragmented the distribution to fewer |
https://en.wikipedia.org/wiki/Free%20disposal | In various parts of economics, the term free disposal implies that resources can be discarded without any cost. For example, a fair division setting with free disposal is a setting where some resources have to be divided fairly, but some of the resources may be left undivided, discarded or donated.
Examples of situations with free disposal are allocation of food, clothes jewels etc. Examples of situations without free disposal are:
Chore division - since all chores must be done.
Allocation of land with an old structure - since the structure may have to be destructed, and destruction is costly.
Allocation of an old car - since the car may have to be carried away to used cars garage, and moving it may be costly.
Allocation of shares in a firm that may have debts - since the firm cannot be disposed of without paying its debts first.
The free disposal assumption may be useful for several reasons:
It enables truthful cake-cutting algorithms: The option to discard some of the cake gives the players an incentive to reveal their true valuations.
It enables fast envy-free cake-cutting algorithms, and more economically-efficient envy-free allocations: Discarding some of the cake helps to reduce envy.
It enables online assignment algorithms.
References
Fair division |
https://en.wikipedia.org/wiki/Viral%20strategies%20for%20immune%20response%20evasion | The mammalian immune system has evolved complex methods for addressing and adapting to foreign antigens. At the same time, viruses have co-evolved evasion machinery to address the many ways that host organisms attempt to eradicate them. DNA and RNA viruses use complex methods to evade immune cell detection through disruption of the Interferon Signaling Pathway, remodeling of cellular architecture, targeted gene silencing, and recognition protein cleavage.
Interferon system
The human immune system relies on a plethora of cell-cell signaling pathways to transmit information about a cell's health and microenvironment. Many of these pathways are mediated by soluble ligands, cytokines, that fit like a lock-and-key into adjacent cell surface receptors. This language of cell communication imparts both specificity and spatiotemporal control for the transmission of data.
The Interferon System is composed of a family of cytokines. Type-I Interferons, IFN-α/β, and Type-III Interferons, IFN-λ play key roles in adaptive immunity, acting as communication highways between cells infected with foreign double stranded DNA or double stranded RNA. Mammalian cells utilize specialized receptors known as Pattern Recognition Receptors(PRRs) to detect viral infection; these receptors are able to recognize pathogen-associated molecular patterns (PAMPs) inscribed in viral DNA and RNA. These pattern recognition receptors, often localized to either the cytosol or the nucleus, are responsible for notifying infected cells and initiating the secretion of interferon cytokines.
Double-stranded RNA mediated immune response
The precise role of double-stranded (ds)RNA is still widely investigated as a central player in the Interferon System. Groups have found that positive-strand RNA viruses and dsRNA viruses produced significant amounts of dsRNA, but the precise methods mammalian cells leverage to distinguish between self vs. non-self dsRNA have yet to be uncovered. Studies suggest that recogniti |
https://en.wikipedia.org/wiki/International%20Journal%20of%20Geomechanics | The International Journal of Geomechanics is a monthly peer-reviewed scientific journal published by the American Society of Civil Engineers that focuses on geomechanics, emphasizing theoretical aspects, to include computational and analytical methods, and related validations.
Abstracting and indexing
The journal is indexed in Ei Compendex, ProQuest databases, Civil Engineering Database, Inspec, Science Citation Index Expanded, and EBSCO databases.
References
External links
Engineering journals
American Society of Civil Engineers academic journals
Mining journals
Geotechnical engineering |
https://en.wikipedia.org/wiki/Headless%20engine | A headless engine or fixed head engine is an engine where the end of the cylinder is cast as one piece with the cylinder and crankcase. The most well known headless engines are the Fairbanks-Morse Z and the Witte Headless hit and miss engine
See also
Monobloc engine
References
Engines |
https://en.wikipedia.org/wiki/List%20of%20construction%20methods | The list of construction methods covers the processes and techniques used in the construction process. The construction method is essential for civil engineers; utilizing it appropriately can help to achieve the desired results. The term building refers to the creation of physical structures such as buildings, bridges or railways. One of the four types of buildings is residential and building methods are easiest to study in these structures.
Background
Construction involves the creation of physical structures such as buildings, bridges or railways.
Bricks are small rectangular blocks that can be used to form parts of buildings, typically walls. Before 7,000 BC, bricks were formed from hand-molded mud and dried by the sun. During the Industrial Revolution, mass-produced bricks became a common alternative to stone. Stone was typically more expensive, less predictable and more difficult to handle. Bricks remain in common use. They are small and easy to handle, strong in compression, durable and low maintenance. They can be formed into complex shapes, providing ample opportunity for the construction of aesthetic designs.
The four basic types of structure are residential, institutional and commercial, industrial, and infrastructure/heavy.
Residential
Residential buildings go through five main stages, including foundations, formwork, scaffolding, concrete work and reinforcement.
Foundation
Foundations provide support for structures, transferring their load to layers of soil or rock that have sufficient bearing capacity and suitable settlement characteristics to support them. There are four types of foundation depending on the bearing capacity. Civil engineers will often determine what type of foundation is suitable for the respective bearing capacity.
The foundation construction method depends on considerations such as:
The nature of the load requiring support
Ground conditions
The presence of water
Space availability
Accessibility
Sensitivity to noise and |
https://en.wikipedia.org/wiki/Bacillus%20submarinus | Bacillus submarinus is a species in the genus Bacillus, meaning it is rod shaped while being capable of producing endospores. B. submarinus is Gram + , where there is a thick layer of peptidoglycan in its cell wall.
Description
Bacillus submarinus is a gram positive, aerobic meaning that it requires oxygen for metabolism. B. submarinus is a sporulating bacteria which is when the cell puts it genetic information in a spore during a cell's dormant phase, rod-shaped, bacterium of the genus Bacillus that is commonly found in the ocean at extreme depths and pressures. As with other members of the genus Bacillus, it can form an endospore a bud that contains genetic information in the chance the bacteria cell dies, later when conditions become more hospitable the bacteria returns, surviving extreme conditions.
Habitat
This species is commonly found in the ocean waters, primarily in the Atlantic Ocean. Bacillus submarinus is able to live in oceans at a depth of more than 5000 m, withstanding extreme hydrostatic pressure that is above Pa or around 15954 Psi. In contrast, the human femur can only withstand a maximum of 1,700 Psi before shattering.
Reproduction
Bacillus submarinus divide symmetrically to make two daughter cells, producing a single endospore that can remain viable for decades and is resistant to unfavourable environmental conditions such as ocean acidification. They do not reproduce like eukaryotic cells by mitosis but, a process known as binary fission. In binary fission the DNA in the prokaryote is not condensed in structures similar to chromosomes, but make a copy of the DNA and the cell divides in half.
Uses
Bacillus submarinus is proven to decompose oil that is found in the ocean such as after an oil spill. As B. submarinus begins the process of decomposing oil in the ocean they form tarballs. In these tarballs the B. submarinus works with other organisms such as Chromobacterium violaceum and Candida marina to change the chemical structure of the oil |
https://en.wikipedia.org/wiki/Ampere%20Computing | Ampere Computing LLC is an American fabless semiconductor company based in Santa Clara, California that develops processes for servers operating in large scale environments. Ampere also has offices in: Portland, Oregon; Taipei, Taiwan; Raleigh, North Carolina; Bangalore, India; Warsaw, Poland; and Ho Chi Minh City, Vietnam.
History
Ampere Computing was founded in the Fall of 2017 by Renée James, Ex-President of Intel with funding from The Carlyle Group. James acquired a team from MACOM Technology Solutions (formerly AppliedMicro) in addition to several industry hires to start the company. Ampere Computing is an ARM architecture licensee and develops its own server microprocessors. Ampere fabricates its products at TSMC.
In April 2019, Ampere announced its second major investment round, including investment from Arm Holdings and Oracle Corporation. In June 2019, Nvidia announced a partnership with Ampere to bring support for Compute Unified Device Architecture (CUDA). In November 2019, Nvidia announced a reference design platform for graphics processing unit (GPU)-accelerated ARM-based servers including Ampere.
In the first half of 2020, Ampere announced Ampere Altra an 80-core and Ampere Altra Max a 128-core processor without the use of hyper-threading.
In March 2020, the company announced a partnership with Oracle. In September of that year, Oracle said it would launch bare-metal and virtual machine instances in early 2021 based on Ampere Altra.
In November 2020, Ampere was named one of the top 10 hottest semiconductor startups by CRN.
In May 2021, the company announced a partnership with Microsoft. In July of that year, Ampere acquired OnSpecta, an AI technology startup. After the acquisition, the companies were able to demonstrate four times faster acceleration on Ampere-based instances running AI-inference workloads.
In April 2022, Ampere said that it had filed a confidential prospectus with the U.S. Securities and Exchange Commission, signaling its i |
https://en.wikipedia.org/wiki/European%20Structural%20Integrity%20Society | The European Society for Structural Integrity (ESIS) is an international non-profit engineering scientific society.
Its purpose is to create and expand knowledge about all aspects of structural integrity and the dissemination of that knowledge. The goal is to improve the safety and performance of structures and components.
History
The purpose of European Structural Integrity Society dates back to November 1978 during the summer school in Darmstadt (Germany). At the time, the name was European Group on Fracture. Between 1979 and 1988 the first technical committees were created, the first technical committee had the designation of Elasto-Plastic Fracture Mechanics. The initial idea was to reproduce in Europe the same as the ASTM committee. The first president of European Structural Integrity Society was Dr. L.H. Larsson (European Commission Joint Research Centre). ESIS has a total of 24 technical committees and national groups in each European country.
The current president of ESIS is Prof. Aleksandar Sedmak from the University of Belgrade (Serbia).
Scientific Journals
ESIS is institutionally responsible for the following scientific journals:
Engineering Failure Analysis
Engineering Fracture Mechanics
International Journal of Fatigue
Theoretical and Applied Fracture Mechanics
Procedia Structural Integrity
Organization of International Conferences
ESIS is the organizer or supporter of various international conference series:
ECF, European Conference on Fracture (biennial)
ICSI, International Conference on Structural Integrity (biennial)
IRAS, International Conference on Risk Analysis and Safety of Complex Structures and Componentes (biennial)
Awards
ESIS, at its events, confers the following awards:
The Griffith Medal
The August-Wöhler Medal
The Award of Merit
Honorary Membership
The Young Scientist Award
Robert Moskovic Award (ESIS TC12)
The August Wöhler Medal Winners
2022: Youshi Hong, Chinese Academy of Sciences, China
2020: Filippo Berto, |
https://en.wikipedia.org/wiki/Truthful%20resource%20allocation | Truthful resource allocation is the problem of allocating resources among agents with different valuations over the resources, such that agents are incentivized to reveal their true valuations over the resources.
Model
There are m resources that are assumed to be homogeneous and divisible. Examples are:
Materials, such as wood or metal;
Virtual resources, such as CPU time or computer memory;
Financial resources, such as shares in firms.
There are n agents. Each agent has a function that attributes a numeric value to each "bundle" (combination of resources).
It is often assumed that the agents' value functions are linear, so that if the agent receives a fraction rj of each resource j, then his/her value is the sum of rj *vj .
Design goals
The goal is to design a truthful mechanism, that will induce the agents to reveal their true value functions, and then calculate an allocation that satisfies some fairness and efficiency objectives. The common efficiency objectives are:
Pareto efficiency (PE);
Utilitarian social welfare --- defined as the sum of agents' utilities. An allocation maximizing this sum is called utilitarian or max-sum; it is always PE.
Nash social welfare --- defined as the product of agents' utilities. An allocation maximizing this product is called Nash-optimal or max-product or proportionally-fair; it is always PE. When agents have additive utilities, it is equivalent to the competitive equilibrium from equal incomes.
The most common fairness objectives are:
Equal treatment of equals (ETE) --- if two agents have exactly the same utility function, then they should get exactly the same utility.
Envy-freeness --- no agent should envy another agent. It implies ETE.
Egalitarian in lieu of equitable markets are analogous to laissez-faire early-stage capitalism, which form the basis of common marketplaces bearing fair trade policies in world markets' market evaluation; financiers can capitalise on financial controls and financial leverage and t |
https://en.wikipedia.org/wiki/List%20of%20human%20protein-coding%20genes%203 |
References
Human protein-coding genes 3 |
https://en.wikipedia.org/wiki/List%20of%20human%20protein-coding%20genes%204 |
References
Human protein-coding genes 4 |
https://en.wikipedia.org/wiki/Perturbed%20angular%20correlation | The perturbed γ-γ angular correlation, PAC for short or PAC-Spectroscopy, is a method of nuclear solid-state physics with which magnetic and electric fields in crystal structures can be measured. In doing so, electrical field gradients and the Larmor frequency in magnetic fields as well as dynamic effects are determined. With this very sensitive method, which requires only about 10-1000 billion atoms of a radioactive isotope per measurement, material properties in the local structure, phase transitions, magnetism and diffusion can be investigated. The PAC method is related to nuclear magnetic resonance and the Mössbauer effect, but shows no signal attenuation at very high temperatures.
Today only the time-differential perturbed angular correlation (TDPAC) is used.
History and development
PAC goes back to a theoretical work by Donald R. Hamilton from 1940. The first successful experiment was carried out by Brady and Deutsch in 1947. Essentially spin and parity of nuclear spins were investigated in these first PAC experiments. However, it was recognized early on that electric and magnetic fields interact with the nuclear moment, providing the basis for a new form of material investigation: nuclear solid-state spectroscopy.
Step by step the theory was developed.
After Abragam and Pound published their work on the theory of PAC in 1953 including extra nuclear fields, many studies with PAC were carried out afterwards. In the 1960s and 1970s, interest in PAC experiments sharply increased, focusing mainly on magnetic and electric fields in crystals into which the probe nuclei were introduced. In the mid-1960s, ion implantation was discovered, providing new opportunities for sample preparation. The rapid electronic development of the 1970s brought significant improvements in signal processing. From the 1980s to the present, PAC has emerged as an important method for the study and characterization of materials, e.g. for the study of semiconductor materials, intermetal |
https://en.wikipedia.org/wiki/Fluorescence-activating%20and%20absorption-shifting%20tag | FAST (Fluorescence-Activating and absorption-Shifting Tag) is a small, genetically-encoded, protein tag which allows for fluorescence reporting of proteins of interest. Unlike natural fluorescent proteins and derivates such as GFP or mCherry, FAST is not fluorescent by itself. It can bind selectively a fluorogenic chromophore derived from 4-hydroxybenzylidene rhodanine (HBR), which is itself non fluorescent unless bound. Once bound, the pair of molecules goes through a unique fluorogen activation mechanism based on two spectroscopic changes, increase of fluorescence quantum yield and absorption red shift, hence providing high labeling selectivity. The FAST-fluorogen reporting system can be used in fluorescence microscopy, flow cytometry and any other fluorometric method to explore the living world: biosensors, protein trafficking.
FAST, a small 14 kDa protein, was engineered from the photoactive yellow protein (PYP) by directed evolution. It was reported for the first time in 2016 by researchers from Ecole normale supérieure de Paris.
Mechanism
FAST pertains to a chemical-genetic strategy for specific labeling of proteins. A peptide domain, called "tag", is genetically encoded to be bound to a protein of interest (by combination of their respective genes by means of transfection or infection). This tag is the anchor for a synthetic fluorescent probe to be further added. Such chemical-genetic approach was already implemented besides natural fluorescent proteins such as GFP or their derivatives such as mCherry in several systems already widely used:
since 2003, SNAP-tag, a bi-component reporting system consisting of a 19 kDa peptide derived from a human enzyme, O6-methylguanine-ADN methyltransferase, evolved to form covalent bonds with fluorescent O6-benzylguanine derivatives; SNAP-tag was later evolved into an orthogonal tag, CLIP-tag;
since 2008, HaloTag, a bi-component reporting system consisting of a 33 kDa peptide derived from a bacterial enzyme, a |
https://en.wikipedia.org/wiki/Outline%20of%20web%20design%20and%20web%20development | The following outline is provided as an overview of and topical guide to web design and web development, two very related fields:
Web design – field that encompasses many different skills and disciplines in the production and maintenance of websites. The different areas of web design include web graphic design; interface design; authoring, including standardized code and proprietary software; user experience design; and search engine optimization. Often many individuals will work in teams covering different aspects of the design process, although some designers will cover them all. The term web design is normally used to describe the design process relating to the front-end (client side) design of a website including writing markup. Web design partially overlaps web engineering in the broader scope of web development. Web designers are expected to have an awareness of usability and if their role involves creating markup then they are also expected to be up to date with web accessibility guidelines.
Web development – work involved in developing a web site for the Internet (World Wide Web) or an intranet (a private network). Web development can range from developing a simple single static page of plain text to complex web-based internet applications (web apps), electronic businesses, and social network services. A more comprehensive list of tasks to which web development commonly refers, may include web engineering, web design, web content development, client liaison, client-side/server-side scripting, web server and network security configuration, and e-commerce development.
Among web professionals, "web development" usually refers to the main non-design aspects of building web sites: writing markup and coding. Web development may use content management systems (CMS) to make content changes easier and available with basic technical skills.
For larger organizations and businesses, web development teams can consist of hundreds of people (web developers) and follow |
https://en.wikipedia.org/wiki/Marcia%20Groszek | Marcia Jean Groszek is an American mathematician whose research concerns mathematical logic, set theory, forcing, and recursion theory. She is a professor of mathematics at Dartmouth College.
Education
As a high school student, Groszek felt isolated for her interest in mathematics,
but she found a sense of community through her participation in the Hampshire College Summer Mathematics Program, and she went on to earn her bachelor's degree at Hampshire College. She completed her Ph.D. in 1981 at Harvard University. Her dissertation, Iterated Perfect Set Forcing and Degrees of Constructibility, was supervised by Akihiro Kanamori.
Research
With Theodore Slaman, Groszek showed that (if they exist at all) non-constructible real numbers must be widespread, in the sense that every perfect set contains one of them, and they asked analogous questions of the non-computable real numbers. With Slaman, she has also shown that the existence of a maximally independent set of Turing degrees, of cardinality less than the cardinality of the continuum, is independent of ZFC.
In the theory of ordinal definable sets, an unordered pair of sets is said to be a Groszek–Laver pair if the pair is ordinal definable but neither of its two elements is; this concept is named for Groszek and Richard Laver, who observed the existence of such pairs in certain models of set theory.
Service and outreach
Groszek was program chair of the 2014 North American annual meeting of the Association for Symbolic Logic. Her interest in logic extends to education as well as to research; she has participated in the Association for Symbolic Logic Committee on Logic Education, and in 2011 she was co-organizer of an Association for Symbolic Logic special session on "Logic in the Undergraduate Mathematics Curriculum".
With mathematics colleague Dorothy Wallace and performance artist Josh Kornbluth, Groszek has also helped write and produce a sequence of educational videos about mathematics.
Selected publications
|
https://en.wikipedia.org/wiki/Lattice%20of%20stable%20matchings | In mathematics, economics, and computer science, the lattice of stable matchings is a distributive lattice whose elements are stable matchings. For a given instance of the stable matching problem, this lattice provides an algebraic description of the family of all solutions to the problem. It was originally described in the 1970s by John Horton Conway and Donald Knuth.
By Birkhoff's representation theorem, this lattice can be represented as the lower sets of an underlying partially ordered set. The elements of this set can be given a concrete structure as rotations, with cycle graphs describing the changes between adjacent stable matchings in the lattice. The family of all rotations and their partial order can be constructed in polynomial time, leading to polynomial time solutions for other problems on stable matching including the minimum or maximum weight stable matching. The Gale–Shapley algorithm can be used to construct two special lattice elements, its top and bottom element.
Every finite distributive lattice can be represented as a lattice of stable matchings.
The number of elements in the lattice can vary from an average case of to a worst-case of exponential.
Computing the number of elements is #P-complete.
Background
In its simplest form, an instance of the stable matching problem consists of two sets of the same number of elements to be matched to each other, for instance doctors and positions at hospitals. Each element has a preference ordering on the elements of the other type: the doctors each have different preferences for which hospital they would like to work at (for instance based on which cities they would prefer to live in), and the hospitals each have preferences for which doctors they would like to work for them (for instance based on specialization or recommendations). The goal is to find a matching that is stable: no pair of a doctor and a hospital prefer each other to their assigned match. Versions of this problem are used, for instance, |
https://en.wikipedia.org/wiki/Anti-social%20Media%20Bill%20%28Nigeria%29 | Anti-social Media Bill was introduced by the Senate of the Federal Republic of Nigeria on 5 November 2019 to criminalise the use of the social media in peddling false or malicious information. The original title of the bill is Protection from Internet Falsehood and Manipulations Bill 2019. It was sponsored by Senator Mohammed Sani Musa from the largely conservative northern Nigeria. After the bill passed second reading on the floor of the Nigeria Senate and its details were made public, information emerged on the social media accusing the sponsor of the bill of plagiarising a similar law in Singapore which is at the bottom of global ranking in the freedom of speech and of the press. But the senator denied that he plagiarised Singaporean law.
Opposition to the bill
Angry reactions trailed the introduction of the bill, and a number of civil society organisations, human rights activists, and Nigerian citizens unanimously opposed the bill. International rights group, Amnesty International and Human Rights Watch condemned the proposed legislation saying it is aimed at gagging freedom of speech which is a universal right in a country of over two hundred million people.
Opposition political parties are very critical of the bill and accused the government of attempting to strip bare, Nigerian citizens of their rights to free speech and destroying same social media on whose power and influence the ruling All Progressives Congress, APC came to power in 2015. Nigeria Information Minister, Lai Mohammed has been at the center of public criticism because he is suspected to be the brain behind the proposed act. Lai was a former spokesman of then opposition All Progressives Congress.
A "Stop the Social Media Bill! You can no longer take our rights from us" online petition campaign to force the Nigeria parliament to drop the bill received over 90,000 signatures within 24 hours. In November 2019, after the bill passed second reading in the senate, Akon Eyakenyi, a senator from A |
https://en.wikipedia.org/wiki/Godzilla%20head | The Godzilla head is a landmark and tourist attraction in Kabukichō, Shinjuku, Tokyo, Japan. The sculpture is accessible from Hotel Gracery Shinjuku's Godzilla Terrace, on the Shinjuku Toho Building. It depicts Godzilla, occasionally with "glowing eyes and smoky breath". The 80-ton head, based on Godzilla's appearance in Godzilla vs. Mothra (1992), was unveiled in 2015. Its placement on the Hotel Gracery's terrace matches Godzilla's 50 meter height seen during the Showa era films in the franchise.
Reception
Editors of Time Out Tokyo included the Godzilla head in their 2019 list of the city's "best public art sculptures".
See also
Godzilla in popular culture
Godzilla Street
References
2015 establishments in Japan
2015 sculptures
Godzilla (franchise)
Outdoor sculptures in Tokyo
Shinjuku
Tourist attractions in Tokyo
Heads in the arts |
https://en.wikipedia.org/wiki/Partial%20allocation%20mechanism | The Partial Allocation Mechanism (PAM) is a mechanism for truthful resource allocation. It is based on the max-product allocation - the allocation maximizing the product of agents' utilities (also known as the Nash-optimal allocation or the Proportionally-Fair solution; in many cases it is equivalent to the competitive equilibrium from equal incomes). It guarantees to each agent at least 0.368 of his/her utility in the max-product allocation. It was designed by Cole, Gkatzelis and Goel.
Setting
There are m resources that are assumed to be homogeneous and divisible.
There are n agents, each of whom has a personal function that attributes a numeric value to each "bundle" (combination of resources). The valuations are assumed to be homogeneous functions.
The goal is to decide what "bundle" to give to each agent, where a bundle may contain a fractional amount of each resource.
Crucially, some resources may have to be discarded, i.e., free disposal is assumed.
Monetary payments are not allowed.
Algorithm
PAM works in the following way.
Calculate the max-product allocation; denote it by z.
For each agent i:
Calculate the max-product allocation when i is not present.
Let fi = (the product of the other agents in z) / (the max-product of the other agents when i is not present).
Give to agent i a fraction fi of each resource he gets in z.
Properties
PAM has the following properties.
It is a truthful mechanism - each agent's utility is maximized by revealing his/her true valuations.
For each agent i, the utility of i is at least 1/e ≈ 0.368 of his/her utility in the max-product allocation.
When the agents have additive linear valuations, the allocation is envy-free.
PA vs VCG
The PA mechanism, which does not use payments, is analogous to the VCG mechanism, which uses monetary payments. VCG starts by selecting the max-sum allocation, and then for each agent i it calculates the max-sum allocation when i is not present, and pays i the difference (max-sum wh |
https://en.wikipedia.org/wiki/Russell%20Lyons | Russell David Lyons (6 September 1957) is an American mathematician, specializing in probability theory on graphs, combinatorics, statistical mechanics, ergodic theory and harmonic analysis.
Lyons graduated with B.A. mathematics in 1979 from Case Western Reserve University, where he became a Putnam Fellow in 1977 and 1978. He received his Ph.D. in 1983 from the University of Michigan with the thesis A Characterization of Measures Whose Fourier-Stieltjes Transforms Vanish at Infinity, which was supervised by Hugh L. Montgomery and Allen Shields. Lyons was a postdoc for the academic year 1984–1985 at the University of Paris-Sud. He was an assistant professor at Stanford University from 1985 to 1990 and an associate professor at Indiana University from 1990 to 1994. At Georgia Tech he was a full professor from 2000 to 2003. At Indiana University he was a professor of mathematics from 1994 to 2014 and is since 2014 the James H. Rudy Professor of Mathematics; there he has also been an adjunct professor of statistics since 2006.
Lyons has held visiting positions in the United States, France, and Israel. In 2012 he was elected a Fellow of the American Mathematical Society. In 2014 he was an invited speaker of the International Congress of Mathematicians (ICM) in Seoul. In 2017 a conference was held in Tel Aviv in honor of his 60th birthday.
Selected publications
References
External links
(joint with Yuval Peres)
1957 births
Living people
20th-century American mathematicians
21st-century American mathematicians
Probability theorists
Case Western Reserve University alumni
University of Michigan alumni
Stanford University faculty
Indiana University faculty
Georgia Tech faculty
Fellows of the American Mathematical Society
Putnam Fellows |
https://en.wikipedia.org/wiki/Security%20token%20offering | A security token offering (STO) / tokenized IPO is a type of public offering in which tokenized digital securities, known as security tokens, are sold in security token exchanges. Tokens can be used to trade real financial assets such as equities and fixed income, and use a blockchain virtual ledger system to store and validate token transactions.
Due to tokens being classified as securities, STOs are more susceptible to regulation and thus represent a more secure investment alternative than ICOs, which have been subject to numerous fraudulent schemes.
Furthermore, since ICOs are not held in traditional exchanges, they can be a less expensive funding source for small and medium-sized companies when compared to an IPO. An STO on a regulated stock exchange (referred to as a tokenized IPO) has the potential to deliver significant efficiencies and cost savings, however.
By the end of 2019, STOs had been used in multiple scenarios including the trading of Nasdaq-listed company stocks, the pre-IPO of World Chess, FIDE's official broadcasting platform, and the creation of Singapore Exchange's own STO market, backed by Japan's Tokai Tokyo Financial Holdings.
Controversy regarding ICOs
Though sharing some core concepts with ICOs and IPOs, STOs are in fact different from both, standing as an intermediary model. Similarly to ICOs, STOs are offerings that are made by selling digital tokens to the general public in cryptocurrency exchanges such as Binance, Kraken, Binaryx and others. The main difference stands in the fact that ICO tokens are the offered cryptocurrency's actual coins, entirely digital, and classified as utilities. New ICO currencies can be generated ad infinitum, as might in some cases their tokens. Additionally, their value is almost entirely speculative and arises from the perceived utility value buyers expect them to provide.
Security tokens, on the other hand, are actual securities, like bonds or stocks, tied to a real company.
In terms of legislation, so |
https://en.wikipedia.org/wiki/China%20Cables | The China Cables are a collection of secret Chinese government documents from 2017 which were leaked by exiled Uyghurs to the International Consortium of Investigative Journalists, and published on 24 November 2019. The documents include a telegram which details the first known operations manual for running the Xinjiang internment camps, and bulletins which illustrate how China's centralized data collection system and mass surveillance tool, known as the Integrated Joint Operations Platform, uses artificial intelligence to identify people for interrogation and potential detention.
The Chinese government has called the cables "pure fabrication" and "fake news", further stating that the West were "slandering and smearing" them. The documents release sparked renewed attention to the Uyghur internment camps and Uyghur genocide.
Description and contents
On November 24, 2019, the International Consortium of Investigative Journalists published secret Chinese government documents from 2017 dubbed as the "China Cables", which exiled Uyghurs had leaked to them. The documents consisted of a classified telegram called "New Secret 5656" from 2017, four bulletins/security briefings and one court document.
The classified telegram detailed the first known operations manual for running "between 1,300 and 1,400" internment camps of Muslim Uyghurs in Xinjiang, It was signed by Zhu Hailun, head of Xinjiang's Political and Legal Commission A, then deputy secretary of Xinjiang's Party Committee of the Chinese Communist Party. According to the American delegate to the UN committee on the elimination of racial discrimination, China is holding one million Uyghurs in these camps.
The 4 bulletins are secret government intelligence briefings from China's centralized data collection system "Integrated Joint Operation Platform" (IJOP), which uses artificial intelligence to identify people for questioning and potential detention. It illustrated a connection between mass surveillance in China |
https://en.wikipedia.org/wiki/Stable%20matching%20polytope | In mathematics, economics, and computer science, the stable matching polytope or stable marriage polytope is a convex polytope derived from the solutions to an instance of the stable matching problem.
Description
The stable matching polytope is the convex hull of the indicator vectors of the stable matchings of the given problem. It has a dimension for each pair of elements that can be matched, and a vertex for each stable matchings. For each vertex, the Cartesian coordinates are one for pairs that are matched in the corresponding matching, and zero for pairs that are not matched.
The stable matching polytope has a polynomial number of facets. These include the conventional inequalities describing matchings without the requirement of stability (each coordinate must be between 0 and 1, and for each element to be matched the sum of coordinates for the pairs involving that element must be exactly one), together with inequalities constraining the resulting matching to be stable (for each potential matched pair elements, the sum of coordinates for matches that are at least as good for one of the two elements must be at least one). The points satisfying all of these constraints can be thought of as the fractional solutions of a linear programming relaxation of the stable matching problem.
Integrality
It is a theorem of that the polytope described by the facet constraints listed above has only the vertices described above. In particular it is an integral polytope. This can be seen as an analogue of the theorem of Garrett Birkhoff that an analogous polytope, the Birkhoff polytope describing the set of all fractional matchings between two sets, is integral.
An equivalent way of stating the same theorem is that every fractional matching can be expressed as a convex combination of integral matchings. prove this by constructing a probability distribution on integral matchings whose expected value can be set equal to any given fractional matching. To do so, they perform th |
https://en.wikipedia.org/wiki/Caveasphaera | Caveasphaera is a multicellular organism found in 609-million-year-old rocks laid down during the Ediacaran period in the Guizhou Province of South China. The organism is not easily defined as an animal or non-animal. The organism is notable due to the study of related embryonic fossils (measuring about a half-millimeter in diameter) which display different stages of its development: from early single-cell stages to later multicellular stages. Such fossil studies present the earliest evidence of an essential step in animal evolution – the ability to develop distinct tissue layers and organs. According to researchers, fossil studies of Caveasphaera have suggested that animal-like embryonic development arose much earlier than the oldest clearly defined animal fossils and may be consistent with studies suggesting that animal evolution may have begun about 750 million years ago. Nonetheless, Caveasphaera fossils may look similar to starfish and coral embryos. Still, researchers have concluded, "Parental investment in the embryonic development of Caveasphaera and co-occurring Tianzhushania and Spiralicellula, as well as delayed onset of later development, may reflect an adaptation to the heterogeneous nature of the early Ediacaran nearshore marine environments in which early animals evolved."
References
External links
Tree of life (biology)
Enigmatic eukaryote taxa |
https://en.wikipedia.org/wiki/HelpSmith | HelpSmith is a Windows-based help authoring tool published by Divcom Software.
HelpSmith allows a technical writer to create documentation in various formats, such as HTML Help (CHM), Web Help (HTML-based help system), PDF, and ePub. Also HelpSmith includes screen capture and image annotation tools.
History
Version 1.0 of HelpSmith was released in 2007 as a help authoring tool that had support for a single HTML Help (CHM) format. On February 7, 2008, HelpSmith was presented on the Giveaway of the Day website where the product received initial feedback and feature requests from its users.
Over the past few years, HelpSmith has obtained support for the major documentation formats, support for High DPI displays, improvements to the user interface, and other enhancements.
Key features
Similarly to common help authoring tools, HelpSmith includes a word processor to edit the content of help topics, customizable templates, user-defined variables, the ability to import existing documentation, media files management tools, support for various output formats, conditional compilation capabilities, and other functions.
Image editing capabilities
The integrated Image Tool can be used by a technical writer to capture screenshots of an application or website, and to add annotations for elements demonstrated on a screenshot. The Image Tool also has support for control annotations which can be used for creation of user interface documentation.
Awards
On the G2 Crowd website, HelpSmith has been marked as a high performer (Fall 2020, Winter 2021) in the category of "help authoring tool" (HAT) software, and is placed in the list of top help authoring tools. The G2 Crowd score is based on the ratings from real users, which takes into account the factors, such as the ease of use, ease of setup, quality of support, compliance with requirements, and others.
On October 30, 2019, Visual Studio Magazine announced the winners of its 26th annual Reader's Choice Awards where Hel |
https://en.wikipedia.org/wiki/Structural%20Ramsey%20theory | In mathematics, structural Ramsey theory is a categorical generalisation of Ramsey theory, rooted in the idea that many important results of Ramsey theory have "similar" logical structures. The key observation is noting that these Ramsey-type theorems can be expressed as the assertion that a certain category (or class of finite structures) has the Ramsey property (defined below).
Structural Ramsey theory began in the 1970s with the work of Nešetřil and Rödl, and is intimately connected to Fraïssé theory. It received some renewed interest in the mid-2000s due to the discovery of the Kechris–Pestov–Todorčević correspondence, which connected structural Ramsey theory to topological dynamics.
History
is given credit for inventing the idea of a Ramsey property in the early 70s. The first publication of this idea appears to be Graham, Leeb and Rothschild's 1972 paper on the subject. Key development of these ideas was done by Nešetřil and Rödl in their series of 1977 and 1983 papers, including the famous Nešetřil–Rödl theorem. This result was reproved independently by Abramson and Harrington, and further generalised by . More recently, Mašulović and Solecki have done some pioneering work in the field.
Motivation
This article will use the set theory convention that each natural number can be considered as the set of all natural numbers less than it: i.e. . For any set , an -colouring of is an assignment of one of labels to each element of . This can be represented as a function mapping each element to its label in (which this article will use), or equivalently as a partition of into pieces.
Here are some of the classic results of Ramsey theory:
(Finite) Ramsey's theorem: for every , there exists such that for every -colouring of all the -element subsets of , there exists a subset , with , such that is -monochromatic.
(Finite) van der Waerden's theorem: for every , there exists such that for every -colouring of , there exists a -monochromatic arithmetic |
https://en.wikipedia.org/wiki/Code%20Ninjas | Code Ninjas is a for-profit educational organization specializing in teaching coding to kids, and is the largest kids coding franchise in the world with over 400 locations open and operating in three countries. It is headquartered in Pearland, Texas. It was founded by David Graham in 2016, inspired by watching his son learn Tae Kwon Do. It has locations in the United States, Canada, and United Kingdom.
Structure
Code Ninjas buildings are separated into classrooms and lobbies. The lobbies are for parents to pick up and drop off their kids and have free Wi-Fi, refreshments, and games or toys for the kids to play with while on break or waiting for their parents. Meanwhile, the classrooms (referred to as dojos) have small desks and are restricted for only Code Senseis (the educators), and Ninjas (the students), aged 7–14, who are given laptops to do programming. Each of the kids start out at white belt, and work their way up the "Path of Enlightenment" to Black Belt. In the "Create" program, different belts have different coding languages. For example, white, yellow, orange, and green belts learn JavaScript. Impact, the latest curriculum launched around May 2023 is structured around Microsoft Make Code Arcade a system that allows for text (JavaScript) or block based coding, their previous program utilized a Konva based game engine, blue belts would learn LuaU, Roblox's own version of Lua. Purple, brown, red, and black belts learn C# with Unity. In black belt, the ninjas are directed to create their own game through the Unity platform. These games are then approved and uploaded to the code ninjas website. Throughout the curriculum, ninjas learn about computer science concepts such as control flow, object-oriented programming, and many other common programming concepts.
Summer Camps
During the summer, Code Ninjas offers camps alongside normal classes, where the parents drop their children off for a half-day summer class during the weekdays, either in the morning or in |
https://en.wikipedia.org/wiki/Polynomial%20method%20in%20combinatorics | In mathematics, the polynomial method is an algebraic approach to combinatorics problems that involves capturing some combinatorial structure using polynomials and proceeding to argue about their algebraic properties. Recently, the polynomial method has led to the development of remarkably simple solutions to several long-standing open problems. The polynomial method encompasses a wide range of specific techniques for using polynomials and ideas from areas such as algebraic geometry to solve combinatorics problems. While a few techniques that follow the framework of the polynomial method, such as Alon's Combinatorial Nullstellensatz, have been known since the 1990s, it was not until around 2010 that a broader framework for the polynomial method has been developed.
Mathematical overview
Many uses of the polynomial method follow the same high-level approach. The approach is as follows:
Embed some combinatorial problem into a vector space.
Capture the hypotheses of the problem by constructing a polynomial of low-degree that is zero on a certain set
After constructing the polynomial, argue about its algebraic properties to deduce that the original configuration must satisfy the desired properties.
Example
As an example, we outline Dvir's proof of the Finite Field Kakeya Conjecture using the polynomial method.
Finite Field Kakeya Conjecture: Let be a finite field with elements. Let be a Kakeya set, i.e. for each vector there exists such that contains a line . Then the set has size at least where is a constant that only depends on .
Proof: The proof we give will show that has size at least . The bound of can be obtained using the same method with a little additional work.
Assume we have a Kakeya set with
Consider the set of monomials of the form of degree exactly . There are exactly such monomials. Thus, there exists a nonzero homogeneous polynomial of degree that vanishes on all points in . Note this is because finding such a polynomia |
https://en.wikipedia.org/wiki/Phylogenetic%20classification%20of%20bony%20fishes | The phylogenetic classification of bony fishes is a phylogenetic classification of bony fishes and is based on phylogenies inferred using molecular and genomic data for nearly 2000 fishes. The first version was published in 2013 and resolved 66 orders. The latest version (version 4) was published in 2017 and recognised 72 orders and 79 suborders.
Phylogeny
The following cladograms show the phylogeny of the Osteichthyes down to order level, with the number of families in parentheses.
The 43 orders of spiny-rayed fishes are related as follows:
References
External links
www.deepfin.org - Phylogeny of all Fishes (redirects to https://sites.google.com/site/guilleorti/home)
Phylogenetics
Bony fish |
https://en.wikipedia.org/wiki/Mathematica%20Applicanda | Mathematica Applicanda is a peer-reviewed scientific journal covering applied mathematics. It was established in 1973 by the Polish Mathematical Society as Series III of the Annales Societatis Mathematicae Polonae, under the name Matematyka Stosowana (ISSN 0137-2890). The first editor-in-chief was Marceli Stark. In 1999 the journal was renamed Matematyka Stosowana-Matematyka dla Społeczeństwa (ISSN 1730-2668 ). Since 2012 its main issue is the electronic one with the name Mathematica Applicanda with ISSN 2299-4009.
Former Editors-in-chief
Marceli Stark (volume I)
Robert Bartoszyński (volumes II - XXIX)
Andrzej Kiełbasiński (volumes XXX - XLI)
Witold Kosiński (volumes XLII - LIV)
Krzysztof J. Szajowski (volumes LV - LXIII)
Krzysztof Burnecki (volume LXIV)
Abstracting and indexing
The journal is abstracted and indexed in
MathSciNet
Zentralblatt MATH
CEON The Library of Science (Biblioteka Nauki)
BazTech
Scopus
See also
List of mathematical physics journals
List of probability journals
List of statistics journals
References
External links
Mathematics journals
Academic journals established in 1973
English-language journals
Biannual journals |
https://en.wikipedia.org/wiki/Counting%20lemma | The counting lemmas this article discusses are statements in combinatorics and graph theory. The first one extracts information from -regular pairs of subsets of vertices in a graph , in order to guarantee patterns in the entire graph; more explicitly, these patterns correspond to the count of copies of a certain graph in . The second counting lemma provides a similar yet more general notion on the space of graphons, in which a scalar of the cut distance between two graphs is correlated to the homomorphism density between them and .
Graph embedding version of counting lemma
Whenever we have an -regular pair of subsets of vertices in a graph , we can interpret this in the following way: the bipartite graph, , which has density , is close to being a random bipartite graph in which every edge appears with probability , with some error.
In a setting where we have several clusters of vertices, some of the pairs between these clusters being -regular, we would expect the count of small, or local patterns, to be roughly equal to the count of such patterns in a random graph. These small patterns can be, for instance, the number of graph embeddings of some in , or more specifically, the number of copies of in formed by taking one vertex in each vertex cluster.
The above intuition works, yet there are several important conditions that must be satisfied in order to have a complete statement of the theorem; for instance, the pairwise densities are at least , the cluster sizes are at least , and . Being more careful of these details, the statement of the graph counting lemma is as follows: Statement of the theorem
If is a graph with vertices and edges, and is a graph with (not necessarily disjoint) vertex subsets , such that for all and for every edge of the pair is -regular with density and , then contains at least many copies of with the copy of vertex in .
This theorem is a generalization of the triangle counting lemma, which states the above but |
https://en.wikipedia.org/wiki/Pl%C3%BCnnecke%E2%80%93Ruzsa%20inequality | In additive combinatorics, the Plünnecke–Ruzsa inequality is an inequality that bounds the size of various sumsets of a set , given that there is another set so that is not much larger than . A slightly weaker version of this inequality was originally proven and published by Helmut Plünnecke (1970).
Imre Ruzsa (1989) later published a simpler proof of the current, more general, version of the inequality.
The inequality forms a crucial step in the proof of Freiman's theorem.
Statement
The following sumset notation is standard in additive combinatorics. For subsets and of an abelian group and a natural number , the following are defined:
The set is known as the sumset of and .
Plünnecke-Ruzsa inequality
The most commonly cited version of the statement of the Plünnecke–Ruzsa inequality is the following.
This is often used when , in which case the constant is known as the doubling constant of . In this case, the Plünnecke–Ruzsa inequality states that sumsets formed from a set with small doubling constant must also be small.
Plünnecke's inequality
The version of this inequality that was originally proven by Plünnecke (1970) is slightly weaker.
Proof
Ruzsa triangle inequality
The Ruzsa triangle inequality is an important tool which is used to generalize Plünnecke's inequality to the Plünnecke–Ruzsa inequality. Its statement is:
Proof of Plünnecke-Ruzsa inequality
The following simple proof of the Plünnecke–Ruzsa inequality is due to Petridis (2014).
Lemma: Let and be finite subsets of an abelian group . If is a nonempty subset that minimizes the value of , then for all finite subsets ,
Proof: This is demonstrated by induction on the size of . For the base case of , note that is simply a translation of for any , so
For the inductive step, assume the inequality holds for all with for some positive integer . Let be a subset of with , and let for some . (In particular, the inequality holds for .) Finally, let . The definition of implie |
https://en.wikipedia.org/wiki/Ruzsa%20triangle%20inequality | In additive combinatorics, the Ruzsa triangle inequality, also known as the Ruzsa difference triangle inequality to differentiate it from some of its variants, bounds the size of the difference of two sets in terms of the sizes of both their differences with a third set. It was proven by Imre Ruzsa (1996), and is so named for its resemblance to the triangle inequality. It is an important lemma in the proof of the Plünnecke-Ruzsa inequality.
Statement
If and are subsets of a group, then the sumset notation is used to denote . Similarly, denotes . Then, the Ruzsa triangle inequality states the following.
An alternate formulation involves the notion of the Ruzsa distance.
Definition. If and are finite subsets of a group, then the Ruzsa distance between these two sets, denoted , is defined to be
Then, the Ruzsa triangle inequality has the following equivalent formulation:
This formulation resembles the triangle inequality for a metric space; however, the Ruzsa distance does not define a metric space since is not always zero.
Proof
To prove the statement, it suffices to construct an injection from the set to the set . Define a function as follows. For each choose a and a such that . By the definition of , this can always be done. Let be the function that sends to . For every point in the set is , it must be the case that and . Hence, maps every point in to a distinct point in and is thus an injection. In particular, there must be at least as many points in as in . Therefore,
completing the proof.
Variants of the Ruzsa triangle inequality
The Ruzsa sum triangle inequality is a corollary of the Plünnecke-Ruzsa inequality (which is in turn proved using the ordinary Ruzsa triangle inequality).
Proof. The proof uses the following lemma from the proof of the Plünnecke-Ruzsa inequality.
Lemma. Let and be finite subsets of an abelian group . If is a nonempty subset that minimizes the value of , then for all finite subsets
If is the empty s |
https://en.wikipedia.org/wiki/Equal-area%20projection | In cartography, an equivalent, authalic, or equal-area projection is a map projection that preserves relative area measure between any and all map regions. Equivalent projections are widely used for thematic maps showing scenario distribution such as population, farmland distribution, forested areas, and so forth, because an equal-area map does not change apparent density of the phenomenon being mapped.
By Gauss's Theorema Egregium, an equal-area projection cannot be conformal. This implies that an equal-area projection inevitably distorts shapes. Even though a point or points or a path or paths on a map might have no distortion, the greater the area of the region being mapped, the greater and more obvious the distortion of shapes inevitably becomes.
Description
In order for a map projection of the sphere to be equal-area, its generating formulae must meet this Cauchy-Riemann-like condition:
where is constant throughout the map. Here, represents latitude; represents longitude; and and are the projected (planar) coordinates for a given coordinate pair.
For example, the sinusoidal projection is a very simple equal-area projections. Its generating formulæ are:
where is the radius of the globe. Computing the partial derivatives,
and so
with taking the value of the constant .
For an equal-area map of the ellipsoid, the corresponding differential condition that must be met is:
where is the eccentricity of the ellipsoid of revolution.
Statistical grid
The term "statistical grid" refers to a discrete grid (global or local) of an equal-area surface representation, used for data visualization, geocode and statistical spatial analysis.
List of equal-area projections
These are some projections that preserve area:
Azimuthal
Lambert azimuthal equal-area
Wiechel (pseudoazimuthal)
Conic
Albers
Lambert equal-area conic projection
Pseudoconical
Bonne
Bottomley
Werner
Cylindrical (with latitude of no distortion)
Lambert cylindrical equal-area (0 |
https://en.wikipedia.org/wiki/Hopper%20%28microarchitecture%29 | Hopper is a graphics processing unit (GPU) microarchitecture developed by Nvidia. It is designed for datacenters and is parallel to Ada Lovelace.
Named for computer scientist and United States Navy rear admiral Grace Hopper, the Hopper architecture was leaked in November 2019 and officially revealed in March 2022. It improves upon its predecessors, the Turing and Ampere microarchitectures, featuring a new streaming multiprocessor and a faster memory subsystem.
Architecture
The Nvidia Hopper H100 GPU is implemented using the TSMC 4N process with 80 billion transistors. It consists of up to 144 streaming multiprocessors. In SXM5, the Nvidia Hopper H100 offers better performance than PCIe.
Streaming multiprocessor
The streaming multiprocessors for Hopper improve upon the Turing and Ampere microarchitectures, although the maximum number of concurrent warps per SM remains the same between the Ampere and Hopper architectures, 64. The Hopper architecture provides a Tensor Memory Accelerator (TMA), which supports bidirectional asynchronous memory transfer between shared memory and global memory. Under TMA, applications may transfer up to 5D tensors. When writing from shared memory to global memory, elementwise reduction and bitwise operators may be used, avoiding registers and SM instructions while enabling users to write warp specialized codes. TMA is exposed through cuda::memcpy_async
When parallelizing applications, developers can use thread block clusters. Thread blocks may perform atomics in the shared memory of other thread blocks within its cluster, otherwise known as distributed shared memory. Distributed shared memory may be used by an SM simultaneously with L2 cache; when used to communicate data between SMs, this can utilize the combined bandwidth of distributed shared memory and L2. The maximum portable cluster size is 8, although the Nvidia Hopper H100 can support a cluster size of 16 by using the cudaFuncAttributeNonPortableClusterSizeAllowed function, pot |
https://en.wikipedia.org/wiki/Alon%E2%80%93Boppana%20bound | In spectral graph theory, the Alon–Boppana bound provides a lower bound on the second-largest eigenvalue of the adjacency matrix of a -regular graph, meaning a graph in which every vertex has degree . The reason for the interest in the second-largest eigenvalue is that the largest eigenvalue is guaranteed to be due to -regularity, with the all-ones vector being the associated eigenvector. The graphs that come close to meeting this bound are Ramanujan graphs, which are examples of the best possible expander graphs.
Its discoverers are Noga Alon and Ravi Boppana.
Theorem statement
Let be a -regular graph on vertices with diameter , and let be its adjacency matrix. Let be its eigenvalues. Then
The above statement is the original one proved by Noga Alon. Some slightly weaker variants exist to improve the ease of proof or improve intuition. Two of these are shown in the proofs below.
Intuition
The intuition for the number comes from considering the infinite -regular tree. This graph is a universal cover of -regular graphs, and it has spectral radius
Saturation
A graph that essentially saturates the Alon–Boppana bound is called a Ramanujan graph. More precisely, a Ramanujan graph is a -regular graph such that
A theorem by Friedman shows that, for every and and for sufficiently large , a random -regular graph on vertices satisfies with high probability. This means that a random -vertex -regular graph is typically "almost Ramanujan."
First proof (slightly weaker statement)
We will prove a slightly weaker statement, namely dropping the specificity on the second term and simply asserting Here, the term refers to the asymptotic behavior as grows without bound while remains fixed.
Let the vertex set be By the min-max theorem, it suffices to construct a nonzero vector such that and
Pick some value For each vertex in define a vector as follows. Each component will be indexed by a vertex in the graph. For each if the distance between and is th |
https://en.wikipedia.org/wiki/Append-only | Append-only is a property of computer data storage such that new data can be appended to the storage, but where existing data is immutable.
Access control
Many file systems' Access Control Lists implement an "append-only" permission:
chattr in Linux can be used to set the append-only flag to files and directories. This corresponds to the flag in .
NTFS ACL has a control for "Create Folders / Append Data", but it does not seem to keep data immutable.
Many cloud storage providers provide the ability to limit access as append-only. This feature is especially important to mitigate the risk of data loss for backup policies in the event that the computer being backed-up becomes infected with ransomware capable of deleting or encrypting the computer's backups.
Data structures
Many data structures and databases implement immutable objects, effectively making their data structures append-only. Implementing an append-only data structure has many benefits, such as ensuring data consistency, improving performance, and permitting rollbacks.
The prototypical append-only data structure is the log file. Log-structured data structures found in Log-structured file systems and databases work in a similar way: every change (transaction) that happens to the data is logged by the program, and on retrieval the program must combine the pieces of data found in this log file. Blockchains add cryptography to the logs so that every transaction is verifiable.
Append-only data structures may also be mandated by the hardware or software environment:
All objects are immutable in purely functional programming languages, where every function is pure and global states do not exist.
Flash storage cells can only be written to once before erasing. Erasing on a flash drive works on the level of pages which cover many cells at once, so each page is treated as an append-only set of cells until it fills up.
Hard drives that use shingled magnetic recording cannot be written to randomly because |
https://en.wikipedia.org/wiki/Ted%20Janssen | Theo Willem Jan Marie Janssen (13 August 1936 – 29 September 2017), better known as Ted Janssen, was a Dutch physicist and Full Professor of Theoretical Physics at the Radboud University Nijmegen. Together with Pim de Wolff and Aloysio Janner, he was one of the founding fathers of N-dimensional superspace approach in crystal structure analysis for the description of quasi periodic crystals and modulated structures. For this work he received the Aminoff Prize of the Royal Swedish Academy of Sciences (together with de Wolff and Janner) in 1988 and the Ewald Prize of the International Union of Crystallography (with Janner) in 2014. These achievements were merit of his unique talent, combining a deep knowledge of physics with a rigorous mathematical approach. Their theoretical description of the structure and symmetry of incommensurate crystals using higher dimensional superspace groups also included the quasicrystals that were discovered in 1982 by Dan Schechtman, who received the Nobel Prize in Chemistry in 2011. The Swedish Academy of Sciences explicitly mentioned their work at this occasion.
Early life and education
Ted Janssen was born on August 13, 1936, in Vught, near 's-Hertogenbosch in the Netherlands. Already as a young boy he was fascinated by the sciences. He built radios, set up a chemistry lab in the attic of his parental home, was an avid bird watcher and he built his own telescopes. He remembered high school as ‘not very inspiring’ and he passed all exams without much effort, but viewed it as a time that truly formed him. Instead of spending time on homework he studied the history and philosophy of science and was very interested in astronomy and astrophysics.
During his high school years he also developed a deep appreciation of literature and music. Later he added the visual arts, ballet, and architecture to that list. The enjoyment of the arts was vital to Ted. He called it essential components of life. He started playing the piano, harpsichord and c |
https://en.wikipedia.org/wiki/Sunny%20Cove%20%28microarchitecture%29 | Sunny Cove is a codename for a CPU microarchitecture developed by Intel, first released in September 2019. It succeeds the Palm Cove microarchitecture and is fabricated using Intel's 10 nm process node. The microarchitecture is implemented in 10th-generation Intel Core processors for mobile (codenamed Ice Lake) and third generation Xeon scalable server processors (codenamed Ice Lake-SP). 10th-generation Intel Core mobile processors were released in September 2019, while the Xeon server processors were released on April 6, 2021.
There are no desktop products featuring Sunny Cove. However, a variant named Cypress Cove is used for the 11th-generation Intel Core desktop processors (codenamed Rocket Lake). Cypress Cove is a version of the Sunny Cove microarchitecture backported to Intel's 14 nm process node.
The direct successor to the Sunny Cove microarchitecture is the Willow Cove microarchitecture, which powers the 11th-generation Intel Core mobile processors.
Features
Sunny Cove was designed by Intel Israel's processor design team in Haifa, Israel.
Intel released details of Ice Lake and its microarchitecture, Sunny Cove, during Intel Architecture Day in December 2018, stating that the Sunny Cove cores would be focusing on single-thread performance, new instructions, and scalability improvements. Intel stated that the performance improvements would be achieved by making the core "deeper, wider, and smarter".
Sunny Cove features a 50% increase in the size of L1 data cache, a larger L2 cache dependent on product size, larger μOP cache, and larger second-level TLB. The core has also increased in width, by increasing execution ports from eight to ten and by doubling the L1 store bandwidth. Allocation width has also increased from four to five. The 5-level paging scheme supports a linear address space up to 57 bits and a physical address space up to 52 bits, increasing the virtual memory space to 128 petabytes, up from 256 terabytes, and the addressable physical memo |
https://en.wikipedia.org/wiki/Multifactorial%20disease | Multifactorial diseases are not confined to any specific pattern of single gene inheritance and are likely to be caused when multiple genes come together along with the effects of environmental factors.
In fact, the terms ‘multifactorial’ and ‘polygenic’ are used as synonyms and these terms are commonly used to describe the architecture of disease causing genetic component. Multifactorial diseases are often found gathered in families yet, they do not show any distinct pattern of inheritance. It is difficult to study and treat multifactorial diseases because specific factors associated with these diseases have not yet been identified. Some common multifactorial disorders include schizophrenia, diabetes, asthma, depression, high blood pressure, Alzheimer’s, obesity, epilepsy, heart diseases, Hypothyroidism, club foot, cancer, birth defects and even dandruff.
The Multifactorial threshold model assumes that gene defects for multifactorial traits are usually distributed within populations. Firstly, different populations might have different thresholds. This is the case in which occurrences of a particular disease is different in males and females (e.g. Pyloric stenosis). The distribution of susceptibility is the same but threshold is different. Secondly, threshold may be same but the distributions of susceptibility may be different. It explains the underlying risks present in first degree relatives of affected individuals.
Characteristics
Multifactorial disorders exhibit a combination of distinct characteristics which are clearly differentiated from Mendelian inheritance.
The risk of multifactorial diseases may get increased due to environmental influences.
The disease is not sex-limited but it occurs more frequently in one gender than the other; females are more likely to have neural tube defects compared to males.
The disease occurs more commonly in a distinct ethnic group (i.e., Africans, Asians, Caucasians etc.)
The diseases may have more in common than g |
https://en.wikipedia.org/wiki/Personal%20Data%20Protection%20Bill%2C%202019 | The Personal Data Protection Bill 2019 (PDP Bill 2019) was a proposed legislation by the Parliament of India which was withdrawn. The bill covers mechanisms for protection of personal data and proposes the setting up of a Data Protection Authority of India for the same. Some key provisions the 2019 Bill provides for which the 2018 draft Bill did not, such as that the central government can exempt any government agency from the Bill and the Right to Be Forgotten, have been included.
Background and timeline
In July 2017, the Ministry of Electronics and Information Technology set up a committee to study issues related to data protection. The committee was chaired by retired Supreme Court judge Justice B. N. Srikrishna.
The committee submitted the draft Personal Data Protection Bill, 2018 in July 2018.
After further deliberations the Bill was approved by the cabinet ministry of India on 4 December 2019 as the Personal Data Protection Bill 2019 and tabled in the Lok Sabha on 11 December 2019.
As of March 2020 the Bill was being analyzed by a Joint Parliamentary Committee (JPC) in consultation with experts and stakeholders. The JPC, which was set up in December 2019, was headed by Meenakshi Lekhi, Member of Parliament. While the JPC was tasked with a short deadline to finalize the draft law before the Budget Session of 2020, it has sought more time to study the Bill and consult stakeholders.
The bill was withdrawn in August 2022.
Provisions
The Bill aims to:
It provided for extensive provisions around collection of consent, assessment of datasets, data flows and transfers of personal data, including to third countries and other aspects around anonymized and non-personal data.
Criticism
The revised 2019 Bill was criticized by Justice B. N. Srikrishna, the drafter of the original Bill, as having the ability to turn India into an “Orwellian State". In an interview with Economic Times, Srikrishna said that, "The government can at any time access private data |
https://en.wikipedia.org/wiki/Carleman%20linearization | In mathematics, Carleman linearization (or Carleman embedding) is a technique to transform a finite-dimensional nonlinear dynamical system into an infinite-dimensional linear system. It was introduced by the Swedish mathematician Torsten Carleman in 1932. Carleman linearization is related to composition operator and has been widely used in the study of dynamical systems. It also been used in many applied fields, such as in control theory and in quantum computing.
Procedure
Consider the following autonomous nonlinear system:
where denotes the system state vector. Also, and 's are known analytic vector functions, and is the element of an unknown disturbance to the system.
At the desired nominal point, the nonlinear functions in the above system can be approximated by Taylor expansion
where is the partial derivative of with respect to at and denotes the Kronecker product.
Without loss of generality, we assume that is at the origin.
Applying Taylor approximation to the system, we obtain
where and .
Consequently, the following linear system for higher orders of the original states are obtained:
where , and similarly .
Employing Kronecker product operator, the approximated system is presented in the following form
where , and and matrices are defined in (Hashemian and Armaou 2015).
See also
Carleman matrix
Composition operator
References
External links
A lecture about Carleman linearization by Igor Mezić
Dynamical systems
Functions and mappings
Functional analysis |
https://en.wikipedia.org/wiki/OneAPI%20%28compute%20acceleration%29 | one-API is an open standard, adopted by Intel, for a unified application programming interface (API) intended to be used across different computing accelerator (coprocessor) architectures, including GPUs, AI accelerators and field-programmable gate arrays. It is an open, cross-industry, standards-based, unified, multi-architecture, multi-vendor programming model that delivers a common developer experience across accelerator architectures - for faster application performance, more productivity, and greater innovation. The one-API initiative encourages collaboration on the one-API specification and compatible one-API implementations across the ecosystem. It is intended to eliminate the need for developers to maintain separate code bases, multiple programming languages, tools, and workflows for each architecture.
The oneAPI specification
The oneAPI specification extends existing developer programming models to enable multiple hardware architectures through a data-parallel language, a set of library APIs, and a low-level hardware interface to support cross-architecture programming. It builds upon industry standards and provides an open, cross-platform developer stack.
Data Parallel C++
DPC++ is an open, cross-architecture language built upon the ISO C++ and Khronos Group SYCL standards. DPC++ is an implementation of SYCL with extensions that are proposed for inclusion in future revisions of the SYCL standard. An example of this is the contribution of unified shared memory, group algorithms, and sub-groups to SYCL 2020.
oneAPI libraries
The set of APIs spans several domains, including libraries for linear algebra, deep learning, machine learning, video processing, and others.
The source code of most implementations of the above libraries is available on GitHub.
The oneAPI documentation also lists the "Level Zero" API defining the low-level direct-to-metal interfaces and a set of ray tracing components with its own APIs.
Hardware abstraction layer
oneAPI Level Z |
https://en.wikipedia.org/wiki/Engineer%20Command%20%28Italy%29 | The Engineer Command () in Rome-Cecchignola commands the specialized engineer regiments of the Italian Army and it is tasked with training of all officers and troops destined for engineer units, as well as with both doctrinal and operational tasks.
The Engineer Command was established in 2010 and underwent a series of reorganizations, shifting from a Brigade-level command to a Division-level element. Nowadays, it keeps the traditions and the honours of the Arm of Engineers, and its commander is the Inspector of the Arm of the Engineers.
History
The Engineer Command of the Italian Army was established on 10 September 2010, but it traces its origins back to the Engineer Brigade (based in Udine) and the Engineer School in Rome.
Engineer School
The Pioneers Engineer School was established on 10 March 1950 in Rome. However, the School was heir to two further training institutes: the Central Engineer School and the Reserve Officers School of Engineers.
The Engineers Central School was established in Manziana as a result of the decree of 18 January 1920 (moved to Civitavecchia in 1925). The School had the task of training the non-commissioned officers and training the troops in the various specialties of the Engineers: sappers, miners, cable operators, photoelectricists, telegraphers and radiotelegraphers. The School also held refresher courses for senior officers and captains about to be promoted as well as training courses for reserve officers called back in service.
The School of Engineer Reserve officer cadets of was created by decree of 1 May 1930 in Verona (moved to Pavia in 1936).
The Italian Civil War forced both two schools to suspend their activities. In 1944 three training bodies were established:
Italian School of Radio-Telegraph Connections in Nocera Inferiore;
Telegraph School in Francavilla Fontana;
232nd Workers Battalion in Bracciano.
In 1948 the School reopened the courses for Reserve officer cadets. In January 1949, a Specialized Battalion an |
https://en.wikipedia.org/wiki/Timor%20Telecom | Timor Telecom, S.A. (TT) is an East Timorese telecommunications company, based in the national capital Dili.
The company originally had a state monopoly on telecommunications in East Timor. The monopoly was lifted by the government in 2010 in response to overwhelming public opinion in favour of liberalisation.
Shareholdings
, the largest shareholder of the company (54.01%) was Telecomunicações Públicas de Timor, S.A. (TPT), which was controlled by Investel Communications, a Brazilian company owned by Timorese businessman , with partners and capital from the Middle East and China.
The shareholders of TPT were Investel (76%), the Harii Foundation – Sociedade para o Desenvolvimento de Timor-Leste (linked to the Roman Catholic Diocese of Baucau) (18%), and Fundação Oriente (6%).
Investel held a further 3.05% of TT via another company, PT Participações SGPS, S.A. The remaining shareholders in TT were the State of Timor-Leste (20.59%), VDT Holding Limited, a Macau-based company (17.86%) and East Timorese businessman Júlio Alfaro (4.49%).
History
In September 1999, the telecommunications infrastructure in East Timor was destroyed during the crisis following the East Timorese independence referendum. In 2001, the United Nations Transitional Administration in East Timor (UNTAET) launched an international tender for the construction of a replacement telecommunications system. The new network was to be operated according to a concession agreement as a BOT (Build–operate–transfer) arrangement. In July 2002, the Timor Telecom consortium (led by Portugal Telecom) was awarded the tender.
On 17 October 2002, the Timor Telecom consortium was transformed into Timor Telecom, S.A., the first corporation to be formed in the newly independent East Timor. Under the concession agreement, TT was granted a monopoly on telecommunications in East Timor for a term of 15 years.
By 1 March 2003, the company had created East Timor's first national telecommunications network, and set up its |
https://en.wikipedia.org/wiki/Proth%20prime | A Proth number is a natural number N of the form where k and n are positive integers, k is odd and . A Proth prime is a Proth number that is prime. They are named after the French mathematician François Proth. The first few Proth primes are
3, 5, 13, 17, 41, 97, 113, 193, 241, 257, 353, 449, 577, 641, 673, 769, 929, 1153, 1217, 1409, 1601, 2113, 2689, 2753, 3137, 3329, 3457, 4481, 4993, 6529, 7297, 7681, 7937, 9473, 9601, 9857 ().
It is still an open question whether an infinite number of Proth primes exist. It was shown in 2022 that the reciprocal sum of Proth primes converges to a real number near 0.747392479, substantially less than the value of 1.093322456 for the reciprocal sum of Proth numbers.
The primality of Proth numbers can be tested more easily than many other numbers of similar magnitude.
Definition
A Proth number takes the form where k and n are positive integers, is odd and . A Proth prime is a Proth number that is prime. Without the condition that , all odd integers larger than 1 would be Proth numbers.
Primality testing
The primality of a Proth number can be tested with Proth's theorem, which states that a Proth number is prime if and only if there exists an integer for which
This theorem can be used as a probabilistic test of primality, by checking for many random choices of whether If this fails to hold for several random , then it is very likely that the number is composite.
This test is a Las Vegas algorithm: it never returns a false positive but can return a false negative; in other words, it never reports a composite number as "probably prime" but can report a prime number as "possibly composite".
In 2008, Sze created a deterministic algorithm that runs in at most time, where Õ is the soft-O notation. For typical searches for Proth primes, usually is either fixed (e.g. 321 Prime Search or Sierpinski Problem) or of order (e.g. Cullen prime search). In these cases algorithm runs in at most , or time for all . There is |
https://en.wikipedia.org/wiki/GUIDE-Seq | GUIDE-Seq (Genome-wide, Unbiased Identification of DSBs Enabled by Sequencing) is a molecular biology technique that allows for the unbiased in vitro detection of off-target genome editing events in DNA caused by CRISPR/Cas9 as well as other RNA-guided nucleases in living cells. Similar to LAM-PCR, it employs multiple PCRs to amplify regions of interest that contain a specific insert that preferentially integrates into double-stranded breaks. As gene therapy is an emerging field, GUIDE-Seq has gained traction as a cheap method to detect the off-target effects of potential therapeutics without needing whole genome sequencing.
Principles
Conceived to work in concert with next-gen sequencing platforms such as Illumina dye sequencing, GUIDE-Seq relies on the integration of a blunt, double-stranded oligodeoxynucleotide (dsODN) that has been phosphothioated on two of the phosphate linkages on the 5' end of both strands. The dsODN cassette integrates into any site in the genome that contains a double-stranded break (DSB). This means that along with the target and off-target sites that may exist as a result of the activity of a nuclease, the dsODN cassette will also integrate into any spurious sites in the genome that have a DSB. This makes it critical to have a dsODN only condition that controls for errant and naturally occurring DSBs, and is required to use the GUIDE-seq bioinformatic pipeline.
After integration of the dsODN cassette, genomic DNA (gDNA) is extracted from the cell culture and sheared to 500bp fragments via sonication. The resulting sheared gDNA undergoes end-repair and adapter ligation. From here, DNA specifically containing the dsODN insert is amplified via two rounds of polymerase chain reaction (PCR) that proceeds in a unidirectional manner starting from the primers that are complementary to the dsODN. This process allows for the reading of the adjacent sequences, both the sense and anti-sense strands, flanking the insert. The final product is a pano |
https://en.wikipedia.org/wiki/DeGoogle | The DeGoogle movement (also called the de-Google movement) is a grassroots campaign that has spawned as privacy advocates urge users to stop using Google products entirely due to growing privacy concerns regarding the company. The term refers to the act of removing Google from one's life. As the growing market share of the internet giant creates monopolistic power for the company in digital spaces, increasing numbers of journalists have noted the difficulty to find alternatives to the company's products. Some projects, such as ungoogled-chromium, primarily distinguish themselves from Google-maintained products by their lessened dependence on the company's infrastructure. It can be seen as part of a broader opposition to big tech companies, sometimes referred to as "techlash."
History
In 2008, Len Hinman began making the move away from Google tools, 'in the interests of privacy', and blogged about his experience. In 2010, publisher Jack Yan used the term as he removed himself from Google's services, citing privacy concerns. Five days later, Kirk McElhearn wrote a piece about "dropping Google" in Macworld, citing privacy, deletions of Blogger blogs, and censorship. In 2013, John Koetsier of Venturebeat said Amazon's Kindle Fire Android-based tablet was "a de-Google-ized version of Android." In 2014 John Simpson of US News wrote about the “right to be forgotten” by Google and other search engines. In 2015, Derek Scally of Irish Times wrote an article on how to "De-Google your life." In 2016 Kris Carlon of Android Authority suggested that users of CyanogenMod 14 could “de-Google” their phones, because CyanogenMod works fine without Google apps too. In 2018, Nick Lucchesi of Inverse wrote about how ProtonMail was promoting how to "be able to completely de-Google-fy your life.” Lifehacker's Brendan Hesse wrote a detailed tutorial on "quitting Google." Gizmodo journalist Kashmir Hill claims that she missed meetings and had difficulties organizing meet ups without the us |
https://en.wikipedia.org/wiki/Novalnet | Novalnet is a payment service provider and a European payment institute that provides e-commerce businesses with electronic and point-of-sale payment processing services. The platform is designed to automate merchants' business processes across the e-commerce value chain, from checkout to debt collection.
Competitors are for example Adyen and Stripe.
Payment platform
The Novalnet SaaS platform provides merchants with a solution for integrating payment services in pre- or self-developed systems, such as online shop, marketplace, content management system (CMS), customer relationship management (CRM), enterprise resource planning (ERP) and inventory management software (WAWI). Business model-based integration via API (application programming interface), iFrame, SDKs (software development kit) and WebView establish a real-time data flow between systems for individual payment processing, fraud prevention and other payment related services.
History
Novalnet was founded in 2007 by Gabriel Dixon. The global company network employs 200 people in four different countries:
Novalnet AG (Headquarters) in Germany,
Novalnet Ltd. in the United Kingdom,
Novalnet Payment Corp. in the United States,
and Novalnet E-Solutions Pvt. Ltd in India.
The company is independent of external investors and is led by Mr. Gabriel Dixon (CEO), who is also the chairman of the board.
Growth
The company has been profitable since 2008. Between 2011 and 2016, Novalnet received the "Usage Award” at the European MPE Awards, and was one of the nominees for the Bavarian SME Award hosted by the European Business Forum. In 2011 Novalnet came first as the “Best Payment-Service Provider 2011” at the t3n Magazin Web Awards.
In 2017 Novalnet was awarded Deloitte's "Technology Fast 50” prize for high growth in turnover over the last four years.
In 2018 Novalnet received the “Best Payment service Provider” prize in the E-Commerce Berlin Awards.
Novalnet AG's revenue grew by 37.82% in the first q |
https://en.wikipedia.org/wiki/IBM%20drum%20storage | In addition to the drums used as main memory by IBM, e.g., IBM 305, IBM 650, IBM offered drum devices as secondary storage for the 700/7000 series and System/360 series of computers.
IBM 731
The IBM 731 is a discontinued storage unit used on the IBM 701. It has a storage capacity of 2048 36-bit words (9,216 8-bit bytes).
IBM 732
The IBM 732 is a discontinued storage unit used on the IBM 702. It has a storage capacity of 60,000 6-bit characters (45,000 8-bit bytes).
IBM 733
The IBM 733 is a discontinued storage unit used on the IBM 704 and IBM 709. It has a storage capacity of 8192 36-bit words (36,864 8-bit bytes).
IBM 734
The IBM 734 is a discontinued storage unit used on the IBM 705 It has a storage capacity of 60,000 6-bit characters (45,000 8-bit bytes).
IBM 7320
The IBM 7320 is a discontinued storage unit manufactured by IBM announced on December 10, 1962 for the IBM 7090 and 7094 computer systems, was retained for the earliest System/360 systems as a count key data device, and was discontinued in 1965. The 7320 is a vertically mounted head-per-track device with 449 tracks, 400 data tracks, 40 alternate tracks, and 9 clock/format tracks. The rotational speed is 3490 rpm, so the average rotational delay is 8.6 milliseconds.
Attachment to a 709x system is through an IBM 7909 Data Channel and an IBM 7631 File Control unit, which can attach up to five random-access storage units, a mix of 7320 and 1301 DASD. One or two 7631 controllers can attach to a computer system, but the system can still attach only a total of five DASD. When used with a 709x, a track holds 2796 six-bit characters, and a 7320 unit holds 1,118,400 characters. Data transfer rate is 202,800 characters per second.
The 7320 attaches to a System/360 through a channel and an 2841 Storage Control unit. Each 2841 can attach up to eight 7320 devices. When used with System/360, a track holds 2081 eight-bit bytes, and a 7320 unit holds 878,000 bytes. Data transfer rate is 135,000 bytes per second |
https://en.wikipedia.org/wiki/Lionel%20Briand | Lionel Claude Briand, born in Paris, France, on November 21, 1965, is a software engineer, and professor at the University of Ottawa and University of Luxembourg. He is an IEEE Fellow, a Canada Research Chair in Intelligent Software Dependability and Compliance and a European Research Council Advanced grantee. His research foci are testing, verification, and validation of software systems; applying machine learning and evolutionary computation to software engineering; and software quality assurance, among others. He was vice-director of the University of Luxembourg's SnT - Interdisciplinary Centre for Security, Reliability and Trust from 2014 to 2019, and editor in chief of Empirical Software Engineering (Springer) from 2003 to 2016.
In 2012, he was the recipient of the Harlan D. Mills Award.
In 2022, he was the recipient of the ACM SIGSOFT Outstanding Research Award
Selected research
Arcuri, Andrea, and Lionel Briand. "A practical guide for using statistical tests to assess randomized algorithms in software engineering." 2011 33rd International Conference on Software Engineering (ICSE). IEEE, 2011.
Andrews, James H., Lionel C. Briand, and Yvan Labiche. "Is mutation an appropriate tool for testing experiments?." Proceedings of the 27th international conference on Software engineering. ACM, 2005.
Briand, Lionel C., John W. Daly, and Jurgen K. Wust. "A unified framework for coupling measurement in object-oriented systems." IEEE Transactions on software Engineering 25.1 (1999): 91–121.
Basili, Victor R., Lionel C. Briand, and Walcélio L. Melo. "A validation of object-oriented design metrics as quality indicators." IEEE Transactions on software engineering 22.10 (1996): 751–761.
References
External links
Software engineers
Academic staff of the University of Ottawa
University of Luxembourg
Living people
1965 births
Software testing people |
https://en.wikipedia.org/wiki/Richard%20M.%20Murray | Richard M. Murray is a synthetic biologist and Thomas E. and Doris Everhart Professor of Control & Dynamical Systems and Bioengineering at Caltech, California.
He was elected to the National Academy of Engineering in 2013 for "contributions in control theory and networked control systems with applications to aerospace engineering, robotics, and autonomy".
Murray is a co-author of several textbooks on feedback and control systems, and helped to develop the Python Control Systems Library to provide operations for use in feedback control systems. He was a founding member of the Department of Defense's Defense Innovation Advisory Board as of 2016.
Education
Murray received a BS in electrical engineering from California Institute of Technology (Caltech) in 1985. He received a MS (1988) and PhD (1990) from the University of California, Berkeley.
Career
Murray joined Caltech in 1991 as an assistant professor of mechanical engineering. He became an associate professor in 1997, a professor in 2000, and the Everhart Professor of Control and Dynamical Systems in 2006. He was named the Everhart Professor of Control and Dynamical Systems and Bioengineering in 2009. He has served as Chair of the Division of Engineering and Applied Science (2000–2005) and Director of Information Science and Technology (2006–2009).
Research
Murray is a pioneer of the field of biological engineering, synthetic biology and control theory
including feedback in networked control systems, biomolecular feedback, engineered biological circuits, and novel architectures.
Murray is a founder and steering group member of the Build-a-Cell Initiative, an international collaboration investigating creation of synthetic live cells.
He is a co-founder of Tierra Biosciences, for cell-free synthetic biology.
Books
Awards and honors
2019, John R. Ragazzini Education Award
2017, IEEE Control Systems Award, Institute of Electrical and Electronics Engineers
2013, National Academy of Engineering for "contrib |
https://en.wikipedia.org/wiki/Queering%20the%20Map | Queering the Map is a community-based online collaborative and counter-mapping platform on which users submit their personal queer experiences to specific locations on a single collective map. Since its inception, users have contributed more than 500,000 posts in 23 languages to the platform.
History
In 2017, Canadian artist and designer Lucas LaRochelle began working on Queering the Map for a class project at Concordia University in Montreal. The project was launched in May of the same year. LaRochelle has cited the lasting impact of personal memories on their perceptions towards places and Sara Ahmed's ideas on queerness as an orientation towards space as influences behind the project. For LaRochelle, a queer space can be a relational experience created by and/or shared between queer people. LaRochelle has stated that their main intent for initiating the project was to archive these spaces, which transcend the traditional notion of queer spaces as fixed places (like businesses or neighborhoods) that are reclaimed by clearly defined communities.
In February 2018, Montreal DJ Frankie Teardrop shared Queering the Map on Facebook, greatly increasing its visibility. During this month, the number of pins on the map increased from 600 to 6,500 within a three-day span. The same month, a cyberattack generating pins with comments in support of U.S. president Donald Trump forced LaRochelle to take down the site and ask for help on its URL. Over the next two months, 8 volunteers developed a more secure version of the site on GitHub, and the project qualified for Cloudflare's free Project Galileo cybersecurity service. Notably, a moderation system was developed for the platform through this process. In April 2018, Queering the Map was relaunched.
In 2019, LaRochelle began developing QT.bot, an artificial neural network trained to generate hypothetical queer stories using the data submitted to Queering the Map.
Reception
Queering the Map has received press coverage throug |
https://en.wikipedia.org/wiki/Stephanie%20Dinkins | Stephanie Dinkins (born 1964) is a transdisciplinary American artist based in Brooklyn, New York. She is known for creating art about artificial intelligence (AI) as it intersects race, gender, and history.
Her aim is to "create a unique culturally attuned AI entity in collaboration with coders, engineers and in close consultation with local communities of color that reflects and is empowered to work toward the goals of its community."
Dinkins is best known for her projects, Conversations with Bina48, a series of conversations between Dinkins and the first social, artificially intelligent humanoid robot BINA48 who looks like a black woman and Not the Only One, a multigenerational artificially intelligent memoir trained off of three generations of Dinkins's family.
Early life and education
Dinkins was born in Perth Amboy, New Jersey to Black American parents who raised her in Staten Island, New York. She credits her grandmother with teaching her how to think about art as a social practice, saying "my grandmother . . . was a gardener and the garden was her art . . . that was a community practice."
Dinkins attended the International Center of Photography School in 1995, where she completed the general studies in photography certificate program. Dinkins received a MFA in photography from the Maryland Institute College of Art in 1997 She completed the Independent Study Program at the Whitney Museum of American Art in 1998.
Career
Dinkins is an associate professor in the art department at Stony Brook University.
Activism
Dinkins advocates for co-creation within a social practice art framework, so that vulnerable communities understand how to use technology to their advantage, instead of being subjected to their use. This is exemplified in her works such as Project al-Khwarzmi, a series of workshops entitled PAK POP-UP at the nonprofit community center Recess in Brooklyn, NY. The workshops involved collaborating with youth in the criminal justice system and uplift |
https://en.wikipedia.org/wiki/Andrea%20Morello | Andrea Morello (born 26 June 1972, in Pinerolo, Italy) is the Scientia Professor of quantum engineering in the School of Electrical Engineering and Telecommunications at the University of New South Wales, and a Program Manager at the ARC Centre of Excellence for Quantum Computation and Communication Technology (CQC2T). Morello is the head of the Fundamental Quantum Technologies Laboratory at UNSW.
Education
Morello completed his undergraduate degree in electrical engineering at the Politecnico di Torino in Italy in 1998. His research career began at the Grenoble High Magnetic Field Laboratory where he investigated the magnetic phase diagram of high superconductors. He obtained his PhD in experimental physics from the Kamerlingh Onnes Laboratory in Leiden in 2004, during which he explored the quantum dynamics of molecular nanomagnets at low temperatures. Morello spent two years at the University of British Columbia before joining UNSW Sydney in 2006.
Research
Morello's research is primarily focused on designing and building the basic components of a quantum computer using the spins of single atoms in silicon. His team were the first in the world to demonstrate the coherent control and readout of the electron and nuclear spin of an individual phosphorus atom in silicon, and for many years they held the record for the longest quantum memory time for a single qubit in the solid state (35.6 seconds). Morello's research also focuses on using highly coherent spin systems to study the foundations of quantum mechanics.
Outreach
Outside of his research Morello is actively engaged in science outreach and education. He has produced a series of YouTube videos 'The Quantum Around You' and 'Quantum Computing Concepts' to bring the fundamental concepts of quantum physics to a wider audience. Morello also starred in a series of videos produced by YouTuber Derek Muller on his channel Veritasium, explaining the fundamental concepts of quantum computing, with the highest viewed |
https://en.wikipedia.org/wiki/KLJN%20Secure%20Key%20Exchange | Random-resistor-random-temperature Kirchhoff-law-Johnson-noise key exchange, also known as RRRT-KLJN or simply KLJN, is an approach for distributing cryptographic keys between two parties that claims to offer unconditional security. This claim, which has been contested, is significant, as the only other key exchange approach claiming to offer unconditional security is Quantum key distribution.
The KLJN secure key exchange scheme was proposed in 2005 by Laszlo Kish and Granqvist. It has the advantage over quantum key distribution in that it can be performed over a metallic wire with just four resistors, two noise generators, and four voltage measuring devices---equipment that is low-priced and can be readily manufactured. It has the disadvantage that several attacks against KLJN have been identified which must be defended against.
"Given that the amount of effort and funding that goes into Quantum Cryptography is substantial (some even mock it as a distraction from the ultimate prize which is quantum computing), it seems to me that the fact that classic thermodynamic resources allow for similar inherent security should give one pause," wrote Henning Dekant, the founder of the Quantum Computing Meetup, in April 2013.
The Cybersecurity Curricula 2017, a joint project of the Association for Computing Machinery, the IEEE Computer Society, the Association for Information Systems, and the International Federation for Information Processing Technical Committee on Information Security Education (IFIP WG 11.8) recommends teaching the KLJN Scheme as part of teaching "Advanced concepts" in its knowledge unit on cryptography.
See Also/Further Reading
http://www.scholarpedia.org/article/Secure_communications_using_the_KLJN_scheme
http://noise.ece.tamu.edu/research_files/research_secure.htm
Science: Simple Noise May Stymie Spies without Quantum Weirdness, Adrian Cho, September 30, 2005. http://noise.ece.tamu.edu/news_files/science_secure.pdf
References
Cryptography |
https://en.wikipedia.org/wiki/Impulse%20vector | An impulse vector is a mathematical tool to graphically design and analyze input shapers that could suppress residual vibration. The impulse vector can be applied for both undamped and underdamped systems, and for both positive and negative impulses in a unified way. The impulse vector makes it easy to obtain impulse time and magnitude of the input shaper graphically.
A vector concept for an input shaper was first introduced by W. Singhose for undamped systems with positive impulses, and an impulse vector was first introduced by C.-G. Kang to generalize Singhose idea to underdamped systems with positive and negative impulses.
Definition
For a vibratory second-order system with undamped natural frequency and damping ratio , the magnitude and angle of an impulse vector corresponding to an impulse function , is defined in a 2-dimensional polar coordinate system as
where implies the magnitude of an impulse function, implies the time location of the impulse function, and implies damped natural frequency . For a positive impulse function with , the initial point of the impulse vector is located at the origin of the polar coordinate system, while for a negative impulse function with , the terminal point of the impulse vector is located at the origin. □
In this definition, the magnitude is the product of and a scaling factor for damping during time interval , which represents the magnitude before being damped; the angle is the product of the impulse time and damped natural frequency. represents the Dirac delta function with impulse time at .
Note that an impulse function is a purely mathematical quantity, while the impulse vector includes a physical quantity (that is, and of a second-order system) as well as a mathematical impulse function. Representing more than two impulse vectors in the same polar coordinate system makes an impulse vector diagram. The impulse vector diagram is a graphical representation of an impulse sequence.
Consider two impu |
https://en.wikipedia.org/wiki/Dolby%20Voice | Dolby Voice is an audio communication technology developed by Dolby Laboratories since at least 2012.
This solution is aimed at improving audio quality in virtual environments such as entreprise-level videoconferencing.
It is implemented using commercially available hardware and/or software and uses the proprietary Dolby Voice Codec (DVC) audio codec.
Features
This technology was created to improve audio quality, compared to other similar technologies through various audio processing features:
a dynamic audio leveling to focus on the human voice, and to equalize participants audio power easing listening
a spatialization of audio to improve voice clarity and reduce fatigue by preventing speech overlapping when multiple participants are talking at the same time
a noise reduction to limit unwanted background sounds in noisy environments
an echo reduction to limit audio reinjection when input and output devices are placed close together
while, at the same time, avoiding to cut back on bandwidth usage and network resilience, through the use of heavy compression.
Products
Dolby.io platform
Dolby Conference Phone
Dolby Voice Room
BlueJeans application
Laptops
References
External links
Description from Dolby website
Description from Dolby.io website
Description from BlueJeans website
Audio codecs
Digital audio
Dolby Laboratories
Web conferencing |
https://en.wikipedia.org/wiki/List%20of%20Tesla%20factories | Tesla, Inc. operates plants worldwide for the manufacture of their products, including electric vehicles, lithium-ion batteries, solar shingles, chargers, automobile parts, and manufacturing equipment and tools for its own factories. The following is a list of current, future and former facilities.
Current production facilities
Future production facilities
{| class="wikitable sortable"
! Name
! City
! Country
! Products
! Expected to Opened
! Employees
! Floor Area
! VIN Code
! Comments
|-
|Hayward facility
|Hayward, California
|
|Automotive parts
|2023|
|
|—
|
|-
|Fremont Battery Factory
|Fremont, California
|
|Lithium-ion batteries
|2023
|
|
|—
|Supports Kato Road 4680 battery "pilot factory."
|-
| Tesla Lithium Refinery||Robstown, Texas
||||| 2024 || 250
| || —|| Estimated annual material capacity for 50 GWh of batteries.
|-
| Tesla Shanghai Megafactory|| Shanghai
||| Megapack || 2024 || || || —|| Estimated annual production of 10,000 Megapacks.
|-
| Gigafactory Mexico||Monterrey, Nuevo Leon
|||Next-gen vehicle || 2026 || 10,000
| || —||
|}
Former production facilities
Note: Maxwell Technologies was acquired by Tesla in 2019 for their battery technology. Maxwell continued to operate as subsidiary until 2021. Due to the short holding time and no known products produced under Tesla, their production facilities are not listed above.
Notes
References
External links
Manufacturing official website,
"Here's Where Tesla Produces Its Electric Cars Around the World", Newsweek'', 2021-08-03
Tesla
Tesla, Inc.
Tesla
Battery manufacturers
Tesla, Inc.-related lists |
https://en.wikipedia.org/wiki/Novak%E2%80%93Tyson%20model | The Novak–Tyson Model is a non-linear dynamics framework developed in the context of cell-cycle control by Bela Novak and John J. Tyson. It is a prevalent theoretical model that describes a hysteretic, bistable bifurcation of which many biological systems have been shown to express.
Historical background
Bela Novak and John Tyson came from the Department of Biology at the Virginia Polytechnic Institute and State University in Blacksburg, Virginia, when this model was first published in the Journal of Cell Science in 1993.
In 1990, two key papers were published that identified and characterized important dynamic relationships between cyclin and MPF in interphase-arrested frog egg extracts. The first was Solomon's 1990 Cell paper, titled "Cyclin activation of p34cdc2" and the second was Felix's 1990 Nature paper, titled "Triggering of cyclin degradation in interphase extracts of amphibian eggs of cdc2 kinase". Solomon's paper showed a distinct cyclin concentration threshold for the activation of MPF. Felix's paper looked at cyclin B degradation in these extracts and found that MPF degrades cyclin B in a concentration dependent and time-delayed manner.
In response to these observations, three competing models were published in the next year, 1991, by Norel and Agur, Goldbeter, and Tyson. These competing theories all attempted to model the experimental observations seen in the 1990 papers regarding the cyclin-MPF network.
The Norel and Agur model
Norel and Agur's model proposes a mechanism where cyclin catalytically drives the production of MPF, which in turn autocatalyzes. This model assumes that MPF activates cyclin degradation via APC activation, and it decouples cyclin degradation from MPF destruction. However, this model is unable to recreate the observed cyclin dependent MPF activity relationship seen in Solomon's 1990 paper, as it shows no upper steady-state level of MPF activity.
Goldbeter model
Goldbeter proposed a model where cyclin also catalytically |
https://en.wikipedia.org/wiki/Open%20Insulin%20Project | The Open Insulin Project is a community of researchers and advocates working to develop an open-source protocol for producing insulin that is affordable, has transparent pricing, and is community-owned.
History
The Open Insulin Project was started in 2015 by Anthony Di Franco, himself a type 1 diabetic. He started the project in response to the unreasonably high prices of insulin in the US. The project has been housed in Counter Culture Labs, a community laboratory and maker space in the Bay Area. Other collaborators include ReaGent, BioCurious and BioFoundry.
Goals
The project aims to develop both the methodology and hardware to allow communities and individuals to produce medical-grade insulin for the treatment of diabetes. These methods will be low-cost in order to combat the high price of insulin in places like the US. There is also potential for small-scale distributed production that may allow for improved insulin access in places with poor availability infrastructure. Access to insulin remains so insufficient around the globe that "Half of all people who need insulin lack the financial or logistical means to obtain adequate supplies".
Motivation
Researcher Frederick Banting famously refused to put his name on the patent after discovering insulin in 1923. The original patent for insulin was later sold by his collaborators for just $1 to the University of Toronto in an effort to make it as available as possible. Despite this, for various reasons, there remains no generic version of insulin available in the US. Insulin remains controlled by a small number of large pharmaceutical companies and sold at prices unaffordable to many who rely on it to live, particularly those without insurance. This lack of availability has led to fatalities, such as Alec Smith, who died in 2017 due to lack of insulin. The Open Insulin Project is motivated by the urgent need to protect the health of those with diabetes regardless of their economic or employment status by develop |
https://en.wikipedia.org/wiki/Cui%20Tiejun | Cui Tiejun (; born September 1965) is a Chinese scientist specializing in electromagnetic field and microwave technology. He is the deputy director of Southeast University's State Key Laboratory of Millimeter Waves and the Synergetic Innovation Center of Wireless Communication Technology and deputy dean of School of Information Science & Engineering.
Education
Cui was born in Luanping County, Hebei in September 1965. He earned his bachelor's degree in 1987, an master's degree in 1989, and doctor's degree in 1993, all in engineering science and all from Xidian University. From 1995 to 1997 he received grants from Alexander von Humboldt Foundation as a Humboldt Research Fellow at Karlsruhe University. He was a postdoctoral fellow at the University of Illinois at Urbana-Champaign from 1997 to 1999, and was a research scientist since 2000.
Career
In October 2001 he was hired as a professor and doctoral supervisor at the School of Information Science & Engineering, Southeast University.
He was a delegate to the 12th National People's Congress. In December 2017, he was elected a member of the 14th Central Committee of Jiu San Society.
Honours and awards
2002 National Science Fund for Distinguished Young Scholars
2014 State Natural Science Award (Second Class)
2015 Fellow of the Institute of Electrical and Electronics Engineers (IEEE) for contributions to microwave metamaterials and computational electromagnetics.
2018 State Natural Science Award (Second Class)
November 22, 2019 Member of the Chinese Academy of Sciences (CAS)
References
1965 births
Living people
People from Luanping County
Xidian University alumni
Scientists from Hebei
Academic staff of Southeast University
Members of the Chinese Academy of Sciences
Delegates to the 12th National People's Congress
Fellow Members of the IEEE
Microwave engineers
Chinese electrical engineers
Metamaterials scientists |
https://en.wikipedia.org/wiki/Fra%C3%AFss%C3%A9%20limit | In mathematical logic, specifically in the discipline of model theory, the Fraïssé limit (also called the Fraïssé construction or Fraïssé amalgamation) is a method used to construct (infinite) mathematical structures from their (finite) substructures. It is a special example of the more general concept of a direct limit in a category. The technique was developed in the 1950s by its namesake, French logician Roland Fraïssé.
The main point of Fraïssé's construction is to show how one can approximate a (countable) structure by its finitely generated substructures. Given a class of finite relational structures, if satisfies certain properties (described below), then there exists a unique countable structure , called the Fraïssé limit of , which contains all the elements of as substructures.
The general study of Fraïssé limits and related notions is sometimes called Fraïssé theory. This field has seen wide applications to other parts of mathematics, including topological dynamics, functional analysis, and Ramsey theory.
Finitely generated substructures and age
Fix a language . By an -structure, we mean a logical structure having signature .
Given an -structure with domain , and a subset , we use to denote the least substructure of whose domain contains (i.e. the closure of under all the function and constant symbols in ).
A substructure of is then said to be finitely generated if for some finite subset . The age of , denoted , is the class of all finitely generated substructures of .
One can prove that any class that is the age of some structure satisfies the following two conditions:
Hereditary property (HP)
If and is a finitely generated substructure of , then is isomorphic to some structure in .
Joint embedding property (JEP)
If , then there exists such that both and are embeddable in .
Fraïssé's theorem
As above, we noted that for any -structure , satisfies the HP and JEP. Fraïssé proved a sort-of-converse result: when is any non-e |
https://en.wikipedia.org/wiki/William%20Henry%20Whitfeld | William Henry Whitfeld (15 October 1856, Ashford, Kent – 1 December 1915) was an English mathematician, leading expert on bridge and whist, and card editor for The Field. He is known as the poser of the Whitfeld Six problem in double dummy bridge.
After graduating from Chatham House Grammar School, Whitfeld matriculated in 1876 at Trinity College, Cambridge. He graduated there in 1880 with B.A. as twelfth wrangler in the Mathematical Tripos and in 1884 with M.A. For several years he was a tutor and lecturer at Cavendish College, Cambridge. In 1880 he published some double-dummy problems in whist in The Cambridge Review: A Journal of University Life and Thought (an undergraduates' journal founded in 1879). His famous problem now known as "Whitfeld Six" was published in the London magazine The Field in the January 31st 1885 issue.
Whitfeld's whist problems are related to the mathematics of nested balanced incomplete block designs. He wrote the article Bridge for the 11th edition of the Encyclopaedia Britannica. He also wrote a 6-page essay entitled Probabilities for the 1902 book Principles and Practice of Whist.
In 1900 he married Ida Alberta Russell (1877-1958). They had three sons and three daughters. The three sons were Francis Russell (1902–1975), Miles (1903–1997), and Ivan (1904–1983) and three daughters were Ida Mary (Mary) (1901-1987), Rachel (Ray) Elizabeth (1908-1993) and Margaret ("Maggie") Penmon (1913-1995). All moved to Western Australia, Miles and Ivan first later their mother Ida Alberta with Francis and the three daughters came a few months later. Ida Alberta, Mother, Ivan, Ida Mary, Rachel and Margret eventually returned to England leaving Francis and Miles in Australia.
References
External links
1856 births
1915 deaths
19th-century English mathematicians
20th-century English mathematicians
People educated at Chatham House Grammar School
Alumni of Trinity College, Cambridge
Academics of the University of Cambridge
Combinatorialists
English ma |
https://en.wikipedia.org/wiki/Michael%20Genesereth | Michael Genesereth (born 1948) is an American logician and computer scientist, who is most known for his work on computational logic and applications of that work in enterprise management, computational law, and general game playing. Genesereth is professor in the Computer Science Department at Stanford University and a professor by courtesy in the Stanford Law School. His 1987 textbook on Logical Foundations of Artificial Intelligence remains one of the key references on symbolic artificial intelligence. He is the author of the influential Game Description Language (GDL) and Knowledge Interchange Format (KIF), the latter of which led to the ISO Common Logic standard.
Education
Genesereth received a B.S. in Physics (1972) from Massachusetts Institute of Technology, and both an M.S. (1974) and Ph.D. (1978) in Applied Mathematics from Harvard University. As a graduate student, he worked on the Macsyma computer algebra system and wrote his dissertation on an automated advisor for Macsyma users.
Career
Genesereth has been a faculty member in the computer science department at Stanford University since 1979. He is the director of the Logic Group at Stanford and a founder and the research director of the Stanford CodeX Center for Legal Informatics. He is one of the founders of the companies Teknowledge, CommerceNet, Mergent Systems, SIPX and Symbium. Symbium is the most recent spinoff from the computational law research undertaken by CodeX and is a winner of the Ivory Innovation Prize for Policy and Regulatory Reform.
Research
Genesereth's research is broadly based on the use of computational logic for such applications as integrating knowledge from heterogeneous sources, as a common format for exchanging knowledge, as a foundation for agent-based knowledge representation and software engineering, as an enhancement to spreadsheets known as a Logical spreadsheet, and for optimizing queries in a deductive database system. He invented the notion of Model-based Diagno |
https://en.wikipedia.org/wiki/Oh%20Shit%21 | Oh Shit! is a Pac-Man clone released in 1985 for the MSX by The ByteBusters (Aackosoft's in-house development team) and published by Dutch publisher Aackosoft under the Classics range of games; a range that consists of clones of arcade games, i.e. Scentipede being a clone of Atari's Centipede. Oh Shit!'s level and art design is identical to that of Pac-Man.
Oh Shit! was later republished with differing names and cover art several times; Oh Shit! was renamed to Oh No! for the game's UK release due to the name being considered 'too obscene', and the name was shortened to Shit! for its release by Premium III Software Distribution. The European re-release Shit! notably uses cover art from 1985 horror novel The Howling III: Echoes, possibly without permission. Oh Shit! features digitized speech; when the player loses a life, the eponymous phrase "Oh Shit!" is said. For the renamed releases, Oh No! and Shit!, the speech is changed accordingly.
Releases
The 1985 MSX release were published by Aackosoft, but later releases of the MSX version were published by different publishers; the European version of Oh Shit! was later published by Eaglesoft (an alternate label of Aackosoft), and Oh Shit! was published by Compulogical in Spain. The UK release, Oh No!, was also published by Eaglesoft. The European re-release, Shit!, was developed by Eurosoft and published by Premium III Software Distribution, notably using cover art from 1985 horror novel The Howling III: Echoes, possibly without permission. The original MSX version of Oh Shit! was made for compatibility with MSX 32K computers, and later re-releases offer MSX 64K compatibility. Unlike other Aackosoft titles in the Classics range, Oh Shit! is incompatible with MSX 16K computers.
Aackosoft went bankrupt in 1988, after which Shit!, alongside other Aackosoft titles, were re-published by Premium III Software Distribution and developed by Eurosoft (a former label of Aackosoft) in the same year. Premium III Software Distribut |
https://en.wikipedia.org/wiki/International%20Association%20for%20Statistical%20Education | The International Association for Statistical Education (IASE) is a section of the International Statistical Institute (ISI), a professional association of statisticians, devoted to statistics education. It was founded in 1991 as an outgrowth of the ISI Statistical Education Committee, which had operated since 1948.
Since 2002 the ISI and IASE have published the Statistics Education Research Journal. The IASE is also associated with the quadrennial International Conference on Teaching Statistics, with satellite conferences of the World Statistics Congress, and with smaller roundtable workshops.
The presidents of the IASE have included
David Vere-Jones (1991–1993),
David S. Moore (1993–1995),
Anne Hawkins (1995–1997),
Maria Gabriella Ottaviani (1997–1999),
Brian Phillips (1999–2001),
Carmen Batanero (2001–2003),
Chris Wild (2003–2005),
Gilberte Schuyten (2005–2007),
Allan Rossman (2007–2009),
Helen MacGillivray (2009–2011),
John Harraway (2011–2013),
Iddo Gal (2013–2015),
Andrej Blejec (2015–2017),
Gail F. Burrill (2017–2019),
and Joachim Engel (2019–2021).
References
External links
Home page
International Statistical Institute
Statistics education |
https://en.wikipedia.org/wiki/Kaplansky%27s%20theorem%20on%20projective%20modules | In abstract algebra, Kaplansky's theorem on projective modules, first proven by Irving Kaplansky, states that a projective module over a local ring is free; where a not-necessarily-commutative ring is called local if for each element x, either x or 1 − x is a unit element. The theorem can also be formulated so to characterize a local ring (#Characterization of a local ring).
For a finite projective module over a commutative local ring, the theorem is an easy consequence of Nakayama's lemma. For the general case, the proof (both the original as well as later one) consists of the following two steps:
Observe that a projective module over an arbitrary ring is a direct sum of countably generated projective modules.
Show that a countably generated projective module over a local ring is free (by a "[reminiscence] of the proof of Nakayama's lemma").
The idea of the proof of the theorem was also later used by Hyman Bass to show big projective modules (under some mild conditions) are free. According to , Kaplansky's theorem "is very likely the inspiration for a major portion of the results" in the theory of semiperfect rings.
Proof
The proof of the theorem is based on two lemmas, both of which concern decompositions of modules and are of independent general interest.
Proof: Let N be a direct summand; i.e., . Using the assumption, we write where each is a countably generated submodule. For each subset , we write the image of under the projection and the same way. Now, consider the set of all triples (, , ) consisting of a subset and subsets such that and are the direct sums of the modules in . We give this set a partial ordering such that if and only if , . By Zorn's lemma, the set contains a maximal element . We shall show that ; i.e., . Suppose otherwise. Then we can inductively construct a sequence of at most countable subsets such that and for each integer ,
.
Let and . We claim:
The inclusion is trivial. Conversely, is the image of and so . The sam |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.