source
stringlengths
31
203
text
stringlengths
28
2k
https://en.wikipedia.org/wiki/Alluxio
Alluxio is an open-source virtual distributed file system (VDFS). Initially as research project "Tachyon", Alluxio was created at the University of California, Berkeley's AMPLab as Haoyuan Li's Ph.D. Thesis, advised by Professor Scott Shenker & Professor Ion Stoica. Alluxio sits between computation and storage in the big data analytics stack. It provides a data abstraction layer for computation frameworks, enabling applications to connect to numerous storage systems through a common interface. The software is published under the Apache License. Data Driven Applications, such as Data Analytics, Machine Learning, and AI, use APIs (such as Hadoop HDFS API, S3 API, FUSE API) provided by Alluxio to interact with data from various storage systems at a fast speed. Popular frameworks running on top of Alluxio include Apache Spark, Presto, TensorFlow, Trino, Apache Hive, and PyTorch, etc. Alluxio can be deployed on-premise, in the cloud (e.g. Microsoft Azure, AWS, Google Compute Engine), or a hybrid cloud environment. It can run on bare-metal or in a containerized environments such as Kubernetes, Docker, Apache Mesos. History Alluxio was initially started by Haoyuan Li at UC Berkeley's AMPLab in 2013, and open sourced in 2014. Alluxio had in excess of 1000 contributors in 2018, making it one of the most active projects in the data eco-system. Enterprises that use Alluxio The following is a list of notable enterprises that have used or are using Alluxio: See also Clustered file system Comparison of distributed file systems Global Namespace List of file systems References External links Free and open-source software
https://en.wikipedia.org/wiki/Stanley%20sequence
In mathematics, a Stanley sequence is an integer sequence generated by a greedy algorithm that chooses the sequence members to avoid arithmetic progressions. If is a finite set of non-negative integers on which no three elements form an arithmetic progression (that is, a Salem–Spencer set), then the Stanley sequence generated from starts from the elements of , in sorted order, and then repeatedly chooses each successive element of the sequence to be a number that is larger than the already-chosen numbers and does not form any three-term arithmetic progression with them. These sequences are named after Richard P. Stanley. Binary–ternary sequence The Stanley sequence starting from the empty set consists of those numbers whose ternary representations have only the digits 0 and 1. That is, when written in ternary, they look like binary numbers. These numbers are 0, 1, 3, 4, 9, 10, 12, 13, 27, 28, 30, 31, 36, 37, 39, 40, ... By their construction as a Stanley sequence, this sequence is the lexicographically first arithmetic-progression-free sequence. Its elements are the sums of distinct powers of three, the numbers such that the th central binomial coefficient is 1 mod 3, and the numbers whose balanced ternary representation is the same as their ternary representation. The construction of this sequence from the ternary numbers is analogous to the construction of the Moser–de Bruijn sequence, the sequence of numbers whose base-4 representations have only the digits 0 and 1, and the construction of the Cantor set as the subset of real numbers in the interval whose ternary representations use only the digits 0 and 2. More generally, they are a 2-regular sequence, one of a class of integer sequences defined by a linear recurrence relation with multiplier 2. This sequence includes three powers of two: 1, 4, and 256 = 35 + 32 + 3 + 1. Paul Erdős conjectured that these are the only powers of two that it contains. Growth rate Andrew Odlyzko and Richard P. Stanley obse
https://en.wikipedia.org/wiki/Center%20for%20Life%20Detection
The Center for Life Detection (CLD) is a collaboration among scientists and technologists from NASA’s Ames Research Center and Goddard Spaceflight Center, which formed in 2018 to support the planning and implementation of missions that will seek evidence of life beyond Earth. CLD is supported by NASA’s Planetary Science Division and is one of three core teams in the Network for Life Detection. CLD’s perspectives on life detection science and technology development are summarized in “Groundwork for Life Detection”, a white paper submitted to and cited in the 2023-2032 Planetary Science and Astrobiology Decadal Survey. Activities The search for life elsewhere is among the NASA Science Mission Directorate's high-level priorities (Science 2020-2024: A Vision for Scientific Excellence, Priority 1). The Center for Life Detection was founded to support this search by: conducting research on biosignature “detectability” to help inform target/sample selection and measurement strategies/requirements; developing tools and engagement activities that enable members of the broader astrobiology community to formulate their knowledge, research, and expertise in a way that facilitates use in mission planning; supporting the instrument development community in mapping existing and emerging measurement technology to life detection science objectives, in order to establish science traceability and identify technology development needs. Research Multiple worlds within and beyond the Solar System are considered potentially habitable by virtue of the presence of liquid water, and mission concepts to seek evidence of life on these worlds are being developed. On Earth, the abundance distribution of life and its products ranges over many orders of magnitude, as a function of multiple environmental and ecological factors. Similar variability can be expected both within and among inhabited worlds beyond Earth, if any exist, and understanding it can inform target selection, observing
https://en.wikipedia.org/wiki/Arnon%20Avron
Arnon Avron (; born 1952) is an Israeli mathematician and Professor at the School of Computer Science at Tel Aviv University. His research focuses on applications of mathematical logic to computer science and artificial intelligence. Biography Born in Tel Aviv in 1952, Arnon Avron studied mathematics at Tel Aviv University and the Hebrew University of Jerusalem, receiving a Ph.D. magna cum laude from Tel Aviv University in 1985. Between 1986 and 1988, he was a visitor at the University of Edinburgh's Laboratory for Foundations of Computer Science, where he began his association with computer science. In 1988 he became a senior faculty member of the Department of Computer Science (later School of Computer Science) of Tel Aviv University, chairing the School in 1996–1998, and becoming a Full Professor in 1999. Research Avron's research interests include proof theory, automated reasoning, non-classical logics, foundations of mathematics, and applications of mathematical logic in computer science and artificial intelligence. Avron made a significant contribution to the theory of automated reasoning with his introduction of hypersequents, a generalization of the sequent calculus. Avron also introduced the use of bilattices to paraconsistent logic, and made contributions to predicative set theory and geometry. Selected works Books Articles References 1952 births Living people Einstein Institute of Mathematics alumni Israeli computer scientists Israeli Jews Israeli mathematicians Mathematical logicians Tel Aviv University alumni Academic staff of Tel Aviv University
https://en.wikipedia.org/wiki/Unihertz%20Atom
The Unihertz Atom was released in 2018 as Unihertz's second smartphone model after the Jelly. Initially launched through a similar kickstarter project that reached its $50,000 goal in only 60 seconds. It features a 2.45-inch display, very small by industry standards, and at 108 grams (plus battery) is also very lightweight. It has a dedicated push to talk (PTT) button on its side. Background Unihertz Atom was marketed as "the ultimate rugged phone for outdoor adventures". Unihertz was associated with a previous very small 3G smartphone, the Posh Micro X, which launched in 2015. Reviews of Jelly and Jelly Pro, the "world's smallest 4G smartphone" have been mixed, but it drew international attention. Specifications Software Hardware Controversies There have been accusations of poor battery performance, and network traffic possibly sending personal data to China. Responses claim the network traffic is to speed up apps, and the company has been updating the phone software to improve performance. It is not known whether this is connected to similar widespread problems, but the predecessor Posh Micro X was also criticized for running suspect software fotaprovider by adups, and the Jelly Pro does as well. References Smart devices Android (operating system) devices
https://en.wikipedia.org/wiki/ISO-IR-169
ISO-IR-169 is a character set developed by the Blissymbolics Communication International Institute (BCI), and registered with the ISO-IR registry for use with ISO/IEC 2022 by the Standards Council of Canada. It contains 2304 characters for communicating with Blissymbols, including 2267 Blissymbolic dictionary words taken from Wood, Star and Reich's 1991 Blissymbol Reference Guide. Code charts Character set 0x21 (row 1) Row 1 contains a space, an "error sign", punctuation and ordinal numbers. This is the only row for which Unicode mappings currently exist, as of Unicode 13.0. Character set 0x23 (row 3) Row 3 contains 19 combining indicator characters for combination with Blissymbolic symbols. Character sets 0x30 and onward (rows 16 and onward) Rows 16 and onward include Blissymbolic dictionary words in alphabetical order by English name. They are annotated with respect to which combining indicators may be applied to the word, and whether certain combinations with indicators are forbidden due to being redundant to another encoded word. These characters do not exist in Unicode, as of Unicode 13.0. References Character sets
https://en.wikipedia.org/wiki/Neukom%20Institute%20for%20Computational%20Science
The Neukom Institute for Computational Science is a collection of offices and laboratory facilities at Dartmouth College in Hanover, New Hampshire. The institute was funded by a donation from Bill Neukom in 2004, then Dartmouth's largest gift for an academic program. The institute provides programs for undergraduates and graduate students as well as encouraging public engagement with computer science through programs such as Neukom Institute Literary Arts Award. Literary Arts Award The Neukom Institute Literary Arts Award is presented to celebrate new works of speculative fiction. The three categories are: Speculative Fiction, Debut Speculative Fiction and Playwriting. Speculative Fiction This award is for any work of speculative fiction published in the last two and a half years or that is about to be published. Recipients The inaugural award in 2018 was to Central Station by Lavie Tidhar and On the Edge of Gone by Corinne Duyvis. Debut Speculative Fiction This award is for an author's first work of speculative fiction. Recipients The inaugural award in 2018 was presented to Best Worst American by Juan Martinez. Playwriting This award is for a full-length play addressing the question "What does it mean to be a human in a computerized world?" Recipients The inaugural award in 2018 was presented to Choices People Make by Jessica Andrewartha. References External links Neukom Institute homepage Neukom Institute Literary Arts Award home page Dartmouth College facilities Computational science American literary awards American theater awards Awards established in 2018
https://en.wikipedia.org/wiki/List%20of%20software%20to%20detect%20low%20complexity%20regions%20in%20proteins
Computational methods can study protein sequences to identify regions with low complexity, which can have particular properties regarding their function and structure. For a comprehensive review on the various methods and tools, see. In addition, a web meta-server named PLAtform of TOols for LOw COmplexity (PlaToLoCo) has been developed, for visualization and annotation of low complexity regions in proteins. PlaToLoCo integrates and collects the output of five different state-of-the-art tools for discovering LCRs and provides functional annotations such as domain detection, transmembrane segment prediction, and calculation of amino acid frequencies. Furthermore, the union or intersection of the results of the search on a query sequence can be obtained. A Neural Network webserver, named LCR-hound has been developed to predict the function of prokaryotic and eukaryotic LCRs, based on their amino acid or di-amino acid content. References Proteomics proteins
https://en.wikipedia.org/wiki/Specific%20quantity
In the natural sciences, including physiology and engineering, a specific quantity generally refers to an intensive quantity "per unit mass", i.e., obtained by dividing an extensive quantity of interest by mass; in this case, it is also known as a mass-specific quantity. For example, specific leaf area is leaf area divided by leaf mass. Derived SI units involve reciprocal kilogram (kg-1), e.g., square metre per kilogram (m2kg−1). Another kind of specific quantity, termed named specific quantity, is a generalization of the original concept. The divisor quantity is not restricted to mass, and name of the divisor is usually placed before "specific" in the full term (e.g., "volume-specific storage" or "thrust-specific fuel consumption"). Named and unnamed specific quantities are given for the terms below. List Mass-specific quantities Per unit of mass (short form of mass-specific): Specific absorption rate, power absorbed per unit mass of tissue at a given frequency Specific activity, radioactivity in becquerels per unit mass Specific energy, defined as energy per unit mass Specific internal energy, internal energy per unit mass Specific kinetic energy, kinetic energy of an object per unit of mass Specific enthalpy, enthalpy per unit mass Specific enzyme activity, activity per milligram of total protein Specific force, defined as the non-gravitational force per unit mass Specific growth rate, increase in cell mass per unit cell mass per unit time Specific heat capacity, heat capacity per unit mass, unless another unit is named, such as mole-specific heat capacity, or volume-specific heat capacity Specific latent heat, latent heat per unit mass Specific leaf area, leaf area per unit dry leaf mass Specific modulus, a materials property consisting of the elastic modulus per mass density of a material Specific orbital energy, orbital energy per unit mass Specific power, per unit of mass (or volume or area) Specific relative angular momentum, of two orb
https://en.wikipedia.org/wiki/Miller%27s%20recurrence%20algorithm
Miller's recurrence algorithm is a procedure for calculating a rapidly decreasing solution of a linear recurrence relation developed by J. C. P. Miller. It was originally developed to compute tables of the modified Bessel function but also applies to Bessel functions of the first kind and has other applications such as computation of the coefficients of Chebyshev expansions of other special functions. Many families of special functions satisfy a recurrence relation that relates the values of the functions of different orders with common argument . The modified Bessel functions of the first kind satisfy the recurrence relation . However, the modified Bessel functions of the second kind also satisfy the same recurrence relation . The first solution decreases rapidly with . The second solution increases rapidly with . Miller's algorithm provides a numerically stable procedure to obtain the decreasing solution. To compute the terms of a recurrence through according to Miller's algorithm, one first chooses a value much larger than and computes a trial solution taking initial condition to an arbitrary non-zero value (such as 1) and taking and later terms to be zero. Then the recurrence relation is used to successively compute trial values for , down to . Noting that a second sequence obtained from the trial sequence by multiplication by a constant normalizing factor will still satisfy the same recurrence relation, one can then apply a separate normalizing relationship to determine the normalizing factor that yields the actual solution. In the example of the modified Bessel functions, a suitable normalizing relation is a summation involving the even terms of the recurrence: where the infinite summation becomes finite due to the approximation that and later terms are zero. Finally, it is confirmed that the approximation error of the procedure is acceptable by repeating the procedure with a second choice of larger than the initial choice and confirming
https://en.wikipedia.org/wiki/Windows%20Remote%20Management
WinRM (Windows Remote Management) is Microsoft's implementation of WS-Management in Windows which allows systems to access or exchange management information across a common network. Utilizing scripting objects or the built-in command-line tool, WinRM can be used with any remote computers that may have baseboard management controllers (BMCs) to acquire data. On Windows-based computers including WinRM, certain data supplied by Windows Management Instrumentation (WMI) can also be obtained. Components WinRM Scripting API Provides an Application programming interface enabling scripts to remotely acquire data from computers that perform WS-Management operations. winrm.cmd Built-in systems management command line tool allowing a machine operator to configure WinRM. Implementation consists of a Visual Basic Scripting (VBS) Edition file (Winrm.vbs) which is written using the aforementioned WinRM scripting API. winrs.exe Another command line tool allowing the remote execution of most Cmd.exe commands. This tool utilizes the WS-Management protocol. Intelligent Platform Management Interface (IPMI) driver Provides hardware management and facilitates control of remote server hardware through BMCs. IPMI is most useful when the operating system is not running or deployed as it allows for continued remote operations of the bare metal hardware/software. WMI plug-in Allows WMI data to be made available to WinRM clients. WMI service Leverages the WMI plug-in to provide requested data or control and can also be used to acquire data from most WMI classes. Examples include the Win32_Process, in addition to any IPMI-supplied data. WS-Management protocol Web Services Management is a DMTF open standard defining a SOAP-based protocol for the management of servers, devices, applications and various Web services. WS-Management provides a common way for systems to access and exchange management information across the IT infrastructure. Ports By default WinRM HTTPS used 598
https://en.wikipedia.org/wiki/History%20of%20CAD%20software
Designers have used computers for calculations since their invention. Digital computers were used in power system analysis or optimization as early as proto-"Whirlwind" in 1949. Circuit design theory or power network methodology was algebraic, symbolic, and often vector-based. 1940s–1950s Between the mid-1940s and 1950s, various developments were made in computer software. Some of these developments include servo-motors controlled by generated pulse (1949), a digital computer with built-in operations to automatically coordinate transforms to compute radar related vectors (1951), and the graphic mathematical process of forming a shape with a digital machine tool (1952). In 1953, MIT researcher Douglas T. Ross saw the "interactive display equipment" being used by radar operators, believing it would be exactly what his SAGE-related data reduction group needed. Ross and the other researchers from the Massachusetts Institute of Technology Lincoln Laboratory were the sole users of the complex display systems installed for the pre-SAGE Cape Cod system. Ross claimed in an interview that they "used it for their own personal workstation." The designers of these early computers built utility programs to ensure programmers could debug software, using flowcharts on a display scope, with logical switches that could be opened and closed during the debugging session. They found that they could create electronic symbols and geometric figures to create simple circuit diagrams and flowcharts. These programs also enabled objects to be reproduced at will; it also was possible to change their orientation, linkage (flux, mechanical, lexical scoping), or scale. This presented numerous possibilities to them. Ross coined the term computer-aided design (CAD) in 1959. 1960s The invention of the 3D CAD/CAM is often attributed to French engineer Pierre Bézier (Arts et Métiers ParisTech, Renault). Between 1966 and 1968, after his mathematical work concerning surfaces, he developed UNISURF
https://en.wikipedia.org/wiki/Medical%20applications%20of%20radio%20frequency
Medical applications of radio frequency (RF) energy, in the form of electromagnetic waves (radio waves) or electrical currents, have existed for over 125 years, and now include diathermy, hyperthermy treatment of cancer, electrosurgery scalpels used to cut and cauterize in operations, and radiofrequency ablation. Magnetic resonance imaging (MRI) uses radio frequency waves to generate images of the human body. Radio frequencies at non-ablation energy levels are commonly used as a part of aesthetic treatments that can tighten skin, reduce fat by lipolysis and also apoptosis, or promote healing. RF diathermy is a medical treatment that uses RF induced heat as a form of physical therapy and in surgical procedures. It is commonly used for muscle relaxation. It is also a method of heating tissue electromagnetically for therapeutic purposes in medicine. Diathermy is used in physical therapy to deliver moderate heat directly to pathologic lesions in the deeper tissues of the body. Surgically, the extreme heat that can be produced by diathermy may be used to destroy neoplasms, warts, and infected tissues, and to cauterize blood vessels to prevent excessive bleeding. The technique is particularly valuable in neurosurgery and surgery of the eye. Diathermy equipment typically operates in the short-wave radio frequency (range 1–100 MHz) or microwave energy (range 434–915 MHz). Pulsed electromagnetic field therapy (PEMF) is a medical treatment that purportedly helps to heal bone tissue reported in a recent NASA study. This method usually employs electromagnetic radiation of different frequencies – ranging from static magnetic fields, through extremely low frequencies (ELF) to higher radio frequencies (RF) administered in pulses. History The idea that high-frequency electromagnetic currents could have therapeutic effects was explored independently around the same time (1890–91) by French physician and biophysicist Jacques Arsene d'Arsonval and Serbian American engineer Nikol
https://en.wikipedia.org/wiki/Hachimoji%20DNA
Hachimoji DNA (from Japanese hachimoji, "eight letters") is a synthetic nucleic acid analog that uses four synthetic nucleotides in addition to the four present in the natural nucleic acids, DNA and RNA. This leads to four allowed base pairs: two unnatural base pairs formed by the synthetic nucleobases in addition to the two normal pairs. Hachimoji bases have been demonstrated in both DNA and RNA analogs, using deoxyribose and ribose respectively as the backbone sugar. Benefits of such a nucleic acid system may include an enhanced ability to store data, as well as insights into what may be possible in the search for extraterrestrial life. The hachimoji DNA system produced one type of catalytic RNA (ribozyme or aptamer) in vitro. Description Natural DNA is a molecule carrying the genetic instructions used in the growth, development, functioning, and reproduction of all known living organisms and many viruses. DNA and ribonucleic acid (RNA) are nucleic acids; alongside proteins, lipids and complex carbohydrates (polysaccharides), nucleic acids are one of the four major types of macromolecules that are essential for all known forms of life. DNA is a polynucleotide as it is composed of simpler monomeric units called nucleotides; when double-stranded, the two chains coil around each other to form a double helix. In natural DNA, each nucleotide is composed of one of four nucleobases (cytosine [C], guanine [G], adenine [A] or thymine [T]), a sugar called deoxyribose, and a phosphate group. The nucleotides are joined to one another in a chain by covalent bonds between the sugar of one nucleotide and the phosphate of the next, resulting in an alternating sugar-phosphate backbone. The nitrogenous bases of the two separate polynucleotide strands are bound to each other with hydrogen bonds, according to base pairing rules (A with T and C with G), to make double-stranded DNA. Hachimoji DNA is similar to natural DNA but differs in the number, and type, of nucleobases. Unn
https://en.wikipedia.org/wiki/Google%20Cloud%20Dataflow
Google Cloud Dataflow is a fully managed service for executing Apache Beam pipelines within the Google Cloud Platform ecosystem. History Google Cloud Dataflow was announced in June, 2014 and released to the general public as an open beta in April, 2015. In January, 2016 Google donated the underlying SDK, the implementation of a local runner, and a set of IOs (data connectors) to access Google Cloud Platform data services to the Apache Software Foundation. The donated code formed the original basis for Apache Beam. References External links Dataflow Cloud computing
https://en.wikipedia.org/wiki/Eden%27s%20conjecture
In the mathematics of dynamical systems, Eden's conjecture states that the supremum of the local Lyapunov dimensions on the global attractor is achieved on a stationary point or an unstable periodic orbit embedded into the attractor. The validity of the conjecture was proved for a number of well-known systems having global attractor (e.g. for the global attractors in the Lorenz system, complex Ginzburg–Landau equation). It is named after Alp Eden, who proposed it in 1987. Kuznetsov–Eden's conjecture For local attractors, a conjecture on the Lyapunov dimension of self-excited attractor, refined by N. Kuznetsov, is stated that for a typical system, the Lyapunov dimension of a self-excited attractor does not exceed the Lyapunov dimension of one of the unstable equilibria, the unstable manifold of which intersects with the basin of attraction and visualizes the attractor. The conjecture is valid, e.g., for the classical self-excited Lorenz attractor; for the self-excited attractors in the Henon map (even in the case of multistability and coexistence of local attractors with different Lyapunov dimensions). For a hidden attractor the conjecture is that the maximum of the local Lyapunov dimensions is achieved on an unstable periodic orbit embedded into the attractor. References Dynamical systems Chaos theory Hidden oscillation
https://en.wikipedia.org/wiki/Maria%20Heep-Altiner
Maria Heep-Altiner (born 29 December 1959 in Niederzeuzheim) is a German mathematician, actuary and university lecturer. Life After graduating from the Prince Johann Ludwig School  in Hadamar in 1978, Heep-Altiner studied mathematics and economics at the University of Bonn. In 1989 she earned her doctorate in mathematics on the number theory topic "Period relations for " under Günter Harder and Michael Rapoport. She then worked as an actuary for Gerling, before she moved companies in 1994 to Allgemeine Versicherungs-AG. There she became the actuarial manager for property insurance. In 2006, she moved to Talanx, where she was responsible for setting up an internal holding model. In 2008, Heep-Altiner returned to academia as a professor at the Institute of Insurance at Cologne university of applied sciences. There she is responsible for the area of financing in the insurance company. She is a member of the German Actuarial Society executive board. In addition, she has co-published various publications on various actuarial topics, in particular on the Solvency II Directive 2009. Publications For the following books Heep-Altiner was the main author or significant part of the writing team: References 1959 births Living people 20th-century German mathematicians 21st-century German mathematicians Actuaries Number theory Academic staff of the University of Bonn Women mathematicians
https://en.wikipedia.org/wiki/PyClone
PyClone is a software that implements a Hierarchical Bayes statistical model to estimate cellular frequency patterns of mutations in a population of cancer cells using observed alternate allele frequencies, copy number, and loss of heterozygosity (LOH) information. PyClone outputs clusters of variants based on calculated cellular frequencies of mutations. Background According to the Clonal Evolution model proposed by Peter Nowell, a mutated cancer cell can accumulate more mutations as it progresses to create sub-clones. These cells divide and mutate further to give rise to other sub-populations. In compliance with the theory of natural selection, some mutations may be advantageous to the cancer cells and thus make the cell immune to previous treatment. Heterogeneity within a single cancer tumour can arise from single nucleotide polymorphism/variation (SNP/SNV) events, microsatellite shifts and instability, loss of heterozygosity (LOH), Copy number variation and karyotypic variations including chromosome structural aberrations and aneuploidy. Due to the current methods of molecular analysis where a mixed population of cancer cells are lysed and sequenced, heterogeneity within the tumour cell population is under-detected. This results in a lack of information on the clonal composition of cancer tumours and more knowledge in this area would aid in the decisions for therapies. PyClone is a hierarchical Bayes statistical model that uses measurements of allele frequency and allele specific copy numbers to estimate the proportion of tumor cells harboring a mutation. By using deeply sequenced data to find putative clonal clusters, PyClone estimates the cellular prevalence, the portion of cancer cells harbouring a mutation, of the input sample. Progress has been made for measuring variant allele frequency with deep sequencing data but statistical approaches to cluster mutations into biologically relevant groups remain underdeveloped. The commonness of a mutation between
https://en.wikipedia.org/wiki/Divsha%20Amir%C3%A0
Divsha Amirà (; 1899 – 9 April 1966) was an Israeli mathematician and educator. Biography Amirà was born in Brańsk, Russian Empire to Rivka () and Aharon Itin. She immigrated to Israel with her family in 1906. Her father was one of the founders of Ahuzat Bayit (today Tel Aviv), a founder of the Tel Aviv Great Synagogue, and the owner of the first publishing house in Jaffa. She graduated in the second class of the Herzliya Gymnasium in 1914. Amirà studied at the University of Göttingen and obtained her doctorate from the University of Geneva in 1924 under the guidance of Herman Müntz. Her doctoral thesis, published in 1925, provided a projective synthesis of Euclidean geometry. Pedagogic career After leaving Geneva, Amirà worked at Gymnasia Rehavia in Jerusalem, and taught several courses on geometry at the Einstein Institute of Mathematics. She later taught at the and Beit-Hakerem High School, where her students included such future mathematicians as Ernst G. Straus. Published works Amirà published an introductory school textbook on geometry in 1938, following the axiomatic approach of Hilbert's Grundlagen der Geometrie. She published a more advanced textbook on the same topic in 1963. See also Education in Israel Women in Israel References 1899 births 1966 deaths 20th-century women mathematicians Burials at Har HaMenuchot Geometers Immigrants to Ottoman Palestine Academic staff of the Hebrew University of Jerusalem Herzliya Hebrew Gymnasium alumni Emigrants from the Russian Empire to the Ottoman Empire Israeli mathematicians Israeli women mathematicians Mathematics educators Textbook writers Women textbook writers University of Geneva alumni University of Göttingen alumni
https://en.wikipedia.org/wiki/Eilon%20Solan
Eilon Solan () (born 1969) is an Israeli mathematician and professor at the School of Mathematical Sciences of Tel Aviv University. His research focuses on game theory, stochastic processes, and measure theory. Biography Solan obtained a B.Sc. in mathematics and computer science from the Hebrew University of Jerusalem in 1989, and an M.Sc. in mathematics from Tel Aviv University in 1993. He completed his doctorate at the Hebrew University of Jerusalem in 1998 under the supervision of Abraham Neyman, with a dissertation on stochastic games. Scientific career Solan was one of the inventors of CAPTCHA in 1997, along with Eran Reshef and Gili Raanan. Solan has 12 research papers joint with his son, Omri Nisan Solan. Some of these were published before Omri finished undergraduate studies. References External links Date of birth missing (living people) Game theorists Hebrew University of Jerusalem alumni Israeli mathematicians Israeli statisticians Living people Tel Aviv University alumni Academic staff of Tel Aviv University 1969 births
https://en.wikipedia.org/wiki/ROSE%20test
The resistivity of solvent extract (ROSE) test is a test for the presence and average concentration of soluble ionic contaminants, for example on a printed circuit board (PCB). It was developed in the early 1970s. Some manufacturers use it as part of Six Sigma processes. Some modern fluxes have low solubility in traditional ROSE solvents such as water and isopropyl alcohol, and therefore require the use of different solvents. References Chemical tests Printed circuit board manufacturing
https://en.wikipedia.org/wiki/Lyapunov%20dimension
In the mathematics of dynamical systems, the concept of Lyapunov dimension was suggested by Kaplan and Yorke for estimating the Hausdorff dimension of attractors. Further the concept has been developed and rigorously justified in a number of papers, and nowadays various different approaches to the definition of Lyapunov dimension are used. Remark that the attractors with noninteger Hausdorff dimension are called strange attractors. Since the direct numerical computation of the Hausdorff dimension of attractors is often a problem of high numerical complexity, estimations via the Lyapunov dimension became widely spread. The Lyapunov dimension was named after the Russian mathematician Aleksandr Lyapunov because of the close connection with the Lyapunov exponents. Definitions Consider a dynamical system , where is the shift operator along the solutions: , of ODE , , or difference equation , , with continuously differentiable vector-function . Then is the fundamental matrix of solutions of linearized system and denote by , singular values with respect to their algebraic multiplicity, ordered by decreasing for any and . Definition via finite-time Lyapunov dimension The concept of finite-time Lyapunov dimension and related definition of the Lyapunov dimension, developed in the works by N. Kuznetsov, is convenient for the numerical experiments where only finite time can be observed. Consider an analog of the Kaplan–Yorke formula for the finite-time Lyapunov exponents: with respect to the ordered set of finite-time Lyapunov exponents at the point . The finite-time Lyapunov dimension of dynamical system with respect to invariant set is defined as follows In this approach the use of the analog of Kaplan–Yorke formula is rigorously justified by the Douady–Oesterlè theorem, which proves that for any fixed the finite-time Lyapunov dimension for a closed bounded invariant set is an upper estimate of the Hausdorff dimension: Looking for best such estimation , the
https://en.wikipedia.org/wiki/Oddly%20satisfying%20videos
Oddly satisfying videos are internet videos that portray repetitive events or actions that viewers find pleasing. Common subjects of oddly satisfying videos include domino shows, parlor tricks, slime, pressure washing, hydraulic presses, soap cutting, and paint mixing. Viewers may watch oddly satisfying videos as a form of escapism, and some oddly satisfying videos may bring about ASMR. The term emerged on the internet forum Reddit after the /r/oddlysatisfying subreddit was established in 2013. Oddly satisfying videos are now widespread on online video platforms such as YouTube and Instagram. Wired UK called slime, a common feature of oddly satisfying videos, the "biggest DIY trend of 2017, even causing a national shortage of glue in the US." In 2021, The Guardian wrote that "the pandemic, coupled with an increasingly chaotic news cycle, has us striving towards the ultimate #smoothbrain state – meaning there’s a global audience regularly seeking out the oddly satisfying, and a huge wave of content creators actively manufacturing it." Psychology The appeal of oddly satisfying videos is thought to lie in the human preference for symmetry, patterns and repetition, the interest of exploring the behavior of materials, or hand movements. It may be related to the autonomous sensory meridian response, a pleasant tingling sensation in the neck and scalp. Evan Malone, a professor of art and film philosophy, theorized that the appeal of oddly satisfying videos may lie in their portrayal of everyday experiences as cinematic and, in Baudrillard's words, "hyper-real". The effect of watching such videos has been described as a "brain massage" or "lightly hypnotizing", and as a form of psychological self-care to help overcome stress or anxiety. The satisfaction derived from oddly satisfying videos may have to do with mirror neurons, which fire both when one performs a motion and when one watches someone else perform a motion. References External links /r/oddlysatisfying on Re
https://en.wikipedia.org/wiki/Day%20convolution
In mathematics, specifically in category theory, Day convolution is an operation on functors that can be seen as a categorified version of function convolution. It was first introduced by Brian Day in 1970 in the general context of enriched functor categories. Day convolution acts as a tensor product for a monoidal category structure on the category of functors over some monoidal category . Definition Let be a monoidal category enriched over a symmetric monoidal closed category . Given two functors , we define their Day convolution as the following coend. If is symmetric, then is also symmetric. We can show this defines an associative monoidal product. References External links Category theory
https://en.wikipedia.org/wiki/HoloLens%202
Microsoft HoloLens 2 is a mixed reality head-mounted display developed and manufactured by Microsoft. It is the successor to the original Microsoft HoloLens. The first variant of the device, The HoloLens 2 enterprise edition, debuted on February 24, 2019. This was followed by a developer edition that was announced on May 2, 2019. The HoloLens 2 was subsequently released in limited numbers on November 7, 2019. Description The HoloLens 2 was announced by lead HoloLens developer Alex Kipman on February 24, 2019 at Mobile World Congress (MWC) in Barcelona, Spain. On May 7, 2019 the HoloLens 2 was shown again at the Microsoft Build developer conference. There, it showcased an application created with the Unreal Game Engine. The HoloLens 2 are combination waveguide and laser-based stereoscopic & full-color mixed reality smartglasses developed and manufactured by Microsoft. The US military's Integrated Visual Augmentation System is a further development of Hololens 2. The HoloLens 2 is an early AR device. The displays on the HoloLens 2 are simple waveguide displays with a fixed focus of approximately two meters. Because of the fixed focus, the displays exhibit the Vergence-Accommodation Conflict, which is an unpleasant visual sensation for the viewer. On August 20, 2019, at the Hot Chips 31 symposium Microsoft presented their Holographic Processing Unit (HPU) 2.0 custom design for the HoloLens 2 with the following features: 7x SIMD Fixed Point (SFP) for 2D processing 6x Floating Vector Processor (FVP) for 3D processing >1 TOP of programmable compute 125Mb SRAM 79mm2 die size and 2 billion transistors TSMC 16FF+ process PCIe 2.0 x1 at 100 MB/s bandwidth to Snapdragon 850 On August 29, 2019, at the World Artificial Intelligence Conference in Shanghai, Microsoft's Executive Vice President, Harry Shum, revealed that HoloLens 2 would go on sale in September 2019. The product started shipping on November 7, 2019. Improvements over the previous model Microsoft hig
https://en.wikipedia.org/wiki/FAM178B
FAM178B is a protein coding that is located on the plus strand of chromosome 2. The locus for the gene is 2q11.2. It is also known by the aliases Family with Sequence Similarity 178, Member B, and HSPC234. In total there are 24 exons in the gene. FAM178B spans 110,720 base pairs, and contains 827 amino acids. Forms There are two isoforms of the gene transcript that exist by alternative splicing, and one gene precursor. SLF2 (FAM178A) is an important paralog of FAM178B. SLF2 is predicted to play a role in the DNA damage response (DDR) pathway by regulating post-replication repair of UV-damaged DNA, and genomic stability maintenance. Protein structure The molecular weight of the protein is 76.5 kilodaltons, and the isoelectric point is 5.47.The gene has 6 transcript splice variants. The protein has been phenotypically associated with bipolar disease due to its locus, as well as body mass index (BMI), and cell adhesion. A proposed structure for the protein can be found in the images for proposed structures. The secondary structure for the FAM178B protein is predicted to be primarily alpha helices. The tertiary structure of the protein may assume a coiled coil structure. Expression FAM178B is most highly expressed in the skeletal muscle and brain tissues.The structure in which it is highly concentrated is in the corpus callosum of the brain. Additionally, it is of high levels in the trigeminal nerve and spinal cord. Further, there is also high concentrations of the gene found near the heart, testes and olfactory regions. According to the Allen Brain Atlas, the olfactory regions, and the hippocampus of the mouse brain showed the greatest expressions of the gene when tested experimentally. DNA Level Regulation The proposed promoter region of FAM178B protein is below. A table of relevant transcription factor binding sites that correspond to the sequence and colors highlighted in the promoter region is also included. The promoter region of FAM178B is highly
https://en.wikipedia.org/wiki/Solventogenesis
Solventogenesis is the biochemical production of solvents (usually acetone and butanol) by Clostridium species. It is the second phase of ABE fermentation. Process Solventogenic Clostridium species have a biphasic metabolism composed of an acidogenic phase and a solventogenic phase. During acidogenesis, these bacteria are able to convert several carbon sources into organic acids, commonly butyrate and acetate. As acid accumulates, cells begin to assimilate the organic acids to solvents. In Clostridium acetobutylicum, a model solventogenic Clostridium species, a combination of low pH and high undisociated butyrate, referred to as the "pH-acid effect", triggers the metabolic shift from acidogenesis to solventogenesis. Products Acetone, butanol, and ethanol are the most common products of solventogenesis. Some species such as Clostridium beijerinckii, Clostridium puniceum and Clostridium roseum are able to further reduce acetone to isopropanol. Several species are able to produce additional solvents under various culture conditions. For example, glycerol fermentation results in the production of 1,3-propanediol in several species. Acetoin is produced by several species and is further reduced to 2,3-butanediol by Clostridium beijerinckii. List of Solventogenic Clostridium References Solvents Biochemistry Clostridia Clostridium Fermentation
https://en.wikipedia.org/wiki/Tanja%20Stadler
Tanja Stadler is a mathematician and professor of computational evolution at the Swiss Federal Institute of Technology (ETH Zurich). She’s the current president of the Swiss Scientific Advisory Panel COVID-19. Career Tanja Stadler studied applied mathematics and statistics at the Technical University of Munich, University of Cardiff, and the University of Canterbury. She continued at the Technical University of Munich to obtain a PhD in 2008 on the topic 'Evolving Trees – Models for Speciation and Extinction in Phylogenetics' (with Prof. Anusch Taraz and Prof. Mike Steel). After a postdoctoral period with Prof. Sebastian Bonhoeffer in the Department of Environmental Systems Sciences at ETH Zürich, she was promoted to Junior Group Leader at ETH Zürich in 2011. In 2014, she became an assistant professor at the Department of Biosystems Science and Engineering of ETH Zürich, where she was promoted to associate professor in 2017 and to full professor in 2021. Research Scientific Contributions Tanja Stadler's research has a strong evolutionary perspective and covers questions ranging from macroevolution, epidemiology, developmental biology to immunology. Her research questions include fundamental aspects such as how speciation processes led to the current biodiversity, as well as questions directly relevant to human societies, such as the spread of pathogens like COVID-19 or Ebola. At the core, Tanja develops and applies statistical phylodynamic tools to estimate evolutionary and population dynamics from genomic sequencing data. Her unique approach is an innovative mix of mathematics, computer science and biology. Tanja made major theoretical contributions to the field of phylodynamics by developing statistical frameworks to use birth-death processes in the context of phylogenetic trees. In particular, she laid the foundations to account for sampling through time in birth-death models – enabling coherent analysis of genetic sequencing data collected through time duri
https://en.wikipedia.org/wiki/CDV3%20%28gene%29
Protein CDV3 homolog also known as carnitine deficiency-associated gene expressed in ventricle 3 is a protein that in humans is encoded by the CDV3 gene. CDV3 is a biomarker for hepatocellular carcinoma. CDV3 has been considered as a potential target for gene therapy. Related gene families include plasma proteins and predicted intracellular proteins. Gene Aliases The CDV3 protein is also commonly known as tyrosine-phosphorylated protein 36 (TPP36). TPP36 isoforms have been found to be substrates of Abl tyrosine kinase. Locus The CDV3 gene is on chromosome 3 (3q22.1). Exons There were variations in the listed number of exons in CDV3 between genetic databases. The number of exons vary based on the isoform in question, with most transcript isoforms having 5 exons. Span The exons of human CDV3 gene's longest transcript isoform span 16,711 bp. Transcripts Isoforms CDV3 has seven isoforms, and more are continuously added to databases as they are discovered. Currently there are isoforms a-f. Protein Molecular weight: 27.3 kD Protein length: 258 aa Isoelectric point: 5.89 Motifs A SAPS analysis on the human CDV3 protein sequence found one uncharged cluster segment from 28-75 aa. There were no signs of high scoring hydrophobic segments. One high scoring transmembrane segment was found from 28-55 aa. CDV3 was found to have significant maximal spacing from 27-76 aa. Repeats The following repetitive structures were found for the protein. Aligned matching blocks: [45-52]  AGAAGGGA [66-73]  AGAAGPGA with superset:   [32-36]   AGAAG   [45-49]   AGAAG   [  66- 70]   AGAAG __ [134-137]   MEKS [213-216]   MEKS __ Simple tandem repeat: [31-43]   AAGAA_GSAGGSSG [44-54]   AAGAAGGGAGA Predicted Motifs PROSITE found several potential motifs in CDV3. Predicted Secondary Structure The following programs were used to develop this figure: JPred, CFSSP, and GOR4. The majority of the CDV3 structure is hypothesized to be alpha helices and random coil. P
https://en.wikipedia.org/wiki/Bioche%27s%20rules
Bioche's rules, formulated by the French mathematician (1859–1949), are rules to aid in the computation of certain indefinite integrals in which the integrand contains sines and cosines. In the following, is a rational expression in and . In order to calculate , consider the integrand . We consider the behavior of this entire integrand, including the , under translation and reflections of the t axis. The translations and reflections are ones that correspond to the symmetries and periodicities of the basic trigonometric functions. Bioche's rules state that: If , a good change of variables is . If , a good change of variables is . If , a good change of variables is . If two of the preceding relations both hold, a good change of variables is . In all other cases, use . Because rules 1 and 2 involve flipping the t axis, they flip the sign of dt, and therefore the behavior of ω under these transformations differs from that of ƒ by a sign. Although the rules could be stated in terms of ƒ, stating them in terms of ω has a mnemonic advantage, which is that we choose the change of variables u(t) that has the same symmetry as ω. These rules can be, in fact, stated as a theorem: one shows that the proposed change of variable reduces (if the rule applies and if f is actually of the form ) to the integration of a rational function in a new variable, which can be calculated by partial fraction decomposition. Case of polynomials To calculate the integral , Bioche's rules apply as well. If p and q are odd, one uses ; If p is odd and q even, one uses ; If p is even and q odd, one uses ; If not, one is reduced to lineariz. Another version for hyperbolic functions Suppose one is calculating . If Bioche's rules suggest calculating by (respectively, ), in the case of hyperbolic sine and cosine, a good change of variable is (respectively, ). In every case, the change of variable allows one to reduce to a rational function, this last change of variable being mos
https://en.wikipedia.org/wiki/Human%E2%80%93robot%20collaboration
Human-Robot Collaboration is the study of collaborative processes in human and robot agents work together to achieve shared goals. Many new applications for robots require them to work alongside people as capable members of human-robot teams. These include robots for homes, hospitals, and offices, space exploration and manufacturing. Human-Robot Collaboration (HRC) is an interdisciplinary research area comprising classical robotics, human-computer interaction, artificial intelligence, process design, layout planning, ergonomics, cognitive sciences, and psychology. Industrial applications of human-robot collaboration involve Collaborative Robots, or cobots, that physically interact with humans in a shared workspace to complete tasks such as collaborative manipulation or object handovers. Collaborative Activity Collaboration is defined as a special type of coordinated activity, one in which two or more agents work jointly with each other, together performing a task or carrying out the activities needed to satisfy a shared goal. The process typically involves shared plans, shared norms and mutually beneficial interactions. Although collaboration and cooperation are often used interchangeably, collaboration differs from cooperation as it involves a shared goal and joint action where the success of both parties depend on each other. For effective human-robot collaboration, it is imperative that the robot is capable of understanding and interpreting several communication mechanisms similar to the mechanisms involved in human-human interaction. The robot must also communicate its own set of intents and goals to establish and maintain a set of shared beliefs and to coordinate its actions to execute the shared plan. In addition, all team members demonstrate commitment to doing their own part, to the others doing theirs, and to the success of the overall task. Theories Informing Human-Robot Collaboration Human-human collaborative activities are studied in depth in or
https://en.wikipedia.org/wiki/ChemRxiv
ChemRxiv (pronounced "chem archive"—the X represents the Greek letter chi [χ]) is an open access preprint archive for chemistry. It is operated by the American Chemical Society, Royal Society of Chemistry and German Chemical Society. The new preprint server was announced already in 2016, but was only opened online in 2017. Initially, editors of ACS journals were skeptical and only 80% of the editors allowed submissions to be uploaded to the preprint server in 2017. In 2019 the Chinese Chemical Society and the Chemical Society of Japan joined as co-owners of the preprint server. The initial reception of ChemRxiv was one of hesitation, with several major journals of the founding organizations initially unsupportive: Angewandte Chemie gave support in March 2018 and JACS only gave support in August 2018. However, ChemRxiv received more than 1,000 submissions in the first eighteen months., growing to 2,314 in 2019. Like other preprint servers, it saw a surge in COVID-19 preprints in 2020. See also List of preprint repositories References External links Eprint archives Online archives Chemistry journals
https://en.wikipedia.org/wiki/Spherical%20Bernstein%27s%20problem
The spherical Bernstein's problem is a possible generalization of the original Bernstein's problem in the field of global differential geometry, first proposed by Shiing-Shen Chern in 1969, and then later in 1970, during his plenary address at the International Congress of Mathematicians in Nice. The problem Are the equators in the only smooth embedded minimal hypersurfaces which are topological -dimensional spheres? Additionally, the spherical Bernstein's problem, while itself a generalization of the original Bernstein's problem, can, too, be generalized further by replacing the ambient space by a simply-connected, compact symmetric space. Some results in this direction are due to Wu-Chung Hsiang and Wu-Yi Hsiang work. Alternative formulations Below are two alternative ways to express the problem: The second formulation Let the (n − 1) sphere be embedded as a minimal hypersurface in (1). Is it necessarily an equator? By the Almgren–Calabi theorem, it's true when n = 3 (or n = 2 for the 1st formulation). Wu-Chung Hsiang proved it for n ∈  {4, 5, 6, 7, 8, 10, 12, 14} (or n ∈ {3, 4, 5, 6, 7, 9, 11, 13}, respectively) In 1987, Per Tomter proved it for all even n (or all odd n, respectively). Thus, it only remains unknown for all odd n ≥ 9 (or all even n ≥ 8, respectively) The third formulation Is it true that an embedded, minimal hypersphere inside the Euclidean -sphere is necessarily an equator? Geometrically, the problem is analogous to the following problem: Is the local topology at an isolated singular point of a minimal hypersurface necessarily different from that of a disc? For example, the affirmative answer for spherical Bernstein problem when n = 3 is equivalent to the fact that the local topology at an isolated singular point of any minimal hypersurface in an arbitrary Riemannian 4-manifold must be different from that of a disc. Further reading F.J. Almgren, Jr., Some interior regularity theorems for minimal surfaces and an extension of the Ber
https://en.wikipedia.org/wiki/Aquatic%20plant%20management
Aquatic plant management involves the science and methodologies used to control invasive and non-invasive aquatic plant species in waterways. Methods used include spraying herbicide, biological controls, mechanical removal as well as habitat modification. Preventing the introduction of invasive species is ideal. Aquaculture has been a source of exotic and ultimately invasive species introductions such Oreochromis niloticus. Aquatic plants released from home fish tanks have also been an issue. Impact Aquatic weeds are obviously most economically problematic where humans and water touch each other. Water weeds reduce our capacity for hydroelectric generation, drinking water supply, industrial water supply, agricultural water supply, and recreational use of water bodies including recreational boating. Some weeds do this by increasing - rather than decreasing - the evaporation loss at the surface. Particular weeds and aquatic insects have a special relationship which makes the plants a source of insect pests. Organizations In Florida the Florida Fish and Wildlife Conservation Commission (FWC) has an aquatic plant management section. The State of Washington has an Aquatic Plant Management Program. The Aquatic Plant Management Society is an organization in the U.S. and published the Journal of Aquatic Plant Management. The City of Winter Park, Florida has a herbicide program. Species Invasive aquatic species include: Eichhornia crassipes (water hyacinth), invasive outside its native habitat in the Amazon Basin Hydrilla, invasive in North America Limnobium laevigatum, invasive in the U.S. Myriophyllum spicatum, invasive in North America Myriophyllum verticillatum, invasive in North America Monochoria vaginalis, invasive outside its native habitat in Asia and the Pacific Pistia Salvinia molesta Aquatic plant harvesting methods Harvesting methods Harvesting refers to anthropogenic removal of aquatic plants from their environment. Aquatic plant harvesting is often
https://en.wikipedia.org/wiki/Cirq
Cirq is an open-source framework for noisy intermediate scale quantum (NISQ) computers. History Cirq was developed by the Google AI Quantum Team, and the public alpha was announced at the International Workshop on Quantum Software and Quantum Machine Learning on July 18, 2018. A demo by QC Ware showed an implementation of QAOA solving an example of the maximum cut problem being solved on a Cirq simulator. Usage Quantum programs in Cirq are represented by "Circuit" which is made up of a series of "Moments" representing slices of quantum gates that should be applied at the same time. The programs can be executed on local simulators or against hardware supplied by IonQ, Pasqal, Rigetti, and Alpine Quantum Technologies The following example shows how to create and measure a Bell state in Cirq. import cirq # Pick qubits qubit0 = cirq.GridQubit(0, 0) qubit1 = cirq.GridQubit(0, 1) # Create a circuit circuit = cirq.Circuit.from_ops( cirq.H(qubit0), cirq.CNOT(qubit0, qubit1), cirq.measure(qubit0, key='m0'), cirq.measure(qubit1, key='m1') ) Printing the circuit displays its diagram print(circuit) # prints # (0, 0): ───H───@───M('m0')─── # │ # (0, 1): ───────X───M('m1')─── Simulating the circuit repeatedly shows that the measurements of the qubits are correlated. simulator = cirq.Simulator() result = simulator.run(circuit, repetitions=5) print(result) # prints # m0=11010 # m1=11010 Projects OpenFermion OpenFermion is a library that compiles quantum simulation algorithms to Cirq. TensorFlow Quantum TensorFlow Quantum is an extension of TensorFlow that allows TensorFlow to be used to explore hybrid classical-quantum machine learning algorithms. ReCirq ReCirq is a repository of research projects done using Cirq. Qsim Cirq Qsim is a high performance wave function simulator that leverages gate fusing, AVS/FMA instructions, and OpenMP to achieve fast simulation rates. Qsimcirq allows one to use qsim from within Cirq. References
https://en.wikipedia.org/wiki/Timed%20word
In model checking, a subfield of computer science, a timed word is an extension of the notion of words, in a formal language, in which each letter is associated with a positive time tag. The sequence of time tags must be non-decreasing, which intuitively means that letters are received. For example, a system receiving a word over a network may associate to each letter the time at which the letter is received. The non-decreasing condition here means that the letters are received in the correct order. A timed language is a set of timed words. Example Consider an elevator. What is formally called a letter could be in fact information such as "someone pressed the button on the 2nd floor", or "the doors opened on the third floor". In this case, a timed word is a sequence of actions taken by the elevators and its users, with time stamps to recall those actions. The timed word can then be analyzed by formal methods to check whether a property such as "each time the elevator is called, it arrives in less than three minutes assuming that no one held the door for more than fifteen seconds" holds. A statement such as this one is usually expressed in metric temporal logic, an extension of linear temporal logic that allows the expression of time constraints. A timed word may be passed to a model, such as a timed automaton, which will decide, given the letters or actions that already occurred, what is the next action that should be done. In our example, to which floor the elevator must go. Then a program may test this timed automaton and check the above-mentioned property. That is, it will try to generate a timed word in which the door is never held open for more than fifteen seconds, and in which a user must wait more than three minutes after calling the elevator. Definition Given an alphabet A, a timed word is a sequence, finite or infinite with , with for each integer . If the sequence is infinite but the sequence of is bounded, then this word is said to be a Zeno t
https://en.wikipedia.org/wiki/Smart%20port
A smart port equips the workforce with relevant skills and technology to solve the unique internal and external challenges of the organisation, and to facilitate the efficient movement of goods, delivery of services and smooth flow of information. Using a holistic approach, the smart port achieves results without creating new challenges internally or elsewhere in the supply chain eco-system. Background The smart port minimises the negative impacts of its activities on the natural environment and enhances the surrounding communities - economically and socially. The material benefits of chosen technologies allow the smart port to: Improve efficiency to gain competitive advantage Increase business resilience to economic shocks or disruptive forces Extract maximum value from physical assets Develop new revenue streams based on digital value propositions Increase employee engagement and wellbeing Achieve and exceed environmental commitments A smart port is not defined by the use of any one particular technology or concept. Concept Smart ports employ smart technology solutions to increase efficiency, effectiveness and security by making ports more environmentally sustainable, economically efficient and capable of handling increased port traffic. Efficiency Due to the increasing size and volume of container, transport and cruise ships, ports continue to face new challenges with daily traffic and processing. Technologies such as IoT can improve warehouse logistics, inventory management etc. and help automate loading, dispatching and transporting goods. In smart ports, parking spaces could be optimised and traffic streamlined by making more efficient use of limited space. Sensors, cameras, drones and other technologies can automatically collect and share information such as weather, traffic and pollution data for port owners and customers. Optimizing workflow could double capacity without having to additional space or having to invest in new infrastructu
https://en.wikipedia.org/wiki/Yau%27s%20conjecture%20on%20the%20first%20eigenvalue
In mathematics, Yau's conjecture on the first eigenvalue is, as of 2018, an unsolved conjecture proposed by Shing-Tung Yau in 1982. It asks: Is it true that the first eigenvalue for the Laplace–Beltrami operator on an embedded minimal hypersurface of is ? If true, it will imply that the area of embedded minimal hypersurfaces in will have an upper bound depending only on the genus. Some possible reformulations are as follows: The first eigenvalue of every closed embedded minimal hypersurface in the unit sphere (1) is The first eigenvalue of an embedded compact minimal hypersurface of the standard (n + 1)-sphere with sectional curvature 1 is If is the unit (n + 1)-sphere with its standard round metric, then the first Laplacian eigenvalue on a closed embedded minimal hypersurface is The Yau's conjecture is verified for several special cases, but still open in general. Shiing-Shen Chern conjectured that a closed, minimally immersed hypersurface in (1), whose second fundamental form has constant length, is isoparametric. If true, it would have established the Yau's conjecture for the minimal hypersurface whose second fundamental form has constant length. A possible generalization of the Yau's conjecture: Let be a closed minimal submanifold in the unit sphere (1) with dimension of satisfying . Is it true that the first eigenvalue of is ? Further reading (Problem 100) Differential geometry Conjectures Unsolved problems in mathematics
https://en.wikipedia.org/wiki/Doubtnut
Doubtnut is an Indian educational app started by Tanushree Nagori and Aditya Shankar. Doubtnut is available as an Android application on Play Store or by accessing its official website. Operation The platform uses image recognition technologies to provide answers and explanation of mathematical and science questions of highschool to college level. To find the solution to a question, one has to upload an image depicting the question. The app extracts text from the image and tries to match it in its database of questions which are pre-answered, having recorded video solutions. If it finds a match, then the result is provided to the user otherwise it asks the user to post their question publicly for tutors available on the platform to provide a video explaining their query. Funding Doubtnut received its first funding of US$100K from Ankit Nagori of Cure.fit in April, 2017. In 2019, Omidyar Network and WaterBridge invested about $470,000. In January 2020, the company raised $15 million in Series A round of funding from Tencent Holdings, and existing investors Sequoia Capital and Omidyar Network India. See also Byju's Vedantu Unacademy References Science websites Organisations based in Gurgaon Indian educational websites Educational math software Online tutoring Education companies of India Distance education institutions based in India E-learning in India Educational technology companies of India 2008 establishments in Haryana Indian companies established in 2008
https://en.wikipedia.org/wiki/Signal%20%28model%20checking%29
In model checking, a subfield of computer science, a signal or timed state sequence is an extension of the notion of words in a formal language, in which letters are continuously emitted. While a word is traditionally defined as a function from a set of non-negative integers to letters, a signal is a function from a set of real numbers to letters. This allow the use of formalisms similar to the ones of automata theory to deal with continuous signals. Example Consider an elevator. What is formally called a letter could be in fact information such as "someone is pressing the button on the 2nd floor", or "the doors are currently open on the third floor". In this case, a signal indicates, at each time, which is the current state of the elevator and its buttons. The signal can then be analyzed using formal methods to check whether a property such that "each time the elevator is called, it arrives in less than three minutes, assuming that no one held the door for more than fifteen seconds" holds. A statement such as this one is usually expressed in metric temporal logic, an extension of linear temporal logic that allows the expression of time constraints. A signal may be passed to a model, such as a signal automaton, which will decide, given the letters or actions that already occurred, what is the next action that should be performed, in our example, to which floor the elevator must go. Then a program may test this signal and check the above-mentioned property. That is, it will try to generate a signal in which the door is never held open for more than fifteen seconds, and in which a user must wait more than three minutes after calling the elevator. Definition Given an alphabet A, a signal is a sequence , finite or infinite, such that , each are pairwise disjoint intervals, , and is also an interval. Given for some , represents . Properties Some authors restrict the kind of signals they consider. We list here some standard properties that a signal may or may
https://en.wikipedia.org/wiki/KARMA%20attack
In information security, KARMA is an attack that exploits a behaviour of some Wi-Fi devices, combined with the lack of access point authentication in numerous WiFi protocols. It is a variant of the evil twin attack. Details of the attack were first published in 2004 by Dino dai Zovi and Shaun Macaulay. Vulnerable client devices broadcast a "preferred network list" (PNL), which contains the SSIDs of access points to which they have previously connected and are willing to automatically reconnect without user intervention. These broadcasts are not encrypted and hence may be received by any WiFi access point in range. The KARMA attack consists in an access point receiving this list and then giving itself an SSID from the PNL, thus becoming an evil twin of an access point already trusted by the client. Once that has been done, if the client receives the malicious access point's signal more strongly than that of the genuine access point (for example, if the genuine access point is nowhere nearby), and if the client does not attempt to authenticate the access point, then the attack should succeed. If the attack succeeds, then the malicious access point becomes a man in the middle (MITM), which positions it to deploy other attacks against the victim device. What distinguishes KARMA from a plain evil twin attack is the use of the PNL, which allows the attacker to know, rather than simply to guess, which SSIDs (if any) the client will automatically attempt to connect to. See also Wireless security References Wi-Fi Computer security exploits Computer-related introductions in 2004
https://en.wikipedia.org/wiki/33344-33434%20tiling
In geometry of the Euclidean plane, a 33344-33434 tiling is one of two of 20 2-uniform tilings of the Euclidean plane by regular polygons. They contains regular triangle and square faces, arranged in two vertex configuration: 3.3.3.4.4 and 3.3.4.3.4. The first has triangles in groups of 3 and square in groups of 1 and 2. It has 4 types of faces and 5 types of edges. The second has triangles in groups of 4, and squares in groups of 2. It has 3 types of face and 6 types of edges. Geometry Its two vertex configurations are shared with two 1-uniform tilings: Circle Packings These 2-uniform tilings can be used as a circle packings. In the first 2-uniform tiling (whose dual resembles a key-lock pattern): cyan circles are in contact with 5 other circles (3 cyan, 2 pink), corresponding to the V33.42 planigon, and pink circles are also in contact with 5 other circles (4 cyan, 1 pink), corresponding to the V32.4.3.4 planigon. It is homeomorphic to the ambo operation on the tiling, with the cyan and pink gap polygons corresponding to the cyan and pink circles (mini-vertex configuration polygons; one dimensional duals to the respective planigons). Both images coincide. In the second 2-uniform tiling (whose dual resembles jagged streams of water): cyan circles are in contact with 5 other circles (2 cyan, 3 pink), corresponding to the V33.42 planigon, and pink circles are also in contact with 5 other circles (3 cyan, 2 pink), corresponding to the V32.4.3.4 planigon. It is homeomorphic to the ambo operation on the tiling, with the cyan and pink gap polygons corresponding to the cyan and pink circles (mini-vertex configuration polygons; one dimensional duals to the respective planigons). Both images coincide. Dual tilings The dual tilings have right triangle and kite faces, defined by face configurations: V3.3.3.4.4 and V3.3.4.3.4, and can be seen combining the prismatic pentagonal tiling and Cairo pentagonal tilings. Notes References Keith Critchlow, Order in Space: A
https://en.wikipedia.org/wiki/Kerstin%20Nordstrom
Kerstin N. Nordstrom is an American physicist who is the Clare Boothe Luce Assistant Professor of Physics in the Department of Physics at Mount Holyoke College. Her research focuses on soft matter physics; her work has been featured in the LA Times and in the BBC News. Early life and education Nordstrom completed a bachelor's degree in physics and mathematics at Bryn Mawr College in 2004. She joined the University of Pennsylvania as a graduate student, earning a Master of Science in 2006 and a PhD in 2010. Her doctoral thesis focused on the "Jamming and flow of soft particle suspensions." In 2011, Nordstrom joined the University of Maryland, College Park as a postdoctoral researcher. At the University of Maryland, Nordstrom worked on several topics, including how beds of granular materials respond to impact and how razor clams burrow in sand. Research and career In 2014, Nordstrom joined Mount Holyoke College as an Assistant Professor. She is interested in complex fluid flows, including the systems of solid particles found in granular materials. Awards and honors 2012 AAAS Mass Media Fellow 2018 Cottrell Scholar Award 2019 National Science Foundation CAREER Award External media In 2016, Nordstrom appeared on Jeopardy!. References Living people Bryn Mawr College alumni University of Pennsylvania alumni Mount Holyoke College faculty Year of birth missing (living people) American women physicists Bionics
https://en.wikipedia.org/wiki/Anirudh%20Devgan
Anirudh Devgan (born September 15, 1969) is an Indian-American computer scientist and CEO, known for his contributions to electronic design automation, specifically circuit simulation, physical design and signoff, statistical design and optimization, and verification and hardware platforms. He was elected a fellow of the Institute of Electrical and Electronics Engineers with the class of 2007. , Devgan serves as President and CEO of Cadence Design Systems. He also serves on the boards of the Global Semiconductor Alliance and the ESD Alliance. Education IIT Delhi, Bachelor of Technology degree in Electrical Engineering (1986-1990) Carnegie Mellon University, Master's (1990-1991) and PhD (1991-1993) in Electrical and Computer Engineering Stanford University, Graduate School of Business (since 2018) Career Devgan began his career at International Business Machines Corporation (IBM) where he spent 12 years in management and research at the IBM Thomas J. Watson Research Center, IBM Server Division, IBM Microelectronics Division, and IBM Austin Research Lab. He later went to Magma Design Automation where he served as Corporate Vice President and General Manager, Custom Design Business Unit. In 2012, Devgan joined Cadence Design Systems serving in a number of senior leadership roles before being named President in 2017. In 2021, Devgan joined the board of directors, and later assumed the role of CEO, taking over from Lip-Bu Tan. Awards and honors Award-winning scholarly work Devgan's scholarly work on statistical analysis and optimization of integrated circuits was published in peer-reviewed proceedings of international conferences. Two such publications were recognized with awards from the top conferences in the field of electronic design automation, sponsored by Institute of Electrical and Electronics Engineers and Association for Computing Machinery. In 2003, Devgan shared the IEEE William J. Macalla Best Paper Award at the ACM/IEEE International Conference o
https://en.wikipedia.org/wiki/The%20Logical%20Structure%20of%20Linguistic%20Theory
The Logical Structure of Linguistic Theory or LSLT is a major work in linguistics by American linguist Noam Chomsky. It was written in 1955 and published in 1975. In 1955, Chomsky submitted a part of this book as his PhD thesis titled Transformational Analysis, setting out his ideas on transformational grammar; he was awarded a Ph.D. for it, and it was privately distributed among specialists on microfilm. Chomsky offered the manuscript of LSLT for publication, but MIT's Technology Press refused to publish it. It was published by Springer in 1975. References External links The Logical Structure of Linguistic Theory preview in Google Books Books by Noam Chomsky Cognitive science literature 1975 non-fiction books Syntax books Logic books Theses
https://en.wikipedia.org/wiki/Stardust%20%281987%20video%20game%29
Stardust is a top-scrolling shoot 'em up developed by Spanish studio Topo Soft and released in the UK by Kixx in 1987 for the ZX Spectrum. The full version was included on a Sinclair User covertape in 1991. It was also released for Amstrad CPC, DOS, and MSX. The introductory screens included music composed by Pablo Toledo. The same music was later re-used for the game Bronx. Gameplay The player controls a small space ship called an "Astrohunter", which flies over the surface of a series of large enemy supercruisers on their way to attack Earth. The player must avoid or destroy various ground targets and free-flying drones in order to eventually reach an array of shield generators. The ship is equipped with a gun that can be improved by collecting power-ups, and a second weapon that targets objects on the ground. When all the supercruisers have been passed, the Astrohunter lands in an enemy starship and its pilot continues on foot to reach the shield generators. After destroying the generators the pilot must be returned to the ship to escape. References 1987 video games Amstrad CPC games DOS games MSX games Multiplayer and single-player video games Shoot 'em ups Topo Soft games U.S. Gold games Video games developed in Spain ZX Spectrum games
https://en.wikipedia.org/wiki/Stenter
A stenter (sometimes called a tenter) is a machine used in textile finishing. It serves multiple purposes, including heat setting, drying, and applying various chemical treatments. This may be achieved through the use of certain attachments such as padding or coating. The machine works by holding the fabric's edges while it is fed from rollers, allowing it to advance gradually while maintaining its dimensions. Eventually, the stretched sheet is pulled off at a specific speed by a second set of rollers. At the delivery end, the edges are released by the stenter pins or clamps that were holding it. Etymology and History Stenter is derived from "tenter," which has its origins in the Latin word "tendere," meaning "to stretch," passing through an intermediate French stage. The primary purpose of this machine is to stretch and dry fabric. In the past, frames used for this purpose were called "tenter," and the metal hooks employed to hold the fabric to the frame were known as "tentering hooks." History Tenters were primarily utilized to process woolen fabric. During the cleaning process, after squeezing out excess water, crumpled woolen cloth needed to be straightened and dried under tension; otherwise, it would shrink. The wet cloth was stretched on a large wooden frame, referred to as a "tenter," and left to dry. To accomplish this, lengths of wet cloth were fastened to the tenter's perimeter using hooks (nails driven through the wood) all around the frame. This ensured that, as the cloth dried, it would maintain its shape and size. Initially, the tentering process was conducted in the open air when Higher Mill was constructed, with the tenter frames erected on the hillside to the east of the mill. However, toward the end of World War I, the process was brought indoors and utilized steam heating for drying. Over time, this technique evolved into the modern-day stenter machine. Function The process of drying textiles is known to consume a significant amount of en
https://en.wikipedia.org/wiki/Qiddiya
Qiddiya (, ) is an entertainment megaproject to be established in Riyadh. Construction started in the beginning of 2019. The project is one of the tourism megaprojects to be established in Saudi Arabia under the auspices of Saudi Vision 2030, which aims to diversify the income resources of the country and alleviate its reliance on oil. On 26 June 2019, the master plan for Qiddiya was revealed, composed of five primary projects. These projects include resorts, parks, and a city center. Phase one, slated to be completed in 2023, will feature Six Flags Qiddiya as a family attraction. The master plan is to create a project that overwhelms visitors with a variety of activities. Such designed experiences are planned to be provided in a way that takes in consideration the cultural, natural and professional aspects. Thus, the natural pattern of the site has been taken into consideration. By doing so, the proposed design will take visitors through a green belt network. Etymology Qiddiya is named after a region which forms a slit at Jabal Tuwaiq to establish a hajj road between Al-Yamama and Hejaz. The stated main idea behind the project is to establish a healthy, happy and engaging lifestyle place that is full of different opportunities and to encourage young ambitious Saudis who look forward to creating a prosperous Saudi Arabia. Location Qiddiyah City or Qiddiyah project, which is supported by The Public Investment Fund (PIF), will be located 40 km away from Riyadh's city center. It will have a number of recreation facilities including, amusement parks, sport areas, car and bike riding paths, water parks, natural sceneries and cultural activities. The project will provide jobs to many Saudi men and women as it will be capable to host global sport competitions and a wide range of activities. By 2030 the project is expected to be the largest tourism destination worldwide, with a total area of 334 km². Phase One The first phase of the project is planned to be compl
https://en.wikipedia.org/wiki/Metric%20interval%20temporal%20logic
In model checking, the Metric Interval Temporal Logic (MITL) is a fragment of Metric Temporal Logic (MTL). This fragment is often preferred to MTL because some problems that are undecidable for MTL become decidable for MITL. Definition A MITL formula is an MTL formula, such that each set of reals used in subscript are intervals, which are not singletons, and whose bounds are either a natural number or are infinite. Difference from MTL MTL can express a statement such as the sentence S: "P held exactly ten time units ago". This is impossible in MITL. Instead, MITL can say T: "P held between 9 and 10 time units ago". Since MITL can express T but not S, in a sense, MITL is a restriction of MTL which allows only less precise statements. Problems that MITL avoids One reason to want to avoid a statement such as S is that its truth value may change an arbitrary number of times in a single time unit. Indeed, the truth value of this statement may change as many times as the truth value of P change, and P itself may change an arbitrary number of time in a single time unit. Let us now consider a system, such as a timed automaton or a signal automaton, which want to know at each instant whether S holds or not. This system should recall everything that occurred in the last 10 time units. As seen above, it means that it must recall an arbitrarily large number of events. This can not be implemented by a system with finite memory and clocks. Bounded variability One of the main advantage of MITL is that each operator has the bounded variability property. Example: Given the statement T defined above. Each time the truth value of T switches from false to true, it remains true for at least one time unit. Proof: At a time t where T becomes true, it means that: between 9 and 10 time units ago, P was true. just before time t, P was false. Hence, P was true exactly 9 time units ago. It follows that, for each , at time , P was true time units ago. Since , at time , T holds.
https://en.wikipedia.org/wiki/Dict.cc
dict.cc is a free, multilingual online dictionary. For offline use the dictionaries can be downloaded as text files and used in various programs on Windows, iOS, Android and Palm OS. Dict.cc GmbH have their main office in the Austrian capital city of Vienna. History The website was put on the internet in the year 2002 by Austrian Paul Hemetsberger. His vision is to provide hundreds of languages within a few years. At the beginning of February 2005 Paul Hemetsberger allowed vocabulary lists to be downloaded under the conditions of the free GNU General Public License (GPL). Later in the year he changed to a propriety license, which provides less freedom to the users. Despite this he released all user-generated posts, which were created under the former GPL-Licensed project, into the lists. Vocabulary structure Like some other online dictionaries, dict.cc allows users to suggest new entries. These new entries are not moderated (unlike in LEO, another online multilingual dictionary), but rather reviewed by the users through a multi-stage validation system. Languages The biggest vocabulary sets currently consist of the German-English dictionary, from which the website developed. Since November 2009 additional language structures can be found, with translations into German and English respectively. Nowadays, there are 50 further dictionaries; in the order of verified vocabulary extent (descending): Swedish, Icelandic, Russian, Italian, French, Romanian, Portuguese, Hungarian, Latin, Dutch, Slovakian, Spanish, Croatian, Bulgarian, Finnish, Norwegian, Czech, Danish, Turkish, Polish, Serbian, Greek, Esperanto, Bosnian and Albanian. The level of user interest in adding additional language is used to develop a wish list, where interested users can register with their E-mail address as future translators or employees. Data The German-English dictionary, with over 1,180,600 translations (November 2018), is larger than the competing site LEO, and as of late 2018 was gro
https://en.wikipedia.org/wiki/Rose%20Funja
Rose Peter Funja (born 1981) is a Tanzanian software engineer and developer. She is a dean at the College of Science at the University of Bagamoyo, lecturing on ICT. She has been active since 2014 with one app ATIZAVO on Google Play. The name is a short form of Airtel, Tigo, Zantel and vodacom. Early life and education Funja was born in 1981 and raised in Dar es Salaam. She attended the Kifungilo and Shaaban Robert School and the University of Dar es Salaam from where she received a Bachelor of Science in computer engineering in 2005, and a master's of engineering degree in Communication and Information Systems from Huazhong University of Science and Technology in 2008. Career Funja worked with Huawei Technologies International as a Senior Product Manager in Shenzhen, China. She is experienced with wireless telecommunication technologies and pioneered projects in various countries and regions including China, Kenya, Rwanda, Madagascar, Zambia and Tanzania. Funja has been actively involved in Rotaract and served as the Club President when she was a student at the University of Dar es Salaam 2005. In 2012 Funja returned to Tanzania and engaged in academia by lecturing at the University of Bagamoyo in science courses. She quickly climbed to the level of Dean at the College of Science and Environmental Sciences Funja is a Mandela Washington Fellow, a President Obama Young African Leaders Initiative Program (YALI), where she attended a Public Management Course from Syracuse University in New York. Funja has always seen information and communication as a solution to challenges facing the Tanzanian agricultural sector. With this focus she founded Agrinfo Company Limited. Agrinfo's initial idea was to serve as a search engine for agricultural information in Tanzania, and it has pivoted over the years to include technologies such as drones for agriculture as Funja is a trained drone pilot. Funja is the current managing Director of Agrinfo company limited. Community
https://en.wikipedia.org/wiki/Barbara%20Canright
Barbara Canright (Born St. John, 1919–1997) was an American human computer at NASA's Jet Propulsion Laboratory (JPL) who was the first female mathematician to be employed. Canright joined the team in 1939 as a human computer, which required "Teams of people who were frequently used to undertake long and often tedious calculations; the work was divided so that this could be done in parallel." During her time at the JPL program she was instrumental in calculating both the thrust-to-weight ratio for performance of engines under various conditions, and the potential of rocket propellant (which would be used by the U.S. Navy). Canright was critical in the development of the JPL program and laid the foundations for other women to work in a field which was previously closed off to them. Pre-JPL Program Canright was born on November 12, 1919, in Iowa during the midst of the American Roaring 20's. Canright was an exceptionally smart student during high-school and took upper-level classes, most notably in math and Chemistry. She met her future husband, Richard Canright, when they were both undergrads at Miami University of Ohio, where her father was a Professor and dean of students. After eloping when she was 19 and he was 21, they moved to Pasadena for Richard to attend graduate school at Caltech. Before joining JPL, Canright worked as a typist while attending college at Occidental College in Los Angeles. Canright graduated from Occidental College in 1940. Career In 1939 the National Academy of Sciences approached Caltech's Guggenheim Aeronautical Laboratory (who were known around Caltech's campus as the "Suicide Squad" because of numerous experiments gone wrong) with a grant for Rocket Research of $1,000. The next year, the U.S. government invested $10,000 into the program, which prompted the group to hire outside help. One of the men who started JPL, Frank Malina, approached Barbara and Richard Canright for positions as mathematicians. Both Barbara and Richard accepte
https://en.wikipedia.org/wiki/Nuclear%20fallout%20effects%20on%20an%20ecosystem
This article uses Chernobyl as a case study of nuclear fallout effects on an ecosystem. Chernobyl Officials used hydrometeorological data to create an image of what the potential nuclear fallout looked like after the Chernobyl disaster in 1986. Using this method, they were able to determine the distribution of radionuclides in the surrounding area, and discovered emissions from the nuclear reactor itself. These emissions included; fuel particles, radioactive gases, and aerosol particles. The fuel particles were due to the violent interaction between hot fuel and the cooling water in the reactor, and attached to these particles were Cerium, Zirconium, Lanthanum, and Strontium. All of these elements have low volatility, meaning they prefer to stay in a liquid or solid state rather than condensing into the atmosphere and existing as vapor. Cerium and Lanthanum can cause irreversible damage to marine life by deteriorating cell membranes, affecting reproductive capability, as well as crippling the nervous system. Strontium in its non-nuclear isotope is stable and harmless, however, when the radioactive isotope, Sr90, is released into the atmosphere it can lead to anemia, cancers, and cause shortages in oxygen. The aerosol particles had traces of Tellurium, a toxic element which can create issues in developing fetuses, along with Caesium, which is an unstable, incredibly reactive, and toxic element. Also found in the aerosol particles was enriched Uranium-235. The most prevalent radioactive gas detected was Radon, a noble gas that has no odor, no color, and no taste, and can also travel into the atmosphere or bodies of water. Radon is also directly linked to lung cancer, and is the second leading cause of lung cancer in the populace. All of these elements only deteriorate through radioactive decay, which is also known as a half-life. Half-lives of the nuclides previously discussed can range from mere hours, to decades. The shortest half-life for the previous ele
https://en.wikipedia.org/wiki/Macie%20Roberts
Macie "Bobby" Roberts is a former supervisor at NASA's Jet Propulsion Laboratory (JPL). She was the supervisor for a group of women nicknamed "computers" during the 1960s. Roberts paved the way for the next generation of female supervisors and computers. The team that she led had their hands on almost every project at NASA before the development of physical computers. Early working life Prior to working at JPL, Roberts worked as an auditor for the Internal Revenue Service (IRS). Career As JPL expanded from rocket technology to missile technology the lab's director, Frank Malina promoted long time computer Macie Roberts to supervisor of the expanding division of female computers. Macie "Bobby" Roberts was the original supervisor of the human computers at NASA, later dubbed the rocket women. She believed that it would be too difficult to work with men, so she created a culture of all women that spanned much longer than her time at JPL. Roberts's job was not limited to just calculations, she hired and trained the new employees for over thirty years after being promoted. Her successor, Helen Ling, continued on the tradition of only hiring women. The women on Robert's team performed trajectory calculations for all space flights before the advent of the desktop computer. One of the techniques that Roberts employed to find women for the job was to list the job as "not requiring a degree." This was code meant for women to know that the job was open to all women. Legacy Her legacy was continued on by the supervisors that followed her. Many women that were just as smart as male engineers could not find work, but they were hired on to be rocket women at JPL. These women formed a sisterhood that has lived on throughout the lifetime of NASA. Famous quotes "You have to look like a girl, act like a lady, think like a man, and work like a dog." References Human computers Living people NASA people Year of birth missing (living people)
https://en.wikipedia.org/wiki/COMOS
COMOS is a plant engineering software from Siemens. The applications for this software are in the process industries for the engineering, operation, and maintenance of process plants as well as their asset management. History The COMOS (acronym for Component Object Server) software system was originally developed and sold by the Logotec Software GmbH, then by the innotec GmbH (founded in 1991 with headquarters in Schwelm, Germany). The first version hit the market in 1996. In 2008, the innotec GmbH was taken over by the Siemens Corporation COMOS is developed further and marketed by a subsidiary, the Siemens Industry Software GmbH. The current status is COMOS Generation 10. Product characteristics Originally, COMOS was developed as an integrated CAE system for engineering in plant construction: all process engineering trades and the involved disciplines of the Electrical, Instrumentation & Control system engineering should be able to work together seamlessly on one system platform. The system incorporates the characteristics of object orientation, central data administration, and open system architecture. Interfaces enable the integration into existing IT infrastructures or cooperation with supplementary software systems. The COMOS software system is based on a central data platform and includes applications that can be combined. They help with the engineering and set-up, operation, and shut-down of industrial plants. Applications The software is used by plant developers (e.g. EPC) to plan process plants (chemical, energy, water / waste water, pharmaceuticals, oil, natural gas, food, etc.). It is also used by plant owner/operators in the mentioned industries, since COMOS not only supports engineering but also operational processes. There are regular user conferences. Its architecture makes COMOS suitable for engineering: it can manage large quantities of data and provide it on an integrated basis. Siemens cooperates in the standardization of export and import
https://en.wikipedia.org/wiki/Elizabeth%20W.%20Jones%20Award%20for%20Excellence%20in%20Education
The Elizabeth W. Jones Award for Excellence in Education is awarded annually by the Genetics Society of America to recognize individuals who have made noteworthy contributions to genetics education. It was founded in 2007 as the Genetics Society of America Award for Excellence in Education. Its first recipient was Elizabeth W. Jones, after whom the award was renamed following her death in 2008. Recipients Source: Genetics Society of America 2007: Elizabeth W. Jones 2008: R. Scott Hawley 2009: Sarah Elgin 2010: Utpal Banerjee 2011: Peter J. Bruns 2012: David A. Micklos 2013: A. Malcolm Campbell 2014: Robin Wright 2015: Louisa A. Stark 2016: William Wood 2017: Sally G. Hoskins 2018: Steven A. Farber, Carnegie Institute for Science & Jamie Shuda 2019: Bruce Weir 2020: Seth Bordenstein, Vanderbilt University 2021: Edward J. Smith, Virginia Tech 2022: Alana O’Reilly and Dara Ruiz-Whalen, Fox Chase Cancer Center and eCLOSE Institute See also List of genetics awards References Genetics awards Awards established in 2007 Genetics education American education awards
https://en.wikipedia.org/wiki/EasyEffects
EasyEffects (formerly known as PulseEffects) is a free and open-source GTK application for Unix-like systems which provides a large array of audio effects and filters to apply to input and output audio streams. The application originally used the Pulseaudio sound server as it allowed effects to be added to audio streams with ease, however, now runs exclusively on the PipeWire sound server after a port in 2021. It is published under the GPL-3.0-or-later license. Overview EasyEffects uses PipeWire to process incoming and outgoing audio streams independently and can apply various sound effects in the form of plug-ins made by different developer teams such as Calf Studio Gear, MDA.LV2 and GStreamer. All plugins have their own presets and can be applicable inside the suite rather than having to use a different mixer or executing a script from the command line. Available output effects are limiter, autovolume, compressor of dynamic range, filter, 30 bands parametric equalizer, bass enhancer, exciter, reverbation, crossfeed, delay, maximizer and spectrum analyzer. Available input effects are WebRTC, limiter, compressor, filter, equalizer, deeser, reverbation, pitch shift and spectrum analyzer. References External links EasyEffects on GitHub Audio software Audio software for Linux Audio software that uses GTK Free audio software Linux audio video-related software Pitch modification software Software that uses GStreamer Software using the GPL license
https://en.wikipedia.org/wiki/Neurotubule
Neurotubules are microtubules found in neurons in nervous tissues. Along with neurofilaments and microfilaments, they form the cytoskeleton of neurons. Neurotubules are undivided hollow cylinders that are made up of tubulin protein polymers and arrays parallel to the plasma membrane in neurons. Neurotubules have an outer diameter of about 23 nm and an inner diameter, also known as the central core, of about 12 nm. The wall of the neurotubules is about 5 nm in width. There is a non-opaque clear zone surrounding the neurotubule and it is about 40 nm in diameter. Like microtubules, neurotubules are greatly dynamic and the length of them can be adjusted by polymerization and depolymerization of tubulin. Despite having similar mechanical properties, neurotubules are distinct from microtubules found in other cell types with regards to their function and intracellular arrangement. Most neurotubules are not anchored in the microtubule organizing center (MTOC) like conventional microtubules do. Instead, they are released for transport into dendrites and axons after their nucleation in the centrosome. Therefore, both ends of the neurotubules terminates in the cytoplasm instead. Neurotubules are crucial in various cellular processes in neurons. Together with neurofilaments, they help to maintain the shape of a neuron and provide mechanical support. Neurotubules also aid the transportation of organelles, vesicles containing neurotransmitters, messenger RNA and other intracellular molecules inside a neuron. Structure and dynamics Composition Like microtubules, neurotubules are made up of protein polymers of α-tubulin and β-tubulin. α-tubulin and β-tubulin are globular proteins that are closely related. They join together to form a dimer, called tubulin. Neurotubules are generally assembled by 13 protofilaments which are polymerized from tubulin dimers. As a tubulin dimer consists of one α-tubulin and one β-tubulin, one end of the neurotubule is exposed with the α-tubulin a
https://en.wikipedia.org/wiki/HEAAN
HEAAN (Homomorphic Encryption for Arithmetic of Approximate Numbers) is an open source homomorphic encryption (HE) library which implements an approximate HE scheme proposed by Cheon, Kim, Kim and Song (CKKS). The first version of HEAAN was published on GitHub on 15 May 2016, and later a new version of HEAAN with a bootstrapping algorithm was released. Currently, the latest version is Version 2.1. CKKS plaintext space Unlike other HE schemes, the CKKS scheme supports approximate arithmetics over complex numbers (hence, real numbers). More precisely, the plaintext space of the CKKS scheme is for some power-of-two integer . To deal with the complex plaintext vector efficiently, Cheon et al. proposed plaintext encoding/decoding methods which exploits a ring isomorphism . Encoding method Given a plaintext vector and a scaling factor , the plaintext vector is encoded as a polynomial by computing where denotes the coefficient-wise rounding function. Decoding method Given a message polynomial and a scaling factor , the message polynomial is decoded to a complex vector by computing . Here the scaling factor enables us to control the encoding/decoding error which is occurred by the rounding process. Namely, one can obtain the approximate equation by controlling where and denote the encoding and decoding algorithm, respectively. From the ring-isomorphic property of the mapping , for and , the following hold: , , where denotes the Hadamard product of the same-length vectors. These properties guarantee the approximate correctness of the computations in the encoded state when the scaling factor is chosen appropriately. Algorithms The CKKS scheme basically consists of those algorithms: key Generation, encryption, decryption, homomorphic addition and multiplication, and rescaling. For a positive integer , let be the quotient ring of modulo . Let , and be distributions over which output polynomials with small coefficients. These distributions, the i
https://en.wikipedia.org/wiki/Donegall%20Lectureship%20at%20Trinity%20College%20Dublin
The Donegall Lecturership at Trinity College Dublin, is one of two endowed mathematics positions at Trinity College Dublin (TCD), the other being the Erasmus Smith's Chair of Mathematics. The Donegall (sometimes spelt Donegal) Lectureship was endowed in 1668 by The 3rd Earl of Donegall. In 1675, after the restoration, it was combined with the previous public Professor in Mathematics position that had been created in 1652 by the Commonwealth parliament. For much of its history, the Donegall Lectureship was awarded to a mathematician as an additional honour which came with a supplementary income. Since 1967, the lectureship has been awarded to a leading international scientist who visits the Department of Pure and Applied Mathematics and gives talks, including a public lecture called the Donegall Lecture. List of Donegall Lecturers 1675–1685: Miles Symner (1610?–1686) 1685–1692: St. George Ashe (1657–1718) 1692–1694: Charles Willoughby (1630?–1694) 1694–1696: Edward Smyth (1665–1720) 1696–1723: Claudius Gilbert (1670–1743) 1723–1730: Richard Helsham (1682–1738) 1730–1731: Charles Stuart (circa 1698–1746) 1731–1734: Lambert Hughes (1698–1771) 1734–1735: Robert Shawe (1699?–1752) 1735–1738: Caleb Cartwright (1696?–1763) 1738–1747: John Pellisier (1703–1781) 1747–1750: John Whittingham (1712–1778) 1750–1759: William Clement (1707–1782) 1759–1760: Theaker Wilder (1717–1777) 1760–1762: John Stokes (1720?–1781) 1762–1764: Richard Murray (1725?–1799) 1764–1769: Henry Joseph Dabzac (1737–1790) 1769–1770: Henry Ussher (1741–1790) 1770–1782: Gerald Fitzgerald (1739?–1819) 1782–1786: Matthew Young (1750–1800) 1786–1790: Digby Marsh (1750?–1791) 1790–1795: Thomas Elrington (1760–1835) 1795–1800: Whitley Stokes (1763–1845) 1800–1807: Robert Phipps (1765?–1844) 1807–1820: James Wilson (1774?–1829) 1820–1827: Richard MacDonnell (1787–1867) 1827–1832: Henry Harte (1790–1848) 1832–1847: Thomas Luby (1800–1870) 1847–1858: Andrew Hart (1811–
https://en.wikipedia.org/wiki/Erasmus%20Smith%27s%20Professor%20of%20Mathematics
The Erasmus Smith's Professor of Mathematics at Trinity College Dublin is one of two endowed mathematics positions at Trinity College Dublin (TCD), the other being the Donegall Lectureship at Trinity College Dublin. It was founded in 1762 and funded by the Erasmus Smith Trust, which was established by Erasmus Smith (1611–1691). Since 1851 the position has been funded by Trinity College. Some of the people listed here also held the Erasmus Smith's Chair of Natural and Experimental Philosophy for a period–that's another of the 4 named professorships honouring Smith's memory. List of the professors 1762–1764: John Stokes (1720–1781) 1764–1795: Richard Murray (1725?–1799) 1795–1799: Thomas Elrington (1760–1835) 1799–1800: George Hall (1753–1811) 1800–1813: William Magee (1766–1831) 1813–1822: Bartholomew Lloyd (1772–1837) 1822–1825: James Wilson (1774?–1829) 1825–1835: Franc Sadleir (1775–1851) 1835–1843: James MacCullagh (1809–1847) 1843–1862: Charles Graves (1812–1899) 1862–1879: Michael Roberts (1817–1882) 1879–1913: William Burnside (1839–1920) 1914–1917: Stephen Kelleher (1875–1917) 1917–1921: Robert Russell (1858?–1938) 1921–1926: (vacant) 1926–1943: Charles Rowe (1893–1943) 1944–1962: TS (Stan) Broderick (1893–1962) 1962–1964: Heini Halberstam (1926–2014) 1964–1966: Gabriel Dirac (1925–1984) 1966–1989: Brian Murdoch (1930–2020) 1989–2000: (vacant) 2000–2001: Paul Feehan (born 1961) 2001–2004: (vacant) 2004–2008: Adrian Constantin (born 1970) 2008– : (vacant) See also List of professorships at the University of Dublin References 1762 establishments in Ireland Mathematics, Smith's, Erasmus Mathematics, Smith's, Erasmus, Dublin, Trinity College
https://en.wikipedia.org/wiki/Cell%20cycle%20withdrawal
Cell cycle withdrawal refers to the natural stoppage of cell cycle during cell division.  When cells divide, there are many internal or external factors that would lead to a stoppage of division. This stoppage could be permanent or temporary, and could occur in any one of the four cycle phases (G1, S, G2 and M), depending on the status of cells or the activities they are undergoing. During the process, all cell duplication process, including mitosis, meiosis as well as DNA replication, will be paused. The mechanisms involve the proteins and DNA sequences inside cells. Permanent cell cycle withdrawal Permanent cell cycle withdrawal refers to the forever stoppage in divisions of cells.  In organisms, cells do not divide endlessly.  Certain mechanisms are present to prevent cells from indefinite division, which is mostly done by programmed failure in DNA synthesis.  By adapting the above mechanism, cells are prevented from over dividing. The process also enables cells to proceed to senescence, which are further stages of cell life and growth. Mechanism The permanent cell cycle withdrawal is mainly done by the wearing off of DNA sequences during S Phase, the second stage during a DNA replication progress.  Such progress occurs in the end sequences of the whole linear chromosome named telomeres. Telomeres are sequences of repetitive nucleotides which serve no genetic use.  During the replication process, the DNA replication enzymes are not able to copy the ending sequences at the telomere.  Those sequences, located at the end of the telomere and chromosome, would hence get lost gradually. Once all of these sequences have been worn out, the useful genetic information in the cell's chromosome would also get lost.  This prevents cells from cell dividing, withdrawing cells from the cell division cycle. Therefore, telomeres act as the buffer for cells to continue dividing and when telomeres are worn out, cells lose their dividing function. Not all cells carry out
https://en.wikipedia.org/wiki/Hardware%20security%20bug
In digital computing, hardware security bugs are hardware bugs or flaws that create vulnerabilities affecting computer central processing units (CPUs), or other devices which incorporate programmable processors or logic and have direct memory access, which allow data to be read by a rogue process when such reading is not authorized. Such vulnerabilities are considered "catastrophic" by security analysts. Speculative execution vulnerabilities Starting in 2017, a series of security vulnerabilities were found in the implementations of speculative execution on common processor architectures which effectively enabled an elevation of privileges. These include: Foreshadow Meltdown Microarchitectural Data Sampling Spectre SPOILER Pacman Intel VISA In 2019 researchers discovered that a manufacturer debugging mode, known as VISA, had an undocumented feature on Intel Platform Controller Hubs, which are the chipsets included on most Intel-based motherboards and which have direct memory access, which made the mode accessible with a normal motherboard possibly leading to a security vulnerability. See also Hardware security Security bug Computer security Threat (computer) References Computer security exploits Hardware bugs Side-channel attacks 2018 in computing
https://en.wikipedia.org/wiki/Emma%20Haruka%20Iwao
Emma Haruka Iwao is a Japanese computer scientist and cloud developer advocate at Google. In 2019 Haruka Iwao calculated the then world record for most accurate value of pi (π); which included 31.4 trillion digits, exceeding the previous record of 22 trillion. This record was surpassed in 2020 by Timothy Mullican who calculated 50 trillion digits, but she reclaimed the record in 2022 with 100 trillion digits. She identifies as queer. Early life and education As a child, Iwao became interested in pi. She was inspired by Japanese mathematicians, including Yasumasa Kanada. She studied computer science at the University of Tsukuba, where she was taught by Daisuke Takahashi. She was awarded the Dean's Award for Excellence in 2008, before starting graduate studies in computing. Her master's dissertation considered high performance computer systems. After graduating, Iwao took on several software engineering positions, working on site reliability for Panasonic, GREE and Red Hat. Career Iwao joined Google as a Cloud Developer Advocate in 2015. She originally worked for Google in Tokyo, before moving to Seattle in 2019. Iwao offers training in the use of the Google Cloud Platform (GCP), as well as supporting application developers. She works to make cloud computing accessible for everyone, creating online demos and teaching materials. In March 2019 Iwao calculated the value of pi to 31,415,926,535,897 digits (Equal to ), using 170 terabytes (TB) of data. The calculation used a multithreaded program called y-cruncher using over 25 machines for 121 days. In March 2022 she extended the world record to 100 trillion digits of pi. See also Chronology of computation of π References Japanese women mathematicians Japanese women computer scientists World record holders Living people Year of birth missing (living people) University of Tsukuba alumni Japanese LGBT scientists Queer women Queer scientists Japanese emigrants to the United States Google employees Pi-related people
https://en.wikipedia.org/wiki/Edinburgh%20Compatible%20Context%20Editor
ECCE (the Edinburgh Compatible Context Editor) is a text editor for computing systems and operating environments that support a command line interface. It is an original command set which is logical and regular. It was written in the 1960s by Hamish Dewar, an experienced Compiler writer and used this skill to design a command-set which could be easily parsed and coded to allow complex commands to be built up. A technique similar to threaded code in the Forth environment. The current ECCE release is licensed under the BSD License, recoded into C and released by Graham Toal in 2007. History Hamish Dewar in the early 1960s recognised a need for a more powerful text editor. At the time editing files was laborious as editors could only load into memory one code line at a time and insert, delete or replace only the whole line. Because of memory limitations (a large computer might have between 8k and 32k or memory) few editors could execute repeated commands or support macros for text processing. H Dewar used his talent as a compiler author to create ECCE as a much more capable command set but retain a small footprint. From the start ECCE would endeavour to buffer as much of the file as memory allowed while earlier editors could only buffer one line of the file at a time. ECCE became the default text editor for computers at the University of Edinburgh and remained almost unchanged for a period of almost 25 years. The editors survival is attributed to the fact that thousands of undergraduates and postgraduates would have used the tool in their higher education and wherever in the world they settled the benefits of ECCE were promoted and local implementations created from Hamish Dewar's source code. ECCE became one of the most popular and well respected text editors of the 1970s. ECCE was originally written in Imp, a language created at the University of Edinburgh, the second implementation was coded in PDP-8 assembler and was ported to numerous other platforms. Source
https://en.wikipedia.org/wiki/Creative%20Computing%20Benchmark
The Creative Computing Benchmark, also called Ahl's Simple Benchmark, is a computer benchmark that was used to compare the performance of the BASIC programming language on various machines. It was first introduced in the November 1983 issue of Creative Computing magazine with the measures from a number of 8-bit computers that were popular at the time. Over a period of a few months, the list was greatly expanded to include practically every contemporary machine, topped by the Cray-1 supercomputer, which ran it in 0.01 seconds. The Creative Computing Benchmark was one of three common benchmarks of the era. Its primary competition in the early 1980s in the United States was the Byte Sieve, of September 1981, while the earlier Rugg/Feldman benchmarks of June 1977 were not as well known in the United States, but were widely used in the United Kingdom. History The benchmark first appeared in the November 1983 issue of Creative Computing under the title "Benchmark Comparison Test". In the article, author David H. Ahl was careful to state that it tested only a few aspects of the BASIC language, mostly its looping performance. He stated: The initial results were provided for common machines of the era, including the Apple II, Commodore 64 and the recently-released IBM Personal Computer. Most of these machines ran some variation of the stock Microsoft BASIC and thus provided similar times on the order of two minutes, while the 16-bit PC was near the top of the list at only 24 seconds. the fastest machine in this initial suite was the Olivetti M20 at 13 seconds, and the slowest was Atari BASIC on the Atari 800 at 6 minutes 58 seconds. In the months following its publication, the magazine was inundated with results for other platforms. It became a regular feature for a time, placed prominently near the front of the magazine with an ever-growing list of results. By March the fastest machine on the list was the Cray-1 at 0.01 seconds, and the slowest was the TI SR-50 progra
https://en.wikipedia.org/wiki/Microsoft%20Docs
Microsoft Learn, formerly known as Microsoft Docs, is the library of technical documentation for end users, developers, and IT professionals who work with Microsoft products. The Microsoft Learn website provides technical specifications, conceptual articles, tutorials, guides, training, API references, code samples and other information related to Microsoft software and web services. Microsoft Docs was introduced in 2016 as a replacement of MSDN and TechNet libraries which previously hosted some of these materials. In 2022, Microsoft Docs was made part of the Microsoft Learn site. Structure and features The content on Microsoft Docs is organised into groups based on product or technology and steps of working with it: evaluating, getting started, planning, deploying, managing, and troubleshooting, and the navigation panel and product/service pages show material breakdowns. The service allows users to download specific docs section as a PDF file for offline use and includes an estimated reading time for each article. Each article is represented as a Markdown file in various GitHub repositories, and most of the documentation content is open-sourced and accepts pull requests. Microsoft released a set of Visual Studio Code extensions, Docs Authoring Pack, to assist in editing Microsoft Docs content. It includes the support of Docs-specific markdown features. Microsoft Docs preview was introduced in June 2016, initially containing .NET documentation. The process of migrating the bulk of MSDN and TechNet libraries' content took approximately two years. In 2022, Microsoft Docs was made part of the Microsoft Learn site. See also Microsoft Developer Network Microsoft TechNet References External links Technical documentation - Microsoft Learn Docs Authoring Pack Microsoft Microsoft websites Software documentation
https://en.wikipedia.org/wiki/SM9%20%28cryptography%20standard%29
SM9 is a Chinese national cryptography standard for Identity Based Cryptography issued by the Chinese State Cryptographic Authority in March 2016.  It is represented by the Chinese National Cryptography Standard (Guomi), GM/T 0044-2016 SM9. The standard contains the following components: (GM/T 0044.1) The Identity-Based Asymmetric Cryptography Algorithm (GM/T 0044.2) The Identity-Based Digital Signature Algorithm which allows one entity to digitally sign a message which can be verified by another entity. (GM/T 0044.3) The Identity-Based Key Establishment and Key Wrapping (GM/T 0044.4) The Identity Based Public-Key Encryption Key Encapsulation Algorithm which allows one entity to securely send a symmetric key to another entity. Identity Based Cryptography Identity Based Cryptography is a type of public key cryptography that uses a widely known representation of an entity's identity (name, email address, phone number etc.) as the entity's public key. This eliminates the need to have a separate public key bound by some mechanism (such as a digitally signed public key certificate) to the identity of an entity. In Identity Based Cryptography (IBC) the public key is often taken as the concatenation of an entity's Identity and a validity period for the public key. In Identity Based Cryptography, one or more trusted agents use their private keys to compute an entity's private key from their public key (Identity and Validity Period). The corresponding public keys of the trusted agent or agents are known to everyone using the network. If only one trusted agent is used that trusted agent can compute all the private keys for users in the network. To avoid that state, some researchers propose using multiple trusted agents in such a way that more than one of them need to be compromised in order to compute individual public keys. Chinese Cryptographic Standards The SM9 Standard adopted in 2016 is one of a number of Chinese national cryptography standards. Other pu
https://en.wikipedia.org/wiki/Butler%20matrix
A Butler matrix is a beamforming network used to feed a phased array of antenna elements. Its purpose is to control the direction of a beam, or beams, of radio transmission. It consists of an matrix ( some power of two) with hybrid couplers and fixed-value phase shifters at the junctions. The device has input ports (the beam ports) to which power is applied, and output ports (the element ports) to which antenna elements are connected. The Butler matrix feeds power to the elements with a progressive phase difference between elements such that the beam of radio transmission is in the desired direction. The beam direction is controlled by switching power to the desired beam port. More than one beam, or even all of them can be activated simultaneously. The concept was first proposed by Butler and Lowe in 1961. It is a development of the work of Blass in 1960. Its advantage over other methods of angular beamforming is the simplicity of the hardware. It requires far fewer phase shifters than other methods and can be implemented in microstrip on a low-cost printed circuit board. Antenna elements The antenna elements fed by a Butler matrix are typically horn antennae at the microwave frequencies at which Butler matrices are usually used. Horns have limited bandwidth and more complex antennae may be used if more than an octave is required. The elements are commonly arranged in a linear array. A Butler matrix can also feed a circular array giving 360° coverage. A further application with a circular antenna array is to produce omnidirectional beams with orthogonal phase-modes so that multiple mobile stations can all simultaneously use the same frequency, each using a different phase-mode. A circular antenna array can be made to simultaneously produce an omnidirectional beam and multiple directional beams when fed through two Butler matrices back-to-back. Butler matrices can be used with both transmitters and receivers. Since they are passive and reciproc
https://en.wikipedia.org/wiki/Extinction%20symbol
The extinction symbol represents the threat of holocene extinction on Earth; a circle represents the planet and a stylised hourglass is a warning that time is running out for many species. The symbol dates to at least 2012 and has been attributed to anonymous East London artist Goldfrog ESP. The symbol has been called "this generation's peace sign". It is used by environmental protesters, and has been incorporated in works by artists and designers such as Banksy. In 2019, the Victoria and Albert Museum acquired a digital copy of the symbol, and other artifacts featuring the symbol, for its permanent collection. Attribution In 2019, The Guardian reported that "Where the symbol has come from is something of a mystery". The Guardian noted that the most reliable attribution is to an anonymous East London artist known only as "ESP" or "Goldfrog ESP", who declines to be contacted directly except via his Extinction Symbol website, which has been supported by design media, and the wider media. In 2019, the New Statesman, reported that after ESP created the symbol in 2011, he contacted over 20 environmental groups to promote the symbol with little success. However, in 2018, the Extinction Rebellion (XR) contacted ESP regarding adoption and use of the symbol, and ESP clarified on his Extinction Symbol website, that the symbol is freely available to those who wish to use it for non-commercial purposes. In May 2019, Gail Bradbrook of XR issued a public statement to clarify that: "Not only does XR not support or endorse any corporations, it reminds them that the Extinction Symbol ⧖⃝ may never be used for commercial purposes, including fundraising. The Extinction Symbol is loaned in good faith to XR by UK street artist ESP". In 2019, noted typeface blogger, Jason Kottke, remarked that the above licensing structure means that while individuals can create their own clothing and signs using the symbol, a non-profit organization could not raise funds, or even use economies of
https://en.wikipedia.org/wiki/EURO%20Journal%20on%20Decision%20Processes
The EURO Journal on Decision Processes (EJDP) is a peer-reviewed academic journal that was established in 2012 and originally published by Springer Science+Business Media. In 2021, the journal will instead be published by Elsevier. It is an official journal of the Association of European Operational Research Societies, publishing scientific knowledge on the theoretical, methodological, behavioural and organizational topics that contribute to the understanding and appropriate use of operational research in supporting different phases of decision-making processes. The editor-in-chief is Jutta Geldermann Past Editor-in-Chief: Vincent Mousseau (2016 - 2021) Ahti Salo (2012-2016). References External links Behavioural sciences Operations research English-language journals Academic journals established in 2012
https://en.wikipedia.org/wiki/Audio%20bus
In audio engineering, a bus (alternate spelling buss, plural busses) is a signal path that can be used to combine (sum) individual audio signal paths together. It is typically used to group several individual audio tracks which can be then manipulated, as a group, like another track. This can be achieved by routing the signal physically by ways of switches and cable patches on a mixing console, or by manipulating software features on a digital audio workstation (DAW). Using buses allows the engineer to work in a more efficient way and with better consistency, for instance, to apply sound processing effects and adjust levels for at a time a workflow known as stem mixing and mastering. Busses are essential to audio production because they help engineers work faster and more consistently. Engineers can apply audio effects and level adjustments to numerous tracks at once and save time and labor by grouping tracks together using a bus. As it streamlines the workflow and aids in maintaining a constant acoustic balance throughout the mix, this can be especially helpful when working on complicated audio projects with many tracks. See also Bus (computing) Intel High Definition Audio I²S Live sound mixing Software bus Sound recording and reproduction Stem mixing References Audio engineering
https://en.wikipedia.org/wiki/Mauro%20Martino
Mauro Martino is an Italian artist, designer and researcher. He is the founder and director of the Visual Artificial Intelligence Lab at IBM Research, and Professor of Practice at Northeastern University. Career He graduated from Polytechnic University of Milan, and was a research affiliate with the Senseable City Lab at MIT. Mauro was formerly an Assistant Research Professor at Northeastern University working with Albert-Laszlo Barabasi at Center for Complex Network Research and with David Lazer and Fellows at The Institute for Quantitative Social Science (IQSS) at Harvard University. His works have been published in "The Best American Infographics" in the 2015 and 2016 editions and have been shown at international festivals and exhibitions including Ars Electronica, RIXC Art Science Festival, Global Exchange at Lincoln Center, TEDx Cambridge THRIVE, TEDx Riga, and the Serpentine Gallery. His work is in the permanent collection at Ars Electronica Center. In 2017, Martino and his team received the National Science Foundation's award for Best Scientific Video for the project Network Earth. In 2019, Martino and Luca Stornaiuolo won the 2019 Webby People's Voice Award in the category NetArt for the project AI Portraits. The project 150 Years of Nature won multiple awards such as Fast Company - Innovation by Design Awards Best Data Design 2020, Webby Award 2020, Webby People's Voice Award 2020. This project, along with other works created in collaboration with Barabási Lab (e.g., Wonder Net, A Century of Physics, Data Sculpture in Bronze, Control, Resilience, Success in Science, Fake News), was shown at the "Barabási Lab. Hidden Patterns" exhibitions at ZKM Center for Art and Media and Ludwig Museum - Museum of Contemporary Art, Budapest. Mauro Martino is a pioneer in the use of the artificial neural network in sculpture. Notable works Strolling Cities is an interactive AI art project created in collaboration with Politecnico di Milano and exhibited in the Ita
https://en.wikipedia.org/wiki/Neurosexism
Neurosexism is an alleged bias in the neuroscience of sex differences towards reinforcing harmful gender stereotypes. The term was coined by feminist scholar Cordelia Fine in a 2008 article and popularised by her 2010 book Delusions of Gender. The concept is now widely used by critics of the neuroscience of sex differences in neuroscience, neuroethics and philosophy. Definition Neuroscientist Gina Rippon defines neurosexism as follows: Neurosexism' is the practice of claiming that there are fixed differences between female and male brains, which can explain women’s inferiority or unsuitability for certain roles." For example, "this includes things such as men being more logical and women being better at languages or nurturing." Fine and Rippon, along with Daphna Joel, state that "the point of critical enquiry is not to deny differences between the sexes, but to ensure a full understanding of the findings and meaning of any particular report." Many of the issues they discuss to support their position are "serious issues for all areas of behavioral research", but they argue that "in sex/gender differences research... they are often particularly acute." Nonetheless, the common factor influencing logical maturity between males and females is the maturity of the frontal cortex, which matures at the age of 25, at the earliest. The topic of neurosexism is thus closely tied to wider debates about scientific methodology, especially in the behavioral sciences. History The history of science contains many examples of scientists and philosophers drawing conclusions about the mental inferiority of women, or their lack of aptitude for certain tasks, on the basis of alleged anatomical differences between male and female brains. In the late 19th century, George J. Romanes used the difference in average brain weight between men and women to explain the "marked inferiority of intellectual power" of the latter. Absent a sexist background assumption about male superiority, there wo
https://en.wikipedia.org/wiki/Vineet%20Bafna
Vineet Bafna is an Indian bioinformatician and professor of computer science and director of bioinformatics program at University of California, San Diego. He was elected a Fellow of the International Society for Computational Biology (ISCB) in 2019 for outstanding contributions to the fields of computational biology and bioinformatics. He has also been a member of the Research in Computational Molecular Biology (RECOMB) conference steering committee. Career and research Bafna received his Ph.D. in computer science from Pennsylvania State University in 1994 under supervision of Pavel Pevzner, and was a post-doctoral researcher at Center for Discrete Mathematics and Theoretical Computer Science. From 1999 to 2002, he worked at Celera Genomics, ultimately as director of informatics research, where he was part of the team (along with J. Craig Venter and Gene Myers) who assembled and annotated the Human Genome in 2001. He was also a member of the team that published the first diploid (six-billion-letter) genome of an individual human in 2007. He joined the faculty at the University of California, San Diego in the Department of Computer Science and Engineering in 2003 where he now serves as professor and director of Bioinformatics program. References University of California, San Diego faculty Living people Fellows of the International Society for Computational Biology Year of birth missing (living people) Indian Institutes of Technology alumni Indian bioinformaticians Indian expatriate academics in the United States Indian expatriates in the United States
https://en.wikipedia.org/wiki/Clock%20%28model%20checking%29
In model checking, a subfield of computer science, a clock is a mathematical object used to model time. More precisely, a clock measures how much time passed since a particular event occurs, in this sense, a clock is more precisely an abstraction of a stopwatch. In a model of some particular program, the value of the clock may either be the time since the program was started, or the time since a particular event occurred in the program. Those clocks are used in the definition of timed automaton, signal automaton, timed propositional temporal logic and clock temporal logic. They are also used in programs such as UPPAAL which implement timed automata. Generally, the model of a system uses many clocks. Those multiple clocks are required in order to track a bounded number of events. All of those clocks are synchronized. That means that the difference in value between two fixed clocks is constant until one of them is restarted. In the language of electronics, it means that clock's jitter is null. Example Let us assume that we want to modelize an elevator in a building with ten floors. Our model may have clocks , such that the value of the clock is the time someone had wait for the elevator at floor . This clock is started when someone calls the elevator on floor (and the elevator was not already called on this floor since last time it visited that floor). This clock can be turned off when the elevator arrives at floor . In this example, we actually need ten distinct clocks because we need to track ten independent events. Another clock may be used to check how much time an elevator spent at a particular floor. A model of this elevator can then use those clocks to assert whether the elevator's program satisfies properties such as "assuming the elevator is not kept on a floor for more than fifteen seconds, then no one has to wait for the elevator for more than three minutes". In order to check whether this statement holds, it suffices to check that, in every run of
https://en.wikipedia.org/wiki/Timed%20propositional%20temporal%20logic
In model checking, a field of computer science, timed propositional temporal logic (TPTL) is an extension of propositional linear temporal logic (LTL) in which variables are introduced to measure times between two events. For example, while LTL allows to state that each event p is eventually followed by an event q, TPTL furthermore allows to give a time limit for q to occur. Syntax The future fragment of TPTL is defined similarly to linear temporal logic, in which furthermore, clock variables can be introduced and compared to constants. Formally, given a set of clocks, MTL is built up from: a finite set of propositional variables AP, the logical operators ¬ and ∨, and the temporal modal operator U, a clock comparison , with , a number and a comparison operator such as <, ≤, =, ≥ or >. a freeze quantification operator , for a TPTL formula with set of clocks . Furthermore, for an interval, is considered as an abbreviation for ; and similarly for every other kind of intervals. The logic TPTL+Past is built as the future fragment of TLS and also contains the temporal modal operator S. Note that the next operator N is not considered to be a part of MTL syntax. It will instead be defined from other operators. A closed formula is a formula over an empty set of clocks. Models Let , which intuitively represents a set of times. Let a function that associates to each moment a set of propositions from AP. A model of a TPTL formula is such a function . Usually, is either a timed word or a signal. In those cases, is either a discrete subset or an interval containing 0. Semantics Let and be as above. Let be a set of clocks. Let (a clock valuation over ). We are now going to explain what it means for a TPTL formula to hold at time for a valuation . This is denoted by . Let and be two formulas over the set of clocks , a formula over the set of clocks , , , a number and being a comparison operator such as <, ≤, =, ≥ or >: We first consider form
https://en.wikipedia.org/wiki/Lower%20Seyhan%20Irrigation%20Project
Lower Seyhan Irrıgation Project () is one of the major irrigation projects of Turkey, which is located in the Seyhan River basin. Location Seyhan is a -long river in southern Turkey, which flows into the Mediterranean Sea. The upper reaches of the river is in Taurus Mountains. It flows within the city of Adana. Seyhan Dam is located to the north of the city and the irrigation project is situated to the south. The Project The irrigation project comprises four phases. During the first phase between 1957 and 1968, of land was irrigated and of land was protected against floods. In the second phase between 1968 and 1974, land covering was irrigated. The third project phase took place between 1974 and 1985, and dealt mainly with Tarsus area to the west . In this phase, was irrigated and land was protected against floods. The forth phase, which is still under construction deals with the coastal area. The total area of the irrigation project will stretch over . Turkey's top 50 civil engineering projects Turkish Chamber of Civil Engineers lists the first phase of this Project as one of the fifty civil engineering feats in Turkey, a list of remarkable engineering projects realized in the first 50 years of the chamber. References Irrigation in Turkey Irrigation projects Economic history of Turkey Mediterranean Region, Turkey Planned developments
https://en.wikipedia.org/wiki/Bioctl
The bio(4) pseudo-device driver and the bioctl(8) utility implement a generic RAID volume management interface in OpenBSD and NetBSD. The idea behind this software is similar to ifconfig, where a single utility from the operating system can be used to control any RAID controller using a generic interface, instead of having to rely on many proprietary and custom RAID management utilities specific for each given hardware RAID manufacturer. Features include monitoring of the health status of the arrays, controlling identification through blinking the LEDs and managing of sound alarms, and specifying hot spare disks. Additionally, the softraid configuration in OpenBSD is delegated to bioctl as well; whereas the initial creation of volumes and configuration of hardware RAID is left to card BIOS as non-essential after the operating system has already been booted. Interfacing between the kernel and userland is performed through the ioctl system call through the /dev/bio pseudo-device. Overview The bio/bioctl subsystem is deemed to be an important part in OpenBSD's advocacy for open hardware documentation, and the 3.8 release title and the titular song were dedicated to the topic — Hackers of the Lost RAID. The development took place during a time of controversy where Adaptec refused to release appropriate hardware documentation that was necessary in order for the make the aac(4) driver work reliably, which followed with OpenBSD disabling support for the driver. In the commentary to the 3.8 release, the developers express the irony of hardware RAID controllers' supposed purpose of providing reliability, through redundancy and repair, whereas in reality many vendors expect system administrators to install and depend on huge binary blobs in order to be assess volume health and service their disk arrays. Specifically, OpenBSD is making a reference to the modus operandi of FreeBSD, where the documentation of the aac(4) driver for Adaptec specifically suggests enabling L
https://en.wikipedia.org/wiki/List%20of%20EC%20numbers%20%28EC%207%29
This list contains a list of sub-classes for the seventh group of Enzyme Commission numbers, EC 7, translocases, placed in numerical order as determined by the Nomenclature Committee of the International Union of Biochemistry and Molecular Biology. All official information is tabulated at the website of the committee. The database is developed and maintained by Andrew McDonald. EC 7.1: Catalysing the translocation of hydrons EC 7.1.1: Linked to oxidoreductase reactions : proton-translocating NAD(P)+ transhydrogenase * : NADH:ubiquinone reductase (H+-translocating) * : ubiquinol oxidase (H+-transporting) * : caldariellaquinol oxidase (H+-transporting) * : menaquinol oxidase (H+-transporting) * : plastoquinol—plastocyanin reductase * : quinol oxidase (electrogenic, proton-motive force generating) * : quinol—cytochrome-c reductase * : cytochrome-c oxidase * : ferredoxin—quinone oxidoreductase (H+-translocating) * : ferredoxin—NAD+ oxidoreductase (H+-transporting) * * No Wikipedia article EC 7.1.2: Linked to the hydrolysis of a nucleoside triphosphate : P-type H+-exporting transporter * : H+-transporting two-sector ATPase * * No Wikipedia article EC 7.1.3: Linked to the hydrolysis of diphosphate : H+-exporting diphosphatase * : Na+-exporting diphosphatase * * No Wikipedia article EC 7.2: catalysing the translocation of inorganic cations and their chelates EC 7.2.1: Linked to oxidoreductase reactions : NADH:ubiquinone reductase (NAD+-transporting) * : ferredoxin—NAD+ oxidoreductase (NAD+-transporting) * : ascorbate ferrireductase (transmembrane) * * No Wikipedia article EC 7.2.2: Linked to the hydrolysis of a nucleoside triphosphate : Na+-transporting two-sector ATPase * : ABC-type Cd2+ transporter * : P-type Na+ transporter * : ABC-type Na+ transporter * : ABC-type Mn2+ transporter * : P-type K+ transporter * : ABC-type Fe2+ transporter * : P-type Cu+ transporter * : P-type Cu2+ transporter * : P-type Ca2+
https://en.wikipedia.org/wiki/Raja%20Koduri
Rajabali Makaradhwaja Koduri (born 31 August 1968) is an Indian-American computer engineer and executive for computer graphics hardware. He was the chief architect and Executive Vice President of Intel's architecture, graphics and software (IAGS) division until April 2023. Before Intel, he worked as the senior vice president and chief architect of the Radeon Technologies Group, the graphics division at Intel's competitor AMD. Early life Raja Koduri wa born to a Telugu family in Kovvur, West Godavari district of Andhra Pradesh, India. Noted film director S. S. Rajamouli is his cousin. Other members of his extended family also work in the film industry as writers, music composers and singers. He earned a bachelor's degree in electronics and communications from Andhra University. He holds a Master of Technology degree from IIT Kharagpur. Career Raja Koduri joined S3 Graphics in 1996. He became the director of advanced technology development at ATI Technologies in 2001. Following Advanced Micro Devices's 2006 acquisition of ATI, he served as chief technology officer for graphics at AMD until 2009. At S3 and ATI he made key contributions to several generations of GPU architectures that evolved from DirectX Ver 3 till Ver 11. He then went to Apple Inc., where he worked with graphics hardware, which allowed Apple to transition to high-resolution Retina displays for its Mac computers. He returned to AMD in 2013 as a vice president in Visual Computing, which includes both GPU hardware and software, unlike his pre-2009 role at AMD which only concerned GPU hardware. AMD reorganized its graphics division in 2015, promoting Koduri to the executive level by naming him senior vice president and chief architect of the newly formed Radeon Technologies Group. Under this role, Koduri reported directly to AMD CEO Lisa Su. Raja lead the architecture transformation of Radeon Technology Group with Polaris, Vega and Navi architectures that made their way into several PC, Mac and Game
https://en.wikipedia.org/wiki/Algorithmic%20technique
In mathematics and computer science, an algorithmic technique is a general approach for implementing a process or computation. General techniques There are several broadly recognized algorithmic techniques that offer a proven method or process for designing and constructing algorithms. Different techniques may be used depending on the objective, which may include searching, sorting, mathematical optimization, constraint satisfaction, categorization, analysis, and prediction. Brute force Brute force is a simple, exhaustive technique that evaluates every possible outcome to find a solution. Divide and conquer The divide and conquer technique decomposes complex problems recursively into smaller sub-problems. Each sub-problem is then solved and these partial solutions are recombined to determine the overall solution. This technique is often used for searching and sorting. Dynamic Dynamic programming is a systematic technique in which a complex problem is decomposed recursively into smaller, overlapping subproblems for solution. Dynamic programming stores the results of the overlapping sub-problems locally using an optimization technique called memoization. Evolutionary An evolutionary approach develops candidate solutions and then, in a manner similar to biological evolution, performs a series of random alterations or combinations of these solutions and evaluates the new results against a fitness function. The most fit or promising results are selected for additional iterations, to achieve an overall optimal solution. Graph traversal Graph traversal is a technique for finding solutions to problems that can be represented as graphs. This approach is broad, and includes depth-first search, breadth-first search, tree traversal, and many specific variations that may include local optimizations and excluding search spaces that can be determined to be non-optimum or not possible. These techniques may be used to solve a variety of problems including shortest path and con
https://en.wikipedia.org/wiki/Symbolic%20language%20%28programming%29
In computer science, a symbolic language is a language that uses characters or symbols to represent concepts, such as mathematical operations and the entities (or operands) on which these operations are performed. Modern programming languages use symbols to represent concepts and/or data and are therefore, examples of symbolic languages. Some programming languages (such as Lisp and Mathematica) make it easy to represent higher-level abstractions as expressions in the language, enabling symbolic programming., See also Mathematical notation Notation (general) Programming language specification Symbol table Symbolic language (other) References External links Common LISP: A Gentle Introduction to Symbolic Computation - Carnegie Mellon University Mathematical notation Programming constructs Writing systems
https://en.wikipedia.org/wiki/Symbolic%20language%20%28mathematics%29
In mathematics, a symbolic language is a language that uses characters or symbols to represent concepts, such as mathematical operations, expressions, and statements, and the entities or operands on which the operations are performed. See also Formal language Language of mathematics List of mathematical symbols Mathematical Alphanumeric Symbols Mathematical notation Notation (general) Symbolic language (other) References External links Mathematical Symbols Mathematical notation Writing systems
https://en.wikipedia.org/wiki/2020%20United%20States%20presidential%20debates
The 2020 United States presidential debates between Joe Biden and Donald Trump, the major candidates in the 2020 United States presidential election, were sponsored by the Commission on Presidential Debates. There were three initially planned scheduled debates. The first debate took place on September 29, 2020. The next debate was scheduled to take place on October 15 but was later canceled due to Trump's COVID-19 diagnosis and refusal to appear remotely rather than in person. As a result, 2020 had the fewest debates since 1996. The final debate took place on October 22. Additionally, a debate between the vice presidential candidates Mike Pence and Kamala Harris took place on October 7. Background On October 11, 2019, the Commission on Presidential Debates (CPD) announced that it would host three presidential debates and one vice presidential debate. In 2019, Trump claimed that the 2016 debates were "biased", and suggested that he may not participate in further CPD-hosted debates. In December 2019, Frank J. Fahrenkopf Jr., the co-chairman of the CPD, met with Brad Parscale, Trump's campaign chairman, to discuss Trump's comments. Fahrenkoph said "the president wanted to debate, but they had concerns about whether or not to do it with the commission." While Trump did not press the issue further publicly, in June 2020, he requested additional debates to the traditional three, which Biden's campaign declined. At the end of June, representatives of the Biden campaign confirmed that they had agreed to the original schedule. The Trump campaign submitted a request to the CPD to move the scheduled debates up in the calendar, or to add a fourth debate in relation to mail-in voting; the request was declined in August 2020. Later that month, Speaker of the House Nancy Pelosi suggested that Biden should skip the debates, claiming that Trump will "probably act in a way that is beneath the dignity of the presidency". Biden responded by stating that he would go ahead and partic
https://en.wikipedia.org/wiki/Finiteness%20properties%20of%20groups
In mathematics, finiteness properties of a group are a collection of properties that allow the use of various algebraic and topological tools, for example group cohomology, to study the group. It is mostly of interest for the study of infinite groups. Special cases of groups with finiteness properties are finitely generated and finitely presented groups. Topological finiteness properties Given an integer n ≥ 1, a group is said to be of type Fn if there exists an aspherical CW-complex whose fundamental group is isomorphic to (a classifying space for ) and whose n-skeleton is finite. A group is said to be of type F∞ if it is of type Fn for every n. It is of type F if there exists a finite aspherical CW-complex of which it is the fundamental group. For small values of n these conditions have more classical interpretations: a group is of type F1 if and only if it is finitely generated (the rose with petals indexed by a finite generating family is the 1-skeleton of a classifying space, the Cayley graph of the group for this generating family is the 1-skeleton of its universal cover); a group is of type F2 if and only if it is finitely presented (the presentation complex, i.e. the rose with petals indexed by a finite generating set and 2-cells corresponding to each relation, is the 2-skeleton of a classifying space, whose universal cover has the Cayley complex as its 2-skeleton). It is known that for every n ≥ 1 there are groups of type Fn which are not of type Fn+1. Finite groups are of type F∞ but not of type F. Thompson's group is an example of a torsion-free group which is of type F∞ but not of type F. A reformulation of the Fn property is that a group has it if and only if it acts properly discontinuously, freely and cocompactly on a CW-complex whose homotopy groups vanish. Another finiteness property can be formulated by replacing homotopy with homology: a group is said to be of type FHn if it acts as above on a CW-complex whose n first homology grou
https://en.wikipedia.org/wiki/Juliette%20Kennedy
Juliette Kennedy is an associate professor in the Department of Mathematics and Statistics at the University of Helsinki. Her main research interests are mathematical logic and the foundations of mathematics. In the course of her work she has published extensively on the works of Kurt Gödel. Education and career Kennedy is an associate professor in the Department of Mathematics and Statistics at the University of Helsinki. Research areas Kennedy's research at the University of Helsinki focuses on mathematical logic in the area of set-theoretic model theory and set theory. In the course of her mathematical work she also researches the history of mathematics and the foundations of mathematics. In this context she has sustained an extensive project to place the works of Kurt Gödel in its historical and foundational context. In 2017 she published her research on the interplay between the works of Alan Turing and that of Gödel, who in 1956 defined the P versus NP problem in a letter to John von Neumann. Books Kennedy and Roman Kossak are the editors of Set Theory, Arithmetic, and Foundations of Mathematics: Theorems, Philosophies, published as Book 36 in the series Lecture Notes in Logic in 2012 by Cambridge University Press. Kennedy is the editor of Interpreting Gödel: Critical Essays, published in 2014 by Cambridge University Press and reprinted in 2017. In the book Kennedy brought together leading contemporary philosophers and mathematicians to explore the impact of Gödel's work on the foundations and philosophy of mathematics. The logician Kurt Gödel has in 1931 formulated the incompleteness theorems, which among other things prove that within any formal system with resources sufficient to code arithmetic, questions exist which are neither provable nor disprovable on the basis of the axioms which define the system. References External links Publication list at DBLP Living people Women mathematicians Mathematical logicians Women logicians University of Helsin
https://en.wikipedia.org/wiki/Symbolic%20language%20%28engineering%29
In engineering, a symbolic language is a language that uses standard symbols, marks, and abbreviations to represent concepts such as entities, aspects, attributes, and relationships. Engineering symbolic language may be used for the specification, design, implementation, management, operation, and execution of engineered systems. Communication using precise, concise representations of concepts is critical in engineering. The Nuclear Principles in Engineering book begins with a quote on symbolic language from Erich Fromm and its power to express and depict associations. The engineering employs symbolic language in a way that is not purely text-based and not purely image-based to represent and communicate knowledge. Examples in chemical engineering include the symbolic languages developed for process flow diagrams and for piping and instrumentation diagrams (P&IDs). in electrical engineering, examples include the symbolic languages developed for network diagrams used in computing. Ladder logic was originally a written symbolic language for the design and construction of programmable logic control (PLC) operations in mechanical and control engineering. See also Electronic symbol Engineering drawing Engineering drawing abbreviations and symbols List of symbols Mathematical Alphanumeric Symbols Notation (general) Symbolic language (other) References External links Basic Symbols Used in Engineering Drawings Construction documents Engineering concepts Notation Product development Standards Technical communication Technical specifications Writing systems
https://en.wikipedia.org/wiki/Mahara%20%28software%29
Mahara is a free and open-source web-based electronic portfolio (eportfolio) management system written in PHP and distributed under the GNU Public License. The Māori language word means "to think about or consider". History Mahara began in 2006 as a collaboration between Massey University, Auckland University of Technology, the Open Polytechnic of New Zealand and Victoria University of Wellington, funded by the New Zealand Tertiary Education Commission. Mahara was initially developed by Catalyst IT Limited, a New Zealand open-source software company, and first released in April 2008. Development of Mahara has since expanded to include a community of contributors, including the New Zealand Ministry of Education. The software was designed to be an open-source electronic portfolio platform to support the student learning and personal learning environment goals of educational institutions. Mahara allows students to select their own work and prepare an online portfolio, to both share in a university classroom context and show to future employers. Language support Mahara supports translation into different languages using language packs, and contributions of complete or near-complete coverage have been provided for Japanese, Basque, French, Māori, Slovenian, German, Czech, and Danish languages. References External links Cross-platform software Free educational software Free learning support software Free software programmed in PHP Free content management systems Classroom management software
https://en.wikipedia.org/wiki/Physical%20mapping
Physical map is a technique used in molecular biology to find the order and physical distance between DNA base pairs by DNA markers. It is one of the gene mapping techniques which can determine the sequence of DNA base pairs with high accuracy. Genetic mapping, another approach of gene mapping, can provide markers needed for the physical mapping. However, as the former deduces the relative gene position by recombination frequencies, it is less accurate than the latter. Physical mapping uses DNA fragments and DNA markers to assemble larger DNA pieces. With the overlapping regions of the fragments, researchers can deduce the positions of the DNA bases. There are different techniques to visualize the gene location, including somatic cell hybridization, radiation hybridization and in situ hybridization. The different approaches to physical mapping are available for analyzing different sizes of genome and achieving different levels of accuracy. Low- and high-resolution mapping are two classes for various resolution of genome, particularly for the investigation of chromosomes. The three basic varieties of physical mapping are fluorescent in situ hybridization (FISH), restriction site mapping and sequencing by clones. The goal of physical mapping, as a common mechanism under genomic analysis, is to obtain a complete genome sequence in order to deduce any association between the target DNA sequence and phenotypic traits. If the actual positions of genes which control certain phenotypes are known, it is possible to resolve genetic diseases by providing advice on prevention and developing new treatments. Low-resolution mapping Low-resolution physical mapping is typically capable of resolving DNA ranging from one base pair to several mega bases. In this category, most mapping methods involve generating a somatic cell hybrid panel, which is able to map any human DNA sequences, the gene of interest, to specific chromosomes of animal cells, such as those of mice and hamsters
https://en.wikipedia.org/wiki/Whitehead%27s%20algorithm
Whitehead's algorithm is a mathematical algorithm in group theory for solving the automorphic equivalence problem in the finite rank free group Fn. The algorithm is based on a classic 1936 paper of J. H. C. Whitehead. It is still unknown (except for the case n = 2) if Whitehead's algorithm has polynomial time complexity. Statement of the problem Let be a free group of rank with a free basis . The automorphism problem, or the automorphic equivalence problem for asks, given two freely reduced words whether there exists an automorphism such that . Thus the automorphism problem asks, for whether . For one has if and only if , where are conjugacy classes in of accordingly. Therefore, the automorphism problem for is often formulated in terms of -equivalence of conjugacy classes of elements of . For an element , denotes the freely reduced length of with respect to , and denotes the cyclically reduced length of with respect to . For the automorphism problem, the length of an input is measured as or as , depending on whether one views as an element of or as defining the corresponding conjugacy class in . History The automorphism problem for was algorithmically solved by J. H. C. Whitehead in a classic 1936 paper, and his solution came to be known as Whitehead's algorithm. Whitehead used a topological approach in his paper. Namely, consider the 3-manifold , the connected sum of copies of . Then , and, moreover, up to a quotient by a finite normal subgroup isomorphic to , the mapping class group of is equal to ; see. Different free bases of can be represented by isotopy classes of "sphere systems" in , and the cyclically reduced form of an element , as well as the Whitehead graph of , can be "read-off" from how a loop in general position representing intersects the spheres in the system. Whitehead moves can be represented by certain kinds of topological "swapping" moves modifying the sphere system. Subsequently, Rapaport, and later, based on
https://en.wikipedia.org/wiki/Structural%20reliability
Structural reliability is about applying reliability engineering theories to buildings and, more generally, structural analysis. Reliability is also used as a probabilistic measure of structural safety. The reliability of a structure is defined as the probability of complement of failure . The failure occurs when the total applied load is larger than the total resistance of the structure. Structural reliability has become known as a design philosophy in the twenty-first century, and it might replace traditional deterministic ways of design and maintenance. Theory In structural reliability studies, both loads and resistances are modeled as probabilistic variables. Using this approach the probability of failure of a structure is calculated. When loads and resistances are explicit and have their own independent function, the probability of failure could be formulated as follows. where is the probability of failure, is the cumulative distribution function of resistance (R), and is the probability density of load (S). However, in most cases, the distribution of loads and resistances are not independent and the probability of failure is defined via the following more general formula. where 𝑋 is the vector of the basic variables, and G(X) that is called is the limit state function could be a line, surface or volume that the integral is taken on its surface. Solution approaches Analytical solutions In some cases when load and resistance are explicitly expressed (such as equation () above), and their distributions are normal , the integral of equation () has a closed-form solution as follows. Simulation In most cases load and resistance are not normally distributed. Therefore, solving the integrals of equations () and () analytically is impossible. Using Monte Carlo simulation is an approach that could be used in such cases. References Reliability analysis Reliability engineering Structural engineering
https://en.wikipedia.org/wiki/Quasi-unmixed%20ring
In algebra, specifically in the theory of commutative rings, a quasi-unmixed ring (also called a formally equidimensional ring in EGA) is a Noetherian ring such that for each prime ideal p, the completion of the localization Ap is equidimensional, i.e. for each minimal prime ideal q in the completion , = the Krull dimension of Ap. Equivalent conditions A Noetherian integral domain is quasi-unmixed if and only if it satisfies Nagata's altitude formula. (See also: #formally catenary ring below.) Precisely, a quasi-unmixed ring is a ring in which the unmixed theorem, which characterizes a Cohen–Macaulay ring, holds for integral closure of an ideal; specifically, for a Noetherian ring , the following are equivalent: is quasi-unmixed. For each ideal I generated by a number of elements equal to its height, the integral closure is unmixed in height (each prime divisor has the same height as the others). For each ideal I generated by a number of elements equal to its height and for each integer n > 0, is unmixed. Formally catenary ring A Noetherian local ring is said to be formally catenary if for every prime ideal , is quasi-unmixed. As it turns out, this notion is redundant: Ratliff has shown that a Noetherian local ring is formally catenary if and only if it is universally catenary. References Appendix of Stephen McAdam, Asymptotic Prime Divisors. Lecture notes in Mathematics. Further reading Herrmann, M., S. Ikeda, and U. Orbanz: Equimultiplicity and Blowing Up. An Algebraic Study with an Appendix by B. Moonen. Springer Verlag, Berlin Heidelberg New-York, 1988. Algebra Commutative algebra
https://en.wikipedia.org/wiki/VT100%20encoding
The VT100 code page is a character encoding used to represent text on the Classic Mac OS for compatibility with the VT100 terminal. It encodes 256 characters, the first 128 of which are identical to ASCII, with the remaining characters including mathematical symbols, diacritics, and additional punctuation marks. It is suitable for English and several other Western languages. It is similar to Mac OS Roman, but includes all characters in ISO 8859-1 except for the currency sign (which was superseded by the euro sign), the no-break space, and the soft hyphen. It also includes all characters in DEC Special Graphics (code page 1090), except for the new line and no-break space controls. The VT100 encoding is only used on the VT100 font on the Classic Mac OS, and is not an official Mac OS character encoding. Codepage layout The following table shows how characters are encoded in the VT100 character set. Each character is shown with its Unicode equivalent. References Encodings
https://en.wikipedia.org/wiki/Noise-induced%20order
Noise-induced order is a mathematical phenomenon appearing in the Matsumoto-Tsuda model of the Belosov-Zhabotinski reaction. In this model, adding noise to the system causes a transition from a "chaotic" behaviour to a more "ordered" behaviour; this article was a seminal paper in the area and generated a big number of citations and gave birth to a line of research in applied mathematics and physics. This phenomenon was later observed in the Belosov-Zhabotinsky reaction. Mathematical background Interpolating experimental data from the Belosouv-Zabotinsky reaction, Matsumoto and Tsuda introduced a one dimensional model, a random dynamical system with uniform additive noise, driven by the map: where (defined so that ), , such that lands on a repelling fixed point (in some way this is analogous to a Misiurewicz point) (defined so that ). This random dynamical system is simulated with different noise amplitudes using floating-point arithmetic and the Lyapunov exponent along the simulated orbits is computed; the Lyapunov exponent of this simulated system was found to transition from positive to negative as the noise amplitude grows. The behavior of the floating point system and of the original system may differ; therefore, this is not a rigorous mathematical proof of the phenomenon. A computer assisted proof of noise-induced order for the Matsumoto-Tsuda map with the parameters above was given in 2017. In 2020 a sufficient condition for noise-induced order was given for one dimensional maps: the Lyapunov exponent for small noise sizes is positive, while the average of the logarithm of the derivative with respect to Lebesgue is negative. See also Self-organization Stochastic Resonance References Non-equilibrium thermodynamics Name reactions Pattern formation
https://en.wikipedia.org/wiki/Scanning%20vibrating%20electrode%20technique
Scanning vibrating electrode technique (SVET), also known as vibrating probe within the field of biology, is a scanning probe microscopy (SPM) technique which visualizes electrochemical processes at a sample. It was originally introduced in 1974 by Jaffe and Nuccitelli to investigate the electrical current densities near living cells. Starting in the 1980s Hugh Isaacs began to apply SVET to a number of different corrosion studies. SVET measures local current density distributions in the solution above the sample of interest, to map electrochemical processes in situ as they occur. It utilizes a probe, vibrating perpendicular to the sample of interest, to enhance the measured signal. It is related to scanning ion-selective electrode technique (SIET), which can be used with SVET in corrosion studies, and scanning reference electrode technique (SRET), which is a precursor to SVET. History Scanning vibrating electrode technique was originally introduced to sensitively measure extracellular currents by Jaffe and Nuccitelli in 1974. Jaffe and Nuccitelli then demonstrated the ability of the technique through the measurement of the extracellular currents involved with amputated and re-generating newt limbs, developmental currents of chick embryos, and the electrical currents associated with amoeboid movement. In corrosion, the scanning reference electrode technique (SRET) existed as the precursor to SVET, and was first introduced commercially and trademarked by Uniscan Instruments, now part of Bio-Logic Science Instruments. SRET is an in situ technique in which a reference electrode is scanned near a sample surface to map the potential distribution in the electrolyte above the sample. Using SRET it is possible to determine the anodic and cathodic sites of a corroding sample without the probe altering the corrosion process. SVET was first applied to and developed for the local investigation of corrosion processes by Hugh Isaacs. Principle of Operation SVET measures the
https://en.wikipedia.org/wiki/Birgit%20Penzenstadler
Birgit Penzenstadler (born September 9, 1981 in Erding, Germany) is a German assistant professor of Software Engineering at Chalmers University of Technology and adjunct docent at Lappeenranta University of Technology. She is well known for her work on environmental sustainability in software engineering and for being one of the founders of the sustainability design initiative, which seeks to advance the research on sustainability in technical disciplines such as computer science and software engineering. Work Prior to Chalmers University of Technology, Birgit was a professor at California State University, Long Beach. Also she has completed a postdoctoral fellowship at the University of California, Irvine with Prof. Debra J. Richardson and Prof. Bill Tomlinson. They developed framework called SE4S that supports the infusion of sustainability in the requirements engineering (RE) and quality assurance (QA) stages of software engineering processes. Penzenstadler coined the term "Software Engineering for Sustainability" in 2013. Also, she is the main organizer of the workshop series “Requirements Engineering for Sustainable Systems” since 2012 at the International Requirements Engineering Conferences. She led the Resilience Lab at California State University, Long Beach during 2015- 2019 which focused on research that evaluated the properties of a software system in relation to sustainability. References Living people People from Erding (district) 1981 births California State University, Long Beach faculty Technical University of Munich alumni German expatriates in the United States German software engineers German women engineers Software engineering researchers Sustainability advocates Engineers from Bavaria
https://en.wikipedia.org/wiki/Kiwi%20Farms
Kiwi Farms, formerly known as CWCki Forums ( ), is a web forum that facilitates the discussion and harassment of online figures and communities. Their targets are often subject to organized group trolling and stalking, as well as doxxing and real-life harassment. These actions have tied Kiwi Farms to the suicides of three people targeted by members of the forum. Kiwi Farms' connection to several controversies and harassment campaigns has caused the forum to be blocked by Internet service providers or refused service by companies. After the Christchurch mosque shootings, some Internet service providers in New Zealand blocked the site. In 2021, after the suicide of Near, a non-binary software developer who was subject to targeted and organised group harassment by members of the site, DreamHost stopped providing their domain registration services to Kiwi Farms. In September 2022, Kiwi Farms was blocked by Cloudflare due to "an imminent and emergency threat to human life". Following intermittent availability, The Daily Dot confirmed VanwaTech was providing content delivery network services to the site, which brought it back online. In September 2022, Kiwi Farms suffered a data breach; the site operator told users to assume that IP addresses, email addresses, and passwords had been leaked. History Kiwi Farms was founded in 2013 by Joshua Conner Moon (known as "Null" on the website), a former 8chan administrator. It was originally launched as a forum website to troll and harass a webcomic artist who was first noticed in 2007 on the Something Awful forums. Eventually, an Encyclopedia Dramatica page was created about the artist. A dedicated wiki, titled "CWCki" based on the artist's initials, was created by people who felt that the Encyclopedia Dramatica entry was not detailed or accurate enough. Kiwi Farms was originally called "CWCki Forums" before "Kiwi Farms" was coined in 2014. It now hosts threads targeting many individuals, including minorities, women, LGBT peop
https://en.wikipedia.org/wiki/Algae%20DNA%20barcoding
DNA barcoding of algae is commonly used for species identification and phylogenetic studies. Algae form a phylogenetically heterogeneous group, meaning that the application of a single universal barcode/marker for species delimitation is unfeasible, thus different markers/barcodes are applied for this aim in different algal groups. Diatoms Diatom DNA barcoding is a method for taxonomical identification of diatoms even to species level. It is conducted using DNA or RNA followed by amplification and sequencing of specific, conserved regions in the diatom genome followed by taxonomic assignment. One of the main challenges of identifying diatoms is that it is often collected as a mixture of diatoms from several species. DNA metabarcoding is the process of identifying the individual species from a mixed sample of environmental DNA (also called eDNA) which is DNA extracted straight from the environment such as in soil or water samples. A newly applied method is diatom DNA metabarcoding which is used for ecological quality assessment of rivers and streams because of the specific response of diatoms to particular ecologic conditions. As species identification via morphology is relatively difficult and requires a lot of time and expertise, high-throughput sequencing (HTS) DNA metabarcoding enables taxonomic assignment and therefore identification for the complete sample regarding the group specific primers chosen for the previous DNA amplification. Until now, several DNA markers have already been developed, mainly targeting the 18S rRNA. Using the V4 hypervariable region of the ribosomal small subunit DNA (SSU rDNA), DNA-based identification was found to be more efficient then the classical morphology based approach. Other conserved regions in the genomes which are frequently used as marker genes are ribulose-1-5-bisphosphate carboxylase (rbcL), cytochrome oxidase I (cox1, COI), ITS and 28S. It has been shown repeatedly that the molecular data gained by diatom eDNA meta
https://en.wikipedia.org/wiki/Microbial%20DNA%20barcoding
Microbial DNA barcoding is the use of DNA metabarcoding to characterize a mixture of microorganisms. DNA metabarcoding is a method of DNA barcoding that uses universal genetic markers to identify DNA of a mixture of organisms. History Using metabarcoding to assess microbial communities has a long history. Back in 1972, Carl Woese, Mitchell Sogin and Stephen Sogin first tried to detect several families within bacteria using the 5S rRNA gene. Only a few years later, a new tree of life with three domains was proposed by again Woese and colleagues, who were the first to use the small subunit of the ribosomal RNA (SSU rRNA) gene to distinguish between bacteria, archaea and eukaryotes. Out of this approach, the SSU rRNA gene made its way to be the most frequently used genetic marker for both prokaryotes (16S rRNA) and eukaryotes (18S rRNA). The tedious process of cloning those DNA fragments for sequencing got fastened up by the steady improvement of sequencing technologies. With the development of HTS (High-Throughput-Sequencing) in the early 2000s and the ability to deal with this massive data using modern bioinformatics and cluster algorithms, investigating microbial life got much easier. Genetic markers Genetic diversity is varying from species to species. Therefore, it is possible to identify distinct species by the recovery of a short DNA sequence from a standard part of the genome. This short sequence is defined as barcode sequence. Requirements for a specific part of the genome to serve as barcode should be a high variation between two different species, but not much differences in the gene between two individuals of the same species to make differentiating individual species easier. For both bacteria and archaea the 16S rRNA/rDNA gene is used. It is a common housekeeping gene in all prokaryotic organisms and therefore is used as a standard barcode to assess prokaryotic diversity. For protists, the corresponding 18S rRNA/rDNA gene is used. To distinguish diff