source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/Whitworth%20Society | The Whitworth Society was founded in 1923 by Henry Selby Hele-Shaw, then president of the Institution of Mechanical Engineers. Its purposes are to promote engineering in the United Kingdom, and more specifically to support all Whitworth Scholars, the recipients of a scholarship funded by Joseph Whitworth's scholarship scheme, which started in 1868. A Whitworth Scholar is the result of completing a successful Whitworth Scholarship. Membership of the Society is limited to Whitworth Scholars, Senior Scholars, Fellows, Exhibitioners and Prizemen. The Society is a way for making contact with all successful "Whitworth's" and provides a way for making information contacts and connections from more senior members to recently successful Scholars. The Society also serves as a way to commemorate Joseph Whitworth and acknowledge his contributions to engineering education.
Activities
Commemorative dinner and annual general meeting
The annual dinner and annual general meeting is held on the evening of 18 March (or nearest Friday to) to commemorate the date in 1868 when Joseph Whitworth wrote to Benjamin Disraeli, offering to found the Whitworth Scholarships. Traditionally the dinner has been held in London until more recent times where the meal and meeting is alternated, one-year London and one-year Manchester.
Summer meeting
There is a summer meeting held over two-days normally at the beginning of July. The event is largely informal and ordinarily arranged by the President of the Society.
Record
A record of all scholars is kept by the Society, until recent years, this was in hardback form (see image) presented when an individual was elected a scholar. In recent times, the register is kept electronically and provided by USB flash drive as part of the awards ceremony.
Whitworth Scholarship
A Whitworth Scholarship, named after Joseph Whitworth, is an "award for outstanding engineers, who have excellent academic and practical skills and the qualities needed to succeed i |
https://en.wikipedia.org/wiki/Sustainability%20in%20construction | Precise definitions of sustainable construction vary from place to place, and are constantly evolving to encompass varying approaches and priorities. In the United States, the Environmental Protection Agency (EPA) defines sustainable construction as "the practice of creating structures and using processes that are environmentally responsible and resource-efficient throughout a building's life-cycle from siting to design, construction, operation, maintenance, renovation and deconstruction." The Netherlands defines sustainable construction as "a way of building which aims at reducing (negative) health and environmental impacts caused by the construction process or by buildings or by the built-up environment." More comprehensively, sustainability can be considered from three dimension of planet, people and profit across the entire construction supply chain. Key concepts include the protection of the natural environment, choice of non-toxic materials, reduction and reuse of resources, waste minimization, and the use of life-cycle cost analysis.
Definition of sustainable construction
The definition of "Sustainable Construction" is the introduction of healthy living and workplace environments, the use of materials that are sustainable, durable and by extension environmentally friendly.
Evolution path
In the 1970s, awareness of sustainability emerged, amidst oil crises. At that time, people began to realize the necessity and urgency of energy conservation, which is to utilize energy in an efficient way and find alternatives to contemporary sources of energy. Additionally, shortages of other natural resources at that time, such as water, also raised public attention to the importance of sustainability and conservation. In the late 1960s, the construction industry began to explore ecological approaches to construction, aiming to seek harmony with nature.
The concept of sustainable construction was born out of sustainable development discourse. The term sustainable deve |
https://en.wikipedia.org/wiki/Mara%20Alagic | Mara Alagic is a Serbian mathematics educator and the editor-in-chief of the Journal of Mathematics and the Arts. She is an Associate Professor in the Department of Curriculum and Instruction and Graduate Coordinator at Wichita State University.
Education
Alagic obtained her Bachelor of Science in Mathematics, her Master's of Science in Mathematics and her PhD from the University of Belgrade in Yugoslavia. Her Master's thesis was on Category of Multivalued Mappings (Hypertopology). She completed her PhD in 1985 under the direction of Ðuro Kurepa; her dissertation title was Categorical Views of Some Relational Models.
Books
Alagic is the co-author of the book Locating Intercultures: Educating for Global Collaboration (2010). In addition, with Glyn M. Rimmington of Wichita State University, Alagic wrote the book Third place learning: Reflective inquiry into intercultural and global cage painting (Information Age Publishing, 2012).
References
External links
Mara Alagic ResearchGate Profile
Year of birth missing (living people)
Living people
Serbian mathematicians
21st-century American mathematicians
Women mathematicians
Mathematics educators
University of Belgrade alumni
Wichita State University faculty |
https://en.wikipedia.org/wiki/Wheatstone%20Corporation | Wheatstone Corporation is an American company that produces digital and analog professional audio equipment for broadcast radio, television, and new media. Products include audio consoles, Audio over IP (AoIP) audio networking, audio processing, audio recording and editing, and custom furniture. The corporation also does business under the brand names Audioarts Engineering, Pacific Research & Engineering, and VoxPro.
Founder
Gary Snow’s interest in audio came early: “By age 12, I was running a neighborhood radio and TV repair shop. I built my first stereo system at 15 and then moved on to guitar amps and loudspeaker enclosures,” Snow said. After high school, Gary took a job repairing amplifiers and special effects devices while attending Onondaga Community College in Syracuse, New York, where he majored in electrical engineering.
Snow’s career then progressed to larger companies, where he engaged in more sophisticated high fidelity repair and installations. Gary explained, “I was sent to KLH, McIntosh Laboratories, and the Allen Organ Company for further technical training. In 1971, I was offered employment at Theatre Sound Inc. in New Haven, Connecticut, where I expanded into electronic circuit design, and large system design and installation.”
After some encouragement by friends, he produced a "for sale" product in 1974. Snow chose the name "Audioarts" for his nascent company.
Snow is the recipient of three Industry Innovator awards announced by trade publications in 2017.
History
Wheatstone Corporation was founded as Audioarts Engineering in 1974. and was incorporated under its current name in 1981. Originally founded in Bethany, Connecticut, the company moved twice, first to Syracuse, New York in 1986, then to its present location just outside New Bern, North Carolina in 1998.
The company's first product was a simple disco mixer designed by the founder. In the years that followed, Audioarts designed and sold outboard equipment for the recording industry, |
https://en.wikipedia.org/wiki/Borg%20%28cluster%20manager%29 | Borg is a cluster manager used by Google. It led to widespread use of similar approaches, such as Docker and Kubernetes.
See also
Apache Mesos
List of cluster management software
Kubernetes
DC/OS
Operating-system-level virtualization (containerization)
References
Further reading
A New Era of Container Cluster Management with Kubernetes
Cluster computing
Google software |
https://en.wikipedia.org/wiki/EEG%20analysis | EEG analysis is exploiting mathematical signal analysis methods and computer technology to extract information from electroencephalography (EEG) signals. The targets of EEG analysis are to help researchers gain a better understanding of the brain; assist physicians in diagnosis and treatment choices; and to boost brain-computer interface (BCI) technology. There are many ways to roughly categorize EEG analysis methods. If a mathematical model is exploited to fit the sampled EEG signals, the method can be categorized as parametric, otherwise, it is a non-parametric method. Traditionally, most EEG analysis methods fall into four categories: time domain, frequency domain, time-frequency domain, and nonlinear methods. There are also later methods including deep neural networks (DNNs).
Methods
Frequency domain methods
Frequency domain analysis, also known as spectral analysis, is the most conventional yet one of the most powerful and standard methods for EEG analysis. It gives insight into information contained in the frequency domain of EEG waveforms by adopting statistical and Fourier Transform methods. Among all the spectral methods, power spectral analysis is the most commonly used, since the power spectrum reflects the 'frequency content' of the signal or the distribution of signal power over frequency.
Time domain methods
There are two important methods for time domain EEG analysis: Linear Prediction and Component Analysis. Generally, Linear Prediction gives the estimated value equal to a linear combination of the past output value with the present and past input value. And Component Analysis is an unsupervised method in which the data set is mapped to a feature set. Notably, the parameters in time domain methods are entirely based on time, but they can also be extracted from statistical moments of the power spectrum. As a result, time domain method builds a bridge between physical time interpretation and conventional spectral analysis. Besides, time domain met |
https://en.wikipedia.org/wiki/Dynamic%20texture | Dynamic texture ( sometimes referred to as temporal texture) is the texture with motion which can be found in videos of sea-waves, fire, smoke, wavy trees, etc. Dynamic texture has a spatially repetitive pattern with time-varying visual pattern. Modeling and analyzing dynamic texture is a topic of images processing and pattern recognition in computer vision.
Extracting features that describe the dynamic texture can be utilized for tasks of images sequences classification, segmentation, recognition and retrieval. Comparing with texture found within static images, analyzing dynamic texture is a challenging problem. It is important that the extracted features from dynamic texture combine motion and appearance description, and also be invariance to some transformation such as rotation, translation and illumination.
Analysis methods of dynamic texture
The methods of dynamic texture recognition can categorized as follows:
Methods based on optical flow: by applying optical flow to the dynamic texture, velocity with direction and magnitude can be detected and used to recognize the dynamic texture. Due to simplicity of its computation, it is currently the most popular method.
Methods computing geometric properties: this methods track the surfaces of motion trajectories in spatiotemporal domain.
Methods based on local spatiotemporal filtering : this methods analyze the local spatiotemporal patterns and its orientation and energy and employ them as feature used for classification.
Methods based on global spatiotemporal transform: this method characterize the motion at different scale using wavelets that can decompose the motion into local and global.
Model-based methods : These methods aims at generating a model to describe the motion by a set of parameters.
Applications
- Segmenting the sequence images of natural scenes. This helps on differentiate between streets and grass alongside these streets which could be used in the application of navigations.
- |
https://en.wikipedia.org/wiki/Neuronal%20cell%20cycle | The Neuronal cell cycle represents the life cycle of the biological cell, its creation, reproduction and eventual death. The process by which cells divide into two daughter cells is called mitosis. Once these cells are formed they enter G1, the phase in which many of the proteins needed to replicate DNA are made. After G1, the cells enter S phase during which the DNA is replicated. After S, the cell will enter G2 where the proteins required for mitosis to occur are synthesized. Unlike most cell types however, neurons are generally considered incapable of proliferating once they are differentiated, as they are in the adult nervous system. Nevertheless, it remains plausible that neurons may re-enter the cell cycle under certain circumstances. Sympathetic and cortical neurons, for example, try to reactivate the cell cycle when subjected to acute insults such as DNA damage, oxidative stress, and excitotoxicity. This process is referred to as “abortive cell cycle re-entry” because the cells usually die in the G1/S checkpoint before DNA has been replicated.
Cell cycle regulation
Transitions through the cell cycle from one phase to the next are regulated by cyclins binding their respective cyclin dependent kinases (Cdks) which then activate the kinases (Fisher, 2012). During G1, cyclin D is synthesized and binds to Cdk4/6, which in turn phosphorylates retinoblastoma (Rb) protein and induces the release of the transcription factor E2F1 which is necessary for DNA replication (Liu et al., 1998). The G1/S transition is regulated by cyclin E binding to Cdk2 which phosphorylates Rb as well (Merrick and Fisher, 2011). S phase is then driven by the binding of cyclin A with Cdk2. In late S phase, cyclin A binds with Cdk1 to promote late replication origins and also initiates the condensation of the chromatin in the late G2 phase. The G2/M phase transition is regulated by the formation of the Cdk1/cyclin B complex.
Inhibition through the cell cycle is maintained by cyclin-dep |
https://en.wikipedia.org/wiki/Biomarkers%20of%20diabetes | Diabetes mellitus (DM) is a type of metabolic disease characterized by hyperglycemia. It is caused by either defected insulin secretion or damaged biological function, or both. The high-level blood glucose for a long time will lead to dysfunction of a variety of tissues.
Type 2 diabetes is a progressive condition in which the body becomes resistant to the normal effects of insulin and/or gradually loses the capacity to produce enough insulin in the pancreas.
Pre-diabetes means that the blood sugar level is higher than normal but not yet high enough to be type 2 diabetes.
Gestational diabetes is a condition in which a woman without diabetes develops high blood sugar levels during pregnancy.
Type 2 diabetes mellitus and prediabetes are associated with changes in levels of metabolic markers, these markers could serve as potential prognostic or therapeutic targets for patients with prediabetes or Type 2 diabetes mellitus.
Metabolic markers
Oxytocin (OXT)
Omentin
Endothelin-1
Nesfatin-1
Irisin
Betatrophin
Hepatocyte growth factor (HGF)
Fibroblast growth factor
-Biomarkers with insulin-sensitizing properties (irisin, omentin, oxytocin)
-Biomarkers of metabolic dysfunction (HGF, Nesfatin and Betatrophin)
Biomarkers with insulin-sensitizing properties
Oxytocin
Oxytocin (OXT), a hormone most commonly associated with labor and lactation, may have a wide variety of physiological and pathological functions, which makes Oxytocin and its receptor potential targets for drug therapy.
OXT may have positive metabolic effects; this is based on the change in glucose metabolism, lipid profile, and insulin sensitivity. It may modify glucose uptake and insulin sensitivity both through direct and indirect effects. It may also cause regenerative changes in diabetic pancreatic islet cells. So, the activation of the OXT receptor pathway by infusion of OXT, OXT analogues, or OXT agonists may represent a promising approach for the management of obesity and related metabolic d |
https://en.wikipedia.org/wiki/Joan%20Mott%20Prize%20Lecture | The Joan Mott Prize Lecture is a prize lecture awarded annually by The Physiological Society in honour of Joan Mott.
Laureates
Laureates of the award have included:
- Intestinal absorption of sugars and peptides: from textbook to surprises
See also
Physiological Society Annual Review Prize Lecture
References
Academic awards
British lecture series
1996 establishments in the United Kingdom
Science lecture series
Recurring events established in 1996
Awards established in 1996
Annual events in London
Biology education
The Physiological Society |
https://en.wikipedia.org/wiki/Kerr%E2%80%93Dold%20vortex | In fluid dynamics, Kerr–Dold vortex is an exact solution of Navier–Stokes equations, which represents steady periodic vortices superposed on the stagnation point flow (or extensional flow). The solution was discovered by Oliver S. Kerr and John W. Dold in 1994. These steady solutions exist as a result of a balance between vortex stretching by the extensional flow and viscous dissipation, which are similar to Burgers vortex. These vortices were observed experimentally in a four-roll mill apparatus by Lagnado and L. Gary Leal.
Mathematical description
The stagnation point flow, which is already an exact solution of the Navier–Stokes equation is given by , where is the strain rate. To this flow, an additional periodic disturbance can be added such that the new velocity field can be written as
where the disturbance and are assumed to be periodic in the direction with a fundamental wavenumber . Kerr and Dold showed that such disturbances exist with finite amplitude, thus making the solution an exact to Navier–Stokes equations. Introducing a stream function for the disturbance velocity components, the equations for disturbances in vorticity-streamfunction formulation can be shown to reduce to
where is the disturbance vorticity. A single parameter
can be obtained upon non-dimensionalization, which measures the strength of the converging flow to viscous dissipation. The solution will be assumed to be
Since is real, it is easy to verify that Since the expected vortex structure has the symmetry , we have . Upon substitution, an infinite sequence of differential equation will be obtained which are coupled non-linearly. To derive the following equations, Cauchy product rule will be used. The equations are
The boundary conditions
and the corresponding symmetry condition is enough to solve the problem. It can be shown that non-trivial solution exist only when On solving this equation numerically, it is verified that keeping first 7 to 8 terms suffice to produc |
https://en.wikipedia.org/wiki/NFL%20Network%20Exclusive%20Game%20Series | {{Infobox television
| image = NFL Network Special Logo.png
| caption = The current NFL Network logo for its exclusive package of games as of the 2022 season.
| alt_name = {{ubl|Saturday Night Football (2006–2008)|Thursday Night Special (2016)|Thursday Night Football: Saturday Edition (2014–2015)|NFL Network Special (2016–2022)}}
| genre = NFL football telecasts
| creator =
| director =
| presenter = List of NFL Network Exclusive Game Series broadcasters
| theme_music_composer =
| opentheme = NFL GameDay theme
| endtheme = Same as open
| composer =
| country = United States
| language = English
| num_seasons = 5
| num_episodes = 4 per season
| list_episodes =
| executive_producer =
| producer =
| location = Various NFL stadiums
| camera = Multi-camera
| runtime = 180 minutes or until game ends (inc. adverts)
| company =
| network = NFL Network
| first_aired =
| last_aired = present
| related =
| image_size =
}}NFL Network Exclusive Game Series (formerly called NFL Network Special) is the branding currently used for broadcasts of National Football League (NFL) games aired by NFL Network. Prior to the 2022 NFL season, the NFL Network Special branding was only used on Thursday Night Football (TNF) games not played on Thursdays (from 2022 on it is used for all games); as of 2022, this arrangement has included at least one NFL London Game played in a Sunday morning (U.S. time) window, and one or more late-season games on Saturdays.
After having briefly used Saturday Night Football to brand the games (alongside the overall blanket title Run to the Playoffs), from 2008 through 2016 the games were branded as "special editions" of Thursday Night Football or a variant there |
https://en.wikipedia.org/wiki/Spectral%20gap%20%28physics%29 | In quantum mechanics, the spectral gap of a system is the energy difference between its ground state and its first excited state. The mass gap is the spectral gap between the vacuum and the lightest particle. A Hamiltonian with a spectral gap is called a gapped Hamiltonian, and those that do not are called gapless.
In solid-state physics, the most important spectral gap is for the many-body system of electrons in a solid material, in which case it is often known as an energy gap.
In quantum many-body systems, ground states of gapped Hamiltonians have exponential decay of correlations.
In 2015, it was shown that the problem of determining the existence of a spectral gap is undecidable in two or more dimensions. The authors used an aperiodic tiling of quantum Turing machines and showed that this hypothetical material becomes gapped if and only if the machine halts. The one-dimensional case was also proven undecidable in 2020 by constructing a chain of interacting qudits divided into blocks that gain energy if and only if they represent a full computation by a Turing machine, and showing that this system becomes gapped if and only if the machine does not halt.
See also
List of undecidable problems
Spectral gap, in mathematics
References
Quantum mechanics
Physical quantities
Undecidable problems |
https://en.wikipedia.org/wiki/List%20of%20desert%20and%20xeric%20shrubland%20ecoregions | The World Wide Fund for Nature defines a number of ecoregions that belong the deserts and xeric shrublands biome:
List of ecoregions
References
Desert and xeric shrublands |
https://en.wikipedia.org/wiki/Fraunhofer-Center%20for%20High%20Temperature%20Materials%20and%20Design%20HTL | The Fraunhofer Center for High Temperature Materials and Design is a research center of the Fraunhofer Institute for Silicate Research in Würzburg, a research institute of the Fraunhofer Society. It predominantly conducts research in high temperature technologies energy-efficient heating processes and thus contributes to sustainable technological progress. It is headquartered in Bayreuth and has additional locations in Würzburg and Münchberg.
History
The centre was founded in 2012 with the aim of pooling the ceramics research of the Fraunhofer ISC. Its research building in Bayreuth was opened in 2015 and funded by the Bavarian Ministry for Economic Affairs, the German Federal Ministry of Education and Research, and the European Regional Development Fund. In 2014, the Fraunhofer Application Center for Textile Fiber Ceramics (TFK) was founded in cooperation with the Hof University of Applied Sciences. Since 2017, the premises of the Fraunhofer-Center HTL in Bayreuth are being extended by a technical center with a fiber pilot plant, which is to be completed in late 2019. The costs for this plant amount to 20 Million Euros, which are predominantly taken over by the Bavarian Ministry for Economic Affairs and the German Federal Ministry of Education and Research. The plant itself is a one-of-its-kind in Europe and its goal is to open production of ceramic fibers in Europe.
Research areas
The Fraunhofer-Center HTL has two business areas: Thermal Process Technology and CMC's (Ceramic matrix composites). One of the applications of CMC's are, for instance, the production of ceramic brakes, which currently are expensive in production, and the Fraunhofer-Center HTL is currently researching ways to reduce costs therein. In the CMC business field, HTL has a closed manufacturing chain from fibre development to textile fibre processing to matrix construction to finishing and coating of CMC components. CMC are characterised by high operating temperatures, corrosion resistance an |
https://en.wikipedia.org/wiki/Pseudoprotease | Pseudoproteases are catalytically-deficient pseudoenzyme variants of proteases that are represented across the kingdoms of life.
Examples
See also
Protease
Pseudoenzyme
Catalytic triad
References
Molecular biology |
https://en.wikipedia.org/wiki/Network%20synthesis | Network synthesis is a design technique for linear electrical circuits. Synthesis starts from a prescribed impedance function of frequency or frequency response and then determines the possible networks that will produce the required response. The technique is to be compared to network analysis in which the response (or other behaviour) of a given circuit is calculated. Prior to network synthesis, only network analysis was available, but this requires that one already knows what form of circuit is to be analysed. There is no guarantee that the chosen circuit will be the closest possible match to the desired response, nor that the circuit is the simplest possible. Network synthesis directly addresses both these issues. Network synthesis has historically been concerned with synthesising passive networks, but is not limited to such circuits.
The field was founded by Wilhelm Cauer after reading Ronald M. Foster's 1924 paper A reactance theorem. Foster's theorem provided a method of synthesising LC circuits with arbitrary number of elements by a partial fraction expansion of the impedance function. Cauer extended Foster's method to RC and RL circuits, found new synthesis methods, and methods that could synthesise a general RLC circuit. Other important advances before World War II are due to Otto Brune and Sidney Darlington. In the 1940s Raoul Bott and Richard Duffin published a synthesis technique that did not require transformers in the general case (the elimination of which had been troubling researchers for some time). In the 1950s, a great deal of effort was put into the question of minimising the number of elements required in a synthesis, but with only limited success. Little was done in the field until the 2000s when the issue of minimisation again became an active area of research, but as of 2023, is still an unsolved problem.
A primary application of network synthesis is the design of network synthesis filters but this is not its only application. |
https://en.wikipedia.org/wiki/Ramanujan%20Math%20Park | The Ramanujan Math Park is an Indian museum and activity center dedicated to mathematics education inside the Agastya Campus Creativity Lab located in Kuppam, in Chittoor, Andhra Pradesh. It is named after the Indian mathematician Srinivasa Ramanujan (1887-1920) who was from nearby Madras State. It is a joint project of Agastya International Foundation and the non-profit organization Gyanome.
Agastya is known for its hands-on teaching methods and the Math Park follows this tradition. The park features both indoor and outdoor exhibits as well as interactive touch screen stations, all designed to enhance the mathematical experience. There are plans to replicate this math park experience at other Government run schools elsewhere in India.
History
Ramanujan Math Park was conceived, partially funded and executed by Sujatha Ramdorai and her husband Srinivasan Ramdorai along with V.S.S Sastry, an Indian mathematics communicator based in nearby Kolar. It was inaugurated in 2017, on 22 December, Ramanujan's birthday and the day celebrated in India as National Math Day.
References
External links
Agastya International Foundation: official website
Ramanujan Math Park - Agastya Campus Creativity Lab Video
Mathematics museums
Srinivasa Ramanujan
Science museums in India
Museums established in 2017 |
https://en.wikipedia.org/wiki/Convolutional%20sparse%20coding | The convolutional sparse coding paradigm is an extension of the global sparse coding model, in which a redundant dictionary is modeled as a concatenation of circulant matrices. While the global sparsity constraint describes signal as a linear combination of a few atoms in the redundant dictionary , usually expressed as for a sparse vector , the alternative dictionary structure adopted by the convolutional sparse coding model allows the sparsity prior to be applied locally instead of globally: independent patches of are generated by "local" dictionaries operating over stripes of .
The local sparsity constraint allows stronger uniqueness and stability conditions than the global sparsity prior, and has shown to be a versatile tool for inverse problems in fields such as image understanding and computer vision. Also, a recently proposed multi-layer extension of the model has shown conceptual benefits for more complex signal decompositions, as well as a tight connection the convolutional neural networks model, allowing a deeper understanding of how the latter operates.
Overview
Given a signal of interest and a redundant dictionary , the sparse coding problem consist of retrieving a sparse vector , denominated the sparse representation of , such that . Intuitively, this implies is expressed as a linear combination of a small number of elements in . The global sparsity constraint prior has been shown to be useful in many ill-posed inverse problems such as image inpainting, super-resolution, and coding. It has been of particular interest for image understanding and computer vision tasks involving natural images, allowing redundant dictionaries to be efficiently inferred
As an extension to the global sparsity constraint, recent pieces in the literature have revisited the model to reach a more profound understanding of its uniqueness and stability conditions. Interestingly, by imposing a local sparsity prior in , meaning that its independent patches can be interprete |
https://en.wikipedia.org/wiki/Neurometric%20function | In neuroscience, a neurometric function is a mathematical formula relating the activity of brain cells to aspects of an animal's sensory experience or motor behavior. Neurometric functions provide a quantitative summary of the neural code of a particular brain region.
In sensory neuroscience, neurometric functions measure the probability with which a sensory stimulus would be perceived based on decoding the activity of a given neuron or collection of neurons.
The concept was introduced to investigate the visibility of visual stimuli, by applying Detection theory to the output of single neurons of visual cortex.
Comparing neurometric functions to psychometric functions (by recording from neurons in the brain of the observer) can reveal whether the neural representation in the recorded region constrains perceptual accuracy.
In motor neuroscience, neurometric functions are used to predict body movements from the activity of neuronal populations in regions such as motor cortex. Such neurometric functions are used in the design of brain–computer interfaces.
See also
Psychometric function
Psychometrics
References
Neuroscience |
https://en.wikipedia.org/wiki/2020%20Democratic%20Party%20presidential%20debates | Debates took place among candidates in the campaign for the Democratic Party's nomination for the president of the United States in the 2020 presidential election.
There were a total of 29 major Democratic candidates. Of these, 23 candidates participated in at least one debate. Only Joe Biden and Bernie Sanders participated in all the debates; Pete Buttigieg, Amy Klobuchar, and Elizabeth Warren participated in all but one debate.
Overview
Schedule
In December 2018, the Democratic National Committee (DNC) announced the schedule for 12 official DNC-sanctioned debates, set to begin in June 2019, with six debates in 2019 and the remaining six during the first four months of 2020. Candidates were allowed to participate in forums featuring multiple other candidates as long as only one candidate appeared on stage at a time. Any presidential candidates who participated in unsanctioned debates with each other would have lost their invitations to the next DNC-sanctioned debate. No unsanctioned debates took place during the 2019—2020 debate season.
The DNC also announced that it would not partner with Fox News as a media sponsor for any debates. Fox News last held a Democratic debate in 2003. All media sponsors selected to host a debate were as a new rule required to appoint at least one female moderator for each debate, to ensure there would not be a gender-skewed treatment of the candidates and debate topics.
Participation
The following is a table of participating candidates in each debate:
Debates in 2019
First debates (June 26–27, 2019)
Qualification
To qualify for the first debates, entrants had to, at a minimum, achieve one of the two criteria listed. If this had resulted in more than 20 qualified candidates, the two criteria would have been evaluated in combination per an outlined set of tiebreaking rules, but since 20 candidates qualified, no tiebreaker was necessary. The deadline for candidates to meet either of the below criteria was June 12.
{| class="wi |
https://en.wikipedia.org/wiki/Amitsur%20complex | In algebra, the Amitsur complex is a natural complex associated to a ring homomorphism. It was introduced by . When the homomorphism is faithfully flat, the Amitsur complex is exact (thus determining a resolution), which is the basis of the theory of faithfully flat descent.
The notion should be thought of as a mechanism to go beyond the conventional localization of rings and modules.
Definition
Let be a homomorphism of (not-necessary-commutative) rings. First define the cosimplicial set (where refers to , not ) as follows. Define the face maps by inserting at the th spot:
Define the degeneracies by multiplying out the th and th spots:
They satisfy the "obvious" cosimplicial identities and thus is a cosimplicial set. It then determines the complex with the augumentation , the Amitsur complex:
where
Exactness of the Amitsur complex
Faithfully flat case
In the above notations, if is right faithfully flat, then a theorem of Alexander Grothendieck states that the (augmented) complex is exact and thus is a resolution. More generally, if is right faithfully flat, then, for each left -module ,
is exact.
Proof:
Step 1: The statement is true if splits as a ring homomorphism.
That " splits" is to say for some homomorphism ( is a retraction and a section). Given such a , define
by
An easy computation shows the following identity: with ,
.
This is to say that is a homotopy operator and so determines the zero map on cohomology: i.e., the complex is exact.
Step 2: The statement is true in general.
We remark that is a section of . Thus, Step 1 applied to the split ring homomorphism implies:
where , is exact. Since , etc., by "faithfully flat", the original sequence is exact.
Arc topology case
show that the Amitsur complex is exact if and are (commutative) perfect rings, and the map is required to be a covering in the arc topology (which is a weaker condition than being a cover in the flat topology).
Notes
Citations
References
|
https://en.wikipedia.org/wiki/National%20Quantum%20Initiative%20Act | The National Quantum Initiative Act is an Act of Congress passed on December 13, 2018, and signed into law on December 21, 2018. The law gives the United States a plan for advancing quantum technology, particularly quantum computing.
Act
The act was passed unanimously by the United States Senate and was signed into law by President Donald Trump. The National Quantum Initiative (NQI) provides an umbrella under which a number of government agencies develop and operate programs related to improving the climate for quantum science and technology in the US, coordinated by the National Quantum Coordination Office. These agencies include the National Institute of Standards and Technology (NIST), the National Science Foundation (NSF), and the United States Department of Energy (DOE). Under the authority of the NQI, the NSF and the DOE have established new research centers and institutes, and NIST has established the Quantum Economic Development Consortium (QED-C), a consortium of industrial, academic, and governmental entities.
References
External links
As codified in 15 U.S.C. chapter 114 of the United States Code from the LII
As codified in 15 U.S.C. chapter 114 of the United States Code from the US House of Representatives
National Quantum Initiative Act (PDF/details) as amended in the GPO Statute Compilations collection
National Quantum Initiative Act (PDF/details) as enacted in the US Statutes at Large
Quantum computing
Science and technology in the United States
2018 in computing
Acts of the 115th United States Congress
United States federal computing legislation |
https://en.wikipedia.org/wiki/Bithumb | Bithumb is a South Korean cryptocurrency exchange. Founded in 2014, Bithumb Korea has 8 million registered users, 1 million mobile app users, and a current cumulative transaction volume has exceeded USD $1 trillion.
History
In October 2018, BK Global Consortium signed a deal to buy a majority share of BTC Holding Co. which is Bithumb's largest investor.
On January 22, 2019, OTC-listed holding company Blockchain Industries signed a binding letter of intent to merge with Bithumb on or before March 1, 2019. The plan is to form a new publicly traded entity called the Blockchain Exchange Alliance (BXA) that would ‘up-list’ on either the New York Stock Exchange or NASDAQ and make BXA the first major cryptocurrency exchange to go public.
On April 11, 2019, Bithumb announced a net loss of KRW205.5 billion (US$180 million) in 2018, a sharp turnaround from the KRW427.2 billion profit in 2017, despite 2018's sales rising 17.5% to KRW391.7 million. The company blamed the loss on the sharp decline in the price of cryptocurrencies and reduced trading volume.
Controversy
In June 2017, hackers stole user information from a Bithumb employee's personal computer.
In January 2018, Bithumb was raided by the government for alleged tax evasion. They were found not guilty, but still had to pay nearly $28 million in back taxes.
In June 2018, approximately $32 million of cryptocurrency was stolen from Bithumb in a hack.
In January 2019, 30 out of 340 total Bithumb employees were laid off in response to declining trading volume and profits in 2018.
On March 29, 2019, Bithumb said that it was hacked. It pointed its fingers at insiders. Nearly $20 million worth of EOS and Ripple tokens were estimated to have been stolen.
On September 2, 2020, Bithumb was reported in local Korean news that the Bithumb exchange was raided by the Seoul Metropolitan Police Agency's Intelligent Crime Investigation Unit. According to the report, the search and seizure is related to suspicion of investment f |
https://en.wikipedia.org/wiki/Upbit | Upbit is a South Korean cryptocurrency exchange founded in 2017. It is operated by Dunamu, which is one of the highest-valued startups in South Korea.
History
Upbit launched in South Korea on October 24, 2017, with the help of their partnership with American cryptocurrency exchange Bittrex.
Sirgoo Lee was named CEO of Dunamu, Upbit's parent company, on December 21, 2017, with Dunamu founder and CEO Chi-hyung Song assuming the role of chairman. Lee previously served as Co-CEO of Kakao Corp. and JOINS, Inc.
Approximately two months after its launch, Upbit became the top global cryptocurrency exchange in terms of 24-hour trading volume.
On May 10, 2018, its main office was raided as part of a fraud probe.
The exchange began expanding into Southeast Asia in late 2018, first by launching in Singapore on October 30, and then beginning services in Indonesia starting January 2019, and Thailand starting January 2021.
On December 21, 2018, three Upbit officials were indicted for allegedly making fake orders. The exchange has denied the allegations.
In December 2018, Upbit became the first cryptocurrency exchange in the world to receive certifications from the Korea Internet and Security Agency for Information Security Management System (ISMS) and the International Organization for Standardization (ISO) for information security (ISO 27001), cloud security (ISO 27017) and cloud privacy (ISO 27018).
On November 27, 2019, Upbit lost about US$48.5 million worth of Ethereum from a hack.
In September 2021, South Korea started to regulate virtual asset service providers.
References
External links
Companies based in Seoul
Cryptocurrencies |
https://en.wikipedia.org/wiki/Successive%20interference%20cancellation | Successive Interference Cancellation (SIC) is a technique used by a receiver in a wireless data transmission that allows decoding of two or more packets that arrived simultaneously (in a regular system, more packets arriving at the same time cause a collision).
SIC is achieved by the receiver decoding the stronger signal first, subtracting it from the combined signal and then decoding the difference as the weaker signal.
References
Wireless
Wireless networking |
https://en.wikipedia.org/wiki/Radoslav%20Rochallyi | Radoslav Rochallyi (born 1 May 1980), Bardejov , Czechoslovakia is a Slovak writer, and poet living in the Malta, and Czech Republic.
Biography
Rochallyi was born in Bardejov, Czechoslovakia in a family with Lemko and Hungarian roots. He start reading even before started primary school. The first book he read was the book Black Ships by Maciej Słomczyński. Around his eight years, he came across Lermontov's poems. Rochallyi started writing poetry as a ten-year-old, and he published own works in magazines from the age of sixteen. The Author graduated in Management at the London International Graduate School and holds a certificate in Fine arts, which he received at the Pratt Institute. He also studied philosophy, and mathematics (linear algebra).
He is a member of Mensa.
Writing
Rochallyi is the author of fifteen books. In addition to Slovak and English, Rochallyi writes in Hungarian, Czech,,and German. He debuted with the collection of poetry Panoptikum: Haikai no renga (2004), written in Japanese haiku.
According to Jan Balaz, the poetry of Radoslav Rochally is characterized by the use of a free verse, which gives the author the necessary freedom and directness to retain the specific nature of the testimony without embellishments. His book Mythra Invictus has received a positive reception.
According to Lenka Vrebl, the perception of Radoslav Rochallyi is not playful, it is serious, direct and focused.
In the DNA-Canvases of Poetry collection he uses mathematical equations to express his poetry.
In addition to his book, poetic equations have also been published in many anthologies and journals. For example, in anthologies and journals published at Stanford University, California State University, Utah Tech University, Olivet College, or Las Positas College.
In the Punch collection, he uses poems based on mathematics, especially on mathematical equations. Both the texts and the equations are based on the author's need to divide the text into a semantically an |
https://en.wikipedia.org/wiki/Stan%20Twitter | Stan Twitter is a community of Twitter users that post opinions related to celebrities, music, TV shows, movies, and social media. The community has been noted for its particular shared terminology but also for incidents of harassment and bullying. Usually, Stan Twitter revolves around discussing actors, singers, rappers, athletes, and politicians.
Background and description
The origin of the term stan is often credited to the 2000 song "Stan", about an obsessed fan, by American rapper Eminem featuring British singer Dido. The word itself was added to the Oxford English Dictionary in 2017. The term was originally a noun, but over time evolved and began to be used as a verb as well.
Stan Twitter has been noted by The Atlantic as one of the "tribes" of Twitter. Polygon has described Stan Twitter as "an overarching collection of various fandoms", and additionally as a community that "[signifies] individuals congregated around certain, specific interests ranging from queer identity to K-pop groups, and added that "Stan Twitter is essentially synonymous with fandom twitter."
The Daily Dot wrote that "Stan Twitter is essentially a community of Extremely Online like-minded individuals who discuss their various fandoms and what they 'stan'." Stan Twitter has also been noted for its common overlap with LGBTQ+ Twitter communities. The Guardian noted, for example, that "Gay male culture has always coalesced around female pop stars, from Judy Garland to Lady Gaga and Ariana Grande."
Mat Whitehead of HuffPost described stans as "volcanic", and added that they are "organised, ... dedicated and—at times—completely unhinged." Whitehead went on to describe stans of recording artists, writing "stans aren't just superfans, they're a community of like-minded souls coming together, unified under the banner of wanting to see their chosen celebrity flourish. Friendships are made, bonding over a shared love of an artist, their work, their achievements."
Culture
Stan Twitter has bee |
https://en.wikipedia.org/wiki/List%20of%20medical%20tests | medical test is a medical procedure performed to detect, diagnose, or monitor diseases, disease processes, susceptibility, or to determine a course of treatment. The tests are classified by speciality field allowing to know in which ward of hospital or by which specialist doctor are usually these tests performed. This list is not exhaustive but might be useful as a guide.
Where available, ICD-10 codes are listed.
Consulting Room Tests
These tests are usually performed in a consulting room by any doctor and require no advanced equipment.
general
Temperature measurement, with a thermometer
Patient's Respiratory rate measurement
Blood oxygen concentration measurement
taking the patient's pulse
weighing, and measuring height and girth
measuring blood pressure
specific:
abdominal palpation
cardiac ausculation
HEENT examination
digital rectal examination
neurological examination
psychiatric assessment
pulmonary auscultation
vaginal examination
Cardiovascular
coronary catheterization
echocardiography
electrocardiogram
ballistocardiogram
Dermatology
skin allergy test
skin biopsy
Ear, Nose and Throat
hearing test
laryngoscopy
vestibular tests
electronystagmography (ENG)
videonystagmography (VNG)
Gastrointestinal
capsule endoscopy
coloscopy
endoscopic retrograde cholangiopancreatography
esophagogastroduodenoscopy
esophageal motility study
esophageal pH monitoring
liver biopsy
Hematology
bone marrow examination
Laboratory
biochemistry
Arterial blood gas (ABG)
Complete blood count (CBC)
Comprehensive metabolic panel (CMP) (including CHEM-7)
coagulation tests
C-reactive protein
Erythrocyte sedimentation rate (ESR)
FibroTest
urea breath test
urinalysis
Agostini's reaction
cytogenetics and Molecular Genetics
Genetic testing
immunology
autoantibodies
microbiology
blood culture
mantoux test
sputum culture
stool culture
urine culture
Neurological
electroencephalogram
electromyography (EMG)
lumbar puncture
neuropsych |
https://en.wikipedia.org/wiki/Parallels%20RAS | Parallels RAS is application virtualization software produced by Parallels that allows Windows applications to be accessed via individual devices from a shared server or cloud system. Parallels RAS was first released in 2014 by 2X Software.
Product overview
Parallels RAS is application virtualization software that delivers centrally-hosted Windows applications to local devices without the necessity of installing them.
With Parallels RAS, Windows applications can be used on devices that typically could not run them, including Macintosh computers, mobile devices, and Google Chromebook.
Parallels RAS is accessed on all devices via Parallels Client. The software can be delivered from on-premises or public, private, or hybrid clouds.
See also
Parallels
2X Software
References
External links
Virtualization software |
https://en.wikipedia.org/wiki/Classification%20of%20low-dimensional%20real%20Lie%20algebras | This mathematics-related list provides Mubarakzyanov's classification of low-dimensional real Lie algebras, published in Russian in 1963. It complements the article on Lie algebra in the area of abstract algebra.
An English version and review of this classification was published by Popovych et al. in 2003.
Mubarakzyanov's Classification
Let be -dimensional Lie algebra over the field of real numbers
with generators , . For each algebra we adduce only non-zero commutators between basis elements.
One-dimensional
, abelian.
Two-dimensional
, abelian ;
, solvable ,
Three-dimensional
, abelian, Bianchi I;
, decomposable solvable, Bianchi III;
, Heisenberg–Weyl algebra, nilpotent, Bianchi II,
, solvable, Bianchi IV,
, solvable, Bianchi V,
, solvable, Bianchi VI, Poincaré algebra when ,
, solvable, Bianchi VII,
, simple, Bianchi VIII,
, simple, Bianchi IX,
Algebra can be considered as an extreme case of , when , forming contraction of Lie algebra.
Over the field algebras , are isomorphic to and , respectively.
Four-dimensional
, abelian;
, decomposable solvable,
, decomposable solvable,
, decomposable nilpotent,
, decomposable solvable,
, decomposable solvable,
, decomposable solvable,
, decomposable solvable,
, unsolvable,
, unsolvable,
, indecomposable nilpotent,
, indecomposable solvable,
, indecomposable solvable,
, indecomposable solvable,
, indecomposable solvable,
, indecomposable solvable,
, indecomposable solvable,
, indecomposable solvable,
, indecomposable solvable,
, indecomposable solvable,
Algebra can be considered as an extreme case of , when , forming contraction of Lie algebra.
Over the field algebras , , , , are isomorphic to , , , , , respectively.
See also
Table of Lie groups
Simple Lie group#Full classification
Notes
References
Lie algebras
Mathematics-related lists
Mathematical classification systems |
https://en.wikipedia.org/wiki/Lie%20operad | In mathematics, the Lie operad is an operad whose algebras are Lie algebras. The notion (at least one version) was introduced by in their formulation of Koszul duality.
Definition à la Ginzburg–Kapranov
Fix a base field k and let denote the free Lie algebra over k with generators and the subspace spanned by all the bracket monomials containing each exactly once. The symmetric group acts on by permutations of the generators and, under that action, is invariant. The operadic composition is given by substituting expressions (with renumbered variables) for variables. Then, is an operad.
Koszul-Dual
The Koszul-dual of is the commutative-ring operad, an operad whose algebras are the commutative rings over k.
Notes
References
External links
Todd Trimble, Notes on operads and the Lie operad
https://ncatlab.org/nlab/show/Lie+operad
Algebra |
https://en.wikipedia.org/wiki/Bell-shaped%20function | A bell-shaped function or simply 'bell curve' is a mathematical function having a characteristic "bell"-shaped curve. These functions are typically continuous or smooth, asymptotically approach zero for large negative/positive x, and have a single, unimodal maximum at small x. Hence, the integral of a bell-shaped function is typically a sigmoid function. Bell shaped functions are also commonly symmetric.
Many common probability distribution functions are bell curves.
Some bell shaped functions, such as the Gaussian function and the probability distribution of the Cauchy distribution, can be used to construct sequences of functions with decreasing variance that approach the Dirac delta distribution. Indeed, the Dirac delta can roughly be thought of as a bell curve with variance tending to zero.
Some examples include:
Gaussian function, the probability density function of the normal distribution. This is the archetypal bell shaped function and is frequently encountered in nature as a consequence of the central limit theorem.
Fuzzy Logic generalized membership bell-shaped function
Hyperbolic secant. This is also the derivative of the Gudermannian function.
Witch of Agnesi, the probability density function of the Cauchy distribution. This is also a scaled version of the derivative of the arctangent function.
Bump function
Raised cosines type like the raised cosine distribution or the raised-cosine filter
Most of the window functions like the Kaiser window
The derivative of the logistic function. This is a scaled version of the derivative of the hyperbolic tangent function.
Some algebraic functions. For example
Gallery
References
Functions and mappings |
https://en.wikipedia.org/wiki/Libby%20Heaney | Libby Heaney is a British artist and quantum physicist known for her pioneering work on AI and quantum computing. She works on the impact of future technologies and is widely known to be the first artist to use quantum computing as a functioning artistic medium. Her work has been featured internationally, including in the Victoria and Albert Museum, Tate Modern and the Science Gallery.
Early life and scientific career
Heaney is from Tamworth, Staffordshire. She studied physics at Imperial College London, graduating in 2005 with first class honours. Libby pursued a successful career in quantum physics, completing a PhD thesis on mode entanglement in ultra-cold atomic gases at the University of Leeds, and pursued her own research as a postdoctoral fellow at the University of Oxford and at the National University of Singapore. In 2008, Heaney was awarded the Institute of Physics Very Early Career Woman in Physics Award (now Jocelyn Bell Burnell Medal and Prize).
Artistic career
In 2013 Heaney returned to the UK and completed a master's degree at the University of the Arts London. She studied arts and science at Central Saint Martins and graduated in 2015. She then became a lecturer at the Royal College of Art, teaching Information Experience Design. In 2016, she created Lady Chatterley's Tinderbot which presented Tinder conversations between real users and AI bots programmed using Lady Chatterley's Lover. Lady Chatterley's Tinderbot was covered by BBC News, TheJournal.ie and the Irish Examiner and was exhibited internationally.
In 2017, Heaney was commissionned by Sky Arts and the Barbican Centre to design Britbot, an internet bot built using artificial intelligence and the citizenship book Life in the UK: a guide for new residents. The book, a manual for the citizenship test, has been described by Heaney as being "largely a white male privileged version of British history and culture". The bot spoke to the public about what it meant to be British and learnt fro |
https://en.wikipedia.org/wiki/List%20of%20aerospace%20flight%20test%20centres | Flight test centers around the world all have similar missions: to conduct flight research and testing of new aircraft concepts and prototypes. Notable centers are listed below (by year of foundation):
Government establishments
U.K. Aeroplane and Armament Experimental Establishment, based at Boscombe Down, England (founded 1917)
U.S. Navy Air Warfare Test Center, based at Naval Air Station Patuxent River, Maryland, United States (founded 1918, as the Navy's Flight Test Group based at Naval Air Station Anacostia)
Swedish Armed Forces Flight Test and Evaluation Center (FMV:PROV is a part of FMV), based at Linköping, Sweden (founded 1933)
Italian Air Force Flight Test Center (Reparto Sperimentale di Volo), based at Pratica di Mare (founded 1935)
Russian State Flight Research and Test Center, based at Zhukovsky, Russia (founded 1941)
I.N.T.A. Spanish Aerospace Research and Test Center, based at Torrejón de Ardoz, Community of Madrid, Spain (founded 1942)
CLAEX Spanish Air Force Experimentation Center, based at Torrejón de Ardoz, Spain (founded 1992)
U.S. Air Force Test Center, based at Edwards Air Force Base, California, United States (founded 1942, as the new location of 477th Air Base Headquarters and Test Squadron)
Flight Test Center (CEV) of the French Ministry of Armed Forces (CEV is a part of Directorate General of Armaments ), based at 217 Air Base in Brétigny-sur-Orge, France (founded 1945)
NASA Flight Research Center, based at Edwards Air Force Base, California, United States (founded 1946, as the Muroc Flight Test Unit)
NRC Institute for Aerospace Research, based at Ottawa and Montreal, Canada (founded 1951, as the National Aeronautical Establishment - NAE)
Brazilian Air Force Flight Testing and Research Institute (part of CTA), São José dos Campos, Brazil (founded 1953)
Japan Air Self-Defense Force Flight Test Center, based at Gifu Air Field, Japan (founded 1955)
DLR German Aerospace Research and Test Center, based at Braunschweig, Germany (foun |
https://en.wikipedia.org/wiki/Aitken%20interpolation | Aitken interpolation is an algorithm used for polynomial interpolation that was derived by the mathematician Alexander Aitken. It is similar to Neville's algorithm.
See also Aitken's delta-squared process or Aitken extrapolation.
References
External links
Polynomials
Interpolation |
https://en.wikipedia.org/wiki/Permutation%20category | In mathematics, the permutation category is a category where
the objects are the natural numbers,
the morphisms from a natural number n to itself are the elements of the symmetric group and
there are no morphisms from m to ''n if .
It is equivalent as a category to the category of finite sets and bijections between them.
References
Category theory |
https://en.wikipedia.org/wiki/Bauer%20maximum%20principle | Bauer's maximum principle is the following theorem in mathematical optimization:
Any function that is convex and continuous, and defined on a set that is convex and compact, attains its maximum at some extreme point of that set.
It is attributed to the German mathematician Heinz Bauer.
Bauer's maximum principle immediately implies the analogue minimum principle:
Any function that is concave and continuous, and defined on a set that is convex and compact, attains its minimum at some extreme point of that set.
Since a linear function is simultaneously convex and concave, it satisfies both principles, i.e., it attains both its maximum and its minimum at extreme points.
Bauer's maximization principle has applications in various fields, for example, differential equations and economics.
References
Mathematical optimization
Mathematical theorems |
https://en.wikipedia.org/wiki/Applied%20Spectral%20Imaging | Applied Spectral Imaging or ASI is a multinational biomedical company that develops and manufactures microscopy imaging and digital analysis tools for hospitals, service laboratories and research centers. The company provides cytogenetic, pathology, and research laboratories with bright-field, fluorescence and spectral imaging in clinical applications. Test slides can be scanned, captured, archived, reviewed on the screen, analyzed with computer-assisted algorithms, and reported. ASI system platforms automate the workflow process to reduce human error in the identification and classification of chromosomal disorders, genome instability, various oncological malignancies, among other diseases.
History
Founded in 1993, ASI initially focused on spectral imaging devices for the research community.
In 2002, ASI made a strategic move to expand into the clinical cytogenetics market and thereby, introduced its CytoLabView system for karyotyping and FISH imaging.
In 2005, ASI launched its automated scanning system in order to increase throughput for case analysis, compensating for higher sample volumes and helping laboratories to better cope with a deficit of laboratory technicians and other professions. As the demand increased for more diagnostics, ASI focused on providing faster imaging and analysis to improve turn-around-time for patient results. Scanning automation and algorithms enabled laboratory technologists to spend more time on results and analysis rather than manual labor.
In 2011, ASI launched a proprietary software platform named GenASIs. The software automates the diagnostic manual process. Physicians, medical scientists and laboratory technicians integrate digital technology to manage the visualization of the slide and compute the analysis. Through algorithms, tissue suspension cell and chromosomes are analyzed for aberrations, cell classification, tumor proportion score etc. ASI's high throughput tray loader, introduced the same year, was manufacture |
https://en.wikipedia.org/wiki/Journal%20of%20Daylighting | Journal of Daylighting is a biannual, online peer-reviewed scientific journal devoted to investigations of daylighting in buildings. It is published by SolarLits, and the current editor-in-chief is Dr Irfan Ullah.
Abstracting and indexing
The journal is abstracted and indexed in:
Scopus
Directory of Open Access Journals
Avery Index to Architectural Periodicals
References
External links
Engineering journals
English-language journals
Academic journals established in 2014 |
https://en.wikipedia.org/wiki/S-object | In algebraic topology, an -object (also called a symmetric sequence) is a sequence of objects such that each comes with an action of the symmetric group .
The category of combinatorial species is equivalent to the category of finite -sets (roughly because the permutation category is equivalent to the category of finite sets and bijections.)
S-module
By -module, we mean an -object in the category of finite-dimensional vector spaces over a field k of characteristic zero (the symmetric groups act from the right by convention). Then each -module determines a Schur functor on .
This definition of -module shares its name with the considerably better-known model for highly structured ring spectra due to Elmendorf, Kriz, Mandell and May.
See also
Highly structured ring spectrum
Notes
References
Algebraic topology |
https://en.wikipedia.org/wiki/Apache%20ORC | Apache ORC (Optimized Row Columnar) is a free and open-source column-oriented data storage format. It is similar to the other columnar-storage file formats available in the Hadoop ecosystem such as RCFile and Parquet. It is used by most of the data processing frameworks Apache Spark, Apache Hive, Apache Flink and Apache Hadoop.
In February 2013, the Optimized Row Columnar (ORC) file format was announced by Hortonworks in collaboration with Facebook.
A month later, the Apache Parquet format was announced, developed by Cloudera and Twitter.
History
See also
Apache Spark
Apache Arrow
Apache Hive
Apache NiFi
Pig (programming tool)
Trino (SQL query engine)
Presto (SQL query engine)
References
2013 software
ORC
Cloud computing
Free system software
Hadoop
Software using the Apache license |
https://en.wikipedia.org/wiki/Chip%20on%20board | Chip on board (COB) is a method of circuit board manufacturing in which the integrated circuits (e.g. microprocessors) are attached (wired, bonded directly) to a printed circuit board, and covered by a blob of epoxy. By eliminating the packaging of individual semiconductor devices, the completed product can be more compact, lighter, and less costly. In some cases, COB construction improves the operation of radio frequency systems by reducing the inductance and capacitance of integrated circuit leads.
COB effectively merges two levels of electronic packaging: level 1 (components) and level 2 (wiring boards), and may be referred to as "level 1.5".
Construction
A finished semiconductor wafer is cut into dies. Each die is then physically bonded to the PCB.
Three different methods are used to connect the terminal pads of the integrated circuit (or other semiconductor device) with the conductive traces of the printed circuit board.
Flip chip
In "flip chip on board", the device is inverted, with the top layer of metallization facing the circuit board. Small balls of solder are placed on the circuit board traces where connections to the chip are required. The chip and board are passed through a reflow soldering process to make the electrical connections.
Wire bonding
In "wire bonding", the chip is attached to the board with an adhesive. Each pad on the device is connected with a fine wire lead that is welded to the pad and to the circuit board. This is similar to the way that an integrated circuit is connected to its lead frame, but instead the chip is wire-bonded directly to the circuit board.
Glob-top
Flexible circuit board
In "tape-automated bonding", thin flat metal tape leads are attached to the semiconductor device pads, then welded to the printed circuit board.
In all cases, the chip and connections are covered with an encapsulant to reduce entry of moisture or corrosive gases to the chip, to protect the wire bonds or tape leads from physical damag |
https://en.wikipedia.org/wiki/Newest%20vertex%20bisection | Newest Vertex Bisection is an algorithmic method to locally refine triangulations. It is widely used in computational science, numerical simulation, and computer graphics. The advantage of newest vertex bisection is that it allows local refinement of triangulations without degenerating the shape of the triangles after repeated usage.
In newest vertex bisection, whenever a triangle is to be split into smaller triangles, it will be bisected by drawing a line from the newest vertex to the midpoint of the edge opposite to that vertex. That midpoint becomes the newest vertex of the two newer triangles. One can show that repeating this procedure for a given triangulation leads to triangles that belong to only a finite number of similarity classes.
Generalizations of newest vertex bisection to dimension three and higher are known. Newest vertex bisection is used in local mesh refinement for adaptive finite element methods, where it is an alternative to red-green refinement and uniform mesh refinement.
References
Algorithms |
https://en.wikipedia.org/wiki/Alexandroff%20plank | Alexandroff plank in topology, an area of mathematics, is a topological space that serves as an instructive example.
Definition
The construction of the Alexandroff plank starts by defining the topological space to be the Cartesian product of and where is the first uncountable ordinal, and both carry the interval topology. The topology is extended to a topology by adding the sets of the form
where
The Alexandroff plank is the topological space
It is called plank for being constructed from a subspace of the product of two spaces.
Properties
The space has the following properties:
It is Urysohn, since is regular. The space is not regular, since is a closed set not containing while every neighbourhood of intersects every neighbourhood of
It is semiregular, since each basis rectangle in the topology is a regular open set and so are the sets defined above with which the topology was expanded.
It is not countably compact, since the set has no upper limit point.
It is not metacompact, since if is a covering of the ordinal space with not point-finite refinement, then the covering of defined by and has not point-finite refinement.
See also
References
Lynn Arthur Steen and J. Arthur Seebach, Jr., Counterexamples in Topology. Springer-Verlag, New York, 1978. Reprinted by Dover Publications, New York, 1995. (Dover edition).
S. Watson, The Construction of Topological Spaces. Recent Progress in General Topology, Elsevier, 1992.
Topological spaces |
https://en.wikipedia.org/wiki/Arens%20square | In mathematics, the Arens square is a topological space, named for Richard Friederich Arens. Its role is mainly to serve as a counterexample.
Definition
The Arens square is the topological space where
The topology is defined from the following basis. Every point of is given the local basis of relatively open sets inherited from the Euclidean topology on . The remaining points of are given the local bases
Properties
The space is:
T2½, since neither points of , nor , nor can have the same second coordinate as a point of the form , for .
not T3 or T3½, since for there is no open set such that since must include a point whose first coordinate is , but no such point exists in for any .
not Urysohn, since the existence of a continuous function such that and implies that the inverse images of the open sets and of with the Euclidean topology, would have to be open. Hence, those inverse images would have to contain and for some . Then if , it would occur that is not in . Assuming that , then there exists an open interval such that . But then the inverse images of and under would be disjoint closed sets containing open sets which contain and , respectively. Since , these closed sets containing and for some cannot be disjoint. Similar contradiction arises when assuming .
semiregular, since the basis of neighbourhood that defined the topology consists of regular open sets.
second countable, since is countable and each point has a countable local basis. On the other hand is neither weakly countably compact, nor locally compact.
totally disconnected but not totally separated, since each of its connected components, and its quasi-components are all single points, except for the set which is a two-point quasi-component.
not scattered (every nonempty subset of contains a point isolated in ), since each basis set is dense-in-itself.
not zero-dimensional, since doesn't have a local basis consisting of open and closed sets. This is because |
https://en.wikipedia.org/wiki/Mark%20Vishik%27s%20seminar%20at%20Moscow%20State%20University | Mark Vishik started his seminar at Moscow State University in the spring of 1961,
at the suggestion of I. M. Gelfand.
The seminar ran on Mondays in Room 13-06 of the main building of Moscow State University, starting at 18:00 (initially at 16:00) and lasting for several hours.
It featured talks of many world class mathematicians from Russia, France, and other countries.
Traditionally, the speakers of the seminars were guests at M. I. Vishik's apartment on the Sunday night, before the seminar.
The seminar ran for more than 50 years, until Mark Vishik's death in June 2012.
Mark Iosifovich considered this seminar one of the main achievements of his life.
This seminar collected the color of the mathematical community, and speaking at its meetings was considered a great honour.
Talks during the last, 2011–2012 academic year
See also
Séminaire Nicolas Bourbaki
French mathematical seminars
References
External links
Mark Vishik's seminar
Moscow State University
Mathematics conferences |
https://en.wikipedia.org/wiki/Microfluidic%20diffusional%20sizing | Microfluidic diffusional sizing (MDS) is a method to measure the size of particles based on the degree to which they diffuse within a microfluidic laminar flow. It allows size measurements to be taken from extremely small quantities of material (nano-grams) and is particularly useful when sizing molecules which may vary in size depending on their environment - e.g. protein molecules which may unfold or become denatured in unfavourable conditions.
Applications
MDS is primarily used in protein analyses, where size, concentration and interactions are important.
Protein size measurement
Measuring the size of a protein molecule is useful as an overall quality indicator, since misfolding, unfolding, oligomerization, aggregation or degradation can all affect size.
The literature specifically demonstrates the use of MDS in sizing protein-nanobody complexes, monitoring the formation of α-synuclein amyloid fibrils. and in observing protein assembly into oligomers
MDS can also be used to size membrane proteins, as the use of a protein specific labelling and detection system allows other species present in the solution (such as free lipid micelles or detergents) to be ignored.
Protein interactions
MDS has been used to characterise interactions between biomolecules under native conditions, and has been demonstrated to detect specific interactions within complex mixtures. It has also been used in detecting and quantifying protein-ligand interactions and protein-lipid interactions.
Protein concentration
The concentration of purified protein solutions in the laboratory is useful in determining yield and measuring the success of a prep. MDS reports concentration as well as size for each test.
Since the detection is not based on inherent fluorescence of tryptophan or tyrosine residues, MDS has been used as an alternative to A280 UV-Vis quantification.
Advantages
If protein specific labelling is applied, MDS allows membrane proteins to be sized. This is particularly usefu |
https://en.wikipedia.org/wiki/Schedule%20K | Schedule K is a geographic coding scheme originally developed by the United States Maritime Administration and currently maintained by the United States Army Corps of Engineers to identify seaports handling waterborne shipments involved with foreign trade of the United States. The codes consist of five numeric digits and are primarily for electronic communications concerning U.S. Customs.
Code lists are maintained and published by the Waterborne Commerce Statistics Center division of the United States Army Corps of Engineers.
Each country's ports share a common three digit prefix which is unique per country (with the exception of Canada which has 21 prefixes grouped by geographic region). For example all ports in the Denmark begin with 409, while the United Kingdom prefix is 412.
The system is organized such that the first digit roughly corresponds to a broad geographic region:
External links
Waterborne Commerce Statistics Center
US Customs & Border Patrol: Where can I find foreign port codes or Schedule K codes?
Automated Manifest Interface Requirements - Ocean ACE M1 Appendix F (February 2017)
References
Geocodes |
https://en.wikipedia.org/wiki/Stabilization%20hypothesis | In mathematics, specifically in category theory and algebraic topology, the Baez–Dolan stabilization hypothesis, proposed in , states that suspension of a weak n-category has no more essential effect after n + 2 times. Precisely, it states that the suspension functor is an equivalence for .
References
Sources
External links
https://ncatlab.org/nlab/show/stabilization+hypothesis
Algebraic topology
Higher category theory |
https://en.wikipedia.org/wiki/Stuttgart%20Database%20of%20Scientific%20Illustrators%201450%E2%80%931950 | The Stuttgart Database of Scientific Illustrators 1450–1950 (abbreviated DSI) is an online repository of bibliographic data about people who illustrated published scientific works from the time of the invention of the printing press, around 1450, until 1950; the latter cut-off chosen with the intention of excluding currently-active illustrators. The database includes those who worked in a variety of fields, including anatomical, astronomical, botanical, zoological and medical illustration.
The database is hosted by the University of Stuttgart. Content is displayed in English, and is free to access. As of October 2023, the database includes over 13,000 illustrators. The site is searchable by 20 fields.
Suggestions for additional entries, or amendments, may be submitted by members of the public, but are subject to editorial review before inclusion.
References
Further reading
External links
https://dsi.hi.uni-stuttgart.de/
Bibliographic databases and indexes
Online databases
German websites
English-language websites
University of Stuttgart
2011 establishments in Germany |
https://en.wikipedia.org/wiki/Conservation%20paleobiology | Conservation paleobiology is a field of paleontology that applies the knowledge of the geological and paleoecological record to the conservation and restoration of biodiversity and ecosystem services. Despite the influence of paleontology on ecological sciences can be traced back at least at the 18th century, the current field has been established by the work of K.W. Flessa and G.P. Dietl in the first decade of the 21st century. The discipline utilizes paleontological and geological data to understand how biotas respond to climate and other natural and anthropogenic environmental change. These information are then used to address the challenges faced by modern conservation biology, like understanding the extinction risk of endangered species, providing baselines for restoration and modelling future scenarios for species range's contraction or expansion.
Description of the discipline
The main strength of conservation paleobiology is the availability of long term data on species, communities and ecosystems that exceeds the timeframe of direct human experience. The discipline takes one of two approaches: near-time and deep-time.
Near-time conservation paleobiology
The near-time approach uses the recent fossil record (usually from the Late Pleistocene or the Holocene) to provide a long-term context to extant ecosystems dynamics. The fossil record is, in many cases, the only source of information on conditions previous to human impacts. These records can be used as reference baselines for comparisons in order to identify targets for restoration ecology, to analyze species responses to perturbations (natural and anthropogenic), understand historical species distributions and their variability, discriminate the factors that distinguish natural from non-natural changes in biological populations and identify ecological legacies only explicable by referring to past events or conditions.
Example - Conservation of the European bison
The European bison or wisent (Bison |
https://en.wikipedia.org/wiki/Interchange%20lemma | In the theory of formal languages, the interchange lemma states a necessary condition for a language to be context-free, just like the pumping lemma for context-free languages.
It states that for every context-free language there is a such that for all for any collection of length words there is a with , and decompositions such that each of , , is independent of , moreover, , and the words are in for every and .
The first application of the interchange lemma was to show that the set of repetitive strings (i.e., strings of the form with ) over an alphabet of three or more characters is not context-free.
See also
Pumping lemma for regular languages
References
Formal languages
Lemmas |
https://en.wikipedia.org/wiki/Museum-digital | museum-digital is a project of museums to collaboratively publish their data online. Increasingly, it has also been targeting inventorization. Having published information on over 281,000 objects in Germany and 95,000 objects in Hungary, the project's work is currently focused on these countries.
Concept
museum-digital offers museums the option to publish their information, especially object information, online. The platform displays both textual and visual information on the objects. Once a respective object has been set public, its information is available for public reuse according to the given license.
To enrich search results, museum-digital makes use of controlled vocabularies, which are shared between the different instances. The larger international versions have own, language-specific controlled vocabularies.
Museums from different regions of Germany have bound together in regional instances of museum-digital, organized through their respective museum associations. These regional instances are aggregated into a national instance, where information can be searched across regions.
Furthermore, museum-digital can serve museums as an aggregator for data to be exported to the Deutsche Digitale Bibliothek and the Europeana.
History
The project was founded in 2009, based on an initiative of the "AG Digitalisierung" (Working Group Digitization) of the Museum Association of Saxony-Anhalt. In October of the same year 187 museums from within Germany were participating and 15,400 objects were available online.
Until 2016, a number of additional regional, international (Hungary, Brazil, Indonesia), and topical ("agrargeschichte" [History of Agriculture]) instances were created
Currently, 572 museums in Germany are participating in the project, with over 281.000 objects
In Saxony-Anhalt and Rhineland-Palantine, the project has enjoyed funding by the respective states.
Development
The different tools museum-digital provides are created using PHP, JavaScript an |
https://en.wikipedia.org/wiki/Cell%20unroofing | Cell unroofing is any of various methods to isolate and expose the cell membrane of cells. Differently from the more common membrane extraction protocols performed with multiple steps of centrifugation (which goal is to separate the membrane fraction from a cell lysate), in cell unroofing the aim is to tear and preserve patches of the plasma membrane in order to perform in situ experiments using (microscopy and biomedical spectroscopy).
History
The first observation the bi-layer cell membrane was made in 1959 on a section of a cell using the electron microscope.
But the first micrograph of the internal side of a cell dates back to 1977 by M.V. Nermut. Professor John Heuser made substantial contributions in the field, imaging the detailed internal structure of the membrane and the cytoskeleton bound to it with extensive use of the electron microscope.
It was only after the development of the atomic force microscope operated in liquid that it was possible to image the cell membranes in almost-physiological conditions and to test its mechanical properties.
Methods
Freeze-fracturing of monolayers
Quick-freeze deep-etch electron microscopy and cryofixation
Sonication for atomic force microscopy
Single-cell unroofing
See also
Sonoporation
Lysis
References
Cell biology
Scientific techniques |
https://en.wikipedia.org/wiki/Replit | Replit (), formerly Repl.it, is an American start-up and an online integrated development environment (IDE). Replit being Software as a service (SaaS) allows users to create online projects (called Repls, not to be confused with REPLs) and write code.
Replit has a global community for programmers to interact and offers Teams for Education, a product to assist in teaching programming in the classroom.
History
Replit was co-founded by programmers Amjad Masad, Faris Masad, and designer Haya Odeh in 2016. Once listed as a co-founder alongside Masad, Max Shawabkeh left the venture early on. Its name comes from the acronym REPL, which stands for "read–evaluate–print loop".
Before creating Replit, Amjad Masad worked in engineering roles at Yahoo and Facebook, where he built development tools. He also helped found Codecademy. Masad had come up with the idea for Replit over a decade before its creation.
In 2009, Amjad Masad sought to write implementations of other programming languages in JavaScript, but realized it was not practically feasible. He saw great leaps in browser and web technologies and was inspired by the web capabilities of Google Docs. He thought of the idea of being able to write and share code all in a web browser. He spent two years creating an open-source product with Haya Odeh called "JSRepl". This product allowed him to compile languages into JavaScript. It powered Udacity and Codecademy's tutorials. After becoming an early employee of Codecademy, this project was put off until years later, when he and Odeh decided to revive the project of a programming environment in a browser.
As Replit was taking shape, Masad and Odeh wanted to have "a real environment and not something emulated in the browser." The focus was first directed at the education market, and then later towards professional developers.
Since March 2021, "replit.com" has been the default domain name for the web service replacing the older "repl.it". This change was attributed to Masad' |
https://en.wikipedia.org/wiki/Ant%20Wars%20%28board%20game%29 | Ant Wars is a 1982 board game published by Jason McAllister Games.
Gameplay
Ant Wars is a game in which tribes of ants battle each other for control of the back yard.
Reception
Steve Jackson reviewed Ant Wars in Space Gamer No. 66. Jackson commented that "On the whole [...] it's fun. I can't help it . . . I have to say this: I like it despite the bugs in it. Sorry . . . couldn't resist."
Reviews
White Wolf #43 (May 1994)
References
Biology-themed board games
Board games introduced in 1982 |
https://en.wikipedia.org/wiki/PlaidML | PlaidML is a portable tensor compiler. Tensor compilers bridge the gap between the universal mathematical descriptions of deep learning operations, such as convolution, and the platform and chip specific code needed to perform those operations with good performance. Internally, PlaidML makes use of the Tile eDSL to generate OpenCL, OpenGL, LLVM, or CUDA code. It enables deep learning on devices where the available computing hardware is either not well supported or the available software stack contains only proprietary components. For example, it does not require the usage of CUDA or cuDNN on Nvidia hardware, while achieving comparable performance.
PlaidML supports the machine learning libraries Keras, ONNX, and nGraph. However, Keras have dropped support of multiple backends and latest Keras version isn't compatible with PlaidML. An integration with Tensorflow-Keras is planned as a replacement for Keras.
History
In August 2018 Intel acquired Vertex.AI, a startup whose mission statement was “deep learning for every platform”. Intel released PlaidML as free software under to the terms of the Apache Licence (version 2.0) to improve compatibility with nGraph, TensorFlow, and other ecosystem software.
References
External links
Tensors
Compilers |
https://en.wikipedia.org/wiki/Phase%20reduction | Phase reduction is a method used to reduce a multi-dimensional dynamical equation describing a nonlinear limit cycle oscillator into a one-dimensional phase equation. Many phenomena in our world such as chemical reactions, electric circuits, mechanical vibrations, cardiac cells, and spiking neurons are examples of rhythmic phenomena, and can be considered as nonlinear limit cycle oscillators.
History
The theory of phase reduction method was first introduced in the 1950s, the existence of periodic solutions to nonlinear oscillators under perturbation, has been discussed by Malkin in, in the 1960s, Winfree illustrated the importance of the notion of phase and formulated the phase model for a population of nonlinear oscillators in his studies on biological synchronization. Since then, many researchers have discovered different rhythmic phenomena related to phase reduction theory.
Phase model of reduction
Consider the dynamical system of the form
where is the oscillator state variable, is the baseline vector field. Let be the flow induced by the system, that is, is the solution of the system for the initial condition . This system of differential equations can describe for a neuron model for conductance with , where represents the voltage difference across the membrane and represents the -dimensional vector that defines gating variables. When a neuron is perturbed by a stimulus current, the dynamics of the perturbed system will no longer be the same with the dynamics of the baseline neural oscillator.
The target here is to reduce the system by defining a phase for each point in some neighbourhood of the limit cycle. The allowance of sufficiently small perturbations (e.g. external forcing or stimulus effect to the system) might cause a large deviation of the phase, but the amplitude is perturbed slightly because of the attracting of the limit cycle. Hence we need to extend the definition of the phase to points in the neighborhood of the cycle by introducing the |
https://en.wikipedia.org/wiki/Squeeze%20flow | Squeeze flow (also called squeezing flow, squeezing film flow, or squeeze flow theory) is a type of flow in which a material is pressed out or deformed between two parallel plates or objects. First explored in 1874 by Josef Stefan, squeeze flow describes the outward movement of a droplet of material, its area of contact with the plate surfaces, and the effects of internal and external factors such as temperature, viscoelasticity, and heterogeneity of the material. Several squeeze flow models exist to describe Newtonian and non-Newtonian fluids undergoing squeeze flow under various geometries and conditions. Numerous applications across scientific and engineering disciplines including rheometry, welding engineering, and materials science provide examples of squeeze flow in practical use.
Basic Assumptions
Conservation of mass (expressed as a continuity equation), the Navier-Stokes equations for conservation of momentum, and the Reynolds number provide the foundations for calculating and modeling squeeze flow. Boundary conditions for such calculations include assumptions of an incompressible fluid, a two-dimensional system, neglecting of body forces, and neglecting of inertial forces.
Relating applied force to material thickness:
Where is the applied squeezing force, is the initial length of the droplet, is the fluid viscosity, is the width of the assumed rectangular plate, is the final height of the droplet, and is the change in droplet height over time. To simplify most calculations, the applied force is assumed to be constant.
Newtonian fluids
Several equations accurately model Newtonian droplet sizes under different initial conditions.
Consideration of a single asperity, or surface protrusion, allows for measurement of a very specific cross-section of a droplet. To measure macroscopic squeeze flow effects, models exist for two the most common surfaces: circular and rectangular plate squeeze flows.
Single asperity
For single asperity squeeze flow:
Wh |
https://en.wikipedia.org/wiki/Patinho%20Feio | Patinho Feio (Portuguese for "Ugly Duckling", in reference to the fairy tale) was the first minicomputer designed and manufactured entirely in Brazil. It was made between 1971 and 1972 in the Polytechnic School of the University of São Paulo by the Digital Systems Laboratory (currently called Department of Computer Engineering and Digital Systems).
Technology
Patinho Feio was an 8-bit minicomputer with a memory of 8 kB and an instruction cycle of 2 microseconds. It was programmed in assembly language.
See also
Zezinho
References
Computers designed in Brazil
Minicomputers
Early microcomputers
One-of-a-kind computers
University of São Paulo |
https://en.wikipedia.org/wiki/Interconnect%20%28integrated%20circuits%29 | In integrated circuits (ICs), interconnects are structures that connect two or more circuit elements (such as transistors) together electrically. The design and layout of interconnects on an IC is vital to its proper function, performance, power efficiency, reliability, and fabrication yield. The material interconnects are made from depends on many factors. Chemical and mechanical compatibility with the semiconductor substrate and the dielectric between the levels of interconnect is necessary, otherwise barrier layers are needed. Suitability for fabrication is also required; some chemistries and processes prevent the integration of materials and unit processes into a larger technology (recipe) for IC fabrication. In fabrication, interconnects are formed during the back-end-of-line after the fabrication of the transistors on the substrate.
Interconnects are classified as local or global interconnects depending on the signal propagation distance it is able to support. The width and thickness of the interconnect, as well as the material from which it is made, are some of the significant factors that determine the distance a signal may propagate. Local interconnects connect circuit elements that are very close together, such as transistors separated by ten or so other contiguously laid out transistors. Global interconnects can transmit further, such as over large-area sub-circuits. Consequently, local interconnects may be formed from materials with relatively high electrical resistivity such as polycrystalline silicon (sometimes silicided to extend its range) or tungsten. To extend the distance an interconnect may reach, various circuits such as buffers or restorers may be inserted at various points along a long interconnect.
Interconnect properties
The geometric properties of an interconnect are width, thickness, spacing (the distance between an interconnect and another on the same level), pitch (the sum of the width and spacing), and aspect ratio, or AR, (the thickn |
https://en.wikipedia.org/wiki/Dounce%20homogenizer | Invented by and named for Alexander Dounce
, a Dounce homogenizer or "Douncer", is a cylindrical glass tube, closed at one end, with two glass pestles of carefully specified outer diameters, intended for the gentle homogenization of eukaryotic cells (e.g. mammalian cells). Dounce homogenizers are still commonly used today to isolate cellular organelles.
The two Dounce homogenizer pestles (known as the "loose" or "A" and "tight" or "B" pestles), have a carefully specified outer diameter, relative to the inner diameter of the cylinder. The "A" (loose) pestle has a clearance from the cylinder wall of (~0.0025 - 0.0055 in.) while the "B" (tight) pestle has a clearance of (~0.0005 - 0.0025 in.). This allows for tissue and cells to be lysed by shear stress with minimal (if any) degree of heating, thereby leaving extracted organelles or heat-sensitive enzyme complexes largely intact.
Typically, a soft tissue (e.g. mammalian liver) is cut or broken into smaller pieces and placed into the glass cylinder, alongside a suitable volume of an appropriate lysis buffer. Homogenization is performed by a defined number of "passes" of the pestles, first with the loose pestle, then with the tight pestle, up and down the cylinder. Five to ten passes are typical. Dounce homogenizers are typically produced from borosilicate glass, but are still fragile, and should be used with care. Especially hard or tough tissues should be pre-homogenized before use in a dounce homogenizer.
Eukaryotic cells with tough cell walls, such as Saccharomyces cerevisiae, cannot be directly lysed with a dounce homogenizer, unless the cell wall is first broken down (e.g. with lyticase, or zymolyase in the case of S. cerevisiae).
References
Cell biology
Biochemistry
Laboratory glassware |
https://en.wikipedia.org/wiki/Pyramid%20wavefront%20sensor | A pyramid wavefront sensor is a type of a wavefront sensor. It measures the optical aberrations of an optical wavefront. This wavefront sensor uses a pyramidal prism with a large apex angle to split the beam into multiple parts at the geometric focus of a lens. A four-faceted prism, with its tip centered at the peak of the point spread function, will generate four identical pupil images in the absence of optical aberrations. In the presence of optical aberrations, the intensity distribution among the pupils will change. The local wavefront gradients can be obtained by recording the distribution of intensity in the pupil images. The wavefront aberrations can be evaluated from the estimated wavefront gradients.
It has potential applications in astronomy and ophthalmology.
Modulation
The prism is often modulated (mechanically moved in a circle/square) for averaging purposes and to make sure that the ray spends an equal fraction of the total time on every face of the pyramidal prism. This makes the wavefront sensor slightly inconvenient to use due to the need for mechanically moving parts – either the prism or the beam is modulated. Using a light diffusing plate, mechanically moving parts can be eliminated. Alternatively, it has been shown that the need for mechanically moving parts can be overcome in a digital pyramid wavefront sensor with the spatial light modulator.
References
Observational astronomy
Optical instruments
Sensors |
https://en.wikipedia.org/wiki/Cryptoeconomics | Cryptoeconomics is an evolving economic paradigm for a cross-disciplinary approach to the study of digital economies and decentralized finance (DeFi) applications. Cryptoeconomics integrates concepts and principles from traditional economics, cryptography, computer science, and game theory disciplines. Just as traditional economics provides a theoretical foundation for traditional financial (a.k.a., Centralized Finance or CeFi) services, cryptoeconomics provides a theoretical foundation for DeFi services bought and sold via fiat cryptocurrencies, and executed by smart contracts.
Definitions and goals
The term cryptoeconomics was coined by the Ethereum community during its formative years (2014-2015), but was initially inspired by the application of economic incentives in the original Bitcoin protocol in 2008. Although the phrase is typically attributed to Vitalik Buterin, the earliest public documented usage is a 2015 talk by Vlad Zamfir entitled “What is Cryptoeconomics?” Zamfir's view of cryptoeconomics is relatively broad and academic: “… a formal discipline that studies protocols that govern the production, distribution, and consumption of goods and services in a decentralized digital economy. Cryptoeconomics is a practical science that focuses on the design and characterization of these protocols”. Alternatively, in a 2017 talk, Buterin's view is more narrow and pragmatic: “… a methodology for building systems that try to guarantee certain kinds of information security properties”.
According to Binance, the primary goals of cryptoeconomics are to understand how to fund, design, develop, and facilitate the operations of DeFi systems, and to apply economic incentives and penalties to regulate the distribution of goods and services in emerging digital economies.
Cryptoeconomics may be considered an evolution of digital economics, which in turn evolved from traditional economics (commonly divided into microeconomics and macroeconomics). Consequently, tradition |
https://en.wikipedia.org/wiki/Lewis%27%20law | For the statement about feminism of the same name, see Helen Lewis.
Lewis' law gives a relationship between the size and the shape of epithelial cells. It states that the average apical area of an epithelial cell is linearly related to its neighbor number . It is a phenomenological law that was first described in the cucumber epidermis by the morphologist Frederic Thomas Lewis in 1928. The simplest version of Lewis' law can be expressed as , which reads: The average apical area of a cell with neighbors (divided by the average apical area of all cells) is proportional to its shape. While neighbor number distributions change throughout organogenesis, the average neighbor number of epithelial cells is , which can be traced back to Euler's formula for polygons.
Discovery
Frederic Thomas Lewis noticed that epidermal cells display a patterning similar to froths, which led him to quantify and analyze the sizes and shapes of epidermal cells.
Confirmation and mechanism
A variety of empirical studies in different epithelial tissues have confirmed Lewis' law.
It has been suggested that the emergence of Lewis' law on the apical surface of epithelia is a result of the concurrence of
the tendency of cells to minimize intercellular contact surface energy, and
the distribution of apical cell areas (as a result of cellular processes such as cell division).
According to this theory, the observed tissue-specific polygon distributions and Lewis' law arise as a compromise in order to maintain tissue integrity.
Importance
In order to understand morphogenetic events, i.e. the growth and shaping of tissues and organs, it is necessary to analyze the packing of cells into tissues. In that context, an analysis of patterning processes can help to identify the underlying mechanisms that drive morphogenesis.
References
Epithelium
Biological theorems |
https://en.wikipedia.org/wiki/List%20of%20BioBlitzes%20in%20New%20Zealand | This is a list of BioBlitzes that have been held in New Zealand. The date is the first day of the BioBlitz if held over several days. This list only includes those that were major public events. BioBlitz was established in New Zealand by Manaaki Whenua - Landcare Research initially based on seed funding from The Royal Society of NZ's "Science & Technology Promotion Fund 2003/2004". BioBlitz events have always been a collaborative activity of professional and amateur taxonomic experts from multiple organisations and the public. Auckland BioBlitz events were coordinated by Manaaki Whenua, later from 2015 moving to events coordinated by Auckland Museum. The first events were 24 hours continuously, e.g. from 3 pm Friday overnight to 3 pm Saturday. Subsequently, this changed to 24 hours spread across mostly daylight hours over 2 consecutive days. For a series of downloadable posters for BioBlitz see: . See also: .
References
Biological censuses |
https://en.wikipedia.org/wiki/Frank%E2%80%93Van%20der%20Merwe%20growth | Frank–Van der Merwe growth (FM growth) is one of the three primary modes by which thin films grow epitaxially at a crystal surface or interface. It is also known as 'layer-by-layer growth'. It is considered an ideal growth model, requiring perfect lattice matching between the substrate and the layer growing on to it, and it is usually limited to homoepitaxy. For FM growth to occur, the atoms that are to be deposited should be more attracted to the substrate than to each other, which is in contrast to the layer-plus-island growth model. FM growth is the preferred growth model for producing smooth films.
It was first described by South African physicist Jan van der Merwe and British physicist Frederick Charles Frank in a series of four papers based on Van der Merwe's PhD research between 1947 and 1949.
See also
Epitaxy
Thin films
Molecular-beam epitaxy
References
Thin films |
https://en.wikipedia.org/wiki/The%20spider%20and%20the%20fly%20problem | The spider and the fly problem is a recreational mathematics problem with an unintuitive solution, asking for a shortest path or geodesic between two points on the surface of a cuboid. It was originally posed by Henry Dudeney.
Problem
In the typical version of the puzzle, an otherwise empty cuboid room 30 feet long, 12 feet wide and 12 feet high contains a spider and a fly. The spider is 1 foot below the ceiling and horizontally centred on one 12′×12′ wall. The fly is 1 foot above the floor and horizontally centred on the opposite wall. The problem is to find the minimum distance the spider must crawl along the walls, ceiling and/or floor to reach the fly, which remains stationary.
Solutions
A naive solution is for the spider to remain horizontally centred, and crawl up to the ceiling, across it and down to the fly, giving a distance of 42 feet. Instead, the shortest path, 40 feet long, spirals around five of the six faces of the cuboid. Alternatively, it can be described by unfolding the cuboid into a net and finding a shortest path (a line segment) on the resulting unfolded system of six rectangles in the plane. Different nets produce different segments with different lengths, and the question becomes one of finding a net whose segment length is minimum. Another path, of intermediate length , crosses diagonally through four faces instead of five.
For a room of length l, width w and height h, the spider a distance b below the ceiling, and the fly a distance a above the floor, length of the spiral path is while the naive solution has length . Depending on the dimensions of the cuboid, and on the initial positions of the spider and fly, one or another of these paths, or of four other paths, may be the optimal solution. However, there is no rectangular cuboid, and two points on the cuboid, for which the shortest path passes through all six faces of the cuboid.
A different lateral thinking solution, beyond the stated rules of the puzzle, involves the spider attach |
https://en.wikipedia.org/wiki/Human%20milk%20immunity | Human milk immunity is the protection provided to the immune system of an infant via the biologically active components in human milk. Human milk was previously thought to only provide passive immunity primarily through Secretory IgA, but advances in technology have led to the identification of various immune-modulating components. Human milk constituents provide nutrition and protect the immunologically naive infant as well as regulate the infant's own immune development and growth.
Immune factors and immune-modulating components in human milk include cytokines, growth factors, proteins, microbes, and human milk oligosaccharides. Immune factors in human milk are categorized mainly as anti-inflammatory primarily working without inducing inflammation or activating the complement system.
Immune factors
Bio-active constituents of human milk that have been cataloged to possess immune-modulating capabilities include immunoglobulins, Lactoferrin, Lysozyme, oligosaccharides, lipids, cytokines, hormones, and growth factors. Some of the roles of bio-actives in human milk are theorized based on their function in other parts of the body, but the mechanisms and function of their activities remain to be discovered.
IgA
Immunoglobulin A is the most well known immune factor in human milk. In its secretory form, SIgA, it is the most plentiful antibody in human milk. It constitutes between 80-90% of all immunoglobulins present in milk. SIgA provides adaptive immunity by directly targeting specific pathogens that both infant and mother have been exposed to in their environments.
Lactoferrin
Lactoferrin is an immune protein with strong anti-microbial function in human milk. Lactoferrin protects the infant intestine by binding to iron to prevent pathogens from utilizing it as a resource. It also modulates immunity by blocking inflammatory signaling cytokines.
Cytokines
Cytokines are pluripotent signaling molecules with the ability to bind to specific receptors. They can cross |
https://en.wikipedia.org/wiki/Amazon%20DocumentDB | Amazon DocumentDB is a managed proprietary NoSQL database service that supports document data structures, with some compatibility with MongoDB version 3.6 (released by MongoDB in 2017) and version 4.0 (released by MongoDB in 2018). As a document database, Amazon DocumentDB can store, query, and index JSON data. It is available on Amazon Web Services. As of March 2023, AWS introduced some compliance with MongoDB 5.0 but lacks time series collection support.
Main features
A document database natively stores JSON data. DocumentDB provides single document lookups, index scans, regular expression queries, and aggregations. It can create single-field, compound, and multi-key indexes to improve the performance of query patterns. Reads from the indexes on the primary instance are read-after-write consistent and users can delete or create new indexes at any time.
DocumentDB was an enhancement to the Amazon Aurora relational database system, specifically the PostgreSQL-compatible edition. Its architecture separates storage and computing so that each layer can scale independently, though the system is limited to a single writable primary. Amazon DocumentDB, through Aurora PostgreSQL, uses the Aurora Storage Engine, originally built for the MySQL relational database. This storage engine is distributed, fault-tolerant, self-healing, and durable, which it maintains by replicating data six ways across three AWS Availability Zones (AZs). Amazon DocumentDB databases cannot span AWS Regions or span cloud providers, nor can Amazon DocumentDB run on-premises. The system can support up to 15 low-latency readable replicas and continuously backs up all changes to Amazon S3.
See also
Amazon Aurora
Amazon Redshift
Amazon DynamoDB
Amazon Relational Database Service
References
External links
A Guide To Amazon Web Services
Comparing MongoDB to DocumentDB
DocumentDB Test Compatibility With MongoDB
DocumentDB
Cloud storage
Distributed data stores
Structured storage
NoSQL products
Cl |
https://en.wikipedia.org/wiki/European%20Project%20on%20Ocean%20Acidification | The European Project on Ocean Acidification (EPOCA) was Europe's first major research initiative and the first large-scale international research effort devoted to studying the impacts and consequences of ocean acidification. EPOCA was an EU FP7 Integrated Project active during four years, from 2008 to 2012.
The EPOCA consortium brought together more than 160 researchers from 32 institutes in 10 European countries (Belgium, France, Germany, Iceland, Italy, The Netherlands, Norway, Sweden, Switzerland, and the United Kingdom) and was coordinated by the French Centre National de la Recherche Scientifique (CNRS) with the project office based at the Institut de la Mer de Villefranche, France (formerly Observatoire Océanologique de Villefranche).
Scope
The research carried out through EPOCA was structured around four themes :
Theme 1 investigated the changes in ocean chemistry and biogeography across space and time. Paleo-reconstruction methods were used on several archives, including foraminifera and deep-sea corals, to determine the past variability in ocean chemistry and to tie these to present-day chemical and biological observations;
Theme 2 studied the sensitivity of marine organisms, communities and ecosystems to ocean acidification. Key climate-relevant biogeochemical processes such as calcification, primary production and nitrogen fixation were investigated using a large array of techniques, ranging from molecular tools to physiological and ecological approaches. Perturbation experiments were carried out both in the laboratory and in the field, including a major large-scale offshore mesocosm experiment in Svalbard in 2010
Theme 3 focused on the integration of th |
https://en.wikipedia.org/wiki/Symmorphosis | Symmorphosis is the regulation of biological units to produce an optimal outcome. Symmorphosis is when a quantitative match of design and function within an organism defined within a functional system. Symmorphosis can be broken down into the three predictions that are required for organs to evolve within a species.
This proposes that if organs were matched structurally and functionally, and paired with the correct energy and minerals, the body would create an organ of optimal design. Some examples of this in the human body could be how the respiratory system distributes oxygen, how bones are structured to withstand stress, how blood vessels are designed to distribute blood throughout the body without using a lot of energy, or even how as a person becomes more physically fit or endures more cardio after their body has adjusted to maintain higher functioning demands. The use of symmorphosis can allow for fields of science to work with the field of evolutionary biology to better understand adaptation.
Requirements
For symmorphosis to occur, there must be three predictions or guidelines in place and functioning at the same time. These three predictions work together to let an organ function or organ system work at full potential.
Structure
When looking at the theory of symmorphosis, one must consider if the design in the organism is fully optimized. The structural design in terms of symmorphosis means that the organ is designed to allow full capacity of its function and can allow for adjustments to occur when necessary. This design must contain the sufficient amount of economy material for the organ needed. In this circumstance, economy material is the careful management of resources such as tissues.
Capacity
The functional capacity is when all functional units work together to determine the maximal capacity. Functional capacity is overall determined by the structural design. Once the design is optimized in terms of biological materials, then the structure must |
https://en.wikipedia.org/wiki/Hybrid%20incompatibility | Hybrid incompatibility is a phenomenon in plants and animals, wherein offspring produced by the mating of two different species or populations have reduced viability and/or are less able to reproduce. Examples of hybrids include mules and ligers from the animal world, and subspecies of the Asian rice crop Oryza sativa from the plant world. Multiple models have been developed to explain this phenomenon. Recent research suggests that the source of this incompatibility is largely genetic, as combinations of genes and alleles prove lethal to the hybrid organism. Incompatibility is not solely influenced by genetics, however, and can be affected by environmental factors such as temperature. The genetic underpinnings of hybrid incompatibility may provide insight into factors responsible for evolutionary divergence between species.
Background
Hybrid incompatibility occurs when the offspring of two closely related species are not viable or suffer from infertility. Charles Darwin posited that hybrid incompatibility is not a product of natural selection, stating that the phenomenon is an outcome of the hybridizing species diverging, rather than something that is directly acted upon by selective pressures. The underlying causes of the incompatibility can be varied: earlier research focused on things like changes in ploidy in plants. More recent research has taken advantage of improved molecular techniques and has focused on the effects of genes and alleles in the hybrid and its parents.
Dobzhansky-Muller model
The first major breakthrough in the genetic basis of hybrid incompatibility is the Dobzhansky-Muller model, a combination of findings by Theodosius Dobzhansky and Joseph Muller between 1937 and 1942. The model provides an explanation as to why a negative fitness effect like hybrid incompatibility is not selected against. By hypothesizing that the incompatibility arose from alterations at two or more loci, rather than one, the incompatible alleles are in one hybrid in |
https://en.wikipedia.org/wiki/Biotic%20homogenization | Biotic homogenization is the process by which two or more spatially distributed ecological communities become increasingly similar over time. This process may be genetic, taxonomic, or functional, and it leads to a loss of beta (β) diversity. While the term is sometimes used interchangeably with "taxonomic homogenization", "functional homogenization", and "genetic homogenization", biotic homogenization is actually an overarching concept that encompasses the other three. This phenomenon stems primarily from two sources: extinctions of native and invasions of nonnative species. While this process pre-dates human civilization, as evidenced by the fossil record, and still occurs due to natural impacts, it has recently been accelerated due anthropogenic pressures. Biotic homogenization has become recognized as a significant component of the biodiversity crisis, and as such has become of increasing importance to conservation ecologists.
Overview
Homogenization versus differentiation
Homogenization is the process of assemblages becoming increasingly similar: the reverse is the process of assemblages becoming increasingly different over time, a process known as "biotic differentiation". Just as biotic homogenization has genetic, taxonomic, and functional components, differentiation can occur at any of these levels of organization.
Alpha and beta diversity
Understanding homogenization requires an understanding of the difference between alpha (α) and beta (β) diversity. Alpha diversity refers to diversity within a community: it addresses how many species are present. A community with high α diversity has many species present. Beta diversity compares multiple communities. For there to be high β diversity, two communities would have to have high α diversity but have different, unique species compositions.
Species introduction, extinction, and richness
When organisms are introduced to a habitat, be it naturally or artificially, overall species richness increases (assumin |
https://en.wikipedia.org/wiki/Temperature-size%20rule | The temperature-size rule denotes the plastic response (i.e. phenotypic plasticity) of organismal body size to environmental temperature variation. Organisms exhibiting a plastic response are capable of allowing their body size to fluctuate with environmental temperature. First coined by David Atkinson in 1996, it is considered to be a unique case of Bergmann's rule that has been observed in plants, animals, birds, and a wide variety of ectotherms. Although exceptions to the temperature-size rule exist, recognition of this widespread "rule" has amassed efforts to understand the physiological mechanisms (via possible tradeoffs) underlying growth and body size variation in differing environmental temperatures.
History
Relation to Bergmann's rule
In 1847, Carl Bergmann published his observations that endothermic body size (i.e. mammals) increased with increasing latitude, commonly known as Bergmann's rule. His rule postulated that selection favored within species individuals with larger body sizes in cooler temperatures because the total heat loss would be diminished through lower surface area to volume ratios. However, ectothermic individuals thermoregulate and allow their internal body temperature to fluctuate with environmental temperature whereas endotherms maintain a constant internal body temperature. This creates an inaccurate description of observed body size variation in ectotherms since they routinely allow evaporative heat loss and do not maintain constant internal temperatures. Despite this, ectotherms have largely been observed to still exhibit larger body sizes in colder environments.
Formulation of the rule
Ray (1960) originally examined body sizes in several species of ectotherms and discovered that around 80% of them exhibited larger body sizes in lower temperatures. A few decades later, Atkinson (1994) performed a similar review of temperature effects on body size in ectotherms. His study, which included 92 species of ectotherms ranging from anim |
https://en.wikipedia.org/wiki/Cancer%20selection | Cancer selection can be viewed through the lens of natural selection. The animal host's body is the environment which applies the selective pressures upon cancer cells. The most fit cancer cells will have traits that will allow them to out compete other cancer cells which they are related to, but are genetically different from. This genetic diversity of cells within a tumor gives cancer an evolutionary advantage over the host's ability to inhibit and destroy tumors. Therefore, other selective pressures such as clinical treatments and pharmaceutical treatments are needed to help destroy the large amount of genetically diverse cancerous cells within a tumor. It is because of the high genetic diversity between cancer cells within a tumor that makes cancer a formidable foe for the survival of animal hosts. It has also been proposed that cancer selection is a selective force that has driven the evolution of animals. Therefore, cancer and animals have been paired as competitors in co-evolution throughout time.
Natural selection
Evolution, which is driven by natural selection, is the cornerstone for nearly all branches of biology including cancer biology. In 1859, Charles Darwin's book On the Origin of Species was published, in which Darwin proposed his theory of evolution by means of natural selection. Natural selection is the force that drives changes in the phenotypes observed in populations over time, and is therefore responsible for the diversity amongst all living things. It is through the pressures applied by natural selection upon individuals that leads to evolutionary change over time. Natural selection is simply the selective pressures acting upon individuals within a population due to changes in their environment which picks the traits that are best fit for the selective change.
Selection and cancer
These same observations that Darwin proposed for the diversity in phenotypes amongst all living things can also be applied to cancer biology to explai |
https://en.wikipedia.org/wiki/IBM%20Q%20System%20One | IBM Quantum System One is the first circuit-based commercial quantum computer, introduced by IBM in January 2019.
This integrated quantum computing system is housed in a airtight glass cube that maintains a controlled physical environment. A cylindrical protrusion from the center of the ceiling is a dilution refrigerator, containing a 20-qubit transmon quantum processor. It was tested for the first time in the summer of 2018, for two weeks, in Milan, Italy.
IBM Quantum System One was developed by IBM Research, with assistance from the Map Project Office and Universal Design Studio. CERN, ExxonMobil, Fermilab, Argonne National Laboratory and Lawrence Berkeley National Laboratory are among the clients signed up to access the system remotely.
From April 6 to May 31, 2019, the Boston Museum of Science hosted an exhibit featuring a replica of the IBM Quantum System One.
On June 15, 2021, IBM deployed the first unit of Quantum System One in Germany at its headquarters in Ehningen.
See also
IBM Eagle
IBM Quantum Platform
Timeline of quantum computing and communication
Superconducting quantum computing
Qiskit
References
External links
Official website
Quantum computing
Computer-related introductions in 2019
IBM computers |
https://en.wikipedia.org/wiki/Light-emitting%20diode%20physics | Light-emitting diodes (LEDs) produce light (or infrared radiation) by the recombination of electrons and electron holes in a semiconductor, a process called "electroluminescence". The wavelength of the light produced depends on the energy band gap of the semiconductors used. Since these materials have a high index of refraction, design features of the devices such as special optical coatings and die shape are required to efficiently emit light. A LED is a long-lived light source, but certain mechanisms can cause slow loss of efficiency of the device or sudden failure. The wavelength of the light emitted is a function of the band gap of the semiconductor material used; materials such as gallium arsenide, and others, with various trace doping elements, are used to produce different colors of light. Another type of LED uses a quantum dot which can have its properties and wavelength adjusted by its size. Light-emitting diodes are widely used in indicator and display functions, and white LEDs are displacing other technologies for general illumination purposes.
Electroluminescence
The p–n junction in any direct band gap material emits light when electric current flows through it. This is electroluminescence. Electrons cross from the n-region and recombine with the holes existing in the p-region. Free electrons are in the conduction band of energy levels, while holes are in the valence energy band. Thus the energy level of the holes is lower than the energy levels of the electrons. Some portion of the energy must be dissipated to recombine the electrons and the holes. This energy is emitted in the form of heat and light.
As indirect band gap materials the electrons dissipate energy in the form of heat within the crystalline silicon and germanium diodes, but in gallium arsenide phosphide (GaAsP) and gallium phosphide (GaP) semiconductors, the electrons dissipate energy by emitting photons. If the semiconductor is translucent, the junction becomes the source of light, thu |
https://en.wikipedia.org/wiki/ZeroTier | ZeroTier, Inc. is a software company with a freemium business model based in Irvine, California. ZeroTier provides proprietary software, SDKs and commercial products and services to create and manage virtual software-defined networks. The company's flagship end-user product ZeroTier One is a client application that enables devices such as PCs, phones, servers and embedded devices to securely connect to peer-to-peer virtual networks.
Software tools
ZeroTier markets proprietary tools, which are licensed under a Business Source License 1.1, intended to support the development and deployment of virtual data centers:
In 2021, the product line consists of the following tools:
ZeroTier One, first released in 2014, is a portable client application that provides connectivity to public or private virtual networks.
Central, a web-based UI portal for managing virtual networks.
libzt (SDK), a linkable library that provides the functionality of ZeroTier One but that can be embedded in applications or services.
LF (pronounced "aleph"), a fully decentralized fully replicated key/value store.
Client
The ZeroTier client is used to connect to virtual networks previously created in the ZeroTier Central web-based UI. Endpoint connections are peer-to-peer and end-to-end encrypted. STUN and hole punching are used to establish direct connections between peers behind NAT. Direct connection route discovery is made with the help of a global network of root servers via a mechanism similar to ICE in WebRTC.
Controller
Virtual networks are created and managed using a ZeroTier controller. Management is done using an API, proprietary web-based UI (ZeroTier Central), open-source web-based or CLI alternative. Using root servers other than those hosted by ZeroTier Inc. is impeded by the software's license.
Security
The following considerations apply to ZeroTier's use as an SDWAN or VPN application:
Asymmetric public key encryption is Curve25519, a 256-bit elliptic curve variant.
All tra |
https://en.wikipedia.org/wiki/The%20Dark%20Overlord%20%28hacker%20group%29 | The Dark Overlord (also known as the TDO) is an international hacker organization which garnered significant publicity through cybercrime extortion of high-profile targets and public demands for ransom to prevent the release of confidential or potentially embarrassing documents.
The group gained its initial notoriety through the sale of stolen medical records on TheRealDeal, a darkweb marketplace. Major targets for the group included the extortion of Netflix, which resulted in the leak of unreleased episodes of the series Orange Is the New Black, and Disney.
In 2017, the group broke its trend of hacking and extortion, and began a series of terror-based attacks starting with the Columbia Falls school district in Montana. The group sent life-threatening text messages to students and their parents, demanding payment to prevent the murder of children. These attacks forced the closure of more than 30 schools across multiple school districts, resulting in more than 15,000 students being home from school for an entire week. During a senate committee hearing Senator Steve Daines (MO) referred to these attacks as "unprecedented".
On December 31, 2018, TDO announced the Lloyd's of London and Silverstein Properties "9/11 Papers" hack on Twitter, with thousands of incriminating documents to be released in stages unless US$2,000,000 in bitcoin were paid.
TDO was subsequently banned from many social media platforms including Twitter, Reddit, Pastebin and removed from the front end of an uncensorable blockchain called Steem/Hive. Platforms unrelated to TDO such as www.hpub.org also had their social media accounts eliminated or followers deleted for serving as mirrors of TDO hacked documents.
Arrests
Nathan Wyatt, a member of The Dark Overlord hacking group was extradited from the UK to the US in December 2019 to face charges in St. Louis for his involvement in the group. According to the charges, Wyatt "conspired to steal sensitive personally identifying information from vic |
https://en.wikipedia.org/wiki/K-outerplanar%20graph | In graph theory, a k-outerplanar graph is a planar graph that has a planar embedding in which the vertices belong to at most concentric layers. The outerplanarity index of a planar graph is the minimum value of for which it is -outerplanar.
Definition
An outerplanar graph (or 1-outerplanar graph) has all of its vertices on the unbounded (outside) face of the graph. A 2-outerplanar graph is a planar graph with the property that, when the vertices on the unbounded face are removed, the remaining vertices all lie on the newly formed unbounded face. And so on.
More formally, a graph is -outerplanar if it has a planar embedding such that, for every vertex, there is an alternating sequence of at most faces and vertices of the embedding, starting with the unbounded face and ending with the vertex, in which each consecutive face and vertex are incident to each other.
Properties and applications
The -outerplanar graphs have treewidth at most . However, some bounded-treewidth planar graphs such as the nested triangles graph may be -outerplanar only for very large , linear in the number of vertices.
Baker's technique covers a planar graph with a constant number of -outerplanar graphs and uses their low treewidth in order to quickly approximate several hard graph optimization problems.
In connection with the GNRS conjecture on metric embedding of minor-closed graph families, the -outerplanar graphs are one of the most general classes of graphs for which the conjecture has been proved.
A conjectured converse of Courcelle's theorem, according to which every graph property recognizable on graphs of bounded treewidth by finite state tree automata is definable in the monadic second-order logic of graphs, has been proven for the -outerplanar graphs.
Recognition
The smallest value of for which a given graph is -outerplanar (its outerplanarity index) can be computed in quadratic time.
References
Planar graphs |
https://en.wikipedia.org/wiki/Graph%20cut%20optimization | Graph cut optimization is a combinatorial optimization method applicable to a family of functions of discrete variables, named after the concept of cut in the theory of flow networks. Thanks to the max-flow min-cut theorem, determining the minimum cut over a graph representing a flow network is equivalent to computing the maximum flow over the network. Given a pseudo-Boolean function , if it is possible to construct a flow network with positive weights such that
each cut of the network can be mapped to an assignment of variables to (and vice versa), and
the cost of equals (up to an additive constant)
then it is possible to find the global optimum of in polynomial time by computing a minimum cut of the graph. The mapping between cuts and variable assignments is done by representing each variable with one node in the graph and, given a cut, each variable will have a value of 0 if the corresponding node belongs to the component connected to the source, or 1 if it belong to the component connected to the sink.
Not all pseudo-Boolean functions can be represented by a flow network, and in the general case the global optimization problem is NP-hard. There exist sufficient conditions to characterise families of functions that can be optimised through graph cuts, such as submodular quadratic functions. Graph cut optimization can be extended to functions of discrete variables with a finite number of values, that can be approached with iterative algorithms with strong optimality properties, computing one graph cut at each iteration.
Graph cut optimization is an important tool for inference over graphical models such as Markov random fields or conditional random fields, and it has applications in computer vision problems such as image segmentation, denoising, registration and stereo matching.
Representability
A pseudo-Boolean function is said to be representable if there exists a graph with non-negative weights and with source and sink nodes and respectively, a |
https://en.wikipedia.org/wiki/Square-law%20detector | In electronic signal processing, a square law detector is a device that produces an output proportional to the square of some input. For example, in demodulating radio signals, a semiconductor diode can be used as a square law detector, providing an output current proportional to the square of the amplitude of the input voltage over some range of input amplitudes. A square law detector provides an output directly proportional to the power of the input electrical signal.
References
Signal processing |
https://en.wikipedia.org/wiki/Jam.py%20%28web%20framework%29 | Jam.py is event driven low-code development platform for database-driven business web applications, based on DRY principle, with emphasis on CRUD.
Jam.py is free and open-source low-code/no-code "full stack" WSGI rapid application development framework for the JavaScript and Python programming language.
The server component runs on any computer with Python 2.6 or later.
It offers a built-in web server, GUI builder and database access for third-party databases.
Features
Single distribution which runs with both Python 2.6+ and 3.x
Can run as a standalone web development server or be used with any web server which supports WSGI
Built-in GUI builder called Application Builder
Support for JSON client data (for REST and JavaScript clients)
Support for popular databases Oracle Database, Microsoft SQL Server, PostgreSQL, SQLite, MySQL, Firebird (database server), SQLCipher
Example
The following code shows a simple web application that displays "Hello World!" when visited:
Task/client module:
task.create_menu($("#menu"), $("#content"), {
splash_screen: '<h1 class="text-center">Hello World!</h1>',
view_first: true
});
PythonAnywhere
PythonAnywhere Python 3.x deployment is supported
Awards
2015. 10 Best Frameworks for Web Design
2016. 35 Best HTML5 and CSS3 Responsive Frameworks
Notes
References
See also
Flask (web framework)
Pylons project
Web2py
Django (web framework)
Comparison of web frameworks
List of low-code development platforms
External links
2015 software
Free software programmed in Python
Python (programming language) web frameworks |
https://en.wikipedia.org/wiki/Normopathy | Normopathy is the pathological pursuit of conformity and societal acceptance at the expense of individuality. In her book, Plea for a Measure of Abnormality, psychoanalyst Joyce McDougall coined the term normopathy to describe fear of individuality. Normopathy is difficult to diagnose because normopaths are integrated in society. Normopaths depend on social approval and validation.
Christopher Bollas studied normopathy during the 1970s and 1980s with patients who had nervous breakdowns. Bollas, who called it normotic illness, considered it an obsession with fitting into society at the cost of the person's own personality. Normopaths experience emotional crisis – such as a teenager fumbling a football during a game at school – as a mania, and resort to violence or other dangerous behavior.
Normopaths often feel crippled, unable to speak or act. Normopaths perform best given a strict protocol to follow. It can cost some people a job or interfere with relationships. Normopaths constantly seek outside validation. The normopath may ask a friend what they think about a new song, dress or hairstyle before forming an opinion. Normopaths look to others to inform them how to think or believe.
The concept of normopathy parallels Winnicott’s idea of the false self, which is formed in response to the demands of the external environment rather than from within. Cognitive behavioral therapy is applied in treatment of normopathy to find individuality and restructure self-image.
Definition
Normopathy is defined as:
Anxiety of examining one’s psyche with diminished curiosity about inner life.
Hyper-rationality in dealing with others and an intense focus on factual data to seek reassurance, as according to Bollas, “the normopath attempts to become an object in the object world.” For the normopath, human feelings are troublemakers that require “formulaic structuring in order to be controllable.” Because Normopaths can't fully go through the cycle of grief, they develop what Bol |
https://en.wikipedia.org/wiki/Samantha%20Payne | Samantha Joanne Payne MBE is an English entrepreneur. The co-founder of Open Bionics, a bionics company developing affordable prosthetics for children, Payne has won a number of international awards for her work. These include the MIT Technology Review 'Innovators under 35' in 2018, James Dyson gong for innovative engineering and Wired Innovation Fellow in 2016. In the Queen's Birthday Honours list 2020, Payne was awarded an MBE, for her work making bionic technology more accessible.
Early life and education
Born and raised in Knowle West, outside of Bristol, England. Payne is a graduate of Whitworth University and has a Bachelor of Arts/Science.
Career
She worked as a journalist, specialising in technology before becoming a co-founder of Open Bionics. In 2013, whilst working as a journalist, Payne interviewed Joel Gibbard, who was a robotics graduate at the time. Gibbard and Payne later became business partners and co-founders of Open Bionics.
Payne and Gibbard founded Open Bionics in 2014. The start-up was initially based at the Technology Business Incubator at Bristol Robotics Laboratory. The aim of the company was to develop "affordable, assistive devices that enhance the human body."
Open Bionics has partnered with Disney to make prosthetics based on Disney characters for children.
Her work at Open Bionics has been featured in The Guardian and Daily Mirror.
Innovation
Open Bionics uses 3D scanning to take the initial prosthetic fitting and 3D printing to improve the prosthetic design. These innovations significantly reduce the build-time and the material costs for a personalised hand, making prosthetics more affordable for amputees. Payne estimates that, if bought from private providers, bionic hands with multi-grip functionality cost up to £60,000, compared to £5,000 from Open Bionics.
Awards and recognition
In 2015, Payne was shortlisted for Women in Business 'Young Entrepreneur of The Year' award. In 2018, Payne featured on the Forbes 30 Under 30 |
https://en.wikipedia.org/wiki/Extremophiles%20in%20biotechnology | Extremophiles in biotechnology is the application of organisms that thrive in extreme environments to biotechnology.
Extremophiles are organisms that thrive in the most volatile environments on the planet and it is due to their talents that they haven begun playing a large role in biotechnology. These organisms live everywhere from environments of high acidity or salinity to areas with limited or no oxygen are places they call home. Scientists show keen interest in organisms with rare or strange talents and in the past 20-30 years extremophiles have been at the forefront with thousands of researchers delving into their abilities. The area in which there has been the most talk, research, and development in relation to these organisms is biotechnology. Scientists around the globe are either extracting DNA to modify genomes or directly using extremophiles to complete tasks. Thanks to the discovery and interest in these organisms the enzymes used in PCR were found, making the rapid replication of DNA in the lab possible. Since they gained the spotlight researchers have been amassing databases of genome data for the hopes that new traits and abilities can be used to further biotechnical advancements Everything from the biodegradation of waste to the production of new fuels is on the horizon with the developments made in the field of biotechnology. There are many different kinds of extremophiles with each kind favoring a different environment. These organisms have become more and more important to biotechnology as their genomes have been uncovered, revealing a plethora of genetic potential. Currently the main uses of extremophiles lies in processes such as PCR, biofuel generation and biomining, but there are many other smaller scale operations at play. There are also labs that have identified what they wish to do with extremophiles, but haven't been able to fully achieve their goals. While these large scale goals have not yet been met the scientific community is working |
https://en.wikipedia.org/wiki/Trilateration | Trilateration is the use of distances (or "ranges") for determining the unknown position coordinates of a point of interest, often around Earth (geopositioning).
When more than three distances are involved, it may be called multilateration, for emphasis.
The distances or ranges might be ordinary Euclidean distances (slant ranges) or spherical distances (scaled central angles), as in true-range multilateration; or biased distances (pseudo-ranges), as in pseudo-range multilateration.
Trilateration or multilateration should not be confused with triangulation, which uses angles for positioning; and direction finding, which determines the line of sight direction to a target without determining the radial distance.
Terminology
Multiple, sometimes overlapping and conflicting terms are employed for similar concepts – e.g., multilateration without modification has been used for aviation systems employing both true-ranges and pseudo-ranges. Moreover, different fields of endeavor may employ different terms. In geometry, trilateration is defined as the process of determining absolute or relative locations of points by measurement of distances, using the geometry of circles, spheres or triangles. In surveying, trilateration is a specific technique.
True-range multilateration
Pseudo-range multilateration
References
Geometry
Geopositioning |
https://en.wikipedia.org/wiki/Lone%20divider | The lone divider procedure is a procedure for proportional cake-cutting. It involves a heterogenous and divisible resource, such as a birthday cake, and n partners with different preferences over different parts of the cake. It allows the n people to divide the cake among them such that each person receives a piece with a value of at least 1/n of the total value according to his own subjective valuation.
The procedure was developed by Hugo Steinhaus for n = 3 people. It was later extended by Harold W. Kuhn to n > 3, using the Frobenius–Konig theorem. A description of the cases n = 3, n = 4 appears in and the general case is described in.
Description
For convenience we normalize the valuations such that the value of the entire cake is n for all agents. The goal is to give each agent a piece with a value of at least 1.
Step 1. One player chosen arbitrarily, called the divider, cuts the cake into n pieces whose value in his/her eyes is exactly 1.
Step 2. Each of the other n − 1 partners evaluates the resulting n pieces and says which of these pieces he considers "acceptable", i.e, worth at least 1.
Now the game proceeds according to the replies of the players in step 3. We present first the case n = 3 and then the general case.
Steinhaus' procedure for the case n = 3
There are two cases.
Case A: At least one of the non-dividers marks two or more pieces as acceptable. Then, the third partner picks an acceptable piece (by the pigeonhole principle he must have at least one); the second partner picks an acceptable piece (he had at least two before, so at least one remains); and finally the divider picks the last piece (for the divider, all pieces are acceptable).
Case B: Both other partners mark only one piece as acceptable. Then, there is at least one piece that is acceptable only for the divider. The divider takes this piece and goes home. This piece is worth less than 1 for the remaining two partners, so the remaining two pieces are worth at least 2 for |
https://en.wikipedia.org/wiki/Diagram%20%28mathematical%20logic%29 | In model theory, a branch of mathematical logic, the diagram of a structure is a simple but powerful concept for proving useful properties of a theory, for example the amalgamation property and the joint embedding property, among others.
Definition
Let be a first-order language and be a theory over For a model of one expands to a new language
by adding a new constant symbol for each element in where is a subset of the domain of Now one may expand to the model
The positive diagram of , sometimes denoted , is the set of all those atomic sentences which hold in while the negative diagram, denoted thereof is the set of all those atomic sentences which do not hold in .
The diagram of is the set of all atomic sentences and negations of atomic sentences of that hold in Symbolically, .
See also
Elementary diagram
References
Mathematical logic
Model theory |
https://en.wikipedia.org/wiki/Turkish%20Space%20Agency | The Turkish Space Agency (, TUA) is a government agency for national aerospace research as a part of the space program of Turkey. It was formally established by a presidential decree on 13 December 2018.
Headquartered in Ankara, the agency is subordinated to the Ministry of Industry and Technology. With the establishment of TUA, the Department for Aviation and Space Technologies at the Ministry of Transportation and Infrastructure was abolished. TUA prepares strategic plans that include medium and long-term goals, basic principles and approaches, objectives and priorities, performance measures, methods to be followed and resource allocation for aerospace science and technologies.
TUA works in close collaboration with the TÜBİTAK Space Technologies Research Institute (TÜBİTAK UZAY). It is administrated by an executive board of seven members. The tenure of board members, the chairperson excluded, is three years.
National Space Program
Preliminary ten objectives:
In 2023 (the centennial of the Republic) Turkey will perform a hard landing on the Moon.
A new company will be established for satellite production.
Work and strive for regional superiority.
A spaceport will be established in the country. And possibly one in Somalia.
Competence in space will be increased by researching space weather and meteorology.
Meteorites, planets and more in space will be tracked from Earth.
The space industry will conduct integrated studies.
A space technologies development zone will be established.
Undergraduate and graduate education focus on space and aviation.
A Turkish citizen astronaut will be sent to space.
The last objective is planned for the second half of 2023 by sending a Turkish citizen to the International Space Station (ISS) within the scope of the Turkish Space Traveler and Science Mission project.
Duties
The agency's duties and areas of authority are:
To prepare and implement the National Space Program in line with the policies determined by the pr |
https://en.wikipedia.org/wiki/City%20Nature%20Challenge | The City Nature Challenge is an annual, global, community science competition to document urban biodiversity. The challenge is a bioblitz that engages residents and visitors to find and document plants, animals, and other organisms living in urban areas. The goals are to engage the public in the collection of biodiversity data, with three awards each year for the cities that make the most observations, find the most species, and engage the most people.
Participants primarily use the iNaturalist app and website to document their observations, though some areas use other platforms, such as Natusfera in Spain. The observation period is followed by several days of identification and the final announcement of winners. Participants need not know how to identify the species; help is provided through iNaturalist's automated species identification feature as well as the community of users on iNaturalist, including professional scientists and expert naturalists.
History
The City Nature Challenge was founded by Alison Young and Rebecca Johnson of the California Academy of Sciences and Lila Higgins of the Natural History Museum of Los Angeles County. The first challenge was in the spring of 2016 between Los Angeles and San Francisco. Participants documented over 20,000 observations with the iNaturalist platform. In 2017, the challenge expanded to 16 cities across the United States and participants collected over 125,000 observations of wildlife in 5 days. In 2018, the challenge expanded to 68 cities across the world. In four days, over 441,000 observations of more than 18,000 species were observed, and over 17,000 people participated. The 2019 challenge more than doubled in scale, with almost a million observations of over 31,000 species observed by around 35,000 people.
Taking the competition beyond its US roots, the 2019 event was a much more international affair, with the winning city for observations and species coming from Africa (Cape Town), and three South American |
https://en.wikipedia.org/wiki/Adiantum%20%28cipher%29 | Adiantum is a cipher construction for disk encryption, which uses the ChaCha and Advanced Encryption Standard (AES) ciphers, and Poly1305 cryptographic message authentication code (MAC).
It was designed in 2018 by Paul Crowley and Eric Biggers at Google specifically for low-powered mobile devices running Android Go. It has been included in the Linux kernel since version 5.0.
HPolyC is an earlier variant of Adiantum, which uses a different construction for the Poly1305 hash function.
Adiantum is implemented in Android 10 as an alternative cipher for device encryption, particularly on low-end devices lacking hardware-accelerated support for AES. The company stated that Adiantum ran five times faster than AES-256-XTS on ARM Cortex-A7 CPUs. Google had previously exempted devices from mandatory device encryption if their specifications affected system performance if enabled. Due to the introduction of Adiantum, device encryption becomes mandatory on all Android devices beginning on Android 10.
References
External links
Android Open Source Project: Enabling Adiantum
Ciphers
Computer-related introductions in 2018
Google |
https://en.wikipedia.org/wiki/Secure%20Scuttlebutt | Secure Scuttlebutt (SSB) is a peer-to peer communication protocol, mesh network, and self-hosted social media ecosystem. Each user hosts their own content and the content of the peers they follow, which provides fault tolerance and eventual consistency. Messages are digitally signed and added to an append-only list of messages published by an author. SSB is primarily used for implementing distributed social networks, and utilizes cryptography to assure that content remains unforged as it is propagated through the network.
In contrast to the major corporate social media platforms, user data and content on Secure Scuttlebutt is not monetized, there are no software design decisions being made in order to maximize user engagement or boost marketing metrics, and there is no paid advertising. According to Forbes, "Scuttlebutt itself isn't supported by venture capital. Instead ... Scuttlebutt is backed by grants that helped jump-start the process ... [and] there are now hundreds of users who personally donate to the cause and an estimated 30,000 people using one of at least six social networks on the protocol".
History
SSB was created by Dominic Tarr in 2014 as part of experimental development in alternative databases and distributed systems. Tarr lived on a sailboat with unreliable internet connection, and became interested in creating an offline-friendly secure gossip protocol for social networking. The word scuttlebutt is slang for "water-cooler gossip" among sailors. SSB gained popularity on the wave of privacy controversies raising against the traditional social media.
Protocol
Secure Scuttlebutt operates as a database of immutable append-only feeds, which allows resilient replication over the Internet, local area networks, and sneakernets. Messages are hashed with SHA256 and verified with an Ed25519 signature; this makes it impossible to forge a message without the private key of the author. Users only download messages from peers that they follow (and optional |
https://en.wikipedia.org/wiki/International%20Journal%20of%20Low-Carbon%20Technologies | The International Journal of Low-Carbon Technologies is an open access peer reviewed academic journal of low-carbon technologies. It is published by Oxford University Press.
References
Oxford University Press academic journals
Open access journals
Engineering journals |
https://en.wikipedia.org/wiki/Lola.com | Lola.com is a software as a service (SaaS) company based in Boston, Massachusetts. It is best known for developing corporate travel management and expense software for web browsers, the App Store and Google Play. The company was founded in 2015 by former Kayak.com executives, Paul M. English and Bill O'Donnell.
The website operates under a travel agency model for hotel and flight search information as well as booking services for businesses. It also has administrative analytics on employee travel and associated costs. Lola has received more than $80 million in funding since its foundation.
History
In July 2015, Blade, a Boston-based incubator, began focusing on a single startup. By December, English announced that Lola had emerged from stealth mode. The company's name was derived from a combination of the words "latitude" and "longitude".
It acquired HopOn, a travel booking company, in 2015 and Room77, a hotel metasearch website, in 2016. The company launched an iOS application in April 2016 where users chatted with human travel agents. That same month, it completed a $20 million Series A funding round led by General Catalyst and Accel.
The company had more than $44 million in total funding after a December 2016 Series B round led by Charles River Ventures. GV and Tenaya Capital each invested $5 million in the round, while previous investors General Catalyst and Accel also participated. In July 2017, Lola had its second major release on iOS and the Android operating system. This iteration of the application focused on business travel by adding self-service hotel and flight booking and personalized travel recommendations.
In July 2018, English announced he would assume the role of chief technology officer at Lola, with Mike Volpe, the chief marketing officer at Cybereason, becoming the company's chief executive officer. Lola announced a five-year exclusive partnership with American Express Global Business Travel in November 2018 to sell its travel management sof |
https://en.wikipedia.org/wiki/Collection%20No.%201 | Collection #1 is the name of a set of email addresses and passwords that appeared on the dark web around January 2019. The database contains over 773 million unique email addresses and 21 million unique passwords, resulting in more than 2.7 billion email/password pairs. The list, reviewed by computer security experts, contains exposed addresses and passwords from over 2000 previous data breaches as well as an estimated 140 million new email addresses and 10 million new passwords from previously unknown sources, and collectively makes it the largest data breach on the Internet.
Collection #1 was discovered by security researcher Troy Hunt, founder of "Have I Been Pwned?," a website that allows users to search their email addresses and passwords to know if either has appeared in a known data breach. The database had been briefly posted to Mega in January 2019, and links to the database posted in a popular hacker forum. Hunt discovered that the offering contained 87 gigabytes of data across 12,000 files. Not only was this discovery of concern to Hunt, but he further found that the passwords were available in plaintext format rather than in their hashed version. This implied that the creators of this database had been able to successfully crack the hashes of these passwords from weak implementation of hashing algorithms. Security researchers noted that unlike other username/password lists which are usually sold on the dark web, Collection #1 was temporarily available at no cost, and could potentially be used by a larger number of malicious agents, primarily for credential stuffing.
By January 30, 2019, security researchers observed that similar sets of data, named Collections #2 through #5, have been seen for sale on the dark web. Collections #2-5 included over 845 gigabytes of data, with a total of 25 billion email/password records. Security researchers at Hasso Plattner Institute estimated that Collections #2-5, after removing duplicates, has about three times as mu |
https://en.wikipedia.org/wiki/Effects%20bargaining | Effects bargaining is a type of bargaining which involves certain decisions that are within the management’s right to make. This has impact on mandatory subjects of bargaining. This is common to some business decisions like laying off and transferring employees. The bargaining on these impacts or effects is called effects bargaining.
For example, a contract may give an employer the ability to integrate new technology however, if the new technology will have a significant impact on employment, the employer is required to give the union notice in advance to allow bargaining on the effects prior to the technology being put in place.
References
Bargaining theory |
https://en.wikipedia.org/wiki/Simultaneous%20Authentication%20of%20Equals | In cryptography, Simultaneous Authentication of Equals (SAE) is a password-based authentication and password-authenticated key agreement method.
Authentication
SAE is a variant of the Dragonfly Key Exchange defined in , based on Diffie–Hellman key exchange using finite cyclic groups which can be a primary cyclic group or an elliptic curve. The problem of using Diffie–Hellman key exchange is that it does not have an authentication mechanism. So the resulting key is influenced by a pre-shared key and the MAC addresses of both peers to solve the authentication problem.
Use
IEEE 802.11s
SAE was originally implemented for use between peers in IEEE 802.11s. When peers discover each other (and security is enabled) they take part in an SAE exchange. If SAE completes successfully, each peer knows the other party possesses the mesh password and, as a by-product of the SAE exchange, the two peers establish a cryptographically strong key. This key is used with the "Authenticated Mesh Peering Exchange" (AMPE) to establish a secure peering and derive a session key to protect mesh traffic, including routing traffic.
WPA3
In January 2018, the Wi-Fi Alliance announced WPA3 as a replacement to WPA2. The new standard uses 128-bit encryption in WPA3-Personal mode (192-bit in WPA3-Enterprise) and forward secrecy. The WPA3 standard also replaces the pre-shared key (PSK) exchange with Simultaneous Authentication of Equals as defined in IEEE 802.11-2016 resulting in a more secure initial key exchange in personal mode. The Wi-Fi Alliance also claims that WPA3 will mitigate security issues posed by weak passwords and simplify the process of setting up devices with no display interface.
Security
In 2019 Eyal Ronen and Mathy Vanhoef (co-author of the KRACK attack) released an analysis of WPA3's Dragonfly handshake and found that "an attacker within range of a victim can still recover the password" and the bugs found "allow an adversary to impersonate any user, and thereby access the Wi- |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.