source
stringlengths
31
227
text
stringlengths
9
2k
https://en.wikipedia.org/wiki/List%20of%20JavaScript%20libraries
This is a list of notable JavaScript libraries. Constraint programming Cassowary (software) CHR.js DOM (manipulation) oriented Google Polymer Dojo Toolkit jQuery midori MooTools Prototype JavaScript Framework Graphical/visualization (canvas, SVG, or WebGL related) AnyChart Babylon.js Chart.js Cytoscape D3.js Dojo Toolkit FusionCharts Google Charts Highcharts JavaScript InfoVis Toolkit p5.js Plotly Processing.js Raphaël RGraph seen.js Snap.svg SWFObject Teechart Three.js Velocity.js Verge3D Webix GUI (Graphical user interface) and widget related Angular (application platform) by Google AngularJS by Google Bootstrap Dojo Widgets Ext JS by Sencha Foundation by ZURB jQuery UI jQWidgets OpenUI5 by SAP Polymer (library) by Google qooxdoo React.js by Facebook Vue.js Webix WinJS Svelte No longer actively developed Glow Lively Kernel Script.aculo.us YUI Library Pure JavaScript/Ajax Google Closure Library Joose JsPHP Microsoft's Ajax library MochiKit PDF.js Socket.IO Spry framework Underscore.js Template systems jQuery Mobile Mustache Jinja-JS Twig.js Unit testing Jasmine Mocha QUnit Unit.js Web-application related (MVC, MVVM) Angular (application platform) by Google AngularJS by Google Backbone.js Echo Ember.js Enyo Express.js Ext JS Google Web Toolkit JavaScriptMVC JsRender/JsViews Knockout Meteor Mojito MooTools Next.js OpenUI5 by SAP Polymer (library) by Google Prototype JavaScript Framework PureMVC qooxdoo React.js SproutCore Vue.js Wakanda Framework Other Blockly Cannon.js MathJax Modernizr TensorFlow Brain.js See also Ajax framework Comparison of JavaScript frameworks JavaScript library
https://en.wikipedia.org/wiki/Rhode%20Island%20statistical%20areas
The U.S. currently has two statistical areas that have been delineated by the Office of Management and Budget (OMB). On March 6, 2020, the OMB delineated the Providence-Warwick, RI-MA Metropolitan Statistical Area and the Boston-Worcester-Providence, MA-RI-NH-CT Combined Statistical Area. All five counties of are a part of both the Providence-Warwick, RI-MA Metropolitan Statistical Area and the more extensive Boston-Worcester-Providence, MA-RI-NH-CT Combined Statistical Area. Statistical areas The Office of Management and Budget (OMB) has designated more than 1,000 statistical areas for the United States and Puerto Rico. These statistical areas are important geographic delineations of population clusters used by the OMB, the United States Census Bureau, planning organizations, and federal, state, and local government entities. The OMB defines a core-based statistical area (commonly referred to as a CBSA) as "a statistical geographic entity consisting of the county or counties (or county-equivalents) associated with at least one core of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration with the core as measured through commuting ties with the counties containing the core." The OMB further divides core-based statistical areas into metropolitan statistical areas (MSAs) that have "a population of at least 50,000" and micropolitan statistical areas (μSAs) that have "a population of at least 10,000, but less than 50,000." The OMB defines a combined statistical area (CSA) as "a geographic entity consisting of two or more adjacent core-based statistical areas with employment interchange measures of at least 15%." The primary statistical areas (PSAs) include all combined statistical areas and any core-based statistical area that is not a constituent of a combined statistical area. Table The table below describes the two United States statistical areas and five counties of the State of Rhode Island and Providence
https://en.wikipedia.org/wiki/South%20Carolina%20statistical%20areas
The U.S. currently has 20 statistical areas that have been delineated by the Office of Management and Budget (OMB). On July 21, 2023, the OMB delineated four combined statistical areas, ten metropolitan statistical areas, and six micropolitan statistical areas in . Statistical areas The Office of Management and Budget (OMB) has designated more than 1,000 statistical areas for the United States and Puerto Rico. These statistical areas are important geographic delineations of population clusters used by the OMB, the United States Census Bureau, planning organizations, and federal, state, and local government entities. The OMB defines a core-based statistical area (commonly referred to as a CBSA) as "a statistical geographic entity consisting of the county or counties (or county-equivalents) associated with at least one core of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration with the core as measured through commuting ties with the counties containing the core." The OMB further divides core-based statistical areas into metropolitan statistical areas (MSAs) that have "a population of at least 50,000" and micropolitan statistical areas (μSAs) that have "a population of at least 10,000, but less than 50,000." The OMB defines a combined statistical area (CSA) as "a geographic entity consisting of two or more adjacent core-based statistical areas with employment interchange measures of at least 15%." The primary statistical areas (PSAs) include all combined statistical areas and any core-based statistical area that is not a constituent of a combined statistical area. Table The table below describes the 22 United States statistical areas and 46 counties of the State of South Carolina with the following information: The combined statistical area (CSA) as designated by the OMB. The CSA population according to 2019 US Census Bureau population estimates. The core based statistical area (CBSA) as designated by the O
https://en.wikipedia.org/wiki/South%20Dakota%20statistical%20areas
The U.S. currently has 13 statistical areas that have been delineated by the Office of Management and Budget (OMB). On March 6, 2020, the OMB delineated one combined statistical area, three metropolitan statistical areas, and nine micropolitan statistical areas in . Statistical areas The Office of Management and Budget (OMB) has designated more than 1,000 statistical areas for the United States and Puerto Rico. These statistical areas are important geographic delineations of population clusters used by the OMB, the United States Census Bureau, planning organizations, and federal, state, and local government entities. The OMB defines a core-based statistical area (commonly referred to as a CBSA) as "a statistical geographic entity consisting of the county or counties (or county-equivalents) associated with at least one core of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration with the core as measured through commuting ties with the counties containing the core." The OMB further divides core-based statistical areas into metropolitan statistical areas (MSAs) that have "a population of at least 50,000" and micropolitan statistical areas (μSAs) that have "a population of at least 10,000, but less than 50,000." The OMB defines a combined statistical area (CSA) as "a geographic entity consisting of two or more adjacent core-based statistical areas with employment interchange measures of at least 15%." The primary statistical areas (PSAs) include all combined statistical areas and any core-based statistical area that is not a constituent of a combined statistical area. Table The table below describes the 13 United States statistical areas and 66 counties of the State of South Dakota with the following information: The combined statistical area (CSA) as designated by the OMB. The CSA population according to 2019 US Census Bureau population estimates. The core based statistical area (CBSA) as designated by the OM
https://en.wikipedia.org/wiki/Utah%20statistical%20areas
The U.S. currently has eleven statistical areas that have been delineated by the Office of Management and Budget (OMB). On July , 2023, the OMB delineated one combined statistical area, five metropolitan statistical areas, and six micropolitan statistical areas in Utah. Statistical areas The Office of Management and Budget (OMB) has designated more than 1,000 statistical areas for the United States and Puerto Rico. These statistical areas are important geographic delineations of population clusters used by the OMB, the United States Census Bureau, planning organizations, and federal, state, and local government entities. The OMB defines a core-based statistical area (commonly referred to as a CBSA) as "a statistical geographic entity consisting of the county or counties (or county-equivalents) associated with at least one core of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration with the core as measured through commuting ties with the counties containing the core." The OMB further divides core-based statistical areas into metropolitan statistical areas (MSAs) that have "a population of at least 50,000" and micropolitan statistical areas (μSAs) that have "a population of at least 10,000, but less than 50,000." The OMB defines a combined statistical area (CSA) as "a geographic entity consisting of two or more adjacent core-based statistical areas with employment interchange measures of at least 15%." The primary statistical areas (PSAs) include all combined statistical areas and any core-based statistical area that is not a constituent of a combined statistical area. Table The table below describes the 11 United States statistical areas and 29 counties of the State of Utah with the following information: The combined statistical area (CSA) as designated by the OMB. The CSA population according to 2019 US Census Bureau population estimates. The core based statistical area (CBSA) as designated by the OMB. T
https://en.wikipedia.org/wiki/Vermont%20statistical%20areas
The U.S. currently has six statistical areas that have been delineated by the Office of Management and Budget (OMB). On March 6, 2020, the OMB delineated one combined statistical area, one metropolitan statistical area, and six micropolitan statistical areas in Vermont. Statistical areas The Office of Management and Budget (OMB) has designated more than 1,000 statistical areas for the United States and Puerto Rico. These statistical areas are important geographic delineations of population clusters used by the OMB, the United States Census Bureau, planning organizations, and federal, state, and local government entities. The OMB defines a core-based statistical area (commonly referred to as a CBSA) as "a statistical geographic entity consisting of the county or counties (or county-equivalents) associated with at least one core of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration with the core as measured through commuting ties with the counties containing the core." The OMB further divides core-based statistical areas into metropolitan statistical areas (MSAs) that have "a population of at least 50,000" and micropolitan statistical areas (μSAs) that have "a population of at least 10,000, but less than 50,000." The OMB defines a combined statistical area (CSA) as "a geographic entity consisting of two or more adjacent core-based statistical areas with employment interchange measures of at least 15%." The primary statistical areas (PSAs) include all combined statistical areas and any core-based statistical area that is not a constituent of a combined statistical area. Table The table below describes the 6 United States statistical areas and 14 counties of the State of Vermont with the following information: The combined statistical area (CSA) as designated by the OMB. The CSA population according to 2019 US Census Bureau population estimates. The core based statistical area (CBSA) as designated by the OMB.
https://en.wikipedia.org/wiki/Washington%20%28state%29%20statistical%20areas
The U.S. currently has 28 statistical areas that have been delineated by the Office of Management and Budget (OMB). On March 6, 2020, the OMB delineated six combined statistical areas, 13 metropolitan statistical areas, and nine micropolitan statistical areas in Washington. Statistical areas The Office of Management and Budget (OMB) has designated more than 1,000 statistical areas for the United States and Puerto Rico. These statistical areas are important geographic delineations of population clusters used by the OMB, the United States Census Bureau, planning organizations, and federal, state, and local government entities. The OMB defines a core-based statistical area (commonly referred to as a CBSA) as "a statistical geographic entity consisting of the county or counties (or county-equivalents) associated with at least one core of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration with the core as measured through commuting ties with the counties containing the core." The OMB further divides core-based statistical areas into metropolitan statistical areas (MSAs) that have "a population of at least 50,000" and micropolitan statistical areas (μSAs) that have "a population of at least 10,000, but less than 50,000." The OMB defines a combined statistical area (CSA) as "a geographic entity consisting of two or more adjacent core-based statistical areas with employment interchange measures of at least 15%." The primary statistical areas (PSAs) include all combined statistical areas and any core-based statistical area that is not a constituent of a combined statistical area. Table The table below describes the 28 United States statistical areas and 39 counties of the State of Washington with the following information: The combined statistical area (CSA) as designated by the OMB. The CSA population according to 2019 US Census Bureau population estimates. The core based statistical area (CBSA) as designated by
https://en.wikipedia.org/wiki/Sharadchandra%20Shankar%20Shrikhande
Sharadchandra Shankar Shrikhande (19 October 1917 – 21 April 2020) was an Indian mathematician with notable achievements in combinatorial mathematics. He was notable for his breakthrough work along with R. C. Bose and E. T. Parker in their disproof of the famous conjecture made by Leonhard Euler dated 1782 that there do not exist two mutually orthogonal latin squares of order 4n + 2 for any n. Shrikhande's specialty was combinatorics, and statistical designs. Shrikhande graph is used in statistical designs. Life, education and career He was the fifth of ten siblings. His father worked at a flour mill. He completed his B.Sc. from Government Science College, Nagpur and went for further studies at the Indian Statistical Institute. He then briefly worked as a lecturer at the Government Science College, Nagpur. Shrikhande received a Ph.D. in the year 1950 from the University of North Carolina at Chapel Hill under the supervision of Raj Chandra Bose. Shrikhande taught at various universities in the USA and in India. Shrikhande was a professor of mathematics at Banaras Hindu University, Banaras, and the founding head of the department of mathematics, University of Mumbai and the founding director of the Center of Advanced Study in Mathematics, Mumbai until he retired in 1978. He was a fellow of the Indian National Science Academy, the Indian Academy of Sciences and the Institute of Mathematical Statistics, USA. In 1988, his wife Shakuntala passed away and he moved to the United States. Shrikhande returned to India in 2009. He turned 100 in October 2017 and died in April 2020 at the age of 102. His son Mohan Shrikhande is a professor of combinatorial mathematics at Central Michigan University in Mt. Pleasant, Michigan.
https://en.wikipedia.org/wiki/Grand%20rounds
Grand rounds are a methodology of medical education and inpatient care, consisting of presenting the medical problems and treatment of a particular patient to an audience consisting of doctors, pharmacists, residents, and medical students. It was first conceived by clinicians as a way for junior colleagues to round on patients. The patient was traditionally present for the round and would answer questions; grand rounds have evolved with most sessions rarely having a patient present and being more like lectures. An actor portrays the patient in some instances. Grand rounds help doctors and other healthcare professionals keep up to date in important evolving areas which may be outside of their core practice. Most departments at major teaching hospitals will have their own specialized, often weekly, grand rounds. Attending grand rounds is also an important supplement to medical school and on-the-job resident training. Grand rounds can also be distinguished from rounds which is the (typically) daily visit by the attending physician and team to all that physician's patients on the ward. Rounding with an attending physician is an important part of medical on-the-job training and education, but its primary focus is immediate care for the patients on the ward. Grand rounds tends to present the bigger picture, including experience with patients over many years, and the newest research and treatments in an area. Grand rounds tend to be open to the entire medical professional community, whereas rounds are specific to individual attending physicians and their teams. A 1966 report in the Australian journal Clinical Pediatrics called into question the sometimes inconsiderate behavior of practitioners who treated patients who were featured cases of grand rounds. Video archives Many teaching and research hospitals have started providing streaming video of their grand rounds presentations for free over the Internet. This is an opportunity for medical professionals and students t
https://en.wikipedia.org/wiki/OpenWire%20%28binary%20protocol%29
OpenWire is a binary protocol designed for working with message-oriented middleware. It is the native wire format of ActiveMQ. External links ActiveMQ - OpenWire How C# - OpenWire Message-oriented middleware Network protocols
https://en.wikipedia.org/wiki/Survey%20of%20Health%2C%20Ageing%20and%20Retirement%20in%20Europe
The Survey of Health, Ageing and Retirement in Europe (SHARE) is a multidisciplinary and cross-national panel database of micro data on health, socio-economic status and social and family networks. In seven survey waves to date, SHARE has conducted approximately 380,000 interviews with about 140,000 individuals aged 50 and over. The survey covers 28 European countries and Israel. SHARE was founded as a response to the European Commission's call to "examine the possibility of establishing, in co-operation with Member States, a European Longitudinal Ageing Survey". It has since become a major pillar of the European Research Area, selected as one of the projects to be implemented in the European Strategy Forum on Research Infrastructures (ESFRI) in 2006 and was given a new legal status as the first ever European Research Infrastructure Consortium (SHARE-ERIC) in March 2011. About SHARE Founded in 2002, SHARE is coordinated centrally at the Munich Center for the Economics of Aging (MEA), Max-Planck-Institute for Social Law and Social Policy and led by Managing Director Axel Börsch-Supan. It is a collaborative effort of more than 150 researchers worldwide who are organized in multidisciplinary national teams and cross-national working groups. A Scientific Monitoring Board composed of eminent international researchers and a network of advisors help to maintain and improve the project’s high scientific standards. SHARE is harmonized with its role models and sister studies the U.S. Health and Retirement Study (HRS) and the English Longitudinal Study of Ageing (ELSA), and has the advantage of encompassing cross-national variation in public policy, culture and history across a variety of European countries. Its scientific power is based on its panel design that grasps the dynamic character of the ageing process. SHARE’s multi-disciplinary approach delivers a full picture of the ageing process. Procedural guidelines and programs ensure an ex-ante harmonized cross-national
https://en.wikipedia.org/wiki/Dreaming%20in%20Code
Dreaming in Code: Two Dozen Programmers, Three Years, 4,732 Bugs, and One Quest for Transcendent Software is a (2007) Random House literary nonfiction book by Salon.com editor and journalist Scott Rosenberg. It documents the workers of Mitch Kapor's Open Source Applications Foundation as they struggled with collaboration and the software development task of building the open source calendar application Chandler. Rosenberg spent time observing the organization at work and wrote about its milestones and problems. The book combines narrative with explanations of software development philosophy, methodology, and process, referring to The Mythical Man-Month and other texts of the field. In a review published in the Atlantic, James Fallows compared the book to Tracy Kidder's The Soul of a New Machine. At the time of the book's publication, OSAF had not yet released Chandler 1.0. Chandler 1.0 was released on August 8, 2008.
https://en.wikipedia.org/wiki/LLT%20polynomial
In mathematics, an LLT polynomial is one of a family of symmetric functions introduced by Alain Lascoux, Bernard Leclerc, and Jean-Yves Thibon (1997) as q-analogues of products of Schur functions. J. Haglund, M. Haiman, N. Loehr (2005) showed how to expand Macdonald polynomials in terms of LLT polynomials. Ian Grojnowski and Mark Haiman (2007, preprint) proved a positivity conjecture for LLT polynomials that combined with the previous result implies the Macdonald positivity conjecture for Macdonald polynomials, and extended the definition of LLT polynomials to arbitrary finite root systems.
https://en.wikipedia.org/wiki/Maternal%20effect%20dominant%20embryonic%20arrest
Maternal effect dominant embryonic arrest (Medea) is a selfish gene composed of a toxin and an antidote. A mother carrying Medea will express the toxin in her germline, killing her progeny. If the children also carry Medea, they produce copies of the antidote, saving their lives. Therefore, if a mother has one Medea allele and one non-Medea allele, half of her children will inherit Medea and survive while the other half will inherit the non-Medea allele and die (unless they receive Medea from their father). Medea's selfish behavior gives it a selective advantage over normal genes. If introduced into a population at sufficiently high levels, the Medea gene will spread, replacing entire populations of normal beetles with beetles carrying Medea. Because of this, Medea has been proposed as a way of genetically modifying insect populations. By linking the Medea construct to a gene of interest - for instance, a gene conferring resistance to malaria - Medea'''s unique dynamics could be exploited to drive both genes into a population. These findings have dramatic implications for the control of insect-borne diseases such as malaria and dengue fever. Construction of Medea Medea, which has been found in nature only in flour beetles, is an example of a selfish gene that has been simulated in the lab and tested in the fruit fly Drosophila melanogaster. The toxin was a microRNA that blocked the expression of myd88, a gene vital for embryonic development in insects. The antidote was an extra copy of myd88. The offspring receiving the extra copy of myd88 survived and hatched, while those without the extra copy died. In lab trials where 25% of the original members were homozygous for Medea, the gene spread to the entire population within 10 to 12 generations. Etymology Medea was named for the Greek mythological figure of Medea, who killed her children when her husband left her for another woman. See also Intragenomic conflict Green-beard effect Medea gene Toxi
https://en.wikipedia.org/wiki/Metal%20bellows
Metal bellows are elastic vessels that can be compressed when pressure is applied to the outside of the vessel, or extended under vacuum. When the pressure or vacuum is released, the bellows will return to its original shape, provided the material has not been stressed past its yield strength. They are used both for their ability to deform under pressure and to provide a hermetic seal that allows movement. Precision bellows technology of the 20th and 21st century is centered on metal bellows with less demanding applications using ones made of rubber and plastic. These products bear little resemblance to the original leather bellows used traditionally in fireplaces and forges. Types There are three main types of metal bellows: formed, welded and electroformed. Formed bellows are produced by reworking tubes, normally produced by deep drawing, with a variety of processes, including cold forming (rolling), and hydroforming. They are also called convoluted bellows or sylphons. Welded bellows (also called edge-welded, or diaphragm bellows) are manufactured by welding a number of individually formed diaphragms to each other. The comparison between the two bellows types generally centers on cost and performance. Hydroformed bellows generally have a high tooling cost, but, when mass-produced, may have a lower piece price. However, hydroformed bellows have lower performance characteristics due to relatively thick walls and high stiffness. Welded metal bellows are produced with a lower initial tooling cost and maintain higher performance characteristics. The drawback of welded bellows is the reduced metal strength at weld joints, caused by the high temperature of welding. Electroformed bellows are produced by plating (electroforming) a metal layer onto a model (mandrel), and subsequently removing the mandrel. They can be produced with modest tooling costs and with thin walls (25 micrometres or less), providing such bellows with high sensitivity and precision in many exac
https://en.wikipedia.org/wiki/XAdES
XAdES (short for XML Advanced Electronic Signatures) is a set of extensions to XML-DSig recommendation making it suitable for advanced electronic signatures. W3C and ETSI maintain and update XAdES together. Description While XML-DSig is a general framework for digitally signing documents, XAdES specifies precise profiles of XML-DSig making it compliant with the European eIDAS regulation (Regulation on electronic identification and trust services for electronic transactions in the internal market). The eIDAS regulation enhances and repeals the Electronic Signatures Directive 1999/93/EC. EIDAS is legally binding in all EU member states since July 2014. An electronic signature that has been created in compliance with eIDAS has the same legal value as a handwritten signature. An electronic signature, technically implemented based on XAdES has the status of an advanced electronic signature. This means that it is uniquely linked to the signatory; it is capable of identifying the signatory; only the signatory has control of the data used for the signature creation; it can be identified if data attached to the signature has been changed after signing. A resulting property of XAdES is that electronically signed documents can remain valid for long periods, even if underlying cryptographic algorithms are broken. However, courts are not obliged to accept XAdES-based electronic signatures as evidence in their proceedings; at least in EU, this is compulsory only for "qualified" signatures. A "qualified electronic signature" needs to be doted with a digital certificate, encrypted by a security signature creation device, and the identity of the owner of this signing-certificate must have been verified according to the "high" assurance level of the eIDAS regulation. Profiles XAdES defines four profiles (forms) differing in protection level offered. XAdES-B-B (Basic Electronic Signature), The lowest and simplest version just containing the SignedInfo, SignatureValue, KeyInfo
https://en.wikipedia.org/wiki/Real-time%20analyzer
A real-time analyzer (RTA) is a professional audio device that measures and displays the frequency spectrum of an audio signal; a spectrum analyzer that works in real time. An RTA can range from a small PDA-sized device to a rack-mounted hardware unit to software running on a laptop. It works by measuring and displaying sound input, often from an integrated microphone or with a signal from a PA system. Basic RTAs show three measurements per octave at 3 or 6 dB increments; sophisticated software solutions can show 24 or more measurements per octave as well as 0.1 dB resolution. Types There are generally two types of RTAs: RTAs employing analog signal processing, and RTAs employing digital signal processing (DSP). The main difference between the two types is that the analog RTAs use a series of hardwired, analog bandpass filters to break the signal into frequency bands prior to measuring it. Digital RTAs use digital sampling technology and microprocessor-based digital signal processing to perform necessary calculations, such as fast Fourier transforms, to perform the measurements and thus do not need analog hardware filters to isolate each frequency band. The digital approach to signal analysis generally yields much higher accuracy and resolution and thus most RTAs currently in production use digital signal processing technology. Digital signal processing is more cost effective. Professional use RTAs are often used by sound engineers and by acousticians installing audio systems in all kinds of listening spaces: Venues, home theatres, cars etc. The parameters that can be measured are the spectral aspects of sound reproduction caused by effects like resonances and constructive and destructive interference, but not imaging and spatial aspects. In professional audio many systems incorporate an RTA along with a device that also performs equalization. While measuring pink noise or other test tones, such a controller can level out the frequency response by employing
https://en.wikipedia.org/wiki/CVS%20Health
CVS Health Corporation (previously CVS Corporation and CVS Caremark Corporation) is an American healthcare company that owns CVS Pharmacy, a retail pharmacy chain; CVS Caremark, a pharmacy benefits manager; and Aetna, a health insurance provider, among many other brands. The company is the world's largest healthcare company, and its headquarters are in Woonsocket, Rhode Island. History of CVS Consumer Value Stores (CVS) was founded in 1963 by three partners - brothers Stanley and Sidney Goldstein and Ralph Hoagland - in Lowell, Massachusetts. They grew the venture from a parent company, Mark Steven, Inc., which helped retailers manage their health and beauty aid product lines. The business began as a chain of health and beauty aid stores, but within several years, pharmacies were added. After two years of operations, it was able to open 17 stores. To expand, the company joined the Melville Corporation, which managed a string of retail businesses. Following a period of growth in the 1980s and 1990s, CVS Corporation spun off from Melville in 1996, becoming a standalone company trading on the New York Stock Exchange as . In December 2017, CVS agreed to acquire Aetna for $69billion and completed the acquisition in November 2018. Legal issues related to the merger were resolved in September 2019. In February 2020, CVS Health announced changes to its board of directors, whose size was reduced from 16 to 13 directors. In 2021, CVS Health was ranked 4th on the Fortune 500 list, and 7th on the Fortune Global 500 list. On November 18, 2021, CVS Health announced that the company plans to close 900 stores over the next three years with closing due to begin in the spring of 2022. In November 2021, a federal jury found that CVS, along with Walgreens and Walmart, "had substantially contributed to" the opioid crisis. Company name CVS, started in Lowell, Massachusetts by brothers Stanley and Sidney Goldstein and their partner Ralph Hoagland, later had to sell to the Melville C
https://en.wikipedia.org/wiki/Low-rate%20picture%20transmission
The low-rate picture transmission (LRPT) is a digital transmission system, intended to deliver images and data from an orbital weather satellite directly to end users via a VHF radio signal. It is used aboard polar-orbiting, near-Earth weather satellite programs such as MetOp and NPOESS. Purpose LRPT provides three image channels at full sensor resolution (10-bit, 1 km/pixel, six lines/second) in addition to data from other sensors, such as atmospheric sounders and GPS positioning information. The system is an update and replacement of the existing analog system called automatic picture transmission (APT), which has been used since the 1960s aboard NOAA's TIROS polar-orbiting satellites. The APT system provided only two image channels, which were at a reduced accuracy and resolution (8-bit, 4 km/pixel, two lines/second). Compared to the APT system, LRPT images are four times more accurate and contain twelve times the resolution. Further, the additional data from other sensors increases the applications of the satellites and the users who receive the signal. Design LRPT uses a packetized datastream transmitted at an approximately 62 kilobits per second (kbit/s) rate. Each sensor using the LRPT is considered an application and provided a percentage of the transmission bandwidth in the form of a virtual channel. For example, the advanced very-high-resolution radiometer (AVHRR) imaging sensor is provided approximately 40 kbit/s to transmit its three image channels, and the high-resolution infrared radiation sounder (HIRS) is provided approximately 2900 bit/s. The packetized application system provides the flexibility to transmit and receive new types of data in the future using the same hardware. The datastream is processed using a Reed–Solomon error correction, then convolution encoded, interleaved, and padded with unique synchronization words. The resulting binary stream is approximately 160 kbit/s. It is transmitted as an 80 kiloBaud quadrature phase-shift keyed
https://en.wikipedia.org/wiki/Serum%20Institute%20of%20India
Serum Institute of India (SII) is an Indian biotechnology and biopharmaceuticals company, based in Pune. It is the world's largest manufacturer of vaccines. It was founded by Cyrus Poonawalla in 1966 and is a part of Cyrus Poonawalla Group. Overview The Serum Institute of India was founded in 1966 in the city of Pune, India. The company set out to produce immunobiologicals, which were imported into India at high prices. Among the first products the Serum Institute of India manufactured in large quantities were the tetanus antitoxin, snake antivenom, DPT vaccine, and MMR vaccine. The company's product lines was expanded to include different types of vaccines against bacterial or virus infections, combination vaccines, influenza vaccine, and meningococcal vaccine. Besides vaccine the company also manufactures antisera, blood plasma, and hormone products. As of 2014 the vaccines manufactured by the Serum Institute of India have been used in international vaccination programmes run by the World Health Organization (WHO), UNICEF, and the Pan American Health Organization (PAHO). Today the Serum Institute of India is run by the Poonawalla Group and engages in research, development, and manufacturing. In 2009, the company began developing an intranasal swine flu vaccine. In 2012, the company's first international acquisition was Bilthoven Biologicals, a biopharmaceutical company in Netherlands. In 2016, with support from US-based Mass Biologics of University of Massachusetts Medical School, the Serum Institute of India invented a fast-acting anti-rabies agent, Rabies Human Monoclonal Antibody (RMAb), also known as Rabishield. , the company is the world's largest vaccine producer by number of doses produced, manufacturing around 1.9 billion doses of vaccines each year with plans to produce 4 billion doses of vaccines. The products developed include tuberculosis vaccine Tubervac (BCG), Poliovac for poliomyelitis, and other vaccinations for the childhood vaccination schedul
https://en.wikipedia.org/wiki/Ballantine%20scale
The Ballantine scale is a biologically defined scale for measuring the degree of exposure level of wave action on a rocky shore. Devised in 1961 by W. J. Ballantine, then at the zoology department of Queen Mary College, London, the scale is based on the observation that where shoreline species are concerned "Different species growing on rocky shores require different degrees of protection from certain aspects of the physical environment, of which wave action is often the most important." The species present in the littoral zone therefore indicate the degree of the shore's exposure. Summary An abbreviated summary of the scale is given below. The scale runs from 1) an "extremely exposed" shore, to 8) an "extremely sheltered" shore. The littoral zone generally is the zone between low and high tides. The supra-littoral is above the barnacle line. Modifications to the scale A modified exposure scale of five stages applying to the shores of the British Isles is given by Lewis (1964): "Very exposed shores" have a wide Verrucaria zone entirely above the upper tide level, a Porphyra zone above the barnacle level and Lichina pygmaea is locally abundant. The eu-littoral zone is dominated by barnacles and limpets with a coralline belt in the very low littoral along with other Rhodophyta and Alaria in the upper sub-littoral. "Exposed shores" show a Verrucaria belt mainly above the high tide, with Porphyra and L. pygmaea. The mid shore is dominated by barnacles, limpets and some Fucus. Some Rhodophyta . Himanthalia and some Rhodophyta such as Mastocarpus and Corallina are found in the low littoral, with Himanthalia, Alaria and L. digitata dominant in the upper sub-littoral. "Semi-exposed shores" show a Verrucaria belt just above high tide with clear Pelvetia in the upper-littoral and F. serratus in the lower-littoral. Limpets, barnacles and short F. vesiculosus mid-shore. F. serratus with Rhodophyta, (Laurencia, Mastocarpus stellatus, Rhodymenia and Lomentaria). Laminaria, Sa
https://en.wikipedia.org/wiki/Hebrew%20Braille
Hebrew Braille () is the braille alphabet for Hebrew. The International Hebrew Braille Code is widely used. It was devised in the 1930s and completed in 1944. It is based on international norms, with additional letters devised to accommodate differences between English Braille and the Hebrew alphabet. Unlike Hebrew, but in keeping with other braille alphabets, Hebrew Braille is read from left to right instead of right to left, and unlike English Braille, it is an abjad, with all letters representing consonants. History Prior to the 1930s, there were several regional variations of Hebrew Braille, but no universal system. In 1930, the Jewish Braille Institute of America, under the direction of the Synagogue Council of America, assembled an international committee for the purpose of producing a unified embossed code to be used by sightless people throughout the world. The committee membership consisted of Isaac Maletz, representing the Jewish Institute for the Blind, Jerusalem; Dr. Max Geffner, of the Blindeninstitut of Vienna; Canon C. F. Waudby, of the National Institute for the Blind, Great Britain; Leopold Dubov, of the Jewish Braille Institute of America; and Rabbi Harry J. Brevis, representing the New York Board of Rabbis. Rabbi Brevis, who had lost his eyesight in his mid 20s, and who had developed a system of Hebrew braille for his personal use as a rabbinical student while attending the Jewish Institute of Religion from 1926 and 1929, was named chairman of the committee, and Leupold Dubov, the executive director of the Jewish Braille Institute of America, was appointed secretary. In 1933, the committee voted unanimously to approve and sponsor Brevis' system for adoption as the International Hebrew Braille Code. Brevis published a selection of readings from the Bible, Mishnah, and contemporary literature using this system in 1935 under the title A Hebrew Braille Chrestomathy. Among the greater challenges faced by the committee was the accommodation of the Hebr
https://en.wikipedia.org/wiki/Microsoft%20Visual%20Programming%20Language
Microsoft Visual Programming Language, or VPL, is a visual programming and dataflow programming language developed by Microsoft for the Microsoft Robotics Studio. VPL is based on the event-driven and data-driven approach. The programming language is distinguished from other Microsoft programming languages such as Visual Basic and C#, as it is the only Microsoft language that is a true visual programming language. Microsoft has utilized the term "Visual" in its previous programming products to reflect that a large degree of development in these languages can be performed by "dragging and dropping" in a traditional wysiwyg fashion. See also Dataflow programming Visual programming languages Microsoft Robotics Developer Studio VIPLE: Visual IoT/Robotics Programming Language Environment
https://en.wikipedia.org/wiki/Empirical%20probability
In probability theory and statistics, the empirical probability, relative frequency, or experimental probability of an event is the ratio of the number of outcomes in which a specified event occurs to the total number of trials, i.e., by means not of a theoretical sample space but of an actual experiment. More generally, empirical probability estimates probabilities from experience and observation. Given an event in a sample space, the relative frequency of is the ratio being the number of outcomes in which the event occurs, and being the total number of outcomes of the experiment. In statistical terms, the empirical probability is an estimator or estimate of a probability. In simple cases, where the result of a trial only determines whether or not the specified event has occurred, modelling using a binomial distribution might be appropriate and then the empirical estimate is the maximum likelihood estimate. It is the Bayesian estimate for the same case if certain assumptions are made for the prior distribution of the probability. If a trial yields more information, the empirical probability can be improved on by adopting further assumptions in the form of a statistical model: if such a model is fitted, it can be used to derive an estimate of the probability of the specified event Advantages and disadvantages Advantages An advantage of estimating probabilities using empirical probabilities is that this procedure is relatively free of assumptions. For example, consider estimating the probability among a population of men that they satisfy two conditions: that they are over 6 feet in height. that they prefer strawberry jam to raspberry jam. A direct estimate could be found by counting the number of men who satisfy both conditions to give the empirical probability of the combined condition. An alternative estimate could be found by multiplying the proportion of men who are over 6 feet in height with the proportion of men who prefer strawberry jam to raspb
https://en.wikipedia.org/wiki/Triple%20correlation
The triple correlation of an ordinary function on the real line is the integral of the product of that function with two independently shifted copies of itself: The Fourier transform of triple correlation is the bispectrum. The triple correlation extends the concept of autocorrelation, which correlates a function with a single shifted copy of itself and thereby enhances its latent periodicities. History The theory of the triple correlation was first investigated by statisticians examining the cumulant structure of non-Gaussian random processes. It was also independently studied by physicists as a tool for spectroscopy of laser beams. Hideya Gamo in 1963 described an apparatus for measuring the triple correlation of a laser beam, and also showed how phase information can be recovered from the real part of the bispectrum—up to sign reversal and linear offset. However, Gamo's method implicitly requires the Fourier transform to never be zero at any frequency. This requirement was relaxed, and the class of functions which are known to be uniquely identified by their triple (and higher-order) correlations was considerably expanded, by the study of Yellott and Iverson (1992). Yellott & Iverson also pointed out the connection between triple correlations and the visual texture discrimination theory proposed by Bela Julesz. Applications Triple correlation methods are frequently used in signal processing for treating signals that are corrupted by additive white Gaussian noise; in particular, triple correlation techniques are suitable when multiple observations of the signal are available and the signal may be translating in between the observations, e.g.,a sequence of images of an object translating on a noisy background. What makes the triple correlation particularly useful for such tasks are three properties: (1) it is invariant under translation of the underlying signal; (2) it is unbiased in additive Gaussian noise; and (3) it retains nearly all of the relevant
https://en.wikipedia.org/wiki/Neurovascular%20bundle
A neurovascular bundle is a structure that binds nerves and veins (and in some cases arteries and lymphatics) with connective tissue so that they travel in tandem through the body. Structure There are two types of neurovascular bundles: superficial bundles and deep bundles. As arteries do not travel within the superficial fascia (loose connective tissue under the skin), superficial neurovascular bundles differ from deep neurovascular bundles in both composition and function. Superficial bundles Superficial neurovascular bundles do not include arteries, and consist primarily of capillaries and nerves. Because capillaries function as the sites for substance exchange between interstitial fluid and blood, they tend to have large surface area and short diffusion path. Normally, capillaries consist of a central lumen lined with an endothelium, a single layer of smooth epithelial cells. Deep bundles Deep neurovascular bundles, which often include arteries, have a more complicated structure than superficial neurovascular bundles. Since arteries have high intraluminal blood pressure relative to capillaries and veins, these bundles have smooth muscle and connective tissue structures outside the endothelium. This structure allows arteries to contract, relax and remain flexible and transfer blood when under pressure. Function Neurovascular bundles are useful for axons, ensuring a continuous supply of oxygenated blood to important nerves. Clinical significance Both superficial and deep neurovascular bundles are at risk during surgical incisions. Leg surgery In surgeries, the principle superficial neurovascular bundles at risk are, medially, the great saphenous vein and its accompanying nerve, and, laterally, the superficial peroneal nerve. The superficial peroneal nerve originates from the common peroneal nerve near the neck of the fibula and passes between the peroneus longus and brevis muscles, supplying motor branches to these muscles. The superficial branch then con
https://en.wikipedia.org/wiki/Drum%20charts
Drum charts are musical charts written for drummers. They are used to help guide the drummer through the music. Sometimes they are meant to be read literally and other times they are used as suggestions for what the drummer should play. Drum charts include their own musical vocabulary. The music written for drummers is not the same as, say, a pianist. Drummers use their own symbols and language in their charts. For example, a "middle C" note written on a staff for pianists is equivalent to the "snare drum" for drummers. Or, the note "F" on the piano staff is equal to the "bass drum." There is no set standard for writing drum music. But there is a guide that is usually adhered to. For example, in Steve Houghton's book Studio & Big Band Drumming, p. 9, under "Rock Patterns", Steve writes each drum or cymbal used in a percussion staff and states the assigned "note." The placement of the drum notes is very typical of a drum chart but may not always be the same within the drum community. Sources Houghton, Steve. Studio & Big Band Drumming. Iowa: C.L. Barnhouse Company, 1985 Links Free drum charts collection Musical notation Drums
https://en.wikipedia.org/wiki/Squirrel%20Systems
Squirrel Systems is a Burnaby-based point of sale vendor specializing in hospitality management systems. Squirrel is based in Burnaby, Canada. History Squirrel Systems was founded in 1984, and released the first restaurant point of sale system to use an integrated diskless touchscreen terminal for order management. Originally a wholly owned subsidiary of Sulcus Hospitality Technologies Corporation, in 1998 Sulcus merged with Eltrax Systems, Incorporated (Nasdaq SmallCap: ELTX). Squirrel is currently a wholly owned subsidiary of Marin Investments Ltd. Squirrel Workstation One of the unique characteristics of Squirrel's original product was the use of hardened LCD touchscreen terminals. Unlike other systems that used keyboards and CRT monitors, Squirrel terminals had no moving parts and were easily adapted to any operating environment. The original Squirrel terminals reached over 35,000 installed units worldwide, and was the first to integrate an LCD panel, credit card reader, employee ID reader, and CPU inside a single unit. Later units would incorporate IP connectivity, remote booting of a customized Linux operating system, and a Java virtual machine. Squirrel Embedded Linux In 1998 Squirrel Systems released Squirrel Embedded Linux (SEL), a customized distribution of Linux for "thin client" terminal architecture. SEL has several characteristics that were unique at the time of development, including primary support for diskless workstations, customized high-volume touchscreen drivers, integrated Java virtual machine with hardware control, and two-stage booting from a Windows server. Industry awards In 2010, O'Charley's named Squirrel as its Enterprise Support Partner of the Year at the annual Inukshuk Business Partner Awards. Squirrel Systems was awarded the 2009 Epson Envision Award for Innovation for its Squirrel in a Box product. Squirrel Systems was awarded the 1999 Independent Cash Register Dealers Association Silver Award for Outstanding Sponsor in Syst
https://en.wikipedia.org/wiki/Christopher%20Budd%20%28mathematician%29
Christopher John Budd (born 15 February 1960) is a British mathematician known especially for his contribution to non-linear differential equations and their applications in industry. He is currently Professor of Applied Mathematics at the University of Bath, and was Professor of Geometry at Gresham College from 2016 to 2020. Budd gained his Bachelor's degree in mathematics at St John's College, Cambridge, where he was senior wrangler. He went on to be awarded a D.Phil. from Oxford University, studying numerical methods for nonlinear elliptic partial differential equations under the supervision of John Norbury. He spent three years as a fellow of St John's College, Oxford, working in numerical analysis at the Oxford University Computing Laboratory and as a fellow sponsored by the CEGB developing numerical methods for third-order partial differential equations. He went on to a permanent post as a lecturer in numerical analysis at the University of Bristol before gaining a position as Professor of Applied Mathematics at the University of Bath in 1995. He was appointed the Professor of Geometry at Gresham College in 2016, where he delivered a series of public lectures on Mathematics and the Making of the Modern World. His research interests involve the analysis, application and numerical analysis of the solution of nonlinear differential equations with a particular emphasis on problems which arise in industry. His recent work has been in geometric integration which aims to develop numerical methods which reproduce qualitative structures in differential equations. He is co-director of the interdisciplinary Centre for Nonlinear Mechanics at the University of Bath and is active in promoting interdisciplinary collaboration both nationally and internationally. Budd is a passionate populariser of mathematics, reflected in his appointment as Chair of Mathematics of the Royal Institution of Great Britain in 2000. He works on a number of projects with schools and has wr
https://en.wikipedia.org/wiki/Flag%20of%20Cantabria
The flag of the Spanish region of Cantabria is made up of two horizontal stripes of equal width, white on the top and red on the bottom, and the region's coat of arms in its centre. The design is established in the text of the Autonomy Statute, except for the coat of arms, which was established by a Law of the Regional Assembly approved on 30 December 1981. The design traces its lineage to the ship registration flag of the maritime province of Santander, assigned by Royal Order on 30 July 1845. In 2016, the Parliament of Cantabria also recognized the Cantabrian labarum as a symbol of the Cantabrian people, urging the institutions and civil society of Cantabria to promote its use. Most townships have already accepted the proposition of using said flag placing it on the balcony of the Town Hall. Notes
https://en.wikipedia.org/wiki/Department%20of%20Mathematics%2C%20University%20of%20Manchester
The Department of Mathematics at the University of Manchester is one of the largest unified mathematics departments in the United Kingdom, with over 90 academic staff and an undergraduate intake of roughly 400 students per year (including students studying mathematics with a minor in another subject) and approximately 200 postgraduate students in total. The School of Mathematics was formed in 2004 by the merger of the mathematics departments of University of Manchester Institute of Science and Technology (UMIST) and the Victoria University of Manchester (VUM). In July 2007 the department moved into a purpose-designed building─the first three floors of the Alan Turing Building─on Upper Brook Street. In a Faculty restructure in 2019 the School of Mathematics reverted to the Department of Mathematics. It is one of five Departments that make up the School of Natural Sciences, which together with the School of Engineering now constitutes the Faculty of Science and Engineering at Manchester. Organization The current head of the department is Andrew Hazel. The department is divided into three groups: Pure Mathematics (Head: Charles Eaton), Applied Mathematics (Head: David Sylvester), and Probability and Statistics (Head: Korbinian Strimmer). The director of research is William Parnell. The Manchester Institute for Mathematical Sciences (MIMS) is a unit of the department focusing on the organising of mathematical colloquia and conferences, and research visitors. MIMS is headed by Nick Higham. Other high-profile mathematicians at Manchester include Martin Taylor and Jeff Paris. Since its formation, the department has made some influential appointments including the topologist Viktor Buchstaber and model theorist Alex Wilkie. Numerical analyst Jack Dongarra, one of the authors of LINPACK, was appointed in 2007 as Turing Fellow. In the autumn of 2007, Albert Shiryaev was appointed to a 20% chair. Shiryaev is known for his work on probability theory (he was a student of An
https://en.wikipedia.org/wiki/Tangential%20and%20normal%20components
In mathematics, given a vector at a point on a curve, that vector can be decomposed uniquely as a sum of two vectors, one tangent to the curve, called the tangential component of the vector, and another one perpendicular to the curve, called the normal component of the vector. Similarly, a vector at a point on a surface can be broken down the same way. More generally, given a submanifold N of a manifold M, and a vector in the tangent space to M at a point of N, it can be decomposed into the component tangent to N and the component normal to N. Formal definition Surface More formally, let be a surface, and be a point on the surface. Let be a vector at Then one can write uniquely as a sum where the first vector in the sum is the tangential component and the second one is the normal component. It follows immediately that these two vectors are perpendicular to each other. To calculate the tangential and normal components, consider a unit normal to the surface, that is, a unit vector perpendicular to at Then, and thus where "" denotes the dot product. Another formula for the tangential component is where "" denotes the cross product. Note that these formulas do not depend on the particular unit normal used (there exist two unit normals to any surface at a given point, pointing in opposite directions, so one of the unit normals is the negative of the other one). Submanifold More generally, given a submanifold N of a manifold M and a point , we get a short exact sequence involving the tangent spaces: The quotient space is a generalized space of normal vectors. If M is a Riemannian manifold, the above sequence splits, and the tangent space of M at p decomposes as a direct sum of the component tangent to N and the component normal to N: Thus every tangent vector splits as where and . Computations Suppose N is given by non-degenerate equations. If N is given explicitly, via parametric equations (such as a parametric curve), then the derivative gi
https://en.wikipedia.org/wiki/Geniculum
A geniculum is a small genu, or angular knee-like structure. The term is often used in anatomical nomenclature to designate a sharp knee-like bend in a small structure or organ. For example, in the facial canal, the genicular ganglion is situated on the geniculum of the facial nerve, the point where the nerve changes its direction.
https://en.wikipedia.org/wiki/BamHI
BamHI (pronounced "Bam H one") (from Bacillus amyloliquefaciens) is a type II restriction endonuclease, having the capacity for recognizing short sequences (6 bp) of DNA and specifically cleaving them at a target site. This exhibit focuses on the structure-function relations of BamHI as described by Newman, et al. (1995). BamHI binds at the recognition sequence 5'-GGATCC-3', and cleaves these sequences just after the 5'-guanine on each strand. This cleavage results in sticky ends which are 4 bp long. In its unbound form, BamHI displays a central b sheet, which resides in between α-helices. BamHI undergoes a series of unconventional conformational changes upon DNA recognition. This allows the DNA to maintain its normal B-DNA conformation without distorting to facilitate enzyme binding. BamHI is a symmetric dimer. DNA is bound in a large cleft that is formed between dimers; the enzyme binds in a "crossover" manner. Each BamHI subunit makes the majority of its backbone contacts with the phosphates of a DNA half site but base pair contacts are made between each BamHI subunit and nitrogenous bases in the major groove of the opposite DNA half site. The protein binds the bases through either direct hydrogen bonds or water-mediated H-bonds between the protein and every H-bond donor/acceptor group in the major groove. Major groove contacts are formed by atoms residing on the amino-terminus of a parallel 4 helix bundle. This bundle marks the BamHI dimer interface, and it is thought that the dipole moments of the NH2-terminal atoms on this bundle may contribute to electrostatic stabilization. Sites of Recognition Between BamHI and DNA The BamHI enzyme is capable of making a large number of contacts with DNA. Water-mediated hydrogen bonding, as well as both main-chain and side-chain interactions aid in binding of the BamHI recognition sequence. In the major groove, the majority of enzyme/DNA contacts take place at the amino terminus of the parallel-4-helix bundle, made up of
https://en.wikipedia.org/wiki/Stream%20X-Machine
The Stream X-machine (SXM) is a model of computation introduced by Gilbert Laycock in his 1993 PhD thesis, The Theory and Practice of Specification Based Software Testing. Based on Samuel Eilenberg's X-machine, an extended finite-state machine for processing data of the type X, the Stream X-Machine is a kind of X-machine for processing a memory data type Mem with associated input and output streams In* and Out*, that is, where X = Out* × Mem × In*. The transitions of a Stream X-Machine are labelled by functions of the form φ: Mem × In → Out × Mem, that is, which compute an output value and update the memory, from the current memory and an input value. Although the general X-machine had been identified in the 1980s as a potentially useful formal model for specifying software systems, it was not until the emergence of the Stream X-Machine that this idea could be fully exploited. Florentin Ipate and Mike Holcombe went on to develop a theory of complete functional testing,<ref name="HolIp98">Mike Holcombe and Florentin Ipate (1998) Correct systems - building a business process solution. Applied Computing Series. Berlin: Springer-Verlag.</ref> in which complex software systems with hundreds of thousands of states and millions of transitions could be decomposed into separate SXMs that could be tested exhaustively, with a guaranteed proof of correct integration. Because of the intuitive interpretation of Stream X-Machines as "processing agents with inputs and outputs", they have attracted increasing interest, because of their utility in modelling real-world phenomena. The SXM model has important applications in fields as diverse as computational biology, software testing and agent-based computational economics. The Stream X-Machine A Stream X-Machine (SXM) is an extended finite-state machine with auxiliary memory, inputs and outputs. It is a variant of the general X-machine, in which the fundamental data type X = Out* × Mem × In*, that is, a tuple consisti
https://en.wikipedia.org/wiki/Majority%20problem
The majority problem, or density classification task, is the problem of finding one-dimensional cellular automaton rules that accurately perform majority voting. Using local transition rules, cells cannot know the total count of all the ones in system. In order to count the number of ones (or, by symmetry, the number of zeros), the system requires a logarithmic number of bits in the total size of the system. It also requires the system send messages over a distance linear in the size of the system and for the system to recognize a non-regular language. Thus, this problem is an important test case in measuring the computational power of cellular automaton systems. Problem statement Given a configuration of a two-state cellular automaton with i + j cells total, i of which are in the zero state and j of which are in the one state, a correct solution to the voting problem must eventually set all cells to zero if i > j and must eventually set all cells to one if i < j. The desired eventual state is unspecified if i = j. The problem can also be generalized to testing whether the proportion of zeros and ones is above or below some threshold other than 50%. In this generalization, one is also given a threshold ; a correct solution to the voting problem must eventually set all cells to zero if and must eventually set all cells to one if . The desired eventual state is unspecified if . Approximate solutions Gács, Kurdyumov, and Levin found an automaton that, although it does not always solve the majority problem correctly, does so in many cases. In their approach to the problem, the quality of a cellular automaton rule is measured by the fraction of the possible starting configurations that it correctly classifies. The rule proposed by Gacs, Kurdyumov, and Levin sets the state of each cell as follows. If a cell is 0, its next state is formed as the majority among the values of itself, its immediate neighbor to the left, and its neighbor three spaces to the left.
https://en.wikipedia.org/wiki/Neuroregeneration
Neuroregeneration involves the regrowth or repair of nervous tissues, cells or cell products. Neuroregenerative mechanisms may include generation of new neurons, glia, axons, myelin, or synapses. Neuroregeneration differs between the peripheral nervous system (PNS) and the central nervous system (CNS) by the functional mechanisms involved, especially in the extent and speed of repair. When an axon is damaged, the distal segment undergoes Wallerian degeneration, losing its myelin sheath. The proximal segment can either die by apoptosis or undergo the chromatolytic reaction, which is an attempt at repair. In the CNS, synaptic stripping occurs as glial foot processes invade the dead synapse. Nervous system injuries affect over 90,000 people every year. Spinal cord injuries alone affect an estimated 10,000 people each year. As a result of this high incidence of neurological injuries, nerve regeneration and repair, a subfield of neural tissue engineering, is becoming a rapidly growing field dedicated to the discovery of new ways to recover nerve functionality after injury. The nervous system is divided by neurologists into two parts: the central nervous system (which consists of the brain and spinal cord) and the peripheral nervous system (which consists of cranial and spinal nerves along with their associated ganglia). While the peripheral nervous system has an intrinsic ability for repair and regeneration, the central nervous system is, for the most part, incapable of self-repair and regeneration. There is no treatment for recovering human nerve-function after injury to the central nervous system. Multiple attempts at nerve re-growth across the PNS-CNS transition have not been successful. There is simply not enough knowledge about regeneration in the central nervous system. In addition, although the peripheral nervous system has the capability for regeneration, much research still needs to be done to optimize the environment for maximum regrowth potential. Neurore
https://en.wikipedia.org/wiki/Temperature-sensitive%20mutant
Temperature-sensitive mutants are variants of genes that allow normal function of the organism at low temperatures, but altered function at higher temperatures. Cold sensitive mutants are variants of genes that allow normal function of the organism at higher temperatures, but altered function at low temperatures. Mechanism Most temperature-sensitive mutations affect proteins, and cause loss of protein function at the non-permissive temperature. The permissive temperature is one at which the protein typically can fold properly, or remain properly folded. At higher temperatures, the protein is unstable and ceases to function properly. These mutations are usually recessive in diploid organisms. Temperature sensitive mutants arrange a reversible mechanism and are able to reduce particular gene products at varying stages of growth and are easily done by changing the temperature of growth. Permissive temperature The permissive temperature is the temperature at which a temperature-sensitive mutant gene product takes on a normal, functional phenotype. When a temperature-sensitive mutant is grown in a permissive condition, the mutated gene product behaves normally (meaning that the phenotype is not observed), even if there is a mutant allele present. This results in the survival of the cell or organism, as if it were a wild type strain. In contrast, the nonpermissive temperature or restrictive temperature is the temperature at which the mutant phenotype is observed. Temperature sensitive mutations are usually missense mutations, which then will harbor the function of a specified necessary gene at the standard, permissive, low temperature. It will alternatively lack the function at a rather high, non-permissive, temperature and display a hypomorphic (partial loss of gene function) and a middle, semi-permissive, temperature. Use in research Temperature-sensitive mutants are useful in biological research. They allow the study of essential processes required for the surviv
https://en.wikipedia.org/wiki/Mixed%20Hodge%20module
In mathematics, mixed Hodge modules are the culmination of Hodge theory, mixed Hodge structures, intersection cohomology, and the decomposition theorem yielding a coherent framework for discussing variations of degenerating mixed Hodge structures through the six functor formalism. Essentially, these objects are a pair of a filtered D-module together with a perverse sheaf such that the functor from the Riemann–Hilbert correspondence sends to . This makes it possible to construct a Hodge structure on intersection cohomology, one of the key problems when the subject was discovered. This was solved by Morihiko Saito who found a way to use the filtration on a coherent D-module as an analogue of the Hodge filtration for a Hodge structure. This made it possible to give a Hodge structure on an intersection cohomology sheaf, the simple objects in the Abelian category of perverse sheaves. Abstract structure Before going into the nitty gritty details of defining Mixed hodge modules, which is quite elaborate, it is useful to get a sense of what the category of Mixed Hodge modules actually provides. Given a complex algebraic variety there is an abelian category pg 339 with the following functorial properties There is a faithful functor called the rationalization functor. This gives the underlying rational perverse sheaf of a mixed Hodge module. There is a faithful functor sending a mixed Hodge module to its underlying D-module These functors behave well with respect to the Riemann-Hilbert correspondence , meaning for every mixed Hodge module there is an isomorphism . In addition, there are the following categorical properties The category of mixed Hodge modules over a point is isomorphic to the category of Mixed hodge structures, Every object in admits a weight filtration such that every morphism in preserves the weight filtration strictly, the associated graded objects are semi-simple, and in the category of mixed Hodge modules over a point, this correspo
https://en.wikipedia.org/wiki/Wave%20turbulence
In continuum mechanics, wave turbulence is a set of nonlinear waves deviated far from thermal equilibrium. Such a state is usually accompanied by dissipation. It is either decaying turbulence or requires an external source of energy to sustain it. Examples are waves on a fluid surface excited by winds or ships, and waves in plasma excited by electromagnetic waves etc. Appearance External sources by some resonant mechanism usually excite waves with frequencies and wavelengths in some narrow interval. For example, shaking a container with frequency ω excites surface waves with frequency ω/2 (parametric resonance, discovered by Michael Faraday). When wave amplitudes are small – which usually means that the wave is far from breaking – only those waves exist that are directly excited by an external source. When, however, wave amplitudes are not very small (for surface waves: when the fluid surface is inclined by more than few degrees) waves with different frequencies start to interact. That leads to an excitation of waves with frequencies and wavelengths in wide intervals, not necessarily in resonance with an external source. In experiments with high shaking amplitudes one initially observes waves that are in resonance with one another. Thereafter, both longer and shorter waves appear as a result of wave interaction. The appearance of shorter waves is referred to as a direct cascade while longer waves are part of an inverse cascade of wave turbulence. Statistical wave turbulence and discrete wave turbulence Two generic types of wave turbulence should be distinguished: statistical wave turbulence (SWT) and discrete wave turbulence (DWT). In SWT theory exact and quasi-resonances are omitted, which allows using some statistical assumptions and describing the wave system by kinetic equations and their stationary solutions – the approach developed by Vladimir E. Zakharov. These solutions are called Kolmogorov–Zakharov (KZ) energy spectra and have the form k−α, with k t
https://en.wikipedia.org/wiki/Catherine%20Flon
Catherine Flon (1772-1831) was a Haitian seamstress, patriot and national heroine. She is regarded as one of the symbols of the Haitian Revolution and independence. She is celebrated for sewing the first Haitian flag in May 18, 1803 and maintains an important place in Haitian memory of the Revolution to this day. Life Catherine Flon was born on December 2, 1772 in Arcahaie in Saint-Domingue. Her parents traded in textiles from France. She became a seamstress with her own workshop, and had several apprentices. She was the god daughter of Jean-Jacques Dessalines. Creation of the flag According to Haitian revolutionary tradition, Flon created the country's first flag on May 18, 1803, the last day of the Congress of Arcahaie. There, the leader of the Revolution, Jean-Jacques Dessalines, Flon's godfather, cut apart a French tricolor with his sabre, demonstrating his desire to break away from France. He gave the pieces to Flon, who stitched them back together, while leaving out the central white strip. In Haitian lore, the colors of the new flag took on a racialized meaning: the blue and red stripes represented a union between the black and mulatto citizens of Haiti. Historians have noted some limitations within this legendary history of the flag's creation. For instance, primary sources from the Revolution reveal that rebels had used blue-and-red flags before the Arcahaie conference. Also, the first Haitians to use the bicolor flag had meant it to represent an extension of French Revolutionary values, rather than a rejection of them; early revolutionaries had fought to preserve the 1794 law of emancipation rather than to gain independence. In Haitian culture and memory Catherine Flon is regarded as one of the three most symbolic heroines of the Haitian independence, alongside Cécile Fatiman and Dédée Bazile. Her birthplace of Arcahaie is today referred to as "flag town" and the date on which she is said to have made the first flag, May 18, has become a national ho
https://en.wikipedia.org/wiki/Shimura%20variety
In number theory, a Shimura variety is a higher-dimensional analogue of a modular curve that arises as a quotient variety of a Hermitian symmetric space by a congruence subgroup of a reductive algebraic group defined over Q. Shimura varieties are not algebraic varieties but are families of algebraic varieties. Shimura curves are the one-dimensional Shimura varieties. Hilbert modular surfaces and Siegel modular varieties are among the best known classes of Shimura varieties. Special instances of Shimura varieties were originally introduced by Goro Shimura in the course of his generalization of the complex multiplication theory. Shimura showed that while initially defined analytically, they are arithmetic objects, in the sense that they admit models defined over a number field, the reflex field of the Shimura variety. In the 1970s, Pierre Deligne created an axiomatic framework for the work of Shimura. In 1979, Robert Langlands remarked that Shimura varieties form a natural realm of examples for which equivalence between motivic and automorphic L-functions postulated in the Langlands program can be tested. Automorphic forms realized in the cohomology of a Shimura variety are more amenable to study than general automorphic forms; in particular, there is a construction attaching Galois representations to them. Definition Shimura datum Let S = ResC/R Gm be the Weil restriction of the multiplicative group from complex numbers to real numbers. It is a real algebraic group, whose group of R-points, S(R), is C* and group of C-points is C*×C*. A Shimura datum is a pair (G, X) consisting of a (connected) reductive algebraic group G defined over the field Q of rational numbers and a G(R)-conjugacy class X of homomorphisms h: S → GR satisfying the following axioms: For any h in X, only weights (0,0), (1,−1), (−1,1) may occur in gC, i.e. the complexified Lie algebra of G decomposes into a direct sum where for any z ∈ S, h(z) acts trivially on the first summand and via
https://en.wikipedia.org/wiki/Acetic%20acid/hydrocortisone
Acetic acid/hydrocortisone is a commonly used combination drug to treat infections of the outer ear and ear canal. Branded as Vosol HC and Acetasol HC, it combines the antibacterial and antifungal action of acetic acid with the anti-inflammatory functions of hydrocortisone.
https://en.wikipedia.org/wiki/Scleroscope
A scleroscope is a device used to measure rebound hardness. It consists of a steel ball dropped from a fixed height. The device was invented in 1907. As an improvement on this rough method, the Leeb Rebound Hardness Test, invented in the 1970s, uses the ratio of impact and rebound velocities (as measured by a magnetic inducer) to determine hardness. See also External links Mechanical Properties of Metals Scleroscope Hardness Test Testing the Hardness of Metals Hardness instruments Materials science Metallurgy
https://en.wikipedia.org/wiki/Electronic%20oscillation
Electronic oscillation is a repeating cyclical variation in voltage or current in an electrical circuit, resulting in a periodic waveform. The frequency of the oscillation in hertz is the number of times the cycle repeats per second. The recurrence may be in the form of a varying voltage or a varying current. The waveform may be sinusoidal or some other shape when its magnitude is plotted against time. Electronic oscillation may be intentionally caused, as in devices designed as oscillators, or it may be the result of unintentional positive feedback from the output of an electronic device to its input. The latter appears often in feedback amplifiers (such as operational amplifiers) that do not have sufficient gain or phase margins. In this case, the oscillation often interferes with or compromises the amplifier's intended function, and is known as parasitic oscillation.
https://en.wikipedia.org/wiki/Master%20of%20Monsters%20%28video%20game%29
Master of Monsters is a turn-based strategy game developed by SystemSoft for the MSX and NEC PC8801. It was ported to a variety of consoles and PCs including the PC Engine CD, NEC PC9801, and Sega Genesis/Mega Drive. While it never had the same success as its SystemSoft stablemate Daisenryaku, the game garnered a loyal following. Its success in the North American market on the Sega Genesis proved sufficient for a sequel on the Sega Saturn, and an anime art-style enhanced Sony PlayStation version titled Disciples of Gaia with a Japanese role-playing game feel. Master of Monsters: Disciples of Gaia was released in 1998. Gameplay Gameplay engages players by permitting them to summon and move monsters around a board in an effort to capture towers and to eventually defeat the opponents (which are controlled either by other humans or by the computer program). Moves are based on a hexagonal board structure, such that every tile on the board is adjacent to six other tiles. Other notable features were the large variety of monsters, upgrading ("leveling up") of veteran units and control of a "Master" character who, if killed, can end the game for that player. The focus of the game is strategic, despite the fantasy-type characters that might imply an RPG element. Other than the existence of the Master character and magic in the game, the gameplay is very similar to System Soft's more hardcore modern warfare strategic wargame series Daisenryaku, with the exception that some versions of the Master of Monsters (such as Master of Monsters – Final) series allow equippable items, weapons and armor. Comparisons The later Lords of Chaos by Julian Gollop of Mythos Games shares many of the same elements of summoning and tactics, along with the earlier title Chaos from 1985. David White, creator of the open-source turn-based strategy game The Battle for Wesnoth, cited Master of Monsters as an inspiration. Master of Monsters was also compared to later games such as the role-playing
https://en.wikipedia.org/wiki/Spherical%20polyhedron
In geometry, a spherical polyhedron or spherical tiling is a tiling of the sphere in which the surface is divided or partitioned by great arcs into bounded regions called spherical polygons. Much of the theory of symmetrical polyhedra is most conveniently derived in this way. The most familiar spherical polyhedron is the soccer ball, thought of as a spherical truncated icosahedron. The next most popular spherical polyhedron is the beach ball, thought of as a hosohedron. Some "improper" polyhedra, such as hosohedra and their duals, dihedra, exist as spherical polyhedra, but their flat-faced analogs are degenerate. The example hexagonal beach ball, is a hosohedron, and is its dual dihedron. History The first known man-made polyhedra are spherical polyhedra carved in stone. Many have been found in Scotland, and appear to date from the neolithic period (the New Stone Age). During the 10th Century, the Islamic scholar Abū al-Wafā' Būzjānī (Abu'l Wafa) wrote the first serious study of spherical polyhedra. Two hundred years ago, at the start of the 19th Century, Poinsot used spherical polyhedra to discover the four regular star polyhedra. In the middle of the 20th Century, Coxeter used them to enumerate all but one of the uniform polyhedra, through the construction of kaleidoscopes (Wythoff construction). Examples All regular polyhedra, semiregular polyhedra, and their duals can be projected onto the sphere as tilings: Improper cases Spherical tilings allow cases that polyhedra do not, namely hosohedra: figures as {2,n}, and dihedra: figures as {n,2}. Generally, regular hosohedra and regular dihedra are used. Relation to tilings of the projective plane Spherical polyhedra having at least one inversive symmetry are related to projective polyhedra (tessellations of the real projective plane) – just as the sphere has a 2-to-1 covering map of the projective plane, projective polyhedra correspond under 2-fold cover to spherical polyhedra that are symmetric under ref
https://en.wikipedia.org/wiki/American%20Association%20of%20Physicists%20in%20Medicine
The American Association of Physicists in Medicine (AAPM) is a scientific, educational, and professional organization of Medical Physicists. In 2011, it absorbed the American College of Medical Physics Their headquarters are located at 1631 Prince Street, Alexandria, Virginia. Publications include two scientific journals Medical Physics and the Journal of Applied Clinical Medical Physics (JACMP), as well as technical reports, and symposium proceedings. The purposes of the American Association of Physicists in Medicine are to promote the application of physics to medicine and biology and to encourage interest and training in medical physics and related fields. AAPM has established Medical Physics as its primary scientific and informational journal. AAPM is a Member of the American Institute of Physics and has over 9700 members. Regional chapters of the AAPM hold regular scientific meetings for their members. For example the New England Chapter typically meets three times per year. More information for the NEAAPM can be found at . See also American Board of Science in Nuclear Medicine Institute of Physics and Engineering in Medicine
https://en.wikipedia.org/wiki/Equidigital%20number
In number theory, an equidigital number is a natural number in a given number base that has the same number of digits as the number of digits in its prime factorization in the given number base, including exponents but excluding exponents equal to 1. For example, in base 10, 1, 2, 3, 5, 7, and 10 (2 × 5) are equidigital numbers . All prime numbers are equidigital numbers in any base. A number that is either equidigital or frugal is said to be economical. Mathematical definition Let be the number base, and let be the number of digits in a natural number for base . A natural number has the prime factorisation where is the p-adic valuation of , and is an equidigital number in base if Properties Every prime number is equidigital. This also proves that there are infinitely many equidigital numbers. See also Extravagant number Frugal number Smith number Notes
https://en.wikipedia.org/wiki/Lutheran%20antigen%20system
The Lutheran antigen systems is a classification of human blood based on the presence of substances called Lutheran antigens on the surfaces of red blood cells. There are 19 known Lutheran antigens. All of these antigens arise from variations in a gene called BCAM (basal cell adhesion molecule). The system is based on the expression of two codominant alleles, designated Lua and Lub. The antigens Aua and Aub, known as the Auberger antigens, were once thought to make up a separate blood group but were later shown to be Lutheran antigens arising from variations in the BCAM gene. The phenotypes Lu(a+b−) and Lu(a+b+) are found at various frequencies within populations. The Lu(a−b+) phenotype is the most common in all populations, whereas the Lu(a−b−) phenotype is uncommon. Though present in the fetus, it is seldom the cause of erythroblastosis fetalis or of transfusion reactions. Notes Blood antigen systems Transfusion medicine
https://en.wikipedia.org/wiki/HITRAN
HITRAN (an acronym for High Resolution Transmission) molecular spectroscopic database is a compilation of spectroscopic parameters used to simulate and analyze the transmission and emission of light in gaseous media, with an emphasis on planetary atmospheres. The knowledge of spectroscopic parameters for transitions between energy levels in molecules (and atoms) is essential for interpreting and modeling the interaction of radiation (light) within different media. For half a century, HITRAN has been considered to be an international standard which provides the user a recommended value of parameters for millions of transitions for different molecules. HITRAN includes both experimental and theoretical data which are gathered from a worldwide network of contributors as well as from articles, books, proceedings, databases, theses, reports, presentations, unpublished data, papers in-preparation and private communications. A major effort is then dedicated to evaluating and processing the spectroscopic data. A single transition in HITRAN has many parameters, including a default 160-byte fixed-width format used since HITRAN2004. Wherever possible, the retrieved data are validated against accurate laboratory data. The original version of HITRAN was compiled by the US Air Force Cambridge Research Laboratories (1960s) in order to enable surveillance of military aircraft detected through the terrestrial atmosphere. One of the early applications of HITRAN was a program called Atmospheric Radiation Measurement (ARM) for the US Department of Energy. In this program spectral atmospheric measurements were made around the globe in order to better understand the balance between the radiant energy that reaches Earth from the sun and the energy that flows from Earth back out to space. The US Department of Transportation also utilized HITRAN in its early days for monitoring the gas emissions (NO, SO2, NO2) of super-sonic transports flying at high altitude. HITRAN was first made public
https://en.wikipedia.org/wiki/Lagrange%20multipliers%20on%20Banach%20spaces
In the field of calculus of variations in mathematics, the method of Lagrange multipliers on Banach spaces can be used to solve certain infinite-dimensional constrained optimization problems. The method is a generalization of the classical method of Lagrange multipliers as used to find extrema of a function of finitely many variables. The Lagrange multiplier theorem for Banach spaces Let X and Y be real Banach spaces. Let U be an open subset of X and let f : U → R be a continuously differentiable function. Let g : U → Y be another continuously differentiable function, the constraint: the objective is to find the extremal points (maxima or minima) of f subject to the constraint that g is zero. Suppose that u0 is a constrained extremum of f, i.e. an extremum of f on Suppose also that the Fréchet derivative Dg(u0) : X → Y of g at u0 is a surjective linear map. Then there exists a Lagrange multiplier λ : Y → R in Y∗, the dual space to Y, such that Since Df(u0) is an element of the dual space X∗, equation (L) can also be written as where (Dg(u0))∗(λ) is the pullback of λ by Dg(u0), i.e. the action of the adjoint map (Dg(u0))∗ on λ, as defined by Connection to the finite-dimensional case In the case that X and Y are both finite-dimensional (i.e. linearly isomorphic to Rm and Rn for some natural numbers m and n) then writing out equation (L) in matrix form shows that λ is the usual Lagrange multiplier vector; in the case n = 1, λ is the usual Lagrange multiplier, a real number. Application In many optimization problems, one seeks to minimize a functional defined on an infinite-dimensional space such as a Banach space. Consider, for example, the Sobolev space and the functional given by Without any constraint, the minimum value of f would be 0, attained by u0(x) = 0 for all x between −1 and +1. One could also consider the constrained optimization problem, to minimize f among all those u ∈ X such that the mean value of u is +1. In terms of the above theorem, the c
https://en.wikipedia.org/wiki/Mixed%20dark%20matter
Mixed dark matter (MDM) is a dark matter (DM) model proposed during the late 1990s. Mixed dark matter is also called hot + cold dark matter. The most abundant form of dark matter is cold dark matter, almost one fourth of the energy contents of the Universe. Neutrinos are the only known particles whose Big-Bang thermal relic should compose at least a fraction of Hot dark matter (HDM), albeit other candidates are speculated to exist. In the early 1990s, the power spectrum of fluctuations in the galaxy clustering did not agree entirely with the predictions for a standard cosmology built around pure cold DM. Mixed dark matter with a composition of about 80% cold and 20% hot (neutrinos) was investigated and found to agree better with observations. This large amount of HDM was made obsolete by the discovery in 1998 of the acceleration of universal expansion, which eventually led to the dark energy + dark matter paradigm of this decade. The cosmological effects of cold DM are almost opposite to the hot DM effects. Given that cold DM promotes the growth of large scale structures, it is often believed to be composed of Weakly interacting massive particles (WIMPs). Conversely hot DM suffers of free-streaming for most of the history of the Universe, washing-out the formation of small scales. In other words, the mass of hot DM particles is too small to produce the observed gravitationally bounded objects in the Universe. For that reason, the hot DM abundance is constrained by Cosmology to less than one percent of the Universe contents. The Mixed Dark Matter scenario recovered relevance when DM was proposed to be a thermal relic of a Bose–Einstein condensate made of very light bosonic particles, as light as neutrinos or even lighter like the Axion. This cosmological model predicts that cold DM is made of many condensed particles, while a small fraction of these particles resides in excited energetic states contributing to hot DM.
https://en.wikipedia.org/wiki/Coat%20of%20arms%20of%20Cantabria
The coat of arms of Cantabria has a rectangular shield, round in base (also called Spanish shield in heraldry) and the field is party en fess. In field azure, a tower or crenellated and masoned, port and windows azure, to its right a ship in natural colours that with its bow has broken a chain sable going from the tower to the dexter flank of the shield. At the base, sea waves argent and azure, all surmounted in chief by two male heads, severed and haloed. In field gules, a disc-shaped stele with geometric ornaments of the kind of the Cantabrian steles of Barros or Lombera. The crest is a closed royal crown, a circle of jeweled gold, made up of eight rosettes in the shape of acanthus leaves, only five visible, interpolated with pearls, and with half-arches topped with pearls raising from each leaf and converging in an orb azure, with submeridian and equator or, topped with cross or. The crown, covered in gules. The coat of arms was designed by a commission of experts made up of members of the Royal Academy of History. After long debates they decided to have two differentiated parts: one historical and hagiographic, and the other characteristical. The historic part of the first field shows the emblem of the conquest of Seville by Cantabrian marines in 1248, with the tower (representing the Torre del Oro) and the ship breaking the chain boom that blocked the way through the river Guadalquivir. It symbolizes the eight centuries of activity that characterised the maritime Cantabria. The hagiographic references consist in the heads of the martyr saints Emeterius and Celedonius, representing the unity of the territory under their patronage. The second field shows the image of one of the most important legates left by the primitive people who inhabited the region: the giant steles of the Cantabri. The Stele of Barros (discovered in the town of the same name) was taken as model. The official coat of arms of Cantabria completes with the inclusion of the Spanish royal crow
https://en.wikipedia.org/wiki/Image%20warping
Image warping is the process of digitally manipulating an image such that any shapes portrayed in the image have been significantly distorted. Warping may be used for correcting image distortion as well as for creative purposes (e.g., morphing). The same techniques are equally applicable to video. While an image can be transformed in various ways, pure warping means that points are mapped to points without changing the colors. This can be based mathematically on any function from (part of) the plane to the plane. If the function is injective the original can be reconstructed. If the function is a bijection any image can be inversely transformed. Some methods are: Images may be distorted through simulation of optical aberrations. Images may be viewed as if they had been projected onto a curved or mirrored surface. (This is often seen in ray traced images.) Images can be partitioned into image polygons and each polygon distorted. Images can be distorted using morphing. The most obvious approach to transforming a digital image is the forward mapping. This applies the transform directly to the source image, typically generating unevenly-spaced points that will then be interpolated to generate the required regularly-spaced pixels. However, for injective transforms reverse mapping is also available. This applies the inverse transform to the target pixels to find the unevenly-spaced locations in the source image that contribute to them. Estimating them from source image pixels will require interpolation of the source image. To work out what kind of warping has taken place between consecutive images, one can use optical flow estimation techniques. See also Anamorphosis Softwarp, is a software technique to warp an image so that it can be projected on a curved screen. Keystone effect, one of the defects often repaired by warping
https://en.wikipedia.org/wiki/Nonochton
is the Classical Nahuatl name for a plant whose identity is uncertain. Suggested plants include Portulaca, Pereskiopsis, and Lycianthes mociniana, a plant now called tlanochtle in the local variety of modern Nahuatl spoken by highland farmers that cultivate it for its fruit. Medicinal uses In Aztec medicine, was used as an ingredient in a remedy for pain at the heart: See also Aztec entheogenic complex
https://en.wikipedia.org/wiki/Institute%20of%20Biological%20Engineering
The Institute of Biological Engineering or IBE is a non-profit professional organization which encourages inquiry and interest in the field of biological engineering. Overview IBE promotes the view that biological engineering is a science-based, application-independent discipline that is aligned with the perspective and foundation of biology. IBE espouses the view that biological engineers should possess the scientific knowledge of biology, including its philosophical views, be proficient in the principles and practices of engineering, and be capable of integrating discoveries from multiple disciplines to design sustainable solutions. IBE supports: Scholarship in education, research and service. Professional standards for engineering practices. Professional and technical development of biological engineering. Interactions among academia, industry and government. Public understanding and responsible uses of biological engineering products. Through publications, meetings, distribution of information and services, IBE encourages: Cooperation among engineers, scientists, technologists and allied professionals. Timely availability of new knowledge and technology. Collaboration in education, research and economic activities worldwide Active promotion and growth of its members. History The IBE was established in 1995 to encourage inquiry and interest in biological engineering in the broadest and most liberal manner and promote the professional development of its members. The organization was proposed on May 20, 1995 by ten individuals who met in Atlanta, Georgia to discuss the creation of a new professional organization and who became the first council of IBE: Susan Blanchard, Susan Capps, Mike Delwiche, Mark Eiteman, Kathrine Flechter, Belinda Roettger, Jonathan Scott, Tim Taylor, John Henry Wells, and Brahm Verma, who served as the first president of IBE. The first annual meeting of IBE was held July 13–15, 1996 in Phoenix, Arizona. Sources External links Offic
https://en.wikipedia.org/wiki/QuickLOAD
QuickLOAD is an internal ballistics predictor computer program for firearms. For computations apart from other parameters, the cartridge the projectile (bullet) the gun barrel length the cartridge overall length the propellant type and quantity must be entered for calculating an estimated maximum chamber gas piezo pressure, muzzle velocity, muzzle pressure and other relevant data. QuickLOAD database QuickLOAD has a default database of predefined bullets, cartridges and propellants. The database of the more recent versions of QuickLOAD also include dimensional technical drawings of the predefined cartridges and for most cartridges photographic images. Data can later be imported or entered by the user to expand the programs database. The default database contains more than 2,500 projectiles, over 1,200 cartridges, over 225 powders and dimensional drawings and photos of many cartridges. The default database however contains some errors, so measuring sizes, weights and case capacities of components intended for use and if appropriate correcting default provided data is wise to avoid surprises and make the predictions more accurate. Some default data is incomplete, since it was not released by the manufacturer or when components that are neither officially registered with nor sanctioned by C.I.P. (Commission Internationale Permanente Pour L'Epreuve Des Armes A Feu Portative) or its American equivalent, SAAMI (Sporting Arms and Ammunition Manufacturers’ Institute) come into play. Such wildcat cartridges have no official dimensions nor other performance related specifications. Cartridge case volume establishment Besides the standard entered information, the actual internal volume or cartridge case capacity of the used cases is an important parameter for QuickLOAD to obtain usable predictions. The internal case volume has to be established by weighing empty once-fired cartridge cases from a production lot, then filling the cases with fresh or distilled water up to
https://en.wikipedia.org/wiki/Nephelometry%20%28medicine%29
Nephelometry is a technique used in immunology to determine the levels of several blood plasma proteins. For example the total levels of antibodies isotypes or classes: Immunoglobulin M, Immunoglobulin G, and Immunoglobulin A. It is important in quantification of free light chains in diseases such as multiple myeloma. Quantification is important for disease classification and for disease monitoring once a patient has been treated (increased skewing of the ratio between kappa and lambda light chains after a patient has been treated is an indication of disease recurrence). It is performed by measuring the scattered light at an angle from the sample being measured. In diagnostic nephelometry, the ascending branch of the Heidelberger-Kendall curve is extended by optimizing the course of the reaction so that most plasma proteins’ (from human blood) measurement signals fall at the left side of the Heidelberger-Kendall curve, even at very high concentrations. This technique is widely used in clinical laboratories because it is relatively easily automated. It is based on the principle that a dilute suspension of small particles will scatter light (usually a laser) passed through it rather than simply absorbing it. The amount of scatter is determined by collecting the light at an angle (usually at 30 and 90 degrees). Antibody and the antigen are mixed in concentrations such that only small aggregates are formed that do not quickly settle to the bottom. The amount of light scatter is measured and compared to the amount of scatter from known mixtures. The amount of the unknown is determined from a standard curve. Nephelometry can be used to detect either antigen or antibody, but it is usually run with antibody as the reagent and the patient antigen as the unknown. In the Immunology Medical Lab, two types of tests can be run: "end point nephelometry" and "kinetic (rate) nephelometry". End point nephelometry tests are run by allowing the antibody/antigen reaction to r
https://en.wikipedia.org/wiki/Lipid%20polymorphism
Polymorphism in biophysics is the ability of lipids to aggregate in a variety of ways, giving rise to structures of different shapes, known as "phases". This can be in the form of spheres of lipid molecules (micelles), pairs of layers that face one another (lamellar phase, observed in biological systems as a lipid bilayer), a tubular arrangement (hexagonal), or various cubic phases (Fdm, Imm, Iam, Pnm, and Pmm being those discovered so far). More complicated aggregations have also been observed, such as rhombohedral, tetragonal and orthorhombic phases. It forms an important part of current academic research in the fields of membrane biophysics (polymorphism), biochemistry (biological impact) and organic chemistry (synthesis). Determination of the topology of a lipid system is possible by a number of methods, the most reliable of which is x-ray diffraction. This uses a beam of x-rays that are scattered by the sample, giving a diffraction pattern as a set of rings. The ratio of the distances of these rings from the central point indicates which phase(s) are present. The structural phase of the aggregation is influenced by the ratio of lipids present, temperature, hydration, pressure and ionic strength (and type). Hexagonal phases In lipid polymorphism, if the packing ratio of lipids is greater or less than one, lipid membranes can form two separate hexagonal phases, or nonlamellar phases, in which long, tubular aggregates form according to the environment in which the lipid is introduced. Hexagonal I phase (HI) This phase is favored in detergent-in-water solutions and has a packing ratio of less than one. The micellar population in a detergent/water mixture cannot increase without limit as the detergent to water ratio increases. In the presence of low amounts of water, lipids that would normally form micelles will form larger aggregates in the form of micellar tubules in order to satisfy the requirements of the hydrophobic effect. These aggregates can be t
https://en.wikipedia.org/wiki/Standard%20probability%20space
In probability theory, a standard probability space, also called Lebesgue–Rokhlin probability space or just Lebesgue space (the latter term is ambiguous) is a probability space satisfying certain assumptions introduced by Vladimir Rokhlin in 1940. Informally, it is a probability space consisting of an interval and/or a finite or countable number of atoms. The theory of standard probability spaces was started by von Neumann in 1932 and shaped by Vladimir Rokhlin in 1940. Rokhlin showed that the unit interval endowed with the Lebesgue measure has important advantages over general probability spaces, yet can be effectively substituted for many of these in probability theory. The dimension of the unit interval is not an obstacle, as was clear already to Norbert Wiener. He constructed the Wiener process (also called Brownian motion) in the form of a measurable map from the unit interval to the space of continuous functions. Short history The theory of standard probability spaces was started by von Neumann in 1932 and shaped by Vladimir Rokhlin in 1940. For modernized presentations see , , and . Nowadays standard probability spaces may be (and often are) treated in the framework of descriptive set theory, via standard Borel spaces, see for example . This approach is based on the isomorphism theorem for standard Borel spaces . An alternate approach of Rokhlin, based on measure theory, neglects null sets, in contrast to descriptive set theory. Standard probability spaces are used routinely in ergodic theory. Definition One of several well-known equivalent definitions of the standardness is given below, after some preparations. All probability spaces are assumed to be complete. Isomorphism An isomorphism between two probability spaces , is an invertible map such that and both are (measurable and) measure preserving maps. Two probability spaces are isomorphic if there exists an isomorphism between them. Isomorphism modulo zero Two probability spaces , are iso
https://en.wikipedia.org/wiki/Neolentinus%20ponderosus
Neolentinus ponderosus, commonly known as the giant sawgill, or ponderous lentinus, is a species of fungus in the family Gloeophyllaceae. Found in western North America, it was originally described in 1965 as a species of Lentinus by American mycologist Orson K. Miller. Taxonomy The fungus was first described as Lentinellus montanus by Orson K. Miller, based on collections that he had made in Idaho. In 1985 it was transferred to Neolentinus, a segregate genus created for Lentinus-type fungi that cause a brown rot in wood. The specific epithet ponderosa derives from the Latin word for "heavy". Description The fruit bodies have convex to flattened caps ranging from in diameter. The caps have small cinnamon-brown scales (squamules) on the surface and a margin that is usually curved inward initially. The narrow gills have an adnate attachment to the stipe and are closely spaced, with intervening lamellulae (short gills) that extend about two-thirds of the distance to the stipe. The gill edges are serrated (notched like a saw), a feature that inspired the mushroom's common name. Gills are initially whitish before aging to light buff to light orange. The stipe measures long by thick. Its reddish-brown surface is made of small scales that are less dense in the upper half, where it has a more whitish or buff color. Fruit bodies produce a dull white to buff spore print. Microscopically, the spores are somewhat spindle-shaped when viewed from the side, and elliptical viewed from the front; they measure 8–10.5 by 3.5–4.4 μm and are inamyloid. The basidia (spore-bearing cells) are thin-walled and club-shaped, four-spored, and measure 26–36 by 5–8.8 μm. The cystidia on both the faces and edges of the gills are thin-walled, hyaline (translucent), narrowly club-shaped, and measure 26–36 by 5–8.8 μm. The cap cuticle comprises threadlike hyphae with a diameter of 4.4–8 μm, while the cap flesh is made of interwoven hyphae (both thick- and thin-walled) measuring 2.5–6 μm. Clamp
https://en.wikipedia.org/wiki/Agaricus%20bitorquis
Agaricus bitorquis, commonly known as torq, banded agaric, spring agaric, banded agaricus, urban agaricus, or pavement mushroom, is an edible white mushroom of the genus Agaricus, similar to the common button mushroom that is sold commercially. The name supersedes Agaricus rodmani. It has been recorded pushing up paving slabs. Taxonomy The specific epithet bitorquis is Latin "having two collars", and refers to the two rings resulting from detachment of the veil from both the top and bottom of the stipe. The species was first defined by the French mycologist Lucien Quélet in 1884, in the form of Psalliota bitorquis (using another genus name in place of the modern Agaricus). Description The cap is dry, smooth, and white (but stains yellowish in age), and measures 4 to 15 cm in diameter, convex to flat, often with dirt on the cap. The gills are free, very narrow and close. They are a light pink color when young, becoming dark reddish-brown as the spores mature. The spore print is chocolate brown. The stipe is 3–11 cm long, 1–4 cm thick, cylindrical to clavate (club-shaped), stout, white, smooth, with a membranous veil and thick white mycelial sheathing near the base. Distinctively it has both a thick upper ring which is shaped like a funnel and a thinner skirt-like lower ring, giving rise to the species name bitorquis. The flesh is solid and firm, with a mild odor. It is often confused with the briny-smelling Agaricus bernardii. It also resembles Agaricus campestris somewhat, but that species only has a single fragile ring. Microscopic details Basidiospores are elliptical in shape, smooth, and with dimensions of 5–7 x 4–5.5 µm. Basidia are 20–25 x 6.5–8.5 µm, usually four-spored, but often with two-spored basidia present. Cystidia are present and numerous. Distribution and habitat Agaricus bitorquis may be found growing solitary or in small groups in gardens (noted as growing in a gregarious manner), and at roadsides, usually on the pavement, often where s
https://en.wikipedia.org/wiki/Paradoxes%20of%20set%20theory
This article contains a discussion of paradoxes of set theory. As with most mathematical paradoxes, they generally reveal surprising and counter-intuitive mathematical results, rather than actual logical contradictions within modern axiomatic set theory. Basics Cardinal numbers Set theory as conceived by Georg Cantor assumes the existence of infinite sets. As this assumption cannot be proved from first principles it has been introduced into axiomatic set theory by the axiom of infinity, which asserts the existence of the set N of natural numbers. Every infinite set which can be enumerated by natural numbers is the same size (cardinality) as N, and is said to be countable. Examples of countably infinite sets are the natural numbers, the even numbers, the prime numbers, and also all the rational numbers, i.e., the fractions. These sets have in common the cardinal number |N| = (aleph-nought), a number greater than every natural number. Cardinal numbers can be defined as follows. Define two sets to have the same size by: there exists a bijection between the two sets (a one-to-one correspondence between the elements). Then a cardinal number is, by definition, a class consisting of all sets of the same size. To have the same size is an equivalence relation, and the cardinal numbers are the equivalence classes. Ordinal numbers Besides the cardinality, which describes the size of a set, ordered sets also form a subject of set theory. The axiom of choice guarantees that every set can be well-ordered, which means that a total order can be imposed on its elements such that every nonempty subset has a first element with respect to that order. The order of a well-ordered set is described by an ordinal number. For instance, 3 is the ordinal number of the set {0, 1, 2} with the usual order 0 < 1 < 2; and ω is the ordinal number of the set of all natural numbers ordered the usual way. Neglecting the order, we are left with the cardinal number |N| = |ω| = . Ordinal numbers c
https://en.wikipedia.org/wiki/Hirsch%20conjecture
In mathematical programming and polyhedral combinatorics, the Hirsch conjecture is the statement that the edge-vertex graph of an n-facet polytope in d-dimensional Euclidean space has diameter no more than n − d. That is, any two vertices of the polytope must be connected to each other by a path of length at most n − d. The conjecture was first put forth in a letter by to George B. Dantzig in 1957 and was motivated by the analysis of the simplex method in linear programming, as the diameter of a polytope provides a lower bound on the number of steps needed by the simplex method. The conjecture is now known to be false in general. The Hirsch conjecture was proven for d < 4 and for various special cases, while the best known upper bounds on the diameter are only sub-exponential in n and d. After more than fifty years, a counter-example was announced in May 2010 by Francisco Santos Leal, from the University of Cantabria. The result was presented at the conference 100 Years in Seattle: the mathematics of Klee and Grünbaum and appeared in Annals of Mathematics. Specifically, the paper presented a 43-dimensional polytope of 86 facets with a diameter of more than 43. The counterexample has no direct consequences for the analysis of the simplex method, as it does not rule out the possibility of a larger but still linear or polynomial number of steps. Various equivalent formulations of the problem had been given, such as the d-step conjecture, which states that the diameter of any 2d-facet polytope in d-dimensional Euclidean space is no more than d; Santos Leal's counterexample also disproves this conjecture. Statement of the conjecture The graph of a convex polytope is any graph whose vertices are in bijection with the vertices of in such a way that any two vertices of the graph are joined by an edge if and only if the two corresponding vertices of are joined by an edge of the polytope. The diameter of , denoted , is the diameter of any one of its graphs. These d
https://en.wikipedia.org/wiki/Schwinger%20variational%20principle
Schwinger variational principle is a variational principle which expresses the scattering T-matrix as a functional depending on two unknown wave functions. The functional attains stationary value equal to actual scattering T-matrix. The functional is stationary if and only if the two functions satisfy the Lippmann-Schwinger equation. The development of the variational formulation of the scattering theory can be traced to works of L. Hultén and J. Schwinger in 1940s. Linear form of the functional The T-matrix expressed in the form of stationary value of the functional reads where and are the initial and the final states respectively, is the interaction potential and is the retarded Green's operator for collision energy . The condition for the stationary value of the functional is that the functions and satisfy the Lippmann-Schwinger equation and Fractional form of the functional Different form of the stationary principle for T-matrix reads The wave functions and must satisfy the same Lippmann-Schwinger equations to get the stationary value. Application of the principle The principle may be used for the calculation of the scattering amplitude in the similar way like the variational principle for bound states, i.e. the form of the wave functions is guessed, with some free parameters, that are determined from the condition of stationarity of the functional. See also Lippmann-Schwinger equation Quantum scattering theory T-matrix Green's operator
https://en.wikipedia.org/wiki/Spherical%20mean
In mathematics, the spherical mean of a function around a point is the average of all values of that function on a sphere of given radius centered at that point. Definition Consider an open set U in the Euclidean space Rn and a continuous function u defined on U with real or complex values. Let x be a point in U and r > 0 be such that the closed ball B(x, r) of center x and radius r is contained in U. The spherical mean over the sphere of radius r centered at x is defined as where ∂B(x, r) is the (n − 1)-sphere forming the boundary of B(x, r), dS denotes integration with respect to spherical measure and ωn−1(r) is the "surface area" of this (n − 1)-sphere. Equivalently, the spherical mean is given by where ωn−1 is the area of the (n − 1)-sphere of radius 1. The spherical mean is often denoted as The spherical mean is also defined for Riemannian manifolds in a natural manner. Properties and uses From the continuity of it follows that the function is continuous, and that its limit as is Spherical means can be used to solve the Cauchy problem for the wave equation in odd space dimension. The result, known as Kirchhoff's formula, is derived by using spherical means to reduce the wave equation in (for odd ) to the wave equation in , and then using d'Alembert's formula. The expression itself is presented in wave equation article. If is an open set in and is a C2 function defined on , then is harmonic if and only if for all in and all such that the closed ball is contained in one has This result can be used to prove the maximum principle for harmonic functions.
https://en.wikipedia.org/wiki/Phase%20space%20method
In applied mathematics, the phase space method is a technique for constructing and analyzing solutions of dynamical systems, that is, solving time-dependent differential equations. The method consists of first rewriting the equations as a system of differential equations that are first-order in time, by introducing additional variables. The original and the new variables form a vector in the phase space. The solution then becomes a curve in the phase space, parametrized by time. The curve is usually called a trajectory or an orbit. The (vector) differential equation is reformulated as a geometrical description of the curve, that is, as a differential equation in terms of the phase space variables only, without the original time parametrization. Finally, a solution in the phase space is transformed back into the original setting. The phase space method is used widely in physics. It can be applied, for example, to find traveling wave solutions of reaction–diffusion systems. See also Reaction–diffusion system Fisher's equation
https://en.wikipedia.org/wiki/Image%20fusion
The image fusion process is defined as gathering all the important information from multiple images, and their inclusion into fewer images, usually a single one. This single image is more informative and accurate than any single source image, and it consists of all the necessary information. The purpose of image fusion is not only to reduce the amount of data but also to construct images that are more appropriate and understandable for the human and machine perception. In computer vision, multisensor image fusion is the process of combining relevant information from two or more images into a single image. The resulting image will be more informative than any of the input images. In remote sensing applications, the increasing availability of space borne sensors gives a motivation for different image fusion algorithms. Several situations in image processing require high spatial and high spectral resolution in a single image. Most of the available equipment is not capable of providing such data convincingly. Image fusion techniques allow the integration of different information sources. The fused image can have complementary spatial and spectral resolution characteristics. However, the standard image fusion techniques can distort the spectral information of the multispectral data while merging. In satellite imaging, two types of images are available. The panchromatic image acquired by satellites is transmitted with the maximum resolution available and the multispectral data are transmitted with coarser resolution. This will usually be two or four times lower. At the receiver station, the panchromatic image is merged with the multispectral data to convey more information. Many methods exist to perform image fusion. The very basic one is the high-pass filtering technique. Later techniques are based on Discrete Wavelet Transform, uniform rational filter bank, and Laplacian pyramid. Multi-focus image fusion Multi-focus image fusion is used to collect useful and necessa
https://en.wikipedia.org/wiki/Semifield
In mathematics, a semifield is an algebraic structure with two binary operations, addition and multiplication, which is similar to a field, but with some axioms relaxed. Overview The term semifield has two conflicting meanings, both of which include fields as a special case. In projective geometry and finite geometry (MSC 51A, 51E, 12K10), a semifield is a nonassociative division ring with multiplicative identity element. More precisely, it is a nonassociative ring whose nonzero elements form a loop under multiplication. In other words, a semifield is a set S with two operations + (addition) and · (multiplication), such that (S,+) is an abelian group, multiplication is distributive on both the left and right, there exists a multiplicative identity element, and division is always possible: for every a and every nonzero b in S, there exist unique x and y in S for which b·x = a and y·b = a. Note in particular that the multiplication is not assumed to be commutative or associative. A semifield that is associative is a division ring, and one that is both associative and commutative is a field. A semifield by this definition is a special case of a quasifield. If S is finite, the last axiom in the definition above can be replaced with the assumption that there are no zero divisors, so that a·b = 0 implies that a = 0 or b = 0. Note that due to the lack of associativity, the last axiom is not equivalent to the assumption that every nonzero element has a multiplicative inverse, as is usually found in definitions of fields and division rings. In ring theory, combinatorics, functional analysis, and theoretical computer science (MSC 16Y60), a semifield is a semiring (S,+,·) in which all nonzero elements have a multiplicative inverse. These objects are also called proper semifields. A variation of this definition arises if S contains an absorbing zero that is different from the multiplicative unit e, it is required that the non-zero elements be invertible, and a·0
https://en.wikipedia.org/wiki/Manuscript%20paper
Manuscript paper (sometimes staff paper in U.S. English, or just music paper) is paper preprinted with staves ready for musical notation. A manuscript is made up of lines and spaces, and these lines and space have their names depending on the staves (bass or treble). Manuscript paper is also available for drum notation and guitar tabulature. See also Rastrum Sheet music Works cited “Staff Paper”. All About Music Theory. 10 Oct. 2014. Accessed 3 Apr. 2020. Sainsbury, Christopher. “Bi-tone Techniques and Notation in Contemporary Guitar Music Composition”. Master’s thesis, NSW Conservatorium of Music, 2001-2002. External links The "Manuscript" Page — downloadable manuscript paper Music Paper — blank music papers in PostScript and PDF formats in Letter paper size
https://en.wikipedia.org/wiki/Black-bag%20cryptanalysis
In cryptography, black-bag cryptanalysis is a euphemism for the acquisition of cryptographic secrets via burglary, or other covert means – rather than mathematical or technical cryptanalytic attack. The term refers to the black bag of equipment that a burglar would carry or a black bag operation. As with rubber-hose cryptanalysis, this is technically not a form of cryptanalysis; the term is used sardonically. However, given the free availability of very high strength cryptographic systems, this type of attack is a much more serious threat to most users than mathematical attacks because it is often much easier to attempt to circumvent cryptographic systems (e.g. steal the password) than to attack them directly. Regardless of the technique used, such methods are intended to capture highly sensitive information e.g. cryptographic keys, key-rings, passwords or unencrypted plaintext. The required information is usually copied without removing or destroying it, so capture often takes place without the victim realizing it has occurred. Methods In addition to burglary, the covert means might include the installation of keystroke logging or trojan horse software or hardware installed on (or near to) target computers or ancillary devices. It is even possible to monitor the electromagnetic emissions of computer displays or keyboards from a distance of 20 metres (or more), and thereby decode what has been typed. This could be done by surveillance technicians, or via some form of bug concealed somewhere in the room. Although sophisticated technology is often used, black bag cryptanalysis can also be as simple as the process of copying a password which someone has unwisely written down on a piece of paper and left inside their desk drawer. The case of United States v. Scarfo highlighted one instance in which FBI agents using a sneak and peek warrant placed a keystroke logger on an alleged criminal gang leader. See also
https://en.wikipedia.org/wiki/Max%20Planck%20Institute%20for%20Marine%20Microbiology
The Max Planck Institute for Marine Microbiology is located in Bremen, Germany. It was founded in 1992, almost a year after the foundation of its sister institute, the Max Planck Institute for Terrestrial Microbiology at Marburg. In 1996, the institute moved into new buildings at the campus of the University of Bremen. It is one of 80 institutes in the Max Planck Society (Max Planck Gesellschaft). Currently, the institute consists of three departments with several associated research groups: Biogeochemistry (headed by Dr. Marcel Kuypers) Molecular Ecology (headed Prof. Dr. Rudolf Amann) Symbiosis (headed by Prof. Dr. Nicole Dubilier) Additionally, the following research groups reside in the institute. Microbial Physiology (headed by Dr. Boran Kartal) Greenhouse Gases (headed Dr. Jana Milucka) Microbial Genomics and Bioinformatics (headed by Prof. Dr. Frank Oliver Glöckner) Flow Cytometry (headed by Dr. Bernhard Fuchs) Metabolic Interactions (headed by Dr. Manuel Liebeke) Microsensors (headed by Dr. Dirk de Beer) HGF MPG Joint Research Group for Deep-Sea Ecology and Technology (headed by Prof. Dr. Antje Boetius) MARUM MPG Bridge Group Marine Glycobiology (headed Dr. Jan-Hendrik Hehemann) Max Planck Research Group Microbial Metabolism (headed by Dr. Tristan Wagner) Marine Geochemistry Group (headed by Prof. Dr. Thorsten Dittmar) Max Planck Research Group for Marine Isotope Geochemistry (headed by Dr. Katharine Pahnke-May) Degree programme The MPI for Marine Microbiology offers the PhD programme "International Max-Planck Research School (IMPRS) of Marine Microbiology" (Marmic) together with the Alfred Wegener Institute for Polar and Marine Research, the University of Bremen and Jacobs University Bremen.
https://en.wikipedia.org/wiki/Donald%20J.%20Newman
Donald Joseph (D. J.) Newman (July 27, 1930 – March 28, 2007) was an American mathematician. He gave simple proofs of the prime number theorem and the Hardy-Ramanujan partition formula. He excelled on multiple occasions at the annual Putnam competition while studying at City College of New York and New York University, and later received his PhD from Harvard University in 1953. Life and works Newman was born in Brooklyn, New York in 1930, and studied at New York's Stuyvesant High School. When he was 14 he worked with Dubble Bubble Gum to help solve the statistical question of how often a gum purchaser would receive the same joke for their gum wrapper. He was an avid problem-solver, and as an undergraduate was a Putnam Fellow all three years he took part in the Putnam math competition; only the third person to attain that feat. His mathematical specialties included complex analysis, approximation theory and number theory. In 1980 he found a short proof of the prime number theorem, which can now be found in his textbook on Complex analysis. He also gave a simplified proof of the Hardy-Ramanujans partition formula. Newman was a friend and associate of John Nash. His career included posts as a Professor of Mathematics at MIT, Brown University, Yeshiva University, Temple University and a distinguished chair at Bar Ilan University in Israel. He held government and industry positions at Avco, Republic Aviation, Bell Laboratories, IBM and the NSA. Newman's love of problem solving comes through in his writing; his published output as a mathematician includes 150 papers and five books. He taught numerous students over the years, including Robert Feinerman, Jonah Mann, Eli Passow, Louis Raymon, Joseph Bak, Shmuel Weinberger, and Gerald Weinstein at Yeshiva University, and Bo Gao, Don Kellman, Jonathan Knappenberger, and Yuan Xu at Temple University. See also A Beautiful Mind (1998) by Sylvia Nasar Selected publications --. (1979) Approximation with rational fun
https://en.wikipedia.org/wiki/Electrostatic%20deflection%20%28molecular%20physics/nanotechnology%29
In molecular physics/nanotechnology, electrostatic deflection is the deformation of a beam-like structure/element bent by an electric field. It can be due to interaction between electrostatic fields and net charge or electric polarization effects. The beam-like structure/element is generally cantilevered (fix at one of its ends). In nanomaterials, carbon nanotubes (CNTs) are typical ones for electrostatic deflections. Mechanisms of electric deflection due to electric polarization can be understood as follows: When a material is brought into an electric field (E), the field tends to shift the positive charge (in red) and the negative charge (in blue) in opposite directions. Thus, induced dipoles are created. Fig. 3 shows a beam-like structure/element in an electric field. The interaction between the molecular dipole moment and the electric field results an induced torque (T). Then this torque tends to align the beam toward the direction of field. In case of a cantilevered CNT, it would be bent to the field direction. Meanwhile the electrically induced torque and stiffness of the CNT compete against each other. This deformation has been observed in experiments. This property is an important characteristic for CNTs promising nanoelectromechanical systems applications, as well as for their fabrication, separation and electromanipulation. Recently, several nanoelectromechanical systems based on cantilevered CNTs have been reported such as: nanorelays, nanoswitches, nanotweezers and feedback device which are designed for memory, sensing or actuation uses. Furthermore, theoretical studies have been carried out to try to get a full understanding of the electric deflection of carbon nanotubes,
https://en.wikipedia.org/wiki/Max%20Planck%20Institute%20of%20Microstructure%20Physics
The Max Planck Institute of Microstructure Physics in Halle (Saale) is a research institute in Germany in the field of materials research. It was founded in 1992 by Hellmut Fischmeister and is a follow-up to the German Academy of Sciences Institute of Solid State Physics and Electron Microscopy. The institute moved into new buildings from 1997 till 1999. It is one of 84 institutes in the Max Planck Society (Max-Planck-Gesellschaft). The institute has three main departments: Stuart Parkin Joyce Poon Xinliang Feng Former departments include the following: The Theory Department, headed by Prof. Eberhard Gross, mainly carries out theoretical research on the electronic, magnetic, optical, and electrical properties of micro- and nanostructured solid-state systems'. The Experimental Department 1, headed by Prof. Jürgen Kirschner, mainly deals with the magnetic properties of dimensionally reduced systems and their dependence on electronic structure, crystalline structure and morphology. The Experimental Department 2, headed by Prof. Ulrich Gösele, is focussed on the scientific understanding, design and fabrication of new materials for information, communication, engineering as well as bio-technological applications. The Experimental Department 3, headed by Prof. Johannes Heydenreich, is focused on analytical methods using high-resolution electronic microscopy. PhD program The Max Planck Institute for Microstructure Physics, the Martin Luther University of Halle-Wittenberg, and the Fraunhofer Institute for Mechanics of Materials offer a PhD program under the "International Max-Planck Research School (IMPRS) of Nanostructures".
https://en.wikipedia.org/wiki/Polar%20Research
Polar Research is a biannual peer-reviewed scientific journal covering natural and social scientific research on the polar regions. It is published by the Norwegian Polar Institute. It covers a wide range of fields from biology to oceanography, including socio-economic and management topics. According to the Journal Citation Reports, the journal has a 2014 impact factor of 1.141.
https://en.wikipedia.org/wiki/Classification%20of%20Fatou%20components
In mathematics, Fatou components are components of the Fatou set. They were named after Pierre Fatou. Rational case If f is a rational function defined in the extended complex plane, and if it is a nonlinear function (degree > 1) then for a periodic component of the Fatou set, exactly one of the following holds: contains an attracting periodic point is parabolic is a Siegel disc: a simply connected Fatou component on which f(z) is analytically conjugate to a Euclidean rotation of the unit disc onto itself by an irrational rotation angle. is a Herman ring: a double connected Fatou component (an annulus) on which f(z) is analytically conjugate to a Euclidean rotation of a round annulus, again by an irrational rotation angle. Attracting periodic point The components of the map contain the attracting points that are the solutions to . This is because the map is the one to use for finding solutions to the equation by Newton–Raphson formula. The solutions must naturally be attracting fixed points. Herman ring The map and t = 0.6151732... will produce a Herman ring. It is shown by Shishikura that the degree of such map must be at least 3, as in this example. More than one type of component If degree d is greater than 2 then there is more than one critical point and then can be more than one type of component Transcendental case Baker domain In case of transcendental functions there is another type of periodic Fatou components, called Baker domain: these are "domains on which the iterates tend to an essential singularity (not possible for polynomials and rational functions)" one example of such a function is: Wandering domain Transcendental maps may have wandering domains: these are Fatou components that are not eventually periodic. See also No-wandering-domain theorem Montel's theorem John Domains Basins of attraction
https://en.wikipedia.org/wiki/Pair%20potential
In physics, a pair potential is a function that describes the potential energy of two interacting objects solely as a function of the distance between them. Some interactions, like Coulomb's law in electrodynamics or Newton's law of universal gravitation in mechanics naturally have this form for simple spherical objects. For other types of more complex interactions or objects it is useful and common to approximate the interaction by a pair potential, for example interatomic potentials in physics and computational chemistry that use approximations like the Lennard-Jones and Morse potentials. Functional form The total energy of a system of objects in positions , that interact through pair potential is given by This expression uses the fact that interaction is symmetric between particles and . It also avoids self-interaction by do not including the case when . Potential range A fundamental property of a pair potential is its range. It is expected that pair potentials go to zero for infinite distance as particles that are too far apart do not interact. In some cases the potential goes quickly to zero and the interaction for particles that are beyond a certain distance can be assumed to be zero, these are said to be short-range potentials. Other potentials, like the Coulomb or gravitational potential, are long range: they go slowly to zero and the contribution of particles at long distances still contributes to the total energy. Computational cost The total energy expression for pair potentials is quite simple to use for analytical and computational work. It has some limitations however, as the computational cost is proportional to the square of number of particles. This might be prohibitively expensive when the interaction between large groups of objects needs to be calculated. For short-range potentials the sum can be restricted only to include particles that are close, reducing the cost to linearly proportional to the number of particles. Infinitely pe
https://en.wikipedia.org/wiki/ASC%20X9
The Accredited Standards Committee X9 (ASC X9, Inc.) is an ANSI (American National Standards Institute) accredited standards developing organization, responsible for developing voluntary open consensus standards for the financial services industry in the U.S. ASC X9 is the USA Technical Advisory Group (TAG) to the International Technical Committee on Financial Services ISO/TC 68 under the International Organization for Standardization (ISO), of Geneva, Switzerland, and submits X9 American National Standards to the international committee to be considered for adoption as international standards or ISO standards. Membership in ASC X9 is open to all U.S. domiciled companies and organizations in the financial services industry. Domestic Role The Accredited Standards Committee X9 develops, establishes, maintains, and promotes standards for the Financial Services Industry in the United States in order to facilitate delivery of financial services and products. Committees ASC X9 is composed of five Subcommittees based on Financial Services business sectors: X9A – Retail Payments (including mobile payments and a card not present fraud group); X9B – Checks and Back-office Operations (all things related to checks); X9C – Corporate Banking (B2B payments and the Balance Transaction Reporting Standard (BTRS) standard); X9D – Securities (stocks and bonds, CUSIP); and X9F – Data & Information Security (methods and cryptography to secure financial data). Standards Examples of X9 standards commonly used today in the US and around the world are: Paper and electronic check standards (X9 owns nearly 98% of the standardized real estate on the front and back of checks, the magnetic ink character record was originally developed by X9, the credit and debit card transactions standard operating all card transactions first developed under X9 Numerous data security standards for financial services including the "PIN" - personal identification standard in wide use today. Data E
https://en.wikipedia.org/wiki/Puerto%20Rico%20statistical%20areas
The currently has 15 statistical areas that have been delineated by the United States Office of Management and Budget (OMB). On March 6, 2020, the OMB delineated three combined statistical areas, eight metropolitan statistical areas, and four micropolitan statistical areas in . Statistical areas The Office of Management and Budget (OMB) has designated more than 1,000 statistical areas for the United States and Puerto Rico. These statistical areas are important geographic delineations of population clusters used by the OMB, the United States Census Bureau, planning organizations, and federal, state, and local government entities. The OMB defines a core-based statistical area (commonly referred to as a CBSA) as "a statistical geographic entity consisting of the county or counties (or county-equivalents) associated with at least one core of at least 10,000 population, plus adjacent counties having a high degree of social and economic integration with the core as measured through commuting ties with the counties containing the core." The OMB further divides core-based statistical areas into metropolitan statistical areas (MSAs) that have "a population of at least 50,000" and micropolitan statistical areas (μSAs) that have "a population of at least 10,000, but less than 50,000." The OMB defines a combined statistical area (CSA) as "a geographic entity consisting of two or more adjacent core-based statistical areas with employment interchange measures of at least 15%." The primary statistical areas (PSAs) include all combined statistical areas and any core-based statistical area that is not a constituent of a combined statistical area. Table The table below describes the 15 statistical areas and 78 municipios of the Commonwealth of Puerto Rico with the following information: The combined statistical area (CSA) as designated by the OMB. The CSA population according to 2019 US Census Bureau population estimates. The core based statistical area (CBSA) as designated by
https://en.wikipedia.org/wiki/Syn-Propanethial-S-oxide
syn-Propanethial S-oxide (or (Z)-propanethial S-oxide), a member of a class of organosulfur compounds known as thiocarbonyl S-oxides (formerly "sulfines"), is a volatile liquid that acts as a lachrymatory agent (triggers tearing and stinging on contact with the eyes). The chemical is released from onions, Allium cepa, as they are sliced. The release is due to the breaking open of the onion cells and their releasing enzymes called alliinases, which then break down amino acid sulfoxides, generating sulfenic acids. A specific sulfenic acid, 1-propenesulfenic acid, formed when onions are cut, is rapidly rearranged by a second enzyme, called the lachrymatory factor synthase or LFS, giving syn-propanethial S-oxide. The gas diffuses through the air and, on contact with the eye, it stimulates sensory neurons creating a stinging, painful sensation. Tears are released from the tear glands to dilute and flush out the irritant. A structurally related lachrymatory compound, syn-butanethial S-oxide, C4H8OS, has been found in another genus Allium plant, Allium siculum. See also Allicin
https://en.wikipedia.org/wiki/6-Phosphogluconic%20acid
6-Phosphogluconic acid (6-phosphogluconate) is an intermediate in the pentose phosphate pathway and the Entner–Doudoroff pathway. It is formed by 6-phosphogluconolactonase, and acted upon by phosphogluconate dehydrogenase to produce ribulose 5-phosphate. It may also be acted upon by 6-phosphogluconate dehydratase to produce 2-keto-3-deoxy-6-phosphogluconate. Organophosphates
https://en.wikipedia.org/wiki/Sedoheptulose%207-phosphate
Sedoheptulose 7-phosphate is an intermediate in the pentose phosphate pathway. It is formed by transketolase and acted upon by transaldolase. Sedoheptulokinase is an enzyme that uses sedoheptulose and ATP to produce ADP and sedoheptulose 7-phosphate. Sedoheptulose-bisphosphatase is an enzyme that uses sedoheptulose 1,7-bisphosphate and H2O to produce sedoheptulose 7-phosphate and phosphate. See also Sedoheptulose 3-Deoxy-D-arabino-heptulosonic acid 7-phosphate, a related compound and an intermediate in the biosynthesis of shikimic acid
https://en.wikipedia.org/wiki/6-Phosphogluconolactone
6-Phosphogluconolactone is an intermediate in the pentose phosphate pathway (PPP). In the PPP pathway, it is produced from glucose-6-phosphate by glucose-6-phosphate dehydrogenase. It is then converted to 6-Phosphogluconic acid by 6-phosphogluconolactonase. See also Lactone Organophosphates Delta-lactones
https://en.wikipedia.org/wiki/Computer%20appliance
A computer appliance is a computer system with a combination of hardware, software, or firmware that is specifically designed to provide a particular computing resource. Such devices became known as appliances because of the similarity in role or management to a home appliance, which are generally closed and sealed, and are not serviceable by the user or owner. The hardware and software are delivered as an integrated product and may even be pre-configured before delivery to a customer, to provide a turn-key solution for a particular application. Unlike general purpose computers, appliances are generally not designed to allow the customers to change the software and the underlying operating system, or to flexibly reconfigure the hardware. Another form of appliance is the virtual appliance, which has similar functionality to a dedicated hardware appliance, but is distributed as a software virtual machine image for a hypervisor-equipped device. Overview Traditionally, software applications run on top of a general-purpose operating system, which uses the hardware resources of the computer (primarily memory, disk storage, processing power, and networking bandwidth) to meet the computing needs of the user. The main issue with the traditional model is related to complexity. It is complex to integrate the operating system and applications with a hardware platform, and complex to support it afterwards. By tightly constraining the variations of the hardware and software, the appliance becomes easily deployable, and can be used without nearly as wide (or deep) IT knowledge. Additionally, when problems and errors appear, the supporting staff very rarely needs to explore them deeply to understand the matter thoroughly. The staff needs merely training on the appliance management software to be able to resolve most of problems. In all forms of the computer appliance model, customers benefit from easy operations. The appliance has exactly one combination of hardware and operati
https://en.wikipedia.org/wiki/Web%20Hosting%20Magazine
Web Hosting Magazine was a web hosting industry print magazine that published from 2000 to 2002. It spawned a companion tradeshow, Web Hosting Expo. Its founders and editors were Dmitri Eroshenko and Isabel Wang. It was published by Infotonics Media. The magazine was written and edited with a deliberately "edgy" style, designed to appeal to the primarily young and male constituency of the ISP industry. In 2003, Ping! Zine Web Hosting Magazine was launched, based on the concept of Web Hosting Magazine, and Isabel Wang came on board as a main part of the Editorial Board. This magazine continued with the edgy style. Web Host Industry Review Magazine (WHIR Magazine) was born out of the need for a more serious hard hitting news and trade publication in the Web Hosting Industry.
https://en.wikipedia.org/wiki/Mouse%20Genome%20Informatics
Mouse Genome Informatics (MGI) is a free, online database and bioinformatics resource hosted by The Jackson Laboratory, with funding by the National Human Genome Research Institute (NHGRI), the National Cancer Institute (NCI), and the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). MGI provides access to data on the genetics, genomics and biology of the laboratory mouse to facilitate the study of human health and disease. The database integrates multiple projects, with the two largest contributions coming from the Mouse Genome Database and Mouse Gene Expression Database (GXD). , MGI contains data curated from over 230,000 publications. The MGI resource was first published online in 1994 and is a collection of data, tools, and analyses created and tailored for use in the laboratory mouse, a widely used model organism. It is "the authoritative source of official names for mouse genes, alleles, and strains", which follow the guidelines established by the International Committee on Standardized Genetic Nomenclature for Mice. The history and focus of Jackson Laboratory research and production facilities generates tremendous knowledge and depth which researchers can mine to advance their research. A dedicated community of mouse researchers, worldwide enhances and contributes to the knowledge as well. This is an indispensable tool for any researcher using the mouse as a model organism for their research, and for researchers interested in genes that share homology with the mouse genes. Various mouse research support resources including animal collections and free colony management software are also available at the MGI site. Mouse Genome Database The Mouse Genome Database collects and curates comprehensive phenotype and functional annotations for mouse genes and alleles. This is an NHGRI-funded project which contributes to the Mouse Genome Informatics database. Mouse gene expression database The Gene Expression Database is a comm
https://en.wikipedia.org/wiki/Texas%20Neurosciences%20Institute
The Texas Neurosciences Institute (TNI) is the name of a medical office building in San Antonio, Texas. The building is adjacent to the University of Texas Health Science Center medical school. Medical specialties in the building include pediatrics, pediatric hematology/oncology, gastroenterology, neurosurgery, internal medicine, etc. There are also diagnostic labs and radiology imaging centers located there. See also South Texas Medical Center
https://en.wikipedia.org/wiki/Computational%20mathematics
Computational mathematics is an area of mathematics devoted to the interaction between mathematics and computer computation. A large part of computational mathematics consists roughly of using mathematics for allowing and improving computer computation in areas of science and engineering where mathematics are useful. This involves in particular algorithm design, computational complexity, numerical methods and computer algebra. Computational mathematics refers also to the use of computers for mathematics itself. This includes mathematical experimentation for establishing conjectures (particularly in number theory), the use of computers for proving theorems (for example the four color theorem), and the design and use of proof assistants. Areas of computational mathematics Computational mathematics emerged as a distinct part of applied mathematics by the early 1950s. Currently, computational mathematics can refer to or include: Computational science, also known as scientific computation or computational engineering Solving mathematical problems by computer simulation as opposed to analytic methods of applied mathematics Numerical methods used in scientific computation, for example numerical linear algebra and numerical solution of partial differential equations Stochastic methods, such as Monte Carlo methods and other representations of uncertainty in scientific computation The mathematics of scientific computation, in particular numerical analysis, the theory of numerical methods Computational complexity Computer algebra and computer algebra systems Computer-assisted research in various areas of mathematics, such as logic (automated theorem proving), discrete mathematics, combinatorics, number theory, and computational algebraic topology Cryptography and computer security, which involve, in particular, research on primality testing, factorization, elliptic curves, and mathematics of blockchain Computational linguistics, the use of mathematical and comput
https://en.wikipedia.org/wiki/Greenseas
Greenseas is a brand of shelf-stable fish products owned by the H. J. Heinz Company. They produce a range of popular goods, including tuna, salmon and sardines. See also Canned tuna
https://en.wikipedia.org/wiki/P-wave%20modulus
There are two kinds of seismic body waves in solids, pressure waves (P-waves) and shear waves. In linear elasticity, the P-wave modulus , also known as the longitudinal modulus, or the constrained modulus, is one of the elastic moduli available to describe isotropic homogeneous materials. It is defined as the ratio of axial stress to axial strain in a uniaxial strain state. This occurs when expansion in the transverse direction is prevented by the inertia of neighboring material, such as in an earthquake, or underwater seismic blast. where all the other strains are zero. This is equivalent to stating that where VP is the velocity of a P-wave and ρ is the density of the material through which the wave is propagating.
https://en.wikipedia.org/wiki/Electrostatic%20deflection%20%28structural%20element%29
In molecular physics/nanotechnology, electrostatic deflection is the deformation of a beam-like structure/element bent by an electric field (Fig. 1). It can be due to interaction between electrostatic fields and net charge or electric polarization effects. The beam-like structure/element is generally cantilevered (fix at one of its ends). In nanomaterials, carbon nanotubes (CNTs) are typical ones for electrostatic deflections. Mechanisms of electric deflection due to electric polarization can be understood as follows: As shown in Fig.2, when a material is brought into an electric field (E), the field tends to shift the positive charge (in red) and the negative charge (in blue) in opposite directions. Thus, induced dipoles are created. Fig. 3 shows a beam-like structure/element in an electric field. The interaction between the molecular dipole moment and the electric field results an induced torque (T). Then this torque tends to align the beam toward the direction of field. In case of a cantilevered CNT (Fig. 1), it would be bent to the field direction. Meanwhile, the electrically induced torque and stiffness of the CNT compete against each other. This deformation has been observed in experiments. This property is an important characteristic for CNTs promising nanoelectromechanical systems applications, as well as for their fabrication, separation and electromanipulation. Recently, several nanoelectromechanical systems based on cantilevered CNTs have been reported such as: nanorelays, nanoswitches, nanotweezers and feedback device which are designed for memory, sensing or actuation uses. Furthermore, theoretical studies have been carried out to try to get a full understanding of the electric deflection of carbon nanotubes.
https://en.wikipedia.org/wiki/Computed%20tomography%20angiography
Computed tomography angiography (also called CT angiography or CTA) is a computed tomography technique used for angiography—the visualization of arteries and veins—throughout the human body. Using contrast injected into the blood vessels, images are created to look for blockages, aneurysms (dilations of walls), dissections (tearing of walls), and stenosis (narrowing of vessel). CTA can be used to visualize the vessels of the heart, the aorta and other large blood vessels, the lungs, the kidneys, the head and neck, and the arms and legs. CTA can also be used to localise arterial or venous bleed of the gastrointestinal system. Medical uses CTA can be used to examine blood vessels in many key areas of the body including the brain, kidneys, pelvis, and the lungs. Coronary CT angiography Coronary CT angiography (CCTA) is the use of CT angiography to assess the arteries of the heart. The patient receives an intravenous injection of contrast and then the heart is scanned using a high speed CT scanner. With the advances in CT technology, patients are typically able to be scanned without needing medicines by simply holding their breath during the scan. CTA is used to assess heart or vessel irregularities, location of stents and whether they are still open, and occasionally to check for atherosclerotic disease. This method displays the anatomical detail of blood vessels more precisely than magnetic resonance imaging (MRI) or ultrasound. Today, many patients can undergo CTA in place of a conventional catheter angiogram, a minor procedure during which a catheter is passed through the blood vessels all the way to the heart. However, CCTA has not fully replaced this procedure. CCTA is able to detect narrowing of blood vessels in time for corrective therapy to be done. CCTA is a useful way of screening for arterial disease because it is safer, much less time-consuming than catheter angiography, and is also a cost-effective procedure. Aorta and great arteries CTA can be used i
https://en.wikipedia.org/wiki/Float%20%28woodworking%29
A woodworking float (more rarely used in silversmithing), also called a planemaker's float, is a tapered, flat, single cut file of two types: edge float and the flat sided float which are traditional woodworking tools generally used when making a wooden plane. The float is used to cut, flatten, and smooth (or float) key areas of wood by abrasion. Despite the name its woodworking uses go well beyond planemaking. Floats are similar to rasps and files. Rasps are generally coarse and cannot be resharpened. Files have angled ridges or teeth and cannot be resharpened. Floats have parallel teeth and they can be resharpened as many times as the thickness of the blade will allow. Edge floats resemble saw blades and are generally used to cut wedge slots in wood. Flat sided floats are more similar to a file or rasp but their cutting edges are a series of parallel teeth. Types of woodworking floats include: joiner's float, bed float, side float.
https://en.wikipedia.org/wiki/Apache%20Thrift
Thrift is an interface definition language and binary communication protocol used for defining and creating services for numerous programming languages. It was developed at Facebook for "scalable cross-language services development" and as of 2020 is an open source project in the Apache Software Foundation. With a remote procedure call (RPC) framework it combines a software stack with a code generation engine to build cross-platform services which can connect applications written in a variety of languages and frameworks, including ActionScript, C, C++, C#, Cappuccino, Cocoa, Delphi, Erlang, Go, Haskell, Java, JavaScript, Objective-C, OCaml, Perl, PHP, Python, Ruby, Elixir, Rust, Scala, Smalltalk and Swift. The implementation was described in an April 2007 technical paper released by Facebook, now hosted on Apache. Architecture Thrift includes a complete stack for creating clients and servers. The top part is generated code from the Thrift definition. From this file, the services generate client and processor codes. In contrast to built-in types, created data structures are sent as a result of generated code. The protocol and transport layer are part of the runtime library. With Thrift, it is possible to define a service and change the protocol and transport without recompiling the code. Besides the client part, Thrift includes server infrastructure to tie protocols and transports together, like blocking, non-blocking, and multi-threaded servers. The underlying I/O part of the stack is implemented differently for different languages. Thrift supports a number of protocols: TBinaryProtocol – A straightforward binary format, simple, but not optimized for space efficiency. Faster to process than the text protocol but more challenging to debug. TCompactProtocol – More compact binary format; typically more efficient to process as well TJSONProtocol – Uses JSON for encoding of data. TSimpleJSONProtocol – A write-only protocol that cannot be parsed by Thrift becaus
https://en.wikipedia.org/wiki/Gene%20targeting
Gene targeting is a biotechnological tool used to change the DNA sequence of an organism (hence it is a form of Genome Editing). It is based on the natural DNA-repair mechanism of Homology Directed Repair (HDR), including Homologous Recombination. Gene targeting can be used to make a range of sizes of DNA edits, from larger DNA edits such as inserting entire new genes into an organism, through to much smaller changes to the existing DNA such as a single base-pair change. Gene targeting relies on the presence of a repair template to introduce the user-defined edits to the DNA. The user (usually a scientist) will design the repair template to contain the desired edit, flanked by DNA sequence corresponding (homologous) to the region of DNA that the user wants to edit; hence the edit is targeted to a particular genomic region. In this way Gene Targeting is distinct from natural homology-directed repair, during which the ‘natural’ DNA repair template of the sister chromatid is used to repair broken DNA (the sister chromatid is the second copy of the gene). The alteration of DNA sequence in an organism can be useful in both a research context – for example to understand the biological role of a gene – and in biotechnology, for example to alter the traits of an organism (e.g. to improve crop plants). Methods To create a gene-targeted organism, DNA must be introduced into its cells. This DNA must contain all of the parts necessary to complete the gene targeting. At a minimum this is the homology repair template, containing the desired edit flanked by regions of DNA homologous (identical in sequence to) the targeted region (these homologous regions are called “homology arms” ). Often a reporter gene and/or a selectable marker is also required, to help identify and select for cells (or “events”) where GT has actually occurred. It is also common practice to increase GT rates by causing a double-strand-break (DSB) in the targeted DNA region. Hence the genes encoding for the s
https://en.wikipedia.org/wiki/Booting%20process%20of%20Linux
The multi-stage booting process of Linux is in many ways similar to the BSD and other Unix-style boot processes, from which it derives. Booting a Linux installation involves multiple stages and software components, including firmware initialization, execution of a boot loader, loading and startup of a Linux kernel image, and execution of various startup scripts and daemons. For each of these stages and components there are different variations and approaches; for example, GRUB, coreboot or Das U-Boot can be used as boot loaders (historical examples are LILO, SYSLINUX or Loadlin), while the startup scripts can be either traditional init-style, or the system configuration can be performed through modern alternatives such as systemd or Upstart. Overview Early stages of the Linux startup process depend very much on the computer architecture. IBM PC compatible hardware is one architecture Linux is commonly used on; on these systems, the BIOS plays an important role, which might not have exact analogs on other systems. In the following example, IBM PC compatible hardware is assumed: The BIOS performs startup tasks like the Power-on self-test specific to the actual hardware platform. Once the hardware is enumerated and the hardware which is necessary for boot is initialized correctly, the BIOS loads and executes the boot code from the configured boot device. The boot loader often presents the user with a menu of possible boot options and has a default option, which is selected after some time passes. Once the selection is made, the boot loader loads the kernel into memory, supplies it with some parameters and gives it control. The kernel, if compressed, will decompress itself. It then sets up system functions such as essential hardware and memory paging, and calls start_kernel() which performs the majority of system setup (interrupts, the rest of memory management, device and driver initialization, etc.). It then starts up, separately, the idle process, schedule
https://en.wikipedia.org/wiki/Cognio
Cognio, Inc. was an American company that developed and marketed radio frequency (RF) spectrum analysis products that find and solve channel interference problems on wireless networks and in wireless applications. Cognio’s Spectrum Expert product was designed for common frequency bands such as RFID and Wi-Fi. It was sold primarily to network engineers responsible for security for wireless networks or applications that run on wireless networks. Cognio was acquired by Cisco Systems in 2007. History Cognio was founded in 2000 and was originally called Aryya Communications. It first announced a "cognitive link processing" technology, and was located in Waltham, Massachusetts near Boston. In February, 2001, Gary Ambrosino became Interim CEO to work with the founding team in commercializing the product and raising additional capital. In March 2003 it announced $12.5 million of venture capital funding from North Bridge Venture Partners and ABS Ventures as round B. In January 2005, Thomas McPherson became chief executive officer. Cognio was headquartered in Germantown, Maryland (near Washington, DC), United States, and had an additional investments that included Avansis Ventures as a fourth round in April 2007, for a total of $30 million. Products In June 2003, Cognio announced "intelligent spectrum management" technology, sometimes called cognitive radio. Cognio shipped its Wi-Fi management software in the spring of 2005. Cognio was granted 12 patents, and submitted 172 patent applications for its RF analysis technology. IEEE 802.11 wireless network protocols (the standards which are marketed under the "Wi-Fi" brand) operate in an unlicensed spectrum. That is, their frequency bands are not licensed by the U.S. Federal Communications Commission (FCC) or other organizations exclusively for 802.11 traffic. Instead, the spectrum is shared with many other types of devices, such as cordless phones and Bluetooth devices. Because of its shared spectrum, an 802.11 device will
https://en.wikipedia.org/wiki/Human%20Universals
Human Universals is a book by Donald Brown, an American professor of anthropology (emeritus) who worked at the University of California, Santa Barbara. It was published by McGraw Hill in 1991. Brown says human universals, "comprise those features of culture, society, language, behavior, and psyche for which there are no known exception." According to Brown, there are many universals common to all human societies. Steven Pinker lists all Brown's universals in the appendix of his book The Blank Slate. The list includes several hundred universals, and notes Brown's later article on human universals in The MIT Encyclopedia of the Cognitive Sciences The list is seen by Brown (and Pinker) to be evidence of mental adaptations to communal life in our species' evolutionary history.p53 The issues raised by Brown's list are essentially darwinian. They occur in Darwin's Descent of Man (1871) and The Expression of the Emotions in Man and Animals (1872), and in Huxley's Evidence as to Man's Place in Nature (1863). The list gives little emphasis to the issues of aggression, physical conflict and warfare, which have an extensive literature in ethology. Brown's list does have conflict and its mediation as items. He also makes note of the fact that human males are more prone to violence and aggression than females. Notes