source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/Reliable%20Internet%20Stream%20Transport | Reliable Internet Stream Transport (RIST) is an open-source, open-specification transport protocol designed for reliable transmission of video over lossy networks (including the Internet) with low latency and high quality. It is currently under development in the Video Services Forum's "RIST Activity Group."
RIST is intended as a more reliable successor to Secure Reliable Transport, and as an open alternative to proprietary commercial options such as ActionStreamer, Zixi, VideoFlow, QVidium, and DVEO (Dozer).
Technology
Technically, RIST seeks to provide reliable, high performance media transport by using RTP / UDP at the transport layer to avoid the limitations of TCP. Reliability is achieved by using NACK-based retransmissions (ARQ). SMPTE-2022 Forward Error Correction can be combined with RIST but is known to be significantly less effective than ARQ.
RIST Simple Profile was published in October 2018 and includes the following features:
The base stream uses RTP for compatibility with existing equipment.
Retransmission requests use RTCP. Two types of retransmission requests are defined:
A Bitmask NACK, defined in RFC 4585.
A Range NACK, defined as an APP RTCP packet.
Bonding of multiple links for load sharing.
Seamless switching using SMPTE-2022-7.
Out-of-band transmission of protection data.
The RIST AG is working on an update to RIST Simple Profile that adds link probing to allow for dynamic ARQ protection.
RIST Main Profile was published in March 2020 and adds the following features to Simple Profile:
Tunneling based on RFC 8086, with bidirectional send/receive in the same tunnel.
Multiplexing of multiple streams into the same tunnel.
In-band data support in the tunnel, useful for remote management.
Client/Server architecture.
Firewall traversal.
DTLS encryption.
Pre-Shared Key encryption, with multicast support, access control, and authentication.
Advanced authentication options using either public key certificates or TLS-SRP.
Bandwidth optimiza |
https://en.wikipedia.org/wiki/1960%20United%20States%20presidential%20debates | The 1960 United States presidential debates were a series of debates held during the 1960 presidential election among Democratic nominee John F. Kennedy and Republican nominee Richard Nixon. The four presidential debates were the first series of debates conducted for any presidential election. The next presidential debate did not occur until 1976, after which debates would become a regular feature of all presidential campaigns.
Some believe that those who listened to the first debate on radio thought that Nixon had won, while those who watched that debate on television thought that Kennedy had won.
Background
In 1960, John F. Kennedy, a senator from Massachusetts was nominated by the Democratic party as their presidential nominee. He chose the Senate Democratic leader Lyndon B. Johnson as his running mate. The Republican party nominated the incumbent vice president Richard Nixon as their presidential nominee, with Henry Cabot Lodge Jr., the United States ambassador to the United Nations as his running mate. Most polls after the party conventions showed the NixonLodge ticket having a six point lead over the KennedyJohnson ticket.
Debates
Schedule
First presidential debate (WBBM-TV, Chicago)
The first presidential debate was held at WBBM-TV, Chicago on Monday September 26, 1960. Howard K. Smith moderated the debate with Sander Vanocur, Charles Warren, Stuart Novins and Bob Fleming as panelists. Questions were restricted to internal or domestic American matters. The format decided was:
Eight minute opening statements
Two and a half minute responses to questions
Optional rebuttal
three minute closing statements.
Nixon refused make-up for the first debate, subsequently his facial stubble showed prominently on the black-and-white television screens at the time. During the debate, Nixon started sweating under the studio lights. His light gray suit faded into the backdrop of the set and seemed to match his skin tone. Reacting to this, his mother immediately ca |
https://en.wikipedia.org/wiki/International%20Conference%20on%20Surface%20Plasmon%20Photonics | The International Conference on Surface Plasmon Photonics (SPP) is a biennial conference series in the field of plasmonics, including electron-plasmon interactions; energy harvesting; graphene, mid-IR, and THz plasmonics; near-field instrumentation; novel plasmonic materials; particle manipulations; plasmonic, metasurface, and metamaterial devices; sensors and transducers for biomedical applications; ultrafast and nonlinear phenomena; and quantum plasmonics.
Special issues
Several scientific journals have published special issues reporting results from recent SPP conferences:
SPP6:
SPP7:
SPP8:
SPP9: Mortensen, N. Asger; Berini, Pierre; Levy, Uriel; Bozhevolnyi, Sergey I. (2020-02-04). "Proceedings of 9th International Conference on Surface Plasmon Photonics (SPP9)". Nanophotonics 9 (2): 245 (2020). doi:10.1515/nanoph-2019-0532
References
Physics conferences
Biennial events
Recurring events established in 2001
Plasmonics |
https://en.wikipedia.org/wiki/West%20PC-800 | The West PC-800 was a home computer introduced by Norwegian company West Computer AS in 1984. The computer was designed as an alarm center allowing use of several CPUs (6502, Z80, 8086, 68000) and operating systems. The company introduced an IBM PC compatible in early 1986 and the West PC-800 line was phased out.
History
West Computer AS was founded in late 1983 by Tov Westby, Terje Holen and Geir Ståle Sætre. In early 1984, the company presented its computer then called Sherlock at the Mikrodata'84 fair. The new computer had both 6502 and Z80 CPUs, promised rich expansion capabilities and included two rather unusual features: a wireless keyboard and an alarm device, which could report fire, flood or burglary via phone and the built-in modem. The machine was released in Autumn 1984 at the Sjølyst "Home and Hobby" fair. The West PC 800 did not sell as well as expected, probably due to weak Apple II position in Norway, and West Computer AS announced in late 1985 the IBM PC compatible West PC 1600.
In March 1985, the price of the basic computer was NOK10,200. An additional package with one floppy disk drive (200 KB unformatted capacity), 3 applications and 3 games was available for NOK3,750 and another floppy disk drive for NOK3,300.
Features
West Computer designed its computer primarily as an alarm center with emphasis that it could also function as a games machine (thanks to it having Apple II compatibility). From ca. serial no. 100 the machine became Apple II Plus compatible due to an updated BIOS. Built-in software included two BASIC variants (one for 6502, one for Z80), but available was only an old basic variant for 6502 (for full Apple Basic compatibility). Disk drives are controlled by West DOS (similar to Apple DOS), whose commands are accessible directly from BASIC. However, ProDOS - at the time of the machine introduction - was not compatible with the West DOS.
A Z80 CPU was available for CP/M compatibility. As access to the Z80 is via 6502, its performa |
https://en.wikipedia.org/wiki/Power%20Rangers%3A%20Battle%20for%20the%20Grid | Power Rangers: Battle for the Grid is a fighting game developed by San Francisco-based game developer nWay, featuring characters from the Power Rangers franchise. It was released digitally for Nintendo Switch and Xbox One on March 26, 2019, for PlayStation 4 on April 2, 2019, for Microsoft Windows on September 24, 2019, and for Stadia on June 1, 2020. Limited Run Games released a standard physical version on the Switch and PlayStation 4 alongside a more expensive Mega Edition, which included a SteelBook case, 18" X 24" poster, and 5 coins in addition to the game. Pre-orders went up for sale in June 2019 with the game delivered in November 2019. In October 2020, Maximum Games published the "Collector's Edition" which included the character Lauren Shiba, both physically and digitally. A third version (physical and digital) the Super Edition containing all previous downloadable content was released digitally in May 2021 and physically in August 2021.
Gameplay
Power Rangers: Battle for the Grid is a fighting game in which players compete in battle using characters with different fighting styles and special attacks. Players select teams of three characters to engage in one-on-one combat, and can choose to switch between them at any point during the match, as well as call teammates to perform assists. Players can also call on Megazords to assist with the battle, via a meter that fills as their team members take damage. When all three opposing teammates have been eliminated, a victory is declared.
The game features Ranked Online, Arcade, Versus, Casual Online, Training and Tutorial modes, along with a Story Mode loosely based on the Power Rangers comic book storyline "Shattered Grid".
Playable Characters
The game features 12 playable characters, with 14 additional characters available as downloadable content.
Mighty Morphin Power Rangers
Jason Lee Scott - MMPR Red Ranger
Trini Kwan - MMPR Yellow Ranger (Dragon Armor)
Tommy Oliver - MMPR Green Ranger/White Ranger
Adam |
https://en.wikipedia.org/wiki/Concurrent%20hash%20table | A concurrent hash table or concurrent hash map is an implementation of hash tables allowing concurrent access by multiple threads using a hash function.
Concurrent hash tables represent a key concurrent data structure for use in concurrent computing which allow multiple threads to more efficiently cooperate for a computation among shared data.
Due to the natural problems associated with concurrent access - namely contention - the way and scope in which the table can be concurrently accessed differs depending on the implementation. Furthermore, the resulting speed up might not be linear with the amount of threads used as contention needs to be resolved, producing processing overhead. There exist multiple solutions to mitigate the effects of contention, that each preserve the correctness of operations on the table.
As with their sequential counterpart, concurrent hash tables can be generalized and extended to fit broader applications, such as allowing more complex data types to be used for keys and values. These generalizations can however negatively impact performance and should thus be chosen in accordance to the requirements of the application.
Concurrent hashing
When creating concurrent hash tables, the functions accessing the table with the chosen hashing algorithm need to be adapted for concurrency by adding a conflict resolution strategy. Such a strategy requires managing accesses in a way such that conflicts caused by them do not result in corrupt data, while ideally increasing their efficiency when used in parallel.
Herlihy and Shavit describe how the accesses to a hash table without such a strategy - in its example based on a basic implementation of the Cuckoo hashing algorithm - can be adapted for concurrent use. Fan et al.
further describe a table access scheme based on cuckoo hashing that is not only concurrent, but also keeps the space efficiency of its hashing function while also improving cache locality as well as the throughput of insertions.
Wh |
https://en.wikipedia.org/wiki/Broadcast%20%28parallel%20pattern%29 | Broadcast is a collective communication primitive in parallel programming to distribute programming instructions or data to nodes in a cluster. It is the reverse operation of reduction. The broadcast operation is widely used in parallel algorithms, such as matrix-vector multiplication, Gaussian elimination and shortest paths.
The Message Passing Interface implements broadcast in MPI_Bcast.
Definition
A message of length should be distributed from one node to all other nodes.
is the time it takes to send one byte.
is the time it takes for a message to travel to another node, independent of its length.
Therefore, the time to send a package from one node to another is .
is the number of nodes and the number of processors.
Binomial Tree Broadcast
With Binomial Tree Broadcast the whole message is sent at once. Each node that has already received the message sends it on further. This grows exponentially as each time step the amount of sending nodes is doubled. The algorithm is ideal for short messages but falls short with longer ones as during the time when the first transfer happens only one node is busy.
Sending a message to all nodes takes time which results in a runtime of
Message M
id := node number
p := number of nodes
if id > 0
blocking_receive M
for (i := ceil(log_2(p)) - 1; i >= 0; i--)
if (id % 2^(i+1) == 0 && id + 2^i <= p)
send M to node id + 2^i
Linear Pipeline Broadcast
The message is split up into packages and send piecewise from node to node . The time needed to distribute the first message piece is whereby is the time needed to send a package from one processor to another.
Sending a whole message takes .
Optimal is to choose resulting in a runtime of approximately
The run time is dependent on not only message length but also the number of processors that play roles. This approach shines when the length of the message is much larger than the amount of processors.
Message M := [m_1, m_2, ..., m_n]
id = node |
https://en.wikipedia.org/wiki/Liquid%20Haskell | Liquid Haskell is a program verifier for the programming language Haskell which allows specifying correctness properties by using refinement types. Properties are verified using a satisfiability modulo theories (SMT) solver which is SMTLIB2-compliant, such as the Z3 Theorem Prover.
See also
Formal verification
References
Further reading
External links
Formal methods tools
Static program analysis tools
Type systems |
https://en.wikipedia.org/wiki/996%20working%20hour%20system | The 996 working hour system () is a work schedule practiced by some companies in China. It derives its name from its requirement that employees work from 9:00 am to 9:00 pm, 6 days per week; i.e. 72 hours per week, 12 hours per day. A number of Mainland Chinese internet companies have adopted this system as their official work schedule. Critics argue that the 996 working hour system is a violation of Chinese Labour Law and have called it "modern slavery".
In March 2019, an "anti-996" protest was launched via GitHub. Since then, the 996 issue has been met with growing discontent in China, but despite China's official promises that the 996 working hour system will disappear, it is still widely present.
Background
The culture of overtime work has a long history in Chinese IT companies, where the focus is typically on speed and cost reduction. Companies employ a range of measures, such as reimbursing taxi fares for employees who remain working at the office late into the night, to encourage overtime work.
This system of working for long hours with few breaks has been known to increase the occurrence of mental and physical problems seen in workers. It is estimated that more than three-quarters of urban workers in big cities like Beijing, Shanghai, and Guangzhou suffer from work-related fatigue, musculoskeletal pain, sleep or eating disorders, occupational stress, and work-family imbalance. According to China's state-owned media People's Daily, a 2013 survey showed that 98.8% of Chinese IT industry workers said they had health problems. Numerous overwork deaths and suicides have occurred during past decades due to the 996 system and other overtime working system in China.
In 2020, a study found that "Chinese businesses are more likely to follow long work hours than American ones". Another study likened 996 culture to "modern slavery", formed through the combination of "unrestricted global capitalism and a Confucian culture of hierarchy and obedience".
Relevant leg |
https://en.wikipedia.org/wiki/Alternating%20timed%20automaton | In automata theory, an alternating timed automaton (ATA) is a mix of both timed automaton and alternating finite automaton. That is, it is a sort of automata which can measure time and in which there exists universal and existential transition.
ATAs are more expressive than timed automaton. one clock alternating timed automaton (OCATA) is the restriction of ATA allowing the use of a single clock. OCATAs allow to express timed languages which can not be expressed using timed-automaton.
Definition
An alternating timed automaton is defined as a timed automaton, where the transitions are more complex.
Difference from a timed-automaton
Given a set , let the set of positive Boolean combination of elements of . I.e. the set containing the elements of , and containing and , for .
For each letter and location , let be a set of clock constraints such that their zones partition , with the number of clocks. Given a clock valuation , let be the only clock constraint of which is satisfied by .
An alternating timed-automaton contains a transition function, which associates to a 3-tuple , with , to an element of .
For example, is an element of . Intuitively, it means that the run may either continue by moving to location , and resetting no clock. Or by moving to location and should be successful when either or is reset.
Formal definition
Formally, an alternating timed automaton is a tuple that consists of the following components:
Σ is a finite set called the alphabet or actions of .
is a finite set. The elements of are called the locations or states of .
is a finite set called the clocks of .
is the set of start locations.
is the set of accepting locations.
is the transitions function of . It is a partial function, defined as explained in the previous section.
Any Boolean expression can be rewritten into an equivalent expression in disjunctive normal form. In the representation of a ATA, each disjunction is represented by a different arrow. Each |
https://en.wikipedia.org/wiki/Marilda%20Sotomayor | Marilda A. Oliveira Sotomayor (born March 13, 1944) is a Brazilian mathematician and economist known for her research on auction theory and stable matchings. She is a member of the Brazilian Academy of Sciences, Brazilian Society of Econometrics, and Brazilian Society of Mathematics. She was elected fellow of the Econometric Society in 2003 and international honorary member of the American Academy of Arts and Sciences in 2020.
Education
Sotomayor grew up in Rio de Janeiro, Brazil. She began her education at Federal University of Rio de Janeiro where she received her degree in Mathematics in 1967. Sotomayor continued her education at Institute of Pure and Applied Mathematics where she received her master's degree in Mathematics in 1972. She received her Ph.D. in Mathematics from Catholic University of Rio de Janeiro in 1981.
Areas of interest
Marilda Sotomayor specializes in game theory, matching markets, and market design. She is the only expert in both game theory and matching markets in Brazil.
Personal
Sotomayor married Jorge Sotomayor and had two children, a son and a daughter.
Selected works
References
External links
1944 births
Living people
Brazilian mathematicians
Brazilian women economists
Brazilian women mathematicians
20th-century Brazilian economists
21st-century Brazilian economists
Game theorists
Federal University of Rio de Janeiro alumni
Pontifical Catholic University of Rio de Janeiro alumni
Academic staff of the University of São Paulo
Fellows of the Econometric Society
Fellows of the American Academy of Arts and Sciences |
https://en.wikipedia.org/wiki/Campaign%20to%20Electrify%20Britain%27s%20Railway | The Campaign to Electrify Britain's Railway (CEBR) is an internet-based campaign group formed in 2018 whose aim is to convince the government to completely electrify the British Railway network. Its slogan is "Down with Dirty Diesel." The campaign promotes a rolling programme of electrification, which it considers essential to improve UK railways and help to decarbonise transport. It collaborates with groups such as the Railway Industry Association, Rail Delivery Group, Birmingham Centre for Railway Research and Education, Campaign for Better Transport, Institute of Electrical Engineers and the Permanent Way Institution. The group has given evidence to the Transport Select Committee. Huw Merriman the committee chair at the time, put it writing he agreed with their view. Merriman was appointed as Minister of State for Rail and HS2 in October 2022. The desire to achieve net zero carbon in transport has increased calls for electrification.
Origins
The CEBR manifesto states: "The UK has suffered from too many boom and bust infrastructure projects. A steady, planned, rolling programme will reduce costs, speed up journey times, create more seats on more reliable trains and ultimately reduce ticket prices." The group staged a protest on top of Snowdon in 2018. In July 2019, the final report of the rail decarbonisation project was published by the group.
Main organisational goals
Many, but not all, diesel trains use only friction brakes (as do cars and trucks) to slow or stop the train. This wears the discs and pads, introducing particulates into the atmosphere. Electric trains predominantly use the motors in regeneration mode to slow the train, producing almost zero particulates. The technology does not yet exist to stop the train completely. Using this technology would improve the health of the nation but in particular for people who live closer to the railway. In addition, regenerative braking saves energy and is more efficient and thus helps the low-carbon economy. |
https://en.wikipedia.org/wiki/TI-BASIC%2083 | TI-BASIC 83,TI-BASIC Z80 or simply TI-BASIC, is the built-in programming language for the Texas Instruments programmable calculators in the TI-83 series. Calculators that implement TI-BASIC have a built in editor for writing programs. While the considerably faster Z80 assembly language is supported for the calculators, TI-BASIC's in-calculator editor and more user friendly syntax make it easier to use. TI-BASIC is interpreted.
Syntax
The syntax for TI-BASIC 83 is significantly different compared to most dialects of BASIC. For example, the language does not permit indentation with whitespace characters. It also depends on the TI calculator character set because it is tokenized. Aside from these differences, TI-BASIC retains most control flow statements: conditionals, various loops, GOTOs and Labels. Conditionals and loops use End to denote the end of their bodies.
Each command can be placed on a new line, or separated by a colon for brevity. As such, the following snippets are identical in function.
:Disp "FOO
:Disp "BAR
and
:Disp "FOO:Disp "BAR
In the above example the closing double quotes can be omitted because the colon causes all open markers to be closed.
Unlike many high-level programming languages, TI-BASIC has only one assignment operator: →. The rightward arrow assigns the value on the left to the variable on the right.
Conditionals
TI-BASIC includes simple constructs using the If statement. When the If token does not have a Then token on the following line it will only execute the next single command.
:If condition
:command
Where condition is any boolean statement. One benefit of this format is brevity as it does not include Then and End. An If statement may have more than one command in its body if, instead of a command, a Then token is placed.
:If condition
:Then
:command
:command
:End
When using Then, the body must be closed by an End token. One more construct utilizes Else. This allows one of two bodies to be executed.
:If condition
:Then
:bod |
https://en.wikipedia.org/wiki/Channel%20system%20%28computer%20science%29 | In computer science, a channel system is a finite state machine similar to communicating finite-state machine in which there is a single system communicating with itself instead of many systems communicating with each other. A channel system is similar to a pushdown automaton where a queue is used instead of a stack. Those queues are called channels. Intuitively, each channel represents a sequence a message to be sent, and to be read in the order in which they are sent.
Definition
Channel system
Formally, a channel system (or perfect channel system) is defined as a tuple with:
a finite set of control states,
an initial state,
a finite alphabet (for the sake of notation simplicity, let ),
a finite set of channels,
a finite alphabet of messages,
a finite set of transition rules with being the set of finite (potentially empty) words over the alphabet .
Depending on the author, a channel system may have no initial state and may have an empty alphabet.
Configuration
A configuration or global state of the channel system is a tuple belonging to . Intuitively, a configuration represents that a run is in state and that its -th channel contains the word .
The initial configuration is , with the empty word.
Step
Intuitively, a transition means that the system may goes to control state to by writing an to the end of the channel . Similarly means that the system may goes to control state to by removing a starting the word .
Formally, given a configuration , and a transition , there is a perfect step , where the step adds a letter to the end of the -th word. Similarly, given a transition , there is a perfect step where the first letter of the -th word is and has been removed during the step.
Run
A perfect run is a sequence of perfect step, of the form . We let denote that there is a perfect run starting at and ending at .
Languages
Given a perfect or a lossy channel system , multiple languages may be defined.
A word over is acce |
https://en.wikipedia.org/wiki/Eurovision%20Debate | The Eurovision Debate is a live televised debate between the lead political candidates (“Spitzenkandidaten”) running to be the next President of the European Commission. Produced by the European Broadcasting Union (EBU) and broadcast across Europe via the Eurovision network, it is hosted by the European Parliament in Brussels, Belgium. The aim of the debate is to help public service media play their role in the democratic process by helping to better inform citizens and encouraging participation in the elections.
History
The first Eurovision Debate took place on 15 May 2014 and was the first-ever live televised format to bring democratic political debate to a pan-European level. Italian journalist and Director of the Rai News24 moderated the debate with RTÉ's Conor McNally as its social media co-presenter. The Eurovision Debate is produced by the EBU under the guidance and the supervision of senior editors from European Public Service Media (the “Editorial Board”) and was directed by Rob Hopkin.
The 2019 edition was broadcast live from the European Parliament in Brussels on 15 May 2019 at 21:00 CET, moderated by TV anchors (ARD/WDR), Émilie Tran Nguyen (France Télévisions) and (Yle) and broadcast by the EBU's public service media members and others throughout Europe. During the 90-minute debate, the following issues were to be addressed: migration, unemployment, security and climate change, and the role of Europe in the world. In fact, the debate was more about the common minimum wage, the European business tax, the reduction of greenhouse gas emissions, the border control associated with solidarity, and the use of trade to improve working conditions in Europe.
Format
The debate is presented by 2 television anchor personalities who ask the candidates on stage a series of questions on pre-determined themes, although the questions themselves are not known in advance. The debate obeys the strictest rules of transparency and neutrality, all candidates are allocate |
https://en.wikipedia.org/wiki/Benjamin%20S.%20Cook | Benjamin S. Cook is an American scientist, entrepreneur, advisory board member, professor, and author. He is best known for his pioneering work in printed electronics and for implementing the first semiconductor-compatible printed electronics process, VIPRE. He holds over 150 patents and patents pending, and over 100 peer reviewed journal and conference publications.
Biography
Benjamin S. Cook received the Bachelor of Science degree from Rose-Hulman Institute of Technology , the Master of Science degree from King Abdullah University of Science and Technology, and the Doctor of Philosophy degree in electrical engineering and materials science from Georgia Institute of Technology.
From 2006 to 2014, he was the founder and president of Soft-Tronics, a technology consulting firm which partnered with technology startups to accelerate growth and market penetration. In 2014, he joined Texas Instruments Kilby Labs to industrialize his pioneering work in semiconductor printed electronics and additive manufacturing. Currently, he is the Sr. Director of Texas Instruments' Nanotechnology Organization and holds advisory positions on the Rose-Hulman Academic Advisory Board, the Elsevier Journal of Additive Manufacturing, as well as several other research consortiums.
Awards
Member of the Group Technical Staff, Texas Instruments (2016)
Rose-Hulman Career Achievement Award, Alumni of the Year (2016)
Intel Doctoral Fellowship Award (2013)
IEEE Antennas & Propagation Society Doctoral Fellowship Award (2012)
Rose-Hulman Engineer of the Year (2010)
Books
Handbook of Flexible Electronics: Materials, Manufacturing and Applications, Woodhead Publishing
Handbook of Antenna Technologies: Advanced Antenna Fabrication Processes (MEMS/LTCC/LCP/Printing), Springer Publishing
Green RFID Systems: Materials and Substrates, Cambridge Press
References
External links
Google Scholar profile
Living people
Year of birth missing (living people)
American electronics engineers
Rose–Hul |
https://en.wikipedia.org/wiki/Suspension%20culture | A cell suspension or suspension culture is a type of cell culture in which single cells or small aggregates of cells are allowed to function and multiply in an agitated growth medium, thus forming a suspension. Suspension culture is one of the two classical types of cell culture, the other being adherent culture. The history of suspension cell culture closely aligns with the history of cell culture overall, but differs in maintenance methods and commercial applications. The cells themselves can either be derived from homogenized tissue or from heterogenous cell solutions. Suspension cell culture is commonly used to culture nonadhesive cell lines like hematopoietic cells, plant cells, and insect cells. While some cell lines are cultured in suspension, the majority of commercially available mammalian cell lines are adherent. Suspension cell cultures must be agitated to maintain cells in suspension, and may require specialized equipment (e.g. magnetic stir plate, orbital shakers, incubators) and flasks (e.g. culture flasks, spinner flasks, shaker flasks). These cultures need to be maintained with nutrient containing media and cultured in a specific cell density range to avoid cell death.
History
The history of suspension cell culture is closely tied to the overall history of cell and tissue culture. In 1885, Wilhelm Roux laid the groundwork for future tissue culture, by developing a saline buffer that was used to maintain living cells (chicken embryos) for a few days. Ross Granville Harrison in 1907 then developed in vitro cell culture techniques, including modifying the hanging drop technique for nerve cells and introducing aseptic technique to the culture process. Later in 1910, Montrose Thomas Burrows adapted Harrison's technique and collaborated with Alexis Carrel to establish multiple tissue cultures that could be maintained in vitro using fresh plasma combined with saline solutions. Carrel went on to develop the first known cell line, a line derived from chick |
https://en.wikipedia.org/wiki/List%20of%20forests%20managed%20by%20Forestry%20and%20Land%20Scotland | Forestry and Land Scotland (FLS) () was formed on 1 April 2019 and is responsible for managing and promoting the National Forest Estate in Scotland. The national forest estate owned by FLS covers 6,400 km2, being roughly 8% of the land area of Scotland. Around two-thirds of this land is forested, with the remaining land consisting of a mixture of agricultural land and open areas such as moorland.
As of January 2020 there were 307 individual forests listed on the FLS website; there are also 6 designated forest parks.
List of Forests
References
Forestry in Scotland
Forests and woodlands of Scotland |
https://en.wikipedia.org/wiki/Paper%20snowflake | A paper snowflake is a type of paper craft based on a snowflake that combines origami with papercutting. The designs can vary significantly after doing mandatory folding.
An online version of the craft is known as "Make-A-Flake", and was created by Barkley Inc. in 2008.
See also
Kirigami
References
External links
How to Make 6-Pointed Paper Snowflakes on Instructables
Paper folding
Paper art |
https://en.wikipedia.org/wiki/24-Norcholestane | 24-Norcholestane, a steroid derivative, is used as a biomarker to constrain the source age of sediments and petroleum through the ratio between 24-norcholestane and 27-norcholestane (24-norcholestane ratio, NCR), especially when used with other age diagnostic biomarkers, like oleanane. While the origins of this compound are still unknown, it is thought that they are derived from diatoms due to their identification in diatom rich sediments and environments. In addition, it was found that 24-norcholestane levels increased in correlation with diatom evolution. Another possible source of 24-norcholestane is from dinoflagellates, albeit to a much lower extent.
Structure
24-Norcholestane is a tetracyclic compound, with 20R,5α(H),14α(H),17α(H) stereochemistry, derived from steroids or sterols. It consists of three 6-membered rings and one 5-membered ring, with carbon 24 removed from the side chain off of C17.
Background
24-Norcholestane is a 26-carbon (C26) sterane created from the removal of carbon 24 from cholestane. It has been found that 24-norcholestane is relatively high in abundance, up to 10% of sterols, in Thalassiosira aff. antarctica, a diatom. It has also been found in the dinoflagellate Gymnodinium simplex, albeit at much lower levels (around 0.2% of sterols).
Origins
Since 24-norcholestane origins are still unknown, the synthesis of it is also unknown as well. However, some pathways have been proposed. Possible sources of 24-norcholestane include 24-norcholesterol, which is present in many marine invertebrates and some algae in addition to diatoms and dinoflagellates.
Measurement techniques
Samples are collected from rocks or crude oils. Asphaltenes are first extracted before the sample is fractionated by passing through a silica column and eluting with solvents of increasing polarity. Traditional gas chromatography-mass spectrometry (GC/MS) techniques are not used, as C26 steranes are present in samples in much lower quantities, generally a magnitud |
https://en.wikipedia.org/wiki/Delta-matroid | In mathematics, a delta-matroid or Δ-matroid is a family of sets obeying an exchange axiom generalizing an axiom of matroids. A non-empty family of sets is a delta-matroid if, for every two sets and in the family, and for every element in their symmetric difference , there exists an such that is in the family. For the basis sets of a matroid, the corresponding exchange axiom requires in addition that and , ensuring that and have the same cardinality. For a delta-matroid, either of the two elements may belong to either of the two sets, and it is also allowed for the two elements to be equal.
An alternative and equivalent definition is that a family of sets forms a delta-matroid when the convex hull of its indicator vectors (the analogue of a matroid polytope) has the property that every edge length is either one or the square root of two.
Delta-matroids were defined by André Bouchet in 1987.
Algorithms for matroid intersection and the matroid parity problem can be extended to some cases of delta-matroids.
Delta-matroids have also been used to study constraint satisfaction problems. As a special case, an even delta-matroid is a delta-matroid in which either all sets have even number of elements, or all sets have an odd number of elements. If a constraint satisfaction problem has a Boolean variable on each edge of a planar graph, and if the variables of the edges incident to each vertex of the graph are constrained to belong to an even delta-matroid (possibly a different even delta-matroid for each vertex), then the problem can be solved in polynomial time. This result plays a key role in a characterization of the planar Boolean constraint satisfaction problems that can be solved in polynomial time.
References
Matroid theory |
https://en.wikipedia.org/wiki/Public%20Radio%20Service | Public Radio Service (PRS) is a walkie-talkie personal radio service in the People's Republic of China, including Hong Kong and Macau, but excluding Taiwan. It can be used without a license. It uses 409 MHz. It is also known as PRS409.
It is similar to the American Family Radio Service (FRS) and PMR446 in the European Union.
Technical information
The PRS radios use narrow-band frequency modulation (NBFM) with a maximum deviation of 2.5 kHz. The channels are spaced at 12.5 kHz intervals. They are limited to 500 milliwatts effective radiated power.
See also
70-centimeter band
CDCSS
Continuous Tone-Coded Squelch System (CTCSS)
General Mobile Radio Service
KDR 444
LPD433
Multi-Use Radio Service
Personal radio service
Personal radio service#Taiwan
UHF CB
Notes
References
Bandplans
Radio hobbies
Radio regulations
Radio technology |
https://en.wikipedia.org/wiki/Allan%20Hills%2077005 | Allan Hills 77005 (also known as Allan Hills A77005, ALHA77005, ALH77005 and ALH-77005) is a Martian meteorite that was found in the Allan Hills of Antarctica in 1977 by a Japanese National Institute of Polar Research mission team and ANSMET. Like other members of the group of SNCs (shergottite, nakhlite, chassignite), ALH-77005 is thought to be from Mars.
Description
On discovery, the mass of ALH-77005 was . Initial geological examination determined that the meteorite was composed of ~55% olivine, ~35% pyroxene, ~8% maskelynite and ~2% opaques.
In March 2019, researchers reported the possibility of biosignatures in this Martian meteorite based on its microtexture and morphology as detected with optical microscopy and FTIR-ATR microscopy, and on the detection of mineralized organic compounds, suggesting that microbial life could have existed on the planet Mars. More broadly, and as a result of their studies, the researchers suggest Solar System materials should be carefully studied to determine whether there may be signs of microbial forms within other space rocks as well.
See also
Allan Hills 84001
Glossary of meteoritics
History of Mars observation
Life on Mars
List of Martian meteorites on Earth
List of meteorites on Mars
Nakhla meteorite
Mars sample return mission
Panspermia
Shergotty meteorite
Water on Mars
References
Further reading
External links
Meteoritical Society
The British and Irish Meteorite Society
The Natural History Museum's meteorite catalogue database
Astrobiology
Extraterrestrial life
Martian meteorites
Meteorites found in Antarctica |
https://en.wikipedia.org/wiki/Future%2050%20Foods%20report | The Future 50 Foods report, subtitled "50 foods for healthier people and a healthier planet", was published in February 2019 by the World Wide Fund for Nature (WWF) and Knorr. It identifies 50 plant-based foods that can increase dietary nutritional value and reduce environmental impacts of the food supply, promoting sustainable global food systems.
Description
The report identifies 12 plant sources and five animal sources that make up 75 percent of the food humans consume, and three crops (wheat, corn and rice) accounting for about "60 percent of the plant-based calories in most diets". The report points out that lack of variety in food sources threatens food security, and "repeatedly harvesting the same crop on the same land depletes nutrients in the soil, leading to intensive use of fertilizers and pesticides that, when misused, can hurt wildlife and damage the environment".
The report offers five steps to identifying a future food: "focus on plant-based foods, optimize nutrient density, evaluate environmental impact, consider culture and flavor, and deliver diversity."
Criteria for inclusion on the list of 50 foods indicated they must be "highly nutritious, have as little impact on the environment as possible, affordable, accessible, and of course, tasty". The foods are grouped into categories:
Algae
Algae contain essential fatty acids and antioxidants rich in protein, and are a potential replacement for meat.
1. Laver seaweed Porphyra umbilicalis
2. Wakame seaweed Undaria pinnatifida
Beans and pulses
Beans are in the legume family, and are a source of fiber, protein and B vitamins.
3. Adzuki beans Vigna angularis
4. Black turtle beans Phaseolus vulgaris
5. Broad beans (fava beans) Vicia faba
6. Bambara groundnuts/Bambara beans Vigna subterranea
7. Cowpeas Vigna unguiculata
8. Lentils Lens culinaris
9. Marama beans Tylosema esculentum
10. Mung beans Vigna radiata
11. Soy beans Glycine max
Cacti
Cacti contains vitamins C and E, carotenoids, |
https://en.wikipedia.org/wiki/Neuroprivacy | Neuroprivacy, or "brain privacy," is a concept which refers to the rights people have regarding the imaging, extraction and analysis of neural data from their brains. This concept is highly related to fields like neuroethics, neurosecurity, and neurolaw, and has become increasingly relevant with the development and advancement of various neuroimaging technologies. Neuroprivacy is an aspect of neuroethics specifically regarding the use of neural information in legal cases, neuromarketing, surveillance and other external purposes, as well as corresponding social and ethical implications.
History
Neuroethical concepts such as neuroprivacy developed initially in the 2000s, after the initial invention and development of neuroimaging techniques such as positron emission tomography (PET), electroencephalography (EEG), and functional magnetic resonance imaging (fMRI). As neuroimaging became highly studied and popularized in the 1990s, it also started entering the commercial market as entrepreneurs sought to market the practical applications of neuroscience, such as neuromarketing, neuroenhancement and lie detection. Neuroprivacy consists of the privacy issues raised by both neuroscience research and applied uses of neuroimaging techniques. The relevance of neuroprivacy debate increased significantly after the 9/11 terrorist attacks, which led to a push for increased neuroimaging in the context of information/threat detection and surveillance.
Neuroanalysis techniques
Brain fingerprinting
Brain fingerprinting is a controversial and unproven EEG technique that relies on identifying the P300 event-related potential, which is correlated with recognition of some stimulus. The purpose of this technique is to determine if a person has incriminating information or memory. In its current state, brain fingerprinting is only able to determine the existence of information, and is unable to provide any specific details about that information. Its creator, Dr. Lawrence Farwe |
https://en.wikipedia.org/wiki/Generalized%20Cohen%E2%80%93Macaulay%20ring | In algebra, a generalized Cohen–Macaulay ring is a commutative Noetherian local ring of Krull dimension d > 0 that satisfies any of the following equivalent conditions:
For each integer , the length of the i-th local cohomology of A is finite:
.
where the sup is over all parameter ideals and is the multiplicity of .
There is an -primary ideal such that for each system of parameters in ,
For each prime ideal of that is not , and is Cohen–Macaulay.
The last condition implies that the localization is Cohen–Macaulay for each prime ideal .
A standard example is the local ring at the vertex of an affine cone over a smooth projective variety. Historically, the notion grew up out of the study of a Buchsbaum ring, a Noetherian local ring A in which is constant for -primary ideals ; see the introduction of.
References
Ring theory |
https://en.wikipedia.org/wiki/PureVPN | PureVPN is a commercial VPN service and a part of PureSquare. It is owned by GZ Systems Ltd. Founded in 2007, the company is based in the British Virgin Islands.
PureVPN allows users to select from four categories: Stream, Internet Freedom, Security/Privacy, and File Sharing. The user's selection then determines which servers through which their traffic will be routed. PureVPN's 6,500 high-speed servers are located in 78 countries. PureVPN requires users to provide their real names to use the service. It stores the day and the Internet service provider through which a user accesses the service but does not store the name of the website or actual time of access.
The service has been criticized for having inconsistent speeds, being unable to access Netflix videos, and having usability problems. It has been praised for its feature set.
History
PureVPN is owned by GZ Systems Limited, a software company that creates cybersecurity apps. Its mailing address is in Tortola, the British Virgin Islands. PureVPN was co-founded by Uzair Gadit who is based in Pakistan. Founded in 2007, it employs contractors in the United States, United Kingdom, Ukraine, Pakistan, the British Virgin Islands, and formerly Hong Kong.
Technology
PureVPN's homepage allows users to select from four categories: Stream, Internet Freedom, Security/Privacy, and File Sharing. Other configuration options include the transport protocol and split tunneling. PureVPN offers users the option to turn on the "VPN Hotspot", allowing other devices to use the PureVPN hotspot connection. PureVPN provides desktop clients for Linux, macOS, and Microsoft Windows and mobile clients for Android and iOS. PureVPN can be run at the same time on five sessions. It allocated 200 servers for peer-to-peer file sharing and BitTorrent usage but does not provide any servers for accessing the Tor network. PureVPN has over 6,500 high-speed servers across 78 countries. IPSec/IKEv2 protocol for comprehensive security,
In August 20 |
https://en.wikipedia.org/wiki/Aftown | aftown is an Internet-based music download and streaming service. IAftown as a company consists of two main platforms: streaming and downloads. The aftown app is free and generates revenue through premium subscriptions and advertisement from free users.
It is currently the top indigenous music streaming and download site in West Africa.
History
In September 2016, the technation team had a collaboration with M.anifest to release his album "No Where Cool " album on 15ghana.com, an eCommerce website. The album upon release sold over a thousand copies all over the country within a week. The sale of M.anifest's album presented several challenges especially in regard to delivery since the project involved the sale CDs. These challenges and the overall success of the project led the technation gh team to come up with the idea for aftown coined from the words Africa and Town.
In October, 2016 aftown was officially launched. Upon its launch aftownmusic had its first major sales from Sonnie Badu's album "Soundz Of Afrika". Aftown went on to sell albums from top Ghanaian artists like Sarkodie, MzVee, Akan, M.anifest , Joe Mettle, EL, Stonebwoy, Okyeame Kwame, Adomaa, Teephlow.
Aftown generated about GHC 80 000 in artist revenue in its first year.
The transition from 15ghana to aftown was a positive one, however there were still challenges present that had to be addressed hence the inception of aftownmusic. Due to feedback from aftown users, a streaming platform was introduced in June 2017. Aftownmusic's first big project was with Akan's album "Onipa Akoma" which was a huge success .
Business Model
Aftownmusic is a free music streaming platform. Revenue is generated from premium subscriptions and advertisements for free users. Aftown encourages users to support indigenous artists by avoiding piracy and streaming local content .Artistes gain revenue from their total play counts.
Accounts And Subscriptions
Aftownmusic has two distinct apps on both android and iOS for |
https://en.wikipedia.org/wiki/Guided%20analytics | Guided analytics is a sub-field at the interface of visual analytics and predictive analytics focused on the development of interactive visual interfaces for business intelligence applications. Such interactive applications serve the analyst to take important decisions by easily extracting information from the data.
Overview
Guided analytics applications lie in the intersection between business intelligence and predictive analytics. A great number of business analysts rely on business intelligence tools to flexibly extract specific information from data. It is often required to automatically run an analysis on the raw data before information can be extracted. However, it is not always possible to automate the entire process from any kind of raw data access to the extraction of useful information. Furthermore, the expertise of data scientists will be necessary each time new data or questions come into the picture. This is especially true when predictive analytics (machine learning) is applied.
To create an application that is flexible to different data problems and usable by the domain experts without continuous help by a data scientist, it is required to insert a number of interaction points in the analysis process. The interactions will determine the sequence of steps in the analysis. In this way, the application guides the user with no need of customization by the data science expert. Guided analytics is about building such interactive applications. By mixing and matching automation and interaction, guided analytics applications empower business analysts to independently extract insights and future outcomes from the data.
History
The term “guided analytics” was coined for the first time in an online magazine by a TIBCO expert in 2004. Back then, predictive analytics in business intelligence was fairly new. Guided analytics applications were focused entirely on interactive visualization to ease the access to trustworthy KPI metrics through a database. This was |
https://en.wikipedia.org/wiki/Museo%20de%20Mujeres%20Artistas%20Mexicanas | The Museo de Mujeres Artistas Mexicanas or MUMA (The Museum of Mexican Women Artists) is a virtual museum exhibiting the work of Mexican women artists, founded by the photographer Lucero González in 2008 to show the work of Mexican women in distinct fields of the arts. The museum's advisory board both curates the works exhibited and conducts research related to the artists and their work.
The first exhibition of the museum featured the work of 50 artists including Leonora Carrington, Lola Álvarez Bravo, Laura Anderson, Mariana Yampolvsky and Graciela Iturbide, and feminist artists like Helen Escobedo, Marta Lick, Silvia Navarrete and Lucero González, among others. It was produced with support from the Mexican Society for Women's Rights, AC, Semillas (Seeds), and is a non-profit project.
In 2015 the museum had exhibited the work of 270 women Mexican artists.
Advisory board
The museum's multidisciplinary advisory board members are:
References
Virtual museums
Arts organizations based in Mexico
- |
https://en.wikipedia.org/wiki/Normally%20flat%20ring | In algebraic geometry, a normally flat ring along a proper ideal I is a local ring A such that is flat over for each integer .
The notion was introduced by Hironaka in his proof of the resolution of singularities as a refinement of equimultiplicity and was later generalized by Alexander Grothendieck and others.
References
Herrmann, M., S. Ikeda, and U. Orbanz: Equimultiplicity and Blowing Up. An Algebraic Study with an Appendix by B. Moonen. Springer Verlag, Berlin Heidelberg New-York, 1988.
Algebraic geometry |
https://en.wikipedia.org/wiki/MeWatch | meWatch (stylized as mewatch, and formerly as meWATCH) is a Singaporean digital video on demand service brand owned by Mediacorp. It was launched on 1 February 2013 as an over-the-top media service and an entertainment and lifestyle website Toggle.
On 1 April 2015, xinmsn, an internet portal which is a joint venture between MediaCorp and Microsoft, was closed down and merged with Toggle. On 30 January 2020, Toggle was renamed meWATCH.
Content
meWATCH offers to worldwide audiences video streaming or on-demand content of programs from Mediacorp's archived library as well as original webseries. In addition, meWatch also offers live streaming of Mediacorp's free-to-air channels (exclusive to Singapore only). It also offers catch-up TV for viewers to watch shows they have missed on prime time TV shows from the previous few days.
In 2016, meWATCH (then Toggle) began to offer made-for-digital productions under the brand Toggle Originals. Certain original series, such as I Want to Be a Star may be telecasted on Mediacorp's television channels, typically in the late night programming slots.
In 2019, meWATCH began offering additional content from HBO Go for Singapore residents only. In the same year, Mediacorp and Wattpad inked partnership for developing scripted series and films for meWATCH and Mediacorp FTA channels, based on books made by Wattpad writers that are based in Singapore. In 30 May 2019, meWATCH began offering on-demand Korean movies from tvN Movies.
On 30 January 2020, Toggle was renamed meWATCH. At the same time, Mediacorp made a content deal with HOOQ to stream the latter's original content. However, on 27 March 2020, HOOQ filed for liquidation, shut down on 30 April 2020, and eventually was acquired by Coupang in July 2020 in order to being used as the basis of its streaming service named Coupang Play.
As of most recent, they had carried anime from both Medialink, Mighty Media and Muse Communication, dozens of Mandarin series from Chinese production |
https://en.wikipedia.org/wiki/Essential%20Video%20Coding | MPEG-5 Essential Video Coding (EVC) is a current video compression standard that has been completed in April 2020 by decision of MPEG Working Group 11 at its 130th meeting.
The standard consists of a royalty-free subset and individually switchable enhancements.
Concept
The publicly available requirements document outlines a development process that is defensive against patent threats: Two sets of coding tools, base and enhanced, are defined:
The base consist of tools that were made public more than 20 years ago or for which a Type 1 declaration is received. Type 1, or option 1, means "royalty-free", in the nomenclature used in ISO documents.
The "enhanced" set consists of 21 other tools which have passed an extra compression efficiency justification and which can be disabled individually.
Each of the 21 payable tools can have separately acquired and separately negotiated and separately Traded License agreements. Each can be individually turned off and, when necessary, replaced by a corresponding cost free baseline profile tool. This structure makes it easy to fall back to a smaller set of tools in the future, if, for example, licensing complications occur around a specific tool, without breaking compatibility with already deployed decoders.
A proposal by Samsung, Huawei and Qualcomm forms the basis of EVC.
Implementations
XEVE (eXtra-fast Essential Video Encoder) is self-described as a fast open source EVC encoder. It is written in C99 and supports both the baseline and main profiles of EVC. Its license is a custom 3-clause BSD license.
MPAI-EVC
MPAI aims to significantly enhance the performance of EVC by improving or replacing traditional tools with AI-based tools, with the goal of reaching at least 25% improvement over the baseline profile of EVC.
See also
MPEG-5 Part 2 / Low Complexity Enhancement Video Coding / LC EVC
H.266 / MPEG-I Part 3 / Versatile Video Coding / VVC
AV1
IP core - Semiconductor intellectual property core - Licensing scheme |
https://en.wikipedia.org/wiki/RF%20chain | An RF chain is a cascade of electronic components and sub-units which may include amplifiers, filters, mixers, attenuators and detectors. It can take many forms, for example, as a wide-band receiver-detector for electronic warfare (EW) applications, as a tunable narrow-band receiver for communications purposes, as a repeater in signal distribution systems, or as an amplifier and up-converters for a transmitter-driver. In this article, the term RF (radio frequency) covers the frequency range "Medium Frequencies" up to "Microwave Frequencies", i.e. from 100 kHz to 20 GHz.
The key electrical parameters for an RF chain are system gain, noise figure (or noise factor) and overload level. Other important parameters, related to these properties, are sensitivity (the minimum signal level which can be resolved at the output of the chain); dynamic range (the total range of signals that the chain can handle from a maximum level down to smallest level that can be reliably processed) and spurious signal levels (unwanted signals produced by devices such as mixers and non-linear amplifiers). In addition, there may be concerns regarding the immunity to incoming interference or, conversely, the amount of undesirable radiation emanating from the chain. The tolerance of a system to mechanical vibration may be important too. Furthermore, the physical properties of the chain, such as size, weight and power consumption may also be important considerations.
An addition to considering the performance of the RF chain, the signal and signal-to-noise requirements of the various signal processing components, which may follow it, are discussed because they often determine the target figures for a chain.
Parameter sets
Each two-port network in an RF chain can be described by a parameter set, which relates the voltages and currents appearing at the terminals of that network. Examples are: impedance parameters, i.e. z-parameters; admittance parameters, i.e. y-parameters or, for high frequen |
https://en.wikipedia.org/wiki/Digico%20Limited |
Digico was a British computer company founded in 1965 by Keith Trickett and Avo Hiiemae, two ex-ICL electronics engineers. Former MP Eric Lubbock became chairman in 1969. The company was based in Letchworth initially, moving to a new factory in Stevenage in 1973 and employing about 90 staff.
Digico's first product was a laboratory data-logging and spectrum analyser hardware system named DIGIAC. This product had been developed before Digico was formed, so was an immediate source of income. Digico soon developed a 16-bit minicomputer series, the Micro 16, for which it was best known for.
Digico Micro 16
Digico quickly started developing a general purpose single accumulator 16-bit minicomputer, the Micro 16, which became available in 1966. Digico was assisted by the Ministry of Technology and the National Research Development Corporation in this development. The first version produced was the Digico Micro 16S (1968), followed by the 16P (1970), then the 16V in 1972.
The Digico Micro 16V had a standard memory of 4k words with 950 nano second cycle time, expandable to 64k words, and able to support up to 64 external interfaces. It had an optional microprogrammed floating-point unit. The Micro 16V was supported by a simple and flexibly sized executive that could optionally support multiprogramming, disc files and teletypes. The Micro 16V used semiconductor memory, rather than magnetic-core memory as in the previous models.
Digico primarily sold into the data logging market until 1969, when it expanded into areas like process control, stock control and front-end processors for the ICL 1900 mainframe. In 1974 Digico had a turnover of over £1 million (equivalent to £ million in ) and in 1977 well over £1 million.
In 1978 the Digico Micro 16E stackable minicomputer, which was well suited to an office environment, won a Design Council Award for Engineering Products.
See also
Computer Technology Limited
PDP-8
References
External links
Digico Micro 16V, Time-Line Com |
https://en.wikipedia.org/wiki/Sims%20conjecture | In mathematics, the Sims conjecture is a result in group theory, originally proposed by Charles Sims. He conjectured that if is a primitive permutation group on a finite set and denotes the stabilizer of the point in , then there exists an integer-valued function such that for the length of any orbit of in the set .
The conjecture was proven by Peter Cameron, Cheryl Praeger, Jan Saxl, and Gary Seitz using the classification of finite simple groups, in particular the fact that only finitely many isomorphism types of sporadic groups exist.
The theorem reads precisely as follows.
Thus, in a primitive permutation group with "large" stabilizers, these stabilizers cannot have any small orbit. A consequence of their proof is that there exist only finitely many connected distance-transitive graphs having degree greater than 2.
References
Algebraic graph theory
Finite groups
Permutation groups
Theorems in graph theory
Theorems in group theory |
https://en.wikipedia.org/wiki/Reinsurance%20to%20close | Reinsurance to close (RITC) is a business transaction whereby the estimated future liabilities of an insurance company are reinsured into another, in order that the profitability of the former can be finally determined. It is most closely associated with the Lloyd's of London insurance market that comprises numerous competing "syndicates", and in order to close each accounting year and declare a profit or loss, each syndicate annually "reinsures to close" its books. In most cases, the liabilities are simply reinsured into the subsequent accounting year of the same syndicate, however, in some circumstances the RITC may be made to a different syndicate or even to a company outside of the Lloyd's market.
History
At Lloyd's, traditionally each year of each syndicate is a separate enterprise, and the profitability of each year is determined essentially by payments for known liabilities (claims) and money reserved for unknown liabilities that may emerge in the future on claims that have been incurred but not reported (IBNR). The estimation of the quantity of IBNR is difficult and can be inaccurate.
Capital providers typically "joined" their syndicate for one calendar year only, and at the end of the year the syndicate as an ongoing trading entity was effectively disbanded. However, usually the syndicate re-formed for the next calendar year with more or less the same capital membership. In this way, a syndicate could have a continuous existence for many years, but each year was accounted for separately. Since some claims can take time to be reported and then paid, the profitability of each syndicate took time to realise. The practice at Lloyd's was to wait three years from the beginning of the year in which the business was written before "closing" the year and declaring a result. For example, for the 1984 year a syndicate would ordinarily declare its result at 31 December 1986. The syndicate's 1984 members would therefore be paid any profit during 1987 (in proportion to |
https://en.wikipedia.org/wiki/Rubin%20Braunstein | Rubin Braunstein (1922–2018) was an American physicist and educator. In 1955 he published the first measurements of light emission by semiconductor diodes made from crystals of gallium arsenide (GaAs), gallium antimonide (GaSb), and indium phosphide (InP). GaAs, GaSb, and InP are examples of III-V semiconductors. The III-V semiconductors absorb and emit light much more strongly than silicon, which is the best-known semiconductor. Braunstein's devices are the forerunners of contemporary LED lighting and semiconductor lasers, which typically employ III-V semiconductors. The 2000 and 2014 Nobel Prizes in Physics were awarded for further advances in closely related fields.
Braunstein was raised in New York City. He earned a doctorate in physics from Syracuse University in 1954. He then joined the research laboratory of the RCA Corporation, which was among the most active industrial laboratories at the time. In the following decade at RCA Laboratories he published broadly on semiconductor physics and technology. Beyond his seminal work with light emission from III-V semiconductors, in 1964 he exploited newly invented lasers to publish the first paper on two-photon absorption in semiconductors. Typically, only individual photons (particles of light) with some minimum energy are absorbed by a given semiconductor. For very high intensity beams of light, two photons, each with half that minimum energy, can be absorbed simultaneously. He also published highly cited foundation papers on the electronic, optical, and vibrational properties of III-V semiconductors, silicon, and germanium.
In 1964 Braunstein became a professor of physics at University of California, Los Angeles (UCLA), where he remained for the rest of his career. His research there continued his RCA work with optoelectronic properties of semiconductors as well as contributions related to the optical properties of highly transparent materials such as tungstate glasses. Some of Braunstein's work was theoretical, |
https://en.wikipedia.org/wiki/Microwave%20welding | Microwave welding is a plastic welding process that utilizes alternating electromagnetic fields in the microwave band to join thermoplastic base materials that are melted by the phenomenon of dielectric heating.
See also
Dielectric heating
Plastic welding
Radio-frequency welding
References
Welding
Radio technology |
https://en.wikipedia.org/wiki/Region%20%28model%20checking%29 | In model checking, a field of computer science, a region is a convex polytope in for some dimension , and more precisely a zone, satisfying some minimality property. The regions partition .
The set of zones depends on a set of constraints of the form , , and , with and some variables, and a constant. The regions are defined such that if two vectors and belong to the same region, then they satisfy the same constraints of . Furthermore, when those vectors are considered as a tuple of clocks, both vectors have the same set of possible futures. Intuitively, it means that any timed propositional temporal logic-formula, or timed automaton or signal automaton using only the constraints of can not distinguish both vectors.
The set of region allows to create the region automaton, which is a directed graph in which each node is a region, and each edge ensure that is a possible future of . Taking a product of this region automaton and of a timed automaton which accepts a language creates a finite automaton or a Büchi automaton which accepts untimed . In particular, it allows to reduce the emptiness problem for to the emptiness problem for a finite or Büchi automaton. This technique is used for example by the software UPPAAL.
Definition
Let a set of clocks. For each let . Intuitively, this number represents an upper bound on the values to which the clock can be compared. The definition of a region over the clocks of uses those numbers 's. Three equivalent definitions are now given.
Given a clock assignment , denotes the region in which belongs. The set of regions is denoted by .
Equivalence of clocks assignment
The first definition allow to easily test whether two assignments belong to the same region.
A region may be defined as an equivalence class for some equivalence relation. Two clocks assignments and are equivalent if they satisfy the following constraints:
iff , for each and an integer, and ~ being one of the following relation =, < |
https://en.wikipedia.org/wiki/DPLL%28T%29 | In computer science, DPLL(T) is a framework for determining the satisfiability of SMT problems. The algorithm extends the original SAT-solving DPLL algorithm with the ability to reason about an arbitrary theory T. At a high level, the algorithm works by transforming an SMT problem into a SAT formula where atoms are replaced with Boolean variables. The algorithm repeatedly finds a satisfying valuation for the SAT problem, consults a theory solver to check consistency under the domain-specific theory, and then (if a contradiction is found) refines the SAT formula with this information.
Many modern SMT solvers, such as Microsoft's Z3 Theorem Prover, use DPLL(T) to power their core solving capabilities.
References
Automated theorem proving
SAT solvers
Constraint programming |
https://en.wikipedia.org/wiki/Transcription%20translation%20feedback%20loop | Transcription-translation feedback loop (TTFL) is a cellular model for explaining circadian rhythms in behavior and physiology. Widely conserved across species, the TTFL is auto-regulatory, in which transcription of clock genes is regulated by their own protein products.
Discovery
Circadian rhythms have been documented for centuries. For example, French astronomer Jean-Jacques d’Ortous de Mairan noted the periodic 24-hour movement of Mimosa plant leaves as early as 1729. However, science has only recently begun to uncover the cellular mechanisms responsible for driving observed circadian rhythms. The cellular basis of circadian rhythms is supported by the fact that rhythms have been observed in single-celled organisms
Beginning in the 1970s, experiments conducted by Ron Konopka and colleagues, in which forward genetic methods were used to induce mutation, revealed that Drosophila melanogaster specimens with altered period (Per) genes also demonstrated altered periodicity. As genetic and molecular biology experimental tools improved, researchers further identified genes involved in sustaining normal rhythmic behavior, giving rise to the concept that internal rhythms are modified by a small subset of core clock genes. Hardin and colleagues (1990) were the first to propose that the mechanism driving these rhythms was a negative feedback loop. Subsequent major discoveries confirmed this model; notably experiments led by Thomas K. Darlington and Nicholas Gekakis in the late 1990s that identified clock proteins and characterized their methods in Drosophila and mice, respectively. These experiments gave rise to the transcription-translation feedback loop (TTFL) model that has now become the dominant paradigm for explaining circadian behavior in a wide array of species.
General mechanisms of TTFL
The TTFL is a negative feedback loop, in which clock genes are regulated by their protein products. Generally, the TTFL involves two main arms: positive regulatory elements th |
https://en.wikipedia.org/wiki/Pig-a%20gene%20mutation%20assay | Pig-a gene mutation assay is a flow cytometry-based method for detecting mammalian cells that have inactivating mutations in the endogenous X-linked reporter gene called phosphatidyl inositolglycan class A gene (PIG-A in humans and non-human primates, Pig-a in other mammalian species). PIG-A is involved in the synthesis of glycosylphosphatidylinositol (GPI), an anchor molecule that tethers multiple protein marker molecules at the surface of the cells. When the sample containing wild-type and PIG-A mutant cells is labeled with fluorescent antibodies raised against GPI-anchored protein markers (such as CD24, CD48, CD55, CD59) the wild-type cells will fluoresce and PIG-A mutant cells will not. The fraction of non-fluorescent PIG-A mutant cells in the antibody-labeled sample can be efficiently determined on any of the modern high throughput flow cytometers. The PIG-A mutant frequency fraction can be determined with high accuracy within minutes by processing samples containing a total of a million cells or more.
Application
Originally, PIG-A assay was proposed as a method for monitoring humans for somatic mutation. The assay was developed as an extension of flow cytometric procedure for diagnosing human acquired genetic disorder, paroxysmal nocturnal hemoglobinuria (PNH). PIG-A assays were developed for cells of peripheral blood, such as red blood cells (RBCs) and white blood cells (WBCs). Due to conservative nature of GPI biosynthesis in mammalian species, similar flow cytometry protocols were developed for mammalian species of toxicological interest, i.e., mice and rats. The most demanding RBC-based Pig-a assay requires only microliter volumes of peripheral blood which is easy to obtain without harming the animals. Therefore, the Pig-a assay can be added to various non-clinical in vivo safety evaluations mandated by regulatory authorities as a test measuring a value end-point, i.e., gene mutation, without requiring additional groups of animals for testing. Government |
https://en.wikipedia.org/wiki/Commercial%20minus%20sign | The commercial minus sign is a typographical and mathematical symbol used in commercial and financial documents in some European languages, in specific contexts.
In some commercial and financial documents, especially in Germany and Scandinavia, the symbol was used to indicate subtraction or to denote a negative quantity. The Unicode Consortium has allocated the code point U+2052 to identify this usage uniquely, the exact form of the symbol displayed is typeface (font) dependent. The symbol is also used in the margins of letters to indicate an enclosure, where the upper point is sometimes replaced with the corresponding number.
The Uralic Phonetic Alphabet uses commercial minus signs to denote borrowed forms of a sound.
In Finland, it is used as a symbol for a correct response (the check mark indicates an incorrect response).
Typographic variant
In Germany, the form was historically an alternative to the formal glyph, since this could be conveniently typed on a typewriter. It also provides a convenient alternative means for typing on a modern keyboard, without needing to resort to Unicode input.
In Japan, the triangle (either △ or ▲) is used as the commercial minus sign.
See also
Obelusthe predecessor of this variant
Arabic percent sign (almost identical symbol except that the dots are squares rather than circles)
Notes
References
Typographical symbols
Mathematical symbols
de:Minuszeichen |
https://en.wikipedia.org/wiki/Vkernel | A virtual kernel architecture (vkernel) is an operating system virtualisation paradigm where kernel code can be compiled to run in the user space, for example, to ease debugging of various kernel-level components, in addition to general-purpose virtualisation and compartmentalisation of system resources. It is used by DragonFly BSD in its vkernel implementation since DragonFly 1.7, having been first revealed in , and first released in the stable branch with DragonFly 1.8 in .
The long-term goal, in addition to easing kernel development, is to make it easier to support internet-connected computer clusters without compromising local security.
Similar concepts exist in other operating systems as well; in Linux, a similar virtualisation concept is known as user-mode Linux; whereas in NetBSD since the summer of 2007, it has been the initial focus of the rump kernel infrastructure.
The virtual kernel concept is nearly the exact opposite of the unikernel concept — with vkernel, kernel components get to run in userspace to ease kernel development and debugging, supported by a regular operating system kernel; whereas with a unikernel, userspace-level components get to run directly in kernel space for extra performance, supported by baremetal hardware or a hardware virtualisation stack. However, both vkernels and unikernels can be used for similar tasks as well, for example, to self-contain software to a virtualised environment with low overhead. In fact, NetBSD's rump kernel, originally having a focus of running kernel components in userspace, has since shifted into the unikernel space as well (going after the anykernel moniker for supporting both paradigms).
The vkernel concept is different from a FreeBSD jail in that a jail is only meant for resource isolation, and cannot be used to develop and test new kernel functionality in the userland, because each jail is sharing the same kernel. (DragonFly, however, still has FreeBSD jail support as well.)
In DragonFly, the v |
https://en.wikipedia.org/wiki/Current%20Opinion%20In%20Food%20Science | Current Opinion in Food Science is a peer-reviewed scientific journal published by Elsevier. It covers the field of food Science and nutrition, and is divided into themed sections. The journal was established in 2015 as part of Elsevier's Current Opinion series, which is a collection of journals publishing invited reviews aimed at experts and non-specialists on various disciplines. The current editors-in-chief are A.G. Marangoni (University of Guelph) and A. Sant'Ana (State University of Campinas).
Abstracting and Indexing
The journal is abstracted and indexed in Science Citation Index Expanded, Current Contents/Agriculture, Biology & Environmental Sciences, Essential science indicators, and Scopus. According to the Journal Citation Reports, the journal has a 2018 impact factor of 3.828
See also
Current Opinion (Elsevier)
Food science
References
External links
Elsevier academic journals
Food science journals
Academic journals established in 2015
English-language journals |
https://en.wikipedia.org/wiki/Evolution%20of%20bacteria | The evolution of bacteria has progressed over billions of years since the Precambrian time with their first major divergence from the archaeal/eukaryotic lineage roughly 3.2-3.5 billion years ago. This was discovered through gene sequencing of bacterial nucleoids to reconstruct their phylogeny. Furthermore, evidence of permineralized microfossils of early prokaryotes was also discovered in the Australian Apex Chert rocks, dating back roughly 3.5 billion years ago during the time period known as the Precambrian time. This suggests that an organism in of the phylum Thermotogota (formerly Thermotogae) was the most recent common ancestor of modern bacteria.
Further chemical and isotopic analysis of ancient rock reveals that by the Siderian period, roughly 2.45 billion years ago, oxygen had appeared. This indicates that oceanic, photosynthetic cyanobacteria evolved during this period because they were the first microbes to produce oxygen as a byproduct of their metabolic process. Therefore, this phylum was thought to have been predominant roughly 2.3 billion years ago. However, some scientists argue they could have lived as early as 2.7 billion years ago, as this was roughly before the time of the Great Oxygenation Event, meaning oxygen levels had time to increase in the atmosphere before it altered the ecosystem during this event.
The rise in atmospheric oxygen led to the evolution of Pseudomonadota (formerly proteobacteria). Today this phylum includes many nitrogen fixing bacteria, pathogens, and free-living microorganisms. This phylum evolved approximately 1.5 billion years ago during the Paleoproterozoic era.
However, there are still many conflicting theories surrounding the origins of bacteria. Even though microfossils of ancient bacteria have been discovered, some scientists argue that the lack of identifiable morphology in these fossils means they can not be utilised to draw conclusions on an accurate evolutionary timeline of bacteria. Nevertheless, more recent |
https://en.wikipedia.org/wiki/Semi-automation | Semi-automation is a process or procedure that is performed by the combined activities of man and machine with both human and machine steps typically orchestrated by a centralized computer controller.
Within manufacturing, production processes may be fully manual, semi-automated, or fully automated. In this case, semi-automation may vary in its degree of manual and automated steps.
Semi-automated manufacturing processes are typically orchestrated by a computer controller which sends messages to the worker at the time in which he/she should perform a step. The controller typically waits for feedback that the human performed step has been completed via either a human-machine interface or via electronic sensors distributed within the process. Controllers within semi-automated processes may either directly control machinery or send signals to machinery distributed within the process. Centralized computer controllers within semi-automated processes orchestrate processes by instructing the worker, providing electronic communication and control to process equipment, tools, or machines, as well as perform data management to record and ensure that the process meets established process criteria.
Many manufacturers choose not to fully automate a process, and instead implement semi-automation due to the complexity of the task, or the number of products produced is too low to justify the investment in full automation. Other processes may not be fully automated because it may reduce the flexibility to easily adapt the processes to reflect production needs.
See also
Automation
Autonomation
Manual labour
Distributed control system
Industrial control system
Control system
References
M. Langer and D. Söffker, "Human guidance and supervision of a manufacturing system for semi-automated production," 2011 IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies (AEECT), Amman, 2011, pp. 1-6. (LINK)
R. Parasuraman, T. B. Sheridan and C. D. Wicken |
https://en.wikipedia.org/wiki/EulerOS | EulerOS is a commercial Linux distribution developed by Huawei based on Red Hat Enterprise Linux to provide an operating system for server and cloud environments. Its open-source community version is known as openEuler, of which source code was released by Huawei at Gitee on December 31, 2019.
OpenEuler became an open-source project operated by OpenAtom Foundation after Huawei donated the source code of openEuler to the foundation on November 9, 2021.
KunLun Mission Critical Server
EulerOS 2.0, running on the Huawei KunLun Mission Critical Server, was certified to conform to The Open Group's UNIX 03 standard, however the certification expired in September 2022.
EulerOS/KunLun allows replacing central processing unit board modules and memory modules without stopping the OS. Hot swapping of CPU and memory is provided by EulerOS.
Code shared with HarmonyOS
EulerOS shares kernel technology with Huawei's mobile operating system, HarmonyOS. Huawei plans to unify additional components between both OSes.
References
External links
EulerOS
OpenEuler
OpenEuler Gitee Repository
EulerOS at Docker Hub
Huawei products
Enterprise Linux distributions
RPM-based Linux distributions
Unix variants
X86-64 Linux distributions
Linux distributions |
https://en.wikipedia.org/wiki/Wizzy%20Active%20Lifestyle%20Telephone | The Wizzy Active Lifestyle Telephone (W.A.L.T.) was a prototype "phone companion" created by Apple Computer in collaboration with BellSouth. W.A.L.T. featured "touchscreen, fax functionality, on-display caller ID, a built-in address book, customizable ringtones, and online banking access". The system was based on the PowerBook 100, and included touchscreen, stylus, and handwriting recognition. The operating system was based on System 6 with a HyperCard GUI. Announced in 1993, the system was not mass-produced. A prototype machine was sold on eBay in 2012 for US$8,000. In 2019 a video demonstration of a prototype machine was uploaded to the internet.
References
External links
YouTube video of a working W.A.L.T. prototype
Apple Inc. hardware
Macintosh platform
Network computer (brand) |
https://en.wikipedia.org/wiki/PebblePost | PebblePost is a New York-based marketing technology company founded by Lewis Gersh, Tom Gibbons and Robert Victor in 2014. The company invented a marketing channel called Programmatic Direct Mail, which takes online web browsing intent data to send relevant direct mail. PebblePost was selected for The ARF's First Innovators A-List and named in the 2016 list of the 100 Most Exciting Startups.
History
The company was founded by Lewis Gersh, Tom Gibbons and Robert Victor in 2014 and is headquartered in New York. Gersh has stated that at first the company had trouble finding support and was "mostly kicked to the curb with it" but the founders persisted, raising a successful seed round followed by the release of their Programmatic Direct Mail technology platform.
In 2018 PebblePost gathered support and funds from Advance Venture Partners, Capital One Growth Ventures and other investors to close out the company's Series C round, which totaled $31 million. PebblePost has also received funding from RRE, Greycroft, Tribeca Venture Partners, and other investors in digital media.
The company has filed for a number of various utility patents on the manufacturing of privacy complaint, targeted direct mail, which are pending for their digital-to-direct mail technology. In addition, PebblePost was granted the trademark for “Programmatic Direct Mail” on March 22, 2016. PebblePost currently has approximately 51-200 employees and is headquartered in the NoHo neighborhood of New York City.
Functions
PebblePost is a digital-to-direct mail marketing platform that provides brands with a medium to reach
shoppers at home with highly targeted mail. PebblePost operates by using clickstream data
from brand's website visitors to help brands determine which customers are most likely to be interested in certain products at a given time, and then sends branded mail to them within 12–24 hours every day.
Reception
PebblePost received positive reception with the release of the Programmatic D |
https://en.wikipedia.org/wiki/Genus%20g%20surface | In mathematics, a genus g surface (also known as a g-torus or g-holed torus) is a surface formed by the connected sum of g many tori: the interior of a disk is removed from each of g many tori and the boundaries of the g many disks are identified (glued together), forming a g-torus. The genus of such a surface is g.
A genus g surface is a two-dimensional manifold. The classification theorem for surfaces states that every compact connected two-dimensional manifold is homeomorphic to either the sphere, the connected sum of tori, or the connected sum of real projective planes.
Definition of genus
The genus of a connected orientable surface is an integer representing the maximum number of cuttings along non-intersecting closed simple curves without rendering the resultant manifold disconnected. It is equal to the number of handles on it. Alternatively, it can be defined in terms of the Euler characteristic χ, via the relationship χ = 2 − 2g for closed surfaces, where g is the genus.
The genus (sometimes called the demigenus or Euler genus) of a connected non-orientable closed surface is a positive integer representing the number of cross-caps attached to a sphere. Alternatively, it can be defined for a closed surface in terms of the Euler characteristic χ, via the relationship χ = 2 − g, where g is the non-orientable genus.
Genus 0
An orientable surface of genus zero is the sphere S2. Another surface of genus zero is the disc.
Genus 1
A genus one orientable surface is the ordinary torus. A non-orientable surface of genus one is the projective plane.
Elliptic curves over the complex numbers can be identified with genus 1 surfaces. The formulation of elliptic curves as the embedding of a torus in the complex projective plane follows naturally from a property of Weierstrass's elliptic functions that allows elliptic curves to be obtained from the quotient of the complex plane by a lattice.
Genus 2
The term double torus is occasionally used to denote a genus 2 surfa |
https://en.wikipedia.org/wiki/Magnetic%20space%20group | In solid state physics, the magnetic space groups, or Shubnikov groups, are the symmetry groups which classify the symmetries of a crystal both in space, and in a two-valued property such as electron spin. To represent such a property, each lattice point is colored black or white, and in addition to the usual three-dimensional symmetry operations, there is a so-called "antisymmetry" operation which turns all black lattice points white and all white lattice points black. Thus, the magnetic space groups serve as an extension to the crystallographic space groups which describe spatial symmetry alone.
The application of magnetic space groups to crystal structures is motivated by Curie's Principle. Compatibility with a material's symmetries, as described by the magnetic space group, is a necessary condition for a variety of material properties, including ferromagnetism, ferroelectricity, topological insulation.
History
A major step was the work of Heinrich Heesch, who first rigorously established the concept of antisymmetry as part of a series of papers in 1929 and 1930. Applying this antisymmetry operation to the 32 crystallographic point groups gives a total of 122 magnetic point groups. However, although Heesch correctly laid out each of the magnetic point groups, his work remained obscure, and the point groups were later re-derived by Tavger and Zaitsev. The concept was more fully explored by Shubnikov in terms of color symmetry. When applied to space groups, the number increases from the usual 230 three dimensional space groups to 1651 magnetic space groups, as found in the 1953 thesis of Alexandr Zamorzaev. While the magnetic space groups were originally found using geometry, it was later shown the same magnetic space groups can be found using generating sets.
Description
Magnetic space groups
The magnetic space groups can be placed into three categories. First, the 230 colorless groups contain only spatial symmetry, and correspond to the crystallographic spac |
https://en.wikipedia.org/wiki/Second%20Conference%20on%20the%20Epistemology%20of%20the%20Exact%20Sciences | The Second Conference on the Epistemology of the Exact Sciences () was held on 5–7 September 1930 in Königsberg, then located in East Prussia. It was at this conference that Kurt Gödel first presented his incompleteness theorems, though just "in an off-hand remark during a general discussion on the last day". The real first presentation took place in Vienna.
The conference was organised by Kurt Reidemeister of the University of Königsberg. The presentations were grouped around two themes: firstly, the foundation of mathematics and secondly philosophical questions arising from Quantum mechanics. The conference was closely related to the journal Erkenntnis who published the associated papers and accounts of the discussion in Erkenntnis (1931), 2 pp 87-190.
The foundation of mathematics
The presentations as regards the foundation of mathematics were as follows:
Session 1:
Rudolf Carnap (Vienna), presented the thought of the logicist school as developed by Bertrand Russell
Arend Heyting (Enschede), presented the thought of the intuitionist school as developed by L. E. J. Brouwer
John von Neumann (Berlin), presented the thought of the formalist school as developed by David Hilbert
Friedrich Waismann (Vienna), presented the thought of the linguistic school as developed by Ludwig Wittgenstein
Kurt Gödel (Vienna), "On the Completeness of the Logical Calculus"
Arnold Scholz (Freiburg), "On the Use of the Term Holism in Axiomatics"
Session 2:
Otto Neugebauer (Göttingen), "On Pre-Greek Mathematics"
Session 3:
Discussion on the foundation of mathematics involving Hans Hahn, Carnap, Heyting, von Neumann, Gödel, Scholz and Reidemeister
Philosophical questions arising from quantum mechanics
There were two key presentations.
Session 4:
Hans Reichenbach (Berlin), presented on the supersession of two-value logic by probability logic
Werner Heisenberg (Leipzig), presented on the meaninglessness of strict assertions about natural phenomena at the micro level.
Session |
https://en.wikipedia.org/wiki/First%20International%20Topological%20Conference | The First International Topological Conference was held in Moscow, 4–10 September, 1935. With presentations by topologists from 10 different countries it constituted the first genuinely international meeting devoted to topology in the world history of the mathematical community. Although a previous mathematical conference had been held in Kharkiv, and attended by Jacques Hadamard, this turned out to be the only truly international conference organised under the Stalin regime. Pavel Aleksandrov played a key role in organising the conference. The foreign delegates were accommodated in major hotels across Moscow, although according to André Weil, the principal form of sustenance was Caviar Canapes served in the conference hall, as no food was available in the hotel restaurants.
Presentations
Documentation of the conference varies, but this summary was drawn from various sources.
Homology Theory
Karol Borsuk: ‘‘On spheroidal spaces’’
Eduard Čech: "Accessibility and Homology'"
Israel Isaakovich Gordon: ‘‘On the intersection invariants of a complex and its residual space.’
Solomon Lefschetz: ‘‘On locally connected sets.’’
Attendees
The following topologists made presentations:
Czechoslovakia:
Eduard Čech
France:
André Weil
Netherlands
Hans Freudenthal
Egbert van Kampen
Poland:
Karol Borsuk
Kazimierz Kuratowski
Juliusz Schauder
Kazimierz Zarankiewicz
USA:
James Waddell Alexander II
Garrett Birkhoff
Solomon Lefschetz
John von Neumann
Albert W. Tucker
Hassler Whitney
USSR:
Pavel Aleksandrov
Felix Frankl
Israel Isaakovich Gordon
Maria A. Nikolaenko
Julia Rozanska
Lev Pontryagin
Vyacheslav Stepanov
Lev Tumarkin
References
1935 in the Soviet Union
Topology |
https://en.wikipedia.org/wiki/Data%20Center%20Manageability%20Interface | Data Center Manageability Interface (DCMI) is a data center system management standard based on the Intelligent Platform Management Interface (IPMI) but designed to be more suitable for data center management: it uses the interfaces defined in IPMI, but minimizes the number of optional interfaces and includes power capping control, among other differences.
The DCMI specification was developed at Intel, and first published in 2008.
See also
Redfish (specification)
External links
DCMI specification revision 1.5
Original Intel DCMI white paper, published in 2008
Networking standards
System administration
Out-of-band management
2008 establishments |
https://en.wikipedia.org/wiki/Gigatron%20TTL | The Gigatron TTL is a retro-style 8-bit computer, where the CPU is implemented by a set of TTL chips instead of a single microprocessor, imitating the hardware present in early arcades. Its target is the computing enthusiasts, for studying or hobby purposes.
Architecture
The CPU is implemented through a small set of TTL 7400 series chips, running at 6.25 MHz base clock rate, that can be overclocked by providing better chips. RAM can also be increased in the same way.
Three CPU modes are implemented:
8-bit native assembly code, that implements a Harvard architecture. This mode offers a 17 instruction set, and supports up to 256 instructions: 8 ALU operations, 8 addressing modes and 4 bus modes. The ROM firmware and the vCPU interpreter are written in the 8-bit Native assembly code.
16-bit vCPU interpreter, that implements a von Neumann architecture and has a 34 instructions set. It loads and runs programs from the RAM. The integrated programs are written for this vCPU.
MOS 6502 emulator (experimental), able to run MOS 6502 machine code.
The video is generated by the ROM firmware (native assembly code), and supports a resolution of 160x120 pixels with 64 colours stored in RAM starting at address 0x0800 and ending at 0x7F9F as 120 segments of 160 bytes of non-contiguous RAM. Pixels are stored as 1 byte per pixel in XXBBGGRR format, (the top 2 bits are unused and may be used by the programmer for their own usage). The video display contains a configurable number of black (empty) scanlines in order to save vCPU time for programs; these empty/black scanlines can be configured by the user to get more displayed raster scanlines or more vCPU time for user programs. Off-screen RAM begins at 0x08A0 and ends at 0x7FFF as 120 segments of 96 bytes of non-contiguous RAM; these fragmented sections of RAM may be used for storing data or code or for scrolling effects using the video indirection table. System RAM is trivially expandable from the default 32K bytes to the full 16bi |
https://en.wikipedia.org/wiki/Local%20differential%20privacy | Local differential privacy (LDP) is a model of differential privacy with the added requirement that if an adversary has access to the personal responses of an individual in the database, that adversary will still be unable to learn much of the user's personal data. This is contrasted with global differential privacy, a model of differential privacy that incorporates a central aggregator with access to the raw data.
Local differential privacy (LDP) is an approach to mitigate the concern of data fusion and analysis techniques used to expose individuals to attacks and disclosures. LDP is a well-known privacy model for distributed architectures that aims to provide privacy guarantees for each user while collecting and analyzing data, protecting from privacy leaks for the client and server. LDP has been widely adopted to alleviate contemporary privacy concerns in the era of big data.
History
In 2003, Alexandre V. Evfimievski, Johannes Gehrke, and Ramakrishnan Srikant gave a definition equivalent to local differential privacy. In 2008, Kasiviswanathan et al. gave a formal definition conforming with the now-standard definition of differential privacy.
The prototypical example of a mechanism with local differential privacy is the randomized response survey technique proposed by Stanley L. Warner in 1965. Warner's innovation was the introduction of the “untrusted curator” model, where the entity collecting the data may not be trustworthy. Before users' responses are sent to the curator, the answers are randomized in a controlled manner guaranteeing differential privacy while still allowing valid population-wide statistical inferences.
Applications
The era of big data exhibits high demand for machine learning services that provide privacy protection for users. Demand for such services has pushed research into algorithmic paradigms that provably satisfy specific privacy requirements.
Anomaly Detection
Anomaly detection is formally defined as the process of identifyi |
https://en.wikipedia.org/wiki/Differentially%20private%20analysis%20of%20graphs | Differentially private analysis of graphs studies algorithms for computing accurate graph statistics while preserving differential privacy. Such algorithms are used for data represented in the form of a graph where nodes correspond to individuals and edges correspond to relationships between them. For examples, edges could correspond to friendships, sexual relationships, or communication patterns.
A party that collected sensitive graph data can process it using a differentially private algorithm and publish the output of the algorithm. The goal of differentially private analysis of graphs is to design algorithms that compute accurate global information about graphs while preserving privacy of individuals whose data is stored in the graph.
Variants
Differential privacy imposes a restriction on the algorithm. Intuitively, it requires that the algorithm has roughly the same output distribution on neighboring inputs. If the input is a graph, there are two natural notions of neighboring inputs, edge neighbors and node neighbors, which yield two natural variants of differential privacy for graph data.
Let ε be a positive real number and be a randomized algorithm that takes a graph as input and returns an output from a set .
The algorithm is -differentially private if, for all neighboring graphs and and all subsets of ,
where the probability is taken over the randomness used by the algorithm.
Edge differential privacy
Two graphs are edge neighbors if they differ in one edge. An algorithm is -edge-differentially private if, in the definition above, the notion of edge neighbors is used. Intuitively, an edge differentially private algorithm has similar output distributions on any pair of graphs that differ in one edge, thus protecting changes to graph edges.
Node differential privacy
Two graphs are node neighbors if one can be obtained from the other by deleting a node and its adjacent edges. An algorithm is -node-differentially private if, in the definition ab |
https://en.wikipedia.org/wiki/Reconstruction%20attack | A reconstruction attack is any method for partially reconstructing a private dataset from public aggregate information. Typically, the dataset contains sensitive information about individuals, whose privacy needs to be protected. The attacker has no or only partial access to the dataset, but has access to public aggregate statistics about the datasets, which could be exact or distorted, for example by adding noise. If the public statistics are not sufficiently distorted, the attacker is able to accurately reconstruct a large portion of the original private data. Reconstruction attacks are relevant to the analysis of private data, as they show that, in order to preserve even a very weak notion of individual privacy, any published statistics need to be sufficiently distorted. This phenomenon was called the Fundamental Law of Information Recovery by Dwork and Roth, and formulated as "overly accurate answers to too many questions will destroy privacy in a spectacular way."
The Dinur-Nissim Attack
In 2003, Irit Dinur and Kobbi Nissim proposed a reconstruction attack based on noisy answers to multiple statistical queries. Their work was recognized by the 2013 ACM PODS Alberto O. Mendelzon Test-of-Time Award in part for being the seed for the development of differential privacy.
Dinur and Nissim model a private database as a sequence of bits , where each bit is the private information of a single individual. A database query is specified by a subset , and is defined to equal . They show that, given approximate answers to queries specified by sets , such that
for all , if is sufficiently small and is sufficiently large, then an attacker can reconstruct most of the private bits in . Here the error bound can be a function of and . Nissim and Dinur's attack works in two regimes: in one regime, is exponential in , and the error can be linear in ; in the other regime, is polynomial in , and the error is on the order of .
References
Theory of cryptography
Informati |
https://en.wikipedia.org/wiki/IEEE%20Transactions%20on%20Image%20Processing | The IEEE Transactions on Image Processing is a monthly peer-reviewed scientific journal covering aspects of image processing in the field of signal processing. It was established in 1992 and is published by the IEEE Signal Processing Society. The editor-in-chief is Alessandro Foi (Tampere University). According to the Journal Citation Reports, the journal has a 2020 impact factor of 10.856.
References
External links
Image processing
IEEE academic journals
Monthly journals
Academic journals established in 1992
Computer science journals
English-language journals |
https://en.wikipedia.org/wiki/SIGTOT | SIGTOT was a one-time tape machine for encrypting teleprinter communication that was used by the United States during World War II and after for the most sensitive message traffic. It was developed after security flaws were discovered in an earlier rotor machine for the same purpose, called SIGCUM. SIGTOT was designed by Leo Rosen and used the same Bell Telephone 132B2 mixer as SIGCUM. The British developed a similar machine called the 5-UCO. Later an improved mixer, the SSM-33, replaced the 131B2,
The phenomenon, codenamed TEMPEST, of sensitive information leaking by way of unintended electromagnetic radiation for the circuits used inside encryption machine was first discovered coming from the 131B2 mixers used in SIGTOT.
SIGTOT required large amounts of key tape to operate on a continual basis, which was needed for traffic flow security. In 1955, NSA produced some 1,660,000 rolls of one time tape. The logistical problems involved in the generation, supply and destruction of sufficient quantities of key tape limited its use to only the most sensitive traffic. In the 1950s. the U.S. Army Security Agency began developing a replacement, an effort later taken over by the newly formed National Security Agency and resulting in the fielding of the KW-26 (ROMULUS) system.
References
History of cryptography
Encryption devices |
https://en.wikipedia.org/wiki/Stickman%20Soccer | Stickman Soccer is a videogame series for Android and iOS by Austrian studio Djinnworks first released in 2014.
Games
Stickman Soccer Classic (2013)
Stickman Soccer 2014 or Stickman Soccer 14 (2014)
Stickman Soccer 2016 or Stickman Soccer 16 (2016)
Stickman Soccer 2018 or Stickman Soccer 18 (2018)
References
Video game franchises introduced in 2013
Association football video games
Video games with cross-platform play
Android (operating system) games
IOS games
Video games developed in Austria
Video game franchises |
https://en.wikipedia.org/wiki/3%20nm%20process | In semiconductor manufacturing, the 3 nm process is the next die shrink after the 5 nanometer MOSFET (metal–oxide–semiconductor field-effect transistor) technology node. South Korean chipmaker Samsung started shipping its 3 nm gate all around (GAA) process, named 3GAA, in mid-2022. On December 29, 2022, Taiwanese chip manufacturer TSMC announced that volume production using its 3 nm semiconductor node termed N3 is under way with good yields. An enhanced 3 nm chip process called N3E may start production in 2023. American manufacturer Intel plans to start 3 nm production in 2023.
Samsung's 3 nm process is based on GAAFET (gate-all-around field-effect transistor) technology, a type of multi-gate MOSFET technology, while TSMC's 3 nm process still uses FinFET (fin field-effect transistor) technology, despite TSMC developing GAAFET transistors. Specifically, Samsung plans to use its own variant of GAAFET called MBCFET (multi-bridge channel field-effect transistor). Intel's process dubbed "Intel 3" without the "nm" suffix will use a refined, enhanced and optimized version of FinFET technology compared to its previous process nodes in terms of performance gained per watt, use of EUV lithography, and power and area improvement.
The term "3 nanometer" has no relation to any actual physical feature (such as gate length, metal pitch or gate pitch) of the transistors. According to the projections contained in the 2021 update of the International Roadmap for Devices and Systems published by IEEE Standards Association Industry Connection, a 3 nm node is expected to have a contacted gate pitch of 48 nanometers and a tightest metal pitch of 24 nanometers.
However, in real world commercial practice, "3 nm" is used primarily as a marketing term by individual microchip manufacturers to refer to a new, improved generation of silicon semiconductor chips in terms of increased transistor density (i.e. a higher degree of miniaturization), increased speed and reduced power consumption. T |
https://en.wikipedia.org/wiki/Engelbert%20Krauskopf | Engelbert Krauskopf (August 21, 1820 – July 11, 1881) was a German-American settler, gunsmith, and naturalist. Born in Bendorf, Germany, he emigrated to the United States in 1846, and became a settler of Fredericksburg, Texas. He was trained as a cabinetmaker and gunsmith, and during the American Civil War once made a gun barrel especially for Robert E. Lee. He was also an inventor: when ammunition became scarce during the Civil War he and silversmith Adolph Lungkwitz developed a process for the manufacture of gun-caps. In 1872, he patented an improvement to a throttle valve stand with John M. Compant, and one of his last inventions was a microscope in the form of a magic lantern. An amateur botanist, he described the species Hesperaloe engelmannii (commonly known as Engelmann's red yucca).
References
1820 births
1881 deaths
People from Fredericksburg, Texas
Prussian emigrants to the United States
19th-century naturalists
People from Bendorf
19th-century American botanists
Gunsmiths |
https://en.wikipedia.org/wiki/Error%20exponents%20in%20hypothesis%20testing | In statistical hypothesis testing, the error exponent of a hypothesis testing procedure is the rate at which the probabilities of Type I and Type II decay exponentially with the size of the sample used in the test. For example, if the probability of error of a test decays as , where is the sample size, the error exponent is .
Formally, the error exponent of a test is defined as the limiting value of the ratio of the negative logarithm of the error probability to the sample size for large sample sizes: . Error exponents for different hypothesis tests are computed using Sanov's theorem and other results from large deviations theory.
Error exponents in binary hypothesis testing
Consider a binary hypothesis testing problem in which observations are modeled as independent and identically distributed random variables under each hypothesis. Let denote the observations. Let denote the probability density function of each observation under the null hypothesis and let denote the probability density function of each observation under the alternate hypothesis .
In this case there are two possible error events. Error of type 1, also called false positive, occurs when the null hypothesis is true and it is wrongly rejected. Error of type 2, also called false negative, occurs when the alternate hypothesis is true and null hypothesis is not rejected. The probability of type 1 error is denoted and the probability of type 2 error is denoted .
Optimal error exponent for Neyman–Pearson testing
In the Neyman–Pearson version of binary hypothesis testing, one is interested in minimizing the probability of type 2 error subject to the constraint that the probability of type 1 error is less than or equal to a pre-specified level . In this setting, the optimal testing procedure is a likelihood-ratio test. Furthermore, the optimal test guarantees that the type 2 error probability decays exponentially in the sample size according to . The error exponent is the Kullback–Leibler |
https://en.wikipedia.org/wiki/Validated%20numerics | Validated numerics, or rigorous computation, verified computation, reliable computation, numerical verification () is numerics including mathematically strict error (rounding error, truncation error, discretization error) evaluation, and it is one field of numerical analysis. For computation, interval arithmetic is used, and all results are represented by intervals. Validated numerics were used by Warwick Tucker in order to solve the 14th of Smale's problems, and today it is recognized as a powerful tool for the study of dynamical systems.
Importance
Computation without verification may cause unfortunate results. Below are some examples.
Rump's example
In the 1980s, Rump made an example. He made a complicated function and tried to obtain its value. Single precision, double precision, extended precision results seemed to be correct, but its plus-minus sign was different from the true value.
Phantom solution
Breuer–Plum–McKenna used the spectrum method to solve the boundary value problem of the Emden equation, and reported that an asymmetric solution was obtained. This result to the study conflicted to the theoretical study by Gidas–Ni–Nirenberg which claimed that there is no asymmetric solution. The solution obtained by Breuer–Plum–McKenna was a phantom solution caused by discretization error. This is a rare case, but it tells us that when we want to strictly discuss differential equations, numerical solutions must be verified.
Accidents caused by numerical errors
The following examples are known as accidents caused by numerical errors:
Failure of intercepting missiles in the Gulf War (1991)
Failure of the Ariane 5 rocket (1996)
Mistakes in election result totalization
Main topics
The study of validated numerics is divided into the following fields:
Tools
See also
References
Further reading
Tucker, Warwick (2011). Validated Numerics: A Short Introduction to Rigorous Computations. Princeton University Press.
Moore, Ramon Edgar, Kearfott, R. Baker., Cloud |
https://en.wikipedia.org/wiki/Arabic%20Ontology | Arabic Ontology is a linguistic ontology for the Arabic language, which can be used as an Arabic WordNet with ontologically clean content. People use it also as a tree (i.e. classification) of the concepts/meanings of the Arabic terms. It is a formal representation of the concepts that the Arabic terms convey, and its content is ontologically well-founded, and benchmarked to scientific advances and rigorous knowledge sources rather than to speakers' naïve beliefs as wordnets typically do
. The Ontology tree can be explored online.
Ontology Structure
The ontology structure (i.e., data model) is similar to WordNet structure. Each concept in the ontology is given a unique concept identifier (URI), informally described by a gloss, and lexicalized by one or more of synonymous lemma terms. Each term-concept pair is called a sense, and is given a SenseID. A set of senses is called synset. Concepts and senses are described by further attributes such as era and area - to specify when and where it is used, lexicalization type, example sentence, example instances, ontological analysis, and others. Semantic relations (e.g., SubTypeOf, PartOf, and others) are defined between concepts. Some important individuals are included in the ontology, such as individual countries and seas. These individuals are given separate IndividualIDs and linked with their concepts through the InstanceOf relation.
Mappings to other resources
Concepts in the Arabic Ontology are mapped to synsets in WordNet, as well as to BFO and DOLCE. Terms used in the Arabic Ontology are mapped to lemmas in the LDC's SAMA database.
Arabic Ontology versus Arabic WordNet
The Arabic Ontology can be seen as a next generation of WordNet - or as an ontologically clean Arabic WordNet. It follows the same structure (i.e., data model) as WordNet, and it is fully mapped to WordNet. However, there are critical foundational differences between them:
The ontology is benchmarked on state-of-art scientific discoveries, whi |
https://en.wikipedia.org/wiki/Mother%20Earth%27s%20Plantasia | Mother Earth's Plantasia (subtitled "warm earth music for plants and the people who love them"), commonly referred to as simply Plantasia, is an electronic album by Mort Garson first released in 1976.
Background
The music on it was composed specifically for plants to listen to. Garson was inspired by his wife, who grew many plants in their home. Garson used a Moog synthesizer to compose the album, the first album on the West Coast composed entirely on the Moog synthesizer.
The album had a very limited distribution upon release, only being available to people who bought a houseplant from a store called Mother Earth on Melrose Avenue in Los Angeles or those who purchased a Simmons mattress from a Sears outlet, both of which came with the record. As a result, the album failed to attain widespread popularity around the time of its release. However, it has since gained a cult following as an early work of electronic music.
Legacy
The album has long been a gem for hip-hop producers, who sample snippets of these soundscapes. One of the best uses, and probably one of the reasons that this album (and Garson) came back to public interest, was Pharcyde's use of a snippet from the main track "Plantasia" for the song "Guestlist", from their 2000 album "Plain Rap".
The album also gained popularity on YouTube, with the full album (uploaded without permission) gaining millions of views and thousands of comments spread over multiple different bootleg uploads.
In March 2019, Sacred Bones Records announced that they were officially reissuing Mother Earth's Plantasia. The reissue is available on music streaming services and was released on vinyl, CD and cassette as well on June 21, 2019. Angie Martoccio, writing for Rolling Stone in 2019, described Mother Earth's Plantasia as Garson's magnum opus. Stephen M. Deusner, writing for Pitchfork, described it as perhaps Garson's "most beloved album, at least among crate-diggers and record collectors."
For the 2023 tax season, Intuit us |
https://en.wikipedia.org/wiki/Pl%C3%B6n%20Evolution%20Path | The Max Planck Institute for Evolutionary Biology's Plön Evolution Path ("Plöner Evolutionspfad", German pronunciation: [ˈpløːnɐ evoluˈt͡si̯oːns pfaːt]) is an educational public works project that presents the history and evolution of life on Earth. It is one of a number of Evolution Paths in Germany.
Located in Plön, Germany, the Evolution Path is composed of 11 dual-language English/German stations extending around the Großer Plöner See. The path extends 1.3 km in total, starting at the Plön Market Bridge () that also serves as the start of the Plön Planet Walk, and makes its way to the Max Planck Institute for Evolutionary Biology.
From beginning to end, the Evolution Path describes events during evolutionary history, beginning with the origin of life (3.8 million years ago) up until the evolution of man (5 million years ago). The distance between each station is proportional to the time interval between the corresponding evolutionary periods described at that station.
The exhibition was officially inaugurated on the 14th of September, 2018, in celebration of the 70th anniversary of the Max Planck Society.
History
The exhibition was officially inaugurated on the 14th of September, 2018, in celebration of the 70th anniversary of the Max Planck Society. The project was conceived by Diethard Tautz of the Max Planck Institute for Evolutionary Biology, with support from Plön's Bürgermeister Lars Winter, the Plön city committee, and the Giordano Bruno Foundation.
Stations
The Plön Evolution Path presents the history and evolution of life on earth in 11 stations. The distance between each station is proportional to the time interval between the corresponding evolutionary periods, broken into three separate linear time scales. As a result, the overall time path does not represent a linear scale, which would be difficult to represent in such a path. Each station includes an overview of the state of the Earth's continental drift at each respective time frame, includ |
https://en.wikipedia.org/wiki/Ordered%20exponential%20field | In mathematics, an ordered exponential field is an ordered field together with a function which generalises the idea of exponential functions on the ordered field of real numbers.
Definition
An exponential on an ordered field is a strictly increasing isomorphism of the additive group of onto the multiplicative group of positive elements of . The ordered field together with the additional function is called an ordered exponential field.
Examples
The canonical example for an ordered exponential field is the ordered field of real numbers R with any function of the form where is a real number greater than 1. One such function is the usual exponential function, that is . The ordered field R equipped with this function gives the ordered real exponential field, denoted by . It was proved in the 1990s that Rexp is model complete, a result known as Wilkie's theorem. This result, when combined with Khovanskiĭ's theorem on pfaffian functions, proves that Rexp is also o-minimal. Alfred Tarski posed the question of the decidability of Rexp and hence it is now known as Tarski's exponential function problem. It is known that if the real version of Schanuel's conjecture is true then Rexp is decidable.
The ordered field of surreal numbers admits an exponential which extends the exponential function exp on R. Since does not have the Archimedean property, this is an example of a non-Archimedean ordered exponential field.
The ordered field of logarithmic-exponential transseries is constructed specifically in a way such that it admits a canonical exponential.
Formally exponential fields
A formally exponential field, also called an exponentially closed field, is an ordered field that can be equipped with an exponential . For any formally exponential field , one can choose an exponential on such that
for some natural number .
Properties
Every ordered exponential field is root-closed, i.e., every positive element of has an -th root for all positive integer (or in o |
https://en.wikipedia.org/wiki/MikroTik | MikroTik (officially SIA "Mikrotīkls") is a Latvian network equipment manufacturing company. MikroTik develops and sells wired and wireless network routers, network switches, access points, as well as operating systems and auxiliary software. The company was founded in 1996, and as of 2022, it was reported that the company employed 351 employees.
With its headquarters in Riga, Latvia, MikroTik serves a diverse array of customers around the world. The company's products and services are utilized in various sectors, such as telecommunications, government agencies, educational institutions, and enterprises of all sizes.
In 2022, with a value of €1.30 billion, Mikrotik was the 4th largest company in Latvia and the first private company to surpass €1 billion value in Latvia.
History
MikroTik was established in 1996 by founders John Tully and Arnis Riekstiņš in Riga, Latvia, developing networking software for x86 PC hardware that would develop into a product called RouterOS. The earliest versions of RouterOS were based on Linux 2.2.
In 2002, MikroTik expanded its product line by producing their own networking-focused low-power single-board computers (SBC), branded RouterBoard, that ran RouterOS. These early SBCs could be expanded and/or integrated as components of other systems, but as time passed, this RouterBoard/RouterOS platform would develop into a full line network equipment.
Timeline
1997 – Release of software for x86 PC platform, called simply MikroTik Router Software, that would eventually develop into RouterOS.
2002 – Release of RouterBoard series PCI add-in boards to be used with MikroTik x86-based PCs running RouterOS.
2003 – Release of RouterBoard 200, a single-board router platform, and RouterBOARD 220 with the SBC integrated into an enclosure with a 2.4 GHz wireless antenna powered by Power over ethernet (PoE). Original RouterBoard was based on the Geode CPU, but later used MIPS.
2012 – Release of Cloud Core Router (CCR) 1000-series, using Tilera |
https://en.wikipedia.org/wiki/Snails%20as%20food | Snails are considered edible in many areas such as the Mediterranean region, Africa, or Southeast Asia, while in other cultures, snails are seen as a taboo food. In American English, edible land snails are also called escargot, taken from the French word for 'snail,' and the production of snails for consumption is called snail farming or heliciculture. Snails as a food date back to ancient times, with numerous cultures worldwide having traditions and practices that attest to their consumption.
The snails are collected after the rains and are put to "purge" (fasting). In the past, the consumption of snails had a marked seasonality, from April to June. However, thanks to snail breeding techniques, today they are available all year round. Heliciculture occurs mainly in Spain, France, and Italy, which are also the countries with the greatest culinary tradition of the snail. Although throughout history the snail has had little value in the kitchen because it is considered "poverty food", in recent times it can be classified as a delicacy thanks to the appreciation given to it by haute cuisine chefs.
Etymology of escargot
Escargot, , comes from the French word for snail. One of the first recorded uses of the French word escargot, meaning dates from 1892. The French word (1549) derives from escaragol (Provençal) and thence escargol (Old French), and is ultimately – via Vulgar Latin coculium and Classical Latin conchylium – from the Ancient Greek konchylion (κογχύλιον), which meant "edible shellfish, oyster". The Online Etymological Dictionary writes, "The form of the word in Provençal and French seem to have been influenced by words related to the scarab."
History
Researchers have not been able to pinpoint when humans began consuming snails, although archaeological discoveries point to earlier stages than the invention of hunting. A lot of broken snail shells have been found in the Franchthi Cave, in the Greek Argolis, from the year 10,700 BCE. In Historia de gastron |
https://en.wikipedia.org/wiki/Military%20geology | Military geology is the application of geological theory to warfare and the peacetime practices of the military. The formal practice of military geology began during the Napoleonic Wars; however, geotechnical knowledge has been applied since the earliest days of siege warfare. In modern warfare military geologists are used for terrain analysis, engineering, and the identification of resources. Military geologists have included both specially trained military personnel and civilians incorporated into the military. The peacetime application of military geology includes the building of infrastructure, typically during local emergencies or foreign peacekeeping deployments.
Warfare can change the physical geology. Examples of this include artillery shattering the bedrock on the Western Front during World War I and the detonation of nuclear weapons creating new rock types. Military research has also led to many important geological discoveries.
Terrain analysis
Geologists have been employed since the Napoleonic Wars to provide an analysis of terrain which was expected to become a war theater, both in case of an upcoming battle and to assess the difficulty of logistical supply. Academically, it has been found that battles are likely to occur on rocks of Permian, Triassic, or Upper Carboniferous age, possibly due to their typical relief and drainage. More practically, geology has been used in identifying the best Allied invasion sites during World War II, including those in North Africa, Italy, and France. This included studying the properties of the sand of Normandy beaches, the tolerance of the soil in the hinterland to bombardment, the sediment of the English Channel sea floor, and the occurrence of landslides in Sicily. Likewise, German geologists created maps of southern England for Operation Sea Lion, identifying quarry locations and the suitability of rock types to excavate trenchers, etc.
In the Demilitarized Zone between North and South Korea, very rugged terr |
https://en.wikipedia.org/wiki/Severi%20variety%20%28Hilbert%20scheme%29 | In mathematics, a Severi variety is an algebraic variety in a Hilbert scheme that parametrizes curves in projective space with given degree and geometric genus and at most node singularities. Its dimension is 3d + g − 1.
It is a theorem that Severi varieties are algebraic varieties, i.e. it is irreducible.
References
Maksym Fedorchuk, Severi varieties and the moduli space of curves, Ph.D. thesis, 2008.
Joe Harris and Ian Morrison. Moduli of curves, volume 187 of Graduate Texts in Mathematics. Springer-Verlag, New York, 1998.
Algebraic geometry
Scheme theory |
https://en.wikipedia.org/wiki/Hurwitz%20scheme | In algebraic geometry, the Hurwitz scheme is the scheme parametrizing pairs () where C is a smooth curve of genus g and has degree d.
References
Algebraic geometry |
https://en.wikipedia.org/wiki/The%20Tale%20of%20the%20Three%20Beautiful%20Raptor%20Sisters%2C%20and%20the%20Prince%20Who%20Was%20Made%20of%20Meat | "The Tale of the Three Beautiful Raptor Sisters, and the Prince Who Was Made of Meat" is a fantasy story by Brooke Bolander. It was first published in Uncanny Magazine, in 2018.
Synopsis
In a fairy tale setting, three dromeosaurid sisters have an unpleasant encounter with a handsome prince.
Reception
"The Tale of the Three Beautiful Raptor Sisters, and the Prince Who Was Made of Meat" was a finalist for the 2019 Hugo Award for Best Short Story.
Tangent Online found it to be "very light comedy, which provides a moderate amount of amusement", calling it "a featherweight story" and "quite lengthy".
References
Works originally published in online magazines
Fiction about dinosaurs
2018 short stories |
https://en.wikipedia.org/wiki/Online%20integrated%20development%20environment | An online integrated development environment, also known as a web IDE or cloud IDE, is a browser based integrated development environment. An online IDE can be accessed from a web browser, such as Firefox, Google Chrome or Microsoft Edge, enabling software development on low-powered devices that are normally unsuitable. An online IDE does not usually contain all of the same features as a traditional desktop IDE, only basic IDE features such as a source-code editor with syntax highlighting. Integrated version control and Read–Eval–Print Loop (REPL) may also be included.
Online IDE's can be further categorized into professional and educational.
Examples
Cloud9 IDE
Codeanywhere
Codio
CodeSandbox
Codiva
Dockside
Eclipse Che IDE
Gitpod
Glitch
GitHub Codespaces
goormIDE
JDoodle
JSFiddle
Replit
SourceLair
StackBlitz
OneCompiler
References
Integrated development environments |
https://en.wikipedia.org/wiki/Southern%20Textile%20Exposition | The Southern Textile Exposition (1915-2004) was an intermittent trade fair for textile manufacturers held in Greenville, South Carolina.
By the early 20th century, American textile production had moved into the Carolina Piedmont from its earlier center in New England. By the second decade of the century, South Carolina ranked second only to Massachusetts in textile production; and Greenville, located between Charlotte and Atlanta, was central to the industry.
In 1914, the Southern Textile Association approved the bid of Greenville mill owners to host the first textile machinery trade fair in the South. The first show, in 1915, was held in borrowed warehouses; but the trade fair was so successful that Greenville's Southern Textile Exposition, Inc. soon raised the money needed to build a permanent exhibition space, Textile Hall, on West Washington Street, which was effectively completed before the second exposition in 1917. In succeeding years the exhibition was often held biennially.
By 1946 Greenville could advertise itself as the "Textile Capital of the World," and by 1962 Textile Hall, even with nine annexes and additional leased space, proved inadequate to host the Textile Exposition. The Greenville corporation put up a larger building adjoining the Greenville Downtown Airport on the new U.S. Route 29-Bypass. In 1969 the Exposition joined with the American Textile Machinery Association to sponsor the American Textile Machinery Exhibition-International, the largest textile machinery show ever held in the United States.
By the end of the 20th century, low wages and new production capacity in countries such as China, India, and Brazil dramatically reduced textile production in the United States, especially in the Southeast. The Southern Textile Exposition was held in Greenville for a final time in 2004.
References
Buildings and structures in Greenville, South Carolina
Textile industry
Textile engineering |
https://en.wikipedia.org/wiki/Variable-buoyancy%20propulsion | In engineering, variable-buoyancy propulsion is the use of a buoyancy engine to provide propulsion for a vehicle. The concept was first explored in the 1960s for use with underwater gliders, but has since been applied to autonomous aircraft as well.
Principle
Variable-buoyancy propulsion is based on the ability of a vehicle to change its buoyancy from negative to positive and vice versa (for aircraft, this means alternating between being heavier and lighter than air). While positively buoyant, the vehicle trims bow up and uses its hydrofoils or wings to glide forward while rising, using buoyancy as the driving force. At the top of the climb, buoyancy is made negative and the vehicle trims bow down and glides forward while descending, using gravity as the driving force.
The process can be repeated for as long as the buoyancy engine can operate, and allows for highly energy-efficient albeit generally slow propulsion. The vehicle's trajectory typically presents a sawtooth-like profile. Various methods may be used to alter the buoyancy.
References
Buoyancy
Fluid mechanics |
https://en.wikipedia.org/wiki/Gary%20Antonick | Gary Antonick ( ; born February 11, 1963) is an American journalist and recreational mathematician who for many years wrote a puzzle-based column called "Numberplay" for the New York Times.
Education and career
Antonick has a BS in Engineering from the University of Michigan and an MBA from Harvard Business School.
Numberplay
From December 2009 to October 2016 Antonick wrote the puzzle themed "Numberplay" column for The New York Times. The puzzles generally involved math or logic problems. They came from many sources, and many were descended from columns by the celebrated Scientific American columnist Martin Gardner. He often wrote about Gardner and considered him to be the leading popularizer of recreational mathematics. Conferences called Gathering 4 Gardner are held every two years to celebrate Gardner's legacy, and Antonick has twice spoken at these events. He also supports the Julia Robinson Mathematics Festival.
Among the many classic problems of recreational mathematics featured in "Numberplay" are The Prisoner's Dilemma, The Two Child Problem, The Monty Hall Problem, The Monkey and the Coconuts, The Two-cube Calendar, and The Zebra Puzzle. Sometimes "Numberplay" was used to celebrate other mathematicians such as Paul Erdős, or simply to report a breakthrough in mathematics or game theory.
"Numberplay" columns led to five sequences originated by Antonick being listed in the On-Line Encyclopedia of Integer Sequences (OEIS)
English Channel Swim
On August 8, 1988, Antonick swam the English Channel, starting from Dover, England, and finishing in France 8 hours and 46 minutes later.
References
External links
list of articles by Gary Antonick in the New York Times
1963 births
Living people
University of Michigan School of Education alumni
Harvard Business School alumni
Recreational mathematicians
Writers from Detroit
Mathematics popularizers |
https://en.wikipedia.org/wiki/Aligner%20%28semiconductor%29 | An aligner, or mask aligner, is a system that produces integrated circuits (IC) using the photolithography process. It holds the photomask over the silicon wafer while a bright light is shone through the mask and onto the photoresist. The "alignment" refers to the ability to place the mask over precisely the same location repeatedly as the chip goes through multiple rounds of lithography. Aligners were a major part of IC manufacture from the 1960s into the late 1970s, when they began to be replaced by the stepper.
There are several distinct generations of aligner technology. The early contact aligners placed the mask in direct contact with the top surface of the wafer, which often damaged the pattern when the mask was lifted off again. Used only briefly, proximity aligners held the mask slightly above the surface to avoid this problem, but were difficult to work with and required considerable manual adjustment. Finally, the Micralign projection aligner, introduced by Perkin-Elmer in 1973, held the mask entirely separate from the chip and made the adjustment of the image much simpler. Through these stages of development, yields improved from perhaps 10% to about 70%, leading to a corresponding reduction in chip prices.
The stepper is similar to an aligner in concept, but with one key difference. The aligner uses a mask that holds the pattern for the entire wafer, which may require large masks. The stepper uses a mask on the wafer repeatedly, and steps across the surface to repeat the pattern of the chip layer. This reduces mask costs dramatically and allows a single wafer to be used for different mask designs in a single run. More importantly, by focussing the light source onto a single area of the wafer, the stepper can produce much higher resolutions, thus allowing for smaller features on chips (minimum feature size). The disadvantage to the stepper is that each chip on the wafer has to be individually imaged, and thus the process of exposing the wafer as a whol |
https://en.wikipedia.org/wiki/Striation%20%28fatigue%29 | Striations are marks produced on the fracture surface that show the incremental growth of a fatigue crack. A striation marks the position of the crack tip at the time it was made. The term striation generally refers to ductile striations which are rounded bands on the fracture surface separated by depressions or fissures and can have the same appearance on both sides of the mating surfaces of the fatigue crack. Although some research has suggested that many loading cycles are required to form a single striation, it is now generally thought that each striation is the result of a single loading cycle.
The presence of striations is used in failure analysis as an indication that a fatigue crack has been growing. Striations are generally not seen when a crack is small even though it is growing by fatigue, but will begin to appear as the crack becomes larger. Not all periodic marks on the fracture surface are striations. The size of a striation for a particular material is typically related to the magnitude of the loading characterised by stress intensity factor range, the mean stress and the environment. The width of a striation is indicative of the overall crack growth rate but can be locally faster or slower on the fracture surface.
Striation features
The study of the fracture surface is known as fractography. Images of the crack can be used to reveal features and understand the mechanisms of crack growth. While striations are fairly straight, they tend to curve at the ends allowing the direction of crack growth to be determined from an image. Striations generally form at different levels in metals and are separated by a tear band between them. Tear bands are approximately parallel to the direction of crack growth and produce what is known as a river pattern, so called, because it looks like the diverging pattern seen with river flows. The source of the river pattern converges to a single point that is typically the origin of the fatigue failure.
Striations can ap |
https://en.wikipedia.org/wiki/Nagad | Nagad () is a Bangladeshi Digital Financial Service (DFS), operating under the authority of Bangladesh Post Office, an attached department of the Ministry of Post and Telecommunication (Bangladesh). It is the new version of the previously introduced Postal Cash Card and Electronic Money Transfer System (EMTS) of the Bangladesh Post Office. Its headquarter is located at 36 Kemal Ataturk Avenue, Banani, Dhaka — 1213, Bangladesh. Second unicorn of the country, Nagad has been awarded as the fastest company in Bangladesh to reach the milestone of becoming a unicorn startup.It is also first digital bank in the country.
Formation
In 2017 "Third Wave Technology Ltd" makes an agreement with Bangladesh Post Office (BPO) for Mobile Financial Service (MFS) under BPO. Though BPO had no ownership of Third Wave Technologies Ltd, there was profit sharing between them.
Third Wave Technologies Ltd was renamed as Nagad Ltd in February 2019 allegedly without informing BPO.
In March 2019, Nagad Ltd started out as the mobile financial service (MFS) of the Bangladesh Post Office (BPO) as "Nagad" brand.
In June 2021, BPO declared it will own 51% share, while the remaining 49% share will be held by Nagad Ltd.
Now, the government has set out to form a new subsidiary to help it secure a full-fledged licence from the central bank.
This financial service regulated under the Bangladesh Postal Act Amendment 2010 Section 3(2), a unique law procured especially for the Bangladesh Post Office by the Government of Bangladesh. The digital financial service was launched by the Bangladesh Post Office on 11 November 2018. It started operations on 26 March 2019, celebrating the 48th Independence Day of the country.
Nagad started its journey with demanding services like Cash-In, Cash-Out 2 Taka/minute, Send Money, 50% Bonus and Mobile Recharge. More popular services like Bills Payment, E-commerce Payment gateway are now available. From the very beginning, Nagad has its own Mobile App for Customers an |
https://en.wikipedia.org/wiki/Black%20Queen%20hypothesis | The Black Queen hypothesis (BQH) is reductive evolution theory which seeks to explain how natural selection (as opposed to genetic drift) can drive gene loss. In a microbial community, different members may have genes which produce certain chemicals or resources in a "leaky fashion" making them accessible to other members of that community. If this resource is available to certain members of a community in a way that allows them to sufficiently access that resource without generating it themselves, these other members in the community may lose the biological function (or the gene) involved in producing that chemical. Put another way, the black queen hypothesis is concerned with the conditions under which it is advantageous to lose certain biological functions. By accessing resources without the need to generate it themselves, these microbes conserve energy and streamline their genomes to enable faster replication.
Jeffrey Morris proposed the Black Queen hypothesis in his 2011 PhD dissertation. In the following year, Morris wrote another publication on the subject alongside Richard Lenski and Erik Zinser more fully refining and fleshing out the hypothesis. The name of the hypothesis—"Black Queen hypothesis"—is a play on the Red Queen hypothesis, an earlier theory of coevolution which states that organisms must constantly refine and adapt to keep up with the changing environment and the evolution of other organisms.
Principles
Original theory
The "Black Queen" refers to the "Queen of Spades" from the card game Hearts. The goal of Hearts is to end up as the player with the fewest number of points. However, the Queen of Spades is worth the same number of points as all the other cards combined. For this reason, players seek to avoid getting the Queen of Spades. At the same time, one player must end up with the Queen. Similarly, the BQH posits that members of a community will dispense with any functions (or genes) that become dispensable. At the same time, at least on |
https://en.wikipedia.org/wiki/DNA%20barcoding%20in%20diet%20assessment | DNA barcoding in diet assessment is the use of DNA barcoding to analyse the diet of organisms. and further detect and describe their trophic interactions. This approach is based on the identification of consumed species by characterization of DNA present in dietary samples, e.g. individual food remains, regurgitates, gut and fecal samples, homogenized body of the host organism, target of the diet study (for example with whole body of insects).
The DNA sequencing approach to be adopted depends on the diet breadth of the target consumer. For organisms feeding on one or only few species, traditional Sanger sequencing techniques can be used. For polyphagous species with diet items more difficult to identify, it is conceivable to determine all consumed species using NGS methodology.
The barcode markers utilized for amplification will differ depending on the diet of the target organism. For herbivore diets, the standard DNA barcode loci will differ significantly depending on the plant taxonomic level. Therefore, for identifying plant tissue at the taxonomic family or genus level, the markers rbcL and trn-L-intron are used, which differ from the loci ITS2, matK, trnH-psbA (noncoding intergenic spacer) used to identify diet items to genus and species level. For animal prey, the most broadly used DNA barcode markers to identify diets are the mitochondrial cytochrome C oxydase (COI) and cytochrome b (cytb). When the diet is broad and diverse, DNA metabarcoding is used to identify most of the consumed items.
Advantages
A major benefit of using DNA barcoding in diet assessment is the ability to provide high taxonomic resolution of consumed species. Indeed, when compared to traditional morphological analysis, DNA barcoding enables a more reliable separation of closely related taxa reducing the observed bias. Moreover, DNA barcoding enables to detect soft and highly digested items, not recognisable through morphological identification. For example, Arachnids feed on pre-dige |
https://en.wikipedia.org/wiki/List%20of%20Rockchip%20products | This is a list of Rockchip products.
Products
Featured products
RK3399
RK3399 was the flagship SoC of Rockchip, Dual A72 and Quad A53 and Mali-T860MP4 GPU, providing high computing and multi-media performance, rich interfaces and peripherals. And software supports multiple APIs: OpenGL ES 3.2, Vulkan 1.0, OpenCL 1.1/1.2, OpenVX1.0, AI interfaces support TensorFlow Lite/AndroidNN API.
RK3399 Linux source code and hardware documents are on GitHub and Wiki opensource website.
RK3288
RK3288 is a high performance IoT platform, Quad-core Cortex-A17 CPU and Mali-T760MP4 GPU, 4K video decoding and 4K display out. It is applied to products of various industries including Vending Machine, Commercial Display, Medical Equipment, Gaming, Intelligent POS, Interactive Printer, Robot and Industrial Computer.
RK3288 Linux source code and hardware documents are on GitHub and Wiki opensource website.
RK3326 & PX30
RK3326 and PX30 are newly announced in 2018, designed for Smart AI solutions. PX30 is a variant of RK3326 targeting IoT market, supporting dual VOP. They are with Arm's new generation of CPU Cortex-A35 and GPU G31. The RK3326 is widely used in handheld consoles designed for emulation.
RK3308
RK3308 is another newly released chipset targeting Smart AI solutions. It is an entry-level chipset aimed at mainstream devices. The chip has multiple audio input interfaces, and greater energy efficiency, featuring an embedded VAD (Voice Activation Detection).
RV1108
The announcement of RV1108 indicated Rockchip's moves to AI/computer vision territory.
With CEVA DSP embedded, RV1108 powers smart cameras including 360° Video Camera, IPC, Drone, Car Camcoder, Sport DV, VR, etc. It also has been deployed for new retail and intelligent marketing applications with integrated algorithms.
RK1808
The RK1808 is Rockchip's first chip with Neural Processing Unit (NPU) for artificial intelligence applications. The RK1808 specifications include:
Dual-core ARM Cortex-A35 CPU
N |
https://en.wikipedia.org/wiki/Spatial%20cloaking | Spatial cloaking is a privacy mechanism that is used to satisfy specific privacy requirements by blurring users’ exact locations into cloaked regions. This technique is usually integrated into applications in various environments to minimize the disclosure of private information when users request location-based service. Since the database server does not receive the accurate location information, a set including the satisfying solution would be sent back to the user. General privacy requirements include K-anonymity, maximum area, and minimum area.
Background
With the emergence and popularity of location-based services, people are getting more personalized services, such as getting the names and locations of nearby restaurants and gas stations. Receiving these services requires users to send their positions either directly or indirectly to the service provider. A user's location information could be shared more than 5000 times in two weeks. Therefore, this convenience also exposes users’ privacy to certain risks, since the attackers may illegally identify the users’ locations and even further exploit their personal information. Continuously tracking users' location has not only been identified as a technical issue, but also a privacy concern as well. It has been realized that Quasi-identifiers, which refer to a set of information attributes, can be used to re-identify the user when linked with some external information. For example, the social security number could be used to identify a specific user by adversaries, and the combined disclosure of birth date, zip code, and gender can uniquely identify a user. Thus, multiple solutions have been proposed to preserve and enhance users’ privacy when using location-based services. Among all the proposed mechanisms, spatial cloaking is one of those which has been widely accepted and revised, thus having been integrated into many practical applications.
Location privacy
Location privacy is usually considered falling int |
https://en.wikipedia.org/wiki/Airport%20privacy | Airport privacy involves the right of personal privacy for passengers when it comes to screening procedures, surveillance, and personal data being stored at airports. This practice intertwines airport security measures and privacy specifically the advancement of security measures following the 9/11 attacks in the United States and other global terrorist attacks. Several terrorist attacks, such as 9/11, have led airports all over the world to look to the advancement of new technology such as body and baggage screening, detection dogs, facial recognition, and the use of biometrics in electronic passports. Amidst the introduction of new technology and security measures in airports and the growing rates of travelers there has been a rise of risk and concern in privacy.
History of airport policies
Before the 9/11 terrorist attacks, the only security measure in place in U.S. airports were metal detectors. A metal detector's ability to only detect metal weapons made it inefficient in detecting nonmetals such as liquids, sharp objects, or explosives. After the 9/11 terrorist attacks in the United States, the Transportation Security Administration (TSA) increased security measures all over the airports. Policies were made to prohibit the carry on of liquids, sharp objects, and explosives. Airlines instructed passengers to arrive 2 hours before their flight is to depart if traveling domestically and 3 hours if traveling internationally. After passing through screening, passengers were selected at random for additional screening including bag checks. After an incident, that involved a passenger carrying a bomb in their shoe, security screeners asked passengers to remove their shoes when passing through checkpoints. In February 2002, the TSA officially took over the responsibility for airport security. In 2009, airport security measures were once again shaken when a passenger, now commonly known as the "underwear bomber," smuggled a bomb into the airport facility in his under |
https://en.wikipedia.org/wiki/Microphysiometry | Microphysiometry is the in vitro measurement of the functions and activities of life or of living matter (as organs, tissues, or cells) and of the physical and chemical phenomena involved on a very small (micrometer) scale. The term microphysiometry emerged in the scientific literature at the end of the 1980s.
The primary parameters assessed in microphysiometry comprise pH and the concentration of dissolved oxygen, glucose, and lactic acid, with an emphasis on the first two. Measuring these parameters experimentally in combination with a fluidic system for cell culture maintenance and a defined application of drugs or toxins provides the quantitative output parameters extracellular acidification rates (EAR), oxygen consumption rates (OUR), and rates of glucose consumption or lactate release to characterize the metabolic situation.
Due to the label-free nature of sensor-based measurements, dynamic monitoring of cells or tissues for several days or even longer is feasible. On an extended timescale, a dynamic analysis of a cell’s metabolic response to an experimental treatment can distinguish acute effects (e.g., one hour after a treatment), early effects (e.g., at 24 hours), and delayed, chronic responses (e.g., at 96 hours). As stated by Alajoki et al., "The concept is that it is possible to detect receptor activation and other physiological changes in living cells by monitoring the activity of energy metabolism".
See also
Organ-on-a-chip
References
Biotechnology
Research methods
Laboratory techniques |
https://en.wikipedia.org/wiki/Close-space%20sublimation | Closed space sublimation is a method of producing thin-films, esp. cadmium telluride photovoltaics, though it is used for other materials like antimony triselenide. It is a type of physical vapor deposition where the substrate to be coated and the source material are held close to one another. They are both placed in a vacuum chamber, which is pumped down. The source and substrate are then heated. The source is heated to some fraction of its melting temperature, and the substrate some lower temperature e.g. 640 °C and 600 °C, respectively. This causes sublimation of the source, allowing vapors to travel a short distance to the substrate, where they condense, producing a thin film. This short-path diffusion is similar in principle to short-path distillation. Compared to other techniques, it is a relevantly insensitive process, and takes as little as 15 minutes for an entire cycle. This makes it a very viable technique for large-scale manufacturing.
References
Thin film deposition
Semiconductor device fabrication |
https://en.wikipedia.org/wiki/2-Methoxybenzaldehyde | 2-Methoxybenzaldehyde is an organic compound with the formula CH3OC6H4CHO. It is also commonly referred to as o-anisaldehyde. As a methylated version of salicylaldehyde, the molecule consists of a benzene ring with adjacent formyl and a methoxy groups. It is a colorless solid with a pleasant aroma. The related isomer 4-anisaldehyde is better known, being a commercial flavorant. 2-Anisaldehyde is prepared commercially by formylation of anisole.
References
Flavors
Benzaldehydes |
https://en.wikipedia.org/wiki/Terminology%20of%20alternative%20medicine | Alternative medicine describes any practice which aims to achieve the healing effects of medicine, but which lacks biological plausibility and is untested or untestable. Complementary medicine (CM), complementary and alternative medicine (CAM), integrated medicine or integrative medicine (IM), and holistic medicine are among many rebrandings of the same phenomenon.
Terms for alternative medicine
The terms alternative medicine, complementary medicine, integrative medicine, holistic medicine, natural medicine, unorthodox medicine, fringe medicine, unconventional medicine, and new age medicine are used interchangeably as having the same meaning and are almost synonymous in most contexts.
The meaning of the term "alternative" in the expression "alternative medicine", is not that it is an effective alternative to medical science, although some alternative medicine promoters may use the loose terminology to give the appearance of effectiveness. Loose terminology may also be used to suggest meaning that a dichotomy exists when it does not, e.g., the use of the expressions "Western medicine" and "Eastern medicine" to suggest that the difference is a cultural difference between the Asiatic east and the European west, rather than that the difference is between evidence-based medicine and treatments that do not work.
Complementary or integrative medicine
Complementary medicine (CM) or integrative medicine (IM) is when alternative medicine is used together with functional medical treatment, in a belief that it improves the effect of treatments. For example, acupuncture (piercing the body with needles to influence the flow of a supernatural energy) might be believed to increase the effectiveness or "complement" science-based medicine when used at the same time. Instead, significant drug interactions caused by alternative therapies may make treatments less effective, notably in cancer therapy. Integrative medicine has been described as an attempt to bring pseudoscience int |
https://en.wikipedia.org/wiki/Timed%20comments | Timed comments are a feature offered by some audio and video players and websites where people can add comments associated with specific times in an audio or video. These comments are then displayed in the player when that time is reached while playing the audio or video.
Timed comments differ from annotations, captions, and subtitles in an important respect: they can be added by viewers, not just video creators, and they include the identity of the person adding the comment.
Examples
SoundCloud, an audio distribution platform and music sharing website: Timed comments can be added at a specific minute and second mark in a soundtrack, and are displayed when the track reaches that minute and second mark. Users can see each others' comments.
Viki, a video streaming website that hosts a number of television shows and movies from Korea, Japan, China, and Taiwan. Viki's timed commenting system is one of its distinguishing features.
Viddler, a video platform used for training videos.
References
Streaming media systems |
https://en.wikipedia.org/wiki/Preimplantation%20factor | Preimplantation factor (PIF) is a peptide secreted by trophoblast cells prior to placenta formation in early embryonic development. Human embryos begin to express PIF at the 4-cell stage, with expression increasing by the morula stage and continuing to do so throughout the first trimester. Expression of preimplantation factor in the blastocyst was discovered as an early correlate of the viability of the eventual pregnancy. Preimplantation factor was identified in 1994 by a lymphocyte platelet-binding assay, where it was thought to be an early biomarker of pregnancy. It has a simple primary structure with a short sequence of fifteen amino acids without any known quaternary structure. A synthetic analogue of preimplantation factor (commonly abbreviated in studies as sPIF or PIF*) that has an identical amino acid sequence and mimics the normal biological activity of PIF has been developed and is commonly used in research studies, particularly those that aim to study potential adult therapeutics.
Preimplantation factor acts by paracrine signaling; that is to say trophoblast cells, which collectively form extra-embryonic tissues, secrete it onto the surface of the endometrium. PIF is known to influence many events in the implantation process, the process by which an early embryo implants into the uterine wall. A crucial event in human implantation is when trophoblast cells expressing preimplantation factor invade the uterine wall and found the placenta, an organ that connects maternal blood supply, and along with it, nutrients, to the growing fetus. This requires changes to the histology of the endometrium; a process called decidualisation. Upregulated expression of PIF increases the presence of integrins on the endometrium wall, promoting the embryo's adhesion to the uterine wall. PIF is thought to modulate and facilitate the depth of the trophoblast's invasion into the uterus at physiological doses.
Maternal immune system regulation is also a critical event in implan |
https://en.wikipedia.org/wiki/Crenobacter%20cavernea | Crenobacter cavernea Cave-375 is a gram negative bacterium that is closely related to a previously discovered Crenobacter cavernae strain K1W11S-77ͭ. C. cavernea Cave-375 has not directly been described morphologically, however the related strain K1W11S-77ͭ is a "rod-shaped, motile, and strictly aerobic novel bacteria".
Its metabolism has not yet been determined.
C. cavernea Cave-375 was first identified from a water sample coming from a dripping stalactite. This stalactite was located in the Algar do Pena cave in the karst Estremadura Limestone Massif in central western Portugal.
C. cavernea Cave-375 was first isolated and "grown on nutrient agar at 25 degrees Celsius".
Its ecology is not yet known. With the sequencing of the genome of C. cavernea Cave-375, the ecological impact should be able to be identified.
Diversity
C. cavernea Cave-375 belongs in the Proteobacteria phylum, Neisseriaceae family, and Crenobacter cavernea species. By comparing the 16s rRNA of the CAVE-375 stain to Crenobacter cavernea species, a 99% similarity value was calculated. When comparing DNA-DNA hybridization using a Genome-to-Genome Distance Calculator, a 62.66% hybridization percentage was found.
Genome
"Genomic DNA was extracted from C. cavernea Cave-375 using an NZY microbial gDNA isolation kit (NZYTech, Portugal)". The whole genome was then sequenced using whole genome shotgun sequencing method. With this, "17,325,372 high-quality raw sequences were assembled into 15 contigs with an N50 value of 323,281 and a total genome size of 2,273,143 base pairs (2.9 Mb)". NCBI Prokaryotic Genome Annotation Pipeline was able to identify a 65.9% GC content and sequencing coding for proteins and tRNA. "2,779 protein coding sequences and 63 tRNA sequences" were identified using this method.
References
Bacteria |
https://en.wikipedia.org/wiki/Overabundant%20species | In biology, overabundant species refers to an excessive number of individuals and occurs when the normal population density has been exceeded. Increase in animal populations is influenced by a variety of factors, some of which include habitat destruction or augmentation by human activity, the introduction of invasive species and the reintroduction of threatened species to protected reserves.
Population overabundance can have a negative impact on the environment, and in some cases on the public as well. There are various methods through which populations can be controlled such as hunting, contraception, chemical controls, disease and genetic modification. Overabundant species is an important area of research as it can potentially impact the biodiversity of ecosystems.
Most research studies have examined negative impacts of overabundant species, whereas very few have documented or performed an in-depth examination on positive impacts. As a result, this article focuses on the negative impact of overabundant species.
Definitions
When referring to animals as “overabundant”, various definitions apply. The following classes explore the different associations with overabundance:
The inconvenience of animals in a certain region or area that threatens human livelihood, for example the tropics are considered to contain an overabundant population of the Anopheles mosquito which carries the malaria parasite.
The population density of a preferred species has been reduced by another species population which is then considered as overabundant, for example predator populations of lions and hyenas reducing zebra and wildebeest numbers.
A species population within a specific habitat exceeds the carrying capacity, for example national parks reducing herbivore populations to maintain and manage habitat equilibrium.
The entire equilibrium consisting of animal and plant organisations is already out of balance, for example existing populations colonising new habitat.
Out of all |
https://en.wikipedia.org/wiki/Planar%20SAT | In computer science, the planar 3-satisfiability problem (abbreviated PLANAR 3SAT or PL3SAT) is an extension of the classical Boolean 3-satisfiability problem to a planar incidence graph. In other words, it asks whether the variables of a given Boolean formula—whose incidence graph consisting of variables and clauses can be embedded on a plane—can be consistently replaced by the values TRUE or FALSE in such a way that the formula evaluates to TRUE. If this is the case, the formula is called satisfiable. On the other hand, if no such assignment exists, the function expressed by the formula is FALSE for all possible variable assignments and the formula is unsatisfiable. For example, the formula "a AND NOT b" is satisfiable because one can find the values a = TRUE and b = FALSE, which make (a AND NOT b) = TRUE. In contrast, "a AND NOT a" is unsatisfiable.
Like 3SAT, PLANAR-SAT is NP-complete, and is commonly used in reductions.
Definition
Every 3SAT problem can be converted to an incidence graph in the following manner: For every variable , the graph has one corresponding node , and for every clause , the graph has one corresponding node An edge is created between variable and clause whenever or is in . Positive and negative literals are distinguished using edge colorings.
The formula is satisfiable if and only if there is a way to assign TRUE or FALSE to each variable node such that every clause node is connected to at least one TRUE by a positive edge or FALSE by a negative edge.
A planar graph is a graph that can be drawn on the plane in a way such that no two of its edges cross each other. Planar 3SAT is a subset of 3SAT in which the incidence graph of the variables and clauses of a Boolean formula is planar. It is important because it is a restricted variant, and is still NP-complete. Many problems (for example games and puzzles) cannot represent non-planar graphs. Hence, Planar 3SAT provides a way to prove those problems to be NP-hard.
Proof of NP-c |
https://en.wikipedia.org/wiki/Hot%20potassium%20carbonate | Hot potassium carbonate, HPC, is a method used to remove carbon dioxide from gas mixtures, in some contexts referred to as carbon scrubbing. The inorganic, basic compound potassium carbonate is mixed with a gas mixture and the liquid absorbs carbon dioxide through chemical processes. The technology is a form of chemical absorption, and was developed for natural gas sweetening (i.e., removal of acidic from raw natural gas). Currently it is also considered, among others, as a post-combustion capture process, in the contexts of carbon capture and storage and carbon capture and utilization. As a post-combustion CO2 capture process, the technology is planned to be used on full scale on a heat plant in Stockholm from 2025.
References
Carbon capture and storage
Climate engineering
Scrubbers
Gas technologies |
https://en.wikipedia.org/wiki/Solstar | Solstar Space Co., also known as Solstar, is an American company that provides commercial wireless internet services to space travelers and Internet of Things in space. It also provides a two-way internet link connecting people on earth to things in space. Based out of Santa Fe, New Mexico, the company was founded in March 2017.
History
Solstar was founded by M. Brian Barnett in March 2017, with Michael Potter and Mark Matossian as co-founders. Prior to this, Barnett had developed an initial design of a communication system which was used to successfully transmit the first-ever commercial text message from earth to space in November 2013, with students from Albuquerque sending 16 messages to a device aboard a UP Aerospace rocket launched from Spaceport America.
In 2017, Solstar received a Phase I small business contract with NASA to develop a preliminary design for a commercial router on the International Space Station, under the Small Business Innovation Research (SBIR) program. The device is intended for low Earth orbit service and was named the Slayton Space Communicator (SC-Slayton) after one of Mercury astronauts Deke Slayton who was NASA's first Chief of the Astronaut Office. The company also signed a Space Act Agreement with NASA to test WiFi technologies in space.
In April 2018, Solstar tested the Schmitt Space Communicator SC-1x, a three-pound device, in a Blue Origin capsule on a New Shepard rocket which was launched from the Blue Origin's launch facility near Van Horn, Texas, and reached a height of 66 miles. The test was successful, with the founder Barnett using the on-flight internet connection to send out a tweet. The project's 2 million cost was partly funded by NASA as part of its Flight Opportunities program. The device is named after Harrison Schmitt, one of the last men to walk on the Moon and Solstar's adviser. It conducted a second successful test in July 2018, with the flight reaching a peak height of 73.8 miles above sea level. The device |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.