source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/Free%20software%20in%20India | The history of Free Software in India can be seen from three different perspectives - the growth of Free Software usage, the growth of Free Software communities, the adoption of Free Software policies by the governments. India was quite late to the free software scene with adoption and penetration growing towards the end of the 1990s with the formation of pockets of Free Software communities spread across the country. The communities were typically centered around educational institutions or free software supporting organizations.
Communities primarily revolved around support mailing lists. Some of the largest and earliest communities were those based out of Chennai, Delhi, Kochi, Mumbai, Pune, and Trivandrum. Free Software Foundation of India, was formed in 2001 to promote the use and development of free software in India.
Some of the state governments, notably Kerala and Tamil Nadu created policies on the use of Free Software in state level organizations and launched ambitions projects like IT@School, Elcot OpenSuse migration. The government of India rolled out a policy to adopt Open standards and promote open source and open technologies in 2015.
Free software usage in India
Free Software was almost synonymous with Linux and associated software in the past. However, with the gaining popularity of Free Software applications like Firefox or operating systems like Android it has become quite difficult to quantify the user base for Free Software. If we look at the usage of Linux operating systems in desktop computers in India we can see that the market share has increased to 1.8% in July 2016. A report of similar data in 2011 had shown that India was one of the top 20 users of Linux desktops globally. According to the State of FOSS in India report, "India still lags behind the global landscape in building sustainable home-grown projects and needs a strategic plan to incubate and proliferate domestic FOSS innovations worldwide".
Free software communities / maili |
https://en.wikipedia.org/wiki/Erich%20H%C3%BCttenhain | Erich Hüttenhain (26 January 1905 in Siegen – 1 December 1990 in Brühl) was a German academic mathematician and cryptographer (Cryptography) and considered a leading cryptanalyst in the Third Reich. He was Head of the cryptanalysis unit at OKW/Chi, the Cipher Department of the High Command of the Wehrmacht.
Life and work
Hüttenhain was the son of a Conrector and studied after the high school diploma () 1924 in Siegen at the University of Marburg, the Johann Wolfgang Goethe-University Frankfurt and the University of Münster. He studied mathematics with Heinrich Behnke and astronomy at Münster. There he was assistant to Martin Lindow
(1880–1967), who was director of the observatory at Münster. In 1933, at the University of Münster, he took his examination for promotion of Dr. phil. in astronomy under Lindow with the thesis titled: Spatial infinitesimal orbits around the libration points in the straight-line case of the (3 + 1) bodies. In 1936, he was sent to the cipher bureau of the OKW OKW/Chi under Director Min.Rat. Wilhelm Fenner. Erich Hüttenhain had an interest in Mayan chronology which led him to cryptology and thus to OKW/CHi. As a recruitment test, Fenner had sent him a message which had been enciphered with a private cipher. Hüttenhain duly deciphered it and was accepted as a possible cryptanalyst. At OKW/Chi he was employed as a specialist to build a cryptanalytic research unit, and later he was most recently Executive Council Head of group IV Analytical cryptanalysis.
During his time in OKW/Chi he succeeded, among other things, in the deciphering of the Japanese Purple cipher machine (William Frederick Friedman) He and his staff also temporarily succeeded in deciphering American rotary machines, such as the M 138 A and the M-209 in North Africa. Later in the war, when the allies invaded Italy, the allies learned in turn by deciphering Italian ciphers that their later systems, e.g. among others the SIGABA designed by Friedman, had not been broken and aro |
https://en.wikipedia.org/wiki/Mycofactocin | Mycofactocin (MFT) is a family of small molecules derived from a peptide of the type known as RiPP (ribosomally synthesized and post-translationally modified peptides), naturally occurring in many types of Mycobacterium. It was discovered in a bioinformatics study in 2011. All mycofactocins share a precursor in the form of premycofactocin (PMFT); they differ by the cellulose tail added. Being redox active, both PMFT and MFT have an oxidized dione (mycofactocinone) form and a reduced diol (mycofactocinol) form, respectively termed PMFTH2 and MFTH2.
Name
The name "mycofactocin" is derived from three words, the genus name "Mycobacterium" (across which it is nearly universal), "cofactor" because its presence in a genome predicts the co-occurrence of certain families of enzymes as if it is a cofactor they require, and "bacteriocin" because a radical SAM enzyme critical to its biosynthesis, MftC, is closely related to the key enzyme for the biosynthesis of subtilosin A, a bacteriocin, from its precursor peptide.
Nomenclature
An MFT with a glucose tail of n units is termed MFT-n; MFT-nH2 in the reduced form. An MFT with a 2-O-methylglucose is termed a methylmycofactocin (MMFT), with analogous numbering.
Function
Mycofactocin is thought to play a role in redox pathways involving nicotinoproteins, enzymes with non-exchangeable bound nicotinamide adenine dinucleotide (NAD). This notion comes largely from comparative genomics work that highlighted the many parallels between mycofactocin and pyrroloquinoline quinone (PQQ). In both cases, maturation of the RiPP requires post-translational modification of a precursor peptide by a radical SAM enzyme, the system appears in very similar form in large numbers of species, the product appears to be used within the cell rather than exported, and several families of enzymes occur exclusively in bacteria with those systems. The number of putatively mycofactocin-dependent oxidoreductases encoded by a single genome can be quite larg |
https://en.wikipedia.org/wiki/Serverless%20computing | Serverless computing is a cloud computing execution model in which the cloud provider allocates machine resources on demand, taking care of the servers on behalf of their customers. "Serverless" is a misnomer in the sense that servers are still used by cloud service providers to execute code for developers. However, developers of serverless applications are not concerned with capacity planning, configuration, management, maintenance, fault tolerance, or scaling of containers, VMs, or physical servers. Serverless computing does not hold resources in volatile memory; computing is rather done in short bursts with the results persisted to storage. When an app is not in use, there are no computing resources allocated to the app. Pricing is based on the actual amount of resources consumed by an application. It can be a form of utility computing.
Serverless computing can simplify the process of deploying code into production. Serverless code can be used in conjunction with code deployed in traditional styles, such as microservices or monoliths. Alternatively, applications can be written to be purely serverless and use no provisioned servers at all. This should not be confused with computing or networking models that do not require an actual server to function, such as peer-to-peer (P2P).
Serverless runtimes
Serverless vendors offer compute runtimes, also known as Function as a Service (FaaS) platforms, which execute application logic but do not store data. Common languages supported by serverless runtimes are Java, Python and PHP. Generally, the functions run under isolation boundaries, such as, Linux containers.
Commercial offerings
The first "pay as you go" code execution platform was Zimki, released in 2006, but it was not commercially successful. In 2008, Google released Google App Engine, which featured metered billing for applications that used a custom Python framework, but could not execute arbitrary code. PiCloud, released in 2010, offered FaaS support f |
https://en.wikipedia.org/wiki/Social%20immunity | Social immunity is any antiparasite defence mounted for the benefit of individuals other than the actor. For parasites, the frequent contact, high population density and low genetic variability makes social groups of organisms a promising target for infection: this has driven the evolution of collective and cooperative anti-parasite mechanisms that both prevent the establishment of and reduce the damage of diseases among group members. Social immune mechanisms range from the prophylactic, such as burying beetles smearing their carcasses with antimicrobials or termites fumigating their nests with naphthalene, to the active defenses seen in the imprisoning of parasitic beetles by honeybees or by the miniature 'hitchhiking' leafcutter ants which travel on larger worker's leaves to fight off parasitoid flies. Whilst many specific social immune mechanisms had been studied in relative isolation (e.g. the "collective medication" of wood ants), it was not until Sylvia Cremer et al.'s 2007 paper "Social Immunity" that the topic was seriously considered. Empirical and theoretical work in social immunity continues to reveal not only new mechanisms of protection but also implications for understanding of the evolution of group living and polyandry.
Social immunity (also termed collective immunity) describes the additional level of disease protection arising in social groups from collective disease defences, performed either jointly or towards one another. These collective defences complement the individual immunity of all group members and constitute an extra layer of protection at the group level, combining behavioural, physiological and organisational adaptations. These defences can be employed either prophylactically or on demand.
Definition
Sylvia Cremer defined social immunity in her seminal 2007 Current Biology paper 'Social Immunity' as the "collective action or altruistic behaviours of infected individuals that benefit the colony". She laid out a conceptual framewor |
https://en.wikipedia.org/wiki/Amazon%20ElastiCache | Amazon ElastiCache is a fully managed in-memory data store and cache service by Amazon Web Services (AWS). The service improves the performance of web applications by retrieving information from managed in-memory caches, instead of relying entirely on slower disk-based databases. ElastiCache supports two open-source in-memory caching engines: Memcached and Redis (also called "ElastiCache for Redis").
As a web service running in the computing cloud, Amazon ElastiCache is designed to simplify the setup, operation, and scaling of memcached and Redis deployments. Complex administration processes like patching software, backing up and restoring data sets and dynamically adding or removing capabilities are managed automatically. Scaling ElastiCache resources can be performed by a single API call.
Amazon ElastiCache was first released on August 22, 2011, supporting memcached. This was followed by support for reserved instances on April 5, 2012 and Redis on September 4, 2013.
Uses
As a managed database service with multiple supported engines, Amazon ElastiCache has a wide range of uses, including
Performance acceleration
Database limitations are often a bottleneck for application performance. By placing Amazon ElastiCache between an application and its database tier, database operations can be accelerated.
Cost reduction
Using ElastiCache for database performance acceleration can significantly reduce the infrastructure needed to support the database. In many cases, the cost savings outweigh the cache costs. Expedia was able to use ElastiCache to reduce provisioned DynamoDB capacity by 90%, reducing total database cost by 6x.
Processing time series data
Using the Redis engine, ElastiCache can rapidly process time-series data, quickly selecting newest or oldest records or events within range of a point-in-time.
Leaderboards
Leaderboards are an effective way to show a user quickly where they currently stand within a gamified system. For systems with large numbers of gam |
https://en.wikipedia.org/wiki/Colm%20Mulcahy | Colm Mulcahy (born September 1958) is an Irish mathematician, academic, columnist, book author, public outreach speaker, and amateur magician. He is Professor Emeritus at Spelman College, where he was on the faculty from 1988 to 2020. In addition to algebra, number theory, and geometry, his interests include mathemagical card magic and the culture of mathematics–particularly the contributions of Irish mathematicians and also the works of iconic mathematics writer Martin Gardner. He has blogged for the Mathematical Association of America, The Huffington Post, Scientific American, and (aperiodically) for The Aperiodical; his puzzles have been featured in The New York Times. Mulcahy serves on the Advisory Council of the Museum of Mathematics in New York City. As of January 2021, he is Chair of Gathering 4 Gardner, Inc. He is the creator and curator of the Annals of Irish Mathematics and Mathematicians.
Education and career
Mulcahy got his BSc and MSc in mathematical science at University College Dublin in 1978 and 1979, and a PhD from Cornell University in 1985 where his advisor was Alex F. T. W. Rosenberg. From 1988 to 2020, he taught mathematics at Spelman College, in Atlanta, Georgia, and is now Professor Emeritus. He served as chair of the department there from 2003 to 2006 and recently created the Archive of Spelman Mathematicians. In 1997 he received the MAA's Allendoerfer Award for excellence in expository writing for a paper on the basics of wavelet image compression. In 2014 he was one of the organizers of Mathematics Awareness Month. An article he co-authored, on the centennial of Martin Gardner, was featured in the book, The Best Writing on Mathematics 2015.
Mulcahy has an Erdös number is 2 as a result of a collaboration with Neil Calkin.
Card magic
Mulcahy is recognised as an authority on the mathematical principles and effects underlying cards tricks. From 2004 to 2014 he authored Card Colm, a column about mathematics and magic–especially card magic–fo |
https://en.wikipedia.org/wiki/Ricardian%20contract | The Ricardian contract, as invented by Ian Grigg in 1996, is a method of recording a document as a contract at law, and linking it securely to other systems, such as accounting, for the contract as an issuance of value. It is robust through use of identification by cryptographic hash function, transparent through use of readable text for legal prose and efficient through markup language to extract essential information.
A Ricardian contract places the defining elements of a legal agreement in a format that can be expressed and executed in software.
The method arises out of the work of Ian Grigg completed in the mid-1990s in contributions to Ricardo, a system of assets transfers that was built in 1995-1996 by Systemics and included the pattern.
Definition
Diagram
The Ricardian contract separates the agreement of parties across time and domain. On the left of the "Bowtie" representation, the negotiation and formation of a legally binding contract leads to a single parent document that defines all of the intent of that agreement. On the right, the performance of that agreement might involve many transactions to be accounted for, logically separated from the meaning of the issue. The join between the legal world and the accounting world is formed by the hash — each transaction locks in the terms and conditions of the precise deal of the parties by including the hash of the contract in every relevant transaction record, yet the operation of the transactions and the issuance of the contract are cleanly separated and thus perverse incentives are eliminated.
Legal relationship
The role of the Ricardian contract is to capture the contractual relationship between contracting parties to assist later performance of that contract by programs. In its contractual form, it is the recording of an offer from an issuer to a holder. The offer is signed digitally within the format by the offerer, typically using a plaintext digital signature such as provided by OpenPGP.
The acce |
https://en.wikipedia.org/wiki/Mathematics%20mastery | Mathematics mastery is an approach to mathematics education which is based on mastery learning in which most students are expected to achieve a high level of competence before progressing. This technique is used in countries such as China and Singapore where good results have been achieved and so the approach is now being promoted in the UK by people such as schools minister Nick Gibb. Chinese teachers were brought to the UK to demonstrate the Shanghai mastery approach in 2015. A trial was made in the UK with about 10,000 students of ages 5–6 and 11–12. In one year, test scores indicated that the students were about a month ahead of students in schools using other approaches. This result was considered small but significant.
Mathematics mastery is a new way of thinking and teaching, where the whole class moves through content at the same pace and students are given time to think deeply about the maths. The methodology build self-confidence in learners and differentiates through depth rather than acceleration.
References
External links
Mathematics Mastery – a programme of the Ark charity to support and encourage this approach
Maths Mastery Guide for primary school education - a deep dive
Mathematics education |
https://en.wikipedia.org/wiki/Carbon%20nanotubes%20in%20interconnects | In nanotechnology, carbon nanotube interconnects refer to the proposed use of carbon nanotubes in the interconnects between the elements of an integrated circuit. Carbon nanotubes (CNTs) can be thought of as single atomic layer graphite sheets rolled up to form seamless cylinders. Depending on the direction on which they are rolled, CNTs can be semiconducting or metallic. Metallic carbon nanotubes have been identified as a possible interconnect material for the future technology generations and to replace copper interconnects. Electron transport can go over long nanotube lengths, 1 μm, enabling CNTs to carry very high currents (i.e. up to a current density of 109 A∙cm−2) with essentially no heating due to nearly one dimensional electronic structure. Despite the current saturation in CNTs at high fields, the mitigation of such effects is possible due to encapsulated nanowires.
Carbon nanotubes for interconnects application in Integrated chips have been studied since 2001, however the extremely attractive performances of individual tubes are difficult to reach when they are assembled in large bundles necessary to make real via or lines in integrated chips. Two proposed approaches to overcome the to date limitations are either to make very tiny local connections that will be needed in future advanced chips or to make carbon metal composite structure that will be compatible with existing microelectronic processes.
Hybrid interconnects that employ CNT vias in tandem with copper interconnects may offer advantages in reliability and thermal-management. In 2016, the European Union has funded a four million euro project over three years to evaluate manufacturability and performance of composite interconnects employing both CNT and copper interconnects. The project named CONNECT (CarbON Nanotube compositE InterconneCTs) involves the joint efforts of seven European research and industry partners on fabrication techniques and processes to enable reliable carbon nanotubes for |
https://en.wikipedia.org/wiki/Serafim%20Batzoglou | Serafim Batzoglou is Chief Data Officer at Seer Inc. Prior to that he was Chief Data Officer at insitro, VP of computational genomics at Illumina, and professor of computer science at Stanford University between 2001 and 2016. His lab focused on computational genomics with special interest in developing algorithms, machine learning methods, and systems for the analysis of large scale genomic data. He has also been involved with the Human Genome Project and ENCODE.
Background
Batzoglou did his undergraduate studies at MIT and obtained his PhD in Computer Science from MIT in 2000 under the supervision of Bonnie Berger.
Awards
ISCB Fellow (2020)
ISCB Innovator Award (2016)
Sloan Research Fellowship, Alfred P. Sloan Foundation
Career Award in Computer Science, National Science Foundation
Top 100 Young Technology Innovators, MIT Technology Review
Best Paper Award, ISMB (2003)
References
Living people
Stanford University faculty
Massachusetts Institute of Technology alumni
Greek computer scientists
Human Genome Project scientists
Year of birth missing (living people) |
https://en.wikipedia.org/wiki/RepRap%20Ormerod | The RepRap Ormerod is an open-source fused deposition modeling 3D printer and is part of the RepRap project. The RepRap Ormerod is named after the English entomologist Eleanor Anne Ormerod, it was designed by RepRapPro. There have been two versions of the Ormerod, the Ormerod 1 was released in December 2013 and the Ormerod 2 released in December 2014.
The RepRap Ormerod has a 200 mm × 200 mm × 200 mm build volume, uses a Bowden extruder, it also has a micro SD card and USB and Ethernet connections allowing it to be connected to a network. The printer was praised for the simplicity of construction and its low cost.
See also
RepRap Fisher
Prusa i3
References
External links
RepRap Ormerod page on RepRap.org
Open Source repository on Github by RepRapPro
Open hardware electronic devices
3D printers
RepRap project
2013 introductions |
https://en.wikipedia.org/wiki/Methyl%20green | Methyl green (CI 42585) is a cationic or positive charged stain related to Ethyl Green that has been used for staining DNA since the 19th century. It has been used for staining cell nuclei either as a part of the classical Unna-Pappenheim stain or as a nuclear counterstain ever since.
In recent years, its fluorescent properties, when bound to DNA, have positioned it as useful for far-red imaging of live cell nuclei.
Fluorescent DNA staining is routinely used in cancer prognosis.
Methyl green also emerges as an alternative stain for DNA in agarose gels, fluorometric assays, and flow cytometry. It has also been shown that it can be used as an exclusion viability stain for cells.
Its interaction with DNA has been shown to be non-intercalating, in other words, not inserting itself into the DNA, but instead electrostatic with the DNA major groove. It is used in combination with pyronin in the methyl green–pyronin stain, which stains and differentiates DNA and RNA.
When excited at 244 or 388 nm in a neutral aqueous solution, methyl green produces a fluorescent emission at 488 or 633 nm, respectively. The presence or absence of DNA does not affect these fluorescence behaviors. When binding DNA under neutral aqueous conditions, methyl green also becomes fluorescent in the far red with an excitation maximum of 633 nm and an emission maximum of 677 nm.
Commercial Methyl green preparations are often contaminated with Crystal violet. Crystal violet can be removed by chloroform extraction.
References
Biological techniques and tools
Staining dyes
Triarylmethane dyes |
https://en.wikipedia.org/wiki/Vicnet | Vicnet (Victoria's Network) was a business unit of the State Library of Victoria, Australia operating between 1994 and 2014. It was an early Australian internet service provider that provided website space and training. It was Australia's largest web host for community organisations and projects such as Skills.Net and Libraries Online. The State Library of Victoria closed Vicnet on 31 January 2014.
History
The State Library of Victoria and the Royal Melbourne Institute of Technology (RMIT) established a joint project to build a web-based publishing service and internet access provider for community organisations in 1993.
Vicnet worked with the State and federal government, private providers, the Victorian public library network and community based organisations across Victoria to address Digital Divide issues. Through a range of ICT programs Vicnet drove the roll out of public access internet points across Victoria and in the process connected every library in Victoria to the Internet for public access. To facilitate access, Vicnet staff delivered extensive training/community development programs across Victoria through government funded programs such as the Skills.net program (a program that was responsible for training more than 100,000 Victorians).
Additionally Vicnet developed an online publication platform and an extensive web directory for community and other organizations, as well as for members of the general population. Among many hundreds, Vicnet published and trained in the editing for the first web sites for the Melbourne International Comedy Festival, the Melbourne Formula One Grand Prix, the Indigenous Flora and Fauna Society, the Council on the Ageing (Victoria), The Age newspaper, and the Victorian Department of Premier and Cabinet.
With government assistance from Multimedia Victoria, Vicnet provided internet access to regional Victorian communities then out of the reach of any internet service, such as Mallacoota and Apollo Bay. With other ear |
https://en.wikipedia.org/wiki/Magic%20hexagram | A magic hexagram of order 2 is an arrangement of numbers in a hexagram with triangular cells with 2 cells on each edge, in such a way that the numbers in each row, in all three directions, sum to the same magic constant M.
Magic star hexagram
Magic star hexagram or 6-pointed magic star is a star polygon with Schläfli symbol {6/2} in which numbers are placed at each of the six vertices and six intersections, such that the four numbers on each line sum to the same magic constant.
Magic star hexagram with triangular cell
There are two solutions of magic star hexagram with 12 triangular cells.
Magic star hexagram with more than 12 vertices
Harold Reiter and David Ritchie calculated the solution of magic hexagrams with 19 vertices.
See also
Magic square
Magic hexagon
References
External links
The Magic Hexagram
A Complete Solution to the Magic Hexagram Problem
Magic shapes
Star symbols |
https://en.wikipedia.org/wiki/How%20Not%20to%20Be%20Wrong | How Not to Be Wrong: The Power of Mathematical Thinking, written by Jordan Ellenberg, is a New York Times Best Selling book that connects various economic and societal philosophies with basic mathematics and statistical principles.
Summary
How Not to Be Wrong explains the mathematics behind some of simplest day-to-day thinking. It then goes into more complex decisions people make. For example, Ellenberg explains many misconceptions about lotteries and whether or not they can be mathematically beaten.
Ellenberg uses mathematics to examine real-world issues ranging from the loving of straight lines in the reporting of obesity to the game theory of missing flights, from the relevance to digestion of regression to the mean to the counter-intuitive Berkson's paradox.
Chapter summaries
Part 1: Linearity
Chapter 1, Less Like Sweden: Ellenberg encourages his readers to think nonlinearly, and know that "where you should go depends on where you are". To develop his thought, he relates this to Voodoo economics and the Laffer curve of taxation. Although there are few numbers in this chapter, the point is that the overall concept still ties back to mathematical thinking.
Chapter 2, Straight Locally, Curved Globally: This chapter puts an emphasis on recognizing that "not every curve is a straight line", and makes reference to multiple mathematical concepts including the Pythagorean theorem, the derivation of Pi, Zeno's paradox, and non-standard analysis.
Chapter 3, Everyone is Obese: Here, Ellenberg dissects some common statistics about Obesity trends in the United States. He ties it into linear regression, and points out basic contradictions made by the original arguments presented. He uses many examples to make his point, including the correlation between SAT scores and tuition rates, as well as the trajectory of missiles.
Chapter 4, How Many Is That In Dead Americans: Ellenberg analyzes statistics about the number of casualties around the world in different countries |
https://en.wikipedia.org/wiki/Dragon%20silk | Dragon silk is a material created by Kraig Biocraft Laboratories of Ann Arbor, Michigan from genetically modified silkworms to create body armor. Dragon silk combines the elasticity and strength of spider silk. It has the tensile strength as high as 1.79 gigapascals (as much as 37%) and the elasticity above 38% exceeding the maximum reported features of the spider silk. It is reported that dragon silk is more flexible than the Monster silk and stronger than the "Big Red, recombinant spider silk designed for increased strength.
Properties
Mechanical properties
Dragon silk has properties higher than that of any other fiber ever noticed.
Tensile Strength
In comparison, Dragon silk's tensile strength is higher than that of steel(450-2000 MPa's). In a report it is said that the strength of Dragon silk is as high as 1.79 GPa's which is 37% higher than the widely reported spider silk. Its tensile strength is higher than the "Big Red silk," which had been reported as the strongest fiber ever made. "Bid Red Silk" was developed in the same Laboratories as Dragon Silk.
Flexibility
Dragon silk is far more flexible than Kevlar(the material used by US Army to develop body armor). Its flexibility is 38% higher than normal Spider silk and is noticeably more flexible than the "Monster silk" from the same lab. In percentage, Kevlar's flexibility is 3% and Dragon silk's flexibility is 30% to 40%.
History
In 2010, the scientists discovered the first spider silk, which was a great achievement, as it is one of the strongest natural fiber. But the problem was that spiders are cannibalistic and territorial, so it is impossible to create a cost-effective spider farm. To overcome this problem, scientists at Kraig Labs developed a method for making spider silk from silkworms. In 2011, Malcolm J. Fraser, Donald L. Jarvis and their colleagues published a study in which they describe how they remove silkworm silk making protein and replaced it with the spiders protein to build unique |
https://en.wikipedia.org/wiki/Timeline%20of%20GitHub | This is a timeline of GitHub, a web-based Git or version control repository and Internet hosting service.
Big picture
Full timeline
See also
Censorship of GitHub
Timeline of social media
Timeline of online food deliveryAnnouncing Open Source Guides
Timeline of online advertising
References
GitHub
GitHub |
https://en.wikipedia.org/wiki/Marine%20thruster | A marine thruster is a device for producing directed hydrodynamic thrust mounted on a marine vehicle, primarily for maneuvering or propulsion. There are a variety of different types of marine thrusters and each of them plays a role in the maritime industry. Marine thrusters come in many different shapes and sizes, for example screw propellers, Voith-Schneider propellers, waterjets, ducted propellers, tunnel bow thrusters, and stern thrusters, azimuth thrusters, rim-driven thrusters, ROV and submersible drive units. A marine thruster consists of a propeller or impeller which may be encased in some kind of tunnel or ducting that directs the flow of water to produce a resultant force intended to obtain movement in the desired direction or resist forces which would cause unwanted movement. The two subcategories of marine thrusters are for propulsion and maneuvering, the maneuvering thruster typically in the form of bow or stern thrusters and propulsion thrusters ranging from Azimuth thrusters to Rim Drive thrusters.
Positioning Thrusters
Positioning thrusters come in applications, Bow thrusters at the forward end of the vessel, and stern thrusters mounted aft on the boat. Their purpose is to maneuver or position the boat to a greater precision than the propulsion device can accomplish. Their positioning along the length of the vessel allows for directed lateral thrust ahead and astern of the centre of lateral resistance so that the vessel may be maneuvered away from obstructions in its path, or towards a desired position, especially when coming to or away from a dock. These positioning thrusters are usually significantly smaller than the main propulsion thrusters because they only have to do small adjustments rather than moving the whole vessel at speed. Both bow and stern thrusters may be housed in through-hull tunnels. Depending on the size of the motors driving these propellers, they could draw an insignificant amount of power or a large amount of power that requi |
https://en.wikipedia.org/wiki/FUEL%20Project | FUEL Project aims at solving the problem of inconsistency and lack of standardization in Software Translation across the platform. FUEL Project develops content especially for language users. It works at making technology friendly to user by working as an interpreter. An interpreter who has localized the language of technology and made a glossary of different keywords which is accessible by the user. Currently FUEL Project is working in 40 different languages worldwide. It helps the user to translate and understand different computer terminologies in his own native language. Additionally, the content and glossaries are provided as open source. They are completely free to access by user.
History
FUEL Project was launched in 2008. It was initiated by Red Hat. It was started to create a desktop for a Hindi known user. It was inspired by the thought of making a Hindi user able to understand the language of technology. Glossary of different terms/command, style guide in that language and other content was generated. It standardized the translation standard. The result of effort made other language communities also work with FUEL Project. FUEL Project has organized several language community workshops with the help of local communities, language academies, university language departments, several important organizations and body like Red Hat, C-DAC, Wiki Media Foundation.
Need
It is widely seen that many user are not able to understand computer terminologies. So for them it is needed to standardize those terminologies. Make technology talk in which they want it to. With each day passing society is drifting more towards a digital society where things are done online. Government Services, Online Exams, Filling forms for different purposes and what not. Digital platform is considered the easiest method, but for some understanding the same is not easy. Making them understand which term/command means what in the language they understand gave birth to FUEL Project. So for |
https://en.wikipedia.org/wiki/Libraries.io | Libraries.io is an open source web service that lists software development project dependencies and alerts developers to new versions of the software libraries they are using.
Libraries.io is written by Andrew Nesbitt, who has also used the code as the basis for DependencyCI, a service that tests project dependencies. A key feature is that the service checks for software license compliance.
As of 17 April 2022, the web service monitors 6,921,905 open source libraries and supports 32 different package managers. To gather the information on libraries, it uses the dominant package manager for each programming language that is supported. The website organizes them by programming language, package manager, license (such as GPL or MIT), and by keyword.
On November 14, 2017, Libraries.io announced its acquisition by Tidelift, an open-source software support company, with an intention to continue to develop and operate the service.
The code that runs the web service is available on GitHub and under the GNU Affero General Public License.
External links
Libraries.io source code
References
Free software websites
Software metrics
Code search engines
Internet properties established in 2015 |
https://en.wikipedia.org/wiki/Olimp%C3%ADada%20de%20Matem%C3%A1tica%20do%20Grande%20ABC | The Olimpíada de Matemática do Grande ABC (English:Grande ABC Mathematical Olympiad), or OMABC is a mathematical competition for pre-collegiate Brazilian students of Grande ABC region, composed by the following cities:
Santo André
São Caetano do Sul
São Bernardo do Campo
Diadema
Mauá
Ribeirão Pires
Rio Grande da Serra
The Faculdade de Ciências Exatas e Tecnológicas da Universidade Metodista de São Paulo is the main organizator of this event, who create the tests and correct then. The main purpose of this olympiad is improve the mathematical knowledge, encouraging the study and research in scientific areas., and contributing to participate in national mathematical competitions, like Olimpíada Brasileira de Matemática das Escolas Públicas and Olimpíada Brasileira de Matemática. The first edition was held in 2004.
Awards
Students
The participants are ranked based on their individual scores. Medals are awarded to the highest ranked participants, such that slightly less than half of them receive a medal. Subsequently, the cutoffs (minimum scores required to receive a gold, silver or bronze medal respectively) are chosen such that the ratio of gold to silver to bronze medals awarded approximates 1 : 2 : 3.
Gold medal
Silver medal
Bronze medal
Schools
Special prizes are awarded for the schools:
Trophy: For the schools whose students received at least a golden medal.
Honorable Mention: For the schools at least one student received an award.
Champions of OMABC
Schools awarded with trophies
References
External links
OMABC - Olimpíada de Matemática do Grande ABC
UMESP - Universidade Metodista de São Paulo
IMPA - Instituto de Matemática Pura e Aplicada
Mathematics competitions |
https://en.wikipedia.org/wiki/Horizon%20of%20predictability | A horizon of predictability is the point after which a dynamical system becomes unpredictable given initial conditions. This includes
Cauchy horizon
Lyapunov exponent
Lyapunov time
See also
Butterfly effect
Dynamical systems |
https://en.wikipedia.org/wiki/Human%20uses%20of%20living%20things | Human uses of living things, including animals plants, fungi, and microbes, take many forms, both practical, such as the production of food and clothing, and symbolic, as in art, mythology, and religion. The skills and practices involved are transmitted by human culture through social learning. Social sciences including archaeology, anthropology and ethnography are starting to take a multispecies view of human interactions with nature, in which living things are not just resources to be exploited, practically or symbolically, but are involved as participants.
Plants provide the greater part of the food for people and their domestic animals: much of human culture and civilisation came into being through agriculture. While many plants have been used for food, a small number of staple crops including wheat, rice, and maize provide most of the food in the world today. In turn, animals provide much of the meat eaten by the human population, whether farmed or hunted, and until the arrival of mechanised transport, terrestrial mammals provided a large part of the power used for work and transport. A variety of living things serve as models in biological research, such as in genetics, and in drug testing. Until the 19th century, plants yielded most of the medicinal drugs in common use, as described in the 1st century by Dioscorides. Plants are the source of many psychoactive drugs, some such as coca known to have been used for thousands of years. Yeast, a fungus, has been used to ferment cereals such as wheat and barley to make bread and beer; other fungi such as Psilocybe and fly agaric mushrooms have been gathered as psychoactive drugs.
Many species of animal are kept as pets, the most popular being mammals, especially dogs and cats. Plants are grown for pleasure in gardens and greenhouses, yielding flowers, shade, and decorative foliage; some, such as cactuses, able to tolerate dry conditions, are grown as houseplants.
Animals such as horses and deer are among the earl |
https://en.wikipedia.org/wiki/Content%20Disarm%20%26%20Reconstruction | Content Disarm & Reconstruction (CDR) is a computer security technology for removing potentially malicious code from files. Unlike malware analysis, CDR technology does not determine or detect malware's functionality but removes all file components that are not approved within the system's definitions and policies.
It is used to prevent cyber security threats from entering a corporate network perimeter. Channels that CDR can be used to protect include email and website traffic. Advanced solutions can also provide similar protection on computer endpoints, or cloud email and file sharing services.
There are three levels of CDR; 1) flattening and converting the original file to a PDF, 2) stripping active content while keeping the original file type, and 3) eliminating all file-borne risk while maintaining file type, integrity and active content. Beyond these three levels, there are also more advanced forms of CDR that is able to perform "soft conversion" and "hard conversion", based on the user's preference in balancing usability and security.
Applications
CDR works by processing all incoming files of an enterprise network, deconstructing them, and removing the elements that do not match the file type's standards or set policies. CDR technology then rebuilds the files into clean versions that can be sent on to end users as intended.
Because CDR removes all potentially malicious code, it can be effective against zero-day vulnerabilities that rely on being an unknown threat that other security technologies would need to patch against to maintain protection.
CDR can be used to prevent cyber threats from variety of sources:
Email
Data Diodes
Web Browsers
Endpoints
File Servers
FTP
Cloud email or webmail programs
SMB/CIFS
Removable media scanning (CDR Kiosk)
CDR can be applied to a variety of file formats including:
Images
Office documents
PDF
Audio/video file formats
Archives
HTML
Open Source Implementations
DocBleach
ExeFilter
See also
Adva |
https://en.wikipedia.org/wiki/Alibaba%20Cloud | Alibaba Cloud, also known as Aliyun (), is a cloud computing company, a subsidiary of Alibaba Group. Alibaba Cloud provides cloud computing services to online businesses and Alibaba's own e-commerce ecosystem. Its international operations are registered and headquartered in Singapore.
Alibaba Cloud offers cloud services that are available on a pay-as-you-go basis, and include elastic compute, data storage, relational databases, big-data processing, anti-DDoS protection and content delivery networks (CDN).
It is the largest cloud computing company in China, and in Asia Pacific according to Gartner. Alibaba Cloud operates data centers in 24 regions and 74 availability zones around the globe. As of June 2017, Alibaba Cloud is placed in the Visionaries' quadrant of Gartner's Magic Quadrant for cloud infrastructure as a service, worldwide.
History
September 2009 – Alibaba Cloud is founded and R&D centers and operation centers are subsequently opened in Hangzhou, Beijing, and Silicon Valley.
November 2010 – Supported the first Single's Day (11.11) Taobao shopping festival, with 2.4 billion PV in 24 hours.
November 2012 – Became the first Chinese cloud service provider to pass ISO27001:2005 (Information Security Management System).
January 2013 – Merged with HiChina (founded by Xiangning Zhang) for www.net.cn business as one of the largest acquisitions in the company's history at the time.
August 2013 – ApsaraDB architecture supported 5000 physical machines in a single cluster.
December 2014 – Defended a 14-hour long DDoS attack, peaking at 453.8Gbit/s.
May 2014 – Hong Kong data center went online.
October 2015 – Two US data centers went online.
July 2015 – Alibaba Group invested a further US$1 billion to Alibaba Cloud.
August 2015 – Alibaba Cloud's first Singapore data center opened. Singapore is announced as Alibaba Cloud's overseas headquarters.
October 2015 – MaxCompute took the lead of the Sort Benchmark, sorting 100TB data in 377s compared with Apache |
https://en.wikipedia.org/wiki/Gyula%20Strommer | Gyula Strommer (8 May 1920 – 28 August 1995) was a Hungarian mathematician and astronomer.
He discovered an asteroid, 1537 Transylvania, on 27 August 1940. This was his first scientific success. From 1942, he was a teaching assistant at the Descriptive Geometry Department of the Technical University of Budapest. In 1952, he became the head of the Descriptive Geometry Department. In 1972, he was appointed a university professor. Between 1981 and 1987, he was the dean of the Faculty of Mechanical Engineering.
His research topics: the foundations of geometry, Bolyai-Lobachevsky geometry.
References
External links
1920 births
1995 deaths
People from Aiud
20th-century Hungarian mathematicians
Hungarian astronomers
Discoverers of minor planets
Geometers |
https://en.wikipedia.org/wiki/Uhuru%20Mobile | Uhuru Mobile is a secure Android-based operating system.
An operating system is the principal program allowing Smartphones, Tablets and PCs to run. This central tool connects and coordinates all the components such as kernel, computers programs, software or device drivers, letting users managing their devices.
Uhuru Mobile is a solution composed with its application encryption market, a modified Android-based OS, a virtual private network and a SMS encryption solution.
The purpose of Uhuru Mobile is to prevent physical attacks.
History
The name Uhuru comes from the Swahili language and means freedom and independence.
Uhuru Mobile is the result of a research and development project initiated in 2012 to promote digital sovereignty.
As an Android-based operating system, Uhuru Mobile focuses on security and privacy for end-users, individuals or businesses, on mobile devices.
The operating system is currently developed by a software Editor company called Teclib’.
System architecture
Software Overview
Multi-layers Protection
Kernel: The system core is protected against malicious or unknown code as well as physical attacks or access.
System protection: Critical resources are dynamically protected from malware and vulnerability exploits ensuring the integrity of the operating system’s components.
Data protection: User’s data on the device are encrypted. User’s authentication resources are protected by using certificates.
Application protection: The applications that can be installed on the device are exclusively coming from a market of certified applications. All those applications are validated and certificated before being available within the Uhuru applications market.
Additional Features
To ensure the OS protection and security while using applications, a dedicated market has been installed (replacing the Google Play Store). Uhuru Mobile’s applications market only provides apps approved and certified by a team of security experts. Companies can al |
https://en.wikipedia.org/wiki/Human%20interactions%20with%20fungi | Human interactions with fungi include both beneficial uses, whether practical or symbolic, and harmful interactions such as when fungi damage crops, timber, food, or are pathogenic to animals.
Yeasts have been used since ancient times to leaven bread and to ferment beer and wine. More recently, mould fungi have been exploited to create a wide range of industrial products, including enzymes and drugs. Medicines based on fungi include antibiotics, immunosuppressants, statins and many anti-cancer drugs. The yeast species Saccharomyces cerevisiae is an important model organism in cell biology. The fruiting bodies of some larger fungi are collected as edible mushrooms, including delicacies like the chanterelle, cep, and truffle, while a few species are cultivated. Mould fungi provide the meaty (umami) flavour of fermented soybean products such as tempeh, miso and soy sauce, and contribute flavour and colour to blue cheeses including Roquefort and Stilton. Moulds also yield vegetarian meat substitutes like Quorn. Some fungi, especially the fly agaric and psilocybin mushrooms are used for the psychoactive drugs that they contain; these in particular are the focus of academic study in the field of ethnomycology. Fungi have appeared, too, from time to time, in literature and art.
Fungi create harm by spoiling food, destroying timber, and by causing diseases of crops, livestock, and humans. Fungi, mainly moulds like Penicillium and Aspergillus, spoil many stored foods. Fungi cause the majority of plant diseases, which in turn cause serious economic losses. Sometimes, as in the Great Irish Famine of 1845–1849, fungal diseases of plants, in this case potato blight caused by Phytophthora, result in large-scale human suffering. Fungi are similarly the main cause of economic losses of timber in buildings. Finally, fungi cause many diseases of humans and livestock; Aspergillosis kills some 600,000 people a year, mainly however those with already weakened immune systems.
Context
|
https://en.wikipedia.org/wiki/Bayesian-optimal%20pricing | Bayesian-optimal pricing (BO pricing) is a kind of algorithmic pricing in which a seller determines the sell-prices based on probabilistic assumptions on the valuations of the buyers. It is a simple kind of a Bayesian-optimal mechanism, in which the price is determined in advance without collecting actual buyers' bids.
Single item and single buyer
In the simplest setting, the seller has a single item to sell (with zero cost), and there is a single potential buyer. The highest price that the buyer is willing to pay for the item is called the valuation of the buyer. The seller would like to set the price exactly at the buyer's valuation. Unfortunately, the seller does not know the buyer's valuation. In the Bayesian model, it is assumed that the buyer's valuation is a random variable drawn from a known probability distribution.
Suppose the cumulative distribution function of the buyer is , defined as the probability that the seller's valuation is less than . Then, if the price is set to , the expected value of the seller's revenue is:
because the probability that the buyer will want to buy the item is , and if this happens, the seller's revenue will be .
The seller would like to find the price that maximizes . The first-order condition, that the optimal price should satisfy, is:
where the probability density function.
For example, if the probability distribution of the buyer's valuation is uniform in , then and (in ). The first-order condition is which implies . This is the optimal price only if it is in the range (i.e., when ).
Otherwise (when ), the optimal price is .
This optimal price has an alternative interpretation: it is the solution to the equation:
where is the virtual valuation of the agent. So in this case, BO pricing is equivalent to the Bayesian-optimal mechanism, which is an auction with reserve-price .
Single item and many buyers
In this setting, the seller has a single item to sell (with zero cost), and there are multiple potential bu |
https://en.wikipedia.org/wiki/Genetic%20demixing | In biology, genetic demixing refers to a phenomenon in which an initial mixture of individuals with
two or more distinct genotypes rearranges in the course of time,
giving birth to a spatial organization where some or all genotypes are concentrated in distinct patches.
See also
Population genetics
Microbiology
Genomics
Ecology
microbial ecology
Genetic admixture
References
Microbial population biology
Genetics |
https://en.wikipedia.org/wiki/Turbomixer | A Turbo mixer, also known as a high speed mixer or a tank mixer, is a type of industrial mixer that uses PVC for mixing raw materials to form a free-flowing powder blend.
Design
It includes a cylindrical tank with a mixing tool assembled on the bottom that typically operates at a peripheral speed of between 20 and 50 m/s, depending on the material to blend. The material is heated inside by a mixer, through the mechanical energy that is produced between the mixing tools and the material which generates mutual impacts of the particles. During the mixing phase, the Turbo-mixer creates an axial vortex. The structure and position of the blades inside the mixer guarantee homogeneous material dispersion.
To avoid thermal degradation, it is usually combined with a cooler that cools down the dry blend to the temperature of around 45-55 C. Due to the poor heat conductivity of the cooler, the cooler is usually three times larger than the mixer as the cooling time is proportional to contact surface.
Applications
The typical uses of the Turbo mixer is for the production of PVC (dry-blend rigid or plasticized) and for other kinds of thermoplastic composites (like Master-Batch, Wood Plastic Composites, Additives, Thermoplastics Polymers). The largest high-speed mixer known on the market has a tank volume of 2500 litres, which corresponds to a PVC batch size of about 1160 kg and is combined with a horizontal cooler 8600 L. This machine, due to the kind of products was mixed, they have also an introduction of around 500 kg into the cooler mixer, and they can produce around 14 Ton/hour It was manufactured by the Italian company PROMIXON S.r.L. in 2014.
References
Industrial machinery |
https://en.wikipedia.org/wiki/Ampleon | Ampleon is a global semiconductor manufacturer headquartered in Nijmegen (Gelderland), Netherlands and founded on December 7, 2015, spun off from the NXP Semiconductors in May 2015, following the acquisition of the NXP Semiconductors RF Power business by the Jianguang Asset Management Co., Ltd. for US$1.8 billion.
References
External links
Semiconductor companies of the Netherlands
Semiconductor device fabrication
Government-owned companies of China
Companies based in Gelderland
Nijmegen |
https://en.wikipedia.org/wiki/Roberts%20linkage | A Roberts linkage is a four-bar linkage which converts a rotational motion to approximate straight-line motion.
The linkage was developed by Richard Roberts.
The Roberts linkage can be classified as:
Watt-type linkage
Grashof rocker-rocker
Symmetrical four-bar linkage
References
See also
Straight line mechanism
Four-bar linkage
Linkages (mechanical)
Straight line mechanisms |
https://en.wikipedia.org/wiki/Xerox%20Sigma%209 | The Xerox Sigma 9, also known as the XDS Sigma 9, was a high-speed, general purpose computer.
Xerox first became interested in office automation through computers in 1969 and purchased Scientific Data Systems or SDS. They then renamed the division Xerox Data Systems or XDS; they saw limited success, and the division was ultimately sold to Honeywell at a significant loss.
The Sigma 9 was announced in 1970 and the first delivery was made in 1971. There were 3 models built, the Sigma 9, the Sigma 9 Model 2 and the Sigma 9 Model 3. The original was the most powerful and was universally applicable to all data processing applications at the time. The Model 2 was able to process in multi-programmed batch, remote batch, conversational time-sharing, real-time, and transaction processing modes. The Model 3 was designed for the scientific real-time community.
Features of the Basic Systems
All models featured a CPU with at least a floating-point arithmetic unit, Memory map with access protection, Memory write protection, Two real-time clocks, a Power fail-safe, an External interface, Ten internal interrupt levels. Also a Multiplexor input/output processor (MIOP) featuring Channel A with eight sub-channels.
Listed below are the individual specifications
Sigma 9
CPU featuring:
Decimal arithmetic unit
Two 16-register general purpose register blocks
Interrupt control chassis with eight external interrupt levels
Memory reconfiguration control unit
Main Memory of 64K words
Motor generator set
Model 2
CPU featuring:
Decimal arithmetic unit
Two 16-register general purpose register blocks
Interrupt control chassis with two external interrupt levels
Main Memory of 32K words
Model 3
CPU featuring:
One 16-register general purpose register blocks
Interrupt control chassis with two external interrupt levels
Main Memory of 32K words
Interesting facts
The Sigma 9 had a very long run, about 10 years, and around 1980 other companies started building computers that coul |
https://en.wikipedia.org/wiki/Information%20oriented%20software%20development | Information Oriented Software Development is a software development methodology focused on working with information inside a computer program as opposed to working with just data. A significant difference exists between data and information. Information Oriented Software Development relies on data structures specifically designed to hold information, and relies on frameworks that support those data structures. Information oriented software development focuses on the conceptual needs of users and customers rather than the data storage models and object models.
Information data structures
Information data structures are data structures specifically intended to support information inside a computer program. Two common ones are as follows:
Data structures to support Fuzzy logic.
Data structures to support concept combinations and concept Permutations.
See also
Knowledge representation
Domain-driven design
Information model
Data science
References
Software development
Software development process
Information |
https://en.wikipedia.org/wiki/Anthropometric%20measurement%20of%20the%20developing%20fetus | Anthropometry is defined as the scientific study of the human body measurements and proportions. These studies are generally used by clinicians and pathologists for adequate assessments of the growth and development of the fetus at any specific point of gestational maturity. Fetal height, fetal weight, head circumference (HC), crown to rump length (CR), dermatological observations like skin thickness etc. are measured individually to assess the growth and development of the organs and the fetus as a whole and can be a parameter for normal or abnormal development also including adaptation of the fetus to its newer environment.
Another important factor that contributes towards the anthropometric measurement of the human fetal growth is the maternal nutrition and maternal well-being. Malnutrition, as already established by WHO, is a global serious health problem not only in adults but in pregnant and lactating mothers too and is a serious problem in third world countries. In Africa and South Asia, 27%-50% of women in the reproductive age are underweight resulting in 30 million low birth weight babies.
For decades, the topic of question pertaining to crown-rump length (CR), crown-heel length (CH), head circumference (HC) with respect to the body weight of human fetus at different time periods of gestation has baffled many developmental researchers and biostatisticians. These biological variations are all based on linear curves based on human fetuses between 9 and 28 weeks of gestation.
Co-relation of fetal weight and fetal growth
Body weight, for example, is an important function and parameter for growth with respect to gestational age of the fetus. There will be great variations in the body weight of a 16 weeks old fetus. The weight will not be constant for every fetus and will vary from individual to individual. Therefore, rather than an appropriate or standard value, a range can be specified like 90 to 100 grams. This number of variations applies to all other ant |
https://en.wikipedia.org/wiki/McKay%20conjecture | In mathematics, specifically in the field of group theory, the McKay conjecture is a conjecture of equality between the number of irreducible complex characters of degree not divisible by a prime number to that of the normalizer of a Sylow -subgroup. It is named after Canadian mathematician John McKay.
Statement
Suppose is a prime number, is a finite group, and is a Sylow -subgroup. Define
where denotes the set of complex irreducible characters of the group . The McKay conjecture claims the equality
where is the normalizer of in .
References
(Corrected reprint of the 1976 original, published by Academic Press.)
Representation theory of groups |
https://en.wikipedia.org/wiki/Topological%20Galois%20theory | In mathematics, topological Galois theory is a mathematical theory which originated from a topological proof of Abel's impossibility theorem found by V. I. Arnold and concerns the applications of some topological concepts to some problems in the field of Galois theory. It connects many ideas from algebra to ideas in topology. As described in Khovanskii's book: "According to this theory, the way the Riemann surface of an analytic function covers the plane of complex numbers can obstruct the representability of this function by explicit formulas. The strongest known results on the unexpressibility of functions by explicit formulas have been obtained in this way."
References
Galois theory
Topology |
https://en.wikipedia.org/wiki/Bump-in-the-wire | Bump-in-the-wire (BITW) is a class of communications devices which can be inserted into existing (legacy) systems to enhance the integrity, confidentiality, or reliability of communications across an existing logical link without altering the communications endpoints. The term was originally used to indicate that the device should introduce only a relatively small increased latency in communications compared to the original, unsecured, approach.
An example of such a device might be a security appliance which applies IPsec protection to communications between existing devices which themselves lack IPsec implementation protocol stack. Such a device might also be called a security gateway or could be implemented as part of a network firewall to implement a tunneling protocol.
References
Computer-mediated communication |
https://en.wikipedia.org/wiki/Firebase%20Cloud%20Messaging | Firebase Cloud Messaging (FCM), formerly known as Google Cloud Messaging (GCM), is a cross-platform cloud service for messages and notifications for Android, iOS, and web applications, which as of May 2023 can be used at no cost. Firebase Cloud Messaging allows third-party application developers to send notifications or messages from servers hosted by FCM to users of the platform or end users.
The service is provided by Firebase, a subsidiary of Google. On October 21, 2014, Firebase announced it had been acquired by Google for an undisclosed amount. The official Google Cloud Messaging website points to Firebase Cloud Messaging (FCM) as the new version of GCM. Firebase is a mobile platform which supports users in developing mobile and web applications. Firebase Cloud Messaging is one of many products which are part of the Firebase platform. On the platform users can integrate and combine different Firebase features in both web and mobile applications.
History
Firebase Cloud Messaging (FCM) is part of the Firebase platform, which is a cloud service model that automates backend development or a Backend-as-a-service (BaaS). After the Firebase company was acquired by Google in 2014, some Firebase platform products or technologies were integrated with Google’s existing services. Google’s mobile notification service Google Cloud Messaging (GCM) was replaced by FCM in 2016. On April 10, 2018, GCM was removed by Google and on May 29, 2019, the GCM server and client API were deprecated. FCM has become the replacement for GCM. However, FCM is compatible with existing Google Software Development Kits (SDK).
Firebase Cloud Messaging is a cross-platform messaging service on which the user can deliver messages without cost. FCM is compatible with various platforms including Android and iOS. Google launched support for web applications on October 17, 2016 including mobile web application. On FCM, third party application developers can send push notifications and messages via a |
https://en.wikipedia.org/wiki/IET%20A%20F%20Harvey%20Prize | The IET A F Harvey Engineering Research Prize is a global engineering research prize awarded annually to an innovative researcher by the Institution of Engineering and Technology. It was named after an engineer, Arthur Frank Harvey.
The award was made for the first time in 2011 and one award is made each year. Between 2011 and 2015 the prize money was £300,000. From 2016, the prize increased to £350,000.
The prize follows a three-year cycle, as follows:
Year one: Medical engineering
Year two: Microwaves and radar
Year three: Lasers and optoelectronics
Conditions
The prize money is to be used for the furtherance of scientific research into the fields of medical, microwave, laser or radar engineering.
The IET A F Harvey Engineering Prize committee searches for potential candidates from around the world for the prize, drawing on wide international networks. The committee draws up a short-list of candidates from whom additional information is requested for further detailed consideration.
The selection takes into account outstanding achievement and potential for further substantial advances in engineering and technology to the benefit of society.
List of recipients
Source: IET
See also
List of engineering awards
References
British science and technology awards
Engineering awards
Institution of Engineering and Technology |
https://en.wikipedia.org/wiki/International%20Consortium%20of%20Universities%20for%20the%20Study%20of%20Biodiversity%20and%20the%20Environment | The International Consortium of Universities for the Study of Biodiversity and the Environment or iCUBE aims to connect a group of public research universities to form a consortium to address the problems and issues related to biodiversity and the environment, for both research and educational purposes.
Members
iCUBE is composed of a key group of public research universities sharing a common mission whilst committed to education and research on the environment, biodiversity and climate change, namely:
King's College London
Korea University
Monash University
National University of Singapore
Universiti Brunei Darussalam
University of Auckland
University of Bonn
University of North Carolina at Chapel Hill
Objective
iCUBE aims to promote awareness and understanding, disseminate knowledge as well conduct collaborative research on problems and issues relevant to the environment and biodiversity. The group is designed to strengthen the existing international linkages and further it by promoting educational and intellectual exchanges as well as collaboration among scholars and students.
References
Biodiversity |
https://en.wikipedia.org/wiki/PrivacyIDEA | privacyIDEA is a two factor authentication system which is multi-tenency- and multi-instance-capable. It is opensource, written in Python and hosted at GitHub. privacyIDEA is a LinOTP's fork from 2014.
Fields of use
privacyIDEA provides an authentication backend for various kinds of applications (including SSH, VPN, as well as web applications such as ownCloud). Thus it is meant to replace classical proprietary two factor authentication systems such as RSA SecurID or Vasco. It supports single sign-on via SAML. It is also possible to login with a second factor to Windows desktops using a privacyIDEA Credential Provider.
Installation
privacyIDEA runs on-premises as a web application on a Linux system. It can be set up quickly and easily. It can run on Debian, Ubuntu and RedHat.
Authentication devices
privacyIDEA supports a wide variety of authentication devices. Amongst those are hardware tokens like Feitian C200, the Yubikey by Yubico or other U2F/WebAuthn devices. Many smartphone apps compliant with HOTP and TOTP are also supported.
References
Computer security software
Authentication methods
Computer access control
Linux |
https://en.wikipedia.org/wiki/Fractone | In biology, fractones are structures consisting primarily of laminin and heparan sulfate proteoglycan (HSPG) first discovered in the extracellular matrix niche of the subventricular zone of the lateral ventricle (SVZa) in the mouse brain. Recent research has suggested its importance in adult neurogenesis, gliogenesis, and angiogenesis.
Fractones are found near or connected to stem cells, and are highly implicated in cell proliferation, differentiation and migration.
New work suggests that fractones are implicated in corticalization during embryogenesis, as well as in cancer and neurodegenerative disease.
History
The term fractone is derived from fractal, a term coined by Benoît Mandelbrot in 1975.
Fractones were discovered in 2002 in the extracellular matrix niche of the subventricular zone of the lateral ventricule (SVZa) in a mouse brain. Originally found in the neurogenic areas of the brain, recent studies indicate that fractones are also present in multiple organisms, including but not limited to plants, fungi, invertebrates, and vertebrate animals.
As the discovery of fractones marked a turning point in neurosciences and the understanding of stem cell niches in the mammal brain, other projects have been conducted in different organs and Fractones were found to be highly implicated not only in physiology but also in numerous pathologies. For instance, fractones are extremely reduced in Autism but highly represented in inflammation, cancer and other pathologies.
Properties
Fractones are structures composed of proteoglycans, consisting primarily but not limited to laminin and HSPG. Different patterns of sulfation in HSPG and chain length are responsible for numerous pathways in physiology as well as in pathology, being involved in most growth factor bindings, and in embryo development, viral infection, cancer, and other pathologies.
The HSPG part of fractones is responsible for growth factor binding, retention and release in the extracellular matrix. More |
https://en.wikipedia.org/wiki/List%20of%20fellows%20of%20the%20American%20Statistical%20Association | Like many other academic professional societies, the American Statistical Association (ASA) uses the title of Fellow of the American Statistical Association as its highest honorary grade of membership. The number of new fellows per year is limited to one third of one percent of the membership of the ASA. , the people that have been named as Fellows are listed below.
Fellows
1914
John Lee Coulter
Miles Menander Dawson
Frank H. Dixon
David Parks Fackler
Henry Walcott Farnam
Charles Ferris Gettemy
Franklin Henry Giddings
Henry J. Harris
Edward M. Hartwell
Joseph A. Hill
George K. Holmes
William Chamberlin Hunt
John Koren
Thomas Bassett Macaulay
S. N. D. North
Warren M. Persons
Edward B. Phelps
LeGrand Powers
William Sidney Rossiter
Charles H. Verrill
Cressy L. Wilbur
S. Herbert Wolfe
Allyn Abbott Young
1916
Victor S. Clark
Frederick Stephen Crum
Louis Israel Dublin
Walter Sherman Gifford
James Waterman Glover
Royal Meeker
Wesley Clair Mitchell
Charles P. Neill
Julius Hall Parmelee
George E. Roberts
I. M. Rubinow
1917
Leonard Porter Ayres
Robert E. Chaddock
Willford I. King
Max O. Lorenz
Henry Ludwell Moore
Albert Henry Mowbray
Nahum I. Stone
Frank H. Streightoff
Edward Thorndike
1918
Kate Claghorn
John Cummings
William A. Hathaway
Horace Secrist
1920
F. Stuart Chapin
Roland P. Falkner
Abb Landis
William Fielding Ogburn
Raymond Pearl
Ethelbert Stewart
1921
Charles J. Bullock
Melvin T. Copeland
Charles Davenport
Edmund Ezra Day
Edwin Francis Gay
Emanuel Goldenweiser
John H. Gray
Lewis Henry Haney
Louis N. Robinson
Elihu Root
Malcolm C. Rorty
1922
Willard C. Brinton
Robert H. Coates
James H. Field
Arne Fisher
David Friday
James Arthur Harris
F. Leslie Hayford
Don D. Lescohier
Roswell F. Phelps
Joseph E. Pogue
Horatio Pollock
Harold Rugg
Edgar Sydenstricker
Fred G. Tryon
George P. Watkins
Leo Wolman
1923
W. Leonard Crum
Truman Lee Kelley
Frederick Macaulay
Henry Lewis Rietz |
https://en.wikipedia.org/wiki/Bridge%20Stress%20Committee | The Bridge Stress Committee was appointed in 1923 by the UK Department of Scientific and Industrial Research under Sir Alfred Ewing, to investigate stresses in railway bridges, especially as regards the effects of moving loads. Its report, published in 1928, was very influential in British locomotive design as it enabled larger multi-cylinder locomotive classes.
Background
The increased weight of express trains in the United Kingdom during the first quarter of the twentieth century required larger, six-coupled locomotives, but new designs were being limited by the weight restrictions imposed on many underline bridges. On most mainlines this was restricted to no more than on any axle. However, engineers were becoming increasingly aware of the significance of ‘hammer blow’ rather than ’deadweight’ in determining the safe loads for such bridges.
Committee
A committee of was established in 1923, funded jointly by the UK government and the railway companies, to carry out investigations in to the effects of hammer blow on bridges. The committee under chairmanship of the physicist Sir James Alfred Ewing consisted primarily of railway civil engineers, but Sir Henry Fowler, Chief Mechanical Engineer of the London Midland and Scottish Railway was also later invited to join.
Among the results was a better understanding of hammer blow and the effects of oscillations in both locomotive springs and bridges. One immediate impact of the investigations was an easing of the axle-load limit for locomotives with both inside and outside cylinders from to . This enabled successful new designs such as the GWR 6000 Class and the fitting of larger boilers to Gresley A1 class.
References
Civil engineering
Railway bridges |
https://en.wikipedia.org/wiki/Euphylliidae | Euphylliidae (Greek eu-, true; Greek phyllon, leaf) are known as a family of polyped stony corals under the order Scleractinia.
This family consists of multiple genera (more than one genus) and various species which are found among the ocean floor. These coral may be sparse or conspicuous in the wild. However, they are commonly kept in home-aquariums to be enjoyed for their beauty and protection by many fish and their owners.
Classification
Marine organisms are studied and classified just as any other member of the animal kingdom. However, marine taxa are observed and therefore classified differently than reptiles or mammals would be. When any marine animal is classified, there are a group of main characteristics that are observed and used to differentiate between phylum, class, (potentially subclass), order, family, and of course species. The key characteristics that scientists look for are categorized by body type, (symmetry, presence of segments, limbs, head or tail) reproduction, digestion,
As of the year 2000, the order Scleractinia was divided into 18 artificial families, known as the Acroporidae, Astrocoeniidae, Pocilloporidae, Euphyllidae, Oculinidae, Meandrinidae, Siderastreidae, Agariciidae, Fungiidae, Rhizangiidae, Pectiniidae, Merulinidae, Dendrophylliidae, Caryophylliidae, Mussidae, Faviidae, Trachyphylliidae, and Poritidae (sensu Veron 2000). During this time, only 11 families were known to contain corals that can be classified as truly reef-building. All scleractinian families considered here are zooxanthellates (contain photo-endo-symbiontic zooxanthellae). However, in 2022 there are more than 30 families determined under the Scleractinia (according to the World Register of Marine Species) order and 845 species of coral which are known to be reef-building.
Among the countless organisms in the Animalia kingdom, the families of coral will always remain as a unique group. Although they are stationary and stony structures, they belong in the same Cn |
https://en.wikipedia.org/wiki/Model-theoretic%20grammar | Model-theoretic grammars, also known as constraint-based grammars, contrast with generative grammars in the way they define sets of sentences: they state constraints on syntactic structure rather than providing operations for generating syntactic objects. A generative grammar provides a set of operations such as rewriting, insertion, deletion, movement, or combination, and is interpreted as a definition of the set of all and only the objects that these operations are capable of producing through iterative application. A model-theoretic grammar simply states a set of conditions that an object must meet, and can be regarded as defining the set of all and only the structures of a certain sort that satisfy all of the constraints. The approach applies the mathematical techniques of model theory to the task of syntactic description: a grammar is a theory in the logician's sense (a consistent set of statements) and the well-formed structures are the models that satisfy the theory.
History
David E. Johnson and Paul M. Postal introduced the idea of model-theoretic syntax in their 1980 book Arc Pair Grammar.
Examples of model-theoretic grammars
The following is a sample of grammars falling under the model-theoretic umbrella:
the non-procedural variant of Transformational grammar (TG) of George Lakoff, that formulates constraints on potential tree sequences
Johnson and Postal's formalization of Relational grammar (RG) (1980), Generalized phrase structure grammar (GPSG) in the variants developed by Gazdar et al. (1988), Blackburn et al. (1993) and Rogers (1997)
Lexical functional grammar (LFG) in the formalization of Ronald Kaplan (1995)
Head-driven phrase structure grammar (HPSG) in the formalization of King (1999)
Constraint Handling Rules (CHR) grammars
The implicit model underlying The Cambridge Grammar of the English Language
Strengths
One benefit of model-theoretic grammars over generative grammars is that they allow for gradience in grammaticality. A structur |
https://en.wikipedia.org/wiki/Synergy%20DBL | Synergy DBL (Data Business Language) is a compiled, imperative programming language designed for business use. The language was originally called DBL; later it was referred to as Synergy Language; as of 2012 the official name is Synergy DBL. It is based on Digital Equipment Corporation’s DIBOL programming language.
DBL has an English-like syntax that was designed to be self-documenting and highly readable, but not verbose. The language is procedural and, since 2007 (version 9.1), object-oriented. Support for Microsoft’s .NET Framework was added in 2010 (version 9.5).
Code is split into two divisions (data and procedure) and uses a rigid hierarchy. The language includes a standard library consisting of 240 built-in subroutines and functions, 10 built-in classes, and 11 APIs that provide functionality such as access to XML from within DBL programs and sending and receiving data via HTTP/HTTPS.
Synergy DBL is cross-platform, with the current version running on all modern Windows platforms (Windows 7/Server 2008 R2 and higher), as well as on HP-UX, IBM AIX, Oracle Solaris, several varieties of Linux and OpenVMS. Applications can be developed on one platform and ported to other platforms.
Traditional DBL is implemented as bytecode, which is executed by the Synergy Runtime. Synergy .NET programs are CLS-compliant and run under the .NET Framework.
DBL is distributed as part of a suite of programming tools sold as Synergy/DE Professional Series by Synergex International Corporation.
History
Synergy DBL is based on Digital Equipment Corporation’s DIBOL. DBL was developed by Digital Information Systems Corporation (DISC; the company name was changed to Synergex in 1996) in the late 1970s as a DIBOL alternative, targeting system integrators who combined DEC hardware with third-party peripherals. DIBOL ran only on DEC hardware, while DBL ran on most major business computer platforms.
By mid-1979, DBL was being sold as a DIBOL-compatible compiler for PDP-11 (and compatib |
https://en.wikipedia.org/wiki/Oil%20Mines%20Regulations-1984 | Oil Mines Regulations-1984(OMR 1984) replaces the Oil Mines Regulations-1933, with effect from October 1984 to deal with matters for the prevention of possible dangers in oil mines in India.
OMR 1984 was Published in 1986 by Directorate General of Mines Safety, Ministry of Labour in Dhanbad, Jharkhand.
Salient Features
Chapter-1 Priliminary
Short Title; Extent; Application and; Definitions. (Reg 1–2)
Chapter-II
Returns, Notices and Plans. (Reg 3–9)
Chapter-III : Inspectors, Management and Duties
Qualifications; Appointment; General Management and; Duties of Pe
ns Employed in Mines for various functions. (Reg 10–23)
Chapter-IV : Drilling and Workover
Reg 24- Derricks; 25- Derrick platforms and floors; 26- Ladders; 27- Safety belts and life lines; 28- Emergency escape device; 29- Weight indicator; 30- Escape exits; 31- Guardrails, handrails and covers; 32- Draw-works; 33- Cathead and cat line; 34- Tongs; 35- Safety chains or wire lines; 36- Casing lines; 37- Rigging equipment for material handling; 38- Storage of materials; 39- Construction and loading of pipe-racks; 40- Rigging-up and rig dismantling; 41- Mud tanks and mud pumps; 42- Blowout preventer assembly; 43- Control system for blowout preventers; 44- Testing of blowout preventer assembly; 45- Precautions against blowout; 46- Precautions after a blowout has occurred; 47- Drilling workover and other operations; 48- Precautions during drill stem test.
Chapter-V: Production
Well completion, Testing and Activation (Reg 49–50)
Group Gathering Station and Emergency Plan (Reg 51-51A)
Precautions during acidizing operations; fractu operations and; loading and unloading of petroleum tankers. (Reg 52–54)
Storage Tank; Well servicing operations; Artificial lifting of oil; Temporary closure of producing well and; Plugging requirements of abandoned wells (Reg 55–59)
Chapter-V: Production
Application (Reg-60)
Chapter-VI : Transport and pipelines
Approval and design of the route and design of pipe |
https://en.wikipedia.org/wiki/RepRap%20Morgan | The RepRap Morgan is an open-source fused deposition modeling 3D printer. The Morgan is part of the RepRap project and has an unusual SCARA arm design. The first Morgan printer was designed by Quentin Harley, a South African engineer (working for Siemens at the time) at the House4Hack Makerspace in Centurion. The SCARA arm design was developed due to the lack of access to components of existing 3D printer designs in South Africa and their relatively high cost. In 2013 the Morgan won the HumanityPlus Uplift Personal Manufacturing Prize and third place in the Gauteng Accelerator Program.
The Morgan name comes from the RepRap convention of naming printers after famous deceased biologists. The Morgan printers was named after Thomas Hunt Morgan. He worked on the genome of the common fruitfly with his wife, Lilian Vaughan Morgan. Their names were used in the development codenames for the first two generations of Morgan 3D Printers.
Morgan printers are now manufactured full-time by the inventor in a small workshop factory in the House4Hack makerspace.
Versions
There are four versions of the RepRap Morgan, the Morgan v1 (codenamed Thomas), Morgan Pro, Morgan Mega and Morgan Pro 2 (codenamed Lilian).
External links
Official website
RepRap Morgan page on the RepRap.org
RepRap Morgan files on Github
References
Open hardware electronic devices
3D printers
RepRap project |
https://en.wikipedia.org/wiki/Greg%20Nelson%20%28computer%20scientist%29 | Charles Gregory Nelson (27 March 1953 – 2 February 2015) was an American computer scientist.
Biography
Nelson grew up in Honolulu. As a boy he excelled at gymnastics and tennis. He attended the University Laboratory School. He received his B.A. degree in mathematics from Harvard University in 1976. He received his Ph.D. in computer science from Stanford University in 1980 under the supervision of Robert Tarjan. He lived in Juneau, Alaska for a year before settling permanently in the San Francisco Bay Area.
Notable work
His thesis, Techniques for Program Verification, influenced both program verification and automated theorem proving, especially in the area now named satisfiability modulo theories, where he contributed techniques for combining decision procedures, as well as efficient decision procedures for quantifier-free constraints in first-order logic and term algebra. He received the Herbrand Award in 2013:
He was instrumental in developing the Simplify theorem prover used by ESC/Java. He made significant contributions in several other areas. He contributed to the field of programming language design as a member of the Modula-3 committee. In distributed systems he contributed to Network Objects. He made pioneering contributions with his constraint-based graphics editors (Juno and Juno-2), windowing system (Trestle), optimal code generation (Denali), and multi-threaded programming (Eraser).
See also
List of computer scientists
List of programmers
References
1953 births
2015 deaths
Harvard University alumni
Stanford University School of Engineering alumni
20th-century American scientists
21st-century American scientists
American computer scientists
American computer programmers
Programming language researchers
Programming language designers
Formal methods people
Scientists from Hawaii
Scientists at PARC (company) |
https://en.wikipedia.org/wiki/Zona%20%28streaming%20video%20software%29 | Zona is a BitTorrent client for watching streaming video content. Described as a "Popcorn Time beater", the application provides a free alternative to subscription-based video streaming services (such as Netflix). In addition to on-demand movies and television series, Zona offers streaming music, live television channels, news, live sports, and games. Zona has been criticized for being closed-source as well as having an installer that has been implicated as malware.
See also
Popcorn Time
Porn Time
Comparison of BitTorrent clients
References
2014 software
BitTorrent clients
Media players
Peer-to-peer
Streaming media systems
Streaming software
Video on demand services |
https://en.wikipedia.org/wiki/Ricardo%20Peralta%20y%20Fabi | Ricardo Peralta y Fabi (August 15, 1950 – December 31, 2017) was a Mexican mechanical engineer and former astronaut trainee who was a backup for astronaut Rodolfo Neri Vela on STS-61-B. Peralta was one of three people selected among 400 applicants to the Mexican space program.
Accident
Peralta was the alternate of the first Mexican astronaut Rodolfo Neri Vela at the time of the ultralight accident (bought as remuneration for astronaut training), in which he was seriously injured. He was unable to complete astronaut training and training at Indiana University. He left the corps of astronauts on 3 December 1985.
He taught at the University in Mexico City for many years after his astronaut career and died in 2016.
See also
List of Hispanic astronauts
References
1950 births
2017 deaths
Mexican mechanical engineers |
https://en.wikipedia.org/wiki/Plant%20epithet | A plant epithet is a name used to label a person or group, by association with some perceived quality of a plant. Vegetable epithets may be pejorative, such as turnip, readily giving offence, or positive, such as rose or other flowers implying beauty. Tree and flower forenames such as Hazel, Holly, Jasmine and Rose are commonly given to girls. Tree surnames such as Oakes (Oak) and Nash (Ash) are toponymic, given to a person in the Middle Ages who lived in a place near a conspicuous tree. A few plant surnames such as Pease and Onions are metonymic, for sellers of peas and onions respectively. Finally, plant surnames are sometimes emblematic, as in the name Rose, used as a family emblem.
Vegetable insults
Plant epithets may be pejorative, used humorously and sometimes offensively. Some plant epithets are used directly as insults, as when people are called turnips, potatoes, or cabbages. When the England football team lost to Sweden under Graham Taylor, The Sun newspaper led with the headline "Swedes 2 Turnips 1", swede being a pun on a particular vegetable, and turnip being an insult.
In English, the collective term vegetable is also pejorative. Plant epithets are used around the world, but the choice of plants and their meanings vary. Thus in China, "stupid melon" is used as an insult. In Britain, coconut is sometimes used by black people to insult other people of colour; the term indicates betrayal, as coconuts are brown on the outside but white on the inside. Trembling or quaking like an aspen leaf means shaking with fear; this may be descriptive or pejorative, and is recorded from around 1700 onwards, starting with Edward Taylor's Poems. In 2022, the British prime minister Liz Truss was described as "Lettuce Liz" and "The Iceberg Lady", her short term in office compared unfavourably to the shelf life of a lettuce.
Flower and tree names
In contrast to vegetable epithets, flower and tree names are generally positive. "English rose" has traditionally been used to |
https://en.wikipedia.org/wiki/ICORES | The International Conference on Operations Research and Enterprise Systems (ICORES) is an annual conference in the field of operations research. Two tracks are held simultaneously, covering domain independent methodologies and technologies and also practical work developed in specific application areas. These tracks are present in the conference not only in technical sessions but also in poster sessions, keynote lectures and tutorials.
The works presented in the conference are published in the conference proceedings and are made available at the SCITEPRESS digital library. Usually, it's established a cooperation with Springer for a post-publication with some of the conference best papers.
The first edition of ICORES was held in 2012 in conjunction with the International Conference on Agents and Artificial Intelligence (ICAART) and the International Conference on Pattern Recognition Applications and Methods (ICPRAM).
Areas
Methodologies and Technologies
Analytics for Enterprise (Engineering) Systems
Inventory theory
Linear programming
Management sciences
Network optimization
Optimization
Predictive analytics
Queuing theory
Simulation
Stochastic optimization
Data mining and business analytics
Decision analysis
Design in ES (E.G., Real Options for Flexible Design, Etc)
Dynamic programming
Forecasting
Game theory
Industrial engineering
Information systems
Applications
Automation of operations
OR in education
OR in emergency management
OR in health
OR in national defense/international security
OR in telecommunications
OR in transportation
Project management
Resource allocation
Risk management
Routing
Decision support systems
Scheduling
Supply chain management
Systems of systems/teams and socio-technical systems
Energy and environment
Engines for innovation
Globalization and Productivity
Logistics
Maintenance
New applications of OR
Optimization in finance
Current chairs
Conference Chair
Marc Demange, RMIT University, School o |
https://en.wikipedia.org/wiki/Ethereum%20Classic | Ethereum Classic is a blockchain-based distributed computing platform which offers smart contract (scripting) functionality. It is open source and supports a modified version of Nakamoto consensus via transaction-based state transitions executed on a public Ethereum Virtual Machine (EVM).
Ethereum Classic maintains the original, unaltered history of the Ethereum network. The Ethereum project's mainnet initially released via Frontier on 30 July 2015. However, due to a hack of a third-party project, The DAO, the Ethereum Foundation created a new version of the Ethereum mainnet on 20 July 2016 with an irregular state change implemented that erased the DAO theft from the Ethereum blockchain history. The Ethereum Foundation applied their trademark to the new, altered version of the Ethereum blockchain; Ethereum (code: ETH). The older, unaltered version of Ethereum was renamed and continued on as Ethereum Classic (code: ETC).
Ethereum Classic's native Ether token is a cryptocurrency traded on digital currency exchanges under the currency code ETC. Ether is created as a reward to network nodes for a process known as "mining", which validates computations performed on Ethereum Classic's EVM. Implemented on 11 December 2017, the current ETC monetary policy seeks the same goals as bitcoin of being mechanical, algorithmic, and capped. ETC can be exchanged for network transaction fees or other assets, commodities, currencies, products, and services.
Ethereum Classic provides a decentralized Turing-complete virtual machine, the Ethereum Virtual Machine (EVM), which can execute scripts using an international network of public nodes. The virtual machine's instruction set is Turing-complete in contrast to others like bitcoin script. Gas, an internal transaction pricing mechanism, is used to mitigate spam and allocate resources on the network.
Milestones
Frontier
Several codenamed prototypes of the Ethereum platform were developed by the Ethereum Foundation, as part of their |
https://en.wikipedia.org/wiki/One-relator%20group | In the mathematical subject of group theory, a one-relator group is a group given by a group presentation with a single defining relation. One-relator groups play an important role in geometric group theory by providing many explicit examples of finitely presented groups.
Formal definition
A one-relator group is a group G that admits a group presentation of the form
where X is a set (in general possibly infinite), and where is a freely and cyclically reduced word.
If Y is the set of all letters that appear in r and then
For that reason X in () is usually assumed to be finite where one-relator groups are discussed, in which case () can be rewritten more explicitly as
where for some integer
Freiheitssatz
Let G be a one-relator group given by presentation () above. Recall that r is a freely and cyclically reduced word in F(X). Let be a letter such that or appears in r. Let . The subgroup is called a Magnus subgroup of G.
A famous 1930 theorem of Wilhelm Magnus, known as Freiheitssatz, states that in this situation H is freely generated by , that is, . See also for other proofs.
Properties of one-relator groups
Here we assume that a one-relator group G is given by presentation () with a finite generating set and a nontrivial freely and cyclically reduced defining relation .
A one-relator group G is torsion-free if and only if is not a proper power.
Every one-relator group G is virtually torsion-free, that is, admits a torsion-free subgroup of finite index.
A one-relator presentation is diagrammatically aspherical.
If is not a proper power then the presentation complex P for presentation () is a finite Eilenberg–MacLane complex .
If is not a proper power then a one-relator group G has cohomological dimension .
A one-relator group G is free if and only if is a primitive element; in this case G is free of rank n − 1.
Suppose the element is of minimal length under the action of , and suppose that for every either or occurs in r. The |
https://en.wikipedia.org/wiki/AngelHack | AngelHack is an American company based in San Francisco that primarily organizes and hosts hackathons for other companies.
History
Founded in 2011, AngelHack distinguished itself from other hackathon organizers by coordinating global hackathons which took place simultaneously in different places.
The company now brands itself as a developer ecosystem. As of 2023, it is headed by Justin Ng.
Users
AngelHack claims over 200,000 members and hosts various educational events for developers.
Controversies
One of AngelHack's co-founders, Greg Gopman, was sued in 2014 by a fellow AngelHack co-founder for allegedly using the company's finances for "elaborate vacations" in Thailand and Colombia. After leaving AngelHack, Gopman has since garnered controversy for referring to homeless people in San Francisco as "hyenas."
References
Technology companies established in 2011
2011 establishments in California
Technology companies based in the San Francisco Bay Area
Hackathons
Hacker culture
Software development events |
https://en.wikipedia.org/wiki/Fibonacci%20word%20fractal | The Fibonacci word fractal is a fractal curve defined on the plane from the Fibonacci word.
Definition
This curve is built iteratively by applying the Odd–Even Drawing rule to the Fibonacci word 0100101001001...:
For each digit at position k:
Draw a segment forward
If the digit is 0:
Turn 90° to the left if k is even
Turn 90° to the right if k is odd
To a Fibonacci word of length (the nth Fibonacci number) is associated a curve made of segments. The curve displays three different aspects whether n is in the form 3k, 3k + 1, or 3k + 2.
Properties
Some of the Fibonacci word fractal's properties include:
The curve contains segments, right angles and flat angles.
The curve never self-intersects and does not contain double points. At the limit, it contains an infinity of points asymptotically close.
The curve presents self-similarities at all scales. The reduction ratio is . This number, also called the silver ratio, is present in a great number of properties listed below.
The number of self-similarities at level n is a Fibonacci number \ −1. (more precisely: ).
The curve encloses an infinity of square structures of decreasing sizes in a ratio (see figure). The number of those square structures is a Fibonacci number.
The curve can also be constructed in different ways (see gallery below):
Iterated function system of 4 and 1 homothety of ratio and
By joining together the curves and
Lindenmayer system
By an iterated construction of 8 square patterns around each square pattern.
By an iterated construction of octagons
The Hausdorff dimension of the Fibonacci word fractal is , with the golden ratio.
Generalizing to an angle between 0 and , its Hausdorff dimension is , with .
The Hausdorff dimension of its frontier is .
Exchanging the roles of "0" and "1" in the Fibonacci word, or in the drawing rule yields a similar curve, but oriented 45°.
From the Fibonacci word, one can define the «dense Fibonacci word», on an alphabet of 3 lette |
https://en.wikipedia.org/wiki/International%20Electric%20Propulsion%20Conference | The International Electric Propulsion Conference (IEPC) in its current form is a biennial academic conference in the field of electric space propulsion and hosted by the Electric Rocket Propulsion Society (ERPS). It was originally organized by the American Rocket Society (ARS) and later by the American Institute of Aeronautics and Astronautics (AIAA) as a US-American national conference and was expanded starting in 1976 to its current international character with the joining of international space engineering societies. Currently, few hundred engineers and scientists join the conference and present and discuss the latest developments and research results regarding electric propulsion.
List of electric propulsion conferences
Proceedings of the recent conferences are available on the website of the ERPS: IEPC archive.
External links
IEPC 2024
Electric Rocket Propulsion Society
Conference paper archive
Academic conferences |
https://en.wikipedia.org/wiki/Microbiomes%20of%20the%20built%20environment | Microbiomes of the built environment is a field of inquiry into the communities of microorganisms that live in human constructed environments like houses, cars and water pipes. It is also sometimes referred to as microbiology of the built environment.
The field has accelerated somewhat in recent years, with significant funding from the Alfred P. Sloan Foundation and with the increase attention being given to microbiomes and communities of microbes generally.
The National Academies of Sciences, Engineering, and Medicine of the USA is conducting a study of this field with the study entitled "Microbiomes of the Built Environment: From Research to Application".
The American Association for the Advancement of Science ran a symposium on the topic in 2014.
The American Academy of Microbiology had a colloquium on this topic in September 2015 and published a report "Microbiology of Built Environments".
A 2016 paper by Brent Stephens highlights some of the key findings of studies of "microbiomes of the indoor environment". These key findings include those listed below:
"Culture-independent methods reveal vastly greater microbial diversity compared to culture-based methods"
"Indoor spaces often harbor unique microbial communities"
"Indoor bacterial communities often originate from indoor sources."
"Humans are also major sources of bacteria to indoor air"
"Building design and operation can influence indoor microbial communities."
The microbiomes of the built environment are being studied for multiple reasons including how they may impact the health of humans and other organisms occupying the built environment but also some non health reasons such as diagnostics of building properties, for forensic application, impact on food production, impact on built environment function, and more.
Studied environments
Extensive research has been conducted on individual microbes found in the built environment. More recently there has been a significant expansion in the number |
https://en.wikipedia.org/wiki/TCP-seq | Translation complex profile sequencing (TCP-seq) is a molecular biology method for obtaining snapshots of momentary distribution of protein synthesis complexes along messenger RNA (mRNA) chains.
Application
Expression of genetic code in all life forms consists of two major processes, synthesis of copies of the genetic code recorded in DNA into the form of mRNA (transcription), and protein synthesis itself (translation), whereby the code copies in mRNA are decoded into amino acid sequences of the respective proteins. Both transcription and translation are highly regulated processes essentially controlling everything of what happens in live cells (and multicellular organisms, consequently).
Control of translation is especially important in eukaryotic cells where it forms part of post-transcriptional regulatory networks of genes expression. This additional functionality is reflected in the increased complexity of the translation process, making it a hard object to investigate. Yet details on when and what mRNA is translated and what mechanisms are responsible for this control are key to understanding of normal and pathological cell functionality. TCP-seq can be used to obtain this information.
Principles
With the advent of the high-throughput DNA and RNA sequence identification methods (such as Illumina sequencing), it became possible to efficiently analyse nucleotide sequences of large numbers of relatively short DNA and RNA fragments. Sequences of these fragments can be superimposed to reconstruct the source. Alternatively, if the source sequence is already known, the fragments can be found within it (“mapped”), and their individual numbers counted. Thus, if an initial stage exists whereby the fragments are differentially present or selected (“enriched”), this approach can be used to quantitatively describe such stage over even a very large number or length of the input sequences, most usually encompassing the entire DNA or RNA of the cell.
TCP-seq is based on |
https://en.wikipedia.org/wiki/Popp%20%26%20Asocia%C8%9Bii | Popp & Asociații is a professional services company based in Bucharest, Romania. It provides structural design & assessment, consultancy, retrofitting design, and project management services for all aspects of the built environment either existing or new, including infrastructure works design.
General Information
The company was founded in 2002 by prof. Traian Popp, a preeminent Romanian senior structural engineer, along with fellow engineering professionals Dragoș Marcu and Mădălin Coman.
Popp & Asociații had its breakthrough projects in 2004, with the structural design of the Charles de Gaulle Plaza office tower and the seismic structural retrofitting of the 110 year old Bucharest Palace of Justice. The Charles de Gaulle Plaza tower site conditions imposed a top-down infrastructure approach for the building's 5 underground levels, the first of its kind in Romania. Equivalent ground-breaking techniques had to be used for the Palace of Justice retrofitting and strengthening as the palace was listed in the 1980s to be demolished in 1990 due to irreparable seismic damage following the 1977 earthquake.
The company has continuously evolved thereafter, progressively broadening its expertise and range of services. The Popp & Asociații Group now provides geotechnical engineering services, along with building and site monitoring, on site and laboratory material testing services, research and development and structural design consultancy services.
The Popp & Asociații portfolio spans over a large range of projects, including high-rise office and residential buildings, large-scale commercial centres, deep excavations and historical monuments of national and international importance. Further business includes international consulting contracts for the evaluation of existing buildings to seismic actions, along with research activities for the development of design codes and guides.
Notable Projects
New Structures
Existing Structures Rehabilitation
Awards
The Romania |
https://en.wikipedia.org/wiki/RepRap%20Snappy | The RepRap Snappy is an open-source fused deposition modeling 3D printer, part of the RepRap project, it is the most self replicating 3D printer in the world.
The RepRap Snappy is designed to address the core goal of the RepRap project of creating a 'general-purpose self-replicating manufacturing machine'. The RepRap Snappy is able to create 73% of its own parts by volume with a design that eliminates as many of the non 3D printed parts as possible including belts and bearings which are replaced with a rack and pinion system. The name Snappy comes from the use of snap fit connectors used on the small printed parts to construct larger pieces, this both cuts down on the use of non 3D printed parts and means a smaller build volume is needed on the machine producing the parts. The only non self replicating parts on the printer are the motors, electronics, a glass build plate and one 686 bearing, the 3D printed parts take around 150 hours to create. The RepRap Snappy received an honourable mention in the Uplift Prize Grand Personal Manufacturing Prize.
External links
RepRap Snappy page on RepRap.org
RepRap Snappy repository on Github
References
Open hardware electronic devices
3D printers
RepRap project
Self-replicating machines |
https://en.wikipedia.org/wiki/Perceptual%20quantizer | The perceptual quantizer (PQ), published by SMPTE as SMPTE ST 2084, is a transfer function that allows for HDR display by replacing the gamma curve used in SDR. It is capable of representing luminance level up to 10000 cd/m2 (nits) and down to 0.0001 nits. It has been developed by Dolby and standardized in 2014 by SMPTE and also in 2016 by ITU in Rec. 2100. ITU specifies the use of PQ or HLG as transfer functions for HDR-TV. PQ is the basis of HDR video formats (such as Dolby Vision, HDR10 and HDR10+) and is also used for HDR still picture formats. PQ is not backward compatible with the BT.1886 EOTF (i.e. the gamma curve of SDR), while HLG is compatible.
PQ is a non-linear transfer function based on the human visual perception of banding and is able to produce no visible banding in 12 bits. A power function (used as EOTFs in standard dynamic range applications) extended to 10000 cd/m2 would have required 15 bits.
Technical details
The PQ EOTF (electro-optical transfer function) is as follows:
The PQ inverse EOTF is as follows:
where
is the non-linear signal value, in the range .
is the displayed luminance in cd/m2
is the normalized linear displayed value, in the range [0:1] (with representing the peak luminance of 10000 cd/m2)
See also
Transfer functions in imaging
High-dynamic-range television
References
Transfer functions |
https://en.wikipedia.org/wiki/High-dynamic-range%20television | High-dynamic-range television (HDR-TV) is a technology that uses high dynamic range (HDR) to improve the quality of display signals. It is contrasted with the retroactively-named standard dynamic range (SDR). HDR changes the way the luminance and colors of videos and images are represented in the signal, and allows brighter and more detailed highlight representation, darker and more detailed shadows, and more intense colors.
HDR allows compatible displays to receive a higher-quality image source. It does not improve a display's intrinsic properties (brightness, contrast, and color capabilities). Not all HDR displays have the same capabilities, and HDR content will look different depending on the display used, and the standards specify the required conversion depending on display capabilities.
HDR-TV was first used in 2014 to enhance videos, and it is now also available for still pictures. HDR-TV is a part of HDR imaging, an end-to-end process of increasing the dynamic range of images and videos from their capture and creation to their storage, distribution and display.
HDR is frequently together with wide color gamut (WCG) technology, which aims to increase the gamut and number of distinct colors available whereas HDR increases the range of luminance available for each color. HDR and WCG are separable but complementary technologies. Standards-compliant HDR display also has WCG capabilities, as mandated by Rec. 2100 and other common HDR specifications.
Description
Before HDR, improvements in display fidelity were typically achieved by increasing the pixel quantity, density (resolution) and the display's frame rate. By contrast, HDR improves the perceived fidelity of the existing individual pixels. Standard dynamic range (SDR) is still based on and limited by the characteristics of older cathode ray tubes (CRT), despite the huge advances in screen and display technologies since CRT's obsolescence.
SDR formats are able to represent a maximum luminance level of ar |
https://en.wikipedia.org/wiki/DJI%20Osmo | The DJI Osmo (; stylized as DJI OSMO) is a line of compact camcorders manufactured by DJI. It is capable of shooting 4K UHD and 12-16 MP within a maximum of 64 GB microSD storage.
Unlike most camcorders, it incorporates a modular design, as it attaches camera gimbals to its base, which it uses to incorporate different file formats, such as lossless compression and RAW format with the Zenmuse () X5R gimbal. It is backwards compatible with the Inspire 1 UAV Zenmuse gimbal.
Overview
The DJI Osmo is a camcorder, as it is only packaged with its handle, camera gimbal, and case. The interface of the camera is controlled through a smartphone app called "DJI Go", which is used in certain Phantom drones.
Specifications
Hardware
Camera
The DJI Osmo uses a 12.4 megapixel Sony Exmor R CMOS 1/2.3" sensor with a 94 degree FOV and a f/2.8. It has an ISO range of 100-3200 for video and 100-1600 for still photos. The DJI Osmo can shoot in the following frame rates and resolutions:
UHD 4K (4096 x 2160) 24/25p
UHD 4K (3840 x 2160) 24/25/30p
2.7K (2704 x 1520) 24/25/30p
FHD: 1920 x 1080 24/25/30/48/50/60/120p
HD: 1280 x 720 24/25/30/48/50/60p
General
The DJI Osmo's handle itself weighs 201g, and the handle is 61.8 x 48.2 x 161.5mm dimensionally. (2.43 x 1.9 x 6.4in) Controls are a record button, a joystick, and a trigger in the front. A power slider for turning on and off the camera is located in the right side of the handle. On the left side, a mount can be attached onto the camera.
Microphone
The camera is equipped with an internal microphone for audio recording, and a 3.5mm jack placed near the trigger allows external microphones to be plugged in.
Camera gimbals
The DJI Osmo is compatible with four Zenmuse gimbal-cameras: the X3, X3 Zoom, X5, and X5R. The X3, which is packaged, can shoot up to 4K video and take 12MP stills. The X3 Zoom keeps the same X3 sensor but adds 3.5x optical zoom and 2x digital lossless zoom. The X5, can shoot 4K at 30fps, and can take 16MP s |
https://en.wikipedia.org/wiki/CenterPOS%20Malware | CenterPOS (also known as "Cerebrus") is a point of sale (POS) malware discovered Cyber Security Experts. It was discovered in September 2015 along with other kinds of POS malware, such as NewPOSThings, BlackPOS, and Alina. There are two versions which have been released by the developer responsible: version 1.7 and version 2.0. CenterPOS 2.0 has similar functionality to CenterPOS version 1.7. The 2.0 variant of CenterPOS malware added some more effective features, such as the addition of a configuration file for storing information in its command and control server.
Overview
CenterPOS has been used to target retailers in order to illegally obtain payment card information using a memory scraper. It uses two distinct modes to scrape and store information: a "smart scan" and a "normal scan". At the normal scan mode, the malware looks at all of the processes on a device and determines which ones are not currently running processes, are not named "system", "system idle process" or "idle", and do not contain keywords such as Microsoft or Mozilla. If the process meets the criteria list, the malware will search all memory regions within the process, searching for credit card data with regular expressions in the regular expression list. In smart scan mode, the malware starts by performing a normal scan, and any process that has a regular expression match will be added to the smart scan list. After the first pass, the malware will only search the processes that are in the smart scan list. The malware contains functionality that allows cybercriminals to create a configuration file.
Process Details
CenterPOS malware searches for the configuration file that contains the C&C information. If unable to find the configuration file, it asks for a password. If the password entered is correct, then it payloads the functions to create a configuration file. This malware is very different from other point of sale system malware in that it has a separate component called builder to cre |
https://en.wikipedia.org/wiki/Tactoid | Tactoids are liquid crystal microdomains nucleated in isotropic phases, which can be distinguished as spherical or spindle-shaped birefringent microdroplets under polarized light microscopy. Tactoids are a transition state between isotropic and macroscopic liquid crystalline phases. The first observation of tactoids was made by Zocher in 1925, when he studied the nematic phase formed in vanadium pentoxide sols. After that, tactoids have been found in the phase transition processes in many lyotropic liquid crystalline substances, such as tobacco mosaic virus, polypeptides, and cellulose nanocrystals.
In biology
It has been shown that filamin causes actin to condense into tactoids.
References
Biochemistry
Chemical properties
Liquid crystals |
https://en.wikipedia.org/wiki/Teleology%20in%20biology | Teleology in biology is the use of the language of goal-directedness in accounts of evolutionary adaptation, which some biologists and philosophers of science find problematic. The term teleonomy has also been proposed. Before Darwin, organisms were seen as existing because God had designed and created them; their features such as eyes were taken by natural theology to have been made to enable them to carry out their functions, such as seeing. Evolutionary biologists often use similar teleological formulations that invoke purpose, but these imply natural selection rather than actual goals, whether conscious or not. Biologists and religious thinkers held that evolution itself was somehow goal-directed (orthogenesis), and in vitalist versions, driven by a purposeful life force. With evolution working by natural selection acting on inherited variation, the use of teleology in biology has attracted criticism, and attempts have been made to teach students to avoid teleological language.
Nevertheless, biologists still often write about evolution as if organisms had goals, and some philosophers of biology such as Francisco Ayala and biologists such as J. B. S. Haldane consider that teleological language is unavoidable in evolutionary biology.
Context
Teleology
Teleology, from Greek τέλος, telos "end, purpose" and -λογία, logia, "a branch of learning", was coined by the philosopher Christian von Wolff in 1728. The concept derives from the ancient Greek philosophy of Aristotle, where the final cause (the purpose) of a thing is its function. However, Aristotle's biology does not envisage evolution by natural selection.
Phrases used by biologists like "a function of ... is to ..." or "is designed for" are teleological at least in language. The presence of real or apparent teleology in explanations of natural selection is a controversial aspect of the philosophy of biology, not least for its echoes of natural theology.
Natural theology
Before Darwin, natural theology bo |
https://en.wikipedia.org/wiki/Free%20software%20in%20Kerala | The state of Kerala, in India has had an active Free software community since early 1980s. The initial users were those who started using TeX in the city of Thiruvananthapuram. Subsequently Free software users groups were formed in some of the different cities like Thiruvananthapuram, Kochi and around engineering colleges in the state. The Free software community in Kerala was instrumental in creating a policy environment at the state government level that was biased towards Free software. The government of Kerala policy on Free software gives first preference to Free and Open Source software for its IT requirements. The state claims to be the only state in the world where IT education is imparted over a Free software operating system.
Government Policy on Free software
The IT Policy for the state of Kerala had acknowledged the relevance of Free software as early as 2001
The 2001 IT policy states that,The Government wishes to encourage the judicious use of open source/free software that complements/supplements proprietary software, to reduce the total cost of ownership of IT applications/solutions without compromising on the immediate and medium term value provided by the application. The Government welcomes research into the use of open/free software in the context of education, governance, and for general use at home, to make IT truly a part of the daily lives of the people of the State. The Government also encourages projects such as the Simputer that is low cost, based on open software, and attuned to the needs of the common man.In 2007, government of Kerala released its ICT policy where one of its objectives was to mandate appropriate use of Free Software in all ICT initiatives.The Government realizes that Free Software presents a unique opportunity in building a truly egalitarian knowledge society. The Government will take all efforts to develop Free Software and Free Knowledge and shall encourage and mandate the appropriate use of Free Software in all ICT |
https://en.wikipedia.org/wiki/3DBenchy | The 3DBenchy is a 3D computer model specifically designed for testing the accuracy and capabilities of 3D printers. The 3DBenchy is described by its creator, Creative Tools, as "the jolly 3D printing torture-test" and was released (STL only) in April 2015, with a multi-part, multi-color model released in July 2015.
Due to its status as a common benchmark, it is believed to be the world's most 3D printed object. The popular 3d-printing website Thingiverse (where the model was originally uploaded) has the 3DBenchy marked as its most popular model of all time. The model itself is a tugboat design, and, as with many 3D prints, actually floats in water given the right conditions in printing.
Gallery
See also
Standard test image
Stanford bunny
Utah teapot
Suzanne (3D model)
References
External links
Download 3DBenchy
3DBenchy Thingiverse Page
DIY culture
Computing output devices
Engineering projects
3D graphics models
Test items
Fused filament fabrication
3D printing
Boats |
https://en.wikipedia.org/wiki/Metric%20Systems%20Corporation | Metric Systems Corporation (MSC) is an American company that develops, manufactures and sells wireless networking equipment and systems. Based in Carlsbad, California, MSC focuses on White spaces (radio) and other equipment and systems for the commercial, industrial, and government market place.
History
In the late 2000s Metric Systems Corporation was tasked by Microsoft to provide White Space test equipment for use by the Federal Communications Commission in its decision to allow unlicensed use of White Space. Following the FCC's release of final rules for the use of TV-band devices in late 2010, Metric Systems began developing its line of VHF/UHF White Space Broadband Radios.
Patents and technology
Metric Systems Corporation holds several patents in the United States and Canada which focus on Dynamic Spectrum Management and wide area wireless networking. Key patents include methods and apparatuses for adaptively setting frequency channels in a multi-point wireless networking system and maintain connectivity in the presence of noise and interference.
Product use and field trials
MSC's White Space products have been evaluated by a number of domestic and international agencies and companies. In February 2013 MSC's first generation White Space VHF/UHF Broadband Radios were delivered to Brazil's CPqD for evaluation.
In July 2013, Metric Systems Corporation's White Space equipment was used by the Port of Pittsburgh for testing inland waterways.
The first carrier-class TV Band White Space Radio, the RaptorX, was certified by the FCC for unlicensed use in 2015.
In Summer 2016, MSC's production White Space Infrastructure Radio, the RaptorXR, began field trials in the mid-western United States for educational, financial, and public safety applications.
References
Wireless network organizations |
https://en.wikipedia.org/wiki/Christos%20Kozyrakis | Christos (Christoforos) Kozyrakis (; born 1974) is a professor of Electrical Engineering and Computer Science at Stanford University, where he leads the multi-scale architecture & systems team (MAST). His current research interests are on resource efficient cloud computing, energy efficient compute and memory systems, and architectural support for security. Kozyrakis was the 2015 ACM Maurice Wilkes Award for outstanding contributions to transactional memory systems.
Kozyrakis holds a Ph.D. degree from UC Berkeley (advised by David A. Patterson) and a B.Sc. from University of Crete. He is an IEEE fellow and an ACM fellow.
References
External links
Christos Kozyrakis homepage
Christos Kozyrakis profile at Google Scholar
The Maurice Wilkes Award
Living people
Greek computer scientists
Computer designers
University of Crete alumni
Fellow Members of the IEEE
Stanford University faculty
Stanford University Department of Computer Science faculty
Stanford University Department of Electrical Engineering faculty
Stanford University School of Engineering faculty
Fellows of the Association for Computing Machinery
Scientists from Heraklion
1974 births |
https://en.wikipedia.org/wiki/Schuylkill%20and%20Susquehanna%20Navigation%20Company | The Schuylkill and Susquehanna Navigation Company was a limited liability corporation founded in Pennsylvania on September 29, 1791.
The company was founded for the purpose of improving river navigation, which in the post-colonial United States era of the 1790s meant improving river systems, not canals. In this Pennsylvania scheme, however, two rivers, a large river, the Susquehanna and a smaller one, the Schuylkill were to be improved by clearing channels through obstructions and building dams where needed. To connect the two watersheds, the company proposed a summit level crossing at Lebanon, a length of almost between the two rivers. The completed project was intended to be part of a navigable water route from Philadelphia to Lake Erie and the Ohio Valley.
The original engineering concept developed by the Society as well as the navigation company's charter had been to build a canal up to the Schuylkill Valley to Norristown, improving the Schuylkill River from there to Reading. While from Reading, the canal was to extend to the Susquehanna via Lebanon. This would have required a four-mile summit crossing between Tulpehocken and the Quittapahilla with an artificial waterway connecting two separate river valleys; namely the Susquehanna and the Schuylkill watersheds. Its successful completion would have made the middle reach, the first summit-level canal in the United States. The term refers to a canal that rises then falls, as opposed to a lateral canal, which has a continuous fall only. In this case, the proposed canal at 80 miles in length would rise over from the west at the Susquehanna River to the summit and then fall over to the Schuylkill River to the east. It was to be the golden link between Philadelphia and the vast interior of Pennsylvania and beyond.
This proposed summit crossing offered a severe test of 18th-century engineering skills, materials and construction techniques. For both designing and operating a water-conveyance transportation syst |
https://en.wikipedia.org/wiki/International%20Speech%20Communication%20Association | The International Speech Communication Association (ISCA) is a non-profit organization and one of the two main professional associations for speech communication science and technology, the other association being the IEEE Signal Processing Society.
Purpose of the association
The purpose is to promote the study and application of automatic speech processing (in the two directions: speech recognition and speech synthesis) with several sub-topics like speaker recognition or speech compression. The activity of the association concerns all aspects of speech processing, from the computational aspects to the linguistic aspects as well as the theoretical aspects.
Conferences
ISCA organizes yearly the INTERSPEECH conference.
Most recent INTERSPEECH:
2013 Lyon
2014 Singapore
2015 Dresden
2016 San Francisco
2017 Stockholm
2018 Hyderabad
2019 Graz
2020 Shanghai (fully virtual)
2021 Brno (hybrid)
Forthcoming INTERSPEECH:
2022 Incheon
2023 Dublin
2024 Jerusalem
ISCA board
Current ISCA president is Sebastian Möller. The Vice president is Odette Scharenborg, and other members are professionals of the field.
History of ISCA
ISCA is the result of the merge of ESCA (European Speech Communication Association created in 1987 in Europe) and PC-ICSLP (Permanent Council of the organization of International Conference on Spoken Language Processing created in 1986 in Japan). The first ISCA event was held in 2000 in Beijing, China.
See also
Natural language processing
Speech technology
References
External links
ISCA web page
Linguistics organizations
Applied linguistics |
https://en.wikipedia.org/wiki/Botanical%20Latin | Botanical Latin is a technical language based on Neo-Latin, used for descriptions of botanical taxa. Until 2012, International Code of Botanical Nomenclature mandated Botanical Latin to be used for the descriptions of most new taxa. It is still the only language other than English accepted for descriptions. The names of organisms governed by the Code also have forms based on Latin.
Botanical Latin is primarily a written language. It includes taxon names derived from any language or even arbitrarily derived, and consequently there is no single consistent pronunciation system. When speakers of different languages use Botanical Latin in speech, they use pronunciations influenced by their own languages, or, notably in French, there may be variant spellings based on the Latin. There are at least two pronunciation systems used for Latin by English speakers. Neither system, however, works across the full spectrum of botanical names, because many non-Latin words, such as people's names, have been used.
Origin
Alphonse Pyramus de Candolle described the language in 1880:
C'est le latin arrangé par Linné à l'usage des descriptions et, j'oserai dire, à l'usage de ceux qui n'aiment ni les complications grammaticales, ni les phrases disposées sens desus dessous." (Quoted by W. T. Stearn) [It is the Latin chosen by Linnaeus for the purpose of descriptions, and, I dare to say, for the use of those who love neither grammatical complications nor phrases arranged with senses on top of one another.]
De Candolle estimated that to learn Botanical Latin would take three months' work for an English speaker not already familiar with any language of Latin origin, and one month for an Italian.
William T. Stearn wrote:
Botanical Latin is best described as a modern Romance language of special technical application, derived from Renaissance Latin with much plundering of ancient Greek, which has evolved, mainly since 1700 and primarily through the work of Carl Linnaeus (1707–78), to serve as |
https://en.wikipedia.org/wiki/Solar-powered%20waste%20compacting%20bin | A solar-powered waste compactor is a smart device that reads a waste bin's fill-level in real-time and triggers an automatic compaction of the waste, effectively increasing the bin's capacity by up to 5-8 times. The compaction mechanism runs on a battery, which is charged by the solar panel. Fully charged, the battery reserve lasts for approximately 3–4 weeks, depending on the compaction frequency and usage patterns.
Solar-powered waste compactors are typically connected to a remote software platform through wireless 2G/3G networks. The platform enables waste collection managers to access real-time data analytics and route optimization.
Solar-powered compactors are primarily used in high foot traffic areas such as town centers, shopping malls, amusement parks, beaches, transit stations and sports stadiums.
Advantages
Some of the benefits of using solar-powered waste compactors include:
Reduced frequency of waste collections
Cleaner and more hygienic public spaces
Historical waste collection data analytics
Reduction in greenhouse gas emissions
Savings in operational waste collection costs
See also
BigBelly
Clever Bins
Ecube Labs
CitySolar
References
External links
Anta Swiss AG
CitySolar
BigBelly Solar
PEL Waste Reduction Equipment
Mr. Fill
Ecube Labs
BritishBins SolaPacta
Binology LLC
Waste management
Smart devices |
https://en.wikipedia.org/wiki/Ion%20%28serialization%20format%29 | Ion is a data serialization language developed by Amazon. It may be represented by either a human-readable text form or a compact binary form. The text form is a superset of JSON; thus, any valid JSON document is also a valid Ion document.
Data types
As a superset of JSON, Ion includes the following data types
: An empty value
: Boolean values
: Unicode text literals
: Ordered heterogeneous collection of Ion values
: Unordered collection of key/value pairs
The nebulous JSON 'number' type is strictly defined in Ion to be one of
: Signed integers of arbitrary size
: 64-bit IEEE binary-encoded floating point numbers
: Decimal-encoded real numbers of arbitrary precision
Ion adds these types:
: Date/time/time zone moments of arbitrary precision
: Unicode symbolic atoms (aka identifiers)
: Binary data of user-defined encoding
: Text data of user-defined encoding
: Ordered collections of values with application-defined semantics
Each Ion type supports a null variant, indicating a lack of value while maintaining a strict type (e.g., , ).
The Ion format permits annotations to any value in the form of symbols. Such annotations may be used as metadata for otherwise opaque data (such as a blob).
Implementations
Amazon supported library implementations
C#
Go Lang
Python
JS
Examples
Sample document
// comments are allowed in Ion files using the double forward slash
{
key: "value", // key here is a symbol, it can also be a string as in JSON
nums: 1_000_000, // equivalent to 1000000, use of underscores with numbers is more readable
'A float value': 31415e-4, // key is a value that contains spaces
"An int value": .int,
annotated: age::35, // age here is the annotation to number 35
lists : 'hw grades'::[80, 85, 90], // any symbol can be used as an annotation
many_annot: I::have::many::annotations::true, // annotations are not nested, but rather, a list of annotations
sexp: (this (is a [valid] "Ion") last::value 42) // Ion S-expres |
https://en.wikipedia.org/wiki/SystemC%20AMS | SystemC AMS is an extension to SystemC for analog, mixed-signal and RF functionality. The SystemC AMS 2.0 standard was released on April 6, 2016 as IEEE Std 1666.1-2016.
Language specification
ToDo: description
Language features
ToDo: description
MoC - Model of Computation
A model of computation (MoC) is a set of rules defining the behavior and interaction between SystemC AMS primitive
modules. SystemC AMS defines the following models of computation: timed data flow (TDF), linear signal flow
(LSF) and electrical linear networks (ELN).
TDF - Timed Data Flow
In the timed data flow (TDF) model, components exchange analogue values with each other
on a periodic basis at a chosen sampling rate, such as every 10 microseconds.
By the sampling theorem, this would be sufficient to convey
signals of up to 50 MHz bandwidth without aliasing artefacts.
A TDF model defines a method called `processing()' that is invoked
at the appropriate rate as simulation time advances.
A so-called cluster of models share a static schedule of when they should communicate.
This sets the relative ordering of the calls to the processing() methods
of each TDF instance in the cluster.
The periodic behaviour of TDF allows it to operate independently of the main
SystemC event-driven kernel used for digital logic.
ELN - Electrical Linear Networks
The SystemC electrical linear networks (ELN) library provides a set of
standard electrical components that enable SPICE-like
simulations to be run. The three basic components, resistors, capacitors and
inductors are, of course, available. Further voltage-controlled variants, such as a transconductance
amplifier (voltage-controlled current generator) enable most FET and other
semiconductor models to be readily created.
Current flowing in ELN networks of resistors can be solved with a suitable simultaneous equation solver.
These are called the nodal equations.
Where time-varying components, such as capacitors and inductors are included, Euler |
https://en.wikipedia.org/wiki/1337x | 1337x is a website that provides a directory of torrent files and magnet links used for peer-to-peer file sharing through the BitTorrent protocol. According to the TorrentFreak news blog, 1337x is the second most popular torrent website as of 2023.
History
1337x was founded in 2007 and saw increasing popularity in 2016 after the closure of KickassTorrents. In October 2016, it introduced a website redesign with new functionalities. The site is banned from Google search queries and does not appear when searching through Google search. This action was taken following a request by Feelgood Entertainment in 2015. In 2015, the site moved from its older .pl domain to .to, partly in order to evade the block.
1337x's design can be compared to the now defunct h33t. It has been touted as an alternative to the Pirate Bay in the face of its potential demise.
See also
Comparison of BitTorrent sites
Online piracy
Leet
References
External links
BitTorrent
BitTorrent websites
Web directories |
https://en.wikipedia.org/wiki/Brachymeiosis | Brachymeiosis was a hypothesized irregularity in the sexual reproduction of ascomycete fungi, a variant of meiosis following an "extra" karyogamy (nuclear fusion) step. The hypothesized process would have transformed four diploid nuclei into eight haploid ones. The current scientific consensus is that brachymeiosis does not occur in any fungi.
According to the current understanding, ascomycetes reproduce by forming male and female organs (antheridia/spermatia and ascogonia), transferring haploid nuclei from the antheridium to the ascogonium, and growing a dikaryotic ascus containing both nuclei. Karyogamy then occurs in the ascus to form a diploid nucleus, followed by meiosis and mitosis to form eight haploid nuclei in the ascospores. In 1895, the botanist R.A. Harper reported the observation of a second karyogamy event in the ascogonium prior to ascogeny. This would imply the creation of a tetraploid nucleus in the ascus, rather than a diploid one; in order to produce the observed haploid ascospores, a second meiotic reduction in chromosome count would then be necessary. The second reduction was hypothesized to occur during the second or third mitotic division in the ascus, even though chromosome reduction does not typically occur during mitosis. This supposed form of meiosis was termed “brachymeiosis” in 1908 by H. C. I. Fraser.
The existence of brachymeiosis was controversial throughout the first half of the twentieth century, with many conflicting results published. Then, research with improved staining techniques established clearly that only one reductive division occurs in the asci of all examined species, including some which had been believed to undergo brachymeiosis. As a result of these studies, the theories of double fusion and subsequent brachymeiosis were discarded around 1950.
References
Mycology
Cell cycle
Reproduction |
https://en.wikipedia.org/wiki/Cap%20set | In affine geometry, a cap set is a subset of (an -dimensional affine space over a three-element field) with no three elements in a line.
The cap set problem is the problem of finding the size of the largest possible cap set, as a function of . The first few cap set sizes are 1, 2, 4, 9, 20, 45, 112, ... .
Cap sets may be defined more generally as subsets of finite affine or projective spaces with no three in line, where these objects are simply called caps. The "cap set" terminology should be distinguished from other unrelated mathematical objects with the same name, and in particular from sets with the compact absorption property in function spaces as well as from compact convex co-convex subsets of a convex set.
Example
An example of cap sets comes from the card game Set, a card game in which each card has four features (its number, symbol, shading, and color), each of which can take one of three values. The cards of this game can be interpreted as representing points of the four-dimensional affine space , where each coordinate of a point specifies the value of one of the features. A line, in this space, is a triple of cards that, in each feature, are either all the same as each other or all different from each other. The game play consists of finding and collecting lines among the cards that are currently face up, and a cap set describes an array of face-up cards in which no lines may be collected.
One way to construct a large cap set in the game Set would be to choose two out of the three values for each feature, and place face up each of the cards that uses only one of those two values in each of its features. The result would be a cap set of 16 cards. More generally, the same strategy would lead to cap sets in of size . However, in 1971, Giuseppe Pellegrino proved that four-dimensional cap sets have maximum size 20. In terms of Set, this result means that some layouts of 20 cards have no line to be collected, but that every layout of 21 cards has at leas |
https://en.wikipedia.org/wiki/NIST%20Cybersecurity%20Framework | NIST Cybersecurity Framework is a set of guidelines for mitigating organizational cybersecurity risks, published by the US National Institute of Standards and Technology (NIST) based on existing standards, guidelines, and practices. The framework "provides a high level taxonomy of cybersecurity outcomes and a methodology to assess and manage those outcomes", in addition to guidance on the protection of privacy and civil liberties in a cybersecurity context. It has been translated to many languages, and is used by several governments and a wide range of businesses and organizations.
A 2016 study found that 70% of organizations surveyed see the NIST Cybersecurity Framework as a popular best practice for computer security, but many note that it requires significant investment.
Overview
The NIST Cybersecurity Framework is designed for individual businesses and other organizations to assess risks they face.
Version 1.0 was published by the US National Institute of Standards and Technology in 2014, originally aimed at operators of critical infrastructure. In 2017, a draft version of the framework, version 1.1, was circulated for public comment. Version 1.1 was announced and made publicly available on April 16, 2018. Version 1.1 is still compatible with version 1.0.
The changes include guidance on how to perform self-assessments, additional detail on supply chain risk management, guidance on how to interact with supply chain stakeholders, and encourages a vulnerability disclosure process.
The framework is divided into three parts, "Core", "Profile" and "Tiers". The "Framework Core" contains an array of activities, outcomes and references about aspects and approaches to cybersecurity. The "Framework Implementation Tiers" are used by an organization to clarify for itself and its partners how it views cybersecurity risk and the degree of sophistication of its management approach. A "Framework Profile" is a list of outcomes that an organization has chosen from the cate |
https://en.wikipedia.org/wiki/Professional%20Engineers%20Day%20%28U.S.%29 | Professional Engineers Day was launched by the National Society of Professional Engineers in 2016 to celebrate and raise public awareness of the contributions of licensed professional engineers in the United States. As of 2015, there were 474,777 licensed professional engineers in the U.S. The first Professional Engineers Day was celebrated on August 3, 2016.
History
The idea for Professional Engineers Day came from Tim Austin, PE, a professional engineer from Kansas who served as president of the National Society of Professional Engineers in 2015–16. While promotion of engineering in the US is common, such as the attention given to STEM fields and events such as the USA Science and Engineering Festival and National Engineers Week (U.S.), which was also founded by NSPE in 1951, Austin believed attention should be paid specifically to the contributions of licensed professional engineers because of NSPE's core principle which states, "Being a licensed professional engineer means more than just holding a certificate and possessing technical competence. It is a commitment to hold the public health, safety, and welfare above all other considerations."
Of course, an idea is just an idea unless actions are taken to turn that idea into something tangible. Led by then Executive Director, Mark J. Golden, CAE, Communications Director David Siegel, and Public Relation & Outreach Manager Stacey Ober, the NSPE staff created the celebratory event known today as PE Day, which is mostly a virtual event across social media platforms. The success of PE Day and the efforts of NSPE staff was recognized by the Association of Media & Publishing in the category "Promotional Content: Social Media Campaign" in June 2018
Licensing of professional engineers in the US began in 1907, when Clarence Johnston, the state engineer of Wyoming, presented a bill to the Wyoming legislature that would require registration for those representing themselves to the public as an engineer or land survey |
https://en.wikipedia.org/wiki/Homomorphic%20equivalence | In the mathematics of graph theory, two graphs, G and H, are called homomorphically equivalent if there exists a graph homomorphism and a graph homomorphism . An example usage of this notion is that any two cores of a graph are homomorphically equivalent.
Homomorphic equivalence also comes up in the theory of databases. Given a database schema, two instances I and J on it are called homomorphically equivalent if there exists an instance homomorphism and an instance homomorphism .
Deciding whether two graphs are homomorphically equivalent is NP-complete.
In fact for any category C, one can define homomorphic equivalence. It is used in the theory of accessible categories, where "weak universality" is the best one can hope for in terms of injectivity classes; see
References
Graph theory
Equivalence (mathematics) |
https://en.wikipedia.org/wiki/Governing%20equation | The governing equations of a mathematical model describe how the values of the unknown variables (i.e. the dependent variables) change when one or more of the known (i.e. independent) variables change.
Physical systems can be modeled phenomenologically at various levels of sophistication, with each level capturing a different degree of detail about the system. A governing equation represents the most detailed and fundamental phenomenological model currently available for a given system.
For example, at the coarsest level, a beam is just a 1D curve whose torque is a function of local curvature. At a more refined level, the beam is a 2D body whose stress-tensor is a function of local strain-tensor, and strain-tensor is a function of its deformation. The equations are then a PDE system. Note that both levels of sophistication are phenomenological, but one is deeper than the other. As another example, in fluid dynamics, the Navier-Stokes equations are more refined than Euler equations.
As the field progresses and our understanding of the underlying mechanisms deepens, governing equations may be replaced or refined by new, more accurate models that better represent the system's behavior. These new governing equations can then be considered the deepest level of phenomenological model at that point in time.
Mass balance
A mass balance, also called a material balance, is an application of conservation of mass to the analysis of physical systems. It is the simplest governing equation, and it is simply a budget (balance calculation) over the quantity in question:
Differential equation
Physics
The governing equations in classical physics that are
lectured
at universities are listed below.
balance of mass
balance of (linear) momentum
balance of angular momentum
balance of energy
balance of entropy
Maxwell-Faraday equation for induced electric field
Ampére-Maxwell equation for induced magnetic field
Gauss equation for electric flux
Gauss equation for magnetic |
https://en.wikipedia.org/wiki/Coffee%20Lake | Coffee Lake is Intel's codename for its eighth-generation Core microprocessor family, announced on September 25, 2017. It is manufactured using Intel's second 14 nm process node refinement. Desktop Coffee Lake processors introduced i5 and i7 CPUs featuring six cores (along with hyper-threading in the case of the latter) and Core i3 CPUs with four cores and no hyperthreading.
On October 8, 2018, Intel announced what it branded its ninth generation of Core processors, the Coffee Lake Refresh family. To avoid running into thermal problems at high clock speeds, Intel soldered the integrated heat spreader (IHS) to the CPU die instead of using thermal paste as on the Coffee Lake processors. The generation was defined by another increase of core counts.
Coffee Lake is used with the 300-series chipset, and officially does not work with the 100- and 200-series chipset motherboards. Although desktop Coffee Lake processors use the same physical LGA 1151 socket as Skylake and Kaby Lake, the pinout is electrically incompatible with these older processors and motherboards.
On April 2, 2018, Intel released additional desktop Core i3, i5, i7, Pentium Gold, Celeron CPUs, the first six-core Core i7 and i9 mobile CPUs, hyper-threaded four-core Core i5 mobile CPUs, and the first Coffee Lake ultra-power CPUs with Intel Iris Plus graphics.
On June 8, 2018, to commemorate the 40th anniversary of the Intel 8086 CPU architecture, Intel released the i7-8086K as a limited edition CPU, a renumbered and slightly higher clocked batch of the i7-8700K dies.
History
Coffee Lake's development was led by Intel Israel's processor design team in Haifa, Israel, as an optimization of Kaby Lake. Intel first launched its 8th Generation Intel Core family processors in August 2017. While with the release of the new 8th Gen Intel Core i9 processor in 2018, Intel said it would be the highest-performance laptop processor Intel has ever built.
Features
Coffee Lake CPUs are built using the second refin |
https://en.wikipedia.org/wiki/System%20information%20modelling | System information modelling (SIM) is the process of modelling complex connected systems. System information models are digital representations of connected systems, such as electrical instrumentation and control, power, and communication systems. The objects modelled in a SIM have a 1:1 relationship with the objects in the physical system. Components, connections and functions are defined and linked as they would be in the real world.
Origins
The concept of SIM has existed since the mid 1990s. It was first proposed in 1994 by an Australian instrument, electrical and control system engineering company – I&E Systems Pty Ltd. Like many technological innovations the idea for SIM was born out of necessity. Since the mid-nineties, the complexity of power, control and Information and Communication Technology (ICT) systems has been growing exponentially due to rapid advances in technology; this has rendered the traditional paper-based methodologies and applications used for system design to become obsolete.
The cost of design related activities can be up to 70% of the total project expenditure in an electrical instrumentation and control system (EICS) engineering project. Analyses revealed that the limited nature of paper-based methods/workflows had significant contributions to the high design cost which required duplication of information on multiple documents resulting in design errors and omissions and therefore increasing the cost of labour. With this in mind, the company realized there is a need to shift away from the traditional paper-based methods to a more efficient systematic digital modelling approach.
The term 'System Information Modelling' was first published in a technical report in 2012 by Peter E.D. Love and Jingyang Zhou. The report presented empirical evidence to demonstrate that the use of a SIM could potentially improve productivity and reduce the cost to produce EICS documentation. The research examined a set of electrical engineering drawings of an |
https://en.wikipedia.org/wiki/Purely%20functional%20programming | In computer science, purely functional programming usually designates a programming paradigm—a style of building the structure and elements of computer programs—that treats all computation as the evaluation of mathematical functions.
Program state and mutable objects are usually modeled with temporal logic, as explicit variables that represent the program state at each step of a program execution: a variable state is passed as an input parameter of a state-transforming function, which returns the updated state as part of its return value. This style handles state changes without losing the referential transparency of the program expressions.
Purely functional programming consists of ensuring that functions, inside the functional paradigm, will only depend on their arguments, regardless of any global or local state. A pure functional subroutine only has visibility of changes of state represented by state variables included in its scope.
Difference between pure and impure functional programming
The exact difference between pure and impure functional programming is a matter of controversy. Sabry's proposed definition of purity is that all common evaluation strategies (call-by-name, call-by-value, and call-by-need) produce the same result, ignoring strategies that error or diverge.
A program is usually said to be functional when it uses some concepts of functional programming, such as first-class functions and higher-order functions. However, a first-class function need not be purely functional, as it may use techniques from the imperative paradigm, such as arrays or input/output methods that use mutable cells, which update their state as side effects. In fact, the earliest programming languages cited as being functional, IPL and Lisp, are both "impure" functional languages by Sabry's definition.
Properties of purely functional programming
Strict versus non-strict evaluation
Each evaluation strategy which ends on a purely functional program returns the same res |
https://en.wikipedia.org/wiki/Fractional%20synthetic%20rate | A fractional synthetic rate (FSR) is the rate at which a precursor compound is incorporated into a product per unit of product mass. The metric has been used to estimate the rate at which proteins, lipids, and lipoproteins are synthesized within humans and other animals. The formula used to calculate the FSR from a stable isotope tracer experiment is:
References
Metrics
Biochemistry
Biosynthesis |
https://en.wikipedia.org/wiki/Mean%20square | In mathematics and its applications, the mean square is normally defined as the arithmetic mean of the squares of a set of numbers or of a random variable.
It may also be defined as the arithmetic mean of the squares of the deviations between a set of numbers and a reference value (e.g., may be a mean or an assumed mean of the data), in which case it may be known as mean square deviation.
When the reference value is the assumed true value, the result is known as mean squared error.
A typical estimate for the sample variance from a set of sample values uses a divisor of the number of values minus one, n-1, rather than n as in a simple quadratic mean, and this is still called the "mean square" (e.g. in analysis of variance):
The second moment of a random variable, is also called the mean square.
The square root of a mean square is known as the root mean square (RMS or rms), and can be used as an estimate of the standard deviation of a random variable.
References
Means |
https://en.wikipedia.org/wiki/Apache%20Beam | Apache Beam is an open source unified programming model to define and execute data processing pipelines, including ETL, batch and stream (continuous) processing. Beam Pipelines are defined using one of the provided SDKs and executed in one of the Beam’s supported runners (distributed processing back-ends) including Apache Flink, Apache Samza, Apache Spark, and Google Cloud Dataflow.
History
Apache Beam is one implementation of the Dataflow model paper. The Dataflow model is based on previous work on distributed processing abstractions at Google, in particular on FlumeJava and Millwheel.
Google released an open SDK implementation of the Dataflow model in 2014 and an environment to execute Dataflows locally (non-distributed) as well as in the Google Cloud Platform service.
Timeline
Apache Beam makes minor releases every 6 weeks.
See also
List of Apache Software Foundation projects
References
Apache Software Foundation
Apache Software Foundation projects
Big data products
Cluster computing
Distributed stream processing
Google software
Hadoop
Java platform
Free software programmed in Java (programming language) |
https://en.wikipedia.org/wiki/Toshiba%20T1000LE | The Toshiba T1000LE was a laptop manufactured by Toshiba starting in 1990 as a member of their LE/SE/XE family. It used a 9.54/4.77 MHz Intel 80C86, with the clock speed being switchable by using function keys on the keyboard. The laptop came with a 20 MB hard drive, 1 MB of RAM, a 1.44M/720K switchable 3.5" floppy drive, and a blue-on-white, back-lit, "Toshiba Graphics" 640x400 STN LCD. The laptop came with a choice of either MS-DOS 3.3 or 4.01 stored in a socketed ROM. The laptop's RAM was expandable to 2 MB, 3 MB, 5 MB, or 9 MB with 1 MB, 2 MB, 4 MB, and 8 MB proprietary memory cards, respectively.
The Toshiba T1000LE was discontinued prior to 1994.
Specifications
Features
The Toshiba T1000LE was one of the first laptops to include both a hard drive and a Ni-CD battery. Previous laptops did not have enough power to run a hard drive from battery power (exceptions include the Toshiba T1200, which had a proprietary 26-pin JVC hard drive, and the Macintosh Portable, which used a lead-acid battery, instead of a Ni-CD).
The laptop has a notable lack of expansion ports, consisting of a RS-232 port, a printer port, and the docking connector. The computer used the docking connector to connect to a Toshiba DeskStation II, giving it extra capabilities.
Unlike the original Toshiba T1000, this model does not have a handle that flips out of the bottom, nor a display that tilts 180 degrees.
Reliability Problems
As these laptops age further, with the oldest models being over 30 years old now, reliability becomes more of an issue. These laptops have a few known issues, none of which have a known fix as of yet. Most of these problems plague many other Toshiba portables from the era.
The most common of these issues is where the system will boot for about 1 second and then turn off again, with a power supply error indicated by the LEDs. It is suspected this is damage caused by leaking electrolytic capacitors in the system. All of these models will have leaking capacitors on |
https://en.wikipedia.org/wiki/Cylance | Cylance Inc., is an American software firm based in Irvine, California, that develops antivirus programs and other kinds of computer software that prevents viruses and malware.
In February 2019, the company was acquired by BlackBerry Limited. After the acquisition, it continues to operate as an independent subsidiary and will remain headquartered in Irvine, California.
Founding
Cylance was founded by Stuart McClure and Ryan Permeh in 2012. McClure was previously co-founder of Foundstone, a security consultancy. He sold Foundstone to McAfee in 2004, and became that firm's Chief Tech Officer.
Cylance's founding came about as a result of McClure's speeches on cybersecurity. In them, he was often asked how he protected his own computer. He noted that he lacked trust in any security technology since it was all reactive in nature (for example, legacy antivirus technology), meaning it only cleaned up after an attack or prevented it a second time. Consequently, McClure began developing a technology based on "proactive protection".
Funding
A July 2015 report indicated that Cylance had raised $42 million from investors including Draper Fisher Jurvetson, Kohlberg Kravis Roberts, Dell, Capital One, and TenEleven Ventures. It received another $100 million in June 2016 with lead investors Blackstone Tactical Opportunities (part of The Blackstone Group) and Insight Venture Partners. They received an investment from In-Q-Tel in September 2015.
Product Features
McClure claims that Cylance's antivirus product does not use security features, such as unique signatures, heuristics, behavioral analysis, sandboxing, virtualization, or blacklisting. Rather, the product claims to use artificial intelligence to identify and stop attackers. McClure claims that its security features are similar to the human brain's way of identifying threats, wherein it "teaches" computers to identify indicators of an attack.
Operation Cleaver
Operation Cleaver was a covert cyberwarfare operation alle |
https://en.wikipedia.org/wiki/Hedonic%20game | In cooperative game theory, a hedonic game (also known as a hedonic coalition formation game) is a game that models the formation of coalitions (groups) of players when players have preferences over which group they belong to. A hedonic game is specified by giving a finite set of players, and, for each player, a preference ranking over all coalitions (subsets) of players that the player belongs to. The outcome of a hedonic game consists of a partition of the players into disjoint coalitions, that is, each player is assigned a unique group. Such partitions are often referred to as coalition structures.
Hedonic games are a type of non-transferable utility game. Their distinguishing feature (the "hedonic aspect") is that players only care about the identity of the players in their coalition, but do not care about how the remaining players are partitioned, and do not care about anything other than which players are in their coalition. Thus, in contrast to other cooperative games, a coalition does not choose how to allocate profit among its members, and it does not choose a particular action to play. Some well-known subclasses of hedonic games are given by matching problems, such as the stable marriage, stable roommates, and the hospital/residents problems.
The players in hedonic games are typically understood to be self-interested, and thus hedonic games are usually analyzed in terms of the stability of coalition structures, where several notions of stability are used, including the core and Nash stability. Hedonic games are studied both in economics, where the focus lies on identifying sufficient conditions for the existence of stable outcomes, and in multi-agent systems, where the focus lies on identifying concise representations of hedonic games and on the computational complexity of finding stable outcomes.
Definition
Formally, a hedonic game is a pair of a finite set of players (or agents), and, for each player a complete and transitive preference relation |
https://en.wikipedia.org/wiki/Rank%20of%20an%20elliptic%20curve | In mathematics, the rank of an elliptic curve is the rational Mordell–Weil rank of an elliptic curve defined over the field of rational numbers. Mordell's theorem says the group of rational points on an elliptic curve has a finite basis. This means that for any elliptic curve there is a finite subset of the rational points on the curve, from which all further rational points may be generated. If the number of rational points on a curve is infinite then some point in a finite basis must have infinite order. The number of independent basis points with infinite order is the rank of the curve.
The rank is related to several outstanding problems in number theory, most notably the Birch–Swinnerton-Dyer conjecture. It is widely believed that there is no maximum rank for an elliptic curve, and it has been shown that there exist curves with rank as large as 28, but it is widely believed that such curves are rare. Indeed, Goldfeld and later Katz–Sarnak conjectured that in a suitable asymptotic sense (see below), the rank of elliptic curves should be 1/2 on average. In other words, half of all elliptic curves should have rank 0 (meaning that the infinite part of its Mordell–Weil group is trivial) and the other half should have rank 1; all remaining ranks consist of a total of 0% of all elliptic curves.
Heights
Mordell–Weil's theorem shows that is a finitely generated abelian group, thus where is the finite torsion subgroup and r is the rank of the elliptic curve.
In order to obtain a reasonable notion of 'average', one must be able to count elliptic curves somehow. This requires the introduction of a height function on the set of rational elliptic curves. To define such a function, recall that a rational elliptic curve can be given in terms of a Weierstrass form, that is, we can write
for some integers . Moreover, this model is unique if for any prime number such that divides , we have . We can then assume that are integers that satisfy this property and defin |
https://en.wikipedia.org/wiki/Genetic%20method | The genetic method is a method of teaching mathematics coined by Otto Toeplitz in 1927. As an alternative to the axiomatic system, the method suggests using history of mathematics to deliver excitement and motivation and engage the class.
History
Otto Toeplitz, a research mathematician in the area of functional analysis, introduced the method in his manuscript "The problem of calculus courses at universities and their demarcation against calculus courses at high schools" in 1927. A part of this manuscript was published in a book in 1949, after Toeplitz's death.
Toeplitz's method was not completely new at the time. In his 1895 talk given at the public meeting of the royal society of sciences in Goettingen, "On the arithmetization of mathematics", the famous German mathematician Felix Klein suggested the idea "that on a small scale, a learner naturally and always has to repeat the same developments that the sciences went through on a large scale".
In addition, the genetic method was occasionally applied in Gerhard Kowalewski's book from 1909, "The classical problems of the analysis of the infinite".
In 1962 the mathematics education in the US met a situation similar to that of Toeplitz in 1926 in Germany, in connection with the introduction of "New Mathematics". Shortly after the Sputnik crisis, a "New Mathematics" reform was introduced to improve the level of mathematics education in the US, so that the threat of Soviet engineers, assumed to be well educated in mathematics, could be met. To prepare students for advanced mathematics, the curriculum shifted to focus on abstraction and rigor. One of the more reasonable responses to "New Mathematics" was a collective statement by Lipman Bers, Morris Kline, George Pólya, and Max Schiffer, cosigned by 61 others, that was published in "The Mathematics Teacher" and The American Mathematical Monthly in 1962. In this letter, the undersigned called for the use of the genetic method:
Also, in the 1980s, departments of mat |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.