source stringlengths 31 203 | text stringlengths 28 2k |
|---|---|
https://en.wikipedia.org/wiki/Journal%20of%20Group%20Theory | The Journal of Group Theory is a bimonthly peer-reviewed mathematical journal covering all aspects of group theory. It was established in 1998 and is published by Walter de Gruyter. The editor-in-chief is Chris Parker (University of Birmingham).
Abstracting and indexing
The journal is abstracted and indexed in:
Its 2018 MCQ is 0.48. According to the Journal Citation Reports, the journal has a 2018 impact factor of 0.47, and the 5-year impact factor is 0.52.
References
External links
Mathematics journals
Bimonthly journals
De Gruyter academic journals
Academic journals established in 1998
English-language journals |
https://en.wikipedia.org/wiki/Zero-rating | Zero-rating is the practice of providing Internet access without financial cost under certain conditions, such as by permitting access to only certain websites or by subsidizing the service with advertising or by exempting certain websites from the data allowance.
Commentators discussing zero-rating present it often in the context of net neutrality. While most sources report that use of zero-rating is contrary to the principle of net neutrality, there are mixed opinions among advocates of net neutrality about the extent to which people can benefit from zero-rating programs while retaining net neutrality protections. Supporters of zero-rating argue that it enables consumers to make choices to access more data and leads to more people using online services, but critics believe zero-rating exploits the poor, creates opportunities for censorship, and disrupts the free market.
Existing programs
Internet services like Facebook, Wikipedia and Google have built special programs to use zero-rating as means to provide their service more broadly into developing markets. The benefit for these new customers, who will mostly have to rely on mobile networks to connect to the Internet, would be a subsidised access to services from these service providers. The results of these efforts have been mixed, with adoption in a number of markets, sometimes overestimated expectations and perceived lack of benefits for mobile network operators. In Chile, the national telecom regulator ruled that this practice violated net neutrality laws and had to end by June 1, 2014. The Federal Communications Commission did not ban zero-rating programs, but it "acknowledged that they could violate the spirit of net neutrality".
Since June 2014, U.S. mobile provider T-Mobile US has offered zero-rated access to participating music streaming services to its mobile internet customers. T-Mobile launched its plan called “Music Freedom” which would exempt users of T-Mobile from having to pay premium prices fo |
https://en.wikipedia.org/wiki/Ulam%20matrix | In mathematical set theory, an Ulam matrix is an array of subsets of a cardinal number with certain properties. Ulam matrices were introduced by Stanislaw Ulam in his 1930 work on measurable cardinals: they may be used, for example, to show that a real-valued measurable cardinal is weakly inaccessible.
Definition
Suppose that κ and λ are cardinal numbers, and let be a -complete filter on . An Ulam matrix is a collection of subsets of indexed by such that
If then and are disjoint.
For each , the union over of the sets , is in the filter .
References
Set theory |
https://en.wikipedia.org/wiki/Space%20Shuttle%3A%20A%20Journey%20into%20Space | Space Shuttle: A Journey into Space is a space flight simulator game designed by Steve Kitchen for the Atari 2600 and published by Activision in 1983. It is one of the first realistic spacecraft simulations available for home systems. Space Shuttle was adapted to the Atari 8-bit family and Atari 5200 by Bob Henderson (1984), then ported to the ZX Spectrum (1984), Commodore 64 (1984), Amstrad CPC (1986), and MSX (1986). The 1984 Activision Software catalog also mentions an Apple II version.
The player controls the most critical flight phases of the Space Shuttle such as launch, stabilizing orbit, docking, deorbit burn, reentry, and landing, each with its own set of instructions to follow. The original Atari 2600 version came with an overlay since it made use of all the switches.
Reception
In an April 1984 review for Video Games magazine, Dan Persons wrote:
Space Shuttle is not a game for everybody. It requires a considerable amount of patience and, perhaps not too surprisingly, a considerable amount of brainpower. Players who seek the visceral thrills of the standard shoot'em-up may ultimately find this simulation's complexity frustrating. But those of you who are ready for a richer, more sophisticated experience will probably recognize Space Shuttle for the monumental achievement it is.
References
External links
Review in GAMES Magazine
1983 video games
Activision games
Amstrad CPC games
Atari 2600 games
Atari 5200 games
Atari 8-bit family games
Cancelled Apple II games
Commodore 64 games
MSX games
Realistic space simulators
Space flight simulator games
ZX Spectrum games
Video games developed in the United States |
https://en.wikipedia.org/wiki/Fair%20cake-cutting | Fair cake-cutting is a kind of fair division problem. The problem involves a heterogeneous resource, such as a cake with different toppings, that is assumed to be divisible – it is possible to cut arbitrarily small pieces of it without destroying their value. The resource has to be divided among several partners who have different preferences over different parts of the cake, i.e., some people prefer the chocolate toppings, some prefer the cherries, some just want as large a piece as possible. The division should be unanimously fair – each person should receive a piece believed to be a fair share.
The "cake" is only a metaphor; procedures for fair cake-cutting can be used to divide various kinds of resources, such as land estates, advertisement space or broadcast time.
The prototypical procedure for fair cake-cutting is divide and choose, which is mentioned in the book of Genesis. It solves the fair division problem for two people. The modern study of fair cake-cutting was initiated during World War II, when Hugo Steinhaus asked his students Stefan Banach and Bronisław Knaster to find a generalization of divide-and-choose to three or more people. They developed the last diminisher procedure. Today, fair cake-cutting is the subject of intense research in mathematics, computer science, economics and political science.
Assumptions
There is a cake C, which is usually assumed to be either a finite 1-dimensional segment, a 2-dimensional polygon or a finite subset of the multidimensional Euclidean plane Rd.
There are n people with subjective value functions over C. Each person i has a value function Vi which maps subsets of C to numbers. All value functions are assumed to be absolutely continuous with respect to the length, area or (in general) Lebesgue measure. This means that there are no "atoms" – there are no singular points to which one or more agents assign a positive value, so all parts of the cake are divisible. In many cases, the value functions are assumed t |
https://en.wikipedia.org/wiki/Cantor%20algebra | In mathematics, a Cantor algebra, named after Georg Cantor, is one of two closely related Boolean algebras, one countable and one complete.
The countable Cantor algebra is the Boolean algebra of all clopen subsets of the Cantor set. This is the free Boolean algebra on a countable number of generators. Up to isomorphism, this is the only nontrivial Boolean algebra that is both countable and atomless.
The complete Cantor algebra is the complete Boolean algebra of Borel subsets of the reals modulo meager sets . It is isomorphic to the completion of the countable Cantor algebra. (The complete Cantor algebra is sometimes called the Cohen algebra, though "Cohen algebra" usually refers to a different type of Boolean algebra.) The complete Cantor algebra was studied by von Neumann in 1935 (later published as ), who showed that it is not isomorphic to the random algebra of Borel subsets modulo measure zero sets.
References
Forcing (mathematics)
Boolean algebra |
https://en.wikipedia.org/wiki/Dream%20Build%20Play | Dream Build Play (also known as Dream-Build-Play, Dream.Build.Play, and DreamBuildPlay) is an annual $75,000 Microsoft video game contest used to promote Microsoft XNA Game Studio and eventually Xbox LIVE Indie Games. It was announced in 2006, started in 2007, and ran until 2012. It was restarted in 2017 as the Dream.Build.Play 2017 Challenge. Notable winners include Dust: An Elysian Tail and The Dishwasher: Dead Samurai.
Concept
Dream-Build-Play, as it was first called, is a game development contest designed to promote and encourage experienced game developers and enthusiasts to create innovative and fun-to-play indie games for Windows and Xbox 360 using Microsoft XNA Game Studio. In 2017, the contest was relaunched to challenge indie game developers to create UWP games across four categories.
Notable entries
Many winners have gone on to be published on Xbox LIVE Arcade:
A.R.E.S.: Extinction Agenda (Entered as A.R.E.S.)
Blazing Birds
Capsized
Cloudberry Kingdom
Dust: An Elysian Tail
Kung Fu Strike (Entered as HurricaneX, HurricaneX2, and HurricaneX2 Evolution in three separate Dream Build Play contests.)
The Adventures of Shuggy (Entered as Shuggy.)
The Bridge
The Dishwasher: Dead Samurai
Yo Ho Kablammo!
Zeit²
Dream Build Play Challenges
Here is a breakdown of all Dream Build Play Challenges, including lesser known Warm-Up Challenges:
2007 Warm-Up Challenge
Microsoft offered a Relatives of Spacewar Warm-Up Challenge precursor to the first Dream Build Play, awarding US$500 to the top five entries. The entries had to create a game based on the Spacewar Starter Kit.
Winners
Top 5:
Earth vs. Mars (Minsk, Belarus)
Udder Assault (Bellevue, WA USA)
G, (Bologna, Italy)
Viduce (Lyon, France)
Cracklin Crackles (Chicago, IL USA)
2007 Challenge
Also known as Dream-Build-Play 1.0, this is the first Dream Build Play Challenge. In it, Microsoft offered an Xbox LIVE Arcade publishing contract to the top four entries out of 4,500 participants, with the |
https://en.wikipedia.org/wiki/Rudder%20%28software%29 | Rudder is an open source audit and configuration management utility to help automate system configuration across large IT infrastructures. Rudder relies on a lightweight local agent installed on each managed machine.
Rudder is produced by Normation, founded in 2010. Its server-side web interface is written in Scala and its local agent is written in C, and are published as free software under the GNU General Public License 3.0.
Features
Host inventory
Feature-complete Web interface
Standardized, reusable policies
Custom Policy editor
Central reporting and historic information for policy applied to hosts
Grouping based on search queries run against inventory
Automatic updating of such groups (dynamic groups)
Dynamic generation of per-host policies (lessens risk of data leaks from shared policy)
Change Request / Validation
REST API
Git backend
History
Rudder was created by the founding team of Normation and first released as free software in October 2011.
Rudder 3.0 was released in February 2015.
Platform support
The following operating systems are supported as a Root server:
Debian Linux 9 and 10
Ubuntu 16.04 LTS, 18.04 LTS and 20.04 LTS
Red Hat Enterprise Linux (RHEL) / CentOS 7 and 8
SUSE Linux Enterprise Server (SLES) 12 et 15
The following operating systems are supported for Rudder Nodes and packages are available for these platforms:
Debian Linux 5 to 10
Ubuntu 10.04 LTS to 20.04 LTS
Red Hat Enterprise Linux (RHEL) / CentOS 3 to 8
SUSE Linux Enterprise Server (SLES) 10 to 15
IBM AIX 5 to 7
Slackware 14
Microsoft Windows Server 2008R2 or higher
See also
CFEngine
Ansible (software)
Bcfg2
Chef (software)
Puppet (software)
Salt (software)
Comparison of open source configuration management software
DevOps
Otter (software)
References
External links
Official website
Configuration management
Virtualization software for Linux
Software using the GNU AGPL license
Linux configuration utilities
Linux package management-related software
Uni |
https://en.wikipedia.org/wiki/Evaluation%20of%20binary%20classifiers | The evaluation of binary classifiers compares two methods of assigning a binary attribute, one of which is usually a standard method and the other is being investigated. There are many metrics that can be used to measure the performance of a classifier or predictor; different fields have different preferences for specific metrics due to different goals. For example, in medicine sensitivity and specificity are often used, while in computer science precision and recall are preferred. An important distinction is between metrics that are independent on the prevalence (how often each category occurs in the population), and metrics that depend on the prevalence – both types are useful, but they have very different properties.
Probabilistic classification models go beyond providing binary outputs and instead produce probability scores for each class. These models are designed to assess the likelihood or probability of an instance belonging to different classes. In the context of evaluating probabilistic classifiers, alternative evaluation metrics have been developed to properly assess the performance of these models. These metrics take into account the probabilistic nature of the classifier's output and provide a more comprehensive assessment of its effectiveness in assigning accurate probabilities to different classes. These evaluation metrics aim to capture the degree of calibration, discrimination, and overall accuracy of the probabilistic classifier's predictions.
Contingency table
Given a data set, a classification (the output of a classifier on that set) gives two numbers: the number of positives and the number of negatives, which add up to the total size of the set. To evaluate a classifier, one compares its output to another reference classification – ideally a perfect classification, but in practice the output of another gold standard test – and cross tabulates the data into a 2×2 contingency table, comparing the two classifications. One then evaluates the clas |
https://en.wikipedia.org/wiki/Maharam%20algebra | In mathematics, a Maharam algebra is a complete Boolean algebra with a continuous submeasure (defined below). They were introduced by .
Definitions
A continuous submeasure or Maharam submeasure on a Boolean algebra is a real-valued function m such that
and if .
If , then .
.
If is a decreasing sequence with greatest lower bound 0, then the sequence has limit 0.
A Maharam algebra is a complete Boolean algebra with a continuous submeasure.
Examples
Every probability measure is a continuous submeasure, so as the corresponding Boolean algebra of measurable sets modulo measure zero sets is complete, it is a Maharam algebra.
solved a long-standing problem by constructing a Maharam algebra that is not a measure algebra, i.e., that does not admit any countably additive strictly positive finite measure.
References
Boolean algebra |
https://en.wikipedia.org/wiki/Collapsing%20algebra | In mathematics, a collapsing algebra is a type of Boolean algebra sometimes used in forcing to reduce ("collapse") the size of cardinals. The posets used to generate collapsing algebras were introduced by Azriel Lévy in 1963.
The collapsing algebra of λω is a complete Boolean algebra with at least λ elements but generated by a countable number of elements. As the size of countably generated complete Boolean algebras is unbounded, this shows that there is no free complete Boolean algebra on a countable number of elements.
Definition
There are several slightly different sorts of collapsing algebras.
If κ and λ are cardinals, then the Boolean algebra of regular open sets of the product space κλ is a collapsing algebra. Here κ and λ are both given the discrete topology. There are several different options for the topology of
κλ. The simplest option is to take the usual product topology. Another option is to take the topology generated by open sets consisting of functions whose value is specified on less than λ elements of λ.
References
Boolean algebra
Forcing (mathematics) |
https://en.wikipedia.org/wiki/Reza%20Farajidana | Reza Faraji-Dana (born 1960 in Qom) is a reformist Iranian politician and the former minister of Science, Research and Technology under President Hassan Rouhani. He served as president of University of Tehran, and as the former president of the Iranian branch of the Institute of Electrical and Electronics Engineers.
Biography
Reza Farji-Dana was born in 1960 in Qom and received his math diploma from Hakim Nezami high school in 1978. In the same year, he was admitted to the Shiraz University in electrical engineering.
He studied electrical engineering at the University of Tehran, and at the University of Waterloo in Ontario, Canada. He graduated with a Ph.D. degree from the University of Waterloo in 1993. He works as the professor of electrical engineering at the University of Tehran.
Dismissal
Iranian parliament dismissed him as Minister of Science on 20 August 2014. The conservative MPs accused him of permitting students involved in 2009–10 Iranian election protests to continue their education in public universities. He was also accused of appointing senior department heads from people involved in the mass protests of 2009. Some of the PMs who voted for the dismissal of Farajidana were allegedly benefiting from unlawful scholarships in the previous government and the list of those who granted unlawful scholarships has been revealed during the ministration of Farajidana. He was removed from office through impeachment in August 2014.
References
Government ministers of Iran
University of Waterloo alumni
1960 births
Living people
Presidential advisers of Iran
Impeached Iranian officials removed from office
Islamic Revolutionary Guard Corps personnel of the Iran–Iraq War
Iranian electrical engineers
Iranian engineers
Ministers of science of Iran
People from Qom
Microwave engineers
Academic staff of the University of Tehran |
https://en.wikipedia.org/wiki/Ronald%20J.%20Ross | Ronald J. Ross is a Cleveland, Ohio radiologist known for research on brain injury in professional and amateur boxers and for the first clinical use of nuclear magnetic resonance imaging (NMR later known as MRI) on human patients. Ross is also credited with the first use of head and whole body computed tomography imaging (CT) in a private clinical setting in the United States.
Biography
Early life and education
Ross is one of three children born of Lithuanian and Russian immigrants. He grew up and attended school in Cleveland Heights, Ohio. Ross earned his undergraduate degree from Case Western Reserve University in 1956 and received his medical degree from Albert Einstein College of Medicine in 1960.
Career
After completing his internship and residency in radiology in Cleveland, Ross was awarded a Fulbright scholarship and Government of Sweden Award in 1964 to complete his fellowship studies in radiology at Karolinska University Hospital. In the mid-1970s, he opened the first U.S. based private diagnostic imaging center that featured a whole body CT scanner for clinical use. In 1981, he performed the world's first MRI clinical studies on patients.
Ross's medical research includes brain damage in boxers and the MRI evaluation of diseases of the breast. Ross was as an abstractor for radiology journals. In 1983, his study of brain injuries in professional and amateur boxers was published by the Journal of the American Medical Association; the study drew global attention, stirred up controversy in the boxing world and spurred the American Medical Association to call for stricter medical supervision of the sport in the United States.
In 1998, Ross was elected to the National Fulbright Association Board of Trustees in Washington, D.C., and served as its national president from 1998 to 2000.
Ross retired from the practice of medicine in 2000 after a career that spanned 40 years. He is currently the Director Emeritus of the Department of Radiology and Chairman of |
https://en.wikipedia.org/wiki/Threema | Threema is a paid cross-platform encrypted instant messaging app developed by Threema GmbH in Switzerland and launched in 2012. The service operates on a decentralized architecture and offers end-to-end encryption. Users can make voice and video calls, send photos, files, and voice notes, share locations, and make groups. Unlike many other popular secure messaging apps, Threema does not require phone numbers or email address for registration, only a one-time purchase. Threema is available on iOS and Android and has clients for Windows, macOS, Linux, and can be accessed via web browser but requires a mobile app to function.
Features
The service claims to be based on the privacy by design principles by not requiring a phone number or other personally identifiable information. This helps anonymize the users to a degree.
Threema uses a user ID, created after the initial app launch by a random generator, instead of requiring a linked email address or phone number to send messages. It is possible to find other users by phone number or email address if the user allows the app to synchronize their address book. Linking a phone number or email address to a Threema ID is optional. Hence, the service can be used anonymously. Users can verify the identity of their Threema contacts by scanning their QR code when they meet physically. The QR code contains the public key of the user, which is cryptographically tied to the ID and will not change during the lifetime of the identity. Using this strong authentication feature, users can make sure they have the correct public key from their chat partners, which provides additional security against a Man-in-the-middle attack. Threema knows three levels of authentication (trust levels of the contact's identity). The verification level of each contact is displayed in the Threema application as dots next to the corresponding contact.
In addition to text messaging, users can make voice and video calls, send multimedia, locations, voice |
https://en.wikipedia.org/wiki/Winquist%20and%20Hansen%20classification | The Winquist and Hansen classification is a system of categorizing femoral shaft fractures based upon the degree of comminution.
Classification
References
Bone fractures
Orthopedic classifications
Injuries of hip and thigh |
https://en.wikipedia.org/wiki/Marsh%20Award%20for%20Conservation%20Biology | The Marsh Award for Conservation Biology, established 1991, is an award run in partnership between the Zoological Society of London (ZSL) and the Marsh Charitable Trust that recognises an individual for his or her "contributions of fundamental science to the conservation of animal species and habitats".
Recipients
1991 – Robert M. May
1992 – Derek A. Ratcliffe
1993 – Georgina M. Mace
1994 – Ian Newton
1995 – John Goss-Custard
1996 – Jeremy A. Thomas
1997 – Rhys E. Green
1998 – Peter S. Maitland
1999 – John Croxall
2000 – Andrew Balmford
2001 – E.J. Milner-Gulland
2002 – Callum Roberts
2003 – Stuart Pimm
2004 – Chris D. Thomas
2005 – William J. Sutherland
2006 – Sarah Wanless
2007 – Stuart Butchart
2008 – Isabelle M. Côté
2009 – Ana Rodrigues
2010 – Paul Donald
2011 – Jane Hill
2012 – Dave Goulson
2013 – Debbie Pain
2014 – Ben Collen
2015 – Stephen Redpath
2016 – Richard Griffiths
2017 – Susan Cheyne
2018 – Steffen Oppel
2020 - Michael W. Bruford
2021 - Rosie Woodroffe
2022 - Kate Jones
See also
List of environmental awards
References
Ecology
1991 establishments in the United Kingdom
British awards
Environmental awards |
https://en.wikipedia.org/wiki/AMD%20PowerTune | AMD PowerTune is a series of dynamic frequency scaling technologies built into some AMD GPUs and APUs that allow the clock speed of the processor to be dynamically changed (to different P-states) by software. This allows the processor to meet the instantaneous performance needs of the operation being performed, while minimizing power draw, heat generation and noise avoidance. AMD PowerTune aims to solve thermal design power and performance constraints.
Besides the reduced energy consumption, AMD PowerTune helps to lower the noise levels created by the cooling in desktop computers, and extends battery life in mobile devices. AMD PowerTune is the successor to AMD PowerPlay.
Support for "PowerPlay" was added to the Linux kernel driver "amdgpu" on November 11, 2015.
As a lecture from CCC in 2014 shows, AMD's x86-64 SMU firmware is executed on some LatticeMico32 and PowerTune was modeled using Matlab. This is similar to Nvidia's PDAEMON, the RTOS responsible for power on their GPUs.
Overview
AMD PowerTune was introduced in the TeraScale 3 (VLIW4) with Radeon HD 6900 on 15 December 2010 and has been available in different development stages on Radeon- and AMD FirePro-branded products ever since.
Over the years, reviews which document the development of AMD PowerTune have been published by AnandTech.
An additional technology named AMD ZeroCore Power has been available since the Radeon HD 7000 Series, implementing the Graphics Core Next microarchitecture.
The pointlessness of a fixed clock frequency was accredited in January 2014 by SemiAccurate.
Operating system support
AMD Catalyst is available for Microsoft Windows and Linux and supports AMD PowerTune since version.
The free and open-source "Radeon" graphics device driver has some support for AMD PowerTune, see "Enduro".
Feature overview for AMD APUs
Feature overview for AMD graphics cards
See also
AMD Cool'n'Quiet (for desktop CPUs)
AMD PowerNow! (for laptop CPUs)
AMD Turbo Core (for CPUs)
AMD PowerX |
https://en.wikipedia.org/wiki/TeraScale%20%28microarchitecture%29 | TeraScale is the codename for a family of graphics processing unit microarchitectures developed by ATI Technologies/AMD and their second microarchitecture implementing the unified shader model following Xenos. TeraScale replaced the old fixed-pipeline microarchitectures and competed directly with Nvidia's first unified shader microarchitecture named Tesla.
TeraScale was used in HD 2000 manufactured in 80 nm and 65 nm, HD 3000 manufactured in 65 nm and 55 nm, HD 4000 manufactured in 55 nm and 40 nm, HD 5000 and HD 6000 manufactured in 40 nm. TeraScale was also used in the AMD Accelerated Processing Units code-named "Brazos", "Llano", "Trinity" and "Richland". TeraScale is even found in some of the succeeding graphics cards brands.
TeraScale is a VLIW SIMD architecture, while Tesla is a RISC SIMD architecture, similar to TeraScale's successor Graphics Core Next.
TeraScale implements HyperZ.
An LLVM code generator (i.e. a compiler back-end) is available for TeraScale, but it seems to be missing in LLVM's matrix. E.g. Mesa 3D makes use of it.
TeraScale 1 (VLIW)
At SIGGRAPH 08 in December 2008 AMD employee Mike Houston described some of the TeraScale microarchitecture.
At FOSDEM09 Matthias Hopf from AMDs technology partner SUSE Linux presented a slide regarding the programming of open-source driver for the R600.
Unified shaders
Previous GPU architectures implemented fixed-pipelines, i.e. there were distinct shader processors for each type of shader. TeraScale leverages many flexible shader processors which can be scheduled to process a variety of shader types, thereby significantly increasing GPU throughput (dependent on application instruction mix as noted below). The R600 core processes vertex, geometry, and pixel shaders as outlined by the Direct3D 10.0 specification for Shader Model 4.0 in addition to full OpenGL 3.0 support.
The new unified shader functionality is based upon a very long instruction word (VLIW) architecture in which the core executes opera |
https://en.wikipedia.org/wiki/AMD%20APP%20SDK | AMD APP SDK is a software development kit by AMD for "Accelerated Parallel Processing" (APP). AMD APP SDK also targets Heterogeneous System Architecture (not only GPU).
AMD APP SDK was available for 32-bit and 64-bit versions of Microsoft Windows and Linux but was removed from AMD's official website. A developer stated in a forum post that the SDK was discontinued as the required libraries are now included with the drivers.
AMD intends developers to employ AMD APP SDK to utilize Video Coding Engine hybrid mode to create hybrid encoders that pair custom motion estimation, inverse discrete cosine transform and motion compensation with the hardware entropy encoding to achieve faster than real-time encoding.
The AMD APP SDK v3.0 supports OpenCL 2.0 and Catalyst Omega 15.7 driver, also it includes samples for OpenCL as well as accelerated libraries such as Bolt (an open-source C++ template library) and the OpenCL accelerated OpenCV (Open Computer Vision) library. This last version can be downloaded with the aid of StackOverflow.
History
AMD APP SDK replaced AMD Stream SDK (formerly named ATI Stream SDK). AMD CAL (Compute Abstraction Layer) SDK was replaced by ATI Stream SDK, available for Microsoft Windows and Linux, 32-bit and 64-bit. As of February 2021, there are few signs on the AMD official web site that AMD APP SDK ever existed.
References
External links
Unofficial Windows mirror
Unofficial Linux mirror
ROCm - Radeon Open Compute, designed to be a successor to the AMD APP SDK
AMD software
GPGPU libraries
Heterogeneous System Architecture |
https://en.wikipedia.org/wiki/Cyclic%20algebra | In algebra, a cyclic division algebra is one of the basic examples of a division algebra over a field and plays a key role in the theory of central simple algebras.
Definition
Let A be a finite-dimensional central simple algebra over a field F. Then A is said to be cyclic if it contains a strictly maximal subfield E such that E/F is a cyclic field extension (i.e., the Galois group is a cyclic group).
See also
Factor system#Cyclic algebras - cyclic algebras described by factor systems.
Brauer group#Cyclic algebras - cyclic algebras are representative of Brauer classes.
References
Algebra |
https://en.wikipedia.org/wiki/Multiprogram%20Research%20Facility | The Multiprogram Research Facility (MRF, also known as Building 5300) is a facility at the Oak Ridge National Laboratory in Oak Ridge, Tennessee. It is used by the U.S. National Security Agency (NSA) to design and build supercomputers for cryptanalysis and other classified projects. It houses the classified component program of the High Productivity Computing Systems (HPCS) project sponsored by the Defense Advanced Research Projects Agency (DARPA).
History
The High Productivity Computing Systems program was launched in 2004 as a multiagency project led by DARPA with the goal of increasing computing speed a thousandfold, creating a supercomputer capable of one petaflop (a quadrillion [1015] floating-point operations a second). The project is sited at the Oak Ridge National Laboratory in Tennessee and is split into two tracks, one top secret and one unclassified, housed in separate facilities. The secret facility, used by the NSA, is located within Building 5300 at the laboratory and is known as the Multiprogram Research Facility.
The MRF was constructed in 2006 at a cost of $41 million. Located on the laboratory's East Campus, the building covers and rises five stories high. As of 2012, it is staffed by 318 computer scientists and engineers.
While the unclassified portion of the HPCS project succeeded in designing the 1.3 petaflop Cray XT5 supercomputer in 2007, the MRF succeeded in developing an even faster machine, designed specifically for cryptanalysis and targeted against one or more specific algorithms, such as the Advanced Encryption Standard (AES). A former NSA official called the MRF's breakthrough "enormous", giving the agency the ability to break current public encryption standards. The data upon which the supercomputer operates is stored at the agency's Utah Data Center in Bluffdale, Utah.
The MRF's next goal is to achieve a machine capable of one exaflop (1018 floating-point operations per second) and then one zetaflop (1021). To achieve an exaflop |
https://en.wikipedia.org/wiki/Methode%20Electronics | Methode Electronics (NYSE: MEI ) is an American multinational company headquartered in Chicago, Illinois, with engineering, manufacturing and sales operations in more than 35 locations in 14 countries. The company employs around 4,566 people worldwide.
References
Electronics manufacturing
Companies listed on the New York Stock Exchange
1946 establishments in the United States
Electrical connectors |
https://en.wikipedia.org/wiki/Scality | Scality is a global technology provider of software-defined storage (SDS) solutions, specializing in distributed file and object storage with cloud data management. Scality maintains offices in Paris (France), London (UK), San Francisco and Washington DC (USA), and Tokyo (Japan) and has employees in 14 countries.
History
Scality was founded in 2009 by Jérôme Lecat, Giorgio Regni, Daniel Binsfeld, Serge Dugas, and Brad King.
Scality raised $7 million of venture capital funding in March 2011. A C-round of $22 million was announced in June 2013, led by Menlo Ventures and Iris Capital with participation from FSN PME and all existing investors, including Idinvest Partners, OMNES Capital and Galileo Partners. Scality raised $45 million in August 2015. This Series D funding was led by Menlo Ventures with participation from all existing investors and one new strategic investor, BroadBand Tower. In 2016, HPE made a strategic investment in the company. In April, 2018, the company announced a $60 million round of funding.
Scality announced a refreshed brand, along with a distribution agreement with Hewlett-Packard in October 2014. Scality added Dell and Cisco Systems as resellers in 2015.
Products
RING
Scality's released the first version of its principal product, RING, in 2010. The object storage software platform is a multitiered architecture and can scale up to thousands of servers and up to 100 petabytes under a single namespace. Ring product depends on a keyspace calculated using a Monte Carlo simulation at install, spread across all of its node servers. While the company aims for the Ring to function without the need of any external management process, a Supervisor server is functionally required to kick-off data integrity operations and keep track of node state, while also providing a single source of truth for data about the ring itself. The Supervisor process is relatively lightweight and can be installed on a node server if required, but the company recommend |
https://en.wikipedia.org/wiki/Cataclysm%3A%20Dark%20Days%20Ahead | Cataclysm: Dark Days Ahead (CDDA) is an open-source survival horror roguelike video game. Cataclysm: Dark Days Ahead is a fork of the original game Cataclysm. The game is freely downloadable on the game's website and the source code is also freely available on the project's GitHub repository under the CC BY-SA Creative Commons license. The game is currently largely developed by its community. Rock, Paper, Shotgun named CDDA one of "The 50 Best Free Games on PC" in 2016.
The game is text-based, though alternative graphical tilesets are available. Prior to playing, the player generates a world which can be shared between different characters. The player then creates a character and is placed into one of many possible starting scenarios. The player is then free to do whatever they please. This includes both things required for day-to-day survival such as scavenging, hunting, and finding shelter, as well as larger goals such as farming, installing bionics, making and repairing vehicles, and constructing homes.
Plot
The game is set in near-future New England after a catastrophic event killed most of the human population and spawned various monsters and hazards. After a UFO crash in the '90s, the US government began to research alternate dimensions. In their explorations they obtained a sample called "XE-037", a mysterious black goo known as the blob that was able to reanimate the dead and cause targeted phenotype mutations. XE-037 (also known as The Blob) escaped the lab environment, reproduced in the worlds’ ground water supply, causing an enormous zombie outbreak and manipulating the population through people's moods, making many people violent. In the following months, the world began to further spiral out of control, with mass demonstrating, rioting, and internal deployment of troops. Amidst the chaos, Earth became the target of a multidimensional portal attack, leading to vast portal storms, with portals opening all over the world, affecting the weather and the r |
https://en.wikipedia.org/wiki/Workerbot | The workerbot is a trademark, which was developed by the pi4 robotics GmbH to describe an industrial robot, which was modeled with its possibilities of movement and its sensory abilities of a human.
The industrial robot has two arms with seven degrees of freedom. In the arms of force sensors are integrated, enabling the robot to work while the forces occurring measure and similar to humans the gripping process or machining processes to adapt to the forces occurring accordingly. The robot is also equipped with cameras that it can detect its environment and react to it. This industrial robot has been developed within the EU funded project PISA (Flexible Assembly Systems through Workplace-Sharing and Time-Sharing Human-Machine Cooperation). The project consortium consists of the lead company pi4_robotics GmbH and the Fraunhofer IPK, the Universidad Politécnica de Madrid and the company EICAS Automazione S.p.A. .
The aim is to enable the use of highly flexible industrial robots manufacturing companies within the European Union cost production and to prevent the migration to low-wage countries. The workerbot is still the first operative humanoid factory worker worldwide that can be acquired by purchase. In context with the workerbot there is the first webshop for humanoid robots.
References
',';
External links
pi4_robotics: The official homepage of workerbot
pi4_robotics: Video of workerbot
Deutschen Patent- und Markenamt: Registerauskunft zur Marke "workerbot"
FP6-Rahmenprogramm der EU: PISA Projekt
FP6-Rahmenprogramm der EU: PISA-Projekt Sub Project 2 - Time Sharing Intelligent Assist Systems
Europäische Kommission: CORDIS Informationsdienst Nationale F&E
Awards / Prizes: The workerbot of pi4_robotics won the MM Award 2010 at the fair Automatica
Awards / Prizes: The workerbot of pi4_robotics is a finalist in the euRobotics Technology Transfer Award 2011
Zeitschrift VDMA Nachrichten 02-2011: Robotik Neue Ära der flexiblen Automation
Zeitschrift Blechnet |
https://en.wikipedia.org/wiki/Mya%20%28program%29 | Mya was an intelligent personal assistant under development by Motorola. Proposed features for the program included the ability to read emails and answer questions 24 hours a day. Mya was intended to work with an internet service Motorola was developing called Myosphere, and was planned to be a paid service that would eventually be used by other mobile carriers. A female computer-generated character was created to represent Mya in advertising. While the quality of the character's animation was praised, it received criticism for being over sexualised.
Both the character and the program were announced to the public via an advertisement in March 2000, though the program was not ready for use at that time. Despite the announcement generating a considerable amount of attention, little was heard regarding the project in subsequent months. The program was never officially released nor cancelled, though the trademarks for both Myosphere and Mya were abandoned by Motorola in 2002. The name Mya was believed to be a play on the words 'My assistant'.
Proposed features and development
The Internet service that Mya was developed for was called Myosphere. Motorola began development of Myosphere in 1998, and it had been described as a speech enabled service "which enables consumers to manage and control wireless and wireline communications from a single point of access using natural voice commands." Several other companies had already announced plans for similar software at the time; Alan Reiter from Wireless Internet and Mobile Computing was puzzled at Motorola's announcement of Myosphere, saying "They're kind of late to the [voice activation] party. But the party is likely to be very big. ... Motorola's entry will help further legitimize the value of voice response systems. But it's a tough market, and it will take time." The term myosphere was "a play on the theme of connecting the elements of an individual's world, or sphere."
Intended to provide a human-like interface to th |
https://en.wikipedia.org/wiki/Suslin%20algebra | In mathematics, a Suslin algebra is a Boolean algebra that is complete, atomless, countably distributive, and satisfies the countable chain condition. They are named after Mikhail Yakovlevich Suslin.
The existence of Suslin algebras is independent of the axioms of ZFC, and is equivalent to the existence of Suslin trees or Suslin lines.
See also
Andrei Suslin
References
Boolean algebra
Forcing (mathematics)
Independence results |
https://en.wikipedia.org/wiki/African%20Geodetic%20Reference%20Frame | The African Geodetic Reference Frame (AFREF) is a project designed to unify the many geodetic reference frames of Africa using data from a network of permanent Global Navigation Satellite Systems (GNSS) stations as the primary data source for the implementation of a uniform reference frame.
See also
European Terrestrial Reference System 1989
References
Geodetic datums |
https://en.wikipedia.org/wiki/Strigocossus%20crassa | Strigocossus crassa is a moth in the family Cossidae. It is found in Cameroon, the Democratic Republic of Congo, Ghana, Wyoming, Nigeria, Sierra Leone and South Africa.
Description
Upper side: antennae filiform, whiteish at the base, black at the tips. Head whiteish, small. Thorax whiteish, having two black tufts of hair on the shoulders, and two next the abdomen; upper part yellowish brown. Abdomen dark brown, almost black, being ringed and edged with dark grey. Anterior wings whiteish, intermixed with many patches and irregular spots of grey and dark brown. Posterior wings darkish grey brown, but lighter along the posterior edges.
Under side: all the parts on this side are of a dark yellow brown, of the same colour with the upper side of the inferior wings. Abdomen rather lighter, with a dark brown line running along its middle from the thorax to the anus. Wingspan 7 inches (178 mm).
References
Zeuzerinae
Moths described in 1782
Descriptions from Illustrations of Exotic Entomology |
https://en.wikipedia.org/wiki/Welfare%20biology | Welfare biology is a proposed cross-disciplinary field of research to study the positive and negative well-being of sentient individuals in relation to their environment. Yew-Kwang Ng first advanced the field in 1995. Since then, its establishment has been advocated for by a number of writers, including philosophers, who have argued for the importance of creating the research field, particularly in relation to wild animal suffering. Some researchers have put forward examples of existing research that welfare biology could draw upon and suggested specific applications for the research's findings.
History
Welfare biology was first proposed by the welfare economist Yew-Kwang Ng, in his 1995 paper "Towards welfare biology: Evolutionary economics of animal consciousness and suffering". In the paper, Ng defines welfare biology as the "study of living things and their environment with respect to their welfare (defined as net happiness, or enjoyment minus suffering)." He also distinguishes between "affective" and "non-affective" sentients, affective sentients being individuals with the capacity for perceiving the external world and experiencing pleasure or pain, while non-affective sentients have the capacity for perception, with no corresponding experience; Ng argues that because the latter experience no pleasure or suffering, "[t]heir welfare is necessarily zero, just like nonsentients". He concludes, based on his modelling of evolutionary dynamics, that suffering dominates enjoyment in nature.
Matthew Clarke and Ng, in 2006, used Ng's welfare biology framework to analyse the costs, benefits and welfare implications of the culling of kangaroos—classified as affective sentients—in Puckapunyal, Australia. They concluded that while their discussion "may give some support to the culling of kangaroos or other animals in certain circumstances, a more preventive measure may be superior to the resort to culling". In the same year, Thomas Eichner and Rüdiger Pethi analyzed Ng's |
https://en.wikipedia.org/wiki/Mean%20of%20a%20function | In calculus, and especially multivariable calculus, the mean of a function is loosely defined as the average value of the function over its domain. In one variable, the mean of a function f(x) over the interval (a,b) is defined by:
Recall that a defining property of the average value of finitely many numbers
is that . In other words, is the constant value which when
added to itself times equals the result of adding the terms . By analogy, a
defining property of the average value of a function over the interval is that
In other words, is the constant value which when integrated over equals the result of
integrating over . But the integral of a constant is just
See also the first mean value theorem for integration, which guarantees
that if is continuous then there exists a point such that
The point is called the mean value of on . So we write
and rearrange the preceding equation to get the above definition.
In several variables, the mean over a relatively compact domain U in a Euclidean space is defined by
This generalizes the arithmetic mean. On the other hand, it is also possible to generalize the geometric mean to functions by defining the geometric mean of f to be
More generally, in measure theory and probability theory, either sort of mean plays an important role. In this context, Jensen's inequality places sharp estimates on the relationship between these two different notions of the mean of a function.
There is also a harmonic average of functions and a quadratic average (or root mean square) of functions.
See also
Mean
Means
Calculus
References |
https://en.wikipedia.org/wiki/Hilbert%E2%80%93Kunz%20function | In algebra, the Hilbert–Kunz function of a local ring (R, m) of prime characteristic p is the function
where q is a power of p and m[q] is the ideal generated by the q-th powers of elements of the maximal ideal m.
The notion was introduced by Ernst Kunz, who used it to characterize a regular ring as a Noetherian ring in which the Frobenius morphism is flat. If d is the dimension of the local ring, Monsky showed that f(q)/(q^d) is c+O(1/q) for some real constant c. This constant, the "Hilbert-Kunz" multiplicity", is greater than or equal to 1. Watanabe and Yoshida strengthened some of Kunz's results, showing that in the unmixed case, the ring is regular precisely when c=1.
Hilbert–Kunz functions and multiplicities have been studied for their own sake. Brenner and Trivedi have treated local rings coming from the homogeneous co-ordinate rings of smooth projective curves, using techniques from algebraic geometry. Han, Monsky and Teixeira have treated diagonal hypersurfaces and various related hypersurfaces. But there is no known technique for determining the Hilbert–Kunz function or c in general. In particular the question of whether c is always rational wasn't settled until recently (by Brenner—it needn't be, and indeed can be transcendental). Hochster and Huneke related Hilbert-Kunz multiplicities to "tight closure" and Brenner and Monsky used Hilbert–Kunz functions to show that localization need not preserve tight closure. The question of how c behaves as the characteristic goes to infinity (say for a hypersurface defined by a polynomial with integer coefficients) has also received attention; once again open questions abound.
A comprehensive overview is to be found in Craig Huneke's article "Hilbert-Kunz multiplicities and the F-signature" arXiv:1409.0467. This article is also found on pages 485-525 of the Springer volume "Commutative Algebra: Expository Papers Dedicated to David Eisenbud on the Occasion of His 65th Birthday", edited by Irena Peeva.
References
|
https://en.wikipedia.org/wiki/GestureWorks%20Gameplay | GestureWorks Gameplay was a utility created by Ideum using its GestureWorks technology to enable a variety of touch and gesture controls for games on Windows 8 devices. The software was discontinued as of June 7, 2016.
The Gameplay utility was used to create and use touchscreen controllers for PC games running on Windows 8 without the need for an external controller or mouse and keyboard.
Users could select from a wide variety of onscreen buttons and gesture controls that may be mapped to keyboard and mouse commands used within the game. Accelerometer support was also available for many devices. In addition to providing a control overlay on mobile devices, Gameplay was compatible with Android devices so that gamers could use their Android device with Bluetooth as a controller for their PC games. Gameplay controls were highly customizable and some aspects of the controls such as button size and location could even be adjusted during gameplay while the controller was in use.
In addition to allowing users to create their own controllers, there was a library of pre-built controllers authored by Gestureworks Gameplay and members of the community, containing more than 250 virtual controllers. Gameplay was available through Steam.
GestureWorks Gameplay received favorable reviews from Penny Arcade's Gabe (Mike Krahulik), and Tom's Hardware. The Behemoth games reviewed GestureWorks Gameplay as a good new way to make their popular Castle Crashers game tablet-friendly, and The Behemoth project manager Emil Ayoubkhan says GestureWorks Gameplay is "a perfect fit" for bringing their game to tablet PCs. The technology was also featured in a keynote speech at the Intel Developer Forum in 2013.
References
External links
"GestureWorks"
"GestureWorks Gameplay Wiki"
Display technology companies
Companies based in Albuquerque, New Mexico |
https://en.wikipedia.org/wiki/Choquet%20game | The Choquet game is a topological game named after Gustave Choquet, who was in 1969 the first to investigate such games. A closely related game is known as the strong Choquet game.
Let be a non-empty topological space. The Choquet game of , , is defined as follows: Player I chooses , a non-empty open subset of , then Player II chooses , a non-empty open subset of , then Player I chooses , a non-empty open subset of , etc. The players continue this process, constructing a sequence If then Player I wins, otherwise Player II wins.
It was proved by John C. Oxtoby that a non-empty topological space is a Baire space if and only if Player I has no winning strategy. A nonempty topological space in which Player II has a winning strategy is called a Choquet space. (Note that it is possible that neither player has a winning strategy.) Thus every Choquet space is Baire. On the other hand, there are Baire spaces (even separable metrizable ones) which are not Choquet spaces, so the converse fails.
The strong Choquet game of , , is defined similarly, except that Player I chooses , then Player II chooses , then Player I chooses , etc, such that for all . A topological space in which Player II has a winning strategy for is called a strong Choquet space. Every strong Choquet space is a Choquet space, although the converse does not hold.
All nonempty complete metric spaces and compact T2 spaces are strong Choquet. (In the first case, Player II, given , chooses such that and . Then the sequence for all .) Any subset of a strong Choquet space which is a set is strong Choquet. Metrizable spaces are completely metrizable if and only if they are strong Choquet.
References
Descriptive set theory
Topological games |
https://en.wikipedia.org/wiki/Binary%20game | In mathematics, the binary game is a topological game introduced by Stanislaw Ulam in 1935 in an addendum to problem 43 of the Scottish book as a variation of the Banach–Mazur game.
In the binary game, one is given a fixed subset X of the set {0,1}N of all sequences of 0s and 1s. The players take it in turn to choose a digit 0 or 1, and the first player wins if the sequence they form lies in the set X. Another way to represent this game is to pick a subset of the interval on the real line, then the players alternatively choose binary digits . Player I wins the game if and only if the binary number , that is, . See, page 237.
The binary game is sometimes called Ulam's game, but "Ulam's game" usually refers to the Rényi–Ulam game.
References
Topological games |
https://en.wikipedia.org/wiki/Utimaco%20Atalla | Utimaco Atalla, founded as Atalla Technovation and formerly known as Atalla Corporation or HP Atalla, is a security vendor, active in the market segments of data security and cryptography. Atalla provides government-grade end-to-end products in network security, and hardware security modules (HSMs) used in automated teller machines (ATMs) and Internet security. The company was founded by Egyptian engineer Mohamed M. Atalla in 1972. Atalla HSMs are the payment card industry's de facto standard, protecting 250million card transactions daily (more than billion transactions annually) as of 2013, and securing the majority of the world's ATM transactions as of 2014.
Company history
1970s
The company was originally founded in 1972, initially as Atalla Technovation, before it was later called Atalla Corporation. The company was founded by Dr. Mohamed M. Atalla, the inventor of the MOSFET (metal–oxide–semiconductor field-effect transistor). In 1972, Atalla filed for a remote PIN verification system, which utilized encryption techniques to assure telephone link security while entering personal ID information, which would be transmitted as encrypted data over telecommunications networks to a remote location for verification.
He invented the first hardware security module (HSM), dubbed the "Atalla Box", a security system which encrypted PIN and ATM messages, and protected offline devices with an un-guessable PIN-generating key. He commercially released the "Atalla Box" in 1973. The product was released as the Identikey. It was a card reader and customer identification system, providing a terminal with plastic card and PIN capabilities. The system was designed to let banks and thrift institutions switch to a plastic card environment from a passbook program. The Identikey system consisted of a card reader console, two customer PIN pads, intelligent controller and built-in electronic interface package. The device consisted of two keypads, one for the customer and one for the |
https://en.wikipedia.org/wiki/Micro-incineration | Micro-incineration or microincineration is a technique to determine the manner and distribution of mineral elements in biological cells, biological tissues and organs. Slide preparation of tissues can be used. Examples include calcium (Ca), potassium (K), sodium (Na), magnesium (Mg), iron (Fe), and silicon (Si).
The organic matter is vaporised by heating. The nature and position of the mineral ash is determined microscopically. Aqueous or cryo-EM fixed tissue materials can also be examined under transmission and scanning electron microscopy (TEM & SEM).
The ashing procedure produces cellular oxidised-residues rich in Na2O, CaO, MgO, Fe2O3, SiO2, Ca(PO4)2, Mg(PO4)2, etc., which are detected by X-ray microanalysis with 2-4 times sensitivity gained after incineration of sample, due to increased mineral concentration and reduced nonspecific background radiation.
See more
http://jcb.rupress.org/content/39/1/55.full.pdf
Biochemistry
Biophysics |
https://en.wikipedia.org/wiki/G%C3%B6del%20logic | In mathematical logic, a first-order Gödel logic is a member of a family of finite- or infinite-valued logics in which the sets of truth values V are closed subsets of the unit interval [0,1] containing both 0 and 1. Different such sets V in general determine different Gödel logics. The concept is named after Kurt Gödel.
References
Set theory
Mathematical logic
Formal methods |
https://en.wikipedia.org/wiki/Sample%20complexity | The sample complexity of a machine learning algorithm represents the number of training-samples that it needs in order to successfully learn a target function.
More precisely, the sample complexity is the number of training-samples that we need to supply to the algorithm, so that the function returned by the algorithm is within an arbitrarily small error of the best possible function, with probability arbitrarily close to 1.
There are two variants of sample complexity:
The weak variant fixes a particular input-output distribution;
The strong variant takes the worst-case sample complexity over all input-output distributions.
The No free lunch theorem, discussed below, proves that, in general, the strong sample complexity is infinite, i.e. that there is no algorithm that can learn the globally-optimal target function using a finite number of training samples.
However, if we are only interested in a particular class of target functions (e.g, only linear functions) then the sample complexity is finite, and it depends linearly on the VC dimension on the class of target functions.
Definition
Let be a space which we call the input space, and be a space which we call the output space, and let denote the product . For example, in the setting of binary classification, is typically a finite-dimensional vector space and is the set .
Fix a hypothesis space of functions . A learning algorithm over is a computable map from to . In other words, it is an algorithm that takes as input a finite sequence of training samples and outputs a function from to . Typical learning algorithms include empirical risk minimization, without or with Tikhonov regularization.
Fix a loss function , for example, the square loss , where . For a given distribution on , the expected risk of a hypothesis (a function) is
In our setting, we have , where is a learning algorithm and is a sequence of vectors which are all drawn independently from . Define the optimal riskSet , for each . N |
https://en.wikipedia.org/wiki/Bernd%20Bruegge | Bernd Bruegge () (born 1951) is a German computer scientist, full professor at the Technische Universität München (TUM) and the head of the Chair for Applied Software Engineering. He is also an adjunct associate professor at Carnegie Mellon University (CMU) in Pittsburgh.
Biography
Born in 1951, Bruegge received a bachelor's degree in computer science at the University of Hamburg in 1978, a master's degree in computer science from Carnegie Mellon University in 1982 and a PhD degree in computer science from Carnegie Mellon University in 1985.
Bruegge has been a professor at the Technische Universität München in Munich since 1997. From 2000 to 2003, he was a member of the Deutsche Telekom research committee. He has been a member of the research committee of Munich district (), a nonprofit association, since 2003 and member of the CIO Colloquium scientific advisory board since 2009. Bruegge is also the liaison professor for the German National Academic Foundation ().
Work
His principal research areas are Modeling and semantics, Computational intelligence and Machine learning, Knowledge Management and representation, Process support and human factors, and Process models and methodologies
Publications
Bernd Bruegge is the author of the following books:
Bernd Bruegge, Allen Dutoit: Object-Oriented Software Engineering: Using UML, Patterns and Java (Third Edition). Prentice Hall, 2009. .
Eva-Maria Kern, Heinz-Gerd Hegering, Bernd Brügge: Managing Development and Application of Digital Technologies. Sringer. 2006. .
He is also the author of many academic papers, for example:
Stephan Krusche, Dora Dzvonyar, Han Xu and Bernd Bruegge. Software Theater — Teaching Demo Oriented Prototyping. Transactions on Computing Education. ACM Journal. 2018
Stephan Krusche, Bernd Bruegge, Irina Camilleri, Kirill Krinkin, Andreas Seitz and Cecil Wöbker. Chaordic Learning: A Case Study. 39th International Conference on Software Engineering (ICSE'17), Software Engineering Educatio |
https://en.wikipedia.org/wiki/HTML5test | HTML5test is a web application for evaluating a web browser's accuracy in implementing the web standards HTML5 and Web SQL Database (developed by the World Wide Web Consortium), as well as the WebGL standard (developed by the Mozilla Foundation and the Khronos Group).
The test suite was developed by Dutch web programmer Niels Leenheer, and published in March 2010. To test a web browser, the user must visit the home page of the website which is located at html5test.com. The application returns an integer score out of a possible 555 points. The point total has changed multiple times through the evolution of the software; Leenheer introduced the present scoring system as part of a major redesign of the test introduced in November 2013.
HTML5test evaluates the browser's support for Web storage, the W3C Geolocation API, HTML5-specific HTML elements (including the canvas element), and other features. It does not evaluate a browser's conformance to other web standards, such as Cascading Style Sheets, ECMAScript, Scalable Vector Graphics, or the Document Object Model. Conformance testing for those standards is within the purview of Acid3, an automated test published by Ian Hickson in 2008. Similarly, Acid3 does not evaluate a browser's HTML5 conformance. The test scope of HTML5test and the test scope of Acid3 are mutually exclusive.
As of July 2020, the maximum score is 555 and Google Chrome scores 535, Microsoft Edge 84 scores 532, Falkon 3.1.0 scores 528, Opera 45 scores 518, Mozilla Firefox 112 scores 515, GNOME Web 3.36 scores 432 and Internet Explorer 11 scores 312.
See also
Acid1
Acid2
Acid3
XHTML
References
External links
Computing websites
Dutch websites
HTML5
Internet properties established in 2010
Web 2.0
Web software
Software testing tools |
https://en.wikipedia.org/wiki/Iron%20founder | An iron founder (also iron-founder or ironfounder) in its more general sense is a worker in molten ferrous metal, generally working within an iron foundry. However, the term 'iron founder' is usually reserved for the owner or manager of an iron foundry, a person also known in Victorian England as a 'master'. Workers in a foundry are generically described as 'foundrymen'; however, the various craftsmen working in foundries, such as moulders and pattern makers, are often referred to by their specific trades.
Historically the appellation "founder" was given to the supervisor of a blast furnace, and persons who made castings in iron or other heavy metal. The term is also often applied to the company or works in which an iron foundry operates.
See also
Foundry
Casting (metalworking)
Bellfounding
Coremaking
Foundry sand testing
Smelting
References
Casting (manufacturing)
History of metallurgy
Metalworking occupations |
https://en.wikipedia.org/wiki/Axel%20Sveinsson | Axel Sveinsson (3 April 1896 - 12 August 1957) was an Icelandic civil engineer. He received his engineering degree in Copenhagen in 1927.
He is noted for having designed several lighthouses in Iceland; these include the lighthouses at Hólmsbergsviti, Garðskagi, Þrídrangaviti and Knarrarós. A postage stamp depicting the Vattarnes lighthouse at Reyðarfjörður, which was designed by Axel, was issued in 2013.
Notes
1896 births
1957 deaths
Icelandic civil engineers
20th-century engineers |
https://en.wikipedia.org/wiki/Product%20ecosystem%20theory | Product ecosystem theory is an emerging theory that describes how the design of manufactured products evolves over time and draws parallels with how species evolve within a natural ecosystem. Fundamental to this theory is that manufactured product lines respond to external threats and opportunities in much the same way that species respond to threats and opportunities. Competition and other environmental pressures may cause a species to become extinct. An example of the parallel in consumer products is the way in which the typewriter became displaced (or extinct) due to pressures from the personal computer.
Products lines can be seen to incrementally change and branch over time following the principle of phyletic gradualism. Or they can be seen to have periods of stasis followed by disruptive innovation. This follows the principle of punctuated equilibrium
Ecosystem Theory provides a conceptual framework that helps designers and others understand the mechanisms underpinning product innovation in a tangible and visual way.
Technology change is one of the environmental variables that provide both opportunity and threat for products in much the same way that environmental variables such as climate provide opportunity and threat for species.
Evaluating designed products from this perspective is useful as it shows that the value of a product is not contained entirely within the product itself. Value is also obtained from the rest of the ecosystem.
History and usage of the term
The term was first used in this context by Tim Williams in 2013 in a paper he wrote with Marianella Chamorro-Koc. This paper proposed a methodology for understanding the difficulties of implementation for disruptive innovation based on a case study of the MIT City Car. The term has been used non-specifically prior to this on a number of occasions.
References
Manufactured goods |
https://en.wikipedia.org/wiki/Road%20curve | Road curves are irregular bends in roads to bring a graduation change of direction. Similar curves are on railways and canals.
Curves provided in the horizontal plane are known as horizontal curves and are generally circular or parabolic. Curves provided in the vertical plane are known as vertical curve.
Five types of horizontal curves on roads and railways:
Simple curve
Compound curve
Transition curve
Reverse curve
Deviation curve
Two types of vertical curves on roads:
Valley curve
Summit curve
Horizontal Curve
Simple curve
A simple curve has the same radius throughout and is a single arc of a circle, with two tangents meeting at the intersection (B in this diagram).
Compound curve
A compound curve has two or more simple curves with different radii that bend the same way and are on the same side of a common tangent. In this diagram, MN is the common tangent.
Reverse curve
Also called a serpentine curve, it is the reverse of a compound curve, and two simple curves bent in opposite directions are on opposite sides of the common tangent.
Deviation curve
A deviation curve is simply a combination of two reverse curves. It is used when it is necessary to deviate from a given straight path to avoid intervening obstructions such as a building, a body of water, or other significant site.
Transition curve
Is a curve with a gradual change in elevation on the outside of the curve to help drivers comfortably take turns at faster speeds
Vertical road
Valley curve
Also called a sag curve, this curve dips down and then rises back up. These are placed in the base of hills. The opposite of summit curve.
Summit curve
Also called the crest curve, this curve rises and then dips down. At the peak of hills. The opposite of valley curve.
See also
Geometric design of roads
Hairpin turn
Ranging rods
Survey camp
Tape (surveying)
Transition curve
References
Surveying
Civil engineering |
https://en.wikipedia.org/wiki/Eveve | Eveve is a company which provides restaurant reservation systems. The firm is the largest independently owned supplier in the industry, managing two million online diners per year, and is the largest supplier in Minnesota, booking more table reservations than its main rival OpenTable.
Founded in 1997, the company launched its restaurant reservation system, for which it is best known, in 2007, initially with a selection of restaurants in Edinburgh. As of July 2014, the company has 870 restaurant clients in 11 countries, and is the largest supplier in New Zealand and Chile (under a subsidiary Reservarte); and the second largest supplier in the US, by online reservations. In 2014, Eveve's operations in Texas expanded significantly
2015 saw accelerated expansion as Eveve gained market share in Seattle and Denver
.
See also
List of websites about food and drink
References
External links
Eveve New Zealand
Companies based in Edinburgh
Companies established in 1997
1997 establishments in Scotland
Online companies of the United Kingdom
Computer_reservation_systems |
https://en.wikipedia.org/wiki/Anatole%20Katok | Anatoly Borisovich Katok (; August 9, 1944 – April 30, 2018) was an American mathematician with Russian-Jewish origins. Katok was the director of the Center for Dynamics and Geometry at the Pennsylvania State University.
His field of research was the theory of dynamical systems.
Early life and education
Anatole Katok graduated from Moscow State University, from which he received his master's degree in 1965 and PhD in 1968 (with a thesis on "Applications of the Method of Approximation of Dynamical Systems by Periodic Transformations to Ergodic Theory" under Yakov Sinai). In 1978 he immigrated to the USA. He was married to the mathematician Svetlana Katok, who also works on dynamical systems and has been involved with Katok in the MASS Program for undergraduate students at Penn State.
Work and research
While in graduate school, Katok (together with A. Stepin) developed a theory of periodic approximations of measure-preserving transformations commonly known as Katok—Stepin approximations. This theory helped to solve some problems that went back to von Neumann and Kolmogorov, and won the prize of the Moscow Mathematical Society in 1967.
His next result was the theory of monotone (or Kakutani) equivalence, which is based on a generalization of the concept of time-change in flows. There are constructions in the theory of dynamical systems that are due to Katok. Among these are the Anosov—Katok construction of smooth ergodic area-preserving diffeomorphisms of compact manifolds, the construction of Bernoulli diffeomorphisms with nonzero Lyapunov exponents on any surface, and the first construction of an invariant foliation for which Fubini's theorem fails in the worst possible way (Fubini foiled).
With Elon Lindenstrauss and Manfred Einsiedler, Katok made important progress on the Littlewood conjecture in the theory of Diophantine approximations.
Katok was also known for formulating conjectures and problems (for some of which he even offered prizes) that influenced |
https://en.wikipedia.org/wiki/Passthrough%20%28electronics%29 | In signal processing, a passthrough is a logic gate that enables a signal to "pass through" unaltered, sometimes with little alteration. Sometimes the concept of a "passthrough" can also involve daisy chain logic.
Examples of passthroughs
Analog passthrough (for digital TV)
Sega 32X (passthrough for Sega Genesis video games)
VCRs, DVD recorders, etc. act as a "pass-through" for composite video and S-video, though sometimes they act as an RF modulator for use on Channel 3.
Tape monitor features allow an AV receiver (sometime the recording device itself) to act as a "pass-through" for audio.
An AV receiver usually allows pass-through of the video signal while amplifying the audio signal to drive speakers.
See also
Dongle, a device that converts signal, instead of just being a "passthrough" for unaltered signal
Signal processing
Electrical engineering
de:Durchschleifen |
https://en.wikipedia.org/wiki/Lamellar%20phase | Lamellar phase refers generally to packing of polar-headed long chain nonpolar-tail molecules in an environment of bulk polar liquid, as sheets of bilayers separated by bulk liquid. In biophysics, polar lipids (mostly, phospholipids, and rarely, glycolipids) pack as a liquid crystalline bilayer, with hydrophobic fatty acyl long chains directed inwardly and polar headgroups of lipids aligned on the outside in contact with water, as a 2-dimensional flat sheet surface. Under transmission electron microscope (TEM), after staining with polar headgroup reactive chemical osmium tetroxide, lamellar lipid phase appears as two thin parallel dark staining lines/sheets, constituted by aligned polar headgroups of lipids. 'Sandwiched' between these two parallel lines, there exists one thicker line/sheet of non-staining closely packed layer of long lipid fatty acyl chains. This TEM-appearance became famous as Robertson's unit membrane - the basis of all biological membranes, and structure of lipid bilayer in unilamellar liposomes. In multilamellar liposomes, many such lipid bilayer sheets are layered concentrically with water layers in between.
In lamellar lipid bilayers, polar headgroups of lipids align together at the interface of water and hydrophobic fatty-acid acyl chains align parallel to one another 'hiding away' from water. The lipid head groups are somewhat more 'tightly' packed than relatively 'fluid' hydrocarbon fatty acyl long chains. The lamellar lipid bilayer organization, thus reveals a 'flexibility gradient' of increasing freedom of motions from near the head-groups towards the terminal fatty-acyl chain methyl groups. Existence of such a dynamic organization of lamellar phase in liposomes as well as biological membranes can be confirmed by spin label electron paramagnetic resonance and high resolution nuclear magnetic resonance spectroscopy studies of biological membranes and liposomes.
In 'soft matter science', where physics and chemistry meet biological scie |
https://en.wikipedia.org/wiki/Knuth%27s%20Simpath%20algorithm | Simpath is an algorithm introduced by Donald Knuth that constructs a zero-suppressed decision diagram (ZDD) representing all simple paths between two vertices in a given graph.
References
External links
Graphillion library which implements the algorithm for manipulating large sets of paths and other structures.
, A CWEB implementation by Donald Knuth.
Computer arithmetic algorithms
Donald Knuth
Graph algorithms
Mathematical logic
Theoretical computer science |
https://en.wikipedia.org/wiki/Water%20level | Water level, also known as gauge height or stage, is the elevation of the free surface of a sea, stream, lake or reservoir relative to a specified vertical datum.
See also
Water level (device), device utilizing the surface of liquid water to establish a local horizontal plane of reference
Flood stage
Hydraulic head
Stream gauge
Water level gauges
Tide gauge
Level sensor
Liquid level
Reference water level
Stage (hydrology)
Sea level
References
Hydrology
Vertical position |
https://en.wikipedia.org/wiki/Orthodox%20semigroup | In mathematics, an orthodox semigroup is a regular semigroup whose set of idempotents forms a subsemigroup. In more recent terminology, an orthodox semigroup is a regular E-semigroup. The term orthodox semigroup was coined by T. E. Hall and presented in a paper published in 1969. Certain special classes of orthodox semigroups had been studied earlier. For example, semigroups that are also unions of groups, in which the sets of idempotents form subsemigroups were studied by P. H. H. Fantham in 1960.
Examples
Consider the binary operation in the set S = { a, b, c, x } defined by the following Cayley table :
Then S is an orthodox semigroup under this operation, the subsemigroup of idempotents being { a, b, c }.
Inverse semigroups and bands are examples of orthodox semigroups.
Some elementary properties
The set of idempotents in an orthodox semigroup has several interesting properties. Let S be a regular semigroup and for any a in S let V(a) denote the set of inverses of a. Then the following are equivalent:
S is orthodox.
If a and b are in S and if x is in V(a) and y is in V(b) then yx is in V(ab).
If e is an idempotent in S then every inverse of e is also an idempotent.
For every a, b in S, if V(a) ∩ V(b) ≠ ∅ then V(a) = V(b).
Structure
The structure of orthodox semigroups have been determined in terms of bands and inverse semigroups. The Hall–Yamada pullback theorem describes this construction. The construction requires the concepts of pullbacks (in the category of semigroups) and Nambooripad representation of a fundamental regular semigroup.
See also
Catholic semigroup
Special classes of semigroups
References
Semigroup theory |
https://en.wikipedia.org/wiki/Catholic%20semigroup | In mathematics, a catholic semigroup is a semigroup in which no two distinct elements have the same set of inverses. The terminology was introduced by B. M. Schein in a paper published in 1979. Every catholic semigroup either is a regular semigroup or has precisely one element that is not regular, much like the partitioners of most Catholic churches. The semigroup of all partial transformations of a set is a catholic semigroup. It follows that every semigroup is embeddable in a catholic semigroup. But the full transformation semigroup on a set is not catholic unless the set is a singleton set. Regular catholic semigroups are both left and right reductive, that is, their representations by inner left and right translations are faithful. A regular semigroup is both catholic and orthodox if and only if the semigroup is an inverse semigroup.
See also
Special classes of semigroups
Orthodox semigroup
References
Semigroup theory |
https://en.wikipedia.org/wiki/Sunday%20Night%20Football%20%28Australian%20TV%20program%29 | Sunday Night Football is an Australian rules football sports broadcast television program that aired on the Seven Network on 28 April 1991 until 9 April 2000. It was returned to broadcast on Seven from 6 April 2014 until 29 June 2014 in VIC, SA, WA, TAS, and on 7mate from 6 April 2014 to 29 June 2014 in NSW & QLD.
See also
Friday Night Football
Saturday Night Footy
Seven Sport § Australian rules football
References
1991 Australian television series debuts
2000 Australian television series endings
2014 Australian television series debuts
2014 Australian television series endings
Australian Football League
Australian sports television series
Seven Network original programming
Simulcasts |
https://en.wikipedia.org/wiki/Industrial%20and%20production%20engineering | Industrial and production engineering (IPE) is an interdisciplinary engineering discipline that includes manufacturing technology, engineering sciences, management science, and optimization of complex processes, systems, or organizations. It is concerned with the understanding and application of engineering procedures in manufacturing processes and production methods. Industrial engineering dates back all the way to the industrial revolution, initiated in 1700s by Sir Adam Smith, Henry Ford, Eli Whitney, Frank Gilbreth and Lilian Gilbreth, Henry Gantt, F.W. Taylor, etc. After the 1970s, industrial and production engineering developed worldwide and started to widely use automation and robotics. Industrial and production engineering includes three areas: Mechanical engineering (where the production engineering comes from), industrial engineering, and management science.
The objective is to improve efficiency, drive up effectiveness of manufacturing, quality control, and to reduce cost while making their products more attractive and marketable. Industrial engineering is concerned with the development, improvement, and implementation of integrated systems of people, money, knowledge, information, equipment, energy, materials, as well as analysis and synthesis. The principles of IPE include mathematical, physical and social sciences and methods of engineering design to specify, predict, and evaluate the results to be obtained from the systems or processes currently in place or being developed. The target of production engineering is to complete the production process in the smoothest, most-judicious and most-economic way. Production engineering also overlaps substantially with manufacturing engineering and industrial engineering. The concept of production engineering is interchangeable with manufacturing engineering.
As for education, undergraduates normally start off by taking courses such as physics, mathematics (calculus, linear analysis, differential equations), c |
https://en.wikipedia.org/wiki/The%20Tor%20Project | The Tor Project, Inc. is a 501(c)(3) research-education nonprofit organization based in Winchester, New Hampshire. It is founded by computer scientists Roger Dingledine, Nick Mathewson, and five others. The Tor Project is primarily responsible for maintaining software for the Tor anonymity network.
History
The Tor Project was founded in December 2006 by computer scientists Roger Dingledine, Nick Mathewson and five others. The Electronic Frontier Foundation (EFF) acted as the Tor Project's fiscal sponsor in its early years, and early financial supporters of the Tor Project included the U.S. International Broadcasting Bureau, Internews, Human Rights Watch, the University of Cambridge, Google, and Netherlands-based Stichting NLnet.
In October 2014, the Tor Project hired the public relations firm Thomson Communications in order to improve its public image (particularly regarding the terms "Dark Net" and "hidden services") and to educate journalists about the technical aspects of Tor.
In May 2015, the Tor Project ended the Tor Cloud Service.
In December 2015, the Tor Project announced that it had hired Shari Steele, former executive director of the Electronic Frontier Foundation, as its new executive director. Roger Dingledine, who had been acting as interim executive director since May 2015, remained at the Tor Project as a director and board member. Later that month, the Tor Project announced that the Open Technology Fund would be sponsoring a bug bounty program that was coordinated by HackerOne. The program was initially invite-only and focuses on finding vulnerabilities that are specific to the Tor Project's applications.
On May 25, 2016 Tor Project employee Jacob Appelbaum stepped down from his position; this was announced on June 2 in a two-line statement by Tor. Over the following days, allegations of sexual mistreatment were made public by several people.
On July 13, 2016, the complete board of the Tor Project – Meredith Hoban Dunn, Ian Goldberg, Julius M |
https://en.wikipedia.org/wiki/Noncommutative%20projective%20geometry | In mathematics, noncommutative projective geometry is a noncommutative analog of projective geometry in the setting of noncommutative algebraic geometry.
Examples
The quantum plane, the most basic example, is the quotient ring of the free ring:
More generally, the quantum polynomial ring is the quotient ring:
Proj construction
By definition, the Proj of a graded ring R is the quotient category of the category of finitely generated graded modules over R by the subcategory of torsion modules. If R is a commutative Noetherian graded ring generated by degree-one elements, then the Proj of R in this sense is equivalent to the category of coherent sheaves on the usual Proj of R. Hence, the construction can be thought of as a generalization of the Proj construction for a commutative graded ring.
See also
Elliptic algebra
Calabi–Yau algebra
Sklyanin algebra
References
Fields of geometry |
https://en.wikipedia.org/wiki/Science%20and%20Hypothesis | Science and Hypothesis () is a book by French mathematician Henri Poincaré, first published in 1902. Aimed at a non-specialist readership, it deals with mathematics, space, physics and nature. It puts forward the theses that absolute truth in science is unattainable, and that many commonly held beliefs of scientists are held as convenient conventions rather than because they are more valid than the alternatives.
In this book, Poincaré describes open scientific questions regarding the photo-electric effect, Brownian motion, and the relativity of physical laws in space.
Reading this book inspired Albert Einstein's subsequent Annus Mirabilis papers published in 1905.
A new translation was published in November 2017.
References
1902 non-fiction books
Popular science books
Mathematics books
French non-fiction books |
https://en.wikipedia.org/wiki/Jonathan%20Bennett%20%28mathematician%29 | Jonathan Bennett is a mathematician and Professor of Mathematical Analysis at the University of Birmingham. He was a recipient of the Whitehead Prize of the London Mathematical Society in 2011 for "his foundational work on multilinear inequalities in harmonic and geometric analysis, and for a number of major results in the theory of oscillatory integrals."
Education
In 1995 he graduated with a BA in mathematics from Hertford College at the University of Oxford. He went on to study for a PhD in harmonic analysis under Anthony Carbery at the University of Edinburgh, graduating in 1999.
Career
Bennett has done postdoctoral work at the University of Edinburgh, the Universidad Autonoma de Madrid and Trinity College Dublin. He joined the University of Birmingham in 2005. Bennett is an editor for the journals Mathematika and Quarterly Journal of Mathematics.
Bennett is known for his work in harmonic analysis, particularly in applying the methods of heat flow monotonicity and induction-on-scale arguments to prove inequalities arising in harmonic and geometric analysis, in particular for his work (jointly with Anthony Carbery and Terence Tao) on the multilinear Kakeya conjecture. Bennett has an Erdős number of 3, via his collaboration with Tao.
References
External links
Bennett's page at the Mathematics Genealogy Project.
Research published by Bennett.
A blog post by Tao explaining induction-on-scales arguments, with reference to his joint work with Bennett and Carbery.
Year of birth missing (living people)
Living people
20th-century British mathematicians
21st-century British mathematicians
Alumni of Hertford College, Oxford
Alumni of the University of Edinburgh College of Science and Engineering
Academics of the University of Birmingham
Mathematical analysts
Whitehead Prize winners
British mathematicians |
https://en.wikipedia.org/wiki/Local%20attraction | While compass surveying, the magnetic needle is sometimes disturbed from its normal position under the influence of external attractive forces. Such a disturbing influence is called as local attraction. The external forces are produced by sources of local attraction which may be current carrying wire (magnetic materials) or metal objects. The term is also used to denote amount of deviation of the needle from its normal position. It mostly causes errors in observations while surveying and thus suitable methods are employed to negate these errors.
Sources
The sources of local attraction may be natural or artificial. Natural sources include iron ores or magnetic rocks while as artificial sources consist of steel structures, iron pipes, current carrying conductors. The iron made surveying instruments such as metric chains, ranging rods and arrows should also be kept at a safe distance apart from compass.
Detection
Local attraction at a place can be detected by observing bearings from both ends of the line in the area. If fore bearing and back bearing of a line differ exactly by 180°, there is no local attraction at either station. But if this difference is not equal to 180°, then local attraction exists there either at one or both ends of the line.
Remedies
There are two common methods of correcting observed bearings of the lines taken in the area affected by Local Attraction. The first method involves correcting the bearing with the help of corrected included angles and the second method involves correcting the bearing of traverse from one correct bearing ( in which difference between fore bearing and back bearing is exactly equal to 180°) by the process of distribution of error to other bearings.
References
Surveying
Civil engineering |
https://en.wikipedia.org/wiki/Terbium%20gallium%20garnet | Terbium gallium garnet (TGG) is a kind of synthetic garnet, with the chemical composition . This is a Faraday rotator material with excellent transparency properties and is very resistant to laser damage. TGG can be used in optical isolators for laser systems, in optical circulators for fiber optic systems, in optical modulators, and in current and magnetic field sensors.
TGG has a high Verdet constant which results in the Faraday effect. The Verdet constant increases substantially as the mineral approaches cryogenic temperatures. The highest Verdet constants are found in terbium doped dense flint glasses or in crystals of TGG. The Faraday effect is chromatic (i.e. it depends on wavelength) and therefore the Verdet constant is quite a strong function of wavelength. At 632 nm, the Verdet constant for TGG is reported to be , whereas at 1064 nm it falls to . This behavior means that the devices manufactured with a certain degree of rotation at one wavelength, will produce much less rotation at longer wavelengths. Many Faraday rotators and isolators are adjustable by varying the degree to which the amount of the Faraday rotator material is inserted into the magnetic field of the device. In this way, the device can be tuned for use with a range of lasers within the design range of the device.
See also
Gadolinium gallium garnet
Yttrium iron garnet
Yttrium aluminium garnet
References
Synthetic minerals
Oxides
Terbium compounds
Gallium compounds
Nonlinear optical materials |
https://en.wikipedia.org/wiki/Largeur.com | Largeur.com is an online magazine. published by the Swiss media agency LargeNetwork.
History
On 19 April 1999, two Swiss journalists, Pierre Grosjean and Gabriel Sigrist, created Largeur.com, an online magazine which publishes investigations, columns and news reports on a daily basis. Organized under five headings (Glocal, Kapital, Pop Culture, Technophile and Latitudes), the articles focus on new trends, original viewpoints and exclusive information
Largeur.com soon expanded into a media agency which produces original content for Swiss media publications and magazines, books and other products (print and online) for companies and other institutions. In July 2009, these activities were grouped into a new entity, LargeNetwork. The online magazine retained its original name.
Grosjean and Sigrist were previously involved in the creation of the daily newspaper Le Temps and both worked for Le Nouveau Quotidien.
References
External links
Media coverage of Largeur.com (in French)
History of Largeur.com
1999 establishments in Switzerland
French-language magazines
French-language websites
Magazines established in 1999
Magazines published in Geneva
News magazines published in Europe
Swiss news websites
Online magazines |
https://en.wikipedia.org/wiki/Balls%20into%20bins%20problem | The balls into bins (or balanced allocations) problem is a classic problem in probability theory that has many applications in computer science. The problem involves m balls and n boxes (or "bins"). Each time, a single ball is placed into one of the bins. After all balls are in the bins, we look at the number of balls in each bin; we call this number the load on the bin. The problem can be modelled using a Multinomial distribution, and may involve asking a question such as: What is the expected number of bins with a ball in them?
Obviously, it is possible to make the load as small as m/n by putting each ball into the least loaded bin. The interesting case is when the bin is selected at random, or at least partially at random. A powerful balls-into-bins paradigm is the "power of two random choices" where each ball chooses two (or more) random bins and is placed in the lesser-loaded bin. This paradigm has found wide practical applications in shared-memory emulations, efficient hashing schemes, randomized load balancing of tasks on servers, and routing of packets within parallel networks and data centers.
Random allocation
When the bin for each ball is selected at random, independent of other choices, the maximum load might be as large as However, it is possible to calculate a tighter bound that holds with high probability. A "high probability" is a probability , i.e. the probability tends to when grows to infinity.
For the case , with probability the maximum load is:
.
Gonnet
gave a tight bound for the expected value of the maximum load, which for is , where is the inverse of the gamma function, and it is known that .
The maximum load can also be calculated for , and for example, for it is , and for it is , with high probability.
Exact probabilities for small can be computed as for defined in OEIS A208250.
Partially random allocation
Instead of just selecting a random bin for each ball, it is possible to select two or more bins for each ball and |
https://en.wikipedia.org/wiki/Ceramic%20valve | A ceramic valve is a valve with ceramic trim, ball, seat, disc or lining. A carbon steel or stainless steel body is used to protect the ceramic trim from being damaged by sudden thermal or mechanical shock. Advanced ceramics are used in the manufacture including alumina, zirconia and silicon nitride. Significant benefits of the use of ceramic in valves (when compared to steel or other traditional materials) include resistance to wear and their lower mass. Thanks to the excellent corrosion resistance, abrasive resistance and wear resistance, ceramic valves are often used in severe corrosive and abrasive applications, such as FGD, and pneumatic refuse conveying systems.
References
Ceramic engineering |
https://en.wikipedia.org/wiki/PlayCanvas | PlayCanvas is an open-source 3D game engine/interactive 3D application engine alongside a proprietary cloud-hosted creation platform that allows for simultaneous editing from multiple computers via a browser-based interface. It runs in modern browsers that support WebGL, including Mozilla Firefox and Google Chrome. The engine is capable of rigid-body physics simulation, handling three-dimensional audio and 3D animations.
PlayCanvas has gained the support of ARM, Activision and Mozilla.
The PlayCanvas engine was open-sourced on June 4, 2014.
In April 2019, BusinessInsider.com reported that the company was acquired by Snap Inc. in 2017.
Features
The PlayCanvas platform has collaborative real-time Editor that allows editing project by multiple developers simultaneously. The engine supports the WebGL 1.0 and 2.0 standard to produce GPU accelerated 3D graphics and allows for scripting via the JavaScript programming language.
Projects can be distributed via a URL web link or packaged in native wrappers, p.g. for Android, using CocoonJS or for Steam using Electron, and many other options and platforms.
Notable PlayCanvas applications
Various companies use PlayCanvas in projects of different disciplines of interactive 3D content in the web.
Disney created an educational game for Hour of Code based on its Moana film.
King published Shuffle Cats Mini as a launch title for Facebook Instant Games.
TANX – massively multiplayer online game of cartoon styled tanks.
Miniclip published number of games on their platform with increase of HTML5 games popularity on the web.
Mozilla collaborated with PlayCanvas team creating After the Flood demo for presenting cutting-edge features of WebGL 2.0.
See also
List of WebGL frameworks
List of game engines
JavaScript
HTML5
WebGL
References
External links
PlayCanvas Official Website
PlayCanvas Engine (Open Source)
PlayCanvas API Reference
PlayCanvas Tutorials
Various free-to-play games built with PlayCanvas
Cloud applica |
https://en.wikipedia.org/wiki/Bryozoology | Bryozoology is a branch of zoology specializing in Bryozoa, commonly known as moss animals, a phylum of aquatic invertebrates that live in clonal colonies.
Organizations
The International Bryozoology Association was founded in August 1968 by 16 zoologists and paleozoologists in Stockholm.
Journals
Annals of Bryozoology
Bryozoologists
Samantha L.L. Hill
Eliza Jelly
Randolph Kirkpatrick
Raymond C. Osburn
Mary Dora Rogick
Ehrhard Voigt
Timothy S. Wood
References
Subfields of zoology |
https://en.wikipedia.org/wiki/Felgo | Felgo (previously V-Play Engine until February 2019) is a cross-platform development tool, based on the Qt framework. It can be used to create mobile apps or games. Felgo apps and games are supported on iOS, Android, Windows Phone, embedded devices and desktop devices. Felgo developers use QML, JavaScript and C++ to create mobile apps and games.
Apps and games built with Felgo use a single code base and work across multiple platforms and screen resolutions. Felgo was founded in 2012 and is based in Vienna, Austria.
Architecture
Felgo is based on the Qt (software) cross-platform development framework that provides abstraction layers for timers, threads, storage, networking and UI rendering on different platforms. Felgo uses Qt as its core and offers components and plugins on top of it, which further simplify the development of apps and games.
Engine Features
Qt Creator IDE
The Qt Creator IDE supports editing QML and JavaScript code with context-sensitive help, code completion of Felgo components, navigation between components and more.
It includes a QML debugger and profiler for debugging custom components and JavaScript functions. It can inspect and change property values and QML code at runtime and is able to measure the time of element creation and binding evaluations.
Declarative Language Features
Felgo apps and games are written in JavaScript and QML, a declarative language that features property bindings, state machines or fluid animations of any property.
Resolution & Aspect Ratio Independence
Felgo is built to handle all possible aspect ratios and resolutions of modern mobile devices. Developers create their game for a logical scene using a content scaling approach.
Native Dialogs and Functionality
Felgo apps and games provide a native look and feel on all platforms. Felgo has abstracted components for displaying native input dialogs and alert boxes without the need of a single native code line.
Felgo Game Network
The Felgo Game Network is a mobi |
https://en.wikipedia.org/wiki/Kaon%20Media | Kaon Media () is a South Korean technology company, which specialises in the development and manufacturing of digital connectivity devices (including set-top boxes) and residential gateways for pay-TV operator, broadband operators and telcos. It provides digital broadcasting services to more than 120 operators in 80 countries. The company is based in Sungnam city, South Korea and was founded in 2001.
In November 2013, KAONMEDIA announced that it has started to provide Android set-top boxes, with the latest Google services for TV, on South Korean broadband; this is the world's first commercial launch of IPTV with the latest Android OS (Jellybean 4.2).
In February 2021, the company partnered with Minim and Irdeto to preintegrate their Trusted Home solution into its cable gateway.
References
External links
Mass media companies of South Korea
Seongnam
Companies based in Gyeonggi Province
Interactive television
Television technology
Digital television
Internet broadcasting
Streaming television
Multimedia
Streaming media systems
Video on demand services
South Korean companies established in 2001 |
https://en.wikipedia.org/wiki/List%20of%20BPMN%202.0%20engines | This is a list of notable Business Process Model and Notation 2.0 (BPMN 2.0) Workflow Management Systems (WfMSs).
List of BPMN 2.0 engines
See also
Comparison of Business Process Modeling Notation tools
List of BPEL engines
Notes
References
BPMN 2.0 engines
Middleware
BPMN |
https://en.wikipedia.org/wiki/Skipper%20%28computer%20software%29 | Skipper is a visualization tool and code/schema generator for PHP ORM frameworks like Doctrine2, Doctrine, Propel, and CakePHP, which are used to create database abstraction layer.
Skipper is developed by Czech company Inventic, s.r.o. based in Brno, and was known as ORM Designer prior to rebranding in 2014.
Overview
Generates visual model from the schema definition files
Repetitive import/export of schema definitions in supported formats (XML, YML, PHP annotations)
Schema definition files are automatically generated from the visual model
Visual representation uses ER diagram extended by concepts of inheritance and many-to-many
Supports customization using .xml configuration files and JavaScript
Does not support direct connections to the database
Crude and simplistic visual representation and menus
Architecture
Skipper was built on the Qt framework. Import/export of the schema definitions uses XSL transformations powered by LibXslt library. Imported source files are first converted to XML format: no conversion for XML, simple conversion for YML, creating the Abstract Syntax Tree and its subsequent conversion to XML for PHP annotations.
The import/export scripts are configured in JavaScript and can be freely customized.
Supported ORM frameworks
Frameworks supported for visual model and schema files generation:
Doctrine2
Doctrine
Propel
CakePHP
History
Skipper was created as an internal tool for the web applications developed by Inventic. It was first published as a commercial tool under the name ORM Designer in 2009.
Application was reworked and optimized in January 2013, and released as ORM Designer 2.
In May 2013 ORM Designer became part of the South Moravian Innovation Center Incubator program (support program for innovative technological startups).
In June 2014, ORM Designer version 3 was released and rebranded under the name of Skipper
See also
List of object-relational mapping software
Comparison of object-relational mapping software
Object-rel |
https://en.wikipedia.org/wiki/Springloops | Springloops is a web-based Git and Subversion version control and hosting service with integrated deploy and code collaboration features for web and software developers. Springloops was also the name of the company behind the service until it rebranded to 84kids in May 2013.
Springloops can be used both from the command line and through a web-based graphical interface shared with BamBam!, Chime and Anchor apps from 84kids.
Springloops offers two kinds of paid plans: Personal, with 3-6GB of disk space and 10-25 repositories, and Business, with 12-60GB of space and 50 to unlimited repositories and a range of extra features. The service also provides a free plan for teachers available upon direct request via email (presumably with no limits on the account), as well as a free plan with 1 repository and 100MB of space. All plans start with a 14-day free trial.
History
Springloops launched in late 2006 as an SVN version control and deploy and was gradually growing into a full-scale project management application. v2.0 was released in June 2009 with new interface and features: tasks, time tracking, wiki module and Git support. On May 5, 2013, the service was upgraded to v3.0 and received a heavy graphical overhaul. The core features were extracted into four separate apps (probably for marketing reasons): BamBam! (tasks), Chime (time tracking), Anchor (wiki) and Springloops which retained its original allocation of source & deploy tool. The apps can be used separately or integrated within one account, depending on user needs and profile.
The software’s account settings also mention Turbine, described “discussions and note-taking for teams.” It’s been greyed out and labeled “Coming soon” since the division, however.
Controversy
The v3.0 upgrade spawned a wide range of reactions, especially due to drastic changes in UI and UX. In September 2013 the company released a major update which was a direct response to the feedback from users. The team wrote:
If we were to descr |
https://en.wikipedia.org/wiki/Digital%20Signal%20Processing%20%28journal%29 | Digital Signal Processing is a monthly peer-reviewed open access scientific journal covering all areas of signal processing. It as established in 1991 and published by Academic Press, now Elsevier. The editor-in-chief is Ercan E. Kuruoglu (ISTI-CNR, Pisa, Italy).
Abstracting and indexing
The journal is abstracted and indexed in:
Current Contents/Engineering, Computing & Technology
Engineering Index Monthly
EBSCOhost
INSPEC
Science Citation Index
Scopus
According to the Journal Citation Reports, the journal has a 2014 impact factor of 1.495.
References
External links
Digital signal processing
English-language journals
Electrical and electronic engineering journals
Open access journals
Monthly journals
Academic journals established in 1991 |
https://en.wikipedia.org/wiki/Computation%20offloading | Computation offloading is the transfer of resource intensive computational tasks to a separate processor, such as a hardware accelerator, or an external platform, such as a cluster, grid, or a cloud. Offloading to a coprocessor can be used to accelerate applications including: image rendering and mathematical calculations. Offloading computing to an external platform over a network can provide computing power and overcome hardware limitations of a device, such as limited computational power, storage, and energy.
History
The first concepts of stored-program computers were developed in the design of the ENIAC, the first general-purpose digital computer. The ENIAC was limited in performance to single tasks which led to the development of the EDVAC which would become the first computer designed to perform instructions of various types. Developing computing technologies facilitated the increase in performance of computers, and subsequently has led to a variety of configurations and architecture.
The first instances of computation offloading were the use of simple sub-processors to handle Input/output processing through a separate system called Channel I/O. This concept improved overall system performance as the mainframe only needed to set parameters for the operations while the channel processors carried out the I/O formatting and processing. During the 1970s, coprocessors began being used to accelerate floating-point arithmetic faster than earlier 8-bit and 16-bit processors which used software. As a result, math coprocessors became common for scientific and engineering calculations. Another form of coprocessor was the graphics coprocessor. As image processing became more popular, specialized graphics chips began being used to offload the creation of images from the CPU. Coprocessors were common in most computers, however, declined in usage due to the development in microprocessor technologies which integrated many coprocessor functions. Dedicated graphics processi |
https://en.wikipedia.org/wiki/Biospeleology | Biospeleology, also known as cave biology, is a branch of biology dedicated to the study of organisms that live in caves and are collectively referred to as troglofauna.
Biospeleology as a science
History
The first documented mention of a cave organisms dates back to 1689, with the documentation of the olm, a cave salamander. Discovered in a cave in Slovenia, in the region of Carniola, it was mistaken for a baby dragon and was recorded by Johann Weikhard von Valvasor in his work The Glory of the Duchy of Carniola.
The first formal study on cave organisms was conducted on the blind cave beetle. Found in 1831 by Luka Čeč, an assistant to the lamplighter, when exploring the newly discovered inner portions of the Postojna cave system in southwestern Slovenia. The specimen was turned over to Ferdinand J. Schmidt, who described it in the paper Illyrisches Blatt (1832). He named it Leptodirus Hochenwartii after the donor, and also gave it the Slovene name drobnovratnik and the German name Enghalskäfer, both meaning "slender-necked (beetle)". The article represents the first formal description of a cave animal (the olm, described in 1768, wasn't recognized as a cave animal at the time).
Subsequent research by Schmidt revealed further previously unknown cave inhabitants, which aroused considerable interest among natural historians. For this reason, the discovery of L. hochenwartii (along with the olm) is considered as the starting point of biospeleology as a scientific discipline. Biospeleology was formalized as a science in 1907 by Emil Racoviță with his seminal work Essai sur les problèmes biospéologiques ("Essay on biospeleological problems”).
Subdivisions
Organisms Categories
Cave organisms fall into three basic classes:
Troglobite
Troglobites are obligatory cavernicoles, specialized for cave life. Some can leave caves for short periods, and may complete parts of their life cycles above ground, but cannot live their entire lives outside of a cave environment. Examp |
https://en.wikipedia.org/wiki/Microbead%20%28research%29 | Microbeads, also called Ugelstad particles after the Norwegian chemist, professor John Ugelstad, who invented them in 1977 and patented the method in 1978, are uniform polymer particles, typically 0.5 to 500 microns in diameter. Bio-reactive molecules can be absorbed or coupled to their surface, and used to separate biological materials such as cells, proteins, or nucleic acids.
Microbeads have been used for isolation and handling of specific material or molecules, as well as for analyzing sensitive molecules, or those that are in low abundance, e.g. in miniaturized and automated settings.
Background
Microbeads were created when John Ugelstad managed to form polystyrene beads of the same spherical sizes at the Norwegian University of Science and Technology (NTNU) in 1977. A few years later, he created superparamagnetic microbeads (Dynabeads), which exhibit magnetic properties when placed in a magnetic field. When they are removed from the magnetic field, there is no residual magnetism, which led to the development of magnetic separation technology. Other processes such as centrifugation, filtration, columns, or precipitation are not needed.
Microbeads display a large surface area per volume. This, together with uniformity of size and shape, provides for very good accessibility and fast liquid-phase reaction kinetics, and rapid and efficient binding.
Use
Black polyethylene microspheres can have magnetic or conductive functionality, and have uses in electronic devices, EMI shielding, and microscopy techniques.
Fluorescent polyethylene microspheres are commonly used to run blind tests on laboratory and industrial processes, in order to develop proper methods and minimize cross-contamination of equipment and materials. Microspheres that appear to be invisible in the daylight can be illuminated to display a bright fluorescent response under UV light.
Colored polyethylene microspheres are used for fluid flow visualization to enable observation and characterization |
https://en.wikipedia.org/wiki/Single%20instruction%2C%20multiple%20threads | Single instruction, multiple threads (SIMT) is an execution model used in parallel computing where single instruction, multiple data (SIMD) is combined with multithreading. It is different from SPMD in that all instructions in all "threads" are executed in lock-step. The SIMT execution model has been implemented on several GPUs and is relevant for general-purpose computing on graphics processing units (GPGPU), e.g. some supercomputers combine CPUs with GPUs.
The processors, say a number of them, seem to execute many more than tasks. This is achieved by each processor having multiple "threads" (or "work-items" or "Sequence of SIMD Lane operations"), which execute in lock-step, and are analogous to SIMD lanes.
The simplest way to understand SIMT is to imagine a multi-core system, where each core has its own register file, its own ALUs (both SIMD and Scalar) and its own data cache, but that unlike a standard multi-core system which has multiple independent instruction caches and decoders, as well as multiple independent Program Counter registers, the instructions are synchronously broadcast to all SIMT cores from a single unit with a single instruction cache and a single instruction decoder which reads instructions using a single Program Counter.
The key difference between SIMT and SIMD lanes is that each of the SIMT cores may have a completely different Stack Pointer (and thus perform computations on completely different data sets), whereas SIMD lanes are simply part of an ALU that knows nothing about memory per se.
History
SIMT was introduced by Nvidia in the Tesla GPU microarchitecture with the G80 chip. ATI Technologies, now AMD, released a competing product slightly later on May 14, 2007, the TeraScale 1-based "R600" GPU chip.
Description
As access time of all the widespread RAM types (e.g. DDR SDRAM, GDDR SDRAM, XDR DRAM, etc.) is still relatively high, engineers came up with the idea to hide the latency that inevitably comes with each memory access. St |
https://en.wikipedia.org/wiki/Orchid%20Graphics%20Adapter | The Orchid Graphics Adapter is a graphics board for IBM PC compatible computers, released in 1982 by Orchid Technology.
It was intended to provide high resolution (at the time) monochrome graphic abilities to computers limited to text displays. It was aimed at the business market and one of the three first third party graphic boards for PCs (the others being Plantronics Colorplus and Hercules Graphics Card).
It offered a monochrome 720 × 350 pixel resolution and required an existing MDA board to function. The board also offered an IBM PC joystick adapter. No software, other than GSX-86 and that supplied with the board (Dr. Halo by Media Cybernetics), offered support for the hardware. Graphic routines could be called from FORTRAN, PASCAL or IBM BASIC.
Output capabilities
720 × 350 monochrome graphics, pixel aspect ratio of 1:1.55.
See also
Plantronics Colorplus
Hercules Graphics Card
IBM Monochrome Display Adapter
References
Computer display standards
Graphics cards
Monochrome Display Adapter
Computer-related introductions in 1982 |
https://en.wikipedia.org/wiki/PLINK%20%28genetic%20tool-set%29 | PLINK is a free, commonly used, open-source whole-genome association analysis toolset designed by Shaun Purcell. The software is designed flexibly to perform a wide range of basic, large-scale genetic analyses.
PLINK currently supports following functionalities:
data management;
basic statistics (FST, missing data, tests of Hardy–Weinberg equilibrium, inbreeding coefficient, etc.);
Linkage disequilibrium (LD) calculation;
Identity by descent (IBD) and identity by state (IBS) matrix calculation;
population stratification, such as a Principal component analysis;
association analysis such as genome-wide association study for both basic case/control studies and quantitative traits;
tests for epistasis
Input and output files
PLINK has its own format of text files (.ped) and binary text files (.bed) that serve as input files for most analyses. A .map accompanies a .ped file and provides information about variants, while .bim and .fam files accompany .bed files as part of the binary dataset. Additionally, PLINK accepts inputs of VCF, BCF, Oxford, and 23andMe files, which are typically extracted into the binary .bed format prior to performing desired analyses. With certain formats such as VCF, some information such as phase and dosage will be discarded.
PLINK has a variety of output files depending on the analysis. PLINK has the ability to output files for BEAGLE and can recode a .bed file into a VCF for analyses in other programs. Additionally, PLINK is designed to work in conjunction with R, and can output files to be processed by certain R packages.
Extensions and current developments
PLINK 2.0 a comprehensive update to PLINK, developed by Christopher Chang, with the improved speed of various Genome-wide association (GWA) calculations, including identity-by-state (IBS) matrix calculation, LD-based pruning and association analysis.
PLINK/SEQ is an open-source C/C++ library designed for analyzing large scale whole-genome and whole-exome studies.
MQFAM is a |
https://en.wikipedia.org/wiki/Thread%20%28network%20protocol%29 | Thread is an IPv6-based, low-power mesh networking technology for Internet of things (IoT) products. The Thread protocol specification is available at no cost; however, this requires agreement and continued adherence to an End-User License Agreement (EULA), which states that "Membership in Thread Group is necessary to implement, practice, and ship Thread technology and Thread Group specifications."
Thread uses 6LoWPAN, which, in turn, uses the IEEE 802.15.4 wireless protocol with mesh communication (on the 2.4 GHz spectrum), as do Zigbee and other systems. However, Thread is IP-addressable, with cloud access and AES encryption. A BSD-licensed open-source implementation of Thread, called "OpenThread", is available from and managed by Google.
In 2019, the Connected Home over IP project (later renamed "Matter"), led by Zigbee Alliance (now Connectivity Standards Alliance), Google, Amazon, and Apple, announced a broad collaboration to create a royalty-free standard and open-source code base to promote interoperability in home connectivity, leveraging Thread, Wi-Fi, and Bluetooth Low Energy. In 2021, Thread was awarded the Smart Home Innovation of the Year from The Ambient's Smart Home Awards.
Thread Group
In July 2014, the Thread Group alliance was formed as an industry group to develop, maintain and drive adoption of Thread as an industry networking standard for IoT applications. Thread Group provides certification for components and products to ensure adherence to the spec. Initial members were ARM Holdings, Big Ass Solutions, NXP Semiconductors/Freescale, Google-subsidiary Nest Labs, OSRAM, Samsung, Silicon Labs, Somfy, Tyco International, Qualcomm, and the Yale lock company. In August 2018 Apple Inc. joined the group and released its first Thread product, the HomePod Mini, in late 2020.
Selling points and key features
Thread is a low-power and low-latency wireless mesh networking protocol built using open and proven standards. It uses 6LoWPAN, which is based |
https://en.wikipedia.org/wiki/Journal%20of%20Non-Equilibrium%20Thermodynamics | The Journal of Non-Equilibrium Thermodynamics is a quarterly peer-reviewed scientific journal covering the field of non-equilibrium thermodynamics. It was established in 1976 by Jurgen Keller and its current editor-in-chief is Karl-Heinz Hoffmann (Chemnitz University of Technology).
Abstracting and indexing
The journal is abstracted and indexed in:
According to the Journal Citation Reports, the journal has a 2021 impact factor of 4.290.
References
External links
De Gruyter academic journals
Quarterly journals
English-language journals
Academic journals established in 1976
Engineering journals
Physical chemistry journals
Thermodynamics |
https://en.wikipedia.org/wiki/Honeywell%20Aerospace%2C%20Cambridge | COM DEV International was a satellite technology, space sciences, and telecommunications company based in Cambridge, Ontario, Canada. The company had branches and offices in Ottawa, the United States, the United Kingdom, China and India.
COM DEV developed and manufactured specialized satellite systems, including microwave systems, switches, optical systems, specialized satellite antennas, as well as components for the aviation and aerospace industry. COM DEV also produced custom equipment designs for commercial, military and civilian purposes, as well as providing contract research for the space sciences.
History
COM DEV International was founded in 1974 and specialized in microwave technology for the aviation and aerospace industry. The company would go on to become a leader in space satellite componentry and hardware, specializing in telecommunication systems; a global designer and builder of telecommunication components and systems for space satellites; as well as one of Canada's largest sources of spacecraft instrumentation.
In 2001, its space products division opened an approximate $7-million Surface Acoustic Wave (SAW) development and manufacturing laboratory in its Cambridge facility.
In 2005, it purchased the EMS Technologies Space Science optical division in Ottawa, formerly CAL Corporation, from MacDonald, Dettwiler and Associates for $5 million.
In 2007, it purchased a Passive Microwave division in El Segundo, California, for $8.75 million. In 2010 it purchased Ottawa-based space instrument supplier Routes AstroEngineering for $1.7 million. Later that year, it established a subsidiary called exactEarth offering global ship tracking data services. In 2015, it purchased MESL Microwave of Edinburgh, Scotland. Also that year, it entered the waveguide market with the purchase of Pacific Wave Systems (PWS) of Garden Grove, California.
On November 15, 2015, Honeywell announced that it would acquire COM DEV, which would become part of Honeywell's Defense |
https://en.wikipedia.org/wiki/American%20Sports%20Network | American Sports Network (ASN) was a sports brand owned by the U.S. television station owner Sinclair Broadcast Group through its Sinclair Networks subsidiary. Formed in July 2014, the multicast network component of ASN produced broadcasts of sporting events that were aired primarily across stations owned by Sinclair (in particular, The CW and MyNetworkTV stations owned and/or operated by the company, or, in some markets, on a digital subchannel of a Sinclair station), and syndicated to non-Sinclair stations and regional sports networks.
The multicast network component of ASN primarily dealt in college sports from NCAA Division I conferences, including live football and basketball games from the Atlantic 10 Conference, Big South Conference, Colonial Athletic Association, Conference USA, Horizon League, Ivy League, Mid-American Conference, Ohio Valley Conference, Patriot League, Southern Conference, Southland Conference, and Western Athletic Conference, as well as a limited number of professional sports events. In 2015, ASN acquired regional rights to Real Salt Lake and D.C. United of Major League Soccer, with games aired on Sinclair stations in the teams' market area, as well as television rights to the newly established Arizona Bowl.
In 2017, Sinclair announced that it would fold the multicast network component of ASN into a new joint venture with Silver Chalice called Stadium, which would combine ASN's broadcast distribution platforms with content from Silver Chalice's digital outlets 120 Sports and Campus Insiders. ASN-branded multicast programming continued on-air until September 6, when the network formally transitioned on-air to Stadium.
History
Sinclair Broadcast Group formally announced the launch of the American Sports Network on July 17, 2014; the service was led by Doron Gorshein, who joined the company in January 2014 in the role of chief operating officer of Sinclair Networks. ASN carried live broadcasts of mainly collegiate sporting events, along wit |
https://en.wikipedia.org/wiki/Transmission%20gate | A transmission gate (TG) is an analog gate similar to a relay that can conduct in both directions or block by a control signal with almost any voltage potential. It is a CMOS-based switch, in which PMOS passes a strong 1 but poor 0, and NMOS passes strong 0 but poor 1. Both PMOS and NMOS work simultaneously.
Structure
In principle, a transmission gate is made up of two field-effect transistors (FET), in which – in contrast to traditional discrete field-effect transistors – the substrate terminal (bulk) is not connected internally to the source terminal. The two transistors, an n-channel MOSFET and a p-channel MOSFET, are connected in parallel with the drain and source terminals of the two transistors connected together. Their gate terminals are connected to each other by a NOT gate (inverter), to form the control terminal.
Unlike with discrete FETs, the substrate terminal is not connected to the source connection. Instead, the substrate terminals are connected to the respective supply potential in order to ensure that the parasitic substrate diode (between source/drain and substrate) is always reversely biased and so does not affect signal flow. The substrate terminal of the p-channel MOSFET is thus connected to the positive supply potential, and the substrate terminal of the n-channel MOSFET connected to the negative supply potential.
Function
When the control input is a logic zero (negative power supply potential), the gate of the n-channel MOSFET is also at a negative supply voltage potential. The gate terminal of the p-channel MOSFET is caused by the inverter, to the positive supply voltage potential. Regardless of on which switching terminal of the transmission gate (A or B) a voltage is applied (within the permissible range), the gate-source voltage of the n-channel MOSFETs is always negative, and the p-channel MOSFETs is always positive. Accordingly, neither of the two transistors will conduct and the transmission gate turns off.
When the control inpu |
https://en.wikipedia.org/wiki/Honeywell%20Level%206 | The Honeywell Level 6 was a line of 16-bit minicomputers, later upgraded to 32-bit, manufactured by Honeywell, Inc. from the mid 1970s. Honeywell literature for Models 6/06, 6/34 and 6/36 say "Series 60 (Level 6)". In 1979 the Level 6 was renamed the DPS 6, subsequently DPS 6 Plus and finally DPS 6000.
Description
As initially introduced the Level 6 consisted of three models: the 6/06, the 6/34, and the 6/36. The CPU featured a real-time clock, a ROM bootstrap loader and 64 interrupt levels. The architecture provided a variety of addressing modes and 18 programmer-visible registers. Rack-mount and tabletop versions were available.
These systems supported up to 64 K words (KW) of MOS memory with a cycle time of 650 nanoseconds.
All three models all featured the Megabus, which was a proprietary asynchronous bus architecture.
By 1978 the line had been extended downwards with the introduction of the 6/23 and 6/33, and upwards with the 6/43, 6/47, 6/53, and 6/57. The 6/23 did not support the Megabus. The 6/33 was the new entry-level upgradable model. The other four models supported up to 1 MW (Mega Words) of memory and 26 registers. A memory management unit (MMU), optional on the 6/43 and 6/47, and standard on the 6/53 and 6/57, supported memory segmentation and four protection rings. An optional Scientific Instruction Processor (SIP) added single- and double-precision hardware floating-point instructions. The 6/47 and 6/57 were enhanced versions of the 6/43 and 6/53 respectively which added a Commercial Instruction Processor (CIP) including 30 additional instructions for character-string manipulation and decimal arithmetic. Among the final developments in the line were the high-end 32-bit 6/95-1, 6/98-1 and dual processor 6/95-2 and 6/98-2 models.
In the 1980s, Honeywell's Datanet 8 line of communications processors, often used as front-end processors for DPS 8 mainframes, shared many hardware components with DPS 6. Another specialised derivative of the Level |
https://en.wikipedia.org/wiki/AXIOM%20%28camera%29 | AXIOM is an open hardware and free software digital cinema camera family of devices being developed by a DIY community around the apertus° project.
The community’s second generation camera, AXIOM Beta Compact, is presently in development.
History
In 2006 Oscar Spierenburg, a Dutch film director, noticed a discussion taking place on DVInfo.net entitled “3 channel 36 bit 1280 X 720 low $ camera”, inside which Elphel cameras, which are typically used in scientific applications, had been mentioned.
In the same year a discussion thread entitled "High Definition with Elphel model 333 camera" was posted on the DVInfo.net forum, whereupon the forum’s members discussed how best to adapt Elphel open hardware camera devices for use in film production. Sebastian Pichelhofer discovered this thread in 2008 and assisted with the project by developing an Elphel camera internal hard-disk recorder user interface.
By early 2009, and because over the course of three years upwards of 1000 posts had been submitted to this thread, the community realised that it was going to be difficult to maintain a full overview of the project in this way, and consequently a dedicated website was established. All of the decision making and naming/logo design was decided upon by the community.
After having done some contracting work with Elphel, Pichelhofer focused full-time on the project across 2011 and in July 2012 the plan to create an AXIOM camera hardware prototype from scratch, and thereby overcome some of the limitations that were found to be inherent with Elphel hardware at the time (mainly due to Elphel Inc. having shifted the company's core business focus towards development of a panoramic camera solution), was announced at the Libre Software Meeting in Geneva. This prototype became known as AXIOM Alpha and was intended to gather feedback from typical shooting scenarios with a view to incorporating ideas into a future, more modular, kit version of the camera aimed at developers and early |
https://en.wikipedia.org/wiki/Myhomepage | Myhomepage.com was founded in 2009 becoming a patented personalised homepage service with synchronised bookmarking and password manager software.
Background
The URL was initially listed for $250,000 by Buydomains.com but later sold for $50,000 by Tucows. In 2010 myhomepage.com became a Microsoft Certified Partner built on a .Net SQL CLR development platform with an html AJAX front end system a .NET Framework due to its collaboration with Axosoft. The company subsequently raised $1.2 million through German Startups Group Berlin AG. Myhomepage Ltd. was incorporated at Companies House in 2009 by its two founding entrepreneurs Max Aengevelt and Massimo 'Max' Agostinelli. In 2010 both Aengevelt and Agostinelli were interviewed in Tavria-V by OK TV.
Tucows Inc.
In 2009 the leading news site covering the domain name industry 'Domain Name Wire' reported that one of Tucows' subsidiaries, "BuyDomains has sold the domain name Myhomepage.com for $50,000. - "The buyer appears to be in Germany and the page currently resolves to WhyPark’s name servers". - "This will be one to watch, as the buyer clearly has plans for it". wrote Andrew Allemann. According to the 'DN Journal Top 20' the domain name was ranked #3 by Godaddy's AfternicsDSL. The Uniform Resource Locator (URL) myhomepage.com is ranked among the index of Top-level domain (TLD) globally. Myhomepage had trust seals certificates from Verisign, McAfee and TrustE linked to the domain name registrar including Yourhomepage.com in line with the Domain Name System Security Extensions (DNSSEC).
See also
German Startups Group Berlin AG
iGoogle
Netvibes
Pageflakes
Tucows
References
News aggregators
Web portals
Internet properties established in 2009
2009 establishments in Germany
Cross-platform software
Computer-related introductions in 2009
2009 software |
https://en.wikipedia.org/wiki/Gene%20drive | A gene drive is a natural process and technology of genetic engineering that propagates a particular suite of genes throughout a population by altering the probability that a specific allele will be transmitted to offspring (instead of the Mendelian 50% probability). Gene drives can arise through a variety of mechanisms. They have been proposed to provide an effective means of genetically modifying specific populations and entire species.
The technique can employ adding, deleting, disrupting, or modifying genes.
Proposed applications include exterminating insects that carry pathogens (notably mosquitoes that transmit malaria, dengue, and zika pathogens), controlling invasive species, or eliminating herbicide or pesticide resistance.
As with any potentially powerful technique, gene drives can be misused in a variety of ways or induce unintended consequences. For example, a gene drive intended to affect only a local population might spread across an entire species. Gene drives that eradicate populations of invasive species in their non-native habitats may have consequences for the population of the species as a whole, even in its native habitat. Any accidental return of individuals of the species to its original habitats, through natural migration, environmental disruption (storms, floods, etc.), accidental human transportation, or purposeful relocation, could unintentionally drive the species to extinction if the relocated individuals carried harmful gene drives.
Gene drives can be built from many naturally occurring selfish genetic elements that use a variety of molecular mechanisms. These naturally occurring mechanisms induce similar segregation distortion in the wild, arising when alleles evolve molecular mechanisms that give them a transmission chance greater than the normal 50%.
Most gene drives have been developed in insects, notably mosquitoes, as a way to control insect-borne pathogens. Recent developments designed gene drives directly in viruses, notabl |
https://en.wikipedia.org/wiki/Primitive%20recursive%20set%20function | In mathematics, primitive recursive set functions or primitive recursive ordinal functions are analogs of primitive recursive functions, defined for sets or ordinals rather than natural numbers. They were introduced by .
Definition
A primitive recursive set function is a function from sets to sets that can be obtained from the following basic functions by repeatedly applying the following rules of substitution and recursion:
The basic functions are:
Projection: Pn,m(x1, ..., xn) = xm for 0 ≤ m ≤ n
Zero: F(x) = 0
Adjoining an element to a set: F(x, y) = x ∪ {y}
Testing membership: C(x, y, u, v) = x if u ∈ v, and C(x, y, u, v) = y otherwise.
The rules for generating new functions by substitution are
F(x, y) = G(x, H(x), y)
F(x, y) = G(H(x), y)
where x and y are finite sequences of variables.
The rule for generating new functions by recursion is
F(z, x) = G(∪u∈z F(u, x), z, x)
A primitive recursive ordinal function is defined in the same way, except that the initial function F(x, y) = x ∪ {y} is replaced by F(x) = x ∪ {x} (the successor of x). The primitive recursive ordinal functions are the same as the primitive recursive set functions that map ordinals to ordinals.
Examples of primitive recursive set functions:
TC, the function assigning to a set its transitive closure.
Given hereditarily finite , the constant function .
Extensions
One can also add more initial functions to obtain a larger class of functions. For example, the ordinal function is not primitive recursive, because the constant function with value ω (or any other infinite set) is not primitive recursive, so one might want to add this constant function to the initial functions.
The notion of a set function being primitive recursive in ω has the same definition as that of primitive recursion, except with ω as a parameter kept fixed, not altered by the primitive recursion schemata.
Examples of functions primitive recursive in ω: pp.28--29
.
The function assigning to the th level of Godel's c |
https://en.wikipedia.org/wiki/Elbrus-8S | The Elbrus-8S () is a Russian 28 nanometer 8-core microprocessor developed by Moscow Center of SPARC Technologies (MCST). The first prototypes were produced by the end of 2014 and serial production started in 2016. The Elbrus-8S is to be used in servers and workstations. The processor's architecture allows support of up to 32 processors on a single server motherboard.
In 2018 MCST announced plans to produce the Elbrus-8SV, an upgraded version of the 8C with doubled performance. The CPU can process 576 Gflops and has a frequency of 1.5 GHz, as well as DDR4 support instead of DDR3. Engineering samples were already completed in Q3 2017. Development was completed in 2019 and its fabrication started in 2020.
In 2021 the processor was offered to Sberbank, Russia's largest bank, for evaluation in light of a potential use for some of the company's hardware needs. The evaluation had a negative outcome, as the functional requirements were not met.
2023 benchmark demonstrated that the Elbrus-8SV performed moderately in gaming with games that were 10 years old but was incompatible with modern games tested.
Successor Elbrus-16C was announced in 2020 with declared start of manufacturing in October 2021 but hasn't entered the market as of 2023 yet.
Supported operating systems
The Elbrus-8S and -SV processors support binary compatibility with Intel x86 and x86-64 processors via runtime binary translation. The documentation suggests that the processors can run Windows XP and Windows 7. The processors can also run a Linux kernel based OS compiled for Elbrus.
Elbrus Elbrus-8S information
Elbrus Elbrus-8SV information
References
External links
Official MCST announcements
Data provided by MCST
Very long instruction word computing
X86 microprocessors
VLIW microprocessors |
https://en.wikipedia.org/wiki/Southern%20celestial%20hemisphere | The southern celestial hemisphere, also called the Southern Sky, is the southern half of the celestial sphere; that is, it lies south of the celestial equator. This arbitrary sphere, on which seemingly fixed stars form constellations, appears to rotate westward around a polar axis as the Earth rotates.
At all times, the entire Southern Sky is visible from the geographic South Pole; less of the Southern Sky is visible the further north the observer is located. The northern counterpart is the northern celestial hemisphere.
Astronomy
In the context of astronomical discussions or writing about celestial mapping, it may also simply then be referred to as the Southern Hemisphere.
For the purpose of celestial mapping, the sky is considered by astronomers as the inside of a sphere divided in two halves by the celestial equator. The Southern Sky or Southern Hemisphere is, therefore, that half of the celestial sphere that is south of the celestial equator. Even if this one is the ideal projection of the terrestrial equatorial onto the imaginary celestial sphere, the Northern and Southern celestial hemispheres must not be confused with descriptions of the terrestrial hemispheres of Earth itself.
Observation
From the South Pole, in good visibility conditions, the Southern Sky features over 2,000 fixed stars that are easily visible to the naked eye, while about 20,000 to 40,000 with the aided eye. In large cities, about 300 to 500 stars can be seen depending on the extent of light and air pollution. The farther north, the fewer are visible to the observer.
The brightest stars are all larger than the Sun. Sirius in the constellation of Canis Major has the brightest apparent magnitude of −1.46; it has a radius twice that of the Sun and is 8.6 light-years away. Canopus and the next fixed star α Centauri, 4.2 light-years away, are also located in the Southern Sky, having declinations around −60°; too close to the south celestial pole for either to be visible from Central Eu |
https://en.wikipedia.org/wiki/Demolition | Demolition (also known as razing, cartage, and wrecking) is the science and engineering in safely and efficiently tearing down of buildings and other artificial structures. Demolition contrasts with deconstruction, which involves taking a building apart while carefully preserving valuable elements for reuse purposes.
For small buildings, such as houses, that are only two or three stories high, demolition is a rather simple process. The building is pulled down either manually or mechanically using large hydraulic equipment: elevated work platforms, cranes, excavators or bulldozers. Larger buildings may require the use of a wrecking ball, a heavy weight on a cable that is swung by a crane into the side of the buildings. Wrecking balls are especially effective against masonry, but are less easily controlled and often less efficient than other methods. Newer methods may use rotational hydraulic shears and silenced rock-breakers attached to excavators to cut or break through wood, steel, and concrete. The use of shears is especially common when flame cutting would be dangerous.
The tallest planned demolition of a building was the 52-storey 270 Park Avenue in New York City, which was built in 1960 and torn down in 2019–2021 to be replaced by 270 Park Avenue.
Manual
Before any demolition activities can take place, there are many steps that must be carried out beforehand, including performing asbestos abatement, removing hazardous or regulated materials, obtaining necessary permits, submitting necessary notifications, disconnecting utilities, rodent baiting and the development of site-specific safety and work plans.
The typical razing of a building is accomplished as follows:
Hydraulic excavators may be used to topple one- or two-story buildings by an undermining process. The strategy is to undermine the building while controlling the manner and direction in which it falls.
The demolition project manager/supervisor will determine where undermining is necessary so tha |
https://en.wikipedia.org/wiki/Knapsack%20cryptosystems | Knapsack cryptosystems are cryptosystems whose security is based on the hardness of solving the knapsack problem. They remain quite unpopular because simple versions of these algorithms have been broken for several decades. However, that type of cryptosystem is a good candidate for post-quantum cryptography.
The most famous knapsack cryptosystem is the Merkle-Hellman Public Key Cryptosystem, one of the first public key cryptosystems, published the same year as the RSA cryptosystem. However, this system has been broken by several attacks: one from Shamir, one by Adleman, and the low density attack.
However, there exist modern knapsack cryptosystems that are considered secure so far: among them is Nasako-Murakami 2006.
Knapsack cryptosystems, when not subject to classical cryptoanalysis, are believed to be difficult even for quantum computers. That is not the case for systems that rely on factoring large integers, like RSA, or computing discrete logarithms, like ECDSA, problems solved in polynomial time with Shor's algorithm.
References
Bibliography
Cryptography |
https://en.wikipedia.org/wiki/Euclid%E2%80%93Euler%20theorem | The Euclid–Euler theorem is a theorem in number theory that relates perfect numbers to Mersenne primes. It states that an even number is perfect if and only if it has the form , where is a prime number. The theorem is named after mathematicians Euclid and Leonhard Euler, who respectively proved the "if" and "only if" aspects of the theorem.
It has been conjectured that there are infinitely many Mersenne primes. Although the truth of this conjecture remains unknown, it is equivalent, by the Euclid–Euler theorem, to the conjecture that there are infinitely many even perfect numbers. However, it is also unknown whether there exists even a single odd perfect number.
Statement and examples
A perfect number is a natural number that equals the sum of its proper divisors, the numbers that are less than it and divide it evenly (with remainder zero). For instance, the proper divisors of 6 are 1, 2, and 3, which sum to 6, so 6 is perfect.
A Mersenne prime is a prime number of the form , one less than a power of two. For a number of this form to be prime, itself must also be prime, but not all primes give rise to Mersenne primes in this way. For instance, is a Mersenne prime, but is not.
The Euclid–Euler theorem states that an even natural number is perfect if and only if it has the form , where is a Mersenne prime. The perfect number 6 comes from in this way, as , and the Mersenne prime 7 corresponds in the same way to the perfect number 28.
History
Euclid proved that is an even perfect number whenever is prime. This is the final result on number theory in Euclid's Elements; the later books in the Elements instead concern irrational numbers, solid geometry, and the golden ratio. Euclid expresses the result by stating that if a finite geometric series beginning at 1 with ratio 2 has a prime sum , then this sum multiplied by the last term in the series is perfect. Expressed in these terms, the sum of the finite series is the Mersenne prime and the last term in |
https://en.wikipedia.org/wiki/Bioinformatics%20Open%20Source%20Conference | The Bioinformatics Open Source Conference (BOSC) is an academic conference on open-source programming and other open science practices in bioinformatics, organised by the Open Bioinformatics Foundation. The conference has been held annually since 2000 and is run as a two-day meeting either within Intelligent Systems for Molecular Biology (ISMB) conference or as a joint conference with the Galaxy community.
Program
The conference is held as a single track consisting of presentations, poster sessions and two keynote talks by people of influence in open-source bioinformatics.
Since 2010, an informal two-day "CollaborationFest" (formerly Codefest) has been held directly preceding the conference.
History
National Institutes of Health Associate Director for Data Science Philip Bourne and C. Titus Brown gave keynote talks at BOSC 2014.
BOSC 2016 was organized in Orlando, Florida from July 8–9 before the main ISMB conference.
In 2018 and 2020, BOSC partnered with Galaxy to organize two joint conferences called GCCBOSC and Bioinformatics Community Conference (BCC) respectively. The event in 2018 was held in Portland, Oregon. The BCC in 2020 took place online with two time schedules for eastern/western time zones
Since 2021, BOSC has been taking place within the ISMB conferences again. In 2023 BOSC will take place in Lyon, France between July 24-28 as part of the ISMB/ECCB conference.
Past conferences
As of November 2022, there have been 23 BOSC held around the world, of those 20 were purely in-person conferences, 2 purely remote due to the COVID-19 pandemic and one that was organized as a hybrid meeting.
References
Computational science
Bioinformatics
Biology conferences |
https://en.wikipedia.org/wiki/Prem%20%28food%29 | Prem is a brand of canned meat similar to Spam first introduced in 1939 by the original Swift & Company in the United States. The brand is currently owned by Zwanenberg Food Group USA. In Canada, Prem continues to be sold under the Swift umbrella brand and both trademarks are currently owned by Maple Leaf Foods. Prem was shipped to England during World War II.
See also
Potted meat
Treet
References
Brand name meats
Canned food
Canned meat
Food processing |
https://en.wikipedia.org/wiki/Robert%20E.%20Collin | Robert Emmanuel Collin (24 October 1928 – 29 November 2010) was a Canadian American electrical engineer, university professor and life fellow of the IEEE. Collin was elected to the National Academy of Engineering in 1990.
Biography
Collin was born on 24 October 1928 in the small town of Donalda, Alberta, Canada. He received an undergraduate degree from the University of Saskatchewan and a PhD in electrical engineering from University of London (Imperial College). He worked at the Canadian Armament and Research Development Establishment on guided missile antennas, radomes and radar system evaluations.
Collin taught at Case Western Reserve University between 1958 and 1997. His served stints as the electrical engineering department chair and the interim dean of engineering. He was a distinguished visiting professor at Ohio State University and was a visiting professor at universities in Brazil, China and Germany.
He made significant contributions to the field of microwaves. He is widely known for his textbooks on electromagnetic waves, microwave engineering and antennas. He was a life fellow of the IEEE. Among his students, Collin was viewed as remarkable for his ability to recount the uttermost details of lengthy mathematical proofs from memory. He was an outstanding scholar of microwave and radar engineering and relativistic electrodynamics based on tensor calculus. During the Korean War era, Dr. Collin achieved many important engineering breakthroughs for His Majesty's and Her Majesty's governments.
Publications
Books
Collin RE, Antennas and Radiowave Propagation, McGraw-Hill, 1985.
Collin RE, Field Theory of Guided Waves, 2nd ed, IEEE, 1991.
Collin RE, Foundations for Microwave Engineering, 2nd ed, IEEE, 2001.
Collin RE, Zucker FJ, eds, Antenna Theory, 2 vols, McGraw-Hill, 1969.
Hansen RC, Collin RE, Small Antenna Handbook, Wiley, 2011.
Plonsey R; Collin RE, Principles and Applications of Electromagnetic Fields, McGraw-Hill, 1961.
Articles
Collin RE, "A Sim |
https://en.wikipedia.org/wiki/Jeffrey%20Skolnick | Jeffrey Skolnick is an American computational biologist. He is currently a Georgia Institute of Technology School of Biology Professor, the Director of the Center for the Study of Systems Biology, the Mary and Maisie Gibson Chair, the Georgia Research Alliance Eminent Scholar in Computational Systems Biology, the Director of the Integrative BioSystems Institute, and was previously the Scientific Advisor at Intellimedix.
He has focused on the development of computational algorithms and their application to proteomes for the prediction of protein structure and function, the prediction of small molecule ligand-protein interactions with applications to drug discovery, the prediction of off-target uses of existing drugs, and the exploration of the interplay between protein physics and evolution in determining protein structure and function. He is a pioneer in the field of protein structure prediction, including the development of CABS and CAS methods of lattice based conformation sampling, and the algorithms Touchstone II and TASSER.
Skolnick is most known for demonstrating that the number of ligand binding pockets in proteins is quite small, thereby justifying the likelihood that large scale drug repurposing will work. This combined with the ability to use predicted as well as experimental structures in virtual ligand screening at higher accuracy and precision than existing approaches will enable FDA approved drugs with novel mechanisms of action to be identified computationally with a high likelihood of experimental success.
He is also known for his unique teaching methodology and interactive pedagogy to simplify the comprehension of complex concepts in computational chemistry.
Major Discoveries
Completeness of the library of protein structures and interactions
Skolnick was first to demonstrate that the library of single domain protein structures is likely complete and that the observed folds in nature arise from the confinement of dense polymer chains. He furt |
https://en.wikipedia.org/wiki/AHPL | A Hardware Programming Language (AHPL) is software developed at University of Arizona that has been used as a tool for teaching computer organization.
It was initially started as a set of notations for representation of computer hardware for academics, which is later started to be considered as a Hardware Description Language
on development of compiler and simulator
for it. This language describes a hardware functionality as flow of data between the ports or sub-modules.
The notation, syntax, and semantics were based on the APL
programming language.
References
Hardware description languages |
https://en.wikipedia.org/wiki/Liquid%20droplet%20radiator | The liquid droplet radiator (LDR) or previously termed liquid droplet stream radiator is a proposed lightweight radiator for the dissipation of waste heat generated by power plants, propulsion or spacecraft systems in space.
Background
An advanced or future space mission must have a power source or propulsion that will require the rejection of waste heat.
Disposing large quantities of waste heat must be considered in order to realize a large-space structure (LSS) that handle high power such as a nuclear reactor or a space solar power satellite (SPS).
Such space systems require advanced high-temperature thermal control systems.
Liquid metal heat pipes with conventional radiators are considered ideally suited for such applications.
However, the required radiator surface area is huge, hence, the system mass is very large. The liquid droplet radiator (LDR) has an advantage in terms of the rejected heat power-weight ratio. The results of the studies indicate that for rejection temperatures below approximately 700 K, the LDR system is significantly lighter in weight than the other advanced radiator concepts. A LDR can be seven times lighter than conventional heat pipe radiators of similar size.
The LDR is more resistant to meteorite impacts due to less critical surface or windage, and requires less storage volume.
Therefore, the LDR has attracted attention as an advanced radiator for high-power space systems.
In 1978, John M. Hedgepeth proposed, in "Ultralightweight Structures for Space Power," in Radiation Energy Conversion in Space, Vol. 61 of Progress in Astronautics and Aeronautics, K. W. Billman, ed. (AIAA, New York, 1978), p. 126,
the use of a dust radiator to reduce the radiator weight of solar power satellites. Practical problems of this dust system led to the LDR concept in 1979.
Numerous studies have been made by companies, organizations and universities around the world.
Practical experiments were carried out for example with STS-77 and at drop shafts |
https://en.wikipedia.org/wiki/Lack%27s%20principle | Lack's principle, proposed by the British ornithologist David Lack in 1954, states that "the clutch size of each species of bird has been adapted by natural selection to correspond with the largest number of young for which the parents can, on average, provide enough food". As a biological rule, the principle can be formalised and generalised to apply to reproducing organisms in general, including animals and plants. Work based on Lack's principle by George C. Williams and others has led to an improved mathematical understanding of population biology.
Principle
Lack's principle implies that birds that happen to lay more eggs than the optimum will most likely have fewer fledglings (young that successfully fly from the nest) because the parent birds will be unable to collect enough food for them all. Evolutionary biologist George C. Williams notes that the argument applies also to organisms other than birds, both animals and plants, giving the example of the production of ovules by seed plants as an equivalent case. Williams formalised the argument to create a mathematical theory of evolutionary decision-making, based on the framework outlined in 1930 by R. A. Fisher, namely that the effort spent on reproduction must be worth the cost, compared to the long-term reproductive fitness of the individual. Williams noted that this would contribute to the discussion on whether (as Lack argued) an organism's reproductive processes are tuned to serve its own reproductive interest (natural selection), or as V.C. Wynne-Edwards proposed, to increase the chances of survival of the species to which the individual belonged (group selection). The zoologist J.L. Cloudsley-Thompson argued that a large bird would be able to produce more young than a small bird. Williams replied that this would be a bad reproductive strategy, as large birds have lower mortality and therefore a higher residual reproductive value over their whole lives (so taking a large short-term risk is unjustified). |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.