text
stringlengths 31
999
| source
stringclasses 5
values |
|---|---|
The School of Clinical Medicine is the medical school of the University of Cambridge in England. The medical school ranks as 2nd in the world in the 2023 Times Higher Education Ranking, and as 1st in The Complete University Guide, followed by Oxford University Medical School, Harvard Medical School, and Stanford School of Medicine. The Cambridge Graduate Course in Medicine (A101) is the most competitive course offered by the University and in the UK, and is among the most competitive medical programs for entry in the world
|
https://huggingface.co/datasets/fmars/wiki_stem
|
GRE Subject Biochemistry, Cell and Molecular Biology was a standardized exam provided by ETS (Educational Testing Service) that was discontinued in December 2016. It is a paper-based exam and there are no computer-based versions of it. ETS places this exam three times per year: once in April, once in October and once in November
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Simeon Chituru Achinewhu (born 15 August, 1946) is a Nigerian food and nutrition biochemist, scholar and university administrator who served as the past president-general of Ogbakor Ikwerre Socio-cultural Organisation Worldwide. He was vice–chancellor of River State University (formerly Rivers State University of Science and Technology), from October 2000 until May 2007. In 2005 he was named the most research active vice-chancellor in the Nigerian university system
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Anaerobic exercise is a type of exercise that breaks down glucose in the body without using oxygen; anaerobic means "without oxygen". In practical terms, this means that anaerobic exercise is more intense, but shorter in duration than aerobic exercise.
The biochemistry of anaerobic exercise involves a process called glycolysis, in which glucose is converted to adenosine triphosphate (ATP), the primary source of energy for cellular reactions
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Excess post-exercise oxygen consumption (EPOC, informally called afterburn) is a measurably increased rate of oxygen intake following strenuous activity. In historical contexts the term "oxygen debt" was popularized to explain or perhaps attempt to quantify anaerobic energy expenditure, particularly as regards lactic acid/lactate metabolism; in fact, the term "oxygen debt" is still widely used to this day. However, direct and indirect calorimeter experiments have definitively disproven any association of lactate metabolism as causal to an elevated oxygen uptake
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The Fick principle states that blood flow to an organ can be calculated using a marker substance if the following information is known:
Amount of marker substance taken up by the organ per unit time
Concentration of marker substance in arterial blood supplying the organ
Concentration of marker substance in venous blood leaving the organDeveloped by Adolf Eugen Fick (1829–1901), the Fick principle has been applied to the measurement of cardiac output. Its underlying principles may also be applied in a variety of clinical situations.
In Fick's original method, the "organ" was the entire human body and the marker substance was oxygen
|
https://huggingface.co/datasets/fmars/wiki_stem
|
VO2 max (also maximal oxygen consumption, maximal oxygen uptake or maximal aerobic capacity) is the maximum rate of oxygen consumption attainable during physical exertion. The name is derived from three abbreviations: "V̇" for volume (the dot appears over the V to indicate "per unit of time"), "O2" for oxygen, and "max" for maximum. A similar measure is VO2 peak (peak oxygen consumption), which is the measurable value from a session of physical exercise, be it incremental or otherwise
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Fermentation is a metabolic process that produces chemical changes in organic substances through the action of enzymes. In biochemistry, it is narrowly defined as the extraction of energy from carbohydrates in the absence of oxygen. In food production, it may more broadly refer to any process in which the activity of microorganisms brings about a desirable change to a foodstuff or beverage
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Bioconversion, also known as biotransformation, is the conversion of organic materials, such as plant or animal waste, into usable products or energy sources by biological processes or agents, such as certain microorganisms. One example is the industrial production of cortisone, which one step is the bioconversion of progesterone to 11-alpha-Hydroxyprogesterone by Rhizopus nigricans. Another example is the bioconversion of glycerol to 1,3-propanediol, which is part of scientific research for many decades
|
https://huggingface.co/datasets/fmars/wiki_stem
|
2,3-Butanediol fermentation is anaerobic fermentation of glucose with 2,3-butanediol as one of the end products. The overall stoichiometry of the reaction is
2 pyruvate + NADH --> 2CO2 + 2,3-butanediol. Butanediol fermentation is typical for the facultative anaerobes Klebsiella and Enterobacter and is tested for using the Voges–Proskauer (VP) test
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Ethanol fermentation, also called alcoholic fermentation, is a biological process which converts sugars such as glucose, fructose, and sucrose into cellular energy, producing ethanol and carbon dioxide as by-products. Because yeasts perform this conversion in the absence of oxygen, alcoholic fermentation is considered an anaerobic process. It also takes place in some species of fish (including goldfish and carp) where (along with lactic acid fermentation) it provides energy when oxygen is scarce
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In food processing, fermentation is the conversion of carbohydrates to alcohol or organic acids using microorganisms—yeasts or bacteria—under anaerobic (oxygen-free) conditions. Fermentation usually implies that the action of microorganisms is desired. The science of fermentation is known as zymology or zymurgy
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Fermentative hydrogen production is the fermentative conversion of organic substrates to H2. Hydrogen produced in this manner is often called biohydrogen. The conversion is effected by bacteria and protozoa, which employ enzymes
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Industrial fermentation is the intentional use of fermentation in manufacturing processes. In addition to the mass production of fermented foods and drinks, industrial fermentation has widespread applications in chemical industry. Commodity chemicals, such as acetic acid, citric acid, and ethanol are made by fermentation
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Lactic acid fermentation is a metabolic process by which glucose or other six-carbon sugars (also, disaccharides of six-carbon sugars, e. g. sucrose or lactose) are converted into cellular energy and the metabolite lactate, which is lactic acid in solution
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In biochemistry, mixed acid fermentation is the metabolic process by which a six-carbon sugar (e. g. glucose, C6H12O6) is converted into a complex and variable mixture of acids
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The Pasteur effect describes how available oxygen inhibits ethanol fermentation, driving yeast to switch toward aerobic respiration for increased generation of the energy carrier adenosine triphosphate (ATP).
Discovery
The effect was described by Louis Pasteur in 1857 in experiments showing that aeration of yeasted broth causes cell growth to increase while the fermentation rate decreases, based on lowered ethanol production.
Explanation
Yeast fungi, being facultative anaerobes, can either produce energy through ethanol fermentation or aerobic respiration
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Solid state fermentation (SSF) is a biomolecule manufacturing process used in the food, pharmaceutical, cosmetic, fuel and textile industries. These biomolecules are mostly metabolites generated by microorganisms grown on a solid support selected for this purpose. This technology for the culture of microorganisms is an alternative to liquid or submerged fermentation, used predominantly for industrial purposes
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Solventogenesis is the biochemical production of solvents (usually acetone and butanol) by Clostridium species. It is the second phase of ABE fermentation.
Process
Solventogenic Clostridium species have a biphasic metabolism composed of an acidogenic phase and a solventogenic phase
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Microbial production of Succinic acid can be performed with wild bacteria like Actinobacillus succinogenes, Mannheimia succiniciproducens and Anaerobiospirillum succiniciproducens or genetically modified Escherichia coli, Corynebacterium glutamicum and Saccharomyces cerevisiae. Understanding of the central carbon metabolism of these organisms is crucial in determining the maximum obtainable yield of succinic acid on the carbon source employed as substrate.
Metabolic pathways
Neglecting the carbon utilised for biomass formation (known to be a small fraction of the total carbon utilised) basic biochemistry balances can be performed based on the established metabolic pathways of these organisms
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Acceptance test–driven development (ATDD) is a development methodology based on communication between the business customers, the developers, and the testers. ATDD encompasses many of the same practices as specification by example (SBE), behavior-driven development (BDD), example-driven development (EDD), and support-driven development also called story test–driven development (SDD). All these processes aid developers and testers in understanding the customer's needs prior to implementation and allow customers to be able to converse in their own domain language
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In software development, agile practices (sometimes written "Agile") include requirements discovery and solutions improvement through the collaborative effort of self-organizing and cross-functional teams with their customer(s)/end user(s), Popularized in the 2001 Manifesto for Agile Software Development, these values and principles were derived from and underpin a broad range of software development frameworks, including Scrum and Kanban. While there is much anecdotal evidence that adopting agile practices and values improves the effectiveness of software professionals, teams and organizations, the empirical evidence is mixed and hard to find.
History
Iterative and incremental software development methods can be traced back as early as 1957, with evolutionary project management and adaptive software development emerging in the early 1970s
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In software engineering, behavior-driven development (BDD) is a software development process that goes well with agile software development process that encourages collaboration among developers, quality assurance experts, and customer representatives in a software project. It encourages teams to use conversation and concrete examples to formalize a shared understanding of how the application should behave. It emerged from test-driven development (TDD)
|
https://huggingface.co/datasets/fmars/wiki_stem
|
A best practice is a method or technique that has been generally accepted as superior to other known alternatives because it often produces results that are superior to those achieved by other means or because it has become a standard way of doing things, e. g. , a standard way of complying with legal or ethical requirements
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary (abbreviated CatB) is an essay, and later a book, by Eric S. Raymond on software engineering methods, based on his observations of the Linux kernel development process and his experiences managing an open source project, fetchmail. It examines the struggle between top-down and bottom-up design
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Software development is the process of conceiving, specifying, designing, programming, documenting, testing, and bug fixing involved in creating and maintaining applications, frameworks, or other software components. Software development involves writing and maintaining the source code, but in a broader sense, it includes all processes from the conception of the desired software through the final manifestation, typically in a planned and structured process often overlapping with software engineering. Software development also includes research, new development, prototyping, modification, reuse, re-engineering, maintenance, or any other activities that result in software products
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Comment programming, also known as comment-driven development (CDD), is a (mostly) satirical software development technique that is heavily based on commenting out code. In comment programming, the comment tags are not used to describe what a certain piece of code is doing, but rather to stop some parts of the code from being executed. The aim is to have the commented code at the developer's disposal at any time it might be needed
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Cowboy coding is software development where programmers have autonomy over the development process. This includes control of the project's schedule, languages, algorithms, tools, frameworks and coding style. Typically, little to no coordination exists with other developers or stakeholders
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Domain-driven design (DDD) is a major software design approach, focusing on modeling software to match a domain according to input from that domain's experts. Under domain-driven design, the structure and language of software code (class names, class methods, class variables) should match the business domain. For example: if software processes loan applications, it might have classes like "loan application", "customers", and methods such as "accept offer" and "withdraw"
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Extreme programming (XP) is a software development methodology intended to improve software quality and responsiveness to changing customer requirements. As a type of agile software development, it advocates frequent releases in short development cycles, intended to improve productivity and introduce checkpoints at which new customer requirements can be adopted.
Other elements of extreme programming include: programming in pairs or doing extensive code review, unit testing of all code, not programming features until they are actually needed, a flat management structure, code simplicity and clarity, expecting changes in the customer's requirements as time passes and the problem is better understood, and frequent communication with the customer and among programmers
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In Agile software development, the Fibonacci scale consists of a sequence of numbers used for estimating the relative size of user stories in points. Agile Scrum is based on the concept of working iteratively in short sprints, typically two weeks long, where the requirements and development are continuously being improved. The Fibonacci sequence consists of numbers that are the summation of the two preceding numbers, starting with [0, 1]
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In computer science, formal methods are mathematically rigorous techniques for the specification, development, analysis, and verification of software and hardware systems. The use of formal methods for software and hardware design is motivated by the expectation that, as in other engineering disciplines, performing appropriate mathematical analysis can contribute to the reliability and robustness of a design. Formal methods employ a variety of theoretical computer science fundamentals, including logic calculi, formal languages, automata theory, control theory, program semantics, type systems, and type theory
|
https://huggingface.co/datasets/fmars/wiki_stem
|
"Homesteading the Noosphere" (abbreviated HtN) is an essay written by Eric S. Raymond about the social workings of open-source software development. It follows his previous piece "The Cathedral and the Bazaar" (1997)
|
https://huggingface.co/datasets/fmars/wiki_stem
|
KISS, an acronym for "Keep it simple, stupid!", is a design principle noted by the U. S. Navy in 1960
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The law of conservation of complexity, also known as Tesler's Law, or Waterbed Theory, is an adage in human–computer interaction stating that every application has an inherent amount of complexity that cannot be removed or hidden. Instead, it must be dealt with, either in product development or in user interaction.
This poses the question of who should be exposed to the complexity
|
https://huggingface.co/datasets/fmars/wiki_stem
|
"The Magic Cauldron" is an essay by Eric S. Raymond on the open-source economic model. It can be read freely online and was published in his book 1999 book, The Cathedral and Bazaar
|
https://huggingface.co/datasets/fmars/wiki_stem
|
A Mayo-Smith pyramid is a triangle divided into a sequence of isosceles trapezoids configured such that the outer perimeter maintains the shape of a triangle with each additional element. A Mayo-Smith pyramid is used to describe system development methodologies adapted for scenarios characterized by schedule and resource uncertainty. "Two Ways to Build a Pyramid" was published in 2001
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In computing, minimalism refers to the application of minimalist philosophies and principles in the design and use of hardware and software. Minimalism, in this sense, means designing systems that use the least hardware and software resources possible.
History
In the late 1970s and early 1980s, programmers worked within the confines of relatively expensive and limited resources of common platforms
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In object-oriented programming, the open–closed principle (OCP) states "software entities (classes, modules, functions, etc. ) should be open for extension, but closed for modification";
that is, such an entity can allow its behaviour to be extended without modifying its source code.
The name open–closed principle has been used in two ways
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Planning poker, also called Scrum poker, is a consensus-based, gamified technique for estimating, mostly used for timeboxing in Agile principles. In planning poker, members of the group make estimates by playing numbered cards face-down to the table, instead of speaking them aloud. The cards are revealed, and the estimates are then discussed
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Release early, release often (also known as ship early, ship often, or time-based releases, and sometimes abbreviated RERO) is a software development philosophy that emphasizes the importance of early and frequent releases in creating a tight feedback loop between developers and testers or users, contrary to a feature-based release strategy. Advocates argue that this allows the software development to progress faster, enables the user to help define what the software will become, better conforms to the users' requirements for the software,
and ultimately results in higher quality software. The development philosophy attempts to eliminate the risk of creating software that no one will use
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In programming, the rule of least power is a design principle that
"suggests choosing the least powerful [computer] language suitable for a given purpose". Stated alternatively, given a choice among computer languages, classes of which range from descriptive (or declarative) to procedural, the less procedural, more descriptive the language one chooses, the more one can do with the data stored in that language.
This rule is an application of the principle of least privilege to protocol design
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The scaled agile framework (SAFe) is a set of organization and workflow patterns intended to guide enterprises in scaling lean and agile practices. Along with disciplined agile delivery (DAD), SAFe is one of a growing number of frameworks that seek to address the problems encountered when scaling beyond a single team. SAFe promotes alignment, collaboration, and delivery across large numbers of agile teams
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Scrum is an agile project management system commonly used in software development and other industries.
Scrum prescribes for teams to break work into goals to be completed within time-boxed iterations, called sprints. Each sprint is no longer than one month and commonly lasts two weeks
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Secure by design, in software engineering, means that software products and capabilities have been designed to be foundationally secure.
Alternate security strategies, tactics and patterns are considered at the beginning of a software design, and the best are selected and enforced by the architecture, and they are used as guiding principles for developers. It is also encouraged to use strategic design patterns that have beneficial effects on security, even though those design patterns were not originally devised with security in mind
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Specification by example (SBE) is a collaborative approach to defining requirements and business-oriented functional tests for software products based on capturing and illustrating requirements using realistic examples instead of abstract statements. It is applied in the context of agile software development methods, in particular behavior-driven development. This approach is particularly successful for managing requirements and functional tests on large-scale projects of significant domain and organisational complexity
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Continuous test-driven development (CTDD) is a software development practice that extends test-driven development (TDD) by means of automatic test execution in the background, sometimes called continuous testing.
Practice
In CTDD the developer writes a test first but is not forced to execute the tests manually. The tests are run automatically by a continuous testing tool running in the background
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Test-driven development (TDD) is a software development process relying on software requirements being converted to test cases before software is fully developed, and tracking all software development by repeatedly testing the software against all test cases. This is as opposed to software being developed first and test cases created later.
Software engineer Kent Beck, who is credited with having developed or "rediscovered" the technique, stated in 2003 that TDD encourages simple designs and inspires confidence
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Transformation Priority Premise (TPP) is a programming approach developed by Robert C. Martin (Uncle Bob) as a refinement to make the process of test-driven development (TDD) easier and more effective for a computer programmer.
Transformation Priority Premise states that simpler transformations should be preferred:
[
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The Unix philosophy, originated by Ken Thompson, is a set of cultural norms and philosophical approaches to minimalist, modular software development. It is based on the experience of leading developers of the Unix operating system. Early Unix developers were important in bringing the concepts of modularity and reusability into software engineering practice, spawning a "software tools" movement
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The waterfall model is a breakdown of project activities into linear sequential phases, meaning they are passed down onto each other, where each phase depends on the deliverables of the previous one and corresponds to a specialization of tasks. The approach is typical for certain areas of engineering design. In software development, it tends to be among the less iterative and flexible approaches, as progress flows in largely one direction ("downwards" like a waterfall) through the phases of conception, initiation, analysis, design, construction, testing, deployment and maintenance
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Write once, compile anywhere (WOCA) is a philosophy taken by a compiler and its associated software libraries or by a software library/software framework which refers to a capability of writing a computer program that can be compiled on all platforms without the need to modify its source code. As opposed to Sun's write once, run anywhere slogan, cross-platform compatibility is implemented only at the source code level, rather than also at the compiled binary code level.
Introduction
There are many languages that follow the WOCA philosophy, such as C++, Pascal (see Free Pascal), Ada, Cobol, or C, on condition that they don't use functions beyond those provided by the standard library
|
https://huggingface.co/datasets/fmars/wiki_stem
|
"You aren't gonna need it" (YAGNI) is a principle which arose from extreme programming (XP) that states a programmer should not add functionality until deemed necessary. Other forms of the phrase include "You aren't going to need it" (YAGTNI) and "You ain't gonna need it". Ron Jeffries, a co-founder of XP, explained the philosophy: "Always implement things when you actually need them, never when you just foresee that you [will] need them
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Uno Platform () is an open source cross-platform graphical user interface that allows WinUI and Universal Windows Platform (UWP) - based code to run on iOS, macOS, Linux, Android, and WebAssembly. Uno Platform is released under the Apache 2. 0 license
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In the context of software engineering, software quality refers to two related but distinct notions:
Software's functional quality reflects how well it complies with or conforms to a given design, based on functional requirements or specifications. That attribute can also be described as the fitness for purpose of a piece of software or how it compares to competitors in the marketplace as a worthwhile product. It is the degree to which the correct software was produced
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Adaptability (Latin: adaptō "fit to, adjust") is a feature of a system or of a process. This word has been put to use as a specialised term in different disciplines and in business operations. Word definitions of adaptability as a specialised term differ little from dictionary definitions
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In computer science, algorithmic efficiency is a property of an algorithm which relates to the amount of computational resources used by the algorithm. An algorithm must be analyzed to determine its resource usage, and the efficiency of an algorithm can be measured based on the usage of different resources. Algorithmic efficiency can be thought of as analogous to engineering productivity for a repeating or continuous process
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Backporting is the action of taking parts from a newer version of a software system or software component and porting them to an older version of the same software. It forms part of the maintenance step in a software development process, and it is commonly used for fixing security issues in older versions of the software and also for providing new features to older versions.
Overview
The simplest and probably most common situation of backporting is a fixed security hole in a newer version of a piece of software
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The Consortium for IT Software Quality (CISQ) is an IT industry group comprising IT executives from the Global 2000, systems integrators, outsourced service providers, and software technology vendors committed to making improvements in the quality of IT application software.
Overview
Jointly organized by the Software Engineering Institute (SEI) at Carnegie Mellon University and the Object Management Group (OMG), CISQ is designed to be a neutral forum in which customers and suppliers of IT application software can develop an industry-wide agenda of actions for defining, measuring, and improving IT software quality.
History
CISQ was launched in August 2009 by 24 founders including SEI and OMG
|
https://huggingface.co/datasets/fmars/wiki_stem
|
coala is a free and open-source language independent analysis toolkit, written in Python. The primary goal of coala is to make it easier for developers to create rules which a project's code should conform to. coala emphasizes on reusability and pluggability of analysis routines, and the principle of don't repeat yourself (DRY)
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Computerized system validation (CSV) (Computerised system validation in European countries, and usually referred to as "Computer Systems Validation") is the process of testing/validating/qualifying a regulated (e. g. , US FDA 21 CFR Part 11) computerized system to ensure that it does exactly what it is designed to do in a consistent and reproducible manner that is as safe, secure and reliable as paper-based records
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Continued process verification (CPV) is the collection and analysis of end-to-end production components and processes data to ensure product outputs are within predetermined quality limits. In 2011 the Food and Drug Administration published a report outlining best practices regarding business process validation in the pharmaceutical industry. Continued process verification is outlined in this report as the third stage in Process Validation
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In theoretical computer science, an algorithm is correct with respect to a specification if it behaves as specified. Best explored is functional correctness, which refers to the input-output behavior of the algorithm (i. e
|
https://huggingface.co/datasets/fmars/wiki_stem
|
CTQ trees (critical-to-quality trees) are the key measurable characteristics of a product or process whose performance standards or specification limits must be met in order to satisfy the customer. They align improvement or design efforts with customer requirements.
CTQs are used to decompose broad customer requirements into more easily quantified elements
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Feature interaction is a software engineering concept. It occurs when the integration of two features would modify the behavior of one or both features.
The term feature is used to denote a unit of functionality of a software application
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Flexibility is used as an attribute of various types of systems. In the field of engineering systems design, it refers to designs that can adapt when external changes occur. Flexibility has been defined differently in many fields of engineering, architecture, biology, economics, etc
|
https://huggingface.co/datasets/fmars/wiki_stem
|
GQM, the initialism for goal, question, metric, is an established goal-oriented approach to software metrics to improve and measure software quality.
History
GQM has been promoted by Victor Basili of the University of Maryland, College Park and the Software Engineering Laboratory at the NASA Goddard Space Flight Center after supervising a Ph. D
|
https://huggingface.co/datasets/fmars/wiki_stem
|
A hazard analysis is used as the first step in a process used to assess risk. The result of a hazard analysis is the identification of different types of hazards. A hazard is a potential condition and exists or not (probability is 1 or 0)
|
https://huggingface.co/datasets/fmars/wiki_stem
|
High-integrity software is software whose failure may cause serious damage with possible "life-threatening consequences. " “Integrity is important as it demonstrates the safety, security, and maintainability of… code. ” Examples of high-integrity software are nuclear reactor control, avionics software, and process control software
|
https://huggingface.co/datasets/fmars/wiki_stem
|
ISO/IEC 9126 Software engineering — Product quality was an international standard for the evaluation of software quality. It has been replaced by ISO/IEC 25010:2011.
The fundamental objective of the ISO/IEC 9126 standard is to address some of the well-known human biases that can adversely affect the delivery and perception of a software development project
|
https://huggingface.co/datasets/fmars/wiki_stem
|
A kludge or kluge () is a workaround or quick-and-dirty solution that is clumsy, inelegant, inefficient, difficult to extend and hard to maintain. This term is used in diverse fields such as computer science, aerospace engineering, Internet slang, evolutionary neuroscience, and government. It is similar in meaning to the naval term jury rig
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In computing and systems design, a loosely coupled system is one
in which components are weakly associated (have breakable relationships) with each other, and thus changes in one component least affect existence or performance of another component.
in which each of its components has, or makes use of, little or no knowledge of the definitions of other separate components. Subareas include the coupling of classes, interfaces, data, and services
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Maintainability is the ease of maintaining or providing maintenance for a functioning product or service. Depending on the field, it can have slightly different meanings.
Engineering
In engineering, maintainability is the ease with which a product can be maintained to:
correct defects or their cause,
Repair or replace faulty or worn-out components without having to replace still working parts,
prevent unexpected working conditions,
maximize a product's useful life,
maximize efficiency, reliability, and safety,
meet new requirements,
make future maintenance easier, or
cope with a changing environment
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In systems engineering and requirements engineering, a non-functional requirement (NFR) is a requirement that specifies criteria that can be used to judge the operation of a system, rather than specific behaviours. They are contrasted with functional requirements that define specific behavior or functions. The plan for implementing functional requirements is detailed in the system design
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In computer programming, orthogonality means that operations change just one thing without affecting others. The term is most-frequently used regarding assembly instruction sets, as orthogonal instruction set.
Orthogonality in a programming language means that a relatively small set of primitive constructs can be combined in a relatively small number of ways to build the control and data structures of the language
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Process validation is the analysis of data gathered throughout the design and manufacturing of a product in order to confirm that the process can reliably output products of a determined standard. Regulatory authorities like EMA and FDA have published guidelines relating to process validation. The purpose of process validation is to ensure varied inputs lead to consistent and high quality outputs
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Quality engineering is the discipline of engineering concerned with the principles and practice of product and service quality assurance and control. In software development, it is the management, development, operation and maintenance of IT systems and enterprise architectures with a high quality standard.
Description
Quality engineering is the discipline of engineering that creates and implements strategies for quality assurance in product development and production as well as software development
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Reliability engineering is a sub-discipline of systems engineering that emphasizes the ability of equipment to function without failure. Reliability describes the ability of a system or component to function under stated conditions for a specified period of time. Reliability is closely related to availability, which is typically described as the ability of a component or system to function at a specified moment or interval of time
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In computer science and software engineering, reusability is the use of existing assets in some form within the software product development process; these assets are products and by-products of the software development life cycle and include code, software components, test suites, designs and documentation. The opposite concept of reusability is leverage, which modifies existing assets as needed to meet specific system requirements. Because reuse implies the creation of a separately maintained version of the assets, it is preferred over leverage
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Reverse semantic traceability (RST) is a quality control method for verification improvement that helps to insure high quality of artifacts by backward translation at each stage of the software development process.
Brief introduction
Each stage of development process can be treated as a series of “translations” from one language to another. At the very beginning a project team deals with customer’s requirements and expectations expressed in natural language
|
https://huggingface.co/datasets/fmars/wiki_stem
|
In computer science, robustness is the ability of a computer system to cope with errors during execution and cope with erroneous input. Robustness can encompass many areas of computer science, such as robust programming, robust machine learning, and Robust Security Network. Formal techniques, such as fuzz testing, are essential to showing robustness since this type of testing involves invalid or unexpected inputs
|
https://huggingface.co/datasets/fmars/wiki_stem
|
A safety-critical system (SCS) or life-critical system is a system whose failure or malfunction may result in one (or more) of the following outcomes:
death or serious injury to people
loss or severe damage to equipment/property
environmental harmA safety-related system (or sometimes safety-involved system) comprises everything (hardware, software, and human aspects) needed to perform one or more safety functions, in which failure would cause a significant increase in the safety risk for the people or environment involved. Safety-related systems are those that do not have full responsibility for controlling hazards such as loss of life, severe injury or severe environmental damage. The malfunction of a safety-involved system would only be that hazardous in conjunction with the failure of other systems or human error
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Search-based software engineering (SBSE) applies metaheuristic search techniques such as genetic algorithms, simulated annealing and tabu search to software engineering problems. Many activities in software engineering can be stated as optimization problems. Optimization techniques of operations research such as linear programming or dynamic programming are often impractical for large scale software engineering problems because of their computational complexity or their assumptions on the problem structure
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Software assurance (SwA) is a critical process in software development that ensures the reliability, safety, and security of software products. It involves a variety of activities, including requirements analysis, design reviews, code inspections, testing, and formal verification. One crucial component of software assurance is secure coding practices, which follow industry-accepted standards and best practices, such as those outlined by the Software Engineering Institute (SEI) in their CERT Secure Coding Standards (SCS)
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Software bloat is a process whereby successive versions of a computer program become perceptibly slower, use more memory, disk space or processing power, or have higher hardware requirements than the previous version, while making only dubious user-perceptible improvements or suffering from feature creep. The term is not applied consistently; it is often used as a pejorative by end users (bloatware) to describe undesired user interface changes even if those changes had little or no effect on the hardware requirements. In long-lived software, perceived bloat can occur from the software servicing a large, diverse marketplace with many differing requirements
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Software crisis is a term used in the early days of computing science for the difficulty of writing useful and efficient computer programs in the required time. The software crisis was due to the rapid increases in computer power and the complexity of the problems that could not be tackled. With the increase in the complexity of the software, many software problems arose because existing methods were inadequate
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Security, as part of the software development process, is an ongoing process involving people and practices, and ensures application confidentiality, integrity, and availability. Secure software is the result of security aware software development processes where security is built in and thus software is developed with security in mind. Security is most effective if planned and managed throughout every stage of software development life cycle (SDLC), especially in critical applications or those that process sensitive information
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Software entropy is the idea that software eventually rots as it is changed if sufficient care is not taken to maintain coherence with product design and established design principles. The common usage is only tangentially related to entropy as defined in classical thermodynamics and statistical physics.
Another aspect can be found in what is perceived to be a decay in the quality of otherwise static software that is the result of the inevitable changes to its environment, that often occur as operating systems and other components are upgraded or retired
|
https://huggingface.co/datasets/fmars/wiki_stem
|
A software map represents static, dynamic, and evolutionary information of software systems and their software development processes by means of 2D or 3D map-oriented information visualization. It constitutes a fundamental concept and tool in software visualization, software analytics, and software diagnosis. Its primary applications include risk analysis for and monitoring of code quality, team activity, or software development progress and, generally, improving effectiveness of software engineering with respect to all related artifacts, processes, and stakeholders throughout the software engineering process and software maintenance
|
https://huggingface.co/datasets/fmars/wiki_stem
|
A computer program is said to be portable if there is very low effort required to make it run on different platforms. The pre-requirement for portability is the generalized abstraction between the application logic and system interfaces. When software with the same functionality is produced for several computing platforms, portability is the key issue for development cost reduction
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Software quality assurance (SQA) is a means and practice of monitoring all software engineering processes, methods, and work products to ensure compliance against defined standards. It may include ensuring conformance to standards or models, such as ISO/IEC 9126 (now superseded by ISO 25010), SPICE or CMMI. It includes standards and procedures that managers, administrators or developers may use to review and audit software products and activities to verify that the software meets quality criteria which link to standards
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Software quality control is the set of procedures used by organizations to ensure that a software product will meet its quality goals at the best value to the customer, and to continually improve the organization’s ability to produce software products in the future. Software quality control refers to specified functional requirements as well as non-functional requirements such as supportability, performance and usability. It also refers to the ability for software to perform well in unforeseeable scenarios and to keep a relatively low defect rate
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Software quality management (SQM) is a management process that aims to develop and manage the quality of software in such a way so as to best ensure that the product meets the quality standards expected by the customer while also meeting any necessary regulatory and developer requirements, if any. Software quality managers require software to be tested before it is released to the market, and they do this using a cyclical process-based quality assessment in order to reveal and fix bugs before release. Their job is not only to ensure their software is in good shape for the consumer but also to encourage a culture of quality throughout the enterprise
|
https://huggingface.co/datasets/fmars/wiki_stem
|
A mock execution is a stratagem in which a victim is deliberately but falsely made to feel that their execution or that of another person is imminent or is taking place. The subject is made to believe that they are being led to their execution. This might involve blindfolding the subjects, telling them they are about to die, making them recount last wishes, making them dig their own grave, holding an unloaded gun to their head and pulling the trigger, shooting near (but not at) the victim, or firing blanks
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Psychological torture or mental torture is a type of torture that relies primarily on psychological effects, and only secondarily on any physical harm inflicted. Although not all psychological torture involves the use of physical violence, there is a continuum between psychological torture and physical torture. The two are often used in conjunction with one another and often overlap in practice, with the fear and pain induced by physical torture often resulting in long-term psychological effects, and many forms of psychological torture involving some form of pain or coercion
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Sensory deprivation or perceptual isolation is the deliberate reduction or removal of stimuli from one or more of the senses. Simple devices such as blindfolds or hoods and earmuffs can cut off sight and hearing, while more complex devices can also cut off the sense of smell, touch, taste, thermoception (heat-sense), and the ability to know which way is down. Sensory deprivation has been used in various alternative medicines and in psychological experiments (e
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Sensory overload occurs when one or more of the body's senses experiences over-stimulation from the environment.
There are many environmental elements that affect an individual. Examples of these elements are urbanization, crowding, noise, mass media, and technology
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Sleep deprivation, also known as sleep insufficiency or sleeplessness, is the condition of not having adequate duration and/or quality of sleep to support decent alertness, performance, and health. It can be either chronic or acute and may vary widely in severity.
Acute sleep deprivation is when an individual sleeps less than usual or does not sleep at all for a short period of time – normally lasting one to two days but tends to follow the sleepless pattern for longer with no outside factors in play
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The Affordable Weapon System is a US Navy program to design and produce a low cost "off the shelf" cruise missile launchable from a self-contained unit mounted in a standard shipping container.
Specifications
Length: (w/o booster): 3. 32 m (10 ft 11 in)
Diameter: 34
|
https://huggingface.co/datasets/fmars/wiki_stem
|
The ArcLight program was a missile development program of the Defense Advanced Research Projects Agency with the goal of equipping ships like Aegis cruisers with a weapon system capable of striking targets nearly anywhere on the globe, thereby increasing the power of surface ships to a level comparable to that of ballistic missile-equipped submarines. According to DARPA, the ArcLight program was to develop a high-tech missile based on the booster stack of the current SM-3 and equipped with a hypersonic glide vehicle capable of carrying a 100-200 lb (45-90 kg) warhead. The configuration would allow ships carrying the ArcLight missile to strike targets 2,300 miles (3,700 km) away from the launch point
|
https://huggingface.co/datasets/fmars/wiki_stem
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.