id
int64
39
79M
url
stringlengths
31
227
text
stringlengths
6
334k
source
stringlengths
1
150
categories
listlengths
1
6
token_count
int64
3
71.8k
subcategories
listlengths
0
30
58,083,234
https://en.wikipedia.org/wiki/Exterior%20calculus%20identities
This article summarizes several identities in exterior calculus, a mathematical notation used in differential geometry. Notation The following summarizes short definitions and notations that are used in this article. Manifold , are -dimensional smooth manifolds, where . That is, differentiable manifolds that can be differentiated enough times for the purposes on this page. , denote one point on each of the manifolds. The boundary of a manifold is a manifold , which has dimension . An orientation on induces an orientation on . We usually denote a submanifold by . Tangent and cotangent bundles , denote the tangent bundle and cotangent bundle, respectively, of the smooth manifold . , denote the tangent spaces of , at the points , , respectively. denotes the cotangent space of at the point . Sections of the tangent bundles, also known as vector fields, are typically denoted as such that at a point we have . Sections of the cotangent bundle, also known as differential 1-forms (or covector fields), are typically denoted as such that at a point we have . An alternative notation for is . Differential k-forms Differential -forms, which we refer to simply as -forms here, are differential forms defined on . We denote the set of all -forms as . For we usually write , , . -forms are just scalar functions on . denotes the constant -form equal to everywhere. Omitted elements of a sequence When we are given inputs and a -form we denote omission of the th entry by writing Exterior product The exterior product is also known as the wedge product. It is denoted by . The exterior product of a -form and an -form produce a -form . It can be written using the set of all permutations of such that as Directional derivative The directional derivative of a 0-form along a section is a 0-form denoted Exterior derivative The exterior derivative is defined for all . We generally omit the subscript when it is clear from the context. For a -form we have as the -form that gives the directional derivative, i.e., for the section we have , the directional derivative of along . For , Lie bracket The Lie bracket of sections is defined as the unique section that satisfies Tangent maps If is a smooth map, then defines a tangent map from to . It is defined through curves on with derivative such that Note that is a -form with values in . Pull-back If is a smooth map, then the pull-back of a -form is defined such that for any -dimensional submanifold The pull-back can also be expressed as Interior product Also known as the interior derivative, the interior product given a section is a map that effectively substitutes the first input of a -form with . If and then Metric tensor Given a nondegenerate bilinear form on each that is continuous on , the manifold becomes a pseudo-Riemannian manifold. We denote the metric tensor , defined pointwise by . We call the signature of the metric. A Riemannian manifold has , whereas Minkowski space has . Musical isomorphisms The metric tensor induces duality mappings between vector fields and one-forms: these are the musical isomorphisms flat and sharp . A section corresponds to the unique one-form such that for all sections , we have: A one-form corresponds to the unique vector field such that for all , we have: These mappings extend via multilinearity to mappings from -vector fields to -forms and -forms to -vector fields through Hodge star For an n-manifold M, the Hodge star operator is a duality mapping taking a -form to an -form . It can be defined in terms of an oriented frame for , orthonormal with respect to the given metric tensor : Co-differential operator The co-differential operator on an dimensional manifold is defined by The Hodge–Dirac operator, , is a Dirac operator studied in Clifford analysis. Oriented manifold An -dimensional orientable manifold is a manifold that can be equipped with a choice of an -form that is continuous and nonzero everywhere on . Volume form On an orientable manifold the canonical choice of a volume form given a metric tensor and an orientation is for any basis ordered to match the orientation. Area form Given a volume form and a unit normal vector we can also define an area form on the Bilinear form on k-forms A generalization of the metric tensor, the symmetric bilinear form between two -forms , is defined pointwise on by The -bilinear form for the space of -forms is defined by In the case of a Riemannian manifold, each is an inner product (i.e. is positive-definite). Lie derivative We define the Lie derivative through Cartan's magic formula for a given section as It describes the change of a -form along a flow associated to the section . Laplace–Beltrami operator The Laplacian is defined as . Important definitions Definitions on Ωk(M) is called... closed if exact if for some coclosed if coexact if for some harmonic if closed and coclosed Cohomology The -th cohomology of a manifold and its exterior derivative operators is given by Two closed -forms are in the same cohomology class if their difference is an exact form i.e. A closed surface of genus will have generators which are harmonic. Dirichlet energy Given , its Dirichlet energy is Properties Exterior derivative properties ( Stokes' theorem ) ( cochain complex ) for ( Leibniz rule ) for ( directional derivative ) for Exterior product properties for ( alternating ) ( associativity ) for ( compatibility of scalar multiplication ) ( distributivity over addition ) for when is odd or . The rank of a -form means the minimum number of monomial terms (exterior products of one-forms) that must be summed to produce . Pull-back properties ( commutative with ) ( distributes over ) ( contravariant ) for ( function composition ) Musical isomorphism properties Interior product properties ( nilpotent ) for ( Leibniz rule ) for for for Hodge star properties for ( linearity ) for , , and the sign of the metric ( inversion ) for ( commutative with -forms ) for ( Hodge star preserves -form norm ) ( Hodge dual of constant function 1 is the volume form ) Co-differential operator properties ( nilpotent ) and ( Hodge adjoint to ) if ( adjoint to ) In general, for Lie derivative properties ( commutative with ) ( commutative with ) ( Leibniz rule ) Exterior calculus identities if ( bilinear form ) ( Jacobi identity ) Dimensions If for for If is a basis, then a basis of is Exterior products Let and be vector fields. Projection and rejection ( interior product dual to wedge ) for If , then is the projection of onto the orthogonal complement of . is the rejection of , the remainder of the projection. thus ( projection–rejection decomposition ) Given the boundary with unit normal vector extracts the tangential component of the boundary. extracts the normal component of the boundary. Sum expressions given a positively oriented orthonormal frame . Hodge decomposition If , such that Poincaré lemma If a boundaryless manifold has trivial cohomology , then any closed is exact. This is the case if M is contractible. Relations to vector calculus Identities in Euclidean 3-space Let Euclidean metric . We use differential operator for . ( scalar triple product ) ( cross product ) if ( scalar product ) ( gradient ) ( directional derivative ) ( divergence ) ( curl ) where is the unit normal vector of and is the area form on . ( divergence theorem ) Lie derivatives ( -forms ) ( -forms ) if ( -forms on -manifolds ) if ( -forms ) References Calculus Mathematical identities Mathematics-related lists Differential forms Differential operators Generalizations of the derivative
Exterior calculus identities
[ "Mathematics", "Engineering" ]
1,627
[ "Mathematical analysis", "Mathematical theorems", "Tensors", "Calculus", "Differential forms", "Mathematical identities", "Mathematical problems", "Differential operators", "Algebra" ]
58,083,320
https://en.wikipedia.org/wiki/Carbonyl%20selenide
Carbonyl selenide is a chemical compound with the chemical formula . It is a linear molecule that is primarily of interest for research purposes. Properties Carbonyl selenide is a colorless gas with an unpleasant odor. Although the compound is quite stable, its solutions gradually revert to elemental selenium and carbon monoxide. Synthesis and reactions Carbonyl selenide can be produced by treating selenium with carbon monoxide in the presence of amines. It is used in organoselenium chemistry as a means of incorporating selenium into organic compounds, e.g. for the preparation of selenocarbamates (O-selenocarbamates and Se-selenocarbamates, , where R is organyl and R' and R" are any group, typically H or organyl). References Inorganic carbon compounds Oxides Selenides
Carbonyl selenide
[ "Chemistry" ]
178
[ "Inorganic compounds", "Inorganic carbon compounds", "Oxides", "Salts" ]
58,083,428
https://en.wikipedia.org/wiki/NGC%202108
NGC 2108 is a globular cluster located in the constellation of Dorado. NGC 2108 was discovered in 1835 by John Herschel. See also Globular Cluster References External links NGC 2108 on SIMBAD Astronomical objects discovered in 1835 2108 Globular clusters Dorado
NGC 2108
[ "Astronomy" ]
59
[ "Dorado", "Constellations" ]
58,084,076
https://en.wikipedia.org/wiki/Julia%20Weertman
Julia Randall Weertman (February 10, 1926 – July 31, 2018) was an American materials scientist who taught at Northwestern University as the Walter P. Murphy Professor of Materials Science and Engineering. Education She was the first female student of the College of Science and Engineering at the Carnegie Institute of Technology, where she earned her baccalaureate and graduate degrees. Weertman met her husband Johannes at Carnegie, and both later joined the Northwestern University faculty. Career In 1986, Julia Weertman was awarded a Guggenheim Fellowship. She became the first woman in the United States to lead a materials science department when she was appointed chair of Northwestern's Department of Materials Science and Engineering the next year. Weertman was granted membership into the National Academy of Engineering in 1988, "for exceptional research on failure mechanisms in high-temperature alloys." In 1989, she became the first female member of the Board of Directors of The Minerals, Metals & Materials Society. Fellowships She was also a fellow of the American Academy of Arts and Sciences, ASM International, the American Physical Society, and the American Geophysical Union and the first female Fellow of The Minerals, Metals & Materials Society. Death Weertman died, aged 92, on July 31, 2018. References 1926 births 2018 deaths American materials scientists Northwestern University faculty Fellows of the American Academy of Arts and Sciences Members of the United States National Academy of Engineering Carnegie Mellon University alumni Women materials scientists and engineers 20th-century American engineers 20th-century American women engineers Fellows of the Minerals, Metals & Materials Society
Julia Weertman
[ "Materials_science", "Technology" ]
307
[ "Women materials scientists and engineers", "Materials scientists and engineers", "Women in science and technology" ]
58,084,624
https://en.wikipedia.org/wiki/Electromagnetic%20radio%20frequency%20convergence
Electromagnetic radio frequency (RF) convergence is a signal-processing paradigm that is utilized when several RF systems have to share a finite amount of resources among each other. RF convergence indicates the ideal operating point for the entire network of RF systems sharing resources such that the systems can efficiently share resources in a manner that's mutually beneficial. With communications spectral congestion recently becoming an increasingly important issue for the telecommunications sector, researchers have begun studying methods of achieving RF convergence for cooperative spectrum sharing between remote sensing systems (such as radar) and communications systems. Consequentially, RF convergence is commonly referred to as the operating point of a remote sensing and communications network at which spectral resources are jointly shared by all nodes (or systems) of the network in a mutually beneficial manner. Remote sensing and communications have conflicting requirements and functionality. Furthermore, spectrum sharing approaches between remote sensing and communications have traditionally been to separate or isolate both systems (temporally, spectrally or spatially). This results in stove pipe designs that lack back compatibility. Future of hybrid RF systems demand co-existence and cooperation between sensibilities with flexible system design and implementation. Hence, achieving RF convergence can be an incredibly complex and difficult problem to solve. Even for a simple network consisting of one remote sensing and communications system each, there are several independent factors in the time, space, and frequency domains that have to be taken into consideration in order to determine the optimal method to share spectral resources. For a given spectrum-space-time resource manifold, a practical network will incorporate numerous remote sensing modalities and communications systems, making the problem of achieving RF convergence intangible. Motivation Spectral congestion is caused by too many RF communications users concurrently accessing the electromagnetic spectrum. This congestion may degrade communications performance and decrease or even restrict access to spectral resources. Spectrum sharing between radar and communications applications was proposed as a way to alleviate the issues caused by spectral congestion. This has led to a greater emphasis being placed by researchers into investigating methods of radar-communications cooperation and co-design. Government agencies such as The Defense Advanced Research Projects Agency (DARPA) and others have begun funding research that investigates methods of coexistence for military radar systems, such that their performance will not be affected when sharing spectrum with communications systems. These agencies are also interested in fundamental research investigating the limits of cooperation between military radar and communications systems that in the long run will lead to better co-design methods that improve performance. However, the problems caused by spectrum sharing do not affect just military systems. There are a wide variety of remote sensing and communications applications that will be adversely affected by sharing spectrum with communications systems such as automotive radars, medical devices, 5G etc. Furthermore, applications like autonomous automobiles and smart home networks can stand to benefit substantially by cooperative remote sensing and communications. Consequently, researchers have started investigating fundamental approaches to joint remote sensing and communications. Remote sensing and communications fundamentally tend to conflict with one another. Remote sensing typically transmits known information into the environment (or channel) and measures a reflected response, which is then used to extract unknown information about the environment. For example, in the case of a radar system, the known information is the transmitted signal and the unknown information is the target channel that is desired to be estimated. On the other hand, a communications system basically sends unknown information into a known environment. Although a communications system does not know what the environment (also called a propagation channel) is beforehand, every system operates under the assumption that it is either previously estimated or its underlying probability distribution is known. Due to both systems’ conflicting nature, it is clear that when it comes to designing systems that can jointly sense and communicate, the solution is non-trivial. Due to difficulties in jointly sensing and communicating, both systems are often designed to be isolated in time, space, and/or frequency. Often, the only time legacy systems consider the other user in their mode of operation is through regulations, which are defined by agencies such as the FCC (United States), that constrain the other user's functionality. As spectral congestion continues to force both remote sensing and communications system to share spectral resources, achieving RF convergence is the solution to optimally function in an increasingly crowded wireless spectrum. Applications of joint sensing-communications systems Several applications can benefit from RF convergence research such as autonomous driving, cloud-based medical devices, light based applications etc. Each application may have different goals, requirements, and regulations which present different challenges to achieving RF convergence. A few examples of joint sensing-communications applications are listed below. Intelligent Transport Systems (Vehicle-to-vehicle Communications) Commercial Flight Control Communications & Military Radar Remote Medical Monitoring and Wearable Medical Sensors High Frequency Imaging and Communications Li-Fi and Lidar RFID & Asset Tracking Capable Wireless Sensor Networks Joint sensing-communications system design and integration Joint sensing-communications systems can be designed based on four different types of system integration. These different levels range from complete isolation, to complete co-design of systems. Some levels of integration, such as non-integration (or isolation) and coexistence, are not complex in nature and do not require an overhaul of how either sensing or communications systems operate. However, this lack of complexity also implies that joint systems employing such methods of system integration will not see significant performance benefits on achieving RF convergence. As such, non-integration and coexistence methods are more short-term solutions to the spectral congestion problem. In the long term, systems will have to be co-designed together to see significant improvements in joint system performance. Non-integration Systems employing non-integration methods are forced to operate in isolated regions of spectrum-space-time. However, in the real world, perfect isolation is not realizable and as a result, isolated systems will leak out and occupy segments of spectrum-space-time occupied by other systems. This is why systems that employ non-integration methods end up interfering with each other, and due to the philosophy of isolation being employed, each system makes no attempt at interference mitigation. Consequentially, each user's performance is degraded. Non-integration is one of the common and traditional solutions, and as highlighted here, is a part of the problem. Coexistence Remote sensing and communications systems that implement coexistence methods are forced to coexist with each other and treat each other as sources of interference. This means that unlike non-integration methods, each system tries to perform interference mitigation. However, since both systems are not cooperative and have no knowledge about the other system, any information required to perform such interference mitigation is not shared or known and has to be estimated. As a result, interference mitigation performance is limited since it is dependent on the estimated information. Cooperation Cooperative techniques, unlike coexistence methods, do not require that both sensing and communications systems treat each other as sources of interference and both systems share some knowledge or information. Cooperative methods exploit this joint knowledge to enable both systems to effectively perform interference mitigation and subsequently improve their performance. Systems willingly share necessary information with each other in order to facilitate mutual interference mitigation. Cooperative methods are the first step toward designing joint systems and achieving RF convergence as an effective solution to the spectral congestion problem.. Co-design Co-design methods consist of jointly considering radar and communications systems when designing new systems to optimally share spectral resources. Such systems are jointly designed from scratch to efficiently utilize the spectrum and can potentially result in performance benefits when compared to an isolated approach to system design. Co-designed systems are not necessarily physically co-located. When operating from the same platform, co-design includes the cases where radar beams and waveforms are modulated to convey communications messages, an approach which is typically referred to as dual function radar communications systems. For example, some recent experimentally demonstrated co-design approaches include: Tandem hopped radar and communications (THoRaCs), where undistorted orthogonal frequency-division multiplexing (OFDM) sub-carriers are embedded into a frequency modulation (FM) radar waveform Phase-attached radar/communication (PARC), where FM and continuous phase modulation (CPM) are merged into a single waveform Far-field radiated emission design (FFRED), where FM multiple-input and multiple-output (MIMO) waveforms produce separate radar and communication beams in different spatial directions See also Radar Communications Systems Co-channel interference Spectrum Management Radio resource management References Radio frequency propagation Electromagnetic components Electromagnetism
Electromagnetic radio frequency convergence
[ "Physics" ]
1,717
[ "Physical phenomena", "Electromagnetism", "Spectrum (physical sciences)", "Radio frequency propagation", "Electromagnetic spectrum", "Waves", "Fundamental interactions" ]
58,085,116
https://en.wikipedia.org/wiki/NGC%203883
NGC 3883 is a large low surface brightness spiral galaxy located about 330 million light-years away in the constellation Leo. NGC 3883 has a prominent bulge but does not host an AGN. The galaxy also has flocculent spiral arms in its disk. It was discovered by astronomer William Herschel on April 13, 1785 and is a member of the Leo Cluster. Star formation Despite being rich in neutral atomic hydrogen (HI), NGC 3883 is very red and has a low amount of H-alpha emission. This suggests the star formation in the galaxy ended a long time ago while the inner regions continued to form stars that enriched the interstellar medium (ISM) and eventually used up the remaining gas. Possibly, the outer regions of NGC 3883 went through only a few generations of star formation because the HI density has been low throughout the galaxy's life. However, J. Donas et al. suggests that the UV emission of NGC 3883 which comes mainly from the disk of the galaxy is coming from young intermediate mass stars and reveals star formation in the outer regions of NGC 3883. Because of a low amount of star formation ongoing in NGC 3883, it has been classified as an anemic galaxy. See also List of NGC objects (3001–4000) NGC 4921, An anemic spiral galaxy in the Coma Cluster Malin 1, A giant low surface brightness galaxy References External links 3883 36740 Leo (constellation) Leo Cluster Astronomical objects discovered in 1785 Flocculent spiral galaxies Low surface brightness galaxies 6754
NGC 3883
[ "Astronomy" ]
325
[ "Leo (constellation)", "Constellations" ]
58,085,715
https://en.wikipedia.org/wiki/2018%20Station%20Square%20Derailment
The 2018 Station Square Derailment occurred on the afternoon of August 5, 2018, in Pittsburgh, Pennsylvania, United States. The accident involved a Norfolk Southern manifest freight train, about 57 cars long, traveling at approximately 25 MPH at the time of the incident. 10 cars derailed, 3 fell onto the Port Authority of Allegheny County tracks, blocking the line and taking 6 containers with them, while the other 7 and their associated containers remained on the NS right of way. No injuries were reported, and the other 47 cars were not damaged. The accident caused $1.8M in damage, and was caught on video. An investigation by the U.S. Federal Railroad Administration (FRA) agreed with an initial assessment by Norfolk Southern Railroad that a defect in the rails caused the accident. A rail inspection performed approximately 3 weeks earlier failed to detect the defect. The investigation faulted Norfolk Southern's contractor Sperry Rail Service for failing to detect the defect. Cleanup and removal The cleanup and removal started immediately after the derailment and ended on August 8, when all 57 cars and their containers were removed. Train The train involved in the incident was heading from New Jersey to Chicago. It consisted of 3 locomotives, 57 wellcars, loaded with double-stack intermodal containers, one of which was carrying Listerine. The consist was 7,687 feet (1.46 miles) long, and weighed 4,838 tons. References Derailments in the United States Accidents and incidents involving Norfolk Southern Railway
2018 Station Square Derailment
[ "Technology" ]
306
[ "Railway accidents and incidents", "Rail accident stubs" ]
58,085,890
https://en.wikipedia.org/wiki/Ash%20Meadows%20Sky%20Ranch
Ash Meadows Sky Ranch was a brothel in Nevada near the ghost town of Shoshone, California. The procurer Vickie Starr said of the establishment “We were six prostitutes in the middle of the desert, isolated from the rest of the world, screwing horny and frustrated men for money. That was our life and as far as we were concerned it was as normal as Rice Krispies. Built as a motel and restaurant, the Ash Meadows Lodge, a brothel was later added. Located on a gravel road 2.5 miles from the California border, it was one of the first three brothels to be licensed by Nye County in 1958. Vickie Starr brought the brothel in 1971 after selling Vickie’s Star Ranch in Beatty. She changed the name to Ash Meadows Sky Ranch, sky coming from the brothel's airstrip. The ranch was one of the most impressive brothels in Nevada during the period, featuring a swimming pool, restaurant, hotel, and golf course. A few years later Nye County declined to renew the brothel license because, due to its remote location, it was costly for the medical examiner to visit weekly for the prostitutes check-ups. The brothel closed. Several scenes from the 1987 film Cherry 2000 were shot at the disused brothel. See also List of brothels in Nevada References Brothels in Nevada 1958 establishments in Nevada Nye County, Nevada
Ash Meadows Sky Ranch
[ "Biology" ]
280
[ "Behavior", "Sexuality stubs", "Sexuality" ]
58,086,456
https://en.wikipedia.org/wiki/Nuclear%20Physics%20and%20Atomic%20Energy%20%28journal%29
Nuclear Physics and Atomic Energy is a quarterly peer-reviewed open-access scientific journal published by Institute for Nuclear Research of the National Academy of Sciences of Ukraine. It was established in 2000 and covers all aspects of nuclear physics, particle physics, atomic energy, radiation physics, plasma physics, radiobiology, radioecology, technique and experimental methods. The editor-in-chief is V.I. Slisenko (Institute for Nuclear Research). Articles are published in English, Ukrainian or Russian with titles and abstracts in all three languages. Abstracting and indexing The journal is abstracted and indexed in Scopus. References External links Quarterly journals Multilingual journals Nuclear physics journals Particle physics journals Plasma science journals Academic journals established in 2002
Nuclear Physics and Atomic Energy (journal)
[ "Physics" ]
148
[ "Plasma science journals", "Nuclear physics journals", "Plasma physics", "Nuclear and atomic physics stubs", "Particle physics", "Plasma physics stubs", "Nuclear physics", "Particle physics stubs", "Particle physics journals" ]
58,086,538
https://en.wikipedia.org/wiki/TBPO
TBPO is an extremely toxic bicyclic phosphate convulsant and GABA receptor antagonist. It is the most toxic bicyclic phosphate known, with an of 36 μg/kg in mice. Some sources claim that TBPO is as toxic as VX. Synthesis The synthesis is equivalent to the synthesis of IPTBO while the triol is produced by the condensation between 3,3-dimethylbutyraldehyde and formaldehyde analogous to the synthesis of trimethylolpropane. See also TBPS IPTBO References Convulsants Neurotoxins Bicyclic phosphates GABAA receptor negative allosteric modulators Tert-butyl compounds
TBPO
[ "Chemistry" ]
152
[ "Neurochemistry", "Neurotoxins" ]
58,089,132
https://en.wikipedia.org/wiki/ABB%20Research%20Award%20in%20honor%20of%20Hubertus%20von%20Gruenberg
The ABB Research Award in Honor of Hubertus von Gruenberg is named after Dr. Hubertus von Gruenberg, who was the Chairman of the Board of Directors of the technology company ABB from 2007 to 2015. Endowment and format Awarded every three years, the prize is endowed with a research grant of US$300,000. The prize is awarded to a postdoctoral researcher working in the fields of energy, manufacturing, transport, infrastructure, digitalization, or related fields. The grant enables the awardee to continue research on his or her topic for three years. Purpose The award was created to sustain high levels of research and encourage the development of future technologies and applications. Awardees 2016 Dr. Jef Beerten from KU Leuven for his PhD thesis on "Modeling and Control of DC Power Systems" 2019 Dr. Ambuj Varshney from Uppsala University/University of California, Berkeley for his PhD thesis on "Enabling Sustainable Networked Embedded Systems" Jury 2016 The 2016 jury consisted of Prof. Robert Armstrong, Massachusetts Institute of Technology Prof. Ulrike Grossner, Swiss Federal Institute of Technology in Zurich (ETH Zurich) Prof. Nina Thornhill, Imperial College London Prof. Zheyao Wang, Tsinghua University, Beijing Bazmi Husain, ABB's Chief Technology Officer Dr. Hubertus von Gruenberg, former ABB Chairman 2019 The 2019 jury consisted of Prof. Nina Thornhill (Imperial College London) Prof. M. Granger Morgan (Carnegie Mellon University Pittsburgh) Prof. Roland Siegwart (ETH Zurich) Prof. C.L. Philip Chen (University of Macau, Taipa, Macau) Bazmi Husain, CTO of ABB Dr. Hubertus von Grünberg, former ABB Chairman 2022 The 2022 jury consisted of Prof. Ambuj Varshney (National University of Singapore) Prof. Nina Thornhill (Imperial College London) Prof. Roland Siegwart (ETH Zurich) Prof. Manfred Morari (University of Pennsylvania) Dr. Bernhard Eschermann, CTO of ABB Process Automation Dr. Hubertus von Grünberg, former ABB Chairman References Science and technology awards
ABB Research Award in honor of Hubertus von Gruenberg
[ "Technology" ]
449
[ "Science and technology awards" ]
58,090,938
https://en.wikipedia.org/wiki/C17H12O7
{{DISPLAYTITLE:C17H12O7}} The molecular formula C17H12O7 may refer to: Aflatoxin B1 exo-8,9-epoxide, a toxic metabolite of aflatoxin B1 Aflatoxin M1, a chemical compound of the aflatoxin class
C17H12O7
[ "Chemistry" ]
67
[ "Isomerism", "Set index articles on molecular formulas" ]
58,090,951
https://en.wikipedia.org/wiki/C17H14N2O2
{{DISPLAYTITLE:C17H14N2O2}} The molecular formula C17H14N2O2 (molar mass: 278.305 g/mol, exact mass: 278.1055 u) may refer to: Bimakalim Sudan Red G Molecular formulas
C17H14N2O2
[ "Physics", "Chemistry" ]
64
[ "Molecules", "Set index articles on molecular formulas", "Isomerism", "Molecular formulas", "Matter" ]
58,091,424
https://en.wikipedia.org/wiki/Pool%20fire
A pool fire is a type of diffusion flame where a layer of volatile liquid fuel is evaporating and burning. The fuel layer can be either on a horizontal solid substrate or floating on a higher-density liquid, usually water. Pool fires are an important scenario in fire process safety and combustion science, as large amounts of liquid fuels are stored and transported by different industries. Physical properties The most important physical parameter describing a pool fire is the heat release rate, which determines the minimum safe distance needed to avoid burns from thermal radiation. The heat release rate is limited by the rate of evaporation of the fuel, as the combustion reaction takes place in the gas phase. The evaporation rate, in turn, is determined by other physical parameters, such as the depth, surface area and shape of the pool, as well as the fuel boiling point, heat of vaporization, heat of combustion, thermal conductivity and others. A feedback loop exists between the heat release rate and evaporation rate, as a significant part of the energy released in the combustion reaction will be transmitted from the gas phase to the liquid fuel, and can supply the needed heat of vaporization. In the case of large pool fires, most of the heat transfer happens in the form of thermal radiation. Typical fuels in accidental pool fires, or experiments simulating them, include aliphatic hydrocarbons (n-heptane, liquefied propane gas), aromatic hydrocarbons (toluene, xylene), alcohols (methanol, ethanol) or mixtures thereof (kerosene). It is important that a pool fire involving a water-insoluble fuel is not attempted to be extinguished with water, as this can trigger explosive boiling and spattering of the burning material. Open-top tank fires are pool fires of industrial scale that occur when the roof of an atmospheric tank fails due to internal tank blast, followed by the contents of the tank catching fire. If a layer of water is present underneath the fuel and the fuel is a mixture of chemical species with several different boiling points, a boilover may eventually occur, greatly aggravating the fire. The boilover onset occurs as soon as a hot zone propagates down through the fuel, reaching the water and making it boil. See also Radiative transfer Fire safety References Types of fire Process safety
Pool fire
[ "Chemistry", "Engineering" ]
481
[ "Chemical process engineering", "Safety engineering", "Process safety" ]
58,091,599
https://en.wikipedia.org/wiki/Healthy%20building
Healthy building refers to an emerging area of interest that supports the physical, psychological, and social health and well-being of people in buildings and the built environment. Buildings can be key promoters of health and well-being since most people spend a majority of their time indoors. According to the National Human Activity Pattern Survey, Americans spend "an average of 87% of their time in enclosed buildings and about 6% of their time in enclosed vehicles." Healthy building can be seen as the next generation of green building that not only includes environmentally responsible and resource-efficient building concepts, but also integrates human well-being and performance. These benefits can include "reducing absenteeism and presenteeism, lowering health care costs, and improving individual and organizational performance." Integrated design Healthy building involves many different concepts, fields of interest, and disciplines. 9 Foundations describes healthy building as an approach built on building science, health science, and building science. An integrated design team can consist of stakeholders and specialists such as facility managers, architects, building engineers, health and wellness experts, and public health partners. Conducting charrettes with an integrated design team can foster collaboration and help the team develop goals, plans, and solutions. Buildings and health components There are many different components that can support health and well-being in buildings. Indoor air quality Spengler considers indoor air quality as an important determinant of healthy design. Buildings with poor indoor air quality can contribute to chronic lung diseases such as asthma, asbestosis and lung cancer. Chemical emissions can be outgassed by building materials, furnishings, and supplies. Air fresheners, cleaning products, paints, printing, flooring, and wax and polish products can also be a source of volatile organic compounds (VOCs) and semi-volatile compound (SVOCs). The LEED v4 Handbook posits that indoor air quality is "one of the most pivotal factors in maintaining building occupants' safety, productivity, and well-being." Ventilation Higher rates of ventilation affect indoor pollutants, odors, and the perceived freshness of air by diluting contaminants in the air. ASHRAE's Standard 55-2017 has minimum standards of 8.3 L/s/person. In one study, raising the rate to 15 L/s/person increased performance by 1.1% and decreased sick building symptoms by 18.8%. Whole Building Design Guide recommends separating ventilation from thermal conditioning so as to increase comfort. Natural ventilation is discouraged in buildings that have strict filtration requirements, contaminant dilution concerns, special pressure relationships, speech privacy concerns, and internal heat load demands. The San Joaquin ASHRAE chapter recommends assessing the outside air quality and configuration of the facade and building before demonstrating compliance and control of natural ventilation. ASHRAE Standard 55-2017 section 6.4 requires the natural ventilation be "manually controlled or controlled through the use of electrical or mechanical actuators under direct occupant control." Chris Schaffner, CEO of the Green Engineer, describes operable windows as the "HVAC engineer's ultimate safety factor." Spengler and Chen recommend natural ventilation being used wherever possible. Dust and pests Dust and dirt can be a source of exposure to VOC and lead as well as pesticides and allergens. High efficiency filter vacuums can remove particles such as dander and allergens that otherwise result in breathing issues. A study of asthmatic children in inner city urban communities suggests they became sensitive to the presence of cockroaches, mice, or rats due to their presence in their homes. The use of disposable material The US culture relies heavily on disposable products, especially within healthcare, to minimize on cost and time. In hospitals, for example, healthcare providers cut on costs associated with sterilizing equipment between patient cares by using ready-to-use disposable trays. However, this may at a cost to the environment; in one study, disposable cotton towels were suspected to have an adverse environmental impact. It is estimated that cotton production requires 6.6 kg of carbon dioxide equivalents and 0.024 kg of nitrogen emissions, in addition to a substantial amount of water, fertilizer and work. Healthcare managers are urged to request transparency of medical product production (and waste management) lines to provide assurance that products used have zero or minimal impacts on human health and our environment. Thermal comfort Thermal comfort is influenced by factors like air temperature, mean radiant temperature, relative humidity, air speed, metabolic rate, and clothing. Thermal conditions can affect learning, cognitive performance, task completion, disease transmission, and sleep. ASHRAE defines an acceptable thermal environment as one that 80% of occupants find acceptable, though individual occupant thermal control results in higher satisfaction of occupants. Indoor spaces that are not air conditioned can create indoor heat waves if the outside air cools but the thermal mass of the building traps the hotter air inside. Cedeño-Laurent et al. believe these may become worse as climate change increases the "frequency, duration, and intensity of heat waves" and will be harder to adjust to in areas that are designed for colder climates. Moisture and humidity The Whole Building Design Guide recommends the indoor relative humidity to be between 30 and 50% to prevent unwanted moisture and to design for proper drainage and ventilation. Moisture is introduced into the building either by rainwater intrusion, outside humid air infiltration, internally generated moisture, and vapor diffusion through the building envelope. High temperatures, precipitation, and building age enable mold. It contributes to mold and poor indoor air quality. Vapor retarders have traditionally been used to prevent moisture in walls and roofs. Noise While noise is not always controllable, it has a high correlation and causation relationship with mental health, stress, and blood pressure. One study suggests that there is a higher correlation of noise irritation and bodily pain or discomfort in women. Effects of excessive noise pollution include hearing impairment, speech intelligibility, sleep disturbance, physiological functions, mental illness, and performance. The World Health Organization recommends creating a "National Plan for a Sustainable Noise Indoor Environment" specific to each country. Water quality Water quality can be contaminated by inorganic chemicals, organic chemicals, and microorganisms. The World Health Organization considers waterborne diseases to be one of the world's major health concerns, especially for developing countries and children. WHO recommends following water safety plans that include management, maintenance, good design, cleaning, temperature management, and preventing stagnation. Stagnant water is found to deteriorate the microbiological quality of water, and increase corrosion, odors, and taste issues. The bacterial pathogen Legionella may have a higher potential for growth in large buildings due to long water distribution systems and not enough maintenance. Awareness of these issues is recommended by the WHO in order to maintain water quality: Backflow Cross connections External quality management Independent water supply Material use Minimization of dead ends and stagnation Seasonal use areas Storage tank integrity Water pressure Water temperature Safety and security Concerns of safety affect the mental and possibly physical health of residents by reducing the amount of physical activity. Fear of crime can result in less physical activity as well as increased social isolation. Atkinson posits that crime is based on motivated offenders, targets, and the absence of guardians. Adjusting these in buildings may increase presumed safety. Lighting and view The type and timing of light throughout the day affects circadian rhythms and human physiology. In a study done by Shamsul et al., cool white light and artificial daylight (approximately 450-480 nanometers) was associated with higher levels of alertness. Blue light positively affects mood, performance, fatigue, concentration, and eye comfort and enabled better sleep at night. Bright light during winter has also been shown to improve self-reported health and reduce distress. Daylighting refers to providing access to natural daylight, which can be aesthetically pleasing and improve sleep duration and quality. The LEED handbook writes that daylighting can save energy while "increasing the quality of the visual environment" and occupant satisfaction. Views to green landscapes can significantly increase attention and stress recovery. They can also have a positive influence on emotional states. Ko et al. consider views to be "important for the comfort, emotion, and working memory and concentration of occupants." Providing a view to nature through a glass window may benefit occupants' well-being and increase employee's effectiveness. Site selection Creating a walkable environment that connects people to workplaces, green spaces, public transportation, fitness centers, and other basic needs and services can influence daily physical activity as well as diet and type of commute. In particular, proximity to green spaces (e.g., parks, walking trails, gardens) or therapeutic landscapes can reduce absenteeism and improve well-being. Foundations Problems in foundations and underground conditions can have drastic impacts on the health of the building including structural issues such as cracking, humidity issues and indoor air quality and mould problems. These can arise from various factors. Different types of soil possess varying properties, and some can challenge foundation stability. Expansive soils like clay expand and shrink with moisture changes, causing foundation movement and settlement. Loose or poorly compacted soil can lead to uneven settling. Inadequate drainage around a building can lead to excessive soil moisture, exerting pressure on the foundation. This can result in movement, heaving, or settling. Large trees near a building can have extensive root systems that extract moisture from the soil, causing it to shrink and destabilize the foundation. Root growth can also apply pressure, leading to cracks or movement. Poor construction practices, including low-quality materials, inadequate reinforcement, or improper design, can compromise the foundation's structural integrity. Leaking or burst water pipes beneath the foundation can saturate the soil, destabilizing it and causing foundation movement or settlement. Earthquakes, floods, or storms can subject the foundation to extreme forces and vibrations, resulting in damage or shifting. Soil erosion due to heavy rainfall, landscaping issues, or improper grading can wash away supporting soil around the foundation, causing settlement or shifting. Extreme temperature fluctuations and freeze-thaw cycles can cause soil expansion and contraction, exerting pressure on the foundation and potentially leading to cracks or movement. Especially the climatic effects are expected to worsen as climate change progresses. Addressing foundation issues promptly is crucial to mitigate these consequences and maintain the safety, value, and integrity of the building. Monitoring solutions include ground penetrating radar and electrical resistivity tomography. Protective measures include regular inspections and maintenance, good drainage and moisture control, appropriate ventilation and tending to trees and plants at the site. Building design There are many aspects of a building that can be designed to support positive health and well-being. For example, creating well-placed collaboration and social areas (e.g., break rooms, open collaboration areas, cafe spaces, courtyard gardens) can encourage social interaction and well-being. Quiet and wellness rooms can provide quiet zones or rooms that help improve well-being and mindfulness. Specifically, a designated lactation room can support nursing mothers by providing privacy and helping them return to work more easily. Biophilic design has been linked to health outcomes such as stress reduction, improved mood, cognitive performance, social engagement, and sleep. Ergonomics can also minimize stress and strain on the body by providing ergonomically designed workstations. Occupant engagement While some components of healthy buildings are inherently designed into the built environment, other components rely on the behavioral change of occupants, users, or organizations residing within the building. Well-lit and accessible stairwells can provide building occupants the opportunity to increase regular physical activity. Fitness centers or an exercise room can encourage exercise during the work day, which can improve mood and performance, leading to improved focus and better work-based relationships. Exercise can also be promoted by encouraging alternative means of transportation (e.g., cycling, walking, running) to and from the building. Providing facilities such as bicycle storage and locker/changing rooms can increase the appeal of cycling, walking, or running. Active workstations, such as of sit/stand desks, treadmill desks, or cycle desks, can encourage increased movement and exercise as well."Behavioral measures" can be taken to "encourage better public health outcomes: e.g., reducing sedentary behaviors by increasing access to stairways, using more active transportation options, and working at sit-to-stand desks." Other examples that can promote health and well-being include establishing workplace wellness programs, health promotion campaigns, and encouraging activity and collaboration. Infectious disease ASHRAE states that "Transmission of SARS-CoV-2 through the air is sufficiently likely that airborne exposure to the virus should be controlled. Changes to building operations, including the operation of heating, ventilating, and air-conditioning systems, can reduce airborne exposures." Current recommendations include increasing air supply and exhaust ventilation, using operable windows, limiting air recirculation, increasing hours of ventilation system operation and upgraded filtration. Joseph Allen of the Healthy Buildings Program at Harvard suggests 4-6 air changes per hour in classrooms, especially when masks are off. Proper ventilation of areas has been found to have the same effect as vaccinating 50-60% of the population for influenza. Enhanced filtration using a MERV 13 filter would be adequate to protect against transmission of viruses. Allen mentions three ways humidity can affect transmission: respiratory health, decaying, and virus evaporation. Drier air also dries out the respiratory cilia that catch particles. Viruses decay faster between 40 and 60% humidity. Respiratory droplets that become aerosols are less likely to do so at higher humidity. After 60%, mold growth begins to be encouraged. Sustainable design of patient rooms, intensive care units, and courtyards could offer opportunities to not only maximize on human safety and wellbeing, but also environmental energy efficiency, waste management recycling, and performance optimization – all of which constitute the core of sustainability. However, this may come at an unexpected cost of enabling growth and spread of opportunistic microbes. Health and well-being in standards and rating systems There are several international and governmental standards, guidelines, and building rating systems that incorporate health and well-being concepts: AirRated ANSI/ASHRAE/USGBC/IES Standard 189.1-2014, Standard for the Design of High-Performance Green Buildings BAIOTEQ Fitwel General Services Administration Facilities Standards for the Public Buildings Service (P-100) Green Building Initiative Green Globes Leadership in Energy and Environmental Design United States Department of Defense Unified Facilities Criteria Program WELL Building Standard GreenSeal Standards for Healthy Buildings and Schools Founded in 1989, GreenSeal is a leading global ecolabeling organization (that is part of The Global Ecolabelling Network) that has set strict criteria for occupant health, sustainability, and product performance. The Healthy Green Schools & Colleges initiative assists facility managers in locating low- or no-cost actions that have a significant impact on indoor air quality and health. The curriculum covers the full spectrum of facilities management methods and was created in collaboration with renowned school facility management professionals: Indoor Air Quality Testing and Monitoring Cleaning and Disinfecting Integrated Pest control Sustainable Purchasing HVAC and Electric management Training and intercommunications WELL Building Standard Certification The WELL Building Standard Certification was first launched in 2014 (WELL v1), and it focuses on the well-being and health of occupants in buildings. It was developed by Delos Living LLC and is currently administered by the International WELL Building Institute (IWBI) who released the second version (WELL v2) in 2020. Generally speaking, WELL v2 has updated requirements for investigating the relationship between building design and human health, adds more diversity to spaces and applications of the standard, and features a single rating system that resembles USGBC LEED's efforts. More specifically, WELL v1 discussed 100 performance features that can be considered for the certification of a building. Those 100 performance features are classified into 7 "concepts" as follows: Air, Water, Nourishment, Light, Fitness, Comfort, and Mind. Of these 100 features, 41 were required preconditions, and 59 were optional optimizations. In order to achieve a WELL certification, a building has to meet the following: For a WELL silver certification: 41 required preconditions. For a WELL gold certification: all the requirements for silver certification plus 40% optimizations. For a WELL platinum certification: all the requirements for gold certification plus 80% optimizations. On the other hand, WELL v2 uses a four-certification system that mimics LEED's scoring system. The required preconditions are decreased to only 23 (vs. 41 in v1), and the optimizations rose to 92 (vs. 59 in v1). WELL v2 also added 3 more "concepts": Sound, Materials and Community. With these updates, more buildings could qualify for a certification under the new system: For a WELL bronze certification: 40 points are required (this is only available for shell and core buildings) For a WELL silver certification: 50 points are required. For a WELL gold certification: 60 points are required. For a WELL platinum certification: 80 points are required. There are some caveats with WELL v2, however. For instance, a building has to meet all required 23 preconditions before qualifying a certification. If one precondition is not satisfied, the building may not proceed with WELL standard certification irrespective of how many points achieved. Additionally, a building must earn at least 4 points in the "Thermal Comfort" and "Air" concepts, and 2 points at minimum in the remainder of the concepts. Lastly, a building can attain a maximum of 110 points because of an additional 10 points that could be achieved for innovation and performance. Based on most recent surveys more than 72M sqft of residential and commercial spaces have been certified around the globe to date. See also Built environment Cleanroom Environmental psychology Healing environments Indoor air quality Particulates Sustainable refurbishment Ventilation (architecture) References External links Hyundai E&C creates Hillstate without worrying about fine dust (machine translation, original text in Korean) Building biology Sustainable building Architectural design Determinants of health Indoor air pollution
Healthy building
[ "Engineering" ]
3,738
[ "Sustainable building", "Building engineering", "Construction", "Architectural design", "Design", "Building biology", "Architecture" ]
58,092,933
https://en.wikipedia.org/wiki/Theorem%20of%20the%20highest%20weight
In representation theory, a branch of mathematics, the theorem of the highest weight classifies the irreducible representations of a complex semisimple Lie algebra . There is a closely related theorem classifying the irreducible representations of a connected compact Lie group . The theorem states that there is a bijection from the set of "dominant integral elements" to the set of equivalence classes of irreducible representations of or . The difference between the two results is in the precise notion of "integral" in the definition of a dominant integral element. If is simply connected, this distinction disappears. The theorem was originally proved by Élie Cartan in his 1913 paper. The version of the theorem for a compact Lie group is due to Hermann Weyl. The theorem is one of the key pieces of representation theory of semisimple Lie algebras. Statement Lie algebra case Let be a finite-dimensional semisimple complex Lie algebra with Cartan subalgebra . Let be the associated root system. We then say that an element is integral if is an integer for each root . Next, we choose a set of positive roots and we say that an element is dominant if for all . An element dominant integral if it is both dominant and integral. Finally, if and are in , we say that is higher than if is expressible as a linear combination of positive roots with non-negative real coefficients. A weight of a representation of is then called a highest weight if is higher than every other weight of . The theorem of the highest weight then states: If is a finite-dimensional irreducible representation of , then has a unique highest weight, and this highest weight is dominant integral. If two finite-dimensional irreducible representations have the same highest weight, they are isomorphic. For each dominant integral element , there exists a finite-dimensional irreducible representation with highest weight . The most difficult part is the last one; the construction of a finite-dimensional irreducible representation with a prescribed highest weight. The compact group case Let be a connected compact Lie group with Lie algebra and let be the complexification of . Let be a maximal torus in with Lie algebra . Then is a Cartan subalgebra of , and we may form the associated root system . The theory then proceeds in much the same way as in the Lie algebra case, with one crucial difference: the notion of integrality is different. Specifically, we say that an element is analytically integral if is an integer whenever where is the identity element of . Every analytically integral element is integral in the Lie algebra sense, but there may be integral elements in the Lie algebra sense that are not analytically integral. This distinction reflects the fact that if is not simply connected, there may be representations of that do not come from representations of . On the other hand, if is simply connected, the notions of "integral" and "analytically integral" coincide. The theorem of the highest weight for representations of is then the same as in the Lie algebra case, except that "integral" is replaced by "analytically integral." Proofs There are at least four proofs: Hermann Weyl's original proof from the compact group point of view, based on the Weyl character formula and the Peter–Weyl theorem. The theory of Verma modules contains the highest weight theorem. This is the approach taken in many standard textbooks (e.g., Humphreys and Part II of Hall). The Borel–Weil–Bott theorem constructs an irreducible representation as the space of global sections of an ample line bundle; the highest weight theorem results as a consequence. (The approach uses a fair bit of algebraic geometry but yields a very quick proof.) The invariant theoretic approach: one constructs irreducible representations as subrepresentations of a tensor power of the standard representations. This approach is essentially due to H. Weyl and works quite well for classical groups. See also Classifying finite-dimensional representations of Lie algebras Representation theory of a connected compact Lie group Weights in the representation theory of semisimple Lie algebras Notes References . Representation theory Lie algebras Theorems about algebras Theorems in representation theory
Theorem of the highest weight
[ "Mathematics" ]
852
[ "Representation theory", "Fields of abstract algebra" ]
58,093,928
https://en.wikipedia.org/wiki/Bicyclic%20phosphate
Bicyclic phosphate is a class of organophosphate compounds that are used as flame retardants, stabilizers and antioxidants. They are also used in spectroscopic studies. Some bicyclic phosphates, such as TBPS, TBPO and IPTBO, are highly toxic. They have toxicity comparable to nerve agents. However, they are not acetylcholinesterase inhibitors. They act as GABA receptor antagonists and have potent convulsant effects. See also Convulsant TBPS TBPO IPTBO References Convulsants Neurotoxins GABAA receptor negative allosteric modulators Organophosphates Phosphate esters
Bicyclic phosphate
[ "Chemistry" ]
148
[ "Neurochemistry", "Neurotoxins" ]
58,094,662
https://en.wikipedia.org/wiki/Optically%20detected%20magnetic%20resonance
In physics, optically detected magnetic resonance (ODMR) is a double resonance technique by which the electron spin state of a crystal defect may be optically pumped for spin initialisation and readout. Like electron paramagnetic resonance (EPR), ODMR makes use of the Zeeman effect in unpaired electrons. The negatively charged nitrogen vacancy centre (NV−) has been the target of considerable interest with regards to performing experiments using ODMR. ODMR of NV−s in diamond has applications in magnetometry and sensing, biomedical imaging, quantum information and the exploration of fundamental physics. NV ODMR The nitrogen vacancy defect in diamond consists of a single substitutional nitrogen atom (replacing one carbon atom) and an adjacent gap, or vacancy, in the lattice where normally a carbon atom would be located. The nitrogen vacancy occurs in three possible charge states: positive (NV+), neutral (NV0) and negative (NV−). As NV− is the only one of these charge states which has shown to be ODMR active, it is often referred to simply as the NV. The energy level structure of the NV− consists of a triplet ground state, a triplet excited state and two singlet states. Under resonant optical excitation, the NV may be raised from the triplet ground state to the triplet excited state. The centre may then return to the ground state via two routes; by the emission of a photon of 637 nm in the zero phonon line (ZPL) (or longer wavelength from the phonon sideband) or alternatively via the aforementioned singlet states through intersystem crossing and the emission of a 1042 nm photon. A return to the ground state via the latter route will preferentially result in the state. Relaxation to the state necessarily results in a decrease in visible wavelength fluorescence (as the emitted photon is in the infrared range). Microwave pumping at a resonant frequency of places the centre in the degenerate state. The application of a magnetic field lifts this degeneracy, causing Zeeman splitting and the decrease of fluorescence at two resonant frequencies, given by , where is the Planck constant, is the electron g-factor and is the Bohr magneton. Sweeping the microwave field through these frequencies results in two characteristic dips in the observed fluorescence, the separation between which enables determination of the strength of the magnetic field . Hyperfine splitting Further splitting in the fluorescence spectrum may occur due to the hyperfine interaction, which leads to further resonance conditions and corresponding spectral lines. In NV ODMR, this detailed structure usually originates from nitrogen and carbon-13 atoms near to the defect. These atoms have small magnetic fields which interact with the spectral lines from the NV, causing further splitting. Hyperfine interactions in nitrogen-vacancy (NV) centres arise from nearby nuclear spins, primarily due to nitrogen (14N or 15N) and, in some cases 13C atoms near the defect. These interactions are significant because they further split the energy levels of the NV center, resulting in additional resonances in the ODMR spectrum. The nitrogen atom in the NV centre can exist as either 14N (with nuclear spin I = 1) or 15N (with nuclear spin I=1/2). The most common isotope, 14N, couples with the electron spin of the NV center, leading to a hyperfine splitting of the states into three sub-levels. The interaction of NV electron spin with 14N nuclear spin can be defined by the hamiltonian shown above where S represents NV electron spin system and I represents nitrogen nuclear spin. This splitting typically depends upon the constants MHz and MHz. Splitting can be observed as three peaks in the ODMR hyperfine resolved spectrum. In NV centres, hyperfine splitting arises due to the interaction between the NV electron spin magnetic spin moment and nuclear spin magnetic moments. NV spin magnetic moments also depend upon the external magnetic field magnitude and orientation. To perform hyperfine resolved ODMR, a single NV ODMR experiment is generally preferable. If 15N is present instead of 14N. It will split into two sublevels. Nearby 13C atoms (with nuclear spin I=1/2) can also interact with the NV centre (3). 13C carbon atoms are randomly distributed in diamonds and have a natural abundance of about 1.1%. When located near the NV center, they induce additional fine structures in the ODMR signal. The coupling strength varies with the position of the 13C nuclei relative to the NV center. References Bibliography Quantum mechanics Materials science Scientific techniques Spectroscopy
Optically detected magnetic resonance
[ "Physics", "Chemistry", "Materials_science", "Engineering" ]
975
[ "Applied and interdisciplinary physics", "Spectrum (physical sciences)", "Molecular physics", "Instrumental analysis", "Theoretical physics", "Materials science", "Quantum mechanics", "nan", "Spectroscopy" ]
59,728,169
https://en.wikipedia.org/wiki/Vitim%20radar
Vitim is a Russian project of a stationary long-range over-the-horizon radar of the UHF wave range by NIIDAR. Positioned as a means of continuous monitoring of the aerospace situation. Provides detection, maintenance and classification of ballistic, space and aerodynamic objects. Determines the coordinates and motion parameters of ballistic missiles, warheads, AES and ADC, passing through the observation zone; classification of accompanied targets according to the signs of “BR”, “AES” (artificial earth satellites), “ADC” (aerodynamic targets). These are detection, tracking, determination of radiation characteristics, parameters of the trajectory and areas of incidence of active interference sources emitting in the working radar range of the Vitim radar, as well as the formation of typical messages and giving the consumer information about detected and accompanied targets and sources of interference. References External links Vitim Radar , NIIDAR. Russian military radars Over-the-horizon radars NIIDAR products Early warning systems
Vitim radar
[ "Technology" ]
204
[ "Warning systems", "Early warning systems" ]
59,730,114
https://en.wikipedia.org/wiki/Parallel%20external%20memory
In computer science, a parallel external memory (PEM) model is a cache-aware, external-memory abstract machine. It is the parallel-computing analogy to the single-processor external memory (EM) model. In a similar way, it is the cache-aware analogy to the parallel random-access machine (PRAM). The PEM model consists of a number of processors, together with their respective private caches and a shared main memory. Model Definition The PEM model is a combination of the EM model and the PRAM model. The PEM model is a computation model which consists of processors and a two-level memory hierarchy. This memory hierarchy consists of a large external memory (main memory) of size and small internal memories (caches). The processors share the main memory. Each cache is exclusive to a single processor. A processor can't access another’s cache. The caches have a size which is partitioned in blocks of size . The processors can only perform operations on data which are in their cache. The data can be transferred between the main memory and the cache in blocks of size . I/O complexity The complexity measure of the PEM model is the I/O complexity, which determines the number of parallel blocks transfers between the main memory and the cache. During a parallel block transfer each processor can transfer a block. So if processors load parallelly a data block of size form the main memory into their caches, it is considered as an I/O complexity of not . A program in the PEM model should minimize the data transfer between main memory and caches and operate as much as possible on the data in the caches. Read/write conflicts In the PEM model, there is no direct communication network between the P processors. The processors have to communicate indirectly over the main memory. If multiple processors try to access the same block in main memory concurrently read/write conflicts occur. Like in the PRAM model, three different variations of this problem are considered: Concurrent Read Concurrent Write (CRCW): The same block in main memory can be read and written by multiple processors concurrently. Concurrent Read Exclusive Write (CREW): The same block in main memory can be read by multiple processors concurrently. Only one processor can write to a block at a time. Exclusive Read Exclusive Write (EREW): The same block in main memory cannot be read or written by multiple processors concurrently. Only one processor can access a block at a time. The following two algorithms solve the CREW and EREW problem if processors write to the same block simultaneously. A first approach is to serialize the write operations. Only one processor after the other writes to the block. This results in a total of parallel block transfers. A second approach needs parallel block transfers and an additional block for each processor. The main idea is to schedule the write operations in a binary tree fashion and gradually combine the data into a single block. In the first round processors combine their blocks into blocks. Then processors combine the blocks into . This procedure is continued until all the data is combined in one block. Comparison to other models Examples Multiway partitioning Let be a vector of d-1 pivots sorted in increasing order. Let be an unordered set of N elements. A d-way partition of is a set , where and for . is called the i-th bucket. The number of elements in is greater than and smaller than . In the following algorithm the input is partitioned into N/P-sized contiguous segments in main memory. The processor i primarily works on the segment . The multiway partitioning algorithm (PEM_DIST_SORT) uses a PEM prefix sum algorithm to calculate the prefix sum with the optimal I/O complexity. This algorithm simulates an optimal PRAM prefix sum algorithm. // Compute parallelly a d-way partition on the data segments for each processor i in parallel do Read the vector of pivots into the cache. Partition into d buckets and let vector be the number of items in each bucket. end for Run PEM prefix sum on the set of vectors simultaneously. // Use the prefix sum vector to compute the final partition for each processor i in parallel do Write elements into memory locations offset appropriately by and . end for Using the prefix sums stored in the last processor P calculates the vector of bucket sizes and returns it. If the vector of pivots M and the input set A are located in contiguous memory, then the d-way partitioning problem can be solved in the PEM model with I/O complexity. The content of the final buckets have to be located in contiguous memory. Selection The selection problem is about finding the k-th smallest item in an unordered list of size . The following code makes use of PRAMSORT which is a PRAM optimal sorting algorithm which runs in , and SELECT, which is a cache optimal single-processor selection algorithm. if then return end if //Find median of each for each processor in parallel do end for // Sort medians // Partition around median of medians if then return else return end if Under the assumption that the input is stored in contiguous memory, PEMSELECT has an I/O complexity of: Distribution sort Distribution sort partitions an input list of size into disjoint buckets of similar size. Every bucket is then sorted recursively and the results are combined into a fully sorted list. If the task is delegated to a cache-optimal single-processor sorting algorithm. Otherwise the following algorithm is used: // Sample elements from for each processor in parallel do if then Load in -sized pages and sort pages individually else Load and sort as single page end if Pick every 'th element from each sorted memory page into contiguous vector of samples end for in parallel do Combine vectors into a single contiguous vector Make copies of : end do // Find pivots for to in parallel do end for Pack pivots in contiguous array // Partition around pivots into buckets // Recursively sort buckets for to in parallel do recursively call on bucket of size using processors responsible for elements in bucket end for The I/O complexity of PEMDISTSORT is: where If the number of processors is chosen that and the I/O complexity is then: Other PEM algorithms Where is the time it takes to sort items with processors in the PEM model. See also Parallel random-access machine (PRAM) Random-access machine (RAM) External memory (EM) References Algorithms Models of computation Analysis of parallel algorithms External memory algorithms Cache (computing)
Parallel external memory
[ "Mathematics" ]
1,344
[ "Applied mathematics", "Algorithms", "Mathematical logic" ]
59,730,366
https://en.wikipedia.org/wiki/Video%20games%20and%20charity
Due to the perceived negative connotations of video games, both industry members and consumers of video games have frequently collaborated to counter this perception by engaging in video gaming for charitable purposes. Some of these have been charitable groups, or regular and annual events, and the scope of these efforts have continued to grow, with more than having been raised by video game-related charity efforts in the first half of 2018 alone. Organizations and events Child's Play The creators of the Penny Arcade webcomic, Jerry Holkins and Mike Krahulik, established the Child's Play charity in 2003 following a string of mass media stories that attempted to portray video games in a negative light. The charity was designed to provide toys and video game systems and games to various children's hospitals in the United States, though both monetary and physical donations. By 2017, Child's Play had raised over through both cash donations and donated items. Extra Life Extra Life is an annual charity fund-raising event to support the Children's Miracle Network Hospitals. The event was started in 2008 to honor Victoria , a teenager who died of acute lymphoblastic leukemia. While gamers accept funds for Extra Life throughout the year, the event encourages streamers to play video games for twenty-four hours straight via Twitch or other streaming services over a specific weekend in November, and collect additional donations from their viewers. In 2017, over 50,000 streamers helped to raise over for the hospitals. Desert Bus For Hope The comedy group LoadingReadyRun ran a Child's Play event in 2007 by marathon-playing the game Desert Bus, a game created by Penn & Teller, in which the player must drive a bus on a desolate stretch of highway from Tucson to Las Vegas, roughly eight hours of continuous gameplay, with little challenge outside of player fatigue. The event was successful, in part due to recognition from Penn & Teller, and eventually spun out into its own annual "Desert Bus for Hope" event. During the stream, broadcast over Twitch and other streaming sites, viewers can donate to get virtual time behind the bus's wheel, as well as participate in various auctions. The 2018 event raised over for Child's Play, with total accumulated donations exceeding . Games Done Quick Games Done Quick was launched in 2010, inspired by the Desert Bus for Hope, with the idea that invited participants speedrun numerous games over the course of the event, typically five to six days, usually with commentary over the course of the games. During semi-annual event, the speedruns are performed live in front of an audience and broadcast to Twitch and other services, with viewers able to donate during the length of the event. There has also been shorter one-off special Games Done Quick events for specific occasions, such as one to support the victims of the 2011 Tōhoku earthquake and tsunami. As of January 21, 2019, Games Done Quick has raised over to various charities, including Prevent Cancer Foundation in its winter events and Doctors Without Borders in its summer events. Humble Bundle Humble Bundle was initially launched by Wolfire Games in 2010 as a series of game bundles, most frequently indie games, offered at a pay-what-you-want price, with all but a few dollars of each sale going to a designated charity. As the bundles became more successful, Humble Bundle's approach expanded to include other bundles, such as mobile games, console games, and digital books, as well as establishing its own company, created publishing support for indie games and establishing a dedicated storefront, where a portion of each purchase goes to a selected charity. By 2017, the various activities of Humble Bundle have raised over across 50 different charities, including Action Against Hunger, Child's Play, the Electronic Frontier Foundation, charity: water, the American Red Cross, WaterAid and the Wikimedia Foundation. Jingle Jam Since 2011, The Yogscast have organised a series of live streams every year in December to benefit charity. The idea began when fans would send presents to founders Lewis Brindley and Simon Lane during the Christmas season, but they would instead insist that the money be donated to charity. By end December 2023, Jingle Jams had raised over £25m for over 40 different charities. In August 2022, Jingle Jam was registered as a charity in England and Wales, with the Charity Commission, independent of The Yogscast. More info: Notable one-off efforts Pink Overwatch Mercy skin During May 2018, to support the Breast Cancer Research Foundation, Blizzard Entertainment offered a limited time character customization skin for the Overwatch character Mercy that reflected the pink colors of breast cancer awareness, with all revenue from the sale of the skin going to the Foundation. By the end of the sale, Blizzard had raised over , which at the time, was the largest single donation that the Foundation had seen. 2018 E3 Fortnite Pro-Am Epic Games ran a celebrity Fortnite Battle Royale pro-am during the Electronic Entertainment Expo 2018 in June 2018, which paired 50 popular streams with 50 celebrities, with an overall prize pool to be given to the winners in the names of their desired charities. H.Bomberguy's stream for Mermaids In January 2019, streamer Harry Brewis aka "hbomberguy" ran a marathon stream to beat Donkey Kong 64 and raise money for Mermaids, a British charity for transgender youth. The stream began following comments made by Graham Linehan which Brewis considered transphobic, and whilst potential funding of Mermaids by the British National Lottery was under review. Word of mouth about the stream quickly spread and several notable pro-trans supporters briefly joined his stream, including John Romero, Chelsea Manning, and U.S. congresswoman Alexandria Ocasio-Cortez. Brewis finished the marathon after 57 hours, having raised over (£265,000) for Mermaids, with 659,000 viewers having watched the stream. References Charity fundraising History of video games Online charity Video game culture
Video games and charity
[ "Technology" ]
1,210
[ "History of video games", "History of computing" ]
59,730,452
https://en.wikipedia.org/wiki/Turkish%20Medicines%20and%20Medical%20Devices%20Agency
The Turkish Medicines and Medical Devices Agency (TMMDA; ) is a regulatory agency of the Government of Turkey that acts as the highest sanitary authority in terms of medical safety on medicines, health products, cosmetics and personal care products. It is responsible for the enforcement of the Turkish Cosmetic regulations, which came into effect in 2005. The current Medical Device Regulation came into force on June 7, 2011. A new draft regulation aligned with the new European regulation was published in 2018. In 2013 the Agency proposed to permit some pharmacies to import medicine independently, bypassing the Turkish Pharmacists Association although the Association successfully contested this decision. Under Law No. 7151 on Amendment of Certain Laws and Decree Laws related to Healthcare, published in December 2018 the Social Security Institution and state institutions and organisations approved by the Ministry of Health can import medicine. It announced principles for the approval of secondary packaging and storage facilities for pharmaceuticals in January 2019. See also Health care in Turkey Ministry of Health Pharmacoepidemiology References External links Medical and health organizations based in Turkey Medical and health regulators National agencies for drug regulation
Turkish Medicines and Medical Devices Agency
[ "Chemistry" ]
223
[ "National agencies for drug regulation", "Drug safety" ]
59,730,509
https://en.wikipedia.org/wiki/Microprotein
A microprotein (miP) is a small protein encoded from a small open reading frame (sORF), also known as sORF-encoded protein (SEP). They are a class of protein with a single protein domain that are related to multidomain proteins. Microproteins regulate larger multidomain proteins at the post-translational level. Microproteins are analogous to microRNAs (miRNAs) and heterodimerize with their targets causing dominant and negative effects. In animals and plants, microproteins have been found to greatly influence biological processes. Because of microproteins' dominant effects on their targets, microproteins are currently being studied for potential applications in biotechnology. History The first microprotein (miP) discovered was during a research in the early 1990s on genes for basic helix–loop–helix (bHLH) transcription factors from a murine erythroleukaemia cell cDNA library. The protein was found to be an inhibitor of DNA binding (ID protein), and it negatively regulated the transcription factor complex. The ID protein was 16 kDa and consisted of a helix-loop-helix (HLH) domain. The microprotein formed bHLH/HLH heterodimers which disrupted the functional basic helix–loop–helix (bHLH) homodimers. The first microprotein discovered in plants was the LITTLE ZIPPER (ZPR) protein. The LITTLE ZIPPER protein contains a leucine zipper domain but does not have the domains required for DNA binding and transcription activation. Thus, LITTLE ZIPPER protein is analogous to the ID protein. Despite not all proteins being small, in 2011, this class of protein was given the name microproteins because their negative regulatory actions are similar to those of miRNAs. Evolutionarily, the ID protein or proteins similar to ID found in all animals. In plants, microproteins are only found in higher order. However, the homeodomain transcription factors that belong to the three-amino-acid loop-extension (TALE) family are targets of microproteins, and these homeodomain proteins are conserved in animals, plants, and fungi. Structure Microproteins are generally small proteins with a single protein domain. The active form of microproteins are translated from smORF. The smORF codons which microproteins are translated from can be less than 100 codons. However, not all microproteins are small, and the name was given because their actions are analogous to miRNAs. Function The function of microproteins is post-translational regulators. Microproteins disrupt the formation of heterodimeric, homodimeric, or multimeric complexes. Furthermore, microproteins can interact with any protein that require functional dimers to function normally. The primary targets of microproteins are transcription factors that bind to DNA as dimers. Microproteins regulate these complexes by creating homotypic dimers with the targets and inhibit protein complex function. There are two types of miP inhibitions: homotypic miP inhibition and heterotypic miP inhibition. In homotypic miP inhibition, microproteins interact with proteins with similar protein-protein interaction (PPI) domain. In heterotypic miP inhibition, microproteins interact with proteins with different but compatible PPI domain. In both types of inhibition, microproteins interfere and prevent the PPI domains from interacting with their normal proteins. References Protein classification Post-translational modification
Microprotein
[ "Chemistry", "Biology" ]
731
[ "Post-translational modification", "Gene expression", "Protein classification", "Biochemical reactions" ]
59,733,031
https://en.wikipedia.org/wiki/Metallochaperones
Metallochaperones are a distinct class of molecular chaperones that facilitate the intracellular transport of metal ions to different metalloproteins, e.g., metalloenzymes, in cells through specific protein-protein interactions. In this way, for example, the proteins ensure that the correct metal ion cofactor is acquired by its corresponding metalloenzyme. Metallochaperones are essential to the proper functioning of cells, playing a vital role in a large number of biological processes including, for example, respiration, photosynthesis, neurotransmission, and protein folding. Prior to the discovery of metallochaperones in the late 1990s, biologists believed that metal ions freely diffused within cells without the aid of auxiliary proteins. Today, it is well established that these special molecules contribute to the intracellular homeostatic control of biometal ions. References Transport proteins Metalloproteins
Metallochaperones
[ "Chemistry" ]
194
[ "Biochemistry stubs", "Metalloproteins", "Protein stubs", "Bioinorganic chemistry" ]
59,733,369
https://en.wikipedia.org/wiki/Architecture%20of%20the%20Bulgarian%20Revival
The architecture of the Bulgarian Revival is an Ottoman style architecture developed between 1770 and 1900. Plovdiv's Old Town is a living museum of the type of National Revival architecture that developed there (there were regional differences) in the early to mid-1800. The roots of the houses of Bulgarian Revival follows a tradition of buildings from the architecture of the Second Bulgarian Empire. There are cities in Bulgaria with preserved Revival architecture are:the old town of Plovdiv, the mountain towns of Tryavna, Kotel, Sopot, Koprivshtitsa, Elena, the old Bulgarian capital - Veliko Tarnovo and others. Gallery References Architecture in Bulgaria Bulgarian National Revival
Architecture of the Bulgarian Revival
[ "Engineering" ]
142
[ "Architecture stubs", "Architecture" ]
59,734,475
https://en.wikipedia.org/wiki/Journal%20of%20Astrophysics%20and%20Astronomy
The Journal of Astrophysics and Astronomy is a peer-reviewed scientific journal of astrophysics and astronomy established in 1980. It is co-published bimonthly by Springer India, the Indian Academy of Sciences, and Astronomical Society of India. The journal is edited by Annapurni Subramaniam. Indexing and abstracting The journal is abstracted and indexed in the following bibliographic databases: According to the Journal Citation Reports, the journal has a 2020 impact factor of 1.270. References External links Academic journals established in 1980 Bimonthly journals English-language journals Astrophysics journals Astronomy journals Springer Science+Business Media academic journals
Journal of Astrophysics and Astronomy
[ "Physics", "Astronomy" ]
130
[ "Astrophysics journals", "Astronomy journals", "Works about astronomy", "Astrophysics" ]
59,736,785
https://en.wikipedia.org/wiki/NetBlocks
NetBlocks is a watchdog organization that monitors cybersecurity and the governance of the Internet. The service was launched in 2017 to monitor Internet freedom. Work Projects NetBlocks publishes original reporting on Internet governance and sustainable energy, providing tools to the public to observe possible Internet restrictions and to estimate the economic consequences of network disruptions. NetBlocks has established a high level of trust in communities around the world, facilitating the spread of information during emergencies and Internet censorship events, according to peer-reviewed research published in the scientific journal Nature. Events On 25 November 2017, NetBlocks and the Digital Rights Foundation provided information about the nationwide censorship of Facebook, Twitter, YouTube and other social media services by the Pakistani government following the Tehreek-e-Labaik protests. During the 2018–2019 Sudanese protests, NetBlocks stated that the Sudanese government maintains "an extensive Internet censorship regime" following the censorship of social media websites in the country. Following the 2019 Gabonese coup d'état attempt, NetBlocks monitored censorship in the country. The cost of the three-day Internet shutdown following the Zimbabwean fuel protests was also calculated to cost Zimbabwe an estimated $17 million. The block of Wikipedia in Venezuela and other censorship incidents during the Venezuelan presidential crisis were also monitored by NetBlocks, with several international media outlets covering the situation with NetBlocks' work. In July 2020, as the Somalian Parliament passed a motion of no confidence in Prime Minister Hassan Ali Khaire, NetBlocks reported that Internet access had been disrupted impeding media coverage of political and public reactions to events on the ground, presenting evidence contradicting network operator Hormuud Telecom's claim that the outage was due to "windy conditions." From February 2022, NetBlocks set up a reporting initiative providing extensive coverage on the Russian invasion of Ukraine, documenting Russian efforts to disable communications at nuclear sites and in conflict zones. References Internet censorship Information technology organizations Organizations established in 2017
NetBlocks
[ "Technology" ]
404
[ "Information technology", "Information technology organizations" ]
59,737,009
https://en.wikipedia.org/wiki/Biodiversity%20Monitoring%20Switzerland
The Biodiversity Monitoring Switzerland (BDM) is a Swiss Confederation programme for the long-term monitoring of species diversity in Switzerland. Introduction The Biodiversity Monitoring Switzerland surveys the long-term development of species diversity in selected organism groups in Switzerland. The focus is on surveying common and widespread species in order to make informed statements about the development of species diversity in common landscapes. Biodiversity Monitoring Switzerland is a programme run by the Federal Office for the Environment FOEN. It is a long-term environmental monitoring project, comparable with other national programmes, such as the Swiss National Forest Inventory (NFI), the National Surface Water Quality Monitoring Programme (NAWA), the Swiss Soil Monitoring Network (NABO) and the project “Monitoring the Effectiveness of Habitat Conservation in Switzerland” (WBS). There are similar biodiversity monitoring programmes in place in the United Kingdom (UK Countryside Survey by the UK Centre for Ecology & Hydrology) and in parts of Canada (Alberta Biodiversity Monitoring run by the Alberta Biodiversity Monitoring Institute). Tasks and objectives Together with other environmental information, the data from the Biodiversity Monitoring Switzerland underpin national conservation policy and other policy areas that are relevant to biodiversity such as agriculture and forestry. By signing the UN Convention on Biological Diversity (CBD), Switzerland also has an obligation under international law to monitor the long-term development of biodiversity. The objectives of the Biodiversity Monitoring Switzerland are to draw representative conclusions about biodiversity in Switzerland as a whole (sometimes broken down by biogeographic region or main type of land use, e.g. grassland, forests, settlements etc.); monitor the evolution of species diversity as a whole, i.e. including in intensively managed areas and therefore draw conclusions about the common landscape; record the taxonomic groups in full, i.e. including all species, and thus supplement existing knowledge on rare and endangered species; document changes in species diversity and highlight long-term trends. Methodology The Biodiversity Monitoring Switzerland comprises three sampling grids on different scales, which cover the whole of Switzerland and yield a representative sample. The sampling grid to observe species diversity in landscapes consists of some 500 sampling areas, each covering one square kilometre. On a precisely defined transect of this quadrant, vascular plants, butterflies and breeding birds are surveyed. Data on breeding birds are collected by the Swiss Ornithological Institute Sempach. These surveys are coordinated with the Monitoring of Common Breeding Birds. The density of the sampling grid in the Jura and in Southern Switzerland was increased in order to obtain reliable data for these regions. The sampling grid to observe species diversity in habitats consists of some 1,450 sampling sites, each covering ten square metres. In terms of habitats a distinction is drawn between forests, meadows and pastures, settlements, farmland, alpine pastures and mountain areas. All the vascular plants found in a circular sampling area are surveyed. In addition, bryophyte samples are collected, which are subsequently identified by a team of experts, and soil samples are taken to study mollusc diversity in the laboratory. The sampling grid to survey aquatic insects comprises approximately 500 small sections of minor watercourses measuring around 5–100 metres long. It surveys the larvae of mayflies, stoneflies and caddisflies (so-called EPT species group). The sampling areas can be precisely located as they are permanent observation plots. A fifth of all areas are surveyed every year, which means that a survey is repeated at the same location every five years. Routine surveys of vascular plants, bryophytes, molluscs and breeding birds were started in 2001, with surveys of butterflies added in 2003 and aquatic invertebrates added in 2010. The species’ coordinates are integrated in the databases of InfoSpecies, the Swiss Information Centre for Species. Indicators The data obtained are routinely used to calculate four indicators: The species diversity in landscapes indicator shows the diversity of flora and fauna in the landscape. It describes the influence of habitat mosaics on species diversity. The species diversity in habitats indicator documents the small-scale species diversity of a habitat type, e.g. meadows, forests or settlements. The population size of common species indicator documents changes in widespread species. They are of ecological importance as they make up the majority of living biomass, provide a significant share of ecosystem services and constitute an abundant food source for other organisms. They shape the appearance of their habitats and characterise entire landscapes. The diversity in species communities indicator looks at whether Switzerland's habitats and landscapes are becoming more similar. It therefore provides information on the heterogeneity or homogeneity of species diversity. In addition, the data can be used for various special analyses. They form the basis of numerous scientific research projects. Thanks to the systematic sampling design, the standardised methodology and the long-term nature of the programme, the data can be used to answer new, as yet undefined questions. The data are also incorporated in European biodiversity indicators, e.g. the European Grassland Butterfly Index compiled by Butterfly Conservation Europe and the European Environment Agency EEA. Furthermore, data from the Biodiversity Monitoring Switzerland contributed to the determination of critical loads in nitrogen deposition in Europe assessed due to the Convention on Long-Range Transboundary Air Pollution (CLRTAP) implemented by the European Monitoring and Evaluation Programme (EMEP). Distinctive features of the Biodiversity Monitoring Switzerland The specific contribution of the Biodiversity Monitoring Switzerland to the analysis of species diversity in Switzerland is the fact that species lists can be drawn up that are as comprehensive as possible for all sampling areas, which increases the probability of detecting species absences. In addition, the Biodiversity Monitoring Switzerland is not restricted to well-known, highly species-rich areas or sites where rarities are found, but rather monitors randomly selected locations that would hardly ever be surveyed otherwise. Common and widespread species are thus also surveyed. Repeat surveys at exactly the same location using exactly the same method allow precise conclusions to be drawn regarding changes in species diversity. Biodiversity Monitoring Switzerland provides a cross section of the overall landscape covering a wide variety of uses. It serves as a reference for programmes that study the development of selected habitats or of specific rare species, e.g. the project “Monitoring the effectiveness of habitat conservation in Switzerland” (WBS), and Switzerland's Red Lists. Notes FOEN: Swiss Biodiversity Monitoring BDM. Description of Methods and Indicators. Environmental Studies No. 1410, Federal Office for the Environment FOEN, Bern, 2014. FOEN: Biodiversity in Switzerland: Status and Trends. Results of the biodiversity monitoring system in 2016. State of the Environment No 1630, Federal Office for the Environment FOEN, Bern, 2014. Swiss Biodiversity Forum (eds.): 20 Jahre Biodiversitätsmonitoring Schweiz BDM, Special Issue of HOTSPOT 46, Swiss Biodiversity Forum, Bern, 2022. (in German and French only) References External links Website of the Biodiversity Monitoring Switzerland FOEN: Indicators that illustrate the changes and state of the environment Environmental studies Convention on Biological Diversity Conservation biology
Biodiversity Monitoring Switzerland
[ "Biology" ]
1,412
[ "Convention on Biological Diversity", "Conservation biology", "Biodiversity" ]
59,737,071
https://en.wikipedia.org/wiki/Hygrocybe%20flavescens
Hygrocybe flavescens, commonly known as the golden waxy cap, is a species of Hygrocybe described from Michigan. It is considered nonpoisonous to humans. The species can be found in various forests and woodlands. The mushroom is yellow-orange. Its cap ranges from 2.5 to 6 cm wide, and can be more orange in youth. The stalk is 4 to 7 cm long, .5 to 1.5 cm wide. The gills are more pale than the cap and stipe. The spores are white, elliptical, smooth and inamyloid. It has a mild taste and odor. Hygrocybe chlorophana is similar, noted in North America as having a more viscid stipe. This distinction is not made in Europe, indicating that they may be the same species. It is considered edible, but undesirable. Mycologist David Arora describes it as "edible, but far from incredible". References External links flavescens Fungi of the United States Fungus species
Hygrocybe flavescens
[ "Biology" ]
220
[ "Fungi", "Fungus species" ]
59,737,081
https://en.wikipedia.org/wiki/Bigoud%C3%A8ne
In Breton tradition, a coiffe bigoudène is a women's coif worn with traditional Breton costumes. By extension, the women wearing the coif and the costume associated with it are also called bigoudènes. The coif is about 30 cm high, and up to 40 cm in Penmarc'h. The bigoudène coif is worn by the women of the Bigouden Country (Breton: Bro-Vigoudenn; French: Le Pays Bigouden) historically known as "Cap Caval" and located along the Bay of Audierne (Bro Kernev), south-west of Quimper, Brittany. They have been officially based in the French departement of Finistère since 1790. The term bigoudène should not be confused with "bigoudénnie", the geographical concentration of these women, and with the Bigouden region. Etymology The first attestation of the term bigoudène being used in the French language (from Breton: bigoudenn) was in 1881 in the Revue des deux Mondes (French: [ʁəvy de dø mɔ̃d], Review of the Two Worlds). It had been used in the Breton language around 1830 through the meaning: "headdress of linen or cotton worn in the region of Pont-l'Abbé". It is related to the terms bigoudi (hair curler), bigot (part of the racage from a yard on top of a traditional square rigged ship) and bigue (kind of pulley, type of spar used as a crane). Literary quotes "Very strong, vaulted, thick waist, they [the women from Plomeur] wear three skirts of cloth superimposed (...) and they are wearing the strange bigoudène coif, kind of variegated headband that hides their ears and lets see from behind, their hair up". ― (François Coppée, Prose, Mon franc-parler I, 1894, p. 115) "But nothing could stop the stubborn bigoudène". ― (Hervé Bazin, , 1956, page 37). History Contrary to a widespread legend encapsulating the headdresses’ size as a response to the cut steeples cut during the Revolt of the papier timbré (anti-fiscal revolt in the west of Ancien Régime France, reign of Louis XIV from April to September 1675); the bigoudène headdress only became really high in the twentieth century, especially in the Interwar period (November 1918 - September 1939) where it gained a centimeter per year. The maximum height of the cap is reached at the end of the Second World War, when the Breton costume started to become old-fashioned. The high headdress is for ceremonies or states of mourning: the everyday headdress worn during the daily work is a simple black velvet ribbon around the comb and behind which one concealed the chignon. In 1977, 31% of women over 47 years old wore the headdress. This figure drastically decreased to only 500 women (of all ages) in 1993. In 2011 Maria Lambour was one of the last women to wear this headdress on a daily basis. Today it is worn only during cultural events and by rare women on an almost daily basis. On 11 June 2018 the then doyenne (eldest) of the Bigoudènes, Marie Pochat, died at the age of 102 in her native Brittany. She was one of the last few irreducible Bretons still wearing the headdress. Born on 29 February 1916 in Léchiagat (now Treffiagat) in the Bigouden country, Marie Pochat regularly wore the headdress from the age of 12. "Without this headdress, I feel that I am missing something," she told France 3 Brittany on the occasion of the celebration of her centenary in 2016. Only a handful of Bretons still wear this lace headdress, a true symbol of Brittany that appeared in 1747. In 2015 the Museum of Brittany had hosted an exhibition by photographer Charles Fréger showing the considerable richness and diversity of Breton headdresses. Sartorial aspects The confection of the bigoudènes' traditional costume is recognized as a landmark of French sartorial heritage and high craftsmanship. The oldest known Bigoudène headdress dates back to 1830; still ample, the headdress largely covers the hair; limited to a small rectangle, the embroidery is nascent. It is exhibited at the Bigouden Museum in Pont-l'Abbé. One of the most important sartorial events for bigoudènes is the "Feast of Embroiderers" (French: Fête des Brodeuses) taking place every year in July in Pont-l'Abbé, Finistère, Brittany. In the arts Numerous artists immortalized the bigoudènes such as: Henri Guinier (1867- 1927) François Hippolyte Lalaisse (1810-1884) Henri Delavallée (1860-1943) Georges A. L. Boisselier (1876-1943) (1825-1893) Georges Lacombe (1868-1916) Lucien Simon (1861- 1945) Joseph-Félix Bouchor (1853-1937) Émile Malo-Renault (1870-1938) Paul Gauguin (1848–1903) Pascal Dagnan-Bouveret (1852–1929) In popular culture In the French-speaking world, since the 1970s, television commercials from Breizh Cola and most importantly the French food industry company "" have been portraying elderly women dressed as Bigoudènes while shouting "Tipiak, Pirates!". This famous slogan propelled the term "" to become synonymous with "hacker" in web communities and now refers to hackers or counterfeiters. The sticker made by the textile enterprise symbolizing a small figure wearing a bigoudène headdress is stuck on more than 1.5 million cars across the world as of July 2011 and has become a popular symbol of recognition for Bretons. The bigouden coiffe is part of local identity, almost in tourist brochures. Pâtisserie The Bigoudène briochée (Brioched Bigoudène) is a pâtisserie popularized during the ' centenary in Loctudy. It is composed of a raised dough wrapped around a cylinder and cooked on a spit and slowly browned. It is sold on city markets around the bigoudénnie and most preponderantly in Locronan regularly elected "one of the most beautiful villages in France". There are two kinds of those pastry headdresses: a savoury one with emmenthal and black olives and the other ones, sweet, covered with sugar or chocolate. The idea was partly inspired by the Eastern countries (Romania, Hungary) where they are very fond of this type of dough put aside to rise for a moment before being put on a grill. Sources Breton art Breton-language singers Culture of Brittany Brittany Culture of France Costume design French fashion
Bigoudène
[ "Engineering" ]
1,466
[ "Costume design", "Design" ]
59,737,107
https://en.wikipedia.org/wiki/Hygrocybe%20singeri
Hygrocybe singeri or witch's hat is a species of Hygrocybe from Northwestern California. The species is very similar to Hygrocybe conica, differing in its viscid stipe. References External links singeri Fungi of California Fungus species
Hygrocybe singeri
[ "Biology" ]
63
[ "Fungi", "Fungus species" ]
59,737,151
https://en.wikipedia.org/wiki/Laccaria%20proxima
Laccaria proxima is a species of edible mushroom in the genus Laccaria from the conifer forest of California, as well as eastern and northern North America. References External links Edible fungi proxima Fungus species
Laccaria proxima
[ "Biology" ]
47
[ "Fungi", "Fungus species" ]
59,737,184
https://en.wikipedia.org/wiki/Laccaria%20fraterna
Laccaria fraterna is a species of Laccaria that grows on Eucalyptus and Acacia trees. References External links fraterna Fungus species
Laccaria fraterna
[ "Biology" ]
31
[ "Fungi", "Fungus species" ]
63,329,110
https://en.wikipedia.org/wiki/CAST-32A
CAST-32A, Multi-core Processors is a position paper, by the Certification Authorities Software Team (CAST). It is not official guidance, but is considered informational by certification authorities such as the FAA and EASA. A key point is that Multi-core processor "interference can affect execution timing behavior, including worst case execution time (WCET)." The original document was published in 2014 by an "international group of certification and regulatory authority representatives." The current revision A was released in 2016. "The Federal Aviation Administration (FAA) and European Aviation Safety Agency (EASA) worked with industry to quantify a set of requirements and guidance that should be met to certify and use multi-core processors in civil aviation, described e.g. in the FAA CAST-32A Position Paper and the EASA Use of MULticore proCessORs in airborne Systems (MULCORS) research report." For applicants certifying under EASA, AMC 20-193 has now superseded CAST-32A since its release on 21 January 2022. It is expected that the FAA will release its Advisory Circular AC 20-193 guidance in 2023, which is expected to be almost identical to AMC 20-193. One of the first mixed-criticality multicore avionics systems is expected to be certified sometime in 2020. The objectives of the standard are applicable to software on multicore processors, including the operating system. However, the nature of the underlying processor hardware must be examined in detail to identify potential interference channels due to inter-core contention for shared resources. Verification that multicore interference channels have been mitigated can be accomplished through the use of interference generators i.e. software tuned to create a heavy usage pattern on a shared resource. Objectives The paper presents ten objectives that must be met for Design Assurance Level (DAL) A or B. Six of the objectives apply for DAL C. The paper does not apply for DAL D or E. References RTCA standards Computer standards Avionics Safety engineering Software requirements
CAST-32A
[ "Technology", "Engineering" ]
411
[ "Systems engineering", "Software requirements", "Safety engineering", "Computer standards", "Avionics", "Software engineering", "Aircraft instruments" ]
63,329,330
https://en.wikipedia.org/wiki/DO-297
DO-297, Integrated Modular Avionics (IMA) Development Guidance and Certification Considerations is one of the primary document by which certification authorities such as the FAA and EASA approve Integrated Modular Avionics (IMA) systems for flight. The FAA Advisory Circular (AC) 20-170 refers to DO-297. Along with ARINC 653 and DO-248, the DO-297 standard guides "Safety of flight for IMA systems" DO-297 provides specific guidance for the stakeholders, defining the following roles platform and module suppliers application suppliers IMA system integrator certification applicant maintenance organization certification authority. The DO-297 standard formalizes the use of more powerful computing hardware to host multiple software functions of mixed safety-criticality. IMA produces benefits of reduced Size, Weight, and Power (SWaP) by integrating into a single computing platform software functions that were formerly on separate (federated) computing systems. The standard describes how safety is maintained through the isolation provided by a partitioning environment, ensuring that independent functions cannot adversely impact one another's behavior. History The document was published by RTCA, Incorporated, in a joint effort with EUROCAE, completed in November 2005. The lessons learned in certifying early approaches to IMA in commercial aircraft such as the Boeing 787 Dreamliner and the Airbus A380 helped inform the development of the standard. References Computer standards Safety engineering RTCA standards Avionics
DO-297
[ "Technology", "Engineering" ]
292
[ "Systems engineering", "Safety engineering", "Computer standards", "Avionics", "Aircraft instruments" ]
63,329,438
https://en.wikipedia.org/wiki/Jens%20Meiler
Jens Meiler (born August 31, 1974) is a German-American biologist and structural chemist. He currently serves as a Professor of Chemistry and Associate Professor of Pharmacology and Biomedical Informatics at Vanderbilt University. His research focuses on protein structures and computational biology, drawing on interdisciplinary techniques from other sciences. Biography Meiler was born in Leipzig, East Germany. He attended the University of Leipzig, where he received a B.S. in biology in 1995. He then continued onto the University of Frankfurt receiving a Ph.D. in structural biology in 2001, where he was funded by the German National Merit Foundation scholarship. His doctoral adviser was Christian Griesinger, Director of the Max Planck Institute for Biophysical Chemistry. Meiler then completed his post-doctoral work in the same field through the Human Frontier Science Program at the University of Washington from 2001 to 2004. His postdoctoral adviser was David Baker (biochemist), the Henrietta and Aubrey Davis Endowed Professor in Biochemistry, University of Washington. After completing a postdoctoral fellowship, Meiler served as an Assistant Professor of Chemistry, Pharmacology, and Biomedical Informatics at Vanderbilt University. In 2011, he received tenure and was promoted to Associate Professor. During this time, he received the Vanderbilt Institute for Chemical Biology Prize for Highly Cited Article award (2014). In 2019, Meiler was awarded the Alexander von Humboldt Professorship from the Alexander von Humboldt Foundation his research in bioinformatics and protein structures. As part of the award, Meiler collaborated with colleagues at Leipzig University on the study of G-protein coupled receptors. He was also named the Stevenson Chair in Chemistry. At Vanderbilt, his lab conducts research on cheminformatics, Ligand docking, and protein design. It is funded by a number of national organizations, including the National Science Foundation and the National Institutes of Health. The Meiler Lab at Vanderbilt University specializes in computational, structural, and chemical biology. Their focus is on protein-protein interactions, protein design, ligand docking, and cheminformatics. Their findings on small-molecule therapeutics and receptor-binding proteins have been published in academic journals like Nature. In recent years, Meiler has also conducted research on artificial intelligence. His work has been featured in newspapers in both the United States and Germany. Honors and awards 2019 Alexander von Humboldt Professorship from the Alexander von Humboldt Foundation 2015 Vanderbilt University Chancellor Faculty Fellow 2014 Vanderbilt Institute for Chemical Biology Prize for Highly Cited Article 2002–2005 Human Frontier Science Program postdoctoral fellowship 1998–2001 Kekulé Scholarship, German Chemical Industry Association 1994–1998 German National Merit Foundation scholarship Personal life Jens Meiler lives in Nashville, Tennessee and Leipzig, Germany. Notable publications References External links 1974 births Living people 21st-century American chemists 21st-century German chemists Structural biologists
Jens Meiler
[ "Chemistry" ]
558
[ "Structural biologists", "Structural biology" ]
63,331,699
https://en.wikipedia.org/wiki/Recursive%20islands%20and%20lakes
A recursive island or lake, also known as a nested island or lake, is an island or a lake that lies within a lake or an island. For the purposes of defining recursion, small continental land masses such as Madagascar and New Zealand count as islands, while large continental land masses do not. Islands found within lakes in these countries are often recursive islands because the lake itself is located on an island. Recursive islands Islands in lakes Islands in lakes on islands There are nearly 1,000 islands in lakes on islands in Finland alone. Islands in lakes on islands in lakes Islands in lakes on islands in lakes on islands Islands in lakes on islands in lakes on islands in lakes Moose Boulder was claimed to exist in the seasonal pond of Moose Flats on Ryan Island in Siskiwit Lake on Isle Royale in Lake Superior in the United States. In 2020, an expedition to the island found that it is potentially a hoax, along with the aforementioned seasonal pond. Recursive lakes Lakes on islands Lakes on islands in lakes Lakes on islands in lakes on islands Lakes on islands in lakes on islands in lakes. The 4th grade recursion lakes are incredibly rare. There is only 2 of them found right now. See also List of endorheic basins Volcanic crater lake List of islands by area List of lakes by area List of islands by population Notes References Coastal and oceanic landforms Coastal geography Lake islands Lakes Lists of islands Recursion
Recursive islands and lakes
[ "Mathematics" ]
294
[ "Mathematical logic", "Recursion" ]
63,331,751
https://en.wikipedia.org/wiki/Time%20in%20Tuvalu
Time in Tuvalu is given by Tuvalu Time (TVT; UTC+12:00). Tuvalu Time does not have an associated daylight saving time. Tuvalu is located at the longitude of 176° to 180°, west of the International Date Line. IANA time zone database In the IANA time zone database, Tuvalu is given one time zone: References Geography of Tuvalu
Time in Tuvalu
[ "Physics" ]
88
[ "Spacetime", "Physical quantities", "Time", "Time by country" ]
63,331,864
https://en.wikipedia.org/wiki/Ruth%20Blake
Ruth E. Blake is an American geochemist and environmental scientist. She is a professor at Yale University in earth & planetary sciences, environmental studies, and chemical & environmental engineering. Blake's work focuses on marine biogeochemical processes, paleoclimate, astrobiology, and stable isotope geochemistry. Education Ruth Blake completed a B.S. degree in geology from Wayne State University and a M.S degree in hydrogeology from the University of Texas. She earned a Ph.D. in geochemistry from University of Michigan in 1998. Blake's doctoral research focused on how microbial activity can affect oxygen isotopes in phosphates. Career and research While a professor at Yale, Blake expanded on her graduate research focus using isotopic evidence in ancient marine phosphates to show that there was significant biological activity in the ocean during the Archean era. Blake has worked on numerous other research topics related to biological and/or chemical activity in oceans, sediments, and soils. She has worked on methods development in isotope geochemistry. Awards and honors Blake was the 2002 winner of the F.W. Clarke Medal from the Geochemical Society. References External links Yale University faculty Living people Year of birth missing (living people) American geophysicists American environmental scientists Women geophysicists 21st-century American physicists 21st-century American women scientists Wayne State University alumni American women geologists American women physicists 21st-century American geologists University of Texas at Austin alumni University of Michigan alumni American women academics 21st-century African-American women 21st-century African-American scientists
Ruth Blake
[ "Environmental_science" ]
323
[ "American environmental scientists", "Environmental scientists" ]
63,333,004
https://en.wikipedia.org/wiki/Kim%20Doochul
Kim Doochul is a South Korean theoretical physicist. He was head of the Department of Physics, director of the BK21 Physics Research Division, and professor emeritus at Seoul National University. He was also a fellow and chairperson in the Korean Academy of Science and Technology before becoming the fifth president of Korea Institute for Advanced Study and the second president of Institute for Basic Science. He was a standing trustee with the Asia Pacific Center for Theoretical Physics and a board of Trustee member of the Korean Physical Society. Education Kim received his bachelor of science in electronic engineering from Seoul National University in 1970 and a Ph.D. in electrical engineering from Johns Hopkins University in 1974 with a focus in statistical physics. Career From 1974 to 1977, he was a postdoctoral research fellow at New York University and the University of Melbourne. Next he became a professor in the Department of Physics and Astronomy of Seoul National University (SNU), a position he held through August 2010. While at SNU, he was also chairperson in the Department of Physics, director of the BK21 Physics Research Division, head of the Department of Physics, and professor emeritus. Within the Korean Academy of Science and Technology, he was a fellow and later chairperson of the Division of Natural Sciences. From July 2010 to June 2013, he was the 5th president of the Korea Institute for Advanced Study. After winning the 20th Sudang Prize in Basic Science in 2011, he became the Basic Science Committee Member for the Sudang Award from the 21st presentation to the 29th (2012-2019). His final position was as the second president of the Institute for Basic Science from September 2014 until September 2019. His November farewell ceremony was also the inauguration ceremony for incoming president Noh Do Young. Awards and honors 2011: Sudang Award in Basic Science 2011: 52nd Samil Cultural Award in Natural Sciences, () 2009: 58th Seoul City Cultural Award in Natural Sciences 2008: Education Award, Seoul National University 1998: Academic Award for Outstanding Research, Korean Physical Society 1987: Best Paper Award, Korean Physical Society References External links Kim Doochul - Google Scholar Seoul National University alumni Academic staff of KAIST 1948 births Living people Theoretical physicists South Korean physicists Johns Hopkins University alumni Presidents of the Institute for Basic Science South Korean scientists
Kim Doochul
[ "Physics" ]
450
[ "Theoretical physics", "Theoretical physicists" ]
63,334,526
https://en.wikipedia.org/wiki/Cluster%20of%20Excellence%20Frankfurt%20Macromolecular%20Complexes
The Cluster of Excellence Frankfurt "Macromolecular Complexes" (CEF) was established in 2006 by Goethe University Frankfurt together with the Max Planck Institute of Biophysics and the Max Planck Institute for Brain Research in the context of the German Universities Excellence Initiative. Funding by the Deutsche Forschungsgemeinschaft (DFG) endet in October 2019. CEF grew out of the long-standing collaborative research on membrane proteins and RNA molecules and strengthened research efforts in these fields by recruiting further scientists to Frankfurt/Main. CEF brought together the research activities of up to 45 research groups, the majority of which were based on Riedberg Campus in Frankfurt/Main. CEF founded the Buchmann Institute for Molecular Life Sciences (BMLS). Aims CEF scientists set out to investigate the structure and function of large macromolecular complexes, in particular membrane proteins and their assemblies, complexes involved in signal transduction and quality control, and RNA-protein complexes. Research Important structures of macromolecular complexes were determined in CEF. Examples for important membrane complexes include the atomic structures of complex I and the ATP synthase of the mitochondrial respiratory chain and of the transporter associated with antigen processing (TAP). Research on RNA structure and function led to the definition of regulatory principles of temperature sensing riboswitches, the structure-function relationship of RNA polymerase I, the functions of microRNAs and the mechanisms of rRNA maturation and downstream processes during ribosome biogenesis and recycling. For instance, CEF scientists identified the receptors of ubiquitin chains on the proteasome, deciphered the role of linear ubiquitin chains and described macromolecules regulating mitophagy, xenophagy and ER-phagy. They delineated the role of sumoylation in ribosome quality control and characterized the process of genetic quality control in oocytes. The efforts in these three research areas were accompanied by approaches to design or reprogram macromolecular complexes and new methods developed to expand the already strong expertise. CEF scientists established and advanced the principles of optogenetics as well as biochemical methods for light regulation. They also developed biophysical techniques for the structural and functional characterization of macromolecules. Example include light-switchable molecules designed for in-cell applications and time-resolved techniques to study RNA folding. Light sheet fluorescence microscopy for the observation of development and LILBID mass spectrometry for the analysis of membrane complexes was improved. PELDOR-EPR was developed to a resolution that allows in-cell measurements. The Cluster promoted scientific exchange through a range of programmes as well as through workshops, international conferences and lecture series. Optogenetics and light sheet fluorescence microscopy were selected as the "Method of the Year" across all fields of science and engineering by the interdisciplinary research journal Nature Methods in 2010 and 2014, respectively. The five research areas of CEF included: (A) Structure, mechanisms and dynamics of complexes in the membrane, (B) Composition and dynamics of macromolecular complexes in quality control and signalling, (C) Dynamics of ribonucleic acid-protein-complexes, (D) Design of macromolecular complexes, and (E) Methods for studying macromolecular complexes. CEF Research Area A - Structure, mechanisms and dynamics of complexes in the membrane Biological membranes have a very important role in life processes as everything a cell needs to live, grow and respond has to either pass through or act on them. The energy conversion processes of cellular respiration and photosynthesis happen in membranes, every sensory stimulus and the information processing in the brain is mediated by them. This array of diverse actions is performed by a large number of different membrane proteins. In the crowded conditions of the cell membrane, most membrane proteins associate into complex dynamic assemblies to carry out their various tasks. For this reason, and because they are embedded in the lipid bilayer of the membrane, most membrane proteins are difficult to study and their functions have often been intractable. CEF scientists have done groundbreaking work to overcome some of these challenges and made major contributions to elucidating the structure, mechanisms and regulation of a number of important large complexes, including respiratory complex I, rotary ATPases, supercomplex I1III2IV1, cytochrome cbb3 oxidase, cytochrome bd oxidase, a sulfide:quinone oxidoreductase, a fungal TOM core complex, a bacterial double-pore K+ uptake system KtrAB, the Na+-independent carnitine/butyrobetaine antiporter CaiT, the betaine/Na+ symporter BetP, the multidrug efflux transporter AcrB and the chaperone and editing TAPBPR–MHC I complex and the human MHC-I peptide-loading complex. Antigenic peptide recognition on TAP was resolved by DNP-enhanced solid-state NMR spectroscopy. The conformational coupling and trans-inhibition in the human antigen transporter ortholog TmrAB was resolved with the aid of dipolar EPR spectroscopy. The progress in 3D structure determination of membrane proteins by X-ray crystallography and cryo electron microscopy has created an increasing demand and opportunity for in-depth mechanistic studies by magnetic resonance methods. Due to the challenges intrinsic to membrane proteins, progress relies on the availability of techniques at the forefront of method development. Especially solid-state (MAS) NMR enables bridging the gap between 'static' structures and biochemical data by probing membrane proteins directly within the bilayer environment. Such experiments are challenging and breakthroughs could only be achieved thanks to the availability of dynamic nuclear polarization for sensitivity enhancement and very high magnetic fields for spectral resolution. CEF scientists were able to provide new insights into the catalytic mechanism of ABC transporters. Based on real-time 31P-MAS-NMR they found that the homodimeric lipid A flippase MsbA is able to catalyze a reverse adenylate kinase-like reaction in addition to ATP hydrolysis. In addition, the ATP hydrolysis cycle of the ABC transporter LmrA was probed by site-directed spin labeling and pulsed electron–electron double resonance (PELDOR/DEER) spectroscopy. The secondary multidrug efflux pump EmrE from E. coli was extensively studied with 31P- and DNP-enhanced solid-state NMR. Also, a number of photoreceptors such as microbial rhodopsins are involved in trans-membrane transport processes. For example, fundamental contributions were made towards the structural and functional description of proteorhodopsin, a pentameric light-driven proton pump by groups within CEF. CEF researchers have developed mass spectrometry approaches specifically suitable for large membrane protein complexes. Laser induced liquid beam/bead ion desorption mass spectrometry (LILBID) enables mass analysis of whole membrane protein complexes of 1 MDa or more. A team of CEF scientists resolved the mechanism of the subtype selectivity of human bradykinin receptors for their peptide agonists by integrating DNP-enhanced solid-state nuclear magnetic resonance with advanced molecular modeling and docking CEF Research Area B - Composition and dynamics of macromolecular complexes in quality control and signalling The characterization of function and structural composition of signalling complexes controlling cellular quality control programs was one of the major topics of CEF research. The view that proteins act as single entities has been replaced with the concept suggesting that dynamic reorganization of multimeric soluble complexes annotated as signalosomes is essential for signal transmission in the cell. Regulation of the activity of these complexes is achieved by their dynamic composition as well as by post-translational modifications (PTMs) of proteins. Domains that recognize these modifications play decisive roles in a cell's ability to respond to alterations in their microenvironment. Significant progress has been accomplished by CEF in characterizing several signalling pathways and their regulation by PTMs including ubiquitylation, phosphorylation and acetylation. A particular focus of research in CEF has been on protein quality control mechanisms that are the basis for the autophagic and the ubiquitin/proteasomal pathways, the two cellular systems used to degrade faulty or superfluous proteins, complexes and organelles. Additional foci of CEF research were genetic quality control in oocytes and epithelial stem cells by the p53 protein and the regulation of and by kinases. Research into autophagy During selective autophagy, cargo is specifically targeted for degradation, and distinct cargo receptors have been described that regulate selectivity. This process is facilitated by autophagy receptors specifically recognizing and binding their cargo, and delivering it to the phagophore. In humans, there are six different LC3/GABARAP proteins, which play a central role by connecting nascent autophagosome membranes and cargo-loaded autophagy receptors to facilitate engulfment, sometimes mediated or supported by additional adaptor proteins. CEF scientists showed that GABARAP proteins are not only involved in autophagy but also in the ubiquitin-dependent degradation of TIAM1. Breakthroughs were achieved in how cells fight intracellular pathogens and how intracellular bacteria try to evade these counter measures. The kinase Tbk1 was identified as important for mediating optineurin based xenophagy to remove the bacteria from the infected cells. Using mass spectrometry, a global analysis of the ubiquitinome of Salmonella-infected cells was carried out, that enabled CEF scientists to identify specific targets of bacterial ligases that are secreted into the cellular cytoplasm by the pathogens. CEF scientists also revealed the molecular mechanism of a novel type of phosphoribosyl-linked serine ubiquitination by the effector SdeA of the pathogen Legionella, which is very different from the canonical lysine-based ubiquitination mechanism. They further showed that another effector of Legionella bacteria, SidJ, opposes the toxicity of SidE in yeast and mammalian cells. Mass spectrometry analysis revealed that SidJ is a glutamylase that modifies the catalytic glutamate in the mono-ADP ribosyl transferase domain of the SdeA, thus blocking the ubiquitin ligase activity of SdeA. They further discovered that reticulon-type proteins act as ER-specific autophagy receptors and simulated their effect on the membrane curvature. Ubiquitination Ubiquitination plays a central role for marking proteins to be degraded either via the autophagy pathway or via the proteasome. Several groups of CEF have contributed to advances in understanding how ubiquitin signalling is not only used as a degradation signal but also involved in several other cellular processes p63 Research on TP63, also known as p63, has shown that this protein plays essential roles both for the proliferation and differentiation of stratified epithelial tissues as well as for the surveillance of the genetic quality in female germ cells. Investigations by CEF scientists showed that a specific isoform of p63 is highly expressed in primordial oocytes which are arrested in prophase of meiosis I. This isoform adopts a closed, inactive and only dimeric conformation in which both, the interaction with the DNA as well as with the transcriptional machinery is significantly reduced The inhibition is achieved by blocking the tetramerization interface of the oligomerization domain with a six-stranded anti-parallel beta-sheet. Activation requires phosphorylation and follows a spring-loaded, irreversible activation mechanism. These discoveries open the possibility to develop a therapy for preserving oocytes during chemotherapy which in female cancer patients usually results in infertility and the premature onset of menopause. CEF scientists also helped to identify the molecular mechanism causing ankyloblepharon-ectodermal dysplasia-cleft lip/palate syndrome, a disease characterized by skin erosions, oral clefting abnormalities and fused eyelids, which is based on mutations in the SAM domain or in the C-terminus of p63. Complexes involved in tumorigenesis were studied by several CEF groups, including the leukemogenic AF4-MLL fusion protein and RIP1-containing cytosolic complexes that are critical for the initiation and fine-tuning of different forms of cell death, i.e. apoptosis and necroptosis SGC Frankfurt Goethe University became a member of the Structural Genomics Consortium (SGC) in 2017, an international consortium and public-private partnership dedicated to the determination of structures of important proteins and the development of inhibitors and probes for biological macromolecules to be used in functional investigations. Goethe University has also become the home and reference center for the SGC's donated probes programme, that makes small molecules no longer being further pursued by industry as drug targets freely available to researchers worldwide). CEF scientists have developed bromodomain inhibitors that can be used to study the function of these acetyl-lysine modification binding domains. A set of probes has been characterized and validated as tools for specific bromodomains Interactions with soluble domains at the membrane CEF showed that vascular endothelial growth factor receptor-2 needs to be internalized and is regulated by its association to ephrin B2 in endothelial cells. Ephrin B2 was also found to be essential to control levels of AMPA receptors at the synaptic membrane. The mechanism of membrane insertion of tail-anchored proteins was studied by structural and biochemical characterization of the interaction of the soluble Get3 protein with the cytoplasmatic domains of the membrane-bound receptors Get1 and Get2. CEF Research Area C - Dynamics of ribonucleic acid-protein-complexes Many discoveries including the identification of multiple classes of noncoding RNAs and regulatory RNA elements has broadened the perspective on RNA function from a passive carrier of information to an active cellular component. Its structural and functional description is required to understand the molecular interactions and the dynamics involved. Structural description of RNA elements and their dynamics The combination of high-resolution NMR-based analysis of RNA structures and time-resolved ligand-induced refolding of RNAs by caging distinct conformations together with pulsed electron paramagnetic resonance methods (PELDOR) after base-specific spin-labeling and ultrafast laser spectroscopy of RNA dynamics has led to the description of the structural dynamics of several RNAs. CEF scientists showed that the regulation mechanism of the adenine-sensing riboswitch of the human pathogenic bacterium Vibrio vulnificus is notably different from a two-state switch mechanism in that it involves three distinct stable conformations. This translational adenine-sensing riboswitch represented the first example of a temperature-compensated regulatory RNA element . The composition and structure of the HIV TAR RNA-Ligand complex was analyzed by LILBID and NMR, leading to a description of the complexity of peptide binding sites in RNAs. Furthermore, the guanine-sensing riboswitch-aptamer domain of the Bacillus subtilis xpt-pbuX operon, the Diels-Alderase ribozymes an RNA-based thermometer, and the N1–ribostamycin complex were structurally and functionally analyzed. CEF scientists also showed that for the guanine-sensing xpt-pbuX riboswitch of B. subtilis, the conformation of the full-length transcripts is static: it exclusively populates the functional off-state but cannot switch to the on-state, regardless of the presence or absence of ligand. Only the combined matching of transcription rates and ligand binding enables transcription intermediates to undergo ligand-dependent conformational refolding(Steinert et al., 2017). Components involved in ribosome biogenesis in eukaryotes CEF scientists in collaboration with the Max Planck Institute for Biophysical Chemistry visualized the RNA Polymerase I (Pol I) in the process of actively transcribing ribosome genes in a cellular environment and solved its structure with and without nucleic acids at 3.8 Å resolution by cryo-EM. Their structures explained the regulation of transcription elongation in which contracted and expanded polymerase conformations are associated with active and inactive states, respectively. Work by a collaboration between several CEF groups unravelled the molecular nature of Bowen-Conradi syndrome by demonstrating that the disease-causing point mutation of the ribosome biogenesis factor Nep1 impairs its nucleolar localisation and RNA binding. Another study, in collaboration with Edinburg University, analysed the RNA helicase Prp43 by crosslinking of RNA and analysis of cDNA (CRAC) and provided first insights into the functional roles of this enzyme in ribosome biogenesis CEF scientists also identified plant-specific ribosome biogenesis factors in A. thaliana with essential function in rRNA processing and showed that the 60S-associated ribosome biogenesis factor LSG1-2 is required for 40S maturation in A. thaliana. Distribution of RNA-modifying enzymes and RNA molecules The dynamics of RNPs in native environments in eukaryotic cells were visualized and quantified using high-resolution microscopy. Adenosine-to-inosine (A-to-I) RNA editing, which is catalyzed by adenosine deaminase acting on RNA (ADAR) enzymes, is important in the epitranscriptomic regulation of RNA metabolism. Cathepsin S (CTSS) mRNA, which encodes a cysteine protease associated with angiogenesis and atherosclerosis, was shown to be highly edited in human endothelial cells . A-to-I RNA editing controls cathepsin S expression in atherosclerosis by enabling HuR-mediated post-transcriptional regulation. mRNA export from the nucleus to the cytoplasm is a highly regulated step in gene expression. CEF scientists evaluated members of the SR protein family (SRSF1–7) for their potential to act as adaptors for nuclear export factor 1 (NXF1) and thereby couple pre-mRNA processing to mRNA export. They found that >1000 endogenous mRNAs required individual SR proteins for nuclear export in vivo. To address the mechanism, transcriptome-wide RNA-binding profiles of NXF1 and SRSF1–7 were determined in parallel by individual-nucleotide-resolution UV crosslinking and immunoprecipitation (iCLIP). SRSF3 emerged as the most potent NXF1 adaptor, conferring sequence specificity to RNA binding by NXF1 in last exons. Numerous human diseases are characterised by a widespread dysregulation of RNA-binding proteins (RBPs) and massively altered transcriptome patterns. CEF scientists used computational methods to study the mechanisms of posttranscriptional regulation on a transcriptomic scale, in collaboration with researchers at IMB Mainz. Noncoding RNAs CEF scientists also investigated the influence of novel |noncoding RNAs]], such as long noncoding RNAs (lncRNAs) and microRNAs (miRNAs), on cellular function. miRNAs regulate gene expression by binding to target mRNAs and preventing their translation. One of the CEF Focus Projects succeeded in observing the activity-dependent spatially-localized miRNA maturation in neuronal dendrites. Local maturation of the miRNA was found to be associated with a local reduction in protein synthesis, showing that localized miRNA maturation can modulate target gene expression with local and temporal precision. LncRNA Meg3 was found to control endothelial cell aging and its inhibition may serve as a potential therapeutic strategy to rescue aging-mediated impairment of endothelial cell function. LncRNA MALAT1 was found to regulate endothelial cell function and vessel growth. and protects against atherosclerosis by regulating inflammation. CEF Research Area D - Design of macromolecular complexes A major focus of work in CEF was to develop and use methods and to explore proteins that enable modulating cellular and molecular function with light. In the field of optogenetics, control of membrane potential and intracellular signalling in neurons and other cells is achieved by expression of photosensor proteins, in most cases of microbial origin, e.g. ion channels or pumps, as well as light-activated enzymes. Optochemical approaches, in contrast, use chemically engineered molecules to achieve light-effects in biological tissue. Optogenetics The origin of optogenetics lies in the work of the Bamberg group at the MPI of Biophysics in Frankfurt, who showed that channelrhodopsin-2 (ChR2) is a light-gated cation channel that can depolarize the cells in which it is expressed. During CEF, the Bamberg lab continued to work in this field and contributed several seminal papers, e.g. on the characterization but also on the engineering of ChR2 to optogenetic tools with different properties. The first utilization of ChR2 for depolarization of mammalian cells and generation of the first ChR2-transgenic animal took place in Frankfurt. The Gottschalk lab introduced ChR2, the light-driven Cl—pump halorhodopsin and other rhodopsins into the nervous system of the nematode C. elegans, to stimulate single neurons and correlate their function with a behavioural output. In addition, they studied synaptic transmission after photostimulation, using ChR2 and a photoactivated adenylyl cyclase (PAC), in combination with electrophysiology and electron microscopy, and introduced modified or novel optogenetic tools with altered properties, for blocking synaptic transmission, or for the manipulation of cyclic GMP. Several CEF groups joined forces not only to unravel the photocycle of ChR2 at different time scales but also provided, in collaboration with the Research Centre Juelich, structural insights into ion conduction by ChR2. They also generated several mutant ChR2 versions with altered ion conductance (for example increased Ca2+-permeability in "CatCh", a Ca2+ transporting channelrhodopsin) or kinetics, representing highly useful additions to the optogenetic toolbox . In 2015, CEF scientists presented the first NMR study which resolved structural details of the retinal cofactor of ChR2. This study was only possible because DNP (a hybrid method linking EPR with solid-state NMR spectroscopy) enhanced the detection sensitivity 60-fold so that metastable intermediates could be detected. In this way, first unambiguous evidence was provided for an exclusive all-trans retinal conformation in the dark state and a new photointermediate could be identified. The study showed that DNP-enhanced solid-state NMR is a key method for bridging the gap between X-ray–based structure analysis and functional studies towards a highly resolved molecular picture . It gradually emerged that rhodopsins have a wide spectrum of functions and distribution and are found in all phyla of life. With the new rhodopsins came the observation that they represent a rather versatile family of proteins while retaining the structural scaffold of seven transmembrane helices with a retinal chromophore bound to a conserved lysine. CEF scientists have studied the structure as well as the function of microbial rhodopsins. One of these is proteorhodopsin, found in marine microbes, which is the most abundant retinal-based photoreceptor on our planet. Variants of proteorhodopsins show high levels of environmental adaptation, as their colours are tuned to the optimal wavelength of available light. CEF scientists together with colleagues from other German universities developed a novel approach to alter the functional properties of rhodopsin optogenetic tools, namely by modifications of the retinal chromophore. Synthetic retinal analogs were introduced into ChR2 or other rhodopsin tools in C. elegans, Drosophila and human cells, to change the light sensitivity, photo cycle kinetics and colour spectrum of the optogenetic actuators. They also established the tightly light-regulated guanylyl-cyclase opsin CyclOp that enabled rapid light-triggered cGMP increase. CEF scientists have also used optogenetic tools for the analysis of neural circuits and how they drive behaviour. Optochemical approaches To control proteins and nucleic acids by light CEF scientists have designed and applied a range of photoswitchable tethers, ribonucleosides and nucleic acids, RNA aptamers and "beacons". They also developed an approach for the chemoenzymatic synthesis of position-specifically modified RNA for biophysical studies including light control. Furthermore, light-activatable interaction of DNA nanoarchitectures, light-dependent conformational changes in nucleic acids, light-dependent RNA interference and light-dependent transcription were realized. Wavelength-selective light-triggering was established for nucleic acids as well as three-dimensional control of DNA hybridization by orthogonal two-colour two-photon uncaging. CEF scientists developed a red-shifted two-photon-only caging group for three-dimensional photorelease. They also developed a minimal light-switchable module enabling the formation of an intermolecular and conformationally well-defined DNA G-quadruplex structure with a photoswitchable azobenzene residue as part of the backbone structure. Important was also the development of an inducible fluorescent probe which enabled the detection of activity-dependent spatially localized miRNA maturation in neuronal dendrites. Using light-inducible antimiRs, CEF scientists also investigated if locally restricted target miRNA activity has a therapeutic benefit in diabetic wound healing and found that light can be used to locally activate therapeutically active antimiRs in vivo. New building principles for DNA-nanoarchitectures have been established in CEF Also, new RNA riboswitches have been designed that can be triggered with small metabolites, exogenous molecules, or by temperature changes, as well as aptamers or self-cleaving ribozymes, which can be used to control gene expression in vivo. Making macromolecules further accessible on the nano-scale for manipulation, CEF developed generally applicable methods to organize macromolecular complexes in two dimensions with very high precision, as well as small synthetic gatekeepers and novel "light switches" to control biomolecular interactions and assembly of macromolecular complexes An approach to assemble three-dimensional protein networks by two-photon activation was developed. CEF scientists also achieved optical control of antigen translocation using synthetic photo-conditional viral inhibitors. Protein engineering CEF scientists used detailed structural knowledge of the fatty acid synthase (FAS) megacomplex to engineer FAS for the biosynthesis of short-chain fatty acids and polyketides, guided by a combined in vitro and in silico approach . They reprogrammed chain-length control of the FAS of Saccharomyces cerevisiae to create a baker's yeast able to produce short-chain fatty acids. A rational and minimally invasive protein engineering approach was used that left the molecular mechanisms of FASs unchanged and identified five mutations that can make baker's yeast produce short-chain fatty acids. To manipulate a protein photocycle in a directed manner, CEF groups collaborated to modify the flavoprotein dodecin at its key amino acid tryptophan with substituents carefully selected for their structural and electronic influence. CEF Research Area E - Methods for studying macromolecular complexes The development of cutting-edge methodologies, including electron paramagnetic resonance (EPR), time-resolved nuclear magnetic resonance spectroscopy (NMR), advanced fluorescence microscopy, as well as optogenetics and optochemical biology has been instrumental in the research efforts of CEF. The Cluster also integrated new developments in electron microscopy and tomography as well as in super-resolution microscopy into the methods portfolio of Riedberg Campus. Cryo-electron microscopy Cryo-electron microscopy, Nature Method of the Year 2015 and the method for which a Nobel prize was awarded in 2017, was extensively employed by several CEF groups, at the MPI of Biophysics as well as at Goethe University's Buchmann Institute for Molecular Biology. Direct electron detectors, in the development of which the MPI of Biophysics was involved, have exceeded all expectations With these detectors, images can be captured with much higher contrast than with the CCD cameras previously used and have led to amazing progress in structural biology. By investing in this new technology, CEF members have been able to speed up structure determination and also solve the structures of macromolecular complexes that were not amenable to x-ray crystallography studies. Another focus of CEFs electron microscopists was to reveal the macromolecular organisation of living cells by means of cryo-electron tomography. Cryo-ET is the only technique that can obtain molecular resolution images of intact cells in a quasi-native environment. Such tomograms contain a large amount of information as they are essentially a three-dimensional map of the cellular proteome and depict the whole network of macromolecular interactions. Information-mining algorithms exploit structural data from various techniques, identify distinct macromolecules and computationally fit atomic resolution structures in the cellular tomograms, thereby bridging the resolution gap. Light microscopy The Cluster also strongly support new developments in advanced light microscopy. A particularly important technique CEF added to the research technique portfolio in Frankfurt is light sheet fluorescence microscopy (LSFM)). In LSFM, optical sectioning in the excitation process minimizes fluorophore bleaching and phototoxic effects. Because with LSFM biological specimens survive long-term three-dimensional imaging at high spatiotemporal resolution, such microscopes have become the tool of choice in developmental biology. The impact of LSFM was recognized in 2015, when the journal Nature Methods elected it as the "Method of the Year 2014". CEF scientists used LSFM, for example, to image in detail the complete embryonic development of different evolutionary unrelated insects and to establish the rules and self-organizing properties of post-embryonic plant organ cell division patterns. The large amount of data produced by advanced light microscopy has made automated image analysis a necessity and CEF has contributed to improved data processing and modelling of advanced light microscopy data. Other novel light microscopy techniques used by CEF scientists include techniques that provide single-molecule sensitivity and a spatial resolution below the diffraction limit to study the structural organization of biomolecules in cells. Software tools developed by CEF scientists include for example SuReSim, a software developed in collaboration with Heidelberg University, that simulates localization data of arbitrary three-dimensional structures represented by ground truth models, allowing users to systematically explore how changing experimental parameters can affect potential imaging outcomes. Using the newly developed techniques, CEF scientists were able to establish the role of the linear ubiquitin coat around the cytosolic pathogen Salmonella Typhimurium as the local NF-κB signalling platform and provided insights into the function of OTULIN in NF-κB activation during bacterial pathogenesis. Another example is the identification of reticulon 3 (RTN3) as a specific receptor for the degradation of ER tubules. The close collaborative teamwork of the consortium allowed tackling two major challenges in live-cell as well as single-molecule localization microscopy: efficient delivery of fluorophores across cell membranes and high-density protein tracing by ultra small labels. Collectively, the new tools provide additional avenues to specifically manipulate and trap cellular proteins, and, at the same time, for high-resolution read-out by single-molecule based microscopy. Spectroscopy methods A wide range of spectroscopy methods for biological applications were available within CEF and CEF scientists have made significant progress in further developing biomolecular NMR and EPR. The members of the Center for Biomolecular Magnetic Resonance (BMRZ) improved the sensitivity of liquid- and solid-state NMR by a spectrometer featuring dynamic nuclear polarization (DNP). Together with researchers from the Russian Academy of Sciences, CEF scientists developed a high-power gyrotron source for DNP. The source operates at 260 GHz with an output power of 20 W, and is connected by a quasi-optical corrugated waveguide to one liquid- and one solid-state 400 MHz NMR spectrometer. The microwave board, which detects the EPR signal and connects the high-power microwave source to the NMR probe, was constructed in collaboration with scientists from the Ukrainian Academy of Sciences. This unique device is based on a metallo-dielectric waveguide system, which guarantees ultra-low losses combined with a high degree of flexibility in terms of instrument design. CEF's scientists demonstrated a proton NMR signal enhancement in aqueous liquids by up to 80-fold at magnetic fields of 9. T, thus exceeding theoretical predictions by more than a factor of 20. First applications to macromolecular complexes have been equally successful. They also recorded signal enhancements by a factor up to 40 under magic angle sample spinning (MAS) conditions at 100 K with proteorhodopsin re-constituted into lipid bilayers. By integrating DNP-enhanced solid-state NMR spectroscopy with advanced molecular modeling and docking, the mechanism of the subtype selectivity of human kinin G-protein-coupled receptors for their peptide agonists was resolved. DNP-enhanced solid-state NMR spectroscopy enabled CEF scientists to determine the atomic-resolution backbone conformation of an antigenic peptide bound to the human ABC transporter TAP. Their NMR data also provided unparalleled insights into the nature of the interactions between the side chains of the antigen peptide and TAP. Their findings revealed a structural and chemical basis of substrate selection rules, which define the crucial function of this ABC transporter in human immunity and health. This work was the first NMR study of a eukaryotic transporter protein complex and demonstrated the power of solid-state NMR in this field They also demonstrated the power of DNP-enhanced solid-state NMR to bridge the gap between functional and structural data and models. In parallel to the DNP developments, a pulsed electron–electron double resonance (PELDOR) spectrometer with a magnetic field of 6.4 T was constructed. A protein concentration of only 10 pMol is sufficient for a measurement at 40 K. With this instrument, CEF scientists were able to determine the dimeric structure of non-covalent protein complexes. This method is also applicable to membrane proteins and spin-labelled RNA and DNA molecules in vivo. PELDOR spectroscopy proved to be a versatile tool for structural investigations of proteins, even in the cellular environment. In order to investigate for example the structural implications of the asymmetric nucleotide-binding domains and the trans-inhibition mechanism in TAP orthologs, spin-label pairs were introduced via double cysteine mutants at the nucleotide-binding domains and transmembrane domains in TmrAB (a functional homologue of the human antigen translocation complex TAP) and the conformational changes and the equilibrium populations followed using PELDOR spectroscopy. This study defined the mechanistic basis for trans-inhibition, which operates by a reverse transition from the outward-facing state through an occluded conformation. The results uncovered the central role of reversible conformational equilibrium in the function and regulation of an ABC exporter and established a mechanistic framework for future investigations on other medically important transporters with imprinted asymmetry. The study also demonstrated for the first-time the feasibility to resolve equilibrium populations at multiple domains and their interdependence for global conformational changes in a large membrane protein complex. Mass spectrometry Native mass spectrometry has emerged as an important tool in structural biology. Advantages of mass spectrometry compared to other methods like X-ray crystallography or nuclear magnetic resonance are for instance its lower limits of detection, its speed and its capability to deal with heterogeneous samples. CEF contributed to the development of laser-induced liquid bead ion desorption mass spectrometry (LILBID), a method developed at Goethe University that is especially suited to the analysis of large membrane protein complexes. A challenge in native mass spectrometry is maintaining the features of the proteins of interest, such as oligomeric state, bound ligands, or the conformation of the protein complex, during the transfer from the solution to the gas phase. This is an essential prerequisite to allow conclusions about the solution state protein complex, based on the gas phase measurements. Therefore, soft ionization techniques are required. While standard methods, such as nESI and matrix-assisted laser desorption/ionization (MALDI) reliably deliver valuable results for soluble proteins, they are not universally applicable to the more challenging matrices which are often required for membrane protein complexes. Generally an artificial membrane mimetic environment is required to maintain a membrane protein complex in its native state outside of the cellular environment. With LILBID the analyte is transferred into the mass spectrometer in small droplets (30 or 50 μm diameter) of the sample solution produced by a piezo-driven droplet generator and is desorbed from the aqueous solution by irradiation with a mid-IR laser. This results in biomolecular ions with lower, more native-like charge states in comparison to nESI. At ultra-soft desorption conditions, even weakly interacting subunits of large protein complexes remain associated, so that the mass of the whole complex can be determined. At higher laser intensities, the complex dissociates by thermolysis and subunit masses are recorded. A broad range of macromolecular complexes from CEF research areas A, C and D, including complex I, ATP synthase, drug transporters with binding proteins, ion channels, proteorhodopsins and DNA/RNA complexes, have been analysed using LILBID. Time-resolved spectroscopy Femtosecond time-resolved spectroscopy was used by CEF scientists to study molecular dynamics and function. This method enables the observation of extremely fast chemical and biological reactions in real time involving a wide variety of molecules from small organic compounds to complex enzymes. Studies included molecular systems like optical switches, natural and non-natural photosynthetic model systems and membrane protein complexes. Fundamental processes in molecular physical chemistry were investigated, such as photoisomerization, energy and electron transfer and reaction dynamics at surfaces. Modern methods in quantum optics for the generation of appropriately shaped and tunable femtosecond pulses in the visible and infrared spectral range were employed and further developed. Examples of these studies include the investigation and deciphering of the dynamics of photoswitchable or photolabile compounds as basis for the design of photoresponsive biomacromolecules, of the primary reaction dynamics of channelrhodopsin-2 (ChR2) and of the conformational dynamics of antibiotic-binding aptamers: Photochromic spiropyrans are organic molecules that can be used for the triggering of biological reactions. Theoretical biophysics and bioinformatics Method development in theoretical biophysics plays an increasingly important role in the study of macromolecular complexes and has made essential contributions to many studies in the other research areas of CEF. Bridging between fundamental physics, chemistry and biology, CEF scientists studied biomolecular processes over a broad resolution range, from quantum mechanics to chemical kinetics, from atomistic descriptions of physical processes and chemical reactions in molecular dynamics (MD) simulations to highly coarse-grained models of the non-equilibrium operation of molecular machines and network descriptions of protein interactions. Their goal is to develop detailed and quantitative descriptions of key biomolecular processes, including energy conversion, molecular transport, signal transduction, and enzymatic catalysis. Within CEF, they worked in close collaboration with experimental scientists who employ a wide variety of methods. Their computational and theoretical studies aided in the interpretation of increasingly complex measurements, and guided the design of future experiments. The interdisciplinary field of bioinformatics opened new perspectives on molecular processes and cellular function. CEF scientists used custom-tailored code and pipelines for fast and efficient analysis of omics data, with a primary focus on protein-RNA interactions and posttranscriptional regulation. They also develops algorithms to solve problems in molecular biology, ranging from atomic protein structure analysis to computational systems biology. Their tools leverage on graph theory, Petri nets and Boolean networks with broad applications within CEF. Their collaborations cover diverse topics from plant metabolomics, to human signal transduction networks and the dissection of the macromolecular complexome. Organisation The CEF Assembly coordinated the research and elected the CEF Speaker and the CEF Board of Directors. The CEF Assembly consisted of the Principal Investigators, Adjunct Investigators, Senior Investigators as well as Associated Members. Speakers of CEF included Werner Müller-Esterl (Nov 2006-Jan 2009), Harald Schwalbe (Feb 2009 - Feb 2013) and Volker Dötsch (March 2013 - October 2019). Publications CEF scientists published more than 2600 original research publications (incl. 479 research papers in journals with an impact factor of ≥10) during the Cluster's lifetime. A full list can be found here. Honours and prizes awarded to CEF scientists A full list can be found here. References External links CEF website Buchmann Institute for Molecular Life Sciences website Goethe University Frankfurt website Max Planck Institute of Biophysics website Max Planck Institute for Brain Research website Center for Biomolecular Magnetic Resonance (BMRZ) website Deutsche Forschungsgemeinschaft website 2006 establishments in Germany Goethe University Frankfurt Nanotechnology institutions Research institutes established in 2006 Research institutes in Germany
Cluster of Excellence Frankfurt Macromolecular Complexes
[ "Materials_science" ]
8,861
[ "Nanotechnology", "Nanotechnology institutions" ]
63,337,149
https://en.wikipedia.org/wiki/International%20Linear%20Algebra%20Society
The International Linear Algebra Society (ILAS) is a professional mathematical society organized to promote research and education in linear algebra, matrix theory and matrix computation. It serves the international community through conferences, publications, prizes and lectures. Membership in ILAS is open to all mathematicians and scientists interested in furthering its aims and participating in its activities. History ILAS was founded in 1989. Its genesis occurred at the Combinatorial Matrix Analysis Conference held at the University of Victoria in British Columbia, Canada, May 20–23, 1987, hosted by Dale Olesky and Pauline van den Driessche. ILAS was initially known as the International Matrix Group, founded in 1987. The founding officers of ILAS were Hans Schneider, President; Robert C. Thompson, Vice President; Daniel Hershkowitz, Secretary; and James R. Weaver, Treasurer. ILAS Conferences The inaugural meeting of ILAS took place at Brigham Young University (including one day at the Sundance Mountain Resort) in Provo, Utah, USA, from August 12–15, 1989. The organizing committee consisted of Wayne Barrett, Daniel Hershkowitz, Charles Johnson, Hans Schneider, and Robert C. Thompson. Much additional support came from Don Robinson, Chair of the BYU Mathematics Department, and James R. Weaver, ILAS Treasurer. The conference received support from Brigham Young University, the National Security Agency, and the National Science Foundation. There were 85 in attendance at the conference from 15 countries including Olga Taussky-Todd, a renowned mathematician in Matrix Theory. The proceedings of the Conference appeared in volume 150 of the journal Linear Algebra and Its Applications. The 2nd ILAS conference was held in Lisbon, Portugal, August 3–7, 1992. The chair of the organizing committee was José Dias da Silva. There were 150 participants from 27 countries and the conference was supported by 11 different organizations. The proceedings of the conference can be found in volumes 197-198 of Linear Algebra and Its Applications. ILAS conferences were held the next 4 years, alternating between the United States and Europe, before beginning the standard pattern of holding the Conference two of every three years (with a few exceptions). The number of participants at each ILAS conference has grown steadily through the years. The first ILAS conference outside of the United States and Europe was held in Haifa, Israel in 2001. The first in the Far East was in Shanghai in 2007 and the first in Latin America was in Cancun, Mexico in 2008. The complete list of locations hosting ILAS conferences follows: 1. Provo, Utah, USA (1989) 2. Lisbon, Portugal (1992) 3. Pensacola, Florida, USA (1993) 4. Rotterdam, The Netherlands (1994) 5. Atlanta, Georgia, USA (1995) 6. Chemnitz, Germany (1996) 7. Madison, Wisconsin, USA (1998) 8. Barcelona, Spain (1999) 9. Haifa, Israel (2001) 10. Auburn, Alabama, USA (2002) 11. Coimbra, Portugal (2004) 12. Regina, Saskatchewan, Canada (2005) 13. Amsterdam, the Netherlands (2006) 14. Shanghai, China (2007) 15. Cancun, Mexico (2008) 16. Pisa, Italy (2010) 17. Braunschweig, Germany (2011) 18. Providence, Rhode Island, USA (2013) 19. Seoul, Korea (2014) 20. Leuven, Belgium (2016) 21. Ames, Iowa, USA (2017) 22. Rio de Janeiro, Brazil (2019) 23. Virtual (originally planned for New Orleans, Louisiana, USA) (2021) 24. Galway, Ireland (2022) 25. Madrid, Spain (2023) 26. Kaohsiung, Taiwan (2025) Prizes and Special Lectures ILAS has three prizes named after giants in Linear Algebra. The Hans Schneider Prize. A distinctive feature of the 3rd ILAS meeting held at the University of West Florida in Pensacola, Florida, March 17–20, 1993, was the institution of the Hans Schneider Prize. This prize was initiated thanks to a donation to ILAS from Hans Schneider, the first president of ILAS and a founding editor of the journal Linear Algebra and Its Applications. Typically, the prize is awarded every 3 years and has evolved as a prize to recognize a person's career. The ILAS Taussky–Todd Prize. Olga Taussky-Todd and John Todd have had a decisive impact on the development of theoretical and numerical linear algebra for over half a century. The ILAS Taussky–Todd Prize honors them for their many and varied mathematical achievements and for their efforts in promoting linear algebra and matrix theory. The prize is awarded once every three to four years recognizing a linear algebra researcher in their mid career. The ILAS Taussky–Todd Prize was originally referred to as the Taussky–Todd lecture, and was instituted at the 3rd ILAS meeting held at the University of West Florida in Pensacola, Florida, March 17–20, 1993. The ILAS Richard A. Brualdi Early Career Prize. The prize is named for Richard A. Brualdi, who has had a major impact on the field, especially in combinatorial matrix theory. In addition, he has been instrumental to the success of ILAS since its inception. The ILAS Richard A. Brualdi Early Career Prize was instituted in 2021 and is awarded every three years to an outstanding early career researcher in the field of linear algebra, for distinguished contributions to the field. In addition ILAS awards Special Lectures at ILAS conferences as well as conferences of collaborating mathematics organizations. Publications ILAS publishes an electronic journal - the Electronic Journal of Linear Algebra (ELA), founded in 1996. The first Editors-in-Chief were Volker Mehrmann and Daniel Hershkowitz. ELA is a platinum open access journal, meaning that it is free to all: no subscription and no article processing fee or page charges. ELA is an all-electronic journal that welcomes high quality mathematical articles that contribute new insights to matrix analysis and the various aspects of linear algebra and its applications. ELA sets high standards for refereeing while using conventional refereeing of articles that is carried out electronically. ILAS also produces and distributes IMAGE, a semiannual electronic bulletin founded in 1988 with Robert C. Thompson as its first Editor. IMAGE contains: essays related to linear algebra activities; feature articles; interviews of linear algebra experts; book reviews; brief reports on conferences; ILAS business notices; announcements of upcoming workshops and conferences; problems and solutions; and news about individual members. Presidents Hans Schneider, 1987–1996 Richard A. Brualdi, 1996–2002 Daniel Hershkowitz, 2002–2008 Stephen Kirkland, 2008–2014 Peter Šemrl, 2014–2020 Daniel B. Szyld, 2020–present Collaborations with other mathematics organizations ILAS collaborates with the Society for Industrial and Applied Mathematics (SIAM), the American Mathematical Society (AMS) and the International Workshop on Operator Theory and its Applications (IWOTA). The collaboration with SIAM started in 1999. The SIAM Activity Group on Linear Algebra (SIAG/LA) holds a conference every three years (when the year minus 2000 is divisible by 3). As part of the agreement, and to encourage interaction between ILAS and SIAG/LA members, the two societies do not hold conferences in the same year. As a result, ILAS holds conferences two out of every three years. In addition, the two societies exchange speakers with ILAS sponsoring two ILAS speakers at every triennial SIAM Applied Linear Algebra (SIAM ALA) meeting (organized by SIAG/LA) and with SIAM sponsoring a SIAM speaker at every ILAS conference. The first ILAS speakers at a SIAM ALA meeting were Hans Schneider and Hugo Woerdeman in 2000, and the first SIAM speakers at an ILAS conference were Michele Benzi and Misha Kilmer in 2002. The collaboration with AMS started in late 2020 with the establishment of ILAS as a partner in the Joint Mathematics Meetings (JMM). In this capacity ILAS will support a speaker for the "ILAS Lecture" at the JMM to be selected by ILAS. In addition, at least four special sessions at the JMM will be identified as ILAS special sessions, the contents of which will be determined by ILAS. The partnership took effect starting with the JMM 2022 held virtually. The collaboration with IWOTA started in 2017 with the establishment of the Israel Gohberg ILAS-IWOTA Lecture, which is funded by donations. This lecture series consists of biennial lectures either at an ILAS conference or at an IWOTA meeting. Israel Gohberg was the founding president of IWOTA and an active member of ILAS. The first Israel Gohberg ILAS-IWOTA Lecturer was Vern Paulsen at the 2021 IWOTA Lancaster UK meeting. References External links International Linear Algebra Society (ILAS) home page Electronic Journal of Linear Algebra (ELA) home page Linear algebra Matrix theory Mathematical societies Mathematics conferences Organizations established in 1989
International Linear Algebra Society
[ "Mathematics" ]
1,858
[ "Linear algebra", "Algebra" ]
63,337,222
https://en.wikipedia.org/wiki/Endangered%20species%20%28IUCN%20status%29
Endangered species, as classified by the International Union for Conservation of Nature (IUCN), are species which have been categorized as very likely to become extinct in their known native ranges in the near future. On the IUCN Red List, endangered is the second-most severe conservation status for wild populations in the IUCN's schema after critically endangered. In 2012, the IUCN Red List featured 3,079 animal and 2,655 plant species as endangered worldwide. The figures for 1998 were 1,102 and 1,197 respectively. IUCN Red List The IUCN Red List is a list of species which have been assessed according to a system of assigning a global conservation status. According to the latest system used by the IUCN, a species can be "Data Deficient" (DD) species – species for which more data and assessment is required before their situation may be determined – as well species comprehensively assessed by the IUCN's species assessment process. A species can be "Near Threatened" (NT) and "Least Concern" (LC), these are species which are considered to have relatively robust and healthy populations, according to the assessment authors. "Endangered" (EN) species lie between "Vulnerable" (VU) and "Critically Endangered" (CR) species. A species must adhere to certain criteria in order to be placed in any of the afore-mentioned conservation status categories, according to the assessment. "Threatened" is a category including all those species determined to be Vulnerable, Endangered or Critically Endangered. Although in general conversation the terms "endangered species" and "threatened species" may mean other things, for the purposes of the current IUCN system, the List uses the terms "endangered" and "threatened" to denote species to which certain criteria apply. Note older or other, such as national, status systems may use other criteria. Some examples of species classified as endangered by the IUCN are listed below: As more information becomes available, or as the conservation status criteria has changed, numerous species have been re-assessed as not endangered, nonetheless the total number of species considered endangered has increased as more new species are assessed for the first time each year. Criteria for endangered status According to the 3.1 version of the IUCN conservation status system from 2001, a species is listed as endangered when it meets any of the following criteria from A to E. A) Reduction in population size based on any of the following: 1. An observed, estimated, inferred or suspected population size reduction of ≥ 70% over the last 10 years or three generations, whichever is the longer, where the causes of the reduction are reversible AND understood AND ceased, based on (and specifying) any of the following: a. direct observation b. an index of abundance appropriate for the taxon c. a decline in area of occupancy, extent of occurrence or quality of habitat d. actual or potential levels of exploitation e. the effects of introduced taxa, hybridisation, pathogens, pollutants, competitors or parasites. 2. An observed, estimated, inferred or suspected population size reduction of ≥ 50% occurred over the last 10 years or three generations, whichever is the longer, where the reduction or its causes may not have ceased OR may not be understood OR may not be reversible, based on (and specifying) any of (a) to (e) under A1. 3. A population size reduction of ≥ 50%, projected or suspected to be met within the next 10 years or three generations, whichever is the longer (up to a maximum of 100 years), based on (and specifying) any of (b) to (e) under A1. 4. An observed, estimated, inferred, projected or suspected population size reduction of ≥ 50% over any 10 year or three-generation period, whichever is longer (up to a maximum of 100 years in the future), where the time period must include both the past and the future, and where the reduction or its causes may not have ceased OR may not be understood OR may not be reversible, based on (and specifying) any of (a) to (e) under A1. B) Geographic range in the form of either B1 (extent of occurrence) OR B2 (area of occupancy) OR both: 1. Extent of occurrence estimated to be less than 5,000 km2, and estimates indicating at least two of a-c: a. Severely fragmented or known to exist at no more than five locations. b. Continuing decline, inferred, observed or projected, in any of the following: i. extent of occurrence ii. area of occupancy iii. area, extent or quality of habitat iv. number of locations or subpopulations v. number of mature individuals c. Extreme fluctuations in any of the following: i. extent of occurrence ii. area of occupancy iii. number of locations or subpopulations iv. number of mature individuals 2. Area of occupancy estimated to be less than 500 km2, and estimates indicating at least two of a-c: a. Severely fragmented or known to exist at no more than five locations. b. Continuing decline, inferred, observed or projected, in any of the following: i. extent of occurrence ii. area of occupancy iii. area, extent or quality of habitat iv. number of locations or subpopulations v. number of mature individuals c. Extreme fluctuations in any of the following: i. extent of occurrence ii. area of occupancy iii. number of locations or subpopulations iv. number of mature individuals C) Population estimated to number fewer than 2,500 mature individuals and either: 1. An estimated continuing decline of at least 20% within five years or two generations, whichever is longer, (up to a maximum of 100 years in the future) OR 2. A continuing decline, observed, projected, or inferred, in numbers of mature individuals AND at least one of the following (a-b): a. Population structure in the form of one of the following: i. no subpopulation estimated to contain more than 250 mature individuals, OR ii. at least 95% of older individuals in one subpopulation b. Extreme fluctuations in the number of older individuals D) Population size estimated to number fewer than 250 mature individuals. E) Quantitative analysis showing the probability of extinction in the wild is at least 20% within 20 years or five generations, whichever is the longer (up to a maximum of 100 years). See also Lists of IUCN Red List endangered species List of endangered amphibians List of endangered arthropods List of endangered birds List of endangered fishes List of endangered insects List of endangered invertebrates List of endangered mammals List of endangered molluscs List of endangered reptiles List of Chromista by conservation status List of fungi by conservation status References External links List of species with the category Endangered as identified by the IUCN Red List of Threatened Species Biota by conservation status IUCN Red List Environmental conservation Habitat IUCN Red List endangered species
Endangered species (IUCN status)
[ "Biology" ]
1,443
[ "Biota by conservation status", "Endangered species", "Biodiversity" ]
63,338,386
https://en.wikipedia.org/wiki/2M1510
2MASS J15104761–2818234, sometimes shortened to 2M1510, is a triple or possibly quadruple brown dwarf system, consisting of the eclipsing binary 2M1510A and the wide companion 2M1510B. 2M1510A was found to be an eclipsing binary in the first light data of the SPECULOOS telescopes. It is only the second eclipsing binary brown dwarf found so far (as of March 2020), the other is 2M0535-05. The system verified theoretical models for how brown dwarfs cool. The system is located 120 light-years away from earth in the constellation Libra. Signs of youth 2M1510A has hydrogen-alpha emission lines, which is interpreted as a sign of youth. The system also belongs to the million-year-old Argus moving group and the brown dwarfs have a low surface gravity, which is an additional indicator for youth. The brown dwarf system 2M1510A and 2M1510B are separated by 250 astronomical units, making them a resolved binary in 2MASS data. The components of the inner eclipsing binary are called 2M1510Aa and 2M1510Ab. Despite the small letter used in this configuration these objects are not planets, but brown dwarfs that burn deuterium. 2M1510A is not only an eclipsing binary, but also a double-lined spectroscopic binary. This was discovered by follow-up observations with Keck II. Follow-up observations with Keck II and the VLT UT2 showed that 2M1510Aa and 2M1510Ab have very similar masses, something that is called a near equal-mass binary. 2M1510Aa has a mass of about and 2M1510Ab has a mass of about . The pair orbits each other every 20.9 days. Additionally the 2M1510A source has an elongated point spread function in VLT/SINFONI data. The naming of the brown dwarfs in Calissendorff et al. 2019 does not follow other works and the companion was called 2M1510B (here from now on: 2M1510B'). 2M1510B' has a mass of and it is separated by about 4.4 au from 2M1510A and orbits the eclipsing binary each 30 years. This result was not considered by Triaud et al. 2020 and it could represent a contamination of the eclipsing binary, making a test of the cooling models more challenging. See also List of nearby stellar associations and moving groups W2150AB another wide binary other triple brown dwarf systems: DENIS-P J020529.0−115925 2MASS J08381155+1511155 VHS J1256–1257 2MASS J0920+3517 References M-type brown dwarfs Eclipsing binaries Libra (constellation) Triple star systems J15104761–2818234
2M1510
[ "Astronomy" ]
631
[ "Libra (constellation)", "Constellations" ]
63,339,701
https://en.wikipedia.org/wiki/Ideal%20electrode
In electrochemistry, there are two types of ideal electrode, the ideal polarizable electrode and the ideal non-polarizable electrode. Simply put, the ideal polarizable electrode is characterized by charge separation at the electrode-electrolyte boundary and is electrically equivalent to a capacitor, while the ideal non-polarizable electrode is characterized by no charge separation and is electrically equivalent to a short. Ideal polarizable electrode An ideal polarizable electrode (also ideally polarizable electrode or ideally polarized electrode or IPE) is a hypothetical electrode characterized by an absence of net DC current between the two sides of the electrical double layer, i.e., no faradic current exists between the electrode surface and the electrolyte. Any transient current that may be flowing is considered non-faradaic. The reason for this behavior is that the electrode reaction is infinitely slow, with zero exchange current density, and behaves electrically as a capacitor. The concept of the ideal polarizability has been first introduced by F.O. Koenig in 1934. Ideal non-polarizable electrode An ideal non-polarizable electrode, is a hypothetical electrode in which a faradic current can freely pass (without polarization). Its potential does not change from its equilibrium potential upon application of current. The reason for this behavior is that the electrode reaction is infinitely fast, having an infinite exchange current density, and behaves as an electrical short. Real examples of nearly ideal electrodes The classical examples of the two nearly ideal types of electrodes, polarizable and non-polarizable, are the mercury droplet electrode in contact with an oxygen-free KCl solution and the silver/silver chloride electrode, respectively. References Electrochemistry Electrodes
Ideal electrode
[ "Chemistry" ]
365
[ "Physical chemistry stubs", "Electrochemistry", "Electrodes", "Electrochemistry stubs" ]
70,548,926
https://en.wikipedia.org/wiki/Light%20in%20painting
Light in painting fulfills several objectives like, both plastic and aesthetic: on the one hand, it is a fundamental factor in the technical representation of the work, since its presence determines the vision of the projected image, as it affects certain values such as color, texture and volume; on the other hand, light has a great aesthetic value, since its combination with shadow and with certain lighting and color effects can determine the composition of the work and the image that the artist wants to project. Also, light can have a symbolic component, especially in religion, where this element has often been associated with divinity. The incidence of light on the human eye produces visual impressions, so its presence is indispensable for the capture of art. At the same time, light is intrinsically found in painting, since it is indispensable for the composition of the image: the play of light and shadow is the basis of drawing and, in its interaction with color, is the primordial aspect of painting, with a direct influence on factors such as modeling and relief. The technical representation of light has evolved throughout the history of painting, and various techniques have been created over time to capture it, such as shading, chiaroscuro, sfumato, or tenebrism. On the other hand, light has been a particularly determining factor in various periods and styles, such as Renaissance, Baroque, Impressionism, or Fauvism. The greater emphasis given to the expression of light in painting is called "luminism", a term generally applied to various styles such as Baroque tenebrism and impressionism, as well as to various movements of the late 19th century and early 20th century such as American, Belgian, and Valencian luminism. Optics Light (ultimately from Proto-Indo-European *lewktom, with the meaning "brightess") is an electromagnetic radiation with a wavelength between 380 nm and 750 nm, the part of the visible spectrum that is perceived by the human eye, located between infrared and ultraviolet radiation. It consists of massless elementary particles called photons, which move at a speed of 299 792 458 m/s in a vacuum, while in matter it depends on its refractive index . The branch of physics that studies the behavior and characteristics of light is optics. Light is the physical agent that makes objects visible to the human eye. Its origin can be in celestial bodies such as the Sun, the Moon, or the stars, natural phenomena such as lightning, or in materials in combustion, ignition, or incandescence. Throughout history, human beings have devised different procedures to obtain light in spaces lacking it, such as torches, candles, candlesticks, lamps or, more recently, electric lighting. Light is both the agent that enables vision and a visible phenomenon in itself, since light is also an object perceptible by the human eye. Light enables the perception of color, which reaches the retina through light rays that are transmitted by the retina to the optic nerve, which in turn transmits them to the brain by means of nerve impulses. The perception of light is a psychological process and each person perceives the same physical object and the same luminosity in a different way.Physical objects have different levels of luminance (or reflectance), that is, they absorb or reflect to a greater or lesser extent the light that strikes them, which affects the color, from white (maximum reflection) to black (maximum absorption). Both black and white are not considered colors of the conventional chromatic circle, but gradations of brightness and darkness, whose transitions make up the shadows. When white light hits a surface of a certain color, photons of that color are reflected; if these photons subsequently hit another surface they will illuminate it with the same color, an effect known as radiance — generally perceptible only with intense light. If that object is in turn the same color, it will reinforce its level of colored luminosity, i.e. its saturation. White light from the sun consists of a continuous spectrum of colors which, when divided, forms the colors of the rainbow: violet, indigo blue, blue, green, yellow, orange, and red. In its interaction with the Earth's atmosphere, sunlight tends to scatter the shorter wavelengths, i.e. the blue photons, which is why the sky is perceived as blue. On the other hand, at sunset, when the atmosphere is denser, the light is less scattered, so that the longer wavelengths, red, are perceived. Color is a specific wavelength of white light. The colors of the chromatic spectrum have different shades or tones, which are usually represented in the chromatic circle, where the primary colors and their derivatives are located. There are three primary colors: lemon yellow, magenta red, and cyan blue. If they are mixed, the three secondary colors are obtained: orange red, bluish violet, and green. If a primary and a secondary are mixed, the tertiary colors are obtained: greenish blue, orange yellow, etc. On the other hand, complementary colors are two colors that are on opposite sides of the chromatic circle (green and magenta, yellow and violet, blue and orange) and adjacent colors are those that are close within the circle (yellow and green, red and orange). If a color is mixed with an adjacent color, it is shaded, and if it is mixed with a complementary color, it is neutralized (darkened). Three factors are involved in the definition of color: hue, the position within the chromatic circle; saturation, the purity of the color, which is involved in its brightness – the maximum saturation is that of a color that has no mixture with black or its complementary; and value, the level of luminosity of a color, increasing when mixed with white and decreasing when mixed with black or a complementary. The main source of light is the Sun and its perception can vary according to the time of day: the most normal is mid-morning or mid-afternoon light, generally blue, clear and diaphanous, although it depends on atmospheric dispersion and cloudiness and other climatic factors; midday light is whiter and more intense, with high contrast and darker shadows; dusk light is more yellowish, soft and warm; sunset light is orange or red, low contrast, with intense bluish shadows; evening light is a darker red, dimmer light, with weaker shadows and contrast (the moment known as alpenglow, which occurs in the eastern sky on clear days, gives pinkish tones); the light of cloudy skies depends on the time of day and the degree of cloudiness, is a dim and diffuse light with soft shadows, low contrast and high saturation (in natural environments there can be a mixture of light and shadow known as "mottled light"); finally, night light can be lunar or some atmospheric refraction of sunlight, is diffuse and dim (in contemporary times there is also light pollution from cities). We must also point out the natural light that filters indoors, a diffuse light of lower intensity, with a variable contrast depending on whether it has a single origin or several (for example, several windows), as well as a coloring also variable, depending on the time of day, the weather or the surface on which it is reflected. An outstanding interior light is the so-called "north light", which is the light that enters through a north-facing window, which does not come directly from the sun -always located to the south- and is therefore a soft and diffuse, constant and homogeneous light, much appreciated by artists in times when there was no adequate artificial lighting. As for artificial light, the main ones are: fire and candles, red or orange; electric, yellow or orange – generally tungsten or wolfram – it can be direct (focal) or diffused by lamp shades; fluorescent, greenish; and photographic, white (flash light). Logically, in many environments there can be mixed light, a combination of natural and artificial light. The visible reality is made up of a play of light and shadow: the shadow is formed when an opaque body obstructs the path of the light. In general, there is a ratio between light and shadow whose gradation depends on various factors, from lighting to the presence and placement of various objects that can generate shadows; however, there are conditions in which one of the two factors can reach the extreme, as in the case of snow or fog or, conversely, at night. We speak of high key lighting when white or light tones predominate, or low key lighting if black or dark tones predominate. Shadows can be of shape (also called "self shadows") or of projection ("cast shadows"): the former are the shaded areas of a physical object, that is, the part of that object on which light does not fall; the latter are the shadows cast by these objects on some surface, usually the ground. Self shadows define the volume and texture of an object; cast shadows help define space. The lightest part of the shadow is the "umbra" and the darkest part is the "penumbra". The shape and appearance of the shadow depends on the size and distance of the light source: the most pronounced shadows are from small or distant sources, while a large or close source will give more diffuse shadows. In the first case, the shadow will have sharp edges and the darker area (penumbra) will occupy most of it; in the second, the edge will be more diffuse and the umbra will predominate. A shadow can receive illumination from a secondary source, known as "fill light". The color of a shadow is between blue and black, and also depends on several factors, such as light contrast, transparency and translucency. The projection of shadows is different if they come from natural or artificial light: with natural light the beams are parallel and the shadow adapts both to the terrain and to the various obstacles that may intervene; with artificial light the beams are divergent, with less defined limits, and if there are several light sources, combined shadows may be produced. The reflection of light produces four derived phenomena: glints, which are reflections of the light source, be it the Sun, artificial lights or incidental sources such as doors and windows; glares, which are reflections produced by illuminated bodies as a reflective screen, especially white surfaces; color reflections, produced by the proximity between various objects, especially if they are luminous; and image reflections, produced by polished surfaces, such as mirrors or water. Another phenomenon produced by light is transparency, which occurs in bodies that are not opaque, with a greater or lesser degree depending on the opacity of the object, from total transparency to varying degrees of translucency. Transparency generates filtered light, a type of luminosity that can also be produced through curtains, blinds, awnings, various fabrics, pergolas and arbors, or through the foliage of trees. Pictorial representation of light In artistic terminology, "light" is the point or center of light diffusion in the composition of a painting, or the luminous part of a painting in relation to the shadows. This term is also used to describe the way a painting is illuminated: zenithal or plumb light (vertical rays), high light (oblique rays), straight light (horizontal rays), workshop or studio light (artificial light), etc. The term "accidental light" is also used to refer to light not produced by the Sun, which can be either moonlight or artificial light from candles, torches, etc. The light can come from different directions, which according to its incidence can be differentiated between: "lateral", when it comes from the side, it is a light that highlights more the texture of the objects; "frontal", when it comes from the front, it eliminates the shadows and the sensation of volume; "zenithal", a vertical light of higher origin than the object, it produces a certain deformation of the figure; "contrapicado", vertical light of lower origin, it deforms the figure in an exaggerated way; and "backlight", when the origin is behind the object, thus darkening and diluting its silhouette. In relation to the distribution of light in the painting, it can be: "homogeneous", when it is distributed equally; "dual", in which the figures stand out against a dark background; or "insertive", when light and shadows are interrelated. According to its origin, light can be intrinsic ("own or autonomous light"), when the light is homogeneous, without luminous effects, directional lights or contrasts of lights and shadows; or extrinsic ("illuminating light"), when it presents contrasts, directional lights and other objective sources of light. The first occurred mainly in Romanesque and Gothic art, and the second especially in the Renaissance and Baroque. In turn, the illuminating light can occur in different ways: "focal light", when it directly presents a light-emitting object ("tangible light") or comes from an external source that illuminates the painting ("intangible light"); "diffuse light", which blurs the contours, as in Leonardo's sfumato; "real light", which aims to realistically capture sunlight, an almost utopian attempt in which artists such as Claude of Lorraine, J. M. W. Turner or the impressionist artists were especially employed; and "unreal light", which has no natural or scientific basis and is closer to a symbolic light, as in the illumination of religious figures. As for the artist's intention, light can be "compositional", when it helps the composition of the painting, as in all the previous cases; or "conceptual light", when it serves to enhance the message, for example by illuminating a certain part of the painting and leaving the rest in semi-darkness, as Caravaggio used to do. In terms of its origin, light can be "natural ambient light", in which no shadows of figures or objects appear, or "projected light", which generates shadows and serves to model the figures. It is also important to differentiate between source and focus of light: the source of light in a painting is the element that radiates the light, be it the sun, a candle or any other; the focus of light is the part of the painting that has the most luminosity and radiates it around the painting. On the other hand, in relation to the shadow, the interrelation between light and shadow is called "chiaroscuro"; if the dark area is larger than the illuminated one, it is called "tenebrism".Light in painting plays a decisive role in the composition and structuring of the painting. Unlike in architecture and sculpture, where light is real, the light of the surrounding space, in painting light is represented, so it responds to the will of the artist both in its physical and aesthetic aspect. The painter determines the illumination of the painting, that is to say, the origin and incidence of the light, which marks the composition and expression of the image. In turn, the shadow provides solidity and volume, while it can generate dramatic effects of various kinds. In the pictorial representation of light it is essential to distinguish its nature (natural, artificial) and to establish its origin, intensity and chromatic quality. Natural light depends on various factors, such as the season of the year, the time of day (auroral, diurnal, twilight or nocturnal light – from the Moon or stars) or the weather. Artificial light, on the other hand, differs according to its origin: a candle, a torch, a fluorescent, a lamp, neon lights, etc. As for the origin, it can be focused or act in a diffuse way, without a determined origin. The chromatism of the image depends on the light, since depending on its incidence an object can have different tonalities, as well as the reflections, ambiances and shadows projected. In an illuminated image the color is considered saturated at the correct level of illumination, while the color in shadow will always have a darker tonal value and will be the one that determines the relief and volume. Light is linked to space, so in painting it is intimately linked to perspective, the way of representing a three-dimensional space in a two-dimensional support such as painting. Thus, in linear perspective, light fulfills the function of highlighting objects, of generating volume, through modeling, in the form of luminous gradations; while in aerial perspective, the effects of light are sought as they are perceived by the spectator in the environment, as another element present in the physical reality represented. The light source can be present in the painting or not, it can have a direct or indirect origin, internal or external to the painting. The light defines the space through the modeling of volumes, which is achieved with the contrast between light and shadow: the relationship between the values of light and shadow defines the volumetric characteristics of the form, with a scale of values that can range from a soft fade to a hard contrast. Spatial limits can be objective, when they are produced by people, objects, architectures, natural elements and other factors of corporeality; or subjective, when they come from sensations such as atmosphere, depth, a hollow, an abyss, etc. In human perception, light creates closeness and darkness creates remoteness, so that a light-darkness gradient gives a sensation of depth.Aspects such as contrast, relief, texture, volume, gradients or the tactile quality of the image depend on light. The play of light and shadow helps to define the location and orientation of objects in space. For their correct representation, their shape, density and extension, as well as their differences in intensity, must be taken into account. It should also be taken into account that, apart from its physical qualities, light can generate dramatic effects and give the painting a certain emotional atmosphere. Contrast is a fundamental factor in painting; it is the language with which the image is shaped. There are two types of contrast: the "luminous", which can be by chiaroscuro (light and shadow) or by surface (a point of light that shines brighter than the rest); and the "chromatic", which can be tonal (contrast between two tones) or by saturation (a bright color with a neutral one). Both types of contrast are not mutually exclusive, in fact they coincide in the same image most of the time. Contrast can have different levels of intensity and its regulation is the artist's main tool to achieve the appropriate expression for his work. From the contrast between light and shadow depends the tonal expression that the artist wants to give to his work, which can range from softness to hardness, which gives a lesser or greater degree of dramatization. Backlighting, for example, is one of the resources that provide greater drama, since it produces elongated shadows and darker tones. The correspondence between light and shadow and color is achieved through tonal evaluation: the lightest tones are found in the most illuminated areas of the painting and the darkest in those that receive less illumination. Once the artist establishes the tonal values, he chooses the most appropriate color ranges for their representation. Colors can be lightened or darkened until the desired effect is achieved: to lighten a color, lighter related colors – such as groups of warm or cool colors – are added to it, as well as amounts of white until the right tone is found; to darken, related dark colors and some blue or shadow are added. In general, the shade is made by mixing a color with a darker shade, plus blue and a complementary of the proper color (such as yellow and dark blue, red and primary blue or magenta and green). The light and chromatic harmony of a painting depends on color, i.e. the relationship between the parts of a painting to create cohesion. There are several ways to harmonize: it can be done through "monochrome and tone dominant melodic ranges", with a single color as a base to which the value and tone is changed; if the value is changed with white or black it is a monochrome, while if the tone is changed it is a simple melodic range: for example, taking red as the dominant tone can be shaded with various shades of red (vermilion, cadmium, carmine) or orange, pink, violet, maroon, salmon, warm gray, etc. Another method is the "harmonic trios", which consists of combining three colors equidistant from each other on the chromatic circle; there can also be four, in which case we speak of "quaternions". Another way is the combination of "warm and cool thermal ranges": warm colors are for example red, orange, purple and yellowish green, as well as black; cool colors are blue, green and violet, as well as white (this perception of color with respect to its temperature is subjective and comes from Goethe's Theory of Colors). It is also possible to harmonize between "complementary colors", which is the one that produces the greatest chromatic contrast. Finally, "broken ranges" consist of neutralization by mixing primary colors and their complementary colors, which produces intense luminous effects, since the chromatic vibration is more subtle and the saturated colors stand out more. Techniques The quality and appearance of the luminous representation is in many cases linked to the technique used. The expression and the different light effects of a work depend to a great extent on the different techniques and materials used. In drawing, whether in pencil or charcoal, the effects of light are achieved through the black-white duality, where white is generally the color of the paper (there are colored pencils, but they produce little contrast, so they are not very suitable for chiaroscuro and light effects). Pencil is usually worked with line and hatching, or by means of blurred spots. Charcoal allows the use of gouache and chalk or white chalk to add touches of light, as well as sanguine or sepia. Another monochrome technique is Indian ink, which generates very violent chiaroscuro, without intermediate values, making it a very expressive medium. Oil painting consists of dissolving the colors in an oily binder (linseed, walnut, almond or hazelnut oil; animal oils), adding turpentine to make it dry better. The oil painting is the one that best allows to value the light effects and the chromatic tones. It is a technique that produces vivid colors and intense effects of brightness and brilliance, and allows a free and fresh stroke, as well as a great richness of textures. On the other hand, thanks to its long permanence in a fluid state, it allows for subsequent corrections.For its application, brushes, spatulas or scrapers can be used, allowing multiple textures, from thin layers and glazes to thick fillings, which produce a denser light. Pastel painting is made with a pigment pencil of various mineral colors, with binders (kaolin, gypsum, gum arabic, fig latex, fish glue, candi sugar, etc.), kneaded with wax and Marseilles soap and cut into sticks. The color should be spread with a smudger, a cylinder of leather or paper used to smudge the color strokes. Pastel combines the qualities of drawing and painting, and brings freshness and spontaneity. Watercolor is a technique made with transparent pigments diluted in water, with binders such as gum arabic or honey, using the white of the paper itself. Known since ancient Egypt, it has been a technique used throughout the ages, although with more intensity during the 18th and 19th centuries. As it is a wet technique, it provides great transparency, which highlights the luminous effect of the white color. Generally, the light tones are applied first, leaving spaces on the paper for the pure white; then the dark tones are applied. In acrylic paint, a plastic binder is added to the colorant, which produces a fast drying and is more resistant to corrosive agents. The speed of drying allows the addition of multiple layers to correct defects and produces flat colors and glazes. Acrylic can be worked by gradient, blurred or contrasted, by flat spots or by filling the color, as in the oil technique. Genres Depending on the pictorial genre, light has different considerations, since its incidence is different in interiors than in exteriors, on objects than on people. In interiors, light generally tends to create intimate environments, usually a type of indirect light filtered through doors or windows, or filtered by curtains or other elements. In these spaces, private scenes are usually developed, which are reinforced by contrasts of light and shadow, intense or soft, natural or artificial, with areas in semi-darkness and atmospheres influenced by gravitating dust and other effects caused by these spaces. A separate genre of interior painting is naturaleza muerta or "still life", which usually shows a series of objects or food arranged as in a sideboard. In these works the artist can manipulate the light at will, generally with dramatic effects such as side lights, frontal lights, zenithal lights, back lights, back-lights, etc. The main difficulty consists in the correct evaluation of the tones and textures of the objects, as well as their brightness and transparency depending on the material. In exteriors, the main genre is landscape, perhaps the most relevant in relation to light in that its presence is fundamental, since any exterior is enveloped in a luminous atmosphere determined by the time of day and the weather and environmental conditions. There are three main types of landscapes: landscape, seascape, and skyscape. The main challenge for the artist in these works is to capture the precise tone of the natural light according to the time of day, the season of the year, the viewing conditions – which can be affected by phenomena such as cloud cover, rain or fog – and an infinite number of variables that can occur in a medium as volatile as the landscape. On numerous occasions artists have gone out to paint in nature to capture their impressions first hand, a working method known by the French term en plen air ("in the open air", equivalent to "outdoors"). There is also the variant of the urban landscape, frequent especially since the 20th century, in which a factor to take into account is the artificial illumination of the cities and the presence of neon lights and other types of effects; in general, in these images the planes and contrasts are more differentiated, with hard shadows and artificial and grayish colors. Light is also fundamental for the representation of the human figure in painting, since it affects the volume and generates different limits according to the play of light and shadow, which delimits the anatomical profile. Light allows us to nuance the surface of the body, and provides a sensation of smoothness and softness to the skin. The focus of the light is important, since its direction influences the general contour of the figure and the illumination of its surroundings: for example, frontal light makes the shadows disappear, attenuating the volume and the sensation of depth, while emphasizing the color of the skin. On the other hand, a partially lateral illumination causes shadows and gives relief to the volumes, and if it is from the side, the shadow covers the opposite side of the figure, which appears with an enhanced volume. On the other hand, in backlighting the body is shown with a characteristic halo around its contour, while the volume acquires a weightless sensation. With overhead lighting, the projection of shadows blurs the relief and gives a somewhat ghostly appearance, just as it does when illuminated from below – although the latter is rare. A determining factor is that of the shadows, which generate a series of contours apart from the anatomical ones that provide drama to the image. Together with the luminous reflections, the gradation of shadows generates a series of effects of great richness in the figure, which the artist can exploit in different ways to achieve different results of greater or lesser effect. It should also be taken into account that direct light or shadow on the skin modifies the color, varying the tonality from the characteristic pale pink to gray or white. The light can also be filtered by objects that get in its path (such as curtains, fabrics, vases or various objects), which generates different effects and colors on the skin. In relation to the human being, the portrait genre is characteristic, in which light plays a decisive role in the modeling of the face. Its elaboration is based on the same premises as those of the human body, with the addition of a greater demand in the faithful representation of the physiognomic features and even the need to capture the psychology of the character. The drawing is essential to model the features according to the model and, from there, light and color are again the vehicle of translation of the visual image to its representation on the canvas. In the 20th century, abstraction emerged as a new pictorial language, in which painting is reduced to non-figurative images that no longer describe reality, but rather concepts or sensations of the artist himself, who plays with form, color, light, matter, space and other elements in a totally subjective way and not subject to conventionalisms. Despite the absence of concrete images of the surrounding reality, light is still present on numerous occasions, generally contributing luminosity to the colors or creating chiaroscuro effects by contrasting tonal values. Chronological factor Another aspect in which light is a determining factor is in time, in the representation of chronological time in painting. Until the Renaissance, artists did not represent a specific time in painting and, in general, the only difference in light was between exterior and interior lights. In many occasions it is difficult to identify the specific time of day in a work, since neither the direction of the light nor its quality nor the dimension of the shadows are decisive elements to recognize a certain time of day. Night was rarely represented until practically Mannerism and, in the cases in which a nocturnal atmosphere was used, it was because the narrative required it or because of some symbolic aspect: in Giotto's The Annunciation to the Shepherds or in Ambrogio Lorenzetti's Annunciation, the nocturnal atmosphere contributes to accentuate the halo of mystery surrounding the birth of Christ; in Uccello's Saint George and the Dragon, night represents evil, the world in which the dragon lives. On the other hand, even in narrative themes that take place at night, such as the Last Supper or the supper at Emmaus, this factor is sometimes deliberately avoided, as in Andrea del Sarto's Last Supper, set in daylight. Generally, the chronological setting of a scene has been linked to its narrative correlate, albeit in an approximate manner and with certain licenses on the part of the artist. Practically until the 19th century, it was not until the industrial civilization, thanks to the advances in artificial lighting, that a complete and exact use of the entire time zone was achieved, thanks to the advances in artificial illumination. But just as in the contemporary age time has had a more realistic component, in the past it was more of a narrative factor, accompanying the action represented: dawn was a time of travel or hunting; noon, of action or its subsequent rest; dusk, of return or reflection; night was sleep, fear or adventure, or fun and passion; birth was morning, death was night. The temporal dimension began to gain relevance in the 17th century, when artists such as Claude Lorrain and Salvator Rosa began to detach landscape painting from a narrative context and to produce works in which the protagonist was nature, with the only variations being the time of day or the season of the year. This new conception developed with 18th-century Vedutism and 19th-century Romantic landscape, and culminated with Impressionism. The first light of the day is that of dawn, sunrise or aurora (sometimes the aurora, which would be the first brightness of the sky, is differentiated from dawn, which would correspond to sunrise). Until the 17th century, dawn appeared only in small pieces of landscape, usually behind a door or a window, but was never used to illuminate the foreground. The light of dawn generally has a spherical effect, so until the appearance of Leonardo's aerial perspective it was not widely used. In his Dictionary of the Fine Arts of Design (1797), Francesco Milizia states that: For Milizia, the light of dawn was the most suitable for the representation of landscapes. Noon and the hours immediately before and after have always been a stable frame for an objective representation of reality, although it is difficult to pinpoint the exact moment in most paintings depending on the different light intensities. On the other hand, the exact noon was discouraged by its extreme refulgence, to the point that Leonardo advised that: Milizia also points out that: Most art treatises advised the afternoon light, which was the most used especially from the Renaissance to the 18th century. Vasari advised to place the sun to the east because "the figure that is made has a great relief and great goodness and perfection is achieved". In the early days of modern painting, the sunset used to be circumscribed to a celestial vault characterized by its reddish color, without an exact correspondence with the illumination of figures and objects. It was again with Leonardo that a more naturalistic study of twilight began, pointing out in his notes that: For Milizia this moment is risky, since "the more splendid these accidents are (the flaming twilight is always an excess), the more they must be observed to represent them well". Finally, the night has always been a singularity within painting, to the point of constituting a genre of its own: the nocturne. In these scenes the light comes from the Moon, the stars or from some type of artificial illumination (bonfires, torches, candles or, more recently, gas or electric light). The justification for a night scene has generally been given from iconographic themes occurring in this time period. In the 14th century painting began to move away from the symbolic and conceptual content of medieval art in search of a figurative content based on a more objective spatio-temporal axis. Renaissance artists were refractory to the nocturnal setting, since their experimentation in the field of linear perspective required an objective and stable frame in which full light was indispensable. Thus, Lorenzo Ghiberti stated that "it is not possible to be seen in darkness" and Leonardo wrote that "darkness means complete deprivation of light". Leonardo advised a night scene only with the illumination of a fire, as a mere artifice to make a night scene diurnal. However, Leonardo's sfumato opened a first door to a naturalistic representation of the night, thanks to the chromatic decrease in the distance in which the bluish white of Leonardo's luminous air can become a bluish black for the night: just as the first creates an effect of remoteness, the second provokes closeness, the dilution of the background in the gloom. This tendency will have its climax in baroque tenebrism, in which darkness is used to add drama to the scene and to emphasize certain parts of the painting, often with a symbolic aspect. On the other hand, in the 17th century the representation of the night acquired a more scientific character, especially thanks to the invention of the telescope by Galileo and a more detailed observation of the night sky. Finally, advances in artificial lighting in the 19th century boosted the conquest of nighttime, which became a time for leisure and entertainment, a circumstance that was especially captured by the Impressionists. Symbology Light has had on numerous occasions throughout the history of painting an aesthetic component, which identifies light with beauty, as well as a symbolic meaning, especially related to religion, but also with knowledge, good, happiness and life, or in general the spiritual and immaterial. Sometimes the light of the Sun has been equated with inspiration and imagination, and that of the Moon with rational thought. In contrast, shadows and darkness represent evil, death, ignorance, immorality, misfortune or secrecy. Thus, many religions and philosophies throughout history have been based on the dichotomy between light and darkness, such as Ahura Mazda and Ahriman, yin and yang, angels and demons, spirit and matter, and so on. In general, light has been associated with the immaterial and spiritual, probably because of its ethereal and weightless aspect, and that association has often been extended to other concepts related to light, such as color, shadow, radiance, evanescence, etc. The identification of light with a transcendent meaning comes from antiquity and probably existed in the minds of many artists and religious people before the idea was written down. In many ancient religions the deity was identified with light, such as the Semitic Baal, the Egyptian Ra or the Iranian Ahura Mazda. Primitive peoples already had a transcendental concept of light – the so-called "metaphor of light" – generally linked to immortality, which related the afterlife to starlight. Many cultures sketched a place of infinite light where the souls rested, a concept also picked up by Aristotle and various Fathers of the Church such as Saint Basil and Saint Augustine. On the other hand, many religious rites were based on "illumination" to purify the soul, from ancient Babylon to the Pythagoreans. In Greek mythology Apollo was the god of the Sun and has often been depicted in art within a disk of light. On the other hand, Apollo was also the god of beauty and the arts, a clear symbolism between light and these two concepts. Also related to light is the goddess of dawn, Eos (Aurora in Roman mythology). In Ancient Greece, light was synonymous with life and was also related to beauty. Sometimes the fluctuation of light was related to emotional changes, as well as to intellectual capacity. On the other hand, the shadow had a negative component, it was related to the dark and hidden, to evil forces, such as the spectral shadows of Tartarus. The Greeks also related the sun to "intelligent light" (φῶς νοετόν), a driving principle of the movement of the universe, and Plato drew a parallel between light and knowledge. The ancient Romans distinguished between lux (luminous source) and lumen (rays of light emanating from that source), terms they used according to the context: thus, for example, lux gloriae or lux intelligibilis, or lumen naturale or lumen gratiae. In Christianity, God is also often associated with light, a tradition that goes back to the philosopher Pseudo-Dionysius Areopagite (On the Celestial Hierarchy, On the Divine Names), who adapted a similar one from Neoplatonism. For this 5th century author, "Light derives from Good and is the image of Goodness". Later, in the 9th century, John Scotus Erigena defined God as "the father of lights". Already the Bible begins with the phrase "let there be light" (Ge 1:3) and points out that "God saw that the light was good" (Ge 1:4). This "good" had in Hebrew a more ethical sense, but in its translation into Greek the term καλός (kalós, "beautiful") was used, in the sense of kalokagathía, which identified goodness and beauty; although later in the Latin Vulgate a more literal translation was made (bonum instead of pulchrum), it remained fixed in the Christian mentality the idea of the intrinsic beauty of the world as the work of the Creator. On the other hand, the Holy Scriptures identify light with God, and Jesus goes so far as to affirm: "I am the light of the world, he who follows me will not walk in darkness, for he will have the light of life" (John 8:12). This identification of light with divinity led to the incorporation in Christian churches of a lamp known as "eternal light", as well as the custom of lighting candles to remember the dead and various other rites. Light is also present in other areas of the Christian religion: the Conception of Jesus in Mary is realized in the form of a ray of light, as seen in numerous representations of the Annunciation; likewise, it represents the Incarnation, as expressed by Pseudo-Saint Bernard: "as the splendor of the sun passes through glass without breaking it and penetrates its solidity in its impalpable subtlety, without opening it when it enters and without breaking it when it leaves, so the Word God penetrates Mary's womb and comes forth from her womb intact." This symbolism of light passing through glass is the same concept that was applied to Gothic stained glass, where light symbolizes divine omnipresence. Another symbolism related to light is that which identifies Jesus with the Sun and Mary as the Dawn that precedes him. In addition to all this, in Christianity light can also signify truth, virtue and salvation. In patristics, light is a symbol of eternity and the heavenly world: according to Saint Bernard, souls separated from the body will be "plunged into an immense ocean of eternal light and luminous eternity". On the other hand, in ancient Christianity, baptism was initially called "illumination". In Orthodox Christianity, light is, more than a symbol, a "real aspect of divinity," according to Vladimir Lossky. A reality that can be apprehended by the human being, as expressed by Saint Simeon the New Theologian: Because of the opposition of light and darkness, this element has also been used on occasions as a repeller of demons, so that light has often been represented in various acts and ceremonies such as circumcision, baptisms, weddings or funerals, in the form of candles or fires. In Christian iconography, light is also present in the halos of the saints, which used to be made – especially in medieval art – with a golden nimbus, a circle of light placed around the heads of saints, angels and members of the Holy Family. In Fra Angelico's The Annunciation, in addition to the halo, the artist placed rays of light radiating from the figure of the archangel Gabriel, to emphasize his divinity, the same resource he uses with the dove symbolizing the Holy Spirit. On other occasions, it is God himself who is represented in the form of rays of sunlight, as in The Baptism of Christ (1445) by Piero della Francesca. The rays can also signify God's wrath, as in The Tempest (1505) by Giorgione. On other occasions light represents eternity or divinity: in the vanitas genre, beams of light used to focus on objects whose transience was to be emphasized as a symbol of the ephemerality of life, as in Vanities (1645) by Harmen Steenwijck, where a powerful beam of light illuminates the skull in the center of the painting. Between the 14th and 15th centuries Italian painters used supernatural-looking lights in night scenes to depict miracles: for example, in the Annunciation to the Shepherds by Taddeo Gaddi (Santa Croce, Florence) or in the Stigmatization of Saint Francis by Gentile da Fabriano (1420, private collection). In the 16th century, supernatural lights with brilliant effects were also used to point out miraculous events, as in Matthias Grünewald's Risen Christ (1512-1516, Isenheim altar, Museum Unterlinden, Colmar) or in Titian's Annunciation (1564, San Salvatore, Venice). In the following century, Rembrandt and Caravaggio identified light in their works with divine grace and as an agent of action against evil. The Baroque was the period in which light became more symbolic: in medieval art the luminosity of the backgrounds, of the halos of the saints and other objects – generally made with gold leaf – was an attribute that did not correspond to real luminosity, while in the Renaissance it responded more to a desire for experimentation and aesthetic delight; Rembrandt was the first to combine both concepts, the divine light is a real, sensory light, but with a strong symbolic charge, an instrument of revelation. Between the 17th and 18th centuries, mystical theories of light were abandoned as philosophical rationalism gained ground. From transcendental or divine light, a new symbolism of light evolved that identified it with concepts such as knowledge, goodness or rebirth, and opposed it to ignorance, evil and death. Descartes spoke of an "inner light" capable of capturing the "eternal truths", a concept also taken up by Leibniz, who distinguished between lumière naturelle (natural light) and lumière révélée (revealed light). In the 19th century light was related by the German Romantics (Friedrich Schlegel, Friedrich Schelling, Georg Wilhelm Friedrich Hegel) to nature, in a pantheistic sense of communion with nature. For Schelling, light was a medium in which the "universal soul" (Weltseele) moved. For Hegel, light was the "ideality of matter", the foundation of the material world. Between the 19th and 20th centuries, a more scientific view of light prevailed. Science had been trying to unravel the nature of light since the early Modern Age, with two main theories: the corpuscular theory, defended by Descartes and Newton; and the wave theory, defended by Christiaan Huygens, Thomas Young and Augustin-Jean Fresnel. Later, James Clerk Maxwell presented an electromagnetic theory of light. Finally, Albert Einstein brought together the corpuscular and wave theories. Light can also have a symbolic character in landscape painting: in general, dawn and the passage from night to day represent the divine plan – or cosmic system – that transcends the simple will of the human being; dawn also symbolizes the renewal and redemption of Christ. On other occasions, the sun and the moon have been associated with various vital forces: thus, the sun and the day are associated with the masculine, the vital force and energy; and the moon and the night with the feminine, rest, sleep and spirituality, sometimes even death. In other religions light also has a transcendent meaning: in Buddhism it represents truth and the overcoming of matter in the ascent to nirvana. In Hinduism it is synonymous with wisdom and the spiritual understanding of participation with divinity (atman); it is also the manifestation of Krishna, the "Lord of Light". In Islam it is the sacred name Nûr. According to the Koran (24:35), "Allah is the light of the heavens and the earth. Light upon light! Allah guides to his light whomever he wills". In the Zohar of the Jewish Kabbalah the primordial light Or (or Awr) appears, and points out that the universe is divided between the empires of light and darkness; also in Jewish synagogues there is usually a lamp of "eternal light" or ner tamid. Finally, in Freemasonry, the search for light is considered the ascent to the various Masonic degrees; some of the Masonic symbols, such as the compass, the bevel and the holy book, are called "great lights"; also the principal Masonic officials are called "lights". On the other hand, initiation into Freemasonry is called "receiving the light". History The use of light is intrinsic to painting, so it has been present directly or indirectly since prehistoric times, when cave paintings sought light and relief effects by taking advantage of the roughness of the walls where these scenes were represented. However, serious attempts at greater experimentation in the technical representation of light did not take place until classical Greco-Roman art: Francisco Pacheco, in El arte de la pintura (1649), points out that: "adumbration was invented by Surias, Samian, covering or staining the shadow of a horse, looked at in the sunlight". On the other hand, Apollodorus of Athens is credited with the invention of chiaroscuro, a procedure of contrast between light and shadow to produce effects of luminous reality in a two-dimensional representation such as painting. The effects of light and shadow were also developed by Greek scenographers in a technique called skiagraphia, consisting of the contrast between black and white to create contrast, to the point that they were called "shadow painters". The first scientific studies on light also emerged in Greece: Aristotle stated in relation to colors that they are "mixtures of different forces of sunlight and the light of fire, air and water", as well as that "darkness is due to the deprivation of light". One of the most famous Greek painters was Apelles, one of the pioneers in the representation of light in painting. Pliny said of Apelles that he was the only one who "painted what cannot be painted, thunder, lightning and thunderbolts". Another outstanding painter was Nicias of Athens, of whom Pliny praised the "care he took with light and shade to achieve the appearance of relief". With the emergence of landscape painting, a new method was developed to represent distance through gradations of light and shadow, contrasting more the plane closest to the viewer and progressively blurring with distance. These early landscape painters created the modeling through shades of light and shadow, without mixing the colors in the palette. Claudius Ptolemy explained in his Optics how painters created the illusion of depth through distances that seemed "veiled by air". In general, the strongest contrasts were made in the areas closest to the observer and progressively reduced towards the background. This technique was picked up by early Christian and Byzantine art, as seen in the apsidal mosaic of Sant'Apollinare in Classe, and even reached as far as India, as denoted in the Buddhist murals of Ajantā. In the 5th century the philosopher John Philoponus, in his commentary on Aristotle's Meteorology, outlined a theory on the subjective effect of light and shadow in painting, known today as "Philoponus' rule": This effect was already known empirically by ancient painters. Cicero was of the opinion that painters saw more than normal people in umbris et eminentia ("in shadows and eminences"), that is, depth and protrusion. And Pseudo-Longinus – in his work On the Sublime – said that "although the colors of shadow and light are on the same plane, side by side, the light jumps immediately into view and seems not only to stand out but actually to be closer." Hellenistic art was fond of light effects, especially in landscape painting, as denoted in the stuccoes of La Farnesina. Chiaroscuro was widely used in Roman painting, as denoted in the illusory architectures of the frescoes of Pompeii, although it disappeared during the Middle Ages. Vitruvius recommended as more suitable for painting the northern light, being more constant due to its low mutability in tone. Later, in Paleochristian art, the taste for contrasts between light and shadow became evident – as can be seen in Christian sepulchral paintings and in the mosaics of Santa Pudenciana and Santa María la Mayor – in such a way that this style has sometimes been called "ancient impressionism". Byzantine art inherited the use of illusionistic touches of light that were used in Pompeian art, but just as in the original its main function was naturalistic, here it is already a rhetorical formula far removed from the representation of reality. In Byzantine art, as well as in Romanesque art, which it powerfully influenced, the luminosity and splendor of shines and reflections, especially of gold and precious stones, were more valued, with a more aesthetic than pictorial component, since these shines were synonymous of beauty, of a type of beauty more spiritual than material. These briils were identified with the divine light, as did Abbot Suger to justify his expenditure on jewels and precious materials. Both Greek and Roman art laid the foundations of the style known as classicism, whose main premises are truthfulness, proportion and harmony. Classicist painting is fundamentally based on drawing as a preliminary design tool, on which the pigment is applied taking into account a correct proportion of chromaticism and shading. These precepts laid the foundations of a way of understanding art that has lasted throughout history, with a series of cyclical ups and downs that have been followed to a greater or lesser extent: some of the periods in which the classical canons have been returned to were the Renaissance, Baroque classicism, neoclassicism and academicism. Medieval art The art historian Wolfgang Schöne divided the history of painting in terms of light into two periods: "proper light" (eigenlicht), which would correspond to medieval art; and "illuminating light" (beleuchtungslicht), which would develop in modern and contemporary art (Über das Licht in der Malerei, Berlin, 1979). In the Middle Ages, light had a strong symbolic component in art, since it was considered a reflection of divinity. Within medieval scholastic philosophy, a current called the aesthetics of light emerged, which identified light with divine beauty, and greatly influenced medieval art, especially Gothic art: the new Gothic cathedrals were brighter, with large windows that flooded the interior space, which was indefinite, without limits, as a concretion of an absolute, infinite beauty. The introduction of new architectural elements such as the pointed arch and the ribbed vault, together with the use of buttresses and flying buttresses to support the weight of the building, allowed the opening of windows covered with stained glass that filled the interior with light, which gained in transparency and luminosity. These stained-glass windows allowed the light that entered through them to be nuanced, creating fantastic plays of light and color, fluctuating at different times of the day, which were reflected in a harmonious way in the interior of the buildings. Light was associated with divinity, but also with beauty and perfection: according to Saint Bonaventure (De Intelligentii), the perfection of a body depends on its luminosity ("perfectio omnium eorum quae sunt in ordine universo, est lux"). William of Auxerre (Summa Aurea) also related beauty and light, so that a body is more or less beautiful according to its degree of radiance. This new aesthetics was parallel in many moments to the advances of science in subjects such as optics and the physics of light, especially thanks to the studies of Roger Bacon. At this time the works of Alhacen were also known, which would be collected by Witelo in De perspectiva (ca. 1270–1278) and Adam Pulchrae Mulieris in Liber intelligentiis (ca. 1230). The new prominence given to light in medieval times had a powerful influence on all artistic genres, to the point that Daniel Boorstein points out that "it was the power of light that produced the most modern artistic forms, because light, the almost instantaneous messenger of sensation, is the swiftest and most transitory element". In addition to architecture, light had a special influence on the miniature, with manuscripts illuminated with bright and brilliant colors, generally thanks to the use of pure colors (white, red, blue, green, gold and silver), which gave the image a great luminosity, without shades or chiaroscuro. The conjugation of these elementary colors generates light by the overall concordance, thanks to the approximation of the inks, without having to resort to shading effects to outline the contours. The light radiates from the objects, which are luminous without the need for the play of volumes that will be characteristic of modern painting. In particular, the use of gold in medieval miniatures generated areas of great light intensity, often contrasted with cold and light tones, to provide greater chromaticism. However, in painting, light did not have the prominence it had in architecture: medieval "proper light" was alien to reality and without contact with the spectator, since it neither came from outside – lacking a light source – nor went outward, since it did not expand light. Chiaroscuro was not used, since shadow was forbidden as it was considered a refuge for evil. Light was considered of divine origin and conqueror of darkness, so it illuminated everything equally, with the consequence of the lack of modeling and volume in the objects, a fact that resulted in the weightless and incorporeal image that was sought to emphasize spirituality. Although there is a greater interest in the representation of light, it is more symbolic than naturalistic. Just as in architecture the stained glass windows created a space where illumination took on a transcendent character, in painting a spatial staging was developed through gold backgrounds, which although they did not represent a physical space, they did represent a metaphysical realm, linked to the sacred. This "gothic light" was a feigned illumination and created a type of unreal image that transcended mere nature. The gold background reinforced the sacred symbolism of light: the figures are immersed in an indeterminate space of unnatural light, a scenario of sacred character where figures and objects are part of the religious symbolism. Cennino Cennini (Il libro dell'Arte), compiled various technical procedures for the use of gold leaf in painting (backgrounds, draperies, nimbuses), which remained in force until the 16th century. Gold leaf was used profusely, especially in halos and backgrounds, as can be seen in Duccio's Maestà, which shone brightly in the interior of the cathedral of Siena. Sometimes, before applying the gold leaf, a layer of red clay was spread; after wetting the surface and placing the gold leaf, it was smoothed and polished with ivory or a smooth stone. To achieve more brilliance and to catch the light, incisions were made in the gilding. It is noteworthy that in early Gothic painting there are no shadows, but the entire representation is uniformly illuminated; according to Hans Jantzen, "to the extent that medieval painting suppresses the shadow, it raises its sensitive light to the power of a super-sensible light". In Gothic painting there is a progressive evolution in the use of light: the linear or Franco-Gothic Gothic was characterized by linear drawing and strong chromaticism, and gave greater importance to the luminosity of flat color than to tonality, emphasizing chromatic pigment as opposed to luminous gradation. With the Italic or Trecentist Gothic a more naturalistic use of light began, characterized by the approach to the representation of depth – which would crystallize in the Renaissance with the linear perspective – the studies on anatomy and the analysis of light to achieve tonal nuance, as seen in the work of Cimabue, Giotto, Duccio, Simone Martini, and Ambrogio Lorenzetti. In the Flemish Gothic period, the technique of oil painting emerged, which provided brighter colors and allowed their gradation in different chromatic ranges, while facilitating greater detail in the details (Jan van Eyck, Rogier van der Weyden, Hans Memling, Gerard David). Between the 13th and 14th centuries a new sensibility towards a more naturalistic representation of reality emerged in Italy, which had as one of its contributing factors the study of a realistic light in the pictorial composition. In the frescoes of the Scrovegni Chapel (Padua), Giotto studied how to distinguish flat and curved surfaces by the presence or absence of gradients and how to distinguish the orientation of flat surfaces by three tones: lighter for horizontal surfaces, medium for frontal vertical surfaces and darker for receding vertical surfaces. Giotto was the first painter to represent sunlight, a type of soft, transparent illumination, but one that already served to model figures and enhance the quality of clothes and objects. For his part, Taddeo Gaddi – in his Annunciation to the Shepherds (Baroncelli Chapel, Santa Croce, Florence) – depicted divine light in a night scene with a visible light source and a rapid fall in the pattern of light distribution characteristic of point sources of light, through contrasts of yellow and violet. In the Netherlands, the brothers Hubert and Jan van Eyck and Robert Campin sought to capture various plays of light on surfaces of different textures and sheen, imitating the reflections of light on mirrors and metallic surfaces and highlighting the brilliance of colored jewels and gems (Triptych of Mérode, by Campin, 1425–1428; Polyptych of Ghent, by Hubert and Jan van Eyck, 1432). Hubert was the first to develop a certain sense of saturation of light in his Hours of Turin (1414-1417), in which he recreated the first "modern landscapes" of Western painting – according to Kenneth Clark. In these small landscapes the artist recreates effects such as the reflection of the evening sky on the water or the light sparkling on the waves of a lake, effects that would not be seen again until the Dutch landscape painting of the 17th century. In the Ghent Polyptych (1432, Saint Bavo's Cathedral, Ghent), by Hubert and Jan, the landscape of The Adoration of the Mystic Lamb melts into light in the celestial background, with a subtlety that only the Baroque Claude of Lorraine would later achieve. Jan van Eyck developed the light experiments of his brother and managed to capture an atmospheric luminosity of naturalistic aspect in his works, in paintings such as The Virgin of Chancellor Rolin (1435, Louvre Museum, Paris), or The Arnolfini Marriage (1434, The National Gallery, London), where he combines the natural light that enters through two side windows with that of a single candle lit on the candlestick, which here has a more symbolic than plastic value, since it symbolizes human life. In Van Eyck's workshop, oil painting was developed, which gave a greater luminosity to the painting thanks to the glazes: in general, they applied a first layer of tempera, more opaque, on which they applied the oil (pigments ground in oil), which is more transparent, through several thin layers that let the light pass through, achieving greater luminosity, depth and tonal and chromatic richness. Other Dutch artists who stood out in the expression of light were: Dirk Bouts, who in his works enhances with light the coloring and, in general, the plastic sense of the composition; Petrus Christus, whose use of light approaches a certain abstraction of the forms; and Geertgen tot Sint Jans, author in some of his works of surprising light effects, as in his Nativity (1490, National Gallery, London), where the light emanates from the body of the Child Jesus in the cradle, symbol of the Divine Grace. Modern Age Art Renaissance The art of the Modern Age – not to be confused with modern art, which is often used as a synonym for contemporary art – began with the Renaissance, which emerged in Italy in the 15th century (Quattrocento), a style influenced by classical Greco-Roman art and inspired by nature, with a more rational and measured component, based on harmony and proportion. Linear perspective emerged as a new method of composition and light became more naturalistic, with an empirical study of physical reality. Renaissance culture meant a return to rationalism, the study of nature, empirical research, with a special influence of classical Greco-Roman philosophy. Theology took a back seat and the object of study of the philosopher returned to the human being (humanism). In the Renaissance, the use of canvas as a support and the technique of oil painting became widespread, especially in Venice from 1460. Oil painting provided a greater chromatic richness and facilitated the representation of brightness and light effects, which could be represented in a wider range of shades. In general, Renaissance light tended to be intense in the foreground, diminishing progressively towards the background. It was a fixed lighting, which meant an abstraction with respect to reality, since it created an aseptic space subordinated to the idealizing character of Renaissance painting; to reconvert this ideal space into a real atmosphere, a slow process was followed based on the subordination of volumetric values to lighting effects, through the dissolution of the solidity of forms in the luminous space. During this period, chiaroscuro was recovered as a method to give relief to objects, while the study of gradation as a technique to diminish the intensity of color and modeling to graduate the different values of light and shadow was deepened. Renaissance natural light not only determined the space of the pictorial composition, but also the volume of figures and objects. It is a light that loses the metaphorical character of Gothic light and becomes a tool for measuring and ordering reality, shaping a plastic space through a naturalistic representation of light effects. Even when light retains a metaphorical reference – in religious scenes – it is a light subordinated to the realistic composition. Light had a special relevance in landscape painting, a genre in which it signified the transition from a symbolic representation in medieval art to a naturalistic transcription of reality. Light is the medium that unifies all parts of the composition into a structured and coherent whole. According to Kenneth Clark, "the sun shines for the first time in the landscape of the Flight into Egypt that Gentile da Fabriano painted in his Adoration of 1423. This sun is a golden disk, which is reminiscent of medieval symbolism, but its light is already fully naturalistic, spilling over the hillside, casting shadows and creating the compositional space of the image. In the Renaissance, the first theoretical treatises on the representation of light in painting appeared: Leonardo da Vinci dedicated a good part of his Treatise on Painting to the scientific study of light. Albrecht Dürer investigated a mathematical procedure to determine the location of shadows cast by objects illuminated by point source lights, such as candlelight. Giovanni Paolo Lomazzo devoted the fourth book of his Trattato (1584) to light, in which he arranged light in descending order from primary sunlight, divine light and artificial light to the weaker secondary light reflected by illuminated bodies. Cennino Cennini took up in his treatise Il libro dell'arte the rule of Philoponus on the creation of distance by contrasts: "the farther away you want the mountains to appear, the darker you will make your color; and the closer you want them to appear, the lighter you will make the colors". Another theoretical reference was Leon Battista Alberti, who in his treatise De pictura (1435) pointed out the indissolubility of light and color, and affirmed that "philosophers say that no object is visible if it is not illuminated and has no color. Therefore they affirm that between light and color there is a great interdependence, since they make themselves reciprocally visible". In his treatise, Alberti pointed out three fundamental concepts in painting: circumscriptio (drawing, outline), compositio (arrangement of the elements), and luminum receptio (illumination). He stated that color is a quality of light and that to color is to "give light" to a painting. Alberti pointed out that relief in painting was achieved by the effects of light and shadow (lumina et umbrae), and warned that "on the surface on which the rays of light fall the color is lighter and more luminous, and that the color becomes darker where the strength of the light gradually diminishes." Likewise, he spoke of the use of white as the main tool for creating brilliance: "the painter has nothing but white pigment (album colorem) to imitate the flash (fulgorem) of the most polished surfaces, just as he has nothing but black to represent the most extreme darkness of the night. Thus, the darker the general tone of the painting, the more possibilities the artist has to create light effects, as they will stand out more. Alberti's theories greatly influenced Florentine painting in the mid-15th century, so much so that this style is sometimes called pittura di luce (light painting), represented by Domenico Veneziano, Fra Angelico, Paolo Uccello, Andrea del Castagno and the early works of Piero della Francesca. Domenico Veneziano, who as his name indicates was originally from Venice but settled in Florence, was the introducer of a style based more on color than on line. In one of his masterpieces, The Virgin and Child with Saint Francis, Saint John the Baptist, Saint Cenobius and Saint Lucy (c. 1445, Uffizi, Florence), he achieved a believably naturalistic representation by combining the new techniques of representing light and space. The solidity of the forms is solidly based on the light-shadow modeling, but the image also has a serene and radiant atmosphere that comes from the clear sunlight that floods the courtyard where the scene takes place, one of the stylistic hallmarks of this artist. Fra Angelico synthesized the symbolism of the spiritual light of medieval Christianity with the naturalism of Renaissance scientific light. He knew how to distinguish between the light of dawn, noon and twilight, a diffuse and non-contrasting light, like an eternal spring, which gives his works an aura of serenity and placidity that reflects his inner spirituality. In Scenes from the Life of Saint Nicholas (1437, Pinacoteca Vaticana, Rome) he applied Alberti's method of balancing illuminated and shaded halves, especially in the figure with his back turned and the mountainous background. Uccello was also a great innovator in the field of pictorial lighting: in his works – such as The Battle of San Romano (1456, Musée du Louvre, Paris) – each object is conceived independently, with its own lighting that defines its corporeality, in conjunction with the geometric values that determine its volume. These objects are grouped together in a scenographic composition, with a type of artificial lighting reminiscent of that of the performing arts. In turn, Piero della Francesca used light as the main element of spatial definition, establishing a system of volumetric composition in which even the figures are reduced to mere geometric outlines, as in The Baptism of Christ (1440-1445, The National Gallery, London). According to Giulio Carlo Argan, Piero did not consider "a transmission of light, but a fixation of light", which turns the figures into references of a certain definition of space. He carried out scientific studies of perspective and optics (De prospectiva pingendi) and in his works, full of a colorful luminosity of great beauty, he uses light as both an expressive and symbolic element, as can be seen in his frescoes of San Francesco in Arezzo. Della Francesca was one of the first modern artists to paint night scenes, such as The Dream of Constantine (Legend of the Cross, 1452–1466, San Francesco in Arezzo). He cleverly assimilated the luminism of the Flemish school, which he combined with Florentine spatialism: in some of his landscapes there are luminous moonscapes reminiscent of the Van Eyck brothers, although transcribed with the golden Mediterranean light of his native Umbria. Masaccio was a pioneer in using light to emphasize the drama of the scene, as seen in his frescoes in the Brancacci chapel of Santa Maria del Carmine (Florence), where he uses light to configure and model the volume, while the combination of light and shadow serves to determine the space. In these frescoes, Masaccio achieved a sense of perspective without resorting to geometry, as would be usual in linear perspective, but by distributing light among the figures and other elements of the representation. In The Tribute of the Coin, for example, he placed a light source outside the painting that illuminates the figures obliquely, casting shadows on the ground with which the artist plays. Straddling the Gothic and Renaissance periods, Gentile da Fabriano was also a pioneer in the naturalistic use of light: in the predella of the Adoration of the Magi (1423, Uffizi, Florence) he distinguished between natural, artificial and supernatural light sources, using a technique of gold leaf and graphite to create the illusion of light through tonal modeling. Sandro Botticelli was a Gothic painter who moved away from the naturalistic style initiated by Masaccio and returned to a certain symbolic concept of light. In The Birth of Venus (1483-1485, Uffizi, Florence), he symbolized the dichotomy between matter and spirit with the contrast between light and darkness, in line with the Neoplatonic theories of the Florentine Academy of which he was a follower: on the left side of the painting the light corresponds to the dawn, both physical and symbolic, since the female character that appears embracing Zephyrus is Aurora, the goddess of dawn; on the right side, darker, are the earth and the forest, as metaphorical elements of matter, while the character that tends a mantle to Venus is the Hour, which personifies time. Venus is in the center, between day and night, between sea and land, between the divine and the human. A remarkable pictorial school emerged in Venice, characterized by the use of canvas and oil painting, where light played a fundamental role in the structuring of forms, while great importance was given to color: chromaticism would be the main hallmark of this school, as it would be in the 16th century with Mannerism. Its main representatives were Carlo Crivelli, Antonello da Messina, and Giovanni Bellini. In the Altarpiece of Saint Job (c. 1485, Gallerie dell'Accademia, Venice), Bellini brought together for the first time the Florentine linear perspective with Venetian color, combining space and atmosphere, and made the most of the new oil technique initiated in Flanders, thus creating a new artistic language that was quickly imitated. According to Kenneth Clark, Bellini "was born with the landscape painter's greatest gift: emotional sensitivity to light". In his Christ on the Mount of Olives (1459, National Gallery, London) he made the effects of light the driving force of the painting, with a shadowy valley in which the rising sun peeks through the hills. This emotive light is also seen in his Resurrection at the Staatliche Museen in Berlin (1475-1479), where the figure of Jesus radiates a light that bathes the sleeping soldiers. While his early works are dominated by sunrises and sunsets, in his mature production he appreciates more the full light of day, in which the forms merge with the general atmosphere. However, he also knew how to take advantage of the cold and pale lights of winter, as in the Virgin of the Meadow (1505, National Gallery, London), where a pale sun struggles with the shadows of the foreground, creating a fleeting effect of marble light. The Renaissance saw the emergence of the sfumato technique, traditionally attributed to Leonardo da Vinci, which consisted of the degradation of light tones to blur the contours and thus give a sense of remoteness. This technique was intended to give greater verisimilitude to the pictorial representation, by creating effects similar to those of human vision in environments with a wide perspective. The technique consisted of a progressive application of glazes and the feathering of the shadows to achieve a smooth gradient between the various parts of light and shadow of the painting, with a tonal gradation achieved with progressive retouching, leaving no trace of the brushstroke. It is also called "aerial perspective", since its results resemble the vision in a natural environment determined by atmospheric and environmental effects. This technique was used, in addition to Leonardo, by Dürer, Giorgione and Bernardino Luini, and later by Velázquez and other Baroque painters. Leonardo was essentially concerned with perception, the observation of nature. He sought life in painting, which he found in color, in the light of chromaticism. In his Treatise on Painting (1540) he stated that painting is the sum of light and darkness (chiaroscuro), which gives movement, life: according to Leonardo, darkness is the body and light is the spirit, and the mixture of both is life. In his treatise he established that "painting is a composition of light and shadows, combined with the various qualities of all the simple and compound colors". He also distinguished between illumination (lume) and brilliance (lustro), and warned that "opaque bodies with hard and rough surface never generate luster in any illuminated part". The Florentine polymath included light among the main components of painting and pointed it out as an element that articulates pictorial representation and conditions the spatial structure and the volume and chromaticism of objects and figures. He was also concerned with the study of shadows and their effects, which he analyzed together with light in his treatise. He also distinguished between shadow (ombra) and darkness (tenebre), the former being an oscillation between light and darkness. He also studied nocturnal painting, for which he recommended the presence of fire as a means of illumination, and he wrote down the different necessary gradations of light and color according to the distance from the light source. Leonardo was one of the first artists to be concerned with the degree of illumination of the painter's studio, suggesting that for nudes or carnations the studio should have uncovered lights and red walls, while for portraits the walls should be black and the light diffused by a canopy. Leonardo's subtle chiaroscuro effects are perceived in his female portraits, in which the shadows fall on the faces as if submerging them in a subtle and mysterious atmosphere. In these works he advocated intermediate lights, stating that "the contours and figures of dark bodies are poorly distinguished in the dark as well as in the light, but in the intermediate zones between light and shadow they are better perceived". Likewise, on color he wrote that "colors placed in shadows will participate to a greater or lesser degree in their natural beauty according as they are placed in greater or lesser darkness. But if the colors are placed in a luminous space, then they will possess a beauty all the greater the more splendorous the luminosity". The other great name of the early Cinquecento was Raphael, a serene and balanced artist whose work shows a certain idealism framed in a realistic technique of great virtuoso execution. According to Giovanni Paolo Lomazzo, Raphael "has given enchanting, loving and sweet light, so that his figures appear beautiful, pleasing and intricate in their contours, and endowed with such relief that they seem to move." Some of his lighting solutions were quite innovative, with resources halfway between Leonardo and Caravaggio, as seen in The Transfiguration (1517-1520, Vatican Museums, Vatican City), in which he divides the image into two halves, the heavenly and the earthly, each with different pictorial resources. In the Liberation of Saint Peter (1514, Vatican Museums, Vatican City) he painted a nocturnal scene in which the light radiating from the angel in the center stands out, giving a sensation of depth, while at the same time it is reflected in the breastplates of the guards, creating intense luminous effects. This was perhaps the first work to include artificial lighting with a naturalistic sense: the light radiating from the angel influences the illumination of the surrounding objects, while diluting the distant forms. Outside Italy, Albrecht Dürer was especially concerned with light in his watercolor landscapes, treated with an almost topographical detail, in which he shows a special delicacy in the capture of light, with poetic effects that prelude the sentimental landscape of Romanticism. Albrecht Altdorfer showed a surprising use of light in The Battle of Alexander at Issos (1529, Alte Pinakothek, Munich), where the appearance of the sun among the clouds produces a supernatural refulgence, effects of bubbling lights that also precede Romanticism. Matthias Grünewald was a solitary and melancholic artist, whose original work reflects a certain mysticism in the treatment of religious themes, with an emotive and expressionist style, still with medieval roots. His main work was the altar of Isenheim (1512-1516, Museum Unterlinden, Colmar), in which the refulgent halo in which he places his Risen Christ stands out. Between Gothic and Renaissance is the unclassifiable work of Bosch, a Flemish artist gifted with a great imagination, author of dreamlike images that continue to surprise for their fantasy and originality. In his works – and especially in his landscape backgrounds – there is a great skill in the use of light in different temporal and environmental circumstances, but he also knew how to recreate in his infernal scenes fantastic effects of flames and fires, as well as supernatural lights and other original effects, especially in works such as The Last Judgment (c. 1486–1510, Groenige Museum, Bruges), Visions of the Beyond (c. 1490, Doge's Palace, Venice), The Garden of Earthly Delights (c. 1500–1505, Museo del Prado, Madrid), The Hay Chariot (c. 1500–1502, Museo del Prado, Madrid) or The Temptations of Saint Anthony (c. 1501, Museum of Fine Arts, Lisbon). Bosch had a predilection for the effects of light generated by fire, by the glow of flames, which gave rise to a new series of paintings in which the effects of violent and fantastic lights originated by fire stood out, as is denoted in a work by an anonymous artist linked to the workshop of Lucas van Leyden, Lot and his daughters (c. 1530, Musée du Louvre, Paris), or in some works by Joachim Patinir, such as Charon crossing the Styx Lagoon (c. 1520–1524, Museo del Prado, Madrid) or Landscape with the Destruction of Sodom and Gomorrah (c. 1520, Boymans Van Beuningen Museum, Rotterdam). These effects also influenced Giorgione, as well as some Mannerist painters such as Lorenzo Lotto, Dosso Dossi and Domenico Beccafumi. Mannerism At the end of the High Renaissance, in the middle of the 16th century, Mannerism followed, a movement that abandoned nature as a source of inspiration to seek a more emotional and expressive tone, in which the artist's subjective interpretation of the work of art became more important, with a taste for sinuous and stylized form, with deformation of reality, distorted perspectives and gimmicky atmospheres. In this style light was used in a gimmicky way, with an unreal treatment, looking for a colored light of different origins, both a cold moonlight and a warm firelight. Mannerism broke with the full Renaissance light by introducing night scenes with intense chromatic interplay between light and shadow and a dynamic rhythm far from Renaissance harmony. Mannerist light, in contrast to Renaissance classicism, took on a more expressive function, with a natural origin but an unreal treatment, a disarticulating factor of the classicist balance, as seen in the work of Pontormo, Rosso or Beccafumi. In Mannerism, the Renaissance optical scheme of light and shadow was broken by suppressing the visual relationship between the light source and the illuminated parts of the painting, as well as in the intermediate steps of gradation. The result was strong contrasts of color and chiaroscuro, and an artificial and refulgent aspect of the illuminated parts, independent of the light source. Between Renaissance classicism and Mannerism lies the work of Michelangelo, one of the most renowned artists of universal stature. His use of light was generally with plastic criteria, but sometimes he used it as a dramatic resource, especially in his frescoes in the Pauline Chapel: Crucifixion of Saint Peter and Conversion of Saint Paul (1549). Placed on opposite walls, the artist valued the entry of natural light into the chapel, which illuminated one wall and left the other in semi-darkness: in the darkest part he placed the Crucifixion, a subject more suitable for the absence of light, which emphasizes the tragedy of the scene, intensified in its symbolic aspect by the fading light of dusk that is perceived on the horizon; instead, the Conversion receives natural light, but at the same time the pictorial composition has more luminosity, especially for the powerful ray of light that comes from the hand of Christ and is projected on the figure of Saul, who thanks to this divine intervention is converted to Christianity. Another reference of Mannerism was Correggio, the first artist – according to Vasari – to apply a dark tone in contrast to light to produce effects of depth, while masterfully developing the Leonardoesque sfumato through diffuse lights and gradients. In his work The Nativity (1522, Gemäldegalerie Alte Meister, Dresden) he was the first to show the birth of Jesus as a "miracle of light", an assimilation that would become habitual from then on. In The Assumption of the Virgin (1526-1530), painted on the dome of the cathedral of Parma, he created an illusionistic effect with figures seen from below (sotto in sù) that would be the forerunner of Baroque optical illusionism; in this work the subtle nuances of his flesh tones stand out, as well as the luminous break of glory of its upper part. Jacopo Pontormo, a disciple of Leonardo, developed a strongly emotional, dynamic style with unreal effects of space and scale, in which a great mastery of color and light can be glimpsed, applied by color stains, especially red. Domenico Beccafumi stood out for his colorism, fantasy and unusual light effects, as in The Birth of the Virgin (1543, Pinacoteca Nazionale di Siena). Rosso Fiorentino also developed an unusual coloring and fanciful play of light and shadow, as in his Descent of Christ (1521, Pinacoteca Comunale, Volterra). Luca Cambiasso showed a great interest in nocturnal illumination, which is why he is considered a forerunner of tenebrism. Bernardino Luini, a disciple of Leonardo, showed a Leonardoesque treatment of light in the Madonna of the Rosebush (c. 1525–1530, Pinacoteca di Brera). Alongside this more whimsical mannerism, a school of a more serene style emerged in Venice that stood out for its treatment of light, which subordinated plastic form to luminous values, as can be seen in the work of Giorgione, Titian, Tintoretto and Veronese. In this school, light and color were fused, and Renaissance linear perspective was replaced by aerial perspective, the use of which would culminate in the Baroque. The technique used by these Venetian painters is called "tonalism": it consisted in the superimposition of glazes to form the image through the modulation of color and light, which are harmonized through relations of tone modulating them in a space of plausible appearance. The color assumes the function of light and shadow, and it is the chromatic relationships that create the effects of volume. In this modality, the chromatic tone depends on the intensity of light and shadow (the color value). Giorgione brought the Leonardesque influence to Venice. He was an original artist, one of the first to specialize in cabinet paintings for private collectors, and the first to subordinate the subject of the work to the evocation of moods. Vasari considered him, together with Leonardo, one of the founders of "modern painting". A great innovator, he reformulated landscape painting both in composition and iconography, with images conceived in depth with a careful modulation of chromatic and light values, as is evident in one of his masterpieces, The Tempest (1508, Gallerie dell'Accademia, Venice).Titian was a virtuoso in the recreation of vibrant atmospheres with subtle shades of light achieved with infinite variations obtained after a meticulous study of reality and a skillful handling of the brushes that demonstrated a great technical mastery. In his Pentecost (1546, Santa Maria della Salute, Venice) he made rays of light emanate from the dove representing the Holy Spirit, ending in tongues of fire on the heads of the Virgin and the apostles, with surprising light effects that were innovative for his time. This research gradually evolved into increasingly dramatic effects, giving more emphasis to artificial lighting, as seen in The Martyrdom of Saint Lawrence (1558, Jesuit Church, Venice), where he combines the light of the torches and the fire of the grill where the saint is martyred with the supernatural effect of a powerful flash of divine light in the sky that is projected on the figure of the saint. This experimentation with light influenced the work of artists such as Veronese, Tintoretto, Jacopo Bassano and El Greco. Tintoretto liked to paint enclosed in his studio with the windows closed by the light of candles and torches, which is why his paintings are often called di notte e di fuoco ("by night and fire"). In his works, of deep atmospheres, with thin and vertical figures, the violent effects of artificial lights stand out, with strong chiaroscuro and phosphorescent effects. These luminous effects were adopted by other members of the Venetian school such as the Bassano (Jacopo, Leandro, and Francesco), as well as by the so-called "Lombard illuminists" (Giovanni Girolamo Savoldo, Moretto da Brescia), while influencing El Greco and Baroque tenebrism. Another artist framed in the painting di notte e di fuoco was Jacopo Bassano, whose indirect incidence lights influenced Baroque naturalism. In works such as Christ in the House of Mary, Martha and Lazarus (c. 1577, Museum of Fine Arts, Houston), he combined natural and artificial lights with striking lighting effects. For his part, Paolo Veronese was heir to the luminism of Giovanni Bellini and Vittore Carpaccio, in scenes of Palladian architecture with dense morning lights, golden and warm, without prominent shadows, emphasizing the brightness of fabrics and jewels. In Allegory of the Battle of Lepanto (1571) he divided the scene into two halves, the battle below and the Virgin with the saints who ask for her favor for the battle at the top, where angels are placed, throwing lightning bolts towards the battle, creating spectacular lighting effects. Outside Italy it is worth mentioning the work of Pieter Brueghel the Elder, author of costumist scenes and landscapes that denote a great sensitivity towards nature. In some of his works the influence of Hieronymous Bosch can be seen in his fire lights and fantastic effects, as in The Triumph of Death (c. 1562, Museo del Prado, Madrid). In some of his landscapes he added the sun as a direct source of luminosity, such as the yellow sun of The Flemish Proverbs (1559, Staatliche Museen, Berlin), the red winter sun of The Census in Bethlehem (1556, Royal Museums of Fine Arts of Belgium, Brussels) or the evening sun of Landscape with the Fall of Icarus (c. 1558, Royal Museums of Fine Arts of Belgium, Brussels). El Greco worked in Spain during this period, a singular painter who developed an individual style, marked by the influence of the Venetian school, the city where he lived for a time, as well as Michelangelo, from whom he took his conception of the human figure. In El Greco's work, light always prevails over shadows, as a clear symbolism of the preeminence of faith over unbelief. In one of his first works from Toledo, the Expolio for the sacristy of the cathedral of Toledo (1577), a zenithal light illuminates the figure of Jesus, focusing on his face, which becomes the focus of light in the painting. In the Trinity of the church of Santo Domingo el Antiguo (1577-1580) he introduced a dazzling Gloria light of an intense golden yellow. In The Martyrdom of Saint Maurice (1580-1582, Royal Monastery of San Lorenzo de El Escorial) he created two areas of differentiated light: the natural light that surrounds the earthly characters and that of the breaking of the glory in the sky, furrowed with angels. Among his last works stands out The Adoration of the Shepherds (1612-1613, Museo del Prado, Madrid), where the focus of light is the Child Jesus, who radiates his luminosity around producing phosphorescent effects of strong chromatism and luminosity. El Greco's illumination evolved from the light coming from a specific point – or in a diffuse way – of the Venetian school to a light rooted in Byzantine art, in which the figures are illuminated without a specific light source or even a diffuse light. It is an unnatural light, which can come from multiple sources or none at all, an arbitrary and unequal light that produces hallucinatory effects. El Greco had a plastic conception of light: his execution went from dark to light tones, finally applying touches of white that created shimmering effects. The refulgent aspect of his works was achieved through glazes, while the whites were finished with almost dry applications. His light is mystical, subjective, almost spectral in appearance, with a taste for shimmering gleams and incandescent reflections. Barroco In the 17th century, the Baroque emerged, a more refined and ornamented style, with the survival of a certain classicist rationalism but with more dynamic and dramatic forms, with a taste for the surprising and the anecdotal, for optical illusions and effects. Baroque painting had a marked geographical differentiating accent, since its development took place in different countries, in various national schools, each with a distinctive stamp. However, there is a common influence coming again from Italy, where two opposing trends emerged: naturalism (also called caravagism), based on the imitation of natural reality, with a certain taste for chiaroscuro – the so-called tenebrism – and classicism, which is realistic but with a more intellectual and idealized concept of reality. Later, in the so-called "full baroque" (second half of the 17th and early 18th centuries), painting evolved to a more decorative style, with a predominance of mural painting and a certain predilection for optical effects (trompe-l'œil) and luxurious and exuberant scenographies. During this period, many scientific studies on light were carried out (Johannes Kepler, Francesco Maria Grimaldi, Isaac Newton, Christiaan Huygens, Robert Boyle), which influenced its pictorial representation. Newton proved that color comes from the spectrum of white light and designed the first chromatic circle showing the relationships between colors. In this period the maximum degree of perfection was reached in the pictorial representation of light and the tactile form was diluted in favor of a greater visual impression, achieved by giving greater importance to light, losing the form the accuracy of its contours. In the Baroque, light was studied for the first time as a system of composition, articulating it as a regulating element of the painting: light fulfills several functions, such as symbolic, modeling and illumination, and begins to be directed as an emphatic element, selective of the part of the painting to be highlighted, so that artificial light becomes more important, which can be manipulated at the free will of the artist. Sacred light (nimbus, haloes) was abandoned and natural light was used exclusively, even as a symbolic element. On the other hand, the light of different times of the day (morning, twilight) began to be distinguished. Illumination was conceived as a luminous unit, as opposed to the multiple sources of Renaissance light; in the Baroque there may be several sources, but they are circumscribed to a global and unitary sense of the work. In the Baroque, the nocturne genre became fashionable, which implies a special difficulty in terms of the representation of light, due to the absence of daylight, so that on numerous occasions it was necessary to resort to chiaroscuro and lighting effects from artificial light, while the natural light should come from the moon or the stars. For artificial light, bonfires, candles, lanterns, lanterns, candles, fireworks or similar elements were used. These light sources could be direct or indirect, they could appear in the painting or illuminate the scene from outside. Naturalism Chiaroscuro resurfaced during the Baroque, especially in the Counter-Reformation, as a method of focusing the viewer's vision on the primordial parts of religious paintings, which were emphasized as didactic elements, as opposed to the Renaissance "pictorial decor". An exacerbated variant of chiaroscuro was tenebrism, a technique based on strong contrasts of light and shadow, with a violent type of lighting, generally artificial, which gives greater prominence to the illuminated areas, on which a powerful focus of directed light is placed. These effects have a strong dramatism, which emphasizes the scenes represented, generally of religious type, although they also abound in mythological scenes, still lifes or vanitas. One of its main representatives was Caravaggio, as well as Orazio and Artemisia Gentileschi, Bartolomeo Manfredi, Carlo Saraceni, Giovanni Battista Caracciolo, Pieter van Laer (il Bamboccio), Adam Elsheimer, Gerard van Honthorst, Georges de La Tour, Valentin de Boulogne, the Le Nain brothers and José de Ribera (lo Spagnoletto). Caravaggio was a pioneer in the dramatization of light, in scenes set in dark interiors with strong spotlights of directed light that used to emphasize one or more characters. With this painter, light acquired a structural character in painting, since, together with drawing and color, it would become one of its indispensable elements. He was influenced by Leonardo's chiaroscuro through The Virgin of the Rocks, which he was able to contemplate in the church of San Francesco il Grande in Milan. For Caravaggio, light served to configure the space, controlling its direction and expressive force. He was aware of the artist's power to shape the space at will, so in the composition of a work he would previously establish which lighting effects he was going to use, generally opting for sharp contrasts between the figures and the background, with darkness as a starting point: the figures emerge from the dark background and it is the light that determines their position and their prominence in the scene represented. Caravaggiesque light is conceptual, not imitative or symbolic, so it transcends materiality and becomes something substantial. It is a projected and solid light, which constitutes the basis of its spatial conception and becomes another volume in space. His main hallmark in depicting light was the diagonal entry of light, which he first used in Boy with a Basket of Fruit (1593-1594, Galleria Borghese, Rome). In La bonaventure (1595-1598, Musée du Louvre, Paris) he used a warm golden light of the sunset, which falls directly on the young man and obliquely on the gypsy woman. His pictorial maturity came with the canvases for the Contarelli chapel in the church of San Luigi dei Francesi in Rome (1599-1600): The Martyrdom of Saint Matthew and The Vocation of Saint Matthew. In the first, he established a composition formed by two diagonals defined by the illuminated planes and the shadows that form the volume of the figures, in a complex composition cohesive thanks to the light, which relates the figures to each other. In the second, a powerful beam of light that enters diagonally from the upper right directly illuminates the figure of Matthew, a beam parallel to the raised arm of Jesus and that seems to accompany his gesture; an open shutter of the central window cuts this beam of light at the top, leaving the left side of the image in semi-darkness. In works such as the Crucifixion of Saint Peter and the Conversion of Saint Paul (1600-1601, Cerasi Chapel, Santa Maria del Popolo, Rome) light makes objects and people glow, to the point that it becomes the true protagonist of the works; these scenes are immersed in light in a way that constitutes more than a simple attribute of reality, but rather the medium through which reality manifests itself. In the final stage of his career he accentuated the dramatic tension of his works through a luminism of flashing effects, as in Seven Works of Mercy (1607, Pio Monte della Misericordia, Naples), a nocturne with several spotlights of light that help to emphasize the acts of mercy depicted in simultaneous action. Artemisia Gentileschi trained with her father, Orazio Gentileschi, coinciding with the years when Caravaggio lived in Rome, whose work she could appreciate in San Luigi dei Francesi and Santa Maria del Popolo. His work was channeled in the tenebrist naturalism, assuming its most characteristic features: expressive use of light and chiaroscuro, dramatism of the scenes and figures of round anatomy. His most famous work is Judith beheading Holofernes (two versions: 1612–1613, Museo Capodimonte, Naples; and 1620, Uffizi, Florence), where the light focuses on Judith, her maid and the Assyrian general, against a complete darkness, emphasizing the drama of the scene. In the 1630s, established in Naples, his style adopted a more classicist component, without completely abandoning naturalism, with more diaphanous spaces and clearer and sharper atmospheres, although chiaroscuro remained an essential part of the composition, as a means to create space, give volume and expressiveness to the image. One of his best compositions due to the complexity of its lighting is The Birth of Saint John the Baptist (1630, Museo del Prado, Madrid), where he mixes natural and artificial light: the light from the portal in the upper right part of the painting softens the light inside the room, in a "subtle transition of light values" – according to Roberto Longhi – that would later become common in Dutch painting. Adam Elsheimer was noted for his light studies of landscape painting, with an interest in dawn and dusk lights, as well as night lighting and atmospheric effects such as mists and fogs. His light was strange and intense, with an enamel-like appearance typical of German painting, in a tradition ranging from Lukas Moser to Albrecht Altdorfer. His most famous painting is Flight into Egypt (1609, Alte Pinakothek, Munich), a night scene that is considered the first moonlit landscape; four sources of light are visible in this work: the shepherds' bonfire, the torch carried by Saint Joseph, the moon and its reflection in the water; the Milky Way can also be perceived, whose representation can also be considered as the first one done in a naturalistic way. Georges de La Tour was a magnificent interpreter of artificial light, generally lamp or candle lights, with a visible and precise focus, which he used to place inside the image, emphasizing its dramatic aspect. Sometimes, in order not to dazzle, the characters placed their hands in front of the candle, creating translucent effects on the skin, which acquired a reddish tone, of great realism and that proved his virtuosity in capturing reality. While his early works show the influence of Italian Caravaggism, from his stay in Paris between 1636 and 1643 he came closer to Dutch Caravaggism, more prone to the direct inclusion of the light source on the canvas. He thus began his most tenebrist period, with scenes of strong half-light where the light, generally from a candle, illuminates with greater or lesser intensity certain areas of the painting. In general, two types of composition can be distinguished: the fully visible light source (Job with his wife, Musée Départemental des Vosges, Épinal; Woman spurring herself, Musée Historique Lorrain, Nancy; Madeleine Terff, Musée du Louvre, Paris) or the light blocked by an object or character, creating a backlit illumination (Madeleine Fabius, Fabius collection, Paris; Angel appearing to Saint Joseph, Musée des Beaux-Arts, Nantes; The Adoration of the Shepherds, Musée du Louvre, Paris). In his later works he reduces the characters to schematic figures of geometric appearance, like mannequins, to fully recreate the effects of light on masses and surfaces (The Repentance of Saint Peter, Museum of Art, Cleveland; The Newborn, Musée des Beaux-Arts, Rennes; Saint Sebastian cured by Saint Irene, parish church of Broglie). Despite its plausible appearance, La Tour's lighting is not fully naturalistic, but is sifted by the artist's will; at all times he prints the desired amount of light and shadow to recreate the desired effect; in general, it is a serene and diffuse lighting, which brings out the volume without excessive drama. The light serves to unite the figures, to highlight the part of the painting that best suits the plot of the work, it is a timeless light of a poetic, transcendent character; it is just the right light necessary to provide credibility, but it serves a more symbolic than realistic purpose. It is an unreal light, since no candle generates such a serene and diffuse light, a conceptual and stylistic light, which serves only the compositional intention of the painter. Another French Caravaggist was Trophime Bigot, nicknamed Maître à la chandelle (Master of the candle) for his scenes of artificial light, in which he showed great expertise in the technique of chiaroscuro. The Valencian artist José de Ribera (nicknamed lo Spagnoletto), who lived in Naples, fully assumed the Caravaggesque light, with an anti-idealist style of pasty brushstrokes and dynamic effects of movement. Ribera assumed the tenebrist illumination in a personal way, sifted by other influences, such as Venetian coloring or the compositional rigor of Bolognese classicism. In his early work he used the violent contrasts of light and shadow characteristic of tenebrism, but from the 1630s he evolved to a greater chromaticism and clearer and more diaphanous backgrounds. In contrast to the flat painting of Caravaggio, Ribera used a dense paste that gave more volume and emphasized the brightness. One of his best works, Sileno ebrio (1626, Museum of Capodimonte, Naples) stands out for the flashes of light that illuminate the various characters, with special emphasis on the naked body of the Sileno, illuminated by a flat light of morbid appearance. In addition to Ribera, in Spain, Caravaggism had the figure of Juan Bautista Maíno, a Dominican friar who was drawing teacher of Philip IV, resident in Rome between 1598 and 1612, where he was a disciple of Annibale Carracci; his work stands out for its colorism and luminosity, as in The Adoration of the Shepherds (1611-1613, Museo del Prado, Madrid). Also noteworthy is the work of the still life painters Juan Sánchez Cotán and Juan van der Hamen. In general, Spanish naturalism treated light with a sense close to Caravaggism, but with a certain sensuality coming from the Venetian school and a detailing with Flemish roots. Francisco de Zurbarán developed a somewhat sweetened tenebrism, although one of his best works, San Hugo in the refectory of the Carthusian monks (c. 1630, Museo de Bellas Artes de Sevilla) stands out for the presence of white color, with a subtle play of light and shadow that stands out for the multiplicity of intensities applied to each figure and object. In Venice, Baroque painting did not produce such exceptional figures as in the Renaissance and Mannerism, but in the work of artists such as Domenico Fetti, Johann Liss, and Bernardo Strozzi one can perceive the vibrant luminism and the enveloping atmospheres so characteristic of Venetian painting. The Caravaggist novelties had a special echo in Holland, where the so-called Caravaggist School of Utrecht emerged, a series of painters who assumed the description of reality and the chiaroscuro effects of Caravaggio as pictorial principles, on which they developed a new style based on tonal chromaticism and the search for new compositional schemes, resulting in a painting that stands out for its optical values. Among its members were Hendrik Terbrugghen, Dirck van Baburen, and Gerard van Honthorst, all three trained in Rome. The first assumed the thematic repertoire of Caravaggio but with a more sweetened tone, with a sharp drawing, a grayish-silver chromatism and an atmosphere of soft light clarity. Van Baburen sought full light effects rather than chiaroscuro contrasts, with intense volumes and contours. Honthorst was a skillful producer of night scenes, which earned him the nickname Gherardo delle Notti ("Gerard of the Nights"). In works such as Christ before the High Priest (1617), Nativity (1622), The Prodigal Son (1623) or The Procuress (1625), he showed great mastery in the use of artificial light, generally from candles, with one or two light sources that illuminated the scene unevenly, highlighting the most significant parts of the painting and leaving the rest in semi-darkness. Of his Christ on the Column, Joachim von Sandrart said: "the brightness of the candles and lights illuminates everything with a naturalness that resembles life so closely that no art has ever reached such heights". One of the greatest exponents of the symbolic use of light was Rembrandt, an original artist with a strong personal stamp, with a style close to tenebrism but more diffused, without the marked contrasts between light and shadow typical of the Caravaggists, but a more subtle and diffuse penumbra. According to Giovanni Arpino, Rembrandt "invented light, not as heat, but as value. He invented light not to illuminate, but to make his world unapproachable". In general, he elaborated images where darkness predominated, illuminated in certain parts of the scene by a ray of zenithal light of divine connotation; if the light is inside the painting it means that the world is circumscribed to the illuminated part and nothing exists outside this light. Rembrandtian light is a reflection of an external force, which affects the objects causing them to radiate energy, like the retransmission of a message. Although he starts from tenebrism, his contrasts of light and shadow are not as sharp as those of Caravaggio, but he likes more a kind of golden shadows that give a mysterious air to his paintings. In Rembrandt, light was something structural, integrated in form, color and space, in such a way that it dematerializes bodies and plays with the texture of objects. It is a light that is not subject to the laws of physics, which he generally concentrates in one area of the painting, creating a glowing luminosity. In his work, light and shadow interact, dissolving the contours and deforming the forms, which become the sustaining object of the light. According to Wolfgang Schöne, in Rembrandt light and darkness are actually two types of light, one bright and the other dark. He used to use a canvas as a reflecting or diffusing screen, which he regulated as he wished to obtain the desired illumination in each scene. His concern for light led him not only to his pictorial study, but also to establish the correct placement of his paintings for optimal visualization; thus, in 1639 he advised Constantijn Huygens on the placement of his painting Samson blinded by the Philistines: "hang this painting where there is strong light, so that it can be seen from a certain distance, and thus it will have the best effect". Rembrandt also masterfully captured light in his etchings, such as The Hundred Florins and The Three Crosses, in which light is almost the protagonist of the scene. Rembrandt picked up the luminous tradition of the Venetian school, as did his compatriot Johannes Vermeer, although while the former stands out for his fantastic effects of light, the latter develops in his work a luminosity of great quality in the local tones. Vermeer imprinted his works – generally everyday scenes in interior spaces – with a pale luminosity that created placid and calm atmospheres. He used a technique called pointillé, a series of dots of pigment with which he enhanced the objects, on which he often applied a luminosity that made the surfaces reflect the light in a special way. Vermeer's light softens the contours without losing the solidity of the forms, in a combination of softness and precision that few other artists have achieved. Nicknamed the "painter of light", Vermeer masterfully synthesized light and color, he knew how to capture the color of light like no one else. In his works, light is itself a color, while shadow is inextricably linked to light. Vermeer's light is always natural, he does not like artificial light, and generally has a tone close to lemon yellow, which together with the dull blue and light gray were the main colors of his palette. It is the light that forms the figures and objects, and in conjunction with the color is what fixes the forms. As for the shadows, they are interspersed in the light, reversing the contrast: instead of fitting the luminous part of the painting into the shadows, it is the shadows that are cut out of the luminous space. Contrary to the practice of chiaroscuro, in which the form is progressively lost in the half-light, Vermeer placed a foreground of dark color to increase the tonal intensity, which reaches its zenith in the middle light; from here he dissolves the color towards white, instead of towards black as was done in chiaroscuro. In Vermeer's work, the painting is an organized structure through which light circulates, is absorbed and diffused by the objects that appear on the scene. He builds the forms thanks to the harmony between light and color, which is saturated, with a predominance of pure colors and cold tones. The light gives visual existence to the space, which in turn receives and diffuses it. Other prominent Dutch painters were Frans Hals and Jacob Jordaens. The former had a Caravaggist phase between 1625 and 1630, with a clear chromaticism and diffuse luminosity (The Merry Drinker, 1627–1628, Rijksmuseum, Amsterdam; Malle Babbe, 1629–1630, Gemäldegalerie, Berlin), to evolve later to a more sober, dark and monochromatic style. Jordaens had a style characterized by a bright and fantastic coloring, with strong contrasts of light and shadow and a technique of dense impasto. Between 1625 and 1630 he had a period in which he deepened the luminous values of his images, in works such as The Martyrdom of Saint Apollonia (1628, Church of Saint Augustine, Antwerp) or The Fecundity of the Earth (1630, Royal Museums of Fine Arts of Belgium, Brussels). One should also mention Godfried Schalcken, a disciple of Gerard Dou who worked not only in his native country but also in England and Germany. An excellent portraitist, in many of his works he used artificial candlelight or candle light, influenced by Rembrandt, as in Portrait of William III (1692-1697, Rijksmuseum, Amsterdam), Portrait of James Stuart, Duke of Lennox and Richmond (1692-1696, Leiden Collection, New York), Young Man and Woman Studying a Statue of Venus by Lamplight (c. 1690, Leiden Collection, New York) or Old Man Reading by Candlelight (c. 1700, Museo del Prado, Madrid). A genre that flourished in Holland in an exceptional way in this century was landscape painting, which, in line with the mannerist landscape painting of Pieter Brueghel the Elder and Joos de Momper, developed a new sensitivity to atmospheric effects and the reflections of the sun on water. Jan van Goyen was its first representative, followed by artists such as Salomon van Ruysdael, Jacob van Ruysdael, Meindert Hobbema, Aelbert Cuyp, Jan van de Cappelle and Adriaen van de Velde. Salomon van Ruysdael sought atmospheric capture, which he treated by tonalities, studying the light of different times of the day. His nephew Jacob van Ruysdael was endowed with a great sensitivity for natural vision, and his depressive character led him to elaborate images of great expressiveness, where the play of light and shadow accentuated the drama of the scene. His light is not the saturating and static light of the Renaissance, but a light in movement, perceptible in the effects of light and shadow in the clouds and their reflections in the plains, a light that led John Constable to formulate one of his lessons on art: "remember that light and shadow never stand still". His assistant was Meindert Hobbema, from whom he differed in his chromatic contrasts and lively light effects, which reveal a certain nervousness of stroke. Aelbert Cuyp used a much lighter palette than his compatriots, with a warmer and more golden light, probably influenced by Jan Both's "Italianate landscape". He stood out for his atmospheric effects, for the detail of the light reflections on objects or landscape elements, for the use of elongated shadows and for the use of the sun's rays diagonally and backlit, in line with the stylistic novelties produced in Italy, especially around the figure of Claudius of Lorraine. Another genre that flourished in Holland was the still life. One of its best representatives was Willem Kalf, author of still lifes of great precision in detail, which combined flowers, fruits and other foods with various objects generally of luxury, such as vases, Turkish carpets and bowls of Chinese porcelain, which emphasize their play of light and shadow and the bright reflections in the metallic and crystalline surfaces. Classicism and full Baroque Classicism emerged in Bologna, around the so-called Bolognese School, initiated by the brothers Annibale and Agostino Carracci. This trend was a reaction against mannerism, which sought an idealized representation of nature, representing it not as it is, but as it should be. It pursued the ideal beauty as its sole objective, for which it was inspired by classical Greco-Roman and Renaissance art. This ideal found an ideal subject of representation in the landscape, as well as in historical and mythological themes. In addition to the Carracci brothers, Guido Reni, Domenichino, Francesco Albani, Guercino and Giovanni Lanfranco stood out. In the classicist trend, the use of light is paramount in the composition of the painting, although with slight nuances depending on the artist: from the Incamminati and the Academy of Bologna (Carracci brothers), Italian classicism split into several currents: one moved more towards decorativism, with the use of light tones and shiny surfaces, where the lighting is articulated in large luminous spaces (Guido Reni, Lanfranco, Guercino); another specialized in landscape painting and, starting from the Carracci influence – mainly the frescoes of Palazzo Aldobrandini – developed along two parallel lines: the first focused more on classical-style composition, with a certain scenographic character in the arrangement of landscapes and figures (Poussin, Domenichino); the other is represented by Claude Lorrain, with a more lyrical component and greater concern for the representation of light, not only as a plastic factor but as an agglutinating element of a harmonious conception of the work. Claude Lorrain was one of the baroque painters who best knew how to represent light in his works, to which he gave a primordial importance at the time of conceiving the painting: the light composition served firstly as a plastic factor, being the basis with which he organized the composition, with which he created space and time, with which he articulated the figures, the architectures, the elements of nature; secondly, it was an aesthetic factor, highlighting light as the main sensitive element, as the medium that attracts and envelops the viewer and leads him to a dream world, a world of ideal perfection recreated by the atmosphere of total serenity and placidity that Claude created with his light. Claude's light was direct and natural, coming from the sun, which he placed in the middle of the scene, in sunrises or sunsets that gently illuminated all parts of the painting, sometimes placing in certain areas intense contrasts of light and shadow, or backlighting that impacted on a certain element to emphasize it. The artist from Lorraine emphasized color and light over the material description of the elements, which precedes to a great extent the luminous investigations of Impressionism. Claude's capture of light is unparalleled by any of his contemporaries: in the landscapes of Rembrandt or Ruysdael the light has more dramatic effects, piercing the clouds or flowing in oblique or horizontal rays, but in a directed manner, the source of which can be easily located. On the other hand, Claude's light is serene, diffuse; unlike the artists of his time, he gives it greater relevance if it is necessary to opt for a certain stylistic solution. On numerous occasions he uses the horizon line as a vanishing point, arranging in that place a focus of clarity that attracts the viewer, because that almost blinding luminosity acts as a focalizing element that brings the background closer to the foreground. The light is diffused from the background of the painting and, as it expands, it is enough by itself to create a sensation of depth, blurring the contours and degrading the colors to create the space of the painting. Lorena prefers the serene and placid light of the sun, direct or indirect, but always through a soft and uniform illumination, avoiding sensational effects such as moonlight, rainbows or storms, which were nevertheless used by other landscape painters of her time. His basic reference in the use of light is Elsheimer, but he differs from him in the choice of light sources and times represented: the German artist preferred exceptional light effects, nocturnal environments, moonlight or twilight; on the other hand, Claude prefers more natural environments, a limpid light of dawn or the refulgence of a warm sunset. On the other hand, the Flemish Peter Paul Rubens represents serenity in the face of Tenebrist dramatism. He was a master in finding the precise tonality for the flesh tones of the skin, as well as its different textures and the multiple variants of the effects of brightness and the reflections of light on the flesh. Rubens had an in-depth knowledge of the different techniques and traditions related to light, and so he was able to assimilate both Mannerist iridescent light and Tenebrist focal light, internal and external light, homogeneous and dispersed light. In his work, light serves as an organizing element of the composition, in such a way that it agglutinates all the figures and objects in a unitary mass of the same light intensity, with different compositional systems, either with central or diagonal illumination or combining a light in the foreground with another in the background. In his beginnings he was influenced by the Caravaggist chiaroscuro, but from 1615 he sought a greater luminosity based on the tradition of Flemish painting, so he accentuated the light tones and marked the contours more. His images stand out for their sinuous movement, with atmospheres built with powerful lights that helped to organize the development of the action, combining the Flemish tradition with the Venetian coloring that he learned in his travels to Italy. Perhaps where he experimented most in the use of light was in his landscapes, most of them painted in his old age, whose use of color and light with agile and vibrant brushstrokes influenced Velázquez and other painters of his time, such as Jordaens and Van Dyck, and artists of later periods such as Jean-Antoine Watteau, Jean-Honoré Fragonard, Eugène Delacroix, and Pierre-Auguste Renoir. Diego Velázquez was undoubtedly the most brilliant artist of his time in Spain, and one of the most internationally renowned. In the evolution of his style we can perceive a profound study of pictorial illumination, of the effects of light both on objects and on the environment, with which he reaches heights of great realism in the representation of his scenes, which however is not exempt from an air of classical idealization, which shows a clear intellectual background that for the artist was a vindication of the painter's craft as a creative and elevated activity. Velázquez was the architect of a space-light in which the atmosphere is a diaphanous matter full of light, which is freely distributed throughout a continuous space, without divisions of planes, in such a way that the light permeates the backgrounds, which acquire vitality and are as highlighted as the foreground. It is a world of instantaneous capture, alien to tangible reality, in which the light generates a dynamic effect that dilutes the contours, which together with the vibratory effect of the changing planes of light produces a sensation of movement. He usually alternated zones of light and shadow, creating a parallel stratification of space. Sometimes he even atomized the areas of light and shadow into small corpuscles, which was a precedent for impressionism.In his youth he was influenced by Caravaggio, to evolve later to a more diaphanous light, as shown in his two paintings of the Villa Medici, in which light filters through the trees. Throughout his career he achieved a great mastery in capturing a type of light of atmospheric origin, of the irradiation of light and chromatic vibration, with a fluid technique that pointed to the forms rather than defining them, thus achieving a dematerialized but truthful vision of reality, a reality that transcends matter and is framed in the world of ideas. After the smoothly executed tenebrism and precise drawing of his first period in Seville (Vieja friendo huevos, 1618, National Gallery of Scotland, Edinburgh; El aguador de Sevilla, 1620, Apsley House, London), his arrival at the Madrid court marked a stylistic change influenced by Rubens and the Venetian school – whose work he was able to study in the royal collections – with looser brushstrokes and soft volumes, while maintaining a realistic tone derived from his youthful period. Finally, after his trip to Italy between 1629 and 1631, he reached his definitive style, in which he synthesized the multiple influences received, with a fluid technique of pasty brushstrokes and great chromatic richness, as can be seen in La fragua de Vulcano (1631, Museo del Prado, Madrid). The Surrender of Breda (1635, Museo del Prado, Madrid) was a first milestone in his mastery of atmospheric light, where color and luminosity achieve an accentuated protagonism. In works such as Pablo de Valladolid (1633, Museo del Prado, Madrid), he managed to define the space without any geometric reference, only with lights and shadows. The Sevillian artist was a master at recreating the atmosphere of enclosed spaces, as shown in Las Meninas (1656, Museo del Prado, Madrid), where he placed several spotlights: the light that enters through the window and illuminates the figures of the Infanta and her ladies-in-waiting, the light from the rear window that shines around the lamp hanger and the light that enters through the door in the background. In this work he constructed a plausible space by defining or diluting the forms according to the use of light and the nuance of color, in a display of technical virtuosity that has led to the consideration of the canvas as one of the masterpieces in the history of painting. In a similar way, he succeeded in structuring space and forms by means of light planes in Las hilanderas (1657, Museo del Prado, Madrid). Another outstanding Spanish Baroque painter was Bartolomé Esteban Murillo, one of whose favorite themes was the Immaculate Conception, of which he produced several versions, generally with the figure of the Virgin within an atmosphere of golden light symbolizing divinity. He generally used translucent colors applied in thin layers, with an almost watercolor appearance, a procedure that denotes the influence of Venetian painting. After a youthful period of tenebrist influence, in his mature work he rejected chiaroscuro dramatism and developed a serene luminosity that was shown in all its splendor in his characteristic breaks of glory, of rich chromaticism and soft luminosity. The last period of this style was the so-called "full Baroque" (second half of the 17th and early 18th centuries), a decorative style in which the illusionist, theatrical and scenographic character of Baroque painting was intensified, with a predominance of mural painting – especially on ceilings – in which Pietro da Cortona, Andrea Pozzo, Giovanni Battista Gaulli (il Baciccio), Luca Giordano and Charles Le Brun stood out. In works such as the ceiling of the church of the Gesù, by Gaulli, or the Palazzo Barberini, in Cortona, is "where the ability to combine extreme light and darkness in a painting was pushed to the limit," according to John Gage, to which he adds that "the Baroque decorator not only introduced into painting the contrasts between extreme darkness and extreme light, but also a careful gradation between the two." Andrea Pozzo's Glory of Saint Ignatius of Loyola (1691-1694), on the ceiling of the church of Saint Ignatius in Rome, a scene full of heavenly light in which Christ sends a ray of light into the heart of the saint, who in turn deflects it into four beams of light directed towards the four continents, is noteworthy. In Spain, Francisco de Herrera el Mozo, Juan Carreño de Miranda, Claudio Coello and Francisco Ricci were exponents of this style. 18th Century The 18th century was nicknamed the "Age of Enlightenment", as it was the period in which the Enlightenment emerged, a philosophical movement that defended reason and science against religious dogmatism. Art oscillated between the late Baroque exuberance of Rococo and neoclassicist sobriety, between artifice and naturalism. A certain autonomy of the artistic act began to take place: art moved away from religion and the representation of power to be a faithful reflection of the artist's will, and focused more on the sensitive qualities of the work than on its meaning. In this century most national art academies were created, institutions in charge of preserving art as a cultural phenomenon, of regulating its study and conservation, and of promoting it through exhibitions and competitions; originally, they also served as training centers for artists, although over time they lost this function, which was transferred to private institutions. After the Académie Royal d'Art, founded in Paris in 1648, this century saw the creation of the Royal Academy of Fine Arts of San Fernando in Madrid (1744), the Russian Academy of Arts in Saint Petersburg (1757), the Royal Academy of Arts in London (1768), etc. The art academies favored a classical and canonical style – academicism – often criticized for its conservatism, especially by the avant-garde movements that emerged between the 19th and 20th centuries. During this period, when the science was gaining greater interest for scholars and the general public, numerous studies of optics were carried out. In particular, the study of shadows was deepened and scynography emerged as the science that studies the perspective and two-dimensional representation of the forms produced by shadows. Claude-Nicolas Lecat wrote in 1767: "the art of drawing proves that the mere gradation of the shadow, its distributions and its nuances with simple light, suffice to form the images of all objects". In the entry on shadow in L'Encyclopédie, the great project of Diderot and d'Alembert, he differentiates between several types of shadows: "inherent", the object itself; "cast", that which is projected onto another surface; "projected", that resulting from the interposition of a solid between a surface and the light source; "tilted shading", when the angle is on the vertical axis; "tilted shading", when it is on the horizontal axis. It also coded light sources as "point", "ambient light" and "extensive", the former producing shadows with clipped edges, the ambient light producing no shadow and the extensive producing shadows with little clipping divided into two areas: "umbra", the darkened part of the area where the light source is located; and "penumbra", the darkened part of the edge of a single proportion of the light area. Several treatises on painting were also written in this century that studied in depth the representation of light and shadow, such as those by Claude-Henri Watelet (L'Art de peindre, poème, avec des réflexions sur les différentes parties de la peinture, 1760) and Francesco Algarotti (Saggio sopra la pittura, 1764). Pierre-Henri de Valenciennes (Élémens de perspective pratique, a l'usage des artistes, suivis de réflexions et conseils à un élève sur la peinture, et particulièrement sur le genre du paysage, 1799) made several studies on the rendering of light at various times of the day, and recorded the various factors affecting the different types of light in the atmosphere, from the rotation of the Earth to the degree of humidity in the environment and the various reflective characteristics of a particular place. He advised his students to paint the same landscape at different times of the day and especially recommended four distinctive moments of the day: morning, characterized by freshness; noon, with its blinding sun; twilight and its fiery horizon; and night with the placid effects of moonlight. Acisclo Antonio Palomino, in El Museo Pictórico y Escala Óptica (1715-1724), stated that light is "the soul and life of everything visible" and that "it is in painting that gives such an extension to sight that it not only sees the physical and real but also the apparent and feigned, persuading bodies, distances and bulks with the elegant arrangement of light and dark, shadows and lights". Rococo meant the survival of the main artistic manifestations of the Baroque, with a more emphasized sense of decoration and ornamental taste, which were taken to a paroxysm of richness, sophistication and elegance. Rococo painting had a special reference in France, in the court scenes of Jean-Antoine Watteau, François Boucher and Jean-Honoré Fragonard. Rococo painters preferred illuminated scenes in broad daylight or colorful sunrises and sunsets. Watteau was the painter of the fête galante, of court scenes set in bucolic landscapes, a type of shady landscape of Flemish heritage. Boucher, an admirer of Correggio, specialized in the female nude, with a soft and delicate style in which the light emphasizes the placidity of the scenes, generally mythological. Fragonard had a sentimental style of free technique, with which he elaborated gallant scenes of a certain frivolity. In the still life genre Jean-Baptiste-Siméon Chardin stood out, a virtuoso in the creation of atmospheres and light effects on objects and surfaces, generally with a soft and warm light achieved through glazes and fading, with which he achieved intimate atmospheres of deep shadows and soft gradients. In this century, one of the movements most concerned with the effects of light was Venetian vedutismo, a genre of urban views that meticulously depicted the canals, monuments and places most typical of Venice, alone or with the presence of the human figure, generally of small size and in large groups of people. The veduta is usually composed of wide perspectives, with a distribution of the elements close to the scenography and with a careful use of light, which collects all the tradition of atmospheric representation from the sfumato of Leonardo and the chromatic ranges of sunrises and sunsets of Claude Lorrain. Canaletto's work stands out, whose sublime landscapes of the Adriatic villa captured with great precision the atmosphere of the city suspended over the water. The great precision and detail of his works was due in large part to the use of the camera obscura, a forerunner of photography. Another outstanding representative was Francesco Guardi, interested in the sizzling effects of light on the water and the Venetian atmosphere, with a light touch technique that was a precursor of impressionism. The landscape genre continued with the naturalistic experimentation begun in the Baroque in the Netherlands. Another reference was Claude Lorrain, whose influence was especially felt in England. The 18th century landscape incorporated the aesthetic concepts of the picturesque and the sublime, which gave the genre greater autonomy. One of the first exponents was the French painter Michel-Ange Houasse, who settled in Spain and initiated a new way of understanding the role of light in the landscape: in addition to illuminating it, light "constructs" the landscape, configures it and gives it consistency, and determines the vision of the work, since the variation of factors involved implies a specific and particular point of view. Claude Joseph Vernet specialized in seascapes, often painted in nocturnal environments by moonlight. He was influenced by Claude Lorrain and Salvator Rosa, from whom he inherited the concept of an idealized and sentimental landscape. The same type of landscape was developed by Hubert Robert, with a greater interest in picturesqueness, as evidenced by his interest in ruins, which serve as the setting for many of his works. Landscape painting was also prominent in England, where the influence of Claude of Lorraine was felt to such an extent that it largely determined the planimetry of the English garden. Here there was a great love for gardens, so that landscape painting was quite sought after, unlike on the continent, where it was considered a minor genre. In this period many painters and watercolorists emerged who dedicated themselves to the transcription of the English landscape, where they captured a new sensibility towards the luminous and atmospheric effects of nature. In this type of work the main artistic value was the capture of the atmosphere and the clients valued above all a vision comparable to the contemplation of a real landscape. Prominent artists were: Richard Wilson, Alexander Cozens, John Robert Cozens, Robert Salmon, Samuel Scott, Francis Towne and Thomas Gainsborough. One of the 18th century painters most concerned with light was Joseph Wright of Derby, who was interested in the effects of artificial light, which he masterfully captured. He spent some formative years in Italy, where he was interested in the effects of fireworks in the sky and painted the eruptions of Vesuvius. One of his masterpieces is Experiment with a Bird in an Air Pump (1768, The National Gallery, London), where he places a powerful light source in the center that illuminates all the characters, perhaps a metaphor for the Enlightenment light that illuminates all human beings equally. The light comes from a candle hidden behind the glass jar used to perform the experiment, whose shadow is placed next to a skull, both symbols of the transience of life, often used in vanitas. Wright made several paintings with artificial lighting, which he called candle light pictures, generally with violent contrasts of light and shadow. In addition – and especially in his paintings of scientific subjects, such as the one mentioned above or A Philosopher Gives a Lesson on the Table Planetarium (1766, Derby Museum and Art Gallery, Derby) – light symbolizes reason and knowledge, in keeping with the Enlightenment, the "Age of Enlightenment". In the transition between the 18th and 19th centuries, one of the most outstanding artists was Francisco de Goya, who evolved from a more or less rococo style to a certain prerromanticism, but with a personal and expressive work with a strong intimate tone. Numerous scholars of his work have emphasized Goya's metaphorical use of light as the conqueror of darkness. For Goya, light represented reason, knowledge and freedom, as opposed to the ignorance, repression and superstition associated with darkness. He also said that in painting he saw "only illuminated bodies and bodies that are not, planes that advance and planes that recede, reliefs and depths". The artist himself painted a self-portrait of himself in his studio against the light of a large window that fills the room with light, but as if that were not enough, he is wearing lighted candles in his hat (Autorretrato en el taller, 1793–1795, Real Academia de Bellas Artes de San Fernando, Madrid). At the same time, he felt a special predilection for nocturnal atmospheres and in many of his works he took up a tradition that began with Caravaggist tenebrism and reinterpreted it in a personal way. According to Jeannine Baticle, "Goya is the faithful heir of the great Spanish pictorial tradition. In him, shadow and light create powerful volumes built in the impasto, clarified with brief luminous strokes in which the subtlety of the colors produces infinite variations". Among his first production, in which he was mainly in charge of the elaboration of cartoons for the Royal Tapestry Factory of Santa Barbara, El quitasol (1777, Museo del Prado, Madrid) stands out for its luminosity, which follows the popular and traditional tastes in fashion at the court at that time, where a boy shades a young woman with a parasol, with an intense chromatic contrast between the bluish and golden tones of the light reflection. Other outstanding works for their atmospheric light effects are La nevada (1786, Museo del Prado, Madrid) and La pradera de San Isidro (1788, Museo del Prado, Madrid). As a painter of the king's chamber, his collective portrait La familia de Carlos IV (1800, Museo del Prado, Madrid) stands out, in which he seems to give a protocol order to the illumination, from the most powerful one centered on the kings in the central part, passing through the dimmer of the rest of the family to the penumbra in which the artist himself is portrayed in the left corner. Of his mature work, Los fusilamientos del 3 de mayo de 1808 en la Moncloa (1814, Museo del Prado, Madrid) stands out, where he places the light source in a beacon located in the lower part of the painting, although it is his reflection in the white shirt of one of the executed men that becomes the most powerful focus of light, extolling his figure as a symbol of the innocent victim in the face of barbarism. The choice of night is a clearly symbolic factor, since it is related to death, a fact accentuated by the Christological appearance of the character with his arms raised. Albert Boime wrote about this work (Historia social del arte): Among his last works is The Milkmaid of Bordeaux (1828, Museo del Prado, Madrid), where light is captured only with color, with a fluffy brushstroke that emphasizes the tonal values, a technique that points to impressionism. Also between the two centuries, neoclassicism developed in France after the French Revolution, a style that favored the resurgence of classical forms, purer and more austere, as opposed to the ornamental excesses of the Baroque and Rococo. The discovery of the ruins of Pompeii and Herculaneum helped to make Greco-Latin culture and an aesthetic ideology that advocated the perfection of classical forms as an ideal of beauty fashionable, which generated a myth about the perfection of classical beauty that still conditions the perception of art today. Neoclassical painting maintained an austere and balanced style, influenced by Greco-Roman sculpture or figures such as Raphael and Poussin. Jacques-Louis David, as well as François Gérard, Antoine-Jean Gros, Pierre-Paul Prud'hon, Anne-Louis Girodet-Trioson, Jean Auguste Dominique Ingres, Anton Raphael Mengs and José de Madrazo stood out. Neoclassicism replaced the dramatic illumination of the Baroque with the restraint and moderation of classicism, with cold tones and a preponderance of drawing over color, and gave special importance to line and contour. Neoclassical images put the idea before the feeling, the truthful description of reality before the imaginative whims of the Baroque artist. Neoclassicism is a clear, cold and diffuse light, which bathes the scenes with uniformity, without violent contrasts; even so, chiaroscuro was sometimes used, intensely illuminating figures or certain objects in contrast with the darkness of the background. The light delimits the contours and space, and generally gives an appearance of solemnity to the image, in keeping with the subjects treated, usually history, mythological and portrait paintings. The initiator of this style was Jacques-Louis David, a sober artist who completely subordinated color to drawing. He meticulously studied the light composition of his works, as can be seen in The Oath at the Jeu de Paume (1791, Musée National du Château de Versailles) and The Rape of the Sabine Women (1794-1799, Musée du Louvre, Paris). In The Death of Marat (1793, Royal Museums of Fine Arts of Belgium, Brussels) he developed a play of light that shows the influence of Caravaggio. Anne-Louis Girodet-Trioson followed David's style, although his emotivism brought him closer to pre-Romanticism. He was interested in chromaticism and the concentration of light and shadow, as glimpsed in The Dream of Endymion (1791, Musée du Louvre, Paris) and The Burial of Atala (1808, Musée du Louvre, Paris). Jean Auguste Dominique Ingres was a prolific author always faithful to classicism, to the point of being considered the champion of academic painting against 19th century romanticism. He was especially devoted to portraits and nudes, which stand out for their purity of lines, their marked contours and a chromatism close to enamel. Pierre-Paul Prud'hon assumed neoclassicism with a certain rococo influence, with a predilection for feminine voluptuousness inherited from Boucher and Watteau, while his work shows a strong influence of Correggio. In his mythological paintings populated by nymphs, he showed a preference for twilight and lunar light, a dim and faint light that delicately bathes the female forms, whose white skin seems to glow. Landscape painting was considered a minor genre by the neoclassicals. Even so, it had several outstanding exponents, especially in Germany, where Joseph Anton Koch, Ferdinand Kobell and Wilhelm von Kobell are worth mentioning. The former focused on the Alpine mountains, where he succeeded in capturing the cloudy atmosphere of the high mountains and the effects of sparkling light on the plant and water surfaces. He usually incorporated the human presence, sometimes with some thematic pretext of a historical or literary type – such as Shakespeare's plays or the Ossian cycle. The light in his paintings is generally clear and cold, natural, without too much stridency. If Koch represented a type of idealistic landscape, heir to Poussin or Lorraine, Ferdinand Kobell represents the realistic landscape, indebted to the Dutch Baroque landscape. His landscapes of valleys and plains with mountainous backgrounds are bathed in a translucent light, with intense contrasts between the various planes of the image. His son Wilhelm followed his style, with a greater concern for light, which is denoted in his clear environments of cold light and elongated shadows, which gives his figures a hard consistency and metallic appearance. Contemporary Art 19th Century In the 19th century began an evolutionary dynamic of styles that followed one another chronologically with increasing speed and modern art emerged as opposed to academic art, where the artist is at the forefront of the cultural evolution of humanity. The study of light was enriched with the appearance of photography and with new technological advances in artificial light, thanks to the appearance of gaslight at the beginning of the century, kerosene in the middle of the century and electricity at the end of the century. These two phenomena brought about a new awareness of light, as this element configures the visual appearance, changing the concept of reality from the tangible to the perceptible. Romanticism The first style of the century was Romanticism, a movement of profound renewal in all artistic genres, which paid special attention to the field of spirituality, fantasy, sentiment, love of nature, along with a darker element of irrationality, attraction to the occult, madness, dreams. Popular culture, the exotic, the return to underrated artistic forms of the past – especially medieval ones – were especially valued, and the landscape gained notoriety, which became a protagonist in its own right. The Romantics had the idea of an art that arose spontaneously from the individual, emphasizing the figure of the "genius": art is the expression of the artist's emotions. The Romantics used a more expressive technique with respect to neoclassical restraint, modeling the forms by means of impasto and glazes, in such a way that the expressiveness of the artist is released. In a certain pre-Romanticism we can place William Blake, an original writer and artist, difficult to classify, who devoted himself especially to illustration, in the manner of the ancient illuminators of codices. Most of Blake's images are set in a nocturnal world, in which light emphasizes certain parts of the image, a light of dawn or twilight, almost "liquid", unreal. Between neoclassicism and romanticism was also Johann Heinrich Füssli, author of dreamlike images in a style influenced by Italian mannerism, in which he used strong contrasts of light and shadow, with lighting of theatrical character, like candlesticks. One of the pioneers of Romanticism was the prematurely deceased Frenchman Théodore Géricault, whose masterpiece, The Raft of the Medusa (1819, Musée du Louvre, Paris), presents a ray of light emerging from the stormy clouds in the background as a symbol of hope. The most prominent member of the movement in France was Eugène Delacroix, a painter influenced by Rubens and the Venetian school, who conceived of painting as a medium in which patches of light and color are related. He was also influenced by John Constable, whose painting The Hay Wain opened his eyes to a new sensitivity to light. In 1832 he traveled to Morocco, where he developed a new style that could be considered proto-impressionist, characterized by the use of white to highlight light effects, with a rapid execution technique. In the field of landscape painting, John Constable and Joseph Mallord William Turner stood out, heirs of the rich tradition of English landscape painting of the 18th century. Constable was a pioneer in capturing atmospheric phenomena. Kenneth Clark, in The Art of Landscape, credited him with the invention of the "chiaroscuro of nature", which would be expressed in two ways: on the one hand, the contrast of light and shade that for Constable would be essential in any landscape painting and, on the other, the sparkling effects of dew and breeze that the British painter was able to capture so masterfully on his canvases, with a technique of interrupted strokes and touches of pure white made with a palette knife. Constable once said that "the form of an object is indifferent; light, shadow and perspective will always make it beautiful". Joseph Mallord William Turner was a painter with a great intuition to capture the effects of light in nature, with environments that combine luminosity with atmospheric effects of great drama, as seen in Hannibal Crossing the Alps (1812, Tate Gallery, London). Turner had a predilection for violent atmospheric phenomena, such as storms, tidal waves, fog, rain, snow, or fire and spectacles of destruction, in landscapes in which he made numerous experiments on chromaticism and luminosity, which gave his works an aspect of great visual realism. His technique was based on a colored light that dissolved the forms in a space-color-light relationship that give his work an appearance of great modernity. According to Kenneth Clark, Turner "was the one who raised the key of color so that his paintings not only represented light, but also symbolized the nature of light". His early works still had a certain classical component, in which he imitated the style of artists such as Claude Lorrain, Richard Wilson, Adriaen van de Velde or Aelbert Cuyp. They are works in which he still represents light by means of contrast, executed in oil; however, his watercolors already pointed to what would be his mature style, characterized by the rendering of color and light in movement, with a clear tonality achieved with a primary application of a film of mother-of-pearl paint. In 1819 he visited Italy, whose light inspired him and induced him to elaborate images where the forms were diluted in a misty luminosity, with pearly moonscapes and shades of yellow or scarlet. He then devoted himself to his most characteristic images, mainly coastal scenes in which he made a profound study of atmospheric phenomena. In Interior at Petworth (1830, British Museum, London) the basis of his design is already light and color, the rest is subordinated to these values. In his later works Clark states that "Turner's imagination was capable of distilling, from light and color, poetry as delicate as Shelley's." Among his works are: San Giorgio Maggiore: At Dawn (1819, Tate Gallery), Regulus (1828, Tate Gallery), The Burning of the Houses of Lords and Commons (1835, Philadelphia Museum of Art), The Last Voyage of the "Daredevil" (1839, National Gallery), Negreros throwing the Dead and Dying Overboard (1840, Museum of Fine Arts, Boston), Twilight over a Lake (1840, Tate Gallery), Rain, Steam and Speed (1844, National Gallery), etc. Mention should also be made of Richard Parkes Bonington, a prematurely deceased artist, primarily a watercolorist and lithographer, who lived most of his time in Paris. He had a light, clear and spontaneous style. His landscapes denote the same atmospheric sensibility of Constable and Turner, with a great delicacy in the treatment of light and color, to the point that he is considered a precursor of impressionism. In Germany the figure of Caspar David Friedrich stands out, a painter with a pantheistic and poetic vision of nature, an uncorrupted and idealized nature where the human figure only represents the role of a spectator of the grandeur and infinity of nature. From his beginnings, Friedrich developed a style marked by sure contours and subtle play of light and shadow, in watercolor, oil or sepia ink. One of his first outstanding works is The Cross on the Mountain (1808, Gemäldegalerie Neue Meister, Dresden), where a cross with Christ crucified stands on a pyramid of rocks against the light, in front of a sky furrowed with clouds and crossed by five beams of light that emerge from an invisible sun that is intuited behind the mountain, without it being clear whether it is the sunrise or the sunset; One of the beams generates reflections on the crucifix, so it is understood that it is a metal sculpture. During his early years he focused on landscapes and seascapes, with warm sunrise and sunset lights, although he also experimented with the effects of winter, stormy and foggy lights. A more mature work is Memorial Image for Johann Emanuel Bremer (1817, Alte Nationalgalerie, Berlin), a night scene with a strong symbolic content alluding to death: in the foreground appears a garden in twilight, with a fence through which the rays of the moon filter; the background, with a faint light of dawn, represents the afterlife. In Woman at Sunrise (1818-1820, Folkwang Museum, Essen) – also called Woman at Sunset, since the time of day is not known with certainty – he showed one of his characteristic compositions, that of a human figure in front of the immensity of nature, a faithful reflection of the romantic feeling of the sublime, with a sky of a reddish yellow of great intensity; it is usually interpreted as an allegory of life as a permanent Holy Communion, a kind of religious communion devised by August Wilhelm von Schlegel. Between 1820 and 1822 he painted several landscapes in which he captured the variation of light at different times of the day: Morning, Noon, Afternoon and Sunset, all of them in the Niedersächsisches Landesmuseum in Hannover. For Friedrich, dawn and dusk symbolized birth and death, the cycle of life. In Sea with Sunrise (1826, Hamburger Kunsthalle, Hamburg) he reduced the composition to a minimum, playing with light and color to create an image of great intensity, inspired by the engravings of the 16th and 17th centuries that recreated the appearance of light on the first day of Creation. One of his last works was The Ages of Life (1835, Museum der bildenden Künste, Leipzig), where the five characters are related to the five boats at different distances from the horizon, symbolizing the ages of life. Other outstanding works of his are: Abbey in the Oak Grove (1809, Alte Nationalgalerie, Berlin), Rainbow in a Mountain Landscape (1809-1810, Folkwang Museum, Essen), View of a Harbor (1815-1816, Charlottenburg Palace, Berlin), The Wayfarer on the Sea of Clouds (1818, Hamburger Kunsthalle, Hamburg), Moonrise on the Seaside (1821, Hermitage Museum, Saint Petersburg), Sunset on the Baltic Sea (1831, Gemäldegalerie Neue Meister, Dresden), The Great Reservoir (1832, Gemäldegalerie Neue Meister, Dresden), etc. The Norwegian Johan Christian Dahl moved in the wake of Friedrich, although with a greater interest in light and atmospheric effects, which he captured in a naturalistic way, thus moving away from the romantic landscape. In his works he shows a special interest in the sky and clouds, as well as misty and moonlit landscapes. In many of his works the sky occupies almost the entire canvas, leaving only a narrow strip of land occupied by a solitary tree. Georg Friedrich Kersting made a transposition of Friedrich's pantheistic mysticism to interior scenes, illuminated by a soft light of lamps or candles that gently illuminate the domestic environments that he used to represent, giving these scenes an appearance that transcends reality to become solemn images with a certain mysterious air. Philipp Otto Runge developed his own theory of color, according to which he differentiated between opaque and transparent colors according to whether they tended to light or darkness. In his work this distinction served to highlight the figures in the foreground from the background of the scene, which was usually translucent, generating a psychological effect of transition between planes. This served to intensify the allegorical sense of his works, since his main objective was to show the mystical character of nature. Runge was a virtuoso in capturing the subtle effects of light, a mysterious light that has its roots in Altdorfer and Grünewald, as in his portraits illuminated from below with magical reflections that illuminate the character as if immersed in a halo. The Nazarene movement also emerged in Germany, a series of painters who between 1810 and 1830 adopted a style that was supposedly old-fashioned, inspired by Renaissance classicism – mainly Fra Angelico, Perugino and Raphael – and with an accentuated religious sense. The Nazarene style was eclectic, with a preponderance of drawing over color and a diaphanous luminosity, with limitation or even rejection of chiaroscuro. Its main representatives were: Johann Friedrich Overbeck, Peter von Cornelius, Julius Schnorr von Carolsfeld and Franz Pforr. Also in Germany and the Austro-Hungarian Empire there was the Biedermeier style, a more naturalistic tendency halfway between romanticism and realism. One of its main representatives was Ferdinand Georg Waldmüller, an advocate of the study of nature as the only goal of painting. His paintings are brimming with a resplendent clarity, a meticulously elaborated light of almost palpable quality, as an element that builds the reality of the painting, combined with well-defined shadows. Other artists of interest in this trend are Johann Erdmann Hummel, Carl Blechen, Carl Spitzweg and Moritz von Schwind. Hummel used light as a stylizing element, with a special interest in unusual light phenomena, from artificial light to glints and reflections. Blechen evolved from a typical romanticism with a heroic and fantastic tone to a naturalism that was characterized by light after a year's stay in Italy. Blechen's light is summery, a bright light that accentuates the volume of objects by giving them a tactile substance, combined with a skillful use of color. Spitzweg incorporated camera obscura effects into his paintings, in which light, whether sunlight or moonlight, appears in the form of beams that create effects that are sometimes unreal but of great visual impact. Schwind was the creator of a diaphanous and lyrical light, captured in resplendent luminous spaces with subtle tonal gradations in the reflections. Lastly, we should mention the Danish Christen Købke, author of landscapes of a delicate light reminiscent of the Pointillé of Vermeer or the luminosity of Gerrit Berckheyde. In Italy in the 1830s the so-called Posillipo School, a group of anti-academic Neapolitan landscape painters, among whom Giacinto Gigante, Filippo Palizzi and Domenico Morelli stood out. These artists showed a new concern for light in the landscape, with a more truthful aspect, far from the classical canons, in which the shimmering effects gain prominence. Inspired by Vedutism and picturesque painting, as well as by the work of what they considered their direct master, Anton Sminck van Pitloo, they used to paint from life, in compositions in which the chromatism stands out without losing the solidity of the drawing. Realism Romanticism was succeeded by realism, a trend that emphasized reality, the description of the surrounding world, especially of workers and peasants in the new framework of the industrial era, with a certain component of social denunciation, linked to political movements such as utopian socialism. These artists moved away from the usual historical, religious or mythological themes to deal with more mundane themes of modern life. One of the realist painters most concerned with light was Jean-François Millet, influenced by Baroque and Romantic landscape painting, especially Caspar David Friedrich. He specialized in peasant scenes, often in landscapes set at dawn and dusk, as in On the Way to Work (1851, private collection), Shepherdess Watching Her Flock (1863, Musée d'Orsay, Paris) or A Norman Milkmaid at Gréville (1871, Los Angeles County Museum of Art). For the composition of his works he often used wax or clay figurines that he moved around to study the effects of light and volume. His technique was dense and vigorous brushwork, with strong contrasts of light and shadow. His masterpiece is The Angelus (1857, Musée d'Orsay, Paris): the evening setting of this work allows its author to emphasize the dramatic aspect of the scene, translated pictorially in non-contrasting tonalities, with the darkened figures standing out against the brightness of the sky, which increases its volumetry and accentuates its outline, resulting in an emotional vision that emphasizes the social message that the artist wants to convey. One of his last works was Bird Hunters (1874, Philadelphia Museum of Art), a nocturnal setting in which some peasants dazzle birds with a torch to hunt them, in which the luminosity of the torch stands out, achieved with a dense application of the pictorial impasto. The champion of realism was Gustave Courbet, who in his training was nourished by Flemish, Dutch and Venetian painting of the 16th and 17th centuries, especially Rembrandt. His early works are still of romantic inspiration, in which he uses a dramatic light tone borrowed from the Flemish-Dutch tradition but reinterpreted with a more modern sensibility. His mature work, now fully realistic, shows the influence of the Le Nain brothers, and is characterized by large, meticulously worked works, with large shiny surfaces and a dense application of pigment, often done with a palette knife. At the end of his career he devoted himself more to landscape and nudes, which stand out for their luminous sensibility. Another reference was Honoré Daumier, painter, lithographer, and caricaturist with a strong satirical tone, loose and free stroke, with an effective use of chiaroscuro. In his paintings he was inspired by the light contrasts of Goya, giving his works little colorism and giving greater emphasis to light (The Fugitives, 1850; Barabbas, 1850; The Butcher, 1857; The Third Wagon, 1862). Linked to realism was the French landscape school of Barbizon (Camille Corot, Théodore Rousseau, Charles-François Daubigny, Narcisse-Virgile Díaz de la Peña), marked by a pantheistic feeling of nature, with concern for the effects of light in the landscape, such as the light that filters through the branches of trees. The most outstanding was Corot, who discovered light in Italy, where he dedicated himself to painting outdoors Roman landscapes captured at different times of the day, in scenes of clean atmospheres in which he applied to the surfaces of the volumes the precise doses of light to achieve a panoramic vision in which the volumes are cut out in the atmosphere. Corot had a predilection for a type of tremulous light that reflected on the water or filtered through the branches of the trees, with which he found a formula that satisfied him while achieving great popularity among the public. Eugène Boudin, one of the first landscape painters to paint outdoors, especially seascapes, also stood out as an independent artist. He achieved great mastery in the elaboration of skies, shimmering and slightly misty skies of dim and transparent light, a light that is also reflected in the water with instantaneous effects that he knew how to capture with spontaneity and precision, with a fast technique that already pointed to impressionism – in fact, he was Monet's teacher. Naturalistic landscape painting had another outstanding representative in Germany, Adolph von Menzel, who was influenced by Constable and developed a style in which light is decisive for the visual aspect of his works, with a technique that was a precursor of impressionism. Also noteworthy are his interior scenes with artificial light, in which he recreates a multitude of anecdotal details and luminous effects of all kinds, as in his Dinner after the Ball (1878, Alte Nationalgalerie, Berlin). Next to him stands out Hans Thoma, who was influenced by Courbet, who in his works combined the social vindication of realism with a still somewhat romantic feeling of the landscape. Thoma was an exponent of a "lyrical realism", with landscapes and paintings of peasant themes, usually set in his native Black Forest, characterized by the use of a silver-toned light. In the Netherlands there was the figure of Johan Barthold Jongkind, considered a pre-impressionist, whom Monet also considered his master. He was a great interpreter of atmospheric phenomena and of the play of light on water and snow, as well as of winter and night lights – his moonlit landscapes were highly valued. In Spain, Carlos de Haes, Agustín Riancho and Joaquín Vayreda deserve to be mentioned. Haes, of Belgian origin, traveled the entire Spanish geography to capture its landscapes, which he captured with an almost topographical detail. Riancho had a predilection for mountain scenery, with a coloring with a certain tendency to dark shades, free and spontaneous. Vayreda was the founder of the so-called Olot School. Influenced by the Barbizon School, he applied this style to the Girona landscape, with works of diaphanous and serene composition with a certain lyrical component of bucolic evocation. Also in Spain it is worth mentioning the work of Mariano Fortuny, who found his personal style in Morocco as a chronicler of the African War (1859-1860), where he discovered the colorfulness and exoticism that would characterize his work. Here he began to paint with quick sketches of luminous touches, with which he captured the action in a spontaneous and vigorous way, and which would be the basis of his style: a vibrantly executed colorism with flashing light effects, as is denoted in one of his masterpieces, La vicaría (1868-1870, Museo Nacional de Arte de Cataluña, Barcelona). Another landscape school was the Italian school of the Macchiaioli (Silvestro Lega, Giovanni Fattori, Telemaco Signorini), of anti-academic style, characterized by the use of stains (macchia in Italian, hence the name of the group) of color and unfinished forms, sketched, a movement that preceded Impressionism. These artists painted from life and had as their main objective the reduction of painting to contrasts of light and brilliance. According to Diego Martelli, one of the theorists of the group, "we affirmed that form did not exist and that, just as in light everything results from color and chiaroscuro, so it is a matter of obtaining tones, the effects of the true". The Manchists revalued the light contrasts and knew how to transcribe in their canvases the power and clarity of the Mediterranean light. They captured like no one else the effects of the sun on objects and landscapes, as in the painting The Patrol by Giovanni Fattori, in which the artist uses a white wall as a luminous screen on which the figures are cut out. In Great Britain, the school of the Pre-Raphaelites emerged, who were inspired – as their name indicates – by Italian painters before Raphael, as well as by the recently emerged photography, with exponents such as Dante Gabriel Rossetti, Edward Burne-Jones, John Everett Millais, William Holman Hunt and Ford Madox Brown. The Pre-Raphaelites sought a realistic vision of the world, based on images of great detail, vivid colors and brilliant workmanship; as opposed to the side lighting advocated by academicist painting, they preferred general lighting, which turned paintings into flat images, without great contrasts of light and shadow. To achieve maximum realism, they carried out numerous investigations, as in the painting The Rescuer (1855, National Gallery of Victoria, Melbourne), by John Everett Millais, in which a fireman saves two girls from a fire, for which the artist burned wood in his workshop to find the right lighting. The almost photographic detail of these works led John Ruskin to say of William Holman Hunt's The Wandering Sheep (1852, Tate Britain, London) that "for the first time in the history of art the absolutely faithful balance between color and shade is achieved, by which the actual brightness of the sun could be transported into a key by which possible harmonies with material pigments should produce on the mind the same impressions as are made by the light itself." Hunt was also the author of The Light of the World (1853, Keble College, Oxford University), in which light has a symbolic meaning, related to the biblical passage that identifies Christ with the phrase "I am the light of the world, he who follows me shall not walk in darkness, for he shall have the light of life" (John 8:12). This painter again portrayed the symbolic light of Jesus Christ in The Awakening of Consciousness (1853, Tate Britain), through the light of the garden streaming through the window. Romanticism and realism were the first artistic movements that rejected the official art of the time, the art taught in the academies – academicism – an art that was institutionalized and anchored in the past both in the choice of subjects and in the techniques and resources made available to the artist. In France, in the second half of the 19th century, this art was called art pompier ("fireman's art", a pejorative name derived from the fact that many authors represented classical heroes with helmets that resembled fireman's helmets). Although in principle the academies were in tune with the art produced at the time, so we can not speak of a distinct style, in the 19th century, when the evolutionary dynamics of the styles began to move away from the classical canons, academic art was constrained in a classicist style based on strict rules. Academicism was stylistically based on Greco-Roman classicism, but also on earlier classicist authors, such as Raphael, Poussin or Guido Reni. Technically, it was based on careful drawing, formal balance, perfect line, plastic purity and careful detailing, together with realistic and harmonious coloring. Many of its representatives had a special predilection for the nude as an artistic theme, as well as a special attraction for orientalism. Its main representatives were: William-Adolphe Bouguereau, Alexandre Cabanel, Eùgene-Emmanuel Amaury-Duval and Jean-Léon Gérôme. Impressionism Light played a fundamental role in impressionism, a style based on the representation of an image according to the "impression" that light produces to the eye. In contrast to academic art and its forms of representation based on linear perspective and geometry, the Impressionists sought to capture reality on the canvas as they perceived it visually, so they gave all the prominence to light and color. To this end, they used to paint outdoors (en plen air), capturing the various effects of light on the surrounding environment at different times of the day. They studied in depth the laws of optics and the physics of light and color. Their technique was based on loose brushstrokes and a combination of colors applied according to the viewer's vision, with a preponderance of contrast between elementary colors (yellow, red and blue) and their complements (orange, green and violet). In addition, they used to apply the pigment directly on the canvas, without mixing, thus achieving greater luminosity and brilliance. Impressionism perfected the capture of light by means of fragmented touches of color, a procedure that had already been used to a greater or lesser extent by artists such as Giorgione, Titian, Guardi and Velázquez (it is well known that the Impressionists admired the genius of Las Meninas, whom they considered "the painter of painters"). For the Impressionists, light was the protagonist of the painting, so they began to paint from life, capturing at all times the variations of light on landscapes and objects, the fleeting "impression" of light at different times of the day, so they often produced series of paintings of the same place at different times. For this they dispensed with drawing and defined form and volume directly with color, in loose brushstrokes of pure tones, juxtaposed with each other. They also abandoned chiaroscuro and violent contrasts of light and shadow, for which they dispensed with colors such as black, gray or brown: the chromatic research of impressionism led to the discarding of black in painting, since they claimed that it is a color that does not exist in nature. From there they began to use a luminous range of "light on light" (white, blue, pink, red, violet), elaborating the shades with cold tones. Thus, the impressionists concluded that there is neither form nor color, the only real thing is the air-light relationship. In impressionist paintings the theme is light and its effects, beyond the anecdotal of places and characters. Impressionism was considerably influenced by research in the field of photography, which had shown that the vision of an object depends on the quantity and quality of light. Impressionist painters were especially concerned with artificial light: according to Juan Antonio Ramirez (Mass Media and Art History, 1976), "the surprise at the effect of the new phenomenon of artificial light in the street, in cafés, and in the living room, gave rise to famous paintings such as Manet's Un bar aux Folies Bergère (1882, Courtauld Gallery, London), Renoir's Dancing at the Moulin de la Galette (1876, Musée d'Orsay, Paris) and Degas' Women in a Café (1877, Musée d'Orsay, Paris). Such paintings show the lighted lanterns and that glaucous tonality that only artificial light produces". Numerous Impressionist works are set in bars, cafés, dances, theaters and other establishments, with lamps or candelabras of dim light that mixes with the smoky air of the atmosphere of these places, or candle lights in the case of theaters and opera houses. The main representatives were Claude Monet, Camille Pissarro, Alfred Sisley, Pierre-Auguste Renoir, and Edgar Degas, with an antecedent in Édouard Manet. The most strictly Impressionist painters were Monet, Sisley and Pissarro, the most concerned with capturing light in the landscape. Monet was a master in capturing atmospheric phenomena and the vibration of light on water and objects, with a technique of short brushstrokes of pure colors. He produced the greatest number of series of the same landscape at different times of the day, to capture all the nuances and subtle differences of each type of light, as in his series of The Station of Saint-Lazare, Haystacks, The Poplars, The Cathedral of Rouen, The Parliament of London, San Giorgio Maggiore or Water Lilies. His last works in Giverny on water lilies are close to abstraction, in which he achieves an unparalleled synthesis of light and color. In the mid-1880s he painted coastal scenes of the French Riviera with the highest degree of luminous intensity ever achieved in painting, in which the forms dissolve in pure incandescence and whose only subject is already the sensation of light. Sisley also showed a great interest in the changing effects of light in the atmosphere, with a fragmented touch similar to that of Monet. His landscapes are of great lyricism, with a predilection for aquatic themes and a certain tendency to the dissolution of form. Pissarro, on the other hand, focused more on a rustic-looking landscape painting, with a vigorous and spontaneous brushstroke that conveyed "an intimate and profound feeling for nature", as the critic Théodore Duret said of him. In addition to his countryside landscapes, he produced urban views of Paris, Rouen and Dieppe, and also produced series of paintings at various times of the day and night, such as those of the Avenue de l'Opera and the Boulevard de Montmartre. Renoir developed a more personal style, notable for its optimism and joie de vivre. He evolved from a realism of Courbetian influence to an impressionism of light and luminous colors, and shared for a time a style similar to that of Monet, with whom he spent several stays in Argenteuil. He differed from the latter especially in his greater presence of the human figure, an essential element for Renoir, as well as the use of tones such as black that were rejected by the other members of the group. He liked the play of light and shadow, which he achieved by means of small spots, and achieved great mastery in effects such as the beams of light between the branches of trees, as seen in his work Dance at the Moulin de la Galette (1876, Musée d'Orsay, Paris), and in Torso, sunlight effect where sunlight is seen on the skin of a naked girl (1875, Musée d'Orsay, Paris). Degas was an individual figure, who although he shared most of the impressionist assumptions never considered himself part of the group. Contrary to the preferences of his peers, he did not paint from life and used drawing as a compositional basis. His work was influenced by photography and Japanese prints, and from his beginnings he showed interest in night and artificial light, as he himself expressed: "I work a lot on night effects, lamps, candles, etc. The curious thing is not always to show the light source, but the effect of the light". In his series of works on dancers or horse races, he studied the effects of light in movement, in a disarticulated space in which the effects of lights and backlighting stand out. Many Impressionist works were almost exclusively about the effects of light on the landscape, which they tried to recreate as spontaneously as possible. However, this led in the 1880s to a certain reaction in which they tried to return to more classical canons of representation and a return to the figure as the basis of the composition. From then on, several styles derived from impressionism emerged, such as neo-impressionism (also called divisionism or pointillism) and post-impressionism. Neo-Impressionism took up the optical experimentation of Impressionism: the Impressionists used to blur the contours of objects by lowering the contrasts between light and shadow, which implied replacing objectual solidity with a disembodied luminosity, a process that culminated in Pointillism: in this technique there is no precise source of illumination, but each point is a light source in itself. The composition is based on juxtaposed ("divided") dots of a pure color, which merge in the eye of the viewer at a given distance. When these juxtaposed colors were complementary (red-green, yellow-violet, orange-blue) a greater luminosity was achieved. Pointillism, based largely on the theories of Michel-Eugène Chevreul (The Law of Simultaneous Contrast of Colors, 1839) and Ogden Rood (Modern Chromatics, 1879), defended the exclusive use of pure and complementary colors, applied in small brushstrokes in the form of dots that composed the image on the viewer's retina, at a certain distance. Its best exponents were Georges Seurat and Paul Signac. Seurat devoted his entire life to the search for a method that would reconcile science and aesthetics, a personal method that would transcend impressionism. His main concern was chromatic contrast, its gradation and the interaction between colors and their complementaries. He created a disc with all the tones of the rainbow united by their intermediate colors and placed the pure tones in the center, which he gradually lightened towards the periphery, where the pure white was located, so that he could easily locate the complementary colors. This disc allowed him to mix the colors in his mind before fixing them on the palette, thus reducing the loss of chromatic intensity and luminosity. In his works he first drew in black and white to achieve the maximum balance between light and dark masses, and applied the color by tiny dots that were mixed in the retina of the viewer by optical mixing. On the other hand, he took from Charles Henry his theory on the relationship between aesthetics and physiology, how some forms or spatial directions could express pleasure and pain; according to this author, warm colors were dynamogenic and cold ones inhibitory. From 1886 he focused more on interior scenes with artificial light. His work Chahut (1889–1890, Kröller-Müller Museum, Otterlo) had a powerful influence on Cubism for its way of modeling volumes in space through light, without the need to simulate a third dimension. Signac was a disciple of Seurat, although with a freer and more spontaneous style, not so scientific, in which the brilliance of color stands out. In his last years his works evolved to a search for pure sensation, with a chromatism of expressionist tendency, while he reduced the pointillist technique to a grid of tesserae of larger sizes than the divisionist dots. In Italy there was a variant – the so-called divisionisti – who applied this technique to scenes of greater social commitment, due to its link with socialism, although with some changes in technical execution, since instead of confronting complementary colors they contrasted them in terms of rays of light, producing images that stand out for their luminosity and transparency, as in the work of Angelo Morbelli. Gaetano Previati developed a style in which luminosity is linked to symbolism related to life and nature, as in his Maternity (1890-1891, Banca Popolare di Novara), generally with a certain component of poetic evocation. Another member of the group, Vittore Grubicy de Dragon, wrote that "light is life and, if, as many rightly affirm, art is life, and light is a form of life, the divisionist technique, which tends to greatly increase the expressiveness of the canvas, can become the cradle of new aesthetic horizons for tomorrow". Post-impressionism was, rather than a homogeneous movement, a grouping of diverse artists initially trained in impressionism who later followed individual trajectories of great stylistic diversity. Its best representatives were Henri de Toulouse-Lautrec, Paul Gauguin, Paul Cézanne, and Vincent van Gogh. Cézanne established a compositional system based on geometric figures (cube, cylinder and pyramid), which would later influence Cubism. He also devised a new method of illumination, in which light is applied in the density and intensity of color, rather than in the transitional values between black and white. The one who experimented the most in the field of light was Van Gogh, author of works of strong dramatism and interior prospection, with sinuous and dense brushstrokes, of intense color, in which he deforms reality, to which he gave a dreamlike air. Van Gogh's work shows influences as disparate as those of Millet and Hiroshige, while from the Impressionist school he was particularly influenced by Renoir. Already in his early works, his interest in light is noticeable, which is why he gradually clarified his palette, until he practically reached a yellow monochrome, with a fierce and temperamental luminosity. In his early works, such as The Potato Eaters (1885, Van Gogh Museum, Amsterdam), the influence of Dutch realism, which had a tendency to chiaroscuro and dense color with thick brushstrokes, is evident; here he created a dramatic atmosphere of artificial light that emphasizes the tragedy of the miserable situation of these workers marginalized by the Industrial Revolution. Later his coloring became more intense, influenced by the divisionist technique, with a technique of superimposing brushstrokes in different tones; for the most illuminated areas he used yellow, orange and reddish tones, seeking a harmonious relationship between them all. After settling in Arles in Arles in 1888 he was fascinated by the limpid Mediterranean light and in his landscapes of that period he created clear and shining atmospheres, with hardly any chiaroscuro. As was usual in impressionism, he sometimes made several versions of the same motif at different times of the day to capture its light variations. He also continued his interest in artificial and nocturnal lights, as in Café de noche, interior (1888, Yale University Art Gallery, New Haven), where the light of the lamps seems to vibrate thanks to the concentric halo-shaped circles with which he has reflected the radiation of the light; or Café de noche, exterior (1888, Kröller-Müller Museum, Otterlo), where the luminosity of the café terrace contrasts with the darkness of the sky, where the stars seem like flowers of light. Light also plays a special role in his Sunflowers series (1888-1889), where he used all imaginable shades of yellow, which for him symbolized light and life, as he expressed in a letter to his brother Theo: "a sun, a light that, for lack of a better adjective, I can only define with yellow, a pale sulfur yellow, a pale lemon yellow". To highlight the yellow and orange, he used green and sky blue in the outlines, creating an effect of soft light intensity. In Italy during these years there was a movement called Scapigliatura (1860-1880), sometimes considered a predecessor of divisionism, characterized by its interest in the purity of color and the study of light. Artists like Tranquillo Cremona, Mosè Bianchi or Daniele Ranzoni tried to capture on canvas their feelings through chromatic vibrations and blurred contours, with characters and objects almost dematerialized. Giovanni Segantini, a personal artist who combined a drawing of academicist tradition with a post-impressionist coloring where the light effects have a great relief. Segantini's specialty was the mountain landscape, which he painted outdoors, with a technique of strong brushstrokes and simple colors, with a vibrant light that he only found in the high alpine mountains. In Germany, impressionism was represented by Fritz von Uhde, Lovis Corinth, and Max Slevogt. The first was more of a plenairist than strictly an impressionist, although more than landscape painting he devoted himself to genre painting, especially of religious themes, works in which he also showed a special sensitivity to light. Corinth had a rather eclectic career, from academic beginnings – he was a disciple of Bouguereau – through realism and impressionism, to a certain decadentism and an approach to Jugendstil, to finally end up in expressionism. Influenced by Rembrandt and Rubens, he painted portraits, landscapes and still lifes with a serene and brilliant chromatism. Slevogt assumed the fresh and brilliant chromatism of the Impressionists, although renouncing the fragmentation of colors that they made, and his technique was of loose brushstrokes and energetic movement, with bold and original light effects, which denote a certain influence of the baroque art of his native Bavaria. In Great Britain, the work of James Abbott McNeil Whistler, American by birth but established in London since 1859, stood out. His landscapes are the antithesis of the sunny French landscapes, as they recreate the foggy and taciturn English climate, with a preference for night scenes, images from which he nevertheless knows how to distill an intense lyricism, with artificial light effects reflected in the waters of the Thames. In the United States, it is worth mentioning the work of John Singer Sargent, Mary Cassatt, and Childe Hassam. Sargent was an admirer of Velázquez and Frans Hals, and excelled as a social portraitist, with a virtuoso and elegant technique, both in oil and watercolor, the latter mainly in landscapes of intense color. Cassatt lived for a long time in Paris, where he was related to the Impressionist circle, with whom he shared more the themes than the technique, and developed an intimate and sophisticated work, influenced by Japanese prints. Hassam's main motif was New York life, with a fresh but somewhat cloying style. Mention should also be made of Scandinavian impressionism, many of whose artists were trained in Paris. These painters had a special sensitivity to light, perhaps due to its absence in their native land, so they traveled to France and Italy attracted by the "light of the south". The main exponents were Peder Severin Krøyer, Akseli Gallen-Kallela, and Anders Zorn. The former showed a special interest in highly complex lighting effects, such as the mixing of natural and artificial light. Gallen-Kallela was an original artist who later approached symbolism, with a personal expressive and stylized painting with a tendency towards romanticism, with a special interest in Finnish folklore. Zorn specialized in portraits, nudes and genre scenes, with a brilliant brushstroke of vibrant luminosity. In Russia, Valentin Serov and Konstantin Korovin should be mentioned. Serov had a style similar to that of Manet or Renoir, with a taste for intense chromatism and light reflections, a bright light that extols the joy of life. Korovin painted both urban landscapes and natural landscapes in which he elevates a simple sketch of chromatic impression to the category of a work of art. In Spain, the work of Aureliano de Beruete and Darío de Regoyos stands out. Beruete was a disciple of Carlos de Haes, so he was trained in the realist landscape, but assumed the impressionist technique after a period of training in France. An admirer of Velazquez's light, he knew how to apply it to the Castilian landscape – especially the mountains of Madrid – with his own personal style. Regoyos also trained with Haes and developed an intimate style halfway between pointillism and expressionism. Luminism and symbolism From the mid-19th century until practically the transition to the 20th century, various styles emerged that placed special emphasis on the representation of light, which is why they were generically referred to as "luminism", with various national schools in the United States and various European countries or regions. The term luminism was introduced by John Ireland Howe Baur in 1954 to designate the landscape painting done in the United States between 1840 and 1880, which he defines as "a polished and meticulous realism in which there are no noticeable brushstrokes and no trace of impressionism, and in which atmospheric effects are achieved by infinitely careful gradations of tone, by the most exact study of the relative clarity of nearer and more distant objects, and by an accurate rendering of the variations of texture and color produced by direct or reflected rays". The first was American Luminism, which gave rise to a group of landscape painters generally grouped in the so-called Hudson River School, in which we can include to a greater or lesser extent Thomas Cole, Asher Brown Durand, Frederic Edwin Church, Albert Bierstadt, Martin Johnson Heade, Fitz Henry Lane, John Frederick Kensett, James Augustus Suydam, Francis Augustus Silva, Jasper Francis Cropsey and George Caleb Bingham. In general, his works were based on bombastic compositions, with a horizon line of great depth and a sky of veiled aspect, with atmospheres of strong expressiveness. His light is serene and peaceful, reflecting a mood of love for nature, a nature largely in the United States of the time virgin and paradisiacal, yet to be explored. It is a transcendent light, of spiritual significance, whose radiance conveys a message of communion with nature. Although they use a classical structure and composition, the treatment of light is original because of the infinity of subtle variations in tonality, achieved through a meticulous study of the natural environment of their country. According to Barbara Novak, Luminism is a more serene form of the romantic aesthetic concept of the sublime, which had its translation in the deep expanses of the North American landscape. Some historians differentiate between pure Luminism and Hudson River School landscape painting: in the former, the landscape – more centered in the New England area – is more peaceful, more anecdotal, with delicate tonal gradations characterized by a crystalline light that seems to emanate from the canvas, in neat brushstrokes that seem to recreate the surface of a mirror and in compositions in which the excess of detail is unreal due to its straightness and geometrism, resulting in an idealization of nature. Thus understood, Luminism would encompass Heade, Lane, Kensett, Suydam and Silva. Hudson River landscape painting, on the other hand, would have a more cosmic vision and a predilection for a wilder and more grandiloquent nature, with more dramatic visual effects, as seen in the work of Cole, Durand, Church, Bierstadt, Cropsey and Bingham. It must be said, however, that neither group ever accepted these labels. Thomas Cole was the pioneer of the school. English by birth, one of his main references was Claude Lorrain. Settled in New York in 1825, he began to paint landscapes of the Hudson River area, with the aim of achieving "an elevated style of landscape" in which the moral message was equivalent to that of history painting. He also painted biblical subjects, in which light has a symbolic component, as in his Expulsion from the Garden of Eden (1828, Museum of Fine Arts, Boston). Durand was a little older than Cole and, after Cole's premature death, was considered the best American landscape painter of his time. An engraver by trade, from 1837 he turned to natural landscape painting, with a more intimate and picturesque vision of nature than Cole's allegorical one. Church was Cole's first disciple, who transmitted to him his vision of a majestic and exuberant nature, which he reflected in his scenes of the American West and the South American tropics. Bierstadt, of German origin, was influenced by Turner, whose atmospheric effects are seen in works such as In the Sierra Nevada Mountains in California (1868, Smithsonian American Art Museum, Washington D. C.), a lake between mountains seen after a storm, with the sun's rays breaking through the clouds. Heade was devoted to country landscapes of Massachusetts, Rhode Island and New Jersey, in meadows of endless horizons with clear or cloudy skies and lights of various times of day, sometimes refracted by humid atmospheres. Fitz Henry Lane is considered the greatest exponent of luminism. Handicapped since childhood by polio, he focused on the landscape of his native Gloucester (Massachusetts), with works that denote the influence of the English seascape painter Robert Salmon, in which light has a special role, a placid light that gives a sense of eternity, of time stopped in a serene perfection and harmony. Suydam focused on the coastal landscapes of New York and Rhode Island, in which he was able to reflect the light effects of the Atlantic coast. Kensett was influenced by Constable and devoted himself to the New England landscape with a special focus on the luminous reflections of the sky and the sea. Silva also excelled in the seascape, a genre in which he masterfully captured the subtle gradations of light in the coastal atmosphere. Cropsey combined the panoramic effect of the Hudson River School with the more serene luminism of Lane and Heade, with a meticulous and somewhat theatrical style. Bingham masterfully captured in his scenes of the Far West the limpid and clear light of dawn, his favorite when recreating scenes with American Indians and pioneers of the conquest of the West. Winslow Homer, considered the best American painter of the second half of the 19th century, who excelled in both oil and watercolor and in both landscape and popular scenes of American society, deserves special mention. One of his favorite genres was the seascape, in which he displayed a great interest in atmospheric effects and the changing lights of the day. His painting Moonlight. Wood Island Lighthouse (1894, Museum of Modern Art, New York) was painted entirely by moonlight, in five hours of work. Another important school was Belgian Luminism. In Belgium, the influence of French Impressionism was strongly felt, initially in the work of the group called Les Vingt, as well as in the School of Tervueren, a group of landscape painters who already showed their interest in light, especially in the atmospheric effects, as can be seen in the work of Isidore Verheyden. Later, Pointillism was the main influence on Belgian artists of the time, a trend embraced by Émile Claus and Théo van Rysselberghe, the main representatives of Belgian Luminism. Claus adopted Impressionist techniques, although he maintained academic drawing as the basis for his compositions, and in his work – mainly landscapes – he showed great interest in the study of the effects of light in different atmospheric conditions, with a style that sometimes recalls Monet. Rysselberghe was influenced by Manet, Degas, and Whistler, as well as by the Baroque painter Frans Hals and Spanish painting. His technique was of loose and vigorous brushwork, with great luminous contrasts. A luminist school also emerged in the Netherlands, more closely linked to the incipient Fauvism, in which Jan Toorop, Leo Gestel, Jan Sluyters, and the early work of Piet Mondrian stood out. Toorop was an eclectic artist, who combined different styles in the search for his own language, such as symbolism, modernism, pointillism, Gauguinian synthetism, Beardsley's linearism, and Japanese printmaking. He was especially devoted to allegorical and symbolic themes and, since 1905, to religious themes. In Germany, Max Liebermann received an initial realist influence – mainly from Millet – and a slight impressionist inclination towards 1890, until he ended up in a luminism of personal inspiration, with violent brushstrokes and brilliant light, a light of his own research with which he experimented until his death in 1935. In Spain, luminism developed especially in Valencia and Catalonia. The main representative of the Valencian school was Joaquín Sorolla, although the work of Ignacio Pinazo, Teodoro Andreu, Vicente Castell and Francisco Benítez Mellado is also noteworthy. Sorolla was a master at capturing the light in nature, as is evident in his seascapes, painted with a gradual palette of colors and a variable brushstroke, wider for specific shapes and smaller to capture the different effects of light. An interpreter of the Mediterranean sun like no other, a French critic said of him that "never has a paintbrush contained so much sun". After a period of training, in the 1890s he began to consolidate his style, based on a genre theme with a technique of rapid execution, preferably outdoors, with a thick brushstroke, energetic and impulsive, and with a constant concern for the capture of light, on which he did not cease to investigate its more subtle effects. La vuelta de la pesca (1895) is the first work that shows a particular interest in the study of light, especially in its reverberation in the water and in the sails moved by the wind. It was followed by Pescadores valencianos (1895), Cosiendo la vela (1896) and Comiendo en la barca (1898). In 1900 he visited with Aureliano de Beruete the Universal Exhibition in Paris, where he was fascinated by the intense chromatism of the Nordic artists, such as Anders Zorn, Max Liebermann or Peder Severin Krøyer; From here he intensified his coloring and, especially, his luminosity, with a light that invaded the whole painting, emphasizing the blinding whites, as in Jávea (1900), Idilio (1900), Playa de Valencia (1902), in two versions, morning and sunset, Evening Sun (1903), The Three Sails (1903), Children at the Seashore (1903), Fisherman (1904), Summer (1904), The White Boat (1905), Bathing in Jávea (1905), etc. They are preferably seascape, with a warm Mediterranean light of which he feels special predilection for that of the month of September, more golden. From 1906 he lowered the intensity of his palette, with a more nuanced tonality and a predilection for mauve ink; he continued with the seascapes, but increased the production of other types of landscapes, as well as gardens and portraits. He summered in Biarritz and the pale and soft light of the Atlantic Ocean made him lower the luminosity of his works. He also continues with his Valencian scenes: Paseo a orillas del mar (1909), Después del baño (1909). Between 1909 and 1910 his stays in Andalusia induced him to blur the contours, with a technique close to pointillism, with a predominance of white, pink, and mauve. Among his last works is La bata rosa (1916), in which he unleashes an abundance of light that filters through all parts of the canvas, highlighting the use of light and color on the treatment of the contours, which appear blurred. The Luminist School of Sitges emerged in Catalonia, active in this town in the Garraf between 1878 and 1892. Its most prominent members were Arcadi Mas i Fondevila, Joaquim de Miró, Joan Batlle i Amell, Antoni Almirall and Joan Roig i Soler. Opposed in a certain way to the Olot School, whose painters treated the landscape of the interior of Catalonia with a softer and more filtered light, the Sitgetan artists opted for the warm and vibrant Mediterranean light and the atmospheric effects of the Garraf coast. Heirs to a large extent of Fortuny, the members of this school sought to faithfully reflect the luminous effects of the surrounding landscape, in harmonious compositions that combined verism and a certain poetic and idealized vision of nature, with a subtle chromaticism and a fluid brushstroke that was sometimes described as impressionist. The Sitges School is generally considered a precursor of Catalan modernism: two of its main representatives, Ramon Casas and Santiago Rusiñol, spent several seasons in the town of Sitges, where they adopted the custom of painting d'après nature and assumed as the protagonist of their works the luminosity of the environment that surrounded them, although with other formal and compositional solutions in which the influence of French painting is evident. Casas studied in Paris, where he was trained in impressionism, with special influence of Degas and Whistler. His technique stands out for the synthetic brushstroke and the somewhat blurred line, with a theme focused preferably on interiors and outdoor images, as well as popular scenes and social vindication. Rusiñol showed a special sensitivity for the capture of light especially in his landscapes and his series of Gardens of Spain – he especially loved the gardens of Mallorca (the sones) and Granada – in which he developed a great ability for the effects of light filtered between the branches of the trees, creating unique environments where light and shadow play capriciously. Likewise, Rusiñol's light shows the longing for the past, for the time that flees, for the instant frozen in time whose memory will live on in the artist's work. From the 1880s until the turn of the century, symbolism was a fantastic and dreamlike style that emerged as a reaction to the naturalism of the realist and impressionist currents, placing special emphasis on the world of dreams, as well as on satanic and terrifying aspects, sex and perversion. A main characteristic of symbolism was aestheticism, a reaction to the prevailing utilitarianism of the time and to the ugliness and materialism of the industrial era. Symbolism gave art and beauty an autonomy of their own, synthesized in Théophile Gautier's formula "art for art's sake" (L'art pour l'art). This current was also linked to modernism (also known as Art Nouveau in France, Modern Style in the United Kingdom, Jugendstil in Germany, Sezession in Austria or Liberty in Italy). Symbolism was an anti-scientific and anti-naturalist movement, so light lost objectivity and was used as a symbolic element, in conjunction with the rest of the visual and iconographic resources of this style. It is a transcendent light, which behind the material world suggests a spirituality, whether religious or pantheistic, or perhaps simply a state of mind of the artist, a feeling, an emotion. Light, by its dematerialization, exerted a powerful influence on these artists, a light far removed from the physical world in its conception, although for its execution they often made use of impressionist and pointillist techniques. The movement originated in France with figures such as Gustave Moreau, Odilon Redon and Pierre Puvis de Chavannes. Moreau was still trained in romanticism under the influence of his teacher, Théodore Chassériau, but evolved a personal style in both subject matter and technique, with mystical images with a strong component of sensuality, a resplendent chromaticism with an enamel-like finish and the use of a chiaroscuro of golden shadows. Redon developed a fantastic and dreamlike theme, influenced by the literature of Edgar Allan Poe, which largely preceded surrealism. Until the age of fifty he worked almost exclusively in charcoal drawing and lithography, although he later became an excellent colorist, both in oil and pastel. Puvis de Chavannes was an outstanding muralist, a procedure that suited him well to develop his preference for cold tones, which gave the appearance of fresco painting. His style was more serene and harmonious, with an allegorical theme evoking an idealized past, simple forms, rhythmic lines and a subjective coloring, far from naturalism. In France there was also the movement of the Nabis ("prophets" in Hebrew), formed by Paul Sérusier, Édouard Vuillard, Pierre Bonnard, Maurice Denis and Félix Vallotton. This group was influenced by Gauguin's rhythmic scheme and stood out for an intense chromatism of strong expressiveness. Another focus of symbolism was Belgium, where the work of Félicien Rops, Fernand Khnopff and William Degouve de Nuncques should be noted. The first was a painter and graphic artist of great imagination, with a predilection for a theme centered on perversity and eroticism. Khnopff developed a dreamlike-allegorical theme of women transformed into angels or sphinxes, with disturbing atmospheres of great technical refinement. Degouve de Nuncques elaborated urban landscapes with a preference for nocturnal settings, with a dreamlike component precursor of surrealism: his work The Blind House (1892, Kröller-Müller Museum, Otterlo) influenced René Magritte's The Empire of Lights (1954, Royal Museums of Fine Arts of Belgium, Brussels). In Central Europe, the Swiss Arnold Böcklin and Ferdinand Hodler and the Austrian Gustav Klimt stood out. Böcklin specialized in a theme of fantastic beings, such as nymphs, satyrs, tritons or naiads, with a somber and somewhat morbid style, such as his painting The Island of the Dead (1880, Metropolitan Museum of Art, New York), where a pale, cold and whitish light envelops the atmosphere of the island where Charon's boat is headed. Hodler evolved from a certain naturalism to a personal style he called "parallelism", characterized by rhythmic schemes in which line, form and color are reproduced in a repetitive way, with simplified and monumental figures. It was in his landscapes that he showed the greatest luminosity, with pure and vibrant coloring. Klimt had an academic training, to lead to a personal style that synthesized impressionism, modernism and symbolism. He had a preference for mural painting, with an allegorical theme with a tendency towards eroticism, and with a decorative style populated with arabesques, butterfly wings or peacocks, and with a taste for the golden color that gave his works an intense luminosity. In Italy, it is worth mentioning Giuseppe Pellizza da Volpedo, formed in the divisionist environment, but who evolved to a personal style marked by an intense and vibrant light, whose starting point is his work Lost Hopes (1894, Ponti-Grün collection, Rome). In The Rising Sun or the Sun (1903-1904, National Gallery of Modern Art, Rome) he carried out a prodigious exercise in the exaltation of light, a refulgent dawn light that peeks over a mountainous horizon and seems to burst into a myriad of rays that spread in all directions, dazzling the viewer. A symbolic reading can be established for this work, given the social and political commitment of the artist, since the rising sun was taken by socialism as a metaphor for the new society to which this ideology aspired. In the Scandinavian sphere, it is worth remembering the Norwegian Christian Krohg and the Danish Vilhelm Hammershøi and Jens Ferdinand Willumsen. The former combined natural and artificial lights, often with theatrical effects and certain unreal connotations, as in The Sleeping Seamstress (1885, Nasjonalgalleriet, Oslo), where the double presence of a lamp next to a window through which daylight enters provokes a sensation of timelessness, of temporal indefinition. Hammershøi was a virtuoso in the handling of light, which he considered the main protagonist of his works. Most of his paintings were set in interior spaces with lights filtered through doors or windows, with figures generally with their backs turned. Willumsen developed a personal style based on the influence of Gauguin, with a taste for bright colors, as in After the Storm (1905, Nasjonalgalleriet, Oslo), a navy with a dazzling sun that seems to explode in the sky. Finally, it is worth mentioning a phenomenon between the 19th and 20th centuries that was a precedent for avant-garde art, especially in terms of its anti-academic component: naïf art ("naïve" in French), a term applied to a series of self-taught painters who developed a spontaneous style, alien to the technical and aesthetic principles of traditional painting, sometimes labeled as childish or primitive. One of its best representatives was Henri Rousseau, a customs officer by trade, who produced a personal work, with a poetic tone and a taste for the exotic, in which he lost interest in perspective and resorted to unreal-looking lighting, without shadows or perceptible light sources, a type of image that influenced artists such as Picasso or Kandinski and movements such as metaphysical painting and surrealism. 20th Century The art of the 20th century underwent a profound transformation: in a more materialistic, more consumerist society, art was directed to the senses, not to the intellect. The avant-garde movements arose, which sought to integrate art into society through a greater interrelation between artist and spectator, since it was the latter who interpreted the work, and could discover meanings that the artist did not even know. Avant-gardism rejected the traditional methods of optical representation – Renaissance perspective – to vindicate the two-dimensionality of painting and the autonomous character of the image, which implied the abandonment of space and light contrasts. In their place, light and shadow would no longer be instruments of a technique of spatial representation, but integral parts of the image, of the conception of the work as a homogeneous whole. On the other hand, other artistic methods such as photography, film and video had a notable influence on the art of this century, as well as, in relation to light, the installation, one of the variants of which is light art. On the other hand, the new interrelationship with the spectator means that the artist does not reflect what he sees, but lets the spectator see his vision of reality, which will be interpreted individually by each person. Advances in artificial light (carbon and tungsten filaments, neon lights) led society in general to a new sensitivity to luminous impacts and, for artists in particular, to a new reflection on the technical and aesthetic properties of the new technological advances. Many artists of the new century experimented with all kinds of lights and their interrelation, such as the mixture and interweaving of natural and artificial lights, the control of the focal point, the dense atmospheres, the shaded or transparent colors and other types of sensorial experiences, already initiated by the impressionists but which in the new century acquired a category of their own. Avant-garde The emergence of the avant-garde at the turn of the century brought a rapid succession of artistic movements, each with a particular technique and a particular vision of the function of light and color in painting: fauvism and expressionism were heirs of post-impressionism and treated light to the maximum of its saturation, with strong chromatic contrasts and the use of complementary colors for shadows; cubism, futurism and surrealism had in common a subjective use of color, giving primacy to the expression of the artist over the objectivity of the image. One of the first movements of the 20th century concerned with light and, especially, color, was Fauvism (1904-1908). This style involved experimentation in the field of color, which was conceived in a subjective and personal way, applying emotional and expressive values to it, independent of nature. For these artists, colors had to generate emotions, through a subjective chromatic range and brilliant workmanship. In this movement a new conception of pictorial illumination arose, which consisted in the negation of shadows; the light comes from the colors themselves, which acquire an intense and radiant luminosity, whose contrast is achieved through the variety of pigments used. Fauvist painters include Henri Matisse, Albert Marquet, Raoul Dufy, André Derain, Maurice de Vlaminck and Kees van Dongen. Perhaps the most gifted was Matisse, who "discovered" light in Collioure, where he understood that intense light eliminates shadows and highlights the purity of colors; from then on he used pure colors, to which he gave an intense luminosity. According to Matisse, "color contributes to expressing light, not its physical phenomenon but the only light that exists in fact, that of the artist's brain". One of his best works is Luxury, Calm and Voluptuousness (1904, Musée d'Orsay, Paris), a scene of bathers on the beach illuminated by intense sunlight, in a pointillist technique of juxtaposed patches of pure and complementary colors. Related to this style was Pierre Bonnard, who had been a member of the Nabis, an intimist painter with a predilection for the female nude, as in his Nude against the light (1908, Royal Museums of Fine Arts of Belgium, Brussels), in which the woman's body is elaborated with light, enclosed in a space formed by the vibrant light of a window sifted by a blind. Expressionism emerged as a reaction to impressionism, against which they defended a more personal and intuitive art, where the artist's inner vision – the "expression" – prevailed over the representation of reality – the "impression". In their works they reflected a personal and intimate theme with a taste for the fantastic, deforming reality to accentuate the expressive character of the work. Expressionism was an eclectic movement, with multiple tendencies in its midst and a diverse variety of influences, from post-impressionism and symbolism to fauvism and cubism, as well as some aniconic tendencies that would lead to abstract art (Kandinski). Expressionist light is more conceptual than sensorial, it is a light that emerges from within and expresses the artist's mentality, his consciousness, his way of seeing the world, his subjective "expression". With precedents in the figures of Edvard Munch and James Ensor, it was formed mainly around two groups: Die Brücke (Ernst Ludwig Kirchner, Erich Heckel, Karl Schmidt-Rottluff, Emil Nolde) and Der Blaue Reiter (Vasili Kandinski, Franz Marc, August Macke, Paul Klee). Other exponents were the Vienna Group (Egon Schiele, Oskar Kokoschka) and the School of Paris (Amedeo Modigliani, Marc Chagall, Georges Rouault, Chaïm Soutine). Edvard Munch was linked in his beginnings to symbolism, but his early work already reflects a certain existential anguish that will lead him to a personal painting of strong psychological introspection, in which light is a reflection of the emptiness of existence, of the lack of communication and of the subordination of physical reality to the artist's inner vision, as can be seen in the faces of his characters, with a spectral lighting that gives them the appearance of automatons. The members of Die Brücke ("The Bridge") – especially Kirchner, Heckel and Schmidt-Rottluff – developed a dark, introspective and anguished subject matter, where form, color and light are subjective, resulting in tense, unsettling works that emphasize the loneliness and rootlessness of the human being. The light in these artists is not illuminating, it does not respond to physical criteria, as can be seen in Erich Heckel and Otto Müller playing Kirchner's chess (1913, Brücke Museum Berlin), where the lamp on the table does not radiate light and constitutes a strange object, alien to the scene. Der Blaue Reiter ("The Blue Rider") emerged in Munich in 1911 and more than a common stylistic stamp shared a certain vision of art, in which the creative freedom of the artist and the personal and subjective expression of his works prevailed. It was a more spiritual and abstract movement, with a technical predilection for watercolor, which gave his works an intense chromatism and luminosity. Cubism (1907-1914) was based on the deformation of reality by destroying the spatial perspective of Renaissance origin, organizing space according to a geometric grid, with simultaneous vision of objects, a range of cold and muted colors, and a new conception of the work of art, with the introduction of collage. It was the first movement that dissociated light from reality, by eliminating the tangible focus that in all the previous history of painting illuminated the pictures, whether natural or artificial; in its place, each part of the picture, each space that has been deconstructed into geometric planes, has its own luminosity. Jean Metzinger, in On Cubism (1912), wrote that "beams of light and shadows distributed in such a way that one engenders the other plastically justify the ruptures whose orientation creates the rhythm". The main figure of this movement was Pablo Picasso, one of the great geniuses of the 20th century, along with Georges Braque, Jean Metzinger, Albert Gleizes, Juan Gris, and Fernand Léger. Before ending up in cubism, Picasso went through the so-called blue and rose periods: in the first one, the influence of El Greco can be seen in his elongated figures of dramatic appearance, with profiles highlighted by a yellowish or greenish light and shadows of thick black brushstrokes; in the second one, he deals with kinder and more human themes, being characteristic the scenes of figures immersed in empty landscapes of luminous appearance. His cubist stage is divided into two phases: in "analytical cubism" he focused on portraits and still lifes, with images broken down into planes in which light loses its modeling and volume-defining character to become a constructive element that emphasizes contrast, giving the image an iridescent appearance; in "synthetic cubism" he expanded the chromatic range and included extra-pictorial elements, such as texts and fragments of literary works. After his cubist stage, his most famous work is Guernica, entirely elaborated in shades of gray, a night scene illuminated by the lights of a light bulb in the ceiling – shaped like a sun and an eye at the same time – and of a quinque in the hands of the character leaning out of the window, with a light constructed by planes that serve as counterpoints of light in the midst of darkness. A movement derived from Cubism was Orphism, represented especially by Robert Delaunay, who experimented with light and color in his abstracting search for rhythm and movement, as in his series on the Eiffel Tower or in Field of Mars. The Red Tower, where he decomposes light into the colors of the prism to diffuse it through the space of the painting. Delaunay studied optics and came to the conclusion that "the fragmentation of form by light creates planes of colors", so in his work he explored with intensity the rhythms of colors, a style he called "simultaneism" taking the scientific concept of simultaneous contrasts created by Chevreul. For Delaunay, "painting is, properly speaking, a luminous language", which led him in his artistic evolution towards abstraction, as in his series of Windows, Disks and Circular and Cosmic Forms, in which he represents beams of light elaborated with bright colors in an ideal space. Another style concerned with optical experimentation was Futurism (1909–1930), an Italian movement that exalted the values of the technical and industrial progress of the 20th century and emphasized aspects of reality such as movement, speed and simultaneity of action. Prominent among its ranks were Giacomo Balla, Gino Severini, Carlo Carrà and Umberto Boccioni. These artists were the first to treat light in an almost abstract way, as in Boccioni's paintings, which were based on pointillist technique and the optical theories of color to carry out a study of the abstract effects of light, as in his work The City Rises (1910-1911, Museum of Modern Art, New York). Boccioni declared in 1910 that "movement and light destroy the matter of objects" and aimed to "represent not the optical or analytical impression, but the psychic and total experience". Gino Severini evolved from a still pointillist technique towards Cubist spatial fragmentation applied to Futurist themes, as in his Expansión de la luz (1912, Museo Thyssen-Bornemisza, Madrid), where the fragmentation of color planes contributes to the construction of plastic rhythms, which enhances the sensation of movement and speed. Carlo Carrà elaborated works of pointillist technique in which he experimented with light and movement, as in La salida del teatro (1909, private collection), where he shows a series of pedestrians barely sketched in their elemental forms and elaborated with lines of light and color, while in the street artificial lights gleam, whose flashes seem to cut the air. Balla synthesized neo-Impressionist chromaticism, pointillist technique and cubist structural analysis in his works, decomposing light to achieve his desired effects of movement. In La jornada del operario (1904, private collection), he divided the work into three scenes separated by frames, two on the left and one on the right of double size. They represent dawn, noon and twilight, in which he depicts various phases of the construction of a building, consigning a day's work; the two parts on the left are actually a single image separated by the frame, but with a different treatment of light for the time of day. In Arc Lamp (1911-1912, Museum of Modern Art, New York) he made an analytical study of the patterns and colors of a beam of light, an artificial light in conflict with moonlight, in a symbolism in which the electric light represents the energy of youth as opposed to the lunar light of classicism and romanticism. In this work the light seems to be observed under a microscope, from the incandescent center of the lamp sprouts a series of colored arrows that gradually lose chromatism as they move away from the bright focus until they merge with the darkness. Balla himself stated that "the splendor of light is obtained by bringing pure colors closer together. This painting is not only original as a work of art, but also scientific, since I sought to represent light by separating the colors that compose it". Outside Italy, Futurism influenced various parallel movements such as English Vorticism, whose best exponent was Christopher Richard Wynne Nevinson, a painter who showed a sensitivity for luminous effects reminiscent of Severini, as seen in his Starry Shell (1916, Tate Gallery, London); or Russian Rayonism, represented by Mikhail Larionov and Natalia Goncharova, a style that combined the interest in light beams typical of analytical cubism with the radiant dynamism of futurism, although it later evolved towards abstraction. In Italy also emerged the so-called metaphysical painting, considered a forerunner of surrealism, represented mainly by Giorgio de Chirico and Carlo Carrà. Initially influenced by symbolism, De Chirico was the creator of a style opposed to futurism, more serene and static, with certain reminiscences of classical Greco-Roman art and Renaissance linear perspective. In his works he created a world of intellectual placidity, a dreamlike space where reality is transformed for the sake of a transcendent evocation, with spaces of wide perspectives populated by figures and isolated objects in which a diaphanous and uniform illumination creates elongated shadows of unreal aspect, creating an overwhelming sensation of loneliness. In his urban spaces, empty and geometrized, populated by faceless mannequins, the lights and shadows create strong contrasts that help to enhance the dreamlike factor of the image. Another artist of this movement is Giorgio Morandi, author of still lifes in which chiaroscuro has a clear protagonism, in compositions where light and shadow play a primordial role to build an unreal and dreamlike atmosphere. With abstract art (1910-1932) the artist no longer tries to reflect reality, but his inner world, to express his feelings. The art loses all real aspect and imitation of nature to focus on the simple expressiveness of the artist, in shapes and colors that lack any referential component. Initiated by Vasili Kandinski, it was developed by the neoplasticist movement (De Stijl), with figures such as Piet Mondrian and Theo Van Doesburg, as well as Russian Suprematism (Kazimir Malevich). The presence of light in abstract art is inherent to its evolution, because although this movement dispenses with the theme in his works, it is no less true that it is part of this, after all, the human being cannot detach himself completely from the reality that shapes his existence. The path towards abstraction came from two paths: one of a psychic-emotive character originated by symbolism and expressionism, and the other objective-optical derived from fauvism and cubism. Light played a special role in the second one, since starting from the cubist light beams it was logical to reach the isolation of them outside the reality that originates them and their consequent expression in abstract forms. In abstract art, light loses the prominence it has in an image based on natural reality, but its presence is still perceived in the various tonal gradations and chiaroscuro games that appear in numerous works by abstract artists such as Mark Rothko, whose images of intense chromaticism have a luminosity that seems to radiate from the color of the work itself. The pioneer of abstraction, Vasili Kandinski, received the inspiration for this type of work when he woke up one day and saw one of his paintings in which the sunlight was shining brightly, diluting the forms and accentuating the chromaticism, which showed an unprecedented brightness; he then began a process of experimentation to find the perfect chromatic harmony, giving total freedom to color without any formal or thematic subordination. Kandinski's research continued with Russian suprematism, especially with Kazimir Malevich, an artist with post-impressionist and fauvist roots who later adopted cubism, leading to a geometric abstraction in which color acquires special relevance, as shown in his Black on Black (1913) and White on White (1919). In the interwar period, the New Objectivity (Neue Sachlichkeit) movement emerged in Germany, which returned to realistic figuration and the objective representation of the surrounding reality, with a marked social and vindictive component. Although they advocated realism, they did not renounce the technical and aesthetic achievements of avant-garde art, such as Fauvist and expressionist coloring, Futurist "simultaneous vision" or the application of photomontage to painting. In this movement, the urban landscape, populated with artificial lights, played a special role. Among its main representatives were Otto Dix, George Grosz, and Max Beckmann. Surrealism (1924-1955) placed special emphasis on imagination, fantasy and the world of dreams, with a strong influence of psychoanalysis. Surrealist painting moved between figuration (Salvador Dalí, Paul Delvaux, René Magritte, Max Ernst) and abstraction (Joan Miró, André Masson, Yves Tanguy, Paul Klee). René Magritte treated light as a special object of research, as is evident in his work The Empire of Lights (1954, Royal Museums of Fine Arts of Belgium, Brussels), where he presents an urban landscape with a house surrounded by trees in the lower part of the painting, immersed in a nocturnal darkness, and a daytime sky furrowed with clouds in the upper part; in front of the house there is a street lamp whose light, together with that of two windows on the upper floor of the house, is reflected in a pond located at the foot of the house. The contrasting day and night represent waking and sleeping, two worlds that never come to coexist. Dalí evolved from a formative phase in which he tried different styles (impressionism, pointillism, futurism, cubism, fauvism) to a figurative surrealism strongly influenced by Freudian psychology. In his work he showed a special interest in light, a Mediterranean light that in many of his works bathes the scene with intensity: The Bay of Cadaqués (1921, private collection), The Phantom Chariot (1933, Nahmad collection, Geneva), Solar Table (1936, Boijmans Van Beuningen Museum, Rotterdam), Composition (1942, Tel Aviv Museum of Art). It is the light of his native Empordà, a region marked by the tramuntana wind, which, according to Josep Pla, generates a "static, clear, shining, sharp, glittering" light. Dalí's treatment of light is generally surprising, with singular fantastic effects, contrasts of light and shadow, backlighting and countershadows, always in continuous research of new and surprising effects. Towards 1948 he abandoned avant-gardism and returned to classicist painting, although interpreted in a personal and subjective way, in which he continues his incessant search for new pictorial effects, as in his "atomic stage" in which he seeks to capture reality through the principles of quantum physics. Among his last works stand out for their luminosity: Christ of Saint John of the Cross (1951, Kelvingrove Museum, Glasgow), The Last Supper (1955, National Gallery of Art, Washington D. C.), The Perpignan Station (1965, Museum Ludwig, Cologne) and Cosmic Athlete (1968, Zarzuela Palace, Madrid). Joan Miró reflected in his works a light of magical and at the same time telluric aspect, rooted in the landscape of the countryside of Tarragona that was so dear to him, as is evident in La masía (1921-1922, National Gallery of Art, Washington D. C.), illuminated by a twilight that bathes the objects in contrast with the incipient darkness of the sky. In his work he uses flat and dense colors, in preferably nocturnal environments with special prominence of empty space, while objects and figures seem bathed in an unreal light, a light that seems to come from the stars, for which he felt a special devotion. In the United States, between the 1920s and 1930s, several figurative movements emerged, especially interested in everyday reality and life in cities, always associated with modern life and technological advances, including artificial lights in streets and avenues as well as commercial and indoor lights. The first of these movements was the Ashcan School, whose leader was Robert Henri, and where George Wesley Bellows and John French Sloan also stood out. In opposition to American Impressionism, these artists developed a style of cold tones and dark palette, with a theme centered on marginalization and the world of nightlife. This school was followed by the so-called American realism or American Scene, whose main representative was Edward Hopper, a painter concerned with the expressive power of light, in urban images of anonymous and lonely characters framed in lights and deep shadows, with a palette of cold colors influenced by the luminosity of Vermeer. Hopper took from black and white cinema the contrast between light and shadow, which would be one of the keys to his work. He had a special predilection for the light of Cape Cod (Massachusetts), his summer resort, as can be seen in Sunlight on the Second Floor (1960, Whitney Museum of American Art, New York). His scenes are notable for their unusual perspectives, strong chromaticism and contrasts of light, in which metallic and electrifying glows stand out. In New York Cinema (1939, Museum of Modern Art, New York) he showed the interior of a cinema vaguely illuminated by – as he himself expressed in his notebook – "four sources of light, with the brightest point in the girl's hair and in the flash of the handrail". On one occasion, Hopper went so far as to state that the purpose of his painting was none other than to "paint sunlight on the side wall of a house." One critic defined the light in Hopper's mysterious paintings as a light that "illuminates but never warms," a light at the service of his vision of the desolate American urban landscape. Latest trends Since the Second World War, art has undergone a vertiginous evolutionary dynamic, with styles and movements following each other more and more rapidly in time. The modern project originated with the historical avant-gardes reached its culmination with various anti-material styles that emphasized the intellectual origin of art over its material realization, such as action art and conceptual art. Once this level of analytical prospection of art was reached, the inverse effect was produced – as is usual in the history of art, where different styles confront and oppose each other, the rigor of some succeeding the excess of others, and vice versa – and a return was made to the classical forms of art, accepting its material and esthetic component, and renouncing its revolutionary and society-transforming character. Thus postmodern art emerged, where the artist shamelessly transits between different techniques and styles, without a vindictive character, and returns to artisanal work as the essence of the artist. The first movements after the war were abstract, such as American abstract expressionism and European informalism (1945-1960), a set of trends based on the expressiveness of the artist, who renounces any rational aspect of art (structure, composition, preconceived application of color). It is an eminently abstract art, where the material support of the work becomes relevant, which assumes the leading role over any theme or composition. Abstract expressionism – also called action painting – was characterized by the use of the dripping technique, the dripping of paint on the canvas, on which the artist intervened with various tools or with his own body. Among its members, Jackson Pollock and Mark Rothko stand out. In addition to pigments, Pollock used glitter and aluminum enamel, which stands out for its brightness, giving his works a metallic light and creating a kind of chiaroscuro. For his part, Rothko worked in oil, with overlapping layers of very fluid paint, which created glazes and transparencies. He was especially interested in color, which he combined in an unprecedented way, but with a great sense of balance and harmony, and used white as a base to create luminosity. European informalism includes various currents such as tachism, art brut and matter painting. Georges Mathieu, Hans Hartung, Jean Fautrier, Jean Dubuffet, Lucio Fontana and Antoni Tàpies stand out. The latter developed a personal and innovative style, with a mixed technique of crushed marble powder with pigments, which he applied on the canvas to later carry out various interventions by means of grattage. He used to use a dark coloring, almost "dirty", but in some of his works (such as Zoom, 1946), he added a white from Spain that gave it a great luminosity. Among the last movements especially concerned with light and color was op-art (optical art, also called kinetic or kinetic-luminescent), a style that emphasized the visual aspect of art, especially optical effects, which were produced either by optical illusions (ambiguous figures, persistent images, moiré effect), or by movement or play of light. Victor Vasarely, Jesús Rafael Soto and Yaacov Agam stood out. The technique of these artists is mixed, transcending canvas or pigment to incorporate metallic pieces, plastics and all kinds of materials; in fact, more than the material substrate of the work, the artistic matter is light, space and movement. Vasarely had a very precise and elaborate way of working, sometimes using photographs that he projected onto the canvas by means of slides, which he called "photographisms". In some works (such as Eridan, 1956) he investigated with the contrasts between light and shadow, reaching high values of light achieved with white and yellow. His Cappella series (1964) focused on the opposition between light and dark combined with shapes. The Vega series (1967) was made with aluminum paint and gold and silver glitter, which reverberated the light. Soto carried out a type of serial painting influenced by dodecaphonism, with primary colors that stand out for their transparency and provoke a strong sensation of movement. Agam, on the other hand, was particularly interested in chromatic combinations, working with 150 different colors, in painting or sculpture-painting. Among the figurative trends is pop art (1955-1970), which emerged in the United States as a movement to reject abstract expressionism. It includes a series of authors who returned to figuration, with a marked component of popular inspiration, with images inspired by the world of advertising, photography, comics, and mass media. Roy Lichtenstein, Tom Wesselmann, James Rosenquist, and Andy Warhol stood out. Lichtenstein was particularly inspired by comics, with paintings that look like vignettes, sometimes with the typical graininess of printed comics. He used flat inks, without mixtures, in pure colors. He also produced landscapes, with light colors and great luminosity. Wesselmann specialized in nudes, generally in bathrooms, with a cold and aseptic appearance. He also used pure colors, without tonal gradations, with sharp contrasts. Rosenquist had a more surrealist vein, with a preference for consumerist and advertising themes. Warhol was the most mediatic and commercial artist of this group. He used to work in silkscreen, in series ranging from portraits of famous people such as Elvis Presley, Marilyn Monroe or Mao Tse-tung to all kinds of objects, such as his series of Campbell's soup cans, made with a garish and strident colorism and a pure, impersonal technique. Abstraction resurfaced between the 1960s and 1980s with Post-painterly abstraction and Minimalism. Post-painterly abstraction (also called "New Abstraction") focused on geometrism, with an austere, cold and impersonal language, due to an anti-anthropocentric tendency that could be glimpsed in these years in art and culture in general, also present in pop-art, a style with which it coexisted. Thus, post-pictorial abstraction focuses on form and color, without making any iconographic reading, only interested in the visual impact, without any reflection. They use striking colors, sometimes of a metallic or fluorescent nature. Barnett Newman, Frank Stella, Ellsworth Kelly and Kenneth Noland stand out. Minimalism was a trend that involved a process of dematerialization that would lead to conceptual art. They are works of marked simplicity, reduced to a minimum motif, refined to the initial approach of the author. Robert Mangold and Robert Ryman stand out, who had in common the preference for monochrome, with a refined technique in which the brushstroke is not noticed and the use of light tones, preferably pastel colors. Figuration returned again with hyperrealism – which emerged around 1965 – a trend characterized by its superlative and exaggerated vision of reality, which is captured with great accuracy in all its details, with an almost photographic aspect, in which Chuck Close, Richard Estes, Don Eddy, John Salt, and Ralph Goings stand out. These artists are concerned, among other things, with details such as glitter and reflections in cars and shop windows, as well as light effects, especially artificial city lights, in urban views with neon lights and the like. Linked to this movement is the Spaniard Antonio López García, author of academic works but where the most meticulous description of reality is combined with a vague unreal aspect close to magical realism. His urban landscapes of wide atmospheres stand out (Madrid sur, 1965–1985; Madrid desde Torres Blancas, 1976–1982), as well as images with an almost photographic aspect such as Mujer en la bañera (1968), in which a woman takes a bath in an atmosphere of electric light reflected on the bathroom tiles, creating an intense and vibrant composition. Another movement especially concerned with the effects of light has been neo-luminism, an American movement inspired by American luminism and the Hudson River School, from which they adopt its majestic skies and calm water marinas, as well as the atmospheric effects of light rendered in subtle gradations. Its main representatives are: James Doolin, April Gornik, Norman Lundin, Scott Cameron, Steven DaLuz and Pauline Ziegen. Since 1975, postmodern art has predominated in the international art scene: it emerged in opposition to the so-called modern art, it is the art of postmodernity, a socio-cultural theory that postulates the current validity of a historical period that would have surpassed the modern project, that is, the cultural, political and economic roots of the Contemporary Age, marked culturally by the Enlightenment, politically by the French Revolution and economically by the Industrial Revolution. These artists assume the failure of the avant-garde movements as the failure of the modern project: the avant-garde intended to eliminate the distance between art and life, to universalize art; the postmodern artist, on the other hand, is self-referential, art speaks of art, and does not intend to do social work. Postmodern painting returns to the traditional techniques and themes of art, although with a certain stylistic mixification, taking advantage of the resources of all the preceding artistic periods and intermingling and deconstructing them, in a procedure that has been baptized as "appropriationism" or artistic "nomadism". Individual artists such as Jeff Koons, David Salle, Jean-Michel Basquiat, Keith Haring, Julian Schnabel, Eric Fischl or Miquel Barceló stand out, as well as various movements such as the Italian trans-avant-garde (Sandro Chia, Francesco Clemente, Enzo Cucchi, Nicola De Maria, Mimmo Paladino), German Neo-Expressionism (Anselm Kiefer, Georg Baselitz, Jörg Immendorff, Markus Lüpertz, Sigmar Polke), Neo-Mannerism, free figuration, among others. See also Light art Light painting History of painting Periods in Western art history References Bibliography Painting Light Renaissance Modern art Light art Medieval art Luminism (American art style) Leonardo da Vinci Vincent van Gogh Jean-Michel Basquiat
Light in painting
[ "Physics" ]
51,825
[ "Physical phenomena", "Spectrum (physical sciences)", "Electromagnetic spectrum", "Waves", "Light" ]
70,549,537
https://en.wikipedia.org/wiki/HD%2076236
HD 76236, also designated as HR 3543 or rarely 11 G. Chamaeleontis, is a solitary star located in the southern circumpolar constellation Chamaeleon. It is faintly visible to the naked eye as an orange-hued star with an apparent magnitude of 5.77. Based on parallax measurements from the Gaia satellite, the object is estimated to be 612 light years away. Currently, it is receding with a heliocentric radial velocity of . At its current distance, HD 76236's brightness is diminished by 0.39 magnitudes due to interstellar dust. It has an absolute magnitude of −0.13. This is an evolved red giant with a stellar classification K5 III. It has 1.78 times the mass of the Sun and an enlarged radius of . It radiates 950 times the luminosity of the Sun from its photosphere at an effective temperature of . HD 76236 has an iron abundance nearly twice of the Sun's, making it metal enriched. It spins modestly with a projected rotational velocity of . An infrared excess has been detected around HD 76236, indicating that the star may have a circumstellar disk. References Chamaeleon K-type giants Chamaeleontis, 11 076236 043012 3543 PD-79 352 Circumstellar disks
HD 76236
[ "Astronomy" ]
290
[ "Chamaeleon", "Constellations" ]
70,549,705
https://en.wikipedia.org/wiki/Spirographa%20ciliata
Spirographa ciliata is a species of lichenicolous fungus in the family Spirographaceae. It was first formally described by Klaus Kalb in 1993 as Cornutispora ciliata. Adam Grzegorz Flakus, Javier Etayo, and Jolanta Miądlikowska transferred it to genus Spirographa in 2019. References Ostropales Fungi described in 1993 lichenicolous fungi Taxa named by Klaus Kalb Fungus species
Spirographa ciliata
[ "Biology" ]
101
[ "Fungi", "Fungus species" ]
70,550,498
https://en.wikipedia.org/wiki/Radicinin
Radicinin is a phytotoxin with the molecular formula C12H12O5. Radicinin is produced by the fungal plant pathogen Alternaria radicina and other Alternaria species. References Further reading Radicinin
Radicinin
[ "Chemistry" ]
54
[ "Chemical ecology", "Plant toxins", "Organic compounds", "Organic compound stubs", "Organic chemistry stubs" ]
70,551,347
https://en.wikipedia.org/wiki/Realme%207%20Pro
Realme 7 Pro (stylized as realme 7 Pro) is a dual-SIM smartphone from the Chinese company Realme. It was launched on 10 September 2020. Realme 7 pro became fifth most selling budget smartphone of all time till now. The devices has Gorilla Glass 3 shatter resistant glass with splash resistant back. References External links 7 Pro Mobile phones introduced in 2020 Mobile phones with multiple rear cameras Mobile phones with 4K video recording Discontinued smartphones
Realme 7 Pro
[ "Technology" ]
91
[ "Mobile technology stubs", "Mobile phone stubs" ]
70,551,358
https://en.wikipedia.org/wiki/List%20of%20Asian%20countries%20by%20life%20expectancy
This is a list of Asian countries by life expectancy. United Nations (2023) Estimation of the analytical agency of the UN. UN: Estimate of life expectancy for various ages in 2023 UN: Change of life expectancy from 2019 to 2023 World Bank Group (2022) Estimation of the World Bank Group for 2022. The data is filtered according to the list of countries in Asia. The values in the World Bank Group tables are rounded. All calculations are based on raw data, so due to the nuances of rounding, in some places illusory inconsistencies of indicators arose, with a size of 0.01 year. In 2014, some of the world's leading countries had a local peak in life expectancy, so this year is chosen for comparison with 2019 and 2022. WHO (2019) Estimation of the World Health Organization for 2019. Charts See also References life expectancy Asia
List of Asian countries by life expectancy
[ "Biology" ]
189
[ "Senescence", "Life expectancy" ]
70,552,069
https://en.wikipedia.org/wiki/NGC%207410
NGC 7410 is a barred spiral galaxy located in the constellation Grus. It is about 122 million light-years away. It was discovered on 15 July 1826, by James Dunlop. References 7410 Spiral galaxies Grus (constellation)
NGC 7410
[ "Astronomy" ]
49
[ "Grus (constellation)", "Constellations" ]
70,554,713
https://en.wikipedia.org/wiki/HD%20114533
HD 114533, also known as HR 4976, is a solitary star located in the southern circumpolar constellation Chamaeleon. It has an apparent magnitude of 5.84, making it faintly visible to the naked eye. The system is located relatively far at a distance of roughly 2,100 light years based on Gaia DR3 parallax measurements but is drifting closer with a heliocentric radial velocity of . At its current distance, HD 114533A's brightness is diminished by 0.74 magnitudes due to interstellar dust. It has an absolute magnitude of −2.0. This is an evolved supergiant with a stellar classification of G2 Ib. It has also been given class of F8 Ib, indicating a slightly hotter star. It has 3.78 times the mass of the Sun but has expanded to 77.3 times its girth. HD 114533 radiates over 2,000 times the bolometric luminosity of the Sun from its enlarged photosphere at an effective temperature of , giving it a yellowish-orange hue. The object has a near-solar metallicity and spins modestly with a projected rotational velocity of . References Chamaeleon F-type supergiants G-type supergiants Chamaeleontis, 44 PD -77 890 114533 64587 4976
HD 114533
[ "Astronomy" ]
286
[ "Chamaeleon", "Constellations" ]
70,556,057
https://en.wikipedia.org/wiki/List%20of%20organisms%20named%20after%20the%20Star%20Wars%20series
Newly created taxonomic names in biological nomenclature often reflect the discoverer's interests or honour those the discoverer holds in esteem, including fictional elements from works like Star Wars. This is a list of real organisms with scientific names chosen to reference the fictional Star Wars franchise. Named after Darth Vader Named after Yoda Named after Chewbacca Named after Luke Skywalker Named after Mandalorians Named after Han Solo Named after Padmé Amidala Named after Porgs Named after other characters and elements Named after Star Wars actors See also List of unusual biological names List of organisms named after works of fiction List of organisms named after famous people References Star Wars organisms
List of organisms named after the Star Wars series
[ "Biology" ]
131
[ "Lists of biota" ]
70,556,329
https://en.wikipedia.org/wiki/Gliese%20514
Gliese 514, also known as BD+11 2576 or HIP 65859, is a M-type main-sequence star, in the constellation Virgo 24.85 light-years away from the Sun. The proximity of Gliese 514 to the Sun was known exactly since 1988. Gliese 514's metallicity Fe/H index is largely unknown, with median values from -0.4 to +0.18 reported in the literature. This discrepancy is due to peculiarities of the stellar spectrum of Gliese 514. The spectrum peculiarities also affect the accuracy of the star's temperature measurement, with reported values as low as 2901 K. The spectrum of Gliese 514 shows emission lines, but the star itself has a low starspot activity. Multiplicity surveys did not detect any stellar companions as of 2020. The Sun is currently calculated to be passing through the tidal tail of Gliese 514's Oort cloud. Thus, future interstellar objects passing through Solar system may originate from Gliese 514. Planetary system The existence of a planet on a 15-day orbit around Gliese 514 was suspected since 2019. However, that planet was not confirmed. Instead, in 2022, one Super-Earth planet, named Gliese 514 b, was discovered on an eccentric 140-day orbit by the radial velocity method. The planetary orbit partially lies within the habitable zone of the parent star with planetary equilibrium temperature, averaged along orbit, equal to K. The infrared excess of the star also indicates the possible presence of a debris disk in the system, albeit at a low signal to noise ratio. References Virgo (constellation) M-type main-sequence stars Planetary systems with one confirmed planet J13295979+1022376 BD+11 2576 065859 0514 Emission-line stars
Gliese 514
[ "Astronomy" ]
394
[ "Virgo (constellation)", "Constellations" ]
70,558,153
https://en.wikipedia.org/wiki/Atriplex%20spongiosa
Atriplex spongiosa, the pop saltbush (a name it shares with Atriplex holocarpa), is a species of flowering plant in the family Amaranthaceae, native to central Australia, and introduced to South Africa and Iran. A halophyte, it can grow in media having an NaCl concentration over 600 mM. References spongiosa Halophytes Endemic flora of Australia Flora of the Northern Territory Flora of Queensland Flora of South Australia Flora of New South Wales Plants described in 1858
Atriplex spongiosa
[ "Chemistry" ]
110
[ "Halophytes", "Salts" ]
70,558,939
https://en.wikipedia.org/wiki/Redmi%2010A
The Redmi 10A is a low-end Android-based smartphone as part of the Redmi series, a sub-brand of Xiaomi Inc. The phone was announced on March 29, 2022. Also, in India on July 27 Redmi announced the Redmi 10A Sport which was sold only in 6/128 GB memory configuration. Design The front is made of glass and the back is made of textured plastic. The Redmi 10A and 10A Sport use the same body as the Redmi 9C with a changed texture and a black area like on the Redmi 10C which merges the camera island and fingerprint sensor. On the bottom of the smartphones, there is a microUSB port, loudspeaker, and microphone. On the top, there is only 3.5mm audio jack. On the left, is a Dual SIM tray with a place for microSD. On the right, side is the volume rocker and the power button. The phones were available in the following color options: Specifications Hardware The smartphones feature the same MediaTek Helio G25 SoC as their predecessors with the PowerVR GE8320 GPU. The Redmi 10A was available in 2/32 GB, 3/32 GB, 3/64 GB, 4/64 GB, 4/128 GB, and 6/128 GB memory configurations, while the Redmi 10A Sport was available only in 6/128 GB configuration. Both models use a non-removable 5000 mAh battery and feature 10 W charging support. The smartphones have a 13 MP, wide-angle lens with an autofocus. Additionally, the Redmi 10A for the global market (model number 220233L2G) features a 2 MP, depth sensor. The phones also feature a 5 MP, front camera. The rear and front cameras can record videos in 1080p @30 fps. Software The Redmi 10A and Redmi 10A Sport were released with MIUI 12.5 based on Android 11. References External links Android (operating system) devices Phablets 10A Mobile phones introduced in 2022 Mobile phones with multiple rear cameras Discontinued smartphones
Redmi 10A
[ "Technology" ]
447
[ "Crossover devices", "Phablets" ]
70,559,414
https://en.wikipedia.org/wiki/KTDU-425
The KTDU-425 was a former Soviet liquid fuelled engine designed Aleksei Isaev of the Isayev Design Bureau. It was mainly used on the 4MV bus, which employed it as its main propulsion system. The first version which was the KTDU-425 (11D425), which had a thrust of 18.85kN was first used in 1971 for Mars-2, Mars-3 as well as the failed M-71C (Kosmos 419) probe. This was soon replaced by the KTDU-425A, which had a thrust of 18.89kN. This version was employed on the remaining 4MV and 5VK probes built, as well as Phobos 1 and 2. Variants KTDU-425 (11D425) - First version of the KTDU-425. Employed in 1971 on M-71 Probes. Thrust: 18.85Kn, Specific Impulse: 312s, Burn time: 520s. KTDU-425A (11D425A) - Improved version of KTDU-425. Employed between 1973 and 1989. Thrust 18.89Kn, Specific Impulse: 315s, Burn time: 560s. References Soviet Mars missions Soviet Venus missions KB_KhimMash_rocket_engines Rocket_engines Rocket_engines_of_the_Soviet_Union Rocket_engines_using_hypergolic_propellant
KTDU-425
[ "Astronomy", "Technology" ]
304
[ "Outer space", "Engines", "Rocket engines", "Astronomy stubs", "Outer space stubs" ]
70,559,988
https://en.wikipedia.org/wiki/History%20of%20traffic%20lights
Traffic lights are signalling devices positioned at road intersections, pedestrian crossings, and other locations to control flows of traffic. The history of traffic lights is associated with the historic growth of the automobile. Traffic lights were first introduced in December 1868 on Parliament Square in London to reduce the need for police officers to control traffic. Since then, electricity and computerised control has advanced traffic light technology and increased intersection capacity. The origins of traffic signals Before traffic lights, traffic police controlled the flow of traffic. A well-documented example is that on London Bridge in 1722. Three men were given the task of directing traffic coming in and out of either London or Southwark. Each officer would help direct traffic coming out of Southwark into London and he made sure all traffic stayed on the west side of the bridge. A second officer would direct traffic on the east side of the bridge to control the flow of people leaving London and going into Southwark. On 9 December 1868, the first non-electric gas-lit traffic lights were installed outside the Houses of Parliament in London to control the traffic in Bridge Street, Great George Street, and Parliament Street. They were proposed by the railway engineer J. P. Knight of Nottingham who had adapted this idea from his design of railway signaling systems and constructed by the railway signal engineers of Saxby & Farmer. The main reason for the traffic light was that there was an overflow of horse-drawn traffic over Westminster Bridge which forced thousands of pedestrians to walk next to the Houses of Parliament. The design combined three semaphore arms with red and green gas lamps for night-time use, on a pillar, operated by a police constable. The gas lantern was manually turned by a traffic police officer with a lever at its base so that the appropriate light faced traffic. The signal was high. The light was called the semaphore and had arms that would extend horizontally that commanded drivers to "Stop" and then the arms would lower to a 45 degrees angle to tell drivers to proceed with "Caution". At night a red light would command "Stop" and a green light would mean use "Caution". Although it was said to be successful at controlling traffic, its operational life was brief. It exploded on 2 January 1869 as a result of a leak in one of the gas lines underneath the pavement and injured the policeman who was operating it. Pre-electric signals Despite the failure of the world's first traffic light in London in 1869, countries all around the world still made traffic lights. By 1880, traffic lights spread all over the world, and it has always been like that, since then. However, the early traffic lights in the late 19th century were very different from the ones that exist now. In the first two decades of the 20th century, semaphore traffic signals like the one in London were in use all over the United States with each state having its own design of the device. One example was from Toledo, Ohio in 1908. The words "Stop" and "Go" were in white on a green background and the lights had red and green lenses illuminated by kerosene lamps for night travellers and the arms were above ground. It was controlled by a traffic officer who would blow a whistle before changing the commands on this signal to help alert travellers of the change. The design was also used in Philadelphia and Detroit. The example in Ohio was the first time America tried to use a more visible form of traffic control that involved the use of semaphores. The device that was used in Ohio was designed based on the use of railroad signals. In 1912, a traffic control device was placed on top of a tower in Paris at the intersection of rue Montmartre and the boulevard Montmartre. This tower signal was operated by a policewoman and she used a revolving four-sided metal box on top of a glass showcase where the word "Stop" was painted in red and the word "Go" painted in white. Electric signals In 1912, the first electric traffic light was developed by Lester Wire, a policeman in Salt Lake City, Utah. It was installed by the American Traffic Signal Company on the corner of East 105th Street and Euclid Avenue in Cleveland, Ohio. It had two colors, red and green, and a buzzer, based on the design of James Hoge, to provide a warning for color changes. The design by James Hoge allowed police and fire stations to control the signals in case of emergency. The first interconnected traffic signal system was installed in Salt Lake City in 1917, with six connected intersections controlled simultaneously from a manual switch. The first four-way, three-color traffic light was created by police officer William Potts in Detroit, Michigan in 1920. He was concerned about how police officers at four different lights signals could not change their lights all at the same time. The answer was a third light that was colored amber, which was the same color used on the railroad. Potts also placed a timer with the light to help coordinate the lights. A tower was used to mount the lights as the junction at which it was installed was one of the busiest in the world, with over 20,000 vehicles a day. Los Angeles installed its first automated traffic signals in October 1920 at five locations on Broadway. These early signals, manufactured by the Acme Traffic Signal Co., paired "Stop" and "Go" semaphore arms with small red and green lights. Bells played the role of today's amber lights, ringing when the flags changed—a process that took five seconds. By 1923 the city had installed 31 Acme traffic control devices. Automatic electric signals In 1922 traffic towers were beginning to be controlled by automatic timers. The first company to add timers in traffic lights was Crouse Hinds. They built railroad signals and were the first company to place timers in traffic lights in Houston, which was their home city. The main advantage for the use of the timer was that it saved cities money by replacing traffic officers. The city of New York was able to reassign all but 500 of its 6,000 officers working on the traffic squad; this saved the city $12,500,000. Wolverhampton was the first British town to introduce automated traffic lights in 1927 in Princes Square at the junction of Lichfield Street and Princess Street on a trial basis. Great Britain's first permanent automated traffic lights were opened on 16 March 1928 in Leeds, on the corner of Park Row and Bond Street. The introduction of automated traffic signals required a change of behavior for pedestrians. Most urban groups welcomed traffic lights; signals were seen by many as favorable to police officer control because they were not affected by potential human biases such as racism or mistrust of transit companies. After witnessing an accident between an automobile and a horse-drawn carriage, inventor Garrett Morgan filed a U.S. patent for a traffic signal. Patent No. 1,475,024 was granted on 20 November 1923 for Morgan's three-position traffic signal. A further development of traffic signals were staggered systems. These allowed the implementation of early green waves, so that vehicles travelling at a certain speed along a single street would only encounter green lights. The first staggered system was installed in 1926 on Sixteenth Street, Washington, D.C., leading to a doubling of commuting speed. The twelve-light system did not become available until 1928 and another feature of the light system was that hoods were placed over the light and each lens was sand-blasted to increase daytime visibility. Both the tower and semaphores were phased out by 1930. Towers were too big and obstructed traffic; semaphores were too small and drivers could not see them at night. Ashville, Ohio, claims to be the home of the oldest working traffic light in the world, used at an intersection of public roads from 1932 to 1982 when it was moved to a local museum. Guinness World Records backed this claim by naming it the Oldest functional traffic light. In 1949, the first traffic light in the continent of Asia was installed in Haifa, Israel. The first traffic light in South India was installed at Egmore Junction, Chennai in 1953. The city of Bangalore installed its first traffic light at Corporation Circle in 1963. Computerised signals The control of traffic lights made a big turn with the rise of computers in America in the 1950s. Thanks to computers, the changing of lights made traffic flow even better thanks to computerised detection. A pressure plate was placed at intersections so that computers would know that a car was waiting at the red light. Some of this detection included knowing the number of waiting cars against the red light and the length of time waited by the first vehicle at the red. One of the best historical examples of computerized control of lights was in Denver in 1952. One computer took control of 120 lights with six pressure-sensitive detectors measuring inbound and outbound traffic. The control room that housed the computer in charge of the system was in the basement of the City and County Building. As computers started to evolve, traffic light control also improved and became easier. In 1967, the city of Toronto was the first to use more advanced computers that were better at vehicle detection. The computers maintained control over 159 signals in the cities through telephone lines. Countdown timers on traffic lights were introduced in the 1990s. Timers are useful for pedestrians, to plan whether there is enough time to cross the intersection before the end of the walk phase, and for drivers, to know the amount of time before the light switches. In the United States, timers for vehicle traffic are prohibited, but pedestrian timers are now required on new or upgraded signals on wider roadways. Some pedestrian timers can be used by motorists as well to know how much time remains in the green cycle, because often when the pedestrian timer reaches zero, the signal will simultaneously turn amber. Lighting technologies When incandescent lamps began to replace gas-powered lamps, it was necessary to incorporate a coloured lens in red, yellow or green to produce the signals, as incandenscent bulbs can only shine white light. In France in particular, the units were equipped with a reflector and a different coloured lens of types such as Fresnel, prismatic or others. This drawbacks of these were their short lifetime and a glare effect when the sun is shinning in colored lens. It was often impossible to identify which signal was in operation. As such, traffic lights have often since been equipped with visors. In the 1960's, new lighting source began to be deployed using a discharge tube. The patent of the Silec Society filed in 1957 explains this technology. The advantages were that the light source did not need a coloured lens, and this technology resolved the glare effect, reduced energy consumption and lengthed the lifetime when compared with incandescent sources. In 1980, incandescent lamps were improved, with a lower 12V voltage, a better lifetime and reduced energy consumption. At the end of 1980, the great turning point was the introduction of Light Emitting Diode (LED) lights, which benefitted from an even longer replacement cycle and lower energy use. The first LED main traffic light was put in service in 1989 in California. The system was created by Electro-techs in Corona (California), a company created by Raymond Deese in 1981. References Road traffic management Traffic signals Traffic management
History of traffic lights
[ "Engineering" ]
2,303
[ "Systems engineering", "Traffic management" ]
70,560,168
https://en.wikipedia.org/wiki/Ototoxic%20medication
Ototoxicity is defined as the toxic effect on the functioning of the inner ear, which may lead to temporary or permanent hearing loss (cochleotoxic) and balancing problems (vestibulotoxic). Drugs or pharmaceutical agents inducing ototoxicity are regarded as ototoxic medications. There is a wide range of ototoxic medications, for example, antibiotics, antimalarials, chemotherapeutic agents, non-steroidal anti-inflammatory drugs (NSAIDs) and loop diuretics. While these drugs target on different body systems, they also trigger ototoxicity through different mechanisms, for example, destruction to cellular tissues of inner ear parts and disturbance on auditory nervous system. Onset of ototoxicity ranges from taking a single dose to long-term usage of the drugs. Signs and symptoms of ototoxicity include tinnitus, hearing loss, dizziness and nausea and/or vomiting. The diagnosis of medicine-induced ototoxicity is challenging as it usually shows only mild symptoms in early stages. Thus, prospective ototoxicity monitoring would be required when patients are using ototoxic medications. Fortunately, the majority of ototoxicity cases are reversible by stopping the medication concerned. Drugs Alcohol is one of the leading substances known to have ototoxic effects. A 2023 systematic review and meta-analysis found that alcohol consumption is associated with an increased risk of hearing loss. Antibiotics and chemotherapeutic agents The most common classes of ototoxic medications include antibiotics (including aminoglycosides and glycopeptides) and chemotherapeutic agents. Aminoglycosides and some chemotherapeutic agents are associated with both cochleotoxicity and vestibulotoxicity. They are thought to damage the hair cells of the cochlea. Long-term exposure to these drugs may cause damage that progresses to the upper turn of the cochlea, impairing hearing or even causing deafness. Glycopeptides, on the other hand, are rarely associated with ototoxicity. Aminoglycosides Aminoglycosides are a class of antibiotics. The most frequently used aminoglycosides include gentamicin, amikacin and streptomycin. These antibiotics are usually used in combination with other antimicrobial agents to treat drug-resistant organisms. For example, they are used with β-lactam for bacterial infections in pneumonia. They are usually given either intravenously or intramuscularly due to their poor oral absorption. Aminoglycosides irreversibly inhibit protein synthesis of bacteria, which specifically helps kill the gram-negative bacteria. The drug is first transported into the bacterial cell and it binds to the 30S ribosomal subunit. This action interferes with the reading of codons during mRNA translation, causing misreading and premature termination of the process. This inhibits protein synthesis and ultimately leads to the death of bacterial cells. All aminoglycosides can cause either reversible or irreversible ototoxicity. Ototoxicity is more frequently observed in individuals who received the treatment for more than five days and those who have renal insufficiency. The mechanism of aminoglycosides-induced ototoxicity is not well understood. It is thought that because cochlear cells are rich in mitochondria, these antibiotics may also target cochlear cells and cause their death. Another hypothesis suggests that these drugs lead to the production of reactive oxygen species which generate oxidative stress and damage the inner ear. Glycopeptides Glycopeptides are another class of antibiotics. Vancomycin is the class originator for the glycopeptides. Lipoglycopeptides are a subclass of glycopeptides and they are derived from the structure of vancomycin. Examples are telavancin and dalbavancin. Vancomycin and the lipoglycopeptides have slight differences in their mechanism of actions. Vancomycin inhibits cell wall synthesis of bacteria by preventing the cell wall component of bacteria, peptidoglycan, from elongating and cross-linking. With weakened peptidoglycan, the bacterial cell becomes susceptible to lysis. Lipoglycopeptides, additionally, can increase the membrane permeability of the bacterial cell and disrupt the bacterial cell membrane potential. This class of antibiotics can be used to treat skin or joint infections, where gram-positive bacteria are the pathogens responsible. Vancomycin is also used as an initial empirical treatment agent of community-acquired bacterial meningitis in locations where penicillin-resistant S. pneumoniae is common. This drug has other clinical uses, including endocarditis and respiratory tract infections caused by Methicillin-resistant Staphylococcus aureus (MRSA). Case reports suggested that long-term use of vancomycin has been associated with ototoxicity. However, there is no well-established causal link between vancomycin and ototoxicity. For instance, preclinical studies showed that vancomycin had a low risk of inducing ototoxicity. Despite these findings, literature generally agreed that pre-existing hearing abnormalities, concomitant use of aminoglycosides and renal dysfunction are risk factors for vancomycin-induced ototoxicity. Chemotherapeutic agents Chemotherapeutic agents are drugs that are used in chemotherapy for the treatment of cancer. Many of these agents are known to have the potential to cause hearing loss. Such agents include cisplatin and bleomycin. Cisplatin Cisplatin is known as a platinum coordination complex. Carboplatin and oxaliplatin also belong to platinum coordination complexes, but they are less commonly associated with ototoxicity. These agents are used in the treatment of ovarian, head and neck, bladder, lung and colon cancers. Cisplatin and other platinum coordination complexes work by reacting with various sites on DNA in mainly cancer cells in order to form cross-links. The formed DNA-platinum complexes inhibit replication and transcription, leading to miscoding and cell death. The mechanism of cisplatin in inducing ototoxicity is believed to involve the accumulation of reactive oxygen species, which exert cytotoxic effect on cochlear cells. Some pharmacogenetics research have opened up new perspectives on the contributing factors of cisplatin-induced ototoxicity. They investigated several cancer-inducing genes and genetic polymorphisms. Results showed that some genes are associated with protective effect on ototoxicity, while others may show no effect or even increased effect on ototoxicity. Bleomycin Bleomycin is one of the antitumour antibiotics and is a fermentation product of Streptomyces verticillus. It has a unique mechanism of action, making it an important agent in treating Hodgkin disease and testicular cancer. This unique chemotherapeutic agent causes oxidative damage to the nucleotides and leads to single- and double-stranded breaks in DNA. Excess breaks in DNA eventually causes cell death. Few case reports have identified the development of ototoxicity in some elderly patients who were administered with bleomycin. Since such cases are rare, the mechanisms behind have yet to be discovered. Other examples of ototoxic medications Non-steroidal anti-inflammatory drugs Non-steroidal anti-inflammatory drugs (NSAIDs) are one of the most recurrently used classes of drug clinically, indicated for anti-inflammatory, analgesic and antipyretic effects. Examples include high-dose aspirin, ibuprofen and naproxen. Its therapeutic effect is achieved by inhibiting the activity of cyclooxygenase (COX), an enzyme mediating the biosynthesis of prostaglandins (PGs) and thromboxanes (TXAs). As this enzyme is inhibited, prostaglandin and thromboxane production is reduced, hence inhibiting the inflammatory response of pain and swelling caused by prostaglandin. Studies have proven that high-dose usage of aspirin can be associated with ototoxicity, manifesting reversible hearing loss and tinnitus. The underlying mechanism is associated with a change in isolated cochlear outer hair cells (OHCs). Due to the COX inhibition, there is an increasing amount of leukotrienes in the inner ear. This leukotriene elevation leads to an alteration in the shape of isolated OHCs, thus disrupting their functions. Cochlear blood flow is eventually reduced, causing hearing impairment. Phosphodiesterase-5 inhibitors Phosphodiesterase-5 (PDE-5) inhibitors are the first-line drugs indicated for erectile dysfunction (ED), implying the sustained impairment of erectile functioning, which may lead to unsatisfactory sexual performance. Specific PDE-5 inhibitors are also approved for the treatment of benign prostatic hyperplasia, pulmonary hypertension and lower urinary tract symptoms. Common drug examples include sildenafil, vardenafil and tadalafil. The enzyme PDE-5, found in the corpus cavernosum smooth muscle, is responsible for degrading cyclic guanosine monophosphate (cGMP) to 5-GMP. Inhibitors of this enzyme compete with cGMP for binding sites, which in turn increases the level of cGMP in smooth muscles. Through this mechanism, penis erection in male is eventually prolonged, resulting in a correction of ED. These drugs are known to cause headache, flushing and abnormal vision as their adverse effects. PDE-5 inhibitors are also known for inducing sudden sensorineural hearing loss. It is mainly related to the obstruction and dysfunction of eustachian tubes which affects middle-ear pressure. Due to the high similarity in structure between the penile corpus cavernosum and nasal erectile tissue, PDE-5 inhibitors targeting on corpus cavernosum smooth muscle will also act on nasal erectile tissues, which are mainly located at the inferior turbinate, the middle turbinate and nasal septum. Hence, specific nasal areas may become congested. This mediates inflammatory responses in the eustachian tube which connects to the middle ear, causing an impact on middle ear pressure. Such events will eventually lead to sudden hearing loss. Antimalarials Antimalarial drugs can be classified into several classes based on different mechanisms of action and effects, including quinoline-type drugs, naphthoquinone, antifolates, guanidine derived drugs, sesquiterpene lactones, etc. In particular, quinoline-type drugs are known to be ototoxic. Examples include chloroquine and hydroxychloroquine which are quinine-like. Apart from antimalarial effects, these drugs are also used in the treatment of other diseases such as dermatological, immunological, rheumatological, and severe infectious diseases. Various ototoxic effects are manifested by using antimalarial drugs, with dizziness being one of the most common one. Other effects include vestibular symptoms, hearing loss and tinnitus, which can appear to be both temporary or permanent. Nonetheless, the underlying mechanisms of antimalarial-induced ototoxicity are still poorly understood. Studies have suggested that high doses of quinine have an impact on the central auditory pathway and auditory periphery, which leads to elongation and subsequent contraction of isolated OHCs in the cochlea. This structural alteration affects their functions and results in cochlear blood flow reduction. Loop diuretics Loop diuretics is a major class of diuretic drugs indicated for oedema due to heart failure, liver disease and kidney disease. It is also used for treating hypertension. Common examples include furosemide, bumetanide and ethacrynic acid. Loop diuretics act on the thick ascending limb of the loop of Henle in the kidney nephrons. The major mechanism of ion reabsorption in the thick ascending limb is the active transport of ions through Na+-K+-2Cl− co-transporters (NKCCs). By binding to and inhibiting NKCCs at the apical membrane of the loop of Henle, the reabsorption of Na+, K+ and Cl- is impaired, contributing to a higher ion concentration in the lumen. Hence, the ultimate effect of loop diuretics is a reduction in salt reabsorption and an increase in water excretion. Loop diuretics-induced ototoxicity is suggested to be associated with their action on stria vascularis located on the lateral wall of the cochlea. This area is responsible for maintaining the balance of ions of endolymph. A high potassium concentration as well as a low sodium concentration should be maintained in the endolymph to allow cochlear hair cells to function normally. As the inhibitory actions of loop diuretics will also target on NKCCs existing on membrane surfaces of stria vascularis marginal cells, there will be a disturbance on the ionic composition of endolymph. Once the endocochlear potential cannot be maintained, hearing is temporarily impaired. It is noticed that the risk of ototoxicity caused by furosemide is much higher than that of bumetanide. Monitoring and management of ototoxicity Several approaches can be considered in managing patients who developed ototoxicity as an adverse reaction to the medications. One approach to managing ototoxicity is the use of otoprotective agents. An example is sodium thiosulfate, which the US FDA approved in 2022 to minimise the risk of ototoxicity and hearing loss in newborn, child, and adolescent cancer patients receiving cisplatin. Other agents being investigated for their potential to reduce ototoxicity include D-methionine and L-N-acetylcysteine. The use of D-methionine to protect against hearing loss induced by drugs like cisplatin and aminoglycosides is preliminarily supported by animal studies. NMDA antagonists are also shown to limit aminoglycoside-induced ototoxicity. Restorative care, which aims to regenerate hair cells that are damaged by ototoxic drugs, can also be considered. For example, the infusion of neurotrophic factors (neurotrophin-3) was shown to produce otoprotective effects. This protective agent was also found to be associated with the survival of cochlear spiral ganglion neurones after hearing loss or deafness. Audiological management can be implemented, for example, by providing hearing aids. In more seriously affected patients, cochlear implantation may be considered and discussed with the patient. It is also important for the healthcare team to educate affected patients and their family members on communication skills in order to minimise the impact on patients’ daily life. References External links Wikiversity page for the International Ototoxicity Management Group: https://en.wikiversity.org/wiki/International_Ototoxicity_Management_Group_(IOMG) Clinical pharmacology
Ototoxic medication
[ "Chemistry" ]
3,235
[ "Pharmacology", "Clinical pharmacology" ]
70,560,337
https://en.wikipedia.org/wiki/Pulmonary%20drug%20delivery
Pulmonary drug delivery is a route of administration in which patients use an inhaler to inhale their medications and drugs are absorbed into the bloodstream via the lung mucous membrane. This technique is most commonly used in the treatment of lung diseases, for example, asthma and chronic obstructive pulmonary disease (COPD). Different types of inhalers include metered-dose inhalers (MDI), dry powder inhalers (DPI), soft mist inhalers (SMI) and nebulizers. The rate and efficacy of pulmonary drug delivery are affected by drug particle properties, breathing patterns and respiratory tract geometry. Pulmonary drug delivery minimizes systemic side effects and increases bioavailability owing to the localised absorption through the lung. The disadvantages include possible drug irritation to the lung, limited drug dissolution, relatively high drug clearance, and the drug effectiveness depends on the inhaler techniques and patients' compliance. Drug formulation can be challenging since the drug has to bypass the defence mechanisms in the respiratory tract. Pharmacokinetics and pharmacodynamics of the drug in elderly patients can also be particularly difficult to predict due to age-related changes in body composition. Ongoing developments in inhaler device engineering, technology and drug formulations may improve the efficacy and overcome the challenges of pulmonary drug delivery. Recent advancements involve the utilization of the pulmonary route as an entry to systemic circulation for treating different diseases, as well as the development of pulmonary drug formulation and particle engineering technology to increase the efficacy of pulmonary delivery. Application / Clinical use Pulmonary drug delivery is mainly utilized for topical applications in the lungs, such as the use of inhaled beta-agonists, corticosteroids and anticholinergic agents for the treatment of asthma and COPD, the use of inhaled mucolytics and antibiotics for the treatment of cystic fibrosis (CT) and respiratory viral infections, and the use of inhaled prostacyclin analogs for the treatment of pulmonary arterial hypertension (PAH). In addition, this technique is employed for systemic application, for example the use of inhaled insulin for diabetes management, the use of inhaled loxapine for treatment of psychiatric disorders. Vaccines, such as the measles-rubella vaccines, can also be delivered via inhalation. Examples of inhalers Metered-dose inhalers (MDIs) Metered-dose inhalers include pressurized metered-dose inhalers (pMDIs) and breath-actuated metered-dose inhalers (BAMDIs). pMDIs are the most commonly used inhalers for treating lung diseases. It requires coordination of patients’ inhalation and inhaler actuation. BAMDIs are triggered by patients’ inspiratory flow instead of hand actuation, solving the coordination issue. MDIs with spacers have similar effectiveness in drug delivery compared to nebulizers, with additional benefits in convenience and cost-effectiveness. The use of MDIs together with spacers, valved holding chambers (VHCs) or masks improve the efficacy of drug delivery into the lungs. Advantages High portability Fixed dose is delivered Efficient aerosolized delivery of drug particles Inexpensive Convenient usage Quiet administration Disadvantages Require coordination between inhalation and inhaler actuation Examples Pressurized metered-dose inhalers (pMDIs) Breath-actuated metered-dose inhalers (BAMDIs) Dry powder inhalers (DPIs) The solid drug powders in DPIs are released by the force of the patient's inspiratory flow. Turbulent airflow generated inside the inhaler by the inhalation force is associated with the movement of airflow and the resistance inside the inhaler. Patients should inhale with adequate inspiratory flow to overcome the resistance of DPIs, leading to drug particle deaggregation for successful pulmonary delivery. Advantages High portability Fixed dose is delivered Require minimal coordination between inhalation and inhaler actuation Disadvantages Require adequate inspiratory flow from patients Moisture sensitive Examples Turbuhaler Accuhaler Handihaler Genuair Ellipta Breexhaler Soft-mist inhalers (SMIs) Soft-mist inhaler aerosolized a fixed dose of liquid drug formulation into inhalable tiny particles through an extremely fine nozzle system using the energy generated by the lever-compressed spring, without the use of propellants. The slow and prolonged duration of aerosolization facilitates the patient's coordination between inhaler actuation and inhalation. Advantages High portability Require minimal patient's coordination between inhalation and inhaler actuation High drug deposition in lungs Disadvantages Low availability in markets Relatively expensive Example Respimat Nebulizers Nebulizer is mainly used in emergencies, or by patients with poor compliance to other handy inhalers. Nebulizer delivers medication into the lungs by converting water-based liquid drug formulations into inhalable droplets mechanically, such as the use of an ultrasonic system, or thermally. Major types of nebulizers include vibrating mesh nebulizers (VMN), jet nebulizers (JN) and ultrasonic nebulizers. Advantages Patients' education and coordination are not required. Suitable for older patients and children Better patient satisfaction due to the visible aerosols Easily employed with tidal breathing Disadvantages Long time of drug delivery Inaccurate drug dosages Bulky device Expensive Require regular maintenance Low drug delivery efficiency to the lungs Examples Vibrating mesh nebulizers (VMN) Jet nebulizers (JN) Ultrasound nebulizers Factors affecting pulmonary drug delivery To achieve successful pulmonary drug delivery, a fraction of the inhaled particles should not deposit on the upper respiratory tract since they will be swallowed or expectorated without reaching the lungs, leading to the loss of pharmacological effect or provoking unwanted systemic side effects. Factors affecting the deposition of drug particles in lungs include drug particle properties, breathing patterns and respiratory tract geometry. Drug particle properties Particle diameter and particle density significantly affect the drug deposition pattern in the respiratory tract, and are the most common considerations for formulation of pulmonary drugs. Drug particles with diameter larger than 5 μm, predominantly deposit on the upper respiratory tract, limiting the amount of drug particles reaching the lung. Moderate-size drug particles with diameter between 2 μm to 5 μm, primarily deposit on the central and small airways. Small drug particles with diameter smaller than 2 μm, predominantly deposit on the alveolar sacs. Other factors affecting deposition of drugs include particle electrostatic charge, particle shape and particle volatility. Electrostatic charge of the drug particles enhances deposition due to the formation of electrostatic force on the wall of the respiratory tract. Non-spherical particle shape has a different entry pathway compared to that of the spherical particles, causing a change in deposition pattern. Particle volatility affects particle diameter due to the change of particle diameter during condensation and evaporation. Breathing patterns Drug particle deposition is associated with mean residence time and tidal volume. An increase in mean residence time or tidal volume enhances drug deposition in lungs, while an increase in air flow decreases the mean residence time, resulting in the decrease of total deposition of drug particles. Respiratory tract geometry The bifurcation of trachea into bronchi with smaller diameter increases turbulent flow, leading to an increase in deposition in the large respiratory tract by impaction. Advantages Several advantages are associated with the pulmonary route of administration. For respiratory diseases, drug can be delivered directly to the disease site to perform topical relief, thus rapid onset of action can be achieved and there is less systemic side effects. Less dosage of drug can also achieve similar therapeutic effect compared to other routes of administration. For drugs designed to exert systemic effect through the lung as a drug target, the drug can reach the circulation bypassing poor gastrointestinal absorption and hepatic first pass metabolism which improve drug bioavailability. The large absorptive surface area, highly permeable membrane with rich blood supply also enable rapid onset of action and increase bioavailability of the drug. Disadvantages and challenges Despite a number of advantages in the pulmonary route compared to other routes of administration, numerous disadvantages are associated with the pulmonary route. As the drug needs to be delivered through the respiratory tract to the lungs, drug formulation can be challenging due to the defense mechanisms which intend to remove or inactivate the exogenous chemicals. Airway constriction and mucus secretion with ciliary movement prevent drugs from reaching the lungs, while enzymes, macrophages and surfactant in the lungs may also inactivate the drugs leading to less drug being absorbed. Studies show that only around 20% of drug reaches the lung for each inhalation and drug loss is mainly due to the accumulation in the oropharynx in terms of pMDIs and DPIs and drug retention in the device for nebulisers. Some irritating drug particles may also cause local side effects at the respiratory tract, for example inhaled corticosteroid accumulating in the oropharynx can result in dysphonia and oral thrush. Besides, drug dosing may be inaccurate due to the variations of breathing patterns between individuals and the presence of numerous factors affecting the deposition and absorption of drug particles in the lungs. In particular, elder patients may not have enough strength to generate sufficient inspiratory flow, resulting in less drug inhalation and hence low drug bioavailability. Finally, inhalers, especially nebulizers, require regular maintenance and cleaning. The inhaler devices are relatively expensive compared to oral tablets, which may not be affordable to low income patients. The effectiveness of drug delivery highly depends on the patient's compliance and proper inhaler technique with no significant error in using the inhalers. Poor compliance may lead to uncontrolled or poorly controlled disease status. For instance, a patient may feel recovered and discontinue the treatment, or a patient may forget to take the medication, resulting in suboptimal disease management. Reducing the amount of puffs by combination inhalers delivering two or more drugs in one breath or the use of electronic data loggers can improve compliance. Incorrect inhaler techniques, such as poor coordination, no exhalation before inhaling the drug aerosol or not holding breath for a few seconds after inhalation may lead to medication depositing inside the respiratory tract instead of the lungs, resulting in inefficient and inadequate treatment. Practical demonstration instead of verbal instruction, education and rechecking on the inhaler technique after a period of time can reduce error and enhance true compliance. Ongoing development The use of the pulmonary route as an entry into the systemic circulation is constantly developing due to the additional benefits of bypassing the hepatic first pass metabolism, rapid systemic absorption, higher patients compliance and its non-invasive nature. Potent drugs with the ability to penetrate the lung mucosa into the blood circulation may be available for treating diseases requiring systemic drug delivery. The ongoing researches include the use of inhaled nicotine for smoking cessation, the use of inhaled levodopa for the treatment of Parkinson's disease, and the pulmonary delivery of various biologics. In addition to the development of new pulmonary drugs, the drug formulation and particle engineering technology is advancing, such as the use of Ultrasound Mediated Amorphous to Crystalline transition (UMAX) process to micronize drug into inhalable drug particles with better performance, the use of drug nanoparticles to minimize unwanted drug adverse effects and increase drug bioavailability at the target site, and the use of porous drug particles to improve pulmonary delivery efficacy. See also Route of administration Inhaler Metered-dose inhaler Dry powder inhaler Nebulizer Respiratory tract Respiratory system Lung References Routes of administration Medical terminology
Pulmonary drug delivery
[ "Chemistry" ]
2,448
[ "Pharmacology", "Routes of administration" ]
70,560,847
https://en.wikipedia.org/wiki/Hartmuth%20C.%20Kolb
Hartmuth Christian Kolb (born 10 August 1964) is a German chemist. He is considered one of the founders of click chemistry. Early life and career After graduating from high school in Marsberg in 1983, Kolb studied at the University of Hanover under the direction of Professor H.M.R. Hoffmann. He received his doctorate as an academic student of Steven Ley at Imperial College London with a thesis on preparative organic chemistry (Synthesis of the decalin fragment of azadirachtin). As a postdoctoral fellow he worked with Barry Sharpless at the Scripps Research Institute in La Jolla, California. He then worked in the research department of Ciba-Geigy in Basel from 1993 to 1997 before taking up a managerial position at Coelacanth Corporation, founded by Sharpless and A. Bader in Princeton, New Jersey. Coelacanth was eventually acquired by Lexicon Pharmaceuticals. In 2002, Kolb obtained an associate professorship in the Department of Chemistry at the Scripps Research Institute. Kolb later obtained a professorship at the University of California, Los Angeles. In 2004 Kolb returned to industry and became Vice President of Molecular Imaging Biomarker Research at Siemens Healthcare in Culver City, California. In 2013, Siemens sold two of the Positron Emission Tomography (PET) radiotracers developed there to Eli Lilly and Company, most notably the Tau PET tracer [18F]-T807 (aka AV1451, Flortaucipir, Tauvid), now FDA approved for PET imaging of the brain to estimate the density and distribution of aggregated tau neurofibrillary tangles (NFTs) in adult patients with cognitive impairment who are being evaluated for Alzheimer's disease (AD). Simultaneously, Kolb joined Avid Radiopharmaceuticals in Philadelphia, Pennsylvania, a subsidiary of Eli Lilly and Company, as vice president of research, and later Janssen Research & Development (Johnson & Johnson) as vice president of Neuroscience Biomarkers & Global Imaging. Work Together with Barry Sharpless and M.G. Finn, Kolb developed the concept of click chemistry, an approach to simplify synthesis by focusing on a few chemical reactions that are similar in nature. The associated scientific publication Click chemistry: diverse chemical function from a few good reactions has been cited more than 14,000 times (as of 2021) and was the foundation for the 2022 Nobel Prize in Chemistry for K. Barry Sharpless, Carolyn Bertozzi and Morten Meldal. Kolb refined the method by combining it as in-situ click chemistry with microfluidic processes. This makes it particularly easy to synthesize new inhibitors for various enzymes. Kolb's more recent work deals with the synthesis of new tracers for positron emission tomography (e.g. for detecting the tau protein in Alzheimer's disease) and with the clinical testing of these tracers, a key highlight being [18F]-T807, also known as AV1451, Flortaucipir, Tauvid, which was approved in 2020 by the US food and drug administration (FDA) for imaging neurofibrillary tangles in adults who are being evaluated for Alzheimer's Disease. Kolb's lab has developed a blood plasma assay for phospho-217-Tau (p217Tau), which shows potential as a highly accurate peripheral biomarker for amyloid and Tau status in Alzheimer's Disease. Awards Kolb was chosen as the recipient of the 2015 Alzheimer Award by the Journal of Alzheimer's Disease and he was one of the recipients of the Royal Society of Chemistry 2021 Organic Division Horizon Prize: Robert Robinson Award in Synthetic Organic Chemistry. References 1964 births Living people Scripps Research faculty University of California, Los Angeles faculty 21st-century German chemists 20th-century German chemists University of Hanover alumni Siemens people Alumni of Imperial College London German organic chemists Eli Lilly and Company people People from Marsberg
Hartmuth C. Kolb
[ "Chemistry" ]
824
[ "Organic chemists", "German organic chemists" ]
70,561,166
https://en.wikipedia.org/wiki/Snowmass%20Process
The Snowmass Process is a particle physics community planning exercise sponsored by the Division of Particles and Fields of the American Physical Society. During this process, scientists develop a collective vision for the next seven to ten years for particle physics research in the US. History Original planning meetings were held beginning in 1982 in Snowmass, Colorado, but that has not been the location since 2005. More recent locations of the Snowmass Process include the University of Minnesota (2013) and the University of Washington (2021), which was delayed until July 2022, due to COVID. Description The modern Snowmass Process consists of a series of small meetings, which culminate in a community-wide meeting. The Snowmass Process solicits reports on progress and plans within "frontiers". Snowmass 2021 identified ten frontiers: "energy; neutrino physics; rare processes and precision measurements; cosmic; theory; accelerator; instrumentation; computation; underground facilities; and community engagement". Members of the particle physics community submit Letters of interest and provide input to contributed whitepapers. The frontiers use these whitepapers to provide web-based reports based on the material that they receive. The final output of the Snowmass Process is a Snowmass Summary for the Public, a Snowmass Summary Report, and the Snowmass Book. Outcomes of Snowmass 2013 The Snowmass Process outcomes of 2013 were used to inform the decisions of the 2014 Particle Physics Project Prioritization Panel. A newsworthy outcome of the 2021 Snowmass Process was the announcement that the Deep Underground Neutrino Experiment would be pursued in two distinct phases. Outcomes of Snowmass 2021 The outcomes of the Snowmass 2021 process, which extended into 2022, were determined at a final meeting held in July 2022 in Seattle, Washington that had 743 in-person attendees and 654 virtual participants. Snowmass outcomes were covered in detailed articles by the scientific press. The title of the Scientific American article, "Physicists Struggle to Unite Around Future Plans", summed up the problem of convergence of opinion. The articles report that two major problems stymied the field: lack of observation of new particles and rocketing costs of ongoing projects. No unexpected particles were observed in the first 15 years of data-taking at the Large Hadron Collider (LHC), the highest energy accelerator on Earth—a disappointment stated by many physicists throughout the Snowmass process, and reflecting a view that has also been expressed outside of the Snowmass meetings. Although LHC will continue to run with modest upgrades, this lack of discovery leaves no clear focus for the next decade of high energy searches, and may also point to a "nightmare-scenario" where the Standard Model that forms the present basis of particle physics is complete up to the Planck scale (an energy level far beyond the ability of any conceivable experiment to probe) and particle physics "wheeze[s] to its end". However the 2012 the discovery of an expected particle, the Higgs Boson, has given the field hope of finding new physics through precision searches for unexpected Higgs interactions. As a result, during the Snowmass process, physicists argued for precision measurements at a Higgs factory constructed of an electron-positron collider. Many Higgs factories are proposed for outside of the US, including at the European center for particle physics, CERN, as well as in China, and so "a surprise at Snowmass 'was the grassroots support for a collider on US soil'" that grew out of a new US-developed technology called the "cool copper collider". An alternative if the world-wide competition for an electron-positron machine is too stiff would be to invest in a Muon collider that could act as a Higgs factory with an approach that is unique worldwide. Muon colliders were discussed at the 2013 Snowmass, but shelved due to insufficiently advanced technology. However, at the 2022 final Snowmass meeting there was an "enthusiastic revival" of the concept. The possibility of establishing any major new project in the US in the 2023-2033 decade, including a Higgs Factory, is limited due to the rising costs and multi-year delays of existing projects. In particular, at Snowmass, physicists expressed deep concern about the Deep Underground Neutrino Experiment (DUNE) project, which has risen from a base cost of $1.3B in 2015 to $3.1B for a de-scoped instrument. Cost over-runs and delays of DUNE are problematic due to stiff competition from a similar experiment in Japan, leaving physicists to question the value of DUNE results when they are obtained. Worries were expressed by physicists that issues with DUNE were "smoothed over, not smoothed out". Some physicists at Snowmass suggested that the DUNE project might be cancelled, comparing the ominous cost-growth to the Superconducting Super Collider (SSC) that was cancelled when the cost tripled. The whitepapers from the Snowmass process provide input to the 2023 P5 study. The 2023 P5 committee was announced in December 2022. See also Decadal survey Scientific collaboration network References External links Snowmass Process Community Summer Study 2013 : Snowmass on the Mississippi Planning the Future of U.S. Particle Physics (2013) Decadal science surveys Physics conferences Physics organizations Particle physics Science and technology studies
Snowmass Process
[ "Physics", "Technology" ]
1,131
[ "Science and technology studies", "Particle physics" ]
70,561,717
https://en.wikipedia.org/wiki/List%20of%20African%20countries%20by%20life%20expectancy
This is a list of African countries by life expectancy. United Nations (2023) Estimation of the analytical agency of the UN. UN: Estimate of life expectancy for various ages in 2023 UN: Change of life expectancy from 2019 to 2023 World Bank Group (2022) Estimation of the World Bank Group for 2022. The data is filtered according to the list of countries in Africa. The values in the World Bank Group tables are rounded. All calculations are based on raw data, so due to the nuances of rounding, in some places illusory inconsistencies of indicators arose, with a size of 0.01 year. In 2014, some of the world's leading countries had a local peak in life expectancy, so this year is chosen for comparison with 2019 and 2022. WHO (2019) Estimation of the World Health Organization for 2019. Charts See also References life expectancy Africa
List of African countries by life expectancy
[ "Biology" ]
189
[ "Senescence", "Life expectancy" ]
73,450,318
https://en.wikipedia.org/wiki/Topical%20gels
Topical gels are a topical drug delivery dosage form commonly used in cosmetics and treatments for skin diseases because of their advantages over cream and ointment. They are formed from a mixture of gelator, solvent, active drug, and other excipients, and can be classified into organogels and hydrogels. Drug formulation and preparation methods depend on the properties of the gelators, solvents, drug and excipients used. Structure of gels A gel refers to the semi- solid, 3-dimensional matrix formed from an interspersed system of colloidal particles or the permeation of a solvent into an entwined polymer chain network. Pharmaceutical gels are formed by adding a gelator (gelling agent) to the solvent and active ingredient mixture. Gelators used in gel formulation can be small molecules with low molecular weight or polymers (synthetic, semi-synthetic or natural). The solvent that is used as a dispersion medium can be aqueous, organic, inorganic, or a system of different solvents. Topical gels are used as a contact or transport medium for active drugs to act on or through the skin. The active drug molecules are entwined into the 3D mesh of the gel and delivered to the site of action. Characteristics Gels have certain special properties that put them apart from other dosage forms, in terms of swelling, syneresis, ageing, rigidity and rheology. Classification Gels can be classified through a variety of criteria such as their nature of the colloidal phase, nature of the solvent used and physical nature. Nature of solvent classification This is the most widely used classification of gels. They are classified into two main groups by the nature of the solvent: organogels and hydrogels. Organogels Organogels are not as commonly used as mediums for drugs or vaccines when compared to other gel classes. This is due to the untested or pharmaceutically unacceptable solvents and gelators commonly used in organogel synthesis. Organogels that are used pharmaceutically include microemulsion-based gels and lecithin gels. Some manufacturers decide to use organogels as a medium for drug delivery due to its potentially emollient effect. Some organogels contain bases composed of oleaginous substances. These bases can help retain skin moisture through the formation of an occlusive layer on the area of application. This occlusive layer traps moisture, allowing hydration of the skin and providing an emollient effect. This emollient effect is particularly helpful in formulation of topical gels for patients with dry and irritated skin. Hydrogels Hydrogels have a high water content, with some hydrogels containing up to 90% water. Active drugs and other substances dispersed as colloids or dissolved in water can be easily taken up by hydrogels. Hydrogels are biocompatible. They also swell to a greater volume than organogels when in contact with water and other natural liquids. Hydrogels can be used as drug delivery vehicles, for transdermal application, ophthalmic drug delivery, cancer treatment or for wound dressing. As a type of water based formulation, hydrogels are generally less greasy and are easier to be removed than oil-based formulations like organogels. Examples of hydrogels include aluminum oxide gels, and bentonite magma. Method of Action Drugs administered through topical application can act locally or systemically. However, the drug molecules must first be retained in and penetrate the surface layer of the skin. Absorption of the drug through the skin surface is a passive process of diffusion. Skin penetration of the drug can take place by passive diffusion directly through the epidermis (via transcellular or intercellular routes), or absorption through shunt routes (diffusion through hair follicles and sweat glands). Initially, drug absorption may take place via the transfolliar route. After the drug reaches a steady state, transepidermal absorption may replace transfolliar absorption as the main pathway for absorption. Drug absorption through the skin varies depending on the concentration gradient between the surface of the skin and the body, with a higher rate of absorption resulting from a greater concentration gradient. The rate of drug absorption can be maintained at a constant level by ensuring that the drug concentration at the surface of the skin remains consistently and substantially greater than that in the body. The rate of penetration of the drug across the skin barrier depends on the physiological factors, physicochemical properties of the drug, and gel characteristics. Physiological factors include skin properties, size of application area, frequency and force of application. Physicochemical properties of the drug include drug solubility, attraction to the skin and metabolism. Gel characteristics include stability, thermodynamic activity, and occlusive properties. Following penetration through the skin barrier, the drug may permeate through deeper skin tissues and reach the blood capillaries in the dermis. It may then proceed to enter the systemic circulation for systemic effect. Gel Formulation Ingredients Formulation of topical gels is determined by important factors such as appearance, odor, spreadability, extrudability, viscosity, pH, texture, microbial contamination potential and bioavailability. The components of the vehicle should serve to make the skin surface more penetrable to the drug. Characteristics of the gel such as consistency and viscosity are affected by formulation design. Consistency and viscosity affect the adhesion and retention property of the gel, and are important in ensuring the gel is retained at the site of application and effective delivery of the drug. The ingredients in topical gel formulation can be broadly categorized into four types: gelator, solvent, drug, and excipients. Gelator Gelators serve as stabilizers and thickeners, thickening the gel solution while simultaneously maintaining the gel’s flexible nature. When dispersed through the solvent as a colloid, gelators offer a stable internal structure to the gel. Gelators are usually chosen based on their affinity for the solvent and the purpose of the gel. The nature of the gelators used determines the rigidity of the gel. There are many types of gelators, of which carbomers are more frequently used due to their ability to thicken gels across a wide range of pH. Gelators can be classified by polymer types, namely natural, semi-synthetic and synthetic polymers. Natural gelators include tragacanth, gelatin, collagen, and guar gum; semi-synthemic gelators include methylcellulose and other cellulose derivatives; while synthetic gelators include carbomers, polyvinyl alcohol, polyethylene and its copolymers. Solvent Solvents are usually chosen based on the applications of the gel. They can be hydrophilic, lipophilic, or organic. Individual solvents can be used alone or as a mixture. Some examples of solvents include purified water, glycerin, glycols, alcohols, sucrose, toluene, and mineral oils. Drug Topical delivery is often used for drugs that are easily degraded in the GI tract, or are highly susceptible to hepatic first pass effect. Even if the drug has to be administered for long periods of time or can induce adverse drug reactions in parts of the body other than the target location, it can still be formulated as a topical gel. There are a number of physicochemical and biological properties that determine whether a drug is suitable for being delivered topically through a gel dosage form. Physicochemical properties: The drug must: Have a molecular weight smaller than 500 daltons. Be adequately lipophilic. Have a pH value greater than 5 and smaller than 9 when saturated in an aqueous solution. Not be highly acidic or highly alkaline. Biological properties: The drug should be non-irritant and non-allergenic. Under a constant rate of delivery (zero order release profile), tolerance to the drug must not be developed. Excipients Excipients are materials inert to the drug, which are added into dosage forms to improve the overall quality of the dosage form. Some examples include antioxidants, sweetening agents, stabilizers, dispersing agents, penetration enhancers, buffers and preservatives. Penetration enhancers are excipients that can increase skin permeability. Many classes of excipients can be used as penetration enhancers, such as glycerin, sulfoxides and related analogues, pyrrolidines, fatty acid and ethanol, surfactants etc. Buffers can be added to control the pH of aqueous or hydroalcoholic based gels. Examples of buffers include phosphate and citrate. Preservatives are important for their antimicrobial action, and are especially important in formulation of hydrogels. Examples of preservatives include parabens and phenolics. Antioxidants are used to prevent gel ingredients from being oxidised. When choosing the antioxidant to be used, it is important to consider the nature of the solvent. Since the solvent of most gels are aqueous in nature, water-soluble antioxidants are more commonly used.Some common examples include sodium metabisulphite and sodium formaldehyde sulfoxylate. Sweetening agents are only used in gels that are designed to be used in the oral cavity such as dental gels. Examples include sucrose, glycerol, sorbitol and liquid glucose. Gel Preparation Methods The process of gel formation involves finding a balance between the concentrations of the gelator and the solvent. When adding a gelator to the solvent, the mixture remains in liquid state. As the concentration of the gelator increases to a certain critical concentration (gelling point), gelation occurs through swelling to form the semi-solid gel. Further increasing the concentration of the gelator beyond the gelling point will increase gel viscosity. The exact gelling point varies depending on the properties of the gelator and the solvent, such as structure uniformity, molecular weight of the polymer, and flexibility of the polymer chain. Generally, gels are prepared by firstly dissolving the soluble excipients in the solvent. The solution is then mixed using a mechanical stirrer. After that, the gelator is added slowly to the stirred mixture in order to avoid aggregation. Then, the mixture is continuously stirred until the polymer dissolves and a gel gradually forms. The gel is allowed to settle for one to two days before the final consistency of the gel can be reached. The exact method of preparing gels depends on the properties of the formulation ingredients. Common uses of Gels Topical gels are commonly used as sustained release dosage forms. Usage of the sustained release dosage form reduces the administration of recurrent doses while maintaining serum dose levels at the therapeutic range (difference between toxic and therapeutic doses), hence improving patient compliance. Some topical gels are fast release gels, which are highly absorbent and can swell rapidly. These fast release gels can be used to treat acute disorders. Topical gels are also used as lubricants, or carriers for pharmaceutical agents. They can be used as vehicles for different purposes, via different routes of administration, such as dental, dermatologicall, ophthalmic, intranasal, vaginal, rectal and others. Topical gels are commonly used in cosmetics, which include shampoos, dentifrices, skin and hair care formulations and fragrance products, and can be used to treat scalp inflammation. Topical gels can be used to deliver anti-inflammatory steroids to the scalp in treatment of scalp inflammations. Examples of Commercially available topical gels. Advantages of topical gels The texture of topical gels is less greasy as it contains a higher proportion of water compared with cream and ointment. These gels have an excellent spreading property and cooling effect due to solvent evaporation, and also has a higher retention time on the skin. Topical gels are more stable than creams and ointments, and can adhere well to the site of application. They form an occlusive layer on the application site that can act as a form of protection. They can be washed off easily and are nontoxic due to their unique composition and structure. They have minimal side effects due to their localized effect. Topical gels are convenient and easy to apply. The topical mode of action of topical gels is also non-invasive. These favorable factors of topical gels improve patient compliance and tolerability. The formulation and manufacturing processes of topical gels are relatively simpler and more cost effective than other semisolid dosage forms. The release profile of the gel can be modified by altering the properties of the gelator, allowing for continuous drug delivery. Topical gels are also eco-friendly, biocompatible and biodegradable. The drug can penetrate deeply into the skin and be directly delivered to the target site, as the topical application allows it to avoid hepatic first pass metabolism. Difficulties in gastrointestinal absorption caused by pH, enzymatic activity and drug-food interactions can be minimized, while at the same time avoiding GI irritation. The topical dosage form allows stable and continuous drug delivery to the site of application, while having a faster drug release than ointments and creams. All these can increase the drug’s bioavailability in the body. Limitations of topical gels There may be flocculation in some gels, which may produce an unstable gel. The rheology of some gels are easily altered by environmental factors such as temperature and humidity, resulting in stricter storage requirements. Syneresis of the gel may occur during storage, causing the gel to shrink unpredictably or even dry out. The gelators may precipitate and salt out, and some drugs may degrade in gel formulation due to the other ingredients present in the formulation. Some additives and gelators added into the formulation may cause irritation problems, such as skin irritation, dermatitis or allergic conditions. The increased water content in gels increases the chances of microbial or fungal attack, which may contaminate the gel, making it unsuitable for use. Considering the direct route of administration, drugs must be very small in size to have an effective plasma concentration for action. The particle size and other properties of the drug may also affect its absorption through the skin barrier, resulting in an unreliable effect. References Drug delivery devices Dosage forms
Topical gels
[ "Chemistry" ]
3,013
[ "Pharmacology", "Drug delivery devices" ]
73,450,975
https://en.wikipedia.org/wiki/Biochar%20carbon%20removal
Biochar carbon removal (also called pyrogenic carbon capture and storage) is a negative emissions technology. It involves the production of biochar through pyrolysis of residual biomass and the subsequent application of the biochar in soils or durable materials (e.g. cement, tar). The carbon dioxide sequestered by the plants used for the biochar production is therewith stored for several hundreds of years, which creates carbon sinks. Definition The term refers to the practice of producing biochar from sustainably sourced biomass and ensuring that it is stored for a long period of time. The concept makes use of the photosynthesis process, through which plants remove CO2 from the atmosphere during their growth. This carbon dioxide is stabilised within the biochar during the production process and can subsequently be stored for several hundreds or thousands of years. Biochar Carbon Removal falls into the category of carbon dioxide removal (CDR) technologies. It is considered to be a rapidly implemented and capital-efficient negative emissions technology ideal for smaller scale installations such as farmers, and also to help rural diversification in developing countries. This is, amongst others, reflected in the guidance documents of the Science Based Targets initiative. Scientifically, this process is often referred to as Pyrogenic Carbon Capture and Storage (PyCCS). The term biochar carbon removal was introduced by the European Biochar Industry Consortium in 2023 and has since been adopted by various institutions and experts. Biochar Carbon Removal can also be categorised as a form of Biomass Carbon Removal and Storage (BiCRS). Beyond carbon sequestration, biochar application has various other potential benefits, such as increased yield and root biomass, water use efficiency and microbial activity. Biochar production Biochar is produced through the pyrolysis process. Biomass (e.g. residual plant material from landscaping or agricultural processes) is reduced to smaller pieces is heated to under oxygen-deficient conditions. This results in solid biochar and by-products (bio-oil, pyrogas). In order to maximise the carbon storage potential, typically those biochar technologies are used that minimise combustion and avoid the loss of pyrogas into the atmosphere. In low-oxygen conditions, the thermal-chemical conversion of organic materials (including biomass) produces both volatiles, termed pyrolytic gases (pyrogases), as well as solid carbonaceous co-products, termed biochar. While the pyrogases mostly condense into liquid bio-oil, which may be used as an energy source, biochar has been proposed as a tool for sequestering carbon in soil. The global biochar market is expected to reach USD 368.85 million by 2028. Internationally there are several voluntary standards that regulate the biochar production process and product quality. These include the following (non-exhaustive list): European Biochar Certificate (EBC) and World Biochar Certificate (WBC) developed by the Ithaka Institute The U.S. Environmental Protection Agency (EPA) The International Biochar Initiative Carbon removal potential Global scope Three main carbonaceous products are generated during pyrolysis, which can be stored subsequently in different ways to produce negative emissions: a solid biochar for various applications, a pyrolytic liquid (bio-oil) pumped into depleted fossil oil repositories, and permanent-pyrogas (dominated by the combustible gases CO, H2 and CH4) that may be transferred as CO2 to geological storage after combustion. In 2022/2023, biochar carbon removal accounted for 87–92% of all delivered carbon removals. The potential extent of carbon removal with biochar is the subject of ongoing research. Using current waste from the farming and forestry industry worldwide, an estimated 6% of global emissions, equivalent to 3 billion tonnes of CO2, could be removed annually over a 100 year time frame. More broadly, the potential is quantified to be between 0.3 and 4.9 billion tonnes of CO2 per year (GtCO2 yr−1). Permanence Permanence in carbon products is a good indicator of the quality or durability of the material. It also describes the amount of time that carbon can be stored in a material. Biochar is produced by rapidly carbonizing organic matter into maceral. There is evidence that biochar, produced at pyrolysis temperature over , resembles inertinite and thus highly stable. The level to which carbon dioxide is fixed and stored, depends both on the biochar production process and the subsequent application. If produced under certain conditions, 97% of the total organic carbon in biochar is highly refractory carbon, i.e. carbon that has near infinite stability. This implies that biochar can have a very high permanence in terms of carbon dioxide storage. Although the permanence of biochar is very high, the biggest setback to this process is making sure the pyrolysis has been achieved efficiently. Applications There are several applications that are considered to store CO2 for long periods of time: Soil application: Once mixed into soil, biochar, which is less susceptible to remineralization into CO2 and CH4 than non-pyrogenic biomass, fragments into micro- and nano-particles which can be transported to deeper soil horizons, groundwater, or other compartments that further protect it from degradation. Multiple studies have demonstrated that pyrogenic carbon is stable over centennial timescales. The impact on soil fertility is context dependent, but largely positive. It is estimated that biochar soil application could sequester 2.5 gigatons (Gt) of CO2 annually. Additive for construction material Cement Particleboards: Adding biochar into cement mixtures is still a work in progress and will take time to figure out exact calculations. In general, when mixed with cement, biochar does not impact the usefulness of cement mixtures. Biochar is a porous material and many benefits come when mixed with cement. It is lightweight, provides good insulation and contains humidity regulation properties. Additive in asphalts While still being studied and placed through various tests to determine more about its properties so far the results for mixing biochar material in asphalt have been promising. Asphalt is known to degrade in the elements rather quickly and has low durability due to the wear and tear done upon it. Biochar additives have proven to not only boost asphalt's overall durability but also its heat resistance. One proposed idea is using agricultural byproducts such as crop straw as a biochar additive to asphalt, therefore increasing the economic value of not only the crop but also the durability of the affected asphalt. Additive in plastics, paper and textiles The addition of biochar to plastic creates a composite, consisting of a matrix and filler phase. Plastic would be the main material in the composite and biochar would be added as a reinforcement for better thermal stability and resistance to moisture. When added to recycled plastics, biochar increases the strength and stiffness. The co-pyrolysis of plastics and biochar also decreases the costs and increases the efficiency of production. Biochar-based carbon credits Biochar Carbon Removal is increasingly seen as a promising negative emissions technology suitable for offsetting and carbon markets. Biochar Carbon credits are considered to be the best type of Carbon credit as they increase economic viability of the companies claiming them, and biochar stays in soil for hundreds if not thousands of years, ensuring that carbon is locked away from the atmosphere. Market size Trade in biochar carbon removal credits is still limited to a small number of suppliers and credit off-takers. In 2022, out of 592,969 carbon dioxide removal credits purchased on the voluntary carbon market, 40% were based on biochar carbon removal projects. Standards For the purpose of generating carbon credits, there are several internationally recognised voluntary biochar standards and methodologies. These include the following (non-exhaustive list): VERRA VM0044 Puro.Earth Biochar Methodology (Finland) CSI Global Artisinal C-Sink Nori (USA) MoorFutures(Germany) max.moor (Switzerland) Compensate (Finland) Several biochar production and carbon credit standards define criteria for permissible biomass feedstocks for biochar carbon removal. For example, the European Biochar Certificate (EBC) features a positive list of permissible biomasses for the production of biochar. This list includes agricultural residues, cultivated biomass, residues from forestry operations and sawmills, residues from landscaping activities, recycled feedstock, kitchen waste, food processing residues, textiles, anaerobic digestion, sludges from wastewater treatment, and animal by-products. In order for a credit to be certified suppliers must be carbon net-negative which indicates that they sequester more carbon then they use in the creation and distribution of biochar. See also Climate change scenario Climate engineering List of emerging technologies Dark earth Terra preta Anthrosols References External links Videos Deutsche Welle (2023) - Biochar: How burning stubble could FIGHT air pollution Biosequestration Climate engineering
Biochar carbon removal
[ "Engineering" ]
1,927
[ "Planetary engineering", "Geoengineering" ]
73,452,946
https://en.wikipedia.org/wiki/Proselenos
Proselenos () is the concept referring to the belief that the ancient Arcadians were a group of people older than the Moon (Selene in Greek) itself. This aspect of the Arcadian identity was in opposition to the other groups inhabiting the Peloponnese, who claimed to be descended from the Dorians. There were some other exceptions, however, such as the Eleans (who were thought to descend from Aetolia), the Cynurians (who adapted Dorian elements into their local identity), and the Achaeans (who were thought to have relocated to the northern Peloponnese following the Dorian invasion of the peninsula). The antiquity of the Arcadians was also shown in their mythical ancestry, claiming that they descended from the hero Pelasgus, who sprung from the earth to become their ancestor and whose son was Lycaon, the grandfather of the region's eponymous hero, Arcas. Concept origin The oldest reference to the story of the Arcadians pre-dating the moon is attested in the Classical period (479 - 323 BCE) of Greek history from the fifth century BCE historian, Hippys of Rhegium. The fragment from Hippys is preserved in the later (sixth century CE) work of Stephanus of Byzantium. The term is also applied by the fourth century BCE philosopher Aristotle, as well as Eudoxos of Cnidus, a fourth century BCE astronomer and mathematician. An unknown fifth century poet also mentioned proselenaios as an epithet of Pelasgus, the ancestor of the Arcadians; Borgeaud and Nielsen proposes that this may have been the fifth century Theban lyric poet, Pindar. Later sources The idea that the Arcadians were older than the moon is also referenced in the work of later writers, such as Apollonius of Rhodes, Statius and Lucian. It is also worth noting that Plutarch mentions that the Arcadians shared a kinship with oak trees, as they were believed to be the first men who sprung from the earth, already when the first oak was planted, further illustrating the great antiquity of the Arcadians as the people who preceded the moon. Notes References Bibliography Arcadian mythology Moon myths
Proselenos
[ "Astronomy" ]
451
[ "Astronomical myths", "Moon myths" ]
73,453,650
https://en.wikipedia.org/wiki/Institute%20of%20Theoretical%20Physics%2C%20Saclay
The Institute of Theoretical Physics ("Institut de physique théorique") (IPhT) is a research institute of the Direction of Fundamental Research (DRF) of the French Alternative Energies and Atomic Energy Commission (CEA). The Institute is also a joint research unit of the Institute of Physics (INP), a subsidiary of the French National Center for Scientific Research (CNRS). It is associated to the Paris-Saclay University. IPhT is situated on the Saclay Plateau South of Paris. History The IPhT was created in 1963 as the "Service de Physique Théorique" (SPhT), in succession of the "Service de Physique Mathématique" (SPM) of CEA. It became an Institute (and took the name IPhT) in 2008. It was initially devoted to nuclear physics and superconductivity. Particle physics quickly became an important theme. After its move in 1968 from the main CEA-Saclay site to the present site of Orme des Merisiers, quantum field theory became a major research topic, together with statistical physics. Subsequently, new topics such as conformal theories and matrix models, cosmology and string theory, condensed matter physics and out-of-equilibrium statistical physics, quantum information, found their place there. IPhT is usually considered one of the top theoretical physics research institute in Europe. Present research themes Research at IPhT covers most areas of theoretical physics: Cosmology and astroparticule physics Particle Physics : quantum chromodynamics, hadron physics, Collider physics, scattering amplitudes, physics beyond the standard model Quantum Gravity, String theory Mathematical Physics : Quantum field theory, conformal field theory, integrable systems, topological recursion, combinatorics, random geometries Condensed matter physics Statistical Physics: out of equilibrium systems, complex systems, network theory, biophysics Quantum information science IPhT organizes each spring the "Itzykson Conference", an international meeting centered on theme which is different every year. Its name is a tribute to Claude Itzykson, former IPhT researcher. Teaching IPhT is not part of a teaching department, but graduate and postgraduate courses of theoretical physics are organized at IPhT. They are aimed at graduate students and researchers of Paris area. The lecturers are researchers from IPhT or other Paris Area labs, and senior visitors of IPhT. Most courses are part of the courses of the Ecole Doctorale Physique en Ile de France (EDPIF). IPhT hosts numerous master and graduate students, as well as postdoctoral researchers. Research dissemination and outreach Talks and conferences of IPhT are usually available by live streaming and are available for replay on the IPhT YouTube channel. Outreach talks and presentations for the general public are also available there. Many scientific books are being published by researchers from IPhT, aiming at students and researchers as well as at the general public. Researchers of IPhT Some researchers who held permanent positions at SPM/SPhT/IPhT: Claude Bloch, Édouard Brézin, Gilles Cohen-Tannoudji, Cirano de Dominicis, Bernard Derrida, Claude Itzykson, Stanislas Leibler, Madan Lal Mehta, Albert Messiah, Stéphane Nonnenmacher, Yves Pomeau, Volker Schomerus, Raymond Stora, Lenka Zdeborová, Jean Zinn-Justin, Jean-Bernard Zuber Some researchers who are presently members of IPhT:(2023) Roger Balian, Jean-Paul Blaizot, François David, Philippe Di Francesco, Michel Gaudin, David Kosower, Vincent Pasquier, Mannque Rho, Hubert Saleur, Pierre Vanhove, André Voros Directors of IPhT Claude Bloch: 1963–1971 Cirano de Dominicis: 1971–1979 Roger Balian: 1979–1987 André Morel: 1987–1992 Jean Zinn–Justin: 1993–1998 Jean–Paul Blaizot: 1998–2004 Henri Orland: 2004–2011 Michel Bauer: 2011–2016 François David: 2017–2021 Catherine Pépin: 2022– Campus The IPhT is located on the Plateau de Saclay, about 20 km southwest of Paris, on the Orme des Merisiers site, which is an annex of the main CEA-Saclay center. References External links IPhT lectures web site IPhT YouTube channel (seminars, talks, outreach presentations) French Alternative Energies and Atomic Energy Commission Theoretical physics French National Centre for Scientific Research Paris-Saclay University 1963 establishments in France
Institute of Theoretical Physics, Saclay
[ "Physics" ]
941
[ "Theoretical physics" ]
73,453,679
https://en.wikipedia.org/wiki/Indium%28II%29%20chloride
Indium(II) chloride is an hypothetical inorganic compound with the formula . Its existence have been disproved and the substance claimed to be indium(II) chloride is a mixture of various indium subchlorides. History Indium(II) chloride was first reported to be produced in 1888 by Lars Fredrik Nilson who claimed to have produced indium(II) chloride from the reaction of indium metal and hydrogen chloride gas at 200 °C. However, this has been called into doubt as characterization by X-ray diffraction and NMR failed. In 1983, an investigation found that the solid claimed to be indium(II) chloride is actually a 5:1 mixture of In5Cl9, alternatively formulated In3[In2Cl9], and InCl3. References Chlorides Metal halides Indium compounds
Indium(II) chloride
[ "Chemistry" ]
172
[ "Chlorides", "Inorganic compounds", "Metal halides", "Salts" ]
73,456,066
https://en.wikipedia.org/wiki/TMP-HTag
Trimethoprim-Halotag (TMP-HTag) is a small molecule chemical linker developed for the rapid and reversible control of protein localization in living cells (Ballister). TMP is an dihydrofolate reductase (DHFR) inhibitor chosen for its specificity in binding to the bacterial form of DHFR. The other half of the linker is a Halotag, a self labelling bacterial globular protein ligand that can bind covalently and irreversibly to the chloroalkane group of a Haloenzyme. Positioned between the TMP group and HaloTag is a flexible linker that can be modified to optimize protein linking efficiency. The modular structure of TMP-HaloTag makes it an ideal heterobifunctional tool for use in chemically induced dimerization (CID). Additionally, TMP- HTag can be modified to include photo-cleavable groups that allow for the control of CID using light. TMP-HTag Examples TMP-HTag Applications References Organic compounds
TMP-HTag
[ "Chemistry" ]
225
[ "Organic compounds" ]
73,457,377
https://en.wikipedia.org/wiki/N%2CN%27-Diallyl-L-tartardiamide
{{DISPLAYTITLE:N,N'-Diallyl-L-tartardiamide}} N,N′-Diallyl-L-tartardiamide (DATD) is a crosslinking agent for polyacrylamide gels, e.g., as used for SDS-PAGE. Compared to bisacrylamide gels, DATD gels have a stronger interaction with glass, and therefore are used in applications where the polyacrylamide gel acts as a "plug" structural component at the bottom of a gel electrophoresis apparatus, thereby preventing a weak discontinuous gel from sliding out from or otherwise moving within the apparatus. Unlike bisacrylamide-polyacrylamide gels, DATD-polyacrylamide gels can be conveniently dissolved using periodic acid due to the presence of viscinal diols in DATD. DATD is the slowest polyacrylamide crosslinker tested, and has can act as an inhibitor of polymerization at high concentrations. See also bisacrylamide References Acrylamides Monomers
N,N'-Diallyl-L-tartardiamide
[ "Chemistry", "Materials_science" ]
238
[ "Monomers", "Polymer chemistry" ]
73,460,839
https://en.wikipedia.org/wiki/Osmium%20tetrasulfide
Osmium tetrasulfide is an inorganic compound, a salt of osmium metal and hydrogen sulfide acid with the chemical formula . Synthesis Osmium tetrasulfide can be made by passing hydrogen sulfide through acidified solutions of osmium tetroxide: Physical properties Osmium tetrasulfide forms dark brown crystals. It does not dissolve in cold water. It is soluble in dilute nitric acid. It forms hydrates. Osmium tetrasulfide decomposes upon melting. References Osmium compounds Sulfur compounds Sulfides
Osmium tetrasulfide
[ "Chemistry" ]
112
[ "Inorganic compounds", "Inorganic compound stubs" ]
73,461,429
https://en.wikipedia.org/wiki/Cerium%20monosulfide
Cerium monosulfide is a binary inorganic compound of cerium and sulfur with the chemical formula CeS. This is the simplest of cerium sulfides. Synthesis Heating stoichiometric amounts of pure substances at 2450 °C: Reduction reaction of dicerium trisulfide and cerium hydride: Physical properties Cerium sulfide forms yellow crystalline solid of cubic syngony crystals, space group Fm3m, cell parameter a = 0.5780 nm, Z = 4, of NaCl-type structure. The compound melts congruently at a temperature of 2450 °C. Chemical properties Cerium monosulfide has a wetting effect on metals, and it is relatively stable to metals other than platinum. It can react violently with platinum to form an intermetallic compound, platinum cerium. References Cerium compounds Sulfur compounds Monosulfides Rock salt crystal structure
Cerium monosulfide
[ "Chemistry" ]
183
[ "Inorganic compounds", "Inorganic compound stubs" ]
73,462,494
https://en.wikipedia.org/wiki/Ytterbium%28II%29%20sulfide
Ytterbium(II) sulfide is a binary inorganic compound of ytterbium and sulfur with the chemical formula YbS. Synthesis Synthesis of ytterbium(II) sulfide can be via a reaction of pure substances in an inert atmosphere: An alternative synthesis is by comproportionation of ytterbium(III) sulfide and ytterbium metal in vacuum at 1000–1100 °C: Physical properties Ytterbium(II) sulfide forms black crystals of cubic symmetry, space group Fm3m, cell parameter a = 0.5658 nm, Z = 4. Ytterbium(II) sulfide demonstrates semiconductor behavior. References Ytterbium(II) compounds Sulfur compounds Monosulfides Rock salt crystal structure
Ytterbium(II) sulfide
[ "Chemistry" ]
160
[ "Inorganic compounds", "Inorganic compound stubs" ]
73,463,362
https://en.wikipedia.org/wiki/Quantum%20battery
A quantum battery is a type of electric battery that uses the principles of quantum mechanics to store energy. They have the potential to be more efficient and powerful than traditional batteries. Quantum batteries are in the early stages of development. History The concept of quantum batteries was first proposed in 2013. The amount of work that can be produced by a quantum battery is called ergotropy. By making the battery and the device being powered inseparable, such as by using quantum entanglement, more battery output than that having been stored is possible. The first model proposed for a quantum battery was the Dicke model in 2018. Initially, the Dicke quantum battery appeared to show a quantum advantage in charging power. However, in 2020, it was demonstrated that the battery's Hamiltonian needed to be adjusted. Researchers found that the Dicke quantum battery, in fact, does not provide any quantum advantage. The SYK quantum battery, proposed in 2020, is the first many-body quantum battery that shows a quantum advantage in the charging process. Experiments on quantum batteries are in their infancy, and to date, there is no fully functional quantum battery. Models Dicke Quantum Battery The Dicke quantum battery uses the Dicke model to store energy. This battery was first proposed due to its relation with superradiant emission and its practical feasibility. The Dicke model describes the collective interaction of an ensemble of N two-level atoms (TLSs) with a single mode of the cavity field. Cavities are typically composed of two or more mirrors that reflect light back and forth, creating a standing wave of electromagnetic radiation, with frequencies determined by the cavity’s geometry. The first term describes the energy of the photons. The second term describes the energy of the qubits. The third term describes the interaction between photons and qubits. is the coupling parameter. This model initially seemed to show that the mean charging power scaled in a super-extensive manner: . However, this Hamiltonian is not well-defined in the thermodynamic limit ( while keeping constant). To fix this, it is necessary to substitute: By doing so, scientists found that this battery does not provide any quantum advantage. SYK Quantum Battery The SYK quantum battery uses the Sachdev–Ye–Kitaev model to store energy. This battery uses the direct charging protocol: where is the battery hamiltonian is the interaction hamiltonian. This is the first many-body model that shows a super extensive charging power. References Battery types Quantum mechanics
Quantum battery
[ "Physics" ]
515
[ "Applied and interdisciplinary physics", "Quantum mechanics", "Applications of quantum mechanics" ]
73,463,381
https://en.wikipedia.org/wiki/Earl%20Hays%20Press
The Earl Hays Press is a Los Angeles company providing props to cinema and television productions. The company was established by Earl Hays in 1915 but in the 1960s was sold to employee Ralph Hernandez Senior whose descendants retain ownership. The company specialises in producing generic printed matters such as food packages, documents and advertisements to avoid intellectual property issues with real brands. Earl Hays produces a number of generic newspapers, often these include standard layouts on inside pages, one of which has been featured in hundreds of films and television series. Other notable products are "Morley" cigarette packets, in imitation of the Marlboro brand, and facsimile currency. History The company was founded by Earl Hays in 1915 to provide props to Hollywood, Los Angeles, productions. Hays had the idea for the company after making sketches of international vehicle license plates while travelling. By 1944 the business employed a press writer and four printers solely in manufacturing newspapers, magazines and other printed matter for movie studios. Ralph Hernandez Senior joined the firm in 1964 and by the end of the decade had purchased it from Hays. The press is located at a site near Hollywood Burbank Airport. Hernandez's son and grandchildren were also involved in the firm and, by 2023, his grandson Keith Hernandez owned the company. The 2023 Writers Guild of America strike has threatened the future of the business; while the company typically works on around 100 productions simultaneously, during the strike this was reduced to just 2-3. Hernandez has auctioned some of the company's historic rental stock to avoid redundancies. Products The Earl Hays Press specialises in "insert printing", ephemera that adds realism to theatrical scenes such as food and beverage packages, menus, magazines, currency, documents, credit cards, DVD and record covers, comic books and advertisements. The company provides non-copyrighted versions of these for productions that cannot afford to or are unwilling to license real products. The company maintains a number of antique printing presses to provide authenticity to its products. In line with Earl Hays' original business the press offers a variety of US and international license plate designs. A stock of around 25,000 items is maintained. A major product is newspapers tailored to the time period of the setting. The papers are produced with a number of titles and formats to imitate titles ranging from high-end financial papers to small local newspapers. Headlines can be customised to cover plotlines in the production but the article text is usually nonsense and pages are filled with generic photographs and headlines such as "Million Dollar Highway Repair Underway" and "New Government Tower Planned". Inside pages are often to a generic design and one particular layout, including the headline "She's 3rd Brightest But Hard Gal To See" and an image of a brunette woman in a sweater has been noticed in hundreds of films and television shows since the 1970s. A newspaper prop that was created by Earl Hays Press has been seen in many productions including, Desperate Housewives, Modern Family, Married with Children, Scrubs, No Country for Old Men, Absolute Power, and more. The Earl Hays Press produces "Morley" cigarette packets. These resemble the copyrighted Marlboro brand and are smoked prominently by the Cigarette Smoking Man in The X-Files, Chandler Bing in Friends and Spike in Buffy the Vampire Slayer. The first known appearance of the Morley brand was in the 1960 film Psycho, they also appear in The Twilight Zone, The Walking Dead, Malcolm in the Middle and Burn Notice. The firm also produces bespoke publications such as the fake adult magazine Playpen that appears in Friends. Earl Hays Press also produces fake money for use in productions. These are similar to regular US dollars but contain the signature of the graphic designer rather than the US Treasurer and all have the same serial number. The US Secret Service checks and approves the designs before production to ensure it is sufficiently different from currency in circulation. Despite following this protocol the bills provided by Earl Hays Press for the production of the 1965 film The Cincinnati Kid caused issues. The fake currency, including $100 bills, used in a poker scene were taken by crew members and distributed in the bars and clubs of New Orleans. The currency entered circulation across the world and the Secret Service required the Earl Hays Press to burn its remaining stock and original printing plates under their supervision. References Manufacturing companies based in Los Angeles Prop design American companies established in 1915 Printing companies of the United States
Earl Hays Press
[ "Engineering" ]
886
[ "Design", "Prop design" ]
73,463,968
https://en.wikipedia.org/wiki/Ytterbium%28III%29%20sulfide
Ytterbium(III) sulfide is a binary inorganic compound of ytterbium and sulfur with the chemical formula . Preparation Ytterbium(III) sulfide can be obtained by reacting ytterbium and sulfur in an inert atmosphere at 450–800 °C: It can also be prepared by passing hydrogen sulfide through heated ytterbium(III) oxide: Physical properties Ytterbium(III) sulfide forms yellow crystals of rhombic symmetry, and cell parameters a = 0.678 nm, b = 0.995 nm, c = 0.361 nm. Uses The compound is used in the production of ceramics and as a catalyst. References Ytterbium(III) compounds Sulfur compounds Sesquisulfides
Ytterbium(III) sulfide
[ "Chemistry" ]
160
[ "Inorganic compounds", "Inorganic compound stubs" ]
73,464,124
https://en.wikipedia.org/wiki/Capronia%20cogtii
Capronia cogtii is a rare species of lichenicolous (lichen-dwelling) fungus in the family Herpotrichiellaceae. Found in northern Mongolia, it was described as a new species in 2019. Taxonomy Capronia cogtii belongs to the fungal family Herpotrichiellaceae and is characterized by its hyaline ascospores, which distinguish it from most other Capronia species that have pigmented ascospores. The new species is most similar to C. amylacea, C. hypotrachynae, C. normandinae, and C. pseudonormandinae, but can be distinguished by its smaller ascomata, longer hyaline ascospores, and different host genus, Vahliella (Vahliellaceae), compared to Peltigera (Peltigeraceae). The species epithet cogtii was given in honor of the late Professor Ulzii Cogt, who was a prominent figure in Mongolian lichenology. Description The vegetative hyphae of Capronia cogtii are pale brown, 2–3.5 μm wide, septate, and ramify from the lower parts of the . The ascomata are , blackish, more or less glossy, roughly spherical to ovoid, and occasionally shortly at the apex. They are above, ostiolate, 90–150 μm in diameter, and have a rough surface. The setae are dark brown, straight, not branched, 15–60 μm tall, 4–5 μm wide at base, and arise from a discrete dark foot-cell. The exciple is made of medium to dark brown cells outwardly, and somewhat hyaline, strongly elongated, radially compressed cells inwardly. The are hyaline, measure 10–20 by 2–3 μm, septate, and are not branched. The ascospores are hyaline, to very narrowly , and typically have 3 transverse septa (sometimes as few as 1 or as many or 5). They are usually constricted at the septa, smooth-walled, and overlappingly crowded in the ascus. Habitat and distribution Capronia cogtii is known only from the holotype, which was collected on the thallus of Vahliella leucophaea and occasionally on adjacent decaying mosses in sparse Larix sibirica mountain forest in northern Mongolia. The host lichen, Vahliella leucophaea, is morphologically similar to some species of Pannariaceae and has long been placed in this family. This is the first species of Capronia known to grow on members of Vahliellaceae. Two Capronia species are known to grow on Pannariaceae hosts, C. magellanica, growing on species of Fuscopannaria, and C. paranectrioides, growing on species of Erioderma. Capronia cogtii is also similar to C. andina and C. solitaria but can be distinguished by its hyaline ascospores and larger ascomata, respectively. References Eurotiomycetes Lichenicolous fungi Fungi described in 2019 Fungi of Asia Taxa named by Mikhail Petrovich Zhurbenko Fungus species
Capronia cogtii
[ "Biology" ]
677
[ "Fungi", "Fungus species" ]
73,465,449
https://en.wikipedia.org/wiki/Process%20Biochemistry
Process Biochemistry is a monthly peer-reviewed scientific journal that covers the study of biochemical processes and their applications in industries, such as food, pharmaceuticals, and biotechnology. The journal was established in 1966 and is published by Elsevier. The editor-in-chief is Joseph Boudrant (University of Lorraine). The journal covers a wide range of topics related to biochemical processes, including enzyme and microbial technology, protein engineering, metabolic engineering, biotransformations, and bioseparations. The journal publishes research articles, review articles, and case studies. Abstracting and indexing The journal is abstracted and indexed in the Science Citation Index Expanded and Scopus. According to the Journal Citation Reports, the journal has a 2021 impact factor of 4.885. References External links English-language journals Elsevier academic journals Academic journals established in 1966 Monthly journals Biochemistry journals
Process Biochemistry
[ "Chemistry" ]
179
[ "Biochemistry stubs", "Biochemistry journals", "Biochemistry literature", "Biochemistry journal stubs" ]
76,395,086
https://en.wikipedia.org/wiki/NGC%202523B
NGC 2523B is a spiral galaxy located around 186 million light-years away in the constellation Camelopardalis. The discovery of this galaxy is credited to Philip C. Keenan, in his paper Studies of Extra-Galactic Nebulae. Part I: Determination of Magnitudes, published in The Astrophysical Journal in 1935. According to A.M. Garcia, NGC 2523B is a member of the five member UGC 4057 galaxy group (also known as LGG 149). The other galaxies in the group are NGC 2523, UGC 4014, UGC 4028, and UGC 4057. See also List of NGC objects (2001–3000) References External links 2523B spiral galaxies Camelopardalis 023025 +12-08-030 04259 08072+7342 Astronomical objects discovered in 1935
NGC 2523B
[ "Astronomy" ]
176
[ "Camelopardalis", "Constellations" ]
76,395,135
https://en.wikipedia.org/wiki/Social%20justice%20index
A social justice index is a set of numbers which have been calculated through weighing several indicators of various entities, usually countries, but also regions or commercial firms. These indicators are considered related to social justice. The European Union Social Justice Index, published in September 2015 by Bertelsmann Stiftung, is based on 35 indicators. The highest number (7.48) is given to Sweden, whilst the lowest one (3.57) goes to Greece. The Social Justice in the EU and OECD Index, published in September 2019 also by Bertelsmann Stiftung, ranks 41 countries, from the highest one (7.90, Iceland) to the lowest one (4.76, Mexico). It considers 6 dimensions of social justice: poverty prevention, equitable education, labor market access, social inclusion and non-discrimination, intergenerational justice and health. For some countries, like Sweden, this index has been calculated since 2009 and every 2 or 3 years. The Adasina Social Justice Index is a stock market index of about 9,000 publicly traded securities. Adasina is a financial analysis firm. These securities are included in this index (or excluded from it) according to 4 criteria: racial justice, gender justice, economic justice and climate justice. The Adasina Social Justice Index is designed to support progressive movements. See also Government effectiveness index World Governance Index References Effective altruism Index numbers International rankings
Social justice index
[ "Mathematics", "Biology" ]
291
[ "Effective altruism", "Behavior", "Altruism", "Mathematical objects", "Index numbers", "Numbers" ]
76,395,229
https://en.wikipedia.org/wiki/M.%20C.%20Escher%3A%20Visions%20of%20Symmetry
M. C. Escher: Visions of Symmetry is a book by mathematician Doris Schattschneider published by W. H. Freeman in 1990. The book analyzes the symmetry of M. C. Escher's colored periodic drawings and explains the methods he used to construct his artworks. Escher made extensive use of two-color and multi-color symmetry in his periodic drawings. The book contains more than 350 illustrations, half of which were never previously published. Structure and topics The book is divided into five chapters. Before the main text there is a foreword and a preface, and the book is concluded with a concordance, afterword (in the second edition only), bibliography and four indexes. The first chapter, 'The Route to Regular Division', describes Escher's early artistic development, and how Escher first became intrigued by the problem of filling the plane with interlocking shapes (tessellation). This work came to dominate his art from 1937. He was also encouraged, by a half-brother who was a professor of geology, to study papers on symmetry by Pólya and other mathematicians in the Zeitschrift für Kristallographie. These helped launch Escher into his own detailed investigations of the rules for generating the allowable patterns for tiling the plane. The second chapter 'The 1941–1942 Notebooks' presents, for the first time, the complete set of numbered drawings from the two 1941–1942 notebooks which summarize Escher's theory of the regular divisions of the plane, and details the classification system Escher used to organize his drawings. The third chapter 'The Regular Division Drawings' is the longest in the book at 118 pages. It reproduces all of the known drawings (numbers 1 to 137) and the known periodic designs (A1 to A14) from Escher's 1938–1941 notebooks together with his notes on their symmetry type. The fourth chapter 'The Use of Regular Division' explains that Escher regarded his periodic drawings as a means to an end rather than as finished works of art in their own right. The periodic drawings were the solutions to the question of what was possible when tiling the plane using the rules that Escher had established. Escher used the periodic drawings as a basis for developing his completed artworks. The fifth chapter 'Notes on the drawings' provides additional information of each of the drawings in chapter 3. For each drawing the following information is given: number, title, place drawn, medium, dimensions, Escher system type, symmetry group, previous publication, and notes. The book concludes with a concordance which gives a complete tabulation of the symmetry groups represented by Escher's periodic drawings and an afterword, in the second edition only, which outlines the developments in the subject between 1990 and 2004. Audience In her preface, the author's stated objective for the book is to answer the question "How did he do it?". The audience for the book is any person who admires, or is interested in, M. C. Escher's periodic drawings and would like to understand his methods for designing and executing his artworks. As no prior mathematical knowledge is assumed by the author to understand the material presented in the book, it is appropriate for a general audience. As Michele Emmer comments in his review: "It is important that, with this beautiful volume, artists and scientists can look at Escher's original notebooks." Reception The book was widely reviewed and its reception was very positive. Alan L. Mackay in a full-page review for Nature wrote: "This book contains very many colour reproductions of the periodic drawings and analyses the 1941–42 notebooks which show Escher's development [...] Taking Doris Schattschneider's beautiful volume with earlier books, especially that by Bruno Ernst, documentation of Escher's life, intellectual development and corpus must now be almost complete." Roger Goodwin writing in The British Journal of Aesthetics said "This book, the product of more than fifteen years of research by its mathematician author, provides the definitive account of how Escher produced his renowned interlocking drawings, based on the regular division of the plane." Michele Emmer reviewing the book in Leonardo wrote: "Escher's theory, recorded in the notebooks of 1941–1942, has never been completely published before. All the 150 color drawings of interlocking patterns that he produced from 1937 to 1941 are reproduced in the book. It is, of course, the most essential part of the volume." Marjorie Senechal wrote the entry for Mathematical Reviews: "The development of Escher's ideas is carefully traced, the influence of his work on others, and vice versa, is discussed, and all of the notebook drawings are presented in full color. Doris Schattschneider has written the Escher book for mathematicians." John Galloway reviewing the book for New Scientist said: "Many books have been written about Escher's art. None has approached Visions of Symmetry for its scope, scale and sumptuousness. The sheer beauty and ingenuity of the pictures keep you turning the pages as though the book were a collection of detective stories whose plots are brilliantly organised patterns." In an extensive review in The American Mathematical Monthly Douglas Dunham said: "For the Escher fan, Visions of Symmetry fills a gap in the literature by showing all of his notebook patterns, answering the question "how did he do it?", and relating the patterns to his prints. For the person interested in tilings and patterns, Visions of Symmetry provides many beautiful examples (which illustrate the theory expounded in Grünbaum and Shepard's Tilings and patterns [1987])." J. Kevin Colligan reviewing the book for The Mathematics Teacher wrote: "This book sits on the boundary between mathematics and art, as did Escher. In fact, this book supports the argument that no such boundary exists; rather, the two disciplines coexist and intermingle, enriching both." Paul Garcia writing in The Mathematical Gazette writes: "I recommend the book highly to anyone - the price is small compared to the scope and interest of the work. Doris Schattschneider has done us all a tremendous favour by compiling this book." Influence David Topper writing of the second edition in Choice said "This beautiful book remains one of the essential studies of this most popular artist." Gerald L. Alexanderson writing in MAA Reviews said "It's an impressive piece of scholarship that is extraordinarily beautiful as well. This book is an old friend and it's good to welcome it back in such an elegant and sumptuous form." Laurence Goldstein reviewing the second edition in Print Quarterly said: "... the reader is enabled to glimpse the process through which the artist struggled towards the finished works of art that Hofstadter (and, of course, many others) find so sensuously gratifying. There is also a wealth of biographical information concerning the mathematical and artistic influences on Escher's work, and on the creative process as witnessed by people close to him and as perceived by the artist himself." A brief, unsigned review in Science said: "Escher's periodic tilings have made the artist a favorite of mathematicians and scientists. In her classic 1990 book, Schattschneider analyzed his art and notebooks to explain how Escher created his colorful, puzzle-like regular divisions of the plane [...] This new edition adds a short survey of reflections of his work in mathematics, computer graphics, the Internet, and contemporary art." An unsigned review in the Epsilon Pi Tau Journal of Technology Studies said: "A revision of a classic book that appeared in 1990, this is the most penetrating study of Escher's work in existence and the one most admired by scientists and mathematicians. It deals with one powerful obsession that preoccupied Escher: what he called the 'regular division of the plane', the puzzle-like interlocking of birds, fish, lizards, and other natural forms in continuous patterns. Schattschneider explores how he succeeded at this task by meticulously analyzing his notebooks." Editions First edition entitled Visions of Symmetry: Notebooks, Periodic Drawings, and Related Work of M. C. Escher published by W.H. Freeman in 1990. Second, revised edition entitled M. C. Escher: Visions of Symmetry: Notebooks, Periodic Drawings, and Related Work published by Harry N. Abrams in 2004. References External links at the Internet Archive M. C. Escher Mathematics and art Patterns Symmetry Tessellation Mathematics books 1990 non-fiction books
M. C. Escher: Visions of Symmetry
[ "Physics", "Mathematics" ]
1,785
[ "Tessellation", "Euclidean plane geometry", "Geometry", "Planes (geometry)", "Symmetry" ]
76,396,169
https://en.wikipedia.org/wiki/NGC%202012
NGC 2012 is a large lenticular galaxy in the Constellation Mensa. It was discovered by John Herschel in 1836. With its distance from the Earth being over 236 million light years, NGC 2012 is not visible to the naked eye, and a large telescope is needed. A probe has never been sent out to study the galaxy. Discovery Polymath John Herschel observed the galaxy in 1836, and it was then added to the New General Catalog (NGC). The galaxy itself is a relatively long distance from Earth, making Herschel's find very uncommon for the time period. References Spiral galaxies Mensa (constellation) 2012 17194 Astronomical objects discovered in 1836 Discoveries by John Herschel
NGC 2012
[ "Astronomy" ]
140
[ "Mensa (constellation)", "Constellations" ]
76,396,388
https://en.wikipedia.org/wiki/Inonotus%20hastifer
Inonotus hastifer, is a species of fungus in the family Hymenochaetaceae, first described by Zdeněk Pouzar in 1981. Distribution and habitat It was noted in North America and Europe, with the most sightings in Europe. It grows in deciduous forest, on dead trunks of hornbeam and beech. References External links hastifer Fungi described in 1981 Fungus species
Inonotus hastifer
[ "Biology" ]
83
[ "Fungi", "Fungus species" ]
76,396,513
https://en.wikipedia.org/wiki/New%20Society%20of%20Artists
The New Society of Artists was formed in London in 1921. Its primary aim was to give a chance for artists whose work had not been accepted by the Royal Academy (RA) to exhibit their work in London and, later, in the provinces. In 1932 it became the United Society of Artists. The last known exhibition was in Margate in June 2017. History The formation of the New Society of Artists (NSA) was announced in 1921. It was initially intended for artists whose works were “crowded out” from hanging by the RA, and to give artists in the provinces a chance to exhibit in London. The Provisional Council consisted of The Hon. John Collier, Mr Alex Maclean, Mr C R Chisman, Mr Henry S Kortright, Mr Percy Edsall and Mr Stafford Leake. The inaugural exhibition was to have been at the Guildhall Art Gallery on 8 June, but instead it opened at the Royal Society of British Artists in Suffolk Street, not far from the RA, on 3 July. The location had been hired for five weeks each summer thanks to the efforts of Charles Robert Chisman and Percy Edsall, “both secretaries of well-known art societies”. A Yorkshire newspaper reported that the exhibition opened with nearly 400 paintings and drawings, “and a very ordinary lot they are, showing in several instances marked imitative tendencies”. One of the most prominent exhibitors was the Welsh artist Miss Margaret Lindsay Williams, with two works: “Lorenzo Babini” and “The Imprisoned Soul”. Charles de Lacy reported that there had been a rush for membership of the new society. The following year, the first provincial exhibition of works by NSA members was opened by the Mayor on 10 February at the Museum and Art Gallery, Burton on Trent. In April, a second such exhibition was opened in Worthing; it was greeted with lukewarm praise in the local press. In June 1922, the second annual exhibition opened. It was “an improvement on the first, and less like a collection of Academy crowded-outs. Women provide much of the quality…” In 1923, the Westminster Gazette commented: Others took a more parochial view. The Hampstead News, for example, said that "There is much in it to interest residents in Hampstead and St John's Wood, as so many well-known artists from these parts have sent exhibits. The Hanging Committee ... have done their work well". The regional exhibition in February 1924 was held in Northampton. From there it moved to Cheltenham in March. At the 1924 annual exhibition in London, excellent portraits by W Howard Robinson, Frank E Beresford, A Jonniaux and E Newling were highlighted among the 400 exhibits. The fifth annual show opened as usual at Suffolk Street in June 1925. One reviewer commented: A significant change came in 1926, when the annual London show opened in January instead of the summer. In June 1932 the annual exhibition opened in London, but this time under the name of the United Society of Artists; members were entitled to use the post-nominal UA. The main reasons for the name change were The Society's 65th Annual Exhibition was held in London from 30 January to 10 February 1985, with an entrance fee of £1. The last known exhibition was in Margate in June 2017. A large number of the Society's annual exhibition catalogues is held at the National Art Library in London. Known NSA/UA members Compiled from exhibitors mentioned in reviews of NSA exhibitions. NSA members (1921-1931) NSA/UA members in 1932 UA exhibitors/members, 1932 and later Entries without references are derived from the 1932 catalogue. References
New Society of Artists
[ "Engineering" ]
733
[ "Design", "Art and design organizations" ]
76,396,726
https://en.wikipedia.org/wiki/Russula%20violacea
Russula violacea, is a species of agaric fungus in the family Russulaceae first described by Lucien Quélet. Distribution and habitat R. violacea has been noted in North America, Asia and Europe, with the most occurrences in Europe. It grows in coniferous and deciduous forests, under Quercus, Betula and Populus trees, it has been also spotted on coal dumps. It fruits between August and October. References External links violacea Fungi described in 1883 Fungus species
Russula violacea
[ "Biology" ]
101
[ "Fungi", "Fungus species" ]
76,396,899
https://en.wikipedia.org/wiki/GRB%20200522A
GRB 200522A is a large kilonova in the Constellation Pisces. It was first observed in May 2020 by the Hubble Space Telescope. It is the result of the largest neutron star explosion ever recorded, and was bright enough to be visible by Hubble 5.4 billion light years away. Formation GRB 200522A is believed to have been formed when two neutron stars collided and exploded, creating an extremely large and bright short-ray gamma burst. The brightness of the emission was 10 times that of predicted, and was around 10,000 times more powerful than the sun in its entire 10 billion year lifetime. These findings and numbers, aided by the Hubble, have concluded that the kilonova is masking an extremely large and magnetized nuetron star. Reactions Prominent astronomer and professor Wen-fai Fong stated about the kilanova “It's amazing to me that after 10 years of studying the same type of phenomenon, we can discover unprecedented behavior like this,”. ADS stated "This is substantially lower than on-axis short GRB afterglow detections but is a factor of ≈8-17 more luminous than the kilonova of GW170817 and significantly more luminous than any kilonova candidate for which comparable observations exist." See also Gamma Ray 2020 in science References Astronomical objects discovered in 2020 Gamma-ray bursts Hubble Space Telescope
GRB 200522A
[ "Physics", "Astronomy" ]
286
[ "Physical phenomena", "Stellar phenomena", "Astronomical events", "Gamma-ray bursts" ]
76,397,416
https://en.wikipedia.org/wiki/Europium%28III%29%20iodate
Europium(III) iodate is an inorganic compound with the chemical formula Eu(IO3)3. It can be produced by hydrothermal reaction of europium(III) nitrate or europium(III) oxide and iodic acid in water at 230 °C. It can be thermally decomposed as follows: It reacts hydrothermally with iodine pentoxide and molybdenum trioxide at 200 °C to obtain Eu(MoO2)(IO3)4(OH). References Europium compounds Iodates
Europium(III) iodate
[ "Chemistry" ]
116
[ "Iodates", "Oxidizing agents" ]
76,397,627
https://en.wikipedia.org/wiki/Volvariella%20media
Volvariella media, is a species of agaric fungus in the family Pluteaceae, described by Rolf Singer in 1951. Distribution and habitat It was noted in Asia and Europe, with the most sightings in Europe. It grows in grass. References External links Pluteaceae Fungi described in 1951 Fungus species
Volvariella media
[ "Biology" ]
66
[ "Fungi", "Fungus species" ]
76,397,902
https://en.wikipedia.org/wiki/Volvariella%20pusilla
Volvariella pusilla, is a species of agaric fungus in the family Pluteaceae, described by Rolf Singer in 1951. Morphology Cap: 1 to 3 cm in diameter, bell-shaped at first, then expands. The skin is silky-stringy when young, slightly sticky, white, the flesh is cream-colored, the edge is more stringy and sometimes cracked. Lamellae: Free, white when young and pink colored when the spores mature. Lamellae come close to the stipe, but do not touch it. Stipe: White and thin. Distribution and habitat It was noted in Asia and Europe, North America, Africa and Australia, with the most sightings in Europe. It grows in forests, parks, botanical gardens, allotment gardens, by the roads, sometimes close to houses, on the ground, in grass. References External links Pluteaceae Fungi described in 1951 Fungus species
Volvariella pusilla
[ "Biology" ]
192
[ "Fungi", "Fungus species" ]
76,399,631
https://en.wikipedia.org/wiki/Architects%20Registration%20Council%20of%20Nigeria
The Architects Registration Council of Nigeria (ARCON) is a statutory body tasked with regulating the architectural profession within Nigeria. It was established under the ARCON Act, Decree No 10 of 1969, amended by Decree No 43 of 1990, and currently operates under the Architects (Registration, Etc.) Act Cap A19 The Laws of the Federation of Nigeria 2004. The ARCON Act was established to set and maintain professional standards in the architectural field in Nigeria. It grants ARCON the authority to determine and periodically update the qualifications and competencies required to practice architecture. The council's primary mandate is to enforce the standards of knowledge and skill necessary for professional practice. The council maintains a register of qualified architects authorised to practice, and ensures compliance with ethical standards and accountability. It issues a code of professional conduct and may impose sanctions for cases of professional misconduct or serious incompetence. ARCON also accredits architectural education programmes and professional development initiatives in Nigeria. Its role includes setting standards of practice within the profession to safeguard the interests of both practitioners and the public. The council collaborates with stakeholders, such as educational institutions and government bodies, and also faces various operational challenges, including funding constraints and enforcement issues. History Before the establishment of the Architects Registration Council of Nigeria (ARCON), the practice of architecture in Nigeria was largely unregulated. While traditional architectural practices varied across the country, the modern profession lacked uniform standards, leading to inconsistencies in competency and professionalism. This became evident in the post-independence era as Nigeria sought to modernise and align its professional practices with international standards. The Nigerian Institute of Architects (NIA), founded in 1960, was one of the first formal organisations established to promote the profession. However, there was still a growing need for a regulatory framework, as the absence of defined standards for architectural education and professional qualifications allowed unqualified individuals to practice. In response to this gap, the Federal Military Government established ARCON in 1969 through Decree No. 10, to oversee the practice of architecture and ensure that practitioners meet certain standards. ARCON underwent several revisions over the years. In 1996, the founding decree was reviewed, leading to the creation of the Architects Registration Board of Nigeria (ARBON) to oversee qualifying examinations. This legal framework was further reviewed and amended with the passage of the Architects (Registration, Etc.) Act Cap A19 under the Laws of the Federation of Nigeria 2004, which updated and formalised the council's regulatory functions. In 2007, as part of its regulatory efforts, ARCON made a strategic plan to enhance the profession, ensuring that architects in Nigeria adhere to global best practices while safeguarding the interests of the public. Throughout its history, ARCON has played a central role in the development of the architectural profession in Nigeria. Legal basis The Architects Registration Council of Nigeria (ARCON) was established by Decree No. 10 of 1969, which was amended by Decree No. 43 of 1990. Following the transition to democratic governance in Nigeria, a comprehensive review of all existing laws took place, leading to the renaming and revision of the law establishing ARCON. In 2004, the law was renamed the Architects (Registration, Etc.) Act Cap A19 under the Laws of the Federation of Nigeria, aligning with the new political framework. ARCON operates as a parastatal under the Federal Ministry of Works and Housing, which is responsible for regulating the council's activities. This amendment integrated modern provisions to address challenges such the globalization of architectural standards. The Act describes ARCON's responsibilities and functions, and grants it the authority to regulate, oversee, and enforce standards within the architectural profession in Nigeria. It details the eligibility criteria for registration and accreditation of educational programmes, and the enforcement of disciplinary measures. Through these provisions, ARCON ensures that only licensed individuals engage in the practice of architecture, to secure public interest and maintain professional integrity. Section 2 of the Act describes ARCON's mandate and outlines the council's powers to review and update its standards periodically, ensuring adaptability to advancements in both the practice and in technology. The Act is divided into two schedules and 18 sections, which collectively comprises the framework for the regulation of the study and practice of architecture. Some provisions include the process for registering architects, defining qualifications for practice, maintaining registers, and approval of institutions. The Act also created the Architects Registration Board of Nigeria (ARBON) to oversee licensing examinations. A characteristic feature of the Act is its emphasis on public protection. It empowers ARCON to investigate cases of professional incompetence or unethical behavior, with sanctions ranging from fines to the withdrawal of licensure. This regulatory scope aligns with international best practices seen in bodies like the National Council of Architectural Registration Boards (NCARB) in the United States, which also mandates adherence to strict ethical codes, and the Royal Institute of British Architects (RIBA) in the United Kingdom, which emphasizes education and professional practice. However, unlike RIBA, ARCON has legal authority embedded in national legislation, granting it enforcement powers beyond advocacy and education. Structure and governance ARCON, established as a policy-making organ of government, is composed of 49 members representing various sectors and stakeholders within the profession. The council's composition, qualifications, and operational procedures are detailed in the First Schedule of the ARCON Act. According to Section 2(2) of the Act, the council includes: Four persons appointed by the Minister to represent various interests in architecture. One representative appointed by the Minister from each state of the Federation and the Federal Capital Territory (FCT), Abuja. Four representatives from accredited universities with faculties of architecture, ensuring rotational representation. Four members appointed by the Nigerian Institute of Architects (NIA). Members must be fully registered architects with at least ten years of professional experience. The Minister of Housing and Urban Development oversees the appointment process and retains authority to adjust council membership to address evolving needs and circumstances. The council's leadership is anchored by the President. Arc. Dipo Ajayi has served as President since his first tenure began on 12 June 2018, succeeding Arc. Umaru Aliyu. Other key officers include the Vice President, Registrar, and Treasurer. In 2024, the Minister of Housing and Urban Development, Arc. Ahmed Musa Dangiwa, inaugurated the reconstituted council. During subsequent elections, Arc. Dipo Ajayi was re-elected as President, defeating Arc. Kabiru Ibrahim. Other elected officers included Arc. Mohammed Aminu Kani as Vice President, Arc. Umar Murnai as Registrar, and Arc. Ugwuanyi Onyedikachi Odobuma as Treasurer. Functions The Architects Registration Council of Nigeria performs the following functions: Registration of architects ARCON registers qualified architects and maintains an official register of practitioners in Nigeria. Accreditation of architectural programmes The council accredits architectural programmes offered by institutions in Nigeria to uphold educational standards. Regulation of architectural practice ARCON regulates architectural practice through guidelines, policies, and monitoring compliance with statutory requirements. Continuing professional development The council organizes training, workshops, and seminars to enhance the skills and knowledge of architects. Professional discipline ARCON investigates cases of professional misconduct or incompetence and imposes appropriate sanctions. Collaboration with stakeholders The council collaborates with educational institutions, government bodies, and other stakeholders to advance architectural education and practice. Oversight of examinations ARCON oversees professional examinations required for architectural licensing and practice in Nigeria. Policy formulation The council contributes to the development of policies and regulations related to architecture and the built environment in Nigeria. Registration and accreditation The registration of architects in Nigeria is governed by the Architects (Registration, Etc.) Act, which specifies the qualifications and procedures for registration with the Architects Registration Council of Nigeria (ARCON). The Act recognises various qualifications obtained from approved institutions, including Nigerian universities, Commonwealth Association of Architects recognised schools, foreign schools producing accepted Nigerian architects, and other approved qualifications. Additionally, individuals seeking registration undergo character certification and provide proof of reciprocity, if applicable. Individual Architect Registration Individuals seeking registration as architects with ARCON follow a prescribed procedure, which includes completing application forms, providing necessary documents such as educational certificates, passing professional practice competence examinations, and obtaining character certifications. The Council conducts a preliminary scrutiny of applications, and the Registration Committee reviews recommendations before final approval. Successful applicants are registered upon payment of prescribed fees and are subject to annual renewal. In order to achieve full registration with ARCON, individuals must fulfill several requirements. This includes obtaining a Bachelor's degree from an accredited school of architecture, followed by enrollment in a two-year Master's Degree programme. Additionally, candidates must possess a National Youth Service Corps (NYSC) Certificate and complete a minimum of two years of practical training, known as pupillage, under the supervision of a registered Architect. ARCON maintains a register of architects categorised as fully registered, provisionally registered Stage 1 and 2. Fully registered architects are entitled to practice independently, while provisionally registered architects may practice under supervision. These categories are majorly based on qualifications and experience. Registration of Architectural Firms Architectural firms in Nigeria are also subject to registration with ARCON, based on specified criteria outlined in the Act. The Act distinguishes between different categories of architectural firms, including those incorporated under the Companies and Allied Matters Act, limited liability companies with professional architects as directors and shareholders, and multidisciplinary consultancy firms offering services in the building industry. The registration process involves submitting application forms, documentary evidence of compliance with regulations, and endorsements from sponsors. Accreditation of Architectural Institutions In addition to individual registration, ARCON accredits institutions offering architecture programmes in Nigeria. The accreditation process involves evaluating the facilities, faculty qualifications, curriculum, and adherence to required standards. Representatives from ARCON conduct inspections and assessments to ensure compliance with accreditation requirements. Use of the title "architect" ARCON, through The Architects (Registration, etc.) 2004 Act establishes strict regulations regarding the use of the title "architect" in Nigeria, in relation to architectural business and practice. These regulations are designed to ensure that individuals using the title meet specific criteria and prescriptions set forth by the Act. According to Section 1 of the Act, only Nigerian citizens who are registered under the Architects Registration Council of Nigeria (ARCON) are permitted to use the title "architect." This is to safeguard the interests of both practitioners and the public. Individuals who are not registered architects under the Act are expressly prohibited from using the title "architect" in connection with architectural building plans or any business related to architecture, except in cases pertaining to ship construction, landscape, or golf-links. ARCON is tasked with the responsibility of enforcing these regulations and ensuring compliance with the provisions of the Act. Violations of the Act's provisions regarding the use of the title "architect" may result in penalties and disciplinary actions. Collaborations The Nigerian Institute of Architects (NIA) and the Architects Registration Council of Nigeria (ARCON) share a collaborative relationship for the regulation of the study and practice of architecture within Nigeria. Their collaboration spans various initiatives, promoting the interests of architectural practitioners and the public. Together, the NIA and ARCON work closely to conduct periodic accreditation exercises at schools offering architecture programmes across the country. Through these exercises, they ensure that architectural education meets the required standards and prepares students adequately for professional practice. Additionally, both organisations collaborate on setting and updating guidelines and regulations for architectural practice within Nigeria. The NIA and ARCON also collaborate on initiatives to enhance the membership drive and support architectural professionals throughout their careers. They provide avenues for continuous professional development, networking opportunities, and advocacy for the profession's interests at both local and international levels. In addition to collaborating with the NIA, ARCON also partners with the Association of Nigerian Chartered Architects (ANCA) to support architecture in Nigeria. ANCA provides a platform for chartered architects to interact professionally and promotes ethical discipline among its members, aligning with ARCON's objectives of regulating the profession. By working hand in hand, they uphold standards, support practitioners, and promote the profession's significance in the built environment sector. Challenges and conflicts ARCON, like many regulatory bodies in Nigeria, encounters various challenges that impede its effective functioning. Some major challenges include: Inadequate funding from the government, leading to delays in processing applications for architectural registration, corruption and lack of enforcement, which can lead to the approval of substandard buildings and unqualified architects, and lack of awareness. These challenges have significant implications for ARCON's operations and its ability to regulate the architecture profession effectively. Conflicts between ARCON and NIA ARCON and the Nigerian Institute of Architects (NIA) have been embroiled in various disputes over the years, affecting the architecture profession in Nigeria. The two bodies clashed over the composition of the ARCON council, leading to litigation and the cancellation of council inaugurations in 2022, by the Ministry. Litigation between ARCON and the NIA further exacerbated tensions between the two organisations. Legal disputes over matters such as nomination processes and the conduct of examinations resulted in prolonged conflicts and uncertainty within the profession. The unresolved disputes have had negative consequences for architectural professionals in Nigeria. Many graduates have been unable to register or proceed with licensure programmes due to the disruptions caused by the conflicts between ARCON and the NIA. However, recent attempts were made to address the enduring conflicts between ARCON and NIA. Following discussions between ARCON, the NIA and ANCA in 2022, agreements were reached to create a unified Professional Competency Evaluation Programme and establish a Memorandum of Understanding for Higher National Diploma Certificate holders within the architectural field. See also Nigerian Institute of Architects Notes References Works cited External links Official website 1969 establishments in Nigeria Architecture-related professional associations Architecture organizations Architecture in Nigeria Professional associations based in Nigeria Government agencies of Nigeria Government agencies established in 1969
Architects Registration Council of Nigeria
[ "Engineering" ]
2,806
[ "Architecture organizations", "Architecture" ]
76,399,949
https://en.wikipedia.org/wiki/GALAX
GALAX Microsystems Limited is a computer hardware manufacturer founded in 1994 and based in Hong Kong. The company specializes in producing video cards, gaming monitors, solid-state drives, memory modules, computer coolers, and other computer accessories. Overview Galaxy Microsystems was founded in August 1994. Galaxy had become one of the NVIDIA AIC partner in 1999. In 2011, Galaxy launched the Hall Of Fame (HOF) product line, which designed for overclocking enthusiasts and hardcore gamers. In 2013, Galaxy Microsystems changed its trademark name to GALAX. References External links Official website Graphics hardware companies Motherboard companies Privately held companies of Hong Kong Computer hardware companies Computer systems companies Hong Kong brands
GALAX
[ "Technology" ]
143
[ "Computer hardware companies", "Computer systems companies", "Computers", "Computer systems" ]
76,400,333
https://en.wikipedia.org/wiki/Natural%20Gas%20Policy%20Act%20of%201978
The Natural Gas Policy Act of 1978 (NGPA) is federal legislation that had been enacted as a response to US natural gas shortages of 1976–77. It was enacted for the following motivations: To create a balance between natural gas supply and demand, Create a national gas market, and Transition to market-based prices. The NGPA: Authorized the Federal Energy Regulatory Commission (FERC) to regulate interstate and some intrastate natural gas production and transportation. Was designed to encourage the development of new natural gas supplies by gradually deregulating wellhead gas prices. Established maximum lawful prices (ceilings) for the sale of natural gas, which were phased out over a series of years, allowing market forces to set natural gas prices. The Natural Gas Policy Act (NGPA) was the first building block in a plan from the Carter Administration to increase energy supply while reducing domestic consumption of energy. It preceded the Energy Security Act that President Carter would sign into law in 1980. In the 1970s, natural gas was persistently in short supply throughout the 1970s, largely because of following: Wellhead price regulation The 1973 OPEC oil embargo. Based on this type of energy setting in the 1970s, Congress enacted the NGPA. The act was intended to raise natural gas rates to market clearing levels. Effect on natural gas supply The NGPA set prices based on where natural gas was old, new, or of high cost. New gas paid a higher price than older gas, which in turn encouraged oil and gas producers to produce from new sources of natural gas. However, this pricing system created a risk for higher prices for consumers as well as inhibiting production for the existing vast old gas reservoirs. Wellhead price controls The wellhead price controls found in the act had a direct effect on natural gas industry commerce. The following are the sections of the NGPA Title 1, Wellhead Price Controls: Section 101. Inflation adjustment; other general price ceiling rules. Section 102. Ceiling price for new natural gas and certain natural gas produced from the Outer Continental Shelf. Section 103. Ceiling price for new, onshore production wells. Section 104. Ceiling price for sales of natural gas dedicated to interstate commerce. Section 105. Ceiling price for sales under existing intrastate contracts. Section 106. Ceiling price for sales under rollover contracts. Section 107. Ceiling price for high-cost natural gas. Section 108. Ceiling price for stripper well natural gas. Section 109. Ceiling price for other categories of natural gas. Section 110. Treatment of State severance taxes and certain production-related costs Section 108. Ceiling price for stripper well natural gas Stripper wells are wells that are marginally productive. In recent history, stripper well production makes up about 8.2% of United States natural gas production. The NGPA defined a stripper well as essentially a well that produces less than 60 Mcf (60,000 cubic feet) per day during any 90-day interval. However, there was an exclusion or exemption for wells that had the application of what is defined as enhanced recovery in the NPGA. Nonassociated natural gas (gas from different geological zones) which is produced from a stripper well that actually exceeds 60 Mcf during any 90-day production period may continue to qualify as stripper well natural gas if the increase in this well production was the result of the application of recognized enhanced recovery technique References Natural gas industry Energy law Energy policy 1978 in American law 95th United States Congress
Natural Gas Policy Act of 1978
[ "Environmental_science" ]
702
[ "Environmental social science", "Energy policy" ]
76,400,572
https://en.wikipedia.org/wiki/Power%20Plant%20and%20Industrial%20Fuel%20Use%20Act
The Power Plant and Industrial Fuel Use Act (FUA) was an act enacted in 1978 by the U.S. Congress which prohibited: The use of natural gas or petroleum as an energy source in any new electric power plant; and Construction of any new electric power plant without the capability to use coal or any alternate fuel as a primary energy source. It prohibited the use of natural gas or petroleum as the primary energy source in a new major fuel-burning installation (MFBI) consisting of a boiler. The legislation was part of the National Energy Plan of President Jimmy Carter. Objectives The major purposes of the FUA program were: To reduce the importation of petroleum. Increase the nation's use of indigenous energy resources. To conserve natural gas and petroleum and minimize their use as primary energy source. To foster greater use of coal. To encourage the use of synthetic fuels. To reduce the vulnerability of the US to energy-supply interruptions. Source: Criticism Due to the requirements of coal as one source of fuel for energy power plants, environmentalists expressed the concern about power plants burning more coal without the imposition of stricter sulphur dioxide emission controls, because that this could increase the risk the acid rain. References Energy policy Alternative fuels
Power Plant and Industrial Fuel Use Act
[ "Environmental_science" ]
252
[ "Environmental social science", "Energy policy" ]
76,400,854
https://en.wikipedia.org/wiki/Phellinus%20arctostaphyli
Phellinus arctostaphyli, also known as the manzanita conk or the manzanita hoof polypore, is a species of shelf fungus. Native to western North America, this saprotrophic fungus only colonizes the wood of Ceanothus, Adenostoma, and Arctostaphylos. P. arctostaphyli is closely to related to three other North American Phellinus species, including Phellinus tremulae and Phellinus tuberculosus. However, in part due to the "economic insignificance of its hosts," P. arctostaphyli is relatively poorly studied as an individual species. The conks or hoofs (basidiocarps) appear perennially, are tough and woody themselves, with tiny pores on the underside and black to gray rings on top that are prone to fracturing longitudinally. This species was first described by William Henry Long in 1917 as Fomes arctostaphyli. In 1954, mycologist Josiah L. Lowe argued that it was a synonym of Fomes igniarius. Tuomo Niemelä moved it into the genus Phellinus in 1975. The presence of P. arctostaphyli has been correlated with manzanita mortality in Mexico. References Sources Fungi of North America Fungus species Phellinus
Phellinus arctostaphyli
[ "Biology" ]
282
[ "Fungi", "Fungus species" ]
76,400,911
https://en.wikipedia.org/wiki/Lisa%20McNeill
Lisa Sylvia McNeill (; born 1977) is a New Zealand academic, and is a full professor at the University of Otago, specialising in consumer behaviour, especially with respect to sustainability, ethical consumption and fashion. Early life and education McNeill was born in 1977. She studied at the University of Otago, graduating with a Bachelor of Arts degree in 1999 and a Bachelor of Commerce degree with first-class honours in marketing management in 2001. She completed a PhD, with a thesis titled Retail sales promotion in the supermarket industry: a tri-country comparison of New Zealand, Singapore and Malaysia, at the University of Otago in 2003. Academic career McNeill then joined the faculty of the University of Otago, rising to associate professor in 2014 and full professor in 2023. She has been the Associate Dean Postgraduate Research since 2019. McNeill is a principal investigator in the Food Waste Innovation research theme at Otago, where her interest is in sustainable packaging of food, and consumer behaviour. She is also part of an international collaboration on sustainable fashion aimed at enabling multi-disciplinary conversations on sustainable fashion. According to the university, her 2015 paper with colleague Rebecca Moore examining relationships that people have with fashion and sustainability is "internationally considered a seminal work", and she authored one of the top ten most downloaded papers in the International Journal of Consumer Studies in 2020. McNeill researches consumer behaviour, and is interested in consumer identity, sustainable fashion, and fashion ethics. Her research projects have included the concept of 'slow fashion', consumer understanding of fashion labelling, textile consumption and waste, collaborative consumption, and cultures of fashion, waste, repair and wardrobe curation in Korea, New Zealand and Canada. McNeill is an associate editor or on the editorial board of a number of journals, including the International Journal of Consumer Studies, Sustainability, and Young Consumers. Selected works References External links The wearable weight of being, Inaugural professorial lecture by McNeill, 5 October 2023 via YouTube 1977 births Living people New Zealand academics New Zealand women academics University of Otago alumni Academic staff of the University of Otago Consumer behaviour
Lisa McNeill
[ "Biology" ]
424
[ "Behavior", "Consumer behaviour", "Human behavior" ]
76,401,012
https://en.wikipedia.org/wiki/Topological%20deep%20learning
Topological deep learning (TDL) is a research field that extends deep learning to handle complex, non-Euclidean data structures. Traditional deep learning models, such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs), excel in processing data on regular grids and sequences. However, scientific and real-world data often exhibit more intricate data domains encountered in scientific computations , including point clouds, meshes, time series, scalar fields graphs, or general topological spaces like simplicial complexes and CW complexes. TDL addresses this by incorporating topological concepts to process data with higher-order relationships, such as interactions among multiple entities and complex hierarchies. This approach leverages structures like simplicial complexes and hypergraphs to capture global dependencies and qualitative spatial properties, offering a more nuanced representation of data. TDL also encompasses methods from computational and algebraic topology that permit studying properties of neural networks and their training process, such as their predictive performance or generalization properties.,. The mathematical foundations of TDL are algebraic topology, differential topology, and geometric topology. Therefore, TDL can be generalized for data on differentiable manifolds, knots, links, tangles, curves, etc. History and motivation The term ``topological deep learning``, including multichannel TDL and multitask TDL, was first introduced in 2017. Traditional techniques from deep learning often operate under the assumption that a dataset is residing in a highly-structured space (like images, where convolutional neural networks exhibit outstanding performance over alternative methods) or a Euclidean space. The prevalence of new types of data, in particular graphs, meshes, and molecules, resulted in the development of new techniques, culminating in the field of geometric deep learning, which originally proposed a signal-processing perspective for treating such data types. While originally confined to graphs, where connectivity is defined based on nodes and edges, follow-up work extended concepts to a larger variety of data types, including simplicial complexes and CW complexes, with recent work proposing a unified perspective of message-passing on general combinatorial complexes. An independent perspective on different types of data originated from topological data analysis, which proposed a new framework for describing structural information of data, i.e., their "shape," that is inherently aware of multiple scales in data, ranging from local information to global information. While at first restricted to smaller datasets, subsequent work developed new descriptors that efficiently summarized topological information of datasets to make them available for traditional machine-learning techniques, such as support vector machines or random forests. Such descriptors ranged from new techniques for feature engineering over new ways of providing suitable coordinates for topological descriptors, or the creation of more efficient dissimilarity measures. Contemporary research in this field is largely concerned with either integrating information about the underlying data topology into existing deep-learning models or obtaining novel ways of training on topological domains. Learning on topological spaces Focusing on topology in the sense of point set topology, an active branch of TDL is concerned with learning on topological spaces, that is, on different topological domains. An introduction to topological domains One of the core concepts in topological deep learning is the domain upon which this data is defined and supported. In case of Euclidean data, such as images, this domain is a grid, upon which the pixel value of the image is supported. In a more general setting this domain might be a topological domain. Next, we introduce the most common topological domains that are encountered in a deep learning setting. These domains include, but not limited to, graphs, simplicial complexes, cell complexes, combinatorial complexes and hypergraphs. Given a finite set S of abstract entities, a neighborhood function on S is an assignment that attach to every point in S a subset of S or a relation. Such a function can be induced by equipping S with an auxiliary structure. Edges provide one way of defining relations among the entities of S. More specifically, edges in a graph allow one to define the notion of neighborhood using, for instance, the one hop neighborhood notion. Edges however, limited in their modeling capacity as they can only be used to model binary relations among entities of S since every edge is connected typically to two entities. In many applications, it is desirable to permit relations that incorporate more than two entities. The idea of using relations that involve more than two entities is central to topological domains. Such higher-order relations allow for a broader range of neighborhood functions to be defined on S to capture multi-way interactions among entities of S. Next we review the main properties, advantages, and disadvantages of some commonly studied topological domains in the context of deep learning, including (abstract) simplicial complexes, regular cell complexes, hypergraphs, and combinatorial complexes. Comparisons among topological domains Each of the enumerated topological domains has its own characteristics, advantages, and limitations: Simplicial complexes Simplest form of higher-order domains. Extensions of graph-based models. Admit hierarchical structures, making them suitable for various applications. Hodge theory can be naturally defined on simplicial complexes. Require relations to be subsets of larger relations, imposing constraints on the structure. Cell Complexes Generalize simplicial complexes. Provide more flexibility in defining higher-order relations. Each cell in a cell complex is homeomorphic to an open ball, attached together via attaching maps. Boundary cells of each cell in a cell complex are also cells in the complex. Represented combinatorially via incidence matrices. Hypergraphs Allow arbitrary set-type relations among entities. Relations are not imposed by other relations, providing more flexibility. Do not explicitly encode the dimension of cells or relations. Useful when relations in the data do not adhere to constraints imposed by other models like simplicial and cell complexes. Combinatorial Complexes : Generalize and bridge the gaps between simplicial complexes, cell complexes, and hypergraphs. Allow for hierarchical structures and set-type relations. Combine features of other complexes while providing more flexibility in modeling relations. Can be represented combinatorially, similar to cell complexes. Hierarchical structure and set-type relations The properties of simplicial complexes, cell complexes, and hypergraphs give rise to two main features of relations on higher-order domains, namely hierarchies of relations and set-type relations. Rank function A rank function on a higher-order domain X is an order-preserving function rk: X → Z, where rk(x) attaches a non-negative integer value to each relation x in X, preserving set inclusion in X. Cell and simplicial complexes are common examples of higher-order domains equipped with rank functions and therefore with hierarchies of relations. Set-type relations Relations in a higher-order domain are called set-type relations if the existence of a relation is not implied by another relation in the domain. Hypergraphs constitute examples of higher-order domains equipped with set-type relations. Given the modeling limitations of simplicial complexes, cell complexes, and hypergraphs, we develop the combinatorial complex, a higher-order domain that features both hierarchies of relations and set-type relations. The learning tasks in TDL can be broadly classified into three categories: Cell classification: Predict targets for each cell in a complex. Examples include triangular mesh segmentation, where the task is to predict the class of each face or edge in a given mesh. Complex classification: Predict targets for an entire complex. For example, predict the class of each input mesh. Cell prediction: Predict properties of cell-cell interactions in a complex, and in some cases, predict whether a cell exists in the complex. An example is the prediction of linkages among entities in hyperedges of a hypergraph. In practice, to perform the aforementioned tasks, deep learning models designed for specific topological spaces must be constructed and implemented. These models, known as topological neural networks, are tailored to operate effectively within these spaces. Topological neural networks Central to TDL are topological neural networks (TNNs), specialized architectures designed to operate on data structured in topological domains. Unlike traditional neural networks tailored for grid-like structures, TNNs are adept at handling more intricate data representations, such as graphs, simplicial complexes, and cell complexes. By harnessing the inherent topology of the data, TNNs can capture both local and global relationships, enabling nuanced analysis and interpretation. Message passing topological neural networks In a general topological domain, higher-order message passing involves exchanging messages among entities and cells using a set of neighborhood functions. Definition: Higher-Order Message Passing on a General Topological Domain Let be a topological domain. We define a set of neighborhood functions on . Consider a cell and let for some . A message between cells and is a computation dependent on these two cells or the data supported on them. Denote as the multi-set , and let represent some data supported on cell at layer . Higher-order message passing on , induced by , is defined by the following four update rules: , where is the intra-neighborhood aggregation function. , where is the inter-neighborhood aggregation function. , where are differentiable functions. Some remarks on Definition above are as follows. First, Equation 1 describes how messages are computed between cells and . The message is influenced by both the data and associated with cells and , respectively. Additionally, it incorporates characteristics specific to the cells themselves, such as orientation in the case of cell complexes. This allows for a richer representation of spatial relationships compared to traditional graph-based message passing frameworks. Second, Equation 2 defines how messages from neighboring cells are aggregated within each neighborhood. The function aggregates these messages, allowing information to be exchanged effectively between adjacent cells within the same neighborhood. Third, Equation 3 outlines the process of combining messages from different neighborhoods. The function aggregates messages across various neighborhoods, facilitating communication between cells that may not be directly connected but share common neighborhood relationships. Fourth, Equation 4 specifies how the aggregated messages influence the state of a cell in the next layer. Here, the function updates the state of cell based on its current state and the aggregated message obtained from neighboring cells. Non-message passing topological neural networks While the majority of TNNs follow the message passing paradigm from graph learning, several models have been suggested that do not follow this approach. For instance, Maggs et al. leverage geometric information from embedded simplicial complexes, i.e., simplicial complexes with high-dimensional features attached to their vertices.This offers interpretability and geometric consistency without relying on message passing. Furthermore, in a contrastive loss-based method was suggested to learn the simplicial representation. Learning on topological descriptors Motivated by the modular nature of deep neural networks, initial work in TDL drew inspiration from topological data analysis, and aimed to make the resulting descriptors amenable to integration into deep-learning models. This led to work defining new layers for deep neural networks. Pioneering work by Hofer et al., for instance, introduced a layer that permitted topological descriptors like persistence diagrams or persistence barcodes to be integrated into a deep neural network. This was achieved by means of end-to-end-trainable projection functions, permitting topological features to be used to solve shape classification tasks, for instance. Follow-up work expanded more on the theoretical properties of such descriptors and integrated them into the field of representation learning. Other such topological layers include layers based on extended persistent homology descriptors, persistence landscapes, or coordinate functions. In parallel, persistent homology also found applications in graph-learning tasks. Noteworthy examples include new algorithms for learning task-specific filtration functions for graph classification or node classification tasks. Learning through alternative formulations Most existing TDL techniques are rooted in homology. However, alternative mathematical approaches, such as topological Laplacians and topological Dirac operators, also provide valuable insights into the topological properties of TDL. Topological Laplacians, including Hodge Laplacians on differentiable manifolds, combinatorial Laplacians for point clouds, and Khovanov Laplacians for knots and links, serve as powerful tools for extracting topological features from their respective data formats. Despite being defined in distinct contexts, these Laplacians share a common algebraic foundation. They are constructed using the (co-)boundary operator and its adjoint, with their kernels isomorphic to homology groups. Consequently, the number of zero eigenvalues corresponds to the Betti numbers of the associated (co-)homology groups. Moreover, the nonzero eigenvalues provide richer insights into the data structure, particularly when analyzed through the perspective of spectral theory. Hodge Laplacians, combinatorial Laplacians, and Khovanov Laplacians draw upon the mathematical fields of differential geometry, graph theory, and geometric topology, respectively, to extend the classical theory of homology beyond the domain of algebraic topology. Each functions as a bridge, linking algebraic topology to its associated mathematical discipline. Persistent Hodge Laplacians were first introduced in 2019 to analyze data on differentiable manifolds with boundary. Additionally, persistent combinatorial Laplacians, also known as persistent Laplacians, were developed for point cloud data. These approaches extend classical persistent homology and have stimulated research interest, fueling advancements in both theory and applications. Persistent Laplacians outperform persistent homology in extensive protein engineering tasks and the prediction of mutation induced protein-protein binding affinity changes. Persistent topological Laplacians have been constructed on various mathematical objects, including simplicial complex, directed flag complex, path complex, cellular sheaves, hypergraph, hyperdigraph, and differentiable manifolds. Persistent Dirac was constructed on various topological spaces, including simplicial complex, path complex, and hypergraph. These new approaches extend the scope of TDL to manifold topological learning and curve data learning. Applications TDL is rapidly finding new applications across different domains, including data compression, enhancing the expressivity and predictive performance of graph neural networks, action recognition, and trajectory prediction. Topology inherently simplifies data, which implies the irreversible loss of certain information. Therefore, competitive performance from TDL mostly involves intrinsically complex data, such as those arising in biological sciences. Perhaps some of the most compelling examples of applications in which TDL consistently demonstrates its advantages over other competing methods are the victories of TDL in the D3R Grand Challenges, the discovery of SARS-CoV-2 evolution mechanisms, and the successful forecasting of SARS-CoV-2 variants BA.2, BA.4 and BA.5, about two months in advance. See also Topological data analysis Deep learning References Deep learning Topology
Topological deep learning
[ "Physics", "Mathematics" ]
3,042
[ "Spacetime", "Topology", "Space", "Geometry" ]
76,401,057
https://en.wikipedia.org/wiki/Miassite
Miassite is a mineral made of rhodium and sulfur, with the stoichometric formula . It was named after the Miass River in the Urals. It is a superconductor and an unconventional superconductor. Naturally occurring miassite is too brittle, so it is made in a lab for superconductor research. Its ability to be an unconventional superconductor was discovered at Ames National Laboratory in 2024. Miassite, covellite, parkerite, and palladseite, occur in nature, and are also made in labs as superconductors. Miassite is the only one found to also have unconventional superconductivity. References External links http://www.webmineral.com/data/Miassite.shtml https://www.mindat.org/min-7250.html Superconductors Minerals
Miassite
[ "Chemistry", "Materials_science" ]
193
[ "Superconductivity", "Superconductors" ]
76,402,036
https://en.wikipedia.org/wiki/2024%20Commercial%20Bank%20of%20Ethiopia%20glitch%20incident
On 15 March 2024, the Commercial Bank of Ethiopia (CBE) reported having glitching issues between 12 a.m. and 3 a.m. Customers were able to withdraw large amounts of cash (more than 40 million dollars) not deposited in their accounts, and were able to make unlimited ATM withdrawals. After circulated in social media, CBE released five notes within less than 24 hours, explaining the systematic failure in its branch services. CBE President Abe Sano warned customers to return money saying "those who do not return money that is not theirs will be prosecuted". On 26 March, CBE announced that about $14 million recovered. Incident On 15 March 2024, the Commercial Bank of Ethiopia (CBE) encounter systemic glitch that occurred between 12 a.m. and 3 a.m in local time. CBE released five notes within less than 24 hours, describing the bank failure as a systemic problem in its branch level. CBE President Abe Sano said that much of the money was withdrawn by students. News of the glitch spread across the local university via messaging apps and phone calls. According to BBC Amharic, students who withdrew money in western Ethiopia queued to access ATM machines after which police officers approached the campus. A Dilla University student said a number of his peers withdrew money from CBE between midnight and 02:00 local time. CBE stated that the glitch was caused by economic and political issues, believing the withdrawal of large amounts of money from ATM machines was a systemic problem. About 490,000 transactions (both legal and illegal) were completed from around midnight to dawn. Freezing transactions took hours. During an interview with BBC's Newsday programme on 20 March, President Abe warned customers to return the money that they had withdrawn, saying "those who do not return money that is not theirs will be prosecuted". Reactions On 16 March, the National Bank of Ethiopia (NBE) issued a statement on the incident that reads: Banks regularly carry out security checks and make updates on their systems to deliver speedy and secure services. The changes that result from these updates and inspection works may cause interruption on banking services. NBE claimed that the interruption of service was largely due to maintenance work. Elias Meseret, an AP journalist, acknowledged those who committed the financial breach by saying: "Many people, especially university students have taken money from the commercial bank of Ethiopia ATP or have transferred money online." On 26 March, the bank said it had recovered about three-quarter of $14 million. References Cyberattacks on banking industry 2024 in Ethiopia 2024 in economic history Economic history of Ethiopia 2024 in computing March 2024 events in Ethiopia Internet in Ethiopia Software anomalies
2024 Commercial Bank of Ethiopia glitch incident
[ "Technology" ]
550
[ "Computer errors", "Technological failures", "Software anomalies" ]
76,403,370
https://en.wikipedia.org/wiki/NGC%203000
NGC 3000 is a double star located in the constellation Ursa Major. It was first discovered and observed by Bindon Stoney an assistant to William Parsons, on January 25, 1851, and was initially catalogued as a nebula-like object. Since its discovery, NGC 3000 has been observed and studied using various telescopes. Discovery Bindon Stoney first described NGC 3000 as a "very faint, small, irregularly round, mottled but not resolved" galaxy. However, its recorded position, precessed to RA 09 49 02.6, Dec +44 08 46, shows no object at that location. Analysis reveals Stoney's recorded positions for objects in this region consistently have a systematic error of approximately 2 arcminutes to the east-northeast. Applying this correction places the coordinates nearly precisely on a pair of stars now identified as NGC 3000. References External links 3000 Ursa Major Astronomical objects discovered in 1851 Principal Galaxies Catalogue objects Double stars
NGC 3000
[ "Astronomy" ]
194
[ "Ursa Major", "Constellations" ]