id int64 39 79M | url stringlengths 31 227 | text stringlengths 6 334k | source stringlengths 1 150 ⌀ | categories listlengths 1 6 | token_count int64 3 71.8k | subcategories listlengths 0 30 |
|---|---|---|---|---|---|---|
47,471,627 | https://en.wikipedia.org/wiki/Penicillium%20rubrum | Penicillium rubrum is a species of fungus in the genus Penicillium which produces kojic acid, mitorubrin, mitorubrinol, rubratoxin A, rubratoxin B rubralactone, rubramin and occurs in grain corn and soybeans. Penicillium rubrum is similar to the species Penicillium chrysogenum.
Further reading
References
rubrum
Fungi described in 1904
Fungus species | Penicillium rubrum | [
"Biology"
] | 97 | [
"Fungi",
"Fungus species"
] |
47,471,695 | https://en.wikipedia.org/wiki/Nojirimycin | Nojirimycin is the parent compound of a class of antibiotics and glycosidase inhibitors. Nojirimycin and its derivatives are mainly obtained from a class of Streptomyces species. Chemically, it is an iminosugar.
Derivatives
1-deoxynojirimycin or duvoglustat
1-deoxygalactonojirimycin or migalastat, a drug for the treatment of Fabry disease
References
Antibiotics
Streptomyces
Iminosugars | Nojirimycin | [
"Chemistry",
"Biology"
] | 112 | [
"Iminosugars",
"Carbohydrates",
"Biotechnology products",
"Antibiotics",
"Biocides"
] |
47,472,309 | https://en.wikipedia.org/wiki/NGC%2094 | NGC 94 (PGC 1423) is a lenticular galaxy in the constellation Andromeda. It was discovered by Guillaume Bigourdan in 1884. This object is extremely faint and small. A little above the galaxy is NGC 96. NGC 94 is about 260 million light-years away and 50,000 light-years across.
References
0094
?
01423
Discoveries by Guillaume Bigourdan
Andromeda (constellation)
Lenticular galaxies | NGC 94 | [
"Astronomy"
] | 92 | [
"Andromeda (constellation)",
"Constellations"
] |
47,474,443 | https://en.wikipedia.org/wiki/Alfonso%20Farina | Alfonso Farina (born January 25, 1948) is an Italian electronic engineer and former industry manager. He is most noted for the development of the track while scan techniques for radars and generally for the development of a wide range of signal processing techniques used for sensors where tracking plays an essential role. He is author of about 1000 publications. His work was aimed to a synergistic cooperation between industry and academy.
Biography
Alfonso Farina was born in Petrella Salto, a small town near Rieti in 1948. He obtained a doctoral – laurea - degree in electronic engineering on 1973 at University La Sapienza in Rome. In 1974 he joined Selenia, a Finmeccanica company then become Selex ES. Here he held the role of director of the analysis of integrate systems unit and then chief engineer of large business systems division. More recently, he has been the senior VP CTO of the company and then senior advisor to the CTO. From 1979 to 1985 he also was professor ("incaricato") of radar techniques at the University of Naples.
He retired in October 2014 but, currently, works as a consultant.
Work
The activity of Alfonso Farina spans a wide range of arguments in the area of radars and sensors. His pioneering work on track while scan, now widely used in all radars, was recounted in a classical set of two books
that due to their widespread relevance have gone published also with Russian and Chinese translations. A more recent publication by him also accounts for ideas and applications on adaptive radar signal processing.
He has also been the contributor to the article on ECCM, invited by Merrill Skolnik, in the second edition of the Radar Handbook (Ch. 9)
and the third (Ch. 24)
Together with Artenio Russo, he has generalised the well-known Swerling target fluctuation cases these being special cases.
Together with Sergio Barbarossa, he introduced time-frequency distributions in the analysis of synthetic-aperture radar signals The methods are useful, in particular, for the detection and imaging of objects moving on the Earth, observed from airborne or spaceborne synthetic aperture radars. The approach was later extended to multi-antenna systems, giving rise to space-time-frequency processing.
He is considered the "father" of Italian industrial PCL radar. From 2004 to 2014, he led the team of engineers in conceiving, designing and implementing successive generations of improved PCL radar systems, extensively tested over several years.
Together with Hernandez and Ristic, he extended the theory and calculation of Posterior Cramer-Rao Lower Bound (PCRLB) to the realistic case of detection probability less than 1 and probability of false alarm greater than 0, with practical applications to target tracking.
Together with Luigi Chisci and Giorgio Battistelli he has developed target tracking for radar systems.
In the recent decade, he has contributed to exploit his competence on signal processing in favor of cyber security of integrated systems.
He has been the organizer and general chairman of 2008 IEEE-AESS Radar Conference held in Rome. This was the first time that such conference has been held outside US since its inception on 1974.
Since 2017 till 2023 he is the Chair of Italy Section Chapter, IEEE AESS-10.
Since 2017 (three-year term), he has been in the Editorial Board of the IEEE Signal Processing Magazine.
He is Visiting Professor at University College of London and Cranfield University in UK.
Since 2014, he works as a consultant.
Recently, he gave an interview for the IEEE Aerospace and Electronic Systems Magazine, with Fulvio Gini hosting, recounting of his professional achievements and more.
In October 2018 he was interviewed at Rai Storia for the "70° anniversario di Leonardo Company" ("70th anniversary of Leonardo Company").
He is active in research on quantum radar. Recently, he has been an associate editor of IEEE Aerospace and Electronic Systems Magazine for a special issue on quantum radar, published in two parts, together with Marco Frasca and Bhashyam Balaji.
Currently, he is ranked in the list of 2% top scientists in the World.
He is President of the Radar & Sensors Academy of Leonardo S.p.A. Electronic Division.
He is President of the Underwater and Sensor Systems Academy of Leonardo S.p.A. Electronic Division.
Awards and honors
Farina is IEEE Fellow since 2000 and International Fellow of the Royal Academy of Engineering since 2005, the latter with the citation "Distinguished for outstanding and continuous innovative in the development of radar signal and data processing techniques and application of these findings in practical systems". He received the award from the hands of Prince Philip, Duke of Edinburgh. From 1997 he is Fellow of IET. Since 2010 he is also Fellow of EURASIP. Starting from 2020, he is fellow member of European Academy of Science.
Since November 2020, he has been named "Académico Correspondiente de la Real Academia de Ingeniería de España".
He is in the Board of Governance of IEEE Aerospace and Electronic Systems Society (2022-2024).
He is part of the IEEE Aerospace and Electronic Systems Standing Committee Chairs as responsible of “Member Service: HISTORY”.
He won the following awards:
Fred Nathanson Memorial Radar Award, 1987, with the motivation
For development of radar data processing techniques.
M. Barry Carlton Award by IEEE Aerospace and Electronic Systems Society in 2001, 2003 and 2013. This award recognizes the best paper published in the IEEE Transactions on Aerospace and Electronic Systems for the given year.
Honour of Maestro del Lavoro with decoration of "Stella al Merito del Lavoro" presented to him by the President of Italian Republic in recognition of his outstanding professional career, 2003.
First Prize Award for Innovation Technology of Finmeccanica Group, 2004, team leader of the winner team presented by the Italian Ministry of Instruction, University and Scientific Research.
2006: Annual European Group Technical Achievement Award 2006 by the EURASIP “for development and application of adaptive signal processing techniques in practical radar systems”.
IEEE Dennis J. Picard Medal for Radar Technologies and Applications, 2010, with the motivation
For continuous, innovative, theoretical and practical contributions to radar systems and adaptive signal processing techniques.
Co-recipient of Oscar Masi Award for the AULOS “green” radar by the Italian Association for Industrial Research (AIRI) (2012).
IET Achievement Medals, 2014, with the motivation
For outstanding contributions to radar system design, signal, data and image processing and data fusion.
IEEE Signal Processing Society Industrial Leader Award, 2017 (presented on 2018), with the motivation
For contributions to radar array processing and industrial leadership.
Honorary chair of IEEE RadarConf 2020, Florence.
2019 Christian Hülsmeyer Award from the German Institute of Navigation (DGON), with the motivation
In appreciation of his outstanding contribution to radar research and education.
2020 IEEE AESS Pioneer Award, with the motivation:
For pioneering contributions to the analysis, design, development, and experimentation of digital-based adaptive radar systems.
2023 International Member of the United States National Academy of Engineering (NAE),In recognition of distinguished contributions to engineering,“for contributions to the development and deployment of advanced radar systems and technology”
2020-2024 Member of the scientific committee of the IEEE Italy Chapter of Signal Processing Society (2020-2024)
22 March 2024: Award of the Research Doctorate Honoris Causa in 'ICT Information and Communication Technologies' at the University of Palermo, Department of Engineering (video on Youtube)
10 July 2024: he gave the gala dinner speech at the ISIF IEEE International conference on Fusion 2024, Venice, Italy
References
In June 2023 Alfonso has collected the list of titles his first 1000 scientific publications in a file that is freely available
External links
Green Radar State of Art: theory, practice and way ahead. Plenary talk given at 2014 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) held in Florence (Italy).
The future of radar: the evolution of a technology with a long history - We talk with Alfonso Farina, one of the fathers of modern radar.
Biography in Engineering and Technology History Wiki.
Interview with Pat Hindle, Editor of Microwave Journal.
Radar Role: From the Underground to Outer Space (ICASSP 2020, Barcellona, Spain).
1948 births
Electronics engineers
Italian engineers
Systems engineers
Fellows of the IEEE
Fellows of the Royal Academy of Engineering
Living people
Academic staff of the University of Naples Federico II | Alfonso Farina | [
"Engineering"
] | 1,721 | [
"Systems engineers",
"Systems engineering",
"Electronic engineering",
"Electronics engineers"
] |
49,229,350 | https://en.wikipedia.org/wiki/List%20of%20American%20Chemical%20Society%20national%20awards | The List of American Chemical Society national awards attempts to include national awards, medals and prized offered by the American Chemical Society (ACS). The ACS national awards program began in 1922 with the establishment of the Priestley Medal, the highest award offered by the ACS. As of 2016, the ACS offers a 64 national awards, medals and prizes based on scientific and professional contributions in chemistry. A category of ACS awards is available on Wikipedia.
The complete list of current awards is:
ACS Award for Achievement in Research for the Teaching and Learning of Chemistry
ACS Award for Affordable Green Chemistry
ACS Award for Computers in Chemical and Pharmaceutical Research
ACS Award for Creative Advances in Environmental Science and Technology
ACS Award for Creative Invention
ACS Award for Creative Work in Fluorine Chemistry
ACS Award for Creative Work in Synthetic Organic Chemistry
ACS Award for Distinguished Service in the Advancement of Inorganic Chemistry
ACS Award for Encouraging Disadvantaged Students into Careers in the Chemical Sciences
ACS Award for Encouraging Women into Careers in the Chemical Sciences
ACS Award for Research at an Undergraduate Institution
ACS Award for Team Innovation
ACS Award in Analytical Chemistry
ACS Award in Applied Polymer Science
ACS Award in Chromatography
ACS Award in Colloid Chemistry
ACS Award in Industrial Chemistry
ACS Award in Inorganic Chemistry
ACS Award in Organometallic Chemistry
ACS Award in Polymer Chemistry
ACS Award in Pure Chemistry
ACS Award in Separations Science and Technology
ACS Award in Surface Chemistry
ACS Award in the Chemistry of Materials
ACS Award in Theoretical Chemistry
Award for Volunteer Service to the American Chemical Society
Roger Adams Award in Organic Chemistry
Alfred Bader Award in Bioinorganic or Bioorganic Chemistry
Earle B. Barnes Award for Leadership in Chemical Research Management
Ronald Breslow Award for Achievement in Biomimetric Chemistry
Herbert C. Brown Award for Creative Research in Synthetic Methods
Alfred Burger Award in Medicinal Chemistry
James Bryant Conant Award in High School Chemistry Teaching
Arthur C. Cope Award
Arthur C. Cope Scholar Awards (given for three distinct career levels)
Elias J. Corey Award for Outstanding Original Contribution in Organic Synthesis by a Young Investigator
F. Albert Cotton Award in Synthetic Inorganic Chemistry
Peter Debye Award in Physical Chemistry
Frank H. Field and Joe L. Franklin Award for Outstanding Achievement in Mass Spectrometry
Francis P. Garvin - John M. Olin Medal
James T. Grady - James H. Stack Award for Interpreting Chemistry for the Public
Harry Gray Award for Creative Work in Inorganic Chemistry by a Young Investigator
Ernest Guenther Award in the Chemistry of Natural Products
Katheryn C. Hach Award for Entrepreneurial Success
E. B. Hershberg Award for Important Discoveries in Medicinally Active Substances
Joel Henry Hildebrand Award in the Theoretical and Experimental Chemistry of Liquids
Ralph F. Hirschmann Award in Peptide Chemistry
Ipatieff Prize
Frederic Stanley Kipping Award in Silicon Chemistry
Irving Langmuir Award in Chemical Physics (awarded in even-numbered years by ACS and in odd-numbered years by the American Physical Society)
Josef Michl ACS Award in Photochemistry
E. V. Murphree Award in Industrial and Engineering Chemistry
Nakanishi Prize (awarded in odd-numbered years by ACS and in even-numbered years by the Chemical Society of Japan)
Nobel Laureate Signature Award for Graduate Education in Chemistry
James Flack Norris Award in Physical Organic Chemistry
George A. Olah Award in Hydrocarbon or Petroleum Chemistry
Charles Lathrop Parsons Award
George C. Pimentel Award in Chemical Education
Priestley Medal
Glenn T. Seaborg Award for Nuclear Chemistry
Gabor A. Somorjai Award for Creative Research in Catalysis
George and Christine Sosnovsky Award for Cancer Research
E. Bright Wilson Award in Spectroscopy
Ahmed Zewail Award in Ultrafast Science and Technology
References
American Chemical Society | List of American Chemical Society national awards | [
"Technology"
] | 768 | [
"Science and technology awards",
"Chemistry awards"
] |
49,230,602 | https://en.wikipedia.org/wiki/Ramaria%20myceliosa | Ramaria myceliosa is a species of coral fungus in the family Gomphaceae. Found in North America, it was originally described by Charles Horton Peck in 1904 with the name Clavaria myceliosa. The type was collected by botanist Edwin Bingham Copeland in the mountains near Stanford University in California. E.J.H. Corner transferred it to the genus Ramaria in 1950. Giachini and colleagues proposed that Ramaria myceliosa is the same species as the European Phaeoclavulina curta, but did not provide molecular evidence to support their suggested synonymy. In a recent (2014) publication on California fungi, the authors propose the transfer of Ramaria myceliosa to the genus Phaeoclavulina, but , this transfer has not been accepted by either MycoBank or Index Fungorum.
The fruiting body is usually about 3–6 cm tall and wide, yellowish to tan, with tips branching into two to four points, whitish flesh, mild odor, and bitter taste. The stalks are about 2 cm tall. The spores are yellowish.
The species is inedible.
Similar species include Ramaria abietina and Ramaria stricta.
References
Gomphaceae
Fungi described in 1904
Fungi of North America
Taxa named by Charles Horton Peck
Inedible fungi
Fungus species | Ramaria myceliosa | [
"Biology"
] | 274 | [
"Fungi",
"Fungus species"
] |
49,231,115 | https://en.wikipedia.org/wiki/Raya%20%28app%29 | Raya is a private, membership-based, exclusive social network application for iOS, founded by Daniel Gendelman and Atlas Benjelloun in 2014. Originally a dating app, Raya has since added features to promote professional networking and social discovery.
Raya’s membership is largely restrictive; limited to celebrities, high-profile industry individuals and internet personalities, through a long selection process or invite.
History
Raya was created by Daniel Gendelman and Atlas Benjelloun in 2014 and went live in February 2015. The app is only available on Apple devices.
In the spring of 2024, Raya’s parent company released a second app, Places, into beta. Places is a subscription-based discovery tool for travel and hospitality and is currently available by invitation only.
Raya and Places each offer various subscription options. Raya’s current one-month subscription is $24.99 USD. Places’ current annual subscription is $39.99 USD.
Format
Users link their profile to their Instagram account, add personal information, including their background, profession, and interests, and must create a photo montage for their profile. After creating a profile, users interact and connect with the global Raya community through features such as a map and member directory.
Access
Raya is not advertised and grows solely by word of mouth. After downloading the app, users submit an application to join the community, which may include referrals from Raya members. All applications are reviewed by Raya’s community team, which takes into account referrals and other factors and solicits input from a committee of long-time community members. Only a small percentage of applicants are admitted, while the rest are added to a waitlist for further review and consideration. As of 2023, the waitlist exceeded 1.5 million applicants.
The list of celebrities found on Raya includes Lamorne Morris, John Mayer, David Harbour, and Trevor Noah.
Privacy
Raya includes a number of features focused on its members’ privacy. If a user screenshots a profile within the app, Raya warns the user that they may be removed from the platform, and repeated screenshots lead to termination of the user’s account.
References
Online dating services of the United States
Mobile social software
2015 software
Internet properties established in 2015
IOS software | Raya (app) | [
"Technology"
] | 473 | [
"Mobile software stubs",
"Mobile technology stubs"
] |
49,233,060 | https://en.wikipedia.org/wiki/Laevens%201 | Laevens 1 is a faint globular cluster in the constellation Crater that was discovered in 2014. It is also known as Crater, the Crater cluster and PSO J174.0675-10.8774.
At a distance of it is the most distant Milky Way globular cluster yet known, located in the galactic halo surrounding the Milky Way galaxy. With an age of only 7.5 Gyr, it is likely to have been incorporated into our galaxy long after the formation of the Milky Way, probably during an interaction with the Small Magellanic Cloud.
Some analyses initially categorized it as a satellite galaxy, because of the presence of a handful of blue loop stars and a sparsely populated red clump; the existence of some of these types of stars in Laevens 1 would imply relatively recent star formation (within 400 Myr). Such recent formation would be very atypical for a globular cluster; more recent work suggests that some of these stars were erroneously assigned membership of the cluster, distorting the overall result. After removing these, the subsequent reanalysis concluded that Laevens 1 is a faint, intermediate-age globular cluster.
Laevens 1 is orbiting the galaxy at approximately the same distance as the ultrafaint dwarf galaxies Leo IV and Leo V. This hints that all three satellites may once have been closely associated before falling together into the Milky Way halo.
See also
Globular cluster
Dwarf spheroidal galaxy
References
External links
Globular clusters
Crater (constellation) | Laevens 1 | [
"Astronomy"
] | 312 | [
"Crater (constellation)",
"Constellations"
] |
49,233,467 | https://en.wikipedia.org/wiki/SensorThings%20API | SensorThings API is an Open Geospatial Consortium (OGC) standard providing an open and unified framework to interconnect IoT sensing devices, data, and applications over the Web. It is an open standard addressing the syntactic interoperability and semantic interoperability of the Internet of Things. It complements the existing IoT networking protocols such CoAP, MQTT, HTTP, 6LowPAN. While the above-mentioned IoT networking protocols are addressing the ability for different IoT systems to exchange information, OGC SensorThings API is addressing the ability for different IoT systems to use and understand the exchanged information. As an OGC standard, SensorThings API also allows easy integration into existing Spatial Data Infrastructures or Geographic Information Systems.
OGC SensorThings API has two parts: (1) Part I - Sensing and (2) Part II - Tasking. OGC SensorThings API Part I - Sensing was released for public comment on June 18, 2015. The OGC Technical Committee (TC) approves start of electronic vote on December 3, 2015, and the SensorThings API Part I - Sensing passed the TC vote on February 1, 2016. The official OGC standard specification was published online on July 26, 2016. In 2019 the SensorThings API was also published as a United Nation's ITU-T Technical Specification.
OGC SensorThings API Part II - Tasking Core was released for public comment on February 20, 2018, and it passed the TC vote on June 1, 2018. The official OGC standard specification for the SensorThings API Part II - Tasking Core was published online on January 8, 2019.
In order to offer a better developer experience, the SensorThings API Part II - Tasking Core Discussion Paper was published online on December 18, 2018. The Tasking Core Discussion paper provides 15 JSON examples showing how SensorThings API Part II - Tasking Core can be used.
Design
SensorThings API is designed specifically for resource-constrained IoT devices and the Web developer community. It follows REST principles, the JSON encoding, and the OASIS OData protocol and URL conventions. Also, it has an MQTT extension allowing users/devices to publish and subscribe updates from devices, and can use CoAP in addition to HTTP.
The foundation of the SensorThings API is its data model that is based on the ISO 19156 (ISO/OGC Observations and Measurements), that defines a conceptual model for observations, and for features involved in sampling when making observations. In the context of the SensorThings, the features are modelled as Things, Sensors (i.e., Procedures in O&M), and Feature of Interests. As a result, the SensorThings API provides an interoperable Observation-focus view, that is particularly useful to reconcile the differences between heterogeneous sensing systems (e.g., in-situ sensors and remote sensors).
An IoT device or system is modelled as a Thing. A Thing has an arbitrary number of Locations (including 0 Locations) and an arbitrary number of Datastreams (including 0 Datastreams). Each Datastream observes one ObservedProperty with one Sensor and has many Observations collected by the Sensor. Each Observation observes one particular FeatureOfInterest. The O&M based model allows SensorThings to accommodate heterogeneous IoT devices and the data collected by the devices.
SensorThings API provides two main functionalities, each handled by a part. The two profiles are the Sensing part and the Tasking part. The Sensing part provides a standard way to manage and retrieve observations and metadata from heterogeneous IoT sensor systems, and the Sensing part functions are similar to the OGC Sensor Observation Service. The Tasking part provides a standard way for parameterizing - also called tasking - of task-able IoT devices, such as sensors or actuators. The Tasking part functions are similar to the OGC Sensor Planning Service. The Sensing part is designed based on the ISO/OGC Observations and Measurements (O&M) model, and allows IoT devices and applications to CREATE, READ, UPDATE, and DELETE (i.e., HTTP POST, GET, PATCH, and DELETE) IoT data and metadata in a SensorThings service.
Entities (resources)
SensorThings API Part I - Sensing defines the following resources. As SensorThings is a RESTful web service, each entity can be CREATE, READ, UPDATE, and DELETE with standard HTTP verbs (POST, GET, PATCH, and DELETE):
Thing: An object of the physical world (physical things) or the information world (virtual things) that is capable of being identified and integrated into communication networks.
Locations: Locates the Thing or the Things it associated with.
HistoricalLocations: Set provides the current (i.e., last known) and previous locations of the Thing with their time.
Datastream: A collection of Observations and the Observations in a Datastream measure the same ObservedProperty and are produced by the same Sensor.
ObservedProperty : Specifies the phenomenon of an Observation.
Sensor : An instrument that observes a property or phenomenon with the goal of producing an estimate of the value of the property.
Observation: Act of measuring or otherwise determining the value of a property.
FeatureOfInterest: An Observation results in a value being assigned to a phenomenon.The phenomenon is a property of a feature, the latter being the FeatureOfInterest of the Observation.
In addition to the above sensing resources, SensorThings API Part II - Tasking Core defines the following resources:
TaskingCapabilities: Specifies the task-able parameters of an actuator.
Tasks: A collection of Tasks that has been created.
Actuator : A type of transducer that converts a signal to some real-world action or phenomenon.
Example payload
http://example.org/v1.0/Datastream(id)/Observations
{
"@iot.count": 2,
"value": [
{
"@iot.id": 1,
"@iot.selfLink": "http://example.org/v1.0/Observations(1)",
"phenomenonTime": "2016-01-01T05:00:00.000Z",
"result": "-9",
"resultTime": null,
"Datastream@iot.navigationLink": "http://example.org/v1.0/Observations(1)/Datastream",
"FeatureOfInterest@iot.navigationLink": "http://example.org/v1.0/Observations(1)/FeatureOfInterest"
},
{
"@iot.id": 2,
"@iot.selfLink": "http://example.org/v1.0/Observations(2)",
"phenomenonTime": "2016-01-01T04:00:00.000Z",
"result": "-10",
"resultTime": null,
"Datastream@iot.navigationLink": "http://example.org/v1.0/Observations(2)/Datastream",
"FeatureOfInterest@iot.navigationLink": "http://example.org/v1.0/Observations(2)/FeatureOfInterest"
}
]
}
Data array extensions
In order to reduce the data size transmitted over the network, SensorThings API data array extension allows users to request for multiple Observation entities and format the entities in the dataArray format. When a SensorThings service returns a dataArray response, the service groups Observation entities by Datastream or MultiDatastream, which means the Observation entities that link to the same Datastream or the same MultiDatastream are aggregated in one dataArray.
Example request for data array
http://example.org/v1.0/Observations?$resultFormat=dataArray
Example data array response
{
"value": [
{
"Datastream@iot.navigationLink": "http://example.org/v1.0/Datastreams(1)",
"components": [
"id",
"phenomenonTime",
"resultTime",
"result"
],
"dataArray@iot.count": 3,
"dataArray": [
[
1,
"2005-08-05T12:21:13Z",
"2005-08-05T12:21:13Z",
20
],
[
2,
"2005-08-05T12:22:08Z",
"2005-08-05T12:21:13Z",
30
],
[
3,
"2005-08-05T12:22:54Z",
"2005-08-05T12:21:13Z",
0
]
]
}
]
}
Evaluation
Interoperability between OpenIoT and SensorThings
"We believe that the implementation of the SensorThing API will be a major improvement for the OpenIoT middleware. It will give OpenIoT a standardized and truly easy to use interface to sensor values.This will complement the rich semantic reasoning services with a simple resource based interface. And the consistent data model mapping gives both a common context to describe the internet of things".
Efficiency of SensorThings API
A comprehensive evaluation of the SensorThings API is published in Jazayeri, Mohammad Ali, Steve HL Liang, and Chih-Yuan Huang. "Implementation and Evaluation of Four Interoperable Open Standards for the Internet of Things." Sensors 15.9 (2015): 24343-24373.
Quotes
SensorThings API was demonstrated in a pilot project sponsored by the Department of Homeland Security Science and Technology Directorate. Dr. Reginald Brothers, the Undersecretary of the Homeland Security Science and Technology, was "impressed with the ‘state of the practical’ where these various industry sensors can be integrated today using open standards that remove the stovepipe limitations of one-off technologies. "
OGC SensorThings API standard specification
OGC® SensorThings API Part 1: Sensing
Internal reference number of this OGC® document: 15-078r6
Submission Date: 2015-06-18
Publication Date: 2016-07-26
Editor: Steve Liang (University of Calgary/SensorUp)
Co-Editors: Chih-Yuan Huang (National Central University) / Tania Khalafbeigi (University of Calgary/SensorUp)
OGC® SensorThings API Part 2: Tasking Core
Internal reference number of this OGC® document: 17-079r1
Submission Date: 2017-10-13
Publication Date: 2019-01-08
Editor: Steve Liang (University of Calgary/SensorUp)
Co-Editors: Tania Khalafbeigi (University of Calgary/SensorUp)
Developer API Documentation
Part I - Sensing
Part II - Tasking
SensorThings API Sandbox
SensorThings Compliance Test Suite
Free and open source SensorThings API implementations
Whiskers
In March 2016 SensorUp and the GeoSensorWeb Lab at the University of Calgary submitted an open source software project proposal to the Eclipse Foundation and has been approved. The project is called Whiskers. Whiskers is an OGC SensorThings API framework. It will have a JavaScript client and a light-weight server for IoT gateway devices (e.g., Raspberry Pi or BeagleBone). Whiskers aim to foster a healthy and open IoT ecosystem, as opposed to one dominated by proprietary information silos. Whiskers aims to make SensorThings development easy for the large and growing world of IoT developers.
GOST
GOST is an open source implementation of the SensorThings API in the Go programming language initiated by Geodan. It contains easily deployable server software and a JavaScript client. Currently (June 2016) it is in development but a first version can already be downloaded and deployed. The software can be installed on any device supporting Docker or Go (e.g. Windows, Linux, Mac OS and Raspberry Pi). By default sensor data is stored in a PostgreSQL database.
FROST
FROST-Server is an Open Source server implementation of the OGC SensorThings API. FROST-Server implements the entire specification, including all extensions. It is written in Java and can run in Tomcat or Wildfly and is available as a Docker image. Among its many features is the ability to use String or UUID based entity IDs.
FROST-Client is a Java client library for communicating with a SensorThings API compatible server.
SensorThings HcDT Charting SDK
SensorThings HcDT is a JavaScript charting library for the OGC SensorThings API. It is based on the open source Highcharts library and DataTables. It is a front-end charting library enable developers to connect to datastreams from any OGC SensorThings API service, and display the sensor observations in charts, tables, or dashboard widgets for web applications.
Mozilla STA
Mozilla developed a node implementation of the OGC SensorThings API.
52°North STA
52N SensorThingsAPI is an open source implementation of the OGC SensorThings API. Its core features are the interoperability with the 52N SOS implementing the OGC Sensor Observation Service, customizable database mappings and several convenience extensions. It can be deployed as a Docker container, inside an Apache Tomcat or as a standalone application.
Example applications
Department of Homeland Security S&T Shaken Fury Operational Experiment
In 2019 the Shaken Fury operational experiment for the DHS Next Generation First Responder program depicts a scenario of an earthquake causing partial structural collapse and HAZMAT leak at a stadium. OGC SensorThings API is used as the standard interface that interconnects multiple sensors and offers an IoT enabled real-time situational awareness.
Smart Citizens for Smart Cities YYC - crowd-sourced air quality sensing
On Oct 8th 2016, a group of volunteers (smart citizens) in Calgary gathered together, assembled their own sensors, installed at their houses, and formed a crowd-sourced air quality sensor network. All data are publicly available via OGC SensorThings API. This citizen sensing efforts increased the number of Calgary's air quality sensors from 3 to more than 50.
Smart Emission Project in Nijmegen, NL
Smart emission is an air quality monitoring project in the city of Nijmegen, NL. The project deployed multiple air quality sensors throughout the city. Data are published with open standards, including OGC SensorThings API. Part of the project is an open source ETL engine to load the project sensor data into an OGC SensorThings API.
SensorThings Dashboard
This dashboard provides easy-to-use client-side visualisation of Internet-of-Things sensor data from OGC SensorThings API compatible servers. Various types of widgets can be arranged and configured on the dashboard. It is a web application and can be embedded into any website. A live demo is available on the project page.
https://github.com/SensorThings-Dashboard/SensorThings-Dashboard
GOST Dashboard v2
GOST Dashboard v2 is an open source library of custom HTML elements (web components) supporting SensorThings API. These elements facilitate the development of HTML applications integrating functionality and data from SensorThings API compatible services. The components are developed with Predix-UI and Polymer.
AFarCloud project OGC Connector
The connector enables interoperability between OGC-compliant data sources and the semantic middleware developed in the Horizon 2020 ECSEL project AFarCloud. It is a modular Java application with Docker-based deployment, implemented according to the 15-078r6 OGC SensorThings API 1.0 Implementation Standard.
Comparison between OGC SensorThings API and OGC Sensor Observation Services
SensorThings API provides functions similar to the OGC Sensor Observation Service, an OGC specification approved in 2005. Both standard specifications are under the OGC Sensor Web Enablement standard suite. The following table summarizes the technical difference between the two specifications.
External links
SensorThings API - GitHub
Presentation: Sensor up your connected applications with OGC SensorThings API (FOSS4G)
Chapter: Mapping the OGC SensorThings API onto the OpenIoT Middleware
Tutorial in YouTube: Getting Started Series #1, SensorThings Tutorial Series #2 and SensorThings Tutorial Series #3
Application: SensorThings Playground allows interested people and organizations to experiment with a SensorThings system via a friendly, step-by-step process.
References
Geographic information systems
Internet of things
Open Geospatial Consortium
Open standards
Web services | SensorThings API | [
"Technology"
] | 3,514 | [
"Information systems",
"Geographic information systems"
] |
49,234,192 | https://en.wikipedia.org/wiki/Frank%20Garvan | Francis G. Garvan (born March 9, 1955) is an Australian-born mathematician who specializes in number theory and combinatorics. He holds the position Professor of Mathematics at the University of Florida. He received his Ph.D. from Pennsylvania State University (January, 1986) with George E. Andrews as his thesis advisor. Garvan's thesis, Generalizations of Dyson's rank, concerned the rank of a partition and formed the groundwork for several of his later papers.
Garvan is well-known for his work in the fields of q-series and integer partitions. Most famously, in 1988, Garvan and Andrews discovered a definition of the crank of a partition. The crank of a partition is an elusive combinatorial statistic similar to the rank of a partition which provides a key to the study of Ramanujan congruences in partition theory. It was first described by Freeman Dyson in a paper on ranks for the journal Eureka in 1944. Andrews and Garvan's definition was the first definition of a crank to satisfy the properties hypothesized for it in Dyson's paper.
References
External links
http://people.clas.ufl.edu/fgarvan/
1955 births
Living people
20th-century American mathematicians
21st-century American mathematicians
Australian mathematicians
Number theorists
University of Florida faculty | Frank Garvan | [
"Mathematics"
] | 283 | [
"Number theorists",
"Number theory"
] |
49,235,470 | https://en.wikipedia.org/wiki/Leccinellum%20lepidum | Leccinellum lepidum is a species of bolete in the family Boletaceae. Originally described as Boletus lepidus in 1965, the fungus has gone through controversial taxonomic treatments over the years and was subsequently transferred to genus Krombholziella in 1985, to genus Leccinum in 1990, and to genus Leccinellum in 2003. It is the sister-species of Leccinellum corsicum, with which it had been erroneously synonymised by some authors in the past.
Like other species of Boletaceae, it has tubes and pores instead of gills in its hymenial (fertile) surface and produces large, fleshy fruit bodies up to 20 cm across. Fruit bodies have the tendency to stain orange, violaceous grey and eventually blackish brown when handled or when the flesh is exposed to the air.
Native to southern Europe, L. lepidum is abundantly present throughout the Mediterranean, growing in mycorrhizal symbiosis with various species of oak (Quercus), particularly evergreen members of the "Ilex" group. Despite its southern distribution, the fungus is notable for its late fruiting and tolerance to low temperatures, and is often the only bolete fruiting during the cold winter months.
It is an edible mushroom, though not as highly regarded as sought-after boletes of the genus Boletus.
Taxonomy and phylogeny
Originally described as Boletus lepidus by H. Essette in 1965, Leccinellum lepidum has been controversially treated by various authors, who placed it in different genera or at times synonymised it with other taxa. In 1985, the species was invalidly recombined into genus Leccinum by mycologists Marcel Bon and Marco Contu, but later in the same year Italian mycologist Carlo Alessio transferred it to Krombholziella, a genus that later became a synonym of Leccinum. Bon recombined it as a variety of Leccinum crocipodium in 1989, only to recombine it again with M. Contu as Leccinum lepidum, in 1990. Heinz Engel and colleagues, on the other hand, rejected all previous names and considered the taxon to be a synonym of Leccinum corsicum, a closely related species associated with Cistaceae shrubs.
In 2003, the species was transferred to the newly segregated genus Leccinellum by mycologists Andreas Bresinsky and Manfred Binder, together with other yellow-pored taxa formerly placed in Leccinum. Subsequent phylogenetic and chemotaxonomical analyses by Binder & Besl and Den Bakker & Noordeloos, questioned the segregation of Leccinellum, but suggested that L. lepidum, L. corsicum and L. crocipodium are probably distinct species. However, the three taxa were initially represented by very few sequences and the inclusive "corsicum/lepidum" clade received high support in preliminary phylogenetic analyses. In a 2014 paper, Bertolini controversially abandoned Leccinellum and placed L. lepidum in synonymy with L. corsicum once again, only for the genus to be reinstated in the same year by Wu and colleagues, in a major contribution delineating 22 generic clades in the family Boletaceae. The confusion was finally clarified in 2019, when several collections from Corsica, Croatia, Cyprus, France and Greece were analysed in an elaborate phylogenetic, biogeographical and ecological treatment by M. Loizides and colleagues. In this study, Leccinellum was phylogenetically validated, while L. lepidum, L. corsicum and L. crocipodium formed well-supported lineages within the genus, and were confirmed as distinct species.
Etymology
The Latin epithet lèpidus, meaning "pleasant" or "charming", likely refers to the appearance or culinary qualities of the fungus.
Description
Morphology
Leccinellum lepidum produces large, fleshy fruit bodies. The cap is at first hemispherical, gradually becoming convex or convex-flat as the fungus expands, reaching a diameter of . The cap cuticle is smooth to somewhat lobed and often with a "hammered" appearance, moderately to strongly viscid in wet weather, ranging in colour from ochraceous yellow to ochraceous brown, chestnut-brown, or in very old specimens blackish brown.
The tubes are more or less free from the stem, long and pale yellow to ochraceous yellow. The pores are small and rounded, concolorous with the tubes, slowly staining rusty-brown and finally greyish brown when handled or with age.
The stem is long by wide, usually stout and short-ventricose at first, but gradually becoming longer and clavate to cylindrical, ranging in colour from ochraceous yellow to pale yellow, straw-coloured, or dirty white. Its surface is covered in tiny pustules (scabrosities), concolorous with the stem surface at first, but often staining rusty-brown or grey-brown with age and sometimes coalescing to form an incomplete pseudoreticulum (false net).
The flesh is thick and dull yellow to straw-coloured. When cut or exposed to the air it very slowly discoloures orange or violaceous-grey in parts, and after a few hours darkens to greyish-brown or grey-black. The smell is weakly fungoid in young specimens, becoming stronger in old specimens, while the taste is mild to somewhat astringent. The spores are tobacco-brown in mass.
Under the microscope, the spores appear narrowly ellipsoid to fusiform (spinde-shaped) and measure 13.5–22 × 5–6 μm. The cap cuticle is a trichodermium of septate cylindrical hyphae, often finely incrusted.
Mycorrhiza
The ectomycorrhiza formed by L. lepidum with the holm oak has been described in detail. It is characterised by a Hartig net devoid of haustoria, a plectenchymatous outer mantle of warty hyphae arranged in a ring-like formation, highly differentiated rhizomorphs which are rounded in cross-section and connected to the mantle, and a negative reaction to FeSO4, KOH or guaiac.
Similar species
Leccinellum corsicum is closely related to L. lepidum, and the two taxa had been previously placed in synonymy by some authors. However, L. corsicum is a smaller species rarely exceeding in diameter, is exclusively associated with rockroses (Cistus species), and has the tendency to stain more reddish when its flesh is exposed to the air.
Leccinellum crocipodium is also similar, but typically fruits earlier in the season in association with deciduous oaks. It produces more slender and elongated fruit bodies, with a cap cuticle that has the tendency to crack extensively at maturity.
Ecology, phenology and distribution
The species is widespread in the Mediterranean region, where it forms ectomycorrhizal associations with various species of oak. It is most commonly associated with evergreen members of the "Ilex" group, particularly the holm oak (Quercus ilex), but also the golden oak (Quercus alnifolia), the kermes oak (Q. coccifera) and the Palestine oak (Q. calliprinos). In the western parts of the Mediterranean basin, it is frequently found under the cork oak (Q. suber), while collections under the semi-deciduous Portuguese oak (Q. faginea) have also been reported. It is indifferent to the substrate and occurs abundantly on both calcareous and acidic soil.
Although a southern species, the fungus is notable for its late fruiting season and tolerance to low temperatures. In a 10-year study from the island of Cyprus, L. lepidum was the most frequently recorded bolete, accounting for over half (61%) of all Boletaceae collections found during the winter months (December–February).
Edibility
Leccinellum lepidum is edible, though opinions on its culinary value vary. It is generally regarded gastronomically inferior to other popular boletes (such as Boletus edulis or B. aereus), while the tendency of its fruit bodies to stain black makes the mushroom unappealing to some people.
References
External links
lepidum
Fungi described in 1965
Fungi of Europe
Fungus species | Leccinellum lepidum | [
"Biology"
] | 1,786 | [
"Fungi",
"Fungus species"
] |
49,235,577 | https://en.wikipedia.org/wiki/2MASS%20J2126%E2%80%938140 | 2MASS J21265040−8140293, also known as 2MASS J2126−8140, is an exoplanet orbiting the red dwarf TYC 9486-927-1, 111.4 light-years away from Earth. Its estimated mass, age (10-45 million years), spectral type (L3), and Teff (1800 K) are similar to the well-studied planet β Pictoris b. With an estimated distance of around 1 trillion kilometres from the host star, this is one of the largest solar systems ever found.
See also
COCONUTS-2b
Gliese 900
References
J21265040−8140293
Exoplanets detected by direct imaging
Giant planets
Exoplanets discovered in 2009
Octans | 2MASS J2126–8140 | [
"Astronomy"
] | 169 | [
"Octans",
"Constellations"
] |
49,236,353 | https://en.wikipedia.org/wiki/Energy%20Company%20Obligation | The Energy Company Obligation (ECO) is a British Government programme. It is designed to offset emissions created by energy company power stations. The first obligation period ran from January 2013 to 31 March 2015. The second obligation period, known as ECO2, ran from 1 April 2015 to 31 March 2017. The third obligation period, known as ECO3, ran from 3 December 2018 until 31 March 2022. The fourth iteration, ECO4, commenced on 1 April 2022 and will run until 31 March 2026.
The Government obligates the larger energy suppliers to help lower-income households improve their energy efficiency.
ECO is the replacement of two previous schemes, the Carbon Emission Reduction Target (CERT) and the Community Energy Saving Programme (CESP). It has been announced that the programme will be replaced in 2017 by a less extensive version.
The programme focused on heating, in particular improving insulation.
Ofgem has been appointed the scheme administrator on behalf of the Department for Energy Security & Net Zero.
How does ECO work?
The ECO scheme works by placing an obligation on large and medium energy suppliers in England, Scotland and Wales to provide energy-saving measures for households deemed to live in fuel poverty. Suppliers are allocated based on their overall share of the domestic gas and electricity market.
The range of measures available through the scheme include heating upgrades, solar panels, wall and roof insulation. The provision of these measures is supposedly designed to help vulnerable families reduce their energy bills. The scheme is also seen as a way of helping the government reach its net zero target by 2050.
ECO3 target reached
Ofgem's ECO3 final determination report provides details on the overall performance of the scheme and conclusions regarding of energy suppliers’ achievement against their obligations. The overall target for all participant suppliers was an estimated lifetime bill savings of £8.253 billion. The ECO3 final report confirms that this target was exceeded, with a total estimated lifetime bill savings of £8.457 billion achieved.
The other highlights of the findings were as follows:
"All but one active supplier successfully met their HHCRO obligation and sub-obligation lifetime bill saving targets.
1.03 million energy saving measures were installed over the course of ECO3. This included:
Broken down or energy inefficient boilers being replaced in 251,741 households with energy efficient condensing boilers or low carbon heating alternatives
Cavity wall insulation installed in 152,938 households
Underfloor insulation installed in 133,173 households
Loft insulation installed in 88,588 households
It is estimated that measures installed since the first ECO scheme was introduced in 2013 will provide lifetime carbon savings of around 58.2 MtCO2e. This is equivalent to the amount of carbon absorbed by 264 million mature trees over 10 years."
ECO4
The latest iteration of the Energy Company Obligation (ECO4) began on 27 July 2022 and will run until 31st March 2026. ECO4 focusses on improving the least energy efficient properties and targets homes with an energy rating between D and G. It also aims to provide a more complete retrofit of properties to ensure maximum carbon emission savings. A minimum project scoring methodology is in place to ensure a multi-measure, whole house approach to each property. This is designed to encourage the installation of a variety of measures per household, including insulation, solar panels and renewable heating systems.
For homeowners looking to take advantage of the ECO4 scheme, organisations like UK Energy Management (UKEM) can assist in securing funding and arranging the installation of energy-efficient measures. These improvements are government backed, but supplied by energy companies, and may include loft insulation, solar panel installations, and energy-efficient boilers, aimed at reducing energy consumption and carbon emissions.
The eligibility criteria for ECO4 has seen the removal of disability benefits which qualified under the ECO3 component of the scheme. The ECO4 focuses solely on households that receive income based benefits, some tax credits and pension credits. This change was introduced to ensure that the scheme targets those households most in need of energy efficiency support, particularly those at risk of fuel poverty. However, there have been concerns that the removal of disability benefits from the eligibility criteria may leave some vulnerable households unsupported.
ECO4 qualifying benefits:
Child tax credit (CTC)
Child benefit
Housing Benefit
Jobseeker's Allowance (JSA)
Employment and Support Allowance (ESA)
Income Support (IS)
Pension Credit Guarantee Credit
Pension Credit Saving Credit
Universal Credit (UC)
Warm Home Discount Scheme Rebate
Working Tax Credit (WTC)
Local authorities can sign declarations for eligible households that apply through Flexible Energy under the programme, but the works are carried out by private companies, with funding from energy suppliers. Householders are recommended to check that installers are registered on the TrustMark website.
According to Ofgem's statistics as of 7 May 2024 there have been a total of 100,708 Energy Company Obligation 4 projects submitted. This highlights the scale of the programme in improving energy efficiency in homes across the UK. The scheme aims to provide long-term energy savings while contributing to the UK’s carbon reduction targets.
The statistics on energy supplier performance at 7 May 2024 can be viewed on the Energy Saving Genie website.
References
External links
ECO4 delivery guidance for suppliers from OFGEM
Government programs
Emissions reduction
Energy in the United Kingdom
Climate change policy in the United Kingdom | Energy Company Obligation | [
"Chemistry"
] | 1,085 | [
"Greenhouse gases",
"Emissions reduction"
] |
49,236,889 | https://en.wikipedia.org/wiki/Plinian%20Core | Plinian Core is a set of vocabulary terms that can be used to describe different aspects of biological species information. Under "biological species Information" all kinds of properties or traits related to taxa—biological and non-biological—are included. Thus, for instance, terms pertaining descriptions, legal aspects, conservation, management, demographics, nomenclature, or related resources are incorporated.
Description
The Plinian Core is aimed to facilitate the exchange of information about the species and upper taxa.
What is in scope?
Species level catalogs of any kind of biological objects or data.
Terminology associated with biological collection data.
Striving for compatibility with other biodiversity-related standards.
Facilitating the addition of components and attributes of biological data.
What is not in scope?
Data interchange protocols.
Non-biodiversity-related data.
Occurrence level data.
This standard is named after Pliny the Elder, a very influential figure in the study of the biological species.
Plinian Core design requirements includes: ease of use, to be self-contained, able to support data integration from multiple databases, and ability to handle different levels of granularity. Core terms can be grouped in its current version as follows:
Metadata
Base Elements
Record Metadata
Nomenclature and Classification
Taxonomic description
Natural history
Invasive species
Habitat and Distribution
Demography and Threats
Uses, Management and Conservation
associatedParty, MeasurementOrFact, References, AncillaryData
Background
Plinian Core started as a collaborative project between Instituto Nacional de Biodiversidad and GBIF Spain in 2005. A series of iterations in which elements were defined and implanted in different projects resulted in a "Plinian Core Flat" [deprecated].
As a result, a new development was impulse to overcome them in 2012. New formal requirements, additional input and a will to better support the standard and its documentation, as well as to align it with the processes of TDWG, the world reference body for biodiversity information standards.
A new version, Plinian Core v3.x.x was defined. This provides more flexibility to fully represent the information of a species in a variety of scenarios. New elements to deal with aspects such as IPR, related resources, referenced, etc. were introduced, and elements already included were better-defined and documented.
Partner for the development of Plinian Core in this new phase incorporated the University of Granada (UG, Spain), the Alexander von Humboldt Institute (IAvH, Colombia), the National Commission for the Knowledge and Use of Biodiversity (Conabio, Mexico) and the University of São Paulo (USP, Brazil).
A "Plinian Core Task Group" within TDWG "Interest Group on species Information" was constituted and currently working on its development.
Levels of the standard
Plinian Core is presented in to levels: the abstract model and the application profiles.
The abstract model (AM), comprising the abstract model schema(xsd) and the terms' URIs, is the normative part. It is all comprehensive, and allows for different levels of granularity in describing species properties. The AM should be taken as a "menu" from which to choose terms and level of detail needed in any specific project.
The subsets of the abstract model intended to be implemented in specific projects are the "application profiles" (APs). Besides containing part of the elements of the AM, APs can impose additional specifications on the included elements, such as controlled vocabularies. Some examples of APs in use follow:
Application profile CONABIO
Application profile INBIO
Application profile GBIF.ES
Application profile Banco de Datos de la Naturaleza.Spain
Application profile SIB-COLOMBIA
Relation to other standards
Plinian incorporates a number of elements already defined by other standards. The following table summarizes these standards and the elements used in Plinian Core:
External links
Main page
An Implementation of Plinian Core as GBIF's IPT Extensions
Plinian Core Terms Quick Reference Guide
Plinian Core Abstract Model (xsd). Current version
Biodiversity Information Standards (TDWG)
Sistema de información de la naturaleza de euskadi. Aplicación del estandar Plinian Core
Estándar Plinian Core para la gestión integrada de la información sobre especies. Ministerio de Agricultura, Alimentación y Medio Ambiente de España
Modelo conceptual de la Base Nacional de Datos de Vegetación. Ministerio del Ambiente, Ecuador
References
Knowledge representation
Interoperability
Metadata standards | Plinian Core | [
"Engineering"
] | 912 | [
"Telecommunications engineering",
"Interoperability"
] |
49,237,382 | https://en.wikipedia.org/wiki/Stellar%20halo | A stellar halo is the component of a galaxy's galactic halo that contains stars. The stellar halo extends far outside a galaxy's brightest regions and typically contains its oldest and most metal-poor stars.
Observation history
Early studies, investigating the shape of the stellar halo of the Milky Way, found some evidence that it may vary with increasing distance from the galaxy. These studies found halos with spherically shaped outer regions and flatter inner regions. Large surveys in the 21st century such as the Sloan Digital Sky Survey have allowed the shape and distribution of the stellar halo to be investigated in much more detail; this data has been used to postulate a triaxial or oblate halo. More recent studies have found the halo to be flattened with a broken power law radius dependence; evidence for triaxiality is unclear.
As a result of their faint brightness, observations of stellar halos in distant galaxies have required very long exposure times, the stacking of data from numerous galaxies to obtain averaged properties, or observing only the resolved stellar populations. Individual resolved stars in stellar halos can only be measured in the Milky Way and Andromeda. The furthest stellar halos detected are at a redshift distance of 1.
Structure and properties
In the Lambda-CDM model of the universe, galaxies grow by mergers. Such mergers are the cause of substructure observed in the stellar halo of galaxies; streams of stars from disrupted satellite galaxies are detectable through their coherence in space or velocity; a number of these streams are observable around the Milky Way. As a result of the buildup from an assortment of satellite galaxies, variations in properties such as metallicity are present across stellar populations in halos.
Astrophysical simulations of galaxies have predicted that stellar halos should have two components: an inner region dominated by stars which formed within the galaxy, and an outer region primarily composed of stars accreted through merger events. Predictions for these components include different structure and rotation directions. Observational evidence for this dual halo in the Milky Way has been claimed but contested.
Milky Way
Studies of the Milky Way galaxy have found that approximately % of its total stellar mass is contained within the stellar halo, and that it extends to over 100 kiloparsecs from the galactic centre.
See also
Dark matter halo
Galactic corona
References
Galactic astronomy
Observational cosmology | Stellar halo | [
"Astronomy"
] | 476 | [
"Galactic astronomy",
"Astronomical sub-disciplines"
] |
49,237,509 | https://en.wikipedia.org/wiki/Garmin%20BaseCamp | Garmin BaseCamp is a map viewing / GIS software package offered free for download by Garmin, primarily intended for use with their GPS navigation devices. BaseCamp serves as a replacement to the now unsupported Garmin MapSource.
Features
View map and satellite imagery and transfer it to the GPS device.
Plan trips by entering routes and waypoints and transferring them to the device.
Geotag photos and export them to e.g. Picasa.
Mac
The Mac version has not been updated for ARM CPUs, and when launched on M1/M2 CPUs will run on Rosetta 2.
References
GIS software
Windows software
MacOS software
Freeware | Garmin BaseCamp | [
"Technology"
] | 135 | [
"Computing stubs",
"Software stubs"
] |
49,237,898 | https://en.wikipedia.org/wiki/Indoor%20Environmental%20Quality%20Global%20Alliance | The Indoor Environmental Quality Global Alliance (IEQ-GA) was initiated in 2014 aiming to improve the actual, delivered indoor environmental quality in buildings through coordination, education, outreach and advocacy. The alliance works to supply information, guidelines and knowledge on the indoor environmental quality (IEQ) in buildings and workplaces, and to provide occupants in buildings and workplaces with an acceptable indoor environmental quality (indoor air quality (IAQ), thermal conditions, visual quality, and acoustical quality) and help promote implementation in practice of knowledge from research on the field.
The group has already begun work to collect and critique IEQ standards and is organising and presenting programmes at the conferences of member organisations and others.
The Alliance was launched on June 29, 2014 during ASHRAE’s 2014 Annual Conference in Seattle by the signing of a memorandum of understanding between AIHA, AIVC, ASHRAE, A&WMA, IAQA, and REHVA. The Alliance was formed by an ad hoc committee appointed by ASHRAE 2013-14 President Bill Bahnfleth to explore ways in which industry groups could work together to address all aspects of indoor environmental quality and health
IEQ-GA was incorporated as a non-profit organization in Belgium in 2019 with ACGIH, AiCARR, AIHA, AIVC, ASHRAE, ISHRAE, and REHVA as founding members of the corporation. The incorporation ceremony took place during the 40th Annual Conference of AIVC on October 15-16, 2019.
IEQ-GA members
The IEQ-GA has the following full members:
AiCARR (Associazione Italiana Condizionemento dell’Aria, Riscaldamento e Refrigerazione),
the American Industrial Hygiene Association (AIHA),
the Air Infiltration and Ventilation Centre (AIVC),
the Acoustical Society of America (ASA),
the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE),
the Federation of Ibero-American Air Conditioning and Refrigeration Associations (FAIAR),
the Federation and Association of the Interior Environment throughout Spain and Andorra (FEDECAI),
the Institute of Inspection Cleaning and Restoration Certification (IICRC),
the Indian Society of Heating, Refrigerating and Air Conditioning Engineers (ISHRAE), and
the Federation of European Heating, Ventilation and Air Conditioning Associations (REHVA)
The IEQ-GA has the following affiliate members:
The Romanian Chamber of Energy Auditors (OAER)
References
External links
IEQ-GA website
Air pollution
Heating, ventilation, and air conditioning
Occupational safety and health
Lighting
Building engineering organizations
2014 establishments in the United States
Indoor air pollution | Indoor Environmental Quality Global Alliance | [
"Engineering"
] | 545 | [
"Building engineering",
"Building engineering organizations"
] |
49,238,421 | https://en.wikipedia.org/wiki/Accord.NET | Accord.NET is a framework for scientific computing in .NET. The source code of the project is available under the terms of the Gnu Lesser Public License, version 2.1.
The framework comprises a set of libraries that are available in source code as well as via executable installers and NuGet packages. The main areas covered include numerical linear algebra, numerical optimization, statistics, machine learning, artificial neural networks, signal and image processing, and support libraries (such as graph plotting and visualization). The project was originally created to extend the capabilities of the AForge.NET Framework, but has since incorporated AForge.NET inside itself. Newer releases have united both frameworks under the Accord.NET name.
The Accord.NET Framework has been featured in multiple books such as Mastering .NET Machine Learning by PACKT publishing and F# for Machine Learning Applications, featured in QCON San Francisco, and currently accumulates more than 1,500 forks in GitHub.
Multiple scientific publications have been published with the use of the framework.
See also
List of numerical libraries for .NET framework
ML.NET
References
External links
Official web site
Project home on GitHub
Accord.NET packages at NuGet
Aforge.NET site on projects using the framework, mentioning Accord.NET as extension of the framework.
.NET software
Computer libraries
Computer vision software | Accord.NET | [
"Technology"
] | 275 | [
"IT infrastructure",
"Computer libraries"
] |
49,238,958 | https://en.wikipedia.org/wiki/ULT%20freezer | An ultra low temperature (ULT) freezer is a refrigerator that stores contents at . An ultra low temperature freezer is commonly referred to as a "minus 80 freezer" or a "negative 80 freezer", referring to the most common temperature standard. ULT freezers come in upright and chest freezer formats.
Application
In contrast to short term sample storage at by using standard refrigerators or freezers, many molecular biology or life science laboratories need long-term cryopreservation (including "cold chain" and/or "colder chain" infrastructures) for biological samples like DNA, RNA, proteins, cell extracts, or reagents. To reduce the risk of sample damage, these types of samples need extremely low temperatures of . Mammalian cells are often stored in dewars containing liquid nitrogen at . Cryogenic chest freezers can achieve temperatures down to to and may include a liquid nitrogen backup.
Biological samples in ULT freezers are often stored in polymer tubes and microtubes, generally inside storage boxes that are commonly made of cardboard, polymer plastics or other materials. Microtubes are placed in storage boxes containing a grid of dividers that typically permit 64, 81, or 100 tubes to be stored. Standard ULT freezers can store approximately 350 to 450 microtube boxes.
ULT freezers are widely used in fish and meat preservation. The tuna fishing industry requires the use of ULT freezers.
ULT freezers are commonly fitted with alarm systems that will remotely alert designated parties in the case of a freezer failure.
Pull down time
The pull down time is defined as the necessary time to cool down the ULT freezer from ambient temperatures to the selected temperature of . The time strongly depends on the type of insulation, the efficiency of the compressor system as well as the installed metal shelves within the freezer. At the start of the twenty-first century, ULT freezers were able to cool down within 3 to 5 hours. Warm up time is typically 1/8 °C per minute.
Energy consumption
Due to the low temperature, ULT freezers consume high amounts of electric energy and hence are expensive to operate. In 2010, Stanford University had more than 2,000 ULT freezers, which used an estimated 40 billion BTUs of energy and cost the university $5.6 million annually. Newer ULT freezers consume less energy. Nonetheless, a comprehensive report published in 2015 by the Center for Energy Efficient Laboratories (funded by Pacific Gas & Electric, Southern California Edison, and San Diego Gas & Electric utility companies as part of their Emerging Technologies program) found that laboratories in California consumed an estimated 800 GWh/year, with ULT freezers being the greatest contributor to that total.
At least as early as 2018, some scientists suggested that laboratories set freezers to –70 °C instead of –80 °C to conserve energy and decrease wear on the freezer's compressor.
Depending on the volume of the freezer, the opening frequency of the users as well as the number of samples, the energy consumption starts from ca. 11 kWh/day and higher. The US government calculates 20 kWh/day. A study performed at the University of Edinburgh showed that the New Brunswick U570 HEF model consumed slightly under 10 kWh/day. Without any data, the University of Michigan claimed that "older model" ULT devices could consume "up to 30 kWh/day". A sales pitch written in 2023 quoted "16-22 KWh of electricity per day".
To reduce the energy consumption, the insulation should be as efficient as possible. Additional inner doors reduce the loss of temperature when opening the main door. Icing within the ULT freezer should be reduced to a minimum. Modern ULT freezers employ variable speed drives for both the compressors and fans. This has reduced energy consumption a further 30% to typically 8.5 kWh/day.
Refrigeration cycle
ULT freezers that employ the cascade refrigeration (CR) system use up to 20 times the energy footprint of household fridges, and used to refrigerate with greenhouse gas fluids (typically hydrofluorocarbon R-508B). Modern ULT freezers employ HC (i.e., hydrocarbon) gas mixtures: typically, ethane and propane. This technology was developed in the mid-1990s, and improved efficiency by up to 30% over the conventional CFC or HFC gassed freezers. Alternatively ULT freezers may use the Stirling cycle in reverse (A Stirling cooler) for refrigeration.
See also
Dry ice
References
Cooling technology
Preservation methods | ULT freezer | [
"Chemistry"
] | 947 | [
"Cryopreservation",
"Cryobiology"
] |
49,239,358 | https://en.wikipedia.org/wiki/Kirklington%20Hall%20Research%20Station | Kirklington Hall Research Station was a geophysical research institute of BP in Kirklington, Nottinghamshire. During the 1950s it was the main research site of BP.
Background
Cricketer John Boddam-Whetham was born at the site in 1843. Sir Albert Bennett, 1st Baronet, Liberal MP from 1922-23 for Mansfield, lived there from 1920. The Bennett baronets was formed in 1929. Lady Evelyn Maude Robinson, was the owner from around 1930, and the wife of Sir John Robinson of Worksop, who died aged 74 on Saturday 2 December 1944.
The previous owner died aged 73 on Friday 14 December 1945, leaving £138,365 in her will. In June 1945 it was put up for auction, with 631 acres, 15 bedrooms, 6 bathrooms, abd 11 servants rooms. It was sold for £24,500 in Derby in July 1945.
Nottinghamshire County Council for two years was looking to buy the property, but it was a big investment, for a residential further education college; it chose another site in July 1947.
History
BP
As part of the East Midlands Oil Province, oil was found in eastern Nottinghamshire. It was also known as the BP Research Centre or the Geophysical Centre, part of BP's Exploration Division.
The site was acquired due to proximity of Eakring, in July 1949. The local church, with Hockerton, held garden parties at the site in the summer.
The research centre was established in 1950. Its first employee was Jack Birks, later managing director of BP. From 1950 it was the main geophysical research site of BP, until BP sold the site in 1957 for £12,000. Research moved to Sunbury-on-Thames, in Surrey, in 1957. Sunbury Research Centre had been built around the same time as the Kirklington site, in the early 1950s.
Private property
It was put up for sale in November 1957. In 1958 there was the possibility of the site being a teacher training college.
From 1958 it was a private school, which had been formed in Southwell in 1945.
It was put up for sale in 1987, with a guide price of £850,000.
Kirklington Hall today is a private school.
Structure
The former site is situated north of the A617.
Function
It conducted geophysical research for exploration for BP. This part of BP is now known as BP Exploration. Work would be conducted on core samples and with seismic methods.
See also
British Geological Survey, also in Nottinghamshire
Sunbury Research Centre, where most of BP's research takes place in the UK today.
:Category:Petroleum geology
:Category:Seismology measurement
References
British Petroleum and Global Oil 1950-1975: The Challenge of Nationalism, James Bamberg, page 33
External links
Our Nottinghamshire
1950 establishments in England
1957 disestablishments in England
BP buildings and structures
Buildings and structures in Nottinghamshire
Energy research institutes
Engineering research institutes
Earth science research institutes
Petroleum industry in the United Kingdom
Petroleum organizations
Research institutes established in 1950
Research institutes in England
Science and technology in Nottinghamshire
Research stations | Kirklington Hall Research Station | [
"Chemistry",
"Engineering"
] | 615 | [
"Engineering research institutes",
"Petroleum",
"Petroleum organizations",
"Energy research institutes",
"Energy organizations"
] |
49,240,826 | https://en.wikipedia.org/wiki/Zinc%20finger%20protein%20557 | Zinc finger protein 557 is a protein that in humans is encoded by the ZNF557 gene.
References
Further reading
Proteins | Zinc finger protein 557 | [
"Chemistry"
] | 28 | [
"Biomolecules by chemical classification",
"Proteins",
"Molecular biology"
] |
49,241,168 | https://en.wikipedia.org/wiki/Ophiocordyceps%20formicarum | Ophiocordyceps formicarum is an entomopathogenic fungus belonging to the order Hypocreales (Ascomycota) in the family Ophiocordycipitaceae. The fungus was first described by mycologist George S. Kobayashi in 1939 as a species of Cordyceps. Originally found in Japan growing on an adult Hercules ant (Camponotus herculeanus var. obscuripes), it was reported from Guizhou, China, in 2003. It was transferred to the new genus Ophiocordyceps in 2007 when the family Cordycipitaceae was reorganized. A technique has been developed to grow the fungus in an agar growth medium supplemented with yeast extract, inosine, and glucose.
References
External links
Fungi described in 1939
Insect diseases
Animal fungal diseases
Ophiocordycipitaceae
Fungus species | Ophiocordyceps formicarum | [
"Biology"
] | 180 | [
"Fungi",
"Fungus species"
] |
49,241,310 | https://en.wikipedia.org/wiki/Anion%20exchanger%20family | The anion exchanger family (TC# 2.A.31, also named bicarbonate transporter family) is a member of the large APC superfamily of secondary carriers. Members of the AE family are generally responsible for the transport of anions across cellular barriers, although their functions may vary. All of them exchange bicarbonate. Characterized protein members of the AE family are found in plants, animals, insects and yeast. Uncharacterized AE homologues may be present in bacteria (e.g., in Enterococcus faecium, 372 aas; gi 22992757; 29% identity in 90 residues). Animal AE proteins consist of homodimeric complexes of integral membrane proteins that vary in size from about 900 amino acyl residues to about 1250 residues. Their N-terminal hydrophilic domains may interact with cytoskeletal proteins and therefore play a cell structural role. Some of the currently characterized members of the AE family can be found in the Transporter Classification Database.
Family overview
Bicarbonate (HCO3 −) transport mechanisms are the principal regulators of pH in animal cells. Such transport also plays a vital role in acid-base movements in the stomach, pancreas, intestine, kidney, reproductive organs and the central nervous system. Functional studies have suggested different HCO3 − transport modes.
Anion exchanger proteins exchange HCO3 − for Cl− in a reversible, electroneutral manner.
Na+/HCO3 − co-transport proteins mediate the coupled movement of Na+ and HCO3 − across plasma membranes, often in an electrogenic manner.
Sequence analysis of the two families of HCO3 − transporters that have been cloned to date (the anion exchangers and Na+/HCO3 − co-transporters) reveals that they are homologous. This is not entirely unexpected, given that they both transport HCO3 − and are inhibited by a class of pharmacological agents called disulphonic stilbenes. They share around ~25-30% sequence identity, which is distributed along their entire sequence length, and have similar predicted membrane topologies, suggesting they have ~10 transmembrane (TM) domains.
A conserved domain is found at the C terminus of many bicarbonate transport proteins. It is also found in some plant proteins responsible for boron transport. In these proteins it covers almost the entire length of the sequence.
The Band 3 anion exchange proteins that exchange bicarbonate are the most abundant polypeptide in the red blood cell membrane, comprising 25% of the total membrane protein. The cytoplasmic domain of band 3 functions primarily as an anchoring site for other membrane-associated proteins. Included among the protein ligands of this domain are ankyrin, protein 4.2, protein 4.1, glyceraldehyde-3-phosphate dehydrogenase (GAPDH), phosphofructokinase, aldolase, hemoglobin, hemichromes, and the protein tyrosine kinase (p72syk).
Anion exchangers in humans
In humans, anion exchangers fall under the solute carrier family 4 (SLC4) family, which is composed of 10 paralogous members (SLC4A1-5; SLC4A7-11). Nine encode proteins that transport HCO. Functionally, eight of these proteins fall into two major groups: three Cl-HCO exchangers (AE1-3) and five Na+-coupled HCO transporters (NBCe1, NBCe2, NBCn1, NBCn2, NDCBE). Two of the Na+-coupled transporters (NBCe1, NBCe2) are electrogenic; the other three Na+-coupled HCO transporters and all three AEs are electroneutral. Two others (AE4, SLC4A9 and BTR1, SLC4A11) are not characterized. Most, though not all, are inhibited by 4,4'-diisothiocyanatostilbene-2,2'-disulfonate (DIDS). SLC4 proteins play roles in acid-base homeostasis, transport of H+ or HCO by epithelia (e.g. absorption of HCO in the renal proximal tubule, secretion of HCO in the pancreatic duct), as well as the regulation of cell volume and intracellular pH.
Based on their hydropathy plots all SLC4 proteins are hypothesized to share a similar topology in the cell membrane. They have relatively long cytoplasmic N-terminal domains composed of a few hundred to several hundred residues, followed by 10-14 transmembrane (TM) domains, and end with relatively short cytoplasmic C-terminal domains composed of ~30 to ~90 residues. Although the C-terminal domain comprises a small percentage of the size of the protein, this domain in some cases, has (i) binding motifs that may be important for protein-protein interactions (e.g., AE1, AE2, and NBCn1), (ii) is important for trafficking to the cell membrane (e.g., AE1 and NBCe1), and (iii) may provide sites for regulation of transporter function via protein kinase A phosphorylation (e.g., NBCe1).
The SLC4 family comprises the following proteins.
SLC4A1
SLC4A2
SLC4A3
SLC4A4
SLC4A5
SLC4A7
SLC4A8
SLC4A9
SLC4A10
SLC4A11
Anion exchanger 1
The human anion exchanger 1 (AE1 or Band 3) binds carbonic anhydrase II (CAII) forming a "transport metabolon" as CAII binding activates AE1 transport activity about 10 fold. AE1 is also activated by interaction with glycophorin, which also functions to target it to the plasma membrane. The membrane-embedded C-terminal domains may each span the membrane 13-16 times. According to the model of Zhu et al. (2003), AE1 in humans spans the membrane 16 times, 13 times as α-helix, and three times (TMSs 10, 11 and 14) possibly as β-strands. AE1 preferentially catalyzes anion exchange (antiport) reactions. Specific point mutations in human anion exchanger 1 (AE1) convert this electroneutral anion exchanger into a monovalent cation conductance. The same transport site within the AE1 spanning domain is involved in both anion exchange and cation transport.
AE1 in human red blood cells has been shown to transport a variety of inorganic and organic anions. Divalent anions may be symported with H+. Additionally, it catalyzes flipping of several anionic amphipathic molecules such as sodium dodecyl sulfate (SDS) and phosphatidic acid from one monolayer of the phospholipid bilayer to the other monolayer. The rate of flipping is sufficiently rapid to suggest that this AE1-catalyzed process is physiologically important in red blood cells and possibly in other animal tissues as well. Anionic phospholipids and fatty acids are likely to be natural substrates. However, the mere presence of TMSs enhances the rates of lipid flip-flop.
Structure
The crystal structure of AE1 (CTD) at 3.5 angstroms has been determined. The structure is locked in an outward-facing open conformation by an inhibitor. Comparing this structure with a substrate-bound structure of the uracil transporter UraA in an inward-facing conformation allowed identification of the likely anion-binding position in the AE1 (CTD), and led to proposal of a possible transport mechanism that could explain why selected mutations lead to disease. The 3-D structure confirmed that the AE family is a member of the APC superfamily.
There are several crystal structures available for the AE1 protein in RCSB (links are also available in TCDB).
AE1: , , , , , , , , , , , , , , , , ,
Other members
Renal Na+:HCO cotransporters have been found to be members of the AE family. They catalyze the reabsorption of HCO in the renal proximal tubule in an electrogenic process that is inhibited by typical stilbene inhibitors of AE such as DIDS and SITS. They are also found in many other body tissues. At least two genes encode these symporters in any one mammal. A 10 TMS model has been presented, but this model conflicts with the 14 TMS model proposed for AE1. The transmembrane topology of the human pancreatic electrogenic Na+:HO transporter, NBC1, has been studied. A TMS topology with N- and C-termini in the cytoplasm has been suggested. An extracellular loop determines the stoichiometry of Na+-HCO cotransporters.
In addition to the Na+-independent anion exchangers (AE1-3) and the Na+:HCO cotransporters (NBCs) (which may be either electroneutral or electrogenic), a Na+-driven HCO/Cl− exchanger (NCBE) has been sequenced and characterized. It transports Na+ + HCO preferentially in the inward direction and H+ + Cl− in the outward direction. This NCBE is widespread in mammalian tissues where it plays an important role in cytoplasmic alkalinization. For example, in pancreatic β-cells, it mediates a glucose-dependent rise in pH related to insulin secretion.
Animal cells in tissue culture expressing the gene-encoding the ABC-type chloride channel protein CFTR (TC# 3.A.1.202.1) in the plasma membrane have been reported to exhibit cyclic AMP-dependent stimulation of AE activity. Regulation was independent of the Cl− conductance function of CFTR, and mutations in the nucleotide-binding domain #2 of CFTR altered regulation independently of their effects on chloride channel activity. These observations may explain impaired HCO secretion in cystic fibrosis patients.
Anion exchangers in plants and fungi
Plants and yeast have anion transporters that in both the pericycle cells of plants and the plasma membrane of yeast cells export borate or boric acid (pKa = 9.2). In A. thaliana, boron is exported from pericycle cells into the root stellar apoplasm against a concentration gradient for uptake into the shoots. In S. cerevisiae, export is also against a concentration gradient. The yeast transporter recognizes HCO, I−, Br−, NO and Cl−, which may be substrates. Tolerance to boron toxicity in cereals is known to be associated with reduced tissue accumulation of boron. Expression of genes from roots of boron-tolerant wheat and barley with high similarity to efflux transporters from Arabidopsis and rice lowered boron concentrations due to an efflux mechanism. The mechanism of energy coupling is not known, nor is it known if borate or boric acid is the substrate. Several possibilities (uniport, anion:anion exchange and anion:cation exchange) can account for the data.
Transport reactions
The physiologically relevant transport reaction catalyzed by anion exchangers of the AE family is:
Cl− (in) + HCO (out) ⇌ Cl− (out) + HCO (in).
That for the Na+:HCO3- cotransporters is:
Na+ (out) + nHCO (out) → Na+ (in) + nHCO (in).
That for the Na+/HCO:H+/Cl− exchanger is:
Na+ (out) + HCO (out) + H+ (in) + Cl− (in) ⇌ Na+ (in) + HCO (in) + H+ (out) + Cl− (out).
That for the boron efflux protein of plants and yeast is:
Boron (in) → Boron (out)
See also
Solute carrier family
Transporter Classification Database
References
Protein families
Transmembrane transporters
Solute carrier family | Anion exchanger family | [
"Biology"
] | 2,627 | [
"Protein families",
"Protein classification"
] |
49,241,355 | https://en.wikipedia.org/wiki/Gastruloid | Gastruloids are three dimensional aggregates of embryonic stem cells (ESCs) that, when cultured in specific conditions, exhibit an organization resembling that of an embryo. They develop with three orthogonal axes and contain the primordial cells for various tissues derived from the three germ layers, without the presence of extraembryonic tissues. Notably, they do not possess forebrain, midbrain, and hindbrain structures. Gastruloids serve as a valuable model system for studying mammalian development, including human development, as well as diseases associated with it. They are a model system an embryonic organoid for the study of mammalian development (including humans) and disease.
Background
The Gastruloid model system draws its origins from work by Marikawa et al.. In that study, small numbers of mouse P19 embryonal carcinoma (EC) cells, were aggregated as embryoid bodies (EBs) and used to model and investigate the processes involved in anteroposterior polarity and the formation of a primitive streak region. In this work, the EBs were able to organise themselves into structures with polarised gene expression, axial elongation/organisation and up-regulation of posterior mesodermal markers. This was in stark contrast to work using EBs from mouse ESCs, which had shown some polarisation of gene expression in a small number of cases but no further development of the multicellular system.
Following this study, the Martinez Arias laboratory in the Department of Genetics at the University of Cambridge demonstrated how aggregates of mouse embryonic stem cells (ESCs) were able to generate structures that exhibited collective behaviours with striking similarity to those during early development such as symmetry-breaking (in terms of gene expression), axial elongation and germ-layer specification. To quote from the original paper: "Altogether, these observations further emphasize the similarity between the processes that we have uncovered here and the events in the embryo. The movements are related to those of cells in gastrulating embryos and for this reason we term these aggregates ‘gastruloids’". As noted by the authors of this protocol, a crucial difference between this culture method and previous work with mouse EBs was the use of small numbers of cells which may be important for generating the correct length scale for patterning, and the use of culture conditions derived from directed differentiation of ESCs in adherent culture
Brachyury (T/Bra), a gene which marks the primitive streak and the site of gastrulation, is up-regulated in the Gastruloids following a pulse of the Wnt/β-Catenin agonist CHIR99021 (Chi; other factors have also been tested) and becomes regionalised to the elongating tip of the Gastruloid. From or near the region expressing T/Bra, cells expressing the mesodermal marker tbx6 are extruded from the similar to cells in the gastrulating embryo; it is for this reason that these structures are called Gastruloids.
Further studies revealed that the events that specify T/Bra expression in gastruloids mimic those in the embryo. After seven days gastruloids exhibit an organization very similar to a midgestation embryo with spatially organized primordia for all mesodermal (axial, paraxial, intermediate, cardiac, cranial and hematopoietic) and endodermal derivatives as well as the spinal cord. They also implement Hox gene expression with the spatiotemporal coordinates as the embryo. Gastruloids lack brain as well as extraembryonic tissues but characterisation of the cellular complexity of gastruloids at the level of single cell and spatial transcriptomics, reveals that they contain representatives of the three germ layers including neural crest, Primordial Germ cells and placodal primordia.
A feature of gastruloids is a disconnect between the transcriptional programs and outlines and the morphogenesis. However, changes in the culture conditions can elicit morphogenesis, most significantly gastruloids have been shown to form somites and early cardiac structures. In addition, interactions between gastruloids and extraembryonic tissues promote an anterior, brain-like polarised tissue.
Gastruloids have recently been obtained from human ESCs, which gives developmental biologists the ability to study early human development without needing human embryos. Importantly though, the human gastruloid model is not able to form a human embryo, meaning that is a non-intact, non-viable and non-equivalent to in vivo human embryos.
The term Gastruloid has been expanded to include self-organised human embryonic stem cell arrangements on patterned (micro patterns) that mimic early patterning events in development; these arrangements should be referred to as 2D gastruloids.
References
Stem cells
Tissue engineering
Animal developmental biology | Gastruloid | [
"Chemistry",
"Engineering",
"Biology"
] | 1,015 | [
"Biological engineering",
"Cloning",
"Chemical engineering",
"Tissue engineering",
"Medical technology"
] |
49,242,470 | https://en.wikipedia.org/wiki/Sulfate%20permease | The sulfate permease (SulP) family (TC# 2.A.53) is a member of the large APC superfamily of secondary carriers. The SulP family is a large and ubiquitous family of proteins derived from archaea, bacteria, fungi, plants and animals. Many organisms including Bacillus subtilis, Synechocystis sp, Saccharomyces cerevisiae, Arabidopsis thaliana and Caenorhabditis elegans possess multiple SulP family paralogues. Many of these proteins are functionally characterized, and most are inorganic anion uptake transporters or anion:anion exchange transporters. Some transport their substrate(s) with high affinities, while others transport it or them with relatively low affinities. Others may catalyze SO:HCO exchange, or more generally, anion:anion antiport. For example, the mouse homologue, SLC26A6 (TC# 2.A.53.2.7), can transport sulfate, formate, oxalate, chloride and bicarbonate, exchanging any one of these anions for another. A cyanobacterial homologue can transport nitrate. Some members can function as channels. SLC26A3 (2.A.53.2.3) and SLC26A6 (2.A.53.2.7 and 2.A.53.2.8) can function as carriers or channels, depending on the transported anion. In these porters, mutating a glutamate, also involved in transport in the CIC family (TC# 2.A.49), (E357A in SLC26A6) created a channel out of the carrier. It also changed the stoichiometry from 2Cl−/HCO to 1Cl−/HCO.
Structure
All SulPs are homodimers. where two subunits do not function independently. The dimeric structure probably represents the native state of SulP transporters. A low-resolution structure of a bacterial SulP transporter revealed a dimeric stoichiometry, stabilized via its transmembrane core and mobile intracellular domains. The cytoplasmic STAS domain projects away from the transmembrane domain and is not involved in dimerization. The structure suggests that large movements of the STAS domain underlie the conformational changes that occur during transport.
The bacterial proteins vary in size from 434 residues to 573 residues with only a few exceptions. The eukaryotic proteins vary in size from 611 residues to 893 residues with a few exceptions. Thus, the eukaryotic proteins are usually larger than the prokaryotic homologues. These proteins exhibit 10-13 putative transmembrane α-helical spanners (TMSs) depending on the protein.
Crystal structures
Several crystal structures are available for members of the SulP family through RCSB:
, , ,
Homologues
One of the distant SulP homologues has been shown to be a bicarbonate:Na+ symporter (TC# 2.A.53.5.1). Bioinformatic work has identified additional homologues with fused domains. Some of these fused proteins have SulP homologues fused to carbonic anhydrase homologues (TC# 2.A.53.8.1). These are also presumed to be bicarbonate uptake permeases. Another has SulP fused to Rhodanese, a sulfate:cyanide sulfotransferase (TC# 2.A.53.9.1). This SulP homologue is presumably a sulfate transporter.
Homologues currently characterized in the SulP family can be found in the Transporter Classification Database.
SLC26A3 in mice
One member of the SulP family, SLC26A3, has been knocked out in mice. Apical membrane chloride/base exchange activity was sharply reduced, and the luminal content was more acidic in SLC26A3-null mouse colon. The epithelial cells in the colon displayed unique adaptive regulation of ion transporters; NHE3 expression was enhanced in the proximal and distal colon, whereas colonic H+/K+-ATPase and the epithelial sodium channel showed massive up-regulation in the distal colon. Plasma aldosterone was increased in SLC26A3-null mice. Thus, SLC26A3 may be the major apical chloride/base exchanger and is essential for the absorption of chloride in the colon. In addition, SLC26A3 regulates colonic crypt proliferation. Deletion of SLC26A3 results in chloride-rich diarrhea and is associated with compensatory adaptive up-regulation of ion-absorbing transporters.
MOT1
MOT1 from Arabidopsis thaliana (TC# 2.A.53.11.1, 456aas; 8-10 TMSs), a distant homologue of the SulP and BenE (2.A.46) families, is expressed in both roots and shoots, and is localized to plasma membranes and intracellular vesicles. MOT1 is required for efficient uptake and translocation of molybdate as well as for normal growth under conditions of limited molybdate supply. Kinetic studies in yeast revealed that the K(m) value of MOT1 for molybdate is approximately 20 nM. Mo uptake by MOT1 in yeast is not affected by the presence of sulfate. MOT1 did not complement a sulfate transporter-deficient yeast mutant strain. MOT1 is thus probably specific for molybdate. The high affinity of MOT1 allows plants to obtain scarce Mo from soil when its concentration is about 10nM.
SLC26
SLC26 proteins function as anion exchangers and Cl− channels. Ousingsawat et al. (2012) examined the functional interaction between CF transmembrane conductance regulator (CFTR) and SLC26A9 in polarized airway epithelial cells and in non-polarized HEK293 cells expressing CFTR and SLC26A9 (2.A.56.2.10). They found that SLC26A9 provides a constitutively active basal Cl− conductance in polarized grown CFTR-expressing CFBE airway epithelial cells, but not in cells expressing F508del-CFTR. In polarized CFTR-expressing cells, SLC26A9 also contributes to both Ca2+- and CFTR-activated Cl− secretion. In contrast in non-polarized HEK293 cells co-expressing CFTR/SLC26A9, the baseline Cl− conductance provided by SLC26A9 was inhibited during activation of CFTR. Thus, SLC26A9 and CFTR behave differentially in polarized and non-polarized cells, explaining earlier conflicting data.
Transport Reaction
The generalized transport reactions catalyzed by SulP family proteins are:
(1) SO (out) + nH+ (out) → SO (in) + nH+ (in).
(2) SO (out) + nHCO (in) ⇌ SO (in) + nHCO (out).
(3) I− and other anions (out) ⇌ I− and other anions (in).
(4) HCO (out) + nH+ (out) → HCO (in) + nH+ (in).
See also
Solute carrier family
Transporter Classification Database
Membrane transport protein
References
Protein families
Transmembrane transporters
Integral membrane proteins | Sulfate permease | [
"Biology"
] | 1,619 | [
"Protein families",
"Protein classification"
] |
49,242,633 | https://en.wikipedia.org/wiki/Fluorocholine | 18F-Fluorocholine is a fluorinated choline derivative and an oncologic PET tracer.
References
PET radiotracers
Quaternary ammonium compounds | Fluorocholine | [
"Chemistry"
] | 39 | [
"Pharmacology",
"Medicinal radiochemistry",
"Medicinal chemistry stubs",
"Chemicals in medicine",
"PET radiotracers",
"Pharmacology stubs"
] |
49,243,724 | https://en.wikipedia.org/wiki/Remune | Remune was the first therapeutic HIV vaccine based on the killed whole virus approach.
Remune was initially invented by Jonas Salk in 1987 and was developed by Immune Response BioPharma, Inc. (IRBP)
Remune is a therapeutic HIV/AIDS vaccine that has completed over 25 clinical studies to date and shows a robust mechanism of action restoring white blood cell counts in CD4 and CD8 T cells by reducing viral load and increasing immunity.
The FDA is currently reviewing IRBP's BLA (biologics licensing application) for therapeutic treatment in people with HIV. Once the FDA approves the BLA, Remune will be the first therapeutic HIV vaccine brought to market.
IRBP is set to submit its BLA with the FDA in early 2014. REMUNE granted FDA Pediatric Orphan Designation on Feb 14th 2014.
Vaccine design
Remune is made of beta-propiolactone inactive HIV-1 which has been irradiated to destroy the viral genome. The vaccine, however, does not contain gp120 surface antigen as it falls off on fixation of HIV-1.
As a result, only gp41 is remaining on the virion surface. The initial clinical trial demonstrated a significant difference in clinical benefit between Remune vaccine and placebo with a decline in viral load P < 0.05 & HIV-1 lymphocyte proliferation P < 0.001 both favoring the Remune Group.
The International AIDS Vaccine Research (IAVA) issued a report on "Whole Killed AIDS Vaccines" in 1999 reviewing a diverse aspect of the killed virus approach.[
Outcome
Trials of Remune have largely shown no benefit. Immune Response Corporation (IRC), one of the developers of Remune, tried unsuccessfully to block negative research findings. Pfizer pulled out of its partnership with IRC and there is now doubt about any further development of Remune.
References
HIV vaccine research | Remune | [
"Chemistry"
] | 397 | [
"HIV vaccine research",
"Drug discovery"
] |
55,984,865 | https://en.wikipedia.org/wiki/Riko%20Muranaka | is a medical doctor, journalist and recipient of the 2017 John Maddox Prize for fighting to reduce cervical cancer and countering misinformation about the human papilloma virus (HPV) vaccine dominating the Japanese media, despite facing safety threats. Despite the lack of evidence, the HPV vaccine is infamous in Japan due to misattributed adverse effects, with government suspending promotion and coverage. While the World Health Organization (WHO) safety and efficacy information about the vaccine is consistent with Muranaka's reporting, a court ruled against Muranaka in an unrelated slander lawsuit in 2016 for claims of alleged fabrication. Under threat of legal harassment by antivaccine activists, publishers declined some of her works including a book on the HPV vaccine (ultimately, Heibonsha accepted the book for publication).
Biography
Muranaka received an M.A. in sociology from Hitotsubashi University and an M.D. from Hokkaido University School of Medicine. According to her own profile, she was known as a journalist writing about the Ebola fever in 2014. In February 2018, she published her first book.
Muranaka is part-time lecturer at the Japan Kyoto University School of Medicine. As of 2019, she lives in Germany.
Lawsuit
In 2016, Muranaka wrote in the Wedge magazine about research done by Shinshu University neurologist Shuichi Ikeda, alleging that some results to demonstrate a link between an HPV vaccine and brain cancer in mouse had been fabricated, resulting in a slander lawsuit. While Japanese's Health Ministry stated that Ikeda's results "have not proven anything about whether the symptoms that occurred after HPV vaccination were caused by the HPV vaccine," the court ruled that evidence of fabrication was absent. The university investigation on Ikeda's work concluded that he did not commit scientific misconduct, but that conclusions may have been overstated, then released a statement, including that the research did not conclusively provide a link in relation to vaccine safety. Muranaka lost the slander case. Wedge magazine had to retract claims of fabrication from the article with both needing to pay for damages.
Muranaka intends to appeal, also stating that she needs to win the lawsuit for science and that the court case was still an opportunity to make friends and gain recognition despite its negative aspects. According to Heidi Larson, director of the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine, "I think what is important is that media coverage does not distort the point and imply Dr. Ikeda's science won: It was Dr. Muranaka's manners and language that lost".
John Maddox Prize delivery ceremony
At the award ceremony of the John Maddox Prize, Riko Murunaka's speech highlighted the circumstances that, according to the award winner, could have given rise to the distinction received.
Vaccination against human papillomavirus
The WHO has evaluated the effectiveness and safety of the vaccine against human papilloma virus (HPV) concluding that it is extremely safe and that it is not related to the adverse effects attributed to it.
At 2016, 79 out of almost 200 countries have HPV vaccine programs for girls and adolescents.
However, Japan stopped recommending vaccination despite the fact that its own technical committees found no relationship with the alleged adverse effects falsely attributed to this vaccine, and as a consequence, vaccination coverage fell to levels close to zero, not seen in any other country. In November 2021 the Ministry of Health, Labour, and Welfare of Japan finally resumed active recommendations of the HPV.
Works
References
Further reading
External links
Riko Murunaka on Twitter
21st-century Japanese physicians
21st-century Japanese women physicians
Japanese journalists
Living people
Physicians from Tokyo
Hokkaido University alumni
Hitotsubashi University alumni
Vaccine controversies
Year of birth missing (living people)
John Maddox Prize recipients | Riko Muranaka | [
"Chemistry",
"Biology"
] | 790 | [
"Vaccination",
"Drug safety",
"Vaccine controversies"
] |
55,985,697 | https://en.wikipedia.org/wiki/ULAS%20J1342%2B0928 | ULAS J1342+0928 is the third-most distant known quasar detected and contains the second-most distant and oldest known supermassive black hole, at a reported redshift of z = 7.54. The ULAS J1342+0928 quasar is located in the Boötes constellation. The related supermassive black hole is reported to be "780 million times the mass of the Sun". At its discovery, it was the most distant known quasar. In 2021 it was eclipsed by QSO J0313-1806 as the most distant quasar.
Discovery
On 6 December 2017, astronomers published that they had found the quasar using data from the Wide-field Infrared Survey Explorer (WISE) combined with ground-based surveys from the United Kingdom Infrared Telescope and the DECam Legacy Survey . It was spectroscopically confirmed using data from the Magellan Telescopes at Las Campanas Observatory in Chile, as well as the Large Binocular Telescope in Arizona and the Gemini North telescope in Hawaii. The related black hole of the quasar existed when the universe was about 690 million years old (about 5 percent of its currently known age of 13.80 billion years).
The quasar comes from a time known as "the epoch of reionization", when the universe emerged from its Dark Ages. Extensive amounts of dust and gas have been detected to be released from the quasar into the interstellar medium of its host galaxy.
Description
ULAS J1342+0928 has a measured redshift of 7.54, which corresponds to a comoving distance of 29.36 billion light-years from Earth. When it was reported in 2017, it was the most distant quasar yet observed. The quasar emitted the light observed on Earth today less than 690 million years after the Big Bang, about 13.1 billion years ago.
The quasar's luminosity is estimated at solar luminosities. This energy output is generated by a supermassive black hole estimated at solar masses. According to lead astronomer Bañados, "This particular quasar is so bright that it will become a gold mine for follow-up studies and will be a crucial laboratory to study the early universe."
Significance
The light from ULAS J1342+0928 was emitted before the end of the theoretically predicted transition of the intergalactic medium from an electrically neutral to an ionized state (the epoch of reionization). Quasars may have been an important energy source in this process, which marked the end of the cosmic Dark Ages, so observing a quasar from before the transition is of major interest to theoreticians. Because of their high ultraviolet luminosity, quasars also are some of the best sources for studying the reionization process. The discovery is also described as challenging theories of black hole formation, by having a supermassive black hole much larger than expected at such an early stage in the Universe's history, though this is not the first distant quasar to offer such a challenge.
See also
List of the most distant astronomical objects
List of quasars
J0313–1806
References
External links
Carnegie Institution for Science
NASA APOD − Quasar Pictures
Quasar Image Gallery/perseus
Astronomical objects discovered in 2017
Supermassive black holes
Boötes
Quasars | ULAS J1342+0928 | [
"Physics",
"Astronomy"
] | 700 | [
"Black holes",
"Boötes",
"Unsolved problems in physics",
"Supermassive black holes",
"Constellations"
] |
55,986,440 | https://en.wikipedia.org/wiki/Paragraphia | Paragraphia is a condition which results in the use of unintended letters or phonemes, words or syllables when writing. This is typically an acquired disorder derived from brain damage and it results in a diminished ability to effectively use written expression.
Paragraphias can be classified as function of the type of writing errors: literal paragraphias, graphemic paragraphias and morphemic paragraphias.
References
Neurological disorders
Mental disorders | Paragraphia | [
"Biology"
] | 86 | [
"Mental disorders",
"Behavior",
"Human behavior"
] |
55,986,980 | https://en.wikipedia.org/wiki/Suksin%20Lee | Suksin Lee (; 6 October 1896/7 – 12 December 1944) was a Korean biochemist and physician. He is considered a pioneer of biochemistry in Korea, having been the first Korean to obtain a Ph.D. and to hold a full-time professorship in that field. His studies of glucose metabolism and the chemical composition of common foods contributed to the scientific analysis of nutrition in the Korean diet.
Biography
Early life and education
Suksin Lee was born in P'yŏngannam-do, Korea, the son of I Myŏngse and a woman of the Koksan Kang family. He earned his medical degree from Kyŏngsŏng Medical College (now Seoul National University) in 1921, and obtained his medical license in August of that year. After graduation, he studied pathology for several months at Tokyo Imperial University in Japan. He then traveled to Germany in 1922 to pursue additional studies.
After completing preliminary language instruction and various coursework in chemistry and physiological chemistry at the Friedrich Wilhelm University of Berlin, he earned a doctorate of medicine in 1926. Lee's inaugural dissertation, Ueber Glykolyse, was a study of inorganic phosphates during blood glycolysis. His thesis advisors at the time included Otto Lubarsch of the Chemistry Department at the Pathological Institute, University of Berlin.
While in his final year of studies, Lee obtained a position as a research assistant at a national hospital in Berlin, where he worked until 1927. During this time he published and co-published several papers on the effects of photosensitive substances on glucose metabolism and cellular respiration.
Scientific career
Returning to his native Korea, he began studies of the staple Korean diet and its effects on metabolism as a research assistant at Kyŏngsŏng Medical College in February 1928. He was appointed an instructor of physiology in the department of biochemistry of Severance Union Medical College (now Yonsei University College of Medicine) and an adjunct instructor of dietetics at Ewha Womans University College of Medicine.
In 1932, Suksin Lee was the first Korean to earn a Ph.D. in biochemistry for his thesis, A Study on the Eating Habits of Koreans, presented to Kyoto Imperial University on the nutrition and metabolism of prisoners in Korea. Among his advisers at the time was Professor Sato of Keijo Imperial University.
He was then appointed full-time professor of biochemistry in 1933 at Severance Union Medical College, the first Korean to hold such a position. He continued to lead the department, later serving as Severance's Dean of Student Affairs, until his death aged approximately 47 of a cerebral hemorrhage on 12 December 1944.
Contributions
As the first Ph.D. and full-time professor of biochemistry in Korea, Lee contributed to the establishment of biochemistry as a newly organized field of study in Korea.
He began with a study of glycolysis. In the late 1920s, the role of phosphorylated compounds in glycolysis had not yet been fully explained. Lee's work touched on early aspects of intermediary carbohydrate metabolism, which was also the subject of Nobel Prize-winning research by Otto Fritz Meyerhof, Otto Heinrich Warburg, and Hans Adolf Krebs.
Lee maintained an interest in factors affecting glucose metabolism upon his return to Korea, where he continued his research with published studies of the Korean diet. Building upon work begun in 1928, he investigated the problem of identifying and quantifying the nutritional elements of the staple Korean diet and its effects on metabolism. He identified nutritional sources in these foods for the healthy development of Korean children and adults during the Japanese occupation of Korea.
In addition to teaching and editing, Lee authored and co-authored at least 10 scientific papers and articles in several languages throughout his brief career. He did all of this despite working under conditions of widespread rationing at the end of World War II.
Legacy
In 2014 and 2016 the Department of Biochemistry and Molecular Biology, Yonsei University College of Medicine, hosted academic symposiums to commemorate his life's work.
A Special Memorial Exhibition was also held in 2015 at the Dong-Eun Museum of Medical Science in Seoul, Korea. The exhibit included a collection of papers left by the late Suksin Lee.
Bibliography
Lipmann, F. (1 July 1976).
Notes
References
1890s births
1944 deaths
Biochemists
20th-century Korean physicians | Suksin Lee | [
"Chemistry",
"Biology"
] | 884 | [
"Biochemistry",
"Biochemists"
] |
55,987,530 | https://en.wikipedia.org/wiki/Mars%20Environmental%20Dynamics%20Analyzer | The Mars Environmental Dynamics Analyzer (MEDA) is an instrument on board the Mars 2020 Perseverance rover that will characterize the dust size and morphology, as well as surface weather. Specifically, the information obtained will help address future human exploration objectives, as dust sizes and shapes, daily weather report and information on the radiation and wind patterns on Mars, that are critical for proper design of in situ resource utilization systems. MEDA is a follow-on project from REMS, of the Curiosity rover mission. MEDA has an increased scope, with greater data collection on Mars dust which contributes to overall Mars program objectives and discovery goals.
The instrument suite was developed and provided by the Spanish Astrobiology Center at the Spanish National Research Council in Madrid, Spain. On April 8, 2021, NASA reported the first MEDA weather report on Mars: for April 3–4, 2021, the high was "minus-7.6 degrees, and a low of minus-117.4 degrees ... [winds] gusting to ... 22 mph".
Scientific team members
The Principal Investigator is José Antonio Rodríguez Manfredi and the Deputy Principal Investigator is Manuel de la Torre Juarez (JPL-NASA).
List of coinvestigators and their affiliations:
Overview
Dust dominates Mars' weather the way that water dominates Earth's weather. Martian weather cannot be predicted unless dust behavior is studied and understood in the weather context. MEDA is a suite of environmental sensors designed to record dust optical properties and six atmospheric parameters: wind speed/direction, pressure, relative humidity, air temperature, ground temperature, and radiation (UV, visible, and IR ranges of the spectrum).
The technology used on MEDA was inherited from the REMS package operating on the Curiosity rover and the TWINS package on InSight lander. The sensors are located on the rover's mast and on the deck, front and interior of the rover's body. It records data whether the rover is active or not, at both day and night. The instruments will collect data for 5 minutes every 30 minutes.
Meda components
See also
Martian soil (aka regolith)
Atmosphere of Mars
Materials Adherence Experiment (1996 Mars dust experiment on Mars Pathfinder)
References
External links
Mars 2020 Home site
Mars 2020 instruments
Meteorological instrumentation and equipment
Space science experiments
INTA spacecraft instruments | Mars Environmental Dynamics Analyzer | [
"Technology",
"Engineering"
] | 471 | [
"Meteorological instrumentation and equipment",
"Measuring instruments"
] |
55,989,914 | https://en.wikipedia.org/wiki/Breakup%20Notifier | Breakup Notifier was a web application written by product developer and programmer Dan Loewenherz that enabled its registered users to track the relationship status of their Facebook friends. An email notification was sent to the user when one of their Facebook friends changed their relationship status.
The app was one of the most viral Facebook app's at the time of its release. It was mentioned in a skit on The Jay Leno Show and news of its popularity was published in Time magazine, The New York Post, CNET, and The Globe and Mail.
Popularity and Facebook controversy
Breakup Notifier gathered 100,000 users in less than 24 hours of its launch and reached a user base of more than 3,000,000 in February 2011. Facebook then blocked the app.
Loewenherz later created an app named Crush Notifier, which differs from the original app in that users can check if they have a mutual crush. Breakup Notifier was later unblocked by Facebook and monetized.
External links
Dan Loewenherz
Crush Notifier
References
Mobile applications
Relationship breakup | Breakup Notifier | [
"Technology"
] | 218 | [
"Mobile software stubs",
"Mobile technology stubs"
] |
55,990,616 | https://en.wikipedia.org/wiki/Migration%20%28ecology%29 | Migration, in ecology, is the large-scale movement of members of a species to a different environment. Migration is a natural behavior and component of the life cycle of many species of mobile organisms, not limited to animals, though animal migration is the best known type. Migration is often cyclical, frequently occurring on a seasonal basis, and in some cases on a daily basis. Species migrate to take advantage of more favorable conditions with respect to food availability, safety from predation, mating opportunity, or other environmental factors.
Migration is most commonly seen in the form of animal migration, the physical movement by animals from one area to another. That includes bird, fish, and insect migration. However, plants can be said to migrate, as seed dispersal enables plants to grow in new areas, under environmental constraints such as temperature and rainfall, resulting in changes such as forest migration.
Mechanisms
While members of some species learn a migratory route on their first journey with older members of their group, other species genetically pass on information regarding their migratory paths. Despite many differences in organisms’ migratory cues and behaviors, “considerable similarities appear to exist in the cues involved in the different phases of migration.” Migratory organisms use environmental cues like photoperiod and weather conditions as well as internal cues like hormone levels to determine when it is time to begin a migration. Migratory species use senses such as magnetoreception or olfaction to orient themselves or navigate their route, respectively.
Factors
The factors that determine migration methods are variable due to the inconsistency of major seasonal changes and events. When an organism migrates from one location to another, its energy use and rate of migration are directly related to each other and to the safety of the organism. If an ecological barrier presents itself along a migrant's route, the migrant can either choose to use its energy to cross the barrier directly or use it to move around the barrier. If an organism is migrating to a place where there is high competition for food or habitat, its rate of migration should be higher. This indirectly helps determine an organism's fitness by increasing the likelihood of its survival and reproductive success.
Taxonomic distribution
In animals
Animal migration is the relatively long-distance movement of individual animals, usually on a seasonal basis. It is the most common form of migration in ecology. It is found in all major animal groups, including birds, mammals, fish, reptiles, amphibians, insects, and crustaceans. The cause of migration may be local climate, local availability of food, the season of the year or for mating.
To be counted as a true migration, and not just a local dispersal or irruption, the movement of the animals should be an annual or seasonal occurrence, or a major habitat change as part of their life. An annual event could include Northern Hemisphere birds migrating south for the winter, or wildebeest migrating annually for seasonal grazing. A major habitat change could include young Atlantic salmon or sea lamprey leaving the river of their birth when they have reached a few inches in size. Some traditional forms of human migration fit this pattern.
Migrations can be studied using traditional identification tags such as bird rings, or tracked directly with electronic tracking devices. Before animal migration was understood, folklore explanations were formulated for the appearance and disappearance of some species, such as that barnacle geese grew from goose barnacles.
In plants
Plants can be said to migrate, as seed dispersal enables plants to grow in new areas, under environmental constraints such as temperature and rainfall. When those constraints change, the border of a plant species's distribution may move, so the plant may be said to migrate, as for example in forest migration.
Effects
A species migrating to a new community can affect the outcome of local competitive interactions. A species that migrates to a new community can cause a top-down effect within the community. If the migratory species is abundant in the new community, it can become a main prey for a resident predator, leaving other resident species as only an alternate prey. This new source of food (migrants) can increase the predatory species’ population size, impacting population sizes of its other prey when the migratory species return to their original location. If a resident species experiences a scarcity of food due to seasonal variation, the species can decrease in population, creating an opportunity for a new species to migrate to that location as the decrease in the population of the resident species leaves an abundance of food. Migratory species can also transport diseases long-distance from their original habitat.
See also
Great American Interchange, an event in which fauna migrated between North America and South America once the continents were bridged by the Isthmus of Panama
Human migration, physical movement by humans from one area to another
References
Ecology
Broad-concept articles | Migration (ecology) | [
"Biology"
] | 953 | [
"Ethology",
"Behavior",
"Animal migration",
"Ecology"
] |
55,991,531 | https://en.wikipedia.org/wiki/Ecomedia | Ecomedia is a field of study that deals with the relationship between non-print media and the natural environment. Generally, this is divided into two domains: cultural representations of the environment in media and environmental impact of media forms.
The first domain, the environment in media, is very broad and can potentially include any form of media that deals with an environmental issue. The second domain, environmental impacts and concerns of media, focuses on the environmental impacts at every level production of media projects and seeks to make media as sustainable as possible
History
The field of ecomedia is still developing and remains mainly an academic discourse. It can be considered a subgroup of media studies, and it crosses the bounds of traditionally “single media” disciplines like literature, art history, and music.
The website EcomediaStudies.org was created in 2009 by Steve Rust and Salma Monani as a forum to generate conversation about ecomedia and to make a space for people writing about ecomedia to publish their work. The website’s creators sketch the history of their field as developing out of “the 2009 Association for the Study of Literature and Environment (ASLE) conference in Victoria, British Columbia” which featured a special session and several panels on Ecological Media. Although the website has since been shut down, due to the creators wanting to focus on promoting the study of ecomedia, the website is still up and holds a significant amount of information about ecomedia and its development.
In 2015 Rust and Monani, along with Sean Cubitt, edited and published Ecomedia: Key Issues (Key Issues in Environment and Sustainability), a book that outlines the important factors at stake in ecomedia studies by looking from various theoretical perspectives at the interactions of environmental issues and media. The text covers multiple types of media with chapters on photography, cinema, newspaper strips, radio, television, mapping systems, advertising, and video games.
Since then, two peer-reviewed journals have emerged in the field of ecomedia studies: the open-access Media+Environment, edited by Alenda Chang, Adrian Ivakhiv, and Janet Walker, and published by the University of California Press, and the Journal of Environmental Media, edited by Hunter Vaughan and published by Intellect. Both emerged from a week-long 2017 workshop led by Ivakhiv, Vaughan, Walker, and Jim Schwoch, at the Rachel Carson Center in Munich.
Environment Through Media
Ecomedia can be a tool for understanding how environmental issues appear and evolve in culture through the study of both traditional and new media. Cultural ideas about the environment can manifest in film, television, music, visual arts, and video games. Ecomedia often aims to promote a dialogue surrounding themes of environmental topics.
Film and television
Digital media offers numerous outlets to share different perspectives on the environment. Within the realm of film and television, there are numerous approaches that exist. The subcategory of television is made up of news broadcasting, entertainment and educational television shows, advertisements, short films, and more. Each has different motives behind their approach of the topic, which also varies between cultures, environments, societies and political structures. There are channels dedicated to providing educational knowledge about the environment and human interactions with the environment, such as National Geographic. There are also television shows that are meant to provide education knowledge about the environment to children in a cartoon format. All of these forms of television are useful for answering the question of how environmental topics are portraying through media sources and what their intentions are. There have been numerous studies that look at whether or not this is a successful and educational way to promote environmental awareness.
The category of environmental films has similar opportunities for creative approaches to address the topic of the environment. Aside from the published media, celebrity status can led a voice to the environmental movement. The well-known actor and environmentalist, Leonardo Dicaprio, has both been featured in and been the director of environmental focused films. The idea that celebrity status can give attention to both good and bad aspects of the environment and its conservation is a growing trend for environmentalism as well as other global topics. The argument that individuals, such as celebrities are becoming a part of the spectacle of conservation, driven by the consumerist aspect of media.
The overarching media forms of film and television can address the environment through undertones and hidden messages, using it to create plot, or coverage of an existing or past environmental concern.
Music
Similarly, musicians have participated in the call to action of the environmental movement. Artists have utilized their fame to publicize issues of environmental crisis through their lyrics. Iconic examples such as Joni Mitchell’s “Big Yellow Taxi” and Marvin Gaye’s “Mercy Mercy Me” have narrated reactions to the rising environmental movement. Lyrical representations of environmental crises serve to bring them to mainstream attention.
For examples see: List of songs about the environment
Visual Arts
Artistic representations of environmental crises have increasingly been represented in contemporary art. Environmentally focused visual art has acted as both a reaction and stimulus to action regarding environmental crises. Environmental art is a recent development, only developing in the late 1960s. The pieces can serve as both visual pleasure and also political statements. Often, artists are starting to use recycled materials to convey their message while aiming to share their ideas in practice by using trash materials and making them into pieces of beauty.
For examples see: Environmental Art
Video games
Another approach to raising environmental consciousness in media aims to target a younger generation of consumers through video games. Video game publishers are releasing games that offer lessons about human impact on the environment, eco systems, and solutions to climate change through building sustainable communities. Some research suggests that video games could be an effective tool for teaching environmental ethics.
Media as Environment
This approach to ecomedia focuses on examining the physical impact that our media practices have on the environment, concerning our use of natural resources in creating and distributing different forms of media. Ecomedia acts as a form of environmental critique through extended engagement with “the production, distribution, consumption, and cultural afterlife of particular media genres and works”. It seeks to reform the way in which we use natural resources to promote safer and more environmentally-friendly practices.
Examples
E-Waste
E-waste can be broadly defined as business and consumer electronic equipment that is at or nearing the end of its functioning life. It can include cellular devices, computers, televisions, printers, microwaves, copiers, stereos, and other common electronic items. According to the US Environmental Protection Agency, only 12.5% of electronic waste is recycled in the US. Additionally, the volume of global e-waste is projected to grow to 65 million tons by the end of 2017, an increase of 33% of its current volume. Many components of e-waste can be recycled or reused in some way, prompting the creation of a number of ecomedia measures to address this issue.
Wastefulness of Storing Electronic Media
There are around 12 million servers driving approximately 3 million data centers. Energy waste is a major problem regarding the media stored in these data centers. According to a study by NRDC, most electricity waste is from smaller-size data centers. Currently, the electricity usage is around 91 billion kilowatt-hours which equals the power generation of 34 coal-fired power plants. Additionally, the rising electronic media demands will increase electricity consumption to around 140 billion kilowatt hours a year. Worsening this situation is that most of the 12 million servers within the United States are idle. This means that they are consuming electricity without performing useful work.
E-Cycling
Electronics recycling, or e-cycling, is the practice of finding other means to dispose of electronics besides regular trash services.
Reasons to E-Cycle
E-cycling provides a means to remedy the physical waste problems surrounding much of ecomedia. Ecomedia waste requires intensive mining and manufacturing practices, which has a substantial impact on the natural land area around those sites. Practicing eco-cycling can prevent increased mining for ecomedia resources, as many recycling programs do actually reuse the materials from older products. Additionally, many of the products contain hazardous materials. Batteries, television components and even smartphones can cause damage to the natural environment if they are allowed to sit in a landfill. E-cycling facilities are able to handle this waste appropriately.
E-Cycling Options
Public
Many local government agencies enforce restrictions on what ecomedia waste is allowed to be disposed of in the general trash, because of its hazardous nature. Products such as computer components, laptops and tablets, televisions and DVD players are restricted from regular trash. Additionally, many municipalities will offer special collection dates and times for electronic waste or require residents to schedule a special pickup for those items.
Private
Private companies also offer programs to recycle and reuse discarded electronics. Similar to the public option, many private companies will accept on-location drop-offs of computer parts or cell phones. Not every company will accept every type of electronic waste product, and some may charge a small fee.
Apple
Apple has implemented a complementary reuse and recycle service that allows users to exchange computers, iPads, and iPhones for store credit if the device qualifies for reuse. If the device does not meet the criteria for reuse, Apple will recycle it at no cost to the consumer.
The company partners with Li Tong Group, a Hong Kong electronics recycler which breaks down every component of the old phone and captures 100% of the chemicals and gas emissions that are created during the process.
References
Ecology terminology | Ecomedia | [
"Biology"
] | 1,903 | [
"Ecology terminology"
] |
55,992,515 | https://en.wikipedia.org/wiki/Transformer%20%28gene%29 | Transformer (tra) is a family of genes which regulate sex determination in insects such as flies. Among its effects, it () regulates differences between males and females in Drosophila fruit flies.
The tra-2 gene () is needed for sexual differentiation in female fruit flies, and for spermatogenesis in the males. It is not in the same protein family as tra, but instead works together with it in the splicing enhancer complex.
References
Eukaryote genes
Sex-determination systems | Transformer (gene) | [
"Biology"
] | 107 | [
"Sex-determination systems",
"Sex"
] |
55,993,407 | https://en.wikipedia.org/wiki/Cooperative%20segmental%20mobility | Cooperative segmental mobility is a phenomenon associated with mobility of tens to a few hundreds of repeat units of a polymer and is important in defining the transition between “glassy” and “rubbery” state of the polymer. This cooperative segmental mobility is closely related to the dynamics of the polymer near its glass transition temperature. In the glassy state, the relaxation process of a polymer chain is a cooperative phenomenon and the molecular motion depends to some degree on that of its neighbors.
Background
F. Bueche was the first scientist to come up with the theory of segmental mobility of polymers near their glass transition temperature. He presented two methods to describe the mobility of polymer segments near Tg. Based on his excluded volume calculations, he concluded that any factor which increases the volume disturbed by a rotating segment of a polymer chain or which increases the density of the polymer will in turn increase Tg. Hence, increasing chain stiffness and addition of bulky, but stiff, chain side groups will increase Tg.
Later in 1965, Adam and Gibbs formulated the temperature dependence of cooperative relaxation properties in glass forming liquids. It was theorized that the local relaxation of polymer chains occurs in cooperative rearranging regions (CRR) where there is a collective motion of small polymer segments (hence the name segmental mobility). They correlated the size of these cooperative regions to the configurational entropy of the system. Typically the size of the CRR near Tg ranges between 1 and 4 nm.
Alexandrov Solunov formulated an equation to determine the average size of the cooperative rearranging regions which corresponds to the average number of structural units that relax together to cross from one configuration to another. He concluded that the glass transition temperature is proportional to the activation energy of these CRRs.
Following these work, a lot of researchers started investigating ways to quantify this cooperative segmental mobility which in turn affects the relaxation time of polymers.
Theory
The concept of cooperative segmental mobility comes from the free volume quantification. With the reduction in temperature, the free volume occupied by the polymer segments is reduced. Due to this loss in free volume, cooperative segmental dynamics is significantly slowed near the glass transition temperature and is practically arrested at glass transition temperature. In other words, the molecular rearrangements are frozen.
This cooperative segmental mobility has a huge effect on the glass transition temperature of the polymer. As F. Bueche concluded in his work, increasing the chain stiffness and/or adding a bulky stiff side group significantly increases Tg. For example, the glass transition temperature of poly(dimethyl siloxane) is -125 °C whereas that of poly(isobutylene) is -75 °C due to stiffer backbone chain. Similarly, Tg for polypropylene is -20 °C whereas for polystyrene it is 95 °C due to the presence of bulkier side group.
The glass transition temperature in thin films is also affected by this phenomenon. Exposure to the free surface enhances the cooperative segmental mobility which reduces Tg. This can be detrimental in applications like nano electronics, where polymer thin films are used.
Experimentally, single-molecule fluorescence microscopy can be used to detect the motion of single molecules, and thus reflect the segmental mobility in their neighboring domains. From simulation perspective, the probability of segment movement (PSM) is a parameter used to directly measure the segmental mobility. PSM is defined as the probability of movement of each segment either by jumping into its neighboring empty sites or partial sliding diffusion. The value of 〈PSM〉 decreases initially with the decrease of temperature, and then levels off at low temperatures, indicating that the chain segments enter the completely frozen state.
References
Polymer physics | Cooperative segmental mobility | [
"Chemistry",
"Materials_science"
] | 750 | [
"Polymer physics",
"Polymer chemistry"
] |
55,993,642 | https://en.wikipedia.org/wiki/Sinosabellidites | Sinosabellidites huainanensis part of the small shelly fauna is a species of Early Neoproterozoic metazoans.
See also
Huainan biota
References
Ediacaran life
Neoproterozoic
Prehistoric marine animals | Sinosabellidites | [
"Biology"
] | 51 | [
"Fossil algae",
"Algae"
] |
55,993,644 | https://en.wikipedia.org/wiki/Business%20process%20outsourcing%20in%20China | The business process outsourcing industry in China including IT and other outsourcing services for onshore and export markets surpassed 1 trillion yuan (about $145 billion) in 2016 according to the Ministry of Commerce (MOC).
History
The outsourcing industry grew rapidly in the 2000s in China by beginning from an "embryonic" scale. IDC, an IT industry consultancy, estimated in 2006 that while outsourcing of IT services was growing at 30% annually, the market size was only $586 million at the end of 2005. Most IT services then were offered to domestic companies with offshore clients concentrated in Japan.
By 2016, outsourcing was still growing fast at about 20% annually, driven by "cloud computing, big data, Internet of things and mobile Internet" according to Xinhua.
IT services
IT related services accounted for about half of the US$87 billion in total service outsourcing provided to export markets in 2015.
The largest IT outsourcing companies based in China include ChinaSoft and Pactera. One of the largest China-focused outsourcing companies but based in the US is VXI Global. ChinaSoft is backed by Huawei. Pactera was formed in 2012 by the merger of two industry leaders, VanceInfo and HiSoft. VXI Global was acquired in 2016 by the Carlyle Group, a marquee private equity firm, for around US$1 billion. The deal was seen by Dow Jones as "making a big bet that the future of the outsourcing industry, long associated with India, will be in China."
Painting
Art production is outsourced to China either to mass-produce thousands of paintings or execute original works based on instructions from foreign artists. A center for art outsourcing is Dafen Village in Shenzhen, well known for production of imitations of masterworks, but also home to artists who are commissioned to execute original works. At the high end of the art world, outsourcing to China is practice by Kehinde Wiley, an American portrait painter, who opened a studio in Beijing in 2006. Under Wiley's 4 to 10 helpers do the brushstrokes for his paintings.
Cost has been a motivation for outsourcing to China with an art blogger suggesting in 2007 that it allowed those with limited budgets to go from buy posters to oil paintings. Initially, Kehinde Wiley also opened his Beijing studio due to cost sensitivity but by 2012, Wiley told New York magazine that savings costs was no longer the reason for relying on Beijing based-helpers.
Global competitiveness
China has the world's second largest outsourcing industry, taking up 33% of global market share, according to a news release from the Ministry of Commerce of the People's Republic of China in March 2017. The Business Processing Industry Association of India in March 2017 had concurring figures, assessing that China had second largest outsourcing industry, behind India and ahead of the Philippines.
References
Industry in China
China | Business process outsourcing in China | [
"Technology",
"Engineering"
] | 608 | [
"Computer industry",
"Software industry",
"Software engineering"
] |
55,993,725 | https://en.wikipedia.org/wiki/Fluorescence%20biomodulation | Fluorescence biomodulation is a form of photobiomodulation, which utilizes fluorescence energy to induce multiple transduction pathways that can modulate biological processes through the activation of photoacceptors found within many different cell and tissue types. According to Magalhães and Yoshimura, photoacceptors are molecules that do not explicitly specialize in light absorption but possess the ability to do so in the presence of light which in turn can enhance their functioning.
To generate fluorescence, specialized light absorbing molecules (chromophores) are employed to translate light energy into a high-energy emission of fluorescence through a mechanism known as stokes shift. Fluorescence biomodulation differs from photobiomodulation in that it employs fluorescence as a photo vehicle to induce biomodulation. Fluorescence, as generated by chromophores, is displayed as a broad spectral distribution of wavelengths and/or frequencies which can be controlled to penetrate tissues to various degrees. Tailoring fluorescence biomodulation allows compatibility between the specific emissions of fluorescence and the unique light absorbing characteristics of different cell and tissue types in the body. Shorter wavelengths (<600 nm) within the visible spectrum cannot penetrate deep into tissue and are localized within the epidermis or dermis. Conversely, longer wavelengths (>600 nm) within the visible spectrum penetrate further up into the hypodermis.
References
Fluorescence techniques | Fluorescence biomodulation | [
"Biology"
] | 293 | [
"Fluorescence techniques"
] |
55,993,750 | https://en.wikipedia.org/wiki/Inertial%20Stellar%20Compass | Inertial Stellar Compass (ISC) was a proposed instrument for an advanced navigation system designed to allow spacecraft to operate more autonomously by enabling precise attitude determination with an accuracy of better than 0.1 degrees across all three axes. It also provides the capability to recover orientation after a power loss.
The ISC is small in size and consumes low power to operate. The ISC was developed by NASA as part of New Millennium program's Space Technology 6 project in collaboration with Charles Stark Draper Laboratory.
The instrument functions with a combination of a miniaturized star tracker and gyroscopes. It uses a wide field-of-view active pixel star camera and a micro electromechanical system to determine the real-time stellar attitude (orientation) of the spacecraft. It has a mass of and requires 3.5 W power.
In 2007, it was successfully deployed and fully operational in space aboard the TacSat-2 spacecraft.
As the New Millennium Program had its budget cancelled in 2009, it is unclear whether development of this instrument is ongoing.
References
New Millennium Program
Spacecraft instruments | Inertial Stellar Compass | [
"Astronomy"
] | 223 | [
"Astronomy stubs",
"Spacecraft stubs"
] |
55,993,906 | https://en.wikipedia.org/wiki/Authorized%20Program%20Analysis%20Report | An APAR (Authorized Program Analysis Report) (pronounced A-PAR, rhymes with far) is an IBM designation of a document intended to identify situations that could result in potential problems. It also serves as a request for the correction of a defect in current releases of IBM-supplied programs.
The Process
"Occasionally" IBM software has a bug.
Once it has been ascertained that the situation has not been caused by problems in third-party hardware or software or the user's configuration errors, IBM support staff, if they suspect that a defect in a current release of an IBM program is the cause, will file a formal report confirming the existence of an issue. In addition to confirming the existence of an issue, APARs include information on known workarounds, information on whether a formal fix is scheduled to be included in future releases, and whether or not a Program Temporary Fix (PTF) is planned.
Documenting the problem
IBM has a program to facilitate documenting the problem.
Solution levels
There are at least 2 levels of fix:
The APAR may result in "an APAR fix."
a permanent correction called a PTF. whereas the PTF "is a tested APAR... The PTF 'closes' the APAR." Prior to that, an APAR is "a problem with an IBM program that is formally tracked until a solution is provided.”
A PTF is a permanent correction with respect to the VRM (Version, Release, Modification) level of the product to which it is applicable, and is a temporary fix in the sense that the problem correction will temporarily be available as a permanent fix, and later will be incorporated into the product base code, and will thereby no longer be a fix, although the associated PTF and/or APAR numbers will, as a rule, be included in the source documentation associated with the ensuing base code update.
System Improvement/Difficulty Report
SIDR was Xerox's acronym, covering APAR and PTF.
The acronym referred to: System Improvement / Difficulty Report.
System Improvement Request
SIR (System Improvement Request) is a terminology that Digital Equipment Corporation used, much as Xerox used SIDR.
See also
List of IBM products
References
IBM software
Software maintenance
IBM mainframe operating systems | Authorized Program Analysis Report | [
"Engineering"
] | 459 | [
"Software engineering",
"Software maintenance"
] |
55,994,004 | https://en.wikipedia.org/wiki/NGC%205609 | NGC 5609 is a spiral galaxy located 1.3 billion light-years light-years away from Earth, in the constellation Boötes. It has the largest redshift of any galaxy in the New General Catalogue. Prior to 2023, another spiral galaxy, NGC 1262, had been thought to have a higher redshift. NGC 5609 is the most distant visually observed galaxy in the NGC Catalog and was discovered by astronomer Bindon Blood Stoney on March 1, 1851.
See also
List of the most distant astronomical objects
List of NGC objects (5001–6000)
References
External links
Spiral galaxies
Boötes
5609
3088538
Astronomical objects discovered in 1851
Discoveries by Bindon Blood Stoney | NGC 5609 | [
"Astronomy"
] | 143 | [
"Boötes",
"Constellations"
] |
55,994,240 | https://en.wikipedia.org/wiki/Hydrovinylation | In organic chemistry, hydrovinylation is the formal insertion of an alkene into the C-H bond of ethylene ():
The more general reaction, hydroalkenylation, is the formal insertion of an alkene into the C-H bond of any terminal alkene. The reaction is catalyzed by metal complexes. A representative reaction is the conversion of styrene and ethylene to 3-phenybutene:
Ethylene dimerization
The dimerization of ethylene which gives 1-butene is another example of a hydrovinylation. In the Dimersol and Alphabutol Processes, alkenes are dimerized for the production of gasoline and for comonomers such as 1-butene. These processes operate at several refineries across the world at the scales of about 400,000 tons/year (2006 report). 1-Butene is amenable to isomerization to 2-butenes, which is used in olefin conversion technology to give propylene.
In organic synthesis
The addition can be done highly regio- and stereoselectively, although the choices of metal, ligands, and counterions often play very important role. Many metals have also been demonstrated to form active catalysts, including nickel and cobalt.
In a stoichiometric version of a hydrovinylation reaction, nucleophiles add to an electrophilic transition metal alkene complex, forming a C-C bond. The resulting metal alkyl undergoes beta-hydride elimination, liberating the vinylated product.
Hydroarylation
Hydroarylation is again a special case of hydrovinylation. Hydroarylation has been demonstrated for alkyne and alkene substrates. An early example was provided by the Murai reaction, which involves the insertion of alkenes into a C-H bond of acetophenone. The keto group directs the regiochemistry, stabilizing an aryl intermediate.
When catalyzed by palladium carboxylates, a key step is electrophilic aromatic substitution to give a Pd(II) aryl intermediate. Gold behaves similarly. Hydropyridination is a similar reaction, but entails addition of a pyridyl-H bond to alkenes and alkynes.
See also
Vinylation, the process of affixing a vinyl group
References
Alkenes
Carbon-carbon bond forming reactions | Hydrovinylation | [
"Chemistry"
] | 512 | [
"Organic compounds",
"Alkenes",
"Carbon-carbon bond forming reactions",
"Organic reactions"
] |
55,994,478 | https://en.wikipedia.org/wiki/NGC%201060 | NGC 1060 is a lenticular galaxy approximately 256 million light-years away from Earth in the constellation of Triangulum. It was discovered by William Herschel on September 12, 1784.
NGC 1060 is the brightest member of the galaxy group LGG 72, which contains approximately 15 galaxies.
Intergalactic medium (IGM) in this system is highly disturbed, with separate X-ray peaks centred on the two main galaxies of the group, NGC 1060 and NGC 1066.
A ~250 kpc arc of hot gas is linking these two galaxies.
The system appears to be undergoing a merger, which may have triggered the nuclear activity in NGC 1060.
In 2013 a small-scale (20”/7.4 kpc) jet source was detected in NGC 1060, indicating a remnant of an old, low power outburst. The radio emission which arises from this jet was also detected.
NGC 1060 is an active galaxy, with confirmed active galactic nucleus (AGN).
Supernova SN 2004fd
Supernova SN 2004fd of magnitude 15.70 was detected in NGC 1060 on October 22, 2004. It was discovered by Tom Boles who was using 0.35m Schmidt-Cassegrain telescope during searches for the UK Nova/Supernova Patrol. The supernova was classified as type Ia, and was located very close to the nucleus of its host galaxy (the J2000 epoch celestial coordinates: RA 02h 43m 15.20s, Dec +32° 25′ 26.00″).
See also
List of NGC objects (1001–2000)
References
External links
SEDS
Lenticular galaxies
Triangulum
1060
10302
Astronomical objects discovered in 1784
Discoveries by William Herschel | NGC 1060 | [
"Astronomy"
] | 353 | [
"Triangulum",
"Constellations"
] |
72,977,148 | https://en.wikipedia.org/wiki/Entotrust%20certification | The Entotrust certification is a voluntary product certification of insects as food, and related insect-based foods, which allows producers to communicate their food safety and sustainability. Increasingly used, in Europe, Africa, Asia, US, Mexico, and Latam with the mission to recognize and report quality products based on edible insects, the logo can only be used by fully certified producers and farmers.
The Entotrust International certification covers a wide range of food products including bakery, pasta, confectionary, salted snacks and chips, protein and energy bars, whole dried insects, insect protein powder, and functional drinks. In general, it encompasses any product that might become a more sustainable and nutritious one, with the inclusion of a percentage of insect origin proteins.
The call for more sustainable proteins, healthier diets, and food innovation is driving wider adoption of insect food across the globe: a certificate and a seal of acceptance have a capability to play a major part in making clients trust edible insect solutions. It is easier making people wish to consume insects if they know they’re secure and bred in a sustainable manner.
The insect as food market is expected to grow significantly according independent analysis at an average CAGR of 8.9% during the forecast period (2023–2028). More than 2,100 insect species are currently eaten by two billion people from 130 countries. Insects have high-value nutritional profiles, and are rich in protein, omega-3 fatty acids, iron, zinc, folic acid and vitamins B12, C and E. Commercial insect farming is considered to have a low environmental footprint, requiring minimal water, energy, and land resources. Europe and the United States of America are the leading edible insect markets in the Western World today, with more than 400 edible-insect-related businesses in operation.
How it works
Producers and farmers of insect food intended for human consumption, request to have their products analysed in accredited food laboratories for microbiology and chemical elements, this to ensure the absolute food safety. Then a thorough audit of the farming and manufacturing processes is performed in compliance with strict requirements, baseline are the Haccp methodology, and checklists comparable to the GFSI and BRC assessments. The aim is to verify the complete value chain from the farm to the consumer's plate.
The three dimensions of the assessment are:
Food safety
Environmental footprint
Social compliance (ILO)
Certification is obtained by responding to specific checklists concerning both food safety, company processes and practices in terms of environmental impact and social fairness. Verification is carried out through meetings, video calls, documentary evidence and photos, when necessary auditors’ visits to the company premises. Furthermore, the accredited laboratory yearly carries out analyses on certified products, as a necessary condition for the renewal of the certification.
Other certification and product labelings
The Global Gap farm and food certification program
The Fair Trade Federation does not certify individual products, but instead evaluates an entire business.
The Marine Stewardship Council for fisheries
The FTO Mark, launched in 2004 by World Fair Trade Organization, and identifies registered fair trade organizations.
The Global Aquaculture Alliance for aquaculture
UTZ Certified is a coffee certification program that has sometimes been dubbed "Fairtrade lite"
References
External links
FAO Edible insects report for food security
Protein classification | Entotrust certification | [
"Biology"
] | 669 | [
"Protein classification"
] |
72,977,234 | https://en.wikipedia.org/wiki/Microsegmentation%20%28network%20security%29 | Microsegmentation is a network security approach that enables security architects to construct network security zones boundaries per machine in data centers and cloud deployments in order to segregate and secure workloads independently.
It is now also used on the client network as well as the data center network.
Types of microsegmentation
There are three main types of microsegmentation:
Native OS host-based firewall segmentation employs OS firewalls to regulate network traffic between network segments. Instead of using a router or network firewalls or deploying agents, each host firewall is used to perform both auditing and enforcement, preventing attackers from moving laterally between network machines. While Native OS host-based firewalls can implement many segmentation schemes, including microsegmentation, only recent innovations in the space have made implementation and management achievable at scale.
Host-agent segmentation: This style of microsegmentation makes use of endpoint-based agents. By having a centralized manager with access to all data flows, the difficulty of detecting obscure protocols or encrypted communications is mitigated. The use of host-agent technology is commonly acknowledged as a powerful method of microsegmentation. Because infected devices act as hosts, a solid host strategy can prevent issues from manifesting in the first place. This software, however, must be installed on every host.
Hypervisor segmentation: In this implementation of microsegmentation, all traffic passes through a hypervisor. Since hypervisor-level traffic monitoring is possible, existing firewalls can be used, and rules can be migrated to new hypervisors as instances are spun up and spun down. Hypervisor segmentation typically doesn't function with cloud environments, containers, or bare metal, which is a downside.
Network segmentation: This approach builds on the current setup by using tried-and-true techniques like access-control list (ACLs) for network segmentation.
Benefits
Microsegmentation allows defenders to thwart almost any attack methods by closing off attack vectors within internal networks so that the attackers are stopped in their tracks.
Microsegmentation in internet of things (IoT) environments can help businesses gain command over the increasing volume of lateral communication taking place between devices, which is currently unmanaged by perimeter-focused security measures.
Challenges
Implementing and maintaining microsegmentation can be difficult. The first deployment is always the most challenging. Some applications may not be able to support microsegmentation, and the process of implementing microsegmentation may cause other problems.
Defining policies that meet the requirements of every internal system is another potential roadblock. Internal conflicts may occur as policies and their ramifications are considered and defined, making this a difficult and time-consuming process for certain adopters.
Network connection between high and low-sensitivity assets inside the same security boundary requires knowledge of which ports and protocols must be open and in which direction. Inadvertent network disruptions are a risk of sloppy implementation.
Microsegmentation is widely compatible with environments running common operating systems including Linux, Windows, and MacOS. However, this is not the case for companies that rely on mainframes or other outdated forms of technology.
To reap the benefits of microsegmentation despite its challenges, companies have developed solutions by using automation and self service.
References
Computer network security | Microsegmentation (network security) | [
"Engineering"
] | 690 | [
"Cybersecurity engineering",
"Computer networks engineering",
"Computer network security"
] |
72,977,646 | https://en.wikipedia.org/wiki/Mode%20conversion | Mode conversion is the transformation of a wave at an interface into other wave types (modes).
Principle
Mode conversion occurs when a wave encounters an interface between materials of different impedances and the incident angle is not normal to the interface. Thus, for example, if a longitudinal wave from a fluid (e.g., water or air) strikes a solid (e.g., steel plate), it is usually refracted and reflected as a function of the angle of incidence, but if some of the energy causes particle movement in the transverse direction, a second transverse wave is generated, which can also be refracted and reflected. Snellius' law of refraction can be formulated as:
This means that the incident wave is split into two different wave types at the interface. If we consider a wave incident on an interface of two different solids (e.g. aluminum and steel), the wave type of the reflected wave also splits.
Besides these simple mode conversions, an incident wave can also be converted into surface waves. For example, if one radiates a longitudinal wave at a shallower angle than that of total reflection onto a boundary surface, it will be totally reflected, but in addition a surface wave traveling along the boundary layer will be generated. The incident wave is thus converted into reflected longitudinal and surface wave.
In general, mode conversions are not discrete processes, i.e. a part of the incident energy is converted into different types of waves. The amplitudes (transmission factor, reflection factor) of the converted waves depend on the angle of incidence.
Seismic waves
In seismology, a wave conversion specifically refers to the conversion between P and S waves at discontinuities. Body waves are reflected and refracted when they hit a boundary layer within the earth. Here, P-waves can be converted into S-waves (PS-wave) at interfaces, as well as vice versa (SP-wave). Here applies analogously for an incident P-wave:
The change in amplitudes can be described with the zoeppritz equations.
References
Wave mechanics | Mode conversion | [
"Physics"
] | 426 | [
"Wave mechanics",
"Waves",
"Physical phenomena",
"Classical mechanics"
] |
72,978,897 | https://en.wikipedia.org/wiki/Green%20track | Green track (also grassed track or lawn track) is a type of railway track in which the track bed and surrounding area are planted with grass turf or other vegetation as ground cover. It is a popular way of making railways more visually appealing, particularly for trams and light rail, and providing additional urban green space. Aside from the visual improvement, the vegetation provides a number of positive effects, such as noise reduction, less air pollution, rainwater runoff mitigation, and reduced urban heat island effect.
History
The first green tracks were installed in 1905 along in Berlin. Due to the technical challenges they posed, many green tracks were phased out in the following decades, but beginning in the mid-1980s, green tracks have seen a resurgence.
Impact
Green tracks have a number of positive impacts on the urban environment. Green tracks reduce surface runoff by retaining an estimated 50-70% of precipitation, while remaining stormwater is released more slowly and with fewer pollutants. The absorbed water is released through evapotranspiration and provides a cooling effect on the surrounding urban micro-climate. Vegetation, commonly grass or sedum, provides an important surface for the deposit and capture of fine particulate matter. High-vegetation tracks offer sound reduction under ideal conditions of up to 3 decibels, and are subjectively perceived as quieter.
A light rail project in Sydney, Australia identified green track as requiring 81% less concrete compared to embedded track.
The cooling effect of the green surfaces, particularly in summer, lowers the temperature of the rails themselves, reducing the risk of buckling. The city of Dresden estimates that their total installed capacity of green track has a cooling effect, reducing the temperature by for of air per year.
Green tracks, particularly planted with endemic plant species, have positive impacts on the local biodiversity, such as the use of wildflowers that serve as food sources for wild bees, and providing habitats for insects.
Types
There are several different designs for green tracks:
Track with high vegetation, or top-of-rail (TOR): Grooved rails are placed on concrete sleepers, and the surrounding space is filled with dirt up to the grade of the top of the tracks. This technique provides the best noise abatement and a visually uniform green area, as only the track surfaces are visible. The sides of the tracks are generally protected with cavity filler blocks, to prevent direct contact between the rails and the soil, keeping the fasteners clean and preventing or slowing corrosion. The soil must be removed during track renewal operations. Soil moisture leads to corrosion in the track and fasteners, and both must typically be replaced during renewal. Because the fasteners are below the surface, it is more difficult to inspect them. The railway area is more difficult to recognize as a dangerous area, and additional protections need to be taken against stray currents. Track with high vegetation can also be crossed or driven on in exceptional circumstances.
Track with low vegetation or foot-of-rail (FOR): Concrete sleepers are placed upon longitudinal concrete beams, and the space in between the sleepers is filled with topsoil instead of track ballast, with vegetation only reaching the base of the tracks. This has a reduced sound dampening effect, and the uniform green area is visually interrupted. However, inspection and maintenance of the rails is easier and the fasteners corrode much more slowly, as they are not constantly exposed to moisture from the soil. Switches and sensors can be mounted on the tracks, and there is better control over residual electrical current flowing into the ground. This design also allows for normal flat-bottomed rails, which are typically cheaper and easier to maintain than grooved tramway track.
Track with mixed-level vegetation: This solution incorporates low vegetation between the rails and high vegetation outside. This allows for easier inspection of the rails, but creates a visible trough in the green surface area, where dead leaves and debris can collect.
The ground cover requires a stable track bed construction, where the track geometry will not shift after construction. This is commonly accomplished with ballastless track. The addition of soil to track ballast would reduce the friction between the individual ballast stones and make it easier for the track to shift out of place, while making adjustments and corrections more difficult. At the same time, extensions and changes are more involved than with conventional track ballast.
The soil surrounding the tracks is planted with grass or other suitable ground cover plant. In comparison, grass tends to require higher maintenance, such as mowing and irrigation, and a soil layer of at least . In particular, shade- and drought-tolerant species are chosen depending on local conditions; sedum is a common choice, as it requires a substrate of only . The space between the tracks can also be filled with grass pavers instead of soil. The city of Braunschweig, Germany, together with the Julius Kühn-Institut, developed a mixture of endemic grasses and wildflowers suitable for green tracks that will also provide a habitat for wild bees.
Prevalence
Green tracks are particularly common in Central Europe; in 2015, Germany had a total of of green track. In France, the government prescribes the use of green track wherever practicable for tramway construction.
Gallery
See also
Conservation development
Environmental design
Environmental planning
Green infrastructure
Sustainable urbanism
Urban green space
Urban nature
Water-sensitive urban design
References
Fills (earthworks)
Grasslands
Permanent way
Railway track layouts
Tram transport
Lawns | Green track | [
"Biology"
] | 1,095 | [
"Grasslands",
"Ecosystems"
] |
72,979,513 | https://en.wikipedia.org/wiki/Tin%28II%29%20stearate | Tin(II) stearate is a metal-organic compound with the chemical formula . The compound is classified as a metallic soap, i.e. a metal derivative of a fatty acid (stearic acid).
Physical properties
Tin(II) stearate forms colorless (white) crystals.
The compound is insoluble in water.
Chemical properties
Tin(II) stearate reacts with sodium hydroxide solution or hydrochloric acid to form the tin(II) chloride or tin(II) chloride hydroxide.
Uses
The compound is used in the pharmaceuticals and cosmetics industries as a thickener, film-forming polymer, and release agent.
References
Stearates
Tin(II) compounds | Tin(II) stearate | [
"Chemistry"
] | 146 | [
"Inorganic compounds",
"Inorganic compound stubs"
] |
72,979,776 | https://en.wikipedia.org/wiki/CharaChorder | CharaChorder is an American privately held company that specializes in text input devices. Its major products include the CharaChorder One and the CharaChorder Lite, which are keyboards that allow for character and chorded entry.
History
The company's first product was the CharaChorder One - The intention of this first device was to assist people with disabilities and those with limited mobility the ability to communicate with ease.
The founders cite their creation as an example of the curb cut effect, that is technology designed to enable people with disabilities that leads to benefits for everyone. After its initial release, the company was recognized as a new and noteworthy company at the consumer electronics show.
In January 2022, the company made the news when its CEO posted videos to social media demonstrating himself typing in excess of 500 wpm. The speeds are not recorded by some typing competition websites because they are not achieved with a traditional keyboard. In particular, the website Monkeytype blocks any speeds over 300 wpm from their leader boards.
Since its initial creation the company has focused on creation of technologies that enable users to perform text entry faster. The company's motto is "typing at the speed of thought." In May 2022, the company began publicly selling the CharaChorder Lite. The CharaChorder Lite is a chorded keyboard that allows for much of the same functionality of a CharaChorder One, with a more familiar QWERTY layout. In November 2022, the company began a Kickstarter campaign to fund the development of the CharaChorder X, a USB device that aims to bring chorded functionality to existing keyboards.
It has now produced the engine used for chording, and is working on the "Forge Keyboard"
References
Companies based in Frisco, Texas
Computer companies of the United States
Computer hardware companies
Computer peripheral companies
Computer keyboard companies
Typing
2019 establishments in Texas | CharaChorder | [
"Technology"
] | 386 | [
"Computer hardware companies",
"Computers"
] |
72,980,753 | https://en.wikipedia.org/wiki/Oxana%20Kharissova | Oxana Vasilievna Kharissova (born 1969) is a Ukrainian–Mexican nanoscientist whose research involves the synthesis and solubility of nanoparticles. She is a professor and researcher in the faculty of physical and mathematical sciences at the Autonomous University of Nuevo León.
Education and career
Kharissova was born in 1969 in Ukraine, then part of the Soviet Union. She earned a bachelor's degree in geochemistry in 1993 and a master's degree in crystallography in 1994, both from Moscow State University, before emigrating to Mexico in 1995 and later becoming a Mexican citizen. She has a 2001 Ph.D. in materials science from the Autonomous University of Nuevo León, where she is a professor and researcher.
Books
Kharissova is a co-author or co-editor of books including:
Handbook of Less-Common Nanostructures (with Boris I. Kharisov and Ubaldo Ortiz-Mendez, CRC Press, 2012)
Solubilization and Dispersion of Carbon Nanotubes (with Boris I. Kharisov, Springer, 2017)
Carbon Allotropes: Metal-Complex Chemistry, Properties and Applications (with Boris I. Kharisov, Springer, 2019)
Recognition
Kharissova is a member of the Mexican Academy of Sciences, elected in 2010. In 2017 the Autonomous University of Nuevo León gave her their "Flama, Vida y Mujer" recognition, in the area of teaching and research.
References
External links
1969 births
Living people
Ukrainian emigrants
Immigrants to Mexico
Naturalized citizens of Mexico
Nanotechnologists
20th-century Mexican scientists
Moscow State University alumni
Autonomous University of Nuevo León alumni
Academic staff of the Autonomous University of Nuevo León
Members of the Mexican Academy of Sciences
21st-century Mexican scientists
20th-century Ukrainian women scientists
20th-century Mexican women scientists
21st-century Mexican women scientists | Oxana Kharissova | [
"Materials_science"
] | 383 | [
"Nanotechnology",
"Nanotechnologists"
] |
72,981,514 | https://en.wikipedia.org/wiki/Sara%20Rietti | Sara Rietti (Buenos Aires, 3 December 1930 – Buenos Aires, 28 May 2017), also known as Sara Bartfeld de Rietti, was the first nuclear chemist from Argentina. She was Chief of Staff of the Ministry of Science and Technology during the government of President Raúl Alfonsín.
Life and work
Education
Rietti began studying chemistry in 1948 after she had finished her secondary education, graduating in 1954 with her degree in nuclear chemistry. She took her last course at the National Atomic Energy Commission in 1953, a fact that by chance allowed her to be the first nuclear chemist in Argentina. Her thesis was titled Study of the reaction between diboron tetrachloride and diborane. Boranes are chemical compounds known as boron hydrides. These materials cannot receive air or humidity. Because they have to always remain cold, she had to constantly monitor her compounds, including on weekends when she went to her laboratory with her children. In 1963, she finally obtained her doctorate from UBA.
Research career
Rietti worked as a researcher at various Argentine universities and state agencies including the Department of Inorganic Chemistry and Physicochemistry at the University of Buenos Aires between 1955 and 1956. She was also a member of the board of directors of the Faculty of Exact and Natural Sciences of the University of Buenos Aires (UBA), where she witnessed the country's security forces breaking into the university during the events known as the Night of the Long Canes on 29 July 1966.
On that night, after police detained about 400 intellectuals, Sara and her husband Víctor moved from one police station to the next to free their colleagues from jail. As a result of the nationwide repression, numerous laboratories and libraries were destroyed and many scientists and academics were exiled or fled the country. Sara and Victor decided to remain in Argentina.
Rietti worked at the Latin American Publishing Center as director of the Scientific Collection between 1967 and 1969 and served on the board of that publisher between 1972 and 1992. Between 1973 and 1975, she was the coordination director of the National Institute of Industrial Technology.
In 1983, when science was again allowed to flourish in Argentina, the mathematician Manuel Sadosky appointed her Chief of Staff of the Secretary of Science and Technology, over which he presided, and in 1994 she was appointed academic coordinator of the Graduate Policy and Management of Science and Technology at the UBA. She was also a teacher advisor to the Rectorate there, a position she held until the end of her life.
Family life
Her mother was of Polish descent and her father was Ukrainian. He encouraged her in the sciences when he noticed her abilities in mathematics. Although she preferred philosophy, history or political science, she ultimately followed her father's advice because she had a cousin who had already graduated as a chemist.
While studying at the university, she met Víctor Rietti but was reluctant to establish a personal relationship with him because he was too young and they only saw each other at the university. At his insistence, they got married in 1952, had three children and sixty years later they were still together.
Honors
The main room for gender studies at the National University of Cuyo was named after Rietti in October 2010, to pay tribute to Argentine women scientists.
On 4 November 2011, the National University of Rosario awarded her an honorary doctorate.
In 2022, Rietti was one of eleven women scientists honored in the renamed Hall of Argentine Science, which is located in the Government House, first floor.
References
1930 births
2017 deaths
Scientists from Buenos Aires
Argentine scientists
Argentine chemists
20th-century Argentine scientists
20th-century Argentine women scientists
20th-century Argentine women
Women chemists
Nuclear chemists
University of Buenos Aires alumni
Academic staff of the University of Buenos Aires | Sara Rietti | [
"Chemistry"
] | 747 | [
"Nuclear chemists"
] |
72,983,103 | https://en.wikipedia.org/wiki/JAB%20Code | JAB Code (Just Another Barcode) is a colour 2D matrix symbology made of colour squares arranged in either square or rectangle grids. It was developed by .
The code contains one primary symbol and optionally multiple secondary symbols. The primary symbol contains four finder patterns located at the corners of the symbol.
The code uses either 4 or 8 colours. The 4 basic colours (cyan, magenta, yellow, black) are the 4 primary colours of the subtractive CMYK colour model which is the most widely used system in the industry for colour printing on a white base such as paper. The other 4 colours (blue, red, green, and white) are secondary colours of the CMYK model and originate as an equal mixture of a pair of basic colours.
The barcode is not subject to licensing and was submitted to ISO/IEC standardization as ISO/IEC 23634 expected to be approved at the beginning of 2021 and finalized in 2022. The software is open source and published under the LGPL v2.1 license. The specification is freely available.
Because the colour adds a third dimension to the two-dimensional matrix, a JAB Code can contain more information in the same area compared to two-colour (black and white) codes – a 4 colour code doubles the amount of data that could possibly be stored and triples it with an 8 colour code. This can allow storage of an entire message in the barcode, rather than just storing partial data with a reference to a full message somewhere else (such as a link to a website), thus eliminating the need for additional always-available infrastructure beyond the printed barcode itself. It may be used to digitally sign encrypted digital version of printed legal documents, contracts and certificates (diplomas, training), medical prescriptions or provide product authenticity assurance to increase protection against counterfeits.
References
Automatic identification and data capture
Barcodes
Encodings
Hypermedia | JAB Code | [
"Technology"
] | 394 | [
"Multimedia",
"Data",
"Hypermedia",
"Automatic identification and data capture"
] |
72,983,444 | https://en.wikipedia.org/wiki/Olibanic%20acid | Olibanic acid is an organic compound that is naturally found in frankincense. Even though it is present in smaller concentrations than other components, it has a highly potent odor and is believed to be one of the key components responsible for the distinctive smell of frankincense. Both the (1S,2S)-(+)-trans and (1S,2R)-(+)-cis enantiomers are present and have similar but not identical "old church"-like odors.
References
Cyclopropyl compounds
Carboxylic acids | Olibanic acid | [
"Chemistry"
] | 121 | [
"Carboxylic acids",
"Functional groups",
"Organic compounds",
"Organic compound stubs",
"Organic chemistry stubs"
] |
72,984,583 | https://en.wikipedia.org/wiki/Cullen%20Medal | The Cullen Medal, named for William Cullen, is awarded by the Royal College of Physicians of Edinburgh for “the greatest benefit done to Practical Medicine”.
References
Awards established in 1890
Medicine awards | Cullen Medal | [
"Technology"
] | 39 | [
"Science and technology awards",
"Medicine awards"
] |
72,987,392 | https://en.wikipedia.org/wiki/Maria%20D.%20Ellul | Maria D. Ellul is a retired ExxonMobil materials scientist known for her contributions to and development of commercial polyolefin and polyamide specialty thermoplastic elastomers, and recognized as one of the first prominent women scientists in the rubber industry.
Education
Ellul was born in Malta. In 1977, she earned a B.Sc. in chemistry at The Royal University of Malta and was employed as an intern with Sterling Organics. In 1979, she completed the M.Sc. degree in Polymer Science from North London Polytechnic. She earned her Ph.D. in Polymer Science from The University of Akron in 1984 under advisor Alan Gent.
Career
Ellul's first post-baccalaureate employment was in 1977 as a chemist at Dowty Seals Ltd. Her first employer following her doctorate was GenCorp/General Tire. In that role she conducted research on tire materials, adhesion of rubber and Fracture Mechanics. She joined ExxonMobil Chemical Co. in 1991 with its subsidiary Advanced Elastomer Systems. She held positions of increasing responsibility, eventually rising to principal scientist and retired as senior research associate.
She is known for both fundamental and applied contributions to polyolefin and polyamide based thermoplastic elastomer materials. She developed methods to control the morphology and rheology of polymer blends, and contributed to the understanding of the kinetics and thermodynamics of polymer blends, block / graft copolymers and dynamic vulcanization. Ellul's research lead to the commercialization of many specialty polymers including Santoprene Thermoplastic Vulcanizates, Exxcore Dynamically Vulcanized Alloys, Vistamaxx, Butyl and EPDM. She is an associate editor for the scientific journal Rubber Chemistry & Technology.
Awards and recognition
1996 - Honored by the National Inventors Hall of Fame in the National Salute to Corporate Inventors.
1997 - Sparks–Thomas award from the ACS Rubber Division
1998 - Woman of the Year Award from Ohio Women's History Project
2005 - Chemistry of Thermoplastic Elastomers Award from the ACS Rubber Division
2010 - Department of Polymer Science Outstanding Alumni Award from The University of Akron
2016 - ExxonMobil Chiefs' Award
References
Polymer scientists and engineers
Year of birth missing (living people)
Living people
ExxonMobil people
Women materials scientists and engineers | Maria D. Ellul | [
"Chemistry",
"Materials_science",
"Technology"
] | 485 | [
"Polymer scientists and engineers",
"Physical chemists",
"Materials scientists and engineers",
"Polymer chemistry",
"Women materials scientists and engineers",
"Women in science and technology"
] |
72,988,129 | https://en.wikipedia.org/wiki/Diffractive%20solar%20sail | A diffractive solar sail, or diffractive lightsail, is a type of solar sail which relies on diffraction instead of reflection for its propulsion. Current diffractive sail designs use thin metamaterial films, containing micrometer-size gratings based on polarization or subwavelength refractive structures, causing light to spread out (i.e. diffract) and thereby exert radiation pressure when it passes through them.
History
The idea of using diffraction for a solar sail was first proposed in 2017 by researchers at the Rochester Institute of Technology. This was enabled in part by advances in material design and fabrication (particularly of gratings), and optoelectronic control. In 2019 a diffractive solar sail project from the Rochester Institute of Technlology suggested a solar polar orbit mission with diffractive sails that could reach a higher solar inclination angle and smaller orbital radius than one with reflective sails, reaching NASA's NIAC phase II. In 2022 the NIAC project reached phase III and gained US$2 million of support from NASA, with involvement of researchers from both Johns Hopkins University and the Rochester Institute of Technology.
Advantages over reflective sails
Reflective solar sail designs tend to consist of large, thin reflective sheets. By the law of reflection, the forces acting on them will always be normal to the sheet surface; the sheets must therefore be tilted during navigation, which poses structure and control challenges, and reduces the power reaching the sail. These in turn can lower reliability, increase mass, and reduce acceleration. Furthermore, reflective sails tend to absorb a reasonable proportion of the light hitting them, causing them to heat up; this can cause structural problems, particularly when the sail is repeatedly heated and then allowed to cool. Also, each photon hitting the sail is used once, i.e. it's either reflected or absorbed.
On the other hand, in a diffractive sail the grating can redirect light even when the sheet directly faces the sun, allowing much more efficient control with maximum power hitting the sail. The diffractive film can be designed to allow for optoelectronic control of the gratings, thereby reducing mass and increasing reliability relative to mechanical control. Since the film is translucent, most of the light just passes through the sail, reducing overall heating. Photons can be reused: either by passing through a second diffraction grating for more thrust, or by going to a solar cell to provide electricity.
References
Spacecraft attitude control
Spacecraft propulsion
Spacecraft components
Interstellar travel
Optoelectronics
Photonics | Diffractive solar sail | [
"Astronomy"
] | 530 | [
"Astronomical hypotheses",
"Interstellar travel"
] |
72,988,466 | https://en.wikipedia.org/wiki/Axial%20current | In particle physics, the axial current, also denoted the pseudo-vector or chiral current, is the conserved current associated to the chiral symmetry or axial symmetry of a system.
Origin
According to Noether's theorem, each symmetry of a system is associated a conserved quantity. For example, the rotational invariance of a system implies the conservation of its angular momentum, or spacetime invariance implies the conservation of energy–momentum. In quantum field theory, internal symmetries also result in conserved quantities. For example, the U(1) gauge transformation of QED implies the conservation of the electric charge. Likewise, if a theory possesses an internal chiral or axial symmetry, there will be a conserved quantity, which is called the axial charge. Further, just as the motion of an electrically charged particle produces an electric current, a moving axial charge constitutes an axial current.
Definition
The axial current resulting from the motion of an axially charged moving particle is formally defined as , where is the particle field represented by Dirac spinor (since the particle is typically a spin-1/2 fermion) and and are the Dirac gamma matrices.
For comparison, the electromagnetic current produced by an electrically charged moving particle is .
Meaning
As explained above, the axial current is simply the equivalent of the electromagnetic current for the axial symmetry instead of the U(1) symmetry. Another perspective is given by recalling that the
chiral symmetry is the invariance of the theory under the field rotation and (or alternatively and ), where denotes a left-handed field and a right-handed one.
From this as well as the fact that and the definition of above, one sees that the axial current is the difference between the current due to left-handed fermions and that from right-handed ones, whilst the electromagnetic current is the sum.
Chiral symmetry is exhibited by vector gauge theories with massless fermions. Since there is no known massless fermion in nature, chiral symmetry is at best an approximate symmetry in fundamental theories, and the axial current is not conserved. (Note: this explicit breaking of the chiral symmetry by non-zero masses is not to be confused with the spontaneous chiral symmetry breaking that plays a dominant role in hadronic physics.) An important consequence of such non-conservation is the neutral pion decay and the chiral anomaly, which is directly related to the pion decay width.
Applications
The axial current is an important part of the formalism describing high-energy scattering reactions. In such reaction, two particles scatter off each other by exchanging a force boson, e.g., a photon for electromagnetic scattering (see the figure).
The cross-section for such reaction is proportional to the square of the scattering amplitude, which in turn is given by the product of boson propagator times the two currents associated with the motions two colliding particles. Therefore, currents (axial or electromagnetic) are one of the two essential ingredients needed to compute high-energy scattering, the other being the boson propagator.
In electron–nucleon scattering (or more generally, charged lepton–hadron/nucleus scattering) the axial current yields the spin-dependent part of the cross-section. (The spin-average part of the cross-section comes from the electromagnetic current.)
In neutrino–nucleon scattering, neutrinos couple only via the axial current, thus accessing different nucleon structure information than with charged leptons.
Neutral pions also couple only via the axial current because pions are pseudoscalar particles and, to produce amplitudes (scalar quantities), a pion must couple to another pseudoscalar object like the axial current. (Charged pions can also couple via the electromagnetic current.)
See also
Chiral anomaly
Chiral symmetry breaking
Chiral perturbation theory
Chiral magnetic effect
Parity (physics)
QCD
References
Physical quantities
Quantum field theory
Particle physics
Nuclear physics
Quantum chromodynamics
Standard Model
Conservation equations
Conservation laws
Symmetry
Four-vectors | Axial current | [
"Physics",
"Mathematics"
] | 830 | [
"Physical phenomena",
"Physical quantities",
"Four-vectors",
"Quantum mechanics",
"Mathematical objects",
"Particle physics",
"Nuclear physics",
"Equations of physics",
"Quantity",
"Vector physical quantities",
"Geometry",
"Symmetry",
"Physics theorems",
"Standard Model",
"Quantum field ... |
72,988,654 | https://en.wikipedia.org/wiki/Vogel%E2%80%93Johnson%20agar | Vogel–Johnson agar is a type of agar growth medium selective for coagulase-positive staphylococci. It is used to isolate Staphylococcus aureus from clinical specimens and food. It was first described by Vogel and Johnson, who modified the Tellurite Glycine Agar recipe by Zebovitz et al. by doubling the mannitol concentration to 1% (w/v) and adding Phenol red as a pH indicator. It is widely available commercially.
Typical composition
Vogel–Johnson agar typically contains:
Tryptone - 10.0 g/L
Yeast extract - 5.0 g/L
Mannitol - 10.0 g/L
Dipotassium phosphate - 5.0 g/L
Lithium chloride - 5.0 g/L
Glycine - 10.0 g/L
Phenol red - 0.025 g/L
Agar = 16.0 g/L
The Modified Vogel-Johnson Agar contains, in addition to above: 5 g/L of beef extract, 2 g/L of deoxyribonucleic acid, 2 /L g of phosphatidyl choline and 780 units/L of catalase spread on the plates before inoculation.
References
Microbiological media | Vogel–Johnson agar | [
"Biology"
] | 271 | [
"Microbiological media",
"Microbiology equipment"
] |
72,988,820 | https://en.wikipedia.org/wiki/Garnier%20integrable%20system | In mathematical physics, the Garnier integrable system, also known as the classical Gaudin model is a classical mechanical system
discovered by René Garnier in 1919 by taking the 'Painlevé simplification' or 'autonomous limit' of the Schlesinger equations. It is a classical analogue to the quantum Gaudin model due to Michel Gaudin (similarly, the Schlesinger equations are a classical analogue to the Knizhnik–Zamolodchikov equations). The classical Gaudin models are integrable.
They are also a specific case of Hitchin integrable systems, when the algebraic curve that the theory is defined on is the Riemann sphere and the system is tamely ramified.
As a limit of the Schlesinger equations
The Schlesinger equations are a system of differential equations for matrix-valued functions , given by
The 'autonomous limit' is given by replacing the dependence in the denominator by constants with :
This is the Garnier system in the form originally derived by Garnier.
As the classical Gaudin model
There is a formulation of the Garnier system as a classical mechanical system, the classical Gaudin model, which quantizes to the quantum Gaudin model and whose equations of motion are equivalent to the Garnier system. This section describes this formulation.
As for any classical system, the Gaudin model is specified by a Poisson manifold referred to as the phase space, and a smooth function on the manifold called the Hamiltonian.
Phase space
Let be a quadratic Lie algebra, that is, a Lie algebra with a non-degenerate invariant bilinear form . If is complex and simple, this can be taken to be the Killing form.
The dual, denoted , can be made into a linear Poisson structure by the Kirillov–Kostant bracket.
The phase space of the classical Gaudin model is then the Cartesian product of copies of for a positive integer.
Sites
Associated to each of these copies is a point in , denoted , and referred to as sites.
Lax matrix
Fixing a basis of the Lie algebra with structure constants , there are functions with on the phase space satisfying the Poisson bracket
These in turn are used to define -valued functions
with implicit summation.
Next, these are used to define the Lax matrix which is also a valued function on the phase space which in addition depends meromorphically on a spectral parameter ,
and is a constant element in , in the sense that it Poisson commutes (has vanishing Poisson bracket) with all functions.
(Quadratic) Hamiltonian
The (quadratic) Hamiltonian is
which is indeed a function on the phase space, which is additionally dependent on a spectral parameter . This can be written as
with
and
From the Poisson bracket relation
by varying and it must be true that the 's, the 's and are all in involution. It can be shown that the 's and Poisson commute with all functions on the phase space, but the 's do not in general. These are the conserved charges in involution for the purposes of Arnol'd Liouville integrability.
Lax equation
One can show
so the Lax matrix satisfies the Lax equation when time evolution is given by any of the Hamiltonians , as well as any linear combination of them.
Higher Hamiltonians
The quadratic Casimir gives corresponds to a quadratic Weyl invariant polynomial for the Lie algebra , but in fact many more commuting conserved charges can be generated using -invariant polynomials. These invariant polynomials can be found using the Harish-Chandra isomorphism in the case is complex, simple and finite.
Integrable field theories as classical Gaudin models
Certain integrable classical field theories can be formulated as classical affine Gaudin models, where is an affine Lie algebra. Such classical field theories include the principal chiral model, coset sigma models and affine Toda field theory. As such, affine Gaudin models can be seen as a 'master theory' for integrable systems, but is most naturally formulated in the Hamiltonian formalism, unlike other master theories like four-dimensional Chern–Simons theory or anti-self-dual Yang–Mills.
Quantum Gaudin models
A great deal is known about the integrable structure of quantum Gaudin models. In particular, Feigin, Frenkel and Reshetikhin studied them using the theory of vertex operator algebras, showing the relation of Gaudin models to topics in mathematics including the Knizhnik–Zamolodchikov equations and the geometric Langlands correspondence.
References
Classical mechanics | Garnier integrable system | [
"Physics"
] | 965 | [
"Mechanics",
"Classical mechanics"
] |
72,989,088 | https://en.wikipedia.org/wiki/Environmental%20epigenetics | Environmental epigenetics is a branch of epigenetics that studies the influence of external environmental factors on the gene expression of a developing embryo. The way that genes are expressed may be passed down from parent to offspring through epigenetic modifications, although environmental influences do not alter the genome itself.
During embryonic development, epigenetic modifications determine which genes are expressed, which in turn determines the embryo's phenotype. When the offspring is still developing, genes can be turned on and off depending on exposure to certain environmental factors. While certain genes being turned on or off can increase the risk of developmental diseases or abnormal phenotypes, there is also the possibility that the phenotype will be non-functional. Environmental influence on epigenetics is highly variable, but certain environmental factors can greatly increase the risk of detrimental diseases being expressed at both early and adult life stages.
Environmental triggers for epigenetic change
The way that genes are expressed is influenced by the environment that the genome is in. These environmental influences are referred to as triggers and can involve anything that influences normal gene expression. How the genome is expressed depends on the environmental factors present during gestation. It is possible for the environmental effects of epigenetics to be deleterious or to be a natural part of the development pathway. When these environmental factors are detrimental it causes the deactivation of some DNA sequences, which can lead to atypical phenotypes. Some of the most common triggers include diet, temperature, exposure to harmful substances, and lifestyle. These triggers can cause low birth weight, neurological disorders, cancers, autoimmune diseases, and many other malformations.
These epigenetic triggers can change the way that an organism develops and have lifelong effects. Epigenetic changes can be passed down through offspring, present in multiple generations, and continue to change throughout a lifetime. Each sequence of affected DNA is not expressed at the same time. There are specific stages in which these expressions happen during development. The combined epigenetic mechanisms of DNA methylation and histone modification are responsible for how the genome is altered. For example, the suppression of oncogenes is regulated by DNA methylation and whether or not methylation is activated. This activation depends on the environment. When these activations happen, they can be passed down to the next generation through germ-cell differentiation.
Examples of triggers
Nutrients
Offspring can experience phenotypic changes depending on their access to nutrients. When nutrients are limited during pregnancy, the offspring's phenotypic expression can be disrupted. Nutrient intake is also important during lactation for the purpose of transferring nutrients to the offspring.
Stress during pregnancy
When the maternal figure is exposed to stressors it can lead to a greater likelihood of expressing or stunting DNA expression. If the maternal figure experiences high levels of depression or stress it can lead to small litter sizes with lower birth rates. Decreased hormone production is the suspected cause of this.
Temperature
Changes in temperature can have varied effects on an organism. DNA methylation can be impacted by temperature when the temperature deviates from its normal value, preventing regular processes from taking place. Temperature can be considered a stressor in environmental epigenetics since it has the potential to change how offspring respond and react to their environment. Monarch butterflies are an example of how temperature can impact the survival and fitness of an organism. If exposed to stressors such as varying temperatures, these butterflies may express coloring that deviates from their normal color.
Lifestyle choices
Epigenetic marks can result from a number of exposures and choices made by an individual in their lifetime. Exposure to environmental pollutants, psychological stress, dietary choices or restrictions, working habits, and consumption of drugs or alcohol all influence the epigenetics of an individual and what may be passed down to future offspring. Such exposures can affect important processes of epigenetics such as DNA methylation and histone acetylation, influencing the risk for noncommunicable diseases such as obesity.
Exposure
Exposure to certain materials or chemicals can cause an epigenetic reaction. The epigenetic causing substances cause issues like altered DNA methylation, CpG islands, chromatin, along with other transcription factors. Environmental epigenetics aims to relate such environmental triggers or substances to phenotypic variation. Numerrous studies have demonstrated how exposure to environmental pollutants, such as heavy metals, pesticides, and air pollutants, can induce epigenetic changes in various organisms. For example, research has shown that exposure to pollutants like biphenol A (BPA) and polycyclic acromatic hydrocarbons (PAHs) can lead to DNA methylation changes and histone modifications in plants, animals, and humans.
Epigentic mechanisms play a role in the adaptation of species of changing environmental conditions, including climate change. Studies have shown that organisms can exhibit phenotypic plasticity through epigenetic modifications in response to environmental stressors such as temperature fluctuation, drought, and habitat loss.
Environmental epigenetics has revealed the potential for transgenerational effects, where environmental exposures experienced by one generation can influence the phenotypes and health outcomes of subsequent generations through epigenetic inheritance mechanisms. Studies in various organisms, including plants, insects, and mammals, have shown transgenerational epigenetic effects resulting from parental exposure to stressors such as toxins, dietary changes, and environmental contaminants. Epigenetic modifications can influence gene expression and phenotypic traits in oragamisms across different trophic levels, with implications for ecosystem stability.
Lemon sharks
An example of exposure causing environmental epigenetic can be seen in lemon sharks, Negaprion brevirostris. Due to a dredging event, lemon sharks in the Bahamas experienced an epigenetic change. Dredging is done with a machine that clears out all mud and debris found at the bottom of a body of water. Dredging is extremely harmful to the physical environment and the organisms living there. This dredging caused exposure to different toxic metals like Manganese along with other trace amount of heavy metals, which then affected DNA methylation in juvenile lemon sharks. This exposure caused the lemon sharks’ DNA to methylate at abnormal rates which caused gene expression to be altered. Scientists hypothesized that this aberrant DNA methylation could be caused by the stress that the dredging caused. Exposure to a stressful event is also an example of an environmental epigenetic factor.
Plants
Plants need some types of metals to help with their development, but when exposed to high amounts, the metals can become toxic to plants. Since plants can process metals that are important for their growth, they also process metals that are not needed to help them grow and develop. Once the levels of the metals get too high, they start to affect plants directly and indirectly because metals cannot be broken down. Direct toxic effects that occur due to high levels of metal are inhibition of cytoplasmic and damage to the structure of the cell because of oxidative stress. Oxidative stress is like a bad storm that messes up the plant by damaging it so it cannot get the nutrients it needs to be healthy because things that are not supposed to be there are taking up space where the essential nutrients should go. Indirect toxic effects are the proxy nutrients at the plant's cation exchange. The cation exchange site is where the plant picks up the nutrients it needs, though if something comes in and starts taking all the valuable nutrients, such as heavy metal, there is nothing there to help the plant grow. An example of a heavy metal that is not required for plant growth is mercury. The relationship between Hg in the soil and how much plants take in is complex. It relies on many other factors, such as the pH of the soil, the type of plant species, etc.
Many types of heavy metals are toxic to plants, such as lead. Typically, land plants absorb lead(Pb) from the soil, most retaining it in their roots with some evidence of foliage uptakes and potential distribution to other plant parts. Calcium and phosphorus can reduce the uptake of lead, a common and toxic soil element that impacts the plant, growth structure, and photosynthesis of the plant. Lead, in particular, inhibits the process by which a plant grows from a seed into a seedling, known as seed germination in various species, by interfering with crucial enzymes. Studies have shown that lead acetate reduces protease and amylase activity in rice endosperm considerably. This interferes with early seeding growth across plant species such as soybean, rice, tomato, barley, maize, and some legumes.
Furthermore, lead delays root and steam elongation and leaf expansion, with the extent of root elongation inhibition varying based on the lead concentration, the medium's ionic composition, and pH. Soil levels that have high levels of lead can also cause irregular root thickening, cell wall modifications in peas, growth reduction in sugar beets, oxidative stress due to increased reactive oxygen species (ROS) production, biomass, and protein content in maize, along with diminished lead count and area, plant height in Portia trees, and enzyme activity affecting CO2 fixation in oats.
Manganese (Mn), is crucial for plants and involves in photosynthesis and other physiological processes. Deficiency commonly affects sandy, organic, or tropical soils with a high pH above six and heavily weathered tropical soils. Mn can move easily from roots to shoots, though it is not efficiently redistributed from leaves to other parts of the plant. The signs of Mn toxicity are necrotic brown spots on leaves, petioles, and steams that start on the lower leaves and move upward, leading to death. When damage to young leaves and stems, coupled with chlorosis and browning, called a "crinkle leaf." In some species, toxicity can begin with chlorosis in older leaves, advancing to younger ones, and can inhibit chlorophyll synthesis by interfering with iron-related processes. Mn toxicity is more present in soils with a pH level lower than six. In the broad bean plant, Mn affects shoot and root length The spearmint plant, lowers chlorophyll and carotenoid levels and increases root Mn accumulation. Pea plant, lowers chlorophyll a and b, growth rate, and photosynthesis. In the tomato plant, it slows growth and decreases chlorophyll concentration.
Humans
Humans have displayed evidence of epigenetic changes such as DNA methylation, differentiation in expression, and histone modification due to environmental exposures. Carcinogen development in humans has been studied in correlation to environmental inducements such as chemical and physical exposures and their transformative abilities on epigenetics. Chemical and physical environmental factors are contributors to epigenetic statuses amongst humans.
Firstly, a study was performed on drinking water populations in China involving three generations: the F1 generation consisting of grandparents exposed to arsenic in adulthood, the F2 generation including the parents exposed to arsenic in utero and early childhood, and the F3 generation which were the grandchildren exposed to arsenic from germ cells. This area in China was historically known for its dangerously high levels of arsenic, therefore, there was opportunity to examine the timeline As exposures across the three generations. The study was conducted to discover the linkage between the timeline effects of As exposure and DNA methylations. The population and environment for which the study was conducted were reportedly not exposed to other environmental exposures besides arsenic.
The results concluded from this experiment were that 744 CpG sites had been differentially methylated. The 744 sites were found across all three generations in the group exposed to arsenic. The concluding argument based on the results of this study is that the DNA methylation changes were more prevalent in those that developed arsenic-induced diseases.
Exposures to environmental factors during human lifetimes and their potential effect on phenotypes is a highly question topic involving epigenetics and disease development. In the case of humans, "unhealthy" phenotypes have been identified to carry such evidence that environmental epigenetics could be a leading cause in gene regulation, disease development, cell development and differentiation, aging, and carcinogenic effect. Although the way that environmental factors and the human genome work together is not completely understood, their influence has been identified and is continuing to drive explanations for human genome modifications and their outcomes. Driving evidence for adverse effects implemented by extrinsic factors from the environment, comes from studies done on nutrition and exposures to toxins.
Besides arsenic exposures, other metals have been identified to cause such hypermethylations. Concentrations of Cd, Cr, Cu, Pb and Zn metals were identified in fishermen's blood and resulted in an increase in the expression of the IGF2 gene. The IGF2 gene is responsible for making the insulin-like growth factor 2. The insulin-like growth factor is involved in growth and can result in disorders where cell growth and overgrowth are abnormal. Such disorders include breast and lung cancers and Silver–Russell and Beckwith–Wiedemann syndromes The significance of IGF2 gene expression is found in its relationships to human health. There is remaining uncertainty between the long-term environmental exposures and epigenetic changes, but conducted research has provided that heavy metal exposures cause DNA methylation changes.
Exposures to certain triggers such as alcohol or drugs can disrupt the normal expression of the offspring's phenotype. Antipsychotic drugs can lead to abnormal or stunted development during the fetal or embryonic stages.
Multigenerational epigenetic inheritance
Organisms respond to the habitat around them in many different ways, one way is by changing its gene expression to one that is most suitable for their surroundings. More often than not, this has a direct correlation to phenotypic plasticity. Phenotypic plasticity is when a species develops new physical features in response to the environment they’re in. Passing down epigenetics that are in relation to mitotic cell divisions allows for the belief of a possibility that this is also passed down from parents to offspring. Parents could be responsible for the development of new phenotypes in these cases.
Epigenetic inheritance
Epigenetic inheritance refers to passing down or transferring epigenetic information between the parent and offspring. Some believe that these occurrences can be passed down for many generations. For example, the language a given species utilizes develops a specific phenotype that will be passed down from generation to generation.
Cultural inheritance
Cultural inheritance is a behavioral factor that is passed down from generation to generation, similar to inheritance. For example, in rats, mothers that lick and groom their pups pass down a specific behavior to their offspring causing them to do the same to the subsequent generation. Epigenetic inheritance is involved in this, but they are separate things that work together.
Mechanisms influencing epigenetics
DNA replication
DNA replication is a highly conserved process involving the copying of genetic information from parent one generation to the next. Within this complex process, chromatin disassembles and reassembles in a precise and regulated manner in order to compact large amounts of genetic material into the nucleus, while also maintaining the integrity of epigenetic information carried by histone proteins bound to DNA in the process of cell division. Half of the histones present during replication are from chromatin found in the parent DNA and thus carry the parent's epigenetic information. These epigenetic marks play a critical role in determining chromatin structure and thus gene expression in the newly synthesized DNA. The other half of the histones present in replication are newly synthesized.
DNA methylation
A major formative mechanism of epigenetic modification is DNA methylation. DNA methylation is the process of adding a methyl group to a cytosine base in the DNA strand, via covalent bond. This process is carried out by specific enzymes. These methyl additions can be reversed in a process known as demethylation. The presence or absence of methyl groups can attract proteins involved in gene repression, or inhibit the binding of certain transcription factors, thus preventing methylated genes from being transcribed, ultimately affecting phenotypic expression.
Acetylation
Acetylation is a reaction that introduces an acetyl group into an organic chemical compound, typically by substituting an acetyl group for a hydrogen atom. Deacetylation is the removal of an acetyl group from an organic chemical compound. Histone acetylation and deacetylation affect the three-dimensional structure of chromatin. A more relaxed chromatin structure leads to greater rates of genetic transcription, whereas a tighter structure inhibits transcription.
Transcriptional regulation
Transcriptional regulation is a complex process involving the binding of transcriptional machinery to regulatory proteins—specifically chromatin remodeling or modifying proteins-directly onto a specific target. This may sometimes be facilitated by the contribution of accessory complexes that function primarily to repress and activate transcription in a cell. Transcriptional regulation additionally focuses on the epigenetic regulation of a target locus, as the epigenetic status of the locus determines either the facilitation of or prohibition of transcription. Epigenetic regulation is necessary for the precise deployment of transcriptional programs.
References
Epigenetics
Developmental biology
Embryology | Environmental epigenetics | [
"Biology"
] | 3,560 | [
"Behavior",
"Developmental biology",
"Reproduction"
] |
72,990,081 | https://en.wikipedia.org/wiki/Big-line-big-clique%20conjecture | The big-line-big-clique conjecture is an unsolved problem in discrete geometry, stating that finite sets of many points in the Euclidean plane either have many collinear points, or they have many points that are all mutually visible to each other (no third point blocks any two of them from seeing each other).
Statement and history
More precisely, the big-line big-clique conjecture states that, for any positive integers and there should exist another number , such that every set of points contains collinear points (a "big line"), mutually-visible points (a "big clique"), or both.
The big-line-big-clique conjecture was posed by Jan Kára, Attila Pór, and David R. Wood in a 2005 publication. It has led to much additional research on point-to-point visibility in point sets.
Partial results
Finite point sets in general position (no three collinear) do always contain a big clique, so the conjecture is true for . Additionally, finite point sets that have no five mutually-visible points (such as the intersections of the integer lattice with convex sets) do always contain many collinear points, so the conjecture is true for .
Generalizing the integer lattice example, projecting a -dimensional system of lattice points of size
onto the plane, using a generic linear projection, produces a set of points with no collinear points and no mutually visible points. Therefore, when exists, it must be greater than .
Related problems
The visibilities among any system of points can be analyzed by using the visibility graph of the points, a graph that has the points as vertices and that connects two points by an edge whenever the line segment connecting them is disjoint from the other points. The "big cliques" of the big-line-big-clique conjecture are cliques in the visibility graph. However, although a system of points that is entirely collinear can be characterized by having a bipartite visibility graph, this characterization does not extend to subsets of points: a subset can have a bipartite induced subgraph of the visibility graph without being collinear.
According to the solution of the happy ending problem, every subset of points with no three in line includes a large subset forming the vertices of a convex polygon. More generally, it can be proven using the same methods that every set of sufficiently many points either includes collinear points or points in convex position. However, some of these pairs of convex points could be blocked from visibility by points within the convex polygon they form.
Another related question asks whether points in general position (or with no lines of more than some given number of points) contain the vertices of an empty convex polygon or hole. This is a polygon whose vertices belong to the point set, but that has no other points in the intersection of the point set with its convex hull. If a hole of a given size exists, its vertices all necessarily see each other. All sufficiently large sets of points in general position contain five vertices forming an empty pentagon or hexagon, but there exist arbitrarily large sets in general position with no empty heptagons.
A strengthening of the big line big clique conjecture asks for the big clique to be a "visible island", a set of points that are mutually visible and that are formed from the given larger point set by intersecting it with a convex set. However, this strengthened version is false: if a point set in general position has no empty heptagon, then replacing each of its points by a closely-spaced triple of collinear points produces a point set with no four in a line and with no visible islands of 13 or more points.
There is no possibility of an infinitary version of the same conjecture: Pór and Wood found examples of countable sets of points with no four points on a line and with no triangle of mutually visible points.
References
Conjectures
Discrete geometry | Big-line-big-clique conjecture | [
"Mathematics"
] | 805 | [
"Discrete mathematics",
"Unsolved problems in mathematics",
"Discrete geometry",
"Conjectures",
"Mathematical problems"
] |
72,990,274 | https://en.wikipedia.org/wiki/Sharkbook | Sharkbook is a global database for identifying and tracking sharks, particularly whale sharks, using uploaded photos and videos.In addition to identifying and tracking sharks, the site allows people to "adopt a shark" and get updates on specific animals.
Creation
Sharkbook is the result of collaboration between Simon J Pierce of the Marine Megafauna Foundation and Jason Holmberg of Wild Me. The software is Open Source and is now being used by other biology projects.
Identification of individual sharks
Whale sharks have unique spot patterning on their sides, similar to a human fingerprint, which allows for individual identification. Scuba divers around the world can photograph sharks and upload their identification photographs to the Sharkbook website, supporting global research and conservation efforts. Additionally, the software automatically searches social media sites like YouTube and Instagram to look for images of whale sharks and adds them to the database.
Sharkbook software uses special pattern-matching software to identify the unique spots on each shark. This software and algorithms were originally adapted from NASA star tracking software used on the Hubble Space Telescope. This software uses a scale-invariant feature transform (SIFT) algorithm, which can cope with complications presented by highly variable spot patterns and low contrast photographs.
Purpose
This citizen science tool is free to use by researchers worldwide. Sharkbook represents a global initiative to centralize shark sightings and facilitate research on these vulnerable species.
See also
Manta Matcher - For Manta Rays
Flukebook - For whales and dolphins
References
Marine biology
Sharks
Citizen science | Sharkbook | [
"Biology"
] | 301 | [
"Marine biology"
] |
72,990,813 | https://en.wikipedia.org/wiki/Karen%20Lozano | Karen Lozano is a Mexican American researcher who is the Julia Beecherl Endowed Professor of Mechanical Engineering and Director of the Nanotechnology Center at the University of Texas Rio Grande Valley. She studies carbon nanofiber-reinforced thermoplastic composites. She was elected Fellow of the National Academy of Inventors in 2020 and the National Academy of Engineering in 2023.
Early life and education
Lozano was born in Mexico. Her mother was a seamstress. She studied mechanical engineering at the University of Monterrey and the year she graduated, she was the only woman to earn a degree in mechanical engineering. Researchers from Rice University visited Monterrey as part of an outreach project, and recruited Lozano to join for a doctoral position. She was the first Latin American woman to earn a PhD from Rice.
Research and career
Lozano joined the faculty at the University of Texas–Pan American, where she worked on new approach to mass-produce nano nanofibers. In 2009 she launched FibeRio, a company that could mass-produce nanofibers through a clean,cheap, and facile process coined Forcespinning. Fiberio makes use of Cyclone ForceSpinning Systems, which uses centrifugal forces to pull nanofibers for industrial and medical applications. She took part in a roundtable discussion with Barack Obama about entrepreneurs in the United States.
In 2009, Lozano was awarded a National Science Foundation (NSF) award to build a partnership between the University of Texas Rio Grande Valley and University of Minnesota to create a materials science research center. The center looks to train undergraduate and graduate students from Hispanic backgrounds to pursue careers in materials science.
In 2023, Dr. Karen Lozano, University of Texas Rio Grande Valley professor of Mechanical Engineering and Julia Beecherl Endowed Professor, was elected into the prestigious National Academy of Engineering (NAE).
She is the first UTRGV professor ever to receive this honor and one of only three Texans in this year’s cohort of electees.
The academy cited her “contributions to nanofiber research and commercialization and mentoring of undergraduate students from underserved populations.”
Awards and honors
2002 Most Promising Scientist Award Hispanic Engineer National Achievement Association Conference
2011 University of Texas System Regents Teaching Award
2015 Engineer of the Year by Great Minds in STEM (Science, Technology, Engineering and Mathematics).
2017 Universidad de Monterrey Historias de Exito(Alumni Success Stories).
2017 Insight Into Diversity Inspiring Leaders in STEM Award.
2017 The University of Texas Rio Grande Valley Research Excellence Award.
2018 American Association of Hispanics in higher education Outstanding Research Award.
2018 Mexicanos Distinguidos Medal
2018 Latina of Influence
2019 Presidential Award for Excellence in Science, Mathematics, and Engineering Mentoring
2020 Elected to the National Academy of Inventors
2023 Elected to the National Academy of Engineering
2023 Honored with the Carnegie Corporation of New York's Great Immigrant Award.
Selected publications
References
Year of birth missing (living people)
Living people
Rice University alumni
University of Monterrey alumni
Fellows of the National Academy of Inventors
Members of the United States National Academy of Engineering
Mexican emigrants to the United States
Mechanical engineers
20th-century American engineers
21st-century American engineers
20th-century American women engineers
21st-century American women engineers
Mexican scientists | Karen Lozano | [
"Engineering"
] | 654 | [
"Mechanical engineers",
"Mechanical engineering"
] |
72,990,835 | https://en.wikipedia.org/wiki/Laminaria%20abyssalis | Laminaria abyssalis is a species of brown kelp, notable for its connection to rhodolith beds in the Brazilian coastline.
Distribution and ecology
Laminaria abyssalis is native to the Atlantic Ocean, off the coast of Brazil. It resides in a habitat spanning over , from upper Espírito Santo to Mid Rio de Janeiro. It thrives in the waters of the continental shelf and intertidal zone, at depths of . The majority of these kelp take root in Rhodolith beds; substrates formed by nodules of calcareous algae.
Morphology
The stipe of the Laminaria abyssalis averaged out to be 14.3 centimeters in length, with an average width of 0.7 centimeters. The stipe supports an undivided blade, averages out to a length of 241 centimeters, width of 68 centimeters, and thickness of 0.65 centimeters. The holdfast which attaches Laminaria abyssalis to the rhodolith bed it resides in has an average of 4 root-like extensions, each averaging to 13.5 centimeters in length.
History
Laminaria abyssalis was first discovered in 1967 by A. B. Joly and E. C. Oliveira, who found it in the Brazilian Coastline.
A later expedition was made by Quége N in 1988 to determine the many locations in which Laminaria abyssalis grows along Brazil.
As of lately, the kelp has been in a semi-endangered state, as sand trawling has become more common in Brazilian waters. This has disrupted the rhodolith beds in which Laminaria abyssalis grows, causing the kelp to slightly die out. The RESTORESEAS project has begun collecting data and making experiments to preserve the lives of these kelps.
References
Flora of Brazil
Laminariaceae
Seaweeds | Laminaria abyssalis | [
"Biology"
] | 375 | [
"Seaweeds",
"Algae"
] |
72,992,443 | https://en.wikipedia.org/wiki/HD%2075747 | HD 75747, also known as HR 3524 or RS Chamaeleontis (RS Cha), is a binary star located in the southern circumpolar constellation Chamaeleon. It has an average apparent magnitude of 6.05, making it barely visible to the naked eye. The system is located relatively close at a distance of 322 light years based on Gaia DR3 parallax measurements but is receding with a somewhat constrained heliocentric radial velocity of . It has an absolute magnitude of +1.21.
HD 75747 was known to be variable since 1960 based on observations by A. W. J. Cousins. The system was first observed as an eclipsing binary in 1967 by astronomers P. A. T. Wild and H. C. Lagerweij. J. Andersen deduced a circular orbit with a period of 1.66 days for the system. Subsequent observations revealed that one of the components as a δ Scuti variable. RS Cha is an Algol-type eclipsing binary ranging from 6.02 to 6.58 or 6.68 within 1.6699 days, depending on the eclipse. This system is part of the η Chamaeleontis association, a group of young stars moving with Eta Cha, and Eta Cha is just eight arc-minutes to the northwest of RS Cha.
Both components have a stellar classification of A8 IV, indicating that both objects are slightly evolved A-type subgiants. RS Cha A and B have masses nearly double of the Sun's and 2.14 - 2.34 times the radius of the Sun. They radiate 17.3 times the luminosity of the Sun from its photosphere at effective temperatures of and respectively, giving the object a white hue. RS Cha was originally thought to be 912 million years old, meaning that both stars were evolving off the main sequence. However, astronomer E. Alecian and colleagues re-examined the age of the system and it turns out that HD 75747 is only 9 million years old, making them pre-main sequence stars. The components rotation periods are synchronous to the orbital period, having projected rotational velocities of and respectively.
References
Pre-main-sequence stars
A-type subgiants
Algol variables
Eclipsing binaries
Delta Scuti variables
Chamaeleon
Chamaeleontis, RS
CD-78 00342
075747
042794
3524
Chamaeleontis, 9 | HD 75747 | [
"Astronomy"
] | 516 | [
"Chamaeleon",
"Constellations"
] |
72,992,833 | https://en.wikipedia.org/wiki/Vibrissea%20truncorum | Vibrissea truncorum, the water club mushroom or aquatic earth-tongue, is a species of fungus in the family Vibrisseaceae. It is found throughout the Northern Hemisphere.
Taxonomy
Description
Water club mushroom is a fungus that grows up to about tall, with a cap wide. It is characterized by its yellow, orange, or reddish fruiting body, and white to bluish-gray stem, darkening to brown at the base.
Similar species
The heads of Leotia and Cudonia species are less yellow, while Vibrissea species can lack stems.
Distribution and habitat
Vibrissea truncorum is a cosmopolitan mushroom, found throughout the Northern Hemisphere, including North America (in the Pacific Northwest and the Atlantic Northeast), Europe, Japan, and rarely in eastern Russia and Chile. It is most heavily concentrated in Norway, Sweden, Finland, and Western Russia, though it appears also in the United Kingdom and Southwestern Europe.
Water club mushroom often grows on partially or completely submerged wood in streams at higher elevations.
Ecology
References
Helotiales
Fungus species | Vibrissea truncorum | [
"Biology"
] | 220 | [
"Fungus stubs",
"Fungi",
"Fungus species"
] |
72,993,421 | https://en.wikipedia.org/wiki/Sanctioned%20Suicide | Sanctioned Suicide (SS, or SaSu) is an internet forum known for its open discussion and encouragement of suicide and suicide methods. The forum was founded on March 18, 2018, by Diego Joaquín Galante and Lamarcus Small, who go by the online pseudonyms Serge and Marquis. Galante and Small created the website after the subreddit r/SanctionedSuicide was banned by Reddit; both the website and the subreddit have been described as the successors to the Usenet newsgroup alt.suicide.holiday. , the forum has over 50,000 members and was reported to receive nearly 10 million page views in September 2023. Although the forum frames itself as a "pro-choice" suicide forum, it has been widely described as "pro-suicide".
Sanctioned Suicide has generated widespread scrutiny from news outlets and government officials for the encouragement of suicide by members on the site, as well as the site's promotion of the use of sodium nitrite as a method of suicide, a previously obscure method. One New York Times report found 45 adults and children who died in connection to the site, and a later report found dozens more. BBC News has identified 50 people who died in connection to the site in the United Kingdom. Access to the forum has been restricted in Italy and Germany. Since November 2023, Turkey also blocked access to the site.
History and background
The r/SanctionedSuicide subreddit and Sanctioned Suicide have been described as the successors of the Usenet newsgroup alt.suicide.holiday. On March 14, 2018, r/SanctionedSuicide was banned for breaking Reddit's rules on the promotion of violence, prompting Galante and Small to create the site on March 18, 2018. In January 2021, Sanctioned Suicide's original .com domain name was banned by the domain name registrar Epik, allegedly for the presence of minors on the site. Following the ban, the site moved to a different domain name.
The New York Times investigation
Journalists Megan Twohey and Gabriel Dance of The New York Times reportedly discovered the full names of the site founders during the October 2021 data breach of Epik. Following the breach, Twohey and Dance obtained photos of Galante and Small that matched previous appearances of their pseudonymous identities Serge and Marquis. When contacted by The New York Times, Small stated that he had no involvement with the website, suggested his brother may run the site, and denied his mother's name reportedly listed on police records. Galante acknowledged using the pseudonym Serge on the forum but denied founding or operating it, contradicting records on the site which described him as a co-founder and administrator of the website. After the two co-founders were named by The New York Times, Galante and Small announced their resignations as administrators, writing that they handed the forum over to a member going by the username RainAndSadness.
In an interview with the Poynter Institute, Twohey stated that the decision to name the website and the suicide methods promoted by the site were "two of the biggest ethical issues that we had ever dealt with". Following discussions with medical experts, law enforcement, and families, The New York Times team chose to name the website and the preservative once in the report, so as not to potentially raise the website's profile.
Galante and Small
Diego Joaquín Galante and Lamarcus Small describe themselves as incels and run a number of incel and manosphere related forums where members' discussions have been characterized as condoning, downplaying, or advocating violence against women. A September 2022 report from the Center for Countering Digital Hate described one of the forums as the largest forum dedicated to incel ideology. In response to a 2019 BuzzFeed News report which disclosed their connection to the incel forums, Small stated that the site's moderation was handled independently of the incel sites. Sanctioned Suicide has been noted as the only forum run by Galante and Small that does not restrict access by women. The New York Times investigation also reported that Small framed the site as part of a fight against censorship after the site received scrutiny for several deaths associated with the forum. An October 2023 BBC News investigation identified Small and confronted him at his home.
Site overview
The site is divided into three forums: Recovery, Suicide Discussion, and Offtopic. The recovery forum hosts recovery-related support discussions, the suicide discussion forum hosts discussions on suicide methods, and the offtopic forum hosts discussions on hobbies and other general interests. The suicide discussion forum is substantially more popular than the other two forums, including both detailed discussions of suicide methods and encouragement to commit suicide, which users refer to by the euphemism "catching the bus". The site also hosts live chats and private messaging. , the forum has over 50,000 members and was reported to receive nearly 10 million page views in September 2023.
An April 2023 study published by the Association for Computing Machinery found that most new users were active only in the first few weeks after making their accounts and their first posts were more likely to be about suicide and methods. According to an informal survey conducted on the site, half of the forum's userbase were 25 or younger. The forum has been widely described as pro-suicide, although the site frames itself as "pro-choice" and denies that it actively encourages suicide. While the site includes links to suicide hotlines and other mental health resources, Small noted that registrants who only sought the recovery forum would be unlikely to be approved. An investigation by the Australian ABC News noted that members have responded to attempts to direct people to hotlines or other supports with antagonism and accusations of being "pro-life".
Deaths
A December 2021 New York Times investigation identified 45 members who died by suicide in connection to the website, with a later report finding dozens more. In October 2023, BBC News identified 50 people who died in connection to the site in the United Kingdom. The New York Times investigation also identified more than 500 threads where users announced their suicide plans and did not continue to post. Among those identified by The New York Times was a 22-year-old woman from Glasgow who died by suicide after meeting a man on the site who had previously sexually assaulted and assisted in the suicides of several other women through the forum. In December 2022, he pled guilty to culpable and reckless conduct with a sexual element and is reportedly the first person in the UK convicted in connection to the site.
A PBS overview of the New York Times investigation specifically focuses on a male minor who was encouraged on the site to take his own life via the meat preservative sodium nitrite. His parents found him deceased on his bed. Prior to his death, the boy stated on the forum his worry about never recovering from an undiagnosed stomach ailment which was causing him pain after eating. While speaking about the boy, journalist Gabriel Dance stated "this Web site doesn't really help people with interventions, as much as it helps them carry out any kind of plans they have to kill themselves".
The New York Times also identified an Australian who died by suicide after members of Sanctioned Suicide taunted him and suggested he should film his death. An investigation by the Australian ABC News on the same death reported that his family believes the forum was the deciding factor in his death. Multiple parents of children who died by suicide after spending time on the site have publicly called on the forum to shut down, including a Facebook group with over 1,000 members. The moderators of the forum have since restricted the accounts of dead users to prevent family members or law enforcement from accessing them.
Response
In April 2019, the original .com domain name of Sanctioned Suicide was blocked by the Australian Federal Police under Section 313 of the Telecommunications Act 1997. Informal requests from the Australian eSafety commissioner Julie Inman Grant to Google and Microsoft Bing to delist the website from search results were declined, citing the need to balance safety with the open access of lawful information. Although both search engines have adjusted the site's ranking in its search results, the companies have stated that they will not delist the site absent a legal requirement. A November 2022 Australian ABC News investigation found that between 2017 and October 2020, there had been 20 suicide deaths from the meat preservative sodium nitrite in Australia, an increase from zero deaths across the previous 16 years. In response, the Australian Therapeutic Goods Administration reclassified the substance in 2022, further restricting its sale. Following the publication of the ABC investigation, which led some internet service providers (ISPs) to block the site, Sanctioned Suicide blocked access to the site in Australia, stating "anti-liberty countries will just be blocked".
In March 2020, the site was blocked from online search results in Germany. Prosecutors in Italy blocked access to the site in June 2021 following the deaths of two Italian teenagers by suicide. In December 2021, seven bipartisan members of the U.S. House of Representatives wrote to Attorney General Merrick Garland to clarify what action could be taken against the site under U.S. law. Following a statement from the House Committee on Energy and Commerce, Bing responded by lowering the ranking of the site in its search results, however both Google and Bing declined requests to remove the site from search results absent a legal requirement. While many U.S. states have laws against assisting suicide, federal law provides immunity from liability for web operators for most user-generated content. April Foreman, a psychologist on the executive board of the American Association of Suicidology, argued that rather than block the site, better systems of support for people with suicidal ideation need to be created.
Following the October 2023 BBC News investigation, several British ISPs, including Sky Broadband, TalkTalk, BT and Virgin Media, announced that the site would be blocked on default safety controls. Administrators for the site partially restricted access to the site in the United Kingdom after being contacted by the British digital regulatory agency Ofcom. A banner on the site announced that new users from the UK will not be able to access content on the website that violates the Online Safety Act. In addition, audio streaming service Spotify disabled the site's Spotify social login, which the company states was added by a third party developer without their knowledge. In response to reactions in the UK, Small made a post on the web forum Kiwi Farms stating that restricting access to the site or "harass[ing] me isn't going to solve the mental health crisis".
In response to The New York Times investigation, Uruguayan law enforcement launched an investigation against Galante, who resides in Uruguay. However, sources from the prosecutor's office stated that "it is very difficult to establish a crime" since Uruguayan law requires personal involvement.
See also
Antinatalism
Assisted suicide
Death-positive movement
Internet censorship
Internet governance
Internet safety
Nihilism
Philosophy of suicide
Right to die
Suicide and the Internet
Suicide prevention
Voluntary euthanasia
Kenneth Law, a Canadian man who is alleged to have sold poison on the site
Notes
References
Further reading
Internet-related controversies
Internet ethics
Internet properties established in 2018
Suicide and the Internet
Internet forums | Sanctioned Suicide | [
"Technology"
] | 2,271 | [
"Internet ethics",
"Ethics of science and technology"
] |
59,385,967 | https://en.wikipedia.org/wiki/Azanide | Azanide is the IUPAC-sanctioned name for the anion . The term is obscure; derivatives of are almost invariably referred to as amides, despite the fact that amide also refers to the organic functional group –. The anion is the conjugate base of ammonia, so it is formed by the self-ionization of ammonia. It is produced by deprotonation of ammonia, usually with strong bases or an alkali metal. Azanide has a H–N–H bond angle of 104.5°.
Alkali metal derivatives
The alkali metal derivatives are best known, although usually referred to as alkali metal amides. Examples include lithium amide, sodium amide, and potassium amide. These salt-like solids are produced by treating liquid ammonia with strong bases or directly with the alkali metals (blue liquid ammonia solutions due to the solvated electron):
, where M = Li, Na, K
Silver(I) amide () is prepared similarly.
Transition metal complexes of the amido ligand are often produced by salt metathesis reaction or by deprotonation of metal ammine complexes.
References
Anions
Nitrogen hydrides | Azanide | [
"Physics",
"Chemistry"
] | 244 | [
"Ions",
"Matter",
"Anions"
] |
59,386,014 | https://en.wikipedia.org/wiki/Four-dimensional%20product | A four-dimensional product (4D product) considers a physical product as a life-like entity capable of changing form and physical properties autonomously over time. It is an evolving field of product design practice and research linked to similar concepts at the material scale (programmable matter and four-dimensional printing), however, typically utilizes sensors and actuators in order to respond to environmental and human conditions, modifying the shape, color, character and other physical properties of the product. In this way 4D products share similarities with responsive architecture, at the more human scale associated with products.
History
The concept of imbuing products with similar life-like qualities has been an area of increasing research within academia and industry alike. However, researchers have used a variety of different terms to describe this research, for example transformational products,. shape changing, kinetic, or in a more general sense, smart, connected, robotic or having a level of artificial intelligence.
Within industry, commercial examples of products capable of adaptation have received some attention. In 2005 Adidas released the Adidas 1 shoe, which was capable of adjusting the compression characteristics in the heel with each stride, and accommodate for the different requirements of the foot during different activities like walking or running. More recently in 2016, Nike released the HyperAdapt 1.0 shoe, capable of self-lacing as the user puts their foot into it. Additional micro adjustments were possible using manual controls, however, the designers claim a longer-term vision for such products to come alive and respond in real-time to user needs.
In 2008 BMW revealed a concept car called GINA which featured a fabric body stretched over a movable aluminium wire and carbon fiber frame, capable of flexing in certain areas to reveal details like door openings, or modify aerodynamic properties of the car in real time. The 2016 incarnation of this concept car, the BMW Vision Next 100, adopted similar capabilities with a more advanced flexible skin capable of expanding as the front wheels turn, reportedly reducing the drag coefficient of the car while cornering. Changes in product form can be used to improve product performance. While such a dynamic car body is yet to be seen on the mainstream market, elements of this transformation can be seen in modern Formula One racing cars. These vehicles have movable rear wing flaps to modify drag for overtaking in certain sections of a race (known as the Drag Reduction System or DRS). Consumer-level cars, like the Audi TT, are also capable of automatically increasing the rear spoiler angle at high speeds to increase traction and safety. This suggests these life-like movements are slowly finding their way into the mainstream.
See also
Actuator
Autonomous robot
Evolutionary robotics
Fourth Industrial Revolution
Industrial design
Responsive computer-aided design
Sensor
Smart wearable system
References
Further reading
Greenfield, Adam (2006). Everyware: The Dawning Age of Ubiquitous Computing. Berkeley, California USA: New Riders.
Kelly, Kevin (2010). What Technology Wants. New York, USA: Penguin Group.
Tibbits, S. (2016) Self-Assembly Lab: Experiments in Programming Matter. Abingdon, Oxon: Routledge.
Industrial design | Four-dimensional product | [
"Engineering"
] | 633 | [
"Industrial design",
"Design engineering",
"Design"
] |
59,386,497 | https://en.wikipedia.org/wiki/Windows%20Refund%20Day | Windows Refund Day was a protest that lasted a day, on February 15, 1999, due to Linux users being unable to get refunds for the bundled copy of Microsoft Windows included with their computers. Multiple protests took place outside of Microsoft offices in the US, with the most well-documented one occurring in the San Francisco Bay Area in California.
Motives for the protests
At the time, original equipment manufacturers (OEMs) used to sell computers that were bundled with a Windows license, including laptops. Computer users who used Linux or other open source operating systems who did not use Windows wanted to be refunded, as the bundled copy significantly raised the price of the computer.
When the press interviewed them, they claimed that although the end user license agreement (EULA) mentioned that "users who do not accept this EULA can return this copy of Windows to the OEM to get a refund", the OEMs would tell Linux users that Microsoft was responsible for the refunds, "as the OEM isn't the manufacturer of Windows", essentially creating a loophole. In 1997, an article was published where a Linux user described how they successfully managed to receive a refund for a laptop they had bought that had Windows bundled with it. According to them, it was a "lengthy process" and a manager at the company that manufactured the laptop, Canon Inc., claimed that they had not been told by Microsoft that the OEM is the one that is supposed to issue the refund, which indicated that Microsoft had not trained their partners correctly.
Bay Area protest
On February 15, 1999, protesters arrived outside of Microsoft's Foster City office (950 Tower Ln, Suite 9000, Foster City, California; ) after making signs at a local Denny's parking lot. This quickly caught the attention of members of the press and news stations. The protesters were users of Linux and other open source operating systems who did not use Windows and had no use for the bundled license that their computer included. They were holding signs, as well as bundled copies of Windows that were included with their computers. Microsoft had been expecting the protesters and had even put up a sign welcoming them. A Microsoft representative was also there, mainly to talk to the press members.
When the press interviewed a Microsoft spokesperson, he claimed that the protest was a "PR activity being used by some Linux enthusiasts to generate interest in their products". Microsoft reminded users that the OEMs were the ones who were supposed to issue the refunds for the copies of Windows and issued an official letter to the protesters which also mentioned that OEMs who sold computers with no operating system or Linux preinstalled existed.
A few protesters even attempted to enter the Microsoft office, on the 9th floor of the building. However, the elevator had been configured to disable access to the 9th floor, and when protesters tried to use the elevator to go to the 10th floor and use the stairs to go to the 9th floor, they discovered that the doors were locked from the side of the stairway.
After many attempts to get into the offices, the protesters left the building without any refunds.
Aftermath
Although the protesters could not get refunds for the Windows copies bundled with their PCs, as one protester claimed: "the goal of the protest was to raise awareness to the catch 22 issue".
The protests received media attention from local news channels that covered the story. News outlets including BBC News and The New York Times also published articles on the internet which reached people worldwide.
References
External links
Text of the letter from Microsoft to protestors
Microsoft criticisms and controversies
History of free and open-source software
1999 protests | Windows Refund Day | [
"Technology"
] | 740 | [
"Computing platforms",
"Microsoft Windows"
] |
59,387,611 | https://en.wikipedia.org/wiki/Micropeptide | Micropeptides (also referred to as microproteins) are polypeptides with a length of less than 100-150 amino acids that are encoded by short open reading frames (sORFs). In this respect, they differ from many other active small polypeptides, which are produced through the posttranslational cleavage of larger polypeptides. In terms of size, micropeptides are considerably shorter than "canonical" proteins, which have an average length of 330 and 449 amino acids in prokaryotes and eukaryotes, respectively. Micropeptides are sometimes named according to their genomic location. For example, the translated product of an upstream open reading frame (uORF) might be called a uORF-encoded peptide (uPEP). Micropeptides lack an N-terminal signaling sequences, suggesting that they are likely to be localized to the cytoplasm. However, some micropeptides have been found in other cell compartments, as indicated by the existence of transmembrane micropeptides. They are found in both prokaryotes and eukaryotes. The sORFs from which micropeptides are translated can be encoded in 5' UTRs, small genes, or polycistronic mRNAs. Some micropeptide-coding genes were originally mis-annotated as long non-coding RNAs (lncRNAs).
Given their small size, sORFs were originally overlooked. However, hundreds of thousands of putative micropeptides have been identified through various techniques in a multitude of organisms. Only a small fraction of these with coding potential have had their expression and function confirmed. Those that have been functionally characterized, in general, have roles in cell signaling, organogenesis, and cellular physiology. As more micropeptides are discovered so are more of their functions. One regulatory function is that of peptoswitches, which inhibit expression of downstream coding sequences by stalling ribosomes, through their direct or indirect activation by small molecules.
Identification
Various experimental techniques exist for identifying potential sORFs and their translational products. These techniques are only useful for identification of sORF that may produce micropeptides and not for direct functional characterization.
RNA sequencing
One method for finding potential sORFs, and therefore micropeptides, is through RNA sequencing (RNA-Seq). RNA-Seq uses next-generation sequencing (NGS) to determine which RNAs are expressed in a given cell, tissue, or organism at a specific point in time. This collection of data, known as a transcriptome, can then be used as a resource for finding potential sORFs. Because of the strong likelihood of sORFs less than 100 aa occurring by chance, further study is necessary to determine the validity of data obtained using this method.
Ribosome profiling (Ribo-Seq)
Ribosome profiling has been used to identify potential micropeptides in a growing number of organisms, including fruit flies, zebrafish, mice and humans. One method uses compounds such as harringtonine, puromycin or lactimidomycin to stop ribosomes at translation initiation sites. This indicates where active translation is taking place. Translation elongation inhibitors, such as emetine or cycloheximide, may also be used to obtain ribosome footprints which are more likely to result in a translated ORF. If a ribosome is bound at or near a sORF, it putatively encodes a micropeptide.
Mass spectrometry
Mass spectrometry (MS) is the gold standard for identifying and sequencing proteins. Using this technique, investigators are able to determine if polypeptides are, in fact, translated from a sORF.
Proteogenomic applications
Proteogenomics combines proteomics, genomics, and transciptomics. This is important when looking for potential micropeptides. One method of using proteogenomics entails using RNA-Seq data to create a custom database of all possible polypeptides. Liquid chromatography followed by tandem MS (LC-MS/MS) is performed to provide sequence information for translation products. Comparison of the transcriptomic and proteomics data can be used to confirm the presence of micropeptides.
Phylogenetic conservation
Phylogenetic conservation can be a useful tool, particularly when sifting through a large database of sORFs. The likelihood of a sORF resulting in a functional micropeptide is more likely if it is conserved across numerous species. However, this will not work for all sORFs. For example, those that are encoded by lncRNAs are less likely to be conserved given lncRNAs themselves do not have high sequence conservation. Further experimentation will be necessary to determine if a functional micropeptide is in fact produced.
Validating protein-coding potential
Antibodies
Custom antibodies targeted to the micropeptide of interest can be useful for quantifying expression or determining intracellular localization. As is the case with most proteins, low expression may make detection difficult. The small size of the micropeptide can also lead to difficulties in designing an epitope from which to target the antibody.
Tagging with CRISPR-Cas9
Genome editing can be used to add FLAG/MYC or other small peptide tags to an endogenous sORF, thus creating fusion proteins. In most cases, this method is beneficial in that it can be performed more quickly than developing a custom antibody. It is also useful for micropeptides for which no epitope can be targeted.
In vitro translation
This process entails cloning the full-length micropeptide cDNA into a plasmid containing a T7 or SP6 promoter. This method utilizes a cell-free protein-synthesizing system in the presence of 35S-methionine to produce the peptide of interest. The products can then be analyzed by gel electrophoresis and the 35S-labeled peptide is visualized using autoradiography.
Databases and repositories
There are several repositories and databases that have been created for both sORFs and micropeptides. A repository for of small ORFs discovered by ribosome profiling can be found at sORFs.org. A repository of putative sORF-encoded peptides in Arabidopsis thaliana can be found at ARA-PEPs. A database of small proteins, especially encoded by non-coding RNAs can be found at SmProt.
Prokaryotic examples
To date, most micropeptides have been identified in prokaryotic organisms. While most have yet to be fully characterized, of those that have been studied, many appear to be critical to the survival of these organisms. Because of their small size, prokaryotes are particularly susceptible to changes in their environment, and as such have developed methods to ensure their existence.
Escherichia coli (E. coli)
Micropeptides expressed in E. coli exemplify bacterial environmental adaptations. Most of these have been classified into three groups: leader peptides, ribosomal proteins, and toxic proteins. Leader proteins regulate transcription and/or translation of proteins involved in amino acid metabolism when amino acids are scarce. Ribosomal proteins include L36 (rpmJ) and L34 (rpmH), two components of the 50S ribosomal subunit. Toxic proteins, such as ldrD, are toxic at high levels and can kill cells or inhibit growth, which functions to reduce the host cell's viability.
Salmonella enterica
In S. enterica, the MgtC virulence factor is involved in adaptation to low magnesium environments. The hydrophobic peptide MgrR, binds to MgtC, causing its degradation by the FtsH protease.
Bacillus subtilis
The 46 aa Sda micropeptide, expressed by B. subtilis, represses sporulation when replication initiation is impaired. By inhibiting the histidine Kinase KinA, Sda prevents the activation of the transcription factor Spo0A, which is required for sporulation.
Staphylococcus aureus
In S. aureus, there are a group of micropeptides, 20-22 aa, that are excreted during host infection to disrupt neutrophil membranes, causing cell lysis. These micropeptides allow the bacterium to avoid degradation by the human immune systems' main defenses.
Eukaryotic examples
Micropeptides have been discovered in eukaryotic organisms from Arabidopsis thaliana to humans. They play diverse roles in tissue and organ development, as well as maintenance and function once fully developed. While many are yet to be functionally characterized, and likely more remain to be discovered, below is a summary of recently identified eukaryotic micropeptide functions.
Arabidopsis thaliana
The POLARIS (PLS) gene encodes a 36 aa micropeptide. It is necessary for proper vascular leaf patterning and cell expansion in the root. This micropeptide interacts with developmental PIN proteins to form a critical network for hormonal crosstalk between auxin, ethylene, and cytokinin.
ROTUNDIFOLIA (ROT4) in A. thaliana encodes a 53 aa peptide, which localizes to the plasma membrane of leaf cells. The mechanism of ROT4 function is not well understood, but mutants have short rounded leaves, indicating that this peptide may be important in leaf morphogenesis.
Zea mays (maize)
Brick1 (Brk1) encodes a 76 aa micropeptide, which is highly conserved in both plants and animals. In Z. mays, it was found to be involved in morphogenesis of leaf epithelia, by promoting multiple actin-dependent cell polarization events in the developing leaf epidermis. Zm401p10 is an 89 aa micropeptide, which plays a role in normal pollen development in the tapetum. After mitosis it also is essential in the degradation of the tapetum. Zm908p11 is a micropeptide 97 aa in length, encoded by the Zm908 gene that is expressed in mature pollen grains. It localizes to the cytoplasm of pollen tubes, where it aids in their growth and development.
Drosophila melanogaster (fruit fly)
The evolutionarily conserved polished rice (pri) gene, known as tarsal-less (tal) in D. melanogaster, is involved in epidermal differentiation. This polycistronic transcript encodes four similar peptides, which range between 11-32 aa in length. They function to truncate the transcription factor Shavenbaby (Svb). This converts Svb into an activator that directly regulates the expression of target effectors, including miniature (m) and shavenoid (sha), which are together responsible for trichome formation.
Danio rerio (zebrafish)
The Elabela gene (Ela) (a.k.a. Apela, Toddler) is important for embryogenesis. It is specifically expressed during late blastula and gastrula stages. During gastrulation, it is critical in promoting the internalization and animal-pole directed movement of mesendodermal cells. After gastrulation, Ela is expressed in the lateral mesoderm, endoderm, as well as the anterior, and posterior, notochord. Although it was annotated as a lncRNA in zebrafish, mouse, and human, the 58-aa ORF was found to be highly conserved among vertebrate species. Ela is processed by removal of its N-terminus signal peptide and then secreted in the extracellular space. Its 34-aa mature peptide serves as the first endogenous ligand to a GPCR known as the Apelin Receptor. The genetic inactivation of Ela or Aplnr in zebrafish results in heartless phenotypes.
Mus musculus (mouse)
Myoregulin (Mln) is encoded by a gene originally annotated as a lncRNA. Mln is expressed in all 3 types of skeletal muscle, and works similarly to the micropeptides phospholamban (Pln) in the cardiac muscle and sarcolipin (Sln) in slow (Type I) skeletal muscle. These micropeptides interact with sarcoplasmic reticulum Ca2+-ATPase (SERCA), a membrane pump responsible for regulating Ca2+ uptake into the sarcoplasmic reticulum (SR). By inhibiting Ca2+ uptake into the SR, they cause muscle relaxation. Similarly, the endoregulin (ELN) and another-regulin (ALN) genes code for transmembrane micropeptides that contain the SERCA binding motif, and are conserved in mammals.
Myomixer (Mymx) is encoded by the gene Gm7325, a muscle-specific peptide, 84 aa in length, which plays a role during embryogenesis in fusion and skeletal muscle formation. It localizes to the plasma membrane, associating with a fusogenic membrane protein, Myomaker (Mymk). In humans, the gene encoding Mymx is annotated as uncharacterized LOC101929726. Orthologs are found in the turtle, frog and fish genomes as well.
Homo sapiens (human)
In humans, NoBody (non-annotated P-body dissociating polypeptide), a 68 aa micropeptide, was discovered in the long intervening noncoding RNA (lincRNA) LINC01420. It has high sequence conservation among mammals, and localizes to P-bodies. It enriches proteins associated with 5’ mRNA decapping. It is thought to interact directly with Enhancer of mRNA Decapping 4 (EDC4).
ELABELA (ELA) (a.k.a. APELA) is an endogenous hormone that is secreted as a 32 amino acid micropeptide by human embryonic stem cells. It is essential to maintain the self-renewal and pluripotency of human embryonic stem cells. Its signals in an autocrine fashion through the PI3/AKT pathway via an as yet unidentified cell surface receptor. In differentiating mesoendermal cells ELA binds to, and signals via, APLNR, a GPCR which can also respond to the hormonal peptide APLN.
The CYREN gene, conserved in mammals, when alternatively spliced is predicted to produce three micropeptides. MRI-1 was previously found to be a modulator of retrovirus infection. The second predicted micropeptide, MRI-2, may be important in non-homologous end joining (NHEJ) of DNA double strand breaks. In Co-Immunoprecipitation experiments, MRI-2 bound to Ku70 and Ku80, two subunits of Ku, which play a major role in the NHEJ pathway.
The 24 amino acid micropeptide, Humanin (HN), interacts with the apoptosis-inducing protein Bcl2-associated X protein (Bax). In its active state, Bax undergoes a conformational change which exposes membrane-targeting domains. This causes it to move from the cytosol to the mitochondrial membrane, where it inserts and releases apoptogenic proteins such as cytochrome c. By interacting with Bax, HN prevents Bax targeting of the mitochondria, thereby blocking apoptosis.
A micropeptide of 90aa, ‘Small Regulatory Polypeptide of Amino Acid Response’ or SPAAR, was found to be encoded in the lncRNA LINC00961. It is conserved between human and mouse, and localizes to the late endosome/lysosome. SPAAR interacts with four subunits of the v-ATPase complex, inhibiting mTORC1 translocation to the lysosomal surface where it is activated. Down-regulation of this micropeptide enables mTORC1 activation by amino acid stimulation, promoting muscle regeneration.
References
Peptides | Micropeptide | [
"Chemistry"
] | 3,412 | [
"Biomolecules by chemical classification",
"Peptides",
"Molecular biology"
] |
59,389,667 | https://en.wikipedia.org/wiki/Effects%20of%20Planet%20Nine%20on%20trans-Neptunian%20objects | The hypothetical Planet Nine would modify the orbits of extreme trans-Neptunian objects via a combination of effects. On very long timescales exchanges of angular momentum with Planet Nine cause the perihelia of anti-aligned objects to rise until their precession reverses direction, maintaining their anti-alignment, and later fall, returning them to their original orbits. On shorter timescales mean-motion resonances with Planet Nine provides phase protection, which stabilizes their orbits by slightly altering the objects' semi-major axes, keeping their orbits synchronized with Planet Nine's and preventing close approaches. The inclination of Planet Nine's orbit weakens this protection, resulting in a chaotic variation of semi-major axes as objects hop between resonances. The orbital poles of the objects circle that of the Solar System's Laplace plane, which at large semi-major axes is warped toward the plane of Planet Nine's orbit, causing their poles to be clustered toward one side.
Apsidal anti-alignment
The anti-alignment and the raising of the perihelia of extreme trans-Neptunian objects with semi-major axes greater than 250 AU is produced by the secular effects of Planet Nine. Secular effects act on timescales much longer than orbital periods so the perturbations two objects exert on each other are the average between all possible configurations. Effectively the interactions become like those between two wires of varying thickness, thicker where the objects spend more time, that are exerting torques on each other, causing exchanges of angular momentum but not energy. Thus secular effects can alter the eccentricities, inclinations and orientations of orbits but not the semi-major axes.
Exchanges of angular momentum with Planet Nine cause the perihelia of the anti-aligned objects to rise and fall while their longitudes of perihelion librate, or oscillate within a limited range of values. When the angle between an anti-aligned object's perihelion and Planet Nine's (delta longitude of perihelion on diagram) climbs beyond 180° Planet Nine exerts a positive average torque on the object's orbit. This torque increases the object's angular momentum from Planet Nine causing the eccentricity of its orbit to decline (see blue curves on diagram) and its perihelion to rise away from Neptune's orbit. The object's precession then slows and eventually reverses as its eccentricity declines. After delta longitude of perihelion drops below 180° the object begins to feel a negative average torque and loses angular momentum to Planet Nine causing its eccentricity grows and perihelion falls. When the object's eccentricity is once again large it precesses forward, returning the object to its original orbit after several hundred million years.
The behavior of the orbits of other objects varies with their initial orbits. Stable orbits exist for aligned objects with small eccentricities. Although objects in these orbits have high perihelia and have yet to be observed, they may have been captured at the same time as Planet Nine due to perturbations from a passing star. Aligned objects with lower perihelia are only temporarily stable, their orbits precess until parts of the orbits are tangent to that of Planet Nine, leading to frequent close encounters. After crossing this region the perihelia of their orbits decline, causing them to encounter the other planets, leading to their ejection.
The curves the orbits follow vary with semi-major axis of the object and if the object is in resonance. At smaller semi-major axes the aligned and anti-aligned regions shrink and eventually disappear below 150 AU, leaving typical Kuiper belt objects unaffected by Planet Nine. At larger semi-major axes the region with aligned orbits becomes narrower and the region with anti-aligned orbits becomes wider. These regions also shift to lower perihelia, with perihelia of 40 AU becoming stable for anti-aligned objects at semi-major axes greater than 1000 AU. The anti-alignment of resonant objects, for example if Sedna is in a 3:2 resonance with Planet Nine as proposed by Malhotra, Volk and Wang, is maintained by a similar evolution inside the mean-motion resonances. The objects behavior is more complex if Planet Nine and the eTNOs are in inclined orbits. Objects then undergo a chaotic evolution of their orbits, but spend much of their time in the aligned or anti-aligned orbits regions of relative stability associated with secular resonances.
Evolution and long-term stability of anti-aligned orbits
The long term stability of anti-aligned extreme trans-Neptunian objects with orbits that intersect that of Planet Nine is due to their being captured in mean-motion resonances. Objects in mean-motion resonances with a massive planet are phase protected, preventing them from making close approaches to the planet. When the orbit of a resonant object drifts out of phase, causing it to make closer approaches to a massive planet, the gravity of the planet modifies its orbit, altering its semi-major axis in the direction that reverses the drift. This process repeats as the drift continues in the other direction causing the orbit to appear to rock back and forth, or librate, about a stable center when viewed in a rotating frame of reference. In the example at right, when the orbit of a plutino drifts backward it loses angular momentum when it makes closer approaches ahead of Neptune, causing its semi-major axis and period to shrink, reversing the drift.
In a simplified model where all objects orbit in the same plane and the giant planets are represented by rings, objects captured in strong resonances with Planet Nine could remain in them for the lifetime of the Solar System. At large semi-major axes, beyond a 3:1 resonance with Planet Nine, most of these objects would be in anti-aligned orbits. At smaller semi-major axes the longitudes of perihelia of an increasing number of objects could circulate, passing through all values ranging from 0° to 360°, without being ejected, reducing the fraction of objects that are anti-aligned. may be in one of these circulating orbits.
If this model is modified with Planet Nine and the eTNOs in inclined orbits the objects alternate between extended periods in stable resonances and periods of chaotic diffusion of their semi-major axes. The distance of the closest approaches varies with the inclinations and orientations of the orbits, in some cases weakening the phase protection and allowing close encounters. The close encounters can then alter the eTNO's orbit, producing stochastic jumps in its semi-major axis as it hops between resonances, including higher order resonances. This results in a chaotic diffusion of an object's semi-major axis until it is captured in a new stable resonance and the secular effects of Planet Nine shift its orbit to a more stable region. The chaotic diffusion reduces the range of longitudes of perihelion that anti-aligned objects can reach while remaining in stable orbits.
Neptune's gravity can also drive a chaotic diffusion of semi-major axes when all objects are in the same plane. Distant encounters with Neptune can alter the orbits of the eTNOs, causing their semi-major axes to vary significantly on million year timescales. These perturbations can cause the semi-major axes of the anti-aligned objects to diffuse chaotically while occasionally sticking in resonances with Planet Nine. At semi-major axes larger than Planet Nine's, where the objects spend more time, anti-alignment may be due to the secular effects outside mean-motion resonances.
The phase protection of Planet Nine's resonances stabilizes the orbits of objects that interact with Neptune via its resonances, for example , or by close encounters for objects with low perihelia like and . Instead of being ejected following a series of encounters these objects can hop between resonances with Planet Nine and evolve into orbits no longer interacting with Neptune. A shift in the position of Planet Nine in simulations from the location favored by an analysis of Cassini data to a position near aphelion has been shown to increase the stability of some of the observed objects, possibly due to this shifting the phases of their orbits to a stable range.
Clustering of orbital poles (nodal alignment)
The clustering of the orbital poles, which produces an apparent clustering of the longitude of the ascending nodes and arguments of perihelion of the extreme TNOs, is the result of a warping of the Laplace plane of the Solar System toward that of Planet Nine's orbit. The Laplace plane defines the center around which the pole of an object's orbit precesses with time. At larger semi-major axes the angular momentum of Planet Nine causes the Laplace plane to be warped toward that of its orbit. As a result, when the poles of the eTNO orbit precess around the Laplace plane's pole they tend to remain on one side of the ecliptic pole. For objects with small inclination relative to Planet Nine, which were found to be more stable in simulations, this off-center precession produces a libration of the longitudes of ascending nodes with respect to the ecliptic making them appear clustered. In simulations the precession is broken into short arcs by encounters with Planet Nine and the positions of the poles are clustered in an off-center elliptical region. In combination with the anti-alignment of the longitudes of perihelion this can also produce clustering of the arguments of perihelion. Node-crossings may also be avoided for enhanced stability.
Objects in perpendicular orbits with large semi-major axis
Planet Nine can deliver extreme trans-Neptunian objects into orbits roughly perpendicular to the plane of the Solar System. Several objects with high inclinations, greater than 50°, and large semi-major axes, above 250 AU, have been observed. Their high inclination orbits can be generated by a high order secular resonance with Planet Nine involving a linear combination of the orbit's arguments and longitudes of perihelion: Δϖ - 2ω. Low inclination eTNOs can enter this resonance after first reaching low eccentricity orbits. The resonance causes their eccentricities and inclinations to increase, delivering them into perpendicular orbits with low perihelia where they are more readily observed. The orbits then evolve into retrograde orbits with lower eccentricities after which they pass through a second phase of high eccentricity perpendicular orbits before returning to low eccentricity, low inclination orbits. Unlike the Kozai mechanism this resonance causes objects to reach their maximum eccentricities when in nearly perpendicular orbits. In simulations conducted by Batygin and Brown this evolution was relatively common, with 38% of stable objects undergoing it at least once. Saillenfest et al. also observed this behavior in their study of the secular dynamics of eTNOs and noted that it caused the perihelia to fall below 30 AU for objects with semi-major axes greater than 300 AU, and with Planet Nine in an inclined orbit it could occur for objects with semi-major axes as small as 150 AU. In simulations the arguments of perihelion of the objects with roughly perpendicular orbits and reaching low perihelia are clustered near or opposite Planet Nine's and their longitudes of ascending node are clustered around 90° in either direction from Planet Nine's. This is in rough agreement with observations with the differences attributed to distant encounters with the known giant planets. Nine high inclination objects with semi-major axes greater than 250 AU and perihelia beyond Jupiter's orbit are currently known:
Dynamically coherent bodies and disrupted binaries
The presence of one or more massive perturbers orbiting the Sun well beyond Pluto may lead to the appearance of dynamically coherent minor bodies, i.e. those with similar orbits within a population of otherwise uncorrelated objects, via binary dissociation. The fact is that dynamically correlated minor bodies seem to be ubiquitous among those in the outer Solar System. A well-known example is in the Haumea collisional family. Another, albeit less-studied case is that of Chiang's collisional family. At least one pair of extreme trans-Neptunian objects, the one made of 474640 Alicanto and , exhibit both similar dynamics and physical properties. If there are no massive planets beyond Pluto, the orbits of the ETNOs must be randomized and the statistical distributions of some of their angular elements should be compatible with a uniform distribution. This has interesting implications on what it should and should not be observed when exploring the relationships between the orbits of the ETNOs. Favorable evidence is now mounting on the statistically significant deviations of the distribution of mutual nodal distances for this population. Such asymmetries are expected if massive perturbers (one or more) are present.
Oort cloud and comets
Numerical simulations of the migration of the giant planets show that the number of objects captured in the Oort cloud is reduced if Planet Nine was in its predicted orbit at that time. This reduction of objects captured in the Oort cloud also occurred in simulations with the giant planets on their current orbits.
The inclination distribution of Jupiter-family (or ecliptic) comets would become broader under the influence of Planet Nine. Jupiter-family comets originate primarily from the scattering objects, trans-Neptunian objects with semi-major axes that vary over time due to distant encounters with Neptune. In a model including Planet Nine, the scattering objects that reach large semi-major axes dynamically interact with Planet Nine, increasing their inclinations. As a result, the population of the scattering objects, and the population of comets derived from it, is left with a broader inclination distribution. This inclination distribution is broader than is observed, in contrast to a five-planet Nice model without a Planet Nine that can closely match the observed inclination distribution.
In a model including Planet Nine, part of the population of Halley-type comets is derived from the cloud of objects that Planet Nine dynamically controls. This Planet Nine cloud is made up of objects with semi-major axes centered on that of Planet Nine that have had their perihelia raised by the gravitational influence of Planet Nine. The continued dynamical effects of Planet Nine drive oscillations of the perihelia of these objects, delivering some of them into planet-crossing orbits. Encounters with the other planets can then alter their orbits, placing them in low-perihelion orbits where they are observed as comets. The first step of this process is slow, requiring more than 100 million years, compared to comets from the Oort cloud, which can be dropped into low-perihelion orbits in one period. The Planet Nine cloud contributes roughly one-third of the total population of comets, which is similar to that without Planet Nine due to a reduced number of Oort cloud comets.
Notes
References
Solar System
Hypothetical planets
Hypothetical trans-Neptunian objects | Effects of Planet Nine on trans-Neptunian objects | [
"Astronomy"
] | 3,040 | [
"Outer space",
"Solar System"
] |
59,389,858 | https://en.wikipedia.org/wiki/Unimog%202010 | The Unimog 2010 is a vehicle of the Unimog series made by German manufacturer Daimler-Benz from June 1951 to August 1953 in the Mercedes-Benz Gaggenau plant. It is a technical copy of its predecessor, the Unimog 70200. Despite being sold by Mercedes-Benz dealerships, the Unimog 2010 did not feature the brand's „Mercedes-star“ emblem. Instead, it was solely sold under the Unimog brand, having the ox-head-Unimog emblem on the bonnet; only vehicles purchased by the Swiss army lack the Unimog emblem and have no branding at all (as seen on the right). In total, 5,846 units were produced, and five different models were available. All Unimog 2010 vehicles have a wheelbase of and a canvas roof; a closed cab was not available as a factory option. The name „Unimog 2010“ originates from the German supply firm Erhard & Söhne, which manufactured the Unimog prototypes – all technical drawings, parts and tools of that firm had the part number 2010, which is said to be the reason why Daimler-Benz simply named the Unimog 2010. The Unimog 2010 was succeeded by the Unimog 401 in 1953.
History
After the final development steps of the Unimog 70200 had been completed at Erhard & Söhne, production was carried out by Gebrüder Boehringer in their Göppingen plant starting in 1948. Boehringer, originally a tool manufacturer, was not experienced in the automotive field, and, therefore, all Unimogs were solely built by hand. This ineffective production process led to a production figure of roughly 25 to 30 units per month, which was not enough to accommodate demand, thus Boehringer sold the entire Unimog production to Daimler-Benz in late 1950. Production in Göppingen was halted in April 1951. During the following months, it was moved to Daimler-Benz' Mercedes-Benz Gaggenau plant. It was planned to start production in the Gaggenau plant in spring 1951, but during the process of creating a new production line for the Unimog, logistical problems occurred. The planned production figure of 170–180 units could not be reached. On 4 June 1951, makeshift production was started in the provisorily modified building 14 of the Gaggenau plant. This marked the introduction of the new name for the Unimog, 2010. After four weeks, the makeshift production was halted after series production could be started in the building 44 of the Gaggenau plant. Building 44 was equipped with the necessary tools, equipment and an assembly line.
Daimler-Benz engineers modified the original Unimog 70200 design: The mud guard wings were upgraded with a beading and the rear part of the bed frame was made with an edge rather than curved to ease production. Also, axle lids were now welded instead of held in place with screws. The original Diesel engine OM 636.912 was used until 1952, when it was replaced with the OM 636.914. The OM 636.914 has a modified cylinder head and valve cover. The 2010s successor Unimog 401 was presented on the DLG exhibition in Cologne in May 1953, and Unimog 2010 production was ceased in August 1953.
Military use
Like its predecessor, the Unimog 2010 was also used as a military vehicle. Some Unimog 2010s were purchased by the French army, but most of the military Unimog 2010s were operated by the Swiss army, they purchased 540 units. In addition to 540 Unimog 2010s, they also had 44 Unimog 70200 in service. The Swiss army used their Unimog 2010s as artillery tractors, combat engineer vehicles and aircraft tractors. Being the smallest motor vehicle of the Swiss army, the Unimog 2010 received the nickname „Dieseli“. The Dieseli remained in service until 1989.
Models
In total, there exist eight different models of the Unimog 2010, however, only five were ever built. The model number was added after a slash; military versions for the Swiss army received an additional M-letter.
Bibliography
Carl-Heinz Vogler: Typenatlas Unimog. Alle Unimog-Klassiker seit 1946 bis 1993. GeraMond, München 2015, , pp. 25
Lutz Nellinger: Der Unimog: Arbeitstier und Kultmobil. Komet, Köln 2016, .
References
Tractors
Mercedes-Benz trucks
Vehicles introduced in 1951 | Unimog 2010 | [
"Engineering"
] | 940 | [
"Engineering vehicles",
"Tractors"
] |
59,389,873 | https://en.wikipedia.org/wiki/Giacomo%20Mauro%20D%27Ariano | Giacomo Mauro D'Ariano (born 11 May 1955) is an Italian quantum physicist. He is a professor of theoretical physics at the University of Pavia, where he is the leader of the QUIT (Quantum Information Theory) group. He is a member of the Center of Photonic Communication and Computing at Northwestern University; a member of the Istituto Lombardo Accademia di Scienze e Lettere; and a member of the Foundational Questions Institute (FQXi).
His primary areas of research are Quantum information theory, the mathematical structure of quantum theory, and foundational problems of contemporary physics. As one of the pioneers of Quantum Information Theory, he has made major contributions to the informational-theoretical derivation of Quantum Theory.
Early life and career
D'Ariano was born on 11 May 1955. He got Laurea cum laude in Physics in 1978 from Pavia University. In 1978, he started a research fellowship in Polymer Science at Politecnico di Milano and in 1979, a research fellowship at Pavia University. In 1984, he was appointed as research assistant at the University of Pavia and as a result of national competitions he became associate professor in 1992 and full professor in 2000.
At the time of his appointment, there were no PhD schools in Italy and D'Ariano became one of the first PhD supervisors in the country. He founded the Quantum Information Theory Group (QUIT) in 2000 and took on the role of the group leader. In the same year, he was also selected as a member of the Photonic Communication and Computing at Northwestern University.
Scientific contributions
Quantum information
D'Ariano and his collaborators introduced the first exact algorithm for quantum homodyne tomography of states, and they subsequently generalized the technique used to do to a universal method of quantum measurement. D'Ariano then developed the first experimental scheme—now called "ancilla-assisted tomography"—that made the characterization of quantum channels, operations, and measuring apparatuses feasible to be actually done in the laboratory, by exploiting a single entangled input state.
D'Ariano proposed quantum entanglement as a tool for improving the precision of quantum measurement, an idea that, parallel to works of other authors, suggested the new field of Quantum metrology. He has also introduced several new types of measurement. With his team, he solved a number of long-standing problems of quantum information theory, such as the optimal broadcasting of mixed states; the optimal phase-estimation for mixed states, and the optimal protocols for phase cloning.
D'Ariano and collaborators introduced the concept of "quantum comb", which generalizes that of "quantum operation", and has a wide range of applications in optimization of quantum measurements, communication, algorithms, and protocols. He and his group subsequently used quantum combs to find the optimal apparatuses for Quantum tomography. The quantum-comb framework also enabled a new understanding of causality in quantum mechanics and quantum field theory. This understanding had a wide and diverse impact in several areas of research, beginning with the study of quantum causal interference and causal-discovery algorithms, used in recent attempts, along quantum informational lines, at reconciling quantum theory and general relativity, one of the great outstanding problems of fundamental physics.
Quantum foundations
D'Ariano has played a major role in making quantum information theory a new paradigm for the foundations of quantum theory and fundamental physics in general. In 2010, he proposed a set of information-theoretical postulates for a rigorous derivation of (finite-dimensional) Quantum Theory, a derivation subsequently achieved in his collaboration with Giulio Chiribella and Paolo Perinotti. This project also led to a new way of understanding, working with, and developing quantum theory, presented in a comprehensive textbook entitled Quantum Theory from First Principles.
In the mid 2010s, D'Ariano extended this program to a derivation of Quantum Field Theory from informational-theoretical postulates, which enabled him and his team to derive the complete free Quantum field theory. In an article in New Scientist, Lucien Hardy wrote that "their work and their approach is extraordinary", and Časlav Brukner wrote that he was "impressed" by their work writing that "there's something deep about quantum mechanics in this work".
A historical perspective, from Dirac's discovery of quantum electrodynamics to the present time, on this work was given by Arkady Plotnitsky in The Principles of Quantum Theory, From Planck's Quanta to the Higgs Boson.
A book by Oliver Darrigol offers an extensive commentary on D'Ariano and co-workers' derivation of Quantum Mechanics, especially emphasizing how it overcomes certain ad hoc assumptions of previous derivations.
On the formulation of quantum theory based on information principles, physicist Federico Faggin based his theory on the nature of consciousness.
Honors and awards
Giacomo Mauro D’Ariano is a Fellow of the Optical Society of America and of the American Physical Society. He won the third prize for the FQXi essay world competitions of 2011, 2012 and 2013. His paper on the informational derivation of quantum theory has been selected for an APS Viewpoint.
In 2022 he won, together with Mikhail Lukin (Harvard University) and Andreas Winter (Universitat Autònoma de Barcelona) the International Quantum Award.
Publications
References
Quantum physicists
1955 births
Living people
University of Pavia alumni
Academic staff of the University of Pavia
20th-century Italian physicists
21st-century Italian physicists
Fellows of the American Physical Society | Giacomo Mauro D'Ariano | [
"Physics"
] | 1,121 | [
"Quantum physicists",
"Quantum mechanics"
] |
59,392,032 | https://en.wikipedia.org/wiki/Dynamic%20texture | Dynamic texture ( sometimes referred to as temporal texture) is the texture with motion which can be found in videos of sea-waves, fire, smoke, wavy trees, etc. Dynamic texture has a spatially repetitive pattern with time-varying visual pattern. Modeling and analyzing dynamic texture is a topic of images processing and pattern recognition in computer vision.
Extracting features that describe the dynamic texture can be utilized for tasks of images sequences classification, segmentation, recognition and retrieval. Comparing with texture found within static images, analyzing dynamic texture is a challenging problem. It is important that the extracted features from dynamic texture combine motion and appearance description, and also be invariance to some transformation such as rotation, translation and illumination.
Analysis methods of dynamic texture
The methods of dynamic texture recognition can categorized as follows:
Methods based on optical flow: by applying optical flow to the dynamic texture, velocity with direction and magnitude can be detected and used to recognize the dynamic texture. Due to simplicity of its computation, it is currently the most popular method.
Methods computing geometric properties: this methods track the surfaces of motion trajectories in spatiotemporal domain.
Methods based on local spatiotemporal filtering : this methods analyze the local spatiotemporal patterns and its orientation and energy and employ them as feature used for classification.
Methods based on global spatiotemporal transform: this method characterize the motion at different scale using wavelets that can decompose the motion into local and global.
Model-based methods : These methods aims at generating a model to describe the motion by a set of parameters.
Applications
- Segmenting the sequence images of natural scenes. This helps on differentiate between streets and grass alongside these streets which could be used in the application of navigations.
- Motion detection : Dynamic texture features extracted from footage videos can be exploited to detect abnormal crowd activities.
- Video classification: video of natural scenes or other scenes that exhibit dynamic textures.
- Video retrieval : Dynamic textures can be employed as a feature retrieve videos that contain, for example, sea-waves, smoke, clouds, wavy trees.
References
Image processing
Computer vision
Pattern recognition | Dynamic texture | [
"Engineering"
] | 430 | [
"Artificial intelligence engineering",
"Packaging machinery",
"Computer vision"
] |
59,393,164 | https://en.wikipedia.org/wiki/Uschi%20Steigenberger | Ursula "Uschi" Steigenberger (25 April 1951 — 12 December 2018) FInstP was a German condensed matter physicist and director of the ISIS neutron source. She was one of the founders of the Institute of Physics Juno Award.
Education
Steigenberger was born in Augsburg. She studied physics at the University of Würzburg, where she remained for her graduate studies. She earned a PhD in condensed matter physics under the supervision of Michael von Ortenburg. For her doctoral thesis she designed a stress rig, which allowed her to apply uniaxial pressure to single crystals of tellurium. After earning her PhD in 1981, Steigenberger worked in various magnet labs, including the CNRS Grenoble High Field Magnet Facility in Grenoble and Lebedev Physical Institute in Moscow.
Career and research
Steigenberger joined the Institut Laue–Langevin in 1982. She worked on the PRogetto dell'Istituto di Strutura della MAteria del CNR (PRISMA) spectrometer with collaborators in Italy, and continued to work with them on the motion of molecules. She worked on cadmium telluride that had been alloyed with magnesium. Here she met British physicist Keith McEwen, whom she married in 1986 before the couple moved to the United Kingdom.
Steigenberger joined the ISIS neutron source in 1986. She was initially responsible for the PRISMA spectrometer, leading a collaboration between the Engineering and Physical Sciences Research Council (EPSRC) and Italian Consiglio Nazionale delle Ricerche. PRISMA was used for inelastic neutron scattering from single crystals. It can measure the change in scattering during the application of temperature, pressure or magnetic fields. Steigenberger was made one of the ISIS Excitations Group leaders, becoming the first woman Division Head in 1994. She attended the Oxford School on Neutron Scattering in 1991. In 1996 Steigenberger coordinated a workshop on pulsed neutron experiments. She served as director at the ISIS neutron source from 2011 to 2012. She worked with scientists from Norway to create Larmor, a high-intensity small-angle scattering instrument that uses a beam of polarised neutrons to study the movement of atoms in a material. Steigenberger served as Chair of the Helmholtz-Zentrum Berlin. She retired from the ISIS neutron source in 2013. Her work was recognised by scientists all over the world, developing important partnerships with Japan and Italy.
Awards and honours
Steigenberger was shortlisted for a WISE Campaign Lifetime Achievement Award. She was a Fellow of the Institute of Physics, and part of their review of women in the physics workplace that led to the Juno award scheme as well as being a member of their Science Board. In the 2016 New Year Honours she was appointed Officer of the Order of the British Empire (OBE) for services to science, which she collected in 2017. This OBE was honorary as Steigenberger kept her German citizenship whilst working in the UK.
Death
Steigenberger was diagnosed with pancreatic cancer in January 2017, and died on 12 December 2018.
References
2018 deaths
German women physicists
Condensed matter physicists
20th-century German physicists
20th-century German women scientists
21st-century German physicists
21st-century German women scientists
1951 births | Uschi Steigenberger | [
"Physics",
"Materials_science"
] | 657 | [
"Condensed matter physicists",
"Condensed matter physics"
] |
59,393,363 | https://en.wikipedia.org/wiki/Barry%20Inglis%20Medal | The Barry Inglis Medal is awarded by the National Measurement Institute, Australia for leadership, research and/or applications of measurement techniques. It is named in honour of Barry Inglis, inaugural CEO and Chief Metrologist of Australia’s National Measurement Institute (NMI) which was set up in 2004. The inaugural medal was in 2008.
Recipients
2008 John E. Sader, University of Melbourne
2009 Michael E. Tobar, University of Western Australia
2010 Kenneth Baldwin, ANU
2011 Philip N. H. Nakashima, Monash University
2012 Not awarded
2013 Not awarded
2014 Bruce Forgan, Bureau of Meteorology
2015 Graham Jones, St Vincent's Hospital
2016 Infrared Soil Analysis Group and Ziltek Pty Ltd, led by Mike McLaughlin, CSIRO
2017 Andre Luiten, University of Adelaide
2018 Derek Abbott, University of Adelaide
2019 Wojciech Chrzanowski , University of Sydney
2020 Warwick Bowen, University of Queensland
2021 Joseph Berry, University of Melbourne
2022 Oliver Jones, RMIT
2023 Mark Taylor University of Adelaide
2024 Vincent Wallace University of Western Australia and Withawat Withayachumnankul University of Adelaide
See also
List of engineering awards
List of physics awards
List of awards named after people
References
Australian science and technology awards
Engineering awards
Awards established in 2008 | Barry Inglis Medal | [
"Technology"
] | 260 | [
"Science and technology awards",
"Engineering awards"
] |
59,395,237 | https://en.wikipedia.org/wiki/Craig%20Electronics | Craig Electronics is an American brand that specializes in consumer electronics, primarily sold at pharmacies, big-box stores and through online retailers.
History
Craig Electronics was founded in 1963 as a manufacturer of home and car stereos. The company had many celebrity sponsorships for stereo systems, which included The Beach Boys, Ray Charles, Ringo Starr, and Emerson, Lake & Palmer.
In 2001, the company was re-established as a manufacturer of low-cost consumer electronics, including MP3 players, netbooks, and speakers. Craig also has a brand licensing deal to sell certain consumer goods under the Magnavox brand. In 2013, Craig Electronics, along with Curtis International and ViewSonic were sued for patent infringement by MPEG LA.
In 2003, the company's then-president, CEO, and chairman, Richard I. Berger, was found guilty of fraud.
In 2015, Craig Electronics and Curtis International were sued by Seoul Semiconductors for patent infringement.
In 2019, Craig Electronics was acquired by Nova Capital Management, and along with Shur-Line, Bulldog, and World and Main electronics, they now operate under H2 Brands group.
In 2022, Craig Electronics was acquired by Newtech Holding from Nova Wildcat Fund.
References
External links
Official website
American companies established in 1963
History of radio in the United States
Electronics companies of the United States
Consumer electronics brands
Companies based in Miami
Manufacturing companies established in 1963
Radio manufacturers | Craig Electronics | [
"Engineering"
] | 285 | [
"Radio electronics",
"Radio manufacturers"
] |
59,400,228 | https://en.wikipedia.org/wiki/Lions%E2%80%93Magenes%20lemma | In mathematics, the Lions–Magenes lemma (or theorem) is the result in the theory of Sobolev spaces of Banach space-valued functions, which provides a criterion for moving a time derivative of a function out of its action (as a functional) on the function itself.
Statement of the lemma
Let X0, X and X1 be three Hilbert spaces with X0 ⊆ X ⊆ X1. Suppose that X0 is continuously embedded in X and that X is continuously embedded in X1, and that X1 is the dual space of X0. Denote the norm on X by || ⋅ ||X, and denote the action of X1 on X0 by . Suppose for some that is such that its time derivative . Then is almost everywhere equal to a function continuous from into , and moreover the following equality holds in the sense of scalar distributions on :
The above equality is meaningful, since the functions
are both integrable on .
See also
Aubin–Lions lemma
Notes
It is important to note that this lemma does not extend to the case where is such that its time derivative for . For example, the energy equality for the 3-dimensional Navier–Stokes equations is not known to hold for weak solutions, since a weak solution is only known to satisfy and (where is a Sobolev space, and is its dual space, which is not enough to apply the Lions–Magnes lemma (one would need , but this is not known to be true for weak solutions).
References
(Lemma 1.2)
Lemmas in analysis | Lions–Magenes lemma | [
"Mathematics"
] | 324 | [
"Theorems in mathematical analysis",
"Lemmas",
"Lemmas in mathematical analysis"
] |
59,401,496 | https://en.wikipedia.org/wiki/Katharina%20Kohse-H%C3%B6inghaus | Katharina Kohse-Höinghaus (born 18 December 1951) is a German chemist.
Life
Kohse-Höinghaus studied Chemistry at the Ruhr University Bochum from 1970 to 1975. She finished her doctorate at the Ruhr University Bochum in 1978 and her habilitation at the University of Stuttgart in 1992. Since 1994, she is a professor for physical chemistry at the Bielefeld University. She founded one of the first hands-on laboratories for schools. She served as the President of The Combustion Institute between 2012 and 2016.
Research
Her research focuses on combustion diagnostics using laser spectroscopy and mass spectrometry, the deposition of functional materials from the gas-phase, and the in-situ analysis of reactive systems.
Awards and memberships
She received the following awards and memberships:
Order of Merit of the Federal Republic of Germany in 2007
Member of the Academy of Sciences Leopoldina since 2008
Member of the German Council of Science and Humanities since 2012
Member of the acatech since 2015
Member of the Göttingen Academy of Sciences and Humanities since 2016
Member of the North Rhine-Westphalian Academy of Sciences, Humanities and the Arts since 2017
Member of the European Academy of Sciences and Arts since 2017
Fellow of The Combustion Institute since 2018
Walther Nernst Memorial Medal by the in 2020
Member of the Chinese Academy of Sciences since 2021
References
1951 births
Living people
20th-century German chemists
German women chemists
Recipients of the Cross of the Order of Merit of the Federal Republic of Germany
20th-century German women scientists
21st-century German chemists
Fellows of the Combustion Institute
Academic staff of Bielefeld University | Katharina Kohse-Höinghaus | [
"Chemistry"
] | 330 | [
"Fellows of the Combustion Institute",
"Combustion"
] |
59,401,907 | https://en.wikipedia.org/wiki/Christel%20Marian | Christel Maria Marian (4 June 1954) is a German chemist. She is a full professor and the director of the institute of theoretical and computational chemistry at the University of Düsseldorf.
Education and professional life
Marian studied chemistry in Cologne and Bonn. She finished her doctorate in Theoretical Chemistry at the University of Bonn under the supervision of Sigrid D. Peyerimhoff in 1980. She did a postdoc in the Theoretical Physics Department of Stockholm University (Sweden) in the group of Per E. M. Siegbahn. She completed her habilitation at the University of Bonn in 1991. In 2001, she joined the University of Düsseldorf as a full professor. Between 2011 and 2015, she was Dean of the Mathematical and Natural Science Faculty at the University of Düsseldorf.
Personal life
She has two daughters.
Research
Her research focuses on the development and application of theoretical and computational excited-state electronic structure methods – also for biomolecules. She also worked on spin-orbit coupling in molecules.
Selected publications
Some of her most cited publications are:
Awards
She is a member of the North Rhine-Westphalian Academy of Sciences, Humanities and the Arts.
References
1954 births
20th-century German chemists
German women chemists
Living people
Theoretical chemists
20th-century German women scientists
21st-century German chemists | Christel Marian | [
"Chemistry"
] | 264 | [
"Quantum chemistry",
"Theoretical chemistry",
"Theoretical chemists",
"Physical chemists"
] |
59,401,999 | https://en.wikipedia.org/wiki/NGC%203367 | NGC 3367 is a barred spiral galaxy located in the constellation Leo. It is located at a distance of about 120 million light years from Earth, which, given its apparent dimensions, means that NGC 3367 is about 85,000 light years across. It was discovered by William Herschel on March 19, 1784.
Characteristics
NGC 3367 is a barred spiral galaxy with an asymmetric shape seen nearly face-on, with an inclination of 25 degrees. The inner arms begin at the ends of the bar, forming a ring with a major axis of 0.9 arcseconds, and after half of a revolution start to branch, creating a multiple-arm structure. They are studded with many bright HII regions. The star formation rate in NGC 3367 is nearly 3 per year. The outer arms form a semi-circular arch towards the south, at a distance of 50 arcseconds from the centre, more visible in the ultraviolet.
Usually, asymmetry in a galaxy is caused by interaction with other galaxies, but no large satellite has been detected near NGC 3367. The cause of the asymmetry is assumed to be a minor merger or mass accretion that took place in the last billion years. Because no low surface brightness structures, like plumes and star streams, have been detected near NGC 3367, and nor is gas seen in HI imaging near the galaxy, it is believed that the galaxy accreted cold gas in the past million years. This gas accretion resulted in increased star formation and nuclear activity.
NGC 3367 features a strong bar. The bar pattern speed was estimated to be 43 ± 6 km s−1 kpc−1 using the Tremaine-Weinberg method. The bar of the galaxy is more prominent in the near infrared, suggesting the dominance of an old population of stars in that region. There is also a knot at the eastern end of the bar, where there is a non-negligible population of red giants and asymptotic giant branch stars.
Nucleus
NGC 3367 has been categorised as a HII region or a type 2 Seyfert galaxy. However, optically there is no hint of an active galactic nucleus (AGN). The spectrum of NGC 3367 features unusually broad lines (FWHM of 490 m/s for Hβ) and a blue asymmetry. The spectrum looks like one produced from Wolf-Rayet stars. There is no clear evidence of H-alpha emission. The spectrum is also dominated by starburst features like low [Fe II] 1.2567 μm/Paβ line ratio. The galaxy was observed by the Spitzer Space Telescope and [Ne v] 14.3 and 24.3 μm lines, consistent with the existence of a weak AGN were detected. Observed by the XMM-Newton telescope, the galaxy's X-ray luminosity in 2-10 keV was 2.0 × 1040 erg s−1 dominated by a power law. This value is consistent with low-luminosity AGNs.
When observed in radio waves, the galaxy features two radio lobes extending from a nuclear source, a feature common to Seyfert 2 galaxies. The emission of the southwest lobe is polarised, indicating it is above the plain of the disk, while the northeast is depolarised. The southwest lobe extends 26" from the nucleus and the northeast 33". The total extent of the source is approximately 12 kpc at the distance of the galaxy. More detailed observations by the Very Large Array revealed a radio jet connecting the nucleus with the southwest lobe and a circumnuclear structure with a radius of nearly 300 pc.
In the centre of the galaxy lies a supermassive black hole whose mass is estimated to be (107.2) based on Ks bulge luminosity. The X-ray spectra of the galaxy suggested the mass of the supermassive black hole to be in the range of 105 to 107 .
Around the nucleus, at a radius of circa 2 arcseconds, has been detected using CO(1-0) emission a significant amount of molecular gas. The total mass of molecular gas in that region is estimated to be and in the central 5 arcseconds the mass of molecular gas is . The presence of such a large mass could obscure the optical emission lines from an active galactic nucleus. The CO emission has an elongated shape, maybe due to the forces created by the stellar bar.
Supernovae
Six supernovae have been observed in NGC 3367:
SN 1986A (type Ia, mag 14.0) was discovered by Robert Evans on 4 February 1986, near the condensations in the spiral arms east of the centre of the galaxy. It was also discovered independently by L. Cameron and B. Leibundgut on 4 February 1986, and by Miklós Lovas on 6 February 1986.
SN 1992C (type II, mag. 16.5) was discovered by ESO astronomer Hans van Winckel on 28 January 1992. He found it on a photographic plate obtained by Guido Pizarro during a search programme carried out with the ESO 1-metre Schmidt telescope at La Silla. Spectra of the supernova, obtained by Della Valle and Christoffel Waelkens (Astronomical Institute of Leuven, Belgium), with the 2.2-metre telescope at La Silla, showed it to be of type II and that the explosion must have happened between 10 and 20 days earlier. The expansion velocity was measured at about 7000 km/sec. SN 1992c was located southeast of the centre of the galaxy, at the tip of a spiral arm.
SN 2003aa (type Ic, mag. 17.6) was discovered by LOTOSS (Lick Observatory and Tenagra Observatory Supernova Searches) on 31 January 2003.
SN 2007am (type II, mag. 17.8) was discovered by LOSS (Lick Observatory Supernova Search) on 11 March 2007.
SN 2018kp (type Ia, mag. 19.9) was discovered by the Catalina Real-time Transient Survey on 24 January 2018.
SN 2022ewj (type II, mag. 16.3) was discovered by Kōichi Itagaki on 19 March 2022.
Nearby galaxies
NGC 3367 is the foremost member in a galaxy group known as the NGC 3367 group. Other members of the group include NGC 3391, and NGC 3419. A bit further away lie the galaxies NGC 3300, and NGC 3306. NGC 3367 lies in the same region of the sky as the Leo Group, whose redshift is about a third of NGC 3367. NGC 3377, a member of the Leo Group, lies 22 arcminutes to the north of NGC 3367.
Gallery
References
External links
Barred spiral galaxies
Leo (constellation)
3367
05880
32178
Astronomical objects discovered in 1784
Discoveries by William Herschel
4C objects
Radio galaxies
Galaxies discovered in 1784
+02-28-005 | NGC 3367 | [
"Astronomy"
] | 1,440 | [
"Leo (constellation)",
"Constellations"
] |
59,402,306 | https://en.wikipedia.org/wiki/Neuronal%20cell%20cycle | The Neuronal cell cycle represents the life cycle of the biological cell, its creation, reproduction and eventual death. The process by which cells divide into two daughter cells is called mitosis. Once these cells are formed they enter G1, the phase in which many of the proteins needed to replicate DNA are made. After G1, the cells enter S phase during which the DNA is replicated. After S, the cell will enter G2 where the proteins required for mitosis to occur are synthesized. Unlike most cell types however, neurons are generally considered incapable of proliferating once they are differentiated, as they are in the adult nervous system. Nevertheless, it remains plausible that neurons may re-enter the cell cycle under certain circumstances. Sympathetic and cortical neurons, for example, try to reactivate the cell cycle when subjected to acute insults such as DNA damage, oxidative stress, and excitotoxicity. This process is referred to as “abortive cell cycle re-entry” because the cells usually die in the G1/S checkpoint before DNA has been replicated.
Cell cycle regulation
Transitions through the cell cycle from one phase to the next are regulated by cyclins binding their respective cyclin dependent kinases (Cdks) which then activate the kinases (Fisher, 2012). During G1, cyclin D is synthesized and binds to Cdk4/6, which in turn phosphorylates retinoblastoma (Rb) protein and induces the release of the transcription factor E2F1 which is necessary for DNA replication (Liu et al., 1998). The G1/S transition is regulated by cyclin E binding to Cdk2 which phosphorylates Rb as well (Merrick and Fisher, 2011). S phase is then driven by the binding of cyclin A with Cdk2. In late S phase, cyclin A binds with Cdk1 to promote late replication origins and also initiates the condensation of the chromatin in the late G2 phase. The G2/M phase transition is regulated by the formation of the Cdk1/cyclin B complex.
Inhibition through the cell cycle is maintained by cyclin-dependent kinase inhibitors (CKIs) of the Ink and Cip/Kip families which inhibit the cyclin/CDK complex. CDK4/6 is inhibited by p15Ink4b, p16Ink4a, p18Ink4c, and p19Ink4d. These inhibitors prevent the binding of CDK4/6 with cyclin D (Cánepa et al., 2007). The Cip/Kip families (p21Cip1, p27Kip1, and p57Kip2) also bind to cyclin/CDK complexes and prohibit advancement through the cell cycle. The cell cycle uses these CDKs and CKIs to regulate the cell cycle through checkpoints. These checkpoints ensure that the cell has completed all of the tasks of the current phase before they can gain entry into the next phase of the cycle. The criteria for the checkpoints are met through a combination of activating and inhibiting cyclin/CDK complexes as the result of different signaling pathways (Besson et al., 2008; Cánepa et al., 2007; Yasutis and Kozminski, 2013). If the criteria are not met, the cell will arrest in the phase prior to the checkpoint until the criteria are met. Progression through a checkpoint without having first met the appropriate criteria can lead to cell death (Fisher, 2012; Williams and Stoeber, 2012).
Abortive cell cycle re-entry
It is believed that neurons are permanently blocked from the cell cycle once they differentiate. As a result, neurons are typically found outside of the cell cycle in a G0 state. It has been found that various genes that encode the G1/S transition, such as D1, Cdk4, Rb proteins, E2Fs, and CKIs, can be detected in different areas of a normal human brain (Frade and Ovejero-Benito, 2015). The presence of these core cell cycle factors can be explained through their role in neuronal migration, maturation, and synaptic plasticity (Christopher L. Frank1 and Li-Huei Tsai1, 2009). However, it is also possible that, under certain conditions, these factors can induce cell cycle re-entry. Under conditions such as DNA damage, oxidative stress, and activity withdrawal these factors have been shown to be upregulated. However the cells usually die in the G1/S checkpoint before DNA has been replicated (Park et al., 1998).
The process by which the cell re-enters the cell cycle and dies is called “abortive cell cycle re-entry” and is characterized by the upregulation of cyclin D-cdk4/6 and downregulation of E2F, followed by cell death (Frade and Ovejero-Benito, 2015).
In cerebellar granule cells and cortical neurons, E2F1 can trigger neuronal apoptosis through activation of Bax/caspase-3 and the induction of the Cdk1/FOXO1/Bad pathway (Giovanni et al., 2000). The downregulation of p130/E2F4 (a complex which has been shown to maintain the post mitotic nature of neurons) induces neuronal apoptosis by upregulating B-myb and C-myb (Liu et al., 2005).
Cell cycle re-entry
Tetraploid neurons (neurons with 4C DNA content) are not restricted to retinal neurons, 10% of human cortical neurons have DNA higher than 2C (Frade and Ovejero-Benito, 2015). Typically differentiated neurons that replicate their DNA die. However, this is not always the case as exhibited by sensory and sympathetic neurons, which are able to replicate their DNA without neuronal death (Smith et al., 2000). Neurons that are Rb deficient have also been found to re-enter the cell cycle and survive in a 4C DNA state (Lipinski et al., 2001). Duplication of DNA can lead to neuronal diversification in vertebrates, as seen in observations in the developing chick retina.
These neurons re-enter the cell cycle as they travel to the ganglion cell layer when they are activated by p75NTR. These neurons are unable to enter mitosis and are stuck in a 4C DNA content state. Cell cycle re-entry by p75NTR is not dependent on Cdk4/6 (Morillo et al., 2012) and, therefore, differs from other cell types that re-enter the cell cycle. In retinal ganglion cells, p75NTR is mediated by p38MAPK and then phosphorylates E2F4, before progressing the cell through the cell cycle. Tetraploid neurons in mice are made in a p75NTR dependent manner in cells that contain Rb during their migration to their differentiated neuronal layers (Morillo et al., 2012). It is still unknown why these neurons are able to pass through the G1/S checkpoint and not induce apoptosis through E2F1.
Neurodegenerative diseases
Cell cycle re-entry usually causes apoptosis. However, in some neurodegenerative diseases, re-entry into the cell cycle occurs. The neurons that are able to re-enter the cell cycle are much more likely to undergo apoptosis and lead to the disease phenotypes. In Alzheimer’s disease, affected neurons show signs of DNA replication such as phosphorylated Mcm2 and cell cycle regulators cyclin D, Cdk4, phosphorylated Rb, E2F1, and cyclin E. Not much is currently known about the direct mechanism by which the cell cycle is reactivated, however it is possible that MiR26b may regulate the activation of cell cycle progression by upregulating cyclin E1 and downregulating p27Kip1 (Busser et al., 1998; Yang et al., 2003).
Alzheimer diseased neurons rarely exhibit the ability to enter mitosis and, if they don’t undergo rapid mitosis, can survive for long periods of time in a tetraploid state. These neurons are able to enter the S phase and replicate their DNA, however they become blocked in the G2 state.
In affected and unaffected tetraploid neurons, during development and during the progression of the disease, passing the G2/M checkpoint leads to cell death. This hints that the G2/M checkpoint aids in the survival of tetraploid neurons. This is supported by experiments in which the G2/M checkpoint is removed through addition of brain-derived neurotrophic factor (BDNF) blockers in tetraploid cells that resulted in cell death. BDNF prevents the G2/M transition through its receptor TrkB and their capacity to decrease cyclin B and Cdk1. The mechanism by which neurons undergo apoptosis after the G2/M transition is not yet fully understood, it is known that Cdk1 can activate the pro-apoptotic factor Bad by phosphorylating its Ser128 (Frade, 2000).
Interkinetic nuclear migration
Interkinetic nuclear migration is a feature of developing neuroepithelia and is characterized by the periodic movement of the cell’s nucleus with the progression of the cell cycle. Developing neuroepithelia are tissues composed of neural progenitor cells, each spanning the entire thickness of the epithelium from the ventricular surface to the laminal side. Cell nuclei occupy different positions along the apical–basal axis of the tissue. S phase occurs close to the basal side whereas mitosis exclusively occurs close to ventricular apical side. The nuclei then move to upper regions near the basal side where they proceed through S-phase.
This nuclear movement is repeated at each cell cycle and is maintained by an apical-to-basal migration during G1- phase and a reverse basal-to-apical movement during G2- phase. It was proposed that the INM maximized the amount of mitotic events in the limited space and that, since neuronal progenitors have a basal body, they need to move their nucleus to the apical side in order to assemble the mitotic spindle used in mitosis. It has been reported that the INM is not required for the cell cycle since removing the INM doesn’t change the length of the cell cycle. Interestingly, blocking or delaying the cell cycle results in the arrest or reduction of the INM respectively. Nuclear migration is not necessary for cell cycle regulation, however cell cycle regulators have tight control over the INM (Del Bene, 2011).
Bibliography
Del Bene, F. (2011). Interkinetic nuclear migration: Cell cycle on the move. EMBO J. 30, 1676–1677.
Besson, A., Dowdy, S.F., and Roberts, J.M. (2008). CDK Inhibitors: Cell Cycle Regulators and Beyond. Dev. Cell 14, 159–169.
Busser, J., Geldmacher, D.S., Herrup, K., Hof, P.R., Duff, K., and Davies, P. (1998). Ectopic cell cycle proteins predict the sites of neuronal cell death in Alzheimer’s disease brain. J. Neurosci. 18, 2801–2807.
Cánepa, E.T., Scassa, M.E., Ceruti, J.M., Marazita, M.C., Carcagno, A.L., Sirkin, P.F., and Ogara, M.F. (2007). INK4 proteins, a family of mammalian CDK inhibitors with novel biological functions. IUBMB Life 59, 419–426.
Christopher L. Frank1 and Li-Huei Tsai1, 2 (2009). Alternative functions of core cell cycle regulators in neuronal migration, neuronal maturation, and synaptic plasticity. NIH Public Access 62, 312–326.
Fisher, R.P. (2012). The CDK network: Linking cycles of cell sdivision and gene expression. Genes and Cancer 3, 731–738.
Frade, J.M. (2000). Unscheduled re-entry into the cell cycle induced by NGF precedes cell death in nascent retinal neurones. J Cell Sci 113, 1139–1148.
Frade, J.M., and Ovejero-Benito, M.C. (2015). Neuronal cell cycle: The neuron itself and its circumstances. Cell Cycle 14, 712–720.
Giovanni, A., Keramaris, E., Morris, E.J., Hou, S.T., O’Hare, M., Dyson, N., Robertson, G.S., Slack, R.S., and Park, D.S. (2000). E2F1 mediates death of B-amyloid-treated cortical neurons in a manner independent of p53 and dependent on Bax and caspase 3. J. Biol. Chem. 275, 11553–11560.
Lipinski, M.M., Macleod, K.F., Williams, B.O., Mullaney, T.L., Crowley, D., and Jacks, T. (2001). Cell-autonomous and non-cell-autonomous functions of the Rb tumor suppressor in developing central nervous system. EMBO J. 20, 3402–3413.
Liu, D., Nath, N., Chellappan, S., and Greene, L. (2005). Regulation of neuron survival and death by p130 and associated chromatin modifiers. GENES Dev. 18, 719–723.
Liu, N., Lucibello, F.C., Engeland, K., and Müller, R. (1998). A new model of cell cycle-regulated transcription: Repression of the cyclin A promoter by CDF-1 and anti-repression by E2F. Oncogene 16, 2957–2963.
Merrick, K. a, and Fisher, R.P. (2011). Putting one step before the other: distinct activation pathways for Cdk1 and Cdk2 bring order to the mammalian cell cycle. 9, 706–714.
Morillo, S.M., Abanto, E.P., Roman, M.J., and Frade, J.M. (2012). Nerve Growth Factor-Induced Cell Cycle Reentry in Newborn Neurons Is Triggered by p38MAPK-Dependent E2F4 Phosphorylation. Mol. Cell. Biol. 32, 2722–2737.
Park, D.S., Morris, E.J., Padmanabhan, J., Shelanski, M.L., Geller, H.M., and Greene, L. a. (1998). Cyclin-dependent kinases participate in death of neurons evoked by DNA- damaging agents. J. Cell Biol. 143, 457–467.
Smith, D.S., Leone, G., DeGregori, J., Ahmed, M.N., Qumsiyeh, M.B., and Nevins, J.R. (2000). Induction of DNA replication in adult rat neurons by deregulation of the retinoblastoma/E2F G1 cell cycle pathway. Cell Growth Differ. 11, 625–633.
Williams, G.H., and Stoeber, K. (2012). The cell cycle and cancer. 352–364.
Yang, Y., Mufson, E.J., and Herrup, K. (2003). Neuronal Cell Death Is Preceded by Cell Cycle Events at All Stages of Alzheimer’s Disease. J. Neurosci. 23, 2557–2563.
Yasutis, K.M., and Kozminski, K.G. (2013). Cell cycle checkpoint regulators reach a zillion. Cell Cycle 12, 1501–1509.
References
Cell cycle
Cellular processes
Cytoskeleton | Neuronal cell cycle | [
"Biology"
] | 3,527 | [
"Cell cycle",
"Cellular processes"
] |
51,551,039 | https://en.wikipedia.org/wiki/Platinum%20OG | Platinum OG is a 75/25 indica dominant hybrid bred by Apothecary Genetics. It is believed that it stems from the combination of Master Kush, OG Kush, and an unknown third strain – believed to be Purps strain. Its genetics mix makes Platinum OG a heavy strain which is an indicator for most strains of Kush origin. This strain is considered an impure indica.
Cultivation
Platinum OG can be grown either indoors or outdoors requiring Mediterranean climate. The flowering time indoor is 9 weeks with a yield of 200-350 grams/sqm. Flowering outdoors usually is in early to mid October with a yield of 400-500 gm2. It grows to a medium height. It has an average THC level of 22%.
Characteristics
The buds of this strain are medium-green and have a few orange hairs speckled throughout. The buds take a crystal appearance due to an abundance of trichromes, creating a dense, rock-like appearance when broken open. Despite the abundance of trichromes, this strain does not have the characteristic stickiness prevalent in most high-trichrome strains. Bud stems are darker than many marijuana strains and are covered in very long, thin hairs. Each bud has one to three leaf segments which are dark purple.
When used it causes a heady onset followed by physical sedation, making it more applicable for possible treatment of anxiety and pain. Some medical practitioners who believe in the medicinal effects of marijuana recommended this strain for sleeplessness and muscular pain.
References
Cannabis strains | Platinum OG | [
"Biology"
] | 308 | [
"Cannabis strains",
"Biopiracy"
] |
51,551,398 | https://en.wikipedia.org/wiki/Handbook%20of%20the%20New%20Zealand%20Flora | Handbook of the New Zealand Flora (abbreviated Handb. N. Zeal. Fl.) is a two volume work by English botanist Joseph Dalton Hooker with systematic botanical descriptions of plants native to New Zealand. The first part published in 1864 covers flowering plants, and the second part published in 1867 covers Hepaticae, mosses, lichens, fungi and algae.
References
Further reading
New Zealand
Botany in New Zealand
Books about New Zealand | Handbook of the New Zealand Flora | [
"Biology"
] | 90 | [
"Flora",
"Florae (publication)"
] |
51,552,534 | https://en.wikipedia.org/wiki/Data%20activism | Data activism is a social practice that uses technology and data. It emerged from existing activism sub-cultures such as hacker an open-source movements. Data activism is a specific type of activism which is enabled and constrained by the data infrastructure. It can use the production and collection of digital, volunteered, open data to challenge existing power relations. It is a form of media activism; however, this is not to be confused with slacktivism. It uses digital technology and data politically and proactively to foster social change. Forms of data activism can include digital humanitarianism and engaging in hackathons. Data activism is a social practice that is becoming more well known with the expansion of technology, open-sourced software and the ability to communicate beyond an individual's immediate community. The culture of data activism emerged from previous forms of media activism, such as hacker movements. A defining characteristic of data activism is that ordinary citizens can participate, in comparison to previous forms of media activism where elite skill sets were required to participate. By increasingly involving average users, they are a signal of a change in perspective and attitude towards massive data collection emerging within the civil society realm.
Data activism can be the act of providing data on events or issues that individuals feel have not been properly addressed by those in power. For example, the first deployment of the Ushahidi platform in 2008 in Kenya visualized the post-electoral violence that had been silenced by the government and the new media. The social practice of data activism revolves around the idea that data is political in nature. Data activism allows individuals to quantify a specific issue. By collecting data for a particular purpose, it allows data activists to quantify and expose specific issues. As data infrastructures and data analytics grow, data activists can use evidence from data-driven science to support claims about social issues.
Types
A twofold classification of data activism has been proposed by Stefania Milan and Miren Gutiérrez, later explored more in-depth by Milan according to the type of activists' engagement with data politics. 'Re-active data activism' can be characterized as motivated by the perception of massive data collection as a threat, for instance when activists seek to resist corporate and government snooping, whereas 'pro-active data activism' sees the increasing availability of data as an opportunity to foster social change. These differentiated approaches to datafication result in different repertoires of action, which are not at odds with each other, since they share a crucial feature: they take information as a constitutive force capable of shaping social reality and contribute to generate new alternative ways of interpreting it. Examples of re-active data activism include the development and usage of encryption and anonymity networks to resist corporate or state surveillance, while instances of pro-active data activism include projects in which data is mobilized to advocate for change and contest established social narrative.
Examples
End the Backlog
It was discovered that in the United States between 180,000 and 500,000 rape kits were left unprocessed in storage in forensic warehouses. Registration and entry of criminal DNA had been inconsistent, which caused this large backlog in date rape kits. The delay in analysing these DNA samples would approximately be six months to two years. The information from rape kits was meant to be entered into the forensic warehouse database, but there was a disconnect between the warehouse system and the national DNA database Combined DNA Index System (CODIS) that left these rape kits unexamined. Testing these rape kits is important in identifying and prosecuting offenders, recognizing serial rapists, and providing justice for rape victims. The Ending the Backlog Initiative brought attention to this issue by demanding that the data from these rape kits be processed. It was this initiative that brought this issue to the attention of the United States government, who began stated that this was unacceptable and put $79 million in grants would be used to help eliminate the backlog of rape kits. The quantification of this data changed the ways in which the public perceived the process of analysing rape kits. This data was then used to gain the attention of politicians.
DataKind
DataKind is a digital activism organization that brings together data scientists and people from other organizations and governments for the purpose of using big data in similar ways that corporations currently use big data namely to monetize data. However, here big data is used to help solve social problems, like food shortages and homelessness. DataKind was founded in 2011 and today there are chapters in the United Kingdom, India, Singapore and the United States of America. Jake Porway is the founder and executive director of DataKind.
Criticism
While data activists may have good intentions, one criticism is that by allowing citizens to generate data without training or reliable forms of measurement, the data can be skewed or presented in different forms.
Safecast
After the Fukushima nuclear disaster in 2011, Safecast was an organization established by a group of citizens that were concerned about high levels of radiation in the area. After receiving conflicting messages about levels of radiation from different media sources and scientists, individuals were uncertain which information was the most reliable. This brought about a movement where citizens would use Geiger counter readings to measure levels of radiation and circulate that data over the internet so that it was accessible by the public. Safecast was developed as a means of producing multiple sources of data on radiation. It was assumed that if the data was collected by similar Geiger counter measurements in mass volume, the data produced was likely to be accurate. Safecast allows individuals to download the raw radiation data, but Safecast also visualizes the data. The data that is used to create a visual map is processed and categorized by Safecast. This data is different from the raw radiation data because it has been filtered, which presents the data in a different way than the raw data. The change in presentation of data may alter the information that individuals take from it, which can pose a threat if misunderstood.
See also
Information activism
Media Activism
Statactivism
Slacktivism
References
Activism
Internet activism
Data | Data activism | [
"Technology"
] | 1,214 | [
"Information technology",
"Data",
"Data activism"
] |
51,552,849 | https://en.wikipedia.org/wiki/Mieczys%C5%82aw%20Warmus | Mieczysław Warmus (born 1 June 1918 in Dobrowlany; died 20 September 2007 in Australia) was a Polish mathematician, a pioneer of computer science in Poland, professor, university lecturer, author of over a hundred scientific papers.
References
Homepage
Biography (Prof. Mieczysław Warmus (1918–2007))
Biography (Jadwiga Dutkiewicz Mieczysław Warmus Życie i praca naukowa )
20th-century Polish mathematicians
1918 births
2007 deaths
Recipients of the Medal of the 10th Anniversary of the People's Republic of Poland | Mieczysław Warmus | [
"Technology"
] | 121 | [
"Computing stubs",
"Computer science",
"Computer science stubs"
] |
51,552,927 | https://en.wikipedia.org/wiki/Non-selective%20cation%20channel-2%20family | Members of the Non-Selective Cation Channel-2 (NSCC2) Family (TC#1.A.15) have been sequenced from various yeast, fungal and animals species including Saccharomyces cerevisiae, Drosophila melanogaster and Homo sapiens. These proteins are the Sec62 proteins, believed to be associated with the Sec61 and Sec63 constituents of the general protein secretory systems of yeast microsomes. They are also the non-selective (NS) cation channels of the mammalian cytoplasmic membrane.
Function
NSCC2 channels are believed to provide entry pathways in response to growth factors. The yeast Sec62 protein has been shown to be essential for cell growth. The mammalian NS channel proteins have been implicated in platelet derived growth factor (PDGF)-dependent single channel current in fibroblasts. These channels are essentially closed in serum deprived tissue-culture cells and are specifically opened by exposure to PDGF.
The channels are reported to exhibit equal selectivity for Na+ , K+ and Cs+ with low permeability to Ca2+ , and no permeability to anions. Channel open probability is voltage- and cytoplasmic Ca2+-independent.
Transport reaction
The generalized transport reaction catalyzed by members of the NSCC2 family is:
Cation (out) ⇌ cation (in)
Structure and location
Sequenced NSCC2 family proteins are 283-402 amino acyl residues in length and exhibit two putative transmembrane α-helical segments (TMSs). The S. cerevisiae protein, of 283 amino acyl residues, has cytoplasmic N- and C-termini with two putative TMSs at positions 159-178 and 193-213. The C-terminal 25 residues are rich in arginine and lysine. These proteins have been reported to be present in both endoplasmic reticular and cytoplasmic membranes.
References
Protein families
Membrane proteins
Transmembrane proteins
Transmembrane transporters
Transport proteins
Integral membrane proteins | Non-selective cation channel-2 family | [
"Biology"
] | 437 | [
"Protein families",
"Protein classification",
"Membrane proteins"
] |
51,553,282 | https://en.wikipedia.org/wiki/Anticipatory%20governance | Anticipatory governance is a method of decision making that uses predictive measures to anticipate possible outcomes to then make decisions based on the data provided. Anticipatory governance is a system of governing that is made up of processes and institutions that rely on foresight and predictions to decrease risk and develop efficient methods to address events in their early conception or prevent them altogether.
History and applications
Anticipatory governance is a concept that has been derived from terms of similar meaning, like forward engagement and forward deployment, which was a primary focus for decisions made by the North Atlantic Treaty Organization (NATO). More recently, anticipatory governance has become data oriented practice which allows citizens and governments to utilize data as contributions and evidence for decision making regarding various matters within society. For example, Finland has a Finnish parliamentary Committee for the Future, which takes advantage of foresight to predict and evaluate the impact of developments to the country.
Since 2001, the Millennium Project has initiated a project entitled the State of the Future Index, has been using a predictive methodology to foresee the future for global countries based on historical data, variables and indicators, such as GDP, annual population, literacy rates, population, and unemployment.
Methodology
Four part system
Anticipatory governance is a system with four components. They allow the system to: use a foresight, have a networked system that integrates foresight and policy procedures, receive feedback in order to improve efficiency and knowledge, and allow for flexibility. By allowing for feedback, anticipatory governance can detect and assess the development of future programs and policy. Feedback can be done through audits, and assessments of performance. The anticipatory system must adapt to consider possibilities that result from the data and may appear to be untraditional to allow the system to be effective and depend on pragmatic data.
Indicators
Anticipatory governance utilizes various techniques to assess society which can evaluate the continuous information from knowledge based technologies, such as machine learning. Anticipatory governance also takes into consideration that the concept cannot predict the future certainly, however, it can account for several possible future avenues. In order to determine these possible avenues the following list of indicators are required: "aggregated averages, risk assessment, sensitivity analysis of factors or decisions driving the scenarios, identification of unacceptable scenarios or worst cases, and assessment of common and different impacts among the scenarios."
Big data
Anticipatory governance allows the state to use collected public information in order to determine predictive practices and draw conclusions based on this data. Data that is gathered by governments in large volumes can be considered Big data. Governments utilize predictive analytics to examine what kinds of behaviour and events that may occur as a result of this collected of data. Anticipatory governance can be used by enforcement agencies in order to proactively protect the public, for instance by estimating where future crimes may occur and identifying areas of improvement for law enforcement.
Variation
Anticipatory governance is not always focused on finding solution, rather it can be focused on a preventative method altogether. As a result of this methodology, anticipatory governance, can be an alternative to having the bureaucracy form specific groups to address issues, whereby, the issue can be avoided due to precise foresight. Furthermore, anticipatory governance can also be considered a precaution, in the sense that it is a practice for preparing for the possible future.
Actors
Anticipatory governance involves a governing body to implement its practice, such as government institutions and policies. For example, education governance utilizes policy instrumentation in order to gather data about students as a means of creating predictions to improve future education. However, anticipatory governance can also be applicable in similar instances by private companies and by smaller organizations. For instance, Hewlett Packard can determine which employees will leave the company and they are able to identify ways of preventing this turnover.
Primarily, anticipatory governance relies on data in order to derive predictive analytical evidence to support its practice, therefore, it is necessary to have an infrastructure that sustains the produced data, such as databases, coding, computational power, and algorithms. These infrastructures can be provided by private companies that have the resources and technologies to acquire and create them.
Ethics
There is type of ethical analysis related to anticipatory governance known as nano ethics (see Impact of nanotechnology). Under this category of nano ethics anticipatory governance falls under anticipatory ethics, which originated in the 1960s. Anticipatory ethics and governance addresses the ethical repercussions associated with technologies in their beginning stages. It assesses the risks that the technology might present and therefore can affect future decision making of such technology (see Predictive analytics).
Anticipatory governance in the concept of predictive analytics, data, and governing can be seen as controversial because its measures can be perceived as unethical. The practice of anticipatory governance presents its own ethical issues concerning the effects its methods have on the individuals that are influenced by it, such as discrimination and self-fulfilling prophecies. Anticipatory governance can also allow the secondary use of information by governments, which in some cases can impede on citizen liberties. Based on the information and data that is gathered by governments, they can utilize this data in unintended ways and unbeknownst to the citizen in order to practice anticipatory governance.
Shortcomings
Due to the fact that anticipatory governance can be considered hypothetical the certainty of the future is not definite, thus, there is a measure of doubt associated with the practice. For example, following the Great Depression, measures were taken within the United States economy to prevent a depression from ever occurring again, however, the market crash in 2008 still occurred, despite these measures. Anticipatory governance also supersedes information about people that may never happen in actual reality. By drawing conclusions based on anticipatory predictions certain groups in society face the consequences of this practice and are subject to prejudices by others within society. For example, predictive policing can target specific individuals within a society because the information provided by such analytics and technology, supports recidivism. Recidivism is the concept that people that have committed crimes are likely to recommit offences, thus becoming individuals of interest in predictive policing data (see Predictive policing). Anticipatory governance can also target specific people and places concerning policing, which affects the behaviours of people within these areas, such as enforcing self-fulfilling prophecies and discrimination.
Anticipatory governance raises the concern regarding the need for traditional governments. If anticipatory governance and its associated technologies, information, and data are used to govern and make decisions within nation states, it can alter the responsibilities of government. However, without the use of anticipatory governance the alternative is to utilize a reactive form of governance, which results in a decision making process that can take longer and lead to implications that are difficult to predict and prevent.
See also
Predictive policing
Big data
Predictive analytics
References
Data collection
Governance | Anticipatory governance | [
"Technology"
] | 1,445 | [
"Data collection",
"Data"
] |
51,556,223 | https://en.wikipedia.org/wiki/NGC%20197 | NGC 197 is a lenticular galaxy located in the constellation Cetus. It was discovered on October 16, 1863 by Albert Marth.
See also
List of NGC objects (1–1000)
References
External links
SEDS
0197
0406
+00-02-110
2365
Lenticular galaxies
Astronomical objects discovered in 1863
Cetus | NGC 197 | [
"Astronomy"
] | 69 | [
"Cetus",
"Constellations"
] |
51,556,322 | https://en.wikipedia.org/wiki/Heptadecaphobia | Heptadecaphobia (Greek: , "seventeen" and , , "fear") or heptadekaphobia is the fear of the number 17. It is considered to be ill-fated in Italy and other countries of Greek and Latin origins, while the date Friday the 17th is considered especially unfortunate in Italy. The number is feared due to superstition, and is similar in nature to the fear of the number 13 in Anglo-Saxon countries.
History
In Ancient Greece, the number 17 was despised by followers of Pythagoras, as the number was between 16 and 18, which were perfect representations of 4×4 and 3×6 quadrilaterals, respectively.
In the Genesis flood narrative, it is written that the flood began on the 17th of the second month (Genesis, 7–11).
It has been suggested that the Romans found the number 17 disturbing because in Roman numerals XVII is an anagram of vixi, meaning "I have lived" (i.e. I am dead).
In La smorfia napoletana, a "dictionary" that associated certain vocabulary words to numbers to be played in the lottery, the number 17 is associated with ("misfortune").
Friday the 17th
In Italy, Friday the 17th (friggaheptadekaphobia or paraskevidekaeftaphobia) is a date of misfortune, as it is a date of two negatives: Friday (from Good Friday, the day of Jesus' death) and the number 17. It occurs when the 17th day of the month in the Gregorian calendar falls on a Friday, which happens at least once every year but can occur up to three times in the same year. For example, 2017 and 2023 had a Friday the 17th in February, March, and November, which also happened thrice in January, April, and July in 2020; 2015 and 2021 had two Friday the 17ths, as will 2025 through 2028; 2016, 2018, 2019, 2022 and 2024 had just one Friday the 17th.
Friday the 17th is similar to other unfortunate dates: for example, in Anglo-Saxon countries, this date is Friday the 13th, while in Spain, Greece, and South America, this date is Tuesday the 13th.
In mass media, the theme is portrayed in movies, such as The Virtuous Bigamist (Italian: Era di venerdì 17) and Shriek If You Know What I Did Last Friday the 13th (Italian: Shriek - Hai impegni per venerdì 17?), where in English the title refers to the number 13.)
A month has a Friday the 17th if and only if it begins on a Wednesday.
See also
List of phobias, including Numerophobia
References
Phobias
Numerology
17 | Heptadecaphobia | [
"Mathematics"
] | 583 | [
"Numerology",
"Mathematical objects",
"Numbers"
] |
51,557,054 | https://en.wikipedia.org/wiki/NGC%20198 | NGC 198 is a spiral galaxy located in the constellation Pisces. It was discovered on December 25, 1790 by William Herschel.
One supernova has been observed in NGC 198: SN2021adxd (typeII-P, mag. 19.8).
See also
Spiral galaxy
List of NGC objects (1–1000)
Pisces (constellation)
References
External links
SEDS
0198
+00-02-107
00414
Unbarred spiral galaxies
Pisces (constellation)
2371
Astronomical objects discovered in 1790
Discoveries by William Herschel | NGC 198 | [
"Astronomy"
] | 118 | [
"Pisces (constellation)",
"Constellations"
] |
51,557,568 | https://en.wikipedia.org/wiki/Secret%20Path | Secret Path is a Canadian multimedia storytelling project including a ten-song music album, a graphic novel, an animated television film, and instructional materials. Released on October 18, 2016, the centrepiece of the project is a concept album about Chanie Wenjack, a young Anishinaabe boy from the Marten Falls First Nation who died in 1966 while trying to return home after escaping from an Indian residential school.
The album Secret Path was the fifth studio album by Gord Downie and the final album released during his lifetime. The album was accompanied by a graphic novel of the same name, written by Downie, illustrated by Jeff Lemire, and published by Simon & Schuster; as well as an animated television film aired on CBC Television on October 23, 2016. All proceeds from the album and book are being donated to the University of Manitoba's National Centre for Truth and Reconciliation.
Downie performed the album in a concert at Roy Thomson Hall on October 21, 2016, which was his last full concert performance in his lifetime and was attended by members of the Wenjack family. The concert was aired by CBC Television in October 2017 following Downie's death. The project was further followed in 2018 by Finding the Secret Path, a documentary film by Downie's brother Mike Downie about the creation of the original project.
Album
The centrepiece of the project is Secret Path, Downie's fifth studio album and the final album released during his lifetime. All proceeds from the album are being donated to the University of Manitoba's National Centre for Truth and Reconciliation.
Track listing
All tracks written by Gord Downie unless otherwise noted.
Personnel
Vocals, Acoustic and Electric Guitars by Gord Downie
Charles Spearin – Bass
Ohad Benchetrit – Lap Steel & Additional Guitar
Kevin Hearn – Additional Keys
Dave "Billy Ray" Koster – Drums
All other instrumentation by Kevin Drew and Dave Hamelin
Produced and Mixed by Kevin Drew and Dave Hamelin
Additional personnel:
Engineered by Nyles Spencer
Mastered by Eric Boulanger, The Bakery, Culver City, California, USA
Live performances
Downie performed the album in a concert at Roy Thomson Hall on October 21, 2016, attended by members of the Wenjack family. The concert was filmed for an hour-long special, Gord Downie's Secret Path in Concert, which also featured backstage footage and scenes from the animated film. Downie and his collaborators performed the album, again with members of the Wenjack family, in Halifax's Rebecca Cohn Auditorium on November 29, 2016. This proved to be Downie's last performance. The special aired on October 22, 2017, on CBC Television, following Downie's death earlier that week.
Book
The graphic novel, also titled Secret Path, was written by Gord Downie, illustrated by Jeff Lemire, and published by Simon & Schuster. It was released on October 18, 2016, concurrently with the album.
As with the album itself, all proceeds from the book are being donated to the University of Manitoba's National Centre for Truth and Reconciliation.
Film
The animated film, also titled The Secret Path, adapts Downie's album and Lemire's graphic novel. It is divided into ten chapters, according to the ten songs from Downie's album.
It was executive produced by Gord Downie, along with Mike Downie, Patrick Downie, and Sarah Polley. The film was produced with the participation of the Canada Media Fund and the Canadian Film or Video Production tax credit.
The Secret Path was broadcast by CBC in an hour-long television special on October 23, 2016.
Reception and impact
The project was widely adopted by many Canadian schools as a teaching tool in indigenous history lessons on the residential school system, and led to the creation of the Gord Downie & Chanie Wenjack Fund to support efforts in indigenous reconciliation.
In an opinion piece in the National Post, Peter Shawn Taylor has criticized the story for containing fictionalized elements.
Charts
Awards
The album won two Juno Awards at the Juno Awards of 2017, for Adult Alternative Album of the Year and Recording Package of the Year, and Downie won Songwriter of the Year for the songs "The Stranger", "The Only Place to Be" and "Son". The album was also a shortlisted nominee for the 2017 Polaris Music Prize.
At the 6th Canadian Screen Awards, the television film received nominations for the Donald Brittain Award and Best Music in a Non-Fiction Program. It won the Best Music award at the non-fiction programming event on March 6, 2017, and the Donald Brittain Award at the broadcast gala on March 11.
At the 7th Canadian Screen Awards, the concert special won two awards, for best variety or entertainment special and best sound in a non-fiction program.
At the 8th Canadian Screen Awards, Finding the Secret Path won the awards for Biography or Arts Documentary Program or Series and Best Direction in a Documentary Program (Mike Downie).
References
External links
Multimedia works
2010s Canadian animated films
2016 albums
2016 animated films
2016 films
2016 graphic novels
2016 television films
Album chart usages for BillboardCanada
Albums produced by Kevin Drew
Animated films set in the 1960s
Animated films set in Canada
Arts & Crafts Productions albums
Biographical graphic novels
Canadian animated television films
Canadian graphic novels
CBC Television original films
Charity albums
2010s concept albums
Donald Brittain Award winning shows
Fiction set in 1966
First Nations films
First Nations music
First Nations novels
Gordon Downie albums
Graphic novels set in Canada
Graphic novels set in the 1960s
Historical fiction graphic novels
Indigenous child displacement in Canada
Juno Award for Adult Alternative Album of the Year albums
Works about residential schools in Canada | Secret Path | [
"Technology"
] | 1,136 | [
"Multimedia",
"Multimedia works"
] |
51,557,888 | https://en.wikipedia.org/wiki/NIST%20World%20Trade%20Center%20Disaster%20Investigation | The NIST World Trade Center Disaster Investigation was a report that the National Institute of Standards and Technology (NIST) conducted to establish the likely technical causes of the three building failures that occurred at the World Trade Center following the September 11, 2001 terrorist attacks. The report was mandated as part of the National Construction Safety Team Act (NCST Act), which was signed into law on October 1, 2002 by President George W. Bush. NIST issued its final report on the collapse of the World Trade Center's twin towers in September 2005, and the agency issued its final report on 7 World Trade Center in November 2008.
NIST concluded that the collapse of each tower resulted from the combined effects of airplane impact damage, widespread fireproofing dislodgment, and the fires that ensued. The sequence of failures that NIST concluded initiated the collapse of both towers involved the heat-induced sagging of floor trusses pulling some of the exterior columns on one side of each tower inward until they buckled, after which instability rapidly spread and the upper sections then fell onto the floors below. 7 World Trade Center, which was never directly hit by an airplane, collapsed as a result of thermal expansion of steel beams and girders that were heated by uncontrolled fires caused by the collapse of the North Tower and failure of the fire-resistive material.
Context
Collapse of the World Trade Center
During the September 11 attacks, two jet airliners struck the Twin Towers of the World Trade Center (WTC), one to each tower. As a result, the two 110 stories-tall skyscrapers collapsed, causing complete destruction to the entire WTC complex and killing 2,763 people. The South Tower collapsed at 9:59 a.m. (EDT) after burning for approximately 56 minutes. 29 minutes later, the North Tower collapsed having burned for 102 minutes. When the North Tower collapsed, debris fell on the nearby 7 World Trade Center, damaging it and starting fires that burned for almost seven hours, compromising the building's structural integrity. Seven World Trade Center collapsed at 5:21 p.m. (EDT).
The collapse of the World Trade Center buildings on September 11, 2001 was unprecedented; never before had a steel-framed multi-story building suffered a complete collapse as a result of fire. In the immediate aftermath, knowledgeable structural engineers began providing a range of explanations in an attempt to help the public understand these tragic events. However, a coordinated effort would need to be organized to investigate and analyze the complex series of events that led to each collapse.
FEMA World Trade Center Building Performance Study
In the aftermath of the attacks, researchers responded immediately by traveling to ground zero where they began collecting data. Among the first was the Federal Emergency Management Agency (FEMA) and American Society of Civil Engineers (ASCE), who together formed a Building Performance Study Team to understand how the building structures failed and why. The team produced the first official government report attempting to explain the destruction of the World Trade Center complex.
They were able to make many observations and findings, including preliminary analysis of the damaged structures, analysis of the buildings' fire suppression systems, and recommendations to building codes and fire standards for including airplane impact into building design. FEMA's final report, FEMA 403 issued in May 2002, titled World Trade Center Building Performance Study: Data Collection, Preliminary Observations, and Recommendations, also provided a substantial amount of data about the event not documented elsewhere. Their findings suggested that fires, in conjunction with damage to the structural members and fire suppression systems inflicted by the jet airliners, played a key role in the collapse of the buildings.
However, the team's investigation was hampered by a number of issues. The lack of authority of investigators to impound pieces of steel for examination before they were recycled led to the loss of important pieces of evidence that were destroyed early during the search and rescue effort, and much of the steel had already been recycled in the one month that had lapsed between the attack and the deployment of the team. The team was further impeded by ongoing criminal investigations by the FBI and NTSB. In a hearing to the U.S. House of Representatives' Committee on Science on March 6, 2002, a panel of witnesses and experts described the obstacles as:
No clear authority and the absence of an effective protocol for how the building performance investigators should conduct and coordinate their investigation with the concurrent search and rescue efforts, as well as any criminal investigation
Difficulty obtaining documents essential to the investigation, including blueprints, design drawings, and maintenance records
Uncertainty as a result of the confidential nature of the BPAT study
Uncertainty as to the strategy for completing the investigation and applying the lessons learned
Because of these inadequacies, the Building Performance Study team could not "definitively determine the sequence of events leading to the collapse of each tower." The report was remarkably blunt in pointing out shortcomings and missteps in the investigation and its recommendation for another, more thorough and authoritative investigation.
National Construction Safety Team act
As a result of the inconclusive FEMA Building Performance Study team's findings and concerns over missteps in its investigation, the U.S. House of Representatives drafted legislation that would give wide powers, including the right to issue subpoenas, to teams investigating building failures. The bill also mandates that the teams would be centered at the National Institute of Standards and Technology (NIST) whose building and fire research laboratory in Maryland has conducted extensive investigations of building failures in the past. The bill, cited as the National Construction Safety Team (NCST) act, passed the House and the Senate and was signed into law on Oct. 1, 2002 by President George W. Bush.
The NCST act gives NIST a clear mandate to:
establish the likely technical cause of building failures;
evaluate the technical aspects of procedures used for evacuation and emergency response;
recommend specific changes to building codes, standards, and practices;
recommend any research or other appropriate actions needed to improve the structural safety of buildings; and/or changes in emergency response and evacuation procedures; and,
make final recommendations within 90 days of completing an investigation.
Investigation
NIST began its investigation on 21 August 2002. Prior to this date, volunteers from NIST, FEMA, ASCE and others collected steel members important to the investigation from the four steel recycling facilities during the recovery effort. They collected and cataloged 236 steel artifacts, including exterior columns, core columns, floor trusses and other similar structural members. They were able to observe the metallurgical chemistry and structure and perform experiments on the recovered elements to measure their attributes such as mechanical properties under high temperatures.
NIST's Building and Fire Research Laboratory created a complex computer model to understand the collapse of the towers. Specifically, they wanted to know if the collapse of a core column could cause the progressive collapse of the whole building. They also modeled the dispersion of the jet fuel and damage to the interior of the building that was not visible from photographic evidence and eyewitnesses. The model was used to understand the hypothesis of the collapse.
Findings
Twin Towers
The investigation team integrated their metallurgy analysis, experimental results and computer simulation with video and photographs of the destruction and eyewitness accounts to form their understanding for how the buildings collapsed. They came to two conclusions:
A conventional fire should not have caused the collapse of the 110-story skyscrapers in the absence of structural and fire-proofing insulation damage.
The towers would likely not have collapsed if not for the impact and damage that the aircraft caused to the fire-proofing insulation.
The most probable collapse sequence was similar between the South Tower and North Tower, but they were not identical. However, they both involved all major structural systems of the building design: the core columns, the exterior columns and the building floors.
First, the floors that lost fire-proofing insulation due to debris impact began to sag as a result of the high temperature of the fire.
The sagging floors pulled inward on the walls
The exterior walls began to bow inward under the combined forces of the sagging floors, the fire, and the severed core columns from aircraft impact damage.
Finally, the exterior walls buckled/caved in and the buildings collapsed. The stories below provided little resistance to the relatively tremendous energy of the falling building, allowing them to fall very quickly.
The NIST investigation's conclusions do not support the "pancake theory" of collapse initiation, in which the collapse is begun by a progressive failure of the floor system. However, "pancaking" was accepted as the mode of collapse progression.
7 World Trade Center
NIST released the final report of their investigation into the collapse of the 47-story World Trade Center 7 building on 20 August 2008.
Their conclusion is that WTC7 collapsed primarily as a result of the fire that was started when the WTC1 collapsed. The collapse of the North Tower also damaged the south exterior wall of WTC7 but it was not a contributing factor. Because FDNY could not attempt to extinguish the fires that were burning on six floors of the building, the fire-proofing insulation began to fail. After several hours of uncontrolled fire, the steel columns, girders and trusses absorbed heat and rapidly lost their strength. As they began to sag, deform and buckle, the interior structure below the east penthouse was brought down. The failed core columns' load was distributed to the remaining columns which all failed. This led to the progressive collapse of the building.
Reforms
The National Construction Safety Team Act of 2002, which mandated NIST to perform its investigation, specifically states that NIST, which is not a regulatory agency, is not authorized to require the adoption of building codes, standards or practices. However, NIST's final reports on the collapse of the WTC buildings provides the technical basis for new and improved standards, codes and practices on designing buildings to resist progressive collapse. Many NIST researchers are also key members of professional, standards-developing organizations, and NIST actively works with these organizations to ensure that lessons learned from investigations are put to use.
Fire Protection of Structural Members
The National Fire Protection Association has adopted many of NIST's recommended improvements to the building code. NIST's recommendation for improving a building's structural frame and support system under severe loading conditions, such as during a fire event, resulted in the adoption of a new approach to high-rise building design. Much more scrutiny is to be given to the primary and secondary structural members as well as the connections that tie them together.
The steel columns of the WTC buildings significantly lost strength when they were subjected to the heat of the fire. Concrete heated to the same temperature, however, loses no strength at all. As a result, new high-rise buildings, including One World Trade Center, are being constructed with reinforced, high-strength concrete.
Emergency Communication Systems
Due to the fact that New York City's Office of Emergency Management was located in World Trade Center 7 on the day of the attacks and vital communications equipment was located in the North Tower, their response was significantly impeded. They have since moved their offices to Brooklyn, and disaster and emergency management teams around the country are also moving away from possible targets of terrorist attacks, natural disasters and other emergency scenarios.
Conspiracy theories
Critics of these conspiracy theories say they are a form of conspiracism common throughout history after a traumatic event in which conspiracy theories emerge as a mythic form of explanation.
A related criticism addresses the form of research on which the theories are based. Thomas W. Eagar, an engineering professor at MIT, suggested they "use the 'reverse scientific method'. They determine what happened, throw out all the data that doesn't fit their conclusion, and then hail their findings as the only possible conclusion."
List of reports released
Final Reports from the NIST World Trade Center Investigation (September 2005, November 2008)
NIST NCSTAR 1: Federal Building and Fire Safety Investigation of the World Trade Center Disaster: Final Report of the National Construction Safety Team on the Collapses of the World Trade Center Tower
NIST NCSTAR 1-1: Design, Construction, and Maintenance of Structural and Life Safety Systems
NIST NCSTAR 1-1A: Design and Construction of Structural Systems
NIST NCSTAR 1-1A appendixes A-B
NIST NCSTAR 1-1A appendixes C-G
NIST NCSTAR 1-1B: Comparison of Building Code Structural Requirements
NIST NCSTAR 1-1C: Maintenance and Modifications to Structural Systems
NIST NCSTAR 1-1C appendixes
NIST NCSTAR 1-1D: Fire Protection and Life Safety Provisions Applied to the Design and Construction of World Trade Center 1, 2, and 7 and Post-Construction Provisions Applied after Occupancy
NIST NCSTAR 1-1E: Comparison of Codes, Standards, and Practices in Use at the Time of the Design and Construction of World Trade Center 1, 2, and 7
NIST NCSTAR 1-1F: Comparison of the 1968 and Current (2003) New York City Building Code Provisions
NIST NCSTAR 1-1G: Amendments to the Fire Protection and Life Safety Provisions of the New York City Building Code by Local Laws Adopted while World Trade Center 1, 2, and 7 Were in Use
NIST NCSTAR 1-1H: Post-Construction Modification to Fire Protection and Life Safety Systems of the World Trade Center Towers
NIST NCSTAR 1-1I: Post-Construction Modifications to Fire Protection, Life Safety, and Structural Systems of World Trade Center 7
NIST NCSTAR 1-1J: Design, Installation, and Operation of Fuel Systems for Emergency Power in World Trade Center 7
NIST NCSTAR 1-2: Baseline Structural Performance and Aircraft Impact Damage Analysis of the World Trade Center Towers
NIST NCSTAR 1-2A: Reference Structural Models and Baseline Performance Analysis of the World Trade Center Towers
NIST NCSTAR 1-2B: Analysis of Aircraft Impacts into the World Trade Center Towers (Chapters 1-8)
NIST NCSTAR 1-2B: Chapters 9-11
NIST NCSTAR 1-2B: appendixes
NIST NCSTAR 1-3: Mechanical and Metallurgical Analysis of Structural Steel
NIST NCSTAR 1-3A: Contemporaneous Structural Steel Specifications
NIST NCSTAR 1-3B: Steel Inventory and Identification
NIST NCSTAR 1-3C: Damage and Failure Modes of Structural Steel Components
NIST NCSTAR 1-3C: appendixes
NIST NCSTAR 1-3D: Mechanical Properties of Structural Steels
NIST NCSTAR 1-3E: Physical Properties of Structural Steels
NIST NCSTAR 1-4: Active Fire Protection Systems
NIST NCSTAR 1-4A: Post-Construction Fires prior to September 11, 2001
NIST NCSTAR 1-4B: Fire Suppression Systems
NIST NCSTAR 1-4C: Fire Alarm Systems
NIST NCSTAR 1-4D: Smoke Management Systems
NIST NCSTAR 1-5: Reconstruction of the Fires in the World Trade Center Towers
NIST NCSTAR 1-5A: Visual Evidence, Damage Estimates, and Timeline Analysis (Chapters 1-8)
NIST NCSTAR 1-5A: Chapters 9-appendix C
NIST NCSTAR 1-5A: appendixes D-G
NIST NCSTAR 1-5A: appendixes H-M
NIST NCSTAR 1-5B: Experiments and Modeling of Structural Steel Elements Exposed to Fire
NIST NCSTAR 1-5C: Fire Tests of Single Office Workstations
NIST NCSTAR 1-5D: Reaction of Ceiling Tile Systems to Shocks
NIST NCSTAR 1-5E: Experiments and Modeling of Multiple Workstations Burning in a Compartment
NIST NCSTAR 1-5F: Computer Simulation of the Fires in the World Trade Center Towers
NIST NCSTAR 1-5G: Fire Structure Interface and Thermal Response of the World Trade Center Towers
NIST NCSTAR 1-6: Structural Fire Response and Probable Collapse Sequence of the World Trade Center Towers
NIST NCSTAR 1-6A: Passive Fire Protection
NIST NCSTAR 1-6B: Fire Resistance Tests of the Floor Truss Systems
NIST NCSTAR 1-6C: Component, Connection, and Subsystem Structural Analysis
NIST NCSTAR 1-6D: Global Structural Analysis of the Response of the World Trade Center Towers to Impact Damage and Fire
NIST NCSTAR 1-7: Occupant Behavior, Egress, and Emergency Communication
NIST NCSTAR 1-7A: Analysis of Published Accounts of the World Trade Center Evacuation
NIST NCSTAR 1-7B: Technical Documentation for Survey Administration: Questionnaires, Interviews, and Focus Groups
NIST NCSTAR 1-8: The Emergency Response Operations
NIST NCSTAR 1-8: Appendixes A-L
NIST NCSTAR 1A: Final Report on the Collapse of World Trade Center Building 7 *
NIST NCSTAR 1-9: Structural Fire Response and Probable Collapse Sequence of World Trade Center Building 7, Volume 1 and 2 *
NIST NCSTAR 1-9A: Global Structural Analysis of the Response of World Trade Center Building 7 to Fires and Debris Impact Damage *
* Errata for NIST NCSTAR 1A, NIST NCSTAR 1-9, and NIST NCSTAR 1-9A (January 2009, April 2012, and June 2012)
Draft Reports from the NIST World Trade Center Investigation - and public comments (April 2005 and August 2008)
Best Practice Guidelines for Structural Fire Resistance Design of Concrete and Steel Buildings (NIST IR 7563, November 2010)
June 2004 Progress Report on the Federal Building and Fire Safety Investigation of the World Trade Center (NIST SP 1000-5, June 2004)
Public Update on the Federal Building and Fire Safety Investigation of the World Trade Center Disaster (NIST SP 1000-4, December 2003)
Progress Report on the Federal Building and Fire Safety Investigation of the World Trade Center Disaster (NIST SP 1000-3, May 2003)
Progress Report on NIST Building and Fire Investigation into the World Trade Center Disaster (NIST IR 6942 and NIST SP 1000-2, December 2002)
NIST Final Plan: National Building and Fire Safety Investigation of the World Trade Center Disaster (NIST SP 1000-1, August 2002)
Initial Model for Fires in the World Trade Center Towers (NIST IR 6879, May 2002)
See also
9/11 Commission Report
9/11 Public Discourse Project
References
External links
Final Reports from the NIST World Trade Center Disaster Investigation
NIST Releases Final WTC 7 Investigation Report
Aftermath of the September 11 attacks
Construction law
National Institute of Standards and Technology
World Trade Center | NIST World Trade Center Disaster Investigation | [
"Engineering"
] | 3,794 | [
"Construction",
"Construction law"
] |
51,558,108 | https://en.wikipedia.org/wiki/Machine-readable%20document | A machine-readable document is a document whose content can be readily processed by computers. Such documents are distinguished from more general machine-readable data by virtue of having further structure to provide the necessary context to support the business processes for which they are created.
Definition
Data without context is meaningless and lacks the four essential characteristics of trustworthy business records specified in ISO 15489 Information and documentation – Records management:
Reliability
Authenticity
Integrity
Usability
The vast bulk of information is unstructured data and, from a business perspective, that means it is "immature", i.e., Level 1 (chaotic) of the Capability Maturity Model. Such immaturity fosters inefficiency, diminishes quality, and limits effectiveness. Unstructured information is also ill-suited for records management functions, provides inadequate evidence for legal purposes, drives up the cost of discovery in litigation, and makes access and usage needlessly cumbersome in routine, ongoing business processes.
There are at least four aspects to machine-readability:
First, words or phrases should be discretely delineated (tagged) so that computer software and/or hardware logic can be applied to them as individual conceptual elements.
Second, the semantics of each element should be specified so that computers can help human beings achieve a common understanding of their meanings and potential usages.
Third, if the relationships among the individual elements are also specified, computers can automatically apply inferences to them, thereby further relieving human beings of the burden of trying to understand them, particularly for purposes of inquiry, discovery, and analysis.
Fourth, if the structures of the documents in which the elements occur are also specified, human understanding is further enhanced and the data becomes more reliable for legal and business-quality purposes.
As early as 1983, the U.S. Government Accountability Office (GAO) began emphasizing the benefits of machine-readable information. Still sooner, in 1981, GAO began reporting on the problem of inadequate record-keeping practices in the U.S. federal government. Such deficiencies are not unique to government and advances in information technology mean that most information is now "born digital" and thus potentially far more easily managed by automated means. However, in testimony to Congress in 2010, GAO highlighted problems with managing electronic records, and as recently as 2015, GAO has continued to report inadequacies in the performance of Executive Branch agencies in meeting records management requirements. Moreover, more than two decades after a major and formerly highly respected auditing firm, Arthur Andersen, met its demise due to a records destruction scandal, record-keeping practices became a central issue in the 2016 Presidential election.
On January 4, 2011, President Obama signed H.R. 2142, the Government Performance and Results Act (GPRA) Modernization Act of 2010 (GPRAMA), into law as P.L. 111-352. Section 10 of GPRAMA requires U.S. federal agencies to publish their strategic and performance plans and reports in searchable, machine-readable format.
Additionally, in 2013, he issued Executive Order 13642, Making Open and Machine Readable the New Default for Government Information in general.
On July 28, 2016, the Office of Management and Budget (OMB) followed up by including in the revised issuance of Circular A-130 direction for agencies to use open, machine-readable formats, and to publish "public information online in a manner that promotes analysis and reuse for the widest possible range of purposes", meaning that the information is both publicly accessible and machine-readable. On January 14, 2019, President Trump signed into law H.R. 4174, the OPEN Government Data Act (OGDA), which codifies in law the requirement for agencies to make their public data assets available in machine-readable format. On June 28, 2019, in Circular A-11, OMB expressed intent to begin complying with section 10 of GPRAMA.
In support of such policy direction, technological advancement is enabling more efficient and effective management and use of machine-readable electronic records. Document-oriented databases have been developed for storing, retrieving, and managing document-oriented information, also known as semi-structured data. Extensible Markup Language (XML) is a World Wide Web Consortium (W3C) Recommendation setting forth rules for encoding documents in a format that is both human-readable and machine-readable. Many XML editor tools have been developed and most, if not all major information technology applications support XML to greater or lesser degrees. The fact that XML itself is an open, standard, machine-readable format makes it relatively easy for application developers to do so.
The W3C's accompanying XML Schema (XSD) Recommendation specifies how to formally describe the elements in an XML document. With respect to the specification of XML schemas, the Organization for the Advancement of Structured Information Standards (OASIS) is a leading standards-developing organization. However, many technical developers prefer to work with JSON, and to define the structure of JSON data for validation, documentation, and interaction control, JSON Schema was developed by the Internet Engineering Task Force (IETF).
The Portable Document Format (PDF) is a file format used to present documents in a manner independent of application software, hardware, and operating systems. Each PDF file encapsulates a complete description of the presentation of the document, including the text, fonts, graphics, and other information needed to display it. PDF/A is an ISO-standardized version of the PDF specialized for use in the archiving and long-term preservation of electronic documents. PDF/A-3 allows embedding of other file formats, including XML, into PDF/A conforming documents, thus potentially providing the best of both human- and machine-readability. The W3C's XSL-FO (XSL Formatting Objects) markup language is commonly used to generate PDF files
Metadata, data about data, can be used to organize electronic resources, provide digital identification, and support the archiving and preservation of resources. In well-structured, machine-readable electronic records, the content can be repurposed as both data and metadata. In the context of electronic record-keeping systems, the terms "management" and "metadata" are virtually synonymous. Given proper metadata, records management functions can be automated, thereby reducing the risk of spoliation of evidence and other fraudulent manipulations of records. Moreover, such records can be used to automate the process of auditing data maintained in databases, thereby reducing the risk of single points of failure associated with the Machiavellian concept of a single source of truth.
Blockchains allow to create and maintain continuously-growing lists of records secured from tampering and revision. A key feature is that every node in a decentralized system has a copy of the blockchain so there is no single point of failure subject to manipulation and fraud.
See also
Budapest Declaration on Machine Readable Travel Documents
Comparison of XML editors
Four corners (law)
Integrity and particularly Data integrity
Linked data
Machine-readable passport
Markup language
Open data
Reliability (statistics), Data integrity, Reliability (computer networking), and Reliability (research methods)
Strategy Markup Language (StratML)
Structured document
Tag (metadata)
Universal Business Language (UBL)
XBRL (eXtensible Business Reporting Language)
References
External links
OMB M-13-13, Open Data Policy: Managing Information as an Asset, which requires agencies to use open, machine-readable, data format standards
NARA Guidance on Managing Web Records, January 2005, which outlines the characteristics of trustworthy records.
Driving a Stake in the Heart of the Capone Consultancy Method of Records Management: Best Practices for Correcting Non-Records Non-Policy Nonsense, March 9, 2015
The U.S. Code, which includes the term "machine-readable" over 50 times as of September 10, 2016
__notoc__
Data management
Records management | Machine-readable document | [
"Technology"
] | 1,634 | [
"Data management",
"Data"
] |
51,559,227 | https://en.wikipedia.org/wiki/1%2C3-Diphenylurea | 1,3-Diphenylurea is a phenylurea-type compound with the formula (PhNH)2CO (Ph = C6H5). It is a colorless solid that is prepared by transamidation of urea with aniline.
DPU is a cytokinin, a type of plant hormone that induces flower development. The cytokinin effect of DPU is relatively low, but other more potent phenylurea-type cytokinins have been reported.
It was detected in coconut milk.
References
External links
Cytokinins
Ureas | 1,3-Diphenylurea | [
"Chemistry"
] | 129 | [
"Organic compounds",
"Organic compound stubs",
"Organic chemistry stubs",
"Ureas"
] |
51,559,936 | https://en.wikipedia.org/wiki/Realizing%20Increased%20Photosynthetic%20Efficiency | Realizing Increased Photosynthetic Efficiency (RIPE) is a translational research project that is genetically engineering plants to photosynthesize more efficiently to increase crop yields. RIPE aims to increase agricultural production worldwide, particularly to help reduce hunger and poverty in Sub-Saharan Africa and Southeast Asia by sustainably improving the yield of key food crops including soybeans, rice, cassava and cowpeas. The RIPE project began in 2012, funded by a five-year, $25-million dollar grant from the Bill and Melinda Gates Foundation. In 2017, the project received a $45 million-dollar reinvestment from the Gates Foundation, Foundation for Food and Agriculture Research, and the UK Government's Department for International Development. In 2018, the Gates Foundation contributed an additional $13 million to accelerate the project's progress.
Background
During the 20th century, the Green Revolution dramatically increased yields through advances in plant breeding and land management. This period of agricultural innovation is credited for saving millions of lives. However, these approaches are reaching their biological limits, leading to stagnation in yield improvement. In 2009, the Food and Agriculture Organization projected that global food production must increase by 70% by 2050 to feed an estimated world population of 9 billion people. Meeting the demands of 2050 is further challenged by shrinking arable land, decreasing natural resources, and climate change.
Research
The RIPE project's proof-of-concept study established photosynthesis can be improved to increase yields, published in Science. The Guardian named this discovery one of the 12 key science moments of 2016. Computer model simulations identify strategies to improve the basic underlying mechanisms of photosynthesis and increase yield. First, researchers transform, or genetically engineer, model plants that are tested in controlled environments, e.g. growth chambers and greenhouses. Next, successful transformations are tested in randomized, replicated field trials. Finally, transformations with statistically significant yield increases are translated to the project's target food crops. Likely several approaches could be combined to additively increase yield. "Global access” ensures smallholder farmers will be able to use and afford the project's intellectual property.
Organization
RIPE is led by the University of Illinois at the Carl R. Woese Institute for Genomic Biology. The project's partner institutions include the Australian National University, Chinese Academy of Sciences, Commonwealth Scientific and Industrial Research Organisation, Lancaster University, Louisiana State University, University of California at Berkeley, University of Cambridge, University of Essex, and the United States Department of Agriculture/Agricultural Research Service.
The Executive Committee oversees the various research strategies; its members are listed in the table below.
References
Genetic engineering and agriculture
Research projects
Photosynthesis | Realizing Increased Photosynthetic Efficiency | [
"Chemistry",
"Engineering",
"Biology"
] | 541 | [
"Biochemistry",
"Genetic engineering and agriculture",
"Genetic engineering",
"Photosynthesis"
] |
51,560,456 | https://en.wikipedia.org/wiki/List%20of%20progestogens | This is a list of progestogens that are or that have been used in clinical or veterinary medicine. They are steroids and include derivatives of progesterone and testosterone.
Progesterone derivatives
Retroprogesterone derivatives
Note that although an active progestogen, retroprogesterone is not medically used.
17α-Hydroxyprogesterone derivatives
Note that 17α-hydroxyprogesterone is inactive as a progestogen and is not used medically.
The 19-norprogesterone derivatives gestonorone caproate (gestronol hexanoate), nomegestrol acetate, segesterone acetate (nestorone, elcometrine), and norgestomet are also derivatives of 17α-hydroxyprogesterone (see below).
17α-Methylprogesterone derivatives
Note that although an active progestogen, 17α-methylprogesterone is not medically used.
The 19-norprogesterone derivatives demegestone, promegestone, and trimegestone are also derivatives of 17α-methylprogesterone (see below).
Other 17α-substituted progesterone derivatives
19-Norprogesterone derivatives
Note that although an active progestogen, 19-norprogesterone is not medically used.
Testosterone derivatives
Note that testosterone itself does not have significant progestogenic activity. Testosterone is instead classified as an anabolic-androgenic steroid and is included here purely because it is the parent structure of this group of progestins.
17α-Ethynyltestosterone derivatives
19-Nortestosterone derivatives
Note that while nandrolone (19-nortestosterone) does have significant progestogenic activity, it is not used as a progestogen. It is instead classified as an androgenic-anabolic steroid and is included here purely because it is an important parent structure of this group of progestins.
17α-Ethynyl-19-nortestosterone derivatives
Estranes
18-Methylestranes (13β-ethylgonanes)
Other 17α-substituted 19-nortestosterone derivatives
Spirolactone derivatives
Note that although an active progestogen, SC-5233 (spirolactone) is not medically used.
See also
List of steroids
List of progestogen esters
Notes
? = Chemical names that are unverified.
Further reading
Progestogens
Steroids
Progestogens | List of progestogens | [
"Chemistry"
] | 546 | [
"nan"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.