id int64 39 79M | url stringlengths 31 227 | text stringlengths 6 334k | source stringlengths 1 150 ⌀ | categories listlengths 1 6 | token_count int64 3 71.8k | subcategories listlengths 0 30 |
|---|---|---|---|---|---|---|
70,334,998 | https://en.wikipedia.org/wiki/Fatiha%20Alabau | Fatiha Alabau-Boussouira (born 1961) is a French applied mathematician specializing in the control theory of partial differential equations. She is affiliated with the Laboratoire Jacques-Louis Lions of Sorbonne University as an external member, a professor at the University of Lorraine in the mathematics department of its Metz campus, and a former president of the Société de Mathématiques Appliquées et Industrielles, a French society for applied mathematics.
Education and career
Alabau was born 27 August 1961 in Montmorency, Val-d'Oise. She earned a diplôme d'études approfondies in numerical analysis in 1984 at Pierre and Marie Curie University, where she defended her doctoral thesis in 1987 under the supervision of Roland Glowinski.
After postdoctoral research as a visiting assistant professor at Arizona State University, she became maître de conferences at the University of Bordeaux 1 in 1988, and earned a habilitation there in 1996. She became a professor at Louis Pasteur University in 1997, and moved to Paul Verlaine University – Metz (which later became part of the University of Lorraine) in 1999.
Service and recognition
Alabau was president of the Société de Mathématiques Appliquées et Industrielles from 2014 to 2017.
References
1961 births
Living people
20th-century French mathematicians
Control theorists
21st-century French mathematicians
People from Val-d'Oise
Pierre and Marie Curie University alumni
20th-century French women mathematicians
21st-century French women mathematicians | Fatiha Alabau | [
"Engineering"
] | 306 | [
"Control engineering",
"Control theorists"
] |
70,337,248 | https://en.wikipedia.org/wiki/List%20of%20Max%20original%20films | Max is an over-the-top subscription service owned and operated by Warner Bros. Discovery. It distributes a number of original films, documentaries, and specials, alongside their slate of television series. The shows produced for Max are dubbed "Max Originals". Max Originals are specifically made for audiences outside the traditional baseline HBO brand, while simultaneously working in parity with the HBO library. Content that is based on new and existing properties from WBD's subsidiaries will be distributed through Max.
Original films
Feature films
Documentaries
Specials
Stand-up comedy
Exclusive films
The following films premiered on the service without being labeled as Max Originals.
Upcoming original films
Feature films
Documentaries
Specials
In development
Notes
References
Internet-related lists
Warner Bros. Discovery-related lists
Lists of films by studio | List of Max original films | [
"Technology"
] | 153 | [
"Computing-related lists",
"Internet-related lists"
] |
70,338,324 | https://en.wikipedia.org/wiki/List%20of%20Twitter%20features | X, commonly called under the former name Twitter, is an American microblogging and social networking service on which users post and interact with messages known as "tweets". Registered users can post, like and retweet tweets, and read those that are publicly available.
Twitter structure
Tweets
Tweets, a term for short posts, are publicly visible by default, but senders can restrict message delivery to only their followers. Users can mute users they do not wish to interact with, block accounts from viewing their tweets and remove accounts from their followers list. Users can tweet via the Twitter website, compatible external applications (such as for smartphones), or by Short Message Service (SMS) available in certain countries. Users may subscribe to other users' tweets—this is known as "following" and subscribers are known as "followers" or "tweeps", a portmanteau of Twitter and peeps. Individual tweets can be forwarded by other users to their own feed, a process known as a "retweet", a term for reposting. In 2015, Twitter launched "quote tweet" (originally called "retweet with comment"), a feature that allows users to add a comment to their retweet, nesting one tweet in the other. Users can also "like" (formerly "favorite") individual tweets.
The counters for "likes", "retweets", and replies appear next to the respective buttons in news feeds, called timelines, such as on profile pages and search results. Counters for likes and retweets exist on a tweet's standalone page too. Since September 2020, quote tweets, formerly known as "retweet with comment", have an own counter on their tweet page. Until the legacy desktop front end that was discontinued in 2020, a row with miniature profile pictures of up to ten liking or retweeting users was displayed (earliest documented implementation in December 2011 overhaul), as well as a tweet reply counter next to the according button on a tweet's page.
Twitter allows users to update their profile via their mobile phone either by text messaging or by apps released for certain smartphones and tablets. Twitter has been compared to a web-based Internet Relay Chat (IRC) client. In a 2009 Time magazine essay, technology author Steven Johnson described the basic mechanics of Twitter as "remarkably simple":
According to research published in April 2014, around 44% of user accounts have never tweeted.
The first tweet was posted by Jack Dorsey (creator) at 12:50 pm PST on March 21, 2006, and read "just setting up my twttr". In 2009, the first tweet was sent from space. US astronauts Nicola Stott and Jeff Williams took part in a live 'tweetup' from the International Space Station with around 35 members of the public at the NASA Headquarters in Washington, D.C.
In March 2021, Jack Dorsey listed his first tweet for sale. The highest bid for the tweet, $2.5 million, came from a Malaysian businessman, Sina Estavi. Along with the metadata of the original tweet, the buyer was to receive a certificate that was digitally signed and verified by Dorsey.
Content
San Antonio-based market-research firm Pear Analytics analyzed 2,000 tweets (originating from the United States and in English) over a two-week period in August 2009 from 11:00 am to 5:00 pm (CST) and separated them into six categories. Pointless babble made up 40%, with 38% being conversational. Pass-along value had 9%, self-promotion 6% with spam and news each making 4%.
Despite Jack Dorsey's own open contention that a message on Twitter is "a short burst of inconsequential information", social networking researcher Danah Boyd responded to the Pear Analytics survey by arguing that what the Pear researchers labeled "pointless babble" is better characterized as "social grooming" or "peripheral awareness" (which she justifies as persons "want[ing] to know what the people around them are thinking and doing and feeling, even when co-presence isn't viable"). Similarly, a survey of Twitter users found that a more specific social role of passing along messages that include a hyperlink is an expectation of reciprocal linking by followers.
Format
Hashtags, usernames, retweets and replies
Users can group posts together by topic or type by use of hashtags – words or phrases prefixed with a "#" sign. Similarly, the "@" sign followed by a username is used for mentioning or replying to other users.
In 2014, in anticipation for the FIFA World Cup, Twitter introduced hashflags, special hashtags that automatically generate a custom emoji next to them for a certain period of time, following the success of a similar campaign during the 2010 World Cup. Hashflags may be generated by Twitter themselves (such as to raise awareness for social issues) or be purchased by corporations (such as to promote products and events).
To repost a message from another Twitter user and share it with one's own followers, a user can click the retweet button within the Tweet. Users can reply other accounts' replies. Since November 2019, users can hide replies to their messages. Since May 2020, users can select who can reply each of their tweets before sending them: anyone, accounts who follow the poster, specific accounts, and none. This ability was upgraded in July 2021 to make the feature retroactively applicable to tweets after they have been sent out.
Twitter Lists
In late 2009, the "Twitter Lists" feature was added, making it possible for users to follow ad hoc lists of authors instead of individual authors.
Using SMS
Through SMS, users can communicate with Twitter through five gateway numbers: short codes for the United States, Canada, India, New Zealand, and an Isle of Man-based number for international use. There is also a short code in the United Kingdom which is only accessible to those on the Vodafone, O2 and Orange networks. In India, since Twitter only supports tweets from Bharti Airtel, an alternative platform called smsTweet was set up by a user to work on all networks. A similar platform called GladlyCast exists for mobile phone users in Singapore and Malaysia.
The tweets were set to a largely constrictive 140-character limit for compatibility with SMS messaging, introducing the shorthand notation and slang commonly used in SMS messages. The 140-character limit also increased the usage of URL shortening services such as bit.ly, goo.gl, tinyurl.com, tr.im, and other content-hosting services such as TwitPic, memozu.com and NotePub to accommodate multimedia content and text longer than 140 characters. Since June 2011, Twitter has used its own t.co domain for automatic shortening of all URLs posted on its site, making other link shorteners unnecessary for staying within Twitter's 140 character limit.
In August 2019, Jack Dorsey's account was hacked by using Twitter's SMS to tweet feature to send crude messages. Days later, the ability to send a tweet via SMS was temporarily turned off.
In April 2020, Twitter discontinued the ability to receive SMS messages containing the text of new tweets in most countries.
Character limits
In 2016, Twitter announced that media such as photos, videos, and the person's handle, would not count against the already constrictive 140 character limit. A user photo post used to count for a large chunk of a Tweet, about 24 characters. Attachments and links would also no longer be part of the character limit.
On March 29, 2016, Twitter introduced the ability to add a caption of up to 480 characters to each image attached to a tweet. This caption can be accessed by screen reading software or by hovering the mouse above a picture inside TweetDeck.
Since March 30, 2017, the Twitter handles are outside the tweet itself, therefore they no longer count towards the character limit. Only new Twitter handles added to the conversation count towards the limit.
In 2017, Twitter doubled their historical 140-character-limitation to 280. Under the new limit, glyphs are counted as a variable number of characters, depending upon the script they are from: most European letters and punctuation forms count as one character, while each CJK glyph counts as two so that only 140 such glyphs can be used in a tweet.
URL shortener
t.co is a URL shortening service created by Twitter. It is only available for links posted to Twitter and not available for general use. All links posted to Twitter use a t.co wrapper. Twitter created the service to try to protect users from malicious sites by warning users if a URL is potentially malicious before redirecting them, and uses the shortener to track clicks on links within tweets.
Having used the services of third parties TinyURL and bit.ly, Twitter began experimenting with its own URL shortening service for private messages in March 2010 using the twt.tl domain, before it purchased the t.co domain. The service was tested on the main site using the accounts @TwitterAPI, @rsarver and @raffi. On September 2, 2010, an email from Twitter to users said they would be expanding the roll-out of the service to users. On June 7, 2011, Twitter announced that it was rolling out the feature.
t.co faced controversy under the ownership of Musk, as Twitter began blocking new Tweets from containing links to other social networks, such as Facebook, Instagram, and Mastodon. Tweets containing the networks could not be shared, and existing Tweets with links to the restricted sites would give an error upon attempting to visit the page via Twitter. The policy was soon reversed after extreme controversy.
Trending topics
A word, phrase, or topic that is mentioned at a greater rate than others is said to be a "trending topic". Trending topics become popular either through a concerted effort by users or because of an event that prompts people to talk about a specific topic. These topics help Twitter and their users to understand what is happening in the world and what people's opinions are about it. Websites that track and display trending topics, like TwitterTrend.co, provide real-time information on the most discussed topics worldwide, offering users insights into regional and global trends.
Trending topics are sometimes the result of concerted efforts and manipulations by fans of certain celebrities or cultural phenomena, particularly musicians like Lady Gaga (known as Little Monsters), Justin Bieber (Beliebers), Rihanna (Rih Navy) and One Direction (Directioners), and novel series Twilight (Twihards) and Harry Potter (Potterheads). Twitter has altered the trend algorithm in the past to prevent manipulation of this type with limited success.
The Twitter web interface displays a list of trending topics on a sidebar on the home page, along with sponsored content (see image).
Twitter often censors trending hashtags that are claimed to be abusive or offensive. Twitter censored the #Thatsafrican and #thingsdarkiessay hashtags after users complained that they found the hashtags offensive. There are allegations that Twitter removed #NaMOinHyd from the trending list and added an Indian National Congress-sponsored hashtag. President Donald Trump protested trends calling them "unfair, disgusting, illegal, ridiculous" claiming the ones that are bad about him are blown up.
Examples of high-impact topics include the wildfires in San Diego, the earthquake in Japan, popular sporting events, and political uprisings in Iran and Egypt.
In 2019, 20% of the global trends were found to be fake, created automatically using fake and compromised accounts originating from Turkey. It is reported that 108,000 accounts were employed since 2015 to push 19,000 keywords such as advertisements and political campaigns, to top trends in Turkey by bulk tweeting.
Moments
In October 2015, Twitter introduced "Moments"—a feature that allows users to curate tweets from other users into a larger collection. Twitter initially intended the feature to be used by its in-house editorial team and other partners; they populated a dedicated tab in Twitter's apps, chronicling news headlines, sporting events, and other content. In September 2016, creation of moments became available to all Twitter users. On December 7, 2022, Twitter announced that it would be removing the ability to create new moments to focus on other experiences.
Adding and following content
There are numerous tools for adding content, monitoring content and conversations including Twitter's own TweetDeck, Salesforce.com, HootSuite, and Twitterfeed.com. , fewer than half of tweets posted were posted using the web user interface with most users using third-party applications (based on an analysis of 500 million tweets by Sysomos).
Verified accounts
In June 2009, after being criticized by Kanye West and sued by Tony La Russa over unauthorized accounts run by impersonators, the company launched their "Verified Accounts" program. Twitter stated that an account with a "blue tick" verification badge indicates "we've been in contact with the person or entity the account is representing and verified that it is approved". In July 2016, Twitter announced a public application process to grant verified status to an account "if it is determined to be of public interest" and that verification "does not imply an endorsement". Verified status allows access to some features unavailable to other users, such as only seeing mentions from other verified accounts.
In November 2020, Twitter announced a relaunch of its verification system in 2021. According to the new policy, Twitter verifies six different types of accounts; for three of them (companies, brands, and influential individuals like activists), the existence of a Wikipedia page will be one criterion for showing that the account has "Off Twitter Notability". Twitter states that it will re-open public verification applications at some point in "early 2021".
Mobile
Twitter has mobile apps for iPhone, iPad, Android, Windows 10, Windows Phone, BlackBerry, and Nokia S40. Users can also tweet by sending SMS. In April 2017, Twitter introduced Twitter Lite, a progressive web app designed for regions with unreliable and slow Internet connections, with a size of less than one megabyte, designed for devices with limited storage capacity.
This has been released in countries with slow internet connection such as the Philippines.
Twitter Lite has evolved into the main Twitter web interface, see section "interface".
Third-party applications
For many years, Twitter has limited the use of third-party applications accessing the service by implementing a 100,000 user limit per application. Since August 2010, third-party Twitter applications have been required to use OAuth, an authentication method that does not require users to enter their password into the authenticating application. This was done to increase security and improve the user experience. As of 2023, third-party applications are prohibited under the Twitter API terms of service, with prohibit the use of the API to "create or attempt to create a substitute or similar service or product to the Twitter Applications".
Related headlines feature
This feature adds websites to the bottom of a tweet's permalink page. If a website embedded a tweet onto one of their stories, the tweet will show the websites that mentioned the tweet. This feature was added onto Twitter so if the viewer does not understand what the tweet means, they can click on the sites to read more about what the person is talking about.
Polls
In 2015, Twitter began to roll out the ability to attach poll questions to tweets. Polls are open for up to 7 days, and voters are not personally identified.
Initially, polls could have only two options with a maximum of twenty characters per option. Later, the ability to add four options with up to 25 characters per option, was added.
Integrated photo-sharing service
On June 1, 2011, Twitter announced its own integrated photo-sharing service that enables users to upload a photo and attach it to a Tweet right from Twitter.com. Users now also have the ability to add pictures to Twitter's search by adding hashtags to the tweet. Twitter also plans to provide photo galleries designed to gather and syndicate all photos that a user has uploaded on Twitter and third-party services such as TwitPic.
Streaming video
In 2016, Twitter began to place a larger focus on live streaming video programming, hosting various events including streams of the Republican and Democratic conventions during the U.S. presidential campaign as part of a partnership with CBS News, Dreamhack and ESL esports events, and winning a bid for non-exclusive streaming rights to ten NFL Thursday Night Football games in the 2016 season.
During an event in New York in May 2017, Twitter announced that it planned to construct a 24-hour streaming video channel hosted within the service, featuring content from various partners. CEO Jack Dorsey stated that the digital video strategy was part of a goal for Twitter to be "the first place that anyone hears of anything going on that matters to them"; as of the first quarter of 2017, Twitter had over 200 content partners, who streamed over 800 hours of video over 450 events.
Twitter announced a number of new and expanded partnerships for its streaming video services at the event, including Bloomberg, BuzzFeed, Cheddar (Opening Bell and Closing Bell shows; the latter was introduced in October 2016) IMG Fashion (coverage of fashion events), Live Nation Entertainment (streaming concert events), Major League Baseball (weekly online game stream, plus a weekly program with live look-ins and coverage of trending stories), MTV and BET (red carpet coverage for their MTV Video Music Awards, MTV Movie & TV Awards, and BET Awards), NFL Network (the Monday-Thursday news program NFL Blitz Live, and Sunday Fantasy Gameday), the PGA Tour (PGA Tour Live coverage of early tournament rounds preceding television coverage), The Players' Tribune, Ben Silverman and Howard T. Owens' Propagate (daily entertainment show #WhatsHappening), The Verge (weekly technology show Circuit Breaker: The Verge's Gadget Show), Stadium (a new digital sports network being formed by Silver Chalice and Sinclair Broadcast Group) and the WNBA (weekly game).
Account archival
Twitter has offered of archiving one's own Twitter account data. Those methods have their individual benefits and disadvantages. As of September 2019, only the latter archival method is available.
Browsable legacy Twitter archive format
In December 2012, Twitter introduced a "Tweet archival" feature, which created a ZIP file that contains an offline-browsable archive of all tweets. Those exported tweets could be browsed and searched offline by using the bundled user-interface accessible through a web browser, which used client-side, JavaScript-powered pagination. The user interface of the tweet archive browser had a design similar to Twitter's 2010–2014 desktop user interface, even until the feature's removal. The tweet text contents, ID's, time data and source labels are located in the file called "tweets.csv". It was possible to request at least one archive per day. The ability to export this type of tweet archive, which never existed on the new layout, has been removed entirely in August 2019[when exactly?], after co-existing with the new 2018 data archival method. Even when accessing the legacy Twitter desktop website layout using the user-agent of an older browser version, the option has disappeared from the account settings.
Spaces
Twitter Spaces is a social audio feature that enables users to host or participate in a live-audio virtual environment called space for conversation. Spaces can accommodate an unlimited number of listeners. A maximum of 13 people (1 host, 2 co-hosts and 10 speakers) are allowed onstage. The feature was initially limited to users with at least 600 followers. Since October 21, 2021, any Twitter user can create a Space from the Android or iOS app.
Fleets
In March 2020, Twitter began to test a stories feature known as "fleets" in some markets, which officially launched on November 17, 2020. Similarly to equivalent features, fleets can contain text and media, are only accessible for 24 hours after they are posted, and are accessed within the Twitter app via an area above the timeline.
In June 2021, Twitter announced it would start implementing advertising into fleets, integrating full-screen ads among user-created content. On July 14, 2021, Twitter stated that it would remove Fleets by August 3. Twitter had intended for fleets to encourage more users to tweet regularly, rather than simply consume other folks' tweets, but instead fleets were generally used by users who already tweeted a lot. The company stated that their spot at the top of the screen would now be occupied by currently active Spaces from the user's feed.
Twitter Blue
On June 3, 2021, Twitter announced a service known as Twitter Blue, which provides features exclusive to those who are subscribers to the Twitter Blue service. They include:
Undo Tweet, which allows users to withdraw a tweet within a short time frame before it is posted.
Bookmarks, which allows users to save individual tweets into folders.
Reader mode, which converts threads of tweets into an article-like view.
Color themes for the Twitter mobile app.
Dedicated customer support.
The service was initially released in Australia and Canada. On November 9, 2021, Twitter Blue was launched for US customers.
Twitter Zero
Twitter Zero is an initiative undertaken by Twitter in collaboration with mobile phone-based Internet providers, whereby the providers waive data (bandwidth) charges—so-called "zero-rate"—for accessing Twitter on phones when using a stripped-down text-only version of the website. The images could be loaded by using the Twitter app. The stripped-down version is available only through providers who have entered the agreement with Twitter. Partners include:
with Ncell.
with Reliance Communications.
with Ucell.
with Turkcell.
with Vodafone and Smart Communications.
with XL Axiata.
Tip Jar
In May 2021, Twitter began testing a Tip Jar feature on its iOS and Android clients. The feature allows users to send monetary tips to certain accounts, providing a financial incentive for content creators on the platform. The Tip Jar is optional and users can choose whether or not to enable tips for their account. The day the feature was launched, a user discovered that sending a tip through PayPal would reveal the sender's address to the recipient.
On September 23, 2021, Twitter announced that it will allow users to tip users on the social network with bitcoin. The feature will be available for iOS users. Previously, users could tip with fiat currency using services such as Square's Cash app and PayPal's Venmo. Twitter will integrate the Strike bitcoin lightning wallet service. It was noted that at this current time, Twitter will not take a cut of any money sent through the tips feature.
The Shop Module
In July 2021, Twitter launched a test of The Shop Module, a shopping extension that directs customers to a brand's products from its official Twitter account. The feature initially launched for US-based users only and only on iOS.
Safety Mode
On September 1, 2021, Twitter began to roll out Safety Mode, allowing users to reduce disruptive interactions. The rollout began with a small beta-feedback group on iOS, Android, and Twitter's web application.
The functionality allows users to temporarily block accounts for seven days when potentially harmful language is detected. If a user has Safety Mode enabled, authors of tweets that are identified by Twitter's technology as being harmful or exercising uninvited behavior will be temporarily unable to follow the account, send direct messages, or see tweets from the user with the enabled functionality during the temporary block period. Jarrod Doherty, senior product manager at Twitter, stated that the technology in place within Safety mode assesses existing relationships to prevent blocking accounts that the user frequently interacts with.
Twitter first revealed Safety Mode in February 2021 within the Analyst Day slide deck.
NFT digital assets
On September 23, 2021, Twitter revealed that it was experimenting with a feature that would allow users to authenticate and showcase their collections of NFT digital assets on the platform. The feature was added on January 20, 2022, allowing Twitter Blue subscribers to connect their cryptocurrency wallet to display an NFT they own as a hexagon-shaped profile picture.
The ability to set new NFT profile pictures was silently removed in January 2024.
Live shopping
On November 22, 2021, Twitter announced live shopping feature on its platform. Walmart will be the first retailer to test Twitter's new livestream shopping platform. The company stated that it is part of their continuing efforts to bring engaging experiences to customers that allow them to shop seamlessly while also being entertained.
Shops
Twitter allow companies to showcase up to 50 products for sale on their profiles, as part of new feature testing. Shops will help Twitter to gain a piece of the $45 billion US market for social commerce.
Community Notes
See also
Twitter usage—How various people and organizations use Twitter
References
Software features
Twitter | List of Twitter features | [
"Technology"
] | 5,306 | [
"Software features"
] |
70,338,934 | https://en.wikipedia.org/wiki/Kepler-289 | Kepler-289 (PH3) is a rotating variable star slightly more massive than the Sun, with an unknown spectral type, 2370 light-years away from Earth in the constellation of Cygnus. In 2014, three exoplanets were discovered orbiting it.
Planetary system
Kepler-289 hosts four planets, three confirmed (Kepler-289b, Kepler-289c, Kepler-289d) and one unconfirmed candidate (Kepler-289e). The discovery of this system was made using the transit method. The inner three planets were found in 2014 with the Kepler space telescope and the Planet Hunters team, while planet e was discovered by follow-up studies in 2017.
References
Cygnus (constellation)
Planetary systems with three confirmed planets
J19495168+4252582
273234825 | Kepler-289 | [
"Astronomy"
] | 168 | [
"Cygnus (constellation)",
"Constellations"
] |
70,338,952 | https://en.wikipedia.org/wiki/Kepler-87 | Kepler-87 is a star slightly more massive than the Sun and it is nearing the end of its main-sequence period.
Planetary system
Kepler-87 hosts four planets, two confirmed (Kepler-87b, Kepler-87c and two unconfirmed (Kepler-87d, Kepler-87e). It is the farthest system from the Sun with two unconfirmed planet candidates at 4021 light-years.
References
Cygnus (constellation)
G-type subgiants
Planetary systems with two confirmed planets
Kepler objects of interest | Kepler-87 | [
"Astronomy"
] | 111 | [
"Cygnus (constellation)",
"Constellations"
] |
70,339,007 | https://en.wikipedia.org/wiki/Kepler-167 | Kepler-167 is a K-type main-sequence star located about away from the Solar System in the constellation of Cygnus. The star has about 78% the mass and 75% the radius of the Sun, and a temperature of . It hosts a system of four known exoplanets. There is also a companion red dwarf star at a separation of about , with an estimated orbital period of over 15,000 years.
Planetary system
Kepler-167 is orbited by four known transiting exoplanets, discovered using the Kepler space telescope. The inner three planets are all super-Earths of unknown composition orbiting closer to their star than Mercury is to the Sun. The outermost planet, Kepler-167e, is a Jupiter analog, with , , and an equilibrium temperature of . It is the first transiting Jupiter analog discovered.
The inner two planets were confirmed in 2014, as part of a study validating hundreds of Kepler planets, and the outer two planets were confirmed in 2016. Observations of Kepler-167e using the Spitzer Space Telescope, published in 2019, ruled out significant transit timing variations, making it easier to predict future transits and plan follow-up observations. As a rare example of a long-period transiting gas giant, Kepler-167e is a target of interest for further observations, for example to characterize its atmosphere. , four transits of planet e have been detected, with both space-based and ground-based observations.
References
Cygnus (constellation)
Binary stars
K-type main-sequence stars
M-type main-sequence stars
Planetary systems with four confirmed planets
0490
J19303802+3820434 | Kepler-167 | [
"Astronomy"
] | 342 | [
"Cygnus (constellation)",
"Constellations"
] |
70,339,057 | https://en.wikipedia.org/wiki/List%20of%20Hulu%20original%20films | Beginning in 2019, streaming service Hulu began to produce its own original films. Its first film was Batman & Bill, a documentary about comic book writer Bill Finger. Its first narrative feature film was Little Monsters, an independently produced zombie comedy film.
Original films
Feature films
Documentaries
Specials
These programs are one-time events or supplementary content related to original films.
Co-distributed films
These films premiered in theatres, or in one film's case another streamer, but were co-distributed on Hulu when they received wide releases.
Upcoming original films
Feature films
Documentaries
Specials
Notes
References
External links
Hulu
Lists of films by studio | List of Hulu original films | [
"Technology"
] | 125 | [
"Computing-related lists",
"Internet-related lists"
] |
70,339,926 | https://en.wikipedia.org/wiki/Bridgman%20Award | The Bridgman Award is a prize given every two years by the International Association for the Advancement of High Pressure Science and Technology (AIRAPT) for research in the physics, chemistry, or technology of high pressure science. The award is named in honor of Percy Williams Bridgman, Nobel Prize winner and famous pioneer of the physics of high pressure.
Recipients
1977 Harry George Drickamer
1979 Boris Vodar, France
1981 E. Ulrich Franck (1920–2004), professor for physical chemistry at the University of Karlsruhe
1983 Albert Francis Birch (1903–1992), geophysicist and mineralogist, professor at Harvard University
1985 Nestor Joseph Trappeniers (1922–2004), professor in Amsterdam
1987 Francis P. Bundy (1910–2008), diamond synthesis under high pressure in 1954 at General Electric
1989 Ho-kwang Mao, Carnegie Institution, Washington D.C.
1991 Shigeru Minomura (1923–2000), professor at the Institute for Condensed Matter Research in Tokyo, later in Hokkaido and at Okayama University
1993 Arthur L. Ruoff, professor at Cornell University
1995 Bogdan Baranowski (1927–2014), professor of physical chemistry in Warsaw
1997 William A. Bassett (born 1931), professor of geology at Cornell University
1999 Vladimir Fortov
2001 William J. Nellis
2003 Neil Ashcroft
2005 Sergei Mikhailovich Stishov, professor and direct of the Institute of High Pressure Physics of the Russian Academy of Sciences
2007 Takehiko Yagi, professor at the Institute of Condensed Matter Physics, University of Tokyo
2009 Russell J. Hemley, director of the Geophysical Laboratory, Carnegie Institution, Washington D.C.
2011 Eji Ito, emeritus professor at Okayama University
2013 Karl Syassen
2015 Paul Loubeyre
2017 Mikhail Eremets
2019 Gilbert Collins
2021 Tetsuo Irifune
References
Science and technology awards
Awards established in 1977 | Bridgman Award | [
"Technology"
] | 382 | [
"Science and technology awards"
] |
70,340,520 | https://en.wikipedia.org/wiki/Phone%20repair%20with%20rice | Submerging a mobile device into rice is a common repair advice for devices that suffered from water damage. This technique has not been shown to be effective in repairing them. Submerging these devices into a desiccant may or may not be more effective than leaving them to dry in open air. Uncooked rice is inferior to other common desiccants such as silica gel or cat litter. Despite what has been said, it is not recommended as the starch and particles from the rice can get lodged inside the phone's inner parts.
History
Rice has traditionally been used to keep camera equipment and films dry in tropical environments.
In July 2007, less than a month after the original iPhone was released, a member of MacRumors named jorsuss started a thread titled "I dropped my iPhone in water". They covered the phone in rice, which may have been the first documented attempt to use the procedure on an iPhone.
See also
IP Code
Further reading
References
Smartphones
Misconceptions
Rice | Phone repair with rice | [
"Technology"
] | 207 | [
"Computing stubs"
] |
70,341,717 | https://en.wikipedia.org/wiki/Peacenotwar | peacenotwar is a piece of malware, which has been characterized as protestware, created by Brandon Nozaki Miller. In March 2022, it was added as a dependency in an update for node-ipc, a common JavaScript dependency.
Background
Between 7 March and 8 March 2022, Brandon Nozaki Miller, the maintainer of the node-ipc package on the npm package registry, released two updates allegedly containing malicious code targeting systems in Russia and Belarus (). This code recursively overwrites all files on the user's system drive with heart emojis. A week later, Miller added the peacenotwar module as a dependency to node-ipc. The function of peacenotwar was to create a text file titled WITH-LOVE-FROM-AMERICA.txt on the desktop of affected machines, containing a message in protest of the Russo-Ukrainian War; it also imports a dependency on a package (npm colors package) that would result in a Denial of Service (DoS) to any server using it.
Impact
Because node-ipc was a common software dependency, it compromised several other projects which relied upon it.
Among the affected projects was Vue.js, which required node-ipc as a dependency but didn't specify a version. Some users of Vue.js were affected if the dependency was fetched from specific packages. Unity Hub 3.1 was also affected, but a patch was issued on the same day as the release.
See also
npm left-pad incident
Supply chain attack
Hacktivism
Reactions to the 2022 Russian invasion of Ukraine
Anti-Russian sentiment
References
Internet-based activism
2022 in computing
Reactions to the Russian invasion of Ukraine
Malware | Peacenotwar | [
"Technology"
] | 354 | [
"Malware",
"Computer security exploits"
] |
70,342,229 | https://en.wikipedia.org/wiki/Allium%20pervestitum | Allium pervestitum is a species of wild garlic in the family Amaryllidaceae, mainly found growing in the coastal area of the Sea of Azov. It is a halophyte.
References
pervestitum
Halophytes
Flora of Ukraine
Flora of the Crimean Peninsula
Flora of South European Russia
Plants described in 1950 | Allium pervestitum | [
"Chemistry"
] | 68 | [
"Halophytes",
"Salts"
] |
70,342,891 | https://en.wikipedia.org/wiki/Apiotrichum%20mycotoxinivorans | Apiotrichum mycotoxinivorans (synonym Trichosporon mycotoxinivorans) is a yeast species purportedly useful in the detoxification of various mycotoxins. It was first isolated from the hindgut of the termite Mastotermes darwiniensis. It has been shown to detoxify mycotoxins such as ochratoxin A and zearalenone. It can occasionally become a human pathogen.
References
Further reading
Tremellomycetes
Fungal pathogens of humans
Fungus species | Apiotrichum mycotoxinivorans | [
"Biology"
] | 112 | [
"Fungi",
"Fungus species"
] |
70,343,570 | https://en.wikipedia.org/wiki/Tim%20Hawarden | Timothy George Hawarden (24 December 1943 – 10 November 2009) was a South African astrophysicist known for his pioneering work on passive cooling techniques for space telescopes for which he won NASA's Exceptional Technology Achievement Medal.
Biography
Hawarden was born in Mossel Bay, Cape Province, South Africa. He graduated from the University of Natal in 1966 with a BSc in Physics and Applied Mathematics, and then graduated from the University of Cape Town with an MSc in Astronomy 1970 and then a PhD in 1975 on old open clusters. While undertaking his PhD he worked as an optical astronomer at the Royal Observatory, Cape of Good Hope and then from 1972 as the Deputy Head of the Photometry Department at the South African Astronomical Observatory in Cape Town. In 1975 he worked as the Deputy Astronomer-in-Charge of the UK Schmidt Telescope at the Siding Spring Observatory in New South Wales, Australia.
In 1978 he moved to work at the Royal Observatory in Edinburgh, Scotland, from which he was based for the rest of his career. In 1981 he began working on the United Kingdom Infrared Telescope in Hawaii. In 1987 he moved to Hawaii and led the telescope's ambitious upgrades programme throughout the 1990s. He returned to Edinburgh in 2001 and became the UK Astronomy Technology Centre Project Scientist developing extremely large telescopes (ELT) before retiring in 2006 to care for his wife Frances. He remained active in the field of astronomy until his sudden death in Edinburgh in 2009.
Passive cooling of space telescopes
Hawarden was involved in the development of the Infrared Space Observatory as the Co-Investigator for the infrared camera (ISOCAM) but he considered the cryogenic cooling system "horrendously complicated". The dependency of infrared space telescopes on cryogenic cooling limited the telescope's lifespan as well as adding significant weight. In the early 1980s Hawarden began developing the idea of using passive cooling for infrared space telescopes through a combination of radiators, sunshields, and by locating the telescope further from Earth. Having a telescope orbit the Sun–Earth L2 Lagrange point enables the sunshield to shelter the telescope from the radiant heat of the Sun, the Earth, and the Moon. A passively cooled telescope is significantly lighter and permits much larger optics and instruments.
In 1989 Hawarden proposed such a telescope, the Passively Cooled Orbiting Infrared Observatory Telescope (POIROT) to the European Space Agency but the design was rejected. In 1991 Hawarden and Harley Thronson proposed a similar design to NASA for the Edison project but the proposal was also rejected. The ideas continued to face resistance though some passive cooling was incorporated into the design of the diameter Spitzer Space Telescope launched in 2003. The ideas were later adopted in full for the diameter James Webb Space Telescope launched in 2021.
In 2010 Hawarden was posthumously awarded the NASA Exceptional Technology Achievement Medal for his work on passive cooling techniques, the award citing "the breakthrough concepts that made possible the James Webb Space Telescope and its successors". The award was accepted on behalf of Hawarden's widow Frances by the Nobel-laureate physicist John C. Mather.
References
Astrophysicists
1943 births
2009 deaths
South African astronomers
Fellows of the Royal Astronomical Society
University of Cape Town alumni
South African emigrants to the United Kingdom
People from Mossel Bay
20th-century astronomers
21st-century astronomers | Tim Hawarden | [
"Physics"
] | 673 | [
"Astrophysicists",
"Astrophysics"
] |
70,343,619 | https://en.wikipedia.org/wiki/1973%20Sale%20and%20Purchase%20Agreement | The 1973 Sale and Purchase Agreement was a 20-year agreement pressured by the Shah of Iran on the oil consortium that nullified The Consortium Agreement of 1954 and provided the National Iranian Oil Company with complete control of Iranian petroleum nationalizing the nation's oil reserves. By 1975, western oil companies complained of the agreement and demanded renegotiation marking the first time in history that oil companies rather than the oil producing nations sought to negotiate an oil contract.
Background
The Shah had ambitions for a "Great Civilization" in which Iran would one day become the largest oil producer in the world. He used Iran's development to establish himself as a strongman across the Middle East and the Persian Gulf and create an oil oligarchy controlling the price of oil. In the 1960s, the Shah initiated the formation and organization of large oil-exporting countries which would become known as the Organization of the Petroleum Exporting Countries (OPEC). The Shah also sought to terminate the 1954 Consortium Agreement an effort which was finalized with the 1973 Sale and Purchase Agreement. In response to the creation of OPEC, Maurice Bridgman of British Petroleum warned the National Iranian Oil Company about
According to Dr. Parviz Mina, an expert on Iranian Oil Affairs and previous director of the National Iranian Oil Company,
Result
In the summer of 1973, Iran exploited the supply shortage to double crude oil prices further reducing the power of the oil consortium. Political balance shifted from oil companies to oil producing nations.
References
Petroleum politics
Treaties of Iran
Cold War history of Iran
OPEC | 1973 Sale and Purchase Agreement | [
"Chemistry"
] | 309 | [
"Petroleum",
"Petroleum politics"
] |
73,208,141 | https://en.wikipedia.org/wiki/Keanumycin | Keanumycins are a group of chemical compounds isolated from bacteria in the genus Pseudomonas. They are classified as nonribosomal lipopeptides and they have a variety of antimicrobial activities. Keanumycin A is active against the amoebas Dictyostelium discoideum (IC50 =
4.4 nM), Acanthamoeba castellanii (IC50 = 2.0 μM), and Acanthamoeba comandoni (IC50 = 3.1 μM) which cause infections in humans.
Fermentation broth containing keanumycins is effective against the fungus Botrytis cinerea and can be directly applied to plants to stop the development of Botrytis blight.
Researchers at the Hans Knöll Institute who first isolated and characterized these compounds named them after the actor Keanu Reeves because their deadliness was perceived to be comparable to Reeves in his film roles.
Chemical structures
References
Antimicrobial peptides
Depsipeptides
Keanu Reeves | Keanumycin | [
"Chemistry"
] | 217 | [
"Molecular biology stubs",
"Molecular biology"
] |
73,211,394 | https://en.wikipedia.org/wiki/High%20Seas%20Treaty | The United Nations agreement on biodiversity beyond national jurisdiction or BBNJ Agreement, also referred to by some stakeholders as the High Seas Treaty or Global Ocean Treaty, is a legally binding instrument for the conservation and sustainable use of marine biological diversity of areas beyond national jurisdiction. There is some controversy over the popularized name of the agreement. It is an agreement under the United Nations Convention on the Law of the Sea (UNCLOS). The text was finalised during an intergovernmental conference at the UN on 4 March 2023 and adopted on 19 June 2023. Both states and regional economic integration organizations can become parties to the treaty.
In 2017, the United Nations General Assembly (UNGA) had voted to convene an intergovernmental conference (IGC) to consider establishing an international legally binding instrument (ILBI) on the conservation and sustainable use of biodiversity beyond national jurisdiction (BBNJ). This was considered necessary because UNCLOS did not provide a framework for areas beyond national jurisdiction. There was a particular concern for marine biodiversity and the impact of overfishing on global fish stocks and ecosystem stability.
The treaty addresses four themes: (1) marine genetic resources (MGRs) and their Digital sequence information, including the fair and equitable sharing of benefits; (2) area-based management tools (ABMTs), including marine protected areas (MPAs); (3) environmental impact assessments (EIAs); and (4) capacity building and transfer of marine technology (CB&TMT). The area-based management tools and environmental impact assessments relate mainly to conservation and sustainable use of marine biodiversity, while the marine genetic resources and capacity building and transfer of marine technology include issues of economic justice and equity.
Greenpeace called it "the biggest conservation victory ever". The main achievement is the new possibility to create marine protected areas in international waters. By doing so the agreement now makes it possible to protect 30% of the oceans by 2030 (part of the 30 by 30 target). Though the agreement does not directly address climate change, it also serves as a step towards protecting the ecosystems that store carbon in sediments.
The treaty has 75 articles and its main purpose is "to take stewardship of the world’s oceans for present and future generations, care for and protect the marine environment and ensure its responsible use, maintain the integrity of undersea ecosystems and conserve marine biological diversity’s inherent value". The treaty recognizes traditional knowledge. It has articles regarding the "polluter-pays" principle, and different impacts of human activities including areas beyond the national jurisdiction of the countries making those activities. The agreement was adopted by the 193 United Nations Member States.
Before the treaty can enter into force, it needs to be ratified by at least 60 UN member states. This process is likely to take some time. The former treaty, UNCLOS, was adopted in 1982 and entered into force in 1994. UNCLOS has 170 parties. The European Union pledged financial support for the process of ratification and implementation of the treaty.
Context
The world's oceans are facing a severe decline in biodiversity and degradation of ecosystems due to threats related to climate change and the expansion of human activities, such as shipping, overfishing, plastic pollution and deep-sea mining. Consequently, there is a pressing need for a more cohesive ocean governance framework, since the existing framework is too fragmented and incomplete to effectively secure conservation and sustainably use of marine biodiversity in areas beyond national jurisdiction. The High Seas treaty aims to address the regulatory gaps, by promoting coherence and coordination with and among existing institutions, frameworks, and bodies.
The areas beyond national jurisdiction comprise the 'high seas' (water column) and the ‘area’ (seabeds), making up about two-thirds of the ocean. The areas are currently regulated by different regional and sectoral agreements, such as regional fisheries management organisations (RFMOs). However, they can only implement measures within their own respective mandates and cooperation is lacking. Additionally, only a few areas are covered, leaving the majority effectively unregulated. The remaining one-third of the ocean falls under national jurisdiction and is situated within the exclusive economic zones (EEZs). The exclusive economic zones extend 200 nautical miles (about 370 km) from the territorial sea baseline. The zones are established under UNCLOS, giving coastal states the jurisdiction over the living and non-living resources within the water and the seabeds.
History
A new agreement under UNCLOS for areas beyond national jurisdiction has been discussed at the United Nations for almost 20 years. The United Nations began preparatory meetings in 2004 to lay the foundation for an Implementing Agreement to UNCLOS addressing governance and regulatory gaps.
On 24 December 2017, the United Nations General Assembly adopted Resolution 72/249 to convene an intergovernmental conference and undertake formal negotiations for a new international legally binding instrument under the UNCLOS for the conservation and sustainable development of marine biological diversity in areas beyond national jurisdiction. Between 2018 and 2023, diplomats have gathered at the UN Headquarters in New York City for negotiating sessions. There have so far been five sessions in total.
The intergovernmental conference (IGC) convened a total of six sessions in 2018, 2019, 2022 and 2023 to negotiate the text for the BBNJ legal instrument:
During the first session in September 2018, the concept of 'Beyond National Jurisdiction' seemed to have a greater influence on positions taken than the direct concerns regarding 'Biodiversity' itself.
In the second session March/April 2019, it became clear that the principle stating that the new BBNJ agreement "should not undermine" existing institutions could be a hindrance, impeding progress towards achieving an effective instrument.
The third session in August 2019 evolved around the dichotomy between ‘the freedom of the seas’ and ‘the common heritage of mankind’ principles.
The fourth session was originally scheduled for 2020, but it had to be postponed until March 2022 because of the COVID-19 pandemic. During the session, a lack of political will was observed, as states continued to object to substantive, key issues for a new treaty. Progress was made in the four main elements: marine genetic resources (MGRs), benefit sharing using area-based management tools (ABMTs) including marine protected areas (MPAs), environmental impact assessments (EIAs) and capacity building and the transfer of marine technology (CB&TT).
The fifth round of talks in August 2022 failed to produce an agreement, due in part to significant disagreements over how to share benefits derived from marine genetic resources and digital sequence information. It was therefore agreed to suspend the session and resume it at a later date.
Agreement on a text was reached on 4 March 2023, after the sixth round of talks at the UN in New York. In February/March 2023, the final text was agreed upon, after almost two decades of work. With the words "the ship has reached the shore", Rena Lee, the president of the intergovernmental conference, announced the final agreement. The treaty opened for signature in New York City on 20 September, a day after a summit on the Sustainable Development Goals. Signatures will be open for two years from 20 September 2023.
In January 2024, Ambassador Ilana Seid presented the agreement of Palau to the agreement. Palau was the first of the sixty required to agree with the new treaty.
The content of the treaty
Marine genetic resources (MGRs), including the fair and equitable sharing of benefits
Marine genetic resources (MGRs), including the fair and equitable sharing of benefits is the first element mentioned in the treaty. Among other things, marine genetic resources can enable production of biochemicals that can be used in cosmetics, pharmaceuticals and food supplements. The economic value of the resources is for now unclear, but the potential for profits has created an increased interest in the resources exploration and exploitation among stakeholders.
During the UN negotiations it has been a contentious point whether or not marine genetic resources should apply to ‘fish’ and ‘fishing activities’. If not, it would be likely to impact the ability of the High Seas treaty to address its objective, since fish are a major component of marine biodiversity and play an essential role in the functioning of marine ecosystems, according to some experts. However, the final treaty text states that the provisions about marine genetic resources do not apply to ‘fish’ and ‘fishing’ in areas beyond national jurisdiction.
The part about fair and equitable sharing of benefits has also been a point of dispute in the negotiations. In the end it was agreed upon to regulate non-monetary as well as monetary benefits. Furthermore, an access and benefit-sharing committee will be established with the purpose of providing guidelines for the benefit-sharing, and ensuring that this is done in a transparent, fair, and equitable way.
Area-based management tools (ABMTs), including marine protected areas (MPAs)
Area-based management tools (ABMTs), including marine protected areas (MPAs) are recognized as key tools for conserving and restoring biodiversity. They can be used to protect, preserve and maintain certain areas beyond national jurisdiction. Marine protected areas offer a degree of long term conservation, and are already established in some areas. However, the protection level of biodiversity varies a lot and the protected areas only cover a small proportion of the areas beyond national jurisdiction. Area based management tools can be used for short-term and emergency measures and to address a specific sector.
The process to establish a tool or a protected area is as follows. First, a part under the High Seas treaty has to submit a proposal for an area-based management tool or a marine protected area. The proposal has to be based on the best available sciences and information. It will be made publicly available and transmitted to the Scientific and Technical Body to be reviewed. Hereafter, relevant stakeholders have to be consulted. The proposal has to be adopted by consensus - or if this is not possible, three-quarter majority of the representatives present and voting. The decision will enter into force within 120 days after the voting, and will be binding for all parties of the treaty. However, if a part within the 120 days makes an objection to the decision, an opt-out is possible.
After the treaty text was finalised, it has been reported that the treaty through marine protected areas will protect 30 pct. of the oceans by 2030 - a target adopted at the UN Biodiversity Conference (COP15) in December 2022 - however this is not the case, according to experts. The treaty can help to implement the 30 by 30 biodiversity target in the oceans, but it will a require a lot of action by states.
Environmental impact assessments (EIAs)
Environmental impact assessments have the potential to predict, reduce and prevent human activities affecting marine biodiversity and ecosystems. While the institutional and legal framework for environmental impact assessments is well established in areas within national jurisdiction, it is less developed in areas beyond. Under the treaty, participating parties are obliged to conduct environmental impact assessments when a planned activity may have an effect on the marine environment, or when there is insufficient knowledge about its potential effects. In such cases, the party possessing jurisdiction or control over the activity is required to conduct the assessment.
The treaty also includes provisions for Strategic Environmental Assessments (SEAs), which are assessments that are more holistic and focused on long-term environmental protection compared to the more specific focus of environmental impact assessments. Parties under the treaty have to consider conducting a strategic environmental assessment for plans and programmes related to their activities in areas beyond national jurisdiction, but are not obliged to conduct one.
Capacity building and the transfer of marine technology (CB&TMT)
Capacity building and the transfer of marine technology concerns the equitable access to research conducted in international waters and enabling cooperation and participation in the activities outlined in the agreement. Different types of capacity building and transfer of technology are mentioned in the agreement, such as sharing of information and research results; develop and share manuals, guidelines and standards; collaboration and cooperation in marine science; and develop and strengthen institutional capacity and national regulation or mechanisms.
Technology plays an important role in the implementation, making capacity building and technology transfer essential for the enforcement of the treaty. A key focus is to support developing and geographically disadvantaged states in implementing the agreement.
Furthermore, a capacity-building and transfer of marine technology committee will be established, in order to monitor and review the undertaken initiatives, under the authority of the Conference of the Parties.
Institutional Setup
The treaty introduces a new institutional framework in part VI about 'Institutional Arrangements', including the Conference of the Parties, the Scientific and Technical Body, the secretariat and the clearing-house mechanism.
The Conference of the parties (COP) will have its first meeting one year after the treaty enters into force, at the latest. The rules of procedure and the financial rules will be adopted at the first meeting. The Conference of the Parties will review and evaluate the implementation of the High Seas treaty. The Conference has to take decisions and adopt recommendations by consensus - or if it is not possible to reach consensus after all efforts have been exhausted, adopted by a two-thirds majority of the parties present and voting. The Conference will also have to promote transparency in the implementation of the agreement and the related activities. Five years after the treaty enters into force, the Conference of Parties has to review the treaty.
The Scientific and Technical Body will be composed of members nominated by the parties and elected by the Conference of the Parties, serving as experts and in the best interest of the agreement. The need for multidisciplinary expertise has to be taken into account in the nomination and election of members. The Scientific and Technical Body will among other things provide scientific and technical advice to the Conference of the Parties, monitor and review area-based management tools and comment on environmental impact assessments.
The secretariat is responsible for providing administrative and logistical support to the Conference of the Parties and its subsidiary bodies. This includes tasks, such as arranging and servicing the meetings, as well as circulating information relating to the implementation of the treaty in a timely manner.
The clearing-house mechanism will work as an open-access platform, facilitating the access, provision, and dissemination of information. It will promote transparency and facilitate international cooperation and collaboration. The mechanism will be managed by the secretariat.
In addition, the treaty establishes an 'access and benefit-sharing committee', a 'capacity-building and transfer of marine technology committee', a 'finance committee on financial resources' and an 'implementation and compliance committee'. However, these are not mentioned in the section about institutional arrangements.
Financial support
The European Union pledged financial support for the process of ratification and implementation of the treaty.
See also
United Nations Convention on the Law of the Sea
2022 United Nations Biodiversity Conference
Kunming-Montreal Global Biodiversity Framework
High seas fisheries management
Nagoya Protocol to the Convention on Biological Diversity
Treaty on Intellectual Property, Genetic Resources and Associated Traditional Knowledge (GRATK)
References
External links
UN delegates reach historic agreement on protecting marine biodiversity in international waters (UN News, 5 March 2023)
Agreement under the United Nations Convention on the Law of the Sea on the conservation and sustainable use of marine biological diversity of areas beyond national jurisdiction (19 June 2023)
2023 in international relations
Anti-biopiracy treaties
Biopiracy
Law of the sea treaties
Marine conservation
Treaties of Belize
Treaties of Chile
Treaties of Cuba
Treaties of Mauritius
Treaties of the Federated States of Micronesia
Treaties of Monaco
Treaties of Palau
Treaties of Seychelles
United Nations treaties | High Seas Treaty | [
"Biology"
] | 3,157 | [
"Anti-biopiracy treaties",
"Biodiversity",
"Biopiracy"
] |
73,211,492 | https://en.wikipedia.org/wiki/Clothing%20physiology | Clothing physiology is a branch of science that studies the interaction between clothing and the human body, with a particular focus on how clothing affects the physiological and psychological responses of individuals to different environmental conditions. The goal of clothing physiology research is to develop a better understanding of how clothing can be designed to optimize comfort, performance, and protection for individuals in various settings, including outdoor recreation, occupational environments, and medical contexts.
Purpose of clothing
Human clothing motives are frequently oversimplified in cultural and sociological theories, with the assumption that they are solely motivated by modesty, adornment, protection, or sex. However, clothing is primarily motivated by the environment, with its form being influenced by human characteristics and traits, as well as physical and social factors such as sex relations, costume, caste, class, and religion. Ultimately, clothing must be comfortable in various environmental conditions to support physiological behavior. The concept of clothing has been aptly characterized as a quasi-physiological system that interacts with the human body.
Quasi-physiological systems
Clothing can be considered as a quasi-physiological system that interacts with the body in different ways, just like the distinct physiological systems of the human body, such as digestive system and nervous system, which can be analyzed systematically.
Purpose of clothing physiology
The acceptance and perceived comfort of a garment cannot be attributed solely to its thermal properties. Rather, the sensation of comfort when wearing a garment is associated with various factors, including the fit of the garment, its moisture buffering properties, and the mechanical characteristics of the fibers and fabrics used in its construction.
The field of clothing physiology concerns the complex interplay between the human body, environmental conditions, and clothing. Through the use of scientific methods, it is possible to accurately measure and quantify the effects of clothing on wearer comfort and overall well-being.
Louis Newburgh is widely recognized among thermal physiologists primarily due to his role as the editor of "Physiology of Heat Regulation and the Science of Clothing".
From a physiological perspective, the purpose of clothing is to shield the body from extreme temperatures, whether they be hot or cold. The role of clothing in affecting the wearer's comfort can be described as the connection between the body and the surroundings. When engaged in outdoor activities, the individual's comfort level is influenced by various environmental factors, such as air temperature, humidity, solar radiation, atmospheric and ground thermal radiation. The wearer's posture, metabolic rate, sweating rate, and bodily processes such as moisture absorption, sweat evaporation, and heat loss through conduction and convection via blood, are among additional factors that also play a role in determining the individual's comfort level.
Skin physiology
The contact between clothing and skin facilitates the regulation of body temperature through the control of blood flow and sweat evaporation in localized areas. However, the design of functional fabrics that efficiently regulate skin temperature must take into account crucial factors such as age, gender, and activity level.
The skin plays a vital role in safeguarding the body's homeostasis by performing a variety of crucial protective functions. Clothing and other textiles interact dynamically with the skin's functions, and the mechanical properties of the fabric, such as its surface roughness, can lead to non-specific skin reactions, such as wool intolerance or keratosis follicularis.
Thermal comfort and insulation
It's common to express metabolic activity in terms of heat production. A resting adult typically generates 100 W of heat, with a significant amount dissipating through the skin. Heat production per unit area of skin, referred to as 1 met, is around 58 W/m2 for a resting individual, based on the average male European's skin surface area of approximately 1.8 m2. The average female European's skin surface area is 1.6 m2 for comparison.
Skin temperatures that correspond to comfort during stationary activities range from 91.4°F to 93.2°F (33°C to 34°C), and these temperatures decrease as the level of physical activity increases. Skin temperature that exceeds 45°C or falls below 18°C induces a sensation of pain. Internal temperatures increase with activity. The brain's temperature regulatory center is around 36.8°C when at rest and rises to about 37.4°C when walking and 37.9°C when jogging. A temperature below 28°C can cause fatal cardiac arrhythmia, while a temperature above 43°C can result in permanent brain damage. Thus, it's crucial to regulate body temperature carefully for both comfort and health.
Clothing insulation can be denoted using the unit of measurement called clo. In the absence of clothing, a thin layer of static air known as the boundary layer forms in close proximity to the skin, acting as an insulating layer that restricts heat exchange between the skin and the surrounding environment. This layer typically offers approximately 0.8 clo units of insulation in a motionless state. It's difficult to apply this generalization to very thin fabric layers or underwear, as they occupy an existing static air layer of no more than 0.5 cm thickness. Consequently, these thin layers offer minimal contribution to the clothing's intrinsic insulation.
The standard measure for clothing insulation is 1.57 clo·cm-1 in thickness, which is equivalent to 4 clo·inch-1.
Applications
The advancements in fibers, textiles, electronics, functional finishing, and clothing physiology are anticipated to improve human life in numerous areas such as medicine, military, firefighting, extreme sports, and other apparel applications. The study of clothing physiology has been prompted by the need to design effective clothing systems for various specialized environments such as space, polar regions, underwater operations, and industrial settings.
Clothing comfort
Comfort is a multifaceted concept that encompasses various perceptions, including physiological, social, and psychological needs. After sustenance, clothing is one of the most vital objects that can satisfy comfort requirements. This is because clothing offers a range of benefits, including aesthetic, tactile, thermal, moisture, and pressure comfort.
Protection
The clothing physiology comfort of an athlete is significantly influenced by the compression effect exerted by their garments. The degree of compression load exerted by the clothing has a direct correlation with the intensity of sweating and the resulting elevation in skin temperature. Specifically, a greater compression load on the body results in a higher degree of sweating and increased skin temperature.
Testing
Thermophysiological models have become a prevalent tool for forecasting human physiological reactions in varying environmental and clothing conditions.
Clothing physiology can be assessed through the utilization of various advanced instruments, including: Sherlock is a thermal manikin test device developed by the Hohenstein Institutes to evaluate clothing physiology, and it is equipped with perspiration simulation capabilities.
SpaceTex experiment
In the SpaceTex experiment, novel fabrics were evaluated for their ability to enhance heat transfer and manage sweat during physical activity, based on their antibacterial properties. Quick-drying T-shirts made from such fabrics would be advantageous to athletes, firefighters, miners, and military personnel. This marks the first experiment in clothing physiology conducted in microgravity, with sportswear manufacturers aiming to improve their products accordingly. In fact, a modified polyester has already been developed for use by the Swiss military.
Certain precautions
The integumentary system is a significant immune organ, possessing both specific and non-specific activities related to immunity. Antimicrobial fabrics could potentially disrupt the skin's non-specific defense mechanisms such as antimicrobial peptides or the resident microflora.
Social psychology of clothing
The social psychology of dress entails comprehending the interconnections that exist between attire and human conduct.
See also
Technical textile
Textile performance
Personal protective equipment
Uniforms of the Canadian Armed Forces
References
Clothing
Physiology | Clothing physiology | [
"Biology"
] | 1,584 | [
"Physiology"
] |
73,212,538 | https://en.wikipedia.org/wiki/Bating%20%28leather%29 | Bating is a technical term used in the tanning industry to denote leather that has been treated with hen or pigeon manure, similar to puering (see puer) where the leather has been treated with dog excrement, and which treatment, in both cases, was performed on the raw hide prior to tanning in order to render the skins, and the subsequent leather, soft and supple. Today, both practices are obsolete and have been replaced in the tanneries with other natural proteolytic enzymes.
Leather processing
Since early times, tanners have made use of either dog fæces, or hen and pigeon manure, in one of the early phases of leather treatment to produce a soft leather. A bath solution containing the animal extracts was made and the raw hide inserted and left there for a few days, which activated the bacteria and enzymes that reacted with the collagen in the animal skin to make the leather soft and supple. This step was followed by drenching, a term denoting skins that were thoroughly washed in a bath solution of bran (usually of barley or rye), or ash bark. This process was thought to open up the fibre, and, if lime (CaO) was used to remove hair before the actual bating, drenching removed excess or residual lime trapped in the leather.
Early inventors who concerned themselves with tanning looked upon bating as a process for removing lime from the skins, and nothing more, and since the use of animal fæces was repulsive, sought to substitute them by inventing artificial bates. What they failed to realize, however, was that bating also acts upon the skin fibres, rendering portions of the skins soluble, bringing about the finished condition. One of the early inventions made to replicate bating was the chemical use of old lime liquors (with high levels of ammonia) neutralized with sulphuric acid. This method more nearly approximates to the conditions of the dung.
Experimentation and research
Puering fell into disuse after began producing the enzyme pancreatin on an industrial scale between 1895 and 1897. By 1907, it was used by Otto Röhm in the tannery. J.T. Wood, investigating the microbial properties of dog fæces, was able to isolate species of different bacteria, determining that aged dog fæces was more potent (hence, more efficacious) than fresh dog fæces. The bacteria that settles on the excrement releases, under right conditions, the principal enzyme trypsin.
Natural bates
Primitive tanning methods differed from country to country, but the use of puering and bating was not prevalent in all of them, as tanners had moved away from their use and employed vegetable tanning which achieved nearly the same result. In western societies, modern tanning techniques tried to replicate the effect of puering and bating by using a natural bate. Papain, the active proteolytic enzyme found in the latex taken from the skin of the papaya fruit (Carica papaya), is thought to replicate the action of traditional puering and bating. The protein-digesting enzyme is now used extensively in the leather industry, and follows the dehairing of the animal skin, usually with lime and other proteolytic enzymes, and the deliming of the animal hide with mineral acid. This process is thought to release traces of lime still trapped in the hide after the deliming process, in addition to removing unwanted grease, besides aiding in the subsequent tanning process by the alteration of protein.
Today, in the modern tanning industry where almost all innovations have been made by substituting vegetable tanning agents with chemical agents, bating is the only step in leather processing where enzymatic process cannot be substituted by chemical processes, as the process of bating gives certain desired characteristics to the finished leather. Large-scale use of microbial enzymes, following the introduction of fermentation technology, has become standard in the tanning industry.
Enzymatic soaking of the raw hides has been shown to loosen the scud, initiate the opening of the fibre structure, and to render a leather product with less wrinkled grain when used at an alkaline pH of less than 10.5. In rabbit skins it improves the softness and elasticity, and increases the surface area yield of the fur by 3.3%. Bating also acts to hydrolyze casein, elastin, albumin, globulin-like proteins, and nonstructural proteins that are not essential for leather making.
Primitive practices
One of the earliest references to puering is found in the old rabbinic Minor tractate, Kallah Rabbati (end of chapter 7): "What is the reason that dogs were privileged to have books of the Law and doorpost scripts prepared from their excrement? It is because it says [of them]: 'not a dog shall bark against any of the people of Israel' (Exo. 11:7)." A record of primitive tanning bequeathed in the 12th century by Abraham ben Isaac of Narbonne (1085–1158) mentions the tanning method employed in his day, in southern France, where the treatment of the rawhide by puering was still in use and done after the hairs of the animal were removed by lime in preparation for writing a Torah scroll and the hide had once again become stiff:
After taking dry [sheep]-skins whose wool had been soaked [in lime water for removal], they leave them in the water for the duration of time needed for them to become soft [=soaking]. Afterwards, they put them inside a pit made for them, and they put therein a little dog fæces, having no prescribed quantity [=puering], and a little salt [is added thereto], and then they seal the mouth of the pit, leaving it there for one day in summer months, and three days in winter months, no longer [than the duration of that time], so that they be not eaten up. They then remove them and check them for holes, and if there be a hole found, they sew it, and then lay them out over a wooden frame that is prepared in advance [for this purpose] and they rinse them thoroughly with running water [=drenching], and then bring out a heaping batch of gallnuts which they then pound or grind thoroughly. They then put on each sheet of leather three litres of the Baghdad measure, and plaster thereon the gallnuts, over its two sides, and sprinkle a little water over them, and they put more gallnuts on that side of the leather where the hairs once were (grain layer) than what they do on the flesh-side [of the leather], doing likewise with each sheet of leather, the application [of gallnuts] made twice daily, while, on the third application, they once more plaster with what remains of the gallnuts [onto the leather] and lay it out in the sun, for the duration of time that it takes for it to whiten, leaving it in that state until it dries [=tanning]. They afterwards shake-off the excess gallnuts and then cut the leather.
Tanners in Egypt in the 12th century and in Yemen of late made use of different methods in varying degrees, yet without the use of puering and bating, and without the use of gallnuts. Rather, after soaking and fleshing, tanners utilized the tannins found in the ground leaves and crushed tender stems of Acacia (Acacia etbaica and Acacia nilotica kraussiana), with which a bath solution was made and the raw hides inserted and left there for about two weeks, constantly stirring and changing the water after one week. In some places in Yemen, the leaves of African rue (Peganum harmala) were used instead of Acacia leaves. In Yemen and Ethiopia, castor-bean oil derived from the castor plant (Ricinus communis) was applied by some tanners to the finished leather product which gave additional softness and suppleness to the leather.
References
Notes
Bibliography
(reprinted in 2015, )
Further reading
External links
The Glasgow Herald, p. 6 ("Chemistry: Leather Manufacture, its Scientific Aspect", by A.E. Caunce), 14 September 1923
Another Important Role Played by Enzymes in Bating, by J.A. Wilson & H.B. Merrill. February 1926
Leathermaking
Manufacturing
Microbiology techniques
Proteases | Bating (leather) | [
"Chemistry",
"Engineering",
"Biology"
] | 1,763 | [
"Microbiology techniques",
"Manufacturing",
"Mechanical engineering"
] |
73,214,650 | https://en.wikipedia.org/wiki/NGC%203044 | NGC 3044 is a barred spiral galaxy in the equatorial constellation of Sextans. It was discovered on December 13, 1784, by German-born English astronomer William Herschel. In 1888, Danish astronomer J. L. E. Dreyer described it as "very faint, very large, very much extended 122°". It is located at an estimated distance of million light years. In the B band of the UBV photometric system, the galaxy spans with the major axis aligned along a position angle of 113°. It is a relatively isolated galaxy with no nearby companions. R. B. Tully in 1988 assigned it as a member of the widely displaced Leo Cloud.
The morphological classification of NGC 3044 is SBc, indicating a barred spiral (SB) with somewhat loosely-wound spiral arms (c). It is being viewed edge-on, with a galactic plane that is inclined at an angle of to the plane of the sky. The disk appears lob-sided and disturbed, suggesting a recent merger or interaction. There is a diffuse ionized gas extending to above the center of the plane.
The stars in the galaxy have a combined mass of approximately , and the star formation rate is . The total mass of the atomic gas in this galaxy is , and it has a dust mass of . The galaxy as a whole has a dynamic mass of .
One supernova has been observed in NGC 3044: SN1983E (typeII, mag. 14) was discovered by Natalʹja Metlova on 13 March 1983, at an offset east, south of the galaxy.
References
Further reading
Barred spiral galaxies
Sextans
Discoveries by William Herschel
3044
05311
28517
+00-25-031
09511+0148 | NGC 3044 | [
"Astronomy"
] | 357 | [
"Sextans",
"Constellations"
] |
73,214,867 | https://en.wikipedia.org/wiki/DatalogZ | DatalogZ (stylized as ) is an extension of Datalog with integer arithmetic and comparisons. The decision problem of whether or not a given ground atom (fact) is entailed by a DatalogZ program is RE-complete (hence, undecidable), which can be shown by a reduction to diophantine equations.
Syntax
The syntax of DatalogZ extends that of Datalog with numeric terms, which are integer constants, integer variables, or terms built up from these with addition, subtraction, and multiplication. Furthermore, DatalogZ allows , which are atoms of the form t < s or t <= s for numeric terms t, s.
Semantics
The semantics of DatalogZ are based on the model-theoretic (Herbrand) semantics of Datalog.
Limit DatalogZ
The undecidability of entailment of DatalogZ motivates the definition of limit DatalogZ. Limit DatalogZ restricts predicates to a single numeric position, which is marked maximal or minimal. The semantics are based on the model-theoretic (Herbrand) semantics of Datalog. The semantics require that Herbrand interpretations be to qualify as models, in the following sense: Given a ground atom of a limit predicate where the last position is a max (resp. min) position, if is in a Herbrand interpretation , then the ground atoms for (resp. ) must also be in for to be limit-closed.
Example
Given a constant w, a binary relation edge that represents the edges of a graph, and a binary relation sp with the last position of sp minimal, the following limit DatalogZ program computes the relation sp, which represents the length of the shortest path from w to any other node in the graph:
sp(w, 0) :- .
sp(y, m + 1) :- sp(x, m), edge(x, y).
See also
Constraint logic programming
References
Notes
Sources
Logic in computer science
Computer programming | DatalogZ | [
"Mathematics",
"Technology",
"Engineering"
] | 419 | [
"Logic in computer science",
"Mathematical logic",
"Computer programming",
"Software engineering",
"Computers"
] |
73,215,008 | https://en.wikipedia.org/wiki/Empress%20%28cracker%29 | Empress (sometimes stylized EMPRESS) is a video game cracker who specializes in breaking anti-piracy software. While the true identity of Empress is unknown, she refers to herself as a young Russian woman. Empress has also released cracked games under the moniker C000005.
Empress is known as one of the few crackers who can crack Denuvo. Her motivation is to remove the software license aspect of digital games in an effort to preserve them after developers drop support. Empress also states that removing digital rights management (DRM) increases performance in-game.
Career
Empress became interested in the DRM-cracking scene in 2014. Her followers can participate in polls to select which game they want cracked next, and her work is funded through crowdsourced donations. Empress typically requests $500 for cracking a specific game. She uses the money to cover living costs, hardware upgrades, and purchase games that she intends to crack.
Empress rose to prominence after releasing a cracked version of Red Dead Redemption 2. Other high-profile games cracked by Empress include Mortal Kombat 11 and Anno 1800. In February 2021, Empress stated that she would soon be arrested after being allegedly caught working on a crack for Immortals Fenyx Rising. Empress blamed FitGirl Repacks, with whom she had a feud. However, that March, Empress was available to publish a workaround for the online check-in system of Battle.net. Empress's arrest announcement was met with general skepticism by the cracking community.
She released a cracked version of Hogwarts Legacy in February 2023, just 12 days after release.
Controversies
Empress is known around the P2P scene for the "personal note" section in the NFOs of her releases, often containing a variety of slurs and other offensive language in relation to her sociopolitical views. The information file supplied with the cracked version of Hogwarts Legacy expressed dissatisfaction with what was described as the "woke system" of today, defending Harry Potter series creator J.K. Rowling's views on transgender people and accused the transgender community of being "sissy men" and progressive groups of censorship against those who express dissent against them.
References
Hackers
Warez
Unidentified people
Year of birth unknown
Living people
Digital rights management
LGBTQ-related controversies in video games
Intellectual property activism | Empress (cracker) | [
"Technology"
] | 466 | [
"Lists of people in STEM fields",
"Hackers"
] |
73,216,226 | https://en.wikipedia.org/wiki/Metallaborane | In chemistry, a metalloborane is a compound that contains one or more metal atoms and one or more boron hydride. These compounds are related conceptually and often synthetically to the boron-hydride clusters by replacement of BHn units with metal-containing fragments. Often these metal fragments are derived from metal carbonyls or cyclopentadienyl complexes. Their structures can often be rationalized by polyhedral skeletal electron pair theory. The inventory of these compounds is large, and their structures can be quite complex.
Examples
Two simple examples are . The MB4 cores (M = Fe or Co) of these two compounds adopt structures expected for nido 5-vertex clusters. The iron compound is produced by reaction of diiron nonacarbonyl with pentaborane. and cyclobutadieneiron tricarbonyl have similar structures.
Metallacarboranes
Even greater in scope than metalloboranes are metallacarboranes. These cages have carbon vertices, often CH, in addition to BH and M vertices. A well-developed class of metallacarboranes are prepared from dicarbollides, anions of the formula [C2B9H11]2-. These anions function as ligands for a variety of metals, often forming sandwich complexes.
Some metalloboranes are derived by the metalation of neutral carboranes. Illustrative are the six-and seven-vertex cages prepared from closo-. Reaction of this carborane with iron carbonyl sources gives closo Fe- and Fe2-containing products, according to these idealized equations:
A further example of insertion into a closo carborane is the synthesis of the yellow-orange solid closo-1,2,3-:
A closely related reaction involves the capping of an anionic nido carborane
The last reaction is worked up with acid and air.
References
Cluster chemistry | Metallaborane | [
"Chemistry"
] | 408 | [
"Cluster chemistry",
"Organometallic chemistry"
] |
73,217,717 | https://en.wikipedia.org/wiki/HD%2021699 | HD 21699, also known as HR 1063 and V396 Persei, is a star about 580 light years from the Earth, in the constellation Perseus. It is a 5th magnitude star, so it will be faintly visible to the naked eye of an observer far from city lights. This is a variable star, whose brightness varies slightly from 5.45 to 5.53 during its 2.4761 day rotation period. It has a remarkable dipole magnetic field which is displaced from the star's center by 0.4 stellar radii, the poles of which appear close to each other on the stellar surface. HD 21699 is a member of the Alpha Persei Cluster.
Properties
In 1967, Robert Garrison noted that the U-B color of HD 21699 is significantly bluer (more negative) than the spectral type assigned to it (B8 III) would suggest. Such a discrepancy suggests that the star is helium-weak. The star's helium-weak nature was confirmed by William Morgan et al. in 1971. HD 21699 also has an enhanced silicon abundance.
John Winzer observed HD 21699 during 1971 - 1972 and discovered that it is a variable star. He found it varied by 0.03, 0.04 and 0.05 magnitudes in the visible, blue and ultraviolet photometric bands, respectively. Though he found that the brightness varied periodically, he was unable to unambiguously assign a period to it. It was the first helium-weak star to be found to vary in brightness periodically. In 1974, HD 21699 was assigned the variable star designation V396 Persei. In 1985, John Percy established that the star's variability period is days.
Magnetic field
In 1980, Werner Weiss deduced that HD 21699 has a magnetic field, based on a heuristic relationship between photometric colors and a star's surface magnetic field. In 1984, Douglas Brown et al. announced that a magnetic field with a strength of about one kilogauss had been detected from observations of Zeeman splitting of spectral lines. That same year, Brown et al. announced that International Ultraviolet Explorer data showed evidence of a stellar wind flowing from HD 21699, which was constrained to flow from the region of the star's magnetic poles. This "plume" of gas sweeps across the line of sight for an observer on the Earth, as the star rotates.
In 2007, Yu. V. Glagolevskij and G. A. Chuntonov examined the extensive data which had been collected for HD 21699, and concluded that the star has a very peculiar magnetic field. In their model, the field is a dipole, but it is displaced by stellar radii from the star's center. If the dipole were centered within the star, one would expect that the surface magnetic poles would be separated by 180° along a great circle which contained both poles. However, because the dipole is displaced from the star's center, the poles are separated by only 55°. Furthermore, the two magnetic poles lie almost exactly on the star's equator. Their estimate for the field's strength is kilogauss at the poles.
References
Further reading
Perseus (constellation)
16470
21699
Persei, V396
SX Arietis variables | HD 21699 | [
"Astronomy"
] | 692 | [
"Perseus (constellation)",
"Constellations"
] |
73,220,111 | https://en.wikipedia.org/wiki/Dead%20Internet%20theory | The dead Internet theory is an online conspiracy theory that asserts, due to a coordinated and intentional effort, the Internet since 2016 or 2017 has consisted mainly of bot activity and automatically generated content manipulated by algorithmic curation to control the population and minimize organic human activity. Proponents of the theory believe these social bots were created intentionally to help manipulate algorithms and boost search results in order to manipulate consumers. Some proponents of the theory accuse government agencies of using bots to manipulate public perception. The dead Internet theory has gained traction because many of the observed phenomena are quantifiable, such as increased bot traffic, but the literature on the subject does not support the full theory.
Origins and spread
The dead Internet theory's exact origin is difficult to pinpoint. In 2021, a post titled "Dead Internet Theory: Most Of The Internet Is Fake" was published onto the forum Agora Road's Macintosh Cafe esoteric board by a user named "IlluminatiPirate", claiming to be building on previous posts from the same board and from Wizardchan, and marking the term's spread beyond these initial imageboards. The conspiracy theory has entered public culture through widespread coverage and has been discussed on various high-profile YouTube channels. It gained more mainstream attention with an article in The Atlantic titled "Maybe You Missed It, but the Internet 'Died' Five Years Ago". This article has been widely cited by other articles on the topic.
The date given for the "death" of the Internet is generally around 2016 or 2017.
Claims
The dead Internet theory has two main components: that organic human activity on the web has been displaced by bots and algorithmically curated search results, and that state actors are doing this in a coordinated effort to manipulate the human population. The first part of this theory, that bots create much of the content on the internet and perhaps contribute more than organic human content, has been a concern for a while, with the original post by "IlluminatiPirate" citing the article "How Much of the Internet Is Fake? Turns Out, a Lot of It, Actually" in New York magazine. The Dead Internet Theory goes on to include that Google, and other search engines, are censoring the Web by filtering content that is not desirable by limiting what is indexed and presented in search results. While Google may suggest that there are millions of search results for a query, the results available to a user do not reflect that. This problem is exacerbated by the phenomenon known as link rot, which is caused when content at a website becomes unavailable, and all links to it on other sites break. This has led to the theory that Google is a Potemkin village, and the searchable Web is much smaller than we are led to believe. The Dead Internet Theory suggests that this is part of the conspiracy to limit users to curated, and potentially artificial, content online.
The second half of the dead Internet theory builds on this observable phenomenon by proposing that the U.S. government, corporations, or other actors are intentionally limiting users to curated, and potentially artificial AI-generated content, to manipulate the human population for a variety of reasons. In the original post, the idea that bots have displaced human content is described as the "setup", with the "thesis" of the theory itself focusing on the United States government being responsible for this, stating: "The U.S. government is engaging in an artificial intelligence-powered gaslighting of the entire world population."
Expert view
Caroline Busta, founder of the media platform New Models, was quoted in an article in The Atlantic calling much of the dead Internet theory a "paranoid fantasy,” even if there are legitimate criticisms involving bot traffic and the integrity of the internet, but she said she does agree with the "overarching idea.” In an article in The New Atlantis, Robert Mariani called the theory a mix between a genuine conspiracy theory and a creepypasta.
The dead Internet theory is sometimes used to refer to the observable increase in content generated via large language models (LLMs) such as ChatGPT appearing in popular Internet spaces without mention of the full theory.
Evidence
Large language models
Generative pre-trained transformers (GPTs) are a class of large language models (LLMs) that employ artificial neural networks to produce human-like content. The first of these to be well known was developed by OpenAI. These models have created significant controversy. For example, Timothy Shoup of the Copenhagen Institute for Futures Studies said in 2022, "in the scenario where GPT-3 'gets loose', the internet would be completely unrecognizable". He predicted that in such a scenario, 99% to 99.9% of content online might be AI-generated by 2025 to 2030. These predictions have been used as evidence for the dead internet theory.
In 2024, Google reported that its search results were being inundated with websites that "feel like they were created for search engines instead of people". In correspondence with Gizmodo, a Google spokesperson acknowledged the role of generative AI in the rapid proliferation of such content and that it could displace more valuable human-made alternatives. Bots using LLMs are anticipated to increase the amount of spam, and run the risk of creating a situation where bots interacting with each other create "self-replicating prompts" that result in loops only human users could disrupt.
ChatGPT
ChatGPT is an AI chatbot whose late 2022 release to the general public led journalists to call the dead internet theory potentially more realistic than before. Before ChatGPT's release, the dead internet theory mostly emphasized government organizations, corporations, and tech-literate individuals. ChatGPT gives the average internet user access to large-language models. This technology caused concern that the Internet would become filled with content created through the use of AI that would drown out organic human content.
Bot traffic
In 2016, the security firm Imperva released a report on bot traffic and found that automated programs were responsible for 52% of web traffic. This report has been used as evidence in reports on the dead Internet theory. Imperva's report for 2023 found that 49.6% of internet traffic was automated, a 2% rise on 2022 which was partly attributed to artificial intelligence models scraping the web for training content.
Facebook
In 2024, AI-generated images on Facebook, referred to as AI "slop", began going viral. Subjects of these AI-generated images included various iterations of Jesus "meshed in various forms" with shrimp, flight attendants, and black children next to artwork they supposedly created. Many of those said iterations have hundreds or even thousands of AI comments that say "Amen". These images have been referred as an example for why the Internet feels "dead."
Facebook includes an option to provide AI-generated responses to group posts. Such responses appear if a user explicitly tags @MetaAI in a post, or if the post includes a question and no other users have responded to it within an hour.
In January 2025, interest renewed in the theory following statements from Meta on their plans to introduce new AI powered autonomous accounts. Connor Hayes, vice-president of product for generative AI at Meta stated, “We expect these AIs to actually, over time, exist on our platforms, kind of in the same way that accounts do...They’ll have bios and profile pictures and be able to generate and share content powered by AI on the platform.”
Reddit
In the past, Reddit allowed free access to its API and data, which allowed users to employ third-party moderation apps and train AI in human interaction. Controversially, Reddit moved to charge for access to its user dataset. Companies training AI will likely continue to use this data for training future AI. As LLMs such as ChatGPT become available to the general public, they are increasingly being employed on Reddit by users and bot accounts. Professor Toby Walsh of the University of New South Wales said in an interview with Business Insider that training the next generation of AI on content created by previous generations could cause the content to suffer. University of South Florida professor John Licato compared this situation of AI-generated web content flooding Reddit to the dead Internet theory.
Twitter
"I hate texting" tweets
Since 2020, several Twitter accounts started posting tweets starting with the phrase "I hate texting" followed by an alternative activity, such as "i hate texting i just want to hold ur hand", or "i hate texting just come live with me". These posts received tens of thousands of likes, many of which are suspected to be from bot accounts. Proponents of the dead internet theory have used these accounts as an example.
Elon Musk's acquisition of Twitter
The proportion of Twitter accounts run by bots became a major issue during Elon Musk's acquisition of the company. Musk disputed Twitter's claim that fewer than 5% of their monetizable daily active users (mDAU) were bots. Musk commissioned the company Cyabra to estimate what percentage of Twitter accounts were bots, with one study estimating 13.7% and another estimating 11%. CounterAction, another firm commissioned by Musk, estimated 5.3% of accounts were bots. Some bot accounts provide services, such as one noted bot that can provide stock prices when asked, while others troll, spread misinformation, or try to scam users. Believers in the dead Internet theory have pointed to this incident as evidence.
TikTok
In 2024, TikTok began discussing offering the use of virtual influencers to advertisement agencies. In a 2024 article in Fast Company, journalist Michael Grothaus linked this and other AI-generated content on social media to the Dead Internet Theory. In this article, he referred to the content as "AI-slime".
YouTube "The Inversion"
On YouTube, there is a market online for fake views to boost a video's credibility and reach broader audiences. At one point, fake views were so prevalent that some engineers were concerned YouTube's algorithm for detecting them would begin to treat the fake views as default and start misclassifying real ones. YouTube engineers coined the term "the Inversion" to describe this phenomenon. YouTube bots and the fear of "the Inversion" were cited as support for the dead Internet theory in a thread on the internet forum Melonland.
SocialAI
SocialAI, an app created on September 18, 2024, was created with the full purpose of chatting with only AI bots without human interaction. Its creator was Michael Sayman, a former product lead at Google who also worked at Facebook, Roblox, and Twitter. An article on the Ars Technica website linked SocialAI to the Dead Internet Theory.
In popular culture
The dead internet theory has been discussed among users of the social media platform Twitter. Users have noted that bot activity has affected their experience. Numerous YouTube channels and online communities, including the Linus Tech Tips forums and Joe Rogan subreddit, have covered the dead Internet theory, which has helped to advance the idea into mainstream discourse. There has also been discussion and memes about this topic on the app TikTok, due to the fact that AI generated content has become more mainstream.
See also
References
Conspiracy theories
Cyberpunk themes
Hyperreality
Information society
Internet manipulation and propaganda
Mass media issues
Social influence
Social information processing
Sociology of the Internet
Technology in society
21st-century neologisms
Internet-related controversies
Internet bots | Dead Internet theory | [
"Technology"
] | 2,380 | [
"Information society",
"Science and technology studies",
"Sociology of the Internet",
"Computing and society",
"Hyperreality"
] |
73,221,578 | https://en.wikipedia.org/wiki/Leucocoprinus%20russoceps | Leucocoprinus russoceps is a species of mushroom producing fungus in the family Agaricaceae.
Taxonomy
It was described in 1871 by the English botanists and mycologists Miles Joseph Berkeley and Christopher Edmund Broome who classified it as Agaricus (Lepiota) russoceps.
In 1887 it was reclassified as Lepiota russoceps by the Italian mycologist Pier Andrea Saccardo and then as Mastocephalus russoceps in 1891 by the German botanist Otto Kunze, however Kunze's Mastocephalus genus, along with most of 'Revisio generum plantarum was not widely accepted by the scientific community of the age so it remained a Lepiota.
In 1987 it was reclassified as Leucocoprinus russoceps by the mycologist Jörg Raithelhuber.
Description
Leucocoprinus russoceps is a small dapperling mushroom.Cap: 1.5-2.5cm wide starting campanulate before flattening and expanding to convex. The surface is yellow-brown to ochre with a pulverulent, powdery coating and striations from the edges. Gills: Pale, 'almost free' and close. Stem: 4cm long and 1.5mm thick at the top with a claviform taper to 4mm wide at the base. The surface is paler than the cap sometimes with a slight greenish tint with age whilst the interior is stuffed with white flesh. The stem ring may disappear. Spores:' Smooth, ovate to elliptic with a faint germ pore. 7.2-9 x 4.2-4.6 μm.
Habitat and distribution
The specimens were found growing on the ground in forests in Brazil.
The specimens studied by Berk and Broome were found on the ground in June 1860 in Ceylon (now Sri Lanka).
References
russoceps
Fungi of South America
Fungus species | Leucocoprinus russoceps | [
"Biology"
] | 404 | [
"Fungi",
"Fungus species"
] |
73,222,124 | https://en.wikipedia.org/wiki/Leucocoprinus%20bulbipes | Leucocoprinus bulbipes is a species of mushroom producing fungus in the family Agaricaceae.
Taxonomy
It was described in 1856 by the French mycologist Jean Pierre François Camille Montagne who classified it as Agricus (Lepiota) bulbipes. Montagne's description was based upon specimens collected by Hugh Algernon Weddell on his expeditions in South America.
In 1887 it was reclassified as Lepiota bulbipes by the Italian mycologist Pier Andrea Saccardo and then as Mastocephalus bulbipes in 1891 by the German botanist Otto Kunze, however Kunze's Mastocephalus genus, along with most of 'Revisio generum plantarum was not widely accepted by the scientific community of the age so it remained a Lepiota. Likewise Kunze's later classification as Chamaeceras bulbipes was not accepted.
In 1987 it was reclassified as Leucocoprinus bulbipes by the mycologist Jörg Raithelhuber.
Raithelhuber also notes that Lepiota cinerascens as described by Carlo Luigi Spegazzini in 1898 may be a synonym however this name was invalid as it had already been used by Lucien Quélet in 1894 and so the species Spegazzini described was reclassified as Lepiota spegazzinii in 1912.
Description
Leucocoprinus bulbipes is a small dapperling mushroom.Cap: 3–4 cm wide starting campanulate before spreading out to convex. The surface is pale with a black or blackish-brown centre disc and striations running from the edges of the cap. It is fibrous and noted as being 'somewhat fleshier' than other Leucocoprinus species. Gills: Pale, free and close sometimes with denticulate edges. Stem: 7–8 cm long and almost cylindrical but with a bulbous base. The surface is whitish and powdery and the ring is persistent but slender. Spores:''' Oval with a somewhat thick wall and indistinct germ pore. 10.2-12.6 x 7–8.4μm.
In Montagne's description of the species he notes grey radial striations and a reddish brown (rufescente) centre.
Habitat and distribution
The specimens studied by Montagne were found growing on the ground amongst rotting leaves in the humid forests of Goiás, Brazil during November.
The specimens studied by Raithelhuber were found growing in forests in Brazil.
Similar species
Raithelhuber notes that Lepiota brebissonii (now Leucocoprinus brebissonii) is similar but has narrower spores and a different cap shape. The spore size is also very similar to that of Leucocoprinus inflatus which Raithelhuber notes may just be a variety of L. bulbipes''.
References
bulbipes
Fungi of South America
Fungi described in 1856
Fungus species | Leucocoprinus bulbipes | [
"Biology"
] | 614 | [
"Fungi",
"Fungus species"
] |
73,222,405 | https://en.wikipedia.org/wiki/Cystangium%20balpineum | Cystangium balpineum, better known as the white sessile truffle, is a basidiomycete mushroom.
See also
Truffle
References
Russulales
Taxa named by Cheryl A. Grgurinovic
Fungus species | Cystangium balpineum | [
"Biology"
] | 52 | [
"Fungi",
"Fungus species"
] |
47,629,661 | https://en.wikipedia.org/wiki/Termitomyces%20bulborhizus | Termitomyces bulborhizus is a species of agaric fungus in the family Lyophyllaceae. Found in Sichuan, China, it was formally described in 2004. It has a large cap, up to in diameter. The specific epithet, derived from the Greek words bulbus ("bulbous") and rhizus ("root"), refers to the bulbous base of the stipe.
References
Lyophyllaceae
Fungi described in 2004
Fungi of China
Fungus species | Termitomyces bulborhizus | [
"Biology"
] | 106 | [
"Fungi",
"Fungus species"
] |
47,630,620 | https://en.wikipedia.org/wiki/Abbot%27s%20Kitchen%2C%20Oxford | The Abbot's Kitchen in Oxford, England, is an early chemistry laboratory based on the Abbot's Kitchen at Glastonbury Abbey, a mediaeval 14th-century octagonal building that served as the kitchen at the abbey.
History
Chemistry was first recognized as a separate discipline at Oxford University with the construction of this laboratory, attached to the Oxford University Museum of Natural History, and opening in 1860. The laboratory is a stone-built structure to the right of the museum, built in the Victorian Gothic style. The building was one of the first ever purpose-built chemical laboratories anywhere and was extended in 1878. A further major extension adding three wings was completed in 1957. It is now part of the new graduate college of the University, Reuben College, which opened in 2023.
Gallery
See also
Abbot's Kitchen, Glastonbury, on which the laboratory building was based
Balliol–Trinity Laboratories, another early Oxford chemistry laboratory
Department of Chemistry, University of Oxford
List of octagonal buildings and structures
References
External links
1860 establishments in England
Buildings and structures completed in 1860
Buildings and structures of the University of Oxford
University and college laboratories in the United Kingdom
Chemistry laboratories
Octagonal buildings in the United Kingdom
Stone buildings in the United Kingdom
Reuben College, Oxford | Abbot's Kitchen, Oxford | [
"Chemistry"
] | 250 | [
"Chemistry laboratories"
] |
47,630,713 | https://en.wikipedia.org/wiki/Penicillium%20simplicissimum | Penicillium simplicissimum is an anamorph species of fungus in the genus Penicillium which can
promote plant growth. This species occurs on food and its primary habitat is in decaying vegetations
Penicillium simplicissimum produces verruculogene, fumitremorgene B, penicillic acid, viridicatumtoxin, decarestrictine G, decarestrictine L, decarestrictine H, decarestrictine I, decarestrictine K decarestrictine M, dihydrovermistatin, vermistatin and penisimplicissin
Further reading
References
simplicissimum
Fungi described in 1930
Taxa named by Charles Thom
Fungus species | Penicillium simplicissimum | [
"Biology"
] | 157 | [
"Fungi",
"Fungus species"
] |
47,631,529 | https://en.wikipedia.org/wiki/Phoenix%20Mecano | Phoenix Mecano AG is a Swiss technology company headquartered in Stein am Rhein, operating internationally in the fields of enclosure technology and industrial components. The company manufactures technical enclosures, electronic components, actuators, and system integrations. Phoenix Mecano employs around 7,000 people and generated sales of €783.1 million in 2023. Founded in 1975, Phoenix Mecano has been listed on the SIX Swiss Exchange since 1988.
History
Foundation and IPO
The company was founded in 1975 under the name Phoenix Maschinentechnik AG and initially mainly produced and distributed technical gases for the welding industry. Through its work with gases, the company began to develop welding torches, which then became the main activity. In 1976, the company began developing enclosures for electronic devices. That same year, the company acquired its strongest competitor, Hartmann.
In 1986, the company name was changed to Phoenix Mecano AG.
The company went public in 1988. At that time, Phoenix Mecano was primarily active in the manufacture of enclosure technology, mechanical, and electromechanical components.
Expansion of business activities
In the years following its IPO, the company shifted focus to secondary products that were not considered strategic by developers, engineers, and mechanical designers. This positioned Phoenix Mecano as an outsourcing partner. The company also began serving markets outside mechanical engineering and industrial electronics. The business expanded into lifestyle and furniture, medical technology, oil and gas, and solar technology. In 1996, Phoenix Mecano entered the Chinese market and, in 1998, acquired a production facility in Tunisia.
During the 2000s, acquisitions were made in Italy and France, expanding the enclosure technology business segment. In 2014, the company acquired Redur to expand the operations into instrument transformers and current transformers.
Newer developments
On 1 January 2021, the DewertOkin Technology Group was established as a separate business division with a potential partial IPO in China considered. As a result, Phoenix Mecano restructured its business segments into three divisions in 2021: Enclosure Systems, Industrial Components, and DewertOkin Technology Group. In 2022, business activities were restructured with the sale of all shares in Phoenix Mecano Digital Elektronik GmbH and Phoenix Mecano Digital Tunisie to Swiss company Cicor.
In 2023, Phoenix Mecano divested its Rugged Computing business unit to the Kontron Group.
Also in 2023, after a five-year construction period, Phoenix Mecano opened a new industrial park in China on a 15-hectare site. This facility became the new headquarters for the DewertOkin Technology Group division, replacing five previous production sites. Phoenix Mecano invested nearly CHF100 million in the industrial park.
Company structure
Phoenix Mecano AG, based in Stein am Rhein, is part of the Phoenix-Mecano Group. The Group generated sales of €783.1 million in the 2023 financial year. Phoenix Mecano has been listed on the Swiss stock exchange since 1988. The founding family Goldkamp holds a 34.6% stake.
The group operates internationally, with its primary production sites and development centres located in Germany, Hungary, Tunisia, India, and China.
Products and field of activity
The Phoenix Mecano Group is divided into three divisions: Enclosure Systems, Industrial Components, and the DewertOkin Technology Group. The Enclosure Systems division includes industrial and electronic enclosures that are used across various industrial sectors, with explosion-proof enclosures deployed in areas with explosive atmospheres. The Industrial Components division offers components and systems for modular automation and industrial digitalisation, including software development for digitalisation of production processes. The DewertOkin Technology Group, headquartered in Jiaxing, Zhejiang, China, manufactures drive, system, and fitting technology for electrically adjustable comfort and care furniture. The company operates production facilities in Europe, North America, and Asia.
Sustainability
Since 2022, Phoenix Mecano has published an annual sustainability report documenting the group’s initiatives. The group aims to fully eliminate CO2 emissions by 2050.
References
External links
Official Website
Companies listed on the SIX Swiss Exchange
Electronics companies of Switzerland
Electrical equipment manufacturers
Companies based in the canton of Schaffhausen
Swiss brands
Stein am Rhein | Phoenix Mecano | [
"Engineering"
] | 847 | [
"Electrical engineering organizations",
"Electrical equipment manufacturers"
] |
47,633,417 | https://en.wikipedia.org/wiki/Cortinarius%20erythraeus | Cortinarius erythraeus, sometimes known as the Jammie Dodger, is a basidiomycete fungus of the genus Cortinarius native to Australia.
English botanist Miles Joseph Berkeley described this species as a "blood red" mushroom, "clothed with a thick gelatinous coat" in 1845, from the writings and specimens of James Drummond, from the vicinity of the Swan River Colony in Western Australia. The species name is derived from the Ancient Greek word erythros "red". John Burton Cleland described Cortinarius ruber in 1928 from a collection in Kinchina, South Australia. Later analysis indicated it was the same species as C. erythraeus.
The fruitbodies of this fungus have hemispherical to convex brick- to brown-red caps, with diameters up to and covered with a layer of slime. The cap centre may be depressed or raised (umbonate) with a boss. The cap margins are curved inwards and smooth. The gills on the cap underside have a subdecurrent attachment to the stipe. Initially light tan or clay-coloured, they deepen to rusty brown as the spores mature. The cinnabar red stipe is cylindrical to slightly bulbous, up to in height and in width. Its lower part, below the remnants of the veil, are covered in slime. The flesh is white. The mushroom has no particular taste or smell, and stains red-purple when potassium hydroxide is applied to it. The spore print is rust-brown, and the oval warty spores measure 8–10 by 5–7 μm.
Cortinarius erythraeus grows with marri (Corymbia calophylla), wandoo (Eucalyptus wandoo), and jarrah (E. marginata) in Western Australia.
See also
List of Cortinarius species
References
External links
erythraeus
Fungi native to Australia
Fungi described in 1845
Taxa named by Miles Joseph Berkeley
Fungus species | Cortinarius erythraeus | [
"Biology"
] | 416 | [
"Fungi",
"Fungus species"
] |
47,633,962 | https://en.wikipedia.org/wiki/Calonarius%20xanthodryophilus | Calonarius xanthodryophilus is a species of fungus in the family Cortinariaceae.
Taxonomy
The species was described in 2011 by the mycologists Dimitar Bojantchev and R. Michael Davis who classified it as Cortinarius xanthodryophilus.
In 2022 the species was transferred from Cortinarius and reclassified as Calonarius xanthodryophilus based on genomic data.
Description
The mushroom cap is wide, convex then flat or uplifted, and yellow then yellow-brown. The gills are notched, crowded, yellow then brown as the spores mature. The stalk is 5–10 cm tall and 1.5–3 cm wide, club-shaped, and sometimes tinted blue.
It should not be consumed due to its similarity to deadly poisonous species.
Habitat and distribution
It is native to North America.
See also
List of Cortinarius species
References
External links
xanthodryophilus
Fungi of North America
Fungi described in 2011
Fungus species | Calonarius xanthodryophilus | [
"Biology"
] | 215 | [
"Fungi",
"Fungus species"
] |
47,633,975 | https://en.wikipedia.org/wiki/Cortinarius%20cyanites | Cortinarius cyanites is a basidiomycete fungus of the genus Cortinarius native to Europe.
Elias Magnus Fries described this species in his 1838 book Epicrisis Systematis Mycologici seu Synopsis Hymenomycetum as Cortinarius cyanites. The species name is derived from the Ancient Greek cyanos "dark blue" Within the large genus Cortinarius, it is classified in the subgenus Phlegmacium and section Cyanites. Genetic analysis showed in 2014 that two previously-described species—C. subcyanites and C. pseudocyanites—lay within the concept of C. cyanites, but conversely revealed three distinct lineages, with two new species—C. boreicyanites and C. violaceorubens—described.
The fruitbodies of this fungus have convex caps, with diameters typically in the range , and various shades of violet, brown or grey. They are slimy when young, and later have brown scales. The pale purple stipe is bulbous, in height and in width. The flesh is purple, but turns blood red when bruised or cut. The gills on the cap underside have an adnate attachment to the stipe and purple color; later, the deepens to rusty brown as the spores mature. The smell has been described as pleasant and fruity. The lemon-shaped spores measure 8–11.5 by 5–6.5 μm. C. boreicyanites and C. violaceorubens have smaller and larger spores respectively. C. violaceorubens has a dark purple-brown cap, while that of C. boreicyanites is more bluish.
Cortinarius cyanites is found in mixed coniferous and deciduous forests in southern Finland, central Sweden southwards into France.
Cortinarius cyanites is not edible.
See also
List of Cortinarius species
References
External links
cyanites
Fungi of Europe
Fungi described in 1838
Inedible fungi
Taxa named by Elias Magnus Fries
Fungus species | Cortinarius cyanites | [
"Biology"
] | 431 | [
"Fungi",
"Fungus species"
] |
47,633,996 | https://en.wikipedia.org/wiki/Invertebrate%20mitochondrial%20code | The invertebrate mitochondrial code (translation table 5) is a genetic code used by the mitochondrial genome of invertebrates. Mitochondria contain their own DNA and reproduce independently from their host cell. Variation in translation of the mitochondrial genetic code occurs when DNA codons result in non-standard amino acids has been identified in invertebrates, most notably arthropods. This variation has been helpful as a tool to improve upon the phylogenetic tree of invertebrates, like flatworms.
The code
AAs = FFLLSSSSYY**CCWWLLLLPPPPHHQQRRRRIIMMTTTTNNKKSSSSVVVVAAAADDEEGGGG
Starts = ---M----------------------------MMMM---------------M------------
Base1 = TTTTTTTTTTTTTTTTCCCCCCCCCCCCCCCCAAAAAAAAAAAAAAAAGGGGGGGGGGGGGGGG
Base2 = TTTTCCCCAAAAGGGGTTTTCCCCAAAAGGGGTTTTCCCCAAAAGGGGTTTTCCCCAAAAGGGG
Base3 = TCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAGTCAG
Bases: adenine (A), cytosine (C), guanine (G) and thymine (T) or uracil (U).
Amino acids: Alanine (Ala, A), Arginine (Arg, R), Asparagine (Asn, N), Aspartic acid (Asp, D), Cysteine (Cys, C), Glutamic acid (Glu, E), Glutamine (Gln, Q), Glycine (Gly, G), Histidine (His, H), Isoleucine (Ile, I), Leucine (Leu, L), Lysine (Lys, K), Methionine (Met, M), Phenylalanine (Phe, F), Proline (Pro, P), Serine (Ser, S), Threonine (Thr, T), Tryptophan (Trp, W), Tyrosine (Tyr, Y), Valine (Val, V).
Differences from the standard code
Note: The codon AGG is absent in Drosophila.
Alternative initiation codons
ATA/AUA
ATT/AUU
ATC/AUC: Apis
GTG/GUG: Polyplacophora
TTG/UUG: Ascaris, Caenorhabditis.
Systematic range
Nematoda: Ascaris, Caenorhabditis;
Mollusca: Bivalvia); Polyplacophora;
Arthropoda/Crustacea: Artemia;
Arthropoda/Insecta: Drosophila [Locusta migratoria (migratory locust), Apis mellifera (honeybee)].
Other variations
Several arthropods translate the codon AGG as lysine instead of serine (as in the Pterobranchia Mitochondrial Code) or arginine (as in the standard genetic code).
GUG may possibly function as an initiator in Drosophila. AUU is not used as an initiator in Mytilus
"An exceptional mechanism must operate for initiation of translation of the cytochrome oxidase subunit I mRNA in both D. melanogaster and D. yakuba, since its only plausible initiation codon, AUA, is out of frame with the rest of the gene. Initiation appears to require the "reading" of an AUAA quadruplet, which would be equivalent to initiation at AUA followed immediately by a specific ribosomal frameshift. Another possible mechanism ... is that the mRNA is "edited" to bring the AUA initiation into frame."
See also
List of genetic codes
References
Molecular genetics
Gene expression
Protein biosynthesis | Invertebrate mitochondrial code | [
"Chemistry",
"Biology"
] | 900 | [
"Protein biosynthesis",
"Gene expression",
"Molecular genetics",
"Biosynthesis",
"Cellular processes",
"Molecular biology",
"Biochemistry"
] |
47,634,064 | https://en.wikipedia.org/wiki/C5H7NO | {{DISPLAYTITLE:C5H7NO}}
The molecular formula C5H7NO (molar mass: 97.11 g/mol, exact mass: 97.0528 u) may refer to:
Furfurylamine
Oxazepine | C5H7NO | [
"Chemistry"
] | 57 | [
"Isomerism",
"Set index articles on molecular formulas"
] |
47,634,311 | https://en.wikipedia.org/wiki/Balliol-Trinity%20Laboratories | The Balliol-Trinity Laboratories in Oxford, England, was an early chemistry laboratory at the University of Oxford.
The laboratory was located between Balliol College and Trinity College, hence the name. It was especially known for physical chemistry.
Chemistry was first recognized as a separate discipline at Oxford University in the 19th century. From 1855, a chemistry laboratory existed in a basement at Balliol College. In 1879, Balliol and Trinity agreed to have a laboratory at the boundary of the two colleges. The laboratory became the strongest of the Oxford college research institutions in chemistry. It remained in operation until the Second World War when a new Physical Chemistry Laboratory (PCL) was constructed by Oxford University in the Science Area.
People
The following scientists of note worked in the Balliol-Trinity Laboratories:
E. J. Bowen
Sir John Conroy
Sir Harold Hartley
Sir Cyril Norman Hinshelwood (Nobel Prize winner)
Henry Moseley
See also
Abbot's Kitchen, Oxford, another early chemistry laboratory in Oxford
Department of Chemistry, University of Oxford
Physical Chemistry Laboratory, which replaced the Balliol-Trinity Laboratories
References
1879 establishments in England
1940 disestablishments in England
Buildings and structures completed in 1879
Buildings and structures of the University of Oxford
History of the University of Oxford
University and college laboratories in the United Kingdom
Chemistry laboratories
Demolished buildings and structures in Oxfordshire
Balliol College, Oxford
Trinity College, Oxford
Physical chemistry | Balliol-Trinity Laboratories | [
"Physics",
"Chemistry"
] | 280 | [
"Chemistry laboratories",
"Applied and interdisciplinary physics",
"nan",
"Physical chemistry",
"Physical chemistry stubs",
"Chemistry organization stubs"
] |
47,634,345 | https://en.wikipedia.org/wiki/Ferro%20%28architecture%29 | A ferro (plural ferri) or is an item of functional wrought-iron work on the façade of an Italian building. Ferri are a common feature of Medieval and Renaissance architecture in Lazio, Tuscany and Umbria. They are of three main types: have a ring for tethering horses, and are set at about from the ground; holders for standards and torches are placed higher on the façade and on the corners of the building; have a cup-shaped hook or hooks to support cloth for shade or to be dried, and are set near balconies.
In Florence, ferri da cavallo and arpioni were often made to resemble the head of a lion, the symbolic marzocco of the Republic of Florence. Later, cats, dragons, horses and fantastic animals were also represented.
References
Further reading
Assunta Maria Adorisio (1996). Per Uso e Per Decoro: L’arte del ferro a Firenze e in Toscana dal eta gotica al XX secolo. Florence: Maria Christina de Montemayor.
Giulio Ferrari. ([1920?]) Il ferro nell'arte Italiana. Centosettanta tavole riproduzioni in parte inedite di 368 soggetti, del medio evo, del rinascimento, del periodo barocco e neo-classico raccolte e ordinate con testo esplicativo. Kraus Reprint, 1973.
James Lindow (2007). The Renaissance Palace in Florence: magnificence and splendor in fifteenth-century Italy. Aldershot, England; Burlington, VT: Ashgate.
Claudio Paolini. Repertorio delle architettura civili di Firenze. [Database] Palazzo Spinelli – Ente Cassa di Risparmio di Firenze.
Augusto Pedrini (1929). Il ferro battuto, sbalzato e cesellato, nell-arte italiana, dal secolo undicesimo al secolo diciottesimo. Milan: Ulrico Hoepli. (Published in English: Decorative ironwork of Italy. Atglen PA: Schiffer Publishers, 2010.)
Urbano Quinto (1998). Gli antichi segreti del fabbro. Galleria Urbano Quinto.
Herbert Railton (1900). Pen drawings of Florence. Cleveland, Ohio: J.H. Jansen.
John Superti (2014). I Cavalli di Firenze = The Horses of Florence. Florence: Polistampa.
John Superti (2013) Florence's Ironworks - Ferri https://www.youtube.com/watch?v=zKQ5s9Lk1Bo
Architectural elements | Ferro (architecture) | [
"Technology",
"Engineering"
] | 586 | [
"Building engineering",
"Architectural elements",
"Components",
"Architecture"
] |
47,634,736 | https://en.wikipedia.org/wiki/The%20Zookeeper%27s%20Wife%20%28film%29 | The Zookeeper's Wife is a 2017 American war drama film directed by Niki Caro and written by Angela Workman. It is based on Diane Ackerman's non-fiction book of the same name. The film tells the true story of how Jan and Antonina Żabiński rescued hundreds of Polish Jews from the Germans by hiding them in their Warsaw zoo during World War II. It stars Jessica Chastain, Johan Heldenbergh, Daniel Brühl and Michael McElhatton.
The film had its world premiere on 8 March 2017 in Warsaw, Poland, the location of the story, followed by its US premiere at the Cinequest Film Festival in San Jose, California, on 12 March 2017. The film was released in the United States on 31 March 2017, by Focus Features, and by Universal Pictures International in the United Kingdom on 21 April 2017. It received mixed reviews from critics but a positive response from audiences and grossed $26 million worldwide.
Plot
Dr. Jan Żabiński is director of the Warsaw Zoo, one of the largest in 1930s Europe, assisted by his wife, Antonina.
On 1 September 1939, the aerial bombardment of Warsaw and Invasion of Poland commences. Antonina and her son Ryszard barely survive. Dr. Lutz Heck, head of the Berlin Zoo and Adolf Hitler's chief zoologist and Jan's professional rival, visits the zoo while Jan is away. Offering to house their prized animals in Berlin until after the war, he then returns with soldiers to shoot the others, revealing his hidden brutality. He becomes infatuated with Antonina.
Warsaw Jews are forced into the Ghetto. The Żabińskis' Jewish friends, Maurycy Fraenkel and his partner Magda Gross, seek a safe place for their friend Szymon Tenenbaum's insect collection. Antonina then offers to shelter Magda. Despite the risk, Jan and Antonina use the zoo to hide others and save more lives.
They propose Heck turn the abandoned zoo into a pig farm, to feed the occupying forces, secretly hoping to sneak people out of the Ghetto. Heck, wanting a new site for his experiments in recreating aurochs as a symbol of the Reich, agrees.
When Jan collects garbage inside the Ghetto, he also hides people in the trucks, to bring them to the zoo, working with the Underground Army to later transport them to safehouses throughout the country. Jews are hidden in the zoo's cages, tunnels, and inside the Żabińskis’ house. When Antonina plays the piano late at night, it means it is safe to come out of hiding. But if it is played in the daytime, they must hide. Jan also rescues a young girl, Urszula, who was raped by German soldiers. Antonina takes a particular interest in her, treating her the same way she would treat a frightened animal, until she emerges to join other "guests" in the Zabinskis' home.
In 1942, Germans begin transporting Jews to death camps. At a loading station, Jan tries to convince Janusz Korczak, head of the Jewish children's orphanage, to escape with him, but he will not leave the children. Jan has no choice but to help load children into the cattle cars bound for the death camps. Becoming aware of Heck's obvious feelings for his wife, a rift begins to form between the couple.
In 1943, two women rescued by Jan and disguised as Aryans by Antonina are discovered and executed outside their boarding house. After the failed uprising, Germans liquidate the Ghetto on Hitler's birthday, also the first night of Passover. While hidden Jews mournfully celebrate a secret Passover Seder, the Germans burn down the Ghetto.
Jan decides to join the Warsaw Uprising, and the couple reconciles before he leaves. Antonina later gives birth to a baby girl, Teresa. During the uprising, Jan is shot and captured, presumed dead. As Heck's attraction to Antonina intensifies, she struggles to fend him off while guarding the secret "guests." Visiting the house unexpectedly, Heck finds Ryzard, questioning the boy about his parents' whereabouts, but is unable to get any information. In growing suspicion and rage, Heck pins the Nazi cross onto the boy's shirt, goading him to say "Heil Hitler." As he leaves, Ryszard cries out "Hitler ist kaput!"
In January 1945, evacuation of Warsaw begins. Desperate to know of Jan, Antonina seeks Heck's help. When he asks what he will get in return, she begins to undress, though she obviously can't bear him. Increasingly sure of her deceit, Heck almost rapes her, she finally confesses that he disgusts her, and he begins to realize how much she has lied. Calling for his car to go to the zoo, Antonina races home and helps her guests escape, just as Heck arrives. Magda takes baby Teresa with her, but Ryszard insists on staying, so Antonina hides him.
Heck enters the house in fury, discovering the basement drawings: Stars of David, dates, and guests drawn with animal faces. When he finds Ryszard, he chases him through the animal tunnels, catching him at gunpoint. Locking Antonina in a cage, he ignores her pleas as he drags the boy out of eyesight. When a shot rings out, she collapses in grief. A moment later, Heck walks back to his car, leaving the zoo for good. Ryszard returns to her side, unharmed. The two join the march out of Warsaw.
Warsaw began rebuilding four months after the Nazi surrender. Antonina and the children return to the damaged zoo, along with Jerzyk, their loyal zookeeper. Jan returns home, having survived a prison camp. They paint Stars of David on all the cages in the zoo.
In the postscript: the Żabińskis saved 300 Jews. Heck's zoo in Berlin was destroyed by Allied bombings, and he never recreated aurochs. The Żabińskis were recognized by Israel (Yad Vashem) for their righteous acts and defiance against the Germans. They rebuilt the present day Warsaw Zoo.
Cast
Historical context
The Zookeeper's Wife is based on Diane Ackerman's non-fiction book of the same name, which relied heavily on the diaries of Antonina Żabińska, published in Poland as Ludzie i zwierzęta (translated as: People and Animals) (1968). In key aspects of historical context, the screenplay follows the story of Antonina and her husband, Jan, closely. Both worked at the Warsaw Zoo. Antonina helped her husband who was the director of the zoo. Animals were part of their family's life, and the devastation that resulted from the attack on Warsaw and the subsequent pillaging of the zoo is well documented. The actions of Lutz Heck and his animal breeding experiments were also a matter of historical record, although the intimate relationship of the protagonist, Antonina, and the antagonist, Heck, is exaggerated. However, the defiance of Nazi occupation and ultimately, the rescue of over 300 Jews from the Warsaw Ghetto were depicted accurately. The contributions and participation of the Żabinski children, Ryszard and Teresa (credited as Theresa in the film) were also notable.
Production
Development
In September 2010, it was announced that Angela Workman was adapting Diane Ackerman's non-fiction book, The Zookeeper's Wife. On 30 April 2013, Jessica Chastain was attached to play the titular role as Antonina Żabińska, while Niki Caro signed on to direct the film. On 24 August 2015, Focus Features acquired the US rights to the film, and Daniel Brühl and Johan Heldenbergh signed on to star in it.
Filming
Filming began with the animals on 9 September 2015, and principal photography with the actors began on 29 September 2015, in Prague, Czech Republic. Suzie Davies served as the production designer, Andrij Parekh as the director of photography, and Bina Daigeler as the costume designer. Filming ended on 29 November 2015.
Release
The Zookeeper's Wife had its world premiere on 8 March 2017 in Warsaw, Poland, and its US premiere at the Cinequest Film Festival on 12 March 2017. The film was released in the United States on 31 March 2017 and was released in the United Kingdom on 21 April 2017. It premiered in Spain at the Barcelona-Sant Jordi International Film Festival on 22 April 2017. It also premiered in France at the 43rd Deauville Film Festival on 7 September 2017.
A special screening was held at the US Holocaust Museum in Washington DC on 22 March 2017, with a panel discussion including speakers Diane Ackerman, Jessica Chastain, Niki Caro and Angela Workman. Prior to the film's release, Focus Features partnered with the International Rescue Committee to screen the film in cities across the country, including a special screening at the Museum of Tolerance in Los Angeles, California, and a special screening in New York City, with a panel of speakers which included Chastain, Caro and Workman. The New York screening occurred on behalf of the Anne Frank Center for Mutual Respect, and was hosted by activist Steven Goldstein. The film speakers were joined by Sarah O'Hagan of the International Rescue Committee. The evening's topic of discussion was the rescue of Jewish refugees during the Holocaust, and the current refugee crisis in Europe.
The film began running on HBO on 23 December 2017.
Reception
Box office
The Zookeeper's Wife grossed $17.6 million in the United States and Canada and $8.6 million in other territories for a worldwide total of $26.1 million, against a production budget of $20 million.
In North America, the film grossed $3.3 million in its opening weekend from 541 theaters (a per-theater average of $6,191), finishing 10th at the box office. It remained the top grossing indie film in its second, third and fourth weeks of release.
The film remained the top grossing specialty film of 2017 in its fifth week of release, with IndieWire praising the film's release strategy, saying: "Focus’ aggressive push for this Jessica Chastain Holocaust rescue story has paid off with the top result for any specialized audience release since awards season. It won't hit the level of Woman in Gold two years ago ($33 million), but that's more of a factor of the steep decline in overall upscale grosses and more competition at the moment than other differences between the two films." In its eighth and ninth weeks of release, The Zookeeper's Wife was the third highest grossing specialty release of 2017, despite a reduction in its theater count. In its tenth week of release, IndieWire said the film "has been a rare specialized standout this spring."
The film remained the top-selling war film for the first three months of its home media release.
Critical response
On review aggregator Rotten Tomatoes, the film holds an approval rating of 64% based on 183 reviews. The website's critical consensus reads: "The Zookeeper's Wife has noble intentions, but is ultimately unable to bring its fact-based story to life with quite as much impact as it deserves." On Metacritic, the film has a weighted average score of 57 out of 100, based on 36 critics, indicating "mixed or average reviews". PostTrak reported that over 90% of audience members gave the film a rating of either "excellent" or "very good".
IndieWire listed The Zookeeper's Wife on its shortlist of best indie films of the year, stating: "Niki Caro’s fact-based historical drama is a heartbreaker of the highest order, anchored by an understated performance by Jessica Chastain and a series of wrenching dramatic twists that will wring tears out of even the hardest of hearts." Mick LaSalle, writing for The San Francisco Chronicle, gave the film a 5-star review, saying that it "grabs us from its first seconds" and that: The Zookeeper's Wife achieves its grandeur not through the depiction of grand movements, but through its attentiveness to the shifts and flickers of the soul. The war was a great external event, but Caro reminds us that it was experienced internally, by the people and the animals who had to try to live through it.Kenneth Turan, in the Los Angeles Times, says "Niki Caro and Jessica Chastain create an emotionally satisfying Zookeeper's Wife". The AP, the national wire service, says the film "tells a riveting true story" that is "both inspiring and comes as a welcome reminder in this time of uncertainty that even in the face of astonishing evil, humanity and goodness can also rise to the occasion." Jacob Soll in The New Republic heralded the film as the "first feminist Holocaust film".
In a negative review, Varietys Peter Debruge said, "There’s no nice way to put it in this case, but The Zookeeper’s Wife has the unfortunate failing of rendering its human drama less interesting than what happens to the animals — and for a subject as damaging to our species as the Holocaust, that no small shortcoming." In contrast, Varietys Kristopher Tapley wrote that the film deserved consideration as an Oscar contender.
Stephen Holden of The New York Times said the film "was like Schindler's List with pets," writing that it was "so timid and sanitized it almost feels safe for children."
Polish reviewers expressed a strong positive response to the film, which spoke to their history. The Krakow Post stated: "On a universal level (the film) is a prayer for sanity and the civilized values of charity, empathy, and humanity in any time which finds itself threatened to be ruled by mass insanity, hatred, and barbarism. Lessons derived from this darkest period of recent history can never be untimely."
Alexandra Macaaron, in Women's Voices For Change, gave the film a rave review, noting that The Zookeeper's Wife is a rarity among Holocaust films, and is distinguished by its female perspective on war and the struggle to protect every living soul, strangers and friends alike.
Accolades
At the 2016 Heartland Film Festival, held each October in Indianapolis, Indiana, The Zookeeper's Wife was awarded the "Truly Moving Picture Award"; emblematic of the festival's goal to "inspire filmmakers and audiences through the transformative power of film."
The Zookeeper's Wife was awarded the Audience Choice Award for Best Narrative Feature at the 2017 Seattle Jewish Film Festival.
In April 2017, Political Film Society USA nominated The Zookeeper's Wife for its PFS award, in the category "Human Rights".
See also
List of Holocaust films
Notes
References
Bibliography
Heck, Lutz. Animals, My Adventure. London: Methuen, 1954. .
External links
The Zookeeper’s Wife: Fact vs. Fiction
2017 biographical drama films
2017 war drama films
2017 films
American biographical drama films
American war drama films
British biographical drama films
British war drama films
Czech war drama films
War drama films based on actual events
2010s feminist films
Films about animals
Films based on non-fiction books
Films directed by Niki Caro
Films scored by Harry Gregson-Williams
Films set in Poland
Films set in 1939
Films set in the 1940s
Films set in Warsaw
Films set in zoos
Films shot in the Czech Republic
Focus Features films
Jewish Polish history
Holocaust films
Rescue of Jews during the Holocaust
World War II films based on actual events
2017 drama films
American World War II films
Czech World War II films
British World War II films
2010s English-language films
2010s American films
2010s British films
English-language Czech films
English-language biographical drama films
English-language war drama films | The Zookeeper's Wife (film) | [
"Biology"
] | 3,255 | [
"Rescue of Jews during the Holocaust",
"Behavior",
"Altruism"
] |
47,634,858 | https://en.wikipedia.org/wiki/Meltwater%20pulse%201B | Meltwater pulse 1B (MWP1b) is the name used by Quaternary geologists, paleoclimatologists, and oceanographers for a period of either rapid or just accelerated post-glacial sea level rise that some hypothesize to have occurred between 11,500 and 11,200 years ago at the beginning of the Holocene and after the end of the Younger Dryas. Meltwater pulse 1B is also known as catastrophic rise event 2 (CRE2) in the Caribbean Sea.
Other named, postglacial meltwater pulses are known most commonly as meltwater pulse 1A0 (meltwater pulse19ka), meltwater pulse 1A, meltwater pulse 1C, meltwater pulse 1D, and meltwater pulse 2. It and these other periods of proposed rapid sea level rise are known as meltwater pulses because the inferred cause of them was the rapid release of meltwater into the oceans from the collapse of continental ice sheets.
Sea level
There is considerable unresolved disagreement over the significance, timing, magnitude, and even existence of meltwater pulse 1B. It was first recognized by Richard G Fairbanks in his coral reef studies in Barbados. From the analysis of data from cores of coral reefs surrounding Barbados, he concluded that during meltwater pulse 1B, sea level rose in about 500 years about 11,300 years ago.
However, in 1996 and 2010, Bard and others published detailed analysis of data from cores from coral reefs surrounding Tahiti. They concluded that meltwater pulse 1B was, at best, just an acceleration of sea level rise at about 11,300 years ago and it was, at worst, not statistically different from a constant rate sea level rise between 11,500 and 10,200 years ago. They argued that meltwater pulse 1B was certainly not an abrupt jump in sea level, which they would consider to be a meltwater pulse. They argue that the rise in sea level estimated by Fairbanks from cores is an artifact created by differential tectonic uplift between different sides of a tectonic structure lying between the two Barbados cores used to identify meltwater pulse 1B and calculate its magnitude.
Other differing estimates about the magnitude of meltwater pulse 1B have been published. In 2010, Standford and others found it to be "robustly expressed" as a multi-millennial interval of enhanced rates of sea-level rise between 11,500 and 8,800 years ago with peak rates of rise of up to 25 mm/yr. In 2004, Liu and Milliman reexamined the original data from Barbados and Tahiti and reconsidered the mechanics and sedimentology of reef drowning by sea level rise. They concluded that meltwater pulse 1B occurred between 11,500 and 11,200 years ago, a 300-year interval, during which sea level rose from to , giving a mean annual rate of around 40mm/yr Other studies have revised the estimated magnitude of meltwater pulse 1B downward to between and less than .
Source(s) of meltwater pulse 1B
Given the disagreement over its timing, magnitude, and even existence, it has been very difficult to constrain the source of meltwater pulse 1B. In his modeling of global glacial isostatic adjustment, Peltier assumed that the predominant source for MWP-1B was the Antarctic Ice Sheet. However, no justification for this assumption is provided in his papers. In addition, Leventer and others argue that the timing of deglaciation in eastern Antarctica roughly coincides with the onset of meltwater pulse 1B and the Antarctic Ice Sheet is a likely source. Finally, McKay and others suggested that recession of the West Antarctic Ice Sheet may have supplied the meltwater needed to the start meltwater pulse 1B.
However, later studies involving the surface exposure dating of glacial erratics, nunataks, and other formerly glaciated exposures using cosmogenic dating contradicted the above arguments and assumptions. These studies tentatively concluded that the actual amount of thinning of the East Antarctic Ice Sheet is too small, , and likely too gradual and too late to have contributed any significant amount of meltwater to meltwater pulse 1B. They also concluded that the ice sheet retreat and thinning accelerated for the West Antarctic Ice Sheet only after 7,000 years ago. Although other researchers have concluded that the abrupt decay of the Laurentide Ice Sheet might have been sufficient to have been responsible for meltwater pulse 1B, its sources remain an unresolved mystery. For example, recent research in West Antarctica found that sufficient deglaciation contemporaneous with meltwater pulse 1B occurred to readily explain this rapid period of global sea level rise.
Mississippi River superflood events MWF-5
A variety of paleoclimate and paleohydrologic proxies, which can be used to reconstruct the prehistoric discharge of the Mississippi River, can be found in the sediments of the Louisiana continental shelf and slope, including the Orca and Pygmy basins, within the Gulf of Mexico. These proxies have been used by Quaternary geologists, paleoclimatologists, and oceanographers to reconstruct both the duration and discharge the mouth of the prehistoric Mississippi River for the Late glacial and postglacial periods, including the time of meltwater pulse 1B. The chronology of flooding events found by the study of cores on the Louisiana continental shelf and slope are in agreement with the timing of meltwater pulses. For example, meltwater pulse 1A in the Barbados coral record matches quite well with a group of two separate Mississippi River meltwater flood events, MWF-3 (12,600 ) and MWF-4 (11,900 ). In addition, meltwater pulse 1B in the Barbados coral record matches a cluster of four Mississippi River superflood events, MWF-5, that occurred between 9,900 and 9,100 . In 2003, Aharon reported that flood event MWF-5 consists of four separate and distinct superfloods at 9,970-9,870; 9,740-9,660; 9,450-9,290; and 9,160-8,900 . The discharge at the mouth of the Mississippi River during three of the four superfloods of MWF-5 is estimated to have varied between 0.07 and 0.08 sverdrups (million cubic meters per second). The superflood at 9450-9290 is estimated to have had a discharge of 0.10 sverdrups (million cubic meters per second). This research also shows that the Mississippi superfloods of MWF-5 occurred during the Preboreal. The same research found an absence of either meltwater floods or superfloods discharging into the Gulf of Mexico from the Mississippi River during the preceding thousand years, which is known as the cessation event, that corresponds with the Younger Dryas stadial.
The Pleistocene deposits blanketing the Louisiana Continental shelf and slope between the mouth of the Mississippi River and Orca and Pygmy basins largely consist of sediments transported down the Mississippi River mixed with variable additions of local biologically generated carbonate. Because of this, the provenance of the meltwater and superfloods can be readily inferred from the sediment's composition. The composition of the sediments brought into the Gulf of Mexico and deposited on the Louisiana continental shelf and slope during the superfloods of MWF-5 reflect an abrupt change in mineralogy, fossil content, organic matter, and amount after 12,900 years ago at the start of the Younger Dryas interval.
First, after 12,900 years ago, smectite-rich sediments from the Missouri River drainage are progressively and quickly replaced by sediments associated with the Great Lakes region and further south along the Mississippi River, as indicated by their clay mineralogy. Second, after 12,900 years ago, the overall quantity of sediment being transported down the Mississippi River abruptly decreases with a corresponding and significantly increased proportion of locally produced biologically generated carbonate and organic matter. Third, after 12,900 years ago, various analyses, e.g. C/N ratio and Rock–Eval Pyrolysis, indicate that the type of organic matter present changes from organic matter that was reworked from old formations by glacials to well-preserved Holocene organic matter that is mainly of marine origin. Finally, after 12,900 years ago, the presence of reworked nannofossils disappear from sediments accumulating on the Louisiana continental shelf and slope.
The above noted changes in the nature of accumulating sediments indicate that after the start of the Younger Dryas, the southern route for Laurentide Ice Sheet meltwater was largely blocked. On the rare occasions it could flow southward, glacial meltwater flowed through Lake Agassiz and sometimes the Great Lakes to the Mississippi River. As the water moved through either Lake Agassiz or other proglacial lakes, they completely trapped and removed any glacial outwash and the older, reworked organic material and reworked nannofossils that the outwash contained. As a result, the sediment carried by the Mississippi River after the start of the Younger Dryas consisted of illite and chlorite enriched sediments from the Great Lakes region that lacked any reworked nannofossils. These changes argue that the superfloods of MWF-5 which fed Meltwater Pulse B are related to either rare periods of southerly discharge of meltwater through Lake Agassiz, nonglacial periods of climate-enhanced discharge within the Mississippi River Basin, or a combination of both.
Antarctic iceberg discharge events
In case of the Antarctic Ice Sheet, an equivalent well-dated, high-resolution record of the discharge of icebergs from various parts of the Antarctic Ice Sheet for the past 20,000 years is also available. Research by Weber and others constructed a record from variations in the amount of iceberg-rafted debris versus time and other environmental proxies in two cores taken from the ocean bottom within Iceberg Alley of the Weddell Sea. The cores of ocean bottom sediments within Iceberg Alley provide a spatially integrated signal of the variability of the discharge of icebergs into the marine waters by the Antarctic Ice Sheet because it is a confluence zone in which icebergs calved from the entire Antarctic Ice Sheet drift along currents, converge, and exit the Weddell Sea to the north into the Scotia Sea.
Between 20,000 and 9,000 years ago, Weber and others documented eight well-defined periods of increased iceberg calving and discharge from various parts of the Antarctic Ice Sheet. Five of these periods, AID5 through AID2 (Antarctic Iceberg Discharge events), are comparable in duration and have a repeat time of about 800–900 years. The largest of the Antarctic Iceberg Discharge events is AID2. Its peak intensity at about 11,300 years ago, which is synchronous with meltwater pulse 1B in the Barbados sea-level record, is consistent with a significant Antarctic contribution to meltwater pulse 1B. The lack of a sea level response in the Tahiti coral record might indicate a regionally specific sea-level response to a deglaciation event only from the Pacific sector of the Antarctica Ice Sheet.
See also
Deglaciation
Holocene glacial retreat
Younger Dryas
References
External links
Gornitz, V. (2007) Sea Level Rise, After the Ice Melted and Today. Science Briefs, NASA's Goddard Space Flight Center. (January 2007)
Gornitz, V. (2012) The Great Ice Meltdown and Rising Seas: Lessons for Tomorrow. Science Briefs, NASA's Goddard Space Flight Center. (June 2012)
Liu, J.P. (2004) Western Pacific Postglacial Sea-level History., River, Delta, Sea Level Change, and Ocean Margin Research Center, Marine, Earth and Atmospheric Sciences, North Carolina State University, Raleigh, NC.
Glaciology
Oceanography
Paleoclimatology
Sea level
10th millennium BC | Meltwater pulse 1B | [
"Physics",
"Environmental_science"
] | 2,452 | [
"Oceanography",
"Hydrology",
"Applied and interdisciplinary physics"
] |
47,635,273 | https://en.wikipedia.org/wiki/Ekoa | Ekoa is a natural biocomposite of flax available in dry fabrics and pre-pregs, as well as cores and resins. Ekoa can be used for a variety of applications, including the production of musical instruments like the ukulele and guitar, and the manufacturing of sports equipment such as bicycle frames and lacrosse sticks.
History
Ekoa was initially developed by Blackbird Guitars, a company that has made musical instruments out of Carbon fiber reinforced polymer, but started working on a biobased composite material that would work well for musical instruments. Blackbird worked with Entropy Resins to develop Ekoa, and released the first production musical instrument in 2013. Joe Luttwak of Blackbird and Desi Banatao of Entropy formed a separate company, Lingrove, LLC, to further develop Ekoa and expand applications. Luttwak filed for a patent for "METHOD FOR MAKING LIGHT AND STIFF PANELS AND STRUCTURES USING NATURAL FIBER COMPOSITES" on November 18, 2014, which was given A1 Kind Code status on May 15, 2015. Lingrove filed "Ekoa" as a registered trademark on November 12, 2013. The trademark was registered on February 3, 2015. The trademark is registered under two separate classes: 015 - Musical instruments, and 024 - Textiles and textile goods, not included in other classes; bed and table covers.
Applications
Ekoa was initially developed to combine the tone of wooden instruments with the durability of a composite instrument. Previously, composite instruments had been made with carbon fiber, glass fiber, or aluminum to achieve durability, but these materials did not have the same tonality of wood. To address this, Ekoa utilizes flax fibers and produces a tone more like wood. The first musical instrument product made of Ekoa was the Blackbird Clara concert ukulele, which has won a variety of awards in the composites industry, including at The Composites And Advanced Material Expo (CAMX), JEC Americas, and Industrial Designers Society of America's IDEA Award. Blackbird later introduced the El Capitan guitar model, also made with Ekoa.
For sports equipment, RockWest Composites has produced a bicycle frame in conjunction with Calfee, as well as a lacrosse stick with a hexagonal shape core wrapped in Ekoa twill.
References
Composite materials | Ekoa | [
"Physics"
] | 475 | [
"Materials",
"Composite materials",
"Matter"
] |
47,635,907 | https://en.wikipedia.org/wiki/IEEE%20Innovation%20in%20Societal%20Infrastructure%20Award | The IEEE Innovation in Societal Infrastructure Award is a Technical Field Award established by the IEEE Board of Directors in 2011. The IEEE Technical Field Awards are awarded for contributions or leadership in specific fields of interest of the IEEE.
This award is typically presented to an individual or a team of up to three people. Recipients of this award receive a bronze medal, certificate, and honorarium.
Recipients
2014: Balaji Prabhakar "For his demonstration of the innovative use of information technology and distributed computing systems to solve long-standing societal problems, in areas ranging from transportation to healthcare and recycling."
2015: Takemochi Ishii and Hirokazu Ihara and Atsunobu Ichikawa "For pioneering the concept of dependable autonomous decentralized systems and contributing to its practical application in early transport control systems."
2016: William H. Sanders "For the assessment-driven design of trustworthy cyber infrastructures for electric grid systems."
2017: Antonello (Anto) Monti "For accelerating innovation of energy, information, and communication technologies for the urban environment."
2018: David F. Ferraiolo, D. Richard Kuhn, and Ravi Sandhu "For advancing the foundations and practice of information security through creation, development, and technology transfer of role-based access control (RBAC)."
2019: Andy Vidan, Paul Breimyer, and Gregory G. Hogan "For development of real-time collaborative and distributed emergency response and recovery systems."
2020: Masaru Kitsuregawa
2021: Elisa Bertino
References
External links
IEEE Innovation in Societal Infrastructure Award page at IEEE
List of recipients of the IEEE Innovation in Societal Infrastructure Award
IEEE Innovation in Societal Infrastructure Award | IEEE Innovation in Societal Infrastructure Award | [
"Technology"
] | 345 | [
"Science and technology awards",
"Science award stubs"
] |
47,635,920 | https://en.wikipedia.org/wiki/DBGp | Common DeBugGer Protocol as used by Xdebug and potentially other implementations. DBGp is a simple protocol for use with language tools and engines for the purpose of debugging applications.
The protocol provides a means of communication between a debugger engine (scripting engine, Virtual Machine, etc.) and a debugger IDE.
Criticisms
DBGp has not received widespread adoption as a server protocol. Most implementations are client-side so that IDEs may be compatible specifically with Xdebug, which remains popular.
Criticisms have included:
Performance (DBGp is a text-mode protocol)
Security (DBGp has a complex connection mechanism that could lead to buggy vulnerable implementations)
Generality (DBGp is designed to be compatible with multiple programming languages rather than being optimized for PHP)
A primary author of the DBGp specification has defended the design.
References
Communications protocols
Debuggers | DBGp | [
"Technology"
] | 186 | [
"Computer standards",
"Communications protocols"
] |
47,636,400 | https://en.wikipedia.org/wiki/Forum%20Geometricorum | Forum Geometricorum: A Journal on Classical Euclidean Geometry was a peer-reviewed open-access academic journal that specialized in mathematical research papers on Euclidean geometry.
Founded in 2001, it was published by Florida Atlantic University and was indexed by Mathematical Reviews and . Its founding editor-in-chief was Paul Yiu, a professor of mathematics at Florida Atlantic.
In 2019, Forum Geometricorum published what was later announced to be its final issue, and stopped accepting submissions, after the retirement of Yiu.
Prior issues are still available. Volumes for 2001 to 2009 can be accessed as a single searchable file (see below). Individual articles up to 2019 are available from Internet Archive (see below).
See also
International Journal of Geometry
Geometry, an open-access journal first published in July 2024: https://www.mdpi.com/journal/geometry
References
External links
Individual articles: https://scholar.archive.org/search?q=Forum+Geometricorum
Geometry journals
Open access journals
Academic journals established in 2001
Florida Atlantic University
English-language journals | Forum Geometricorum | [
"Mathematics"
] | 215 | [
"Geometry",
"Geometry journals"
] |
47,637,779 | https://en.wikipedia.org/wiki/Hortiboletus%20bubalinus | Hortiboletus bubalinus is a species of bolete fungus in the family Boletaceae. Originally described in 1991 as a species of Boletus, the fungus was transferred to Xerocomus in 1993. It was transferred to Hortiboletus by Bálint Dima in 2015.
References
External links
Boletaceae
Fungi described in 1991
Fungi of Europe
Fungus species | Hortiboletus bubalinus | [
"Biology"
] | 80 | [
"Fungi",
"Fungus species"
] |
47,638,279 | https://en.wikipedia.org/wiki/Data%20Analytics%20Library | oneAPI Data Analytics Library (oneDAL; formerly Intel Data Analytics Acceleration Library or Intel DAAL), is a library of optimized algorithmic building blocks for data analysis stages most commonly associated with solving Big Data problems.
The library supports Intel processors and is available for Windows, Linux and macOS operating systems. The library is designed for use popular data platforms including Hadoop, Spark, R, and MATLAB.
History
Intel launched the Intel Data Analytics Library(oneDAL) on December 8, 2020. It also launched the Data Analytics Acceleration Library on August 25, 2015 and called it Intel Data Analytics Acceleration Library 2016 (Intel DAAL 2016). oneDAL is bundled with Intel oneAPI Base Toolkit as a commercial product. A standalone version is available commercially or freely, the only difference being support and maintenance related.
License
Apache License 2.0
Details
Functional categories
Intel DAAL has the following algorithms:
Analysis
Low Order Moments: Includes computing min, max, mean, standard deviation, variance, etc. for a dataset.
Quantiles: splitting observations into equal-sized groups defined by quantile orders.
Correlation matrix and variance-covariance matrix: A basic tool in understanding statistical dependence among variables. The degree of correlation indicates the tendency of one change to indicate the likely change in another.
Cosine distance matrix: Measuring pairwise distance using cosine distance.
Correlation distance matrix: Measuring pairwise distance between items using correlation distance.
Clustering: Grouping data into unlabeled groups. This is a typical technique used in “unsupervised learning” where there is not established model to rely on. Intel DAAL provides 2 algorithms for clustering: K-Means and “EM for GMM.”
Principal Component Analysis (PCA): the most popular algorithm for dimensionality reduction.
Association rules mining: Detecting co-occurrence patterns. Commonly known as “shopping basket mining.”
Data transformation through matrix decomposition: DAAL provides Cholesky, QR, and SVD decomposition algorithms.
Outlier detection: Identifying observations that are abnormally distant from typical distribution of other observations.
Training and Prediction
Regression
Linear regression: The simplest regression method. Fitting a linear equation to model the relationship between dependent variables (things to be predicted) and explanatory variables (things known).
Classification: Building a model to assign items into different labeled groups. DAAL provides multiple algorithms in this area, including Naïve Bayes classifier, Support Vector Machine, and multi-class classifiers.
Recommendation systems
Neural networks
Intel DAAL supported three processing modes:
Batch processing: When all data fits in the memory, a function is called to process the data all at once.
Online processing (also called Streaming): when all data does not fit in memory. Intel® DAAL can process data chunks individually and combine all partial results at the finalizing stage.
Distributed processing: DAAL supports a model similar to MapReduce. Consumers in a cluster process local data (map stage), and then the Producer process collects and combines partial results from Consumers (reduce stage). Intel DAAL offers flexibility in this mode by leaving the communication functions completely to the developer. Developers can choose to use the data movement in a framework such as Hadoop or Spark, or explicitly coding communications most likely with MPI.
References
External links
OneAPI (compute acceleration)
oneAPI oneDAL Specification
DAAL Official Product Website
DAAL Support
DAAL User Forum
DAAL Support Channel
Intel software
Numerical software
Numerical linear algebra | Data Analytics Library | [
"Mathematics"
] | 710 | [
"Numerical software",
"Mathematical software"
] |
47,639,156 | https://en.wikipedia.org/wiki/Sharktopus%20vs.%20Whalewolf | Sharktopus vs. Whalewolf is a television film that premiered on July 19, 2015 on Syfy.
It is the third and final installment in the Sharktopus franchise, after Sharktopus (2010) and Sharktopus vs. Pteracuda (2014).
Plot
Since its fight with the Pteracuda, the Sharktopus is still at large and is lurking in the waters of the Dominican Republic. An alcoholic boat captain named Ray (Casper Van Dien) and his sidekick Pablo (Jorge Eduardo de los Santos) are enlisted by a voodoo priest named Tiny (Tony Almont) to obtain the heart of the Sharktopus. Meanwhile, Dr. Reinhart (Catherine Oxenberg), a mad scientist who studied with the late Nathan Sands and the late Rico Symes from the previous two Sharktopus movies, mixes the genes of a killer whale and a wolf (resembling the extinct Pakicetus, an ancestor of modern whales). The resulting treatment transforms Felix Rosa (Mario Artura Hernandez) into the Whalewolf, which causes havoc and results in it fighting with the Sharktopus, then he dies.
Cast
Catherine Oxenberg as Dr. Reinhart
Casper Van Dien as Ray
Akari Endo as Officer Nita Morales
Jorge Eduardo de los Santos as Pablo
Jennifer Wenger as Betty
Tony Almont as "Tiny"
Mario Arturo Hernández as Felix Rosa
Reviews
Felix Vasquez Jr. of Cinema Crazed would not call the film "a masterpiece, but it's guilty fun". Dylan Grable of Medium.com stated that the film had comedic value, being "so crazy that it culminates in a stupor of stupidity and entertainment".
Home media
No U.S. DVDs of this film were made, but Regions 2 and 4 DVDs were made and are available online.
References
External links
Sharktopus vs. Whalewolf at Internet Movie Database
2015 television films
2015 films
American science fiction horror films
2010s English-language films
2015 horror films
American natural horror films
Syfy original films
2010s science fiction horror films
Fictional sea monsters
Fictional hybrids
Films about genetic engineering
Mad scientist films
American independent films
2010s monster movies
Giant monster films
Films about shark attacks
American monster movies
American horror television films
Sharktopus films
2015 independent films
Films set in the Dominican Republic
Films produced by Roger Corman
Films directed by Kevin O'Neill (director)
2010s American films
English-language science fiction horror films
English-language independent films | Sharktopus vs. Whalewolf | [
"Biology"
] | 493 | [
"Fictional hybrids",
"Hybrid organisms"
] |
47,639,409 | https://en.wikipedia.org/wiki/Urban%20informatics | Urban informatics refers to the study of people creates
, applying and using information and communication technology and data in the context of cities and urban environments. It sits at the conjunction of urban science, geomatics, and informatics, with an ultimate goal of creating more smart and sustainable cities. Various definitions are available, some provided in the Definitions section.
Although first mentions of the term date back as early as 1987, urban informatics did not emerge as a notable field of research and practice until 2006 (see History section). Since then, the emergence and growing popularity of ubiquitous computing, open data and big data analytics, as well as smart cities, contributed to a surge in interest in urban informatics, not just from academics but also from industry and city governments seeking to explore and apply the possibilities and opportunities of urban informatics.
Definitions
Many definitions of urban informatics have been published and can be found online. The descriptions provided by Townsend in his foreword and by Foth in his preface to the Handbook of Research on Urban Informatics emphasize two key aspects: (1) the new possibilities (including real-time data) for both citizens and city administrations afforded by ubiquitous computing, and (2) the convergence of physical and digital aspects of the city.
In this definition, urban informatics is a trans-disciplinary field of research and practice that draws on three broad domains: people, place and technology.
"People" can refer to city residents, citizens, and community groups, from various socio-cultural backgrounds, as well as the social dimensions of non-profit organisations and businesses. The social research domains that urban informatics draws from include urban sociology, media studies, communication studies, cultural studies, city planning and others.
"Place" can refer to distinct urban sites, locales and habitats, as well as to larger-scale geographic entities such as neighbourhoods, public space, suburbs, regions, or peri-urban areas. The place or spatial research domains entail urban studies, architecture, urban design, urban planning, geography, and others.
"Technology" can refer to various types of information and communication technology and ubiquitous computing / urban computing technology such as mobile phones, wearable devices, urban screens, media façades, sensors, and other Internet of Things devices. The technology research domains span informatics, computer science, software engineering, human–computer interaction, and others.
In addition to geographic data/spatial data, most common sources of data relevant to urban informatics can be divided into three broad categories: government data (census data, open data, etc.); personal data (social media, quantified self data, etc.); and sensor data (transport, surveillance, CCTV, Internet of Things devices, etc.).
Although closely related, Foth differentiates urban informatics from the field of urban computing by suggesting that the former focusses more on the social and human implications of technology in cities (similar to the community and social emphases of how community informatics and social informatics are defined), and the latter focusses more on technology and computing. Urban informatics emphasises the relationship between urbanity, as expressed through the many dimensions of urban life, and technology.
Later, with the increasing popularity of commercial opportunities under the label of smart city and big data, subsequent definitions became narrow and limited in defining urban informatics mainly as big data analytics for efficiency and productivity gains in city contexts – unless the arts and social sciences are added to the interdisciplinary mix. This specialisation within urban informatics is sometimes referred to as 'data-driven, networked urbanism' or urban science.
In the book Urban Informatics published in 2021, the term Urban Informatics has been defined in a systematical and principled way.
History
One of the first occurrences of the term can be found in Mark E. Hepworth's 1987 article "The Information City", which mentions the term "urban informatics" on page 261. However, Hepworth's overall discussion is more concerned with the broader notion of "informatics planning". Considering the article pre-dates the advent of ubiquitous computing and urban computing, it does contain some visionary thoughts about major changes on the horizon brought about by information and communications technology and the impact on cities.
The Urban Informatics Research Lab was founded at Queensland University of Technology in 2006, the first research group explicitly named to reflect its dedication to the study of urban informatics. The first edited book on the topic, the Handbook of Research on Urban Informatics, published in 2009, brought together researchers and scholars from three broad domains: people, place, and technology; or, the social, the spatial, and the technical.
There were many precursors to this transdisciplinarity of "people, place, and technology." From an architecture, planning and design background, there is the work of the late William J. Mitchell, Dean of the MIT School of Architecture and Planning, and author of the 1995 book City of Bits: Space, Place, and the Infobahn. Mitchell was influential in suggesting a profound relationship between place and technology at a time when mainstream interest was focused on the promise of the Information Superhighway and what Frances Cairncross called the "Death of Distance". Rather than a decline in the significance of place through remote work, distance education, and e-commerce, the physical / tangible layers of the city started to mix with the digital layers of the internet and online communications. Aspects of this trend have been studied under the terms community informatics and community networks.
One of the first texts that systematically examined the impact of information technologies on the spatial and social evolution of cities is Telecommunications and the City: Electronic Spaces, Urban Places, by Stephen Graham and Simon Marvin. The relationship between cities and the internet was further expanded upon in a volume edited by Stephen Graham entitled Cybercities Reader and by various authors in the 2006 book Networked Neighbourhoods: The Connected Community in Context edited by Patrick Purcell. Additionally, contributions from architecture, design and planning scholars are contained in the 2007 journal special issue on "Space, Sociality, and Pervasive Computing" published in the journal Environment and Planning B: Planning and Design, 34(3), guest edited by the late Bharat Dave, as well as in the 2008 book Augmented Urban Spaces: Articulating the Physical and Electronic City, edited by Alessandro Aurigi and Fiorella De Cindio, based on contributions to the Digital Cities 4 workshop held in conjunction with the Communities and Technologies (C&T) conference 2005 in Milan, Italy.
The first prominent and explicit use of the term "urban informatics" in the sociology and media studies literature appears in the 2007 special issue "Urban Informatics: Software, Cities and the New Cartographies of Knowing Capitalism" published in the journal Information, Communication & Society, 10(6), guest edited by Ellison, Burrows, & Parker. Later on, in 2013, Burrows and Beer argued that the socio-technical transformations described by research studies conducted in the field of urban informatics give reason for sociologists more broadly to not only question epistemological and methodological norms and practices but also to rethink spatial assumptions.
In computer science, the sub-domains of human–computer interaction, ubiquitous computing, and urban computing provided early contributions that influenced the emerging field of urban informatics. Examples include the Digital Cities workshop series (see below), Greenfield's 2006 book Everyware: The Dawning Age of Ubiquitous Computing, and the 2006 special issue "Urban Computing: Navigating Space and Context" published in the IEEE journal Computer, 39(9), guest edited by Shklovski & Chang, and the 2007 special issue "Urban Computing" published in the IEEE journal Pervasive Computing, 6(3), guest edited by Kindberg, Chalmers, & Paulos.
Digital Cities Workshop Series
The Digital Cities Workshop Series started in 1999 and is the longest running academic workshop series that has focused on, and profoundly influenced, the field of urban informatics. The first two workshops in 1999 and 2001 were both held in Kyoto, Japan, with subsequent workshops since 2003 held in conjunction with the biennial International Conference on Communities and Technologies (C&T).
Each Digital Cities workshop proceedings have become the basis for key anthologies listed below, which in turn have also been formative to a diverse set of emerging fields, including urban informatics, urban computing, smart cities, pervasive computing, internet of things, media architecture, urban interaction design, and urban science.
Research centres
Methods
The diverse range of people, groups and organisations involved in urban informatics is reflective of the diversity of methods being used in its pursuit and practice. As a result, urban informatics borrows from a wide range of methodologies across the social sciences, humanities, arts, design, architecture, planning (including geographic information systems), and technology (in particular computer science, pervasive computing, and ubiquitous computing), and applies those to the urban domain. Examples include:
Action research and participatory action research
Big data analytics and urban science
Critical theory
Cultural mapping
Grounded theory
Interaction design
Participatory design
Spatial analysis, including urban modelling, complex urban systems analysis, geographic information systems, and space syntax analysis
User-centred design
See also
Communicative ecology
Community informatics
E-government
Geoinformatics
Human–computer interaction
Interaction design
Location-based service
Locative media
Placemaking
Ubiquitous computing
Urban computing
References
Further reading
Since Foth's 2009 Handbook of Research on Urban Informatics, a number of books and special issues of academic journals have been published on the topic, which further demonstrate the increasing significance and notability of the field of urban informatics. Key works include:
External links
Big Data for Urban Informatics and Earth Observation, ISPRS International Journal of Geo-Information
The Use of Urban Informatics in Climate Risk Management, Climate Risk Management
Premier Software Development Services
Advances in urban informatics, Environment and Planning B: Urban Analytics and City Science
Community networks
Human–computer interaction
Information society
Interdisciplinary subfields of sociology
Urban design
Urban planning | Urban informatics | [
"Technology",
"Engineering"
] | 2,044 | [
"Information society",
"Urban planning",
"Computing and society",
"Human–machine interaction",
"Human–computer interaction",
"Architecture"
] |
47,639,454 | https://en.wikipedia.org/wiki/The%20Mystery%20of%20Matter | The Mystery of Matter: Search for the Elements is a 2014 American documentary miniseries, which premiered nationwide on August 19, 2015. The PBS documentary, in three-episodes of one hour each, was directed by Stephen Lyons and Muffie Meyer.
The series, which took ten years to make, describes the search for the basic chemical elements that form matter by focusing on the lives and times of seven scientific visionaries. Hosted by actor Michael Emerson, the series depicts the creative process of the scientists, with actors describing the process of discovery in the scientists' own words and reenacting their major discoveries using replicas of their original laboratory equipment.
Episodes
Participants
The documentary is narrated by Michael Emerson and includes the following participants (alphabetized by last name):
Matthew Amendt (actor)
Michael Aronov (actor)
Hugo Becker (actor)
Ava Deluca-Verley (actress)
Michael Emerson (narrator)
Russell Egdell (chemist)
Nick Gehlfuss (actor)
John L. Heilbron (biographer)
Roald Hoffmann (chemist)
David Kaiser (physicist/historian)
Paul Lyons (actor)
Anthony Marble (actor)
Seymour Mauskopf (historian)
Patrick Page (actor)
Gregory Petsko (chemist)
Lawrence M. Principe (chemist/historian)
Sebastian Roché (actor)
Alan Rocke (historian)
Juliet Rylance (actress)
Eric Scerri (chemist)
Neil Todd (Manchester University)
Gallery
The seven featured scientists
Cast and advisors
Reviews and criticism
According to Carman Drahl of Forbes magazine, "Chemists will quickly recognize the life stories of giants in their field. This show wasn’t designed just for chemists, however. The target audience includes teachers, students, and curious TV viewers." The series, based on a National Science Foundation project description, tells "a 'detective story' of chemistry, stretching from the ancient alchemists to today's efforts to find stable new forms of matter". Mark Dawidziak, of the Cleveland Plain Dealer, quotes the historical advisor, Alan Rocke: "[The series] portrays science as [a] very human process. People see it is a very mechanical process. A great humanity is revealed by these stories, but also the unfolding process of how science actually comes to these understandings of nature." Erica K. Jacobsen, of the Chemical Education Division of the American Chemical Society, found the series to be "an excellent tool for bringing students a different view of the periodic table and those involved in its history".
See also
Atom
Chemical element
Electron
History of chemistry
History of the periodic table
Neutron
Proton
Search for the Super Battery (2017 PBS film)
References
External links
The Mystery of Matter at the PBS WebSite/1.
The Mystery of Matter (videos) at the PBS WebSite/2.
The Mystery of Matter at Amazon.com
The Mystery of Matter – video search on YouTube
The Mystery of Matter – video search on Dailymotion
2010s American television miniseries
2010s American documentary television series
History of chemistry
Periodic table in popular culture
Science docudramas | The Mystery of Matter | [
"Chemistry"
] | 618 | [
"Periodic table",
"Periodic table in popular culture"
] |
47,640,103 | https://en.wikipedia.org/wiki/HCL%20color%20space | HCL (Hue-Chroma-Luminance) or LCh refers to any of the many cylindrical color space models that are designed to accord with human perception of color with the three parameters. Lch has been adopted by information visualization practitioners to present data without the bias implicit in using varying saturation. They are, in general, designed to have characteristics of both cylindrical translations of the RGB color space, such as HSL and HSV, and the L*a*b* color space. Some conflicting definitions of the terms are:
A name for a cylindrical transformation of CIELuv (CIELChuv) employed by Ihaka (2003) and adopted by Zeileis et al. (2009, 2020). This name appears to be the one most commonly used in information visualization. Ihaka, Zeileis, and co-authors also provide software implementations and web pages to promote its use.
A name for cylindrical CIELab (CIELChab), employed by chroma.js.
"HCL" designed in 2005 by Sarifuddin and Missaou, which is a transformation of whatever type of RGB color space is in use.
HCT with tone as a synonym for luminance is then used within Material Design for its color system, using value ranges of 0–360°, 0–120+ and 0–100%, respectively. Its hue and chroma come from CAM16, whereas tone is actually L* from CIELab.
Derivation
Color-making attributes
HCL concerns the following attributes of color appearance:
Hue The "attribute of a visual sensation according to which an area appears to be similar to one of the perceived colors: red, yellow, green, and blue, or to a combination of two of them".
Lightness, value The "brightness relative to the brightness of a similarly illuminated white".
Luminance (Y or Lv,Ω) The radiance weighted by the effect of each wavelength on a typical human observer, measured in SI units in candela per square meter (). Often the term luminance is used for the relative luminance, Y/Yn, where Yn is the luminance of the reference white point.
Colorfulness The "attribute of a visual sensation according to which the perceived color of an area appears to be more or less chromatic".
The HSL and HSV color spaces are more intuitive translations of the RGB color space, because they provide a single hue number. However, their luminance variation does not match the way humans perceive color. Perceptually uniform color spaces outperform RGB in cases such as high noise environments.
CIE color spaces
CIE-based LCh color spaces are transformations of the two chroma values (ab or uv) into the polar coordinate. The source color spaces are still very well-regarded for their uniformity, and the transformation does not cause degradation in this aspect. See the respective articles for how the underlying coordinates are derived.
Sarifuddin 2005
Sarifuddin, noting the lack of blue hue consistency of CIELAB—a common complaint among its users— decided to make their own color space by mashing up some of the features.
According to the Stack Overflow user Tatarize, what Sarifuddin proposes as "HCL" is algorithmically similar to HSL. While pointing out advantages in computational efficiency, they argue that Sarifuddin's work does not represent a significant improvement over the CIELAB color space while showing failure to reproduce the paper's claims. They also propose what they consider to be an improved version of Sarifuddin's algorithm.
Other color appearance models
In general, any color appearance model with a lightness and two chroma components can also be transformed into a HCL-type color space by turning the chroma components into polar coordinates.
Implementations
CIELCh has been implemented in a wide range of ways: as programmatic code for generating color swatches in statistics tools, as standalone tools for designing and testing swatches, or as libraries that allow other programs to use the color space. Some implementations include:
Statistical tools:
d3.js: Data Driven Documents JavaScript library (CIELChab)
Swatch designs:
The colorspace package for the R and the Python programming languages, also with pre-made sets of swatches in hclwizard
Fabio Crameri's scientific colour maps, a set of pre-made swatches
Library:
The aforementioned colorspace library (CIELChuv)
ac-colors JavaScript library (CIELChab and CIELChuv)
chroma.js JavaScript library (CIELChab)
colorio for Python
Most other color space libraries handle at least one of CIELUV or CIELAB
References
External links
HCL Wizard online color apps
colorspace: HCL-Based Color Tools and Palettes in R
Generating random colors
How To Avoid Equidistant HSV Colors
Color Space Blues
HCL demo
Color space | HCL color space | [
"Mathematics"
] | 1,035 | [
"Color space",
"Space (mathematics)",
"Metric spaces"
] |
47,640,262 | https://en.wikipedia.org/wiki/Two%20Years%20Eight%20Months%20and%20Twenty-Eight%20Nights | Two Years Eight Months and Twenty-Eight Nights is a fantasy novel by British Indian author Salman Rushdie published by Jonathan Cape in 2015.
Plot
The novel is set in New York City in the near future. It deals with jinns, and recounts the story of a jinnia princess and her offspring during the "strangenesses". After a great storm, slits between the world of jinns and the world of men are opened and strange phenomena emerge as dark jinnis invade the Earth. The jinnia princess and her children thus need to fight to defend the Earth and the humans from them, the Grand Ifrits. All the while, the Great Philosopher Averroes (Ibn Rushd) and the famous theologian Al-Ghazali pursue a philosophical debate about reason and God.
Title
The title is a reference to the 1,001 nights Scheherazade spent telling stories in the Middle-Eastern story of One Thousand and One Nights.
Critical reception
According to Book Marks, the book received "mixed" reviews (or a "B-" ) based on ten critic reviews with two being "rave" and two being "positive" and one being "mixed" and five being "pan". On Bookmarks November/December 2015 issue, a magazine that aggregates critic reviews of books, the book received a (3.0 out of 5) based on critic reviews with a critical summary saying, "Underdeveloped characters, a complicated structure characterized by abrupt shifts in perspective, and repetition bothered some critics; a few also questioned his playful treatment of religious fanaticism and his choice to use a collective, futuristic "we" as a narrator".
In a review of the book in The Guardian, Erica Wagner said that it is a "wonderful" novel and praised Rushdie: "the dark delights that spring from his imagination in this novel have the spellbinding energy that has marked the greatest storytellers since the days of Scheherazade." Also in The Guardian, Ursula K. Le Guin praises the novel's "fierce colours, [...] boisterousness, humour and tremendous pizzazz" and Rushdie's "fractal imagination".
References
http://www.thestatesman.com/news/8th-day/end-of-war-conflict-and-tension-but-at-the-cost-of-dreams/89690.html#EyDGcCEU7d8zGP71.99
External links
Publishers Weekly Review
2015 British novels
Novels by Salman Rushdie
Novels set in the future
Novels set in New York City
Jonathan Cape books
Jinn in popular culture
Cultural depictions of Averroes
Al-Ghazali | Two Years Eight Months and Twenty-Eight Nights | [
"Astronomy"
] | 559 | [
"Cultural depictions of astronomers",
"Cultural depictions of Averroes"
] |
47,640,478 | https://en.wikipedia.org/wiki/Etelcalcetide | Etelcalcetide, sold under the brand name Parsabiv, is a calcimimetic medication for the treatment of secondary hyperparathyroidism in people undergoing hemodialysis. It is administered intravenously at the end of each dialysis session. Etelcalcetide functions by binding to and activating the calcium-sensing receptor in the parathyroid gland. Parsabiv is currently owned by Amgen and Ono Pharmaceuticals in Japan.
Medical uses
Etelcalcetide is used for the treatment of secondary hyperparathyroidism in people with chronic kidney disease (CKD) on hemodialysis. Hyperparathyroidism is the condition of elevated parathyroid hormone (PTH) levels and is often observed in people with CKD.
Pharmacodynamics
Mechanism of action
Etelcalcetide functions by binding to and activating the calcium-sensing receptor (CaSR) in the parathyroid gland as an allosteric activator, resulting in PTH reduction and suppression.
Pharmacokinetics
Etelcalcetide functions in a first order elimination, with a half life of 19 hours.
No interaction studies in humans were conducted. Studies in vitro showed no affinity of etelcalcetide to cytochrome P450 enzymes or common transport proteins. Therefore, no relevant pharmacokinetic interactions are expected.
Side effects
Common side effects (in more than 10% of people) are nausea, vomiting, diarrhoea, muscle spasms, and hypocalcaemia (too low blood calcium levels). In clinical studies, the latter side effect was usually mild to moderate and without symptoms. An increase of the QT interval of more than 60 ms was detected in 1.2% of people receiving etelcalcetide.
Due to the lower iPTH levels achieved by the use of this drug, it is possible that adynamic bone disease could occur at levels "below 100 pg/mL"
Contraindications
The drug is contraindicated in people with blood serum calcium levels below the norm.
Chemistry
The substance is a peptide consisting mostly of D-amino acids instead of the common L-amino acids. More specifically, it is the disulfide of N-acetyl-D-cysteinyl-D-alanyl-D-arginyl-D-arginyl-D-arginyl-D-alanyl-D-argininamide with L-cysteine.
History
Originally, Etelcalcetide was being developed by KAI Pharmaceuticals. After positive phase II trials, Amgen acquired KAI for $315 million.
In 2011, KAI entered into agreement with Ono Pharmaceutical for production of Etelcalcetide in Japan, the deal being worth ¥1 billion.
In August 2015 Amgen Inc. announced its submission of a new drug application to the Food and Drug Administration for etelcalcetide. The European Medicines Agency approved the medication in November 2016.
In February 2017, the FDA approved Parsabiv for the treatment of secondary hyperparathyroidism.
Research
Phase II trials found that Etelcalcetide was able to lower PTH levels in one cohort by -49% vs a 29% increase in the placebo group. In another phase II study "89% of patients experienced a C30% reduction in PTH and 56% achieved a PTH level of B300 pg/mL."
In 2017, two phase III trials found that using etelcalcetide showed greater symptom reduction compared to placebo. Etelcalcetide was also able to lower PTH levels below 300pg/mL more often.
Phase I pediatric studies are planned for the US and UK for etelcalcetide.
References
Nephrology procedures
Systemic hormonal preparations
Peptides | Etelcalcetide | [
"Chemistry"
] | 802 | [
"Biomolecules by chemical classification",
"Peptides",
"Molecular biology"
] |
47,640,526 | https://en.wikipedia.org/wiki/Jemma%20Simmons | Jemma Anne Simmons is a fictional character that originated in the Marvel Cinematic Universe before appearing in Marvel Comics. The character, created by Joss Whedon, Jed Whedon and Maurissa Tancharoen, first appeared in the 2013 pilot episode of Agents of S.H.I.E.L.D. and has continually been portrayed by Elizabeth Henstridge.
In the series, Simmons is one of S.H.I.E.L.D.'s top scientific minds. Though her experience is vast, she is particularly expert in biological sciences. Many of her storylines involve her relationship with her best friend, and later husband, Leopold Fitz. Over the course of the series, she develops from a relatively young and inexperienced S.H.I.E.L.D. scientist to one of S.H.I.E.L.D.'s most senior agents, also clocking significant experience as a field agent. She is distinguished from her colleagues by taking very determined and sometimes coldly rational decisions in the pursuit of what she believes is right.
Fictional character biography
New Agent of S.H.I.E.L.D.
Jemma Simmons is brought on to S.H.I.E.L.D. agent Phil Coulson's team as a life sciences (both human and alien) specialist. She has a close bond with fellow agent Leopold Fitz, whom she met at the S.H.I.E.L.D. academy, with both being its Science and Technology division's youngest graduates. Near the end of the season, Fitz and Simmons lock themselves inside a medical unit for safety from rogue agent Grant Ward, who ejects the unit into the ocean. Trapped on the ocean floor, Fitz and Simmons send out a distress signal, and devise a controlled explosive to blow the windows open and escape. Fitz forces a distraught Simmons to take the sole oxygen tank, professing his feelings for her. He is nearly drowned after using the explosive, while Simmons swims to the surface with his unconscious body, where they are rescued by Nick Fury, who picked up their distress signal.
Joining Hydra
Following their underwater experience, Fitz struggles with technology and hallucinates the presence of Simmons, who left S.H.I.E.L.D. some time earlier because of Fitz's condition. However, it is later revealed that Simmons is working undercover within Hydra. Her identity is exposed, but Hydra security chief Bobbi Morse, another undercover S.H.I.E.L.D. agent, rescues her and Simmons reunites with a recovering Fitz. Near the end of the season, as Fitz arranges for a date with Simmons, a Kree weapon called the "Monolith", which is in S.H.I.E.L.D. custody, breaks free of containment and absorbs Simmons into itself.
Stranded on Maveth
In season three, Fitz acquires an ancient Hebrew scroll describing the Monolith that consumed Simmons as "Death" (), which Fitz is unable to accept. Unknown to him, Simmons is alive on a desolate alien planet. Fitz realizes that the Monolith is a portal, and with help from the Asgardian Elliot Randolph and S.H.I.E.L.D. agent Daisy Johnson, is able to enter the portal, finds Simmons and manages to rescue her just as Daisy's power destroys the Monolith. Simmons struggles to readjust to being back on Earth, and tells Fitz about the 4,722 hours she spent stranded on the desert planet. Fitz and Simmons eventually consummate their relationship.
S.H.I.E.L.D. inner circle
Simmons is shown to now be working in the new but paranoid S.H.I.E.L.D. director Jeffrey Mace's inner circle, and takes daily lie-detector tests. Mace's public approval was high after his alleged heroics during a bombing in Vienna, but when Simmons threatens to reveal the truth about his Vienna actions, he agrees to exempt her from any lie-detection tests. When everyone on the base is kidnapped and replaced by LMD's, Simmons is one of the two remaining real people. After discovering Fitz has been replaced, she is forced to stab him several times after crushing him with heavy equipment. After teaming up with Daisy, the other real person, the two release sleeping gas and fight off the LMD's. They then wake up Piper and Davis who fly them out with the LMD's being blown up in the process. Simmons then hacks into the framework so she and Daisy can enter, which they do. Inside the framework, Simmons wakes up in a mass grave where she discovers that her avatar was murdered. After getting thrown out of a car she realises S.H.I.E.L.D. lost the war in this world with Hydra running everything. Simmons tracks down Coulson but is unable to convince him that this world is fake and he calls Hydra on her which she narrowly escapes. She immediately meets up with Daisy but their way out has been damaged.
Future and Present
An unknown group seizes Simmons and the other agents of S.H.I.E.L.D. and transports them to a space station in the future, except Fitz, who joins them later through the Chronicom Enoch's spaceship after being in stasis for 74 years. After Simmons and the team return to the present, she marries Fitz in a ceremony organized by S.H.I.E.L.D., while it is revealed that Deke Shaw, who Simmons met in the future and was somehow teleported to the present, is her grandson. After Fitz becomes a casualty during Daisy's final battle against a powered Glenn Talbot, Simmons resolves to find the present version of Fitz, who is still in stasis aboard Enoch's spaceship.
Search for Fitz
Simmons has unsuccessfully been searching for Fitz for a year. She and Fitz eventually reunite on the planet of Kitson until the assassin Malachi makes off with Fitz. For Fitz's safety, Simmons surrenders herself to Atarah, Enoch's former superior, so that the two of them can come up with a time-traveling method that the Chronicoms intend to use. Atarah traps Fitz and Simmons inside their own minds, forcing them to work together in figuring out time travel logic. The duo are ultimately freed by Enoch, who manages to overpower Atarah and the other Chronicoms. The trio then teleport away, but end up again on Kitson, where Fitz and Simmons are saved from execution by Izel, who helps them on their return to Earth while Enoch bids them farewell. Izel believes Fitz and Simmons are conspiring against her, so she commands her ship's crew to eliminate them. The two are ultimately rescued by a team led by the new S.H.I.E.L.D. director Alphonso Mackenzie, and return to Earth. While S.H.I.E.L.D. stops Izel, Simmons and Fitz are ambushed by the Chronicom Hunters, but saved by Enoch who helps them achieve time travel as well as create a Coulson LMD to help them fight the Hunters.
Chronicom War
In season seven, Simmons aids S.H.I.E.L.D. in traveling through time to stop the Chronicoms from altering history. Along the way, Deke discovers that Simmons has a memory implant that blocks her knowledge of Fitz's location while retaining information on time travel. Simmons is later kidnapped by John Garrett on behalf of the Chronicoms' ally Nathaniel Malick. Malick forces her to give up Fitz's location as he is the key to stopping them. When she refuses, Malick attempts to use a memory machine to search Simmons' memories, but he only learns that she and Fitz spent an extended period of time together before she went back in time. After releasing her, he inadvertently causes her to forget Fitz entirely. He takes her to the Chronicom ship, where they melt her memory implant and plan for her to be rescued by S.H.I.E.L.D. to jog her memory per Chronicom predictor Sibyl's orders. However, an impatient Malick unwittingly ruins the plan when he sends his acolyte Kora to attack Simmons' rescuers, who ultimately allows them to escape. Upon regrouping in a S.H.I.E.L.D. safehouse, Simmons subconsciously builds a portal device that brings Fitz to them despite her damaged memory. While returning to their timeline and helping to restore her memory, Fitz helps Simmons remember that Enoch took them away so they could build a time machine and that they had a daughter named Alya before returning to their friends Flint and Piper and asking them to guard Fitz and Alya while Simmons left with the team. Following Sibyl and Nathaniel's defeat, Fitz and Simmons pick up Alya before retiring from S.H.I.E.L.D. to raise her.
Concept and creation
In November 2012, Elizabeth Henstridge was cast as Jemma Simmons. She described her character as "a biochem expert. She's young and hungry and she's a great woman to play because she's intelligent and focused and curious and she doesn't apologize for it. She's got a wonderful relationship with Fitz. They kind of bounce off each other." After the reveal during the season two premiere that Fitz was just imagining Simmons in the episode, Henstridge explained that the showrunners "tell you what you need to know to act your scenes, but anything after that, you never know." For Simmons' costume design, Foley tried to have her clothes reflect her personality, without "getting too cliché...we mix the hard with the soft—we combine the feminine elements like peter pan collars, silk blouses and florals with the masculine touches like ties". Ava Mireille portrays a younger Simmons.
Characterization
Henstridge talked about the characters of Fitz and Simmons being separated over the course of the series, noting that they have "never been without each other. When you see them without each other, that brings a whole new dynamic just to them as characters in discovering what it's like to have to be independent". On Simmons' guilt over Fitz's brain damage, Henstridge said "She feels a huge amount of guilt. There's a lot of emotions happening. A lot of it revolves around Fitz and Ward. She feels a lot of anger and resentment at the situation. When something catastrophic happens to someone you love, or a situation arises that affects people you love the most, if that's the first time you've been in that position, you never really know what to do." As this relationship developed through the second season, Henstridge said, "I don't think they fully realize the implication of how far apart they are. There's so much hurt there. I don't think they realize what they're sacrificing by not figuring this out." Talking about the harsher side of Simmons seen later in the second season, after the reveal of the Inhumans and the subsequent death of Agent Triplett, Henstridge explained that at the beginning of the series, Simmons was "very mathematical" but throughout the first season "understood that it was more about human relationships and what it means to save someone's life". Now, "she's had a traumatic event and she's gone straight back to what she knows of trying to make everything black and white", and so "It makes sense [to her] if there are these people—call them what you want; Inhumans—that cause destruction, and you can get rid of them, then they won't be anymore....Of course it isn't" that simple.
After Simmons is trapped on the planet Maveth for six months, she becomes "profoundly different", with Henstridge describing her as "definitely still her essence—she doesn't just completely change. But she's been through so much. She's hardened. She's had to face things that she never would've imagined, also by herself without Fitz, so she's definitely changed, stronger and kind of damaged." Describing the relationship that Simmons develops with Daniels on the planet, and comparing it to that with Fitz, Henstridge said, "It's very visceral. It's more primal and intense. That just comes from having to survive in a hostile environment, only having each other on the whole planet. The stakes are always so high, so it's more physical than her relationship with Fitz. FitzSimmons is a slow burn that's taken years and years, and they connected over intellect, whereas her and Will, it's an "us against the world" kind of thing." After Daniels dies and Simmons eventually moves on with Fitz, the latter two are shown consummating their relationship after several seasons worth of build up. "We imagine they spend the morning after laughing a lot about what just happened," said Whedon and Tancharoen, "We want their relationship to feel like their friendship did, because all the best relationships are just that. So moving forward, while this change in their friendship would hopefully only deepen their connection, it is bound also to make things a bit more complicated."
Reception
Reviewing the season 1 episode "0-8-4", Eric Goldman of IGN criticized the lack of development for the majority of main characters, specifically Fitz and Simmons as he did with the pilot episode. However, he was more positive while reviewing "FZZT", praising it for finally giving the "well-needed" development of them both. Henstridge was named TVLine's "Performer of the Week" for the week of October 25, 2015, for her performance in "4,722 Hours", particularly for carrying the episode herself.
Other appearances
Live-action
Jemma Simmons appears in the digital series Agents of S.H.I.E.L.D.: Slingshot with Elizabeth Henstridge reprising her role.
Animation
Simmons appeared in the episode "Lizards" in the fourth season of Ultimate Spider-Man reprised by Henstridge.
Comics
Simmons appears in Agents of S.H.I.E.L.D.: The Chase, a tie-in comic book to the Agents of S.H.I.E.L.D television series that is set between the episodes "Seeds" and "T.R.A.C.K.S.", and "depicts a previously unseen mission of the Agents of S.H.I.E.L.D.", as they investigate a new weapon, and search for crooked billionaire Ian Quinn (who has connections to the Clairvoyant and Project: Centipede).
Simmons first appeared in the mainstream Marvel Comics in S.H.I.E.L.D. vol 3 #1 where she was adapted into the comics by Mark Waid and Carlos Pacheco. She appears as a member of Phil Coulson's team and the daughter of an unnamed Roxxon executive. Simmons joined Coulson's team to regain the Uru Sword, an ancient weapon that belonged to Heimdall. When it was revealed that Heimdall was being possessed by an alien rock, the team remove it and Simmons analyzes it afterwards. While attempting to neutralize a bomb, Simmons is attacked and infected by an unknown material. She comes to the conclusion that she only has one month to live. Henry Hayes / Deathlok finds out about her condition and asks her about it. Simmons reveals that the reason she has not told anyone is because she did not want anyone to pity her. She eventually slips into a coma, revealing her condition to the S.H.I.E.L.D. staff. Hayes and Mockingbird realize that the best way to save her life is to turn her into another Deathlok. The procedure saves her life, but in a disoriented state she begins to attack her fellow agents. Coulson arrives in time to reach out to her humanity and she regains her sanity. She then thanks Hayes for saving her life.
Video games
Simmons made her Marvel video game debut in Marvel: Future Fight, where she appears as a non-player character. In the game, she acquired degrees in quantum mechanics and systems engineering at a young age. She has worked for Stark Industries for two years before she is kidnapped by Advanced Idea Mechanics. After being rescued by the Avengers, Dr. Simmons will assist the team in analyzing the multiverse. An alternate universe version Jemma Simmons who became a S.H.I.E.L.D. agent was added during the Agents of S.H.I.E.L.D. tie up patch.
Simmons appears as a playable character in Lego Marvel's Avengers. Simmons, along with 12 other characters, was added in the limited Agents of S.H.I.E.L.D. DLC pack.
Simmons appears as a playable character in Marvel Avengers Academy.
See also
Characters of the Marvel Cinematic Universe
References
External links
Jemma Simmons (Earth-616 version) at Marvel Wiki
Agents of S.H.I.E.L.D. original characters
Characters created by Joss Whedon
English female characters in television
Fictional biochemists
Fictional British spies
Fictional female doctors
Fictional female scientists
Fictional female spies
Fictional immigrants to the United States
Fictional scientists in television
Fictional slaves
Marvel Comics cyborgs
Marvel Comics scientists
S.H.I.E.L.D. agents
Television characters introduced in 2013
Time travelers | Jemma Simmons | [
"Chemistry"
] | 3,676 | [
"Fictional biochemists",
"Biochemists"
] |
47,641,692 | https://en.wikipedia.org/wiki/OGLE-LMC-CEP0227 | OGLE-LMC-CEP0227 is an eclipsing binary and Cepheid variable star, pulsating every 3.8 days. The star, in the Large Magellanic Cloud, was the first Cepheid star system found to be orbiting exactly edge on.
The OGLE-LMC-CEP0227 system contains two stars which orbit each other almost exactly 'edge on' to the line of sight from the Earth. This unique configuration has allowed astronomers to refine their understanding of classical Cepheid variable stars. Studies of this system have allowed astronomers to measure the Cepheid mass with unprecedented accuracy. There is still disagreement over whether the pulsational properties accurately match the mass derived from the observed orbit.
The two stars orbit each other every 309 days, and each has a mass close to . The primary component has an effective temperature of and the secondary a temperature of .
Notes
References
Stars in the Large Magellanic Cloud
Classical Cepheid variables
Mensa (constellation)
J04521567-7014313
Algol variables
\
Extragalactic stars
F-type supergiants
G-type bright giants | OGLE-LMC-CEP0227 | [
"Astronomy"
] | 244 | [
"Mensa (constellation)",
"Constellations"
] |
47,641,929 | https://en.wikipedia.org/wiki/Imleria%20badia | Imleria badia, commonly known as the bay bolete, is an edible, pored mushroom found in Eurasia and North America, where it grows in coniferous or mixed woods on the ground or on decaying tree stumps, sometimes in prolific numbers. Both the common and scientific names refer to the bay- or chestnut-coloured cap, which is almost spherical in young specimens before broadening and flattening out to a diameter up to . On the cap underside are small yellowish pores that turn dull blue-grey when bruised. The smooth, cylindrical stipe, measuring long by thick, is coloured like the cap, but paler. Some varieties have been described from eastern North America, differing from the main type in both macroscopic and microscopic morphology.
First described scientifically by Elias Fries in 1818, the bay bolete was reclassified as Xerocomus badius in 1931, and it is still listed thus in several sources. Modern molecular phylogenetic studies show Xerocomus to be polyphyletic (not descended from the same common ancestor), and the bay bolete is not particularly closely related to species in that genus. Often considered a poor relation of the cep (Boletus edulis), I. badia is nevertheless regarded as a choice edible mushroom by some authors, such as food expert Antonio Carluccio, and is sold in markets in Europe and central Mexico. Its mushrooms are less often infested by maggots than other boletes. Several European studies have demonstrated that the mushroom can bioaccumulate some trace metals from the soil, such as mercury, cobalt, and nickel. Additionally, the mushroom contains a pigment that concentrates radioactive caesium; specimens collected in Europe following the 1986 Chernobyl disaster contained several times more caesium-137 than those collected before the incident.
Taxonomy
The bay bolete was first named as Boletus castaneus ß badius (i.e. a subspecies of Boletus castaneus) by Elias Magnus Fries in 1818. Fries later renamed it as a variety of Boletus castaneus in 1828, before assigning it distinct species status in his 1832 work Elenchus Fungorum. The fungus has been transferred to several genera in its taxonomic history: Rostkovites by Petter Karsten in 1881; Viscipellis and Ixocomus by Lucien Quélet in 1886 and 1888, respectively; and Suillus by Otto Kuntze in 1898. In 1931, Edouard-Jean Gilbert reclassified it in the genus Xerocomus, and many sources still list it thus. Review of Xerocomus strongly suggested it was polyphyletic, and the genus was not accepted by some mycologists. The stickiness of its wet cap distinguishes the species from others classified in Xerocomus, and hence it was left in Boletus until Alfredo Vizzini placed it in its own genus in 2014. Genetic analysis published in 2013 shows that Imleria badia is related to B. pallidus and B. glabellus; the three species form a clade known informally as the badius clade within a larger group (informally called anaxoboletus) in the suborder Boletineae. Other clades within the group include the Tylopilus, porcini (= Boletus sensu stricto) and Strobilomyces clades, as well as two other groups composed of members of various genera including Xerocomus (the taxa designated as Xerocomus species in this clade are not Xerocomus species and require new taxonomic designations) and Xerocomellus.
The species Boletus limatulus, originally published by Charles Christopher Frost in 1874, was later redescribed, "with a slight tinge of irritation at the time, energy and gasoline spent", as a variety of I. badia by Wally Snell in 1945 (as Xerocomus badius var. limatulus). The taxon name comes from the Latin limatulus, "rather polished" or "refined". Varieties glaber and macrostipitatus were described from Nova Scotia, Canada, in 1976.
The starting date of fungal taxonomy had been set as January 1, 1821, to coincide with the date of the works of Swedish naturalist Elias Magnus Fries, the "father of mycology". Rolf Singer argued that setting the starting date earlier to Christiaan Persoon's 1801 publication of Synopsis would make a name change necessary, as he had originally given what is now known as Royoporus badius the combination Boletus badius Pers. and if the bay bolete was classified in the genus Boletus, the name would be unavailable and the names Boletus glutinosus Krombh. or B. spadiceus Krombh. (non Fr.) would have to be used instead.
The species name is the Latin adjective badia, meaning "chestnut brown". The common name is likewise derived from the colour of the cap, likened to the coat of a bay horse. Alternate common names of a similar derivation include bay-brown bolete and bay-capped bolete, and it is known as bolet bai in French. It is also known as the false cep. Variety glaber was named for its smooth (Latin: glaber, "without hairs") stipe, and macrostipitatus for its large (Latin: macro, "large") stipe.
Description
Imleria badia fruit bodies have a chestnut to dark brown cap, which is almost spherical in young specimens before broadening and flattening out to a diameter of up to . The cap margin is acute, and cap surface velvety when young and slightly sticky when wet or old. The cap cuticle is difficult to separate from the flesh underneath. On the cap undersurface, the pores are initially cream to pale yellow, but become greenish yellow or olive with age. They stain dull blue to bluish-grey when bruised or cut, and are easily removed from the flesh. The pores are initially circular, becoming more angular with age, and number about one or two per millimetre. The tubes are long, and are adnate to depressed around the area of attachment to the stipe.
The flesh is mostly whitish or yellowish in some places; underneath the cap cuticle, it is brownish-pink or reddish brown. Initially firm, it begins to soften under the cap in older mushrooms. In some parts of the cap, such as the junction of the cap and the stipe, the flesh stains pale blue when injured or exposed to air, particularly in damp weather. This change is sometimes faint, and not persistent, as it eventually reverts to its original colour. The stipe is long by thick, and is similar in colour to the cap but paler, and sometimes with a rose-coloured tinge. Its surface has faint longitudinal ridges, a fine powdering, and fine reticulations (a net-like pattern of ridges) at the apex. It often has a whitish region at the base and the top, and white mycelium at the base. Unlike the bulbous stipe of many other boletes, the stipe of B. badius remains relatively slim and cylindrical. The flesh of the stipe gets tougher with age. Its smell has been described as fruity.
The spore print is olive to olive-brown. The smooth spores are somewhat oblong to slightly ventricose (fattened in the middle), and measure 10–14 by 4–5 μm. The basidia (spore-bearing cells) are four-spored and measure 25–35 by 8–10 μm. Pleurocystidia (cystidia found on the faces of the tubes) are fuse-shaped and ventricose, with dimensions of 50–60 by 10–14 μm.
Variety B. b. macrostipitatus differs from the main form by its grey-orange cap, shorter stipe measuring , longer spores (15–18 by 4–5 μm), and longer pleurocystidia (30–55 by 10–14 μm). The variety B. b. glaber has a smooth (glabrous) stipe, and smaller pleurocystidia (35–40 by 10–15 μm) and cheilocystidia (25–30 by 9–12 μm).
Several chemical tests can be used to help identify the mushroom. A drop of ammonium hydroxide solution turns the cap cuticle a greenish to bluish colour. Application of iron(II) sulphate solution causes the flesh to stain a dull bluish-green, while the pores turn golden brown with a drop of dilute potassium hydroxide.
Similar species
The similar colouration may cause confusion with Boletus projectellus, but the latter species is usually more robust, and has a reticulated stipe. Additionally, B. projectellus has the largest spores in the Boletaceae, up to about 30 μm in diameter. Another lookalike is Austroboletus gracilis, but this species does not have a blue bruising reaction, and its pore surface is initially white before turning pinkish. Compared to I. badia, B. subtomentosus fruit bodies have narrower stipes, paler brown, dry caps, and wider pores that do not stain blue on bruising. This latter species is not as good to eat. In western North America, I. badia is replaced by the similar B. zelleri, which also grows both on the ground and on rotten wood. The European species Xerocomus bubalinus can be mistaken for I. badia, but it has a paler yellow-brown cap flushed with pinkish-red, and is not sticky when wet.
Ecology, distribution and habitat
Although the bay bolete is predominantly a mycorrhizal species, it does have some saprophytic tendencies and may be able to use this lifestyle in certain circumstances. The ectomycorrhizae formed between I. badia and spruce (Picea abies) have active hyphal sheaths and a higher potential to store nitrogen, phosphorus, potassium, magnesium, iron, and zinc than other mycorrhizal types, indicating the fungus is well adapted to acidic stands and its mycorrhizae are very efficient in uptake and storage of macronutrients. Mycorrhizae with Monterey pine (Pinus radiata) have also been described.
The bay bolete is common in coniferous and less commonly mixed woodlands in Europe, from the British Isles, where it is abundant throughout from August to November, east to the Black Sea Region in Turkey. In Asia, the species has been recorded from Jordan mainland China, and Taiwan. The North American distribution extends from eastern Canada west to Minnesota and south to North Carolina, where the mushroom fruits from July to November. It also grows in central Mexico. The variety B. b. macrostipitatus is found from eastern Canada south to Maine and New York state, while variety B. b. glaber is known from the Atlantic Maritime Ecozone of eastern Canada. Fruit bodies appear singly or scattered on the ground, or on decaying tree stumps, and can be well hidden by pine needles and ferns. Fruiting tends to peak three or four days after rain during warm weather. They can be prolific, especially in highland areas that are humid and shady. It is commonly found under white pine, spruce, and hemlock, and also occurs under deciduous trees, especially beech. It can also occur in grassy or mossy areas at or near forest margins; Italian restaurateur and cook Antonio Carluccio recalled picking them in the grounds of Blenheim Palace. It does not occur on calcareous (chalky) soils.
I. badia fruit bodies are less affected by insects than other boletes. Orbatid mites such as Carabodes femoralis, Nothrus silvestris and Oribatula tibialis eat them, as do squirrels. Several microbial pathogens can damage the fruit bodies, and have had an effect on populations in China, including soft rot caused by Pseudomonas aeruginosa, and black mould caused by Mucor, Sepedonium, Paecilomyces, and Diasporangium species.
Uses
Often considered a poor relation of the cep (Boletus edulis), the bay bolete is nevertheless highly regarded as a choice edible mushroom by some authors such as Carluccio. In central Mexico, it is collected from Izta-Popo Zoquiapan National Park and sold in neighbouring markets. It may cause an allergic reaction in some people, and the blue discolouration upon bruising can be offputting, although the staining disappears from white flesh when it is cooked. The flavour is milder than its better-known relative. Younger specimens are best for eating, though more mature ones can be suitable for cutting up and drying. The tendency for the pores to absorb water means that wiping rather than washing is recommended before use in the kitchen. Unlike most boletes, I. badia can be eaten raw (though only young mushrooms should be used). Otherwise it can be fried in butter, or used with meat or fish recipes. Mushrooms can also be frozen, dried, or pickled in cider vinegar, wine, or extra virgin olive oil, and later used in sauces or soups.
The fruit bodies can be used to make mushroom dyes. Depending on the mordant used, colours ranging from yellow, orange, gold, and green-brown can be obtained. Without mordant, a yellow colour is produced.
Research
In laboratory experiments, extracts of I. badia fruit bodies have been shown to have significant antioxidative properties in vitro. Fruit bodies contain the compound theanine, an amino acid and a glutamic acid analogue found in green tea. Efforts have been made to establish a protocol for producing theanine by growing the fungus mycelium using submerged fermentation. Several indole compounds have been detected in fruit bodies. Unprocessed mushrooms contain tryptophan (0.68 mg per 100 g dry weight), tryptamine (0.47), serotonin (0.52), kynurenine sulphate (1.96), and kynurenic acid (1.57). Due to their temperature sensitivity, cooking significantly changes the contents and composition of indole compounds: cooked mushrooms contained tryptophan (1.74 mg/100 g dw), 5-methyltryptophan (6.55), melatonin (0.71), and indoleacetonitrile (2.07). Fruit body extracts have been shown to slow the growth of certain tumour cell lines in cell culture.
Polish studies found that although the mushroom bioaccumulates mercury and cobalt from the soil, occasional consumption of mushrooms should not cause maximum allowable intake doses to be exceeded. Similar conclusions about safety were made in a Polish study of the mushroom's ability to accumulate organochlorine compounds. Different methods of preparation for consumption affect the leaching rate of cadmium, lead, and mercury. After the 1986 Chernobyl disaster, several studies showed I. badia bioaccumulates radioactive caesium, 137Cs. 137Cs is produced in nuclear power plants following the chain decay of 235U to 137Te, and has a half-life of thirty years. A German study showed that mushrooms collected from 1986 to 1988 had radiocaesium contents that were 8.3 to 13.6 times greater than mushrooms collected before the accident in 1985. This caesium-sequestering effect is caused by a brown pigment, the polyphenol compound norbadione A, which is related to a family of mushroom pigments known as pulvinic acids. Norbadione A has been investigated for its ability to provide a protective effect against the damaging effects of ionizing radiation. Tests with cell cultures and mice show that although it has some protective effect, it is toxic to cells in higher doses. A new series of alkali chelators based on the structure of norbadione A has been reported. The mushroom may have potential as a bioremediation agent to clean up contaminated sites.
See also
List of North American boletes
Notes
References
External links
Boletaceae
Edible fungi
Fungi described in 1818
Fungi of Asia
Fungi of Europe
Fungi of North America
Taxa named by Elias Magnus Fries
Fungus species | Imleria badia | [
"Biology"
] | 3,448 | [
"Fungi",
"Fungus species"
] |
47,641,959 | https://en.wikipedia.org/wiki/Heat%20transfer%20enhancement | Heat transfer enhancement is the process of increasing the effectiveness of heat exchangers. This can be achieved when the heat transfer power of a given device is increased or when the pressure losses generated by the device are reduced. A variety of techniques can be applied to this effect, including generating strong secondary flows or increasing boundary layer turbulence.
Principle
During the earliest attempts to enhance heat transfer, plain (or smooth) surfaces were used. This surface requires a special surface geometry able to provide higher values per unit surface area in comparison with a plain surface. The ratio of of an enhanced heat transfer surface to the plain surface is called Enhancement Ratio " ". Thus,
The heat transfer rate for a two-fluid counterflow heat exchanger is given by
In order to better illustrate the benefits of enhancement, the total length 'L' of the tube is multiplied and divided in the equation
Where is the overall thermal resistance per unit tube length. And it is given by
The subscripts 1 and 2, describe the two different fluids. The surface efficiency is represented by employing extended surfaces.
One aspect to take into consideration is that the latter equation does not include any fouling resistances due to its simplicity, which can be important. In order to enhance the performance of the heat exchanger, the term, must be increased.
For achieving a reduced thermal resistance, the enhanced surface geometry may be used to increase one or both terms in relation to the plain surfaces, leading to a reduced thermal resistance per unit tube length, . This reduced term may be used to achieve one of the following three objectives:
1. Size reduction. maintaining the heat exchange rate constant, the length of the heat exchanger may be reduced, providing a heat exchanger of smaller proportions.
2. Increased .
Reduced : maintaining both and the length constant, can be reduced increasing thermodynamic process efficiency leading to reduced operation costs.
Increased heat exchange: Increasing and keeping a constant length will lead to an increased for fixed fluid inlet temperature.
3. Reduced pumping power for fixed heat duty. This will require smaller velocities of operation than the plain surface and a normally not desired, increased frontal area.
Depending on the objectives for the design, any of the three different performance improvements can be used on an enhanced surface, and using any of the three mentioned performance improvements it is fully possible to accomplish it.
Internal flow
There are several available options for enhancing heat transfer. The enhancement can be achieved by increasing the surface area for convection or/and increasing the convection coefficient. For example, the surface roughness can be used to increase in order to enhance turbulence. This can be achieved through machining or other kinds of insertions like coil-spring wire. The insert provides a helical roughness in contact with the surface. The convection coefficient may also be increased by an insert of a twisted tape that consists in a periodical twist through 360 degrees. Tangential inserts optimize the velocity of the flow near the tube wall, while providing a bigger heat transfer area. While, increased area and convection coefficient can be achieved by applying spiral fin or ribs inserts. Other aspects such pressure drop must be taken into consideration in order to meet the fan or pump power constraints.
Helically Coiled Tube
The coil spring insert may enhance heat transfer without turbulence or additional heat transfer surface area. A secondary flow is induces the fluid creating two longitudinal vortices. This could result, (in contrast to a right tube) in highly non-uniform local around the periphery of the tube. Leading to a dependence of the local heat transfer coefficients on the different locations along the tube (). Supposing that the conditions for the heat flux are constant, the mean fluid temperature, can be estimated as follows,
where = constant.
Maximum fluid temperatures near the tube wall are present when the fluid is heated, and because the heat transfer coefficient is strongly depended of angle (), the calculation of the maximum local temperature is not straight forward. For this purpose, correlations for the peripherally averaged Nusselt number are, if none, of little use when keeping heat flux conditions constant. On the other hand, correlations for the peripherally averaged Nusselt number for constant wall temperature are very useful.
The secondary flow:
Increases heat transfer rates.
Increases friction loses.
Decreases entrance length.
Reduces the difference between the laminar and turbulent heat transfer rates, in contrast to the straight tube case.
The coil pitch S has negligible influence on the pressure drop and the heat transfer rates. For the helical tube, the critical Reynolds number to the onset of turbulence is,
where is given by in turbulent and fully developed state.
The delays on the transition from laminar to turbulent state are strongly dependent on strong secondary flows associated with tightly wound helically coiled tubes.
The friction factor for fully developed laminar flow with is,
where . C is the outer diameter of the helical coil.
and
for
and
where
For cases where , there is available recommendations provided by Shah and Joshi. The heat transfer coefficient may be used in the equation for the Newton's law of cooling equation
and can be evaluated from the correlation,
where and
The correlations for the friction factor in turbulent state are based in limited data. Increased heat transfer due to the secondary flow is not significant in turbulent state constituting less than 10% for . Furthermore, augmentation created by the use of helically coiled tubes due to the secondary flow is usually employed only for situations where the flow is in the laminar state. In this state, the entrance length is 20% to 50% shorter in comparison with the straight tube. In the case of turbulent flow, the flow becomes fully developed during the first half-turn of the helically coiled tube. For this reason, the entrance region can be neglected in many engineering calculations.
If the liquid or gas is heated in a straight tube, the fluid that passes near the centerline, will exit the tube in a much shorter time and will always be cooler than the fluid passing near the wall.
References
Heat transfer | Heat transfer enhancement | [
"Physics",
"Chemistry"
] | 1,223 | [
"Transport phenomena",
"Physical phenomena",
"Heat transfer",
"Thermodynamics"
] |
47,642,057 | https://en.wikipedia.org/wiki/Alfa%20mannan%20degradation | The α-mannan degradation Mannan which can be found in the cell wall of yeast has a
particular chemical structure, and constitutes a food source since humans begun eating fermented foods several thousands of years ago. To determine whether the intake of yeast mannans through fermented foods has promoted specific adaptations of the human gut microbiota, an international team of researchers studied the ability of Bacteroides thetaiotaomicron to specifically degrade yeast mannans.
The mannan-oligosaccharides are able to alter the composition of the microbiota present in the bowels, so they produce an increase in the growth of benign bacteria and therefore an increase in the resistance to infection by pathogens.
The B. thetaiotaomicron are bacteria that have been shown to bind polysaccharides thanks to a receptor system located on the outer membrane before introducing the polysaccharides into the periplasm for their degradation to monosaccharides. These bacteria use α-mannose as a carbon source. Transcriptional studies have identified three different PULs (Polysaccharide Utilization Loci) which are activated by α-mannan from Saccharomyces cerevisiae, and Schizosaccharomyces pombe and the yeast pathogen Candida albicans. To demonstrate the specificity of these PULs, the researchers have engineered different B. thetaiotaomicron strains which showed that mutants lacking MAN-PUL1, MAN-PUL3 or PUL2 are unable to grow in vitro with yeast mannan as the sole carbon source.
In order to assess whether the ability to degrade yeast mannan is a general feature of the microbiota or it is a specific adaptation of B. thetaiotaomicron, the authors analysed the growth profiles of 29 species of Bacteroidota on the human bowel. The analysis revealed that only nine are able to metabolize S. cerevisiae alfa-mannan while 33 of 34 strains of B. thetaiotaomicron are able to grow on this glycan. These results show that B. thetaiotaomicron along with some phylogenetically related species dominate the yeast metabolism of α-mannan in the phylum Bacteroidota of the microbial flora.
References
Carbohydrates
Polysaccharides | Alfa mannan degradation | [
"Chemistry"
] | 489 | [
"Biomolecules by chemical classification",
"Carbohydrates",
"Organic compounds",
"Carbohydrate chemistry",
"Polysaccharides"
] |
59,542,058 | https://en.wikipedia.org/wiki/Unimog%20421 | The Unimog 421 is a vehicle of the Mercedes-Benz Unimog series, made by Daimler-Benz. In total, 18,995 units of the Unimog 421 were built from 1966 to 1989 in the Mercedes-Benz Gaggenau plant. It is a medium-sized vehicle bigger than the traditional Unimog 411, but smaller than the Unimog 406. Introduction of new heavy models and Unimog 411 production ceasing in the mid-1970s changed the Unimog 421's role in the Unimog lineup; it became the predecessor of the light Unimog series and thus succeeded the Unimog 411.
Both short and long wheelbase versions, as well as "front half only" OEM part versions, were made. Technically, the Unimog 421 is based on the Unimog 406 and Unimog 411. The plane ladder frame and axles are Unimog 411 parts, and the engines used for the 421-series are also passenger car engines. Cab and gearbox are Unimog 406-related. The 421 closed cab version looks almost exactly like a 406-series cab, but the cabrio version is a bit more narrow and "squat". A 406-series can be differentiated from a 421-series by the position of the air-intake: A 421-series has the air-intake on the right-hand side of the bonnet, and a 406-series on the left-hand side.
In Argentina, a copy of the Unimog 421, the 431-series, was produced under licence. From 1969 to 1971, 601 cabrios and 152 closed cab units of the Unimog 431 were made. Daimler-Benz produced CKD-kits in Gaggenau, which were then shipped to Argentina for manufacture. The 431-series was fitted with a engine, but has the short wheelbase – this combination was not available for the original 421-series.
Engines
The 421-series was made with straight-four precombustion chamber Diesel engines. Prototypes included, four different engines were used, which are all passenger car engines.
Types of the 421-series
In total 20 different types of the 421-series were made, out of which four were made as OEM parts for other manufacturers.
Mercedes-Benz types
Types made for third party manufacturers
Several Unimog 421 types were custom-made as OEM parts for third party manufacturers.
Technical specifications
Gallery
References
External links
Historical Daimler-Benz commercial video on YouTube; 9:39 min
Tractors
Mercedes-Benz trucks
Vehicles introduced in 1966 | Unimog 421 | [
"Engineering"
] | 529 | [
"Engineering vehicles",
"Tractors"
] |
59,543,266 | https://en.wikipedia.org/wiki/Alexandroff%20plank | Alexandroff plank in topology, an area of mathematics, is a topological space that serves as an instructive example.
Definition
The construction of the Alexandroff plank starts by defining the topological space to be the Cartesian product of and where is the first uncountable ordinal, and both carry the interval topology. The topology is extended to a topology by adding the sets of the form
where
The Alexandroff plank is the topological space
It is called plank for being constructed from a subspace of the product of two spaces.
Properties
The space has the following properties:
It is Urysohn, since is regular. The space is not regular, since is a closed set not containing while every neighbourhood of intersects every neighbourhood of
It is semiregular, since each basis rectangle in the topology is a regular open set and so are the sets defined above with which the topology was expanded.
It is not countably compact, since the set has no upper limit point.
It is not metacompact, since if is a covering of the ordinal space with not point-finite refinement, then the covering of defined by and has not point-finite refinement.
See also
References
Lynn Arthur Steen and J. Arthur Seebach, Jr., Counterexamples in Topology. Springer-Verlag, New York, 1978. Reprinted by Dover Publications, New York, 1995. (Dover edition).
S. Watson, The Construction of Topological Spaces. Recent Progress in General Topology, Elsevier, 1992.
Topological spaces | Alexandroff plank | [
"Mathematics"
] | 319 | [
"Mathematical structures",
"Space (mathematics)",
"Topological spaces",
"Topology stubs",
"Topology"
] |
59,543,651 | https://en.wikipedia.org/wiki/Arens%20square | In mathematics, the Arens square is a topological space, named for Richard Friederich Arens. Its role is mainly to serve as a counterexample.
Definition
The Arens square is the topological space where
The topology is defined from the following basis. Every point of is given the local basis of relatively open sets inherited from the Euclidean topology on . The remaining points of are given the local bases
Properties
The space is:
T2½, since neither points of , nor , nor can have the same second coordinate as a point of the form , for .
not T3 or T3½, since for there is no open set such that since must include a point whose first coordinate is , but no such point exists in for any .
not Urysohn, since the existence of a continuous function such that and implies that the inverse images of the open sets and of with the Euclidean topology, would have to be open. Hence, those inverse images would have to contain and for some . Then if , it would occur that is not in . Assuming that , then there exists an open interval such that . But then the inverse images of and under would be disjoint closed sets containing open sets which contain and , respectively. Since , these closed sets containing and for some cannot be disjoint. Similar contradiction arises when assuming .
semiregular, since the basis of neighbourhood that defined the topology consists of regular open sets.
second countable, since is countable and each point has a countable local basis. On the other hand is neither weakly countably compact, nor locally compact.
totally disconnected but not totally separated, since each of its connected components, and its quasi-components are all single points, except for the set which is a two-point quasi-component.
not scattered (every nonempty subset of contains a point isolated in ), since each basis set is dense-in-itself.
not zero-dimensional, since doesn't have a local basis consisting of open and closed sets. This is because for small enough, the points would be limit points but not interior points of each basis set.
References
Lynn Arthur Steen and J. Arthur Seebach, Jr., Counterexamples in Topology. Springer-Verlag, New York, 1978. Reprinted by Dover Publications, New York, 1995. (Dover edition).
Topological spaces | Arens square | [
"Mathematics"
] | 480 | [
"Topological spaces",
"Mathematical structures",
"Topology",
"Space (mathematics)"
] |
59,545,820 | https://en.wikipedia.org/wiki/NGC%206907 | NGC 6907 is a spiral galaxy located in the constellation Capricornus. It is located at a distance of about 120 million light-years from Earth, which, given its apparent dimensions, means that NGC 6907 is about 115,000 light-years across. It was discovered by William Herschel on July 12, 1784. The total infrared luminosity of the galaxy is , and thus it is categorised as a luminous infrared galaxy.
Characteristics
NGC 6907 is a grand design spiral galaxy with two spiral arms. It has an elliptical bulge that is skewed towards the base of the arms. The inner arms are bright and with knots, forming a bar. There are dust lanes in the arms. The disk of NGC 6907 is asymmetric. The eastern arm changes pitch angle and becomes linear after the location of the nearby galaxy NGC 6908. The western arm is less strong, but it is considerably longer, as its outermost parts form an arc with H II regions, wrapping nearly 360 degrees around the disk and forming a pseudoring. NGC 6907 also has a tidal tail with low surface brightness. The asymmetric tail extends from the north part of the disk of the galaxy towards the west and southwest. Its presence is an indicator of an ongoing unequal mass merger. The total HI mass of NGC 6907 is estimated to be .
NGC 6907 interacts with a low-luminosity lenticular galaxy, known as NGC 6908, that is superimposed on the eastern arm of NGC 6907, lying 40 arcseconds off the nucleus of NGC 6907. NGC 6908 was thought for many years to be actually part of NGC 6907, which was described as having two massive asymmetric arms; however, when observed in infrared, it becomes apparent NGC 6908 is a different galaxy. As NGC 6908 passed through the disk of NGC 6907, a stellar and gas bridge was formed between the two galaxies that has been observed as high-velocity gas. It is estimated that NGC 6908 passed through the disk approximately 35 million years ago.
Nearby galaxies
NGC 6907 is the more prominent member of a small galaxy group known as the NGC 6907 group or LGG 436. Other members of the group, apart from NGC 6908, include IC 4999 and IC 5005. These two galaxies lie 61 and 74 arcminutes off NGC 6907, respectively. The group seems to form, with some other galaxies lying at similar redshift, like ESO 462- G016, a sheet of galaxies that extends 10 degrees in the sky, which corresponds to 7 Mpc at the distance of NGC 6907.
Supernovae
NGC 6907 has been home to four supernovae:
SN 1984V (type unknown, mag 15.0) was discovered by L. E. Gonzalez on 29 May 1984.
SN 2004bv (type Ia, mag 15.6) was discovered by R. Kushida on 24 May 2004.
SN 2008fq (Type II, mag 15.4) was discovered by the Lick Observatory Supernova Search (LOSS) on 15 September 2008.
SN 2014eh (Type Ic, mag 16.0) was discovered by LOSS on 28 October 2014.
Gallery
See also
NGC 1097 – another spiral galaxy with a smaller companion
References
External links
NGC 6907 on SIMBAD
Barred spiral galaxies
Luminous infrared galaxies
Capricornus
6907
UGCA objects
64650
Astronomical objects discovered in 1784
Discoveries by William Herschel | NGC 6907 | [
"Astronomy"
] | 719 | [
"Capricornus",
"Constellations"
] |
59,545,995 | https://en.wikipedia.org/wiki/Kimberly%20Prather | Kimberly A. Prather is an American atmospheric chemist. She is a distinguished chair in atmospheric chemistry and a distinguished professor at the Scripps Institution of Oceanography and department of chemistry and biochemistry at UC San Diego. Her work focuses on how humans are influencing the atmosphere and climate. In 2019, she was elected a member of the National Academy of Engineering for technologies that transformed understanding of aerosols and their impacts on air quality, climate, and human health. In 2020, she was elected as a member of the National Academy of Sciences. She is also an elected Fellow of the American Philosophical Society, American Geophysical Union, the American Association for the Advancement of Science, American Philosophical Society, and the American Academy of Arts and Sciences.
Education and early career
Prather was born in Santa Rosa, California. She studied at Santa Rosa Junior College and University of California, Davis, earning a bachelor's degree in 1985 and a PhD in 1990. She served as a postdoctoral fellow at the University of California, Berkeley between 1990 and 1992, working with Nobel Laureate Yuan T. Lee. Prather joined University of California, Riverside as an assistant professor in 1992. During her time at UC Riverside she began to work on aerosol mass spectrometry, developing ways to make it compact and transportable. She patented the technology.
Research
In 2001, Prather joined the faculty at the University of California, San Diego as a member of the Department of Chemistry and Biochemistry and Scripps Institution of Oceanography. Prather's early research focused on determining the major sources of fine particle pollution in California as well as in the Northeastern United States. As part of this research, she explored methods to distinguish between different aerosol sources based on their single particle composition and size. She developed aerosol time-of-flight mass spectrometry (ATOFMS), a technique with high temporal and size resolution. In 1999 she began to work with the University of Rochester studying the health effects of ultrafine particles. She refined the detection technique so that it would precisely measure the size and composition of small particles. The ultrafine ATOFMS was able to examine exhaust particles from gasoline and diesel powered vehicles. She found that alongside the freeway, particles between 50 and 300 nm were mainly due to heavy-duty vehicles (51%) and light-duty vehicles (32%). She used the ultrafine ATOFMS to study atmospheric composition, combining it with ozone and NOx measurements. ATOFMS is now widely used in atmospheric studies around the world.
In 2003, she joined the advisory board of United States Environmental Protection Agency PM2.5 Clean Air. Between 2003 and 2006 Prather studied whether ATOFMS could be used to measure the carbonaceous components of aerosols (including PAHs) and help to understand atmospheric processes, distinguishing between organic (OC) and elemental carbon (EC). Prather showed it was possible to distinguish EC and OC on a single particle level, and investigated their chemical associations with ammonium, nitrate, and sulfate. Her group explored ways to calibrate the ATOFMS data, making real-time apportionment of ambient particles possible. They did this by classifying particles using an artificial neural network (ART-2a). In 2008 she became the co-lead scientist in CalWater in collaboration with F. Martin Ralph; a multi-year interdisciplinary research effort focusing on how aerosols are impacting the water supply in the West Coast of the United States. Her PhD student Kerri Pratt led the Ice in Clouds Experiment - Layer Clouds (ICE-L) study. ICE-L included the first aircraft ATOFMS, named Shirley. Pratt and Prather studied ice crystals in situ on high speed aircraft flying above Wyoming, and found that the particles were mainly composed of dust or biological particles (bacteria, fungal spores or plants). Understanding the composition of airborne particles is imperative to properly evaluate their impact on climate change, as well as provide insight into how aerosol impact cloud formation and precipitation.
In 2010 she became the founding director of the NSF Center for Aerosol Impacts on Climate and the Environment (CAICE). CAICE became a National Science Foundation Phase II Center for Chemical Innovation in 2013. In this role, Prather develops new analytical techniques for studying aerosol chemistry. Her group demonstrated that dust and bioaerosols that travel from as far away as the Sahara can enhance precipitation in Western United States. Prather's group is studying the microbes that transfer from the ocean, become airborne and contribute to the global temperature. Ocean-in-the lab experiments are conducted by transferring thousands of gallons of seawater from the Pacific Ocean, producing waves, and adding nutrients to induce the growth of microbes. As part of CAICE, her group was the first to identify the major factors controlling chemical composition of sea spray, finding that the characteristics depended on the physical forces and ocean biology of the waves. They demonstrated two types of droplets; "film" drops that were full of microbes and organic materials, and "jet" drops that mainly contained sea salt and other biological species. Prather's research team can now explore the impact of carbon dioxide on the global temperature by controlling the amount entering their ocean simulation chamber. The Scripps Ocean Atmosphere Research Simulator (SOARS) became operational in the summer of 2022 and is being used to study how wind, temperature, sunlight and pollution impact the ocean and atmosphere. CAICE funding was extended by the National Science Foundation in 2018, with a second $20 million grant allowing them to investigate the interaction of human pollution with ocean-produced gases and aerosols.
Prather received the 2024 National Academy of Sciences Award in Chemical Sciences for her work furthering the understanding of atmospheric aerosols and their impact on air quality, climate, and human health.
Awards and honors
1994 American Society for Mass Spectrometry Research Award
1994 National Science Foundation Young Investigator
1997 National Science Foundation Special Creativity Award
1998 Gesellschaft für Aerosolforschung Smoluchowski Award
1999 American Association for Aerosol Research Kenneth T. Whitby Award
2000 ACS Analytical Chemistry Arthur F. Findeis Award
2009 UCSD Faculty Sustainability Award
2009 American Association for the Advancement of Science Fellow
2010 American Geophysical Union Fellow
2010 American Academy of Arts and Sciences Fellow
2010 ACS Creative Advances in Environmental Science and Technology
2011 ACS San Diego Distinguished Scientist Award
2015 California Air Resources Board Haagen-Smit Clean Air Award
2018 UC San Diego Chancellor’s Associates Excellence Award in Research in Science and Engineering
2019 Elected to the National Academy of Engineering
2020 ACS Frank H. Field and Joe L. Franklin Award for Outstanding Achievement in Mass Spectrometry
2020 Elected to the National Academy of Sciences
2022 Elected to the American Philosophical Society
2023 Gustavus John Esselen Award for Chemistry in the Public Interest
2023 Analytical Scientist the Power list - Leaders and Advocates
2024 National Academy of Sciences Award in Chemical Sciences
2024 Analytical Scientist the Power List - Plant Protectors
References
American women chemists
Environmental scientists
University of California, Davis alumni
University of California, San Diego faculty
University of California, Riverside faculty
Living people
Year of birth missing (living people)
21st-century American women
Mass spectrometrists | Kimberly Prather | [
"Physics",
"Chemistry",
"Environmental_science"
] | 1,477 | [
"Environmental scientists",
"Spectrum (physical sciences)",
"American environmental scientists",
"Mass spectrometrists",
"Mass spectrometry",
"Biochemists"
] |
59,546,391 | https://en.wikipedia.org/wiki/Jasic%20Technology%20Co.%2C%20Ltd. | Jasic Technology Company Ltd. () is a Chinese corporation operating out of Shenzhen, in the province of Guangdong. Its headquarters are in Pingshan New District.
The company manufactures and sells inverter welding machines, engine driven welders and other welder equipment primarily used in construction. Jasic is listed on The Shenzhen Stock Exchange.
Jasic was the center of a labor and political conflict in the city of Guangdong, referred to as the Jasic Incident.
Overview
Jasic Technology Company was founded in 2005 in Shenzhen, Guangdong. Jasic presently operates three factories in Shenzhen: Jasic Industrial Park, Chongqing Yunda Industrial Park, and Chengdu Jasic Industrial Park.
2018 labour dispute
The Jasic Technology Company was at the center of a widely reported controversy regarding the treatment of employees at Jasic Industrial Park in Shenzhen. Workers at the plant cited low pay, long hours, poor working conditions and, in addition, accused the management of Jasic of violating Chinese labor laws through illegal coerced overtime and excessive company fines.
In May 2018 several employees of Jasic petitioned to form a labor union with the All-China In Federation of Trade Unions, which was rejected. The workers decided to continue to build their union independently, workers reported that union organizers were attacked and beaten so after. Tensions sparked on 27 July when twenty nine workers and supporters were arrested and allegedly beaten by Shenzhen Police.
In response to the arrests, at noon on Monday, 6 August a group of eighty demonstrators publicly protested against the detainment outside of the Yanziling police station.
"At noon on Monday, about 80 supporters staged a second rally under the scorching sun outside Yanziling police station in Shenzhen’s Pingshan district, about 50 km (31 miles) from the border with Hong Kong. More than 40 Communist Party members and retired cadres, who are part of the country’s leading Maoist internet forum, Utopia, joined the rally."
A wide range of public figures condemned Chinese suppression of labor activists including MIT professor Noam Chomsky, Chinese labor activist Li Qiang, University of Hong Kong professor Pun Ngai, Chris Chan King-chi, Jenny Chan, Neo-Hegelian philosopher Slavoj Žižek, American socialist journal Jacobin, Amnesty International, Human Rights Watch, and Cornell University.
References
External links
Official Website (English)
Companies listed on the Shenzhen Stock Exchange
Manufacturing companies based in Shenzhen
Chinese companies established in 2005
Technology companies of China
Welding
Jasic incident
2005 in Shenzhen | Jasic Technology Co., Ltd. | [
"Engineering"
] | 506 | [
"Welding",
"Mechanical engineering"
] |
59,547,490 | https://en.wikipedia.org/wiki/Doppler%20velocity%20sensor | A Doppler velocity sensor (DVS) is a specialized Doppler radar that uses the Doppler effect to measure the three orthogonal velocity components referenced to the aircraft. When aircraft true heading, pitch and roll are provided by other aircraft systems, it can function as a navigation sensor to perform stand-alone dead reckoning navigation calculations as a Doppler Navigation Set (DNS).
Doppler navigation systems are independent of surrounding conditions, perform with high accuracy over land and sea anywhere in the world, and are independent of ground-based aids and space-based satellite navigation systems.
Operational principles
To measure an aircraft three-dimensional velocity, a Doppler radar antenna is caused to radiate a minimum of three non-coplanar microwave electromagnetic beams toward the earth's surface. Some of the energy is backscattered to the radar by the earth surface. With knowledge of the beam angles, three or more beam-Doppler frequencies are combined to generate the components of aircraft velocity.
DVS transmission is performed at a center frequency of 13.325 GHz in the internationally authorized Ku band of 13.25 to 13.4 GHz.
Uses
DVS are used on helicopters for navigation, hovering, sonar dropping, target handover for weapon delivery and search and rescue. Because the Doppler radar measures velocity relative to surface, sea current and tidal effects create biases. However, for sonobuoys dropping and over water search and rescue, velocity of the aircraft relative to water movement is expected.
These radars were formally approved under the FAA TSO-65a until 2013, and are designed in accordance with the Radio Technical Commission for Aeronautics (RTCA) DO-158 standard titled Minimum Performance Standards − Airborne Doppler Radar Navigation Equipment.
Limitations
The functional operation and accuracy of Doppler velocity sensors is affected by many factors, including aircraft velocity, attitude and altitude above terrain. It is also affected by environmental factors, including the type of terrain the radar is illuminating, and precipitation in the atmosphere.
As the aircraft moves, the backscattering coefficient changes within the beam width, and this causes a shift and some skewing of the Doppler spectrum, and hence an error in the measurement of velocity. A major limitation of using DVSs for navigation is that they typically suffer from accumulated error. Because the guidance system is continually integrating velocity with respect to time to calculate position ''(see dead reckoning)'', any measurement errors, however small, are accumulated over time. This leads to 'drift': an ever-increasing difference between where the system thinks it is located and the actual location. Due to integration a constant error in velocity results in a linear error in position.
See also
Continuous wave radar
Dead reckoning
Frequency modulation
Guidance systems
Radio navigation
References
External links
Doppler Navigation Radar on Friends of CRC
CMA-2012 Doppler Velocity Sensor and Navigation System on CMC Electronics
AN/ASN-157 Doppler Navigation Set on BAE Systems
ANV-353 on Leonardo S.p.A.
Radar
Aircraft instruments
Avionics
Speed sensors
Navigational equipment
Spacecraft components
Navigational aids
Velocity
Motion (physics) | Doppler velocity sensor | [
"Physics",
"Technology",
"Engineering"
] | 643 | [
"Physical phenomena",
"Physical quantities",
"Avionics",
"Measuring instruments",
"Velocity",
"Motion (physics)",
"Space",
"Vector physical quantities",
"Mechanics",
"Aircraft instruments",
"Spacetime",
"Wikipedia categories named after physical quantities",
"Speed sensors"
] |
59,551,167 | https://en.wikipedia.org/wiki/Mixed%20quantum-classical%20dynamics | Mixed quantum-classical (MQC) dynamics is a class of computational theoretical chemistry methods tailored to simulate non-adiabatic (NA) processes in molecular and supramolecular chemistry. Such methods are characterized by:
Propagation of nuclear dynamics through classical trajectories;
Propagation of the electrons (or fast particles) through quantum methods;
A feedback algorithm between the electronic and nuclear subsystems to recover nonadiabatic information.
Use of NA-MQC dynamics
In the Born-Oppenheimer approximation, the ensemble of electrons of a molecule or supramolecular system can have several discrete states. The potential energy of each of these electronic states depends on the position of the nuclei, forming multidimensional surfaces.
Under usual conditions (room temperature, for instance), the molecular system is in the ground electronic state (the electronic state of lowest energy). In this stationary situation, nuclei and electrons are in equilibrium, and the molecule naturally vibrates near harmonically due to the zero-point energy.
Particle collisions and photons with wavelengths in the range from visible to X-ray can promote the electrons to electronically excited states. Such events create a non-equilibrium between nuclei and electrons, which leads to an ultrafast response (picosecond scale) of the molecular system. During the ultrafast evolution, the nuclei may reach geometric configurations where the electronic states mix, allowing the system to transfer to another state spontaneously. These state transfers are nonadiabatic phenomena.
Nonadiabatic dynamics is the field of computational chemistry that simulates such ultrafast nonadiabatic response.
In principle, the problem can be exactly addressed by solving the time-dependent Schrödinger equation (TDSE) for all particles (nuclei and electrons). Methods like the multiconfigurational self-consistent Hartree (MCTDH) have been developed to do such task. Nevertheless, they are limited to small systems with two dozen degrees of freedom due to the enormous difficulties of developing multidimensional potential energy surfaces and the costs of the numerical integration of the quantum equations.
NA-MQC dynamics methods have been developed to reduce the burden of these simulations by profiting from the fact that the nuclear dynamics is near classical. Treating the nuclei classically allows simulating the molecular system in full dimensionality. The impact of the underlying assumptions depends on each particular NA-MQC method.
Most of NA-MQC dynamics methods have been developed to simulate internal conversion (IC), the nonadiabatic transfer between states of the same spin multiplicity. The methods have been extended, however, to deal with other types of processes like intersystem crossing (ISC; transfer between states of different multiplicities) and field-induced transfers.
NA-MQC dynamics has been often used in theoretical investigations of photochemistry and femtochemistry, especially when time-resolved processes are relevant.
List of NA-MQC dynamics methods
NA-MQC dynamics is a general class of methods developed since the 1970s. It encompasses:
Trajectory surface hopping (TSH; FSSH for fewest switches surface hopping);
Mean-field Ehrenfest dynamics (MFE);
Coherent Switching with Decay of Mixing (CSDM; MFE with Non-Markovian decoherence and stochastic pointer state switch);
Multiple spawning (AIMS for ab initio multiple spawning; FMS for full multiple spawning);
Coupled-Trajectory Mixed Quantum-Classical Algorithm (CT-MQC);
Mixed quantum−classical Liouville equation (QCLE);
Mapping approach;
Nonadiabatic Bohmian dynamics (NABDY);
Multiple cloning; (AIMC for ab initio multiple cloning)
Global Flux Surface Hopping (GFSH);
Decoherence Induced Surface Hopping (DISH)
Integration of NA-MQC dynamics
Classical trajectories
The classical trajectories can be integrated with conventional methods, as the Verlet algorithm. Such integration requires the forces acting on the nuclei. They are proportional to the gradient of the potential energy of the electronic states and can be efficiently computed with diverse electronic structure methods for excited states, like the multireference configuration interaction (MRCI) or the linear-response time-dependent density functional theory (TDDFT).
In NA-MQC methods like FSSH or MFE, the trajectories are independent of each other. In such a case, they can be separately integrated and only grouped afterward for the statistical analysis of the results. In methods like CT-MQC or diverse TSH variants, the trajectories are coupled and must be integrated simultaneously.
Electronic subsystem
In NA-MQC dynamics, the electrons are usually treated by a local approximation of the TDSE, i.e., they depend only on the electronic forces and couplings at the instantaneous position of the nuclei.
Nonadiabatic algorithms
There are three basic algorithms to recover nonadiabatic information in NA-MQC methods:
Spawning - new trajectories are created at regions of large nonadiabatic coupling.
Hopping - trajectories are propagated on a single potential energy surface (PES), but they are allowed to change surface near regions of large nonadiabatic couplings.
Averaging - trajectories are propagated on a weighted average of potential energy surfaces. The weights are determined by the amount of nonadiabatic mixing.
Relation to other nonadiabatic methods
NA-MQC dynamics are approximated methods to solve the time-dependent Schrödinger equation for a molecular system. Methods like TSH, in particular in the fewest switches surface hopping (FSSH) formulation, do not have an exact limit. Other methods like MS or CT-MQC can in principle deliver the exact non-relativistic solution.
In the case of multiple spawning, it is hierarchically connected to MCTDH, while CT-MQC is connected to the exact factorization method.
Drawbacks in NA-MQC dynamics
The most common approach in NA-MQC dynamics is to compute the electronic properties on-the-fly, i.e., at each timestep of the trajectory integration. Such an approach has the advantage of not requiring pre-computed multidimensional potential energy surfaces. Nevertheless, the costs associated with the on-the-fly approach are significantly high, leading to a systematic level downgrade of the simulations. This downgrade has been shown to lead to qualitatively wrong results.
The local approximation implied by the classical trajectories in NA-MQC dynamics also leads to failing in the description of non-local quantum effects, as tunneling and quantum interference. Some methods like MFE and FSSH are also affected by decoherence errors. New algorithms have been developed to include tunneling and decoherence effects. Global quantum effects can also be considered by applying quantum forces between trajectories.
Software for NA-MQC dynamics
Survey of NA-MQC dynamics implementations in public software.
a Development version.
References
Computational chemistry | Mixed quantum-classical dynamics | [
"Chemistry"
] | 1,476 | [
"Theoretical chemistry",
"Computational chemistry"
] |
59,551,559 | https://en.wikipedia.org/wiki/Evdokimov%27s%20algorithm | In computational number theory, Evdokimov's algorithm, named after Sergei Evdokimov, is an algorithm for factorization of polynomials over finite fields. It was the fastest algorithm known for this problem, from its publication in 1994 until 2020. It can factorize a one-variable polynomial of degree over an explicitly given finite field of cardinality . Assuming the generalized Riemann hypothesis the algorithm runs in deterministic time (see Big O notation). This is an improvement of both Berlekamp's algorithm and Rónyai's algorithm in the sense that the first algorithm is polynomial for small characteristic of the field, whearas the second one is polynomial for small ; however, both of them are exponential if no restriction is made.
The factorization of a polynomial over a ground field is reduced to the case when has no multiple roots and is completely splitting
over (i.e. has distinct roots in ). In order to find a root of in this case, the algorithm deals with
polynomials not only over the ground field but also over a completely splitting semisimple algebra over (an example of such an algebra is given by , where ). The main problem here is to find efficiently a nonzero zero-divisor in the algebra. The GRH is used only to take roots in finite fields in polynomial time. Thus the Evdokimov algorithm, in fact, solves a polynomial equation over a finite field "by radicals" in quasipolynomial time.
The analyses of Evdokimov's algorithm is closely related with some problems in the association scheme theory. With the help of this approach, it was proved
that if is a prime and has a ‘large’ -smooth divisor , then a modification of the Evdokimov algorithm finds a nontrivial factor of the polynomial in deterministic time, assuming GRH and that .
References
Further reading
Computational number theory
Quasi-polynomial time algorithms | Evdokimov's algorithm | [
"Mathematics"
] | 398 | [
"Computational mathematics",
"Computational number theory",
"Number theory"
] |
59,552,027 | https://en.wikipedia.org/wiki/Microfluidic%20diffusional%20sizing | Microfluidic diffusional sizing (MDS) is a method to measure the size of particles based on the degree to which they diffuse within a microfluidic laminar flow. It allows size measurements to be taken from extremely small quantities of material (nano-grams) and is particularly useful when sizing molecules which may vary in size depending on their environment - e.g. protein molecules which may unfold or become denatured in unfavourable conditions.
Applications
MDS is primarily used in protein analyses, where size, concentration and interactions are important.
Protein size measurement
Measuring the size of a protein molecule is useful as an overall quality indicator, since misfolding, unfolding, oligomerization, aggregation or degradation can all affect size.
The literature specifically demonstrates the use of MDS in sizing protein-nanobody complexes, monitoring the formation of α-synuclein amyloid fibrils. and in observing protein assembly into oligomers
MDS can also be used to size membrane proteins, as the use of a protein specific labelling and detection system allows other species present in the solution (such as free lipid micelles or detergents) to be ignored.
Protein interactions
MDS has been used to characterise interactions between biomolecules under native conditions, and has been demonstrated to detect specific interactions within complex mixtures. It has also been used in detecting and quantifying protein-ligand interactions and protein-lipid interactions.
Protein concentration
The concentration of purified protein solutions in the laboratory is useful in determining yield and measuring the success of a prep. MDS reports concentration as well as size for each test.
Since the detection is not based on inherent fluorescence of tryptophan or tyrosine residues, MDS has been used as an alternative to A280 UV-Vis quantification.
Advantages
If protein specific labelling is applied, MDS allows membrane proteins to be sized. This is particularly useful as it is an area where other biophysical techniques can struggle - for example dynamic light scattering (DLS) is of limited use, since free detergent molecules may also scatter light and affect the results.
Furthermore, as the size reported is an average of all detectable species present there is no bias towards large species, as is found in DLS measurements.
Another key advantage is that results can be obtained with very small quantities of material which may be particularly important where samples are scarce or expensive.
With commercially available MDS instruments, testing is very simple and there is no need to input test parameters or sample conditions. This makes it a very repeatable method of testing as most of the functions such as flow rates, detector settings etc. are automated by the instrument rather than set by the operator.
In addition to size, MDS is able to calculate concentration so two parameters can be assessed in one test.
Finally the method does not require calibration, as it relies on a ratio-metric measurement to determine diffusion rate.
Theory
In an MDS analysis, a stream of liquid containing the particles to be sized is introduced alongside an auxiliary stream in a laminar flow in a microfluidic channel. Because there is no convective mixing of the two streams, the only way particles can move to the auxiliary stream is by diffusion. The rate of this diffusion is dependent on the particle's size, as determined by the Stokes–Einstein equation, so small particles diffuse quicker than large particles.
After a period of diffusion the original and auxiliary streams are split and the degree of diffusion is fixed. The number of particles in each stream can then be detected (in the case of proteins this is achieved by addition of an amine reactive fluorogenic dye). The ratio between the two streams is used to determine the diffusion co-efficient, which is used to calculate the hydrodynamic radius. The sum of particles in both streams can also be used to measure the concentration of the analyte.
References
Biochemistry methods | Microfluidic diffusional sizing | [
"Chemistry",
"Biology"
] | 806 | [
"Biochemistry methods",
"Biochemistry"
] |
59,552,758 | https://en.wikipedia.org/wiki/NGC%20741 | NGC 741, also known as PGC 7252, is a formerly active radio galaxy in the constellation of Pisces. Located 74.13 Mpc away, NGC 741 is part of a group of galaxies including NGC 742 and PGC 7250. NGC 741 and NGC 742 recently collided, although the disruption was minor. Radio filaments have been found connecting NGC 741 to NGC 742, and due to the bent structure of the radio filaments, NGC 741 is estimated to be moving at 1400 km/s with respect to its local group, suggesting that ram-pressure stripping was created as a product of the former merger.
References
External links
Pisces (constellation)
741
Elliptical galaxies
007252
Radio galaxies
01413
4C objects | NGC 741 | [
"Astronomy"
] | 162 | [
"Pisces (constellation)",
"Constellations"
] |
59,553,421 | https://en.wikipedia.org/wiki/Homeric%20Minimum | The Homeric Minimum is a grand solar minimum that started about 2,800 years ago (ca. 800 BC) and lasted around 200 years. It appears to coincide with, and have been the cause of, a phase of climate change at that time, which involved a wetter Western Europe and drier eastern Europe. This had far-reaching effects on human civilization, some of which may be recorded in Greek mythology and the Old Testament.
Solar phenomenon
The Homeric Minimum is a persistent and deep grand solar minimum between about 800 and 600 BC. Cosmogenic beryllium-10 deposits in varves in a German lake show a sharp increase "2,759 ± 39 varve years before present", while carbon-14 is high starting around 830 BC. It is similar to the Spörer Minimum of around AD 1500. It is sometimes named the "Great Solar Minimum" or Homerian GSM. It has been subdivided into a stronger minimum at 2,750-2,635 years before present and a secondary minimum 2,614-2,594 years before present. The Homeric Minimum is sometimes considered to be part of a longer "Hallstattzeit" solar minimum between 705–200 BC that also includes a second minimum between 460 and 260 BC. The Homeric Minimum however also coincided with a geomagnetic excursion named "Etrussia-Sterno", which may have altered the climate response to the Homeric Minimum. The name "Homeric Minimum" however is not widely accepted in solar physics.
Mechanisms of climate effects
Variations in the solar output have effects on climate, less through the usually quite small effects on insolation and more through the relatively large changes of UV radiation and potentially also indirectly through modulation of cosmic ray radiation. The 11-year solar cycle measurably alters the behaviour of weather and atmosphere, but decadal and centennial climate cycles are also attributed to solar variation. Any direct radiation effects may have been further amplified by oceanic circulation changes, such as the El Niño-Southern Oscillation and the Atlantic Meridional Overturning Circulation. It is possible that cooling in the North Atlantic predated the Homeric Minimum, while the timing of East Asian Summer Monsoon changes coincides well with that of the Homeric Minimum.
Effects on human populations and climate
Debates on whether a climatic deterioration occurred during that time began already in the late 19th century. The Homeric Minimum has been linked with a phase of climate change known as the Homeric Climate Anomaly or Homeric Climate Oscillation, during which western North America and Europe became colder but whether it became drier or wetter is under debate; the western parts and the North Atlantic may have become wetter and the eastern parts of Europe drier. This climate oscillation has been called the "Homeric Climate Oscillation", "850 BC event" or the "2.8 kyr event", and it has been associated with the Iron Age Cold Epoch, the decline of the Urartu kingdom in Armenia and a cultural interruption in Ireland although its effect there is still debated.
Human cultures at that time underwent changes, which also coincide with the transition from the Bronze Age to the Iron Age. The climate fallout of this prolonged solar minimum may have had substantial impact on human societies at that time, with adaptations to changed climates, and a recovery of societies after its end. Increased precipitation in Thrace and over the Eurasian steppes during the Homeric Minimum may have benefitted the founding of Byzantium and the Skythians, respectively, however, and a cultural change in the Eurasian Koban culture has been associated with the Homeric Minimum.
It has been speculated that some ancient literary references refer to these phenomena. For example, the period saw the growth of a glacier on Mount Olympus, while Greek mythology and Homer refer to ice and storms on the mountain, which may also be reflected in the name "Olympus". Increased activity of the polar lights at the end of the Homeric Minimum may have inspired Ezekiel's vision of God in the Old Testament.
Other effects
A variety of phenomena have been linked to the Homeric Minimum:
Increasingly cold, wet and windy climate recorded from Meerfelder Maar in Germany, where the Homeric Minimum has been associated with a permanent climate transition. A wetter climate was also recognized in a bog in the Netherlands; the present-day Czech Republic, where it also became colder; and in the British Isles.
A growth in the size of lakes and downward expansion of conifer forests took place in Western North America at the time of the Homeric Minimum.
Decreased sea levels are recorded from the Homeric Minimum.
Increased storminess in Scotland, England and Sweden.
Increased precipitation in northern Iberia. Such a precipitation increase took place a few decades after the Homeric Minimum and increased wetness has been noted after other solar minima, as well.
Cold sea surface temperatures in the Santa Barbara Basin of California and a cold interval in the Campito Mountain tree ring record. The Homeric Minimum in general seems to be associated with a cold climate in California.
Decreased atmospheric pressure differences between Iceland and the subtropics, that is a decreased North Atlantic oscillation index which lasted past the end of the Homeric Minimum.
Cooling is also recorded from Asia and the Southern Hemisphere.
Increased windiness is recorded in lakes of Western Europe.
Gustier springs in Europe and increased cold air outbreaks in East Asia.
Groundwater levels rose in the Netherlands.
A weaker monsoon in East Asia, India and Tibet.
A wetter climate is recorded for Central Asia.
Lake levels in the Caspian Sea rose.
Water levels increased in lakes of the French Jura.
Cooling in the Ionian Sea.
More frequent floods and storms in the Alps.
A dry period in the Eastern Mediterranean, such as at Jerusalem, Lake Van and the Dead Sea appears to coincide with the Homeric Minimum, although the mechanisms for this are not clear.
Expansion of glaciers in the Caucasus.
A cold and arid climate in Armenia.
Increased incision along the River Soar.
Increased flooding along the Ammer river.
Increased production of carbon-14 and beryllium-10 by cosmic rays, recorded in Greenland. The carbon-14 excursion is also recorded elsewhere and constitutes the largest such spike since 2000 BCE, exceeding the Maunder Minimum. The so-called Hallstatt plateau, an anomaly in carbon-14 production that creates large imprecisions in radiocarbon dating during that time, has been related to the Homeric Minimum.
The switch from the Subboreal to the Subatlantic climate epoch in the Blytt–Sernander sequence about 2,800 years before present.
The "Göschenen I" glacier advance in the Alps relates to the Homeric Minimum.
A change in storm frequency on the Scotian Shelf.
Increased precipitation in Sicily.
The Bond event 2 is associated with the 2.8 ka event.
Cold and dry weather in China is recorded in historical records like the Bamboo Annals.
A colder climate in the Khingan Mountains of China.
Increased runoff in Corsica.
References
Sources
History of climate variability and change
Solar phenomena
9th century BC
Homer
Ezekiel | Homeric Minimum | [
"Physics"
] | 1,455 | [
"Physical phenomena",
"Stellar phenomena",
"Solar phenomena"
] |
59,553,876 | https://en.wikipedia.org/wiki/Lava%20balloon | A lava balloon is a gas-filled bubble of lava that floats on the sea surface. It can be up to several metres in size. When it emerges from the sea, it is usually hot and often steaming. After floating for some time it fills with water and sinks again.
Lava balloons can form in lava flows entering the sea and at volcanic vents, but they are not common. They have been observed in the Azores, Canary Islands, Hawaii, Japan, Mariana Islands and Mexico. Apparently, they are generated when gases trapped within magma form large bubbles that eventually rise to the sea surface. In the Canary Islands, balloons containing sediments were used to infer the age of the basement on which the volcano is constructed; these sediments were also at first misinterpreted as evidence of an impending large explosive eruption.
Appearance
Lava balloons are gas-filled bubbles surrounded by a crust formed by lava; their gas content allows them to float on the sea surface. Observed sizes range from at El Hierro (Canary Islands) during the 2011–2012 eruption to about at Terceira on their long axis with rounded shapes. They have one or sometimes several large cavities surrounded by a crust. The outer part of the crust is highly vesicular and striated and has delicate flow structures that can be seen using a scanning electron microscope. It is fragile and often breaks off the balloon. The inner part of the crust is separated from the outer part by orange and white layers. It is subdivided into three inward-thickening layers, all of which contain varying amounts of vesicles that become larger toward the interior. Recovered lava balloons and associated rocks are on display in the UGGp museum on El Hierro.
Occurrence
Lava balloons have been described from Terceira Island in the Azores, at Teishi Knoll of Izu-Tobu (Japan) in 1989, El Hierro, offshore Pantelleria (Foerstner volcano, Italy) in 1891 and Kealakekua Bay (Mauna Loa, Hawaii) in 1877. Similar floating scoria blocks containing reticulite were observed in 1993–1994 at Socorro, Mexico. , lava balloons have been observed only at these sites, although the increasing number of observations might indicate that this is a common mode of submarine volcanism.
A similar style of eruption but involving silicic magmas has also been found and christened "Tangaroan", after the research ship that carried out research on the Macauley caldera. Balloon-like structures were observed in 1934–1935 at Shin-Iwo-jima, Japan, and at West Rota in the Marianas. At Macauley Island in the Kermadec Islands such a style of eruption has been inferred and used to explain the presence of large rocks at substantial distances from the volcanic vent.
Observations
Lava balloons observed during a 1998–2000 eruption at Terceira are considered to be the most noteworthy expression of that eruption. They were described as steaming dark objects floating on the sea, hot enough to damage fishing ropes. At first, they were thought to be dead whales or trunks. They surfaced in batches over a span of several months, clustering in particular areas that appear to reflect the position of active volcanic vents on the seafloor but also wind and ocean current driven transport. Sometimes, hundreds of balloons were observed on a given occasion, accompanied by gas bubbles (i.e. gas slug) and particles shed by the balloons, all of which rose through the water in the form of plumes. The balloons steamed at first under their own heat, forming small vapour plumes and hissing sounds. Their insides could reach temperatures of over and were sometimes incandescent. Balloons usually floated for less than 15 minutes before sinking again as water penetrated them through cracks in the crust and gases escaped. Sometimes, however, explosions threw fragments for tens of meters when water interacted with a hot interior. Remotely operated underwater vehicle (ROV) observations of the putative vent area found debris that may have come from lava balloons.
The Pantelleria eruption generated scoriaceous and vesicular floating structures with sizes exceeding that sank again beneath the water surface after they had become saturated with water. 1892 descriptions of lava balloons about the Pantelleria eruption resemble the Terceira balloons. The eruption was discovered thanks to its balloons. As reported by fishers, black balloons of lava floated on the sea, sometimes propelled by steam jets and sometimes exploding with up to high debris fountains. As with Terceira, they were accompanied by gas bubbles and many of them were hot enough to melt zinc. Water entering the balloons evaporated from the heat, thus delaying their filling. Eventually, the balloons filled with water and sank again.
At El Hierro, lava balloons were erupted from 27 November 2011 until 23 February 2012 and often exploded upon reaching the sea surface. On the seafloor close to the vent were balloons with various shapes including amphora-like and sizes reaching over . They had sunk to the seafloor immediately after being ejected from the vent and had sometimes spilled magma. The amphora-like shape appears to have formed when floating balloons degassed through vents at their top and the balloons deformed. On the seafloor, the ballons were buried by later pillow lavas.
Towards the end of the eruption, some lava balloons had a thin layer of solidified magma around a glassy core and appeared to float for longer times, allowing them to reach the coast. The balloons were named "restingoliths" and the glassy core "xeno-pumice". Similar balloons were observed at Teishi Knoll and appear to form when sediments are incorporated into lava and melted, forming a pumice-like structure. At El Hierro, the origin of the cores gave rise to a scientific debate about whether they originated as sediment or as silicic magma; now there is agreement that they formed out of sediments. In Socorro, the cores of lava balloons contained reticulite.
In Kealakekua Bay, over a hundred lava balloons were observed. They emitted sulfurous gases and steam and were hot inside, even incandescent. As ships were moving across the area rising balloons in the water impacted their hulls but did not do any damage.
Genesis
Large floating pumice blocks such as these observed in Kikai, Japan, in 1934–1935 may be comparable to lava balloons, but they are produced by eruptions of felsic magma, which are rich in silicates and lighter elements. By contrast, lava balloons are generally produced by eruptions of alkali basalt, although few basaltic eruptions produce them.
Lava balloons are probably limited to a depth range of : too deep, and gas bubbles do not form; too shallow, and degassing fragments the rocks. Only a few sufficiently large balloons can rise all the way to the sea surface; smaller ones fill quickly with water and sink. Overly crystalline magma may render a crust too brittle to form a lava balloon.
Several different mechanisms have been invoked to explain the genesis of lava balloons. Water that penetrates the lava can boil and the resulting vapours can inflate the balloons and make them float, although for Terceira a non-water gas composition has been inferred. They are usually observed when lava flows enter the sea. They appear to form when water is trapped in lava as it flows onto a beach with waves or enters lava tubes; in the latter case, entrained water can be transported through the tube and eventually end up in developing pillow lavas which are rendered buoyant by water vapour bubbles.
Less commonly, as in Terceira, balloons and accompanying gas bubbles appear to have formed on volcanic vents rather than at the front of lava flows, and more specifically on volcanic vents where magma ponded. There, gas emanating from a gas-rich magma accumulated below a crust on top of lava, forming blisters that eventually reached a critical buoyancy and broke off, forming lava balloons. The high gas content and low viscosity of the magma during the Terceira eruption allowed balloons to form despite the vents being located at considerable depth.
Finally, lava fountaining processes have been proposed to form balloons underwater. According to this model, slabs of magma in the water are surrounded by a thin shell which traps exoluting gases but also magma. The trapped gases inflate the shell and make it buoyant, while the remnant magma maintains the shell as it expands.
Impact
On São Miguel Island in the Azores, lava balloons are considered to be one of the main volcanic hazards stemming from submarine volcanic eruptions. Early lava balloons erupted during the 2011–2012 El Hierro eruption contained xeno-pumice, which raised concerns that evolved magmas such as phonolite and trachyte, capable of generating explosive eruptions, might be present under the volcano. As the eruption continued, these concerns together with an outburst of gas led to the evacuation of the town of La Restinga. The link between xeno-pumice and evolved magmas was contested early on; when explosive eruptions did not occur, this led to complaints that the response to the eruption had been disproportionate especially given its effect on the economy. The management of the El Hierro eruption in general attracted intense criticism.
At El Hierro, the crevice-rich submarine terrain formed by sunk lava balloons and lava bombs forms a particular habitat. Animal species encountered there include the decapod Plesionika narval.
Scientific significance
At El Hierro, foraminifera fossils found in the glassy cores of lava balloons have been inferred to originate from sediments that underlie the El Hierro volcano. These fossils indicate a Cretaceous–Pliocene age for these sediments, implying that El Hierro rests on the youngest sediment base of the archipelago. The progressively lower age of the islands from east to west reinforces the theory that the Canary Islands are on top of a hotspot. Furthermore, it has been proposed that lava balloons might be proof of shallow-water volcanic eruptions.
See also
Lithophysa
Pumice raft
Notes
References
Sources
External links
Volcanology
Floating islands
Rafts
Balloons | Lava balloon | [
"Chemistry"
] | 2,094 | [
"Balloons",
"Fluid dynamics"
] |
59,554,913 | https://en.wikipedia.org/wiki/Britaldo%20Silveira%20Soares%20Filho | Britaldo Silveira Soares Filho is a Brazilian scientist, and Professor in Environment Modeling at Universidade Federal de Minas Gerais. Britaldo is one of winners of the Georg Foster Research Awards because "he has developed innovative methods in the field of geography and cartography which make it possible to precisely predict how tropical rainforests – such as in the Amazon basin – will develop. Based on these models, the government of Brazil has implemented a variety of protective measures and is planning more for the future."
Education
BA in Geology, Universidade Federal de Minas Gerais: March, 1978-December, 1982.
MSc in Remote Sensing, Instituto Nacional de Pesquisas Espaciais – March 1987-November 1989.
DSc in Spatial Analyses, Universidade de São Paulo – March, 1993-September, 1998.
Career
Dr. Britaldo Silveira Soares-Filho is full professor of Department of Cartography since March 2012, Institute of Geosciences and the current coordinator of CSR (Remote Sensing Center) of Federal University of Minas Gerais (Universidade Federal de Minas Gerais), Brazil. He advises at the graduate courses on Production Engineering and Environmental Modeling of UFMG; of which he led the creation of the latter. Since 2000, he has collaborated in various research projects in the Amazon with IPAM (Instituto de Pesquisa Ambiental da Amazônia), Aliança da Terra, and the Woods Hole Research Center, where he is a distinguished visiting scientist. In addition, he is member of the scientific board of CTI (Center of Territorial Intelligence) and guest professor at Center for Development Research University of Bonn, Germany.
His research consists of environmental modeling, in particular, the development of simulation models of changes in land use and coverage, agricultural and forest profitability, urban dynamics, forest fire and carbon balance and their applications for the design of public policies and evaluation policies. An important product of his research is the DINAMICA EGO software, a platform for environmental modeling used by researchers from various countries like Mexico, Iran, Bangladesh, Greece, China and others.
Soares Filho participated in important projects to define public policies for environmental protection and conservation in Brazil, such as the Amazon Region Protected Areas Program, in which he is part of the Arpa Scientific Advisory Panel, and the environmental impact modeling studies of the implementation of BR 163 in the Amazon region
Awards
2007. IPCC fourth report, working group III, mitigation, contributing author, chapter 9, Forestry – IPCC was awarded jointly with Al Gore with the 2007 Nobel Peace Prize.
2015. Georg Forster Research Award, The Alexander von Humboldt Foundation
References
External links
"Nature Sustainability publishes opinion of Professors Britaldo Soares Filho and Raoni Rajão on the strategies of environmental conservation in Brazil.", Nov 2018
Britaldo Soareas Filho interview to Boletim UFMG
Britaldo Soareas Filho interview to Estadão evaluate the repercussions of a possible merger between the Ministries of Agriculture and the Environment.
Britaldo Soareas Filho video on 90th anniversary of University of Minas Gerais.
In a presentation to the Federal Senate, Professor Britaldo Soares Filho discussed the advances of the environmental registry and tools to facilitate compliance with the Brazilian forest code.
Lecture given by Frank Merry and Britaldo Soares-Filho for the Climate and Land Use Alliance - CLUA. July, 2017.
Britaldo Soares Filho at Research gate.net
Britaldo Soares Filho article to Climate Observatory web sit e "A encruzilhada das emissões do desmatamento".
Britaldo Soares Filho interview to Vet UFMG on livestock productivity and the forest conservation.
Britaldo Soares Filho on Environmental modelling in support of sound policy development.
Living people
Year of birth missing (living people)
Environmental scientists | Britaldo Silveira Soares Filho | [
"Environmental_science"
] | 785 | [
"Environmental scientists"
] |
59,556,385 | https://en.wikipedia.org/wiki/Robert%20F.%20Sternitzky | Robert F. Sternitzky (August 25, 1891 - May 1980) was a United States lepidopterist and illustrator. Butterfly and moth specimens he collected are in a number of collections, including those of the Harvard Museum of Natural History, the Essig Museum of Entomology (at the University of California at Berkeley), Manitoba Museum, and the Smithsonian National Museum of Natural History. He collected primarily in California and Arizona.
In 1930, he described Plebejus icarioides moroensis (Morro Bay blue or Morro blue), having taken the type specimen at Morro Beach, in San Luis Obispo County, California, on June 1, 1929; it is now known as Aricia icarioides moroensis. In 1937, he described the "Bay checkerspot", Euphydryas editha var. bayensis; it is now known as Euphydryas editha bayensis. In 1945, he described the subspecies Parnassius clodius strohbeeni.
He was accompanied on some collecting trips by Charles Henry Ingham.
He painted a plate depicting seventeen larvae and pupae, in color, for John Adams Comstock's Butterflies of California. The book was published in 1927 in very small editions, and is now rare. A facsimile edition was published in 1989; Sternitzky's is the only plate reproduced in color.
In April 1948 he notified the Lepidopterists' Society of a change of address, to Laytonville, Mendocino County. The same issue carried his advertisement, both for specimens sold commercially, and his services as an illustrator for museums.
He died in 1980. Some of his specimens were purchased by Cyril Franklin dos Passos.
The species Nemeris sternitzkyi was named in his honor by Frederick H. Rindge in 1981, as had been Parnassius phoebus sternitzkyi (Sternitzky's parnassian), by James Halliday McDunnough, in 1936; the latter is now known as Parnassius smintheus sternitzkyi.
Claude Lemaire et al. have questioned the accuracy of some of the locations on Sternitzky's specimen labels.
Papers
References
External links
Parnassius clodius strohbeeni types - images, including labels
1891 births
1980 deaths
American lepidopterists
20th-century American zoologists
People from Laytonville, California
Taxon authorities | Robert F. Sternitzky | [
"Biology"
] | 509 | [
"Taxon authorities",
"Taxonomy (biology)"
] |
53,303,114 | https://en.wikipedia.org/wiki/Orthopole | In geometry, the orthopole of a system consisting of a triangle ABC and a line ℓ in the same plane is a point determined as follows. Let be the feet of perpendiculars dropped on ℓ from respectively. Let be the feet of perpendiculars dropped from to the sides opposite (respectively) or to those sides' extensions. Then the three lines are concurrent. The point at which they concur is the orthopole.
Due to their many properties, orthopoles have been the subject of a large literature.
Some key topics are determination of the lines having a given orthopole and orthopolar circles.
Literature
Orthopole=Ортополюс. In Russian
References
Points defined for a triangle | Orthopole | [
"Mathematics"
] | 156 | [
"Points defined for a triangle",
"Point (geometry)"
] |
53,303,367 | https://en.wikipedia.org/wiki/Suthan%20Suthersan | Suthan Suthersan (10 June 1956 – 20 February 2017) was an environmental engineer; he served as the chief technical officer and executive vice president of Arcadis North America.
Early life
Dr. Suthersan had his childhood education at the rural schools of Sri Lanka in Mankulam, Kodikamam, Valaichenai, Wattegama and Pesalai due to his father's career at Sri Lanka Railways. He joined Jaffna Central College for his secondary education.
He entered the engineering faculty at the University of Peradeniya and went for his master's degree at the Asian Institute of Technology. He did his Ph.D. in environmental engineering at the University of Toronto.
Career
He joined Clean Harbors in Boston, a pure hazardous waste handling and removal company, and then the Groundwater Technology Inc. at their corporate office in Norwood, Massachusetts.
He later joined Geraghty and Miller which headquartered in Long Island. There he had come under the guidance of David Miller; he introduced him to Steve Blake and both encouraged and enabled him to build the technical foundation to make Geraghty and Miller, which later became Arcadis North America.
Bibliography
References
External links
Suthan Suthersan: A Journey from Jaffna (Dr. Suthersan's Blog)
Arcadis Mourns Loss of Chief Technical Officer Dr. Suthan S. Suthersan
NGWA mourns passing of Suthan Suthersan
Environmental engineers
1956 births
2017 deaths
Alumni of Jaffna Central College
Asian Institute of Technology alumni
Alumni of the University of Peradeniya
University of Toronto alumni
American people of Sri Lankan Tamil descent
Sri Lankan Hindus
American Hindus
Sri Lankan emigrants to the United States
People from Hopewell Township, Mercer County, New Jersey | Suthan Suthersan | [
"Chemistry",
"Engineering"
] | 361 | [
"Environmental engineers",
"Environmental engineering"
] |
53,303,491 | https://en.wikipedia.org/wiki/NGC%201741 | NGC 1741 is a distant pair of interacting galaxies (NGC 1741A and NGC 1741B) in the Eridanus constellation. It was discovered on 6 January 1878 by French astronomer Édouard Stephan. As a result of the collision, the galaxies are in a rapid starburst phase. The galaxies are classed as Wolf–Rayet galaxies due to their high content of rare Wolf–Rayet stars.
This pair of spiral galaxies is made up of PGC 16570 (NGC 1741B) and PGC 16574 (NGC 1741A). This pair is part of the Halton Arp catalog as Arp 259 and the Hickson Compact Group as HCG 31A (NGC 1741A) and HCG 31B (NGC 1741B).
References
External links
Interacting galaxies
Eridanus (constellation)
1741
259
Discoveries by Édouard Stephan
016574
Markarian galaxies | NGC 1741 | [
"Astronomy"
] | 177 | [
"Eridanus (constellation)",
"Galaxy stubs",
"Astronomy stubs",
"Constellations"
] |
53,304,994 | https://en.wikipedia.org/wiki/Danoprevir | Danoprevir (INN) is an orally available 15-membered macrocyclic peptidomimetic inhibitor of NS3/4A HCV protease. It contains acylsulfonamide, fluoroisoindole and tert-butyl carbamate moieties. Danoprevir is a clinical candidate based on its favorable potency profile against multiple HCV genotypes 1–6 and key mutants (GT1b, IC50 = 0.2–0.4 nM; replicon GT1b, EC50 = 1.6 nM).
Danoprevir has been evaluated in an open-label, single arm clinical trial in combination with ritonavir for treating COVID-19 and favourably compared to lopinavir/ritonavir in a second trial.
History
Danaoprevir was initially developed by Array BioPharma then licensed to Roche for further development and commercialization. In 2013, Danoprevir was licensed to Ascletis by Roche for development and production in China under the tradename Ganovo.
References
Further reading
Anti–hepatitis C agents
Antiviral drugs
COVID-19 drug development
Macrocycles
NS3/4A protease inhibitors
Carbamates
Cyclopropyl compounds
Organofluorides
Pyrrolidines
Acylsulfonamides | Danoprevir | [
"Chemistry",
"Biology"
] | 281 | [
"Antiviral drugs",
"Drug discovery",
"Organic compounds",
"COVID-19 drug development",
"Macrocycles",
"Biocides"
] |
53,305,266 | https://en.wikipedia.org/wiki/Resonant%20converter | A resonant converter is a type of electric power converter that contains a network of inductors and capacitors called a resonant tank, tuned to resonate at a specific frequency. They find applications in electronics, in integrated circuits.
There are multiple types of resonant converter:
Series resonant converter
Parallel resonant converter
Class E resonant converter
Class E resonant rectifier
Zero-voltage switching resonant converter
Zero-current switching resonant converter
Two-quadrant ZVS resonant converter
Resonant DC-link inverter
See also
Inverter
Switched-mode power supply
References
Electrical engineering | Resonant converter | [
"Engineering"
] | 143 | [
"Electrical engineering"
] |
53,305,497 | https://en.wikipedia.org/wiki/HD%2057197 | HD 57197, also known as M Puppis or HR 2789, is a suspected astrometric binary located in the southern constellation Puppis, the poop deck. It has an apparent magnitude of 5.84, making it faintly visible to the naked eye under ideal conditions. Based on parallax measurements from the Gaia satellite, the system is estimated to be 629 light years away from the Solar System. The value is poorly constrained, but it appears to be receding with a heliocentric radial velocity of . At its current distance, HD 57197's brightness is diminished by 0.3 magnitudes due to interstellar dust. It has an absolute magnitude of -0.43.
The visible component has a stellar classification of B8 II/III, indicating that it is an evolved B-type star with the blended luminosity class of a bright giant and a giant star. HD 57197 is estimated to be 220 million years old, enough time for it to expand to 3.9 times the Sun's radius. It has 3.08 times the mass of the Sun and radiates 234 times the luminosity of the Sun from its photosphere at an effective temperature of , giving it a bluish-white hue. Based on its extinction in the Gaia passband, it has an iron abundance 66% that of the Sun. This makes HD 57197 metal deficient.
References
Puppis
B-type giants
Puppis, M
CD-43 3093
2789
057197
035347
B-type bright giants
Puppis, 85 | HD 57197 | [
"Astronomy"
] | 327 | [
"Puppis",
"Constellations"
] |
53,305,844 | https://en.wikipedia.org/wiki/DyNet | DyNet is the communications network and communications protocol for Dynalite lighting automation and building automation. It is now part of Signify.
Design
The network runs on a 4-twisted-pair cable of 100Ω 100 MHz CAT5E [1] or a flat cable with RS485 serial port, usually with a RJ-12 connector. A daisy-chain serial network topology is strongly recommended with no stubs. The recommended cable colour-coding is:
Green/White pair = paralleled for GND
Orange/White pair = paralleled for +12V
Blue/White pair = blue for DATA+ and white for DATA-
Brown/White pair = spare or shield if unshielded cable is used.
References
Lighting
Building automation | DyNet | [
"Engineering"
] | 152 | [
"Building engineering",
"Building automation",
"Automation"
] |
53,306,110 | https://en.wikipedia.org/wiki/Glecaprevir | Glecaprevir (INN,) is a hepatitis C virus (HCV) nonstructural (NS) protein 3/4A protease inhibitor that was identified jointly by AbbVie and Enanta Pharmaceuticals. It is being developed as a treatment of chronic hepatitis C infection in co-formulation with an HCV NS5A inhibitor pibrentasvir. Together they demonstrated potent antiviral activity against major HCV genotypes and high barriers to resistance in vitro.
On 19 December 2016, AbbVie submitted a new drug application to the U.S. Food and Drug Administration for the glecaprevir/pibrentasvir (trade name Mavyret) regimen for the treatment of all major genotypes (1–6) of chronic hepatitis C. On 3 August 2017 the FDA approved the combination for hepatitis C treatment. In Europe, it was approved on 17 August 2017 for the same indication, under the trade name Maviret.
See also
Protease inhibitor (pharmacology)
References
Drugs developed by AbbVie
NS3/4A protease inhibitors
Acylsulfonamides
Carboxamides
Carbamates
Cyclopentanes
Cyclopropanes
Ethers
Organofluorides
Pyrrolidines
Quinoxalines
Tert-butyl compounds | Glecaprevir | [
"Chemistry"
] | 276 | [
"Organic compounds",
"Functional groups",
"Ethers"
] |
53,306,867 | https://en.wikipedia.org/wiki/Chassis%20dynamometer | A chassis dynamometer, informally referred to as a rolling road or a dyno, is a mechanical device that uses one or more fixed roller assemblies to simulate different road conditions within a controlled environment, and is used for a wide variety of vehicle testing and development purposes.
Chassis dynamometer types
There are many types of chassis dynamometer according to the target application - for example, emissions measurement, miles accumulation chassis dynamometer (MACD), Noise-Vibration-Harshness (NVH or "Acoustic") Application, Electromagnetic Compatibility (EMC) testing, end of line (EOL) tests, performance measurement and tuning. Another basic division is by type of vehicle - motorcycles, cars, trucks, tractors or the size of the roller - mostly 25", 48", 72", but also any other. Modern dynamometers used for development are mostly one roller to the wheel construction and the vehicle wheel is placed the top of the roller. Older constructional solutions are two roller per wheel and vehicle is place between these rollers - this design solution is cheaper and simpler, however, due to the requirements for accuracy and strict limits is no longer used for the development of new vehicles, but only as a test dynamometer at the end of the line or to measure the performance of the engine without dismantling, or performance tuning in "garage" companies.
Basic modes
Tractive force control/Force constant - in this mode the dynamometer holds set force regardless of speed or other parameters. The specified Force can be distributed evenly between the axles or in different amounts between different axles in the case of multiple axles chassis dynamometers.
Speed control/Velocity constant - dynamometer holds the set speed regardless of force or other parameters. For example, if a vehicle tries to accelerate in this mode, dynamometer applies opposite force to maintain set constant speed. This mode is used for example in the static power measurement.
Road load simulation - dynamometer simulates road according to set parameters (according to desired simulation parameters = F0, F1, F2 or ABC parameters, simulated inertia and gradient).
Measured variables on a roller dynamometer
Directly measured variables are only force on the torque transducer (i.e. loadcell) and revolutions measured on the role encoder dynamometer. All other variables are calculated based on known design (i.e. roller radius and loadcell mounting).
Power measurement on Chassis dynamometer
Due to friction and mechanical losses in various parts of the power train, the measured power at the wheels is about 15 to 20 percent lower than the power measured directly at the output of engine crankshaft (measuring device with this purpose is called engine testbed).
Road load simulation principle on chassis dynamometer
Because the vehicle is secured to the chassis dynamometer, it prevents variables such as wind resistance to alter the data set. The chassis dynamometer is designed to add the sum of all the forces that are applied to a vehicle when driven on an actual road course to be simulated through the tires and calculated in the test results.
Increasing air drag with the speed on the road manifests as increasing braking force of the vehicle wheels. The aim is to make the vehicle on the dynamometer accelerate and decelerate the same way as on a real road.
First you need to know the parameters of the "behavior" of the vehicle on a real road.
In order to get "road parameters", vehicle must be driving on ideal flat road with no wind from any direction, gear set to neutral and time needed to slow down without braking is measured in certain intervals e.g. 100–90 km/h, 90–80 km/h, 80–70 km/h 70–60 km/h etc. Slowing down from higher speed takes shorter time mainly due to air resistance.
Those parameters are later set in dynamometer workstation, together with vehicle inertia. Vehicle is restrained and so called vehicle adaptation has to be performed.
During vehicle adaptation dynamometer automatically slowing down from set speed, changing its own "dyno parameters" and trying to get same deceleration in given intervals as on real road. Those parameters are then valid for this vehicle type. Changing of set simulated inertia it is possible to simulate vehicle ability to accelerate if fully loaded, with setting gradient it is possible to simulate force if vehicle going downhill etc. Chassis dynamometers for climatic chamber does exists, where it is possible to change temperature in give range i.e. -40 to +50 °C or altitude chamber where it is possible to check fuel consumption with different temperatures or pressure and to simulate driving on mountain roads.
References
Dynamometers
Automotive technologies
Engine tuning instruments | Chassis dynamometer | [
"Technology",
"Engineering"
] | 949 | [
"Engine tuning instruments",
"Dynamometers",
"Mechanical engineering",
"Measuring instruments"
] |
53,307,338 | https://en.wikipedia.org/wiki/FlowFET | A flowFET is a microfluidic component which allows the rate of flow of liquid in a microfluidic channel to be modulated by the electrical potential applied to it. In this way, it behaves as a microfluidic analogue to the field effect transistor, except that in the flowFET the flow of liquid takes the place of the flow of electric current. Indeed, the name of the flowFET is derived from the naming convention of electronic FETs (e.g. MOSFET, FINFET etc.).
Mechanism of action
A flowFET relies on the principle of electro-osmotic flow (EOF). In many liquid-solid interfaces, there is an electrical double layer that develops due to interactions between the two phases. In the case of a microfluidic channel, this results in a charged layer of liquid on the periphery of the fluid column which surrounds the bulk of the liquid. This electric double layer has an associated potential difference known as the zeta potential. When an appropriately-oriented electrical field is applied to this interfacial double layer (i.e. parallel to the channel and in the plane of the electric double layer), the charged liquid ions experience a motive Lorentz force. Since this layer sheaths the fluid column, and since this layer moves, the entire column of liquid will begin to move with a speed . The velocity of the fluid layer "diffuses" into the bulk of the channel from the periphery towards the centre due to viscous coupling. The speed is related to the strength of the electric field , the magnitude of the zeta potential , the permittivity and the viscosity of the fluid:
In a FlowFET, the zeta potential between the channel walls and the fluid can be altered by applying an electrical field perpendicular to the channel walls. This has the effect of altering the motive force experienced by the mobile liquid atoms in the double layer. This change in the zeta-potential can be used to control both the magnitude and direction of the electro-osmotic flow in the microchannel.
The controlling voltage need only be in the range of 50 V for a typical microfluidic channel, since this correlates to a gradient of 1.5 MV/cm due to the channel size.
Operational limitations
Variation of the FlowFET dimensions (e.g. insulating layer thickness between the channel wall and gate electrode) due to the manufacturing process can lead to inexact control of the zeta potential. This can be exacerbated in the case of wall contamination, which can alter the channel wall surface's electrical properties adjacent to the gate electrode. This will affect the local flow characteristics, which may be especially important in chemical synthesis systems whose stoichiometry are directly related to the transport rate of reaction precursors and reaction products.
There are constraints placed on the fluid that can be manipulated in a FlowFET. Since it relies on EOF, only fluids producing an EOF in response to an applied electric field may be used.
While the controlling voltage need only be on the order of 50V, the EOF-producing voltage along the channel axis is larger, on the order of 300V. It is noticed experimentally that electrolysis may occur at the electrode contacts. This water electrolysis can alter the pH in the channel and adversely affect biological cells and biomolecules, while gas bubbles tend to "clog" microfluidic systems.
In further analogy with microelectronic systems, the switching time for a flowFET is inversely proportional to its size. Scaling down a flowFET results in a reduction in the amount of time for the flow to equilibrate to a new flow rate following a change in the applied electrical field. It should be noted, however, that the frequency of flowFET is many orders of magnitude slower than with an electronic FET.
Applications
A FlowFET sees potential uses in massively parallel microfluidic manipulation, for example in DNA microarrays.
Without using a FlowFET, it is necessary to control the rate of EOF by changing the magnitude of the EOF-producing field (i.e. the field parallel to the channel's axis) while leaving the zeta potential unaltered. In this arrangement, however, simultaneous control of EOF in channels connected with each other cannot easily be accomplished.
A FlowFET provides a way of controlling microfluidic flow in a way that uses no moving parts. This is in stark contrast to other solutions including pneumatically-actuated peristaltic pumps such as presented by Wu et al. Fewer moving parts allows less opportunity for mechanical breakdown of a microfluidic device. This may be increasingly relevant as large future iterations of large microelectronic fluidic (MEF) arrays continue to increase in size and complexity.
The use of bi-directional electronically-controlled flow has interesting options for particle and bubble cleaning operations.
See also
Fluidics
Microfluidics
Electro-osmosis
Lab-on-a-chip
References
Fluid dynamics
Nanotechnology
Biotechnology | FlowFET | [
"Chemistry",
"Materials_science",
"Engineering",
"Biology"
] | 1,051 | [
"Microfluidics",
"Microtechnology",
"Chemical engineering",
"Materials science",
"Biotechnology",
"nan",
"Piping",
"Nanotechnology",
"Fluid dynamics"
] |
53,308,753 | https://en.wikipedia.org/wiki/C26H29NO2 | {{DISPLAYTITLE:C26H29NO2}}
The molecular formula C26H29NO2 (molar mass: 387.5139 g/mol) may refer to:
Afimoxifene
Droloxifene, also known as 3-hydroxytamoxifen
Molecular formulas | C26H29NO2 | [
"Physics",
"Chemistry"
] | 68 | [
"Molecules",
"Set index articles on molecular formulas",
"Isomerism",
"Molecular formulas",
"Matter"
] |
53,310,496 | https://en.wikipedia.org/wiki/Sergei%20V.%20Kalinin | Sergei V. Kalinin is the Weston Fulton Professor at the Department of Materials Science and Engineering at the University of Tennessee-Knoxville.
Education
Kalinin graduated with M.S. from Department of Materials Science, Moscow State University, Russia in 1998. He received his Ph.D. in Materials Science and Engineering from the University of Pennsylvania in 2002 under Prof. Dawn Bonnell.
Career
He has been a research staff member at ORNL since October 2004 (Senior since 2007, Distinguished since 2013, Corporate Fellow since 2020 and Group Leader at CNMS). Previously he was Theme leader for Electronic and Ionic Functionality at CNMS, ORNL (2007– 2015). He was a recipient of Eugene P. Wigner Fellowship (2002 - 2004).
He became Joint faculty at the Center for Interdisciplinary Research and Graduate Education, University of Tennessee, Knoxville in December 2010. He also became adjunct professor at Sung Kyun Kwan University in January 2013.
From March 2022 to February 2023 he worked at Amazon as special projects principal scientist. After his assignment at Amazon he resumed his role at University of Tennessee, Knoxville as the Weston Fulton Chair Professor in the department of Materials Science and Engineering.
Research
Big data in physics and atom by atom fabrication
Kalinin's research applies machine learning and artificial intelligence to nanometer scale and atomically resolved imaging data, aiming to extract physics of atomic, molecular, and mesoscale interactions and enable real-time feedback for controlled matter modification, patterning, and atom by atom fabrication. The research builds on modern electron and scanning probe microscopies, which provide high-veracity information on the structure and functionalities of solids. Kalinin has developed frameworks for information capture, crowd-sourced analysis, and physics extraction from imaging tools. His research aims to extract simple physical parameters from imaging data and establish causative relationships between materials properties and functionalities. Kalinin and colleagues believe that electron microscopy can transition from a purely imaging tool to a new paradigm of atomic matter control and quantum computing, enabled via atom by atom fabrication by electron beams.
Kalinin has proposed the concept of Atomic Forge, the use of the sub-atomically focused beam of Scanning Transmission Electron Microscopy for atomic manipulation and atom by atom assembly.
Nanoelectromechanics and piezoresponse force microscopy
Kalinin has contributed to the field of nanoscale electromechanics, exploring the coupling between electrical and mechanical phenomena on the nanoscale. He has made significant contributions to piezoresponse force microscopy (PFM), including the first PFM imaging in liquid and vacuum, PFM of biological tissues, and the observation of nanoscale ferroelectricity in molecular systems. He has also pioneered the development of spectroscopic imaging modes for PFM, allowing visualization of polarization switching on the sub-10 nanometer level and establishing the resolution and contrast transfer mechanisms of domain walls and spectroscopy. Kalinin led the team that pioneered the BE principle for force-based scanning probe microscopes, enabling quantitative capture of probe-material interactions. His multidimensional, multimodal spectroscopies have enabled quantitative studies of polarization dynamics and mechanical effects accompanying switching in ferroelectrics. Kalinin's work has revealed the critical role of electrochemical phenomena on ferroelectric surfaces and the emergence of chaos and intermittency during domain switching and shape symmetry breaking. His recent work includes the development of the basic theory and phase-field formulation for domain evolution and the exploration of the coupled electrochemical-ferroelectric states.
Awards and honors
He is a recipient of:
Presidential Early Career Award for Scientists and Engineers (PECASE) in 2009
Blavatnik Award Laureate (2018) and Finalist (2016, 17)
IEEE-UFFC Ferroelectrics Young Investigator Award in 2010
Burton medal of Microscopy Society of America in 2010
ISIF Young Investigator Award in 2009,
American Vacuum Society 2008 Peter Mark Memorial Award
2003 Ross Coffin Award and 2009 Robert L. Coble Awards of American Ceramics Society
RMS medal for Scanning Probe Microscopy (2015)
R&D100 Awards (2008, 2010, 2016, 2018, and 2023)
Feynman Prize in Nanotechnology - Experimental category in 2022
Peter Duncumb Award for excellence in microanalysis, Microanalysis Society (2024)
He was named a fellow of Royal Society of Chemistry (2024), AAAS (2024), Materials Research Society (2017), Foresight Institute (2017), MRS (2016), AVS (2015), APS (2015), and a senior member (2015) and Fellow (2017) of IEEE.
He is a member of editorial boards for Nanotechnology, Journal of Applied Physics/Applied Physics Letters, and Nature Partner Journal Computational Materials.
External links
Google Scholar
AE-SPM group at the University of Tennessee, Knoxville
ORNL's Institute for Functional Imaging of Materials, News Article, March 19, 2015
Big data, machine learning, and artificial intelligence in Scanning Transmission Electron Microscopy (STEM) and Scanning Probe Microscopy (SPM) Lecture Series (YouTube playlist)
Piezoresponse Force Microscopy (PFM) and Spectroscopy Lectures Series (YouTube playlist)
Electronic and Ionic Transport Measurements by Scanning Probe Microscopy (SPM) Lecture Series (YouTube playlist)
References
Year of birth missing (living people)
Living people
Oak Ridge National Laboratory people
Scientists from Moscow
Moscow State University alumni
University of Pennsylvania alumni
Russian emigrants to the United States
Microscopists
Fellows of the American Physical Society
Recipients of the Presidential Early Career Award for Scientists and Engineers | Sergei V. Kalinin | [
"Chemistry"
] | 1,134 | [
"Microscopists",
"Microscopy"
] |
53,311,012 | https://en.wikipedia.org/wiki/NGC%205264 | NGC 5264, also known as DDO 242, is an irregular galaxy in the constellation Hydra. It is part of the M83 subgroup of the Centaurus A/M83 Group, located some 15 million light years (4.5 megaparsecs) away. The galaxy was discovered on 30 March 1835 by John Herschel, and it was described as "very faint, pretty large, round, very little brighter middle" by John Louis Emil Dreyer, the compiler of the New General Catalogue.
NGC 5264 was imaged by the Hubble Space Telescope in 2016. The galaxy is relatively small: it is a dwarf galaxy, a type of galaxy much smaller than normal spiral galaxies and elliptical galaxies. In fact, it is only 11000 light years (3300 parsecs) wide at its widest; our own galaxy, Milky Way, in comparison, is about ten times larger. Dwarf galaxies like these usually have about a billion stars. NGC 5264 also is relatively blue-coloured; this is from it interacting with other galaxies, supplying it with gas for star formation.
References
External links
5264
048467
Centaurus A/M83 Group
Dwarf irregular galaxies
Hydra (constellation)
UGCA objects | NGC 5264 | [
"Astronomy"
] | 252 | [
"Hydra (constellation)",
"Galaxy stubs",
"Astronomy stubs",
"Constellations"
] |
53,311,744 | https://en.wikipedia.org/wiki/Miedema%27s%20model | Miedema's model is a semi-empirical approach for estimating the heat of formation of solid or liquid metal alloys and compounds in the framework of thermodynamic calculations for metals and minerals. It was developed by the Dutch scientist Andries Rinse Miedema (15 November 1933 – 28 May 1992) while working at Philips Natuurkundig Laboratorium. It may provide or confirm basic enthalpy data needed for the calculation of phase diagrams of metals, via CALPHAD or ab initio quantum chemistry methods. For a binary system composed by elements A and B, a generic Miedema Formula could be cast as where terms Phi and nwS are explained and reported below.
For a binary system the physical picture could be simplified by considering a relatively simpler function of the difference of these three physical parameters resulting in a more complex form
History
Miedema introduced his approach in several papers, beginning in 1973 in Philips Technical Review Magazine with "A simple model for alloys".
Miedema described his motivation with "Reliable rules for the alloying behaviour of metals have long been sought. There is the qualitative rule that states that the greater the difference in the electronegativity of two metals, the greater the heat of formation - and hence the stability. Then there is the Hume-Rothery rule, which states that two metals that differ by more than 15% in their atomic radius will not form substitutional solid solutions. This rule can only be used reliably (90 % success) to predict poor solubility; it cannot predict good solubility. The author has proposed a simple atomic model, which is empirical like the other two rules, but nevertheless has a clear physical basis and predicts the alloying behaviour of transition metals accurately in 98 % of cases. The model is very suitable for graphical presentation of the data and is therefore easy to use in practice."
Free web based applications include Entall and Miedema Calculator. The latter was reviewed and improved in 2016, with an extension of the method. The original Algol program was ported to Fortran.
Informatics-guided classification of miscible and immiscible binary alloy systems
Miedema's approach has been applied to the classification of miscible and immiscible systems of binary alloys. These are relevant in the design of multicomponent alloys. A comprehensive classification of alloying behavior for 813 binary alloy systems consisting of transition and lanthanide metals. "Impressively, the classification by the miscibility map yields a robust validation on the capability of the well-known Miedema’s theory (95% agreement) and shows good agreement with the HTFP method (90% agreement)." These 2017 results demonstrate that "a state-of-the art physics-guided data mining can provide an efficient pathway for knowledge discovery in the next generation of materials design".
Appendix: Basic Miedema Model Parameters
This Table, reports the three main Miedema parameters for the elements of the Periodic table for whom the model is applicable.
These are original parameters which are after page 24 of the book after F.R. De Boer, R. Boom, W.C.M. Mattens, A.R. Miedema and A.K. Niessen Cohesion in Metals. Transition Metal Alloys (1988),
The above list of parameters should be considered as a starting point, which could yield such data (results after Fortran program made available by Emre Sururi Tasci
improved data may be found in more recent publications; possibly, in the near future, improvement or insisight of these data could be provided by the extended Calphad databases open collections available at NIMS For instance for Fe-X binary phase diagrams, a list of available databases is as presented in this link and more specifically in this table:
References
See also
Phase diagram
Gibbs energy
CALPHAD
Intermetallic compound
Enthalpy of mixing
Computational thermodynamics
Alloys
Solid-state chemistry
Metallurgy
Materials science
Thermodynamic free energy | Miedema's model | [
"Physics",
"Chemistry",
"Materials_science",
"Engineering"
] | 836 | [
"Thermodynamic properties",
"Applied and interdisciplinary physics",
"Physical quantities",
"Metallurgy",
"Materials science",
"Thermodynamic free energy",
"Energy (physics)",
"Chemical mixtures",
"Condensed matter physics",
"nan",
"Alloys",
"Wikipedia categories named after physical quantitie... |
53,314,796 | https://en.wikipedia.org/wiki/Dihydroxydisulfane | Dihydroxydisulfane or hypodithionous acid is a reduced sulfur oxyacid with sulfur in a formal oxidation state of +1, but the valence of sulfur is 2. The structural formula is , with all atoms arranged in a chain. It is an isomer of thiosulfurous acid but is lower in energy. Other isomers include HOS(=O)SH, HOS(=S)OH, and HS(=O)2SH. Disulfur monoxide, S2O, can be considered as the anhydride. Unlike many of these other reduced sulfur acids, dihydroxydisulfane can be formed in a pure state by reacting hydrogen sulfide with sulfur dioxide at −70 °C in dichlorodifluoromethane.
H2S + SO2 → H2S2O2
Dihyroxydisulfane may exist in an equilibrium with thiosulfurous acid.
Organic derivatives such as dimethoxydisulfane, diaceto disulfide, and bis(trifluoroaceto) disulfide also exist.
The conjugate bases are called disulfanediolate and hypodithionite .
Properties
Calculations predict that the S−S bond length is 2.013 Å, O−S bond length is 1.645 Å, H−O bond length is 0.943 Å.
Related compounds
Related compounds include the isoelectronic substances hydrogen tetroxide HOOOOH, hydroxotrisulfane HOSSSH, HSOSSH, and tetrasulfane HSSSSH.
References
Sulfur oxoacids
Disulfides | Dihydroxydisulfane | [
"Chemistry"
] | 355 | [
"Inorganic compounds",
"Inorganic compound stubs"
] |
63,099,563 | https://en.wikipedia.org/wiki/Christofari | Christofari — are Christofari (2019), Christofari Neo (2021) supercomputers of Sberbank based on Nvidia corporation hardware Sberbank of Russia and Nvidia. Their main purpose is neural network learning. They are also used for scientific research and commercial calculations.
The supercomputers are named after , the first customer of Sberbank, holder of the Bank's first savings account passbook. The supercomputers are listed in the Top 500 ranking of most powerful commercially available computer systems.
Development
Sberbank presented the supercomputers together with its subsidiary SberCloud. In December 2019, Sberbank and SberCloud commercially launched the Christofari supercomputer. Within a year, the power of Christofari became the foundation of a cloud based ML Space platform. It was configured to work with machine learning models. Sberbank and SberCloud announced this platform in December 2020.
The more powerful Christofari Neo supercomputer was presented at the AI Journey international conference in November 2021 by David Rafalovsky, the CTO of Sberbank Group. Currently David Rafalovsky is not a member of Sberbank Group.
Usage
The supercomputers can be used by scientific, commercial and government organizations working in the various sectors of the economy. The machines were developed to work with artificial intelligence algorithms, neural network learning, and inference of various models.
Sber uses Christofari for internal tasks e.g. speech recognition and autoresponder voice generation in a call center (40% of customer inquiries are already answered automatically by bots). Also they use it for analysis of CT scan images of the lungs. The SberDevices and Sber AI teams were the first who received access to Christofari Neo. They developed the first service based on the DALL-E neural network that generates images from queries in Russian.
The power of supercomputers is also provided to other organizations when connecting the services of the cloud platform SberCloud ML Space.
Christofari
The first supercomputer was presented by Herman Gref, CEO of Sberbank, and David Rafalovsky, CTO of Sberbank Group, on 8 November 2019, at the AI Journey conference in Moscow.
As of March 2020, this is the only supercomputer in Russia designed specifically for working with artificial intelligence algorithms. It is capable of training software models based on complex neural networks in unprecedentedly short time, it's Russia's fastest supercomputer.
The Christofari machines are based on Nvidia DGX-2 nodes equipped with Tesla V100 graphics accelerators. The InfiniBand network based on Mellanox hardware is used for an interconnection. Effective performance is about 6.7 petaflops, which makes it the 40th most powerful system in the world (at the time of launch on 8 November 2019), the 7th in Europe, and the 1st in Russia (the results of the previous Russian leader — the Lomonosov-2 system — were exceeded by more than two times). This is the first supercomputer of a financial organization in the world available for third-party users. It is also the first supercomputer certified in Russia to work with personal data.
DGX-2 single node specifications
Maximum Power Usage — 10 kW
CPU — Dual Intel Xeon Platinum 8168, 2.7 GHz, 24-cores
GPUs — 16X NVIDIA Tesla V100
GPU Memory — 512 GB total
NVIDIA CUDA Cores — 81920
NVIDIA Tensor cores — 10240
System Memory — 1.5 TB
The DGX servers are connected via Mellanox switches with 36-ports, supporting up to four InfiniBand EDR connections at 100 Gbit/s.
Almost the entire machine learning stack in use is oriented on the Ubuntu operating system as the base platform. The machines utilize a modified server version of an Ubuntu 18.04 LTS operating system. It is supported by Nvidia including graphics accelerators drivers, network software stack, and the necessary tools for maintenance and diagnostics. The package also includes proprietary software from Nvidia CUDA Toolkit, cuDNN, NCCL, and Docker Engine Utility for GPU Nvidia (the entire main machine learning stack runs in containers).
The supercomputer is located in the Sberbank data center (DC) in the Skolkovo Innovation Center in Moscow, Russia. It occupies one machine room and was built in less than a year.
Positions in rankings
September 2021 — 1st of the 50th most powerful computers in CIS;
June 2022 — 80th among the 500th most powerful computers in the world (November 2021 — 72nd).
Christofari Neo
The second supercomputer is also built on the basis of Nvidia technologies and is equipped with Nvidia A100 GPUs with 80 GB of memory. For an interconnection is used the high-speed switching InfiniBand network, which provides the speed of data exchange up to 1600 GB/s per each compute node and minimal latency. The actual performance is 11,95 petaflops.
DGX A100 single processor specifications
Maximum Power Consumption — 6,5 kW
CPU — Dual AMD Rome 7742, 128-cores, 2.25 GHz (base), 3.4 GHz (maximum)
GPUs — 8X Nvidia A100, 80 GB
GPU memory — 640 GB
System memory — 2 TB
Positions in rankings
November 2021 — 7th place in the HPL-AI rating of supercomputers and artificial intelligence;
June 2022 — 46th place among the 500 most powerful computers in the world (November 2021 — 43rd).
See also
References
Supercomputers | Christofari | [
"Technology"
] | 1,213 | [
"Supercomputers",
"Supercomputing"
] |
63,102,447 | https://en.wikipedia.org/wiki/Birkhoff%27s%20theorem%20%28equational%20logic%29 | In logic, Birkhoff's theorem in equational logic states that an equality t = u is a semantic consequence of a set of equalities E, if and only if t = u can be proven from the set of equalities. It is named after Garrett Birkhoff.
References
Logic
Formal sciences | Birkhoff's theorem (equational logic) | [
"Mathematics"
] | 64 | [
"Foundations of mathematics",
"Mathematical logic",
"Mathematical problems",
"Mathematical theorems",
"Theorems in the foundations of mathematics"
] |
63,102,592 | https://en.wikipedia.org/wiki/LTT%201445 | LTT 1445 is a triple M-dwarf system distant in the constellation Eridanus. The primary LTT 1445 A hosts two exoplanets—one discovered in 2019 that transits the star every 5.36 days, and another found in 2021 that transits the star every 3.12 days, close to a 12:7 resonance. As of October 2022 it is the second closest transiting exoplanet system discovered, with the closest being HD 219134 bc.
Stellar system
All three stars in the system are M-dwarfs, with masses between 0.16 and 0.26 . LTT 1445 A and LTT 1445 BC are separated by about 34 astronomical units and orbit each other with a period of about 250 years. The BC pair orbit each other about every 36 years in an eccentric orbit (e= ~0.5). The alignment of the three stars and the edge-on orbit of the BC pair suggests co-planarity of the system. The existence of transiting planets suggests that the entire system is co-planar, with orbits in one plane.
The TESS light curve showed stellar flares and rotational modulation due to starspots, likely on either the B or C component.
Planetary system
LTT 1445 Ab
LTT 1445 Ab is an exoplanet located approximately 22 light years away from Earth. Astrophysicists of the Harvard Center for Astrophysics discovered it in June 2019 with data from the Transiting Exoplanet Survey Satellite. The team obtained follow-up observations, including HARPS radial velocity measurements to constrain the mass of the planet.
LTT 1445 Ab takes 5 days to orbit its star, which in turn orbits two sibling stars, making a total of three stars in the system.
In July 2021, the mass of the planet was measured as Earth masses, confirming an Earth-like composition.
In 2022, a planetary transmission spectrum showed no evidence for an atmosphere, although an atmosphere with high altitude hazes cannot be ruled out yet. LTT 1445 Ab likely has a rocky composition, and because it orbits close to the M-dwarf, it has an equilibrium temperature of ().
LTT 1445 Ac
A second planet, LTT 1445 Ac, was also found in 2021 on a 3.12 day orbital period, with a mass of Earth masses. Although it transits the star too, its smaller size made it difficult to detect before the radial velocity measurements, and still makes it difficult to estimate its exact size. The planets orbit near a 12:7 orbital resonance with one another - Ac orbiting 11.988 times for every 7 orbits Ab makes - oscillating one full orbit away from a 'perfect' resonance every 104 years. The planet's existence was independently confirmed in 2022.
In 2023, observations with the Hubble Space Telescope allowed a more precise determination of the planet's size, supporting a rocky composition for both planets. Its equilibrium temperature is ().
LTT 1445 Ad
A third planetary candidate on a 24.3-day orbit, LTT 1445 Ad, was found in 2022. This is a possibly rocky super-Earth orbiting within the habitable zone.
See also
List of star systems within 20–25 light-years
List of nearest exoplanets
EZ Aquarii nearby M-dwarf triple
GJ 1245 nearby M-dwarf triple
Gliese 667 Cc
References
14101
Triple star systems
M-type main-sequence stars
455
Planetary systems with two confirmed planets
3
Eridanus (constellation)
Durchmusterung objects | LTT 1445 | [
"Astronomy"
] | 737 | [
"Eridanus (constellation)",
"Constellations"
] |
63,104,549 | https://en.wikipedia.org/wiki/Xiaomi%20Mi%2010 | The Xiaomi Mi 10 and Xiaomi Mi 10 Pro are high-end Android smartphones developed by Xiaomi Inc. It is the Xiaomi's first high-end smartphone and announced on 13 February 2020.
Specifications
Design
The Mi 10 and Mi 10 Pro use an aluminum frame and Gorilla Glass 5 on the front and rear. The display is curved and larger than the Mi 9; a circular cutout in the upper left hand corner for the front-facing camera replaces the Mi 9's notch. The camera module resembles the Mi CC9 Pro/Mi Note 10 with the accent ring from the Mi 9 around the top sensor, although the flash is located below in place of the macro sensor. The bottom sensor is likewise separate from the main camera array, and both protrude slightly. The Mi 10 is available in Ice Blue, Peach Gold and Titanium Silver, while the Mi 10 Pro is available in Pearl White and Starry Blue.
Hardware
The Xiaomi Mi 10 and Mi 10 Pro are powered by the Qualcomm Snapdragon 865 processor, with the Adreno 650 GPU. They have a FHD+ Super AMOLED display at a 90 Hz refresh rate with HDR10+ support. There is an optical (in-display) fingerprint scanner as well. Both have 8 GB or 12 GB LPDDR5 RAM, and 128 GB, 256 GB or 512 GB of non-expandable UFS 3.0. The Mi 10's battery is 4780 mAh and the Mi 10 Pro's is slightly smaller at 4500 mAh. The devices can be recharged over USB-C at up to 30 W for the Mi 10 and 50 W for the Mi 10 Pro, and can also charge wirelessly at up to 30 W with reverse charging at 10 W. The Mi 10 and Mi 10 Pro feature quad camera setups. While both come with a 108 MP wide sensor, the Mi 10 has a 13 MP ultrawide sensor and 2 MP macro and depth sensors, while the Mi 10 Pro has a 20 MP ultrawide sensor, a 12 MP portrait sensor and an 8 MP telephoto sensor. Both are capable of recording video at 8K resolution. The front-facing camera on both devices uses a 20 MP sensor.
Software
The devices run on Android 10, with Xiaomi's custom MIUI 11 skin. Later they were updated to MIUI 13 based on Android 12.
As of 2024, Xiaomi Mi 10 and Xiaomi Mi 10 Pro received HyperOS 1 based on Android 13.
Reception
DXOMARK gave the Mi 10 Pro's camera an overall score of 124, with a photo score of 134 and a video score of 104, ranking it as their best smartphone camera at the time.
References
External links
Android (operating system) devices
Phablets
Mobile phones introduced in 2020
Mobile phones with multiple rear cameras
Mobile phones with 8K video recording
Mobile phones with infrared transmitter
Discontinued flagship smartphones
Xiaomi smartphones | Xiaomi Mi 10 | [
"Technology"
] | 608 | [
"Crossover devices",
"Discontinued flagship smartphones",
"Phablets",
"Flagship smartphones"
] |
63,105,084 | https://en.wikipedia.org/wiki/ASUSat | ASUSat (Arizona State University Satellite, also known as ASU-OSCAR 37) was a U.S. amateur radio satellite that was developed and built for educational purposes by students at Arizona State University. It was equipped with two digital cameras for tracking changes to Earth's coasts and forests.
ASUSat was launched on January 27, 2000, along with JAWSAT with a Minotaur I rocket from Vandenberg Air Force Base, Lompoc, California. ASUSat was received 50 minutes after the start in South Africa, later also in New Zealand and the United States. During two overflights over Arizona, Arizona State University students were able to receive and control the satellite remotely. A problem with the power supply was reported on the third pass, 14 hours after take-off. The solar cells did not provide any electrical energy, so the batteries were exhausted shortly afterwards.
Frequencies
Uplink: 145.900 MHz
Downlink: 436.700 MHz
See also
OSCAR
References
Satellites orbiting Earth
Amateur radio satellites
Nanosatellites
Spacecraft launched in 2000 | ASUSat | [
"Astronomy"
] | 216 | [
"Astronomy stubs",
"Spacecraft stubs"
] |
63,105,185 | https://en.wikipedia.org/wiki/Hydrodimerization | Hydrodimerization is an organic reaction that couples two alkenes to give a symmetrical hydrocarbon. The reaction is often implemented electrochemically; in that case the reaction is called electrodimerization. The reaction can also be induced with samarium diiodide, a one-electron reductant.
Hydrodimerization is the basis of the Monsanto adiponitrile synthesis:
2CH2=CHCN + 2e− + 2H+ → NCCH2CH2CH2CH2CN
The reaction applies to a number electrophilic alkenes (Michael acceptors).
References
Electrochemistry
Industrial processes
Organic chemistry
Chemical synthesis
Chemical processes
Dimers (chemistry) | Hydrodimerization | [
"Chemistry",
"Materials_science"
] | 147 | [
"Physical chemistry stubs",
"Dimers (chemistry)",
"Chemical processes",
"Electrochemistry",
"nan",
"Chemical synthesis",
"Chemical process engineering",
"Polymer chemistry",
"Electrochemistry stubs",
"Chemical process stubs"
] |
63,105,339 | https://en.wikipedia.org/wiki/OSCAR%2044 | OSCAR 44 (also called Navy-OSCAR 44, PCSat-1, Prototype Communications SATellite and NO-44) is an American amateur radio satellite for packet radio. It was built by Bob Bruninga at the U.S. Naval Academy.
The satellite was launched on September 30, 2001 by the Kodiak Launch Complex, Alaska, using an Athena 1 rocket along with the Starshine 3, PICOSat and SAPPHIRE satellites. After the successful launch, the satellite was assigned OSCAR number 44.
The satellite has a digipeater for APRS in the 2-meter band. OSCAR 44 usually works with a negative power balance, which means that it is supplied with voltage by the photovoltaic cells each time it enters sunlight and remains active for another 45 minutes when it leaves sunlight using the battery charged by the photovoltaic cells.
See also
OSCAR
External links
in German
References
Satellites orbiting Earth
Amateur radio satellites
Spacecraft launched in 2001 | OSCAR 44 | [
"Astronomy"
] | 191 | [
"Astronomy stubs",
"Spacecraft stubs"
] |
63,106,053 | https://en.wikipedia.org/wiki/Dolf%20Zillmann | Dolf Zillmann (born March 12, 1935) is dean emeritus, and professor of information sciences, communication and psychology at the University of Alabama (UA). Zillmann predominantly conducted research in media psychology, a branch of psychology focused on the effects of media consumption on human affect, developing and expanding a range of theories within media psychology and communication. His work centred on the relation between aggression, emotion, and arousal through media consumption, predominantly in pornography and violent genres of movie and television. His research also includes the effects of music consumption, video games, and sports.
Zillmann's influence within both the fields of media psychology and communication was highlighted by Ellen Baker Derwin and Janet De Merode finding Zillmann to be the seventh most contributing media psychology author between 1999 and 2010.
Life
Early life and education (1935–1959)
Born in the former Province of Brandenburg, in the now Polish town of Meseritz, Zillmann's birthplace was highly contended, changing hands between German, Polish and Soviet forces throughout the duration of the Second World War. Much of his early educational experiences in the Western region of Poland took place in underfunded and understaffed educational institutions. Within Poland, roughly 20 percent of the population over the age of 10 were illiterate. Many of the primary schools had been closed, instead being utilised as hospitals for soldiers. Those schools that were open were highly filtered and structured through the Nazi ideology for the education of Slavs.
With his Father's conscription into the war, later death, and the contention in the area his family resided, Zillmann, along with his mother and sister, spent the majority of the war fleeing violence, leading to poor living conditions throughout most of his youth, irrespective of his family's affluence in Meseritz. Eventually, he and his family would settle in Marburg, a university town in the Hessen Region of Germany. Zillmann was self-taught, his only means of gaining an education due to the widespread post-war resource shortages.
Zillmann would continue on to higher education studying German architecture at the Ulm School of Design, a new Bauhaus School of Architecture which had been re-opened by the Swiss architect Max Bill after its closure by Nazi authorities during the war. After the acquisition of his diploma in architecture in 1955, he began working with Max Bill, entering architecture competitions in Zurich, beginning city planning in Isfahan, and designing and planning several public projects in many other European cities.
Zillmann would go back to formal study at the Ulm School, studying in the fields of communication and cybernetics, engaging with many different academics in the field outside of Ulm such as the German aesthetics philosopher Max Bense at the University of Stuttgart and Professor of visual science Herbert Schober at the University of Munich. Zillmann would acquire his diploma in communication and cybernetics in 1959 while also working as a scientific advisor for a holding company in Zurich. Zillmann's role predominantly involved the practical application of communications research to assist in marketing campaigns for several of the companies it parented, working there from 1959 - 1965.
Academic career (1968–2001)
In 1968, Zillmann moved to the United States in Madison, Wisconsin, where he was a doctoral student in communication and psychology at the University of Wisconsin. He would then move to Philadelphia in 1969, working in the University of Pennsylvania, which houses the oldest psychology department in North America, going on to acquire a doctorate in communication and social psychology there that same year from the Annenberg School. He would work as an assistant professor there until 1971, and would hold the position of associate professor from 1971 to 1975, teaching a range of subjects in both communication, psychology and general scientific methodology. It is during this time that the underpinnings of his excitation transfer theory were being tested and published.
Following his time at Philadelphia, Zillmann accepted an appointment as an associate professor and subsequently a full Professorial appointment in Communication and Psychology between 1975 and 1988 at Indiana University. Whilst continuing his own research Zillmann also established the Institute for Communications Research (ICR) at the university, Zillmann acting as the Director of the ICR from 1974 - 1988. The focus of the ICR is on both communication research and wider social scientific research in the field of media consumption, operating within Indiana University.
Zillmann would then move to the University of Alabama in Tuscaloosa, assuming the position of professor of communication and psychology and senior associate dean for graduate studies and research in 1989, leading the College of Communication and Information Sciences at the university. where he would formally retire from academic employment.
Research
General
Zillmann has conducted research in media psychology and communication for 30 years developing a wide range of psychological and communications theories and models. His research has been in a range of domains, these include:
Excitation transfer theory
Three-factor theory of emotion
Sex, aggression and emotion
Selective exposure theory
Affective disposition theory
Mood management theory
Entertainment theory
Misattribution theory of humour
Massive-exposure of media effects
Sportsfanship
Empathy theory
Three-factor theory of emotion
The three-factor theory of emotion proposed by Zillmann is an advancement of Schachter's two-factor theory model, which proposed that emotion and emotional excitation was the product of both one's interoception of internal physiological stimuli (e.g. blushing, sweating or shaking etc.) and the environmental stimuli (e.g. media, people or danger etc.) which one is currently engaged with. This cognitive approach to emotion emphasised context specific emotional appraisal and cognition in relation to the stimuli present in the environment along with the internal states one experiences to cognitively formulate the most appropriate emotional response in a particular circumstance. In Schachter's two-factor theory, as Reisenzein (1984) posits "physiological arousal is necessary for the experience of an emotion (feeling), but not for emotion-related behaviour. Zillmann combined Schachter's cognitive approach to emotion with Hullian Drive Theory, particularly in regards to Hull's concept of 'excitational residues' in the cognitive process of emotional responses, implicating that both present and prior excitatory stimuli affect both excitatory levels and emotional cognition. However, Zilllmann argues that the individual is usually unable to recognise this residual effect of prior stimuli on current response, leading to a disproportionate response to a current stimulus. It is from this three-factor theory of emotion Zillmann would construct the foundation of his Excitation Transfer Theory, which he attributes as his most significant contribution to the field of media psychology.
Excitation transfer theory
Zillmann's excitation transfer theory posits that residual emotional excitation from a particular stimulus will be carried over and applied to another stimulus. The excitatory response one has to the following stimulus will be in proportion with the levels of prior excitation from the first stimulus upon exposure to the latter, resulting in an over-exaggerated response to an otherwise low or moderate excitatory stimulus. Zillmann generalises emotion from distinguishable states and remain as such until the brain has established which emotional response to a given stimulus is appropriate.
The theory arose in a period of advancing media technology and concern for its effects upon the public, particularly children. In 1972, the appointed Surgeon General, Jesse Leonard Steinfeld delivered a report detailing his concern of the effects of violent media on childhood mental health and increased aggressive and anti social behaviour found to be associated with its consumption. Leo Bogart (1972) drew attention to the findings of David Clark and William Blankenburg who found that the ratings of violent programming was higher than that of other programming, Bogart stating that "Children's cartoon films are especially violent." This concern was reflected in the leading psychological research of the time with Albert Bandura's Social Learning Theory providing an experimentally valid association between the consumption of media and aggressive behaviour in children, in what is referred to as his 'Bobo Doll Experiment'. Wherein, Bandura demonstrated that modelling and imitative behaviour of observed aggression was present upon the viewing of a recorded clip of aggressive and violent behaviour. Zillmann's Excitation Theory served to explain the physiological and neurological underpinnings of the Surgeon General's findings whilst also expanding upon the psychological zeitgeist of the time.
As Zillmann states in a 2002 interview, excitation transfer theory is "a clear mechanism with well-defined, measurable variables for the prediction of effects. [Whilst also having] universal and ubiquitous applicability." Zillmann's research incorporated excitation transfer theory in a range of communications and media psychology studies, being as Bryant describes "cogent, elegant, and extremely comprehensive theory of communication and emotion that explains and predicts a vast array of human communication behaviours." In his 1971 study on the effects of aggressive, non-aggressive and erotic media, Zillmann found that the effect of aggressive stimuli more significantly increased excitatory responses than those of non-aggressive stimuli, resulting in higher rated aggressive behaviour. The study also found that erotic stimuli more significantly increased aggressive responses than even those of the aggressive stimulus, Zillmann would continue to study the excitatory effects of pornographic and other erotic material in later research.
In a 1999 study, Zillmann demonstrated the effects of repeated and extended exposure to violent media on hostility in men and women, finding that both provoked and unprovoked participants presented "markedly increased hostile behavior", and that these effects of media were similar across both men and women.
Pornography, aggression and emotion
Technological advances in the 1980s led to a far more accessible and public distribution of erotic media, leading to large increases in the consumption of pornographic content. In a study conducted with frequent collaborator Jennings Bryant, Zillmann found an increase in the usage of porn amongst younger age groups, and that the majority of teens and adults had at some point been exposed to pornographic content. Zillmann's research was predominantly engaged in the effects of pornography on both behavioural and attitudinal dispositions. In a 1971 study, pornographic content was found to be more emotionally excitatory in provoking aggressive behaviour than violent television, leading Zillmann to further explore this result. In a 1982 study, again alongside Bryant, Zillmann found that through continued 'massive' exposure to pornographic content for six weeks led to a loss of compassion for women as rape victims, an increase in opposition towards women's causes, a disposition towards less severe incarceration sentences for rapists and a higher degree of callousness toward women overall. Zillmann also observed the effect of frequent pornographic media consumption upon viewing habits; finding that as a result of either waning interest, increased curiosity or an entanglement of the two, those who viewed larger quantities of common practice pornography were more open to particular niche and fetishised pornographic material and also more violent or aggressive forms as well.
Zillmann proposes that in the same manner non-pornographic media can propagate a reality which is heavily mediated through its curators and the themes, leading to an altered subjective view of the world that diverges from reality; pornography alters perceptions of women, one's sexual expectations, and sexual practice. Due to a lack of 'primary experience', the distorted recounts of those in friend and familial settings, and the limited available academic information, Zillmann argues that one draws from pornography, which he says "provide[s] the closest approximation to primary experience." What he calls the "pornography answer" to the private world of sexuality. Prolonged exposure to mainstream pornography depicting heterosexual intercourse in a casual setting led to an increased devaluation of marriage, emotionally invested relationships, childbirth and child rearing. Instead, the participant's world view is altered through continuous exposure to the narrative constructed by the pornography they watch and as such associate more casual sexual relationships, as more enjoyable and risk-free. Zillmann states "The perceptual and evaluative changes that were evident in both genders are direct reflections of what can be considered the chief proclamation of pornography: great sexual joy without any attachment, commitment, or responsibility."
Zillmann emphasised his dissatisfaction with his research into the effects of pornography as a result of the continuous controversy and backlash faced upon publication of his results. In an interview, he states “Our research on the effects of pornography triggered an unimaginable avalanche of hostility from those deeming particular findings inopportune – that is, in conflict with their values regarding sexuality." Both from liberal and conservative groups, Zillmann's research was attacked in the media, and as a result of this Zillmann had discontinued his research into pornography due to threats made to his fellow researchers, only recently continuing research in the area.
Publications
Books
1979 Hostility and Aggression
1984 Connections Between Sex and Aggression
1985 Selective Exposure To Communication
1989 Pornography: Research Advances and Policy Considerations
1991 Responding to the Screen: reception and Reaction Processes
1994 Media Effects: Advances in Theory and Research
1994 Media, Children, and the Family: Social Scientific, Psychodynamic, and Clinical Perspectives
1998 Connections Between Sexuality and Aggression, 2nd ed.
2000 Media Entertainment: The Psychology of Its Appeal
2000 Exemplification in Communication: The Influence of Case Reports on the Perception of Issues
2002 Media Effects: Advances in Theory and Research, 2nd ed.
2013 Selective Exposure to Communication
Awards
Burnum Distinguished Faculty Award, 2001
References
University of Alabama faculty
1935 births
People from Międzyrzecz
Academic staff of the Ulm School of Design
University of Wisconsin–Madison alumni
University of Pennsylvania faculty
Aggression
Living people | Dolf Zillmann | [
"Biology"
] | 2,811 | [
"Behavior",
"Aggression",
"Human behavior"
] |
63,107,430 | https://en.wikipedia.org/wiki/Octasulfur%20monoxide | Octasulfur monoxide is an inorganic compound with a chemical formula , discovered in 1972. It is a type of sulfur oxide.
A crystalline compound composed of cyclooctasulfur monoxide and antimony pentachloride in equimolar quantities can be made ().
References
External links
Sulfur oxides
Eight-membered rings
Substances discovered in the 1970s | Octasulfur monoxide | [
"Chemistry"
] | 78 | [
"Inorganic compounds",
"Inorganic compound stubs"
] |
63,107,772 | https://en.wikipedia.org/wiki/Complexities%3A%20Women%20in%20Mathematics | Complexities: Women in Mathematics is an edited volume on women in mathematics that "contains the stories and insights of more than eighty female mathematicians". It was edited by Bettye Anne Case and Anne M. Leggett, based on a collection of material from the Newsletter of the Association for Women in Mathematics, and published by Princeton University Press in 2005 ().
Topics
The book contains over 100 articles, by over 70 authors, divided into five sections. The first of these, "Inspiration", discusses the work of famous women in mathematics
(such as Sofya Kovalevskaya, Julia Robinson, and Emmy Noether) and of women mathematicians from the 18th and 19th centuries, offering insights into their personal life as well as their mathematics. Next, "Joining Together" covers the history of the Association for Women in Mathematics and related topics in the organization of women in mathematics including European Women in Mathematics
and the participation of women at the International Congress of Mathematicians.
The middle section, "Choices and Challenges", covers the problems facing women in contemporary mathematics, and includes a statistical quantification of these problems by Case and Leggett. "Celebration" is a collection of plenary talks and other materials from the Olga Taussky-Todd Celebration of Careers for Women in Mathematics, a conference held in 1999 to celebrate women in mathematics; its plenary speakers were , Evelyn Boyd Granville, Lisa Goldberg, Fern Hunt, Diane Lambert, Cathleen Synge Morawetz, Linda Petzold, Helene Shapiro, Richard S. Varga, Margaret H. Wright, and Lani Wu. The final chapter, "Into a New Century", consists of essays by the youg women mathematicians of the time the book was published, many of them in non-academic careers. A collection of photographs from 1975 to 2003 is included as an appendix.
Despite its material on the difficulties faced by women in mathematics, the tone of the book is "factual and upbeat", in many cases covering ordinary mathematical careers with no overt discrimination, and celebratory rather than encyclopedic.
Audience and reception
The book is aimed at any woman interested in a mathematical career and anyone else "interested in the struggle and development of female mathematicians", and is "intended to encourage young women to enter mathematics". Reviewer Peggy Kidwell suggests that it would be of interest to historians of mathematics in its documentation of many current practices. And reviewer Shandelle Henson recommends it to all professional mathematicians, to provide history and context to the struggles still faced by some of their students, to help face down their own prejudices, and to avoid backsliding in the progress we have made as a society to reduce the obstacles for women in mathematics.
A small complaint of Kidwell is that there is no bibliography of related literature on women in mathematics. A. E. L. Davis, a British reviewer, criticizes the US-centric focus of the book, as does Argentinian-born mathematician Marianne Korten. Davis writes that "only parts" of it would be of interest to British readers, calls the first chapter's coverage of historical women in mathematics "rather disappointing" compared to the more encyclopedic Women of Mathematics: A Bio-Bibliographic Sourcebook by Louise Grinstein and Paul Campbell, and criticizes the Todd Celebration section as too specialized and technical for the audience of the book. Nevertheless, Davis recommends the final section of perspectives from young women mathematicians "to teachers everywhere who would like to promote mathematics to their high-flyers". And both Davis and reviewer Gwen Spencer agree that the book provides "valuable practical advice" to women mathematicians on balancing families and careers and handling two-body job searching, and examples of how to address these issues that may also be helpful for institutions aiming to treat women better. Korten singles out the essays by Cora Sadosky, Susan Landau, Karen E. Smith, and Helen Moore as speaking particularly strongly to her.
More simply, reviewer Erica Voolich, a schoolteacher, writes "The book is exactly what I need in school."
References
Women in mathematics
Mathematics books
2005 non-fiction books
Books about women | Complexities: Women in Mathematics | [
"Technology"
] | 846 | [
"Women in science and technology",
"Women in mathematics"
] |
63,107,907 | https://en.wikipedia.org/wiki/Unification%20of%20theories%20in%20physics | Unification of theories about observable fundamental phenomena of nature is one of the primary goals of physics. The two great unifications to date are Isaac Newton’s unification of gravity and astronomy, and James Clerk Maxwell’s unification of electromagnetism; the latter has been further unified with the concept of electroweak interaction. This process of "unifying" forces continues today, with the ultimate goal of finding a theory of everything.
Unification of gravity and astronomy
The "first great unification" was Isaac Newton's 17th century unification of gravity, which brought together the understandings of the observable phenomena of gravity on Earth with the observable behaviour of celestial bodies in space.
His work is credited with laying the foundations of future endeavors for a grand unified theory. For example, it has been stated that "If we have to take any single individual as the originator of the quest for a unified theory of physics, and, by implication, the whole of knowledge, it has to be Newton." Physicist Steven Weinberg stated that "It is with Isaac Newton that the modern dream of a final theory really begins".
Unification of magnetism, electricity, light and related radiation
The ancient Chinese people observed that certain rocks such as lodestone and magnetite were attracted to one another by an invisible force. This effect was later called magnetism, which was first rigorously studied in the 17th century. However, prior to ancient Chinese observations of magnetism, the ancient Greeks knew of other objects such as amber, that when rubbed with fur would cause a similar invisible attraction between the two. This was also studied rigorously in the 17th century and came to be called electricity. Thus, physics had come to understand two observations of nature in terms of some root cause (electricity and magnetism). However, work in the 19th century revealed that these two forces were just two different aspects of one force electromagnetism.
The "second great unification" was James Clerk Maxwell's 19th century unification of electromagnetism. It brought together the understandings of the observable phenomena of magnetism, electricity and light (and more broadly, the spectrum of electromagnetic radiation). This was followed in the 20th century by Albert Einstein's unification of space and time, and of mass and energy through his theory of special relativity. Later, Paul Dirac developed quantum field theory, unifying quantum mechanics and special relativity.
A relatively recent unification of electromagnetism and the weak nuclear force now consider them to be two aspects of the electroweak interaction.
Unification of the remaining fundamental forces: theory of everything
This process of "unifying" forces continues today, with the ultimate goal of finding a theory of everything it remains perhaps the most prominent of the unsolved problems in physics. There remain four fundamental forces which have not been decisively unified: the gravitational and electromagnetic interactions, which produce significant long-range forces whose effects can be seen directly in everyday life, and the strong and weak interactions, which produce forces at minuscule, subatomic distances and govern nuclear interactions. Electromagnetism and the weak interactions are widely considered to be two aspects of the electroweak interaction. Attempts to unify quantum mechanics and general relativity into a single theory of quantum gravity, a program ongoing for over half a century, have not yet been decisively resolved; current leading candidates are M-theory, superstring theory and loop quantum gravity.
References
Concepts in physics | Unification of theories in physics | [
"Physics"
] | 698 | [
"nan"
] |
63,108,627 | https://en.wikipedia.org/wiki/Tonograph | The tonograph () is a device invented by Italian scientist Luca de Samuele Cagnazzi (1764-1852) and presented at the Terza riunione degli scienziati italiani (the "Third Meeting of Italian Scientists"), held in Florence in September 1841.
The original device was donated by its inventor Cagnazzi during the Third Meeting of Italian Scientists. After then, the instrument went lost, but in 1932 ca., thanks to the work of a scholar, it was found in a cellar and exhibited at the Museo Galileo, located in Piazza dei Giudici, Florence. The original device is now stored in the Museo Nazionale Scienza e Tecnologia Leonardo da Vinci, Milan. A copy of the device was commissioned by Count Celio Sabini (from Altamura) and it's now displayed at the museum Archivio Biblioteca Museo Civico, located in Altamura.
Making
According to Luca de Samuele Cagnazzi's unpublished autobiography La mia vita, the tonograph was made by Cagnazzi himself with his own hands ("colle mie mani") in 1841. He also wrote a short essay about how it worked and its purpose and this work was first written in Latin language and published under the name Tonographiae Excogitatio (1841), since Cagnazzi wanted its invention to be known in what it was then known as Germany. Subsequently, he translated his essay to Italian language on the occasion of the Third Meeting of Italian Scientists, held in Florence, where he presented his invention.
According to Cagnazzi, his device was very much appreciated during the Third Meeting.
Working principle
The tonograph is a device consisting of a hollow brass cylindrical section closed at one end and equipped with a hole. The cylinder is in all respects similar to that of the organs. Through bellows, operated through one's feet, the air flows through the cylindrical tube making a sound. Inside the cylinder there is a piston whose position is regulated by a thin rod and, as the position of the piston changes, the length of the cylinder also changes. As the plunger (and therefore the rod) changes, the instrument will generate a different sound. A graduated scale makes it possible to "measure" the intonation and inflection of human voice, by matching one's voice with the sound made by the device.
The scale provided by Cagnazzi, apparently, also related to the harmonic and diatonic scale employed in music. Cagnazzi assumes that there is roughly an inverse proportionality between the length of a closed cylindrical tube and the sound's frequency. Based on this assumption, he came to define the width of the scale and he related the scale of the device with the scale of musics.
The device was meant not only to measure, but also to preserve the tones and inflections of human voice (for example, by transcribing them above or below a text). Therefore, it also represents, in a broad sense, a device which helps store some kind of information.
During the presentation of the device, which took place in 1841 at the Third Meeting of Italian Scientists, during which the device was donated to the department, Professor Giovanni Alessandro Majocchi praised Cagnazzi for his invention, as it provided declamation schools with a way to precisely and successfully store the tone and intensity of human voice. The diatonic and chromatic scales of music didn't have enough notches to accurately represent the tone and intensity of human voice. During the presentation, chemist Giuseppe Gazzeri objected that a mechanical device could never make a sound similar to the human voice, as the material of which the human phonation system is made and the materials of which a mechanical device is made are intrinsically different.
Majeri himself replied to Gazzeri's objection by explaining that sound is described by three factors, namely tone, intensity and timbre. The tone depends on the sound's frequency, the intensity is the "strength" of a sound, while the timbre depends on the material of the "sounding body" (). Different sounds such as a double bass and a bell may have the same tone and intensity, but they are often perceived as different sounds. The difference is given by the timbre. Since intensity and tone but not timbre are meaningful and employed in acting schools, Gazzocchi's objection, according to Majocchi, is unreasonable "by itself".
Example of use
An example of the use of the tonograph is provided by Cagnazzi himself. The syllables of each word and each sentence of a text to are pronounced slowly enough, so as to imitate the voice with the device. Once the sound of the voice seems closest to the human voice of the syllable, the corresponding value on the scale is transcribed below the syllable. At a lower level, the number of bellows pressures carried out for each word is also added. The "measurement" of the voice requires a lot of diligence and a certain number of attempts before reaching a precise measurement.
Luca de Samuele Cagnazzi provides an example of the use of the tonograph based on a notable verse by Ennius (Andromache):
From the example above, it's clear that the purpose of the device is to help store the features of human voice, its tones and music. Cagnazzi himself shaped his experiment based on the information provided by Cicero on how the above verse was pronounced in the classical era.
Cagnazzi faces the impossibility (with some exceptions) of faithfully reconstructing the tones and inflections of the voice that the Ancient Greeks and the Romans used in their survived works. The same inventor, in the first part of his essay Tonografia escogitata (1841) made some acute observations on both linguistics and music; he also explained, in the preface, the purpose of his work as well as of his invention:
Previous attempts
According to what reported by its inventor, there had been some previous attempts to faithfully transcribe the tones of acting. Some attempts had been carried out by the Académie des inscriptions et belles-lettres of Paris. Its perpetual secretary Charles Pinot Duclos wrote that abbot Jean-Baptiste Dubos proposed to create a group of experts in the field of music, in order to identify and distinguish fractions of the human voice's diatonic scale.
Nevertheless, the Académie didn't succeed in the above purpose, since human ears (even the ears of the most skilled people) basically can't go beyond a certain level of precision without a proper device. The failure led the Académie to jump to the conclusion that distinguishing between fractions of the diatonic scale was simply impossible, and alternative methods, based on science and mathematics, weren't taken into account. Cagnazzi compared the Académie to the fox in Phaedrus's fable, who said that the grapes were unripe since he couldn't reach it.
See also
Luca de Samuele Cagnazzi
Declamation
Acoustics
Physics
References
Bibliography
External links
LombardiaBeniCulturali - Tonografo
Strumentidellascienza.edu
Fonografo di Luca De Samuele Cagnazzi
Musical instruments
Linguistics
Acoustics
History of physics | Tonograph | [
"Physics"
] | 1,498 | [
"Classical mechanics",
"Acoustics"
] |
63,108,647 | https://en.wikipedia.org/wiki/Thierry%20Poinsot | Thierry Poinsot (born 22 March 1958), is a French researcher, research director at the CNRS, researcher at the Institute of Fluid Mechanics in Toulouse, scientific advisor at CERFACS and senior research fellow at Stanford University. He has been a member of the French Academy of sciences since 2019.
Biography
Engineer from École Centrale de Paris (1980, now Centralesupelec), he obtained a doctorate in engineering in 1983 and a state thesis in 1987 before working at Stanford for two years (1988-1990). He currently works in Toulouse. His areas of expertise are fluid mechanics, combustion, propulsion, acoustics, high performance computing.
Professional positions
Poinsot has taught since 1980 at Ecole Centrale Paris, Stanford, ISAE and ENSEEIHT in Toulouse, Princeton, Tsinghua, Kanpur, CISM, and the von Karmann Institute. He was head of the MIR group (reactive media) at the Institute of Fluid Mechanics in Toulouse from 2010 to 2017 and member of the scientific council of PRACE from 2008 to 2013.
He has been a consultant for IFP Energies Nouvelles, Air Liquide, Siemens, Daimler, and John Zink, Senior research fellow at the Center for Turbulence Research at Stanford since 1990, scientific advisor at CERFACS since 1992, chief editor (with Pr F. Egofopoulos, University of Southern California) of Combustion and Flame since 2013, expert at the European Commission for the ERC (European Research Council) programmes since 2014 and member of the Board of Directors of the Combustion Institute since 2016.
Scientific contributions
His work focuses mainly on combustion, fluid mechanics and energy. To do this, he uses experiments and theoretical methods. In addition, he relies on high performance numerical simulation which consists in creating 'virtual' digital twins of real systems (such as an airplane or helicopter engine) thanks to supercomputers now comprising several million processors (see Top500).
After his PhD thesis on the physical mechanisms controlling the cooking of tyres (for Michelin), he developed experimental and theoretical studies of combustion instabilities and their control in aeronautical engines under the direction of Sébastien Candel at the EM2C laboratory at Centrale Paris. He has also developed models for turbulent combustion.
During his two-year postdoctoral fellowship at Stanford, he set up the first direct simulations of turbulent flames. These first academic simulations paved the way for numerical simulation tools for real combustion chambers which use the largest computers available today and are used to calculate French aeronautical combustion chambers (rockets, helicopters, aircraft, furnaces). In addition to this numerical simulation work, he has also developed theoretical and experimental activities on combustion at the IMFT.
He is currently interested in aeronautical engines and the energy generation systems of the future as well as in the storage of renewable energies using hydrogen. He has made a major contribution to the pooling of major numerical simulation codes for fluid mechanics in France and Europe and his codes are used by hundreds of researchers and engineers. His work has been supported since 2013 by two European ERC (European Research Council) projects: INTECOCIS and SCIROCCO.
He is the author or co-author of Theoretical and numerical combustion with D. Veynante, a textbook on combustion, and 220 articles in peer-reviewed journals
Awards
CNRS Bronze medal in 1988.
Best DRET researcher in 1991.
First Cray prize in 1993.
Edmond Brun Prize of the French Academy of sciences in 1996.
First BMW prize for the supervision of B. Caruelle's thesis in 2002.
Grand Prix de l'Académie des sciences, Paris, 2003.
AIAA Associate Fellow in 2003.
CNRS 'Prime d'excellence scientifique' in 2009–2013.
ERC advanced grant in 2013 on thermoacoustic instabilities.
ERC advanced grant in 2019 on hydrogen storage of renewable energy
Hottel plenary lecture at the 36th Symp.(Int.) Comb. 2016 (Seoul).
Zeldovich Gold medal of the Combustion Institute, 2016.
Fellow of the Combustion Institute in 2018.
EPSC Award in 2021
References
External links
http://www.cerfacs.fr/~poinsot
1958 births
Living people
French physicists
French National Centre for Scientific Research awards
Fellows of the Combustion Institute
Research directors of the French National Centre for Scientific Research
École Centrale Paris alumni
Members of the French Academy of Sciences | Thierry Poinsot | [
"Chemistry"
] | 892 | [
"Fellows of the Combustion Institute",
"Combustion"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.