article_text
stringlengths
294
32.8k
topic
stringlengths
3
42
World's longest lichen declines in a national park A unique long-term study performed by researchers at Umeå University shows that the pendent lichen Usnea longissima has decreased by 42% over 37 years in Skuleskogen National Park, located in High Coast UNESCO World Heritage site. The study has been published in the journal Forest Ecology and Management. "It is well known that that pendent lichens decrease in managed forests. This study suggests that the long-term survival of red-listed lichens may be threatened also in forests that have a strong protection," says Per-Anders Esseen, professor emeritus at Department of Ecology and Environmental Science, Umeå University. Usnea longissima (Methuselah's beard lichen) was probably the original "tinsel" placed on Christmas trees and may reach a length of several meters. It grows in old and humid spruce forests and is an important indicator of biodiversity in forests. It is red-listed as vulnerable and protected by law in Sweden. Sweden and Norway host the largest occurrences in Europe and therefore have particular responsibility to protect the lichen. "Long-term data on spatial dynamics of populations of red-listed species are fundamental for understanding and predicting how these species respond to global change drivers. Such knowledge is also needed to develop effective conservation measures," says Per-Anders Esseen. Detailed inventory The researchers performed a detailed inventory of U. longissima in Skuleskogen in 1984. A total of 355 trees hosting the lichen were tagged with an aluminum plate buried in the ground. The inventory was repeated in 2021 and a metal detector was used to search for the plates. The researchers found that the lichen was extinct on 81% of the tagged trees. The extinction was higher on trees that were still standing (stochastic extinction) than extinction caused by treefalls (deterministic extinction). A total of 207 newly colonized trees were also detected, reflecting substantial turnover of host trees within local populations. The study also provides key findings about the spatial dynamics of U. longissima in forest landscapes and shows that the lichen is strongly dispersal-limited. The lichen mainly disperses with larger fragments, which only dispersed a few meters over 37 years. The poor dispersal explains the lichen's strong dependence on long continuity of forest cover and preference for sites not subjected to fire. This also explains why the distribution of the lichen in the National Park was stable over the study period. Lichens are complex partnerships between fungi, photobionts (an alga, or a cyanobacterium) and other bacteria. They lack roots and passively take up water. Thin pendulous lichens, for example, Usnea longissima, are particularly sensitive to environmental hazards such as air pollution, forestry, and climate change, yet are vital components of forest canopies worldwide. Pendent lichens contribute to nutrient cycling in forests and provide habitat for insects and spiders. They also constitute important fodder for reindeer during winter when ground-lichens are inaccessible. Pollution and climate change Data on composition and age of the forests indicate extensive harvesting between 1860 and 1900 in Skuleskogen, but the researchers found no evidence of large-scale disturbances during the last 80 years. Instead, the decline of U. longissima was probably driven by a combination of air pollution (mainly deposition of nitrogen), climate change with milder and more snow-rich winters and heat waves in summers, as well as denser forests. The lichen is also threatened by storms and fires, says Esseen. The study highlights the necessity of developing a comprehensive action plan for securing the long-term survival of this unique lichen in Sweden, says Esseen. It is also important to start a national program for monitoring of red-listed lichens in both protected and managed forests. More information: P.-A. Esseen et al, Long-term dynamics of the iconic old-forest lichen Usnea longissima in a protected landscape, Forest Ecology and Management (2023). DOI: 10.1016/j.foreco.2023.121369 Journal information: Forest Ecology and Management Provided by Umea University
Environmental Science
Mario Tama/Getty Images toggle caption Workers install solar panels at the Port of Los Angeles in California. Mario Tama/Getty Images Workers install solar panels at the Port of Los Angeles in California. Mario Tama/Getty Images The United States is poised to make much deeper cuts to the pollution that's fueling global warming than it was even a couple years ago. That's largely because of the billions of dollars the country is spending on green technologies through the Inflation Reduction Act (IRA), which Congressional Democrats passed last summer, according to a new report from Rhodium Group. The research firm says that by 2030, the U.S. could lower its greenhouse gas emissions by 29% to 42%, compared to 2005 pollution levels. At the start of the Biden administration, Rhodium Group analysts said it looked like the country would only be able to cut its emissions by about a quarter, at most. The changed outlook reflects expectations that huge investments by the federal government will make things like renewable energy and electric vehicles a lot more affordable. But big barriers still stand in the way. Companies that build wind and solar plants often struggle to get projects permitted by local governments because of public opposition. And there are long waiting lines to plug in power plants and batteries to the country's electric grids. To make the kinds of emissions cuts that the Rhodium Group says are possible, the U.S. will have to at least match its best-ever year for wind and solar development, and it will have to do it year after year. And even if everything goes right, it still won't be enough to deliver on a pledge the U.S. made under the 2015 Paris Agreement to cut its emissions in half by the end of this decade. Meeting that target will require even more aggressive actions by states and the federal government, Rhodium Group says. "You're gonna need to figure out how to build out a whole bunch of wind and solar, get a bunch of electric vehicles on the road and that kind of thing," says Ben King, an associate director in the firm's energy and climate practice. "The IRA is the push, the economic push that you need, and you just gotta clear the way for it and not let it encounter so many headwinds," King adds. A recent report from the United Nations warned that the world is running out of time to keep temperatures from rising to levels that could be catastrophic for many places. The Earth is already nearly 2 degrees Fahrenheit warmer than it was in the late 1800s, and it's on track to exceed 5 degrees Fahrenheit of warming by the end of the century, according to the U.N. Beyond about 2.8 degrees Fahrenheit of warming, storms, heat waves and other climate impacts become far more destructive. Limiting the rise in global temperatures will require an international response. But as the largest historical contributor to climate change, the U.S. "needs to lead that effort," says Aiguo Dai, a professor of atmospheric and environmental science at the University of Albany. "If the U.S. can start cutting down the emissions, steadily year over year, decade over decade, then we are on the right path to limit global warming," Dai says. However, scientists say time is of the essence. At the slow current pace countries are cutting emissions, warming is on track to trigger runaway impacts that could lead to permanent changes in the Earth's ecosystems. "If we cut it too [slowly], it could be difficult to avoid catastrophic warming in the near future," Dai says.
Environmental Science
PFAS Found in Blood of Dogs, Horses Living Near Fayetteville, N.C. For Immediate Release In a new study, researchers from North Carolina State University detected elevated PFAS levels in the blood of pet dogs and horses from Gray’s Creek, N.C. – including dogs that only drank bottled water. The work establishes horses as an important sentinel species and is a step toward investigating connections between PFAS exposure and liver and kidney function in dogs and horses. The study included 31 dogs and 32 horses from the community, and was conducted at the behest of community members concerned about their pets’ well-being. All of the households in the study were on well water, and all of the wells had been tested and deemed PFAS contaminated by state inspectors. The animals received a general veterinary health check and had their blood serum screened for 33 different PFAS chemicals. These PFAS were chosen based on compounds that were present in the Cape Fear River basin and the availability of analytical standards. From the targeted list of 33 PFAS of interest, researchers found 20 different PFAS in the animals. All of the animals in the study had at least one chemical detected in their blood serum, and over 50% of the dogs and horses had at least 12 of the 20 detected PFAS. PFOS, a long-chain PFAS used for years in industrial and commercial products, had the highest concentrations in dog serum. The perfluorosulfonic acid PFHxS, a surfactant used in consumer products and firefighting foams, was detected in dogs, but not horses. Consistent with wells being the known contamination source, some ether-containing PFAS including HFPO-DA (colloquially known as GenX), were detected only in dogs and horses that drank well water. In dogs who drank well water, median concentrations of two of the PFAS – PFOS and PFHxS –were similar to those of children in the Wilmington GenX exposure study, suggesting that pet dogs may serve as an important indicator of household PFAS. Dogs who drank bottled water, on the other hand, had different types of PFAS in their blood serum. However, 16 out of the 20 PFAS detected in this study were found in the dogs who drank bottled water. Overall, horses had lower concentrations of PFAS than dogs, though the horses did show higher concentrations of Nafion byproduct 2 (NBP2), a byproduct of fluorochemical manufacturing. The finding suggests that contamination of the outdoor environment, potentially from deposition of the PFAS onto forage, contributed to their exposure. “Horses have not previously been used to monitor PFAS exposure,” says Kylie Rock, postdoctoral researcher at NC State and first author of the work. “But they may provide critical information about routes of exposure from the outdoor environment when they reside in close proximity to known contamination sources.” Finally, the veterinary blood chemistry panels for the animals showed changes in diagnostic biomarkers used to assess liver and kidney dysfunction, two organ systems that are primary targets of PFAS toxicity in humans. “While the exposures that we found were generally low, we did see differences in concentration and composition for animals that live indoors versus outside,” says Scott Belcher, associate professor of biology at NC State and corresponding author of the work. “The fact that some of the concentrations in dogs are similar to those in children reinforces the fact that dogs are important in-home sentinels for these contaminants,” Belcher says. “And the fact that PFAS is still present in animals that don’t drink well water points to other sources of contamination within homes, such as household dust or food.” The work appears in Environmental Science and Technology and was supported by the National Institute of Environmental Health Sciences and the North Carolina Policy Collaboratory. -peake- Note to editors: An abstract follows. “Domestic Dogs and Horses as Sentinels of Per- and Polyfluoroalkyl Substance (PFAS) Exposure and Associated Health Biomarkers in Gray’s Creek North Carolina” Authors: Kylie D. Rock, Madison E. Polera, Hannah M. Starnes, Scott M Belcher, North Carolina State University; Theresa C. Guillette, Oak Ridge Institute for Science and Education Research Participation Program; Kentley Dean, Southern Oaks Animal Hospital; Mike Watters, Debra Stevens-Stewart, Gray’s Creek Residents United Against PFAS in Our Wells and Rivers Published: June 21 in Environmental Science and Technology Abstract: Central North Carolina (NC) is highly contaminated with per- and polyfluoroalkyl substances (PFAS), in part due to local fluorochemical production. Little is known about the exposure profiles and long-term health impacts for humans and animals that live in nearby communities. In this study, serum PFAS concentrations were determined using liquid chromatography high-resolution mass spectrometry and diagnostic clinical chemistry endpoints were assessed for 31 dogs and 32 horses that reside in Gray’s Creek NC at households with documented PFAS contamination in their drinking water. PFAS were detected in every sample, with 12 of the 20 PFAS detected in ≥50% of samples from each species. Average total PFAS concentrations in horses were lower compared to dogs who had higher concentrations of PFOS (dogs 2.9 ng/ml; horses 1.8 ng/ml), PFHxS (dogs 1.43 ng/ml, horses <LOD), and PFOA (dogs 0.37 ng/ml; horses 0.10 ng/ml). Regression analysis highlighted alkaline phosphatase, glucose, and globulin proteins in dogs and gamma glutamyl transferase in horses as potential biomarkers associated with PFAS exposure. Overall, the results of this study support the utility of companion animal and livestock species as sentinels of PFAS exposure differences inside and outside of the home. As in humans, renal and hepatic health in domestic animals may be sensitive to long-term PFAS exposures.
Environmental Science
The International Space Station (ISS) has been home to more than 250 astronauts who have lived and worked on board the orbiting lab for months at a time. Unlike homes on Earth, however, the space station has been discovered to have higher levels of chemical contamination lingering in dust. A group of scientists analyzed a dust sample from the air filters on board the ISS and found higher concentrations of potentially harmful chemical compounds than those found in floor dust in most American households. The latest findings were published Tuesday in Environmental Science and Technology Letters, and could have implications for the design of future space stations that will succeed the ISS. Some of the contaminants found in the space dust, such as polycyclic aromatic hydrocarbons (PAH), perfluoroalkyl substances (PFAS), and polychlorinated biphenyls (PCBs), have been banned or their use has been limited due to their potential effects on human health. Some PAH, for example, could increase the risk of cancer. PFAS, also known as “forever chemicals,” are problematic on account of their persistence in the environment, potential for bioaccumulation, and association with various adverse health effects in humans. The team of researchers behind the study, however, point out that while the concentrations of the chemicals found on the ISS were higher than those found in most homes, “levels of these compounds were generally within the range found on earth,” the scientists write. How did these contaminants make their way to the ISS in the first place? It could be due to the use of inorganic flame retardants like ammonium dihydrogen phosphate to make fabrics and webbing on board the space station not flammable, the researchers suggest. The chemical contaminants could have also made their way to the ISS along with the astronauts, who sometimes board the space station with commercial devices like cameras, MP3 players, tablet computers, or clothing. The ISS is equipped with a ventilation and contaminant removal system, which absorbs carbon dioxide and gaseous contaminants and recirculates the air eight to 10 times an hour. The researchers, however, are not exactly sure how efficient this system is at removing the chemical contaminants detected in the space station’s dust. The ISS is set to retire in 2030, but the space station serves as a blueprint for the design of future orbital labs currently in the making. “Our findings have implications for future space stations and habitats, where it may be possible to exclude many contaminant sources by careful material choices in the early stages of design and construction,” Stuart Harrad, a researcher at the University of Birmingham, and co-author of the new paper, said in a statement.
Environmental Science
WASHINGTON – A peer-reviewed study by scientists at the Environmental Working Group estimates that more than 200 million Americans could have the toxic fluorinated chemicals known as PFAS in their drinking water at a concentration of 1 part per trillion, or ppt, or higher. Independent scientific studies have recommended a safe level for PFAS in drinking water of 1 ppt, a standard that is endorsed by EWG. The study, published today in the journal Environmental Science & Technology Letters, analyzed publicly accessible drinking water testing results from the Environmental Protection Agency and U.S. Geological Survey, as well as state testing by Colorado, Kentucky, Michigan, New Hampshire, New Jersey, North Carolina and Rhode Island. “We know drinking water is a major source of exposure of these toxic chemicals,” said Olga Naidenko, Ph.D., vice president for science investigations at EWG and a co-author of the new study. “This new paper shows that PFAS pollution is affecting even more Americans than we previously estimated. PFAS are likely detectable in all major water supplies in the U.S., almost certainly in all that use surface water.” The analysis also included laboratory tests commissioned by EWG that found PFAS chemicals in the drinking water of dozens of U.S. cities. Some of the highest PFAS levels detected were in samples from major metropolitan areas, including Miami, Philadelphia, New Orleans and the northern New Jersey suburbs of New York City. There is no national requirement for ongoing testing and no national drinking water standard for any PFAS in drinking water. The EPA has issued an inadequate lifetime health advisory level of 70 ppt for the two most notorious fluorinated chemicals, PFOA and PFOS, and efforts to set an enforceable standard could take many years. In the absence of a federal standard, states have started to pass their own legal limits for some PFAS. New Jersey was the first to issue a maximum contaminant limit for the compound PFNA, at 13 ppt, and has set standards of 13 ppt for PFOS and 14 ppt for PFOA. Many states have either set or proposed limits for PFOA and PFOS, including California, Massachusetts, Michigan, New Hampshire, New Jersey, New York and Vermont. “The first step in fighting any contamination crisis is to turn off the tap,” said Scott Faber, EWG senior vice president for government affairs. “The second step is to set a drinking water standard, and the third is to clean up legacy pollution. The PFAS Action Act passed by the House would address all three steps by setting deadlines for limiting industrial PFAS releases, setting a two-year deadline for a drinking water standard, and designating PFAS as ‘hazardous substances’ under the Superfund law. But Mitch McConnell’s Senate has refused to act to protect our communities from ‘forever chemicals.’” PFAS are called forever chemicals because they are among the most persistent toxic compounds in existence, contaminating everything from drinking water to food, food packaging and personal care products. They are found in the blood of virtually everyone on Earth, including newborn babies. They never break down in the environment. Very low doses of PFAS chemicals in drinking water have been linked to suppression of the immune system and are associated with an elevated risk of cancer and reproductive and developmental harms, among other serious health concerns. “When we look for PFAS contamination, we almost always find it,” said David Andrews, Ph.D., a senior scientist at EWG and one of the co-authors. “Americans should trust that their water is safe, but far too many communities have water supplies polluted by toxic PFAS chemicals. These are some of the most insidious chemicals ever produced, and they continue to be used. Our analysis was largely limited to PFOA and PFOS, but many more PFAS are found to contaminate drinking water and the entire class of PFAS chemicals is a concern.” The EPA has identified over 600 PFAS in active use in the U.S. According to the most recent analysis of state and federal data by EWG, 2,230 locations in 49 states are known to have PFAS contamination, including more than 300 military installations. PFAS contamination has raised alarms among a bipartisan group of lawmakers in Congress. The PFAS Action Act also includes a provision that would set a two-year deadline for the EPA to establish a national drinking water standard for the two most notorious PFAS chemicals – PFOA, formerly used to make DuPont’s Teflon, and PFOS, formerly an ingredient in 3M’s Scotchgard. “It’s not too late for this Congress to protect us from the growing PFAS contamination crisis,” Faber said. ### The Environmental Working Group is a nonprofit, non-partisan organization that empowers people to live healthier lives in a healthier environment. Through research, advocacy and unique education tools, EWG drives consumer choice and civic action.
Environmental Science
Droughts can be good for trees. Certain trees, that is. Contrary to expectation, sometimes a record-breaking drought can increase tree growth. Why and where this happens is the subject of a new paper in Global Change Biology. A team of scientists led by Joan Dudney at UC Santa Barbara examined the drought response of endangered whitebark pine over the past century. They found that in cold, harsh environments -- often at high altitudes and latitudes -- drought can actually benefit the trees by extending the growing season. This research provides insights into where the threats from extreme drought will be greatest, and how different species and ecosystems will respond to climate change. Many factors can constrain tree growth, including temperature, sunlight and the availability of water and nutrients. The threshold between energy-limited and water-limited systems turns out to be particularly significant. Trees that try to grow in excessively cold temperatures -- often energy-limited systems -- can freeze to death. On the other hand, too little water can also kill a tree, particularly in water-limited systems. Over time, many tree species have adapted to these extreme conditions, and their responses are broadly similar. They often reduce growth-related activities, including photosynthesis and nutrient uptake, to protect themselves until the weather improves. "Interestingly, the transition from energy- to water-limited growth can produce highly unexpected responses," explained Dudney, an assistant professor in the Bren School of Environmental Science & Management and the Environmental Studies Program. "In cold, energy-limited environments, extreme drought can actually increase growth and productivity, even in California." Dudney and her colleagues extracted 800 tree cores from whitebark pine across the Sierra Nevada, comparing the tree rings to historical records of climate conditions. This climate data spanned 1900 to 2018, and included three extreme droughts: 1959-61, 1976-77, and 2012-15. They recorded where tree growth and temperature showed a positive relationship, and where the relationship was negative. The authors found a pronounced shift in growth during times of drought when the average maximum temperature was roughly 8.4° Celsius (47.1° Fahrenheit) between October and May. Above this threshold, extreme drought reduced growth and photosynthesis. Below this temperature, trees grew more in response to drought. "It's basically, 'how long is the growing season?'" Dudney said. Colder winters and higher snowpack often lead to shorter growing seasons that constrain tree growth. Even during an extreme drought, many of the trees growing in these extreme environments did not experience high water stress. This surprised the team of scientists, many of whom had observed and measured the unprecedented tree mortality that occurred at slightly lower elevations in the Sierra Nevada. Dudney was curious whether drought impacts growth in just the main trunk, or the whole tree. Without more data, the trends they saw could be a result of disparate processes all responding to the drought differently, she explained. Fortunately, whitebark pine retains its needles for roughly eight years. This provided additional data that could address this question. The researchers shifted their attention from dendrology to chemistry. Atoms of the same element can have different weights, or isotopes, thanks to the number of neutrons they contain. Several aspects of a plant's metabolism can influence the relative abundance of heavy, carbon-13 and light, carbon-12 in tissues such as their leaves and needles. These changes provide a rough guide to the amount of water stress a tree experienced during drought. This was a boon for the researchers, because isotopic data from the pine needles spanned drought and non-drought years. Analyzing needle growth, carbon and nitrogen isotopes revealed that the whole tree was affected by the threshold between water-limited and energy-limited systems. Trunk growth, needle growth, photosynthesis and nutrient cycling responded in opposite directions to drought above and below the threshold between energy- and water-limited systems. The future of whitebark pine is highly uncertain. The species -- recently listed as threatened under the Endangered Species Act -- faces many threats, including disease, pine beetle infestation and impacts from altered fire regimes. It's clear from this research that drought and warming will likely exacerbate these threats in water-limited regions, but warming may be beneficial for growth in energy-limited environments. "This research can help develop more targeted conservation strategies," said Dudney, "to help restore this historically widespread tree species." Indeed, the pine's range encompasses a diverse region, stretching from California to British Columbia, and east to Wyoming. The findings also have implications more broadly. Approximately 21% of forests are considered energy limited, and an even higher percentage can be classified as water limited. So transitions between these two climatic regimes likely occur around the globe. What's more, the transition seems to have an effect on nitrogen cycling. Trees in water-limited environments appeared to rely less on symbiotic fungi for nitrogen, which is critical for tree growth in harsh, energy-limited environments. "Droughts are leading to widespread tree mortality across the globe," Dudney said, "which can accelerate global warming." Deciphering the many ways trees respond to drought will help us better predict where ecosystems are vulnerable to climate change and how to develop more targeted strategies to protect our forests. Story Source: Journal Reference: Cite This Page:
Environmental Science
Earth Day in the year of the farm bill Earth Day reminds us of the importance of the soil, water and air upon which our health, prosperity and food security depend. The good news is that, on average, humans are better nourished now than at any time in human history, although nearly 1 billion people are still undernourished. The bad news is that our agricultural achievements have come as the expense of extensive air and water pollution and a changing climate. Crop and livestock systems always entail environmental compromises when native lands are converted to agriculture and when fertilizers are applied to increase crop production. Excess nutrients run off of farmlands and animal production facilities, ending up in streams, groundwater and estuaries, where they pose human health risks and promote damaging algal blooms. Food production contributes between 21 percent to 37 percent of total global greenhouse gases causing global warming. One of the greatest challenges of our time is to produce abundant, affordable and nutritious food for today’s 8 billion people, and for the 10 billion to 11 billion people within the next 30 to 50 years, while also sustainably stewarding land and water resources and a stable climate so that future generations will enjoy the necessary soils, water and climate required to meet their needs for food, good health and economic prosperity. Every five years, the U.S. Congress establishes policies and funding through the giant legislation package known as the “farm bill,” and we are in one of those years. The urgency for effective policies that serve dual goals of agronomic productivity and environmental stewardship, including large reductions of greenhouse gas emissions, could not be greater. More good news is that the 2023 Farm Bill will likely devote considerable attention to advancing “climate smart-agriculture” and “regenerative agriculture.” As usual, however, the devil and the goodness are in the details, including what these popular catchphrases really mean and parsing out which ideas are most worthy of support. There are dozens of potential interventions that this year’s farm bill could adopt, but three rise to the level of being truly transformational, significantly slowing climate change while increasing agricultural productivity. 1) Reduce methane emissions from livestock by more than half through existing technologies and accelerated research and development. Because it is relatively short-lived in the atmosphere, reducing methane sources would almost immediately slow climate change. While much methane mitigation focuses on the energy sector, livestock is the single largest source of methane, mainly the burps of cows and cattle, sheep and other ruminants. Unfortunately, reducing livestock methane emissions has received paltry funding despite some promising developments with feed additives and an unrealized potential from state-of-the-art research on the rumen microbiome. An added bonus is that energy that cows now lose as they produce methane, could be redirected to producing more milk and beef, thus reaping an economic benefit for farmers and a more efficient food system for humanity. 2) Give USDA a mandate to harmonize and coordinate monitoring, reporting and validation (MRV) of tradeable soil carbon credits. In contrast to methane, the proliferation of private and public money wishfully chasing soil carbon credits has gotten ahead of the soil science, agronomy and social science needed to design and implement a credible system for effectively farming for carbon. The current Wild West marketplace of soil carbon credits urgently needs an MRV sheriff for this outpouring of good intentions to actually lead to verifiable sequestration of carbon in agricultural soils and the accompanying improved soil fertility and economic benefits to farmers. 3) Authorize and increase funding for the Agricultural Advanced Research and Development Authority (Ag-ARDA) to transform management of the nitrogen cycle in agriculture. Such ground-breaking innovations include making fertilizers using renewable energy, feeding amino acids and other forms of nitrogen directly to livestock, and breeding crop varieties that make more efficient use of nitrogen. Such developments would start decoupling the current dependency of food production on inefficient uses of nitrogen that wreak havoc on the land, air and water, including emissions of nitrous oxide, a potent greenhouse gas and stratospheric ozone destroyer. It is difficult to estimate how big the effect could be — but reducing nitrate leaching and nitrous oxide emissions by half seems within reason. At the same time, such transformational improvements in efficiencies would boost yields, profitability and food security, as well as reduce global demand for converting forests to croplands. The 20th-century Earth Day discussions often focused on debates about protecting the environment or promoting economic productivity — but not both. In the 21st century, Earth Day is a time to say: Yes, we can promote productive agriculture to meet human needs and we can do so intelligently as good stewards of the Earth. Eric A. Davidson is professor of the University of Maryland Center for Environmental Science, principal scientist for Spark Climate Solutions and the author of “Science for a Green New Deal: Connecting Climate, Economics and Social Justice,” Johns Hopkins University Press, 2022. Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
Environmental Science
Hidden in plain sight: Windshield washer fluid is an unexpected emission source Exhaust fumes probably come to mind when considering vehicle emissions, but they aren't the only source of pollutants released by a daily commute. In a recent Environmental Science & Technology study, researchers report that alcohols in windshield washer fluid account for a larger fraction of real-world vehicle emissions than previous estimates have suggested. Notably, the levels of these non-fuel-derived gases will likely remain unchanged, even as more drivers transition from gas-powered to electric vehicles. Cars' average carbon dioxide emissions have dropped by 25% since the early 2000s, according to the U.S. Environmental Protection Agency, but this gas only accounts for part of the total. Another important component of emissions is volatile organic compounds (VOCs), a broad classification of carbon-based molecules that are easily vaporized and that can contribute to ozone formation. While some VOCs are released in exhaust, others may arise from an unexpected source—the products used for "car care," such as windshield washer fluid. Estimates from a national inventory of manufacturer statistics in the U.K. showed that car-care products could be an even greater source of VOCs than exhaust, but these numbers had never been verified experimentally. So, Samuel Cliff and coworkers decided to measure the amounts of vaporized windshield washer fluid ingredients from cars on a real-world road and compare it to the inventory estimates. To measure the VOCs actually emitted by vehicles, the researchers outfitted a van with several instruments, including a mass spectrometer, and parked it near a busy roadway. By comparing the van's measurements with those from a university site with minimal traffic influence, they calculated the average amount of vapor given off per car for each kilometer traveled for several key VOCs. The measured values matched inventory estimates for aromatic compounds that are commonly monitored and regulated, but those for alcohols—key ingredients in windshield washer fluid—far exceeded inventory numbers. In fact, the release of two alcohols, ethanol and methanol, was nearly twice the amount of all VOCs released in exhaust. The discrepancy in alcohol emissions could be accounted for by including solvents from car-care products in the inventory estimations, suggesting that these products are a significant, if unexpected, source of vehicle-derived pollutants. The researchers say that this finding has implications for future regulatory policy especially as drivers transition to electric vehicles, which may have fewer emissions from fuels but will still need clean windshields. More information: Samuel J. Cliff et al, Unreported VOC Emissions from Road Transport Including from Electric Vehicles, Environmental Science & Technology (2023). DOI: 10.1021/acs.est.3c00845 Journal information: Environmental Science & Technology Provided by American Chemical Society
Environmental Science
Researchers have found that one method of reducing greenhouse gas emissions is available, affordable, and capable of being implemented right now. Nitrous oxide, a potent greenhouse gas and ozone-depleting substance, could be readily abated with existing technology applied to industrial sources. "The urgency of climate change requires that all greenhouse gas emissions be abated as quickly as is technologically and economically feasible," said lead author Eric Davidson, a professor with the University of Maryland Center for Environmental Science. "Limiting nitrous oxide in an agricultural context is complicated, but mitigating it in industry is affordable and available right now. Here is a low-hanging fruit that we can pluck quickly." When greenhouse gases are released into the atmosphere, they trap the heat from the sun, leading to a warming planet. In terms of emissions, nitrous oxide is third among greenhouse gases, topped only by carbon dioxide and methane. Also known as laughing gas, it has a global warming potential nearly 300 times that of carbon dioxide and stays in the atmosphere for more than 100 years. It also destroys the protective ozone layer in the stratosphere, so reducing nitrous oxide emissions provides a double benefit for the environment and humanity. Nitrous oxide concentration in the atmosphere has increased at an accelerating rate in recent decades, mostly from increasing agricultural emissions, which contribute about two-thirds of the global human-caused nitrous oxide. However, agricultural sources are challenging to reduce. In contrast, for the industry and energy sectors, low-cost technologies already exist to reduce nitrous oxide emissions to nearly zero. Industrial nitrous oxide emissions from the chemical industry are primarily by-products from the production of adipic acid (used in the production of nylon) and nitric acid (used to make nitrogen fertilizers, adipic acid, and explosives). Emissions also come from fossil fuel combustion used in manufacturing and internal combustion engines used in cars and trucks. "We know that abatement is feasible and affordable. The European Union's emissions trading system made it financially attractive to companies to remove nitrous oxide emissions in all adipic acid and nitric acid plants," said co-author Wilfried Winiwarter of the International Institute for Applied Systems Analysis. "The German government is also helping to fund abatement of nitrous oxide emissions from nitric acid plants in several low-income and middle-income countries." The private sector could also play a key role in nitrous oxide emissions reduction, encouraged by trends in consumer preferences for purchasing climate-friendly products. For example, 65% of the nitrous emissions embodied in nylon products globally are used in passenger cars and light vehicles. Automobile manufacturers could require supply chains to source nylon exclusively from plants that deploy efficient nitrous oxide abatement technology. "Urgent abatement of industrial sources of nitrous oxide" is published in Nature Climate Change by Eric Davidson of the University of Maryland Center for Environmental Science, Spark Climate Solutions, Wilfried Winiwarter of the International Institute of Applied Systems Analysis, Austria, and the Institute for Environmental Engineering, University of Zielona Góra, Poland. Story Source: Materials provided by University of Maryland Center for Environmental Science. Note: Content may be edited for style and length. Journal Reference: Cite This Page:
Environmental Science
image: Ten years of progress - forest ecosystem restoration on an abandoned agricultural field at Mon Cham, northern Thailand, by Chiang Mai University's Forest Restoration Research Unit view more  Credit: Stephen Elliott On average, about half of trees planted in tropical and sub-tropical forest restoration efforts do not survive more than five years, but there is enormous variation in outcomes, new research has found. The study analysed tree survival and growth data from 176 restoration sites in tropical and sub-tropical Asia, where natural forests have suffered degradation. The team found that, on average, 18% of planted saplings died within the first year, rising to 44% after five years. However, survival rates varied greatly amongst sites and species, with some sites seeing over 80% of trees still alive after five years, whereas at others, a similar percentage had died. The findings are published today in the Philosophical Transactions of the Royal Society B: Biological Sciences. Forest restoration is a powerful tool to tackle biodiversity loss and climate change, by locking away carbon and supporting important habitats. Reforestation projects are also used widely for carbon offsetting. While the main measurement used for many projects is the number of trees initially planted, the research shows that many of these trees are not surviving long-term. In some sites, survival rates were high, showing that with the right approach restoration has the potential to be successful. About 15% of the world’s tropical forests are found in Southeast Asia and they are amongst the most carbon-dense and species-rich in the world, providing habitat for tigers, primates and elephants. However, in recent decades the region has also seen major deforestation, with forest cover reducing by an estimated 32 million hectares between 1990 and 2010. The region has therefore become an important focus for forest restoration projects. The research – by an international team of scientists from 29 universities and research centres – is the first to bring together data to evaluate the long-term outcomes of restoration projects. Dr Lindsay Banin, co-lead author based at the UK Centre for Ecology & Hydrology, said: “The large variability in survival we found across sites could be for a number of reasons, including planting densities, the choice of species, the site conditions, extreme weather events or differences in management and maintenance. Local socio-economic factors may also be important. What’s clear is that success is very site-dependent – we need to understand what works and why and share that information, so we can bring all sites up to the level of the most successful and harness the full potential for restoration. There’s likely no one-size-fits-all approach and restoration action should be tailored to local conditions. This will help ensure the scarce resources and land available to restoration are used to best effect.” The team found that, when an area had been fully deforested, reforestation efforts were less successful than in areas where some trees remained. Saplings planted in areas with existing mature trees had roughly 20% higher chance of survival. In more disturbed areas, more intensive measures for protection and maintenance may be needed. The study also found some evidence that active restoration provides faster results than simply letting nature take its course. Sites which included tree planting activities gained forest cover more quickly than sites which were left to regenerate naturally. But many more studies tracked the fate of planted trees rather than structural properties of the whole community. The research team believes that collating both types of data in the same study areas will help to determine acceptable levels of mortality that will still deliver a return of forest cover. More experiments are needed to help hone the most appropriate and cost-effective methods of restoration across sites under different conditions. Prof David Burslem, co-author based at the University of Aberdeen in the UK, said: “The sites where active restoration is most needed – those that have already been cleared of trees – are also those where restoration is most risky and prone to higher numbers of trees dying. We need to understand better how to improve the survival chances of saplings on these sites, to ensure restoration has positive outcomes. But the study also provides a warning, to protect our remaining forests as much as possible, both because restoration outcomes are uncertain and to provide the diverse seed sources needed for restoration activities.” Prof Robin Chazdon, a co-author based at the University of the Sunshine Coast, Queensland, Australia, said: “Replanting is only going to be an answer to excess carbon dioxide in the atmosphere if we can guarantee that carbon is being successfully drawn out of the atmosphere and locked away – and be able to quantify the amounts and timescales involved. This is why assessing restoration outcomes over the long-term, and gathering information that helps to maximise success rates, are so important. We need the focus to shift away from simply planting trees toward growing them and helping our forests thrive.” The study was supported by UKRI Natural Environment Research Council funding. Ends For interviews and further information, contact: Abigail Chard, Campus PR, T: +44 (0)113 258 9880 M: +44 (0)7960448532 E: abigail@campuspr.co.uk Anthea Milnes, Head of Communications and Engagement at UKCEH, antheamilnes@ceh.ac.uk or +44 (0)7920 295384. Notes to editors: “The road to recovery: a synthesis of outcomes from ecosystem restoration in tropical and sub-tropical Asian forests” by Banin et al is published in the Philosophical Transactions of the Royal Society B. DOI: 10.1098/rstb.2021.0090 The institutions involved in the research were: UK Centre for Ecology & Hydrology, UK University of Exeter, UK University of Sunshine, Australia Nanyang Technological University, Singapore University of Brighton, UK Biomathematics and Statistics Scotland, UK Permian Global Research Limited, UK Swedish University of Agricultural Sciences, Sweden Research Centre for Ecology and Ethnobiology, Indonesia National University of Singapore, Singapore University of Dundee, UK Chiang Mai University, Thailand PT Restorasi Ekosistem Indonesia, Universiti Malaysia Sabah, Malaysia Borneo Orangutan Survival Foundation, Indonesia University of Stirling, UK University of Oxford, UK Nature Conservation Foundation, India ETH Zürich, Switzerland Bioversity International, Italy Forever Sabah, Malaysia Universidad Rey Juan Carlos, Spain South East Asia Rainforest Research Partnership, Malaysia Yayasan Sabah Group, Malaysia University of Göttingen, Germany University of Cambridge, UK Centre for International Forestry Research (CIFOR), Indonesia University of Sheffield, UK. University of Aberdeen, U.K. About the UK Centre for Ecology & Hydrology (UKCEH) The UK Centre for Ecology & Hydrology is a centre for excellence in environmental science across water, land and air. Our 500-plus scientists seek to understand the environment, how it sustains life and the human impact on it, so that together, people and nature can prosper. We have a long history of investigating, monitoring and modelling environmental change, and our science makes a positive difference in the world. www.ceh.ac.uk / Twitter: @UK_CEH / LinkedIn: UK Centre for Ecology & Hydrology Journal Philosophical Transactions of the Royal Society of London B Biological Sciences Article Title The road to recovery: a synthesis of outcomes from ecosystem restoration in tropical and sub-tropical Asian forests Article Publication Date 14-Nov-2022 Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.
Environmental Science
CNN  —  Today’s kids have the power and skills to change the world. But when it comes to topics like the climate crisis, reaching children in a way that sparks their interest — and empowers them to believe they can truly make a difference — can be tricky. “Kids need to hear about climate change. But the greatest danger is that it just sounds so dreadful and people tune it out, and that’s bad news,” explained Anita Sanchez, author of the new book “MELTDOWN: Discover Earth’s Irreplaceable Glaciers and Learn What You Can Do to Save Them,” designed for young readers ages 8 to 12. Featuring emotional and informative illustrations by Brooklyn-based artist Lily Padula, the book taps into the sheer beauty and wonder of the world’s glaciers, paired with accessible science to give kids tools they can use to help save them. Sanchez was on a “typical tourist trip to Iceland,” visiting the Vatnajokull glacier — Iceland’s largest ice cap — when she realized there was a children’s book in what she was witnessing. “We were on the moraine, the land near the glacier,” said Sanchez, who worked for 25 years for the New York State Department of Environmental Conservation and has written 19 books on environmental science. “It was muddy, dirty and gray. Then, when we came to the side of the glacier itself, there was a sense of climbing up onto the back of a big animal.” Glacier mice, which are clumps of moss Sanchez said looked like “green furry ping pong balls,” were being blown across the landscape in herd-like formations. “There was this freshness to the air, there was a small crevasse you could look into and see this really deep sapphire blue, down deep,” she said. “I came off the glacier going, ‘There’s gotta be a book here somewhere.’” At first, she thought she’d create a picture book on the topic, before realizing she “had to go bigger and deeper for slightly older kids.” Middle-grade students, she decided, were the ideal audience to reach. “They’re old enough to really take in some tough science but still young enough to have that enthusiasm and love of animals and adventure that some of us lose when we reach adulthood,” Sanchez said. “Kids need to get excited about the wild places of the world before they become activists to help preserve them.” Speaking with CNN, Sanchez shared how she reaches out to young readers about the climate crisis, how she’s inspired by ecosystems that glaciers support and how adults can inspire the young people in their lives to act, too, in the most accessible ways. This conversation has been edited and condensed for clarity. CNN: How did you make the concept of climate change relatable and interesting to younger readers? Anita Sanchez: This is the toughest book I’ve ever written because there’s a lot of tough science — what climate change is, why it’s happening, and what the evidence is that glacial melt is human caused. So I tried to imagine I’m talking to readers rather than writing a textbook. I approached the book and topic like a fiction writer. The first paragraph describes the sounds — crunch, crunch, crunch — your crampons make as you walk in the hard-packed snow of a glacier. There’s the sense of the glacier being something alive. You don’t actually feel it moving, but there’s this sense that it’s different from a big pile of snow. You get the sense that it’s powerful and that you’re in a dramatic place. It’s like being on the back of a big frozen animal. Throughout the book, I try to make the glacier a sort of character, even though the book is non-fiction. Indigenous people who have lived with glaciers have beliefs that glaciers speak and that they can hear us. When you hear the sounds they make, it’s impressive. I was visiting Northern Cascades National Park in Washington state in early May, when the (annual spring) melt had started, and there was crackling and roaring and a big ‘hmmm’ sound as part of the glacier broke off and came crackling down. It was like a distant thunderstorm, a sense of power. I try to get these things across (to my readers). CNN: What do you think might surprise young readers to learn about glaciers in the book? Sanchez: To keep it relevant to young readers, I keep bringing the focus back to the animals that depend on glaciers. In Washington state, the salmon depend on glacial melt, and grizzly bears are in turn dependent on salmon. There are ice worms that live on glaciers, they’re as tiny as an eyelash. And rosy finches depend on those ice worms for food. … I try to make the glaciers seem alive to make them more relatable, and along with it the message of why we need to save them. CNN: For most people, the chance to walk on a real glacier will remain a pipe dream. What are to be awed by nature wherever we live? Sanchez: Ideally, I wish every child could take a walk on a glacier. Sadly. it’s not that accessible for most people. In the United States, you need to go to a national park in the northwest or to Alaska. To places like North Cascades National Park or Glacier National Park in Montana. For many people, it’s beyond their means. I usually write about places closer to (my) home (in New York), so even for me glaciers were a stretch. And while it’s wonderful to write about glaciers and rainforests and faraway places, I think that real love of the outdoors can come from your backyard, your local park, the squirrel at your local nature center. I’ve taught for 25 years in the outdoor realm, and so many kids have barely been off the sidewalks. Getting out into any form of nature is a real adventure for them. CNN: What are some of the tools your book gives young readers to take action to save the glaciers? Sanchez: I think most people of any age hear the words ‘climate change’ on the news and reach for the mute button. It’s like we can’t take any more bad news. I really think the way to inspire activism and this message that we conserve only what we love is to get excited about the beauty and fun of nature. It starts with the small stuff, like working with your family to live more sustainably. Think about where you buy your food, where does your food come from? The book explains (the concept of your) carbon footprint. But realistically, we need bigger action. We need to harness the power of our governments to take action and get big corporations to take action. I really stress the power of voting. Kids can’t vote, of course, but they can be huge advocates for getting people to vote — and for candidates who take strong action on climate change. Kids can find out how registration works in their state. How city ballots work in their state. It’s all online, and most kids are computer savvy. Do you know where your polling place is, do you know the hours? Many kids are masters of social media and can help spread these messages. They really know how to use social media to spread the word in a way my generation doesn’t. There are, of course, limits to what a 10-year-old can do. But they have power, even if they don’t know they have it. CNN: What’s the most important message you hope young readers will take away from your book? Sanchez: The last section of the book is about how to help, how to become an activist. Kids need to get active. They can really make a difference. I think it’s easy for kids to feel like they have no power, or they have no say. That scientists know everything, that everything has been done. But it’s not true. There are so many more mysteries. Nobody knows how ice worms survive on glaciers, for example, how they manage not to freeze. For kids, this world is their future. They really need to be active, and there are so many things they can do. One letter from a kid to a school board or to an editor can really start the ball rolling for change in their town. I would love to hear from just one young person that reading the book made them take a definite action, made them do something. It doesn’t matter what, it’s that first baby step.
Environmental Science
Scientists advocate synergistic approach to address climate change and air pollution in China In a new research perspective published in Environmental Science and Ecotechnology, a research team emphasizes that the key to a synergistic approach lies in understanding that carbon dioxide and air pollutants predominantly originate from the same sources, namely, the combustion and use of fossil fuels. Therefore, policies designed to mitigate climate change and control air pollution can generate considerable synergies. In particular, as China has committed to peaking its CO2 emissions by 2030 and achieving carbon neutrality by 2060, the team believes that these goals can serve as powerful drivers for future air quality improvement. To facilitate the tracking and analysis of the synergistic governance of air pollution and climate change, the researchers developed a suite of 18 indicators. These indicators span five critical areas: air pollution and associated weather-climate conditions; progress in structural transition; sources, sinks, and mitigation pathway of atmospheric composition; health impacts and benefits of coordinated control; and synergistic governance system and practices. One of the key insights from the research perspective is the recognition of the intricate links between climate change and air pollution. Changes in meteorological factors induced by climate change can significantly impact the formation, accumulation, and dispersion of air pollution. Simultaneously, these changes can affect emissions from natural sources like vegetation, dust, and wildfires, which significantly contribute to air pollution. In terms of public health, both climate change and air pollution pose substantial risks. Increased frequency of extreme weather events, higher risk of infectious diseases, and exposure to air pollutants can all lead to increased mortality and morbidity. Recognizing this, the researchers argue that public health protection should serve as the starting point for coordinated governance. The researchers further underline the importance of aligning the control measures for greenhouse gases and air pollutants, given their shared sources and processes. Strategies that target the reduction of fossil fuel consumption and carbon emissions will inherently mitigate air pollutant emissions, offering synergistic benefits for air quality improvement. However, the team also cautions that future carbon sink changes should be factored into planning carbon reduction pathways to avoid compromising air quality. The research perspective also identifies considerable health and economic benefits that can arise from a synergistic approach to addressing climate change and air pollution. These include reducing the incidence of extreme weather events, saving pollution control costs, improving the structure of the economy, promoting new industries, and creating jobs. In terms of governance, the researchers advocate for a system that coordinates climate change mitigation and air pollution control, integrating strategic planning, laws, regulations, standards, and economic policies. They recommend piloting this approach in select cities and industries to gain practical experience before a broader implementation. The research team's ultimate goal is to establish a theoretical framework for the synergetic governance of carbon neutrality and clean air, identify potential challenges in developing a synergetic roadmap for China, and provide corresponding policy recommendations. This comprehensive, synergistic approach could be a game-changer in China's ongoing battles against air pollution and climate change. More information: Qiang Zhang et al, Synergetic roadmap of carbon neutrality and clean air for China, Environmental Science and Ecotechnology (2023). DOI: 10.1016/j.ese.2023.100280 Provided by Chinese Society for Environmental Sciences
Environmental Science
New research this month adds more fuel to the debate over gas stoves. The study found that gas and propane stoves emitted detectable amounts of benzene, a common air pollutant and carcinogen. Benzene levels were often above recommended safety thresholds and could linger for hours after the stoves were turned off in some homes. Earlier this year, the Consumer Product Safety Commission made headlines after one of its commissioners, Richard Trumka Jr, suggested that the agency might explore a number of options to better regulate gas stoves due to their health impacts, up to and including a ban. Trumka later clarified that any such ban would only apply to new gas stoves, though that wasn’t enough to dissuade the wave of right-wing lawmakers and pundits who fearmongered about the government marching in to steal people’s stoves. Eventually, the chair of the CPSC, Alexander Hoehn-Saric, walked back Trumka’s comments and stated that a ban was not on the table currently. Controversy aside, there has been growing research pointing to the dangers of gas stoves. A study last December, for instance, estimated that one in every eight cases of childhood asthma in the U.S. could be attributed to the indoor pollution caused by gas stoves. The authors of this new study, published in the journal Environmental Science & Technology, decided to take a closer look at one of the specific pollutants linked to stoves: benzene. Benzene is a colorless or slightly yellowish flammable liquid at room temperature. It’s naturally found in the environment and is also commonly used in manufacturing. But it’s toxic in high doses, causing symptoms like dizziness, vomiting, and tremors, and long-term exposure can raise a person’s risk of developing certain cancers. The researchers went into 87 homes throughout California and Colorado that had gas, propane, or other types of stoves. They then measured benzene levels directly emitted by the stoves across a variety of scenarios, such as after turning on the burners to high temperatures. In some homes, they also measured airborne concentrations of benzene in the kitchen and as far away as the bedroom. The team found that gas and propane stoves consistently produced noticeable levels of benzene, with emissions 10 to 25 times higher than those from electric coil or radiant stoves. Meanwhile, neither induction stoves nor the foods they cooked produced detectable benzene. Often, these levels were higher than the safety benchmarks for indoor exposure established by the European Union and other countries. The researchers also found that benzene levels could reach these unsafe exposure levels in the kitchen and bedroom, depending on the size of the house and ventilation conditions. In some cases, people could be exposed to benzene outside of the kitchen even hours after the stove was no longer in use. “Combustion of gas and propane from stoves may be a substantial benzene exposure pathway and can reduce indoor air quality,” the authors wrote. This is only one study, so its findings shouldn’t be taken as gospel. But other research has similarly shown that gas stoves can release unhealthy levels of benzene into the home, and they aren’t good for our environment either. Even if you disagree with an outright ban on gas stoves, it’s tough to argue that we wouldn’t be better off with fewer of these pollution-spewing appliances in our homes.
Environmental Science
Chestnuts roasting on an open fire kindle iconic imagery of the season. However, the American chestnut (Castenea dentata) that once produced them is all but absent from the Appalachian forests that it once dominated. An invading fungus in the early 20th century virtually eliminated the American chestnut from its native range, spanning from Georgia to Maine.  Forests east of the Mississippi River were once filled with old growth American Chestnut trees. Credit: Credit: KnowxTNToday Over the last several decades, conservation-minded scientists developed a biotechnology-based solution that would return this tree to this critical ecosystem. Tested, validated and poised for deployment, the fate of this effort balances in the uncertain hands of regulators Your voice may be instrumental in their decision. A USDA/APHIS public comment period is open only until December 27, 2022. Regulators place significant weight in sound scientific arguments. However, good evidence and reason are usually rare among the avalanche of boilerplate anti-GMO pseudoscience spam that pollutes the space of scientific discourse. Your words will stand out with exceptional impact.   Follow the latest news and policy debates on agricultural biotech and biomedicine? Subscribe to our newsletter. Here’s what you need to know. Over the last twenty years, efforts at the nexus of conservation and biotechnology have genetically engineered (GE) the American chestnut to resist the fungus. The goal is to repatriate the forests and restore them to pre-1900s composition, adding back a key species that supported significant biological diversity. Scientists developed the GE trees at the State University of New York College of Environmental Science and Forestry Experiment Station in Syracuse, NY. After years of study, they are poised to be released. The genetic tweak is the addition of a gene from wheat known as oxalate oxidase. The fungal pathogen generates oxalic acid to break the defenses of wild chestnut trees. The genetically engineered trees fight back by producing the added oxalate oxidase enzyme, a natural enzyme that breaks down oxalic acid. The tree doesn’t kill the pathogen, which remains part of the ecosystem, and because the solution removes the acid rather than attacking the pathogen, there is virtually no chance of the fungus becoming resistant. The added gene simply protects the GE tree against fungal invasion. It works beautifully in the lab and in field trials — the fungus and the tree living in perfect harmony.  Photo, taken in the mid-to late 19th century, shows how large and abundant the American chestnut was in the forests of the eastern United States. Credit: Forest History Society, Durham, NC The critics’ response has been typical, with claims that the development is too fast, too risky and “we just don’t know” what might happen. Many suggest that breeding in resistance genes from the Chinese chestnut as an alternative, believing that introduction of thousands of genes from another part of the world is somehow less risk than maintaining the American chestnut’s genetic integrity, except for addition of a single natural plant gene to solve one easy-to-solve problem.  But the GE trees have been extensively tested in the lab and controlled field trials. Scientists understand where the gene is inserted and observe no collateral effects. The trees have been carefully examined for unanticipated effects against fungal symbionts, insect herbivores, pollinating bees, and other animals that may consume the plant materials. The trees were shown to present no risk above traditionally bred trees, a view shared by the USDA after extensive review.   A typical USDA/APHIS public comment period on a GE crop is usually splattered with the diversionary rhetoric of patents, Monsanto, profits, monocultures and health claims. The GE chestnut, developed by state university faculty, was developed to restore critical ecosystems, to enhance biodiversity, and fortify native forests against an invasive pathogen. No patents, no Monsanto, no profits, just GE trees crossing with wild relatives and new Chinese-American chestnut hybrids to expand the native genetics.  The GE chestnut represents a stellar application of biotechnology, and your note of support will help regulators justify their decision to allow widespread release.  Again, the public comment page is here. Even a sentence or two may have a profound positive impact.  Kevin M. Folta is a professor, keynote speaker and podcast host. Follow Professor Folta on Twitter @kevinfolta
Environmental Science
A recent study published in the Journal of Environmental Psychology indicates that spacious natural landscapes can enhance feelings of selflessness, connectedness, and boost positive emotions. The research, conducted using immersive virtual reality technology, found that participants felt a diminished sense of body boundaries, leading to increased selflessness. Mental health issues, such as stress, anxiety, and depression, are on the rise globally. Understanding how nature can contribute to alleviating these conditions is of paramount importance, especially in a world where many people live in urbanized, high-stress environments. “Rumination, excessive worrying, not being able to stop worrying: these are things I get back from a lot of people/students, and I’m also myself struggling with that from time to time,” said study author Thomas J.L. van Rompay, an associate professor at the University of Twente. Nature has long been celebrated for its soothing and restorative qualities, but the precise mechanisms behind these effects have remained a subject of scientific inquiry. Researchers have sought to understand how different aspects of natural settings influence our mental state and overall well-being. Previous studies have explored the concept of awe—a profound emotional experience often associated with nature—and its potential benefits, such as increased feelings of selflessness and connectedness. However, many of these studies focused on iconic natural wonders like the Grand Canyon, leaving questions about the effects of more common natural settings unanswered. The researchers sought to explore how different aspects of nature could influence our emotional state and connectedness with the world around us. They were particularly interested in understanding the role of spaciousness in natural environments. Van Rompay and his colleagues designed an experiment that combined the power of immersive virtual reality (VR) with insights from psychology and environmental science. The study involved 80 participants, primarily students, who were invited to explore virtual natural environments created by the researchers. The participants were divided into four groups, each experiencing a different combination of spaciousness and nature type. Some explored dense natural landscapes, while others ventured into spacious ones. Additionally, the researchers considered two types of nature settings: tended and wild. The tended landscapes featured signs of human intervention, such as paths, while the wild ones remained untouched by human hands. Each participant’s journey into these virtual landscapes was meticulously crafted to maintain a sense of realism and immersion. The researchers measured various aspects of the participants’ mental well-being to gauge the impact of spaciousness and nature type. Participants in the spacious condition experienced a reduction in the salience of their perceived body boundaries. Participants also reported a greater sense of selflessness (e.g. “I felt my sense of self shrink”) when immersed in spacious natural environments compared to dense ones. Importantly, perceived body boundaries fully accounted for the effect of spaciousness on selflessness. In other words, feeling that their body boundaries are less distinct or more integrated with the environment is the reason why the participants exhibited greater selflessness in spaciousness settings. Van Rompay told PsyPost he was surprised to observe “the connection between a very abstract notion (loss of self, selflessness) and its grounding in a very concrete bodily sensation.” Similar to selflessness, participants also reported feeling more connected in general (“I had the sense of being connected to everything”) and more connected to their community after being immersed in spacious environments. Stress levels significantly decreased after participants experienced VR nature exposure in both spacious and dense conditions. However, stress reduction was more pronounced in spacious settings, indicating the potential of open landscapes to reduce stress. Anxiety levels were notably lower in the spacious condition compared to the dense condition. Furthermore, in tended, human-managed nature settings, anxiety was lower compared to wild, untouched landscapes. This finding highlights the potential of both spaciousness and human interventions in natural environments to reduce anxiety. Participants also reported higher levels of positive emotions in spacious nature environments compared to dense ones. The findings provide evidence “that there are ‘easy’ things that can be done to get out of that mental prison; going outside and seeking out a spacious setting (open field, large water surface) is one of them,” van Rompay said. Moreover, the study demonstrates the value of virtual reality technology in providing immersive nature experiences, particularly for those who may have limited access to wide open spaces. Virtual reality could help to bridge the gap between urban living and the natural world, offering opportunities for relaxation, stress reduction, and emotional well-being. However, the study also has some limitations to consider, such as the relatively small sample size. In addition, the use of virtual reality, while innovative, cannot fully replicate the complexity and sensory richness of real-life natural environments. Future research could explore these effects in more diverse and realistic settings to provide a comprehensive understanding of the relationship between spaciousness and well-being. “I feel that interventions such as these can not only keep us sane (in terms of mental health), but they can also strengthen our connection to nature and in doing so make us care for the environment and for the health of our planet in general,” van Rompay added. The study, “Lose yourself: Spacious nature and the connected self“, was authored by Thomas J.L. van Rompay, Sandra Oran, Mirjam Galetzka, and Agnes E. van den Berg.
Environmental Science
Toxic PFAS chemicals have been detected in seven out of 10 insecticides tested in the US, according to new research. Six contained what the study’s lead author characterized as “screamingly high” levels of PFOS, one of the most dangerous PFAS compounds.The Environmental Protection Agency (EPA) has known about the findings for more than 18 months but appears to have not yet investigated the products or taken any action against the manufacturer.PFAS, also known as forever chemicals, can be taken up by crops. Such high levels in pesticides create a health risk if spread on fields where food is grown, public health advocates say.“We know PFOS is a carcinogen, we know it’s a deadly chemical and there’s no safe level in drinking water,” said Kyla Bennett, a former EPA official and science policy director with the non-profit Public Employees for Environmental Responsibility (PEER), which issued a press release on the study. “Our soil and water are now contaminated.”In a statement, the EPA told the Guardian it’s reviewing active ingredients used in pesticides – those which kill pests – to determine if any are PFAS. However, PFOS could be an inert ingredient.Per-and polyfluoroalkyl substances, or PFAS, are a class of about 12,000 chemicals typically used to make thousands of products water-, stain- and heat-resistant. They do not naturally break down and accumulate in humans and the environment. A growing body of evidence links them to serious health problems such as cancer, birth defects, liver disease, kidney disease, autoimmune disorders, high cholesterol and decreased immunity.Researchers from Texas Tech University checked 10 insecticides that were being used on cotton, but can also be used on food and other crops. The peer-reviewed study, published in Journal of Hazardous Materials Letters, found PFAS in seven of these “widely used” insecticides, said environmental toxicologist and lead author Steve Lasee, who was at Texas Tech University at the time of the study. He is now an independent consultant with Lasee Research and Consulting and a research fellow for the EPA.Testing revealed PFOS at a level as high as 19m parts per trillion (ppt) in one insecticide. The EPA hasn’t set limits for PFAS in pesticides, but in June it lowered its advisory health limit in drinking water to 0.02 ppt, a level so low as to suggest no amount of exposure to the compound is safe.Lasee said he presented his findings in March 2021 to staff members at the EPA’s Office of Research and Development and at a conference attended by environmental science professionals and EPA staff. He said he received an email from leadership in one of EPA’s divisions asking him to present his study to more EPA staff, but never heard anything beyond that.Lasee said he named the insecticide’s active ingredients but he never received requests for the brand names, meaning the EPA could not know which companies had sold tainted products.The EPA did not respond to direct questions about the study’s findings or about Lasee’s presentation to the agency.Lasee said the Massachusetts department of environmental protection (DEP) contacted him after his presentation to say it was interested in learning more about the research. A DEP spokesperson told the Guardian that the agency had been testing some pesticides for PFAS and had “discontinued use” of those that contain the chemicals. The agency is reviewing the information in the Texas Tech study and determining what’s next, the spokesperson said.It’s unclear what purpose PFAS in insecticides may serve, but Lasee said they could be used as a dispersing agent, to help the pesticide spread evenly.The study was published amid increased scrutiny of PFAS in pesticides because of the potential for widespread food and water contamination. Multiple studies have established that crops uptake PFAS and can be ingested by humans. The Food and Drug Administration (FDA) began monitoring PFAS in food in 2019 and has detected them in fruits and vegetables, but has not set any limits.The EPA earlier this year found PFAS added to plastic barrels and containers used to store pesticides can leach into the products. An EPA spokesperson said the agency alerted companies that they may be in violation of the law.However, Lasee said the type of PFAS compounds he found are different from those that leached from plastic containers, and the level of PFAS the Texas Tech study is several orders of magnitude higher, suggesting that the chemicals are from a different source.In September, the EPA proposed banning some PFAS that can be used as inert ingredients that were approved for use in pesticide products, but it said active ingredients are being reviewed. “EPA will share results of that investigation as soon as possible”, an agency spokesperson said.The agency also updated a webpage with information about PFAS in pesticides in September that claims PFOS is not used in the products. “The EPA Office of Pesticide Programs previously determined that there were no pesticide active or inert ingredients with structures similar to prominent PFAS such as PFOS,” it reads.That could be contradicted by Lasee’s research. The reason for the presence of PFOS in the insecticides is unclear. It could be the result of chemical companies illegally adding the compound, Bennett said. It could also be that a different PFAS compound is added to the fertilizer, then breaks down into PFOS after the pesticide is manufactured. The EPA did not reply to specific questions about the statement on its site.Bennett said there was little consumers can do to immediately protect themselves beyond eating organic food, but she noted that many people don’t have access to or can afford organic products.That leaves it up to the EPA to take swifter and more forceful action to get PFAS out of pesticides, Bennett added.“We have to get the EPA to stop allowing PFAS in pesticides,” she said. “We’ve got a toxic chemical in them that doesn’t need to be there, and pesticides are bad enough on their own without adding another carcinogen.”
Environmental Science
Study finds bottom-dwellers thrive at foundations of offshore wind farms Offshore wind farms host more soil animals per square meter than the North Sea floor, discovered Leiden researchers. After 25 years, a hundred times more animals and twice the number of different species could live on the foundations of wind turbines. The researchers published their findings in Environmental Science & Technology. Until now, only little was known about the long-term effects of wind farms on marine life. "Previous studies focused only on several years, not a whole life cycle of a wind turbine," says research leader and industrial ecologist Chen Li. A wind turbine lasts about 25 to 30 years, so Li examined the effects of turbines on the soil after 25 years. Wind turbine foundations host more soil animals That soil animals feel at home in wind farms was known, "but now we have detailed numbers and a method to quantify the effect of wind farms on biodiversity," says Li. His calculations show that foundations of wind turbines are more popular among soil animals than the North Sea floor. After 25 years, the wind turbine foundations could host a hundred times more animals and lead to a doubling of species richness. "It is great that, along with the contributions of offshore wind energy to renewable energy, there can also be co-benefits for marine biodiversity," says co-author and environmental scientist Laura Scherer. Bottom-dwellers feel at home in wind farms, because they can find more food there. Plants can grow undisturbed on the foundations, since bottom trawling is prohibited in many farms. 11 years of data from different research institutes Li did not have to dive into the sea to collect the soil life samples himself for his research. He used data from Wageningen Marine Research, Gent University and Royal Belgian Institute for Natural Science from six wind farms in the North Sea. Samples had been collected on German, Belgian, Danish and Dutch wind farms over a period up to 11 years. Li compared the samples of the seabed life inside the wind farms with the samples taken just outside the farms. He then estimated, using a model, how the seabed life will continue to develop until the end of the wind turbine's lifespan. Vibrations and noise Li's results do not mean wind turbines are solely positive for marine life. Scherer states, "It depends on which species are lost or gained. An advantage for one species can be a disadvantage for another." For example, some seabirds avoid wind farms and may lose their feeding or wintering grounds. In addition, the effect of building wind turbines was not included in the study. Li says, "Installing a turbine creates a lot of vibrations and noise." This can disorient fish and mammals. And what do you do with a turbine at the end of its life? "If a wind turbine foundation is fully removed, the biodiversity gains we found will be totally damaged," says Li. This requires careful consideration. However, Scherer is "cautiously optimistic" about the impact of the increasing number of wind turbines in the North Sea for biodiversity—"if the locations are carefully chosen." Wind farms produce renewable energy, slowing down climate change. And less climate change benefits marine life. Therefore, "even if there were net negative impacts of the infrastructure and operation of offshore wind power, the benefits from less climate change could possibly compensate for them." More information: Chen Li et al, Offshore Wind Energy and Marine Biodiversity in the North Sea: Life Cycle Impact Assessment for Benthic Communities, Environmental Science & Technology (2023). DOI: 10.1021/acs.est.2c07797 Journal information: Environmental Science & Technology Provided by Leiden University
Environmental Science
A tailored and rapid approach for ozonation catalyst design In a new study published in the journal Environmental Science and Ecotechnology, researchers from the Chinese Research Academy of Environment Sciences have employed machine learning, specifically the artificial neural network (ANN) model, to predict catalyst performance based on data collected from 52 different catalysts. The ANN model demonstrated a strong correlation and generalization ability, indicating its robustness in predicting catalyst behavior. Additionally, fluorescence spectroscopy, which provides valuable information on the composition and concentration of organics in wastewater, was integrated with the machine learning model. This innovative approach leads to more efficient and effective treatment of refractory organics. Using the Mn/γ-Al2O3 catalyst as an example, the researchers successfully screened a range of catalyst formulations using fluorescence spectroscopy. They determined the optimal impregnation concentration and time of Mn(NO3)2 for specific wastewater compositions. The ANN model then generated an optimized formulation for the Mn/γ-Al2O3 catalyst, resulting in improved catalytic performance. The predicted and experimental values for total organic carbon removal are closely aligned, confirming the effectiveness of the optimized catalyst. The study also identified the synergistic effect of oxidation radicals (•OH and 1O2) and the Mn/γ-Al2O3 catalyst as the key factors contributing to the improved performance. This innovative approach offers a rapid and tailored solution for designing ozonation catalysts based on the unique characteristics of wastewater quality. By combining machine learning and fluorescence spectroscopy, researchers can optimize catalyst formulation more efficiently, leading to enhanced treatment of refractory organics in industrial wastewater. Moreover, applying the ANN model combined with fluorescence spectroscopy holds great potential for further advancements in catalyst development, performance prediction, and process simulation in complex wastewater systems. This approach provides a valuable strategy for researchers and practitioners in wastewater treatment, enabling the development of more sustainable and efficient treatment methods. More information: Min Li et al, A tailored and rapid approach for ozonation catalyst design, Environmental Science and Ecotechnology (2023). DOI: 10.1016/j.ese.2023.100244 Provided by TranSpread
Environmental Science
Nuclear weapons tests found to contribute to persistent radioactivity in German wild boars Shaggy-haired, tusked pigs roam free in the woods of Germany and Austria. Although these game animals look fine, some contain radioactive cesium at levels that render their meat unsafe to eat. Previously, scientists hypothesized that the contamination stemmed from the 1986 Chernobyl nuclear power plant accident. But now, researchers report in Environmental Science & Technology that nuclear weapon fallout from 60 to 80 years ago also contributes significantly to the wild boars' persistent radioactivity. Radioactive cesium, a byproduct of nuclear weapons explosions and nuclear energy production, poses risks to public health when it enters the environment. And the environment across Europe got a large pulse of radioactive cesium contamination following the Chernobyl power plant accident 37 years ago. Most of that radioactivity originated from cesium-137, but a much longer-lived form, called cesium-135, can also be produced during nuclear fission. Over time, cesium-137 has declined in most game animals, but wild boars' radioactivity levels haven't changed substantially. Their meat continues to exceed regulatory limits for consumption, in some places leading to less hunting and consequently contributing to the overpopulation of the animals in Europe. Because the radioactive cesium levels haven't changed as expected, Georg Steinhauser, Bin Feng and colleagues wanted to investigate the amount and origin of that contamination in wild boars from Germany. The researchers worked with hunters to collect wild boar meat from across Southern Germany and then measured the samples' cesium-137 levels with a gamma-ray detector. To determine the origin of the radioactivity, the team compared the amount of cesium-135 to cesium-137 with a sophisticated mass spectrometer. Previous studies showed that this ratio clearly indicates sources: A high ratio points to nuclear weapons explosions, whereas a low ratio implicates nuclear reactors. The team observed that 88% of the 48 meat samples exceeded German regulatory limits for radioactive cesium in food. For the samples with elevated levels, the researchers calculated the ratios of cesium-135 to cesium-137, and found that nuclear weapons testing supplied between 10 and 68% of the contamination. And in some samples, the amount of cesium from weapons alone exceeded regulatory limits. The researchers propose that the mid-20th century weapons tests were an underappreciated source of radioactive cesium to German soil, which was also unevenly impacted by the Chernobyl accident. Contamination from both sources have been taken up by the wild boars' food, such as underground truffles, contributing to their persistent radioactivity. The researchers say that future nuclear accidents or explosions could worsen these animals' contamination, potentially impacting food safety for decades, as this study shows. More information: Disproportionately High Contributions of 60 Year Old Weapons-137Cs Explain the Persistence of Radioactive Contamination in Bavarian Wild Boars, Environmental Science & Technology (2023). DOI: 10.1021/acs.est.3c03565 Journal information: Environmental Science & Technology Provided by American Chemical Society
Environmental Science
It’s no surprise that the climate crisis is fueling more intense weather, but new research shows how warming ocean temperatures are boosting tropical storms, causing hurricanes to become bigger and stronger at faster rates. A study published this week in the journal Scientific Reports found that hurricanes that form in the Atlantic Ocean are now twice as likely to grow from a small storm to a strong Category 3 hurricane in just a day. Andra Garner, an assistant professor of environmental science at Rowan University and the paper’s author, studied tropical storms that formed in the Atlantic Ocean from 2001 to 2020. Data showed that 8.1% of these storms upgraded from a Category 1 to a Category 3 hurricane or stronger in 24 hours. She compared that to storms that formed from 1971 to 2000, where only 3.2% of storms strengthened that quickly. The study noted that the increase in quickly strengthening storms occurred alongside rising ocean temperatures, which are known to fuel tropical storms. Stronger storms of course mean greater damage to infrastructure and displacement of coastal communities, but faster-intensifying storms also make it harder for people to adequately prepare or evacuate. “The rapid intensification of TCs [tropical cyclones] in a warmer climate is particularly concerning, given that such events can be difficult to forecast and predict, leading to potentially escalated damages as well as difficulties when communicating the approaching hazard to coastal residents who may be in the TC’s path,” the study reads. We’ve seen several hurricanes get bigger in alarmingly short periods of time these past few years. Hurricane Maria formed in September 2017 and intensified from a Category 1 storm to a Category 5 in less than 24 hours. It slammed into Puerto Rico with maximum sustained winds of about 155 miles per hour. And last month, Hurricane Lee formed and was barely a hurricane, but quickly grew to a Category 5 in a day. There are other factors involved, including whether we’re in an El Niño or La Niña year. Remember how we ran through storm names so quickly that we had to speedrace through the Greek alphabet in 2020? That was a La Niña year. This year is an El Niño formation year, which means fewer hurricanes in the Atlantic and more in the Pacific. But because ocean temperatures have been so warm this year, the National Oceanic and Administrative Administration had to upgrade its original forecast for hurricane season from “near-normal” to “above average.” This year was expected to see 12 to 17 named storms; the update increases that prediction to an expectation of 14 to 21 named storms. “Forecasters believe that current ocean and atmospheric conditions, such as record-warm Atlantic sea surface temperatures, are likely to counterbalance the usually limiting atmospheric conditions associated with the ongoing El Nino event,” NOAA explained in its statement. There is a solution: mitigating climate change by phasing out oil and gas infrastructure to stop the planet from becoming even warmer. “One of the messages from this work is that there is an urgency,” Garner said in a statement. “If we don’t make some pretty big changes and rapidly move away from fossil fuels, this is something we can expect to see worsen in the future.” Want more climate and environment stories? Check out Earther’s guides to decarbonizing your home, divesting from fossil fuels, packing a disaster go bag, and overcoming climate dread. And don’t miss our coverage of the latest IEA report on clean energy, the future of carbon dioxide removal, and the invasive plants you should rip to shreds.
Environmental Science
Bioinspired self-assembled colloidal collectives of active matter systems Active matter systems feature unique behaviors that include collective self-assembly structures and collective migration. However, the efforts to realize collective entities in spaces without wall-adhered support, in order to conduct three-dimensional locomotion without dispersion, are challenging. In a new study, published in Science Advances, Mengmeng Sun and a research team in mechanical engineering and physical intelligence in China and Germany, were bioinspired by migration mechanisms of plankton and proposed a bimodal actuation strategy by combining magnetic and optical fields. While the magnetic field triggered the self-assembly of magnetic colloidal particles to maintain numerous colloids as a dynamically stable entity, the optical fields allowed the colloidal collectives to generate convective flow through photothermal effects for 3D drifting. The collectives performed 3D locomotion underwater to provide insights into the design of smart devices and intelligent materials for synthetic active matter that can regulate collective movement in 3D space. Active living matter Active living matter is ubiquitous in nature, offering self-assembled collectives that can accomplish complex tasks that surpass individual capabilities, which include bird flocks, and colonies of bacteria. Bioinspired by natural collectives, it is possible to examine colloids as building blocks for materials, much like atoms that form building blocks of molecules and crystals. Colloidal self-assembly can be studied as a method to fabricate nanostructures with technical implications to build nanoscale electronics, energy conversion or storage, drug delivery and catalysts. In this work, Mengmeng Sun and a team of scientists presented a new approach to achieve 3D motility of colloidal collectives without dispersion. The colloidal collective consisted of ferrofluidic iron colloidal particles with a diameter below 1 μm, driven by a tailored rotating magnetic field to self-assemble into a dynamic stable collective. The team focused on optical convective flow using fluid currents for 3D drifting—bioinspired by plankton. Sun and the team discussed the methods for transitions of colloidal collectives to examine their locomotion capabilities, on water surfaces. The outcomes culminated in colloidal collectives with 3D mobility to adapt to complex environments with physical intelligence for locomotion, self-assembly and regulation. Bimodal activation strategy Sun and the research team adopted a bimodal actuation strategy of magnetic and optical fields to realize 3D locomotion of colloidal collectives. In the first step, they triggered the formation of colloidal collectives by incorporating a magnetic field containing three adjustable parameters, including pitch angle, frequency, and strength. At first, in the absence of a magnetic field, the ferrofluidic colloids exhibited Brownian motion after settling. Once energized by the tailored rotating magnetic field, they self-assembled to form small primitive collectives known as nonequilibrium colloidal collectives that continued to increase in size and merge with neighboring particles to contribute to their growth; the scientists confirmed this by using simulations. The morphology of the colloidal collective depended on the strength and frequency of the applied magnetic field, which allowed the collective to maintain its integrity, triggering the formation and maintenance of its dynamic stability. Temperature gradient The dispersed ferrofluid colloidal particles absorbed near-infrared light to convert it to heat energy, giving rise to a local temperature gradient. The temperature gradient induced a convective flow to carry the particles upward to gather into a collective with an enhanced photothermal effect. This resulted in the maintenance of a dynamically stable entity, without disintegrating. In the absence of a near-infrared optical field, the colloidal collective cooled down with a weakened hydrodynamic force to sink progressively under gravity. These samples therefore adjusted the optical field for convection and achieved vertical upward, hovering, and directional horizontal motion. Since the hydrodynamic force was greater than gravity, the convection pushed the collective upward vertically, allowing the colloidal collective to hover underwater. By regulating the optical field, Sun and team directed the motion of the colloid collective and adjusted their positions underwater. Transitions through the air-water interface The scientists investigated the ability of the colloidal collective to break through the water surface using induced convection flow; to indicate how the samples successfully exited the water by overcoming the surface tension of the water. The colloidal collectives overcame surface tension and gravity for well-regulated transitions through the water surface to dive into water at a desired location and time. The researchers analyzed the constructs by using buoyancy, hydrodynamic force from convection, surface tension, and gravity. Sun and team explored these effects on conventional microrobot collectives to introduce spatially symmetrical interactions for locomotion underwater, and on the water's surface. The team used magnetic and optical fields to drive the movement of such microrobot collectives on the water surface, where they climbed the water meniscus for transport driven by an optical field. Such instruments known as surface walkers can cross obstacles larger than their own size and bypass high barriers for applications in environmental science, medicine, and engineering. Outlook In this way, Mengmeng Sun and colleagues were bioinspired by the migration mechanisms of plankton to propel colloidal collectives to move in 3D space without boundaries. The team combined magnetic and optical fields for well-formed and regulated 3D locomotion of active colloidal collectives in an aquatic environment, with the combined optical and magnetic fields to facilitate 3D locomotion. These sediments and colloidal systems provide a powerful process to explore the physics of self-assembly and develop a practical method to synthesize functional materials. The living systems can form self-assembled colloidal collectives under external magnetic fields, to create structures that can be guided through spaces and interfaces, to attain unusual geometries and patterns. Sun and team intend to investigate these collectives and their complexity for materials synthesis and design. These dual-responsive constructs can function as microrobot collectives for environmental adaptability with practical applications in biofluids with high viscosity and high ionic concentrations with broad applications in biomedical engineering. More information: Mengmeng Sun et al, Bioinspired self-assembled colloidal collectives drifting in three dimensions underwater, Science Advances (2023). DOI: 10.1126/sciadv.adj4201 Journal information: Science Advances © 2023 Science X Network
Environmental Science
Published November 4, 2022 2:46PM Updated 3:41PM What are microplastics? Microplastics are pieces of plastic that measure less than 5 millimeters and these tiny particles are finding their way into our drinking water, food, and even our blood. A recent study found that those wonderfully convenient non-stick pans you’ve been using could actually be releasing thousands of microplastics and/or nanoplastics with each use, especially if they’ve been damaged.  Researchers in Australia used a specialized imaging process called Raman on pans coated with Teflon and found that, with each use, it is estimated that thousands, and maybe even millions, of microplastic particles are being released.  Teflon is a synthetic plastic which is chemically composed of carbon and fluorine atoms, according to the study. It has an extremely low level of friction and notable chemical, thermal and electrical stability.  However, Teflon is a member of PFAS, which are also known as "forever chemicals" because they last so long in the environment. They have been associated with serious health conditions, including cancer and reduced birth weight.  FILE - Chef tooks sausage and veggies in a pan. (Marie Ostrosky for The Washington Post via Getty Images) PFAS is short for per- and polyfluoroalkyl substances that are used in nonstick frying pans, water-repellent sports gear, stain-resistant rugs, firefighting foam and many other products. The chemical bonds are so strong that they don’t degrade or do so only slowly in the environment and remain in a person's bloodstream indefinitely.  "As emerging contaminants, microplastic/nanoplastic is a big concern with lots of unknowns. Teflon is a member of PFAS, so Teflon microplastic/nanoplastics make the concern even bigger," said co-author of the study and senior research fellow at the University of Newcastle Australia, Dr. Cheng Fang.  RELATED: Microplastics found in human blood for the 1st time, study says What are microplastics and nanoplastics? Microplastics are pieces of plastic that measure less than 5 millimeters and nanoplastics measure less than 1 micrometer. So in short, they are very small pieces of plastic that are not easily detectable to the naked eye.  Teflon’s teenie tiny plastics From left to right - Tested cookware with steel spatula (L), a scanning electron microscope (SEM) image (M) and the Raman imaging results detecting particles.  (Dr. Cheng Fang) Fang and his team tested six different non-stick pots and pans — four new and two used — and mimicked a typical cooking or cleaning process. Researchers noted no actual food or cooking oils were used in this experiment and said the process was akin to what they called a "dry run."  A steel spatula, a barbeque clamp, a stainless steel wool scrubber and a wooden spatula were used on the tested cookware.  Researchers estimated about 9,100 particles of microplastics/nanoplastics could be released from small scratches during cooking and a whopping 2.3 million particles could be released from significantly damaged areas such as cracks or fractures.  Fang also noted that even if there was no damage to the cookware, the non-stick coating could still release particles over time as the coating can be worn down with each use.  "It depends but most likely is inevitable from our current test, unfortunately," Fang said about the non-stick cookware releasing particles over time. "You might note that your non-sticking pot/pan gradually (after years of use) becomes yellow and looks no longer brand-new, which is supposedly releasing particles over its lifetime."  These estimates could also be more or less "depending on cooking style/habit, turner/pan materials/approaches etc.," Fang said.  Separately, Americans eat, drink and breathe in an estimated 74,000 and 121,000 microplastics each year, depending on their age and gender, according to a study published in Environmental Science & Technology.  Those who only drink bottled water over tap water can ingest an additional 90,000 plastic particles each year, the study found.  "Individuals who meet their recommended water intake through only bottled sources may be ingesting an additional 90,000 microplastics annually, compared to 4,000 microplastics for those who consume only tap water," the study stated.  RELATED: Infants ingest 15 times more microplastics than adults, new study finds What’s next? Further research on better detecting microplastics and nanoplastics is warranted, according to Fang, as well as the long-term health impacts of ingesting such tiny inorganic particles.  "We are actually being surrounded by lots of plastic items in our daily lives. We need and enjoy them. However, most of them can release particles/fragments in their lifetime as microplastics and even nanoplastics. Microplastics/nanoplastics contamination are everywhere," Fang said. "While the risk assessment is not yet done, we should be cautious to avoid the potential contamination in our kitchen and in our food."  Kelly Hayes contributed to this report. This story was reported from Los Angeles.
Environmental Science
Insta-worthy catch? Social media helps researchers track changes in fisheries While many changes happened during the pandemic, one instance of change involving fishing around Hawai'i Island showed the importance of the activity for residents to researchers at the University of Hawai'i at Hilo, thanks to photos on social media. The researchers were part of a team examining pandemic-driven changes in nearshore, non-commercial use of fisheries on Hawai'i Island. The study, "Pandemic-driven changes in the nearshore non-commercial fishery in Hawai'i: catch photos posted to social media capture changes in fisher behavior," was published on March 28 in the journal PeerJ. While studying new fishing habits that arose during the pandemic and how those changes were impacting local fisheries, the researchers also confirmed something interesting about data collection in the age of social media. Catch photos posted to Instagram told a story about changes in fishing behavior much more quickly than conventional approaches to data collection. The images showed that people were fishing more during the COVID-19 lockdown, and they started targeting different species than their normal catch. The study's lead author is adjunct associate professor of marine science Tim Grabowski, who is also unit leader at the U.S. Geological Survey's Hawai'i Cooperative Fishery Research Unit in Hilo. "This project illustrates the importance of nearshore fisheries to the people of Hawai'i Island as a source of both sustenance and recreation during a time of hardship, uncertainty and stress," said Grabowski. "Not only were people fishing more during the COVID-19 lockdown, but they started targeting different species. From a management perspective, understanding how fishers change their behavior during hard times will allow better planning to ensure these resources are available to people when they are needed most." The study's co-authors are Michelle Shuey, an instructor of geography and environmental science; Andrew Curley, an anthropology alumnus; Michelle Benedum, a political scientist and doctoral candidate from University of Colorado at Boulder; and Cole Dill-De Sa, an undergraduate student at Stanford University's earth systems program. Curley and Dill-De Sa were able to participate in the study through UH Hilo's Pacific Internship Programs for Exploring Science program. Study highlights The study produced three main takeaways. First, resource users (people fishing) posted to social media nearly three times as often during the pandemic with nearly double the number of fish pictured per post. Second, individuals who fished for subsistence were more likely to increase the amount of time spent fishing and relied more on their catch for food security. Last, individuals fishing exclusively for subsistence were more likely to fish for different species. In addition to examining catch photos on social media, researchers also collected oral histories directly from fishers. Information gleaned from those interviews validated the social media findings, suggesting that social media can be used to rapidly collect data and predict changes in nearshore fisheries due to large-scale disturbances. "Our analysis of photos posted to social media allowed us to quickly and efficiently detect many of the changes that occurred in the fishery that were only apparent after spending much more time and effort talking story with local fishers around Hilo Bay," said Grabowski. The researchers emphasize the importance of this quick data collecting—as climate change threatens additional disturbances, it will be necessary for resource managers to collect reliable data quickly to prevent unsustainable fishing pressures and to better target management plans. "Climate change is only likely to increase the frequency and severity of acute social disruptions, like the COVID-19 pandemic," said Grabowski. "Therefore, understanding how fishers respond to these disruptions is critical to ensuring that resource management can be responsive and adaptive to changing use patterns." More information: Timothy Grabowski et al, Pandemic-driven changes in the nearshore non-commercial fishery in Hawai'i: catch photos posted to social media capture changes in fisher behavior, PeerJ (2023). DOI: 10.7717/peerj.14994 Journal information: PeerJ Provided by University of Hawaii at Manoa
Environmental Science
The constellation will be used to generate a complete, high-resolution 3D point cloud of the Earth's surface (Image: Shutterstock/rommma) Orlando-based startup Nuview has emerged from stealth to build what it says is the world’s first commercial lidar satellite constellation, which will be used to map the entirety of Earth’s land surface in 3D. The high-quality, accurate mapping data will serve the environmental science, infrastructure, agriculture and forestry industries, helping solve critical challenges in combating climate change, disaster response, sustainable farming, conservation and forestry. The majority of current satellite imagery provides a 2D view of the planet, with only an estimated 5% of the Earth’s landmass having ever been mapped with lidar. Nuview, which already has $1.2 billion in early adopter agreements, plans to change this via its new constellation. The firm sees great potential in the addressable geospatial market, which it expects to grow to a $1.7 trillion industry. "Nuview is thrilled to be leading a new era in geospatial technology to provide the first, most complete, high-resolution 3D point cloud of the Earth's surface," said Clint Graumann, CEO & co-founder of Nuview. "Our lidar satellite constellation will offer a wealth of information that has never before been available at scale, driving innovation and progress throughout numerous industries and revolutionising the way we understand and interact with our planet.” The constellation will collect data more than 100 times faster than current commercial aerial solutions in an “always on” approach, according to Nuview. The high-resolution 3D point cloud data could enable farmers to optimise crop yields and water usage, while city planners could use it to create more efficient and sustainable urban environments. In addition, the ability to quickly gather precise data about disaster-affected areas could help emergency responders and aid organisations to better coordinate their efforts. The Nuview team is also looking to collaborate with partners such as other satellite operators and analytics companies across various industries to explore new applications and possibilities.
Environmental Science
Almost 5 miles above sea level in the Himalayan mountains, the rocky dip between Mount Everest and its sister peak, Lhotse, lies windswept, free of snow. It is here at the South Col where hundreds of adventurers pitch their final camp each year before attempting to scale the world's tallest peak from the southeastern side. According to new University of Colorado Boulder-led research, they're also leaving behind a frozen legacy of hardy microbes, which can withstand harsh conditions at high elevations and lie dormant in the soil for decades or even centuries. The research not only highlights an invisible impact of tourism on the world's highest mountain, but could also lead to a better understanding of environmental limits to life on Earth, as well as where life may exist on other planets or cold moons. The findings were published last month in Arctic, Antarctic, and Alpine Research, a journal published on behalf of the Institute of Arctic and Alpine Research (INSTAAR) at CU Boulder. "There is a human signature frozen in the microbiome of Everest, even at that elevation," said Steve Schmidt, senior author on the paper and professor of ecology and evolutionary biology. In decades past, scientists have been unable to conclusively identify human-associated microbes in samples collected above 26,000 feet. This study marks the first time that next-generation gene sequencing technology has been used to analyze soil from such a high elevation on Mount Everest, enabling researchers to gain new insight into almost everything and anything that's in them. The researchers weren't surprised to find microorganisms left by humans. Microbes are everywhere, even in the air, and can easily blow around and land some distance away from nearby camps or trails. "If somebody even blew their nose or coughed, that's the kind of thing that might show up," said Schmidt. What they were impressed by, however, was that certain microbes which have evolved to thrive in warm and wet environments like our noses and mouths were resilient enough to survive in a dormant state in such harsh conditions. Life in the cryosphere This team of CU Boulder researchers -- including Schmidt, lead author Nicholas Dragone and Adam Solon, both graduate students in the Department of Ecology and Evolutionary Biology and the Cooperative Institute for Research in Environmental Science (CIRES) -- study the cryobiosphere: Earth's cold regions and the limits to life in them. They have sampled soils everywhere from Antarctica and the Andes to the Himalayas and the high Arctic. Usually, human-associated microbes don't show up in these places to the extent they appeared in the recent Everest samples. Schmidt's work over the years connected him with researchers who were headed to Everest's South Col in May of 2019 to set up the planet's highest weather station, established by the National Geographic and Rolex Perpetual Planet Everest Expedition. He asked his colleagues: Would you mind collecting some soil samples while you're already there? So Baker Perry, co-author, professor of geography at Appalachian State University and a National Geographic Explorer, hiked as far away from the South Col camp as possible to scoop up some soil samples to send back to Schmidt. Extremes on Earth, and elsewhere Dragone and Solon then analyzed the soil in several labs at CU Boulder. Using next-generation gene sequencing technology and more traditional culturing techniques, they were able to identify the DNA of almost any living or dead microbes in the soils. They then carried out extensive bioinformatics analyses of the DNA sequences to determine the diversity of organisms, rather than their abundances. Most of the microbial DNA sequences they found were similar to hardy, or "extremophilic" organisms previously detected in other high-elevation sites in the Andes and Antarctica. The most abundant organism they found using both old and new methods was a fungus in the genus Naganishia that can withstand extreme levels of cold and UV radiation. But they also found microbial DNA for some organisms heavily associated with humans, including Staphylococcus, one of the most common skin and nose bacteria, and Streptococcus, a dominant genus in the human mouth. At high elevation, microbes are often killed by ultraviolet light, cold temperatures and low water availability. Only the hardiest critters survive. Most -- like the microbes carried up great heights by humans -- go dormant or die, but there is a chance that organisms like Naganishia may grow briefly when water and the perfect ray of sunlight provides enough heat to help it momentarily prosper. But even for the toughest of microbes, Mount Everest is a Hotel California: "You can check out any time you like/ But you can never leave." The researchers don't expect this microscopic impact on Everest to significantly affect the broader environment. But this work does carry implications for the potential for life far beyond Earth, if one day humans step foot on Mars or beyond. "We might find life on other planets and cold moons," said Schmidt. "We'll have to be careful to make sure we're not contaminating them with our own." Additional authors on this publication include: Anton Seimon, Department of Geography and Planning, Appalachian State University; and Tracie Seimon, Wildlife Conservation Society, Zoological Health Program, Bronx, New York. This work was supported by the National Geographic and Rolex Perpetual Planet Everest Expedition, the Department of Ecology and Evolutionary Biology, and the University of Colorado Boulder Libraries Open Access Fund Story Source: Journal Reference: Cite This Page:
Environmental Science
Influence of residual aluminum hydrolyzed species on activated sludge properties in industrial wastewater pre-treatment Given that previous studies have neglected the potential mechanism of the difference in aluminum content affecting the activated sludge system, a research team from Wuhan University of Technology has investigated the morphology distributions of residual aluminum salts (RAS) and their effects on the removal efficiency of activated sludge (AS) under different PAC concentrations, and revealed the internal mechanisms of the effects of the excessive and appropriate concentrations of RAS on the AS system. The paper is published in the journal Frontiers of Environmental Science & Engineering. This work improved the dominance of medium polymeric RAS, formed under an appropriate PAC dose of 20 mg/L enhanced the hydrophobicity, flocculation, and sedimentation performances of AS, as well as the enzymatic activity in cells in the sludge system, improving the main pollutants removal efficiency of the treatment system. The activated sludge (AS) method has been extensively utilized in industrial wastewater treatment systems worldwide due to its strong economy, excellent treatment efficiency, and strong adaptability to practical application. However, for the increasing content of organic suspended matters in industrial wastewater, physical and chemical methods must be introduced as auxiliary processing for biological treatment. Among them is the coagulation-sedimentation process, a pretreatment technology with the advantages of simple manipulation and cost-effectiveness. However, it is worth noting that a small number of metal salts still remain in the subsequent biological treatment units. Early studies confirmed that the long-term accumulation of the residual metal entering AS system would negatively affect the properties and bioactivity of the sludge, the community structure of microorganisms, and in turn, the system treatment efficiency. Nevertheless, as the research continued to deepen, it was found that low levels of toxic metals could coordinate with freely dissolved EPS secreted by cells. It is illustrated that the accumulation of metals within an appropriate concentration range might be beneficial to certain aspects of AS. However, few studies have comprehensively explored the internal mechanism behind the difference in Al contents and assessed the correlation between various physical-chemical and biological parameters influencing the AS properties. To fill these gaps, researchers from Wuhan University of Technology comprehensively explored the internal mechanism behind the difference in Al contents and assessed the correlation between various physical-chemical and biological parameters influencing the AS properties. In this study, the research team explored and analyzed the following questions: 1) Determine the coagulant dosage beneficial to the reaction through various indicators of the reactor effluent and AS. 2) Evaluate the influences of RAS on the AS system using several indicators, including EPS, relative hydrophobicity, surface charge, electrolyte seepage volume, etc. 3) Explore the internal mechanisms of the effects of the excessive and appropriate concentrations of residual aluminum salts (RAS) on the AS system through analyzing the influence rule of the distribution of Al hydrolysis species. Based on the analysis of the species distribution of aluminum brine hydrolysis, their study focused on the effect of different RAS concentrations on the AS system in the subsequent biochemical treatment unit after coagulation pretreatment. Their results showed that the moderate polymeric RAS formed at an appropriate PAC dose of 20 mg/L was dominant, stimulating the secretion of EPS and decreasing the surface charge of the sludge. In addition, the heightened AS hydrophobicity and the sludge surface charge neutralization by Al salts resulted in better flocculation and settling performance, which improving the main pollutants removal efficiency of the treatment system. Comparatively, the species composition with monomer and dimer / high polymer RAS as the overwhelming parts under an overdosed PAC concentration of 55 mg/L resulted in excessive secretion of EPS with loose flocs structure and conspicuous inhibition of cellular activity, leading to the deterioration of physico-chemical and biological properties of AS. This study not only reveals the influence of RAS hydrolyzed species distributions on the comprehensive properties of AS, which is closely relevant to the dosage of Al, but also provides a theoretical reference for the precise control of coagulant dosage in industrial wastewater pretreatment. More information: Ziqi Zhao et al, Evaluation of activated sludge properties' changes in industrial-wastewater pre-treatment: role of residual aluminum hydrolyzed species with different polymerization degree, Frontiers of Environmental Science & Engineering (2023). DOI: 10.1007/s11783-023-1675-3 Provided by Higher Education Press
Environmental Science
For years, beavers have been treated as an annoyance for chewing down trees and shrubs and blocking up streams, leading to flooding in neighborhoods and farms. But the animal is increasingly being seen as nature's helper in the midst of climate change. California recently changed its tune and is embracing the animals that can create lush habitats that lure species back into now-urban areas, enhance groundwater supplies and buffer against the threat of wildfires. A new policy that went into effect last month encourages landowners and agencies dealing with beaver damage to seek solutions such as putting flow devices in streams or protective wrap on trees before seeking permission from the state to kill the animals. The state is also running pilot projects to relocate beavers to places where they can be more beneficial. The aim is to preserve more beavers, along with their nature-friendly behaviors. “There's been this major paradigm shift throughout the West where people have really transitioned from viewing beavers strictly as a nuisance species, and recognizing them for the ecological benefits that they have,” said Valerie Cook, beaver restoration program manager for California's Department of Fish and Wildlife. The program was funded by Gov. Gavin Newsom's administration last year. The push follows similar efforts in other Western states including Washington, which has a pilot beaver relocation program, Cook said. It marks a new chapter in Californians' lengthy history with the animals, which experts say used to be everywhere, but after years of trapping, attempts at reintroduction, and then removal under depredation permits, are found in much smaller numbers than they once were — largely in the Central Valley and northern part of the state. It is unknown how many beavers live in California, but hundreds of permits are sought by landowners each year that typically allowed them to kill the animals. According to the state's Department of Fish and Wildlife, the beaver population in North America used to range between 100 million and 200 million but now totals between 10 million and 15 million. Kate Lundquist, director of the WATER Institute at the Occidental Arts & Ecology Center, said she expects California's changes will lead to fewer beavers killed in the state and a growth in wetland spaces. She said she believes the past three years of drought and devastating wildfires contributed to the state's shift on beavers. “There has been increased motivation to identify and fund the implementation of nature-based climate smart solutions,” she said. “Beaver restoration is just that.” Beavers live in family units and quickly build dams on streams, creating ponds. The pools help slow the flow of water, replenishing groundwater supplies, and can also stall the spread of wildfires — a critical issue for a state plagued by fires in recent years, said Emily Fairfax, professor of environmental science and management at California State University, Channel Islands. “You talk to anyone who has lived near beaver ponds. They’ll tell you: These things don’t burn,” said Fairfax, who has researched beavers and the ponds they build. The animals are not a protected species but help create habitat that is critical for others such as the coho salmon, which is listed under the Endangered Species Act. Young salmon grow and thrive in beaver ponds before heading to the ocean, which gives them a better shot at survival, said Tom Wheeler, executive director of the Environmental Protection Information Center, which has long pushed for California to try to resolve problems with beavers without killing them. Officials at the California Farm Bureau said they were studying the change and have not yet taken a position on it. California will continue to issue depredation permits as needed, but the state wants people to try other solutions before resorting to killing the animals, officials said. Those could be wrapping trees with wire mesh or using flow devices on streams to control beaver pond levels to prevent flooding. In some cases, it may involve relocating beavers to places that want them. Vicky Monroe, statewide conflict programs coordinator for California’s Department of Fish and Wildlife, said her office has long received requests from groups that want beavers, but the state didn’t have a mechanism to legally move them until recently. California has planned two pilot relocation projects, including one to bring beavers back to the Tule River. Kenneth McDarment, a councilmember for the Tule River Indian Tribe, said the tribe started seeking ways to reintroduce beavers nearly a decade ago due to drought and hopes to see them relocated later this year. “We’re going to give these beavers a chance to do what they do naturally in a place where they’re wanted,” he said. The state is also hoping to educate people about the benefits of beavers. Rusty Cohn, a 69-year-old retired auto parts businessman, said he knew little about the animals before he spotted chewed trees on a walk through the Northern California city of Napa in a region better known for winemaking than the critters. He later observed beavers building a dam on a trickling stream, converting the area into a lush pond for heron, mink and other species, and became a fan. “It was like a little magical place with an incredible amount of wildlife,” Cohn said. That was eight years ago, he said, adding that beaver sightings in that spot are becoming rarer amid increased development, but he can still find them on streams throughout Napa.
Environmental Science
ChatGPT's citation approach may amplify the Matthew Effect in environmental science ChatGPT (GPT) has become one of the most talked-about innovations in recent years, with over 100 million users worldwide. However, there is still limited knowledge about the sources of information GPT utilizes. As a result, we carried out a study focusing on the sources of information within the field of environmental science. Our study, available on the arXiv preprint server, aims to address the research question: "Does ChatGPT predominantly cite the most-cited publications in environmental science?" In the study, researchers asked GPT to identify the ten most significant subdisciplines within the field of environmental science. They then asked it to compose a scientific review article on each subdiscipline, including 25 references. They proceeded to analyze these references, focusing on factors such as the number of citations, publication date, and the journal in which the work was published. The findings indicate that GPT tends to cite highly-cited publications in environmental science, with a median citation count of 1184.5. It also exhibits a preference for older publications, with a median publication year of 2010, and predominantly refers to well-respected journals in the field, with Nature being the most cited journal by GPT. Interestingly, our findings suggest that GPT seems to exclusively rely on citation count data from Google Scholar for the works it cites, rather than utilizing citation information from other scientific databases such as Web of Science or Scopus. The study suggests that Google Scholar citations play a significant role as a predictor for mentioning a study in GPT-generated content. This finding reinforces the dominance of Google Scholar among scientific databases and perpetuates the Matthew Effect in science, where the rich get richer in terms of citations. With many scholars already utilizing GPT for literature review purposes, we can anticipate further disparities and an expanding gap between lesser-cited and highly-cited publications. More information: Eduard Petiska, ChatGPT cites the most-cited articles and journals, relying solely on Google Scholar's citation counts. As a result, AI may amplify the Matthew Effect in environmental science, arXiv (2023). DOI: 10.48550/arxiv.2304.06794 Journal information: Nature Provided by CUECR
Environmental Science
Alumni Spotlight: Jonathan Rubin Takes Vertical Farming to New Heights This story was originally published by Columbia’s School of International and Public Affairs. Jonathan Rubin’s deep-rooted affinity for nature and the environment blossomed during his formative years in Florida. From volunteering at a sea turtle hospital to embarking on exhilarating bike rides through the awe-inspiring Everglades, he forged an unbreakable bond with the natural world. After moving to Israel — where he studied government, diplomacy, and strategy as an undergraduate — Rubin pursued a career in policy roles, both in the Israeli parliament, known as the Knesset, and in US congressional internships. Looking to combine his political experience with his love for the environment, he enrolled in the one-year MPA in Environmental Science and Policy program, which is offered jointly by Columbia’s School of International and Public Affairs and Climate School. As a student, Rubin was one of the leaders of the Israel Trek, a week-long trip that exposed him to groundbreaking practices such as water recycling, solar farms, and algae farms. These innovative sustainability approaches reinforced the invaluable lessons he learned in the classroom. “At SIPA, a lot of our courses were focused on economics, environmental policy, and biology,” Rubin remembers. “In one course, [adjunct professor] Howard Apsan showed us vertical farms. And I said, ‘OK, let me focus on vertical farms from all these different angles.’ So whenever we had to write papers, instead of focusing on different environmental spheres so large, I focused specifically on vertical farms.” Rubin received Columbia travel grants to further research sustainable farming. Among other things he studied aquaponics, an integrated growing ecosystem where fish and plants coexist harmoniously, with the fish waste serving as a natural fertilizer for the plants. In return, the plants filter and purify the water. Rubin’s aquaponics experiment laid the foundation for a hydroponic system, which cultivates plants in a nutrient-rich water solution without the need for soil. In 2021 Rubin launched Fresh Florida Farms, which grows non-GMO hydroponic lettuce, microgreens, sprouts, herbs, and other leafy greens in Boca Raton — supplying fresh products to caterers, restaurants, supermarkets, and food banks in South Florida. Growing crops in vertically stacked layers increases crop yields while reducing the amount of space, water, and energy required compared to traditional agriculture and allows for year-round crop production. Because crops are grown in a controlled environment, there is less need for pesticides and herbicides. Fresh Florida Farms, Rubin says, now has the capacity to produce 100,000 heads of lettuce per year in “a very small space.” The remarkable growth of the vertical farming industry, projected to reach $9.7 billion in revenue by 2026 (up from $3.1 billion in 2021), is a testament to its potential. “There are many parts to being a farmer. Only 30 percent is actually growing the product. A lot has to do with policy, logistical support and dealing with the food safety regulations. Farmers will also spend lots of time researching and collaborating on projects with the USDA.” While vertical farming undeniably benefits the environment, Rubin, with his astute understanding of policy matters, emphasizes its broader geopolitical implications. “Many countries are exploring the viability of developing vertical farms in hope to address rising food costs and national food security threats,” he explains, pointing to UAE and Singapore, which have little farmable land. “Many smaller countries may import over 80 percent of their produce. If there were to be a war and borders were to be closed, people would starve. Vertical farms have the potential to lower costs of farming making fresh produce more affordable for the masses.” Rubin is entrepreneurial, for sure — always looking to maximize growth times, space capacity, and even designed his own automated watering system — but Fresh Florida Farms also has a social mission. Rubin works with special needs students to teach them about the farm and donates surplus crops to local food banks. “I think it’s a beautiful thing to see when the community comes together and is able to help benefit different aspects of the population,” Rubin says. “There are many social benefits, besides environmental benefits from this kind of operation.” He is also eager to maintain his ties to the Columbia community, offering advice to current and future students interested in launching their own sustainability ventures: “Find a professor to mentor you and help guide you in how to get along further, network at events, try to win grants.” And, of course, eat your leafy greens!
Environmental Science
Scientists say an experimental tower in northern China, dubbed the world's biggest air purifier, appears to be working. Researchers at the Institute of Earth Environment at the Chinese Academy of Sciences say they have seen improvements in the air quality over an area of more than three square miles in the past few months. The tower works through greenhouse coverings. Polluted air is sucked in, heated up by solar energy, and then circulated through multiple layers of cleaning filters.A number of locals say they have noticed a difference in the air quality, even during the winter when the city is especially prone to pollution. An experimental tower over 100 meters (328 feet) high in northern China — dubbed the world's biggest air purifier by its operators – has brought a noticeable improvement in air quality, according to the scientist leading the project, as authorities seek ways to tackle the nation's chronic smog problem. Loading Something is loading. Thanks for signing up! Access your favorite topics in a personalized feed while you're on the go. The tower has been built in Xian in Shaanxi province and is undergoing testing by researchers at the Institute of Earth Environment at the Chinese Academy of Sciences.The head of the research, Cao Junji, said improvements in air quality had been observed over an area of 10 square kilometers (3.86 square miles) in the city over the past few months and the tower has managed to produce more than 10 million cubic meters (353 million cubic feet) of clean air a day since its launch. Cao added that on severely polluted days the tower was able to reduce smog close to moderate levels.Read more: Greenpeace says China is winning battle against pollution in big cities but other areas falling behindThe system works through greenhouses covering about half the size of a soccer field around the base of the tower. Polluted air is sucked into the glasshouses and heated up by solar energy. The hot air then rises through the tower and passes through multiple layers of cleaning filters."The tower has no peer in terms of size … the results are quite encouraging," said Cao.Xian can experience heavy pollution in winter, with much of the city's heating relying on coal. The world's biggest air purifier is being tested in Xian in an effort to remedy the nation's smog problem without costing a fortune. Zhang Yuan/China News Service/VCG via Getty Images The tower's operators say, however, that the system still works in the cold months as coatings on the greenhouses enable the glass to absorb solar radiation at a much higher efficiency. Cao's team set up more than a dozen pollution monitoring stations in the area to test the tower's impact.The average reduction in PM2.5 – the fine particles in smog deemed most harmful to health – fell 15 percent during heavy pollution.Read more: Are China's new green taxes tough enough to fight pollution?Cao said the results were preliminary because the experiment is still ongoing. The team plans to release more detailed data in March with a full scientific assessment of the facility's overall performance. The Xian smog tower project was launched by the academy in 2015 and construction was completed last year at a development zone in the Chang'an district. The purpose of the project was to find an effective, low-cost method to artificially remove pollutants from the atmosphere. The cost of the project was not disclosed. The tower's operators say, however, that the system still works in the cold months as coatings on the greenhouses enable the glass to absorb solar radiation at a much higher efficiency. VCG/VCG via Getty Images What was previously thought to be the largest smog tower in China was built last year by Dutch artist Daan Roosegaarde at 798, a creative park in Beijing.The seven-meter (23-feet) tall tower produced about eight cubic meters (282.5 cubic feet) of clean air per second. The Smog Free Tower can run on solar energy and does not need to use electricity from coal plants. Dutch artist and innovator Daan Roosegaarde poses in front of the Smog Free Tower. Damir Sagol/Reuters Cao, however, said their tower in Xian required little power to run. "It barely requires any power input throughout daylight hours. The idea has worked very well in the test run," he said.Read more: From ionizing towers to bicycles, Dutchman's smog-removing inventions stand to clear the air in polluted ChinaSeveral people in Xian told the South China Morning Post they had noticed the difference since the tower started operating.A manager at a restaurant about 1km (0.62 miles) northwest of the facility said she had noticed an improvement in air quality this winter, although she was previously unaware of the purpose of the tower. "I do feel better," she said. A student studying environmental science at Shaanxi Normal University, also a few hundred meters from the tower, said the improvement was quite noticeable."I can't help looking at the tower each time I pass. It's very tall, very eye-catching, but it's also very quiet. I can't hear any wind going in or out," she said. "The air quality did improve. I have no doubt about that." Researchers say they hope to build larger smog towers in other cities in China. Kevin Frayer/Getty Images However, a teacher at the Meilun Tiancheng Kindergarten on the edge of the 10-square-kilometer (3.86-square-mile) zone said she had felt no change. "It's just as bad as elsewhere," she said.The experimental facility in Xian is a scaled-down version of a much bigger smog tower that Cao and his colleagues hope to build in other cities in China in the future. A full-sized tower would reach 500 meters (1,640 feet) high with a diameter of 200 meters (656 feet), according to a patent application they filed in 2014.The size of the greenhouses could cover nearly 30 square kilometers (11.6 square miles) and the plant would be powerful enough to purify the air for a small sized city.Visit INSIDER's homepage for more.
Environmental Science
SAN DIEGO (AP) — Relentless storms from a series of atmospheric rivers have saturated the steep mountains and bald hillsides scarred from wildfires along much of California’s long coastline, causing hundreds of landslides this month. So far the debris has mostly blocked roads and highways and has not harmed communities as in 2018 when mudslides roared through Montecito, killing 23 people and wiping out 130 homes. But more rain is in the forecast, increasing the threat. Experts say California has learned important lessons from the Montecito tragedy, and has more tools to pinpoint the hot spots and more basins and nets are in place to capture the falling debris before it hits homes. The recent storms are putting those efforts to the test as climate change produces more severe weather. Why is California prone to mudslides? California has relatively young mountains from a geology standpoint, meaning much of its steep terrain is still in motion and covered in loose rocks and soil that can be sloughed off easily, especially when the ground is wet, according to geologists. Almost all of the state has received rainfall totals of 400% to 600% above average since Christmas, with some areas receiving as much as 30 inches of precipitation, causing massive flooding. The severe weather has killed at least 19 people since late December. Since New Year’s Eve, the California Department of Conservation’s landslide mapping team has documented more than 300 landslides. The state’s prolonged drought has made matters worse. Dan Shugar, an associate professor of geoscience at the University of Calgary, said drought can have a counterintuitive effect when combined with the incredible rainfall California has seen in recent days. READ MORE: What happens in a mudslide? A geologist answers “You’d think if the ground is dry it should be able to absorb a lot of water, but when ground becomes too dry, the permeability of the ground actually decreases,” he said. As water runs off the hardened soil, moving downward and picking up energy, it can begin carrying soil and debris away, he said. Added to that, wildfires have left some hillsides with little to no vegetation to hold the soil in place. What are the most vulnerable areas? The most vulnerable areas are hillsides that have burned in the past two to three years with communities below them, said Jeremy Lancaster, who leads the California Department of Conservation’s geological and landslide mapping team. That includes areas that recently burned in Napa, Mariposa, and Monterey counties, he said. In 2018, the deadly mudslides in Montecito occurred about a month after one of the largest fires in California’s history tore through the same area, charring 280,000 acres. Barren hills, which were charred by the Thomas Fire, are seen in Montecito, California, Feb. 26, 2018. File photo by Mike Eliason/Santa Barbara County Fire/Handout via REUTERS Montecito is sandwiched between the Santa Ynez mountains and the Pacific coast. On the fifth anniversary of that tragedy, the entire community was ordered to evacuate on Jan. 9 as rains pummeled the area and debris blocked roads. Lancaster warned that the threat of landslides will linger long after the rains have subsided as the water seeps 50 to 100 feet into the soil, dislodging things. “They can occur weeks later, if not months,” he said. What can be done to protect communities? Lancaster said California has dramatically increased its efforts to identify hotspots since the Montecito mudslides. His department continually updates its map so local communities are aware and can make decisions, including whether to evacuate an entire community. The state is also working on a system to better pinpoint how much rain might trigger a landslide. Marten Geertsema, who studies natural hazards and terrain analysis at the University of Northern British Columbia, said agencies use a variety of tools to gauge the likelihood of landslides in a given area, including terrain maps and lidar – pulsed light from lasers to penetrate foliage to see the ground. Then they can watch for early warnings, such as changes over time in photos taken from the air, or from satellites, or in data from GPS monitoring stations, tilt meters and or other on-site instrumentation. What is the most effective defense against mudslides? One of the best ways to manage landslides is with debris basins – pits carved out of the landscape to catch material flowing downhill. But basins, which can require a lot of land, can also disrupt the natural ecosystem and lead to beaches needing to be replenished by collecting sediment that flows out of the canyons, according to experts. And they are costly, said Douglas Jerolmack, a professor of environmental science and mechanical engineering at the University of Pennsylvania. And if old debris isn’t removed, they can be overwhelmed by new landslides or mudslides. Some might also not be big enough to deal with future slides worsened by climate change, Jerolmack said. After the 2018 mudslides hit Montecito, the Los Angeles Times reported that debris basins above the community were undersized and hadn’t been sufficiently emptied. READ MORE: Storm forces evacuations on 5th anniversary of Montecito’s deadly mudslide The tragedy galvanized the community, which raised millions to address the problem, said Patrick McElroy, a retired Santa Barbara fire chief who founded the nonprofit organization, The Project for Resilient Communities. The organization hired an engineering company to map the canyons and installed debris nets. He said the recent storms put them to the test: One net measuring 25 feet tall filled nearly to capacity. McElroy said he’s still haunted by memories from 2018 but feels better, knowing that the community might be safer now. “I’m not over it yet. But to wake up, you know, the other day and see no injuries and no fatalities. I just can’t tell you how impressed I am,” he said of the nets. The best solution for the Montecito and Santa Barbara area is to have both nets and debris basins, according to Larry Gurrola, the engineering geologist hired by the organization. But nothing is cheap. Santa Barbara County’s spent $20 million on a new basin after 2018, while McElroy’s organization spent close to $2 million on installing the nets, which includes liability insurance and other fees. They have a five-year permit for the nets, which will be removed if it is not renewed. Gurrola said the alternative is more costly. With the recent storms, more than half of California’s 58 counties have been declared disaster areas and repairing the damage may cost more than $1 billion. “Most importantly these things protect the community and save lives,” he said. Glass reported from Minneapolis.
Environmental Science
The total amount of microplastics deposited on the bottom of oceans has tripled in the past two decades with a progression that corresponds to the type and volume of consumption of plastic products by society. ​​​​​​​This is the main conclusion of a study developed by the Institute of Environmental Science and Technology of the Universitat Autònoma de Barcelona (ICTA-UAB) and the Department of the Built Environment of Aalborg University (AAU-BUILD), which provides the first high-resolution reconstruction of microplastic pollution from sediments obtained in the northwestern Mediterranean Sea. Despite the seafloor being considered the final sink for microplastics floating on the sea surface, the historical evolution of this pollution source in the sediment compartment, and particularly the sequestration and burial rate of smaller microplastics on the ocean floor, is unknown. This new study, published in the journal Environmental Science and Technology (ES&T), shows that microplastics are retained unaltered in marine sediments, and that the microplastic mass sequestered in the seafloor mimics the global plastic production from 1965 to 2016. "Specifically, the results show that, since 2000, the amount of plastic particles deposited on the seafloor has tripled and that, far from decreasing, the accumulation has not stopped growing mimicking the production and global use of these materials," explains ICTA-UAB researcher Laura Simon-Sánchez. Researchers explains that the sediments analysed have remained unaltered on the seafloor since they were deposited decades ago. "This has allowed us to see how, since the 1980s, but especially in the past two decades, the accumulation of polyethylene and polypropylene particles from packaging, bottles and food films has increased, as well as polyester from synthetic fibres in clothing fabrics," explains Michael Grelaud, ICTA-UAB researcher. The amount of these three types of particles reaches 1.5mg per kilogram of sediment collected, with polypropylene being the most abundant, followed by polyethylene and polyester. Despite awareness campaigns on the need to reduce single-use plastic, data from annual marine sediment records show that we are still far from achieving this. Policies at the global level in this regard could contribute to improving this serious problem. Although smaller microplastics are very abundant in the environment, constraints in analytical methods have limited robust evidence on the levels of small microplastics in previous studies targeting marine sediment. In this study they were characterised by applying state-of-the-art imaging to quantify particles down to 11 µm in size. The degradation status of the buried particles was investigated, and it was found that, once trapped in the seafloor, they no longer degrade, either due to lack of erosion, oxygen, or light. "The process of fragmentation takes place mostly in the beach sediments, on the sea surface or in the water column. Once deposited, degradation is minimal, so plastics from the 1960s remain on the seabed, leaving the signature of human pollution there," says Patrizia Ziveri, ICREA professor at ICTA-UAB. The investigated sediment core was collected in November 2019, on board the oceanographic vessel Sarmiento de Gamboa, in an expedition that went from Barcelona to the coast of the Ebro Delta, in Tarragona, Spain. The research group selected the western Mediterranean Sea as a study area, in particular the Ebro Delta, because rivers are recognized as hotspots for several pollutants, including microplastics. In addition, the influx of sediment from the Ebro River provides higher sedimentation rates than in the open ocean.   Simon-Sánchez, L., Grelaud, M., Lorenz, C., Garcia-Orellana, J., Vianello, A., Liu, F., Vollertsen, J., Ziveri, P. (2022). Can a Sediment Core Reveal the Plastic Age? Microplastic Preservation in a Coastal Sedimentary Record.. ES&T  https://doi.org/10.1021/acs.est.2c04264   ICTA-UABs strategic research program, promoted within the framework of the María de Maeztu Unit of Excellence 2020-2023, granted by the Spanish Ministry of Science and Innovation, is structured around 5 interrelated Societal Challenges, focused on Oceans. Land. Cities, Consumption and Policies. Investigating these Societal Challenges is critical to envision a transition towards a sustainable Earth.).
Environmental Science
In an article published in the Proceedings of the National Academy of Sciences, Michael Mann, professor in the Department of Earth and Environmental Science in the University of Pennsylvania's School of Arts & Sciences, and colleagues from Clemson University, the University of California Los Angeles, and Columbia University investigate the effects of climate change on exacerbating compounding heat and drought situations. Their findings offer new insights into predicting their interplay, which will provide scientists and policymakers with a clearer and more holistic approach to preventing and preparing for extreme-weather events. "We wanted to see how the state-of-the-art climate models used in the most recent assessment reports of the Intergovernmental Panel on Climate Change address the episodes of heat waves and droughts that have given rise to some of the worst wildfires we've witnessed in recent history," Mann says. "We also wanted to get a better understanding of how often these events were occurring, their typical durations, and their intensity to improve not only our forecasting but approaches to mitigating further damage to human life." Compound drought and heat wave events and their effects The researchers document the deleterious effects of increasingly severe droughts and wildfires occurring in the past three years. "Two standout events," Mann says, "were the 2020 California wildfires and the 2019-20 Australian bush fire season, which lasted nearly one whole year and came to be known as the Black Summer. These are known as compound drought and heat wave (CDHW) events and refer to situations wherein a region experiences both prolonged hot temperatures and a shortage of water." These conditions can occur together and worsen each other's impacts, the researchers say, and could potentially lead to heat-related illnesses and deaths, water scarcity for drinking and agriculture, reduced crop yields, increased wildfire risk, and ecological stress. They also note that anthropogenic climate change -- climate change that is driven by human activity -- can contribute to the frequency and severity of these events. Projected impact of a worst-case versus moderate-case scenario The researchers compared two contrasting socioeconomic pathways: the high-end or worst-case scenario, wherein society fails to mitigate the effects of anthropogenic climate change, and a moderate scenario, wherein some conservative measures are put in place and efforts are made to abide by them. In the worst-case scenario, they found that by the late 21st century approximately 20% of global land areas are expected to witness approximately two CDHW events per year. These events could last for around 25 days and a fourfold increase in severity. "Comparatively, the average CDHW frequency over the recent observed reference period was approximately 1.2 events per year, lasting less than 10 days, with far less severity," Mann says. The most vulnerable geographical regions, such as eastern North America, southeastern South America, Central Europe, East Africa, Central Asia, and northern Australia, are projected to experience the largest increases in CDHW frequency by the end of the 21st century. "Interestingly, places like Philadelphia and some of the regions in the eastern U.S. are where we expect to see an increase in these sorts of events; urban environments in the summertime will witness the highest relative frequency of these events," Mann says. Critical need for proactive measures The researchers emphasize the profound threat posed by more frequent and intense CDHW events in the coming decades and the dependence the emissions pathway chosen has on the severity of these events. As climate change continues to unfold, addressing the escalating risks associated with CDHW events becomes crucial. This study contributes to the growing understanding of the projected changes in CDHWs and highlights the need for proactive measures, including emission reductions and adaptation strategies, to build resilience and safeguard vulnerable regions from the impacts of compound drought and heat wave events. "Our findings provide important scientific context for the record heat and wildfire that we're witnessing right now here in the United States," Mann says. "They underscore that we need to get off fossil fuels as quickly as possible to prevent a worsening of these dangerous combinations of heat and drought." Michael E. Mann is the inaugural Presidential Distinguished Professor in the Department of Earth and Environmental Science in the School of Arts & Sciences at the University of Pennsylvania, director of the Penn Center for Science, Sustainability, and the Media, and holds a secondary appointment in the Annenberg School for Communications. This work was supported by the National Science Foundation (Grant 1653841) and National Oceanic and Atmospheric Administration Modeling, Analysis, Prediction, and Planning (Grant NA 190AR4310278). Story Source: Journal Reference: Cite This Page:
Environmental Science
Left and right figures show warming in Europe of the summer half-year during the latest four decades, subdivided for clear-sky and all.sky conditions, respectively. Credit: Paul Glantz/Stockholm University The warming during the summer months in Europe has been much faster than the global average, shows a new study published in the Journal of Geophysical Research Atmospheres. As a consequence of human emissions of greenhouse gases, the climate across the continent has also become drier, particularly in southern Europe, leading to worse heat waves and an increased risk of fires. According to the UN’s Intergovernmental Panel on Climate Change (IPCC), warming over land areas occurs significantly faster than over oceans, with 1.6 degrees Celsius and 0.9 degrees Celsius on average, respectively. It means that the global greenhouse gas emissions budget to stay under a 1.5-degree Celsius warming on land has already been used up. Now, a new study by researchers at Stockholm University shows that the emissions budget to avoid a 2-degree Celsius warming over large parts of Europe during the summer half-year (April-September) has also been used up. In fact, measurements reveal that the warming during the summer months in large parts of Europe during the last four decades has already surpassed two degrees Celsius . Left and right figures show changes in sensible and latent heat fluxes, respectively, in Europe during the latest four decades of the summer half year. Credit: Paul Glantz/Stockholm University “Climate change is serious as it leads to, among other things, more frequent heat waves in Europe. These, in turn, increase the risk of fires, such as the devastating fires in southern Europe in the summer of 2022,” says Paul Glantz, Associate Professor at the Department of Environmental Science, Stockholm University, and main author of the study.   In southern Europe, a clear, so-called, positive feedback caused by global warming is evident, i.e. warming is amplified due to drier soil and decreased evaporation.  Moreover, there has been less cloud coverage over large parts of Europe, probably as a result of less water vapor in the air. “What we see in southern Europe is in line with what IPCC has predicted, which is that an increased human impact on the greenhouse effect would lead to dry areas on Earth becoming even drier,” says Paul Glantz. Left and right figures show decrease in clouds in Europe during the latest four decades, for the summer half year, with respect to low level clouds and total amount of clouds throughout the atmosphere, respectively. Credit: Paul Glantz/Stockholm University Impact of aerosol particles The study also includes a section about the estimated impact of aerosol particles on the temperature increase. According to Paul Glantz, the rapid warming in, for example, Central and Eastern Europe, is first and foremost a consequence of the human emissions of long-lived greenhouse gases, such as carbon dioxide. But since emissions of short-lived aerosol particles from, for example, coal-fired power plants have decreased greatly over the past four decades, the combined effect has led to an extreme temperature increase of over two degrees Celsius. Coal power plants on earth emit over 12 Gt of carbon dioxide each year, nearly one-third of the total emissions of carbon dioxide. Coal power plants constitute therefore the single largest source of global warming. Coal power plants emit also sulfur dioxide that forms aerosols in the atmosphere. Coal power plants have decreased and increased substantially in Europe and East Asia, respectively, during the latest four decades. Credit: Tomasz Matuszewski/Mostphotos “The airborne aerosol particles, before they began to decrease in the early 1980s in Europe, have masked the warming caused by human greenhouse gases by just over one degree on average for the summer half-year. As the aerosols in the atmosphere decreased, the temperature increased rapidly. Human emissions of carbon dioxide are still the biggest threat as they affect the climate for hundreds to thousands of years,” says Paul Glantz. According to Paul Glantz, this effect provides a harbinger of future warming in areas where aerosol emissions are high, such as in India and China. Warming and consequently drier conditions in Europe, particularly for southern countries, higher the risk for fires. In the report “Spreading like wildfires: the rising threat of extraordinary landscape fires,” by UN Environmental Program (UNEP) and GRID-Arendal (a UNEP partner) in 2022, conclude that climate change and land-use change are making wildfires worse. The report anticipates a global increase of extreme fires, even in areas previously unaffected. https://www.unep.org/resources/report/spreading-wildfire-rising-threat-extraordinary-landscape-fires. Credit: Ryhor Bruyeu/Mostphotos Background facts — The greenhouse effect and aerosol effect Fossil burning leads to the release of both aerosol particles and greenhouse gases. Although their source is common, their effects on climate differ. Paul Glantz, Associate Professor at the Department of Environmental Science, Stockholm University, and main author of the study. Credit: Stockholm University About the greenhouse effect Greenhouse gases are largely unaffected by solar radiation while they absorb infrared radiation efficiently, leading to re-emission towards the Earth’s surface. The Earth absorbs both solar radiation and infrared radiation, which leads to the warming of the lower part of the atmosphere in particular. Time-space: Greenhouse gases are generally long-lived in the atmosphere and this applies above all to carbon dioxide where human emissions affect climate for hundreds to thousands of years. It also means that greenhouse gases spread evenly over the entire planet. About the aerosol effect In contrast to greenhouse gases, aerosol particles affect incoming solar radiation, i.e. they scatter part of the sunlight back into space causing a cooling effect. Human emissions of aerosols can enhance this cooling effect. Time-space: Airborne human aerosol particles have a lifetime of about a week, which means that they mainly cool the climate locally or regionally and in the short term. According to the Paris Agreement, all parties must commit to drastically reduce their greenhouse gas emissions, but it is also important to decrease concentrations of aerosol particles as well because, in addition to their effects on climate, aerosol particles in polluted air cause approximately eight million premature deaths each year around the world. Reference: “Unmasking the Effects of Aerosols on Greenhouse Warming Over Europe” by P. Glantz, O. G. Fawole, J. Ström, M. Wild and K. J. Noone, 4 November 2022, Journal of Geophysical Research Atmospheres. DOI: 10.1029/2021JD035889 Funding: Svenska Forskningsrådet Formas (Formas)
Environmental Science
Sea level rise is transforming the U.S. coastline across the country, but researchers have noticed that the rate of sea level rise has increased faster in the last decade around the Gulf and Southeastern coasts. In a new study published in Nature Communications, researchers from Tulane University found rates of sea level rise of about 10 mm (0.4 inches ) per year around Gulf states and the Southeast since 2010. They compared a combination of field and satellite measurements from 1900 to 2021 and noticed record rates of sea level rise in the last 12 years. Researchers referred to the accelerated rate as “unprecedented in at least 120 years.” A little under half an inch of sea level rise may seem small, but average sea level has risen by about 0.14 inches a year since the early 1990s, according to the Environmental Protection Agency. They also noticed that the acceleration spans from the Gulf of Mexico up to Cape Hatteras in North Carolina. Higher-than-average rates of sea level rise were also recorded in the Caribbean. The team examined different factors that could have affected ice-mass loss and air pressure in the region. They couldn’t connect those to the sea level rise in the Gulf and Southeast. Because of this, they came to the conclusion that this is a result of human-caused climate change combined with natural variability in the ocean. The last 12 years, the accelerated rates of sea level rise have correlated with the expansion of the Subtropical Gyre that includes the North Atlantic Ocean and the Caribbean, according to the study. Gyres are systems of rotating ocean currents. Ocean warming and changing wind patterns have altered currents in the gyre, according to the study. Warming water expands, which contributes to some of the rapid sea level rise. It’s not clear if the ocean variability will continue or how. The observed rate of sea level rise has worsened coastal flooding, a trend already set in motion by climate change. “These high rates of sea-level rise have put even more stress on these vulnerable coastlines, particularly in Louisiana and Texas where the land is also sinking rapidly,” Torbjörn Törnqvist, co-author and an environmental science professor at Tulane, said in a statement. Törnqvist is right. Gulf states like Louisiana are struggling with erosion. According to data from the U.S. Geological Survey, that state lost an estimated 2,000 square miles of land between 1932 and 2016. That’s an area larger than the state of Rhode Island. According to the Texas General Land Office, 64% of the Texas coast is eroding at about 6 feet per year, and the average rate of erosion for that state is about 4 feet per year. Sea level rise has been a known risk for Florida’s coast for a long time, but flooding and king tides are projected to happen more often in the sunshine state in the near future. Increased flooding has also messed with the real estate market, as homeowners in flood zones could see property values plunge. “Results, once again, demonstrate the urgency of the climate crisis for the Gulf region. We need interdisciplinary and collaborative efforts to sustainably face these challenges,” said Sönke Dangendorf, study author and a Tulane professor. Want more climate and environment stories? Check out Earther’s guides to decarbonizing your home, divesting from fossil fuels, packing a disaster go bag, and overcoming climate dread. And don’t miss our coverage of the latest IPCC climate report, the future of carbon dioxide removal, and the un-greenwashed facts on bioplastics and plastic recycling.
Environmental Science
A million hectares of kelp forests need planting by 2040, and scientists are asking for help Marine ecologists from UNSW Sydney working to regrow kelp seaweed are calling on the public to participate in a global challenge to restore a million hectares of lost underwater forest by 2040. The researchers are terming it The Kelp Forest Challenge to raise awareness of kelp forests and challenge everyone to work together towards a unified target. Individuals and businesses can share their time, resources, and expertise to help meet the goal. The target area of kelp—which equates to an area about a sixth of the size of Tasmania—needs to be regrown in the next two decades to reverse a decline that has seen up to 95% of the canopy disappear in places like Tasmania and California. The grassroots movement also aims to protect an additional three million hectares of existing kelp forests in the same timeframe. While similar habitats like coral reefs often get more attention, kelp forests are one of the most unique and productive ecosystems on earth. These vibrant underwater jungles of brown algae live in shallow waters off a third of the world's coastlines and are incredible hubs of biodiversity. But threats like climate change and pollution have brought some kelp forests to the brink of extinction. "Terrestrial forests and coral reefs have often been the focus of much-needed protection and restoration in recent years, but kelp forests are just as vital and are disappearing by the minute," says Dr. Aaron Eger, Founder/Program Director of the Kelp Forest Alliance (KFA) and a marine ecologist from the School of Biological, Earth & Environmental Science, UNSW Science. "On land, we have powerful high-level initiatives like the Bonn Challenge to restore deforested landscapes," says Professor Adriana Vergés, a marine ecologist from the School of Biological, Earth & Environmental Science, UNSW Science, and a KFA director. "The Kelp Forest Challenge represents an equivalent ambitious target to protect and revitalize our underwater forests." Dr. Eger says while a pledge can be monetary, any positive contribution towards kelp forest conservation projects can count. "This initiative aims to encourage and facilitate positive actions and communities that can protect what is remaining and restore what has been lost with an ambitious shared vision for ensuring our kelp forests and the benefits they provide thrive into the future," Dr. Eger says. "We have media and marketing companies working to help promote kelp forests, dive companies loaning the needed equipment, and aquaculture groups helping produce seed stock." A coastal restoration movement According to the Alliance's best estimates, restoring 1 million hectares of lost kelp forest will require an initial investment of $40 billion but will produce tens of billions of dollars each year through a coastal restoration industry comprised of fisheries, blue carbon, and tourism. "If we are successful, we can restore billions of dollars in ecosystem services, create hundreds of thousands of jobs, and rebalance the ocean to a place of abundance and beauty," Dr. Eger says. While most restoration projects to date have taken place on less than a hectare, larger-scale restoration is becoming more viable. Twenty organizations have already made pledges, including a 30,000-hectare restoration project in South Korea. "We have seen an acceleration in the size of projects and scale of success above 100 hectares now," Dr. Eger says. "As methods like transplanting are refined, knowledge is shared, and economies of scale emerge, the feasibility of this work will increase, and this project will only help accelerate it." The Kelp Forest Challenge targets were developed from intensive consultation among experts and those living and working in kelp ecosystems worldwide. It is led by the KFA, a UNSW-supported, research-driven not-for-profit founded by Dr. Eger, that brings together 450 kelp forest experts from 25 countries to accelerate the protection and restoration of kelp forests worldwide. "We compiled the best available information on the known distribution of kelp, their past declines, the costs of restoration, and the technical capacity to do a restoration to propose this target value. We then considered how these scenarios fit in with other global initiatives to protect and restore ecosystems, like the UN Decade for Ecosystem Restoration," Dr. Eger says. The targets also align with the newly announced Kunming-Montreal Global Biodiversity Framework to protect and restore ecosystems. The potential for kelp to sequester carbon dioxide means that it may also contribute to commitments under the Paris Agreement. "We would love to work with governments to achieve these targets and mobilize funding for restoration projects," Dr. Eger says. "But we do not have time to wait for an international coalition to form and create a target, the need to act is too pressing." While we may be unable to entirely halt climate change by restoring a kelp forest, any progress towards their restoration is still a positive contribution. "We further hope that the positive action we are putting forward helps shift the conversation to a more equitable and sustainable future, not just for kelp forests but all ecosystems under threat," Dr. Eger says. "Alone we might be a drop, but together, we are an ocean."
Environmental Science
Gas stoves raise indoor levels of cancer-causing benzene, study finds Using gas-powered stoves and ovens can raise the levels of the cancer-causing chemical benzene in a home, according to a new study. The study, published in the journal Environmental Science and Technology, looked at 87 stoves in California and Colorado. Researchers found that gas and propane stoves and ovens emitted significantly more benzene than electric alternatives. Long-term exposure to benzene can cause blood cancer, according to the Centers for Disease Control and Prevention. Gas stoves have previously been found to leak benzene and have also been linked to childhood asthma. Whether to regulate gas-powered appliances has been a topic of debate in Washington, particularly after one regulator floated a ban earlier this year. Many Republicans have been particularly staunch advocates against bans or regulations on gas stoves, while the White House has also said that it opposes a ban. Looking at a subset of 17 homes with 33 distinct burners or ovens, the researchers found that setting the gas or propane burners on high or the ovens to 350 degrees Fahrenheit for 45 minutes raised benzene levels in all kitchens. They also found that in 29 percent of cases — judging each burner or oven as an individual case — kitchen benzene concentrations were found to be above the chemical levels found in secondhand tobacco smoke. That’s not to say that the health impacts are the same as secondhand smoke, however, since tobacco has a different makeup. Tobacco smoke contains a number of other carcinogens such as arsenic and formaldehyde. Nevertheless, lead author Yannai Kashtan said that the findings do suggest that there could be an elevated cancer risk. “If you’re breathing in benzene you’re at an elevated risk,” said Kashtan, a graduate student at Stanford’s Doerr School of Sustainability. “The World Health Organization says that from a cancer point of view there is no totally safe level of exposure.” The study also looked at bedrooms in six of the homes. After setting ovens to 475 degrees fahrenheit for 90 minutes and measuring for several hours afterwards, the researchers found elevated concentrations also in the bedrooms. Asked for comment, a leading gas industry group said that it is reviewing the study. “As we have with every study related to the use of natural gas and the health and safety of customers and communities, we are taking the time to evaluate this study to understand its methodology and the merits of its findings,” said Karen Harbert, CEO and president of the American Gas Association, in a written statement. “Customers deserve access to transparent information and sound science to help make decisions about the health and safety of their families, and the natural gas industry continues to contribute objective, thorough and meticulous scientific analysis,” Harbert added. Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
Environmental Science
A huge ash plume rises into the sky as the Lava fire explodes in Weed, California on July 1, 2021.Photo: JOSH EDELSON / AFP (Getty Images)Regional wildfire smoke is significantly lowering air quality for millions of people across the country.OffEnglishA new study published in Environmental Science & Technology found that wildfires are creating more air pollution every year. They especially create particle pollution known as PM2.5, which can cause short-term health concerns like nose, eye, and lung irritation. Long-term exposure can create or worsen existing respiratory and cardiovascular issues.According to the study, about 25 million people were exposed to wildfire particles during an especially bad day in 2020. “So this is just people exposed on what we call an extreme day of wildfire smoke,” Marshall Burke, one of the study’s co-authors and a Stanford University scientist, told Earther.The number of people in the U.S. exposed to dangerous levels of wildfire-related particles, which the study categorized as PM2.5 concentrations reaching at least 100 micrograms per cubic meter, have increased more than 27-fold in the last decade. The number of people exposed to extreme levels of pollution from wildfire smoke grew 11,000-fold during that time. These days were categorized by PM2.5 concentrations that reached 200 micrograms per cubic meter of air. Burke and other researchers looked at satellite images of wildfire smoke from 2006 to 2020. They compared the images to data from air quality monitors to see if the spikes in pollution coincided with wildfire smoke. But air quality monitors are not distributed equally across the country, so the scientists filled in data gaps by training a machine learning model to use satellite data in order to accurately predict PM2.5 concentrations.G/O Media may get a commission40% OffSamsung Galaxy Buds LiveListen upThese are some of the best earbuds Samsung users can get and feature active noise cancelling, incredible sound quality, and a long-lasting fast-charging battery. They also found that high-income communities and Hispanic communities have been disproportionately impacted by particles from wildfire smoke. Burke explained that this reflects demographics in heavily impacted states, like California. There are a lot of Hispanic communities in that state, along with extremely wealthy zip codes along the West Coast that are being impacted by wildfires.Researchers worry that the increased air pollution from wildfires is reversing the strides the U.S. has made in improving air quality since the passing of pollution regulations in the 1970s. “We’ve been really successful in reducing pollution from point sources—from factories, from tailpipes, energy producing units. Wildfire is a whole different animal,” Burke said. “It’s not regulated by the Clean Air Act, and it’s a growing source of pollution.”The worst pockets of polluted air are unsurprisingly in western states. The climate crisis has fueled conditions that have made wildfires increasingly destructive. There’s an ongoing megadrought drying out major water sources like the Colorado River. This year has also seen several dangerous heat waves, creating even drier and hotter conditions. But the increased intensity and frequency of wildfires has far-reaching consequences. “As you move east, out of the Rockies into the Midwest, we still see the impact of wildfire smoke on air quality there,” Burke said. “These impacts are smaller, but they still exist… this is not just a West Coast problem anymore.”Wildfire smoke can travel even farther than the Midwest. For example, last July, New York’s air quality was one of the worst in the world after wildfire smoke from the West Coast traveled thousands of miles. Some of the smoke reached as far east as Greenland that week.As wildfire seasons become worse over time, Burke and other researchers are working to answer questions that came up as they compiled the recently released study. He hopes to find out just exactly how much pollution wildfires are creating and how much that is rolling back air quality improvements in the U.S.
Environmental Science
SYDNEY, Nov 4 (Reuters) - A former chef turned farmer has begun to supply Sydney restaurants with sustainable herbs and microgreens grown in a carpark beneath the city's harbourside business district.Noah Verin set up his Urban Green business in Sydney's Barangaroo district in early 2020 with around 40 different plant species growing side by side. Now, he is riding an industry push to make sustainability a top menu ingredient."I always knew that when people heard the story of the fact there's a farm in a basement in Barangaroo growing food... I knew it would leave an impact," Verin, who also holds an environmental science degree, told Reuters.While vertical farms have been seen as a potential answer to the food crisis, Verin said now the conversation has shifted to how those same farms can also be sustainable."There's no point in setting up a farm to help solve these problems if we are not also creating sustainable farms," he said, ahead of the 2022 United Nations Climate Change Conference, COP27, to be held from Nov. 6 in Sharm El Sheikh, Egypt.[1/3] FILE PHOTO:Noah Verin, the owner of Sydney farm Urban Green, holds a tray herbs and micro-greens that he grows with his team in a basement carpark at central business district, in Sydney, Australia, October 27, 2022. REUTERS/Stefica Nicol BikesVerin is pushing to make his business fully sustainable and aims to make Urban Green carbon neutral by 2026.So far, he has halved his power usage from LED lights, while the fibre he grows plants in - coconut coir - is a byproduct from the coconut industry. He is shifting towards e-bike delivery and fully biodegradable plant pots so the business can be plastic free."Noah's product comes in still alive, still in its pot and also he doesn't use a lot of plastics or any throw away products like that so it's all very sustainable which I like," head chef Logan Campbell of Sydney restaurant Botswana Butchery told Reuters.Verin aims to one day open car park farms for products such as chilis and strawberries and more car park micro green and herb farms."We want to derive a minimum 50 percent of our deliveries within a one kilometre radius of the farm because that's a major advantage... we are surrounded by hundreds and hundreds of food service restaurants," he said.Reporting by Stefica Nicol Bikes Editing by Melanie Burton and Tomasz JanowskiOur Standards: The Thomson Reuters Trust Principles.
Environmental Science
Tijuana River sewage may be contaminating air along Southern California coast: study Chronic coastal contamination from the Tijuana River can end up in the atmosphere as “sea spray aerosol” — spreading far beyond the San Diego County beaches where it has long polluted the water, a new study has found. For decades, storms occurring along the U.S.-Mexico border have been diverting sewage through the Tijuana River and into the ocean in south Imperial Beach, according to the study, published on Thursday in Environmental Science & Technology. But researchers have now determined that sewage-polluted coastal waters can transfer to the atmosphere as aerosol — generated “by breaking waves and bursting bubbles.” And while the level of threat to human health remains uncertain, this so-called “sea spray aerosol” contains bacteria, viruses and chemical compounds. “We’ve shown that up to three-quarters of the bacteria that you breathe in at Imperial Beach are coming from aerosolization of raw sewage in the surf zone,” Kim Prather, an atmospheric chemist at the University of California San Diego’s Scripps Institution of Oceanography, said in a statement. While residents typically consider coastal pollution to be a waterborne issue, Prather warned that there may be additional cause for concern beyond the swimming and surfing communities. Contaminated coastal waters have been found to cause more than 100 million annual illnesses worldwide, but such pollution could also be reaching an uncounted number of individuals through the air, according to the study. “Aerosols can travel long distances and expose many more people than those just at the beach or in the water,” Prather said. The findings are particularly timely after wet winter weather last week exacerbated a sewage spill that had recently occurred across the border. In addition to coping with storm-related wastewater issues, the region has long been enduring infrastructural breakdowns that shuttle Tijuana’s sewage toward San Diego. The Tijuana River Watershed originates in the U.S., before crossing the border into Mexico and then returning to California. A pipeline rupture near Tijuana last month led Mexican authorities to shutter pumping stations in its water conveyance systems for two weeks — creating a “transboundary flow” of wastewater, the International Boundary and Water Commission explained at the time. While that acute problem was temporarily resolved late last week, a storm had already begun by the time the initial repair concluded. Since Dec. 28 alone, an estimated 13 billion gallons of sewage-polluted waters have flowed into the ocean from the Tijuana River, according to Prather. On Thursday, the International Boundary and Water Commission updated that flow to 20 billion gallons. The atmospheric chemist and her colleagues sampled coastal aerosols at Imperial Beach and water from the Tijuana River between January and May 2019. The scientists then used DNA sequencing and mass spectrometry to connect bacteria and chemical compounds in sea spray aerosol back to the polluted river water. They are now conducting follow-up research to identify viruses and other airborne pathogens. Although the authors identified the presence of certain bacteria and chemicals in the sea spray, they stressed that this does not necessarily mean people are getting sick from their exposure, as most bacteria and viruses are harmless. Meanwhile, the presence of bacteria in these aerosols does not automatically indicate that pathogenic microbes become airborne, according to the scientists. Characteristics like infectivity, exposure levels and other risk factors require further investigation, they concluded. “More research is necessary to determine the level of risk posed to the public by aerosolized coastal water pollution,” lead author Matthew Pendergraft, a recent PhD recipient from Prather’s group, said in a statement. “These findings provide further justification for prioritizing cleaning up coastal waters,” Pendergraft added.
Environmental Science
People dealing with the most socioeconomic disadvantages in greater Los Angeles also face higher levels of toxic air pollution, according to a new UCLA-led study. Researchers collected air samples from 54 locations over two-week periods in September 2019 and February 2020, and then analyzed the samples to determine how much PM 2.5 pollution was present, and how toxic it was. PM 2.5 refers to particles smaller than 2.5 microns, which can penetrate deep into lungs. The paper, published in the journal Environmental Science and Technology, found that air from census tracts in the 25% of communities facing the most socioeconomic disadvantages not only contained a greater amount of pollution, but that the pollution in these areas was more toxic. “Overall, people living in these places experience about 65% higher toxicity than people in the most advantaged group,” said Suzanne Paulson, the senior author of the study and a UCLA professor of atmospheric and oceanic sciences. Based on a combination of socioeconomic factors, researchers sorted the communities into quartiles, from those with the least socioeconomic advantages to those with the most. They found that the amount of dangerous pollution decreased as socioeconomic advantages increased. Air toxicity was measured by its ability to induce oxidative stress. It has long been known that exposure to particles in air pollution contributes to an increased risk for a wide range of cardiovascular, developmental, metabolic and respiratory diseases and conditions. Oxidative stress underlies many of these conditions. Researchers took samples from four types of locations: near major roadways, in urban communities, in so-called “background” locations away from dense urban development and desert locations in the San Bernardino Valley. They determined that across all of the areas sampled, 42% of total toxicity came from tailpipe emissions and 21% from brake or tire wear — meaning that a combined 63% of the pollution came from vehicles. Another 20% of the toxicity came from soil dust and 17% was from various other sources — including industrial sites and ports. Gasoline burns cleaner now thanks to state regulations and research from chemistry experts like Paulson, so the paper’s authors were surprised that tailpipe emissions still represented the largest source of particle toxicity, she said. Pollution in high-traffic locations was about 50% higher than it was in urban community locations; vehicle-related pollution made up the largest amount of that difference. Desert locations had slightly lower overall pollution levels than urban community sites, with a much larger proportion of the pollution coming in the form of dust. While it might seem obvious that dust would negatively affect air in desert regions, Paulson said it was important to recognize that much of that dust is contaminated with other particles that make it potentially toxic. “Dust in urban areas gets mixed in with industrial and tire pollution that is suspended in the dirt,” she said. The types of pollutants measured in the study significantly correlate with medical conditions such as respiratory, cardiovascular and metabolic illnesses, as well as with low birth weight and other pregnancy-related issues. Pollution that is more toxic typically contains more metals and may contain more toxic organic compounds, although those were not measured in the study. The research is the first U.S. study to measure air pollution toxicity against people’s socioeconomic conditions. Forthcoming UCLA studies will focus in greater detail on related health issues. Jiaqi Shen, the study’s lead author and a UCLA graduate student, said higher levels of toxic pollution compound other public health challenges. “Disadvantaged areas can face a situation where the environment is worse, and there is also less access to health care and good nutrition, increasing their health risks,” Shen said. Other study co-authors include UCLA professors Michael Jerrett, Beate Ritz and Yifang Zhu, and current and former graduate and undergraduate students.
Environmental Science
A study conducted by Duke University researchers suggests that glyphosate, the active ingredient in Roundup, may be a potential cause of the mysterious kidney disease CKDu that has affected rural communities in Sri Lanka and similar regions around the world. The findings have been published in Environmental Science and Technology Letters. Phys.Org reports: Roundup is a glyphosate-based herbicide used to control weeds and other pests. Because it is supposed to break down in the environment within a few days to weeks, its use is relatively under-regulated by most public health agencies. But when glyphosate encounters certain trace metal ions that make water hard -- like magnesium and calcium -- glyphosate-metal ion complexes can form. Those complexes can persist up to seven years in water and 22 years in soil. In certain agricultural areas of Sri Lanka, the high, dry climate combined with its geological formations creates the perfect conditions for hard water. It is also in these regions that CKDu has reached epidemic levels, with as many as 10% of children aged 5-11 years exhibiting signs of early onset kidney damage. [Nishad Jayasundara, the Juli Plant Grainger Assistant Professor of Global Environmental Health at Duke] believed that glyphosate may play a role in CKDu incidence because of the region's hard water, even though Sri Lanka has banned use of the herbicide. To test his hypothesis, Jayasundara teamed up with environmental chemist Lee Ferguson, an associate professor of civil and environmental engineering at Duke and his Ph.D. student Jake Ulrich. In collaboration with Mangala De Silva, a professor at the University of Ruhuna, Sri Lanka, the Duke team sampled more than 200 wells across four regions in Sri Lanka. Ferguson's lab at Duke employs high-resolution and tandem mass spectrometry to identify contaminants -- even the barest trace of them -- by their molecular weights. It's a highly sensitive method of identification and quantitation that allows a broad view into the pollutants present in a water system. Through this technique, the researchers found significantly higher levels of the herbicide in 44% of wells within the affected areas versus just 8% of those outside it. "We really focused on drinking water here, but it's possible there are other important routes of exposure -- direct contact from agricultural workers spraying the pesticide, or perhaps food or dust," said Ferguson. "I'd like to see increased study with more emphasis looking at the links among these exposure routes. It still seems like there might be things we're missing." To this point, Ulrich also found elevated levels of fluoride and vanadium -- both of which are linked to kidney damage -- in the drinking water of most all of the communities with high incidence of CKDu. The researchers agree that more attention must be paid to the potential contributions each of these contaminants is playing, either individually or in concert with others. But given the reasoning for their glyphosate-based hypothesis going into the study and the herbicide's high levels of use worldwide, they also believe these results should serve as a serious warning when considering risk of exposure to glyphosate. [Nishad Jayasundara, the Juli Plant Grainger Assistant Professor of Global Environmental Health at Duke] believed that glyphosate may play a role in CKDu incidence because of the region's hard water, even though Sri Lanka has banned use of the herbicide. To test his hypothesis, Jayasundara teamed up with environmental chemist Lee Ferguson, an associate professor of civil and environmental engineering at Duke and his Ph.D. student Jake Ulrich. In collaboration with Mangala De Silva, a professor at the University of Ruhuna, Sri Lanka, the Duke team sampled more than 200 wells across four regions in Sri Lanka. Ferguson's lab at Duke employs high-resolution and tandem mass spectrometry to identify contaminants -- even the barest trace of them -- by their molecular weights. It's a highly sensitive method of identification and quantitation that allows a broad view into the pollutants present in a water system. Through this technique, the researchers found significantly higher levels of the herbicide in 44% of wells within the affected areas versus just 8% of those outside it. "We really focused on drinking water here, but it's possible there are other important routes of exposure -- direct contact from agricultural workers spraying the pesticide, or perhaps food or dust," said Ferguson. "I'd like to see increased study with more emphasis looking at the links among these exposure routes. It still seems like there might be things we're missing." To this point, Ulrich also found elevated levels of fluoride and vanadium -- both of which are linked to kidney damage -- in the drinking water of most all of the communities with high incidence of CKDu. The researchers agree that more attention must be paid to the potential contributions each of these contaminants is playing, either individually or in concert with others. But given the reasoning for their glyphosate-based hypothesis going into the study and the herbicide's high levels of use worldwide, they also believe these results should serve as a serious warning when considering risk of exposure to glyphosate.
Environmental Science
image: Left and right figures show warming in Europe of the summer half year during the latest four decades, subdivided for clear-sky and all.sky conditions, respectively. view more  Credit: Paul Glantz/Stockholm University The warming during the summer months in Europe has been much faster than the global average, shows a new study by researchers at Stockholm University published in the Journal of Geophysical Research Atmospheres. As a consequence of human emissions of greenhouse gases, the climate across the continent has also become drier, particularly in southern Europe, leading to worse heat waves and an increased risk of fires. According to the UN's Intergovernmental Panel on Climate Change (IPCC), warming over land areas occurs significantly faster than over oceans, with 1.6 degrees and 0.9 degrees on average, respectively. It means that the global greenhouse gas emissions budget to stay under a 1.5-degree warming on land has already been used up. Now, the new study shows that the emissions budget to avoid a 2-degree warming over large parts of Europe during the summer half-year (April-September) has also been used up. In fact, measurements reveal that the warming during the summer months in large parts of Europe during the last four decades has already surpassed two degrees. “Climate change is serious as it leads to, among other things, more frequent heat waves in Europe. These, in turn, increase the risk of fires, such as the devastating fires in southern Europe in the summer of 2022,” says Paul Glantz, Associate Professor at the Department of Environmental Science, Stockholm University, and main author of the study.   In southern Europe, a clear, so-called, positive feedback caused by global warming is evident, i.e. warming is amplified due to drier soil and decreased evaporation.  Moreover, there has been less cloud coverage over large parts of Europe, probably as a result of less water vapour in the air. “What we see in southern Europe is in line what IPCC has predicted, which is that an increased human impact on the greenhouse effect would lead to dry areas on Earth becoming even drier,” says Paul Glantz. Impact of aerosol particles The study also includes a section about the estimated impact of aerosol particles on the temperature increase. According to Paul Glantz, the rapid warming in, for example, Central and Eastern Europe, is first and foremost a consequence of the human emissions of long-lived greenhouse gases, such as carbon dioxide. But since emissions of short-lived aerosol particles from, for example, coal-fired power plants have decreased greatly over the past four decades, the combined effect has led to an extreme temperature increase of over two degrees. “The airborne aerosol particles, before they began to decrease in the early 1980s in Europe, have masked the warming caused by human greenhouse gases by just over one degree on average for the summer half-year. As the aerosols in the atmosphere decreased, the temperature increased rapidly. Human emissions of carbon dioxide are still the biggest threat as they affect the climate for hundreds to thousands of years,” says Paul Glantz. According to Paul Glantz, this effect provides a harbinger of future warming in areas where aerosol emissions are high, such as in India and China. Background facts - The greenhouse effect and aerosol effect Fossil burning leads to the release of both aerosol particles and greenhouse gases. Although their source is common, their effects on climate differ. About the greenhouse effect Greenhouse gases are largely unaffected by solar radiation while they absorb infrared radiation efficiently, leading to re-emission towards the Earth's surface. The Earth absorbs both solar radiation and infrared radiation, which leads to the warming of the lower part of the atmosphere in particular. Time-space: Greenhouse gases are generally long-lived in the atmosphere and this applies above all to carbon dioxide where human emissions affect climate for hundreds to thousands of years. It also means that greenhouse gases spread evenly over the entire planet. About the aerosol effect In contrast to greenhouse gases, aerosol particles affect incoming solar radiation, i.e. they scatter part of the sunlight back into space causing a cooling effect. Human emissions of aerosols can enhance this cooling effect. Time-space: Airborne human aerosol particles have a lifetime of about a week, which means that they mainly cool the climate locally or regionally and in the short term. According to the Paris Agreement, all parties must commit to drastically reduce their greenhouse gas emissions, but it is also important to decrease concentrations of aerosol particles as well because, in addition to their effects on climate, aerosol particles in polluted air cause approximately eight million premature deaths each year around the world. Journal Journal of Geophysical Research Atmospheres Method of Research Data/statistical analysis Subject of Research Not applicable Article Title Unmasking the effects of aerosols on greenhouse warming over Europe Article Publication Date 22-Nov-2022 Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.
Environmental Science
Air pollution is a major public health problem: The World Health Organization has estimated that it leads to over 4 million premature deaths worldwide annually. Still, it is not always extensively measured. But now an MIT research team is rolling out an open-source version of a low-cost, mobile pollution detector that could enable people to track air quality more widely. The detector, called Flatburn, can be made by 3D printing or by ordering inexpensive parts. The researchers have now tested and calibrated it in relation to existing state-of-the-art machines, and are publicly releasing all the information about it -- how to build it, use it, and interpret the data. "The goal is for community groups or individual citizens anywhere to be able to measure local air pollution, identify its sources, and, ideally, create feedback loops with officials and stakeholders to create cleaner conditions," says Carlo Ratti, director of MIT's Senseable City Lab. "We've been doing several pilots around the world, and we have refined a set of prototypes, with hardware, software, and protocols, to make sure the data we collect are robust from an environmental science point of view," says Simone Mora, a research scientist at Senseable City Lab and co-author of a newly published paper detailing the scanner's testing process. The Flatburn device is part of a larger project, known as City Scanner, using mobile devices to better understand urban life. "Hopefully with the release of the open-source Flatburn we can get grassroots groups, as well as communities in less developed countries, to follow our approach and build and share knowledge," says An Wang, a researcher at Senseable City Lab and another of the paper's co-authors. The paper, "Leveraging Machine Learning Algorithms to Advance Low-Cost Air Sensor Calibration in Stationary and Mobile Settings," appears in the journal Atmospheric Environment. In addition to Wang, Mora, and Ratti the study's authors are: Yuki Machida, a former research fellow at Senseable City Lab; Priyanka deSouza, an assistant professor of urban and regional planning at the University of Colorado at Denver; Tiffany Duhl, a researcher with the Massachusetts Department of Environmental Protection and a Tufts University research associate at the time of the project; Neelakshi Hudda, a research assistant professor at Tufts University; John L. Durant, a professor of civil and environmental engineering at Tufts University; and Fabio Duarte, principal research scientist at Senseable City Lab. The Flatburn concept at Senseable City Lab dates back to about 2017, when MIT researchers began prototyping a mobile pollution detector, originally to be deployed on garbage trucks in Cambridge, Massachusetts. The detectors are battery-powered and rechargable, either from power sources or a solar panel, with data stored on a card in the device that can be accessed remotely. The current extension of that project involved testing the devices in New York City and the Boston area, by seeing how they performed in comparison to already-working pollution detection systems. In New York, the researchers used 5 detectors to collect 1.6 million data points over four weeks in 2021, working with state officials to compare the results. In Boston, the team used mobile sensors, evaluating the Flatburn devices against a state-of-the-art system deployed by Tufts University along with a state agency. In both cases, the detectors were set up to measure concentrations of fine particulate matter as well as nitrogen dioxide, over an area of about 10 meters. Fine particular matter refers to tiny particles often associated with burning matter, from power plants, internal combustion engines in autos and fires, and more. The research team found that the mobile detectors estimated somewhat lower concentrations of fine particulate matter than the devices already in use, but with a strong enough correlation so that, with adjustments for weather conditions and other factors, the Flatburn devices can produce reliable results. "After following their deployment for a few months we can confidently say our low-cost monitors should behave the same way [as standard detectors]," Wang says. "We have a big vision, but we still have to make sure the data we collect is valid and can be used for regulatory and policy purposes," Duarte adds: "If you follow these procedures with low-cost sensors you can still acquire good enough data to go back to [environmental] agencies with it, and say, 'Let's talk.'" The researchers did find that using the units in a mobile setting -- on top of automobiles -- means they will currently have an operating life of six months. They also identified a series of potential issues that people will have to deal with when using the Flatburn detectors generally. These include what the research team calls "drift," the gradual changing of the detector's readings over time, as well as "aging," the more fundamental deterioration in a unit's physical condition. Still, the researchers believe the units will function well, and they are providing complete instructions in their release of Flatburn as an open-source tool. That even includes guidance for working with officials, communities, and stakeholders to process the results and attempt to shape action. "It's very important to engage with communities, to allow them to reflect on sources of pollution," says Mora. "The original idea of the project was to democratize environmental data, and that's still the goal," Duarte adds. "We want people to have the skills to analyze the data and engage with communities and officials." Story Source: Journal Reference: Cite This Page:
Environmental Science
Chemical contamination on International Space Station is out of this world, study shows Concentrations of potentially harmful chemical compounds in dust collected from air filtration systems on the International Space Station (ISS) exceed those found in floor dust from many American homes, a new study reveals. In the first study of its kind, scientists analyzed a sample of dust from air filters within the ISS and found levels of organic contaminants which were higher than the median values found in US and Western European homes. Publishing their results in Environmental Science & Technology Letters, researchers from the University of Birmingham, UK, as well as the NASA Glenn Research Center, U.S., say their findings could guide the design and construction of future spacecraft. Contaminants found in the "space dust" included polybrominated diphenyl ethers (PBDEs), hexabromocyclododecane (HBCDD), "novel" brominated flame retardants (BFRs), organophosphate esters (OPEs), polycyclic aromatic hydrocarbons (PAH), perfluoroalkyl substances (PFAS), and polychlorinated biphenyls (PCBs). BFRs and OPEs are used in many countries to meet fire safety regulations in consumer and commercial applications like electrical and electronic equipment, building insulation, furniture fabrics and foams. PAH are present in hydrocarbon fuels and emitted from combustion processes, PCBs were used in building and window sealants and in electrical equipment as dielectric fluids, while PFAS have been used in applications like stain proofing agents for fabrics and clothing. However, their potential human health effects have led to some of them being banned or limited in use. PCBs, some PFAS, HBCDD and the Penta- Octa-, and Deca-BDE commercial formulations of PBDEs, are classed as persistent organic pollutants (POPs) under the UNEP Stockholm Convention. In addition, some PAH are classified as human carcinogens, while some OPEs are under consideration for restriction by the European Chemicals Agency. Co-author Professor Stuart Harrad, from the University of Birmingham, said, "Our findings have implications for future space stations and habitats, where it may be possible to exclude many contaminant sources by careful material choices in the early stages of design and construction. "While concentrations of organic contaminants discovered in dust from the ISS often exceeded median values found in homes and other indoor environments across the US and western Europe, levels of these compounds were generally within the range found on earth." Researchers note that PBDE concentrations in the dust sample falling within the range of concentrations detected in US house dust may reflect use on the ISS of inorganic FRs like ammonium dihydrogen phosphate to make fabrics and webbing flame retardant. They believe that the use of commercially available off-the-shelf items brought on board for the personal use of astronauts, such as cameras, MP3 players, tablet computers, medical devices, and clothing, are potential sources of many of the chemicals detected. Air inside the ISS is constantly recirculated with eight to 10 changes per hour. While CO2 and gaseous trace contaminant removal occurs, the degree to which this removes chemicals like BFRs is unknown. High levels of ionizing radiation can accelerate aging of materials, including breakdown of plastic goods into micro and nanoplastics that become airborne in the microgravity environment. This may cause concentrations and relative abundance of PBDEs, HBCDD, NBFRs, OPEs, PAH, PFAS, and PCBs in ISS dust to differ notably from those in dust from terrestrial indoor microenvironments. Scientists measured concentrations of a range of target chemicals in dust collected from the ISS. In a microgravity environment, particles float around according to ventilation system flow patterns, eventually depositing on surfaces and air intakes. Screens covering the ISS HEPA filters accumulate this debris, requiring weekly vacuuming to maintain efficient filtration. Material in ISS vacuum bags comprises of previously airborne particles, clothing lint, hair and other debris generally identified as spacecraft cabin dust. Some vacuum bags were returned to Earth for studies of this unique dust, with a small sample shipped to the University of Birmingham for analysis in the study. More information: Stuart Harrad et al., Persistent Organic Contaminants in Dust from the International Space Station, Environmental Science & Technology Letters (2023). Journal information: Environmental Science & Technology Letters Provided by University of Birmingham
Environmental Science
Farms that create habitat key to food security and biodiversity It seems intuitive that forests would provide better habitat for forest-dwelling wildlife than farms. Yet, in one of the longest-running studies of tropical wildlife populations in the world, Stanford researchers found that over 18 years, smaller farms with varying crop types—interspersed with patches or ribbons of forest—sustain many forest-dependent bird populations in Costa Rica, even as populations decline in forests. In a paper published Sept. 4 in the Proceedings of the National Academy of Sciences, Nicholas Hendershot and colleagues compared trends in specific bird populations across three landscape types in Costa Rica: forests, diversified farms, and intensive agriculture. The steepest declines were found in forests, then in intensive agriculture (and the species succeeding in intensive agriculture were often invasive). But on diversified farms, a significant subset of bird species typically found in forests, including some of conservation concern, actually increased over time. "Birds are kind of a proxy we use to track the health of ecosystems. And the birds we're seeing today aren't the same as we saw 18 to 20 years ago. This paper really documents this pattern," said Hendershot, a postdoctoral fellow at the time of this research in Stanford's Department of Biology in the School of Humanities and Sciences (H&S), the Stanford Center for Conservation Biology (CCB), and the Stanford-based Natural Capital Project (NatCap). Food security at stake While this research implies that diversified farming could be key for biodiversity, the relationship goes both ways: biodiversity is key for food security. In this case, that means having a variety of types of birds feeding on insects and helping to pollinate crops. "Identity does seem to matter a lot for pest control and other ecosystem services birds provide. These species are not interchangeable," said Hendershot. "We need a constant stream of pollinators servicing farms. About three-quarters of the world's crops require pollinators to some extent, and that 75% is our most nutritious food—think of all the vitamins and minerals packed into fruits, nuts, and veggies," explained Gretchen Daily, faculty director of NatCap and CCB, Bing Professor of Environmental Science in H&S, and a senior author on the paper. "We need a constant stream of birds, bats, and other wildlife to help control pests: they suppress the vast majority naturally. And we need to start building flood protection, water purification, carbon storage, and many other vital benefits back into agricultural landscapes, way beyond what can be achieved in protected areas alone." Daily also noted that, in terms of food production, diversified farms are not necessarily lower yielding than intensive agriculture. "This is a recent assumption that is being overturned," she said. Beyond protected areas It has become increasingly apparent around the world that while protected areas remain critical, they are too few and far between to provide the ecosystem services people and nature need to thrive. Working landscapes are crucial now for preserving biodiversity and its benefits. "People, including scientists, had the idea that farmland would not support a meaningful amount of biodiversity," said Daily. In this case, not only are diversified farms themselves providing habitat, they connect otherwise fragmented forested areas. Over time, Hendershot said, "I have moved away from the 'fortress conservation' model, which focused more on creating protected areas separate from human activities, and see more and more how much potential there is outside of forests. The forests are key—we need them, of course. But in addition to that, I'm always surprised by how important 'how' you manage a farm is for biodiversity." "We believe the findings of our research are new to science, but in a sense, it merely confirms what Indigenous communities around the world have already known for a long time, which is that humans can and should have reciprocal relationships with the rest of the local ecological community they are part of," said Tadashi Fukami, a professor of biology in H&S and of Earth system science in the Stanford Doerr School of Sustainability and a co-author of the paper. Incentivizing farmers In the 1980s and 90s, deforestation was occurring in Costa Rica at the fastest rate ever seen on a country scale. Then, they turned it around—becoming a renowned model of success. By setting up the world's first countrywide payment for ecosystem services (PES) program, Costa Rica reversed this trend: today, forests cover almost 60% of its land, up from 40% in 1987. The country currently aims to double the amount of protected forest in just a few years. In its existing PES program, any landowner can receive money for reforesting even small parts of their land. Now, the government is also working toward a new PES program to incentivize farmers to adopt best management practices. This study will help inform Costa Rican policymakers in understanding the benefits provided over time by different farming practices. Said Daily, "We need to recognize the vital work many farmers are doing that supports biodiversity." Nicholas Hendershot was a postdoctoral researcher with the Center for Conservation Biology at Stanford and is now a forest ecologist with The Nature Conservancy-California. Gretchen Daily is also a senior fellow in the Stanford Woods Institute for the Environment. Other co-authors on the paper are Alejandra Echeverri, a senior scientist at the Natural Capital Project, Luke Frishkoff of the University of Texas at Arlington, and prominent Costa Rican ornithologist Jim Zook. More information: Hendershot, J. Nicholas et al, Diversified farms bolster forest-bird populations despite ongoing declines in tropical forests, Proceedings of the National Academy of Sciences (2023). DOI: 10.1073/pnas.2303937120. doi.org/10.1073/pnas.2303937120 Journal information: Proceedings of the National Academy of Sciences Provided by Stanford University
Environmental Science
Gas stoves in California homes are leaking cancer-causing benzene, researchers found in a new study published on Thursday, though they say more research is needed to understand how many homes have leaks.In the study, published in Environmental Science and Technology on Thursday, researchers also estimated that over 4 tons of benzene per year are being leaked into the atmosphere from outdoor pipes that deliver the gas to buildings around California - the equivalent to the benzene emissions from nearly 60,000 vehicles. And those emissions are unaccounted for by the state.The researchers collected samples of gas from 159 homes in different regions of California and measured to see what types of gases were being emitted into homes when stoves were off. They found that all of the samples they tested had hazardous air pollutants, like benzene, toluene, ethylbenzene and xylene (BTEX), all of which can have adverse health effects in humans with chronic exposure or acute exposure in larger amounts.Of most concern to the researchers was benzene, a known carcinogen that can lead to leukemia and other cancers and blood disorders, according to the National Cancer Institute.The finding could have major implications for indoor and outdoor air quality in California, which has the second highest level of residential natural gas use in the United States."What our science shows is that people in California are exposed to potentially hazardous levels of benzene from the gas that is piped into their homes," said Drew Michanowicz, a study co-author and senior scientist at PSE Healthy Energy, an energy research and policy institute. "We hope that policymakers will consider this data when they are making policy to ensure current and future policies are health-protective in light of this new research."Homes in the Greater Los Angeles, the North San Fernando Valley, and the San Clarita Valley areas had the highest benzene in gas levels. Leaks from stoves in these regions could emit enough benzene to significantly exceed the limit determined to be safe by the California Office of Environmental Health Hazards Assessment.This finding in particular didn't surprise residents and health care workers in the region who spoke to The Associated Press about the study. That's because many of them experienced the largest-known natural gas leak in the nation in Aliso Canyon in 2015.Back then, 100,000 tons of methane and other gases, including benzene, leaked from a failed well operated by Southern California Gas Co. It took nearly four months to get the leak under control and resulted in headaches, nausea and nose bleeds.Dr. Jeffrey Nordella was a physician at an urgent care in the region during this time and remembers being puzzled by the variety of symptoms patients were experiencing. "I didn't have much to offer them," except to help them try to detox from the exposures, he said.That was an acute exposure of a large amount of benzene, which is different from chronic exposure to smaller amounts, but "remember what the World Health Organization said: there's no safe level of benzene," he said.Kyoko Hibino was one of the residents exposed to toxic air pollution as a result of the Aliso Canyon gas leak. After the leak, she started having a persistent cough and nosebleeds and eventually was diagnosed with breast cancer, which has also been linked to benzene exposure. Her cats also started having nosebleeds and one recently passed away from leukemia."I'd say let's take this study really seriously and understand how bad (benzene exposure) is," she said.Copyright © 2022 by The Associated Press. All Rights Reserved.
Environmental Science
Climate change once seemed a distant threat. No more. We now know its face, and all too well. We see it in every hurricane, torrential rainstorm, flood, heatwave, wildfire and drought. It’s even detectable in our daily weather. Climate disruption has changed the background conditions in which all weather occurs: the oceans and air are warmer, there’s more water vapor in the atmosphere and sea levels are higher. Hurricane Ian is the latest example.Ian made landfall as one of the five most powerful hurricanes in recorded history to strike the US, and with its 150 mile per hour winds at landfall, it tied with 2004’s Hurricane Charley as the strongest to ever hit the west coast of Florida. In isolation, that might seem like something we could dismiss as an anomaly or fluke. But it’s not – it’s part of a larger pattern of stronger hurricanes, typhoons and superstorms that have emerged as the oceans continue to set record levels of warmth.Many of the storms of the past five years – Harvey, Maria, Florence, Michael, Ida and Ian - aren’t natural disasters so much as human-made disasters, whose amplified ferocity is fueled by the continued burning of fossil fuels and the increase in heat-trapping carbon pollution, a planet-warming “greenhouse gas”.This Atlantic hurricane season, although it started out slow, has heated up, thanks to unusually warm ocean waters. Fiona hit Puerto Rico as a powerful category 4 storm, and hundreds of thousands of people there are still without power. The storm barreled on into the open Atlantic, eventually making landfall in the maritime provinces to become Canada’s strongest ever storm. Then came Ian, which feasted on a deep layer of very warm water in the Gulf of Mexico.Human-caused warming is not just heating the surface of the oceans; the warmth is diffusing down into the depths of the ocean, leading to year after year of record ocean heat content. That means that storms are less likely to churn up colder waters from below, inhibiting one of the natural mechanisms that dampen strengthening. It also leads to the sort of rapid intensification we increasingly see with these storms, where they balloon into major hurricanes in a matter of hours.Too often we still hear, even from government scientists, the old saw that we cannot link individual hurricanes to climate change. There was a time when climate scientists believed that to be true. But they don’t anymore. We have developed powerful tools to attribute the degree to which global warming affects extreme events. One study found, for example, that the devastating flooding from Hurricane Florence as it made landfall in North Carolina four years ago was as much as 50% greater and 80 km (50 miles) larger due to the warmer ocean.We can also draw upon basic physics, as we explained in Scientific American in 2017. Warmer oceans mean more fuel to strengthen hurricanes, with an average increase in wind speeds of major hurricanes of about 18 mph for each 1C (1.8F) of ocean surface warming, a roughly 13% increase. Since the power of the storm increases roughly the wind speed not only squared but raised to the third power, that amounts to a roughly 44% increase in the destructive potential of these storms.There is also evidence that human-caused warming is increasing the size of these storms. All else being equal, larger storms pile up greater amounts of water, leading to larger storm surges like the 12 to 18 feet estimated for Ian in some locations. Add sea level rise, and that’s the better part of foot of additional coastal flooding baked into every single storm surge. If humanity continues to warm the planet, and destabilize the Greenland and west Antarctic ice sheets, we could see yards, not feet, of eventual sea-level rise. Think of that as a perpetual coastal flooding event.Then there is the flooding rainfall, like the 20 inches (50cm) of it we’re seeing across a large swath of Florida with Ian. Simple physics tells us that the amount of moisture that evaporates off the ocean into the atmosphere increases about 7% for each 1C of ocean surface warming. That means 7% more moisture to turn into flooding rains. But that’s not the whole story. Stronger storms can entrain more moisture into them – a double whammy that produced the record flooding we saw in Philadelphia a year ago with Hurricane Ida, and the flooding we saw with Harvey in Texas in 2017 and Florence in the Carolinas in 2018, the two worst flooding events on record in the US.Tampa’s wide shallow coastal shelf, low topography combined with rising sea levels and vulnerable infrastructure make it particularly vulnerable to a landfalling major hurricane. Tampa Bay has dodged multiple bullets in recent years in the form of major hurricanes that ultimately weakened or swerved away from the city. Ian is the latest example, as it passed to the east rather than to the west of the Tampa Bay, sparing the sprawling urban population a devastating storm surge that would have flooded the homes of millions.Unfortunately, Tampa’s luck will eventually run out. We must prepare for the inevitable calamity that will occur when the city is at the receiving end of a losing roll of the weather dice.It is important to take steps to increase resilience and adapt to the changes that are inevitable, taking all of the precautions we can to spare our coasts from the devastating consequences of sea-level rise combined with stronger, more damaging hurricanes. But no amount of adaptation can shield Florida, or anywhere else, from the devastating consequences of the continued warming of our planet.Only mitigation – the dramatic reduction of heat-trapping pollution – can prevent things from getting worse. We’ve seen some progress on that front recently, both in the US and globally. The climate provisions of the recently passed Inflation Reduction Act are a great start, but they’re not adequate on their own for the US to meet its obligations to cut carbon emissions in half by 2030.We need more aggressive climate action to pass Congress. And that means we need politicians who are willing to support that action, rather than act as apologists for powerful fossil fuel interests. That’s something for all Americans to think about as they go to the voting booths in a matter of weeks. Michael E Mann is presidential distinguished professor of earth and environmental science at the University of Pennsylvania. He is author of The New Climate War: The Fight to Take Back Our Planet Susan Joy Hassol is director of the nonprofit Climate Communication. She publishes Quick Facts on the links between climate change and extreme weather events
Environmental Science
Sask. oil facilities' methane emissions nearly 4 times more than reported, research suggests Team from Carleton University in Ottawa measured emissions from 962 heavy oil facilities in Sask. New research using advanced technology suggests heavy oil facilities in Saskatchewan are releasing almost four times as much of a powerful greenhouse gas as they report to government. The research, published in the journal Environmental Science and Technology, pioneers new methods of measuring methane emissions that question current industry practice, said author Matthew Johnson, an engineering professor at Carleton University in Ottawa. "A lot of these [reports] are done on … estimates," said Johnson. "Clearly, they're not very accurate." Methane is a gas emitted as a byproduct of oil production. It is often rated as 25 times more potent a greenhouse gas than carbon dioxide. Industry and government are trying to cut those emissions by three-quarters, but measuring them has been difficult. "These are hard measurements," said Johnson. Industry generally relies on an estimate of how much methane comes to the surface for each barrel of oil, then multiplies that measurement by how much oil is produced. In recent years, several studies using direct measurement from overflying aircraft have thrown doubt on that method. Johnson said the amount of methane associated with oil is highly variable, which makes calculations based on that ratio unreliable. Johnson and his colleagues used the latest airborne technology as well as ground-based sensors to measure methane emissions from 962 heavy oil facilities in Saskatchewan that use the so-called CHOPS technology, which uses sand to help force oil to the surface. They found those sites released 3.9 times as much methane as was reported to government inventories. That's more than 10,000 kilograms per hour, as compared to the nearly 2,700 kilograms per hour industry reports. "That methane, on its own, would be a significant contribution to the entire inventory of Saskatchewan," said Johnson. Getting an accurate handle on how much methane industry releases to the atmosphere is important for a couple reasons, Johnson said. First, industry and the federal government have agreed to cut those emissions by 75 per cent by 2030. Regulations to achieve that goal are expected this year and measuring an accurate starting point will be crucial. Second, Johnson said getting a reliable, well-by-well analysis of emissions will be important for industry in the future. Methane emissions do not face the same taxes as carbon dioxide releases, but that's changing. The United States is discussing putting a price on released methane under its Inflation Reduction Act. Good information will be key to knowing which wells will remain profitable as such price regimes spread, said Johnson. "If you imagine a price on methane … a lot of these wells would be uneconomic." However, Johnson's calculations suggest the cost of reducing that methane is low enough that the payback period for not having to pay a methane price could be just two years. And if the value of the oil produced is included, the payback period drops to nine months for many wells. Even burning the methane off would help, Johnson said. "Just installing basic combustion mitigation technology is not going to be a deal-breaker for the well, and you can get quite significant methane reductions." Province says it reduced methane emissions by 66% Meanwhile, an official with the provincial government said she hasn't examined the study in detail yet, but appreciates that it's Saskatchewan-specific research. Debby Westerman, executive director of resource management with the Ministry of Energy and Resources, said the type of wells in the study represent only about 10 per cent of the production in the province. Emissions from CHOPS wells are known to be difficult to measure, so it's not surprising that samples taken at any given time will be higher or lower than the average, Westerman said. "We will certainly take this report into consideration and have a look at it. We're always looking to improve our measurement and reporting on associated gas." Westerman said the province has made "great progress" with it's methane action plan, adding that emissions were cut by about 66 per cent between 2015 and 2021. With files from the CBC's Nicholas Frew and Radio-Canada's Bryanna Frankel
Environmental Science
So much on this planet depends on a simple matter of density. In the Atlantic Ocean, a conveyor belt of warm water heads north from the tropics, reaching the Arctic and chilling. That makes it denser, so it sinks and heads back south, finishing the loop. This system of currents, known as the Atlantic Meridional Overturning Circulation, or AMOC, moves 15 million cubic meters of water per second.In recent years, researchers have suggested that because of climate change, the AMOC current system could be slowing down and may eventually collapse. A paper published yesterday in the journal Nature Communications warns that the collapse of the AMOC isn’t just possible, but imminent. By this team’s calculations, the circulation could shut down as early as 2025, and no later than 2095. That’s a tipping point that would come much sooner than anyone thought. “We got scared by our own results,” says Susanne Ditlevsen, a statistician at the University of Copenhagen and coauthor of the new paper. “We checked and checked and checked and checked, and I do believe that they're right. Of course, we might be wrong, and I hope we are.” But there’s vigorous debate in the scientific community over just how quickly the AMOC might decline, and how best to even figure that out.It’s abundantly clear to researchers that the Arctic is warming up to four and a half times faster than the rest of the planet. Arctic ice is melting at a pace of about 150 billion metric tons per year, says Marlos Goes, an oceanographer from the University of Miami and NOAA's Atlantic Oceanographic and Meteorological Laboratory who was not involved with the new paper. Greenland’s ice sheet is also rapidly declining, injecting more freshwater into the sea. That deluge of freshwater is less dense than saltwater, meaning less water sinks and less power goes into the AMOC conveyor belt. The consequences would be brutal and global. Without these warm waters, weather in Europe would get significantly colder—more like that of similar latitudes in Canada and the northern United States. “In model simulations, the collapse of the AMOC cools the North Atlantic and warms the South Atlantic, which may result in drastic precipitation changes throughout the world,” Goes says. “There would be changes in storm patterns over the continental areas, affecting the monsoon systems. Therefore, a future AMOC shutdown could bring massive migration, impacting ecological and agricultural production, and fish population displacement.” Ditlevsen did her team’s calculation by using measurements of Atlantic sea surface temperatures as a proxy for the AMOC. These readings go all the way back to the 1870s, thanks to measurements taken by ship crews. This meant researchers could compare temperatures before and after the start of the wide-scale burning of fossil fuels and the ensuing changes to the climate. Because the AMOC system involves warm water heading north from the tropics, if the circulation is slowing down, you’d expect to find cooler temperatures in the North Atlantic over time. And indeed, that’s what Ditlevsen’s group found, once they compensated for the overall warming of the world’s oceans due to climate change. “When it is established that the sea surface temperature record is the fingerprint of the AMOC, we can calculate the early warning signals of the forthcoming collapse and extrapolate to the tipping point,” says University of Copenhagen climate scientist Peter Ditlevsen, coauthor of the new paper. (The Ditlevsens are siblings.)The result echoes previous studies finding early warning signals in the circulation, says Stefan Rahmstorf, who studies the AMOC current system at the Potsdam Institute for Climate Impact Research. “As always in science, a single study provides limited evidence, but when multiple approaches lead to similar conclusions, this must be taken very seriously, especially when we're talking about a risk that we really want to rule out with 99.9 percent certainty,” says Rahmstorf. “The scientific evidence now is that we can't even rule out crossing a tipping point already in the next decade or two.”Still, scientists don’t agree about whether sea surface temperature (SST) is a good indicator of the health of this massively consequential circulation. “Fundamentally, I am deeply skeptical that SST is actually a proxy of AMOC,” says climate scientist Hali Kilbourne, who studies the current system at the University of Maryland Center for Environmental Science. “But there's certainly a school of thought of people who think it's the best thing going—and it may be the best thing going right now. I don't think we have a good alternative, which is why people are using it."“I really question whether [SST] is an adequate proxy for AMOC itself,” agrees Kevin Trenberth, a climate scientist at the National Center for Atmospheric Research. “But the trouble is there aren't really adequate measurements.”The core of the issue is that sea surface temperatures are just one component of the AMOC system; other factors also help determine Atlantic temperatures. Warm waters flowing north have an effect, but so does the atmosphere touching the water. “There's a lot of what we call air-sea interactions—the heat exchange between the atmosphere and the ocean,” Kilbourne says. “And that's not at all related to ocean circulation.” “This SST fingerprint, although sensitive to the AMOC, is not solely driven by it, so these changes may be overestimated,” agrees Goes, the oceanographer from the University of Miami and NOAA. “Current climate models do not give a strong probability of the collapse of the AMOC this century.”The beauty of the SST dataset is that it stretches back 150 years, so scientists can see longer-term trends in temperatures. However, those early shipboard measurements were made by people hauling buckets of water aboard and sticking a thermometer in—not exactly the precision that modern science demands. “It is not ideal, but it’s the best we can do,” says Peter Ditlevsen, “since we need measurements to go back to the pre-industrial era to assess the natural state of the AMOC, before it began slowing down toward the collapse.” Satellite measurements of SST began in the late 1970s, providing much better coverage across oceans. And it wasn’t until 20 years ago that scientists deployed a dedicated AMOC sensor array, known as RAPID, which also measures current velocities and salinity—another factor that influences the density of water. By comparing this modern data to the historical SST data, Peter Ditlevsen says, they can compensate for the influence of the atmosphere on the sea surface, isolating the signal of the AMOC system.When the RAPID array went online, the assumption was that it’d take 40 years to get an idea of whether the current system was in decline. “It's just hard to tease apart, because we really don't know what the intrinsic timescales of AMOC are,” says Nicholas Foukal, an assistant scientist at Woods Hole Oceanographic Institution, who wasn’t involved in the new paper. “We haven't had an AMOC collapse in the past 20 years, so it’s like trying to predict a hurricane—having never seen a hurricane.” Since RAPID started operating, scientists have seen a good amount of variability. “We've been directly measuring AMOC since 2004, and we don't have any evidence of long-term decline,” says Foukal. “The first six years, there was a very strong decline. And people jumped on that, saying that it's declining, and we have observational evidence of it. But since then, it has recovered.” Scientists also use models to simulate how the current system might change as the climate does. Compared to the studies indicating a slowdown and eventual collapse of the circulation, models indicate more stability, says Oluwayemi Garuba, a climate scientist who studies ocean-atmosphere interactions at the Pacific Northwest National Laboratory. “Observations are showing more statistically significant early-warning signals of a collapse of the AMOC, whereas most models are not showing that,” says Garuba. “So, it could be that the overturning circulation in models is just more stable than in observation, as earlier studies have suggested.”Going forward, Greenland will be a major wildcard. Last week, scientists reported how they used ice cores from an abandoned military base to determine that around 400,000 years ago, northwest Greenland was ice-free. Back then, temperatures were about the same as they are today, yet atmospheric carbon dioxide concentrations were far lower. That raises the alarm that the decline of Greenland’s ice sheet could accelerate. If it does, the melt would load the north Atlantic with astonishing amounts of freshwater, fast-tracking the decline of the AMOC and adding many feet to sea levels.It’s complexity and uncertainty all the way down. “The fact that, with continued warming, AMOC will slow down is a very robust result. The uncertainty—and where science still needs to figure things out—is when,” Kilbourne. “But I kind of think that by the time we figure out when, it'll already have happened.”
Environmental Science
U.S. August 24, 2022 / 2:34 PM / CBS News Cities on Fire: The Urban Heat Island Effect Cities on Fire: The Urban Heat Island Effect 09:13 This summer's deadly heat waves have left people throughout the U.S. and Europe desperate for relief. It's urban centers that have felt the intense temperatures the most — and where experts say it's only expected to get worse. That's because of something called the urban heat island effect. "Simply put, it means that urban centers are hotter than the surrounding suburban areas," Liv Yoon, a postdoctoral research scholar at Columbia University's Climate School, told CBS News. She explained that the effect is like a "dome overarching the city" that's created by carbon emissions, air getting trapped amid tall buildings, and a lack of open space and greenery, among other things.  "All of that is building and building and it has nowhere to escape," she said. This image shows how the temperature can differ between rural, industrial, downtown, park and suburban areas on the same day.  Climate Central This dome effect can cascade over entire cities, although it also impacts specific neighborhoods within that urban area very differently. According to the recently launched heat.gov website, neighborhoods in the same city can have temperatures differing by roughly 15º to 20ºF at the same time depending on their levels of tree cover and other factors. Pretty much every metropolitan area experiences this effect to some degree. About 85% of the U.S. population currently lives in metro areas, and the heat island effect is felt most intensely in New Orleans, New York City, Houston, San Francisco and Newark, New Jersey, according to a 2021 report by the nonprofit Climate Central. "Today, there are over 200 to 250 million people that experience temperatures of over 35ºC [95ºF] every summer, living in about 318 urban areas" across the U.S., environmental scientist and climate resilience specialist Deborah Brosnan said. "So, it's a lot of people experiencing it." Yoon, the heat island researcher, is part of a project participating in a nationwide campaign by the National Oceanic and Atmospheric Administration to "make heat visible." Teams in various U.S. cities — including Yoon's in New York City — have been tracking temperatures to show the impact of excessive heat and the heat island effect. A map of their findings in New York shows how by the afternoon and evening, the heat index is substantially higher in the neighborhoods of Washington Heights, Harlem and the South Bronx when compared to the Upper East Side, Central Park and the Upper West Side.  "Within a city, just because there's a heat wave going on, no one's experiencing that the same exact way everywhere," Jeremy Hoffman, an environmental science professor and scientist at the Science Museum of Virginia, told CBS News.In his own research in Richmond, Hoffman's team found a 16ºF difference between the coolest and warmest places — less than 3 miles apart — at the exact same time during a heat wave.The phenomenon isn't confined to summer heat waves. It's year-round. "Urban areas tend to be several degrees warmer than their outlying rural areas throughout the year," Hoffman said. "That tends to be actually kind of a good thing for our energy needs in the wintertime, but in the summertime … when it's on our hottest days with the brightest sunshine, the least amount of wind and the least amount of cloud cover, these conditions tend to become the recipe for these extreme temperature difference to become the most stark."  Yvette Johnson, 54, sits next to a fan outside her family's home in Houston on June 10, 2022, when Texas was under a heat wave alert. BRANDON BELL / Getty Images "Incompatible with human life" Ben Zaitchik, a professor of earth and planetary sciences at Johns Hopkins University, told CBS News that urban heat lingers, and is usually maximized, at night. The sustained high temperatures prevent people from getting relief when their bodies desperately need it.The negative impact extends beyond sheer discomfort. Areas with extreme and prolonged heat see increased cases of kids with asthma going to the emergency room, older adults with chronic lung issues having complications, and decreased worker productivity because of heat exhaustion, Zaitchik said. It can also be deadly."There's a literal, physiological limit, which is that once you get above a certain threshold of temperature and humidity, you can't sweat," he said. The body's internal temperature is about 98.6ºF, but without the ability to sweat, the body can't cool down, and body temperature can rapidly increase to 106ºF or more within 10 minutes, according to the Centers for Disease Control and Prevention.  "At that point, you're really kind of incompatible with human life," he said. According to the CDC, high temperatures can lead to heart and lung complications, renal failure and kidney stones, and can even impact fetal health and lead to preterm birth.Every year in the U.S., the heat causes an average of more than 700 deaths, more than 67,500 emergency department visits and more than 9,200 hospitalizations. Those who are Black or Native American have the highest rates of death, according to CDC tracking from 2004 to 2018.   This table shows the number of heat-related deaths by race and ethnicity and the level of urbanization from 2004 to 2018 in the U.S. Those who are Black or Hispanic are shown to have the highest rates of death.  Heat-Related Deaths — United States, 2004–2018/CDC Yoon has seen these health issues in New York, where she specifically researches the social inequities associated with extreme heat. She said that areas with more extreme heat tend to see the most heat-related deaths and illnesses, and also tend to be communities with lower income that are historically redlined. Even in neighborhoods where the heat difference isn't necessarily drastic, the impact can be more intense because there are fewer buffers. On the Upper East Side, for example, Yoon found that more people are likely to have climate-controlled homes or offices and have jobs consistent with good quality health care. That's not the case just a few miles away in the Bronx. "People are supposed to seek refuge from the heat indoors," Yoon said. "But if you live in dilapidated housing or public housing, they're notorious for their lack of cooling infrastructure. You just have nowhere to go, so extreme heat becomes even more of a problem."  Most of those who live in the Bronx are people of color, according to the U.S. Census, and the issues the borough experiences with heat echo those in similar communities nationwide. Brosnan said that areas of lower socioeconomic status and higher populations of people of color usually have "significantly higher" heat indexes. They also usually have more air pollution, fewer green spaces, are often closer to traffic or factories, and have less cooling infrastructure. "If you're a single mother, if you're from a minority race, you are more likely to experience heat stress and to be affected by extreme heat or heat waves than people in more affluent or larger percentage White areas," she said. "And that's the real issue, is that the impact of rising temperatures is borne disproportionately by the poor and minorities and those who can least afford to pay for them.""Very high level of danger"Since 1970, 96% of the 246 locations analyzed by Climate Central have seen increases in their average summer temperatures. "Now versus just 50, 60 years ago," Hoffman said, "our summers are much hotter, much more intense." The hotter temperatures are not just a danger to public health, but to infrastructure as well, he said. It's particularly noticeable in regions that were historically cooler but over the past few years have been sweltering under unprecedented heat. "There is a very high level of danger right now, for vulnerable groups in particular," Weather Channel meteorologist Carl Parker told CBS News.  When last year's Pacific Northwest heat wave hit, he said as an example, only two-thirds of households in the region had air conditioning. As of 2020, 12% of U.S. households didn't have air conditioning, according to the U.S. Energy Information Administration, and of those who do, just 66% have central AC. Described by the National Weather Service as "oppressive and unprecedented heat," the wave brought record-breaking temperatures to the region, resulting in buckled highways and numerous deaths. That area was among many that experienced yet another deadly surge in heat this summer."Now, suddenly, you're looking at days on end with temperatures above 100 degrees," Parker said. "That's when it gets really dangerous, when you don't have this sort of built-in infrastructure. And then even when you do have this infrastructure, sometimes it will fail." "Roads begin to buckle," Hoffman said. "Airlines can't operate because they're unable to take off in such warm air.""We see things like rolling blackouts," he added, "because so many people are needing to push their air conditioning beyond what they're kind of used to dealing with." And when the infrastructure crumbles, it only compounds on the existing issues, Zaitchik said, pointing to last August's deadly Hurricane Ida, a Category 4 storm that trampled numerous states.  "When you look at New Orleans, where it hit landfall, the heat killed more people than the storm itself," he said, "because the storm knocked out power and then it got really hot and then people died." "No city is immune" The past few years have shown that excessive heat is a nationwide — and global — issue. Utah residents recently had more than two weeks straight of triple-digit temperatures, and last month, the U.K. hit its hottest day on record at more than 104ºF.  "No city is immune to this," Yoon said. "...It's becoming more of a problem. One, temperatures are getting hotter, but two, more and more people are congregating in urban centers. So, it's becoming more of an issue everywhere." Parker told CBS News that by mid-century, major cities such as Dallas, Oklahoma City, Tampa and New Orleans, will see a month or more with heat index values of 105ºF or greater.  That's less than 30 years from now. And these urban areas, where the heat is most intense, are only growing. "As bad as it will be, in general, it's going to be that much worse in these urban areas. … By the middle of the century, somewhere between 60 and 70% of the world's population will be in urban areas," Parker said. "… And that's gonna mean a lot of dangerous weather for millions and millions of people." Population density will only add to the heat. And a recent report by the nonprofit research firm First Street Foundation found that a "heat belt" will soon emerge in the U.S., stretching from the Gulf Coast to Chicago and encompassing nearly one-third of U.S. adults. "We're already operating on a knife's edge in a lot of ways," Parker said. "... So the big question will be adaptation — do city planners really start thinking about what they need to do to make cities more livable as temperatures are rising?"  Since 1970, 96% (235) of 246 U.S. locations had an increase in their summer average temperature and 81% (200) had 7 or more summer days above normal since 1970, according to Climate Central. Climate Central "Billions" will be exposed to hotter temperatures Research from 2019 found that, within 30 years, the heat island effect will raise city temperatures about half of whatever they experience from climate change — and in some places, twice the amount. That means if a city's temperature increased by 2 degrees because of climate change, it can expect an additional degree of warming because of the heat island effect. "This previously unexamined extra warming will expose billions of urban dwellers, primarily in the tropical global South, to greater extreme heat risks," the researchers said. Minimizing greenhouse gas emissions is the primary way to reduce global warming as a whole. In July, President Biden announced $2.3 billion in funding to help in disaster response and expand home energy assistance and offshore wind opportunities. He also signed the Inflation Reduction Act, which provides almost $400 billion to fund energy and climate projects to help reduce carbon emissions by 40% in 2030. But these actions will only work to help prevent more warming in the future; they won't erase the damage that's already been done. Experts say that means that the world must prepare for a new reality, and now. "Unless cities can adapt … we will reach a situation where, at least in some cities, some of the densest cities, that it will be unsafe for people to live there during the summer," Brosnan told CBS News. Cities impacted by the urban heat island effect must develop three main components of heat resiliency, Brosnan said: more green spaces to provide shade and help absorb the heat; retrofit buildings to be more economical and efficient in keeping people cool during extreme heat; and finally, better assist communities during times of extreme heat by providing water, cooling centers, assistance and anything else residents might need to survive. "It's a long-term investment and it's a cost," Brosnan said, noting that all of these things will require lots of money and manpower. "The benefit is in human survival." In: Climate Change urban heat island Heat Wave New York Li Cohen Li Cohen is a social media producer and trending reporter for CBS News, focusing on social justice issues. Thanks for reading CBS NEWS. Create your free account or log in for more features. Please enter email address to continue Please enter valid email address to continue
Environmental Science
Exxon Mobil’s scientists were remarkably accurate in their predictions about global warming, even as the company made public statements that contradicted its own scientists’ conclusions, a new study says.The study in the journal Science Thursday looked at research that Exxon funded that didn’t just confirm what climate scientists were saying, but used more than a dozen different computer models that forecast the coming warming with precision equal to or better than government and academic scientists.This was during the same time that the oil giant publicly doubted that warming was real and dismissed climate models’ accuracy. Exxon said its understanding of climate change evolved over the years and that critics are misunderstanding its earlier research.Scientists, governments, activists and news sites, including Inside Climate News and the Los Angeles Times, several years ago reported that “Exxon knew” about the science of climate change since about 1977 all while publicly casting doubt. What the new study does is detail how accurate Exxon funded research was. From 63% to 83% of those projections fit strict standards for accuracy and generally predicted correctly that the globe would warm about .36 degrees (.2 degrees Celsius) a decade.The Exxon-funded science was “actually astonishing” in its precision and accuracy, said study co-author Naomi Oreskes, a Harvard science history professor. But she added so was the “hypocrisy because so much of the Exxon Mobil disinformation for so many years ... was the claim that climate models weren’t reliable.”Study lead author Geoffrey Supran, who started the work at Harvard and now is a environmental science professor at the University of Miami, said this is different than what was previously found in documents about the oil company.“We’ve dug into not just to the language, the rhetoric in these documents, but also the data. And I’d say in that sense, our analysis really seals the deal on ‘Exxon knew’,” Supran said. It “gives us airtight evidence that Exxon Mobil accurately predicted global warming years before, then turned around and attacked the science underlying it.”The paper quoted then-Exxon CEO Lee Raymond in 1999 as saying future climate “projections are based on completely unproven climate models, or more often, sheer speculation,” while his successor in 2013 called models “not competent.”Exxon’s understanding of climate science developed along with the broader scientific community, and its four decades of research in climate science resulted in more than 150 papers, including 50 peer-reviewed publications, said company spokesman Todd Spitler.“This issue has come up several times in recent years and, in each case, our answer is the same: those who talk about how ‘Exxon Knew’ are wrong in their conclusions,” Spitler said in an emailed statement. “Some have sought to misrepresent facts and Exxon Mobil’s position on climate science, and its support for effective policy solutions, by recasting well intended, internal policy debates as an attempted company disinformation campaign.”Exxon, one of the world’s largest oil and gas companies, has been the target of numerous lawsuits that claim the company knew about the damage its oil and gas would cause to the climate, but misled the public by sowing doubt about climate change. In the latest such lawsuit, New Jersey accused five oil and gas companies including Exxon of deceiving the public for decades while knowing about the harmful toll fossil fuels take on the climate.An Exxon Mobil oil refinery in Baton Rouge, La. in 2020.Barry Lewis / In Pictures via Getty Images fileSimilar lawsuits from New York to California have claimed that Exxon and other oil and gas companies launched public relations campaigns to stir doubts about climate change. In one, then-Massachusetts Attorney General Maura Healey said Exxon’s public relations efforts were “ reminiscent of the tobacco industry’s long denial campaign about the dangerous effects of cigarettes.”Oil giants including Exxon and Shell were accused in congressional hearings in 2021 of spreading misinformation about climate, but executives from the companies denied the accusations.University of Illinois atmospheric scientist professor emeritus Donald Wuebbles told The Associated Press that in the 1980s he worked with Exxon-funded scientists and wasn’t surprised by what the company knew or the models. It’s what science and people who examined the issue knew.“It was clear that Exxon Mobil knew what was going on,” Wuebbles said. “The problem is at the same time they were paying people to put out misinformation. That’s the big issue.”There’s a difference between the “hype and spin” that companies do to get you to buy a product or politicians do to get your vote and an “outright lie ... misrepresenting factual information and that’s what Exxon did,” Oreskes said.Several outside scientists and activists said what the study showed about Exxon actions is serious.“The harm caused by Exxon has been huge,” said University of Michigan environment dean Jonathan Overpeck. “They knew that fossil fuels, including oil and natural gas, would greatly alter the planet’s climate in ways that would be costly in terms of lives, human suffering and economic impacts. And yet, despite this understanding they choose to publicly downplay the problem of climate change and the dangers it poses to people and the planet.”Cornell University climate scientist Natalie Mahowald asked: “How many thousands (or more) of lives have been lost or adversely impacted by Exxon Mobil’s deliberate campaign to obscure the science?”Critics say Exxon’s past actions on climate change undermine its claims that it’s committed to reducing emissions.After tracking Exxon’s and hundreds of other companies’ corporate lobbying on climate change policies, InfluenceMap, a firm that analyzes data on how companies are impacting the climate crisis, concluded that Exxon is lobbying overall in opposition to the goals of the Paris Agreement and that it’s currently among the most negative and influential corporations holding back climate policy.“All the research we have suggests that effort to thwart climate action continues to this day, prioritizing the oil and gas industry value chain from the “potentially existential” threat of climate change, rather than the other way around,” said Faye Holder, program manager for InfluenceMap.“The messages of denial and delay may look different, but the intention is the same.”
Environmental Science
The 'plastic paradox': Some clean-up technologies do more harm than good, researchers say The ever-increasing problem of plastic pollution has prompted widespread efforts to combat it through innovative clean-up technologies. These advancements, however, often seen as the silver bullet to solve our plastic crisis, sometimes do more harm than good. This plastic clean-up paradox is addressed in a recent publication in Environmental Science & Technology, where a group of stakeholders representing different perspectives were brought together to discuss this pressing issue. The consensus emerging from the dialog is clear: clean-up technologies must be regulated within the framework of an international plastics treaty to ensure they genuinely benefit the environment. In other words: We must adopt a philosophy of "clean it up, not mess it up." Capture plastics, not turtles So why the caution? When we target litter, we obviously encounter ecosystems teeming with life. Dragging a net across the ocean to capture plastics may unintentionally trap the very organisms we aim to protect, like the unfortunate turtle ensnared in our well-intentioned efforts. Moreover, the effectiveness of a technology at one place may be impractical in another. Consider the case of the clean-up equipment supplied to the Sri Lankan government following the X-Press Pearl disaster, where plastic nurdles inundated the environment. This technology was designed for dry surfaces, but the nurdles had seeped into wet substrates, so that the equipment was inadequate. The lack of funds and capacity for repairs meant that manual clean-ups were more cost-efficient. This underscores the importance of evaluating cost-effectiveness before selecting a clean-up approach for a specific area. Litter concentration also plays a key role. Many clean-up technologies are tailored for oceanic debris, but the densest accumulations are often found on shorelines. The cost of implementing clean-up technologies also increases the more difficult the area is to access, with seafloor and open-ocean clean-ups having very high capital costs. To get more bang for the buck we should therefore support projects focusing on areas that are the most polluted and can be cleaned relatively effortlessly. Post clean-up issues We also know very little about what happens to the litter after it has been removed from the environment. The litter must be sorted, transported, and processed. All these steps may include unexpected hick-ups. In many cases, most of what is trapped is organic material. This must be removed, and the litter cleaned and sorted into fractions that hopefully can be recycled. Transporting litter across national jurisdictions may not be easy. Safe deposit or recycling facilities might be unavailable locally, increasing the risk that the plastic recovered end up in places it shouldn't—like back into the ocean. It has also been shown that plastic that has been in the ocean are of low quality, making it difficult to recycle. To ensure that clean-ups provide a net benefit, we must carefully consider these factors. How to maximize the impact of clean-up technologies Efforts to reduce litter in the environment, including the use of clean-up technologies, hold great promise. However, we must consider how to harness their full potential. First, understanding the types of litter found provides valuable insights for decision-makers aiming to prevent further littering. Data collection is paramount. Additionally, the operation and effectiveness of clean-up technologies can inform outreach programs, inspiring greater public involvement in addressing the plastic crisis. Managing technology, encouraging communication, and promoting litter reuse and recycling can also create economic opportunities and meaningful employment. To ensure we make the most of these efforts, we advocate for the implementation of guidelines and regulations related to clean-up technologies within the international plastics treaty. This step is vital for robust evaluation processes, efficient deployment of clean-up technologies, proper documentation of litter's fate, and enhanced monitoring and outreach efforts. By doing this, clean-up technologies can be part of the solution to plastic pollution, allowing us to be cleaning up without messing up. More information: Jannike Falk-Andersson et al, Cleaning Up without Messing Up: Maximizing the Benefits of Plastic Clean-Up Technologies through New Regulatory Approaches, Environmental Science & Technology (2023). DOI: 10.1021/acs.est.3c01885 Journal information: Environmental Science & Technology Provided by Norwegian Institute for Water Research (NIVA)
Environmental Science
Research team realizes rapid detection of low-concentration BOD in an oxygen-rich water environment Researchers at the College of Environmental Science and Engineering of Nankai University have introduced a new strategy to realize rapid biochemical oxygen demand (BOD) measurement by using the competitive switching of electrotrophic and heterotrophic pathways of facultative bacteria. The findings were published in Water Research under the title "Switchover of Electrotrophic and Heterotrophic Respirations Enables the Biomonitoring of Low Concentration of BOD in Oxygen-rich Environment." The research team isolated and obtained a strain of Acinetobacter venetianus RAG-1 with both electrotrophic and heterotrophic respiration from microbial electroactive biofilms (EABs) over a period of five years. The findings of research reveal that this bacterium can respire with a polarized graphite electrode in the absence of a degradable organic carbon source, and the current generated at this point can be used as the baseline of sensors. When degradable pollutants are present in the water, RAG-1 swiftly switches to heterotrophic respiration, resulting in a decrease in current. The decrease in current value is proportional to the concentration of organic pollutants. Based on this, the research team developed a novel bio-cathode BOD sensor, which has a linear response to common pollutants such as organic acids, sugars, proteins, humic acids, as well as mixtures such as low-concentration domestic sewage and lake sediments. It realizes sensitive monitoring of oxygen-rich and low-BOD water, with a test time of less than three hours. This study further explains the switchover mechanism of facultative electrotrophic bacteria's metabolic pathways and their adaptability and resilience to contaminated environment. Based on this new principle, the research team will develop more electrotrophic-heterotrophic microorganisms as sensing elements to support rapid BOD monitoring under complex environmental scenarios. This technique is expected to be used in water quality regulation for aquaculture, water quality monitoring for reclaimed water, and more. More information: Yilian Han et al, Switchover of electrotrophic and heterotrophic respirations enables the biomonitoring of low concentration of BOD in oxygen-rich environment, Water Research (2023). DOI: 10.1016/j.watres.2023.119897 Provided by Nankai University
Environmental Science
High levels of chemicals called per-and polyfluoroalky substances (PFAS) were detected in water-proof or stain-resistant school uniforms in the United States and Canada, according to a new study published Wednesday in the Journal of Environmental Science and Technology.PFAS chemicals, often called “forever chemicals” because of their slow breakdown, are widely used for their non-stick properties. They are ubiquitous and found in a range of everyday products such as non-stick cookware, stain and water repellants on carpets, food packaging and personal care products such as shampoos and cosmetic products.Researchers studied more than 72 products from nine different brands, finding that school uniforms had high amounts of these potentially harmful chemicals. The highest levels were detected in clothing that was labeled as 100% cotton or cotton/spandex.Due to widespread use and their slow breakdown, these chemicals can build up in humans and the environment over time. Current scientific research suggests that exposure to high levels of certain PFAS may cause a range of health problems, from delays in development in children to increased risk of some cancers, with the highest risk associated with drinking or eating contaminated food over an extended time. Scientists, however, are continuing to learn about the health effects of exposure to different types and levels of PFAS.Researchers are especially concerned about possible high exposure, especially for children."Our findings are concerning as school uniforms are worn directly on the skin for about eight hours per day by children, who are particularly vulnerable to harmful chemicals," said Dr. Arlene Blum, a study co-author and the executive director of the Green Science Policy Institute.It's not clear if PFAS chemicals cause health problems if exposed on the skin, but researchers who led the study said that they may end up in children’s bodies through skin absorption, eating with unwashed hands, hand-to-mouth behaviors and mouthing of fabric by younger children."These chemicals are not well studied. We still have a lot to learn and we are not sure what harmful effects, if any, these chemicals have by skin exposure and clothing," said Dr. Stephanie Widmer, a medical toxicologist and an emergency medicine physician.According to the study, the PFAS levels in some uniforms exceeded the tolerable daily intake set by European regulators. In the United States, regulators have yet to set similar allowable limits for clothing. But given these concerns, bills in New York and California that require the phasing out of PFAS in textiles, including school uniforms, by Jan. 1, 2025, were passed by state lawmakers.The Centers for Disease Control and Prevention said exposure to PFAS chemicals may be associated with increased cholesterol levels, changes in liver enzymes, decreases in infant birth weights, decreased vaccine response in kids, increased risk of birth complications in pregnant women and increased risk of some cancers.“The reality is the health concerns that have been reported in association with PFAS cannot be ignored, and while we are learning more about PFAS and their potential dangers, we should all try to limit our exposures as much as reasonably possible," said Widmer.Added Blum: "Concerned parents should check if any of their children’s uniforms are labeled 'stain-resistant.' If so, they should ask school administrators to update their uniform policies and when purchasing new uniforms, specify PFAS-free uniform options."ABC News' Youri Benadjaoud contributed to this report.Khushali Jhaveri, MD, a board-certified internal medicine physician, is a hematology/oncology fellow at Moffitt Cancer Center and is a member of the ABC News Medical Unit.
Environmental Science
The planet’s demand for salt comes at a cost to the environment and human health, according to a new scientific review led by University of Maryland Geology Professor Sujay Kaushal. Published in the journal Nature Reviews Earth & Environment, the paper revealed that human activities are making Earth’s air, soil and freshwater saltier, which could pose an “existential threat” if current trends continue. Geologic and hydrologic processes bring salts to Earth’s surface over time, but human activities such as mining and land development are rapidly accelerating the natural “salt cycle.” Agriculture, construction, water and road treatment, and other industrial activities can also intensify salinization, which harms biodiversity and makes drinking water unsafe in extreme cases. “If you think of the planet as a living organism, when you accumulate so much salt it could affect the functioning of vital organs or ecosystems,” said Kaushal, who holds a joint appointment in UMD’s Earth System Science Interdisciplinary Center. “Removing salt from water is energy intensive and expensive, and the brine byproduct you end up with is saltier than ocean water and can’t be easily disposed of.” Kaushal and his co-authors described these disturbances as an “anthropogenic salt cycle,” establishing for the first time that humans affect the concentration and cycling of salt on a global, interconnected scale. “Twenty years ago, all we had were case studies. We could say surface waters were salty here in New York or in Baltimore’s drinking water supply,” said study co-author Gene Likens, an ecologist at the University of Connecticut and the Cary Institute of Ecosystem Studies. “We now show that it’s a cycle—from the deep Earth to the atmosphere—that’s been significantly perturbed by human activities.” The new study considered a variety of salt ions that are found underground and in surface water. Salts are compounds with positively charged cations and negatively charged anions, with some of the most abundant ones being calcium, magnesium, potassium and sulfate ions. “When people think of salt, they tend to think of sodium chloride, but our work over the years has shown that we’ve disturbed other types of salts, including ones related to limestone, gypsum and calcium sulfate,” Kaushal said. When dislodged in higher doses, these ions can cause environmental problems. Kaushal and his co-authors showed that human-caused salinization affected approximately 2.5 billion acres of soil around the world—an area about the size of the United States. Salt ions also increased in streams and rivers over the last 50 years, coinciding with an increase in the global use and production of salts. Salt has even infiltrated the air. In some regions, lakes are drying up and sending plumes of saline dust into the atmosphere. In areas that experience snow, road salts can become aerosolized, creating sodium and chloride particulate matter. Salinization is also associated with “cascading” effects. For example, saline dust can accelerate the melting of snow and harm communities—particularly in the western United States—that rely on snow for their water supply. Because of their structure, salt ions can bind to contaminants in soils and sediments, forming “chemical cocktails” that circulate in the environment and have detrimental effects. “Salt has a small ionic radius and can wedge itself between soil particles very easily,” Kaushal said. “In fact, that’s how road salts prevent ice crystals from forming.” Road salts have an outsized impact in the U.S., which churns out 44 billion pounds of the deicing agent each year. Road salts represented 44% of U.S. salt consumption between 2013 and 2017, and they account for 13.9% of the total dissolved solids that enter streams across the country. This can cause a “substantial” concentration of salt in watersheds, according to Kaushal and his co-authors. To prevent U.S. waterways from being inundated with salt in the coming years, Kaushal recommended policies that limit road salts or encourage alternatives. Washington, D.C., and several other U.S. cities have started treating frigid roads with beet juice, which has the same effect but contains significantly less salt. Kaushal said it is becoming increasingly important to weigh the short- and long-term risks of road salts, which play an important role in public safety but can also diminish water quality. “There's the short-term risk of injury, which is serious and something we certainly need to think about, but there’s also the long-term risk of health issues associated with too much salt in our water,” Kaushal said. “It’s about finding the right balance." The study’s authors also called for the creation of a “planetary boundary for safe and sustainable salt use” in much the same way that carbon dioxide levels are associated with a planetary boundary to limit climate change. Kaushal said that while it’s theoretically possible to regulate and control salt levels, it comes with unique challenges. “This is a very complex issue because salt is not considered a primary drinking water contaminant in the U.S., so to regulate it would be a big undertaking,” Kaushal said. “But do I think it’s a substance that is increasing in the environment to harmful levels? Yes.” ### In addition to Kaushal, other UMD-affiliated co-authors included Carly Maas (M.S. ’22, geology), geology master’s student Joseph Malin, Jenna Reimer (B.S. ’19, geology), Ruth Shatkay (B.S. ’19 architecture; M.S. ’21, environmental science and technology), geology Ph.D. student Sydney Shelton, and Alexis Yaculak (B.S. ’21, geology). Their paper, “The Anthropogenic Salt Cycle,” was published in Nature Reviews Earth & Environment on October 31, 2023. This research was supported by the National Science Foundation (Award Nos. GCR 2021089 and 2021015), Maryland Sea Grant (Award No. SA75281870W) and the Washington Metropolitan Council of Governments (Contract No. 21-001). This article does not necessarily reflect the views of these organizations. Method of Research Literature review Subject of Research Not applicable Article Title The anthropogenic salt cycle Article Publication Date 31-Oct-2023
Environmental Science
Show captionScientists estimate the leaks could release up to 400,000 tonnes of methane into the atmosphere. Photograph: Danish Defence/AFP/GettySeascape: the state of our oceans‘Colossal amount’ of leaked methane, twice initial estimates, is equivalent to third of Denmark’s annual CO2 emissions or 1.3m carsScientists fear methane erupting from the burst Nord Stream pipelines into the Baltic Sea could be one of the worst natural gas leaks ever and pose significant climate risks.Neither of the two breached Nord Stream pipelines, which run between Russia and Germany, was operational, but both contained natural gas. This mostly consists of methane – a greenhouse gas that is the biggest cause of climate heating after carbon dioxide.The extent of the leaks is still unclear but rough estimates by scientists, based on the volume of gas reportedly in one of the pipelines, vary between 100,000 and 350,000 tonnes of methane.Jasmin Cooper, a research associate at Imperial College London’s department of chemical engineering, said a “lot of uncertainty” surrounded the leak.“We know there are three explosions but we don’t know if there are three holes in the sides of the pipe or how big the breaks are,” said Cooper. “It’s difficult to know how much is reaching the surface. But it is potentially hundreds of thousands of tonnes of methane: quite a big volume being pumped into the atmosphere.”Nord Stream 2, which was intended to increase the flow of gas from Russia to Germany, reportedly contained 300m cubic metres of gas when Berlin halted the certification process shortly before Russia invaded Ukraine.That volume alone would translate to 200,000 tonnes of methane, Cooper said. If it all escaped, it would exceed the 100,000 tonnes of methane vented by the Aliso Canyon blowout, the biggest gas leak in US history, which happened in California in 2015. Aliso had the warming equivalent of half a million cars. “It has the potential to be one of the biggest gas leaks,” said Cooper. “The climate risks from the methane leak are quite large. Methane is a potent greenhouse gas, 30 times stronger than CO2 over 100 years and more than 80 times stronger over 20 years.”Prof Grant Allen, an expert in Earth and environmental science at Manchester University, said it was unlikely that natural processes, which convert small amounts of methane into carbon dioxide, would be able to absorb much of the leak.Allen said: “This is a colossal amount of gas, in really large bubbles. If you have small sources of gas, nature will help out by digesting the gas. In the Deepwater Horizon spill, there was a lot of attenuation of methane by bacteria.“My scientific experience is telling me that – with a big blow-up like this – methane will not have time to be attenuated by nature. So a significant proportion will be vented as methane gas.”Unlike an oil spill, gas will not have as polluting an effect on the marine environment, Allen said. “But in terms of greenhouse gases, it’s a reckless and unnecessary emission to the atmosphere.”Germany’s environment agency said there were no containment mechanisms on the pipeline, so the entire contents were likely to escape. The Danish Energy Agency said on Wednesday that the pipelines contained 778m cubic metres of natural gas in total – the equivalent of 32% of Danish annual CO2 emissions.This is almost twice the volume initially estimated by scientists. This would significantly bump up estimates of methane leaked to the atmosphere, from 200,000 to more than 400,000 tonnes. More than half the gas had left the pipes and the remainder is expected be gone by Sunday, the agency said.Jean-Francois Gauthier, vice-president of measurements at the commercial methane-measuring satellite firm GHGSat, said evaluating the total gas volume emitted was “challenging”.“There is little information on the size of the breach and whether it is still going on” Gauthier said. “If it’s a significant enough breach, it would empty itself.“It’s safe to say that we’re talking about hundreds of thousands of tonnes of methane. In terms of leaks, it’s certainly a very serious one. The catastrophic instantaneous nature of this one – I’ve certainly never seen anything like that before.”In terms of the climate impact, 250,000 tonnes of methane was equivalent to the impact of 1.3m cars driven on the road for a year, Gauthier said.{{#ticker}}{{topLeft}}{{bottomLeft}}{{topRight}}{{bottomRight}}{{#goalExceededMarkerPercentage}}{{/goalExceededMarkerPercentage}}{{/ticker}}{{heading}}{{#paragraphs}}{{.}}{{/paragraphs}}{{highlightedText}}{{#choiceCards}}{{/choiceCards}}We will be in touch to remind you to contribute. Look out for a message in your inbox in . If you have any questions about contributing, please contact us.TopicsNord Stream 1 pipelineSeascape: the state of our oceansNord Stream 2 pipelineRussiaGermanyEuropeGasEnergynewsShare on FacebookShare on TwitterShare via EmailShare on LinkedInShare on WhatsAppShare on Messenger
Environmental Science
Thousands of tourists spill onto a boardwalk in Alaska's capital city every day from cruise ships towering over downtown. Vendors hawk shoreside trips and rows of buses stand ready to whisk visitors away, with many headed for the area's crown jewel: the Mendenhall Glacier. A craggy expanse of gray, white and blue, the glacier gets swarmed by sightseeing helicopters and attracts visitors by kayak, canoe and foot. So many come to see the glacier and Juneau's other wonders that the city's immediate concern is how to manage them all as a record number are expected this year. Some residents flee to quieter places during the summer, and a deal between the city and cruise industry will limit how many ships arrive next year. But climate change is melting the Mendenhall Glacier. It is receding so quickly that by 2050, it might no longer be visible from the visitor center it once loomed outside. That's prompted another question Juneau is only now starting to contemplate: What happens then? "We need to be thinking about our glaciers and the ability to view glaciers as they recede," said Alexandra Pierce, the city's tourism manager. There also needs to be a focus on reducing environmental impacts, she said. "People come to Alaska to see what they consider to be a pristine environment and it's our responsibility to preserve that for residents and visitors." The glacier pours from rocky terrain between mountains into a lake dotted by stray icebergs. Its face retreated eight football fields between 2007 and 2021, according to estimates from University of Alaska Southeast researchers. Trail markers memorialize the glacier's backward march, showing where the ice once stood. Thickets of vegetation have grown in its wake. While massive chunks have broken off, most ice loss has come from the thinning due to warming temperatures, said Eran Hood, a University of Alaska Southeast professor of environmental science. The Mendenhall has now largely receded from the lake that bears its name. Scientists are trying to understand what the changes might mean for the ecosystem, including salmon habitat. There are uncertainties for tourism, too. Most people enjoy the glacier from trails across Mendenhall Lake near the visitor center. Caves of dizzying blues that drew crowds several years ago have collapsed and pools of water now stand where one could once step from the rocks onto the ice. Manoj Pillai, a cruise ship worker from India, took pictures from a popular overlook on a recent day off. "If the glacier is so beautiful now, how would it be, like, 10 or 20 years before? I just imagine that," he said. Officials with the Tongass National Forest, under which the Mendenhall Glacier Recreation Area falls, are bracing for more visitors over the next 30 years even as they contemplate a future when the glacier slips from casual view. The agency is proposing new trails and parking areas, an additional visitor center and public use cabins at a lakeside campground. Researchers do not expect the glacier to disappear completely for at least a century. "We did talk about, 'Is it worth the investment in the facilities if the glacier does go out of sight?'" said Tristan Fluharty, the forest's Juneau district ranger. "Would we still get the same amount of visitation?" A thundering waterfall that is a popular place for selfies, salmon runs, black bears and trails could continue attracting tourists when the glacier is not visible from the visitor center, but "the glacier is the big draw," he said. Around 700,000 people are expected to visit this year, with about 1 million projected by 2050. Other sites offer a cautionary tale. Annual visitation peaked in the 1990s at around 400,000 to the Begich, Boggs Visitor Center, southeast of Anchorage, with the Portage Glacier serving as a draw. But now, on clear days, a sliver of the glacier remains visible from the center, which was visited by about 30,000 people last year, said Brandon Raile, a spokesperson with the Chugach National Forest, which manages the site. Officials are discussing the center's future, he said. "Where do we go with the Begich, Boggs Visitor Center?" Raile said. "How do we keep it relevant as we go forward when the original reason for it being put there is not really relevant anymore?" At the Mendenhall, rangers talk to visitors about climate change. They aim to "inspire wonder and awe but also to inspire hope and action," said Laura Buchheit, the forest's Juneau deputy district ranger. After pandemic-stunted seasons, about 1.6 million cruise passengers are expected in Juneau this year, during a season stretching from April through October. The city, nestled in a rainforest, is one stop on what are generally weeklong cruises to Alaska beginning in Seattle or Vancouver, British Columbia. Tourists can leave the docks and move up the side of a mountain in minutes via a popular tram, see bald eagles perch on light posts and enjoy a vibrant Alaska Native arts community. On the busiest days, about 20,000 people, equal to two-thirds of the city's population, pour from the boats. City leaders and major cruise lines agreed to a daily five-ship limit for next year. But critics worry that won't ease congestion if the vessels keep getting bigger. Some residents would like one day a week without ships. As many as seven ships a day have arrived this year. Juneau Tours and Whale Watch is one of about two dozen companies with permits for services like transportation or guiding at the glacier. Serene Hutchinson, the company's general manager, said demand has been so high that she neared her allotment halfway through the season. Shuttle service to the glacier had to be suspended, but her business still offers limited tours that include the glacier, she said. Other bus operators are reaching their limits, and tourism officials are encouraging visitors to see other sites or get to the glacier by different means. Limits on visitation can benefit tour companies by improving the experience rather than having tourists "shoehorned" at the glacier, said Hutchinson, who doesn't worry about Juneau losing its luster as the glacier recedes. "Alaska does the work for us, right?" she said. "All we have to do is just kind of get out of the way and let people look around and smell and breathe." Pierce, Juneau's tourism manager, said discussions are just beginning around what a sustainable southeast Alaska tourism industry should look like. In Sitka, home to a slumbering volcano, the number of cruise passengers on a day earlier this summer exceeded the town's population of 8,400, overwhelming businesses, dragging down internet speeds and prompting officials to question how much tourism is too much. Juneau plans to conduct a survey that could guide future growth, such as building trails for tourism companies. Kerry Kirkpatrick, a Juneau resident of nearly 30 years, recalls when the Mendenhall's face was "long across the water and high above our heads." She called the glacier a national treasure for its accessibility and noted an irony in carbon-emitting helicopters and cruise ships chasing a melting glacier. She worries the current level of tourism isn't sustainable. As the Mendenhall recedes, plants and animals will need time to adjust, she said. So will humans. "There's too many people on the planet wanting to do the same things," Kirkpatrick said. "You don't want to be the person who closes the door and says, you know, 'I'm the last one in and you can't come in.' But we do have to have the ability to say, 'No, no more.'"
Environmental Science
Scientists find ‘forever chemicals’ in the blood of North Carolina dogs and horses A team of North Carolina State University scientists have identified elevated levels of “forever chemicals” in the blood of every pet dog and horse they tested in a recent community study. The research, published Wednesday in Environmental Science and Technology, establishes horses and confirms dogs as important sentinel species for gauging human exposure to cancer-linked per- and polyfluoroalkyl substances (PFAS) inside and outside the home. Blood chemistry panels conducted on the animals also revealed changes in the biological indicators used to assess liver and kidney dysfunction — two systems that are the primary targets of PFAS toxicity in humans, according to the study. The region of Central North Carolina where the dogs and horses reside is highly contaminated with PFAS, due to the local production of these long-lasting, synthetic compounds, the authors explained. Known for their ability to linger in both the body and the environment, PFAS are found in industrial discharge, certain firefighting foams and a variety of household items. Many of these substances — of which there are thousands — are linked to kidney cancer, thyroid disease, testicular cancer and other illnesses. “Bolstering their utility as sentinels for human health effects, domestic animals have substantial overlap in shared health risks,” the authors stated. While this method has historically been used to study disease spread, the researchers described “growing interest in its application for chemical-induced health risks to inform regulatory and public health response to chemical hazards like PFAS.” The NC State researchers evaluated PFAS blood levels for 31 dogs and 32 horses from Gray’s Creek, N.C., at the request of community members who had voiced concerns about the well-being of their pets. All of the households included in the study were on well water, and all of these wells had been tested by state inspectors and deemed contaminated with PFAS, according to the study. After receiving a general veterinary health check, the animals underwent a blood serum screening for 33 different types of PFAS — chosen based on compounds present in the adjacent Cape Fear River Basin, the authors explained. Among those 33 compounds of interest, the scientists identified 20 different PFAS in the pets. While all animals participating in the study had at least one such substance in their blood, more than 50 percent of the subjects had at least 12 of the 20 types of PFAS. One of the most notorious types of PFAS, an industrial and commercial product ingredient called PFOS, had the highest concentrations in dog serum, according to the study. The researchers found PFHxS, a surfactant used in certain firefighting foams and consumer products, in the blood of dogs but not of horses. Other specific types of PFAS — including the compound known colloquially as GenX — was identified only in the blood of dogs and horses that drank well water. In dogs that drank well water, median levels of two types of PFAS — PFOS and PFHxS — were similar to those of children included in the university’s GenX Exposure Study, also conducted in the Cape Fear River Basin. Such parallels suggest that pet dogs could serve as significant indicators of household PFAS, the authors explained. Dogs that drank bottled water, on the other hand, had different types of PFAS in their blood, and were by no means free of these substances, according to the study. In fact, the scientists identified 16 out of the 20 PFAS in dogs that drank bottled water. “The fact that some of the concentrations in dogs are similar to those in children reinforces the fact that dogs are important in-home sentinels for these contaminants,” corresponding author Scott Belcher, an associate professor of biology at NC State, said in a statement. “And the fact that PFAS is still present in animals that don’t drink well water points to other sources of contamination within homes, such as household dust or food,” Belcher added. In comparison to the dogs, the horses overall had lower concentrations of PFAS in their blood, the authors found. Nonetheless, these animals had higher levels of a substance called NBP2, a byproduct of fluorochemical manufacturing. This heightened presence of a manufacturing byproduct suggests that contamination of the outdoor environment — potentially the discharge of PFAS onto forage — could be linked to their exposure, according to the study. “Horses have not previously been used to monitor PFAS exposure,” first author Kylie Rock, a postdoctoral researcher at NC State, said in a statement. “But they may provide critical information about routes of exposure from the outdoor environment when they reside in close proximity to known contamination sources,” Rock added. Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
Environmental Science
Optimizing nitrogen application for sustainable rice production in China Addressing the dual challenge of food security and environmental quality highlights the importance of managing nitrogen inputs in rice production in a responsible and efficient manner. An underexplored aspect in previous research is that, in areas where smallholder farmers dominate the agricultural landscape, there is often significant variability in yield performance among different fields. Implementing uniform nitrogen fertilizer optimization measures may therefore result in yield fluctuations or economic risk for small farms. Thus, mobilizing millions of small farms to reduce nitrogen fertilizer inputs poses a significant challenge. It is crucial to conduct a risk analysis that considers the potential risk in rice production and the environmental impact of adopting nitrogen fertilizer optimization approach. In a study published in Nature, a research team led by Prof. Zhao Xu and Yan Xiaoyuan from the Institute of Soil Science of the Chinese Academy of Sciences (ISSCAS), and the collaborators from the University of California, Davis, University of Maryland Center for Environmental Science and China Agricultural University, have proposed an optimal nitrogen rate strategy using new subregion-specific models, while ensuring that social, economic, and environmental benefits of fertilizer use are integrated into the analysis. They also analyzed an extensive dataset from on-farm studies to assess the risk of yield losses among smallholders and the challenges associated with implementing the strategy. The researchers found that meeting national rice production targets in 2030 in China is possible under optimal nitrogen rate strategy while concurrently reducing nationwide nitrogen consumption by 10%–27%, mitigating reactive nitrogen losses by 7%–24%, and increasing nitrogen use efficiency by 30%–36%. Moreover, the national reactive nitrogen pollution from rice systems can be restricted under proposed environmental thresholds without compromising soil nitrogen stocks or economic benefits for smallholders. The researchers also suggested a multi-faceted approach to facilitate the implementation of the annually revised subregional nitrogen rate strategy, which is tailored to the current state of rice cropping systems in China: first, building a nationwide large-scale monitoring network for crop yield response to nitrogen application and an intelligent management system for "nitrogen control" decision-making; and second, establishing a nitrogen fertilizer quota management system with purchase quota use and issuing incentives and subsidies for optimizing nitrogen use for all farmers. "This approach is aimed at reducing the spatial and temporal heterogeneity of fields, improving the accuracy and applicability of optimized nitrogen application, reducing the cost of technology popularization, ensuring the scientific and accurate implementation of fertilizer nitrogen zoning macro-control, and maximizing regional benefits," said Prof. Zhao from ISSCAS. "Our study may provide a preferable nitrogen strategy allocated to each region based on the trade-off between economic risk and environmental benefit," said Yan Xiaoyuan from ISSCAS. More information: Siyuan Cai et al, Optimal nitrogen rate strategy for sustainable rice production in China, Nature (2023). DOI: 10.1038/s41586-022-05678-x
Environmental Science
An artificial intelligence system enables robots to conduct autonomous scientific experiments -- as many as 10,000 per day -- potentially driving a drastic leap forward in the pace of discovery in areas from medicine to agriculture to environmental science. Reported today in Nature Microbiology, the team was led by a professor now at the University of Michigan. That artificial intelligence platform, dubbed BacterAI, mapped the metabolism of two microbes associated with oral health -- with no baseline information to start with. Bacteria consume some combination of the 20 amino acids needed to support life, but each species requires specific nutrients to grow. The U-M team wanted to know what amino acids are needed by the beneficial microbes in our mouths so they can promote their growth. "We know almost nothing about most of the bacteria that influence our health. Understanding how bacteria grow is the first step toward reengineering our microbiome," said Paul Jensen, U-M assistant professor of biomedical engineering who was at the University of Illinois when the project started. Figuring out the combination of amino acids that bacteria like is tricky, however. Those 20 amino acids yield more than a million possible combinations, just based on whether each amino acid is present or not. Yet BacterAI was able to discover the amino acid requirements for the growth of both Streptococcus gordonii and Streptococcus sanguinis. To find the right formula for each species, BacterAI tested hundreds of combinations of amino acids per day, honing its focus and changing combinations each morning based on the previous day's results. Within nine days, it was producing accurate predictions 90% of the time. Unlike conventional approaches that feed labeled data sets into a machine-learning model, BacterAI creates its own data set through a series of experiments. By analyzing the results of previous trials, it comes up with predictions of what new experiments might give it the most information. As a result, it figured out most of the rules for feeding bacteria with fewer than 4,000 experiments. "When a child learns to walk, they don't just watch adults walk and then say 'Ok, I got it,' stand up, and start walking. They fumble around and do some trial and error first," Jensen said. "We wanted our AI agent to take steps and fall down, to come up with its own ideas and make mistakes. Every day, it gets a little better, a little smarter." Little to no research has been conducted on roughly 90% of bacteria, and the amount of time and resources needed to learn even basic scientific information about them using conventional methods is daunting. Automated experimentation can drastically speed up these discoveries. The team ran up to 10,000 experiments in a single day. But the applications go beyond microbiology. Researchers in any field can set up questions as puzzles for AI to solve through this kind of trial and error. "With the recent explosion of mainstream AI over the last several months, many people are uncertain about what it will bring in the future, both positive and negative," said Adam Dama, a former engineer in the Jensen Lab and lead author of the study. "But to me, it's very clear that focused applications of AI like our project will accelerate everyday research." The research was funded by the National Institutes of Health with support from NVIDIA. Story Source: Journal Reference: Cite This Page:
Environmental Science
Recovering tropical forests offset just 25% of carbon emissions from new tropical deforestation and forest degradation A pioneering global study has found deforestation and forests lost or damaged due to human and environmental change, such as fire and logging, are fast outstripping current rates of forest regrowth. Tropical forests are vital ecosystems in the fight against both climate and ecological emergencies. The research, published today (March 15) in Nature and led by the University of Bristol, highlights the carbon storage potential and the current limits of forest regrowth to addressing such crises. The findings showed degraded forests recovering from human disturbances, and secondary forests regrowing in previously deforested areas, are annually removing at least 107 million metric tons of carbon from the atmosphere across the tropics. The team of international researchers have quantified the rates of aboveground carbon stock recovery using satellite data across the world's three largest tropical forests. Although the results demonstrate the important carbon value of conserving recovering forests across the tropics, the total amount of carbon being taken up in aboveground forest growth was only enough to counterbalance around a quarter (26%) of the current carbon emissions from tropical deforestation and degradation. Lead author Dr. Viola Heinrich, who recently gained a Ph.D. in physical geography at the University of Bristol School of Geographical Science, said, "Our study provides the first pan-tropical estimates of aboveground carbon absorption in tropical forests recovering from degradation and deforestation. "While protecting ancient tropical forests remains the priority, we demonstrate the value in sustainably managing forest areas that can recover from human disturbances." Environmental scientists at the University of Bristol worked with experts from Brazil's National Institute for Space Research (INPE), which included collaborations with scientists from across the U.K., Europe, and U.S. The team used satellite datasets that can distinguish deforestation from other human-induced disturbances, such as logging and fire, to determine the types of forests regrowing. Combined with information on aboveground carbon from the European Space Agency, and environmental variables, the team modeled the spatial patterns of forest regrowth in the Amazon, Central Africa and Borneo. They found the type of human disturbances in Borneo resulted in the greatest carbon reductions in degraded forests, primarily due to the high intensity logging of economically valuable trees, compared to in the Amazon and Central Africa. Additionally, the climate and environment on Borneo also results in carbon accumulating about 50% faster than in the other regions. "The carbon recovery models we developed can inform scientists and policy makers on the carbon storage potential of secondary and degraded forests if they are protected and allowed to recover," said Viola, now a Research Associate at the University of Exeter. The team also found that one third of forests degraded by logging or fire were later completely deforested, emphasizing the vulnerability of the carbon sink in these recovering forests. "Tropical forests provide many vital direct resources for millions of people and animals. At large scales we need to protect and restore tropical forests for their carbon and climate value. On the local scale, people need to be allowed to continue to use the forests sustainably," added Viola. Co-author Dr. Jo House, Reader in Environmental Science and Policy at the University of Bristol, who has authored many international assessments on climate change and forests, said, "Countries have repeatedly made pledges to reduce deforestation and forest degradation and restore deforested areas. "This is the most cost-effective and immediately available way to remove carbon from the atmosphere, alongside many co-benefits such as biodiversity, flood control and protection of indigenous peoples' livelihoods. Yet targets are repeatedly missed due a lack of serious international coordinated support and political will. Our research demonstrates that time is running out." At COP27, hosted by Egypt last November, Brazil, Indonesia, and Congo forged a South-South alliance to protect rainforests. January 2023 saw the inauguration of Brazil's new president Luiz Inácio Lula da Silva, who has pledged to undo the damage caused by preceding policies and revert to protecting and restoring the Amazon. Co-author Dr. Luiz Aragão, Head of Earth Observation and Geoinformatics Division at the National Institute for Space Research (INPE) in Brazil, said, "Focusing on the protection and restoration of degraded and secondary tropical forests is an efficient solution for building robust mechanisms for sustainable development of tropical countries. This aggregates monetary value for the local to global environmental services provided by these forests, in turn benefiting local populations economically and socially." The team now plans to build on this research, improving the estimates of carbon losses and gains from different types and intensities of forest disturbance across the tropics. Journal information: Nature Provided by University of Bristol
Environmental Science
A startup based in Tacoma, Washington has devised a portable system capable of removing the vast majority of per- and polyfluoroalkyl substances, or PFAS, from water. Housed within a 10-by-8-foot corrugated shipping container, the “PFAS Destruction Unit” is already helping tackle pollution around the state. As the health risks and ubiquity of PFAS become more apparent, scientists have increasingly sought ways to remove the persistent “forever chemicals” from water, soil, and other mediums. While they’ve made significant progress over the last few years, most PFAS removal methods are barely past the experimental stage. This means commercially available, scalable PFAS removal operations are still in demand—providing a perfect opportunity for Aquagga to swoop in and help. Aquagga’s PFAS Destruction Unit has already earned the startup nearly $7 million in crowdfunding, angel investments, government contracts, awards, and demonstration agreements. The system uses hydrothermal alkaline treatment, or HALT, to eliminate 99% of forever chemicals from water, as documented by scientific journals like Chemosphere and Environmental Science & Technology Letters. After the PFAS Destruction Unit has been supplied with contaminated water, it heats that water to 570 degrees Fahrenheit and applies roughly 25 megapascals of pressure. The system then creates a caustic environment by adding caustic soda, otherwise known as lye. After just 10 minutes in these harsh conditions, the molecular bonds that comprise PFAS break apart, separating carbon from fluoride. While the PFAS Destruction Unit captures carbon as-is, it combines fluoride with calcium or sodium to make harmless salts, which can be removed and used to create toothpaste, dietary supplements, and more. Aquagga recently returned from Alaska’s Fairbanks International Airport, where its system treated 20,000 gallons of water contaminated by PFAS-heavy firefighting foam, also called AFFF. The PFAS Destruction Unit successfully reduced the pool down to 1,000 gallons of foam; although “weather conditions and technical difficulties” forced the startup to pause operations, it’ll soon return to the airport to finish the job. Opportunities for Aquagga to shrink water and soil pollution are abundant. The startup’s website points to AFFF, which is frequently stored on military bases, airport grounds, and in landfills following fires or fire readiness training, as one prevalent use case; industrial runoff, often found near manufacturing sites, presents another. Aquagga is working on creating even smaller versions of its PFAS Destruction Unit to deploy short-term or periodically at these locations.
Environmental Science
PROVIDENCE, R.I. (AP) — The plastics industry says there is a way to help solve the crisis of plastic waste plaguing the planet's oceans, beaches and lands— recycle it, chemically.Chemical recycling typically uses heat or chemical solvents to break down plastics into liquid and gas to produce an oil-like mixture or basic chemicals. Industry leaders say that mixture can be made back into plastic pellets to make new products.“What we are trying to do is really create a circular economy for plastics because we think it is the most viable option for keeping plastic out of the environment,” said Joshua Baca, vice president of the plastics division at the American Chemistry Council, the industry trade association for American chemical companies.ExxonMobil, New Hope Energy, Nexus Circular, Eastman, Encina and other companies are planning to build large plastics recycling plants. Seven smaller facilities across the United States already recycle plastic into new plastic, according to the ACC. A handful of others convert hard-to-recycle used plastics into alternative transportation fuels for aviation, marine and auto uses.But environmental groups say advanced recycling is a distraction from real solutions like producing and using less plastic. They suspect the idea of recyclable plastics will enable the steep ramp up in plastic production to continue. And while the amount produced globally grows, recycling rates for plastic waste are abysmally low, especially in the United States.Plastic packaging, multi-layered films, bags, polystyrene foam and other hard-to-recycle plastic products are piling up in landfills and in the environment, or going to incinerators.Judith Enck, the founder and president of Beyond Plastics, says plastics recycling doesn't work and never will. Chemical additives and colorants used to give plastic different properties mean that there are thousands of types, she said. That’s why they can’t be mixed together and recycled in the conventional, mechanical way. Nor is there much of a market for recycled plastic, because virgin plastic is cheap, she said.So what is more likely to happen than actual recycling, said Enck, a former regional administrator at the U.S. Environmental Protection Agency, is the industry will shift to burning plastics as waste or as fuel.Lee Bell, a policy advisor for the International Pollutants Elimination Network, thinks chemical recycling is a public relations exercise by the petrochemical industry. The purpose is to dissuade regulators from capping plastics production. Making plastic could become even more important to the fossil fuel industry as climate change puts pressure on their transportation fuels, Bell said.The industry has made roughly 11 billion metric tons of plastic since 1950, with half of that produced since 2006, according to industrial ecologist Roland Geyer. Global plastic production is expected to more than quadruple by 2050, according to the United Nations Environment Programme and GRID-Arendal in Norway.The international Organisation for Economic Co-operation and Development says the share of plastic waste that is successfully recycled is projected to rise to 17% in 2060 from 9% in 2019 if no additional policies are enacted to restrain plastic demand and enhance recycling, but that wouldn't begin to keep up with the projected growth in plastic waste. With more ambitious policies, the amount of plastic waste that is recycled could rise to 40% to 60%, according to OECD.Two groups working to reduce plastic pollution, the Last Beach Clean Up and Beyond Plastics, estimated that the U.S. rate for recycling plastic waste in 2021 was even lower — 5% to 6%, after China stopped accepting other countries' waste in 2018.The U.S. national recycling strategy says no option, including chemical recycling, should be ruled out. The way to think of these new plants, the industry says, is as manufacturing plants. They should be legally defined that way, and not as waste management. About 20 states have adopted laws in the past five years consistent with that wish. Opponents say it’s a way to skirt the more stringent environmental regulations that apply to waste management facilities.EXISTING PLANTSThe U.S. facilities currently recycling plastic into new plastic are small — the largest is a 60-ton-per-day plant in Akron, Ohio, Alterra Energy, according to the ACC.Alterra Energy says it takes in the hard-to-recycle plastics, like flexible pouches, multi-layered films and rigid plastics from automobiles — everything except plastic water bottles since those are recycled mechanically, or plastics marked with a “3” since they contain polyvinyl chloride, or PVC.“Our mission is to solve plastic pollution,” said Jeremy DeBenedictis, company president. “That is not just a tag line. We all truly want to solve plastic pollution.”The Ohio facility typically takes in 40 tons to 50 tons per day, heating and liquifying the plastic to turn it back into an oil or hydrocarbon liquid, about 10,000 gallons to 12,000 gallons daily. About 75% of what comes into the facility can be liquified like that. Another 15% is turned into a synthetic natural gas to heat the process, while the remainder — paper, metals, dyes, inks and colorants — exit the reactor as a byproduct, or carbon char, DeBenedictis said. The char is disposed of as nonhazardous waste, though in the future some hope to sell it to the asphalt industry.The process doesn’t involve oxygen so there’s no combustion or incineration of plastics, DeBenedictis said, and their product is trucked as a synthetic oil to petrochemical companies, essentially the “building blocks on a molecular level for new plastic production."The materials they take in, that haven't been able to be recycled until now, should not be sent to landfills, dumped in the ocean or incinerated, DeBenedictis said.“That next level has to be a new technology, what you call chemical recycling or advanced recycling. That’s the next frontier,” he said.“Let’s not kid ourselves here. This is the right time to do it,” added company CEO Fred Schmuck. “There is absolutely no way we can meet our climate goals without addressing plastic waste.”DeBenedictis said he’s licensing the technology to try to grow the industry because that’s the “best way to make the quickest impact to the world.” A Finnish oil and gas company, Neste, is currently working to commercialize Alterra’s technology in Europe.The main chemical recycling technologies use pyrolysis, gasification or depolymerization. Neil Tangri, the science and policy director at the Global Alliance for Incinerator Alternatives, is skeptical. He says he has been hearing that pyrolysis is going to change everything since the 1990s, but it hasn’t happened. Instead, plastic production keeps climbing.GAIA views chemical recycling as a false solution that will facilitate greater production of virgin plastic — a high-energy process with high-carbon emissions that releases hazardous air pollutants, Tangri said. Instead, GAIA wants plastic production to be dramatically scaled back and only recyclable plastics to be produced.“Nobody needs more plastic,” Tangri said. “We keep trying to solve these production problems with recycling when really we need to change how much we make and what we make. That’s where the solution lies.”EQUITY ISSUES IN CITING PLANTSIn Rhode Island, state lawmakers considered a bill this year to exempt such facilities from solid waste licensing requirements. It was vigorously opposed by environmental activists and residents near the port of Providence who feared it would lead to a new plant in their neighborhood. State environmental officials sided with them.Monica Huertas, executive director of The People’s Port Authority, helped lead the opposition. The neighborhood is already overburdened by industry, she said, so much so that she sometimes has asthma attacks after walking around.Dwayne Keys said it’s unfair that he and his neighbors always have to be on guard for proposals like these, unlike residents in some of the state’s wealthy, white neighborhoods. The port area has enough environmental hazards that residents don't benefit from economically, he added. Keys calls it environmental racism.“The assessment is, we're the path of least resistance,” he said. “Not that there's no resistance, but the least. We're a coalition of individuals volunteering our time. We don't have wealth or access to resources or the legal means, as opposed to our white counterparts in higher income, higher net worth communities.”The chemistry council's Baca said the facilities operate at the highest standards, the industry believes everyone deserves clear air and water, and he would invite any detractors to one of the facilities so they can see that firsthand.U.S. plastics producers have said they will recycle or recover all plastic packaging used in the United States by 2040, and have already announced more than $7 billion in investments in both mechanical and chemical recycling.“I think we are on the cusp of a sustainability revolution where circularity will be the centerpiece of that,” Baca said. “And innovative technologies like advanced recycling will be what makes this possible.”Kate O’Neill wrote the book on waste, called “Waste.” A professor in the Department of Environmental Science, Policy and Management at the University of California, Berkeley, she has thought a lot about whether chemical recycling should be part of the solution to the plastic crisis. She said she has concluded yes, even though she knows saying so would “piss off the environmentalists.”“With some of these big problems,” she said, “we can’t rule anything out.”___Associated Press climate and environmental coverage receives support from several private foundations. See more about AP’s climate initiative here. The AP is solely responsible for all content.
Environmental Science
Killer whales' diet more important than location for pollutant exposure, study finds Both elegant and fierce, killer whales are some of the oceans' top predators, but even they can be exposed to environmental pollution. Now, in the largest study to date on North Atlantic killer whales, researchers in Environmental Science & Technology report the levels of legacy and emerging pollutants in 162 individuals' blubber. The animals' diet, rather than location, greatly impacted contaminant levels and potential health risks—information that's helpful to conservation efforts. As the largest member of the dolphin family, killer whales, also known as orcas, are found worldwide. Marine vessel traffic can disturb the hunting and communication of these black-and-white marine mammals. But they face another type of human threat—legacy and emerging persistent organic pollutants (POPs) in their environments. POPs include chlorinated hydrocarbons and flame retardants, and can accumulate in animals' fat stores as the contaminants move up the food chain though a process called biomagnification. Previous studies have shown that some Pacific orca populations can carry POP loads in their blubber that pose potential health risks, including reduced immunity, hormonal imbalances and reproductive issues. But information on orcas living in the North Atlantic are lacking. So, Anaïs Remili, Melissa McKinney and colleagues wanted to assess the contaminants present in animals spanning from Eastern Canada to Norway. The researchers collected skin and blubber biopsies from over a hundred free-ranging killer whales, across the North Atlantic Ocean from Canada, Greenland, Iceland to Norway. They analyzed half of each tissue sample for five classes of POPs, including polychlorinated biphenyls (PCBs). The other portion was used to evaluate the animals' diets. Multiple features stood out in the data: - Specimens from orcas in the western North Atlantic contained substantially higher contaminant loads than ones from orcas on the eastern side—a pattern that contrasts with previously reported POP levels in other Arctic marine organisms. - The pattern could be attributed to individuals' diet rather than location. Specifically, killer whales foraging on fish had the lowest POP levels, and animals consuming marine mammals, such as seals or other whales, had the highest. - PCB-associated health risks were highest for killer whales that ate primarily marine mammals, with most animals' levels exceeding the threshold for a higher risk of female reproductive failure. - The levels of one POP, known as α-HBCDD, were the highest reported for any marine mammal to date, despite the fact that this brominated flame retardant was banned a decade ago. The researchers say the findings support the need for proper waste disposal to prevent contaminants from entering the oceans' food chains and reaching the top predators. They explain that the findings of their study underscore the need for action to protect North Atlantic killer whales and their ecosystems. More information: Varying Diet Composition Causes Striking Differences in Legacy and Emerging Contaminant Concentrations in Killer Whales across the North Atlantic, Environmental Science & Technology (2023). DOI: 10.1021/acs.est.3c05516 , pubs.acs.org/doi/abs/10.1021/acs.est.3c05516 Journal information: Environmental Science & Technology Provided by American Chemical Society
Environmental Science
A satellite image of the location of the Zaap-C offshore platform with other offshore platforms visible flaring in the Gulf of Mexico December 28, 2021. Copernicus Sentinel data (2015)/ESA, CC BY-SA 3.0 IGO/Handout via REUTERS Register now for FREE unlimited access to Reuters.comMEXICO CITY, Sept 13 (Reuters) - Scientists who detected a massive methane leak at an offshore platform run by Mexico's Pemex said Tuesday there was "no way" they had made a mistake, roundly rebutting claims by the state oil company that the emissions were smaller and less polluting.Pemex, which is under increasing international pressure over its environmental record, last week issued a statement calling the study published in the Environmental Science & Technology Letters journal incorrect, arguing it had mistaken nitrogen - also a colorless, odorless gas - for methane in its calculations. read more But in a response sent to Reuters, the scientists behind the study dismissed Pemex's position saying the sensors they used to detect the methane leak at the Ku-Maloob-Zaap oil field cluster in the Gulf of Mexico cannot see nitrogen.Register now for FREE unlimited access to Reuters.com"There is no way of mistaking one for the other," said two of the authors, Itziar Irakulis-Loitxate and Luis Guanter, both at the Polytechnic University of Valencia in Spain. "The startling emissions we reported were 100% methane, plain and simple."The research, which was republished by the European Space Agency, found 40,000 tons of methane were emitted in December.It is part of a wider study funded by the agency that aims to detect and quantify human-made emissions from space using satellite data.Methane, the main component of natural gas, is considered a much more potent driver of global warming in the short term than carbon dioxide because it traps more heat in the atmosphere.Pemex (PEMX.UL) did not respond to a Reuters request for comment.Irakulis-Loitxate and Guanter said the satellite methods behind their study were bringing emissions to light that previously would have gone unreported."Methane is a huge challenge across the industry. Ideally, operators would embrace this new information," they said.Curbing methane emissions is considered a vital part of global attempts to limit global warming.Mexican President Andres Manuel Lopez Obrador has promised to dramatically reduce methane emissions and is facing increasing international pressure to do so. In June, U.S. Special Presidential Envoy for Climate John Kerry raised the matter during a visit to Mexico.Irakulis-Loitxate and Guanter said their satellite observations also showed the flare at the Zaap-C platform, used to burn off the excess natural gas and minimize methane's harmful impact, was unlit for 17 days in December."This is a matter of simple visual confirmation," the statement said. "Data from two other satellites confirm that the unlit flare was emitting large volumes of methane during that same period."Pemex had said the flare was unlit for just a few hours.Pemex also posted a video on its official Twitter account in which Chief Executive Officer Octavio Romero repeated that most of the gas that was being burnt on the platform was nitrogen."Here, we're not doing any irrational flaring - and even less so polluting in the way this publication claims," he said in the video, which included footage of the platform and the gas flare.Earlier this month, the scientists shared new data with Reuters that showed there was another leak of a similar magnitude from the same location during six days in August. read more Register now for FREE unlimited access to Reuters.comReporting by Stefanie Eschenbacher; Editing by Stephen Eisenhammer, Aurora Ellis and Edmund KlamannOur Standards: The Thomson Reuters Trust Principles.
Environmental Science
A satellite image of the location of the Zaap-C offshore platform with other offshore platforms visible flaring in the Gulf of Mexico December 28, 2021. Copernicus Sentinel data (2015)/ESA, CC BY-SA 3.0 IGO/Handout via REUTERS Register now for FREE unlimited access to Reuters.comMEXICO CITY, Sept 13 (Reuters) - Scientists who detected a massive methane leak at an offshore platform run by Mexico's Pemex said Tuesday there was "no way" they had made a mistake, roundly rebutting claims by the state oil company that the emissions were smaller and less polluting.Pemex, which is under increasing international pressure over its environmental record, last week issued a statement calling the study published in the Environmental Science & Technology Letters journal incorrect, arguing it had mistaken nitrogen - also a colorless, odorless gas - for methane in its calculations. read more But in a response sent to Reuters, the scientists behind the study ridiculed Pemex's position saying the sensors they used to detect the methane leak at the Ku-Maloob-Zaap oil field cluster in the Gulf of Mexico cannot even see nitrogen.Register now for FREE unlimited access to Reuters.com"There is no way of mistaking one for the other," said two of the authors, Itziar Irakulis-Loitxate and Luis Guanter, both at the Polytechnic University of Valencia in Spain. "The startling emissions we reported were 100% methane, plain and simple."The research, which was republished by the European Space Agency, found 40,000 tons of methane were emitted in December.It is part of a wider study funded by the agency that aims to detect and quantify human-made emissions from space using satellite data.Methane, the main component of natural gas, is considered a much more potent driver of global warming in the short-term than carbon dioxide because it traps more heat in the atmosphere.Pemex (PEMX.UL) did not respond to a Reuters request for comment.Irakulis-Loitxate and Guanter said the satellite methods behind their study were bringing emissions to light that previously would have gone unreported."Methane is a huge challenge across the industry. Ideally, operators would embrace this new information," they said.Curbing methane emissions is considered a vital part of global attempts to limit global warming.Mexican President Andres Manuel Lopez Obrador has promised to dramatically reduce methane emissions and is facing increasing international pressure to do so. In June, U.S. Special Presidential Envoy for Climate John Kerry raised the matter during a visit to Mexico.Irakulis-Loitxate and Guanter said their satellite observations also showed the flare at the Zaap-C platform, used to burn off the excess natural gas and minimize methane's harmful impact, was unlit for 17 days in December."This is a matter of simple visual confirmation," the statement said. "Data from two other satellites confirm that the unlit flare was emitting large volumes of methane during that same period."Pemex had said the flare was unlit for just a few hours.Earlier this month, the scientists shared new data with Reuters that showed there was another leak of a similar magnitude from the same location during six days in August. read more Register now for FREE unlimited access to Reuters.comReporting by Stefanie Eschenbacher; Editing by Stephen Eisenhammer and Aurora EllisOur Standards: The Thomson Reuters Trust Principles.
Environmental Science
What happens when people unknowingly eat, drink or inhale nearly invisible pieces of plastic? Although it’s unclear what impact this really has on humans, researchers have now taken a step toward answering that question. In ACS’ Environmental Science & Technology, a team reports laboratory results indicating that tiny plastic particles could enter liver and lung cells and disrupt their regular processes, potentially causing adverse health outcomes. Plastic can’t be avoided in daily life. Many products that we bring into our homes are made of plastic or wrapped in plastic packaging — all of which could release micro- and nanometer-sized pieces that could be accidentally consumed or inhaled. Although the health risks to humans from taking in nanoplastics isn’t entirely clear, researchers recently have shown that particles less than 100 nm-wide can enter animals’ blood and organs, causing inflammation, toxicity and neurological changes. So, Zongwei Cai, Chunmiao Zheng and colleagues wanted to examine the molecular-level and metabolic impacts when human lung and liver cells are exposed to similarly sized nanoplastics.  The researchers cultured human liver and lung cells separately in laboratory plates and treated them with different amounts of 80 nm-wide plastic particles. After two days, electron microscopy images showed that nanoplastics had entered both types of cells without killing them. To learn more about what happened to the cells, the researchers looked at the compounds released by mitochondria — crucial energy-producing organelles that are thought to be sensitive to nanoplastics — during metabolism. As liver and lung cells were exposed to more nanoplastics, they produced more reactive oxygen species and different amounts of nucleotides, nucleosides, amino acids, peptides and carboxylic acids, indicating that multiple metabolic processes were disturbed. In some cases, mitochondrial pathways appeared to be dysfunctional. These observations demonstrate that while nanoplastics exposure doesn’t kill human lung and liver cells, it could disrupt critical processes, potentially causing negative impacts to organs, the researchers say. The authors acknowledge funding from the Hong Kong General Research Fund and the National Science Foundation of China. The American Chemical Society (ACS) is a nonprofit organization chartered by the U.S. Congress. ACS’ mission is to advance the broader chemistry enterprise and its practitioners for the benefit of Earth and all its people. The Society is a global leader in promoting excellence in science education and providing access to chemistry-related information and research through its multiple research solutions, peer-reviewed journals, scientific conferences, eBooks and weekly news periodical Chemical & Engineering News. ACS journals are among the most cited, most trusted and most read within the scientific literature; however, ACS itself does not conduct chemical research. As a leader in scientific information solutions, its CAS division partners with global innovators to accelerate breakthroughs by curating, connecting and analyzing the world’s scientific knowledge. ACS’ main offices are in Washington, D.C., and Columbus, Ohio. To automatically receive news releases from the American Chemical Society, contact newsroom@acs.org. Follow us: Twitter | Facebook | LinkedIn | Instagram Journal Environmental Science & Technology Article Title Metabolomics Reveal Nanoplastic-Induced Mitochondrial Damage in Human Liver and Lung Cells Article Publication Date 25-Aug-2022 Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.
Environmental Science
Deaf scientists and sign language experts have created hundreds of new signs for British Sign Language (BSL). Expanded BSL vocabulary now includes climate-related terms like "greenhouse gas" and "carbon footprint", for which there were no official signs. That meant children, teachers and scientists would often have to finger-spell long, complex, scientific terms. "We're trying to create the perfect signs that visualise scientific concepts," explains Dr Audrey Cameron. Dr Cameron, who is profoundly deaf, leads the sign language project at Edinburgh University, which has just added 200 new environmental science terms to the BSL dictionary. She described how, in her own scientific career, a lack of vocabulary meant she was excluded from important meetings and conversations. "I was involved in research for 11 years and went to numerous meetings but was never was truly involved because I couldn't understand what people were saying," she told BBC News. "I wanted to talk with people about chemistry and I just wasn't able to." Glasgow-based biology teacher Liam McMulkin has also been involved in the sign-creation workshops, hosted by the Scottish Sensory Centre. "The beauty of sign language - particularly for science - is that it's a visual language," he explained. "Some of the concepts are abstract, but sign language can really help children to understand them." Mr McMulkin used the sign for "photosynthesis" as an example, which uses one flat hand-shape to represent a leaf, while projecting the fingers - like the sun's rays - from other hand. "When I do this [move the sun hand towards the leaf hand], you can see that the energy is being absorbed by the leaf," he explained. The science glossary project, funded in part by the Royal Society, has been running since 2007 and has added about 7,000 new signs to BSL. Describing the process by which signs are developed, Dr Cameron explained: "We take a list of terms from the school curriculum and then work together to come up with something accurate but also visual of the meaning." The newest signs are themed around biodiversity, ecosystems, the physical environment and pollution. There is an online video video glossary demonstrating the terms. Missing words The glossary is designed to support deaf children in schools. And as 13-year-old Melissa, a deaf student at a mainstream school in Glasgow explained: "they really help you understand what's happening." Melissa showed me the difference between laboriously finger-spelling greenhouse gases (G-R-E-E-N-H-O-U-S-E G-A-S-E-S), and using the new sign that includes moving her closed fists around like gas molecules in the air. "With the sign I can see something is happening with the gas," she said. Mr McMulkin, who is Melissa's science teacher and is also profoundly deaf, added that hearing people were "constantly learning and acquiring knowledge" wherever they go, "but deaf people miss out on so much information". "That's why it's so important to use sign language in science lessons in schools," he said. "It allows deaf children to learn in their natural language." Dr Cameron also highlighted the value in education of depicting intricate scientific concepts in hand movements - for both hearing and deaf children. Dr Cameron recalled observing a class in which five-year-olds were learning about how things float or sink. "They were learning about how things that are less dense will float, which is quite complex," she explained. "And the teacher was using the sign for 'density'." The sign explains that concept by using one closed fist and wrapping the other hand around it - squeezing and releasing to represent different densities. "I thought - these five-year-olds are not going to get this. But some time after the end of the lesson, they were asked a question about why things float or sink and they all used the sign for density," Dr Cameron said. "So I've seen how much of an impact this can have. And my passion has just grown as the glossary has grown." Prof Jeremy Sanders, chair of the Royal Society diversity and inclusion committee, said: "We hope these new signs will inspire and empower the next generation of BSL-using students and allow practising scientists to share their vital work with the world." Additional reporting by Kate Stephens and Maddie Molloy Hear more about the mission to create this visual vocabulary on Radio 4's Inside Science on BBC Sounds
Environmental Science
Humic substances affect iron wheel-driven hydroxyl radical production in soil Hydroxyl radical (·OH) is the most reactive oxidant and plays important roles in the biogeochemical cycle of elements and the attenuation of contaminants in the environment. In recent years, the redox reaction of iron in subsurface sediments was found to produce ·OH naturally, even in dark conditions without the presence of exogenous hydrogen peroxide. However, the effect of mineral-associated soil organic matter (SOM) on the process is not well understood. In a study published in Environmental Science & Technology, a research team led by Prof. Zhu Yongguan and Li Gang from the Institute of Urban Environment of the Chinese Academy of Sciences reported the influence of humic substances, the major components of SOM, on the microbially mediated iron reduction and reoxidation processes, and established the pathway of ·OH production in different SOM-containing system. The researchers used fulvic acid (FA), humic acid (HA), and humin (HM), components of humic substances operationally separated from soil, to evaluate the influence of SOM characteristics on iron redox processes. They found that high electron exchange capacity of FA and HA promoted the microbial iron reduction process, while HA with high electron donating capacity inhibited the yield of ·OH. Using the scavengers of possible intermediate involved in ·OH production, the researchers established different pathways for ·OH production in SOM-containing system. They found that the one-electron transfer process dominated the ·OH production in the FA-containing system, while both one- and two-electron transfer processes were present in HA- and HM-containing systems. Additionally, the researchers found that microbially mediated iron redox processes changed the properties of dissolved fractions of SOM, and the aromaticity of dissolved fraction of HA decreased due to its high reactivity with ·OH. Based on the high-resolution transmission electron microscope and X-ray diffractometer, ferrous secondary minerals formed and SOM inhibited its transformation to higher stable and crystalline iron oxy(hydr)oxides. This work advances the understanding of SOM-involved iron redox processes and ·OH production. The mechanisms revealed need to be considered when evaluating the effect of potentially produced ·OH on pollutants degradation in redox fluctuating environments. More information: Qiao Xu et al, Enhanced Formation of 6PPD-Q during the Aging of Tire Wear Particles in Anaerobic Flooded Soils: The Role of Iron Reduction and Environmentally Persistent Free Radicals, Environmental Science & Technology (2023). DOI: 10.1021/acs.est.2c08672 Journal information: Environmental Science & Technology Provided by Chinese Academy of Sciences
Environmental Science
Korea is regarded as a "water-stressed nation." Although the country receives an annual precipitation of approximately 1,300mm, it is characterized by concentrated periods and specific regions, thereby giving rise to challenges stemming from water scarcity. The lack of drinking water extends beyond mere inconvenience, posing life-threatening implications for certain individuals. In March 2023, the United Nations Children's Fund (UNICEF) released a report highlighting the plight of roughly 190 million children in Africa who suffer from an absence of safe water, resulting in the tragic daily loss of 1,000 children under the age of five. Nations across the globe are employing diverse approaches in an endeavor to mitigate this issue. However, seawater desalination is energy intensive that predominantly reliant on fossil fuels engendering environmental pollution such as discharge of concentrated brine into the sea. Harvesting atmospheric water also presents challenges, particularly in regions with humidity is less than 70% as it necessitates a substantial amount of energy to condense the vapor, rendering it an ineffective solution. Recently, a joint team of researchers led by Professor Woochul Song from the Division of Environmental Science & Engineering at Pohang University of Science and Technology (POSTECH) and Omar M. Yaghi, Professor of Chemistry at UC Berkeley, accomplished successful atmospheric water harvesting using ambient sunlight in the Death Valley desert. The achievement signifies a promising breakthrough in tackling water scarcity as it harnesses an infinite resource without polluting the environment. The research findings were published in the international journal Nature Water on July 6 (local time), 2023. Metal-organic frameworks (MOFs) refer to porous materials characterized by minuscule holes measuring 1-2 nm. High-surface-areas of MOFs function as an absorbent, capturing atmospheric water vapor. To this end, the research team devised a water harvesting device, which is referred to as water harvester, based on the MOF, leveraging its capability to draw in water from the atmosphere during nighttime while condensing the absorbed water into drinkable liquid using ambient sunlight throughout the day. The team's water harvester takes the form of a cylindrical structure, unlike conventional rectangular design. This configuration ensures that the device's surface area aligns with the trajectory of the sun, maximizing the utilization of ambient sunlight from sunrise to sunset. The research team tested the harvester through water collection experiments conducted in two different locations: Berkeley in June 2022 and Death Valley desert in August 2022. Death Valley desert represents one of the world's hottest and most arid regions. With persistently elevated temperatures reaching 40 degrees even at midnight, soaring to a scorching 57 degrees during day, and a relative humidity below 7%, the area experiences exceptionally dry conditions. During the experiments, the device harvested up to 285g and 210g of water per kilogram of MOF in Berkeley and Death Valley desert respectively. This represents a twofold increase in water production compared to the previous harvesters. Notably, the research team's harvester successfully extracted water without generating any carbon footprint even in the extremely dry weather conditions including the highest temperature of 60 degrees and the average night humidity of 14%. The team's unique condenser and MOF-absorbent system, empowered the water harvesting process solely through the use of ambient sunlight, rendered additional energy sources or power supplies unnecessary. The research holds substantial significance due to the experiments conducted in genuinely extreme environments, effectively showcasing the practicality of the technology. The ability to harvest water from atmospheric water vapor has universal applicability, offering contributions toward the realization of ethical objectives in the realms of environment and public health, thereby serving as a pivotal technology for human security. Professor Woochul Song at POSTECH conveyed explained, "We have substantiated the technology's potential to address the escalating challenges of water scarcity, further compounded by environmental issues." He added, "This technology embodies sustainability as it provides a reliable water resource regardless of geographic or weather conditions in the world." Story Source: Materials provided by Pohang University of Science & Technology (POSTECH). Note: Content may be edited for style and length. Journal Reference: Cite This Page:
Environmental Science
Can the UAE help get a planet-saving methane deal at COP28? Environmental advocates are rightfully skeptical that the head of a national oil company can credibly steer the parties at the upcoming UN climate summit COP28 to the strong climate mitigation needed to meet the climate emergency, especially one that is planning a major oil and gas expansion that is inconsistent with the 2015 Paris Agreement goal of limiting warming to no more than 1.5 degrees Celsius of warming above pre-Industrial levels. What can the United Arab Emirates, led by Sultan Al Jaber, Ph.D., the CEO of Abu Dhabi National Oil Company, realistically deliver to keep the planet safe as head of COP 28 this fall? There’s a lot riding on Al Jaber, and many environmental advocates don’t believe he can deliver. But… maybe he can. Maybe he can lead the charge to cut methane — the fastest way to cool the planet in time to avoid having self-amplifying feedback loops push the planet past the series of tipping points that are just ahead. At COP28, maybe the UAE can bring along other state-owned oil giants, who have opposed climate action and lay the foundation for a global methane agreement. This wouldn’t satisfy the many climate advocates who consider efforts to reduce methane leaks from the fossil fuel sector a moral hazard that could extend the life of the industry that is killing the planet. These advocates want, as we all should, to achieve 100 percent clean energy as soon as possible and create a just clean energy transition to a zero-emissions future. The challenge is that as fast as we’re expanding clean energy, it is not yet replacing the fossil fuel energy system. The legacy industry is not going to be pushed off the stage until there is enough clean energy to replace it. How fast can this transition happen? Under the most optimistic scenario, this will take until 2040. Public pressure and growing litigation might hasten that. But state-owned companies control nearly three-quarters of global crude oil and gas production; and an estimated 75 percent of the oil and gas industry’s global methane emissions come from the countries they operate in, according to the International Energy Agency. Even if we succeed in transitioning to clean energy and net-zero emissions by 2040, a tremendously ambitious goal, our continuing emissions will continue the self-amplifying feedback loops where the planet warms itself and soon sends us careening past a series of irreversible tipping points in the next decade that we may not be able to recover from. Importantly, even the most aggressive decarbonization will only avoid 0.1 degree C of warming at mid-century. This is because much of fossil fuel CO2 is co-emitted with cooling sulfate aerosols, and when the sulfates fall out, which they do in a matter of days once a fossil power plant shuts down, this unmasks existing warming. The result: net warming for the first decade. We’ve got to do it, but the deceive-and-delay tactics of the fossil fuel industry have left us too little time for decarbonization on its own to limit the near-term temperature. In contrast, cutting methane can avoid nearly 0.3 degrees C of warming before 2050 — three times more than cutting CO2. It’s the only way we currently know of to slow near-term warming. Cutting methane is the near-term sprint we need to win in this critical decade, while cutting CO2 is a mid-term marathon. The real moral hazard we face is failing to solve climate change in time — failing to move fast enough to win the sprint and prevent the devastating impacts that will hit us when we start passing tipping points. The fossil fuel sector is ripe for action today, with more than 100 oil and gas companies, including those in the Oil and Gas Methane Partnership, already committed to reducing their aggregate upstream oil and gas methane emissions. Many companies are also supporting Zero Routine Flaring by 2030. These actions make economic sense since half of methane mitigation can be done at negative cost, leaving more product to sell in tight markets. Right now, 150 countries have also committed to reducing human-caused methane emissions by at least 30 percent by 2030 under the Global Methane Pledge. In support of the pledge, this week the United States and the EU Energy Council noted in their joint statement “the need to develop effective global schemes to limit leakage, venting, and flaring,” and the joint progress developing international standards for leak detection and quantification of methane emissions.” But the pledge remains voluntary, and world’s largest methane emitters — Russia, China and India — have yet to commit to it. Because cutting methane is now the single most important strategy for slowing near-term warming, it is essential to move from a pledge to strong sectoral commitments building toward a mandatory methane agreement. Human-caused methane emissions come from three sectors — fossil fuels (35 percent), waste (20 percent) and agriculture (40 percent). Oil, gas and coal should be the first sector covered by a global methane agreement. (Methane mitigation is already becoming a key measure of near-term oil and gas industry climate change efforts.) The waste and agriculture sectors would come later under separate protocols. Al Jaber should use COP28 to forge an agreement to mitigate methane in the oil and gas sector to the maximum extent possible in the shortest time. This requires getting the state-owned energy companies on board with climate solutions, as U.S. Climate Envoy John Kerry recently noted. COP28 presents the UAE and global climate community with the opportunity. The UAE is in a unique position to bring other nationally owned oil and gas companies together in support of methane emissions reductions as a first step toward a sectoral agreement on methane. Such action by national oil and gas exporting companies is crucial to show a willingness to take climate change action, as well as to compete with private sector companies and countries like the United States and European Union that are taking aggressive methane mitigation action. Lower methane emissions from the oil and gas sector will increasingly have a competitive advantage in the global marketplace as importers grow more concerned about the lifecycle emissions of their imports. Moreover, nationally owned companies have in recent years increasingly turned to private equity markets for capital; as global investors begin to prioritize investments with low emissions, the lifecycle emissions of nationally owned oil and gas companies will be under increased scrutiny and could find their financing constrained if they do not reduce emissions in keeping with publicly traded private sector companies. And an announcement of an “Dubai Methane Fund” also would be welcome, as would help with other methane sectors, and funding the research and development needed for methane removal technology. But as Al Jaber said recently, “The oil and gas sector needs to up its game, do more and do it faster.” He added, “Let’s aim to reach net-zero methane emissions by 2030.” We agree. The UAE and all nations should make a binding international agreement on methane reductions from the oil and gas sector a key outcome of COP 28. Paul Bledsoe is professorial lecturer at American University’s Center for Environmental Policy and a former Clinton White House climate official. Durwood Zaelke is president of the Institute for Governance & Sustainable Development (IGSD) in Washington, D.C. and Paris, and adjunct professor at University of California, Santa Barbara Bren School of Environmental Science & Management. Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.
Environmental Science
Chinese scientists are combating a glacier's melting by covering it with a blanket One of China's most-visited glaciers in the Tibetan region, the Dagu Glacier, is now covered with white sheets, also known as geotextiles, in an attempt to slow its melting. Scientists at Nanjing University are leading the effort, and in July installed a white reflective material over 400 square meters of the glacier, located in Sichuan province in southwestern China. The project begins as the Dagu, one of the largest glacier tourism sites in China, as well as other glaciers around China face severe melting due to rising temperatures. In the past 50 years, glaciers on the Tibetan Plateau, where the Dagu is located, have shrunk by about 15%, according to research from the Institute of Tibetan Plateau Research of the Chinese Academy of Sciences. The effort is funded by video game giant Tencent Holdings' Carbon Neutrality Lab, as part of the company's stated goal to be a leader in encouraging society to develop sustainably. The company says it is committed to achieving carbon neutrality by 2030, and seeks to foster sustainable innovation beyond its own goals, according to its website. Although glacier blankets are a novel method of glacier preservation in Tibet, they have been employed in the past, primarily at European ski resorts. At the Rhone Glacier in Switzerland, glacier blankets have been installed seasonally for the last 13 years in an attempt to preserve ski slopes and tourism. "We have received overwhelming support from the local communities over the past years," Bin said. "The protection of the glacier is more important than ever as it underpins the entire ecosystem as well as the livelihood of the locals." The use of blankets in other regions has come with controversy, as many people have criticized them for their lack of long-term feasibility. Blankets are often hard to scale up, and cannot protect glaciers from the effects of warming for long periods of time. "They're a little bit of a desperation measure," said Mauri Pelto, a professor of environmental science at Nichols College who studies glaciers. "If you've only covered just a tiny part of the glacier, the [rest] is melting. That doesn't sustain your operation." There are also obstacles depending on the region, including the fact that higher elevation glaciers are harder to reach. Combined with the high expenses of the project, many say that geotextiles offer too little, too late. A study published last year in the journal Remote Sensing by researchers from the Chinese Academy of Sciences showed that a blanket installed over part of the Dagu Glacier was effective at slowing melting, with the covered area showing 15% less mass loss than uncovered areas. However, as the study noted, high expenses, harsh geography and aging of textiles limit the feasibility of using blankets in widespread areas. Geotextiles must be manually installed, and many glaciers are in areas that are difficult to reach. Although the textiles used in this project are designed to be eco-friendly, according to researchers, they are still made from materials that create carbon emissions. Although glacier blankets cannot stop melting, some have said that, especially in areas with high tourism, like the Dagu Glacier, they can be effective visual reminders of the impact of climate change. "I think actually ultimately, this is more of a visual reminder to tourists and recreationists that climate change is making an impact in those areas," said Mark Carey, a historian at the University of Oregon who studies glacier retreat and climate change. "It's an educational tool." Researchers on the project team are aware that glacier blankets can only do so much. "All the human intervention methods that we're working on, even if they prove effective, are only going to slow down [the melting]," Bin said. "If the earth keeps getting warmer, in the end, there is no way to protect the glaciers forever." Provided by State of the Planet This story is republished courtesy of Earth Institute, Columbia University http://blogs.ei.columbia.edu.
Environmental Science
The number of students on renewables-related courses in Scotland has soared by 70% in four years, figures reveal. Scottish Renewables found that 22,000 undergraduates were studying subjects which cover the sector, ranging from engineering to maths. The same survey in 2019 reported around 13,000 young people studying in similar areas. Scottish Renewables said it demonstrated the attractiveness of the industry. The figures come from a Freedom of Information (FOI) request to 33 colleges and universities. The research also showed the sector is dominated by men as only 28% of students are female. Ellice Mentiplay was among the minority but she is now a commercial graduate with EDF Renewables in Edinburgh. The 24-year-old was born in Perth but her father's job in oil and gas meant she lived across the world before finally returning to Scotland. Her degree at Abertay University was in environmental science which she followed with a masters in energy, society and sustainability at Edinburgh. 'It is going to be huge' On wind energy, she said: "I think offshore, particularly in Scotland, is just going to be huge in the future especially with developments into things like floating offshore wind. "I wasn't really aware when I was younger that that was a job career path that I could go down. "Working in renewables means that I can also help the UK transition over to clean net zero energy." EDF runs Scotland's only nuclear power station at Torness, in the Highlands, and is now building the Neart na Gaoithe wind farm, off the coast of Fife. It is expected to generate its first electricity this year. And on the quayside at Dundee harbour towers and turbine blades are now stacking up ready to be installed. Millie Anderson is a fourth year student in mechanical engineering with renewables at Dundee University. She is from the city and has no family connection with the energy sector but has always had a keen interest in engineering. The expanding nature of the renewables sector was the key selling point as she believes the petroleum side is probably not going to grow. She said: "I definitely think it's more exciting than daunting because you've got so much more potential. It might mean certain things don't work out but that's part of the fun of it is figuring out what works. "It's not 'same old, same old.' It's different challenges, new challenges." The industry body Scottish Renewables said wind, solar and hydropower already provided the "vast majority" of Scotland's electricity and contributed more than £5.6bn to the economy. From their FOI figures, the most popular courses are engineering with 5,373 students followed by business and management. The institutions with the highest number of students studying renewables-related courses are Glasgow Caledonian followed by St Andrews and Glasgow. For Aaron Wilson, from Dunfermline, it was a family trip to a hydro power station as a child which sparked his interest in energy. The 22-year-old is now studying a masters in sustainability at Dundee and wants to enter into environmental consultancy covering areas of policy and law. He said: "At the start I definitely believed that it might be difficult for me to actually be able to get a job but now I've seen the growth and it's becoming much more prominent. "Even consultancy companies have grown rapidly." Aaron believes there is a generational shift in attitudes towards tackling climate change through sustainability and that younger people want roles which will help the energy transition. A report by Skills Development Scotland found that in 2021 there were about 100,000 "green jobs" although it said the figure could be far smaller. It said identifying the true number was difficult for multiple reasons, not least because there's no formal definition of a "green job." Some might be directly linked to sustainability or renewables engineering whereas others might have greener elements such as a plumber installing heat pumps. But it said almost 40% of all vacancies could now be described as "green jobs". Johnjo Morgan, 23, is in the second of a three-year apprenticeship in automotive engineering. He is a self-confessed petrol head and landed a dream job restoring classic cars at Errol in Perthshire. One day a week he studies at Dundee and Angus College on a course which now includes teaching students how to work on electric vehicles. He has noticed an increase in people converting older cars to run with battery powered electric motors. Johnjo explained: "It's grown in the last two years. It didn't exist and now everyone's starting to hear about it so I'd imagine it's going to get bigger and bigger. "Older cars were more interesting with petrol and diesel. The engines were more interesting but they're of their time. The future is definitely electric."
Environmental Science
Image source, Getty ImagesImage caption, Climate change is making extreme weather including flooding more likely, scientists sayOne of the world's largest oil companies accurately forecast how climate change would cause global temperature to rise as long ago as the 1970s, researchers claim.ExxonMobil's private research predicted how burning fossil fuels would warm the planet but the company publicly denied the link, they suggest.The academics analysed data in the company's internal documents.ExxonMobil denied the allegations."This issue has come up several times in recent years and, in each case, our answer is the same: those who talk about how "Exxon Knew" are wrong in their conclusions," the company told BBC News.Corporations including ExxonMobil have made billions from selling fossil fuels that release emissions that scientists, governments and the UN say cause global warming.The findings suggest that ExxonMobil's predictions were often more accurate than even world-leading Nasa scientists."It really underscores the stark hypocrisy of ExxonMobil leadership, who knew that their own scientists were doing this very high quality modelling work and had access to that privileged information while telling the rest of us that climate models were bunk," Naomi Oreskes, professor of the history of science at Harvard University, told BBC News.The findings are a "smoking gun", suggests co-author Geoffrey Supran, associate professor of environmental science and policy at the University of Miami."Our analysis allows us for the first time to actually put a number on what Exxon knew, which is that the burning of their fossil fuel products was going to heat the planet by about 0.2C of warming every decade," he said.Researchers have never before quantified the scientific evidence in ExxonMobil's documents, he says. In response, ExxonMobil pointed to a 2019 US court ruling that concluded: "ExxonMobil executives and employees were uniformly committed to rigorously discharging their duties in the most comprehensive and meticulous manner possible.""ExxonMobil is committed to being part of the solution to climate change and the risks it poses," a spokesperson said. Image source, Geoffrey Supran/Naomi OreskesImage caption, A chart that researchers say compares ExxonMobil's predictions of temperature rise with actual temperature increase"Their excellent climate modelling was at least comparable in performance to one of the most influential and well-regarded climate scientists of modern history," Prof Supran said, comparing ExxonMobil's work to Nasa's James Hansen who sounded the alarm on climate in 1988.Prof Oreskes said the findings show that ExxonMobil "knowingly misled" the public and governments. "They had all this information at their disposal but they said very, very different things in public," she explained.Previous investigations have unearthed Exxon documents that suggest the company sought to spread doubt about the science. One internal paper set out the "Exxon position" to "emphasise the uncertainty in scientific conclusions" about the greenhouse effect.The research, published in the academic journal Science, also suggests that ExxonMobil had reasonable estimates for how emissions would need to be reduced in order to avoid the worst effects of climate change in a world warmed by 2C or more.Their scientists also correctly rejected the theory that an ice age was coming at a time when other researchers were still debating the prospect.Prof Oreskes and Prof Supran carried out the research after journalists in 2015 uncovered evidence suggesting ExxonMobil's knew about climate change, but were accused by ExxonMobil of "cherry-picking" the truth.They plotted scientific data in more than 100 publications from Exxon and Exxon Mobil between 1977 and 2014 to calculate their predictions of global temperature rise.Prof Oreskes suggests that it showed the company was internally using climate science when publicly it called the models "speculative" or "bad science".The findings add to ongoing pressure on the company over what it knew about climate change. Campaigners allege it spread misinformation in order to protect its business interests in fossil fuels and are suing the company in a number of US courts.In May a court in Massachusetts, US ruled that ExxonMobil must face trial over accusations it lied about climate change.
Environmental Science
Although the study utilized feathers obtained from Manx shearwaters, it is believed that the findings should also apply to other speciesJamie Darby, School of Biological Earth and Environmental Science, UCC 1/2 A microscope image shows how oil exposure causes feather structures known as barbules to clump, instead of interlocking to form a waterproof barrierDr. Richard Unitt, School of Biological Earth and Environmental Science, UCC 2/2 Although the study utilized feathers obtained from Manx shearwaters, it is believed that the findings should also apply to other speciesJamie Darby, School of Biological Earth and Environmental Science, UCC It's always upsetting to see images of seabirds covered in crude oil, as the result of an accidental spill. According to a new study, however, even tiny amounts of routinely released waterborne oil may seriously damage such birds' feathers.Led by researcher Emma Murphy, a team at Ireland's University College Cork started by collecting feathers from live Manx shearwaters, which are a type of seabird believed to be threatened by oil pollution.In lab tests, the scientists proceeded to measure how quickly water would pass through those feathers, after exposure to four different thicknesses of surface crude oil – the thicknesses corresponded to those commonly observed at sea. Additionally, high-powered microscopes were used to assess how exposure to that oil affected the structure of the feathers.It was found that even very thin surface films of oil did structurally harm the feathers, decreasing the ability of feather structures known as barbules to interlock with one another – this made the feathers significantly less waterproof. The films in question were just 0.1 to 3 microns thick. Putting those figures in context, the thickness of a human hair is approximately 70 microns (a micron is one millionth of a meter). A microscope image shows how oil exposure causes feather structures known as barbules to clump, instead of interlocking to form a waterproof barrierDr. Richard Unitt, School of Biological Earth and Environmental Science, UCC According to previous studies, oil-exposure-related loss of feather waterproofing increases the likelihood of seabirds becoming waterlogged, less buoyant, and less able to retain body heat. And while major oil spills may not be a constant occurrence, activities such as shipping and offshore oil extraction do routinely release smaller amounts of oil into the ocean."Chronic small-scale oil pollution is commonly overlooked in the marine environment, though it has been shown to have serious implications for the fitness and survival of seabirds," said Murphy. "This study examined one species, but the results can be extended to other species that rely on waterproofing to stay healthy when at sea for long periods."A paper on the research was recently published in the journal Royal Society Open Science.Source: University College Cork Based out of Edmonton, Canada, Ben Coxworth has been writing for New Atlas since 2009 and is presently Managing Editor for North America. An experienced freelance writer, he previously obtained an English BA from the University of Saskatchewan, then spent over 20 years working in various markets as a television reporter, producer and news videographer. Ben is particularly interested in scientific innovation, human-powered transportation, and the marine environment.
Environmental Science
An upcoming NASA mission will provide an unprecedented look at ice clouds at high altitudes in Earth's atmosphere. NASA's Polarized Submillimeter Ice-cloud Radiometer (PolSIR) is an instrument designed to study ice clouds that form high above tropical and subtropical regions of the Earth. A pair these relatively low-cost sensors will be mounted on two small satellites and launched into low Earth orbit, where they will collect data on how ice clouds change over the course of a day. The data will help scientists better understand both how these ice clouds are responding to climate change and how they might influence our climate in the future. "Studying ice clouds is crucial for improving climate forecasts — and this will be the first time we can study ice clouds in this level of detail," Nicola Fox, associate administrator for the Science Mission Directorate at NASA Headquarters in Washington, said in a statement. The equipment for the mission are two identical pairs of radiometers, which will measure electromagnetic radiation coming off of the clouds. The radiometers will record infrared radiation at two different frequencies: 325 and 680 gigahertz. Each pair of radiometers will travel aboard a cubesat, a mini satellite a little over a foot tall. The two cubesats will orbit between three and nine hours apart, enabling them to continuously collect data on the ice clouds over a 24-hour period. "The radiometers, which measure the radiant energy emitted by clouds, will significantly improve our understanding of how ice clouds change and respond throughout the day," Karen St. Germain, who leads NASA's Earth Sciences Division, said in the statement. PolSIR is part of NASA's Earth Venture class of missions, a group of relatively low-cost missions to explore the Earth and improve our ability to predict future changes. Earth Venture missions are selected through open, competitive grant applications. Submitted by a group at Vanderbilt University, the PolSIR team will receive a grant of $37 million to cover operation costs (not including the cost of launch). Ralf Bennartz, chair of the department of earth and environmental science at Vanderbilt, will lead the mission, along with Dong Wu of NASA's Goddard Space Flight Center in Greenbelt, Maryland. The mission joins NASA's many other Earth-focused missions, including the recently launched TROPICS experiment and the TEMPO mission, both also Earth Venture missions. At its inaugural climate change summit in December 2022, NASA highlighted several earth science missions, which will help us understand the many impacts of climate change on our planet. PolSIR is scheduled to launch in 2027, if all goes according to plan, according to a Vanderbilt University statement.
Environmental Science
Chinese Scientists Are Combating a Glacier's Melting By Covering It With a Blanket One of China’s most-visited glaciers in the Tibetan region, the Dagu Glacier, is now covered with white sheets, also known as geotextiles, in an attempt to slow its melting. Scientists at Nanjing University are leading the effort, and in July installed a white reflective material over 400 square meters of the glacier, located in Sichuan province in southwestern China. The project begins as the Dagu, one of the largest glacier tourism sites in China, as well as other glaciers around China face severe melting due to rising temperatures. In the past 50 years, glaciers on the Tibetan Plateau, where the Dagu is located, have shrunk by about 15 percent, according to research from the Institute of Tibetan Plateau Research of the Chinese Academy of Sciences. The effort is funded by video game giant Tencent Holdings’ Carbon Neutrality Lab, as part of the company’s stated goal to be a leader in encouraging society to develop sustainably. The company says it is committed to achieving carbon neutrality by 2030, and seeks to foster sustainable innovation beyond its own goals, according to its website. Although glacier blankets are a novel method of glacier preservation in Tibet, they have been employed in the past, primarily at European ski resorts. At the Rhone Glacier in Switzerland, glacier blankets have been installed seasonally for the last 13 years in an attempt to preserve ski slopes and tourism. In an article published by Tencent, Zhu Bin, a material scientist at Nanjing University leading the effort, said that local communities have welcomed the attempt to preserve the glacier. “We have received overwhelming support from the local communities over the past years,” Bin said. “The protection of the glacier is more important than ever as it underpins the entire ecosystem as well as the livelihood of the locals.” The use of blankets in other regions has come with controversy, as many people have criticized them for their lack of long-term feasibility. Blankets are often hard to scale up, and cannot protect glaciers from the effects of warming for long periods of time. “They’re a little bit of a desperation measure,” said Mauri Pelto, a professor of environmental science at Nichols College who studies glaciers. “If you’ve only covered just a tiny part of the glacier, the [rest] is melting. That doesn’t sustain your operation.” There are also obstacles depending on the region, including the fact that higher elevation glaciers are harder to reach. Combined with the high expenses of the project, many say that geotextiles offer too little, too late. A study published last year in the journal Remote Sensing by researchers from the Chinese Academy of Sciences showed that a blanket installed over part of the Dagu Glacier was effective at slowing melting, with the covered area showing 15 percent less mass loss than uncovered areas. However, as the study noted, high expenses, harsh geography and aging of textiles limit the feasibility of using blankets in widespread areas. Geotextiles must be manually installed, and many glaciers are in areas that are difficult to reach. Although the textiles used in this project are designed to be eco-friendly, according to researchers, they are still made from materials that create carbon emissions. Although glacier blankets cannot stop melting, some have said that, especially in areas with high tourism, like the Dagu Glacier, they can be effective visual reminders of the impact of climate change. “I think actually ultimately, this is more of a visual reminder to tourists and recreationists that climate change is making an impact in those areas,” said Mark Carey, a historian at the University of Oregon who studies glacier retreat and climate change. “It’s an educational tool.” Researchers on the project team are aware that glacier blankets can only do so much. “All the human intervention methods that we’re working on, even if they prove effective, are only going to slow down [the melting],” Bin said. “If the earth keeps getting warmer, in the end, there is no way to protect the glaciers forever.” Esha Karam is GlacierHub staff writer and a junior at Columbia College studying sustainability and economics.
Environmental Science
Dozens of species of frogs, salamanders and other amphibians quietly disappeared from parts of Latin America in the 1980s and 2000s, with little notice from humans, outside of a small group of ecologists. Yet the amphibian decline had direct health consequences for people, according to a study from the University of California, Davis.  The study, published in the journal Environmental Research Letters, links an amphibian die-off in Costa Rica and Panama with a spike in malaria cases in the region. At the spike’s peak, up to 1 person per 1,000 annually contracted malaria that normally would not have had the amphibian die-off not occurred, the study found. “Stable ecosystems underpin all sorts of aspects of human well-being, including regulating processes important for disease prevention and health,” said lead author Michael Springborn, a professor in the UC Davis Department of Environmental Science and Policy. “If we allow massive ecosystem disruptions to happen, it can substantially impact human health in ways that are difficult to predict ahead of time and hard to control once they’re underway.”  Figure 1 from the UC Davis study shows the spike in annual total malaria cases from 1976-2016 for Costa Rica and Panama. (Michael Springborn et al./UC Davis) A natural experiment From the early 1980s to the mid-1990s, a deadly fungal pathogen called Batrachochytrium dendrobatidis, or “Bd,” traveled across Costa Rica, devastating amphibian populations. This amphibian chytrid fungus continued its path eastward across Panama through the 2000s. Globally, the pathogen led to the extinction of at least 90 amphibian species, and to the decline of at least 500 additional species. Shortly after the mass die-off of amphibians in Costa Rica and Panama, both countries experienced a spike in malaria cases. A microscopic view of skin from a dead frog collected during the die-off at El Cope, Panama in 2004. The round cells are the fungal pathogen Bd. The grey-green irregular cells are frog skin. Notice there are more fungal cells than frogs cells. (Forrest Brem, University of Memphis) The Chiriqui harlequin frog is among the many species of amphibians that disappeared from the Talamanca highlands of Costa Rica and Panama following the arrival of Bd, a fungal pathogen. (Marcos Guerra/Smithsonian Tropical Research Institute) Some frogs, salamanders and other amphibians eat hundreds of mosquito eggs each day. Mosquitoes are a vector for malaria. Scientists wondered, could the crash in amphibians have influenced the rise in malaria cases? To find out, the researchers combined their knowledge of amphibian ecology, newly digitized public health record data, and data analysis methods developed by economists to leverage this natural experiment. “We’ve known for a while that complex interactions exist between ecosystems and human health, but measuring these interactions is still incredibly hard,” said co-author Joakim Weill, a Ph.D. candidate at UC Davis when the study was conducted. “We got there by merging tools and data that don’t usually go together. I didn’t know what herpetologists studied before collaborating with one!” The results show a clear connection between the time and location of the spread of the fungal pathogen and the time and location of increases in malaria cases. The scientists note that while they cannot fully rule out another confounding factor, they found no evidence of other variables that could both drive malaria and follow the same pattern of die-offs. Figure 2 from the study shows the date of the pathogen-driven amphibian decline across Costa Rica and Panama. Colored shading indicates the earliest date of decline at the county level. (Michael Springborn, et al./UC Davis) Tree cover loss was also associated with an increase in malaria cases, but not nearly to the same extent as the loss of amphibians. Typical levels of tree canopy loss increase annual malaria cases by up to 0.12 cases per 1,000 people, compared to 1 in 1,000 for the amphibian die-off.   Trade threats Researchers were motivated to conduct the study by concerns about the future spread of similar diseases through international wildlife trade. For instance, Batrachochytrieum salamandrivorans, or “Bsal,” similarly threatens to invade ecosystems through global trade markets. Springborn said measures that could help prevent the spread of pathogens to wildlife include updating trade regulations to better target species that host such diseases, as our knowledge of threats evolve. “The costs of putting those protective measures in place are immediate and evident, but the long-term benefits of avoiding ecosystem disruptions like this one are harder to assess but potentially massive, as this paper shows,” Springborn said. Additional co-authors include Karen Lips of University of Maryland, Roberto Ibáñez of Smithsonian Tropical Research Institute in Panamá, and Aniruddha Ghosh of UC Davis and the Alliance of Biodiversity International and CIAT in Kenya.  The study was funded by the National Science Foundation and the UC Davis Institute of the Environment.
Environmental Science
Researchers demand European Parliament take action to fight pollution in the Mediterranean Sea The implementation of effective policies at local and regional level, and the cooperation of all countries in the Mediterranean Sea basin is urgently needed to successfully reverse the environmental problems in this marine area. This is evidenced by a report carried out by the Institute of Environmental Science and Technology of the Universitat Autònoma de Barcelona (ICTA-UAB) presented in the European Parliament by oceanographer Patrizia Ziveri, who stresses the need to urgently fight against the growing pollution caused by marine litter and plastics in the Mediterranean, to improve current legislation and to monitor new pollutants that require immediate regulation. The study, requested by the Committee on Regional Development (REGI) Policy Department for Structural and Cohesion Policies of the European Parliament, provides an exhaustive analysis of the current situation of the Mediterranean Sea, a global pollution hotspot, as well as the actions taken by the cities and regions of the Mediterranean countries of the European Union to reduce the generation and dispersion of marine pollutants. The study makes policy recommendations and points out that pollution affects both marine environment and fauna, as well as human health. The Mediterranean is one of the world's marine areas under human pressure. Its high rates of population and urbanization (150 million inhabitants on its coasts), industrial activity, tourism (one third of the world's volume) and fishing have led to a rapid increase in pollution. It accounts for up to 30% of global shipping activity. This, combined with a geomorphological configuration in the form of a semi-enclosed basin and its specific oceanic circulation, has made the Mediterranean Sea one of the most polluted spots on the planet and a natural trap for marine litter, mainly plastics. Between 80 and 90 percent of marine litter in the basin is plastic, and an estimated 230,000 tons of land-sourced plastic leak into the sea each year. Tourism is the main sector contributing to beach litter (up to 60%) followed by fishing and aquaculture (5-10%). Only 10 types of items account for 66.4% of the beach litter in the Mediterranean Sea, 9 of them are made partly or entirely of plastic, and 7 of them of single-use plastic. Cigarette butts and cigarette filters are the most common (27.3%). Shipping activities are estimated to contribute up to 20,000 tons of plastic per year. The ICTA-UAB report, "Actions of cities and regions in the Mediterranean Sea area to fight sea pollution," indicates that the main cause of this situation is the massive waste generation and its mismanagement. Other causes include industrial and urban waste discharge, sewage, agricultural run-off, shipping, fishing, and maritime traffic, as well as tourism. "To tackle pollution, management policies must be applied to waste reduction and treatment, tourism, pollution from plastics and other pollutants, sewage and other waste from rivers," explains Patrizia Ziveri, oceanographer at ICTA-UAB. It is necessary to target the production model, consumption patterns and waste disposal practices. In this context, "it is essential that the fight against pollution in the Mediterranean Sea is endorsed not only by EU countries, but that regulations are implemented by all Mediterranean countries through effective cooperation", she says. The implementation and success of the actions to fight marine pollution should be monitored at different stages. Best practices should be highlighted, shared, and implemented in different suitable Mediterranean regions. The scientists stress that significant progress has been made in terms of treatment and prevention, including the implementation of the single-use plastic directive and the promotion of recycling. However, more and continued efforts are needed. The study examines the implementation of the EU's single-use plastics directive in France, Spain, Italy, and Greece, and calls for a strategy to reduce plastics which includes market restrictions, improved waste management and agreements between consumers and producers. "Efforts to reduce the use of plastics must continue in order to meet environmental targets. There is an urgent need to focus on the EU's strategy targets for key sectors, such as consumption patterns, production, and waste management," says Michael Grelaud, ICTA-UAB oceanographer and co-author of the report. "Some actions to limit marine-based pollution (fisheries, aquaculture, shipping, mining) already exist, but they often face challenges in terms of effective implementation because this is often reduced to voluntary collaborations by states," says Jorge Pato, also co-author of the report. Some of the other measures they propose in different areas are: - Emerging pollutants. This refers to new pollutants such as pharmaceuticals, UV filters, flame retardants or pesticides that reach the sea through agricultural, urban and industrial runoff or coastal wastewater treatment plants. - Microplastics. They point out that there are no regulations for the growing problem of microplastics. "Microplastic pollution should be established as a priority issue in the Mediterranean agenda, capable of leading to binding agreements". They point to the establishment of bans and reduction targets in the manufacture of fabrics and cosmetics, monitoring the entry of microplastics into the sea in all water-channels, including rivers and sewage outflows. Strict regulation of ship paint and antifouling coatings is needed. - Marine noise pollution. Shipping, oil and gas exploration, construction and maintenance of offshore structures, and military activities are a dangerous source of noise pollution affecting marine fauna, causing behavioral disturbances, communication disruption, hearing damage, stress and even death. They propose the creation of particularly sensitive sea areas where noise levels are restricted (with special attention to migratory routes, breeding grounds and biodiversity hotspots), the use of quieter ship models and the reduction of ship speeds. - Rivers, wastewater treatment and harbors. The challenge in managing water pollution lies in the implementation of policies by the signatory countries. This is particularly evident given the varying levels of economic development among the Mediterranean nations. They are committed to the cyclical reuse of treated effluent for agriculture to reduce spending on fertilizers and the recovery of organic wastewater from urban areas as a valuable agricultural resource. - Aquaculture. Pollutes by discharging untreated waste, using chemicals, and releasing excess nutrients. This harms aquatic life, promotes harmful algal blooms, and poisons fish and other marine species with antibiotics and heavy metals, so regulation of these excess nutrients in aquaculture is needed. EU policies for Mediterranean countries should implement the Voluntary Guidelines on the Marking of Fishing Gear to eliminate abandoned, lost or otherwise discarded fishing gear and encourage the recovery of marine litter through compensation. - Implementation of initiatives in coastal cities on waste characterization and monitoring. Examples include the use of smart waste bins that alert waste management teams when they are full; awareness-raising campaigns oriented to beach users; monitoring of debris and litter on the main commercial routes in the Mediterranean or the adaptation of packaging that is not possible to ban with alternative sustainable solutions. - Mediterranean islands. Promote sustainable tourism; limit the generation of coastal litter by improving general awareness of the problem; limit the impact of tourism by introducing a visiting fee for litter-free coastal attractions; develop comprehensive waste management plans with the involvement of local communities; and introduce regulations to create smoke-free beaches. More information: Report: www.europarl.europa.eu/RegData … U(2023)733123_EN.pdf Provided by Autonomous University of Barcelona
Environmental Science
Sitting calmly in their webs, many spiders wait for prey to come to them. Arachnids along lakes and rivers eat aquatic insects, such as dragonflies. But, when these insects live in mercury-contaminated waterways, they can pass the metal along to the spiders that feed on them. Now, researchers reporting in ACS’ Environmental Science & Technology Letters have demonstrated how some shoreline spiders can move mercury contamination from riverbeds up the food chain to land animals. Most mercury that enters waterways originates from industrial pollution and other human activities, but it can also come from natural sources. Once in the water, microbes transform the element into methylmercury, a more toxic form, which biomagnifies and increases in organisms up the food chain. Scientists increasingly recognize spiders living on lakeshores and riverbanks as a potential link between contamination in waterways and animals that mostly live on land, such as birds, bats and amphibians, which eat the insects. So, Sarah Janssen and colleagues wanted to assess if shoreline spiders’ tissues contain mercury from nearby riverbeds and establish how these animals could connect mercury pollution in water and land animals. The researchers collected long-jawed spiders along two tributaries to Lake Superior, and they sampled sediments, dragonfly larvae and yellow perch fish from these waterways. Next, the team measured and identified the mercury sources, including direct industrial contamination, precipitation and runoff from soil. The team observed that the origin of mercury in the sediments was the same up the aquatic food chain in wetlands, reservoir shorelines and urban shorelines. For instance, when sediment contained a higher proportion of industrial mercury, so did the dragonfly larvae, spider and yellow perch tissues that were collected. Based on the data, the researchers say that long-jawed spiders could indicate how mercury pollution moves from aquatic environments to terrestrial wildlife. The implication of these findings is that spiders living next to the water provide clues to the sources of mercury contamination in the environment, informing management decisions and providing a new tool for monitoring of remediation activities, explain the researchers. The team also collected and analyzed tissues from two other types of arachnids from some sites: fishing spiders and orb-weaver spiders. A comparison of the data showed that the mercury sources varied among the three taxa. The team attributes this result to differences in feeding strategies. Fishing spiders hunt near water but primarily on land; orb-weavers eat both aquatic and terrestrial insects; but it’s the long-jawed species that feed most heavily on adult aquatic insects. These results suggest that although long-jawed spiders can help monitor aquatic contaminants, not every species living near the shore is an accurate sentinel, the researchers say. The authors acknowledge funding from the U.S. Geological Survey Environmental Health Program and the U.S. Environmental Protection Agency Great Lakes Restoration Initiative. Story Source: Journal Reference: Cite This Page:
Environmental Science
Extreme events in Antarctica such as ocean heatwaves and ice loss will almost certainly become more common and more severe, researchers say. With drastic action now needed to limit global warming to the Paris Agreement target of 1.5°C, the scientists warn that recent extremes in Antarctica may be the tip of the iceberg. The study reviews evidence of extreme events in Antarctica and the Southern Ocean, including weather, sea ice, ocean temperatures, glacier and ice shelf systems, and biodiversity on land and sea. It concludes that Antarctica's fragile environments "may well be subject to considerable stress and damage in future years and decades" -- and calls for urgent policy action to protect it. "Antarctic change has global implications," said lead author Professor Martin Siegert, from the University of Exeter. "Reducing greenhouse gas emissions to net zero is our best hope of preserving Antarctica, and this must matter to every country -- and individual -- on the planet." Professor Siegert said the rapid changes now happening in Antarctica could place many countries in breach of an international treaty. "Signatories to the Antarctic Treaty (including the UK, USA, India and China) pledge to preserve the environment of this remote and fragile place," he said. "Nations must understand that by continuing to explore, extract and burn fossil fuels anywhere in the world, the environment of Antarctica will become ever more affected in ways inconsistent with their pledge." The researchers considered the vulnerability of Antarctica to a range of extreme events, to understand the causes and likely future changes -- following a series of recent extremes. For example, the world's largest recorded heatwave (38.5°C above the mean) occurred in East Antarctica in 2022 and, at present, winter sea ice formation is the lowest on record. Extreme events can also affect biodiversity. For example, high temperatures have been linked to years with lower krill numbers, leading to breeding failures of krill-reliant predators -- evidenced by many dead fur seal pups on beaches. Co-author Professor Anna Hogg, from the University of Leeds, said: "Our results show that while extreme events are known to impact the globe through heavy rainfall and flooding, heatwaves and wildfires, such as those seen in Europe this summer, they also impact the remote polar regions. "Antarctic glaciers, sea ice and natural ecosystems are all impacted by extreme events. Therefore, it is essential that international treaties and policy are implemented in order to protect these beautiful but delicate regions." Dr Caroline Holmes, a sea ice expert at British Antarctic Survey, said: "Antarctic sea ice has been grabbing headlines in recent weeks, and this paper shows how sea ice records -- first record highs but, since 2017, record lows -- have been tumbling in Antarctica for several years. "On top of that, there are deep interconnections between extreme events in different aspects of the Antarctic physical and biological system, almost all of them vulnerable to human influence in some way." The retreat of Antarctic sea ice will make new areas accessible by ships, and the researchers say careful management will be required to protect vulnerable sites. The European Space Agency and European Commission Copernicus Sentinel satellites are an essential tool for regular monitoring of the whole Antarctic region and Southern Ocean. This data can be used to measure ice speed, sea ice thickness and ice loss at exceptionally fine resolution. The paper, published in the journal Frontiers in Environmental Science, is entitled, Antarctic Extreme Events. Story Source: Journal Reference: Cite This Page:
Environmental Science
Fertiliser made from human faeces and urine is safe to use in agriculture and has "huge potential" to replace 25% of current synthetic products in some countries, according to research.The findings come as farmers continue to struggle with rising fertiliser costs due to a combination of climate change and the war in Ukraine. Researchers screened human waste for 310 chemicals - including rubber additives, insect repellents and pharmaceuticals - and only found them in 6.5% of the samples examined, but still at low concentrations.Scientists said low levels of the painkiller ibuprofen and mood-stabilising drug carbamazepine were found - but added that someone would have to eat more than 500,000 cabbage heads to accumulate a dose equal to one pill.Author Franziska Hafner, a student at the University of Hohenheim in Stuttgart, said products made from human urine and faeces "are viable and safe nitrogen fertilisers" and "did not show any risk regarding transmission of pathogens or pharmaceuticals". The work by experts in Germany also looked at modern products already being made from human urine which are turned into ammonium and nitrate.This included Aurin, which was recently approved for use in agriculture in Switzerland, Liechtenstein and Austria, and CROP - combined regenerative organic food production - which is part of ongoing space projects to recycle wastewater for future bases on the moon and Mars. More on Farming Asda and Lidl restrict egg sales following supply disruption Fears of Christmas egg shortages amid farmers' rising costs and avian flu outbreak Tory environment group criticises PM Liz Truss over farmland solar energy plan The lead author of the study Dr Ariane Krause, a scientist at the Leibniz Institute of Vegetable and Ornamental Crops in Germany, said: "If correctly prepared and quality-controlled, up to 25% of conventional synthetic mineral fertilisers in Germany could be replaced by recycling fertilisers from human urine and faeces."Combined with an agricultural transition involving the reduction of livestock farming and plant cultivation for fodder, even less synthetic fertiliser would be necessary, resulting, for example, in lower consumption of fossil natural gas. "Our study results demonstrate that nitrified urine fertilisers such as Aurin and CROP have a huge potential as fertiliser in agriculture."They argue for a greater use of these recycled products in the future."Click to subscribe wherever you get your podcastsThe peer-reviewed research has been published in the journal Frontiers In Environmental Science and comes amid record-breaking food inflation, with many shoppers struggling with supermarket bills.Last week, a study by researchers at Edinburgh University warned soaring farming costs mostly driven by high fertiliser prices could leave an extra 100 million people starving around the world.The scientists said the cost of fertiliser along with climate change would have the biggest impact on food security and could cause up to one million more people to die from malnutrition.Artificial fertilisers are made from or through using fossil fuels - they contribute to global emissions and can be harmful to their immediate environment.
Environmental Science
Air, water, soil, food and even blood – microplastics have found their way virtually everywhere on Earth, and now that list includes clouds. Bits of plastic particles were recently discovered above eastern China, with new research showing that these microplastics could influence cloud formation and the weather. A group of scientists from Shandong University in China collected cloud water atop Mount Tai, finding microplastics in 24 out of 28 samples. They include polyethylene terephthalate (otherwise known as PET), polypropylene, polyethylene and polystyrene, all particles commonly found in synthetic fibers, clothing and textiles, as well as packaging and face masks. “This finding provides significant evidence of the presence of abundant MP’s [microplastics] in clouds,” the researchers stated in the paper published today in Environmental Science and Technology Letters. Earlier this year, a study out of Japan showed that microplastics were present at the peak of Mount Fuji and Mount Oyama, suggesting that the particles may have originated from plastics in the ocean and been transported via air masses. The concentration of microplastics in Mount Tai cloud water was up to 70 times that of Japan’s mountains’ cloud water. “Most pollution we tend to think of is in liquid form,” said Fay Couceiro, a professor of environmental pollution at the University of Portsmouth. “We tend to think of that going into the river and the sea. Whereas microplastics, because they are a physical particle, are not following the normal rules. We’re finding microplastics in these pristine environments at the tops of these extremely hard-to-reach mountains.” So, how are they getting there? Other than contamination from people visiting these sites, the particles may be transported through the air. Samples from low-altitude and denser clouds had larger amounts of microplastics in them. Aged plastics – in other words, ones that have already been weathered from ultraviolet radiation – were smaller in size and had rougher surfaces. They also contained more lead, mercury and oxygen compared to pristine, untouched plastics. Scientists found that clouds can modify microplastics, possibly resulting in these particles affecting cloud formation. “Cloud formation has a huge implication for not just our local weather patterns, but for our global temperatures,” said Couceiro, who was not involved in the study. Clouds affect the climate in a plethora of ways. They produce precipitation and snow, affecting global rainfall and vegetation. Clouds block sunlight, cooling the surface of the planet and providing shade on the ground. But they can also trap heat and humidity, subsequently warming the air. The study authors state that more research is needed to fully determine the impact of microplastics on the weather, but what remains clear is that more can be done to address this. “There’s only one group of animals on this planet that use plastic, and that’s us human beings,” said Couceiro. “We really need a global response to this, as it’s not going to be solved by a single country, because air doesn’t respect boundaries.”
Environmental Science
The future has arrived: We have mushroom leather hats now. Slowly but surely, leather alternatives made from mycelium—essentially the root structure of a mushroom—instead of animal hides have entered the market. In 2021, MycoWorks partnered with Hermès to make the first object out of a fine mycelium leather alternative, which looks and acts pretty similarly to the real thing. The bag—the French label’s Victoria style—was an attention grabbing prototype that is slated to be released later this year. (A second harbinger came that year, when another biofabrication company called Bolt Threads partnered with Stella McCartney on two ready-to-wear garments made from a similar material). Today, MycoWorks and milliner Nick Fouquet have unveiled a capsule collection of hats: the first commercially available products made of the flagship material, Reishi. It’s an optimistic milestone for leather alternatives, and the people who want to shop consciously but don’t want to forgo their favorite boots. Reishi is animal-free, plastic-free, and vegetable tanned (meaning fewer harmful chemicals are used in the finishing process). Mycelium is an infinitely renewable resource, and MycoWorks harnesses the structure in order to grow a durable material in whatever sized container it’s put in. Reishi is grown in San Francisco in rectangular sheets, but you could place the mycelium in a container that is shaped like, say, a leather shoe upper, and eliminate the waste that cutting that shape from a cow hide creates. Boletus hat  Morchella hat MycoWorks was founded in 2013 by artists Philip Ross and Sophia Wang. Ross had used mycelium as a material for his sculptures and furniture since the 1990s. Wang, who was getting a Ph.D. at Berkeley in poetry and starting a dance company, came on board to create a biotech company (quite the pivot). The proprietary process to make Fine Mycelium was developed with process engineers, biochemists, experts in fermentation and mycology, and material science, but Wang says that they retained the artists’ eye that the company started with. The result is a leather alternative that is on its own biodegradable (although finishings and flourishes that a designer adds may change that), and beautiful. In my personal experience, it looks real. Comparable to high-end PVC leather I’ve seen luxury brands use—but without the plastic. The eco-minded customers and designers interested in the textile are certainly curious about the carbon footprint of Reishi vs. traditional leather. CEO Matt Scullin says, “While MycoWorks’ materials show a 95% reduction in carbon emissions as compared to the average cowhide, it is important to note that there are other attributes that matter in the conversation: for example, end-of-life and biodegradability and keeping our material plastic-free. In fact, Reishi is the only alternative leather on the market that is both low-carbon and plastic-free. MycoWorks has not only conducted a third-party LCA [life-cycle assessment] validating the amazing sustainability profile of the material, but it is now being peer-reviewed and will be published in the future.”With all the excitement around the potential of mycelium leather—so close to becoming a reality—MycoWorks is scaling up. They’ve outgrown their space in California that produces 10,000 sheets of Reishi a year, and are opening a full-scale factory in South Carolina with the capacity to produce several million sheets a year. The partnership with Fouquet is the beginning of this process. Coprinus hat  Aysia StiebFouquet, who has a degree in environmental science and sustainable development and is passionate about alternative textiles, reached out to MycoWorks through their email form on the website, hoping to collaborate. Nine months later, here we are with three hats: the Boletus, Coprinus, and Morchella (each named after a species of mushroom). The $810 Boletus is a bucket hat made from just Reishi and a lining, and the $875 Coprinus and $1725 Morchella use Reishi as trimming. “It’s indistinguishable,” Fouquet says of the product. He adds that his seamstresses couldn’t tell that they were using a different material other than leather. The biggest similarity to Fouquet is how workable the material is. The only difference is the grain of the leather alternative, “but it’s minor,” Fouquet says. The grain can also be engineered differently. Unlike cow leather, Reishi is customizable when it comes to size, shape, softness, flexibility, and density. After all, as Wang points out, “a cow can only grow to a certain shape.”But there’s one other crucial aspect to determining the veracity of leather. Does it have that smell? “The first thing I do with leather is feel it and smell it. Leather, especially tanned leather, has a very specific smell,” Fouquet says. “It’s hard to describe the smell of Reishi. It’s shocking because it’s a very raw, earthy scent.” Maybe that’s the next level of customization.
Environmental Science
The future has arrived: We have mushroom leather hats now. Slowly but surely, leather alternatives made from mycelium—essentially the root structure of a mushroom—instead of animal hides have entered the market. In 2021, MycoWorks partnered with Hermès to make the first object out of a fine mycelium leather alternative, which looks and acts pretty similarly to the real thing. The bag—the French label’s Victoria style—was an attention grabbing prototype that is slated to be released later this year. (A second harbinger came that year, when another biofabrication company called Mylo partnered with Stella McCartney on two ready-to-wear garments made from a similar material). Today, MycoWorks and milliner Nick Fouquet have unveiled a capsule collection of hats: the first commercially available products made of the flagship material, Reishi. It’s an optimistic milestone for leather alternatives, and the people who want to shop consciously but don’t want to forgo their favorite boots. Reishi is animal-free, plastic-free, and vegetable tanned (meaning fewer harmful chemicals are used in the finishing process). Mycelium is an infinitely renewable resource, and MycoWorks harnesses the structure in order to grow a durable material in whatever sized container it’s put in. Reishi is grown in San Francisco in rectangular sheets, but you could place the mycelium in a container that is shaped like, say, a leather shoe upper, and eliminate the waste that cutting that shape from a cow hide creates. Boletus hat  Morchella hat MycoWorks was founded in 2013 by artists Philip Ross and Sophia Wang. Ross had used mycelium as a material for his sculptures and furniture since the 1990s. Wang, who was getting a Ph.D. at Berkeley in poetry and starting a dance company, came on board to create a biotech company (quite the pivot). The proprietary process to make Fine Mycelium was developed with process engineers, biochemists, experts in fermentation and mycology, and material science, but Wang says that they retained the artists’ eye that the company started with. The result is a leather alternative that is on its own biodegradable (although finishings and flourishes that a designer adds may change that), and beautiful. In my personal experience, it looks real. Comparable to high-end PVC leather I’ve seen luxury brands use—but without the plastic. The eco-minded customers and designers interested in the textile are certainly curious about the carbon footprint of Reishi vs. traditional leather. CEO Matt Scullin says, “While MycoWorks’ materials show a 95% reduction in carbon emissions as compared to the average cowhide, it is important to note that there are other attributes that matter in the conversation: for example, end-of-life and biodegradability and keeping our material plastic-free. In fact, Reishi is the only alternative leather on the market that is both low-carbon and plastic-free. MycoWorks has not only conducted a third-party LCA [life-cycle assessment] validating the amazing sustainability profile of the material, but it is now being peer-reviewed and will be published in the future.”With all the excitement around the potential of mycelium leather—so close to becoming a reality—MycoWorks is scaling up. They’ve outgrown their space in California that produces 10,000 sheets of Reishi a year, and are opening a full-scale factory in South Carolina with the capacity to produce several million sheets a year. The partnership with Fouquet is the beginning of this process. Coprinus hat  Aysia StiebFouquet, who has a degree in environmental science and sustainable development and is passionate about alternative textiles, reached out to MycoWorks through their email form on the website, hoping to collaborate. Nine months later, here we are with three hats: the Boletus, Coprinus, and Morchella (each named after a species of mushroom). The $810 Boletus is a bucket hat made from just Reishi and a lining, and the $875 Coprinus and $1725 Morchella use Reishi as trimming. “It’s indistinguishable,” Foquet says of the product. He adds that his seamstresses couldn’t tell that they were using a different material other than leather. The biggest similarity to Fouquet is how workable the material is. The only difference is the grain of the leather alternative, “but it’s minor,” Fouquet says. The grain can also be engineered differently. Unlike cow leather, Reishi is customizable when it comes to size, shape, softness, flexibility, and density. After all, as Wang points out, “a cow can only grow to a certain shape.”But there’s one other crucial aspect to determining the veracity of leather. Does it have that smell? “The first thing I do with leather is feel it and smell it. Leather, especially tanned leather, has a very specific smell,” Fouquet says. “It’s hard to describe the smell of Reishi. It’s shocking because it’s a very raw, earthy scent.” Maybe that’s the next level of customization.
Environmental Science
Climate change is primarily caused by humans burning fossil fuels, as well as other activities that produce greenhouse gases. But that blame is not evenly distributed amongst the entire human species. A recent study published in the journal PLOS Climate emphasizes that the society's elites are disproportionately responsible for the extreme weather events linked to climate change like heatwaves, droughts, floods, tropical storms, hurricanes and rising sea levels. Indeed, as corresponding author Jared Starr described American greenhouse gas pollution, "the top 1% of households are responsible for more emissions (15-17%) than the lower earning half of American households put together (14% of national emissions)." "Our study is the first to link US households to the greenhouse gas (GHG) pollution generated when creating their incomes," Starr, a graduate student at the University of Massachusetts Amherst's Department of Environmental Conservation, told Salon in an email. "I think this offers a fundamentally different perspective on carbon pollution responsibility and new insights into emissions inequality. We found that the highest earning top 10% of households are responsible for about 40% of U.S. GHG." Although the United States only includes roughly five percent of the world's population, it is accountable for more than a quarter of the activity fueling climate change. This is in large part because of America's dominance as the world's foremost economic power — a dominance reflected in its large investor class, which because of its wealth is figuratively steering Earth off of the climate cliff. "The top 1%'s emissions would be the size of five Empire State buildings stacked on top of each other and the top 0.1%'s emissions would be taller than Mount Everest." "For the first time, we also quantify the share of emissions related to investment," Starr explained. "The share of emissions coming from investments increases as we move up the income ladder. For the top 0.1% households, more than half of their emissions are coming from investment income." Starr used a visual analogy to illustrate his point. "If we picture this on a graph and imagine the bottom 10% households' emissions (1.6 metric tons) are the size of average home, then the top 1 percent's emissions would be the size of [five] Empire State buildings stacked on top of each other and the top 0.1%'s emissions would be taller than Mount Everest," Starr told Salon. "This scale of emissions inequality was unknown before our study. I think it is a climate justice issue and it poses a fundamental challenge to our political system to respond to this level of emissions disparity." The study has a term for the super-rich who emit so much carbon: "super-emitters," which covers households with emissions greater than 3,000 metric tons of CO2 per year. As the authors write, "For pre-tax income, we estimate about 43,200 U.S. households or 34% of the top 0.1% households are super emitters with the supplier framework." They pointed out that "almost all super emitting households come from the top 0.1% income group" and "had average incomes of over $10.6 million (supplier) and $11.5 (producer)." Overall "in 2019, fully 40% of total U.S. emissions were associated with income flows to the highest earning 10% of households." Dr. Michael E. Mann, a professor of Earth and Environmental Science at the University of Pennsylvania, wrote a 2021 book called "The New Climate War" that discussed how climate change "inactivists" (fossil fuel businesses, political conservatives and others financially or ideologically inclined to oppose climate change science) spread misinformation about climate change to muddle the public discourse. The book specifically delves into how these groups falsely claim that all of humanity is collectively responsible for this pollution crisis — therefore making it seem like an individual problem where rich and poor alike are equally culpable — as opposed to a crisis predominantly caused by the wealthy's behavior. "The inequality in energy use is a great argument for progressive climate pricing," Mann, was not involved in the recent study, told Salon. "For example, Canada has instituted a progressive carbon tax where revenue is preferentially returned to low-income earners and families." Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter The Vulgar Scientist. It simply does not support the notion that individual choices make much of a difference when it comes to stopping the overheating of the planet. Only the wealthy can make that difference. Mann added that "the solution to the climate crisis isn't going to be voluntary behavioral change — it's going to have to be systemic change, including climate policies such as a carbon tax, fee and dividend, etc. that disincentivize carbon-intensive lifestyles." He also pointed out that when people talk about lifestyle changes, it's important to stress that only the super-affluent should be making those sacrifices. "If we want a just transition, we need to make sure that the pricing structure is progressive, so low income earners who have had the least role in creating this problem not only avoid financial hardship but potentially benefit financially," Mann wrote. As Starr explained, the new study provides crucial support to the observation Mann and other climate scientists have made about the scientific evidence: It simply does not support the notion that individual choices make much of a difference when it comes to stopping the overheating of the planet. Only the wealthy can make that difference. "I think for the last couple decades the dominant cultural argument has been that fixing climate change is everyone's responsibility and that if we just make different choices as consumers then we can 'fix this,'" Starr told Salon. "I think we should all obviously try to make less carbon intensive choices as consumers. But as consumers we often have very limited choices, time, knowledge, etc. over the carbon content of the goods and services we buy. This is not a problem that individual consumers can solve alone — it is a systemic problem." Because fossil fuel use is baked into our current economic system, the vast majority of people are disempowered even as a small group of Americans become "exceedingly rich." "Let the political class explain to the rest of the public why they think it is ok for a small group of people to enrich themselves while leaving the rest of society and future generations an uninhabitable planet." "While we all may bear some responsibility for climate change, our work shows that very wealthy households should bear the majority of responsibility, since these emissions are occurring to enrich them and they are reaping the most benefits," Starr pointed out. "While we often think in this 'consumer responsibility' way, when we think about why businesses actually produce goods and services it isn't to benefit the consumer, but to create value for shareholders (shareholder primacy)." The study also identified racial inequities in how different groups are responsible for carbon emissions, reporting that Black households, on average, have a carbon footprint of 19 metric tons of carbon dioxide equivalent from both supplier and producer emissions. In comparison, White Hispanic households show slightly higher emissions with 26 metric tons of carbon dioxide equivalent from suppliers and 25 metric tons of carbon dioxide equivalent from producers but the most significant emissions can be observed within White non-Hispanic households. They have 40 metric tons of carbon dioxide equivalent from suppliers and 36 metric tons of carbon dioxide equivalent from producers. This is not the first study to demonstrate that wealthy individuals are literally destroying the planet. A study from earlier this year in the journal Nature Sustainability demonstrated that "urban elites are able to overconsume water while excluding less-privileged populations from basic access." Similarly, a study from the journal Cleaner Production Letters found that wealthy individuals produce more greenhouse gases than poor individuals, particularly due to their extensive use of private aircraft and yachts, as well as their massive real estate holdings all over the planet. Meanwhile in 2020, a study in the journal Nature Communications detailed how the "affluent citizens of the world are responsible for most environmental impacts and are central to any future prospect of retreating to safer environmental conditions." We need your help to stay independent Starr was unambiguous when describing to Salon the political implications of the latest scientific evidence. "What we have done in this study is revealed to the public and policymakers the scale of emissions disparity," Starr pointed out, adding that the research is open source and available to the public. "Let the political class have that information and then explain to the rest of the public why they think it is ok for a small group of people to enrich themselves while leaving the rest of society and future generations an uninhabitable planet. Let them show the other 99% of us whose interests they are there to represent. And let us hold them to political account for their choice."
Environmental Science
Humans are leaving behind a 'frozen signature' of microbes on Mount Everest Almost 5 miles above sea level in the Himalayan mountains, the rocky dip between Mount Everest and its sister peak, Lhotse, lies windswept, free of snow. It is here at the South Col where hundreds of adventurers pitch their final camp each year before attempting to scale the world's tallest peak from the southeastern side. According to new University of Colorado Boulder-led research, they're also leaving behind a frozen legacy of hardy microbes, which can withstand harsh conditions at high elevations and lie dormant in the soil for decades or even centuries. The research not only highlights an invisible impact of tourism on the world's highest mountain, but could also lead to a better understanding of environmental limits to life on Earth, as well as where life may exist on other planets or cold moons. The findings were published last month in Arctic, Antarctic, and Alpine Research. "There is a human signature frozen in the microbiome of Everest, even at that elevation," said Steve Schmidt, senior author on the paper and professor of ecology and evolutionary biology. In decades past, scientists have been unable to conclusively identify human-associated microbes in samples collected above 26,000 feet. This study marks the first time that next-generation gene sequencing technology has been used to analyze soil from such a high elevation on Mount Everest, enabling researchers to gain new insight into almost everything and anything that's in them. The researchers weren't surprised to find microorganisms left by humans. Microbes are everywhere, even in the air, and can easily blow around and land some distance away from nearby camps or trails. "If somebody even blew their nose or coughed, that's the kind of thing that might show up," said Schmidt. What they were impressed by, however, was that certain microbes that have evolved to thrive in warm and wet environments like our noses and mouths were resilient enough to survive in a dormant state in such harsh conditions. Life in the cryosphere This team of CU Boulder researchers—including Schmidt and lead author Nicholas Dragone and Adam Solon, both graduate students in the Department of Ecology and Evolutionary Biology and the Cooperative Institute for Research in Environmental Science (CIRES)—study the cryobiosphere: Earth's cold regions and the limits to life in them. They have sampled soils everywhere from Antarctica and the Andes to the Himalayas and the high Arctic. Usually, human-associated microbes don't show up in these places to the extent they appeared in the recent Everest samples. Schmidt's work over the years connected him with researchers who were headed to Everest's South Col in May of 2019 to set up the planet's highest weather station, established by the National Geographic and Rolex Perpetual Planet Everest Expedition. He asked his colleagues: Would you mind collecting some soil samples while you're already there? So Baker Perry, co-author, professor of geography at Appalachian State University and a National Geographic Explorer, hiked as far away from the South Col camp as possible to scoop up some soil samples to send back to Schmidt. Extremes on Earth, and elsewhere Dragone and Solon then analyzed the soil in several labs at CU Boulder. Using next-generation gene sequencing technology and more traditional culturing techniques, they were able to identify the DNA of almost any living or dead microbes in the soils. They then carried out extensive bioinformatics analyses of the DNA sequences to determine the diversity of organisms, rather than their abundances. Most of the microbial DNA sequences they found were similar to hardy, or "extremophilic" organisms previously detected in other high-elevation sites in the Andes and Antarctica. The most abundant organism they found using both old and new methods was a fungus in the genus Naganishia that can withstand extreme levels of cold and UV radiation. But they also found microbial DNA for some organisms heavily associated with humans, including Staphylococcus, one of the most common skin and nose bacteria, and Streptococcus, a dominant genus in the human mouth. At high elevation, microbes are often killed by ultraviolet light, cold temperatures and low water availability. Only the hardiest critters survive. Most—like the microbes carried up great heights by humans—go dormant or die, but there is a chance that organisms like Naganishia may grow briefly when water and the perfect ray of sunlight provides enough heat to help it momentarily prosper. But even for the toughest of microbes, Mount Everest is a "Hotel California": "You can check out any time you like / But you can never leave." The researchers don't expect this microscopic impact on Everest to significantly affect the broader environment. But this work does carry implications for the potential for life far beyond Earth, if one day humans step foot on Mars or beyond. "We might find life on other planets and cold moons," said Schmidt. "We'll have to be careful to make sure we're not contaminating them with our own." Additional authors on this publication include: Anton Seimon, Department of Geography and Planning, Appalachian State University; and Tracie Seimon, Wildlife Conservation Society, Zoological Health Program, Bronx, New York. More information: Nicholas B. Dragone et al, Genetic analysis of the frozen microbiome at 7900 m a.s.l., on the South Col of Sagarmatha (Mount Everest), Arctic, Antarctic, and Alpine Research (2023). DOI: 10.1080/15230430.2023.2164999 Provided by University of Colorado at Boulder
Environmental Science
The Antarctic is turning from a fridge for the world - keeping temperatures down - to 'a radiator', scientists warned yesterday. Ice on land and sea in the Antarctic would normally reflect the sun's rays back into space, but as water forms ponds and lakes on Antarctica, this absorbs heat. Professor Martin Siegert said the South Pole continent 'is an enormous, vast white surface...it does an enormous job for the planet in terms of a lot of solar radiation bounced back into space'. The same process - where ice has melted exposing the ocean - is already happening in the Arctic, he said. Professor Siegert, a glaciologist at the University of Exeter, said sea ice around the frozen continent is currently at its lowest level since satellites began observing it in 1979, beating the previous minimum record set last year. A winter heatwave in March 2022 saw temperatures soar nearly 40C above the norm in East Antarctica, from around -50C to -10C, and had it happened in summer it would have began melting the surface of the ice sheets which scientists said they have never seen before. Professor Siegert said: 'I think the scientific community has been shocked by this season's lack of sea ice, so much lower than has happened in previous years'. Because of Antarctica's harsh environment and remote location, there is less data available to unequivocally link events like these with human-induced climate change, but scientists say they are to be expected on a warming planet. Professor Siegert added: 'I think it's reasonable to assume that with the Antarctic heat event that we've seen, that is the sort of thing that has been expected with global heating because of burning fossil fuels and it has happened. 'It could be, because we've done a lot of scientific evidence, that it was just one of those one-in-1,000 year events, but that's so unlikely, and I think it's perfectly scientifically reasonable to make the assumption that it is linked to our heating planet.' Together with scientists from across the UK, Chile and South Africa, Professor Siegert has been examining evidence of extreme events in Antarctica and said it is 'virtually certain' that their severity will increase unless greenhouse gas emissions are controlled. Publishing their work in the journal Frontiers in Environmental Science, they identified nearly a dozen ways that human impacts are changing the Antarctic, from melting sea and land ice, collapse of ice shelves, warming oceans and atmosphere, near-extinction of marine animals and introduction of foreign species such as moss and grass. Scientists are particularly concerned about what might happen over the next few years as the warming effects of El Nino take hold. Dr Anna Hogg of the University of Leeds said: 'As somebody who watches this happen on a day-to-day basis, I'm finding it really surprising and staggering to see the changes occur at the scale that they are already.' She said it would take centuries for collapsed ice shelves to recover, if it was even possible. These collapses do not directly add to sea level rise as the ice is already floating, but it means ice from the land pours into the sea much faster via glaciers, which is speeding up the rate of sea level rise. If all the ice in Antarctica were to melt, although scientists do not believe this will happen anytime soon, it would push up the global sea level by 57 metres. Extreme events such as ice shelf collapse or heatwaves combine in cascading, or multiplying effects that reach across the world but also threaten native species. The team of scientists are calling for more environmental protection measures to be put in place to help conserve increasingly fragile ecosystems that are becoming more exposed. Melting ice could result in better access for ships which bring more people for example, who therefore must take more care not to bring non-native seeds on their boots. The UK Foreign Office is looking to give better protection to emperor penguins, who are a 'climate vulnerable' species, said the department's head of polar regions Jane Rumble.
Environmental Science
Blue lakes in North America and Europe will likely turn green-brown as global temperatures rise 22 September 2022 The world’s blue lakes, many of which are located at high northern latitudes, are at risk of losing their blue hues as a result of climate change, according to a new paper in Geophysical Research Letters. Credit: Eric Stein-Beldring/Unsplash AGU press contact: Liza Lester, +1 (202) 777-7494, [email protected] (UTC-4 hours) Contact information for the researchers: Xiao Yang, Southern Methodist University, Dallas, [email protected],  (UTC- 5 hours) Catherine M. O’Reilly, Illinois State University, [email protected], (UTC- 5 hours) WASHINGTON — If global warming persists, blue lakes worldwide are at risk of turning green-brown, according to a new study which presents the first global inventory of lake color. Shifts in lake water color can indicate a loss of ecosystem health. While substances such as algae and sediments can affect the color of lakes, the new study finds air temperature, precipitation, lake depth and elevation also play important roles in determining a lake’s most common water color. Blue lakes, which account for less than one-third of the world’s lakes, tend to be deeper and are found in cool, high-latitude regions with high precipitation and winter ice cover. Green-brown lakes, which are 69% of all lakes, are more widespread, and are found in drier regions, continental interiors, and along coastlines, the study finds. The new research was published in Geophysical Research Letters, AGU’s journal publishing high-impact, short-format reports with immediate implications spanning all Earth and space sciences. The researchers used 5.14 million satellite images for 85,360 lakes and reservoirs around the world from 2013 to 2020 to determine their most common water color. “No one has ever studied the color of lakes at a global scale,” said Xiao Yang, remote sensing hydrologist at Southern Methodist University and author of the study. “There were past studies of maybe 200 lakes across the globe, but the scale we’re attempting here is much, much larger in terms of the number of lakes and also the coverage of small lakes. Even though we’re not studying every single lake on Earth, we’re trying to cover a large and representative sample of the lakes we have.” The new study presents the most extensive map of lake color, revealing that most of the world’s lakes are already green-brown rather than blue. Credit: AGU/Geophysical Research Letters A lake’s color can change seasonally, in part, due to changes in algal growth, so the authors characterized lake color by assessing the most frequent lake color over seven years. The results can be explored through an interactive map the authors developed. Additionally, the new study explored how different degrees of warming could affect water color if climate change persists. The study finds climate change may decrease the percentage of blue lakes, many of which are found in the Rocky Mountains, northeastern Canada, northern Europe and New Zealand. “Warmer water, which produces more algal blooms, will tend to shift lakes towards green colors,” said Catherine O’Reilly, an aquatic ecologist at Illinois State University and author of the new study. “There are lots of examples of where people have actually seen this happen when they studied one individual lake.” For example, the North American Great Lakes are experiencing increased algal blooms and are also among the fastest warming lakes, O’Reilly said. Previous research has also shown remote Arctic regions have lakes with “intensifying greenness,” said Yang. While prior studies have used more complex and finer scale metrics to understand overall lake ecosystem health, water color is a simple yet viable metric for water quality that can be viewed from satellites at the global scale, the authors said. This approach provides a way to study how remote lakes are changing with climate. “If you’re using lakes for fisheries or sustenance or water drinking water, changes in water quality that are likely happening when lakes become greener are probably going to mean it’s going to be more expensive to treat that water,” said O’Reilly. “There might be periods where the water isn’t usable, and fish species might no longer be present, so we’re not going to get the same ecosystem services essentially from those lakes when they shift from being blue to being green.” Additionally, changes to water color may have recreational and cultural implications in locations such as Sweden and Finland where lakes are culturally prevalent, O’Reilly said. As warming continues, lakes in northern Europe will likely lose their winter ice cover, which could affect winter and cultural activities. “Nobody wants to go swim in a green lake,” said O’Reilly, “so aesthetically, some of the lakes that we might have always thought of as a refuge or spiritual places, those places might be disappearing as the color changes.” ### AGU (www.agu.org) supports 125,000 enthusiasts to experts worldwide in Earth and space sciences. Through broad and inclusive partnerships, we advance discovery and solution science that accelerate knowledge and create solutions that are ethical, unbiased and respectful of communities and their values. Our programs include serving as a scholarly publisher, convening virtual and in-person events and providing career support. We live our values in everything we do, such as our net zero energy renovated building in Washington, D.C. and our Ethics and Equity Center, which fosters a diverse and inclusive geoscience community to ensure responsible conduct. Notes for Journalists: This research article will be available for free until 10/15. Download a PDF copy of the paper here. Neither the paper nor this press release is under embargo. Paper title: “The Color of Earth’s Lakes” Authors: Xiao Yang, (corresponding author) Department of Earth, Marine and Environmental Sciences, University of North Carolina at Chapel Hill, Chapel Hill, NC-27514, USA, Current affiliation: Department of Earth Sciences, Southern Methodist University, Dallas, TX, USA Catherine M. O’Reilly, Department of Geography, Geology, and the Environment, Illinois State University, Normal, IL, USA John R. Gardner, Department of Geology and Environmental Science, University of Pittsburgh, Pittsburgh, PA, USA Matthew R.V. Ross, Department of Ecosystem Science and Sustainability, Colorado State University, Fort Collins, CO, USA Simon N. Topp, Department of Earth, Marine and Environmental Sciences, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA Jida Wang, Department of Geography and Geospatial Sciences, Kansas State University, Manhattan, KS, USA Tamlin M. Pavelsky, Department of Earth, Marine and Environmental Sciences, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA
Environmental Science
Alumni Spotlight: Centering Humans in Asia's Energy Transition Srishti Mahajan is an energy specialist at the Asian Development Bank in New Delhi, where she leads clean energy projects to support countries in the Asia Pacific region transition to a low carbon economy and to mobilize finance to support these initiatives. Mahajan is also a 2022 graduate of Columbia University’s MPA in Environmental Science and Policy (MPA-ESP) at the School of International and Public Affairs. Following her graduation, Mahajan worked at Sustainable Energy for All (SEforALL), where she provided technical and analytical support on subjects such as clean energy access and net-zero energy transition, among other responsibilities. For Mahajan, the MPA-ESP program helped broaden her perspective on the human impact of her chosen career. “It redefined my pursuit of sustainability, anchoring it within a tapestry woven from diverse threads—technology, policy and above all, the intricate human dynamics that underlie every endeavor,” she says. This interview has been edited for clarity and length. What interests you about the sustainability field? My interest in sustainability stems from my nomadic upbringing in India. I moved across seven provinces in 16 years, experiencing a wide variety of cultures. I vividly remember adapting to rice-centric diets in states where paddy fields painted the scenery, or donning silk garments in regions famed for sericulture to ward off the cold. Those moves significantly influenced the path I chose during my undergraduate years. My exposure to diverse cultures, climates and ways of life steered me toward environmental engineering. In academia, I explored the intricacies of air, water and soil, viewing these subjects not only as engineering challenges but also as critical components of policy and sustainability landscapes. Can you give us a quick overview of your previous work experience and educational background? My career is grounded in my dual expertise as an environmental engineer and a renewable energy engineer. I began my journey as an analyst, crunching data and numbers to demonstrate the feasibility of implementing decentralized renewable energy systems in India. I then transitioned to a role with a grant-making organization where I oversaw projects focused on how renewable energy could positively impact the health, education and livelihoods of rural communities. Following my studies at Columbia, I ventured into the international arena of climate mitigation with SEforALL. Presently, I am with the Asian Development Bank, where I work to facilitate the adoption of emerging and innovative technologies to expedite energy transition in the Asia-Pacific (APAC) region. Why did you decide to pursue your MPA-ESP at Columbia? My initial tech-focused perspective on sustainability, shaped by my engineering background, underwent a significant transformation. In the past, I firmly believed that introducing a technology such as solar in peri-urban and rural areas would solve both electricity shortages and other challenges of inequality and poverty. I saw electricity as the key to unlocking opportunities and boosting income levels. And it is. But beyond technology and economics, I discovered a complex human element that defies easy categorization. This realization dawned on me during a visit to a community that was benefitting from a solar installation, a project I had overseen. The MPA-ESP program served as a gateway to a panorama of perspectives, enriching my understanding of sustainability beyond the confines of technology. What were the most valuable skills you took from the program, and how have they translated to your professional life? One of the most striking aspects of my current role at the Asian Development Bank is the ability to forge connections on a global scale—both in understanding challenges and devising comprehensive solutions. The exposure I gained opened my eyes to the strategies pursued by the U.S. and also to the initiatives undertaken in Europe and Africa. This global perspective isn’t just valuable; it provides insights that enrich the way I design and implement projects. This perspective adds immense value, offering a unique vantage point through which I proudly contribute to our team’s endeavors, particularly as we engage in projects here in India. Where would you like to see your career take you? My focus is on my current position. The vast landscape of the Asia Pacific region beckons, brimming with untapped potential for driving energy transition. This region holds the key to substantial advancements, and it is precisely where the most impactful work lies. I am committed to positively impacting the energy transition within the region. What advice would you give to current students? As you embark on your academic journey, remember to savor the experience. Take it one step at a time. While the notion of networking is emphasized, I urge you to focus on something more profound—cultivating genuine relationships. Seek connections that endure beyond the surface, ones that are built on shared experiences and mutual understanding. Engage with individuals not solely to expand your professional circle, but with the intention of forging meaningful bonds. Have an authentic desire to connect, learn and grow alongside your peers. And lastly, be kind—it doesn’t cost a thing.
Environmental Science
We should put the environment on the balance sheet Decoupling, carbon taxes, and unrealistic environmentalists In a world where businesses focus on quarterly earnings and daily stock price trends, the market is largely blind to the fact that it’s driving us toward a climate cliff. The damages that are caused by CO2 emissions are not priced into our usage of fossil fuel products. Market failures like this are what economists call externalities, and we need to use tried and true fiscal policies to get the environment on the balance sheet. When economists find externalities like this in the market, they tend to use “carrots” and “sticks” (incentives and penalties) to steer the market in a healthier direction. A good example of this sort of “steering” is how we tax tobacco products. We know that cigarettes cause cancer and increase healthcare costs for the public, so we tax tobacco to 1. reduce the number of people that use tobacco, and 2. raise money for healthcare costs. We should be treating the fossil fuel industry the same way. The Inflation Reduction Act (IRA) is an example of fiscal carrots in action. The IRA is helping us finance more clean energy and transportation projects, especially at the state level. While many states have put a price on carbon, America is lacking a federal climate-friendly fiscal stick. We have a gas tax, but not a federal tax on industry-related CO2 pollution. There are many policies we need to be using to reduce climate changing emissions, and pricing carbon is one of them. Pricing carbon at the federal level will help us decouple GDP growth and CO2 emissions, and grow the economy responsibly. GDP and CO2 emissions The carbon intensity of our GDP is high, but that doesn’t mean all economic growth is inherently bad. Calls for “degrowth” are growing in the EU, because they think that if they slow GDP to a halt they will reduce emissions. Degrowth advocates suggest that there is no possible way to decouple GDP and emissions. Maybe this is my MBA talking, but I don’t think we need to give up GDP growth and increased prosperity in our fight against climate change. We have shown that decoupling greenhouse gases from economic growth is possible and is already happening: Noah Smith has a recent post about this that I recommend reading: Degrowth: we can’t let it happen here! Not only is degrowth a bad idea, we don’t even need to settle for slowed growth. Republicans and democrats both think carbon pricing would slow economic growth, but it’s not true. In fact, that’s not true with other types of taxation either. Carbon pricing is a tool that directs our economy to grow in a way that acknowledges the importance of preserving our environment. How do you put a price on carbon? The concept of putting the environment on the balance sheet brings to mind the old adage, “Measure what matters.” If we care about the environment so much, why don’t we have a uniform way to measure our ability to preserve it? Well, it may not be uniform, but we do have the social cost of carbon (SCC). Professor William Nordhaus won the Nobel in economics for his work on SCC, which has helped states construct fiscal policies that put our climate on the balance sheet. Nordhaus developed the Dynamic Integrated Climate-Economy (DICE) model, a complex statistical model that estimates, in dollars, the economic damages that come from adding an additional ton of CO2 to the atmosphere. The DICE model illustrated: This cost-benefit analysis quantifies the negative and positive long-term effects of adding more CO2 into the atmosphere. Yes, there are some positive economic outcomes that come with warming temperatures, like some regions may experience higher crop yields, but it’s mostly bad. Really bad, actually. Inputs to the DICE model are often tinkered with, and Republicans and Democrats tend to disagree over the inputs. One of the main assumptions in the DICE model is the discount rate. A higher discount rate means we expect higher economic growth in the future and it will be easier for us to pay for the damages caused by climate change. So, a high discount rate = a lower social cost of carbon. On the other hand, assuming a low discount rate means we expect the economy to grow slowly and it will be more expensive for us to pay climate damages in the future. Therefore, a lower discount rate = a higher social cost of carbon. Unsurprisingly, the Trump-era EPA adopted a high discount rate in an attempt to dismiss the negative effects of CO2 as part of his deregulation agenda (and it’s possible that Trump thought his economy would grow faster than any economy ever). The Trump-era EPA’s take on the SCC was $3-5 per ton of CO2. So far, the Biden-era EPA has adopted a moderate discount rate and the official SCC estimation is $51, although the EPA proposed a much higher $190 in November of 2022. I’m no expert on SCC, but $3-5 per ton is laughable - especially when you think about the uncertainties of everlasting economic growth and all of the negative effects that the DICE model excludes: As the model is further developed by interdisciplinary scientists, I expect the known damages from greenhouse gases to increase, and the SCC to increase. Now, just because the EPA recognizes that there is a social cost to emitting CO2, doesn’t mean Congress has to do anything about it. In fact, the feds have done nothing and there are only 14 US states that have enacted some sort of SCC use into law as seen in this chart of state climate policies from my most recent post: Many states have enacted carbon trading schemes like in the liberal northeast and west coast states. Conservative states have not adopted a carbon tax, or any climate policies for that matter, but what is surprising is the pushback of carbon pricing from some environmentalists - largely environmental justice (EJ) advocates. Environmental Justice Concerns A paper by Boyce et al from February of this year summarizes EJ advocate’s concerns succinctly: In brief, critics have argued that carbon pricing (i) fails to reduce carbon emissions significantly, (ii) fails to reduce the disproportionate impacts of hazardous co-pollutants on people of color and low-income communities, (iii) harms the purchasing power of low-income households, and (iv) commodifies nature.[3, 4] Proponents of carbon pricing often, and in our view hastily, have dismissed these criticisms as baseless. Boyce and the boys are right - these claims are baseless. Here are my quick rebuttals to the four claims (i, ii, iii, and iv): (i) fails to reduce carbon emissions significantly i. This is like saying seat belts are bad because they don’t save more lives. Nobody is saying carbon pricing is the only policy that needs to be enacted, but the fact that it will reduce emissions at all is a great start. An omnibus policy approach with many mechanisms - a carbon tax, mass transit investments, a low carbon fuel standard, and so on - would be the most beneficial policy approach to climate change, and carbon pricing is a key piece of that. (ii) fails to reduce the disproportionate impacts of hazardous co-pollutants on people of color and low-income communities, ii. Again, nobody is saying carbon pricing is the only policy that needs to be enacted. There are many other policies that can help us deal with particulate matter in EJ communities. Furthermore, revenue generated from a carbon tax or a low carbon fuel standard could and should go towards electrifying EJ communities - thus reducing the disproportionate impacts of hazardous co-pollutants on people of color and low-income communities. (iii) harms the purchasing power of low-income households iii. It is true that gasoline and airplane tickets would likely become more expensive, but that’s the point. We want people to start choosing greener alternatives to fossil fuels - many of which are cheaper, by the way. To ease the disproportionate burden for low-income households, we need to be subsidizing electric vehicles, building more public transit, and incentivizing clean fuels through a low-carbon fuel standard. Also, some or all revenue generated from a carbon tax could and should be redistributed to low-income families via a dividend. I’m a little weary about a dividend tied to carbon emissions because, eventually, we hope to have a zero-emission economy and we don’t want emissions to be perceived as a payday. Ultimately, a dividend may be the best way to gain initial political support for a carbon tax. (iv) commodifies nature iv. Boyce and the boys squash this notion perfectly: There is a fundamental difference, however, between valuing nature and turning it into a commodity. When we fail to put a price on carbon and allow emissions free-of-charge, we effectively value the resulting climate impacts on present and future generations at zero. This is not treating Nature as sacred; it is treating it as worthless. Yes, in a perfect world, companies wouldn’t have the ability to emit CO2 at all, and maybe someday we can get there, but it is not economically viable unless you want to lose your job and live in a cave. We need to take the baby steps where we can get them. Why the opponents are wrong, in a nutshell: Putting a price on carbon doesn’t have to be a fix-all for it to be worthwhile; A carbon tax or cap-and-trade system can be structured in a way that benefits EJ communities; and Suggesting that a carbon price “commodifies nature” is a distraction from the fact that it’s a step toward more ideal environmentalism. Nordhaus gave us the ability to put a price on, arguably, the most important externality in our economy, but there are still blind spots in the model. Even outside the DICE model, there are largely untouched areas of environmental fiscal policy development that need some attention. For example: what is the economic value of preserving biodiversity? or culturally important landscapes? I’m looking to academia to help us with these problems that are part economic, part environmental science, and part philosophical in nature. In the meantime, the climate is changing and we haven’t acted in enough meaningful ways. Luckily, we already have most of the technology and policy ideas we need to slow and reverse climate change. We need to stop climate change and embrace strategic growth by putting the environment on the balance sheet.
Environmental Science
Communities of color disproportionately exposed to PFAS pollution in drinking water People who live in communities with higher proportions of Black and Hispanic/Latino residents are more likely to be exposed to harmful levels of per- and polyfluoroalkyl substances (PFAS) in their water supplies than people living in other communities, according to a new study led by researchers from Harvard T.H. Chan School of Public Health. The researchers link this finding to the disproportionate siting of sources of PFAS pollution—such as major manufacturers, airports, military bases, wastewater treatment plants, and landfills—near watersheds serving these communities. The study will be published online May 15, 2023, in Environmental Science & Technology. In March, the EPA proposed the first-ever national drinking water regulation for six PFAS, which it anticipates finalizing by the end of 2023. The regulation would establish maximum contaminant levels of two PFAS compounds, PFOA and PFOS, at 4 parts per trillion (4 ng/L) and limit the other four. The public comment period ends on May 30. "Our work suggests that the sociodemographic groups that are often stressed by other factors, including marginalization, racism, and poverty, are also more highly exposed to PFAS in drinking water," said first author Jahred Liddie, a Ph.D. student in population health sciences at Harvard Chan School. "Environmental justice is a major emphasis of the current administration and this work shows it should be considered in the upcoming regulations for PFAS in drinking water." This is the first peer-reviewed study to show sociodemographic disparities in drinking water PFAS exposures and to statistically link sources such as landfills and airports to PFAS concentrations in community water systems over broad geographic scales. PFAS—dubbed "forever chemicals" because of their extreme persistence in the environment due to their characteristic fluorine-carbon backbone—are artificial compounds widely used for their stain-resistant and water-resistant properties. PFAS exposure has been associated with numerous adverse health outcomes, including diabetes, cardiovascular disease, and cancer. The researchers used PFAS monitoring data from 7,873 U.S. community water systems in the 18 states in which such data is widespread: California, Colorado, Illinois, Indiana, Kentucky, Maine, Maryland, Massachusetts, Michigan, New Hampshire, New Jersey, New York, Ohio, Pennsylvania, South Carolina, Utah, Vermont, and Wisconsin. Their analysis included 44,111 samples collected between January 2016 and August 2022. They also looked at the geographic locations of PFAS sources from multiple databases. The study found that PFAS detection was positively associated with the number of PFAS sources and proportions of people of color who are served by a water system. Each additional industrial facility, military fire training area, and airport in a community water system's watershed was associated with a 10%−108% increase in perfluorooctanoic acid and a 20%−34% increase in perfluorooctane sulfonic acid in drinking water. According to the researchers, about 25% of the population in the 18 states considered in their study were served by community water systems that had levels of PFAS above 5 ng/L. Per this estimate, if the EPA's new proposed level of 4 ng/L is implemented, more than 25% of all Americans are likely to be considered exposed to dangerous levels of PFAS. "Our findings are particularly concerning because past work on environmental disparities for other pollutants shows marginalized populations are susceptible to greater risks of adverse health outcomes compared to other populations, even at the same exposure levels," said senior author Elsie Sunderland, Fred Kavli Professor of Environmental Chemistry and professor of earth and planetary sciences at the Harvard John A. Paulson School of Engineering and Applied Sciences and professor of environmental science and engineering in the Department of Environmental Health at Harvard Chan School. "Regulating releases from PFAS sources and ensuring that people have safe drinking water is especially important in the most vulnerable communities to protect public health." Laurel Schaider at Silent Spring Institute was a co-author. More information: Sociodemographic Factors Are Associated with the Abundance of PFAS Sources and Detection in U.S. Community Water Systems, Environmental Science & Technology (2023). DOI: 10.1021/acs.est.2c07255 Journal information: Environmental Science & Technology Provided by Harvard T.H. Chan School of Public Health
Environmental Science
AI could run a million microbial experiments per year, says study An artificial intelligence system enables robots to conduct autonomous scientific experiments—as many as 10,000 per day—potentially driving a drastic leap forward in the pace of discovery in areas from medicine to agriculture to environmental science. Reported today in Nature Microbiology, the research was led by a professor now at the University of Michigan. That artificial intelligence platform, dubbed BacterAI, mapped the metabolism of two microbes associated with oral health—with no baseline information to start with. Bacteria consume some combination of the 20 amino acids needed to support life, but each species requires specific nutrients to grow. The U-M team wanted to know what amino acids are needed by the beneficial microbes in our mouths so they can promote their growth. "We know almost nothing about most of the bacteria that influence our health. Understanding how bacteria grow is the first step toward reengineering our microbiome," said Paul Jensen, U-M assistant professor of biomedical engineering who was at the University of Illinois when the project started. Figuring out the combination of amino acids that bacteria like is tricky, however. Those 20 amino acids yield more than a million possible combinations, just based on whether each amino acid is present or not. Yet BacterAI was able to discover the amino acid requirements for the growth of both Streptococcus gordonii and Streptococcus sanguinis. To find the right formula for each species, BacterAI tested hundreds of combinations of amino acids per day, honing its focus and changing combinations each morning based on the previous day's results. Within nine days, it was producing accurate predictions 90% of the time. Unlike conventional approaches that feed labeled data sets into a machine-learning model, BacterAI creates its own data set through a series of experiments. By analyzing the results of previous trials, it comes up with predictions of what new experiments might give it the most information. As a result, it figured out most of the rules for feeding bacteria with fewer than 4,000 experiments. "When a child learns to walk, they don't just watch adults walk and then say 'Okay, I got it,' stand up, and start walking. They fumble around and do some trial and error first," Jensen said. "We wanted our AI agent to take steps and fall down, to come up with its own ideas and make mistakes. Every day, it gets a little better, a little smarter." Little to no research has been conducted on roughly 90% of bacteria, and the amount of time and resources needed to learn even basic scientific information about them using conventional methods is daunting. Automated experimentation can drastically speed up these discoveries. The team ran up to 10,000 experiments in a single day. But the applications go beyond microbiology. Researchers in any field can set up questions as puzzles for AI to solve through this kind of trial and error. "With the recent explosion of mainstream AI over the last several months, many people are uncertain about what it will bring in the future, both positive and negative," said Adam Dama, a former engineer in the Jensen Lab and lead author of the study. "But to me, it's very clear that focused applications of AI like our project will accelerate everyday research." More information: Adam C. Dama et al, BacterAI maps microbial metabolism without prior knowledge, Nature Microbiology (2023). DOI: 10.1038/s41564-023-01376-0 Journal information: Nature Microbiology Provided by University of Michigan
Environmental Science
Embargoed for release: Monday, May 15, 2023, 8:00 AM ET Boston, MA – People who live in communities with higher proportions of Black and Hispanic/Latino residents are more likely to be exposed to harmful levels of per- and polyfluoroalkyl substances (PFAS) in their water supplies than people living in other communities, according to a new study led by researchers from Harvard T.H. Chan School of Public Health. The researchers link this finding to the disproportionate siting of sources of PFAS pollution—such as major manufacturers, airports, military bases, wastewater treatment plants, and landfills—near watersheds serving these communities. The study will be published online May 15, 2023, in Environmental Science & Technology. In March, the EPA proposed the first-ever national drinking water regulation for six PFAS, which it anticipates finalizing by the end of 2023. The regulation would establish maximum contaminant levels of two PFAS compounds, PFOA and PFOS, at 4 parts per trillion (4 ng/L) and limit the other four. The public comment period ends on May 30. “Our work suggests that the sociodemographic groups that are often stressed by other factors, including marginalization, racism, and poverty, are also more highly exposed to PFAS in drinking water,” said first author Jahred Liddie, a PhD student in population health sciences at Harvard Chan School. “Environmental justice is a major emphasis of the current administration and this work shows it should be considered in the upcoming regulations for PFAS in drinking water.” This is the first peer-reviewed study to show sociodemographic disparities in drinking water PFAS exposures and to statistically link sources such as landfills and airports to PFAS concentrations in community water systems over broad geographic scales. PFAS—dubbed “forever chemicals” because of their extreme persistence in the environment due to their characteristic fluorine-carbon backbone—are artificial compounds widely used for their stain-resistant and water-resistant properties. PFAS exposure has been associated with numerous adverse health outcomes, including diabetes, cardiovascular disease, and cancer. The researchers used PFAS monitoring data from 7,873 U.S. community water systems in the 18 states in which such data is widespread: California, Colorado, Illinois, Indiana, Kentucky, Maine, Maryland, Massachusetts, Michigan, New Hampshire, New Jersey, New York, Ohio, Pennsylvania, South Carolina, Utah, Vermont, and Wisconsin. Their analysis included 44,111 samples collected between January 2016 and August 2022. They also looked at the geographic locations of PFAS sources from multiple databases. The study found that PFAS detection was positively associated with the number of PFAS sources and proportions of people of color who are served by a water system. Each additional industrial facility, military fire training area, and airport in a community water system’s watershed was associated with a 10%−108% increase in perfluorooctanoic acid and a 20%−34% increase in perfluorooctane sulfonic acid in drinking water. According to the researchers, about 25% of the population in the 18 states considered in their study were served by community water systems that had levels of PFAS above 5 ng/L. Per this estimate, if the EPA’s new proposed level of 4 ng/L is implemented, more than 25% of all Americans are likely to be considered exposed to dangerous levels of PFAS. “Our findings are particularly concerning because past work on environmental disparities for other pollutants shows marginalized populations are susceptible to greater risks of adverse health outcomes compared to other populations, even at the same exposure levels,” said senior author Elsie Sunderland, Fred Kavli Professor of Environmental Chemistry and professor of earth and planetary sciences at the Harvard John A. Paulson School of Engineering and Applied Sciences and professor of environmental science and engineering in the Department of Environmental Health at Harvard Chan School. “Regulating releases from PFAS sources and ensuring that people have safe drinking water is especially important in the most vulnerable communities to protect public health.” Laurel Schaider at Silent Spring Institute was a co-author. This research was supported by the National Institute of Environmental Health Sciences (NIEHS) grant P42ES027706, grant R01ES028311, and an NIEHS training grant (T32 E007069). “Sociodemographic Factors Are Associated with the Abundance of PFAS Sources and Detection in U.S. Community Water Systems,” Jahred M. Liddie, Laurel A. Schaider, Elsie M. Sunderland, Environmental Science & Technology, online May 15, 2023, doi: 10.1021/acs.est.2c07255 ### Harvard T.H. Chan School of Public Health brings together dedicated experts from many disciplines to educate new generations of global health leaders and produce powerful ideas that improve the lives and health of people everywhere. As a community of leading scientists, educators, and students, we work together to take innovative ideas from the laboratory to people’s lives—not only making scientific breakthroughs, but also working to change individual behaviors, public policies, and health care practices. Each year, more than 400 faculty members at Harvard Chan School teach 1,000-plus full-time students from around the world and train thousands more through online and executive education courses. Founded in 1913 as the Harvard-MIT School of Health Officers, the School is recognized as America’s oldest professional training program in public health. Journal Environmental Science & Technology Method of Research Observational study Subject of Research Not applicable Article Title Sociodemographic Factors Are Associated with the Abundance of PFAS Sources and Detection in U.S. Community Water Systems Article Publication Date 15-May-2023
Environmental Science
Rowan researcher documents 50-year trend in hurricane escalation due to climate change Rowan researcher documents 50-year trend in hurricane escalation due to climate change For decades, climate scientists have warned that greenhouse gas emissions are causing worsening and more frequent severe weather patterns, with many studies focusing on trends since the start of the Industrial Revolution. New research by Rowan University climate scientist Dr. Andra Garner indicates that there have been great changes to Atlantic hurricanes in just the past 50 years, with storms developing and strengthening faster. An assistant professor of environmental science in Rowan’s School of Earth & Environment, Garner documented in the journal Nature Scientific Reports (“Observed Increases in North Atlantic Tropical Cyclone Peak Intensification Rates”) that from 1971 through 2020, intensification rates from Atlantic hurricanes have changed as human-caused greenhouse gas emissions warmed the planet and its oceans. Atlantic hurricanes developed faster, from a weak Category 1 to a major Category 3 or stronger, in a 24-hour period and they are now more likely to strengthen faster along the east coast of the U.S., Garner found. She also concluded that better communication methods are needed to warn at-risk communities as it’s difficult to predict when hurricanes will strengthen fastest. Garner showed that from 2001 through 2020, considered for the study’s purpose the “modern era,” that hurricane intensification rates were up to 28.7 percent greater than they were from 1971 through 1990, a period she identified as the “historical era.” “In the modern era, it is about as likely for hurricanes to intensify by at least 57 mph in 24 hours, and more likely for hurricanes to intensify by at least 23 mph within 24 hours than it was for storms to intensify by these amounts in 36 hours in the historical era,” she said. “The number of times that hurricanes strengthen from a Category 1 storm (or weaker) into a major hurricane (Category 3 or greater) within 36 hours has also more than doubled in the modern era relative to the historical era.” Garner said ever-warming ocean waters, such as the record-high temperatures reported this summer off the coast of Florida, are especially troubling, because tropical storms feed off energy in ocean water and the warmer the water, the greater the amount of energy such storms can draw. For example, she said, September’s Hurricane Lee, a massive Category 5 that was the third-fastest intensifying storm in recorded history, virtually exploded because of the unnaturally warm Atlantic waters. “More than 90 percent of the warming we’re seeing from human-caused greenhouse gases goes into our oceans,” Garner said. Based on data from the National Hurricane Center, Garner’s research shows a pattern of rapid escalation that is quickly changing. “The increase in the number of times hurricanes turned from Category 1 or weaker to a major storm, (Category 3 or greater), is particularly concerning, since major hurricanes often produce the most damage in our coastal communities,” she said. Ultimately, she said, proof of quicker intensifying hurricanes as the planet warms should serve as a warning. “One of the messages from this work is that there is an urgency,” Garner said. “If we don’t make some pretty big changes and rapidly move away from fossil fuels, this is something we can expect to see worsen in the future.”
Environmental Science
In satellite pictures, they look like the pale blue and gray eggs of a giant butterfly, laid in tight patterns on some dismal leaf. The eggs, made of steel, are tanks brimming with radioactive fluid—contaminated water from Japan’s Fukushima nuclear plant. The water will soon be diluted and pumped into the sea. Núria Casacuberta Arola, of ETH Zürich, is among those who will be watching. Closely.“We have access to a ship that goes to the coast of Fukushima every year, sometimes once, sometimes twice,” she says. Casacuberta Arola and her colleagues regularly drop an assembly of jars into waters near the incapacitated power plant to collect samples at different depths. The lids of the jars close automatically, one by one, as the device is slowly pulled back up to the surface.By doing this, and also taking sediment samples from the seabed, they hope to be able to tell in the coming months and years whether the disposal of water from Fukushima is causing a noticeable rise in radiation in this corner of the Pacific Ocean. The water release could start as early as next month. If there is a significant bump in radiation levels in the surrounding waters, it will mean things have gone very wrong.In 2011, a massive tsunami struck Fukushima Daiichi Nuclear Power Station. The defensive sea wall intended to protect the plant from such an onslaught was many meters too low to stop the monster wave. Seawater flooded the facility, ultimately leading to partial meltdowns and huge explosions at some of the reactors. It is considered one of the worst nuclear accidents in history.In the years since, workers have had to constantly pump water into Fukushima’s stricken reactors, which still contain hot nuclear fuel. This water has, thankfully, done its job of keeping the reactors cool, but it has become irradiated in the process, meaning it can’t just be flushed away. Workers have kept the used cooling water on-site, building tank after tank in which to store it. All the while, they have known that they will eventually have to dispose of it. Today, there are 1.3 million metric tons of contaminated water on-site. And no space for any more tanks. The time to do something about it is here.It has taken years of research, modeling, and sampling, but earlier this month the International Atomic Energy Agency gave its approval for a discharge plan. Japan’s Nuclear Regulation Authority signed off on the proposals at the same time, meaning that the Tokyo Electric Power Co (Tepco), which is in charge of the plant and its cleanup, has full authority to begin slowly releasing the water into the ocean via a 1-km-long underwater pipe.Some aren’t happy. Local fishers are strongly opposed to the plan, and there have been street protests in South Korea. Yet many scientists are highly confident that the discharge will be perfectly safe.The contaminated water, enough to fill more than 30,000 fuel-truck semi-trailers, contains a mix of unstable chemical elements, known as radionuclides, that emit radiation. To keep these radioactive components to a minimum, Tepco has installed special water purification technology that treats the water before storage. In essence, it involves passing the contaminated water through a series of chambers containing materials that can adsorb radionuclides. The isotopes stick to those materials and the water flows on, a little cleaner than before.However, it is not 100 percent effective, and many of the radionuclides it’s designed to extract, such as the isotopes caesium-137 and strontium-90, for example, can still be found in the stored water. There are also some isotopes the system can’t remove at all, such as carbon-14 and tritium, a form of hydrogen with two neutrons and one proton in its nucleus (hydrogen usually contains just one proton).Despite this, the water is extremely safe because the concentrations of radionuclides are so low, explains Jim Smith, a professor of environmental science at the University of Portsmouth. “I’m not concerned,” he says of the plan to discharge the water.Many of the above radioactive isotopes were released into the ocean at the time of the disaster in 2011—and some traveled. One study found them floating around 3,000 km away in the Arctic Ocean six years after the accident. Once the discharge begins, radionuclides will undoubtedly spread out into the Pacific, but this is very unlikely to have a noticeable effect on the environment, Smith says.For context, he points out that he has many years of experience studying the effects of radiation on living things near the destroyed nuclear power plant in Chernobyl. Even there, where exposure to radiation is much greater, the impact appears to be tiny. “We know radiation damages DNA, probably there are subtle effects of radiation at these levels, but we don’t generally see a significant effect on the ecosystem,” he says, referring to that work.Plus, tritium—one of the isotopes that can’t be removed from the stored water—is already present all around us at low concentrations, though higher levels are associated with nuclear-related activities. The authors of one 2018 study speculated that unusually high levels of tritium in the Rhône river delta in France were down to historical pollution from the watchmaking industry—tritium has been used to make glow-in-the-dark paint for watch dials.What many people don’t realize is that water containing tritium is actually routinely released into the sea—sometimes in vastly greater quantities than are to be discharged from Fukushima—by nuclear facilities all around the world, including in the US, Europe, and East Asia. The Cap de la Hague nuclear processing site in France releases 11,400 terabecquerels (Tbq) of tritium every year, which is more than 13 times the total radioactivity of the tritium across every storage tank at Fukushima.Tepco is regularly testing the stored water ahead of the release, the company says. The water will be re-treated, multiple times if necessary, and diluted more than 100 times to bring its tritium radioactivity concentration down to no more than 0.0000000015 TBq per liter, a level equivalent to a 1/40 of Japan’s national safety standards. Roughly 70 percent of the stored water also contains radionuclides other than tritium that are at concentrations exceeding regulatory limits, says the Japanese government—levels of these will also be brought down to below Japan’s regulatory standards. The water will then be tested again before being discharged.For a final point of comparison, Smith calculates that cosmic rays interacting with the Earth’s atmosphere over the Pacific Ocean annually cause the natural deposition of 2,000 times more tritium than will be introduced by the gradual Fukushima release.Tatsujiro Suzuki at Nagasaki University remembers watching in horror as the disaster unfolded back in 2011. “We all thought that this kind of thing would never happen in Japan,” he says. At the time, he was working for the government. He recalls the confusion over what was happening to the reactors in the days following the tsunami. Everyone was gripped by fear.“Once you experience that kind of accident, you don’t want to see another one,” he says. The long shadow cast by the disaster means that, for the water release plan, the stakes—at least in terms of public trust—could not be higher.Suzuki argues that it’s not quite fair to compare the Fukushima water to fluids discharged from other nuclear facilities elsewhere in the world because of the challenge of cleaning up the many different radionuclides here. “This is an unprecedented event, we have not done this before,” he says, adding that he thinks the procedure is “probably safe” but that there is still room for human error or an accident, such as another tsunami, that could cause an uncontrolled release of the water into the sea.Tepco and the International Atomic Energy Agency have considered such possibilities and still judge the risk to human and marine life to be extremely low. Sameh Melhem, now at the World Nuclear Association, formerly worked for the Atomic Energy Agency and was involved in some of the research to evaluate the discharge plan. “I think it’s very safe for the operators themselves and also for the public,” he says, adding: “The radionuclide concentrations coming from this release, it’s negligible.”Last November, Casacuberta Arola and her colleagues collected samples of seawater off the coast of Fukushima, and they have recently begun to analyze them. The scientists measure the levels of various radionuclides that might be present. For tritium, that means removing all helium from the sample and waiting to see how much new helium emerges from the water as a product of radioactivity. This makes it possible to extrapolate the amount of tritium that must be present, explains Casacuberta Arola. She and her team have records of radionuclide measurements like this from the sea off Fukushima going back years.“We already know that the values that we see now close to Fukushima are close to the background values,” she says. If that changes, they should find out fairly quickly. As will the International Atomic Energy Agency and other observers, who, separately, intend to sample water and wildlife in the area in the coming years to keep an eye on things.Smith says that despite overwhelming evidence that the water release will be entirely safe and heavily scrutinized at every turn, it is not surprising that some people are skeptical of the plan. They have a right to be, he adds, given the troubled history of the plant.At the same time, the threat posed by the release—even in a worst-case scenario where everything goes wrong—is miniscule compared to some of the other environmental risks in the region, such as the effects of the climate crisis on the Pacific Ocean, Smith says.Casacuberta Arola agrees. Negative coverage of the discharge plan has been used to “brainwash” people, she argues, and to instill fear against the nuclear energy industry. “To me,” she adds, “it’s been very much exaggerated.”
Environmental Science
The Reality of the Transition to an Environmentally Sustainable Economy in New York City Economic transitions take time but do take place. New York City first became prominent as a trading city, with America’s resources and farm goods delivered via the Erie Canal shipped out of the port of New York. Then, we became a manufacturing city, at one time making nearly all the clothing worn in America. Today, we are a center of finance, communication, fashion design, entertainment, the arts, health care, research, and education. We are no longer a shipping port or a center of manufacturing but a global capital. Slowly but certainly, we are on the path to environmental sustainability. Along with New York state, we have ambitious carbon reduction goals, and our utilities are struggling to make our energy grid more renewable and our energy system more efficient and reliable. The path is not direct since inflation, mismanagement, NIMBY, and pandemics are upending some of our plans. But not all of them. Local Law 97 requires large buildings over 25,000 feet to reduce their carbon emissions by 40% by 2030 and 80% by 2040. Most of New York City’s large buildings are moving toward compliance, and only 10% have done nothing. The City’s Department of Buildings held a hearing in late October to hear comments on the rules envisioned for enforcing the law and to present the operational definition of “good faith compliance” required by building owners to demonstrate they are working toward adhering to the law. If compliance requires raising housing costs for people on low or fixed incomes, we could be paying for carbon reduction with homelessness. These realities of the transition must be understood, and we should allow for exceptions or the development of subsidies. We must understand the trade-off choices we are making and not turn poor people or the elderly into victims of the sustainability transition. Energy and climate change may be the lead story, but it’s not the only story. Since 2013, large food companies and restaurants have been required to separate and recycle their food waste. Today, people living in Brooklyn and Queens are required to recycle their food waste. Curbside pick up of food waste will be expanded from Brooklyn and Queens to Staten Island and the Bronx on March 25, 2024, and to Manhattan on October 7, 2024. At that point, food recycling will be required city-wide. This waste will be converted into gas and fertilizer via anaerobic digestion or directly into fertilizer via composting. It also won’t end up in landfills leaching toxics into groundwater and methane into the atmosphere. And despite the shameful pandering of New Jersey’s Governor Murphy and less prominent but equally obtuse politicos, congestion pricing is at long last arriving in New York City’s Central Business district. Freight and cars will now be able to move along the streets more rapidly—polluting less and saving billions of dollars in lost productivity—and an extra billion dollars a year will be available to subsidize and improve mass transit. Lots of time and energy have been wasted on fears of unanticipated impacts from congestion pricing that will never materialize. My new favorite is that trucks will avoid passing through Manhattan to avoid the tolls. The cost of trucks sitting in Manhattan gridlock today dwarfs the cost of the tolls that are envisioned for the future. The reduction in travel time will be well worth the toll. Moreover, by improving mass transit, we make the city more energy-efficient and environmentally sustainable. This past April, New York City updated the city’s sustainability plan with a comprehensive plan entitled: PLANYC: Getting Sustainability Done. New York City is a worldwide leader in the generations-long transition to urban environmental sustainability. The updated plan includes 32 initiatives ranging from achieving a 30% tree canopy cover to pursuing fossil fuel-free city operations, from assisting city building owners with solar installation and other clean energy projects to developing new markets for recycling. These initiatives will be implemented by over 50 practical, operational actions, largely within the control of city government, such as: expanding the tree risk management program, installing heat pumps in 10,000 NYCHA apartments, phasing out city investment in fossil fuel equipment, and “[e]valuat[ing] all City roofs undergoing repair work for climate infrastructure installation by 2025…Install[ing] solar energy, electric building infrastructure, green roofs, or other renewable energy on all viable City-owned property by 2035.” The plan is ambitious and visionary and attempts to imbed environmental sustainability in multiple operations of the city’s routine governmental program. While I would be impressed if half of what is outlined is achieved as planned, whatever is accomplished will be important steps in the right direction. New York City government is a huge and unwieldy operation subjected to multiple cross pressures and led by a mayor who is far more political than managerial. And “getting sustainability done” would be a huge challenge even for a mayor who was a genius at management. Nevertheless, the direction, intent, and accomplishments of the city’s government are significant elements of NYC’s transition to environmental sustainability. Fortunately for New Yorkers, the city and state governments are not going at this alone. The use of renewable energy and the adoption of electric vehicles is growing in New York City. For one hour in May 2023, 20% of New York State’s electrical demands were met by solar power. Typically, over 25% of our electricity is drawn from hydropower. The city’s major nonprofit and private sector institutions see the need for environmental sustainability and, like the city government, are looking for practical operational opportunities to move us toward a circular economy. Brooklyn-based Revel, (whose CEO, Frank Reig, is a graduate of the MPA in Environmental Science and Policy program I direct at Columbia’s School of International and Public Affairs) is an electric vehicle innovator with electric ride shares, mopeds, and high-speed charging hubs with 40 chargers in two Brooklyn locations with plans to expand city-wide in 2024. Their bright blue ride-share Teslas and mopeds are hard to miss in my neighborhood. There are business opportunities in the transition to environmental sustainability, and many talented young entrepreneurs like Frank Reig are reaching for those opportunities. As long as I’m highlighting alums, I should mention that Jeff Prosserman, a graduate of Columbia’s Sustainability Management Master’s program, has founded a company called Voltpost, which is installing charging stations built into urban streetlamps. Other young people are dreaming and studying in universities all over the world and getting ready to turn their sustainability ideals into operational realities. But it is not only government, entrepreneurs, and individuals who are moving New York forward. New York’s major institutions—its cultural landmarks, universities, hospitals, and many private businesses—are exploring ways to decarbonize and recycle. Columbia University and many other institutions are working to reduce waste and greenhouse gas pollution and slowly starting to invest in power systems and infrastructure to reduce their environmental impact. Here at Columbia, we are in the midst of a transition to electric vehicles including busses, maintenance vehicles, and even the President’s personal vehicle (a shiny red Tesla parked by the President’s House). The reality is that our culture is changing. Environmental impact has gone from the fringes to the center of our consciousness. When I started working in environmental policy back in 1975, few people cared about environmental quality. And people making decisions about energy, waste, manufacturing, and transportation rarely, if ever, factored environmental impacts into their decision-making. Today, it is routine for people in organizations to ask about environment and sustainability when they design products or services. It is not that every decision is dominated by environmental considerations, but fewer and fewer decision-makers ignore environmental impacts. That is a sea change, and it is the best guarantee that the transition that’s begun will continue until it is completed. The pace may be slow, but it’s steady—and I’ve heard that’s what wins the race.
Environmental Science
TL;DR: A wide range of online courses from Dartmouth College(opens in a new tab) are available for free on edX. Learn about Linux, environmental science, philosophy, and more, without spending anything. There is an absolutely massive bank of online courses out there, just waiting for you to enroll. There has never been a better time to pursue a passion or learn something new. Some of the best examples of these online courses can be found on edX. This online course provider has partnered with some of the world's greatest academic institutions to provide lessons on everything from artificial intelligence to poetry, and you can become a student for free. We have checked out everything on offer from edX, and lined up a selection of standout courses from Dartmouth College. You can learn all about Linux programming, environmental science, philosophy, and much more, without spending anything. These are the best free online Dartmouth College courses as of June 20: It's important to note that these free courses do not include a certificate of completion, so if you really need something to stick on your CV, you will need to pay a small fee. But don't let that stop you from enrolling, because you can still learn at your own pace with unlimited access to all the course material. Find the best free Dartmouth College courses on edX.
Environmental Science
Biosphere 2 experiment reveals that soils in drought stress leak more volatile organic compounds into the atmosphere Microbes are doing a lot under the soil surface that can't be seen with the naked eye—from sequestering carbon to building the foundation of Earth's crust. But even tiny microbes are feeling the stress of a hotter, drier future. According to a new study by University of Arizona researchers, published in Nature Microbiology, soil microbes release more volatile organic compounds into the atmosphere in response to drought stress. The study is just one part of the B2 Water, Atmosphere, and Life Dynamics project, which brought over 90 researchers from around the world to the University of Arizona's enclosed rainforest at Biosphere 2 to conduct a controlled drought experiment and better understand what happens to the world's ecosystems when water is scarce. Uncovering how soil microbes process carbon and interact with the atmosphere under environmental stress helps scientists predict and support how ecosystems will adapt in the face of increasing temperatures and prolonged drought. Volatile isn't what you may think When most people think of volatile organic compounds, they think of aerosols—which can contribute to warming and have negative impacts on air quality—but the term "volatile" simply refers to how easily a chemical or compound can change from a liquid to a gas phase, explained lead study author Linnea Honeker, a postdoctoral researcher who worked with associate professor of environmental science Malak Tfaily in the College of Agriculture and Life Sciences during the B2 WALD project. Many volatile organic compounds are naturally produced and are released in our breath, from trees or by microbes that live in the soil. Microbes naturally consume carbon as part of their life cycle and, in turn, produce volatile metabolites. As part of the B2 WALD project—led by Laura Meredith, an associate professor and ecosystem genomics expert in the School of Natural Resources and the Environment—Honeker and a team of international soil and atmospheric scientists used a labeled carbon isotope to track the movement of carbon and water throughout the rainforest ecosystem during the simulated drought experiment. Using soil flux chambers, the team was able to measure the consumption and release of volatile organic compounds in the soil. Less CO2, more VOCs While microbes worked to break down volatile organic compounds produced in the soils during ambient or pre-drought conditions, these same microbes appeared to ramp up production and decrease consumption of volatile metabolites under drought stress. "What we found is microbial production of CO2 decreased during drought, but there was a net increase of emissions of the volatile metabolites acetate, acetone and diacetyl," said Honeker, who recently accepted a postdoctoral position in soil microbiome bioinformatics at the Lawrence Livermore National Laboratory. Overall, the study revealed soil carbon cycling efficiency decreased during drought, and that may be a result of microbes diverting more of their resources to producing volatile organic compounds and other protective compounds to help support themselves during the drought, she said. It is not yet clear what specific role the volatile organic compounds found in the study play in soil-atmosphere dynamics, but the findings are an important step toward understanding how small but mighty microbes beneath the surface are responding to environmental stress. "These results bring us one step closer to understanding how droughts, which are expected to increase in frequency and duration, can impact microbial carbon cycling in the soil, which, in turn, can have large-scale impacts on ecosystem services and even atmospheric processes," Honeker said. More information: Linnea K. Honeker et al, Drought re-routes soil microbial carbon metabolism towards emission of volatile metabolites in an artificial tropical rainforest, Nature Microbiology (2023). DOI: 10.1038/s41564-023-01432-9 Journal information: Nature Microbiology Provided by University of Arizona
Environmental Science