text
stringlengths 198
584k
| id
stringlengths 47
47
| url
stringlengths 15
2.08k
| climate_prob
float64 0.5
1
| source
stringclasses 1
value |
|---|---|---|---|---|
the energy [r]evolution
The climate change imperative demands nothing short of an Energy [R]evolution. The expert consensus is that this fundamental shift must begin immediately and be well underway within the next ten years in order to avert the worst impacts. What is needed is a complete transformation of the way we produce, consume and distribute energy, while at the same time maintaining economic growth. Nothing short of such a revolution will enable us to limit global warming to less than a rise in temperature of 2° Celsius, above which the impacts become devastating.
Current electricity generation relies mainly on burning fossil fuels, with their associated CO2 emissions, in very large power stations which waste much of their primary input energy. More energy is lost as the power is moved around the electricity grid network and converted from high transmission voltage down to a supply suitable for domestic or commercial consumers. The system is innately vulnerable to disruption: localised technical, weather-related or even deliberately caused faults can quickly cascade, resulting in widespread blackouts. Whichever technology is used to generate electricity within this old fashioned configuration, it will inevitably be subject to some, or all, of these problems. At the core of the Energy [R]evolution there therefore needs to be a change in the way that energy is both produced and distributed.
4.1 key principles
the energy [r]evolution can be achieved by adhering to five key principles:
1.respect natural limits – phase out fossil fuels by the end of this century We must learn to respect natural limits. There is only so much carbon that the atmosphere can absorb. Each year humans emit over 25 billion tonnes of carbon equivalent; we are literally filling up the sky. Geological resources of coal could provide several hundred years of fuel, but we cannot burn them and keep within safe limits. Oil and coal development must be ended. The global Energy [R]evolution scenario has a target to reduce energy related CO2 emissions to a maximum of 10 Gigatonnes (Gt) by 2050 and phase out fossil fuels by 2085.
2.equity and fairness As long as there are natural limits there needs to be a fair distribution of benefits and costs within societies, between nations and between present and future generations. At one extreme, a third of the world’s population has no access to electricity, whilst the most industrialised countries consume much more than their fair share.
The effects of climate change on the poorest communities are exacerbated by massive global energy inequality. If we are to address climate change, one of the core principles must be equity and fairness, so that the benefits of energy services – such as light, heat, power and transport – are available for all: north and south, rich and poor. Only in this way can we create true energy security, as well as the conditions for genuine human wellbeing.
The Advanced Energy [R]evolution scenario has a target to achieve energy equity as soon as technically possible. By 2050 the average per capita emission should be between 1 and 2 tonnes of CO2.
3.implement clean, renewable solutions and decentralise energy systems. There is no energy shortage. All we need to do is use existing technologies to harness energy effectively and efficiently. Renewable energy and energy efficiency measures are ready, viable and increasingly competitive. Wind, solar and other renewable energy technologies have experienced double digit market growth for the past decade.
Just as climate change is real, so is the renewable energy sector. Sustainable decentralised energy systems produce less carbon emissions, are cheaper and involve less dependence on imported fuel. They create more jobs and empower local communities. Decentralised systems are more secure and more efficient. This is what the Energy [R]evolution must aim to create.
To stop the earth’s climate spinning out of control, most of the world’s fossil fuel reserves – coal, oil and gas – must remain in the ground. Our goal is for humans to live within the natural limits of our small planet.
4.decouple growth from fossil fuel use Starting in the developed countries, economic growth must be fully decoupled from fossil fuel usage. It is a fallacy to suggest that economic growth must be predicated on their increased combustion.
We need to use the energy we produce much more efficiently, and we need to make the transition to renewable energy and away from fossil fuels quickly in order to enable clean and sustainable growth.
5.phase out dirty, unsustainable energyWe need to phase out coal and nuclear power. We cannot continue to build coal plants at a time when emissions pose a real and present danger to both ecosystems and people. And we cannot continue to fuel the myriad nuclear threats by pretending nuclear power can in any way help to combat climate change. There is no role for nuclear power in the Energy [R]evolution.
|
<urn:uuid:b6cc700a-55c3-47a6-baaf-dbe7c04a4b04>
|
http://www.energyblueprint.info/1332.0.html?L=0
| 1
|
fineweb
|
What is Rainwater Harvesting?
Rainwater harvesting is an ancient practice of catching and holding rain for later use. In a rainwater harvesting system, rain is gathered from a building rooftop or other source and is held in large containers for future use, such as watering gardens or washing cars. This practice reduces the demand on water resources and is excellent during times of drought.
Why is it Important?
In addition to reducing the demand on our water sources (especially important during drought), rainwater harvesting also helps prevent water pollution. Surprised?
Here’s why: the success of the 1972 Clean Water Act has meant that the greatest threat to New York’s waterbodies comes not from industrial sources, but rather through the small actions we all make in our daily lives. For example, in a rain storm, the oil, pesticides, animal waste, and litter from our lawns, sidewalks, driveways, and streets are washed down into our sewers. This is called non-point source (NPS) pollution because the pollutants come from too many sources to be identified. Rainwater harvesting diverts water from becoming polluted stormwater; instead, this captured rainwater may be used to irrigate gardens near where it falls.
In New York City, keeping rainwater out of the sewer system is very important. That’s because the city has an old combined sewer system that uses the same pipes to transport both household waste and stormwater to sewage treatment plants. During heavy rains, the system overloads; then untreated sewage and contaminated stormwater overflow into our rivers and estuary, with serious consequences:
Who is Harvesting Rainwater in New York City?
Back in 2002, a drought emergency pushed many community gardens to the brink of extinction. For the first time in twenty years, community gardeners were denied permission to use fire hydrants, the primary source of water for most community gardens. This crisis led to the formation of the Water Resources Group (WRG), an open collaboration of community gardening and environmental organizations. With help from the WRG, rainwater harvesting systems have now been built as demonstration sites in twenty NYC community gardens.
At community gardens that harvest rainwater, rain is diverted from the gutters of adjacent buildings and is stored in tanks in the gardens. A 1-inch rainfall on a 1,000-square-foot roof produces 600 gallons of water. The tanks are mosquito proof, so the standing water does not encourage West Nile virus. Because rainwater is chlorine free, it is better than tap water for plant growth, meaning healthier plants. And it’s free!
What are Other Cities Doing?
Many cities have adopted creative, low-cost ways to stop wasting rainwater by diverting it from their sewage systems and putting it to use where it falls. Here are some examples:
What Can I Do?
Spread the word! Educate those around you on the importance of lifestyle decisions.
Tell people not to litter, dump oil down storm drains, or overfertilize their lawns.
Install a rainwater harvesting system at your home, school, business, or local community center.
Contact your local elected officials, and let them know you support rainwater harvesting!
Supporting rainwater harvesting Jade Boat Loans
|
<urn:uuid:14a860e9-8430-426b-8c1a-80c7f022fb96>
|
http://www.waterresourcesgroup.org/
| 0.8696
|
fineweb
|
Japan has been hit by the worst crisis since 1945, as an earthquake and tsunami have killed 10,000, destroyed tens of thousands of buildings, displaced hundreds of thousands, and left millions without power or water. As the nation braces for more aftershocks, people have resorted to using sea water in an attempt to prevent a nuclear meltdown from adding a third catastrophe, which has already leaked and caused a mass evacuation. According to Greenpeace,
"We are told by the nuclear industry that things like this cannot happen with modern reactors, yet Japan is in the middle of a nuclear crisis with potentially devastating consequences…The evolving situation at Fukushima remains far from clear, but what we do know is that contamination from the release of Cesium-137 poses a significant health risk to anyone exposed. Cesium-137 has been one if the isotopes causing the greatest health impacts following the Chernobyl disaster, because it can remain in the environment and food chain for 300 years.”
Whereas the first two catastrophe’s were natural and unpredictable, a nuclear meltdown is entirely unnatural and entirely predictable. According to the local anti-nuclear group, Citizens’ Nuclear Information Centre,
The nuclear crisis comes a month before the 25th anniversary of the Chernobyl disaster, the largest nuclear meltdown in history, which showered Europe in a radioactive cloud causing a quarter of a million cancers, 100,000 of them fatal. As of this writing the disaster in Japan is already the third worst in history, behind Chernobyl and the Three Mile Island partial meltdown in 1979, and comes only 12 years after a fatal overexposure of workers at a nuclear plant in Tokaimura, Japan. Even without the inherent risk of a meltdown, nuclear power is a threat to health. The problem is not just the few terrible times when they don't work, but the daily experience of when they do work. As climate campaigner George Monbiot wrote more than a decade ago,
“The children of women who have worked in nuclear installations, according to a study by the National Radiological Protection Board, are eleven times more likely to contract cancer than the children of workers in non-radioactive industries. You can tell how close to [the nuclear plant in] Sellafield children live by the amount of plutonium in their teeth.”
Add to this the morbidity and mortality or working in uranium mines and the dangers of disposing of radioactive waste, and you have negative health impacts at every stage of nuclear power (for a summary see the UK’s Campaign for Nuclear Disarmament). Despite this, governments have invested massively in the nuclear industry and globalized the risk. Canada has exported nuclear reactors while building seven of its own, and despite concerns about safety the Ontario government plans on investing $36 billion into nuclear power at the same time as its backing off wind power.
REASONS AND EXCUSES
While nuclear power is a clear and present danger to the health of the planet and its people, it is a thriving industry driven by economic and military competition. Vandana Shiva—who studied nuclear physics and now leads the climate justice movement in India—has exposed the hypocrisy of US hostility to Iranian nuclear power when it is doing the same thing to promote nuclear power and weapons in India as a bulwark against China:
As Shiva summarized in her book Soil Not Oil, “nuclear winter is not an alternative to global warming”, and it is a tragedy that Japan has become the test case against both military and civilian arms of the nuclear industry--from the atomic bomb 65 years ago to the nuclear meltdown today. But instead of admitting the problems of nuclear power, the nuclear industry and its supporters have greenwashed it and presented it as a solution to global warming. Some environmentalists, such as Gaia theorist James Lovelock, have fallen prey to these claims. Lovelock, whose ideas are driven by apocalyptic predictions and an extreme pessimism, has gone so far as to claim that “nuclear power is the only green solution”.While former US president George Bush defended his country’s 103 nuclear power plants as not producing "a single pound of air pollution or greenhouses gases”, Dr. Helen Caldicott has refuted the claim in her important book Nuclear Power is Not the Answer, which proves that even without meltdowns nuclear power is a threat to the planet:
The false dichotomy between carbon emissions and nuclear power is also refuted by those developing the Tar Sands, who have proposed using nuclear power to pump Tar Sands oil.
PEOPLE POWER, GREEN JOBS
Fortunately there are growing anti-nuclear campaigns uniting indigenous groups, NGOs and the broader climate justice movement to challenge nuclear power in all its stages—from mining to use to waste disporal. As Vandana Shiva writes in Soil Not Oil,
Meanwhile in Canada indigenous groups are leading opposition to transportation of nuclear waste through the Great Lakes and their surrounding communities, declaring “what we do to the land, we do to ourselves.” Last year the German government extended nuclear power against the will of the majority but after news of the leak in Japan, 50,000 people formed a human chain from a nuclear reactor to Stuttgart demanding an end to nuclear power.
Uniting these campaigns with the labour movement raises the demands of good green jobs for all, to transform our oil and nuclear economy into one based on ecological and social sustainability and justice. Instead of the billions in subsidies for the nuclear industry, governments could be investing in solar, wind and clean electricity, while retrofitting buildings, which could solve the economic and climate crises without the inherent dangers of nuclear power. As Greenpeace wrote,
"Our thoughts continue to be with the Japanese people as they face the threat of a nuclear disaster, following already devastating earthquake and tsunami. The authorities must focus on keeping people safe, and avoiding any further releases of radioactivity...Greenpeace is calling for the phase out of existing reactors, and no construction of new commercial nuclear reactors. Governments should invest in renewable energy resources that are not only environmentally sound but also affordable and reliable.”
|
<urn:uuid:f8f50eee-35e0-4fe2-8425-891ee98718b0>
|
http://yourheartsontheleft.blogspot.com/2011/03/nuclear-meltdown-is-not-alternative-to.html
| 0.6281
|
fineweb
|
Barbara Heath Land Race – 2012
By the time Barbara Heath visited Horsham, the town and the surrounding Wimmera District of Western Victoria were in the process of recovering from a decade-long drought. To inform her work, which was initially to address issues of drought, Heath held a number of planned and fortuitous conversations with the assistance of Horsham Regional Art Gallery staff, which came to focus on the changes in agricultural practices in the area.
The list of people with whom Heath consulted is lengthy, but Dr Bob Redden, curator Australian Temperate Field Crops Collection of the Grains Innovation Park became her main contact. In an email of August 2011, Dr Redden wrote to Heath: ‘Now with unprecedented population levels and growth, there is a risk of disconnect and taking food supply for granted, even with climate change. Humans will need to change if they wish to continue their increasing diverse interests, but will need to prioritise agricultural research, better understanding our available genetic resources, plant growth and development, and imaginative paths to harnessing science and truly earn the title ‘Homo sapiens’.
Land race is a direct response to the urgency of maintaining biodiversity. Agriculture today requires economies of scale that change the social landscape and limit population diversity. This results in the erasure of many small communities, loss of connection to the past and cultural loss. Dr Redden explained his department’s work to ensure plant gene diversity by sourcing and saving seed from land race crops. ‘Land race’ is the term used to describe heritage seed varieties now being displaced by International Seed Uniformity Standards.
Heath’s Land Race series shows distinct levels, from biodiversity in the soils to the patterns of farming practices above. Each Land Race also features a remnant plant species that reaches up and through the tractor track patterns: briar, apple and aloe.
There are numerous hero shots (one above) and details prepared (below), we will wait for the show to get under way and publicise a little later. The preliminary research is in an earlier blog post – click here.
2 Responses »
|
<urn:uuid:c1e616f0-c628-4e79-8c4f-63212b07cee3>
|
http://viewersite.wordpress.com/2012/02/
| 0.986
|
fineweb
|
Participatory Video created by members of various indigenous communities in Itogon, Philippines, tracking the impacts of large-scale mining and now climate change on their environment and culture.
This film was created by members of various indigenous communities in the Cordillera region of the Philippines, during a Participatory Video project facilitated by InsightShare. The participants were taught to use video cameras during an intensive 9-day PV workshop in the barangay of Garrison, in Itogon, and created this 24-minute film to communicate the devastating impacts of large-scale mining wrought on their communities by various companies over the years, and now the increasingly alarming impacts of climate change.
This project was part of Conversations with the Earth project. Launched in April 2009, Conversations with the Earth is a collective opportunity to build a global movement for an indigenous-controlled community media network. CWE works with a growing network of indigenous groups and communities living in critical ecosystems around the world, from the Atlantic Rainforest to Central Asia, from the Philippines to the Andes, from the Arctic to Ethiopia. Through CWE, these indigenous communities are able to share their story of climate change. Through the creation of sustainable autonomous indigenous media hubs in these regions, CWE fosters a long-term relationship with these communities, based on principles of local control and supporting indigenous media capacity.
|
<urn:uuid:8e21dd22-c3b3-4956-ade9-66549e3c9812>
|
http://www.insightshare.org/watch/video/voices-experience
| 1
|
fineweb
|
As the years tick by with most of the planet doing little in the way of reducing carbon emissions, researchers are getting increasingly serious about the possibility of carbon sequestration. If it looks like we're going to be burning coal for decades, carbon sequestration offers us the best chance of limiting its impact on climate change and ocean acidification. A paper that will appear in today's PNAS describes a fantastic resource for carbon sequestration that happens to be located right next to many of the US' major urban centers on the East Coast.
Assuming that capturing the carbon dioxide is financially and energetically feasible, the big concern becomes where to put it so that it will stay out of the atmosphere for centuries. There appear to be two main schools of thought here. One is that areas that hold large deposits of natural gas should be able to trap other gasses for the long term. The one concern here is that, unlike natural gas, CO2 readily dissolves in water, and may escape via groundwater that flows through these features. The alternative approach turns that problem into a virtue: dissolved CO2 can react with minerals in rocks called basalts (the product of major volcanic activity), forming insoluble carbonate minerals. This should provide an irreversible chemical sequestration.
The new paper helpfully points out that if we're looking for basalts, the East Coast of the US, home to many of its major urban centers and their associated carbon emissions, has an embarrassment of riches. The rifting that broke up the supercontinent called Pangea and formed the Atlantic Ocean's basin triggered some massive basalt flows at the time, which are now part of the Central Atlantic Magmatic Province, or CAMP. The authors estimate that prior to some erosion, CAMP had the equivalent of the largest basalt flows we're currently aware of, the Siberian and Deccan Traps.
Some of this basalt is on land—anyone in northern Manhattan can look across the Hudson River and see it in the sheer cliffs of the Palisades. But much, much more of it is off the coast under the Atlantic Ocean. The authors provide some evidence in the form of drill cores and seismic readings that indicate there are large basalt deposits in basins offshore of New Jersey and New York, extending up to southern New England.
These areas are now covered with millions of years of sediment, which should provide a largely impermeable barrier that will trap any gas injected into the basalt for many years. The deposits should also have reached equilibrium with the seawater above, which will provide the water necessary for the chemical reactions that precipitate out carbonate minerals.
Using a drill core from an onshore deposit, the authors show that the basalt deposits are also composed of many distinct flows of material. Each of these flows would have undergone rapid cooling on both its upper and lower surface, which fragmented the rock. The core samples show porosity levels between 10 and 20 percent, which should allow any CO2 pumped into the deposits to spread widely.
The authors estimate that New Jersey's Sandy Hook basin, a relatively small deposit, is sufficient to house 40 years' worth of emissions from coal plants that produce 4GW of electricity. And the Sandy Hook basin is dwarfed by one that lies off the Carolinas and Georgia. They estimate that the South Georgia Rift basin covers roughly 40,000 square kilometers.
The authors argue that although laboratory simulations suggest the basic idea of using basalts for carbon sequestration is sound, the actual effectiveness in a given region can depend on local quirks of geology, so pilot tests in the field are absolutely essential for determining whether a given deposit is suitable. So far, only one small-scale test has been performed on any of the CAMP deposits.
Given the area's proximity to significant sources of CO2 and the infrastructure that could be brought into play if full-scale sequestration is attempted, it seems like one of the most promising proposals to date.
PNAS, 2010. DOI: 10.1073/pnas.0913721107
|
<urn:uuid:0f4b5328-483d-437b-b4b6-8cf4bfa3968b>
|
http://arstechnica.com/science/2010/01/pangea-era-rift-makes-east-coast-perfect-for-carbon-storage/
| 0.9988
|
fineweb
|
Bundelkhand’s ravine wastelands. Photo: Keya Acharya/IPS
BUNDELKHAND, India – Narrow, cobblestoned lanes separate the rows of mud houses with cool interiors and mud-smoothened patios, some with goats tethered to the wooden posts. This is Tajpura village, deep in this water-stressed, drought-prone region of northern India.
An area of stark beauty marked by deep ravines in central India, Bundelkhand spans the states of Uttar Pradesh and Madhya Pradesh. The ruins of stone fortresses dotting the landscape betray a history of constant warfare just as the remnants of water courses and irrigation systems speak of peaceable and prosperous times gone by.
Bundelkhand suffers from manmade problems, starting with the government’s misplaced land and water policies that have worsened an already stressed climatic situation caused by prolonged droughts and erratic rainfall.
Air dropping of ‘Prosopis juliflora’ seeds as a soil-conservation measure in the 1960s resulted in the plant becoming an invasive species that killed indigenous shrubs and trees, making the soft soils of the ravines leach water rapidly and turned vast areas into wastelands.
Thoughtless promotion by the government of water-intensive crops like mentha (mint) encouraged richer farmers to dig deep tube wells while neglecting groundwater recharge, resulting in a disastrous lowering of the water table.
Marginalised farmers, unable to afford expensive infrastructure and inputs, suffer as groundwater depletion adds to problems caused by the ancient rainwater storage and distribution systems going defunct.
Drought is now a familiar spectre in this region and less than half of its one million hectare arable spread is now cultivable, causing distress to its mainly farming population of 50 million people.
“What you have is very high water consumption in an area suffering from water crisis,” says Anil Singh, coordinator of Parmarth, an organisation working to revive traditional systems of water and cropping among marginalised communities that inhabit the ravines of Bundelkhand.
In Tajpura village, as though in denial of Bundelkhand’s stark conditions, 36-year-old Mamtadevi, wife of Ajan Singh, serves up a meal of steaming hot chappatis (Indian flat bread) smeared with clarified butter, a cool, green salad and a dish of smoked brinjal, boiled potato, fresh tomato and green chilli.
“That extra taste in the vegetables is because they are grown sustainably and without chemicals,” explains Mamtadevi.
Ajan Singh and Mamtadevi were among the first to adopt Parmarth’s ‘low external input sustainable agriculture’ (LEISA) which is now standing them in good stead as rainfall becomes scantier and average temperatures rises.
LEISA involves such practices as efficient recycling of nitrogen and other plant nutrients, managing pests through natural means, maintaining ideal soil conditions and ensuring that local farmers are aware of the environment and the value of preserving ecosystems.
The soundness of this method shows in the freshness of Ajan Singh’s vegetable crops, in biodiversity conservation through the use of hardy indigenous seeds and avoiding chemicals for maintaining soil health.
Ajan Singh is also able to beat the vagaries of the weather and this year’s drought, caused by failure of the monsoons, holds no great terror for him or for other farmers who follow LEISA.
Bhartendu Prakash, steering committee member of the Organic Farmers Association of India (OFAI) and in-charge of its northern branch based in Bundelkhand, says the region was hit by frost last winter but organic farmlands using LEISA were the least affected.
“I did not know this system previously. I would grow ‘gehu’ (wheat) and manage 200-300 kg on this same plot,” says Ajan Singh.
Parmarth helped the community in contouring the lands for rainwater run-off and storage and constructed a well for irrigation. Its volunteers also taught farmers like Ajan Singh how to make vermicompost and set up pheromone traps to catch insects.
Most farmers though, already had their own methods of making biopesticide – usually a mix of neem leaves and garlic soaked in buffalo buttermilk. “But before the pheromone traps were laid, the spraying had to be done once every three days, now once a week is enough,” says Mamtadevi.
By 2009, the couple’s vegetables had such a reputation for quality that they sold at the local market 10 km away at higher than prevailing rates, earning them nearly 80,000 Indian rupees (then approximately 1,800 dollars) yearly.
Three years later, Ajan Singh bought another ‘bigha’ (approximately 2.2 acres) of land. He now takes his produce to two markets and also sells milk from five buffaloes that he bought with his earnings.
Fifteen more farmers from Tajpura are now following Ajan Singh’s methods.
Along with this, the women of the community have banded together into self-help groups that maintain a savings and loan account to assist women find simple livelihood alternatives like livestock rearing.
The women also run a grain bank that sells surplus grain in the open market and give grain free to distressed families in times of need.
“We are now trying to link the community to government schemes wherever possible, such as obtaining sprinklers, and getting some benefit from the state-run Bundelkhand Relief Package which does help with drought-proofing,” says Anil Singh who works for Parmarth.
Released in 2009 by the federal government, the package worth 1.5 billion dollars supports rainwater harvesting, proper utilisation of river systems, irrigation canals and water bodies over a three-year period.
But Bundelkhand’s natural farming methods need to get more support as the funding period comes to an end.
“Bundelkhand is too entrenched in northern Indian chemical farming methods,” says OFAI’s Prakash. In contrast, OFAI is deluged with requests for training in organic farming methods from farmers in Punjab and Haryana, the ‘mother zone’ of the so-called ‘green revolution’ that transformed agriculture in India after introduction in the 1960s.
Rajesh Krishnan, campaigner for Greenpeace in India, is optimistic that the government will see the wisdom of promoting organic agriculture as a counter measure to the numerous fallouts of chemical agriculture that fuelled the green revolution.
Krishnan is hopeful for the probable financing of sustainable agriculture in India’s 12th Five- Year Plan, due to be rolled out in November.
Prakash is confident that sustainable agricultural farming will survive through a growing demand for organically-grown crops.
|
<urn:uuid:34eabe1a-7864-4b64-9b8b-dce88b0f492c>
|
http://climate-connections.org/2012/08/22/india-beating-the-weather-with-sustainable-crops/
| 0.8262
|
fineweb
|
Scientists have long projected that areas north and south of the tropics will grow drier in a warming world –- from the Middle East through the European Riviera to the American Southwest, from sub-Saharan Africa to parts of Australia.
These regions are too far from the equator to benefit from the moist columns of heated air that result in steamy afternoon downpours. And the additional precipitation foreseen as more water evaporates from the seas is mostly expected to fall at higher latitudes. Essentially, a lot of climate scientists say, these regions may start to feel more like deserts under the influence of global warming.
Now scientists have measured a rapid recent expansion of desert-like barrenness in the subtropical oceans –- in places where surface waters have also been steadily warming. There could be a link to human-driven climate change, but it’s too soon to tell, the scientists said.
[UPDATED below, 3/6, 1 p..m] Read more…
|
<urn:uuid:71855304-2f8a-4425-8945-02a9b90be1ae>
|
http://dotearth.blogs.nytimes.com/tag/deserts/
| 1
|
fineweb
|
The Seine, the scenic river running through Paris, has inspired artists, attracted tourists and served as the soul of the city, and now it will also be a source of renewable energy. Paris officials have announced a plan to place river turbines beneath four bridges on the Seine.
The Pont du Garigliano, Pont de la Tournelle, Pont Marie and Pont au Change will each have two turbines installed underwater at their base. These bridges were chosen because the speed of the current accelerates in those locations. While river currents don't produce the kind of electricity that wave power can, the current-harvesting technology has come a long way and more devices are being introduced that can generate energy from even the slowest moving waters.
City officials have put a call out to power companies to come up with the best plan for installing the turbines, with a winner being chosen in January and installations starting next spring.
via The Guardian
written by Quiet-Environmentalist, June 29, 2010
written by David Brockes, July 08, 2010
|< Prev||Next >|
|
<urn:uuid:e338e7ab-37e4-40ee-98f0-254c81baa630>
|
http://ecogeek.org/component/content/article/3207-paris-putting-turbines-in-the-seine
| 0.9792
|
fineweb
|
of lakes dot the marshy Arctic tundra regions. Now, in the latest addition to
the growing body of evidence that global warming is significantly affecting
the Arctic, two recent studies suggest that thawing permafrost is the cause
of two seemingly contradictory observations both rapidly growing and
rapidly shrinking lakes.
Thawing permafrost is altering the lakes that dominate Arctic landscapes, such as this one in western Siberia. Courtesy of Laurence C. Smith.
The first study is a historical analysis of changes to 10,000 Siberian lakes over the past 30 years, a period of warming air and soil temperatures. Using satellite images, Laurence Smith, a geographer at the University of California, Los Angeles, and colleagues found that, since the early 1970s, 125 Siberian lakes vanished completely, and those that remain averaged a 6 percent loss in surface area, a total of 930 square kilometers.
They report in the June 3 Science that the spatial pattern of lake disappearance suggests that the lakes drained away when the permafrost below them thawed, allowing the lake water to seep down into the groundwater. However, the team also found that lakes in northwestern Siberia actually grew by 12 percent, and 50 new lakes formed. Both of the rapid changes are due to warming, they say, and if the warming trend continues, the northern lakes will eventually shrink as well.
These two processes are similar, in that were witnessing permafrost degradation in both regions, says co-author Larry Hinzman, a hydrologist at the University of Alaska in Fairbanks, who in previous studies documented shrinking lakes in southern Alaska. In the warmer, southern areas, we get groundwater infiltration, but in the northern areas, where the permafrost is thicker and colder, its going to take much, much longer for that to occur. So instead of seeing lakes shrinking there, were seeing lakes growing.
That finding is consistent with the second study, which focused on a set of unusually oriented, rapidly growing lakes in northern Alaska, an area of continuous permafrost. Jon Pelletier, a geomorphologist at the University of Arizona in Tucson, reports in the June 30 Journal of Geophysical Research Earth Surface that the odd alignment of the lakes is caused not by wind direction but by permafrost melting faster at the downhill end of the lake, which has shallower banks.
Since the 1950s, scientists have attributed the odd alignment of the egg-shaped lakes to winds blowing perpendicularly to the long axes of the lakes, which then set up currents that caused waves to break at the northwest and southeast ends, thus preferentially eroding them. The prevailing wind direction idea has been around so long that we dont even think about it, Smith says, but Jons [Pelletier] work is challenging that. Its a very interesting paper.
Wind-driven erosion occurs in the Great Lakes, but at rates of about a meter a year. The Alaskan oriented thaw lakes grow at rates of 5 meters or more per year. Pelletier says this rate difference suggests a different process is at work.
According to the model, the direction and speed of growth depend on where and how quickly the permafrost thaws, which is determined by two factors: how the water table intersects the slope of the landscape and how fast the summer temperature increases. If the permafrost thaws abruptly, the shorter, downhill bank is more likely to thaw first. The soggy soil slumps into the water, and the perimeter of the lake is enlarged. Its not just the [global] warming trend, but also how quickly the warming takes place in the summertime, Pelletier says.
Hinzman says that the lakes are just one part of the Arctic water cycle, which has seen an increasing number of perturbations in recent years. The whole hydrologic cycle is changing and this is just one component of that.
Understanding how the hydrologic cycle is changing is important, Hinzman says, because the amount of freshwater runoff into the Arctic Ocean impacts global ocean circulation and the amount of sea ice, thus affecting climate worldwide. If global warming continues to the point where permafrost goes away, there will be fewer lakes, Smith says. And a drier, less marshy Arctic could alter weather patterns and ecosystems, researchers say, affecting everything from the subsistence lifestyle of native people to the hazard of fire on the tundra.
Geotimes contributing writer
Back to top
|
<urn:uuid:5fdf99e1-ac10-4897-aae4-baeb9600a36e>
|
http://www.geotimes.org/sept05/NN_arcticlakes.html
| 0.9633
|
fineweb
|
A new world record wind gust: 253 mph in Australia's Tropical Cyclone Olivia
The 6,288-foot peak of New Hampshire's Mount Washington is a forbidding landscape of wind-swept barren rock, home to some of planet Earth's fiercest winds. As a 5-year old boy, I remember being blown over by a terrific gust of wind on the summit, and rolling out of control towards a dangerous drop-off before a fortuitously-placed rock saved me. Perusing the Guinness Book of World Records as a kid, three iconic world weather records always held a particular mystique and fascination for me: the incredible 136°F (57.8°C) at El Azizia, Libya in 1922, the -128.5°F (-89.2°C) at the "Pole of Cold" in Vostok, Antarctica in 1983, and the amazing 231 mph wind gust (103.3 m/s) recorded in 1934 on the summit of Mount Washington, New Hampshire. Well, the legendary winds of Mount Washington have to take second place now, next to the tropical waters of northwest Australia. The World Meteorological Organization (WMO) has announced that the new world wind speed record at the surface is a 253 mph (113.2 m/s) wind gust measured on Barrow Island, Australia. The gust occurred on April 10, 1996, during passage of the eyewall of Category 4 Tropical Cyclone Olivia.
Figure 1. Instruments coated with rime ice on the summit of Mt. Washington, New Hampshire. Image credit: Mike Theiss.
Tropical Cyclone Olivia
Tropical Cyclone Olivia was a Category 4 storm on the U.S. Saffir-Simpson scale, and generated sustained winds of 145 mph (1-minute average) as it crossed over Barrow Island off the northwest coast of Australia on April 10, 1996. Olivia had a central pressure of 927 mb and an eye 45 miles in diameter at the time, and generated waves 21 meters (69 feet) high offshore. According to Black et al. (1999), the eyewall likely had a tornado-scale mesovortex embedded in it that caused the extreme wind gust of 253 mph. The gust was measured at the standard measuring height of 10 meters above ground, on ground at an elevation of 64 meters (210 feet). A similar mesovortex was encountered by a Hurricane Hunter aircraft in Hurricane Hugo of 1989, and a mesovortex was also believed to be responsible for the 239 mph wind gust measured at 1400 meters by a dropsonde in Hurricane Isabel in 2003. For reference, 200 mph is the threshold for the strongest category of tornado, the EF-5, and any gusts of this strength are capable of causing catastrophic damage.
Figure 2. Visible satellite image of Tropical Cyclone Olivia a few hours before it crossed Barrow Island, Australia, setting a new world-record wind gust of 253 mph. Image credit: Japan Meteorological Agency.
Figure 3. Wind trace taken at Barrow Island, Australia during Tropical Cyclone Olivia. Image credit: Buchan, S.J., P.G. Black, and R.L. Cohen, 1999, "The Impact of Tropical Cyclone Olivia on Australia's Northwest Shelf", paper presented at the 1999 Offshore Technology Conference in Houston, Texas, 3-6 May, 1999.
Why did it take so long for the new record to be announced?
The instrument used to take the world record wind gust was funded by a private company, Chevron, and Chevron's data was not made available to forecasters at Australia's Bureau of Meteorology (BOM) during the storm. After the storm, the tropical cyclone experts at BOM were made aware of the data, but it was viewed as suspect, since the gusts were so extreme and the data was taken with equipment of unknown accuracy. Hence, the observations were not included in the post-storm report. Steve Buchan from RPS MetOcean believed in the accuracy of the observations, and coauthored a paper on the record gust, presented at the 1999 Offshore Technology Conference in Houston (Buchan et al., 1999). The data lay dormant until 2009, when Joe Courtney of the Australian Bureau of Meteorology was made aware of it. Courtney wrote up a report, coauthored with Steve Buchan, and presented this to the WMO extremes committee for ratification. The report has not been made public yet, and is awaiting approval by Chevron. The verified data will be released next month at a World Meteorological Organization meeting in Turkey, when the new world wind record will become official.
New Hampshire residents are not happy
Residents of New Hampshire are understandably not too happy about losing their cherished claim to fame. The current home page of the Mount Washington Observatory reads, "For once, the big news on Mount Washington isn't our extreme weather. Sadly, it's about how our extreme weather--our world record wind speed, to be exact--was outdone by that of a warm, tropical island".
Comparison with other wind records
Top wind in an Atlantic hurricane: 239 mph (107 m/s) at an altitude of 1400 meters, measured by dropsonde in Hurricane Isabel (2003).
Top surface wind in an Atlantic hurricane: 211 mph (94.4 m/s), Hurricane Gustav, Paso Real de San Diego meteorological station in the western Cuban province of Pinar del Rio, Cuba, on the afternoon of August 30, 2008.
Top wind in a tornado: 302 mph (135 m/s), measured via Doppler radar at an altitude of 100 meters (330 feet), in the Bridge Creek, Oklahoma tornado of May 3, 1999.
Top surface wind not associated with a tropical cyclone or tornado: 231 mph (103.3 m/s), April 12, 1934 on the summit of Mount Washington, New Hampshire.
Top wind in a typhoon: 191 mph (85.4 m/s) on Taiwanese Island of Lanya, Super Typhoon Ryan, Sep 22, 1995; also on island of Miyakojima, Super Typhoon Cora, Sep 5, 1966.
Top surface wind not measured on a mountain or in a tropical cyclone: 207 mph (92.5 m/s) measured in Greenland at Thule Air Force Base on March 6, 1972.
Top wind measured in a U.S. hurricane: 186 mph (83.1 m/s) measured at Blue Hill Observatory, Massachusetts, during the 1938 New England Hurricane.
Buchan, S.J., P.G. Black, and R.L. Cohen, 1999, "The Impact of Tropical Cyclone Olivia on Australia's Northwest Shelf", paper presented at the 1999 Offshore Technology Conference in Houston, Texas, 3-6 May, 1999.
Black, P.G., Buchan, S.J., and R.L. Cohen, 1999, "The Tropical Cyclone Eyewall Mesovortex: A Physical Mechanism Explaining Extreme Peak Gust Occurrence in TC Olivia, 4 April 1996 on Barrow Island, Australia", paper presented at the 1999 Offshore Technology Conference in Houston, Texas, 3-6 May, 1999.
|
<urn:uuid:3cf8391c-7628-4b73-b23d-af8d16292401>
|
http://www.wunderground.com/blog/JeffMasters/comment.html?entrynum=1420&page=7
| 0.9948
|
fineweb
|
July 18, 2012
Since the Industrial Revolution, ocean acidity has risen by 30 percent as a direct result of fossil-fuel burning and deforestation. And within the last 50 years, human industry has caused the world’s oceans to experience a sharp increase in acidity that rivals levels seen when ancient carbon cycles triggered mass extinctions, which took out more than 90 percent of the oceans’ species and more than 75 percent of terrestrial species.
Rising ocean acidity is now considered to be just as much of a formidable threat to the health of Earth’s environment as the atmospheric climate changes brought on by pumping out greenhouse gases. Scientists are now trying to understand what that means for the future survival of marine and terrestrial organisms.
In June, ScienceNOW reported that out of the 35 billion metric tons of carbon dioxide released annually through fossil fuel use, one-third of those emissions diffuse into the surface layer of the ocean. The effects those emissions will have on the biosphere is sobering, as rising ocean acidity will completely upset the balance of marine life in the world’s oceans and will subsequently affect humans and animals who benefit from the oceans’ food resources.
The damage to marine life is due in large part to the fact that higher acidity dissolves naturally-occurring calcium carbonate that many marine species–including plankton, sea urchins, shellfish and coral–use to construct their shells and external skeletons. Studies conducted off Arctic regions have shown that the combination of melting sea ice, atmospheric carbon dioxide and subsequently hotter, CO2-saturated surface waters has led to the undersaturation of calcium carbonate in ocean waters. The reduction in the amount of calcium carbonate in the ocean spells out disaster for the organisms that rely on those nutrients to build their protective shells and body structures.
The link between ocean acidity and calcium carbonate is a directly inverse relationship, which allows scientists to use the oceans’ calcium carbonate saturation levels to measure just how acidic the waters are. In a study by the University of Hawaii at Manoa published earlier this year, researchers calculated that the level of calcium carbonate saturation in the world’s oceans has fallen faster in the last 200 years than has been seen in the last 21,000 years–signaling an extraordinary rise in ocean acidity to levels higher than would ever occur naturally.
The authors of the study continued on to say that currently only 50 percent of the world’s ocean waters are saturated with enough calcium carbonate to support coral reef growth and maintenance, but by 2100, that proportion is expected to drop to a mere five percent, putting most of the world’s beautiful and diverse coral reef habitats in danger.
In the face of so much mounting and discouraging evidence that the oceans are on a trajectory toward irreparable marine life damage, a new study offers hope that certain species may be able to adapt quick enough to keep pace with the changing make-up of Earth’s waters.
In a study published last week in the journal Nature Climate Change, researchers from the ARC Center of Excellence for Coral Reef Studies found that baby clownfish (Amphiprion melanopus) are able to cope with increased acidity if their parents also lived in higher acidic water, a remarkable finding after a study conducted last year on another clownfish species (Amphiprion percula) suggested acidic waters reduced the fish’s sense of smell, making it likely for the fish to mistakenly swim toward predators.
But the new study will require further research to determine whether or not the adaptive abilities of the clownfish are also present in more environmentally-sensitive marine species.
While the news that at least some baby fish may be able to adapt to changes provides optimism, there is still much to learn about the process. It is unclear through what mechanism clownfish are able to pass along this trait to their offspring so quickly, evolutionarily speaking. Organisms capable of generation-to-generation adaptations could have an advantage in the coming decades, as anthropogenic emissions push Earth to non-natural extremes and place new stresses on the biosphere.
Sign up for our free email newsletter and receive the best stories from Smithsonian.com each week.
|
<urn:uuid:d5fc8f97-1ffe-4404-b9ee-d359c5162435>
|
http://blogs.smithsonianmag.com/science/2012/07/ocean-acidity-rivals-climate-change-as-environmental-threat/
| 0.9999
|
fineweb
|
Hot Weather Gets Scientists' Attention
Originally published on Wed July 11, 2012 5:30 am
RENEE MONTAGNE, HOST:
Across America people are sweltering through extreme heat this year, continuing a long-term trend of rising temperatures. Inevitably, many are wondering if the scorching heat is due to global warming. Scientists are expected to dig into the data and grapple with that in the months to come. They've already taken a stab at a possible connection with last year's extreme weather events, like the blistering drought in Texas. NPR's Richard Harris reports.
RICHARD HARRIS, BYLINE: Weather researchers from around the world are now taking stock of what happened in 2011. It was not the hottest year on record, but it was still in the top 15. Jessica Blunden from the National Climatic Data Center says 2011 had its own memorable characteristics.
JESSICA BLUNDEN: People may very well remember this year as a year of extreme weather and climate.
HARRIS: There were devastating droughts in Africa, Mexico, and Texas. In Thailand, massive flooding kept people's houses underwater for two months.
BLUNDEN: Here in the United States, we had one of our busiest and most destructive seasons on record in 2011. There were seven different tornado and severe weather outbreaks that each caused more than a billion dollars in damages.
HARRIS: So what's going on here? Federal climate scientist, Tom Karl, said one major feature of the global weather last year was a La Nina event. That's a period of cooler Pacific Ocean temperatures and it has effects around the globe, primarily in producing floods in some parts of the world and droughts in others.
TOM KARL: By no means did it explain all of the activity in 2011, but it certainly influenced a considerable part of the climate and weather.
HARRIS: Karl and Blunden are part of a huge multinational effort to sum up last year's weather and say what it all means. They provided an update by conference call. Clearly, long-term temperature trends are climbing as you'd expect as a result of global warming. Tom Peterson from the Federal Climate Data Center says the effort now is to look more closely at individual events.
TOM PETERSON: You've probably all heard the term you can't attribute any single event to global warming, and while that's true, the focus of the science now is evolving and moving onto how is the probability of event change.
HARRIS: And there researchers report some progress. For example, last year's record-breaking drought in Texas wasn't simply the result of La Nina. Peter Stott from the British Meteorology Office says today's much warmer planet played a huge role as well, according to the study the group released on Tuesday.
PETER STOTT: The result that they find is really quite striking, in that they find that such a heat wave is now about 20 times more likely during a La Nina year than it was during the 1960s.
HARRIS: A second study found that an extraordinary warm spell in London last November was 60 times more likely to occur on our warming planet than it would have been over the last 350 years. But that's not to say everything is related to climate change. There's no clear link between the spate of tornadoes and global warming, and devastating floods in Thailand last year, turn out to be the result of poor land use practices.
Even so, Kate Willett of the British Weather Service says there is a global trend consistent with what scientists expect climate change to bring.
KATE WILLETT: So, in simple terms, we can say that the dry regions are getting drier and the wet regions are getting wetter.
HARRIS: This year's extreme events are different from last year's, but they all fit into a coherent picture of global change. Richard Harris, NPR News. Transcript provided by NPR, Copyright NPR.
|
<urn:uuid:e8e46237-1e26-4326-b62c-a25477bd0d59>
|
http://kacu.org/post/hot-weather-gets-scientists-attention
| 1
|
fineweb
|
Climate change has already pushed the nation's wildlife into crisis, according to a report released Wednesday from the National Wildlife Federation (NWF), and further catastrophe, including widespread extinction, can only be curbed with swift action to curb the carbon pollution that has the planet sweltering.
Entitled Wildlife in a Warming World: Confronting the Climate Crisis, the report looks at 8 regions across the U.S. where "the underlying climatic conditions to which species have been accustomed for thousands of years," the report explains, have been upturned by human-caused climate change.
“Some of America’s most iconic species—from moose to sandhill cranes to sea turtles – are seeing their homes transformed by rapid climate change,” stated Dr. Amanda Staudt, climate scientist at the National Wildlife Federation.
Feb 15, 2013 Living on Earth: STARVING POLAR BEARS Polar Bears have long been the poster species for the problem of climate change. But a new paper in Conservation Letters argues that supplemental feeding may be necessary to prevent polar bear populations from going extinct. Polar bear expert Andrew Derocher from the University of Alberta joins Host Steve Curwood to discuss how we can save the largest bear on the planet.http://www.loe.org/shows/segments.html?programID=13-P13-00007&segmentID=2
|
<urn:uuid:8f73ff6f-28d6-4d1c-b460-f3a592885a8d>
|
http://www.scoop.it/t/why-has-putin-closed-the-archives-relating-to-the-holocaust-and-why-has-russian-joined-the-wto/p/3371169705/israel-shells-syria-and-gaza-sabbah-report
| 1
|
fineweb
|
Pricing Carbon Emissions
A bill before Congress may prove a costly way to reduce greenhouse gases.
- Friday, June 5, 2009
- By Kevin Bullis
Experts are applauding a sweeping energy bill currently before the United States Congress, saying that it could lead to significant cuts in greenhouse-gas emissions and improve the likelihood of a comprehensive international agreement to cut greenhouse gases. "It's real climate-change legislation that's being taken seriously," says Gilbert Metcalf, a professor of economics at Tufts University. But many warn that the bill's market-based mechanisms and more conventional regulations could make these emissions reductions more expensive than they need to be.
The bill, officially called the American Clean Energy and Security Act of 2009, is also referred to as the Waxman-Markey Bill, after its sponsors, Henry Waxman (D-Ca.) and Edward Markey (D-Mass.). The legislation would establish a cap and trade system to reduce greenhouse gases, an approach favored by most economists over conventional regulatory approaches because it provides a great deal of flexibility in how emissions targets are met. But it also contains mandates that could significantly reduce the cost savings that the cap and trade approach is supposed to provide.
In a cap and trade system, the government sets a cap on total emissions of greenhouse gases from various industrial and utility sources, including power plants burning fossil fuels to generate electricity. It then issues allowances to polluters allowing them to emit carbon dioxide and other greenhouse gases; total emissions are meant to stay under the cap. Over a period of time, the government gradually reduces the cap and the number of allowances until it reaches its target. If companies' emissions exceed their allowances, they must buy more.
Economists like the system because companies can choose to either lower their emissions, such as by investing in new technology, or buy more allowances from the government or from companies that don't need them--whichever makes the best economic sense. It is meant to create a carbon market, putting a value on emissions.
In the proposed energy bill, the government will set caps to reduce greenhouse-gas emissions by 17 percent by 2020 (compared with 2005 levels) and by 80 percent by 2050--targets chosen to prevent the worst effects of climate change. Setting caps will make electricity more expensive, as companies turn to cleaner technologies to meet ever lower caps or have to spend money to buy allowances from others with lower emissions. But the bill has some provisions for cushioning the blow, especially at first. For one thing, it gives away most of the allowances rather than charging for them, and it also requires that any profits gained from these free allowances be passed on to electricity customers. It also allows companies to buy "offsets" that permit them to pay to reduce emissions outside the United States.
If the program is designed right, there are fewer allowances than the total emissions when the program starts. At first, when the caps are relatively easy to meet, the prices for allowances on the carbon market will be low. But eventually, they will get higher as the allowances become scarcer. In an ideal world, companies will predict what the price of the allowances will be, and plan accordingly.
|
<urn:uuid:ecbdee27-d586-4d08-a03d-036829352851>
|
http://www.technologyreview.in/energy/22755/page1/
| 1
|
fineweb
|
Green building facts
- Buildings consume 32% of the world’s resources including 12% of fresh water and 40% of the world’s energy (7).
- In Australia commercial buildings produce almost 9% of our national Greenhouse gas emissions (8).
- To make way for the new Law School the Edgeworth David building and the Stephen Roberts lecture theatre were demolished in 2006. Over 80% of the materials from these buildings were recycled including the valuable copper from the roof of the lecture theatre
The new Law School Building
7. “Environmentally Sustainable Buildings: Challenges and Policies” OECD (2003)
8. “Australia State of the Environment Report” Department of Environment & Heritage (2001)
|
<urn:uuid:354e6914-456e-4541-8c57-6410f91b48cd>
|
http://sydney.edu.au/facilities/sustainable_campus/buildings/index.shtml
| 0.947
|
fineweb
|
The drought in Texas, during March, was the worst since 1895.
That is about the time my parents were born 120 years ago.
I never thought it could be worse than the drought of the 1950s, but it is. Drive out into grazing country where mesquite aren't too thick and all you can see is dry, cracked soil with an occasional fire ant or a gopher mound in the sandier soil.
Comparing the current drought with the seven-year drought in the 1950s, old-timers say the current drought sapped the soil of moisture faster than it did in the 1950s.
It just stopped raining last July, and pasture after pasture was hit by wildfires.
Right now, there is no potential to produce hay, harvest wheat or plant cotton or grain sorghum this May. Unless there is a week of rain fairly soon there is no hope for agriculture this year.
The Texas Ag Extension Service says that, despite a few recent showers in some areas, the cotton growing in Texas and Oklahoma is still in a drought. Any crop planted in southern Texas earlier in the year that got up out of the ground is now being sand blasted by hot, dry winds.
Wildfires have burned at least 1.5 million acres in the state since Jan. 1.
In addition to grazing losses, ranchers are facing rangeland stock water tanks that are dry or nearly dry. Streams are not flowing and lakes and big tanks are turning to deep mud.
|
<urn:uuid:2ea6b2e4-22cb-4c80-8462-ec4f7a51e6d6>
|
http://www.timesrecordnews.com/news/2011/may/01/drought-worst-since-1895/
| 0.9997
|
fineweb
|
Forest Ecosystems: Current Research
Regional Fire/Climate Relationships in the Pacific Northwest and Beyond
Fire exerts a strong influence on the structure and function of many terrestrial ecosystems. In forested ecosystems, the factors controlling the frequency, intensity, and size of fires are complex and operate at different spatial and temporal scales. Since climate strongly influences most of these factors (such as vegetation structure and fuel moisture), understanding the past and present relationships between climate and fire is essential to developing strategies for managing fire-prone ecosystems in an era of rapid climate change. The influence of climate change and climate variability on fire regimes and large fire events in the Pacific Northwest (PNW) and beyond is the focus of this project.
There is mounting evidence that a detectable relationship exists between extreme fire years in the West and Pacific Ocean circulation anomalies. The El Niño/Southern Oscillation (ENSO) influences fire in the Southwest (SW) and the Pacific Decadal Oscillation (PDO) appears to be related to fire in the PNW and Northern Rockies (NR). However, there are reasons to expect that processes driving fire in PNW, SW, and NR are not constant in their relative influence on fire through time or across space and that their differentiation is not stationary through time or across space.
- How regionally specific is the relationship between large fire events and precipitation/atmospheric anomalies associated with ENSO and PDO during the modern record?
- What do tree-ring and other paleo-records tell us about the temporal variability of the patterns of fire/climate relationships?
- How is climate change likely to influence climate/fire relationships given the demonstrated influences of climate variability?
Figure 1 A simple model of climate–fire-vegetation linkages. This project emphasizes the mechanisms and variability indicated by (1).
For publications on climate impacts on PNW forest ecosystems, please see CIG Publications.
Gedalof, Z. 2002. Links between Pacific basin climatic variability and natural systems of the Pacific Northwest. PhD dissertation, School of Forestry, University of Washington, Seattle.
Littell, J.S. 2002. Determinants of fire regime variability in lower elevation forests of the northern greater Yellowstone ecosystem. M.S. Thesis, Big Sky Institute/Department of Land Resources and Environmental Sciences, Montana State University, Bozeman.
Mote, P.W., W.S. Keeton, and J.F. Franklin. 1999. Decadal variations in forest fire activity in the Pacific Northwest. In Proceedings of the 11th Conference on Applied Climatology, pp. 155-156, Boston, Massachusetts: American Meteorological Society.
|
<urn:uuid:e4092633-013e-4995-97f5-6212c2dac106>
|
http://cses.washington.edu/cig/res/fe/fireclimate.shtml
| 1
|
fineweb
|
“A remote Indian village is responding to global warming-induced water shortages by creating large masses of ice, or “artificial glaciers,” to get through the dry spring months. (See a map of the region.)
Located on the western edge of the Tibetan plateau, the village of Skara in the Ladakh region of India is not a common tourist destination.
“It’s beautiful, but really remote and difficult to get to,” said Amy Higgins, a graduate student at the Yale School of Forestry & Environmental Studies who worked on the artificial glacier project.
“A lot of people, when I met them in Delhi and I said I was going to Ladakh, they looked at me like I was going to the moon,” said Higgins, who is also a National Geographic grantee.
People in Skara and surrounding villages survive by growing crops such as barley for their own consumption and for sale in neighboring towns. In the past, water for the crops came from meltwater originating in glaciers high in the Himalaya.”
Read more: National Geographic
|
<urn:uuid:5050ac83-4770-4e9c-9b44-38ba46d2466e>
|
http://peakwater.org/2012/02/artificial-glaciers-water-crops-in-indian-highlands/
| 0.9923
|
fineweb
|
Will the US Face Blackouts as Electricity Generation Suffers in Drought?
Well, its official – the U.S. government has acknowledged that the U.S. is in the worst drought in over 50 years, since December 1956, when about 58 percent of the contiguous U.S. was in moderate to extreme drought.
According to the National Oceanic and Atmospheric Administration National Climatic Data Center’s “State of the Climate Drought July 2012″ report, “Based on the Palmer Drought Index, severe to extreme drought affected about 38 percent of the contiguous United States as of the end of July 2012, an increase of about 5 percent from last month… About 57 percent of the contiguous U.S. fell in the moderate to extreme drought categories (based on the Palmer Drought Index) at the end of July… According to the weekly U.S. Drought Monitor, about 63 percent of the contiguous U.S. (about 53 percent of the U.S. including Alaska, Hawaii, and Puerto Rico) was classified as experiencing moderate to exceptional (D1-D4) drought at the end of July.”
Much business writing on the effects of the drought have focused on its agricultural aspects. To give but one, the hottest, driest summer since 1936 scorching the Midwest have diminished projected corn and soybean crop yields s in the U.S. for a third straight year to their lowest levels in nine years. Accordingly, the price of a bushel of corn has jumped 62 percent since 15 June and soybeans gained 32 percent in the same period.
But as consumers fret about the inevitable rise in food prices to come, the drought is unveiling another, darker threat to the American lifestyle, as it is now threatening U.S. electricity supplies.
Because virtually all power plants, whether they are nuclear, coal, or natural gas-fired, are completely dependent on water for cooling. Hydroelectric plants require continuous water flow to operate their turbines. Given the drought, many facilities are overheating and utilities are shutting them down or running their plants at lower capacity. Few Americans know (or up to this point have cared) that the country’s power plants account for about half of all the water used in the United States. For every gallon of residential water used in the average U.S. household, five times more is used to provide that home with electricity via hydropower turbines and fossil fuel power plants, roughly 40,000 gallons each month.
Michael Webber, associate director of the Center for International Energy and Environmental Policy at the University of Texas at Austin, is under no such illusions, stating that the summer’s record high heat and drought have worked together to overtax the nation’s electrical grid, adding that families use more water to power their homes than they use from their tap. Webber said, “In summer you often get a double whammy. People want their air-conditioning and drought gets worse. You have more demand for electricity and less water available to produce it. That is what we are seeing in the Midwest right now, power plants on the edge.”
In July U.S. nuclear-power production hit its lowest seasonal levels in nine years as drought and heat forced Nuclear power plants from Ohio to Vermont to slow output. Nuclear Regulatory Commission spokesman David McIntyre explained, “Heat is the main issue, because if the river is getting warmer the water going into the plant is warmer and makes it harder to cool. If the water gets too warm, you have to dial back production,” McIntyre said. “That’s for reactor safety, and also to regulate the temperature of discharge water, which affects aquatic life.”
Nuclear is the thirstiest power source. According to the National Energy Technology Laboratory (NETL) in Morgantown, West Virginia, the average NPP that generates 12.2 million megawatt hours of electricity requires far more water to cool its turbines than other power plants. NPPs need 2725 liters of water per megawatt hour for cooling. Coal or natural gas plants need, on average, only 1890 and 719 liters respectively to produce the same amount of energy.
And oh, the National Weather Service Climate Prediction Center in its 16 August “U.S. Seasonal Drought Outlook” wrote, “The Drought Outlook valid through the end of November 2012 indicates drought conditions will remain essentially unchanged in large sections of the central Mississippi Valley, the central and southwestern Great Plains, most of the High Plains, the central Rockies, the Great Basin, and parts of the Far West…” The lack of rain and the incessant heat, has also increased the need for irrigation water for farming, meaning increasing competition between the agricultural and power generation sectors for the same shrinking water “pool.”
But, every cloud has a silver lining. California’s Pacific Gas and Electric Co. utility, commonly known as PG&E, that provides natural gas and electricity to most of the northern two-thirds of California, from Bakersfield almost to the Oregon border, is on the case. PG&E has informed its customers that its “Diablo Canyon (nuclear) Power Plant, the largest source of generation in the utility’s service area, is cooled by ocean water, not by rivers that could dry up.”
Never mind the fact that by the time the Diablo Canyon NPP was completed in 1973, engineers discovered that it was several miles away from the Hosgri seismic fault, which had a 7.1 magnitude earthquake on 4 November 1927.
But ocean water as a coolant is not necessarily the answer either.
On 12 August Dominion Resources’ Millstone NPP, situated on Connecticut’s Niantic Bay on Long Island Sound, was forced to shut down one of two reactor units because seawater used to cool down the plant was too warm, averaging 1.7 degrees above the NRC limit of 75 degrees Fahrenheit. The Millstone NPP, which provides half of all power used in Connecticut and 12 percent in New England, was only restarted twelve days later.
The federal government is hardly known for its scaremongering tactics, but it would seem that Mother Nature is forcing Americans to belatedly consider making some lifestyle changes, as the choice seems to be devolving into energy conservation, turning down the air conditioner and digging deeper into the wallet for food costs.
It might also be time for serious national discussion about renewable energy, including wind and solar.
If the sun stops shining, all bets are off.
By. John C.K. Daly of Oilprice.comhome solar power, hydroelectric plants, national oceanic and atmospheric administration, palmer drought index
Short URL: http://www.solarthermalmagazine.com/?p=21120
|
<urn:uuid:86e28879-f223-4422-a14b-5d61114c348e>
|
http://www.solarthermalmagazine.com/2012/10/05/will-the-us-face-blackouts-as-electricity-generation-suffers-in-drought/
| 0.999
|
fineweb
|
Sea ice is frozen seawater that floats on the ocean surface. Blanketing millions of square kilometers, sea ice forms and melts with the polar seasons, affecting both human activity and biological habitat. In the Arctic, some sea ice persists year after year, whereas almost all Southern Ocean or Antarctic sea ice is "seasonal ice," meaning it melts away and reforms annually. While both Arctic and Antarctic ice are of vital importance to the marine mammals and birds for which they are habitats, sea ice in the Arctic appears to play a more crucial role in regulating climate.
Because they are composed of ice originating from glaciers, icebergs are not considered sea ice. Most of the icebergs infesting North Atlantic shipping lanes originate from Greenland glaciers.
Global Sea Ice Extent and Concentration: What sensors on satellites are telling us about sea ice
Sea ice regulates exchanges of heat, moisture and salinity in the polar oceans. It insulates the relatively warm ocean water from the cold polar atmosphere except where cracks, or leads, in the ice allow exchange of heat and water vapor from ocean to atmosphere in winter. The number of leads determines where and how much heat and water are lost to the atmosphere, which may affect local cloud cover and precipitation.
The seasonal sea ice cycle affects both human activities and biological habitats. For example, companies shipping raw materials such as oil or coal out of the Arctic must work quickly during periods of low ice concentration, navigating their ships towards openings in the ice and away from treacherous multi-year ice that has accumulated over several years. Many arctic mammals, such as polar bears, seals, and walruses, depend on the sea ice for their habitat. These species hunt, feed, and breed on the ice. Studies of polar bear populations indicate that declining sea ice is likely to decrease polar bear numbers, perhaps substantially (Stirling and Parkinson 2006).
Ice thickness, its spatial extent, and the fraction of open water within the ice pack can vary rapidly and profoundly in response to weather and climate. Sea ice typically covers about 14 to 16 million square kilometers in late winter in the Arctic and 17 to 20 million square kilometers in the Antarctic Southern Ocean. The seasonal decrease is much larger in the Antarctic, with only about three to four million square kilometers remaining at summer's end, compared to approximately seven to nine million square kilometers in the Arctic. These maps provide examples of late winter and late summer ice cover in the two hemispheres.
Monitoring sea ice
Passive microwave satellite data represent the best method to monitor sea ice because of the ability to show data through most clouds and during darkness. Passive microwave data allow scientists to monitor the inter-annual variations and trends in sea ice cover. Observations of polar oceans derived from these instruments are essential for tracking the ice edge, estimating sea ice concentrations, and classifying sea ice types. In addition to the practical use of this information for shipping and transport, these data add to the meteorological knowledge base required for better understanding climate.
Decline in Arctic sea ice extent
Passive microwave satellite data reveal that, since 1979, winter Arctic ice extent has decreased about 3.6 percent per decade (Meier et al. 2006). Antarctic ice extent is increasing (Cavalieri et al. 2003), but the trend is small.
Satellite data from the SMMR and SSM/I instruments have been combined with earlier observations from ice charts and other sources to yield a time series of Arctic ice extent from the early 1900s onward. While the pre-satellite records are not as reliable, their trends are in good general agreement with the satellite record and indicate that Arctic sea ice extent has been declining since at least the early 1950s.
In recent years, satellite data have indicated an even more dramatic reduction in regional ice cover. In September 2002, sea ice in the Arctic reached a record minimum (Serreze et al. 2003), 4 percent lower than any previous September since 1978, and 14 percent lower than the 1979-2000 mean. In the past, a low ice year would be followed by a rebound to near-normal conditions, but 2002 was followed by two more low-ice years, both of which almost matched the 2002 record. Taking these three years into account, the September ice extent trend for 1979-2004 declined by 7.7 percent per decade (Stroeve et al. 2005). The year 2005 set a new record, dropping the estimated decline in end-of-summer Arctic sea ice to approximately 8 percent per decade. Although sea ice did not set a new record low in 2006, it did fall below normal for the fifth consecutive year. In 2007, sea ice broke all prior satellite records, reaching a record low a month before the end of melt season. Through 2007, the September decline trend is now over 10 percent per decade. (For current sea ice trends, visit NSIDC's Sea Ice Index Cryospheric Climate Indicators.)
Combined with record low summertime extent, Arctic sea ice exhibited a new pattern of poor winter recovery. In the past, a low-ice year would be followed by a rebound to near-normal conditions, but 2002 was followed by two more low-ice years, both of which almost matched the 2002 record (see Arctic Sea Ice Decline Continues). Although wintertime recovery of Arctic sea ice improved somewhat after 2006, wintertime extents have remained well below the long-term average.
Decline in Arctic Sea Ice Thickness
Sea ice thickness has likewise shown substantial decline in recent decades (Rothrock et al. 1999). Using data from submarine cruises, Rothrock and collaborators determined that the mean ice draft at the end of the melt season in the Arctic has decreased by about 1.3 meters between the 1950s and the 1990s.
Estimates based on measurements taken by NASA's ICESat laser altimeter, first-year ice that formed after the autumn of 2007 had a mean thickness of 1.6 meters. The ice formed relatively late in the autumn of 2007, and NSIDC researchers had actually anticipated this first-year ice to be thinner, but it nearly equaled the thickness of 2006 and 2007. Snow accumulation on sea ice helps insulate the ice from frigid air overhead, so sparse snowfall during the winter of 2007-2008 might have actually accelerated the sea ice's growth.
Greenhouse gases emitted through human activities and the resulting increase in global mean temperatures are the most likely underlying cause of the sea ice decline, but the direct cause is a complicated combination of factors resulting from the warming, and from climate variability. The Arctic Oscillation (AO) is a see-saw pattern of alternating atmospheric pressure at polar and mid-latitudes. The positive phase produces a strong polar vortex, with the mid-latitude jet stream shifted northward. The negative phase produces the opposite conditions. From the 1950s to the 1980s, the AO flipped between positive and negative phases, but it entered a strong positive pattern between 1989 and 1995. So the acceleration in the sea ice decline since the mid 1990s may have been partly triggered by the strongly positive AO mode during the preceding years (Rigor et al. 2002 and Rigor and Wallace 2004) that flushed older, thicker ice out of the Arctic, but other factors also played a role.
Since the mid-1990s, the AO has largely been a neutral or negative phase, and the late 1990s and early 2000s brought a weakening of the Beaufort Gyre. However, the longevity of ice in the gyre began to change as a result of warming along the Alaskan and Siberian coasts. In the past, sea ice in this gyre could remain in the Arctic for many years, thickening over time. Beginning in the late 1990s, sea ice began melting in the southern arm of the gyre, thanks to warmer air temperatures and more extensive summer melt north of Alaska and Siberia. Moreover, ice movement out of the Arctic through Fram Strait continued at a high rate despite the change in the AO. Thus warming conditions and wind patterns have been the main drivers of the steeper decline since the late 1990s. Sea ice may not be able to recover under the current persistently warm conditions, and a tipping point may have been passed where the Arctic will eventually be ice-free during at least part of the summer (Lindsay and Zhang 2005).
Examination of the long-term satellite record dating back to 1979 and earlier records dating back to the 1950s indicate that spring melt seasons have started earlier and continued for a longer period throughout the year (Serreze et al. 2007). Even more disquieting, comparison of actual Arctic sea ice decline to IPCC AR4 projections show that observed ice loss is faster than any of the IPCC AR4 models have predicted (Stroeve et al. 2007).
Disclaimer: This article is taken wholly from, or contains information that was originally published by, the National Snow and Ice Data Center. Topic editors and authors for the Encyclopedia of Earth may have edited its content or added new information. The use of information from the National Snow and Ice Data Center should not be construed as support for or endorsement by that organization for any new information added by EoE personnel, or for any editing of the original content.
|
<urn:uuid:b5246854-fe77-4d8a-96ae-2c3df064ab3d>
|
http://www.eoearth.org/article/Climate_change_and_sea_ice
| 0.9999
|
fineweb
|
Unfortunately, the modern buildings we live and work in rival cars and factories as sources of harm to the environment, contributing to deforestation, global warming, overuse of water and energy and carbon dioxide emissions.
Sustainable building refers to those buildings that are built to have the least impact on the natural environment, both in terms of the building itself, its immediate surroundings and the broader global setting. To construct in a sustainable way, some basic rules need to be followed: (a) minimization of non-renewable resource consumption; (b) enhancement of the natural environment; and (c) elimination or minimization of toxic emissions.
Almost every step of the green building process is heavily focused on how building elements fit together to optimize efficiency and sustainability.
Sustainable development marries two important themes:
1. Environmental protection does not preclude economic development.
2. Economic development must be ecologically viable now and in the long term.
“Sustainable design” involves the planning and development of projects in a manner that minimizes impact on natural resources, such as water and energy. There are many aspects to the sustainable process, one of which involves “LEED” principles – Leadership in Engineering and Environmental Design, with standards for selecting materials and designing facilities established by the U.S. Green Building Council. Zurn strongly encourages organizations to consider including cost-effective and environmentally friendly practices in the design, construction and retrofit of buildings and facilities. In this way, your buildings and facilities not only exemplify your care for the environment and the well-being of the community that you serve, but they also decrease facility operating and maintenance costs.
|
<urn:uuid:180acaf8-94ec-4f00-abe3-803c9ec9f24d>
|
http://www.zurn.com/Pages/SustainabilityProcess.aspx
| 0.9744
|
fineweb
|
There are nearly two million known species on the planet. But many of those won't be around much longer; one out of every eight known bird species, one in four mammal species, and one in three amphibian species are at risk for extinction, according to the World Conservation Union (IUCN), which maintains the Red List, a catalog of the world's species classified according to their risk of extinction.
"It's supposed to inform conservation practice, to be a wake-up call for the extinctions that are happening," says Caroline Pollock, a program officer with the Red List unit. Animals that are classified as "critically endangered" are at the highest risk--their numbers in the wild may be extraordinarily low or their territories incredibly small. "It is possible to bring them back," Pollock says, "but it is quite work-intensive and financially expensive." Here, a look at five species on the brink.
Native to Spain and Portugal, there are fewer than 250 of these felines left in the wild. Habitat destruction has been a major cause of its decline as agriculture spreads through its homeland. Additionally, disease has claimed a large percentage of the region's rabbits, one of the lynx's primary food sources. Intensive captive breeding programs are currently underway to help save the lynx, Pollock says. If they do disappear, the lynx will be the first wild cat to go extinct in more than 2,000 years.
The wild population of these frogs has declined more than 80 percent in the last decade. The plummeting numbers of the frogs, which are endemic to Panama, is largely a result of chytridiomycosis, an infectious fungal disease that seems to be causing mass amphibian die-offs. The disease is still spreading, and deforestation is adding to the pressures faced by the frogs. Though there are captive-breeding programs in place for these amphibians, they will not be released into the wild until conditions improve.
Fewer than 100 of these birds, which are confined to one small island in Cape Verde, remain in the wild. The birds have been threatened by drought and increasing desertification on the island, conditions that may worsen as a result of global climate change. Because they build their nests on the ground, they also face risks from cats, dogs, and rats that have been introduced to the island.
Only 34 of these trees, native to Mexico, remain. The plants have a low rate of pollination--and don't reach maturity until they are approximately 25 years old--and are also profoundly threatened by agriculture. One tree was cut down in 2006 to expand farmland, and insecticides decrease the number of pollinators available to help the trees spread. Human-caused fires have also destroyed or damaged a number of these plants.
It could already be too late for the Yangtze River dolphin, or baiji. There has not been a documented sighting of these cetaceans, which lived in China's Yangtze River and nearby lakes, since 2002. A search for the dolphin--and the signature sounds that they make--was conducted in late 2006 but turned up no evidence of the mammals. However, further surveys are still needed to determine whether the dolphins truly have disappeared forever. The baiji's population decline is due, in large part, to the development of Chinese waterways and the expansion of commercial fishing.
Read more on helping endangered species by breeding captive animals in DISCOVER's Recall of the Wild
Emotion researcher Jaak Panksepp
Read More »
Sign up to get the latest science news delivered weekly right to your inbox!
|
<urn:uuid:246be445-3dd9-4c3d-a6e4-a6d193647b45>
|
http://discovermagazine.com/galleries/zen-photo/e/endangered-species
| 0.9908
|
fineweb
|
Since 1993, RAN’s Protect-an-Acre program (PAA) has distributed more than one million dollars in grants to more than 150 frontline communities, Indigenous-led organizations, and allies, helping their efforts to secure protection for millions of acres of traditional territory in forests around the world.
Rainforest Action Network believes that Indigenous peoples are the best stewards of the world’s rainforests and that frontline communities organizing against the extraction and burning of dirty fossil fuels deserve the strongest support we can offer. RAN established the Protect-an-Acre program to protect the world’s forests and the rights of their inhabitants by providing financial aid to traditionally under-funded organizations and communities in forest regions.
Indigenous and frontline communities suffer disproportionate impacts to their health, livelihood and culture from extractive industry mega-projects and the effects of global climate change. That’s why Protect-an-Acre provides small grants to community-based organizations, Indigenous federations and small NGOs that are fighting to protect millions of acres of forest and keep millions of tons of CO2 in the ground.
Our grants support organizations and communities that are working to regain control of and sustainably manage their traditional territories through land title initiatives, community education, development of sustainable economic alternatives, and grassroots resistance to destructive industrial activities.
PAA is an alternative to “buy-an-acre” programs that seek to provide rainforest protection by buying tracts of land, but which often fail to address the needs or rights of local Indigenous peoples. Uninhabited forest areas often go unprotected, even if purchased through a buy-an-acre program. It is not uncommon for loggers, oil and gas companies, cattle ranchers, and miners to illegally extract resources from so-called “protected” areas.
Traditional forest communities are often the best stewards of the land because their way of life depends upon the health of their environment. A number of recent studies add to the growing body of evidence that Indigenous peoples are better protectors of their forests than governments or industry.
Based on the success of Protect-an-Acre, RAN launched The Climate Action Fund (CAF) in 2009 as a way to direct further resources and support to frontline communities and Indigenous peoples challenging the fossil fuel industry.
Additionally, RAN has been a Global Advisor to Global Greengrants Fund (GGF) since 1995, identifying recipients for small grants to mobilize resources for global environmental sustainability and social justice using the same priority and criteria as we use for PAA and CAF.
Through these three programs each year we support grassroots projects that result in at least:
|
<urn:uuid:995ec683-d967-4f36-82d9-547c9ea3d646>
|
http://ran.org/protect-an-acre
| 0.9993
|
fineweb
|
Walking and cycling have long been considered the most environmentally sound methods of getting around. They still are but some environmentalists have argued that food production has become so fossil-fuel intensive that driving could be considered greener than walking (though the analysis has been debunked as flawed).
What of other, more obviously polluting, modes of transport? The data below gives an idea of how your carbon footprint might grow depending on how you make a journey. If you were to take an average domestic flight rather than a high-speed electric train, you'd be personally responsible for 29 times as much carbon dioxide.
The data also highlights how the UK government's plans to electrify parts of the rail network could cut emissions. Diesel trains are responsible for more greenhouse gases than electric trains, even taking into account Britain's carbon-heavy electricity production.
On the roads, next-generation hybrid and electric vehicles can help those of us behind the wheel to be that little bit greener. However, no journey is completely carbon free.
|
<urn:uuid:66d0a330-4520-4e4a-8b95-75da2b16d190>
|
http://www.guardian.co.uk/environment/datablog/2009/sep/02/carbon-emissions-per-transport-type
| 0.9984
|
fineweb
|
But Keller and her colleagues say their research proves otherwise.
Keller has studied the Chicxulub site and other impact-crater sites around the world for the past decade. She believes that the asteroid impact behind Chicxulub coincided with a "time of massive volcanism, which led to greenhouse warming."
Keller says those three eventsthe Chicxulub asteroid impact, volcanism, and climate change"led to high biotic stress and caused the decline of many tropical species populations," but not mass extinctions. That die-off didn't occur until later. However, Keller does believe that the initial confluence of volcanic activity, global warming, and the Chicxulub asteroid impact ultimately contributed to the mass extinction.
Key to Keller's assertions is a 20-inch-thick (50-centimeter-thick) layer of limestone found between the K-T boundary and the impact breccia, or molten lava and rocky debris, laid down when the Chicxulub asteroid collided with Earth.
Keller and her colleagues believe that the thickness of the limestone layera type of sedimentary rock characteristically formed under large bodies of water like oceans, seas, and lakesindicates that it accumulated in the crater over some 300,000 years after the impact. As proof, Keller points to fossils of microscopic organisms called foraminifera and fossil burrows present in the limestone layer.
According to Keller, those fossils indicate the sediment was deposited after the asteroid impact but before the period of mass extinction that marked the end of the Cretaceous.
Many other scientists disagree with that interpretation, however. They say the layer of fossil-rich limestone was deposited quickly as backwash and infill caused by a huge tsunami that followed the Chicxulub asteroid's impact with Earth. The layer, they say, did not take 300,000 years to accumulate.
In her defense, Keller says the quick-accumulation theory is unsupported by evidence that would have been found during her analysis of core samples gathered at Chicxulub and 45 localities in northeast Mexico.
But Alan Hildebrand, a proponent of the quick-accumulation theory, says the burrows were "made by organisms digging after the fireball layer was deposited."
Thomas R. Holtz, Jr., a vertebrate paleontologist at the University of Maryland in College Park, supports the view that the limestone was quickly laid down as crater infill. He said he is not surprised that Cretaceous fossils were found in the limestone layer.
"If an asteroid clobbered the Eastern seaboard of the U.S. today, I would expect that most of the infilling would be Chevys and Hondas and shopping malls and houses and cows and McDonald's burger wrappers," Holtz said. "Only a tiny bit might be mastodons and Clovis points and Miocene whales." In other words, the crater would quickly fill with objects common on Earth at the time of impact.
So where do researchers in the Keller camp look next for the possible K-T crater? Keller says she's unsure, although "some scientists have suggested it could be a structure called Shiva, in India. We have no convincing evidence so far that this is the case."
SOURCES AND RELATED WEB SITES
|
<urn:uuid:4d9d2441-762a-4bf0-bbc6-64a734349fda>
|
http://news.nationalgeographic.com/news/2004/03/0309_040309_chicxulubdinos_2.html
| 0.5263
|
fineweb
|
1. Reduce our personal carbon footprint by 20 percent in the next year.
2. Stop subsidizing fossil fuels.
3. Mitigate politics and polarization.+
4. Shift towards more vegetarian diets.
5. Scientists better communicate the scientific facts underlying climate change.+
6. Scientists and engineers develop cheap alternative energy sources to reduce dependence on fossil fuels.+
7. Reduce waste water treatment costs.*
8. Reduce costs to absorb CO2 from industrial activities.*
9. Manage the timing, magnitude, and speed of reservoir drawdowns in order to mitigate methane releases to the atmosphere.^
+ http://www.newswise.com/articles/view/591970/ ...
* http://www.typicallyspanish.com/news/publish/ ...
|
<urn:uuid:ecdaa893-6065-4c28-8ac3-88c4ac418547>
|
http://www.topix.net/forum/news/drought/TUGBRVERP7JUKITNE
| 1
|
fineweb
|
This page lists climate science and climate impact claims that have either not been proven, or have had the claim modified, moved, or expanded to protect the claimant from having to admit the original claim was wrong.
This will always be a work in progress. New items will be added as they are examined and will include:
- The claim itself – what was stated as factual or predicted? A clear unambiguous statement, such as “50 million climate refugees by 2010″
- Proof of the original claim – website, documents, photos, audio, video that clearly and unambiguously show the claim being made sometime in the past.
- A test of the of the claim, and the results – website, documents, photos, audio, video that clearly and unambiguously show the claim not coming true or not meeting the claim.
- Proof of change in the claim (if applicable) – often, when the claim fails to materialize, goalposts get moved, such as we saw with the “50 million climate refugees” story that was originally set with a due date of 2010, is now set for the year 2020.
The Claim: 50 million climate refugees will be produced by climate change by the year 2010. Especially hard hit will be river delta areas, and low lying islands in the Caribbean and Pacific. The UN 62nd General assembly in July 2008 said: …it had been estimated that there would be between 50 million and 200 million environmental migrants by 2010.
The Test: Did population go down in these areas during that period, indicating climate refugees were on the move? The answer, no.
The Proof: Population actually gained in some Caribbean Island for which 2010 census figures were available. Then when challenged on these figures, the UN tried to hide the original claim from view. See: The UN “disappears” 50 million climate refugees, then botches the disappearing attempt
The Change in claim: Now it is claimed that it will be 10 years into the future, and there will be 50 million refugees by the year 2020.
|
<urn:uuid:7a0bb173-9e03-4fcc-9bcf-ccc3aaa3e68a>
|
http://wattsupwiththat.com/climate-fail-files/?like=1&source=post_flair&_wpnonce=8ef4095fcf
| 1
|
fineweb
|
The Coca-Cola System Announces New Global Targets for Water Conservation and Climate Protection in Partnership With WWF
The Coca-Cola Company, in partnership with World Wildlife Fund (WWF), today announced ambitious new targets to improve water efficiency and reduce carbon emissions within its system-wide operations, while promoting sustainable agricultural practices and helping to conserve the world’s most important freshwater basins.
“Our sustainability as a business demands a relentless focus on efficiency in our use of natural resources. These performance targets are one way we are engaging to improve our management of water and energy,” said Muhtar Kent, president and CEO of The Coca-Cola Company.
“In this resource constrained world, successful businesses will find ways to achieve growth while using fewer resources,” said Carter Roberts, president and CEO of WWF-US. “The Coca-Cola Company’s commitment to conservation responds to the imperative to solve the global water and climate crisis.”
The partnership, announced by WWF and The Coca-Cola Company in 2007 with $20 million in funding, has now been extended an additional two years (through 2012) with the Company providing $3.75 million in new funding.
The Coca-Cola Company also joined WWF’s Climate Savers program in which leading corporations from around the world work with WWF to dramatically reduce their greenhouse gas emissions. By 2010, Climate Savers companies will collectively cut carbon emissions by 14 million tons annually – the equivalent to taking more than 3 million cars off the road each year.
Water Efficiency — Saving 50 billion liters in 2012
The Coca-Cola system will improve its water efficiency 20 percent by 2012, compared to a baseline year 2004. While water use is expected to increase as the business grows, this water efficiency target will eliminate approximately 50 billion liters of that increase in 2012.
To support this efficiency target, The Coca-Cola Company and WWF have developed a Water Efficiency Toolkit to help reduce water consumption within bottling plants. This software-based instruction manual has been distributed to managers and operators throughout the Coca-Cola system, providing strategies to shrink the water footprint of their operations.
Climate Protection — Preventing 2 million tons of CO(2) emissions
The Company has set two emissions reduction targets: 1) grow the business, not the carbon system-wide and 2) a 5 percent absolute reduction in Annex 1 (developed) countries. The emissions targets apply to manufacturing operations in the year 2015 compared to a baseline year of 2004.
The Coca-Cola Company and its bottlers anticipate substantial volume growth globally during this period, thus growing the business without growing the carbon is a significant commitment. Without intervention, emissions would grow proportional to volume and reach 7.3 million metric tons in 2015. Thus, the global commitment will prevent the release of more than 2 million metric tons of CO(2) in 2015 – the equivalent of planting 600,000 acres of trees.
Supply Chain Sustainability
The Coca-Cola Company also will work with WWF to promote more sustainable agricultural practices in an effort to reduce the impact of its supply chain on water resources. This work will initially focus on sugarcane production. The Coca-Cola Company and WWF are working with the Better Sugarcane Initiative to establish standards, evaluate suppliers and set goals for the purchase of sugar. The Coca-Cola Company will identify two additional commodities on which to work in 2009.
The Coca-Cola system and WWF are working together to conserve some of the world’s most important freshwater resources, including the Yangtze, Mekong, Danube, Rio Grande/Rio Bravo, Lakes Niassa and Chiuta, the Mesoamerican Reef catchments, and the rivers and streams in the southeastern region of the United States. More than a dozen production plants and/or bottlers in the areas surrounding these rivers are developing and implementing water stewardship plans to serve as models throughout the Coca-Cola system.
“Water and energy conservation are areas where we can truly make a difference. Last year, we set a goal to return to communities and to nature an amount of water equal to what we use in our beverages and their production. These targets support our work to achieve that goal,” said Kent. “The expansion of our partnership with WWF demonstrates our shared dedication to achieving large-scale results, and a grounded understanding that collaboration is key if we are to help address the world’s water challenges.”
To learn more about the partnership, please visit www.thecoca-colacompany.com or www.worldwildlife.org.
About The Coca-Cola Company
The Coca-Cola Company is the world’s largest beverage company, refreshing consumers with more than 450 sparkling and still brands. Along with Coca-Cola, recognized as the world’s most valuable brand, the Company’s portfolio includes 12 other billion dollar brands, including Diet Coke, Fanta, Sprite, Coca-Cola Zero, vitaminwater, Powerade, Minute Maid and Georgia Coffee. Globally, we are the No.1 provider of sparkling beverages, juices and juice drinks and ready-to-drink teas and coffees. Through the world’s largest beverage distribution system, consumers in more than 200 countries enjoy the Company’s beverages at a rate of 1.5 billion servings a day. With an enduring commitment to building sustainable communities, our Company is focused on initiatives that protect the environment, conserve resources and enhance the economic development of the communities where we operate. For more information about our Company, please visit our Web site at www.thecoca-colacompany.com.
About World Wildlife Fund
WWF is the world’s largest conservation organization, working in 100 countries for nearly half a century. With the support of almost 5 million members worldwide, WWF is dedicated to delivering science-based solutions to preserve the diversity and abundance of life on Earth, stop the degradation of the environment and combat climate change. Visit www.worldwildlife.org to learn more.
|
<urn:uuid:0a04b507-9ee2-4a9e-8102-aa69f8f44e4e>
|
http://www.redorbit.com/news/science/1595040/the_cocacola_system_announces_new_global_targets_for_water_conservation/
| 1
|
fineweb
|
The factors behind the calving process were not well understood
US researchers have come up with a way to predict the rate at which ice shelves break apart into icebergs.
These sometimes spectacular occurrences, called calving events, are a key step in the process by which climate change drives sea level rise.
Computer models that simulate how ice sheets might behave in a warmer world do not describe the calving process in much detail, Science journal reports.
Until now, the factors controlling this process have not been well understood.
Ice sheets, such as those in Antarctica and Greenland, spread under their own weight and flow off land over the ocean water.
Ice shelves are the thick, floating lips of ice sheets or glaciers that extend out past the coastline.
Timelapse footage of an iceberg breaking away from a glacier in July 2008. The event took approximately 15 minutes (Video: Fahnestock/UNH)
The Ross Ice Shelf in Antarctica floats for as much as 800km (500 miles) over the ocean before the edges begin to break and create icebergs. But other ice shelves may only edge over the water for a few kilometres.
A team led by Richard Alley at Pennsylvania State University, US, analysed factors such as thickness, calving rate and strain rate for 20 different ice shelves.
"The problem of when things break is a really hard problem because there is so much variability," said Professor Alley.
"Anyone who has dropped a coffee cup knows this. Sometimes the coffee cup breaks and sometimes it bounces."
The team's results show that the calving rate of an ice shelf is primarily determined by the rate at which the ice shelf is spreading away from the continent.
The researchers were also able to show that narrower shelves should calve more slowly than wider ones.
Ice cracking off into the ocean from Antarctica and Greenland could play a significant role in future sea level rise.
Floating ice that melts does not of itself contribute to the height of waters (because it has already displaced its volume), but the shelf from which it comes acts as a brake to the land-ice behind. Removal of the shelf will allow glaciers heading to the ocean to accelerate - a phenomenon documented when the Larsen B shelf on the Antarctic Peninsula shattered in spectacular style in 2002. This would speed sea level rise.
The UN Intergovernmental Panel on Climate Change in its 2007 assessment forecast that seas could rise by 18 to 59 cm (7-23ins) this century. However, in giving those figures, it conceded that ice behaviour was poorly understood.
This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.
|
<urn:uuid:345a4045-6f1f-4b6d-b5c0-385afebb5719>
|
http://news.bbc.co.uk/2/hi/science/nature/7753228.stm
| 0.9977
|
fineweb
|
If you download this publication you may also be interested in these:
Facing an uncertain future
How forest and people can adapt to climate changeCenter for International Forestry Research (CIFOR)Bogor, Indonesia
The most prominent international responses to climate change focus on mitigation (reducing the accumulation of greenhouse gases) rather than adaptation (reducing the vulnerability of society and ecosystems). However, with climate change now inevitable, adaptation is gaining importance in the policy arena, and is an integral part of ongoing negotiations towards an international framework. This report presents the case for adaptation for tropical forests (reducing the impacts of climate change on forests and their ecosystem services) and tropical forests for adaptation (using forests to help local people and society in general to adapt to inevitable changes). Policies in the forest, climate change and other sectors need to address these issues and be integrated with each other—such a cross-sectoral approach is essential if the benefits derived in one area are not to be lost or counteracted in another. Moreover, the institutions involved in policy development and implementation need themselves to be flexible and able to learn in the context of dynamic human and environmental systems. And all this needs to be done at all levels from the local community to the national government and international institutions. The report includes an appendix covering climate scenarios, concepts, and international policies and funds.
|
<urn:uuid:57106d23-5399-4426-a853-4136284a3a19>
|
http://www.cifor.org/online-library/browse/view-publication/publication/2600.html
| 1
|
fineweb
|
Karuk Tribe: Learning from the First Californians for the Next California
Editor's Note: This is part of series, Facing the Climate Gap, which looks at grassroots efforts in California low-income communities of color to address climate change and promote climate justice.
This article was published in collaboration with GlobalPossibilities.org.
The three sovereign entities in the United States are the federal government, the states and indigenous tribes, but according to Bill Tripp, a member of the Karuk Tribe in Northern California, many people are unaware of both the sovereign nature of tribes and the wisdom they possess when it comes to issues of climate change and natural resource management.
“A lot of people don’t realize that tribes even exist in California, but we are stakeholders too, with the rights of indigenous peoples,” says Tripp.
Tripp is an Eco-Cultural Restoration specialist at the Karuk Tribe Department of Natural Resources. In 2010, the tribe drafted an Eco-Cultural Resources Management Plan, which aims to manage and restore “balanced ecological processes utilizing Traditional Ecological Knowledge supported by Western Science.” The plan addresses environmental issues that affect the health and culture of the Karuk tribe and outlines ways in which tribal practices can contribute to mitigating the effects of climate change.
Before climate change became a hot topic in the media, many indigenous and agrarian communities, because of their dependence upon and close relationship to the land, began to notice troubling shifts in the environment such as intense drought, frequent wildfires, scarcer fish flows and erratic rainfall.
There are over 100 government recognized tribes in California, which represent more than 700,000 people. The Karuk is the second largest Native American tribe in California and has over 3,200 members. Their tribal lands include over 1.48 million acres within and around the Klamath and Six Rivers National Forests in Northwest California.
Tribes like the Karuk are among the hardest hit by the effects of climate change, despite their traditionally low-carbon lifestyles. The Karuk, in particular have experienced dramatic environmental changes in their forestlands and fisheries as a result of both climate change and misguided Federal and regional policies.
The Karuk have long depended upon the forest to support their livelihood, cultural practices and nourishment. While wildfires have always been a natural aspect of the landscape, recent studies have shown that fires in northwestern California forests have risen dramatically in frequency and size due to climate related and human influences. According to the California Natural Resources Agency, fires in California are expected to increase 100 percent due to increased temperatures and longer dry seasons associated with climate change.
Some of the other most damaging human influences to the Karuk include logging activities, which have depleted old growth forests, and fire suppression policies created by the U.S. Forest Service in the 1930s that have limited cultural burning practices. Tripp says these policies have been detrimental to tribal traditions and the forest environment.
“It has been huge to just try to adapt to the past 100 years of policies that have led us to where we are today. We have already been forced to modify our traditional practices to fit the contemporary political context,” says Tripp.
Further, the construction of dams along the Klamath River by PacifiCorp (a utility company) has impeded access to salmon and other fish that are central to the Karuk diet. Fishing regulations have also had a negative impact.
Though the Karuk’s dependence on the land has left them vulnerable to the projected effects of climate change, it has also given them and other indigenous groups incredible knowledge to impart to western climate science. Historically, though, tribes have been largely left out of policy processes and decisions. The Karuk decided to challenge this historical pattern of marginalization by formulating their own Eco-Cultural Resources Management Plan.
The Plan provides over twenty “Cultural Environmental Management Practices” that are based on traditional ecological knowledge and the “World Renewal” philosophy, which emphasizes the interconnectedness of humans and the environment. Tripp says the Plan was created in the hopes that knowledge passed down from previous generations will help strengthen Karuk culture and teach the broader community to live in a more ecologically sound way.
“It is designed to be a living document…We are building a process of comparative learning, based on the principals and practices of traditional ecological knowledge to revitalize culturally relevant information as passed through oral transmission and intergenerational observations,” says Tripp.
One of the highlights of the plan is to re-establish traditional burning practices in order to decrease fuel loads and the risk for more severe wildfires when they do happen. Traditional burning was used by the Karuk to burn off specific types of vegetation and promote continued diversity in the landscape. Tripp notes that these practices are an example of how humans can play a positive role in maintaining a sound ecological cycle in the forests.
“The practice of utilizing fire to manage resources in a traditional way not only improves the use quality of forest resources, it also builds and maintains resiliency in the ecological process of entire landscapes” explains Tripp.
Another crucial aspect of the Plan is the life cycle of fish, like salmon, that are central to Karuk food traditions and ecosystem health. Traditionally, the Karuk regulated fishing schedules to allow the first salmon to pass, ensuring that those most likely to survive made it to prime spawning grounds. There were also designated fishing periods and locations to promote successful reproduction. Tripp says regulatory agencies have established practices that are harmful this cycle.
“Today, regulatory agencies permit the harvest of fish that would otherwise be protected under traditional harvest management principles and close the harvest season when the fish least likely to reach the very upper river reaches are passing through,” says Tripp.
The Karuk tribe is now working closely with researchers from universities such as University of California, Berkeley and the University of California, Davis as well as public agencies so that this traditional knowledge can one day be accepted by mainstream and academic circles dealing with climate change mitigation and adaptation practices.
According to the Plan, these land management practices are more cost effective than those currently practiced by public agencies; and, if implemented, they will greatly reduce taxpayer cost burdens and create employment. The Karuk hope to create a workforce development program that will hire tribal members to implement the plan’s goals, such as multi-site cultural burning practices.
The Plan has a long way to full realization and Federal recognition. According to the National Indian Forest Resources Management Act and the National Environmental Protection Act, it must go through a formal review process. Besides that, the Karuk Tribe is still solidifying funding to pursue its goals.
The work of California’s environmental stewards will always be in demand, and the Karuk are taking the lead in showing how community wisdom can be used to generate an integrated approach to climate change. Such integrated and community engaged policy approaches are rare throughout the state but are emerging in other areas. In Oakland, for example, the Oakland Climate Action Coalition engaged community members and a diverse group of social justice, labor, environmental, and business organizations to develop an Energy and Climate Action Plan that outlines specific ways for the City to reduce greenhouse gas emissions and create a sustainable economy.
In the end, Tripp hopes the Karuk Plan will not only inspire others and address the global environmental plight, but also help to maintain the very core of his people. In his words: “Being adaptable to climate change is part of that, but primarily it is about enabling us to maintain our identity and the people in this place in perpetuity.”
Dr. Manuel Pastor is Professor of Sociology and American Studies & Ethnicity at the University of Southern California where he also directs the Program for Environmental and Regional Equity and co-directs USC’s Center for the Study of Immigrant Integration. His most recent books include Just Growth: Inclusion and Prosperity in America’s Metropolitan Regions (Routledge 2012; co-authored with Chris Benner) Uncommon Common Ground: Race and America’s Future (W.W. Norton 2010; co-authored with Angela Glover Blackwell and Stewart Kwoh), and This Could Be the Start of Something Big: How Social Movements for Regional Equity are Transforming Metropolitan America (Cornell 2009; co-authored with Chris Benner and Martha Matsuoka).
|
<urn:uuid:003baaf4-69c7-4ee7-b37f-468bf9b55842>
|
http://www.resilience.org/stories/2012-10-19/karuk-tribe-learning-from-the-first-californians-for-the-next-california
| 1
|
fineweb
|
The Geological Perspective On Global Warming: A Debate
Dr Colin P. Summerhayes, Vice-President of the Geological Society of London
Dear Dr Peiser,
In the interest of contributing to the evidence-based debate on climate change I thought it would be constructive to draw to your attention the geological evidence regarding climate change, and what it means for the future. This evidence was published in November 2010 by the Geological Society of London in a document entitled “Climate Change: Evidence from the Geological Record”, which can be found on the Society’s web page.
A variety of techniques is now available to document past levels of CO2 in the atmosphere, past global temperatures, past sea levels, and past levels of acidity in the ocean. What the record shows is this. The Earth’s climate has been cooling for the past 50 million years from 6-7°C above today’s global average temperatures to what we see now. That cooling led to the formation of ice caps on Antarctica 34 million years ago and in the northern hemisphere around 2.6 million years ago. The cooling was directly associated with a decline in the amount of CO2 in the atmosphere. In effect we moved from a warm “greenhouse climate” when CO2, temperature and sea level were high, and there were no ice caps, to an “icehouse climate” in which CO2, temperature and sea level are low, and there are ice caps. The driver of that change is the balance between the emission of CO2 into the atmosphere from volcanoes, and the mopping up of CO2 from the atmosphere by the weathering of rocks, especially in mountains. There was more volcanic activity in the past and there are more mountains now.
Superimposed on this broad decline in CO2 and temperature are certain events. Around 55 million years ago there was a massive additional input of carbon into the atmosphere – about 4 times what humans have put there. It caused temperatures to rise by a further 6°C globally and 10°C at the poles. Sea level rose by some 15 metres. Deep ocean bottom waters became acid enough to dissolve carbonate sediments and kill off calcareous bottom dwelling organisms. It took over 100,000 years for the Earth to recover from this event. More recently, during the Pliocene, around 3 million years ago, CO2 rose to levels a little higher than today’s, global temperature rose to 2-3°C above today’s level, Antarctica’s Ross Ice Shelf melted, and sea level rose by 10-25 metres.
The icehouse climate that characterised the past 2.6 million years averaged 9°C colder in the polar regions and 5°C colder globally. It was punctuated by short warm interglacial periods. We are living in one of these warm periods now – the Holocene – which started around 11,000 years ago. The glacial to interglacial variations are responses to slight changes in solar energy meeting the Earth’s surface with changes in: our planet’s orbit from circular to elliptical and back; the position of the Earth relative to the sun around the Earth’s orbit; and the tilt of the Earth’s axis. These changes recur on time scales of tens to hundreds of thousands of years. CO2 plays a key role in these changes. As the Earth begins to warm after a cold period, sea ice melts allowing CO2 to emerge from the ocean into the atmosphere. There it acts to further warm the planet through a process known as positive feedback. The same goes for another greenhouse gas, methane, which is given off from wetlands that grow as the world warms. As a result the Earth moves much more rapidly from cold to warm than it does from warm to cold. We are currently in a cooling phase of this cycle, so the Earth should be cooling slightly. Evidently it is not.
The Geological Society deduced that by adding CO2 to the atmosphere as we are now doing, we would be likely to replicate the conditions of those past times when natural emissions of CO2 warmed the world, melted ice in the polar regions, and caused sea level to rise and the oceans to become more acid. The numerical models of the climate system that are used by the meteorological community to predict the future give much the same result by considering modern climate variation alone. Thus we arrive at the same solution by two entirely independent methods. Under the circumstances the Society concluded that “emitting further large amounts of CO2 into the atmosphere over time is likely to be unwise, uncomfortable though that fact may be.”
Dr Colin P. Summerhayes
Vice-President Geological Society of London and Emeritus Associate Scott Polar Research Institute, Cambridge.
8 February 2013
Professor Robert Carter and Professor Vincent Courtillot respond:
Dear Dr Peiser,
Thank you for your invitation on behalf of the Foundation to reply to Dr Summerhayes’ letter about geological evidence in relation to the hypothesis of dangerous anthropogenic global warming (DAGW) that is favoured by the Intergovernmental Panel on Climate Change (IPCC).
We are in agreement with many of Dr Summerhayes’ preliminary remarks about the geological context of climate change. This reflects that a large measure of scientific agreement and shared interpretation exists amongst most scientists who consider the global warming issue.
Points of commonality in the climate discussion include:
* that climate has always changed and always will,
* that Earth has often been warmer than it is today, and that its present climatic condition is that of a warm interglacial during a punctuated icehouse world,
* that carbon dioxide is a greenhouse gas and warms the lower atmosphere (though debate remains as to the magnitude and timescale of the warming),
* that a portion of human emissions are accumulating in the atmosphere,
* that a global warming of around 0.5°C occurred in the 20th century, but that there has been no global temperature rise over the last 16 years.
The first two points are rooted in geological evidence (as discussed in more detail by Dr Summerhayes), the third is based upon physical principle and the last three are mostly matters of instrumental measurement (i.e. observation). Despite the disparate scientific disciplines involved, all these points are relevant to achieving a quantitative understanding of climate change, together with several other disputed scientific matters such as those that we discuss below.
One of the disputed scientific matters is represented by Dr Summerhayes’ assertion that cooling over the last 34 million years “was directly associated with a decline in the amount of CO2 in the atmosphere”.
The word “associated” is ambiguous. It may simply mean that temperature and CO2 were correlated, in the sense that their trends were parallel. But as everyone knows correlation is not causality and whether one drives the other, or the two are driven by a third forcing factor, or the correlation is the result of chance, requires careful analysis and argument. Though it may be true that a broad correlation exists between atmospheric CO2 content and global temperature, at least on some timescales, it remains unclear whether the primary effect is one of increasing CO2 causing warming (via the greenhouse effect) or of warming causing CO2 increase (via outgassing from the ocean). We are familiar with the argument that the currently decreasing carbon isotope ratio in the atmosphere is consistent with a fossil fuel source for incremental CO2 increases, and therefore with the first of these two possibilities, but do not find it compelling because other natural sources (soil carbon, vegetation) also contribute isotopically negative carbon to the atmosphere.
A second area of uncertainty, related to the point just discussed, is the rate, scope and direction of the various feedbacks that apply during a natural glacial-interglacial climatic cycle. Dr Summerhayes provides a confident, and perhaps plausible, account as to how changing insolation (controlled by orbital change), melting sea-ice and increasing CO2 and CH4 jointly drive the asymmetrical glacial-interglacial cycles that have characterised recent planetary history. However, our knowledge of the climate system and its history currently remains incomplete; some of the forcing mechanisms and feedbacks may not be known accurately, or even at all. For example, we do not yet know whether clouds exert a net warming or cooling effect on the climate. Similarly, variations in ultraviolet radiation and high-energy particle emission from the Sun, in atmospheric electricity and in galactic cosmic rays may all play larger roles in controlling climate change than is currently assumed, yet these effects are absent from most of the current generation of deterministic computer models of the future climate. The temperature projections made by these models may well be affected by our ignorance of the magnitude, the sign, or even the existence of some of the forcings and feedbacks that are actually involved.
Thirdly, Dr Summerhayes also briefly discusses the issue of sea level change. He quotes an estimated increase of 15 m in sea level associated with a temperature increase of 6–10°C 55 million years ago. He then quotes a range of 10–25 m rise for a 2–3°C warming 3 million years ago. To this we might add the further examples of the 125 m sea level rise that has accompanied the 6°C temperature rise since the last glacial maximum, and the 0.2-m rise associated with the ~0.5°C 20th century warming. It appears from these examples that a 1°C temperature rise can be associated with a sea level rise of as little as 0.4 m or as much as 8 m, and all values in between! This indicates an uncertainty in our understanding of the temperature/CO2/sea-level connection that surely lessens its value for contributing to policy formulation.
Figure 1. Temperature curve reconstructed from oxygen isotope measurements in a Greenland ice core over the last 10,000 years (Lappi 2010 after Alley,2000).
Fourth, and last, Dr Summerhayes says that because orbitally-forced climate periodicity is currently in a cooling phase “the Earth should be cooling slightly. Evidently it is not”. The statement is tendentious, because whether Earth is seen to be cooling or warming depends upon the length of climate record that is considered. Trends over 1, 10, 100 or 1000 years are not the same thing, and their differences must be taken into account carefully. We reproduce two figures that may be used to demonstrate that Earth is currently not warming on either the longer-term millennial timescale (Figure 1) or the short-term decadal/meteorological timescale (Figure 2). We note also that on the intermediate centennial timescale (1850–2010) the temperature trend has been one of a slight (0.5°C) rise. In assessing which of these timescales is the “proper” one to consider in formulating climate policy, we observe that the results conveyed in Figure 2 have little scientific (and therefore policy) meaning unless they are assessed in the context of the data in Figure 1.
Figure 2. Mean temperature of lower atmosphere: HadCRUT4 annual means 1997-2011
We acknowledge that the data in Figure 1, which are drawn from a Greenland ice core, represent regional rather than global climate. But a similar pattern of Holocene long-term cooling is seen in many other records from around the world, including from Antarctic ice cores. Also, evidence for a millenial solar cycle has been accumulating over the past years, and, representing that rhythm, the Medieval Warming (also called Medieval Climatic Optimum) appears to have been both global and also warmer than today’s climate.
Regarding Figure 2, the data demonstrate that no warming has occurred since 1997. In response, some leading IPCC scientists have already acknowledged that should the temperature plateau continue, or turn into a statistically significant cooling trend, then the mainstream IPCC view will need revision. It is noteworthy, too, that over the 16 years during which global temperature has remained unchanged (1997-2012), atmospheric carbon dioxide levels have increased by 8%, from 364 ppm to c.394 ppm. Given a mixing time for the atmosphere of about 1 year, these data would invalidate the hypothesis that human-related carbon dioxide emissions are causing dangerous global warming. In any case, observed global temperatures are currently more remote than ever from the most recent predictions set out in IPCC AR4.
The areas of uncertainty in the prevailing argument over DAGW are therefore not only geological but also instrumental and physical. Current debate, which needs to be resolved before climate policy is set, centres on the following three issues:
* whether any definite evidence exists for dangerous warming of human causation over the last 50 years,
* the amount of net warming that is, or will be, produced by human-related emissions (the climate sensitivity issue), and
* whether the IPCC’s computer models can provide accurate climate predictions 100 years into the future.
In assessing these issues, our null hypothesis is that the global climate changes that have occurred over the last 150 years (and continue to occur today) are mainly natural in origin. As summarised in the reports of the Nongovernmental International Panel on Climate Change (NIPCC), literally thousands of papers published in refereed journals contain facts or writings consistent with this null hypothesis, and plausible natural explanations exist for all the post-1850 global climatic changes that have been described so far. In contrast, no direct evidence exists, and nor does the Geological Society point to any, that a measurable part of the mild late 20th century warming was definitely caused by human-related carbon dioxide emissions.
The possibility of human-caused global warming nonetheless remains, because carbon dioxide is indubitably a greenhouse gas. The major unknown is the actual value of climate sensitivity, i.e. the amount of temperature increase that would result from doubling the atmospheric concentration of CO2 compared to pre-industrial levels. IPCC models estimate that water vapour increases the 1°C effect that would be seen in a dry atmosphere to 2.5-4.5°C, whereas widely cited papers by Lindzen & Choi (2011) and Spencer & Braswell (2010) both describe empirical data that is consistent with negative feedback, i.e. sensitivity less than 1°C. The conclusion that climate sensitivity is significantly less than argued by the IPCC is also supported by a range of other empirical or semi-empirical studies (e.g., Forster & Gregory, 2006; Aldrin et al., 2012; Ring et al., 2012).
Gathering these various thoughts together, we conclude that the risk of occurrence of damaging human-caused global warming is but a small one within the much greater and proven risks of dangerous natural climate-related events (not to mention earthquakes, volcanic eruptions, tsunamis and landslides, since we are dealing here with geological topics). Moreover, the property damage and loss of life that occurred in the floods in the UK in 2007; in the 2005 Katrina and 2012 Sandy storms in the USA; and in deadly bushfires in Australia in 2009 and 2013 all attest that even wealthy and technologically sophisticated nations are often inadequately prepared to deal with climate-related hazard.
The appropriate response to climate hazard is to treat it in the same way as other geological hazards. Which is to say that national policies are needed that are based on preparing for and adapting to all climate events as and when they happen, and irrespective of their presumed cause. Every country needs to develop its own understanding of, and plans to cope with, the unique combination of climate hazards that apply within its own boundaries. The planned responses should be based upon adaptation, with mitigation where appropriate to cushion citizens who are affected in an undesirable way.
The idea that there can be a one-size-fits-all global solution to deal with just one possible aspect of future climate hazard, as recommended by the IPCC, and apparently supported by Dr Summerhayes on behalf of the Geological Society, fails to deal with the real climate and climate-related hazards to which all parts of the world are episodically exposed.
Professor Robert (Bob) Carter
Professor Vincent Courtillot
14 February 2013
Aldrin, M. et al. 2012. Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations on hemispheric temperature and global ocean heat content. Environmetrics, doi:10.1002/env.2140.
Alley, R.B. 2000. The Younger Dryas cold interval as viewed from central Greenland. Quaternary Science Reviews 19: 213–226
Forster, P.M. & Gregory, J.M. 2006. The climate sensitivity and its components diagnosed from Earth radiation budget data. Journal of Climate 19, 39-52.
Lappi, D. 2010. 65 million years of cooling
Lindzen, R.S. & Choi, Y-S. 2011. On the observational determination of climate sensitivity and its implications. Asia-Pacific Journal of Atmospheric Sciences 47, 377-390.
Ring, M.J. et al. 2012. Causes of the global warming observed since the 19th century. Atmospheric and Climate Sciences 2, 401-415.
Spencer R. W. & Braswell, W.D. 2010. On the diagnosis of radiative feedback in the presence of unknown radiative forcing. Journal of Geophysical Research 115, D16109.
|
<urn:uuid:f0965eb8-455d-443b-9cf9-720d790d4628>
|
http://www.thegwpf.org/geological-perspective-global-warming-debate/
| 1
|
fineweb
|
The name of the island is Lohachara not that many of us are going to remember it for long. We'll probably recall it as that little island off India, the first once inhabited island to disappear from the surface of the earth due to rising sea levels attributed to global warming.
Lohachara was a small island that supported a population of 10,000 in the Bay of Bengal near where the Ganges and Brahmaputra rivers meet the sea. It is believed that other islands in the area will soon also be submerged displacing some 70,000 more islanders.
Several uninhabited islands have disappeared in recent years, notably in the South Pacific. Lohachara is unique because it was inhabited.
|
<urn:uuid:803bd8f9-4427-468e-ab37-eada95ff6dd4>
|
http://the-mound-of-sound.blogspot.com/2006/12/global-warming-milestone.html
| 0.9997
|
fineweb
|
PAO: Policy Activities » Briefing (November 2, 2011)
Using Science to Improve Flood Management
On November 2, 2011, ESA sponsored a congressional briefing: “Using Science to Improve Flood Management.” Emily Stanley (University of Wisconsin, Madison) and Jeff Opperman (The Nature Conservancy, Ohio Field Office) addressed the function of floodplains and managing rivers as systems and for multiple benefits.
Emily Stanley’s presentation focused on the work of rivers and the function of floodplains. Industry, transportation, and recreation all constitute work done by rivers. Less well known and valued is the work done by floodplains, responsible for such desirable services such as flood attenuation, fish production, improved water quality and groundwater recharge. Stanley noted that aging US levee and other infrastructure provide an opportunity to move beyond structural flood control and take greater advantage of the functions of floodplains.
Jeff Opperman’s presentation focused on the logic of managing rivers as a system and for multiple benefits. These include risk reduction for people and infrastructure as well as benefits such as water storage during droughts and increased fisheries production. Opperman said that because the Mississippi River is managed as a comprehensive system, the recent flood was far less damaging than that of 1927 even though a greater volume of water passed through the system in 2011. Opperman also pointed to the success story of California’s Yolo Bypass, which has reduced flood risk while increasing goods and services.
To view the complete presentations, please click on the links below:
|
<urn:uuid:33f7423f-3d75-4ed6-a025-a3e344c8d8f3>
|
http://www.esa.org/pao/policyActivities/briefing11072011.php
| 0.9955
|
fineweb
|
Almost all of the 33 developed and developing countries surveyed in a new study had introduced or progressed with significant climate-related legislation within their own borders in the past year.
In 2012, 18 countries made significant progress, according to the report by the Grantham Research Institute at LSE and Globe International, which brings together legislators from different countries.
Only Canada had gone backwards on climate change, by repealing the Act implementing its targets under the Kyoto Protocol treaty to cut global emissions.
Despite a tough economic year for many countries, such as those in the eurozone, some progress was made by developed nations.
The EU as a whole made progress through its new directive on energy efficiency, while the US pushed forward with regulating carbon dioxide through its Clean Air Act.
Action by developing countries was more significant, with Mexico leading the way with a new climate law to cut emissions by 30% compared to "business as usual" by 2020, and major progress by countries ranging from Kenya to Pakistan.
|
<urn:uuid:0a8af8f1-6eb4-4528-91d2-0dbf9e717b05>
|
http://www.heraldscotland.com/news/environment/global-climate-change-progress.19909473
| 1
|
fineweb
|
Plants flower faster than climate change models predict
Scientific models are failing to accurately predict the impact of global warming on plants, says a new report.
Researchers found in long-term studies that some are flowering up to eight times faster than models anticipate.
The authors say that poor study design and a lack of investment in experiments partly account for the difference.
They suggest that spring flowering and leafing will continue to advance at the rate of 5 to 6 days per year for every degree celsius of warming.
The results are published in the journal Nature.
For more than 20 years, scientists have been carrying out experiments to mimic the impacts of rising temperatures on the first leafing and flowering of plant species around the world.
End Quote This Rutishauser Oeschger Centre for Climate Change Research
The bottom line is that the impacts might be bigger than we have believed until now”
Researchers had assumed that plants would respond in essentially the same way to experimental warming with lamps and open top chambers as they would to changes in temperatures in the real world.
Very little has been done to test the assumption until this study lead by Dr Elizabeth Wolkovich, who is now at the University of British Columbia in Vancouver.
With her colleagues she studied the timing of the flowering and leafing of plants in observational studies and warming experiments spanning four continents and 1,634 plant species.
According to Dr Wolkovich, the results were a surprise.
"What we found is that the experiments don't line up with the long term data, and in fact they greatly underestimate how much plants change their leafing and flowering with warming," she said.
"So for models based on experimental data, then we would expect that plants are leafing four times faster and flowering eight times faster in the long term historical record than what we're using in some of the models."'Consistent message'
Observational data have been gathered by scientific bodies for many years. In the UK, the systematic recording of flowering times dates back to 1875, when the Royal Meteorological Society established a national network of observers.
Since then, data has also been recorded by full-time biologists and part-time enthusiasts, and in recent years there have been mass-participation projects such as BBC Springwatch.
This new research suggests that these observations of flowering and leafing carried out in many different parts of the world over the past thirty years are remarkably similar according to Dr Wolkovich.
"In terms of long term observations, the records are very coherent and very consistent and they suggest for every degree celsius of warming we get we are going to get a five- to six-day change in how plants leaf and flower."
She argues that the difficulties in mimicking the impacts of nature in an artificial setting are much greater than many scientists estimate. The team found that in some cases the use of warming chambers to artificially raise temperatures can sometimes have the opposite effect.
"In the real world, we don't just see changes in temperature - we see changes in precipitation and cloud patterns and other factors - so certainly when you think about replicating changes in clouds, we are very, very far away from being able to do that.
"I guess we will never get to perfectly match nature, but I am hopeful as scientists we can do much, much better, given funding resources."
The team found that the greater investment in the design and monitoring of experiments, the more accurate the result.
"We have a very consistent message from the long-term historical records about how plants are changing, but we need to think more critically about how we fund and invest in and really design experiments," said Dr Wolkovich.
"We do need them in the future, they are the best way going forward to project how species are changing but right now what we're doing isn't working as well as I think it could."
Other researchers were equally surprised by the results.
Dr This Rutishauser is at the Oeschger Centre for Climate Change Research at the University of Bern in Switzerland. He says that in light of this work scientists will have to rethink the impacts of global warming.
"The bottom line is that the impacts might be bigger than we have believed until now. That's going to provoke a lot of work to probably revise modelling results for estimations of what's going to happen in the future for food production especially."
Dr Wolkovich agrees that if the models are so significantly underestimating the real world observations, there could be also be impacts on water the world over.
"If a whole plant community starts growing a week earlier than we expect according to these experiments, it's going to take up a lot more water over the growing season and if you add to that many years of the model projections, you are going to see big changes in the water supply."
She appeals to people to get involved in citizen science projects and help gather data on flowering and leafing, especially in remote areas.
The National Phenology Network in the US logged its millionth observation this week, and similar programmes are underway in the UK, Sweden, Switzerland, and the Netherlands, and a pan-European database is under development.
"We have very few monitoring networks. We need many, many people out there observing this because it is changing faster and across more habitats than we are currently measuring - we need more help!"
|
<urn:uuid:2b0a717c-2162-468d-9a68-767e517dc557>
|
http://www.bbc.co.uk/news/science-environment-17924653
| 0.9733
|
fineweb
|
"Many environmentalists believe that wind and solar power can be scaled to meet the rising demand [of billions emerging from poverty], especially if coupled with aggressive efforts to cut waste," reports Justin Gillis. "But a lot of energy analysts have crunched the numbers and concluded that today’s renewables, important as they are, cannot get us even halfway there."
Gillis discusses the most promising innovations in nuclear power, which many technologists see as the most viable option for providing a reliable source of electricity without carbon emissions. These include "a practicable type of nuclear fusion", "a fission reactor that could run on today’s nuclear waste", and "a safer reactor based on an abundant element called thorium."
"Beyond the question of whether they will work," he adds, "these ambitious schemes pose a larger issue: How much faith should we, as a society, put in the idea of a big technological fix to save the world from climate change?"
And as is appropriate for a nuclear-related news item that appeared on the two-year anniversary of the Tohoku earthquake, we offer a reminder of the twelve different nuclear power "near miss" events that occurred in the United States in 2012.
|
<urn:uuid:11349972-17b0-4f34-b408-cfc39341347b>
|
http://www.planetizen.com/node/61170
| 0.9002
|
fineweb
|
What Is Air Pollution?
in its great magnitude has existed in the 20th century from the
coal burning industries of the early century to the fossil burning technology in
the new century. The problems of
air pollution are a major problem for highly developed nations whose large
industrial bases and highly developed infrastructures generate much of the air
Every year, billions of tonnes of pollutants are released into the
atmosphere; the sources include power plants burning fossil fuels to the effects
of sunlight on certain natural materials. But
the air pollutants released from natural materials pose very little health
threat, only the natural radioactive gas radon poses any threat to health.
So much of the air pollutants being released into the atmosphere are all
results of man’s activities.
In the United Kingdom, traffic
is the major cause of air pollution in British cities. Eighty six percent of families own either one or two
vehicles. Because of the
high-density population of cities and towns, the number of people exposed to air
pollutants is great. This had led
to the increased number of people getting chronic diseases over these past years
since the car ownership in the UK has nearly trebled. These include asthma and respiratory complaints ranging
through the population demographic from children to elderly people who are most
at risk. Certainly those who are
suffering from asthma will notice the effects more greatly if living in the
inner city areas or industrial areas or even near by major roads.
Asthma is already the fourth biggest killer, after heart diseases and
cancers in the UK and currently, it affects more than three point four million
In the past, severe pollution in London during 1952 added with low winds
and high-pressure air had taken more than four thousand lives and another seven
hundred in 1962, in what was called the ‘Dark Years’ because of the dense
dark polluted air.
is also causing devastation for the environment; many of these causes are by man
made gases like sulphur dioxide that results from electric plants burning fossil
fuels. In the UK, industries and
utilities that use tall smokestacks by means of removing air pollutants only
boost them higher into the atmosphere, thereby only reducing the concentration
at their site.
These pollutants are often transported over the North Sea and produce
adverse effects in western Scandinavia, where sulphur dioxide and nitrogen oxide
from UK and central Europe are generating acid rain, especially in Norway and
Sweden. The pH level, or relative
acidity of many of Scandinavian fresh water lakes has been altered dramatically
by acid rain causing the destruction of entire fish populations.
In the UK, acid rain formed by subsequent sulphur dioxide atmospheric
emissions has lead to acidic erosion in limestone in North Western Scotland and
marble in Northern England.
In 1998, the
London Metropolitan Police launched the ‘Emissions Controlled Reduction’
scheme where by traffic police would monitor the amount of pollutants being
released into the air by vehicle exhausts.
The plan was for traffic police to stop vehicles randomly on roads
leading into the city of London, the officer would then measure the amounts of
air pollutants being released using a CO2 measuring reader fixed in
the owner's vehicle's exhaust. If the
exhaust exceeded the legal amount (based on micrograms of pollutants) the driver
would be fined at around twenty-five pounds.
The scheme proved unpopular with drivers, especially with those driving
to work and did little to help improve the city air quality.
In Edinburgh, the main causes of bad air quality were from the vast
number of vehicles going through the city centre from west to east.
In 1990, the Edinburgh council developed the city by-pass at a cost of
nearly seventy five million pounds. The
by-pass was ringed around the outskirts of the city where its main aim was to
limit the number of vehicles going through the city centre and divert vehicles
to use the by-pass in order to reach their destination without going through the
city centre. This released much of
the congestion within the city but did little very little in solving the
city’s overall air quality.
To further decrease the number of vehicles on the roads, the government
promoted public transport. Over two
hundred million pounds was devoted in developing the country's public transport
network. Much of which included the development of more bus lanes in
the city of London, which increased the pace of bus services.
Introduction of gas and electric powered buses took place in Birmingham
in order to decrease air pollutants emissions around the centre of the city.
Because children and the elderly are at most risk to chronic diseases,
such as asthma, major diversion roads were build in order to divert the vehicles
away from residential areas, schools and elderly institutions.
In some councils, trees were planted along the sides of the road in order
to decrease the amount of carbon monoxide emissions.
Other ways of improving the air quality included the restriction on the
amounts of air pollutants being released into the atmosphere by industries;
tough regulations were placed whereby if the air quality dropped below a certain
level around the industries area, a heavy penalty would be wavered against them.
© Copyright 2000, Andrew Wan.
|
<urn:uuid:ea6c54fe-1f6e-4a4c-bcb5-4f4c9e0fb6de>
|
http://everything2.com/user/KS/writeups/air+pollution
| 0.705
|
fineweb
|
Forest governance and climate policies
Fred Stolle of the World Resources Institute looks at the need for REDD to address forest governance issues as well as creating market incentives.
Policy-makers are recognizing the essential role that the world’s remaining forests play in maintaining the global climate system. The political momentum generated by the Bali Action Plan under the UN Framework Convention on Climate Change (UNFCCC) will create a unique opportunity to put in place a framework of incentives that could curb deforestation, slow forest degradation, and improve the way forests are managed. To succeed, these incentives must strike at the main drivers of rampant deforestation and must also recognize the dependency of local communities on forest ecosystems for their livelihoods.
In the coming months, climate change negotiators have agreed to explore a mechanism for providing compensation for “Reducing Emissions from Deforestation and Forest Degradation in Developing Countries” (REDD). Under most REDD proposals, compensation would be financed by the sale of these emission reductions as ‘carbon offsets’ to be used by regulated countries or companies to remain within their emissions limits.
However, will the promise of money for carbon alone create the conditions necessary to counteract the drivers of deforestation?
If a REDD mechanism is to succeed, competing pressures on forests will need to be managed fairly and effectively. REDD needs to strike at the heart of the drivers, which are not always directly linked to markets, but are as often factors of problems such as illegal logging, bad planning, lack of law enforcement, the absence of tenure rights, the lack of accountability, the lack of coordination and capacity of institutions that manage forest resources and the loss of revenues and other governance factors.
It seems thus apparent that REDD will need to do more than create market incentives. To make REDD effective, efficient and capable of achieving lasting impacts, these governance issues need to be addressed. However, to make these difficult governance improvements countries will need assistance, while these improvements cannot be directly translated into reduced emissions and thus cannot be paid for by carbon credits. There is thus a need for a payment mechanism phase either in parallel or prior to a market mechanism, for REDD to be successful.
Although this phase could not be measured by tons of carbon removed, it is clear that such a phase needs to be measured (and reported and verified), not to fall into the same trap of general development assistance (ODA) over the last decades that has had a low percentage of success. The concept of this governance phase is getting more attention lately and one option of such a phase has been described recently in the Norwegian government -Meridian Institute Options Assessment Report (2009), as the ‘Implementation of policies and measures phase’.
To make this governance phase measurable and successful, governance indicators (qualitative and/or quantitative) need to be developed and agreed upon to be able to identify areas of improvement and hold governments accountable (both governments that supply funds and governments that receive funds). These indicators should cover a wide range of governance topics such as institutions, management, tenure, planning, etc.
Addressing climate change and especially deforestation worldwide will depend on the right incentives and the governance capacity to effectively use these incentives. To improve governance and ensure progress and accountability of governance, we need to develop measurable and agreed upon governance indicators.
|
<urn:uuid:1f534088-d817-4fe5-b528-87d4f831722f>
|
http://www.iucn.org/news_homepage/news_by_date/2009_news/october_2009/?3960/Forest-governance-and-climate-policies
| 1
|
fineweb
|
Stopping Carbon Pollution
Whether you live in a city, on a farm, or anywhere in between, climate change is affecting your weather and damaging the natural world around you. In order to work towards a clean energy future, America needs carbon pollution controls on the largest industrial sources. The U.S. Environmental Protection Agency is taking long overdue steps to limit greenhouse gas emissions from oil refineries and coal-fired power plants, but right now, these highly polluting sources are allowed to release carbon into the atmosphere without any limits.
The National Wildlife Federation’s top priority is to stop the primary cause of climate change – carbon pollution – before it’s too late. NWF is currently fighting major campaigns to:
Electricity generation is the single largest source of global warming pollution in the United States, representing 41 percent of all carbon dioxide emissions. EPA’s newly announced plan to set standards for this sector could require the clean-up of our oldest, dirtiest, least efficient coal power plants. NWF is engaged in a major effort to finalize these rules and dramatically ratchet down our carbon pollution, and we are also supporting EPA's work to address emissions from oil refineries -- the second largest "stationary source" (as opposed to mobile sources like cars and trucks) of global warming pollution in the United States. Strong controls over these big sources will begin holding polluters accountable for their contribution to the climate crisis.
Reducing Emissions in the US and Worldwide
Recognizing that this is a global problem that demands national and international leadership, NWF’s long-term goal is to adopt a national plan that rapidly cuts carbon pollution from all major sources in the US, and safeguards communities and wildlife from the mounting impacts of climate change. The last effort at national legislation – the American Clean Energy and Security Act – passed the House of Representatives in 2009 but stalled in the Senate. Since that time, the impacts of climate change have rapidly escalated. NWF is working hard to get Congress to step up and take action to solve our nation’s most urgent environmental issue.
Avoiding the worst consequences of this disaster also requires a global solution. NWF is partnering with organizations around the world to promote an international agreement that clamps down on carbon pollution, while ensuring that all countries can protect their citizens and wildlife from the impacts of climate change.
|
<urn:uuid:5ca34e3e-8ae3-4073-898a-e4abb2d9d0ba>
|
http://www.nwf.org/What-We-Do/Energy-and-Climate/Reducing-Emissions/~/link.aspx?_id=9963E404A6C54F749ABA02164F76CF07&_z=z
| 1
|
fineweb
|
What determines how much coverage a climate study gets?
It probably goes without saying that it isn’t strongly related to the quality of the actual science, nor to the clarity of the writing. Appearing in one of the top journals does help (Nature, Science, PNAS and occasionally GRL), though that in itself is no guarantee. Instead, it most often depends on the ‘news’ value of the bottom line. Journalists and editors like stories that surprise, that give something ‘new’ to the subject and are therefore likely to be interesting enough to readers to make them read past the headline. It particularly helps if a new study runs counter to some generally perceived notion (whether that is rooted in fact or not). In such cases, the ‘news peg’ is clear.
And so it was for the Steig et al “Antarctic warming” study that appeared last week. Mainstream media coverage was widespread and generally did a good job of covering the essentials. The most prevalent peg was the fact that the study appeared to reverse the “Antarctic cooling” meme that has been a staple of disinformation efforts for a while now.
It’s worth remembering where that idea actually came from. Back in 2001, Peter Doran and colleagues wrote a paper about the Dry Valleys long term ecosystem responses to climate change, in which they had a section discussing temperature trends over the previous couple of decades (not the 50 years time scale being discussed this week). The “Antarctic cooling” was in their title and (unsurprisingly) dominated the media coverage of their paper as a counterpoint to “global warming”. (By the way, this is a great example to indicate that the biggest bias in the media is towards news, not any particular side of a story). Subsequent work indicated that the polar ozone hole (starting in the early 80s) was having an effect on polar winds and temperature patterns (Thompson and Solomon, 2002; Shindell and Schmidt, 2004), showing clearly that regional climate changes can sometimes be decoupled from the global picture. However, even then both the extent of any cooling and the longer term picture were more difficult to discern due to the sparse nature of the observations in the continental interior. In fact we discussed this way back in one of the first posts on RealClimate back in 2004.
This ambiguity was of course a gift to the propagandists. Thus for years the Doran et al study was trotted out whenever global warming was being questioned. It was of course a classic ‘cherry pick’ – find a region or time period when there is a cooling trend and imply that this contradicts warming trends on global scales over longer time periods. Given a complex dynamic system, such periods and regions will always be found, and so as a tactic it can always be relied on. However, judging from the take-no-prisoners response to the Steig et al paper from the contrarians, this important fact seems to have been forgotten (hey guys, don’t worry you’ll come up with something new soon!).
Actually, some of the pushback has been hilarious. It’s been a great example for showing how incoherent and opportunistic the ‘antis’ really are. Exhibit A is an email (and blog post) sent out by Senator Inhofe’s press staff (i.e. Marc Morano). Within this single email there are misrepresentations, untruths, unashamedly contradictory claims and a couple of absolutely classic quotes. Some highlights:
Dr. John Christy of the University of Alabama in Huntsville slams new Antarctic study for using [the] “best estimate of the continent’s temperature”
Perhaps he’d prefer it if they used the worst estimate? ;)
[Update: It should go without saying that this is simply Morano making up stuff and doesn't reflect Christy's actual quotes or thinking. No-one is safe from Morano's misrepresentations!]
[Further update: They've now clarified it. Sigh....]
Morano has his ear to the ground of course, and in his blog piece dramatically highlights the words “estimated” and “deduced” as if that was some sign of nefarious purpose, rather than a fundamental component of scientific investigation.
Internal contradictions are par for the course. Morano has previously been convinced that “… the vast majority of Antarctica has cooled over the past 50 years.”, yet he now approvingly quotes Kevin Trenberth who says “It is hard to make data where none exist.” (It is indeed, which is why you need to combine as much data as you can find in order to produce a synthesis like this study). So which is it? If you think the data are clear enough to demonstrate strong cooling, you can’t also believe there is no data (on this side of the looking glass anyway).
It’s even more humourous, since even the more limited analysis available before this paper showed pretty much the same amount of Antarctic warming. Compare the IPCC report, with the same values from the new analysis (under various assumptions about the methodology).
(The different versions are the full reconstruction, a version that uses detrended satellite data for the co-variance, a version that uses AWS data instead of satelltes and one that use PCA instead of RegEM. All show positive trends over the last 50 years).
Further contradictions abound: Morano, who clearly wants it to have been cooling, hedges his bets with a “Volcano, Not Global Warming Effects, May be Melting an Antarctic Glacier” Hail Mary pass. Good luck with that!
It always helps if you haven’t actually read the study in question. That way you can just make up conclusions:
Scientist adjusts data — presto, Antarctic cooling disappears
Nope. It’s still there (as anyone reading the paper will see) – it’s just put into a larger scale and longer term context (see figure 3b).
Inappropriate personalisation is always good fodder. Many contrarians seemed disappointed that Mike was only the fourth author (the study would have been much easier to demonise if he’d been the lead). Some pretended he was anyway, and just for good measure accused him of being a ‘modeller’ as well (heaven forbid!).
Others also got in on the fun. A chap called Ross Hays posted a letter to Eric on multiple websites and on many comment threads. On Joe D’Aleo’s site, this letter was accompanied with this little bit of snark:
Icecap Note: Ross shown here with Antarctica’s Mount Erebus volcano in the background was a CNN forecast Meteorologist (a student of mine when I was a professor) who has spent numerous years with boots on the ground working for NASA in Antarctica, not sitting at a computer in an ivory tower in Pennsylvania or Washington State
This is meant as a slur against academics of course, but is particularly ironic, since the authors of the paper have collectively spent over 8 seasons on the ice in Antarctica, 6 seasons in Greenland and one on Baffin Island in support of multiple ice coring and climate measurement projects. Hays’ one or two summers there, his personal anecdotes and misreadings of the temperature record, don’t really cut it.
Neither do rather lame attempts to link these results with the evils of “computer modelling”. According to Booker (for it is he!) because a data analysis uses a computer, it must be a computer model – and probably the same one that the “hockey stick” was based on. Bad computer, bad!
The proprietor of the recently named “Best Science Blog”, also had a couple of choice comments:
In my opinion, this press release and subsequent media interviews were done for media attention.
This remarkable conclusion is followed by some conspiratorial gossip implying that a paper that was submitted over a year ago was deliberately timed to coincide with a speech in Congress from Al Gore that was announced last week. Gosh these scientists are good.
All in all, the critical commentary about this paper has been remarkably weak. Time will tell of course – confirming studies from ice cores and independent analyses are already published, with more rumoured to be on their way. In the meantime, floating ice shelves in the region continue to collapse (the Wilkins will be the tenth in the last decade or so) – each of them with their own unique volcano no doubt – and gravity measurements continue to show net ice loss over the Western part of the ice sheet.
Nonetheless, the loss of the Antarctic cooling meme is clearly bothering the contrarians much more than the loss of 10,000 year old ice. The poor level of their response is not surprising, but it does exemplify the tactics of the whole ‘bury ones head in the sand” movement – they’d much rather make noise than actually work out what is happening. It would be nice if this demonstration of intellectual bankruptcy got some media attention itself.
That’s unlikely though. It’s just not news.
|
<urn:uuid:0fe1fa4f-99f0-436d-ba48-6f2e07ec325e>
|
http://www.realclimate.org/index.php/archives/2009/01/warm-reception-to-antarctic-warming-story/langswitch_lang/de?wpmp_switcher=desktop
| 0.5784
|
fineweb
|
DENVER – Put on your poodle skirts and tune in Elvis on the transistor radio, because it’s starting to look a lot like the 1950s.
Unfortunately, this won’t be the nostalgic ’50s of big cars and pop music.
The 1950s that could be on the way to Colorado is the decade of drought.
So says Brian Bledsoe, a Colorado Springs meteorologist who studies the history of ocean currents and uses what he learns to make long-term weather forecasts.
“I think we’re reliving the ’50s, bottom line,” Bledsoe said Friday morning at the annual meeting of the Colorado Water Congress.
Bledsoe studies the famous El Niño and La Niña ocean currents. But he also looks at other, less well-known cycles, including long-term temperature cycles in the oceans.
In the 1950s, water in the Pacific Ocean was colder than normal, but it was warmer than usual in the Atlantic. That combination caused a drought in Colorado that was just as bad as the Dust Bowl of the 1930s.
The ocean currents slipped back into their 1950s pattern in the last five years, Bledsoe said. The cycles can last a decade or more, meaning bad news for farmers, ranchers, skiers and forest residents.
“Drought feeds on drought. The longer it goes, the harder it is to break,” Bledsoe said.
The outlook is worst for Eastern Colorado, where Bledsoe grew up and his parents still own a ranch. They recently had to sell half their herd when their pasture couldn’t provide enough feed.
“They’ve spent the last 15 years grooming that herd for organic beef stock,” he said.
Bledsoe looks for monsoon rains to return to the Four Corners and Western Slope in July. But there’s still a danger in the mountains in the summer.
“Initially, dry lightning could be a concern, so obviously, the fire season is looking not so great right now,” he said.
Weather data showed the last year’s conditions were extreme.
Nolan Doesken, Colorado’s state climatologist, said the summer of 2012 was the hottest on record in Colorado. And it was the fifth-driest winter since record-keeping began more than 100 years ago.
Despite recent storms in the San Juan Mountains, this winter hasn’t been much better.
“We’ve had a wimpy winter so far,” Doesken said. “The past week has been a good week for Colorado precipitation.”
However, the next week’s forecast shows dryness returning to much of the state.
Reservoir levels are higher than they were in 2002 – the driest year since Coloradans started keeping track of moisture – but the state is entering 2013 with reservoirs that were depleted last year.
“You don’t want to start a year at this level if you’re about to head into another drought,” Doesken said.
It was hard to find good news in Friday morning’s presentations, but Bledsoe is happy that technology helps forecasters understand the weather better than they did during past droughts. That allows people to plan for what’s on the way.
“I’m a glass-half-full kind of guy,” he said.
|
<urn:uuid:6b5ff0a8-5351-4289-bb86-d7195a7837dc>
|
http://durangoherald.com/article/20130201/NEWS01/130209956/0/20120510/Drought-is-making-itself-at-home
| 0.8756
|
fineweb
|
In an era when almost every energy technology is unpopular with somebody, the people who don’t want wind turbines, generating stations or new transmission lines installed in their neighborhoods often raise the idea of improving energy efficiency as an alternative.
That argument is particularly common in New York State and in Vermont, where state governments are trying to close nuclear reactors within their borders. So, how effectively can efficiency replace a reactor, making up for the loss of this zero-carbon energy source?
Not very, according to a new study of carbon dioxide output in Japan in the months around the Fukushima disaster.
Figures collected by the Breakthrough Institute, a group that often presents contrarian views on environmentalism and energy conservation, found that despite stringent efforts to use less energy, Japan emitted 4 percent more carbon dioxide in November 2011 than it did in the same month the previous year. After a quake and tsunami in March 2011 led to three meltdowns at the Fukushima nuclear plant, Japan began closing other plants as well, one because it appeared vulnerable to tsunami and others because local officials did not want them running.
Energy consumption dropped sharply and was nearly 10 percent lower last November than in November 2010, the institute’s figures show. But with natural gas, oil and coal substituting for about 46 reactors, the production of carbon dioxide per unit of energy produced ran about 15 percent higher.
The pattern was the same all year after the March 11 tsunami and quake: consumption dropped but fuel burn increased. This was true even though Japan ran office air-conditioners at far reduced levels last summer and some demand had disappeared because of damage from the disaster.
What analogy can be drawn at Indian Point, 30 miles north of New York City, or Vermont Yankee, near Brattleboro? This month, a New York State Assembly committee concluded that Indian Point was replaceable, an assertion sharply disputed by a business consumer group.
Jason Grumet, an air pollution expert and founder of the Bipartisan Policy Center, said it was hard to draw direct parallels. “The circumstances in the United States are obviously different from Japan,’’ he said. For one thing, Japan was parsimonious in its use of electricity even before Fukushima, and American consumers probably have more fat to cut.
But in either country, he said, it is true that “a decrease in nuclear production in favor of fossil fuels will increase carbon intensity of the power sector, and total carbon dioxide emissions.’’
“It’s an incredibly difficult public policy challenge’’ for the United States, Mr. Grumet said, with different imperatives colliding. “One is to ensure that the aging fleet of nuclear plants is held to the highest safety standards, and the second is to reduce greenhouse gas emissions,’’ he said. “And the third is to keep the lights on.”
|
<urn:uuid:4cda4899-535c-49d1-ba21-2625b4bde643>
|
http://green.blogs.nytimes.com/2012/02/13/can-efficiency-counter-a-loss-of-nuclear-power/?ref=sustainableliving
| 0.856
|
fineweb
|
Greenland ice a benchmark for warming
Core data Greenland was about eight degrees warmer 130,000 years ago than it is today, an analysis of an almost three-kilometre-long ice core in Greenland has revealed.
The finding by an international team of 38 institutions from 14 nations provides an important benchmark for climate change modelling and gives an insight into how the natural world will respond to global warming in the future.
The study, which involves CSIRO researchers, also suggests Antarctica's ice sheets may be more vulnerable to warming than previously thought.
Published in today's Nature journal, the results flow out of a four-year expedition known as the North Greenland Eemian Ice Drilling operation (NEEM).
Dr David Etheridge, principal research scientist with CSIRO Marine and Atmospheric Research who has worked on the project, says the NEEM program is the first to successfully reach down into Greenland's ice core into the Eemian period, which stretched from 130,000 years to 115,000 years ago.
"It has been something of a holy grail for Greenland work to achieve this … we are getting to ice close to the bedrock where you get melting and mixing of the ice layers."
Etheridge says in a process similar to assembling a jigsaw puzzle, scientists used comparisons with gas elements in Antarctica's deep ice core records to re-assemble the layers in their original sequence. Deep ice drilling in the Antarctic has reached as far back as 800,000 years.
Past and future
It is important to understand what happened in Greenland during the Eemian period because the temperatures experienced then are "within the realms of where we are heading", says Etheridge.
However, he says the previous warming was due to the Earth receiving more of the Sun's radiation due to its orbit at the time, while today's warming is being driven by increases in greenhouse gases in the atmosphere.
Nature paper co-author Dr Mauro Rubino, of CSIRO Marine and Atmospheric Research, says it had been previously estimated that Greenland's temperature was about 4°C warmer during the Eemian than now.
But this latest work used analysis of water-stable isotopes to estimate "the temperature 130,000 years ago was up to 8°C warmer [in Greenland] than what it is today", says Rubino.
It also shows sea levels were on average 6 metres higher.
The results provide "important benchmarks for future climate change projections" in temperature and the contribution of the two main ice sheets to sea level rises, Rubino says.
He says the study also reveals the Greenland ice sheet did not melt as much as previously thought so was not the major contributor to sea level at that time.
"It shows the major contribution to sea level rises was not coming from the Greenland ice shelf," he says.
"It was previously believed that Greenland melted entirely [during the Eemian], but in fact the ice sheet was not that much different from what it is now.
"Most of the contribution to sea level rise comes from these two big ice reserves [in Greenland and the Antarctica] so one of the possible interpretations is Antarctica is more susceptible to climate change than we thought."
Etheridge agrees. He says the work shows the Greenland ice sheet survived during the Eemian - although it was about 400 metres thinner.
"From that figure you can deduce how much it contributed to the sea level rise and it is not as much as was thought.
"That throws things back to Antarctica ... previously the thought was Antarctica was too cold and too stable to be impacted."
Etheridge says CSIRO was invited by lead institution, the University of Copenhagen, to be involved in NEEM at its formation because of its expertise in analysing air composition in air bubbles trapped in deep ice.
Rubino says their team began analysis of gas bubbles from the first 80 to 100 metres of ice core down to the final 2540 metre depth.
This helped track changes in climate and temperature on a year-by-year basis.
He says the concentration of greenhouse gases such as carbon dioxide, methane and nitrous oxide in the air bubbles from the Eemian was much lower than what it is today.
|
<urn:uuid:96cf8d51-9a89-4975-ae4f-fecfebf943e1>
|
http://www.abc.net.au/science/articles/2013/01/24/3675740.htm
| 1
|
fineweb
|
Carbon With That Latte?
Sonia Narang 07.03.07, 6:00 AM ET
How Starbucks hopes to trim its emissions footprint.
In its shop in downtown San Mateo, Calif., for instance, baristas serve up about 40,000 cups of coffee drinks every month. Just based on utility bills alone, that means Starbucks is serving up about 4,900 pounds of carbon with its drinks--or about two ounces per cup.
Starbucks executives say they are looking for ways to trim those carbon emissions. But they are reluctant to say just how much Starbucks' worldwide carbon footprint is--and how it has changed over the past few years. Starbucks has calculated the carbon footprint of its North American locations only once, in 2003. Since then, its number of U.S. company-owned stores has almost doubled to 6,281. Its international company-owned locations, also left out of the calculation, now number more than 1,500.
"Although we have grown in size, the nature of our business remains the same--the operation of retail stores and roasting coffee," says Jim Hanna, environmental affairs manager at Starbucks in Seattle. While Starbucks chooses not to calculate its carbon footprint every year, the company does conduct annual progress checks, but these numbers are not publicly reported.
Other eco-friendly companies are also surprisingly coy. Last month, for instance, Google
) led a group of 40 other companies (including Starbucks) in kicking off the "Climate Savers Computing Initiative," a project aimed at building and buying more energy-efficient PCs.
Google is nonetheless keeping a watch on the size of its carbon footprint and hopes to achieve carbon neutrality by the end of this year by using non-carbon energy sources for much of its power needs and purchasing carbon offsets for the rest. Recently, Google flipped the switch on 1.6 megawatts of solar power modules on the roof of its Mountain View headquarters.
Starbucks was early among eco-sensitive companies. Executives became convinced early in this decade that atmospheric carbon could wreak havoc on the global climate--and so on the supply and price of coffee beans. "We're facing environmental risks posed by climate change that could negatively affect many aspects of our company, including our ability to procure coffee," Hanna says.
Temperature and rainfall dictate how much coffee comes out of regions including Latin America and Asia. "As we hope to increase to 40,000 stores worldwide in the next 10 years, we're going to need a larger supply," Hanna says.
In 2003, Starbucks hired Denver-based engineering firm CH2M Hill to calculate the carbon footprint of the approximately 3,700 stores it then had in North America. CH2M Hill began measuring corporate footprints in the late 1990s and has done comparable calculations for a few dozen companies, including Nike (nyse: NKE - news - people ), 3M (nyse: MMM - news - people ), SC Johnson and energy firm Kinder Morgan (nyse: KMI - news - people ).
Doing such calculations is still something of a black art. CH2M Hill's Lisa Grice, who worked on the coffee company's carbon footprint, says the final number primarily includes electricity used in retail stores. Carbon calculators take into account stores' geographic locations. That's because electricity generated at power plants in one state may come from a different source than a power plant in another state. Some stores may get electricity from coal-fired plants, which results in greater carbon emissions, while others may depend on hydroelectric power, which has a lower carbon byproduct.
Starbucks decided to leave out the additional 81,000 tons of carbon dioxide it emitted through transporting coffee materials and disposing solid waste. According to Starbucks Environmental Affairs Manager Ben Packard, the company can only control and manage carbon emissions from energy used in retail stores and coffee-roasting plants.
It took about half a year of data collection and complex calculations to figure out that Starbucks emitted 295,000 tons of carbon into the atmosphere in 2003. Starbucks decided to leave out an additional 81,000 tons of carbon dioxide it emitted by transporting coffee materials and disposing of solid waste. According to Starbucks Environmental Affairs Manager Ben Packard, the company can only control and manage carbon emissions from energy used in retail stores and coffee-roasting plants.
Starbucks attributes 81% of its greenhouse gas emissions to purchased electricity and 18% to coffee roasting at its three North American plants and natural gas usage in stores.
That 295,000-ton figure gives Starbucks a small carbon footprint, among a list of about 1,000 companies compiled by the Carbon Disclosure Project, a London-based nonprofit. Near the top of the list is energy giant American Electric Power (nyse: AEP - news - people ) with 146.5 million tons of carbon emissions. Next in line are oil and gas companies Royal Dutch/Shell and British Petroleum (nyse: BP - news - people ) with 105 million tons and 92 million tons.
Comparatively, General Electric's (nyse: GE - news - people ) 12.4 million ton footprint makes it a medium-size emitter. The smallest carbon emitters weighed in at a few thousand tons. Most of the lower footprints belong to insurance companies, retailers and banks.
Starbucks execs say that even as they've been growing the number of outlets, they've been trying to be more energy efficient. In 2005, Starbucks joined the World Research Institute's Green Power Market Development Group, a consortium of 15 companies ranging from Staples (nasdaq: SPLS - news - people ) to Google. The group helps its members purchase renewable energy at lower prices. Last year, the coffee company increased its wind power to 20% of the total energy usage in North American stores. This offset 62,000 tons of carbon dioxide.
But to track progress in reducing carbon emissions accurately, companies need to update those footprints frequently, says Marcus Peacock of the U.S. Environmental Protection Agency. "We've asked companies to check their numbers annually," he says.
A number of companies are doing just that. Both Intel (nasdaq: INTC - news - people ) and Sun Microsystems (nasdaq: SUNW - news - people ), which are also part of the Climate Savers Computing Initiative, report their carbon footprints annually. Intel's carbon footprint added up to 4 million tons in 2006, a number that includes worldwide operations. Sun first calculated its footprint at 255,000 tons last year, and used past data to figure out carbon emissions dating back four years. The company also reports up-to-date carbon numbers on its Web site.
"We calculate this monthly so that we can make sure we're on track with improving emissions," says Sun's VP of Eco Responsibility Dave Douglas.
Both Intel and Sun are part of the EPA's Climate Leaders Program, a group of companies that sets tangible carbon reduction goals. Climate Leaders began five years ago, when few companies even knew the meaning of carbon footprint. Now, the program boasts 132 members.
In the meantime, Starbucks executives insist they are looking for ways to improve energy efficiency and encourage their customers to do the same. This summer, Starbucks told its customers to go green through a number of high-profile campaigns, including "Green Umbrellas for a Green Cause" and the online Planet Green Game (planetgreengame.com). Starbucks will also start monitoring the energy usage of specific equipment at some stores later this year. "We'll install individual meters on espresso machines, refrigerators, water filtration systems and other components," Hanna says.
This doesn't necessarily mean you'll see a green espresso maker at a Starbucks near you anytime soon. "Quality and performance come first," Hanna says.
'); //--> News Headlines | More From Forbes.com | Special Reports
Advertisement: Related Business Topics >
|
<urn:uuid:f118693f-8e25-48c9-a4fa-ea787f1d53e7>
|
http://www.forbes.com/2007/07/02/starbucks-emissions-environment-biz-cz_sn_0703green_carbon.html
| 1
|
fineweb
|
End of preview. Expand
in Data Studio
FineWeb-Edu v2 - FastText Climate Filtered
A climate and environment-focused subset of sraj/finewebedu-climate-v2, further filtered using a trained FastText binary classifier.
Overview
This dataset applies a supervised FastText climate classifier to the FineWeb-Edu climate v2 dataset. Each record includes a climate probability score from the classifier, providing a confidence measure for climate relevance.
Pipeline
- Source: sraj/finewebedu-climate-v2 (pre-filtered FineWeb-Edu)
- Classifier: FastText supervised model trained on 10K GPT-labeled samples
- Threshold: Records with climate_prob >= 0.5 are included
Fields
| Field | Type | Description |
|---|---|---|
| text | string | The document text |
| id | string | Original document ID |
| url | string | Source URL |
| climate_prob | float | FastText classifier probability (0-1) |
| source | string | Source dataset identifier |
Usage
from datasets import load_dataset
dataset = load_dataset("Michaelyya/fineweb-edu-v2-FastT")
Model Details
- Architecture: FastText supervised classifier
- Training: 10,000 samples labeled via GPT weak supervision
- Labels: Binary classification (climate vs. other)
- Hyperparameters: lr=0.5, epochs=25, wordNgrams=2, dim=100
License
MIT
- Downloads last month
- 41