text
stringlengths 198
584k
| id
stringlengths 47
47
| url
stringlengths 15
2.08k
| climate_prob
float64 0.5
1
| source
stringclasses 1
value |
|---|---|---|---|---|
the energy [r]evolution
The climate change imperative demands nothing short of an Energy [R]evolution. The expert consensus is that this fundamental shift must begin immediately and be well underway within the next ten years in order to avert the worst impacts. What is needed is a complete transformation of the way we produce, consume and distribute energy, while at the same time maintaining economic growth. Nothing short of such a revolution will enable us to limit global warming to less than a rise in temperature of 2° Celsius, above which the impacts become devastating.
Current electricity generation relies mainly on burning fossil fuels, with their associated CO2 emissions, in very large power stations which waste much of their primary input energy. More energy is lost as the power is moved around the electricity grid network and converted from high transmission voltage down to a supply suitable for domestic or commercial consumers. The system is innately vulnerable to disruption: localised technical, weather-related or even deliberately caused faults can quickly cascade, resulting in widespread blackouts. Whichever technology is used to generate electricity within this old fashioned configuration, it will inevitably be subject to some, or all, of these problems. At the core of the Energy [R]evolution there therefore needs to be a change in the way that energy is both produced and distributed.
4.1 key principles
the energy [r]evolution can be achieved by adhering to five key principles:
1.respect natural limits – phase out fossil fuels by the end of this century We must learn to respect natural limits. There is only so much carbon that the atmosphere can absorb. Each year humans emit over 25 billion tonnes of carbon equivalent; we are literally filling up the sky. Geological resources of coal could provide several hundred years of fuel, but we cannot burn them and keep within safe limits. Oil and coal development must be ended. The global Energy [R]evolution scenario has a target to reduce energy related CO2 emissions to a maximum of 10 Gigatonnes (Gt) by 2050 and phase out fossil fuels by 2085.
2.equity and fairness As long as there are natural limits there needs to be a fair distribution of benefits and costs within societies, between nations and between present and future generations. At one extreme, a third of the world’s population has no access to electricity, whilst the most industrialised countries consume much more than their fair share.
The effects of climate change on the poorest communities are exacerbated by massive global energy inequality. If we are to address climate change, one of the core principles must be equity and fairness, so that the benefits of energy services – such as light, heat, power and transport – are available for all: north and south, rich and poor. Only in this way can we create true energy security, as well as the conditions for genuine human wellbeing.
The Advanced Energy [R]evolution scenario has a target to achieve energy equity as soon as technically possible. By 2050 the average per capita emission should be between 1 and 2 tonnes of CO2.
3.implement clean, renewable solutions and decentralise energy systems. There is no energy shortage. All we need to do is use existing technologies to harness energy effectively and efficiently. Renewable energy and energy efficiency measures are ready, viable and increasingly competitive. Wind, solar and other renewable energy technologies have experienced double digit market growth for the past decade.
Just as climate change is real, so is the renewable energy sector. Sustainable decentralised energy systems produce less carbon emissions, are cheaper and involve less dependence on imported fuel. They create more jobs and empower local communities. Decentralised systems are more secure and more efficient. This is what the Energy [R]evolution must aim to create.
To stop the earth’s climate spinning out of control, most of the world’s fossil fuel reserves – coal, oil and gas – must remain in the ground. Our goal is for humans to live within the natural limits of our small planet.
4.decouple growth from fossil fuel use Starting in the developed countries, economic growth must be fully decoupled from fossil fuel usage. It is a fallacy to suggest that economic growth must be predicated on their increased combustion.
We need to use the energy we produce much more efficiently, and we need to make the transition to renewable energy and away from fossil fuels quickly in order to enable clean and sustainable growth.
5.phase out dirty, unsustainable energyWe need to phase out coal and nuclear power. We cannot continue to build coal plants at a time when emissions pose a real and present danger to both ecosystems and people. And we cannot continue to fuel the myriad nuclear threats by pretending nuclear power can in any way help to combat climate change. There is no role for nuclear power in the Energy [R]evolution.
|
<urn:uuid:b6cc700a-55c3-47a6-baaf-dbe7c04a4b04>
|
http://www.energyblueprint.info/1332.0.html?L=0
| 1
|
fineweb
|
What is Rainwater Harvesting?
Rainwater harvesting is an ancient practice of catching and holding rain for later use. In a rainwater harvesting system, rain is gathered from a building rooftop or other source and is held in large containers for future use, such as watering gardens or washing cars. This practice reduces the demand on water resources and is excellent during times of drought.
Why is it Important?
In addition to reducing the demand on our water sources (especially important during drought), rainwater harvesting also helps prevent water pollution. Surprised?
Here’s why: the success of the 1972 Clean Water Act has meant that the greatest threat to New York’s waterbodies comes not from industrial sources, but rather through the small actions we all make in our daily lives. For example, in a rain storm, the oil, pesticides, animal waste, and litter from our lawns, sidewalks, driveways, and streets are washed down into our sewers. This is called non-point source (NPS) pollution because the pollutants come from too many sources to be identified. Rainwater harvesting diverts water from becoming polluted stormwater; instead, this captured rainwater may be used to irrigate gardens near where it falls.
In New York City, keeping rainwater out of the sewer system is very important. That’s because the city has an old combined sewer system that uses the same pipes to transport both household waste and stormwater to sewage treatment plants. During heavy rains, the system overloads; then untreated sewage and contaminated stormwater overflow into our rivers and estuary, with serious consequences:
Who is Harvesting Rainwater in New York City?
Back in 2002, a drought emergency pushed many community gardens to the brink of extinction. For the first time in twenty years, community gardeners were denied permission to use fire hydrants, the primary source of water for most community gardens. This crisis led to the formation of the Water Resources Group (WRG), an open collaboration of community gardening and environmental organizations. With help from the WRG, rainwater harvesting systems have now been built as demonstration sites in twenty NYC community gardens.
At community gardens that harvest rainwater, rain is diverted from the gutters of adjacent buildings and is stored in tanks in the gardens. A 1-inch rainfall on a 1,000-square-foot roof produces 600 gallons of water. The tanks are mosquito proof, so the standing water does not encourage West Nile virus. Because rainwater is chlorine free, it is better than tap water for plant growth, meaning healthier plants. And it’s free!
What are Other Cities Doing?
Many cities have adopted creative, low-cost ways to stop wasting rainwater by diverting it from their sewage systems and putting it to use where it falls. Here are some examples:
What Can I Do?
Spread the word! Educate those around you on the importance of lifestyle decisions.
Tell people not to litter, dump oil down storm drains, or overfertilize their lawns.
Install a rainwater harvesting system at your home, school, business, or local community center.
Contact your local elected officials, and let them know you support rainwater harvesting!
Supporting rainwater harvesting Jade Boat Loans
|
<urn:uuid:14a860e9-8430-426b-8c1a-80c7f022fb96>
|
http://www.waterresourcesgroup.org/
| 0.8696
|
fineweb
|
Japan has been hit by the worst crisis since 1945, as an earthquake and tsunami have killed 10,000, destroyed tens of thousands of buildings, displaced hundreds of thousands, and left millions without power or water. As the nation braces for more aftershocks, people have resorted to using sea water in an attempt to prevent a nuclear meltdown from adding a third catastrophe, which has already leaked and caused a mass evacuation. According to Greenpeace,
"We are told by the nuclear industry that things like this cannot happen with modern reactors, yet Japan is in the middle of a nuclear crisis with potentially devastating consequences…The evolving situation at Fukushima remains far from clear, but what we do know is that contamination from the release of Cesium-137 poses a significant health risk to anyone exposed. Cesium-137 has been one if the isotopes causing the greatest health impacts following the Chernobyl disaster, because it can remain in the environment and food chain for 300 years.”
Whereas the first two catastrophe’s were natural and unpredictable, a nuclear meltdown is entirely unnatural and entirely predictable. According to the local anti-nuclear group, Citizens’ Nuclear Information Centre,
The nuclear crisis comes a month before the 25th anniversary of the Chernobyl disaster, the largest nuclear meltdown in history, which showered Europe in a radioactive cloud causing a quarter of a million cancers, 100,000 of them fatal. As of this writing the disaster in Japan is already the third worst in history, behind Chernobyl and the Three Mile Island partial meltdown in 1979, and comes only 12 years after a fatal overexposure of workers at a nuclear plant in Tokaimura, Japan. Even without the inherent risk of a meltdown, nuclear power is a threat to health. The problem is not just the few terrible times when they don't work, but the daily experience of when they do work. As climate campaigner George Monbiot wrote more than a decade ago,
“The children of women who have worked in nuclear installations, according to a study by the National Radiological Protection Board, are eleven times more likely to contract cancer than the children of workers in non-radioactive industries. You can tell how close to [the nuclear plant in] Sellafield children live by the amount of plutonium in their teeth.”
Add to this the morbidity and mortality or working in uranium mines and the dangers of disposing of radioactive waste, and you have negative health impacts at every stage of nuclear power (for a summary see the UK’s Campaign for Nuclear Disarmament). Despite this, governments have invested massively in the nuclear industry and globalized the risk. Canada has exported nuclear reactors while building seven of its own, and despite concerns about safety the Ontario government plans on investing $36 billion into nuclear power at the same time as its backing off wind power.
REASONS AND EXCUSES
While nuclear power is a clear and present danger to the health of the planet and its people, it is a thriving industry driven by economic and military competition. Vandana Shiva—who studied nuclear physics and now leads the climate justice movement in India—has exposed the hypocrisy of US hostility to Iranian nuclear power when it is doing the same thing to promote nuclear power and weapons in India as a bulwark against China:
As Shiva summarized in her book Soil Not Oil, “nuclear winter is not an alternative to global warming”, and it is a tragedy that Japan has become the test case against both military and civilian arms of the nuclear industry--from the atomic bomb 65 years ago to the nuclear meltdown today. But instead of admitting the problems of nuclear power, the nuclear industry and its supporters have greenwashed it and presented it as a solution to global warming. Some environmentalists, such as Gaia theorist James Lovelock, have fallen prey to these claims. Lovelock, whose ideas are driven by apocalyptic predictions and an extreme pessimism, has gone so far as to claim that “nuclear power is the only green solution”.While former US president George Bush defended his country’s 103 nuclear power plants as not producing "a single pound of air pollution or greenhouses gases”, Dr. Helen Caldicott has refuted the claim in her important book Nuclear Power is Not the Answer, which proves that even without meltdowns nuclear power is a threat to the planet:
The false dichotomy between carbon emissions and nuclear power is also refuted by those developing the Tar Sands, who have proposed using nuclear power to pump Tar Sands oil.
PEOPLE POWER, GREEN JOBS
Fortunately there are growing anti-nuclear campaigns uniting indigenous groups, NGOs and the broader climate justice movement to challenge nuclear power in all its stages—from mining to use to waste disporal. As Vandana Shiva writes in Soil Not Oil,
Meanwhile in Canada indigenous groups are leading opposition to transportation of nuclear waste through the Great Lakes and their surrounding communities, declaring “what we do to the land, we do to ourselves.” Last year the German government extended nuclear power against the will of the majority but after news of the leak in Japan, 50,000 people formed a human chain from a nuclear reactor to Stuttgart demanding an end to nuclear power.
Uniting these campaigns with the labour movement raises the demands of good green jobs for all, to transform our oil and nuclear economy into one based on ecological and social sustainability and justice. Instead of the billions in subsidies for the nuclear industry, governments could be investing in solar, wind and clean electricity, while retrofitting buildings, which could solve the economic and climate crises without the inherent dangers of nuclear power. As Greenpeace wrote,
"Our thoughts continue to be with the Japanese people as they face the threat of a nuclear disaster, following already devastating earthquake and tsunami. The authorities must focus on keeping people safe, and avoiding any further releases of radioactivity...Greenpeace is calling for the phase out of existing reactors, and no construction of new commercial nuclear reactors. Governments should invest in renewable energy resources that are not only environmentally sound but also affordable and reliable.”
|
<urn:uuid:f8f50eee-35e0-4fe2-8425-891ee98718b0>
|
http://yourheartsontheleft.blogspot.com/2011/03/nuclear-meltdown-is-not-alternative-to.html
| 0.6281
|
fineweb
|
Barbara Heath Land Race – 2012
By the time Barbara Heath visited Horsham, the town and the surrounding Wimmera District of Western Victoria were in the process of recovering from a decade-long drought. To inform her work, which was initially to address issues of drought, Heath held a number of planned and fortuitous conversations with the assistance of Horsham Regional Art Gallery staff, which came to focus on the changes in agricultural practices in the area.
The list of people with whom Heath consulted is lengthy, but Dr Bob Redden, curator Australian Temperate Field Crops Collection of the Grains Innovation Park became her main contact. In an email of August 2011, Dr Redden wrote to Heath: ‘Now with unprecedented population levels and growth, there is a risk of disconnect and taking food supply for granted, even with climate change. Humans will need to change if they wish to continue their increasing diverse interests, but will need to prioritise agricultural research, better understanding our available genetic resources, plant growth and development, and imaginative paths to harnessing science and truly earn the title ‘Homo sapiens’.
Land race is a direct response to the urgency of maintaining biodiversity. Agriculture today requires economies of scale that change the social landscape and limit population diversity. This results in the erasure of many small communities, loss of connection to the past and cultural loss. Dr Redden explained his department’s work to ensure plant gene diversity by sourcing and saving seed from land race crops. ‘Land race’ is the term used to describe heritage seed varieties now being displaced by International Seed Uniformity Standards.
Heath’s Land Race series shows distinct levels, from biodiversity in the soils to the patterns of farming practices above. Each Land Race also features a remnant plant species that reaches up and through the tractor track patterns: briar, apple and aloe.
There are numerous hero shots (one above) and details prepared (below), we will wait for the show to get under way and publicise a little later. The preliminary research is in an earlier blog post – click here.
2 Responses »
|
<urn:uuid:c1e616f0-c628-4e79-8c4f-63212b07cee3>
|
http://viewersite.wordpress.com/2012/02/
| 0.986
|
fineweb
|
Participatory Video created by members of various indigenous communities in Itogon, Philippines, tracking the impacts of large-scale mining and now climate change on their environment and culture.
This film was created by members of various indigenous communities in the Cordillera region of the Philippines, during a Participatory Video project facilitated by InsightShare. The participants were taught to use video cameras during an intensive 9-day PV workshop in the barangay of Garrison, in Itogon, and created this 24-minute film to communicate the devastating impacts of large-scale mining wrought on their communities by various companies over the years, and now the increasingly alarming impacts of climate change.
This project was part of Conversations with the Earth project. Launched in April 2009, Conversations with the Earth is a collective opportunity to build a global movement for an indigenous-controlled community media network. CWE works with a growing network of indigenous groups and communities living in critical ecosystems around the world, from the Atlantic Rainforest to Central Asia, from the Philippines to the Andes, from the Arctic to Ethiopia. Through CWE, these indigenous communities are able to share their story of climate change. Through the creation of sustainable autonomous indigenous media hubs in these regions, CWE fosters a long-term relationship with these communities, based on principles of local control and supporting indigenous media capacity.
|
<urn:uuid:8e21dd22-c3b3-4956-ade9-66549e3c9812>
|
http://www.insightshare.org/watch/video/voices-experience
| 1
|
fineweb
|
As the years tick by with most of the planet doing little in the way of reducing carbon emissions, researchers are getting increasingly serious about the possibility of carbon sequestration. If it looks like we're going to be burning coal for decades, carbon sequestration offers us the best chance of limiting its impact on climate change and ocean acidification. A paper that will appear in today's PNAS describes a fantastic resource for carbon sequestration that happens to be located right next to many of the US' major urban centers on the East Coast.
Assuming that capturing the carbon dioxide is financially and energetically feasible, the big concern becomes where to put it so that it will stay out of the atmosphere for centuries. There appear to be two main schools of thought here. One is that areas that hold large deposits of natural gas should be able to trap other gasses for the long term. The one concern here is that, unlike natural gas, CO2 readily dissolves in water, and may escape via groundwater that flows through these features. The alternative approach turns that problem into a virtue: dissolved CO2 can react with minerals in rocks called basalts (the product of major volcanic activity), forming insoluble carbonate minerals. This should provide an irreversible chemical sequestration.
The new paper helpfully points out that if we're looking for basalts, the East Coast of the US, home to many of its major urban centers and their associated carbon emissions, has an embarrassment of riches. The rifting that broke up the supercontinent called Pangea and formed the Atlantic Ocean's basin triggered some massive basalt flows at the time, which are now part of the Central Atlantic Magmatic Province, or CAMP. The authors estimate that prior to some erosion, CAMP had the equivalent of the largest basalt flows we're currently aware of, the Siberian and Deccan Traps.
Some of this basalt is on land—anyone in northern Manhattan can look across the Hudson River and see it in the sheer cliffs of the Palisades. But much, much more of it is off the coast under the Atlantic Ocean. The authors provide some evidence in the form of drill cores and seismic readings that indicate there are large basalt deposits in basins offshore of New Jersey and New York, extending up to southern New England.
These areas are now covered with millions of years of sediment, which should provide a largely impermeable barrier that will trap any gas injected into the basalt for many years. The deposits should also have reached equilibrium with the seawater above, which will provide the water necessary for the chemical reactions that precipitate out carbonate minerals.
Using a drill core from an onshore deposit, the authors show that the basalt deposits are also composed of many distinct flows of material. Each of these flows would have undergone rapid cooling on both its upper and lower surface, which fragmented the rock. The core samples show porosity levels between 10 and 20 percent, which should allow any CO2 pumped into the deposits to spread widely.
The authors estimate that New Jersey's Sandy Hook basin, a relatively small deposit, is sufficient to house 40 years' worth of emissions from coal plants that produce 4GW of electricity. And the Sandy Hook basin is dwarfed by one that lies off the Carolinas and Georgia. They estimate that the South Georgia Rift basin covers roughly 40,000 square kilometers.
The authors argue that although laboratory simulations suggest the basic idea of using basalts for carbon sequestration is sound, the actual effectiveness in a given region can depend on local quirks of geology, so pilot tests in the field are absolutely essential for determining whether a given deposit is suitable. So far, only one small-scale test has been performed on any of the CAMP deposits.
Given the area's proximity to significant sources of CO2 and the infrastructure that could be brought into play if full-scale sequestration is attempted, it seems like one of the most promising proposals to date.
PNAS, 2010. DOI: 10.1073/pnas.0913721107
|
<urn:uuid:0f4b5328-483d-437b-b4b6-8cf4bfa3968b>
|
http://arstechnica.com/science/2010/01/pangea-era-rift-makes-east-coast-perfect-for-carbon-storage/
| 0.9988
|
fineweb
|
Bundelkhand’s ravine wastelands. Photo: Keya Acharya/IPS
BUNDELKHAND, India – Narrow, cobblestoned lanes separate the rows of mud houses with cool interiors and mud-smoothened patios, some with goats tethered to the wooden posts. This is Tajpura village, deep in this water-stressed, drought-prone region of northern India.
An area of stark beauty marked by deep ravines in central India, Bundelkhand spans the states of Uttar Pradesh and Madhya Pradesh. The ruins of stone fortresses dotting the landscape betray a history of constant warfare just as the remnants of water courses and irrigation systems speak of peaceable and prosperous times gone by.
Bundelkhand suffers from manmade problems, starting with the government’s misplaced land and water policies that have worsened an already stressed climatic situation caused by prolonged droughts and erratic rainfall.
Air dropping of ‘Prosopis juliflora’ seeds as a soil-conservation measure in the 1960s resulted in the plant becoming an invasive species that killed indigenous shrubs and trees, making the soft soils of the ravines leach water rapidly and turned vast areas into wastelands.
Thoughtless promotion by the government of water-intensive crops like mentha (mint) encouraged richer farmers to dig deep tube wells while neglecting groundwater recharge, resulting in a disastrous lowering of the water table.
Marginalised farmers, unable to afford expensive infrastructure and inputs, suffer as groundwater depletion adds to problems caused by the ancient rainwater storage and distribution systems going defunct.
Drought is now a familiar spectre in this region and less than half of its one million hectare arable spread is now cultivable, causing distress to its mainly farming population of 50 million people.
“What you have is very high water consumption in an area suffering from water crisis,” says Anil Singh, coordinator of Parmarth, an organisation working to revive traditional systems of water and cropping among marginalised communities that inhabit the ravines of Bundelkhand.
In Tajpura village, as though in denial of Bundelkhand’s stark conditions, 36-year-old Mamtadevi, wife of Ajan Singh, serves up a meal of steaming hot chappatis (Indian flat bread) smeared with clarified butter, a cool, green salad and a dish of smoked brinjal, boiled potato, fresh tomato and green chilli.
“That extra taste in the vegetables is because they are grown sustainably and without chemicals,” explains Mamtadevi.
Ajan Singh and Mamtadevi were among the first to adopt Parmarth’s ‘low external input sustainable agriculture’ (LEISA) which is now standing them in good stead as rainfall becomes scantier and average temperatures rises.
LEISA involves such practices as efficient recycling of nitrogen and other plant nutrients, managing pests through natural means, maintaining ideal soil conditions and ensuring that local farmers are aware of the environment and the value of preserving ecosystems.
The soundness of this method shows in the freshness of Ajan Singh’s vegetable crops, in biodiversity conservation through the use of hardy indigenous seeds and avoiding chemicals for maintaining soil health.
Ajan Singh is also able to beat the vagaries of the weather and this year’s drought, caused by failure of the monsoons, holds no great terror for him or for other farmers who follow LEISA.
Bhartendu Prakash, steering committee member of the Organic Farmers Association of India (OFAI) and in-charge of its northern branch based in Bundelkhand, says the region was hit by frost last winter but organic farmlands using LEISA were the least affected.
“I did not know this system previously. I would grow ‘gehu’ (wheat) and manage 200-300 kg on this same plot,” says Ajan Singh.
Parmarth helped the community in contouring the lands for rainwater run-off and storage and constructed a well for irrigation. Its volunteers also taught farmers like Ajan Singh how to make vermicompost and set up pheromone traps to catch insects.
Most farmers though, already had their own methods of making biopesticide – usually a mix of neem leaves and garlic soaked in buffalo buttermilk. “But before the pheromone traps were laid, the spraying had to be done once every three days, now once a week is enough,” says Mamtadevi.
By 2009, the couple’s vegetables had such a reputation for quality that they sold at the local market 10 km away at higher than prevailing rates, earning them nearly 80,000 Indian rupees (then approximately 1,800 dollars) yearly.
Three years later, Ajan Singh bought another ‘bigha’ (approximately 2.2 acres) of land. He now takes his produce to two markets and also sells milk from five buffaloes that he bought with his earnings.
Fifteen more farmers from Tajpura are now following Ajan Singh’s methods.
Along with this, the women of the community have banded together into self-help groups that maintain a savings and loan account to assist women find simple livelihood alternatives like livestock rearing.
The women also run a grain bank that sells surplus grain in the open market and give grain free to distressed families in times of need.
“We are now trying to link the community to government schemes wherever possible, such as obtaining sprinklers, and getting some benefit from the state-run Bundelkhand Relief Package which does help with drought-proofing,” says Anil Singh who works for Parmarth.
Released in 2009 by the federal government, the package worth 1.5 billion dollars supports rainwater harvesting, proper utilisation of river systems, irrigation canals and water bodies over a three-year period.
But Bundelkhand’s natural farming methods need to get more support as the funding period comes to an end.
“Bundelkhand is too entrenched in northern Indian chemical farming methods,” says OFAI’s Prakash. In contrast, OFAI is deluged with requests for training in organic farming methods from farmers in Punjab and Haryana, the ‘mother zone’ of the so-called ‘green revolution’ that transformed agriculture in India after introduction in the 1960s.
Rajesh Krishnan, campaigner for Greenpeace in India, is optimistic that the government will see the wisdom of promoting organic agriculture as a counter measure to the numerous fallouts of chemical agriculture that fuelled the green revolution.
Krishnan is hopeful for the probable financing of sustainable agriculture in India’s 12th Five- Year Plan, due to be rolled out in November.
Prakash is confident that sustainable agricultural farming will survive through a growing demand for organically-grown crops.
|
<urn:uuid:34eabe1a-7864-4b64-9b8b-dce88b0f492c>
|
http://climate-connections.org/2012/08/22/india-beating-the-weather-with-sustainable-crops/
| 0.8262
|
fineweb
|
Scientists have long projected that areas north and south of the tropics will grow drier in a warming world –- from the Middle East through the European Riviera to the American Southwest, from sub-Saharan Africa to parts of Australia.
These regions are too far from the equator to benefit from the moist columns of heated air that result in steamy afternoon downpours. And the additional precipitation foreseen as more water evaporates from the seas is mostly expected to fall at higher latitudes. Essentially, a lot of climate scientists say, these regions may start to feel more like deserts under the influence of global warming.
Now scientists have measured a rapid recent expansion of desert-like barrenness in the subtropical oceans –- in places where surface waters have also been steadily warming. There could be a link to human-driven climate change, but it’s too soon to tell, the scientists said.
[UPDATED below, 3/6, 1 p..m] Read more…
|
<urn:uuid:71855304-2f8a-4425-8945-02a9b90be1ae>
|
http://dotearth.blogs.nytimes.com/tag/deserts/
| 1
|
fineweb
|
The Seine, the scenic river running through Paris, has inspired artists, attracted tourists and served as the soul of the city, and now it will also be a source of renewable energy. Paris officials have announced a plan to place river turbines beneath four bridges on the Seine.
The Pont du Garigliano, Pont de la Tournelle, Pont Marie and Pont au Change will each have two turbines installed underwater at their base. These bridges were chosen because the speed of the current accelerates in those locations. While river currents don't produce the kind of electricity that wave power can, the current-harvesting technology has come a long way and more devices are being introduced that can generate energy from even the slowest moving waters.
City officials have put a call out to power companies to come up with the best plan for installing the turbines, with a winner being chosen in January and installations starting next spring.
via The Guardian
written by Quiet-Environmentalist, June 29, 2010
written by David Brockes, July 08, 2010
|< Prev||Next >|
|
<urn:uuid:e338e7ab-37e4-40ee-98f0-254c81baa630>
|
http://ecogeek.org/component/content/article/3207-paris-putting-turbines-in-the-seine
| 0.9792
|
fineweb
|
of lakes dot the marshy Arctic tundra regions. Now, in the latest addition to
the growing body of evidence that global warming is significantly affecting
the Arctic, two recent studies suggest that thawing permafrost is the cause
of two seemingly contradictory observations both rapidly growing and
rapidly shrinking lakes.
Thawing permafrost is altering the lakes that dominate Arctic landscapes, such as this one in western Siberia. Courtesy of Laurence C. Smith.
The first study is a historical analysis of changes to 10,000 Siberian lakes over the past 30 years, a period of warming air and soil temperatures. Using satellite images, Laurence Smith, a geographer at the University of California, Los Angeles, and colleagues found that, since the early 1970s, 125 Siberian lakes vanished completely, and those that remain averaged a 6 percent loss in surface area, a total of 930 square kilometers.
They report in the June 3 Science that the spatial pattern of lake disappearance suggests that the lakes drained away when the permafrost below them thawed, allowing the lake water to seep down into the groundwater. However, the team also found that lakes in northwestern Siberia actually grew by 12 percent, and 50 new lakes formed. Both of the rapid changes are due to warming, they say, and if the warming trend continues, the northern lakes will eventually shrink as well.
These two processes are similar, in that were witnessing permafrost degradation in both regions, says co-author Larry Hinzman, a hydrologist at the University of Alaska in Fairbanks, who in previous studies documented shrinking lakes in southern Alaska. In the warmer, southern areas, we get groundwater infiltration, but in the northern areas, where the permafrost is thicker and colder, its going to take much, much longer for that to occur. So instead of seeing lakes shrinking there, were seeing lakes growing.
That finding is consistent with the second study, which focused on a set of unusually oriented, rapidly growing lakes in northern Alaska, an area of continuous permafrost. Jon Pelletier, a geomorphologist at the University of Arizona in Tucson, reports in the June 30 Journal of Geophysical Research Earth Surface that the odd alignment of the lakes is caused not by wind direction but by permafrost melting faster at the downhill end of the lake, which has shallower banks.
Since the 1950s, scientists have attributed the odd alignment of the egg-shaped lakes to winds blowing perpendicularly to the long axes of the lakes, which then set up currents that caused waves to break at the northwest and southeast ends, thus preferentially eroding them. The prevailing wind direction idea has been around so long that we dont even think about it, Smith says, but Jons [Pelletier] work is challenging that. Its a very interesting paper.
Wind-driven erosion occurs in the Great Lakes, but at rates of about a meter a year. The Alaskan oriented thaw lakes grow at rates of 5 meters or more per year. Pelletier says this rate difference suggests a different process is at work.
According to the model, the direction and speed of growth depend on where and how quickly the permafrost thaws, which is determined by two factors: how the water table intersects the slope of the landscape and how fast the summer temperature increases. If the permafrost thaws abruptly, the shorter, downhill bank is more likely to thaw first. The soggy soil slumps into the water, and the perimeter of the lake is enlarged. Its not just the [global] warming trend, but also how quickly the warming takes place in the summertime, Pelletier says.
Hinzman says that the lakes are just one part of the Arctic water cycle, which has seen an increasing number of perturbations in recent years. The whole hydrologic cycle is changing and this is just one component of that.
Understanding how the hydrologic cycle is changing is important, Hinzman says, because the amount of freshwater runoff into the Arctic Ocean impacts global ocean circulation and the amount of sea ice, thus affecting climate worldwide. If global warming continues to the point where permafrost goes away, there will be fewer lakes, Smith says. And a drier, less marshy Arctic could alter weather patterns and ecosystems, researchers say, affecting everything from the subsistence lifestyle of native people to the hazard of fire on the tundra.
Geotimes contributing writer
Back to top
|
<urn:uuid:5fdf99e1-ac10-4897-aae4-baeb9600a36e>
|
http://www.geotimes.org/sept05/NN_arcticlakes.html
| 0.9633
|
fineweb
|
A new world record wind gust: 253 mph in Australia's Tropical Cyclone Olivia
The 6,288-foot peak of New Hampshire's Mount Washington is a forbidding landscape of wind-swept barren rock, home to some of planet Earth's fiercest winds. As a 5-year old boy, I remember being blown over by a terrific gust of wind on the summit, and rolling out of control towards a dangerous drop-off before a fortuitously-placed rock saved me. Perusing the Guinness Book of World Records as a kid, three iconic world weather records always held a particular mystique and fascination for me: the incredible 136°F (57.8°C) at El Azizia, Libya in 1922, the -128.5°F (-89.2°C) at the "Pole of Cold" in Vostok, Antarctica in 1983, and the amazing 231 mph wind gust (103.3 m/s) recorded in 1934 on the summit of Mount Washington, New Hampshire. Well, the legendary winds of Mount Washington have to take second place now, next to the tropical waters of northwest Australia. The World Meteorological Organization (WMO) has announced that the new world wind speed record at the surface is a 253 mph (113.2 m/s) wind gust measured on Barrow Island, Australia. The gust occurred on April 10, 1996, during passage of the eyewall of Category 4 Tropical Cyclone Olivia.
Figure 1. Instruments coated with rime ice on the summit of Mt. Washington, New Hampshire. Image credit: Mike Theiss.
Tropical Cyclone Olivia
Tropical Cyclone Olivia was a Category 4 storm on the U.S. Saffir-Simpson scale, and generated sustained winds of 145 mph (1-minute average) as it crossed over Barrow Island off the northwest coast of Australia on April 10, 1996. Olivia had a central pressure of 927 mb and an eye 45 miles in diameter at the time, and generated waves 21 meters (69 feet) high offshore. According to Black et al. (1999), the eyewall likely had a tornado-scale mesovortex embedded in it that caused the extreme wind gust of 253 mph. The gust was measured at the standard measuring height of 10 meters above ground, on ground at an elevation of 64 meters (210 feet). A similar mesovortex was encountered by a Hurricane Hunter aircraft in Hurricane Hugo of 1989, and a mesovortex was also believed to be responsible for the 239 mph wind gust measured at 1400 meters by a dropsonde in Hurricane Isabel in 2003. For reference, 200 mph is the threshold for the strongest category of tornado, the EF-5, and any gusts of this strength are capable of causing catastrophic damage.
Figure 2. Visible satellite image of Tropical Cyclone Olivia a few hours before it crossed Barrow Island, Australia, setting a new world-record wind gust of 253 mph. Image credit: Japan Meteorological Agency.
Figure 3. Wind trace taken at Barrow Island, Australia during Tropical Cyclone Olivia. Image credit: Buchan, S.J., P.G. Black, and R.L. Cohen, 1999, "The Impact of Tropical Cyclone Olivia on Australia's Northwest Shelf", paper presented at the 1999 Offshore Technology Conference in Houston, Texas, 3-6 May, 1999.
Why did it take so long for the new record to be announced?
The instrument used to take the world record wind gust was funded by a private company, Chevron, and Chevron's data was not made available to forecasters at Australia's Bureau of Meteorology (BOM) during the storm. After the storm, the tropical cyclone experts at BOM were made aware of the data, but it was viewed as suspect, since the gusts were so extreme and the data was taken with equipment of unknown accuracy. Hence, the observations were not included in the post-storm report. Steve Buchan from RPS MetOcean believed in the accuracy of the observations, and coauthored a paper on the record gust, presented at the 1999 Offshore Technology Conference in Houston (Buchan et al., 1999). The data lay dormant until 2009, when Joe Courtney of the Australian Bureau of Meteorology was made aware of it. Courtney wrote up a report, coauthored with Steve Buchan, and presented this to the WMO extremes committee for ratification. The report has not been made public yet, and is awaiting approval by Chevron. The verified data will be released next month at a World Meteorological Organization meeting in Turkey, when the new world wind record will become official.
New Hampshire residents are not happy
Residents of New Hampshire are understandably not too happy about losing their cherished claim to fame. The current home page of the Mount Washington Observatory reads, "For once, the big news on Mount Washington isn't our extreme weather. Sadly, it's about how our extreme weather--our world record wind speed, to be exact--was outdone by that of a warm, tropical island".
Comparison with other wind records
Top wind in an Atlantic hurricane: 239 mph (107 m/s) at an altitude of 1400 meters, measured by dropsonde in Hurricane Isabel (2003).
Top surface wind in an Atlantic hurricane: 211 mph (94.4 m/s), Hurricane Gustav, Paso Real de San Diego meteorological station in the western Cuban province of Pinar del Rio, Cuba, on the afternoon of August 30, 2008.
Top wind in a tornado: 302 mph (135 m/s), measured via Doppler radar at an altitude of 100 meters (330 feet), in the Bridge Creek, Oklahoma tornado of May 3, 1999.
Top surface wind not associated with a tropical cyclone or tornado: 231 mph (103.3 m/s), April 12, 1934 on the summit of Mount Washington, New Hampshire.
Top wind in a typhoon: 191 mph (85.4 m/s) on Taiwanese Island of Lanya, Super Typhoon Ryan, Sep 22, 1995; also on island of Miyakojima, Super Typhoon Cora, Sep 5, 1966.
Top surface wind not measured on a mountain or in a tropical cyclone: 207 mph (92.5 m/s) measured in Greenland at Thule Air Force Base on March 6, 1972.
Top wind measured in a U.S. hurricane: 186 mph (83.1 m/s) measured at Blue Hill Observatory, Massachusetts, during the 1938 New England Hurricane.
Buchan, S.J., P.G. Black, and R.L. Cohen, 1999, "The Impact of Tropical Cyclone Olivia on Australia's Northwest Shelf", paper presented at the 1999 Offshore Technology Conference in Houston, Texas, 3-6 May, 1999.
Black, P.G., Buchan, S.J., and R.L. Cohen, 1999, "The Tropical Cyclone Eyewall Mesovortex: A Physical Mechanism Explaining Extreme Peak Gust Occurrence in TC Olivia, 4 April 1996 on Barrow Island, Australia", paper presented at the 1999 Offshore Technology Conference in Houston, Texas, 3-6 May, 1999.
|
<urn:uuid:3cf8391c-7628-4b73-b23d-af8d16292401>
|
http://www.wunderground.com/blog/JeffMasters/comment.html?entrynum=1420&page=7
| 0.9948
|
fineweb
|
July 18, 2012
Since the Industrial Revolution, ocean acidity has risen by 30 percent as a direct result of fossil-fuel burning and deforestation. And within the last 50 years, human industry has caused the world’s oceans to experience a sharp increase in acidity that rivals levels seen when ancient carbon cycles triggered mass extinctions, which took out more than 90 percent of the oceans’ species and more than 75 percent of terrestrial species.
Rising ocean acidity is now considered to be just as much of a formidable threat to the health of Earth’s environment as the atmospheric climate changes brought on by pumping out greenhouse gases. Scientists are now trying to understand what that means for the future survival of marine and terrestrial organisms.
In June, ScienceNOW reported that out of the 35 billion metric tons of carbon dioxide released annually through fossil fuel use, one-third of those emissions diffuse into the surface layer of the ocean. The effects those emissions will have on the biosphere is sobering, as rising ocean acidity will completely upset the balance of marine life in the world’s oceans and will subsequently affect humans and animals who benefit from the oceans’ food resources.
The damage to marine life is due in large part to the fact that higher acidity dissolves naturally-occurring calcium carbonate that many marine species–including plankton, sea urchins, shellfish and coral–use to construct their shells and external skeletons. Studies conducted off Arctic regions have shown that the combination of melting sea ice, atmospheric carbon dioxide and subsequently hotter, CO2-saturated surface waters has led to the undersaturation of calcium carbonate in ocean waters. The reduction in the amount of calcium carbonate in the ocean spells out disaster for the organisms that rely on those nutrients to build their protective shells and body structures.
The link between ocean acidity and calcium carbonate is a directly inverse relationship, which allows scientists to use the oceans’ calcium carbonate saturation levels to measure just how acidic the waters are. In a study by the University of Hawaii at Manoa published earlier this year, researchers calculated that the level of calcium carbonate saturation in the world’s oceans has fallen faster in the last 200 years than has been seen in the last 21,000 years–signaling an extraordinary rise in ocean acidity to levels higher than would ever occur naturally.
The authors of the study continued on to say that currently only 50 percent of the world’s ocean waters are saturated with enough calcium carbonate to support coral reef growth and maintenance, but by 2100, that proportion is expected to drop to a mere five percent, putting most of the world’s beautiful and diverse coral reef habitats in danger.
In the face of so much mounting and discouraging evidence that the oceans are on a trajectory toward irreparable marine life damage, a new study offers hope that certain species may be able to adapt quick enough to keep pace with the changing make-up of Earth’s waters.
In a study published last week in the journal Nature Climate Change, researchers from the ARC Center of Excellence for Coral Reef Studies found that baby clownfish (Amphiprion melanopus) are able to cope with increased acidity if their parents also lived in higher acidic water, a remarkable finding after a study conducted last year on another clownfish species (Amphiprion percula) suggested acidic waters reduced the fish’s sense of smell, making it likely for the fish to mistakenly swim toward predators.
But the new study will require further research to determine whether or not the adaptive abilities of the clownfish are also present in more environmentally-sensitive marine species.
While the news that at least some baby fish may be able to adapt to changes provides optimism, there is still much to learn about the process. It is unclear through what mechanism clownfish are able to pass along this trait to their offspring so quickly, evolutionarily speaking. Organisms capable of generation-to-generation adaptations could have an advantage in the coming decades, as anthropogenic emissions push Earth to non-natural extremes and place new stresses on the biosphere.
Sign up for our free email newsletter and receive the best stories from Smithsonian.com each week.
|
<urn:uuid:d5fc8f97-1ffe-4404-b9ee-d359c5162435>
|
http://blogs.smithsonianmag.com/science/2012/07/ocean-acidity-rivals-climate-change-as-environmental-threat/
| 0.9999
|
fineweb
|
Hot Weather Gets Scientists' Attention
Originally published on Wed July 11, 2012 5:30 am
RENEE MONTAGNE, HOST:
Across America people are sweltering through extreme heat this year, continuing a long-term trend of rising temperatures. Inevitably, many are wondering if the scorching heat is due to global warming. Scientists are expected to dig into the data and grapple with that in the months to come. They've already taken a stab at a possible connection with last year's extreme weather events, like the blistering drought in Texas. NPR's Richard Harris reports.
RICHARD HARRIS, BYLINE: Weather researchers from around the world are now taking stock of what happened in 2011. It was not the hottest year on record, but it was still in the top 15. Jessica Blunden from the National Climatic Data Center says 2011 had its own memorable characteristics.
JESSICA BLUNDEN: People may very well remember this year as a year of extreme weather and climate.
HARRIS: There were devastating droughts in Africa, Mexico, and Texas. In Thailand, massive flooding kept people's houses underwater for two months.
BLUNDEN: Here in the United States, we had one of our busiest and most destructive seasons on record in 2011. There were seven different tornado and severe weather outbreaks that each caused more than a billion dollars in damages.
HARRIS: So what's going on here? Federal climate scientist, Tom Karl, said one major feature of the global weather last year was a La Nina event. That's a period of cooler Pacific Ocean temperatures and it has effects around the globe, primarily in producing floods in some parts of the world and droughts in others.
TOM KARL: By no means did it explain all of the activity in 2011, but it certainly influenced a considerable part of the climate and weather.
HARRIS: Karl and Blunden are part of a huge multinational effort to sum up last year's weather and say what it all means. They provided an update by conference call. Clearly, long-term temperature trends are climbing as you'd expect as a result of global warming. Tom Peterson from the Federal Climate Data Center says the effort now is to look more closely at individual events.
TOM PETERSON: You've probably all heard the term you can't attribute any single event to global warming, and while that's true, the focus of the science now is evolving and moving onto how is the probability of event change.
HARRIS: And there researchers report some progress. For example, last year's record-breaking drought in Texas wasn't simply the result of La Nina. Peter Stott from the British Meteorology Office says today's much warmer planet played a huge role as well, according to the study the group released on Tuesday.
PETER STOTT: The result that they find is really quite striking, in that they find that such a heat wave is now about 20 times more likely during a La Nina year than it was during the 1960s.
HARRIS: A second study found that an extraordinary warm spell in London last November was 60 times more likely to occur on our warming planet than it would have been over the last 350 years. But that's not to say everything is related to climate change. There's no clear link between the spate of tornadoes and global warming, and devastating floods in Thailand last year, turn out to be the result of poor land use practices.
Even so, Kate Willett of the British Weather Service says there is a global trend consistent with what scientists expect climate change to bring.
KATE WILLETT: So, in simple terms, we can say that the dry regions are getting drier and the wet regions are getting wetter.
HARRIS: This year's extreme events are different from last year's, but they all fit into a coherent picture of global change. Richard Harris, NPR News. Transcript provided by NPR, Copyright NPR.
|
<urn:uuid:e8e46237-1e26-4326-b62c-a25477bd0d59>
|
http://kacu.org/post/hot-weather-gets-scientists-attention
| 1
|
fineweb
|
Climate change has already pushed the nation's wildlife into crisis, according to a report released Wednesday from the National Wildlife Federation (NWF), and further catastrophe, including widespread extinction, can only be curbed with swift action to curb the carbon pollution that has the planet sweltering.
Entitled Wildlife in a Warming World: Confronting the Climate Crisis, the report looks at 8 regions across the U.S. where "the underlying climatic conditions to which species have been accustomed for thousands of years," the report explains, have been upturned by human-caused climate change.
“Some of America’s most iconic species—from moose to sandhill cranes to sea turtles – are seeing their homes transformed by rapid climate change,” stated Dr. Amanda Staudt, climate scientist at the National Wildlife Federation.
Feb 15, 2013 Living on Earth: STARVING POLAR BEARS Polar Bears have long been the poster species for the problem of climate change. But a new paper in Conservation Letters argues that supplemental feeding may be necessary to prevent polar bear populations from going extinct. Polar bear expert Andrew Derocher from the University of Alberta joins Host Steve Curwood to discuss how we can save the largest bear on the planet.http://www.loe.org/shows/segments.html?programID=13-P13-00007&segmentID=2
|
<urn:uuid:8f73ff6f-28d6-4d1c-b460-f3a592885a8d>
|
http://www.scoop.it/t/why-has-putin-closed-the-archives-relating-to-the-holocaust-and-why-has-russian-joined-the-wto/p/3371169705/israel-shells-syria-and-gaza-sabbah-report
| 1
|
fineweb
|
Pricing Carbon Emissions
A bill before Congress may prove a costly way to reduce greenhouse gases.
- Friday, June 5, 2009
- By Kevin Bullis
Experts are applauding a sweeping energy bill currently before the United States Congress, saying that it could lead to significant cuts in greenhouse-gas emissions and improve the likelihood of a comprehensive international agreement to cut greenhouse gases. "It's real climate-change legislation that's being taken seriously," says Gilbert Metcalf, a professor of economics at Tufts University. But many warn that the bill's market-based mechanisms and more conventional regulations could make these emissions reductions more expensive than they need to be.
The bill, officially called the American Clean Energy and Security Act of 2009, is also referred to as the Waxman-Markey Bill, after its sponsors, Henry Waxman (D-Ca.) and Edward Markey (D-Mass.). The legislation would establish a cap and trade system to reduce greenhouse gases, an approach favored by most economists over conventional regulatory approaches because it provides a great deal of flexibility in how emissions targets are met. But it also contains mandates that could significantly reduce the cost savings that the cap and trade approach is supposed to provide.
In a cap and trade system, the government sets a cap on total emissions of greenhouse gases from various industrial and utility sources, including power plants burning fossil fuels to generate electricity. It then issues allowances to polluters allowing them to emit carbon dioxide and other greenhouse gases; total emissions are meant to stay under the cap. Over a period of time, the government gradually reduces the cap and the number of allowances until it reaches its target. If companies' emissions exceed their allowances, they must buy more.
Economists like the system because companies can choose to either lower their emissions, such as by investing in new technology, or buy more allowances from the government or from companies that don't need them--whichever makes the best economic sense. It is meant to create a carbon market, putting a value on emissions.
In the proposed energy bill, the government will set caps to reduce greenhouse-gas emissions by 17 percent by 2020 (compared with 2005 levels) and by 80 percent by 2050--targets chosen to prevent the worst effects of climate change. Setting caps will make electricity more expensive, as companies turn to cleaner technologies to meet ever lower caps or have to spend money to buy allowances from others with lower emissions. But the bill has some provisions for cushioning the blow, especially at first. For one thing, it gives away most of the allowances rather than charging for them, and it also requires that any profits gained from these free allowances be passed on to electricity customers. It also allows companies to buy "offsets" that permit them to pay to reduce emissions outside the United States.
If the program is designed right, there are fewer allowances than the total emissions when the program starts. At first, when the caps are relatively easy to meet, the prices for allowances on the carbon market will be low. But eventually, they will get higher as the allowances become scarcer. In an ideal world, companies will predict what the price of the allowances will be, and plan accordingly.
|
<urn:uuid:ecbdee27-d586-4d08-a03d-036829352851>
|
http://www.technologyreview.in/energy/22755/page1/
| 1
|
fineweb
|
Green building facts
- Buildings consume 32% of the world’s resources including 12% of fresh water and 40% of the world’s energy (7).
- In Australia commercial buildings produce almost 9% of our national Greenhouse gas emissions (8).
- To make way for the new Law School the Edgeworth David building and the Stephen Roberts lecture theatre were demolished in 2006. Over 80% of the materials from these buildings were recycled including the valuable copper from the roof of the lecture theatre
The new Law School Building
7. “Environmentally Sustainable Buildings: Challenges and Policies” OECD (2003)
8. “Australia State of the Environment Report” Department of Environment & Heritage (2001)
|
<urn:uuid:354e6914-456e-4541-8c57-6410f91b48cd>
|
http://sydney.edu.au/facilities/sustainable_campus/buildings/index.shtml
| 0.947
|
fineweb
|
The drought in Texas, during March, was the worst since 1895.
That is about the time my parents were born 120 years ago.
I never thought it could be worse than the drought of the 1950s, but it is. Drive out into grazing country where mesquite aren't too thick and all you can see is dry, cracked soil with an occasional fire ant or a gopher mound in the sandier soil.
Comparing the current drought with the seven-year drought in the 1950s, old-timers say the current drought sapped the soil of moisture faster than it did in the 1950s.
It just stopped raining last July, and pasture after pasture was hit by wildfires.
Right now, there is no potential to produce hay, harvest wheat or plant cotton or grain sorghum this May. Unless there is a week of rain fairly soon there is no hope for agriculture this year.
The Texas Ag Extension Service says that, despite a few recent showers in some areas, the cotton growing in Texas and Oklahoma is still in a drought. Any crop planted in southern Texas earlier in the year that got up out of the ground is now being sand blasted by hot, dry winds.
Wildfires have burned at least 1.5 million acres in the state since Jan. 1.
In addition to grazing losses, ranchers are facing rangeland stock water tanks that are dry or nearly dry. Streams are not flowing and lakes and big tanks are turning to deep mud.
|
<urn:uuid:2ea6b2e4-22cb-4c80-8462-ec4f7a51e6d6>
|
http://www.timesrecordnews.com/news/2011/may/01/drought-worst-since-1895/
| 0.9997
|
fineweb
|
Forest Ecosystems: Current Research
Regional Fire/Climate Relationships in the Pacific Northwest and Beyond
Fire exerts a strong influence on the structure and function of many terrestrial ecosystems. In forested ecosystems, the factors controlling the frequency, intensity, and size of fires are complex and operate at different spatial and temporal scales. Since climate strongly influences most of these factors (such as vegetation structure and fuel moisture), understanding the past and present relationships between climate and fire is essential to developing strategies for managing fire-prone ecosystems in an era of rapid climate change. The influence of climate change and climate variability on fire regimes and large fire events in the Pacific Northwest (PNW) and beyond is the focus of this project.
There is mounting evidence that a detectable relationship exists between extreme fire years in the West and Pacific Ocean circulation anomalies. The El Niño/Southern Oscillation (ENSO) influences fire in the Southwest (SW) and the Pacific Decadal Oscillation (PDO) appears to be related to fire in the PNW and Northern Rockies (NR). However, there are reasons to expect that processes driving fire in PNW, SW, and NR are not constant in their relative influence on fire through time or across space and that their differentiation is not stationary through time or across space.
- How regionally specific is the relationship between large fire events and precipitation/atmospheric anomalies associated with ENSO and PDO during the modern record?
- What do tree-ring and other paleo-records tell us about the temporal variability of the patterns of fire/climate relationships?
- How is climate change likely to influence climate/fire relationships given the demonstrated influences of climate variability?
Figure 1 A simple model of climate–fire-vegetation linkages. This project emphasizes the mechanisms and variability indicated by (1).
For publications on climate impacts on PNW forest ecosystems, please see CIG Publications.
Gedalof, Z. 2002. Links between Pacific basin climatic variability and natural systems of the Pacific Northwest. PhD dissertation, School of Forestry, University of Washington, Seattle.
Littell, J.S. 2002. Determinants of fire regime variability in lower elevation forests of the northern greater Yellowstone ecosystem. M.S. Thesis, Big Sky Institute/Department of Land Resources and Environmental Sciences, Montana State University, Bozeman.
Mote, P.W., W.S. Keeton, and J.F. Franklin. 1999. Decadal variations in forest fire activity in the Pacific Northwest. In Proceedings of the 11th Conference on Applied Climatology, pp. 155-156, Boston, Massachusetts: American Meteorological Society.
|
<urn:uuid:e4092633-013e-4995-97f5-6212c2dac106>
|
http://cses.washington.edu/cig/res/fe/fireclimate.shtml
| 1
|
fineweb
|
“A remote Indian village is responding to global warming-induced water shortages by creating large masses of ice, or “artificial glaciers,” to get through the dry spring months. (See a map of the region.)
Located on the western edge of the Tibetan plateau, the village of Skara in the Ladakh region of India is not a common tourist destination.
“It’s beautiful, but really remote and difficult to get to,” said Amy Higgins, a graduate student at the Yale School of Forestry & Environmental Studies who worked on the artificial glacier project.
“A lot of people, when I met them in Delhi and I said I was going to Ladakh, they looked at me like I was going to the moon,” said Higgins, who is also a National Geographic grantee.
People in Skara and surrounding villages survive by growing crops such as barley for their own consumption and for sale in neighboring towns. In the past, water for the crops came from meltwater originating in glaciers high in the Himalaya.”
Read more: National Geographic
|
<urn:uuid:5050ac83-4770-4e9c-9b44-38ba46d2466e>
|
http://peakwater.org/2012/02/artificial-glaciers-water-crops-in-indian-highlands/
| 0.9923
|
fineweb
|
Will the US Face Blackouts as Electricity Generation Suffers in Drought?
Well, its official – the U.S. government has acknowledged that the U.S. is in the worst drought in over 50 years, since December 1956, when about 58 percent of the contiguous U.S. was in moderate to extreme drought.
According to the National Oceanic and Atmospheric Administration National Climatic Data Center’s “State of the Climate Drought July 2012″ report, “Based on the Palmer Drought Index, severe to extreme drought affected about 38 percent of the contiguous United States as of the end of July 2012, an increase of about 5 percent from last month… About 57 percent of the contiguous U.S. fell in the moderate to extreme drought categories (based on the Palmer Drought Index) at the end of July… According to the weekly U.S. Drought Monitor, about 63 percent of the contiguous U.S. (about 53 percent of the U.S. including Alaska, Hawaii, and Puerto Rico) was classified as experiencing moderate to exceptional (D1-D4) drought at the end of July.”
Much business writing on the effects of the drought have focused on its agricultural aspects. To give but one, the hottest, driest summer since 1936 scorching the Midwest have diminished projected corn and soybean crop yields s in the U.S. for a third straight year to their lowest levels in nine years. Accordingly, the price of a bushel of corn has jumped 62 percent since 15 June and soybeans gained 32 percent in the same period.
But as consumers fret about the inevitable rise in food prices to come, the drought is unveiling another, darker threat to the American lifestyle, as it is now threatening U.S. electricity supplies.
Because virtually all power plants, whether they are nuclear, coal, or natural gas-fired, are completely dependent on water for cooling. Hydroelectric plants require continuous water flow to operate their turbines. Given the drought, many facilities are overheating and utilities are shutting them down or running their plants at lower capacity. Few Americans know (or up to this point have cared) that the country’s power plants account for about half of all the water used in the United States. For every gallon of residential water used in the average U.S. household, five times more is used to provide that home with electricity via hydropower turbines and fossil fuel power plants, roughly 40,000 gallons each month.
Michael Webber, associate director of the Center for International Energy and Environmental Policy at the University of Texas at Austin, is under no such illusions, stating that the summer’s record high heat and drought have worked together to overtax the nation’s electrical grid, adding that families use more water to power their homes than they use from their tap. Webber said, “In summer you often get a double whammy. People want their air-conditioning and drought gets worse. You have more demand for electricity and less water available to produce it. That is what we are seeing in the Midwest right now, power plants on the edge.”
In July U.S. nuclear-power production hit its lowest seasonal levels in nine years as drought and heat forced Nuclear power plants from Ohio to Vermont to slow output. Nuclear Regulatory Commission spokesman David McIntyre explained, “Heat is the main issue, because if the river is getting warmer the water going into the plant is warmer and makes it harder to cool. If the water gets too warm, you have to dial back production,” McIntyre said. “That’s for reactor safety, and also to regulate the temperature of discharge water, which affects aquatic life.”
Nuclear is the thirstiest power source. According to the National Energy Technology Laboratory (NETL) in Morgantown, West Virginia, the average NPP that generates 12.2 million megawatt hours of electricity requires far more water to cool its turbines than other power plants. NPPs need 2725 liters of water per megawatt hour for cooling. Coal or natural gas plants need, on average, only 1890 and 719 liters respectively to produce the same amount of energy.
And oh, the National Weather Service Climate Prediction Center in its 16 August “U.S. Seasonal Drought Outlook” wrote, “The Drought Outlook valid through the end of November 2012 indicates drought conditions will remain essentially unchanged in large sections of the central Mississippi Valley, the central and southwestern Great Plains, most of the High Plains, the central Rockies, the Great Basin, and parts of the Far West…” The lack of rain and the incessant heat, has also increased the need for irrigation water for farming, meaning increasing competition between the agricultural and power generation sectors for the same shrinking water “pool.”
But, every cloud has a silver lining. California’s Pacific Gas and Electric Co. utility, commonly known as PG&E, that provides natural gas and electricity to most of the northern two-thirds of California, from Bakersfield almost to the Oregon border, is on the case. PG&E has informed its customers that its “Diablo Canyon (nuclear) Power Plant, the largest source of generation in the utility’s service area, is cooled by ocean water, not by rivers that could dry up.”
Never mind the fact that by the time the Diablo Canyon NPP was completed in 1973, engineers discovered that it was several miles away from the Hosgri seismic fault, which had a 7.1 magnitude earthquake on 4 November 1927.
But ocean water as a coolant is not necessarily the answer either.
On 12 August Dominion Resources’ Millstone NPP, situated on Connecticut’s Niantic Bay on Long Island Sound, was forced to shut down one of two reactor units because seawater used to cool down the plant was too warm, averaging 1.7 degrees above the NRC limit of 75 degrees Fahrenheit. The Millstone NPP, which provides half of all power used in Connecticut and 12 percent in New England, was only restarted twelve days later.
The federal government is hardly known for its scaremongering tactics, but it would seem that Mother Nature is forcing Americans to belatedly consider making some lifestyle changes, as the choice seems to be devolving into energy conservation, turning down the air conditioner and digging deeper into the wallet for food costs.
It might also be time for serious national discussion about renewable energy, including wind and solar.
If the sun stops shining, all bets are off.
By. John C.K. Daly of Oilprice.comhome solar power, hydroelectric plants, national oceanic and atmospheric administration, palmer drought index
Short URL: http://www.solarthermalmagazine.com/?p=21120
|
<urn:uuid:86e28879-f223-4422-a14b-5d61114c348e>
|
http://www.solarthermalmagazine.com/2012/10/05/will-the-us-face-blackouts-as-electricity-generation-suffers-in-drought/
| 0.999
|
fineweb
|
Sea ice is frozen seawater that floats on the ocean surface. Blanketing millions of square kilometers, sea ice forms and melts with the polar seasons, affecting both human activity and biological habitat. In the Arctic, some sea ice persists year after year, whereas almost all Southern Ocean or Antarctic sea ice is "seasonal ice," meaning it melts away and reforms annually. While both Arctic and Antarctic ice are of vital importance to the marine mammals and birds for which they are habitats, sea ice in the Arctic appears to play a more crucial role in regulating climate.
Because they are composed of ice originating from glaciers, icebergs are not considered sea ice. Most of the icebergs infesting North Atlantic shipping lanes originate from Greenland glaciers.
Global Sea Ice Extent and Concentration: What sensors on satellites are telling us about sea ice
Sea ice regulates exchanges of heat, moisture and salinity in the polar oceans. It insulates the relatively warm ocean water from the cold polar atmosphere except where cracks, or leads, in the ice allow exchange of heat and water vapor from ocean to atmosphere in winter. The number of leads determines where and how much heat and water are lost to the atmosphere, which may affect local cloud cover and precipitation.
The seasonal sea ice cycle affects both human activities and biological habitats. For example, companies shipping raw materials such as oil or coal out of the Arctic must work quickly during periods of low ice concentration, navigating their ships towards openings in the ice and away from treacherous multi-year ice that has accumulated over several years. Many arctic mammals, such as polar bears, seals, and walruses, depend on the sea ice for their habitat. These species hunt, feed, and breed on the ice. Studies of polar bear populations indicate that declining sea ice is likely to decrease polar bear numbers, perhaps substantially (Stirling and Parkinson 2006).
Ice thickness, its spatial extent, and the fraction of open water within the ice pack can vary rapidly and profoundly in response to weather and climate. Sea ice typically covers about 14 to 16 million square kilometers in late winter in the Arctic and 17 to 20 million square kilometers in the Antarctic Southern Ocean. The seasonal decrease is much larger in the Antarctic, with only about three to four million square kilometers remaining at summer's end, compared to approximately seven to nine million square kilometers in the Arctic. These maps provide examples of late winter and late summer ice cover in the two hemispheres.
Monitoring sea ice
Passive microwave satellite data represent the best method to monitor sea ice because of the ability to show data through most clouds and during darkness. Passive microwave data allow scientists to monitor the inter-annual variations and trends in sea ice cover. Observations of polar oceans derived from these instruments are essential for tracking the ice edge, estimating sea ice concentrations, and classifying sea ice types. In addition to the practical use of this information for shipping and transport, these data add to the meteorological knowledge base required for better understanding climate.
Decline in Arctic sea ice extent
Passive microwave satellite data reveal that, since 1979, winter Arctic ice extent has decreased about 3.6 percent per decade (Meier et al. 2006). Antarctic ice extent is increasing (Cavalieri et al. 2003), but the trend is small.
Satellite data from the SMMR and SSM/I instruments have been combined with earlier observations from ice charts and other sources to yield a time series of Arctic ice extent from the early 1900s onward. While the pre-satellite records are not as reliable, their trends are in good general agreement with the satellite record and indicate that Arctic sea ice extent has been declining since at least the early 1950s.
In recent years, satellite data have indicated an even more dramatic reduction in regional ice cover. In September 2002, sea ice in the Arctic reached a record minimum (Serreze et al. 2003), 4 percent lower than any previous September since 1978, and 14 percent lower than the 1979-2000 mean. In the past, a low ice year would be followed by a rebound to near-normal conditions, but 2002 was followed by two more low-ice years, both of which almost matched the 2002 record. Taking these three years into account, the September ice extent trend for 1979-2004 declined by 7.7 percent per decade (Stroeve et al. 2005). The year 2005 set a new record, dropping the estimated decline in end-of-summer Arctic sea ice to approximately 8 percent per decade. Although sea ice did not set a new record low in 2006, it did fall below normal for the fifth consecutive year. In 2007, sea ice broke all prior satellite records, reaching a record low a month before the end of melt season. Through 2007, the September decline trend is now over 10 percent per decade. (For current sea ice trends, visit NSIDC's Sea Ice Index Cryospheric Climate Indicators.)
Combined with record low summertime extent, Arctic sea ice exhibited a new pattern of poor winter recovery. In the past, a low-ice year would be followed by a rebound to near-normal conditions, but 2002 was followed by two more low-ice years, both of which almost matched the 2002 record (see Arctic Sea Ice Decline Continues). Although wintertime recovery of Arctic sea ice improved somewhat after 2006, wintertime extents have remained well below the long-term average.
Decline in Arctic Sea Ice Thickness
Sea ice thickness has likewise shown substantial decline in recent decades (Rothrock et al. 1999). Using data from submarine cruises, Rothrock and collaborators determined that the mean ice draft at the end of the melt season in the Arctic has decreased by about 1.3 meters between the 1950s and the 1990s.
Estimates based on measurements taken by NASA's ICESat laser altimeter, first-year ice that formed after the autumn of 2007 had a mean thickness of 1.6 meters. The ice formed relatively late in the autumn of 2007, and NSIDC researchers had actually anticipated this first-year ice to be thinner, but it nearly equaled the thickness of 2006 and 2007. Snow accumulation on sea ice helps insulate the ice from frigid air overhead, so sparse snowfall during the winter of 2007-2008 might have actually accelerated the sea ice's growth.
Greenhouse gases emitted through human activities and the resulting increase in global mean temperatures are the most likely underlying cause of the sea ice decline, but the direct cause is a complicated combination of factors resulting from the warming, and from climate variability. The Arctic Oscillation (AO) is a see-saw pattern of alternating atmospheric pressure at polar and mid-latitudes. The positive phase produces a strong polar vortex, with the mid-latitude jet stream shifted northward. The negative phase produces the opposite conditions. From the 1950s to the 1980s, the AO flipped between positive and negative phases, but it entered a strong positive pattern between 1989 and 1995. So the acceleration in the sea ice decline since the mid 1990s may have been partly triggered by the strongly positive AO mode during the preceding years (Rigor et al. 2002 and Rigor and Wallace 2004) that flushed older, thicker ice out of the Arctic, but other factors also played a role.
Since the mid-1990s, the AO has largely been a neutral or negative phase, and the late 1990s and early 2000s brought a weakening of the Beaufort Gyre. However, the longevity of ice in the gyre began to change as a result of warming along the Alaskan and Siberian coasts. In the past, sea ice in this gyre could remain in the Arctic for many years, thickening over time. Beginning in the late 1990s, sea ice began melting in the southern arm of the gyre, thanks to warmer air temperatures and more extensive summer melt north of Alaska and Siberia. Moreover, ice movement out of the Arctic through Fram Strait continued at a high rate despite the change in the AO. Thus warming conditions and wind patterns have been the main drivers of the steeper decline since the late 1990s. Sea ice may not be able to recover under the current persistently warm conditions, and a tipping point may have been passed where the Arctic will eventually be ice-free during at least part of the summer (Lindsay and Zhang 2005).
Examination of the long-term satellite record dating back to 1979 and earlier records dating back to the 1950s indicate that spring melt seasons have started earlier and continued for a longer period throughout the year (Serreze et al. 2007). Even more disquieting, comparison of actual Arctic sea ice decline to IPCC AR4 projections show that observed ice loss is faster than any of the IPCC AR4 models have predicted (Stroeve et al. 2007).
Disclaimer: This article is taken wholly from, or contains information that was originally published by, the National Snow and Ice Data Center. Topic editors and authors for the Encyclopedia of Earth may have edited its content or added new information. The use of information from the National Snow and Ice Data Center should not be construed as support for or endorsement by that organization for any new information added by EoE personnel, or for any editing of the original content.
|
<urn:uuid:b5246854-fe77-4d8a-96ae-2c3df064ab3d>
|
http://www.eoearth.org/article/Climate_change_and_sea_ice
| 0.9999
|
fineweb
|
Unfortunately, the modern buildings we live and work in rival cars and factories as sources of harm to the environment, contributing to deforestation, global warming, overuse of water and energy and carbon dioxide emissions.
Sustainable building refers to those buildings that are built to have the least impact on the natural environment, both in terms of the building itself, its immediate surroundings and the broader global setting. To construct in a sustainable way, some basic rules need to be followed: (a) minimization of non-renewable resource consumption; (b) enhancement of the natural environment; and (c) elimination or minimization of toxic emissions.
Almost every step of the green building process is heavily focused on how building elements fit together to optimize efficiency and sustainability.
Sustainable development marries two important themes:
1. Environmental protection does not preclude economic development.
2. Economic development must be ecologically viable now and in the long term.
“Sustainable design” involves the planning and development of projects in a manner that minimizes impact on natural resources, such as water and energy. There are many aspects to the sustainable process, one of which involves “LEED” principles – Leadership in Engineering and Environmental Design, with standards for selecting materials and designing facilities established by the U.S. Green Building Council. Zurn strongly encourages organizations to consider including cost-effective and environmentally friendly practices in the design, construction and retrofit of buildings and facilities. In this way, your buildings and facilities not only exemplify your care for the environment and the well-being of the community that you serve, but they also decrease facility operating and maintenance costs.
|
<urn:uuid:180acaf8-94ec-4f00-abe3-803c9ec9f24d>
|
http://www.zurn.com/Pages/SustainabilityProcess.aspx
| 0.9744
|
fineweb
|
There are nearly two million known species on the planet. But many of those won't be around much longer; one out of every eight known bird species, one in four mammal species, and one in three amphibian species are at risk for extinction, according to the World Conservation Union (IUCN), which maintains the Red List, a catalog of the world's species classified according to their risk of extinction.
"It's supposed to inform conservation practice, to be a wake-up call for the extinctions that are happening," says Caroline Pollock, a program officer with the Red List unit. Animals that are classified as "critically endangered" are at the highest risk--their numbers in the wild may be extraordinarily low or their territories incredibly small. "It is possible to bring them back," Pollock says, "but it is quite work-intensive and financially expensive." Here, a look at five species on the brink.
Native to Spain and Portugal, there are fewer than 250 of these felines left in the wild. Habitat destruction has been a major cause of its decline as agriculture spreads through its homeland. Additionally, disease has claimed a large percentage of the region's rabbits, one of the lynx's primary food sources. Intensive captive breeding programs are currently underway to help save the lynx, Pollock says. If they do disappear, the lynx will be the first wild cat to go extinct in more than 2,000 years.
The wild population of these frogs has declined more than 80 percent in the last decade. The plummeting numbers of the frogs, which are endemic to Panama, is largely a result of chytridiomycosis, an infectious fungal disease that seems to be causing mass amphibian die-offs. The disease is still spreading, and deforestation is adding to the pressures faced by the frogs. Though there are captive-breeding programs in place for these amphibians, they will not be released into the wild until conditions improve.
Fewer than 100 of these birds, which are confined to one small island in Cape Verde, remain in the wild. The birds have been threatened by drought and increasing desertification on the island, conditions that may worsen as a result of global climate change. Because they build their nests on the ground, they also face risks from cats, dogs, and rats that have been introduced to the island.
Only 34 of these trees, native to Mexico, remain. The plants have a low rate of pollination--and don't reach maturity until they are approximately 25 years old--and are also profoundly threatened by agriculture. One tree was cut down in 2006 to expand farmland, and insecticides decrease the number of pollinators available to help the trees spread. Human-caused fires have also destroyed or damaged a number of these plants.
It could already be too late for the Yangtze River dolphin, or baiji. There has not been a documented sighting of these cetaceans, which lived in China's Yangtze River and nearby lakes, since 2002. A search for the dolphin--and the signature sounds that they make--was conducted in late 2006 but turned up no evidence of the mammals. However, further surveys are still needed to determine whether the dolphins truly have disappeared forever. The baiji's population decline is due, in large part, to the development of Chinese waterways and the expansion of commercial fishing.
Read more on helping endangered species by breeding captive animals in DISCOVER's Recall of the Wild
Emotion researcher Jaak Panksepp
Read More »
Sign up to get the latest science news delivered weekly right to your inbox!
|
<urn:uuid:246be445-3dd9-4c3d-a6e4-a6d193647b45>
|
http://discovermagazine.com/galleries/zen-photo/e/endangered-species
| 0.9908
|
fineweb
|
Since 1993, RAN’s Protect-an-Acre program (PAA) has distributed more than one million dollars in grants to more than 150 frontline communities, Indigenous-led organizations, and allies, helping their efforts to secure protection for millions of acres of traditional territory in forests around the world.
Rainforest Action Network believes that Indigenous peoples are the best stewards of the world’s rainforests and that frontline communities organizing against the extraction and burning of dirty fossil fuels deserve the strongest support we can offer. RAN established the Protect-an-Acre program to protect the world’s forests and the rights of their inhabitants by providing financial aid to traditionally under-funded organizations and communities in forest regions.
Indigenous and frontline communities suffer disproportionate impacts to their health, livelihood and culture from extractive industry mega-projects and the effects of global climate change. That’s why Protect-an-Acre provides small grants to community-based organizations, Indigenous federations and small NGOs that are fighting to protect millions of acres of forest and keep millions of tons of CO2 in the ground.
Our grants support organizations and communities that are working to regain control of and sustainably manage their traditional territories through land title initiatives, community education, development of sustainable economic alternatives, and grassroots resistance to destructive industrial activities.
PAA is an alternative to “buy-an-acre” programs that seek to provide rainforest protection by buying tracts of land, but which often fail to address the needs or rights of local Indigenous peoples. Uninhabited forest areas often go unprotected, even if purchased through a buy-an-acre program. It is not uncommon for loggers, oil and gas companies, cattle ranchers, and miners to illegally extract resources from so-called “protected” areas.
Traditional forest communities are often the best stewards of the land because their way of life depends upon the health of their environment. A number of recent studies add to the growing body of evidence that Indigenous peoples are better protectors of their forests than governments or industry.
Based on the success of Protect-an-Acre, RAN launched The Climate Action Fund (CAF) in 2009 as a way to direct further resources and support to frontline communities and Indigenous peoples challenging the fossil fuel industry.
Additionally, RAN has been a Global Advisor to Global Greengrants Fund (GGF) since 1995, identifying recipients for small grants to mobilize resources for global environmental sustainability and social justice using the same priority and criteria as we use for PAA and CAF.
Through these three programs each year we support grassroots projects that result in at least:
|
<urn:uuid:995ec683-d967-4f36-82d9-547c9ea3d646>
|
http://ran.org/protect-an-acre
| 0.9993
|
fineweb
|
Walking and cycling have long been considered the most environmentally sound methods of getting around. They still are but some environmentalists have argued that food production has become so fossil-fuel intensive that driving could be considered greener than walking (though the analysis has been debunked as flawed).
What of other, more obviously polluting, modes of transport? The data below gives an idea of how your carbon footprint might grow depending on how you make a journey. If you were to take an average domestic flight rather than a high-speed electric train, you'd be personally responsible for 29 times as much carbon dioxide.
The data also highlights how the UK government's plans to electrify parts of the rail network could cut emissions. Diesel trains are responsible for more greenhouse gases than electric trains, even taking into account Britain's carbon-heavy electricity production.
On the roads, next-generation hybrid and electric vehicles can help those of us behind the wheel to be that little bit greener. However, no journey is completely carbon free.
|
<urn:uuid:66d0a330-4520-4e4a-8b95-75da2b16d190>
|
http://www.guardian.co.uk/environment/datablog/2009/sep/02/carbon-emissions-per-transport-type
| 0.9984
|
fineweb
|
But Keller and her colleagues say their research proves otherwise.
Keller has studied the Chicxulub site and other impact-crater sites around the world for the past decade. She believes that the asteroid impact behind Chicxulub coincided with a "time of massive volcanism, which led to greenhouse warming."
Keller says those three eventsthe Chicxulub asteroid impact, volcanism, and climate change"led to high biotic stress and caused the decline of many tropical species populations," but not mass extinctions. That die-off didn't occur until later. However, Keller does believe that the initial confluence of volcanic activity, global warming, and the Chicxulub asteroid impact ultimately contributed to the mass extinction.
Key to Keller's assertions is a 20-inch-thick (50-centimeter-thick) layer of limestone found between the K-T boundary and the impact breccia, or molten lava and rocky debris, laid down when the Chicxulub asteroid collided with Earth.
Keller and her colleagues believe that the thickness of the limestone layera type of sedimentary rock characteristically formed under large bodies of water like oceans, seas, and lakesindicates that it accumulated in the crater over some 300,000 years after the impact. As proof, Keller points to fossils of microscopic organisms called foraminifera and fossil burrows present in the limestone layer.
According to Keller, those fossils indicate the sediment was deposited after the asteroid impact but before the period of mass extinction that marked the end of the Cretaceous.
Many other scientists disagree with that interpretation, however. They say the layer of fossil-rich limestone was deposited quickly as backwash and infill caused by a huge tsunami that followed the Chicxulub asteroid's impact with Earth. The layer, they say, did not take 300,000 years to accumulate.
In her defense, Keller says the quick-accumulation theory is unsupported by evidence that would have been found during her analysis of core samples gathered at Chicxulub and 45 localities in northeast Mexico.
But Alan Hildebrand, a proponent of the quick-accumulation theory, says the burrows were "made by organisms digging after the fireball layer was deposited."
Thomas R. Holtz, Jr., a vertebrate paleontologist at the University of Maryland in College Park, supports the view that the limestone was quickly laid down as crater infill. He said he is not surprised that Cretaceous fossils were found in the limestone layer.
"If an asteroid clobbered the Eastern seaboard of the U.S. today, I would expect that most of the infilling would be Chevys and Hondas and shopping malls and houses and cows and McDonald's burger wrappers," Holtz said. "Only a tiny bit might be mastodons and Clovis points and Miocene whales." In other words, the crater would quickly fill with objects common on Earth at the time of impact.
So where do researchers in the Keller camp look next for the possible K-T crater? Keller says she's unsure, although "some scientists have suggested it could be a structure called Shiva, in India. We have no convincing evidence so far that this is the case."
SOURCES AND RELATED WEB SITES
|
<urn:uuid:4d9d2441-762a-4bf0-bbc6-64a734349fda>
|
http://news.nationalgeographic.com/news/2004/03/0309_040309_chicxulubdinos_2.html
| 0.5263
|
fineweb
|
1. Reduce our personal carbon footprint by 20 percent in the next year.
2. Stop subsidizing fossil fuels.
3. Mitigate politics and polarization.+
4. Shift towards more vegetarian diets.
5. Scientists better communicate the scientific facts underlying climate change.+
6. Scientists and engineers develop cheap alternative energy sources to reduce dependence on fossil fuels.+
7. Reduce waste water treatment costs.*
8. Reduce costs to absorb CO2 from industrial activities.*
9. Manage the timing, magnitude, and speed of reservoir drawdowns in order to mitigate methane releases to the atmosphere.^
+ http://www.newswise.com/articles/view/591970/ ...
* http://www.typicallyspanish.com/news/publish/ ...
|
<urn:uuid:ecdaa893-6065-4c28-8ac3-88c4ac418547>
|
http://www.topix.net/forum/news/drought/TUGBRVERP7JUKITNE
| 1
|
fineweb
|
This page lists climate science and climate impact claims that have either not been proven, or have had the claim modified, moved, or expanded to protect the claimant from having to admit the original claim was wrong.
This will always be a work in progress. New items will be added as they are examined and will include:
- The claim itself – what was stated as factual or predicted? A clear unambiguous statement, such as “50 million climate refugees by 2010″
- Proof of the original claim – website, documents, photos, audio, video that clearly and unambiguously show the claim being made sometime in the past.
- A test of the of the claim, and the results – website, documents, photos, audio, video that clearly and unambiguously show the claim not coming true or not meeting the claim.
- Proof of change in the claim (if applicable) – often, when the claim fails to materialize, goalposts get moved, such as we saw with the “50 million climate refugees” story that was originally set with a due date of 2010, is now set for the year 2020.
The Claim: 50 million climate refugees will be produced by climate change by the year 2010. Especially hard hit will be river delta areas, and low lying islands in the Caribbean and Pacific. The UN 62nd General assembly in July 2008 said: …it had been estimated that there would be between 50 million and 200 million environmental migrants by 2010.
The Test: Did population go down in these areas during that period, indicating climate refugees were on the move? The answer, no.
The Proof: Population actually gained in some Caribbean Island for which 2010 census figures were available. Then when challenged on these figures, the UN tried to hide the original claim from view. See: The UN “disappears” 50 million climate refugees, then botches the disappearing attempt
The Change in claim: Now it is claimed that it will be 10 years into the future, and there will be 50 million refugees by the year 2020.
|
<urn:uuid:7a0bb173-9e03-4fcc-9bcf-ccc3aaa3e68a>
|
http://wattsupwiththat.com/climate-fail-files/?like=1&source=post_flair&_wpnonce=8ef4095fcf
| 1
|
fineweb
|
The Coca-Cola System Announces New Global Targets for Water Conservation and Climate Protection in Partnership With WWF
The Coca-Cola Company, in partnership with World Wildlife Fund (WWF), today announced ambitious new targets to improve water efficiency and reduce carbon emissions within its system-wide operations, while promoting sustainable agricultural practices and helping to conserve the world’s most important freshwater basins.
“Our sustainability as a business demands a relentless focus on efficiency in our use of natural resources. These performance targets are one way we are engaging to improve our management of water and energy,” said Muhtar Kent, president and CEO of The Coca-Cola Company.
“In this resource constrained world, successful businesses will find ways to achieve growth while using fewer resources,” said Carter Roberts, president and CEO of WWF-US. “The Coca-Cola Company’s commitment to conservation responds to the imperative to solve the global water and climate crisis.”
The partnership, announced by WWF and The Coca-Cola Company in 2007 with $20 million in funding, has now been extended an additional two years (through 2012) with the Company providing $3.75 million in new funding.
The Coca-Cola Company also joined WWF’s Climate Savers program in which leading corporations from around the world work with WWF to dramatically reduce their greenhouse gas emissions. By 2010, Climate Savers companies will collectively cut carbon emissions by 14 million tons annually – the equivalent to taking more than 3 million cars off the road each year.
Water Efficiency — Saving 50 billion liters in 2012
The Coca-Cola system will improve its water efficiency 20 percent by 2012, compared to a baseline year 2004. While water use is expected to increase as the business grows, this water efficiency target will eliminate approximately 50 billion liters of that increase in 2012.
To support this efficiency target, The Coca-Cola Company and WWF have developed a Water Efficiency Toolkit to help reduce water consumption within bottling plants. This software-based instruction manual has been distributed to managers and operators throughout the Coca-Cola system, providing strategies to shrink the water footprint of their operations.
Climate Protection — Preventing 2 million tons of CO(2) emissions
The Company has set two emissions reduction targets: 1) grow the business, not the carbon system-wide and 2) a 5 percent absolute reduction in Annex 1 (developed) countries. The emissions targets apply to manufacturing operations in the year 2015 compared to a baseline year of 2004.
The Coca-Cola Company and its bottlers anticipate substantial volume growth globally during this period, thus growing the business without growing the carbon is a significant commitment. Without intervention, emissions would grow proportional to volume and reach 7.3 million metric tons in 2015. Thus, the global commitment will prevent the release of more than 2 million metric tons of CO(2) in 2015 – the equivalent of planting 600,000 acres of trees.
Supply Chain Sustainability
The Coca-Cola Company also will work with WWF to promote more sustainable agricultural practices in an effort to reduce the impact of its supply chain on water resources. This work will initially focus on sugarcane production. The Coca-Cola Company and WWF are working with the Better Sugarcane Initiative to establish standards, evaluate suppliers and set goals for the purchase of sugar. The Coca-Cola Company will identify two additional commodities on which to work in 2009.
The Coca-Cola system and WWF are working together to conserve some of the world’s most important freshwater resources, including the Yangtze, Mekong, Danube, Rio Grande/Rio Bravo, Lakes Niassa and Chiuta, the Mesoamerican Reef catchments, and the rivers and streams in the southeastern region of the United States. More than a dozen production plants and/or bottlers in the areas surrounding these rivers are developing and implementing water stewardship plans to serve as models throughout the Coca-Cola system.
“Water and energy conservation are areas where we can truly make a difference. Last year, we set a goal to return to communities and to nature an amount of water equal to what we use in our beverages and their production. These targets support our work to achieve that goal,” said Kent. “The expansion of our partnership with WWF demonstrates our shared dedication to achieving large-scale results, and a grounded understanding that collaboration is key if we are to help address the world’s water challenges.”
To learn more about the partnership, please visit www.thecoca-colacompany.com or www.worldwildlife.org.
About The Coca-Cola Company
The Coca-Cola Company is the world’s largest beverage company, refreshing consumers with more than 450 sparkling and still brands. Along with Coca-Cola, recognized as the world’s most valuable brand, the Company’s portfolio includes 12 other billion dollar brands, including Diet Coke, Fanta, Sprite, Coca-Cola Zero, vitaminwater, Powerade, Minute Maid and Georgia Coffee. Globally, we are the No.1 provider of sparkling beverages, juices and juice drinks and ready-to-drink teas and coffees. Through the world’s largest beverage distribution system, consumers in more than 200 countries enjoy the Company’s beverages at a rate of 1.5 billion servings a day. With an enduring commitment to building sustainable communities, our Company is focused on initiatives that protect the environment, conserve resources and enhance the economic development of the communities where we operate. For more information about our Company, please visit our Web site at www.thecoca-colacompany.com.
About World Wildlife Fund
WWF is the world’s largest conservation organization, working in 100 countries for nearly half a century. With the support of almost 5 million members worldwide, WWF is dedicated to delivering science-based solutions to preserve the diversity and abundance of life on Earth, stop the degradation of the environment and combat climate change. Visit www.worldwildlife.org to learn more.
|
<urn:uuid:0a04b507-9ee2-4a9e-8102-aa69f8f44e4e>
|
http://www.redorbit.com/news/science/1595040/the_cocacola_system_announces_new_global_targets_for_water_conservation/
| 1
|
fineweb
|
The factors behind the calving process were not well understood
US researchers have come up with a way to predict the rate at which ice shelves break apart into icebergs.
These sometimes spectacular occurrences, called calving events, are a key step in the process by which climate change drives sea level rise.
Computer models that simulate how ice sheets might behave in a warmer world do not describe the calving process in much detail, Science journal reports.
Until now, the factors controlling this process have not been well understood.
Ice sheets, such as those in Antarctica and Greenland, spread under their own weight and flow off land over the ocean water.
Ice shelves are the thick, floating lips of ice sheets or glaciers that extend out past the coastline.
Timelapse footage of an iceberg breaking away from a glacier in July 2008. The event took approximately 15 minutes (Video: Fahnestock/UNH)
The Ross Ice Shelf in Antarctica floats for as much as 800km (500 miles) over the ocean before the edges begin to break and create icebergs. But other ice shelves may only edge over the water for a few kilometres.
A team led by Richard Alley at Pennsylvania State University, US, analysed factors such as thickness, calving rate and strain rate for 20 different ice shelves.
"The problem of when things break is a really hard problem because there is so much variability," said Professor Alley.
"Anyone who has dropped a coffee cup knows this. Sometimes the coffee cup breaks and sometimes it bounces."
The team's results show that the calving rate of an ice shelf is primarily determined by the rate at which the ice shelf is spreading away from the continent.
The researchers were also able to show that narrower shelves should calve more slowly than wider ones.
Ice cracking off into the ocean from Antarctica and Greenland could play a significant role in future sea level rise.
Floating ice that melts does not of itself contribute to the height of waters (because it has already displaced its volume), but the shelf from which it comes acts as a brake to the land-ice behind. Removal of the shelf will allow glaciers heading to the ocean to accelerate - a phenomenon documented when the Larsen B shelf on the Antarctic Peninsula shattered in spectacular style in 2002. This would speed sea level rise.
The UN Intergovernmental Panel on Climate Change in its 2007 assessment forecast that seas could rise by 18 to 59 cm (7-23ins) this century. However, in giving those figures, it conceded that ice behaviour was poorly understood.
This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.
|
<urn:uuid:345a4045-6f1f-4b6d-b5c0-385afebb5719>
|
http://news.bbc.co.uk/2/hi/science/nature/7753228.stm
| 0.9977
|
fineweb
|
If you download this publication you may also be interested in these:
Facing an uncertain future
How forest and people can adapt to climate changeCenter for International Forestry Research (CIFOR)Bogor, Indonesia
The most prominent international responses to climate change focus on mitigation (reducing the accumulation of greenhouse gases) rather than adaptation (reducing the vulnerability of society and ecosystems). However, with climate change now inevitable, adaptation is gaining importance in the policy arena, and is an integral part of ongoing negotiations towards an international framework. This report presents the case for adaptation for tropical forests (reducing the impacts of climate change on forests and their ecosystem services) and tropical forests for adaptation (using forests to help local people and society in general to adapt to inevitable changes). Policies in the forest, climate change and other sectors need to address these issues and be integrated with each other—such a cross-sectoral approach is essential if the benefits derived in one area are not to be lost or counteracted in another. Moreover, the institutions involved in policy development and implementation need themselves to be flexible and able to learn in the context of dynamic human and environmental systems. And all this needs to be done at all levels from the local community to the national government and international institutions. The report includes an appendix covering climate scenarios, concepts, and international policies and funds.
|
<urn:uuid:57106d23-5399-4426-a853-4136284a3a19>
|
http://www.cifor.org/online-library/browse/view-publication/publication/2600.html
| 1
|
fineweb
|
Karuk Tribe: Learning from the First Californians for the Next California
Editor's Note: This is part of series, Facing the Climate Gap, which looks at grassroots efforts in California low-income communities of color to address climate change and promote climate justice.
This article was published in collaboration with GlobalPossibilities.org.
The three sovereign entities in the United States are the federal government, the states and indigenous tribes, but according to Bill Tripp, a member of the Karuk Tribe in Northern California, many people are unaware of both the sovereign nature of tribes and the wisdom they possess when it comes to issues of climate change and natural resource management.
“A lot of people don’t realize that tribes even exist in California, but we are stakeholders too, with the rights of indigenous peoples,” says Tripp.
Tripp is an Eco-Cultural Restoration specialist at the Karuk Tribe Department of Natural Resources. In 2010, the tribe drafted an Eco-Cultural Resources Management Plan, which aims to manage and restore “balanced ecological processes utilizing Traditional Ecological Knowledge supported by Western Science.” The plan addresses environmental issues that affect the health and culture of the Karuk tribe and outlines ways in which tribal practices can contribute to mitigating the effects of climate change.
Before climate change became a hot topic in the media, many indigenous and agrarian communities, because of their dependence upon and close relationship to the land, began to notice troubling shifts in the environment such as intense drought, frequent wildfires, scarcer fish flows and erratic rainfall.
There are over 100 government recognized tribes in California, which represent more than 700,000 people. The Karuk is the second largest Native American tribe in California and has over 3,200 members. Their tribal lands include over 1.48 million acres within and around the Klamath and Six Rivers National Forests in Northwest California.
Tribes like the Karuk are among the hardest hit by the effects of climate change, despite their traditionally low-carbon lifestyles. The Karuk, in particular have experienced dramatic environmental changes in their forestlands and fisheries as a result of both climate change and misguided Federal and regional policies.
The Karuk have long depended upon the forest to support their livelihood, cultural practices and nourishment. While wildfires have always been a natural aspect of the landscape, recent studies have shown that fires in northwestern California forests have risen dramatically in frequency and size due to climate related and human influences. According to the California Natural Resources Agency, fires in California are expected to increase 100 percent due to increased temperatures and longer dry seasons associated with climate change.
Some of the other most damaging human influences to the Karuk include logging activities, which have depleted old growth forests, and fire suppression policies created by the U.S. Forest Service in the 1930s that have limited cultural burning practices. Tripp says these policies have been detrimental to tribal traditions and the forest environment.
“It has been huge to just try to adapt to the past 100 years of policies that have led us to where we are today. We have already been forced to modify our traditional practices to fit the contemporary political context,” says Tripp.
Further, the construction of dams along the Klamath River by PacifiCorp (a utility company) has impeded access to salmon and other fish that are central to the Karuk diet. Fishing regulations have also had a negative impact.
Though the Karuk’s dependence on the land has left them vulnerable to the projected effects of climate change, it has also given them and other indigenous groups incredible knowledge to impart to western climate science. Historically, though, tribes have been largely left out of policy processes and decisions. The Karuk decided to challenge this historical pattern of marginalization by formulating their own Eco-Cultural Resources Management Plan.
The Plan provides over twenty “Cultural Environmental Management Practices” that are based on traditional ecological knowledge and the “World Renewal” philosophy, which emphasizes the interconnectedness of humans and the environment. Tripp says the Plan was created in the hopes that knowledge passed down from previous generations will help strengthen Karuk culture and teach the broader community to live in a more ecologically sound way.
“It is designed to be a living document…We are building a process of comparative learning, based on the principals and practices of traditional ecological knowledge to revitalize culturally relevant information as passed through oral transmission and intergenerational observations,” says Tripp.
One of the highlights of the plan is to re-establish traditional burning practices in order to decrease fuel loads and the risk for more severe wildfires when they do happen. Traditional burning was used by the Karuk to burn off specific types of vegetation and promote continued diversity in the landscape. Tripp notes that these practices are an example of how humans can play a positive role in maintaining a sound ecological cycle in the forests.
“The practice of utilizing fire to manage resources in a traditional way not only improves the use quality of forest resources, it also builds and maintains resiliency in the ecological process of entire landscapes” explains Tripp.
Another crucial aspect of the Plan is the life cycle of fish, like salmon, that are central to Karuk food traditions and ecosystem health. Traditionally, the Karuk regulated fishing schedules to allow the first salmon to pass, ensuring that those most likely to survive made it to prime spawning grounds. There were also designated fishing periods and locations to promote successful reproduction. Tripp says regulatory agencies have established practices that are harmful this cycle.
“Today, regulatory agencies permit the harvest of fish that would otherwise be protected under traditional harvest management principles and close the harvest season when the fish least likely to reach the very upper river reaches are passing through,” says Tripp.
The Karuk tribe is now working closely with researchers from universities such as University of California, Berkeley and the University of California, Davis as well as public agencies so that this traditional knowledge can one day be accepted by mainstream and academic circles dealing with climate change mitigation and adaptation practices.
According to the Plan, these land management practices are more cost effective than those currently practiced by public agencies; and, if implemented, they will greatly reduce taxpayer cost burdens and create employment. The Karuk hope to create a workforce development program that will hire tribal members to implement the plan’s goals, such as multi-site cultural burning practices.
The Plan has a long way to full realization and Federal recognition. According to the National Indian Forest Resources Management Act and the National Environmental Protection Act, it must go through a formal review process. Besides that, the Karuk Tribe is still solidifying funding to pursue its goals.
The work of California’s environmental stewards will always be in demand, and the Karuk are taking the lead in showing how community wisdom can be used to generate an integrated approach to climate change. Such integrated and community engaged policy approaches are rare throughout the state but are emerging in other areas. In Oakland, for example, the Oakland Climate Action Coalition engaged community members and a diverse group of social justice, labor, environmental, and business organizations to develop an Energy and Climate Action Plan that outlines specific ways for the City to reduce greenhouse gas emissions and create a sustainable economy.
In the end, Tripp hopes the Karuk Plan will not only inspire others and address the global environmental plight, but also help to maintain the very core of his people. In his words: “Being adaptable to climate change is part of that, but primarily it is about enabling us to maintain our identity and the people in this place in perpetuity.”
Dr. Manuel Pastor is Professor of Sociology and American Studies & Ethnicity at the University of Southern California where he also directs the Program for Environmental and Regional Equity and co-directs USC’s Center for the Study of Immigrant Integration. His most recent books include Just Growth: Inclusion and Prosperity in America’s Metropolitan Regions (Routledge 2012; co-authored with Chris Benner) Uncommon Common Ground: Race and America’s Future (W.W. Norton 2010; co-authored with Angela Glover Blackwell and Stewart Kwoh), and This Could Be the Start of Something Big: How Social Movements for Regional Equity are Transforming Metropolitan America (Cornell 2009; co-authored with Chris Benner and Martha Matsuoka).
|
<urn:uuid:003baaf4-69c7-4ee7-b37f-468bf9b55842>
|
http://www.resilience.org/stories/2012-10-19/karuk-tribe-learning-from-the-first-californians-for-the-next-california
| 1
|
fineweb
|
The Geological Perspective On Global Warming: A Debate
Dr Colin P. Summerhayes, Vice-President of the Geological Society of London
Dear Dr Peiser,
In the interest of contributing to the evidence-based debate on climate change I thought it would be constructive to draw to your attention the geological evidence regarding climate change, and what it means for the future. This evidence was published in November 2010 by the Geological Society of London in a document entitled “Climate Change: Evidence from the Geological Record”, which can be found on the Society’s web page.
A variety of techniques is now available to document past levels of CO2 in the atmosphere, past global temperatures, past sea levels, and past levels of acidity in the ocean. What the record shows is this. The Earth’s climate has been cooling for the past 50 million years from 6-7°C above today’s global average temperatures to what we see now. That cooling led to the formation of ice caps on Antarctica 34 million years ago and in the northern hemisphere around 2.6 million years ago. The cooling was directly associated with a decline in the amount of CO2 in the atmosphere. In effect we moved from a warm “greenhouse climate” when CO2, temperature and sea level were high, and there were no ice caps, to an “icehouse climate” in which CO2, temperature and sea level are low, and there are ice caps. The driver of that change is the balance between the emission of CO2 into the atmosphere from volcanoes, and the mopping up of CO2 from the atmosphere by the weathering of rocks, especially in mountains. There was more volcanic activity in the past and there are more mountains now.
Superimposed on this broad decline in CO2 and temperature are certain events. Around 55 million years ago there was a massive additional input of carbon into the atmosphere – about 4 times what humans have put there. It caused temperatures to rise by a further 6°C globally and 10°C at the poles. Sea level rose by some 15 metres. Deep ocean bottom waters became acid enough to dissolve carbonate sediments and kill off calcareous bottom dwelling organisms. It took over 100,000 years for the Earth to recover from this event. More recently, during the Pliocene, around 3 million years ago, CO2 rose to levels a little higher than today’s, global temperature rose to 2-3°C above today’s level, Antarctica’s Ross Ice Shelf melted, and sea level rose by 10-25 metres.
The icehouse climate that characterised the past 2.6 million years averaged 9°C colder in the polar regions and 5°C colder globally. It was punctuated by short warm interglacial periods. We are living in one of these warm periods now – the Holocene – which started around 11,000 years ago. The glacial to interglacial variations are responses to slight changes in solar energy meeting the Earth’s surface with changes in: our planet’s orbit from circular to elliptical and back; the position of the Earth relative to the sun around the Earth’s orbit; and the tilt of the Earth’s axis. These changes recur on time scales of tens to hundreds of thousands of years. CO2 plays a key role in these changes. As the Earth begins to warm after a cold period, sea ice melts allowing CO2 to emerge from the ocean into the atmosphere. There it acts to further warm the planet through a process known as positive feedback. The same goes for another greenhouse gas, methane, which is given off from wetlands that grow as the world warms. As a result the Earth moves much more rapidly from cold to warm than it does from warm to cold. We are currently in a cooling phase of this cycle, so the Earth should be cooling slightly. Evidently it is not.
The Geological Society deduced that by adding CO2 to the atmosphere as we are now doing, we would be likely to replicate the conditions of those past times when natural emissions of CO2 warmed the world, melted ice in the polar regions, and caused sea level to rise and the oceans to become more acid. The numerical models of the climate system that are used by the meteorological community to predict the future give much the same result by considering modern climate variation alone. Thus we arrive at the same solution by two entirely independent methods. Under the circumstances the Society concluded that “emitting further large amounts of CO2 into the atmosphere over time is likely to be unwise, uncomfortable though that fact may be.”
Dr Colin P. Summerhayes
Vice-President Geological Society of London and Emeritus Associate Scott Polar Research Institute, Cambridge.
8 February 2013
Professor Robert Carter and Professor Vincent Courtillot respond:
Dear Dr Peiser,
Thank you for your invitation on behalf of the Foundation to reply to Dr Summerhayes’ letter about geological evidence in relation to the hypothesis of dangerous anthropogenic global warming (DAGW) that is favoured by the Intergovernmental Panel on Climate Change (IPCC).
We are in agreement with many of Dr Summerhayes’ preliminary remarks about the geological context of climate change. This reflects that a large measure of scientific agreement and shared interpretation exists amongst most scientists who consider the global warming issue.
Points of commonality in the climate discussion include:
* that climate has always changed and always will,
* that Earth has often been warmer than it is today, and that its present climatic condition is that of a warm interglacial during a punctuated icehouse world,
* that carbon dioxide is a greenhouse gas and warms the lower atmosphere (though debate remains as to the magnitude and timescale of the warming),
* that a portion of human emissions are accumulating in the atmosphere,
* that a global warming of around 0.5°C occurred in the 20th century, but that there has been no global temperature rise over the last 16 years.
The first two points are rooted in geological evidence (as discussed in more detail by Dr Summerhayes), the third is based upon physical principle and the last three are mostly matters of instrumental measurement (i.e. observation). Despite the disparate scientific disciplines involved, all these points are relevant to achieving a quantitative understanding of climate change, together with several other disputed scientific matters such as those that we discuss below.
One of the disputed scientific matters is represented by Dr Summerhayes’ assertion that cooling over the last 34 million years “was directly associated with a decline in the amount of CO2 in the atmosphere”.
The word “associated” is ambiguous. It may simply mean that temperature and CO2 were correlated, in the sense that their trends were parallel. But as everyone knows correlation is not causality and whether one drives the other, or the two are driven by a third forcing factor, or the correlation is the result of chance, requires careful analysis and argument. Though it may be true that a broad correlation exists between atmospheric CO2 content and global temperature, at least on some timescales, it remains unclear whether the primary effect is one of increasing CO2 causing warming (via the greenhouse effect) or of warming causing CO2 increase (via outgassing from the ocean). We are familiar with the argument that the currently decreasing carbon isotope ratio in the atmosphere is consistent with a fossil fuel source for incremental CO2 increases, and therefore with the first of these two possibilities, but do not find it compelling because other natural sources (soil carbon, vegetation) also contribute isotopically negative carbon to the atmosphere.
A second area of uncertainty, related to the point just discussed, is the rate, scope and direction of the various feedbacks that apply during a natural glacial-interglacial climatic cycle. Dr Summerhayes provides a confident, and perhaps plausible, account as to how changing insolation (controlled by orbital change), melting sea-ice and increasing CO2 and CH4 jointly drive the asymmetrical glacial-interglacial cycles that have characterised recent planetary history. However, our knowledge of the climate system and its history currently remains incomplete; some of the forcing mechanisms and feedbacks may not be known accurately, or even at all. For example, we do not yet know whether clouds exert a net warming or cooling effect on the climate. Similarly, variations in ultraviolet radiation and high-energy particle emission from the Sun, in atmospheric electricity and in galactic cosmic rays may all play larger roles in controlling climate change than is currently assumed, yet these effects are absent from most of the current generation of deterministic computer models of the future climate. The temperature projections made by these models may well be affected by our ignorance of the magnitude, the sign, or even the existence of some of the forcings and feedbacks that are actually involved.
Thirdly, Dr Summerhayes also briefly discusses the issue of sea level change. He quotes an estimated increase of 15 m in sea level associated with a temperature increase of 6–10°C 55 million years ago. He then quotes a range of 10–25 m rise for a 2–3°C warming 3 million years ago. To this we might add the further examples of the 125 m sea level rise that has accompanied the 6°C temperature rise since the last glacial maximum, and the 0.2-m rise associated with the ~0.5°C 20th century warming. It appears from these examples that a 1°C temperature rise can be associated with a sea level rise of as little as 0.4 m or as much as 8 m, and all values in between! This indicates an uncertainty in our understanding of the temperature/CO2/sea-level connection that surely lessens its value for contributing to policy formulation.
Figure 1. Temperature curve reconstructed from oxygen isotope measurements in a Greenland ice core over the last 10,000 years (Lappi 2010 after Alley,2000).
Fourth, and last, Dr Summerhayes says that because orbitally-forced climate periodicity is currently in a cooling phase “the Earth should be cooling slightly. Evidently it is not”. The statement is tendentious, because whether Earth is seen to be cooling or warming depends upon the length of climate record that is considered. Trends over 1, 10, 100 or 1000 years are not the same thing, and their differences must be taken into account carefully. We reproduce two figures that may be used to demonstrate that Earth is currently not warming on either the longer-term millennial timescale (Figure 1) or the short-term decadal/meteorological timescale (Figure 2). We note also that on the intermediate centennial timescale (1850–2010) the temperature trend has been one of a slight (0.5°C) rise. In assessing which of these timescales is the “proper” one to consider in formulating climate policy, we observe that the results conveyed in Figure 2 have little scientific (and therefore policy) meaning unless they are assessed in the context of the data in Figure 1.
Figure 2. Mean temperature of lower atmosphere: HadCRUT4 annual means 1997-2011
We acknowledge that the data in Figure 1, which are drawn from a Greenland ice core, represent regional rather than global climate. But a similar pattern of Holocene long-term cooling is seen in many other records from around the world, including from Antarctic ice cores. Also, evidence for a millenial solar cycle has been accumulating over the past years, and, representing that rhythm, the Medieval Warming (also called Medieval Climatic Optimum) appears to have been both global and also warmer than today’s climate.
Regarding Figure 2, the data demonstrate that no warming has occurred since 1997. In response, some leading IPCC scientists have already acknowledged that should the temperature plateau continue, or turn into a statistically significant cooling trend, then the mainstream IPCC view will need revision. It is noteworthy, too, that over the 16 years during which global temperature has remained unchanged (1997-2012), atmospheric carbon dioxide levels have increased by 8%, from 364 ppm to c.394 ppm. Given a mixing time for the atmosphere of about 1 year, these data would invalidate the hypothesis that human-related carbon dioxide emissions are causing dangerous global warming. In any case, observed global temperatures are currently more remote than ever from the most recent predictions set out in IPCC AR4.
The areas of uncertainty in the prevailing argument over DAGW are therefore not only geological but also instrumental and physical. Current debate, which needs to be resolved before climate policy is set, centres on the following three issues:
* whether any definite evidence exists for dangerous warming of human causation over the last 50 years,
* the amount of net warming that is, or will be, produced by human-related emissions (the climate sensitivity issue), and
* whether the IPCC’s computer models can provide accurate climate predictions 100 years into the future.
In assessing these issues, our null hypothesis is that the global climate changes that have occurred over the last 150 years (and continue to occur today) are mainly natural in origin. As summarised in the reports of the Nongovernmental International Panel on Climate Change (NIPCC), literally thousands of papers published in refereed journals contain facts or writings consistent with this null hypothesis, and plausible natural explanations exist for all the post-1850 global climatic changes that have been described so far. In contrast, no direct evidence exists, and nor does the Geological Society point to any, that a measurable part of the mild late 20th century warming was definitely caused by human-related carbon dioxide emissions.
The possibility of human-caused global warming nonetheless remains, because carbon dioxide is indubitably a greenhouse gas. The major unknown is the actual value of climate sensitivity, i.e. the amount of temperature increase that would result from doubling the atmospheric concentration of CO2 compared to pre-industrial levels. IPCC models estimate that water vapour increases the 1°C effect that would be seen in a dry atmosphere to 2.5-4.5°C, whereas widely cited papers by Lindzen & Choi (2011) and Spencer & Braswell (2010) both describe empirical data that is consistent with negative feedback, i.e. sensitivity less than 1°C. The conclusion that climate sensitivity is significantly less than argued by the IPCC is also supported by a range of other empirical or semi-empirical studies (e.g., Forster & Gregory, 2006; Aldrin et al., 2012; Ring et al., 2012).
Gathering these various thoughts together, we conclude that the risk of occurrence of damaging human-caused global warming is but a small one within the much greater and proven risks of dangerous natural climate-related events (not to mention earthquakes, volcanic eruptions, tsunamis and landslides, since we are dealing here with geological topics). Moreover, the property damage and loss of life that occurred in the floods in the UK in 2007; in the 2005 Katrina and 2012 Sandy storms in the USA; and in deadly bushfires in Australia in 2009 and 2013 all attest that even wealthy and technologically sophisticated nations are often inadequately prepared to deal with climate-related hazard.
The appropriate response to climate hazard is to treat it in the same way as other geological hazards. Which is to say that national policies are needed that are based on preparing for and adapting to all climate events as and when they happen, and irrespective of their presumed cause. Every country needs to develop its own understanding of, and plans to cope with, the unique combination of climate hazards that apply within its own boundaries. The planned responses should be based upon adaptation, with mitigation where appropriate to cushion citizens who are affected in an undesirable way.
The idea that there can be a one-size-fits-all global solution to deal with just one possible aspect of future climate hazard, as recommended by the IPCC, and apparently supported by Dr Summerhayes on behalf of the Geological Society, fails to deal with the real climate and climate-related hazards to which all parts of the world are episodically exposed.
Professor Robert (Bob) Carter
Professor Vincent Courtillot
14 February 2013
Aldrin, M. et al. 2012. Bayesian estimation of climate sensitivity based on a simple climate model fitted to observations on hemispheric temperature and global ocean heat content. Environmetrics, doi:10.1002/env.2140.
Alley, R.B. 2000. The Younger Dryas cold interval as viewed from central Greenland. Quaternary Science Reviews 19: 213–226
Forster, P.M. & Gregory, J.M. 2006. The climate sensitivity and its components diagnosed from Earth radiation budget data. Journal of Climate 19, 39-52.
Lappi, D. 2010. 65 million years of cooling
Lindzen, R.S. & Choi, Y-S. 2011. On the observational determination of climate sensitivity and its implications. Asia-Pacific Journal of Atmospheric Sciences 47, 377-390.
Ring, M.J. et al. 2012. Causes of the global warming observed since the 19th century. Atmospheric and Climate Sciences 2, 401-415.
Spencer R. W. & Braswell, W.D. 2010. On the diagnosis of radiative feedback in the presence of unknown radiative forcing. Journal of Geophysical Research 115, D16109.
|
<urn:uuid:f0965eb8-455d-443b-9cf9-720d790d4628>
|
http://www.thegwpf.org/geological-perspective-global-warming-debate/
| 1
|
fineweb
|
The name of the island is Lohachara not that many of us are going to remember it for long. We'll probably recall it as that little island off India, the first once inhabited island to disappear from the surface of the earth due to rising sea levels attributed to global warming.
Lohachara was a small island that supported a population of 10,000 in the Bay of Bengal near where the Ganges and Brahmaputra rivers meet the sea. It is believed that other islands in the area will soon also be submerged displacing some 70,000 more islanders.
Several uninhabited islands have disappeared in recent years, notably in the South Pacific. Lohachara is unique because it was inhabited.
|
<urn:uuid:803bd8f9-4427-468e-ab37-eada95ff6dd4>
|
http://the-mound-of-sound.blogspot.com/2006/12/global-warming-milestone.html
| 0.9997
|
fineweb
|
PAO: Policy Activities » Briefing (November 2, 2011)
Using Science to Improve Flood Management
On November 2, 2011, ESA sponsored a congressional briefing: “Using Science to Improve Flood Management.” Emily Stanley (University of Wisconsin, Madison) and Jeff Opperman (The Nature Conservancy, Ohio Field Office) addressed the function of floodplains and managing rivers as systems and for multiple benefits.
Emily Stanley’s presentation focused on the work of rivers and the function of floodplains. Industry, transportation, and recreation all constitute work done by rivers. Less well known and valued is the work done by floodplains, responsible for such desirable services such as flood attenuation, fish production, improved water quality and groundwater recharge. Stanley noted that aging US levee and other infrastructure provide an opportunity to move beyond structural flood control and take greater advantage of the functions of floodplains.
Jeff Opperman’s presentation focused on the logic of managing rivers as a system and for multiple benefits. These include risk reduction for people and infrastructure as well as benefits such as water storage during droughts and increased fisheries production. Opperman said that because the Mississippi River is managed as a comprehensive system, the recent flood was far less damaging than that of 1927 even though a greater volume of water passed through the system in 2011. Opperman also pointed to the success story of California’s Yolo Bypass, which has reduced flood risk while increasing goods and services.
To view the complete presentations, please click on the links below:
|
<urn:uuid:33f7423f-3d75-4ed6-a025-a3e344c8d8f3>
|
http://www.esa.org/pao/policyActivities/briefing11072011.php
| 0.9955
|
fineweb
|
Almost all of the 33 developed and developing countries surveyed in a new study had introduced or progressed with significant climate-related legislation within their own borders in the past year.
In 2012, 18 countries made significant progress, according to the report by the Grantham Research Institute at LSE and Globe International, which brings together legislators from different countries.
Only Canada had gone backwards on climate change, by repealing the Act implementing its targets under the Kyoto Protocol treaty to cut global emissions.
Despite a tough economic year for many countries, such as those in the eurozone, some progress was made by developed nations.
The EU as a whole made progress through its new directive on energy efficiency, while the US pushed forward with regulating carbon dioxide through its Clean Air Act.
Action by developing countries was more significant, with Mexico leading the way with a new climate law to cut emissions by 30% compared to "business as usual" by 2020, and major progress by countries ranging from Kenya to Pakistan.
|
<urn:uuid:0a8af8f1-6eb4-4528-91d2-0dbf9e717b05>
|
http://www.heraldscotland.com/news/environment/global-climate-change-progress.19909473
| 1
|
fineweb
|
Plants flower faster than climate change models predict
Scientific models are failing to accurately predict the impact of global warming on plants, says a new report.
Researchers found in long-term studies that some are flowering up to eight times faster than models anticipate.
The authors say that poor study design and a lack of investment in experiments partly account for the difference.
They suggest that spring flowering and leafing will continue to advance at the rate of 5 to 6 days per year for every degree celsius of warming.
The results are published in the journal Nature.
For more than 20 years, scientists have been carrying out experiments to mimic the impacts of rising temperatures on the first leafing and flowering of plant species around the world.
End Quote This Rutishauser Oeschger Centre for Climate Change Research
The bottom line is that the impacts might be bigger than we have believed until now”
Researchers had assumed that plants would respond in essentially the same way to experimental warming with lamps and open top chambers as they would to changes in temperatures in the real world.
Very little has been done to test the assumption until this study lead by Dr Elizabeth Wolkovich, who is now at the University of British Columbia in Vancouver.
With her colleagues she studied the timing of the flowering and leafing of plants in observational studies and warming experiments spanning four continents and 1,634 plant species.
According to Dr Wolkovich, the results were a surprise.
"What we found is that the experiments don't line up with the long term data, and in fact they greatly underestimate how much plants change their leafing and flowering with warming," she said.
"So for models based on experimental data, then we would expect that plants are leafing four times faster and flowering eight times faster in the long term historical record than what we're using in some of the models."'Consistent message'
Observational data have been gathered by scientific bodies for many years. In the UK, the systematic recording of flowering times dates back to 1875, when the Royal Meteorological Society established a national network of observers.
Since then, data has also been recorded by full-time biologists and part-time enthusiasts, and in recent years there have been mass-participation projects such as BBC Springwatch.
This new research suggests that these observations of flowering and leafing carried out in many different parts of the world over the past thirty years are remarkably similar according to Dr Wolkovich.
"In terms of long term observations, the records are very coherent and very consistent and they suggest for every degree celsius of warming we get we are going to get a five- to six-day change in how plants leaf and flower."
She argues that the difficulties in mimicking the impacts of nature in an artificial setting are much greater than many scientists estimate. The team found that in some cases the use of warming chambers to artificially raise temperatures can sometimes have the opposite effect.
"In the real world, we don't just see changes in temperature - we see changes in precipitation and cloud patterns and other factors - so certainly when you think about replicating changes in clouds, we are very, very far away from being able to do that.
"I guess we will never get to perfectly match nature, but I am hopeful as scientists we can do much, much better, given funding resources."
The team found that the greater investment in the design and monitoring of experiments, the more accurate the result.
"We have a very consistent message from the long-term historical records about how plants are changing, but we need to think more critically about how we fund and invest in and really design experiments," said Dr Wolkovich.
"We do need them in the future, they are the best way going forward to project how species are changing but right now what we're doing isn't working as well as I think it could."
Other researchers were equally surprised by the results.
Dr This Rutishauser is at the Oeschger Centre for Climate Change Research at the University of Bern in Switzerland. He says that in light of this work scientists will have to rethink the impacts of global warming.
"The bottom line is that the impacts might be bigger than we have believed until now. That's going to provoke a lot of work to probably revise modelling results for estimations of what's going to happen in the future for food production especially."
Dr Wolkovich agrees that if the models are so significantly underestimating the real world observations, there could be also be impacts on water the world over.
"If a whole plant community starts growing a week earlier than we expect according to these experiments, it's going to take up a lot more water over the growing season and if you add to that many years of the model projections, you are going to see big changes in the water supply."
She appeals to people to get involved in citizen science projects and help gather data on flowering and leafing, especially in remote areas.
The National Phenology Network in the US logged its millionth observation this week, and similar programmes are underway in the UK, Sweden, Switzerland, and the Netherlands, and a pan-European database is under development.
"We have very few monitoring networks. We need many, many people out there observing this because it is changing faster and across more habitats than we are currently measuring - we need more help!"
|
<urn:uuid:2b0a717c-2162-468d-9a68-767e517dc557>
|
http://www.bbc.co.uk/news/science-environment-17924653
| 0.9733
|
fineweb
|
"Many environmentalists believe that wind and solar power can be scaled to meet the rising demand [of billions emerging from poverty], especially if coupled with aggressive efforts to cut waste," reports Justin Gillis. "But a lot of energy analysts have crunched the numbers and concluded that today’s renewables, important as they are, cannot get us even halfway there."
Gillis discusses the most promising innovations in nuclear power, which many technologists see as the most viable option for providing a reliable source of electricity without carbon emissions. These include "a practicable type of nuclear fusion", "a fission reactor that could run on today’s nuclear waste", and "a safer reactor based on an abundant element called thorium."
"Beyond the question of whether they will work," he adds, "these ambitious schemes pose a larger issue: How much faith should we, as a society, put in the idea of a big technological fix to save the world from climate change?"
And as is appropriate for a nuclear-related news item that appeared on the two-year anniversary of the Tohoku earthquake, we offer a reminder of the twelve different nuclear power "near miss" events that occurred in the United States in 2012.
|
<urn:uuid:11349972-17b0-4f34-b408-cfc39341347b>
|
http://www.planetizen.com/node/61170
| 0.9002
|
fineweb
|
What Is Air Pollution?
in its great magnitude has existed in the 20th century from the
coal burning industries of the early century to the fossil burning technology in
the new century. The problems of
air pollution are a major problem for highly developed nations whose large
industrial bases and highly developed infrastructures generate much of the air
Every year, billions of tonnes of pollutants are released into the
atmosphere; the sources include power plants burning fossil fuels to the effects
of sunlight on certain natural materials. But
the air pollutants released from natural materials pose very little health
threat, only the natural radioactive gas radon poses any threat to health.
So much of the air pollutants being released into the atmosphere are all
results of man’s activities.
In the United Kingdom, traffic
is the major cause of air pollution in British cities. Eighty six percent of families own either one or two
vehicles. Because of the
high-density population of cities and towns, the number of people exposed to air
pollutants is great. This had led
to the increased number of people getting chronic diseases over these past years
since the car ownership in the UK has nearly trebled. These include asthma and respiratory complaints ranging
through the population demographic from children to elderly people who are most
at risk. Certainly those who are
suffering from asthma will notice the effects more greatly if living in the
inner city areas or industrial areas or even near by major roads.
Asthma is already the fourth biggest killer, after heart diseases and
cancers in the UK and currently, it affects more than three point four million
In the past, severe pollution in London during 1952 added with low winds
and high-pressure air had taken more than four thousand lives and another seven
hundred in 1962, in what was called the ‘Dark Years’ because of the dense
dark polluted air.
is also causing devastation for the environment; many of these causes are by man
made gases like sulphur dioxide that results from electric plants burning fossil
fuels. In the UK, industries and
utilities that use tall smokestacks by means of removing air pollutants only
boost them higher into the atmosphere, thereby only reducing the concentration
at their site.
These pollutants are often transported over the North Sea and produce
adverse effects in western Scandinavia, where sulphur dioxide and nitrogen oxide
from UK and central Europe are generating acid rain, especially in Norway and
Sweden. The pH level, or relative
acidity of many of Scandinavian fresh water lakes has been altered dramatically
by acid rain causing the destruction of entire fish populations.
In the UK, acid rain formed by subsequent sulphur dioxide atmospheric
emissions has lead to acidic erosion in limestone in North Western Scotland and
marble in Northern England.
In 1998, the
London Metropolitan Police launched the ‘Emissions Controlled Reduction’
scheme where by traffic police would monitor the amount of pollutants being
released into the air by vehicle exhausts.
The plan was for traffic police to stop vehicles randomly on roads
leading into the city of London, the officer would then measure the amounts of
air pollutants being released using a CO2 measuring reader fixed in
the owner's vehicle's exhaust. If the
exhaust exceeded the legal amount (based on micrograms of pollutants) the driver
would be fined at around twenty-five pounds.
The scheme proved unpopular with drivers, especially with those driving
to work and did little to help improve the city air quality.
In Edinburgh, the main causes of bad air quality were from the vast
number of vehicles going through the city centre from west to east.
In 1990, the Edinburgh council developed the city by-pass at a cost of
nearly seventy five million pounds. The
by-pass was ringed around the outskirts of the city where its main aim was to
limit the number of vehicles going through the city centre and divert vehicles
to use the by-pass in order to reach their destination without going through the
city centre. This released much of
the congestion within the city but did little very little in solving the
city’s overall air quality.
To further decrease the number of vehicles on the roads, the government
promoted public transport. Over two
hundred million pounds was devoted in developing the country's public transport
network. Much of which included the development of more bus lanes in
the city of London, which increased the pace of bus services.
Introduction of gas and electric powered buses took place in Birmingham
in order to decrease air pollutants emissions around the centre of the city.
Because children and the elderly are at most risk to chronic diseases,
such as asthma, major diversion roads were build in order to divert the vehicles
away from residential areas, schools and elderly institutions.
In some councils, trees were planted along the sides of the road in order
to decrease the amount of carbon monoxide emissions.
Other ways of improving the air quality included the restriction on the
amounts of air pollutants being released into the atmosphere by industries;
tough regulations were placed whereby if the air quality dropped below a certain
level around the industries area, a heavy penalty would be wavered against them.
© Copyright 2000, Andrew Wan.
|
<urn:uuid:ea6c54fe-1f6e-4a4c-bcb5-4f4c9e0fb6de>
|
http://everything2.com/user/KS/writeups/air+pollution
| 0.705
|
fineweb
|
Forest governance and climate policies
Fred Stolle of the World Resources Institute looks at the need for REDD to address forest governance issues as well as creating market incentives.
Policy-makers are recognizing the essential role that the world’s remaining forests play in maintaining the global climate system. The political momentum generated by the Bali Action Plan under the UN Framework Convention on Climate Change (UNFCCC) will create a unique opportunity to put in place a framework of incentives that could curb deforestation, slow forest degradation, and improve the way forests are managed. To succeed, these incentives must strike at the main drivers of rampant deforestation and must also recognize the dependency of local communities on forest ecosystems for their livelihoods.
In the coming months, climate change negotiators have agreed to explore a mechanism for providing compensation for “Reducing Emissions from Deforestation and Forest Degradation in Developing Countries” (REDD). Under most REDD proposals, compensation would be financed by the sale of these emission reductions as ‘carbon offsets’ to be used by regulated countries or companies to remain within their emissions limits.
However, will the promise of money for carbon alone create the conditions necessary to counteract the drivers of deforestation?
If a REDD mechanism is to succeed, competing pressures on forests will need to be managed fairly and effectively. REDD needs to strike at the heart of the drivers, which are not always directly linked to markets, but are as often factors of problems such as illegal logging, bad planning, lack of law enforcement, the absence of tenure rights, the lack of accountability, the lack of coordination and capacity of institutions that manage forest resources and the loss of revenues and other governance factors.
It seems thus apparent that REDD will need to do more than create market incentives. To make REDD effective, efficient and capable of achieving lasting impacts, these governance issues need to be addressed. However, to make these difficult governance improvements countries will need assistance, while these improvements cannot be directly translated into reduced emissions and thus cannot be paid for by carbon credits. There is thus a need for a payment mechanism phase either in parallel or prior to a market mechanism, for REDD to be successful.
Although this phase could not be measured by tons of carbon removed, it is clear that such a phase needs to be measured (and reported and verified), not to fall into the same trap of general development assistance (ODA) over the last decades that has had a low percentage of success. The concept of this governance phase is getting more attention lately and one option of such a phase has been described recently in the Norwegian government -Meridian Institute Options Assessment Report (2009), as the ‘Implementation of policies and measures phase’.
To make this governance phase measurable and successful, governance indicators (qualitative and/or quantitative) need to be developed and agreed upon to be able to identify areas of improvement and hold governments accountable (both governments that supply funds and governments that receive funds). These indicators should cover a wide range of governance topics such as institutions, management, tenure, planning, etc.
Addressing climate change and especially deforestation worldwide will depend on the right incentives and the governance capacity to effectively use these incentives. To improve governance and ensure progress and accountability of governance, we need to develop measurable and agreed upon governance indicators.
|
<urn:uuid:1f534088-d817-4fe5-b528-87d4f831722f>
|
http://www.iucn.org/news_homepage/news_by_date/2009_news/october_2009/?3960/Forest-governance-and-climate-policies
| 1
|
fineweb
|
Stopping Carbon Pollution
Whether you live in a city, on a farm, or anywhere in between, climate change is affecting your weather and damaging the natural world around you. In order to work towards a clean energy future, America needs carbon pollution controls on the largest industrial sources. The U.S. Environmental Protection Agency is taking long overdue steps to limit greenhouse gas emissions from oil refineries and coal-fired power plants, but right now, these highly polluting sources are allowed to release carbon into the atmosphere without any limits.
The National Wildlife Federation’s top priority is to stop the primary cause of climate change – carbon pollution – before it’s too late. NWF is currently fighting major campaigns to:
Electricity generation is the single largest source of global warming pollution in the United States, representing 41 percent of all carbon dioxide emissions. EPA’s newly announced plan to set standards for this sector could require the clean-up of our oldest, dirtiest, least efficient coal power plants. NWF is engaged in a major effort to finalize these rules and dramatically ratchet down our carbon pollution, and we are also supporting EPA's work to address emissions from oil refineries -- the second largest "stationary source" (as opposed to mobile sources like cars and trucks) of global warming pollution in the United States. Strong controls over these big sources will begin holding polluters accountable for their contribution to the climate crisis.
Reducing Emissions in the US and Worldwide
Recognizing that this is a global problem that demands national and international leadership, NWF’s long-term goal is to adopt a national plan that rapidly cuts carbon pollution from all major sources in the US, and safeguards communities and wildlife from the mounting impacts of climate change. The last effort at national legislation – the American Clean Energy and Security Act – passed the House of Representatives in 2009 but stalled in the Senate. Since that time, the impacts of climate change have rapidly escalated. NWF is working hard to get Congress to step up and take action to solve our nation’s most urgent environmental issue.
Avoiding the worst consequences of this disaster also requires a global solution. NWF is partnering with organizations around the world to promote an international agreement that clamps down on carbon pollution, while ensuring that all countries can protect their citizens and wildlife from the impacts of climate change.
|
<urn:uuid:5ca34e3e-8ae3-4073-898a-e4abb2d9d0ba>
|
http://www.nwf.org/What-We-Do/Energy-and-Climate/Reducing-Emissions/~/link.aspx?_id=9963E404A6C54F749ABA02164F76CF07&_z=z
| 1
|
fineweb
|
What determines how much coverage a climate study gets?
It probably goes without saying that it isn’t strongly related to the quality of the actual science, nor to the clarity of the writing. Appearing in one of the top journals does help (Nature, Science, PNAS and occasionally GRL), though that in itself is no guarantee. Instead, it most often depends on the ‘news’ value of the bottom line. Journalists and editors like stories that surprise, that give something ‘new’ to the subject and are therefore likely to be interesting enough to readers to make them read past the headline. It particularly helps if a new study runs counter to some generally perceived notion (whether that is rooted in fact or not). In such cases, the ‘news peg’ is clear.
And so it was for the Steig et al “Antarctic warming” study that appeared last week. Mainstream media coverage was widespread and generally did a good job of covering the essentials. The most prevalent peg was the fact that the study appeared to reverse the “Antarctic cooling” meme that has been a staple of disinformation efforts for a while now.
It’s worth remembering where that idea actually came from. Back in 2001, Peter Doran and colleagues wrote a paper about the Dry Valleys long term ecosystem responses to climate change, in which they had a section discussing temperature trends over the previous couple of decades (not the 50 years time scale being discussed this week). The “Antarctic cooling” was in their title and (unsurprisingly) dominated the media coverage of their paper as a counterpoint to “global warming”. (By the way, this is a great example to indicate that the biggest bias in the media is towards news, not any particular side of a story). Subsequent work indicated that the polar ozone hole (starting in the early 80s) was having an effect on polar winds and temperature patterns (Thompson and Solomon, 2002; Shindell and Schmidt, 2004), showing clearly that regional climate changes can sometimes be decoupled from the global picture. However, even then both the extent of any cooling and the longer term picture were more difficult to discern due to the sparse nature of the observations in the continental interior. In fact we discussed this way back in one of the first posts on RealClimate back in 2004.
This ambiguity was of course a gift to the propagandists. Thus for years the Doran et al study was trotted out whenever global warming was being questioned. It was of course a classic ‘cherry pick’ – find a region or time period when there is a cooling trend and imply that this contradicts warming trends on global scales over longer time periods. Given a complex dynamic system, such periods and regions will always be found, and so as a tactic it can always be relied on. However, judging from the take-no-prisoners response to the Steig et al paper from the contrarians, this important fact seems to have been forgotten (hey guys, don’t worry you’ll come up with something new soon!).
Actually, some of the pushback has been hilarious. It’s been a great example for showing how incoherent and opportunistic the ‘antis’ really are. Exhibit A is an email (and blog post) sent out by Senator Inhofe’s press staff (i.e. Marc Morano). Within this single email there are misrepresentations, untruths, unashamedly contradictory claims and a couple of absolutely classic quotes. Some highlights:
Dr. John Christy of the University of Alabama in Huntsville slams new Antarctic study for using [the] “best estimate of the continent’s temperature”
Perhaps he’d prefer it if they used the worst estimate? ;)
[Update: It should go without saying that this is simply Morano making up stuff and doesn't reflect Christy's actual quotes or thinking. No-one is safe from Morano's misrepresentations!]
[Further update: They've now clarified it. Sigh....]
Morano has his ear to the ground of course, and in his blog piece dramatically highlights the words “estimated” and “deduced” as if that was some sign of nefarious purpose, rather than a fundamental component of scientific investigation.
Internal contradictions are par for the course. Morano has previously been convinced that “… the vast majority of Antarctica has cooled over the past 50 years.”, yet he now approvingly quotes Kevin Trenberth who says “It is hard to make data where none exist.” (It is indeed, which is why you need to combine as much data as you can find in order to produce a synthesis like this study). So which is it? If you think the data are clear enough to demonstrate strong cooling, you can’t also believe there is no data (on this side of the looking glass anyway).
It’s even more humourous, since even the more limited analysis available before this paper showed pretty much the same amount of Antarctic warming. Compare the IPCC report, with the same values from the new analysis (under various assumptions about the methodology).
(The different versions are the full reconstruction, a version that uses detrended satellite data for the co-variance, a version that uses AWS data instead of satelltes and one that use PCA instead of RegEM. All show positive trends over the last 50 years).
Further contradictions abound: Morano, who clearly wants it to have been cooling, hedges his bets with a “Volcano, Not Global Warming Effects, May be Melting an Antarctic Glacier” Hail Mary pass. Good luck with that!
It always helps if you haven’t actually read the study in question. That way you can just make up conclusions:
Scientist adjusts data — presto, Antarctic cooling disappears
Nope. It’s still there (as anyone reading the paper will see) – it’s just put into a larger scale and longer term context (see figure 3b).
Inappropriate personalisation is always good fodder. Many contrarians seemed disappointed that Mike was only the fourth author (the study would have been much easier to demonise if he’d been the lead). Some pretended he was anyway, and just for good measure accused him of being a ‘modeller’ as well (heaven forbid!).
Others also got in on the fun. A chap called Ross Hays posted a letter to Eric on multiple websites and on many comment threads. On Joe D’Aleo’s site, this letter was accompanied with this little bit of snark:
Icecap Note: Ross shown here with Antarctica’s Mount Erebus volcano in the background was a CNN forecast Meteorologist (a student of mine when I was a professor) who has spent numerous years with boots on the ground working for NASA in Antarctica, not sitting at a computer in an ivory tower in Pennsylvania or Washington State
This is meant as a slur against academics of course, but is particularly ironic, since the authors of the paper have collectively spent over 8 seasons on the ice in Antarctica, 6 seasons in Greenland and one on Baffin Island in support of multiple ice coring and climate measurement projects. Hays’ one or two summers there, his personal anecdotes and misreadings of the temperature record, don’t really cut it.
Neither do rather lame attempts to link these results with the evils of “computer modelling”. According to Booker (for it is he!) because a data analysis uses a computer, it must be a computer model – and probably the same one that the “hockey stick” was based on. Bad computer, bad!
The proprietor of the recently named “Best Science Blog”, also had a couple of choice comments:
In my opinion, this press release and subsequent media interviews were done for media attention.
This remarkable conclusion is followed by some conspiratorial gossip implying that a paper that was submitted over a year ago was deliberately timed to coincide with a speech in Congress from Al Gore that was announced last week. Gosh these scientists are good.
All in all, the critical commentary about this paper has been remarkably weak. Time will tell of course – confirming studies from ice cores and independent analyses are already published, with more rumoured to be on their way. In the meantime, floating ice shelves in the region continue to collapse (the Wilkins will be the tenth in the last decade or so) – each of them with their own unique volcano no doubt – and gravity measurements continue to show net ice loss over the Western part of the ice sheet.
Nonetheless, the loss of the Antarctic cooling meme is clearly bothering the contrarians much more than the loss of 10,000 year old ice. The poor level of their response is not surprising, but it does exemplify the tactics of the whole ‘bury ones head in the sand” movement – they’d much rather make noise than actually work out what is happening. It would be nice if this demonstration of intellectual bankruptcy got some media attention itself.
That’s unlikely though. It’s just not news.
|
<urn:uuid:0fe1fa4f-99f0-436d-ba48-6f2e07ec325e>
|
http://www.realclimate.org/index.php/archives/2009/01/warm-reception-to-antarctic-warming-story/langswitch_lang/de?wpmp_switcher=desktop
| 0.5784
|
fineweb
|
DENVER – Put on your poodle skirts and tune in Elvis on the transistor radio, because it’s starting to look a lot like the 1950s.
Unfortunately, this won’t be the nostalgic ’50s of big cars and pop music.
The 1950s that could be on the way to Colorado is the decade of drought.
So says Brian Bledsoe, a Colorado Springs meteorologist who studies the history of ocean currents and uses what he learns to make long-term weather forecasts.
“I think we’re reliving the ’50s, bottom line,” Bledsoe said Friday morning at the annual meeting of the Colorado Water Congress.
Bledsoe studies the famous El Niño and La Niña ocean currents. But he also looks at other, less well-known cycles, including long-term temperature cycles in the oceans.
In the 1950s, water in the Pacific Ocean was colder than normal, but it was warmer than usual in the Atlantic. That combination caused a drought in Colorado that was just as bad as the Dust Bowl of the 1930s.
The ocean currents slipped back into their 1950s pattern in the last five years, Bledsoe said. The cycles can last a decade or more, meaning bad news for farmers, ranchers, skiers and forest residents.
“Drought feeds on drought. The longer it goes, the harder it is to break,” Bledsoe said.
The outlook is worst for Eastern Colorado, where Bledsoe grew up and his parents still own a ranch. They recently had to sell half their herd when their pasture couldn’t provide enough feed.
“They’ve spent the last 15 years grooming that herd for organic beef stock,” he said.
Bledsoe looks for monsoon rains to return to the Four Corners and Western Slope in July. But there’s still a danger in the mountains in the summer.
“Initially, dry lightning could be a concern, so obviously, the fire season is looking not so great right now,” he said.
Weather data showed the last year’s conditions were extreme.
Nolan Doesken, Colorado’s state climatologist, said the summer of 2012 was the hottest on record in Colorado. And it was the fifth-driest winter since record-keeping began more than 100 years ago.
Despite recent storms in the San Juan Mountains, this winter hasn’t been much better.
“We’ve had a wimpy winter so far,” Doesken said. “The past week has been a good week for Colorado precipitation.”
However, the next week’s forecast shows dryness returning to much of the state.
Reservoir levels are higher than they were in 2002 – the driest year since Coloradans started keeping track of moisture – but the state is entering 2013 with reservoirs that were depleted last year.
“You don’t want to start a year at this level if you’re about to head into another drought,” Doesken said.
It was hard to find good news in Friday morning’s presentations, but Bledsoe is happy that technology helps forecasters understand the weather better than they did during past droughts. That allows people to plan for what’s on the way.
“I’m a glass-half-full kind of guy,” he said.
|
<urn:uuid:6b5ff0a8-5351-4289-bb86-d7195a7837dc>
|
http://durangoherald.com/article/20130201/NEWS01/130209956/0/20120510/Drought-is-making-itself-at-home
| 0.8756
|
fineweb
|
In an era when almost every energy technology is unpopular with somebody, the people who don’t want wind turbines, generating stations or new transmission lines installed in their neighborhoods often raise the idea of improving energy efficiency as an alternative.
That argument is particularly common in New York State and in Vermont, where state governments are trying to close nuclear reactors within their borders. So, how effectively can efficiency replace a reactor, making up for the loss of this zero-carbon energy source?
Not very, according to a new study of carbon dioxide output in Japan in the months around the Fukushima disaster.
Figures collected by the Breakthrough Institute, a group that often presents contrarian views on environmentalism and energy conservation, found that despite stringent efforts to use less energy, Japan emitted 4 percent more carbon dioxide in November 2011 than it did in the same month the previous year. After a quake and tsunami in March 2011 led to three meltdowns at the Fukushima nuclear plant, Japan began closing other plants as well, one because it appeared vulnerable to tsunami and others because local officials did not want them running.
Energy consumption dropped sharply and was nearly 10 percent lower last November than in November 2010, the institute’s figures show. But with natural gas, oil and coal substituting for about 46 reactors, the production of carbon dioxide per unit of energy produced ran about 15 percent higher.
The pattern was the same all year after the March 11 tsunami and quake: consumption dropped but fuel burn increased. This was true even though Japan ran office air-conditioners at far reduced levels last summer and some demand had disappeared because of damage from the disaster.
What analogy can be drawn at Indian Point, 30 miles north of New York City, or Vermont Yankee, near Brattleboro? This month, a New York State Assembly committee concluded that Indian Point was replaceable, an assertion sharply disputed by a business consumer group.
Jason Grumet, an air pollution expert and founder of the Bipartisan Policy Center, said it was hard to draw direct parallels. “The circumstances in the United States are obviously different from Japan,’’ he said. For one thing, Japan was parsimonious in its use of electricity even before Fukushima, and American consumers probably have more fat to cut.
But in either country, he said, it is true that “a decrease in nuclear production in favor of fossil fuels will increase carbon intensity of the power sector, and total carbon dioxide emissions.’’
“It’s an incredibly difficult public policy challenge’’ for the United States, Mr. Grumet said, with different imperatives colliding. “One is to ensure that the aging fleet of nuclear plants is held to the highest safety standards, and the second is to reduce greenhouse gas emissions,’’ he said. “And the third is to keep the lights on.”
|
<urn:uuid:4cda4899-535c-49d1-ba21-2625b4bde643>
|
http://green.blogs.nytimes.com/2012/02/13/can-efficiency-counter-a-loss-of-nuclear-power/?ref=sustainableliving
| 0.856
|
fineweb
|
Greenland ice a benchmark for warming
Core data Greenland was about eight degrees warmer 130,000 years ago than it is today, an analysis of an almost three-kilometre-long ice core in Greenland has revealed.
The finding by an international team of 38 institutions from 14 nations provides an important benchmark for climate change modelling and gives an insight into how the natural world will respond to global warming in the future.
The study, which involves CSIRO researchers, also suggests Antarctica's ice sheets may be more vulnerable to warming than previously thought.
Published in today's Nature journal, the results flow out of a four-year expedition known as the North Greenland Eemian Ice Drilling operation (NEEM).
Dr David Etheridge, principal research scientist with CSIRO Marine and Atmospheric Research who has worked on the project, says the NEEM program is the first to successfully reach down into Greenland's ice core into the Eemian period, which stretched from 130,000 years to 115,000 years ago.
"It has been something of a holy grail for Greenland work to achieve this … we are getting to ice close to the bedrock where you get melting and mixing of the ice layers."
Etheridge says in a process similar to assembling a jigsaw puzzle, scientists used comparisons with gas elements in Antarctica's deep ice core records to re-assemble the layers in their original sequence. Deep ice drilling in the Antarctic has reached as far back as 800,000 years.
Past and future
It is important to understand what happened in Greenland during the Eemian period because the temperatures experienced then are "within the realms of where we are heading", says Etheridge.
However, he says the previous warming was due to the Earth receiving more of the Sun's radiation due to its orbit at the time, while today's warming is being driven by increases in greenhouse gases in the atmosphere.
Nature paper co-author Dr Mauro Rubino, of CSIRO Marine and Atmospheric Research, says it had been previously estimated that Greenland's temperature was about 4°C warmer during the Eemian than now.
But this latest work used analysis of water-stable isotopes to estimate "the temperature 130,000 years ago was up to 8°C warmer [in Greenland] than what it is today", says Rubino.
It also shows sea levels were on average 6 metres higher.
The results provide "important benchmarks for future climate change projections" in temperature and the contribution of the two main ice sheets to sea level rises, Rubino says.
He says the study also reveals the Greenland ice sheet did not melt as much as previously thought so was not the major contributor to sea level at that time.
"It shows the major contribution to sea level rises was not coming from the Greenland ice shelf," he says.
"It was previously believed that Greenland melted entirely [during the Eemian], but in fact the ice sheet was not that much different from what it is now.
"Most of the contribution to sea level rise comes from these two big ice reserves [in Greenland and the Antarctica] so one of the possible interpretations is Antarctica is more susceptible to climate change than we thought."
Etheridge agrees. He says the work shows the Greenland ice sheet survived during the Eemian - although it was about 400 metres thinner.
"From that figure you can deduce how much it contributed to the sea level rise and it is not as much as was thought.
"That throws things back to Antarctica ... previously the thought was Antarctica was too cold and too stable to be impacted."
Etheridge says CSIRO was invited by lead institution, the University of Copenhagen, to be involved in NEEM at its formation because of its expertise in analysing air composition in air bubbles trapped in deep ice.
Rubino says their team began analysis of gas bubbles from the first 80 to 100 metres of ice core down to the final 2540 metre depth.
This helped track changes in climate and temperature on a year-by-year basis.
He says the concentration of greenhouse gases such as carbon dioxide, methane and nitrous oxide in the air bubbles from the Eemian was much lower than what it is today.
|
<urn:uuid:96cf8d51-9a89-4975-ae4f-fecfebf943e1>
|
http://www.abc.net.au/science/articles/2013/01/24/3675740.htm
| 1
|
fineweb
|
Carbon With That Latte?
Sonia Narang 07.03.07, 6:00 AM ET
How Starbucks hopes to trim its emissions footprint.
In its shop in downtown San Mateo, Calif., for instance, baristas serve up about 40,000 cups of coffee drinks every month. Just based on utility bills alone, that means Starbucks is serving up about 4,900 pounds of carbon with its drinks--or about two ounces per cup.
Starbucks executives say they are looking for ways to trim those carbon emissions. But they are reluctant to say just how much Starbucks' worldwide carbon footprint is--and how it has changed over the past few years. Starbucks has calculated the carbon footprint of its North American locations only once, in 2003. Since then, its number of U.S. company-owned stores has almost doubled to 6,281. Its international company-owned locations, also left out of the calculation, now number more than 1,500.
"Although we have grown in size, the nature of our business remains the same--the operation of retail stores and roasting coffee," says Jim Hanna, environmental affairs manager at Starbucks in Seattle. While Starbucks chooses not to calculate its carbon footprint every year, the company does conduct annual progress checks, but these numbers are not publicly reported.
Other eco-friendly companies are also surprisingly coy. Last month, for instance, Google
) led a group of 40 other companies (including Starbucks) in kicking off the "Climate Savers Computing Initiative," a project aimed at building and buying more energy-efficient PCs.
Google is nonetheless keeping a watch on the size of its carbon footprint and hopes to achieve carbon neutrality by the end of this year by using non-carbon energy sources for much of its power needs and purchasing carbon offsets for the rest. Recently, Google flipped the switch on 1.6 megawatts of solar power modules on the roof of its Mountain View headquarters.
Starbucks was early among eco-sensitive companies. Executives became convinced early in this decade that atmospheric carbon could wreak havoc on the global climate--and so on the supply and price of coffee beans. "We're facing environmental risks posed by climate change that could negatively affect many aspects of our company, including our ability to procure coffee," Hanna says.
Temperature and rainfall dictate how much coffee comes out of regions including Latin America and Asia. "As we hope to increase to 40,000 stores worldwide in the next 10 years, we're going to need a larger supply," Hanna says.
In 2003, Starbucks hired Denver-based engineering firm CH2M Hill to calculate the carbon footprint of the approximately 3,700 stores it then had in North America. CH2M Hill began measuring corporate footprints in the late 1990s and has done comparable calculations for a few dozen companies, including Nike (nyse: NKE - news - people ), 3M (nyse: MMM - news - people ), SC Johnson and energy firm Kinder Morgan (nyse: KMI - news - people ).
Doing such calculations is still something of a black art. CH2M Hill's Lisa Grice, who worked on the coffee company's carbon footprint, says the final number primarily includes electricity used in retail stores. Carbon calculators take into account stores' geographic locations. That's because electricity generated at power plants in one state may come from a different source than a power plant in another state. Some stores may get electricity from coal-fired plants, which results in greater carbon emissions, while others may depend on hydroelectric power, which has a lower carbon byproduct.
Starbucks decided to leave out the additional 81,000 tons of carbon dioxide it emitted through transporting coffee materials and disposing solid waste. According to Starbucks Environmental Affairs Manager Ben Packard, the company can only control and manage carbon emissions from energy used in retail stores and coffee-roasting plants.
It took about half a year of data collection and complex calculations to figure out that Starbucks emitted 295,000 tons of carbon into the atmosphere in 2003. Starbucks decided to leave out an additional 81,000 tons of carbon dioxide it emitted by transporting coffee materials and disposing of solid waste. According to Starbucks Environmental Affairs Manager Ben Packard, the company can only control and manage carbon emissions from energy used in retail stores and coffee-roasting plants.
Starbucks attributes 81% of its greenhouse gas emissions to purchased electricity and 18% to coffee roasting at its three North American plants and natural gas usage in stores.
That 295,000-ton figure gives Starbucks a small carbon footprint, among a list of about 1,000 companies compiled by the Carbon Disclosure Project, a London-based nonprofit. Near the top of the list is energy giant American Electric Power (nyse: AEP - news - people ) with 146.5 million tons of carbon emissions. Next in line are oil and gas companies Royal Dutch/Shell and British Petroleum (nyse: BP - news - people ) with 105 million tons and 92 million tons.
Comparatively, General Electric's (nyse: GE - news - people ) 12.4 million ton footprint makes it a medium-size emitter. The smallest carbon emitters weighed in at a few thousand tons. Most of the lower footprints belong to insurance companies, retailers and banks.
Starbucks execs say that even as they've been growing the number of outlets, they've been trying to be more energy efficient. In 2005, Starbucks joined the World Research Institute's Green Power Market Development Group, a consortium of 15 companies ranging from Staples (nasdaq: SPLS - news - people ) to Google. The group helps its members purchase renewable energy at lower prices. Last year, the coffee company increased its wind power to 20% of the total energy usage in North American stores. This offset 62,000 tons of carbon dioxide.
But to track progress in reducing carbon emissions accurately, companies need to update those footprints frequently, says Marcus Peacock of the U.S. Environmental Protection Agency. "We've asked companies to check their numbers annually," he says.
A number of companies are doing just that. Both Intel (nasdaq: INTC - news - people ) and Sun Microsystems (nasdaq: SUNW - news - people ), which are also part of the Climate Savers Computing Initiative, report their carbon footprints annually. Intel's carbon footprint added up to 4 million tons in 2006, a number that includes worldwide operations. Sun first calculated its footprint at 255,000 tons last year, and used past data to figure out carbon emissions dating back four years. The company also reports up-to-date carbon numbers on its Web site.
"We calculate this monthly so that we can make sure we're on track with improving emissions," says Sun's VP of Eco Responsibility Dave Douglas.
Both Intel and Sun are part of the EPA's Climate Leaders Program, a group of companies that sets tangible carbon reduction goals. Climate Leaders began five years ago, when few companies even knew the meaning of carbon footprint. Now, the program boasts 132 members.
In the meantime, Starbucks executives insist they are looking for ways to improve energy efficiency and encourage their customers to do the same. This summer, Starbucks told its customers to go green through a number of high-profile campaigns, including "Green Umbrellas for a Green Cause" and the online Planet Green Game (planetgreengame.com). Starbucks will also start monitoring the energy usage of specific equipment at some stores later this year. "We'll install individual meters on espresso machines, refrigerators, water filtration systems and other components," Hanna says.
This doesn't necessarily mean you'll see a green espresso maker at a Starbucks near you anytime soon. "Quality and performance come first," Hanna says.
'); //--> News Headlines | More From Forbes.com | Special Reports
Advertisement: Related Business Topics >
|
<urn:uuid:f118693f-8e25-48c9-a4fa-ea787f1d53e7>
|
http://www.forbes.com/2007/07/02/starbucks-emissions-environment-biz-cz_sn_0703green_carbon.html
| 1
|
fineweb
|
The “presidi” translates as “garrisons” (from the French word, “to equip”), as protectors of traditional food production practices
Monday, March 23, 2009
The “presidi” translates as “garrisons” (from the French word, “to equip”), as protectors of traditional food production practices
This past year, I have had rewarding opportunities to observe traditional food cultures in varied regions of the world. These are:
Athabascan Indian in the interior of Alaska (the traditional Tanana Chiefs Conference tribal lands) in July, 2008 (for more, read below);
Swahili coastal tribes in the area of Munje village (population about 300), near Msambweni, close to the Tanzania border in December, 2008-January, 2009 (for more, read below); and,Laikipia region of Kenya (January, 2009), a German canton of Switzerland (March, 2009), and the Piemonte-Toscana region of northern/central Italy (images only, February-March, 2009).
In Fort Yukon, Alaska, salmon is a mainstay of the diet. Yet, among the Athabascan Indians, threats to subsistence foods and stresses on household economics abound. In particular, high prices for external energy sources (as of July, 2008, almost $8 for a gallon of gasoline and $6.50 for a gallon of diesel, which is essential for home heating), as well as low Chinook salmon runs for information click here, and moose numbers.
Additional resource management issues pose threats to sustaining village life – for example, stream bank erosion along the Yukon River, as well as uneven management in the Yukon Flats National Wildlife Refuge. People are worried about ever-rising prices for fuels and store-bought staples, and fewer and fewer sources of wage income. The result? Villagers are moving out from outlying areas into “hub” communities like Fort Yukon -- or another example, Bethel in Southwest Alaska – even when offered additional subsidies, such as for home heating. But, in reality, “hubs” often offer neither much employment nor relief from high prices.
In Munje village in Kenya, the Digo, a Bantu-speaking, mostly Islamic tribe in the southern coastal area of Kenya, enjoy the possibilities of a wide variety of fruits, vegetables, and fish/oils.
Breakfast in the village typically consists of mandazi (a fried bread similar to a doughnut), and tea with sugar. Lunch and dinner is typically ugali and samaki (fish), maybe with some dried cassava or chickpeas.
On individual shambas (small farms), tomatoes, cassava, maize, cowpeas, bananas, mangos, and coconut are typically grown. Ugali is consumed every day, as are cassava, beans, oil, fish -- and rice, coconut, and chicken, depending on availability.
Even with their own crops, villagers today want very much to enter the market economy and will sell products from their shambas to buy staples and the flour needed to make mandazis, which they in turn sell. Sales of mandazis (and mango and coconut, to a lesser extent) bring in some cash for villagers.
A treasured food is, in fact, the coconut. This set of pictures show how coconut is used in the village. True, coconut oil now is reserved only for frying mandazi. But it also is used as a hair conditioner, and the coconut meat is eaten between meals. I noted also that dental hygiene and health were good in the village. Perhaps the coconut and fish oils influence this (as per the work of Dr. Weston A. Price).
Photos L-R: Using a traditional conical basket (kikatu), coconut milk is pressed from the grated meat; Straining coconut milk from the grated meat, which is then heated to make oil; Common breakfast food (and the main source of cash income), the mandazi, is still cooked in coconut oil
Note: All photos were taken by G. Berardi
Thursday, February 19, 2009
Despite maize in the fields, it is widely known that farmers are hoarding stocks in many districts. Farmers are refusing the NCPB/government price of Sh1,950 per 90-kg bag. They are waiting to be offered at least the same amount of money as that which was being assigned to imports (Bii, 2009b). “The country will continue to experience food shortages unless the Government addresses the high cost of farm inputs to motivate farmers to increase production,” said Mr. Jonathan Bii of Uasin Gish (Bartoo & Lucheli, 2009; Bii, 2009a, 2009b; Bungee, 2009).
Pride and politics, racism and corruption are to blame for food deficits (Kihara & Marete, 2009; KNA, 2009; Muluka, 2009; Siele, 2009). Clearly, what are needed in Kenya are food system planning, disaster management planning, and protection and development of agricultural and rural economies.
Click here for the full text.
Photos taken by G. Berardi
Cabbage, an imported food (originally), and susceptible to much pest damage.
Camps still remain for Kenya’s Internally Displaced Persons resulting from post-election violence forced migrations. Food security is poor.
Lack of sustained recent short rains have resulted in failed maize harvests.
Friday, January 16, 2009
Today I went to a lunch time discussion of sustainability. This concept promoted development with an equitable eye to the triple bottom line - financial, social, and ecological costs. We discussed the how it seemed relatively easier to discuss the connections between financial and ecological costs, than between social costs and other costs. Sustainable development often comes down to "green" designs that consider environmental impacts or critiques of the capitalist model of financing.
As I thought about sustainable development, or sustainable community management if you are a bit queasy with the feasibility of continuous expansion, I considered its corollaries in the field of disaster risk reduction. It struck me again that it is somewhat easier to focus on some components of the triple bottom line in relation to disasters.
The vulnerability approach to disasters has rightly brought into focus the fact that not all people are equally exposed to or impacted by disasters. Rather, it is often the poor or socially marginalized most at risk and least able to recover. This approach certainly brings into focus the social aspects of disasters.
The disaster trap theory, likewise, brings into focus the financial bottom line. This perspective is most often discussed in international development and disaster reduction circles. It argues that disasters destroy development gains and cause communities to de-develop unless both disaster reduction and development occur in tandem. Building a cheaper, non-earthquake resistant school in an earthquake zone, may make short-term financial sense. However,over the long term, this approach is likely to result in loss of physical infrastructure, human life, and learning opportunities when an earthquake does occur.
What seems least developed to me, though I would enjoy being rebutted, is the ecological bottom line of disasters. Perhaps it is an oxymoron to discuss the ecological costs of disasters, given that many disasters are triggered natural ecological processes like cyclones, forest fires, and floods. It might also be an oxymoron simply because a natural hazard disaster is really looking at an ecological event from an almost exclusively human perspective. Its not a disaster if it doesn't destroy human lives and human infrastructure. But, the lunch-time discussion made me wonder if there wasn't something of an ecological bottom line to disasters in there somewhere. Perhaps it is in the difference between an ecological process heavily or lightly impacted by human ecological modification. Is a forest fire in a heavily managed forest different from that in an unmanaged forest? Certainly logging can heighten the impacts of heavy rains by inducing landslides, resulting in a landscape heavily rather than lightly impacted by the rains. Similar processes might also be true in the case of heavily managed floodplains. Flooding is concentrated and increased in areas outside of levee systems. What does that mean for the ecology of these locations? Does a marsh manage just as well in low as high flooding? My guess would be no.
And of course, there is the big, looming disaster of climate change. This is a human-induced change that may prove quite disasterous to many an ecological system, everything from our pine forests here, to arctic wildlife, and tropical coral reefs.
Perhaps, we disaster researchers, need to also consider a triple bottom line when making arguments for the benefits of disaster risk reduction.
Tuesday, January 13, 2009
This past week the Northwest experienced a severe barrage of weather systems back to back. Everyone seemed to be affected. Folks were re-routed on detours, got soaked, slipped on ice, or had to spend money to stay a little warmer. In Whatcom and Skagit Counties, there are hundreds to thousands of people currently in the process of recovering and cleaning-up after the floods. These people live in the rural areas throughout the county, with fewer people knowing about their devastation and having greater vulnerability to flood hazards.
Luckily, there are local agencies and non-profits who are ready at a moment’s call to help anyone in need. The primary organization that came to the aid of the flood victims was the American Red Cross.
The last week I began interning and volunteering with one of these non-profits, the Mt. Baker American Red Cross (ARC) Chapter. While I am still in the process of getting screened and officially trained, I received first-hand experience and saw how important this organization is to the community.
With the flood waters rising throughout the week, people were flooded out of their homes and rescued from the overflowing rivers and creeks. As the needs for help increased, hundreds of ARC volunteers were called to service. Throughout the floods there have been several shelters opened to accommodate the needs of these flood victims. On Saturday I was asked to help staff one of these shelters overnight in Ferndale.
While I talked with parents and children, I became more aware of the stark reality of how these people have to recover from having all their possessions covered in sewage and mud and damaged by flood waters. In the meantime, these flood victims have all their privacy exposed to others in a public shelter, while they work to find stability in the middle of all the traumas of the events. As I sat talking and playing with the children, another thought struck me. Children are young and resilient, but it must be very difficult when they connect with a volunteer and then lose that connection soon after. Sharing a shelter with the folks over the weekend showed a higher degree of reality and humanity to the situation than the news coverage ever could.
I posted this bit about my volunteer experience because it made me realize something about my education and degree track in disaster reduction and emergency planning. We look at ways to create a more sustainable community, and we need to remember that community service is an important part of creating this ideal. Underlying sustainable development is the triple bottom line (social, economy, and environment). Volunteers and non-profits are a major part of this social line of sustainability. Organizations like the American Red Cross only exist because of volunteers. So embrace President-elect Obama’s call for a culture of civil service this coming week and make a commitment to the organization of your choice with your actions or even your pocketbook. Know that sustainable development cannot exist with out social responsibility.
Thursday, January 8, 2009
Its been two days now that schools have been closed in Whatcom County, not for snow, but for rain and flooding. This unusual event coincides with record flooding throughout Western Washington, just a year after record flooding closed I5 for three days and Lewis County businesses experienced what they then called an unprecedented 500 year flood. I guess not.
There are many strange things about flood risk notation, and this idea that a 500 year flood often trips people up. They often believe a flood of that size will happen only once in 500 years. On a probabilistic level, this is inaccurate. A 500 year flood simply has a .2% probability of happening each year. A more useful analogy might be to tell people they are rolling a 500 sided die every year and hoping that it doesn’t come up with a 1. Next year they’ll be forced to roll again.
But, this focus on misunderstandings of probability often hides an even larger societal misunderstanding . Flood risk changes when we change the environment in which it occurs. If a flood map tells you that you are not in the flood plain, better check the date of the map. Most maps are utterly out of date and many vastly underestimate present flood risk. There are several reasons this happens. Urban development, especially development with a lot of parking lots and buildings that don’t let water seep into the ground, will cause rainwater to move quickly into rivers rather than seep into the ground and slowly release. Developers might complain that they are required to create runoff catchment wetlands when they do build. They do, but these requirements may very well be based upon outdated data on flood risk. Thus, each new development never fully compensates for its runoff, a small problem for each site but a mammoth problem when compounded downstream.
Deforesting can have the same effect, with the added potential for house-crushing and river-clogging mudslides. Timber harvesting is certainly an important industry in our neck of the woods. Not only is commercial logging an important source of jobs for many rural and small towns, logging on state Department of Natural Resource land is the major source of funding for K-12 education. Yet, commercial logging, like other industries, suffers from a problem of cost externalization. When massive mudslides occurred during last year’s storm, Weyerhaeuser complained that it wasn’t it’s logging practices, but the fact that it was an unprecedented, out of the blue, 500 year storm that caused it. While it is doubtful the slides would have occurred uncut land, that isn’t the only fallacy. When the slide did occur, the costs of repairing roads, treatment plants, and bridges went to the county and often was passed on to the nation’s tax payers through state and federal recovery grants. Thus, what should have been paid by Weyerhaeuser, 500 year probability or not, was paid by someone else.
Finally, there is local government. Various folks within local governments set regulations for zoning, deciding what will be built and where. Here is the real crux of the problem. Local government also gets an increase in revenue in the form of property, sales, and business income taxes. Suppress the updating of flood plain maps, and you get a short term profit and often, a steady supply of happy voters. You might think these local governments will have to pay when the next big flood comes, but often that can be avoided. Certainly, they must comply with federal regulations on flood plain management to be part of the National Flood Insurance program, but that plan has significant leeway and little monitoring. Like the commercial logging, disaster-stricken local governments can often push the recovery costs off to individual homeowners through the FEMA homeowner’s assistance program, and off to state and federal agencies by receiving disaster recovery and community development grants and loans. Certainly, some communities are so regularly devastated, and are so few resources, that disasters simply knock them down before they can given stand up again. But others have found loopholes and can profit by continuing to use old food maps and failing to aggressively control flood plain development.
What is it going to take to really change this system and make it unprofitable to profit from bad land use management?
Here’s a good in-depth article on last year’s landslides in Lewis County. http://seattletimes.nwsource.com/html/localnews/2008048848_logging13m.html
An interesting article on the failure of best management practices in development catchment basins can be found here: Hur, J. et al (2008) Does current management of storm water runoff adequately protect water resources in developing catchments? Journal of Soil and Water Conservation, 63 (2) pp. 77-90.
Monday, December 29, 2008
It’s difficult to imagine a more colorful book, celebrating locally-grown and –marketed foods, than David Westerlund’s Simone Goes to the Market: A Children’s Book of Colors Connecting Face and Food. This book is aimed at families and the foods they eat. Who doesn’t want to know where their food is coming from – the terroir, the kind of microclimate it’s produced in, as well as who’s selling it? Gretchen sells her pole beans (purple), Maria her Serrano peppers (green), Dana and Matt sell their freshly-roasted coffee (black), Katie her carrots (orange), a blue poem from Matthew, brown potatoes from Roslyn, yellow patty pan squash from Jed, red tomatoes (soft and ripe) from Diana, and golden honey from Bill (and his bees). This is a book perfect for children of any age who want to connect to and with the food systems that sustain community. Order from firstname.lastname@example.org.
|
<urn:uuid:e139d24e-7144-4cf8-866c-6066d64a435f>
|
http://igcr.blogspot.com/
| 0.922
|
fineweb
|
Grassland in Mabi County destroyed by glacial lake outburst flood (GLOF). This area used to be farmland, now it's covered with black glacial deposits after the glacial lake burst. Global warming causes Himalayan glaciers to melt at an unprecedented rate, making GLOF more frequent. Latest research (2009) indicates that in the Chinese Himalayas region, there are currently 143 glacial lakes and 44 of them are very high risk of bursting.
© © Greenpeace / Du Jiang
|
<urn:uuid:925035c8-4b10-42e8-90d0-bcef592b3b78>
|
http://www.greenpeace.org/eastasia/multimedia/slideshows/climate-energy/scenes-climate-change-china/Destroyed-Grassland-in-China/?tab=5
| 1
|
fineweb
|
Electric vehicles have been touted as the dream technology to solve our suburban transport challenges and rescue us from oil dependence and environmental threats. Yet technology use occurs in a social context. Almost no discussion of electric vehicles has addressed the uneven suburban social patterns among which electric vehicles might be adopted.
The evidence that my colleagues Neil Sipe, Terry Li and I have assembled suggests the socio-economic structure of Australian suburbia, in combination with the distribution of public transport infrastructure, constitutes a major barrier to the widespread adoption of electric vehicles, especially among the most car-dependent households.
Relying on electric vehicles as a solution to energy and environmental problems may perpetuate suburban social disadvantage in a period of economic and resource insecurity.
Australia’s five largest cities are the most car-dependent national set outside the United States. Our previous studies (Dodson and Sipe 2007; 2008 have shown that outer suburban residents, especially those with lower socio-economic capacity, are among those most exposed to the pressures of higher transport fuel prices.
Future transport fuel costs are likely to be even higher (currently oil is approximately US$100 per barrel). Unconventional oil sources such as shale or tar sands may be abundant, but they have much higher production costs than conventional light crude. Their current production boom is underpinned by expectations that global oil prices will remain high or increase further over the long term.
Higher oil prices and the need to constrain carbon emissions will likely lead to much higher transport fuel costs than have prevailed in the past decade.
Electric vehicles are often presented as the most likely way to resolve this transport conundrum. Australia’s 2012 Energy White Paper alludes to a transition to electric vehicles as the economy of conventional fuels wanes.
Much of the Energy White Paper and the rhetoric around electric vehicles assumes an unproblematic transition – consumers will change their behaviour in response to price pressures. There is little discussion of potential barriers and impediments to this comforting, convenient narrative.
It makes sense that households who are most car dependent and least able to afford higher fuel prices would be the most eager to switch to an electric car. But, it turns out, the social structure of Australian suburbia means these groups are poorly placed to lead such a transition.
In our study of Brisbane we created datasets linking vehicle fuel efficiency with household socio-economic status. In our analysis, high vehicle fuel efficiency, including hybrids, serves as a proxy for future electric vehicles. We linked motor vehicle registration data with the Green Vehicle dataset on fuel efficiency, plus travel and socio-economic data from the ABS Census.
Our analysis builds a rich picture of how the spatial distribution of vehicle efficiency intersects with suburban socio-spatial patterns, using Brisbane and Sydney as case studies.
We found that the average commuting distance increases with distance from the CBD while average fuel efficiency of vehicles declines. So outer suburban residents travel further, in less efficient vehicles, than more centrally situated households. Outer suburban residents are also likely to be on relatively lower incomes than those closer in.
The result is those living in the outer suburbs have relatively weaker socio-economic status but are paying more for transport. For example, one-third of the most disadvantaged suburbs in greater Brisbane also have the most energy-intensive motor vehicle use.
A socially equitable transition to highly fuel efficient or electric vehicles ought to favour those with the highest current exposure to high fuel prices. Yet our research finds it’s not likely to happen.
Outer suburban groups also own the oldest vehicles in the fleet – they can’t afford newer ones – and this also contributes to poor fuel efficiency and big transport bills. The newest most fuel efficient vehicles are more commonly purchased by wealthier inner-urban households. They can afford the car, but have less need of the efficiency because they don’t travel as far. If such patterns are applied to electric vehicles, their high cost and novelty status means they’re likely to also be taken up by this more advantaged group. Any subsidies offered to spur their uptake will be largely captured by the wealthy.
The implication of our analysis is that the intersection of new fuel and vehicle technology costs with the social and travel patterns in Australian cities mean that suburban households face continued socio-economic stress even as these new vehicles become more widely adopted in Australian cities.
So if new technologies such as electric cars aren’t the solution, how can we secure suburban households against higher fuel prices?
We need a sustained strategy to redress the grossly inequitable supply of public transport to our suburbs. We also need to decentralise our cities, getting jobs and services out into the suburbs and reducing the distances people need to travel by car.
Electric vehicles may be fantastic technology but they risk heading up a cul-de-sac of real suburban vulnerability.
The full paper on which this article is based can be downloaded for free until 6 March 2013.
Jago Dodson receives funding from the Australian Research Council, the National Climate Change Adaptation Research Facility, Logan City Council, Springfield Land Corporation and Lend Lease Communities.
|
<urn:uuid:c179e592-b391-443b-bb73-2aaaadd8a4bb>
|
http://www.standard.net.au/story/1327263/electric-vehicles-wont-solve-the-suburbs-transport-woes/?cs=12
| 0.8916
|
fineweb
|
WFP's office in Kathmandu is leading the way towards carbon neutrality with an ambitious solar project that will cut down its greenhouse emissions by over a third. Copyright: WFP/Meghbar Chemjong
Frequent power outages and melting glaciers are a constant reminder in Nepal of the shortcomings of fossil fuels. In an effort to reduce its carbon footprint and become more energy-efficient, WFP's office there is looking to solar power as a sustainable alternative.
by Deepesh Shrestha, Public Information Officer and Tyler McMahon, Solar Project Coordinator
KATHMANDU -- For much of the world, climate change is still an abstract concept. But in Nepal it is a visible and ominous reality — just ask WFP Nepal Country Director Richard Ragan. A mountaineer who first came to Nepal in the early 1990s to climb the Himalayas, Ragan saw first-hand how much had changed when he came back with WFP in 2006.
On World Environment Day 2007, UN Secretary General Ban Ki-moon
called on all agencies and programmes to “go green” and become climate neutral. See how far we’ve come at the new Greening the Blue website.
“Twenty years ago, when I first came here climbing, there was no lake at the bottom of Mt. Imja Tse,” he says. But there is now. Runoff from a melting glacier has created a body of water over one kilometre wide in less than 16 years.
One of many examples of climate change in Nepal, the country also struggles with major energy shortages that leave much of the country without electricity for long stretches of time.
“Nepal is a country with abundant energy potential yet most people still live without power for 50% of the day. This just didn't make sense to me so I felt we needed to demonstrate that there were alternatives,” said Ragan.
In an effort to shrink its carbon footprint and become less reliant on the national energy grid, the Kathmandu office launched an ambitious project to cut greenhouse emissions by 30% using solar energy. This will eliminate at least 25 tonnes of CO2 emissions per year and will ultimately pay for itself through reduced electricity costs.
The project got off the ground in late May with the installation of a 10 kilowatt peak grid-interactive power system and stand-alone solar-powered security lights. This alone will power the office’s server room, satellite and telephone communication systems, and 11 computers, saving around 30,000 watt-hours per day.
The second phase, to be completed by mid-August, will raise the peak to 22 kilowatts, providing enough solar energy to power all of the lights, computers and printers for more than 80 staff. At this point, WFP Nepal will be able to do without its generator all together in addition to the 11,000 litres of fuel it consumes ever year.
The system is being installed by Solar Solutions Nepal, whose managing Director Raj Thapa said it was the “biggest urban grid-interactive project in the country”.
Reducing our footprint
In order to meet the UN-wide goal of climate neutrality, WFP has launched a wide range of initiatives intended to reduce our carbon footprint and protect the environment in the countries where we work. Here are a few examples:
|
<urn:uuid:0609f075-e02c-439d-8394-3fa264dbd3f6>
|
http://www.wfp.org/aid-professionals/blog/wfp-nepal-goes-solar
| 1
|
fineweb
|
Huffington Post released an article today that looks at the impact of sewage overflow from Superstorm Sandy in New York and New Jersey. The article points out how storm surge will result in overflow events and how rising sea levels will only exacerbate these events resulting in more severe discharges.
The 11 billion gallons of untreated or partially treated sewage spilled due to storm surge in New York and New Jersey must be seen as a warning for all coastal cities. We must consider this warning as we rebuild Miami-Dade’s sewage system. As sea levels rise storm surges will increase in intensity and frequency. Miami-Dade’s facilities must be built to withstand these storm surges to avoid the kind of spills seen in the northeast.
“Princeton, N.J.-based Climate Central said that future sewage leaks are a major risk because rising sea levels can make coastal flooding more severe…The collective overflows – almost all in New York and New Jersey and due to storm surges – would be enough to cover New York City’s Central Park with a pile of sewage 41 feet high, Climate Central said.”
Read More Here:
Sandy Sewage Report: 11 Billion Gallons Of Untreated or Partially Treated Waste Was Released. www.huffingtonpost.org
On April 9, 2013 Biscayne Bay Waterkeeper, along with 131 other organizations, undersigned a letter to the United States Senate urging them to oppose advancing the Water Resources Development Act of 2013 (S. 601). This letter, put together by the Water Protection Network, points out significant problems in this bill:
“Particularly troubling are the streamlining provisions (Sections 2033 and 2032) which will force agency staff to make uninformed decisions, to rubber stamp unacceptable projects, and prioritize deadline compliance over effective review. They do this by:
Requiring the Corps of Engineers to carry out the shortest review possible; Establishing arbitrary and unreasonably short deadlines for the public and resource agencies to comment;
Establishing arbitrary deadlines for resource agency decisions and recommendations;
Allowing the Corps to elevate multiple technical and substantive disagreements all the way to the President; and
Directing the Corps to impose multiple and ongoing fines on resource agencies that miss deadlines or disagree with the Corps on issues fully within the expertise of the resource agencies.
These provisions also could give the Corps control over reviews that are clearly outside of its jurisdiction, including consultation under Section 7 of the Endangered Species Act, review under the Fish and Wildlife Coordination Act, and reviews under laws governing activities in coastal areas and public lands.
Additionally, the bill threatens to exacerbate our nation’s fiscal deficits by rolling back long- established cost-sharing rules and expanding federal responsibilities into areas that have been the financial responsibility of non-federal project sponsors. If enacted as reported, the bill will result in overspending, overcapacity, and substantial and unnecessary damage to the nation’s major estuaries and harbors. Title VIII of the bill would immediately more than double spending on harbor maintenance without assurance of the cost-effectiveness or true need for the dredging. In addition, the Title eliminates the current 50 percent non-federal cost share for maintaining deep draft harbors from 45 to 50 feet of depth, making these costs 100 percent federal responsibility. The provision also makes dredging and maintenance of all approach channels to berths along federal navigation channels and all upland confined disposal of contaminated dredged sediments a 100 percent federal responsibility, rather than the current 100 percent non-federal responsibility. No one has ever even estimated the costs of such an expansion. This would likely cause increases in dredging of contaminated areas that otherwise never would have been contemplated, increasing toxic releases into the nation’s bays and estuaries. We strongly urge rejection of this title as representing a major setback for the nation’s water policy that will be both environmentally-damaging and represents an improper shift of spending and water project responsibility to the taxpayers.”
For more information on the 2013 WRDA see: http://www.waterprotectionnetwork.org/sitepages/downloads/WRDA_2013_NWF_Memo_EPW_Committee_3-18-13_Final.pdf
It is difficult to consider ourselves surrounded by nature in Miami, FL. In the city, on the interstate, or in the supermarket it is easy to think of ourselves removed from the nature of Muir’s Yosemite or Thoreau’s Walden pond. An essay called “Thirteen Ways of Seeing Nature” by Jenny Price, suggests that we reconsider how we think about nature in our city. She writes about nature in L.A., but her message applies to all cities.
Miami is confronted with a decrepit sewage system and the problems that this system is causing for the health of our environment. Our connection to nature is real whether we recognize it or not. We must consider difficult questions like “how are we connected to the nature around us?”, “how do we affect the health of the nature around us?”, and “how do we depend on the nature around us?”. As we move into a future full of challenges like Climate Change these questions are going to become more and more important.
I would encourage everyone to read this article by Jenny Price:
As Biscayne Bay Waterkeeper reflects on a successful clean-up this past week-end, it seems appropriate to consider another clean-up that happened two weeks ago.
On Sunday, March 3rd, Sean Bignami, was jogging on Virginia Key and came across an enormous pile of trash left over from the 9 mile music festival the night before.
Sean spoke with staff who where standing around the festival site who said they could not pick up the trash because the wind was blowing it around. Sean took pictures and videos of the scene with his phone and posted them online along with a request that people join him the next day to help clean up the area.
Four graduate students joined Sean the following morning and picked up enough trash to fill 25 garbage bags!
Sean was unable to get a satisfactory response from the festival supervisor or the Miami parks department regarding accountability for this trash or penalties for the negligence on the part of the festival organizers.
The systems in place that are designed to prevent the festival from leaving piles of trash failed, and it is unclear if the festival will be held accountable. Regardless of this failure, the immediate response from concerned residents must be seen as a message to institutions who ignore the sanctity of our Bay. Biscayne Bay is home to concerned stewards, like Sean Bignami, who will not stand quietly while polluters leave trash on our shores.
Biscayne Bay Waterkeeper wishes to celebrate the stewardship shown in this story. Thank you Sean, and all who came out to help clean up after the 9 mile festival left their trash to be blown into the Bay!
See the article Miami Newtimes blog posted about this story here:
On Sunday, March 17, 2013, Biscayne Bay Waterkeeper and Sierra Club put on a clean-up at Peacock Park in the Grove. Volunteers paddled nearby waters and gathered a huge amount of trash. Thank you to the stewards of Biscayne Bay who volunteered their time to put a dent in the amount of trash in our waters.
Thank you for a successful clean-up!
There is plenty of trash to pick up in Biscayne Bay.
Join Biscayne Bay Waterkeeper and the Sierra Club this Sunday, March 17th, for a paddle clean up at Peacock Park in the Grove (2820 Mcfarland Road, Miami, FL 33133). The clean up will start at 9 am and end at 2 pm. We will launch next to the boardwalk. Please bring your own gloves and trash bags. The Sierra club has a limited number of canoes, so we are encouraging attendees to bring their own kayaks, canoes, or paddle boards. If you do not have a boat, please contact Mark at Sierra Club to reserve a canoe. (contact Mark with any questions: email@example.com/ 305 632 7514)
(Miami, February 28, 2013) - Samples of beach water collected at Dog Beach on Virginia Key did not meet the recreational water quality standard for enterococci. By state regulation, the Florida Department of Health is required to issue an advisory to inform the public in a specific area when this standard is not met.
An advisory for Dog Beach on Virginia Key has been issued because two consecutive samples collected at the beach exceeded the federal and State recommended standard for enterococci (greater than 104 colony forming units per 100ml for a single sample).
Additional beach water samples at the Dog Beach on Virginia Key have been collected and further results are pending.
The advisory issued recommends not swimming at this location at this time. The results of the sampling indicate that water contact may pose an increased risk of illness, particularly for susceptible individuals.
The Florida Department of Health in Miami-Dade County has been conducting marine beach water quality monitoring at 17 sites, including Dog Beach on Virginia Key, weekly since August 2002, through the Florida Healthy Beaches Program. The sampling sites are selected based on the frequency and intensity of recreational water use and the proximity to pollution sources. The water samples are being analyzed for enteric bacteria enterococci that normally inhabit the intestinal track of humans and animals, and which may cause human disease, infections, or illness. The prevalence of enteric bacteria is an indicator of fecal pollution, which may come from storm water run-off, wildlife, pets and human sewage. The purpose of the Florida Healthy Beaches program is to determine whether Florida has significant beach water quality concerns.
For more information please visit the Florida Healthy Beaches Program Website: http://www.doh.state.fl.us and Select “Beach Water Quality”, from the A-Z Topics List.
We just posted the second edition of the Paddle Out Guide. We are excited to be able to provide you with this updated material. Keep this guide close to your kayak or canoe as an aid in your exploration of our beautiful Biscayne Bay. We have posted the guide below for your convenience, but you can always find the Paddle Out Guide at bbwk.org/paddle-out. Go out and enjoy our Bay!
Thank you Julie for speaking at the Grassroots festival on behalf of BBWK
Thank you to everyone who came out to see Biscayne Bay Waterkeeper (BBWK) speak at the Sustainability Fair this weekend at the Grassroots festival!
Julie Dick, a BBWK representative, spoke about our current projects and initiatives, helping connect the festival to some of the issues that face the Bay that surrounded the event.
BBWK was invited to speak alongside the Center for Biological Diversity, Surfrider Miami, and the United States Green Building Council Florida Chapter. We are honored to have shared the stage with such great organizations.
Thousands of people attended the festival, many of whom camped along the water. We are happy such a festive event took place amidst the beauty of our Bay.
Virginia Key and Key Biscayne are barrier islands which are, by their nature, exposed to the elements.
On February 15, 2013 the Village of Key Biscayne sent Carlos Gimenez, Mayor of Miami, a letter asking Miami-Dade County to take another look at the plans to improve the central wastewater treatment plant located on Virginia Key. Key Biscayne is concerned that the plans do not adequately consider the impacts of climate change, such as increased sea levels and stronger storm surges, and do not include funding for flood mitigation. Considering Virginia Key is a barrier island, and therefor more vulnerable to weather and flooding, makes these oversights in planning for a wastewater treatment plant on this Key particularly alarming.
Key Biscayne supports the County’s immediate plans to address Clean Water Act outflow violations, deteriorated conditions at the Virginia Key facility, and of sewer lines identified as being at risk of rupturing, including the 54 inch under-bay line from Miami Beach to Fisher Island to Virginia Key. At the same time, the Village of Key Biscayne, situated just south of Virginia Key, is relying on the County to protect their natural environment. As long as infrastructure improvement plans do not address these long-term issues the residents of the adjacent island community of Key Biscayne will be understandably concerned for their quality of life. Key Biscayne is already plagued by foul odors from the central wastewater facility and occasional sewage spills.
Community voices like key Biscayne, calling for better sewage infrastructure, are the impetus for Biscayne Bay Waterkeeper’s legal initiatives for this issue. If the County will not address the concerns of local residential and business communities, or the needs of our fragile natural resources, then legal action may be the only way we can ensure that the County properly address these issues.
|
<urn:uuid:2cb3abbd-7305-4de9-8d53-6b9b8e8641a0>
|
http://bbwk.org/
| 0.9672
|
fineweb
|
A Stanford scientist has spelled out for the first time the direct links between increased levels of carbon dioxide in the atmosphere and increases in human mortality, using a state-of-the-art computer model of the atmosphere that incorporates scores of physical and chemical environmental processes. The new findings, to be published in Geophysical Research Letters, come to light just after the Environmental Protection Agency’s recent ruling against states setting specific emission standards for this greenhouse gas based in part on the lack of data showing the link between carbon dioxide emissions and their health effects.
While it has long been known that carbon dioxide emissions contribute to climate change, the new study details how for each increase of one degree Celsius caused by carbon dioxide, the resulting air pollution would lead annually to about a thousand additional deaths and many more cases of respiratory illness and asthma in the United States, according to the paper by Mark Jacobson, a professor of civil and environmental engineering at Stanford. Worldwide, upward of 20,000 air-pollution-related deaths per year per degree Celsius may be due to this greenhouse gas.
“This is a cause and effect relationship, not just a correlation,” said Jacobson of his study, which on Dec. 24 was accepted for publication in Geophysical Research Letters. “The study is the first specifically to isolate carbon dioxide’s effect from that of other global-warming agents and to find quantitatively that chemical and meteorological changes due to carbon dioxide itself increase mortality due to increased ozone, particles and carcinogens in the air.”
Jacobson said that the research has particular implications for California. This study finds that the effects of carbon dioxide’s warming are most significant where the pollution is already severe. Given that California is home to six of the 10 U.S. cities with the worst air quality, the state is likely to bear an increasingly disproportionate burden of death if no new restrictions are placed on carbon dioxide emissions.
On Dec. 19, the Environmental Protection Agency denied California and 16 other states a waiver that would have allowed the states to set their own emission standards for carbon dioxide, which are not currently regulated. The EPA denied the waiver partly on the grounds that no special circumstances existed to warrant an exception for the states.
Stephen L. Johnson, the EPA administrator, was widely quoted as saying that California’s petition was denied because the state had failed to prove the “extraordinary and compelling conditions” required to qualify for a waiver. While previous published research has focused on the global effect on pollution—but not health—of all the greenhouse gases combined, the EPA noted that, under the Clean Air Act, it has to be shown that there is a reasonable anticipation of a specific pollutant endangering public health in the United States for the agency to regulate that pollutant.
Jacobson’s paper offers concrete evidence that California is facing a particularly dire situation if carbon dioxide emissions increase. “With six of the 10 most polluted cities in the nation being in California, that alone creates a special circumstance for the state,” he said, explaining that the health-related effects of carbon dioxide emissions are most pronounced in areas that already have significant pollution. As such, increased warming due to carbon dioxide will worsen people’s health in those cities at a much faster clip than elsewhere in the nation.
According to Jacobson, more than 30 percent of the 1,000 excess deaths (mean death rate value) due to each degree Celsius increase caused by carbon dioxide occurred in California, which has a population of about 12 percent of the United States. This indicates a much higher effect of carbon dioxide-induced warming on California health than that of the nation as a whole.
Jacobson added that much of the population of the United States already has been directly affected by climate change through the air they have inhaled over the last few decades and that, of course, the health effects would grow worse if temperatures continue to rise.
Jacobson’s work stands apart from previous research in that it uses a computer model of the atmosphere that takes into account many feedbacks between climate change and air pollution not considered in previous studies. Developed by Jacobson over the last 18 years, it is considered by many to be the most complex and complete atmospheric model worldwide. It incorporates principles of gas and particle emissions and transport, gas chemistry, particle production and evolution, ocean processes, soil processes, and the atmospheric effects of rain, winds, sunlight, heat and clouds, among other factors.
For this study, Jacobson used the computer model to determine the amounts of ozone and airborne particles that result from temperature increases, caused by increases in carbon dioxide emissions. Ozone causes and worsens respiratory and cardiovascular illnesses, emphysema and asthma, and many published studies have associated increased ozone with higher mortality. “[Ozone] is a very corrosive gas, it erodes rubber and statues,” Jacobson said. “It cracks tires. So you can imagine what it does to your lungs in high enough concentrations.” Particles are responsible for cardiovascular and respiratory illness and asthma.
Jacobson arrived at his results of the impact of carbon dioxide globally and, at higher resolution, over the United States by modeling the changes that would occur when all current human and natural gas and particle emissions were considered versus considering all such emissions except human-emitted carbon dioxide.
Jacobson simultaneously calculated the effects of increasing temperatures on pollution. He observed two important effects:
- Higher temperatures due to carbon dioxide increased the chemical rate of ozone production in urban areas
- Increased water vapor due to carbon dioxide-induced higher temperatures boosted chemical ozone production even more in urban areas.
Interestingly, neither effect was so important under the low pollution conditions typical of rural regions, though other factors, such as higher organic gas emissions from vegetation, affected ozone in low-pollution areas. Higher emissions of organic gases also increased the quantity of particles in the air, as organic gases can chemically react to form particles.
And in general, where there was an increase in water vapor, particles that were present became more deadly, as they swelled from absorption of water. “That added moisture allows other gases to dissolve in the particles—certain acid gases, like nitric acid, sulfuric acid and hydrochloric acid,” Jacobson said. That increases the toxicity of the particles, which are already a harmful component of air pollution.
Jacobson also found that air temperatures rose more rapidly due to carbon dioxide than did ground temperatures, changing the vertical temperature profile, which decreased pollution dispersion, thereby concentrating particles near where they formed.
In the final stage of the study, Jacobson used the computer model to factor in the spatially varying population of the United States with the health effects that have been demonstrated to be associated with the aforementioned pollutants.
“The simulations accounted for the changes in ozone and particles through chemistry, transport, clouds, emissions and other processes that affect pollution,” Jacobson said. “Carbon dioxide definitely caused these changes, because that was the only input that was varied.”
“Ultimately, you inhale a greater abundance of deleterious chemicals due to carbon dioxide and the climate change associated with it, and the link appears quite solid,” he said. “The logical next step is to reduce carbon dioxide: That would reduce its warming effect and improve the health of people in the U.S. and around the world who are currently suffering from air pollution health problems associated with it.”
Source: Stanford University
Explore further: NASA's BARREL mission launches 20 balloons
|
<urn:uuid:31f3330c-885a-433b-9829-352ea7e6f20a>
|
http://phys.org/news118591612.html
| 1
|
fineweb
|
As the planet becomes increasingly covered with concrete, Tarun Naik says at least some of it could be used to help the environment rather than hurt it.
Some of the negative consequences are worrisome. The production of one ton of cement, the paste used to make concrete, creates almost an equal amount of greenhouse gases. That's more than 1.2 billion tons of carbon dioxide a year, says Naik, a University of Wisconsin-Milwaukee engineering professor.
Concrete roads and buildings have been linked to "hot city syndrome," a condition in which temperatures keep rising in urban areas. There's a direct link between concrete and global warming, according to Naik, who has spent decades studying greener alternatives to conventional cement and concrete.
What's more, entire geographical regions of the world are running out of limestone to make cement, Naik says.
"As limestone becomes a limited resource, employment and construction associated with the concrete industry will decline," he wrote in a recent study about the sustainability of the cement and concrete industries.
Naik and his colleagues are seeking ways to make the use of concrete more sustainable and environmentally friendly. It's a huge task since concrete is second only to water when it comes to the consumption of materials worldwide.
"It's a global issue. China alone plans to double or triple its cement capacity in the near future," said Rudolph Kraus, assistant director of the UWM Center for By-Products Utilization, which is doing research on cement and concrete.
Researchers are studying things such as porous pavement that allows the earth to breathe and take in water. Stone and soil underneath porous pavement acts as a reservoir and cleans runoff water like the filter on a fish tank.
"It's a sore point with me that we spend millions of dollars a year flushing storm water into Lake Michigan," Naik said.
Porous concrete study
Highway barriers made from porous concrete could absorb sound and act as sponges that soak up greenhouse gases, according to Naik.
Currently, he and other UWM researchers are trying to quantify how well porous concrete absorbs carbon dioxide. They're exposing crushed concrete to carbon dioxide in the atmosphere, triggering a chemical reaction that sequesters the gas and keeps it from leaching out.
Through the reaction, which converts calcium hydroxide to limestone, porous concrete gets stronger. But eventually the material becomes saturated with carbon dioxide and stops absorbing it.
Researchers don't know how long it takes to reach the saturation point, though it might be years. When a porous highway barrier stopped absorbing gas, it could be torn down and recycled to make new concrete.
"Even if it only sequestered a small amount of carbon dioxide, it would be a step in the right direction," Naik said.
Buildings made from porous concrete could help cleanse indoor air. Ordinary concrete walls absorb some carbon dioxide, but not much of it because the surface isn't very porous.
Porous concrete could help filter storm water in parking lots. Studies have shown that most pollutants would be trapped in the material rather than passed through to the aquifer.
Crushed concrete could be used to cleanse waste gases from power plants. And fly ash, a waste product from coal-burning power plants, has been used as a substitute for cement in making concrete.
"By using fly ash in place of cement, we cut the corresponding amount of CO2 emissions," Naik said. "It has been a holy grail to use 100 percent fly ash. We have had successes with this idea in lab-produced concrete. However, it has not yet been implemented in real-world practice."
The building materials industry recognizes it has environmental issues, but not everyone agrees that porous concrete provides many solutions.
The material isn't necessarily compatible with climates that have frequent freeze-and-thaw cycles, said David Schulz, an engineering professor at Northwestern University and former Milwaukee County executive.
"Road builders spend a lot of time and money trying to keep water out of concrete," Schulz said. "In the winter it freezes and expands with incredible pressure. That's what causes the breakup of a lot of concrete."
But in tests done in Wisconsin, porous concrete has held up well to freezing and thawing, Naik said.
One of the tests, in Port Washington, involved the use of porous concrete on a road. After 10 years of exposure to year-round weather, the road was undamaged.
Another test, in Green Bay, is under way to evaluate porous concrete in a parking lot and truck loading area. So far, after four years, there's not been any damage from freeze and thaw cycles, Naik said.
The U.S. Department of Energy concluded that porous concrete would not suffer winter damage, assuming the proper mix of materials is used and water is not allowed to saturate the porous material.
As long as water drains through it and into the ground below, there should not be a problem, according to Naik.
The Portland Cement Association, which represents U.S. cement companies, is at the center of the environmental debate. Cement companies have improved their environmental performance and are constantly evaluating new materials and ideas, said David Shepherd, the association's director of sustainable development
Industrial waste that used to clog landfills is now added to concrete mixes to reduce the reliance on raw materials. When it's constantly recycled, concrete produces very little waste, Shepherd said.
"You certainly would not make concrete as a remedial solution for carbon dioxide, but I think the industry will be spending money trying to figure out what our overall impact is" on greenhouse gases, Shepherd said.
|
<urn:uuid:920d7e76-d2ea-46ba-bab6-7619548c2267>
|
http://www.jsonline.com/business/29251104.html
| 0.9497
|
fineweb
|
Latitude And Rain Dictated Where Species Lived
More than 200 million years ago, mammals and reptiles lived in their own separate worlds on the supercontinent Pangaea, despite little geographical incentive to do so. Mammals lived in areas of twice-yearly seasonal rainfall; reptiles stayed in areas where rains came just once a year. Mammals lose more water when they excrete, and thus need water-rich environments to survive. Results are published in the Proceedings of the National Academy of Sciences.
Aggregating nearly the entire landmass of Earth, Pangaea was a continent the likes our planet has not seen for the last 200 million years. Its size meant there was a lot of space for animals to roam, for there were few geographical barriers, such as mountains or ice caps, to contain them.
Yet, strangely, animals confined themselves. Studying a transect of Pangaea stretching from about three degrees south to 26 degrees north (a long swath in the center of the continent covering tropical and semiarid temperate zones), a team of scientists led by Jessica Whiteside at Brown University has determined that reptiles, represented by a species called procolophonids, lived in one area, while mammals, represented by a precursor species called traversodont cynodonts, lived in another. Though similar in many ways, their paths evidently did not cross.
“We’re answering a question that goes back to Darwin’s time,” said Whiteside, assistant professor of geological sciences at Brown, who studies ancient climates. “What controls where organisms live? The two main constraints are geography and climate.”
Turning to climate, the frequency of rainfall along lines of latitude directly influenced where animals lived, the scientists write in a paper published this week in the online early edition of the Proceedings of the National Academy of Sciences. In the tropical zone where the mammal-relative traversodont cynodonts lived, monsoon-like rains fell twice a year. But farther north on Pangaea, in the temperate regions where the procolophonids predominated, major rains occurred only once a year. It was the difference in the precipitation, the researchers conclude, that sorted the mammals’ range from that of the reptiles.
The scientists focused on an important physiological difference between the two: how they excrete. Mammals lose water when they excrete and need to replenish what they lose. Reptiles (and birds) get rid of bodily waste in the form of uric acid in a solid or semisolid form that contains very little water.
On Pangaea, the mammals needed a water-rich area, so the availability of water played a decisive role in determining where they lived. “It’s interesting that something as basic as how the body deals with waste can restrict the movement of an entire group,” Whiteside said.
In water-limited areas, “the reptiles had a competitive advantage over mammals,” Whiteside said. She thinks the reptiles didn’t migrate into the equatorial regions because they already had found their niche.
The researchers compiled a climate record for Pangaea during the late Triassic period, from 234 million years ago to 209 million years ago, using samples collected from lakes and ancient rift basins stretching from modern-day Georgia to Nova Scotia. Pangaea was a hothouse then: Temperatures were about 20 degrees Celsius hotter in the summer, and atmospheric carbon dioxide was five to 20 times greater than today. Yet there were regional differences, including rainfall amounts.
The researchers base the rainfall gap on variations in the Earth’s precession, or the wobble on its axis, coupled with the eccentricity cycle, based on the Earth’s orbital position to the sun. Together, these Milankovitch cycles influence how much sunlight, or energy, reaches different areas of the planet. During the late Triassic, the equatorial regions received more sunlight, thus more energy to generate more frequent rainfall. The higher latitudes, with less total sunlight, experienced less rain.
The research is important because climate change projections shows areas that would receive less precipitation, which could put mammals there under stress.
“There is evidence that climate change over the last 100 years has already changed the distribution of mammal species,” said Danielle Grogan, a graduate student in Whiteside’s research group. “Our study can help us predict negative climate effects on mammals in the future.”
Contributing authors include Grogan, Paul Olsen from Columbia University, and Dennis Kent from Rutgers. The National Science Foundation and the Richard Salomon Foundation funded the research.
Image 1 Caption: More than 200 million years ago, nearly all the land on Earth was part of Pangaea. Animals could roam freely, yet they appear to have sorted themselves into regions. Researchers at Brown are figuring out why. (Credit: Brown University)
Image 2 Caption: The skull of the procolophonid Hypsognathus was found in Fundy basin, Nova Scotia, which was hotter and drier when it was part of Pangaea. Mammals, needing more water, chose to live elsewhere. (Credit: Brown University)
On the Net:
|
<urn:uuid:f3c83f07-5d87-43fc-8d42-a95031cc8ee9>
|
http://www.redorbit.com/news/science/2047586/latitude_and_rain_dictated_where_species_lived/
| 0.9965
|
fineweb
|
Virginia continues to be a top coal producer, but new rules and renewables challenge the industry’s futureApril 27, 2012 6:00 AM
by Garry Kranz
Coal mining took off in Virginia when the railroads arrived in the 1880s. More than 130 years later, coal still fuels Virginia’s economy. Northern Virginia’s tech sector depends on a steady supply of low-cost electricity, much of it derived from bitumen-rich Appalachian coal.
And Virginia’s deepwater port in Hampton Roads — the largest coal exporting terminal in the U.S. — is a huge economic asset that neighboring coal states can only envy.
Coal mining is one of Virginia’s oldest continuously operating industries, but the biggest question facing coal companies now is this: how does it adapt to the future? Just like the commodity they coax from the earth, Virginia’s coal companies and their national counterparts are under intense pressure. Coal has fallen into disfavor amid fears that it contributes to carbon pollution and global warming, prompting new environmental regulations.
In late March, the Environmental Protection Agency (EPA) announced the first-ever limits on carbon dioxide emissions for new coal power plants. Under the New Source Performance Standard, emissions could not exceed 1,000 pounds of carbon dioxide per megawatt hour of electricity produced. That benchmark, say some industry observers, is impossible to attain without costly equipment to capture carbon emissions.
The regulations face certain opposition in Congress as well as legal challenges. Still, they’re indicative of the enormous challenges facing the industry — challenges that could have an impact on a new $6 billion coal-power plant proposed for Surry County. It will delay, if not altogether doom, plans by Old Dominion Electric Cooperative (ODEC) in Glen Allen. ODEC already has received zoning permits to build a 1,200- to 1,500-megawatt coal plant in Dendron, and it owns another potential site in Sussex County. The proposed Cypress Creek Power Station would supply electricity for 330,000 to 375,000 homes, depending on its eventual location.
The plant is needed to keep pace with new demand for electricity, forecast to increase about 3 percent a year, says David Hudgins, ODEC’s director of member and external relations. “We need the base-load energy, and we need it to be as fuel-efficient as possible,” he says. It’s unlikely the plant will be built before 2021-22, he adds, but the company has spent about $25 million so far on land acquisitions, site work and preliminary testing. The new spate of environmental regs puts the plant’s future in doubt, though. “The reality is we’re waiting for clarification of multiple new EPA rules. We can’t spend big money on a new plant unless we know what the standard is going to be,” Hudgins says.
For now and the near future, experts say coal-fired electricity will contribute significantly to Virginia’s energy supply, to say nothing of America in general. That’s because it’s cheap and abundant, even in the face of plummeting prices for natural gas. “The United States has about 250 billion to 300 billion tons of recoverable coal reserves. We are to coal what Saudi Arabia is to oil,” says Nino Ripepi, a Virginia Tech professor with the Virginia Center for Coal and Energy Research.
About 1 billion tons of coal gets extracted annually in the U.S. Even in Virginia — where production is in the midst of a two-decade decline — Ripepi says coal accounts for about 40 percent of all in-state generation of electric power, a figure not expected to change quickly in coming years. Virginia imports about one-third of its electricity, second only to California. If coal use is discouraged, Virginia probably will be forced to import even more electric power, increasing energy costs, Repipi says.
Not surprisingly, big changes are afoot in Virginia’s coal industry. Despite howls of protest from environmentalists and citizens groups, Dominion Virginia Power, the state’s largest regulated electric utility, is set to open a $1.8 billion “clean-coal” plant in the Southwest Virginia hamlet of St. Paul this summer.
Consolidation also is reshaping the state’s coal sector. Bristol-based Alpha Natural Resources last year became the second-largest U.S. coal company, in terms of revenue — and the third largest in production — after it bought Massey Energy Co., a longtime Virginia rival. The deal swelled Alpha’s revenue to $7.1 billion in 2011, second only to St. Louis-based Peabody Energy Inc.
Changes in Virginia are emblematic of the flux within the coal industry in general. A relatively mild winter cut into demand in 2011 and drove domestic coal prices lower than normal. Good news for consumers — not so much for coal companies. In mid-April, stock prices for Alpha Natural Resources had dropped by 75 percent and were down 80 percent at James River Coal Co. over the past year.
Coal prices are being depressed even further by a sudden surplus of natural gas, including newly unearthed shale deposits in Pennsylvania, North Dakota, Wyoming and other states. In addition, the fast-growing economies of Brazil, China and India, among others, are fueling a sustained building boom, supported by imports of high-quality U.S. metallurgical coal, used in the making of steel.
In Virginia, the macroeconomic challenges are magnified by bad geological luck. Although it is consistently a top 10 producer, Virginia is running out of available, easily accessible coal seams. Virginia producers extracted 22.4 million tons of coal in 2011, according to the federal Energy Information Administration. That’s less than half of Virginia’s all-time high of 46.6 million tons in 1990. During the same time frame, lower-cost surface mines in western states have doubled their output, siphoning off business from Appalachian states like Virginia, where most mining occurs underground.
“Our coal seams are thinner and deeper than our sister coal states. That makes it tougher and more expensive for us to recover mineable coal,” says W. Thomas Hudson, president of the Virginia Coal Association Inc., a trade group in Richmond.
Unfriendly environment for coal
But the biggest worry is an uncertain regulatory climate, says Brooks Smith, an attorney with Hunton & Williams in Richmond who represents several of the nation’s largest coal companies, including Alpha Natural Resources. Mining permits have slowed to a trickle as the EPA implements tougher restrictions on mining, Smith says.
“We believe the EPA is trying to turn this situation on its head and upset the process,” he says, referring to recent attempts by the agency to modify how permits are issued under the federal Clean Water Act.
The debate mostly centers on a 2008 study in which EPA scientists concluded that entire orders of aquatic insects were being wiped out downstream of mountaintop-removal mining operations in Central Appalachia, which encompasses Kentucky, Virginia and West Virginia. The 20-page report, “Downstream Effects of Mountaintop Mining,” prompted the EPA to impose stricter rules designed to protect streams and waterways. Critics say the new limits are based on assumptions, not science.
“It’s an attempt by the EPA to establish a water-quality standard that no mining company can meet,” says Gene Kitts, senior vice president of environmental affairs for Alpha, who is based in Charleston, W.Va.
Not even a court decision has resolved the issue. The National Mining Association, a trade group in Washington, D.C., sued the EPA over the matter. The first phase of the litigation alleged that EPA unlawfully obstructed Clean Water Act permits, thus creating an unofficial moratorium on coal mining in the Central Appalachian region. Late last year, U.S. District Court Judge Reggie Walton sided with the industry, ruling the EPA had overstepped its authority and failed to follow established guidelines for federal rulemaking.
The matter finally may be resolved in June, Smith says. That’s when a motion of summary judgment is scheduled to be heard on the so-called “guidance” phase of the lawsuit. As the complex process gets sorted out, about 200 water-discharge permits for coal mining projects in Virginia are in abeyance, Smith says.
Hudson says it’s a double whammy on the coal industry: delaying the opening of new mines, while older mines can’t get renewals of their five-year operating permits. Virginia’s mining sector already is feeling the impact. In February, Hudson says, A&G Coal Corp. in Wise County laid off 108 miners because of a slowdown in mining.
Then in March Pennsylvania-based Consol Inc. announced it was idling production of metallurgical coal at its long-wall mine in Buchanan County, which employs about 700 people. Hudson says Consol hasn’t announced layoffs, instead shifting employees to non-mining duties, “but how long will it be before that work runs out?”
Power outage ahead?
The coal industry faces another hit, at least indirectly, from regulations aimed at U.S. power companies. ICF International Inc., a Fairfax-based consulting firm, forecasts that America could lose up to 20 percent of its coal-fired electric generation during this decade. The main impetus: four newly proposed rules by the EPA that could force utilities to either shut down older coal plants or make expensive upgrades. The report was prepared before EPA announced its new caps in March on carbon emissions for new coal-fired plants.
“It’s going to require flexibility, careful planning and lots of discussion by all players in the industry to make sure the power grid continues to operate as these coal plants get decommissioned,” says John Blaney, a senior vice president with ICF International.
Roanoke-based Appalachian Power Co. plans to shutter its two Virginia coal-fired generating plants by 2015 to comply with the Mercury and Air Toxics Standards Rule and the Cross-State Air Pollution Rule, company spokesman Todd Burns says.
The company, a subsidiary of Columbus, Ohio-based American Electric Power, or AEP, serves about 500,000 customers in western Virginia. The closures in Virginia include a 335-megawatt coal unit in Giles County and a 235-megawatt coal station in Russell County. Two other Russell County coal plants will be converted to burn natural gas. The shift is expected to result in the loss of 85 jobs in Virginia, Burns says.
It is part of a larger effort by AEP to retire nearly 6,000 megawatts of coal generation in nine states. Appalachian Power customers could see their monthly electric bills jump 10 percent to 15 percent as a result, Burns says.
Likewise Dominion has announced plans to retire several coal-fired units at its Chesapeake Energy Center in Chesapeake and Yorktown Power Station by 2016. The units total about 360 megawatts. Output from a proposed $1 billion, 1,300-megawatt, natural gas-fired plant in Brunswick County would replace electricity from those coal units and help Dominion meet growing demand while also satisfying the latest federal clean-air standards. The company also plans to retire the 74-megawatt North Branch Power Station in late 2014 when a new 1,300-megawatt, natural gas Warren County Power Station is complete.
Coal plants and controversies
While U.S. energy producers wrestle with these potential impacts, Dominion is testing the boilers at its soon-to-be-opened Virginia City Hybrid Energy Center in Wise County — one of seven Southwest counties that account for nearly all the coal mined in Virginia. Situated on an reclaimed strip mine, Dominion’s unit will burn a blend of freshly mined coal, waste coal and biomass to produce up to 585 megawatts, or enough electricity for about 146,000 homes, say company officials.
It will use a technology known as circulating fluidized bed, a process for burning “clean coal” that has been approved by the U.S. Department of Energy’s Office of Fossil Energy. “This type of boiler enables us to burn a very broad range of coal, and it’s designed for very low heating values,” says Diane Leopold, senior vice president of Dominion’s transmission business. The hybrid plant will consume about 2.8 million tons of coal annually, most of it mined in and around Wise and adjacent counties, Leopold says. Under the air permit issued to Dominion by the SCC, biomass — wood chips, in layman’s terms — must comprise 5 percent of the fuel mix at the facility by its third year of operation, increasing 1 percent a year to a minimum of 10 percent and a maximum of 20 percent.
The plant is a giant stride forward in the use of clean coal, Dominion spokesman James Norvelle says. Because of its fuels mix and strict permitting limits, he says, the plant will emit fewer toxins and use less water.
Environmentalists thus far are not impressed. “The biomass component is negligible compared to the amount of coal that will be burned, resulting in mountains being destroyed by mountaintop removal and the release of mercury and smog-inducing pollutants,” says Glen Besa, president of the Virginia chapter of the Sierra Club.
While controversial, coal is an economic linchpin for the seven counties that comprise the Virginia Coalfields — a region where high-wage jobs are in short supply.
Mechanization has increased mine production and reduced employment in recent decades. Even so, the region’s coal industry pays an average yearly wage of nearly $85,000, according to a recent report by Chmura Economics and Analytics, a Richmond-based econometrics firm. That’s more than double the $37,757 average for all industries in the region. The report, prepared on behalf of the Virginia Coal Association, says nearly 12 percent of local tax revenue in the region stems from coal.
Coal’s impact is not confined to the coalfields, however. “Virginia gets the double benefit from coal because of the Port of Virginia,” says Paul Grossman, director of international trade and investment at the Virginia Economic Development Partnership.
State coal exports shot up 30 percent year over year in 2011, generating nearly $1.3 billion in revenue. Grossman traces the spectacular one-year leap to growing worldwide coal consumption, notably for Virginia’s high-quality metallurgical coal. Global supplies are constrained as Australia, the world’s largest coal exporter, recovers from flooding that disrupted its mining industry two years ago.
All told, 40.9 million short tons (a short ton is equivalent to 2,000 pounds) of coal were shipped through harbors in Hampton Roads, most of it coking coal bound for emerging economies. China, the Netherlands, Spain and Sweden all imported dramatically higher quantities of Virginia coal. Exports to Japan — Virginia’s 18th-largest import customer — were up an eye-popping 2,406 percent as the island nation used coal to replenish base load electric power lost when nuclear reactors were damaged by a tsunami on March 11, 2011.
Railway giant Norfolk Southern Corp. steamed to record revenues of $11.2 billion in 2011, with increased coal shipments accounting for nearly one-third of the revenue growth, according to the company’s annual financial statement.
Prices for metallurgical coal tend to far outstrip prices for thermal coal (used in making electricity), thus making it more profitable to recover and process. After acquiring Massey Energy last June, Alpha has become the new heavyweight in metallurgical coal, with total reserves of 1.4 billion tons. “That’s a lot of high-quality metallurgical coal in the ground that we can develop over time,” Alpha spokesman Ted Pile says.
Despite access to Massey’s metallurgical coal, there was a downside to the deal: one of the worst safety records in the industry. Alpha paid a record fine of $209.3 million last year to settle ongoing criminal and civil suits related to a massive explosion at a Massey mine. Twenty-nine miners died at Upper Big Branch in West Virginia on April 5, 2010. To improve worker safety, Alpha enrolled all 7,000 former Massey employees in its safety-training program, called “Running Right.”
For the foreseeable future, Virginia is poised to remain a top-ranked coal producer. “And we have some of the highest quality coal in the world,” says Ripepi of Virginia Tech. Still, America is marching to a drumbeat for cleaner, renewable energy. To adapt, Virginia’s coal companies, like the industry at large, will have to dig in and dig deep.
|
<urn:uuid:6b05531a-b7e9-4820-a50c-4fa2feed56e1>
|
http://www.virginiabusiness.com/index.php/news/article/old-king-coal
| 0.8999
|
fineweb
|
(MENAFN Press) Plastic is a natural product derived from hydrocarbons and is as natural as petrol or diesel, argues the All India Plastic Manufacturers Association (AIPMA), the co-organiser of Plastivision Arabia scheduled to be held at Expo Centre Sharjah from May 14 “ 17, 2012.
"It is the misuse and specifically the littering of plastic which is unwarranted and harmful to the environment," said Mr. Jayesh Rambhia, President of AIPMA.
The AIPMA was making a case for putting an end to littering habits and working towards educating the public that plastics have a lower carbon footprint compared to other materials.
For example, plastic bags generate 39 per cent less greenhouse gas emissions than uncomposted paper bags; plastic bags consume less than 6 per cent of the water needed to make paper bags; plastic grocery bags consume 71% less energy during production than paper bags; and using paper sacks generates almost five times more solid waste than plastic bags.
"More than 90 per cent of plastic bags or any plastic which is disposed of properly is recycled and will not harm the environment. Any plastic that does not recycle will find its ways to designated land-fills where it will eventually bio-degrade, only if disposed properly," he added.
Some argue that plastic helps the environment is several ways. After all, plastic has been used to make cars lighter. As a result, less oil is used to mobilize the cars and less CO2 is emitted. In addition, plastic containers provide safe ways for disposing of toxic waste products.
"In our society and communities, people litter the landscape with plastics. This not only creates an eyesore with plastic floating at beaches or flying in the desert, but also poses a serious risk to the environment," said Mr Saif Mohammed Al Midfa, Director-General of Expo Centre Sharjah, the co-organiser of Plastivision Arabia.
Mr Midfa cited a report in the 'Community Reports' section of Gulf News dated April 11, 2012, which had a photograph of a bird struggling to remove pieces of plastic entangled around its neck. "It is a heart-wrenching sight. Had the plastic been disposed of properly, it wouldn't have caused this hardship to this poor creature," Mr Midfa said.
"The municipalities are doing their best in trying to curb this menace and the absence of awareness among the public is the main reason for such practices. The recent announcement of Sharjah Municipality to intensify imposing littering fine will go a long way in reducing littering and promoting recycling of plastic," Mr Midfa said.
"There is no point in hating all plastic products, instead we should actually hate the abuse of plastics. We should hate the way how this incredibly versatile product is made the enemy of public health and environment," Mr Midfa said.
Among the most crucial improvements in technology over the past few hundred years has been the creation of plastic. Today, plastic is used in just about everything we use, and we pay it almost no notice at all.
"Plastic is quite indispensable and there is no complete alternative for it," said Eng. Mohamed Saleh Badri, Director General of the Emirates Authority for Standardization & Metrology (ESMA).
"We use plastic in almost everything in our daily life, in our cars, at the workplace, in hospitals. However, it is important to manage disposable plastics like shopping bags, packaging, cups and cutlery," he added.
"We have introduced a law to use Oxo bio-degradable plastic in shopping bags and garbage bags across the UAE since the beginning of 2012. Information on this, the proper usage of additives, and registration of additive suppliers will be available at our stand at the forthcoming Plastivision Arabia," he said.
ESMA will also create workshop on the specific requirements for registration of Oxo biodegradable plastic bags as part of Plastivision Arabia.
Plastivision Arabia is organised by Expo Centre Sharjah and the All India Plastic Manufacturers Association with the support of the Sharjah Chamber of Commerce and Industry.
|
<urn:uuid:51fae2d7-d695-44d8-ae45-a936d57ceaff>
|
http://www.menafn.com/menafn/1093506237/Plastic-bags-which-escapes-collection-results-environmental-hazards
| 0.8314
|
fineweb
|
Some of Palawan’s reefs are sad reflections of warming ocean temperatures. White skeletons are all that remain of previously colorful and varied coral reefs around the island. The phenomena is known as ‘coral bleaching’, caused by too warm of ocean temperatures.
Scientists cited in the article below hold out hope for these damaged reefs. Apparently, some corals can adapt to warming temperatures, and even thrive in them. Studies are being done in Kiribati, an island in the South Pacific, very close to the equator, where ocean temperatures are the hottest. An international team of scientists, including lead researchers from Canada and Australia published an article on March 30, in the journal PLoS ONE,
Click on the link below to read the article from ScienceDaily.com:
Excerpt from article says, the study:
. . . paves the way towards an important road map on the impacts of ocean warming, and will help scientists identify the habitats and locations where coral reefs are more likely to adapt to climate change.
“We’re starting to identify the types of reef environments where corals are more likely to persist in the future,” says study co-author Simon Donner, an assistant professor in UBC’s Department of Geography and organizer of the field expedition. “The new data is critical for predicting the future for coral reefs, and for planning how society will cope in that future.”
When water temperatures get too hot, the tiny algae that provides coral with its colour and major food source is expelled. This phenomenon, called coral bleaching, can lead to the death of corals. The researchers say coral reefs may be better able to withstand the expected rise in temperature in locations where heat stress is naturally more common. This will benefit the millions of people worldwide who rely on coral reefs for sustenance and livelihoods, they say.
“Until recently, it was widely assumed that coral would bleach and die off worldwide as the oceans warm due to climate change,” says lead author Jessica Carilli, a post-doctoral fellow in Australian Nuclear Science and Technology Organisation’s (ANSTO) Institute for Environmental Research. “This would have very serious consequences, as loss of live coral — already observed in parts of the world — directly reduces fish habitats and the shoreline protection reefs provide from storms.”
This is very good news for Palawan. DonnaOnPalawan wishes these scientists and their studies continuing success. My novel’s plot revolves around Palawan’s coral reefs and fish life, as I am very concerned about this issue.
Palawan’s coral reefs are a precious resource. We hope the damage will be halted, and the reefs will thrive on into the future.
|
<urn:uuid:5e163826-772f-4baf-9ef3-7b93d366dd8b>
|
http://donnaonpalawan.wordpress.com/2012/04/04/corals-hot-heat-stress-help-coral-reefs-survive-climate-change/
| 1
|
fineweb
|
Threats to Frogs
One of the most pressing threats to frogs today is the chytrid fungus, a deadly skin fungus that has moved across the globe causing amphibian declines in Australia, South America, North America, Central America, New Zealand, Europe, and Africa killing frogs by the millions. The chytrid fungus is responsible for over 100 frog and other amphibian species extinctions since the 1970’s. Chytrid fungus has been detected on at least 285 species of amphibians (including frogs) from 36 countries.
Climate change is also having an impact on frogs that live on mountain tops. They are being hit hard since they are dependant on moist leaf litter found in cloud forests as a suitable place to lay their eggs. As temperatures increase further up mountain sides, clouds are being pushed further away and leaves are drying out leaving less suitable habitat for frogs to lay their eggs. As frogs migrate further up the mountain they are faced with the inevitable problem that once they reach the top, unlike birds, they can go no further.
Frogs are also facing many threats from many different environmental factors: pollution, infectious diseases, habitat loss, invasive species, climate change, and over-harvesting for the pet and food trades are all contributing to the rapid rise of frog extinctions since 1980.
Reasons for Hope
Chytrid fungus has been recognized as one of the largest threats to amphibian populations around the world. In 2009 a group of organizations came together to respond to the crisis. Defenders of Wildlife (Washington DC), Africam Safari Park (Mexico), Cheyenne Mountain Zoo (Colorado), the Smithsonian National Zoological Park (Washington DC), the Smithsonian Tropical Research Institute (Panama), Zoo New England (Massachusetts) and Houston Zoo (Texas) have launched the Panama Amphibian Rescue and Conservation Project.
There are yet undiscovered species of frogs in the world. A new species of flying frog was discovered in the Himalayan Mountains in 2008.
|
<urn:uuid:41cc0e52-1089-47c4-b367-00aac24e3935>
|
http://www.defenders.org/frogs/threats
| 0.9947
|
fineweb
|
A way of life is feeling the heat
International development policies are undermining the long term survival of some of the globe's poorest communities, argues Masego Madzwamuse, IUCN's regional programme development officer and focal person for southern African drylands. She says the skills and knowledge needed to survive in the world's harsh drylands are being sacrificed in the name of progress.
The world's poorest of the poor live in the toughest areas of the planet - the drylands.
These areas all have key factors in common: water is scarce, and rainfall is unpredictable - or it rains only during a very short period every year.
Drylands cover more than 40% of the Earth's surface and are home to more than two billion people.
These areas are also home to a disproportionate number of people without secure access to food.
Why are 43% of the world's cultivated lands found in dry areas? And why have decades of development not led to significant improvements?
Rather than improving, it would appear that the situation is getting worse, with more frequent droughts, such as those in Ethiopia and Northern Kenya. Another important issue that strikes me about drylands is that these areas have been completely neglected despite being the world's home of the poor.
While one international agreement - the United Nations Convention to Combat Desertification (UNCCD) - has been dedicated solely to the drylands of this world, little attention has been paid by the media, development or conservation organisations, or the international donor community.
The only time attention is paid is when droughts (a regular climatic phenomenon in such lands) are allowed to proceed to famine, which in this day and age can only be the result of political failure.
Humanitarian and food relief follow the TV headlines, creating more dependencies rather than developing viable and sustainable economies.
It is expected that these areas will be hardest hit by climate change in the future. The influential Stern Review noted that a 3C (5.4F) increase in global temperature was likely to result in an extra 150-550m people becoming exposed to the risk of hunger.
The review also said that climate change was likely to result in up to four billion people suffering water shortages. The world's drylands are likely to bear the brunt of this gloomy prognosis.
In my opinion, the world will only successfully fight poverty and achieve the Millennium Development Goals (MDGs) if we pay more attention to these unique ecosystems and learn from the mistakes of the past. This means moving away from a colonially biased view of drylands.
It is unfortunately still common to equate drylands with deserts and wastelands, as these areas might not look at first sight very productive, especially during a period of drought. So, what are the ingredients for success in developing the poorest regions of this world?
First of all, development interventions need to be adapted to the realities of drylands. Crop production, whether rain-fed or irrigated, will always be a limited opportunity. Yet the major effort in "development" is a green revolution for the desert.
Has half a century of development not taught us the reality for cultivation in the drylands? Livestock is much more suitable to arid environments and more likely to support rural livelihoods in arid regions.
For instance, Turkana pastoralists of Kenya know that livestock is their mainstay, even though they have some of the fastest maturing varieties of sorghum in the world.
Secondly, we should work with the knowledge and institutional systems of the people who have lived there for centuries. We need to understand why they have complex common property systems for land and resource management that may span and cover very large territories, and guarantee that a variety of stakeholders can use these scarce resources and survive.
It is important to also understand why they place more emphasis on livestock than crops. Livestock is a better converter of biomass in such harsh lands. We must not sweep aside this knowledge and experience. Instead, we should build on those systems and support them with so-called "modern and scientific knowledge" to improve productivity and create market opportunities.
Yet we ignore their complex risk management and resilience enhancement strategies. One classical example has been the numerous efforts to use inappropriate policies to settle nomadic people and restrict their movements.
Nomadic livestock herding has been a key sustainable survival strategy in the more arid areas. Once grass and water become scarce, these communities move with their animals to the next area. Thus, they are able to use resources sustainably without leaving themselves exposed to the effects of droughts.
While livestock farming in drylands contributes significantly to national economies, most subsidies go to unsustainable ranching projects rather than the small livestock holders.
Pastoralism is one of the few land use systems that can be compatible with wildlife conservation.
Yet where are many of the world's national parks? More than 70% of Kenya's are in drylands, which includes a number of important dry season grazing areas for pastoralists.
Dryland peoples depend on the surrounding environment, and they should be able to benefit from conservation through community conserved areas and tourism, rather than having their best lands taken away from them in the name of conservation.
Thirdly, nature's contribution to the survival of the poor needs to be recognised as an important asset. It is nature that provides food, fodder for livestock, construction material for shelter, medicinal plants, emergency food and climate regulation (shade is highly valued in 40C).
Opportunities for sustainable development exist
Sudan is the world's largest producer of gum arabic, a principal ingredient of colas and chewing gum, which stems from a 2,000-year agroforestry tradition. And the arid lands of the Horn of Africa produce the highest quality frankincense and myrrh in the world.
In one district in Botswana that has an average annual rainfall of just 200mm, dryland ecosystem services contributed $190,000 (£95,000) to the national income. Almost 50% of this came from wild plants such as the medicinal devil's claw.
Instead of building on this natural capital, development and government interventions tend to replace and disregard them. Even worse, they are not reflected in the national GDP figures. As a consequence, most policy frameworks provide incentives for their exploitation rather than their sustainable use.
We cannot continue to let the world's poor dryland dwellers down. Panaceas, history tells us, don't work. Instead, we need to invest in the innovative and sustainable use of natural assets.
This article first appeared on the BBC's Green Room. Visit http://news.bbc.co.uk/1/hi/in_depth/sci_tech/green_room/default.stm
|
<urn:uuid:40b6044b-ba49-4e52-ab4b-ab31784fb381>
|
http://cms.iucn.org/es/noticias/noticias_por_tema/gestion_de_ecosistemas_news/?1170/A-way-of-life-is-feeling-the-heat
| 0.9835
|
fineweb
|
Issues: Environment & Climate
California’s rich and diverse soils, vast farm and ranch lands, climate, air, water, and native species must be protected and in many cases restored so that future generations can enjoy the same quality of life that we have. ROC believes that among several other key dynamics, the food system will not be sustainable until agriculture greatly reduce and/or eliminates its huge impacts on climate, air and water pollution. Roots of Change joins with farmers and ranchers, nonprofit organizations, public officials, entrepreneurs, and concerned citizens to ensure that stewardship incentives exist, that vital environmental research is funded, and that the knowledge gained is shared.
Learn more about how how agriculture can be a powerful tool to fight climate change and reduce nitrogen pollution, click here.
Check out our forum page on the environment & climate.
The movement for healthy food and agriculture began and has grown largely as a result of non-profit organizations. The non-profits we have highlighted here reflect the breadth of issues covered by the movement.
Roots of Change believes that abundant, safe, healthy, fresh, and affordable food is a foundation for a positive future for all Californians. In a market-based economy, powerful solutions must come from entrepreneurs who apply sustainable principles and practices in their businesses.
Roots of Change is working together with California’s farmers and ranchers to ensure that every aspect of our food—from the time it’s grown to the time it’s eaten—is healthy, safe, profitable, and fair for those who grow it and for the state where it’s grown.
|
<urn:uuid:9eb1ade2-5ef5-44bd-b6ac-e3005245bf85>
|
http://www.rootsofchange.org/content/issues-0/56
| 0.9987
|
fineweb
|
LEDs & the environment
Climate change is commonly accepted to be the greatest threat to our environment. It will result in us all experiencing more extreme weather – with wetter winters and drier summers. This has been caused by the levels of greenhouse gases, including carbon dioxide (CO2), which have been released into our atmosphere. In the UK, business produces almost half of our carbon dioxide (CO2) emissions. Even one small office can emit three to five tonnes of carbon dioxide a year.
It is a known fact that industry can realise significant money savings by simply upgrading old lighting systems with new more energy efficient lighting; but of greater importance is the reduction in greenhouse gases emitted into the atmosphere as a result. By drastically reducing the electricity demanded from the power utilities by local authorities and business and industry, a substantial impact can be had on the reduction of CO2 emissions released, and on industry’s carbon footprint as a whole. Add the option for the responsible reuse of the existing lighting system fixture, and project costs can be further reduced while putting even less of an effect on the environment. The environment wins and your project wins considering UKLED offers the perfect balance of buying cost, performance and low operating/consumable costs over the long term for a favourably short return on investment time.
One kilowatt-hour of electricity will cause 1.34 pounds (610 g) of CO2 emission. A GU10 Halogen downlighter rated at 50W on for an average of 8 hours a day will, over a year, cause 195 pounds (89kg) of CO2. The 3-watt LED equivalent will only cause 11 pounds (5 kg) of CO2 over the same time span, a reduction of around 94%! . A building’s carbon footprint from lighting can be reduced, typically, by between 64% and 95% by exchanging all legacy lamps and tubes for new LED lamps and tubes.
|
<urn:uuid:1ab42086-e47e-4c0c-918a-b3aef0f2e591>
|
http://ukled-ltd.co.uk/our-products/leds-the-environment/
| 1
|
fineweb
|
No coal fired power stations. No SUV’s.
And they are warning the planet's atmosphere could have similar levels of the greenhouse gas within hundreds of years.
An international team led by German scientists and involving University of Queensland Environmental Geologist Dr Kevin Welsh has found tropical palms grew on the coast of Antarctica 52 million years ago.
At that warm period in the earth's history, there was twice as much CO2 in the atmosphere as there is now and winter temperatures of 10C meant Antarctica's 4km thick ice sheet didn't exist.
Fancy that, no ice in Antarctica 52 million years ago.
Below is what I wrote on the same subject for Menzies House on 24th July 2011:
Global warming. Rising sea levels. Massive volcanic activity around the world. Widespread climate change.
It’s not a scene from the Hollywood disaster film, The Day After Tomorrow, but the Earth as it appeared during the mid-to late-Cretaceous geological period, 145 million to 65 million years ago, when the largest dinosaurs such as Tyrannosaurus Rex ruled the planet.
Our planet during the late Cretaceous period was very different than it is today. Not only were dinosaurs like T-Rex present, but the climate was extremely warm and global sea levels were significantly higher than they are today. This was a time when there were no glaciers in either the Arctic or Antarctic.
Late Cretaceous atmospheric carbon dioxide levels were two to four times higher than today, which resulted in a greenhouse climate with tropical sea-surface temperatures rising to more than 34 degrees Celsius, 3 to 7 degrees Celsius warmer than today.
Calderia and Rampino concluded in their 1991 paper - The mid-Cretaceous super plume, carbon dioxide, and global warming - that carbon dioxide emissions resulting from super‐plume tectonics could have produced atmospheric carbon dioxide levels from 3.7 to 14.7 times the modern pre‐industrial value of 285 ppm. Carbon dioxide levels today are around 390 ppm. According to Calderia and Rampino, temperature sensitivity to carbon dioxide increases used in the weathering‐rate formulations, would have caused global warming of from 2.8 to 7.7°C over today's global mean temperature.
Further supporting Calderia and Rampino’s 1991 paper is John Tarduno and his collaborators 1998 paper - Evidence for Extreme Climatic Warmth from Late Cretaceous Arctic Vertebrates.
In 1996, Tarduno’s expedition team literally stumbled across a unique fossil find: vertebrate remains from fish, turtles and Champsosaurs.
The fossils indicate that at least once in Earth's history, high amounts of the greenhouse gas warmed Earth to much higher temperatures than usual.
The highlight of the expedition find are bones that belonged to an eight-foot Champsosaur, a now-extinct crocodile-like beast with a long snout and razor-sharp teeth.
The reptiles, which were tied to their freshwater environment on Axel Heiberg Island, needed an extended warm period each summer to survive and reproduce. Based on the numbers and sizes of the animals found, the Tarduno’s team estimated that the annual mean temperature in the Arctic during the late Cretaceous period, from about 92 million to 86 million years ago, was about 14 degrees Celsius. That means it was rarely if ever freezing during the winter, and summer temperatures consistently reached between 27 and 32°C.
The Arctic today is defined as being the area where the average temperature for the warmest month (July) is minus 10°C.
The fossils of the Champsosaur are a record of what was happening in the Arctic just as extreme volcanism on Earth was winding down.
Most of the volcanic activity didn't resemble spectacular eruptions like Mt. Pinatubo. Instead, the eruptions were "basaltic" – billions of tons of lava oozed out, and carbon dioxide floated skyward. Besides huge amounts of lava in the Arctic, where hardened lava rock today measures more than a kilometre thick in some places, magma oozed from volcanoes in the Caribbean, in the Pacific Ocean northeast of Australia, in the Indian Ocean, off the coasts of Madagascar and Brazil, in South Africa and in the Southwestern United States.
Understanding how our past atmosphere, land and ocean system interacted while in this global greenhouse mode is very relevant if we want to understand the fate of our future climate.
It also further illustrates that we live on a dynamic planet who's climate is always changing over the millennia.
Whilst no one denies that the world’s industrialisation has increased considerably the output of greenhouse gases, to ascribe the current phase of our ever changing climate to one single variable (carbon dioxide) or, more specifically, to a very small proportion of one variable (i.e. human produced carbon dioxide) is not science, for it requires us to abandon all we know about our planet Earth, the Sun, our Galaxy and the Cosmos.
And believing that putting a price on Carbon Dioxide will make any difference to the Earth’s climate is madness. The only sensible action to tackle climate change is by adaption, as trying to prevent it is a fool’s game.
|
<urn:uuid:30bc0048-2d31-4c07-afe0-15786ed57ea5>
|
http://www.andysrant.com/2012/08/what-do-you-knowantarctica-had-rainforests-50-million-years-ago.html
| 1
|
fineweb
|
Conception/Leipzig. Even the snow on Aconcagua Mountain in the Andes is polluted with PCBs. An international team of researchers detected low concentrations of these toxic, carcinogenic chlorine compounds in samples taken from America's highest mountain. The snow samples taken at an altitude of 6200 metres are among the highest traces found anywhere in the world of these substances, which have been banned since 2001. In particular, the samples contained more persistent compounds like hexachlorobiphenyl (PCB 138) and heptachlorobiphenyl (PCB 180). Mountain ranges could be a natural trap for persistent organic pollutants that are transported by the atmosphere all over the world, say the scientists from IIQAB in Barcelona (Now IDAEA), the UFZ in Leipzig and the University of Concepcion in Chile, writing in the journal Environmental Chemistry Letters. According to the researchers, these findings highlight the need to investigate further the role of mountains in the spread of these pollutants and the associated risks. Just a few weeks ago, Swiss researchers found similar persistent environmental pollutants in glacial lakes in the Alps and pointed to potential risks to drinking water supplies.
Polychlorinated biphenyls (PCBs) are among the 'dirty dozen' persistent organic pollutants banned worldwide under the Stockholm Convention. Until the 1980s, PCBs were used primarily in transformers and capacitors and as hydraulic fluids and diluents. As well as causing chronic effects like acne, hair loss and liver damage, PCBs are also a suspected cause of male infertility. The toxin also represents a danger to a large number of animals because it accumulates in fatty tissue and is passed on via the food chain.
The study of environmental pollution in remote mountain regions is difficult because they are not easily accessible. "This is compounded by the fact that the concentrations are often so small that researchers have to bring back large quantities of snow just to reach the detection limit. While conventional extraction methods need at least a litre of snow, the solvent-free method we used works with 40 ml," explains Peter Popp of the Helmholtz Centre for Environmental Research (UFZ), who analysed the samples in the laboratory in Leipzig. Roberto Quiroz of IIQAB, the Spanish research institute for environmental chemistry (now researcher at the EULA Chile Environmental Sciences Centre), adds, "On expeditions to high mountain peaks, every gram counts. We would never have been able to carry 40 litres of snow per sample. So we were very pleased that only 40 ml per sample were required for analysis in Leipzig." Aconcagua is in the southern Andes, close to the Chile-Argentina border, and has five large glaciers. It was a holy mountain of the Incas. As one of the Seven Summits (the highest mountains of each of the seven continents) Aconcagua is now a popular destination for mountaineers. The first to reach the summit was Swiss mountaineer Matthias Zurbriggen in 1897.
During the 2003 expedition, the Chilean researchers took samples at altitudes of 3500, 4300, 5000, 5800 and 6200 metres. The concentrations measured do not represent any immediate danger to mountaineers, who melt small quantities of snow to obtain water. The PCB concentration on Aconcagua was less than half a nanogram per litre. Compared with the values measured in other mountain and polar regions, the concentrations on the mountain peak in the Andes were relatively low. Concentrations four times higher have been measured in the Italian Alps, for instance – an indication that pollution in the southern hemisphere is less severe than in the northern hemisphere.
The PCB concentrations measured around the peak of Mount Aconcagua were approximately one-tenth of those found in earlier samples taken from Sierra Velluda, a mountain just 3500 metres high on the west side of the Andes in Chile. "This could be because of the way in which these pollutants accumulate in the snow. But it could also have something to do with the three hydroelectric power stations on the lower slopes of Sierra Velluda. Their transformers are potential sources of PCBs," suggests Ricardo Barra of the EULA-Chile Centre for Environmental Research at Concepcion University. "However, detecting PCBs in the snow on top of Aconcagua clearly shows that these compounds are transported to the Andes by the atmosphere and accumulate there."
The research findings are also relevant in relation to climate change: "The shrinking of the glaciers could lead to the pollutants stored in the glacier snow being carried down with the melt water," fears Roberto Quiroz. South America is not the only part of the world in which water from melting glaciers plays an important role in irrigation for farming and as a source of drinking water.
|
<urn:uuid:c3e80b2d-2be0-4118-b330-c15d3c60178f>
|
http://www.sciencecodex.com/white_but_not_pure
| 0.6332
|
fineweb
|
Buried inside Robert Bryce’s relatively new book entitled Power Hungry is a call to “aggressively pursue taxes or caps on the emissions of neurotoxins, particularly those that come from burning coal” to generate electricity such as mercury and lead. This is notable not because Bryce agrees with many environmental and human health experts, but also because the book credibly debunks the move to tax or cap carbon dioxide emissions both from technical and political perspectives.
The word “neurotoxic” literally translates as “nerve poison”. Broadly described, a neurotoxicant is any chemical substance which adversely acts on the structure or function of the human nervous system.
As its subtitle signals, Power Hungry also declares policies subsidizing renewable sources of electricity, biofuels and electric vehicles as too costly and impractical to make a significant difference in making the U.S. power and transportation systems more sustainable.
So why take aim at mercury and lead, which is certain to drive up the cost of coal-fired electricity just as a carbon cap or tax would? Because, Bryce asserts, “arguing against heavy metal contaminants with known neurotoxicity will be far easier than arguing against carbon dioxide emissions. Cutting the output of mercury and the other heavy metals may, in the long run, turn out to have far greater benefits for the environmental and human health.” Bryce draws a parallel to the U.S. government ordering oil refiners to remove lead from gasoline starting in the 1970s.
In the book, which has has received predominantly good reviews on Amazon.com, Bryce makes some valid points about the carbon density of our energy sources. Among his overarching messages is that the carbon density of the world’s major economies is actually declining (see graph below). Not to be missed: his attack on carbon sequestration, pp. 160-165. His case about the threat of neurotoxins begins on p. 167.
There’s a lot more to this challenge of reducing America’s reliance on coal-fired power plants than this. But considering the failure by the U.S. Congress to agree on a carbon tax or cap, his idea has serious merit and deserves a broad discussion, especially as Congress reassess its budget priorities. This includes billions of dollars of tax breaks and incentives for oil and other fossil fuels.
|
<urn:uuid:ed7842f6-485f-401b-96c7-6ca3e6045411>
|
http://www.theenergyfix.com/2011/05/07/tax-toxins-not-carbon-dioxide-from-coal-fired-power-plants/
| 0.973
|
fineweb
|
Deaths in Moscow have doubled to an average of 700 people a day as the Russian capital is engulfed by poisonous smog from wildfires and a sweltering heat wave, a top health official said today, according to the Associated Press.
The Russian newspaper Pravda reported: “Moscow is suffocating. Thick toxic smog has been covering the sky above the city for days. The sun in Moscow looks like the moon during the day: it’s not that bright and yellow, but pale and orange with misty outlines against the smoky sky. Muscovites have to experience both the smog and sweltering heat at once.”
“Russia has recently seen the longest unprecedented heat wave for at least one thousand years, the head of the Russian Meteorological Center,” the news site Ria Novosti reported.
Various news sites report that foreign embassies have reduced activities or shut down, with many staff leaving Moscow to escape the toxic atmosphere.
Russian heatwave: This NASA map released today shows areas of Russia experiencing above-average temperatures this summer (orange and red). The map was released on NASA’s Earth Obervatory website.
NASA Earth Observatory image by Jesse Allen, based on MODIS land surface temperature data available through the NASA Earth Observations (NEO) Website. Caption by Michon Scott.
According to NASA:
In the summer of 2010, the Russian Federation had to contend with multiple natural hazards: drought in the southern part of the country, and raging fires in western Russia and eastern Siberia. The events all occurred against the backdrop of unusual warmth. Bloomberg reported that temperatures in parts of the country soared to 42 degrees Celsius (108 degrees Fahrenheit), and the Wall Street Journal reported that fire- and drought-inducing heat was expected to continue until at least August 12.
This map shows temperature anomalies for the Russian Federation from July 20-27, 2010, compared to temperatures for the same dates from 2000 to 2008. The anomalies are based on land surface temperatures observed by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra satellite. Areas with above-average temperatures appear in red and orange, and areas with below-average temperatures appear in shades of blue. Oceans and lakes appear in gray.
Not all parts of the Russian Federation experienced unusual warmth on July 20-27, 2010. A large expanse of northern central Russia, for instance, exhibits below-average temperatures. Areas of atypical warmth, however, predominate in the east and west. Orange- and red-tinged areas extend from eastern Siberia toward the southwest, but the most obvious area of unusual warmth occurs north and northwest of the Caspian Sea. These warm areas in eastern and western Russia continue a pattern noticeable earlier in July, and correspond to areas of intense drought and wildfire activity.
Bloomberg reported that 558 active fires covering 179,596 hectares (693 square miles) were burning across the Russian Federation as of August 6, 2010. Voice of America reported that smoke from forest fires around the Russian capital forced flight restrictions at Moscow airports on August 6, just as health officials warned Moscow residents to take precautions against the smoke inhalation.
Posted by David Braun
Earlier related post: Russia burns in hottest summer on record (July 28, 2010)
Talk about tough: These guys throw themselves out of 50-year-old aircraft into burning Siberian forests. (National Geographic Magazine feature, February 2008)
Photo by Mark Thiessen
Join Nat Geo News Watch community
Readers are encouraged to comment on this and other posts–and to share similar stories, photos and links–on the Nat Geo News Watch Facebook page. You must sign up to be a member of Facebook and a fan of the blog page to do this.
Leave a comment on this page
You may also email David Braun (email@example.com) if you have a comment that you would like to be considered for adding to this page. You are welcome to comment anonymously under a pseudonym.
|
<urn:uuid:10b103c1-284b-41c5-8dc9-bc9d1b7577ea>
|
http://newswatch.nationalgeographic.com/2010/08/09/russia_chokes_as_fires_rage_worst_summer_ever/
| 0.922
|
fineweb
|
CORRECTED-Warming temperatures could multiply Katrina-like hurricanes -study
(Corrects that IPCC assesses but does not run computer simulations, and expands range of warming, paragraph 5)
* Extreme storms most sensitive to rising temperatures
* Number of strong hurricanes could increase seven-fold
By Environment Correspondent Deborah Zabarenko
WASHINGTON, March 18 (Reuters) - The number of Atlantic storms with magnitude similar to killer Hurricane Katrina, which devastated the U.S. Gulf Coast in 2005, could rise sharply this century, environmental researchers reported on Monday.
Scientists have long studied the relationship between warmer sea surface temperatures and cyclonic, slowly spinning storms in the Atlantic Ocean, but the new study attempts to project how many of the most damaging hurricanes could result from warming air temperatures as well.
The extreme storms are highly sensitive to temperature changes, and the number of Katrina-magnitude events could double due to the increase in global temperatures that occurred in the 20th century, the researchers reported in the journal Proceedings of the National Academy of Sciences.
If temperatures continue to warm in the 21st century, as many climate scientists project, the number of Katrina-strength hurricanes could at least double, and possibly rise much more, with every 1.8 degree F (1 degree C) rise in global temperatures, the researchers said.
Computer projections assessed by the U.N. Intergovernmental Panel on Climate Change suggest that global temperatures could rise by between 1.8 degrees and 10.8 degrees F (1 degree and 6 degrees C) by century's end.
To figure out how many of the most extreme hurricanes these higher temperatures might spawn, Aslak Grinsted of the Centre for Ice and Climate at the University of Copenhagen and his co-authors looked at storm surges, which are often the most damaging aspect of these monster storms.
A storm surge is the abnormal rise in water, over and above normal high tide, pushed toward shore by the winds whipping around a big cyclonic storm. Much of the damage from Hurricane Katrina, an estimated $108 billion, was caused by high storm surges across a wide area of the Gulf of Mexico coast, according to the National Oceanic and Atmospheric Administration (NOAA).
Superstorm Sandy, which plowed into the northeastern U.S. coast with hurricane-strength winds last year, cost an estimated $75 billion, NOAA said.
The researchers looked at storm surges going back to 1923, and related those to how warm air temperatures were when the surges occurred. Then, using computer models, they projected how storm surges might be influenced by future warming.
Storm surges can be a more accurate gauge of a hurricane's severity than wind speed, like those on the Saffir-Simpson hurricane wind scale, Grinsted said by phone from Denmark.
"When people talk about (hurricane) intensity normally, then they mean wind speed," he said. "But that is not what is causing the most damage only. Sometimes it's about how fast it is traveling."
He said that was the case with Sandy, which traveled so slowly and stretched over such a wide area that its impact was intense, even though wind speeds abated somewhat by landfall.
Previous research on the link between climate change and hurricanes has suggested that there may be fewer hurricanes overall but more stronger ones as global temperatures rise.
This study indicates there will be an increase of hurricanes of all magnitudes, but the increase will be greatest for the most extreme events. (Reporting by Deborah Zabarenko; Editing by Ros Krasny, Jackie Frank and Eric Beech)
- Tweet this
- Share this
- Digg this
|
<urn:uuid:da86b2d3-85d0-4b75-8994-21e41bccc284>
|
http://www.reuters.com/article/2013/03/19/usa-climate-hurricanes-idUSL1N0CA8MS20130319?feedType=RSS
| 1
|
fineweb
|
An oil executive once observed that burning oil for energy is like burning Picassos for heat. Oil is extraordinarily valuable as the basis for so many products we use every day that the thought of simply burning it ought to be unthinkable. So versatile are oil molecules that they can be transformed into substances that serve as clothing, medicines, building materials, carpet, skin care products, sporting goods, agricultural chemicals, perfumes, and myriad other products.
Increasingly, when we make oil-based products for homes and businesses, we are finding ways to reuse those products or recycle the materials they are made from (think: recyclable plastics). But, burning oil is always a one-time, irreversible act that leaves nothing of value behind and produces greenhouse gases and pollutants that harm us. And yet, because oil remains the most cost-effective and widely available source of liquid fuels, we are hooked on it for transportation with little prospect of substitutes on the scale we would require--unless we consider electricity.
It is worth remembering that electricity was a strong contender for powering automobiles at the beginning of the last century and that it ran the trolleys of the era (and still runs many today). Electricity was actually preferred over gasoline for powering cars at the time, especially cars that were used exclusively for local trips. Battery exchange was already available as a quick way to "charge" a car. But improvements in the internal combustion engine and the increasing availability and affordability of gasoline led to the extinction of the electric car no later than the 1930s.
More recently, despite all the hand waving about marginal gains in U.S. oil production, we have been experiencing a plateau in worldwide oil production since 2005. Ongoing tightness in oil supplies has led to high prices for gasoline and diesel, and so the world is turning once again to electricity to power transportation. Of course, many hybrid gas-electric vehicles are already in use, and some all-electric vehicles are now being produced for the mass market. But in a world increasingly faced with energy constraints and climate change, continuing to rely on the automobile as the main source of transportation may be a poor policy choice.
First, astute observers will note that electric vehicles of whatever kind are actually powered primarily by fossil fuels. According to the U.S. Energy Information Administration two-thirds of all electric power worldwide is generated using fossil fuels. That means coal and natural gas are being burned to produce the lion's share of electricity. Some oil is still used, especially in countries that export it and so have cheap supplies available to them.
To reduce overall greenhouse gas emissions, we would have to burn less overall fossil fuel. Only one-third of the heat energy produced in a typical fossil-fueled power plant actually gets turned into electricity. The rest is expelled as waste heat which is why we see huge volumes of steam coming from cooling towers wherever fossil-fueled generating plants operate. Were it not for the fact that renewable energy can be employed to make electricity, electric-powered vehicles on a mass scale would provide little advantage when it comes to pollution and greenhouse gas emissions. These vehicles would, however, still reduce dependence on petroleum.
There are two obvious moves that would substantially reduce our reliance on fossil-fuel produced electricity. One already mentioned would be vastly expanding renewable energy sources such as wind, solar and hydroelectric. Naturally, there are the problems of load-balancing and storage related to intermittent power sources such as wind and solar. These problems would have to be overcome in the long term in order to allow the electrification of transportation based primarily on renewable energy. But, there are plausible paths to such an outcome, especially if overall reductions in energy use are part of the path, something I'll discuss below. Naturally, nuclear generated electricity can also be used to power vehicles. But I am doubtful that in the post-Fukushima era, nuclear power will be a viable option for increasing nonfossil fuel-based electricity production, both for political and technical reasons.
A second move that would reduce our reliance on fossil-fuel based electricity would be a vast expansion of our mass transit systems. Done properly, this expansion would reduce overall energy use in transportation by moving people from energy-intensive automobiles into more efficient mass transit. An overall reduction in energy use is important because, for many reasons, it is unlikely that renewable energy production will be able to match the huge quantities of energy we currently get from fossil fuels. The expansion of mass transit would need to be executed in a way that would make such systems so ubiquitous, convenient and inviting that people would prefer them over cars as many do in major American and European cities.
Much of the mass transit infrastructure can run on electricity and already does including electric-powered subways, commuter trains, buses and trams. To that infrastructure we would need to add electric-powered, high-speed passenger rail service between major cities. That's already in place in Europe and Japan. In the United States such a high-speed rail system would reduce the need for short-haul air travel and thus reduce jet fuel use. And, we'd want to expand and electrify freight traffic over rails, something that would lessen the need for long-haul trucking. Even in trucking, hybrid trucks are starting to appear in commercial fleets, something that can further reduce use of diesel and gasoline.
Of course, some modes of transport are not going to be amenable to electric power. Electric-powered planes are not impossible, but would probably not be able to carry much weight given the current state of battery technology. Ocean-going freighters will likely continue to need liquid fuels, though sails are starting to appear on some to reduce fuel use.
On land we will almost certainly need some liquid fuels for four categories of vehicles: rural transport, farm machinery, heavy equipment and emergency vehicles. It probably isn't cost-effective to string wires in rural areas for local transportation because population densities are too low. Some people are working on electric farm machinery charged using solar cells. But, the work needs to progress further before it can be widely adopted. For some farm tasks, liquid-fueled engines may continue to be the most practical approach for a long time to come. Where construction and mining take place away from sources of electricity, heavy equipment will have to operate using liquid fuels. Emergency vehicles could use electricity, but would have to have liquid-fuel capabilities in case the electricity is unavailable.
In the United States 71 percent of the petroleum products consumed are used in transportation. If the country were able to run its transportation system entirely without oil, the United States would not only cease to import oil, but would have significant surplus oil production. Of course, such a change could only take place over many years. But the advantages to such a transition are so numerous that we should not dismiss it as too difficult or costly.
Only 5 percent of all oil is used to produce petrochemicals--chemicals which form the basis for the almost miraculous materials and substances that we now take for granted. By ceasing to burn the bulk of our oil to move goods and people, we could sustain the production of these products for a very long time. And, properly formulated, many could be recycled almost indefinitely. That seems like a much better use of an energy source that doubles as the "renaissance man" of the chemical industry.
When you add in the reduction in greenhouse gas emissions and air pollution; an end to oil imports for the United States and possibly many other countries adopting the same strategy; and the financial boost of keeping funds previously spent on imports at home, it's hard to see why electrifying transportation would not be a good idea--so long as it is done with any eye toward increasing renewable energy production while reducing overall energy consumption in the transportation sector.
Kurt Cobb is an author, speaker, and columnist focusing on energy and the environment. He is a regular contributor to the Energy Voices section of The Christian Science Monitor and author of the peak-oil-themed novel Prelude. In addition, he writes columns for the Paris-based science news site Scitizen, and his work has been featured on Energy Bulletin, The Oil Drum, OilPrice.com, Econ Matters, Peak Oil Review, 321energy, Common Dreams, Le Monde Diplomatique and many other sites. He maintains a blog called Resource Insights and can be contacted at firstname.lastname@example.org.
|
<urn:uuid:1df04d55-82a4-4684-b3ab-370306a531a3>
|
http://resourceinsights.blogspot.com/2012/11/burning-picassos-for-heat-why-we-need.html?showComment=1352055129140
| 0.8868
|
fineweb
|
Saturday 30 October, 6.15pm until 7.30pm, Upper Gulbenkian Gallery
Contemporary fears about climate change have brought historical concerns about global population numbers back onto the agenda. There has been much discussion about the need for lifestyle change, particularly in the Western world, to reduce the amount we consume. But a growing number of voices argue that this skirts around an equally important consideration: the need to reduce the absolute number of ‘carbon footprints’ left on the planet. With the global population set to reach seven billion in a few years time, some argue we are heading for a crisis, as food supplies and energy sources wane in the face of increasing demand.
On the other hand, it is pointed out that similar arguments have been made throughout history – most notably by Thomas Malthus – and have been proven wrong, as development and human ingenuity have solved the problems posed by apparently natural limits. Critics object to the way more people are seen as a burden on the planet, rather than a source of creativity. Moreover, the world population is growing in the developing world rather than the richer countries, and there is a concern that population reduction arguments might be tainted with racist undertones.
The Optimum Population Trust produces calculations to show how reducing population levels will ameliorate the environmental and social crises provoked by growing numbers of people. Others argue controlling population has immediate benefits – to women, who in some parts of the world lack access to modern contraception; and to families on low incomes struggling to support the children they already have. Some family planning organisations have brought the environmental argument together with the arguments for reproductive choice, claiming the number of ‘births averted’ through abortion is a boon. But what – if any - is the link between individuals’ reproductive choices and the state of the natural environment? Is it irresponsible for people to have large numbers of children in the knowledge they will consume more resources? Is there anything wrong with promoting voluntary strategies for limiting family size?
Listen to session audio:
chairman, Optimum Population Trust
editor, spiked; author, Can I Recycle My Granny and 39 Other Eco-Dilemmas
chief executive, British Pregnancy Advisory Service
In the years after Cairo, population issues essentially fell off the international agenda. Now that is beginning to changeCatholics for Choice, Conscience, 2010
The Chinese policy of birth control has reduced the number of children with some alarming consequences. State control of reproduction is both wrong and ineffectiveThe Times, 12 September 2010
We have met the enemy, and in our ever-growing, voracious multitudes, it is us! We have nine billion -- or is it 12? -- things to start talking about, asap.David Katz, Huffington Post, 1 September 2010
Politicians of western countries avoid talking about population control, but if we invest in family planning we might just save our planet.Mary Fitzgerald, New Statesman, 31 August 2010
The Royal Society’s two-year study of population seems to have already decided that there are ‘too many people’.Brendan O'Neill, spiked, 19 July 2010
There are too many humans and disease may restore the balance, the actor claimsAmy Turner, The Times, 24 May 2010
Brendan O’Neill says that the state’s cruel and antiquated one-child policy is being propped up by British environmentalists with an agenda — but the Chinese are striking backBrendan O'Neill, Spectator, 20 May 2010
Since 200 AD, scaremongers have been describing human beings as ‘burdensome to the world’. They were wrong then, and they’re still wrong today.Brendan O'Neill, spiked, 20 November 2009
‘Can you name a single environmental problem that would not be easier to solve with fewer people, and doesn’t get harder –- and ultimately impossible –- to solve with ever more?’Roger Martin, Reuters, 17 October 2009
An Optimum Population Trust BriefingCarter Dillard, Optimum Population Trust, 2 July 2008
The most ominous reality of 21st-century life may be the fall in human birth rates almost everywhere in the worldJeff Jacoby, International Herald Tribune, 23 June 2008
World population growth—and how to slow it—continues to be a subject of great controversy. The planet's poorest nations have yet to find effective ways to check their population increase—at least without restricting citizens' rights and violating such traditions as the custom of having large families as insurance in old age.Time, 25 October 1977
The art of criticism: judgement in crisis?
"As ever, the Battle of Ideas is full of stimulating and lively argument. It's fun to be able to clash robustly in a good-humoured atmosphere."
Martin Wright, Editor in chief, Green Futures
|
<urn:uuid:42d856df-9b98-4a61-af51-232618abaa69>
|
http://www.battleofideas.org.uk/index.php/2010/session_detail/4130/
| 0.7608
|
fineweb
|
Climate Change Information Project - February 2013 Digest Vol. 8
WASHINGTON (Reuters) - Thousands of protesters gathered on the Washington's National Mall on Sunday calling on President Barack Obama to reject the controversial Keystone XL oil pipeline proposal and honor his inaugural pledge to act on climate change.
Organizers of the Forward on Climate event estimated that 35,000 people from 30 states turned out in cold, blustery conditions for what they said was the biggest climate rally in U.S. history.
For the full story visit http://www.reuters.com/assets/print?aid=USBRE91G0GZ20130217
Below are articles from the Climate Change Information Project. The focus of the project is to get scientifically-based articles relating to climate change disseminated to the broader public. Currently, most of this information tends to stay within the scientific community.
2012 Sustained Long-Term Climate Warming Trend, NASA Finds
Jan. 15, 2013 — NASA scientists say 2012 was the ninth warmest of any year since 1880, continuing a long-term trend of rising global temperatures. With the exception of 1998, the nine warmest years in the 132-year record all have occurred since 2000, with 2010 and 2005 ranking as the hottest years on record. For complete story http://www.sciencedaily.com/releases/2013/01/130115190218.htm
Increases in Extreme Rainfall Linked to Global Warming
Feb. 1, 2013 A worldwide review of global rainfall data led by the University of Adelaide has found that the intensity of the most extreme rainfall events is increasing across the globe as temperatures rise. For complete story go to http://www.sciencedaily.com/releases/2013/02/130201100036.htm
Unprecedented Glacier Melting in the Andes Blamed On Climate Change
Jan. 22, 2013 Glaciers in the tropical Andes have been retreating at increasing rate since the 1970s, scientists write in the most comprehensive review to date of Andean glacier observations. The researchers blame the melting on rising temperatures as the region has warmed...For complete aticlehttp://www.sciencedaily.com/releases/2013/01/130122101907.htm
Loss of Arctic Sea Ice Speeds Domino Effect of Warming Temperatures at High Latitudes
Jan. 23, 2013 Melting Arctic sea ice is no longer just evidence of a rapidly warming planet -- it's also part of the problem.
Alan Werner, professor of geology at Mount Holyoke College, said that decreasing amounts of Arctic snow and ice in summer will lead to a greater degree of heat absorption at the North Pole. For complete story:
Feb. 11, 2013 Ancient carbon trapped in Arctic permafrost is extremely sensitive to sunlight and, if exposed to the surface when long-frozen soils melt and collapse, can release climate-warming carbon dioxide gas into the atmosphere much faster than previously thought. For complete article http://www.sciencedaily.com/releases/2013/02/130211162116.htm
Heat Waves, Storms, Flooding: Climate Change to Profoundly Affect U.S. Midwest in Coming Decades
Jan. 18, 2013 In the coming decades, climate change will lead to more frequent and more intense Midwest heat waves while degrading air and water quality and threatening public health. Intense rainstorms and floods will become more common, and existing risks to the Great Lakes will be exacerbated. For complete article http://www.sciencedaily.com/releases/2013/01/130118104121.htm
Predictions of the Human Cost of Climate Change
Feb. 8, 2013 A new book, Overheated: The Human Cost of Climate Change, predicts a grim future for billions of people in this century. It is a factual account of a staggering human toll, based on hard data. For complete article:
There are no comments to this post(Back to bremerm blog | Write a Comment | Subscribe)
facebook | del.icio.us | digg | stumbleupon | RSS | slashdot | twitter
|
<urn:uuid:50c5f80e-7064-4178-8b19-5a048e9a26fb>
|
http://www.sunyit.edu/apps/weblog/?blog=bremerm&mode=viewpost&id=21778
| 1
|
fineweb
|
Nuclear is clean energy
Illustration: Andrew Dyson
At a conference in Melbourne last October, global energy expert Sir Chris Llewellyn Smith, the director of energy research at Oxford University, urged Australia to embrace civilian nuclear power. He said it was the logical pathway to clean, secure and cost-effective energy.
He also reflected on the ''hard sell'' of a nuclear power policy in this country due to fear, pseudoscience, political pragmatism, poor education and the dominant hydrocarbon energy lobby. Risk-conscious Australians were urged to remember: ''The nuclear accident in Japan has not killed anybody. There may be one or two people who will die of cancer, but we are talking of very small numbers, if any.''
There is a ridiculous paradox in the energy policy of a nation which makes symbolic gestures to the United Nations about embracing ''clean renewable energy'' but continues to maintain its position as the planet's premier exporter of dirty coal. Delegate nations to last month's US Intergovernmental Panel on Climate Change in Doha were challenged by the growing dangers of, as yet, an uncontrolled increase in emissions from the burning of hydrocarbon fuels.
The International Energy Agency estimates that, during 2010, about 30.6 gigatonnes of carbon dioxide poured into the Earth's atmosphere as a result of burning fossil fuels. This represents an increase of about 1.6 gigatonnes over the 2009 levels, despite the economic effects globally of the most serious recession for the past 80 years.
This highest greenhouse gas emission in recorded human history will make it almost impossible to achieve the UN panel's goals of a mean global maximum temperature rise of not more than 2 degrees, with a carbon concentration below 350 parts per million.
These targets cannot be attained without nuclear power. In 2013, it is just as foolish to be a ''nuclear denier'' as a ''climate sceptic''.
A recent International Energy Agency report commended the role of nuclear power in combating climate change and providing global energy security at the end of the hydrocarbon-fuel age. In summary, it said, ''nuclear power is the technology which must be accelerated, promoted and relied upon if the world is to stabilise carbon dioxide emissions at an acceptable level''.
Australia's Energy and Resources Minister, Martin Ferguson, has often endorsed this view. He chaired a recent meeting of the international agency, where he said: ''The only proven form of clean energy of a baseload and a reliable nature is actually nuclear from a global point of view.''
Also at stake in Australia is the gradual elimination of more than 250 million tonnes of carbon dioxide a year from hydrocarbon-fuelled power stations. This could more than double by the year 2050, when Australia's energy supply will need to exceed 100GW(e). Of this, at least one-quarter should be nuclear to meet the US' climate criteria. Without such a policy, Prime Minister Julia Gillard's ''clean energy pathway'', Climate Change Minister Greg Combet's ''clean coal'' and ''renewable'' mantra, and former prime minister Kevin Rudd's concern for ''the greatest ethical problem facing humanity'' are merely symbolic gestures and political spin.
The International Energy Agency's calculations indicate that to avoid climate change ''tipping points'' and to avert environmental disasters, annual energy-related greenhouse gas emissions must not exceed 32 gigatonnes by the year 2020. Our own calculations indicate for Australia, steadily decreasing a carbon price of $7 per tonne of coal will facilitate the introduction of nuclear power into the country and help achieve all the nation's carbon abatement goals. An energy policy based on clean coal and renewables will see only a steady rise in the carbon price to hundreds of dollars without any significant carbon abatement but with a concomitant loss in energy security. The recent Productivity Commission's report on carbon pricing has confirmed these issues.
The UN climate panel has proposed that the world should halve its carbon-dioxide emissions by the year 2050. At the request of the Group of Eight countries - Canada, France, Germany, Italy, Japan, Russia, Britain and the US - the International Energy Agency produced a report detailing the optimal solution path to this objective. It is focused on nuclear power and plays down the effectiveness of clean coal and renewables.
Australia is a country hungry for energy and thirsty for water. The nation's sustainable development, its value-adding industries and its rural production are largely dependent on these two commodities. And now, among the world's leading scientists and engineers, there is a growing consensus that a greenhouse-gas-free and cost-effective supply of energy, water and even hydrogen can be best sourced from a ''generation four'' nuclear power plant. And the generation costs would be a fraction of that related to renewables or clean coal. Electricity could be produced at under 2¢ a kilowatt hour and typically potable water at under $2 per cubic metre.
In 2011, on the 25th anniversary of the nuclear accident in Chernobyl, the former head of the International Atomic Energy Agency, Mohamed ElBaradei, summed up the present global civilian nuclear power situation as follows: ''Today, nuclear power is the only real alternative to fossil fuels as a source of sustainable and reliable supply.'' To this he added: ''Fukushima represents a potentially significant setback to nuclear power'' but stressed that ''confidence would be re-established in due course'', saying ''Chernobyl and Fukushima will be shown to be aberrations''.
>> Professor Leslie Kemeny is the Australian foundation member of the International Nuclear Energy Academy. He was the Australian observer and assessor at Chernobyl, which he visited in 1987.
|
<urn:uuid:f8044344-6612-452f-903e-c0d060f9d128>
|
http://www.canberratimes.com.au/opinion/nuclear-is-clean-energy-20130110-2cinm.html
| 1
|
fineweb
|
What is Green Energy?
What is Green Energy?
Green Tags are created when wind power or other renewable energy is substituted for traditional power. The result is a shift away from our dependence on burning fossil fuel to produce electricity. Using clean renewable energy is friendly to the environment and reduces emissions of carbon dioxide and other greenhouse gases. Green Tags represent the real savings in carbon dioxide and other pollutants that occur when green power replaces burning fossil fuel.
Renewable energy is still a little more expensive than buying traditional power, so Green Tags are purchased in addition to the electricity that you are now using. Buying Green Tags has the same effect as buying green power. Both replace fossil fuel generators with clean renewables, and both have exactly the same environmental benefits.
The purchase of Green Tags - which are also called renewable energy certificates - supports the production of renewable energy in the United States and Canada. Participants continue to receive a separate electricity bill from that provided by their utility company. For every unit of renewable energy generated, equivalent amounts of Green Tags (renewable certificates) are produced. Green Tags support new renewable electricity generation, which offsets the environmental effects of burning coal, gas, and other fossil fuels in the region where the renewable generator is located, and helps shift the overall energy mix toward more renewable resources. Also, Green Tags help build a market for renewable energy, reduce global climate change, and may have other environmental benefits such as reducing regional air pollution.
The source of our 100% renewable Green Tag energy that ultimately supplies the HostPapa servers comes from a multiple of sources, including:
- Condon Wind Facility (Gilliam County, OR)
- Foote Creek Wind Facility (Carbon County, WY)
- Klondike Wind Facility (Sherman County, OR)
- Northwest Small Wind Co-op (WA, MT)
- Northwest Solar Co-op (OR, WA)
- Portland Brewery Blocks (Portland, OR)
- Solar Ashland (Ashland, OR)
- Stateline Wind Facility (Walla Walla County, WA; Umatilla County, OR)
- Summerview Wind Facility (Pincher Creek, Alberta, Canada)
- Tillamook Animal Waste to Energy (Tillamook, OR)
- Washington State School for the Blind (Vancouver, WA)
- White Bluffs/Hanford (White Bluffs, WA)
The above facilities replace our energy consumption by supplying the power grid with our equivalent energy requirements of 100% green energy from wind and solar energy providers.
We at HostPapa are proud of the fact that our company is making the world a cleaner and healthier place. All of our equipment including servers, data centers, power allotment, and even our office computers and laptops are powered by 100% renewable and environmentally-friendly energy. We intend to continue investing in these alternative energy sources, even with the rapid growth and expansion of our company and related equipment.
Next: Spread the Word!
|
<urn:uuid:d42389a5-574e-4576-956f-fd4a8b67feea>
|
http://www.hostpapa.com/all-about-green/what-is-green-energy/
| 1
|
fineweb
|
Kids at Sidwell Friends School in Washington, D.C., love their new 'green' campus.
What has light, fresh air, and is a great place to learn?Skip to next paragraph
Subscribe Today to the Monitor
An ecofriendly middle school.
At Sidwell Friends Middle School in Washington, D.C., one of the newest teachers on campus this year is a building. From top to bottom it's energy efficient, environmentally friendly, and an inviting place to learn everything from science to singing.
It all started when the school needed more classroom space. Instead of tearing down the existing building, a construction crew brought in a bulldozer to clear out the interior, and an L-shaped addition went up beside it. The new, U-shaped building is filled with earth-friendly features, but the spacious rooms with huge windows are the first things you notice.
"The extra natural light in the classrooms really keeps you awake and enjoying the day," says Isabel Dorval, a ninth-grader at Sidwell.
Walkways made of what?
The architects chose natural, recycled, and renewable materials wherever possible. Most of these could be used with minimal impact on the environment. Doors were made with a veneer of bamboo (a fast-growing grass), bulletin boards with cork (which can be harvested without cutting down trees), and cabinets from wheatboard (which is made of wheat straw – the part of the plant that's left over after the grain is harvested).
Old materials were also reused in new ways. Bleachers from another school were used to make the window trim. Wood for walkways came from a pier in Baltimore. The "skin" on the outside of the building was made with wood from wine casks. The sun is turning the boards a beautiful silvery gray.
On the roof, tall, glass-sided chimneys vent warm air, creating a current that pulls cool, fresh air through the building's north-facing windows. Sixth-graders tend rooftop beds of herbs that they cut and bring to the cafeteria. Native plants help insulate the building and filter rain-water that flows through downspouts to the landscaped area below.
Instead of planting a grass lawn, the school created a terraced wetland area and pond by the main entrance. The area has become a hands-on science lab where students take water samples, identify microorganisms, and study wildlife.
Another important purpose of the wetlands is to treat the water from sinks and toilets.
Waste goes into an underground tank, where tiny organisms begin to break it down. Then it filters through plants, rock, and sand in the wetland and back through the building to be used in sinks and toilets and to cool machinery. Fresh water in drinking fountains comes from the city supply. The school uses about 90 percent less water than a traditionally built school of the same size.
"My favorite place is probably the benches outside by the wetlands," says Isabel. "It feels like it's a little habitat out there because you're enclosed on three sides by the building. There's a mural that illustrates the sedimentation process. That brings awareness of what's happening right in front of you. That's very neat."
Lessons from the building
Mechanical controls, vents, and pipes in plain view make it easier to understand how everything works. Along the wide, open hallways filled with natural light, wind chimes in vents signal when fresh air is being taken into the building. In science class, everyone reads the monitors to note how temperature and levels of carbon dioxide change throughout the day.
The building is a great place for environmental detective work, too.
"I asked the students to look around and tell me where paper was used to make something in the classroom," says Jennifer Mitchell, who teaches fifth-grade science. "One student looked up and said it was in the ceiling tiles, and he was right. The ceiling tiles are made from recycled newspaper."
Let the sunshine in
The building's greatest energy saving is in its use of light. The large windows have light shelves above that reflect natural light farther into rooms without letting in heat from the sun. On the south side of the building, where the sun is strongest, horizontal screens shade classrooms from glare. On the east and west sides, vertical screens shade windows when the sun is low.
Some days, the overhead fluorescent lights never need to be turned on. That saves not only the energy it takes to keep lights on; it also saves the energy it takes to cool down the building from all the heat that lights can generate. The result is that such an efficient building has helped the school cut its energy use by 60 percent.
All about the environment
From the day the doors to the new building opened, changes have echoed through Sidwell Friends School. The cafeteria has been serving more organic and locally grown food.
There's an environmental club called ECO, and students have begun to teach their parents about more energy-efficient ways of doing things at home.
"We have such an opportunity here," says Ms. Mitchell. "As you learn about the building, it makes you think how much sense it makes to do things this way."
Isabel likes science very much and says the new building has made science even more interesting for her.
"It went from little in-class experiments to really learning about the school itself as an experiment," she says. "I think I can speak for our whole grade, saying that suddenly you just understood your environment and how you affect it."
|
<urn:uuid:47bb1a35-670c-437e-ade1-659e793bfd3d>
|
http://www.csmonitor.com/The-Culture/The-Home-Forum/2008/0311/p18s03-hfks.html
| 0.5122
|
fineweb
|
The Obama Administration has just ramped up the nation’s transition to clean energy resources such as wind power with a new Great Lakes wind power initiative that joins federal agencies with Illinois, Michigan, Minnesota, New York, and Pennsylvania in a regional development plan for installing offshore wind turbines in the Great Lakes. In addition to supplying clean, renewable wind power to local grids, the new plan could provide the mid-continent states in this group with an economic boost from selling low cost renewable energy to other regions of the U.S.
Wind energy as a regional asset
The five states have joined in a Memorandum of Understanding that sets forth an interstate cooperative mechanism similar to the Atlantic Wind Consortium for East Coast states spearheaded by the Obama Administration last year. Other mid-continent states have also begun leveraging their wind resources for sale to the national grid, including North Dakota and Kansas, which has been enthusiastically touting its “Grain Belt Express” wind energy initiative.
Wind energy and national security
Along with the expected participation of the U.S. departments of Energy and Environmental Protection, the Memorandum of Understanding includes the Department of Defense, the Army, and the Coast Guard. That could be at least partly related to the military’s cautious approach to wind energy, primarily due to concerns over radar interference from wind turbines. However, that doesn’t necessarily meant that DoD will only play the devil’s advocate in terms of siting and permitting new offshore wind farms. In recent years the military has begun to embrace wind power, and more is on the way through the U.S. Army’s Net Zero initiative.
Wind energy in the Great Lakes
Wind farms have already proven to provide landlocked rural communities with economic benefits in terms of job creation and tax revenues, with virtually none of the risks posed by fossil fuel harvesting. For communities without sufficient land for wind farms, offshore wind power is the solution. The Obama Administration estimates that the Great Lakes could provide about one fifth of the total U.S. wind energy potential, or more than 700 gigawatts (one gigawatt of electricity can run about 300,000 typical homes).
Missing the boat on wind power
Not all of the eight Great Lakes states are included in the MOU. Indiana is not on the list nor is Ohio, which makes sense given Governor John Kasich’s “that’s just dumb” position on offshore wind development back in 2010, though his stance appears to have tempered in the last couple of years.
Also notably absent from the Great Lakes plan is Wisconsin, where last March the state legislature abruptly suspended new wind power rules that were just coming to fruition after a year of hard work by numerous stakeholders. Though the legislature caved in just a couple of weeks ago and let the rules stand, the damage has already been done. The state’s wind industry – and its economy – will not be able to take full advantage of surging interest in wind power, at least not this year. The damage has already been done to the legislature, too, as a number of members are facing a recall election along with the state’s governor, Scott Walker.
Follow Tina Casey on Twitter: @TinaMCasey.
Tina Casey specializes in military and corporate sustainability, advanced technology, emerging materials, biofuels, and water and wastewater issues. Tina’s articles are reposted frequently on Reuters, Scientific American, and many other sites. You can also follow her on Twitter @TinaMCasey and Google+.
|
<urn:uuid:8d380d12-7a8e-4d81-a95c-481792189876>
|
http://cleantechnica.com/2012/03/30/obama-enlists-wind-power-to-breathe-new-life-into-great-lakes-economy/
| 0.9789
|
fineweb
|
Time to think big
Did the designation of 2010 as the first-ever International Year of Biodiversity mean anything at all? Is it just a publicity stunt, with no engagement on the real, practical issues of conservation, asks Simon Stuart, Chair of IUCN’s Species Survival Commission.
Eight years ago 183 of the world’s governments committed themselves “to achieve by 2010 a significant reduction of the current rate of biodiversity loss at the global, regional and national level as a contribution to poverty alleviation and to the benefit of all life on Earth”. This was hardly visionary—the focus was not on stopping extinctions or loss of key habitats, but simply on slowing their rate of loss—but it was, at least, the first time the nations of the world had pledged themselves to any form of concerted attempt to face up to the ongoing degradation of nature.
Now the results of all the analyses of conservation progress since 2002 are coming in, and there is a unanimous finding: the world has spectacularly failed to meet the 2010 Biodiversity Target, as it is called. Instead species extinctions, habitat loss and the degradation of ecosystems are all accelerating. To give a few examples: declines and extinctions of amphibians due to disease and habitat loss are getting worse; bleaching of coral reefs is growing; and large animals in South-East Asia are moving rapidly towards extinction, especially from over-hunting and degradation of habitats.
|This month the world’s governments will convene in Nagoya, Japan, for the Convention on Biological Diversity’s Conference of the Parties. Many of us hope for agreement there on new, much more ambitious biodiversity targets for the future. The first test of whether or not the 2010 International Year of Biodiversity means anything will be whether or not the international community can commit itself to a truly ambitious conservation agenda.|
The early signs are promising. Negotiating sessions around the world have produced 20 new draft targets for 2020. Collectively these are nearly as strong as many of us hoped, and certainly much stronger than the 2010 Biodiversity Target. They include: halving the loss and degradation of forests and other natural habitats; eliminating overfishing and destructive fishing practices; sustainably managing all areas under agriculture, aquaculture and forestry; bringing pollution from excess nutrients and other sources below critical ecosystem loads; controlling pathways introducing and establishing invasive alien species; managing multiple pressures on coral reefs and other vulnerable ecosystems affected by climate change and ocean acidification; effectively protecting at least 15 per cent of land and sea, including the areas of particular importance for biodiversity; and preventing the extinction of known threatened species. We now have to keep up the pressure to prevent these from becoming diluted.
We at IUCN are pushing for urgent action to stop biodiversity loss once and for all. The well-being of the entire planet—and of people—depends on our committing to maintain healthy ecosystems and strong wildlife populations. We are therefore proposing, as a mission for 2020, “to have put in place by 2020 all the necessary policies and actions to prevent further biodiversity loss”. Examples include removing government subsidies which damage biodiversity (as many agricultural ones do), establishing new nature reserves in important areas for threatened species, requiring fisheries authorities to follow the advice of their scientists to ensure the sustainability of catches, and dramatically cutting carbon dioxide emissions worldwide to reduce the impacts of climate change and ocean acidification.
If the world makes a commitment along these lines, then the 2010 International Year of Biodiversity will have been about more than platitudes. But it will still only be a start: the commitment needs to be implemented. We need to look for signs this year of a real change from governments and society over the priority accorded to biodiversity.
|One important sign will be the amount of funding that governments pledge this year for replenishing the Global Environment Facility (GEF), the world’s largest donor for biodiversity conservation in developing countries. Between 1991 and 2006, it provided approximately $2.2 billion in grants to support more than 750 biodiversity projects in 155 countries. If the GEF is replenished at much the same level as over the last decade we shall know that the governments are still in “business as usual” mode. But if it is doubled or tripled in size, then we shall know that they are starting to get serious.|
IUCN estimates that even a tripling of funding would still fall far short of what is needed to halt biodiversity loss. Some conservationists have suggested that developed countries need to contribute 0.2 per cent of gross national income in overseas biodiversity assistance to achieve this. That would work out at roughly $120 billion a year—though of course this would need to come through a number of sources, not just the GEF. It is tempting to think that this figure is unrealistically high, but it is small change compared to the expenditures governments have committed to defence and bank bail outs.
It is time for the conservation movement to think big. We are addressing problems that are hugely important for the future of this planet and its people, and they will not be solved without a huge increase in funds.
|
<urn:uuid:2d3e80a0-ca7b-4358-80a9-0f5129e87a3e>
|
http://cms.iucn.org/es/recursos/focus/enfoques_anteriores/cbd_2010/noticias/opinion/?6131/time-to-think-big
| 0.9928
|
fineweb
|
— Sheila Watt-Cloutier
Chair, Inuit Circumpolar Conference (February 12, 2005)
For millennia, the Inuit have lived in the Arctic coastal areas of Alaska, Canada, Russia and Greenland. Like many indigenous peoples, the Inuit are the product of the physical environment in which they live. The culture, economy and identity of the Inuit as an indigenous people depend upon the ice and snow. Climate change now threatens the Inuit's human rights to culture, life, personal security, health, housing, and food.
The Arctic is warming much more rapidly than previously known, at nearly twice the rate as the rest of the globe, according to the Arctic Climate Impact Assessment (ACIA), a four-year scientific study conducted by an international team of 300 scientists under the direction of a high-level intergovernmental forum including the United States. Increasing greenhouse gases from human activities are projected to make the Arctic warmer still, according to this unprecedented report.
These changes will have major global impacts, such as contributing to global sea-level rise and intensifying global warming, according to the ACIA final report.
Earthjustice's International Program, along with the Center for International Environmental Law, has worked with the Inuit Circumpolar Conference to submit a petition to the Inter-American Commission on Human Rights seeking relief from the impacts of climate change resulting from the United States' failure to take effective action to reduce its greenhouse gas emissions. A positive decision on the petition will establish the responsibility of the United States and other major greenhouse gas-emitting nations for the human rights violations resulting from climate change. Such responsibility creates an international obligation to take action to prevent such violations.
Learn more about this issue:
- Summary of the petition (PDF)
- Download the full petition (2 MB PDF file)
- The latest science on the issue
- Additional resources
- Getting the Inuit Message on Global Warming to the World: Confessions of a Press Guy
- Climate Change Denial: A Note to Journalists
- The Snow Must Go On (Grist Magazine, 7/26/05)
|
<urn:uuid:d73b868f-77c2-466b-95c2-999a9d72aedc>
|
http://earthjustice.org/print/features/inuit-human-rights-and-climate-change
| 1
|
fineweb
|
The current cycle of global warming is changing the rhythms of climate that all living things have come to rely upon. What will we do to slow this warming? How will we cope with the changes we've already set into motion? While we struggle to figure it all out, the face of the Earth as we know it—coasts, forests, farms, and snowcapped mountains—hangs in the balance.
More About Global Warming
See National Geographic's full coverage of the 2010 Gulf of Mexico oil spill: pictures, news reports, and first-person accounts.
Burning fossil fuels, humans pump CO2 into the atmosphere. Fortunately, plants and ocean waters gather it in. But what if this great recycling system went awry?
See the effects global warming has had on Antarctic glaciers and the wildlife that depends on them.
Learn more about these underground reservoirs of steam and hot water that can be tapped to generate electricity or to heat and cool buildings directly.
@NatGeoGreen on Twitter
The Great Energy Challenge
An initiative to help you understand our current energy situation.
See how you measure up against others, and how changes at home could do tons to protect the planet.
Special Ad Section
The World's Water
NG's new Change the Course campaign launches. When individuals pledge to use less water in their own lives, our partners carry out restoration work in the Colorado River Basin.
A special series on how grabbing water from poor people and future generations threatens global food security, environmental sustainability, and local cultures.
|
<urn:uuid:58a5802e-df03-455e-a035-890db1abafc1>
|
http://environment.nationalgeographic.com/environment/global-warming/
| 1
|
fineweb
|
Powering up for Wildlife
“Smart-from-the-start” solar-energy plan ensures fewer risks to wildlife
Building solar-energy fields on already developed land can help protect desert tortoise habitat. (Photo: © Mark A Wilson)
The desert landscape is looking even brighter now that the Interior Department has released its first roadmap for environmentally responsible solar-energy development on public lands in the West. According to the map, unfurled this fall, this means the Bureau of Land Management (BLM) will direct solar-energy projects to areas that avoid or have reduced impacts to wildlife and wild lands in Arizona, California, Colorado, Nevada, New Mexico and Utah.
“The decision to direct solar projects to low-conflict areas is a much more rational approach to renewable energy development on public lands,” says Jamie Rappaport Clark, Defenders’ president. “This is better for wildlife, energy developers, utilities and investors alike because it offers a more efficient way to get environmentally friendly renewable energy on line and greater certainty for all involved.”
But she adds that the impact of the plan on the federally threatened desert tortoise is still a concern because the plan allows for some vital tortoise habitat to remain open to potential development. “The BLM and U.S. Fish and Wildlife Service should have taken a more cautious approach to exclude these areas or delay solar development until more scientific information becomes available on the impact of these projects on the species,” says Clark. “Instead, they chose to merely ‘discourage’ development in these areas. This creates greater uncertainty for developers and could undermine the survival and long-term sustainability of a unique and iconic desert species.”
GET UP STAND UP
You can help wildlife by adding solar panels to your rooftop. Defenders is partnering with SolarCity to encourage the switch to solar energy and will receive a $400 donation for every Defenders member that installs a home solar system. SolarCity will install your system for free—you pay for electricity by the month, just like your utility bill but lower. Learn more and sign up for a free consultation at www.defenders.org/solarcity.
In total, 17 solar-energy zones were finalized over approximately 280,000 acres, which represents a significant reduction from earlier proposals.
Development can also occur on a case-by-case basis on an additional 19 million acres of BLM lands outside the solar development zones—although the plan technically discourages it.
Over the past two years, Defenders urged the Obama administration to adopt upfront, “smart-from-the-start” principles for planning, designing and managing renewable energy projects that avoid or minimize the impact to sensitive desert land and wildlife—like the desert tortoise, bighorn sheep, golden eagle and Mojave ground squirrel.
Since some solar projects can sprawl over 8,000 acres or more, if located in sensitive habitats their sheer size can result in significant habitat loss and fragmentation. This can make it difficult for wildlife to find food, water, shelter, mates and protection from predators. Fragmented habitat can also lead to smaller, isolated populations of wildlife, making long-term survival more difficult.
“In our efforts to switch to clean energy and reduce greenhouse gas pollution, we must ensure that development of utility-scale solar power does not preclude wildlife from migrating to lands essential for climate change adaptation,” says Erin Lieberman, Defenders’ western policy advisor for renewable energy and wildlife. “We need to encourage the development of solar, wind and geothermal energy for all the benefits they provide. But we must not do it at the expense of our nation’s rich wildlife legacy.”
Defenders will continue working with stakeholders and the Department of the Interior to ensure that large-scale solar development occurs in places of least conflict so that wildlife and natural resources, like the desert tortoise, are protected.
|
<urn:uuid:f7a4226e-1055-406f-a2eb-2c10aeaba121>
|
http://www.defenders.org/magazine/winter-2013/powering-wildlife
| 0.9935
|
fineweb
|
The U.S. had the world's top two costliest natural disasters in 2012, according to a report released Thursday by global reinsurance firm Aon Benfield, based in London.
The largest global disasters of 2012 were Hurricane Sandy (with a cost of $65 billion) and the year-long Midwest/Plains drought ($35 billion), according to the company's Annual Global Climate and Catastrophe Report, which was prepared by Aon Benfield's Impact Forecasting division.
The $35 billion figure is one of the first estimates of the U.S. drought cost, which "comes from a combination of anticipated losses sustained by the agricultural sector and other factors such as business interruption," says Aon Benfield meteorologist and senior scientist Steve Bowen.
Sandy and the drought accounted for nearly half of the world's economic losses but, owing to higher levels of insurance coverage in the U.S., 67% of insured losses globally, the report states. Total economic losses include the entire cost of an event, while insured losses are the amount of economic losses that are covered by insurance, says Bowen.
The U.S. alone accounted for nearly 90% of all the world's insured losses in 2012. In addition to the drought and Sandy, several severe weather events and Hurricane Isaac contributed to this total.
The U.S. typically represents 64% of the insured losses. Why does the U.S. percentage tend to be so high each year? "From an insurance perspective, the United States has generally been the dominant region of the world for costliest natural disaster events," Bowen says. "The U.S. has a higher level of insurance penetration than most countries, which in turn leads to more of the economic losses being covered."
Global natural disasters in 2012 combined to cause economic losses of $200 billion, which is just above the 10- year average of $187 billion. There were 295 separate events, compared to an average of 257.
A report in October 2012 from Munich Re, the world's largest reinsurance firm, said that climate change was driving the increase in natural disasters since 1980, and predicted that those influences will continue in years ahead.
However, the Aon Benfield report states: "Despite growing support for 'the new normal' theory of a world dominated by rapidly escalating global catastrophe losses, our study highlights that 2012 returned to a more normal level of losses after the extreme economic and insured losses of 2011." That was the year of the terrible earthquake and tsunami in Japan, an earthquake in New Zealand, and floods and landslides in Thailand.
The report goes on to say that while nominal catastrophe losses are increasing at an alarming rate, economic losses as a percent of global GDP â?? a measure normalized for inflation and economic development â?? has remained relatively stable over the past 30 years.
The deadliest event of 2012 was Super Typhoon Bopha, which left more than 1,900 people dead after making landfall in the Philippines in December. The number of human fatalities caused by natural disasters in 2012 was approximately 8,800, with nine of the top 10 events occurring outside of the U.S.
Copyright 2013 USATODAY.com
Read the original story: Hurricane Sandy, drought cost U.S. $100 billion
|
<urn:uuid:49be57ab-0a7c-4016-869f-8cde070e7530>
|
http://www.postcrescent.com/usatoday/article/1862201
| 1
|
fineweb
|
|© UNICEF Nepal/Shrestha|
|Children crossing the road near Garbari tole in the flood-affected town of Nepalgunj, Banke District, Nepal.|
KATHMANDU, Nepal, 29 August 2006 – UNICEF has started distributing supplies to families affected by devastating floods in the Mid-Western and Far-Western Regions of Nepal.
The Nepal Red Cross Society, UNICEF’s partner in the aid distribution, estimates that some 5,800 households have been affected by flooding in the district of Banke, with a further 5,000 households affected in neighouring Bardiya.
Some 30 per cent of the houses in Banke and 1,030 homes in Bardiya reportedly have been destroyed. Families have been taking shelter on the high ground next to main roads, and six local schools are being used as shelters.
“Our immediate concern for the families is that they have shelter and warmth, and to ensure they can purify their drinking water,” UNICEF’s Representative in Nepal, Dr. Suomi Sakai, said today. “The districts most affected by the flooding are some of the poorest districts in the country. The families living there have been heavily hit both by the internal conflict of the last 10 years and by a recent drought.
“Diarrhoea is a major worry,” Dr. Sakai added. “Some 45 children die each day [from diarrhoea] in Nepal as it is, even without a natural disaster. It is vital that the families can purify water easily.”
|© UNICEF Nepal/Shrestha|
|Flooding in Nepalgunj, Banke District, Nepal.|
Joint mission to Bardiya
Along with the World Food Programme (WFP), the Red Cross and other organizations, UNICEF is working closely with the Government of Nepal to assess the situation and coordinate relief efforts.
The organization is sending technical experts in hygiene and water supply to help with the assessment in flood-affected areas, and its field staff on the ground is already assisting with preliminary surveys.
UNICEF staff also joined a joint mission to Bardiya today with the UN Office for the Coordination of Humanitarian Affairs, WFP and the United Nations Development Programme.
“If the rains start to ease up soon, there is the possibility that the flood waters will start to recede within a week,” noted Dr. Sakai. “But if the rains continue, the plight of these families may become much worse.”
More supplies en route
Prior to the flooding, UNICEF had emergency supplies pre-positioned at its field office in Nepalgunj, one of the affected districts. These supplies, now being distributed, include 60 tarpaulins for shelter, 375 blankets, 25 boxes of water-purification tablets and 4,000 sachets of oral rehydration salts for treating diarrhoea.
Two further truckloads of supplies are due to arrive in Nepalgunj on Wednesday morning. They will bring a further 1,300 tarpaulins, 1,000 blankets, 159 plastic sheets, 200,000 sachets of water-purification powder, 2 boxes of water-treatment tablets, 1,500 buckets, 1,000 plastic mugs and 540 sets of kitchen utensils.
A final truckload containing 1,000 hygiene kits should be dispatched by the weekend. These kits include soap, towels and clothes, toothbrushes and toothpaste, sanitary napkins, combs, nail cutters and other items.
“We are standing ready to provide further help, should it be needed,” said Dr. Sakai.
|
<urn:uuid:b167b41b-2eb9-42c4-bf39-40f5266cb410>
|
http://www.unicef.org/infobycountry/nepal_35484.html
| 0.9989
|
fineweb
|
RIO CLARO, BRAZIL (AP) - The cash cows on Carlos Marques' farm used to be nothing but that: herds of dairy cattle that grazed the grassy, rolling hills of his property, where most of the dense tropical forest was long ago cut down for pastures and cropland.
But now the trees are starting to put money in his pocket as well.
The 68-year-old farmer is part of a pilot project that aims to reverse the economics of environmental destruction by paying farmers to preserve the forests that protect a crucial watershed, using money from some of the millions of people who use that water.
It's the sort of initiative that is at the heart of the United Nations' Rio+20 earth summit, the three-day mega-conference that ends Friday and is aimed at pushing sustainable development to the top of the world's agenda.
"It used to be that the forest was worth nothing," said Fernando Veiga, water funds manager at The Nature Conservancy, the environmental organization that helped spearhead the Rio Claro-area project along with a Brazilian NGO and the state and municipal governments. "But we know how crucial living trees are to the planet, and now they have a monetary value."
Proponents insist that sustainable development _ which allows economic growth to meet people's current needs while preserving natural resources for the future _ is the only way to prevent an environmental meltdown that could prove catastrophic for the planet and humanity.
But critics contend that the idea often serves as a front that permits governments and companies to make noise about protecting the environment while permitting business to continue as usual.
Looking out onto rounded hills that surround Marques' farm near the tiny town of Rio Claro, 130 kilometers (80 miles) south of Rio de Janeiro, it's hard to believe this entire region was once swathed in dense vegetation. Devastated by centuries of deforestation _ first for coffee plantations, then for charcoal and now for cattle raising and urban sprawl _ Brazil's Atlantic Forest has been whittled down to just 12 percent of its original size, and scientists say it ranks among the world's most threatened ecosystems.
The hills around Rio Claro are now almost bald, with just a sparse covering of grass that's often chewed down to the root by the rangy cattle that graze here. With little to anchor the earth into place, erosion has cut vivid gashes of rusty red soil.
This desolate landscape is the source for the Guandu River, which provides 80 percent of Rio's water. Because of deforestation and erosion, water is less abundant than locals say it once was, and silt from the erosion and other pollutants seep into the tributaries of the Guandu, as well as the river itself. That forces water officials to heavily treat the water to make it usable, costing the city $500 million per year, according to environmentalists. And still, most Rio residents who can afford it drink bottled water.
On Marques' property, for example, the brook that once babbled its way across his land had dried up, as have many other other streams in the area, the farmer said.
The Nature Conservancy and partner organization Instituto Terra developed the Guandu Water Fund to protect Rio's water supply by investing in the forests that help generate the water itself.
Under the pilot project, inaugurated in 2009, $500,000 in fees paid by big water consumers are being doled out to small farmers around Rio Claro who pledge to conserve their forests or allow swaths of their land to be reforested.
Farmers sign a contract promising to keep their animals out of protected plots, and organizers send out teams of locally hired employees to fence in the areas and plant thousands of saplings from a potpourri of some 80 native plant species.
The payouts are mostly small _ Marques receives just $640 a year for his 62 protected acres _ but advocates say even symbolic amounts help change people's attitudes toward conservation.
"I used to think of the trees as mine, to use as I saw fit, but now I see things differently," said Marques, a father of five and grandfather of five. "The trees that grow here are mine, but lots of other people depend on them, too, so by saving even just one single tree, I'm performing a service for all of humanity."
Since he joined the project three years ago, the dried-up stream has been resuscitated. At first it was a mere trickle, he said, but now it's grown into a thick rope of water.
With real, measurable gains for 9 million consumers in Rio de Janeiro, for the forest and for the locals who call it their home, the Guandu Water Fund embodies the win-win situation for people and the environment that sustainable development aspires to be.
Such initiatives are gaining traction among policymakers as a way to slow the kind of wholesale environmental destruction that has been blamed for recent years' rise in devastating droughts, floods and other natural disasters.
The notion of sustainable development was born well ahead of the current conference's precursor, the U.N.'s 1992 Earth Summit, which helped put climate change on the world agenda. Still, it remains an amorphous, and divisive, concept.
"Definitions of just what is sustainable development vary, society by society," said Jeffrey Sachs, the economist and who heads Columbia University's Earth Institute. "But while there are big debates about the specifics and how to balance ... the economy, the environmental and the social concern, I think that the basic idea that we have three bottom lines, not one, is the most important idea."
Still, it has been hard to agree on how to implement it.
Weeks of bickering between rich and poor countries delayed agreement on the final summit conclusions and the result has disappointed environmental groups, who have lambasted it as toothless and inadequate.
"What most people at the summit are talking about when they talk about sustainable development is nothing but business as usual under a different name, something that will deliver misery to many and profit to a few," said Daniel Mittler, a political director at Greenpeace who is heading the environmental group's delegation at Rio+20.
"But it doesn't have to be: Sustainability is an agenda that can deliver for people and the planet at the same time," Mittler said, adding that political will and direction are needed to make it work on a global scale. "The tragedy of Rio+20 is that governments are failing to grasp that opportunity."
While decision makers squabble over language, farmer Marques in Rio Claro says he's sold on sustainable development.
"I need money to live, but I also need clean air and clean water," he said. "This project gives me all three at the same time."
By John Solomon
How the government's punishing of the exposure of official wrongdoing can linger for years
Independent voices from the TWT Communities
A collection of reader guest articles, thoughts and opinions by Communities writers and breaking news and information.
A mother of three and a passionate conservative, Shirley Husar changes the game.
This column will cover anything that has anything remotely to do with the game of baseball, from the game itself to mid-summer trades to offseason moves.
Eye on Europe, the Middle East and Africa
Benghazi: The anatomy of a scandal
Vietnam Memorial adds four names
Cinco de Mayo on the Mall
NRA kicks off annual convention
|
<urn:uuid:20d09de8-3b4c-489a-a76e-6d2ce9846a32>
|
http://www.washingtontimes.com/news/2012/jun/21/cash-cows-disputes-explain-un-development-summit/print/
| 0.9456
|
fineweb
|
NOAA: November Warmer than Average in U.S., January-November Temperature Near Average for U.S.
December 11, 2008
The November 2008 temperature for the contiguous United States was warmer than the long-term average, according to NOAA’s National Climatic Data Center in Asheville, N.C. The January-November 2008 temperature was near average.
The average November temperature of 44.5 degrees F was 2.0 degrees F above the 20th Century average. Precipitation across the contiguous United States in November averaged 1.93 inches, which is 0.20 inch below the 1901-2000 average.
For the January-November period, the average temperature of 54.9 degrees F was 0.3 degree above the 20th Century average. The nation’s January-November temperature has increased at a rate of 0.12 degrees per decade since 1895, and at a faster rate of 0.41 degrees each decade during the last 50 years. All findings are based on a preliminary analysis of data based on records dating back to 1895.
U.S. Temperature Highlights
- November temperatures were cooler than average across the Southeast and Central regions, and much warmer than average in the Southwest, Northwest and West regions.
- The West region had its fourth warmest November on record. This contrasted with the Southeast, which was much below normal.
- Persistent above-average temperatures for the last six months have resulted in a record warm June-November period for the West region. California set a record for its warmest June-November, while both Nevada and Utah had their fifth warmest June-November period.
- Based on NOAA's Residential Energy Demand Temperature Index, the contiguous U.S. temperature-related energy demand was 0.6 percent below average in November.
U.S. Precipitation Highlights
- The United States measured above-normal precipitation across the northern Great Plains from eastern Montana to western Minnesota. However, November was drier than normal across much of the South and Central regions.
- Precipitation across most of the Midwest was only 50-75 percent of normal and some areas from southern Missouri through central Illinois received less than 50 percent of normal precipitation.
- The January-November period has been persistently wet across much of the country from the central Plains to the Northeast. The 11-month period was the wettest on record for New Hampshire and Massachusetts, second wettest for Missouri, third wettest for Vermont and Illinois, and fifth wettest for Maine and Iowa.
- At the end of November, 22 percent of the contiguous United States was in moderate-to-exceptional drought, about the same as October. Meanwhile, extreme-to-exceptional drought conditions continued in the western Carolinas, northeast Georgia, eastern Tennessee, southern Texas, and Hawai’i.
- About 26 percent of the contiguous United States was in moderately-to-extremely wet conditions at the end of November, according to the Palmer Index. This was a decrease of about three percent compared to October.
- It was the wettest November on record in Yuma, Ariz., with 2.2 inches (5.6 cm) of precipitation – all of it falling on November 26. This was more than five times the November average.
- An early November blizzard forced more than 100 businesses and schools, and Interstate 90, to close in western South Dakota on Nov. 5 and 6. The blizzard brought total snow accumulations of 3 to 4 feet and drifts up to 20 feet in places.
- Several periods of strong northwesterly winds during the month resulted in mountain-enhanced snowfalls across the mountains of western Virginia, North Carolina, and extreme northern Georgia. Banner Elk, N.C. recorded 6.2 inches (15.7 cm) of snow during the month making it the snowiest November since 1983.
- Three separate wildfires, which scorched 41,000 acres in Southern California, destroyed 1,000 homes and prompted 15,000 people to evacuate from November 13-17.
NCDC’s preliminary reports, which assess the current state of the climate, are released soon after the end of each month. These analyses are based on preliminary data, which are subject to revision. Additional quality control is applied to the data when late reports are received several weeks after the end of the month and as increased scientific methods improve NCDC’s processing algorithms.
NOAA understands and predicts changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and conserves and manages our coastal and marine resources.
|
<urn:uuid:b2be6da7-ff96-4bf6-9cc3-ef1ddb2bf1b1>
|
http://www.noaanews.noaa.gov/stories2008/20081211_novstats.html
| 0.6515
|
fineweb
|
Hockerton Housing Project
The Hockerton Housing Project (HHP) is the UK’s first earth sheltered, self-sufficient ecological housing development. The residents of the five houses generate their own clean energy, harvest their own water and recycle waste materials causing minimal pollution or carbon dioxide emissions. The houses are amongst the most energy efficient, purpose-built dwellings in Europe. The houses are the focus of a holistic way of living, which combines the production of organic foods, low intensity fish farming, promotion of wildlife, and the planting of thousands of trees.
The project was conceived in the early 1990’s. It took two years to complete the planning agreement with the local authority and a further two years to build the homes and facilities.
Over the years the project has established itself as an exemplar of sustainable development. As a result of this, it has developed a range of services through the creation of a small on-site business. This workers’ co-operative provides a level of employment for its members, whilst promoting sustainable development. Its activities include running guided tours, workshops, talks, consultancy and, soon to be launched, a match-making service.
Although each family has their own home, the community share food growing, site maintenance, managing the facilities and a common sustainable business.
|
<urn:uuid:dd1943ca-082a-4fc0-935f-100aa40dbe28>
|
http://www.diggersanddreamers.org.uk/index.php?fld=super_region&val=The%20Midlands&one=dat&two=det&sel=hockertn
| 0.9945
|
fineweb
|
A massive weather formation has developed that is nearly nine thousand miles long, stretching from the equatorial Pacific into the northern part of the northern hemisphere, driving moist tropical air from the Pacific deep into the United States.
The formation is causing flooding in previously drought-afflicted Texas, and is driving a flow of warm, extremely moist air far to the north. It is unusual for a weather system this large to develop in earth's atmosphere, and its presence could be another sign of climate change.
In The Coming Global Superstorm, Art Bell and Whitley Strieber referred to theories that very large scale atmospheric phenomena like this would develop as the earth warmed, and the storm described in the book starts when a gigantic flow of warm, moist tropical air drives into the far north, a flow that is interrupted when ocean currents cease to support it. There is then a collision between the tropical air mass and cold air pouring down from the arctic in the wake of the collapsed currents.
There has been evidence that surface features of the North Atlantic Current have been weakening since 1999, and the recent severe storms that swept northern Europe may be connected with this process.
At the present time, arctic weather conditions are relatively normal for this time of year. However, if a significant temperature spike should now appear in the arctic, it would be cause for concern. In any case, the penetration of warm, humid tropical air so deeply into the northern hemisphere at this time of year sets the stage for further dramatic weather over the next few weeks, possibly taking the form of strong blizzards across the midwest.
NOTE: This news story, previously published on our old site, will have any links removed.
|
<urn:uuid:ea4e5e22-e73c-42fc-a56a-a4c6931061d8>
|
http://www.unknowncountry.com/news/unusual-weather-formation-appears
| 0.9942
|
fineweb
|
China has worked actively and seriously to tackle global climate change and build capacity to respond to it. We believe that every country has a stake in dealing with climate change and every country has a responsibility for the safety of our planet. China is at a critical stage of building a moderately prosperous society on all fronts, and a key stage of accelerated industrialization and urbanization. Yet, despite the huge task of developing the economy and improving people’s lives, we have joined global actions to tackle climate change with the utmost resolve and a most active attitude, and have acted in line with the principle of common but differentiated responsibilities established by the United Nations. China voluntarily stepped up efforts to eliminate backward capacity in 2007, and has since closed a large number of heavily polluting small coal-fired power plants, small coal mines and enterprises in the steel, cement, paper-making, chemical and printing and dyeing sectors. Moreover, in 2009, China played a positive role in the success of the Copenhagen conference on climate change and the ultimate conclusion of the Copenhagen Accord. In keeping with the requirements of the Copenhagen Accord, we have provided the Secretariat of the United Nations Framework Convention on Climate Change with information on China’s voluntary actions on emissions reduction and joined the list of countries supporting the Copenhagen Accord.
The targets released by China last year for greenhouse gas emissions control require that by 2020, CO2 emissions per unit of GDP should go down by 40% - 45% from the 2005 level, non-fossil energy should make up about 15% of primary energy consumption, and forest coverage should increase by 40 million hectares and forest stock volume by 1.3 billion cubic meters, both from the 2005 level. The measure to lower energy consumption alone will help save 620 million tons of standard coal in energy consumption in the next five years, which will be equivalent to the reduction of 1.5 billion tons of CO2 emissions. This is what China has done to step up the shift in economic development mode and economic restructuring. It contributes positively to Asia’s and the global effort to tackle climate change.
Ladies and Gentlemen,
Green and sustainable development represents the trend of our times. To achieve green and sustainable development in Asia and beyond and ensure the sustainable development of resources and the environment such as the air, fresh water, ocean, land and forest, which are all vital to human survival, we countries in Asia should strive to balance economic growth, social development and environmental protection. To that end, we wish to work with other Asian countries and make further efforts in the following six areas.
First, shift development mode and strive for green development. To accelerate the shift in economic development mode and economic restructuring provides an important precondition for our efforts to actively respond to climate change, achieve green development and secure the sustainable development of the population, resources and the environment. It is the shared responsibility of governments and enterprises of all countries in Asia and around the world. We should actively promote a conservation culture and raise awareness for environmental protection. We need to make sure that the concept of green development, green consumption and a green lifestyle and the commitment to taking good care of Planet Earth, our common home are embedded in the life of every citizen in society.
Second, value the importance of science and technology as the backing of innovation and development. We Asian countries have a long way to go before we reach the advanced level in high-tech-powered energy consumption reduction and improvement of energy and resource efficiency. Yet, this means we have a huge potential to catch up. It is imperative for us to quicken the pace of low-carbon technology development, promote energy efficient technologies and raise the proportion of new and renewable energies in our energy mix so as to provide a strong scientific and technological backing for green and sustainable development of Asian countries. As for developed countries, they should facilitate technology transfer and share technologies with developing countries on the basis of proper protection of intellectual property rights.
Third, open wider to the outside world and realize harmonious development. In such an open world as ours, development of Asian countries and development of the world are simply inseparable. It is important that we open our markets even wider, firmly oppose and resist protectionism in all forms and uphold a fair, free and open global trade and investment system. At the same time, we should give full play to the role of regional and sub-regional dialogue and cooperation mechanisms in Asia to promote harmonious and sustainable development of Asia and the world.
Fourth, strengthen cooperation and sustain common development. Pragmatic, mutually beneficial and win-win cooperation is a sure choice of all Asian countries if we are to realize sustainable development. No country could stay away from or manage to meet on its own severe challenges like the international financial crisis, climate change and energy and resources security. We should continue to strengthen macro-economic policy coordination and vigorously promote international cooperation in emerging industries, especially in energy conservation, emissions reduction, environmental protection and development of new energy sources to jointly promote sustainable development of the Asian economy and the world economy as a whole.
Fifth, work vigorously to eradicate poverty and gradually achieve balanced development. A major root cause for the loss of balance in the world economy is the seriously uneven development between the North and the South. Today, 900 million people in Asia, or roughly one fourth of the entire Asian population, are living below the 1.25 dollars a day poverty line. We call for greater efforts to improve the international mechanisms designed to promote balanced development, and to scale up assistance from developed countries to developing countries, strengthen South-South cooperation, North-South cooperation and facilitate attainment of the UN Millennium Development Goals. This will ensure that sustainable development brings real benefits to poor regions, poor countries and poor peoples.
Sixth, bring forth more talents to promote comprehensive development. The ultimate goal of green and sustainable development is to improve people’s living environment, better their lives and promote their comprehensive development. Success in this regard depends, to a large extent, on the emergence of talents with an innovative spirit. We need to build institutions, mechanisms and a social environment to help people bring out the best of their talents, and to intensify education and training of professionals of various kinds. This will ensure that as Asia achieves green and sustainable development, our people will enjoy comprehensive development.
Ladies and Gentlemen,
We demonstrated solidarity as we rose up together to the international financial crisis in 2009. Let us carry forward this great spirit, build up consensus, strengthen unity and cooperation and explore a path of green and sustainable development. This benefits Asia. It benefits the world, too.
In conclusion, I wish this annual conference of the Boao Forum for Asia a complete success.
|
<urn:uuid:648ee2b5-f8cd-4273-8ab0-29206d637638>
|
http://news.xinhuanet.com/english2010/china/2010-04/11/c_13245754_2.htm
| 0.9997
|
fineweb
|
An atlas report from Australia's Commonwealth Scientific and Industrial Research Organization (CSIRO) outlines the monstrous wave energy potential of the country's southern coastline.
The report, Ocean Power for Australia -- Waves, Tides, and Ocean Currents [pdf], outlines that if Australia could harness just 10% of the wave energy produced, it would meet all of its current electricity consumption.
"If we look at the sustained energy resource along the southern coastline - and we're looking between Geraldton in West Australia and southern tip of Tasmania -- that has a sustained wave energy resource of about five times larger than Australia's present day electricity consumption," said Dr. Mark Hemer from The Centre for Australian Weather and Climate Research.
"We figured out that if we could harness just 10 per cent of the wave energy along a 1,000km strip of the southern coast, then that would be enough to meet the Australian Government's renewable energy targets of 20 per cent renewable energy before 2020."
Of course, this doesn't happen overnight. The predicted timeline would be somewhere in the neighborhood of a decade. And Australia's wave energy projects of late have had considerable troubles. But Dr. Hemer attributes these recent troubles to wave energy's infancy stages.
"Wave energy really is a baby at the moment -- there's currently only about four megawatts of wave energy generating capacity installed globally," he said.
"If you compare that to wind energy, there's about 200,000 megawatts of installed capacity, or 50,000 times more, so wave energy is a long way behind on the cost learning curve."
But the latest report provides a promising continuum for ongoing research. Efforts and further study can now be concentrated to specific areas.
After that comes the more arduous task of convincing investors about the relatively new technology and its durability.
Learn more about wave energy at eBoom's Emerging Energy Learning Page.
Any opinion contained in this article is solely that of the writers, and does not necessarily shape or reflect the editorial opinions of Energy Boom. Energy Boom content is for informational purposes only and is not intended to be advice regarding the investment merits of, or a recommendation regarding the purchase or sale of, any security identified on, or linked through, this site.
|
<urn:uuid:0226bac8-44d4-4525-a17f-3e31f4a70a94>
|
http://www.energyboom.com/emerging/australias-wave-energy-hotspots-mapped-out-new-atlas
| 0.9584
|
fineweb
|
Analysis: Evidence for climate extremes, costs, gets more local
OSLO (Reuters) - Scientists are finding evidence that man-made climate change has raised the risks of individual weather events, such as floods or heatwaves, marking a big step towards pinpointing local costs and ways to adapt to freak conditions.
"We're seeing a great deal of progress in attributing a human fingerprint to the probability of particular events or series of events," said Christopher Field, co-chairman of a U.N. report due in 2014 about the impacts of climate change.
Experts have long blamed a build-up of greenhouse gas emissions for raising worldwide temperatures and causing desertification, floods, droughts, heatwaves, more powerful storms and rising sea levels.
But until recently they have said that naturally very hot, wet, cold, dry or windy weather might explain any single extreme event, like the current drought in the United States or a rare melt of ice in Greenland in July.
But for some extremes, that is now changing.
A study this month, for instance, showed that greenhouse gas emissions had raised the chances of the severe heatwave in Texas in 2011 and unusual heat in Britain in late 2011. Other studies of extremes are under way.
Growing evidence that the dice are loaded towards ever more severe local weather may make it easier for experts to explain global warming to the public, pin down costs and guide investments in everything from roads to flood defenses.
"One of the ironies of climate change is that we have more papers published on the costs of climate change in 2100 than we have published on the costs today. I think that is ridiculous," said Myles Allen, head of climate research at Oxford University's Environmental Change Institute.
"We can't (work out current costs) without being able to make the link to extreme weather," he said. "And once you've worked out how much it costs that raises the question of who is going to pay."
Industrialized nations agree they should take the lead in cutting emissions since they have burnt fossil fuels, which release greenhouse gases, since the Industrial Revolution. But they oppose the idea of liability for damage.
Almost 200 nations have agreed to work out a new deal by the end of 2015 to combat climate change, after repeated setbacks. China, the United States and India are now the top national emitters of greenhouse gases.
Field, Professor of Biology and Environmental Earth System Science at the University of Stanford, said that the goal was to carry out studies of extreme weather events almost immediately after they happen, helping expose the risks.
"Everybody who needs to make decisions about the future - things like building codes, infrastructure planning, insurance - can take advantage of the fact that the risks are changing but we have a lot of influence over what those risks are."
Another report last year indicated that floods 12 years ago in Britain - among the countries most easily studied because of it has long records - were made more likely by warming. And climate shifts also reduced the risks of flooding in 2001.
Previously, the European heatwave of 2003 that killed perhaps 70,000 people was the only extreme where scientists had discerned a human fingerprint. In 2004, they said that global warming had at least doubled the risks of such unusual heat.
The new statistical reviews are difficult because they have to tease out the impact of greenhouse gases from natural variations, such as periodic El Nino warmings of the Pacific, sun-dimming volcanic dust or shifts in the sun's output.
So far, extreme heat is the easiest to link to global warming after a research initiative led by the U.S. National Oceanic and Atmospheric Administration and the British Meteorological Office.
"Heatwaves are easier to attribute than heavy rainfall, and drought is very difficult given evidence for large droughts in the past," said Gabriele Hegerl of the University of Edinburgh.
Scientists often liken climate change to loading dice to get more sixes, or a baseball player on steroids who hits more home runs. That is now going to the local from the global scale.
Field said climate science would always include doubt since weather is chaotic. It is not as certain as physics, where scientists could this month express 99.999 percent certainty they had detected the Higgs boson elementary particle.
"This new attribution science is showing the power of our understanding, but it also illustrates where the limits are," he said.
A report by Field's U.N. group last year showed that more weather extremes that can be linked to greenhouse warming, such as the number of high temperature extremes and the fact that the rising fraction of rainfall falls in downpours.
But scientists warn against going too far in blaming climate change for extreme events.
Unprecedented floods in Thailand last year, for instance, that caused $45 billion in damage according to a World Bank estimate, were caused by people hemming in rivers and raising water levels rather than by climate change, a study showed.
"We have to be a bit cautious about blaming it all on climate change," Peter Stott, head of climate monitoring and attribution at the Met Office's Hadley Centre, said of extremes in 2012.
Taken together, many extremes are a sign of overall change.
"If you look all over the world, we have a great disastrous drought in North America ... you have the same situation in the Mediterranean... If you look at all the extremes together you can say that these are indicators of global warming," said Friedrich-Wilhelm Gerstengabe, a professor at the Potsdam Institute for Climate Impact Research.
(Additional reporting by Sara Ledwith in London; Editing by Louise Ireland)
- Tweet this
- Share this
- Digg this
|
<urn:uuid:1df49711-7531-4aeb-87df-52870202f86e>
|
http://www.reuters.com/article/2012/07/27/us-climate-extremes-idUSBRE86Q0PI20120727?feedType=RSS&feedName=environmentNews
| 1
|
fineweb
|
18 abril 2012 | EN | FR
Regreening is being monitored by village committees in Niger
World leaders must promote effective land use methods to mitigate drought, says Luc Gnacadja of the UN Convention to Combat Desertification.
Severe droughts in Africa are a stark reminder of global unfairness. About 13 million people still struggle to have enough food in the Horn of Africa, and about the same number, most of them children, suffer from hunger in the Sahel region, which stretches across Africa below the Sahara.
Droughts now hit these parts of Sub-Saharan Africa more frequently than the usual ten-year cycle, and more severely. And people living there are the least responsible for this climatic change.
Last year, the response of the international community to the Horn of Africa crisis was catastrophically late and slow. Tens of thousands of people may have been rescued if we had not waited for the food crisis across East Africa to escalate into famine in Somalia.
To make things worse, humanitarian relief eventually leaves communities that depend on agriculture even more vulnerable to the next drought. Food aid disrupts local markets — farmers lose income and a major incentive to grow crops when local people can get food for free.
This article is part of our coverage of preparations for Rio+20 — the UN Conference on Sustainable Development — which takes place on 20-22 June 2012. For other articles, go to Science at Rio+20
Now, the latest reports from early warning systems predict a crisis in both the Horn of Africa and Sahel again this year, when they have not yet recovered from the 2010 and 2011 droughts. The question is, what are we going to do about it?
Protect and thrive
Farmers in the Maradi and Zinder regions of Niger know what to do. During the past 20 years they have protected trees on some five million hectares of farmland. Where they had no trees or only a few per hectare, they now have up to 120. These trees not only improve soil fertility but also provide about a million households with fodder, fruit and firewood.
A recent survey shows that the farmers who preserve trees are able to cope better with drought than other farmers in the same area. Some of them even produced a modest cereal surplus in 2011.
This is just one example of highly successful sustainable land management on a grassroots level. Another is Yacouba Sawadogo, a farmer in Burkina Faso who featured in the documentary film, The Man Who Stopped the Desert. He has combined tree and crop planting techniques to turn the barren land in his village into a 15 hectare cultivated forest within three decades.
We should not wait until the next food crisis emerges — we need to disseminate this experience and scale it up to national and regional levels.
Drought is predictable. The tools and knowledge farmers need to cope with it are already there. What is missing is the political will and the lack of awareness about existing, sustainable land-use practices.
First steps are local
The first step is to empower local communities and foster farmer-to-farmer communication.
In 2004, the International Fund for Agricultural Development helped to create the first village committee in the Aguie district of the Maradi region in Niger to monitor regreening activities. The initiative is recognised at a national level, prompting the establishment of similar committees in neighbouring villages.
Today, these committees regularly meet to share experience in land management and protect trees from theft.
Then in 2008, several farmers from Senegal visited re-greened areas in Niger. On their return, they used what they learned to protect young trees on their farmland on about 40,000 hectares. Local authorities should encourage such experience sharing.
These successes should be communicated by local, national and international media — especially radio, which is the most accessible medium for farmers across Africa.
National and international strategy
But strategies that work on the level of individual farmers are not enough. We need to make sure that each drought-prone country has a national drought policy, based on the principles of early warning, preparedness, risk management and response. This effort is led by the UN Convention to Combat Desertification and the World Meteorological Organization.
Such policies would embrace insurance schemes, for example, allowing farmers and herders affected by drought to receive state subsidies.
Most importantly, we should empower smallholder farmers to become 'champions' in the race against the disastrous effects of climate change. In most African countries, the land that local people have been cultivating for generations is legally owned by the government. But farmers will preserve their trees if they have clearly defined rights to them. So governments in Africa need to recognise these rights in forestry and agriculture laws.
We know from numerous studies that building long-term resilience is much more cost-effective than ad-hoc crisis response. Still, it seems easier for donors to justify spending money to feed a starving child rather than supporting his or her father to grow enough crops.
With the global community discussing the green economy and sustainable development in the UN Conference on Sustainable Development (Rio+20), it is unthinkable that we would continue standing by and allow tens of thousands of people to die of hunger.
We should act at local, national and global levels to give farmers the lead, and promote sustainable land use and re-greening initiatives.
The Rio+20 summit should make this a top priority. It is essential to averting the next food crisis in Africa, and meeting the global challenge of feeding nine billion people by 2050.
Luc Gnacadja is executive secretary of the UN Convention to Combat Desertification (UNCCD), which promotes a global response to desertification, land degradation and drought.
This article is part of our coverage on Science at Rio+20.
Al Jazeera Aid agency warns of West Africa food crisis. (9 March 2012)
Food security and water in Africa's drylands (African re-greening initiatives, 2012)
1080 Films The Man Who Stopped the Desert. (2010)
Todos los comentarios están sujetos a revisión. Nos reservamos el derecho de editar los comentarios que contengan un lenguaje inapropiado o inadecuado. SciDev.Net mantiene los derechos de autor de todo el material que se publica en el portal. Por favor lea las condiciones de uso para más detalles.
Todo el material de SciDev.Net se puede reproducir gratuitamente siempre que se de crédito a la fuente y al autor. Para más detalles ver Creative commons.
|
<urn:uuid:cd735d47-b987-4922-bcfd-e0a0af4da6b4>
|
http://www.scidev.net/es/science-communication/farming-practices/opinions/uso-sostenible-del-suelo-debe-ser-prioridad-para-r-o-20.html
| 0.9946
|
fineweb
|
(Phys.org)—Research indicates the out-of-Africa spread of humans was dictated by the appearance of favourable climatic windows.
By integrating genetics with high resolution historical climate reconstructions, scientists have been able to predict the timing and routes taken by modern humans during their expansion out of Africa. Their research reveals that the spread of humans out of Africa was dictated by climate, with their entry into Europe possibly delayed by competition with Neanderthals. The research is published today, 17 September, in the journal PNAS.
Dr Anders Eriksson, from the University of Cambridge, the lead author of the paper said: "By combining extensive genetic information with climate and vegetation models, we were able to build the most detailed reconstruction of human history so far."
The role of climate change in determining the timing of the expansion of human populations has been long debated. The oldest fossil remains of anatomically modern humans are found in Africa and date back to around 200 thousand years ago, but there is no trace outside Africa until 100 thousand years later.
The newly published model provides the first direct link between climate change and the timing of the expansion out of Africa, as well as the routes taken.
To investigate the role of climate, the Cambridge scientists built a highly detailed model tracking the fate of all individuals on the planet. The project involved specialists from a variety of fields. Working together with climatologists and vegetation modellers, they reconstructed climate and sea level changes and their effect on food availability through time, with a resolution of 100km. After exploring several million demographic scenarios (e.g. birth rates, local movement rates, link between food availability and population sizes), they were able to identify the scenarios that were most compatible with the geographic patterns of genetic diversity in modern humans. Working with anthropologists and archaeologists, they were then able to compare these scenarios against the dates and localities of known archaeological and fossil finds.
The demographic scenarios chosen by the model revealed a link between food availability and population density in the past was very similar to the link found in present day hunter-gatherers. Based on this link, the model found that climate prevented humans from exiting Africa until a favourable window appeared in North-East Africa approximately 70-55k years ago. Most movement occurred through the so-called Sothern Route, exiting Africa via the Bab-el-Mandeb strait into the Arabian Peninsula.
The dating of the out-of-Africa exit as well as the arrival times for other continents identified by the model, were also found to largely agree with archaeological and fossil evidence, with the notable exception of Europe. For Europe, the model based on climate predicted arrival times approximately 10 thousand years earlier than the available archaeological evidence. This discrepancy could be explained by competition with Neanderthals, which was not accounted for in their model, and would likely have slowed down the colonization of Europe by modern humans.
Dr Manica, who co-led the study, said: "The idea that we can reconstruct climate, and estimate food availability and finally figure out the demographic changes and movements of our ancestors all over the world is simply amazing. The fact that most of our results are in agreement with archaeological and anthropological evidence – which was not used to generate our model – points to the fact that our reconstructions based on genetics are quite realistic."
Explore further: New research raises doubts about whether modern humans and Neanderthals interbred
More information: "Late Pleistocene climate change and the global expansion of anatomically modern humans," by Anders Eriksson et al. PNAS, 2012.
|
<urn:uuid:6dc227d8-ed59-487d-9fcd-4a32d73b787a>
|
http://phys.org/news/2012-09-scientists-genetics-climate-reconstructions-track.html
| 0.9999
|
fineweb
|
Jellyfish threaten to 'dominate' oceans
Giant jelly fish are taking over parts of the world's oceans due to overfishing and other human activities, say researchers.
"We need to take management action to avert the marine systems of the world flipping over to being jellyfish dominated," says Richardson, who is also a marine biologist at the University of Queensland.
Richardson says jellyfish numbers are increasing, particularly in Southeast Asia, the Black Sea, the Gulf of Mexico and the North Sea.
He says the Japanese have a real problem with giant jellyfish that burst through fishing nets.
"[They're] a jelly fish called Nomura, which is the biggest jellyfish in the world. It can weigh 200 kilograms, as big as a sumo wrestler and is 2 metres in diameter," says Richardson.
Overfishing and eutrophication
Richardson and colleagues reviewed literature linking jellyfish blooms with overfishing and eutrophication - high levels of nutrients.
Jellyfish are normally kept in check by fish, which eat small jellyfish and compete for jellyfish food such as zooplankton, he says.
But, with overfishing, jellyfish numbers are increasing. Jellyfish feed on fish eggs and larvae, further impacting on fish numbers.
To add insult to injury, nitrogen and phosphorous in run-off cause red phytoplankton blooms, which create low-oxygen dead zones where jellyfish survive, but fish can't.
"You can think of them like a protected area for jellyfish," says Richardson.
Richardson and colleagues say climate change may also encourage more jellyfish.
They have postulated for the first time that these conditions can lead to what they call a "jellyfish stable state", in which jellyfish rule the oceans.
Richardson and colleagues recommend a number of actions in their paper, to coincide with World Oceans Day.
They say it's important to reduce overfishing, especially of small pelagic fish, like sardines, and to reduce run-off.
They also say it's important to control the transport of jellyfish around the world in ballast water and aquariums.
Richardson says researchers are experimenting with different ways of controlling jellyfish.
Some methods involve sound waves to explode jellyfish, while others use special nets to try and cut them up.
Jellyfish are considered simple jelly-like sea animals, which are related to the microscopic animals that form coral.
They generally start their life as a plant-like polyp on the sea bed before budding off into the well-known bell-shaped medusa.
Jellyfish have tentacles containing pneumatocyst cells, which act like little harpoons that lodge in prey to sting and kill them.
The location and number of pneumatocysts dictate whether jellyfish are processed for human consumption.
While dried jellyfish with soya sauce is a delicacy served in Chinese weddings and banquets, not all kinds of jellyfish can be eaten, says Richardson.
According to Richardson, the species increasing in number aren't generally eaten.
|
<urn:uuid:fd4bdb2b-1936-463f-9953-ce05b9e81c26>
|
http://www.abc.net.au/science/articles/2009/06/08/2592139.htm
| 0.7229
|
fineweb
|
Climate change to take heavy toll on Bangladesh: U.N.
By Ruma Paul
DHAKA (Reuters) - Disaster-prone Bangladesh is among the countries most vulnerable to climate change, which could worsen water scarcity and force mass displacement, the United Nations said on Tuesday.
The U.N. Development Programme in its latest report warned that climate change will hit the world's poorest countries by breaking down agricultural systems, worsening water scarcity, increasing risks of diseases and triggering mass displacement due to recurring floods and storms.
The report said more than 70 million Bangladeshis, 22 million Vietnamese, and 6 million Egyptians could be affected by global warming-related flooding.
"The near-term vulnerabilities are not concentrated in lower Manhattan and London, but in flood-prone areas of Bangladesh and drought-prone parts of sub-Saharan Africa," said Kevin Watkins, the lead author of the Human Development Report.
Dhaka has proposed setting up of an International Centre for Adaptation to study countries most at risk from climate change, C.S.Karim, a government adviser said.
British High Commissioner Anwar Chowdury said on Wednesday his government welcomed the proposal, and plans to organize a conference in Dhaka early next year on climate change.
Bangladesh has suffered a double blow in the last few months, first from devastating floods in July and then two weeks ago when the worst cyclone since 1991 killed some 3,500 people and displaced millions.
"Bangladesh faces several vulnerabilities from climate change during this century," K.B. Sajjadur Rasheed, a Bangladeshi environment specialist, told Reuters.
"The sea-level rise of even by 40 cm (16 inches) in the Bay of Bengal would submerge 11 percent of the country's land area in the coastal zone, displacing 7 to 10 million people."
Secondly, the frequency, extent, depth and duration of floods could increase because of more monsoon rains triggered by climate change, he said.
That would cause a significant decrease in crops, and food security.
This century should also see the flow of water decreasing in the Ganges, one of the major river systems in riverine Bangladesh, due to glacial retreat from global warming, he said.
It would force millions to seek shelter further inland in the densely populated country of more than 140 million people.
"The implication is that, while Bangladesh could be subjected to increased flooding in the next two to four decades, the country could face drought-like conditions from low flows in the rivers during the latter half of the century," Rasheed said.
(Additional reporting by Masud Karim; Editing by Sanjeev Miglani)
|
<urn:uuid:00111258-c187-48f0-b046-53020198a03b>
|
http://www.enn.com/climate/article/26032
| 1
|
fineweb
|
Upland Bird Regional Forecast
When considering upland game population levels during the fall hunting season, two important factors impact population change. First is the number of adult birds that survived the previous fall and winter and are considered viable breeders in the spring. The second is the reproductive success of this breeding population. Reproductive success consists of nest success (the number of nests that successfully hatched) and chick survival (the number of chicks recruited into the fall population). For pheasant and quail, annual population turnover is relatively high; therefore, the fall population is more dependent on reproductive success than breeding population levels. For grouse (prairie chickens), annual population turnover is not as rapid although reproductive success is still the major population regulator and important for good hunting. In the following forecast, breeding population and reproductive success of pheasants, quail, and prairie chickens will be discussed. Breeding population data were gathered during spring breeding surveys for pheasants (crow counts), quail (whistle counts), and prairie chickens (lek counts). Data for reproductive success were collected during late summer roadside surveys for pheasants and quail. Reproductive success of prairie chickens cannot be easily assessed using the same methods because they generally do not associate with roads like the other game birds.
Kansas experienced extreme drought this past year. Winter weather was mild, but winter precipitation is important for spring vegetation, which can impact reproductive success, and most of Kansas did not get enough winter precipitation. Pheasant breeding populations showed significant reductions in 2012, especially in primary pheasant range in western Kansas. Spring came early and hot this year, but also included fair spring moisture until early May, when the precipitation stopped, and Kansas experienced record heat and drought through the rest of the reproductive season. Early nesting conditions were generally good for prairie chickens and pheasants. However, the primary nesting habitat for pheasants in western Kansas is winter wheat, and in 2012, Kansas had one of the earliest wheat harvests on record. Wheat harvest can destroy nests and very young broods. The early harvest likely lowered pheasant nest and early brood success. The intense heat and lack of rain in June and July resulted in a decrease in brooding cover and insect populations, causing lower chick survival for all upland game birds.
Because of drought, all counties in Kansas were opened to Conservation Reserve Program (CRP) emergency haying or grazing. CRP emergency haying requires fields that are hayed to leave at least 50 percent of the field in standing grass cover. CRP emergency grazing requires 25 percent of the field (or contiguous fields) to be left ungrazed or grazing at 75-percent normal stocking rates across the entire field. Many CRP fields, including Walk In Hunting Areas (WIHA), may be affected across the state. WIHA property is privately-owned land open to the public for hunting access. Kansas has more than one million acres of WIHA. Often, older stands of CRP grass are in need of disturbance, and haying and grazing can improve habitat for the upcoming breeding season, and may ultimately be beneficial if weather is favorable.
Due to continued drought, Kansas will likely experience a below-average upland game season this fall. For those willing to hunt hard, there will still be pockets of decent bird numbers, especially in the northern Flint Hills and northcentral and northwestern parts of the state. Kansas has approximately 1.5 million acres open to public hunting (wildlife areas and WIHA combined). The regular opening date for the pheasant and quail seasons will be Nov. 10 for the entire state. The previous weekend will be designated for the special youth pheasant and quail season. Youth participating in the special season must be 16 years old or younger and accompanied by a non-hunting adult who is 18 or older. All public wildlife areas and WIHA tracts will be open for public access during the special youth season. Please consider taking a young person hunting this fall, so they might have the opportunity to develop a passion for the outdoors that we all enjoy.
PHEASANT – Drought in 2011 and 2012 has taken its toll on pheasant populations in Kansas. Pheasant breeding populations dropped by nearly 50 percent or more across pheasant range from 2011 to 2012 resulting in fewer adult hens in the population to start the 2012 nesting season. The lack of precipitation has resulted in less cover and insects needed for good pheasant reproduction. Additionally, winter wheat serves as a major nesting habitat for pheasants in western Kansas, and a record early wheat harvest this summer likely destroyed many nests and young broods. Then the hot, dry weather set in from May to August, the primary brood-rearing period for pheasants. Pheasant chicks need good grass and weed cover and robust insect populations to survive. Insufficient precipitation and lack of habitat and insects throughout the state’s primary pheasant range resulted in limited production. This will reduce hunting prospects compared to recent years. However, some good opportunities still exist to harvest roosters in the sunflower state, especially for those willing to work for their birds. Though the drought has taken its toll, Kansas still contains a pheasant population that will produce a harvest in the top three or four major pheasant states this year.
The best areas this year will likely be pockets of northwest and northcentral Kansas. Populations in southwest Kansas were hit hardest by the 2011-2012 drought (72 percent decline in breeding population), and a very limited amount of production occurred this season due to continued drought and limited breeding populations.
QUAIL – The bobwhite breeding population in 2012 was generally stable or improved compared to 2011. Areas in the northern Flint Hills and parts of northeast Kansas showed much improved productivity this year. Much of eastern Kansas has seen consistent declines in quail populations in recent decades. After many years of depressed populations, this year’s rebound in quail reproduction in eastern Kansas is welcomed, but overall populations are still below historic averages. The best quail hunting will be found throughout the northern Flint Hills and parts of central Kansas. Prolonged drought undoubtedly impacted production in central and western Kansas.
PRAIRIE CHICKEN – Kansas is home to greater and lesser prairie chickens. Both species require a landscape of predominately native grass. Lesser prairie chickens are found in westcentral and southwestern Kansas in native prairie and nearby stands of native grass within the conservation reserve program (CRP). Greater prairie chickens are found primarily in the tallgrass and mixed-grass prairies in the eastern one-third and northern one-half of the state.
The spring prairie chicken lek survey indicated that most populations remained stable or declined from last year. Declines were likely due to extreme drought throughout 2011. Areas of northcentral and northwest Kansas fared the best, while areas in southcentral and southwest Kansas experienced the sharpest declines where drought was most severe. Many areas in the Flint Hills were not burned this spring due to drought. This resulted in far more residual grass cover for much improved nesting conditions compared to recent years. There have been some reports of prairie chickens broods in these areas, and hunting will likely be somewhat improved compared to recent years.
Because of recent increases in prairie chicken (both species) populations in northwest Kansas, regulations have been revised this year. The early prairie chicken season (Sept. 15-Oct. 15) and two-bird bag limit has been extended into northwest Kansas. The northwest unit boundary has also been revised to include areas north of U.S. Highway 96 and west of U.S. Highway 281. Additionally, all prairie chicken hunters are now required to purchase a $2.50 prairie chicken permit. This permit will allow KDWPT to better track hunters and harvest, which will improve management activities. Both species of prairie chicken are of conservation concern and the lesser prairie chicken is a candidate species for federal listing under the Endangered Species Act.
This region has 11,809 acres of public land and 339,729 acres of WIHA open to hunters this fall.
Pheasant – Spring breeding populations declined almost 50 percent from 2011 to 2012, reducing fall population potential. Early nesting conditions were decent due to good winter wheat growth, but early wheat harvest and severe heat and drought through the summer reduced populations. While this resulted in a significant drop in pheasant numbers, the area will still have the highest densities of pheasants this fall compared to other areas in the state. Some counties — such as Graham, Rawlins, Decatur, and Sherman — showed the relatively-highest densities of pheasants during summer brood surveys. Much of the cover will be reduced compared to previous years due to drought and resulting emergency haying and grazing in CRP fields. Good hunting opportunities will also be reduced compared to recent years, and harvest will likely be below average.
Quail – Populations in this region have been increasing in recent years although the breeding population had a slight decline. This area is at the extreme northwestern edge of bobwhite range in Kansas, and densities are relatively low compared to central Kansas. Some counties — such as Graham, Rawlins, and Decatur — will provide hunting opportunities for quail.
Prairie Chicken – Prairie chicken populations have expanded in both numbers and range within the region over the past 20 years. The better hunting opportunities will be found in the central and southeastern portions of the region in native prairies and nearby CRP grasslands. Spring lek counts in that portion of the region were slightly depressed from last year and nesting conditions were only fair this year. Extreme drought likely impaired chick survival.
This region has 75,576 acres of public land and 311,182 acres of WIHA open to hunters this fall.
Pheasant – The Smoky Hills breeding population dropped about 40 percent from 2011 to 2012, reducing overall fall population potential. While nesting conditions were fair due to good winter wheat growth, the drought and early wheat harvest impacted the number of young recruited into the fall population. Certain areas had decent brood production, including portions of Mitchell, Rush, Rice, and Cloud counties. Across the region, hunting opportunities will likely be below average and definitely reduced from recent years. CRP was opened to emergency haying and grazing, reducing available cover.
Quail – Breeding populations increased nearly 60 percent from 2011 to 2012, increasing fall population potential. However, drought conditions were severe, likely impairing nesting and brood success. There are reports of fair quail numbers in certain areas throughout the region. Quail populations in northcentral Kansas are naturally spotty due to habitat characteristics. Some areas, such as Cloud County, showed good potential while other areas in the more western edges of the region did not fare as well.
Prairie Chicken – Greater prairie chickens occur throughout the Smoky Hills in large areas of native rangeland and some CRP. This region includes some of the highest densities and greatest hunting opportunities in the state for greater prairie chickens. Spring counts indicated that numbers were stable or slightly reduce from last year. Much of the rangeland cover is significantly reduced due to drought, which likely impaired production, resulting in reduced fall hunting opportunities..
This region has 60,559 acres of public land and 54,170 of WIHA open to hunters this fall.
Pheasant – Spring crow counts this year showed a significant increase in breeding populations of pheasants. While this increase is welcome, this region was nearing all-time lows in 2011. Pheasant densities across the region are still low, especially compared to other areas in western Kansas. Good hunting opportunities will exist in only a few pockets of good habitat.
Quail – Breeding populations stayed relatively the same as last year, and some quail were detected during the summer brood survey. The long-term trend for this region has been declining, largely due to unfavorable weather and degrading habitat. This year saw an increase in populations. Hunting opportunities for quail will be improved this fall compared to recent years in this region. The best areas will likely be in Marshall and Jefferson counties.
Prairie Chickens – Very little prairie chicken range occurs in this region, and opportunities are limited. The best areas are in the western edges of the region, in large areas of native rangeland.
This region has 80,759 acres of public land and 28,047 acres of WIHA open to hunters this fall.
Pheasant – This region is outside the primary pheasant range and has very limited hunting. A few birds can be found in the northwestern portion of the region.
Quail – Breeding populations were relatively stable from 2011 to 2012 for this region although long term trends have been declining. In the last couple years, the quail populations throughout much of the region have been on the increase. Specific counties that showed relatively higher numbers are Coffey, Osage, and Wilson. However, populations remain far below historic levels across the bulk of the region due to extreme habitat degradation.
Prairie Chicken – Greater prairie chickens occur in the central and northwest parts of this region in large areas of native rangeland. Breeding population densities were up nearly 40 percent from last year, and opportunities may increase accordingly. However, populations have been in consistent decline over the long term. Infrequent fire frequency has resulted in woody encroachment of native grasslands in the area, gradually reducing the amount of suitable habitat.
This region has 128,371 acres of public land and 63,069 acres of WIHA open to hunters this fall.
Pheasant – This region is on the eastern edge of pheasant range in Kansas and well outside the primary range. Pheasant densities have always been relatively low throughout the Flint Hills. Spring breeding populations were down nearly 50 percent, and reproduction was limited this summer. The best pheasant hunting will be in the northwestern edge of this region in Marion and Dickinson counties.
Quail – This region contains some of the highest densities of bobwhite in Kansas. The breeding population in this region increased 25 percent compared to 2011, and the long-term trend (since 1998) has been stable do to steadily increasing populations over the last four or five years. High reproductive success was reported in the northern half of this region, and some of the best opportunities for quail hunting will be found in the northern Flint Hills this year. In the south, Cowley County showed good numbers of quail this summer.
Prairie Chickens – The Flint Hills is the largest intact tallgrass prairie left in North America. It has served as a core habitat for greater prairie chickens for many years. Since the early 1980s, inadequate range burning frequencies have consistently reduced nest success in the area, and prairie chicken numbers have been declining as a result. Because of the drought this spring, many areas that are normally burned annually were left unburned this year. This left more residual grass cover for nesting and brood rearing. There are some good reports of prairie chicken broods, and hunting opportunities will likely increase throughout the region this year.
This region has 19,534 acres of public land and 73,341 acres of WIHA open to hunters this fall.
Pheasant – The breeding population declined about 40 percent from 2011 to 2012. Prolonged drought for two years now and very poor vegetation conditions resulted in poor reproductive success this year. All summer indices showed a depressed pheasant population in this region, especially compared to other regions. Some of the relatively better counties in this area will be Reno, Pawnee, and Pratt although these counties have not been immune to recent declines. There will likely few good hunting opportunities this fall.
Quail – The breeding population dropped over 30 percent this year from 2011 although long term trends (since 1998) have been stable in this region. This region generally has some of the highest quail densities in Kansas, but prolonged drought and reduced vegetation have caused significant declines in recent years. Counties such as Reno, Pratt, and Stafford will likely have the best opportunities in the region. While populations may be down compared to recent years, this region will continue to provide fair hunting opportunities for quail.
Prairie Chicken – This region is almost entirely occupied by lesser prairie chickens. The breeding population declined nearly 50 percent from 2011 to 2012. Reproductive conditions were not good for the region due to extreme drought and heat for the last two years, and production was limited. The best hunting opportunities will likely be in the sand prairies south of the Arkansas River.
This region has 2,904 acres of public land and 186,943 acres of WIHA open to hunters this fall.
Pheasant – The breeding population plummeted more than 70 percent in this region from 2011 to 2012. Last year was one of the worst on record for pheasant reproduction. However, last fall there was some carry-over rooster (second-year) from a record high season in 2010. Those carry-over birds are mostly gone now, which will hurt hunting opportunities this fall. Although reproduction was slightly improved from 2011, chick recruitment was still fair to below average this summer due to continued extreme drought conditions. Moreover, there were not enough adult hens in the population yet to make a significant rebound. Generally, hunting opportunity will remain well below average in this region. Haskell and Seward counties showed some improved reproductive success, especially compared to other counties in the region.
Quail – The breeding population in this region tends to be highly variable depending on available moisture and resulting vegetation. The region experienced an increase in breeding populations from 2011 to 2012 although 2011 was a record low for the region. While drought likely held back production, the weather was better than last year, and some reproduction occurred. Indices are still well below average for the region. There will be some quail hunting opportunities in the region although good areas will be sparse.
Prairie Chicken – While breeding populations in the eastern parts of this region were generally stable or increasing, areas of extreme western and southwest portions (Cimarron National Grasslands) saw nearly 30-percent declines last year and 65 percent declines this year. Drought remained extreme in this region, and reproductive success was likely very low. Hunting opportunities in this region will be extremely limited this fall.
|
<urn:uuid:a611d07f-9067-4341-92f3-f62b82e34e98>
|
http://www.kdwpt.state.ks.us/index.php/news/Hunting/Upland-Birds/Upland-Bird-Regional-Forecast
| 0.9848
|
fineweb
|
Posted: Feb 18, 2013 10:06 AM by Suzanne Philippus - MTN News
Updated: Feb 18, 2013 10:07 AM
BOZEMAN - Several Montana State University scientists recently returned from the summer research season in Antarctica.
MTN reporter Suzanne Philippus has a rare glimpse of what takes place on the ice at the bottom of the world.
A group left Bozeman right before Thanksgiving and we're in Antarctica until the middle of February.
After 37 hours in the air, traveling over 10,000 miles, the ski-equipped military cargo plane landed on the southern-most continent in the world.
"In the broadest sense, Antarctica is the place to come to study global climate change because it's the end member environment, because it's the coldest place," explained Reed Scherer a scientist on the Wissard Project.
Scientists from across the world descend upon this vast continent during the Antarctic summer, for a chance to work in one of the most unique and minimally disturbed laboratories on earth.
From ancient geologic microbes to climate change, scientists are conducting more than 120 research projects; none of which can be conducted anywhere else on earth.
"The science we think is really exciting and so...a ...we're ready to go and get into action here," said Ross Powell, who's working on the Wissard Project
During the Austral summer, which ranges from October to February, scientists take advantage of 24 hours of sunlight in a race against a mandatory, late-February departure off the ice.
|
<urn:uuid:4684aa0a-b877-4529-8dc7-14e9760b6bff>
|
http://www.kpax.com/news/msu-scientists-take-part-in-antarctica-research/
| 0.9985
|
fineweb
|
Data reported by the weather station: 39160
Latitude: 55.18 | Longitude: -6.16 | Altitude: 156
|Main||Year 1988 climate||Select a month|
To calculate annual averages, we analyzed data of 357 days (97.54% of year).
If in the average or annual total of some data is missing information of 10 or more days, this is not displayed.
The total rainfall value 0 (zero) may indicate that there has been no such measurement and / or the weather station does not broadcast.
|Annual average temperature:||8.3°C||357|
|Annual average maximum temperature:||10.8°C||357|
|Annual average minimum temperature:||5.9°C||357|
|Annual average humidity:||-||-|
|Annual total precipitation:||1716.87 mm||357|
|Annual average visibility:||-||-|
|Annual average wind speed:||18.5 km/h||357|
Number of days with extraordinary phenomena.
|Total days with rain:||0|
|Total days with snow:||0|
|Total days with thunderstorm:||0|
|Total days with fog:||0|
|Total days with tornado or funnel cloud:||0|
|Total days with hail:||0|
Days of extreme historical values in 1988
The highest temperature recorded was 19.3°C on May 15.
The lowest temperature recorded was -2°C on March 17.
The maximum wind speed recorded was 111.1 km/h on February 9.
|
<urn:uuid:c9607554-dd59-4b6c-aa2c-f7cae4b18b4b>
|
http://www.tutiempo.net/en/Climate/BALLYPATRICK_FOREST/1988/39160.htm
| 0.9764
|
fineweb
|
How Global Warming Will Make Hurricanes Like Irene Worse
Stay up to date with the latest headlines via email.
Climate science suggests that global warming will make hurricanes like Irene more destructive in three ways (all things being equal):
On the third point, warming also extends the range of warm SSTs, which can help sustain the strength of a hurricane as it steers on a northerly track. As meteorologist and former hurricane hunter Dr. Jeff Masters has explained:
… this year sea surface temperatures 1 – 3°F warmer than average extend along the East Coast from North Carolina to New York. Waters of at least 26°C extend all the way to Southern New Jersey, which will make it easier for Irene to maintain its strength much farther to the north than a hurricane usually can. During the month of July, ocean temperature off the mid-Atlantic coast (35°N – 40°N, 75°W – 70°W) averaged 2.6°F (1.45°C) above average, the second highest July ocean temperatures since record keeping began over a century ago (the record was 3.8°F above average, set in 2010.) These warm ocean temperatures will also make Irene a much wetter hurricane than is typical, since much more water vapor can evaporate into the air from record-warm ocean surfaces.
Also, hurricanes tend to be self-limiting, in that they churn up deeper (usually cooler) water, that can stop them from gaining strength and also weaken them. So since global warming also warms the deeper ocean, it further helps hurricanes stay stronger longer.
One says, “all things being equal,” because, among other things, it is possible that global warming will increase wind shear, which can disrupt hurricanes.
The media prefer to ask the wrong question — as Politico did Friday with its piece, “Was Hurricane Irene caused by global warming?” But they do have a good quote from perhaps the leading expert on the subject:
“I think the evidence is fairly compelling that we’re seeing a climate change signal in the Atlantic, ” said Kerry Emanuel, a professor of atmospheric science at the Massachusetts Institute of Technology. Citing other recent trends of extreme weather, including hailstorms and catastrophic tornadoes, “one begins to wonder, if you add all those up, maybe you are seeing a global warming effect.”
Still, he adds, “I would be reluctant myself to say anything about global warming and Irene” — but again, that I think is a function of asking the wrong question. That’s a point Climate Central makes in its post on this subject, “ Irene’s Potential for Destruction Made Worse by Global Warming, Sea Level Rise“:
At the moment, the immediate question for anyone in the path of the storm is — or should be — “how can I keep myself and my loved ones safe?” But another question may be lingering in the background. It’s the same question that came up in April, when a series of killer tornadoes tore up the South in April, and in May, when floods ravaged the entire Mississippi River basin, and in July, when killer heat waves seared the Midwest and Northeast, and in August, when Texas officially completed its worst one-year drought on record — a drought that isn’t over by a long shot.
|
<urn:uuid:4d67788f-56d6-4533-a558-5579a580e5f9>
|
http://www.alternet.org/story/152200/how_global_warming_will_make_hurricanes_like_irene_worse
| 1
|
fineweb
|
On September 19, the Cary Institute hosted a one-day conference on the impacts of tropical storms Irene and Lee on the Hudson River. Organized by the Hudson River Environmental Society, with leadership from Cary's Stuart Findlay, the forum examined how the river and estuary responded to the storms, which dropped an estimated 12-18 inches of rainfall throughout the Hudson Valley and Catskill regions. Topics included dredging, sediment transport, water quality, impacts to fish, and future management practices.
In late October, Gary Lovett will present his assessment of the health of the Catskill Forest at the second Catskill Environmental Research & Monitoring Conference (CERM). The forum brings together research on the region, to better understand the effects of extreme weather, air pollution, invasive species, biodiversity loss, and habitat fragmentation. The Catskills provide the majority of New York City's drinking water supply; CERM forums help coordinate research and identify research agendas to protect these resources.
In November, Cary Institute will hold a two-day conference examining the effects of climate change on plant, animal, and microbial species. The invitation-only event is being organized by Richard Ostfeld, Shannon LaDeau, and Amy Angert (University of British Columbia). With more than 50 invited experts, the conference's goal is to identify tools that will help lessen the negative effects of climate change on biodiversity, disease risk, extinction, and ecosystem function.
|
<urn:uuid:40f6fcd0-48f9-4b94-a078-36492fab999c>
|
http://www.caryinstitute.org/newsroom/conferences?page=0%2525252C3%25252C2%252C2
| 1
|
fineweb
|
The Amazon a Key Ecosystem.
Amazonia – an example of a key ecosystem for climate regulation (based on the presentation provided by Antonio Nobre 2007)
The climate of Amazonia is strongly dependent on the presence of the forest. Amazonia has been described as a “Green Ocean” with satellite imagery revealing very high cloud cover and rainfall over the region in comparison with the surrounding oceans. The forest is also a very large carbon store.
Amazon basin map
In comparison to unforested land, forest cover can enhance evapotranspiration through the extraction of moisture deep in the soil by plant roots. The canopy can capture a greater fraction of rainfall which is then re-evaporated back to the atmosphere, compared to bare soil which holds less water on the surface before runoff and infiltration. Furthermore, the higher aerodynamic roughness of a forested land surface can promote the flux of moisture to the atmosphere through enhanced turbulence.
Biogenic volatile organic compounds (VOCs) are emitted by many different plant species, and may act as cloud condensation nuclei, potentially enhancing cloud cover. VOCs can also affect concentrations of ground-level ozone, an important greenhouse gas, leading to ozone destruction when NOx levels are low, but net ozone production when NOx levels are higher (Sanderson et al 2003). Aerosols arising as a result of biomass burning may change rainfall regimes and maintain a dry fire-prone land surface.
Amazon rainforest mist
Deforestation in the Amazon region accounts for 5-10% of global CO2 emissions. Global climate change may also lead to changes in Amazonian vegetation cover, especially if there is a significant reduction in rainfall in the region. The relationship between the warming of global average temperature and changes in regional rainfall patterns is highly uncertain, but a number of climate models suggest that global warming could lead to particular patterns of warming in the North Atlantic, and tropical east Pacific sea surface temperatures (SSTs) which change the atmospheric circulation reducing rainfall across large parts of Amazonia.
Strong drying of Amazonia or North East South America is simulated by variants of the Hadley Centre climate model (Cox et al 2004) and feedbacks between the forest loss and regional and global climate contribute to the strength of this drying (Betts et al 2004).
Deforestation or degradation of the forest as a result of habitat fragmentation or climate change may therefore significantly alter the climate of the Amazon region and also contribute to global climate change.
|
<urn:uuid:50841248-9e53-49ff-861b-edff80894e1b>
|
http://www.center4climatechange.com/amazoneco.php
| 1
|
fineweb
|
Reducing emissions of data centre activity
Microsoft Ireland Research has been working on electricity grid research that allows the exact energy consumption and emissions to be measured for any piece of computation performed in a data centre. This allows past or predicted emissions to be calculated for any computation performed in the cloud
and opens the doorway to measuring and reducing the emissions produced by data centres around the world.
In a paper presented at the 11th international conference on Electrical Power Quality And Utilisation
, Conor Kelly of Microsoft Ireland and Antonio Ruzzelli of University College Dublin outlined how data centre emissions can be calculated, and presented sample measurements showing that performing computation when ample wind energy was available on the grid emitted less than 1% of the emissions of the same piece of computation performed at another time when the bulk of the electricity consumed was produced by fossil fuel power plants.
As carbon taxes, emissions trading and environmental imperatives become an increasing influence around the world, and the EU looks to slash carbon emissions by 30 percent from 1990 levels by 2020, measuring and controlling the emissions resulting from the energy consumed in data centres will become as important as controlling the energy consumption itself. The research puts in place the calculations required to understand these emissions, which can in turn be used to inform their reduction. This opens up another path to reducing the growing emissions produced by data centres worldwide.
Putting a CO2 Figure on a Piece of Computation has been published by the IEEE and can be viewed here.
|
<urn:uuid:448e98d6-039d-4a07-8b6b-899474285915>
|
http://www.microsoft.eu/innovation-in-society/posts/reducing-emissions-of-data-centre-activity-cm1l.aspx
| 1
|
fineweb
|
Most natural disasters today are linked to climate change, says John Holmes, UN emergency relief coordinator and head of the UN Office for the Coordination of Humanitarian Affairs, OCHA. OCHA today launched a campaign to raise awareness of the humanitarian implications of climate change, calling for improved disaster preparedness and response measures in countries that suffer most from extreme weather events. "This campaign highlights our huge concerns about the humanitarian impact of climate change," said Holmes. "Any credible vision of the future must recognize that humanitarian needs are increasing and that climate change is the main driver. We are already seeing its effects, in terms of the numbers of people affected and in the rising cost of response."
…From 1988 through 2007, over 75 percent of all disaster events were climate-related and accounted for 45 percent of deaths and 80 percent of the economic losses caused by natural hazards. The most vulnerable are impoverished people living in risk-prone hotspot countries, where the risks from extreme climatic events overlap with human vulnerability In 2007, OCHA issued an unprecedented 15 funding appeals for sudden natural disasters, five more than the previous annual record - all but one due to climatic events. "So welcome to the 'new normal' of extreme weather. Climate change may well exacerbate chronic hunger and malnutrition across much of the developing world," wrote Holmes in the current issue of "The Economist" magazine. "And it will almost certainly precipitate battles over resources."
…In the last 20 years, the number of recorded disasters has doubled from about 200 to more than 400 per year. Disasters caused by floods are more frequent - up from about 50 in 1985 to more than 200 in 2005 - and floods damage larger areas than they did 20 years ago.
Cylcones Nancy and Olaf get together in 2005 and make a night of it. NASA
|
<urn:uuid:25e6038c-38bc-4a58-af2a-94d5e6216329>
|
http://carbon-based-ghg.blogspot.com/2008/12/climate-change-now-main-driver-of.html
| 1
|
fineweb
|
By Brian Palmer
Special to the Washington Post
— It's that time of year when even environmentalists committed to saving trees proudly display a massive tree carcass in the living room, bejeweled and topped with a star. American cities are rarely greener than during Christmastime, when every other street corner can seem to be occupied by a tree peddler.
Christmas trees play into a wider debate among environmentalists: Are tree farms better or worse at carbon sequestration than untouched forests?
The pro-tree-farm argument goes like this: When you plant a tree, it goes from seedling to full-grown plant by rapidly extracting carbon from the atmosphere, including carbon that humans have emitted by burning fossil fuels and raising cattle. (When a climatologist looks at a tree, he sees a leafy pillar of solidified greenhouse gases.) Once the tree reaches maturity, though, it slows its consumption of carbon. By way of comparison, think of the appetites of a growing teenager and a senior citizen. When you're done growing, you stop consuming as many calories. The best move, according to some tree-farm advocates, is to replace the mature tree with a new sapling and start the growth process over again.
Tree farmers have been making this claim for more than two decades, but many climate experts think it's bunk. The most obvious objection to the theory is: What becomes of the trees once they're cut? According to research out of Oregon in the 1990s, 58 percent of felled trees are used for paper, mulch, firewood or other short-term purposes. In those cases, the tree's sequestered carbon quickly reenters the atmosphere after decomposing or burning. The remaining 42 percent is used in ways that keep the wood intact more than five years, such as homebuilding and furniture production. Even in those cases, though, the carbon doesn't stay sequestered forever.
New forests also seem to emit significant levels of carbon dioxide, rather than only absorbing and storing it. When we plant or replant a tree farm, we turn over the soil and kill off roots and ground-level plants. That vegetation was also storing carbon, and it begins to decompose. In some cases, the dying plant matter emits more carbon dioxide than the newly planted trees extract from the atmosphere.
There has also been research suggesting that old-growth forests are more active than they appear. According to a scientific letter published in the journal Nature in 2008, forests continue to add woody matter — both new branches on existing trees and new, smaller plants — for centuries, sequestering carbon from the atmosphere in the process. The net carbon budget — the amount of carbon sequestered minus the carbon emitted through decomposition of downed plant matter — is more favorable in a forest's 300th year than in its fifth year. Overall, the data seem to suggest that old-growth forests keep more carbon out of the atmosphere than high-turnover tree farms, but there is probably significant variation depending on locale and how foresters manage the stock.
This doesn't mean you should forsake a Christmas tree or turn to an artificial alternative. (Fake Christmas trees often include chemicals that are especially harmful to the environment when discarded and are responsible for more greenhouse gas emissions than natural trees.)
A few special considerations set Christmas tree farms apart from producers of trees grown for paper. Christmas tree farmers typically plant more trees than they harvest, giving the new crop a better chance at out-sequestering the ones they replaced.
Evergreens aren't the best arboreal carbon sequestration tools — that title goes to hardwood trees — so the difference in greenhouse gas emissions between a long-lived evergreen forest and a Christmas tree farm aren't likely to be significant. (Razing a hardwood forest to grow Christmas trees would be a bigger problem, but this is a relatively rare event.)
If you're concerned about the impacts of your tannenbaum on global climate, consider renting a living tree that spends two to three weeks in your home over the holidays, then summers at business parks or other locales. If you're looking for a long-term relationship with a single tree, some companies will bring back the same tree year after year. You should start with something small, through. The trees grow between two and three inches per year, and your living-room ceiling probably doesn't.
In other cases, rented trees are permanently retired to a nice farm or city planter after a single Christmas with a family. Before you decide to rent, be aware that you might not get a classic Christmas variety such as the Douglas fir or Scotch pine. Many companies offer less traditional species including the small-leaf tristania. You should also seek out a local farm, minimizing the gas burned on the way from the farm to your home.
|
<urn:uuid:e84f3a4b-1010-4082-a9b3-38288bed9a67>
|
http://dailyiowegian.com/cnhi/x1951920713/Do-you-deserve-a-lump-of-carbon-under-your-Christmas-tree/print
| 0.999
|
fineweb
|
Without innovation, it will be very difficult and very costly to achieve the transformation to a greener economy. There is vast amount of scientific and empirical evidence that suggests that reducing global greenhouse gases (GHG) emissions will require innovation and large-scale adoption of green technologies throughout the global energy system.
How to foster green technologies and innovation is perhaps the most crucial challenge for a green economy. Recent efforts show that OECD governments as well as emerging economies are giving priority to R&D activities and incentives for specific technologies such as renewal energies and environmental technologies.
The key challenge for policy makers in the area of science, technology and innovation is to identify the specific policies that will be needed to achieve broad technological change. The OECD Innovation Strategy outlines that strengthening innovation requires a policy response on several fronts. Besides market-based measures and regulatory policies such as carbon pricing that work at the end of the innovation cycle, policies that enhance the supply of available knowledge and technologies will also be needed.
As innovation will be an important driver of the transition to a green economy, the OECD Committee for Scientific and Technological Policy (CSTP) organised a workshop with the following objectives:
- Identify opportunities and challenges for funding and performing breakthrough research in the public as well as the business sector, including identifying the major gaps.
- Identify good practices for the diffusion of breakthrough green technology and innovation, including the coherence of supply and demand side policy measures.
- Develop a vision and medium-term strategy for CSTP-wide work on green research, technology and innovation.
Click here to download the draft agenda (updated 19 October).
For further information, please contact Mario Cervantes, Science and Technology Policy Division, OECD Directorate for Science, Technology and Industry
OECD work on green growth
The OECD Innovation Strategy
|
<urn:uuid:345e04e1-c987-48db-8c86-e2ecf3473d1d>
|
http://www.oecd.org/sti/inno/oecdworkshopongreentechnologyandinnovationpolicies.htm
| 0.9999
|
fineweb
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.