Loading presentation...

Present Remotely

Send the link below via email or IM


Present to your audience

Start remote presentation

  • Invited audience members will follow you as you navigate and present
  • People invited to a presentation do not need a Prezi account
  • This link expires 10 minutes after you close the presentation
  • A maximum of 30 users can follow your presentation
  • Learn more about this feature in our knowledge base article

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.


Make your likes visible on Facebook?

Connect your Facebook account to Prezi and let your likes appear on your timeline.
You can change this under Settings & Account at any time.

No, thanks


No description

karam singh

on 1 May 2015

Comments (0)

Please log in to add your comment.

Report abuse

Transcript of a;lkjfa;lkjfwijepijsoijdpijfpwoiejfpwoiejfpoija;lkjd;oifjpweoij

Probable Future
Increasing Ocean Deadzones
The Arctics First Ozone Hole
Located in the stratosphere, the ozone layer protects life on earth from harmful ultraviolet radiation. In the spring of 2011 scientists observed the largest and most severe ozone destruction ever witnessed in the arctic since records began in 1978. Researchers in an article in the journal NATURE on October 2, 2011 that in the northern spring of 2011, massive ozone destruction of 80% occurred 18-20 kilometers above the Arctic Ice sheet in the Earth's stratosphere. As a result of this, this made 2011 the first year ever that an ozone hole has been observed in the Arctic. They specifically stated that: " For the first time sufficient loss occurred to reasonably be described as an arctic ozone hole". However they have conceded that some degree of loss above the northern Arctic may have been annul events, measured in past decades during the respective winters of the poles. As mentioned previously the ozone layer protects living things on Earth from harmful ultraviolet radiation. If there were no ozone layer skin cancers and crop failure would increase dramatically. Earthly life as we know it would be unable to survive. Researchers have already begun to speculate that the 2011 Arctic Ozone hole might have caused reductions in Europe's winter wheat crop.

When temperatures become colder the chances for cloud development in the stratosphere increases. From March 2011 a polar vortex (this is a strong spin of swirling winds around the pole) was observed spinning above the Arctic. When this occurs it also blocks out the warmer air along the troposphere (this is the earths surface) and keeps colder air in the stratosphere. These colder conditions then created more stratospheric clouds which then acted as a surface for stable chlorine gases to turn into chlorine monoxide and the constant cold of the stratospheric clouds and the generation of the ozone destroying chlorine monoxide eventually caused the depletion of the Ozone in the Arctic.

Now from this simple explanation as well as the two graphs its is quite evident that as the troposphere warms (as the climate continues to warm due to continued release of green house gases) the stratosphere cools which will as previously explained will lead to a bigger hole in the ozone layer in the Arctic. So the more being emitted and the warmer the troposphere becomes the larger the ozone hole could potentially become. Below are the two graphs suggesting this correlation:

Snow in the Northern Hemisphere
Pure snow cover is the whitest surface on Earth and because of its color it reflects 90% of the sunlight
that reaches it. A snow free ground four to six times more radiation from the sun (otherwise known as solar radiation than ground that is covered with snow. Its cooling impact on the planet aside snow also affects soil moisture and run off- including critical drinking water and irrigation water in dry areas. In addition to this it also insulates plant roots from winter conditions. The majority of all of Earth's seasonal snow is located in the Nothern Hemisphere as of 2012, Nothern Hemisphere snow cover on land (snow cover can occur over sea ice) was around 24.6 million square kilometers which was 0.3 million square kilometers below the 1967-2012 average and as a result of this 2012 ranked as the 12th lowest in snow cover over the 1967-2012 period and the end-of-spring in 2012, snow cover was the lowest on record.

Stratospheric Temperature

Through recording temperature patterns in the lower stratosphere researchers gain clues regarding the planets ever-changing climate. Short-term spikes in these temperatures occur in response to volcanic eruptions. Increasing green house gases and the decline of the stratospheric ozone layer cool the stratosphere. A long term cooling trend would suggest that increasing levels of green house gases are changing the earths climate. Global average temperatures in the lower stratosphere in 2013 were slightly below the 1981 -2010 average. Satellite datasets determined that 2013 was warmer than 2012- a year that was ranked as the coldest year or nearly the coldest on record in the lower stratosphere. Cold temperatures dominated the lower stratosphere in the lower latitudes as well as the subtropics with a region of cooler-than-usual temperatures extending across the atmosphere above Canada till Alaska in the northern Hemisphere.

Colder temperatures in the lower stratosphere can allow reactive and ozone-destroying chemicals to buildup, contributing to ozone holes over the polar regions, which expose people to harmful UV radiation. Warmer than average temperatures in the lower stratosphere over the southern pole from September- December resulted in a larger than average ozone hole in 2014.

Antarctic Ice
Preferable Future
Probable Future Assessment
'Deadzones' otherwise known as 'hypoxic zone's are areas in the ocean of such low concentration that animal life ends up eventually suffocating and dying which is what gave rise to the infamous nickname of these areas. If the life there is mobile it ends up migrating to better environments. This results in habitats that were originally teeming with life, essentially become biological deserts. These zones can indeed occur naturally however they can also be enhanced by human activity. A good example of this could be observed in the gulf of mexico where the 'run off' from the Mississipi river causes a 22,000 km squared long dead zone. This is because fertilizer as well as sewage causes stimulation in the growth of algae which then decomposes. This decomposition process causes depletion of oxygen in the waters and depletes the supply that is readily available to marine life. However there is also a direct link with climate change. Warming and several other aspects of climate change has increased the number deadzones by the hundreds since the 1960's and has reached a peak in 2010. Temperature is perhaps the factor that most affects the formation of deadzones. Warmer waters hold less dissolved oxygen in general. Warmer air will heat up the surface of the water, making it more buoyant and reducing the liklihood that the top layer will mix with colder waters below. These deep waters are often where the hypoxia develops. Therefore as temperature rises aquatic life in the oceans will require more and more oxygen to survive but due to the fact there is less oxygen available mortality rate would increase and could drive several ecoysystems to collapse as a 2009 study by Nature Geoscience reported.

"Such deadzones were rare 40 years from now but now there are hundreds and at this point it will take a thousand years for the oceans to cool down so it is imperative to pull the emergency brake. on emissions."

Stephen Leahy, Interpress Service, July 31st, 2010

In the initial 3 months of 2012, snow cover over Eurasia was high, ranking third most extensive in February and ninth most extensive in January and March. Over this same three-month period snow cover in North America fell far below average. It was third lowest in January, 16th lowest in February and fourth lowest in March. After the melting of these snows began in spring in 2012 snow cover reduced dramatically over Europe and Asia (it dropped to seventh lowest in the month of April and to the lowest seen on record during May and June). In North America the snow cover remained below the average. Northern Hemisphere snow cover ranked the eighth in October fifth highest in November and also highest on record in December. Snow cover extents were higher in Eurasia than in North America where snow cover was below the average in October and fourteenth highest in November. From the graph below it is also evident that this trend was being observed for quite some time and was only broken twice before. From the 1960's through the 1980's snow cover spiked above the average between the years 1966-2010. Beginning in the 1990's snow cover exceeded the long term average with the snow cover remaining below the average; the average snow cover did not exceed the long term average after 2003 ( this year 2012 would be the first time since then).

In 2014 the amount of Antarctic ice reached its minimum of 5.01 million square kilometres. The sixth lowest extent since records began. The minimum ever recorded at the North Pole was 3.29 m sq km in 2012 and the eighth lowest years have unfortunately been the last eight years. Meanwhile ice in the antarct ice has advanced beyond 20 m sq for the first time and it is still expanding. Ice levels in the antarctic have recovered from their all time low but are still on a shrinking trend. Julienne Stroeve of the National Snow and Ice Data Centre says:"We have been telling this story for a long time, and we are still telling it." The records have shown that in the year of 2014 ice momentarily dipped below 5 million sq to 4.98m and 16 september. Satellite data suggests that a part of Laptev Sea was clear from sea ice completely for the first time in the summer of 2014. An important question that is being asked is when the first ice-free summer is going to be experienced by Antarctic. Peter Wadhams, a speaker from Cambridge predicted it would happen in the year of 2015: “I would still nail my colours to 2015 as a possibility [for an ice free Arctic in the summer],” The majority of scientists at the conference disagreed with this but NASA's Gavin Schmidt described them as unjustified and not based in physics. However the cause for the decrease in antarctic ice is also hotly debated whether it is due to geothermal heat as well as winds and many have attempted to explain it by running computer simulations with the stratosphere cooling it is still unclear and scientists are divided about why it has decreased.
Permafrost Melting
"A so called irreversible tipping point wil inevitably occur by 2020 or 2030 due to the release of huge quantities of what is known as organic carbon (which is part of the carbon cycle) locked away and frozen in plant matter in the vast region known as the Thermafrost in the Arctic which scientists have found. A tipping point is where billions of tons of frozen leaves and roots which are containing carbon which have been lying undisturbed for thousands of years in the permanently frozen ground of the tundra of the northern hemisphere is thawing out due to climate change. This is because as the earth warms up the microbes in the permafrost will begin to consume this carbon (which as previously mentioned was in the form of dead plants) and will reprocess it in the form of carbon dioxide and methane. This will warm the earth even more thereby creating a 'runaway' effect. This will have catastrophic implications for climate change so much so that the amount released could potentially dwarf the amount that humans release. A study which was conducted regarding the speed at which the Thermafrost is melting strongly suggests that the tipping point will be in year 2020 (most likely). This year will mark the point at which the region of the Arctic will go from being a 'net' so to speak for carbon dioxide into a source of carbon dioxide emissions that will result in the hastening/acceleration of Global Warming as we know it. Now, once the frozen carbon located in the Arctic finally thaws and decays there is no way for us to physically put the carbon molecules back into the Thermafrost. Scientists state that this essentially will be the starting point of the irreversible point. The videos expand on this:
Permafrost Melting (Tipping point)
This is the year when the melting of the previous mentioned permafrost will reach what is known as an irreversible tipping point. In this year it is predicted that there will be an initiation of an irreversible release of around 190 gigatons of greenhouses such as Carbon Dioxide as well as Methane into the stratosphere. At this point the thawing thermafrost is threatening to overwhelm any attempts at keeping the planet from heating up too much. What makes this situation even bleaker is the fact that without any major reductions in the use of fossil fuels almost two thirds of the worlds gigantic warehouse of frozen carbon potentially could be released in a new study titled, " Amount and timing of permafrost carbon release in response to climate warming". As a direct result of this initiation of this tipping point this would push global temperatures several degrees higher, making large parts of the planet uninhabitable. After the Arctic reaches a certain level of heat the emissions of carbon dioxide and methane will trigger a feedback process that will "amplify the current warming rate," as stated by Kevin Schaefer from the National Snow and Ice in Colorado. The rising temperature in the Arctic have extended the length of the season of summer in the Arctic. This has stimulated plant growth and the subsequent uptake of CO2. Specifcally in the year of 2025 more this process will reverse and the thawing permafrost will release more carbon than is being taken up by the thermafrost growing over it. This point will definitely be irreversible. The 'tipping point' is in direct refernce to the point where the thermafrost located in parts of Canada, Siberia and parts of Europe becomes a major source of Green House gases. Several mathematical models have been created to predict and quantify the extend of the thawing of the thermafrost. The model using favorable environmental conditions, the model was created with a fossil fuel emission lower than the current being released. Even then the study revealed that between 29 and 60 percent of the worlds thermafrost will thaw by 2020 releasing an extra 190 gigatonnes - as mentioned previously- irreversibly. This was the first study to quantify the amount in the journal Tellus.

Artic on thin ice
It is predicted that by 2030 the arctic will heat reach critical levels of heat and will even potentially be ice free. Global climate models have long projected that surface (otherwise known as troposphere) warming resulting from the increased atmospheric concentrations of CO2, methane and other various green house gases will be greater at what is known at the poles (North and South) in comparison to the latitudes , which are otherwise known as the tropics. This is known as 'polar amplification' and is primarily due to positive feedbacks from the retreat of ice and snow as the region warms, is a consisten factor across many predictions/models and assessments. The 2007 Intergovernmental Panel on Climate Change had a simulated model in which the mean Arctic Warming exceeded the global mean warming by around a factor of two. The Artic winter warming was a factor of four larger than the global annual mean. In another model known as the IPPC AR4, it is projected that under the scenario of 'current emissions' it is predicted that there will be an increase of warming by around 5 degree Celsius. Mean Arctic warming for another scenario is larger at 5.9 degrees celsius. These numbers compare to global changes of 2.8 degrees celcius. , 3.4 degrees celsius and 1.8 and a global warming of 4.0 degrees celsius for another scenario (the A1hF1) suggests an increase in the mean annual arctic temperature of -9 to -13. Now assuming that the polar amplification factor for the annual mean of roughly two, the Arctic in 2030 can reasonably be expected to be from 1.28 degrees celsius to 1.38 degrees celsius warmer than it was in 2011. As a result we can assume the arctic in 2030 will be 1. 3 degrees celsius warmer warmer than 2011. This was concluded in a study using changes consistent with trends observed over the past century.

Heat Waves: Hotter and more Frequent
A Stanford based study predicts that by 2039 hot temperature extremes could become extremely common and not to mention extremely dangerous across the world and especially in the United States. Exceptionally long heatwaves and other hot events could quickly become commonplace in the United States in the next thirty years, according to a new study (Intensification of Hot extremes in the United States by Stanford University climate specialists Noah Diffenbaugh and Moetasim Ashfaq. In the study it was concluded that hot extremes could be very frequent by the year 2039, which poses major agricultural concerns for several farming States in the United States. They stated that within the next 30 years it is quite possible that the country could see an increase in Heat Waves like the ones occurring in the east of the United States and similar to the one that occurred in 2003 which caused tens of thousands of casualties. These kinds of heats will put gargantuan quantities of stress on crops. Major ones such as soybean, corn, cotton and wine grapes could see a major, major reduction in crop yields. From the years 2031-2040 most of the areas in the states of Utah, Colorado, Arizona and New Mexico could potentially suffer seven seasons as intense as the hottest ever season recorded in 1951 till 1999. They additionally stated that by 2040 heat waves like the one which occurred in 2003 (as previously mentioned) could occur nearly every year in Europe. Switzerlands Federal Office of Meteorology and Climatology revealed findings at an international conference which showed that since the year 1880 the duration of heat waves in specifically Western Europe, has doubled and the number of unusually hot days in Western Europe has doubled and the unusually hot days has tripled. Furthermore reasearchers from Britains Hadley Centre for Climate Prediction and Research produce mathematical models that clearly indicate that by 2040 it will be a very high probability that such events as the dangerous 2003 heat wave could take place in Western Europe ( Britain included) every alternate year. These researchers have also shown that half of the risk of the heat wave occurring again can be traced to human influence on the planets climate changing. They suggest that if concentrations of green house gases coming from factories and power plants and the like continue to increase at even a modest pace from this year, by 2040, the region of Western Europe will be forced to endure summers that break even the record-high temperatures of those in 2003.

Exponential Increase in Wildfire Rates
Although this particular even in the future is more Ameri-centric it is still relevant in the global context. By the year 2050 wildfire burn area (the area in which a wildfire burns as implied by the name of the term) is predicted to increase by more than 50% and additionally as much as 175% in some areas in the United States. This was stated by a report titled "Impacts of Climate Change from 2000 to 2050 on Wildfire activity and carbonaceous aerosol concentrations in the western united states". It specifically predicts that the worst affected areas will be the forests in the region known as the Pacific Northwest as well as the Rocky Mountains. Here it is projected that the area of forest destroyed by wildfire is predicted to increase by 78% and 175%. 75% in the Pacific Northwest and 175% in the Rocky Mountains. This study and research is based on a conservative temperature increase of 1.6 degrees (celsius) over the next forty years. Once again this research was conducted under a conservative scenario of 1.6 degrees celsius. The scientists that authored the journal of Geophysical Research state that the increase in Wildfires will lead to significant deterioration of the air quality in the western united states due to the greater presence of smoke. This study was also funded by the U.S environmental protection agency and additionally was also Dr. Domnick Spracklen of Harvards School of Engineering and Applied sciences.

Because of this exponential increase in Wildfire rates it is predicted that the property losses in California specifically, would increase by $2 billion a year by the year 2050. In Yosemite park alone the wildifire rate (in other words frequency) and also burn area would increase by around 20% due to climate change. The International Journal of Wildland Fire states that warmer temperatures pose a doubled-edged threat. In addition to triggering fires, the increased warming could also melt the snow that covers the forest in the winter. Lighting strikes would then trigger even more fires which would burn more intensely. This prediction was made based on the fact that predicted higher temperatures will make vegetation more flammable and will therefore allow larger fires to catch on. Dr. Lutz also predicts that warmer temperatures will cause a 20% increase in the amount of area that will be burned as well as the frequency at which they will take place. This will be explained in further detail. This will happen because of, as previously mentioned lightning strikes. This is because there is evidence that states that the increasing levels of carbon dioxide which are coming as a result of emissions lead to more lightning strikes occurring. Additionally if snowpack cover falls in the National Park by 17% by 2050- which would well happen as a result of rising temperatures and has been predicted by conservative global warming models- then lightning strikes will inevitably lead to exponential amounts of wildfires. To conclude the number of wildfires ignited via lightning is projected to increase by 19.1 % from 2020 through to 2049 . On the other hand the areas that will burn at a high amount of severity by 21.9%. This was project based on a global warming model known as the B1. scenario.
This shows the percentage increase in areas burned due to wildfires in 2050.
By 2015, 10 billion tonnes is objected to be the maximum volume of carbon dioxide that humans may emit to remain below what is known as the critical threshold for global warming which is of two degrees. In conjunction with the help of new mathematical and computer models for prescribed atmospheric carbon dioxide concentration scientists from Europe have formulated for the first time the extent to which the global carbon dioxide emissions must be reduced to halt global warming. What most don’t realize is that the biggest impact from climate change will be shifts in precipitation, not temperature increase. These changes have been documented already; with increasing frequency and severity of flooding and droughts. It was reported in 2009 that the normal band of heavy rainfall around the equator has been going northwards. This results in areas that were once abundant with rainfall high and now dry. According to the model, admissible carbon dioxide emissions will increase from around 7 billion tonnes of carbon in the year 2000 to a maximum of around ten billion tonnes in 2015. Global energy use is expected to climb 55 percent between 2005 and 2015, and most of that will come from the use of fossil fuels. This will in fact be pushing carbon emissions beyond the unstoppable.
Greenhouse Gas Emissions
How to take action to prevent continuously thawing permafrost
The crux of the majority of these matters is that we have to essentially keep decreasing the amount of CO2 which we are emitting into out atmosphere. However in this case there are specific solutions that are being developed which can tackle the issue of the ever-melting permafrost. Engineers in Alaska have been developing to tackle this issue exactly. The solution was believe it or not was created out of necessity for a trans-alaskan pipeline. This is because the melting permafrost situated in Alaska was proving to be a major problem for the stability of business in the region. The constantly melting permafrost was going to cause a sinking of the pipeline. The cost of repairing one of the posts would be around 85,000 dollars. As a result out of financial necessity technology was developed to solve the problem. The technology that has been developed to solve this problem are Snow Making Machines. These machines would drain the heat absorbing ponds, and at the same time place a relfective white layer of snow over the pipeline's supports. The machines would then service a triple service of Reducing the heat absorbing permafrost melt water ponds, increase reflectivity of the tundra and keep the trans-alaska oil pipeline supports from needing replacement.

This project showed that the pipeline sinking was completely hindered and as a result of this the pipeline managed to remain operational. Now that it has been observed that it is indeed possible that this technological solution could be applied at a larger scale in order to tackle the situation of the permafrost melting. Of course this basic idea would need to be whole-heartedly adopted by the goverments of the countries where the permafrost is located. Such as canada and Russia. The best case scenario would be a international union and would require the implementation of policies.
Best case scenario
The best case scenario which would most likely take place if the melting of the permafrost was truly halted would be the avoidance of releasing a gigantic amount of methane and carbon dioxide into the air which would result in our atmosphere warming up and accelerating global warming drastically. By avoiding this scenario completely the world could begin focusing on actually stabilizing the amount of carbon dioxide and other green house gases being released by us on a regular basis. Essentially elimnating this problem from escalating any further will allows us as a society to being dealing with far more pressing matters at hand.
Saving the Arctic
As stated by many researchers, the Arctic is essentially the canary in the coal mine. Scientists and researchers observe what happens there extremely vigilantly. In addition to this the ice there, as mentioned previously absorbs a significant amount of and the the expansion of sea water due to heat could have even further longer reaching implications. As a result of this it is absolutely imperative that we begin to implement changes immediately if we want to see an Arctic that still has ice in the future. At the time there are two things that can be done immediately in order to prevent a future in which the Arctic is completely ice-free. The first is considering the role of soot in accelerating the ice melting.

Soot and other such green house gases such as ozone and methane are known as 'short lived climate enforcers. They are named so because of the fact that; though they linger in the atmosphere for a relatively small amount of time they can have an extremely powerful greenhouse effect. Soot, otherwise known as 'black carbon' remains in the atmosphere for about 6 days whilst CO2 lasts for centuries and possibly thousands of years. Despite this black carbon has an absurdly high warming effect in the snowy arctic. This is due to the fact that the dark soot; after being rained on or snowed onto bright snow or ice continues to absorb heat. The UN's environment protection programme estimates that reducing soot as well as methane emissions could quite possibly cut the Arctic warming by 2/3rds over the next 3 decades. This would delay the disappearance of summer ice by at least 20 years. As a result of this reducing the amount of soot we put into the air should be our priority. Research from Stanford university has revealed that black carbon is only second to carbon dioxide in its contribution to global warming and by reducing the amount of soot released we will be essentially be preventing around 1.5 million premature deaths. Soot, which comes from older diesel engines and burning fossil fuels, industrial sources, inefficient biomass cook stoves used in many developing nations around the world, can come out of the atmosphere in a matter of a week (as mentioned previously). With this knowledge we can conclusively state that the way to solve this problem (and it definitely is a problem that can be tackled right now) is to go 'clean'. The solution is clean energy, otherwise known as the usage of electric vehicles and rural electrification. Mark Jacobson from Stanford observed the effects of two different kinds of soot (black and brown) from burning fossil fuels as well as burning biomass on the heating of clouds snow and ice and concluded that the combination of this soot has a greater warming effect than even methane and that the elimination of soot could reduce warming by 1.7 degrees celsius specifically. Jacobson's study also concluded that the most low-cost and effective economical way of reducing soot is to place what are known as particle traps on vehicles burning fossil fuels and to convert electricity generated from clean stoves. Additionally in areas where biomass cookstoves are commonplace; providing electricity to these areas from clean sources would reduce use of biofuels for heating and cooking. This would therefore reduce the amount of soot produced.

The: "Quick Fix"
A recent study suggests that one way in which we could slow the melting of the sea ice would be by preventing and rerouting international flights from crossing over the arctic circle. Doing this would result in immediate results however this is definitely not a long term solution and should not be treated as such. This solution once again relates to decreasing the amount of black carbon/soot being emitted into our atmosphere. These inter polar flights are strangely a huge amount of black carbon pollution in the arctic circle. If those planes were to diver their course it could help prevent an ice-free arctic in the near future. Until the year 1998 commercial flights didn't start travelling over the Arctic on a regular basis when Russia gave other countries permission to fly over the region. After that year flights have become increasingly popular for planes travelling the route between North America and Europe. Most recently, in 2012 it was recorded that more than 50,000 flights a year were crossing the poles. This had a huge impact on the arctic. Although these inter polar flights are a very small source of the greenhouse-gas emissions that are warming the planet. The pollutants from inter polar flights are a significant source of pollutants like black carbon. This is because the planes planes fly through the stratosphere. Once again this solution provided in the paper Climate Change, Mark Jacobson (from Stanford) suggests that airlines should begin to divert flights away from the arctic circle. The model described in the paper suggests that if this were to truly happen, less soot would remain in the atmosphere of the arctic circle and as a result the Arctic would cool down slightly (by around 0.0015 degrees celcius to be exact). And as a result of this the Arctic Sea ice would even recover slightly by the next two decades. Now considering the economic aspect. These diversions would not be entirely cost-free. If planes are currently flying over the arctic because its simply faster than if all cross- polar flights were to be diverted then the airlines would en up using more fuel than usual. This would cost them around $99 million per annum and rather ironically would slightly increase the amount of green house gases around the rest of the world the planes would be travelling through due to their new diverted routes. As previously mentioned this should not be viewed as a permanent and sustainable solution. It is simply a way of buying a bit more time. According to the modelling the ice recovers in the next 22 years but the world would still need to green house emissions from the main sources such as transport and from energy sources.
This is an example of a diverted flight from Frankfurt to anchorage.
Solution for the worst case scenario
Another solution which was also formulated in alaska is the development of a technology known as thermosyphoning. It involves the use of passive cooling through the means of heat exchange pipes by the U.S Engineer corps to preserve the foundations of the permafrost. Now it has been applied widely in Alaska as well as Russia and Canada to preserve and cool permafrost in those regions. The original design was simply a a 20mm vertical pipe with a radiator in the air. Now the design has been evolved to include thermopiles as well as sloped pipe thermosyphons and additionally evaporator pipe syphons. The themosyphon was originally developed in Canada in 1994. Based on this technology it is possible that if government in the countries where the permafrost is located implemented polices where the thermosyphoning is done on a grander scale the permafrost could be effectively preserved. However this should only be implemented at the point where the snow-machine policy is implemented and the permafrost gets closer to the tipping point.
How geo-engineering can reduce the severity of heat waves
A study known as geo-engineering has become increasingly more important and researched. The idea behind geo-engineering is to hack the planet. The problems that arisen because of the weather such as droughts and storms could be tackled through extensive use of Geo-engineering. The same issues that arise from global warming that cause such things as droughts and storms also cause heatwaves. As a result of whatever Geo-engineering is applied to solve these problems would inadvertantly solve the issue of sever heatwaves. One method of Geo-engineering which researchers speculate could be of use is the utilization of aerosols in the air. Aerosols reflect solar radiation back into space thereby lowering the earths surface temperature. They also provide seeds around which water droplets will coalesce to form clouds, thus further increasing the planets reflectivity. The particles are long lasting in the stratosphere. This makes the use of aerosols as a worldwide plane cooler very attractive. The actual effects of aerosols are known quite well since volcanic eruptions produce aerosols naturally and have resulted in cooling of the stratosphere in the past. For example mount Pinatubo in the phillipines which erupted in 1991 released a very large amount of sulfur dioxide, so much that the planet cooled by 1 degree Farenheit and stayed cool for more than two years. It is estimated that if Pinatubo amounts of sulfur were to be released into the stratosphere (around 20 million tons) pumped into the stratosphere could linger in the atmosphere for 3-4 years. It could reverse the Arctic melting and would obvisouly reduce the severity of heatwaves. Cambridge scientists have also published paper proposing that the the best way to pump the aerosols into the air would be via a large tethered balloon with a hose attached to a high-pressure pump. The study has also concluded that aerosol injection would be the most effective, fastest, and cheapest solution to warming of the stratosphere, costing around $50 billion per year. However testing large scale climate engineering solutions would requires tests in the field as a result it is important that governments begin funded research programs related to climate engineering immediately.
Like many before this solution if implemented will allow the stratosphere to be cooled down and as a result of this many weather related problems that arose from climate change will be averted. Such as hurricanes, droughts not just severe heatwaves. However this does not mean that the human race can continue emitting the amount of greenhouse gases that it does. The long term goal would be to reduce the amount we are emitting. This solution will simply be giving us some breathing room to take of the real problem at hand.
The Solution
Previously we clarified that wildfires have directly to do with the amount as well as the severity of the wildfires that occur around the world. As a result of this, the main aim here to see the end goal of reducing the quantity and the severity of wildfires we should be attempting to reduce the levels of greenhouse gases in order to properly mitigate the impacts of climate change on our planet. We know from data that biggest sources of greenhouse gases come mainly from the production/generation of electricity. This amount is supposed to be around 31% as pictured below:
As a result of this what we should be in fact looking to do is increase wide spread use of renewable energy. This is energy that can sustainbly generate electricity without the need to actually burn any fossil fuels. Well in truth people have been doing this for years now, extensively using solar power and wind power. Countries such as Denmark are notable examples of this. Approximately 47.7% of the nations electricity comes entirely from wind power. Countries have adopted and are continuing to adopt it in the future thanks to such policies as the Kyoto Protocol which has stated very specific targets and agendas that need to be met regarding renewable energy. There is also no doubt that such policies will continue to be enacted by governments in the future. As a result what should instead be focused on is ensuring that there is a secure supply of materials required my manufacturers and countries around the world to continuously be building such things as wind turbines. The magnets that are essential for making such things are known as rare-earth metals. For example turbines and electric vehicles both rely on dysprosium and neodymium to make the magnets. These materials have unusual configurations of electrons orbiting their nuclei and thus have incredibly powerful magnet properties. Motors or generators that are made from other materials would be heavier and far less efficient. The issue here is actually geopolitical. At the moment the majority of all of these rare earth metals are all being manufactured by one country China. Due to this fact the Chinese have control over any shortages and embargoes they may want to impose on other nations. For example international prices for rare earth metals like cerium and lanathum have fell by 2/3rds in august of 2011. Additionally prices of Neodymium dropped 1/3rd. This can be seen from the graphic below. They have this power because 94 percent of rare earth metals are mined in China. Throughout the year 2008 it supplied almost all of the global annual demand outside of china of 50,000 to 55,000 tons. But then the nation suddenly decided to cut export quotas more than 30,000 tons last year alone and once again this year imposed some heft taxes which resulted in a shortage in the rest of the world. This, in conjunction with a two-month chinese embargo on shipments to Japan during the territorial dispute over the Senkaku islands resulted in prices outside of China reaching as much 15 times the level within China last winter. In the future if China were to impose more trade restrictions such as those mentioned it could cause a major drop in rare-earth metal supply and as a result in the end, could hinder nations from creating their energy in a sustainable way thereby accelerating the impact of climate change. As a result of this what should be done is find a substitute for these rare earth metals. And this is exactly what many researchers have been doing such as a Professor of Chemical Engineering based in the Northeastern University in Boston. Laura Lewis is leading a $3.3 million Department of Energy research project to synthesize new supermagnetic materials and then work out how to mass produce them for a market that is 20 billion dollars. They are currently looking for synthetic replacements for rare-earth materials. In addition to this consortiums have been led which have been able to find a way to replace indium with a silicon-doped zinc oxide. If governments were to fund such programs they would be able to ensure that they would be able to continue manufacturing renewable energy technologies and would be able to drop the total amount of green house gases being pumped into the stratosphere.
Preferable Future
Doing this would essential assure that the green house gases are being reduced by the year 2050 as the supply of rare-earth materials would allow for renewable energy technologies to be produced without stop. This would mitigate any impacts of climate change such as the exponential increase of wildfires and droughts, if not completely remove the chance of there being any such events taking place in 2050.
Carbon Tax
The idea of a tax can deter any one from doing anything: speeding, littering and even loitering. So why not on green house gas emissions? Greenhouse gas emissions can be reduced most cost-effectively through market-based approaches that put a price on carbon. The two most discussed systems that could be imposed that have been discussed by economists is a cap-and-trade system and carbon tax. Through the establishment of a price for greenhouse gas emissions, either system can help to adjust the (what is known as) market failure that is prevalent when the cost of environmental damages are not considered in the market value of fossil fuels. The main greenhouse gas produced by humans is CO2 which comes largely from burning fossil fuels. A CO2 tax would for example impose a tax on on coal, oil and natural gas in proportion to the amount of carbon they contain. This tax would pass on to the price of electricity, petroleum products and energy intensive goods. Another broader based carbon tax could also be designed to apply to non-energy sources (because as mentioned before energy sources are only accountable for around 31% for all of the greenhouse gases that emitted into our stratosphere) of CO2 emissions and on other greenhouse gases based on their climate change potential relative CO2. The costs that come out of the results of climate change are not currently included in the market such as damaging weather events as well as rising sea levels and loss of biodiversity. The tax will allow these affects to be borne monetarily by society. A carbon tax would also attempt to include these costs in market prices. Secondly as technologies that reduce CO2 emissions during combustion are not yet widely available or even researched, the primary way to reduce CO2 is to switch to fuel source with lower carbon content or reduce consumption of fossil fuels. Through the utilization of of a market-based policy to establish a common price on greenhouse gas emissions is necessary in order to provide incentives for a a dramatic emission reduction from across a variety of industries. Some emission reductions will be achieved by corporations when they switch to either higher-to-lower CO2 fuels and invest in research in energy saving technologies. Other reductions in the market would come from the consumers who who will begin purchasing less energy intensive goods and also changing their behavior to use energy more efficiently. These pricing policies will also encourage manufacturers to begin developing new technologies such as carbon capture and geological storage or alternatively zero-carbon energy sources.
Preferable Future
By implementing this carbon tax we would discourage society from overly using energy because the tax would force society to bear the environmental cost of fossil fuels and more importantly would discourage corporations from producing energy through fossil fuels due to having to pay more and instead they would begin to invest in other less carbon dioxide emitting technologies for producing energy. If this is implemented on a global level and well enforced it is possible that the other climate changes that have been mentioned further into the future on this timeline may simply not occur because the green house gas emissions would be so dramatically cut down thanks to this tax.
Solution for the worst case scenario
Similar to the preferable future of the previous climate change event , if the syphoning is done on a large scale it is possible the permafrost will remain cool and as a result will stop melting and the melting of the Arctic circle could also be possibly reversed. And as a result of this the next climate change even may not occur at all.
Full transcript