Wind will Blow you Away

The use of wind energy dates back as early as 500B.C. to help the Persians pump water and grind grain. Around 1000 A.D., Europeans started to use a wind technology to help drain lakes and rivers in marshy lands. Moving forward 850 years, Daniel Halladay of the United States starts the first windmill company.  It wasn’t till the 1893 Chicago World’s Fair that wind power was showcased to provided energy for homes and businesses. The largest wind turbine, at the time, was located in Vermont during the 1940’s. The wind turbine is known as Grandpa’s Knob and generated 1.25-megawatts of energy. After the second world war, oil prices were very low which destroyed most interest in renewable energy sources for a few decades. It wasn’t till the oil shortages in the 1970’s that changed the outlook on energy for United States and the rest of the world. Because of these shortages, wind energy was able to re-enter into the energy sector. Wind energy continued to grow and by 1980 the first large wind farm was installed in California and in 2008, 25.4 Gigawatts of wind turbines have been installed in the United States. The capacity of wind power jumped significantly, to 60 Gigawatts, in the United States in 2012. Finally, in 2015, the United States has 66 Gigawatts of wind energy installed and has pledged that the United states will use 20% wind energy by 2030.

Wind power has taken huge steps in the energy sector but there are many ways that the industry can improve. The power produce form wind turbines only supplies roughly five percent of the nations energy demand for only 39 states. On the bright side, the cost of wind technology has decreased by 90% since the 1980’s. There is little to no excuse for the increase of funding and support for the wind energy sector. The picture below shows the potential of wind energy across the United states; blue representing most potential/stability to white/grey for least potential. Based on the figure, there are 700,000 square miles, 1/5 of the land area of the United States, that has the potential to provide adequate wind energy.

New map shows how taller wind turbines could help unlock wind's potential in all 50 states, especially in the southeastern U.S. | Map courtesy of National Renewable Energy Laboratory.

The potential for total energy produced has just been taped into. Wind power as a whole can reach to powering 35% of the United States by 2050, and can supply 600,000 jobs in the process.

Now why do people dislike wind turbines and are there any drawbacks for wind energy? One of the top reasons that people dislike turbines is that they do not believe that climate change exists. Another reason why is that wind turbines/wind farms scars the landscape. While i do agree that turbines can obstruct the view, humans have been building and living next to objects that obstruct the view/landscape before turbines were built. Such as oil pump-jacks, power lines, cell phone towers, water towers, buildings, etc. People are also saying wind energy is to expensive to produce. While this was true only a few years ago, wind energy is decreasing in price year after year. Within the next 5 years, wind energy is projected to cost less than oil and coal. People also complain that wind turbines kill to many birds. While turbines do kill birds, buildings and cats alone kill exponentially more birds than turbines. The biggest problem that needs to be solved for wind turbines is to determine how to not kill bats. Bats do not run into turbines but die before they reach the blades. The barometric pressure messes with their bodies which causes them to die. Ultimately people are scared of change and do not have the correct facts about wind energy and turbines.

Overall, the wind sector is heading in the right direction. More and more people are seeing wind energy as a positive alternative to fossil fuel and wind energy is becoming cheaper for the United States citizen. As oil and other fossil fuels increase in price, more resources will be pumped into the wind sector, thus increasing efficiency and total gigawatts produced. Wind power has the potential to provided thousands more jobs than oil, and even more jobs that are tied to the wind energy sector. As more people are educated about the wind sector, they will be willing to accept the increase in turbine construction/installation and will be happy about paying less for their electricity bill.

Works cited

History of Wind Energy

Why do people hate wind Turbines?

The Billion Year Background of Coral Reefs

The history of coral reefs is a very long one. One that spans billions of years. The origins of reef building organisms can be traced back to the first living organisms on the planet. Reef building organisms have endured through numerous extinction events, climactic changes, and global continent moving events. They have continued to create the beautiful and vibrant reefs we are familiar with. But within the past few decades there has been significant damage and change to these essential ecosystems. The extinction level events and climactic changes they have endured through in the past happened gradually over thousands of years, and now they are experiencing that change so quickly that it may have devastating and permanent consequences on these ancient organisms.

The planet Earth is estimated to be about 4.5  billion years old. The first organisms to appear on our planet were single-celled micro-organisms, bacteria, and archea about 3.5 billion years ago. Around this time microbialites begin to appear in the fossil record. For the next 2.5 billion years these organisms are represented by photosynthesizing cyanobacteria in the form of Stromatolites. Even though these are still single celled organisms, they formed colonies and made boulder like structures. These are the first evidence of reef building type organisms in the fossil record.

It was not until about 600 million years ago that more complex reef building organisms evolved and began building reef ecosystems. Since then there are have been four main cycles in the geological history of reefs, each generally separated by extinction events. These events each changed the organisms in the communities and the way reefs are structured. One of the most significant extinction events in history is the Permian-Triassic Mass Extinction. It is believed to have occurred over a span of about 200,000 years or less. This one extinction event affected both terrestrial and marine organisms and accounted for the loss of 79% of marine invertebrate genera. There are many hypothesis about why this event so severely affected marine species but the main cause has been found to be a combination of changes in ocean oxygen levels, CO2, pH, and temperature. This event had such a large impact on marine organisms that it completely changed the way reefs were built. Before, reefs were mainly made up of sponges, rugose corals, and tabulate corals. After and even to this day, reefs are mainly built from scleractinian corals also known as hard corals. Even though these events have caused massive biodiversity loss, evolution continues on and reefs are built again.

One thing that has remained constant throughout these events though is the symbiotic relationship between reef building organisms and algae.  Today this symbiosis has become essential to the survival of reef-building corals. In modern day corals, algae lives within the cells of the coral. This is often the cause of the beautiful colors we see in coral reefs. The algae are essential to the corals, 90% of the material produced photosynthetically by the algae goes to the coral, helping them grow and thrive. Although this is beneficial to the corals, it can also be a downfall. Just like during the Permian-Triassic extinction, today these organisms are still very vulnerable and sensitive to changes in ocean chemistry and temperatures. Things like coral bleaching are just one consequence of these drastic changes. Knowing the history of how these beautiful and diverse ecosystems came to be, as well as the historical events of extinction and evolution, can help us better understand what is really going on with our coral reef ecosystems today, and what we can do to prevent the loss of such an essential part of our planet.

Works Cited

“Coral Reefs: Past, Present and Future.” Coral Reefs: Past, Present and Future. N.p., n.d. <;.

“Coral Reefs –”. MarineBio Conservation Society. Web. Accessed 12:40 PM 10/18/2016. <;

Payne, J. L., and M. E. Clapham. 2012. End-Permian Mass Extinction in the Oceans: An Ancient Analog for the Twenty-First Century? Annual Review of Earth and Planetary Sciences, 40:89–111.

Landfill Leachates

Landfills leaching has always occurred since humans first started piling their trash in landfills. It wasn’t until the 1970s however that humans first started to notice these leachates and the environmental problems  they can cause. Over the years, engineers and scientists have made significant improvements on preventing leachates from contaminating groundwater however modern systems are still not always reliable and can be prone to failure. The problem is only going to get worse as the trash burried beneath landfills remains for hundreds of years, and so an ever increasing number of past and present landfills will need to be monitored.

By definition, a leachate is any liquid that extracts soluble or suspended solids, etc when passing through a medium. This liquid, usually water poses numerous threats to wildlife when it enters the environment. Further, leachate that reaches the water table can be carried far distances from the initial spot of contamination. Because of this there have been a number of legislations that govern how solid waste is to be confined in landfills. The first of these acts was the Solid Waste Disposal Act or SWDA passed in 1965. This act required mandated disposal methods and technologies to put in place to limit leachates entering the water table. This act was later found to be obsolete and so the Resource Conservation and Recovery Act was passed in 1976. This act provided a number of provisions to the SWDA, the most notable of which was the “cradle to grave” requirement for waste disposal. This amendment directs the EPA to establish controls on the management of hazardous waste  and require strict recordkeeping on waste generators, disposers, transporters, etc. Further, this law requires set standards such as having double liners and groundwater monitoring as well as standards for release detection, prevention, and correction, spill and overspill control and restrictions on land disposal of untreatable hazardous waste products.

Modern landfills must also have a leachate collection system in place. This includes a leachate collection layer of porous sand or gravel or a thick plastic mesh sheet. A geotextile fabric must also be in place to collect solids that pass through. Finally a  leachate collection pipe system must be in place with pipes to remove the leachate where it is transported to treatment facilities. Groundwater monitoring wells are placed both upgradient and downgradient of the solid waste facility. After closure of the landfill, the EPA requires the dump to be monitored for 30 years post closure.

With all these safety regulations in place it is hard to imagine that any leachate can possibly leak into the environment. However, most of these regulations were not in place prior to 1976 and so many old disposal sites can still pose environmental problems. Further, modern leachate collection systems can fail, which  could also lead to an environmental disaster.

Leachate forms when water percolates through waste and promotes decomposition. This decomposition in turn uses up any available oxygen making the landfill anoxic. Because of this, temperature rises and pH lowers quickly. This higher temperature and lower pH causes heavy metals that are usually insoluble to dissolve. Leachate can then react with other materials to create even more dangerous substances. For example, if leachate reatcs with material containing gypsum, hydrogen sulfide gas can be released. Most leachate consists of either dissolved organic matter, inorganic macro components (sulfides, chlorides), heavy metals (Pb, Ni, Cu, etc), and organic compounds.

Leachate with a high iron content in a polluted stream


A Brief History of Walkability

Origins of Walkablity
How a walkable city came to be did not occur overnight. Some of the most walkable cities in the world are the ones that have stood the test of time. Take for example the city of Rome, a city founded as far back as 753 B.C.E. Its street infrastructure is made up of a system of informal, weathered, and meandering roads that are some of the most walkable roads on earth. To distinguish the successes of Rome’s walkability from others around the globe, it is imperative to relay it back to its origins as a city and the planning that it required. Rome is a city built upon the topography of the land – it grew out of seven hills and along the River Tiber, a source of transport and water for the city.  Forma Urbis Romae (Urban Form of Rome), an ancient map of Rome, dates back to 203-211 A.D. and illustrates the building plans within the city as well as the hills and networks of water that grace its landscape. Settlements were consolidated east of the river within the valleys that dip between the hills. Encompassing the limits of Rome is the Aurelian wall. Completed in 282 A.D., it functioned as a fortress for the city and its inhabitants. With both natural and man-made parameters, the city was able to limit development within a contained land area. As the population of the city increased, Rome grew more and more dense. The decision to build up, rather than tearing down for new construction projects has played a part in the fabrication of a city that is uniquely Rome. Unwavered by development, the roads that wind through the historic city have stubbornly remained compact and narrow. The fluctuation of Roman culture through millennia has morphed the city into a hybrid of a city of the past and of the twenty-first century. Pressured to move forward (quite literally), the city has adopted a number of lines of transport moving through the city (trams and buses) and in/out of the city (trains). Learning from Rome, a walkable city explores its parameters, both geophysical and historical, and from there, paths of walkability will unwind themselves.

The Seven Hills of Rome – Rome Today

A More Perfect City: the Beginnings of Suburban Sprawl
A walkable city cannot be every other city on the map. In addition, it cannot be drafted and orchestrated with a straightedge and a sweep of a pen. Walkable cities are more complex than that. As old as global cities are, it was not until the turn of the twentieth century that urban planning was born as a discipline. Over the past century, urbanists have been conceptualizing grand plans for cities across the globe. Assuming a societal position and a call for agency, urbanists have made both successful and unsuccessful attempts at planning hollistic cities. Take for instance, Le Corbusier’s Voisin Plan for Paris (1925) – simply put, the concept was to lift auto-centric streets and pedestrian sidewalks. Le Corbusier envisioned a vertical city with offices in space and residents elevated stories high. In retrospect, we can make an educated guess that this sort of plan would not work in terms of walkability. Diverging the horizontal streets in opposition from the vertical city is foretelling of what was to be one of the most shape shifting pieces of infrastructure that would change the American landscape.

Voisin Plan for Paris

In 1961, critic Jane Jacobs released her widely renown novel, Death and Life of Great American Cities. During this period, America was experiencing a transition in urban culture – from industrial and economically prosperous city centers to quaint and picturesque suburban towns. The 1950’s marked the beginning of the post-World War II era of leisure and recreation. Before the war ended, nearly 70 percent of Americans lived in metropolitan cities. By the 1990’s, only 40 percent still lived in metropolitan cities. It was during this period that cars became comfort and from suburbia grew segregation. A city affected by suburban sprawl was St. Louis in the 1960’s. The white population fled to southern suburban counties while many of the black population remained in the urban north region of St. Louis. Part of the reason for this is the popularization of car culture. Many suburban dwellers were able to afford to commute to work; money was at their disposal, with activities within suburban towns that included drive-ins theaters, dining restaurants, hotels, and banks. With development such as these occurring in areas of lower density, the question then becomes – at what price were people willing to pay to remove themselves from the once booming city that St. Louis once was?
Jane Jacobs felt impelled to write of such cities and why they must not be devalued through neglect. Cities can be powerful forces that captivate the interests of many, but for it to do so, a city has to have its participants. For instance, Jacobs begins by introducing the role of sidewalks as a stage for a number of activities (all of which must be realized in order for a sidewalk to function properly). The realms of sidewalk functionality are: of safety, of assimilating children, and of contact. To be safe is to have eyes on the path. The actors include the walker and the watcher. How would this work in suburbia where neighborhoods are focused inward and not outward towards the streets? People must frequent the sidewalks; sidewalks need users. Which brings us to the next realm – of assimilating children. Parks are pockets along sidewalks, when really, parks should be the streets themselves. In order for streets to be safe enough for young children, the streets must not be central to car usage only. With minimal car activity, alternate activities can occur within the streetscape that facilitate contact and exchange between pedestrians.

The Theory of Walkability
Fast forward half a century and enter Jeff Speck’s Walkable Cities: How Downtown Can Save America, One Step at a Time (2012). Following Jacobs’ publication, Speck in response has expressed the urgency for planners to sell to the rest of the city the idea of walkability. His General Theory of Walkability is supported by four main conditions – that a walk should be useful, safe, comfortable and interesting; useful in that it serves as a way for inhabitants to move through a city and where conveniences are attainable; safe because sidewalks are free from distractions of automobiles; comfortable in that the outdoors become livable; interesting enough that the architecture and pedestrian interactions convey a sense of humanity and genuity. Speck proposes the potential in many cities across the nation to adopt walkability as the solution to today’s pressing issues relating to health and environmentalism. According to data collected by Walk Score, the ten most walkable cities today are as following: New York, San Francisco, Boston, Philadelphia, Miami, Chicago, Washington D.C., Seattle, Oakland, and Long Beach. Learning from the successes of these walkable cities, it is clear that a lot of opportunities lie in many other great American cities. The only tricky part is achieving walkability and convincing a population stuck within the culture of cars to rediscover their downtowns and historic centers.

Bliss, Laura. “The Trade-Offs of Suburban Sprawl Have Been Plain for 50 Years.” CityLab. The Atlantic, 10 Dec. 2015. Web.
Jacobs, Jane. The Death and Life of Great American Cities. New York: Random House, 1961. Print.
Le Corbusier. “Plan Voisin, Paris, France, 1925.” Fondation Le Corbusier. Fondation Le Corbusier, n.d. Web.
Lewyn, Michael E. “Suburban Sprawl: Not Just an Environmental Issue.” Marquette Law Review 84.2 (2000): 301-82. Web.
Speck, Jeff. Walkable City: How Downtown Can Save American, One Step at a Time. New York: D&M, 2012. Print.

Swine Production- The Basics.

In order to give a clear idea of why antibiotic use in swine production is important, it may be necessary to give a general description of a production swine operation. For a narrower view we will only look at those operations that produce swine for slaughter at the level of a small scale farrow-to-feeder enterprise and not for show purposes for simplicity. In farrow-to-feeder approach sows are breed and then the piglets are sold when they reach a weight range between 30-60 lbs. This is also about the weight range in which show swine are bought.

The facilities necessary for this type of operation may include a barn that is well insulated  with pens constructed of metal and concrete flooring. This is important because it can neither be too hot or too cold. Piglets require temperatures of about 85-90°F in order to keep warm. Their temperature requirements will decrease when the mature to about 55-70°F. The producer may choose to use heat lamps in order to keep piglets warm , but this can be dangerous. The types of bedding like straw or shavings are highly flammable so precautions must be taken in order to lower fire risks. In hot weather it is important to keep the swine cool because they do not have functioning sweat glands. In order to keep cool they like to wallow in mud and water. For an indoor facility fans may be necessary. The producer may choose to install an automatic watering system or use water barrels. As for feeding, the producer will need a safe place to store feed, buckets, and feeders. Preferably, the feeders will be made of hard thick plastic or metal. This way the pigs will not break them as easily. A farrowing crate will be helpful for the sow when she is farrowing her piglets. It will help to prevent the sow from crushing the piglets.

There are additional measures that should be taken in order to secure the health and management of the herd. It may be important to have a pair of barn designated boots in order to lower the risk of tracking in illness. If visitors were to walk into the barns, then you may invest in disposable boots and disinfectant as an additional biosecurity measure. Herd management is facilitated through the use of an ear notching system. These numbers provide identification for identification and record keeping purposes. The right ear contains the litter number and the left tells the number of the pig within the litter. Most processors will not accept hogs from a producer who is not Pork Quality Assurance certified. Producers with this certification are educated in good production practices such as, having an appropriate relationship with a vetrinarian, a health management plan, swine care, and the proper use of antibiotics.

The process of the farrow-to-feeder enterprise can be simply explained. First, we have to have the proper facilities. Then sows/gilts and boars/semen for breeding. Only healthy swine should be bred. The sow can be artificially inseminated by the producer or live bred. The gestation period for the sow will be approximately 3 months, 3 weeks, and 3 days. When the sow nears her due date, she will be placed in a farrowing crate. These crates allow little movement for the sow , but are helpful for the safety of the piglets. She will then farrow her litter and raise them until weaned at approximately 4 weeks old. After birth, the piglets should be given an iron injection, castrated, tails docked, needle teeth clipped, and ear notched. Although these procedures are not strictly necessary, there are reasons for the long term welfare of the animal. After weaning, the piglets are separated from their mother and put on a starter feed that they will be raised on until they are 30-60lbs. The piglets will then be sold to a processor, feeder-to finish operation, or public to be raised until maturity and slaughter. Throughout this entire process the health of the swine may be maintained through the correct use of antibiotics.

The history of antibiotic use in production livestock in the United States can be dated back to the 1940’s.[1] The use of antibiotics has helped in the prevention, control, and treatment of illness in production animals. As the population of the United States continues to increase, so does the demand for food sources. The rational use of antibiotics plays a vital role in improving the feed efficiency, growth, and control of illness. It is important to note that when antibiotics are used irresponsibly it raises the risk of contamination and antibiotic resistance. A producer should always contact the veterinarian before administration.

By using antibiotics producers can improve the nutritional , disease prevention, and metabolic effects in swine.[2] Feeding or administering antibiotics can improve nutrient absorption and suppress illness in the hogs environment. They improve mortality and morbidity rates in piglets.[3]

The United Kingdom was the among the first to take precautionary measures to restrict the use of growth promoting antibiotics in 1973.[4] Their fear was that these medicines would promote microbes capable of resisting antibiotics used in human medicine and that they could pass from human to swine through handling or consumption. Sweden, Norway, Finland, and Denmark also imposed these bans.[5] In 2006 a European Union-wide ban was put into effect on growth promoters. However, a study by the National Research Council in 1999 stated that these antibiotics posed a low risk to humans if they are used properly. Currently, the United States in under increasing scrutiny by the Federal Food and Drug Administration and public interest groups to ban the use of antibiotics in hog production.


McBride, William D., Nigel Key, and Kenneth Mathews. “Sub-therapeutic Antibiotics and Productivity in U.S. Hog Production.” July 23-26, 2006. Accessed October 12, 2016.

Hao, Haihong. “Benefits and Risks of Antimicrobial Use in Food-Producing Animals.” NCBI. January 12, 2014. Accessed October 12, 2016.

[1] Hao, Haihong. “Benefits and Risks of Antimicrobial Use in Food-Producing Animals.” NCBI. January 12, 2014. Accessed October 12, 2016.

[2] McBride, William D., Nigel Key, and Kenneth Mathews. “Sub-therapeutic Antibiotics and Productivity in U.S. Hog Production.” July 23-26, 2006. Accessed October 12, 2016.

[3] “Benefits and Risks of Antimicrobial Use in Food-Producing Animals.”

[4] “Benefits and Risks of Antimicrobial Use in Food-Producing Animals.”

[5] Sub-therapeutic Antibiotics and Productivity in U.S. Hog Production.”

Plastic to Stainless Steel: A History of Plastic Water Bottles

Water is essential to living but paying for water bottles, is not.  Ways of carrying water from one location to another has been an issue dating back to the cavemen.  Before plastic water bottles, people used horns, animal skins, mud, and clay to carry their water.  Dating back to 5000 B.C. ancient people used porcelain because it was strong enough to hold gallons and gallons of water and the pots would not leak until animal skins or horns.  Beginning in 1600 B.C. glass became the newest way of holding water and other liquids and as time went on glass slowly replaced porcelain all together.  Glass bottles held everything from water to pills for doctors.  Access to fresh clean drinking water is a necessity and without it, humans suffer tremendously.

The first known use of water bottles dates back to Boston’s Jackson’s Spa in 1767.  From that point forward the US became infatuated with the idea of taking water with them wherever they went.  The first plastic (PVC) was identified in 1838 and the plastic water bottle fad sky rocketed (  Back in 1856, companies like Saratoga Springs sold seven million water bottles in a single year.  This fad of consuming mass amounts of water bottles has only increased since the first production in 1767. The invention of the glass blowing machine in 1900 is what took water bottle consumption from a new trend to an overwhelming habit formed by people not only in the U.S. but around the world.  Glass blowing machines made production simpler and more efficient giving companies the upper hand in producer mass quantities of water bottles on a daily basis.  Water bottle companies like Saratoga Springs faced hard times when chlorinated water filtration systems in homes were available in 1913.  The process of chlorination provided households with safe, clean, and easily accessible water right form their faucet.  Households did not stray from the water bottle industry for long. In a matter of months, the water bottle was back and booming and selling more bottles than ever before.

Plastic water bottles all boil down to one thing: oil.  In 2008, 15 millions barrels of oil were used to produce enough water bottles for the United States alone ( This number is outrageous.  Can you think of what we could use all of the oil for if we weren’t having to use it for plastics?   Essentially we are paying for water. Why would we continue to do this if we can buy a reusable water bottle like a S’well Bottle and continuously fill it up in purified water stations that we see all over campus? Water bottle manufacturers are making a killing on individuals (like myself from time to time) who continue to buy massive packs of water bottles.  It takes a company like S’well Bottle to stir up changes and kick this plastic water bottle habit for good.

I have linked a video showing the story of water bottles and their devastating effects on our society.

Sources Cited:

@. (2015). The history of WATER BOTTLES and BOTTLED WATER. Retrieved October 10, 2016, from

Summarized Tribal Soveriegnty

Image result for reservation native american

To start from the beginning it seems fitting to discuss the history of the Native American and the US government; especially since it is right around Columbus Day. The Native American tribes had inhabited the “land of the free and home of the brave” way before Christopher Columbus ever set foot on the North American continent and yet he is given credit for “discovering” America. Whenever the United States was still just an angry mob of men who hated taxes the tribes had developed relationships with the different groups of settlers that had taken over the New England region and traded with them. Eventually the revolutionary war took place and America gained its independence from the evil British and from there the Native American tribes had to deal with the newly formed US government. Often forgotten is how the expansion of the United States towards the western territories led to the tribes being displaced from the land they resided on but did not believe belonged to any person. Then came the Indian Removal Act endorsed by president Andrew Jackson which basically gave the tribal lands to settlers and offered the Native Americans some land west of the Mississippi river. This relocation is essentially the beginning of the long line of broken promises and assimilation tactics the US used to try and persuade the Native Americans that the “protection” of the United States was worth losing their land and culture for. Over the years many battles and injustices against the Native Americans took place including the Trail of Tears, Wounded Knee, and many other instances where innocent Native lives were lost due to the settler expansion agenda. After the last tribes had either been forcibly removed or assimilated into the United States society all that remained of their sovereign land were the small disconnected reservations that spread across the US. These reservations are considered to be sovereign nations that reside within the US and must abide by federal laws. These “federally recognized tribes are recognized as possessing certain inherent rights of self-government (i.e., tribal sovereignty) and are entitled to receive certain federal benefits, services, and protections because of their special relationship with the United States.” (). In addition to this there are a number of treaties in place to ensure that the tribes get their fair share from the United States. However laws like the Indian Mineral Leasing Act of 1938 gave non Natives the ability to mine and eventually drill on Native lands if they were the highest bidder. “In 1982 Congress enacted the Indian Mineral Development Act, authorizing Indian tribes to enter into forms of agreements for oil and gas development in addition to leases under the IMLA.” ( The tribes have a choice when it comes to regulating the environmental harms that come from developing oil and gas on their lands but if they choose not to provide any provisions then the EPA implements the standards it would for a state in the nation. The tribes that live on reservations have difficulty regulating the drilling processes that take place on their lands due to the complex web of jurisdictions that has formed from working with the federal government and a sovereign Native government. This leads to irrefutable environmental harm when techniques like fracking become the new way for the US to stay competitive when it comes to oil and gas production. Although many tribes have deep roots when it comes to the connection with nature and their spirituality they do not have many options other than to lease out the land and receive generous profit from doing it. In 2013 a reported $971 million was given to various tribes because of the oil and natural gas that had been extracted from their lands. ( Often we hear stories about third world countries who have to deal with the effects of ecological devastation and insane amounts of pollution in exchange for income that is greatly needed but it is happening within the borders of the US in these sovereign Native reservations. There are many factors that led to the quality of living on some of these reservations declining but the dealings with the US federal government are mostly responsible for why Native Americans now allow drilling on lands that were once sacred to them.

Image result for reservation drilling