air pollution, bottled drinking water safety, corruption in the US, drinking water safety, eco-tech, FDA, food poisoning, health care industry, health care insurance, health care reform, medical system reform, nuclear contamination, nuclear industries, pollution, US Congress, US government policy
And the first question is –
which two-faced lying bastards because there are too many to choose from –
the list is long.
The politicians that claim they want good health reform while allowing food producers to continue producing poisoned, tainted and filthy foods too commonly for me to want to even consider.
The agencies of the government who were paid to protect us, that were given the best of resources, education and manpower to do so, who have not done their jobs then covered their involvement with rhetoric and political positioning, and by “framing” studies and statistics to favor themselves rather than show the truth.
The lobbyists . . .
The insurance companies that have literally destroyed the health care system in America who are about to be given a free ticket to 8000% profits because for some reason their 400% profits aren’t enough as it is.
The corporate executives in the United States who have managed to rob every last dime and dollar from their companies, every asset, every future profit, every raise from the employees actually doing their jobs and to thieve every last resource of the American people on top of it.
The corporate businesses with no conscience, no ethics, no morality, no common sense and no concept of accountability from large pharmaceutical companies to manufacturers of all kinds to logging companies and mining companies and petroleum corporate giants to Wall Street to bankers and on and on and on . . .
The thousands of supposedly unbiased and objective educated professionals who have let politics decide rather than any objective standards.
And, the layers upon layers of agencies, non-profits, associations, organizations, government entities and others who were entrusted to be ethical, decent, conscientious, honorable, honest, truthful, objective, educated, reasonable, sane, sober and use common sense who did whatever would continue bringing them money whether it was right or wrong.
And, the list goes on longer than I am tall . . .
But, what brought these things to my mind? Was it watching the “Triple Cross” show on the History Channel or National Geographic or whatever channel I was watching, which showed the complete incompetence of our government agencies like the State Department, the FBI, the CIA, the military, the immigration department and police to get anything right? Was it that, if the Osama bin Laden spy in the US had been in a car with a broken taillight he might have been caught like any of the rest of us – but short of that, he could do anything and get away with anything and never be stopped?
Was it the knowledge that the CDC put out a story in the UPI group which expresses numbers of food borne illness from 2006 in a way that is far less than the real numbers that were reported by “framing” the parameters of what they studied and then releasing that story to the public as if the situation was, and is better than it actually is?
ATLANTA, June 11 (UPI) — There were 1,270 reported U.S. food borne disease outbreaks in 2006, resulting in 27,634 illnesses and 11 deaths, federal officials said.
Analysis was done on data from the 243 outbreaks in which a single food commodity was identified and reported to CDC. Twenty-one percent of all outbreak-associated cases involved poultry, 17 percent involved leafy vegetables and 16 percent involved fruits or nuts.
from a story with this title, (making it look like that is all there is,)
CDC: 1,270 U.S. food borne outbreaks
Published: June 11, 2009 at 2:17 PM
Was it seeing the story about the “pay czar” approving the AIG CEO to get another $10.5 million and reading in the article his quote –
In a meeting with employees, Benmosche reportedly said, “The money is about what I am worth, and what my job is worth to be your leader. And that sets the tone for all of you in this room.” from UPI story, 10-02-09
Or was it seeing a time line of the space race in chronological order which I started looking at the linked information and realized that after eight colleges, thousands of library books, hundreds of hours online researching and a family involved in aerospace stuff, that I had never ever seen anything about most of the events on the list . . .
Timeline of planetary exploration by date of launch.
like this one –
For people who check facts
Zond 7 (Soyuz 7K-L1)Enlarge
Zond 7 (Soyuz 7K-L1)
Zond 7, a member of the Soviet Union’s Zond program and the only truly successful test of the Soyuz 7K-L1, was launched towards the Moon from a mother spacecraft (69-067B) on a mission of further studies of the Moon and circulmunar space, to obtain color photography of Earth and the Moon from varying distances, and to flight test the spacecraft systems. Earth photos were obtained on August 9, 1969. On August 11, 1969, the spacecraft flew past the Moon at a distance of 1984.6 km and conducted two picture taking sessions. Zond 7 reentered Earth’s atmosphere on August 14, 1969, and achieved a soft landing in a preset region south of Kustanai.
* Launch Date/Time: 1969-08-07 at 23:48:06 UTC
* On-orbit dry mass: 5979 kg
This article was originally based on material from NASA (NSSDC) information on Zond 7
This is the “Zond 7” reference article from the English Wikipedia. All text is available under the terms of the GNU Free Documentation License. See also our Disclaimer.
(And this one, which is disturbing for the attitude it has about exposing millions of people to radioactive fallout – )
The Nuclear pulse propulsion reference article from the English Wikipedia on 24-Jul-2004
(provided by Fixed Reference: snapshots of Wikipedia from wikipedia.org)
Nuclear pulse propulsion
An artist’s conception of a spacecraft powered by nuclear pulse propulsionEnlarge
An artist’s conception of a spacecraft powered by nuclear pulse propulsion
Nuclear pulse propulsion (or External Pulsed Plasma Propulsion, as it is termed in recent NASA documents) is a proposed method of spacecraft propulsion that uses nuclear explosions for thrust. It was briefly developed as Project Orion by ARPA. It was invented by Stanislaw Ulam in 1957, and is the invention of which he was most proud.
Table of contents
5 The Plumbbob Test
6 Appearance in Fiction
8 See also
Calculations show that this form of rocket would combine both high thrust and a high specific impulse, a rarity in rocket design. Specific impulses from 2000 (easy, yet ten times chemical specific impulses) to 100,000 (requires specialized nuclear explosives and spacecraft design) are possible, with thrusts in the millions of tons.
This is possible because Orion uses nuclear power to make thrust without requiring the power to be held within a rocket chamber. Thus, very high temperatures, exhaust velocities and efficiencies are possible. Orion directs the thrust by using directional nuclear explosives, so it achieves reasonable efficiencies without a rocket bell.
An Orion drive is the only known method of performing manned interstellar exploration with current technology. It would be slow, requiring several generations to get to Alpha Centauri (the closest known solar system other than our own), but it would arrive, assuming it had no accidents.
The most likely real application for an Orion craft is to deflect an Earth-crossing asteroid from hitting the Earth. The extreme specific impulse is a major advantage, because it permits the missile to launch late, and still have a hope of arriving in time. Simply hitting the asteroid would be enough to deflect it. A kinetic missile could transfer greater energies than a nuclear explosion, with less risk of breaking up the target. Such craft could be unmanned, and inexpensive (no shock absorbers or shielding), launched from orbits outside the magnetosphere to minimize radioactives in the biosphere.
Carrying through the mass ratios, Orion could be built of steel, without special fittings, and carry crews of hundreds. In 1960, the proposed contractor was Electric Boat, the maker of nuclear submarines.
The design reference model proposed by General Atomics could likely be built today, and land a thousand tons on Mars in several weeks. If reaction mass such as water were gathered from a local moon, the same design could explore the moons of Jupiter or Saturn with a human crew.
A more advanced design was proposed for the Project Daedalus interstellar probe study in the 1970s, using electron beams to detonate pellets of deuterium/helium-3.
In the 1954 explosion at Bikini Atoll, a crucial experiment by Lew Allen proved that nuclear explosives could be used for propulsion. Two graphite-covered steel spheres were suspended near the bomb. After the explosion, they were found intact some distance away, proving that engineered structures could survive a nuclear fireball.
A 1959 report by General Atomics, “Dimensional Study of Orion Type Spaceships,” (Dunne, Dyson and Treshow), GAMD-784 explored the parameters of three different sizes of hypothetical Orion spacecraft:
Ship Diameter 17-20 m 40 m 400 m
Ship Mass 300 T 1-2000 T 8,000,000 T
Number of bombs 540 1080 1080
Individual Bomb Mass 0.22 T 0.37-0.75 T 3000 T
The most amazing to consider is the “super” Orion design; At 8 million tons, it could easily be a city. In interviews, the designers contemplated the large ship as a possible interstellar ark. This extreme design was buildable with materials and techniques that could be obtained or anticipated in 1958. The real upper limit is probably larger now.
Most of the three thousand tons of each of the “super” Orion’s propulsion units would be inert material such as polyethylene, or boron salts, used to transmit the force of the propulsion unit’s detonation to the Orion’s pusher plate, and absorb neutrons to minimize fallout. One design proposed by Freeman Dyson for the “Super Orion” called for the pusher plate to be composed of uranium or a largely transuranic element so that upon reaching a nearby star system the plate could be converted to nuclear fuel.
From 1957 through 1964 this information was used to design a spacecraft propulsion system called “Orion” in which nuclear explosives would be thrown through a pusher-plate mounted on the bottom of a spacecraft and exploded underneath. The shock wave and radiation from the detonation would impact against the underside of the pusher plate, giving it a powerful “kick,” and the pusher plate would be mounted on large two-stage shock absorbers which would transmit the acceleration to the rest of the spacecraft in a smoother manner.
Radiation shielding for the crews was thought to be a problem, but on ships that mass more than a thousand tons, the material of the pusher plate is sufficiently thick to shield the crew from the explosives’ radiation. Radiation shielding goes up as the exponent of the thickness (see gamma ray for a discussion of shielding).
At low altitudes, during take-off, the fallout was extremely dirty, and there was a grave danger of fluidic shrapnel being reflected from the ground. The solution was to use a flat plate of explosives spread over the pusher plate, to get two or three detonations from the ground before going nuclear. This would lift the ship far enough into the air that a focused nuclear blast would avoid harming the ship.
A preliminary design for the explosives was produced. It used a fusion-boosted fission explosive. The explosive was wrapped in a beryllium oxide “channel filler”, which was surrounded by a uranium radiation mirror. The mirror and channel filler opened out to an open end. In the open end, a flat plate of tungsten propellant was placed. The whole thing was wrapped in a can so that it could be handled by machinery scaled-up from a soft-drink vending machine.
At 1 microsecond after ignition, the gamma bomb plasma and neutrons would heat the channel filler, and be somewhat contained by the uranium shell. At 2-3 microseconds, the channel filler would transmit some of the energy to the propellant, which would form a cigar-shaped explosion aimed at the pusher plate.
The plasma would cool to 25,000 Ã?Â̊F (14,000 Ã?Â̊C), as it traversed the 75 ft (25 m) distance to the pusher plate, and then reheat to 120,000 Ã?Â̊F (67,000 Ã?Â̊C), as (at about 300 microseconds) it hit the pusher plate and recompressed. This temperature emits ultraviolet, which is poorly transmitted through most plasmas. This helps keep the pusher plate cool. The cigar shape and low density of the plasma reduces the shock to the pusher plate.
The pusher plate’s thickness decreases by about a factor of 6 from the center to the edge, so that the net velocity of the inner and outer parts of the plate are the same, even though the momentum transferred by the plasma increases from the center outwards.
Deep in the air, there might be problems from harm of the crew by gamma scattering.
Stability was thought to be a problem, but it developed that random placement errors of the bombs would cancel.
A one-meter model using RDX (chemical explosives), called “put-put”, flew a controlled flight for 23 seconds, to a height of 185 feet at Point Loma.
The shock absorber was at first merely a ring-shaped airbag. However, if an explosion should fail, the 1000 ton pusher plate would tear away the airbag on the rebound. A two-stage, detuned shock absorber design proved more workable. On the reference design, the mechanical absorber was tuned to 1/2 the bomb frequency, and the air-bag absorber was tuned to 4.5 the bomb expulsion frequency.
Another problem was finding a way to push the explosives past the pusher plate fast enough that they would explode 20 to 30m beyond it, and do so every 1.1 seconds. The final reference design used a gas gun to shoot the devices through a hole in the pusher plate.
The expense of the fissionables was thought high, until Ted Taylor proved that with the right designs for explosives, the amount of fissionables used on launch was close to constant for every size of Orion, from 2000 tons to 8,000,000 tons. Smaller ships actually use more fissionables, because they cannot use fusion bombs (though the later Project Daedalus design used fusion explosives detonated by electron beam inertial confinement, which could be scaled down much smaller than self-contained bombs). The large size bombs used more explosives to super-compress the fissionables (reducing the fallout). The extra explosives simply served as propulsion mass. The expense of launch for the largest size of Orion was 5 cents per pound (11 cent/kg) to Earth orbit in 1958 dollars.
Exposure to repeated nuclear blasts raises the problem of ablation (erosion) of the pusher plate. However, calculations and experiments indicate that a steel pusher plate would ablate less than 1 mm if unprotected. If sprayed with an oil, it need not ablate at all. The absorption spectra of carbon and hydrogen minimize heating. The design temperature of the shockwave, 120,000 Ã?Â̊F (67,000 Ã?Â̊C), emits ultraviolet. Most materials and elements are opaque to ultraviolet, especially at the 50,000 lb/in2 (340 Mpa) pressures the plate experiences. This prevents the plate from melting or ablating.
One issue that remained unresolved at the conclusion of the project was whether the turbulence created by the combination of the propellant and ablated pusher plate would dramatically increase the total ablation of the pusher plate. According to Freeman Dyson, whilst back in the 1960s they would have had to actually perform a test with a real nuclear explosive to determine this, with modern simulation technology this could be determined fairly accurately without such.
The unsolved problem for a launch from the surface of the Earth is the nuclear fallout. Freeman Dyson, an early worker on the project, estimated that with conventional nuclear weapons, each launch would cause fatal cancers in ten human beings from the fallout. To keep this in perspective, roughly 600 people die of cancer each year from eating spices.
However, the fallout for the entire launch of a 6000 ton Orion was only equal to a ten-megaton blast, and he was assuming use of weapon-type nuclear explosives.
With special designs of the nuclear explosive, Ted Taylor estimated that it could be reduced ten-fold, or even to zero if a pure fusion explosive could be constructed. However, bomb designers are reluctant to design such an explosive, because it is thought to be destabilizing, and tempting to terrorists. Project Daedalus solved this problem through the use of electron beam inertial confinement, which is not suitable for use in weaponized explosives.
The vehicle and its test program would violate the International test ban treaty as currently written. This could almost certainly be solved, if the fallout problem were solved.
The launch of such a craft from the ground or from low Earth orbit would generate an electromagnetic pulse that could cause significant damage to computers and satellites, as well as flooding the van Allen beltss with high-energy radiation. This problem might be solved by launching from very remote areas. EMP footprints are only a few hundred miles wide. The Earth is well-shielded from the Van Allen belts.
True engineering tests of the vehicle systems were said to be impossible because several thousand nuclear explosions could not be performed in any one place. However, experiments were designed to test pusher plates in nuclear fireballs. Long-term tests of pusher plates could occur in space. Several of these almost flew. The shock-absorber designs could be tested full-scale on Earth using chemical explosives.
Assembling a pulse drive spacecraft in orbit by more conventional means and only activating its main drive at a safer distance would be a less destructive approach. Such a system would be much less efficient than the pure pulse approach, because no chemical rocket could conceivably launch a big enough pusher plate to take full advantage of the thrust of the explosions. Adverse public reaction to any use of nuclear explosives is likely to remain a hindrance even if all practical and legal difficulties are overcome.
The “Medusa” design is a type of nuclear pulse propulsion which shares more in common with solar sails than with conventional rockets. It was proposed in the 1990s. A Medusa spacecraft would deploy a large sail ahead of it, attached by cables, and then launch nuclear explosives forward to detonate between itself and its sail. The sail would be accelerated by the impulse, and the spacecraft would follow.
Medusa performs better than the classical Orion design because its “pusher plate” intercepts more of the bomb’s blast, because its shock-absorber stroke is much longer, and because all its major structures are in tension and hence can be quite lightweight. It also scales down better. Medusa-type ships would be capable of a specific impulse between 50,000 and 100,000 seconds.
The Jan 1993 and June 1994 issues of JBIS have articles on Medusa. (There is also a related paper in the Nov/Dec 2000 issue.)
The Plumbbob Test
A test similar to the test of a pusher plate apparently happened by accident during a series of nuclear containment tests called “Plumbbob” in 1957. A low-yield nuclear explosive accelerated a massive (900 kg) steel capping plate above escape velocity. See the account by the experimental designer, Dr. Robert Brownlee. Although his calculations showed that the plate would reach six times escape velocity, and the plate was never found, he believes that the plate never left the atmosphere. It probably vaporized from friction. The calculated velocity was sufficiently interesting that the crew trained a high-speed camera on the plate, which unfortunately only appeared in one frame. Brownlee estimated a lower bound of 2 times escape velocity.
Appearance in Fiction
A Project Orion spaceship features prominently in the science fiction novel Footfall by Larry Niven and Jerry Pournelle. In the face of an alien siege/invasion of Earth, the humans must resort to drastic measures to get a fighting ship into orbit to face the alien fleet.
* “Project Orion: The True Story of the Atomic Spaceship”, George Dyson, 2002, ISBN 0805072845
* “Nuclear Pulse Propulsion (Project Orion) Technical Summary Report” RTD-TDR-63-3006 (1963-1964); GA-4805 Vol. 1, Reference Vehicle Design Study, Vol. 2, Interaction Effects, Vol. 3, Pulse Systems, Vol. 4, Experimental Structural Response. (From the National Technical Information Service, U.S.A.)
* “Nuclear Pulse Propulsion (Project Orion) Technical Summary Report” 1 July 1963- 30 June 1964, WL-TDR-64-93; GA-5386 Vol. 1, Summary Report, Vol. 2, Theoretical and Experimental Physics, Vol. 3, Engine Design, Analysis and Development Techniques, Vol. 4, Engineering Experimental Tests. (From the National Technical Information Service, U.S.A.)
spacecraft propulsion, nuclear weapon
This is the “Nuclear pulse propulsion” reference article from the English Wikipedia. All text is available under the terms of the GNU Free Documentation License. See also our Disclaimer.
Did the “two-faced bastards” thought come to me because I had been researching the failing dams or from listening to the jackasses in Congress drop an amendment to the health care reform bill that would have prevented insurance companies from charging any more than double on premiums to women and people over 50 years old – when right now they are charging many times more based on age and gender? I think that was it . . .
First, there is no right of legislators to force every American by law to purchase something – anything from a private company or industry who profits from it – that isn’t American, its something else.
And, second of all – the discrimination of charging from five times more to twenty-eight times more for insurance premiums based on age or gender is a practice in violation of the Federal Code in every equal opportunity law, every fairness and fair practices doctrine and in every guarantee against discrimination based on age, gender, nationality, religion and disability which is written into the law and the Constitution.
And, third of all, I watched the Republican who argued for the discrimination practice by the insurance companies based on age and gender since they would no longer be allowed to discriminate based on previous health or lack thereof. Instead of honoring his oath of office to protect us against discrimination and to support the equality, democracy, rights and freedoms of our Constitution, he was arguing to deny those things in support of the interests of the insurance corporate giants as if he worked for them. That man has no business in the halls of Congress representing any one.
federal mandate backed by the full capacity of the enforcement regime
Senate Finance Committee Health Care Markup
The Chairman’s Mark requires every American to buy health insurance coverage
Debate in the Senate Finance Cmte. has concluded for the week. Today, they discussed and voted on amendments to the legislation. Chairman Sen. Max Baucus (D-MT), announced that when the committee returns on Tuesday they will bring up several amendments which include a public option.
Washington, DC : 2 hr.
Discrimination Rating System – against elderly and women
John Kerry amendment
withdrawn – the actual hearing can be seen on the link below –
31:52 and thereabouts – Senator Kerry withdrew the amendment so there will be nothing to hinder the insurance companies from discriminating based on age and gender to whatever degree they choose
09 – 25 – 09
I think that was when my level of disgust finally peaked to be able to form a question at all. And, the question is this –
Does Washington, Congress and their business friends just want all of our money – 100% of our incomes sent to them and then they can decide if we deserve a couple percent back to live on? Is that it? Haven’t the insurance companies driven the prices up to where they are today and allowed the nastiest, unfit, unhealthy health system in the free world to be paid to stay that way? Isn’t it their profit based business model that created that problem in the first place? And, isn’t it “Cinderella thinking” that by any plan favoring the insurance companies things will ever be different? How could that even begin to work?
And, then I think about the ways our government have already spent our money and it pisses me off that educated, professional, highly paid people with all those resources at their disposal would do it that way .
Like this –
E. coli virus’ path exposes safety lapses
Published: Oct. 4, 2009 at 12:06 PM
WASHINGTON, Oct. 4 (UPI) — A virulent strain of E. coli bacteria known as O157:H7 still sickens thousands of Americans each year despite regulations, The New York Times reported Sunday.
The newspaper said that through interviews and government and corporate records it traced the path of a batch of hamburger that sickened a 22-year-old Minnesota woman in 2007 and found that regulatory safeguards meant to prevent food contamination is not what consumers have been led to believe.
The Times said the frozen hamburger the woman ate were made by the food giant Cargill, which confidential grinding logs and other company records showed were made from a mix of trimmings and a mash-like product from slaughterhouses in Nebraska, Texas and Uruguay. A South Dakota company that processes fatty trimmings treated them with ammonia to kill bacteria.
Using the combination of sources reportedly allowed Cargill to spend about 25 percent less than it would have for cuts of whole meat. Despite the low-grade ingredients, Cargill, like most meat companies, relies on suppliers to check for the bacteria.
The Times said food scientists contend that that federal guidelines urging consumers to cook meat thoroughly and to wash up afterward is not sufficient to kill the O157:H7 virus.
And this –
Government wants cheese plant closed
Published: July 7, 2009 at 11:43 PM
NEW YORK, July 7 (UPI) — Federal authorities say they have gone to court to stop a New York cheese company from making and distributing its products, alleging contamination problems.
The U.S. Justice Department, acting for the U.S. Food and Drug Administration, filed a complaint in U.S. District Court alleging Peregrina Cheese Inc. has a history of operating under insanitary conditions. The government alleges cheese produced by Peregrina is contaminated with Listeria monocytogenes, a Justice Department release Tuesday said.
Listeria is a food-borne pathogen. It can cause serious illness and has been known to cause death in young children and those who are frail or elderly.
The release alleges FDA investigators found Listeria monocytogenes within the factory and in finished cheese products on numerous occasions since 2004. It also contends New York State Department of Agriculture and Markets conducted laboratory testing which found Listeria in Peregrina products.
(its 2009 – and they’ve profited since 2004 without restraint knowing they were producing something tainted that could kill people. What is the FDA – the food and drug producers’ protection, legal representatives and anti-liability agency?)
(And this one – )
from 2008 – 2009
As of Friday night, 474 people had been reported infected by a salmonella outbreak linked to peanut butter by public health authorities in 43 of the 50 U.S. states, the Centers for Disease Control and Prevention said.
The very young, elderly and immuno-compromised were the most severely affected, he said in the teleconference. The reported illnesses began in September and 21 cases were reported on Friday.
Results for “peanut butter deaths”
By Alex Nussbaum
Jan. 30 (Bloomberg) — The U.S. has opened a criminal investigation of the peanut butter manufacturer tied to a salmonella outbreak that has sickened 529 people and killed eight, White House and health officials said.
White House spokesman Robert Gibbs confirmed the investigation during a briefing with journalists, when asked about an Associated Press report that the FDA knew in April about a shipment of peanuts from the plant containing pieces of metal and never tested by inspectors. Agency records also found that an outside lab uncovered salmonella at the plant as recently as last year, AP reported. A second round of testing by a different company turned up negative for salmonella, the news agency said.
“I think the revelations have no doubt been alarming, that whether it was our own regulatory system or a company that repeatedly found salmonella in its own testing would continue to ship out that product is beyond disturbing for millions of parents,” Gibbs said.
Peanut Corp. shipped crackers and other foods from the plant after tests on a dozen occasions in 2007 and 2008 showed salmonella, the FDA and U.S. Centers for Disease Control and Prevention said.
Last Updated: January 30, 2009 17:47 EST
(And this one – which is a real winner – )
We decided to collect data ourselves and began by surveying the nation’s 50 largest cities, along with the nation’s largest water providers, which added another dozen major utilities to our list. We also called on at least one smaller community water provider in each of the 50 states. Even though the AP has reporters in every state to whom such calls could have been assigned, the three PharmaWater reporters divvied them up along with the e-mails. We wanted to be absolutely certain that our questions, and the answers, were apples and apples.
Some of our initial interviews left us unable to confirm even that the water in specific cities had been tested. From there, deeper reporting problems emerged: As with industry association folks, several local water utilities and city governments acknowledged they had tested the water but would not reveal results. In some places, officials wouldn’t speak to us at all. Repeated calls were met with repeated brushbacks. For example, New York City water officials declined repeated requests for an interview and waited more than three months before participating in the AP survey, supplying information only after being informed that every other major city in the nation had cooperated. We shamed them into talking to us.
Even before New York City officials reluctantly spoke with us, Donn had discovered that the New York state health department and the U.S. Geological Survey had detected heart medicine, infection fighters, estrogen, anticonvulsants, a mood stabilizer, and the active ingredient in an antianxiety medication in the city’s watershed upstate. Ultimately, the city’s Department of Environmental Protection informed us that it does not test its downstate drinking water.
In Emporia, Kansas, Ron Rhodes, the city’s water treatment plant supervisor, explained why he wouldn’t disclose whether his community’s source water or drinking water had been tested for pharmaceuticals. “Well, it’s because of 9/11,” he said. “We want everybody to guess.” When we asked how it would endanger anyone if the public knew whether Emporia’s water has been screened for minute concentrations of pharmaceutical compounds, he replied, “We’re not putting out more information than we have to put out. How about that?”
In conversations with other water officials, we heard much the same. Philadelphia officials balked at first, then relented, but not before a city water department official declared: “It would be irresponsible to communicate to the public about this issue, as doing so would only generate questions that scientific research has not yet answered. We don’t want to create the perception where people would be alarmed.”
Security-conscious officials in Arlington, Texas, gave us information in drips and drabs. First, they said they’d detected drugs in the city’s source waters but wouldn’t say which ones, or in what amounts, or whether any such drugs had survived the treatment process. Next, the mayor told us a trace amount of one pharmaceutical had survived the treatment process and had been detected in drinking water. He declined to name the drug, saying identifying it could prompt a terrorist to intentionally release more of it, causing significant harm to residents.
Three months later, after we’d filed public records requests—and after assurances from the Texas Attorney General that the terrorism concerns were not well founded—the secret was revealed: Drinking water in Arlington had tested positive for the antianxiety medication meprobamate. The public announcement was made in June 2008; the water samples had been taken in October 2006.
- At least 46 million Americans consume water contaminated with prescription and over-the-counter drugs. That number is no doubt a gross undercount—most cities and water suppliers do not test. (Our first series, published in March 2008, had tallied 41 million; a follow-up survey six months later added five million.) This year we are working on comparing the communities now contained in our results with a new research project that will probably document an additional 10 million or so.
- In the wild, scientists have found reason to blame pharmaceuticals in the water for severe reproductive problems in many types of fish—razorback suckers and male fathead minnows with lower sperm counts, male carp now called feminized fish, and female fish developing male genital organs. There are problems with other wildlife: kidney failure in vultures, impaired reproduction in mussels, and inhibited growth in algae.
- In the laboratory, there are growing indications that small amounts of medication have affected human embryonic kidney cells, human blood cells, and human breast cancer cells. The cancer cells proliferated too quickly, the kidney cells grew too slowly, and the blood cells showed biological activity associated with inflammation.
- Our follow-up series, in September 2008, revealed that hospitals and long-term care facilities annually dump an estimated 250 million pounds of unused or outdated pharmaceuticals and contaminated packaging. Again, we had to gather this data and do the calculations; up to half of that total could be the drugs themselves.
“The richest 1% in the US have more wealth than the lowest 95% combined.”
From Larry King Live with Michael Moore – promoting his movie – “Capitalism: A Love Story”
“There’s a foreclosure filing in America, once every 7 and a half seconds.”
My Note –
I have always wondered, where would we be today, if health care had been done differently? If doctors had made the same amount of money whether they saw 40 people once a year, or saw those 40 people for 8000 times a year – would they have helped people be healthy, get healthy and stay healthy so they wouldn’t have had to see them more than once or twice? Honestly, if hospitals and doctors were paid once per person per year, regardless of how much or how little they did and if each were responsible for that person to be healthy in order to get that money – wouldn’t they do things differently?
If pharmaceutical companies had to give back all the profits from drugs deemed to have been dangerous and to have caused damages to people, plus the amount of fines and penalties on top of that, wouldn’t they stop producing drugs that are manufactured in a method which is known to cause unnecessary side effects and health dangers? When they are fined three or four million dollars on something which has brought them $5 Billion dollars in profits, there isn’t any reason to change the choices they are making – especially considering they continue to sell the drugs which have been found to have caused permanent damage to people’s health and quality of life.
Even if 90% of the citizens of the United States disagree with what they are doing, the Senators and Representatives in Washington, D.C. will do it anyway. There are no American citizens at the table with them. We are not invited. Our interests and concerns are ignored. We are of no importance to them except as a portion of the cattle to be taxed and routed through their profit-driven investments to be robbed of our money while giving us little or nothing in return.
Our voices don’t count for anything as evidenced by the actions of the Bush administration and its agency’s policy applications and now evidenced by the disdain Congressional members have had for the outcry against mandated health insurance while doing nothing to actually reform a pathetic and dangerous health care system.
The CNN and NPR Interns Incident
In the 1990s it came to light that soldiers from the 4th Psychological Operations Group had been interning at the American news networks Cable News Network (CNN) and National Public Radio (NPR). The program was claimed by the Army to be an attempt to provide its PSYOP personnel with the expertise developed by the private sector under its Training with Industry program. The program caused concern about the influence these soldiers might have on American news and the programs were terminated.
National Public Radio reported on April 10, 2000:
The U.S. Army’s Psychological Operations unit placed interns at CNN and NPR in 1998 and 1999. The placements at CNN were reported in the European press in February of this year and the program was terminated. The NPR placements will be reported this week in TV Guide. 
[from – ]
Perception management is a term originated by the U. S. military. The U. S. Department of Defense (DOD) gives this definition:
Actions to convey and/or deny selected information and indicators to foreign audiences to influence their emotions, motives, and objective reasoning as well as to intelligence systems and leaders at all levels to influence official estimates, ultimately resulting in foreign behaviors and official actions favorable to the originator’s objectives. In various ways, perception management combines truth projection, operations security, cover and deception, and psychological operations.
The phrase “perception management” has often functioned as a “euphemism” for “an aspect of information warfare.” A scholar in the field notes a distinction between “perception management” and public diplomacy, which “does not, as a rule, involve falsehood and deception, whereas these are important ingredients of perception management; the purpose is to get the other side to believe what one wishes it to believe, whatever the truth may be.”
More recently, the U.S. government has used perception management techniques to promote the belief that weapons of mass destruction were indeed being manufactured in Iraq, and that Iraq had aided and assisted the Al Quaeda terrorists responsible for the September 11, 2001 attacks upon the World Trade Center. These “facts” were, in part, the government’s justification for invading Iraq and beginning the war. A man named John Rendon has been very influential in creating the conditions necessary to justify the war in Iraq. Rendon’s firm, the Rendon Group, has had close ties with the U.S. government ever since 1991, when the CIA hired the firm to help “create the conditions for the removal of Hussein from power.” 
Perception management includes all actions used to influence the attitudes and objective reasoning of foreign audiences and consists of Public Diplomacy, Psychological Operations (PSYOPS), Public Information, Deception and Covert Action. The Department of Defense describes “perception management” as a type of psychological operation. It is supposed to be directed at foreign audiences, and involves providing or discarding information to influence their emotions, motives, and objective reasoning in a way that is favorable to the originator of the information. The main goal is to influence friends and enemies, provoking them to engage in the behavior that you want. DOD sums it up: “Perception management combines truth projection, operations security, cover and deception, and psychological operations.” 
The U.S. government already has checks in place to dissuade perception management conducted by the state towards domestic populations, such as the Smith-Mundt Act of 1948, which “forbids the domestic dissemination of U.S Government authored or developed propaganda… deliberately designed to influence public opinion or policy.” 
(but they did it anyway – my note)
Perception management can be used as a propaganda strategy for controlling how people view political events. This practice was refined by U.S. intelligence services as they tried to manipulate foreign populations, but it eventually made its way into domestic U.S. politics as a tool to manipulate post-Vietnam-War-era public opinion. For example, in the early 1980s, the Reagan administration saw the “Vietnam Syndrome” -a reluctance to commit military forces abroad- as a strategic threat to its Cold War policies. This caused the administration to launch an extraordinary effort to change people’s perception of foreign events, essentially by exaggerating threats from abroad and demonizing selected foreign leaders. The strategy proved to be very successful.
Beginning in the 1950s, more than 800 news and public information organizations and individuals carried out assignments to manage the public’s perception for the CIA, according to the New York Times. By the mid-80s, CIA Director William Casey had taken the practice to the next level: an organized, covert “public diplomacy” apparatus designed to sell a “new product” — Central America — while stoking fear of communism, the Sandinistas, Libyan leader Muammar Qaddafi, and anyone else considered an adversary during the Ronald Reagan presidential administration. Sometimes it involved so-called “white propaganda,” stories and op-eds secretly financed by the government. But they also went “black,” pushing false story lines, such as how the Sandinistas were actually anti-Semitic drug dealers. That campaign included altered photos and blatant disinformation dispersed by public officials as high as the president himself.
The Trans-Alaska Pipeline Authorization Act is a United States federal law signed by Richard Nixon on November 16, 1973 that authorized the building of an oil pipeline connecting the North Slope of Alaska to Port Valdez. Specifically, it halted all legal challenges – filed primarily by environmental activists – against the construction of the pipeline.
The act is found in title 43, section 1651 of the United States Code (43 U.S.C. § 1651). Eventually, the Trans-Alaska Pipeline System was built as a result of this act.
The Trans-Alaska Pipeline System (TAPS), includes the Trans-Alaska Pipeline, 11 pump stations, several hundred miles of feeder pipelines, and the Valdez Marine Terminal. It is commonly called the Alaska Pipeline, Trans-Alaska Pipeline, Alyeska Pipeline or The Pipeline (in Alaska), but those terms technically apply only to the 800.302 miles (1,287.961 km) of 48-inch (122 cm) pipe that convey oil from Prudhoe Bay, to Valdez, Alaska.
The pipeline was built between 1974 and 1977 after the 1973 Oil Crisis caused a sharp rise in oil prices in the United States. This rise made exploration of the Prudhoe Bay Oil Field economically feasible. Environmental, legal, and political debates followed the discovery of oil at Prudhoe Bay in 1968, and the pipeline was built only after the oil crisis provoked the passage of legislation designed to remove legal challenges to the project.
The task of building the pipeline had to address a wide range of difficulties, stemming mainly from the extreme cold, the difficult terrain, and the isolated terrain. This was one of the first large-scale projects to deal with problems caused by permafrost, and special construction techniques had to be developed to cope with the frozen ground. The project attracted tens of thousands of workers to Alaska, causing a boomtown atmosphere in Valdez, Fairbanks, and Anchorage.
The first barrel of oil traveled through the pipeline in 1977, and full-scale production began by the end of the year. Several notable oil-leakage incidents have occurred since, including sabotage, maintenance failures, and holes caused by gunshot. The most significant oil spill associated with the pipeline was caused by the Exxon Valdez, and did not directly involve the pipeline. As of 2009, the pipeline has shipped almost 16 billion barrels (2.5H109 m3) of oil.
Accordingly, President Warren G. Harding established a series of naval petroleum reserves across the United States. These reserves were areas thought to be rich in oil and set aside for future drilling by the U.S. government. Naval Petroleum Reserve No. 4 was sited in Alaska’s far north, just south of Barrow, and encompassed 23,000,000 acres (93,078 km2).
The petroleum reserve lay dormant until the Second World War provided an impetus to explore new oil prospects. Starting in 1944, the U.S. Navy funded oil exploration near Unimat Mountain, on the Colville River in the Brooks Range. Surveyors from the U.S. Geological Survey spread across the petroleum reserve and worked to determine its extent until 1953, when the Navy suspended funding for the project. The USGS found several small oil fields, most notably the Unimat Oil Field, but none were deemed particularly feasible to develop.
Four years after the Navy suspended its survey, Richfield Oil Corporation (later Atlantic Richfield and ARCO) drilled an enormously successful oil well near the Swanson River in southern Alaska, near Kenai. The resulting Swanson River Oil Field was Alaska’s first commercially producing oil field, and it spurred the exploration and development of many others. By 1965, five oil and 11 natural gas fields had been developed. This success and the previous Navy exploration of its petroleum reserve led petroleum engineers to the conclusion that the area of Alaska north of the Brooks Range surely held large amounts of oil and gas. The problems came from the area’s remoteness and harsh climate. It was estimated that between 200 million and 500 million barrels of oil would have to be recovered to make a North Slope oil field commercially viable.
In 1967, Atlantic Richfield (ARCO) began detailed survey work in the Prudhoe Bay area. By January 1968, reports began circulating that natural gas had been discovered by a discovery well. On March 12, 1968, an Atlantic Richfield drilling crew hit paydirt. A discovery well began flowing at the rate of 1,152 barrels (183.2 m3) of oil per day. On June 25, ARCO announced that a second discovery well likewise was producing oil at a similar rate. Together, the two wells confirmed the existence of the Prudhoe Bay Oil Field. The new field contained more than 25 billion barrels of oil, making it the largest in North America and the 18th largest in the world.
The pipeline has at times been damaged due to sabotage, human error, maintenance failures, and natural disasters. By law, Alyeska is required to report significant oil spills to regulatory authorities. The Exxon Valdez oil spill is the best-known accident involving Alaska oil, but it did not involve the pipeline itself. Following the spill, Alyeska created a rapid response force that is paid for by the oil companies, including ExxonMobil, which was found liable for the spill.
The largest oil spill involving the main pipeline took place on February 15, 1978, when an unknown individual blew a 1-inch (2.54-centimeter) hole in it at Steele Creek, just east of Fairbanks. Approximately 16,000 barrels (2,500 m³) of oil leaked out of the hole before the pipeline was shut down. After more than 21 hours, it was restarted.
The steel pipe is resistant to gunshots and has resisted them on several occasions, but on October 4, 2001, a drunken gunman named Daniel Carson Lewis shot a hole into a weld near Livengood, causing the second-largest mainline oil spill in pipeline history. Approximately 258,000 US gallons (980 m3) leaked from the pipeline; 178,000 US gallons (670 m3) were recovered and reinjected into the pipeline. Nearly 2 acres (8,100 m2) of tundra were soiled and were removed in the cleanup. The pipeline was repaired and was restarted more than 60 hours later. Lewis was found guilty in December 2002 of criminal mischief, assault, drunken driving, oil pollution, and misconduct. He was sentenced to 16 years and jail and ordered to repay the $17 million cleanup costs.
The pipeline was built to withstand earthquakes, forest fires, and other natural disasters. The 2002 Denali earthquake damaged some of the pipeline sliders designed to absorb similar quakes, and it caused the pipeline to shut down for more than 66 hours as a precaution. In 2004, wildfires overran portions of the pipeline, but it was not damaged and did not shut down.
In March 2006, corroded feeder pipelines on the North Slope gave way, spilling at least 265,000 US gallons (1,000 m3) of oil. In August 2006, during an inspection mandated by the United States Department of Transportation after the leak, severe corrosion was discovered. The transit pipelines were shut down for several days that month, and replacement of 16 miles (26 km) of transit pipeline began. The project was completed before Christmas Day 2008 at a cost of $500 million to British Petroleum.
My Note –
I could’ve been influenced to a high level of disgust about the two faced bastards because of seeing the show on the Redwood trees and the Redwood forests too, which stated that 95% of the Redwood forest is now gone. It was logged to build picnic tables and give profits to those who did not earn them because not one of them owned those three hundred to thousands of years old trees. The citizens of the United States and the future of our species who may not have air to breath because they are gone – actually owned the rights to those trees.
And, too I was looking up fracture mechanics to understand how the dams that are in danger because of disrepair and integrity degradation could be fixed and how they are currently being checked for integrity – I shouldn’t have bothered because the attitude generally is that a lot is known, but it isn’t worth the money, the effort or the time that would be involved in doing it right (as in correctly, accurately, in the highest standard for public safety.)
Integrity is a term which refers to the quality of being whole and complete, or the state of being unimpaired [1,2]. Structural Integrity Assessment is an approach to assess whether a structure is fit to withstand the service conditions safely and reliability throughout its predicted lifetime.
This describes these attitudes – although these GAO reports are from 2008 and 1983 respectively – they actually describe the intellectual inbreeding of contempt for the American people and for the safety of the public which is at the heart of most misplaced and misdirected policies and resources –
Physical Infrastructure: Challenges and Investment Options for the Nation’s Infrastructure
Information on Unsafe Conditions at Specific Dams Located on Federal Lands
RCED-83-209 August 1, 1983
Full Report (PDF, 13 pages)
In response to a congressional request, GAO provided information on safety deficiencies identified at four dams on National Park Service and Forest Service lands, the status of agency actions to correct the identified safety deficiencies, and the reasons for failure to take corrective actions.
GAO found that, although Federal officials have been aware of the unsafe conditions at these dams for at least 4 years, only minimal corrective action has been taken to repair the dams. Interim actions have not been taken to diminish the dangers posed by the dams pending their repair. The Park Service has not taken this action because, while it agrees with the assessment of the danger the dams present, it does not believe that the conditions justify immediate repair. Furthermore, it does not believe that interim action, such as lowering the level of the lake, would diminish the dangers enough to justify reducing the benefits provided by the dams.
The Forest Service has not required the private owner of one dam to take all of the recommended actions because the regional forester decided in 1980 that it would not be fair to hold owners responsible until Federal or State funding became available to prove the extent of the unsafe conditions. Forest Service officials agreed to review the adequacy of this decision after GAO brought it to their attention.
(And this one makes it thoroughly obvious how little regard the State of California and the government agencies have for human life – and for the lives and well-being of people in the United States.)
In early May 2007, a Federal Court in San Francisco issued a major ruling which concluded that DOE has not been cleaning up the site to proper standards, and that the site would have to be cleaned up to higher standards if DOE ever wanted to release the site to Boeing, which in turn, would most likely release the land for unrestricted residential development. From the L.A. Times ( Judge assails Rocketdyne cleanup print edition, California section, May 3, 2007): Judge Conti’s ruling requires DOE to prepare a more stringent review of the lab, which is on the border of Los Angeles County. Conti wrote that the department’s decision to prepare a less-stringent environmental document prior to cleanup is in violation of the National Environmental Policy Act and noted that the lab ‘is located only miles away from one of the largest population centers in the world.’
On July 26, 2007, staff at the Los Angeles Regional Water Quality Control Board recommended a $471,190 fine against Boeing Co. for 79 violations of the California Water Code during an 18-month period. From October 2004 to January 2006, wastewater and storm water runoff coming from the lab had increased levels of chromium, dioxin, lead, mercury and other pollutants, the water board said. The contaminated water flowed into Bell Creek and the Los Angeles River in violation of a July 1, 2004, permit that allowed release of wastewater and storm water runoff as long as it didn’t contain high levels of pollutants.
On October 15, 2007, Boeing announced that In a landmark agreement between Boeing and California officials, nearly 2,400 acres (10 km2) of land that is currently Boeing’s Santa Susana Field Laboratory will become state parkland. According to the plan jointly announced by California Gov. Arnold Schwarzenegger, Boeing and state Sen. Sheila Kuehl, the property will be donated and preserved as a vital undeveloped open-space link in the Santa Susana Mountains above Simi Valley and the San Fernando Valley. The agreement will permanently restrict the land for nonresidential, noncommercial use.
Future use of the land SSFL is located on is also a source of much debate. The site’s current owners, the Boeing Company have issued statements suggesting that the land may be sold for future unrestricted residential development without having cleaned the site up to Environmental Protection Agency (EPA) cleanup standards. On August 2, 2005, Pratt & Whitney purchased Rocketdyne from Boeing, but refused to acquire SSFL as part of the sale.
In 1989, DOE found widespread chemical and radioactive contamination at the site, and a cleanup program commenced. In 1995 EPA and DOE announced that they had entered into a Joint Policy Agreement to assure that all DOE sites would be cleaned up to standards consistent with EPA’s Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) standards, also known as Superfund.
However, in March 2003, DOE reversed its position and announced that SSFL would not be cleaned up to EPA Superfund standards. While DOE simultaneously claimed compliance with the 1995 Joint Policy Agreement, the new plan included a cleanup of only 1% of the contaminated soil, and the release of SSFL for unrestricted residential use in as little as ten years.
EPA responded to this announcement by claiming that DOE was not subject to EPA regulation due to the fact that DOE existed as a separate entity under the Executive Branch of the Federal Government, and refused take steps to force DOE adherence to the 1995 agreement.
National Security Archive
From Wikipedia, the free encyclopedia
The National Security Archive is a 501(c)(3) non-governmental, non-profit research and archival institution located within The George Washington University in Washington, D.C.. Founded in 1985 by Scott Armstrong, it archives and publishes declassified U.S. government files concerning selected topics of American foreign policy. The Archive collects and analyzes the documents of many various government institutions obtained via the Freedom of Information Act. The Archive then selects documents to be published in the form of manuscripts and microfiche as well as made available through their website, which receives a half-million downloads daily. According to a Washington Post feature story, the Archive files roughly 2,000 FOIA requests annually, collecting about 75,000 documents. The Archive appealed 549 FOIA decisions in 2006, and has filed more than 40 lawsuits to obtain compliance with its requests.
The Archive operates under an advisory board which is directed by Tom Blanton and is overseen by a board of directors. The Archive’s research was awarded in late 2005 by winning an Emmy Award for its work on the documentary, Declassified: Nixon in China. More recently, the Archive uncovered a secret reclassification program operating since 1999. This program was underway to reclassify documents related to American foreign policy during the 1940s and 1950s, at the National Archives and Records Administration. The materials in question had all been declassified during the Clinton administration.
From 1985 until 1998, the Fund for Peace, Inc. was the archive’s fiscal sponsor. Among the Archive’s more prominent institutional supporters today are the Carnegie Corporation of New York, the Ford Foundation, the Freedom Forum, the John D. and Catherine T. MacArthur Foundation, Congressional Quarterly, and Cox Enterprises. The Archive receives funding from these and other, organizations via their donations to the National Security Archive Fund, established in order to administer the Archive’s finances.
On October 1, 2007, U.S. District Judge Colleen Kollar-Kotelly reversed George Bush on archive secrecy, (38-page) ruling that the U.S. Archivist’s reliance on the executive order to delay release of the papers of former presidents is arbitrary, capricious, an abuse of discretion and not in accordance with law. National Security Archives, at George Washington University alleged that the Bush order severely slowed or prevented the release of historic presidential papers. 
* Family jewels (Central Intelligence Agency), documents unclassified in June 2007
* United States intervention in Chile
* Operation Condor
* Operation Northwoods
* National Security Archive
* Digital National Security Archive Collections
* Charity Navigator overview of the National Security Archive Fund
* National Security Archive Sues CIA, 2006
* NSA Director Tom Blanton speaks on Secrecy in the United States: Priorities for the Next President , Rappaport Center for Law and Public Service, Suffolk University Law School, October 12, 2008.
1. ^ Carlson, Peter (2008-05-08). Eyes Only: (redacted) – In Its (redacted) Offices, the National Security Archive Houses Stockpiles of (redacted), Gotten From the Government by (redacted) . The Washington Post. http://www.washingtonpost.com/wp-dyn/content/article/2008/05/07/AR2008050703965.html. Retrieved 2008-05-09.
2. ^ Reuters, Court reverses Bush on archive secrecy
Retrieved from http://en.wikipedia.org/wiki/National_Security_Archive
Categories: United States government secrecy | Freedom of information in the United States | Central Intelligence Agency operations | United States national security policy | History of the foreign relations of the United States | Non-profit organizations based in Washington, D.C. | Archives in the United States | 1985 establishments
Globally, a total of 52 nuclear reactors were under construction as of Jan. 1, according to the Japan Atomic Industrial Forum Inc. Last year was the first time in the history of commercial nuclear power that no new reactors came into operation, according to International Atomic Energy Agency figures. Some 33 new plants came online in 1984 and that number has declined almost every year since.
(My Note – US has 103 nuclear reactors with new ones being built and going online now, and there are 430 nuclear reactors worldwide with these new ones that will be going online mentioned above as under construction as of Jan. 1)
Japan Steel Works is spending 80 billion yen ($864 million) at its Muroran plant in the country’s northern island of Hokkaido by March 2012 to increase capacity to make parts for 12 nuclear reactors a year, compared with 5.5 units now, the president said.
The investment will increase annual sales from Japan Steel Works’ cast and forged steel for electric and nuclear power to 70 billion yen from the year starting April 2012, up from 45.5 billion yen expected for the current year, Sato said.
To contact the reporters on this story: Masumi Suga in Tokyo at firstname.lastname@example.org; Shunichi Ozasa in Tokyo at email@example.com.
Last Updated: September 7, 2009 08:39 EDT
The United States also led in arms sales to the developing world, signing 70.1 percent of these weapons agreements at a value of $29.6 billion in 2008, the report said.
Such deals with the developing world included a $6.5 billion air defense system for the United Arab Emirates, a $2.1 billion jet fighter for Morocco and a $2 billion attack helicopter for Taiwan.
India, Iraq, Saudi Arabia, Egypt, South Korea and Brazil also reached weapons deals with the United States, the Times said.
The report revealed the United Arab Emirates was the top buyer of arms in the developing world with $9.7 billion in arms purchases in 2008.
Saudi Arabia ranked second with $8.7 billion in weapons agreements, and Morocco was third with $5.4 billion in deals.
(Reporting by Jasmin Melvin; Editing by Chris Wilson)
(and this is what our USGS money and resources and manpower is being used to do – if you look at the map and the petroleum companies get their way – most of the top third of Alaska will be developed for the hydrates to be recovered at the decimation of the environment – )
Fact Sheet 2008-3073| Podcast (Episode 74)
Slide Show Slide Presentation (Flash document 10.6 MB)
Gas Hydrates Website
There are several energy-related efforts currently under way in Alaska. Geographically, these range from the Alaska Peninsula to the North Slope (see graphic on left) and several are collaborative efforts with Federal and State agencies and Alaska Native villages. A brief description of these projects:
Circum-Arctic Basins Oil & Gas Assessment An ongoing effort of the World Energy Project that includes northern Alaska.
NEW Circum-Arctic Resource Appraisal: Estimates of Undiscovered Oil and Gas North of the Arctic Circle
Fact Sheet 2008-3049| Press Release (7/23/08)
Podcast (Episode 55) | Slide Show Slide Presentation (Flash document 4.39 MB)
Geologic Framework and Assessment Studies, North Slope of Alaska
These studies will increase our understanding of the petroleum geology and improve our estimates of undiscovered oil and gas resources. This is a multi-disciplinary investigation that uses concepts of basin analysis, sequence stratigraphy, fluid-flow modeling, petroleum systems, and structural and geophysical analysis. Assessments of the NPRA and the central North Slope were completed in May 2002 and May 2005, respectively. Current work is focused on assessment of the area west of NPRA and aggregation of all North Slope assessments with an update of the economics, including natural gas.
Gas Hydrate Studies in northern Alaska
These studies will investigate the technical aspects of gas production from gas hydrates, which contain gas trapped with water in ice-like structures. The presence of huge volumes of gas in hydrate form is known in the Prudhoe Bay region from earlier USGS studies. The current work is a collaborative effort involving the USGS Coastal and Marine Geology Program, Bureau of Land Management (BLM), the State of Alaska, the U.S. Department of Energy, and private industry. Collaborative gas hydrate work has also been conducted with the multinational Mallik Drilling Consortium in the Mackenzie Delta region. In 2004, the Alaska State Legislature requested the U.S. Geological Survey (USGS) to provide a technical briefing on the energy resource potential of gas hydrates in northern Alaska at a Federal Energy Regulatory Commission (FERC) technical conference, USGS Open-File Report 2004–1454.
Coalbed Gas Studies
A cooperative project with the State partly funded by the BLM and DOE to evaluate coalbed gas resources near Native villages and on Federal lands in rural Alaska. Coalbed gas may be a viable local energy source for Native villages and a commercial resource in Alaska. Shallow coalbed gas wells have been drilled near Chignik, Fort Yukon, and the Dalton Highway south of Prudhoe Bay. Current work involves continued evaluation of drill sites and collecting and analyzing coal samples for their methane potential from wells drilled for oil and gas in Cook Inlet and the North Slope. A new coal assessment of Alaska was released in 2003.
Digital Geologic Map Compilation
Compilation of existing geologic maps of the northern foothills of the Brooks Range, from the Chukchi Sea eastward to the Canadian border. This work is a collaboration between the USGS and the Alaska Department of Natural Resources, Division of Geological and Geophysical Surveys (DGGS) and the Division of Oil and Gas. It will result in a synthesis of geologic mapping that was conducted independently over several decades by the USGS and DGGS and will be produced at a fraction of the cost of new, field-based geologic mapping of the same area. A report of revised stratigraphic nomenclature for common use on all maps was completed in 2003, the Umiat quadrangle map was released in 2004, and the Ikpikpuk River quadrangle map, in 2005. A digital compilation of northeastern NPRA surficial geology was completed in 2005 at the request of the BLM.
Interior Alaska Province Review and Yukon Flats Assessment
An effort to provide essential geologic, geophysical, geochemical, and historical information in preparation for the next USGS assessment of the oil and gas resources in this province. Assessment of the Yukon Flats basin was released in 2004. A comprehensive review and compilation of oil and gas related information for the entire province was completed in 2002.
South Alaska Province Review
A new effort initiated in 2003 and focused on Cook Inlet. It is designed to provide essential geologic, geophysical, geochemical, and historical information in preparation for the next USGS assessment of the oil and gas resources in this province.
Collaboration with State of Alaska
Although not a separate project, the Energy Resources Program provides staff, analytical capabilities, and financial support for Alaskan petroleum studies and geologic mapping conducted by the Alaska Department of Natural Resources, Division of Geologic and Geophysical Surveys and Division of Oil and Gas.
NASA Landsat photo: Alaska North Slope in Winter NASA Landsat photo: Alaska North Slope in Spring
A blanket of snow gives the Brooks Range Mountains in northern Alaska an etched appearance in this true-color Moderate Resolution Imaging Spectroradiometer (MODIS) image from October 15, 2002. (Credit: Jacques Descloitres, MODIS Rapid Response Team, NASA/GSFC). Summertime glows green across Northern Alaska in the true-color Terra MODIS image, which was acquired July 29, 2002. Prominent in the image is the Brooks Range, which stretches all the way across Northern Alaska from the western shore to the border of Canada’s Yukon Territory, a distance of about 600 miles. (Credit: Jacques Descloitres, MODIS Rapid Response Team, NASA/GSFC).
Fact Sheet 2008-3082
The U.S. Geological Survey (USGS) recently completed the first assessment of the undiscovered technically recoverable gas-hydrate resources on the North Slope of Alaska. Using a geology-based assessment methodology, the USGS estimates that there are about 85 trillion cubic feet (TCF) of undiscovered, technically recoverable gas resources within gas hydrates in northern Alaska.
Recent Publications IconRECENT PUBLICATIONS
The Yukon Flats Cretaceous(?)-Tertiary Extensional Basin, East-Central Alaska: Burial and Thermal History Modeling
Scientific Investigations Report 2007–5281
Sentinel Hill Core Test 1: Facies Descriptions and Stratigraphic Reinterpretations of the Prince Creek and Schrader Bluff Formations, North Slope, Alaska
Professional Paper 1747
Stratigraphy and Facies of Cretaceous Schrader Bluff and Prince Creek Formations in Colville River Bluffs, North Slope, Alaska
Professional Paper 1748
Sedimentology and Sequence Stratigraphy of the Lower Cretaceous Fortress Mountain and Torok Formations Exposed Along the Siksikpuk River, North-Central Alaska
Professional Paper 1739-D
Lithofacies, Age, and Sequence Stratigraphy of the Carboniferous Lisburne Group in the Skimo Creek Area, Central Brooks Range
Professional Paper 1739-B
Oil and Gas Resources of the Arctic Alaska Petroleum Province
Professional Paper 1732-A
Regional Fluid Flow and Basin Modeling in Northern Alaska
Color Shaded-Relief and Surface-Classification Maps of the Fish Creek Area, Harrison Bay Quadrangle, Northern Alaska
Scientific Investigations Map 2948
Alaska Division of Geological & Geophysical Surveys (DGGS) USGS scanning project
Virtually all U.S. Geological Survey Bulletins and Professional Papers for Alaska are now viewable and retrievable online through the Alaska Division of Geological & Geophysical Surveys (DGGS). USGS scanning project press release (PDF 20KB).
USGS Alaska Science Center
Center of Excellence for the Department of the Interior to address important natural resources issues and natural hazards assessments in Alaska and circumpolar regions through long-term data collection and monitoring, research and development, and assessments and applications.
Accessibility FOIA Privacy Policies and Notices
Take Pride in America home page. FirstGov button U.S. Department of the Interior | U.S. Geological Survey
Page Contact Information: ERP Webmaster
Page Last Modified: 09/27/2009 04:05:18
Assessment of Gas Hydrate Resources on the North Slope, Alaska, 2008
Thumbnail of and link to report PDF (6 MB)
The U.S. Geological Survey (USGS) recently completed the first assessment of the undiscovered technically recoverable gas-hydrate resources on the North Slope of Alaska. Using a geology-based assessment methodology, the USGS estimates that there are about 85 trillion cubic feet (TCF) of undiscovered, technically recoverable gas resources within gas hydrates in northern Alaska.
Posted October 2008
* Fact Sheet PDF (6 MB)
For further information:
This factsheet and assessment results are available at the USGS Energy Program website, http://energy.usgs.gov
Part or all of this report is presented in Portable Document Format (PDF); the latest version of Adobe Reader or similar software is required to view it. Download the latest version of Adobe Reader, free of charge.
Collett, T.S., Agena, W.F., Lee, M.W., Zyrianova, M.V., Bird, K.J., Charpentier, T.C., Houseknect, D.W., Klett, T.R., Pollastro, R.M., and Schenk, C.J., 2008, Assessment of gas hydrate resources on the North Slope, Alaska, 2008: U.S. Geological Survey Fact Sheet 2008-3073, 4 p.
Description Gas hydrates 1996.jpg
English: Also en:image:Gas hydrates 1996.svg
Worldwide distribution of confirmed or inferred offshore gas hydrate-bearing sediments. USGS Open-File Report 96-272. 
Image is on a en:US Geological Survey web site. Report produced by USGS and Naval Research Laboratory, so USGov is source rather than only USGS.
2005-02-08 (original upload date)
Transferred from en.wikipedia; transferred to Commons by User:Kajasudhakarababu using CommonsHelper.
Original uploader was SEWilco at en.wikipedia
(and this – which explains so much – )
– this is how much our good health matters – zero – )
From Wikipedia, the free encyclopedia
For gas generated by oil shale pyrolysis, see Oil shale gas.
Shale gas is natural gas produced from shale. Shale gas has become an increasingly more important source of natural gas in the United States over the past decade, and interest has spread to potential gas shales in Canada and Europe. One analyst expects shale gas to supply as much as half the natural gas production in North America by 2020.
Some analysts expect that North American shale gas will affect the worldwide energy supply. A study by the Baker Institute for Public Policy at Rice University concluded that increased shale gas production in the US and Canada could help prevent Russia from dictating higher prices for the gas it exports to European countries.
Because shales ordinarily have insufficient permeability to allow significant fluid flow to a well bore, most shales are not commercial sources of natural gas. Shale gas is one of a number of “unconventional” sources of natural gas; other unconventional sources of natural gas include coalbed methane, tight sandstones, and methane hydrates. Shale gas areas are often known as resource plays (as opposed to exploration plays). The geological risk of not finding gas is low in resource plays, but the potential profits per well are usually also lower.
Shale has low matrix permeability, so gas production in commercial quantities requires fractures to provide permeability. Shale gas has been produced for years from shales with natural fractures; the shale gas boom in recent years has been due to modern technology in hydraulic fracturing to create extensive artificial fractures around well bores. Horizontal drilling is often used with shale gas wells.
Shales that host economic quantities of gas have a number of common properties. They are rich in organic material, and are usually mature petroleum source rocks in the thermogenic gas window. They are sufficiently brittle and rigid enough to maintain open fractures. In some areas, shale intervals with high natural gamma radiation are the most productive.
Some of the gas produced is held in natural fractures, some in pore spaces, and some is adsorbed onto the organic material. The gas in the fractures is produced immediately; the gas adsorbed onto organic material is released as the formation pressure declines.
* 1 Environment
* 2 Economics
* 3 Canada
o 3.1 Utica Shale, Quebec
o 3.2 Muskwa Shale, British Columbia
o 3.3 Montney Shale, British Columbia
o 3.4 Horton Bluff Shale, Nova Scotia
* 4 Europe
* 5 United States
o 5.1 Antrim Shale, Michigan
o 5.2 Barnett Shale, Texas
o 5.3 Caney Shale, Oklahoma
o 5.4 Conesauga Shale, Alabama
o 5.5 Fayetteville Shale, Arkansas
o 5.6 Floyd Shale, Alabama
o 5.7 Gothic Shale, Colorado
o 5.8 Haynesville Shale, Louisiana
o 5.9 New Albany Shale, Illinois Basin
o 5.10 Pearsall Shale, Texas
o 5.11 Upper Devonian shales, Appalachian Basin
o 5.12 Woodford Shale, Oklahoma
* 6 References
* 7 External links
Chemicals are added to the water to facilitate the underground fracturing process that releases natural gas. The resulting volume of contaminated water is generally kept in above-ground ponds to await removal by tanker or injected back into the earth.
Although shale gas has been produced for more than 100 years in the Appalachian Basin and the Illinois Basin, the wells were often economically marginal. Higher natural gas prices in recent years and advances in hydraulic fracturing and horizontal completions have made shale gas wells more profitable. Shale gas tends to cost more to produce than gas from conventional wells, because of the expense of massive hydraulic fracturing treatments required to produce shale gas, and of horizontal drilling.
The prices required to make drilling and producing shale gas economic are different for each shale area. One study concluded that a wellhead gas price above $4.25 per thousand cubic feet (MCF) was required to make wells completed in the Fayettville Shale in Arkansas economic, while wells to the Woodford Shale in Oklahoma required a price above $6.50. Another study concluded that the Fayettville shale required a NYMEX gas price above $5.95 per million British thermal units (MMBTU), and the Woodford shale a price above $7.24; the same study arrived at break-even NYMEX prices of between $5.40 to $7.39 for the Barnett, and $6.31 for Appalachian gas shale. (The conclusions might appear to be different, but one is in terms of wellhead price per MCF, and the other study is in terms of NYMEX price per MMBTU).
To date, all successful shale gas wells have been in rocks of Paleozoic and Mesozoic age.
North America has been the leader in developing and producing shale gas because of high gas prices in that market. The great economic success of the Barnett Shale play in Texas in particular has spurred the search for other sources of shale gas across the United States and Canada.
Canada has a number of prospective shale gas targets in various stages of exploration and exploitation in British Columbia, Alberta, Saskatchewan, Ontario, Quebec, and Nova Scotia.
Utica Shale, Quebec
The Ordovician Utica Shale in Quebec potentially holds 4H10^12 cu ft (110 km3) at production rates of 1 MMCF per day Gastem, one of the Utica shale producers, has announced plans to explore for Utica Shale gas across the border in New York state.
The Quebec shale play focuses on an area south of the St. Lawrence River between Montreal and Quebec City. Interest has grown in the region since Denver-based Forest Oil Corp. announced a significant discovery there after testing two vertical wells. Forest Oil said its Quebec assets may hold as much as four trillion cubic feet of gas reserves, and that the Utica shale has similar rock properties to the Barnett shale in Texas. Quebec has been known to have natural gas reserves, but advanced horizontal drilling techniques and higher gas prices are only now making the play potentially economically viable, observers say. Forest Oil, which has several junior partners in the region, will drill three horizontal wells in Quebec this summer. It has targeted its first production for next year, and full-scale drilling for 2010. Calgary-based Talisman Energy Inc. also plans to drill in Quebec in late summer.
Muskwa Shale, British Columbia
The Devonian Muskwa Shale of the Horn River Basin in northeast British Columbia is said to contain 6H10^12 cu ft (170 km3) of recoverable gas. Major leaseholders in the play are EOG Resources, Encana, and Apache Corp. The government of British Columbia recently announced lease proceeds for 2008 to be in excess of CDN$2.2 billion, a record high for the province, with the majority of the proceeds coming from shale gas prospects.
Montney Shale, British Columbia
The Montney Shale play is in east-central British Columbia.
Horton Bluff Shale, Nova Scotia
In 2009, Triangle Petroleum Corporation completed two gas wells in the Horton Bluff Shale, of the Windsor Basin, Nova Scotia.
While Europe has no shale gas production as yet, the success of shale gas in North America has prompted geologists in a number of European countries to examine the productive possibilities of their own organic-rich shales. Potential host formations for shale gas include shales in northeast France, the Alum Shale in Northern Europe and Carboniferous shales in Germany and the Netherlands.
Shell Oil is evaluating the viability of the Alum Shale in southern Sweden as a source of shale gas.
Eurenergy Resource Corporation has announced plans to drill for shale gas in southern England’s Weald Basin.
ConocoPhillips has announced plans to explore for shale gas in Poland.
The first commercial gas wells drillied in the US, starting in 1821 in Fredonia, New York, produced gas from shales. After the Drake Oil Well in 1859, however, shale gas production was overshadowed by much larger volumes produced from conventional gas reservoirs.
In 1996, shale gas wells in the United States produced 0.3 TCF (trillion cubic feet), 1.6% of US gas production; by 2006, production had more than tripled to 1.1 TCF per year, 5.9% of US gas production. By 2005 there were 14,990 shale gas wells in the US. A record 4,185 shale gas wells were completed in the US in 2007. In 2007, shale gas fields included the #2 (Barnett/Newark East) and #13 (Antrim) sources of natural gas in the United States in terms of gas volumes produced.
Antrim Shale, Michigan
The Antrim Shale of Upper Devonian age produces along a belt across the northern part of the Michigan Basin. Although the Antrim Shale has produced gas since the 1940s, the play was not active until the late 1980s. During the 1990s, the Antrim became the most actively drilled shale gas play in the US, with thousands of wells drilled. To date, the shale has produced more than 2.5 TCF from more than 9 thousand wells. Antrim Shale wells produced almost 140H10^9 cu ft (4.0H109 m3) in 2006. The shale appears to be most economic at depths of 1,000-2,000 feet. Wells are developed on 80-acre (320,000 m2) units. Horizontal drilling is not widely used. Unlike other shale gas plays such as the Barnett Shale, the natural gas from the Antrim appears to be biogenic gas generated by the action of bacteria on the organic-rich rock.
In 2007, the Antrim gas field produced 136 billion cubic feet of gas, making it the 13th largest source of natural gas in the United States.
Barnett Shale, Texas
Barnett Shale gas drilling rig near Alvarado, Texas (2008)
The Barnett Shale of the Fort Worth Basin is the most active shale gas play in the United States. The first Barnett Shale well was completed in 1981 in Wise County. Drilling expanded greatly in the past several years due to higher natural gas prices and use of horizontal wells to increase production. In contrast to older shale gas plays, such as the Antrim Shale, the New Albany Shale, and the Ohio Shale, the Barnett Shale completions are much deeper (up to 8,000 feet). The thickness of the Barnett varies from 100 to 1,000 feet (300 m), but most economic wells are located where the shale is between 300 and 600 feet (180 m) thick. The success of the Barnett has spurred exploration of other deep shales.
In 2007, the Barnett shale (Newark East) gas field produced 1.11 trillion cubic feet of gas, making it the second-largest source of natural gas in the United States. The Barnett shale currently produces more than 6% of US natural gas production.
Texas Shale Forum
Caney Shale, Oklahoma
The Caney Shale in the Arkoma Basin is the stratigraphic equivalent of the Barnett Shale in the Ft. Worth Basin. The formation has become a gas producer since the large success of the Barnett play.
* Bill Grieser: Caney Shale, Oklahoma’s shale challenge, PDF file, retieved 25 February 2009.
Conesauga Shale, Alabama
Wells are currently being drilled to produce gas from the Cambrian Conasauga shale in northern Alabama. Activity is in St. Clair, Etowah, and Cullman counties.
Fayetteville Shale, Arkansas
The Mississippian Fayetteville Shale produces gas in the Arkansas part of the Arkoma Basin. The productive section varies in thickness from 50 to 550 feet (170 m), and in depth from 1500 to 6,500 feet (2,000 m). The shale gas was originally produced through vertical wells, but operators are increasingly going to horizontal wells in the Fayetteville. Producers include SEECO a subsidiary of Southwestern Energy Co. who discovered the play, Chesapeake Energy, Noble Energy Corp., XTO Energy Inc., Contango Oil & Gas Co., Edge Petroleum Corp., Triangle Petroleum Corp., and Kerogen Resources Inc.
* Geology.Com: Fayetteville shale
* Fayetteville shale: reducing environmental impacts
Floyd Shale, Alabama
The Floyd Shale of Mississippian age is a current gas exploration target in the Black Warrior Basin of northern Alabama and Mississippi.
Gothic Shale, Colorado
Bill Barrett Corporation has drilled and completed several gas wells in the Gothic Shale. The wells are in Montezuma County, Colorado, in the southeast part of the Paradox basin. A horizontal well in the Gothic flowed 5,700 MCF per day.
Haynesville Shale, Louisiana
Although the Jurassic Haynesville Shale of northwest Louisiana has produced gas since 1905, it has been the focus of modern shale gas activity only since a gas discovery drilled by Cubic Energy in November 2007. The Cubic Energy discovery was followed by a March 2008 announcement by Chesapeake Energy that it had completed a Haynesville Shale gas well. Haynesville shale wells have also been drilled in northeast Texas, where it is also known as the Bossier Shale.
* Geology.Com: Haynesville Shale: news, map, videos, lease and royalty information
* Go Haynesville Shale, a forum for petroleum professionals and landowners to discuss the Haynesville shale.
New Albany Shale, Illinois Basin
The Devonian-Mississippian New Albany Shale produces gas in the southeast Illinois Basin in Illinois, Indiana, and Kentucky. The New Albany has been a gas producer in this area for more than 100 years, but recent higher gas prices and improved well completion technology have increased drilling activity. Wells are 250 to 2,000 feet (610 m) deep. The gas is described as having a mixed biogenic and thermogenic origin.
 Pearsall Shale, Texas
Operators have completed approximately 50 wells in the Pearsall Shale in the Maverick Basin of south Texas. The most active company in the play has been TXCO Resources, although EnCana and Anadarko Petroleum have also acquired large land positions in the basin. The gas wells had all been vertical until 2008, when TXCO drilled and completed a number of horizontal wells.
Upper Devonian shales, Appalachian Basin
Drilling a horizontal shale gas well in Appalachia
The upper Devonian shales of the Appalachian Basin, which are known by different names in different areas have produced gas since the early 20th century. The main producing area straddles the state lines of Virginia, West Virginia, and Kentucky, but extends through central Ohio and along Lake Erie into the panhandle of Pennsylvania. More than 20,000 wells produce gas from Devonian shales in the basin. The wells are commonly 3,000 to 5,000 feet (1,500 m) deep. The shale most commonly produced is the Chattanooga Shale, also called the Ohio Shale. The US Geological Survey estimated a total resource of 12.2 trillion cubic feet (350 km3) of natural gas in Devonian black shales from Kentucky to New York
The Marcellus shale in West Virginia, Pennsylvania, and New York, once thought to be played out, is now estimated to hold 168-516 TCF still available with horizontal drilling. It has been suggested that the Marcellus shale and other Devonian shales of the Appalachian Basin, could supply the northeast U.S. with natural gas. In November 2008, Chesapeake Energy, which held 1.8 million net acres of oil and gas leases in the Marcellus trend, sold a 32.5% interest in its leases to StatoilHydro of Norway, for $3.375 billion.
* Geology.Com: Marcellus shale
* Go Marcellus Shale A forum for the Marcellus Shale.
* Mammoth resource partners
Woodford Shale, Oklahoma
The Devonian Woodford Shale in Oklahoma is from 50 to 300 feet (91 m) thick. Although the first gas production was recorded in 1939, by late 2004, there were only 24 Woodford Shale gas wells. By early 2008, there were more than 750 Woodford gas wells. Like many shale gas plays, the Woodford started with vertical wells, then became dominantly a play of horizontal wells. The play is mostly in the Arkoma Basin of southeast Oklahoma, but some drilling has extended the play west into the Anadarko Basin and south into the Ardmore Basin. The largest gas producer from the Woodford is Newfield Exploration; other operators include Devon Energy, Chesapeake Energy, Cimarex Energy, Antero Resources, St. Mary Land and Exploration, XTO Energy, Pablo Energy, Petroquest Energy, Continental Resources, and Range Resources.
* Oklahoma Geological Survey: Map of Woodford shale wells, accessed 25 February 2009.
* Brian J. Cardott: Overview of Woodford gas-shale play in Oklahoma, 2008 update, PDF file, retrieved 25 February 2009.
1. ^ Shaun Polczer, Shale expected to supply half of North America’s gas, Calgary Herald, 9 April 2009, accessed 27 August 2009.
2. ^ Rice University, News and Media Relations (8 May 2009): US-Canadian shale could neutralize Russian energy threat to Europeans, accessed 27 May 2009.
3. ^ Don Lyle, Shale gas plays expand, E&P, Mar. 2007, p.77-79.
4. ^ Michael Godec, Tyler Van Leeuwen, and Vello A. Kuuskraa, Rising drilling, stimulation costs pressure economics, Oil & Gas Journal, 15 Oct. 2007, p.45-51.
5. ^ Kevin Heffernen, Shale gas in North America, emerging shale opportunities, PDF file, retrieved 15 April 2009.
6. ^ Forest Oil Corporation – Press Releases and Notices
7. ^ Press release > Investors > Junex
8. ^ New York to get Utica shale exploration . Oil & Gas Journal (PennWell Corporation) 106 (12): 41. 2008-03-24. http://www.ogj.com/index/article-display/323864/s-articles/s-oil-gas-journal/s-volume-106/s-issue-12/s-exploration-development/s-new-york-to-get-utica-shale-exploration.html. Retrieved 2009-07-07.
9. ^ Alan Petzet (2008-03-24). BC’s Muskwa shale shaping up as Barnett gas equivalent . Oil & Gas Journal (PennWell Corporation) 106 (12): 40-41. http://www.ogj.com/index/article-display/323863/s-articles/s-oil-gas-journal/s-volume-106/s-issue-12/s-exploration-development/s-bcrsquos-muskwa-shale-shaping-up-as-barnett-gas-equivalent.html. Retrieved 2009-07-07.
10. ^ Study analyzes nine US, Canada shale gas plays , Oil and gas Journal, 10 Nov. 2008, p.50.
11. ^ Reuters (6 Mar. 2009): Triangle Petroleum provides update on its Nova Scotia shale gas prospect, accessed 16 April 2009.
12. ^ David Jolly, Europe starting search for shale gas, International Herald Tribune, 22 August 2008, accessed 18 March 2009.
13. ^ Doris Leblond, PennEnergy (29 May 2009): European shale gas prospects heat up, accessed 10 June 2009.
14. ^ AAPG Annual Convention (8 June 2009): Shale gas in Europe – overview, potential and research (abs.), accessed 26 March 2009.
15. ^ Royal Dutch Shell, Annual Report and Form 20-F for the year ending December 31, 2008, p.25, PDF file, retrieved 16 April 2009.
16. ^ Svenska Shell Skane natural gas website, accessed 27 May 2009.
17. ^ Reuters (18 Mar. 2009) Eurenergy Resources Corporation awarded oil & gas concessions in Eorope’s East Paris Basin and Weald Basin, accessed 15 April 2009.
18. ^ Reuters, Conoco sees promise in Polish shale gas-exec, 9 September 2009.
19. ^ Vello A. Kuuskraa, Reserves, production grew greatly during last decade Oil & Gas Journal, 3 Sept. 2007, p.35-39
20. ^ Louise S. Durham, Prices, technology make shales hot, AAPG Explorer, July 2008, p.10.
21. ^ US Energy Information Administration, Top 100 oil and gas fields, PDF file, retrieved 18 February 2009.
22. ^ Michigan DEQ map: Antrim, PDF file, downloaded 12 February 2009.
23. ^ US Energy Information Administration, Top 100 oil and gas fields, PDF file, retrieved 18 February 2009.
24. ^ Scott R. Reeves and others, New basins invigorate U.S. gas shales play, Oil & Gas Journal, 22 Jan. 1996, p.53-58.
25. ^ US Energy Information Administration, Top 100 oil and gas fields, PDF file, retrieved 18 February 2009.
26. ^ US Energy Information Administration: Is U.S. natural gas production increasing?, Accessed 20 March 2009.
27. ^ Alabama State Oil and Gas Board (Nov. 2007):An overview of the Conesauga shale gas play in Alabama, PDF file, downloaded 10 June 2009.
28. ^ Operators chase gas in three Alabama shale formations, Oil & Gas Jour., 21 Jan. 2008, p.49-50.
29. ^ Nina M. Rach, Triangle Petroleum, Kerogen Resources drilling Arkansas’ Fayetteville shale gas, Oil & Gas Journal, 17 Sept. 2007, p.59-62.
30. ^ Barrett may haveParadox Basin discovery, Rocky Mountain Oil Journal, 14 Nov. 2008, p.1.
31. ^ Louise S. Durham, Louisiana play a ‘company maker’, AAPG Explorer, July 2008, p.18-36.
32. ^ Alan Petzet (2007-08-13). More operators eye Maverick shale gas, tar sand potential . Oil & Gas Journal (PennWell Corporation) 107: 38-40. http://www.ogj.com/index/article-display/303130/s-articles/s-oil-gas-journal/s-volume-105/s-issue-30/s-exploration-development/s-more-operators-eye-maverick-shale-gas-tar-sand-potential.html. Retrieved 2009-07-07.
33. ^ Maverick fracs unlock gas in Pearsall Shale . Oil & Gas Journal (PennWell Corporation) 107: 32-34. 2007-08-25. http://www.ogj.com/index/article-display/337639/s-articles/s-oil-gas-journal/s-volume-106/s-issue-32/s-exploration-development/s-maverick-fracs-unlock-gas-in-pearsall-shale.html. Retrieved 2009-07-07.
34. ^ Richard E. Peterson (1982) A Geologic Study of the Appalachian Basin, Gas Research Institute, p.40, 45.
35. ^ Unconventional natural gas reservoir in Pennsylvania poised to dramatically increase US Production 2008-01-17
36. ^ Arthur J. Pyron (2008-04-21). Appalachian basin’s Devonian: more than a new Barnett shale . Oil & Gas Journal (PennWell Corporation) 106 (15): 38-40. http://www.ogj.com/index/article-display/326309/s-articles/s-oil-gas-journal/s-volume-106/s-issue-15/s-exploration-development/s-appalachian-basinrsquos-devonian-more-than-a-lsquonew-barnett-shalersquo.html. Retrieved 2009-07-07.
37. ^ Chesapeake announces joint venture agreement , World Oil, December 2008, p.106.
38. ^ Travis Vulgamore and others, Hydraulic fracturing diagnostics help optimize stimulations of Woodford Shale horizontals, American Oil and Gas Reporter, Mar. 2008, p.66-79.
39. ^ David Brown, Big potential boosts Woodford, AAPG Explorer, July 2008, p.12-16.
* Jackson School of Geosciences (Jan. 2007): Barnett Boom Ignites Hunt for Unconventional Gas Resources
* AAPG Explorer (Mar. 2001): Shale Gas Exciting Again
* Oil and Gas Investor (Jan. 2006): Shale Gas
* Centre for Energy: What is Shale Gas?
* US Energy Investor (1 Jan 2005): The Bright Future of Shale Gas
* Marcellus Shale: horizontal drilling and hydrofracing
* The Haynesville Shale of Louisiana, Texas and Arkansas
* West Virginia Geological and Economic Survey: Geology of the Marcellus Shale PDF file, retrieved 2 January 2009.
* West Virginia Geological and Economic Survey: Enhancement of the Appalachian Basin Devonian shale resource base in the GRI hydrocarbon model PDF file, retrieved 2 January 2009.
Retrieved from http://en.wikipedia.org/wiki/Shale_gas
Categories: Natural gas fields | Energy in the United States | Petroleum
Drilling a horizontal shale gas well in Appalachia
Petroleum History – Canadian site but includes links to US info sparsely
National Air Toxics Assessments
What is NATA?
The National-Scale Air Toxics Assessment (NATA) is EPA’s ongoing comprehensive evaluation of air toxics in the U.S. EPA developed the NATA as a state-of-the-science screening tool for State/Local/Tribal Agencies to prioritize pollutants, emission sources and locations of interest for further study in order to gain a better understanding of risks. NATA assessments do not incorporate refined information about emission sources, but rather, use general information about sources to develop estimates of risks which are more likely to overestimate impacts than underestimates them.
NATA provides estimates of the risk of cancer and other serious health effects from breathing (inhaling) air toxics in order to inform both national and more localized efforts to identify and prioritize air toxics, emission source types and locations which are of greatest potential concern in terms of contributing to population risk. This in turn helps air pollution experts focus limited analytical resources on areas and or populations where the potential for health risks are highest.
Assessments include estimates of cancer and non-cancer health effects based on chronic exposure from outdoor sources, including assessments of non-cancer health effects for Diesel Particulate Matter (PM). Assessments provide a snapshot of the outdoor air quality and the risks to human health that would result if air toxic emissions levels remained unchanged.
How do I access NATA assessments?
EPA has completed three assessments that characterize the nationwide chronic cancer risk estimates and noncancer hazards from inhaling air toxics. The latest, the 2002 NATA, was made available to the public in June of 2009. You can access any of the NATA assessments by clicking below on the specific year of interest.
* 2002 National-Scale Air Toxics Assessment
* 1999 National-Scale Air Toxics Assessment
* 1996 National-Scale Air Toxics Assessment
Why was NATA developed?
The NATA assessments were designed to help guide efforts to cut toxic air pollution and build upon the already significant emissions reductions achieved in the US since 1990.
NATA was developed as a tool to inform both national and more localized efforts to collect air toxics information, characterize emissions, and help prioritize pollutants/geographic areas of interest for more refined data collection and analyses.
The goal is to identify those air toxics which are of greatest potential concern in terms of contribution to population risk. Ambient and exposure concentrations, and estimates of risk and hazard for air toxics in each State are typically generated at the census tract level.
What NATA is not.
NATA results provide answers to questions about emissions, ambient air concentrations, exposures and risks across broad geographic areas (such as counties, states and the Nation) at a moment in time. As such, they help the EPA identify specific air toxics compounds, and specific source sectors such as stationary sources or mobile sources, which generally produce the highest exposures and risks in the country.
These assessments are based on assumptions and methods that limit the range of questions that can be answered reliably. The results cannot be used to identify exposures and risks for specific individuals, or even to identify exposures and risks in small geographic regions such as a specific census block, i.e., hotspots.
These assessments use emissions data for a single year as inputs to models which will yield concentration and risk estimates. These estimates reflect chronic exposures resulting from the inhalation of the air toxics emitted and do not consider exposures which may occur indoors or as a results of exposures other than inhalation, i.e., dermal or ingestion.
stimated County Level Carcinogenic Risk (PDF) (1pg, 2.1 MB) – PDF version of map below.
Estimated County Level Noncancer (Respiratory) Risk (PDF) (1pg, 2.1 MB) – PDF version of map below.
(from – )
The Clean Air Act Amendments of 1990 List of Hazardous Air Pollutants
79107 Acrylic acid
107051 Allyl chloride
71432 Benzene (including benzene from gasoline)
100447 Benzyl chloride
117817 Bis(2-ethylhexyl)phthalate (DEHP)
156627 Calcium cyanamide
105602 Caprolactam(See Modification)
75150 Carbon disulfide
56235 Carbon tetrachloride
463581 Carbonyl sulfide
79118 Chloroacetic acid
107302 Chloromethyl methyl ether
1319773 Cresols/Cresylic acid (isomers and mixture)
94757 2,4-D, salts and esters
111444 Dichloroethyl ether (Bis(2-chloroethyl)ether)
121697 N,N-Diethyl aniline (N,N-Dimethylaniline)
64675 Diethyl sulfate
60117 Dimethyl aminoazobenzene
119937 3,3′-Dimethyl benzidine
79447 Dimethyl carbamoyl chloride
68122 Dimethyl formamide
57147 1,1-Dimethyl hydrazine
131113 Dimethyl phthalate
77781 Dimethyl sulfate
534521 4,6-Dinitro-o-cresol, and salts
123911 1,4-Dioxane (1,4-Diethyleneoxide)
106898 Epichlorohydrin (l-Chloro-2,3-epoxypropane)
140885 Ethyl acrylate
100414 Ethyl benzene
51796 Ethyl carbamate (Urethane)
75003 Ethyl chloride (Chloroethane)
106934 Ethylene dibromide (Dibromoethane)
107062 Ethylene dichloride (1,2-Dichloroethane)
107211 Ethylene glycol
151564 Ethylene imine (Aziridine)
75218 Ethylene oxide
96457 Ethylene thiourea
75343 Ethylidene dichloride (1,1-Dichloroethane)
7647010 Hydrochloric acid
7664393 Hydrogen fluoride (Hydrofluoric acid)
7783064 Hydrogen sulfide(See Modification)
58899 Lindane (all isomers)
108316 Maleic anhydride
74839 Methyl bromide (Bromomethane)
74873 Methyl chloride (Chloromethane)
71556 Methyl chloroform (1,1,1-Trichloroethane)
78933 Methyl ethyl ketone (2-Butanone)(See Modification)
60344 Methyl hydrazine
74884 Methyl iodide (Iodomethane)
108101 Methyl isobutyl ketone (Hexone)
624839 Methyl isocyanate
80626 Methyl methacrylate
1634044 Methyl tert butyl ether
101144 4,4-Methylene bis(2-chloroaniline)
75092 Methylene chloride (Dichloromethane)
101688 Methylene diphenyl diisocyanate (MDI)
82688 Pentachloronitrobenzene (Quintobenzene)
85449 Phthalic anhydride
1336363 Polychlorinated biphenyls (Aroclors)
1120714 1,3-Propane sultone
114261 Propoxur (Baygon)
78875 Propylene dichloride (1,2-Dichloropropane)
75569 Propylene oxide
75558 1,2-Propylenimine (2-Methyl aziridine)
96093 Styrene oxide
127184 Tetrachloroethylene (Perchloroethylene)
7550450 Titanium tetrachloride
95807 2,4-Toluene diamine
584849 2,4-Toluene diisocyanate
8001352 Toxaphene (chlorinated camphene)
108054 Vinyl acetate
593602 Vinyl bromide
75014 Vinyl chloride
75354 Vinylidene chloride (1,1-Dichloroethylene)
1330207 Xylenes (isomers and mixture)
0 Antimony Compounds
0 Arsenic Compounds (inorganic including arsine)
0 Beryllium Compounds
0 Cadmium Compounds
0 Chromium Compounds
0 Cobalt Compounds
0 Coke Oven Emissions
0 Cyanide Compounds1
0 Glycol ethers2
0 Lead Compounds
0 Manganese Compounds
0 Mercury Compounds
0 Fine mineral fibers3
0 Nickel Compounds
0 Polycylic Organic Matter4
0 Radionuclides (including radon)5
0 Selenium Compounds
NOTE: For all listings above which contain the word compounds and for glycol ethers, the following applies: Unless otherwise specified, these listings are defined as including any unique chemical substance that contains the named chemical (i.e., antimony, arsenic, etc.) as part of that chemical’s infrastructure.
1 X’CN where X = H’ or any other group where a formal dissociation may occur. For example KCN or Ca(CN)2
2 Includes mono- and di- ethers of ethylene glycol, diethylene glycol, and triethylene glycol R-(OCH2CH2)n -OR’ where
n = 1, 2, or 3
R = alkyl or aryl groups
R’ = R, H, or groups which, when removed, yield glycol ethers with the structure: R-(OCH2CH)n-OH. Polymers are excluded from the glycol category.(See Modification)
3 Includes mineral fiber emissions from facilities manufacturing or processing glass, rock, or slag fibers (or other mineral derived fibers) of average diameter 1 micrometer or less.
4 Includes organic compounds with more than one benzene ring, and which have a boiling point greater than or equal to 100 º C.
5 A type of atom which spontaneously undergoes radioactive decay.
If you are aware of, or participate in, any air toxics emission reduction activities in your community please feel free to contact us.
Estimated County Level Carcinogenic Risk (PDF) (1pg, 2.1 MB) – PDF version of map below.
Estimated County Level Noncancer (Neurological) Risk (PDF) (1pg, 2.1 MB) – PDF version of map below.
Diesel Particulate Matter:
Diesel Particulate Matter (PM) is a mixture of particles that is a component of diesel exhaust. EPA lists diesel exhaust as a mobile source air toxic due to the cancer and noncancer health effects associated with exposure to whole diesel exhaust. EPA believes that exposure to whole diesel exhaust is best described, as many researchers have done over the years, by diesel particulate concentrations.
A computerized set of mathematical equations that uses emissions and meteorological information to simulate the behavior and movement of air pollutants in the atmosphere. The results of a dispersion model are estimated outdoor concentrations of individual air pollutants at specified locations.
Represents tons per year within a given area on a per square mile basis. In this assessment, total county emissions are divided by the total square mileage of the county. Emission density is often used to show emissions information graphically because it provides a more consistent basis for comparison than emissions totals alone.
Emissions Modeling System For Hazardous Air Pollutants (EMS-HAP):
This modeling system processes the National Emission Inventory to provide model-ready emissions for input into the ASPEN model. These inputs consist of tract-level emissions and point source emissions for each toxic air pollutant, temporalized into eight 3-hour time blocks for an annually-averaged year. For purposes of this tool, the EMS-HAP temporalized emission outputs are summed into annual emissions.
Identifying the ways in which chemicals may reach individuals (e.g., by breathing); estimating how much of a chemical an individual is likely to be exposed to; and estimating the number of individuals likely to be exposed.
TRI Early Data Sharing
EPA has adopted a new strategy to provide the public with quicker access to the Toxics Release Inventory (TRI) data. The data sets provided below are only preliminary, but some may want to use them for study and analysis. EPA encourages this use but also cautions users to review and understand the limitations of the data. The Agency will release the data incrementally as they are processed through the reporting system, and updates will be posted on this page.
My Note –
Sources of iatrogenesis
Examples of iatrogenesis:
- medical error, poor prescription handwriting
- faulty procedures, techniques, information, or methods
- prescription drug interaction
- adverse effects of prescription drugs
- over-use of drugs leading to antibiotic resistance in bacteria
- nosocomial infection
- blood transfusion
- harmful emotional distress from the ascription of mental pathology nomenclature for transient personal problems
In the United State alone, recorded deaths per year (2000):
- 12,000—unnecessary surgery
- 7,000—medication errors in hospitals
- 20,000—other errors in hospitals
- 80,000—infections in hospitals
- 106,000—non-error, negative effects of drugs
Based on these figures, 225,000 deaths per year constitutes the third leading cause of death in the United States, after deaths from heart disease and cancer. Also, there is a wide margin between these numbers of deaths and the next leading cause of death (cerebrovascular disease).
This totals 225,000 deaths per year from iatrogenic causes. In interpreting these numbers, note the following:
- most data were derived from studies in hospitalized patients.
- the estimates are for deaths only and do not include negative effects that are associated with disability or discomfort.
- the estimates of death due to error are lower than those in the IOM report. If higher estimates are used, the deaths due to iatrogenic causes would range from 230,000 to 284,000.
From October 2004 to January 2006, wastewater and storm water runoff coming from the lab had increased levels of chromium, dioxin, lead, mercury and other pollutants, the water board said. The contaminated water flowed into Bell Creek and the Los Angeles River in violation of a July 1, 2004, permit that allowed release of wastewater and storm water runoff as long as it didn’t contain high levels of pollutants.
Federal and state regulation of the quality of bottled water. FDA’s bottled water standard of quality regulations generally mirror EPA’s national primary drinking water regulations under the Safe Drinking Water Act, as required by the Federal Food, Drug, and Cosmetic Act (FFDCA) as amended, although the case of DEHP (an organic compound widely used in the manufacture of polyvinyl chloride plastics) is a notable exception.
Specifically, FDA deferred action on DEHP in a final rule published in 1996, and has yet to either adopt a standard or publish a reason for not doing so, even though FDA’s statutory deadline for acting on DEHP was more than 15 years ago. More broadly, we found that FDA’s regulation of bottled water (including its implementation and enforcement), particularly when compared with EPA’s regulation of tap water, reveals key differences in the agencies’ statutory authorities. Of particular note, FDA does not have the specific statutory authority to require bottlers to use certified laboratories for water quality tests or to report test results, even if violations of the standards are found. Among our other findings, the states’ requirements to safeguard bottled water often exceed those of FDA, but are still often less comprehensive than state requirements to safeguard tap water.
5Phthalates are a class of chemical compounds primarily used as a plasticizer, added to plastics to increase flexibility, transparency, durability, and longevity and found in a variety of food containers and packaging.
We specifically cited FDA’s resource constraints, noting in 20087 that while the number of domestic firms under FDA’s jurisdiction increased from fiscal
years 2001 through 2007 from about 51,000 firms to more than 65,500, the number of firms inspected declined from 14,721 to 14,566 during the same period. We cited resource constraints as a contributing factor, noting that the number of full-time-equivalent positions at FDA devoted to food safety oversight had decreased by about 19 percent from fiscal years 2003 through 2007.
Ultimately, as our January 2007 report recommended, a fundamental reexamination of the federal food safety system will be needed to look across the activities of individual programs within specific agencies with responsibilities related to food safety.
Toward that end, we had previously recommended in 2001 that the Congress, among other things, enact comprehensive, uniform, and risk-based food safety legislation and commission the National Academy of Sciences or a blue-ribbon panel to analyze alternative organizational food safety structures in detail.8