A new report from the Government Accountability Office (GAO) confirms what many small-government environmentalists have been saying for years: States are more effective at regulating the disposal of wastewater from hydraulic fracturing operations than is the Environmental Protection Agency.
Hydraulic fracturing, also known as “fracking,” has led to a surge in oil and natural gas production in the United States. The process uses water, sand, and a few chemical additives to create fissures in oil- and gas-bearing rocks thousands of feet underground, allowing these resources to flow up to the surface.
Each hydraulically fractured well typically requires 2 to 4 million gallons of water, with 15 to 50 percent of this water flowing back to the surface after the process is complete. This water is typically briny, and it contains remnants of the sand and chemical compounds used to fracture the well.
This water must be disposed of or recycled. Disposal typically means injecting the wastewater into deep, underground wells regulated by EPA under the Underground Injection Control (UIC) program. GAO concluded EPA’s injection-well safeguards sufficiently protect drinking water: Few allegations of drinking water contamination, and fewer confirmed cases of groundwater contamination, have been reported.
However, GAO’s report stressed EPA has failed to be proactive regarding emerging challenges, such as induced seismicity (manmade earthquakes) and excessive pressurization of rock formations. GAO urged EPA to update its regulations to reflect state laws.
EPA cannot help enforce state regulations unless they are incorporated into federal rules, which is why GAO is urging EPA to update its rules to reflect the superior wastewater-injection protections adopted by states.
Amazingly, EPA responded by stating, “Incorporating changes into federal regulations, particularly through the rulemaking process, was burdensome and time-consuming.” This is the same EPA that is seeking to expand its authority (and therefore its control over your everyday life) by creating rules to regulate carbon dioxide emissions from power plants and micromanaging prairie potholes and the puddles that form in your driveway after a summer rain. Yet it considers its current duties “too burdensome.”
Claiming protection of the environment is “too burdensome” is not an option for state regulators, which is why they are more effective than federal regulators on these matters.
For example, Ohio passed regulations allowing the state’s chief of the Division of Oil and Gas to require a number of tests or evaluations to address potential induced seismic risks for companies seeking permits for brine injection wells in Ohio.
Other state regulations considered “too burdensome” for EPA adoption include on-site inspection for all injection wells to review the condition and operation of the wells. California, Colorado, and North Dakota require monthly reporting on injection pressure, injection volume, and the type of fluid being injected.
It makes little sense to entrust EPA to handle more responsibility when it has been incapable of fulfilling the responsibilities it already has. This is especially obvious when the responsibility involves essentially copying the example already implemented by state agencies.
EPA claims it does not have the resources to implement this program properly, but EPA’s budget request for 2014 was $8.153 billion, more than the entire annual budgets of 20 percent of states nationwide … yet these states, which have fewer resources at their disposal, manage to get the job done just fine.
It is time to seriously consider replacing EPA with a Committee of the Whole of the 50 state environmental protection agencies, an idea suggested by Jay Lehr, science director and senior fellow at The Heartland Institute, where I serve as research fellow. According to the GAO, we might as well do so, since the states seem to be doing all the heavy lifting already.
Isaac Orr (email@example.com) is a research fellow for energy and environmental policy at The Heartland Institute.
The 2010 introduction of Common Core, a set of requirements for what elementary and secondary school children should know in math and English language arts, has turned schools in one state after another into battlefields as its complexity and other factors led to protests against it. Even so, by mid-2014, a NBC/Wall Street Journal poll found that very nearly half of those asked about it hadn’t even heard of it. A number of states, such as Missouri, Indiana, Oklahoma, and South Carolina have withdrawn from it.
Schools today are often under fire for one reason or another. Ever since the 1960s when teachers unions began to secure more and more control, formerly the responsibility of individual and state school boards, Americans have been engaged in efforts to improve the elementary and secondary education systems. Many have elected to home school their children. Others have pushed for school choice to permit their children to attend a school that was clearly doing a better job than the one to which their children were assigned.
As youngsters settle into their classes, there are a number of trends worth noting.
Perhaps one of the most interesting trends is the expansion of online classes into K-12. As Ashley Bateman noted in a recent issue ofSchool Reform News, “In 2013 ten million students of all ages participated in more than 1,200 massive, open, online courses offered by more than 200 universities.” Of value to self-motivated students in particular, online classes are sure to find a larger audience of students who have grown up in the virtual world of game playing.
Another trend was noted by Marcy C. Tillotson, an education reporter for Watchdog.org. It is the increasing demand for more and more data about each student who worry that things done at a very young age like a schoolyard fight or emotional problems will follow them into college when they have long outgrown the problems or behaviors of childhood. Parents want to know what data is being collected and who has access to it. As often as not, they cannot find out.
Increasingly, school choice, a parent’s right to enroll their child in a selected public school, a private or a parochial choice, has become an issue that makes it into state legislature’s where some support and some forbid it. In Louisiana and Texas, for example, school choice programs and scholarship credits have gained support as a political issue. In Florida, the teachers union has initiated a lawsuit “to eliminate school choice for many low-income students and effectively kill a program to help students with autism and other special needs.” In North Carolina, its Supreme Court rendered a decision that permits more than 2,000 low-income parents to send their children to schools of their choice.
Attention to the quality of teachers, as opposed to letting tenure keep poorly performing ones in the classroom, is a growing trend. Last year in California, a first of its kind teacher quality lawsuit was decided in favor of the education reforms that brought it, striking down tenure and a similar lawsuit has been announced for New York.
As Ms. Tillitson reported, “Vergara v. California struck down state laws that required teacher layoffs based solely on seniority with no regard to teacher effectiveness, gave teachers permanent status after two years on the job, and made it difficult for school administrators to dismiss ineffective teachers.” As this trend expands to other states, a major complaint regarding poor performance will be addressed.
At the heart of the issue of teacher quality are the programs that prepare them to teach. As Ms. Tillotson noted, “A week after a California judge ruled on a case involving teacher tenure, dismissals and layoffs, the National Council for Teacher Quality released its annual report on another fundamental problem, the poor quality of teacher preparation programs. The report found that, as a whole, the programs need improvement. “Only a quarter of the programs expect aspiring teachers to be in the top half of their college’s academic pool. On a 125-point scale, the NCTQ ranked most programs as earning fewer than 50 points.
Increasingly, the quality and content of various educational programs are being questioned and challenged. One example is the College Board’s Advanced Placement U.S. History Framework (APUSH) and the questions about who wrote the curriculum that is taught to 500,000 students in more than 8,000 high schools every year.
When Larry Krieger, a retired College Board-praised teacher and Jane Robbins, a senior fellow at The American Principles Project asked the College Board who was the author or authors of the program, all they got as a reference to a web page listing 19 college professors and teachers who served on two College Board committees but where not listed as authors, but as “Acknowledgements.” Kreiger and Robbins call the history program “biased, poorly written, and ineptly organized”; one that “has raised alarms from state and national leaders.” We keep hearing about the importance of “transparency” but apparently the College Board does not think it applies to them.
It has long been known that U.S. schools tend to perform more poorly than those in other nations. Joy Pullman, a research fellow of The Heartland Institute and managing editor of School Reform Newsreported that “According to two recently released studies, the schools middle-income families send their kids to are not as good as parents think.”
“A national study,” wrote Ms. Pullman, “found U.S. students whose parents have college degrees perform worse than peers from comparable families in other countries. In the United States, 43 percent of such children tested ‘proficient’ in math on an international test, compared to 71 percent of comparable students from Poland, 68 percent in Japan, and 64 percent in Germany.” Overall, U.S. students performed better than those in only six countries.
Not surprisingly, Ms. Bateman has reported that “Accepting federal mandates in exchange for funding is the crux of the problem” of ever-growing educational bureaucracies at the state level. “States report that 40 percent of the paperwork burden they deal with is to comply with federal regulations,” said Lindsey Burke, the Will Skillman Fellow in Education at The Heritage Foundation.
When one considers how much in tax revenue is collected for the purpose of educating our youth, one would hope for better results, but fortunately there are many individuals, parents, and organizations seeking to improve the quality of education and our schools are going to remain battlefields for many years to come.
© Alan Caruba, 2014
[Originally published at Warning Signs]
Their chants, rants and placards demanded that we stop climate change (that’s been ongoing throughout Earth and human history), eliminate fossil fuels (that supply 80% of the energy that makes their modern living standards possible), ban fracking (which is largely responsible for reducing the carbon dioxide emissions they blame for global warming that ended at least 18 years ago), and abolish capitalism!
* Al Gore grinning for a photo op with NYC Mayor Bill DiBlasio and UN Secretary General Ban-Ki Moon. This is same Al Gore who got a C and D in his two college science courses, told “Tonight Show” audiences that the Earth’s interior is “several million degrees” (the core is actually nine thousand deg F), and refuses to debate anyone on climate change or even take audience questions he has not preapproved.
* Actor Leonardo DiCaprio basking in the NYC limelight, releasing a series of movies claiming that climate change is immediate and dangerous, and marching with other people’s anti-tar sands and “100% for the planet” signs – after arriving in the Big Apple not via commercial jetliner and subway.
* Actor Mark Ruffalo denouncing Climate Depot director Marc Morano for daring to ask whether celebrities like Messrs. Gore and DiCaprio are appropriate spokesmen for “stop global warming” campaigns – considering how much they enjoy multiple mansions, global vacations, and private jets, yachts, SUVs, helicopters and limos. Questions like that are “off-limits,” Ruffalo declared. “That is a question you shouldn’t be asking here today, because that defies the spirit of what this is about,” he said. “Anyone who attacks Leonardo DiCaprio is either a coward or an ideologue.”
Wow! I wasn’t aware that asking inconvenient questions or pointing out inconvenient truths was improper – especially when posed to people who put themselves forward as paragons of virtue for leading campaigns that inevitably restrict access to energy, lower developed country living standards, and keep the Third World impoverished – while the leaders enjoy lifestyles that are many times more profligate, carbon-intensive and carbon dioxide-spewing than the average American or African citizen’s.
But surely the most surreal episode of the march was Robert F. Kennedy, Jr. saying Morano and I and thousands like us should be jailed for expressing doubts about “dangerous manmade climate change.”
“I think they should be in jail … with all the other war criminals.” Republican politicians too – “those guys are doing the Koch brothers bidding and are against all the evidence, saying global warming does not exist. They are contemptible human beings,” he fumed, for our “war on science,” I presume.
So RFK the younger wants to punish us for the “crimes” of exercising our First Amendment rights, demanding actual evidence to support alarmist assertions, saying people’s needs for reliable, affordable energy must be part of the conversation – and insisting that those needs take precedence over absurd claims that climate change is “the world’s most fearsome weapon of mass destruction,” posing “greater long-term consequences” than ISIL, terrorism or Ebola, as Secretary of State John Kerry insists.
Mr. Kennedy needs to read the Constitution, reflect on the once proud history of free speech and civil rights in the United States, and acknowledge the harm his policies are causing. He also needs to get his facts straight.
None of us says global warming or climate change “does not exist.” Global warming, global cooling, “climate disruption” and “wild weather” have been “real” since Earth began. What we challenge is alarmist assertions that human carbon dioxide emissions have replaced the powerful, complex natural forces that caused repeated ice ages, little ice ages, warm periods, droughts, storms and other fluctuations throughout history. We dispute claims that any climate changes will be dangerous, and are our fault.
We vigorously refute claims that CO2 is “pollution.” This is what we exhale. It’s the trace gas (0.04% of our atmosphere) that enables plants to grow, and makes all life on Earth possible.
We debunk talk of countless “disasters” that Climate Armageddonites – from President Obama on down – blame on fossil fuels and insist “are happening right now.” The planet hasn’t warmed for 18 years. The nearly nine years since Wilma in October 2005 is the longest period since 1900 (and maybe the US Civil War) without a category 3-5 hurricane hitting the United States. Floods, droughts and other events are all within historic patterns, as readers can see in my new report, Climate Hype Exposed – how pseudo-science is used to justify policies that hurt jobs, liberties and people.
Just as crazy, RFK Jr. made it clear that he and his wife will not give up their $5,000,000 Malibu home or “reduce the, uh, our quality of life in order to have a, uh, rational free market, in order to, um, stop the use of carbon and to divorce ourselves from a fuel that is destroying our planet.” But they, many of the NYC marchers and climate alarm leaders are surely doing all they can to reduce your quality of life.
The policies RFK & Comrades demand would raise the price of fossil fuel energy that powers our modern world, creates and preserves jobs, and improves, enhances and safeguards lives. In Europe, they’ve made energy so expensive that millions of pensioners and other poor families cannot afford to heat their homes properly – and thousands die needlessly from hypothermia every winter. We’re heading there, too.
They cause millions of deaths every year in developing countries – by preventing construction of state-of-the-art coal and gas-fired power plants, and depriving people of reliable, affordable energy. More than 2.5 billion people worldwide must still use wood, charcoal, coal and dung in open fires to heat and cook; well over a billion still do not have electricity, still do not enjoy its wondrous blessings.
As a result, millions die every year from lung diseases due to constantly breathing polluted smoke from cooking and heating fires, from intestinal diseases caused by spoiled food and tainted water, and from countless other diseases of energy deprivation and poverty. The vast majority are women and children.
My colleagues and I would gladly go on trial and even serve time for “treasonous” speech against the climate alarm establishment … and for “polluting” the atmosphere with plant-fertilizing, life-giving CO2.
But then we would insist that Mr. Kennedy and his comrades also be tried and sentenced: for eco-manslaughter and crimes against humanity, for the disease and death their policies cause and perpetuate.
The International Criminal Court might be the proper venue, just as RFK suggested for us. But perhaps the climate demagogues and anti-fossil fuel zealots should be tried – and serve their sentences – in countries that have suffered the most at their hands, for their war on women, children and the poor. Conditions in those Third World prisons are notoriously worse than in the zealots’ mansions, and in the comparatively posh modern jails and prisons found in most of the USA and Europe.
Alternatively, these true climate criminals could be sentenced to do community service, while living like the natives: in mud huts, breathing their air, drinking their water, being bitten by disease-infested insects, and having to walk miles to basic medical services when they inevitably contract malaria, pneumonia or dysentery. That could make alternative community service a death sentence – akin to what Mr. Kennedy and his self-righteous friends are imposing on so many unfortunate people.
It’s time to refocus. The world needs abundant, reliable, affordable energy, to create opportunity and prosperity, improve and save lives, and enable us to adapt to whatever climate changes might come. Misguided noise about climate change “deniers” and humans replacing natural forces in controlling Earth’s climate serve only to distract us from the critical job at hand.
Paul Driessen is senior policy analyst for the Committee For A Constructive Tomorrow (CFACT) and Congress of Racial Equality (CORE), and author of Eco-Imperialism: Green power – Black death.
What does it tell you when the President sends 3,000 U.S. troops on a “humanitarian” mission to West Africa? It tells me he has put the U.S. at risk if any or a portion of these troops return after having been infected.
As always history has lessons that cannot be ignored. In 1918 and 1919, there was a pandemic of the Spanish influenza that caught nations by surprise, infecting an estimated 500 million people and killing between 50 and a 100 million of them in three waves. It began in the U.S. in March 1918 at a crowded army camp, Fort Riley, Kansas.
As these troops, living in close proximity to one another, were transported between camps, the disease spread quickly even before they were assembled on East Coast ports on route to France. They in turn brought it to the trenches of war in Europe.
The second wave struck in 1918 at a naval facility in Boston and at the Camp Devens military base in Massachusetts. October 1918 was the most deadly month in which 195,000 Americans died. The Harvard University Open Library notes that the supply of health care workers, morticians, and grave diggers dwindled and mass graves were often dug to bury the dead. There were subsequent outbreaks in 1957 and 1968.
And, at some point, 3,000 U.S. troops will be returning from West Africa to military facilities here at home.
Thus far we have been fortunate to have identified the case of the Ebola victim who had entered the nation from Liberia, but there are few guarantees that more will not be found or deterred. The Oct 4Washington Post reports that “Since July, hospitals around the country have reported more than 100 cases involving Ebola-like symptoms to the federal Centers for Disease Control and Prevention.”
Largely unknown is that 90,000 Americans die annually from preventable infections they acquire while in hospitals!
The concern about illnesses entering the U.S. is particularly true of our southern border which remains porous. Thank goodness Texas has taken measures to tighten its border security, but I am reminded that the Obama administration sued Arizona when it attempted to increase its security against the influx of illegal aliens.
Obama is the President who engineered an invasion of thousands of children and others from Latin America and then distributed them to various states without informing their governors or other authorities of who and where they were. Not surprisingly, in recent months cases of an enterovirus respiratory disease affecting school-age children have been reported around the nation.
Obama has no regard for the sovereignty of the nation or its immigration laws.
This is the same President who has made it clear that he intends to extend amnesty by executive order to an estimated eleven million illegal aliens, but not until after the midterm elections in November. I doubt that he has the constitutional power to do this. I hope the U.S. Congress has the means and the will to negate this.
The U.S. has a healthcare system that is the envy of the world, but the introduction of ObamaCare is already having negative effects on its administration and the former system of privately purchased healthcare insurance. Hundreds of thousands of Americans who had such insurance have lost it and those who signed up for ObamaCare are discovering it is far more expensive.
Perhaps the most under-reported story thus far regarding Ebola is the fact that in 2010, according to The Daily Caller, “the administration of President Barack Obama moved with virtually no fanfare to abandon a comprehensive set of regulations which the Centers for Disease Control and Prevention (CDC) had called essential to preventing international travelers from spreading deadly diseases inside the United States.” Among the viral diseases of concern was Ebola.
I want to have confidence in the Centers for Disease Control, but after witnessing the failures of one government agency after another including the Secret Service, I wish I felt better about them.
I have no doubt its staff are seriously concerned and doing what they can to respond to the threat, but I also think they and the rest of us are at risk from a regime led by a man whose incompetence has written a new chapter in the history of the presidency.
I wish that I felt confident that the Obama administration will take such steps as are necessary to keep the Ebola threat from harming the health of the nation such as not issuing visas to those from the affected nations in Africa, but the record to date limits that confidence.
© Alan Caruba, 2014
[Originally published at Warning Signs]
“When the Dunes Sagebrush Lizard (DSL) was being considered for listing under the Endangered Species Act (ESA),” Chris Bryan, agency spokesman for the Texas Comptroller, told me, “significant parts of the Texas economy were placed at risk.”
On September 30, District of Columbia District Court Judge Rudolph Contreras ruled against the Center for Biological Diversity (CBD) and the Defenders of Wildlife. The groups brought litigation in the hopes of requiring the Fish and Wildlife Service (FWS) to reverse its 2012 decision not to list the lizard as endangered.
The 2012 decision was the first time that community engagement beat back a proposed ESA listing—a stinging defeat to a movement that has historically used lawsuits as an effective weapon.
In August 2013, Texas Comptroller Susan Combs was granted intervenor status in the case. In October, several regional and national oil and gas associations joined Combs.
The DSL story represents a new chapter in ESA compliance that allows conservation and productive activity to coexist. Previously, presence of an ESA-listed species would shut down activity with harsh consequences for landowners and communities.
The spotted owl stands as the posterbird for bad ESA policy. More than 20 years ago, the spotted owl was listed under the ESA. As a result, much of the logging industry in the Pacific Northwest is gone—leaving thousands unemployed and hundreds of communities decimated. Fifty percent of the nation’s forestry jobs lost from 1990 to 2009 were in just two states: Oregon and Washington. Yet, the listing did not stop the decline of the spotted owl. And, as a result of the listing, forest management in the West changed—leaving thousands of acres overgrown and unhealthy, resulting in the devastating wildfires we see today.
Texas decided to do it differently.
Aware that the Dunes Sagebrush Lizard was an ESA target, conservation efforts started in 2008. Private land in the Permian Basin of West Texas and Southeastern New Mexico—an area that produces 15 percent of U.S. oil and 5 percent of natural gas, as well as a prime ranching and farming region—makes up about half of the DSL habitat. The locals were very worried that if the lizard were listed, the regulations would seriously impact their operations and impose substantial costs.
Stories of individual losses, like the spotted owl’s, prompted the Texas State Legislature to pass a bill creating the Interagency Task Force on Economic Growth and Endangered Species to help municipalities and regional governmental bodies cope with the ESA.
Additionally, the Comptroller’s Office provided funds to survey the lizard’s habitat—which revealed 28 more Texas DSL populations, in addition to the three known populations.
The 2011 surveys were possible because of a special provision the legislature passed in 2011 that allowed DSL population locations to remain confidential. Without the force of state law, landowners resist cooperating in conservation efforts out of fear their property would be rendered unusable.
By being proactive, Texas was able to enact voluntary conservation programs that brought about the 2012 FWS decision not to list the lizard. Addressing the Texas approach, Brian Seasholes, director of the Endangered Species Project at the Reason Foundation, says: “The Texas approach protects landowners from the ESA and the federal government, while finding a balance between economic activity and species conservation.”
Comptroller Combs is elated with the court’s decision, especially considering thepushback she received when she took a risky stand and embarked on the experimental plan to forge an innovative, flexible, and successful conservation plan for the DSL. Responding to the court ruling, Combs said: “It supports our basic belief that the TCP provides appropriate conservation for the lizard and reaffirms that the research conducted by Texas A&M University about the DSL helped to provide Fish and Wildlife the best scientific data available to make the decision not to list the species as endangered.”
New Mexico Congressman Steve Pearce, who spearheaded much of the public education on the potential impacts the DSL listing would have on communities in his district, likewise, welcomes the court’s decision:
It is about time the courts stood up for private landowners over radical environmental groups that continually use sue-and-settle tactics to exploit taxpayer money to pay lawyers and fund themselves instead of recovering species. This decision ensures that sound conservation efforts are carried out in Eastern New Mexico without sacrificing the economic activity that the area depends on. The plan itself is a great example of how cooperative conservation efforts between private industry, state officials, landowners, and the federal government are more than adequate to protect species. This decision differs from the Fish and Wildlife’s listing of the lesser prairie chicken in March that severely hindered a successful cooperative conservation effort. I hope the Fish and Wildlife Service along with the courts continue to allow future efforts like this to succeed.”
Now that Texas’s proactive efforts—such as those engaged to protect the Dunes Sagebrush Lizard—have withstood legal challenge, other states may take similar legislative and conservation actions preventing environmental groups (under the guise of conservation) from using lawsuits to block growth in the United States.
The author of Energy Freedom, Marita Noon serves as the executive director forEnergy Makes America Great Inc. and the companion educational organization, theCitizens’ Alliance for Responsible Energy (CARE). She hosts a weekly radio program:America’s Voice for Energy—which expands on the content of her weekly column.
Photo credit: U.S. Fish & Wildlife Service
[Originally published at Breitbart]
Imagine police seize your money, your car, even your house. Imagine this happens without you being convicted of a crime or even charged with one. Imagine being told you must sue the government to get back your property and prove you did nothing wrong, and the government can do nothing – nothing – and still keep the property.
This happens thousands of times a year across the country. But it will soon happen less often in Minnesota, which has taken a small but important step toward ending one of the most abusive law enforcement practices in the nation. It’s a step the federal and other state governments should take to protect citizens from abusive police and prosecutors and restore a fundamental principle of life in these United States: that we are presumed innocent until proven guilty.
States and local governments have stolen billions of dollars of property from people who have never been convicted of a crime or charged with one. They’ve done it under a practice called “civil forfeiture.” It’s an outgrowth of the nation’s “war on drugs,” which has been raging and failing since President Richard Nixon launched it more than 40 years ago.
Civil forfeiture defenders say it’s another way to get at criminals—usually drug users or sellers—while helping to fund law enforcement. Under civil forfeiture, police and prosecutors may seize property, sell it, and use the proceeds to pad their budgets. To get back their property, forfeiture victims must spend thousands of dollars in legal fees to sue. In many instances, the legal costs would exceed the value of the property.
Politicians eager to look tough on crime decided to structure civil forfeiture so police and prosecutors may take property on the mere suspicion it could be linked to a drug crime or certain other nefarious activities. Police and prosecutors don’t have to prove anything. All they have to do is claim they “suspect” the person losing the property might have been planning to use it in an illegal way or might have used or obtained it illegally. Their “suspicions” often are so flimsy no arrest or criminal charge is made. They just take the property.
The final straw for Minnesota legislators came after the Minneapolis Star-Tribune broke a scandal in the state’s Metro Gang Strike Force by reporting on the brutality of its raids and the apparent police thefts of cash and other property, including at least 13 seized cars that went “missing.” A state court later ordered $840,000 in seized property returned to forfeiture victims, and the strike force was disbanded.
Effective August 1, a bill signed into law by Gov. Mark Dayton will require people in Minnesota to be convicted of a drug crime before their property can be seized through forfeiture.
Civil forfeiture for other reasons is still possible, but reining in property seizures under the pretext of drug activity is a good start.
Through civil forfeiture our local, state, and federal governments effectively declare we are presumed guilty until proven innocent. In other words, it is a system of tyranny by police and prosecutors. Americans should never accept tyranny, no matter what excuses people in government give for imposing it.
North Carolina is the only state with no civil forfeiture. Let us hope Minnesota and the rest of the states and the federal government get to where North Carolina is and ban all civil forfeitures.
[Originally published at Inside Sources]
Below is a video from HBO’s John Oliver’s show “Last Week Tonight” discussing Civil Forfeiture laws
Government violates the Wallet Rule. Which is:
You go out on a Friday night with your wallet. You go out the following Friday night with my wallet. On which night are you going to have more fun?
Government is always working with our wallet – theirs is empty until they first fleece ours. They will thus never spend our money as prudently, wisely or well as do we.
Government is just another organism. Like any other, its first priority is self-preservation – its second self-expansion. And worse than just about any other – it will do whatever it takes to accomplish these priorities.
Including lie its collective face off.
The Barack Obama Administration is the most government-expansive administration in our nation’s history. To that end, they have used any means necessary – including lying its collective face off. For instance:
This Administration’s obsessive government expansion occurs in the face of it being just like any other administration and government entity – incessantly, serially incompetent at doing just about anything.
All of which has led people – well beyond conservative and libertarian circles – here:
Which brings us to the current debate over the government power grabbing huge now authority over the Internet.
President Obama’s Federal Communications Commission (FCC) – and its Obama-appointee Chairman Tom Wheeler – are contemplating fundamentally transforming how the government regulates the Web. It’s called Title II Reclassification.
Title II is the uber-regulatory superstructure with which we have strangled landline phones – you know, that bastion of technological and economic innovation. Which do you find more impressive – your desktop dialer or your iPhone?
Title II regulations date back to the 1930s – so you know they’ll be a perfect fit for the ultra-modern, incredibly dynamic, expanding-like-the-universe World Wide Web.
This would be the most detrimental of all Information Superhighway road blocks. Rather than the omni-directional, on-the-fly innovation that now constantly occurs, Title II is a Mother-May-I-Innovate, top-down traffic congest-er. Imagine taking a 16-lane Autobahn down to just a grass shoulder.
But fret not, the regulators tell us. They will wield just some – and not all – of their massive new powers. They will practice “forbearance.”
“(F)orbearance” refers to a special magic power that Congress gave the FCC…which gives the FCC the power to say “you know that specific provision of law that Congress passed? We decide it really doesn’t make sense for us to enforce it in some particular case, so we will “forbear” (hence the term ‘forbearance’) from enforcing it.”
Can we trust government to – forever and for always – leave regulatory powers on the table unused?
Can we trust this Administration – the most government-expansive ever – to do so?
Can we trust this particular FCC?
In a letter sent today to Federal Communications Commission Chairman Tom Wheeler, a coalition of groups expressed concerns over the agency’s loss of objectivity and impartiality in recent proceedings, especially the FCC’s ongoing Open Internet rulemaking.
The letter urges the Commission to keep partisan politics out of its decision-making process, to avoid spinning media coverage, and to focus on substance, not the total number of comments filed in controversial proceedings.
We certainly can not. In what can we trust?
So when the government tells us – as it ramps up new, massive government power grabs – “If you like your Internet – you can keep your Internet?”
[Originally published at Human Events]
Australians are supposed to feel guilty because some bureaucrat in the climate industry has calculated that we have a very high per capita “carbon footprint.” By “carbon footprint.” they mean the amount of carbon dioxide gas produced by whatever we do. Every human activity contributes to our carbon footprint – even just lying on the beach breathing gently produces carbon dioxide.
Producing carbon dioxide is not bad – it an essential gas in the cycle of life, and beneficial for all life. There is no proof whatsoever that human emissions cause dangerous global warming. Moreover, it is not per capita emissions that could affect the climate – it is total emissions, and on that measure Australia’s small contribution is largely irrelevant. This is just another PR weapon in the extreme green alarmist arsenal.
Even if carbon footprints were important, not all footprints are environmentally equal – some are good, some are bad and some are just plain ugly.
“Good” carbon footprints are the result of producing unsubsidised things for the benefit of others. An example is a grazier in outback Australia whose family lives frugally and works hard but has a high carbon footprint producing wool, mutton and beef from sustainable native grasslands and may use quad bikes, diesel pumps, electricity, tractors, trucks, trains, planes and ships to supply distant consumers. Many productive Australians with good carbon footprints produce food and fibres, seafood and timber, minerals and energy for grateful consumers all over the world. Activities like this create a large “per capita carbon footprint” for Australia. That so few people can produce so much is an achievement to be proud of.
A “bad” carbon footprint is produced when government subsidies, grants, hand-outs, tax breaks or mandates keep unproductive or unsustainable activities alive, leaving their footprint, but producing little useful in return. The prime examples are subsidised green energy and the government climate industry, but there are examples in all nationalised or subsidised industries and activities. (Russia and East Germany easily met their initial Kyoto targets by closing decrepit Soviet-era nationalised industries.)
An “ugly” carbon footprint is produced by green hypocrites who preach barefoot frugalism to us peasants while they live the opulent life style. Examples are the mansions, yachts and jet-setting of prominent green extremists such as Al Gore and Leonardo DiCaprio.
The ultimate ugly carbon hypocrites are those who organise and attend the regular meetings, conferences and street protests, drawing thousands of globe-trotting alarmists and “environmentalists” from all over the world by plane, yacht, car, bus, train and taxi to eat, drink, chant and dance while they protest about over-population, excessive consumption and heavy carbon footprints of “all those other people”.
Maybe they should lead by example and stop travelling, eating, drinking and breathing.
Celebrating The Work Of Nobel Prize Winning Economist, F.A. Hayek – A Man Who Has Made the 21st Century a Freer and More Prosperous Time
Forty years ago, on October 9, 1974, the Nobel Prize committee announced that the co-recipient of that year’s award for economics was the Austrian economist, Friedrich A. Hayek. Never was there a more deserving recognition for one of the truly great free market thinkers of modern times.
The Nobel committee recognized his contributions, including “pioneering work in the theory of money and economic fluctuations and for [his] penetrating analysis of the interdependence of economic, social and institutional phenomena.”
Over a scholarly and academic career that spanned seven decades, Hayek was one of the leading challengers against Keynesian economics, a profound critic of socialist central planning, and a defender of the open, competitive free society.
The awarding of the Nobel Prize for Economics in 1974 represented capstone recognition to an intellectual life devoted to understanding the workings and superiority of social systems grounded in the idea and ideals of human freedom and voluntary association.
“Austrian” Influences on Hayek
Friedrich August von Hayek was born on May 8, 1899 in Vienna, Austria. He briefly served in the Austrian Army on the Italian front during World War I. Shortly after returning from the battlefield in 1918 he entered the University of Vienna and earned two doctorates, one in jurisprudence in 1921 and the other in political science in 1923. While at the university, he studied with one of the founders of the Austrian school of economics, Friedrich von Wieser.
But perhaps the most important intellectual influence on his life began in 1921, when he met Ludwig von Mises while working for the Austrian Reparations Commission. It is not meant to detract from Hayek’s own contributions to suggest that many areas in which he later made his profoundly important mark were initially stimulated by the writings of Mises. This is most certainly true of Hayek’s work in monetary and business-cycle theory, his criticisms of socialism and the interventionist state, and in some of his writings on the methodology of the social sciences.
In 1923 and 1924, Hayek visited New York to learn about the state of economics in the United States. After he returned to Austria, Mises helped arrange the founding of the Austrian Institute for Business Cycle Research, with Hayek as the first director.
Though Hayek initially operated the institute with almost no staff and only a modest budget primarily funded by the Rockefeller Foundation, it was soon recognized as a leading center for the study of economic trends and forecasting in central Europe. Hayek and the Institute were frequently asked to prepare studies on economic conditions in Austria and central Europe for the League of Nations.
Hayek as Opponent of Keynesian Economics
In early 1931, Hayek traveled to Great Britain to deliver a series of lectures at the London School of Economics. The lectures created such a sensation that he was invited to permanently join the faculty of the LSE. In the early fall of 1931 these lectures appeared in book form under the title Prices and Production. So widely influential did this book and his other writings become at the time that through a good part of the 1930s, Hayek was the third-most frequently cited economist in the English-language economics journals. (John Maynard Keynes and his Cambridge University colleague Dennis Robertson came in first and second.)
This began his decade-long challenge to Keynes’ emerging “new economics” of macroeconomics and its rationale for activist government manipulation through monetary and fiscal policy.
In 1931–1932, Hayek wrote a lengthy two-part review of Keynes’s Treatise on Money for the British journal Economica. It was considered a devastating critique of Keynes’ work, one that forced Keynes to rethink his ideas and go back to the drawing board.
At the same time, the Great Depression of the early 1930s served as the backdrop against which Hayek explained his own theory and criticized Keynes.
Monetary Mismanagement and the Great Depression
In Prices and Production (1931) and Monetary Theory and the Trade Cycle (1933) Hayek argued that in the 1920s the American Federal Reserve System had followed a monetary policy geared toward stabilizing the general price level. But that decade had been one of major technological innovations and increases in productivity. If the Federal Reserve had not increased the money supply, the prices for goods and services would have gently fallen to reflect the increased ability of the American economy to produce greater quantities of output at lower costs of production.
Instead, the Federal Reserve increased the money supply just sufficiently to prevent prices from falling and to create the illusion of economic stability under an apparently stable price level. But the only way the Fed could succeed in this task was to increase reserves to the banking system, which then served as additional funds lent primarily for investment purposes to the business community.
To attract borrowers to take these funds off the market, interest rates had to be lowered. Beneath the calm surface of a stable price level, interest rates had been artificially pushed below real market-clearing levels. That generated a misdirection of labor and investment resources into long-term capital projects that eventually would be revealed as unsustainable because there was not enough real savings available to complete and maintain them.
The break finally came in 1928 and 1929, when the Fed became concerned that prices in general were finally beginning to rise. The Fed stopped increasing the money supply, investment profitability became uncertain, and the stock market crashed in October 1929.
Hayek argued that the economic downturn that then began was the inevitable consequence of the investment distortions caused by the earlier monetary inflation. A return to economic balance required the writing down of unprofitable capital investments, a downward adjustment of wages and prices, and a reallocation of labor and other resources to uses reflecting actual supply and demand in the market.
But the political and ideological climate of the 1930s was one increasingly dominated by collectivist and interventionist ideas. Governments in Europe as well as the United States did everything in their power to resist these required market adjustments. Business interests as well as trade unions called for protection from foreign competition, as well as government support of various types to keep prices and wages at their artificial inflationary levels. International trade collapsed, industrial output fell dramatically, and unemployment increased and became permanent for many of those now out of work.
Throughout the 1930s Keynes presented arguments to justify activist monetary and fiscal policies to try to overcome the imbalances the earlier monetary manipulation and interventions had created. This culminated in Keynes’ 1936 book, The General Theory of Employment, Interest and Money, which soon became the bible of a new macroeconomics that claimed that capitalism was inherently unstable and could only be saved through government “aggregate demand management.”
Hayek and other critics of Keynesian economics were rapidly swept away in the euphoric belief that government had the ability to demand-manage a return to full employment.
Hayek as Critic of Socialist Central Planning
But while seemingly “defeated” in the area of macroeconomics, Hayek realized that what was at stake was the wider question of whether in fact government had the wisdom and ability to successfully plan and guide an economy. This also led him to ask profoundly important questions about how markets successfully function and what institutions are essential for economic coordination to be possible in a complex system of division of labor.
In 1935, Hayek edited a collection of essays titled Collectivist Economic Planning, which included a translation of Ludwig von Mises’ famous 1920 article, “Economic Calculation in the Socialist Commonwealth” on why a socialist planned economy was functionally unworkable. For the volume, Hayek wrote an introduction summarizing the history of the question of whether socialist central planning could work and a concluding chapter on “the present state of the debate” in which he challenged many of the newer arguments in support of planning.
This was followed by a series of articles over the next several years on the same theme: “Economics and Knowledge” (1937), “Socialist Calculation: The Competitive ‘Solution’” (1940), “The Use of Knowledge in Society” (1945), and “The Meaning of Competition” (1946). Along with other writings, they were published in a volume entitled, Individualism and Economic Order (1948).
Divided Knowledge and Market Prices
In this work Hayek emphasized that the division of labor has a counterpart: the division of knowledge. Each individual comes to possess specialized and local knowledge in his corner of the division of labor that he alone may fully understand and appreciate how to use. Yet if all of these bits of specialized knowledge are to serve everyone in society, some method must exist to coordinate the activities of all these interdependent participants in the market.
The market’s solution to this problem, Hayek argued, was the competitive price system. Prices not only served as an incentive to stimulate work and effort, they also informed individuals about opportunities worth pursuing. Hayek clearly and concisely explained this in “The Use of Knowledge in Society”:
“We must look at the price system as such a mechanism for communicating information if we want to understand its real function . . . The most significant fact about this system is the economy of knowledge with which it operates, or how little the individual participants need to know in order to be able to take the right action.”
In elaborating his point, Hayek wrote that “The marvel is that in a case like that of a scarcity of one raw material, without an order being issued, without more than perhaps a handful of people knowing the cause, tens of thousands of people whose identity could not be ascertained by months of investigation, are made to use the material or its products more sparingly.”
Hayek added: “I am convinced that if it [the price system] were the result of deliberate human design, and if the people guided by the price changes understood that their decisions have significance far beyond their immediate aim, this mechanism would have been acclaimed as one of the greatest triumphs of the human mind”
It was in this period, as well, that Hayek applied his thinking about central planning to current politics. In 1944 he published what became his most famous book, The Road to Serfdom, in which he warned of the danger of tyranny that inevitably results from government control of economic decision-making through central planning. His message was clear: Nazism and fascism were not the only threats to liberty. The little book was condensed in Reader’s Digest and read by millions, and resulted in Hayek going on a nationwide lecture tour in the United States that was a resounding success.
In 1949 Hayek moved to the United States and took a position at the University of Chicago in 1950 as professor of social and moral science. He remained there until 1962, when he returned to Europe, where he held positions at various times at the University of Freiburg in West Germany and the University of Salzburg in Austria.
The Spontaneous Order of Human Society
The realization that something so significant—the price system—was undesigned and not intended to serve the purpose it serves so well became the centerpiece of Hayek’s writings for the rest of his life. He developed the idea in several directions in another series of works, including, The Counter-Revolution of Science (1952); The Constitution of Liberty (1960); Law, Legislation and Liberty in three volumes (1973–1979); in various essays collected in Studies in Philosophy, Politics and Economics (1967) and New Studies in Philosophy, Politics, Economics and the History of Ideas (1978); and in his final work, The Fatal Conceit: The Errors of Socialism (1988).
His underlying theme was that most institutions in society and the rules of interpersonal conduct are, as the eighteenth-century Scottish philosopher Adam Ferguson expressed it, “the result of human action, but not the execution of any human design.” In developing this idea, Hayek consciously took up the task of extending and improving the notion of the “invisible hand” as first formulated by Adam Smith in The Wealth of Nations and refined in the nineteenth century by Carl Menger, the founder of the Austrian school of economics.
Hayek argued that many forms of social interaction are coordinated through institutions that at one level are unplanned and are part of a wider “spontaneous order.” To a large extent, he explained, language, customs, traditions, rules of conduct, and exchange relationships have all evolved and developed without any conscious design guiding them. Yet without such unplanned rules and institutions, society would have found it impossible to progress beyond a rather primitive level.
Another way of expressing this is that, in Hayek’s view, the unique characteristic of an advanced civilization is that no one mind (or group of minds) controls or directs it. In a small tribal society all members often share basically one scale of values and preferences; the chief or leader can know the potentialities of each member and can assign roles and duties so that the tribe’s physical and mental means can be applied more or less successfully to the common hierarchy of ends.
However, once the group passes beyond a simple level of development, any further social progress will require radical revision of the social rules and order: the complexity of social and economic activity will make it impossible for any individual to master the information necessary to coordinating the members of the group. Nor will the members continue to agree on preferences and values; their actions and interests will become more diverse.
An advanced society, therefore, must always be a “planless” society, that is, a society in which no one overall “plan” is superimposed over the actions and plans of the individuals making up the society. Instead, civilization is by necessity a “spontaneous order,” in which the participants use their own special knowledge and pursue their own individually chosen plans without a higher will or mind guiding them.
The Fallacy of Social Justice
The very complexity that makes it impossible to know all the information required to guide society, Hayek reasoned, makes it equally impossible to judge the “justice” or “worthiness” of an individual’s total actions. As a result, the popular call for “social,” or “distributive,” justice is inapplicable in a free society. Social justice requires not merely that individuals receive what is rightly theirs in general terms, but that individuals and groups also receive some stipulated distributional share of the society’s total output or wealth.
However, Hayek showed that in the market economy, distributions of income are not based on some standard of “deservedness,” but rather on the degree to which the individual has directly or indirectly satisfied consumer demand within the general rules of individual rights and property.
To attempt to distribute income shares by “deservedness” would require the government to establish some overarching standard for disbursing “social justice,” and would necessitate an economic system in which that government had the authority and the power to investigate, measure, and judge each person’s “right” to a share of the society’s wealth.
Hayek suggested that such a system would involve a return to the mentality and the rules of a tribal society: government would have to impose a single hierarchy of ends and would decide what each member should have and what should be expected from him in return. It would mean the end of the free and open society.
Hayek’s Appeal to Intellectual Humility
At the Nobel Prize ceremonies held in December 1974, at which the recipients received their awards, Hayek delivered a brief banquet dinner address in which he said that he wondered if there should be a Nobel Prize in a field like economics because the media often expects the award winner to deliver omniscient-like remarks on all the social and economic problems of the world.
The usefulness of Hayek receiving that Nobel Prize was that it enabled him to present a more formal lecture at the Nobel ceremonies on what he called “The Pretense of Knowledge,” a reminder that economists and policy-makers should remember that we all know far to little to presume to know enough to successfully plan and regulate the world through any political authority.
Thanks to his ideas, the 21st century can be a freer and more prosperous place in which to live, if we only we take to heart his appeal to intellectual humility, and allow each of us the liberty and latitude to plan our own lives with our individual limited knowledge, and rely upon the open market to coordinate all that we do through the competitive price system.
[Originally published at EpicTimes]
Part 1 published yesterday at Illinois Review, Thursday, October 1, recounted the stellar and above-board behavior of Attorney General Edwin Meese when serving President Reagan, as remembered by Joseph Morris, an honored member of the Chicago Federalist Society, who held the position of Reagan’s Assistant Attorney General during part of Meese’s tenure as Attorney general. Introductory comments by John Fund were included as a teaser to whet the appetite for further Fund reflections and trust-worthy opinions in what is now Part 2.
John Fund reflected on how It was only six years ago that Obama was “a citizen of the world.” His candidacy was like the third coming. However, now the tide seems to be turning. Fund’s observation was linked to NBC’s “60 Minutes” TV program that he had viewed the night before, Sunday, 9/28. Fund drew a blank when he inquired whether any in attendance had viewed the program.
On the broadcast, Obama was asked by Steve Kroft why the U.S. had not anticipated the Islamic State’s threat. Although Obama acknowledged there had been an underestimation of what had been taking place in Syria, he proceeded to place blame on his Director of National Intelligence, James Clapper, and others in the intel community.
Shortly after the interview Ron Fournier, a supporter of President Obama, tweeted John Fund:
“I, me, my. It’s their fault. “I, me, my. It’s their fault.” “I, me, my. It’s their fault.” “I, me, my. It’s their fault.” “I, me, my. It’s their fault”
To further note: Americans did turn away from watching Obama talk about the nation’s new war on ISIS on CBS’ “60 Minutes.” The sympathetic news media mentioned the massive collapse in ratings for the program, but placed blame on the absence of a preceding football game. It was not surprising that not one individual present to hear John Fund speak had watched Obama on “60 Minutes.”
Fund related an anecdote about James O’ Keefe, an American conservativeactivist who gained national attention for his release of video recordings of workers at ACORN offices in 2009. Entering a polling place in Washington, D.C, dressed as a scruffy-looking individual, James ‘O Keefe, seeing people in line, walked up and asked if Eric Holder was on the voter role.
A quick check showed Holder as a registered voter. Without further communication, O’Keefe was handed Holder’s ballot. Deciding to play along, O’Keefe asked if he had to show his ID and even offered to go out to his car to fetch it, only to be told that they weren’t permitted to ask for ID’s. O’Keefe didn’t vote in Holder’s name, but, Holder, upon hearing about O’Keefe’s stunt responded later on that “You don’t need an ID to come to the Justice Department.” He further explained that no one had to show an ID to go up to see him in his office whenever they wished to.
The Obama legacy of Al Sharpton
Another disappointing aspect of the Obama legacy was Holder, Obama, and Jarrett building up Al Sharpton as the most important civil rights leader in America today. As the new black leader and one whom Obama leans on for advice, Al Sharpton sits in on meetings at the White House; Sharpton’s telephone number is on Eric Holder’s speed dial; Sharpton vacations near where Valerie Jarrett spends her time at Martha’s Vineyard; and Sharpton, now that Holder has tendered his resignation, is engaged in conversation as to Holder’s replacement. All this, and Sharpton claims to have no income, claims to borrow all his suits from friends, and has never apologized for anything he has ever done of a shady and dishonest nature, i.e., the Tawana Brawley case of 1987 and the ‘Jena 6′ protest held in Jena, Louisiana in 2007.
Al Sharpton’s behavior has left this nation divided, politically disconnected, and cynical. Nearly 50 years after the March on Washington by Martin Luther King, race relations remain poor due to hucksters like Al Sharpton and Jesse Jackson. Some liberals are finally getting it, included Margaret Carlson, who said when appearing in August of last year on PS’s Inside Washington: “We’re gone from Martin Luther King to the Reverend A Sharpton, and as a leader . . . it’s very dispiriting.” Rather than having the welcome mat rolled out for Al Sharpton at the White House, John Fund suggested that Sharpton should be run out of town. According to Fund, only a small segment of blacks support Sharpton.
Questions entertained by John Fund
- In referring jokingly to Valerie Jarrett as President Jarrett, John Fund revealed that Jarrett has a 24-hour secret service detail, something Obama’s Chief of Staff doesn’t even receive. “But after all the President does play a lot of golf and is often delayed.” Fund left it at that.
- Regarding the possibility of finding out about the scandal involving Lois Lerner? The Justice Department, having starting an investigation in May of 2013, seems to have no intention of following through. The same holds true for the host of other scandals associated with the Obama administration.
- In response to a question about Holder’s replacement, Fund thought the individual would be made out of the same cloth as Holder. After all, President Obama plans to issue a slew of executive orders after the November election. He will need someone to defend Obama’s elastic view of executive power divorced from the Constitution.
Related as fact by John Fund was how Tom Perez, as Secretary of Labor, makes use of a separate private computer in his agency which operates as a shadow government. Other officials have been caught using private aliases email accounts to correspond to hide sent messages. Two top EPA officials used email aliases accounts when corresponding with environmental groups: EPA Region 8 Administration James Martin, as well as EPA Administrator Lisa Jackton, who has since resigned.
The vast majority of inspector generals conferred by the House have been appointed by Democratic governors. It is telling that forty-seven out of seventy-two signed a letter in August of this year that they can no longer do their jobs effectively because of their inability to get the documents they need to conduct their individual investigations.
The Inspector General Act of 1978, as amended, establishes the responsibilities and duties of an IG. The IG Act has been amended to increase the number of agencies with statutory IGs. In 1988 came the establishment of IGs in smaller, independent agencies and there are now 72 statutory IGs.
The primary job of an inspector general is to detect and prevent fraud, waste, abuse, and violations of laws and to promote economy, efficiency and effectiveness in the operations of the Federal Government. Given the dissatisfaction of two-thirds of this nation’s inspector generals, it does not bode well for this nation, nor does it create the environment necessary to build back trust in government so lacking among the American people.
What happened to the promise candidate Obama made to the American people that his government would be the most open and transparent ever, if he were elected? This promise has fallen by the wayside. Perhaps it was never meant to be.
[Originally published at Illinois Review]
Recently two towns, Chattanooga, Tennessee, and the City of Wilson, North Carolina, have petitioned the federal government, via the FCC, complaining that state laws are constraining them from the municipal provision of broadband services, that is, from building a government owned network (GON). That is, these municipalities want to expend resources to build and operate broadband systems, without following any of regulations that govern private sector providers. To overcome the state’s rightful authority the city governments have proposed that the FCC preempt state law and empower municipalities in ways that upset the political structure of the U.S.
While models of municipality creation vary widely around the world, in the United States how they are created is fairly clear. The U.S. Constitution empowers states as the primary political entity. The federal government itself is also creation of the states, and of the people, with the Constitution placing restraints on government broadly, at the agreement of the states. States are also empowered to arbitrarily create subdivisions, generically referred to as municipalities. Ultimately then, responsibility for the municipalities generally falls to the states.
FCC intervention into this relationship between states and municipalities would have profound negative effects as was explained in the FCC filing by Madery Bridge. Municipalities, untethered from responsibility to the state, could partake in risky schemes of tax funded adventurism placing the entire state and all its citizens at risk. And government owned networks have proven risky indeed.
For years, municipalities around the country have tried, and ultimately failed, to either set up their own communications networks or to partner with private companies to get into the business of broadband. The list of failures is long and keeps growing but includes Utah’s UTOPIA, Burlington, Vermont , Chicago, Seattle, Tacoma, , Minnesota’s FiberNet, the Northern Florida Broadband Authority, Philadelphia and Orlando. To be clear this is a minimal partial list and does not include the many systems that will not disclose whether they are already being bailed out with taxpayer’s money. The reasons for the failures are numerous, typically resulting in taxpayer funds being wasted. Some would nit-pick the details of the failures, but the fact remains that taxpayer money was put at risk, often without approval of taxpayers, and most often squandered.
Even still, some municipalities want to plow forward, heedless of the lessons, believing that they are somehow different. As mentioned, some have been frustrated by state laws in at least twenty states that were designed to prevent fiscal folly on behalf of the localities, laws that shield all citizens of the state from financial risk. Adopting the failed model of municipal provision of communications services is the wrong idea, as many municipalities across the country can attest.
Municipalities face many risks in building and operating broadband networks. As has been seen in the routine failures, governments chronically underestimate the cost of building out and maintaining networks, and chronically overestimate adoption rates.
Technology infrastructure investment, like most infrastructure investment, is not for the faint of heart or the partially committed. Municipalities and states across the country are constantly challenged by maintaining the relatively static infrastructure that they have already taken on, such as streets, sidewalks, bridges and buildings.
Technology is vastly more challenging. One must jump in with both feet, constantly updating the technology and business models. As online services grow more sophisticated, customers have become accustomed to regular upgrades, challenging the ability of governments to keep up with demand. Those challenges are multiplied a hundred fold when the complications of delivering video and voice are added. Video services alone are in a constant state of upgrade, either in providing more channels, more programming, or providing services to customers to allow them to customize their own video experience, such as video on demand.
Of course as a greater variety of more complicated technology and services is offered, the more expensive the building of the system and overall operations becomes. In turn even more taxpayer money is placed at risk, because when these systems fail it is not private investors who lose money but taxpayers across the state often without any say in the matter, and the vast majority of whom received absolutely no benefit. When local and state coffers are depleted because of these sorts of risky government bets, the cry is for more tax revenue or for an outright bailout.
In general, technological innovation continues to far outpace the speed of government, which simply cannot compete with the market. So, in the case where a municipal system is competing against a private system, about the time the municipal system is up and running, private networks will offer something better, cheaper, and faster. Even in cases where there is no private sector competition, government operated networks will never keep pace with public expectations. Broadband systems are not like a water public utility where the same pipes are used for one hundred years to deliver the same product in the same way.
The challenges of government owned networks and the preservation of free speech is also daunting. The theoretical became real in San Francisco, a city that often brags of its rich tradition of civil liberties. There, a municipal communications system was purposely shut down to prevent people from engaging in specific, legal communications. In a chilling statement, city officials pointedly said, “Cellphone users may not have liked being incommunicado, but BART officials told the SF Appeal, an online paper, that it was well within its rights. After all, since it pays for the cell service underground, it can cut it off.”
Whether San Francisco should be paying for municipal communications systems at all is a question for the city and state. The more pressing concern is the freedom of speech problems that arise when a municipality owns a communications system.
Importantly, rarely is it the case that government is trying to serve someone with no Internet access option. Rather the most common motivation for beginning the government owned network fantasy is economic development groups being swayed by traveling consultants. Their siren song is too hard for some communities to resist, and repeated past failures tend not to be mentioned.
Such localities could better use their time and resources by moving to provide clear and more rapid approval decision making for wireless facilities as wireless rapidly has and continues to be the favored method of accessing broadband. Policymakers should sponsor initiatives to encourage broadband deployment into unserved areas using incentives for private sector companies that risk their own capital.
Where state officials of any sort are calling for FCC action, their arguments are merely an attempt to end run the state’s political process and the will of the people. They seek to create public policy where they were not able to do so within their own state through proper channels. This is policy
making by the ruling class rather than by will of the people. State policies should be determined through state legislation or at least through state rule making.
Allowing the states to continue to experiment with how to broadband will be delivered to the greatest number of their residents is absolutely the right policy to pursue. The FCC should stand on the side of greater creativity and innovation, and the law, and not intervene in state law.
Pence spent 12 years in Congress building his conservative street cred. He railed against Bush-era policies such as the bailouts and No Child Left Behind, and he launched a quixotic bid to replace John Boehner as minority leader in 2006.
It’s difficult for social conservatives or tax-cutting supply-siders not to love Mike Pence. Only such a self-proclaimed “happy warrior for conservatism” could buck his own party, become the third-highest-ranking Republican in the House, and set fundraising records while becoming the odds-on favorite to replace party darling Mitch Daniels as governor of the Hoosier State.
When asked about his White House ambitions, Pence brushes off the question and says he’s focused on Indiana. But his record says otherwise. The person now occupying the governor’s mansion in Indianapolis doesn’t look like the Mike Pence conservatives originally came to trust. Over the past 18 months, a Pence 2.0 has emerged—one who embraces big-government solutions while claiming to fight against them.
Consider, for example, the Common Core education standards, against which parents around the nation have protested as a nationalization of curriculum with politicized, dumbed-down requirements. In April, Pence penned a triumphant piece in the Indianapolis Star touting Indiana’s first-in-the-nation exit from the controversial program to impose national academic standards.
What happened next came courtesy of Pence 2.0. Indiana didn’t return to its rigorous pre-Common Core standards, described by The Heritage Foundation as “state-driven and, most importantly, supported by teachers and parents.”
Instead, Pence 2.0 implemented Common Core by another name.
Writing at National Review, Stanley Kurtz says Indiana’s new standards are “nothing but a slightly mangled and rebranded version of what they supposedly replace.” Some education experts actually proclaimed the new standards to be even worse than Common Core. Indiana voters responded by ousting two Pence-backed Common Core supporters in the GOP legislative primary.
Unfortunately, Pence 2.0 is just getting started. Faced with a decision about Obamacare, the once-staunch fiscal conservative who led the charge against the law is currently in talks with the federal government to implement Obamacare’s massive Medicaid expansion.
Pence’s proposal, the “Healthy Indiana Plan 2.0,” would use Obamacare dollars to give Medicaid benefits to more than 375,000 able-bodied adults—more than three-quarters of whom have no children. In doing so, Pence 2.0 affirms there’s no principle limiting who should be eligible for government-funded welfare.
But to hear Pence 2.0 tell it, you would think his move to expand Medicaid is positively Reaganesque. In fact, he invoked the Gipper repeatedly while unveiling the plan at the American Enterprise Institute.
A growing number of health policy experts aren’t buying Pence’s revisionism. Writers at Forbes debunked Pence’s argument that his plan is a block grant, noting,
“By definition, Medicaid block grants give states a fixed, lump sum of federal dollars in exchange for broad autonomy in providing Medicaid benefits. Pence’s plan features neither of these elements. Under Pence’s ObamaCare expansion, Indiana will draw down increasing amounts of ObamaCare in exchange for adding more people to the Medicaid rolls.”
The Heritage Foundation calls the plan “disappointing.” And over at The Federalist, Dean Clancy writes, “Two camps are emerging among GOP governors: those who oppose the Obamacare expansion, and those who pretend to. Pence has now officially joined the pretender camp.”
People often say politicians’ campaign promises disappear when they start to govern. That’s certainly the case with Mike Pence 2.0, who has been rebranding big-government policies as conservative and apparently hoping voters won’t notice. Here’s hoping they do.
John Nothdurft (firstname.lastname@example.org) is director of government relations for The Heartland Institute.
Earlier this year, the Obama administration asked ICANN (Internet Corporation for Assigned Names and Numbers) to create a means of overseeing the Internet after U.S. governance is scheduled to end in another year. The administration decided not to maintain the current U.S. minimum-oversight role.
This decision led to a debate as to whether transferring control of the Internet root zone functions from the U.S. Department of Commerce to some yet-to-be-determined multi-stakeholder organization is a good thing. Especially since some governments want to undermine the permissionless, free-speech Internet built under U.S. oversight.
ICANN is a non-profit organization created to manage, according to its bylaws, “the coordination of maintenance and methodology of several databases of unique identifiers related to the namespaces of the Internet, and ensuring the network’s stable and secure operation…including policy development for internationalization of the DNS system, introduction of new generic top-level domains (TLDs), and the operation of root name servers.”
And as described in the memorandum of understanding with the U.S. government, “ICANN’s primary principles of operation have been described as helping preserve the operational stability of the Internet; to promote competition; to achieve broad representation of the global Internet community; and to develop policies appropriate to its mission through bottom-up, consensus-based processes.”
But things began changing rapidly shortly after a Commerce Department official downplayed the threat of top-down control of the Internet by authoritarian governments in the absence of U.S. oversight.
ICANN has advisory committees that advise on the needs of various stakeholders to the board. One of these is the Governmental Advisory Committee, made up of representatives of national governments from around the world. Right now all they do is offer advice; their recommendations can be ignored by a majority vote of the board.
But last month ICANN proposed changing its rules to say that government opinion must be followed unless two-thirds of the board objects. Not wasting a moment, a virtual leader of repressive authoritarian government, Iran, has proposed that government opinion be a mandate if a simple majority agrees. The Obama administration’s view is not just being ignored, but essentially mocked, as authoritarian governments move to make ICANN another puppet of government.
This is not the UN taking control of the Internet yet. There is, however, intense international pressure to have a UN-type organization take control of Internet governance, fundamentally changing ICANN into an international governmental regulatory agency. There is every reason to believe that the core ICANN functions, which the International Telecommunications Union (a small U.N. agency) is already coveting and which many countries are already lobbying to be turned over to U.N. control, will eventually be consumed. This change would end a long history of independent multi-stakeholder organizations, set up to do technical functions that are of interest to the global community, being absorbed into the U.N.
ICANN has agreed to provide a brief comment period before moving to make the change. But realistically, this fight is just beginning, and the debate will likely go on for years—an ongoing battle between those who favor freedom for individuals to direct their own destiny and those who stand against such liberty.
The U.S. will need to hang tough, stay committed to its fundamental principles, recognize real threats instead of ignoring them, and work to convince the members of the international Internet community that they must stand steadfast against a block of repressive regimes. Another option is to sacrifice the open Internet, our freedoms, and the Internet industry. A third option may prove to be the only way if the U.S. fails —disconnecting from the global Internet.
[Originally published at Institute for Policy Innovation]
CHICAGO, October 2, 2014 — In 1997 during the Kyoto Protocol Treaty negotiations in Japan, Dr. Robert Watson, then Chairman of the Intergovernmental Panel on Climate Change, was asked about scientists who challenge United Nations conclusions that global warming was man-made. He answered, “The science is settled…we’re not going to reopen it here.” Thus began one of the greatest propaganda lines in support of the theory of human-caused global warming.
On June 19 this year, the University of Northern Iowa held a debate on climate change titled, “Climate Instability: Interpretations of Scientific Evidence.” Dr. Jerry Schnoor of the University of Iowa presented an effective case for the theory of man-made warming and I presented the case for climate change driven by natural causes. The video contains 30 minutes of presentation by each side and then 30 minutes of questions and rebuttal, presented to a small audience of faculty and students.
Formal debates on the theory of human-caused warming are somewhat rare in our society today. Former Vice President Al Gore stated on the CBS Early Show on May 31, 2006:
…the debate among the scientists is over. There is no more debate. We face a planetary emergency. There is no more scientific debate among serious people who’ve looked at the science…Well, I guess in some quarters, there’s still a debate over whether the moon landing was staged in a movie lot in Arizona, or whether the earth is flat instead of round.
EPA Administrator Lisa Jackson declared to Congress in 2010, “The science behind climate change is settled, and human activity is responsible for global warming.” Even President Obama in his 2014 State of the Union address said, “But the debate is settled. Climate change is a fact.”
The Los Angeles Times announced last year that they will not print opinions that challenge the concept that humans are the cause of climate change. The BBC has taken a similar position. Many of our universities will not allow an open debate on climate change. The Department of Meteorology and Climate Science at San Jose State University posted an image last year of two professors holding a match to my book.
In contrast to the “no debate” positions of our political leaders, news media, and many universities, the event at the University of Northern Iowa was a breath of fresh air. Thanks to Dr. Catherine Zeman and the Center for Energy and Environmental Education at UNI for their sponsorship of an open debate on the “settled science” of climate change.
[Originally published at Communities Digital News]
On August 6, 2014, Sean Parnell did a presentation about his new book, The Self-Pay Patient: Affordable Healthcare Choices in the Age of Obamacare as a part of The Heartland Institute’s Author Series. During the presentation, Parnell explained why he wrote the book, what it means to be a self-pay patient, why one might want to be a self-pay patient, and what the book means for the free-market healthcare movement.
Being a self-pay patient means operating without the necessity for the government. A self-pay patient may want to seek healthcare without the use of traditional insurance. They may want to pay in cash or use an insurance plan with a high deductible. A self-pay patient basically wants to operate in more of a free market in regards to healthcare.
So, why become a self-pay patient? Parnell gives three main reasons. One, many do not want the government involved in their healthcare. They don’t want their medical information in a national database or they don’t want the government telling them what type of insurance they need or what it must cover. Another reason is to simply eliminate the bureaucracy and the associated headache of dealing with a third party. The last main reason to become a self-pay patient Parnell discusses is that it is less expensive.
The lower cost of being a self-pay patient is the main focus of Parnell’s book and presentation. He gives many examples and resources of how this can save money. For example, Parnell refers to a man who saved thousands of dollars by traveling to a doctor in Oregon after comparing prices for a procedure. He also mentions a website that compiles price lists for those who are looking for cheaper options. These examples are more in line with a real free market that Parnell desires.
Patients are not the only ones who may want to operate in this manner. Parnell describes doctors and clinics that also cutout the middleman. These clinics would rather accept only cash, check or card instead of dealing with insurance and the government. In many cases, these clinics are intended to deal with a limited amount of procedures.
But what about dealing with major issues? Parnell describes various mechanisms a person could use to ensure coverage when dealing with a catastrophic event. One example of this is called “a healthcare sharing ministry.” This organization is a group that shares medical expenses. While it operates in a similar manner as a traditional insurance company, the savings could be substantial.
When listening to Parnell, it is obvious that he has a wealth of information regarding the healthcare system. The solutions that he provides are a breath of fresh air for those who are concerned with the continuing implementation of Obamacare. While we may still be far from a free market in healthcare, becoming a self-pay patient may get us a little closer to that goal.
The continuing improvement in international traffic congestion data makes comparisons between different cities globally far easier. Annual reports (2013) by Tom Tom have been expanded to include China, adding the world’s second largest economy to previously produced array of reports on the Americas, Europe, South Africa and Australia/New Zealand. A total of 160 cities are now rated in these Tom Tom Traffic Index Reports. This provides an opportunity to provide world 10 most congested and 10 least congested cities lists among the rated cities.
Tom Tom provides all day congestion indexes and indexes for peak hours (heaviest traffic peak morning and evening hour). The traffic indexes rate congestion based on the additional time necessary to make the trip compared to those under free flow conditions. For example, an index of 10 indicates that a 30 minute trip would take 10 percent longer, or 33 minutes. An index of 50 means that a 30 minute trip will, on average, take 45 minutes.
Congestion in Peak Hours: 10 Most Congested Cities
This article constructs an average peak hour index, using the morning and evening peak period Tom Tom Traffic Indexes for the 125 rated metropolitan areas with principal urban areas of more than 1,000,000 residents. The peak hour index is used because peak hour congestion is generally of more public policy concern than all day congestion. This congestion occurs because of the concentration of work trips in relatively short periods of time. Work trips are by no means the majority of trips, but it can be argued that they cause the most congestion. Many cities have relatively little off-peak traffic congestion.
The two most congested cities are in Eastern Europe, Moscow and Istanbul (which stretches across the Bosporus into Asia). Four of the most congested cities are in China, three in Latin America (including all that are rated) and one is in Western Europe (Figure 1).
Moscow is the most congested city, with a peak hour index of 126. This means that the average 30 minute trip in free flow conditions will take 68 minutes during peak hours. Moscow has a limited freeway system, but its ambitious plans could relieve congestion. The city has undertaken a huge geographical expansion program, with the intention of relocating many jobs to outside the primary ring road. This dispersion of employment, if supported by sufficient road infrastructure could lead to improved traffic conditions.
Istanbul is the second most congested city with a peak hour traffic index of 108. The average free flow 30 minute trip would take 62 minutes during peak hours.
Rio de Janeiro is the third most congested city him with a peak hour traffic index of 99.5. The average free flow 30 minute trip takes 60 minutes due to congestion during peak hours.
Tianjin, which will achieve megacity status in 2015, and which is adjacent to Beijing, is the fourth most congested city, with an index of 91. In Tianjin, the peak hour congestion extends a free flow 30 minute trip to 57 minutes.
Mexico City is the fifth most congested city, with a peak hour traffic index of 88.5. The average free flow 30 minute trip takes 57 minutes due to congestion.
Hangzhou (capital of Zhejiang, China), which is adjacent to Shanghai, has the sixth worst traffic congestion, with a peak period traffic index of 87. The average 30 minute trip in free flow takes 56 minutes during peak hours.
Sao Paulo has the seventh worst traffic congestion, with a peak hour index of 80.5. The average 30 minute trip in free flow takes 54 minutes during peak periods. Sao Paulo’s intense traffic congestion has long been exacerbated by truck traffic routed along the “Marginale” near the center of the city. A ring road now is mostly complete, but the section most critical to relieving traffic congestion from trucks is yet to be opened.
Chongqing has the eighth worst traffic congestion, with a peak hour index of 78.5. As a result, a trip that would take 30 minutes in free flow conditions takes 54 minutes during peak hours.
Beijing has the ninth worst traffic congestion, with a peak hour index of 76.5. As a result a trip that should take 30 minutes in free flow is likely to take 53 minutes during peak hour. In spite of recent reports of its intense traffic congestion, Beijing rates better than some other cities. There are likely two causes for this. With its seventh ring road now planned, Beijing has a top-flight freeway system. Its traffic is also aided by its dispersion of employment The lower density government oriented employment core , is flanked on both side by major business centers (“edge cities”) on the Second and Third Ring Roads. This disperses traffic.
Brussels has the 10th worst peak hour traffic congestion, with an index of 75. A trip that would take 30 minutes at free flow takes 53 minutes in peak hour congestion.
Seven of the 10 most congested cities are megacities (urban areas with populations over 10 million). The exceptions are Hangzhou, Chongqing and Brussels. Brussels has by far the smallest population, at only 2.1 million residents, little more than one-third the size of second smallest city, Hangzhou.
Most Congested Cities in the US and Canada
The most congested US and Canadian cities rank far down the list. Los Angeles ranks in a tie with Paris, Marseille and Ningbo (China), at a peak hour congestion index of 65. It may be surprising that Los Angeles does rank much higher. Los Angeles has been the most congested city in the United States, displacing Houston in the 1980s. The intensity of the Los Angeles traffic congestion is driven by its highest urban area density in the United States and important gaps in the planned freeway system that were canceled. Nonetheless, Los Angeles is aided by a strong dispersion of employment, which helps to make makes its overall work trip travel times the lowest among world megacities for which data is available). Part of the Los Angeles advantages is its high automobile usage, which shortens travel times relative to megacities with much larger transit market shares (such as Tokyo, New York, London and Paris).
Vancouver is Canada’s most congested city, with a pea period index of 62.5 and has the 27th worst traffic congestion, in a tie with Stockholm. Vancouver had exceeded Los Angeles in traffic congestion in the 2013 mid-year Tom Tom Traffic Index report.
Least Congested Cities
All but one of the 10 least congested large cities in the Tom Tom report are in the United States. The least congested is Kansas City, with a peak period index of 19.5, indicating that a 30 minute trip in free flow is likely to take 36 minutes due to congestion. Kansas City has one of the most comprehensive freeway systems in the United States and has a highly dispersed employment base. US cities also occupy the second through the sixth least congested positions (Cleveland, Indianapolis, Memphis, Louisville and St. Louis). Spain’s Valencia is the seventh least congested city, while the eighth through 10th positions are taken by Salt Lake City, Las Vegas and Detroit.
Cities Not Rated
There are a number of other highly congested cities that are not yet included in international traffic congestion ratings. Data in the 1999 publication Cities and Automobile Dependence: A Sourcebook indicated that the greatest density of traffic among rated cities was in Seoul, Bangkokand Hong Kong. Singapore, Kuala Lumpur, Jakarta, Tokyo, Surabaya (Indonesia), while Zürich and Munich also had intense traffic congestion. Later data would doubtless add Manila to the list. The cities of the Indian subcontinent also experience extreme, but as yet unrated traffic congestion. It is hoped that traffic indexes will soon be available for these and other international cities.
Determinants of Traffic Congestion
An examination (regression analysis) of the peak period traffic indexes indicates an association between higher urban area population densities and greater traffic congestion, with a coefficient of determination (R2) of 0.48, which is significant at the one percent level of confidence (Figure 2). This is consistent with other research equating lower densities with faster travel times and anincreasing automobile use in response to higher densities.
At the regional level, a similar association is apparent. The United States, with the lowest urban population densities, has the least traffic congestion. Latin America, Eastern Europe and China, with higher urban densities, have worse traffic congestion. Density does not explain all the differences, however, especially among geographies outside the United States. Despite its high density, China’s traffic congestion is less intense than that of Eastern European and Latin American cities. It seems likely that this is, at least in part due to the better matching of roadway supply with demand in China, with its extensive urban freeway systems. Further, the cities of China often have a more polycentric employment distribution (Table).Traffic Congestion & Urban Population Density Urban Poulation Density Peak Hour Congestion Per Square Mile Per KM2 Australia & New Zealand 49.2 4,600 1,800 Canada 49.4 5,000 1,900 China 64.9 15,700 6,100 Eastern Europe 80.8 11,800 4,500 Latin America 89.5 19,600 7,600 United States 37.1 3,100 1,200 Western Europe 47.4 8,700 3,400 South Africa 52.4 8,300 3,200 Peak Hour Congestion: Average of Tom Tom Peak Hour Congestion Indexes 2013 Population Densities: Demographia World Urban Areas
Both of these factors, high capacity roadways and dispersion of population as well as jobs are also important contributors to the lower congestion levels in the United States.
Wendell Cox is principal of Demographia, an international public policy and demographics firm. He is co-author of the “Demographia International Housing Affordability Survey” and author of “Demographia World Urban Areas” and “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.” He was appointed to three terms on the Los Angeles County Transportation Commission, where he served with the leading city and county leadership as the only non-elected member. He was appointed to the Amtrak Reform Council to fill the unexpired term of Governor Christine Todd Whitman and has served as a visiting professor at the Conservatoire National des Arts et Metiers, a national university in Paris.
Photo: On the Moscow MKAD Ring Road
[Originally published at New Geography]
In this 3-minute video, titled “Stay away from Muni Broadband,” Scott Cleland lays out multiple reasons why we should be wary about Municipal Broadband.
First of all, it is preposterous to think of the government as a competitor. The government does not operate under the same rules as its private sector counterparts. In fact, the government makes the rules. The government sets taxes, they regulate industry, they have the ability to set barriers to entry. By using these powers, the government has a large advantage over the competition. As Cleland states, “you can’t fight city hall.”
Competing with the government can easily become a unfair fight. If the government begins to lose, they can use their powers to gain an edge. By reconstructing the regulations or fees, the government has the ability to injure its competition; this is an advantage the private-sector competition does not have. Also, these advantages of the government come at the expense of the taxpayer.
At best, municipal broadband is a waste of taxpayer money. As Cleland states, “In the past, these municipal broadband things have been boondoggles and have cost local and state coffers millions upon millions upon tens-of-millions of bonds that then were essentially bankrupt.” It is also a gamble. The government intends to spend money attempting to compete with the private sector even though they have no experience in this field.
The last argument Cleland gives against municipal broadband is the possibility of government snooping. With the government running the internet, this opens up the opportunity for them to be able to check your email or monitor your internet searches.
This short video by Cleland does a great job of explaining some of the arguments against municipal broadband. The government should stick to tasks we deem necessary while leaving the internet in the hands of the private sector.
Would you want the textbooks at your child’s school to teach your kids the Soviet Union still exists and is the greatest danger your child will face in their lifetime? Would you want the computers at your child’s school to use dial-up internet or run on Windows 98? Of course not, because these resources are out of date, and parents want their kids to get the best education possible.
To ensure the quality of the education provided to students, the Texas State Board of Education has begun the process of updating its textbooks to reflect the latest information and advancements in history and science, because part of giving kids the best education possible means giving them access to the best resources available.
We no longer teach kids about the current status of the Berlin Wall, so why would we teach our kids about climate change by using climate models and textbooks that are similarly out-of-date and out-of-touch with reality?
Many people don’t know mean global temperatures have not risen significantly for the last 17 years, meaning no significant global warming has occurred since you first bought Windows 98. Some peer-reviewed, scientific studies suggest the current time period with no global warming has been as long as 20 years. The scientific analysis and climate models that predicted drastic global warming over this period were simply wrong, so it makes sense to reexamine the issue in light of new evidence and teach our children accordingly.
This common sense approach to science and education has somehow managed to ruffle the feathers of left-leaning special-interest groups who want to protect the status quo. These groups repeat the tired and thoroughly debunked claims that 97 percent of climate scientists believe climate change is real, man-made, and dangerous. But citing cherry-picked survey data and flawed studies does not help children understand the science of the climate, nor does using climate models that have been so inaccurate in predicting the temperatures for past two decades. The reason these models are such poor predictors of global temperatures is they assume we have complete knowledge of the climate system, or at least enough to predict the future accurately. But as we’ve seen, that isn’t true.
Assuming we have complete knowledge of the global climate system is like assuming we know everything there is to know about the creatures in the oceans or the ecosystems of the rainforests. We just don’t have all that information. Scientists are constantly discovering new species of animals we never knew existed, such as the Yin-Yang Frog from Vietnam and the subterranean blind fish, which lives in an underground river that runs for 4.3 miles through limestone caves.
Arguing the claim of “scientific consensus” is problematic, too, because scientists don’t always know what they think they know. For example, scientists have rediscovered several species of animals they had long considered extinct, such as the Coelacanth, a species of fish thought to have disappeared 65 million years ago, until they were rediscovered in 1938. Another species, the Bermuda petrel, was believed to be extinct since the 1620s, but it was rediscovered in 1951, falsifying more than 331 years of “scientific consensus.” Although perhaps not as cute as the New Caledonian crested gecko, all these animals demonstrate science is never settled, and we must adjust our opinions to reflect new evidence as it becomes available. That is the scientific way.
No one is saying humans have zero effect on the climate, but there is legitimate disagreement over how much. Considering CO2 emissions have increased dramatically but temperatures have remained steady or fallen slightly in the past 20 years, it is reasonable to argue natural forces have an impact on the climate that is equal to or significantly greater than that of humans.
With the U.S. falling behind the rest of the world in science education, we should applaud, not condemn, the Texas Board of Education for trying to teach kids how to think instead of what to think – especially when the “consensus” is still running on Windows 98.
[Originally published at The Houston Chronicle]
As president of the Chicago Lawyers’ Chapter of the Federalist Society, founded in 1982 as a group of conservatives and libertarians interested in the current state of the legal order, Laura Kotelman welcomed those who had come to have “Lunch with Author John Fund” on Monday, September 29 at the Union League Club, 65 West Jackson, Chicago, IL. John Fund is a National Affairs columnist for National Review magazine and on-air analyst on the Fox News Channel. He is considered a notable expert on American politics and the nexus between politics and economics and legal issues. Previously Fund served as a columnist and editorial board member for The Wall Street Journal.
While John Fund was in Chicago speaking to the Chicago Lawyers’ Chapter of the Federalist Society, co-author Hans von Spakovsky was at his venue in Toledo, Ohio, doing the same to promote their book: Obama’s Enforcer: Eric Holder’s Justice Department, which catalogues the abuses of power at the Department of Justice under Attorney General Holder. Set forth is how Attorney General Eric Holder, Jr has politicized the Justice Department and put the interests of left-wing ideology and his political party ahead of the fair and partial administration of justice.
Remarks made by Federalist member Joseph A Morris, prior to his introduction of John Fund, provided a perfect segue to what Fund later shared about Eric Holder as President Obama’s Attorney General. Morris, a former Assistant Attorney General and Director of the Office of Liasion Services with the U.S. Dept of Justice under Ronald Reagan, was eminently qualified to paint an accurate profile of Edwin Meese III, who served as U.S. Attorney General under President Reagan. The comparison between Edwin Meese III under Reagan and Eric Holder under Obama in conducting the office of Attorney General was indeed worlds apart.
It was out of great respect for Joseph Morris by members of The Chicago Lawyers’ Chapter of the Federalist Society that Laura Kotelman introduced Morris as “our home town hero.” Joseph Morris is a partner with Morris & De La Rosa in Chicago.
Joseph Morris comments on Edwin Meese as Reagan’s Attorney General
Joseph Morris directed those in attendance to an Opinion piece that appeared on the morning of the Fund event (9/29) in the Wall Street Journal, “Holder’s Legacy of Racial Politics,” by Edwin Meese III and Kenneth Blackwell, former Ohio Secretary of State. The article relates how Eric Holder battled against state voter-ID laws despite all the evidence of their fairness and popularity. According to Morris, the only reason for opposing sensible voter-ID laws is a “desire for votes.”
Joe Morris, in reflecting upon Edwin Meese III, spoke of Meese as Governor Reagan’s legal advisor and head of Reagan’s campaign committee in 1980. When Reagan took office, Meese went along with Reagan as one of his three staff assistants. Howard Baker later became Chief of Staff in Reagan’s second administration. With the surfacing of the Iran Contra scandal in Reagan’s 2nd administration, Edwin Meese, having been appointed Attorney General well before the scandal broke, was assigned by President Reagan to investigate the matter. Unlike Messrs. Obama and Holder, Meese saw the job of the Attorney General as one to pursue the truth, not to cover up an internal administration scandal.
It was during Reagan’s second term with the emergence of the Iran Contra scandal that Joe Morris, serving under Reagan at the time as both Chief of Staff and General Counsel of USIA (United States Information Agency), was asked to assist the Reagan White House. Morris recalls receiving two envelopes from the White House asking that he and his entire staff at USIA assist in the Iran Contra investigation by preserving all the facts (documents, dates, etc.). The instructions to be forthcoming about preserving records and being cooperative in the Iran-Contra investigation were received not only by Joseph Morris at USIA, but were also sent by President Reagan and Attorney General Meese to every other relevant agency of the U.S. Government.
Edwin Meese, as Attorney General, told Morris that he wanted the investigation to be taken seriously. It was through Morris’ involvement in the Iran Contra Scandal, while performing his dual roles at the USIA, that he was brought into Reagan’s Justice Department as Assistant Attorney General under Edwin Meese. Because of this relationship with Edwin Meese, Morris was able to present an accurate account of the way Meese conducted himself in his role of Attorney General under Reagan. Meese, in his role as Attorney General, sat in on the meetings of the NSC (National Security Council) responsible for coordinating policy on national security issues. It didn’t take long for Meese to observe that as the only lawyer among the participants, he alone was able to advise in a way that was consistent with the Constitution.
Four principles championed by Attorney General Meese
Joseph Morris set out these four principles followed by Attorney General Meese under the Reagan administration:
- Rule of Law must always follow the truth, wherever it goes, letting the facts speak for themselves.
- The structure of the government (system of procedure) was revamped so staff members could be brought together in an open channel of communication.
- No stranger to controversy, Edwin Meese did not shrink from what he considered his responsibility. On December 4, 1986, Attorney General Edwin Meese III requested that an independent counsel be appointed to investigate Iran-Contra matters. On December 19, the three judges on the appointing panel named Lawrence Walsh, a former judge and deputy attorney general in the Eisenhower Administration, to the post.
- Fighting a battle of ideas, Meese was willing to debate the “originalist” perspective of the Constitution. In 1985, Attorney General Edwin Meese III delivered a series of speeches challenging the then-dominant view of constitutional jurisprudence and calling for judges to embrace a “jurisprudence of original intention.” There ensued a vigorous debate in the academy, as well as in the popular press, and in Congress itself over the prospect of an “originalist” interpretation of the Constitution.
John Fund speaks
In introducing John Fund, Joseph Morris spoke of Fund as being a hard worker and a close student of the Department of Justice for thirty years, with a particular interest in the soft underbelly of the election system. Morris recalled how John Fund would call him, asking to have lunch to talk about Chicago politics. John Fund would, without fail, have with him a list of well thought out questions to ask such as: “Could this Blagojewich person really become mayor?” Later on: “What about Rahm Emanuel running for mayor in Chicago with all his ties to Obama?”
The above reference made by Morris about Emanual’s mayoral candidacy became the focus of John Fund’s opening remarks. Fund related how Rahm Emanuel was one of only a few individuals who had ever apologized to him over something he had written. What prompted Emanuel’s apology was a debate with Fund at Northwestern University in Evanston, IL, at which time Emanuel called Fund names that could only be defined as over-the-top.
Expressing his delight to be back in Chicago again, while his co-author was in Toledo, Ohio, John Fund felt he had drawn the better half of the straw. There followed a pithy comment by Fund about the resignation the week before (Thursday, September 25) of Eric Holder as Attorney General due to a conflict of forces. Fund suggested that Holder’s new job title be “Permanent Witness.”
Part 2: John Fund’s knowledge and wit will be shared as he elaborates on the way Eric Holder viewed his position at Attorney General, reflected by his behavior, while serving President Obama. Additional thoughts relative to the direction of this nation will also be covered.
[Originally published at Illinois Review]
Senior citizens (age 65 and over) are dispersing throughout major metropolitan areas, and specifically away from the urban cores. This is the opposite of the trend suggested by some planners and media sources who claim than seniors are moving to the urban cores. For example, one headline, “Millions of Seniors Moving Back to Big Cities” is at the top of a story with no data and anecdotes ranging that are at least as much suburban (Auburn Hills, in the Detroit area) and college towns (Oxford, Mississippi and Lawrence, Kansas), as they are big city. Another article, “Why Seniors are Moving to the Urban Core and Why It’s Good for Everyone,” is also anecdote based, and gave prominence to a solitary housing development in downtown Phoenix (more about Phoenix below).
Senior Metropolitan Growth Trails National
Between 2000 and 2010, the nation’s senior population increased approximately 5.4 million, an increase of 15 percent. Major metropolitan areas accounted for approximately 50 percent of the increase (2.7 million) and also saw their senior population increase 15 percent. By contrast, these same metropolitan areas accounted for 60 percent of overall growth between 2000 and 2010, indicating that most senior growth is in smaller metropolitan areas and rural areas.
Senior Metropolitan Population Dispersing
The number of senior citizens living in suburbs and exurbs of major metropolitan areas (over 1,000,000 population) increased between 2000 and 2010, according to census data. The senior increases were strongly skewed away from the urban cores. Suburbs and exurbs gained 2.82 million senior residents over the period, while functional urban cores lost 112,000. The later suburbs added 1.64 million seniors. The second largest increase was in exurban areas, with a gain of 0.88 million seniors. The earlier suburbs (generally inner suburbs) added just under 300,000 seniors (Figure 1).
During that period, the share of senior citizens living in the later suburbs increased 35 percent. The senior citizen population share in the exurbs rose nearly 15 percent. By contrast, the share of seniors living in the functional urban cores declined 17 percent. Their share in the earlier suburbs declined 11 percent.
This is based on an analysis of small area data for major metropolitan areas using the City Sector Model.
City Sector Model analysis avoids the exaggeration of urban core data that necessarily occurs from reliance on the municipal boundaries of core cities (which are themselves nearly 60 percent suburban or exurban, ranging from as little as three percent to virtually 100 percent). It also avoids the use of the newer “principal cities” designation of larger employment centers within metropolitan areas, nearly all of which are suburbs, but are inappropriately joined with core municipalities in some analyses. The City Sector Model” small area analysis method is described in greater detail in the Note below.
Pervasive Suburban and Exurban Senior Gains
The gains in functional suburban and exurban senior population were pervasive. Among the 52 major metropolitan areas, there were gains in 50. In two areas (New Orleans and Pittsburgh), there were losses. However, in each of these cases there was an even greater senior loss in the functional urban cores. In no case did urban cores gain more or lose fewer seniors than the suburbs and exurbs. Eight of the functional urban cores experienced gains in senior population, while 44 experienced losses (Figure 2)
Largest Urban Cores
The major metropolitan areas with the largest urban cores (more than 20 percent of the population in the functional urban cores), would tend to be the most attractive to seniors seeking an urban core lifestyle. But they still saw their seniors heading to the suburbs and exurbs (Figure 3). Senior populations declined in the functional urban cores of all but two of these nine areas, New York and San Francisco. However, in both of these metropolitan areas, the increases in suburban and exurban senior populations overwhelmed the increases in the urban cores. All of these nine major metropolitan areas experienced increases in their suburban and exurban senior populations.
Moreover, the Phoenix anecdote cited above is at odds with the reality that the later suburbs and exurbs gained 165,000 seniors between 2000 and 2010. The earlier suburbs lost 7,000 seniors (No part of Phoenix has sufficient density or transit market share to be classified as functional urban core).
Consistency of Seniors Trend with Other Metropolitan Indicators
As has been indicated in previous articles, there continues to be a trend toward dispersal and decentralization in US major metropolitan areas. There was an overall population dispersion from1990 to 2000 and 2000 to 2010, which continued trends that have been evident since World War II and even before, as pre-automobile era urban cores have lost their dominance. Jobs continued to follow the suburbanization and exurbanization of the population over the past decade away as cities became less monocentric, less polycentric and more “non-centric.” As a result, work trip travel times are generally shorter for residents where population densities are lower. Baby boomers and Millennials have been shown to be dispersing as well, despite anecdotes to the contrary (Figure 4). The same applies to seniors.
Note: The City Sector Model allows a more representative functional analysis of urban core, suburban and exurban areas, by the use of smaller areas, rather than municipal boundaries. The more than 30,000 zip code tabulation areas (ZCTA) of major metropolitan areas and the rest of the nation are categorized by functional characteristics, including urban form, density and travel behavior. There are four functional classifications, the urban core, earlier suburban areas, later suburban areas and exurban areas. The urban cores have higher densities, older housing and substantially greater reliance on transit, similar to the urban cores that preceded the great automobile oriented suburbanization that followed World War II. Exurban areas are beyond the built up urban areas. The suburban areas constitute the balance of the major metropolitan areas. Earlier suburbs include areas with a median house construction date before 1980. Later suburban areas have later median house construction dates.
Urban cores are defined as areas (ZCTAs) that have high population densities (7,500 or more per square mile or 2,900 per square kilometer or more) and high transit, walking and cycling work trip market shares (20 percent or more). Urban cores also include non-exurban sectors with median house construction dates of 1945 or before. All of these areas are defined at the zip code tabulation area (ZCTA) level.
Wendell Cox is principal of Demographia, an international public policy and demographics firm. He is co-author of the “Demographia International Housing Affordability Survey” and author of “Demographia World Urban Areas” and “War on the Dream: How Anti-Sprawl Policy Threatens the Quality of Life.” He was appointed to three terms on the Los Angeles County Transportation Commission, where he served with the leading city and county leadership as the only non-elected member. He was appointed to the Amtrak Reform Council to fill the unexpired term of Governor Christine Todd Whitman and has served as a visiting professor at the Conservatoire National des Arts et Metiers, a national university in Paris.
Photo: Later Suburbs of Cincinnati (where most senior growth occurred from 2000 to 2010). By Author
[Originally published at New Geography]